Jan 21 22:43:49 localhost kernel: Linux version 5.14.0-661.el9.x86_64 (mockbuild@x86-05.stream.rdu2.redhat.com) (gcc (GCC) 11.5.0 20240719 (Red Hat 11.5.0-14), GNU ld version 2.35.2-69.el9) #1 SMP PREEMPT_DYNAMIC Fri Jan 16 09:19:22 UTC 2026
Jan 21 22:43:49 localhost kernel: The list of certified hardware and cloud instances for Red Hat Enterprise Linux 9 can be viewed at the Red Hat Ecosystem Catalog, https://catalog.redhat.com.
Jan 21 22:43:49 localhost kernel: Command line: BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-661.el9.x86_64 root=UUID=22ac9141-3960-4912-b20e-19fc8a328d40 ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Jan 21 22:43:49 localhost kernel: BIOS-provided physical RAM map:
Jan 21 22:43:49 localhost kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009fbff] usable
Jan 21 22:43:49 localhost kernel: BIOS-e820: [mem 0x000000000009fc00-0x000000000009ffff] reserved
Jan 21 22:43:49 localhost kernel: BIOS-e820: [mem 0x00000000000f0000-0x00000000000fffff] reserved
Jan 21 22:43:49 localhost kernel: BIOS-e820: [mem 0x0000000000100000-0x00000000bffdafff] usable
Jan 21 22:43:49 localhost kernel: BIOS-e820: [mem 0x00000000bffdb000-0x00000000bfffffff] reserved
Jan 21 22:43:49 localhost kernel: BIOS-e820: [mem 0x00000000feffc000-0x00000000feffffff] reserved
Jan 21 22:43:49 localhost kernel: BIOS-e820: [mem 0x00000000fffc0000-0x00000000ffffffff] reserved
Jan 21 22:43:49 localhost kernel: BIOS-e820: [mem 0x0000000100000000-0x000000023fffffff] usable
Jan 21 22:43:49 localhost kernel: NX (Execute Disable) protection: active
Jan 21 22:43:49 localhost kernel: APIC: Static calls initialized
Jan 21 22:43:49 localhost kernel: SMBIOS 2.8 present.
Jan 21 22:43:49 localhost kernel: DMI: OpenStack Foundation OpenStack Nova, BIOS 1.15.0-1 04/01/2014
Jan 21 22:43:49 localhost kernel: Hypervisor detected: KVM
Jan 21 22:43:49 localhost kernel: kvm-clock: Using msrs 4b564d01 and 4b564d00
Jan 21 22:43:49 localhost kernel: kvm-clock: using sched offset of 3193536893 cycles
Jan 21 22:43:49 localhost kernel: clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns
Jan 21 22:43:49 localhost kernel: tsc: Detected 2800.000 MHz processor
Jan 21 22:43:49 localhost kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved
Jan 21 22:43:49 localhost kernel: e820: remove [mem 0x000a0000-0x000fffff] usable
Jan 21 22:43:49 localhost kernel: last_pfn = 0x240000 max_arch_pfn = 0x400000000
Jan 21 22:43:49 localhost kernel: MTRR map: 4 entries (3 fixed + 1 variable; max 19), built from 8 variable MTRRs
Jan 21 22:43:49 localhost kernel: x86/PAT: Configuration [0-7]: WB  WC  UC- UC  WB  WP  UC- WT  
Jan 21 22:43:49 localhost kernel: last_pfn = 0xbffdb max_arch_pfn = 0x400000000
Jan 21 22:43:49 localhost kernel: found SMP MP-table at [mem 0x000f5ae0-0x000f5aef]
Jan 21 22:43:49 localhost kernel: Using GB pages for direct mapping
Jan 21 22:43:49 localhost kernel: RAMDISK: [mem 0x2d426000-0x32a0afff]
Jan 21 22:43:49 localhost kernel: ACPI: Early table checksum verification disabled
Jan 21 22:43:49 localhost kernel: ACPI: RSDP 0x00000000000F5AA0 000014 (v00 BOCHS )
Jan 21 22:43:49 localhost kernel: ACPI: RSDT 0x00000000BFFE16BD 000030 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Jan 21 22:43:49 localhost kernel: ACPI: FACP 0x00000000BFFE1571 000074 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Jan 21 22:43:49 localhost kernel: ACPI: DSDT 0x00000000BFFDFC80 0018F1 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Jan 21 22:43:49 localhost kernel: ACPI: FACS 0x00000000BFFDFC40 000040
Jan 21 22:43:49 localhost kernel: ACPI: APIC 0x00000000BFFE15E5 0000B0 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Jan 21 22:43:49 localhost kernel: ACPI: WAET 0x00000000BFFE1695 000028 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Jan 21 22:43:49 localhost kernel: ACPI: Reserving FACP table memory at [mem 0xbffe1571-0xbffe15e4]
Jan 21 22:43:49 localhost kernel: ACPI: Reserving DSDT table memory at [mem 0xbffdfc80-0xbffe1570]
Jan 21 22:43:49 localhost kernel: ACPI: Reserving FACS table memory at [mem 0xbffdfc40-0xbffdfc7f]
Jan 21 22:43:49 localhost kernel: ACPI: Reserving APIC table memory at [mem 0xbffe15e5-0xbffe1694]
Jan 21 22:43:49 localhost kernel: ACPI: Reserving WAET table memory at [mem 0xbffe1695-0xbffe16bc]
Jan 21 22:43:49 localhost kernel: No NUMA configuration found
Jan 21 22:43:49 localhost kernel: Faking a node at [mem 0x0000000000000000-0x000000023fffffff]
Jan 21 22:43:49 localhost kernel: NODE_DATA(0) allocated [mem 0x23ffd5000-0x23fffffff]
Jan 21 22:43:49 localhost kernel: crashkernel reserved: 0x00000000af000000 - 0x00000000bf000000 (256 MB)
Jan 21 22:43:49 localhost kernel: Zone ranges:
Jan 21 22:43:49 localhost kernel:   DMA      [mem 0x0000000000001000-0x0000000000ffffff]
Jan 21 22:43:49 localhost kernel:   DMA32    [mem 0x0000000001000000-0x00000000ffffffff]
Jan 21 22:43:49 localhost kernel:   Normal   [mem 0x0000000100000000-0x000000023fffffff]
Jan 21 22:43:49 localhost kernel:   Device   empty
Jan 21 22:43:49 localhost kernel: Movable zone start for each node
Jan 21 22:43:49 localhost kernel: Early memory node ranges
Jan 21 22:43:49 localhost kernel:   node   0: [mem 0x0000000000001000-0x000000000009efff]
Jan 21 22:43:49 localhost kernel:   node   0: [mem 0x0000000000100000-0x00000000bffdafff]
Jan 21 22:43:49 localhost kernel:   node   0: [mem 0x0000000100000000-0x000000023fffffff]
Jan 21 22:43:49 localhost kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000023fffffff]
Jan 21 22:43:49 localhost kernel: On node 0, zone DMA: 1 pages in unavailable ranges
Jan 21 22:43:49 localhost kernel: On node 0, zone DMA: 97 pages in unavailable ranges
Jan 21 22:43:49 localhost kernel: On node 0, zone Normal: 37 pages in unavailable ranges
Jan 21 22:43:49 localhost kernel: ACPI: PM-Timer IO Port: 0x608
Jan 21 22:43:49 localhost kernel: ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1])
Jan 21 22:43:49 localhost kernel: IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23
Jan 21 22:43:49 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl)
Jan 21 22:43:49 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level)
Jan 21 22:43:49 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level)
Jan 21 22:43:49 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level)
Jan 21 22:43:49 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level)
Jan 21 22:43:49 localhost kernel: ACPI: Using ACPI (MADT) for SMP configuration information
Jan 21 22:43:49 localhost kernel: TSC deadline timer available
Jan 21 22:43:49 localhost kernel: CPU topo: Max. logical packages:   8
Jan 21 22:43:49 localhost kernel: CPU topo: Max. logical dies:       8
Jan 21 22:43:49 localhost kernel: CPU topo: Max. dies per package:   1
Jan 21 22:43:49 localhost kernel: CPU topo: Max. threads per core:   1
Jan 21 22:43:49 localhost kernel: CPU topo: Num. cores per package:     1
Jan 21 22:43:49 localhost kernel: CPU topo: Num. threads per package:   1
Jan 21 22:43:49 localhost kernel: CPU topo: Allowing 8 present CPUs plus 0 hotplug CPUs
Jan 21 22:43:49 localhost kernel: kvm-guest: APIC: eoi() replaced with kvm_guest_apic_eoi_write()
Jan 21 22:43:49 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x00000000-0x00000fff]
Jan 21 22:43:49 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x0009f000-0x0009ffff]
Jan 21 22:43:49 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x000a0000-0x000effff]
Jan 21 22:43:49 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x000f0000-0x000fffff]
Jan 21 22:43:49 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xbffdb000-0xbfffffff]
Jan 21 22:43:49 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xc0000000-0xfeffbfff]
Jan 21 22:43:49 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xfeffc000-0xfeffffff]
Jan 21 22:43:49 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xff000000-0xfffbffff]
Jan 21 22:43:49 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xfffc0000-0xffffffff]
Jan 21 22:43:49 localhost kernel: [mem 0xc0000000-0xfeffbfff] available for PCI devices
Jan 21 22:43:49 localhost kernel: Booting paravirtualized kernel on KVM
Jan 21 22:43:49 localhost kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns
Jan 21 22:43:49 localhost kernel: setup_percpu: NR_CPUS:8192 nr_cpumask_bits:8 nr_cpu_ids:8 nr_node_ids:1
Jan 21 22:43:49 localhost kernel: percpu: Embedded 64 pages/cpu s225280 r8192 d28672 u262144
Jan 21 22:43:49 localhost kernel: pcpu-alloc: s225280 r8192 d28672 u262144 alloc=1*2097152
Jan 21 22:43:49 localhost kernel: pcpu-alloc: [0] 0 1 2 3 4 5 6 7 
Jan 21 22:43:49 localhost kernel: kvm-guest: PV spinlocks disabled, no host support
Jan 21 22:43:49 localhost kernel: Kernel command line: BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-661.el9.x86_64 root=UUID=22ac9141-3960-4912-b20e-19fc8a328d40 ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Jan 21 22:43:49 localhost kernel: Unknown kernel command line parameters "BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-661.el9.x86_64", will be passed to user space.
Jan 21 22:43:49 localhost kernel: random: crng init done
Jan 21 22:43:49 localhost kernel: Dentry cache hash table entries: 1048576 (order: 11, 8388608 bytes, linear)
Jan 21 22:43:49 localhost kernel: Inode-cache hash table entries: 524288 (order: 10, 4194304 bytes, linear)
Jan 21 22:43:49 localhost kernel: Fallback order for Node 0: 0 
Jan 21 22:43:49 localhost kernel: Built 1 zonelists, mobility grouping on.  Total pages: 2064091
Jan 21 22:43:49 localhost kernel: Policy zone: Normal
Jan 21 22:43:49 localhost kernel: mem auto-init: stack:off, heap alloc:off, heap free:off
Jan 21 22:43:49 localhost kernel: software IO TLB: area num 8.
Jan 21 22:43:49 localhost kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=8, Nodes=1
Jan 21 22:43:49 localhost kernel: ftrace: allocating 49417 entries in 194 pages
Jan 21 22:43:49 localhost kernel: ftrace: allocated 194 pages with 3 groups
Jan 21 22:43:49 localhost kernel: Dynamic Preempt: voluntary
Jan 21 22:43:49 localhost kernel: rcu: Preemptible hierarchical RCU implementation.
Jan 21 22:43:49 localhost kernel: rcu:         RCU event tracing is enabled.
Jan 21 22:43:49 localhost kernel: rcu:         RCU restricting CPUs from NR_CPUS=8192 to nr_cpu_ids=8.
Jan 21 22:43:49 localhost kernel:         Trampoline variant of Tasks RCU enabled.
Jan 21 22:43:49 localhost kernel:         Rude variant of Tasks RCU enabled.
Jan 21 22:43:49 localhost kernel:         Tracing variant of Tasks RCU enabled.
Jan 21 22:43:49 localhost kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies.
Jan 21 22:43:49 localhost kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=8
Jan 21 22:43:49 localhost kernel: RCU Tasks: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Jan 21 22:43:49 localhost kernel: RCU Tasks Rude: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Jan 21 22:43:49 localhost kernel: RCU Tasks Trace: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Jan 21 22:43:49 localhost kernel: NR_IRQS: 524544, nr_irqs: 488, preallocated irqs: 16
Jan 21 22:43:49 localhost kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention.
Jan 21 22:43:49 localhost kernel: kfence: initialized - using 2097152 bytes for 255 objects at 0x(____ptrval____)-0x(____ptrval____)
Jan 21 22:43:49 localhost kernel: Console: colour VGA+ 80x25
Jan 21 22:43:49 localhost kernel: printk: console [ttyS0] enabled
Jan 21 22:43:49 localhost kernel: ACPI: Core revision 20230331
Jan 21 22:43:49 localhost kernel: APIC: Switch to symmetric I/O mode setup
Jan 21 22:43:49 localhost kernel: x2apic enabled
Jan 21 22:43:49 localhost kernel: APIC: Switched APIC routing to: physical x2apic
Jan 21 22:43:49 localhost kernel: tsc: Marking TSC unstable due to TSCs unsynchronized
Jan 21 22:43:49 localhost kernel: Calibrating delay loop (skipped) preset value.. 5600.00 BogoMIPS (lpj=2800000)
Jan 21 22:43:49 localhost kernel: x86/cpu: User Mode Instruction Prevention (UMIP) activated
Jan 21 22:43:49 localhost kernel: Last level iTLB entries: 4KB 512, 2MB 255, 4MB 127
Jan 21 22:43:49 localhost kernel: Last level dTLB entries: 4KB 512, 2MB 255, 4MB 127, 1GB 0
Jan 21 22:43:49 localhost kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization
Jan 21 22:43:49 localhost kernel: Spectre V2 : Mitigation: Retpolines
Jan 21 22:43:49 localhost kernel: Spectre V2 : Spectre v2 / SpectreRSB: Filling RSB on context switch and VMEXIT
Jan 21 22:43:49 localhost kernel: Spectre V2 : Enabling Speculation Barrier for firmware calls
Jan 21 22:43:49 localhost kernel: RETBleed: Mitigation: untrained return thunk
Jan 21 22:43:49 localhost kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier
Jan 21 22:43:49 localhost kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl
Jan 21 22:43:49 localhost kernel: Speculative Return Stack Overflow: IBPB-extending microcode not applied!
Jan 21 22:43:49 localhost kernel: Speculative Return Stack Overflow: WARNING: See https://kernel.org/doc/html/latest/admin-guide/hw-vuln/srso.html for mitigation options.
Jan 21 22:43:49 localhost kernel: x86/bugs: return thunk changed
Jan 21 22:43:49 localhost kernel: Speculative Return Stack Overflow: Vulnerable: Safe RET, no microcode
Jan 21 22:43:49 localhost kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers'
Jan 21 22:43:49 localhost kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers'
Jan 21 22:43:49 localhost kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers'
Jan 21 22:43:49 localhost kernel: x86/fpu: xstate_offset[2]:  576, xstate_sizes[2]:  256
Jan 21 22:43:49 localhost kernel: x86/fpu: Enabled xstate features 0x7, context size is 832 bytes, using 'compacted' format.
Jan 21 22:43:49 localhost kernel: Freeing SMP alternatives memory: 40K
Jan 21 22:43:49 localhost kernel: pid_max: default: 32768 minimum: 301
Jan 21 22:43:49 localhost kernel: LSM: initializing lsm=lockdown,capability,landlock,yama,integrity,selinux,bpf
Jan 21 22:43:49 localhost kernel: landlock: Up and running.
Jan 21 22:43:49 localhost kernel: Yama: becoming mindful.
Jan 21 22:43:49 localhost kernel: SELinux:  Initializing.
Jan 21 22:43:49 localhost kernel: LSM support for eBPF active
Jan 21 22:43:49 localhost kernel: Mount-cache hash table entries: 16384 (order: 5, 131072 bytes, linear)
Jan 21 22:43:49 localhost kernel: Mountpoint-cache hash table entries: 16384 (order: 5, 131072 bytes, linear)
Jan 21 22:43:49 localhost kernel: smpboot: CPU0: AMD EPYC-Rome Processor (family: 0x17, model: 0x31, stepping: 0x0)
Jan 21 22:43:49 localhost kernel: Performance Events: Fam17h+ core perfctr, AMD PMU driver.
Jan 21 22:43:49 localhost kernel: ... version:                0
Jan 21 22:43:49 localhost kernel: ... bit width:              48
Jan 21 22:43:49 localhost kernel: ... generic registers:      6
Jan 21 22:43:49 localhost kernel: ... value mask:             0000ffffffffffff
Jan 21 22:43:49 localhost kernel: ... max period:             00007fffffffffff
Jan 21 22:43:49 localhost kernel: ... fixed-purpose events:   0
Jan 21 22:43:49 localhost kernel: ... event mask:             000000000000003f
Jan 21 22:43:49 localhost kernel: signal: max sigframe size: 1776
Jan 21 22:43:49 localhost kernel: rcu: Hierarchical SRCU implementation.
Jan 21 22:43:49 localhost kernel: rcu:         Max phase no-delay instances is 400.
Jan 21 22:43:49 localhost kernel: smp: Bringing up secondary CPUs ...
Jan 21 22:43:49 localhost kernel: smpboot: x86: Booting SMP configuration:
Jan 21 22:43:49 localhost kernel: .... node  #0, CPUs:      #1 #2 #3 #4 #5 #6 #7
Jan 21 22:43:49 localhost kernel: smp: Brought up 1 node, 8 CPUs
Jan 21 22:43:49 localhost kernel: smpboot: Total of 8 processors activated (44800.00 BogoMIPS)
Jan 21 22:43:49 localhost kernel: node 0 deferred pages initialised in 10ms
Jan 21 22:43:49 localhost kernel: Memory: 7763980K/8388068K available (16384K kernel code, 5797K rwdata, 13916K rodata, 4200K init, 7192K bss, 618356K reserved, 0K cma-reserved)
Jan 21 22:43:49 localhost kernel: devtmpfs: initialized
Jan 21 22:43:49 localhost kernel: x86/mm: Memory block size: 128MB
Jan 21 22:43:49 localhost kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns
Jan 21 22:43:49 localhost kernel: futex hash table entries: 2048 (131072 bytes on 1 NUMA nodes, total 128 KiB, linear).
Jan 21 22:43:49 localhost kernel: pinctrl core: initialized pinctrl subsystem
Jan 21 22:43:49 localhost kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family
Jan 21 22:43:49 localhost kernel: DMA: preallocated 1024 KiB GFP_KERNEL pool for atomic allocations
Jan 21 22:43:49 localhost kernel: DMA: preallocated 1024 KiB GFP_KERNEL|GFP_DMA pool for atomic allocations
Jan 21 22:43:49 localhost kernel: DMA: preallocated 1024 KiB GFP_KERNEL|GFP_DMA32 pool for atomic allocations
Jan 21 22:43:49 localhost kernel: audit: initializing netlink subsys (disabled)
Jan 21 22:43:49 localhost kernel: audit: type=2000 audit(1769035427.336:1): state=initialized audit_enabled=0 res=1
Jan 21 22:43:49 localhost kernel: thermal_sys: Registered thermal governor 'fair_share'
Jan 21 22:43:49 localhost kernel: thermal_sys: Registered thermal governor 'step_wise'
Jan 21 22:43:49 localhost kernel: thermal_sys: Registered thermal governor 'user_space'
Jan 21 22:43:49 localhost kernel: cpuidle: using governor menu
Jan 21 22:43:49 localhost kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5
Jan 21 22:43:49 localhost kernel: PCI: Using configuration type 1 for base access
Jan 21 22:43:49 localhost kernel: PCI: Using configuration type 1 for extended access
Jan 21 22:43:49 localhost kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible.
Jan 21 22:43:49 localhost kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages
Jan 21 22:43:49 localhost kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page
Jan 21 22:43:49 localhost kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages
Jan 21 22:43:49 localhost kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page
Jan 21 22:43:49 localhost kernel: Demotion targets for Node 0: null
Jan 21 22:43:49 localhost kernel: cryptd: max_cpu_qlen set to 1000
Jan 21 22:43:49 localhost kernel: ACPI: Added _OSI(Module Device)
Jan 21 22:43:49 localhost kernel: ACPI: Added _OSI(Processor Device)
Jan 21 22:43:49 localhost kernel: ACPI: Added _OSI(Processor Aggregator Device)
Jan 21 22:43:49 localhost kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded
Jan 21 22:43:49 localhost kernel: ACPI: Interpreter enabled
Jan 21 22:43:49 localhost kernel: ACPI: PM: (supports S0 S3 S4 S5)
Jan 21 22:43:49 localhost kernel: ACPI: Using IOAPIC for interrupt routing
Jan 21 22:43:49 localhost kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug
Jan 21 22:43:49 localhost kernel: PCI: Using E820 reservations for host bridge windows
Jan 21 22:43:49 localhost kernel: ACPI: Enabled 2 GPEs in block 00 to 0F
Jan 21 22:43:49 localhost kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff])
Jan 21 22:43:49 localhost kernel: acpi PNP0A03:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI EDR HPX-Type3]
Jan 21 22:43:49 localhost kernel: acpiphp: Slot [3] registered
Jan 21 22:43:49 localhost kernel: acpiphp: Slot [4] registered
Jan 21 22:43:49 localhost kernel: acpiphp: Slot [5] registered
Jan 21 22:43:49 localhost kernel: acpiphp: Slot [6] registered
Jan 21 22:43:49 localhost kernel: acpiphp: Slot [7] registered
Jan 21 22:43:49 localhost kernel: acpiphp: Slot [8] registered
Jan 21 22:43:49 localhost kernel: acpiphp: Slot [9] registered
Jan 21 22:43:49 localhost kernel: acpiphp: Slot [10] registered
Jan 21 22:43:49 localhost kernel: acpiphp: Slot [11] registered
Jan 21 22:43:49 localhost kernel: acpiphp: Slot [12] registered
Jan 21 22:43:49 localhost kernel: acpiphp: Slot [13] registered
Jan 21 22:43:49 localhost kernel: acpiphp: Slot [14] registered
Jan 21 22:43:49 localhost kernel: acpiphp: Slot [15] registered
Jan 21 22:43:49 localhost kernel: acpiphp: Slot [16] registered
Jan 21 22:43:49 localhost kernel: acpiphp: Slot [17] registered
Jan 21 22:43:49 localhost kernel: acpiphp: Slot [18] registered
Jan 21 22:43:49 localhost kernel: acpiphp: Slot [19] registered
Jan 21 22:43:49 localhost kernel: acpiphp: Slot [20] registered
Jan 21 22:43:49 localhost kernel: acpiphp: Slot [21] registered
Jan 21 22:43:49 localhost kernel: acpiphp: Slot [22] registered
Jan 21 22:43:49 localhost kernel: acpiphp: Slot [23] registered
Jan 21 22:43:49 localhost kernel: acpiphp: Slot [24] registered
Jan 21 22:43:49 localhost kernel: acpiphp: Slot [25] registered
Jan 21 22:43:49 localhost kernel: acpiphp: Slot [26] registered
Jan 21 22:43:49 localhost kernel: acpiphp: Slot [27] registered
Jan 21 22:43:49 localhost kernel: acpiphp: Slot [28] registered
Jan 21 22:43:49 localhost kernel: acpiphp: Slot [29] registered
Jan 21 22:43:49 localhost kernel: acpiphp: Slot [30] registered
Jan 21 22:43:49 localhost kernel: acpiphp: Slot [31] registered
Jan 21 22:43:49 localhost kernel: PCI host bridge to bus 0000:00
Jan 21 22:43:49 localhost kernel: pci_bus 0000:00: root bus resource [io  0x0000-0x0cf7 window]
Jan 21 22:43:49 localhost kernel: pci_bus 0000:00: root bus resource [io  0x0d00-0xffff window]
Jan 21 22:43:49 localhost kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window]
Jan 21 22:43:49 localhost kernel: pci_bus 0000:00: root bus resource [mem 0xc0000000-0xfebfffff window]
Jan 21 22:43:49 localhost kernel: pci_bus 0000:00: root bus resource [mem 0x240000000-0x2bfffffff window]
Jan 21 22:43:49 localhost kernel: pci_bus 0000:00: root bus resource [bus 00-ff]
Jan 21 22:43:49 localhost kernel: pci 0000:00:00.0: [8086:1237] type 00 class 0x060000 conventional PCI endpoint
Jan 21 22:43:49 localhost kernel: pci 0000:00:01.0: [8086:7000] type 00 class 0x060100 conventional PCI endpoint
Jan 21 22:43:49 localhost kernel: pci 0000:00:01.1: [8086:7010] type 00 class 0x010180 conventional PCI endpoint
Jan 21 22:43:49 localhost kernel: pci 0000:00:01.1: BAR 4 [io  0xc140-0xc14f]
Jan 21 22:43:49 localhost kernel: pci 0000:00:01.1: BAR 0 [io  0x01f0-0x01f7]: legacy IDE quirk
Jan 21 22:43:49 localhost kernel: pci 0000:00:01.1: BAR 1 [io  0x03f6]: legacy IDE quirk
Jan 21 22:43:49 localhost kernel: pci 0000:00:01.1: BAR 2 [io  0x0170-0x0177]: legacy IDE quirk
Jan 21 22:43:49 localhost kernel: pci 0000:00:01.1: BAR 3 [io  0x0376]: legacy IDE quirk
Jan 21 22:43:49 localhost kernel: pci 0000:00:01.2: [8086:7020] type 00 class 0x0c0300 conventional PCI endpoint
Jan 21 22:43:49 localhost kernel: pci 0000:00:01.2: BAR 4 [io  0xc100-0xc11f]
Jan 21 22:43:49 localhost kernel: pci 0000:00:01.3: [8086:7113] type 00 class 0x068000 conventional PCI endpoint
Jan 21 22:43:49 localhost kernel: pci 0000:00:01.3: quirk: [io  0x0600-0x063f] claimed by PIIX4 ACPI
Jan 21 22:43:49 localhost kernel: pci 0000:00:01.3: quirk: [io  0x0700-0x070f] claimed by PIIX4 SMB
Jan 21 22:43:49 localhost kernel: pci 0000:00:02.0: [1af4:1050] type 00 class 0x030000 conventional PCI endpoint
Jan 21 22:43:49 localhost kernel: pci 0000:00:02.0: BAR 0 [mem 0xfe000000-0xfe7fffff pref]
Jan 21 22:43:49 localhost kernel: pci 0000:00:02.0: BAR 2 [mem 0xfe800000-0xfe803fff 64bit pref]
Jan 21 22:43:49 localhost kernel: pci 0000:00:02.0: BAR 4 [mem 0xfeb90000-0xfeb90fff]
Jan 21 22:43:49 localhost kernel: pci 0000:00:02.0: ROM [mem 0xfeb80000-0xfeb8ffff pref]
Jan 21 22:43:49 localhost kernel: pci 0000:00:02.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff]
Jan 21 22:43:49 localhost kernel: pci 0000:00:03.0: [1af4:1000] type 00 class 0x020000 conventional PCI endpoint
Jan 21 22:43:49 localhost kernel: pci 0000:00:03.0: BAR 0 [io  0xc080-0xc0bf]
Jan 21 22:43:49 localhost kernel: pci 0000:00:03.0: BAR 1 [mem 0xfeb91000-0xfeb91fff]
Jan 21 22:43:49 localhost kernel: pci 0000:00:03.0: BAR 4 [mem 0xfe804000-0xfe807fff 64bit pref]
Jan 21 22:43:49 localhost kernel: pci 0000:00:03.0: ROM [mem 0xfeb00000-0xfeb7ffff pref]
Jan 21 22:43:49 localhost kernel: pci 0000:00:04.0: [1af4:1001] type 00 class 0x010000 conventional PCI endpoint
Jan 21 22:43:49 localhost kernel: pci 0000:00:04.0: BAR 0 [io  0xc000-0xc07f]
Jan 21 22:43:49 localhost kernel: pci 0000:00:04.0: BAR 1 [mem 0xfeb92000-0xfeb92fff]
Jan 21 22:43:49 localhost kernel: pci 0000:00:04.0: BAR 4 [mem 0xfe808000-0xfe80bfff 64bit pref]
Jan 21 22:43:49 localhost kernel: pci 0000:00:05.0: [1af4:1002] type 00 class 0x00ff00 conventional PCI endpoint
Jan 21 22:43:49 localhost kernel: pci 0000:00:05.0: BAR 0 [io  0xc0c0-0xc0ff]
Jan 21 22:43:49 localhost kernel: pci 0000:00:05.0: BAR 4 [mem 0xfe80c000-0xfe80ffff 64bit pref]
Jan 21 22:43:49 localhost kernel: pci 0000:00:06.0: [1af4:1005] type 00 class 0x00ff00 conventional PCI endpoint
Jan 21 22:43:49 localhost kernel: pci 0000:00:06.0: BAR 0 [io  0xc120-0xc13f]
Jan 21 22:43:49 localhost kernel: pci 0000:00:06.0: BAR 4 [mem 0xfe810000-0xfe813fff 64bit pref]
Jan 21 22:43:49 localhost kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 10
Jan 21 22:43:49 localhost kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 10
Jan 21 22:43:49 localhost kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11
Jan 21 22:43:49 localhost kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 11
Jan 21 22:43:49 localhost kernel: ACPI: PCI: Interrupt link LNKS configured for IRQ 9
Jan 21 22:43:49 localhost kernel: iommu: Default domain type: Translated
Jan 21 22:43:49 localhost kernel: iommu: DMA domain TLB invalidation policy: lazy mode
Jan 21 22:43:49 localhost kernel: SCSI subsystem initialized
Jan 21 22:43:49 localhost kernel: ACPI: bus type USB registered
Jan 21 22:43:49 localhost kernel: usbcore: registered new interface driver usbfs
Jan 21 22:43:49 localhost kernel: usbcore: registered new interface driver hub
Jan 21 22:43:49 localhost kernel: usbcore: registered new device driver usb
Jan 21 22:43:49 localhost kernel: pps_core: LinuxPPS API ver. 1 registered
Jan 21 22:43:49 localhost kernel: pps_core: Software ver. 5.3.6 - Copyright 2005-2007 Rodolfo Giometti <giometti@linux.it>
Jan 21 22:43:49 localhost kernel: PTP clock support registered
Jan 21 22:43:49 localhost kernel: EDAC MC: Ver: 3.0.0
Jan 21 22:43:49 localhost kernel: NetLabel: Initializing
Jan 21 22:43:49 localhost kernel: NetLabel:  domain hash size = 128
Jan 21 22:43:49 localhost kernel: NetLabel:  protocols = UNLABELED CIPSOv4 CALIPSO
Jan 21 22:43:49 localhost kernel: NetLabel:  unlabeled traffic allowed by default
Jan 21 22:43:49 localhost kernel: PCI: Using ACPI for IRQ routing
Jan 21 22:43:49 localhost kernel: PCI: pci_cache_line_size set to 64 bytes
Jan 21 22:43:49 localhost kernel: e820: reserve RAM buffer [mem 0x0009fc00-0x0009ffff]
Jan 21 22:43:49 localhost kernel: e820: reserve RAM buffer [mem 0xbffdb000-0xbfffffff]
Jan 21 22:43:49 localhost kernel: pci 0000:00:02.0: vgaarb: setting as boot VGA device
Jan 21 22:43:49 localhost kernel: pci 0000:00:02.0: vgaarb: bridge control possible
Jan 21 22:43:49 localhost kernel: pci 0000:00:02.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none
Jan 21 22:43:49 localhost kernel: vgaarb: loaded
Jan 21 22:43:49 localhost kernel: clocksource: Switched to clocksource kvm-clock
Jan 21 22:43:49 localhost kernel: VFS: Disk quotas dquot_6.6.0
Jan 21 22:43:49 localhost kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes)
Jan 21 22:43:49 localhost kernel: pnp: PnP ACPI init
Jan 21 22:43:49 localhost kernel: pnp 00:03: [dma 2]
Jan 21 22:43:49 localhost kernel: pnp: PnP ACPI: found 5 devices
Jan 21 22:43:49 localhost kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns
Jan 21 22:43:49 localhost kernel: NET: Registered PF_INET protocol family
Jan 21 22:43:49 localhost kernel: IP idents hash table entries: 131072 (order: 8, 1048576 bytes, linear)
Jan 21 22:43:49 localhost kernel: tcp_listen_portaddr_hash hash table entries: 4096 (order: 4, 65536 bytes, linear)
Jan 21 22:43:49 localhost kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear)
Jan 21 22:43:49 localhost kernel: TCP established hash table entries: 65536 (order: 7, 524288 bytes, linear)
Jan 21 22:43:49 localhost kernel: TCP bind hash table entries: 65536 (order: 8, 1048576 bytes, linear)
Jan 21 22:43:49 localhost kernel: TCP: Hash tables configured (established 65536 bind 65536)
Jan 21 22:43:49 localhost kernel: MPTCP token hash table entries: 8192 (order: 5, 196608 bytes, linear)
Jan 21 22:43:49 localhost kernel: UDP hash table entries: 4096 (order: 5, 131072 bytes, linear)
Jan 21 22:43:49 localhost kernel: UDP-Lite hash table entries: 4096 (order: 5, 131072 bytes, linear)
Jan 21 22:43:49 localhost kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family
Jan 21 22:43:49 localhost kernel: NET: Registered PF_XDP protocol family
Jan 21 22:43:49 localhost kernel: pci_bus 0000:00: resource 4 [io  0x0000-0x0cf7 window]
Jan 21 22:43:49 localhost kernel: pci_bus 0000:00: resource 5 [io  0x0d00-0xffff window]
Jan 21 22:43:49 localhost kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window]
Jan 21 22:43:49 localhost kernel: pci_bus 0000:00: resource 7 [mem 0xc0000000-0xfebfffff window]
Jan 21 22:43:49 localhost kernel: pci_bus 0000:00: resource 8 [mem 0x240000000-0x2bfffffff window]
Jan 21 22:43:49 localhost kernel: pci 0000:00:01.0: PIIX3: Enabling Passive Release
Jan 21 22:43:49 localhost kernel: pci 0000:00:00.0: Limiting direct PCI/PCI transfers
Jan 21 22:43:49 localhost kernel: ACPI: \_SB_.LNKD: Enabled at IRQ 11
Jan 21 22:43:49 localhost kernel: pci 0000:00:01.2: quirk_usb_early_handoff+0x0/0x160 took 112914 usecs
Jan 21 22:43:49 localhost kernel: PCI: CLS 0 bytes, default 64
Jan 21 22:43:49 localhost kernel: PCI-DMA: Using software bounce buffering for IO (SWIOTLB)
Jan 21 22:43:49 localhost kernel: software IO TLB: mapped [mem 0x00000000ab000000-0x00000000af000000] (64MB)
Jan 21 22:43:49 localhost kernel: ACPI: bus type thunderbolt registered
Jan 21 22:43:49 localhost kernel: Trying to unpack rootfs image as initramfs...
Jan 21 22:43:49 localhost kernel: Initialise system trusted keyrings
Jan 21 22:43:49 localhost kernel: Key type blacklist registered
Jan 21 22:43:49 localhost kernel: workingset: timestamp_bits=36 max_order=21 bucket_order=0
Jan 21 22:43:49 localhost kernel: zbud: loaded
Jan 21 22:43:49 localhost kernel: integrity: Platform Keyring initialized
Jan 21 22:43:49 localhost kernel: integrity: Machine keyring initialized
Jan 21 22:43:49 localhost kernel: Freeing initrd memory: 87956K
Jan 21 22:43:49 localhost kernel: NET: Registered PF_ALG protocol family
Jan 21 22:43:49 localhost kernel: xor: automatically using best checksumming function   avx       
Jan 21 22:43:49 localhost kernel: Key type asymmetric registered
Jan 21 22:43:49 localhost kernel: Asymmetric key parser 'x509' registered
Jan 21 22:43:49 localhost kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 246)
Jan 21 22:43:49 localhost kernel: io scheduler mq-deadline registered
Jan 21 22:43:49 localhost kernel: io scheduler kyber registered
Jan 21 22:43:49 localhost kernel: io scheduler bfq registered
Jan 21 22:43:49 localhost kernel: atomic64_test: passed for x86-64 platform with CX8 and with SSE
Jan 21 22:43:49 localhost kernel: shpchp: Standard Hot Plug PCI Controller Driver version: 0.4
Jan 21 22:43:49 localhost kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input0
Jan 21 22:43:49 localhost kernel: ACPI: button: Power Button [PWRF]
Jan 21 22:43:49 localhost kernel: ACPI: \_SB_.LNKB: Enabled at IRQ 10
Jan 21 22:43:49 localhost kernel: ACPI: \_SB_.LNKC: Enabled at IRQ 11
Jan 21 22:43:49 localhost kernel: ACPI: \_SB_.LNKA: Enabled at IRQ 10
Jan 21 22:43:49 localhost kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled
Jan 21 22:43:49 localhost kernel: 00:00: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A
Jan 21 22:43:49 localhost kernel: Non-volatile memory driver v1.3
Jan 21 22:43:49 localhost kernel: rdac: device handler registered
Jan 21 22:43:49 localhost kernel: hp_sw: device handler registered
Jan 21 22:43:49 localhost kernel: emc: device handler registered
Jan 21 22:43:49 localhost kernel: alua: device handler registered
Jan 21 22:43:49 localhost kernel: uhci_hcd 0000:00:01.2: UHCI Host Controller
Jan 21 22:43:49 localhost kernel: uhci_hcd 0000:00:01.2: new USB bus registered, assigned bus number 1
Jan 21 22:43:49 localhost kernel: uhci_hcd 0000:00:01.2: detected 2 ports
Jan 21 22:43:49 localhost kernel: uhci_hcd 0000:00:01.2: irq 11, io port 0x0000c100
Jan 21 22:43:49 localhost kernel: usb usb1: New USB device found, idVendor=1d6b, idProduct=0001, bcdDevice= 5.14
Jan 21 22:43:49 localhost kernel: usb usb1: New USB device strings: Mfr=3, Product=2, SerialNumber=1
Jan 21 22:43:49 localhost kernel: usb usb1: Product: UHCI Host Controller
Jan 21 22:43:49 localhost kernel: usb usb1: Manufacturer: Linux 5.14.0-661.el9.x86_64 uhci_hcd
Jan 21 22:43:49 localhost kernel: usb usb1: SerialNumber: 0000:00:01.2
Jan 21 22:43:49 localhost kernel: hub 1-0:1.0: USB hub found
Jan 21 22:43:49 localhost kernel: hub 1-0:1.0: 2 ports detected
Jan 21 22:43:49 localhost kernel: usbcore: registered new interface driver usbserial_generic
Jan 21 22:43:49 localhost kernel: usbserial: USB Serial support registered for generic
Jan 21 22:43:49 localhost kernel: i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12
Jan 21 22:43:49 localhost kernel: serio: i8042 KBD port at 0x60,0x64 irq 1
Jan 21 22:43:49 localhost kernel: serio: i8042 AUX port at 0x60,0x64 irq 12
Jan 21 22:43:49 localhost kernel: mousedev: PS/2 mouse device common for all mice
Jan 21 22:43:49 localhost kernel: rtc_cmos 00:04: RTC can wake from S4
Jan 21 22:43:49 localhost kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input1
Jan 21 22:43:49 localhost kernel: input: VirtualPS/2 VMware VMMouse as /devices/platform/i8042/serio1/input/input4
Jan 21 22:43:49 localhost kernel: rtc_cmos 00:04: registered as rtc0
Jan 21 22:43:49 localhost kernel: input: VirtualPS/2 VMware VMMouse as /devices/platform/i8042/serio1/input/input3
Jan 21 22:43:49 localhost kernel: rtc_cmos 00:04: setting system clock to 2026-01-21T22:43:48 UTC (1769035428)
Jan 21 22:43:49 localhost kernel: rtc_cmos 00:04: alarms up to one day, y3k, 242 bytes nvram
Jan 21 22:43:49 localhost kernel: amd_pstate: the _CPC object is not present in SBIOS or ACPI disabled
Jan 21 22:43:49 localhost kernel: hid: raw HID events driver (C) Jiri Kosina
Jan 21 22:43:49 localhost kernel: usbcore: registered new interface driver usbhid
Jan 21 22:43:49 localhost kernel: usbhid: USB HID core driver
Jan 21 22:43:49 localhost kernel: drop_monitor: Initializing network drop monitor service
Jan 21 22:43:49 localhost kernel: Initializing XFRM netlink socket
Jan 21 22:43:49 localhost kernel: NET: Registered PF_INET6 protocol family
Jan 21 22:43:49 localhost kernel: Segment Routing with IPv6
Jan 21 22:43:49 localhost kernel: NET: Registered PF_PACKET protocol family
Jan 21 22:43:49 localhost kernel: mpls_gso: MPLS GSO support
Jan 21 22:43:49 localhost kernel: IPI shorthand broadcast: enabled
Jan 21 22:43:49 localhost kernel: AVX2 version of gcm_enc/dec engaged.
Jan 21 22:43:49 localhost kernel: AES CTR mode by8 optimization enabled
Jan 21 22:43:49 localhost kernel: sched_clock: Marking stable (1264048946, 145932568)->(1537293710, -127312196)
Jan 21 22:43:49 localhost kernel: registered taskstats version 1
Jan 21 22:43:49 localhost kernel: Loading compiled-in X.509 certificates
Jan 21 22:43:49 localhost kernel: Loaded X.509 cert 'The CentOS Project: CentOS Stream kernel signing key: 04453f216699002fd63185eeab832de990bee6d7'
Jan 21 22:43:49 localhost kernel: Loaded X.509 cert 'Red Hat Enterprise Linux Driver Update Program (key 3): bf57f3e87362bc7229d9f465321773dfd1f77a80'
Jan 21 22:43:49 localhost kernel: Loaded X.509 cert 'Red Hat Enterprise Linux kpatch signing key: 4d38fd864ebe18c5f0b72e3852e2014c3a676fc8'
Jan 21 22:43:49 localhost kernel: Loaded X.509 cert 'RH-IMA-CA: Red Hat IMA CA: fb31825dd0e073685b264e3038963673f753959a'
Jan 21 22:43:49 localhost kernel: Loaded X.509 cert 'Nvidia GPU OOT signing 001: 55e1cef88193e60419f0b0ec379c49f77545acf0'
Jan 21 22:43:49 localhost kernel: Demotion targets for Node 0: null
Jan 21 22:43:49 localhost kernel: page_owner is disabled
Jan 21 22:43:49 localhost kernel: Key type .fscrypt registered
Jan 21 22:43:49 localhost kernel: Key type fscrypt-provisioning registered
Jan 21 22:43:49 localhost kernel: Key type big_key registered
Jan 21 22:43:49 localhost kernel: Key type encrypted registered
Jan 21 22:43:49 localhost kernel: ima: No TPM chip found, activating TPM-bypass!
Jan 21 22:43:49 localhost kernel: Loading compiled-in module X.509 certificates
Jan 21 22:43:49 localhost kernel: Loaded X.509 cert 'The CentOS Project: CentOS Stream kernel signing key: 04453f216699002fd63185eeab832de990bee6d7'
Jan 21 22:43:49 localhost kernel: ima: Allocated hash algorithm: sha256
Jan 21 22:43:49 localhost kernel: ima: No architecture policies found
Jan 21 22:43:49 localhost kernel: evm: Initialising EVM extended attributes:
Jan 21 22:43:49 localhost kernel: evm: security.selinux
Jan 21 22:43:49 localhost kernel: evm: security.SMACK64 (disabled)
Jan 21 22:43:49 localhost kernel: evm: security.SMACK64EXEC (disabled)
Jan 21 22:43:49 localhost kernel: evm: security.SMACK64TRANSMUTE (disabled)
Jan 21 22:43:49 localhost kernel: evm: security.SMACK64MMAP (disabled)
Jan 21 22:43:49 localhost kernel: evm: security.apparmor (disabled)
Jan 21 22:43:49 localhost kernel: evm: security.ima
Jan 21 22:43:49 localhost kernel: evm: security.capability
Jan 21 22:43:49 localhost kernel: evm: HMAC attrs: 0x1
Jan 21 22:43:49 localhost kernel: usb 1-1: new full-speed USB device number 2 using uhci_hcd
Jan 21 22:43:49 localhost kernel: Running certificate verification RSA selftest
Jan 21 22:43:49 localhost kernel: Loaded X.509 cert 'Certificate verification self-testing key: f58703bb33ce1b73ee02eccdee5b8817518fe3db'
Jan 21 22:43:49 localhost kernel: Running certificate verification ECDSA selftest
Jan 21 22:43:49 localhost kernel: Loaded X.509 cert 'Certificate verification ECDSA self-testing key: 2900bcea1deb7bc8479a84a23d758efdfdd2b2d3'
Jan 21 22:43:49 localhost kernel: clk: Disabling unused clocks
Jan 21 22:43:49 localhost kernel: Freeing unused decrypted memory: 2028K
Jan 21 22:43:49 localhost kernel: Freeing unused kernel image (initmem) memory: 4200K
Jan 21 22:43:49 localhost kernel: Write protecting the kernel read-only data: 30720k
Jan 21 22:43:49 localhost kernel: usb 1-1: New USB device found, idVendor=0627, idProduct=0001, bcdDevice= 0.00
Jan 21 22:43:49 localhost kernel: usb 1-1: New USB device strings: Mfr=1, Product=3, SerialNumber=10
Jan 21 22:43:49 localhost kernel: usb 1-1: Product: QEMU USB Tablet
Jan 21 22:43:49 localhost kernel: usb 1-1: Manufacturer: QEMU
Jan 21 22:43:49 localhost kernel: usb 1-1: SerialNumber: 28754-0000:00:01.2-1
Jan 21 22:43:49 localhost kernel: Freeing unused kernel image (rodata/data gap) memory: 420K
Jan 21 22:43:49 localhost kernel: input: QEMU QEMU USB Tablet as /devices/pci0000:00/0000:00:01.2/usb1/1-1/1-1:1.0/0003:0627:0001.0001/input/input5
Jan 21 22:43:49 localhost kernel: hid-generic 0003:0627:0001.0001: input,hidraw0: USB HID v0.01 Mouse [QEMU QEMU USB Tablet] on usb-0000:00:01.2-1/input0
Jan 21 22:43:49 localhost kernel: x86/mm: Checked W+X mappings: passed, no W+X pages found.
Jan 21 22:43:49 localhost kernel: Run /init as init process
Jan 21 22:43:49 localhost kernel:   with arguments:
Jan 21 22:43:49 localhost kernel:     /init
Jan 21 22:43:49 localhost kernel:   with environment:
Jan 21 22:43:49 localhost kernel:     HOME=/
Jan 21 22:43:49 localhost kernel:     TERM=linux
Jan 21 22:43:49 localhost kernel:     BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-661.el9.x86_64
Jan 21 22:43:49 localhost systemd[1]: systemd 252-64.el9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS +FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified)
Jan 21 22:43:49 localhost systemd[1]: Detected virtualization kvm.
Jan 21 22:43:49 localhost systemd[1]: Detected architecture x86-64.
Jan 21 22:43:49 localhost systemd[1]: Running in initrd.
Jan 21 22:43:49 localhost systemd[1]: No hostname configured, using default hostname.
Jan 21 22:43:49 localhost systemd[1]: Hostname set to <localhost>.
Jan 21 22:43:49 localhost systemd[1]: Initializing machine ID from VM UUID.
Jan 21 22:43:49 localhost systemd[1]: Queued start job for default target Initrd Default Target.
Jan 21 22:43:49 localhost systemd[1]: Started Dispatch Password Requests to Console Directory Watch.
Jan 21 22:43:49 localhost systemd[1]: Reached target Local Encrypted Volumes.
Jan 21 22:43:49 localhost systemd[1]: Reached target Initrd /usr File System.
Jan 21 22:43:49 localhost systemd[1]: Reached target Local File Systems.
Jan 21 22:43:49 localhost systemd[1]: Reached target Path Units.
Jan 21 22:43:49 localhost systemd[1]: Reached target Slice Units.
Jan 21 22:43:49 localhost systemd[1]: Reached target Swaps.
Jan 21 22:43:49 localhost systemd[1]: Reached target Timer Units.
Jan 21 22:43:49 localhost systemd[1]: Listening on D-Bus System Message Bus Socket.
Jan 21 22:43:49 localhost systemd[1]: Listening on Journal Socket (/dev/log).
Jan 21 22:43:49 localhost systemd[1]: Listening on Journal Socket.
Jan 21 22:43:49 localhost systemd[1]: Listening on udev Control Socket.
Jan 21 22:43:49 localhost systemd[1]: Listening on udev Kernel Socket.
Jan 21 22:43:49 localhost systemd[1]: Reached target Socket Units.
Jan 21 22:43:49 localhost systemd[1]: Starting Create List of Static Device Nodes...
Jan 21 22:43:49 localhost systemd[1]: Starting Journal Service...
Jan 21 22:43:49 localhost systemd[1]: Load Kernel Modules was skipped because no trigger condition checks were met.
Jan 21 22:43:49 localhost systemd[1]: Starting Apply Kernel Variables...
Jan 21 22:43:49 localhost systemd[1]: Starting Create System Users...
Jan 21 22:43:49 localhost systemd[1]: Starting Setup Virtual Console...
Jan 21 22:43:49 localhost systemd[1]: Finished Create List of Static Device Nodes.
Jan 21 22:43:49 localhost systemd[1]: Finished Apply Kernel Variables.
Jan 21 22:43:49 localhost systemd[1]: Finished Create System Users.
Jan 21 22:43:49 localhost systemd-journald[303]: Journal started
Jan 21 22:43:49 localhost systemd-journald[303]: Runtime Journal (/run/log/journal/d7c2924b8ca54f7593761023950dbf90) is 8.0M, max 153.6M, 145.6M free.
Jan 21 22:43:49 localhost systemd-sysusers[308]: Creating group 'users' with GID 100.
Jan 21 22:43:49 localhost systemd-sysusers[308]: Creating group 'dbus' with GID 81.
Jan 21 22:43:49 localhost systemd-sysusers[308]: Creating user 'dbus' (System Message Bus) with UID 81 and GID 81.
Jan 21 22:43:49 localhost systemd[1]: Started Journal Service.
Jan 21 22:43:49 localhost systemd[1]: Starting Create Static Device Nodes in /dev...
Jan 21 22:43:49 localhost systemd[1]: Starting Create Volatile Files and Directories...
Jan 21 22:43:49 localhost systemd[1]: Finished Create Static Device Nodes in /dev.
Jan 21 22:43:49 localhost systemd[1]: Finished Create Volatile Files and Directories.
Jan 21 22:43:49 localhost systemd[1]: Finished Setup Virtual Console.
Jan 21 22:43:49 localhost systemd[1]: dracut ask for additional cmdline parameters was skipped because no trigger condition checks were met.
Jan 21 22:43:49 localhost systemd[1]: Starting dracut cmdline hook...
Jan 21 22:43:49 localhost dracut-cmdline[323]: dracut-9 dracut-057-102.git20250818.el9
Jan 21 22:43:49 localhost dracut-cmdline[323]: Using kernel command line parameters:    BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-661.el9.x86_64 root=UUID=22ac9141-3960-4912-b20e-19fc8a328d40 ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Jan 21 22:43:49 localhost systemd[1]: Finished dracut cmdline hook.
Jan 21 22:43:49 localhost systemd[1]: Starting dracut pre-udev hook...
Jan 21 22:43:49 localhost kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log.
Jan 21 22:43:49 localhost kernel: device-mapper: uevent: version 1.0.3
Jan 21 22:43:49 localhost kernel: device-mapper: ioctl: 4.50.0-ioctl (2025-04-28) initialised: dm-devel@lists.linux.dev
Jan 21 22:43:49 localhost kernel: RPC: Registered named UNIX socket transport module.
Jan 21 22:43:49 localhost kernel: RPC: Registered udp transport module.
Jan 21 22:43:49 localhost kernel: RPC: Registered tcp transport module.
Jan 21 22:43:49 localhost kernel: RPC: Registered tcp-with-tls transport module.
Jan 21 22:43:49 localhost kernel: RPC: Registered tcp NFSv4.1 backchannel transport module.
Jan 21 22:43:49 localhost rpc.statd[439]: Version 2.5.4 starting
Jan 21 22:43:49 localhost rpc.statd[439]: Initializing NSM state
Jan 21 22:43:49 localhost rpc.idmapd[444]: Setting log level to 0
Jan 21 22:43:49 localhost systemd[1]: Finished dracut pre-udev hook.
Jan 21 22:43:49 localhost systemd[1]: Starting Rule-based Manager for Device Events and Files...
Jan 21 22:43:50 localhost systemd-udevd[457]: Using default interface naming scheme 'rhel-9.0'.
Jan 21 22:43:50 localhost systemd[1]: Started Rule-based Manager for Device Events and Files.
Jan 21 22:43:50 localhost systemd[1]: Starting dracut pre-trigger hook...
Jan 21 22:43:50 localhost systemd[1]: Finished dracut pre-trigger hook.
Jan 21 22:43:50 localhost systemd[1]: Starting Coldplug All udev Devices...
Jan 21 22:43:50 localhost systemd[1]: Created slice Slice /system/modprobe.
Jan 21 22:43:50 localhost systemd[1]: Starting Load Kernel Module configfs...
Jan 21 22:43:50 localhost systemd[1]: Finished Coldplug All udev Devices.
Jan 21 22:43:50 localhost systemd[1]: nm-initrd.service was skipped because of an unmet condition check (ConditionPathExists=/run/NetworkManager/initrd/neednet).
Jan 21 22:43:50 localhost systemd[1]: Reached target Network.
Jan 21 22:43:50 localhost systemd[1]: nm-wait-online-initrd.service was skipped because of an unmet condition check (ConditionPathExists=/run/NetworkManager/initrd/neednet).
Jan 21 22:43:50 localhost systemd[1]: Starting dracut initqueue hook...
Jan 21 22:43:50 localhost systemd[1]: modprobe@configfs.service: Deactivated successfully.
Jan 21 22:43:50 localhost systemd[1]: Finished Load Kernel Module configfs.
Jan 21 22:43:50 localhost systemd[1]: Mounting Kernel Configuration File System...
Jan 21 22:43:50 localhost systemd[1]: Mounted Kernel Configuration File System.
Jan 21 22:43:50 localhost systemd[1]: Reached target System Initialization.
Jan 21 22:43:50 localhost systemd[1]: Reached target Basic System.
Jan 21 22:43:50 localhost kernel: virtio_blk virtio2: 8/0/0 default/read/poll queues
Jan 21 22:43:50 localhost kernel: libata version 3.00 loaded.
Jan 21 22:43:50 localhost kernel: ata_piix 0000:00:01.1: version 2.13
Jan 21 22:43:50 localhost kernel: scsi host0: ata_piix
Jan 21 22:43:50 localhost kernel: scsi host1: ata_piix
Jan 21 22:43:50 localhost kernel: ata1: PATA max MWDMA2 cmd 0x1f0 ctl 0x3f6 bmdma 0xc140 irq 14 lpm-pol 0
Jan 21 22:43:50 localhost kernel: ata2: PATA max MWDMA2 cmd 0x170 ctl 0x376 bmdma 0xc148 irq 15 lpm-pol 0
Jan 21 22:43:50 localhost kernel: virtio_blk virtio2: [vda] 167772160 512-byte logical blocks (85.9 GB/80.0 GiB)
Jan 21 22:43:50 localhost kernel:  vda: vda1
Jan 21 22:43:50 localhost kernel: ata1: found unknown device (class 0)
Jan 21 22:43:50 localhost kernel: ata1.00: ATAPI: QEMU DVD-ROM, 2.5+, max UDMA/100
Jan 21 22:43:50 localhost kernel: scsi 0:0:0:0: CD-ROM            QEMU     QEMU DVD-ROM     2.5+ PQ: 0 ANSI: 5
Jan 21 22:43:50 localhost systemd-udevd[471]: Network interface NamePolicy= disabled on kernel command line.
Jan 21 22:43:50 localhost kernel: scsi 0:0:0:0: Attached scsi generic sg0 type 5
Jan 21 22:43:50 localhost systemd[1]: Found device /dev/disk/by-uuid/22ac9141-3960-4912-b20e-19fc8a328d40.
Jan 21 22:43:50 localhost systemd[1]: Reached target Initrd Root Device.
Jan 21 22:43:50 localhost kernel: sr 0:0:0:0: [sr0] scsi3-mmc drive: 4x/4x cd/rw xa/form2 tray
Jan 21 22:43:50 localhost kernel: cdrom: Uniform CD-ROM driver Revision: 3.20
Jan 21 22:43:50 localhost kernel: sr 0:0:0:0: Attached scsi CD-ROM sr0
Jan 21 22:43:50 localhost systemd[1]: Finished dracut initqueue hook.
Jan 21 22:43:50 localhost systemd[1]: Reached target Preparation for Remote File Systems.
Jan 21 22:43:50 localhost systemd[1]: Reached target Remote Encrypted Volumes.
Jan 21 22:43:50 localhost systemd[1]: Reached target Remote File Systems.
Jan 21 22:43:50 localhost systemd[1]: Starting dracut pre-mount hook...
Jan 21 22:43:50 localhost systemd[1]: Finished dracut pre-mount hook.
Jan 21 22:43:50 localhost systemd[1]: Starting File System Check on /dev/disk/by-uuid/22ac9141-3960-4912-b20e-19fc8a328d40...
Jan 21 22:43:50 localhost systemd-fsck[551]: /usr/sbin/fsck.xfs: XFS file system.
Jan 21 22:43:50 localhost systemd[1]: Finished File System Check on /dev/disk/by-uuid/22ac9141-3960-4912-b20e-19fc8a328d40.
Jan 21 22:43:50 localhost systemd[1]: Mounting /sysroot...
Jan 21 22:43:51 localhost kernel: SGI XFS with ACLs, security attributes, scrub, quota, no debug enabled
Jan 21 22:43:51 localhost kernel: XFS (vda1): Mounting V5 Filesystem 22ac9141-3960-4912-b20e-19fc8a328d40
Jan 21 22:43:51 localhost kernel: XFS (vda1): Ending clean mount
Jan 21 22:43:51 localhost systemd[1]: Mounted /sysroot.
Jan 21 22:43:51 localhost systemd[1]: Reached target Initrd Root File System.
Jan 21 22:43:51 localhost systemd[1]: Starting Mountpoints Configured in the Real Root...
Jan 21 22:43:51 localhost systemd[1]: initrd-parse-etc.service: Deactivated successfully.
Jan 21 22:43:51 localhost systemd[1]: Finished Mountpoints Configured in the Real Root.
Jan 21 22:43:51 localhost systemd[1]: Reached target Initrd File Systems.
Jan 21 22:43:51 localhost systemd[1]: Reached target Initrd Default Target.
Jan 21 22:43:51 localhost systemd[1]: Starting dracut mount hook...
Jan 21 22:43:51 localhost systemd[1]: Finished dracut mount hook.
Jan 21 22:43:51 localhost systemd[1]: Starting dracut pre-pivot and cleanup hook...
Jan 21 22:43:51 localhost rpc.idmapd[444]: exiting on signal 15
Jan 21 22:43:51 localhost systemd[1]: var-lib-nfs-rpc_pipefs.mount: Deactivated successfully.
Jan 21 22:43:51 localhost systemd[1]: Finished dracut pre-pivot and cleanup hook.
Jan 21 22:43:51 localhost systemd[1]: Starting Cleaning Up and Shutting Down Daemons...
Jan 21 22:43:51 localhost systemd[1]: Stopped target Network.
Jan 21 22:43:51 localhost systemd[1]: Stopped target Remote Encrypted Volumes.
Jan 21 22:43:51 localhost systemd[1]: Stopped target Timer Units.
Jan 21 22:43:51 localhost systemd[1]: dbus.socket: Deactivated successfully.
Jan 21 22:43:51 localhost systemd[1]: Closed D-Bus System Message Bus Socket.
Jan 21 22:43:51 localhost systemd[1]: dracut-pre-pivot.service: Deactivated successfully.
Jan 21 22:43:51 localhost systemd[1]: Stopped dracut pre-pivot and cleanup hook.
Jan 21 22:43:51 localhost systemd[1]: Stopped target Initrd Default Target.
Jan 21 22:43:51 localhost systemd[1]: Stopped target Basic System.
Jan 21 22:43:51 localhost systemd[1]: Stopped target Initrd Root Device.
Jan 21 22:43:51 localhost systemd[1]: Stopped target Initrd /usr File System.
Jan 21 22:43:51 localhost systemd[1]: Stopped target Path Units.
Jan 21 22:43:51 localhost systemd[1]: Stopped target Remote File Systems.
Jan 21 22:43:51 localhost systemd[1]: Stopped target Preparation for Remote File Systems.
Jan 21 22:43:51 localhost systemd[1]: Stopped target Slice Units.
Jan 21 22:43:51 localhost systemd[1]: Stopped target Socket Units.
Jan 21 22:43:51 localhost systemd[1]: Stopped target System Initialization.
Jan 21 22:43:51 localhost systemd[1]: Stopped target Local File Systems.
Jan 21 22:43:51 localhost systemd[1]: Stopped target Swaps.
Jan 21 22:43:51 localhost systemd[1]: dracut-mount.service: Deactivated successfully.
Jan 21 22:43:51 localhost systemd[1]: Stopped dracut mount hook.
Jan 21 22:43:51 localhost systemd[1]: dracut-pre-mount.service: Deactivated successfully.
Jan 21 22:43:51 localhost systemd[1]: Stopped dracut pre-mount hook.
Jan 21 22:43:51 localhost systemd[1]: Stopped target Local Encrypted Volumes.
Jan 21 22:43:51 localhost systemd[1]: systemd-ask-password-console.path: Deactivated successfully.
Jan 21 22:43:51 localhost systemd[1]: Stopped Dispatch Password Requests to Console Directory Watch.
Jan 21 22:43:51 localhost systemd[1]: dracut-initqueue.service: Deactivated successfully.
Jan 21 22:43:51 localhost systemd[1]: Stopped dracut initqueue hook.
Jan 21 22:43:51 localhost systemd[1]: systemd-sysctl.service: Deactivated successfully.
Jan 21 22:43:51 localhost systemd[1]: Stopped Apply Kernel Variables.
Jan 21 22:43:51 localhost systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully.
Jan 21 22:43:51 localhost systemd[1]: Stopped Create Volatile Files and Directories.
Jan 21 22:43:51 localhost systemd[1]: systemd-udev-trigger.service: Deactivated successfully.
Jan 21 22:43:51 localhost systemd[1]: Stopped Coldplug All udev Devices.
Jan 21 22:43:51 localhost systemd[1]: dracut-pre-trigger.service: Deactivated successfully.
Jan 21 22:43:51 localhost systemd[1]: Stopped dracut pre-trigger hook.
Jan 21 22:43:51 localhost systemd[1]: Stopping Rule-based Manager for Device Events and Files...
Jan 21 22:43:51 localhost systemd[1]: systemd-vconsole-setup.service: Deactivated successfully.
Jan 21 22:43:51 localhost systemd[1]: Stopped Setup Virtual Console.
Jan 21 22:43:51 localhost systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully.
Jan 21 22:43:51 localhost systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully.
Jan 21 22:43:51 localhost systemd[1]: systemd-udevd.service: Deactivated successfully.
Jan 21 22:43:51 localhost systemd[1]: Stopped Rule-based Manager for Device Events and Files.
Jan 21 22:43:51 localhost systemd[1]: initrd-cleanup.service: Deactivated successfully.
Jan 21 22:43:51 localhost systemd[1]: Finished Cleaning Up and Shutting Down Daemons.
Jan 21 22:43:51 localhost systemd[1]: systemd-udevd-control.socket: Deactivated successfully.
Jan 21 22:43:51 localhost systemd[1]: Closed udev Control Socket.
Jan 21 22:43:51 localhost systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully.
Jan 21 22:43:51 localhost systemd[1]: Closed udev Kernel Socket.
Jan 21 22:43:51 localhost systemd[1]: dracut-pre-udev.service: Deactivated successfully.
Jan 21 22:43:51 localhost systemd[1]: Stopped dracut pre-udev hook.
Jan 21 22:43:51 localhost systemd[1]: dracut-cmdline.service: Deactivated successfully.
Jan 21 22:43:51 localhost systemd[1]: Stopped dracut cmdline hook.
Jan 21 22:43:51 localhost systemd[1]: Starting Cleanup udev Database...
Jan 21 22:43:51 localhost systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully.
Jan 21 22:43:51 localhost systemd[1]: Stopped Create Static Device Nodes in /dev.
Jan 21 22:43:51 localhost systemd[1]: kmod-static-nodes.service: Deactivated successfully.
Jan 21 22:43:51 localhost systemd[1]: Stopped Create List of Static Device Nodes.
Jan 21 22:43:51 localhost systemd[1]: systemd-sysusers.service: Deactivated successfully.
Jan 21 22:43:51 localhost systemd[1]: Stopped Create System Users.
Jan 21 22:43:51 localhost systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev.service.mount: Deactivated successfully.
Jan 21 22:43:51 localhost systemd[1]: run-credentials-systemd\x2dsysusers.service.mount: Deactivated successfully.
Jan 21 22:43:51 localhost systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully.
Jan 21 22:43:51 localhost systemd[1]: Finished Cleanup udev Database.
Jan 21 22:43:51 localhost systemd[1]: Reached target Switch Root.
Jan 21 22:43:51 localhost systemd[1]: Starting Switch Root...
Jan 21 22:43:51 localhost systemd[1]: Switching root.
Jan 21 22:43:51 localhost systemd-journald[303]: Journal stopped
Jan 21 22:43:52 localhost systemd-journald[303]: Received SIGTERM from PID 1 (systemd).
Jan 21 22:43:52 localhost kernel: audit: type=1404 audit(1769035431.694:2): enforcing=1 old_enforcing=0 auid=4294967295 ses=4294967295 enabled=1 old-enabled=1 lsm=selinux res=1
Jan 21 22:43:52 localhost kernel: SELinux:  policy capability network_peer_controls=1
Jan 21 22:43:52 localhost kernel: SELinux:  policy capability open_perms=1
Jan 21 22:43:52 localhost kernel: SELinux:  policy capability extended_socket_class=1
Jan 21 22:43:52 localhost kernel: SELinux:  policy capability always_check_network=0
Jan 21 22:43:52 localhost kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 21 22:43:52 localhost kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 21 22:43:52 localhost kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 21 22:43:52 localhost kernel: audit: type=1403 audit(1769035431.824:3): auid=4294967295 ses=4294967295 lsm=selinux res=1
Jan 21 22:43:52 localhost systemd[1]: Successfully loaded SELinux policy in 133.932ms.
Jan 21 22:43:52 localhost systemd[1]: Relabelled /dev, /dev/shm, /run, /sys/fs/cgroup in 26.786ms.
Jan 21 22:43:52 localhost systemd[1]: systemd 252-64.el9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS +FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified)
Jan 21 22:43:52 localhost systemd[1]: Detected virtualization kvm.
Jan 21 22:43:52 localhost systemd[1]: Detected architecture x86-64.
Jan 21 22:43:52 localhost systemd-rc-local-generator[634]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 22:43:52 localhost systemd[1]: initrd-switch-root.service: Deactivated successfully.
Jan 21 22:43:52 localhost systemd[1]: Stopped Switch Root.
Jan 21 22:43:52 localhost systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1.
Jan 21 22:43:52 localhost systemd[1]: Created slice Slice /system/getty.
Jan 21 22:43:52 localhost systemd[1]: Created slice Slice /system/serial-getty.
Jan 21 22:43:52 localhost systemd[1]: Created slice Slice /system/sshd-keygen.
Jan 21 22:43:52 localhost systemd[1]: Created slice User and Session Slice.
Jan 21 22:43:52 localhost systemd[1]: Started Dispatch Password Requests to Console Directory Watch.
Jan 21 22:43:52 localhost systemd[1]: Started Forward Password Requests to Wall Directory Watch.
Jan 21 22:43:52 localhost systemd[1]: Set up automount Arbitrary Executable File Formats File System Automount Point.
Jan 21 22:43:52 localhost systemd[1]: Reached target Local Encrypted Volumes.
Jan 21 22:43:52 localhost systemd[1]: Stopped target Switch Root.
Jan 21 22:43:52 localhost systemd[1]: Stopped target Initrd File Systems.
Jan 21 22:43:52 localhost systemd[1]: Stopped target Initrd Root File System.
Jan 21 22:43:52 localhost systemd[1]: Reached target Local Integrity Protected Volumes.
Jan 21 22:43:52 localhost systemd[1]: Reached target Path Units.
Jan 21 22:43:52 localhost systemd[1]: Reached target rpc_pipefs.target.
Jan 21 22:43:52 localhost systemd[1]: Reached target Slice Units.
Jan 21 22:43:52 localhost systemd[1]: Reached target Swaps.
Jan 21 22:43:52 localhost systemd[1]: Reached target Local Verity Protected Volumes.
Jan 21 22:43:52 localhost systemd[1]: Listening on RPCbind Server Activation Socket.
Jan 21 22:43:52 localhost systemd[1]: Reached target RPC Port Mapper.
Jan 21 22:43:52 localhost systemd[1]: Listening on Process Core Dump Socket.
Jan 21 22:43:52 localhost systemd[1]: Listening on initctl Compatibility Named Pipe.
Jan 21 22:43:52 localhost systemd[1]: Listening on udev Control Socket.
Jan 21 22:43:52 localhost systemd[1]: Listening on udev Kernel Socket.
Jan 21 22:43:52 localhost systemd[1]: Mounting Huge Pages File System...
Jan 21 22:43:52 localhost systemd[1]: Mounting POSIX Message Queue File System...
Jan 21 22:43:52 localhost systemd[1]: Mounting Kernel Debug File System...
Jan 21 22:43:52 localhost systemd[1]: Mounting Kernel Trace File System...
Jan 21 22:43:52 localhost systemd[1]: Kernel Module supporting RPCSEC_GSS was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab).
Jan 21 22:43:52 localhost systemd[1]: Starting Create List of Static Device Nodes...
Jan 21 22:43:52 localhost systemd[1]: Starting Load Kernel Module configfs...
Jan 21 22:43:52 localhost systemd[1]: Starting Load Kernel Module drm...
Jan 21 22:43:52 localhost systemd[1]: Starting Load Kernel Module efi_pstore...
Jan 21 22:43:52 localhost systemd[1]: Starting Load Kernel Module fuse...
Jan 21 22:43:52 localhost systemd[1]: Starting Read and set NIS domainname from /etc/sysconfig/network...
Jan 21 22:43:52 localhost systemd[1]: systemd-fsck-root.service: Deactivated successfully.
Jan 21 22:43:52 localhost systemd[1]: Stopped File System Check on Root Device.
Jan 21 22:43:52 localhost systemd[1]: Stopped Journal Service.
Jan 21 22:43:52 localhost kernel: fuse: init (API version 7.37)
Jan 21 22:43:52 localhost systemd[1]: Starting Journal Service...
Jan 21 22:43:52 localhost systemd[1]: Load Kernel Modules was skipped because no trigger condition checks were met.
Jan 21 22:43:52 localhost systemd[1]: Starting Generate network units from Kernel command line...
Jan 21 22:43:52 localhost systemd[1]: TPM2 PCR Machine ID Measurement was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Jan 21 22:43:52 localhost systemd[1]: Starting Remount Root and Kernel File Systems...
Jan 21 22:43:52 localhost systemd[1]: Repartition Root Disk was skipped because no trigger condition checks were met.
Jan 21 22:43:52 localhost systemd[1]: Starting Apply Kernel Variables...
Jan 21 22:43:52 localhost kernel: xfs filesystem being remounted at / supports timestamps until 2038 (0x7fffffff)
Jan 21 22:43:52 localhost systemd[1]: Starting Coldplug All udev Devices...
Jan 21 22:43:52 localhost systemd-journald[676]: Journal started
Jan 21 22:43:52 localhost systemd-journald[676]: Runtime Journal (/run/log/journal/85ac68c10a6e7ae08ceb898dbdca0cb5) is 8.0M, max 153.6M, 145.6M free.
Jan 21 22:43:52 localhost systemd[1]: Queued start job for default target Multi-User System.
Jan 21 22:43:52 localhost systemd[1]: systemd-journald.service: Deactivated successfully.
Jan 21 22:43:52 localhost systemd[1]: Started Journal Service.
Jan 21 22:43:52 localhost systemd[1]: Mounted Huge Pages File System.
Jan 21 22:43:52 localhost kernel: ACPI: bus type drm_connector registered
Jan 21 22:43:52 localhost systemd[1]: Mounted POSIX Message Queue File System.
Jan 21 22:43:52 localhost systemd[1]: Mounted Kernel Debug File System.
Jan 21 22:43:52 localhost systemd[1]: Mounted Kernel Trace File System.
Jan 21 22:43:52 localhost systemd[1]: Finished Create List of Static Device Nodes.
Jan 21 22:43:52 localhost systemd[1]: modprobe@configfs.service: Deactivated successfully.
Jan 21 22:43:52 localhost systemd[1]: Finished Load Kernel Module configfs.
Jan 21 22:43:52 localhost systemd[1]: modprobe@drm.service: Deactivated successfully.
Jan 21 22:43:52 localhost systemd[1]: Finished Load Kernel Module drm.
Jan 21 22:43:52 localhost systemd[1]: modprobe@efi_pstore.service: Deactivated successfully.
Jan 21 22:43:52 localhost systemd[1]: Finished Load Kernel Module efi_pstore.
Jan 21 22:43:52 localhost systemd[1]: modprobe@fuse.service: Deactivated successfully.
Jan 21 22:43:52 localhost systemd[1]: Finished Load Kernel Module fuse.
Jan 21 22:43:52 localhost systemd[1]: Finished Read and set NIS domainname from /etc/sysconfig/network.
Jan 21 22:43:52 localhost systemd[1]: Finished Generate network units from Kernel command line.
Jan 21 22:43:52 localhost systemd[1]: Finished Remount Root and Kernel File Systems.
Jan 21 22:43:52 localhost systemd[1]: Finished Apply Kernel Variables.
Jan 21 22:43:52 localhost systemd[1]: Mounting FUSE Control File System...
Jan 21 22:43:52 localhost systemd[1]: First Boot Wizard was skipped because of an unmet condition check (ConditionFirstBoot=yes).
Jan 21 22:43:52 localhost systemd[1]: Starting Rebuild Hardware Database...
Jan 21 22:43:52 localhost systemd[1]: Starting Flush Journal to Persistent Storage...
Jan 21 22:43:52 localhost systemd[1]: Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore).
Jan 21 22:43:52 localhost systemd[1]: Starting Load/Save OS Random Seed...
Jan 21 22:43:52 localhost systemd[1]: Starting Create System Users...
Jan 21 22:43:52 localhost systemd-journald[676]: Runtime Journal (/run/log/journal/85ac68c10a6e7ae08ceb898dbdca0cb5) is 8.0M, max 153.6M, 145.6M free.
Jan 21 22:43:52 localhost systemd-journald[676]: Received client request to flush runtime journal.
Jan 21 22:43:52 localhost systemd[1]: Mounted FUSE Control File System.
Jan 21 22:43:52 localhost systemd[1]: Finished Flush Journal to Persistent Storage.
Jan 21 22:43:52 localhost systemd[1]: Finished Load/Save OS Random Seed.
Jan 21 22:43:52 localhost systemd[1]: First Boot Complete was skipped because of an unmet condition check (ConditionFirstBoot=yes).
Jan 21 22:43:52 localhost systemd[1]: Finished Create System Users.
Jan 21 22:43:52 localhost systemd[1]: Starting Create Static Device Nodes in /dev...
Jan 21 22:43:52 localhost systemd[1]: Finished Coldplug All udev Devices.
Jan 21 22:43:52 localhost systemd[1]: Finished Create Static Device Nodes in /dev.
Jan 21 22:43:52 localhost systemd[1]: Reached target Preparation for Local File Systems.
Jan 21 22:43:52 localhost systemd[1]: Reached target Local File Systems.
Jan 21 22:43:52 localhost systemd[1]: Starting Rebuild Dynamic Linker Cache...
Jan 21 22:43:52 localhost systemd[1]: Mark the need to relabel after reboot was skipped because of an unmet condition check (ConditionSecurity=!selinux).
Jan 21 22:43:52 localhost systemd[1]: Set Up Additional Binary Formats was skipped because no trigger condition checks were met.
Jan 21 22:43:52 localhost systemd[1]: Update Boot Loader Random Seed was skipped because no trigger condition checks were met.
Jan 21 22:43:52 localhost systemd[1]: Starting Automatic Boot Loader Update...
Jan 21 22:43:52 localhost systemd[1]: Commit a transient machine-id on disk was skipped because of an unmet condition check (ConditionPathIsMountPoint=/etc/machine-id).
Jan 21 22:43:52 localhost systemd[1]: Starting Create Volatile Files and Directories...
Jan 21 22:43:52 localhost bootctl[694]: Couldn't find EFI system partition, skipping.
Jan 21 22:43:52 localhost systemd[1]: Finished Automatic Boot Loader Update.
Jan 21 22:43:52 localhost systemd[1]: Finished Rebuild Dynamic Linker Cache.
Jan 21 22:43:52 localhost systemd[1]: Finished Create Volatile Files and Directories.
Jan 21 22:43:52 localhost systemd[1]: Starting Security Auditing Service...
Jan 21 22:43:52 localhost systemd[1]: Starting RPC Bind...
Jan 21 22:43:52 localhost auditd[699]: audit dispatcher initialized with q_depth=2000 and 1 active plugins
Jan 21 22:43:52 localhost systemd[1]: Starting Rebuild Journal Catalog...
Jan 21 22:43:52 localhost auditd[699]: Init complete, auditd 3.1.5 listening for events (startup state enable)
Jan 21 22:43:52 localhost systemd[1]: Started RPC Bind.
Jan 21 22:43:52 localhost augenrules[705]: /sbin/augenrules: No change
Jan 21 22:43:52 localhost systemd[1]: Finished Rebuild Journal Catalog.
Jan 21 22:43:52 localhost augenrules[720]: No rules
Jan 21 22:43:52 localhost augenrules[720]: enabled 1
Jan 21 22:43:52 localhost augenrules[720]: failure 1
Jan 21 22:43:52 localhost augenrules[720]: pid 699
Jan 21 22:43:52 localhost augenrules[720]: rate_limit 0
Jan 21 22:43:52 localhost augenrules[720]: backlog_limit 8192
Jan 21 22:43:52 localhost augenrules[720]: lost 0
Jan 21 22:43:52 localhost augenrules[720]: backlog 0
Jan 21 22:43:52 localhost augenrules[720]: backlog_wait_time 60000
Jan 21 22:43:52 localhost augenrules[720]: backlog_wait_time_actual 0
Jan 21 22:43:52 localhost augenrules[720]: enabled 1
Jan 21 22:43:52 localhost augenrules[720]: failure 1
Jan 21 22:43:52 localhost augenrules[720]: pid 699
Jan 21 22:43:52 localhost augenrules[720]: rate_limit 0
Jan 21 22:43:52 localhost augenrules[720]: backlog_limit 8192
Jan 21 22:43:52 localhost augenrules[720]: lost 0
Jan 21 22:43:52 localhost augenrules[720]: backlog 0
Jan 21 22:43:52 localhost augenrules[720]: backlog_wait_time 60000
Jan 21 22:43:52 localhost augenrules[720]: backlog_wait_time_actual 0
Jan 21 22:43:52 localhost augenrules[720]: enabled 1
Jan 21 22:43:52 localhost augenrules[720]: failure 1
Jan 21 22:43:52 localhost augenrules[720]: pid 699
Jan 21 22:43:52 localhost augenrules[720]: rate_limit 0
Jan 21 22:43:52 localhost augenrules[720]: backlog_limit 8192
Jan 21 22:43:52 localhost augenrules[720]: lost 0
Jan 21 22:43:52 localhost augenrules[720]: backlog 1
Jan 21 22:43:52 localhost augenrules[720]: backlog_wait_time 60000
Jan 21 22:43:52 localhost augenrules[720]: backlog_wait_time_actual 0
Jan 21 22:43:52 localhost systemd[1]: Started Security Auditing Service.
Jan 21 22:43:52 localhost systemd[1]: Starting Record System Boot/Shutdown in UTMP...
Jan 21 22:43:52 localhost systemd[1]: Finished Record System Boot/Shutdown in UTMP.
Jan 21 22:43:53 localhost systemd[1]: Finished Rebuild Hardware Database.
Jan 21 22:43:53 localhost systemd[1]: Starting Rule-based Manager for Device Events and Files...
Jan 21 22:43:53 localhost systemd[1]: Starting Update is Completed...
Jan 21 22:43:53 localhost systemd[1]: Finished Update is Completed.
Jan 21 22:43:53 localhost systemd-udevd[728]: Using default interface naming scheme 'rhel-9.0'.
Jan 21 22:43:53 localhost systemd[1]: Started Rule-based Manager for Device Events and Files.
Jan 21 22:43:53 localhost systemd[1]: Reached target System Initialization.
Jan 21 22:43:53 localhost systemd[1]: Started dnf makecache --timer.
Jan 21 22:43:53 localhost systemd[1]: Started Daily rotation of log files.
Jan 21 22:43:53 localhost systemd[1]: Started Daily Cleanup of Temporary Directories.
Jan 21 22:43:53 localhost systemd[1]: Reached target Timer Units.
Jan 21 22:43:53 localhost systemd[1]: Listening on D-Bus System Message Bus Socket.
Jan 21 22:43:53 localhost systemd[1]: Listening on SSSD Kerberos Cache Manager responder socket.
Jan 21 22:43:53 localhost systemd[1]: Reached target Socket Units.
Jan 21 22:43:53 localhost systemd[1]: Starting D-Bus System Message Bus...
Jan 21 22:43:53 localhost systemd[1]: TPM2 PCR Barrier (Initialization) was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Jan 21 22:43:53 localhost systemd[1]: Condition check resulted in /dev/ttyS0 being skipped.
Jan 21 22:43:53 localhost systemd[1]: Starting Load Kernel Module configfs...
Jan 21 22:43:53 localhost systemd[1]: modprobe@configfs.service: Deactivated successfully.
Jan 21 22:43:53 localhost systemd[1]: Finished Load Kernel Module configfs.
Jan 21 22:43:53 localhost systemd[1]: Started D-Bus System Message Bus.
Jan 21 22:43:53 localhost systemd-udevd[734]: Network interface NamePolicy= disabled on kernel command line.
Jan 21 22:43:53 localhost systemd[1]: Reached target Basic System.
Jan 21 22:43:53 localhost dbus-broker-lau[754]: Ready
Jan 21 22:43:53 localhost systemd[1]: Starting NTP client/server...
Jan 21 22:43:53 localhost systemd[1]: Starting Cloud-init: Local Stage (pre-network)...
Jan 21 22:43:53 localhost systemd[1]: Starting Restore /run/initramfs on shutdown...
Jan 21 22:43:53 localhost kernel: input: PC Speaker as /devices/platform/pcspkr/input/input6
Jan 21 22:43:53 localhost systemd[1]: Starting IPv4 firewall with iptables...
Jan 21 22:43:53 localhost chronyd[784]: chronyd version 4.8 starting (+CMDMON +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +NTS +SECHASH +IPV6 +DEBUG)
Jan 21 22:43:53 localhost chronyd[784]: Loaded 0 symmetric keys
Jan 21 22:43:53 localhost chronyd[784]: Using right/UTC timezone to obtain leap second data
Jan 21 22:43:53 localhost chronyd[784]: Loaded seccomp filter (level 2)
Jan 21 22:43:53 localhost kernel: piix4_smbus 0000:00:01.3: SMBus Host Controller at 0x700, revision 0
Jan 21 22:43:53 localhost kernel: i2c i2c-0: 1/1 memory slots populated (from DMI)
Jan 21 22:43:53 localhost kernel: i2c i2c-0: Memory type 0x07 not supported yet, not instantiating SPD
Jan 21 22:43:53 localhost systemd[1]: Started irqbalance daemon.
Jan 21 22:43:53 localhost systemd[1]: Load CPU microcode update was skipped because of an unmet condition check (ConditionPathExists=/sys/devices/system/cpu/microcode/reload).
Jan 21 22:43:53 localhost systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Jan 21 22:43:53 localhost systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Jan 21 22:43:53 localhost systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Jan 21 22:43:53 localhost systemd[1]: Reached target sshd-keygen.target.
Jan 21 22:43:53 localhost systemd[1]: System Security Services Daemon was skipped because no trigger condition checks were met.
Jan 21 22:43:53 localhost systemd[1]: Reached target User and Group Name Lookups.
Jan 21 22:43:53 localhost kernel: Warning: Deprecated Driver is detected: nft_compat will not be maintained in a future major release and may be disabled
Jan 21 22:43:53 localhost kernel: Warning: Deprecated Driver is detected: nft_compat_module_init will not be maintained in a future major release and may be disabled
Jan 21 22:43:53 localhost systemd[1]: Starting User Login Management...
Jan 21 22:43:53 localhost systemd[1]: Started NTP client/server.
Jan 21 22:43:53 localhost systemd[1]: Finished Restore /run/initramfs on shutdown.
Jan 21 22:43:53 localhost kernel: [drm] pci: virtio-vga detected at 0000:00:02.0
Jan 21 22:43:53 localhost kernel: virtio-pci 0000:00:02.0: vgaarb: deactivate vga console
Jan 21 22:43:53 localhost kernel: Console: switching to colour dummy device 80x25
Jan 21 22:43:53 localhost kernel: [drm] features: -virgl +edid -resource_blob -host_visible
Jan 21 22:43:53 localhost kernel: [drm] features: -context_init
Jan 21 22:43:53 localhost kernel: [drm] number of scanouts: 1
Jan 21 22:43:53 localhost kernel: [drm] number of cap sets: 0
Jan 21 22:43:53 localhost kernel: [drm] Initialized virtio_gpu 0.1.0 for 0000:00:02.0 on minor 0
Jan 21 22:43:53 localhost kernel: fbcon: virtio_gpudrmfb (fb0) is primary device
Jan 21 22:43:53 localhost kernel: Console: switching to colour frame buffer device 128x48
Jan 21 22:43:53 localhost kernel: virtio-pci 0000:00:02.0: [drm] fb0: virtio_gpudrmfb frame buffer device
Jan 21 22:43:53 localhost systemd-logind[796]: New seat seat0.
Jan 21 22:43:53 localhost systemd-logind[796]: Watching system buttons on /dev/input/event0 (Power Button)
Jan 21 22:43:53 localhost systemd-logind[796]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard)
Jan 21 22:43:53 localhost systemd[1]: Started User Login Management.
Jan 21 22:43:53 localhost kernel: kvm_amd: TSC scaling supported
Jan 21 22:43:53 localhost kernel: kvm_amd: Nested Virtualization enabled
Jan 21 22:43:53 localhost kernel: kvm_amd: Nested Paging enabled
Jan 21 22:43:53 localhost kernel: kvm_amd: LBR virtualization supported
Jan 21 22:43:53 localhost iptables.init[778]: iptables: Applying firewall rules: [  OK  ]
Jan 21 22:43:53 localhost systemd[1]: Finished IPv4 firewall with iptables.
Jan 21 22:43:53 localhost cloud-init[837]: Cloud-init v. 24.4-8.el9 running 'init-local' at Wed, 21 Jan 2026 22:43:53 +0000. Up 6.37 seconds.
Jan 21 22:43:53 localhost kernel: ISO 9660 Extensions: Microsoft Joliet Level 3
Jan 21 22:43:53 localhost kernel: ISO 9660 Extensions: RRIP_1991A
Jan 21 22:43:53 localhost systemd[1]: run-cloud\x2dinit-tmp-tmpjzy3emqj.mount: Deactivated successfully.
Jan 21 22:43:53 localhost systemd[1]: Starting Hostname Service...
Jan 21 22:43:54 localhost systemd[1]: Started Hostname Service.
Jan 21 22:43:54 np0005591284.novalocal systemd-hostnamed[851]: Hostname set to <np0005591284.novalocal> (static)
Jan 21 22:43:54 np0005591284.novalocal systemd[1]: Finished Cloud-init: Local Stage (pre-network).
Jan 21 22:43:54 np0005591284.novalocal systemd[1]: Reached target Preparation for Network.
Jan 21 22:43:54 np0005591284.novalocal systemd[1]: Starting Network Manager...
Jan 21 22:43:54 np0005591284.novalocal NetworkManager[855]: <info>  [1769035434.2701] NetworkManager (version 1.54.3-2.el9) is starting... (boot:eb0f01be-82e2-4e7f-8f82-f8e2d1cf7324)
Jan 21 22:43:54 np0005591284.novalocal NetworkManager[855]: <info>  [1769035434.2708] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Jan 21 22:43:54 np0005591284.novalocal NetworkManager[855]: <info>  [1769035434.2799] manager[0x5609db615000]: monitoring kernel firmware directory '/lib/firmware'.
Jan 21 22:43:54 np0005591284.novalocal NetworkManager[855]: <info>  [1769035434.2839] hostname: hostname: using hostnamed
Jan 21 22:43:54 np0005591284.novalocal NetworkManager[855]: <info>  [1769035434.2840] hostname: static hostname changed from (none) to "np0005591284.novalocal"
Jan 21 22:43:54 np0005591284.novalocal NetworkManager[855]: <info>  [1769035434.2844] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Jan 21 22:43:54 np0005591284.novalocal NetworkManager[855]: <info>  [1769035434.2970] manager[0x5609db615000]: rfkill: Wi-Fi hardware radio set enabled
Jan 21 22:43:54 np0005591284.novalocal NetworkManager[855]: <info>  [1769035434.2971] manager[0x5609db615000]: rfkill: WWAN hardware radio set enabled
Jan 21 22:43:54 np0005591284.novalocal NetworkManager[855]: <info>  [1769035434.3029] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-device-plugin-team.so)
Jan 21 22:43:54 np0005591284.novalocal systemd[1]: Listening on Load/Save RF Kill Switch Status /dev/rfkill Watch.
Jan 21 22:43:54 np0005591284.novalocal NetworkManager[855]: <info>  [1769035434.3030] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Jan 21 22:43:54 np0005591284.novalocal NetworkManager[855]: <info>  [1769035434.3030] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Jan 21 22:43:54 np0005591284.novalocal NetworkManager[855]: <info>  [1769035434.3031] manager: Networking is enabled by state file
Jan 21 22:43:54 np0005591284.novalocal NetworkManager[855]: <info>  [1769035434.3034] settings: Loaded settings plugin: keyfile (internal)
Jan 21 22:43:54 np0005591284.novalocal NetworkManager[855]: <info>  [1769035434.3052] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-settings-plugin-ifcfg-rh.so")
Jan 21 22:43:54 np0005591284.novalocal NetworkManager[855]: <info>  [1769035434.3089] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Jan 21 22:43:54 np0005591284.novalocal NetworkManager[855]: <info>  [1769035434.3111] dhcp: init: Using DHCP client 'internal'
Jan 21 22:43:54 np0005591284.novalocal NetworkManager[855]: <info>  [1769035434.3115] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Jan 21 22:43:54 np0005591284.novalocal NetworkManager[855]: <info>  [1769035434.3139] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 21 22:43:54 np0005591284.novalocal NetworkManager[855]: <info>  [1769035434.3151] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Jan 21 22:43:54 np0005591284.novalocal NetworkManager[855]: <info>  [1769035434.3167] device (lo): Activation: starting connection 'lo' (4662a9d4-1184-4934-9979-d04ebf8a1fd8)
Jan 21 22:43:54 np0005591284.novalocal NetworkManager[855]: <info>  [1769035434.3184] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Jan 21 22:43:54 np0005591284.novalocal NetworkManager[855]: <info>  [1769035434.3191] device (eth0): state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 21 22:43:54 np0005591284.novalocal NetworkManager[855]: <info>  [1769035434.3230] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Jan 21 22:43:54 np0005591284.novalocal NetworkManager[855]: <info>  [1769035434.3236] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Jan 21 22:43:54 np0005591284.novalocal NetworkManager[855]: <info>  [1769035434.3240] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Jan 21 22:43:54 np0005591284.novalocal NetworkManager[855]: <info>  [1769035434.3243] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Jan 21 22:43:54 np0005591284.novalocal NetworkManager[855]: <info>  [1769035434.3247] device (eth0): carrier: link connected
Jan 21 22:43:54 np0005591284.novalocal NetworkManager[855]: <info>  [1769035434.3250] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Jan 21 22:43:54 np0005591284.novalocal NetworkManager[855]: <info>  [1769035434.3255] device (eth0): state change: unavailable -> disconnected (reason 'carrier-changed', managed-type: 'full')
Jan 21 22:43:54 np0005591284.novalocal NetworkManager[855]: <info>  [1769035434.3260] policy: auto-activating connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Jan 21 22:43:54 np0005591284.novalocal NetworkManager[855]: <info>  [1769035434.3263] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Jan 21 22:43:54 np0005591284.novalocal NetworkManager[855]: <info>  [1769035434.3264] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 21 22:43:54 np0005591284.novalocal NetworkManager[855]: <info>  [1769035434.3266] manager: NetworkManager state is now CONNECTING
Jan 21 22:43:54 np0005591284.novalocal NetworkManager[855]: <info>  [1769035434.3267] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 21 22:43:54 np0005591284.novalocal NetworkManager[855]: <info>  [1769035434.3273] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 21 22:43:54 np0005591284.novalocal NetworkManager[855]: <info>  [1769035434.3276] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Jan 21 22:43:54 np0005591284.novalocal systemd[1]: Starting Network Manager Script Dispatcher Service...
Jan 21 22:43:54 np0005591284.novalocal systemd[1]: Started Network Manager.
Jan 21 22:43:54 np0005591284.novalocal systemd[1]: Reached target Network.
Jan 21 22:43:54 np0005591284.novalocal systemd[1]: Starting Network Manager Wait Online...
Jan 21 22:43:54 np0005591284.novalocal systemd[1]: Starting GSSAPI Proxy Daemon...
Jan 21 22:43:54 np0005591284.novalocal systemd[1]: Started Network Manager Script Dispatcher Service.
Jan 21 22:43:54 np0005591284.novalocal NetworkManager[855]: <info>  [1769035434.3496] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Jan 21 22:43:54 np0005591284.novalocal NetworkManager[855]: <info>  [1769035434.3499] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Jan 21 22:43:54 np0005591284.novalocal NetworkManager[855]: <info>  [1769035434.3504] device (lo): Activation: successful, device activated.
Jan 21 22:43:54 np0005591284.novalocal systemd[1]: Started GSSAPI Proxy Daemon.
Jan 21 22:43:54 np0005591284.novalocal systemd[1]: RPC security service for NFS client and server was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab).
Jan 21 22:43:54 np0005591284.novalocal systemd[1]: Reached target NFS client services.
Jan 21 22:43:54 np0005591284.novalocal systemd[1]: Reached target Preparation for Remote File Systems.
Jan 21 22:43:54 np0005591284.novalocal systemd[1]: Reached target Remote File Systems.
Jan 21 22:43:54 np0005591284.novalocal systemd[1]: TPM2 PCR Barrier (User) was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Jan 21 22:43:56 np0005591284.novalocal NetworkManager[855]: <info>  [1769035436.0202] dhcp4 (eth0): state changed new lease, address=38.102.83.173
Jan 21 22:43:56 np0005591284.novalocal NetworkManager[855]: <info>  [1769035436.0222] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Jan 21 22:43:56 np0005591284.novalocal NetworkManager[855]: <info>  [1769035436.0242] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 21 22:43:56 np0005591284.novalocal NetworkManager[855]: <info>  [1769035436.0275] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 21 22:43:56 np0005591284.novalocal NetworkManager[855]: <info>  [1769035436.0276] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 21 22:43:56 np0005591284.novalocal NetworkManager[855]: <info>  [1769035436.0279] manager: NetworkManager state is now CONNECTED_SITE
Jan 21 22:43:56 np0005591284.novalocal NetworkManager[855]: <info>  [1769035436.0281] device (eth0): Activation: successful, device activated.
Jan 21 22:43:56 np0005591284.novalocal NetworkManager[855]: <info>  [1769035436.0285] manager: NetworkManager state is now CONNECTED_GLOBAL
Jan 21 22:43:56 np0005591284.novalocal NetworkManager[855]: <info>  [1769035436.0287] manager: startup complete
Jan 21 22:43:56 np0005591284.novalocal systemd[1]: Finished Network Manager Wait Online.
Jan 21 22:43:56 np0005591284.novalocal systemd[1]: Starting Cloud-init: Network Stage...
Jan 21 22:43:56 np0005591284.novalocal cloud-init[918]: Cloud-init v. 24.4-8.el9 running 'init' at Wed, 21 Jan 2026 22:43:56 +0000. Up 9.02 seconds.
Jan 21 22:43:56 np0005591284.novalocal cloud-init[918]: ci-info: +++++++++++++++++++++++++++++++++++++++Net device info+++++++++++++++++++++++++++++++++++++++
Jan 21 22:43:56 np0005591284.novalocal cloud-init[918]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Jan 21 22:43:56 np0005591284.novalocal cloud-init[918]: ci-info: | Device |  Up  |           Address            |      Mask     | Scope  |     Hw-Address    |
Jan 21 22:43:56 np0005591284.novalocal cloud-init[918]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Jan 21 22:43:56 np0005591284.novalocal cloud-init[918]: ci-info: |  eth0  | True |        38.102.83.173         | 255.255.255.0 | global | fa:16:3e:93:48:44 |
Jan 21 22:43:56 np0005591284.novalocal cloud-init[918]: ci-info: |  eth0  | True | fe80::f816:3eff:fe93:4844/64 |       .       |  link  | fa:16:3e:93:48:44 |
Jan 21 22:43:56 np0005591284.novalocal cloud-init[918]: ci-info: |   lo   | True |          127.0.0.1           |   255.0.0.0   |  host  |         .         |
Jan 21 22:43:56 np0005591284.novalocal cloud-init[918]: ci-info: |   lo   | True |           ::1/128            |       .       |  host  |         .         |
Jan 21 22:43:56 np0005591284.novalocal cloud-init[918]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Jan 21 22:43:56 np0005591284.novalocal cloud-init[918]: ci-info: +++++++++++++++++++++++++++++++++Route IPv4 info+++++++++++++++++++++++++++++++++
Jan 21 22:43:56 np0005591284.novalocal cloud-init[918]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Jan 21 22:43:56 np0005591284.novalocal cloud-init[918]: ci-info: | Route |   Destination   |    Gateway    |     Genmask     | Interface | Flags |
Jan 21 22:43:56 np0005591284.novalocal cloud-init[918]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Jan 21 22:43:56 np0005591284.novalocal cloud-init[918]: ci-info: |   0   |     0.0.0.0     |  38.102.83.1  |     0.0.0.0     |    eth0   |   UG  |
Jan 21 22:43:56 np0005591284.novalocal cloud-init[918]: ci-info: |   1   |   38.102.83.0   |    0.0.0.0    |  255.255.255.0  |    eth0   |   U   |
Jan 21 22:43:56 np0005591284.novalocal cloud-init[918]: ci-info: |   2   | 169.254.169.254 | 38.102.83.126 | 255.255.255.255 |    eth0   |  UGH  |
Jan 21 22:43:56 np0005591284.novalocal cloud-init[918]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Jan 21 22:43:56 np0005591284.novalocal cloud-init[918]: ci-info: +++++++++++++++++++Route IPv6 info+++++++++++++++++++
Jan 21 22:43:56 np0005591284.novalocal cloud-init[918]: ci-info: +-------+-------------+---------+-----------+-------+
Jan 21 22:43:56 np0005591284.novalocal cloud-init[918]: ci-info: | Route | Destination | Gateway | Interface | Flags |
Jan 21 22:43:56 np0005591284.novalocal cloud-init[918]: ci-info: +-------+-------------+---------+-----------+-------+
Jan 21 22:43:56 np0005591284.novalocal cloud-init[918]: ci-info: |   1   |  fe80::/64  |    ::   |    eth0   |   U   |
Jan 21 22:43:56 np0005591284.novalocal cloud-init[918]: ci-info: |   3   |    local    |    ::   |    eth0   |   U   |
Jan 21 22:43:56 np0005591284.novalocal cloud-init[918]: ci-info: |   4   |  multicast  |    ::   |    eth0   |   U   |
Jan 21 22:43:56 np0005591284.novalocal cloud-init[918]: ci-info: +-------+-------------+---------+-----------+-------+
Jan 21 22:43:57 np0005591284.novalocal useradd[986]: new group: name=cloud-user, GID=1001
Jan 21 22:43:57 np0005591284.novalocal useradd[986]: new user: name=cloud-user, UID=1001, GID=1001, home=/home/cloud-user, shell=/bin/bash, from=none
Jan 21 22:43:57 np0005591284.novalocal useradd[986]: add 'cloud-user' to group 'adm'
Jan 21 22:43:57 np0005591284.novalocal useradd[986]: add 'cloud-user' to group 'systemd-journal'
Jan 21 22:43:57 np0005591284.novalocal useradd[986]: add 'cloud-user' to shadow group 'adm'
Jan 21 22:43:57 np0005591284.novalocal useradd[986]: add 'cloud-user' to shadow group 'systemd-journal'
Jan 21 22:43:57 np0005591284.novalocal cloud-init[918]: Generating public/private rsa key pair.
Jan 21 22:43:57 np0005591284.novalocal cloud-init[918]: Your identification has been saved in /etc/ssh/ssh_host_rsa_key
Jan 21 22:43:57 np0005591284.novalocal cloud-init[918]: Your public key has been saved in /etc/ssh/ssh_host_rsa_key.pub
Jan 21 22:43:57 np0005591284.novalocal cloud-init[918]: The key fingerprint is:
Jan 21 22:43:57 np0005591284.novalocal cloud-init[918]: SHA256:mw3P9szXiY+GBws7mLoRLCwdOQEFMywW6+/d+tCkPWg root@np0005591284.novalocal
Jan 21 22:43:57 np0005591284.novalocal cloud-init[918]: The key's randomart image is:
Jan 21 22:43:57 np0005591284.novalocal cloud-init[918]: +---[RSA 3072]----+
Jan 21 22:43:57 np0005591284.novalocal cloud-init[918]: |.*=o             |
Jan 21 22:43:57 np0005591284.novalocal cloud-init[918]: |.o+ o            |
Jan 21 22:43:57 np0005591284.novalocal cloud-init[918]: |o. +             |
Jan 21 22:43:57 np0005591284.novalocal cloud-init[918]: |. o +            |
Jan 21 22:43:57 np0005591284.novalocal cloud-init[918]: | o + o .S        |
Jan 21 22:43:57 np0005591284.novalocal cloud-init[918]: |  o . B  O .     |
Jan 21 22:43:57 np0005591284.novalocal cloud-init[918]: |   . E += B + ...|
Jan 21 22:43:57 np0005591284.novalocal cloud-init[918]: |  . o +o.+ * +o..|
Jan 21 22:43:57 np0005591284.novalocal cloud-init[918]: |   . ==o  . *o.. |
Jan 21 22:43:57 np0005591284.novalocal cloud-init[918]: +----[SHA256]-----+
Jan 21 22:43:57 np0005591284.novalocal cloud-init[918]: Generating public/private ecdsa key pair.
Jan 21 22:43:57 np0005591284.novalocal cloud-init[918]: Your identification has been saved in /etc/ssh/ssh_host_ecdsa_key
Jan 21 22:43:57 np0005591284.novalocal cloud-init[918]: Your public key has been saved in /etc/ssh/ssh_host_ecdsa_key.pub
Jan 21 22:43:57 np0005591284.novalocal cloud-init[918]: The key fingerprint is:
Jan 21 22:43:57 np0005591284.novalocal cloud-init[918]: SHA256:6QW/YcUyUZ3oomJrg7rtObwrFWk7TWKenZp/CFPcPEo root@np0005591284.novalocal
Jan 21 22:43:57 np0005591284.novalocal cloud-init[918]: The key's randomart image is:
Jan 21 22:43:57 np0005591284.novalocal cloud-init[918]: +---[ECDSA 256]---+
Jan 21 22:43:57 np0005591284.novalocal cloud-init[918]: |          ...o . |
Jan 21 22:43:57 np0005591284.novalocal cloud-init[918]: |           o. o  |
Jan 21 22:43:57 np0005591284.novalocal cloud-init[918]: |     o o. o.o    |
Jan 21 22:43:57 np0005591284.novalocal cloud-init[918]: |    * E ++.+.    |
Jan 21 22:43:57 np0005591284.novalocal cloud-init[918]: |   + @ oSo=.     |
Jan 21 22:43:57 np0005591284.novalocal cloud-init[918]: |    O B..o o     |
Jan 21 22:43:57 np0005591284.novalocal cloud-init[918]: |   o O +. .      |
Jan 21 22:43:57 np0005591284.novalocal cloud-init[918]: |  ..*.= .        |
Jan 21 22:43:57 np0005591284.novalocal cloud-init[918]: |  o=**.o         |
Jan 21 22:43:57 np0005591284.novalocal cloud-init[918]: +----[SHA256]-----+
Jan 21 22:43:57 np0005591284.novalocal cloud-init[918]: Generating public/private ed25519 key pair.
Jan 21 22:43:57 np0005591284.novalocal cloud-init[918]: Your identification has been saved in /etc/ssh/ssh_host_ed25519_key
Jan 21 22:43:57 np0005591284.novalocal cloud-init[918]: Your public key has been saved in /etc/ssh/ssh_host_ed25519_key.pub
Jan 21 22:43:57 np0005591284.novalocal cloud-init[918]: The key fingerprint is:
Jan 21 22:43:57 np0005591284.novalocal cloud-init[918]: SHA256:ZIF5zP/G0rtB+lY5CWTrCd3jpqTrbMC2uqRtOxStkFg root@np0005591284.novalocal
Jan 21 22:43:57 np0005591284.novalocal cloud-init[918]: The key's randomart image is:
Jan 21 22:43:57 np0005591284.novalocal cloud-init[918]: +--[ED25519 256]--+
Jan 21 22:43:57 np0005591284.novalocal cloud-init[918]: |       =.        |
Jan 21 22:43:57 np0005591284.novalocal cloud-init[918]: |    E o +.  o    |
Jan 21 22:43:57 np0005591284.novalocal cloud-init[918]: |   o . oo. + o   |
Jan 21 22:43:57 np0005591284.novalocal cloud-init[918]: |  . o .o. o + o  |
Jan 21 22:43:57 np0005591284.novalocal cloud-init[918]: |     . +S  *.+ + |
Jan 21 22:43:57 np0005591284.novalocal cloud-init[918]: |      o + .oO B  |
Jan 21 22:43:57 np0005591284.novalocal cloud-init[918]: |     ... o.=.= . |
Jan 21 22:43:57 np0005591284.novalocal cloud-init[918]: |     +o ..o.+.   |
Jan 21 22:43:57 np0005591284.novalocal cloud-init[918]: |    ..== o+oo.   |
Jan 21 22:43:57 np0005591284.novalocal cloud-init[918]: +----[SHA256]-----+
Jan 21 22:43:58 np0005591284.novalocal sm-notify[1002]: Version 2.5.4 starting
Jan 21 22:43:58 np0005591284.novalocal systemd[1]: Finished Cloud-init: Network Stage.
Jan 21 22:43:58 np0005591284.novalocal systemd[1]: Reached target Cloud-config availability.
Jan 21 22:43:58 np0005591284.novalocal systemd[1]: Reached target Network is Online.
Jan 21 22:43:58 np0005591284.novalocal systemd[1]: Starting Cloud-init: Config Stage...
Jan 21 22:43:58 np0005591284.novalocal systemd[1]: Starting Crash recovery kernel arming...
Jan 21 22:43:58 np0005591284.novalocal systemd[1]: Starting Notify NFS peers of a restart...
Jan 21 22:43:58 np0005591284.novalocal systemd[1]: Starting System Logging Service...
Jan 21 22:43:58 np0005591284.novalocal systemd[1]: Starting OpenSSH server daemon...
Jan 21 22:43:58 np0005591284.novalocal systemd[1]: Starting Permit User Sessions...
Jan 21 22:43:58 np0005591284.novalocal systemd[1]: Started Notify NFS peers of a restart.
Jan 21 22:43:58 np0005591284.novalocal sshd[1004]: Server listening on 0.0.0.0 port 22.
Jan 21 22:43:58 np0005591284.novalocal sshd[1004]: Server listening on :: port 22.
Jan 21 22:43:58 np0005591284.novalocal systemd[1]: Started OpenSSH server daemon.
Jan 21 22:43:58 np0005591284.novalocal systemd[1]: Finished Permit User Sessions.
Jan 21 22:43:58 np0005591284.novalocal systemd[1]: Started Command Scheduler.
Jan 21 22:43:58 np0005591284.novalocal systemd[1]: Started Getty on tty1.
Jan 21 22:43:58 np0005591284.novalocal crond[1007]: (CRON) STARTUP (1.5.7)
Jan 21 22:43:58 np0005591284.novalocal crond[1007]: (CRON) INFO (Syslog will be used instead of sendmail.)
Jan 21 22:43:58 np0005591284.novalocal crond[1007]: (CRON) INFO (RANDOM_DELAY will be scaled with factor 3% if used.)
Jan 21 22:43:58 np0005591284.novalocal crond[1007]: (CRON) INFO (running with inotify support)
Jan 21 22:43:58 np0005591284.novalocal systemd[1]: Started Serial Getty on ttyS0.
Jan 21 22:43:58 np0005591284.novalocal systemd[1]: Reached target Login Prompts.
Jan 21 22:43:58 np0005591284.novalocal rsyslogd[1003]: [origin software="rsyslogd" swVersion="8.2510.0-2.el9" x-pid="1003" x-info="https://www.rsyslog.com"] start
Jan 21 22:43:58 np0005591284.novalocal rsyslogd[1003]: imjournal: No statefile exists, /var/lib/rsyslog/imjournal.state will be created (ignore if this is first run): No such file or directory [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2040 ]
Jan 21 22:43:58 np0005591284.novalocal systemd[1]: Started System Logging Service.
Jan 21 22:43:58 np0005591284.novalocal systemd[1]: Reached target Multi-User System.
Jan 21 22:43:58 np0005591284.novalocal systemd[1]: Starting Record Runlevel Change in UTMP...
Jan 21 22:43:58 np0005591284.novalocal systemd[1]: systemd-update-utmp-runlevel.service: Deactivated successfully.
Jan 21 22:43:58 np0005591284.novalocal systemd[1]: Finished Record Runlevel Change in UTMP.
Jan 21 22:43:58 np0005591284.novalocal rsyslogd[1003]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 21 22:43:58 np0005591284.novalocal kdumpctl[1012]: kdump: No kdump initial ramdisk found.
Jan 21 22:43:58 np0005591284.novalocal kdumpctl[1012]: kdump: Rebuilding /boot/initramfs-5.14.0-661.el9.x86_64kdump.img
Jan 21 22:43:58 np0005591284.novalocal cloud-init[1100]: Cloud-init v. 24.4-8.el9 running 'modules:config' at Wed, 21 Jan 2026 22:43:58 +0000. Up 11.03 seconds.
Jan 21 22:43:58 np0005591284.novalocal systemd[1]: Finished Cloud-init: Config Stage.
Jan 21 22:43:58 np0005591284.novalocal systemd[1]: Starting Cloud-init: Final Stage...
Jan 21 22:43:58 np0005591284.novalocal cloud-init[1264]: Cloud-init v. 24.4-8.el9 running 'modules:final' at Wed, 21 Jan 2026 22:43:58 +0000. Up 11.45 seconds.
Jan 21 22:43:58 np0005591284.novalocal dracut[1268]: dracut-057-102.git20250818.el9
Jan 21 22:43:58 np0005591284.novalocal cloud-init[1284]: #############################################################
Jan 21 22:43:58 np0005591284.novalocal cloud-init[1286]: -----BEGIN SSH HOST KEY FINGERPRINTS-----
Jan 21 22:43:58 np0005591284.novalocal cloud-init[1288]: 256 SHA256:6QW/YcUyUZ3oomJrg7rtObwrFWk7TWKenZp/CFPcPEo root@np0005591284.novalocal (ECDSA)
Jan 21 22:43:58 np0005591284.novalocal cloud-init[1290]: 256 SHA256:ZIF5zP/G0rtB+lY5CWTrCd3jpqTrbMC2uqRtOxStkFg root@np0005591284.novalocal (ED25519)
Jan 21 22:43:58 np0005591284.novalocal cloud-init[1292]: 3072 SHA256:mw3P9szXiY+GBws7mLoRLCwdOQEFMywW6+/d+tCkPWg root@np0005591284.novalocal (RSA)
Jan 21 22:43:58 np0005591284.novalocal cloud-init[1293]: -----END SSH HOST KEY FINGERPRINTS-----
Jan 21 22:43:58 np0005591284.novalocal cloud-init[1294]: #############################################################
Jan 21 22:43:58 np0005591284.novalocal cloud-init[1264]: Cloud-init v. 24.4-8.el9 finished at Wed, 21 Jan 2026 22:43:58 +0000. Datasource DataSourceConfigDrive [net,ver=2][source=/dev/sr0].  Up 11.67 seconds
Jan 21 22:43:58 np0005591284.novalocal dracut[1270]: Executing: /usr/bin/dracut --quiet --hostonly --hostonly-cmdline --hostonly-i18n --hostonly-mode strict --hostonly-nics  --mount "/dev/disk/by-uuid/22ac9141-3960-4912-b20e-19fc8a328d40 /sysroot xfs rw,relatime,seclabel,attr2,inode64,logbufs=8,logbsize=32k,noquota" --squash-compressor zstd --no-hostonly-default-device --add-confdir /lib/kdump/dracut.conf.d -f /boot/initramfs-5.14.0-661.el9.x86_64kdump.img 5.14.0-661.el9.x86_64
Jan 21 22:43:59 np0005591284.novalocal systemd[1]: Finished Cloud-init: Final Stage.
Jan 21 22:43:59 np0005591284.novalocal systemd[1]: Reached target Cloud-init target.
Jan 21 22:43:59 np0005591284.novalocal dracut[1270]: dracut module 'systemd-networkd' will not be installed, because command 'networkctl' could not be found!
Jan 21 22:43:59 np0005591284.novalocal dracut[1270]: dracut module 'systemd-networkd' will not be installed, because command '/usr/lib/systemd/systemd-networkd' could not be found!
Jan 21 22:43:59 np0005591284.novalocal dracut[1270]: dracut module 'systemd-networkd' will not be installed, because command '/usr/lib/systemd/systemd-networkd-wait-online' could not be found!
Jan 21 22:43:59 np0005591284.novalocal dracut[1270]: dracut module 'systemd-resolved' will not be installed, because command 'resolvectl' could not be found!
Jan 21 22:43:59 np0005591284.novalocal dracut[1270]: dracut module 'systemd-resolved' will not be installed, because command '/usr/lib/systemd/systemd-resolved' could not be found!
Jan 21 22:43:59 np0005591284.novalocal dracut[1270]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-timesyncd' could not be found!
Jan 21 22:43:59 np0005591284.novalocal dracut[1270]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-time-wait-sync' could not be found!
Jan 21 22:43:59 np0005591284.novalocal dracut[1270]: dracut module 'busybox' will not be installed, because command 'busybox' could not be found!
Jan 21 22:43:59 np0005591284.novalocal dracut[1270]: dracut module 'dbus-daemon' will not be installed, because command 'dbus-daemon' could not be found!
Jan 21 22:43:59 np0005591284.novalocal dracut[1270]: dracut module 'rngd' will not be installed, because command 'rngd' could not be found!
Jan 21 22:43:59 np0005591284.novalocal dracut[1270]: dracut module 'connman' will not be installed, because command 'connmand' could not be found!
Jan 21 22:43:59 np0005591284.novalocal dracut[1270]: dracut module 'connman' will not be installed, because command 'connmanctl' could not be found!
Jan 21 22:43:59 np0005591284.novalocal dracut[1270]: dracut module 'connman' will not be installed, because command 'connmand-wait-online' could not be found!
Jan 21 22:43:59 np0005591284.novalocal dracut[1270]: dracut module 'network-wicked' will not be installed, because command 'wicked' could not be found!
Jan 21 22:43:59 np0005591284.novalocal dracut[1270]: Module 'ifcfg' will not be installed, because it's in the list to be omitted!
Jan 21 22:43:59 np0005591284.novalocal dracut[1270]: Module 'plymouth' will not be installed, because it's in the list to be omitted!
Jan 21 22:43:59 np0005591284.novalocal dracut[1270]: 62bluetooth: Could not find any command of '/usr/lib/bluetooth/bluetoothd /usr/libexec/bluetooth/bluetoothd'!
Jan 21 22:43:59 np0005591284.novalocal dracut[1270]: dracut module 'lvmmerge' will not be installed, because command 'lvm' could not be found!
Jan 21 22:43:59 np0005591284.novalocal dracut[1270]: dracut module 'lvmthinpool-monitor' will not be installed, because command 'lvm' could not be found!
Jan 21 22:43:59 np0005591284.novalocal dracut[1270]: dracut module 'btrfs' will not be installed, because command 'btrfs' could not be found!
Jan 21 22:43:59 np0005591284.novalocal dracut[1270]: dracut module 'dmraid' will not be installed, because command 'dmraid' could not be found!
Jan 21 22:43:59 np0005591284.novalocal dracut[1270]: dracut module 'lvm' will not be installed, because command 'lvm' could not be found!
Jan 21 22:43:59 np0005591284.novalocal dracut[1270]: dracut module 'mdraid' will not be installed, because command 'mdadm' could not be found!
Jan 21 22:43:59 np0005591284.novalocal dracut[1270]: dracut module 'pcsc' will not be installed, because command 'pcscd' could not be found!
Jan 21 22:43:59 np0005591284.novalocal dracut[1270]: dracut module 'tpm2-tss' will not be installed, because command 'tpm2' could not be found!
Jan 21 22:43:59 np0005591284.novalocal dracut[1270]: dracut module 'cifs' will not be installed, because command 'mount.cifs' could not be found!
Jan 21 22:44:00 np0005591284.novalocal dracut[1270]: dracut module 'iscsi' will not be installed, because command 'iscsi-iname' could not be found!
Jan 21 22:44:00 np0005591284.novalocal dracut[1270]: dracut module 'iscsi' will not be installed, because command 'iscsiadm' could not be found!
Jan 21 22:44:00 np0005591284.novalocal dracut[1270]: dracut module 'iscsi' will not be installed, because command 'iscsid' could not be found!
Jan 21 22:44:00 np0005591284.novalocal dracut[1270]: dracut module 'nvmf' will not be installed, because command 'nvme' could not be found!
Jan 21 22:44:00 np0005591284.novalocal dracut[1270]: Module 'resume' will not be installed, because it's in the list to be omitted!
Jan 21 22:44:00 np0005591284.novalocal dracut[1270]: dracut module 'biosdevname' will not be installed, because command 'biosdevname' could not be found!
Jan 21 22:44:00 np0005591284.novalocal dracut[1270]: Module 'earlykdump' will not be installed, because it's in the list to be omitted!
Jan 21 22:44:00 np0005591284.novalocal sshd-session[1667]: Connection closed by 38.102.83.114 port 45654 [preauth]
Jan 21 22:44:00 np0005591284.novalocal sshd-session[1682]: Unable to negotiate with 38.102.83.114 port 45666: no matching host key type found. Their offer: ssh-ed25519,ssh-ed25519-cert-v01@openssh.com [preauth]
Jan 21 22:44:00 np0005591284.novalocal sshd-session[1712]: Unable to negotiate with 38.102.83.114 port 45680: no matching host key type found. Their offer: ecdsa-sha2-nistp384,ecdsa-sha2-nistp384-cert-v01@openssh.com [preauth]
Jan 21 22:44:00 np0005591284.novalocal sshd-session[1721]: Unable to negotiate with 38.102.83.114 port 45686: no matching host key type found. Their offer: ecdsa-sha2-nistp521,ecdsa-sha2-nistp521-cert-v01@openssh.com [preauth]
Jan 21 22:44:00 np0005591284.novalocal dracut[1270]: dracut module 'memstrack' will not be installed, because command 'memstrack' could not be found!
Jan 21 22:44:00 np0005591284.novalocal dracut[1270]: memstrack is not available
Jan 21 22:44:00 np0005591284.novalocal dracut[1270]: If you need to use rd.memdebug>=4, please install memstrack and procps-ng
Jan 21 22:44:00 np0005591284.novalocal sshd-session[1731]: Connection reset by 38.102.83.114 port 45692 [preauth]
Jan 21 22:44:00 np0005591284.novalocal dracut[1270]: dracut module 'systemd-resolved' will not be installed, because command 'resolvectl' could not be found!
Jan 21 22:44:00 np0005591284.novalocal dracut[1270]: dracut module 'systemd-resolved' will not be installed, because command '/usr/lib/systemd/systemd-resolved' could not be found!
Jan 21 22:44:00 np0005591284.novalocal sshd-session[1692]: Connection closed by 38.102.83.114 port 45676 [preauth]
Jan 21 22:44:00 np0005591284.novalocal sshd-session[1743]: Connection reset by 38.102.83.114 port 45706 [preauth]
Jan 21 22:44:00 np0005591284.novalocal dracut[1270]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-timesyncd' could not be found!
Jan 21 22:44:00 np0005591284.novalocal dracut[1270]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-time-wait-sync' could not be found!
Jan 21 22:44:00 np0005591284.novalocal dracut[1270]: dracut module 'busybox' will not be installed, because command 'busybox' could not be found!
Jan 21 22:44:00 np0005591284.novalocal sshd-session[1752]: Unable to negotiate with 38.102.83.114 port 45712: no matching host key type found. Their offer: ssh-rsa,ssh-rsa-cert-v01@openssh.com [preauth]
Jan 21 22:44:00 np0005591284.novalocal dracut[1270]: dracut module 'dbus-daemon' will not be installed, because command 'dbus-daemon' could not be found!
Jan 21 22:44:00 np0005591284.novalocal dracut[1270]: dracut module 'rngd' will not be installed, because command 'rngd' could not be found!
Jan 21 22:44:00 np0005591284.novalocal dracut[1270]: dracut module 'connman' will not be installed, because command 'connmand' could not be found!
Jan 21 22:44:00 np0005591284.novalocal sshd-session[1759]: Unable to negotiate with 38.102.83.114 port 45718: no matching host key type found. Their offer: ssh-dss,ssh-dss-cert-v01@openssh.com [preauth]
Jan 21 22:44:00 np0005591284.novalocal dracut[1270]: dracut module 'connman' will not be installed, because command 'connmanctl' could not be found!
Jan 21 22:44:00 np0005591284.novalocal dracut[1270]: dracut module 'connman' will not be installed, because command 'connmand-wait-online' could not be found!
Jan 21 22:44:00 np0005591284.novalocal dracut[1270]: dracut module 'network-wicked' will not be installed, because command 'wicked' could not be found!
Jan 21 22:44:00 np0005591284.novalocal dracut[1270]: 62bluetooth: Could not find any command of '/usr/lib/bluetooth/bluetoothd /usr/libexec/bluetooth/bluetoothd'!
Jan 21 22:44:00 np0005591284.novalocal dracut[1270]: dracut module 'lvmmerge' will not be installed, because command 'lvm' could not be found!
Jan 21 22:44:00 np0005591284.novalocal dracut[1270]: dracut module 'lvmthinpool-monitor' will not be installed, because command 'lvm' could not be found!
Jan 21 22:44:00 np0005591284.novalocal dracut[1270]: dracut module 'btrfs' will not be installed, because command 'btrfs' could not be found!
Jan 21 22:44:00 np0005591284.novalocal dracut[1270]: dracut module 'dmraid' will not be installed, because command 'dmraid' could not be found!
Jan 21 22:44:00 np0005591284.novalocal dracut[1270]: dracut module 'lvm' will not be installed, because command 'lvm' could not be found!
Jan 21 22:44:00 np0005591284.novalocal dracut[1270]: dracut module 'mdraid' will not be installed, because command 'mdadm' could not be found!
Jan 21 22:44:00 np0005591284.novalocal dracut[1270]: dracut module 'pcsc' will not be installed, because command 'pcscd' could not be found!
Jan 21 22:44:00 np0005591284.novalocal dracut[1270]: dracut module 'tpm2-tss' will not be installed, because command 'tpm2' could not be found!
Jan 21 22:44:00 np0005591284.novalocal dracut[1270]: dracut module 'cifs' will not be installed, because command 'mount.cifs' could not be found!
Jan 21 22:44:00 np0005591284.novalocal dracut[1270]: dracut module 'iscsi' will not be installed, because command 'iscsi-iname' could not be found!
Jan 21 22:44:00 np0005591284.novalocal dracut[1270]: dracut module 'iscsi' will not be installed, because command 'iscsiadm' could not be found!
Jan 21 22:44:00 np0005591284.novalocal dracut[1270]: dracut module 'iscsi' will not be installed, because command 'iscsid' could not be found!
Jan 21 22:44:00 np0005591284.novalocal dracut[1270]: dracut module 'nvmf' will not be installed, because command 'nvme' could not be found!
Jan 21 22:44:00 np0005591284.novalocal dracut[1270]: dracut module 'memstrack' will not be installed, because command 'memstrack' could not be found!
Jan 21 22:44:00 np0005591284.novalocal dracut[1270]: memstrack is not available
Jan 21 22:44:00 np0005591284.novalocal dracut[1270]: If you need to use rd.memdebug>=4, please install memstrack and procps-ng
Jan 21 22:44:00 np0005591284.novalocal dracut[1270]: *** Including module: systemd ***
Jan 21 22:44:00 np0005591284.novalocal chronyd[784]: Selected source 147.189.136.126 (2.centos.pool.ntp.org)
Jan 21 22:44:00 np0005591284.novalocal chronyd[784]: System clock TAI offset set to 37 seconds
Jan 21 22:44:01 np0005591284.novalocal dracut[1270]: *** Including module: fips ***
Jan 21 22:44:01 np0005591284.novalocal dracut[1270]: *** Including module: systemd-initrd ***
Jan 21 22:44:01 np0005591284.novalocal dracut[1270]: *** Including module: i18n ***
Jan 21 22:44:01 np0005591284.novalocal dracut[1270]: *** Including module: drm ***
Jan 21 22:44:02 np0005591284.novalocal dracut[1270]: *** Including module: prefixdevname ***
Jan 21 22:44:02 np0005591284.novalocal dracut[1270]: *** Including module: kernel-modules ***
Jan 21 22:44:02 np0005591284.novalocal kernel: block vda: the capability attribute has been deprecated.
Jan 21 22:44:02 np0005591284.novalocal dracut[1270]: *** Including module: kernel-modules-extra ***
Jan 21 22:44:02 np0005591284.novalocal dracut[1270]:   kernel-modules-extra: configuration source "/run/depmod.d" does not exist
Jan 21 22:44:02 np0005591284.novalocal dracut[1270]:   kernel-modules-extra: configuration source "/lib/depmod.d" does not exist
Jan 21 22:44:02 np0005591284.novalocal dracut[1270]:   kernel-modules-extra: parsing configuration file "/etc/depmod.d/dist.conf"
Jan 21 22:44:02 np0005591284.novalocal dracut[1270]:   kernel-modules-extra: /etc/depmod.d/dist.conf: added "updates extra built-in weak-updates" to the list of search directories
Jan 21 22:44:02 np0005591284.novalocal dracut[1270]: *** Including module: qemu ***
Jan 21 22:44:02 np0005591284.novalocal dracut[1270]: *** Including module: fstab-sys ***
Jan 21 22:44:02 np0005591284.novalocal dracut[1270]: *** Including module: rootfs-block ***
Jan 21 22:44:02 np0005591284.novalocal dracut[1270]: *** Including module: terminfo ***
Jan 21 22:44:02 np0005591284.novalocal dracut[1270]: *** Including module: udev-rules ***
Jan 21 22:44:03 np0005591284.novalocal irqbalance[786]: Cannot change IRQ 25 affinity: Operation not permitted
Jan 21 22:44:03 np0005591284.novalocal irqbalance[786]: IRQ 25 affinity is now unmanaged
Jan 21 22:44:03 np0005591284.novalocal irqbalance[786]: Cannot change IRQ 31 affinity: Operation not permitted
Jan 21 22:44:03 np0005591284.novalocal irqbalance[786]: IRQ 31 affinity is now unmanaged
Jan 21 22:44:03 np0005591284.novalocal irqbalance[786]: Cannot change IRQ 28 affinity: Operation not permitted
Jan 21 22:44:03 np0005591284.novalocal irqbalance[786]: IRQ 28 affinity is now unmanaged
Jan 21 22:44:03 np0005591284.novalocal irqbalance[786]: Cannot change IRQ 32 affinity: Operation not permitted
Jan 21 22:44:03 np0005591284.novalocal irqbalance[786]: IRQ 32 affinity is now unmanaged
Jan 21 22:44:03 np0005591284.novalocal irqbalance[786]: Cannot change IRQ 30 affinity: Operation not permitted
Jan 21 22:44:03 np0005591284.novalocal irqbalance[786]: IRQ 30 affinity is now unmanaged
Jan 21 22:44:03 np0005591284.novalocal irqbalance[786]: Cannot change IRQ 29 affinity: Operation not permitted
Jan 21 22:44:03 np0005591284.novalocal irqbalance[786]: IRQ 29 affinity is now unmanaged
Jan 21 22:44:03 np0005591284.novalocal dracut[1270]: Skipping udev rule: 91-permissions.rules
Jan 21 22:44:03 np0005591284.novalocal dracut[1270]: Skipping udev rule: 80-drivers-modprobe.rules
Jan 21 22:44:03 np0005591284.novalocal dracut[1270]: *** Including module: virtiofs ***
Jan 21 22:44:03 np0005591284.novalocal dracut[1270]: *** Including module: dracut-systemd ***
Jan 21 22:44:03 np0005591284.novalocal dracut[1270]: *** Including module: usrmount ***
Jan 21 22:44:03 np0005591284.novalocal dracut[1270]: *** Including module: base ***
Jan 21 22:44:04 np0005591284.novalocal dracut[1270]: *** Including module: fs-lib ***
Jan 21 22:44:04 np0005591284.novalocal dracut[1270]: *** Including module: kdumpbase ***
Jan 21 22:44:04 np0005591284.novalocal dracut[1270]: *** Including module: microcode_ctl-fw_dir_override ***
Jan 21 22:44:04 np0005591284.novalocal dracut[1270]:   microcode_ctl module: mangling fw_dir
Jan 21 22:44:04 np0005591284.novalocal dracut[1270]:     microcode_ctl: reset fw_dir to "/lib/firmware/updates /lib/firmware"
Jan 21 22:44:04 np0005591284.novalocal dracut[1270]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel"...
Jan 21 22:44:04 np0005591284.novalocal dracut[1270]:     microcode_ctl: configuration "intel" is ignored
Jan 21 22:44:04 np0005591284.novalocal dracut[1270]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-2d-07"...
Jan 21 22:44:04 np0005591284.novalocal dracut[1270]:     microcode_ctl: configuration "intel-06-2d-07" is ignored
Jan 21 22:44:04 np0005591284.novalocal dracut[1270]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-4e-03"...
Jan 21 22:44:04 np0005591284.novalocal dracut[1270]:     microcode_ctl: configuration "intel-06-4e-03" is ignored
Jan 21 22:44:04 np0005591284.novalocal dracut[1270]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-4f-01"...
Jan 21 22:44:04 np0005591284.novalocal dracut[1270]:     microcode_ctl: configuration "intel-06-4f-01" is ignored
Jan 21 22:44:04 np0005591284.novalocal dracut[1270]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-55-04"...
Jan 21 22:44:04 np0005591284.novalocal dracut[1270]:     microcode_ctl: configuration "intel-06-55-04" is ignored
Jan 21 22:44:04 np0005591284.novalocal dracut[1270]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-5e-03"...
Jan 21 22:44:04 np0005591284.novalocal dracut[1270]:     microcode_ctl: configuration "intel-06-5e-03" is ignored
Jan 21 22:44:04 np0005591284.novalocal dracut[1270]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8c-01"...
Jan 21 22:44:04 np0005591284.novalocal dracut[1270]:     microcode_ctl: configuration "intel-06-8c-01" is ignored
Jan 21 22:44:04 np0005591284.novalocal dracut[1270]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8e-9e-0x-0xca"...
Jan 21 22:44:04 np0005591284.novalocal dracut[1270]:     microcode_ctl: configuration "intel-06-8e-9e-0x-0xca" is ignored
Jan 21 22:44:04 np0005591284.novalocal dracut[1270]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8e-9e-0x-dell"...
Jan 21 22:44:05 np0005591284.novalocal dracut[1270]:     microcode_ctl: configuration "intel-06-8e-9e-0x-dell" is ignored
Jan 21 22:44:05 np0005591284.novalocal dracut[1270]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8f-08"...
Jan 21 22:44:05 np0005591284.novalocal dracut[1270]:     microcode_ctl: configuration "intel-06-8f-08" is ignored
Jan 21 22:44:05 np0005591284.novalocal dracut[1270]:     microcode_ctl: final fw_dir: "/lib/firmware/updates /lib/firmware"
Jan 21 22:44:05 np0005591284.novalocal dracut[1270]: *** Including module: openssl ***
Jan 21 22:44:05 np0005591284.novalocal dracut[1270]: *** Including module: shutdown ***
Jan 21 22:44:05 np0005591284.novalocal dracut[1270]: *** Including module: squash ***
Jan 21 22:44:05 np0005591284.novalocal dracut[1270]: *** Including modules done ***
Jan 21 22:44:05 np0005591284.novalocal dracut[1270]: *** Installing kernel module dependencies ***
Jan 21 22:44:06 np0005591284.novalocal dracut[1270]: *** Installing kernel module dependencies done ***
Jan 21 22:44:06 np0005591284.novalocal dracut[1270]: *** Resolving executable dependencies ***
Jan 21 22:44:06 np0005591284.novalocal systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Jan 21 22:44:07 np0005591284.novalocal dracut[1270]: *** Resolving executable dependencies done ***
Jan 21 22:44:07 np0005591284.novalocal dracut[1270]: *** Generating early-microcode cpio image ***
Jan 21 22:44:07 np0005591284.novalocal dracut[1270]: *** Store current command line parameters ***
Jan 21 22:44:07 np0005591284.novalocal dracut[1270]: Stored kernel commandline:
Jan 21 22:44:07 np0005591284.novalocal dracut[1270]: No dracut internal kernel commandline stored in the initramfs
Jan 21 22:44:07 np0005591284.novalocal dracut[1270]: *** Install squash loader ***
Jan 21 22:44:08 np0005591284.novalocal dracut[1270]: *** Squashing the files inside the initramfs ***
Jan 21 22:44:09 np0005591284.novalocal dracut[1270]: *** Squashing the files inside the initramfs done ***
Jan 21 22:44:09 np0005591284.novalocal dracut[1270]: *** Creating image file '/boot/initramfs-5.14.0-661.el9.x86_64kdump.img' ***
Jan 21 22:44:09 np0005591284.novalocal dracut[1270]: *** Hardlinking files ***
Jan 21 22:44:09 np0005591284.novalocal dracut[1270]: Mode:           real
Jan 21 22:44:09 np0005591284.novalocal dracut[1270]: Files:          50
Jan 21 22:44:09 np0005591284.novalocal dracut[1270]: Linked:         0 files
Jan 21 22:44:09 np0005591284.novalocal dracut[1270]: Compared:       0 xattrs
Jan 21 22:44:09 np0005591284.novalocal dracut[1270]: Compared:       0 files
Jan 21 22:44:09 np0005591284.novalocal dracut[1270]: Saved:          0 B
Jan 21 22:44:09 np0005591284.novalocal dracut[1270]: Duration:       0.000961 seconds
Jan 21 22:44:09 np0005591284.novalocal dracut[1270]: *** Hardlinking files done ***
Jan 21 22:44:10 np0005591284.novalocal dracut[1270]: *** Creating initramfs image file '/boot/initramfs-5.14.0-661.el9.x86_64kdump.img' done ***
Jan 21 22:44:10 np0005591284.novalocal kdumpctl[1012]: kdump: kexec: loaded kdump kernel
Jan 21 22:44:10 np0005591284.novalocal kdumpctl[1012]: kdump: Starting kdump: [OK]
Jan 21 22:44:10 np0005591284.novalocal systemd[1]: Finished Crash recovery kernel arming.
Jan 21 22:44:10 np0005591284.novalocal systemd[1]: Startup finished in 1.676s (kernel) + 2.722s (initrd) + 19.023s (userspace) = 23.422s.
Jan 21 22:44:24 np0005591284.novalocal systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Jan 21 22:45:07 np0005591284.novalocal chronyd[784]: Selected source 54.39.23.64 (2.centos.pool.ntp.org)
Jan 21 22:45:23 np0005591284.novalocal sshd-session[4304]: Accepted publickey for zuul from 38.102.83.114 port 48106 ssh2: RSA SHA256:zhs3MiW0JhxzckYcMHQES8SMYHj1iGcomnyzmbiwor8
Jan 21 22:45:23 np0005591284.novalocal systemd[1]: Created slice User Slice of UID 1000.
Jan 21 22:45:23 np0005591284.novalocal systemd[1]: Starting User Runtime Directory /run/user/1000...
Jan 21 22:45:23 np0005591284.novalocal systemd-logind[796]: New session 1 of user zuul.
Jan 21 22:45:23 np0005591284.novalocal systemd[1]: Finished User Runtime Directory /run/user/1000.
Jan 21 22:45:23 np0005591284.novalocal systemd[1]: Starting User Manager for UID 1000...
Jan 21 22:45:23 np0005591284.novalocal systemd[4308]: pam_unix(systemd-user:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 21 22:45:24 np0005591284.novalocal systemd[4308]: Queued start job for default target Main User Target.
Jan 21 22:45:24 np0005591284.novalocal systemd[4308]: Created slice User Application Slice.
Jan 21 22:45:24 np0005591284.novalocal systemd[4308]: Started Mark boot as successful after the user session has run 2 minutes.
Jan 21 22:45:24 np0005591284.novalocal systemd[4308]: Started Daily Cleanup of User's Temporary Directories.
Jan 21 22:45:24 np0005591284.novalocal systemd[4308]: Reached target Paths.
Jan 21 22:45:24 np0005591284.novalocal systemd[4308]: Reached target Timers.
Jan 21 22:45:24 np0005591284.novalocal systemd[4308]: Starting D-Bus User Message Bus Socket...
Jan 21 22:45:24 np0005591284.novalocal systemd[4308]: Starting Create User's Volatile Files and Directories...
Jan 21 22:45:24 np0005591284.novalocal systemd[4308]: Listening on D-Bus User Message Bus Socket.
Jan 21 22:45:24 np0005591284.novalocal systemd[4308]: Reached target Sockets.
Jan 21 22:45:24 np0005591284.novalocal systemd[4308]: Finished Create User's Volatile Files and Directories.
Jan 21 22:45:24 np0005591284.novalocal systemd[4308]: Reached target Basic System.
Jan 21 22:45:24 np0005591284.novalocal systemd[4308]: Reached target Main User Target.
Jan 21 22:45:24 np0005591284.novalocal systemd[4308]: Startup finished in 132ms.
Jan 21 22:45:24 np0005591284.novalocal systemd[1]: Started User Manager for UID 1000.
Jan 21 22:45:24 np0005591284.novalocal systemd[1]: Started Session 1 of User zuul.
Jan 21 22:45:24 np0005591284.novalocal sshd-session[4304]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 21 22:45:24 np0005591284.novalocal python3[4390]: ansible-setup Invoked with gather_subset=['!all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 21 22:45:34 np0005591284.novalocal python3[4418]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 21 22:45:42 np0005591284.novalocal python3[4476]: ansible-setup Invoked with gather_subset=['network'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 21 22:45:43 np0005591284.novalocal python3[4516]: ansible-zuul_console Invoked with path=/tmp/console-{log_uuid}.log port=19885 state=present
Jan 21 22:45:45 np0005591284.novalocal python3[4542]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC2e4yttZDYBqdG8LApHzgUrKnJJhokPjy46000EGKrecg+C8A4mLQflJ0D/xvugtt/H91C3VfRJbQOPQ7hZmStaqICNoXl/C8gc+eNroWZE+yY/wlWIxUH08XS6asYrTpDpg5UmpvUaYUK+3UMHnBY7Ito24+Jty+rd2YwCphABstuMfb1NJAx6Jml5CgCMob2n9WNcySPRTJ7JEA45egnysW3zGHGsS6qA8z8KP4tsp0oqBu1cfczB2RxnOXPhXZSJcS+3lww8bkb/wmQh1+Ho5qQEILiO5sxZGE4T9giN9XH2aveWWK0ttofy63F0tFxrl4uVBOtPYvY+GFt+GJuAwQK/wFmObp8yFqj8YU0HrxwXaVGLO6bfltMq8+k+/sDcwLSVGsCR6kw70L44MXX4znyZuRO7aEx+rAOMmL9ZfrVMgF7BEKlJG7ZldriZuFA1dpyF07UOpUN5wDaKC0EUC9s9ANBhs/JzmSBbA66LTl3G+2zXPfjQLBU99msPhs= zuul-build-sshkey manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 21 22:45:45 np0005591284.novalocal python3[4566]: ansible-file Invoked with state=directory path=/home/zuul/.ssh mode=448 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 22:45:46 np0005591284.novalocal python3[4665]: ansible-ansible.legacy.stat Invoked with path=/home/zuul/.ssh/id_rsa follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 21 22:45:46 np0005591284.novalocal python3[4736]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769035545.9276214-252-12656152559596/source dest=/home/zuul/.ssh/id_rsa mode=384 force=False _original_basename=3005863ccc544e3a9d90dbd38e9aa500_id_rsa follow=False checksum=232cfc4771d49d01feffe7bca174ec959890bb55 backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 22:45:47 np0005591284.novalocal python3[4859]: ansible-ansible.legacy.stat Invoked with path=/home/zuul/.ssh/id_rsa.pub follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 21 22:45:47 np0005591284.novalocal python3[4930]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769035547.0490565-307-199277196259039/source dest=/home/zuul/.ssh/id_rsa.pub mode=420 force=False _original_basename=3005863ccc544e3a9d90dbd38e9aa500_id_rsa.pub follow=False checksum=0a660c0f8e508883780892e7228376ef7bc415eb backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 22:45:49 np0005591284.novalocal python3[4978]: ansible-ping Invoked with data=pong
Jan 21 22:45:50 np0005591284.novalocal python3[5002]: ansible-setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 21 22:45:52 np0005591284.novalocal python3[5060]: ansible-zuul_debug_info Invoked with ipv4_route_required=False ipv6_route_required=False image_manifest_files=['/etc/dib-builddate.txt', '/etc/image-hostname.txt'] image_manifest=None traceroute_host=None
Jan 21 22:45:53 np0005591284.novalocal python3[5092]: ansible-file Invoked with path=/home/zuul/zuul-output/logs state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 22:45:54 np0005591284.novalocal python3[5116]: ansible-file Invoked with path=/home/zuul/zuul-output/artifacts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 22:45:54 np0005591284.novalocal python3[5140]: ansible-file Invoked with path=/home/zuul/zuul-output/docs state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 22:45:54 np0005591284.novalocal python3[5164]: ansible-file Invoked with path=/home/zuul/zuul-output/logs state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 22:45:55 np0005591284.novalocal python3[5188]: ansible-file Invoked with path=/home/zuul/zuul-output/artifacts state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 22:45:55 np0005591284.novalocal python3[5212]: ansible-file Invoked with path=/home/zuul/zuul-output/docs state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 22:45:56 np0005591284.novalocal sudo[5236]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-esbzqnrhamywiyvfvtnccuvtckdzjuvq ; /usr/bin/python3'
Jan 21 22:45:56 np0005591284.novalocal sudo[5236]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 22:45:56 np0005591284.novalocal python3[5238]: ansible-file Invoked with path=/etc/ci state=directory owner=root group=root mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 22:45:56 np0005591284.novalocal sudo[5236]: pam_unix(sudo:session): session closed for user root
Jan 21 22:45:57 np0005591284.novalocal sudo[5314]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vpmcrmqdxipnxuvzttcjnaetqqeewxrl ; /usr/bin/python3'
Jan 21 22:45:57 np0005591284.novalocal sudo[5314]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 22:45:57 np0005591284.novalocal python3[5316]: ansible-ansible.legacy.stat Invoked with path=/etc/ci/mirror_info.sh follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 21 22:45:57 np0005591284.novalocal sudo[5314]: pam_unix(sudo:session): session closed for user root
Jan 21 22:45:57 np0005591284.novalocal sudo[5387]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lqczlqsdyhzwingjumsksjzaabbkppmv ; /usr/bin/python3'
Jan 21 22:45:57 np0005591284.novalocal sudo[5387]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 22:45:58 np0005591284.novalocal python3[5389]: ansible-ansible.legacy.copy Invoked with dest=/etc/ci/mirror_info.sh owner=root group=root mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1769035557.0735276-32-87577945012757/source follow=False _original_basename=mirror_info.sh.j2 checksum=92d92a03afdddee82732741071f662c729080c35 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 22:45:58 np0005591284.novalocal sudo[5387]: pam_unix(sudo:session): session closed for user root
Jan 21 22:45:58 np0005591284.novalocal python3[5437]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAABIwAAAQEA4Z/c9osaGGtU6X8fgELwfj/yayRurfcKA0HMFfdpPxev2dbwljysMuzoVp4OZmW1gvGtyYPSNRvnzgsaabPNKNo2ym5NToCP6UM+KSe93aln4BcM/24mXChYAbXJQ5Bqq/pIzsGs/pKetQN+vwvMxLOwTvpcsCJBXaa981RKML6xj9l/UZ7IIq1HSEKMvPLxZMWdu0Ut8DkCd5F4nOw9Wgml2uYpDCj5LLCrQQ9ChdOMz8hz6SighhNlRpPkvPaet3OXxr/ytFMu7j7vv06CaEnuMMiY2aTWN1Imin9eHAylIqFHta/3gFfQSWt9jXM7owkBLKL7ATzhaAn+fjNupw== arxcruz@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 21 22:45:59 np0005591284.novalocal python3[5461]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQDS4Fn6k4deCnIlOtLWqZJyksbepjQt04j8Ed8CGx9EKkj0fKiAxiI4TadXQYPuNHMixZy4Nevjb6aDhL5Z906TfvNHKUrjrG7G26a0k8vdc61NEQ7FmcGMWRLwwc6ReDO7lFpzYKBMk4YqfWgBuGU/K6WLKiVW2cVvwIuGIaYrE1OiiX0iVUUk7KApXlDJMXn7qjSYynfO4mF629NIp8FJal38+Kv+HA+0QkE5Y2xXnzD4Lar5+keymiCHRntPppXHeLIRzbt0gxC7v3L72hpQ3BTBEzwHpeS8KY+SX1y5lRMN45thCHfJqGmARJREDjBvWG8JXOPmVIKQtZmVcD5b mandreou@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 21 22:45:59 np0005591284.novalocal python3[5485]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQC9MiLfy30deHA7xPOAlew5qUq3UP2gmRMYJi8PtkjFB20/DKeWwWNnkZPqP9AayruRoo51SIiVg870gbZE2jYl+Ncx/FYDe56JeC3ySZsXoAVkC9bP7gkOGqOmJjirvAgPMI7bogVz8i+66Q4Ar7OKTp3762G4IuWPPEg4ce4Y7lx9qWocZapHYq4cYKMxrOZ7SEbFSATBbe2bPZAPKTw8do/Eny+Hq/LkHFhIeyra6cqTFQYShr+zPln0Cr+ro/pDX3bB+1ubFgTpjpkkkQsLhDfR6cCdCWM2lgnS3BTtYj5Ct9/JRPR5YOphqZz+uB+OEu2IL68hmU9vNTth1KeX rlandy@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 21 22:45:59 np0005591284.novalocal python3[5509]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIFCbgz8gdERiJlk2IKOtkjQxEXejrio6ZYMJAVJYpOIp raukadah@gmail.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 21 22:45:59 np0005591284.novalocal python3[5533]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIBqb3Q/9uDf4LmihQ7xeJ9gA/STIQUFPSfyyV0m8AoQi bshewale@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 21 22:46:00 np0005591284.novalocal python3[5557]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQC0I8QqQx0Az2ysJt2JuffucLijhBqnsXKEIx5GyHwxVULROa8VtNFXUDH6ZKZavhiMcmfHB2+TBTda+lDP4FldYj06dGmzCY+IYGa+uDRdxHNGYjvCfLFcmLlzRK6fNbTcui+KlUFUdKe0fb9CRoGKyhlJD5GRkM1Dv+Yb6Bj+RNnmm1fVGYxzmrD2utvffYEb0SZGWxq2R9gefx1q/3wCGjeqvufEV+AskPhVGc5T7t9eyZ4qmslkLh1/nMuaIBFcr9AUACRajsvk6mXrAN1g3HlBf2gQlhi1UEyfbqIQvzzFtsbLDlSum/KmKjy818GzvWjERfQ0VkGzCd9bSLVL dviroel@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 21 22:46:00 np0005591284.novalocal python3[5581]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDLOQd4ZLtkZXQGY6UwAr/06ppWQK4fDO3HaqxPk98csyOCBXsliSKK39Bso828+5srIXiW7aI6aC9P5mwi4mUZlGPfJlQbfrcGvY+b/SocuvaGK+1RrHLoJCT52LBhwgrzlXio2jeksZeein8iaTrhsPrOAs7KggIL/rB9hEiB3NaOPWhhoCP4vlW6MEMExGcqB/1FVxXFBPnLkEyW0Lk7ycVflZl2ocRxbfjZi0+tI1Wlinp8PvSQSc/WVrAcDgKjc/mB4ODPOyYy3G8FHgfMsrXSDEyjBKgLKMsdCrAUcqJQWjkqXleXSYOV4q3pzL+9umK+q/e3P/bIoSFQzmJKTU1eDfuvPXmow9F5H54fii/Da7ezlMJ+wPGHJrRAkmzvMbALy7xwswLhZMkOGNtRcPqaKYRmIBKpw3o6bCTtcNUHOtOQnzwY8JzrM2eBWJBXAANYw+9/ho80JIiwhg29CFNpVBuHbql2YxJQNrnl90guN65rYNpDxdIluweyUf8= anbanerj@kaermorhen manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 21 22:46:00 np0005591284.novalocal python3[5605]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC3VwV8Im9kRm49lt3tM36hj4Zv27FxGo4C1Q/0jqhzFmHY7RHbmeRr8ObhwWoHjXSozKWg8FL5ER0z3hTwL0W6lez3sL7hUaCmSuZmG5Hnl3x4vTSxDI9JZ/Y65rtYiiWQo2fC5xJhU/4+0e5e/pseCm8cKRSu+SaxhO+sd6FDojA2x1BzOzKiQRDy/1zWGp/cZkxcEuB1wHI5LMzN03c67vmbu+fhZRAUO4dQkvcnj2LrhQtpa+ytvnSjr8icMDosf1OsbSffwZFyHB/hfWGAfe0eIeSA2XPraxiPknXxiPKx2MJsaUTYbsZcm3EjFdHBBMumw5rBI74zLrMRvCO9GwBEmGT4rFng1nP+yw5DB8sn2zqpOsPg1LYRwCPOUveC13P6pgsZZPh812e8v5EKnETct+5XI3dVpdw6CnNiLwAyVAF15DJvBGT/u1k0Myg/bQn+Gv9k2MSj6LvQmf6WbZu2Wgjm30z3FyCneBqTL7mLF19YXzeC0ufHz5pnO1E= dasm@fedora manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 21 22:46:00 np0005591284.novalocal python3[5629]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIHUnwjB20UKmsSed9X73eGNV5AOEFccQ3NYrRW776pEk cjeanner manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 21 22:46:01 np0005591284.novalocal python3[5653]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDercCMGn8rW1C4P67tHgtflPdTeXlpyUJYH+6XDd2lR jgilaber@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 21 22:46:01 np0005591284.novalocal python3[5677]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIAMI6kkg9Wg0sG7jIJmyZemEBwUn1yzNpQQd3gnulOmZ adrianfuscoarnejo@gmail.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 21 22:46:01 np0005591284.novalocal python3[5701]: ansible-authorized_key Invoked with user=zuul state=present key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBPijwpQu/3jhhhBZInXNOLEH57DrknPc3PLbsRvYyJIFzwYjX+WD4a7+nGnMYS42MuZk6TJcVqgnqofVx4isoD4= ramishra@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 21 22:46:02 np0005591284.novalocal python3[5725]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIGpU/BepK3qX0NRf5Np+dOBDqzQEefhNrw2DCZaH3uWW rebtoor@monolith manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 21 22:46:02 np0005591284.novalocal python3[5749]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDK0iKdi8jQTpQrDdLVH/AAgLVYyTXF7AQ1gjc/5uT3t ykarel@yatinkarel manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 21 22:46:02 np0005591284.novalocal python3[5773]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIF/V/cLotA6LZeO32VL45Hd78skuA2lJA425Sm2LlQeZ fmount@horcrux manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 21 22:46:02 np0005591284.novalocal python3[5797]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDa7QCjuDMVmRPo1rREbGwzYeBCYVN+Ou/3WKXZEC6Sr manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 21 22:46:03 np0005591284.novalocal python3[5821]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQCfNtF7NvKl915TGsGGoseUb06Hj8L/S4toWf0hExeY+F00woL6NvBlJD0nDct+P5a22I4EhvoQCRQ8reaPCm1lybR3uiRIJsj+8zkVvLwby9LXzfZorlNG9ofjd00FEmB09uW/YvTl6Q9XwwwX6tInzIOv3TMqTHHGOL74ibbj8J/FJR0cFEyj0z4WQRvtkh32xAHl83gbuINryMt0sqRI+clj2381NKL55DRLQrVw0gsfqqxiHAnXg21qWmc4J+b9e9kiuAFQjcjwTVkwJCcg3xbPwC/qokYRby/Y5S40UUd7/jEARGXT7RZgpzTuDd1oZiCVrnrqJNPaMNdVv5MLeFdf1B7iIe5aa/fGouX7AO4SdKhZUdnJmCFAGvjC6S3JMZ2wAcUl+OHnssfmdj7XL50cLo27vjuzMtLAgSqi6N99m92WCF2s8J9aVzszX7Xz9OKZCeGsiVJp3/NdABKzSEAyM9xBD/5Vho894Sav+otpySHe3p6RUTgbB5Zu8VyZRZ/UtB3ueXxyo764yrc6qWIDqrehm84Xm9g+/jpIBzGPl07NUNJpdt/6Sgf9RIKXw/7XypO5yZfUcuFNGTxLfqjTNrtgLZNcjfav6sSdVXVcMPL//XNuRdKmVFaO76eV/oGMQGr1fGcCD+N+CpI7+Q+fCNB6VFWG4nZFuI/Iuw== averdagu@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 21 22:46:03 np0005591284.novalocal python3[5845]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDq8l27xI+QlQVdS4djp9ogSoyrNE2+Ox6vKPdhSNL1J3PE5w+WCSvMz9A5gnNuH810zwbekEApbxTze/gLQJwBHA52CChfURpXrFaxY7ePXRElwKAL3mJfzBWY/c5jnNL9TCVmFJTGZkFZP3Nh+BMgZvL6xBkt3WKm6Uq18qzd9XeKcZusrA+O+uLv1fVeQnadY9RIqOCyeFYCzLWrUfTyE8x/XG0hAWIM7qpnF2cALQS2h9n4hW5ybiUN790H08wf9hFwEf5nxY9Z9dVkPFQiTSGKNBzmnCXU9skxS/xhpFjJ5duGSZdtAHe9O+nGZm9c67hxgtf8e5PDuqAdXEv2cf6e3VBAt+Bz8EKI3yosTj0oZHfwr42Yzb1l/SKy14Rggsrc9KAQlrGXan6+u2jcQqqx7l+SWmnpFiWTV9u5cWj2IgOhApOitmRBPYqk9rE2usfO0hLn/Pj/R/Nau4803e1/EikdLE7Ps95s9mX5jRDjAoUa2JwFF5RsVFyL910= ashigupt@ashigupt.remote.csb manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 21 22:46:03 np0005591284.novalocal python3[5869]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIOKLl0NYKwoZ/JY5KeZU8VwRAggeOxqQJeoqp3dsAaY9 manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 21 22:46:04 np0005591284.novalocal python3[5893]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIASASQOH2BcOyLKuuDOdWZlPi2orcjcA8q4400T73DLH evallesp@fedora manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 21 22:46:04 np0005591284.novalocal python3[5917]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAILeBWlamUph+jRKV2qrx1PGU7vWuGIt5+z9k96I8WehW amsinha@amsinha-mac manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 21 22:46:04 np0005591284.novalocal python3[5941]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIANvVgvJBlK3gb1yz5uef/JqIGq4HLEmY2dYA8e37swb morenod@redhat-laptop manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 21 22:46:05 np0005591284.novalocal python3[5965]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQDZdI7t1cxYx65heVI24HTV4F7oQLW1zyfxHreL2TIJKxjyrUUKIFEUmTutcBlJRLNT2Eoix6x1sOw9YrchloCLcn//SGfTElr9mSc5jbjb7QXEU+zJMhtxyEJ1Po3CUGnj7ckiIXw7wcawZtrEOAQ9pH3ExYCJcEMiyNjRQZCxT3tPK+S4B95EWh5Fsrz9CkwpjNRPPH7LigCeQTM3Wc7r97utAslBUUvYceDSLA7rMgkitJE38b7rZBeYzsGQ8YYUBjTCtehqQXxCRjizbHWaaZkBU+N3zkKB6n/iCNGIO690NK7A/qb6msTijiz1PeuM8ThOsi9qXnbX5v0PoTpcFSojV7NHAQ71f0XXuS43FhZctT+Dcx44dT8Fb5vJu2cJGrk+qF8ZgJYNpRS7gPg0EG2EqjK7JMf9ULdjSu0r+KlqIAyLvtzT4eOnQipoKlb/WG5D/0ohKv7OMQ352ggfkBFIQsRXyyTCT98Ft9juqPuahi3CAQmP4H9dyE+7+Kz437PEtsxLmfm6naNmWi7Ee1DqWPwS8rEajsm4sNM4wW9gdBboJQtc0uZw0DfLj1I9r3Mc8Ol0jYtz0yNQDSzVLrGCaJlC311trU70tZ+ZkAVV6Mn8lOhSbj1cK0lvSr6ZK4dgqGl3I1eTZJJhbLNdg7UOVaiRx9543+C/p/As7w== brjackma@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 21 22:46:05 np0005591284.novalocal python3[5989]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIKwedoZ0TWPJX/z/4TAbO/kKcDZOQVgRH0hAqrL5UCI1 vcastell@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 21 22:46:05 np0005591284.novalocal python3[6013]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIEmv8sE8GCk6ZTPIqF0FQrttBdL3mq7rCm/IJy0xDFh7 michburk@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 21 22:46:05 np0005591284.novalocal python3[6037]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAICy6GpGEtwevXEEn4mmLR5lmSLe23dGgAvzkB9DMNbkf rsafrono@rsafrono manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 21 22:46:08 np0005591284.novalocal sudo[6061]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-celbhtmyscntichxmjnrkcbmmpbwkybg ; /usr/bin/python3'
Jan 21 22:46:08 np0005591284.novalocal sudo[6061]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 22:46:08 np0005591284.novalocal python3[6063]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Jan 21 22:46:08 np0005591284.novalocal systemd[1]: Starting Time & Date Service...
Jan 21 22:46:08 np0005591284.novalocal systemd[1]: Started Time & Date Service.
Jan 21 22:46:08 np0005591284.novalocal systemd-timedated[6065]: Changed time zone to 'UTC' (UTC).
Jan 21 22:46:08 np0005591284.novalocal sudo[6061]: pam_unix(sudo:session): session closed for user root
Jan 21 22:46:08 np0005591284.novalocal sudo[6092]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vvjzesglrrtiltgeozpwzyehlznhjxsb ; /usr/bin/python3'
Jan 21 22:46:08 np0005591284.novalocal sudo[6092]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 22:46:08 np0005591284.novalocal python3[6094]: ansible-file Invoked with path=/etc/nodepool state=directory mode=511 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 22:46:09 np0005591284.novalocal sudo[6092]: pam_unix(sudo:session): session closed for user root
Jan 21 22:46:09 np0005591284.novalocal python3[6170]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 21 22:46:09 np0005591284.novalocal python3[6241]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/sub_nodes src=/home/zuul/.ansible/tmp/ansible-tmp-1769035569.2118134-252-215317857240533/source _original_basename=tmpungze7yw follow=False checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 22:46:10 np0005591284.novalocal python3[6341]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 21 22:46:10 np0005591284.novalocal python3[6412]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/sub_nodes_private src=/home/zuul/.ansible/tmp/ansible-tmp-1769035570.105307-302-187263050837564/source _original_basename=tmpvcqto7g5 follow=False checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 22:46:11 np0005591284.novalocal sudo[6512]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-igkucvlhlrkdwroqooggbhmlksjtcstk ; /usr/bin/python3'
Jan 21 22:46:11 np0005591284.novalocal sudo[6512]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 22:46:11 np0005591284.novalocal python3[6514]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/node_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 21 22:46:11 np0005591284.novalocal sudo[6512]: pam_unix(sudo:session): session closed for user root
Jan 21 22:46:12 np0005591284.novalocal sudo[6585]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-shkajawfaatxxabkilofivuydpluxgfi ; /usr/bin/python3'
Jan 21 22:46:12 np0005591284.novalocal sudo[6585]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 22:46:12 np0005591284.novalocal python3[6587]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/node_private src=/home/zuul/.ansible/tmp/ansible-tmp-1769035571.4462605-382-128860278202401/source _original_basename=tmpypus3c50 follow=False checksum=2e7e63ba56c9b487ea71081bee61c12a1e9cb2fe backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 22:46:12 np0005591284.novalocal sudo[6585]: pam_unix(sudo:session): session closed for user root
Jan 21 22:46:12 np0005591284.novalocal chronyd[784]: Selected source 147.189.136.126 (2.centos.pool.ntp.org)
Jan 21 22:46:12 np0005591284.novalocal python3[6635]: ansible-ansible.legacy.command Invoked with _raw_params=cp .ssh/id_rsa /etc/nodepool/id_rsa zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 22:46:13 np0005591284.novalocal python3[6661]: ansible-ansible.legacy.command Invoked with _raw_params=cp .ssh/id_rsa.pub /etc/nodepool/id_rsa.pub zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 22:46:13 np0005591284.novalocal irqbalance[786]: Cannot change IRQ 27 affinity: Operation not permitted
Jan 21 22:46:13 np0005591284.novalocal irqbalance[786]: IRQ 27 affinity is now unmanaged
Jan 21 22:46:13 np0005591284.novalocal sudo[6739]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dusghymikvdmmllfekbtxklyfrczkrzc ; /usr/bin/python3'
Jan 21 22:46:13 np0005591284.novalocal sudo[6739]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 22:46:13 np0005591284.novalocal python3[6741]: ansible-ansible.legacy.stat Invoked with path=/etc/sudoers.d/zuul-sudo-grep follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 21 22:46:13 np0005591284.novalocal sudo[6739]: pam_unix(sudo:session): session closed for user root
Jan 21 22:46:13 np0005591284.novalocal sudo[6812]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-njevbsuhoufurjqffcijrkkuozapskbn ; /usr/bin/python3'
Jan 21 22:46:13 np0005591284.novalocal sudo[6812]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 22:46:13 np0005591284.novalocal python3[6814]: ansible-ansible.legacy.copy Invoked with dest=/etc/sudoers.d/zuul-sudo-grep mode=288 src=/home/zuul/.ansible/tmp/ansible-tmp-1769035573.237137-452-213756175373111/source _original_basename=tmpvl7w4pw0 follow=False checksum=bdca1a77493d00fb51567671791f4aa30f66c2f0 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 22:46:13 np0005591284.novalocal sudo[6812]: pam_unix(sudo:session): session closed for user root
Jan 21 22:46:14 np0005591284.novalocal sudo[6863]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wzbyftrhqimujcejjllfvfpxdatjjnmh ; /usr/bin/python3'
Jan 21 22:46:14 np0005591284.novalocal sudo[6863]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 22:46:14 np0005591284.novalocal python3[6865]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/visudo -c zuul_log_id=fa163ec2-ffbe-2a6d-faa9-00000000001f-1-compute1 zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 22:46:14 np0005591284.novalocal sudo[6863]: pam_unix(sudo:session): session closed for user root
Jan 21 22:46:15 np0005591284.novalocal python3[6893]: ansible-ansible.legacy.command Invoked with executable=/bin/bash _raw_params=env
                                                       _uses_shell=True zuul_log_id=fa163ec2-ffbe-2a6d-faa9-000000000020-1-compute1 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None creates=None removes=None stdin=None
Jan 21 22:46:16 np0005591284.novalocal python3[6921]: ansible-file Invoked with path=/home/zuul/workspace state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 22:46:37 np0005591284.novalocal sudo[6945]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vadqkpmajyiugbgtbhywlvxcjheazwnm ; /usr/bin/python3'
Jan 21 22:46:37 np0005591284.novalocal sudo[6945]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 22:46:38 np0005591284.novalocal python3[6947]: ansible-ansible.builtin.file Invoked with path=/etc/ci/env state=directory mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 22:46:38 np0005591284.novalocal sudo[6945]: pam_unix(sudo:session): session closed for user root
Jan 21 22:46:38 np0005591284.novalocal systemd[1]: systemd-timedated.service: Deactivated successfully.
Jan 21 22:47:38 np0005591284.novalocal sshd-session[4317]: Received disconnect from 38.102.83.114 port 48106:11: disconnected by user
Jan 21 22:47:38 np0005591284.novalocal sshd-session[4317]: Disconnected from user zuul 38.102.83.114 port 48106
Jan 21 22:47:38 np0005591284.novalocal sshd-session[4304]: pam_unix(sshd:session): session closed for user zuul
Jan 21 22:47:38 np0005591284.novalocal systemd-logind[796]: Session 1 logged out. Waiting for processes to exit.
Jan 21 22:47:39 np0005591284.novalocal kernel: pci 0000:00:07.0: [1af4:1000] type 00 class 0x020000 conventional PCI endpoint
Jan 21 22:47:39 np0005591284.novalocal kernel: pci 0000:00:07.0: BAR 0 [io  0x0000-0x003f]
Jan 21 22:47:39 np0005591284.novalocal kernel: pci 0000:00:07.0: BAR 1 [mem 0x00000000-0x00000fff]
Jan 21 22:47:39 np0005591284.novalocal kernel: pci 0000:00:07.0: BAR 4 [mem 0x00000000-0x00003fff 64bit pref]
Jan 21 22:47:39 np0005591284.novalocal kernel: pci 0000:00:07.0: ROM [mem 0x00000000-0x0007ffff pref]
Jan 21 22:47:39 np0005591284.novalocal kernel: pci 0000:00:07.0: ROM [mem 0xc0000000-0xc007ffff pref]: assigned
Jan 21 22:47:39 np0005591284.novalocal kernel: pci 0000:00:07.0: BAR 4 [mem 0x240000000-0x240003fff 64bit pref]: assigned
Jan 21 22:47:39 np0005591284.novalocal kernel: pci 0000:00:07.0: BAR 1 [mem 0xc0080000-0xc0080fff]: assigned
Jan 21 22:47:39 np0005591284.novalocal kernel: pci 0000:00:07.0: BAR 0 [io  0x1000-0x103f]: assigned
Jan 21 22:47:39 np0005591284.novalocal kernel: virtio-pci 0000:00:07.0: enabling device (0000 -> 0003)
Jan 21 22:47:39 np0005591284.novalocal NetworkManager[855]: <info>  [1769035659.8960] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Jan 21 22:47:39 np0005591284.novalocal systemd-udevd[6950]: Network interface NamePolicy= disabled on kernel command line.
Jan 21 22:47:39 np0005591284.novalocal NetworkManager[855]: <info>  [1769035659.9143] device (eth1): state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 21 22:47:39 np0005591284.novalocal NetworkManager[855]: <info>  [1769035659.9184] settings: (eth1): created default wired connection 'Wired connection 1'
Jan 21 22:47:39 np0005591284.novalocal NetworkManager[855]: <info>  [1769035659.9189] device (eth1): carrier: link connected
Jan 21 22:47:39 np0005591284.novalocal NetworkManager[855]: <info>  [1769035659.9193] device (eth1): state change: unavailable -> disconnected (reason 'carrier-changed', managed-type: 'full')
Jan 21 22:47:39 np0005591284.novalocal NetworkManager[855]: <info>  [1769035659.9206] policy: auto-activating connection 'Wired connection 1' (ef9564cf-3cba-317e-b605-2bb50bce1cb4)
Jan 21 22:47:39 np0005591284.novalocal NetworkManager[855]: <info>  [1769035659.9213] device (eth1): Activation: starting connection 'Wired connection 1' (ef9564cf-3cba-317e-b605-2bb50bce1cb4)
Jan 21 22:47:39 np0005591284.novalocal NetworkManager[855]: <info>  [1769035659.9214] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 21 22:47:39 np0005591284.novalocal NetworkManager[855]: <info>  [1769035659.9219] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 21 22:47:39 np0005591284.novalocal NetworkManager[855]: <info>  [1769035659.9225] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 21 22:47:39 np0005591284.novalocal NetworkManager[855]: <info>  [1769035659.9233] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Jan 21 22:47:39 np0005591284.novalocal systemd[4308]: Starting Mark boot as successful...
Jan 21 22:47:39 np0005591284.novalocal systemd[4308]: Finished Mark boot as successful.
Jan 21 22:47:40 np0005591284.novalocal sshd-session[6955]: Accepted publickey for zuul from 38.102.83.114 port 57042 ssh2: RSA SHA256:LSN8GeK+nwfQgAdzsG9Fx0/CGGktcUeOM8rFlOBs7zo
Jan 21 22:47:40 np0005591284.novalocal systemd-logind[796]: New session 3 of user zuul.
Jan 21 22:47:40 np0005591284.novalocal systemd[1]: Started Session 3 of User zuul.
Jan 21 22:47:40 np0005591284.novalocal sshd-session[6955]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 21 22:47:40 np0005591284.novalocal python3[6982]: ansible-ansible.legacy.command Invoked with _raw_params=ip -j link zuul_log_id=fa163ec2-ffbe-4168-cfbf-000000000189-0-controller zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 22:47:47 np0005591284.novalocal sudo[7060]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-npyegwswdvqcdwzohautsmmirnxlzkma ; OS_CLOUD=vexxhost /usr/bin/python3'
Jan 21 22:47:47 np0005591284.novalocal sudo[7060]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 22:47:47 np0005591284.novalocal python3[7062]: ansible-ansible.legacy.stat Invoked with path=/etc/NetworkManager/system-connections/ci-private-network.nmconnection follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 21 22:47:47 np0005591284.novalocal sudo[7060]: pam_unix(sudo:session): session closed for user root
Jan 21 22:47:48 np0005591284.novalocal sudo[7133]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-esczbgplkrzlgxefauideuvozifneciz ; OS_CLOUD=vexxhost /usr/bin/python3'
Jan 21 22:47:48 np0005591284.novalocal sudo[7133]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 22:47:48 np0005591284.novalocal python3[7135]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769035667.4345658-155-131293959244741/source dest=/etc/NetworkManager/system-connections/ci-private-network.nmconnection mode=0600 owner=root group=root follow=False _original_basename=bootstrap-ci-network-nm-connection.nmconnection.j2 checksum=1c0321fb43ac1855fe247c64752513566434a374 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 22:47:48 np0005591284.novalocal sudo[7133]: pam_unix(sudo:session): session closed for user root
Jan 21 22:47:48 np0005591284.novalocal sudo[7183]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wsptcidcwlavirpnlevenrkdtqcznhbm ; OS_CLOUD=vexxhost /usr/bin/python3'
Jan 21 22:47:48 np0005591284.novalocal sudo[7183]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 22:47:48 np0005591284.novalocal python3[7185]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 21 22:47:48 np0005591284.novalocal systemd[1]: NetworkManager-wait-online.service: Deactivated successfully.
Jan 21 22:47:48 np0005591284.novalocal systemd[1]: Stopped Network Manager Wait Online.
Jan 21 22:47:48 np0005591284.novalocal systemd[1]: Stopping Network Manager Wait Online...
Jan 21 22:47:48 np0005591284.novalocal systemd[1]: Stopping Network Manager...
Jan 21 22:47:48 np0005591284.novalocal NetworkManager[855]: <info>  [1769035668.7702] caught SIGTERM, shutting down normally.
Jan 21 22:47:48 np0005591284.novalocal NetworkManager[855]: <info>  [1769035668.7721] dhcp4 (eth0): canceled DHCP transaction
Jan 21 22:47:48 np0005591284.novalocal NetworkManager[855]: <info>  [1769035668.7722] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Jan 21 22:47:48 np0005591284.novalocal NetworkManager[855]: <info>  [1769035668.7722] dhcp4 (eth0): state changed no lease
Jan 21 22:47:48 np0005591284.novalocal NetworkManager[855]: <info>  [1769035668.7725] manager: NetworkManager state is now CONNECTING
Jan 21 22:47:48 np0005591284.novalocal NetworkManager[855]: <info>  [1769035668.7789] dhcp4 (eth1): canceled DHCP transaction
Jan 21 22:47:48 np0005591284.novalocal NetworkManager[855]: <info>  [1769035668.7789] dhcp4 (eth1): state changed no lease
Jan 21 22:47:48 np0005591284.novalocal NetworkManager[855]: <info>  [1769035668.7865] exiting (success)
Jan 21 22:47:48 np0005591284.novalocal systemd[1]: Starting Network Manager Script Dispatcher Service...
Jan 21 22:47:48 np0005591284.novalocal systemd[1]: Started Network Manager Script Dispatcher Service.
Jan 21 22:47:48 np0005591284.novalocal systemd[1]: NetworkManager.service: Deactivated successfully.
Jan 21 22:47:48 np0005591284.novalocal systemd[1]: Stopped Network Manager.
Jan 21 22:47:48 np0005591284.novalocal systemd[1]: NetworkManager.service: Consumed 1.716s CPU time, 10.2M memory peak.
Jan 21 22:47:48 np0005591284.novalocal systemd[1]: Starting Network Manager...
Jan 21 22:47:48 np0005591284.novalocal NetworkManager[7194]: <info>  [1769035668.8663] NetworkManager (version 1.54.3-2.el9) is starting... (after a restart, boot:eb0f01be-82e2-4e7f-8f82-f8e2d1cf7324)
Jan 21 22:47:48 np0005591284.novalocal NetworkManager[7194]: <info>  [1769035668.8665] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Jan 21 22:47:48 np0005591284.novalocal NetworkManager[7194]: <info>  [1769035668.8733] manager[0x56527569b000]: monitoring kernel firmware directory '/lib/firmware'.
Jan 21 22:47:48 np0005591284.novalocal systemd[1]: Starting Hostname Service...
Jan 21 22:47:48 np0005591284.novalocal systemd[1]: Started Hostname Service.
Jan 21 22:47:48 np0005591284.novalocal NetworkManager[7194]: <info>  [1769035668.9522] hostname: hostname: using hostnamed
Jan 21 22:47:48 np0005591284.novalocal NetworkManager[7194]: <info>  [1769035668.9523] hostname: static hostname changed from (none) to "np0005591284.novalocal"
Jan 21 22:47:48 np0005591284.novalocal NetworkManager[7194]: <info>  [1769035668.9530] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Jan 21 22:47:48 np0005591284.novalocal NetworkManager[7194]: <info>  [1769035668.9537] manager[0x56527569b000]: rfkill: Wi-Fi hardware radio set enabled
Jan 21 22:47:48 np0005591284.novalocal NetworkManager[7194]: <info>  [1769035668.9537] manager[0x56527569b000]: rfkill: WWAN hardware radio set enabled
Jan 21 22:47:48 np0005591284.novalocal NetworkManager[7194]: <info>  [1769035668.9574] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-device-plugin-team.so)
Jan 21 22:47:48 np0005591284.novalocal NetworkManager[7194]: <info>  [1769035668.9575] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Jan 21 22:47:48 np0005591284.novalocal NetworkManager[7194]: <info>  [1769035668.9575] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Jan 21 22:47:48 np0005591284.novalocal NetworkManager[7194]: <info>  [1769035668.9576] manager: Networking is enabled by state file
Jan 21 22:47:48 np0005591284.novalocal NetworkManager[7194]: <info>  [1769035668.9579] settings: Loaded settings plugin: keyfile (internal)
Jan 21 22:47:48 np0005591284.novalocal NetworkManager[7194]: <info>  [1769035668.9583] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-settings-plugin-ifcfg-rh.so")
Jan 21 22:47:48 np0005591284.novalocal NetworkManager[7194]: <info>  [1769035668.9615] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Jan 21 22:47:48 np0005591284.novalocal NetworkManager[7194]: <info>  [1769035668.9630] dhcp: init: Using DHCP client 'internal'
Jan 21 22:47:48 np0005591284.novalocal NetworkManager[7194]: <info>  [1769035668.9634] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Jan 21 22:47:48 np0005591284.novalocal NetworkManager[7194]: <info>  [1769035668.9641] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 21 22:47:48 np0005591284.novalocal NetworkManager[7194]: <info>  [1769035668.9647] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Jan 21 22:47:48 np0005591284.novalocal NetworkManager[7194]: <info>  [1769035668.9656] device (lo): Activation: starting connection 'lo' (4662a9d4-1184-4934-9979-d04ebf8a1fd8)
Jan 21 22:47:48 np0005591284.novalocal NetworkManager[7194]: <info>  [1769035668.9665] device (eth0): carrier: link connected
Jan 21 22:47:48 np0005591284.novalocal NetworkManager[7194]: <info>  [1769035668.9670] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Jan 21 22:47:48 np0005591284.novalocal NetworkManager[7194]: <info>  [1769035668.9676] manager: (eth0): assume: will attempt to assume matching connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03) (indicated)
Jan 21 22:47:48 np0005591284.novalocal NetworkManager[7194]: <info>  [1769035668.9676] device (eth0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Jan 21 22:47:48 np0005591284.novalocal NetworkManager[7194]: <info>  [1769035668.9684] device (eth0): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Jan 21 22:47:48 np0005591284.novalocal NetworkManager[7194]: <info>  [1769035668.9694] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Jan 21 22:47:48 np0005591284.novalocal NetworkManager[7194]: <info>  [1769035668.9703] device (eth1): carrier: link connected
Jan 21 22:47:48 np0005591284.novalocal NetworkManager[7194]: <info>  [1769035668.9708] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Jan 21 22:47:48 np0005591284.novalocal NetworkManager[7194]: <info>  [1769035668.9715] manager: (eth1): assume: will attempt to assume matching connection 'Wired connection 1' (ef9564cf-3cba-317e-b605-2bb50bce1cb4) (indicated)
Jan 21 22:47:48 np0005591284.novalocal NetworkManager[7194]: <info>  [1769035668.9715] device (eth1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Jan 21 22:47:48 np0005591284.novalocal NetworkManager[7194]: <info>  [1769035668.9723] device (eth1): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Jan 21 22:47:48 np0005591284.novalocal NetworkManager[7194]: <info>  [1769035668.9730] device (eth1): Activation: starting connection 'Wired connection 1' (ef9564cf-3cba-317e-b605-2bb50bce1cb4)
Jan 21 22:47:48 np0005591284.novalocal systemd[1]: Started Network Manager.
Jan 21 22:47:48 np0005591284.novalocal NetworkManager[7194]: <info>  [1769035668.9740] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Jan 21 22:47:48 np0005591284.novalocal NetworkManager[7194]: <info>  [1769035668.9746] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Jan 21 22:47:48 np0005591284.novalocal NetworkManager[7194]: <info>  [1769035668.9749] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Jan 21 22:47:48 np0005591284.novalocal NetworkManager[7194]: <info>  [1769035668.9752] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Jan 21 22:47:48 np0005591284.novalocal NetworkManager[7194]: <info>  [1769035668.9756] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Jan 21 22:47:48 np0005591284.novalocal NetworkManager[7194]: <info>  [1769035668.9760] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'assume')
Jan 21 22:47:48 np0005591284.novalocal NetworkManager[7194]: <info>  [1769035668.9763] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Jan 21 22:47:48 np0005591284.novalocal NetworkManager[7194]: <info>  [1769035668.9766] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'assume')
Jan 21 22:47:48 np0005591284.novalocal NetworkManager[7194]: <info>  [1769035668.9773] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Jan 21 22:47:48 np0005591284.novalocal NetworkManager[7194]: <info>  [1769035668.9781] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Jan 21 22:47:48 np0005591284.novalocal NetworkManager[7194]: <info>  [1769035668.9785] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Jan 21 22:47:48 np0005591284.novalocal NetworkManager[7194]: <info>  [1769035668.9795] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Jan 21 22:47:48 np0005591284.novalocal NetworkManager[7194]: <info>  [1769035668.9799] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Jan 21 22:47:48 np0005591284.novalocal NetworkManager[7194]: <info>  [1769035668.9817] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Jan 21 22:47:48 np0005591284.novalocal NetworkManager[7194]: <info>  [1769035668.9825] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Jan 21 22:47:48 np0005591284.novalocal NetworkManager[7194]: <info>  [1769035668.9832] device (lo): Activation: successful, device activated.
Jan 21 22:47:48 np0005591284.novalocal NetworkManager[7194]: <info>  [1769035668.9841] dhcp4 (eth0): state changed new lease, address=38.102.83.173
Jan 21 22:47:48 np0005591284.novalocal NetworkManager[7194]: <info>  [1769035668.9853] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Jan 21 22:47:48 np0005591284.novalocal NetworkManager[7194]: <info>  [1769035668.9929] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Jan 21 22:47:48 np0005591284.novalocal systemd[1]: Starting Network Manager Wait Online...
Jan 21 22:47:48 np0005591284.novalocal NetworkManager[7194]: <info>  [1769035668.9962] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Jan 21 22:47:48 np0005591284.novalocal NetworkManager[7194]: <info>  [1769035668.9963] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Jan 21 22:47:48 np0005591284.novalocal NetworkManager[7194]: <info>  [1769035668.9967] manager: NetworkManager state is now CONNECTED_SITE
Jan 21 22:47:48 np0005591284.novalocal NetworkManager[7194]: <info>  [1769035668.9971] device (eth0): Activation: successful, device activated.
Jan 21 22:47:48 np0005591284.novalocal NetworkManager[7194]: <info>  [1769035668.9976] manager: NetworkManager state is now CONNECTED_GLOBAL
Jan 21 22:47:49 np0005591284.novalocal sudo[7183]: pam_unix(sudo:session): session closed for user root
Jan 21 22:47:49 np0005591284.novalocal python3[7270]: ansible-ansible.legacy.command Invoked with _raw_params=ip route zuul_log_id=fa163ec2-ffbe-4168-cfbf-0000000000c8-0-controller zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 22:47:59 np0005591284.novalocal systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Jan 21 22:48:18 np0005591284.novalocal systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Jan 21 22:48:34 np0005591284.novalocal NetworkManager[7194]: <info>  [1769035714.2941] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Jan 21 22:48:34 np0005591284.novalocal systemd[1]: Starting Network Manager Script Dispatcher Service...
Jan 21 22:48:34 np0005591284.novalocal systemd[1]: Started Network Manager Script Dispatcher Service.
Jan 21 22:48:34 np0005591284.novalocal NetworkManager[7194]: <info>  [1769035714.3292] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Jan 21 22:48:34 np0005591284.novalocal NetworkManager[7194]: <info>  [1769035714.3296] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Jan 21 22:48:34 np0005591284.novalocal NetworkManager[7194]: <info>  [1769035714.3303] device (eth1): Activation: successful, device activated.
Jan 21 22:48:34 np0005591284.novalocal NetworkManager[7194]: <info>  [1769035714.3309] manager: startup complete
Jan 21 22:48:34 np0005591284.novalocal NetworkManager[7194]: <info>  [1769035714.3310] device (eth1): state change: activated -> failed (reason 'ip-config-unavailable', managed-type: 'full')
Jan 21 22:48:34 np0005591284.novalocal NetworkManager[7194]: <warn>  [1769035714.3315] device (eth1): Activation: failed for connection 'Wired connection 1'
Jan 21 22:48:34 np0005591284.novalocal NetworkManager[7194]: <info>  [1769035714.3322] device (eth1): state change: failed -> disconnected (reason 'none', managed-type: 'full')
Jan 21 22:48:34 np0005591284.novalocal systemd[1]: Finished Network Manager Wait Online.
Jan 21 22:48:34 np0005591284.novalocal NetworkManager[7194]: <info>  [1769035714.3429] dhcp4 (eth1): canceled DHCP transaction
Jan 21 22:48:34 np0005591284.novalocal NetworkManager[7194]: <info>  [1769035714.3430] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Jan 21 22:48:34 np0005591284.novalocal NetworkManager[7194]: <info>  [1769035714.3431] dhcp4 (eth1): state changed no lease
Jan 21 22:48:34 np0005591284.novalocal NetworkManager[7194]: <info>  [1769035714.3445] policy: auto-activating connection 'ci-private-network' (693d23c9-22df-5d5e-b59c-efc0730a6438)
Jan 21 22:48:34 np0005591284.novalocal NetworkManager[7194]: <info>  [1769035714.3449] device (eth1): Activation: starting connection 'ci-private-network' (693d23c9-22df-5d5e-b59c-efc0730a6438)
Jan 21 22:48:34 np0005591284.novalocal NetworkManager[7194]: <info>  [1769035714.3450] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 21 22:48:34 np0005591284.novalocal NetworkManager[7194]: <info>  [1769035714.3452] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 21 22:48:34 np0005591284.novalocal NetworkManager[7194]: <info>  [1769035714.3458] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 21 22:48:34 np0005591284.novalocal NetworkManager[7194]: <info>  [1769035714.3467] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 21 22:48:34 np0005591284.novalocal NetworkManager[7194]: <info>  [1769035714.3509] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 21 22:48:34 np0005591284.novalocal NetworkManager[7194]: <info>  [1769035714.3511] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 21 22:48:34 np0005591284.novalocal NetworkManager[7194]: <info>  [1769035714.3516] device (eth1): Activation: successful, device activated.
Jan 21 22:48:44 np0005591284.novalocal systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Jan 21 22:48:49 np0005591284.novalocal sshd-session[6958]: Received disconnect from 38.102.83.114 port 57042:11: disconnected by user
Jan 21 22:48:49 np0005591284.novalocal sshd-session[6958]: Disconnected from user zuul 38.102.83.114 port 57042
Jan 21 22:48:49 np0005591284.novalocal sshd-session[6955]: pam_unix(sshd:session): session closed for user zuul
Jan 21 22:48:49 np0005591284.novalocal systemd[1]: session-3.scope: Deactivated successfully.
Jan 21 22:48:49 np0005591284.novalocal systemd[1]: session-3.scope: Consumed 1.742s CPU time.
Jan 21 22:48:49 np0005591284.novalocal systemd-logind[796]: Session 3 logged out. Waiting for processes to exit.
Jan 21 22:48:49 np0005591284.novalocal systemd-logind[796]: Removed session 3.
Jan 21 22:49:14 np0005591284.novalocal sshd-session[7298]: Accepted publickey for zuul from 38.102.83.114 port 52226 ssh2: RSA SHA256:LSN8GeK+nwfQgAdzsG9Fx0/CGGktcUeOM8rFlOBs7zo
Jan 21 22:49:14 np0005591284.novalocal systemd-logind[796]: New session 4 of user zuul.
Jan 21 22:49:14 np0005591284.novalocal systemd[1]: Started Session 4 of User zuul.
Jan 21 22:49:14 np0005591284.novalocal sshd-session[7298]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 21 22:49:15 np0005591284.novalocal sudo[7377]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iovagxvwrsdizafmgepptzbqtpzefncc ; OS_CLOUD=vexxhost /usr/bin/python3'
Jan 21 22:49:15 np0005591284.novalocal sudo[7377]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 22:49:15 np0005591284.novalocal python3[7379]: ansible-ansible.legacy.stat Invoked with path=/etc/ci/env/networking-info.yml follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 21 22:49:15 np0005591284.novalocal sudo[7377]: pam_unix(sudo:session): session closed for user root
Jan 21 22:49:15 np0005591284.novalocal sudo[7450]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xmzavuzfgfrpzahotkktoquqiuytzjlj ; OS_CLOUD=vexxhost /usr/bin/python3'
Jan 21 22:49:15 np0005591284.novalocal sudo[7450]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 22:49:15 np0005591284.novalocal python3[7452]: ansible-ansible.legacy.copy Invoked with dest=/etc/ci/env/networking-info.yml owner=root group=root mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769035754.9246461-365-145543753091380/source _original_basename=tmp1zynq2c9 follow=False checksum=9be2ac127257c76b31f8acdef7104cc3c2481547 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 22:49:15 np0005591284.novalocal sudo[7450]: pam_unix(sudo:session): session closed for user root
Jan 21 22:49:18 np0005591284.novalocal sshd-session[7301]: Connection closed by 38.102.83.114 port 52226
Jan 21 22:49:18 np0005591284.novalocal sshd-session[7298]: pam_unix(sshd:session): session closed for user zuul
Jan 21 22:49:18 np0005591284.novalocal systemd[1]: session-4.scope: Deactivated successfully.
Jan 21 22:49:18 np0005591284.novalocal systemd-logind[796]: Session 4 logged out. Waiting for processes to exit.
Jan 21 22:49:18 np0005591284.novalocal systemd-logind[796]: Removed session 4.
Jan 21 22:50:43 np0005591284.novalocal systemd[4308]: Created slice User Background Tasks Slice.
Jan 21 22:50:43 np0005591284.novalocal systemd[4308]: Starting Cleanup of User's Temporary Files and Directories...
Jan 21 22:50:43 np0005591284.novalocal systemd[4308]: Finished Cleanup of User's Temporary Files and Directories.
Jan 21 22:50:53 np0005591284.novalocal sshd-session[7481]: Invalid user admin from 139.19.117.129 port 46404
Jan 21 22:50:53 np0005591284.novalocal sshd-session[7481]: userauth_pubkey: signature algorithm ssh-rsa not in PubkeyAcceptedAlgorithms [preauth]
Jan 21 22:51:03 np0005591284.novalocal sshd-session[7481]: Connection closed by invalid user admin 139.19.117.129 port 46404 [preauth]
Jan 21 22:53:29 np0005591284.novalocal sshd-session[7483]: Invalid user nginx from 106.63.7.208 port 42486
Jan 21 22:53:29 np0005591284.novalocal sshd-session[7483]: Received disconnect from 106.63.7.208 port 42486:11:  [preauth]
Jan 21 22:53:29 np0005591284.novalocal sshd-session[7483]: Disconnected from invalid user nginx 106.63.7.208 port 42486 [preauth]
Jan 21 22:54:44 np0005591284.novalocal sshd-session[7486]: Accepted publickey for zuul from 38.102.83.114 port 54684 ssh2: RSA SHA256:LSN8GeK+nwfQgAdzsG9Fx0/CGGktcUeOM8rFlOBs7zo
Jan 21 22:54:44 np0005591284.novalocal systemd-logind[796]: New session 5 of user zuul.
Jan 21 22:54:45 np0005591284.novalocal systemd[1]: Started Session 5 of User zuul.
Jan 21 22:54:45 np0005591284.novalocal sshd-session[7486]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 21 22:54:45 np0005591284.novalocal sudo[7513]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qpoxveebumrlxodyvcqjpgqouowzhbtl ; /usr/bin/python3'
Jan 21 22:54:45 np0005591284.novalocal sudo[7513]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 22:54:45 np0005591284.novalocal python3[7515]: ansible-ansible.legacy.command Invoked with _raw_params=lsblk -nd -o MAJ:MIN /dev/vda
                                                       _uses_shell=True zuul_log_id=fa163ec2-ffbe-51e3-fd82-000000000ca4-1-compute1 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 22:54:45 np0005591284.novalocal sudo[7513]: pam_unix(sudo:session): session closed for user root
Jan 21 22:54:45 np0005591284.novalocal sudo[7541]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nliswqsifcwuvtzfejxwmoaccbttehrz ; /usr/bin/python3'
Jan 21 22:54:45 np0005591284.novalocal sudo[7541]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 22:54:45 np0005591284.novalocal python3[7543]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/init.scope state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 22:54:45 np0005591284.novalocal sudo[7541]: pam_unix(sudo:session): session closed for user root
Jan 21 22:54:45 np0005591284.novalocal sudo[7568]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nywpyjgdbwhtvvnlbmitvgoxfblkygdu ; /usr/bin/python3'
Jan 21 22:54:45 np0005591284.novalocal sudo[7568]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 22:54:45 np0005591284.novalocal python3[7570]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/machine.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 22:54:45 np0005591284.novalocal sudo[7568]: pam_unix(sudo:session): session closed for user root
Jan 21 22:54:46 np0005591284.novalocal sudo[7594]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wxaivcelgmkegnuuwhwcvoygydaotyxn ; /usr/bin/python3'
Jan 21 22:54:46 np0005591284.novalocal sudo[7594]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 22:54:46 np0005591284.novalocal python3[7596]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/system.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 22:54:46 np0005591284.novalocal sudo[7594]: pam_unix(sudo:session): session closed for user root
Jan 21 22:54:46 np0005591284.novalocal sudo[7620]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nahzijxpxxquicqovbfrwzdxzcezbpzr ; /usr/bin/python3'
Jan 21 22:54:46 np0005591284.novalocal sudo[7620]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 22:54:46 np0005591284.novalocal python3[7622]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/user.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 22:54:46 np0005591284.novalocal sudo[7620]: pam_unix(sudo:session): session closed for user root
Jan 21 22:54:46 np0005591284.novalocal sudo[7646]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hjqhmptjpdysfvcnpctbqkeuzxlivsue ; /usr/bin/python3'
Jan 21 22:54:46 np0005591284.novalocal sudo[7646]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 22:54:46 np0005591284.novalocal python3[7648]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system.conf.d state=directory mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 22:54:46 np0005591284.novalocal sudo[7646]: pam_unix(sudo:session): session closed for user root
Jan 21 22:54:47 np0005591284.novalocal sudo[7724]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hkexpzegjumlginhfcyyotqmjxnqpnfy ; /usr/bin/python3'
Jan 21 22:54:47 np0005591284.novalocal sudo[7724]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 22:54:47 np0005591284.novalocal python3[7726]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system.conf.d/override.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 21 22:54:47 np0005591284.novalocal sudo[7724]: pam_unix(sudo:session): session closed for user root
Jan 21 22:54:47 np0005591284.novalocal sudo[7797]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mivjjkwjotxjfdtduqroazgfwtstnwnk ; /usr/bin/python3'
Jan 21 22:54:47 np0005591284.novalocal sudo[7797]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 22:54:47 np0005591284.novalocal python3[7799]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system.conf.d/override.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769036087.2224855-365-26935730158894/source _original_basename=tmpkpfijo42 follow=False checksum=a05098bd3d2321238ea1169d0e6f135b35b392d4 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 22:54:47 np0005591284.novalocal sudo[7797]: pam_unix(sudo:session): session closed for user root
Jan 21 22:54:49 np0005591284.novalocal sudo[7847]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pjskxgjjkusxmoxhixsntoubyduplpvo ; /usr/bin/python3'
Jan 21 22:54:49 np0005591284.novalocal sudo[7847]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 22:54:49 np0005591284.novalocal python3[7849]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 21 22:54:49 np0005591284.novalocal systemd[1]: Reloading.
Jan 21 22:54:49 np0005591284.novalocal systemd-rc-local-generator[7867]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 22:54:49 np0005591284.novalocal sudo[7847]: pam_unix(sudo:session): session closed for user root
Jan 21 22:54:51 np0005591284.novalocal sudo[7902]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ltfckhghyodwtdfnxrecaoktzduueqhd ; /usr/bin/python3'
Jan 21 22:54:51 np0005591284.novalocal sudo[7902]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 22:54:51 np0005591284.novalocal python3[7904]: ansible-ansible.builtin.wait_for Invoked with path=/sys/fs/cgroup/system.slice/io.max state=present timeout=30 host=127.0.0.1 connect_timeout=5 delay=0 active_connection_states=['ESTABLISHED', 'FIN_WAIT1', 'FIN_WAIT2', 'SYN_RECV', 'SYN_SENT', 'TIME_WAIT'] sleep=1 port=None search_regex=None exclude_hosts=None msg=None
Jan 21 22:54:51 np0005591284.novalocal sudo[7902]: pam_unix(sudo:session): session closed for user root
Jan 21 22:54:54 np0005591284.novalocal sudo[7928]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hisnzjirdxhapptlpkhxdlqytbfilnji ; /usr/bin/python3'
Jan 21 22:54:54 np0005591284.novalocal sudo[7928]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 22:54:54 np0005591284.novalocal python3[7930]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/init.scope/io.max
                                                       _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 22:54:54 np0005591284.novalocal sudo[7928]: pam_unix(sudo:session): session closed for user root
Jan 21 22:54:55 np0005591284.novalocal sudo[7956]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bqsnokbyeugqskpvczssbxdmeomfdkgg ; /usr/bin/python3'
Jan 21 22:54:55 np0005591284.novalocal sudo[7956]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 22:54:55 np0005591284.novalocal python3[7958]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/machine.slice/io.max
                                                       _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 22:54:55 np0005591284.novalocal sudo[7956]: pam_unix(sudo:session): session closed for user root
Jan 21 22:54:55 np0005591284.novalocal sudo[7984]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gzudirtketwauawwpftjemkpotzrzrhc ; /usr/bin/python3'
Jan 21 22:54:55 np0005591284.novalocal sudo[7984]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 22:54:55 np0005591284.novalocal python3[7986]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/system.slice/io.max
                                                       _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 22:54:55 np0005591284.novalocal sudo[7984]: pam_unix(sudo:session): session closed for user root
Jan 21 22:54:55 np0005591284.novalocal sudo[8012]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hamzkocejzovnxqrbhpunchlvdzdlsgk ; /usr/bin/python3'
Jan 21 22:54:55 np0005591284.novalocal sudo[8012]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 22:54:55 np0005591284.novalocal python3[8014]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/user.slice/io.max
                                                       _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 22:54:55 np0005591284.novalocal sudo[8012]: pam_unix(sudo:session): session closed for user root
Jan 21 22:54:56 np0005591284.novalocal python3[8041]: ansible-ansible.legacy.command Invoked with _raw_params=echo "init";    cat /sys/fs/cgroup/init.scope/io.max; echo "machine"; cat /sys/fs/cgroup/machine.slice/io.max; echo "system";  cat /sys/fs/cgroup/system.slice/io.max; echo "user";    cat /sys/fs/cgroup/user.slice/io.max;
                                                       _uses_shell=True zuul_log_id=fa163ec2-ffbe-51e3-fd82-000000000cab-1-compute1 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 22:54:57 np0005591284.novalocal python3[8071]: ansible-ansible.builtin.stat Invoked with path=/sys/fs/cgroup/kubepods.slice/io.max follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Jan 21 22:55:00 np0005591284.novalocal sshd-session[7489]: Connection closed by 38.102.83.114 port 54684
Jan 21 22:55:00 np0005591284.novalocal sshd-session[7486]: pam_unix(sshd:session): session closed for user zuul
Jan 21 22:55:00 np0005591284.novalocal systemd[1]: session-5.scope: Deactivated successfully.
Jan 21 22:55:00 np0005591284.novalocal systemd[1]: session-5.scope: Consumed 4.125s CPU time.
Jan 21 22:55:00 np0005591284.novalocal systemd-logind[796]: Session 5 logged out. Waiting for processes to exit.
Jan 21 22:55:00 np0005591284.novalocal systemd-logind[796]: Removed session 5.
Jan 21 22:55:02 np0005591284.novalocal sshd-session[8075]: Accepted publickey for zuul from 38.102.83.114 port 44882 ssh2: RSA SHA256:LSN8GeK+nwfQgAdzsG9Fx0/CGGktcUeOM8rFlOBs7zo
Jan 21 22:55:02 np0005591284.novalocal systemd-logind[796]: New session 6 of user zuul.
Jan 21 22:55:02 np0005591284.novalocal systemd[1]: Started Session 6 of User zuul.
Jan 21 22:55:02 np0005591284.novalocal sshd-session[8075]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 21 22:55:02 np0005591284.novalocal sudo[8102]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vuaznonfgcbgmctwincpqmvywdgdaued ; /usr/bin/python3'
Jan 21 22:55:02 np0005591284.novalocal sudo[8102]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 22:55:02 np0005591284.novalocal python3[8104]: ansible-ansible.legacy.dnf Invoked with name=['podman', 'buildah'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Jan 21 22:55:09 np0005591284.novalocal setsebool[8146]: The virt_use_nfs policy boolean was changed to 1 by root
Jan 21 22:55:09 np0005591284.novalocal setsebool[8146]: The virt_sandbox_use_all_caps policy boolean was changed to 1 by root
Jan 21 22:55:21 np0005591284.novalocal kernel: SELinux:  Converting 385 SID table entries...
Jan 21 22:55:21 np0005591284.novalocal kernel: SELinux:  policy capability network_peer_controls=1
Jan 21 22:55:21 np0005591284.novalocal kernel: SELinux:  policy capability open_perms=1
Jan 21 22:55:21 np0005591284.novalocal kernel: SELinux:  policy capability extended_socket_class=1
Jan 21 22:55:21 np0005591284.novalocal kernel: SELinux:  policy capability always_check_network=0
Jan 21 22:55:21 np0005591284.novalocal kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 21 22:55:21 np0005591284.novalocal kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 21 22:55:21 np0005591284.novalocal kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 21 22:55:32 np0005591284.novalocal kernel: SELinux:  Converting 388 SID table entries...
Jan 21 22:55:32 np0005591284.novalocal kernel: SELinux:  policy capability network_peer_controls=1
Jan 21 22:55:32 np0005591284.novalocal kernel: SELinux:  policy capability open_perms=1
Jan 21 22:55:32 np0005591284.novalocal kernel: SELinux:  policy capability extended_socket_class=1
Jan 21 22:55:32 np0005591284.novalocal kernel: SELinux:  policy capability always_check_network=0
Jan 21 22:55:32 np0005591284.novalocal kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 21 22:55:32 np0005591284.novalocal kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 21 22:55:32 np0005591284.novalocal kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 21 22:55:50 np0005591284.novalocal dbus-broker-launch[770]: avc:  op=load_policy lsm=selinux seqno=4 res=1
Jan 21 22:55:50 np0005591284.novalocal systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 21 22:55:50 np0005591284.novalocal systemd[1]: Starting man-db-cache-update.service...
Jan 21 22:55:50 np0005591284.novalocal systemd[1]: Reloading.
Jan 21 22:55:50 np0005591284.novalocal systemd-rc-local-generator[8919]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 22:55:51 np0005591284.novalocal systemd[1]: Queuing reload/restart jobs for marked units…
Jan 21 22:55:52 np0005591284.novalocal sudo[8102]: pam_unix(sudo:session): session closed for user root
Jan 21 22:55:53 np0005591284.novalocal python3[10939]: ansible-ansible.legacy.command Invoked with _raw_params=echo "openstack-k8s-operators+cirobot"
                                                        _uses_shell=True zuul_log_id=fa163ec2-ffbe-17c1-e6f1-00000000000c-1-compute1 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 22:55:54 np0005591284.novalocal kernel: evm: overlay not supported
Jan 21 22:55:54 np0005591284.novalocal systemd[4308]: Starting D-Bus User Message Bus...
Jan 21 22:55:54 np0005591284.novalocal dbus-broker-launch[11992]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +31: Eavesdropping is deprecated and ignored
Jan 21 22:55:54 np0005591284.novalocal dbus-broker-launch[11992]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +33: Eavesdropping is deprecated and ignored
Jan 21 22:55:54 np0005591284.novalocal systemd[4308]: Started D-Bus User Message Bus.
Jan 21 22:55:54 np0005591284.novalocal dbus-broker-lau[11992]: Ready
Jan 21 22:55:54 np0005591284.novalocal systemd[4308]: selinux: avc:  op=load_policy lsm=selinux seqno=4 res=1
Jan 21 22:55:54 np0005591284.novalocal systemd[4308]: Created slice Slice /user.
Jan 21 22:55:54 np0005591284.novalocal systemd[4308]: podman-11899.scope: unit configures an IP firewall, but not running as root.
Jan 21 22:55:54 np0005591284.novalocal systemd[4308]: (This warning is only shown for the first unit using IP firewalling.)
Jan 21 22:55:54 np0005591284.novalocal systemd[4308]: Started podman-11899.scope.
Jan 21 22:55:54 np0005591284.novalocal systemd[4308]: Started podman-pause-6071ce97.scope.
Jan 21 22:55:55 np0005591284.novalocal sudo[12990]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jlpsodxfjkzobvlqofoxbywzbibabrsk ; /usr/bin/python3'
Jan 21 22:55:55 np0005591284.novalocal sudo[12990]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 22:55:55 np0005591284.novalocal python3[13013]: ansible-ansible.builtin.blockinfile Invoked with state=present insertafter=EOF dest=/etc/containers/registries.conf content=[[registry]]
                                                       location = "38.102.83.27:5001"
                                                       insecure = true path=/etc/containers/registries.conf block=[[registry]]
                                                       location = "38.102.83.27:5001"
                                                       insecure = true marker=# {mark} ANSIBLE MANAGED BLOCK create=False backup=False marker_begin=BEGIN marker_end=END unsafe_writes=False insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 22:55:55 np0005591284.novalocal python3[13013]: ansible-ansible.builtin.blockinfile [WARNING] Module remote_tmp /root/.ansible/tmp did not exist and was created with a mode of 0700, this may cause issues when running as another user. To avoid this, create the remote_tmp dir with the correct permissions manually
Jan 21 22:55:55 np0005591284.novalocal sudo[12990]: pam_unix(sudo:session): session closed for user root
Jan 21 22:55:56 np0005591284.novalocal sshd-session[8078]: Connection closed by 38.102.83.114 port 44882
Jan 21 22:55:56 np0005591284.novalocal sshd-session[8075]: pam_unix(sshd:session): session closed for user zuul
Jan 21 22:55:56 np0005591284.novalocal systemd[1]: session-6.scope: Deactivated successfully.
Jan 21 22:55:56 np0005591284.novalocal systemd[1]: session-6.scope: Consumed 44.396s CPU time.
Jan 21 22:55:56 np0005591284.novalocal systemd-logind[796]: Session 6 logged out. Waiting for processes to exit.
Jan 21 22:55:56 np0005591284.novalocal systemd-logind[796]: Removed session 6.
Jan 21 22:56:15 np0005591284.novalocal sshd-session[21542]: Unable to negotiate with 38.102.83.151 port 43970: no matching host key type found. Their offer: sk-ssh-ed25519@openssh.com [preauth]
Jan 21 22:56:15 np0005591284.novalocal sshd-session[21547]: Connection closed by 38.102.83.151 port 43924 [preauth]
Jan 21 22:56:15 np0005591284.novalocal sshd-session[21551]: Connection closed by 38.102.83.151 port 43930 [preauth]
Jan 21 22:56:15 np0005591284.novalocal sshd-session[21545]: Unable to negotiate with 38.102.83.151 port 43944: no matching host key type found. Their offer: ssh-ed25519 [preauth]
Jan 21 22:56:15 np0005591284.novalocal sshd-session[21554]: Unable to negotiate with 38.102.83.151 port 43956: no matching host key type found. Their offer: sk-ecdsa-sha2-nistp256@openssh.com [preauth]
Jan 21 22:56:20 np0005591284.novalocal sshd-session[23598]: Accepted publickey for zuul from 38.102.83.114 port 43252 ssh2: RSA SHA256:LSN8GeK+nwfQgAdzsG9Fx0/CGGktcUeOM8rFlOBs7zo
Jan 21 22:56:20 np0005591284.novalocal systemd-logind[796]: New session 7 of user zuul.
Jan 21 22:56:20 np0005591284.novalocal systemd[1]: Started Session 7 of User zuul.
Jan 21 22:56:20 np0005591284.novalocal sshd-session[23598]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 21 22:56:20 np0005591284.novalocal python3[23709]: ansible-ansible.posix.authorized_key Invoked with user=zuul key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBLb1E5HwmlMjFU9nv6wd+VHV9J1rtO+UWxPZpEjo1oVR+Rls9TFII1iFAeK4/68neaHhE2B9Qc0dAUKPbHC0hoM= zuul@np0005591282.novalocal
                                                        manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 21 22:56:20 np0005591284.novalocal sudo[23908]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-icldojogddffpouarybzwhbzlkowhtmb ; /usr/bin/python3'
Jan 21 22:56:20 np0005591284.novalocal sudo[23908]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 22:56:20 np0005591284.novalocal python3[23919]: ansible-ansible.posix.authorized_key Invoked with user=root key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBLb1E5HwmlMjFU9nv6wd+VHV9J1rtO+UWxPZpEjo1oVR+Rls9TFII1iFAeK4/68neaHhE2B9Qc0dAUKPbHC0hoM= zuul@np0005591282.novalocal
                                                        manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 21 22:56:21 np0005591284.novalocal sudo[23908]: pam_unix(sudo:session): session closed for user root
Jan 21 22:56:21 np0005591284.novalocal sudo[24333]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ttoodckoawfnwgcdxshxdabsqgsiotxj ; /usr/bin/python3'
Jan 21 22:56:21 np0005591284.novalocal sudo[24333]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 22:56:21 np0005591284.novalocal python3[24344]: ansible-ansible.builtin.user Invoked with name=cloud-admin shell=/bin/bash state=present non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on np0005591284.novalocal update_password=always uid=None group=None groups=None comment=None home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None
Jan 21 22:56:21 np0005591284.novalocal useradd[24430]: new group: name=cloud-admin, GID=1002
Jan 21 22:56:21 np0005591284.novalocal useradd[24430]: new user: name=cloud-admin, UID=1002, GID=1002, home=/home/cloud-admin, shell=/bin/bash, from=none
Jan 21 22:56:22 np0005591284.novalocal sudo[24333]: pam_unix(sudo:session): session closed for user root
Jan 21 22:56:22 np0005591284.novalocal sudo[24647]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-drxupsnloptexlqtogvsyjiokuouatyj ; /usr/bin/python3'
Jan 21 22:56:22 np0005591284.novalocal sudo[24647]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 22:56:22 np0005591284.novalocal python3[24656]: ansible-ansible.posix.authorized_key Invoked with user=cloud-admin key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBLb1E5HwmlMjFU9nv6wd+VHV9J1rtO+UWxPZpEjo1oVR+Rls9TFII1iFAeK4/68neaHhE2B9Qc0dAUKPbHC0hoM= zuul@np0005591282.novalocal
                                                        manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 21 22:56:22 np0005591284.novalocal sudo[24647]: pam_unix(sudo:session): session closed for user root
Jan 21 22:56:22 np0005591284.novalocal sudo[24925]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-msjkhdhthzenevmtgucfirbkutcfahat ; /usr/bin/python3'
Jan 21 22:56:22 np0005591284.novalocal sudo[24925]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 22:56:23 np0005591284.novalocal python3[24927]: ansible-ansible.legacy.stat Invoked with path=/etc/sudoers.d/cloud-admin follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 21 22:56:23 np0005591284.novalocal sudo[24925]: pam_unix(sudo:session): session closed for user root
Jan 21 22:56:23 np0005591284.novalocal sudo[25202]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qrlbigtthjculsqenuteofjwourlwzvx ; /usr/bin/python3'
Jan 21 22:56:23 np0005591284.novalocal sudo[25202]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 22:56:23 np0005591284.novalocal python3[25211]: ansible-ansible.legacy.copy Invoked with dest=/etc/sudoers.d/cloud-admin mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1769036182.8288577-168-197529155084307/source _original_basename=tmpi0l0phda follow=False checksum=e7614e5ad3ab06eaae55b8efaa2ed81b63ea5634 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 22:56:23 np0005591284.novalocal sudo[25202]: pam_unix(sudo:session): session closed for user root
Jan 21 22:56:24 np0005591284.novalocal sudo[25572]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pajhjnietltvquoullfyngfwgznrewkp ; /usr/bin/python3'
Jan 21 22:56:24 np0005591284.novalocal sudo[25572]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 22:56:24 np0005591284.novalocal python3[25580]: ansible-ansible.builtin.hostname Invoked with name=compute-1 use=systemd
Jan 21 22:56:24 np0005591284.novalocal systemd[1]: Starting Hostname Service...
Jan 21 22:56:24 np0005591284.novalocal systemd[1]: Started Hostname Service.
Jan 21 22:56:24 np0005591284.novalocal systemd-hostnamed[25680]: Changed pretty hostname to 'compute-1'
Jan 21 22:56:24 compute-1 systemd-hostnamed[25680]: Hostname set to <compute-1> (static)
Jan 21 22:56:24 compute-1 NetworkManager[7194]: <info>  [1769036184.6650] hostname: static hostname changed from "np0005591284.novalocal" to "compute-1"
Jan 21 22:56:24 compute-1 systemd[1]: Starting Network Manager Script Dispatcher Service...
Jan 21 22:56:24 compute-1 systemd[1]: Started Network Manager Script Dispatcher Service.
Jan 21 22:56:24 compute-1 sudo[25572]: pam_unix(sudo:session): session closed for user root
Jan 21 22:56:26 compute-1 sshd-session[23653]: Connection closed by 38.102.83.114 port 43252
Jan 21 22:56:26 compute-1 sshd-session[23598]: pam_unix(sshd:session): session closed for user zuul
Jan 21 22:56:26 compute-1 systemd[1]: session-7.scope: Deactivated successfully.
Jan 21 22:56:26 compute-1 systemd[1]: session-7.scope: Consumed 2.362s CPU time.
Jan 21 22:56:26 compute-1 systemd-logind[796]: Session 7 logged out. Waiting for processes to exit.
Jan 21 22:56:26 compute-1 systemd-logind[796]: Removed session 7.
Jan 21 22:56:34 compute-1 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Jan 21 22:56:36 compute-1 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 21 22:56:36 compute-1 systemd[1]: Finished man-db-cache-update.service.
Jan 21 22:56:36 compute-1 systemd[1]: man-db-cache-update.service: Consumed 54.667s CPU time.
Jan 21 22:56:36 compute-1 systemd[1]: run-r47d848f9c1e240acb595bfc776f1eb26.service: Deactivated successfully.
Jan 21 22:56:54 compute-1 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Jan 21 22:59:33 compute-1 systemd[1]: Starting Cleanup of Temporary Directories...
Jan 21 22:59:33 compute-1 systemd[1]: systemd-tmpfiles-clean.service: Deactivated successfully.
Jan 21 22:59:33 compute-1 systemd[1]: Finished Cleanup of Temporary Directories.
Jan 21 22:59:33 compute-1 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dclean.service.mount: Deactivated successfully.
Jan 21 23:00:23 compute-1 sshd-session[29937]: Accepted publickey for zuul from 38.102.83.151 port 50432 ssh2: RSA SHA256:LSN8GeK+nwfQgAdzsG9Fx0/CGGktcUeOM8rFlOBs7zo
Jan 21 23:00:23 compute-1 systemd-logind[796]: New session 8 of user zuul.
Jan 21 23:00:23 compute-1 systemd[1]: Started Session 8 of User zuul.
Jan 21 23:00:23 compute-1 sshd-session[29937]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 21 23:00:24 compute-1 python3[30013]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 21 23:00:25 compute-1 sudo[30127]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nvyileaahbuzomrpuysdpliaojjwqiaa ; /usr/bin/python3'
Jan 21 23:00:25 compute-1 sudo[30127]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:00:26 compute-1 python3[30129]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 21 23:00:26 compute-1 sudo[30127]: pam_unix(sudo:session): session closed for user root
Jan 21 23:00:26 compute-1 sudo[30200]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fgyrdfarzpdyfdmhfwqabjgizbpbpavf ; /usr/bin/python3'
Jan 21 23:00:26 compute-1 sudo[30200]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:00:26 compute-1 python3[30202]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1769036425.692469-34006-211568400265213/source mode=0755 _original_basename=delorean.repo follow=False checksum=0f7c85cc67bf467c48edf98d5acc63e62d808324 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:00:26 compute-1 sudo[30200]: pam_unix(sudo:session): session closed for user root
Jan 21 23:00:26 compute-1 sudo[30226]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dpzywacyaqzakxmasxendxpdyopfzjfg ; /usr/bin/python3'
Jan 21 23:00:26 compute-1 sudo[30226]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:00:26 compute-1 python3[30228]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean-antelope-testing.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 21 23:00:26 compute-1 sudo[30226]: pam_unix(sudo:session): session closed for user root
Jan 21 23:00:26 compute-1 sudo[30299]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vojdcxvwfsiikeubkjetaptvrnorbzko ; /usr/bin/python3'
Jan 21 23:00:26 compute-1 sudo[30299]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:00:27 compute-1 python3[30301]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1769036425.692469-34006-211568400265213/source mode=0755 _original_basename=delorean-antelope-testing.repo follow=False checksum=4ebc56dead962b5d40b8d420dad43b948b84d3fc backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:00:27 compute-1 sudo[30299]: pam_unix(sudo:session): session closed for user root
Jan 21 23:00:27 compute-1 sudo[30325]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fndexbhqkadehwgrdhiekmsciemztmnr ; /usr/bin/python3'
Jan 21 23:00:27 compute-1 sudo[30325]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:00:27 compute-1 python3[30327]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-highavailability.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 21 23:00:27 compute-1 sudo[30325]: pam_unix(sudo:session): session closed for user root
Jan 21 23:00:27 compute-1 sudo[30398]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gudmpapkhltxtyzwhdipqnqogywzlbpz ; /usr/bin/python3'
Jan 21 23:00:27 compute-1 sudo[30398]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:00:27 compute-1 python3[30400]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1769036425.692469-34006-211568400265213/source mode=0755 _original_basename=repo-setup-centos-highavailability.repo follow=False checksum=55d0f695fd0d8f47cbc3044ce0dcf5f88862490f backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:00:27 compute-1 sudo[30398]: pam_unix(sudo:session): session closed for user root
Jan 21 23:00:27 compute-1 sudo[30424]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hbslkayiobynsnxsraidsnhjwtfiwlvh ; /usr/bin/python3'
Jan 21 23:00:27 compute-1 sudo[30424]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:00:28 compute-1 python3[30426]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-powertools.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 21 23:00:28 compute-1 sudo[30424]: pam_unix(sudo:session): session closed for user root
Jan 21 23:00:28 compute-1 sudo[30497]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dazkvpnspqaoamjcizkiluhnxjwdbuzb ; /usr/bin/python3'
Jan 21 23:00:28 compute-1 sudo[30497]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:00:28 compute-1 python3[30499]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1769036425.692469-34006-211568400265213/source mode=0755 _original_basename=repo-setup-centos-powertools.repo follow=False checksum=4b0cf99aa89c5c5be0151545863a7a7568f67568 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:00:28 compute-1 sudo[30497]: pam_unix(sudo:session): session closed for user root
Jan 21 23:00:28 compute-1 sudo[30524]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gllhlibcdtrntyeipmwnayyqzjiuouzw ; /usr/bin/python3'
Jan 21 23:00:28 compute-1 sudo[30524]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:00:28 compute-1 python3[30526]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-appstream.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 21 23:00:28 compute-1 sudo[30524]: pam_unix(sudo:session): session closed for user root
Jan 21 23:00:28 compute-1 sudo[30597]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oeeeqfmeasvteqkqmozrfpjaxhyuqjrf ; /usr/bin/python3'
Jan 21 23:00:28 compute-1 sudo[30597]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:00:29 compute-1 python3[30599]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1769036425.692469-34006-211568400265213/source mode=0755 _original_basename=repo-setup-centos-appstream.repo follow=False checksum=e89244d2503b2996429dda1857290c1e91e393a1 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:00:29 compute-1 sudo[30597]: pam_unix(sudo:session): session closed for user root
Jan 21 23:00:29 compute-1 sudo[30623]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kvinqmbgyubwhnrknzplsnynbcxclwrz ; /usr/bin/python3'
Jan 21 23:00:29 compute-1 sudo[30623]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:00:29 compute-1 python3[30625]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-baseos.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 21 23:00:29 compute-1 sudo[30623]: pam_unix(sudo:session): session closed for user root
Jan 21 23:00:29 compute-1 sudo[30696]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-isvbitxpydrbhpsngsstlfwdacbbfnbp ; /usr/bin/python3'
Jan 21 23:00:29 compute-1 sudo[30696]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:00:29 compute-1 python3[30698]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1769036425.692469-34006-211568400265213/source mode=0755 _original_basename=repo-setup-centos-baseos.repo follow=False checksum=36d926db23a40dbfa5c84b5e4d43eac6fa2301d6 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:00:29 compute-1 sudo[30696]: pam_unix(sudo:session): session closed for user root
Jan 21 23:00:29 compute-1 sudo[30722]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cuheukqyfjwvrgfoczvjqjleybjnvnwn ; /usr/bin/python3'
Jan 21 23:00:29 compute-1 sudo[30722]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:00:29 compute-1 python3[30724]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean.repo.md5 follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 21 23:00:29 compute-1 sudo[30722]: pam_unix(sudo:session): session closed for user root
Jan 21 23:00:30 compute-1 sudo[30795]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-luibmeuhhdfycybfxezxquuyhvvoqmnx ; /usr/bin/python3'
Jan 21 23:00:30 compute-1 sudo[30795]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:00:30 compute-1 python3[30797]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1769036425.692469-34006-211568400265213/source mode=0755 _original_basename=delorean.repo.md5 follow=False checksum=2583a70b3ee76a9837350b0837bc004a8e52405c backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:00:30 compute-1 sudo[30795]: pam_unix(sudo:session): session closed for user root
Jan 21 23:00:40 compute-1 python3[30845]: ansible-ansible.legacy.command Invoked with _raw_params=hostname _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 23:01:01 compute-1 CROND[30848]: (root) CMD (run-parts /etc/cron.hourly)
Jan 21 23:01:01 compute-1 run-parts[30851]: (/etc/cron.hourly) starting 0anacron
Jan 21 23:01:01 compute-1 anacron[30859]: Anacron started on 2026-01-21
Jan 21 23:01:01 compute-1 anacron[30859]: Will run job `cron.daily' in 35 min.
Jan 21 23:01:01 compute-1 anacron[30859]: Will run job `cron.weekly' in 55 min.
Jan 21 23:01:01 compute-1 anacron[30859]: Will run job `cron.monthly' in 75 min.
Jan 21 23:01:01 compute-1 anacron[30859]: Jobs will be executed sequentially
Jan 21 23:01:01 compute-1 run-parts[30861]: (/etc/cron.hourly) finished 0anacron
Jan 21 23:01:01 compute-1 CROND[30847]: (root) CMDEND (run-parts /etc/cron.hourly)
Jan 21 23:05:40 compute-1 sshd-session[29940]: Received disconnect from 38.102.83.151 port 50432:11: disconnected by user
Jan 21 23:05:40 compute-1 sshd-session[29940]: Disconnected from user zuul 38.102.83.151 port 50432
Jan 21 23:05:40 compute-1 sshd-session[29937]: pam_unix(sshd:session): session closed for user zuul
Jan 21 23:05:40 compute-1 systemd[1]: session-8.scope: Deactivated successfully.
Jan 21 23:05:40 compute-1 systemd[1]: session-8.scope: Consumed 5.503s CPU time.
Jan 21 23:05:40 compute-1 systemd-logind[796]: Session 8 logged out. Waiting for processes to exit.
Jan 21 23:05:40 compute-1 systemd-logind[796]: Removed session 8.
Jan 21 23:06:46 compute-1 sshd-session[30864]: Invalid user ubuntu from 38.67.240.124 port 55797
Jan 21 23:06:46 compute-1 sshd-session[30864]: Received disconnect from 38.67.240.124 port 55797:11:  [preauth]
Jan 21 23:06:46 compute-1 sshd-session[30864]: Disconnected from invalid user ubuntu 38.67.240.124 port 55797 [preauth]
Jan 21 23:14:16 compute-1 sshd-session[30869]: Connection closed by 203.83.238.251 port 45952
Jan 21 23:15:34 compute-1 sshd-session[30871]: Accepted publickey for zuul from 192.168.122.30 port 57174 ssh2: ECDSA SHA256:yVd9eaNlUxI8uClDfn5gH9n5gqPIh4AgIE0cN4DTPFE
Jan 21 23:15:34 compute-1 systemd-logind[796]: New session 9 of user zuul.
Jan 21 23:15:34 compute-1 systemd[1]: Started Session 9 of User zuul.
Jan 21 23:15:34 compute-1 sshd-session[30871]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 21 23:15:35 compute-1 python3.9[31024]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 21 23:15:36 compute-1 sudo[31204]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vxtvuxuzawbssiyqjeftoxpkppfqfrie ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037335.929079-57-209498983390231/AnsiballZ_command.py'
Jan 21 23:15:36 compute-1 sudo[31204]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:15:36 compute-1 python3.9[31206]: ansible-ansible.legacy.command Invoked with _raw_params=set -euxo pipefail
                                            pushd /var/tmp
                                            curl -sL https://github.com/openstack-k8s-operators/repo-setup/archive/refs/heads/main.tar.gz | tar -xz
                                            pushd repo-setup-main
                                            python3 -m venv ./venv
                                            PBR_VERSION=0.0.0 ./venv/bin/pip install ./
                                            ./venv/bin/repo-setup current-podified -b antelope
                                            popd
                                            rm -rf repo-setup-main
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 23:15:43 compute-1 sudo[31204]: pam_unix(sudo:session): session closed for user root
Jan 21 23:15:51 compute-1 sshd-session[30874]: Connection closed by 192.168.122.30 port 57174
Jan 21 23:15:51 compute-1 sshd-session[30871]: pam_unix(sshd:session): session closed for user zuul
Jan 21 23:15:51 compute-1 systemd[1]: session-9.scope: Deactivated successfully.
Jan 21 23:15:51 compute-1 systemd[1]: session-9.scope: Consumed 7.727s CPU time.
Jan 21 23:15:51 compute-1 systemd-logind[796]: Session 9 logged out. Waiting for processes to exit.
Jan 21 23:15:51 compute-1 systemd-logind[796]: Removed session 9.
Jan 21 23:16:07 compute-1 sshd-session[31264]: Accepted publickey for zuul from 192.168.122.30 port 59234 ssh2: ECDSA SHA256:yVd9eaNlUxI8uClDfn5gH9n5gqPIh4AgIE0cN4DTPFE
Jan 21 23:16:07 compute-1 systemd-logind[796]: New session 10 of user zuul.
Jan 21 23:16:07 compute-1 systemd[1]: Started Session 10 of User zuul.
Jan 21 23:16:07 compute-1 sshd-session[31264]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 21 23:16:08 compute-1 python3.9[31417]: ansible-ansible.legacy.ping Invoked with data=pong
Jan 21 23:16:09 compute-1 python3.9[31591]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 21 23:16:10 compute-1 sudo[31741]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gjfpinfszstejwrxgycdpetyzxawezpy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037369.696686-94-74116050719019/AnsiballZ_command.py'
Jan 21 23:16:10 compute-1 sudo[31741]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:16:10 compute-1 python3.9[31743]: ansible-ansible.legacy.command Invoked with _raw_params=PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin which growvols
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 23:16:10 compute-1 sudo[31741]: pam_unix(sudo:session): session closed for user root
Jan 21 23:16:11 compute-1 sudo[31894]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-efxzshyatpjfuscdykvgxhtsjunnjeza ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037370.8869646-130-255819616764204/AnsiballZ_stat.py'
Jan 21 23:16:11 compute-1 sudo[31894]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:16:11 compute-1 python3.9[31896]: ansible-ansible.builtin.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 21 23:16:11 compute-1 sudo[31894]: pam_unix(sudo:session): session closed for user root
Jan 21 23:16:12 compute-1 sudo[32046]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vgqdzemxpkowqerrubvcebpshwnpwldg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037371.8222892-154-66541662027953/AnsiballZ_file.py'
Jan 21 23:16:12 compute-1 sudo[32046]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:16:12 compute-1 python3.9[32048]: ansible-ansible.builtin.file Invoked with mode=755 path=/etc/ansible/facts.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:16:12 compute-1 sudo[32046]: pam_unix(sudo:session): session closed for user root
Jan 21 23:16:13 compute-1 sudo[32198]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iszhlqdgobwkcqjqsakyoafababvvhkv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037372.8068635-178-190365839003649/AnsiballZ_stat.py'
Jan 21 23:16:13 compute-1 sudo[32198]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:16:13 compute-1 python3.9[32200]: ansible-ansible.legacy.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:16:13 compute-1 sudo[32198]: pam_unix(sudo:session): session closed for user root
Jan 21 23:16:13 compute-1 sudo[32321]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kjzxuhdbadhveojfcnrphysdwklioyed ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037372.8068635-178-190365839003649/AnsiballZ_copy.py'
Jan 21 23:16:13 compute-1 sudo[32321]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:16:14 compute-1 python3.9[32323]: ansible-ansible.legacy.copy Invoked with dest=/etc/ansible/facts.d/bootc.fact mode=755 src=/home/zuul/.ansible/tmp/ansible-tmp-1769037372.8068635-178-190365839003649/.source.fact _original_basename=bootc.fact follow=False checksum=eb4122ce7fc50a38407beb511c4ff8c178005b12 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:16:14 compute-1 sudo[32321]: pam_unix(sudo:session): session closed for user root
Jan 21 23:16:14 compute-1 sudo[32473]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qmeuzrogqydmjxpegocsysofryuspaca ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037374.3078394-223-123745823405160/AnsiballZ_setup.py'
Jan 21 23:16:14 compute-1 sudo[32473]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:16:15 compute-1 python3.9[32475]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 21 23:16:15 compute-1 sudo[32473]: pam_unix(sudo:session): session closed for user root
Jan 21 23:16:15 compute-1 sudo[32629]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-amllclflyxmcmmyuijjoxeexqbrnnfsj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037375.4599001-247-143641935385134/AnsiballZ_file.py'
Jan 21 23:16:15 compute-1 sudo[32629]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:16:16 compute-1 python3.9[32631]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/log/journal setype=var_log_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 21 23:16:16 compute-1 sudo[32629]: pam_unix(sudo:session): session closed for user root
Jan 21 23:16:16 compute-1 sudo[32781]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-scbdgiocgujyptflndlbzjxwvxypokud ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037376.2727582-274-261943271645573/AnsiballZ_file.py'
Jan 21 23:16:16 compute-1 sudo[32781]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:16:16 compute-1 python3.9[32783]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/config-data/ansible-generated recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 21 23:16:16 compute-1 sudo[32781]: pam_unix(sudo:session): session closed for user root
Jan 21 23:16:17 compute-1 python3.9[32933]: ansible-ansible.builtin.service_facts Invoked
Jan 21 23:16:22 compute-1 python3.9[33186]: ansible-ansible.builtin.lineinfile Invoked with line=cloud-init=disabled path=/proc/cmdline state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:16:22 compute-1 python3.9[33336]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 21 23:16:24 compute-1 python3.9[33490]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 21 23:16:25 compute-1 sudo[33646]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-okbqwpisgteosrjveslihnvkhckgwysf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037385.1312509-418-240938910443821/AnsiballZ_setup.py'
Jan 21 23:16:25 compute-1 sudo[33646]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:16:25 compute-1 python3.9[33648]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 21 23:16:25 compute-1 sudo[33646]: pam_unix(sudo:session): session closed for user root
Jan 21 23:16:26 compute-1 sudo[33730]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hyzhctqjhcjkdnsgvghpjxvodzwbgsod ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037385.1312509-418-240938910443821/AnsiballZ_dnf.py'
Jan 21 23:16:26 compute-1 sudo[33730]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:16:26 compute-1 python3.9[33732]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 21 23:17:10 compute-1 systemd[1]: Reloading.
Jan 21 23:17:10 compute-1 systemd-rc-local-generator[33920]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 23:17:10 compute-1 systemd[1]: Listening on Device-mapper event daemon FIFOs.
Jan 21 23:17:10 compute-1 systemd[1]: Reloading.
Jan 21 23:17:10 compute-1 systemd-rc-local-generator[33972]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 23:17:10 compute-1 systemd[1]: Starting Monitoring of LVM2 mirrors, snapshots etc. using dmeventd or progress polling...
Jan 21 23:17:10 compute-1 systemd[1]: Finished Monitoring of LVM2 mirrors, snapshots etc. using dmeventd or progress polling.
Jan 21 23:17:11 compute-1 systemd[1]: Reloading.
Jan 21 23:17:11 compute-1 systemd-rc-local-generator[34013]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 23:17:11 compute-1 systemd[1]: Starting dnf makecache...
Jan 21 23:17:11 compute-1 systemd[1]: Listening on LVM2 poll daemon socket.
Jan 21 23:17:11 compute-1 dbus-broker-launch[754]: Noticed file-system modification, trigger reload.
Jan 21 23:17:11 compute-1 dbus-broker-launch[754]: Noticed file-system modification, trigger reload.
Jan 21 23:17:11 compute-1 dbus-broker-launch[754]: Noticed file-system modification, trigger reload.
Jan 21 23:17:11 compute-1 dnf[34021]: Failed determining last makecache time.
Jan 21 23:17:11 compute-1 dnf[34021]: delorean-openstack-barbican-42b4c41831408a8e323 159 kB/s | 3.0 kB     00:00
Jan 21 23:17:11 compute-1 dnf[34021]: delorean-python-glean-10df0bd91b9bc5c9fd9cc02d7 205 kB/s | 3.0 kB     00:00
Jan 21 23:17:11 compute-1 dnf[34021]: delorean-openstack-cinder-1c00d6490d88e436f26ef 209 kB/s | 3.0 kB     00:00
Jan 21 23:17:11 compute-1 dnf[34021]: delorean-python-stevedore-c4acc5639fd2329372142 173 kB/s | 3.0 kB     00:00
Jan 21 23:17:11 compute-1 dnf[34021]: delorean-python-cloudkitty-tests-tempest-2c80f8 194 kB/s | 3.0 kB     00:00
Jan 21 23:17:11 compute-1 dnf[34021]: delorean-os-refresh-config-9bfc52b5049be2d8de61 206 kB/s | 3.0 kB     00:00
Jan 21 23:17:11 compute-1 dnf[34021]: delorean-openstack-nova-6f8decf0b4f1aa2e96292b6 171 kB/s | 3.0 kB     00:00
Jan 21 23:17:11 compute-1 dnf[34021]: delorean-python-designate-tests-tempest-347fdbc 188 kB/s | 3.0 kB     00:00
Jan 21 23:17:11 compute-1 dnf[34021]: delorean-openstack-glance-1fd12c29b339f30fe823e 190 kB/s | 3.0 kB     00:00
Jan 21 23:17:11 compute-1 dnf[34021]: delorean-openstack-keystone-e4b40af0ae3698fbbbb 194 kB/s | 3.0 kB     00:00
Jan 21 23:17:11 compute-1 dnf[34021]: delorean-openstack-manila-3c01b7181572c95dac462 141 kB/s | 3.0 kB     00:00
Jan 21 23:17:11 compute-1 dnf[34021]: delorean-python-whitebox-neutron-tests-tempest- 189 kB/s | 3.0 kB     00:00
Jan 21 23:17:11 compute-1 dnf[34021]: delorean-openstack-octavia-ba397f07a7331190208c 178 kB/s | 3.0 kB     00:00
Jan 21 23:17:11 compute-1 dnf[34021]: delorean-openstack-watcher-c014f81a8647287f6dcc 173 kB/s | 3.0 kB     00:00
Jan 21 23:17:11 compute-1 dnf[34021]: delorean-ansible-config_template-5ccaa22121a7ff 188 kB/s | 3.0 kB     00:00
Jan 21 23:17:11 compute-1 dnf[34021]: delorean-puppet-ceph-7352068d7b8c84ded636ab3158 178 kB/s | 3.0 kB     00:00
Jan 21 23:17:11 compute-1 dnf[34021]: delorean-openstack-swift-dc98a8463506ac520c469a 199 kB/s | 3.0 kB     00:00
Jan 21 23:17:11 compute-1 dnf[34021]: delorean-python-tempestconf-8515371b7cceebd4282 196 kB/s | 3.0 kB     00:00
Jan 21 23:17:11 compute-1 dnf[34021]: delorean-openstack-heat-ui-013accbfd179753bc3f0 190 kB/s | 3.0 kB     00:00
Jan 21 23:17:12 compute-1 dnf[34021]: CentOS Stream 9 - BaseOS                         19 kB/s | 6.7 kB     00:00
Jan 21 23:17:12 compute-1 dnf[34021]: CentOS Stream 9 - AppStream                      62 kB/s | 6.8 kB     00:00
Jan 21 23:17:12 compute-1 dnf[34021]: CentOS Stream 9 - CRB                            27 kB/s | 6.6 kB     00:00
Jan 21 23:17:13 compute-1 dnf[34021]: CentOS Stream 9 - Extras packages                19 kB/s | 7.3 kB     00:00
Jan 21 23:17:13 compute-1 dnf[34021]: dlrn-antelope-testing                           102 kB/s | 3.0 kB     00:00
Jan 21 23:17:13 compute-1 dnf[34021]: dlrn-antelope-build-deps                        133 kB/s | 3.0 kB     00:00
Jan 21 23:17:13 compute-1 dnf[34021]: centos9-rabbitmq                                111 kB/s | 3.0 kB     00:00
Jan 21 23:17:13 compute-1 dnf[34021]: centos9-storage                                 118 kB/s | 3.0 kB     00:00
Jan 21 23:17:13 compute-1 dnf[34021]: centos9-opstools                                 82 kB/s | 3.0 kB     00:00
Jan 21 23:17:13 compute-1 dnf[34021]: NFV SIG OpenvSwitch                              83 kB/s | 3.0 kB     00:00
Jan 21 23:17:13 compute-1 dnf[34021]: repo-setup-centos-appstream                     135 kB/s | 4.4 kB     00:00
Jan 21 23:17:13 compute-1 dnf[34021]: repo-setup-centos-baseos                        172 kB/s | 3.9 kB     00:00
Jan 21 23:17:13 compute-1 dnf[34021]: repo-setup-centos-highavailability              159 kB/s | 3.9 kB     00:00
Jan 21 23:17:13 compute-1 dnf[34021]: repo-setup-centos-powertools                    180 kB/s | 4.3 kB     00:00
Jan 21 23:17:13 compute-1 dnf[34021]: Extra Packages for Enterprise Linux 9 - x86_64  109 kB/s |  33 kB     00:00
Jan 21 23:17:14 compute-1 dnf[34021]: Metadata cache created.
Jan 21 23:17:14 compute-1 systemd[1]: dnf-makecache.service: Deactivated successfully.
Jan 21 23:17:14 compute-1 systemd[1]: Finished dnf makecache.
Jan 21 23:17:14 compute-1 systemd[1]: dnf-makecache.service: Consumed 1.897s CPU time.
Jan 21 23:18:15 compute-1 kernel: SELinux:  Converting 2724 SID table entries...
Jan 21 23:18:15 compute-1 kernel: SELinux:  policy capability network_peer_controls=1
Jan 21 23:18:15 compute-1 kernel: SELinux:  policy capability open_perms=1
Jan 21 23:18:15 compute-1 kernel: SELinux:  policy capability extended_socket_class=1
Jan 21 23:18:15 compute-1 kernel: SELinux:  policy capability always_check_network=0
Jan 21 23:18:15 compute-1 kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 21 23:18:15 compute-1 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 21 23:18:15 compute-1 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 21 23:18:15 compute-1 dbus-broker-launch[770]: avc:  op=load_policy lsm=selinux seqno=6 res=1
Jan 21 23:18:16 compute-1 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 21 23:18:16 compute-1 systemd[1]: Starting man-db-cache-update.service...
Jan 21 23:18:16 compute-1 systemd[1]: Reloading.
Jan 21 23:18:16 compute-1 systemd-rc-local-generator[34381]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 23:18:16 compute-1 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 21 23:18:16 compute-1 sudo[33730]: pam_unix(sudo:session): session closed for user root
Jan 21 23:18:17 compute-1 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 21 23:18:17 compute-1 systemd[1]: Finished man-db-cache-update.service.
Jan 21 23:18:17 compute-1 systemd[1]: man-db-cache-update.service: Consumed 1.352s CPU time.
Jan 21 23:18:17 compute-1 systemd[1]: run-r72a5f38aa571458c8e08802ff586ca29.service: Deactivated successfully.
Jan 21 23:18:45 compute-1 sudo[35291]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fcxzdndkwqeitvzgvsuwsriydvvuwydy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037525.0726728-455-158116643406081/AnsiballZ_command.py'
Jan 21 23:18:45 compute-1 sudo[35291]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:18:45 compute-1 python3.9[35293]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 23:18:46 compute-1 sudo[35291]: pam_unix(sudo:session): session closed for user root
Jan 21 23:18:47 compute-1 sudo[35572]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wauvnfgjkhsopekholrqkvcyodxvyfxj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037526.7602332-478-180950409484647/AnsiballZ_selinux.py'
Jan 21 23:18:47 compute-1 sudo[35572]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:18:47 compute-1 python3.9[35574]: ansible-ansible.posix.selinux Invoked with policy=targeted state=enforcing configfile=/etc/selinux/config update_kernel_param=False
Jan 21 23:18:47 compute-1 sudo[35572]: pam_unix(sudo:session): session closed for user root
Jan 21 23:18:48 compute-1 sudo[35724]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pquqxhufekmqnldpjefjehrqykgcqxkz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037528.263971-511-150910773912529/AnsiballZ_command.py'
Jan 21 23:18:48 compute-1 sudo[35724]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:18:48 compute-1 python3.9[35726]: ansible-ansible.legacy.command Invoked with cmd=dd if=/dev/zero of=/swap count=1024 bs=1M creates=/swap _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None removes=None stdin=None
Jan 21 23:18:49 compute-1 sudo[35724]: pam_unix(sudo:session): session closed for user root
Jan 21 23:18:52 compute-1 sudo[35877]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xgzcxfapsvcxibirrqqbwdcqnguzcfdf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037532.0178888-535-166766537569851/AnsiballZ_file.py'
Jan 21 23:18:52 compute-1 sudo[35877]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:18:52 compute-1 python3.9[35879]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/swap recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:18:52 compute-1 sudo[35877]: pam_unix(sudo:session): session closed for user root
Jan 21 23:18:53 compute-1 sudo[36029]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nvidmfqbxukpfkjbdyqzetnnofrbajyi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037532.9768627-559-178844634053320/AnsiballZ_mount.py'
Jan 21 23:18:53 compute-1 sudo[36029]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:18:54 compute-1 python3.9[36031]: ansible-ansible.posix.mount Invoked with dump=0 fstype=swap name=none opts=sw passno=0 src=/swap state=present path=none boot=True opts_no_log=False backup=False fstab=None
Jan 21 23:18:54 compute-1 sudo[36029]: pam_unix(sudo:session): session closed for user root
Jan 21 23:18:55 compute-1 sudo[36181]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-imhrpidmukkcjevbdandhodwytfzmaff ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037535.5150285-643-145713286200448/AnsiballZ_file.py'
Jan 21 23:18:55 compute-1 sudo[36181]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:18:58 compute-1 python3.9[36183]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/ca-trust/source/anchors setype=cert_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 21 23:18:58 compute-1 sudo[36181]: pam_unix(sudo:session): session closed for user root
Jan 21 23:18:59 compute-1 sudo[36333]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zyarguizqrmsodtspstbnmwyjqgnagon ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037538.7285345-667-1957169658881/AnsiballZ_stat.py'
Jan 21 23:18:59 compute-1 sudo[36333]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:19:00 compute-1 python3.9[36335]: ansible-ansible.legacy.stat Invoked with path=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:19:00 compute-1 sudo[36333]: pam_unix(sudo:session): session closed for user root
Jan 21 23:19:01 compute-1 sudo[36456]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-covgulsyvemjvkujgcumfwxtkvavbkfc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037538.7285345-667-1957169658881/AnsiballZ_copy.py'
Jan 21 23:19:01 compute-1 sudo[36456]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:19:01 compute-1 python3.9[36458]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769037538.7285345-667-1957169658881/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=7d2dea4c5f9b91987b3f91d50d337a484f86b475 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:19:01 compute-1 sudo[36456]: pam_unix(sudo:session): session closed for user root
Jan 21 23:19:03 compute-1 irqbalance[786]: Cannot change IRQ 26 affinity: Operation not permitted
Jan 21 23:19:03 compute-1 irqbalance[786]: IRQ 26 affinity is now unmanaged
Jan 21 23:19:04 compute-1 sudo[36608]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ipxcyuqrdyyfwbaodrrycwshdpcjjbfv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037543.9968336-739-46799473520015/AnsiballZ_stat.py'
Jan 21 23:19:04 compute-1 sudo[36608]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:19:04 compute-1 python3.9[36610]: ansible-ansible.builtin.stat Invoked with path=/etc/lvm/devices/system.devices follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 21 23:19:04 compute-1 sudo[36608]: pam_unix(sudo:session): session closed for user root
Jan 21 23:19:06 compute-1 sudo[36760]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tfdipiroiqskxmwpgfmclgdbaozbvpvr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037545.71673-763-169171258467434/AnsiballZ_command.py'
Jan 21 23:19:06 compute-1 sudo[36760]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:19:06 compute-1 python3.9[36762]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/vgimportdevices --all _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 23:19:06 compute-1 sudo[36760]: pam_unix(sudo:session): session closed for user root
Jan 21 23:19:06 compute-1 sudo[36913]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dtujrxwcrtpxtunbwjhvpowvqkbynlfb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037546.569561-787-39358162898603/AnsiballZ_file.py'
Jan 21 23:19:06 compute-1 sudo[36913]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:19:07 compute-1 python3.9[36915]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/lvm/devices/system.devices state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:19:07 compute-1 sudo[36913]: pam_unix(sudo:session): session closed for user root
Jan 21 23:19:08 compute-1 sudo[37065]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ehuzwioqlwqfurbowmwkgfwrthtigucv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037547.6505926-820-87444128770476/AnsiballZ_getent.py'
Jan 21 23:19:08 compute-1 sudo[37065]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:19:08 compute-1 python3.9[37067]: ansible-ansible.builtin.getent Invoked with database=passwd key=qemu fail_key=True service=None split=None
Jan 21 23:19:08 compute-1 sudo[37065]: pam_unix(sudo:session): session closed for user root
Jan 21 23:19:08 compute-1 rsyslogd[1003]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 21 23:19:09 compute-1 sudo[37219]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rsesfnytwwgtoqtcezcoddfvpdvcnovc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037548.6785548-844-135127334633121/AnsiballZ_group.py'
Jan 21 23:19:09 compute-1 sudo[37219]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:19:09 compute-1 python3.9[37221]: ansible-ansible.builtin.group Invoked with gid=107 name=qemu state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Jan 21 23:19:09 compute-1 groupadd[37222]: group added to /etc/group: name=qemu, GID=107
Jan 21 23:19:09 compute-1 groupadd[37222]: group added to /etc/gshadow: name=qemu
Jan 21 23:19:09 compute-1 groupadd[37222]: new group: name=qemu, GID=107
Jan 21 23:19:09 compute-1 sudo[37219]: pam_unix(sudo:session): session closed for user root
Jan 21 23:19:10 compute-1 sudo[37377]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ongrgejuubnjmqokmqqornrbekdvnzto ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037550.1318872-868-243406518129375/AnsiballZ_user.py'
Jan 21 23:19:10 compute-1 sudo[37377]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:19:11 compute-1 python3.9[37379]: ansible-ansible.builtin.user Invoked with comment=qemu user group=qemu groups=[''] name=qemu shell=/sbin/nologin state=present uid=107 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-1 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Jan 21 23:19:11 compute-1 useradd[37381]: new user: name=qemu, UID=107, GID=107, home=/home/qemu, shell=/sbin/nologin, from=/dev/pts/0
Jan 21 23:19:11 compute-1 sudo[37377]: pam_unix(sudo:session): session closed for user root
Jan 21 23:19:11 compute-1 sudo[37537]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ztqylhcrdiklxxigjuryzwmsrzfqtwtz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037551.651777-892-226973796765330/AnsiballZ_getent.py'
Jan 21 23:19:11 compute-1 sudo[37537]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:19:12 compute-1 python3.9[37539]: ansible-ansible.builtin.getent Invoked with database=passwd key=hugetlbfs fail_key=True service=None split=None
Jan 21 23:19:12 compute-1 sudo[37537]: pam_unix(sudo:session): session closed for user root
Jan 21 23:19:12 compute-1 sudo[37690]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rugwcslwpiuclupkvvjzbshjnogmbetb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037552.4546719-916-163871064377401/AnsiballZ_group.py'
Jan 21 23:19:12 compute-1 sudo[37690]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:19:13 compute-1 python3.9[37692]: ansible-ansible.builtin.group Invoked with gid=42477 name=hugetlbfs state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Jan 21 23:19:13 compute-1 groupadd[37693]: group added to /etc/group: name=hugetlbfs, GID=42477
Jan 21 23:19:13 compute-1 groupadd[37693]: group added to /etc/gshadow: name=hugetlbfs
Jan 21 23:19:13 compute-1 groupadd[37693]: new group: name=hugetlbfs, GID=42477
Jan 21 23:19:13 compute-1 sudo[37690]: pam_unix(sudo:session): session closed for user root
Jan 21 23:19:13 compute-1 sudo[37848]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hxmweytyzlmkieehxwybpyfjviwspxom ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037553.421971-943-165946058981450/AnsiballZ_file.py'
Jan 21 23:19:13 compute-1 sudo[37848]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:19:14 compute-1 python3.9[37850]: ansible-ansible.builtin.file Invoked with group=qemu mode=0755 owner=qemu path=/var/lib/vhost_sockets setype=virt_cache_t seuser=system_u state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None serole=None selevel=None attributes=None
Jan 21 23:19:14 compute-1 sudo[37848]: pam_unix(sudo:session): session closed for user root
Jan 21 23:19:15 compute-1 sudo[38000]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-keexwpoqpbciatecshlafjesxsrodves ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037554.666839-976-172078049613803/AnsiballZ_dnf.py'
Jan 21 23:19:15 compute-1 sudo[38000]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:19:15 compute-1 python3.9[38002]: ansible-ansible.legacy.dnf Invoked with name=['dracut-config-generic'] state=absent allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 21 23:19:16 compute-1 sudo[38000]: pam_unix(sudo:session): session closed for user root
Jan 21 23:19:18 compute-1 sudo[38153]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dysrojoxoetfcujbkphaiuhnbqpakhws ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037557.663746-1000-80058475979645/AnsiballZ_file.py'
Jan 21 23:19:18 compute-1 sudo[38153]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:19:18 compute-1 python3.9[38155]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/modules-load.d setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 21 23:19:18 compute-1 sudo[38153]: pam_unix(sudo:session): session closed for user root
Jan 21 23:19:18 compute-1 sudo[38305]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fnwdrtctpwtxtdflymxcgcatliifhopz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037558.524539-1024-88941549425328/AnsiballZ_stat.py'
Jan 21 23:19:18 compute-1 sudo[38305]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:19:19 compute-1 python3.9[38307]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:19:19 compute-1 sudo[38305]: pam_unix(sudo:session): session closed for user root
Jan 21 23:19:19 compute-1 sudo[38428]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wuqeeryljfgdjkmidjzivdlxjoesdwrp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037558.524539-1024-88941549425328/AnsiballZ_copy.py'
Jan 21 23:19:19 compute-1 sudo[38428]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:19:19 compute-1 python3.9[38430]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/99-edpm.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769037558.524539-1024-88941549425328/.source.conf follow=False _original_basename=edpm-modprobe.conf.j2 checksum=8021efe01721d8fa8cab46b95c00ec1be6dbb9d0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 21 23:19:19 compute-1 sudo[38428]: pam_unix(sudo:session): session closed for user root
Jan 21 23:19:20 compute-1 sudo[38580]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-epmfvsgywepmrzqwydielbhzwljngogj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037559.9771147-1069-145930319609615/AnsiballZ_systemd.py'
Jan 21 23:19:20 compute-1 sudo[38580]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:19:20 compute-1 python3.9[38582]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 21 23:19:21 compute-1 systemd[1]: Starting Load Kernel Modules...
Jan 21 23:19:21 compute-1 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this.
Jan 21 23:19:21 compute-1 kernel: Bridge firewalling registered
Jan 21 23:19:21 compute-1 systemd-modules-load[38586]: Inserted module 'br_netfilter'
Jan 21 23:19:21 compute-1 systemd[1]: Finished Load Kernel Modules.
Jan 21 23:19:21 compute-1 sudo[38580]: pam_unix(sudo:session): session closed for user root
Jan 21 23:19:22 compute-1 sudo[38739]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lwrzxzmgllokfgdxxtbbjvieifsuomgb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037562.3495142-1093-98587456492391/AnsiballZ_stat.py'
Jan 21 23:19:22 compute-1 sudo[38739]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:19:22 compute-1 python3.9[38741]: ansible-ansible.legacy.stat Invoked with path=/etc/sysctl.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:19:22 compute-1 sudo[38739]: pam_unix(sudo:session): session closed for user root
Jan 21 23:19:23 compute-1 sudo[38862]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zygpmkaceodopwzumgojfgughvfbfqat ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037562.3495142-1093-98587456492391/AnsiballZ_copy.py'
Jan 21 23:19:23 compute-1 sudo[38862]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:19:23 compute-1 python3.9[38864]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysctl.d/99-edpm.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769037562.3495142-1093-98587456492391/.source.conf follow=False _original_basename=edpm-sysctl.conf.j2 checksum=2a366439721b855adcfe4d7f152babb68596a007 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 21 23:19:23 compute-1 sudo[38862]: pam_unix(sudo:session): session closed for user root
Jan 21 23:19:24 compute-1 sudo[39014]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qjyyqkpckkqgfzqdmhfketxwyjvcwuqb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037564.073652-1147-56080161277452/AnsiballZ_dnf.py'
Jan 21 23:19:24 compute-1 sudo[39014]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:19:24 compute-1 python3.9[39016]: ansible-ansible.legacy.dnf Invoked with name=['tuned', 'tuned-profiles-cpu-partitioning'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 21 23:19:28 compute-1 dbus-broker-launch[754]: Noticed file-system modification, trigger reload.
Jan 21 23:19:28 compute-1 dbus-broker-launch[754]: Noticed file-system modification, trigger reload.
Jan 21 23:19:28 compute-1 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 21 23:19:28 compute-1 systemd[1]: Starting man-db-cache-update.service...
Jan 21 23:19:28 compute-1 systemd[1]: Reloading.
Jan 21 23:19:28 compute-1 systemd-rc-local-generator[39078]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 23:19:28 compute-1 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 21 23:19:29 compute-1 sudo[39014]: pam_unix(sudo:session): session closed for user root
Jan 21 23:19:30 compute-1 python3.9[40719]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/active_profile follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 21 23:19:31 compute-1 python3.9[41563]: ansible-ansible.builtin.slurp Invoked with src=/etc/tuned/active_profile
Jan 21 23:19:31 compute-1 python3.9[42335]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/throughput-performance-variables.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 21 23:19:32 compute-1 sudo[43186]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eawwykkuissomilcgdaotsdeaejqsuuh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037572.3317602-1264-281454347488299/AnsiballZ_command.py'
Jan 21 23:19:32 compute-1 sudo[43186]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:19:32 compute-1 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 21 23:19:32 compute-1 systemd[1]: Finished man-db-cache-update.service.
Jan 21 23:19:32 compute-1 systemd[1]: man-db-cache-update.service: Consumed 5.250s CPU time.
Jan 21 23:19:32 compute-1 systemd[1]: run-r6a951e81ddd742d49e0b4932a0062e62.service: Deactivated successfully.
Jan 21 23:19:32 compute-1 python3.9[43188]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/tuned-adm profile throughput-performance _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 23:19:32 compute-1 systemd[1]: Starting Dynamic System Tuning Daemon...
Jan 21 23:19:33 compute-1 systemd[1]: Starting Authorization Manager...
Jan 21 23:19:33 compute-1 polkitd[43406]: Started polkitd version 0.117
Jan 21 23:19:33 compute-1 systemd[1]: Started Dynamic System Tuning Daemon.
Jan 21 23:19:33 compute-1 polkitd[43406]: Loading rules from directory /etc/polkit-1/rules.d
Jan 21 23:19:33 compute-1 polkitd[43406]: Loading rules from directory /usr/share/polkit-1/rules.d
Jan 21 23:19:33 compute-1 polkitd[43406]: Finished loading, compiling and executing 2 rules
Jan 21 23:19:33 compute-1 systemd[1]: Started Authorization Manager.
Jan 21 23:19:33 compute-1 polkitd[43406]: Acquired the name org.freedesktop.PolicyKit1 on the system bus
Jan 21 23:19:33 compute-1 sudo[43186]: pam_unix(sudo:session): session closed for user root
Jan 21 23:19:34 compute-1 sudo[43574]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xtqamcebljvrwodvrseqgypdfkswuttp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037573.815597-1291-151451852471999/AnsiballZ_systemd.py'
Jan 21 23:19:34 compute-1 sudo[43574]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:19:34 compute-1 python3.9[43576]: ansible-ansible.builtin.systemd Invoked with enabled=True name=tuned state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 21 23:19:34 compute-1 systemd[1]: Stopping Dynamic System Tuning Daemon...
Jan 21 23:19:34 compute-1 systemd[1]: tuned.service: Deactivated successfully.
Jan 21 23:19:34 compute-1 systemd[1]: Stopped Dynamic System Tuning Daemon.
Jan 21 23:19:34 compute-1 systemd[1]: Starting Dynamic System Tuning Daemon...
Jan 21 23:19:34 compute-1 systemd[1]: Started Dynamic System Tuning Daemon.
Jan 21 23:19:34 compute-1 sudo[43574]: pam_unix(sudo:session): session closed for user root
Jan 21 23:19:35 compute-1 python3.9[43738]: ansible-ansible.builtin.slurp Invoked with src=/proc/cmdline
Jan 21 23:19:39 compute-1 sudo[43888]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wkbegpwguknszuvbtohltwnivbhxrotc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037578.9521422-1462-229807953209015/AnsiballZ_systemd.py'
Jan 21 23:19:39 compute-1 sudo[43888]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:19:39 compute-1 python3.9[43890]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksm.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 21 23:19:39 compute-1 systemd[1]: Reloading.
Jan 21 23:19:39 compute-1 systemd-rc-local-generator[43915]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 23:19:39 compute-1 sudo[43888]: pam_unix(sudo:session): session closed for user root
Jan 21 23:19:40 compute-1 sudo[44077]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cejoknikryxkxpclmqlcftweqefehxsi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037580.0870323-1462-188790645192105/AnsiballZ_systemd.py'
Jan 21 23:19:40 compute-1 sudo[44077]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:19:40 compute-1 python3.9[44079]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksmtuned.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 21 23:19:40 compute-1 systemd[1]: Reloading.
Jan 21 23:19:40 compute-1 systemd-rc-local-generator[44110]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 23:19:41 compute-1 sudo[44077]: pam_unix(sudo:session): session closed for user root
Jan 21 23:19:41 compute-1 sudo[44266]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kivwukddjmdmhkrdqrnbkmrzwvpaztvv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037581.4194274-1510-104598171819418/AnsiballZ_command.py'
Jan 21 23:19:41 compute-1 sudo[44266]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:19:41 compute-1 python3.9[44268]: ansible-ansible.legacy.command Invoked with _raw_params=mkswap "/swap" _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 23:19:41 compute-1 sudo[44266]: pam_unix(sudo:session): session closed for user root
Jan 21 23:19:42 compute-1 sudo[44419]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-elpamultclkrfskdwskdmqopqzlhaluh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037582.188565-1534-228356648103345/AnsiballZ_command.py'
Jan 21 23:19:42 compute-1 sudo[44419]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:19:42 compute-1 python3.9[44421]: ansible-ansible.legacy.command Invoked with _raw_params=swapon "/swap" _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 23:19:42 compute-1 kernel: Adding 1048572k swap on /swap.  Priority:-2 extents:1 across:1048572k 
Jan 21 23:19:42 compute-1 sudo[44419]: pam_unix(sudo:session): session closed for user root
Jan 21 23:19:43 compute-1 sudo[44572]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tknhqslekwbmevoousqhcpwggbaeyzox ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037582.9726765-1559-166357951610100/AnsiballZ_command.py'
Jan 21 23:19:43 compute-1 sudo[44572]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:19:43 compute-1 python3.9[44574]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/bin/update-ca-trust _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 23:19:44 compute-1 sudo[44572]: pam_unix(sudo:session): session closed for user root
Jan 21 23:19:45 compute-1 sudo[44734]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-chbrtjhskolddnutaspfossqvmpomwca ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037585.1690361-1582-136134278124032/AnsiballZ_command.py'
Jan 21 23:19:45 compute-1 sudo[44734]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:19:45 compute-1 python3.9[44736]: ansible-ansible.legacy.command Invoked with _raw_params=echo 2 >/sys/kernel/mm/ksm/run _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 23:19:45 compute-1 sudo[44734]: pam_unix(sudo:session): session closed for user root
Jan 21 23:19:46 compute-1 sudo[44887]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-idcogiazkbunllzugwlupyodgujolxxq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037585.9165256-1606-154027637862017/AnsiballZ_systemd.py'
Jan 21 23:19:46 compute-1 sudo[44887]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:19:46 compute-1 python3.9[44889]: ansible-ansible.builtin.systemd Invoked with name=systemd-sysctl.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 21 23:19:46 compute-1 systemd[1]: systemd-sysctl.service: Deactivated successfully.
Jan 21 23:19:46 compute-1 systemd[1]: Stopped Apply Kernel Variables.
Jan 21 23:19:46 compute-1 systemd[1]: Stopping Apply Kernel Variables...
Jan 21 23:19:46 compute-1 systemd[1]: Starting Apply Kernel Variables...
Jan 21 23:19:46 compute-1 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully.
Jan 21 23:19:46 compute-1 systemd[1]: Finished Apply Kernel Variables.
Jan 21 23:19:46 compute-1 sudo[44887]: pam_unix(sudo:session): session closed for user root
Jan 21 23:19:47 compute-1 sshd-session[31267]: Connection closed by 192.168.122.30 port 59234
Jan 21 23:19:47 compute-1 sshd-session[31264]: pam_unix(sshd:session): session closed for user zuul
Jan 21 23:19:47 compute-1 systemd[1]: session-10.scope: Deactivated successfully.
Jan 21 23:19:47 compute-1 systemd[1]: session-10.scope: Consumed 2min 17.859s CPU time.
Jan 21 23:19:47 compute-1 systemd-logind[796]: Session 10 logged out. Waiting for processes to exit.
Jan 21 23:19:47 compute-1 systemd-logind[796]: Removed session 10.
Jan 21 23:19:52 compute-1 sshd-session[44919]: Accepted publickey for zuul from 192.168.122.30 port 55060 ssh2: ECDSA SHA256:yVd9eaNlUxI8uClDfn5gH9n5gqPIh4AgIE0cN4DTPFE
Jan 21 23:19:52 compute-1 systemd-logind[796]: New session 11 of user zuul.
Jan 21 23:19:52 compute-1 systemd[1]: Started Session 11 of User zuul.
Jan 21 23:19:52 compute-1 sshd-session[44919]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 21 23:19:54 compute-1 python3.9[45072]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 21 23:19:55 compute-1 python3.9[45226]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 21 23:19:56 compute-1 sudo[45380]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cqdyxvvrrgvtwykyakmrblxlfxusldup ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037596.1953433-111-139062163600323/AnsiballZ_command.py'
Jan 21 23:19:56 compute-1 sudo[45380]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:19:56 compute-1 python3.9[45382]: ansible-ansible.legacy.command Invoked with _raw_params=PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin which growvols
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 23:19:56 compute-1 sudo[45380]: pam_unix(sudo:session): session closed for user root
Jan 21 23:19:57 compute-1 python3.9[45533]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 21 23:19:58 compute-1 sudo[45687]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kurydmaqykokxrdsjqxicbzkyxqtfnng ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037598.4524734-171-59683334738396/AnsiballZ_setup.py'
Jan 21 23:19:58 compute-1 sudo[45687]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:19:59 compute-1 python3.9[45689]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 21 23:19:59 compute-1 sudo[45687]: pam_unix(sudo:session): session closed for user root
Jan 21 23:19:59 compute-1 sudo[45771]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rzfxxeahbwxpffdaacpjdpsvsixalrxm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037598.4524734-171-59683334738396/AnsiballZ_dnf.py'
Jan 21 23:19:59 compute-1 sudo[45771]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:19:59 compute-1 python3.9[45773]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 21 23:20:01 compute-1 sudo[45771]: pam_unix(sudo:session): session closed for user root
Jan 21 23:20:02 compute-1 sudo[45924]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eawrzcnmkytjyaztobusevxemhcigmfb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037601.6690073-207-184772572352382/AnsiballZ_setup.py'
Jan 21 23:20:02 compute-1 sudo[45924]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:20:02 compute-1 python3.9[45926]: ansible-ansible.builtin.setup Invoked with filter=['ansible_interfaces'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 21 23:20:02 compute-1 sudo[45924]: pam_unix(sudo:session): session closed for user root
Jan 21 23:20:03 compute-1 sudo[46095]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yfysaothaniywozqbzbdxncynfzimgqd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037602.8191617-240-66195721658206/AnsiballZ_file.py'
Jan 21 23:20:03 compute-1 sudo[46095]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:20:03 compute-1 python3.9[46097]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/containers/networks recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:20:03 compute-1 sudo[46095]: pam_unix(sudo:session): session closed for user root
Jan 21 23:20:04 compute-1 sudo[46247]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zwgezyxpcktvxqthrhdqghlfdrkmmwcc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037603.735739-264-254314963345648/AnsiballZ_command.py'
Jan 21 23:20:04 compute-1 sudo[46247]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:20:04 compute-1 python3.9[46249]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 23:20:04 compute-1 systemd[1]: var-lib-containers-storage-overlay-compat1789323252-merged.mount: Deactivated successfully.
Jan 21 23:20:04 compute-1 systemd[1]: var-lib-containers-storage-overlay-metacopy\x2dcheck2562635281-merged.mount: Deactivated successfully.
Jan 21 23:20:04 compute-1 podman[46250]: 2026-01-21 23:20:04.272374009 +0000 UTC m=+0.050448841 system refresh
Jan 21 23:20:04 compute-1 sudo[46247]: pam_unix(sudo:session): session closed for user root
Jan 21 23:20:04 compute-1 sudo[46411]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-icnxmebetecupbjjztpbijeglwzdzami ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037604.5325112-288-9954962571525/AnsiballZ_stat.py'
Jan 21 23:20:04 compute-1 sudo[46411]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:20:05 compute-1 python3.9[46413]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:20:05 compute-1 sudo[46411]: pam_unix(sudo:session): session closed for user root
Jan 21 23:20:05 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 21 23:20:05 compute-1 sudo[46534]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wraxjcelgwcppzxrombykwafmvktojjx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037604.5325112-288-9954962571525/AnsiballZ_copy.py'
Jan 21 23:20:05 compute-1 sudo[46534]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:20:05 compute-1 python3.9[46536]: ansible-ansible.legacy.copy Invoked with dest=/etc/containers/networks/podman.json group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769037604.5325112-288-9954962571525/.source.json follow=False _original_basename=podman_network_config.j2 checksum=a438cf9900345e24fabefde6aef5e545d5e1b90e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:20:05 compute-1 sudo[46534]: pam_unix(sudo:session): session closed for user root
Jan 21 23:20:06 compute-1 sudo[46686]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ouoorhagfpppddugrlcvzxxtcubjpjta ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037606.0564966-333-14694398745130/AnsiballZ_stat.py'
Jan 21 23:20:06 compute-1 sudo[46686]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:20:06 compute-1 python3.9[46688]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:20:06 compute-1 sudo[46686]: pam_unix(sudo:session): session closed for user root
Jan 21 23:20:07 compute-1 sudo[46809]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ingepttvoaennuweclhhezcorzxnryeb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037606.0564966-333-14694398745130/AnsiballZ_copy.py'
Jan 21 23:20:07 compute-1 sudo[46809]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:20:07 compute-1 python3.9[46811]: ansible-ansible.legacy.copy Invoked with dest=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769037606.0564966-333-14694398745130/.source.conf follow=False _original_basename=registries.conf.j2 checksum=51f7dfe021bf6a784cb4010cf142a3df219fb1a0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 21 23:20:07 compute-1 sudo[46809]: pam_unix(sudo:session): session closed for user root
Jan 21 23:20:08 compute-1 sudo[46961]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-chtooqmyziapsdkatlaaacmvbavbsfhj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037607.5863705-381-11414011196131/AnsiballZ_ini_file.py'
Jan 21 23:20:08 compute-1 sudo[46961]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:20:08 compute-1 python3.9[46963]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=pids_limit owner=root path=/etc/containers/containers.conf section=containers setype=etc_t value=4096 backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Jan 21 23:20:08 compute-1 sudo[46961]: pam_unix(sudo:session): session closed for user root
Jan 21 23:20:08 compute-1 sudo[47113]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wxrolncymifrkxuotteindmzawageoap ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037608.4798-381-170174552867676/AnsiballZ_ini_file.py'
Jan 21 23:20:08 compute-1 sudo[47113]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:20:08 compute-1 python3.9[47115]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=events_logger owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="journald" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Jan 21 23:20:09 compute-1 sudo[47113]: pam_unix(sudo:session): session closed for user root
Jan 21 23:20:09 compute-1 sudo[47265]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-agxbckjnojfvehrhgygkijmjzqxzhodq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037609.189266-381-210634226968362/AnsiballZ_ini_file.py'
Jan 21 23:20:09 compute-1 sudo[47265]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:20:09 compute-1 python3.9[47267]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=runtime owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="crun" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Jan 21 23:20:09 compute-1 sudo[47265]: pam_unix(sudo:session): session closed for user root
Jan 21 23:20:10 compute-1 sudo[47417]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-plkawenqoazilmawfohzugbnrgvylsus ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037609.8855953-381-137102931770942/AnsiballZ_ini_file.py'
Jan 21 23:20:10 compute-1 sudo[47417]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:20:10 compute-1 python3.9[47419]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=network_backend owner=root path=/etc/containers/containers.conf section=network setype=etc_t value="netavark" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Jan 21 23:20:10 compute-1 sudo[47417]: pam_unix(sudo:session): session closed for user root
Jan 21 23:20:11 compute-1 python3.9[47569]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 21 23:20:12 compute-1 sudo[47721]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ybrtozccwotknofxqkozlevhfzydafbz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037611.7126946-501-44517487925508/AnsiballZ_dnf.py'
Jan 21 23:20:12 compute-1 sudo[47721]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:20:12 compute-1 python3.9[47723]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 21 23:20:13 compute-1 sudo[47721]: pam_unix(sudo:session): session closed for user root
Jan 21 23:20:14 compute-1 sudo[47874]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ocafffnjztmpsfusjqlhmsrjkxvkkxqu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037613.6812093-525-223287823711941/AnsiballZ_dnf.py'
Jan 21 23:20:14 compute-1 sudo[47874]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:20:14 compute-1 python3.9[47876]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openstack-network-scripts'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 21 23:20:16 compute-1 sudo[47874]: pam_unix(sudo:session): session closed for user root
Jan 21 23:20:17 compute-1 sudo[48034]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dpzkgyfyndhndrqissrzztyqjyzdrxsr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037616.849958-555-240746316585013/AnsiballZ_dnf.py'
Jan 21 23:20:17 compute-1 sudo[48034]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:20:17 compute-1 python3.9[48036]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['podman', 'buildah'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 21 23:20:18 compute-1 sudo[48034]: pam_unix(sudo:session): session closed for user root
Jan 21 23:20:20 compute-1 sudo[48187]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-atueevtcgutdwzkxdgzndonpvzzaioca ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037619.6988862-582-249883537447329/AnsiballZ_dnf.py'
Jan 21 23:20:20 compute-1 sudo[48187]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:20:20 compute-1 python3.9[48189]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['tuned', 'tuned-profiles-cpu-partitioning'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 21 23:20:21 compute-1 sudo[48187]: pam_unix(sudo:session): session closed for user root
Jan 21 23:20:22 compute-1 sudo[48340]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ysjxtjavkgdyddwskdbewksdvidqyayh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037622.0247765-615-105110922897554/AnsiballZ_dnf.py'
Jan 21 23:20:22 compute-1 sudo[48340]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:20:22 compute-1 python3.9[48342]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['NetworkManager-ovs'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 21 23:20:23 compute-1 sudo[48340]: pam_unix(sudo:session): session closed for user root
Jan 21 23:20:24 compute-1 sudo[48496]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jaojgaqjwlupvrckbyfrjvpgvobcsafr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037624.296101-639-7634178266413/AnsiballZ_dnf.py'
Jan 21 23:20:24 compute-1 sudo[48496]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:20:24 compute-1 python3.9[48498]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['os-net-config'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 21 23:20:27 compute-1 sudo[48496]: pam_unix(sudo:session): session closed for user root
Jan 21 23:20:28 compute-1 sudo[48666]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wpuubocwidlvnfvghwxeloitsqqykvda ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037628.2544913-666-261452970934377/AnsiballZ_dnf.py'
Jan 21 23:20:28 compute-1 sudo[48666]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:20:28 compute-1 python3.9[48668]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openssh-server'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 21 23:20:29 compute-1 sudo[48666]: pam_unix(sudo:session): session closed for user root
Jan 21 23:20:30 compute-1 sudo[48819]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aibevgvklraoojmglyohgrtmdqappxql ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037630.289597-693-30720270567987/AnsiballZ_dnf.py'
Jan 21 23:20:30 compute-1 sudo[48819]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:20:30 compute-1 python3.9[48821]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['libvirt ', 'libvirt-admin ', 'libvirt-client ', 'libvirt-daemon ', 'qemu-kvm', 'qemu-img', 'libguestfs', 'libseccomp', 'swtpm', 'swtpm-tools', 'edk2-ovmf', 'ceph-common', 'cyrus-sasl-scram'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 21 23:20:42 compute-1 sudo[48819]: pam_unix(sudo:session): session closed for user root
Jan 21 23:20:50 compute-1 sudo[49154]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bxhcvtrjsxiwhpandyrxxghuazrdlidt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037650.3935802-720-63735089193988/AnsiballZ_dnf.py'
Jan 21 23:20:50 compute-1 sudo[49154]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:20:50 compute-1 python3.9[49156]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['iscsi-initiator-utils'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 21 23:20:52 compute-1 sudo[49154]: pam_unix(sudo:session): session closed for user root
Jan 21 23:20:53 compute-1 sudo[49310]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oslyzbqascsxmtqpccacllgyldrqukjh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037653.1303449-750-31897755256393/AnsiballZ_dnf.py'
Jan 21 23:20:53 compute-1 sudo[49310]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:20:53 compute-1 python3.9[49312]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['device-mapper-multipath'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 21 23:20:55 compute-1 sudo[49310]: pam_unix(sudo:session): session closed for user root
Jan 21 23:20:56 compute-1 sudo[49467]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hmyfhjmbxttaplvisnhwrppghuxihtkl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037656.1073287-783-248607595776481/AnsiballZ_file.py'
Jan 21 23:20:56 compute-1 sudo[49467]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:20:56 compute-1 python3.9[49469]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:20:56 compute-1 sudo[49467]: pam_unix(sudo:session): session closed for user root
Jan 21 23:20:57 compute-1 sudo[49642]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rfwgexxjexqpnphrlcyzykrhekftowia ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037656.8339286-807-110914080255176/AnsiballZ_stat.py'
Jan 21 23:20:57 compute-1 sudo[49642]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:20:57 compute-1 python3.9[49644]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:20:57 compute-1 sudo[49642]: pam_unix(sudo:session): session closed for user root
Jan 21 23:20:57 compute-1 sudo[49765]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sbocxuuwzmmvbuswryreietjhnplreoy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037656.8339286-807-110914080255176/AnsiballZ_copy.py'
Jan 21 23:20:57 compute-1 sudo[49765]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:20:57 compute-1 python3.9[49767]: ansible-ansible.legacy.copy Invoked with dest=/root/.config/containers/auth.json group=zuul mode=0660 owner=zuul src=/home/zuul/.ansible/tmp/ansible-tmp-1769037656.8339286-807-110914080255176/.source.json _original_basename=.quh34yh6 follow=False checksum=bf21a9e8fbc5a3846fb05b4fa0859e0917b2202f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:20:57 compute-1 sudo[49765]: pam_unix(sudo:session): session closed for user root
Jan 21 23:20:58 compute-1 sudo[49917]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lvtmbdocujrsrlhsgcoxbyedbzbnwcng ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037658.4118001-861-116317745087747/AnsiballZ_podman_image.py'
Jan 21 23:20:58 compute-1 sudo[49917]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:20:59 compute-1 python3.9[49919]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Jan 21 23:20:59 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 21 23:21:01 compute-1 systemd[1]: var-lib-containers-storage-overlay-compat1776876879-lower\x2dmapped.mount: Deactivated successfully.
Jan 21 23:21:04 compute-1 podman[49931]: 2026-01-21 23:21:04.45514721 +0000 UTC m=+5.304658149 image pull a17927617ef5a603f0594ee0d6df65aabdc9e0303ccc5a52c36f193de33ee0fe quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified
Jan 21 23:21:04 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 21 23:21:04 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 21 23:21:04 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 21 23:21:04 compute-1 sudo[49917]: pam_unix(sudo:session): session closed for user root
Jan 21 23:21:09 compute-1 sudo[50227]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xmoguhkdzadmvufbywbgurqjaqdfxqwh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037669.0377812-900-109787495303309/AnsiballZ_podman_image.py'
Jan 21 23:21:09 compute-1 sudo[50227]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:21:09 compute-1 python3.9[50229]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Jan 21 23:21:09 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 21 23:21:26 compute-1 podman[50241]: 2026-01-21 23:21:26.367061949 +0000 UTC m=+16.734284063 image pull e3166cc074f328e3b121ff82d56ed43a2542af699baffe6874520fe3837c2b18 quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Jan 21 23:21:26 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 21 23:21:26 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 21 23:21:26 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 21 23:21:26 compute-1 sudo[50227]: pam_unix(sudo:session): session closed for user root
Jan 21 23:21:28 compute-1 sudo[50527]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kzkvqdkhtofqoullcjdnkmehezcmgdth ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037688.2485495-933-257580107789456/AnsiballZ_podman_image.py'
Jan 21 23:21:28 compute-1 sudo[50527]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:21:29 compute-1 python3.9[50529]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Jan 21 23:21:29 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 21 23:21:32 compute-1 podman[50540]: 2026-01-21 23:21:32.540810384 +0000 UTC m=+2.964418949 image pull 806262ad9f61127734555408f71447afe6ceede79cc666e6f523dacd5edec739 quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified
Jan 21 23:21:32 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 21 23:21:32 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 21 23:21:32 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 21 23:21:32 compute-1 sudo[50527]: pam_unix(sudo:session): session closed for user root
Jan 21 23:21:33 compute-1 sudo[50792]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-laesahzwwyxwnopqxcfpypqsazytaoij ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037693.0711854-933-177712028225043/AnsiballZ_podman_image.py'
Jan 21 23:21:33 compute-1 sudo[50792]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:21:33 compute-1 python3.9[50794]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/prometheus/node-exporter:v1.5.0 tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Jan 21 23:21:33 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 21 23:21:35 compute-1 podman[50806]: 2026-01-21 23:21:35.493701594 +0000 UTC m=+1.808399263 image pull 0da6a335fe1356545476b749c68f022c897de3a2139e8f0054f6937349ee2b83 quay.io/prometheus/node-exporter:v1.5.0
Jan 21 23:21:35 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 21 23:21:35 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 21 23:21:35 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 21 23:21:35 compute-1 sudo[50792]: pam_unix(sudo:session): session closed for user root
Jan 21 23:21:36 compute-1 sshd-session[44922]: Connection closed by 192.168.122.30 port 55060
Jan 21 23:21:36 compute-1 sshd-session[44919]: pam_unix(sshd:session): session closed for user zuul
Jan 21 23:21:36 compute-1 systemd[1]: session-11.scope: Deactivated successfully.
Jan 21 23:21:36 compute-1 systemd[1]: session-11.scope: Consumed 1min 37.341s CPU time.
Jan 21 23:21:36 compute-1 systemd-logind[796]: Session 11 logged out. Waiting for processes to exit.
Jan 21 23:21:36 compute-1 systemd-logind[796]: Removed session 11.
Jan 21 23:21:46 compute-1 sshd-session[50954]: Accepted publickey for zuul from 192.168.122.30 port 56552 ssh2: ECDSA SHA256:yVd9eaNlUxI8uClDfn5gH9n5gqPIh4AgIE0cN4DTPFE
Jan 21 23:21:46 compute-1 systemd-logind[796]: New session 12 of user zuul.
Jan 21 23:21:46 compute-1 systemd[1]: Started Session 12 of User zuul.
Jan 21 23:21:46 compute-1 sshd-session[50954]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 21 23:21:48 compute-1 python3.9[51107]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 21 23:21:49 compute-1 sudo[51261]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zisrbigpthqvlrcmbjtehfmjvbtdtbew ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037708.8241441-69-130414778567768/AnsiballZ_getent.py'
Jan 21 23:21:49 compute-1 sudo[51261]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:21:49 compute-1 python3.9[51263]: ansible-ansible.builtin.getent Invoked with database=passwd key=openvswitch fail_key=True service=None split=None
Jan 21 23:21:49 compute-1 sudo[51261]: pam_unix(sudo:session): session closed for user root
Jan 21 23:21:50 compute-1 sudo[51414]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mhffhwstfraesijgwjmwfjvcwmzycdrw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037709.8752732-93-164831721067683/AnsiballZ_group.py'
Jan 21 23:21:50 compute-1 sudo[51414]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:21:50 compute-1 python3.9[51416]: ansible-ansible.builtin.group Invoked with gid=42476 name=openvswitch state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Jan 21 23:21:50 compute-1 groupadd[51417]: group added to /etc/group: name=openvswitch, GID=42476
Jan 21 23:21:50 compute-1 groupadd[51417]: group added to /etc/gshadow: name=openvswitch
Jan 21 23:21:50 compute-1 groupadd[51417]: new group: name=openvswitch, GID=42476
Jan 21 23:21:50 compute-1 sudo[51414]: pam_unix(sudo:session): session closed for user root
Jan 21 23:21:51 compute-1 sudo[51572]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cojodxfylsjkapxecdqqenoneqzqqyra ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037710.8305604-117-38408164298203/AnsiballZ_user.py'
Jan 21 23:21:51 compute-1 sudo[51572]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:21:52 compute-1 python3.9[51574]: ansible-ansible.builtin.user Invoked with comment=openvswitch user group=openvswitch groups=['hugetlbfs'] name=openvswitch shell=/sbin/nologin state=present uid=42476 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-1 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Jan 21 23:21:52 compute-1 useradd[51576]: new user: name=openvswitch, UID=42476, GID=42476, home=/home/openvswitch, shell=/sbin/nologin, from=/dev/pts/0
Jan 21 23:21:52 compute-1 useradd[51576]: add 'openvswitch' to group 'hugetlbfs'
Jan 21 23:21:52 compute-1 useradd[51576]: add 'openvswitch' to shadow group 'hugetlbfs'
Jan 21 23:21:52 compute-1 sudo[51572]: pam_unix(sudo:session): session closed for user root
Jan 21 23:21:53 compute-1 sudo[51732]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vjccohktkvmbetnbzuxwgdjurvedkhkc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037712.6675084-147-137292377672231/AnsiballZ_setup.py'
Jan 21 23:21:53 compute-1 sudo[51732]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:21:53 compute-1 python3.9[51734]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 21 23:21:53 compute-1 sudo[51732]: pam_unix(sudo:session): session closed for user root
Jan 21 23:21:53 compute-1 sudo[51816]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rdjzylegbfbzlylfkfbhohcgmmkkdfry ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037712.6675084-147-137292377672231/AnsiballZ_dnf.py'
Jan 21 23:21:54 compute-1 sudo[51816]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:21:55 compute-1 python3.9[51818]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openvswitch'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 21 23:21:57 compute-1 sudo[51816]: pam_unix(sudo:session): session closed for user root
Jan 21 23:21:57 compute-1 sudo[51978]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yuynqyyfjcnxbyhmvoxbcydyujjqleiu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037717.4133258-189-208519501159707/AnsiballZ_dnf.py'
Jan 21 23:21:57 compute-1 sudo[51978]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:21:58 compute-1 python3.9[51980]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 21 23:22:10 compute-1 kernel: SELinux:  Converting 2737 SID table entries...
Jan 21 23:22:10 compute-1 kernel: SELinux:  policy capability network_peer_controls=1
Jan 21 23:22:10 compute-1 kernel: SELinux:  policy capability open_perms=1
Jan 21 23:22:10 compute-1 kernel: SELinux:  policy capability extended_socket_class=1
Jan 21 23:22:10 compute-1 kernel: SELinux:  policy capability always_check_network=0
Jan 21 23:22:10 compute-1 kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 21 23:22:10 compute-1 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 21 23:22:10 compute-1 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 21 23:22:10 compute-1 groupadd[52003]: group added to /etc/group: name=unbound, GID=994
Jan 21 23:22:10 compute-1 groupadd[52003]: group added to /etc/gshadow: name=unbound
Jan 21 23:22:10 compute-1 groupadd[52003]: new group: name=unbound, GID=994
Jan 21 23:22:10 compute-1 useradd[52010]: new user: name=unbound, UID=993, GID=994, home=/var/lib/unbound, shell=/sbin/nologin, from=none
Jan 21 23:22:10 compute-1 dbus-broker-launch[770]: avc:  op=load_policy lsm=selinux seqno=7 res=1
Jan 21 23:22:10 compute-1 systemd[1]: Started daily update of the root trust anchor for DNSSEC.
Jan 21 23:22:12 compute-1 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 21 23:22:12 compute-1 systemd[1]: Starting man-db-cache-update.service...
Jan 21 23:22:12 compute-1 systemd[1]: Reloading.
Jan 21 23:22:12 compute-1 systemd-rc-local-generator[52503]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 23:22:12 compute-1 systemd-sysv-generator[52508]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 21 23:22:12 compute-1 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 21 23:22:13 compute-1 sudo[51978]: pam_unix(sudo:session): session closed for user root
Jan 21 23:22:13 compute-1 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 21 23:22:13 compute-1 systemd[1]: Finished man-db-cache-update.service.
Jan 21 23:22:13 compute-1 systemd[1]: run-rd79e849354444af298d637402238daac.service: Deactivated successfully.
Jan 21 23:22:15 compute-1 sudo[53077]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xaeabnluhqfpxwxqlqzbayctfzokeldw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037734.793881-213-258474154363090/AnsiballZ_systemd.py'
Jan 21 23:22:15 compute-1 sudo[53077]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:22:15 compute-1 python3.9[53079]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 21 23:22:15 compute-1 systemd[1]: Reloading.
Jan 21 23:22:15 compute-1 systemd-rc-local-generator[53109]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 23:22:15 compute-1 systemd-sysv-generator[53113]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 21 23:22:16 compute-1 systemd[1]: Starting Open vSwitch Database Unit...
Jan 21 23:22:16 compute-1 chown[53121]: /usr/bin/chown: cannot access '/run/openvswitch': No such file or directory
Jan 21 23:22:16 compute-1 ovs-ctl[53126]: /etc/openvswitch/conf.db does not exist ... (warning).
Jan 21 23:22:16 compute-1 ovs-ctl[53126]: Creating empty database /etc/openvswitch/conf.db [  OK  ]
Jan 21 23:22:16 compute-1 ovs-ctl[53126]: Starting ovsdb-server [  OK  ]
Jan 21 23:22:16 compute-1 ovs-vsctl[53175]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait -- init -- set Open_vSwitch . db-version=8.5.1
Jan 21 23:22:16 compute-1 ovs-vsctl[53191]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait set Open_vSwitch . ovs-version=3.3.5-115.el9s "external-ids:system-id=\"74526b6d-b1ca-423f-9094-b845f8b97526\"" "external-ids:rundir=\"/var/run/openvswitch\"" "system-type=\"centos\"" "system-version=\"9\""
Jan 21 23:22:16 compute-1 ovs-ctl[53126]: Configuring Open vSwitch system IDs [  OK  ]
Jan 21 23:22:16 compute-1 ovs-ctl[53126]: Enabling remote OVSDB managers [  OK  ]
Jan 21 23:22:16 compute-1 systemd[1]: Started Open vSwitch Database Unit.
Jan 21 23:22:16 compute-1 ovs-vsctl[53201]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait add Open_vSwitch . external-ids hostname=compute-1
Jan 21 23:22:16 compute-1 systemd[1]: Starting Open vSwitch Delete Transient Ports...
Jan 21 23:22:16 compute-1 systemd[1]: Finished Open vSwitch Delete Transient Ports.
Jan 21 23:22:16 compute-1 systemd[1]: Starting Open vSwitch Forwarding Unit...
Jan 21 23:22:16 compute-1 kernel: openvswitch: Open vSwitch switching datapath
Jan 21 23:22:16 compute-1 ovs-ctl[53245]: Inserting openvswitch module [  OK  ]
Jan 21 23:22:16 compute-1 ovs-ctl[53214]: Starting ovs-vswitchd [  OK  ]
Jan 21 23:22:16 compute-1 ovs-vsctl[53262]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait add Open_vSwitch . external-ids hostname=compute-1
Jan 21 23:22:16 compute-1 ovs-ctl[53214]: Enabling remote OVSDB managers [  OK  ]
Jan 21 23:22:16 compute-1 systemd[1]: Started Open vSwitch Forwarding Unit.
Jan 21 23:22:16 compute-1 systemd[1]: Starting Open vSwitch...
Jan 21 23:22:16 compute-1 systemd[1]: Finished Open vSwitch.
Jan 21 23:22:16 compute-1 sudo[53077]: pam_unix(sudo:session): session closed for user root
Jan 21 23:22:17 compute-1 python3.9[53414]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 21 23:22:18 compute-1 sudo[53564]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-giyskobevmpuwblpqzhnqjaczkutpzbm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037738.2262332-267-239691588658486/AnsiballZ_sefcontext.py'
Jan 21 23:22:18 compute-1 sudo[53564]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:22:19 compute-1 python3.9[53566]: ansible-community.general.sefcontext Invoked with selevel=s0 setype=container_file_t state=present target=/var/lib/edpm-config(/.*)? ignore_selinux_state=False ftype=a reload=True substitute=None seuser=None
Jan 21 23:22:20 compute-1 kernel: SELinux:  Converting 2751 SID table entries...
Jan 21 23:22:20 compute-1 kernel: SELinux:  policy capability network_peer_controls=1
Jan 21 23:22:20 compute-1 kernel: SELinux:  policy capability open_perms=1
Jan 21 23:22:20 compute-1 kernel: SELinux:  policy capability extended_socket_class=1
Jan 21 23:22:20 compute-1 kernel: SELinux:  policy capability always_check_network=0
Jan 21 23:22:20 compute-1 kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 21 23:22:20 compute-1 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 21 23:22:20 compute-1 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 21 23:22:20 compute-1 sudo[53564]: pam_unix(sudo:session): session closed for user root
Jan 21 23:22:21 compute-1 python3.9[53721]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 21 23:22:22 compute-1 sudo[53877]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tlemrwpnghdcrltyejqqmbfbbtwulrtg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037742.146633-321-70395205891708/AnsiballZ_dnf.py'
Jan 21 23:22:22 compute-1 dbus-broker-launch[770]: avc:  op=load_policy lsm=selinux seqno=8 res=1
Jan 21 23:22:22 compute-1 sudo[53877]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:22:22 compute-1 python3.9[53879]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 21 23:22:23 compute-1 sudo[53877]: pam_unix(sudo:session): session closed for user root
Jan 21 23:22:24 compute-1 sudo[54030]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xgsokcdsipmdusrtbdtrrxindrcrptmj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037744.466682-345-273273215154365/AnsiballZ_command.py'
Jan 21 23:22:24 compute-1 sudo[54030]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:22:25 compute-1 python3.9[54032]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 23:22:25 compute-1 sudo[54030]: pam_unix(sudo:session): session closed for user root
Jan 21 23:22:26 compute-1 sudo[54317]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-knayurqexzlfbjebiqtjxcmiltvetyje ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037746.2114942-369-70272278419922/AnsiballZ_file.py'
Jan 21 23:22:26 compute-1 sudo[54317]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:22:26 compute-1 python3.9[54319]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config selevel=s0 setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None attributes=None
Jan 21 23:22:26 compute-1 sudo[54317]: pam_unix(sudo:session): session closed for user root
Jan 21 23:22:28 compute-1 python3.9[54469]: ansible-ansible.builtin.stat Invoked with path=/etc/cloud/cloud.cfg.d follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 21 23:22:28 compute-1 sudo[54621]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uetidqfmzbccxtlzqxkvyzuwlktenrlc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037748.377984-417-100332520610251/AnsiballZ_dnf.py'
Jan 21 23:22:28 compute-1 sudo[54621]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:22:28 compute-1 python3.9[54623]: ansible-ansible.legacy.dnf Invoked with name=['NetworkManager-ovs'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 21 23:22:30 compute-1 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 21 23:22:30 compute-1 systemd[1]: Starting man-db-cache-update.service...
Jan 21 23:22:30 compute-1 systemd[1]: Reloading.
Jan 21 23:22:31 compute-1 systemd-rc-local-generator[54661]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 23:22:31 compute-1 systemd-sysv-generator[54666]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 21 23:22:31 compute-1 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 21 23:22:31 compute-1 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 21 23:22:31 compute-1 systemd[1]: Finished man-db-cache-update.service.
Jan 21 23:22:31 compute-1 systemd[1]: run-re6e97a7925f84d90b508749650324223.service: Deactivated successfully.
Jan 21 23:22:31 compute-1 sudo[54621]: pam_unix(sudo:session): session closed for user root
Jan 21 23:22:32 compute-1 sudo[54938]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zrzmjlttpaqyhrumookmsvnytrwxlpvx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037751.7088468-441-37714074115413/AnsiballZ_systemd.py'
Jan 21 23:22:32 compute-1 sudo[54938]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:22:32 compute-1 python3.9[54940]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 21 23:22:32 compute-1 systemd[1]: NetworkManager-wait-online.service: Deactivated successfully.
Jan 21 23:22:32 compute-1 systemd[1]: Stopped Network Manager Wait Online.
Jan 21 23:22:32 compute-1 systemd[1]: Stopping Network Manager Wait Online...
Jan 21 23:22:32 compute-1 systemd[1]: Stopping Network Manager...
Jan 21 23:22:32 compute-1 NetworkManager[7194]: <info>  [1769037752.4224] caught SIGTERM, shutting down normally.
Jan 21 23:22:32 compute-1 NetworkManager[7194]: <info>  [1769037752.4245] dhcp4 (eth0): canceled DHCP transaction
Jan 21 23:22:32 compute-1 NetworkManager[7194]: <info>  [1769037752.4245] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Jan 21 23:22:32 compute-1 NetworkManager[7194]: <info>  [1769037752.4245] dhcp4 (eth0): state changed no lease
Jan 21 23:22:32 compute-1 NetworkManager[7194]: <info>  [1769037752.4249] manager: NetworkManager state is now CONNECTED_SITE
Jan 21 23:22:32 compute-1 NetworkManager[7194]: <info>  [1769037752.4322] exiting (success)
Jan 21 23:22:32 compute-1 systemd[1]: Starting Network Manager Script Dispatcher Service...
Jan 21 23:22:32 compute-1 systemd[1]: Started Network Manager Script Dispatcher Service.
Jan 21 23:22:32 compute-1 systemd[1]: NetworkManager.service: Deactivated successfully.
Jan 21 23:22:32 compute-1 systemd[1]: Stopped Network Manager.
Jan 21 23:22:32 compute-1 systemd[1]: NetworkManager.service: Consumed 14.079s CPU time, 4.1M memory peak, read 0B from disk, written 17.0K to disk.
Jan 21 23:22:32 compute-1 systemd[1]: Starting Network Manager...
Jan 21 23:22:32 compute-1 NetworkManager[54952]: <info>  [1769037752.4873] NetworkManager (version 1.54.3-2.el9) is starting... (after a restart, boot:eb0f01be-82e2-4e7f-8f82-f8e2d1cf7324)
Jan 21 23:22:32 compute-1 NetworkManager[54952]: <info>  [1769037752.4877] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Jan 21 23:22:32 compute-1 NetworkManager[54952]: <info>  [1769037752.4932] manager[0x55e70cc43000]: monitoring kernel firmware directory '/lib/firmware'.
Jan 21 23:22:32 compute-1 systemd[1]: Starting Hostname Service...
Jan 21 23:22:32 compute-1 systemd[1]: Started Hostname Service.
Jan 21 23:22:32 compute-1 NetworkManager[54952]: <info>  [1769037752.5732] hostname: hostname: using hostnamed
Jan 21 23:22:32 compute-1 NetworkManager[54952]: <info>  [1769037752.5733] hostname: static hostname changed from (none) to "compute-1"
Jan 21 23:22:32 compute-1 NetworkManager[54952]: <info>  [1769037752.5737] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Jan 21 23:22:32 compute-1 NetworkManager[54952]: <info>  [1769037752.5741] manager[0x55e70cc43000]: rfkill: Wi-Fi hardware radio set enabled
Jan 21 23:22:32 compute-1 NetworkManager[54952]: <info>  [1769037752.5741] manager[0x55e70cc43000]: rfkill: WWAN hardware radio set enabled
Jan 21 23:22:32 compute-1 NetworkManager[54952]: <info>  [1769037752.5760] Loaded device plugin: NMOvsFactory (/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-device-plugin-ovs.so)
Jan 21 23:22:32 compute-1 NetworkManager[54952]: <info>  [1769037752.5767] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-device-plugin-team.so)
Jan 21 23:22:32 compute-1 NetworkManager[54952]: <info>  [1769037752.5767] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Jan 21 23:22:32 compute-1 NetworkManager[54952]: <info>  [1769037752.5768] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Jan 21 23:22:32 compute-1 NetworkManager[54952]: <info>  [1769037752.5769] manager: Networking is enabled by state file
Jan 21 23:22:32 compute-1 NetworkManager[54952]: <info>  [1769037752.5771] settings: Loaded settings plugin: keyfile (internal)
Jan 21 23:22:32 compute-1 NetworkManager[54952]: <info>  [1769037752.5775] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-settings-plugin-ifcfg-rh.so")
Jan 21 23:22:32 compute-1 NetworkManager[54952]: <info>  [1769037752.5795] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Jan 21 23:22:32 compute-1 NetworkManager[54952]: <info>  [1769037752.5802] dhcp: init: Using DHCP client 'internal'
Jan 21 23:22:32 compute-1 NetworkManager[54952]: <info>  [1769037752.5803] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Jan 21 23:22:32 compute-1 NetworkManager[54952]: <info>  [1769037752.5807] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 21 23:22:32 compute-1 NetworkManager[54952]: <info>  [1769037752.5811] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Jan 21 23:22:32 compute-1 NetworkManager[54952]: <info>  [1769037752.5816] device (lo): Activation: starting connection 'lo' (4662a9d4-1184-4934-9979-d04ebf8a1fd8)
Jan 21 23:22:32 compute-1 NetworkManager[54952]: <info>  [1769037752.5822] device (eth0): carrier: link connected
Jan 21 23:22:32 compute-1 NetworkManager[54952]: <info>  [1769037752.5825] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Jan 21 23:22:32 compute-1 NetworkManager[54952]: <info>  [1769037752.5828] manager: (eth0): assume: will attempt to assume matching connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03) (indicated)
Jan 21 23:22:32 compute-1 NetworkManager[54952]: <info>  [1769037752.5829] device (eth0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Jan 21 23:22:32 compute-1 NetworkManager[54952]: <info>  [1769037752.5833] device (eth0): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Jan 21 23:22:32 compute-1 NetworkManager[54952]: <info>  [1769037752.5837] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Jan 21 23:22:32 compute-1 NetworkManager[54952]: <info>  [1769037752.5844] device (eth1): carrier: link connected
Jan 21 23:22:32 compute-1 NetworkManager[54952]: <info>  [1769037752.5847] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Jan 21 23:22:32 compute-1 NetworkManager[54952]: <info>  [1769037752.5851] manager: (eth1): assume: will attempt to assume matching connection 'ci-private-network' (693d23c9-22df-5d5e-b59c-efc0730a6438) (indicated)
Jan 21 23:22:32 compute-1 NetworkManager[54952]: <info>  [1769037752.5851] device (eth1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Jan 21 23:22:32 compute-1 NetworkManager[54952]: <info>  [1769037752.5854] device (eth1): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Jan 21 23:22:32 compute-1 NetworkManager[54952]: <info>  [1769037752.5859] device (eth1): Activation: starting connection 'ci-private-network' (693d23c9-22df-5d5e-b59c-efc0730a6438)
Jan 21 23:22:32 compute-1 systemd[1]: Started Network Manager.
Jan 21 23:22:32 compute-1 NetworkManager[54952]: <info>  [1769037752.5864] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Jan 21 23:22:32 compute-1 NetworkManager[54952]: <info>  [1769037752.5871] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Jan 21 23:22:32 compute-1 NetworkManager[54952]: <info>  [1769037752.5872] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Jan 21 23:22:32 compute-1 NetworkManager[54952]: <info>  [1769037752.5874] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Jan 21 23:22:32 compute-1 NetworkManager[54952]: <info>  [1769037752.5876] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Jan 21 23:22:32 compute-1 NetworkManager[54952]: <info>  [1769037752.5879] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'assume')
Jan 21 23:22:32 compute-1 NetworkManager[54952]: <info>  [1769037752.5881] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Jan 21 23:22:32 compute-1 NetworkManager[54952]: <info>  [1769037752.5882] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'assume')
Jan 21 23:22:32 compute-1 NetworkManager[54952]: <info>  [1769037752.5885] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Jan 21 23:22:32 compute-1 NetworkManager[54952]: <info>  [1769037752.5890] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Jan 21 23:22:32 compute-1 NetworkManager[54952]: <info>  [1769037752.5893] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Jan 21 23:22:32 compute-1 NetworkManager[54952]: <info>  [1769037752.5912] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Jan 21 23:22:32 compute-1 NetworkManager[54952]: <info>  [1769037752.5925] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Jan 21 23:22:32 compute-1 NetworkManager[54952]: <info>  [1769037752.5931] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Jan 21 23:22:32 compute-1 NetworkManager[54952]: <info>  [1769037752.5933] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Jan 21 23:22:32 compute-1 NetworkManager[54952]: <info>  [1769037752.5936] device (lo): Activation: successful, device activated.
Jan 21 23:22:32 compute-1 NetworkManager[54952]: <info>  [1769037752.5941] dhcp4 (eth0): state changed new lease, address=38.102.83.173
Jan 21 23:22:32 compute-1 NetworkManager[54952]: <info>  [1769037752.5946] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Jan 21 23:22:32 compute-1 NetworkManager[54952]: <info>  [1769037752.6001] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Jan 21 23:22:32 compute-1 NetworkManager[54952]: <info>  [1769037752.6004] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Jan 21 23:22:32 compute-1 NetworkManager[54952]: <info>  [1769037752.6008] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Jan 21 23:22:32 compute-1 NetworkManager[54952]: <info>  [1769037752.6010] manager: NetworkManager state is now CONNECTED_LOCAL
Jan 21 23:22:32 compute-1 NetworkManager[54952]: <info>  [1769037752.6012] device (eth1): Activation: successful, device activated.
Jan 21 23:22:32 compute-1 NetworkManager[54952]: <info>  [1769037752.6017] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Jan 21 23:22:32 compute-1 NetworkManager[54952]: <info>  [1769037752.6018] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Jan 21 23:22:32 compute-1 NetworkManager[54952]: <info>  [1769037752.6021] manager: NetworkManager state is now CONNECTED_SITE
Jan 21 23:22:32 compute-1 NetworkManager[54952]: <info>  [1769037752.6023] device (eth0): Activation: successful, device activated.
Jan 21 23:22:32 compute-1 NetworkManager[54952]: <info>  [1769037752.6026] manager: NetworkManager state is now CONNECTED_GLOBAL
Jan 21 23:22:32 compute-1 NetworkManager[54952]: <info>  [1769037752.6028] manager: startup complete
Jan 21 23:22:32 compute-1 systemd[1]: Starting Network Manager Wait Online...
Jan 21 23:22:32 compute-1 sudo[54938]: pam_unix(sudo:session): session closed for user root
Jan 21 23:22:32 compute-1 systemd[1]: Finished Network Manager Wait Online.
Jan 21 23:22:33 compute-1 sudo[55164]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bklkkyjamdyisolzpvjppvzltwdyyeex ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037752.8536317-465-78550628518143/AnsiballZ_dnf.py'
Jan 21 23:22:33 compute-1 sudo[55164]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:22:33 compute-1 python3.9[55166]: ansible-ansible.legacy.dnf Invoked with name=['os-net-config'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 21 23:22:37 compute-1 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 21 23:22:37 compute-1 systemd[1]: Starting man-db-cache-update.service...
Jan 21 23:22:37 compute-1 systemd[1]: Reloading.
Jan 21 23:22:37 compute-1 systemd-rc-local-generator[55222]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 23:22:37 compute-1 systemd-sysv-generator[55227]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 21 23:22:38 compute-1 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 21 23:22:39 compute-1 sudo[55164]: pam_unix(sudo:session): session closed for user root
Jan 21 23:22:39 compute-1 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 21 23:22:39 compute-1 systemd[1]: Finished man-db-cache-update.service.
Jan 21 23:22:39 compute-1 systemd[1]: run-r2c4fe0f129294acea13e817bba6affd9.service: Deactivated successfully.
Jan 21 23:22:40 compute-1 sudo[55627]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gxkhhnlpulxbrzyqejoubiwmccartmho ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037760.347369-501-206622135355708/AnsiballZ_stat.py'
Jan 21 23:22:40 compute-1 sudo[55627]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:22:40 compute-1 python3.9[55629]: ansible-ansible.builtin.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 21 23:22:40 compute-1 sudo[55627]: pam_unix(sudo:session): session closed for user root
Jan 21 23:22:41 compute-1 sudo[55779]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ktcbeqkejeralmmjaudmfenvwxzralwc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037761.1853206-528-135302132734781/AnsiballZ_ini_file.py'
Jan 21 23:22:41 compute-1 sudo[55779]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:22:42 compute-1 python3.9[55781]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=no-auto-default path=/etc/NetworkManager/NetworkManager.conf section=main state=present value=* exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:22:42 compute-1 sudo[55779]: pam_unix(sudo:session): session closed for user root
Jan 21 23:22:42 compute-1 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Jan 21 23:22:42 compute-1 sudo[55933]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-blztlnymgcrccmvtfqqfczwhdzqucrjw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037762.622579-558-48633804637970/AnsiballZ_ini_file.py'
Jan 21 23:22:42 compute-1 sudo[55933]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:22:43 compute-1 python3.9[55935]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=dns path=/etc/NetworkManager/NetworkManager.conf section=main state=absent value=none exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:22:43 compute-1 sudo[55933]: pam_unix(sudo:session): session closed for user root
Jan 21 23:22:43 compute-1 sudo[56085]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vulxpyflhjxuitwbrpmtpcitojseiqwx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037763.3722181-558-194254432340856/AnsiballZ_ini_file.py'
Jan 21 23:22:43 compute-1 sudo[56085]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:22:43 compute-1 python3.9[56087]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=dns path=/etc/NetworkManager/conf.d/99-cloud-init.conf section=main state=absent value=none exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:22:43 compute-1 sudo[56085]: pam_unix(sudo:session): session closed for user root
Jan 21 23:22:44 compute-1 sudo[56237]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pafyxtqdbtnoqggcultngwwqxrpxhheu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037764.0957918-603-94849726707626/AnsiballZ_ini_file.py'
Jan 21 23:22:44 compute-1 sudo[56237]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:22:44 compute-1 python3.9[56239]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=rc-manager path=/etc/NetworkManager/NetworkManager.conf section=main state=absent value=unmanaged exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:22:44 compute-1 sudo[56237]: pam_unix(sudo:session): session closed for user root
Jan 21 23:22:45 compute-1 sudo[56389]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-skefkjaclnarjxdddfeivsuuwsjyxdzo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037765.1967363-603-18825685291890/AnsiballZ_ini_file.py'
Jan 21 23:22:45 compute-1 sudo[56389]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:22:45 compute-1 python3.9[56391]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=rc-manager path=/etc/NetworkManager/conf.d/99-cloud-init.conf section=main state=absent value=unmanaged exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:22:45 compute-1 sudo[56389]: pam_unix(sudo:session): session closed for user root
Jan 21 23:22:46 compute-1 sudo[56541]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-whcdblbxgmfxiglozpwyqiibyomucamk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037765.9332533-648-127274343733321/AnsiballZ_stat.py'
Jan 21 23:22:46 compute-1 sudo[56541]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:22:46 compute-1 python3.9[56543]: ansible-ansible.legacy.stat Invoked with path=/etc/dhcp/dhclient-enter-hooks follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:22:46 compute-1 sudo[56541]: pam_unix(sudo:session): session closed for user root
Jan 21 23:22:46 compute-1 sudo[56664]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ycnljbcpyspbazyjzwpubtqljzoskxps ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037765.9332533-648-127274343733321/AnsiballZ_copy.py'
Jan 21 23:22:46 compute-1 sudo[56664]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:22:47 compute-1 python3.9[56666]: ansible-ansible.legacy.copy Invoked with dest=/etc/dhcp/dhclient-enter-hooks mode=0755 src=/home/zuul/.ansible/tmp/ansible-tmp-1769037765.9332533-648-127274343733321/.source _original_basename=.55_3tfqy follow=False checksum=f6278a40de79a9841f6ed1fc584538225566990c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:22:47 compute-1 sudo[56664]: pam_unix(sudo:session): session closed for user root
Jan 21 23:22:47 compute-1 sudo[56816]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wqtplzamdyigirelpipapansslquguey ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037767.4175088-693-58713099060035/AnsiballZ_file.py'
Jan 21 23:22:47 compute-1 sudo[56816]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:22:47 compute-1 python3.9[56818]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/os-net-config state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:22:47 compute-1 sudo[56816]: pam_unix(sudo:session): session closed for user root
Jan 21 23:22:48 compute-1 sudo[56968]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ghtyqeemvqwrchlqrbwdcbbrytrswvgk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037768.239836-717-32174187498124/AnsiballZ_edpm_os_net_config_mappings.py'
Jan 21 23:22:48 compute-1 sudo[56968]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:22:48 compute-1 python3.9[56970]: ansible-edpm_os_net_config_mappings Invoked with net_config_data_lookup={}
Jan 21 23:22:49 compute-1 sudo[56968]: pam_unix(sudo:session): session closed for user root
Jan 21 23:22:49 compute-1 sudo[57120]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pyhcqhynbcfgqcsiudocnvlbkaiknlhb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037769.2680387-744-154963461140680/AnsiballZ_file.py'
Jan 21 23:22:49 compute-1 sudo[57120]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:22:49 compute-1 python3.9[57122]: ansible-ansible.builtin.file Invoked with path=/var/lib/edpm-config/scripts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:22:49 compute-1 sudo[57120]: pam_unix(sudo:session): session closed for user root
Jan 21 23:22:50 compute-1 sudo[57272]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wjlkyuizgacfeokqmqmruhownuceowzw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037770.6221573-774-123457936415267/AnsiballZ_stat.py'
Jan 21 23:22:50 compute-1 sudo[57272]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:22:51 compute-1 sudo[57272]: pam_unix(sudo:session): session closed for user root
Jan 21 23:22:51 compute-1 sudo[57395]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uedrxhtksencukatnyixyjjrigeejpri ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037770.6221573-774-123457936415267/AnsiballZ_copy.py'
Jan 21 23:22:51 compute-1 sudo[57395]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:22:51 compute-1 sudo[57395]: pam_unix(sudo:session): session closed for user root
Jan 21 23:22:52 compute-1 sudo[57547]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pnxzyoiiwzbmhmihifwplrkpjkqaoier ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037772.0001535-819-149664477487966/AnsiballZ_slurp.py'
Jan 21 23:22:52 compute-1 sudo[57547]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:22:52 compute-1 python3.9[57549]: ansible-ansible.builtin.slurp Invoked with path=/etc/os-net-config/config.yaml src=/etc/os-net-config/config.yaml
Jan 21 23:22:52 compute-1 sudo[57547]: pam_unix(sudo:session): session closed for user root
Jan 21 23:22:54 compute-1 sudo[57722]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yqzgkpfivalslyvbljaktkublducznhr ; ANSIBLE_ASYNC_DIR=\'~/.ansible_async\' /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037773.0047643-846-155433371007245/async_wrapper.py j94741458078 300 /home/zuul/.ansible/tmp/ansible-tmp-1769037773.0047643-846-155433371007245/AnsiballZ_edpm_os_net_config.py _'
Jan 21 23:22:54 compute-1 sudo[57722]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:22:54 compute-1 ansible-async_wrapper.py[57724]: Invoked with j94741458078 300 /home/zuul/.ansible/tmp/ansible-tmp-1769037773.0047643-846-155433371007245/AnsiballZ_edpm_os_net_config.py _
Jan 21 23:22:54 compute-1 ansible-async_wrapper.py[57727]: Starting module and watcher
Jan 21 23:22:54 compute-1 ansible-async_wrapper.py[57727]: Start watching 57728 (300)
Jan 21 23:22:54 compute-1 ansible-async_wrapper.py[57728]: Start module (57728)
Jan 21 23:22:54 compute-1 ansible-async_wrapper.py[57724]: Return async_wrapper task started.
Jan 21 23:22:54 compute-1 sudo[57722]: pam_unix(sudo:session): session closed for user root
Jan 21 23:22:55 compute-1 python3.9[57729]: ansible-edpm_os_net_config Invoked with cleanup=True config_file=/etc/os-net-config/config.yaml debug=True detailed_exit_codes=True safe_defaults=False use_nmstate=True
Jan 21 23:22:55 compute-1 kernel: cfg80211: Loading compiled-in X.509 certificates for regulatory database
Jan 21 23:22:55 compute-1 kernel: Loaded X.509 cert 'sforshee: 00b28ddf47aef9cea7'
Jan 21 23:22:55 compute-1 kernel: Loaded X.509 cert 'wens: 61c038651aabdcf94bd0ac7ff06c7248db18c600'
Jan 21 23:22:55 compute-1 kernel: platform regulatory.0: Direct firmware load for regulatory.db failed with error -2
Jan 21 23:22:55 compute-1 kernel: cfg80211: failed to load regulatory.db
Jan 21 23:22:56 compute-1 NetworkManager[54952]: <info>  [1769037776.8327] audit: op="checkpoint-create" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=57730 uid=0 result="success"
Jan 21 23:22:56 compute-1 NetworkManager[54952]: <info>  [1769037776.8350] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=57730 uid=0 result="success"
Jan 21 23:22:56 compute-1 NetworkManager[54952]: <info>  [1769037776.8845] manager: (br-ex): new Open vSwitch Bridge device (/org/freedesktop/NetworkManager/Devices/4)
Jan 21 23:22:56 compute-1 NetworkManager[54952]: <info>  [1769037776.8850] audit: op="connection-add" uuid="ce1ae6dc-ed2a-4b84-a071-10a184875925" name="br-ex-br" pid=57730 uid=0 result="success"
Jan 21 23:22:56 compute-1 NetworkManager[54952]: <info>  [1769037776.8861] manager: (br-ex): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/5)
Jan 21 23:22:56 compute-1 NetworkManager[54952]: <info>  [1769037776.8862] audit: op="connection-add" uuid="a40196d7-63c4-4009-b989-f22827dc233c" name="br-ex-port" pid=57730 uid=0 result="success"
Jan 21 23:22:56 compute-1 NetworkManager[54952]: <info>  [1769037776.8873] manager: (eth1): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/6)
Jan 21 23:22:56 compute-1 NetworkManager[54952]: <info>  [1769037776.8875] audit: op="connection-add" uuid="2821181c-de4c-4d3e-879a-179e352ddb44" name="eth1-port" pid=57730 uid=0 result="success"
Jan 21 23:22:56 compute-1 NetworkManager[54952]: <info>  [1769037776.8886] manager: (vlan20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/7)
Jan 21 23:22:56 compute-1 NetworkManager[54952]: <info>  [1769037776.8888] audit: op="connection-add" uuid="a4e558c9-7a2c-4ef9-b50a-731838927474" name="vlan20-port" pid=57730 uid=0 result="success"
Jan 21 23:22:56 compute-1 NetworkManager[54952]: <info>  [1769037776.8899] manager: (vlan21): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/8)
Jan 21 23:22:56 compute-1 NetworkManager[54952]: <info>  [1769037776.8901] audit: op="connection-add" uuid="fb82b70d-dae8-400f-869b-9d89327d96ae" name="vlan21-port" pid=57730 uid=0 result="success"
Jan 21 23:22:56 compute-1 NetworkManager[54952]: <info>  [1769037776.8913] manager: (vlan22): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/9)
Jan 21 23:22:56 compute-1 NetworkManager[54952]: <info>  [1769037776.8914] audit: op="connection-add" uuid="6b6a3acf-ef69-4251-939d-5936ff90f2c2" name="vlan22-port" pid=57730 uid=0 result="success"
Jan 21 23:22:56 compute-1 NetworkManager[54952]: <info>  [1769037776.8936] audit: op="connection-update" uuid="5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03" name="System eth0" args="ipv4.dhcp-client-id,ipv4.dhcp-timeout,connection.timestamp,connection.autoconnect-priority,802-3-ethernet.mtu,ipv6.addr-gen-mode,ipv6.dhcp-timeout,ipv6.method" pid=57730 uid=0 result="success"
Jan 21 23:22:56 compute-1 NetworkManager[54952]: <info>  [1769037776.8964] manager: (br-ex): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/10)
Jan 21 23:22:56 compute-1 NetworkManager[54952]: <info>  [1769037776.8966] audit: op="connection-add" uuid="f6d44f1e-78da-497c-8bca-0077766f057d" name="br-ex-if" pid=57730 uid=0 result="success"
Jan 21 23:22:56 compute-1 NetworkManager[54952]: <info>  [1769037776.9413] audit: op="connection-update" uuid="693d23c9-22df-5d5e-b59c-efc0730a6438" name="ci-private-network" args="ipv4.routing-rules,ipv4.addresses,ipv4.dns,ipv4.never-default,ipv4.method,ipv4.routes,ovs-external-ids.data,connection.master,connection.port-type,connection.controller,connection.slave-type,connection.timestamp,ovs-interface.type,ipv6.addresses,ipv6.dns,ipv6.routing-rules,ipv6.addr-gen-mode,ipv6.method,ipv6.routes" pid=57730 uid=0 result="success"
Jan 21 23:22:56 compute-1 NetworkManager[54952]: <info>  [1769037776.9439] manager: (vlan20): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/11)
Jan 21 23:22:56 compute-1 NetworkManager[54952]: <info>  [1769037776.9442] audit: op="connection-add" uuid="01dab2bc-3025-40a0-abf1-86db0e59d3f1" name="vlan20-if" pid=57730 uid=0 result="success"
Jan 21 23:22:56 compute-1 NetworkManager[54952]: <info>  [1769037776.9464] manager: (vlan21): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/12)
Jan 21 23:22:56 compute-1 NetworkManager[54952]: <info>  [1769037776.9468] audit: op="connection-add" uuid="cebd4b3e-bd3e-4c5d-aa35-0657d8114f3d" name="vlan21-if" pid=57730 uid=0 result="success"
Jan 21 23:22:56 compute-1 NetworkManager[54952]: <info>  [1769037776.9490] manager: (vlan22): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/13)
Jan 21 23:22:56 compute-1 NetworkManager[54952]: <info>  [1769037776.9494] audit: op="connection-add" uuid="891008e7-5583-47ea-9574-715c01f9d000" name="vlan22-if" pid=57730 uid=0 result="success"
Jan 21 23:22:56 compute-1 NetworkManager[54952]: <info>  [1769037776.9510] audit: op="connection-delete" uuid="ef9564cf-3cba-317e-b605-2bb50bce1cb4" name="Wired connection 1" pid=57730 uid=0 result="success"
Jan 21 23:22:56 compute-1 NetworkManager[54952]: <info>  [1769037776.9526] device (br-ex)[Open vSwitch Bridge]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 21 23:22:56 compute-1 NetworkManager[54952]: <warn>  [1769037776.9530] device (br-ex)[Open vSwitch Bridge]: error setting IPv4 forwarding to '1': Success
Jan 21 23:22:56 compute-1 NetworkManager[54952]: <info>  [1769037776.9541] device (br-ex)[Open vSwitch Bridge]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 21 23:22:56 compute-1 NetworkManager[54952]: <info>  [1769037776.9549] device (br-ex)[Open vSwitch Bridge]: Activation: starting connection 'br-ex-br' (ce1ae6dc-ed2a-4b84-a071-10a184875925)
Jan 21 23:22:56 compute-1 NetworkManager[54952]: <info>  [1769037776.9550] audit: op="connection-activate" uuid="ce1ae6dc-ed2a-4b84-a071-10a184875925" name="br-ex-br" pid=57730 uid=0 result="success"
Jan 21 23:22:56 compute-1 NetworkManager[54952]: <info>  [1769037776.9555] device (br-ex)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 21 23:22:56 compute-1 NetworkManager[54952]: <warn>  [1769037776.9557] device (br-ex)[Open vSwitch Port]: error setting IPv4 forwarding to '1': Resource temporarily unavailable
Jan 21 23:22:56 compute-1 NetworkManager[54952]: <info>  [1769037776.9566] device (br-ex)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 21 23:22:56 compute-1 NetworkManager[54952]: <info>  [1769037776.9573] device (br-ex)[Open vSwitch Port]: Activation: starting connection 'br-ex-port' (a40196d7-63c4-4009-b989-f22827dc233c)
Jan 21 23:22:56 compute-1 NetworkManager[54952]: <info>  [1769037776.9577] device (eth1)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 21 23:22:56 compute-1 NetworkManager[54952]: <warn>  [1769037776.9579] device (eth1)[Open vSwitch Port]: error setting IPv4 forwarding to '1': Resource temporarily unavailable
Jan 21 23:22:56 compute-1 NetworkManager[54952]: <info>  [1769037776.9588] device (eth1)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 21 23:22:56 compute-1 NetworkManager[54952]: <info>  [1769037776.9597] device (eth1)[Open vSwitch Port]: Activation: starting connection 'eth1-port' (2821181c-de4c-4d3e-879a-179e352ddb44)
Jan 21 23:22:56 compute-1 NetworkManager[54952]: <info>  [1769037776.9601] device (vlan20)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 21 23:22:56 compute-1 NetworkManager[54952]: <warn>  [1769037776.9603] device (vlan20)[Open vSwitch Port]: error setting IPv4 forwarding to '1': Resource temporarily unavailable
Jan 21 23:22:56 compute-1 NetworkManager[54952]: <info>  [1769037776.9613] device (vlan20)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 21 23:22:56 compute-1 NetworkManager[54952]: <info>  [1769037776.9621] device (vlan20)[Open vSwitch Port]: Activation: starting connection 'vlan20-port' (a4e558c9-7a2c-4ef9-b50a-731838927474)
Jan 21 23:22:56 compute-1 NetworkManager[54952]: <info>  [1769037776.9624] device (vlan21)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 21 23:22:56 compute-1 NetworkManager[54952]: <warn>  [1769037776.9627] device (vlan21)[Open vSwitch Port]: error setting IPv4 forwarding to '1': Resource temporarily unavailable
Jan 21 23:22:56 compute-1 NetworkManager[54952]: <info>  [1769037776.9636] device (vlan21)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 21 23:22:56 compute-1 NetworkManager[54952]: <info>  [1769037776.9645] device (vlan21)[Open vSwitch Port]: Activation: starting connection 'vlan21-port' (fb82b70d-dae8-400f-869b-9d89327d96ae)
Jan 21 23:22:56 compute-1 NetworkManager[54952]: <info>  [1769037776.9648] device (vlan22)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 21 23:22:56 compute-1 NetworkManager[54952]: <warn>  [1769037776.9651] device (vlan22)[Open vSwitch Port]: error setting IPv4 forwarding to '1': Resource temporarily unavailable
Jan 21 23:22:56 compute-1 NetworkManager[54952]: <info>  [1769037776.9660] device (vlan22)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 21 23:22:56 compute-1 NetworkManager[54952]: <info>  [1769037776.9667] device (vlan22)[Open vSwitch Port]: Activation: starting connection 'vlan22-port' (6b6a3acf-ef69-4251-939d-5936ff90f2c2)
Jan 21 23:22:56 compute-1 NetworkManager[54952]: <info>  [1769037776.9669] device (br-ex)[Open vSwitch Bridge]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 21 23:22:56 compute-1 NetworkManager[54952]: <info>  [1769037776.9675] device (br-ex)[Open vSwitch Bridge]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 21 23:22:56 compute-1 NetworkManager[54952]: <info>  [1769037776.9678] device (br-ex)[Open vSwitch Bridge]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 21 23:22:56 compute-1 NetworkManager[54952]: <info>  [1769037776.9690] device (br-ex)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 21 23:22:56 compute-1 NetworkManager[54952]: <warn>  [1769037776.9692] device (br-ex)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Jan 21 23:22:56 compute-1 NetworkManager[54952]: <info>  [1769037776.9699] device (br-ex)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 21 23:22:56 compute-1 NetworkManager[54952]: <info>  [1769037776.9709] device (br-ex)[Open vSwitch Interface]: Activation: starting connection 'br-ex-if' (f6d44f1e-78da-497c-8bca-0077766f057d)
Jan 21 23:22:56 compute-1 NetworkManager[54952]: <info>  [1769037776.9711] device (br-ex)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 21 23:22:56 compute-1 NetworkManager[54952]: <info>  [1769037776.9719] device (br-ex)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 21 23:22:56 compute-1 NetworkManager[54952]: <info>  [1769037776.9722] device (br-ex)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 21 23:22:56 compute-1 NetworkManager[54952]: <info>  [1769037776.9724] device (br-ex)[Open vSwitch Port]: Activation: connection 'br-ex-port' attached as port, continuing activation
Jan 21 23:22:56 compute-1 NetworkManager[54952]: <info>  [1769037776.9727] device (eth1): state change: activated -> deactivating (reason 'new-activation', managed-type: 'full')
Jan 21 23:22:56 compute-1 NetworkManager[54952]: <info>  [1769037776.9744] device (eth1): disconnecting for new activation request.
Jan 21 23:22:56 compute-1 NetworkManager[54952]: <info>  [1769037776.9745] device (eth1)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 21 23:22:56 compute-1 NetworkManager[54952]: <info>  [1769037776.9750] device (eth1)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 21 23:22:56 compute-1 NetworkManager[54952]: <info>  [1769037776.9753] device (eth1)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 21 23:22:56 compute-1 NetworkManager[54952]: <info>  [1769037776.9754] device (eth1)[Open vSwitch Port]: Activation: connection 'eth1-port' attached as port, continuing activation
Jan 21 23:22:56 compute-1 NetworkManager[54952]: <info>  [1769037776.9758] device (vlan20)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 21 23:22:56 compute-1 NetworkManager[54952]: <warn>  [1769037776.9759] device (vlan20)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Jan 21 23:22:56 compute-1 NetworkManager[54952]: <info>  [1769037776.9763] device (vlan20)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 21 23:22:56 compute-1 NetworkManager[54952]: <info>  [1769037776.9769] device (vlan20)[Open vSwitch Interface]: Activation: starting connection 'vlan20-if' (01dab2bc-3025-40a0-abf1-86db0e59d3f1)
Jan 21 23:22:56 compute-1 NetworkManager[54952]: <info>  [1769037776.9770] device (vlan20)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 21 23:22:56 compute-1 NetworkManager[54952]: <info>  [1769037776.9774] device (vlan20)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 21 23:22:56 compute-1 NetworkManager[54952]: <info>  [1769037776.9776] device (vlan20)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 21 23:22:56 compute-1 NetworkManager[54952]: <info>  [1769037776.9777] device (vlan20)[Open vSwitch Port]: Activation: connection 'vlan20-port' attached as port, continuing activation
Jan 21 23:22:56 compute-1 NetworkManager[54952]: <info>  [1769037776.9781] device (vlan21)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 21 23:22:56 compute-1 NetworkManager[54952]: <warn>  [1769037776.9782] device (vlan21)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Jan 21 23:22:56 compute-1 NetworkManager[54952]: <info>  [1769037776.9786] device (vlan21)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 21 23:22:56 compute-1 NetworkManager[54952]: <info>  [1769037776.9793] device (vlan21)[Open vSwitch Interface]: Activation: starting connection 'vlan21-if' (cebd4b3e-bd3e-4c5d-aa35-0657d8114f3d)
Jan 21 23:22:56 compute-1 NetworkManager[54952]: <info>  [1769037776.9793] device (vlan21)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 21 23:22:56 compute-1 NetworkManager[54952]: <info>  [1769037776.9797] device (vlan21)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 21 23:22:56 compute-1 NetworkManager[54952]: <info>  [1769037776.9800] device (vlan21)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 21 23:22:56 compute-1 NetworkManager[54952]: <info>  [1769037776.9802] device (vlan21)[Open vSwitch Port]: Activation: connection 'vlan21-port' attached as port, continuing activation
Jan 21 23:22:56 compute-1 NetworkManager[54952]: <info>  [1769037776.9806] device (vlan22)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 21 23:22:56 compute-1 NetworkManager[54952]: <warn>  [1769037776.9807] device (vlan22)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Jan 21 23:22:56 compute-1 NetworkManager[54952]: <info>  [1769037776.9811] device (vlan22)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 21 23:22:56 compute-1 NetworkManager[54952]: <info>  [1769037776.9816] device (vlan22)[Open vSwitch Interface]: Activation: starting connection 'vlan22-if' (891008e7-5583-47ea-9574-715c01f9d000)
Jan 21 23:22:56 compute-1 NetworkManager[54952]: <info>  [1769037776.9817] device (vlan22)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 21 23:22:56 compute-1 NetworkManager[54952]: <info>  [1769037776.9822] device (vlan22)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 21 23:22:56 compute-1 NetworkManager[54952]: <info>  [1769037776.9824] device (vlan22)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 21 23:22:56 compute-1 NetworkManager[54952]: <info>  [1769037776.9826] device (vlan22)[Open vSwitch Port]: Activation: connection 'vlan22-port' attached as port, continuing activation
Jan 21 23:22:56 compute-1 NetworkManager[54952]: <info>  [1769037776.9828] device (br-ex)[Open vSwitch Bridge]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 21 23:22:56 compute-1 NetworkManager[54952]: <info>  [1769037776.9844] audit: op="device-reapply" interface="eth0" ifindex=2 args="ipv4.dhcp-client-id,ipv4.dhcp-timeout,connection.autoconnect-priority,802-3-ethernet.mtu,ipv6.addr-gen-mode,ipv6.method" pid=57730 uid=0 result="success"
Jan 21 23:22:56 compute-1 NetworkManager[54952]: <info>  [1769037776.9847] device (br-ex)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 21 23:22:56 compute-1 NetworkManager[54952]: <info>  [1769037776.9855] device (br-ex)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 21 23:22:56 compute-1 NetworkManager[54952]: <info>  [1769037776.9857] device (br-ex)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 21 23:22:56 compute-1 NetworkManager[54952]: <info>  [1769037776.9865] device (br-ex)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 21 23:22:56 compute-1 NetworkManager[54952]: <info>  [1769037776.9871] device (eth1)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 21 23:22:56 compute-1 NetworkManager[54952]: <info>  [1769037776.9875] device (vlan20)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 21 23:22:56 compute-1 NetworkManager[54952]: <info>  [1769037776.9880] device (vlan20)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 21 23:22:56 compute-1 NetworkManager[54952]: <info>  [1769037776.9882] device (vlan20)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 21 23:22:56 compute-1 kernel: ovs-system: entered promiscuous mode
Jan 21 23:22:56 compute-1 NetworkManager[54952]: <info>  [1769037776.9891] device (vlan20)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 21 23:22:56 compute-1 NetworkManager[54952]: <info>  [1769037776.9898] device (vlan21)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 21 23:22:56 compute-1 systemd-udevd[57735]: Network interface NamePolicy= disabled on kernel command line.
Jan 21 23:22:56 compute-1 NetworkManager[54952]: <info>  [1769037776.9910] device (vlan21)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 21 23:22:56 compute-1 NetworkManager[54952]: <info>  [1769037776.9912] device (vlan21)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 21 23:22:56 compute-1 NetworkManager[54952]: <info>  [1769037776.9917] device (vlan21)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 21 23:22:56 compute-1 NetworkManager[54952]: <info>  [1769037776.9921] device (vlan22)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 21 23:22:56 compute-1 NetworkManager[54952]: <info>  [1769037776.9924] device (vlan22)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 21 23:22:56 compute-1 NetworkManager[54952]: <info>  [1769037776.9926] device (vlan22)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 21 23:22:56 compute-1 kernel: Timeout policy base is empty
Jan 21 23:22:56 compute-1 NetworkManager[54952]: <info>  [1769037776.9931] device (vlan22)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 21 23:22:56 compute-1 systemd[1]: Starting Network Manager Script Dispatcher Service...
Jan 21 23:22:56 compute-1 NetworkManager[54952]: <info>  [1769037776.9935] dhcp4 (eth0): canceled DHCP transaction
Jan 21 23:22:56 compute-1 NetworkManager[54952]: <info>  [1769037776.9936] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Jan 21 23:22:56 compute-1 NetworkManager[54952]: <info>  [1769037776.9936] dhcp4 (eth0): state changed no lease
Jan 21 23:22:56 compute-1 NetworkManager[54952]: <info>  [1769037776.9938] dhcp4 (eth0): activation: beginning transaction (no timeout)
Jan 21 23:22:56 compute-1 NetworkManager[54952]: <info>  [1769037776.9951] device (br-ex)[Open vSwitch Interface]: Activation: connection 'br-ex-if' attached as port, continuing activation
Jan 21 23:22:56 compute-1 NetworkManager[54952]: <info>  [1769037776.9955] audit: op="device-reapply" interface="eth1" ifindex=3 pid=57730 uid=0 result="fail" reason="Device is not activated"
Jan 21 23:22:56 compute-1 NetworkManager[54952]: <info>  [1769037776.9961] device (vlan20)[Open vSwitch Interface]: Activation: connection 'vlan20-if' attached as port, continuing activation
Jan 21 23:22:57 compute-1 NetworkManager[54952]: <info>  [1769037777.0063] device (vlan21)[Open vSwitch Interface]: Activation: connection 'vlan21-if' attached as port, continuing activation
Jan 21 23:22:57 compute-1 NetworkManager[54952]: <info>  [1769037777.0066] dhcp4 (eth0): state changed new lease, address=38.102.83.173
Jan 21 23:22:57 compute-1 NetworkManager[54952]: <info>  [1769037777.0077] device (vlan22)[Open vSwitch Interface]: Activation: connection 'vlan22-if' attached as port, continuing activation
Jan 21 23:22:57 compute-1 systemd[1]: Started Network Manager Script Dispatcher Service.
Jan 21 23:22:57 compute-1 kernel: br-ex: entered promiscuous mode
Jan 21 23:22:57 compute-1 kernel: vlan22: entered promiscuous mode
Jan 21 23:22:57 compute-1 systemd-udevd[57736]: Network interface NamePolicy= disabled on kernel command line.
Jan 21 23:22:57 compute-1 systemd-udevd[57734]: Network interface NamePolicy= disabled on kernel command line.
Jan 21 23:22:57 compute-1 kernel: vlan20: entered promiscuous mode
Jan 21 23:22:57 compute-1 kernel: vlan21: entered promiscuous mode
Jan 21 23:22:57 compute-1 NetworkManager[54952]: <info>  [1769037777.1007] device (br-ex)[Open vSwitch Interface]: carrier: link connected
Jan 21 23:22:57 compute-1 NetworkManager[54952]: <info>  [1769037777.1021] device (vlan22)[Open vSwitch Interface]: carrier: link connected
Jan 21 23:22:57 compute-1 NetworkManager[54952]: <info>  [1769037777.1033] device (vlan20)[Open vSwitch Interface]: carrier: link connected
Jan 21 23:22:57 compute-1 NetworkManager[54952]: <info>  [1769037777.1050] device (vlan21)[Open vSwitch Interface]: carrier: link connected
Jan 21 23:22:57 compute-1 NetworkManager[54952]: <info>  [1769037777.1066] device (eth1): disconnecting for new activation request.
Jan 21 23:22:57 compute-1 NetworkManager[54952]: <info>  [1769037777.1068] audit: op="connection-activate" uuid="693d23c9-22df-5d5e-b59c-efc0730a6438" name="ci-private-network" pid=57730 uid=0 result="success"
Jan 21 23:22:57 compute-1 NetworkManager[54952]: <info>  [1769037777.1069] device (eth1): state change: deactivating -> disconnected (reason 'new-activation', managed-type: 'full')
Jan 21 23:22:57 compute-1 NetworkManager[54952]: <info>  [1769037777.1206] device (eth1): Activation: starting connection 'ci-private-network' (693d23c9-22df-5d5e-b59c-efc0730a6438)
Jan 21 23:22:57 compute-1 NetworkManager[54952]: <info>  [1769037777.1212] device (br-ex)[Open vSwitch Bridge]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 21 23:22:57 compute-1 NetworkManager[54952]: <info>  [1769037777.1214] device (br-ex)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 21 23:22:57 compute-1 NetworkManager[54952]: <info>  [1769037777.1215] device (eth1)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 21 23:22:57 compute-1 NetworkManager[54952]: <info>  [1769037777.1216] device (vlan20)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 21 23:22:57 compute-1 NetworkManager[54952]: <info>  [1769037777.1219] device (vlan21)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 21 23:22:57 compute-1 NetworkManager[54952]: <info>  [1769037777.1221] device (vlan22)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 21 23:22:57 compute-1 NetworkManager[54952]: <info>  [1769037777.1241] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 21 23:22:57 compute-1 NetworkManager[54952]: <info>  [1769037777.1244] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 21 23:22:57 compute-1 NetworkManager[54952]: <info>  [1769037777.1251] device (br-ex)[Open vSwitch Bridge]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 21 23:22:57 compute-1 NetworkManager[54952]: <info>  [1769037777.1256] device (br-ex)[Open vSwitch Bridge]: Activation: successful, device activated.
Jan 21 23:22:57 compute-1 NetworkManager[54952]: <info>  [1769037777.1263] device (br-ex)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 21 23:22:57 compute-1 NetworkManager[54952]: <info>  [1769037777.1267] device (br-ex)[Open vSwitch Port]: Activation: successful, device activated.
Jan 21 23:22:57 compute-1 NetworkManager[54952]: <info>  [1769037777.1271] device (eth1)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 21 23:22:57 compute-1 NetworkManager[54952]: <info>  [1769037777.1274] device (eth1)[Open vSwitch Port]: Activation: successful, device activated.
Jan 21 23:22:57 compute-1 NetworkManager[54952]: <info>  [1769037777.1277] device (vlan20)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 21 23:22:57 compute-1 NetworkManager[54952]: <info>  [1769037777.1281] device (vlan20)[Open vSwitch Port]: Activation: successful, device activated.
Jan 21 23:22:57 compute-1 NetworkManager[54952]: <info>  [1769037777.1286] device (vlan21)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 21 23:22:57 compute-1 NetworkManager[54952]: <info>  [1769037777.1292] device (vlan21)[Open vSwitch Port]: Activation: successful, device activated.
Jan 21 23:22:57 compute-1 NetworkManager[54952]: <info>  [1769037777.1296] device (vlan22)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 21 23:22:57 compute-1 NetworkManager[54952]: <info>  [1769037777.1300] device (vlan22)[Open vSwitch Port]: Activation: successful, device activated.
Jan 21 23:22:57 compute-1 NetworkManager[54952]: <info>  [1769037777.1304] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=57730 uid=0 result="success"
Jan 21 23:22:57 compute-1 NetworkManager[54952]: <info>  [1769037777.1320] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 21 23:22:57 compute-1 NetworkManager[54952]: <info>  [1769037777.1343] device (eth1): Activation: connection 'ci-private-network' attached as port, continuing activation
Jan 21 23:22:57 compute-1 kernel: virtio_net virtio5 eth1: entered promiscuous mode
Jan 21 23:22:57 compute-1 NetworkManager[54952]: <info>  [1769037777.1351] device (br-ex)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 21 23:22:57 compute-1 NetworkManager[54952]: <info>  [1769037777.1357] device (vlan22)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 21 23:22:57 compute-1 NetworkManager[54952]: <info>  [1769037777.1363] device (vlan20)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 21 23:22:57 compute-1 NetworkManager[54952]: <info>  [1769037777.1374] device (vlan21)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 21 23:22:57 compute-1 NetworkManager[54952]: <info>  [1769037777.1386] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 21 23:22:57 compute-1 NetworkManager[54952]: <info>  [1769037777.1682] device (br-ex)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 21 23:22:57 compute-1 NetworkManager[54952]: <info>  [1769037777.1684] device (vlan22)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 21 23:22:57 compute-1 NetworkManager[54952]: <info>  [1769037777.1685] device (vlan20)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 21 23:22:57 compute-1 NetworkManager[54952]: <info>  [1769037777.1686] device (vlan21)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 21 23:22:57 compute-1 NetworkManager[54952]: <info>  [1769037777.1687] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 21 23:22:57 compute-1 NetworkManager[54952]: <info>  [1769037777.1699] device (br-ex)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 21 23:22:57 compute-1 NetworkManager[54952]: <info>  [1769037777.1704] device (br-ex)[Open vSwitch Interface]: Activation: successful, device activated.
Jan 21 23:22:57 compute-1 NetworkManager[54952]: <info>  [1769037777.1709] device (vlan22)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 21 23:22:57 compute-1 NetworkManager[54952]: <info>  [1769037777.1712] device (vlan22)[Open vSwitch Interface]: Activation: successful, device activated.
Jan 21 23:22:57 compute-1 NetworkManager[54952]: <info>  [1769037777.1717] device (vlan20)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 21 23:22:57 compute-1 NetworkManager[54952]: <info>  [1769037777.1721] device (vlan20)[Open vSwitch Interface]: Activation: successful, device activated.
Jan 21 23:22:57 compute-1 NetworkManager[54952]: <info>  [1769037777.1726] device (vlan21)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 21 23:22:57 compute-1 NetworkManager[54952]: <info>  [1769037777.1730] device (vlan21)[Open vSwitch Interface]: Activation: successful, device activated.
Jan 21 23:22:57 compute-1 NetworkManager[54952]: <info>  [1769037777.1735] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 21 23:22:57 compute-1 NetworkManager[54952]: <info>  [1769037777.1739] device (eth1): Activation: successful, device activated.
Jan 21 23:22:58 compute-1 NetworkManager[54952]: <info>  [1769037778.3110] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=57730 uid=0 result="success"
Jan 21 23:22:58 compute-1 sudo[58060]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-koetlrkwfbxvxijbkbfkkiarmvfpshyo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037777.9954982-846-91736941002616/AnsiballZ_async_status.py'
Jan 21 23:22:58 compute-1 sudo[58060]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:22:58 compute-1 NetworkManager[54952]: <info>  [1769037778.4908] checkpoint[0x55e70cc18950]: destroy /org/freedesktop/NetworkManager/Checkpoint/1
Jan 21 23:22:58 compute-1 NetworkManager[54952]: <info>  [1769037778.4910] audit: op="checkpoint-destroy" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=57730 uid=0 result="success"
Jan 21 23:22:58 compute-1 python3.9[58062]: ansible-ansible.legacy.async_status Invoked with jid=j94741458078.57724 mode=status _async_dir=/root/.ansible_async
Jan 21 23:22:58 compute-1 sudo[58060]: pam_unix(sudo:session): session closed for user root
Jan 21 23:22:58 compute-1 NetworkManager[54952]: <info>  [1769037778.7601] audit: op="checkpoint-create" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=57730 uid=0 result="success"
Jan 21 23:22:58 compute-1 NetworkManager[54952]: <info>  [1769037778.7612] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=57730 uid=0 result="success"
Jan 21 23:22:58 compute-1 NetworkManager[54952]: <info>  [1769037778.9457] audit: op="networking-control" arg="global-dns-configuration" pid=57730 uid=0 result="success"
Jan 21 23:22:58 compute-1 NetworkManager[54952]: <info>  [1769037778.9493] config: signal: SET_VALUES,values,values-intern,global-dns-config (/etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf)
Jan 21 23:22:58 compute-1 NetworkManager[54952]: <info>  [1769037778.9560] audit: op="networking-control" arg="global-dns-configuration" pid=57730 uid=0 result="success"
Jan 21 23:22:58 compute-1 NetworkManager[54952]: <info>  [1769037778.9588] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=57730 uid=0 result="success"
Jan 21 23:22:59 compute-1 NetworkManager[54952]: <info>  [1769037779.0780] checkpoint[0x55e70cc18a20]: destroy /org/freedesktop/NetworkManager/Checkpoint/2
Jan 21 23:22:59 compute-1 NetworkManager[54952]: <info>  [1769037779.0783] audit: op="checkpoint-destroy" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=57730 uid=0 result="success"
Jan 21 23:22:59 compute-1 ansible-async_wrapper.py[57728]: Module complete (57728)
Jan 21 23:22:59 compute-1 ansible-async_wrapper.py[57727]: Done in kid B.
Jan 21 23:23:01 compute-1 sudo[58166]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lajjstrtceyweufznxkmnyrlnjdsmlab ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037777.9954982-846-91736941002616/AnsiballZ_async_status.py'
Jan 21 23:23:01 compute-1 sudo[58166]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:23:02 compute-1 python3.9[58168]: ansible-ansible.legacy.async_status Invoked with jid=j94741458078.57724 mode=status _async_dir=/root/.ansible_async
Jan 21 23:23:02 compute-1 sudo[58166]: pam_unix(sudo:session): session closed for user root
Jan 21 23:23:02 compute-1 sudo[58266]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vrhuccwyfhkqveonqdelieejvqiozjcq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037777.9954982-846-91736941002616/AnsiballZ_async_status.py'
Jan 21 23:23:02 compute-1 sudo[58266]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:23:02 compute-1 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Jan 21 23:23:02 compute-1 python3.9[58268]: ansible-ansible.legacy.async_status Invoked with jid=j94741458078.57724 mode=cleanup _async_dir=/root/.ansible_async
Jan 21 23:23:02 compute-1 sudo[58266]: pam_unix(sudo:session): session closed for user root
Jan 21 23:23:03 compute-1 sudo[58420]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ejlgzrgppheirdskcwfsqgizemqbbunm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037783.0446417-927-64281832610437/AnsiballZ_stat.py'
Jan 21 23:23:03 compute-1 sudo[58420]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:23:03 compute-1 python3.9[58422]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:23:03 compute-1 sudo[58420]: pam_unix(sudo:session): session closed for user root
Jan 21 23:23:03 compute-1 sudo[58543]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cozuqvdspuriptxwxknkcrkiaxpkjedj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037783.0446417-927-64281832610437/AnsiballZ_copy.py'
Jan 21 23:23:03 compute-1 sudo[58543]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:23:04 compute-1 python3.9[58545]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/os-net-config.returncode mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769037783.0446417-927-64281832610437/.source.returncode _original_basename=.d2mzw_vi follow=False checksum=b6589fc6ab0dc82cf12099d1c2d40ab994e8410c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:23:04 compute-1 sudo[58543]: pam_unix(sudo:session): session closed for user root
Jan 21 23:23:04 compute-1 sudo[58695]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kvzxtwvvncirtpenxkkenngtiworzzwp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037784.420474-975-76753449980374/AnsiballZ_stat.py'
Jan 21 23:23:04 compute-1 sudo[58695]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:23:04 compute-1 python3.9[58697]: ansible-ansible.legacy.stat Invoked with path=/etc/cloud/cloud.cfg.d/99-edpm-disable-network-config.cfg follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:23:04 compute-1 sudo[58695]: pam_unix(sudo:session): session closed for user root
Jan 21 23:23:05 compute-1 sudo[58819]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qrrgkuqwyynvwzgtfsdolocghrrenoyz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037784.420474-975-76753449980374/AnsiballZ_copy.py'
Jan 21 23:23:05 compute-1 sudo[58819]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:23:05 compute-1 python3.9[58821]: ansible-ansible.legacy.copy Invoked with dest=/etc/cloud/cloud.cfg.d/99-edpm-disable-network-config.cfg mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769037784.420474-975-76753449980374/.source.cfg _original_basename=.fhljr_xy follow=False checksum=f3c5952a9cd4c6c31b314b25eb897168971cc86e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:23:05 compute-1 sudo[58819]: pam_unix(sudo:session): session closed for user root
Jan 21 23:23:06 compute-1 sudo[58971]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kwucqyieaxfpkxkpxubgzxmwyuyrgalz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037785.7522104-1020-221668275318919/AnsiballZ_systemd.py'
Jan 21 23:23:06 compute-1 sudo[58971]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:23:06 compute-1 python3.9[58973]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 21 23:23:06 compute-1 systemd[1]: Reloading Network Manager...
Jan 21 23:23:06 compute-1 NetworkManager[54952]: <info>  [1769037786.4172] audit: op="reload" arg="0" pid=58977 uid=0 result="success"
Jan 21 23:23:06 compute-1 NetworkManager[54952]: <info>  [1769037786.4184] config: signal: SIGHUP,config-files,values,values-user,no-auto-default (/etc/NetworkManager/NetworkManager.conf, /usr/lib/NetworkManager/conf.d/00-server.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf, /var/lib/NetworkManager/NetworkManager-intern.conf)
Jan 21 23:23:06 compute-1 systemd[1]: Reloaded Network Manager.
Jan 21 23:23:06 compute-1 sudo[58971]: pam_unix(sudo:session): session closed for user root
Jan 21 23:23:06 compute-1 sshd-session[50957]: Connection closed by 192.168.122.30 port 56552
Jan 21 23:23:06 compute-1 sshd-session[50954]: pam_unix(sshd:session): session closed for user zuul
Jan 21 23:23:06 compute-1 systemd[1]: session-12.scope: Deactivated successfully.
Jan 21 23:23:06 compute-1 systemd[1]: session-12.scope: Consumed 52.207s CPU time.
Jan 21 23:23:06 compute-1 systemd-logind[796]: Session 12 logged out. Waiting for processes to exit.
Jan 21 23:23:06 compute-1 systemd-logind[796]: Removed session 12.
Jan 21 23:23:13 compute-1 sshd-session[59008]: Accepted publickey for zuul from 192.168.122.30 port 58720 ssh2: ECDSA SHA256:yVd9eaNlUxI8uClDfn5gH9n5gqPIh4AgIE0cN4DTPFE
Jan 21 23:23:13 compute-1 systemd-logind[796]: New session 13 of user zuul.
Jan 21 23:23:13 compute-1 systemd[1]: Started Session 13 of User zuul.
Jan 21 23:23:13 compute-1 sshd-session[59008]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 21 23:23:14 compute-1 python3.9[59161]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 21 23:23:15 compute-1 python3.9[59316]: ansible-ansible.builtin.setup Invoked with filter=['ansible_default_ipv4'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 21 23:23:16 compute-1 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Jan 21 23:23:17 compute-1 python3.9[59507]: ansible-ansible.legacy.command Invoked with _raw_params=hostname -f _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 23:23:17 compute-1 sshd-session[59011]: Connection closed by 192.168.122.30 port 58720
Jan 21 23:23:17 compute-1 sshd-session[59008]: pam_unix(sshd:session): session closed for user zuul
Jan 21 23:23:17 compute-1 systemd[1]: session-13.scope: Deactivated successfully.
Jan 21 23:23:17 compute-1 systemd[1]: session-13.scope: Consumed 2.503s CPU time.
Jan 21 23:23:17 compute-1 systemd-logind[796]: Session 13 logged out. Waiting for processes to exit.
Jan 21 23:23:17 compute-1 systemd-logind[796]: Removed session 13.
Jan 21 23:23:23 compute-1 sshd-session[59535]: Accepted publickey for zuul from 192.168.122.30 port 53608 ssh2: ECDSA SHA256:yVd9eaNlUxI8uClDfn5gH9n5gqPIh4AgIE0cN4DTPFE
Jan 21 23:23:23 compute-1 systemd-logind[796]: New session 14 of user zuul.
Jan 21 23:23:23 compute-1 systemd[1]: Started Session 14 of User zuul.
Jan 21 23:23:23 compute-1 sshd-session[59535]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 21 23:23:24 compute-1 python3.9[59688]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 21 23:23:25 compute-1 python3.9[59843]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 21 23:23:26 compute-1 sudo[59997]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ralvkhcjfmnmxwlbcyqkiduccswbuvzv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037806.2400582-81-262376158119475/AnsiballZ_setup.py'
Jan 21 23:23:26 compute-1 sudo[59997]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:23:26 compute-1 python3.9[59999]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 21 23:23:27 compute-1 sudo[59997]: pam_unix(sudo:session): session closed for user root
Jan 21 23:23:27 compute-1 sudo[60081]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zqmpoeetszqlgfihfustijbzorhuyore ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037806.2400582-81-262376158119475/AnsiballZ_dnf.py'
Jan 21 23:23:27 compute-1 sudo[60081]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:23:27 compute-1 python3.9[60083]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 21 23:23:29 compute-1 sudo[60081]: pam_unix(sudo:session): session closed for user root
Jan 21 23:23:29 compute-1 sudo[60235]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tunzyydpguwrlrwwhfnarkxrxkqfnerh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037809.3591573-117-125299471166652/AnsiballZ_setup.py'
Jan 21 23:23:29 compute-1 sudo[60235]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:23:30 compute-1 python3.9[60237]: ansible-ansible.builtin.setup Invoked with filter=['ansible_interfaces'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 21 23:23:30 compute-1 sudo[60235]: pam_unix(sudo:session): session closed for user root
Jan 21 23:23:31 compute-1 sudo[60426]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ibcvmwotnjebuflpdslutmdqjcraaphh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037810.8551035-150-260858998420522/AnsiballZ_file.py'
Jan 21 23:23:31 compute-1 sudo[60426]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:23:31 compute-1 python3.9[60428]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/containers/networks recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:23:31 compute-1 sudo[60426]: pam_unix(sudo:session): session closed for user root
Jan 21 23:23:32 compute-1 sudo[60578]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rieerodsvaglbyzkixsczmsjgzjxielh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037811.753254-174-213931969892712/AnsiballZ_command.py'
Jan 21 23:23:32 compute-1 sudo[60578]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:23:32 compute-1 python3.9[60580]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 23:23:32 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 21 23:23:32 compute-1 sudo[60578]: pam_unix(sudo:session): session closed for user root
Jan 21 23:23:33 compute-1 sudo[60740]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jtatyjotrtpghlcupwwwkfqnbfgchdya ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037812.7361188-198-280539218402702/AnsiballZ_stat.py'
Jan 21 23:23:33 compute-1 sudo[60740]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:23:33 compute-1 python3.9[60742]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:23:33 compute-1 sudo[60740]: pam_unix(sudo:session): session closed for user root
Jan 21 23:23:33 compute-1 sudo[60818]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-okoryesrmmthtorpgnplmvoccwbrxvzh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037812.7361188-198-280539218402702/AnsiballZ_file.py'
Jan 21 23:23:33 compute-1 sudo[60818]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:23:33 compute-1 python3.9[60820]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/containers/networks/podman.json _original_basename=podman_network_config.j2 recurse=False state=file path=/etc/containers/networks/podman.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:23:33 compute-1 sudo[60818]: pam_unix(sudo:session): session closed for user root
Jan 21 23:23:34 compute-1 sudo[60970]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ntzujbmkgxeykbmbbqodgwjgtwudrgwr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037814.1121032-234-125827490659209/AnsiballZ_stat.py'
Jan 21 23:23:34 compute-1 sudo[60970]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:23:34 compute-1 python3.9[60972]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:23:34 compute-1 sudo[60970]: pam_unix(sudo:session): session closed for user root
Jan 21 23:23:34 compute-1 sudo[61048]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gvgknucfkvpqktflcctjatedhfsfpicx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037814.1121032-234-125827490659209/AnsiballZ_file.py'
Jan 21 23:23:34 compute-1 sudo[61048]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:23:35 compute-1 python3.9[61050]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root setype=etc_t dest=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf _original_basename=registries.conf.j2 recurse=False state=file path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 21 23:23:35 compute-1 sudo[61048]: pam_unix(sudo:session): session closed for user root
Jan 21 23:23:36 compute-1 sudo[61200]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xobikrgcuqmxtxscewqsteksoljapfdf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037815.575738-273-133027276205379/AnsiballZ_ini_file.py'
Jan 21 23:23:36 compute-1 sudo[61200]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:23:36 compute-1 python3.9[61202]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=pids_limit owner=root path=/etc/containers/containers.conf section=containers setype=etc_t value=4096 backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Jan 21 23:23:36 compute-1 sudo[61200]: pam_unix(sudo:session): session closed for user root
Jan 21 23:23:36 compute-1 sudo[61352]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hujyabwajlrlwrgmjcawbortwdtwgzsr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037816.436074-273-26600693211163/AnsiballZ_ini_file.py'
Jan 21 23:23:36 compute-1 sudo[61352]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:23:37 compute-1 python3.9[61354]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=events_logger owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="journald" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Jan 21 23:23:37 compute-1 sudo[61352]: pam_unix(sudo:session): session closed for user root
Jan 21 23:23:37 compute-1 sudo[61504]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ptddjsgivuciwqaopsudvjbkzaxbjarz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037817.1462824-273-67887687565931/AnsiballZ_ini_file.py'
Jan 21 23:23:37 compute-1 sudo[61504]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:23:37 compute-1 python3.9[61506]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=runtime owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="crun" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Jan 21 23:23:37 compute-1 sudo[61504]: pam_unix(sudo:session): session closed for user root
Jan 21 23:23:38 compute-1 sudo[61656]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jizqkchvuibyfuonzzksitodjtkdudtw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037817.8542805-273-107235271432182/AnsiballZ_ini_file.py'
Jan 21 23:23:38 compute-1 sudo[61656]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:23:38 compute-1 python3.9[61658]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=network_backend owner=root path=/etc/containers/containers.conf section=network setype=etc_t value="netavark" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Jan 21 23:23:38 compute-1 sudo[61656]: pam_unix(sudo:session): session closed for user root
Jan 21 23:23:39 compute-1 sudo[61808]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ilxghsdkoraweakexitzotpyrlxldwvj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037818.8073976-366-138837260692872/AnsiballZ_dnf.py'
Jan 21 23:23:39 compute-1 sudo[61808]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:23:39 compute-1 python3.9[61810]: ansible-ansible.legacy.dnf Invoked with name=['openssh-server'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 21 23:23:40 compute-1 sudo[61808]: pam_unix(sudo:session): session closed for user root
Jan 21 23:23:41 compute-1 sudo[61961]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fmwkmvauhaswknihlcmphinhxwoeoiyp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037821.4834938-399-11135976216214/AnsiballZ_setup.py'
Jan 21 23:23:41 compute-1 sudo[61961]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:23:42 compute-1 python3.9[61963]: ansible-setup Invoked with gather_subset=['!all', '!min', 'distribution', 'distribution_major_version', 'distribution_version', 'os_family'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 21 23:23:42 compute-1 sudo[61961]: pam_unix(sudo:session): session closed for user root
Jan 21 23:23:42 compute-1 sudo[62115]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kwnyfotywfnhphkxvzdyaopmtzxgadjo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037822.411885-423-76732396251231/AnsiballZ_stat.py'
Jan 21 23:23:42 compute-1 sudo[62115]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:23:42 compute-1 python3.9[62117]: ansible-stat Invoked with path=/run/ostree-booted follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 21 23:23:42 compute-1 sudo[62115]: pam_unix(sudo:session): session closed for user root
Jan 21 23:23:43 compute-1 sudo[62267]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bvzmxnpnlvthfbwmgrokmzhftkekipas ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037823.1905258-450-159412074815514/AnsiballZ_stat.py'
Jan 21 23:23:43 compute-1 sudo[62267]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:23:43 compute-1 python3.9[62269]: ansible-stat Invoked with path=/sbin/transactional-update follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 21 23:23:43 compute-1 sudo[62267]: pam_unix(sudo:session): session closed for user root
Jan 21 23:23:44 compute-1 sudo[62419]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vfehfwzxsunzpdgezedgyvnlmbtqhktb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037824.0617962-480-76087957616983/AnsiballZ_command.py'
Jan 21 23:23:44 compute-1 sudo[62419]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:23:44 compute-1 python3.9[62421]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl is-system-running _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 23:23:44 compute-1 sudo[62419]: pam_unix(sudo:session): session closed for user root
Jan 21 23:23:45 compute-1 sudo[62572]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wnvgahlfjqszxhopnjkdwtulfeonpmkr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037824.965206-510-126980031873345/AnsiballZ_service_facts.py'
Jan 21 23:23:45 compute-1 sudo[62572]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:23:45 compute-1 python3.9[62574]: ansible-service_facts Invoked
Jan 21 23:23:45 compute-1 network[62591]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Jan 21 23:23:45 compute-1 network[62592]: 'network-scripts' will be removed from distribution in near future.
Jan 21 23:23:45 compute-1 network[62593]: It is advised to switch to 'NetworkManager' instead for network management.
Jan 21 23:23:48 compute-1 sudo[62572]: pam_unix(sudo:session): session closed for user root
Jan 21 23:23:50 compute-1 sudo[62876]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lqdczmonaoauxadfmprqzotsbgzkwtkp ; /bin/bash /home/zuul/.ansible/tmp/ansible-tmp-1769037830.5731206-556-25829504285834/AnsiballZ_timesync_provider.sh /home/zuul/.ansible/tmp/ansible-tmp-1769037830.5731206-556-25829504285834/args'
Jan 21 23:23:50 compute-1 sudo[62876]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:23:50 compute-1 sudo[62876]: pam_unix(sudo:session): session closed for user root
Jan 21 23:23:51 compute-1 sudo[63043]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tqcwegpztujvoxiwuffrlqjsbqjpwrlr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037831.4157991-588-220134415046533/AnsiballZ_dnf.py'
Jan 21 23:23:51 compute-1 sudo[63043]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:23:52 compute-1 python3.9[63045]: ansible-ansible.legacy.dnf Invoked with name=['chrony'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 21 23:23:53 compute-1 sudo[63043]: pam_unix(sudo:session): session closed for user root
Jan 21 23:23:55 compute-1 sudo[63196]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rcxfoqrtkvsygziegjovffshfttwipew ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037834.2370796-627-13449358607838/AnsiballZ_package_facts.py'
Jan 21 23:23:55 compute-1 sudo[63196]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:23:55 compute-1 python3.9[63198]: ansible-package_facts Invoked with manager=['auto'] strategy=first
Jan 21 23:23:55 compute-1 sudo[63196]: pam_unix(sudo:session): session closed for user root
Jan 21 23:23:57 compute-1 sudo[63348]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bgpeuubnupugfxowhtshbbgnuqtglxeh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037836.644578-658-93685946290497/AnsiballZ_stat.py'
Jan 21 23:23:57 compute-1 sudo[63348]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:23:57 compute-1 python3.9[63350]: ansible-ansible.legacy.stat Invoked with path=/etc/chrony.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:23:57 compute-1 sudo[63348]: pam_unix(sudo:session): session closed for user root
Jan 21 23:23:58 compute-1 sudo[63473]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-odbsiwzzhhvcjvbsvfhilxhbpzskulhn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037836.644578-658-93685946290497/AnsiballZ_copy.py'
Jan 21 23:23:58 compute-1 sudo[63473]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:23:58 compute-1 python3.9[63475]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/chrony.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769037836.644578-658-93685946290497/.source.conf follow=False _original_basename=chrony.conf.j2 checksum=cfb003e56d02d0d2c65555452eb1a05073fecdad force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:23:58 compute-1 sudo[63473]: pam_unix(sudo:session): session closed for user root
Jan 21 23:23:58 compute-1 sudo[63627]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kvdrzxewjbhezhseyvsmbdjwboxogdcz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037838.5848262-703-51947746765436/AnsiballZ_stat.py'
Jan 21 23:23:58 compute-1 sudo[63627]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:23:59 compute-1 python3.9[63629]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/chronyd follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:23:59 compute-1 sudo[63627]: pam_unix(sudo:session): session closed for user root
Jan 21 23:23:59 compute-1 sudo[63752]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uemtrvdjewgetwmwyvgrcrtsodhvtlum ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037838.5848262-703-51947746765436/AnsiballZ_copy.py'
Jan 21 23:23:59 compute-1 sudo[63752]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:23:59 compute-1 python3.9[63754]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/sysconfig/chronyd mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769037838.5848262-703-51947746765436/.source follow=False _original_basename=chronyd.sysconfig.j2 checksum=dd196b1ff1f915b23eebc37ec77405b5dd3df76c force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:23:59 compute-1 sudo[63752]: pam_unix(sudo:session): session closed for user root
Jan 21 23:24:01 compute-1 sudo[63906]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dwxhlelsokalxexslcqvzudpoplvlkcn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037841.0977862-766-72685100401675/AnsiballZ_lineinfile.py'
Jan 21 23:24:01 compute-1 sudo[63906]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:24:01 compute-1 python3.9[63908]: ansible-lineinfile Invoked with backup=True create=True dest=/etc/sysconfig/network line=PEERNTP=no mode=0644 regexp=^PEERNTP= state=present path=/etc/sysconfig/network encoding=utf-8 backrefs=False firstmatch=False unsafe_writes=False search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:24:01 compute-1 sudo[63906]: pam_unix(sudo:session): session closed for user root
Jan 21 23:24:03 compute-1 sudo[64060]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tdkhsjtxjxtgmpfavfcxpjgpfimwlbjp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037842.9380674-810-43036795138218/AnsiballZ_setup.py'
Jan 21 23:24:03 compute-1 sudo[64060]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:24:03 compute-1 python3.9[64062]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 21 23:24:03 compute-1 sudo[64060]: pam_unix(sudo:session): session closed for user root
Jan 21 23:24:05 compute-1 sudo[64144]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dpppolthxpfjzcilazhpidodybmqfszl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037842.9380674-810-43036795138218/AnsiballZ_systemd.py'
Jan 21 23:24:05 compute-1 sudo[64144]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:24:05 compute-1 python3.9[64146]: ansible-ansible.legacy.systemd Invoked with enabled=True name=chronyd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 21 23:24:06 compute-1 sudo[64144]: pam_unix(sudo:session): session closed for user root
Jan 21 23:24:08 compute-1 sudo[64298]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fspygugyilkkocoqkdylfjrvcuaplloi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037847.7561865-859-59148434928119/AnsiballZ_setup.py'
Jan 21 23:24:08 compute-1 sudo[64298]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:24:08 compute-1 python3.9[64300]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 21 23:24:08 compute-1 sudo[64298]: pam_unix(sudo:session): session closed for user root
Jan 21 23:24:09 compute-1 sudo[64382]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-alyatumeepmiokgcjbaqkmkoubbqfuze ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037847.7561865-859-59148434928119/AnsiballZ_systemd.py'
Jan 21 23:24:09 compute-1 sudo[64382]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:24:09 compute-1 python3.9[64384]: ansible-ansible.legacy.systemd Invoked with name=chronyd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 21 23:24:09 compute-1 chronyd[784]: chronyd exiting
Jan 21 23:24:09 compute-1 systemd[1]: Stopping NTP client/server...
Jan 21 23:24:09 compute-1 systemd[1]: chronyd.service: Deactivated successfully.
Jan 21 23:24:09 compute-1 systemd[1]: Stopped NTP client/server.
Jan 21 23:24:09 compute-1 systemd[1]: Starting NTP client/server...
Jan 21 23:24:09 compute-1 chronyd[64393]: chronyd version 4.8 starting (+CMDMON +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +NTS +SECHASH +IPV6 +DEBUG)
Jan 21 23:24:09 compute-1 chronyd[64393]: Frequency -31.398 +/- 0.350 ppm read from /var/lib/chrony/drift
Jan 21 23:24:09 compute-1 chronyd[64393]: Loaded seccomp filter (level 2)
Jan 21 23:24:09 compute-1 systemd[1]: Started NTP client/server.
Jan 21 23:24:09 compute-1 sudo[64382]: pam_unix(sudo:session): session closed for user root
Jan 21 23:24:10 compute-1 sshd-session[59538]: Connection closed by 192.168.122.30 port 53608
Jan 21 23:24:10 compute-1 sshd-session[59535]: pam_unix(sshd:session): session closed for user zuul
Jan 21 23:24:10 compute-1 systemd-logind[796]: Session 14 logged out. Waiting for processes to exit.
Jan 21 23:24:10 compute-1 systemd[1]: session-14.scope: Deactivated successfully.
Jan 21 23:24:10 compute-1 systemd[1]: session-14.scope: Consumed 27.294s CPU time.
Jan 21 23:24:10 compute-1 systemd-logind[796]: Removed session 14.
Jan 21 23:24:15 compute-1 sshd-session[64419]: Accepted publickey for zuul from 192.168.122.30 port 54136 ssh2: ECDSA SHA256:yVd9eaNlUxI8uClDfn5gH9n5gqPIh4AgIE0cN4DTPFE
Jan 21 23:24:15 compute-1 systemd-logind[796]: New session 15 of user zuul.
Jan 21 23:24:15 compute-1 systemd[1]: Started Session 15 of User zuul.
Jan 21 23:24:15 compute-1 sshd-session[64419]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 21 23:24:16 compute-1 python3.9[64572]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 21 23:24:17 compute-1 sudo[64726]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xwyforabsyzwifinykelwlzphntanlyp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037857.031418-60-33953961328325/AnsiballZ_file.py'
Jan 21 23:24:17 compute-1 sudo[64726]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:24:17 compute-1 python3.9[64728]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:24:18 compute-1 sudo[64726]: pam_unix(sudo:session): session closed for user root
Jan 21 23:24:18 compute-1 sudo[64901]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zoachgtbeixsijshumdmnefrqevmkavc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037858.199592-84-275428539474234/AnsiballZ_stat.py'
Jan 21 23:24:18 compute-1 sudo[64901]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:24:18 compute-1 python3.9[64903]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:24:19 compute-1 sudo[64901]: pam_unix(sudo:session): session closed for user root
Jan 21 23:24:19 compute-1 sudo[64979]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pgjyovhqwgkzrsipxvdzsdtnwaxujpbq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037858.199592-84-275428539474234/AnsiballZ_file.py'
Jan 21 23:24:19 compute-1 sudo[64979]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:24:19 compute-1 python3.9[64981]: ansible-ansible.legacy.file Invoked with group=zuul mode=0660 owner=zuul dest=/root/.config/containers/auth.json _original_basename=.9xbysfat recurse=False state=file path=/root/.config/containers/auth.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:24:19 compute-1 sudo[64979]: pam_unix(sudo:session): session closed for user root
Jan 21 23:24:20 compute-1 sudo[65131]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gqfewhfwusxmyxnezsmpemvexqggxgcg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037860.041056-144-227496537690719/AnsiballZ_stat.py'
Jan 21 23:24:20 compute-1 sudo[65131]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:24:20 compute-1 python3.9[65133]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:24:20 compute-1 sudo[65131]: pam_unix(sudo:session): session closed for user root
Jan 21 23:24:21 compute-1 sudo[65254]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-takvzppqxaaknkksxlzkybmanvvanxoa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037860.041056-144-227496537690719/AnsiballZ_copy.py'
Jan 21 23:24:21 compute-1 sudo[65254]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:24:21 compute-1 python3.9[65256]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysconfig/podman_drop_in mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769037860.041056-144-227496537690719/.source _original_basename=.tprrdhnk follow=False checksum=125299ce8dea7711a76292961206447f0043248b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:24:21 compute-1 sudo[65254]: pam_unix(sudo:session): session closed for user root
Jan 21 23:24:21 compute-1 sudo[65406]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sdajcqkipvildfnliyudbbdtbilhctks ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037861.5114794-192-210896973780/AnsiballZ_file.py'
Jan 21 23:24:21 compute-1 sudo[65406]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:24:22 compute-1 python3.9[65408]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 21 23:24:22 compute-1 sudo[65406]: pam_unix(sudo:session): session closed for user root
Jan 21 23:24:22 compute-1 sudo[65558]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lzzinwndxindpjsihfcajxtgujlrfova ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037862.2977586-216-80606374718282/AnsiballZ_stat.py'
Jan 21 23:24:22 compute-1 sudo[65558]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:24:22 compute-1 python3.9[65560]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:24:22 compute-1 sudo[65558]: pam_unix(sudo:session): session closed for user root
Jan 21 23:24:23 compute-1 sudo[65681]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vgryhqoikpbqgivdwgsqxfhyszqeiovv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037862.2977586-216-80606374718282/AnsiballZ_copy.py'
Jan 21 23:24:23 compute-1 sudo[65681]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:24:23 compute-1 python3.9[65683]: ansible-ansible.legacy.copy Invoked with dest=/var/local/libexec/edpm-container-shutdown group=root mode=0700 owner=root setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769037862.2977586-216-80606374718282/.source _original_basename=edpm-container-shutdown follow=False checksum=632c3792eb3dce4288b33ae7b265b71950d69f13 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 21 23:24:23 compute-1 sudo[65681]: pam_unix(sudo:session): session closed for user root
Jan 21 23:24:24 compute-1 sudo[65833]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-weypxunyitwddwkcajkpzaodqvzlxzly ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037863.6832862-216-261084576060483/AnsiballZ_stat.py'
Jan 21 23:24:24 compute-1 sudo[65833]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:24:24 compute-1 python3.9[65835]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:24:24 compute-1 sudo[65833]: pam_unix(sudo:session): session closed for user root
Jan 21 23:24:24 compute-1 sudo[65956]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kpgxszauciwxtcetqjadhjrrpvziwgbm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037863.6832862-216-261084576060483/AnsiballZ_copy.py'
Jan 21 23:24:24 compute-1 sudo[65956]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:24:24 compute-1 python3.9[65958]: ansible-ansible.legacy.copy Invoked with dest=/var/local/libexec/edpm-start-podman-container group=root mode=0700 owner=root setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769037863.6832862-216-261084576060483/.source _original_basename=edpm-start-podman-container follow=False checksum=b963c569d75a655c0ccae95d9bb4a2a9a4df27d1 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 21 23:24:24 compute-1 sudo[65956]: pam_unix(sudo:session): session closed for user root
Jan 21 23:24:25 compute-1 sudo[66108]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ftrwaizqafpjjpupiztyrjiksuxbhfww ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037865.058474-303-121402136663700/AnsiballZ_file.py'
Jan 21 23:24:25 compute-1 sudo[66108]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:24:25 compute-1 python3.9[66110]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:24:25 compute-1 sudo[66108]: pam_unix(sudo:session): session closed for user root
Jan 21 23:24:26 compute-1 sudo[66260]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fyonbqbobbvnmfdmdhdlcctyfdehcpvt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037865.8821447-327-173659729579588/AnsiballZ_stat.py'
Jan 21 23:24:26 compute-1 sudo[66260]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:24:26 compute-1 python3.9[66262]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:24:26 compute-1 sudo[66260]: pam_unix(sudo:session): session closed for user root
Jan 21 23:24:26 compute-1 sudo[66383]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nacmjnrgznrmdeebeszfkqhwbtouxegs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037865.8821447-327-173659729579588/AnsiballZ_copy.py'
Jan 21 23:24:26 compute-1 sudo[66383]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:24:27 compute-1 python3.9[66385]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm-container-shutdown.service group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769037865.8821447-327-173659729579588/.source.service _original_basename=edpm-container-shutdown-service follow=False checksum=6336835cb0f888670cc99de31e19c8c071444d33 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:24:27 compute-1 sudo[66383]: pam_unix(sudo:session): session closed for user root
Jan 21 23:24:27 compute-1 sudo[66535]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mlyxfiepptrdesamfvshsrtwmnamupnn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037867.3030257-372-151335983141348/AnsiballZ_stat.py'
Jan 21 23:24:27 compute-1 sudo[66535]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:24:27 compute-1 python3.9[66537]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:24:27 compute-1 sudo[66535]: pam_unix(sudo:session): session closed for user root
Jan 21 23:24:28 compute-1 sudo[66658]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pfsplwcqrhcytaantrgslqxwhttrrlms ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037867.3030257-372-151335983141348/AnsiballZ_copy.py'
Jan 21 23:24:28 compute-1 sudo[66658]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:24:28 compute-1 python3.9[66660]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769037867.3030257-372-151335983141348/.source.preset _original_basename=91-edpm-container-shutdown-preset follow=False checksum=b275e4375287528cb63464dd32f622c4f142a915 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:24:28 compute-1 sudo[66658]: pam_unix(sudo:session): session closed for user root
Jan 21 23:24:29 compute-1 sudo[66810]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vigbnlmyrdwaanfreufmgtetqwluwswy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037868.6555834-417-190855714582013/AnsiballZ_systemd.py'
Jan 21 23:24:29 compute-1 sudo[66810]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:24:29 compute-1 python3.9[66812]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 21 23:24:29 compute-1 systemd[1]: Reloading.
Jan 21 23:24:29 compute-1 systemd-rc-local-generator[66834]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 23:24:29 compute-1 systemd-sysv-generator[66838]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 21 23:24:30 compute-1 systemd[1]: Reloading.
Jan 21 23:24:30 compute-1 systemd-sysv-generator[66883]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 21 23:24:30 compute-1 systemd-rc-local-generator[66879]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 23:24:30 compute-1 systemd[1]: Starting EDPM Container Shutdown...
Jan 21 23:24:30 compute-1 systemd[1]: Finished EDPM Container Shutdown.
Jan 21 23:24:30 compute-1 sudo[66810]: pam_unix(sudo:session): session closed for user root
Jan 21 23:24:31 compute-1 sudo[67039]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sqoomhipsjdbyqsutzlzmbyfnwuframk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037870.7237024-441-13440544602212/AnsiballZ_stat.py'
Jan 21 23:24:31 compute-1 sudo[67039]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:24:31 compute-1 python3.9[67041]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:24:31 compute-1 sudo[67039]: pam_unix(sudo:session): session closed for user root
Jan 21 23:24:31 compute-1 sudo[67162]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aazjpkhzwoaowdzelmrxhlkupftohmfd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037870.7237024-441-13440544602212/AnsiballZ_copy.py'
Jan 21 23:24:31 compute-1 sudo[67162]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:24:31 compute-1 python3.9[67164]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/netns-placeholder.service group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769037870.7237024-441-13440544602212/.source.service _original_basename=netns-placeholder-service follow=False checksum=b61b1b5918c20c877b8b226fbf34ff89a082d972 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:24:31 compute-1 sudo[67162]: pam_unix(sudo:session): session closed for user root
Jan 21 23:24:32 compute-1 sudo[67314]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cbcwxzljirckjcsyrrlffnautytczoye ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037872.1851377-486-111129331028302/AnsiballZ_stat.py'
Jan 21 23:24:32 compute-1 sudo[67314]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:24:32 compute-1 python3.9[67316]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:24:32 compute-1 sudo[67314]: pam_unix(sudo:session): session closed for user root
Jan 21 23:24:33 compute-1 sudo[67437]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hjbxpyxirrqzdhqjklwvesnpoaidmnfd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037872.1851377-486-111129331028302/AnsiballZ_copy.py'
Jan 21 23:24:33 compute-1 sudo[67437]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:24:33 compute-1 python3.9[67439]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system-preset/91-netns-placeholder.preset group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769037872.1851377-486-111129331028302/.source.preset _original_basename=91-netns-placeholder-preset follow=False checksum=28b7b9aa893525d134a1eeda8a0a48fb25b736b9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:24:33 compute-1 sudo[67437]: pam_unix(sudo:session): session closed for user root
Jan 21 23:24:33 compute-1 sudo[67589]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-svositioxmwqhkkvdhtsliowyyuakawj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037873.5530756-531-272337147859465/AnsiballZ_systemd.py'
Jan 21 23:24:33 compute-1 sudo[67589]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:24:34 compute-1 python3.9[67591]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 21 23:24:34 compute-1 systemd[1]: Reloading.
Jan 21 23:24:34 compute-1 systemd-rc-local-generator[67621]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 23:24:34 compute-1 systemd-sysv-generator[67624]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 21 23:24:34 compute-1 systemd[1]: Reloading.
Jan 21 23:24:34 compute-1 systemd-rc-local-generator[67651]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 23:24:34 compute-1 systemd-sysv-generator[67659]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 21 23:24:34 compute-1 systemd[1]: Starting Create netns directory...
Jan 21 23:24:34 compute-1 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Jan 21 23:24:34 compute-1 systemd[1]: netns-placeholder.service: Deactivated successfully.
Jan 21 23:24:34 compute-1 systemd[1]: Finished Create netns directory.
Jan 21 23:24:34 compute-1 sudo[67589]: pam_unix(sudo:session): session closed for user root
Jan 21 23:24:35 compute-1 python3.9[67817]: ansible-ansible.builtin.service_facts Invoked
Jan 21 23:24:35 compute-1 network[67834]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Jan 21 23:24:35 compute-1 network[67835]: 'network-scripts' will be removed from distribution in near future.
Jan 21 23:24:35 compute-1 network[67836]: It is advised to switch to 'NetworkManager' instead for network management.
Jan 21 23:24:41 compute-1 sudo[68096]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mvlwbarnvgkpthyyxbcrxzgpxttewiht ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037880.8704116-579-127837224490764/AnsiballZ_systemd.py'
Jan 21 23:24:41 compute-1 sudo[68096]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:24:41 compute-1 python3.9[68098]: ansible-ansible.builtin.systemd Invoked with enabled=False name=iptables.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 21 23:24:41 compute-1 systemd[1]: Reloading.
Jan 21 23:24:41 compute-1 systemd-sysv-generator[68130]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 21 23:24:41 compute-1 systemd-rc-local-generator[68124]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 23:24:41 compute-1 systemd[1]: Stopping IPv4 firewall with iptables...
Jan 21 23:24:41 compute-1 iptables.init[68138]: iptables: Setting chains to policy ACCEPT: raw mangle filter nat [  OK  ]
Jan 21 23:24:42 compute-1 iptables.init[68138]: iptables: Flushing firewall rules: [  OK  ]
Jan 21 23:24:42 compute-1 systemd[1]: iptables.service: Deactivated successfully.
Jan 21 23:24:42 compute-1 systemd[1]: Stopped IPv4 firewall with iptables.
Jan 21 23:24:42 compute-1 sudo[68096]: pam_unix(sudo:session): session closed for user root
Jan 21 23:24:42 compute-1 sudo[68332]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nutekrueqpemjyhadesgrtxrcshidzys ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037882.2310922-579-15081739739465/AnsiballZ_systemd.py'
Jan 21 23:24:42 compute-1 sudo[68332]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:24:42 compute-1 python3.9[68334]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ip6tables.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 21 23:24:42 compute-1 sudo[68332]: pam_unix(sudo:session): session closed for user root
Jan 21 23:24:43 compute-1 sudo[68486]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ukynoqsbpnwaxxpafpnmbblyhimhlujb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037883.3993483-627-166423397174751/AnsiballZ_systemd.py'
Jan 21 23:24:43 compute-1 sudo[68486]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:24:44 compute-1 python3.9[68488]: ansible-ansible.builtin.systemd Invoked with enabled=True name=nftables state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 21 23:24:44 compute-1 systemd[1]: Reloading.
Jan 21 23:24:44 compute-1 systemd-rc-local-generator[68518]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 23:24:44 compute-1 systemd-sysv-generator[68521]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 21 23:24:44 compute-1 systemd[1]: Starting Netfilter Tables...
Jan 21 23:24:44 compute-1 systemd[1]: Finished Netfilter Tables.
Jan 21 23:24:44 compute-1 sudo[68486]: pam_unix(sudo:session): session closed for user root
Jan 21 23:24:45 compute-1 sudo[68678]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kwqqpffdyobkbglgcydbgcizhswpynwv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037884.9416468-651-9121863005537/AnsiballZ_command.py'
Jan 21 23:24:45 compute-1 sudo[68678]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:24:45 compute-1 python3.9[68680]: ansible-ansible.legacy.command Invoked with _raw_params=nft flush ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 23:24:45 compute-1 sudo[68678]: pam_unix(sudo:session): session closed for user root
Jan 21 23:24:46 compute-1 sudo[68831]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pzrtwsulkiurwurqbfjflliqjkineocn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037886.323187-693-278129559451644/AnsiballZ_stat.py'
Jan 21 23:24:46 compute-1 sudo[68831]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:24:46 compute-1 python3.9[68833]: ansible-ansible.legacy.stat Invoked with path=/etc/ssh/sshd_config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:24:46 compute-1 sudo[68831]: pam_unix(sudo:session): session closed for user root
Jan 21 23:24:47 compute-1 sudo[68956]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xkfinzyhwwcrzvnbmydicpkazutkooag ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037886.323187-693-278129559451644/AnsiballZ_copy.py'
Jan 21 23:24:47 compute-1 sudo[68956]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:24:47 compute-1 python3.9[68958]: ansible-ansible.legacy.copy Invoked with dest=/etc/ssh/sshd_config mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1769037886.323187-693-278129559451644/.source validate=/usr/sbin/sshd -T -f %s follow=False _original_basename=sshd_config_block.j2 checksum=6c79f4cb960ad444688fde322eeacb8402e22d79 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:24:47 compute-1 sudo[68956]: pam_unix(sudo:session): session closed for user root
Jan 21 23:24:48 compute-1 sudo[69109]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gskasvmuwmqciuiowkgpmtrnfuqvkqnr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037887.8609157-738-117377840208075/AnsiballZ_systemd.py'
Jan 21 23:24:48 compute-1 sudo[69109]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:24:48 compute-1 python3.9[69111]: ansible-ansible.builtin.systemd Invoked with name=sshd state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 21 23:24:48 compute-1 systemd[1]: Reloading OpenSSH server daemon...
Jan 21 23:24:48 compute-1 sshd[1004]: Received SIGHUP; restarting.
Jan 21 23:24:48 compute-1 systemd[1]: Reloaded OpenSSH server daemon.
Jan 21 23:24:48 compute-1 sshd[1004]: Server listening on 0.0.0.0 port 22.
Jan 21 23:24:48 compute-1 sshd[1004]: Server listening on :: port 22.
Jan 21 23:24:48 compute-1 sudo[69109]: pam_unix(sudo:session): session closed for user root
Jan 21 23:24:49 compute-1 sudo[69265]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rdduyztsbrzbeguubefebgbtxggyqdmf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037888.8744519-762-169690320774713/AnsiballZ_file.py'
Jan 21 23:24:49 compute-1 sudo[69265]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:24:49 compute-1 python3.9[69267]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:24:49 compute-1 sudo[69265]: pam_unix(sudo:session): session closed for user root
Jan 21 23:24:49 compute-1 sudo[69417]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tahmtmzpuwjxeszangodpvpdiqbgmrfb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037889.6471877-786-48250785319436/AnsiballZ_stat.py'
Jan 21 23:24:49 compute-1 sudo[69417]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:24:50 compute-1 python3.9[69419]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/sshd-networks.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:24:50 compute-1 sudo[69417]: pam_unix(sudo:session): session closed for user root
Jan 21 23:24:50 compute-1 sudo[69540]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lsworksmwhyzwwfkjkcfqpwumqipixys ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037889.6471877-786-48250785319436/AnsiballZ_copy.py'
Jan 21 23:24:50 compute-1 sudo[69540]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:24:50 compute-1 python3.9[69542]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/sshd-networks.yaml group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769037889.6471877-786-48250785319436/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=0bfc8440fd8f39002ab90252479fb794f51b5ae8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:24:50 compute-1 sudo[69540]: pam_unix(sudo:session): session closed for user root
Jan 21 23:24:52 compute-1 sudo[69692]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jdnjuqlodkexvadsfenhkezpauaxvwzn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037891.678183-840-254613368782058/AnsiballZ_timezone.py'
Jan 21 23:24:52 compute-1 sudo[69692]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:24:52 compute-1 python3.9[69694]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Jan 21 23:24:52 compute-1 systemd[1]: Starting Time & Date Service...
Jan 21 23:24:52 compute-1 systemd[1]: Started Time & Date Service.
Jan 21 23:24:52 compute-1 sudo[69692]: pam_unix(sudo:session): session closed for user root
Jan 21 23:24:54 compute-1 sudo[69848]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pniccvkamfjqhgwxpbynhladhpqgxgey ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037893.8351824-867-98195868839146/AnsiballZ_file.py'
Jan 21 23:24:54 compute-1 sudo[69848]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:24:54 compute-1 python3.9[69850]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:24:54 compute-1 sudo[69848]: pam_unix(sudo:session): session closed for user root
Jan 21 23:24:54 compute-1 sudo[70000]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ppzqxnbmsjqasantjamysctsrcruatem ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037894.6455643-891-3419749348010/AnsiballZ_stat.py'
Jan 21 23:24:54 compute-1 sudo[70000]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:24:55 compute-1 python3.9[70002]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:24:55 compute-1 sudo[70000]: pam_unix(sudo:session): session closed for user root
Jan 21 23:24:55 compute-1 sudo[70123]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-botmjhrmblozlcfajzdyncszmgflsvyu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037894.6455643-891-3419749348010/AnsiballZ_copy.py'
Jan 21 23:24:55 compute-1 sudo[70123]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:24:55 compute-1 python3.9[70125]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769037894.6455643-891-3419749348010/.source.yaml follow=False _original_basename=base-rules.yaml.j2 checksum=450456afcafded6d4bdecceec7a02e806eebd8b3 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:24:55 compute-1 sudo[70123]: pam_unix(sudo:session): session closed for user root
Jan 21 23:24:56 compute-1 sudo[70275]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fyybrkevxtimyvroatclzcjcioxlvsgm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037895.951742-936-35857627953610/AnsiballZ_stat.py'
Jan 21 23:24:56 compute-1 sudo[70275]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:24:56 compute-1 python3.9[70277]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:24:56 compute-1 sudo[70275]: pam_unix(sudo:session): session closed for user root
Jan 21 23:24:56 compute-1 sudo[70398]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qpbgdjbcztbxhtnnvlhfwvkyzfiklcbi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037895.951742-936-35857627953610/AnsiballZ_copy.py'
Jan 21 23:24:56 compute-1 sudo[70398]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:24:57 compute-1 python3.9[70400]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769037895.951742-936-35857627953610/.source.yaml _original_basename=.f_wnloi9 follow=False checksum=97d170e1550eee4afc0af065b78cda302a97674c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:24:57 compute-1 sudo[70398]: pam_unix(sudo:session): session closed for user root
Jan 21 23:24:57 compute-1 sudo[70550]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pjsimjvlaaxnjpfwevhwxogtlcdoggcy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037897.2460496-981-4953679701422/AnsiballZ_stat.py'
Jan 21 23:24:57 compute-1 sudo[70550]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:24:57 compute-1 python3.9[70552]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:24:57 compute-1 sudo[70550]: pam_unix(sudo:session): session closed for user root
Jan 21 23:24:58 compute-1 sudo[70673]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-umtgeulhwrwmklxavwhqevusvspfdqbv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037897.2460496-981-4953679701422/AnsiballZ_copy.py'
Jan 21 23:24:58 compute-1 sudo[70673]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:24:58 compute-1 python3.9[70675]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/iptables.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769037897.2460496-981-4953679701422/.source.nft _original_basename=iptables.nft follow=False checksum=3e02df08f1f3ab4a513e94056dbd390e3d38fe30 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:24:58 compute-1 sudo[70673]: pam_unix(sudo:session): session closed for user root
Jan 21 23:24:58 compute-1 sudo[70825]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-txptmosqrohrllommqmkswabghimuubj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037898.686977-1026-65064406348581/AnsiballZ_command.py'
Jan 21 23:24:58 compute-1 sudo[70825]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:24:59 compute-1 python3.9[70827]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/iptables.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 23:24:59 compute-1 sudo[70825]: pam_unix(sudo:session): session closed for user root
Jan 21 23:24:59 compute-1 sudo[70978]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ljajhygptsgbpxtcyrubfkkygdwfrzem ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037899.414455-1050-144170354544072/AnsiballZ_command.py'
Jan 21 23:24:59 compute-1 sudo[70978]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:24:59 compute-1 python3.9[70980]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 23:24:59 compute-1 sudo[70978]: pam_unix(sudo:session): session closed for user root
Jan 21 23:25:00 compute-1 sudo[71131]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dzajomxpiumqoaubzncsziwldtugnogb ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1769037900.220571-1074-112160794520151/AnsiballZ_edpm_nftables_from_files.py'
Jan 21 23:25:00 compute-1 sudo[71131]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:25:00 compute-1 python3[71133]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Jan 21 23:25:00 compute-1 sudo[71131]: pam_unix(sudo:session): session closed for user root
Jan 21 23:25:01 compute-1 sudo[71283]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-leijdjjkxwoaqeieuwzhamaaomxtorlu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037901.115977-1098-213431825952325/AnsiballZ_stat.py'
Jan 21 23:25:01 compute-1 sudo[71283]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:25:01 compute-1 python3.9[71285]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:25:01 compute-1 sudo[71283]: pam_unix(sudo:session): session closed for user root
Jan 21 23:25:02 compute-1 sudo[71406]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ycwukeialcznzdqkqjwwruqtorzwybtb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037901.115977-1098-213431825952325/AnsiballZ_copy.py'
Jan 21 23:25:02 compute-1 sudo[71406]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:25:02 compute-1 python3.9[71408]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769037901.115977-1098-213431825952325/.source.nft follow=False _original_basename=jump-chain.j2 checksum=4c6f036d2d5808f109acc0880c19aa74ca48c961 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:25:02 compute-1 sudo[71406]: pam_unix(sudo:session): session closed for user root
Jan 21 23:25:02 compute-1 sudo[71558]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xvuxjcthcnzzamqewztcfidktfbielqa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037902.5561733-1143-108891632845746/AnsiballZ_stat.py'
Jan 21 23:25:02 compute-1 sudo[71558]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:25:03 compute-1 python3.9[71560]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:25:03 compute-1 sudo[71558]: pam_unix(sudo:session): session closed for user root
Jan 21 23:25:03 compute-1 sudo[71681]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kehfocxjubbjzgvdkdtynrjftpavkqoq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037902.5561733-1143-108891632845746/AnsiballZ_copy.py'
Jan 21 23:25:03 compute-1 sudo[71681]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:25:03 compute-1 python3.9[71683]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769037902.5561733-1143-108891632845746/.source.nft follow=False _original_basename=jump-chain.j2 checksum=4c6f036d2d5808f109acc0880c19aa74ca48c961 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:25:03 compute-1 sudo[71681]: pam_unix(sudo:session): session closed for user root
Jan 21 23:25:04 compute-1 sudo[71833]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yzeshmugtdibafmawxqrwadhofexojjp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037904.026784-1188-263557659664915/AnsiballZ_stat.py'
Jan 21 23:25:04 compute-1 sudo[71833]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:25:04 compute-1 python3.9[71835]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:25:04 compute-1 sudo[71833]: pam_unix(sudo:session): session closed for user root
Jan 21 23:25:04 compute-1 sudo[71956]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sgoryzepioauqyvabxelxeeyfvcohnyj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037904.026784-1188-263557659664915/AnsiballZ_copy.py'
Jan 21 23:25:04 compute-1 sudo[71956]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:25:05 compute-1 python3.9[71958]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-flushes.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769037904.026784-1188-263557659664915/.source.nft follow=False _original_basename=flush-chain.j2 checksum=d16337256a56373421842284fe09e4e6c7df417e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:25:05 compute-1 sudo[71956]: pam_unix(sudo:session): session closed for user root
Jan 21 23:25:05 compute-1 sudo[72108]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cjmcbjuwrrqkfwyzzhfqlijtaycqdfke ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037905.4275582-1233-64702998054465/AnsiballZ_stat.py'
Jan 21 23:25:05 compute-1 sudo[72108]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:25:05 compute-1 python3.9[72110]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:25:05 compute-1 sudo[72108]: pam_unix(sudo:session): session closed for user root
Jan 21 23:25:06 compute-1 sudo[72231]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bxydlhaerotixssimnbqdhjafmrdputx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037905.4275582-1233-64702998054465/AnsiballZ_copy.py'
Jan 21 23:25:06 compute-1 sudo[72231]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:25:06 compute-1 python3.9[72233]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-chains.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769037905.4275582-1233-64702998054465/.source.nft follow=False _original_basename=chains.j2 checksum=2079f3b60590a165d1d502e763170876fc8e2984 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:25:06 compute-1 sudo[72231]: pam_unix(sudo:session): session closed for user root
Jan 21 23:25:07 compute-1 sudo[72383]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-udmnbhklzbtzomzjloizabowcwmsasna ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037906.852205-1278-184595532862393/AnsiballZ_stat.py'
Jan 21 23:25:07 compute-1 sudo[72383]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:25:07 compute-1 python3.9[72385]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:25:07 compute-1 sudo[72383]: pam_unix(sudo:session): session closed for user root
Jan 21 23:25:07 compute-1 sudo[72506]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nmwjzflukwftqsjjujnywcnbbnmjwrut ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037906.852205-1278-184595532862393/AnsiballZ_copy.py'
Jan 21 23:25:07 compute-1 sudo[72506]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:25:08 compute-1 python3.9[72508]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769037906.852205-1278-184595532862393/.source.nft follow=False _original_basename=ruleset.j2 checksum=15a82a0dc61abfd6aa593407582b5b950437eb80 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:25:08 compute-1 sudo[72506]: pam_unix(sudo:session): session closed for user root
Jan 21 23:25:08 compute-1 sudo[72658]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-esxouuhdtxusoczxverecwolauqddmsw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037908.6836448-1323-67606550804305/AnsiballZ_file.py'
Jan 21 23:25:08 compute-1 sudo[72658]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:25:09 compute-1 python3.9[72660]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:25:09 compute-1 sudo[72658]: pam_unix(sudo:session): session closed for user root
Jan 21 23:25:09 compute-1 sudo[72810]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iycxyzmntfwaglyrcvkvjyagrutxeken ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037909.3679657-1347-108860367034623/AnsiballZ_command.py'
Jan 21 23:25:09 compute-1 sudo[72810]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:25:09 compute-1 python3.9[72812]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 23:25:09 compute-1 sudo[72810]: pam_unix(sudo:session): session closed for user root
Jan 21 23:25:10 compute-1 sudo[72969]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vylgmnirllmuezqtpvkhlvnyhhaorzxw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037910.2105021-1371-29362700422276/AnsiballZ_blockinfile.py'
Jan 21 23:25:10 compute-1 sudo[72969]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:25:10 compute-1 python3.9[72971]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                            include "/etc/nftables/edpm-chains.nft"
                                            include "/etc/nftables/edpm-rules.nft"
                                            include "/etc/nftables/edpm-jumps.nft"
                                             path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:25:11 compute-1 sudo[72969]: pam_unix(sudo:session): session closed for user root
Jan 21 23:25:11 compute-1 sudo[73122]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ajtswxvenyokmdeitzbkovcztmzrpatg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037911.279178-1398-147640092921242/AnsiballZ_file.py'
Jan 21 23:25:11 compute-1 sudo[73122]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:25:11 compute-1 python3.9[73124]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages1G state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:25:11 compute-1 sudo[73122]: pam_unix(sudo:session): session closed for user root
Jan 21 23:25:12 compute-1 sudo[73274]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cfqsfhghrrachghykhexuccamcjfkebz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037911.923963-1398-76699489210946/AnsiballZ_file.py'
Jan 21 23:25:12 compute-1 sudo[73274]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:25:12 compute-1 python3.9[73276]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages2M state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:25:12 compute-1 sudo[73274]: pam_unix(sudo:session): session closed for user root
Jan 21 23:25:13 compute-1 sudo[73426]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cdnwqlxqoyshvqjfqpvgemblkpytqbci ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037912.665309-1443-138256495280443/AnsiballZ_mount.py'
Jan 21 23:25:13 compute-1 sudo[73426]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:25:13 compute-1 python3.9[73428]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=1G path=/dev/hugepages1G src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Jan 21 23:25:13 compute-1 sudo[73426]: pam_unix(sudo:session): session closed for user root
Jan 21 23:25:13 compute-1 rsyslogd[1003]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 21 23:25:13 compute-1 rsyslogd[1003]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 21 23:25:13 compute-1 sudo[73580]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bijaaotkkavsqhlgnfcmusoogvxzgfta ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037913.528304-1443-61133118454884/AnsiballZ_mount.py'
Jan 21 23:25:13 compute-1 sudo[73580]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:25:14 compute-1 python3.9[73582]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=2M path=/dev/hugepages2M src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Jan 21 23:25:14 compute-1 sudo[73580]: pam_unix(sudo:session): session closed for user root
Jan 21 23:25:14 compute-1 sshd-session[64422]: Connection closed by 192.168.122.30 port 54136
Jan 21 23:25:14 compute-1 sshd-session[64419]: pam_unix(sshd:session): session closed for user zuul
Jan 21 23:25:14 compute-1 systemd[1]: session-15.scope: Deactivated successfully.
Jan 21 23:25:14 compute-1 systemd[1]: session-15.scope: Consumed 40.386s CPU time.
Jan 21 23:25:14 compute-1 systemd-logind[796]: Session 15 logged out. Waiting for processes to exit.
Jan 21 23:25:14 compute-1 systemd-logind[796]: Removed session 15.
Jan 21 23:25:19 compute-1 sshd-session[73608]: Accepted publickey for zuul from 192.168.122.30 port 38588 ssh2: ECDSA SHA256:yVd9eaNlUxI8uClDfn5gH9n5gqPIh4AgIE0cN4DTPFE
Jan 21 23:25:19 compute-1 systemd-logind[796]: New session 16 of user zuul.
Jan 21 23:25:19 compute-1 systemd[1]: Started Session 16 of User zuul.
Jan 21 23:25:19 compute-1 sshd-session[73608]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 21 23:25:20 compute-1 sudo[73761]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mekkcqexmpwrkeeiioxvtmgzrnmiqcvg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037919.854436-24-54796924828688/AnsiballZ_tempfile.py'
Jan 21 23:25:20 compute-1 sudo[73761]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:25:20 compute-1 python3.9[73763]: ansible-ansible.builtin.tempfile Invoked with state=file prefix=ansible. suffix= path=None
Jan 21 23:25:20 compute-1 sudo[73761]: pam_unix(sudo:session): session closed for user root
Jan 21 23:25:21 compute-1 sudo[73913]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-irdwtpvufmmyudlrofxojbpjzfacqfmx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037920.8013487-60-175330659703162/AnsiballZ_stat.py'
Jan 21 23:25:21 compute-1 sudo[73913]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:25:21 compute-1 python3.9[73915]: ansible-ansible.builtin.stat Invoked with path=/etc/ssh/ssh_known_hosts follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 21 23:25:21 compute-1 sudo[73913]: pam_unix(sudo:session): session closed for user root
Jan 21 23:25:22 compute-1 sudo[74065]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vsxyndwyjhjdzuttcmrudqksehtxcphc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037921.7510543-90-260012339210182/AnsiballZ_setup.py'
Jan 21 23:25:22 compute-1 sudo[74065]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:25:22 compute-1 systemd[1]: systemd-timedated.service: Deactivated successfully.
Jan 21 23:25:22 compute-1 python3.9[74067]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'ssh_host_key_rsa_public', 'ssh_host_key_ed25519_public', 'ssh_host_key_ecdsa_public'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 21 23:25:22 compute-1 sudo[74065]: pam_unix(sudo:session): session closed for user root
Jan 21 23:25:23 compute-1 sudo[74219]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bfccggqnzwufeiplnekddchtptjmuhcc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037922.9371407-115-109440967329422/AnsiballZ_blockinfile.py'
Jan 21 23:25:23 compute-1 sudo[74219]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:25:23 compute-1 python3.9[74221]: ansible-ansible.builtin.blockinfile Invoked with block=compute-2.ctlplane.example.com,192.168.122.102,compute-2* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC26D51NdJjdilPO47VkyAGWZEKpDvfQ2t45jAnFi+yGdqGpJZeqIqXy1qJWgR+nOjHPpu4xyjUsXsUdkcmQySQ9nELhPXxBtFGM3LlXjhhk0Yibj4G2gfuMuG/m8d0BtpBY66pWUvd424nrAKh1ObdZgR5iHS4dtFVcrUPD7nmkE3YxEDETOTc5d/Tcal9MQArb/rQQAs2Z7N4Lgv1bSzhuu70Ij9qUff8SJhc5ZBQkAGKfNPP8XajfuTOvnEOo9uZQjTKcFZnsiSBUnxId028vihtYF6+NFOByOltsmJc7OIafk5r6JZzbps6FcCaOaT2TRLuLemBS+qfS4N0tWS1iJ00Jo7h7y+UdgDBFB3/zCHD5KiHOYCHbXdqtz8HUwsz65bdDEsKyJdh6qyFv5DN7sbB1UK6Yr/urKbVGR2mYP7sNEIAcSC9HZ2vehi9Hm/TSD7IfvR2i96ckZOsnHD3QeMUyJXjqk3PG7rlUM7NxZYyHaTuZzYrR5DvOsUjS1s=
                                            compute-2.ctlplane.example.com,192.168.122.102,compute-2* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAICZF6/j6naCAJ9xH6aYQVqdvwoz3vezm/JU2Pso9ogKK
                                            compute-2.ctlplane.example.com,192.168.122.102,compute-2* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBLqlgZ52debu0OKcJwhzrcTUf3XONAZS4TIW+jISXbbaqXAGs35QUNRljBr9O34MR2l+Jib4kJghkCYEmTbTxNo=
                                            compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCmNgflEDQr9DxhZFToMSHP67cO7SUQpgVB7thv3JwDIojWojCRgQSVty7S1IJD5allDPdSEn7he/4X0ePPAI6phFNIWx+fwLuXpedyRVclMG0GASpOZ1kxLiQoMh+DOdnJArZ4llA4Lxdm7MyKCzA+Tna+2Z0+XrBTZjxzM4NbwGmUrESDcTXXu7f/vCq0QTRmjHLTbEvqFbJJzIetehEBC/yJb+35myPPBJ5IU8op6ixtbvwk2pzrRYr/NOUsf/ODWITXAvMjl6U1iE2Np2giBVqfz3zKkoH7gkMRHUmwxetTejWa1kIIZRiJUsQRetDm7v+bkaHGpwokxAC3n7pMwzdSO59inU/Lpr63ruukI64YeLK7FQiJ9557a+lcykXz0xgDF2aNHS9jyyhLQ0EHQGUgQToa462bLJwlCLFkxHpamrQKS67M+71TcFb//zx7kRmUT7NSxGHNAe155RcO0L1mkDHk1r2xOfk2iLNJtgYdCcVlwSRp4beSywJSECk=
                                            compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAILEP7bJKdLXxjWmdj4eC7ngVkPSbC0h6tc+Oej4hLtk7
                                            compute-0.ctlplane.example.com,192.168.122.100,compute-0* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBO/PF1lIcuvdp/VOQkUSqyeGOw1ILI4bhZtJ8xgcsTd+//1XE1ll313MwTKeS1n9loXGAVB4+f9lF2fbY4gEkQI=
                                            compute-1.ctlplane.example.com,192.168.122.101,compute-1* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCrBpqo7KcpSwQCe8bhjM7Y7rs9JlI0f5WrmGDjYEfFo3lyWMOb2fxXrIDWqMZa5HGb83LwgwgDL1MhTWmi47d7h8Wcxzg4uoflPcGqILiXv9Z+T68l/C6NT2ur0r4Njrz27cayzBtDPz1wKz1bf72s+Jm7Ukl84pubtCYfPhpZ6HBojmNiq+gesC60N0wbEbIHDEgd+jVptW/UdWmhzO7xEBn3qbNPk6UpnYJSU+Z2wGx6hHckTSl5Wy/7RQ2HXE990+4qkeVl88lR/LqsGthwUQ8tlp8F33yw3IS9D0uurGkuqY4GyRjexrol0VPx9VlrPU0y4K+1pP59O4qo9+z/eylWJViS4R223v0JF2RIrH6aQvHTtV1un22qYnTCTCQrZ6KAKQipc0pawnz7DdXE3D2gwcQkZZmcYm9JboWqFn5/80rsuHUZmMBOHy5owN7IjIly0yAPxjAIZy5dMr1MkQP9o/FSnvyQzt11XeO/49/DI3FH0TkomkN21/QhSYE=
                                            compute-1.ctlplane.example.com,192.168.122.101,compute-1* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIKG99Xw/DkEh2LuhUTQH1tq7VFfroV01ukYKDqY+UjHx
                                            compute-1.ctlplane.example.com,192.168.122.101,compute-1* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBBvYfIE3Tv6SOsn96jsNhozh4WS77CDAl4JYSfjVLVK/RVCTMxlZOAnhAHwDUgcw2k0t2eycyJ2wTJO6OCAqGM4=
                                             create=True mode=0644 path=/tmp/ansible.rp6gi0ot state=present marker=# {mark} ANSIBLE MANAGED BLOCK backup=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:25:23 compute-1 sudo[74219]: pam_unix(sudo:session): session closed for user root
Jan 21 23:25:24 compute-1 sudo[74371]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mkfljofufwzoogexgyvmejmuomrdwcvt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037923.8214116-139-116567440012685/AnsiballZ_command.py'
Jan 21 23:25:24 compute-1 sudo[74371]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:25:24 compute-1 python3.9[74373]: ansible-ansible.legacy.command Invoked with _raw_params=cat '/tmp/ansible.rp6gi0ot' > /etc/ssh/ssh_known_hosts _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 23:25:24 compute-1 sudo[74371]: pam_unix(sudo:session): session closed for user root
Jan 21 23:25:25 compute-1 sudo[74525]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-isqkewvjmdjgzccwdvqwbpdhjassvcwi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037924.688101-163-194495669788141/AnsiballZ_file.py'
Jan 21 23:25:25 compute-1 sudo[74525]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:25:25 compute-1 python3.9[74527]: ansible-ansible.builtin.file Invoked with path=/tmp/ansible.rp6gi0ot state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:25:25 compute-1 sudo[74525]: pam_unix(sudo:session): session closed for user root
Jan 21 23:25:26 compute-1 sshd-session[73611]: Connection closed by 192.168.122.30 port 38588
Jan 21 23:25:26 compute-1 sshd-session[73608]: pam_unix(sshd:session): session closed for user zuul
Jan 21 23:25:26 compute-1 systemd[1]: session-16.scope: Deactivated successfully.
Jan 21 23:25:26 compute-1 systemd[1]: session-16.scope: Consumed 3.441s CPU time.
Jan 21 23:25:26 compute-1 systemd-logind[796]: Session 16 logged out. Waiting for processes to exit.
Jan 21 23:25:26 compute-1 systemd-logind[796]: Removed session 16.
Jan 21 23:25:30 compute-1 sshd-session[74553]: Accepted publickey for zuul from 192.168.122.30 port 57784 ssh2: ECDSA SHA256:yVd9eaNlUxI8uClDfn5gH9n5gqPIh4AgIE0cN4DTPFE
Jan 21 23:25:30 compute-1 systemd-logind[796]: New session 17 of user zuul.
Jan 21 23:25:30 compute-1 systemd[1]: Started Session 17 of User zuul.
Jan 21 23:25:30 compute-1 sshd-session[74553]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 21 23:25:31 compute-1 python3.9[74706]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 21 23:25:32 compute-1 sudo[74860]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zjotjlklwvmrxyqvinqzidtbxzezwvwv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037932.324519-57-30088634328726/AnsiballZ_systemd.py'
Jan 21 23:25:32 compute-1 sudo[74860]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:25:33 compute-1 python3.9[74862]: ansible-ansible.builtin.systemd Invoked with enabled=True name=sshd daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Jan 21 23:25:33 compute-1 sudo[74860]: pam_unix(sudo:session): session closed for user root
Jan 21 23:25:33 compute-1 sudo[75014]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qrhkdfdxmfczwekbxthosgxzzzxdujxm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037933.566396-81-70611736971241/AnsiballZ_systemd.py'
Jan 21 23:25:33 compute-1 sudo[75014]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:25:34 compute-1 python3.9[75016]: ansible-ansible.builtin.systemd Invoked with name=sshd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 21 23:25:34 compute-1 sudo[75014]: pam_unix(sudo:session): session closed for user root
Jan 21 23:25:34 compute-1 sudo[75167]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vuiwrduqbbjbsxfdxvuivecscwwjruff ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037934.5162807-108-108209142640772/AnsiballZ_command.py'
Jan 21 23:25:34 compute-1 sudo[75167]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:25:35 compute-1 python3.9[75169]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 23:25:35 compute-1 sudo[75167]: pam_unix(sudo:session): session closed for user root
Jan 21 23:25:35 compute-1 sudo[75320]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-htvdjzhsfaywjoflmecrvuntkbqtlmnt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037935.4491913-132-125797870130748/AnsiballZ_stat.py'
Jan 21 23:25:35 compute-1 sudo[75320]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:25:36 compute-1 python3.9[75322]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 21 23:25:36 compute-1 sudo[75320]: pam_unix(sudo:session): session closed for user root
Jan 21 23:25:36 compute-1 sudo[75474]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tnbabdzgodrwqvrwntmmvuzaxqdxdolj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037936.3170848-156-139690391304246/AnsiballZ_command.py'
Jan 21 23:25:36 compute-1 sudo[75474]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:25:36 compute-1 python3.9[75476]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 23:25:36 compute-1 sudo[75474]: pam_unix(sudo:session): session closed for user root
Jan 21 23:25:37 compute-1 sudo[75629]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yluyvgjxgwnmrcxwnsstbpdicjjajkqh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037937.0947208-180-99823768337982/AnsiballZ_file.py'
Jan 21 23:25:37 compute-1 sudo[75629]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:25:37 compute-1 python3.9[75631]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:25:37 compute-1 sudo[75629]: pam_unix(sudo:session): session closed for user root
Jan 21 23:25:38 compute-1 sshd-session[74556]: Connection closed by 192.168.122.30 port 57784
Jan 21 23:25:38 compute-1 sshd-session[74553]: pam_unix(sshd:session): session closed for user zuul
Jan 21 23:25:38 compute-1 systemd[1]: session-17.scope: Deactivated successfully.
Jan 21 23:25:38 compute-1 systemd[1]: session-17.scope: Consumed 4.585s CPU time.
Jan 21 23:25:38 compute-1 systemd-logind[796]: Session 17 logged out. Waiting for processes to exit.
Jan 21 23:25:38 compute-1 systemd-logind[796]: Removed session 17.
Jan 21 23:25:44 compute-1 sshd-session[75656]: Accepted publickey for zuul from 192.168.122.30 port 47630 ssh2: ECDSA SHA256:yVd9eaNlUxI8uClDfn5gH9n5gqPIh4AgIE0cN4DTPFE
Jan 21 23:25:44 compute-1 systemd-logind[796]: New session 18 of user zuul.
Jan 21 23:25:44 compute-1 systemd[1]: Started Session 18 of User zuul.
Jan 21 23:25:44 compute-1 sshd-session[75656]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 21 23:25:45 compute-1 python3.9[75809]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 21 23:25:45 compute-1 sudo[75963]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eopgbtjnuwpezsibykhpzbbaqbyokgfr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037945.687062-63-261293178456331/AnsiballZ_setup.py'
Jan 21 23:25:45 compute-1 sudo[75963]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:25:46 compute-1 python3.9[75965]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 21 23:25:46 compute-1 sudo[75963]: pam_unix(sudo:session): session closed for user root
Jan 21 23:25:46 compute-1 sudo[76047]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kcqbtzywfqocxpodvqrneqysxigrzcet ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037945.687062-63-261293178456331/AnsiballZ_dnf.py'
Jan 21 23:25:46 compute-1 sudo[76047]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:25:47 compute-1 python3.9[76049]: ansible-ansible.legacy.dnf Invoked with name=['yum-utils'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 21 23:25:48 compute-1 sudo[76047]: pam_unix(sudo:session): session closed for user root
Jan 21 23:25:49 compute-1 python3.9[76200]: ansible-ansible.legacy.command Invoked with _raw_params=needs-restarting -r _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 23:25:50 compute-1 python3.9[76351]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/reboot_required/'] patterns=[] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Jan 21 23:25:51 compute-1 python3.9[76501]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 21 23:25:52 compute-1 python3.9[76651]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 21 23:25:52 compute-1 sshd-session[75659]: Connection closed by 192.168.122.30 port 47630
Jan 21 23:25:52 compute-1 sshd-session[75656]: pam_unix(sshd:session): session closed for user zuul
Jan 21 23:25:52 compute-1 systemd[1]: session-18.scope: Deactivated successfully.
Jan 21 23:25:52 compute-1 systemd[1]: session-18.scope: Consumed 5.920s CPU time.
Jan 21 23:25:52 compute-1 systemd-logind[796]: Session 18 logged out. Waiting for processes to exit.
Jan 21 23:25:52 compute-1 systemd-logind[796]: Removed session 18.
Jan 21 23:25:58 compute-1 sshd-session[76676]: Accepted publickey for zuul from 192.168.122.30 port 37482 ssh2: ECDSA SHA256:yVd9eaNlUxI8uClDfn5gH9n5gqPIh4AgIE0cN4DTPFE
Jan 21 23:25:58 compute-1 systemd-logind[796]: New session 19 of user zuul.
Jan 21 23:25:58 compute-1 systemd[1]: Started Session 19 of User zuul.
Jan 21 23:25:58 compute-1 sshd-session[76676]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 21 23:25:59 compute-1 python3.9[76829]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 21 23:26:00 compute-1 sudo[76983]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tlndrhmjqamunxxvevdicfzkiymhicbb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037960.3956182-110-96522005941256/AnsiballZ_file.py'
Jan 21 23:26:00 compute-1 sudo[76983]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:26:01 compute-1 python3.9[76985]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/libvirt/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 21 23:26:01 compute-1 sudo[76983]: pam_unix(sudo:session): session closed for user root
Jan 21 23:26:01 compute-1 sudo[77135]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lzxsqhllmknyjllokjfmkowqnxvzzqlm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037961.2184083-110-194142180485254/AnsiballZ_file.py'
Jan 21 23:26:01 compute-1 sudo[77135]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:26:01 compute-1 python3.9[77137]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/libvirt/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 21 23:26:01 compute-1 sudo[77135]: pam_unix(sudo:session): session closed for user root
Jan 21 23:26:02 compute-1 sudo[77287]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ixhogqxsrnbbslaisrktcxsyhitpfjve ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037961.8941634-153-266909261763104/AnsiballZ_stat.py'
Jan 21 23:26:02 compute-1 sudo[77287]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:26:02 compute-1 python3.9[77289]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:26:02 compute-1 sudo[77287]: pam_unix(sudo:session): session closed for user root
Jan 21 23:26:03 compute-1 sudo[77410]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dgrookxczpgdxjdcxztunldpivbkmlkx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037961.8941634-153-266909261763104/AnsiballZ_copy.py'
Jan 21 23:26:03 compute-1 sudo[77410]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:26:03 compute-1 python3.9[77412]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769037961.8941634-153-266909261763104/.source.crt _original_basename=compute-1.ctlplane.example.com-tls.crt follow=False checksum=f3fbbcc5a7dd80076e21c54bc309f44ee6a201c6 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:26:03 compute-1 sudo[77410]: pam_unix(sudo:session): session closed for user root
Jan 21 23:26:04 compute-1 sudo[77562]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sdyfxtcwoanzpagdvhoqvmnstutipbam ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037963.9413474-153-229815418920887/AnsiballZ_stat.py'
Jan 21 23:26:04 compute-1 sudo[77562]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:26:04 compute-1 python3.9[77564]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:26:04 compute-1 sudo[77562]: pam_unix(sudo:session): session closed for user root
Jan 21 23:26:04 compute-1 sudo[77685]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-esgdofpzewhyacgmycetyilysabjkbaj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037963.9413474-153-229815418920887/AnsiballZ_copy.py'
Jan 21 23:26:04 compute-1 sudo[77685]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:26:05 compute-1 python3.9[77687]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769037963.9413474-153-229815418920887/.source.crt _original_basename=compute-1.ctlplane.example.com-ca.crt follow=False checksum=9c03bcfa62361e5ef322801c360476a6187916b6 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:26:05 compute-1 sudo[77685]: pam_unix(sudo:session): session closed for user root
Jan 21 23:26:05 compute-1 sudo[77837]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rsdqmbpglzympzeipghjhtrijuzhyzrm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037965.235966-153-59664439903338/AnsiballZ_stat.py'
Jan 21 23:26:05 compute-1 sudo[77837]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:26:05 compute-1 python3.9[77839]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:26:05 compute-1 sudo[77837]: pam_unix(sudo:session): session closed for user root
Jan 21 23:26:06 compute-1 sudo[77960]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-swghqidtzrulthzxpzkhhbyacamfqzix ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037965.235966-153-59664439903338/AnsiballZ_copy.py'
Jan 21 23:26:06 compute-1 sudo[77960]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:26:06 compute-1 python3.9[77962]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769037965.235966-153-59664439903338/.source.key _original_basename=compute-1.ctlplane.example.com-tls.key follow=False checksum=fbd0c343f551f0c63a20c30c8d6b8957393c1ded backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:26:06 compute-1 sudo[77960]: pam_unix(sudo:session): session closed for user root
Jan 21 23:26:06 compute-1 sudo[78112]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eplowrbnnlxkslarcmkokcnnoaqsistm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037966.58792-301-60766527439115/AnsiballZ_file.py'
Jan 21 23:26:06 compute-1 sudo[78112]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:26:07 compute-1 python3.9[78114]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/telemetry/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 21 23:26:07 compute-1 sudo[78112]: pam_unix(sudo:session): session closed for user root
Jan 21 23:26:07 compute-1 sudo[78264]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bqvchyttsccdvlbsqmvwidtcykcliddi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037967.261508-301-50677112577599/AnsiballZ_file.py'
Jan 21 23:26:07 compute-1 sudo[78264]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:26:07 compute-1 python3.9[78266]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/telemetry/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 21 23:26:07 compute-1 sudo[78264]: pam_unix(sudo:session): session closed for user root
Jan 21 23:26:08 compute-1 sudo[78416]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zvbxgqnvfrwvotehebgjhxhmxapvvlsf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037967.9621987-347-179036238731579/AnsiballZ_stat.py'
Jan 21 23:26:08 compute-1 sudo[78416]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:26:08 compute-1 python3.9[78418]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:26:08 compute-1 sudo[78416]: pam_unix(sudo:session): session closed for user root
Jan 21 23:26:08 compute-1 sudo[78539]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hkzhderzwrrzjuwuyxorbkzffbyuvetm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037967.9621987-347-179036238731579/AnsiballZ_copy.py'
Jan 21 23:26:08 compute-1 sudo[78539]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:26:08 compute-1 python3.9[78541]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/telemetry/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769037967.9621987-347-179036238731579/.source.crt _original_basename=compute-1.ctlplane.example.com-tls.crt follow=False checksum=76840100a9092c0f88b87e2907ea3d2721a32fc8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:26:09 compute-1 sudo[78539]: pam_unix(sudo:session): session closed for user root
Jan 21 23:26:09 compute-1 sudo[78691]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oidjtbjbppabessjxbigyvnsekqjspup ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037969.138897-347-207240818969297/AnsiballZ_stat.py'
Jan 21 23:26:09 compute-1 sudo[78691]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:26:09 compute-1 python3.9[78693]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:26:09 compute-1 sudo[78691]: pam_unix(sudo:session): session closed for user root
Jan 21 23:26:10 compute-1 sudo[78814]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kauuaubozupfpzlfcdizjemvuzyolghr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037969.138897-347-207240818969297/AnsiballZ_copy.py'
Jan 21 23:26:10 compute-1 sudo[78814]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:26:10 compute-1 python3.9[78816]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/telemetry/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769037969.138897-347-207240818969297/.source.crt _original_basename=compute-1.ctlplane.example.com-ca.crt follow=False checksum=d8fd7bb3e34b5ea059d1c8aca5209211b8d4078a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:26:10 compute-1 sudo[78814]: pam_unix(sudo:session): session closed for user root
Jan 21 23:26:10 compute-1 sudo[78966]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-txuntbqfdmsdzcqjxidhvyemmoapqinh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037970.341825-347-189884692419669/AnsiballZ_stat.py'
Jan 21 23:26:10 compute-1 sudo[78966]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:26:10 compute-1 python3.9[78968]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:26:10 compute-1 sudo[78966]: pam_unix(sudo:session): session closed for user root
Jan 21 23:26:11 compute-1 sudo[79089]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kyjjkdlrkodeadpsolnratydzjlasdui ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037970.341825-347-189884692419669/AnsiballZ_copy.py'
Jan 21 23:26:11 compute-1 sudo[79089]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:26:11 compute-1 python3.9[79091]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/telemetry/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769037970.341825-347-189884692419669/.source.key _original_basename=compute-1.ctlplane.example.com-tls.key follow=False checksum=9e5b439343944469cd1f447adfa97a7d89083026 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:26:11 compute-1 sudo[79089]: pam_unix(sudo:session): session closed for user root
Jan 21 23:26:11 compute-1 sudo[79241]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bvjsmlarrhywpgbixahxniyxoxxcutdl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037971.713197-479-201618090342927/AnsiballZ_file.py'
Jan 21 23:26:11 compute-1 sudo[79241]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:26:12 compute-1 python3.9[79243]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/neutron-metadata/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 21 23:26:12 compute-1 sudo[79241]: pam_unix(sudo:session): session closed for user root
Jan 21 23:26:12 compute-1 sudo[79393]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-weezmizewwptbozarkdusfhbcupkruaz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037972.3717961-479-16866681028574/AnsiballZ_file.py'
Jan 21 23:26:12 compute-1 sudo[79393]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:26:12 compute-1 python3.9[79395]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/neutron-metadata/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 21 23:26:12 compute-1 sudo[79393]: pam_unix(sudo:session): session closed for user root
Jan 21 23:26:13 compute-1 sudo[79545]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gpxtcwsywsvortgyppkwuvijdxhsbnme ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037973.077518-525-216494745861472/AnsiballZ_stat.py'
Jan 21 23:26:13 compute-1 sudo[79545]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:26:13 compute-1 python3.9[79547]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:26:13 compute-1 sudo[79545]: pam_unix(sudo:session): session closed for user root
Jan 21 23:26:13 compute-1 sudo[79668]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vbeaudhesurpaxsgxlarfombscssykjm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037973.077518-525-216494745861472/AnsiballZ_copy.py'
Jan 21 23:26:13 compute-1 sudo[79668]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:26:14 compute-1 python3.9[79670]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769037973.077518-525-216494745861472/.source.crt _original_basename=compute-1.ctlplane.example.com-tls.crt follow=False checksum=09d290f9a78e9d86131ce4724e6f3d260dbf3aff backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:26:14 compute-1 sudo[79668]: pam_unix(sudo:session): session closed for user root
Jan 21 23:26:14 compute-1 sudo[79820]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fbnieheywrczrrpscxatfchfzkpyobzy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037974.3083427-525-131106586520075/AnsiballZ_stat.py'
Jan 21 23:26:14 compute-1 sudo[79820]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:26:14 compute-1 python3.9[79822]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:26:14 compute-1 sudo[79820]: pam_unix(sudo:session): session closed for user root
Jan 21 23:26:15 compute-1 sudo[79943]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zxfetpmtptlwafxafzkidszzgywmybnl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037974.3083427-525-131106586520075/AnsiballZ_copy.py'
Jan 21 23:26:15 compute-1 sudo[79943]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:26:15 compute-1 python3.9[79945]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769037974.3083427-525-131106586520075/.source.crt _original_basename=compute-1.ctlplane.example.com-ca.crt follow=False checksum=5ac53e5233bb5dc2a1a1ee89225b6d9cf54a324a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:26:15 compute-1 sudo[79943]: pam_unix(sudo:session): session closed for user root
Jan 21 23:26:15 compute-1 sudo[80095]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eihefspqtbaatsamcixrrfcnbksxayes ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037975.5964544-525-255694251873462/AnsiballZ_stat.py'
Jan 21 23:26:15 compute-1 sudo[80095]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:26:16 compute-1 python3.9[80097]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:26:16 compute-1 sudo[80095]: pam_unix(sudo:session): session closed for user root
Jan 21 23:26:16 compute-1 sudo[80218]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-belckzmhstxmsyrgjifzjhxhparihcnm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037975.5964544-525-255694251873462/AnsiballZ_copy.py'
Jan 21 23:26:16 compute-1 sudo[80218]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:26:16 compute-1 python3.9[80220]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769037975.5964544-525-255694251873462/.source.key _original_basename=compute-1.ctlplane.example.com-tls.key follow=False checksum=75f625a152379ab88def47f4b2c708385a145c84 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:26:16 compute-1 sudo[80218]: pam_unix(sudo:session): session closed for user root
Jan 21 23:26:17 compute-1 sudo[80370]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-prgqjnhmvccxdxjkgknbfivehvehdsuj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037976.931571-660-165277884474697/AnsiballZ_file.py'
Jan 21 23:26:17 compute-1 sudo[80370]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:26:17 compute-1 python3.9[80372]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/ovn/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 21 23:26:17 compute-1 sudo[80370]: pam_unix(sudo:session): session closed for user root
Jan 21 23:26:18 compute-1 sudo[80522]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gfgvjhypwbicwqdhetqpxytliszphkwv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037977.6875184-660-189876258005986/AnsiballZ_file.py'
Jan 21 23:26:18 compute-1 sudo[80522]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:26:18 compute-1 python3.9[80524]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/ovn/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 21 23:26:18 compute-1 sudo[80522]: pam_unix(sudo:session): session closed for user root
Jan 21 23:26:18 compute-1 chronyd[64393]: Selected source 23.159.16.194 (pool.ntp.org)
Jan 21 23:26:18 compute-1 sudo[80674]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vzcylwobtujjwgzsrplfcftxzibglcie ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037978.418433-708-142034188954530/AnsiballZ_stat.py'
Jan 21 23:26:18 compute-1 sudo[80674]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:26:18 compute-1 python3.9[80676]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:26:18 compute-1 sudo[80674]: pam_unix(sudo:session): session closed for user root
Jan 21 23:26:19 compute-1 sudo[80797]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gultncppmuqctlgyhvnosogftkbyunbi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037978.418433-708-142034188954530/AnsiballZ_copy.py'
Jan 21 23:26:19 compute-1 sudo[80797]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:26:19 compute-1 python3.9[80799]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769037978.418433-708-142034188954530/.source.crt _original_basename=compute-1.ctlplane.example.com-tls.crt follow=False checksum=00a967c87c7f8cf1ed765f58d6e1fbd1175c9798 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:26:19 compute-1 sudo[80797]: pam_unix(sudo:session): session closed for user root
Jan 21 23:26:20 compute-1 sudo[80949]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ucxgqmurraxdmiqumflzfitjomztvsvm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037979.894779-708-222695077765511/AnsiballZ_stat.py'
Jan 21 23:26:20 compute-1 sudo[80949]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:26:20 compute-1 python3.9[80951]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:26:20 compute-1 sudo[80949]: pam_unix(sudo:session): session closed for user root
Jan 21 23:26:20 compute-1 sudo[81072]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-csizbxfnbmyfujohoxieintulktyabzm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037979.894779-708-222695077765511/AnsiballZ_copy.py'
Jan 21 23:26:20 compute-1 sudo[81072]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:26:20 compute-1 python3.9[81074]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769037979.894779-708-222695077765511/.source.crt _original_basename=compute-1.ctlplane.example.com-ca.crt follow=False checksum=5ac53e5233bb5dc2a1a1ee89225b6d9cf54a324a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:26:20 compute-1 sudo[81072]: pam_unix(sudo:session): session closed for user root
Jan 21 23:26:21 compute-1 sudo[81224]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aeeueokhfkrvacliqnyckvxazdnajwxk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037981.0754058-708-199169216866052/AnsiballZ_stat.py'
Jan 21 23:26:21 compute-1 sudo[81224]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:26:21 compute-1 python3.9[81226]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:26:21 compute-1 sudo[81224]: pam_unix(sudo:session): session closed for user root
Jan 21 23:26:21 compute-1 sudo[81347]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dpoprazthrnwjysswlvctbtpbzpqpgkl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037981.0754058-708-199169216866052/AnsiballZ_copy.py'
Jan 21 23:26:21 compute-1 sudo[81347]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:26:22 compute-1 python3.9[81349]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769037981.0754058-708-199169216866052/.source.key _original_basename=compute-1.ctlplane.example.com-tls.key follow=False checksum=c2001a194bbb3e874315c78df60577329f7d7773 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:26:22 compute-1 sudo[81347]: pam_unix(sudo:session): session closed for user root
Jan 21 23:26:23 compute-1 sudo[81499]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xbreabljiznevelxnxtljofyftiqtuqk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037982.8234777-885-195699298531182/AnsiballZ_file.py'
Jan 21 23:26:23 compute-1 sudo[81499]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:26:23 compute-1 python3.9[81501]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/telemetry setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 21 23:26:23 compute-1 sudo[81499]: pam_unix(sudo:session): session closed for user root
Jan 21 23:26:23 compute-1 sudo[81651]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-utwvtuxzohsbqkcpwdtvyucryxymcgrf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037983.503662-913-79649047290928/AnsiballZ_stat.py'
Jan 21 23:26:23 compute-1 sudo[81651]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:26:23 compute-1 python3.9[81653]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:26:23 compute-1 sudo[81651]: pam_unix(sudo:session): session closed for user root
Jan 21 23:26:24 compute-1 sudo[81774]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-llrjrepysxjfeuhpiqjocfgkndsrlrgt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037983.503662-913-79649047290928/AnsiballZ_copy.py'
Jan 21 23:26:24 compute-1 sudo[81774]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:26:24 compute-1 python3.9[81776]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769037983.503662-913-79649047290928/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=7d2dea4c5f9b91987b3f91d50d337a484f86b475 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:26:24 compute-1 sudo[81774]: pam_unix(sudo:session): session closed for user root
Jan 21 23:26:25 compute-1 sudo[81926]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ysxwgszjbtlqphpgxasiwlejagrjcnyv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037984.7929397-959-194442370064920/AnsiballZ_file.py'
Jan 21 23:26:25 compute-1 sudo[81926]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:26:25 compute-1 python3.9[81928]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 21 23:26:25 compute-1 sudo[81926]: pam_unix(sudo:session): session closed for user root
Jan 21 23:26:25 compute-1 sudo[82078]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xbgmvpyhbgvnwjmnmlrbidphrqnzsntk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037985.4468899-984-262489654271046/AnsiballZ_stat.py'
Jan 21 23:26:25 compute-1 sudo[82078]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:26:25 compute-1 python3.9[82080]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/libvirt/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:26:25 compute-1 sudo[82078]: pam_unix(sudo:session): session closed for user root
Jan 21 23:26:26 compute-1 sudo[82201]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cvsgnqzbbgwxorsmhlyjaximleeweisl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037985.4468899-984-262489654271046/AnsiballZ_copy.py'
Jan 21 23:26:26 compute-1 sudo[82201]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:26:26 compute-1 python3.9[82203]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/libvirt/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769037985.4468899-984-262489654271046/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=7d2dea4c5f9b91987b3f91d50d337a484f86b475 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:26:26 compute-1 sudo[82201]: pam_unix(sudo:session): session closed for user root
Jan 21 23:26:27 compute-1 sudo[82353]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jqvraqfasamlxpbgsitnnrxpfbxvmbqa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037986.7809553-1030-142418078728670/AnsiballZ_file.py'
Jan 21 23:26:27 compute-1 sudo[82353]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:26:27 compute-1 python3.9[82355]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/repo-setup setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 21 23:26:27 compute-1 sudo[82353]: pam_unix(sudo:session): session closed for user root
Jan 21 23:26:27 compute-1 sudo[82505]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lxngdxjsrpwdxoypgiwywsdvkozaykco ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037987.4145458-1054-261443874181997/AnsiballZ_stat.py'
Jan 21 23:26:27 compute-1 sudo[82505]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:26:27 compute-1 python3.9[82507]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/repo-setup/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:26:27 compute-1 sudo[82505]: pam_unix(sudo:session): session closed for user root
Jan 21 23:26:28 compute-1 sudo[82628]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tisvvmqnqfivmgypackbnvvwvovjegrg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037987.4145458-1054-261443874181997/AnsiballZ_copy.py'
Jan 21 23:26:28 compute-1 sudo[82628]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:26:28 compute-1 python3.9[82630]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/repo-setup/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769037987.4145458-1054-261443874181997/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=7d2dea4c5f9b91987b3f91d50d337a484f86b475 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:26:28 compute-1 sudo[82628]: pam_unix(sudo:session): session closed for user root
Jan 21 23:26:29 compute-1 sudo[82780]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qkxhbszfgtwhgbaypsaefptvuqxppjhi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037988.826054-1100-202677307987421/AnsiballZ_file.py'
Jan 21 23:26:29 compute-1 sudo[82780]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:26:29 compute-1 python3.9[82782]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/neutron-metadata setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 21 23:26:29 compute-1 sudo[82780]: pam_unix(sudo:session): session closed for user root
Jan 21 23:26:30 compute-1 sudo[82932]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-avoxgeyqdkhdrcyryhnhngquiatddxbm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037989.7473311-1132-139741587784382/AnsiballZ_stat.py'
Jan 21 23:26:30 compute-1 sudo[82932]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:26:30 compute-1 python3.9[82934]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:26:30 compute-1 sudo[82932]: pam_unix(sudo:session): session closed for user root
Jan 21 23:26:30 compute-1 sudo[83055]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eofohuvtajnuerfrwuhcxuudhcctcday ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037989.7473311-1132-139741587784382/AnsiballZ_copy.py'
Jan 21 23:26:30 compute-1 sudo[83055]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:26:30 compute-1 python3.9[83057]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769037989.7473311-1132-139741587784382/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=7d2dea4c5f9b91987b3f91d50d337a484f86b475 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:26:30 compute-1 sudo[83055]: pam_unix(sudo:session): session closed for user root
Jan 21 23:26:31 compute-1 sudo[83207]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hnecgkapunlfpsfawcuxolyhalusvqsu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037991.0511794-1181-124583542096375/AnsiballZ_file.py'
Jan 21 23:26:31 compute-1 sudo[83207]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:26:31 compute-1 python3.9[83209]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 21 23:26:31 compute-1 sudo[83207]: pam_unix(sudo:session): session closed for user root
Jan 21 23:26:32 compute-1 sudo[83359]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hbmqlctiavoqdumgzuqrooqcfmzpotko ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037991.8245976-1207-58933623897029/AnsiballZ_stat.py'
Jan 21 23:26:32 compute-1 sudo[83359]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:26:32 compute-1 python3.9[83361]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:26:32 compute-1 sudo[83359]: pam_unix(sudo:session): session closed for user root
Jan 21 23:26:32 compute-1 sudo[83482]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ftkkmrdcwjhuzxetjfshkufltozfjaqu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037991.8245976-1207-58933623897029/AnsiballZ_copy.py'
Jan 21 23:26:32 compute-1 sudo[83482]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:26:32 compute-1 python3.9[83484]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769037991.8245976-1207-58933623897029/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=7d2dea4c5f9b91987b3f91d50d337a484f86b475 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:26:32 compute-1 sudo[83482]: pam_unix(sudo:session): session closed for user root
Jan 21 23:26:33 compute-1 sudo[83634]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ixmpacfokljfwhbdcilopetlmiwiauge ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037993.1478252-1255-257968849573262/AnsiballZ_file.py'
Jan 21 23:26:33 compute-1 sudo[83634]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:26:33 compute-1 python3.9[83636]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/bootstrap setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 21 23:26:33 compute-1 sudo[83634]: pam_unix(sudo:session): session closed for user root
Jan 21 23:26:34 compute-1 sudo[83786]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pujrdjzzyuavdcfnwxkgfavulyotvjyu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037993.8531606-1280-49770902720199/AnsiballZ_stat.py'
Jan 21 23:26:34 compute-1 sudo[83786]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:26:34 compute-1 python3.9[83788]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/bootstrap/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:26:34 compute-1 sudo[83786]: pam_unix(sudo:session): session closed for user root
Jan 21 23:26:34 compute-1 sudo[83909]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mjtojcvjtcpgdrzgclarrroabzarfgjd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037993.8531606-1280-49770902720199/AnsiballZ_copy.py'
Jan 21 23:26:34 compute-1 sudo[83909]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:26:35 compute-1 python3.9[83911]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/bootstrap/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769037993.8531606-1280-49770902720199/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=7d2dea4c5f9b91987b3f91d50d337a484f86b475 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:26:35 compute-1 sudo[83909]: pam_unix(sudo:session): session closed for user root
Jan 21 23:26:35 compute-1 sudo[84061]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fjxioskxbibxhngjjdbriterjumcbrlj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037995.2675598-1319-83693943197605/AnsiballZ_file.py'
Jan 21 23:26:35 compute-1 sudo[84061]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:26:35 compute-1 python3.9[84063]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 21 23:26:35 compute-1 sudo[84061]: pam_unix(sudo:session): session closed for user root
Jan 21 23:26:36 compute-1 sudo[84213]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aqdnglvyqdwniuqzazftcoaxfyddrfph ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037995.9852386-1336-122567269405560/AnsiballZ_stat.py'
Jan 21 23:26:36 compute-1 sudo[84213]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:26:36 compute-1 python3.9[84215]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:26:36 compute-1 sudo[84213]: pam_unix(sudo:session): session closed for user root
Jan 21 23:26:36 compute-1 sudo[84336]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yotzjkuckuizksiwtmvbdspmjekinlvi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037995.9852386-1336-122567269405560/AnsiballZ_copy.py'
Jan 21 23:26:36 compute-1 sudo[84336]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:26:37 compute-1 python3.9[84338]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769037995.9852386-1336-122567269405560/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=7d2dea4c5f9b91987b3f91d50d337a484f86b475 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:26:37 compute-1 sudo[84336]: pam_unix(sudo:session): session closed for user root
Jan 21 23:26:37 compute-1 sshd-session[76679]: Connection closed by 192.168.122.30 port 37482
Jan 21 23:26:37 compute-1 sshd-session[76676]: pam_unix(sshd:session): session closed for user zuul
Jan 21 23:26:37 compute-1 systemd[1]: session-19.scope: Deactivated successfully.
Jan 21 23:26:37 compute-1 systemd[1]: session-19.scope: Consumed 30.760s CPU time.
Jan 21 23:26:37 compute-1 systemd-logind[796]: Session 19 logged out. Waiting for processes to exit.
Jan 21 23:26:37 compute-1 systemd-logind[796]: Removed session 19.
Jan 21 23:26:43 compute-1 sshd-session[84363]: Accepted publickey for zuul from 192.168.122.30 port 54386 ssh2: ECDSA SHA256:yVd9eaNlUxI8uClDfn5gH9n5gqPIh4AgIE0cN4DTPFE
Jan 21 23:26:43 compute-1 systemd-logind[796]: New session 20 of user zuul.
Jan 21 23:26:43 compute-1 systemd[1]: Started Session 20 of User zuul.
Jan 21 23:26:43 compute-1 sshd-session[84363]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 21 23:26:44 compute-1 python3.9[84516]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 21 23:26:45 compute-1 sudo[84670]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dsbgbxqzqyewzeuyscfmxtweirdlkqmw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038004.8235795-63-203326438125469/AnsiballZ_file.py'
Jan 21 23:26:45 compute-1 sudo[84670]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:26:45 compute-1 python3.9[84672]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/edpm-config/firewall setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 21 23:26:45 compute-1 sudo[84670]: pam_unix(sudo:session): session closed for user root
Jan 21 23:26:45 compute-1 sudo[84822]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aacyfelvglqqreerryhqotxlziqwygaf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038005.6398187-63-135279008586324/AnsiballZ_file.py'
Jan 21 23:26:45 compute-1 sudo[84822]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:26:46 compute-1 python3.9[84824]: ansible-ansible.builtin.file Invoked with group=openvswitch owner=openvswitch path=/var/lib/openvswitch/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Jan 21 23:26:46 compute-1 sudo[84822]: pam_unix(sudo:session): session closed for user root
Jan 21 23:26:47 compute-1 python3.9[84974]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 21 23:26:47 compute-1 sudo[85124]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pkppyzcgorrtuiiumtlwztgmbgeztqdv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038007.231714-132-85257635593104/AnsiballZ_seboolean.py'
Jan 21 23:26:47 compute-1 sudo[85124]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:26:47 compute-1 python3.9[85126]: ansible-ansible.posix.seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False
Jan 21 23:26:49 compute-1 sudo[85124]: pam_unix(sudo:session): session closed for user root
Jan 21 23:26:50 compute-1 sudo[85280]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mtafodecstniazyspaejohrojjobewlj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038009.7401135-162-152283495180203/AnsiballZ_setup.py'
Jan 21 23:26:50 compute-1 dbus-broker-launch[770]: avc:  op=load_policy lsm=selinux seqno=9 res=1
Jan 21 23:26:50 compute-1 sudo[85280]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:26:50 compute-1 python3.9[85282]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 21 23:26:50 compute-1 sudo[85280]: pam_unix(sudo:session): session closed for user root
Jan 21 23:26:51 compute-1 sudo[85364]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bfqpcaruejewjpbwroobkoevbweqkqit ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038009.7401135-162-152283495180203/AnsiballZ_dnf.py'
Jan 21 23:26:51 compute-1 sudo[85364]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:26:51 compute-1 python3.9[85366]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 21 23:26:52 compute-1 sudo[85364]: pam_unix(sudo:session): session closed for user root
Jan 21 23:26:53 compute-1 sudo[85517]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iewmganmwnabbdeikrqewjrqzixxdvdl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038012.8467135-198-185404315637479/AnsiballZ_systemd.py'
Jan 21 23:26:53 compute-1 sudo[85517]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:26:53 compute-1 python3.9[85519]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 21 23:26:53 compute-1 sudo[85517]: pam_unix(sudo:session): session closed for user root
Jan 21 23:26:54 compute-1 sudo[85672]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ndufkvkjobtnaowhmlrglazttioxlfns ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1769038014.0813735-222-203312212421647/AnsiballZ_edpm_nftables_snippet.py'
Jan 21 23:26:54 compute-1 sudo[85672]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:26:54 compute-1 python3[85674]: ansible-osp.edpm.edpm_nftables_snippet Invoked with content=- rule_name: 118 neutron vxlan networks
                                            rule:
                                              proto: udp
                                              dport: 4789
                                          - rule_name: 119 neutron geneve networks
                                            rule:
                                              proto: udp
                                              dport: 6081
                                              state: ["UNTRACKED"]
                                          - rule_name: 120 neutron geneve networks no conntrack
                                            rule:
                                              proto: udp
                                              dport: 6081
                                              table: raw
                                              chain: OUTPUT
                                              jump: NOTRACK
                                              action: append
                                              state: []
                                          - rule_name: 121 neutron geneve networks no conntrack
                                            rule:
                                              proto: udp
                                              dport: 6081
                                              table: raw
                                              chain: PREROUTING
                                              jump: NOTRACK
                                              action: append
                                              state: []
                                           dest=/var/lib/edpm-config/firewall/ovn.yaml state=present
Jan 21 23:26:54 compute-1 sudo[85672]: pam_unix(sudo:session): session closed for user root
Jan 21 23:26:55 compute-1 sudo[85824]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xhngvqvdyodarbvhsirlnrphnouguvwu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038015.1354344-249-216734280582450/AnsiballZ_file.py'
Jan 21 23:26:55 compute-1 sudo[85824]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:26:55 compute-1 python3.9[85826]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:26:55 compute-1 sudo[85824]: pam_unix(sudo:session): session closed for user root
Jan 21 23:26:56 compute-1 sudo[85976]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-irhaukqkijahavzqrafdtlwitzjvense ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038015.8548117-273-198455973799088/AnsiballZ_stat.py'
Jan 21 23:26:56 compute-1 sudo[85976]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:26:56 compute-1 python3.9[85978]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:26:56 compute-1 sudo[85976]: pam_unix(sudo:session): session closed for user root
Jan 21 23:26:56 compute-1 sudo[86054]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ywasmfpzmobexkwqqqbsdkngxijsprug ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038015.8548117-273-198455973799088/AnsiballZ_file.py'
Jan 21 23:26:56 compute-1 sudo[86054]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:26:56 compute-1 python3.9[86056]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:26:56 compute-1 sudo[86054]: pam_unix(sudo:session): session closed for user root
Jan 21 23:26:57 compute-1 sudo[86207]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vanzosnflhxvexeoopbpwuaoyrvotylp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038017.2865634-309-95763456233459/AnsiballZ_stat.py'
Jan 21 23:26:57 compute-1 sudo[86207]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:26:57 compute-1 python3.9[86209]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:26:57 compute-1 sudo[86207]: pam_unix(sudo:session): session closed for user root
Jan 21 23:26:58 compute-1 sudo[86285]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nuqsbvwzsopvcyeeqgcmqilfcdhhszcs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038017.2865634-309-95763456233459/AnsiballZ_file.py'
Jan 21 23:26:58 compute-1 sudo[86285]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:26:58 compute-1 python3.9[86287]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=._bcunj4e recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:26:58 compute-1 sudo[86285]: pam_unix(sudo:session): session closed for user root
Jan 21 23:26:58 compute-1 sudo[86437]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-upnswlcltdreqdnlnqafscmzvxriwqzo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038018.575051-345-139465126768298/AnsiballZ_stat.py'
Jan 21 23:26:58 compute-1 sudo[86437]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:26:59 compute-1 python3.9[86439]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:26:59 compute-1 sudo[86437]: pam_unix(sudo:session): session closed for user root
Jan 21 23:26:59 compute-1 sudo[86515]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sinxqjnduqemhdlducpuyfntevltnyvv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038018.575051-345-139465126768298/AnsiballZ_file.py'
Jan 21 23:26:59 compute-1 sudo[86515]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:26:59 compute-1 python3.9[86517]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:26:59 compute-1 sudo[86515]: pam_unix(sudo:session): session closed for user root
Jan 21 23:27:00 compute-1 sudo[86667]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mwqphkqrmytjxqmzosdlvqmjykdhvfmn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038019.8955982-384-15905348724742/AnsiballZ_command.py'
Jan 21 23:27:00 compute-1 sudo[86667]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:27:00 compute-1 python3.9[86669]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 23:27:00 compute-1 sudo[86667]: pam_unix(sudo:session): session closed for user root
Jan 21 23:27:01 compute-1 sudo[86820]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-urxigykhtahwvghosczrowpmnavopgmk ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1769038020.8055801-408-280039806912308/AnsiballZ_edpm_nftables_from_files.py'
Jan 21 23:27:01 compute-1 sudo[86820]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:27:01 compute-1 python3[86822]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Jan 21 23:27:01 compute-1 sudo[86820]: pam_unix(sudo:session): session closed for user root
Jan 21 23:27:02 compute-1 sudo[86972]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-meckpbnohlypivpqbtwuuxszcjxtmohx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038021.7288306-432-224923675699701/AnsiballZ_stat.py'
Jan 21 23:27:02 compute-1 sudo[86972]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:27:02 compute-1 python3.9[86974]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:27:02 compute-1 sudo[86972]: pam_unix(sudo:session): session closed for user root
Jan 21 23:27:02 compute-1 sudo[87097]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bxcgtirolwzpfbtzawltanrwyvgxmmvg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038021.7288306-432-224923675699701/AnsiballZ_copy.py'
Jan 21 23:27:02 compute-1 sudo[87097]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:27:02 compute-1 python3.9[87099]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769038021.7288306-432-224923675699701/.source.nft follow=False _original_basename=jump-chain.j2 checksum=81c2fc96c23335ffe374f9b064e885d5d971ddf9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:27:02 compute-1 sudo[87097]: pam_unix(sudo:session): session closed for user root
Jan 21 23:27:03 compute-1 sudo[87249]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ihkhurmwejsjijreeogocjxxhsomlqsq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038023.2418027-477-230989947793875/AnsiballZ_stat.py'
Jan 21 23:27:03 compute-1 sudo[87249]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:27:03 compute-1 python3.9[87251]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:27:03 compute-1 sudo[87249]: pam_unix(sudo:session): session closed for user root
Jan 21 23:27:04 compute-1 sudo[87374]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mgjfvhcywkvmuvvjkdygeqwlxqwkwxsq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038023.2418027-477-230989947793875/AnsiballZ_copy.py'
Jan 21 23:27:04 compute-1 sudo[87374]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:27:04 compute-1 python3.9[87376]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769038023.2418027-477-230989947793875/.source.nft follow=False _original_basename=jump-chain.j2 checksum=ac8dea350c18f51f54d48dacc09613cda4c5540c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:27:04 compute-1 sudo[87374]: pam_unix(sudo:session): session closed for user root
Jan 21 23:27:04 compute-1 sudo[87526]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rjweplexnndkybdzzgxuorwgnewqahjq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038024.6565707-523-55692925731985/AnsiballZ_stat.py'
Jan 21 23:27:04 compute-1 sudo[87526]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:27:05 compute-1 python3.9[87528]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:27:05 compute-1 sudo[87526]: pam_unix(sudo:session): session closed for user root
Jan 21 23:27:05 compute-1 sudo[87651]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hmkohqgvqdvfckzwuwruzvpiimrmesnu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038024.6565707-523-55692925731985/AnsiballZ_copy.py'
Jan 21 23:27:05 compute-1 sudo[87651]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:27:05 compute-1 python3.9[87653]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-flushes.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769038024.6565707-523-55692925731985/.source.nft follow=False _original_basename=flush-chain.j2 checksum=4d3ffec49c8eb1a9b80d2f1e8cd64070063a87b4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:27:05 compute-1 sudo[87651]: pam_unix(sudo:session): session closed for user root
Jan 21 23:27:06 compute-1 sudo[87803]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ovcjctbrlyshcmtvxohonybakconqnjo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038026.0750012-567-33909078544853/AnsiballZ_stat.py'
Jan 21 23:27:06 compute-1 sudo[87803]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:27:06 compute-1 python3.9[87805]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:27:06 compute-1 sudo[87803]: pam_unix(sudo:session): session closed for user root
Jan 21 23:27:07 compute-1 sudo[87928]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ymubivtxqcxbdipywwnwlsxewlrplfqx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038026.0750012-567-33909078544853/AnsiballZ_copy.py'
Jan 21 23:27:07 compute-1 sudo[87928]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:27:07 compute-1 python3.9[87930]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-chains.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769038026.0750012-567-33909078544853/.source.nft follow=False _original_basename=chains.j2 checksum=298ada419730ec15df17ded0cc50c97a4014a591 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:27:07 compute-1 sudo[87928]: pam_unix(sudo:session): session closed for user root
Jan 21 23:27:07 compute-1 sudo[88080]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-heewccoxwwtafpdsnymsryuselrypxva ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038027.5393472-612-88811137999894/AnsiballZ_stat.py'
Jan 21 23:27:07 compute-1 sudo[88080]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:27:08 compute-1 python3.9[88082]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:27:08 compute-1 sudo[88080]: pam_unix(sudo:session): session closed for user root
Jan 21 23:27:08 compute-1 sudo[88205]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hzthyslovgnsnhkpzphxyyzwpfpfcfjp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038027.5393472-612-88811137999894/AnsiballZ_copy.py'
Jan 21 23:27:08 compute-1 sudo[88205]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:27:08 compute-1 python3.9[88207]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769038027.5393472-612-88811137999894/.source.nft follow=False _original_basename=ruleset.j2 checksum=eb691bdb7d792c5f8ff0d719e807fe1c95b09438 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:27:08 compute-1 sudo[88205]: pam_unix(sudo:session): session closed for user root
Jan 21 23:27:09 compute-1 sudo[88357]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-njsanbgndaklnmwtxssmrhupscbisuol ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038029.3169096-657-163747236454111/AnsiballZ_file.py'
Jan 21 23:27:09 compute-1 sudo[88357]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:27:09 compute-1 python3.9[88359]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:27:09 compute-1 sudo[88357]: pam_unix(sudo:session): session closed for user root
Jan 21 23:27:10 compute-1 sudo[88509]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rsrjrnyfeylgkwvhspmzahqxneoehhmo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038030.145191-681-160725905242197/AnsiballZ_command.py'
Jan 21 23:27:10 compute-1 sudo[88509]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:27:10 compute-1 python3.9[88511]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 23:27:10 compute-1 sudo[88509]: pam_unix(sudo:session): session closed for user root
Jan 21 23:27:11 compute-1 sudo[88664]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-esrvsobsjizfdoscjqrqzadstchhtrxt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038030.9186654-705-196468565786/AnsiballZ_blockinfile.py'
Jan 21 23:27:11 compute-1 sudo[88664]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:27:11 compute-1 python3.9[88666]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                            include "/etc/nftables/edpm-chains.nft"
                                            include "/etc/nftables/edpm-rules.nft"
                                            include "/etc/nftables/edpm-jumps.nft"
                                             path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:27:11 compute-1 sudo[88664]: pam_unix(sudo:session): session closed for user root
Jan 21 23:27:12 compute-1 sudo[88816]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gubumtwunraoxbzphlnjhobfzzduvvtf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038031.938839-732-104640895102336/AnsiballZ_command.py'
Jan 21 23:27:12 compute-1 sudo[88816]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:27:12 compute-1 python3.9[88818]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 23:27:12 compute-1 sudo[88816]: pam_unix(sudo:session): session closed for user root
Jan 21 23:27:13 compute-1 sudo[88969]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lzpyevdzbyegemvcuzltpqcdjriewlte ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038033.0258324-756-239496097277988/AnsiballZ_stat.py'
Jan 21 23:27:13 compute-1 sudo[88969]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:27:13 compute-1 python3.9[88971]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 21 23:27:13 compute-1 sudo[88969]: pam_unix(sudo:session): session closed for user root
Jan 21 23:27:14 compute-1 sudo[89123]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rxilscnerrbsbbtxtkcazabozfsnyonv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038033.7742333-780-148936576321444/AnsiballZ_command.py'
Jan 21 23:27:14 compute-1 sudo[89123]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:27:14 compute-1 python3.9[89125]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 23:27:14 compute-1 sudo[89123]: pam_unix(sudo:session): session closed for user root
Jan 21 23:27:14 compute-1 sudo[89278]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ifzgwqtozkumrslxwcppnntwhyehqdrc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038034.5512702-804-161465900570099/AnsiballZ_file.py'
Jan 21 23:27:14 compute-1 sudo[89278]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:27:15 compute-1 python3.9[89280]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:27:15 compute-1 sudo[89278]: pam_unix(sudo:session): session closed for user root
Jan 21 23:27:16 compute-1 python3.9[89430]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'machine'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 21 23:27:17 compute-1 sudo[89581]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tdkjeosgrogizzfuoihmnmbsnlgokbwn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038037.3132372-924-142902874457000/AnsiballZ_command.py'
Jan 21 23:27:17 compute-1 sudo[89581]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:27:17 compute-1 python3.9[89583]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl set open . external_ids:hostname=compute-1.ctlplane.example.com external_ids:ovn-bridge=br-int external_ids:ovn-bridge-mappings=datacentre:br-ex external_ids:ovn-chassis-mac-mappings="datacentre:3e:0a:8d:1d:08:09" external_ids:ovn-encap-ip=172.19.0.101 external_ids:ovn-encap-type=geneve external_ids:ovn-encap-tos=0 external_ids:ovn-match-northd-version=False external_ids:ovn-monitor-all=True external_ids:ovn-remote=ssl:ovsdbserver-sb.openstack.svc:6642 external_ids:ovn-remote-probe-interval=60000 external_ids:ovn-ofctrl-wait-before-clear=8000 external_ids:rundir=/var/run/openvswitch 
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 23:27:17 compute-1 ovs-vsctl[89584]: ovs|00001|vsctl|INFO|Called as ovs-vsctl set open . external_ids:hostname=compute-1.ctlplane.example.com external_ids:ovn-bridge=br-int external_ids:ovn-bridge-mappings=datacentre:br-ex external_ids:ovn-chassis-mac-mappings=datacentre:3e:0a:8d:1d:08:09 external_ids:ovn-encap-ip=172.19.0.101 external_ids:ovn-encap-type=geneve external_ids:ovn-encap-tos=0 external_ids:ovn-match-northd-version=False external_ids:ovn-monitor-all=True external_ids:ovn-remote=ssl:ovsdbserver-sb.openstack.svc:6642 external_ids:ovn-remote-probe-interval=60000 external_ids:ovn-ofctrl-wait-before-clear=8000 external_ids:rundir=/var/run/openvswitch
Jan 21 23:27:17 compute-1 sudo[89581]: pam_unix(sudo:session): session closed for user root
Jan 21 23:27:18 compute-1 sudo[89734]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tivkkafujernrlvjaaayeitzeymxbknj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038038.1409366-951-103428105140206/AnsiballZ_command.py'
Jan 21 23:27:18 compute-1 sudo[89734]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:27:18 compute-1 python3.9[89736]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail
                                            ovs-vsctl show | grep -q "Manager"
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 23:27:18 compute-1 sudo[89734]: pam_unix(sudo:session): session closed for user root
Jan 21 23:27:19 compute-1 sudo[89889]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kirkjwywblodjqkmptgtzbobookrhwlv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038038.9213774-975-46827375847113/AnsiballZ_command.py'
Jan 21 23:27:19 compute-1 sudo[89889]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:27:19 compute-1 python3.9[89891]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl --timeout=5 --id=@manager -- create Manager target=\"ptcp:********@manager
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 23:27:19 compute-1 ovs-vsctl[89892]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --timeout=5 --id=@manager -- create Manager "target=\"ptcp:6640:127.0.0.1\"" -- add Open_vSwitch . manager_options @manager
Jan 21 23:27:19 compute-1 sudo[89889]: pam_unix(sudo:session): session closed for user root
Jan 21 23:27:20 compute-1 python3.9[90042]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 21 23:27:20 compute-1 sudo[90194]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tpqvaxieedahqjbuyrqudzthfkjplofo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038040.5467796-1026-278554898636846/AnsiballZ_file.py'
Jan 21 23:27:20 compute-1 sudo[90194]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:27:21 compute-1 python3.9[90196]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 21 23:27:21 compute-1 sudo[90194]: pam_unix(sudo:session): session closed for user root
Jan 21 23:27:21 compute-1 sudo[90346]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-phrgzyhuvcggqgtuxvtlcpfylgunyelf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038041.3036823-1050-10720780762904/AnsiballZ_stat.py'
Jan 21 23:27:21 compute-1 sudo[90346]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:27:21 compute-1 python3.9[90348]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:27:21 compute-1 sudo[90346]: pam_unix(sudo:session): session closed for user root
Jan 21 23:27:22 compute-1 sudo[90424]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lcbbhocfbsbquaxzpfiuhkrlhczwocje ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038041.3036823-1050-10720780762904/AnsiballZ_file.py'
Jan 21 23:27:22 compute-1 sudo[90424]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:27:22 compute-1 python3.9[90426]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 21 23:27:22 compute-1 sudo[90424]: pam_unix(sudo:session): session closed for user root
Jan 21 23:27:22 compute-1 sudo[90576]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eesrovcivyrrsfhkazirecebjqgexcap ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038042.4939795-1050-42686043344378/AnsiballZ_stat.py'
Jan 21 23:27:22 compute-1 sudo[90576]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:27:22 compute-1 python3.9[90578]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:27:22 compute-1 sudo[90576]: pam_unix(sudo:session): session closed for user root
Jan 21 23:27:23 compute-1 sudo[90654]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sgqhmdkeikorsteznedocqkswwvveywc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038042.4939795-1050-42686043344378/AnsiballZ_file.py'
Jan 21 23:27:23 compute-1 sudo[90654]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:27:23 compute-1 python3.9[90656]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 21 23:27:23 compute-1 sudo[90654]: pam_unix(sudo:session): session closed for user root
Jan 21 23:27:24 compute-1 sudo[90806]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kycyrfucvbvvsojpnthbmffdcbtsdzad ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038043.6689513-1119-156993447090499/AnsiballZ_file.py'
Jan 21 23:27:24 compute-1 sudo[90806]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:27:24 compute-1 python3.9[90808]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:27:24 compute-1 sudo[90806]: pam_unix(sudo:session): session closed for user root
Jan 21 23:27:24 compute-1 sudo[90958]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jiiaazcbijgblcuwrpmdctbigmsbxycm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038044.440747-1143-81350448685996/AnsiballZ_stat.py'
Jan 21 23:27:24 compute-1 sudo[90958]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:27:25 compute-1 python3.9[90960]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:27:25 compute-1 sudo[90958]: pam_unix(sudo:session): session closed for user root
Jan 21 23:27:25 compute-1 sudo[91036]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nabghhmrlkexorytiskjyygwkcpkakcy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038044.440747-1143-81350448685996/AnsiballZ_file.py'
Jan 21 23:27:25 compute-1 sudo[91036]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:27:25 compute-1 python3.9[91038]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:27:25 compute-1 sudo[91036]: pam_unix(sudo:session): session closed for user root
Jan 21 23:27:25 compute-1 sudo[91188]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nomvkqazquvtivtwsumqaxkgqjrmkzex ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038045.6904573-1179-246489399453635/AnsiballZ_stat.py'
Jan 21 23:27:25 compute-1 sudo[91188]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:27:26 compute-1 python3.9[91190]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:27:26 compute-1 sudo[91188]: pam_unix(sudo:session): session closed for user root
Jan 21 23:27:26 compute-1 sudo[91266]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bwopcmdkfzfdriacgiomreeajpixntlg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038045.6904573-1179-246489399453635/AnsiballZ_file.py'
Jan 21 23:27:26 compute-1 sudo[91266]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:27:26 compute-1 python3.9[91268]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:27:26 compute-1 sudo[91266]: pam_unix(sudo:session): session closed for user root
Jan 21 23:27:27 compute-1 sudo[91418]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hpbmcqzrvcelcffovqkpdzlupeyjedfj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038046.9115326-1215-231455486281019/AnsiballZ_systemd.py'
Jan 21 23:27:27 compute-1 sudo[91418]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:27:27 compute-1 python3.9[91420]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 21 23:27:27 compute-1 systemd[1]: Reloading.
Jan 21 23:27:27 compute-1 systemd-rc-local-generator[91447]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 23:27:27 compute-1 systemd-sysv-generator[91451]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 21 23:27:27 compute-1 sudo[91418]: pam_unix(sudo:session): session closed for user root
Jan 21 23:27:28 compute-1 sudo[91607]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iynjlhcykugjctfgxklvdcrszjablgur ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038048.099229-1239-240734750173640/AnsiballZ_stat.py'
Jan 21 23:27:28 compute-1 sudo[91607]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:27:28 compute-1 python3.9[91609]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:27:28 compute-1 sudo[91607]: pam_unix(sudo:session): session closed for user root
Jan 21 23:27:28 compute-1 sudo[91685]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kcidesdnvenaybngtodtyqdqlqmrpzgs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038048.099229-1239-240734750173640/AnsiballZ_file.py'
Jan 21 23:27:28 compute-1 sudo[91685]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:27:29 compute-1 python3.9[91687]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:27:29 compute-1 sudo[91685]: pam_unix(sudo:session): session closed for user root
Jan 21 23:27:29 compute-1 sudo[91837]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vdfyjqhgrnykfffyjbfsamccrfjqyodj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038049.4041605-1275-152849562508490/AnsiballZ_stat.py'
Jan 21 23:27:29 compute-1 sudo[91837]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:27:29 compute-1 python3.9[91839]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:27:29 compute-1 sudo[91837]: pam_unix(sudo:session): session closed for user root
Jan 21 23:27:30 compute-1 sudo[91915]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kuhwsrnpqonftwfxezkzyalddtgimzrx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038049.4041605-1275-152849562508490/AnsiballZ_file.py'
Jan 21 23:27:30 compute-1 sudo[91915]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:27:30 compute-1 python3.9[91917]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:27:30 compute-1 sudo[91915]: pam_unix(sudo:session): session closed for user root
Jan 21 23:27:31 compute-1 sudo[92067]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-plasuvxuchrbppnasghpzzjfmwxczfnf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038050.691665-1311-84101747526313/AnsiballZ_systemd.py'
Jan 21 23:27:31 compute-1 sudo[92067]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:27:31 compute-1 python3.9[92069]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 21 23:27:31 compute-1 systemd[1]: Reloading.
Jan 21 23:27:31 compute-1 systemd-sysv-generator[92099]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 21 23:27:31 compute-1 systemd-rc-local-generator[92093]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 23:27:31 compute-1 systemd[1]: Starting Create netns directory...
Jan 21 23:27:31 compute-1 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Jan 21 23:27:31 compute-1 systemd[1]: netns-placeholder.service: Deactivated successfully.
Jan 21 23:27:31 compute-1 systemd[1]: Finished Create netns directory.
Jan 21 23:27:31 compute-1 sudo[92067]: pam_unix(sudo:session): session closed for user root
Jan 21 23:27:32 compute-1 sudo[92259]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qakttnzhgkfehwzcymosxtirmvnqdezu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038052.0575423-1341-119930080417196/AnsiballZ_file.py'
Jan 21 23:27:32 compute-1 sudo[92259]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:27:32 compute-1 python3.9[92261]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 21 23:27:32 compute-1 sudo[92259]: pam_unix(sudo:session): session closed for user root
Jan 21 23:27:33 compute-1 sudo[92411]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qsblwjzmminjdfbqhclzfffuqbqlensf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038052.7577422-1365-26065510168550/AnsiballZ_stat.py'
Jan 21 23:27:33 compute-1 sudo[92411]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:27:33 compute-1 python3.9[92413]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ovn_controller/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:27:33 compute-1 sudo[92411]: pam_unix(sudo:session): session closed for user root
Jan 21 23:27:33 compute-1 sudo[92534]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iorzchojqmtkntkerpxfeshthxzdixlm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038052.7577422-1365-26065510168550/AnsiballZ_copy.py'
Jan 21 23:27:33 compute-1 sudo[92534]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:27:33 compute-1 python3.9[92536]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ovn_controller/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769038052.7577422-1365-26065510168550/.source _original_basename=healthcheck follow=False checksum=4098dd010265fabdf5c26b97d169fc4e575ff457 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 21 23:27:33 compute-1 sudo[92534]: pam_unix(sudo:session): session closed for user root
Jan 21 23:27:34 compute-1 sudo[92686]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-egwehlpmjghitakkfphnygzcdcoucbws ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038054.2385182-1416-179871605597241/AnsiballZ_file.py'
Jan 21 23:27:34 compute-1 sudo[92686]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:27:34 compute-1 python3.9[92688]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:27:34 compute-1 sudo[92686]: pam_unix(sudo:session): session closed for user root
Jan 21 23:27:35 compute-1 sudo[92838]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ppvsuskedtqrllrwrfggemviajpmivei ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038054.9859047-1440-136973021671183/AnsiballZ_file.py'
Jan 21 23:27:35 compute-1 sudo[92838]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:27:35 compute-1 python3.9[92840]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 21 23:27:35 compute-1 sudo[92838]: pam_unix(sudo:session): session closed for user root
Jan 21 23:27:36 compute-1 sudo[92990]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eiydihqmljwrcswpcvyjzcbjdisxbvjf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038055.7750793-1464-72291148075766/AnsiballZ_stat.py'
Jan 21 23:27:36 compute-1 sudo[92990]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:27:36 compute-1 python3.9[92992]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ovn_controller.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:27:36 compute-1 sudo[92990]: pam_unix(sudo:session): session closed for user root
Jan 21 23:27:36 compute-1 sudo[93113]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qzlwrxcxphzdeuqklzcfvplswcpphqvd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038055.7750793-1464-72291148075766/AnsiballZ_copy.py'
Jan 21 23:27:36 compute-1 sudo[93113]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:27:37 compute-1 python3.9[93115]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ovn_controller.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1769038055.7750793-1464-72291148075766/.source.json _original_basename=.e0336vhb follow=False checksum=2328fc98619beeb08ee32b01f15bb43094c10b61 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:27:37 compute-1 sudo[93113]: pam_unix(sudo:session): session closed for user root
Jan 21 23:27:37 compute-1 python3.9[93265]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ovn_controller state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:27:39 compute-1 sudo[93686]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wxxkyiicrdyixhoobakqcwwkfiutbwgl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038059.480423-1584-238921413139989/AnsiballZ_container_config_data.py'
Jan 21 23:27:39 compute-1 sudo[93686]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:27:40 compute-1 python3.9[93688]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ovn_controller config_pattern=*.json debug=False
Jan 21 23:27:40 compute-1 sudo[93686]: pam_unix(sudo:session): session closed for user root
Jan 21 23:27:41 compute-1 sudo[93838]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qlcfrhmltvfdpsrugkjaijripirvycxa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038060.5803301-1617-247772240133391/AnsiballZ_container_config_hash.py'
Jan 21 23:27:41 compute-1 sudo[93838]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:27:41 compute-1 python3.9[93840]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Jan 21 23:27:41 compute-1 sudo[93838]: pam_unix(sudo:session): session closed for user root
Jan 21 23:27:42 compute-1 sudo[93990]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gazhdhhlqjuytyyhccbpeoienowidazl ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1769038061.6804986-1647-92390243784985/AnsiballZ_edpm_container_manage.py'
Jan 21 23:27:42 compute-1 sudo[93990]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:27:42 compute-1 python3[93992]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ovn_controller config_id=ovn_controller config_overrides={} config_patterns=*.json containers=['ovn_controller'] log_base_path=/var/log/containers/stdouts debug=False
Jan 21 23:27:42 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 21 23:27:42 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 21 23:27:42 compute-1 podman[94027]: 2026-01-21 23:27:42.738237208 +0000 UTC m=+0.050860932 container create 1b31374526468d6b98833c6ddc06c791524e3aaa8f4a92d02e3f11c449a17ca2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, managed_by=edpm_ansible, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Jan 21 23:27:42 compute-1 podman[94027]: 2026-01-21 23:27:42.709599971 +0000 UTC m=+0.022223725 image pull a17927617ef5a603f0594ee0d6df65aabdc9e0303ccc5a52c36f193de33ee0fe quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified
Jan 21 23:27:42 compute-1 python3[93992]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ovn_controller --conmon-pidfile /run/ovn_controller.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env EDPM_CONFIG_HASH=7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f --healthcheck-command /openstack/healthcheck --label config_id=ovn_controller --label container_name=ovn_controller --label managed_by=edpm_ansible --label config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --user root --volume /lib/modules:/lib/modules:ro --volume /run:/run --volume /var/lib/openvswitch/ovn:/run/ovn:shared,z --volume /var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z --volume /var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z --volume /var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z --volume /var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified
Jan 21 23:27:42 compute-1 sudo[93990]: pam_unix(sudo:session): session closed for user root
Jan 21 23:27:43 compute-1 sudo[94215]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oirunbritolwupfzvizhtduzunlvjktr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038063.1108222-1671-236824402750871/AnsiballZ_stat.py'
Jan 21 23:27:43 compute-1 sudo[94215]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:27:43 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 21 23:27:43 compute-1 python3.9[94217]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 21 23:27:43 compute-1 sudo[94215]: pam_unix(sudo:session): session closed for user root
Jan 21 23:27:44 compute-1 sudo[94369]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gfohhghrafzlszfcmeykghauupvlcwjy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038063.9340594-1698-2565258562360/AnsiballZ_file.py'
Jan 21 23:27:44 compute-1 sudo[94369]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:27:44 compute-1 python3.9[94371]: ansible-file Invoked with path=/etc/systemd/system/edpm_ovn_controller.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:27:44 compute-1 sudo[94369]: pam_unix(sudo:session): session closed for user root
Jan 21 23:27:44 compute-1 sudo[94445]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qgqhbowjvtqzblnqhiarfiviasqjsmyv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038063.9340594-1698-2565258562360/AnsiballZ_stat.py'
Jan 21 23:27:44 compute-1 sudo[94445]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:27:44 compute-1 python3.9[94447]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ovn_controller_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 21 23:27:44 compute-1 sudo[94445]: pam_unix(sudo:session): session closed for user root
Jan 21 23:27:45 compute-1 sudo[94596]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-agzcmeswlpqochxadrmcldyifpgfpdnp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038065.0372927-1698-18766839308540/AnsiballZ_copy.py'
Jan 21 23:27:45 compute-1 sudo[94596]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:27:45 compute-1 python3.9[94598]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769038065.0372927-1698-18766839308540/source dest=/etc/systemd/system/edpm_ovn_controller.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:27:45 compute-1 sudo[94596]: pam_unix(sudo:session): session closed for user root
Jan 21 23:27:46 compute-1 sudo[94672]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jzrmgjimjndukhgjqeinejojjrngrtlq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038065.0372927-1698-18766839308540/AnsiballZ_systemd.py'
Jan 21 23:27:46 compute-1 sudo[94672]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:27:46 compute-1 python3.9[94674]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 21 23:27:46 compute-1 systemd[1]: Reloading.
Jan 21 23:27:46 compute-1 systemd-rc-local-generator[94702]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 23:27:46 compute-1 systemd-sysv-generator[94705]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 21 23:27:46 compute-1 sudo[94672]: pam_unix(sudo:session): session closed for user root
Jan 21 23:27:46 compute-1 sudo[94783]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-paazffytooxhgywzsijmdffqiccbgupz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038065.0372927-1698-18766839308540/AnsiballZ_systemd.py'
Jan 21 23:27:46 compute-1 sudo[94783]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:27:47 compute-1 python3.9[94785]: ansible-systemd Invoked with state=restarted name=edpm_ovn_controller.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 21 23:27:47 compute-1 systemd[1]: Reloading.
Jan 21 23:27:47 compute-1 systemd-rc-local-generator[94816]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 23:27:47 compute-1 systemd-sysv-generator[94820]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 21 23:27:47 compute-1 systemd[1]: Starting ovn_controller container...
Jan 21 23:27:47 compute-1 systemd[1]: Created slice Virtual Machine and Container Slice.
Jan 21 23:27:47 compute-1 systemd[1]: Started libcrun container.
Jan 21 23:27:47 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8033d52be4ebbeb250cfc6ecd644ea0488127b52da2b1ba79dc5348ff7dc37c8/merged/run/ovn supports timestamps until 2038 (0x7fffffff)
Jan 21 23:27:47 compute-1 systemd[1]: Started /usr/bin/podman healthcheck run 1b31374526468d6b98833c6ddc06c791524e3aaa8f4a92d02e3f11c449a17ca2.
Jan 21 23:27:47 compute-1 podman[94826]: 2026-01-21 23:27:47.717599289 +0000 UTC m=+0.147144261 container init 1b31374526468d6b98833c6ddc06c791524e3aaa8f4a92d02e3f11c449a17ca2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, io.buildah.version=1.41.3, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.license=GPLv2, config_id=ovn_controller)
Jan 21 23:27:47 compute-1 ovn_controller[94841]: + sudo -E kolla_set_configs
Jan 21 23:27:47 compute-1 podman[94826]: 2026-01-21 23:27:47.744111298 +0000 UTC m=+0.173656240 container start 1b31374526468d6b98833c6ddc06c791524e3aaa8f4a92d02e3f11c449a17ca2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Jan 21 23:27:47 compute-1 edpm-start-podman-container[94826]: ovn_controller
Jan 21 23:27:47 compute-1 systemd[1]: Created slice User Slice of UID 0.
Jan 21 23:27:47 compute-1 systemd[1]: Starting User Runtime Directory /run/user/0...
Jan 21 23:27:47 compute-1 systemd[1]: Finished User Runtime Directory /run/user/0.
Jan 21 23:27:47 compute-1 systemd[1]: Starting User Manager for UID 0...
Jan 21 23:27:47 compute-1 edpm-start-podman-container[94825]: Creating additional drop-in dependency for "ovn_controller" (1b31374526468d6b98833c6ddc06c791524e3aaa8f4a92d02e3f11c449a17ca2)
Jan 21 23:27:47 compute-1 systemd[94875]: pam_unix(systemd-user:session): session opened for user root(uid=0) by root(uid=0)
Jan 21 23:27:47 compute-1 systemd[1]: Reloading.
Jan 21 23:27:47 compute-1 podman[94847]: 2026-01-21 23:27:47.869388496 +0000 UTC m=+0.108767765 container health_status 1b31374526468d6b98833c6ddc06c791524e3aaa8f4a92d02e3f11c449a17ca2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=starting, health_failing_streak=1, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 21 23:27:47 compute-1 systemd-rc-local-generator[94926]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 23:27:47 compute-1 systemd-sysv-generator[94929]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 21 23:27:47 compute-1 systemd[94875]: Queued start job for default target Main User Target.
Jan 21 23:27:47 compute-1 systemd[94875]: Created slice User Application Slice.
Jan 21 23:27:47 compute-1 systemd[94875]: Mark boot as successful after the user session has run 2 minutes was skipped because of an unmet condition check (ConditionUser=!@system).
Jan 21 23:27:47 compute-1 systemd[94875]: Started Daily Cleanup of User's Temporary Directories.
Jan 21 23:27:47 compute-1 systemd[94875]: Reached target Paths.
Jan 21 23:27:47 compute-1 systemd[94875]: Reached target Timers.
Jan 21 23:27:47 compute-1 systemd[94875]: Starting D-Bus User Message Bus Socket...
Jan 21 23:27:48 compute-1 systemd[94875]: Starting Create User's Volatile Files and Directories...
Jan 21 23:27:48 compute-1 systemd[94875]: Listening on D-Bus User Message Bus Socket.
Jan 21 23:27:48 compute-1 systemd[94875]: Reached target Sockets.
Jan 21 23:27:48 compute-1 systemd[94875]: Finished Create User's Volatile Files and Directories.
Jan 21 23:27:48 compute-1 systemd[94875]: Reached target Basic System.
Jan 21 23:27:48 compute-1 systemd[94875]: Reached target Main User Target.
Jan 21 23:27:48 compute-1 systemd[94875]: Startup finished in 160ms.
Jan 21 23:27:48 compute-1 systemd[1]: Started User Manager for UID 0.
Jan 21 23:27:48 compute-1 systemd[1]: Started ovn_controller container.
Jan 21 23:27:48 compute-1 systemd[1]: 1b31374526468d6b98833c6ddc06c791524e3aaa8f4a92d02e3f11c449a17ca2-4f470c94ea8abce9.service: Main process exited, code=exited, status=1/FAILURE
Jan 21 23:27:48 compute-1 systemd[1]: 1b31374526468d6b98833c6ddc06c791524e3aaa8f4a92d02e3f11c449a17ca2-4f470c94ea8abce9.service: Failed with result 'exit-code'.
Jan 21 23:27:48 compute-1 systemd[1]: Started Session c1 of User root.
Jan 21 23:27:48 compute-1 sudo[94783]: pam_unix(sudo:session): session closed for user root
Jan 21 23:27:48 compute-1 ovn_controller[94841]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Jan 21 23:27:48 compute-1 ovn_controller[94841]: INFO:__main__:Validating config file
Jan 21 23:27:48 compute-1 ovn_controller[94841]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Jan 21 23:27:48 compute-1 ovn_controller[94841]: INFO:__main__:Writing out command to execute
Jan 21 23:27:48 compute-1 systemd[1]: session-c1.scope: Deactivated successfully.
Jan 21 23:27:48 compute-1 ovn_controller[94841]: ++ cat /run_command
Jan 21 23:27:48 compute-1 ovn_controller[94841]: + CMD='/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '
Jan 21 23:27:48 compute-1 ovn_controller[94841]: + ARGS=
Jan 21 23:27:48 compute-1 ovn_controller[94841]: + sudo kolla_copy_cacerts
Jan 21 23:27:48 compute-1 systemd[1]: Started Session c2 of User root.
Jan 21 23:27:48 compute-1 systemd[1]: session-c2.scope: Deactivated successfully.
Jan 21 23:27:48 compute-1 ovn_controller[94841]: + [[ ! -n '' ]]
Jan 21 23:27:48 compute-1 ovn_controller[94841]: + . kolla_extend_start
Jan 21 23:27:48 compute-1 ovn_controller[94841]: Running command: '/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '
Jan 21 23:27:48 compute-1 ovn_controller[94841]: + echo 'Running command: '\''/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '\'''
Jan 21 23:27:48 compute-1 ovn_controller[94841]: + umask 0022
Jan 21 23:27:48 compute-1 ovn_controller[94841]: + exec /usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt
Jan 21 23:27:48 compute-1 ovn_controller[94841]: 2026-01-21T23:27:48Z|00001|reconnect|INFO|unix:/run/openvswitch/db.sock: connecting...
Jan 21 23:27:48 compute-1 ovn_controller[94841]: 2026-01-21T23:27:48Z|00002|reconnect|INFO|unix:/run/openvswitch/db.sock: connected
Jan 21 23:27:48 compute-1 ovn_controller[94841]: 2026-01-21T23:27:48Z|00003|main|INFO|OVN internal version is : [24.03.8-20.33.0-76.8]
Jan 21 23:27:48 compute-1 ovn_controller[94841]: 2026-01-21T23:27:48Z|00004|main|INFO|OVS IDL reconnected, force recompute.
Jan 21 23:27:48 compute-1 ovn_controller[94841]: 2026-01-21T23:27:48Z|00005|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connecting...
Jan 21 23:27:48 compute-1 ovn_controller[94841]: 2026-01-21T23:27:48Z|00006|main|INFO|OVNSB IDL reconnected, force recompute.
Jan 21 23:27:48 compute-1 NetworkManager[54952]: <info>  [1769038068.3111] manager: (br-int): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/14)
Jan 21 23:27:48 compute-1 NetworkManager[54952]: <info>  [1769038068.3121] device (br-int)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 21 23:27:48 compute-1 NetworkManager[54952]: <warn>  [1769038068.3126] device (br-int)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Jan 21 23:27:48 compute-1 NetworkManager[54952]: <info>  [1769038068.3140] manager: (br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/15)
Jan 21 23:27:48 compute-1 NetworkManager[54952]: <info>  [1769038068.3151] manager: (br-int): new Open vSwitch Bridge device (/org/freedesktop/NetworkManager/Devices/16)
Jan 21 23:27:48 compute-1 NetworkManager[54952]: <info>  [1769038068.3157] device (br-int)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'none', managed-type: 'full')
Jan 21 23:27:48 compute-1 kernel: br-int: entered promiscuous mode
Jan 21 23:27:48 compute-1 ovn_controller[94841]: 2026-01-21T23:27:48Z|00007|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connected
Jan 21 23:27:48 compute-1 ovn_controller[94841]: 2026-01-21T23:27:48Z|00008|features|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Jan 21 23:27:48 compute-1 ovn_controller[94841]: 2026-01-21T23:27:48Z|00009|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Jan 21 23:27:48 compute-1 ovn_controller[94841]: 2026-01-21T23:27:48Z|00010|features|INFO|OVS Feature: ct_zero_snat, state: supported
Jan 21 23:27:48 compute-1 ovn_controller[94841]: 2026-01-21T23:27:48Z|00011|features|INFO|OVS Feature: ct_flush, state: supported
Jan 21 23:27:48 compute-1 ovn_controller[94841]: 2026-01-21T23:27:48Z|00012|features|INFO|OVS Feature: dp_hash_l4_sym_support, state: supported
Jan 21 23:27:48 compute-1 ovn_controller[94841]: 2026-01-21T23:27:48Z|00013|reconnect|INFO|unix:/run/openvswitch/db.sock: connecting...
Jan 21 23:27:48 compute-1 ovn_controller[94841]: 2026-01-21T23:27:48Z|00014|main|INFO|OVS feature set changed, force recompute.
Jan 21 23:27:48 compute-1 ovn_controller[94841]: 2026-01-21T23:27:48Z|00015|ofctrl|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Jan 21 23:27:48 compute-1 ovn_controller[94841]: 2026-01-21T23:27:48Z|00016|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Jan 21 23:27:48 compute-1 ovn_controller[94841]: 2026-01-21T23:27:48Z|00017|reconnect|INFO|unix:/run/openvswitch/db.sock: connected
Jan 21 23:27:48 compute-1 ovn_controller[94841]: 2026-01-21T23:27:48Z|00018|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Jan 21 23:27:48 compute-1 ovn_controller[94841]: 2026-01-21T23:27:48Z|00019|main|INFO|OVS feature set changed, force recompute.
Jan 21 23:27:48 compute-1 ovn_controller[94841]: 2026-01-21T23:27:48Z|00020|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Jan 21 23:27:48 compute-1 ovn_controller[94841]: 2026-01-21T23:27:48Z|00021|ofctrl|INFO|ofctrl-wait-before-clear is now 8000 ms (was 0 ms)
Jan 21 23:27:48 compute-1 ovn_controller[94841]: 2026-01-21T23:27:48Z|00022|main|INFO|OVS OpenFlow connection reconnected,force recompute.
Jan 21 23:27:48 compute-1 ovn_controller[94841]: 2026-01-21T23:27:48Z|00023|features|INFO|OVS DB schema supports 4 flow table prefixes, our IDL supports: 4
Jan 21 23:27:48 compute-1 ovn_controller[94841]: 2026-01-21T23:27:48Z|00024|main|INFO|Setting flow table prefixes: ip_src, ip_dst, ipv6_src, ipv6_dst.
Jan 21 23:27:48 compute-1 ovn_controller[94841]: 2026-01-21T23:27:48Z|00001|pinctrl(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Jan 21 23:27:48 compute-1 ovn_controller[94841]: 2026-01-21T23:27:48Z|00001|statctrl(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Jan 21 23:27:48 compute-1 ovn_controller[94841]: 2026-01-21T23:27:48Z|00002|rconn(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Jan 21 23:27:48 compute-1 ovn_controller[94841]: 2026-01-21T23:27:48Z|00002|rconn(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Jan 21 23:27:48 compute-1 ovn_controller[94841]: 2026-01-21T23:27:48Z|00003|rconn(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Jan 21 23:27:48 compute-1 ovn_controller[94841]: 2026-01-21T23:27:48Z|00003|rconn(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Jan 21 23:27:48 compute-1 NetworkManager[54952]: <info>  [1769038068.3356] manager: (ovn-ce4b29-0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/17)
Jan 21 23:27:48 compute-1 NetworkManager[54952]: <info>  [1769038068.3369] manager: (ovn-7f404a-0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/18)
Jan 21 23:27:48 compute-1 NetworkManager[54952]: <info>  [1769038068.3385] manager: (ovn-f0bd48-0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/19)
Jan 21 23:27:48 compute-1 systemd-udevd[94974]: Network interface NamePolicy= disabled on kernel command line.
Jan 21 23:27:48 compute-1 kernel: genev_sys_6081: entered promiscuous mode
Jan 21 23:27:48 compute-1 systemd-udevd[94976]: Network interface NamePolicy= disabled on kernel command line.
Jan 21 23:27:48 compute-1 NetworkManager[54952]: <info>  [1769038068.3530] device (genev_sys_6081): carrier: link connected
Jan 21 23:27:48 compute-1 NetworkManager[54952]: <info>  [1769038068.3536] manager: (genev_sys_6081): new Generic device (/org/freedesktop/NetworkManager/Devices/20)
Jan 21 23:27:49 compute-1 python3.9[95104]: ansible-ansible.builtin.slurp Invoked with src=/var/lib/edpm-config/deployed_services.yaml
Jan 21 23:27:50 compute-1 sudo[95254]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jsijyilmveotkppbzpmvyadbmabuluom ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038070.0499973-1833-16048719101304/AnsiballZ_stat.py'
Jan 21 23:27:50 compute-1 sudo[95254]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:27:50 compute-1 python3.9[95256]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/deployed_services.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:27:50 compute-1 sudo[95254]: pam_unix(sudo:session): session closed for user root
Jan 21 23:27:51 compute-1 sudo[95377]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vnfigllvqsaxmqfpfjfeojtxuglguioh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038070.0499973-1833-16048719101304/AnsiballZ_copy.py'
Jan 21 23:27:51 compute-1 sudo[95377]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:27:51 compute-1 python3.9[95379]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/deployed_services.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769038070.0499973-1833-16048719101304/.source.yaml _original_basename=.5z6omll7 follow=False checksum=1fa0f89c2313d90a3d28193c1cbb0dd87b38dad2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:27:51 compute-1 sudo[95377]: pam_unix(sudo:session): session closed for user root
Jan 21 23:27:51 compute-1 sudo[95529]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-njyjyxbakpmupiyougkotuwwwweqqxkx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038071.4700131-1878-107182533883811/AnsiballZ_command.py'
Jan 21 23:27:51 compute-1 sudo[95529]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:27:52 compute-1 python3.9[95531]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl remove open . other_config hw-offload
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 23:27:52 compute-1 ovs-vsctl[95532]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove open . other_config hw-offload
Jan 21 23:27:52 compute-1 sudo[95529]: pam_unix(sudo:session): session closed for user root
Jan 21 23:27:52 compute-1 sudo[95682]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pcgfwzxabfwuleiwmnwjizrbsofynxvo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038072.2476642-1902-164067240967284/AnsiballZ_command.py'
Jan 21 23:27:52 compute-1 sudo[95682]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:27:52 compute-1 python3.9[95684]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl get Open_vSwitch . external_ids:ovn-cms-options | sed 's/\"//g'
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 23:27:52 compute-1 ovs-vsctl[95686]: ovs|00001|db_ctl_base|ERR|no key "ovn-cms-options" in Open_vSwitch record "." column external_ids
Jan 21 23:27:52 compute-1 sudo[95682]: pam_unix(sudo:session): session closed for user root
Jan 21 23:27:53 compute-1 sudo[95838]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fleuslxlfmazetleyscgtaqkmaeflrcw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038073.4495523-1944-119008626278150/AnsiballZ_command.py'
Jan 21 23:27:53 compute-1 sudo[95838]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:27:53 compute-1 python3.9[95840]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl remove Open_vSwitch . external_ids ovn-cms-options
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 23:27:53 compute-1 ovs-vsctl[95841]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove Open_vSwitch . external_ids ovn-cms-options
Jan 21 23:27:53 compute-1 sudo[95838]: pam_unix(sudo:session): session closed for user root
Jan 21 23:27:54 compute-1 sshd-session[84366]: Connection closed by 192.168.122.30 port 54386
Jan 21 23:27:54 compute-1 sshd-session[84363]: pam_unix(sshd:session): session closed for user zuul
Jan 21 23:27:54 compute-1 systemd[1]: session-20.scope: Deactivated successfully.
Jan 21 23:27:54 compute-1 systemd[1]: session-20.scope: Consumed 51.395s CPU time.
Jan 21 23:27:54 compute-1 systemd-logind[796]: Session 20 logged out. Waiting for processes to exit.
Jan 21 23:27:54 compute-1 systemd-logind[796]: Removed session 20.
Jan 21 23:27:58 compute-1 systemd[1]: Stopping User Manager for UID 0...
Jan 21 23:27:58 compute-1 systemd[94875]: Activating special unit Exit the Session...
Jan 21 23:27:58 compute-1 systemd[94875]: Stopped target Main User Target.
Jan 21 23:27:58 compute-1 systemd[94875]: Stopped target Basic System.
Jan 21 23:27:58 compute-1 systemd[94875]: Stopped target Paths.
Jan 21 23:27:58 compute-1 systemd[94875]: Stopped target Sockets.
Jan 21 23:27:58 compute-1 systemd[94875]: Stopped target Timers.
Jan 21 23:27:58 compute-1 systemd[94875]: Stopped Daily Cleanup of User's Temporary Directories.
Jan 21 23:27:58 compute-1 systemd[94875]: Closed D-Bus User Message Bus Socket.
Jan 21 23:27:58 compute-1 systemd[94875]: Stopped Create User's Volatile Files and Directories.
Jan 21 23:27:58 compute-1 systemd[94875]: Removed slice User Application Slice.
Jan 21 23:27:58 compute-1 systemd[94875]: Reached target Shutdown.
Jan 21 23:27:58 compute-1 systemd[94875]: Finished Exit the Session.
Jan 21 23:27:58 compute-1 systemd[94875]: Reached target Exit the Session.
Jan 21 23:27:58 compute-1 systemd[1]: user@0.service: Deactivated successfully.
Jan 21 23:27:58 compute-1 systemd[1]: Stopped User Manager for UID 0.
Jan 21 23:27:58 compute-1 systemd[1]: Stopping User Runtime Directory /run/user/0...
Jan 21 23:27:58 compute-1 systemd[1]: run-user-0.mount: Deactivated successfully.
Jan 21 23:27:58 compute-1 systemd[1]: user-runtime-dir@0.service: Deactivated successfully.
Jan 21 23:27:58 compute-1 systemd[1]: Stopped User Runtime Directory /run/user/0.
Jan 21 23:27:58 compute-1 systemd[1]: Removed slice User Slice of UID 0.
Jan 21 23:27:59 compute-1 sshd-session[95867]: Accepted publickey for zuul from 192.168.122.30 port 42850 ssh2: ECDSA SHA256:yVd9eaNlUxI8uClDfn5gH9n5gqPIh4AgIE0cN4DTPFE
Jan 21 23:27:59 compute-1 systemd-logind[796]: New session 22 of user zuul.
Jan 21 23:27:59 compute-1 systemd[1]: Started Session 22 of User zuul.
Jan 21 23:27:59 compute-1 sshd-session[95867]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 21 23:28:01 compute-1 python3.9[96020]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 21 23:28:02 compute-1 sudo[96174]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wkmyqyeupdluwihmypovchektyhpzgjp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038081.9625459-63-65207150317665/AnsiballZ_file.py'
Jan 21 23:28:02 compute-1 sudo[96174]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:28:02 compute-1 python3.9[96176]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/var/lib/openstack/neutron-ovn-metadata-agent setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Jan 21 23:28:02 compute-1 sudo[96174]: pam_unix(sudo:session): session closed for user root
Jan 21 23:28:03 compute-1 sudo[96326]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xtffwqysxppgiixzhjczcnbpvbaasmjz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038082.8375812-63-16348482488592/AnsiballZ_file.py'
Jan 21 23:28:03 compute-1 sudo[96326]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:28:03 compute-1 python3.9[96328]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 21 23:28:03 compute-1 sudo[96326]: pam_unix(sudo:session): session closed for user root
Jan 21 23:28:03 compute-1 sudo[96478]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-owbrtikugpanfcskkfmapboxfxhsympn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038083.571667-63-72355642297998/AnsiballZ_file.py'
Jan 21 23:28:03 compute-1 sudo[96478]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:28:04 compute-1 python3.9[96480]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/kill_scripts setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 21 23:28:04 compute-1 sudo[96478]: pam_unix(sudo:session): session closed for user root
Jan 21 23:28:04 compute-1 sudo[96630]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ghusyzlnlrerkhiwhwfijhcwpmuhbfxs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038084.1960566-63-156886626042188/AnsiballZ_file.py'
Jan 21 23:28:04 compute-1 sudo[96630]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:28:04 compute-1 python3.9[96632]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/ovn-metadata-proxy setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 21 23:28:04 compute-1 sudo[96630]: pam_unix(sudo:session): session closed for user root
Jan 21 23:28:05 compute-1 sudo[96782]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rjsdtnfqsgqedydbqcovjtdxbzdgkoyq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038084.8629777-63-44606174714760/AnsiballZ_file.py'
Jan 21 23:28:05 compute-1 sudo[96782]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:28:05 compute-1 python3.9[96784]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/external/pids setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 21 23:28:05 compute-1 sudo[96782]: pam_unix(sudo:session): session closed for user root
Jan 21 23:28:06 compute-1 python3.9[96934]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 21 23:28:06 compute-1 sudo[97085]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jegnqidvjukwhnpwmwvpzocqxnbzseeu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038086.4644613-195-189750681832020/AnsiballZ_seboolean.py'
Jan 21 23:28:06 compute-1 sudo[97085]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:28:07 compute-1 python3.9[97087]: ansible-ansible.posix.seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False
Jan 21 23:28:07 compute-1 sudo[97085]: pam_unix(sudo:session): session closed for user root
Jan 21 23:28:08 compute-1 python3.9[97237]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/ovn_metadata_haproxy_wrapper follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:28:09 compute-1 python3.9[97358]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/ovn_metadata_haproxy_wrapper mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769038087.9856756-219-20709578748971/.source follow=False _original_basename=haproxy.j2 checksum=a5072e7b19ca96a1f495d94f97f31903737cfd27 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 21 23:28:10 compute-1 python3.9[97508]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/kill_scripts/haproxy-kill follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:28:10 compute-1 python3.9[97629]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/kill_scripts/haproxy-kill mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769038089.6145062-264-161936846684472/.source follow=False _original_basename=kill-script.j2 checksum=2dfb5489f491f61b95691c3bf95fa1fe48ff3700 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 21 23:28:11 compute-1 sudo[97779]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tnzxoqyrcgmcmdwrqcckilybcifccuor ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038091.1249115-315-48980134449605/AnsiballZ_setup.py'
Jan 21 23:28:11 compute-1 sudo[97779]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:28:11 compute-1 python3.9[97781]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 21 23:28:12 compute-1 sudo[97779]: pam_unix(sudo:session): session closed for user root
Jan 21 23:28:12 compute-1 sudo[97863]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vhjanpxxkvrglmqgvdisdmmmzmyughcm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038091.1249115-315-48980134449605/AnsiballZ_dnf.py'
Jan 21 23:28:12 compute-1 sudo[97863]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:28:12 compute-1 python3.9[97865]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 21 23:28:14 compute-1 sudo[97863]: pam_unix(sudo:session): session closed for user root
Jan 21 23:28:15 compute-1 sudo[98016]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vorbzrxbeiyflrogfuueccuxrfxgxnpv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038094.638872-351-56719368701364/AnsiballZ_systemd.py'
Jan 21 23:28:15 compute-1 sudo[98016]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:28:15 compute-1 python3.9[98018]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 21 23:28:15 compute-1 sudo[98016]: pam_unix(sudo:session): session closed for user root
Jan 21 23:28:16 compute-1 python3.9[98171]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/neutron-ovn-metadata-agent/01-rootwrap.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:28:17 compute-1 python3.9[98292]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/neutron-ovn-metadata-agent/01-rootwrap.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769038095.9797432-375-159732886251742/.source.conf follow=False _original_basename=rootwrap.conf.j2 checksum=11f2cfb4b7d97b2cef3c2c2d88089e6999cffe22 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 21 23:28:17 compute-1 python3.9[98442]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/neutron-ovn-metadata-agent/01-neutron-ovn-metadata-agent.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:28:18 compute-1 python3.9[98563]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/neutron-ovn-metadata-agent/01-neutron-ovn-metadata-agent.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769038097.2045765-375-276695010009265/.source.conf follow=False _original_basename=neutron-ovn-metadata-agent.conf.j2 checksum=8bc979abbe81c2cf3993a225517a7e2483e20443 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 21 23:28:18 compute-1 ovn_controller[94841]: 2026-01-21T23:28:18Z|00025|memory|INFO|16128 kB peak resident set size after 30.1 seconds
Jan 21 23:28:18 compute-1 ovn_controller[94841]: 2026-01-21T23:28:18Z|00026|memory|INFO|idl-cells-OVN_Southbound:273 idl-cells-Open_vSwitch:585 ofctrl_desired_flow_usage-KB:7 ofctrl_installed_flow_usage-KB:5 ofctrl_sb_flow_ref_usage-KB:2
Jan 21 23:28:18 compute-1 podman[98564]: 2026-01-21 23:28:18.464966473 +0000 UTC m=+0.153324696 container health_status 1b31374526468d6b98833c6ddc06c791524e3aaa8f4a92d02e3f11c449a17ca2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Jan 21 23:28:19 compute-1 python3.9[98738]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/neutron-ovn-metadata-agent/10-neutron-metadata.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:28:20 compute-1 python3.9[98859]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/neutron-ovn-metadata-agent/10-neutron-metadata.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769038099.044791-507-180708748705532/.source.conf _original_basename=10-neutron-metadata.conf follow=False checksum=ca7d4d155f5b812fab1a3b70e34adb495d291b8d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 21 23:28:20 compute-1 python3.9[99009]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/neutron-ovn-metadata-agent/05-nova-metadata.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:28:21 compute-1 python3.9[99130]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/neutron-ovn-metadata-agent/05-nova-metadata.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769038100.3322353-507-135011828749114/.source.conf _original_basename=05-nova-metadata.conf follow=False checksum=3fd0bbe67f8d6b170421a2b4395a288aa69eaea2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 21 23:28:22 compute-1 python3.9[99280]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 21 23:28:22 compute-1 sudo[99432]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kyvmtejpfmwgkvdpsolndbljhpzdfdns ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038102.6135025-621-21889483067576/AnsiballZ_file.py'
Jan 21 23:28:22 compute-1 sudo[99432]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:28:23 compute-1 python3.9[99434]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 21 23:28:23 compute-1 sudo[99432]: pam_unix(sudo:session): session closed for user root
Jan 21 23:28:23 compute-1 sudo[99584]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yuhrxidhvmsvqodlqzjwzqhjdgqkzjac ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038103.3758311-645-281185712316387/AnsiballZ_stat.py'
Jan 21 23:28:23 compute-1 sudo[99584]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:28:23 compute-1 python3.9[99586]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:28:23 compute-1 sudo[99584]: pam_unix(sudo:session): session closed for user root
Jan 21 23:28:24 compute-1 sudo[99662]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-brapcoplvfmvoyjblsmozpqftulvnigw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038103.3758311-645-281185712316387/AnsiballZ_file.py'
Jan 21 23:28:24 compute-1 sudo[99662]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:28:24 compute-1 python3.9[99664]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 21 23:28:24 compute-1 sudo[99662]: pam_unix(sudo:session): session closed for user root
Jan 21 23:28:24 compute-1 sudo[99814]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jiskfwskzpvdhlmwteosguvquxhqczfd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038104.5343285-645-223355793564754/AnsiballZ_stat.py'
Jan 21 23:28:24 compute-1 sudo[99814]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:28:25 compute-1 python3.9[99816]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:28:25 compute-1 sudo[99814]: pam_unix(sudo:session): session closed for user root
Jan 21 23:28:25 compute-1 sudo[99892]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tozuekmxfibesdzeqjtqmmbpdpjxwxit ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038104.5343285-645-223355793564754/AnsiballZ_file.py'
Jan 21 23:28:25 compute-1 sudo[99892]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:28:25 compute-1 python3.9[99894]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 21 23:28:25 compute-1 sudo[99892]: pam_unix(sudo:session): session closed for user root
Jan 21 23:28:26 compute-1 sudo[100044]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hxmmzftgypuywxfsvzodqxztybnwixof ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038105.847763-714-216580631169384/AnsiballZ_file.py'
Jan 21 23:28:26 compute-1 sudo[100044]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:28:26 compute-1 python3.9[100046]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:28:26 compute-1 sudo[100044]: pam_unix(sudo:session): session closed for user root
Jan 21 23:28:26 compute-1 sudo[100196]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lnxxstxdhnfgoseqmmehxkgqahcsrcbm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038106.5989647-738-54703169411679/AnsiballZ_stat.py'
Jan 21 23:28:26 compute-1 sudo[100196]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:28:27 compute-1 python3.9[100198]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:28:27 compute-1 sudo[100196]: pam_unix(sudo:session): session closed for user root
Jan 21 23:28:27 compute-1 sudo[100274]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-clgtxalbxfmrsdxqdshwikoddhdrinnu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038106.5989647-738-54703169411679/AnsiballZ_file.py'
Jan 21 23:28:27 compute-1 sudo[100274]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:28:27 compute-1 python3.9[100276]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:28:27 compute-1 sudo[100274]: pam_unix(sudo:session): session closed for user root
Jan 21 23:28:28 compute-1 sudo[100426]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iebrrlkxdykpiyzojhhkhtwzynkimevb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038107.8608873-774-277696446464933/AnsiballZ_stat.py'
Jan 21 23:28:28 compute-1 sudo[100426]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:28:28 compute-1 python3.9[100428]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:28:28 compute-1 sudo[100426]: pam_unix(sudo:session): session closed for user root
Jan 21 23:28:28 compute-1 sudo[100504]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bicwxkrxouxsdwoigvcitbplnlcoodyw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038107.8608873-774-277696446464933/AnsiballZ_file.py'
Jan 21 23:28:28 compute-1 sudo[100504]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:28:28 compute-1 python3.9[100506]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:28:28 compute-1 sudo[100504]: pam_unix(sudo:session): session closed for user root
Jan 21 23:28:29 compute-1 sudo[100656]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bkslojjyikovfphkepicazwzsykpqhgq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038109.103992-810-260847474811144/AnsiballZ_systemd.py'
Jan 21 23:28:29 compute-1 sudo[100656]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:28:30 compute-1 python3.9[100658]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 21 23:28:30 compute-1 systemd[1]: Reloading.
Jan 21 23:28:30 compute-1 systemd-rc-local-generator[100689]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 23:28:30 compute-1 systemd-sysv-generator[100693]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 21 23:28:30 compute-1 sudo[100656]: pam_unix(sudo:session): session closed for user root
Jan 21 23:28:30 compute-1 sudo[100846]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hcyhoffoovgvefgxosoyljuevroejvhp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038110.5842452-834-190268173840863/AnsiballZ_stat.py'
Jan 21 23:28:30 compute-1 sudo[100846]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:28:31 compute-1 python3.9[100848]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:28:31 compute-1 sudo[100846]: pam_unix(sudo:session): session closed for user root
Jan 21 23:28:31 compute-1 sudo[100924]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kafoxhevsdmpvgpstsbbmnsqzzardtpo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038110.5842452-834-190268173840863/AnsiballZ_file.py'
Jan 21 23:28:31 compute-1 sudo[100924]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:28:31 compute-1 python3.9[100926]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:28:31 compute-1 sudo[100924]: pam_unix(sudo:session): session closed for user root
Jan 21 23:28:32 compute-1 sudo[101076]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kfegalihtkarbcqkfapfqpcvlewasyrr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038111.8124983-870-232757752334809/AnsiballZ_stat.py'
Jan 21 23:28:32 compute-1 sudo[101076]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:28:32 compute-1 python3.9[101078]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:28:32 compute-1 sudo[101076]: pam_unix(sudo:session): session closed for user root
Jan 21 23:28:32 compute-1 sudo[101154]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jeqotgoobjlulqtioaftvukiunndjzug ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038111.8124983-870-232757752334809/AnsiballZ_file.py'
Jan 21 23:28:32 compute-1 sudo[101154]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:28:32 compute-1 python3.9[101156]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:28:32 compute-1 sudo[101154]: pam_unix(sudo:session): session closed for user root
Jan 21 23:28:33 compute-1 sudo[101306]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uiprybwwglkcncjkqgmnlnkslphurdvo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038113.1172893-906-29908167914713/AnsiballZ_systemd.py'
Jan 21 23:28:33 compute-1 sudo[101306]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:28:33 compute-1 python3.9[101308]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 21 23:28:33 compute-1 systemd[1]: Reloading.
Jan 21 23:28:33 compute-1 systemd-sysv-generator[101333]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 21 23:28:33 compute-1 systemd-rc-local-generator[101327]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 23:28:34 compute-1 systemd[1]: Starting Create netns directory...
Jan 21 23:28:34 compute-1 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Jan 21 23:28:34 compute-1 systemd[1]: netns-placeholder.service: Deactivated successfully.
Jan 21 23:28:34 compute-1 systemd[1]: Finished Create netns directory.
Jan 21 23:28:34 compute-1 sudo[101306]: pam_unix(sudo:session): session closed for user root
Jan 21 23:28:34 compute-1 sudo[101501]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-titcoirblmefvjdyjsrhmxultcfikbqj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038114.458741-936-18506371880883/AnsiballZ_file.py'
Jan 21 23:28:34 compute-1 sudo[101501]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:28:35 compute-1 python3.9[101503]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 21 23:28:35 compute-1 sudo[101501]: pam_unix(sudo:session): session closed for user root
Jan 21 23:28:35 compute-1 sudo[101653]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hdmjitdqjjpnzkrymfhemxqqpuzpxxrc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038115.2909653-960-120744105822591/AnsiballZ_stat.py'
Jan 21 23:28:35 compute-1 sudo[101653]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:28:35 compute-1 python3.9[101655]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ovn_metadata_agent/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:28:35 compute-1 sudo[101653]: pam_unix(sudo:session): session closed for user root
Jan 21 23:28:36 compute-1 sudo[101776]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-izfngcepptidpimozfrcnowfnazzecka ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038115.2909653-960-120744105822591/AnsiballZ_copy.py'
Jan 21 23:28:36 compute-1 sudo[101776]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:28:36 compute-1 python3.9[101778]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ovn_metadata_agent/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769038115.2909653-960-120744105822591/.source _original_basename=healthcheck follow=False checksum=898a5a1fcd473cf731177fc866e3bd7ebf20a131 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 21 23:28:36 compute-1 sudo[101776]: pam_unix(sudo:session): session closed for user root
Jan 21 23:28:37 compute-1 sudo[101928]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qcrnuujrvtbpdlbhizsjwhklyksvvazc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038116.8930457-1011-42770104905979/AnsiballZ_file.py'
Jan 21 23:28:37 compute-1 sudo[101928]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:28:37 compute-1 python3.9[101930]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:28:37 compute-1 sudo[101928]: pam_unix(sudo:session): session closed for user root
Jan 21 23:28:37 compute-1 sudo[102080]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-devdlxrzqzwxffaycsqamdcroedbsdky ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038117.5735905-1035-95294990210116/AnsiballZ_file.py'
Jan 21 23:28:37 compute-1 sudo[102080]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:28:38 compute-1 python3.9[102082]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 21 23:28:38 compute-1 sudo[102080]: pam_unix(sudo:session): session closed for user root
Jan 21 23:28:38 compute-1 sudo[102232]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nbfuaezeclmaemtlcwgnxtmdxehkxyxq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038118.3216312-1059-126951299816065/AnsiballZ_stat.py'
Jan 21 23:28:38 compute-1 sudo[102232]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:28:38 compute-1 python3.9[102234]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ovn_metadata_agent.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:28:38 compute-1 sudo[102232]: pam_unix(sudo:session): session closed for user root
Jan 21 23:28:39 compute-1 sudo[102355]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jpxyvpnnotafyqbqadjlpbhybjwwzfpn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038118.3216312-1059-126951299816065/AnsiballZ_copy.py'
Jan 21 23:28:39 compute-1 sudo[102355]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:28:39 compute-1 python3.9[102357]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ovn_metadata_agent.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1769038118.3216312-1059-126951299816065/.source.json _original_basename=.k_s4k7s8 follow=False checksum=a908ef151ded3a33ae6c9ac8be72a35e5e33b9dc backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:28:39 compute-1 sudo[102355]: pam_unix(sudo:session): session closed for user root
Jan 21 23:28:40 compute-1 python3.9[102507]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:28:42 compute-1 sudo[102928]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ffdlyvzotffeumnbzznyotipsyztvgix ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038121.8934486-1179-103716949184603/AnsiballZ_container_config_data.py'
Jan 21 23:28:42 compute-1 sudo[102928]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:28:42 compute-1 python3.9[102930]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent config_pattern=*.json debug=False
Jan 21 23:28:42 compute-1 sudo[102928]: pam_unix(sudo:session): session closed for user root
Jan 21 23:28:43 compute-1 sudo[103080]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-irncsxrymcrgumlfdvwtjqljjxyzkhqv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038123.0173478-1212-188964526161552/AnsiballZ_container_config_hash.py'
Jan 21 23:28:43 compute-1 sudo[103080]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:28:43 compute-1 python3.9[103082]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Jan 21 23:28:44 compute-1 sudo[103080]: pam_unix(sudo:session): session closed for user root
Jan 21 23:28:44 compute-1 sudo[103232]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dzidwqpfnbptvgafeaixychybazogtup ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1769038124.3953564-1242-113618669659423/AnsiballZ_edpm_container_manage.py'
Jan 21 23:28:44 compute-1 sudo[103232]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:28:45 compute-1 python3[103234]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent config_id=ovn_metadata_agent config_overrides={} config_patterns=*.json containers=['ovn_metadata_agent'] log_base_path=/var/log/containers/stdouts debug=False
Jan 21 23:28:51 compute-1 podman[103291]: 2026-01-21 23:28:51.237069706 +0000 UTC m=+2.706027419 container health_status 1b31374526468d6b98833c6ddc06c791524e3aaa8f4a92d02e3f11c449a17ca2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, config_id=ovn_controller, managed_by=edpm_ansible)
Jan 21 23:28:53 compute-1 podman[103247]: 2026-01-21 23:28:53.071768375 +0000 UTC m=+7.805811025 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 21 23:28:53 compute-1 podman[103368]: 2026-01-21 23:28:53.289998945 +0000 UTC m=+0.087677193 container create af9a44923f14d865893c91712a29a99d59e05d83f1a1a0300c4f4d5af638f284 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3)
Jan 21 23:28:53 compute-1 podman[103368]: 2026-01-21 23:28:53.250266297 +0000 UTC m=+0.047944575 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 21 23:28:53 compute-1 python3[103234]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ovn_metadata_agent --cgroupns=host --conmon-pidfile /run/ovn_metadata_agent.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env EDPM_CONFIG_HASH=7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a --healthcheck-command /openstack/healthcheck --label config_id=ovn_metadata_agent --label container_name=ovn_metadata_agent --label managed_by=edpm_ansible --label config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']} --log-driver journald --log-level info --network host --pid host --privileged=True --user root --volume /run/openvswitch:/run/openvswitch:z --volume /var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z --volume /run/netns:/run/netns:shared --volume /var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/neutron:/var/lib/neutron:shared,z --volume /var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro --volume /var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro --volume /var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z --volume /var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 21 23:28:53 compute-1 sudo[103232]: pam_unix(sudo:session): session closed for user root
Jan 21 23:28:56 compute-1 sudo[103556]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tcxpcnhcpxcrvqnslmqgyyvtxsrnbduc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038136.3208714-1266-87106919521580/AnsiballZ_stat.py'
Jan 21 23:28:56 compute-1 sudo[103556]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:28:56 compute-1 python3.9[103558]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 21 23:28:56 compute-1 sudo[103556]: pam_unix(sudo:session): session closed for user root
Jan 21 23:28:57 compute-1 sudo[103710]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yfgtxmkbtgovnnsurnwdnuyiyqjkxwxy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038137.1833239-1293-15235328699817/AnsiballZ_file.py'
Jan 21 23:28:57 compute-1 sudo[103710]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:28:57 compute-1 python3.9[103712]: ansible-file Invoked with path=/etc/systemd/system/edpm_ovn_metadata_agent.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:28:57 compute-1 sudo[103710]: pam_unix(sudo:session): session closed for user root
Jan 21 23:28:57 compute-1 sudo[103786]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zccbtezhqikzlxdgzasodivlvioxnzzb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038137.1833239-1293-15235328699817/AnsiballZ_stat.py'
Jan 21 23:28:57 compute-1 sudo[103786]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:28:58 compute-1 python3.9[103788]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ovn_metadata_agent_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 21 23:28:58 compute-1 sudo[103786]: pam_unix(sudo:session): session closed for user root
Jan 21 23:28:58 compute-1 sudo[103937]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kaxwbactrwneukmtydakrxrdrvpfbsgd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038138.2848253-1293-205112121200340/AnsiballZ_copy.py'
Jan 21 23:28:58 compute-1 sudo[103937]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:28:59 compute-1 python3.9[103939]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769038138.2848253-1293-205112121200340/source dest=/etc/systemd/system/edpm_ovn_metadata_agent.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:28:59 compute-1 sudo[103937]: pam_unix(sudo:session): session closed for user root
Jan 21 23:28:59 compute-1 sudo[104013]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xvbehnzbdktztndpznmfrpxiojwihzli ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038138.2848253-1293-205112121200340/AnsiballZ_systemd.py'
Jan 21 23:28:59 compute-1 sudo[104013]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:28:59 compute-1 python3.9[104015]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 21 23:28:59 compute-1 systemd[1]: Reloading.
Jan 21 23:28:59 compute-1 systemd-rc-local-generator[104041]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 23:28:59 compute-1 systemd-sysv-generator[104044]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 21 23:28:59 compute-1 sudo[104013]: pam_unix(sudo:session): session closed for user root
Jan 21 23:29:00 compute-1 sudo[104123]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sggmthsvkeenlggtkwwerfprsfomvvdu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038138.2848253-1293-205112121200340/AnsiballZ_systemd.py'
Jan 21 23:29:00 compute-1 sudo[104123]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:29:00 compute-1 python3.9[104125]: ansible-systemd Invoked with state=restarted name=edpm_ovn_metadata_agent.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 21 23:29:00 compute-1 systemd[1]: Reloading.
Jan 21 23:29:00 compute-1 systemd-rc-local-generator[104152]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 23:29:00 compute-1 systemd-sysv-generator[104156]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 21 23:29:00 compute-1 systemd[1]: Starting ovn_metadata_agent container...
Jan 21 23:29:00 compute-1 systemd[1]: Started libcrun container.
Jan 21 23:29:00 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c3ea5ce2fd097349e3ff929e500726a54d77039fb462a4c74fd93994452a6088/merged/etc/neutron.conf.d supports timestamps until 2038 (0x7fffffff)
Jan 21 23:29:00 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c3ea5ce2fd097349e3ff929e500726a54d77039fb462a4c74fd93994452a6088/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 21 23:29:00 compute-1 systemd[1]: Started /usr/bin/podman healthcheck run af9a44923f14d865893c91712a29a99d59e05d83f1a1a0300c4f4d5af638f284.
Jan 21 23:29:00 compute-1 podman[104166]: 2026-01-21 23:29:00.969092829 +0000 UTC m=+0.157510179 container init af9a44923f14d865893c91712a29a99d59e05d83f1a1a0300c4f4d5af638f284 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Jan 21 23:29:00 compute-1 ovn_metadata_agent[104179]: + sudo -E kolla_set_configs
Jan 21 23:29:01 compute-1 podman[104166]: 2026-01-21 23:29:01.00701641 +0000 UTC m=+0.195433720 container start af9a44923f14d865893c91712a29a99d59e05d83f1a1a0300c4f4d5af638f284 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 21 23:29:01 compute-1 edpm-start-podman-container[104166]: ovn_metadata_agent
Jan 21 23:29:01 compute-1 edpm-start-podman-container[104165]: Creating additional drop-in dependency for "ovn_metadata_agent" (af9a44923f14d865893c91712a29a99d59e05d83f1a1a0300c4f4d5af638f284)
Jan 21 23:29:01 compute-1 ovn_metadata_agent[104179]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Jan 21 23:29:01 compute-1 ovn_metadata_agent[104179]: INFO:__main__:Validating config file
Jan 21 23:29:01 compute-1 ovn_metadata_agent[104179]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Jan 21 23:29:01 compute-1 ovn_metadata_agent[104179]: INFO:__main__:Copying service configuration files
Jan 21 23:29:01 compute-1 ovn_metadata_agent[104179]: INFO:__main__:Deleting /etc/neutron/rootwrap.conf
Jan 21 23:29:01 compute-1 podman[104186]: 2026-01-21 23:29:01.092072001 +0000 UTC m=+0.068008150 container health_status af9a44923f14d865893c91712a29a99d59e05d83f1a1a0300c4f4d5af638f284 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS)
Jan 21 23:29:01 compute-1 ovn_metadata_agent[104179]: INFO:__main__:Copying /etc/neutron.conf.d/01-rootwrap.conf to /etc/neutron/rootwrap.conf
Jan 21 23:29:01 compute-1 ovn_metadata_agent[104179]: INFO:__main__:Setting permission for /etc/neutron/rootwrap.conf
Jan 21 23:29:01 compute-1 ovn_metadata_agent[104179]: INFO:__main__:Writing out command to execute
Jan 21 23:29:01 compute-1 ovn_metadata_agent[104179]: INFO:__main__:Setting permission for /var/lib/neutron
Jan 21 23:29:01 compute-1 ovn_metadata_agent[104179]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts
Jan 21 23:29:01 compute-1 ovn_metadata_agent[104179]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy
Jan 21 23:29:01 compute-1 ovn_metadata_agent[104179]: INFO:__main__:Setting permission for /var/lib/neutron/external
Jan 21 23:29:01 compute-1 ovn_metadata_agent[104179]: INFO:__main__:Setting permission for /var/lib/neutron/ovn_metadata_haproxy_wrapper
Jan 21 23:29:01 compute-1 ovn_metadata_agent[104179]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts/haproxy-kill
Jan 21 23:29:01 compute-1 ovn_metadata_agent[104179]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids
Jan 21 23:29:01 compute-1 systemd[1]: Reloading.
Jan 21 23:29:01 compute-1 ovn_metadata_agent[104179]: ++ cat /run_command
Jan 21 23:29:01 compute-1 ovn_metadata_agent[104179]: + CMD=neutron-ovn-metadata-agent
Jan 21 23:29:01 compute-1 ovn_metadata_agent[104179]: + ARGS=
Jan 21 23:29:01 compute-1 ovn_metadata_agent[104179]: + sudo kolla_copy_cacerts
Jan 21 23:29:01 compute-1 ovn_metadata_agent[104179]: + [[ ! -n '' ]]
Jan 21 23:29:01 compute-1 ovn_metadata_agent[104179]: + . kolla_extend_start
Jan 21 23:29:01 compute-1 ovn_metadata_agent[104179]: + echo 'Running command: '\''neutron-ovn-metadata-agent'\'''
Jan 21 23:29:01 compute-1 ovn_metadata_agent[104179]: Running command: 'neutron-ovn-metadata-agent'
Jan 21 23:29:01 compute-1 ovn_metadata_agent[104179]: + umask 0022
Jan 21 23:29:01 compute-1 ovn_metadata_agent[104179]: + exec neutron-ovn-metadata-agent
Jan 21 23:29:01 compute-1 systemd-rc-local-generator[104258]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 23:29:01 compute-1 systemd-sysv-generator[104261]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 21 23:29:01 compute-1 systemd[1]: Started ovn_metadata_agent container.
Jan 21 23:29:01 compute-1 sudo[104123]: pam_unix(sudo:session): session closed for user root
Jan 21 23:29:02 compute-1 python3.9[104418]: ansible-ansible.builtin.slurp Invoked with src=/var/lib/edpm-config/deployed_services.yaml
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.916 104184 INFO neutron.common.config [-] Logging enabled!
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.918 104184 INFO neutron.common.config [-] /usr/bin/neutron-ovn-metadata-agent version 22.2.2.dev43
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.918 104184 DEBUG neutron.common.config [-] command line: /usr/bin/neutron-ovn-metadata-agent setup_logging /usr/lib/python3.9/site-packages/neutron/common/config.py:123
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.919 104184 DEBUG neutron.agent.ovn.metadata_agent [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.919 104184 DEBUG neutron.agent.ovn.metadata_agent [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.919 104184 DEBUG neutron.agent.ovn.metadata_agent [-] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.919 104184 DEBUG neutron.agent.ovn.metadata_agent [-] config files: ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.920 104184 DEBUG neutron.agent.ovn.metadata_agent [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.920 104184 DEBUG neutron.agent.ovn.metadata_agent [-] agent_down_time                = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.920 104184 DEBUG neutron.agent.ovn.metadata_agent [-] allow_bulk                     = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.920 104184 DEBUG neutron.agent.ovn.metadata_agent [-] api_extensions_path            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.920 104184 DEBUG neutron.agent.ovn.metadata_agent [-] api_paste_config               = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.920 104184 DEBUG neutron.agent.ovn.metadata_agent [-] api_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.921 104184 DEBUG neutron.agent.ovn.metadata_agent [-] auth_ca_cert                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.921 104184 DEBUG neutron.agent.ovn.metadata_agent [-] auth_strategy                  = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.921 104184 DEBUG neutron.agent.ovn.metadata_agent [-] backlog                        = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.921 104184 DEBUG neutron.agent.ovn.metadata_agent [-] base_mac                       = fa:16:3e:00:00:00 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.921 104184 DEBUG neutron.agent.ovn.metadata_agent [-] bind_host                      = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.921 104184 DEBUG neutron.agent.ovn.metadata_agent [-] bind_port                      = 9696 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.922 104184 DEBUG neutron.agent.ovn.metadata_agent [-] client_socket_timeout          = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.922 104184 DEBUG neutron.agent.ovn.metadata_agent [-] config_dir                     = ['/etc/neutron.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.922 104184 DEBUG neutron.agent.ovn.metadata_agent [-] config_file                    = ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.922 104184 DEBUG neutron.agent.ovn.metadata_agent [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.922 104184 DEBUG neutron.agent.ovn.metadata_agent [-] control_exchange               = neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.922 104184 DEBUG neutron.agent.ovn.metadata_agent [-] core_plugin                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.922 104184 DEBUG neutron.agent.ovn.metadata_agent [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.922 104184 DEBUG neutron.agent.ovn.metadata_agent [-] default_availability_zones     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.923 104184 DEBUG neutron.agent.ovn.metadata_agent [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'OFPHandler=INFO', 'OfctlService=INFO', 'os_ken.base.app_manager=INFO', 'os_ken.controller.controller=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.923 104184 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_agent_notification        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.923 104184 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_lease_duration            = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.923 104184 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_load_type                 = networks log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.923 104184 DEBUG neutron.agent.ovn.metadata_agent [-] dns_domain                     = openstacklocal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.923 104184 DEBUG neutron.agent.ovn.metadata_agent [-] enable_new_agents              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.924 104184 DEBUG neutron.agent.ovn.metadata_agent [-] enable_traditional_dhcp        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.924 104184 DEBUG neutron.agent.ovn.metadata_agent [-] external_dns_driver            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.924 104184 DEBUG neutron.agent.ovn.metadata_agent [-] external_pids                  = /var/lib/neutron/external/pids log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.924 104184 DEBUG neutron.agent.ovn.metadata_agent [-] filter_validation              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.924 104184 DEBUG neutron.agent.ovn.metadata_agent [-] global_physnet_mtu             = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.924 104184 DEBUG neutron.agent.ovn.metadata_agent [-] host                           = compute-1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.925 104184 DEBUG neutron.agent.ovn.metadata_agent [-] http_retries                   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.925 104184 DEBUG neutron.agent.ovn.metadata_agent [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.925 104184 DEBUG neutron.agent.ovn.metadata_agent [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.925 104184 DEBUG neutron.agent.ovn.metadata_agent [-] ipam_driver                    = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.925 104184 DEBUG neutron.agent.ovn.metadata_agent [-] ipv6_pd_enabled                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.926 104184 DEBUG neutron.agent.ovn.metadata_agent [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.926 104184 DEBUG neutron.agent.ovn.metadata_agent [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.926 104184 DEBUG neutron.agent.ovn.metadata_agent [-] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.926 104184 DEBUG neutron.agent.ovn.metadata_agent [-] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.926 104184 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.926 104184 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.927 104184 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.927 104184 DEBUG neutron.agent.ovn.metadata_agent [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.927 104184 DEBUG neutron.agent.ovn.metadata_agent [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.927 104184 DEBUG neutron.agent.ovn.metadata_agent [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.927 104184 DEBUG neutron.agent.ovn.metadata_agent [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.928 104184 DEBUG neutron.agent.ovn.metadata_agent [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.928 104184 DEBUG neutron.agent.ovn.metadata_agent [-] max_dns_nameservers            = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.928 104184 DEBUG neutron.agent.ovn.metadata_agent [-] max_header_line                = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.928 104184 DEBUG neutron.agent.ovn.metadata_agent [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.929 104184 DEBUG neutron.agent.ovn.metadata_agent [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.929 104184 DEBUG neutron.agent.ovn.metadata_agent [-] max_subnet_host_routes         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.929 104184 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_backlog               = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.929 104184 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_group           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.929 104184 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_shared_secret   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.930 104184 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_socket          = /var/lib/neutron/metadata_proxy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.930 104184 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_socket_mode     = deduce log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.930 104184 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_user            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.930 104184 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_workers               = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.930 104184 DEBUG neutron.agent.ovn.metadata_agent [-] network_link_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.931 104184 DEBUG neutron.agent.ovn.metadata_agent [-] notify_nova_on_port_data_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.931 104184 DEBUG neutron.agent.ovn.metadata_agent [-] notify_nova_on_port_status_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.931 104184 DEBUG neutron.agent.ovn.metadata_agent [-] nova_client_cert               =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.931 104184 DEBUG neutron.agent.ovn.metadata_agent [-] nova_client_priv_key           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.932 104184 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_host             = nova-metadata-cell1-internal.openstack.svc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.932 104184 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_insecure         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.932 104184 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_port             = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.932 104184 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_protocol         = https log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.933 104184 DEBUG neutron.agent.ovn.metadata_agent [-] pagination_max_limit           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.933 104184 DEBUG neutron.agent.ovn.metadata_agent [-] periodic_fuzzy_delay           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.933 104184 DEBUG neutron.agent.ovn.metadata_agent [-] periodic_interval              = 40 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.933 104184 DEBUG neutron.agent.ovn.metadata_agent [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.933 104184 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.933 104184 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.934 104184 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.934 104184 DEBUG neutron.agent.ovn.metadata_agent [-] retry_until_window             = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.934 104184 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_resources_processing_step  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.934 104184 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_response_max_timeout       = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.935 104184 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_state_report_workers       = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.935 104184 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.935 104184 DEBUG neutron.agent.ovn.metadata_agent [-] send_events_interval           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.935 104184 DEBUG neutron.agent.ovn.metadata_agent [-] service_plugins                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.935 104184 DEBUG neutron.agent.ovn.metadata_agent [-] setproctitle                   = on log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.935 104184 DEBUG neutron.agent.ovn.metadata_agent [-] state_path                     = /var/lib/neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.936 104184 DEBUG neutron.agent.ovn.metadata_agent [-] syslog_log_facility            = syslog log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.936 104184 DEBUG neutron.agent.ovn.metadata_agent [-] tcp_keepidle                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.936 104184 DEBUG neutron.agent.ovn.metadata_agent [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.936 104184 DEBUG neutron.agent.ovn.metadata_agent [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.936 104184 DEBUG neutron.agent.ovn.metadata_agent [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.937 104184 DEBUG neutron.agent.ovn.metadata_agent [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.937 104184 DEBUG neutron.agent.ovn.metadata_agent [-] use_ssl                        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.937 104184 DEBUG neutron.agent.ovn.metadata_agent [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.937 104184 DEBUG neutron.agent.ovn.metadata_agent [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.937 104184 DEBUG neutron.agent.ovn.metadata_agent [-] vlan_transparent               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.938 104184 DEBUG neutron.agent.ovn.metadata_agent [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.938 104184 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_default_pool_size         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.938 104184 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.938 104184 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_log_format                = %(client_ip)s "%(request_line)s" status: %(status_code)s  len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.938 104184 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_server_debug              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.939 104184 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.939 104184 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_concurrency.lock_path     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.939 104184 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.connection_string     = messaging:// log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.939 104184 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.enabled               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.940 104184 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_doc_type           = notification log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.940 104184 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_scroll_size        = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.940 104184 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_scroll_time        = 2m log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.940 104184 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.filter_error_trace    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.940 104184 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.hmac_keys             = SECRET_KEY log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.941 104184 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.sentinel_service_name = mymaster log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.941 104184 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.socket_timeout        = 0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.941 104184 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.trace_sqlalchemy      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.941 104184 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.enforce_new_defaults = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.941 104184 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.enforce_scope      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.942 104184 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.942 104184 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.942 104184 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.942 104184 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.942 104184 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.943 104184 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.943 104184 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.943 104184 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.943 104184 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.943 104184 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.944 104184 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.944 104184 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.944 104184 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.944 104184 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.944 104184 DEBUG neutron.agent.ovn.metadata_agent [-] service_providers.service_provider = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.945 104184 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.capabilities           = [21, 12, 1, 2, 19] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.945 104184 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.group                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.945 104184 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.helper_command         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.945 104184 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.logger_name            = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.945 104184 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.thread_pool_size       = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.946 104184 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.user                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.946 104184 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.946 104184 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.group     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.946 104184 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.946 104184 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.947 104184 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.947 104184 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.user      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.947 104184 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.947 104184 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.947 104184 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.947 104184 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.948 104184 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.948 104184 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.948 104184 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.capabilities = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.948 104184 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.949 104184 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.949 104184 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.949 104184 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.949 104184 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.949 104184 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.950 104184 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.950 104184 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.950 104184 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.950 104184 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.950 104184 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.951 104184 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.capabilities      = [12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.951 104184 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.group             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.951 104184 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.helper_command    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.951 104184 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.logger_name       = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.951 104184 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.thread_pool_size  = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.951 104184 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.user              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.952 104184 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.check_child_processes_action = respawn log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.952 104184 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.check_child_processes_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.952 104184 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.comment_iptables_rules   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.952 104184 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.debug_iptables_rules     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.953 104184 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.kill_scripts_path        = /etc/neutron/kill_scripts/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.953 104184 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.root_helper              = sudo neutron-rootwrap /etc/neutron/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.953 104184 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.root_helper_daemon       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.953 104184 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.use_helper_for_ns_read   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.953 104184 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.use_random_fully         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.954 104184 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.954 104184 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.default_quota           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.954 104184 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_driver            = neutron.db.quota.driver_nolock.DbQuotaNoLockDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.954 104184 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_network           = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.954 104184 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_port              = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.955 104184 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_security_group    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.955 104184 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_security_group_rule = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.955 104184 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_subnet            = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.955 104184 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.track_quota_usage       = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.955 104184 DEBUG neutron.agent.ovn.metadata_agent [-] nova.auth_section              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.956 104184 DEBUG neutron.agent.ovn.metadata_agent [-] nova.auth_type                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.956 104184 DEBUG neutron.agent.ovn.metadata_agent [-] nova.cafile                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.956 104184 DEBUG neutron.agent.ovn.metadata_agent [-] nova.certfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.956 104184 DEBUG neutron.agent.ovn.metadata_agent [-] nova.collect_timing            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.956 104184 DEBUG neutron.agent.ovn.metadata_agent [-] nova.endpoint_type             = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.957 104184 DEBUG neutron.agent.ovn.metadata_agent [-] nova.insecure                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.957 104184 DEBUG neutron.agent.ovn.metadata_agent [-] nova.keyfile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.957 104184 DEBUG neutron.agent.ovn.metadata_agent [-] nova.region_name               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.957 104184 DEBUG neutron.agent.ovn.metadata_agent [-] nova.split_loggers             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.957 104184 DEBUG neutron.agent.ovn.metadata_agent [-] nova.timeout                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.957 104184 DEBUG neutron.agent.ovn.metadata_agent [-] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.957 104184 DEBUG neutron.agent.ovn.metadata_agent [-] placement.auth_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.958 104184 DEBUG neutron.agent.ovn.metadata_agent [-] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.958 104184 DEBUG neutron.agent.ovn.metadata_agent [-] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.958 104184 DEBUG neutron.agent.ovn.metadata_agent [-] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.958 104184 DEBUG neutron.agent.ovn.metadata_agent [-] placement.endpoint_type        = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.958 104184 DEBUG neutron.agent.ovn.metadata_agent [-] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.958 104184 DEBUG neutron.agent.ovn.metadata_agent [-] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.959 104184 DEBUG neutron.agent.ovn.metadata_agent [-] placement.region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.959 104184 DEBUG neutron.agent.ovn.metadata_agent [-] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.959 104184 DEBUG neutron.agent.ovn.metadata_agent [-] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.959 104184 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.959 104184 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.959 104184 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.959 104184 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.960 104184 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.960 104184 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.960 104184 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.960 104184 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.enable_notifications    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.960 104184 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.960 104184 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.960 104184 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.interface               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.961 104184 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.961 104184 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.961 104184 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.961 104184 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.961 104184 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.961 104184 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.service_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.961 104184 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.962 104184 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.962 104184 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.962 104184 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.962 104184 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.valid_interfaces        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.962 104184 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.962 104184 DEBUG neutron.agent.ovn.metadata_agent [-] cli_script.dry_run             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.962 104184 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.allow_stateless_action_supported = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.963 104184 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dhcp_default_lease_time    = 43200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.963 104184 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.disable_ovn_dhcp_for_baremetal_ports = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.963 104184 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dns_servers                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.963 104184 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.enable_distributed_floating_ip = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.963 104184 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.neutron_sync_mode          = log log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.963 104184 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_dhcp4_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.963 104184 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_dhcp6_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.964 104184 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_emit_need_to_frag      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.964 104184 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_l3_mode                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.964 104184 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_l3_scheduler           = leastloaded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.964 104184 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_metadata_enabled       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.964 104184 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_ca_cert             =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.964 104184 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_certificate         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.964 104184 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_connection          = tcp:127.0.0.1:6641 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.965 104184 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_private_key         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.965 104184 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_ca_cert             = /etc/pki/tls/certs/ovndbca.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.965 104184 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_certificate         = /etc/pki/tls/certs/ovndb.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.965 104184 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_connection          = ssl:ovsdbserver-sb.openstack.svc:6642 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.965 104184 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_private_key         = /etc/pki/tls/private/ovndb.key log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.965 104184 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.966 104184 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_log_level            = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.966 104184 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_probe_interval       = 60000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.966 104184 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_retry_max_interval   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.966 104184 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.vhost_sock_dir             = /var/run/openvswitch log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.966 104184 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.vif_type                   = ovs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.966 104184 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.bridge_mac_table_size      = 50000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.966 104184 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.igmp_snooping_enable       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.967 104184 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.ovsdb_timeout              = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.967 104184 DEBUG neutron.agent.ovn.metadata_agent [-] ovs.ovsdb_connection           = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.967 104184 DEBUG neutron.agent.ovn.metadata_agent [-] ovs.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.967 104184 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.967 104184 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.967 104184 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.968 104184 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.968 104184 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.968 104184 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.968 104184 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.968 104184 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.968 104184 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.968 104184 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.969 104184 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.969 104184 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.969 104184 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.969 104184 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.969 104184 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.969 104184 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.970 104184 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.970 104184 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.970 104184 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.970 104184 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.970 104184 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.970 104184 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.970 104184 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.971 104184 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.971 104184 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.971 104184 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.971 104184 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.971 104184 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.971 104184 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.971 104184 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.972 104184 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.972 104184 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.driver = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.972 104184 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.972 104184 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.972 104184 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.972 104184 DEBUG neutron.agent.ovn.metadata_agent [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.985 104184 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Bridge.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.985 104184 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Port.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.986 104184 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Interface.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.986 104184 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: connecting...
Jan 21 23:29:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:02.986 104184 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: connected
Jan 21 23:29:03 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:03.004 104184 DEBUG neutron.agent.ovn.metadata.agent [-] Loaded chassis name 74526b6d-b1ca-423f-9094-b845f8b97526 (UUID: 74526b6d-b1ca-423f-9094-b845f8b97526) and ovn bridge br-int. _load_config /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:309
Jan 21 23:29:03 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:03.033 104184 INFO neutron.agent.ovn.metadata.ovsdb [-] Getting OvsdbSbOvnIdl for MetadataAgent with retry
Jan 21 23:29:03 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:03.033 104184 DEBUG ovsdbapp.backend.ovs_idl [-] Created lookup_table index Chassis.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:87
Jan 21 23:29:03 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:03.033 104184 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Datapath_Binding.tunnel_key autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Jan 21 23:29:03 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:03.034 104184 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Chassis_Private.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Jan 21 23:29:03 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:03.037 104184 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...
Jan 21 23:29:03 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:03.043 104184 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected
Jan 21 23:29:03 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:03.050 104184 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched CREATE: ChassisPrivateCreateEvent(events=('create',), table='Chassis_Private', conditions=(('name', '=', '74526b6d-b1ca-423f-9094-b845f8b97526'),), old_conditions=None), priority=20 to row=Chassis_Private(chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>], external_ids={}, name=74526b6d-b1ca-423f-9094-b845f8b97526, nb_cfg_timestamp=1769038076332, nb_cfg=1) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 21 23:29:03 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:03.051 104184 DEBUG neutron_lib.callbacks.manager [-] Subscribe: <bound method MetadataProxyHandler.post_fork_initialize of <neutron.agent.ovn.metadata.server.MetadataProxyHandler object at 0x7efce36ebb80>> process after_init 55550000, False subscribe /usr/lib/python3.9/site-packages/neutron_lib/callbacks/manager.py:52
Jan 21 23:29:03 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:03.052 104184 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 23:29:03 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:03.052 104184 DEBUG oslo_concurrency.lockutils [-] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 23:29:03 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:03.052 104184 DEBUG oslo_concurrency.lockutils [-] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 23:29:03 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:03.053 104184 INFO oslo_service.service [-] Starting 1 workers
Jan 21 23:29:03 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:03.058 104184 DEBUG oslo_service.service [-] Started child 104443 _start_child /usr/lib/python3.9/site-packages/oslo_service/service.py:575
Jan 21 23:29:03 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:03.062 104184 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.namespace_cmd', '--privsep_sock_path', '/tmp/tmp9keqn1j6/privsep.sock']
Jan 21 23:29:03 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:03.063 104443 DEBUG neutron_lib.callbacks.manager [-] Publish callbacks ['neutron.agent.ovn.metadata.server.MetadataProxyHandler.post_fork_initialize-1029008'] for process (None), after_init _notify_loop /usr/lib/python3.9/site-packages/neutron_lib/callbacks/manager.py:184
Jan 21 23:29:03 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:03.097 104443 INFO neutron.agent.ovn.metadata.ovsdb [-] Getting OvsdbSbOvnIdl for MetadataAgent with retry
Jan 21 23:29:03 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:03.098 104443 DEBUG ovsdbapp.backend.ovs_idl [-] Created lookup_table index Chassis.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:87
Jan 21 23:29:03 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:03.099 104443 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Datapath_Binding.tunnel_key autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Jan 21 23:29:03 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:03.103 104443 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...
Jan 21 23:29:03 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:03.111 104443 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected
Jan 21 23:29:03 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:03.120 104443 INFO eventlet.wsgi.server [-] (104443) wsgi starting up on http:/var/lib/neutron/metadata_proxy
Jan 21 23:29:03 compute-1 sudo[104573]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-joxhcmyuafclknetbqiddbetazswbwkf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038143.1285548-1428-85876761396399/AnsiballZ_stat.py'
Jan 21 23:29:03 compute-1 sudo[104573]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:29:03 compute-1 kernel: capability: warning: `privsep-helper' uses deprecated v2 capabilities in a way that may be insecure
Jan 21 23:29:03 compute-1 python3.9[104575]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/deployed_services.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:29:03 compute-1 sudo[104573]: pam_unix(sudo:session): session closed for user root
Jan 21 23:29:03 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:03.851 104184 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap
Jan 21 23:29:03 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:03.851 104184 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmp9keqn1j6/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362
Jan 21 23:29:03 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:03.610 104576 INFO oslo.privsep.daemon [-] privsep daemon starting
Jan 21 23:29:03 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:03.618 104576 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Jan 21 23:29:03 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:03.626 104576 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_SYS_ADMIN/CAP_SYS_ADMIN/none
Jan 21 23:29:03 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:03.626 104576 INFO oslo.privsep.daemon [-] privsep daemon running as pid 104576
Jan 21 23:29:03 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:03.854 104576 DEBUG oslo.privsep.daemon [-] privsep: reply[2d566cd7-8ef0-4d42-b31c-a4c98fc96832]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:29:04 compute-1 sudo[104703]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-swlypxtdrdldnmarwiwpbsbkvkgwnbax ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038143.1285548-1428-85876761396399/AnsiballZ_copy.py'
Jan 21 23:29:04 compute-1 sudo[104703]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:29:04 compute-1 python3.9[104705]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/deployed_services.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769038143.1285548-1428-85876761396399/.source.yaml _original_basename=.nrcvdb7v follow=False checksum=3944cba7eabd17aea9c4028b478ec4257c60bf08 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:29:04 compute-1 sudo[104703]: pam_unix(sudo:session): session closed for user root
Jan 21 23:29:04 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:04.562 104576 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:29:04 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:04.562 104576 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:29:04 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:04.562 104576 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:29:05 compute-1 sshd-session[95870]: Connection closed by 192.168.122.30 port 42850
Jan 21 23:29:05 compute-1 sshd-session[95867]: pam_unix(sshd:session): session closed for user zuul
Jan 21 23:29:05 compute-1 systemd[1]: session-22.scope: Deactivated successfully.
Jan 21 23:29:05 compute-1 systemd[1]: session-22.scope: Consumed 57.657s CPU time.
Jan 21 23:29:05 compute-1 systemd-logind[796]: Session 22 logged out. Waiting for processes to exit.
Jan 21 23:29:05 compute-1 systemd-logind[796]: Removed session 22.
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.174 104576 DEBUG oslo.privsep.daemon [-] privsep: reply[599b34bb-b80d-4d23-b631-28342062e601]: (4, []) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.177 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbAddCommand(_result=None, table=Chassis_Private, record=74526b6d-b1ca-423f-9094-b845f8b97526, column=external_ids, values=({'neutron:ovn-metadata-id': 'f6c96c8b-5340-5463-b270-675a0981d821'},)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.203 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=74526b6d-b1ca-423f-9094-b845f8b97526, col_values=(('external_ids', {'neutron:ovn-bridge': 'br-int'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.210 104184 DEBUG oslo_service.service [-] Full set of CONF: wait /usr/lib/python3.9/site-packages/oslo_service/service.py:649
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.210 104184 DEBUG oslo_service.service [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.211 104184 DEBUG oslo_service.service [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.211 104184 DEBUG oslo_service.service [-] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.211 104184 DEBUG oslo_service.service [-] config files: ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.211 104184 DEBUG oslo_service.service [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.211 104184 DEBUG oslo_service.service [-] agent_down_time                = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.212 104184 DEBUG oslo_service.service [-] allow_bulk                     = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.212 104184 DEBUG oslo_service.service [-] api_extensions_path            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.212 104184 DEBUG oslo_service.service [-] api_paste_config               = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.212 104184 DEBUG oslo_service.service [-] api_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.213 104184 DEBUG oslo_service.service [-] auth_ca_cert                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.213 104184 DEBUG oslo_service.service [-] auth_strategy                  = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.213 104184 DEBUG oslo_service.service [-] backlog                        = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.213 104184 DEBUG oslo_service.service [-] base_mac                       = fa:16:3e:00:00:00 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.214 104184 DEBUG oslo_service.service [-] bind_host                      = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.214 104184 DEBUG oslo_service.service [-] bind_port                      = 9696 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.214 104184 DEBUG oslo_service.service [-] client_socket_timeout          = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.214 104184 DEBUG oslo_service.service [-] config_dir                     = ['/etc/neutron.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.214 104184 DEBUG oslo_service.service [-] config_file                    = ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.215 104184 DEBUG oslo_service.service [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.215 104184 DEBUG oslo_service.service [-] control_exchange               = neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.215 104184 DEBUG oslo_service.service [-] core_plugin                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.215 104184 DEBUG oslo_service.service [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.216 104184 DEBUG oslo_service.service [-] default_availability_zones     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.216 104184 DEBUG oslo_service.service [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'OFPHandler=INFO', 'OfctlService=INFO', 'os_ken.base.app_manager=INFO', 'os_ken.controller.controller=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.216 104184 DEBUG oslo_service.service [-] dhcp_agent_notification        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.216 104184 DEBUG oslo_service.service [-] dhcp_lease_duration            = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.217 104184 DEBUG oslo_service.service [-] dhcp_load_type                 = networks log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.217 104184 DEBUG oslo_service.service [-] dns_domain                     = openstacklocal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.217 104184 DEBUG oslo_service.service [-] enable_new_agents              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.217 104184 DEBUG oslo_service.service [-] enable_traditional_dhcp        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.218 104184 DEBUG oslo_service.service [-] external_dns_driver            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.218 104184 DEBUG oslo_service.service [-] external_pids                  = /var/lib/neutron/external/pids log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.218 104184 DEBUG oslo_service.service [-] filter_validation              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.218 104184 DEBUG oslo_service.service [-] global_physnet_mtu             = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.219 104184 DEBUG oslo_service.service [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.219 104184 DEBUG oslo_service.service [-] host                           = compute-1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.219 104184 DEBUG oslo_service.service [-] http_retries                   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.220 104184 DEBUG oslo_service.service [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.220 104184 DEBUG oslo_service.service [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.220 104184 DEBUG oslo_service.service [-] ipam_driver                    = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.220 104184 DEBUG oslo_service.service [-] ipv6_pd_enabled                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.220 104184 DEBUG oslo_service.service [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.221 104184 DEBUG oslo_service.service [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.221 104184 DEBUG oslo_service.service [-] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.221 104184 DEBUG oslo_service.service [-] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.222 104184 DEBUG oslo_service.service [-] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.222 104184 DEBUG oslo_service.service [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.222 104184 DEBUG oslo_service.service [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.223 104184 DEBUG oslo_service.service [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.223 104184 DEBUG oslo_service.service [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.223 104184 DEBUG oslo_service.service [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.223 104184 DEBUG oslo_service.service [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.223 104184 DEBUG oslo_service.service [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.224 104184 DEBUG oslo_service.service [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.224 104184 DEBUG oslo_service.service [-] max_dns_nameservers            = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.224 104184 DEBUG oslo_service.service [-] max_header_line                = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.224 104184 DEBUG oslo_service.service [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.224 104184 DEBUG oslo_service.service [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.225 104184 DEBUG oslo_service.service [-] max_subnet_host_routes         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.225 104184 DEBUG oslo_service.service [-] metadata_backlog               = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.225 104184 DEBUG oslo_service.service [-] metadata_proxy_group           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.225 104184 DEBUG oslo_service.service [-] metadata_proxy_shared_secret   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.226 104184 DEBUG oslo_service.service [-] metadata_proxy_socket          = /var/lib/neutron/metadata_proxy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.226 104184 DEBUG oslo_service.service [-] metadata_proxy_socket_mode     = deduce log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.226 104184 DEBUG oslo_service.service [-] metadata_proxy_user            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.226 104184 DEBUG oslo_service.service [-] metadata_workers               = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.227 104184 DEBUG oslo_service.service [-] network_link_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.227 104184 DEBUG oslo_service.service [-] notify_nova_on_port_data_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.227 104184 DEBUG oslo_service.service [-] notify_nova_on_port_status_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.227 104184 DEBUG oslo_service.service [-] nova_client_cert               =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.227 104184 DEBUG oslo_service.service [-] nova_client_priv_key           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.228 104184 DEBUG oslo_service.service [-] nova_metadata_host             = nova-metadata-cell1-internal.openstack.svc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.228 104184 DEBUG oslo_service.service [-] nova_metadata_insecure         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.228 104184 DEBUG oslo_service.service [-] nova_metadata_port             = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.228 104184 DEBUG oslo_service.service [-] nova_metadata_protocol         = https log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.229 104184 DEBUG oslo_service.service [-] pagination_max_limit           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.229 104184 DEBUG oslo_service.service [-] periodic_fuzzy_delay           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.229 104184 DEBUG oslo_service.service [-] periodic_interval              = 40 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.229 104184 DEBUG oslo_service.service [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.230 104184 DEBUG oslo_service.service [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.230 104184 DEBUG oslo_service.service [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.230 104184 DEBUG oslo_service.service [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.230 104184 DEBUG oslo_service.service [-] retry_until_window             = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.230 104184 DEBUG oslo_service.service [-] rpc_resources_processing_step  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.231 104184 DEBUG oslo_service.service [-] rpc_response_max_timeout       = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.231 104184 DEBUG oslo_service.service [-] rpc_state_report_workers       = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.231 104184 DEBUG oslo_service.service [-] rpc_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.231 104184 DEBUG oslo_service.service [-] send_events_interval           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.231 104184 DEBUG oslo_service.service [-] service_plugins                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.232 104184 DEBUG oslo_service.service [-] setproctitle                   = on log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.232 104184 DEBUG oslo_service.service [-] state_path                     = /var/lib/neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.232 104184 DEBUG oslo_service.service [-] syslog_log_facility            = syslog log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.232 104184 DEBUG oslo_service.service [-] tcp_keepidle                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.232 104184 DEBUG oslo_service.service [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.233 104184 DEBUG oslo_service.service [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.233 104184 DEBUG oslo_service.service [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.233 104184 DEBUG oslo_service.service [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.233 104184 DEBUG oslo_service.service [-] use_ssl                        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.234 104184 DEBUG oslo_service.service [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.234 104184 DEBUG oslo_service.service [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.234 104184 DEBUG oslo_service.service [-] vlan_transparent               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.234 104184 DEBUG oslo_service.service [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.234 104184 DEBUG oslo_service.service [-] wsgi_default_pool_size         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.235 104184 DEBUG oslo_service.service [-] wsgi_keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.235 104184 DEBUG oslo_service.service [-] wsgi_log_format                = %(client_ip)s "%(request_line)s" status: %(status_code)s  len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.235 104184 DEBUG oslo_service.service [-] wsgi_server_debug              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.235 104184 DEBUG oslo_service.service [-] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.236 104184 DEBUG oslo_service.service [-] oslo_concurrency.lock_path     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.236 104184 DEBUG oslo_service.service [-] profiler.connection_string     = messaging:// log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.236 104184 DEBUG oslo_service.service [-] profiler.enabled               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.236 104184 DEBUG oslo_service.service [-] profiler.es_doc_type           = notification log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.237 104184 DEBUG oslo_service.service [-] profiler.es_scroll_size        = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.237 104184 DEBUG oslo_service.service [-] profiler.es_scroll_time        = 2m log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.237 104184 DEBUG oslo_service.service [-] profiler.filter_error_trace    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.237 104184 DEBUG oslo_service.service [-] profiler.hmac_keys             = SECRET_KEY log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.237 104184 DEBUG oslo_service.service [-] profiler.sentinel_service_name = mymaster log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.238 104184 DEBUG oslo_service.service [-] profiler.socket_timeout        = 0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.238 104184 DEBUG oslo_service.service [-] profiler.trace_sqlalchemy      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.238 104184 DEBUG oslo_service.service [-] oslo_policy.enforce_new_defaults = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.238 104184 DEBUG oslo_service.service [-] oslo_policy.enforce_scope      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.239 104184 DEBUG oslo_service.service [-] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.239 104184 DEBUG oslo_service.service [-] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.239 104184 DEBUG oslo_service.service [-] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.240 104184 DEBUG oslo_service.service [-] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.240 104184 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.240 104184 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.240 104184 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.241 104184 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.241 104184 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.241 104184 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.241 104184 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.242 104184 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.242 104184 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.242 104184 DEBUG oslo_service.service [-] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.242 104184 DEBUG oslo_service.service [-] service_providers.service_provider = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.243 104184 DEBUG oslo_service.service [-] privsep.capabilities           = [21, 12, 1, 2, 19] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.243 104184 DEBUG oslo_service.service [-] privsep.group                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.243 104184 DEBUG oslo_service.service [-] privsep.helper_command         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.243 104184 DEBUG oslo_service.service [-] privsep.logger_name            = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.243 104184 DEBUG oslo_service.service [-] privsep.thread_pool_size       = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.244 104184 DEBUG oslo_service.service [-] privsep.user                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.244 104184 DEBUG oslo_service.service [-] privsep_dhcp_release.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.244 104184 DEBUG oslo_service.service [-] privsep_dhcp_release.group     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.244 104184 DEBUG oslo_service.service [-] privsep_dhcp_release.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.244 104184 DEBUG oslo_service.service [-] privsep_dhcp_release.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.244 104184 DEBUG oslo_service.service [-] privsep_dhcp_release.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.244 104184 DEBUG oslo_service.service [-] privsep_dhcp_release.user      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.245 104184 DEBUG oslo_service.service [-] privsep_ovs_vsctl.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.245 104184 DEBUG oslo_service.service [-] privsep_ovs_vsctl.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.245 104184 DEBUG oslo_service.service [-] privsep_ovs_vsctl.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.245 104184 DEBUG oslo_service.service [-] privsep_ovs_vsctl.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.245 104184 DEBUG oslo_service.service [-] privsep_ovs_vsctl.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.245 104184 DEBUG oslo_service.service [-] privsep_ovs_vsctl.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.246 104184 DEBUG oslo_service.service [-] privsep_namespace.capabilities = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.246 104184 DEBUG oslo_service.service [-] privsep_namespace.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.246 104184 DEBUG oslo_service.service [-] privsep_namespace.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.246 104184 DEBUG oslo_service.service [-] privsep_namespace.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.246 104184 DEBUG oslo_service.service [-] privsep_namespace.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.246 104184 DEBUG oslo_service.service [-] privsep_namespace.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.247 104184 DEBUG oslo_service.service [-] privsep_conntrack.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.247 104184 DEBUG oslo_service.service [-] privsep_conntrack.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.247 104184 DEBUG oslo_service.service [-] privsep_conntrack.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.247 104184 DEBUG oslo_service.service [-] privsep_conntrack.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.247 104184 DEBUG oslo_service.service [-] privsep_conntrack.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.247 104184 DEBUG oslo_service.service [-] privsep_conntrack.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.248 104184 DEBUG oslo_service.service [-] privsep_link.capabilities      = [12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.248 104184 DEBUG oslo_service.service [-] privsep_link.group             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.248 104184 DEBUG oslo_service.service [-] privsep_link.helper_command    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.248 104184 DEBUG oslo_service.service [-] privsep_link.logger_name       = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.248 104184 DEBUG oslo_service.service [-] privsep_link.thread_pool_size  = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.249 104184 DEBUG oslo_service.service [-] privsep_link.user              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.249 104184 DEBUG oslo_service.service [-] AGENT.check_child_processes_action = respawn log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.249 104184 DEBUG oslo_service.service [-] AGENT.check_child_processes_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.249 104184 DEBUG oslo_service.service [-] AGENT.comment_iptables_rules   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.249 104184 DEBUG oslo_service.service [-] AGENT.debug_iptables_rules     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.249 104184 DEBUG oslo_service.service [-] AGENT.kill_scripts_path        = /etc/neutron/kill_scripts/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.250 104184 DEBUG oslo_service.service [-] AGENT.root_helper              = sudo neutron-rootwrap /etc/neutron/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.250 104184 DEBUG oslo_service.service [-] AGENT.root_helper_daemon       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.250 104184 DEBUG oslo_service.service [-] AGENT.use_helper_for_ns_read   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.250 104184 DEBUG oslo_service.service [-] AGENT.use_random_fully         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.250 104184 DEBUG oslo_service.service [-] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.251 104184 DEBUG oslo_service.service [-] QUOTAS.default_quota           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.251 104184 DEBUG oslo_service.service [-] QUOTAS.quota_driver            = neutron.db.quota.driver_nolock.DbQuotaNoLockDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.251 104184 DEBUG oslo_service.service [-] QUOTAS.quota_network           = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.252 104184 DEBUG oslo_service.service [-] QUOTAS.quota_port              = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.252 104184 DEBUG oslo_service.service [-] QUOTAS.quota_security_group    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.252 104184 DEBUG oslo_service.service [-] QUOTAS.quota_security_group_rule = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.252 104184 DEBUG oslo_service.service [-] QUOTAS.quota_subnet            = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.252 104184 DEBUG oslo_service.service [-] QUOTAS.track_quota_usage       = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.253 104184 DEBUG oslo_service.service [-] nova.auth_section              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.253 104184 DEBUG oslo_service.service [-] nova.auth_type                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.253 104184 DEBUG oslo_service.service [-] nova.cafile                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.253 104184 DEBUG oslo_service.service [-] nova.certfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.253 104184 DEBUG oslo_service.service [-] nova.collect_timing            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.254 104184 DEBUG oslo_service.service [-] nova.endpoint_type             = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.254 104184 DEBUG oslo_service.service [-] nova.insecure                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.254 104184 DEBUG oslo_service.service [-] nova.keyfile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.254 104184 DEBUG oslo_service.service [-] nova.region_name               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.254 104184 DEBUG oslo_service.service [-] nova.split_loggers             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.255 104184 DEBUG oslo_service.service [-] nova.timeout                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.255 104184 DEBUG oslo_service.service [-] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.255 104184 DEBUG oslo_service.service [-] placement.auth_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.255 104184 DEBUG oslo_service.service [-] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.255 104184 DEBUG oslo_service.service [-] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.255 104184 DEBUG oslo_service.service [-] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.256 104184 DEBUG oslo_service.service [-] placement.endpoint_type        = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.256 104184 DEBUG oslo_service.service [-] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.256 104184 DEBUG oslo_service.service [-] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.256 104184 DEBUG oslo_service.service [-] placement.region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.256 104184 DEBUG oslo_service.service [-] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.256 104184 DEBUG oslo_service.service [-] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.256 104184 DEBUG oslo_service.service [-] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.257 104184 DEBUG oslo_service.service [-] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.257 104184 DEBUG oslo_service.service [-] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.257 104184 DEBUG oslo_service.service [-] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.257 104184 DEBUG oslo_service.service [-] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.257 104184 DEBUG oslo_service.service [-] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.257 104184 DEBUG oslo_service.service [-] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.258 104184 DEBUG oslo_service.service [-] ironic.enable_notifications    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.258 104184 DEBUG oslo_service.service [-] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.258 104184 DEBUG oslo_service.service [-] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.258 104184 DEBUG oslo_service.service [-] ironic.interface               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.258 104184 DEBUG oslo_service.service [-] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.258 104184 DEBUG oslo_service.service [-] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.258 104184 DEBUG oslo_service.service [-] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.259 104184 DEBUG oslo_service.service [-] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.259 104184 DEBUG oslo_service.service [-] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.259 104184 DEBUG oslo_service.service [-] ironic.service_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.259 104184 DEBUG oslo_service.service [-] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.259 104184 DEBUG oslo_service.service [-] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.259 104184 DEBUG oslo_service.service [-] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.260 104184 DEBUG oslo_service.service [-] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.260 104184 DEBUG oslo_service.service [-] ironic.valid_interfaces        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.260 104184 DEBUG oslo_service.service [-] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.260 104184 DEBUG oslo_service.service [-] cli_script.dry_run             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.260 104184 DEBUG oslo_service.service [-] ovn.allow_stateless_action_supported = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.260 104184 DEBUG oslo_service.service [-] ovn.dhcp_default_lease_time    = 43200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.261 104184 DEBUG oslo_service.service [-] ovn.disable_ovn_dhcp_for_baremetal_ports = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.261 104184 DEBUG oslo_service.service [-] ovn.dns_servers                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.261 104184 DEBUG oslo_service.service [-] ovn.enable_distributed_floating_ip = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.261 104184 DEBUG oslo_service.service [-] ovn.neutron_sync_mode          = log log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.261 104184 DEBUG oslo_service.service [-] ovn.ovn_dhcp4_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.261 104184 DEBUG oslo_service.service [-] ovn.ovn_dhcp6_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.261 104184 DEBUG oslo_service.service [-] ovn.ovn_emit_need_to_frag      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.262 104184 DEBUG oslo_service.service [-] ovn.ovn_l3_mode                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.262 104184 DEBUG oslo_service.service [-] ovn.ovn_l3_scheduler           = leastloaded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.262 104184 DEBUG oslo_service.service [-] ovn.ovn_metadata_enabled       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.262 104184 DEBUG oslo_service.service [-] ovn.ovn_nb_ca_cert             =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.262 104184 DEBUG oslo_service.service [-] ovn.ovn_nb_certificate         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.262 104184 DEBUG oslo_service.service [-] ovn.ovn_nb_connection          = tcp:127.0.0.1:6641 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.263 104184 DEBUG oslo_service.service [-] ovn.ovn_nb_private_key         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.263 104184 DEBUG oslo_service.service [-] ovn.ovn_sb_ca_cert             = /etc/pki/tls/certs/ovndbca.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.263 104184 DEBUG oslo_service.service [-] ovn.ovn_sb_certificate         = /etc/pki/tls/certs/ovndb.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.263 104184 DEBUG oslo_service.service [-] ovn.ovn_sb_connection          = ssl:ovsdbserver-sb.openstack.svc:6642 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.263 104184 DEBUG oslo_service.service [-] ovn.ovn_sb_private_key         = /etc/pki/tls/private/ovndb.key log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.263 104184 DEBUG oslo_service.service [-] ovn.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.263 104184 DEBUG oslo_service.service [-] ovn.ovsdb_log_level            = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.264 104184 DEBUG oslo_service.service [-] ovn.ovsdb_probe_interval       = 60000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.264 104184 DEBUG oslo_service.service [-] ovn.ovsdb_retry_max_interval   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.264 104184 DEBUG oslo_service.service [-] ovn.vhost_sock_dir             = /var/run/openvswitch log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.264 104184 DEBUG oslo_service.service [-] ovn.vif_type                   = ovs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.264 104184 DEBUG oslo_service.service [-] OVS.bridge_mac_table_size      = 50000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.264 104184 DEBUG oslo_service.service [-] OVS.igmp_snooping_enable       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.264 104184 DEBUG oslo_service.service [-] OVS.ovsdb_timeout              = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.265 104184 DEBUG oslo_service.service [-] ovs.ovsdb_connection           = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.265 104184 DEBUG oslo_service.service [-] ovs.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.265 104184 DEBUG oslo_service.service [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.265 104184 DEBUG oslo_service.service [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.265 104184 DEBUG oslo_service.service [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.265 104184 DEBUG oslo_service.service [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.266 104184 DEBUG oslo_service.service [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.266 104184 DEBUG oslo_service.service [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.266 104184 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.266 104184 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.266 104184 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.266 104184 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.267 104184 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.267 104184 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.267 104184 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.267 104184 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.267 104184 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.267 104184 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.267 104184 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.268 104184 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.268 104184 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.268 104184 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.268 104184 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.268 104184 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.268 104184 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.268 104184 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.269 104184 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.269 104184 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.269 104184 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.269 104184 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.269 104184 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.269 104184 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.270 104184 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.270 104184 DEBUG oslo_service.service [-] oslo_messaging_notifications.driver = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.270 104184 DEBUG oslo_service.service [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.270 104184 DEBUG oslo_service.service [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.270 104184 DEBUG oslo_service.service [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:29:05.270 104184 DEBUG oslo_service.service [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Jan 21 23:29:10 compute-1 sshd-session[104731]: Accepted publickey for zuul from 192.168.122.30 port 53150 ssh2: ECDSA SHA256:yVd9eaNlUxI8uClDfn5gH9n5gqPIh4AgIE0cN4DTPFE
Jan 21 23:29:10 compute-1 systemd-logind[796]: New session 23 of user zuul.
Jan 21 23:29:10 compute-1 systemd[1]: Started Session 23 of User zuul.
Jan 21 23:29:10 compute-1 sshd-session[104731]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 21 23:29:11 compute-1 python3.9[104884]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 21 23:29:12 compute-1 sudo[105067]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qwiwxiytprokfklxylzhzpnswcqjryyk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038152.3887901-63-92083853303224/AnsiballZ_command.py'
Jan 21 23:29:12 compute-1 sudo[105067]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:29:13 compute-1 python3.9[105069]: ansible-ansible.legacy.command Invoked with _raw_params=podman ps -a --filter name=^nova_virtlogd$ --format \{\{.Names\}\} _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 23:29:13 compute-1 sudo[105067]: pam_unix(sudo:session): session closed for user root
Jan 21 23:29:15 compute-1 sudo[105231]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xhlkyxxbdchfoshanywxtmwwnsoshnxg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038154.3806794-96-232867745695737/AnsiballZ_systemd_service.py'
Jan 21 23:29:15 compute-1 sudo[105231]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:29:15 compute-1 python3.9[105233]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 21 23:29:15 compute-1 systemd[1]: Reloading.
Jan 21 23:29:15 compute-1 systemd-sysv-generator[105263]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 21 23:29:15 compute-1 systemd-rc-local-generator[105260]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 23:29:16 compute-1 sudo[105231]: pam_unix(sudo:session): session closed for user root
Jan 21 23:29:16 compute-1 python3.9[105417]: ansible-ansible.builtin.service_facts Invoked
Jan 21 23:29:16 compute-1 network[105434]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Jan 21 23:29:16 compute-1 network[105435]: 'network-scripts' will be removed from distribution in near future.
Jan 21 23:29:16 compute-1 network[105436]: It is advised to switch to 'NetworkManager' instead for network management.
Jan 21 23:29:21 compute-1 sudo[105695]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-socrgikqvjvrhocoteshyllvqcongheh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038161.1620815-153-279400247907613/AnsiballZ_systemd_service.py'
Jan 21 23:29:21 compute-1 sudo[105695]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:29:21 compute-1 python3.9[105697]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_libvirt.target state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 21 23:29:21 compute-1 sudo[105695]: pam_unix(sudo:session): session closed for user root
Jan 21 23:29:22 compute-1 sudo[105848]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tmctfxeznzmptgedzogybvrtcwvyirft ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038162.0681057-153-128755203690608/AnsiballZ_systemd_service.py'
Jan 21 23:29:22 compute-1 sudo[105848]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:29:22 compute-1 python3.9[105850]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtlogd_wrapper.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 21 23:29:22 compute-1 sudo[105848]: pam_unix(sudo:session): session closed for user root
Jan 21 23:29:22 compute-1 podman[105852]: 2026-01-21 23:29:22.83932897 +0000 UTC m=+0.109608487 container health_status 1b31374526468d6b98833c6ddc06c791524e3aaa8f4a92d02e3f11c449a17ca2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_managed=true)
Jan 21 23:29:23 compute-1 sudo[106027]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rxmuzwhcenbljppgepsqkbonylshdmlu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038162.8979347-153-158148029397574/AnsiballZ_systemd_service.py'
Jan 21 23:29:23 compute-1 sudo[106027]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:29:23 compute-1 python3.9[106029]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtnodedevd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 21 23:29:24 compute-1 sudo[106027]: pam_unix(sudo:session): session closed for user root
Jan 21 23:29:25 compute-1 sudo[106180]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lairzwqmcjkrsykruvglvmcauehgjqwx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038164.7741714-153-89255087308070/AnsiballZ_systemd_service.py'
Jan 21 23:29:25 compute-1 sudo[106180]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:29:25 compute-1 python3.9[106182]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtproxyd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 21 23:29:25 compute-1 sudo[106180]: pam_unix(sudo:session): session closed for user root
Jan 21 23:29:25 compute-1 sudo[106333]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-czavlkoikstziaqkmfbgdyjywatuivqg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038165.635731-153-209069132759236/AnsiballZ_systemd_service.py'
Jan 21 23:29:25 compute-1 sudo[106333]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:29:26 compute-1 python3.9[106335]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtqemud.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 21 23:29:26 compute-1 sudo[106333]: pam_unix(sudo:session): session closed for user root
Jan 21 23:29:26 compute-1 sudo[106486]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fcfpkzzdedpmccmnrciixlplciryfejy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038166.4690952-153-85312624699453/AnsiballZ_systemd_service.py'
Jan 21 23:29:26 compute-1 sudo[106486]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:29:27 compute-1 python3.9[106488]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtsecretd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 21 23:29:27 compute-1 sudo[106486]: pam_unix(sudo:session): session closed for user root
Jan 21 23:29:27 compute-1 sudo[106639]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vgwsotyamgqwtupalyemcvmoszuutbym ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038167.2865167-153-173215507164778/AnsiballZ_systemd_service.py'
Jan 21 23:29:27 compute-1 sudo[106639]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:29:27 compute-1 python3.9[106641]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtstoraged.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 21 23:29:27 compute-1 sudo[106639]: pam_unix(sudo:session): session closed for user root
Jan 21 23:29:29 compute-1 sudo[106792]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eojpktfjupfasuokijmqcgyfiqxaleto ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038169.2432184-309-70868777011040/AnsiballZ_file.py'
Jan 21 23:29:29 compute-1 sudo[106792]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:29:29 compute-1 python3.9[106794]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_libvirt.target state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:29:29 compute-1 sudo[106792]: pam_unix(sudo:session): session closed for user root
Jan 21 23:29:30 compute-1 sudo[106944]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wdzclklmlrhavlmawzdumejrksxrxwdw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038170.094316-309-2932676734203/AnsiballZ_file.py'
Jan 21 23:29:30 compute-1 sudo[106944]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:29:30 compute-1 python3.9[106946]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:29:30 compute-1 sudo[106944]: pam_unix(sudo:session): session closed for user root
Jan 21 23:29:31 compute-1 sudo[107096]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mlyfntgpbowemrvvmzwepehmoahghjzt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038170.75329-309-84863730013628/AnsiballZ_file.py'
Jan 21 23:29:31 compute-1 sudo[107096]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:29:31 compute-1 python3.9[107098]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:29:31 compute-1 sudo[107096]: pam_unix(sudo:session): session closed for user root
Jan 21 23:29:31 compute-1 podman[107165]: 2026-01-21 23:29:31.593878974 +0000 UTC m=+0.078240050 container health_status af9a44923f14d865893c91712a29a99d59e05d83f1a1a0300c4f4d5af638f284 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 21 23:29:31 compute-1 sudo[107267]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ljuibljfmsqjmrdrmpzuesavkynfonoq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038171.4369445-309-6932191622780/AnsiballZ_file.py'
Jan 21 23:29:31 compute-1 sudo[107267]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:29:32 compute-1 python3.9[107269]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:29:32 compute-1 sudo[107267]: pam_unix(sudo:session): session closed for user root
Jan 21 23:29:32 compute-1 sudo[107419]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wlzfcvjylrkxulwdzipdabrrkxpbbeut ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038172.226196-309-183637111568686/AnsiballZ_file.py'
Jan 21 23:29:32 compute-1 sudo[107419]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:29:32 compute-1 python3.9[107421]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:29:32 compute-1 sudo[107419]: pam_unix(sudo:session): session closed for user root
Jan 21 23:29:33 compute-1 sudo[107571]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lrmgieqjfuffsuyitoprnlgjtuzftjmo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038172.9255598-309-199466217565596/AnsiballZ_file.py'
Jan 21 23:29:33 compute-1 sudo[107571]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:29:33 compute-1 python3.9[107573]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:29:33 compute-1 sudo[107571]: pam_unix(sudo:session): session closed for user root
Jan 21 23:29:34 compute-1 sudo[107723]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kilgucudljgvmqamydvbqgugptzdfzqe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038173.6590273-309-19345890675708/AnsiballZ_file.py'
Jan 21 23:29:34 compute-1 sudo[107723]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:29:34 compute-1 python3.9[107725]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:29:34 compute-1 sudo[107723]: pam_unix(sudo:session): session closed for user root
Jan 21 23:29:35 compute-1 sudo[107875]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nptphgbsljorpqjbnoxwpssixlrimzhz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038174.770998-459-267938362333964/AnsiballZ_file.py'
Jan 21 23:29:35 compute-1 sudo[107875]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:29:35 compute-1 python3.9[107877]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_libvirt.target state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:29:35 compute-1 sudo[107875]: pam_unix(sudo:session): session closed for user root
Jan 21 23:29:35 compute-1 sudo[108027]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wttvvyoyetprmojlrlmypfghejtwfjpy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038175.4408205-459-131230529180284/AnsiballZ_file.py'
Jan 21 23:29:35 compute-1 sudo[108027]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:29:35 compute-1 python3.9[108029]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:29:35 compute-1 sudo[108027]: pam_unix(sudo:session): session closed for user root
Jan 21 23:29:36 compute-1 sudo[108179]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ueqlaidbaybpjtyocxlcurjtgsummszl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038176.1584287-459-18910757874946/AnsiballZ_file.py'
Jan 21 23:29:36 compute-1 sudo[108179]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:29:36 compute-1 python3.9[108181]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:29:36 compute-1 sudo[108179]: pam_unix(sudo:session): session closed for user root
Jan 21 23:29:37 compute-1 sudo[108331]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nymlvpzjfnvxulaxduusabyujkcojrrs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038176.8556972-459-57836198165759/AnsiballZ_file.py'
Jan 21 23:29:37 compute-1 sudo[108331]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:29:37 compute-1 python3.9[108333]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:29:37 compute-1 sudo[108331]: pam_unix(sudo:session): session closed for user root
Jan 21 23:29:37 compute-1 sudo[108483]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vwfisjhgxegniiaqmrzhomygqsupyzzm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038177.5339828-459-171304395333255/AnsiballZ_file.py'
Jan 21 23:29:37 compute-1 sudo[108483]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:29:38 compute-1 python3.9[108485]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:29:38 compute-1 sudo[108483]: pam_unix(sudo:session): session closed for user root
Jan 21 23:29:38 compute-1 sudo[108635]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zjccduavkpxdrujnywwraghfadjsrgcf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038178.1921947-459-208297715336270/AnsiballZ_file.py'
Jan 21 23:29:38 compute-1 sudo[108635]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:29:38 compute-1 python3.9[108637]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:29:38 compute-1 sudo[108635]: pam_unix(sudo:session): session closed for user root
Jan 21 23:29:39 compute-1 sudo[108787]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ptbnokczofuaoidyzmfncqxycweahwgi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038178.9303613-459-121846399399463/AnsiballZ_file.py'
Jan 21 23:29:39 compute-1 sudo[108787]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:29:39 compute-1 python3.9[108789]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:29:39 compute-1 sudo[108787]: pam_unix(sudo:session): session closed for user root
Jan 21 23:29:40 compute-1 sudo[108939]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aypbtjodltetsvzzunxjjksnttjvvdot ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038180.034485-612-93051872639478/AnsiballZ_command.py'
Jan 21 23:29:40 compute-1 sudo[108939]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:29:40 compute-1 python3.9[108941]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then
                                               systemctl disable --now certmonger.service
                                               test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service
                                             fi
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 23:29:40 compute-1 sudo[108939]: pam_unix(sudo:session): session closed for user root
Jan 21 23:29:41 compute-1 python3.9[109093]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Jan 21 23:29:42 compute-1 sudo[109243]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-isgesudpxqdtbyrwvgtblosbajgepnky ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038182.1159878-666-55033275982412/AnsiballZ_systemd_service.py'
Jan 21 23:29:42 compute-1 sudo[109243]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:29:42 compute-1 python3.9[109245]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 21 23:29:42 compute-1 systemd[1]: Reloading.
Jan 21 23:29:42 compute-1 systemd-rc-local-generator[109267]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 23:29:42 compute-1 systemd-sysv-generator[109273]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 21 23:29:43 compute-1 sudo[109243]: pam_unix(sudo:session): session closed for user root
Jan 21 23:29:43 compute-1 sudo[109430]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qmllarfmtwabudsqhutliefpkmnzqbas ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038183.3214118-690-74231396763340/AnsiballZ_command.py'
Jan 21 23:29:43 compute-1 sudo[109430]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:29:43 compute-1 python3.9[109432]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_libvirt.target _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 23:29:43 compute-1 sudo[109430]: pam_unix(sudo:session): session closed for user root
Jan 21 23:29:44 compute-1 sudo[109583]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cekxzhwxepkepcwogonnqsecsamapwid ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038184.0649016-690-33014775047910/AnsiballZ_command.py'
Jan 21 23:29:44 compute-1 sudo[109583]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:29:44 compute-1 python3.9[109585]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtlogd_wrapper.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 23:29:44 compute-1 sudo[109583]: pam_unix(sudo:session): session closed for user root
Jan 21 23:29:45 compute-1 sudo[109736]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vbcvsivpwnvescxnsinkcdhyrihjltpi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038184.7558284-690-115217182508951/AnsiballZ_command.py'
Jan 21 23:29:45 compute-1 sudo[109736]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:29:45 compute-1 python3.9[109738]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtnodedevd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 23:29:45 compute-1 sudo[109736]: pam_unix(sudo:session): session closed for user root
Jan 21 23:29:45 compute-1 sudo[109889]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zqunbyikgdbwttzqkuphlkqhpjppnvyq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038185.4227867-690-74906505535738/AnsiballZ_command.py'
Jan 21 23:29:45 compute-1 sudo[109889]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:29:45 compute-1 python3.9[109891]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtproxyd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 23:29:45 compute-1 sudo[109889]: pam_unix(sudo:session): session closed for user root
Jan 21 23:29:46 compute-1 sudo[110042]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nqzokhhoktonaxtwrrgfvahizjfscpfy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038186.320909-690-163198090952996/AnsiballZ_command.py'
Jan 21 23:29:46 compute-1 sudo[110042]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:29:46 compute-1 python3.9[110044]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtqemud.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 23:29:46 compute-1 sudo[110042]: pam_unix(sudo:session): session closed for user root
Jan 21 23:29:47 compute-1 sudo[110195]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ygjqxvhfnczldqrbufoqxkrscqnqysqc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038186.973556-690-174964578989949/AnsiballZ_command.py'
Jan 21 23:29:47 compute-1 sudo[110195]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:29:47 compute-1 python3.9[110197]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtsecretd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 23:29:47 compute-1 sudo[110195]: pam_unix(sudo:session): session closed for user root
Jan 21 23:29:48 compute-1 sudo[110348]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ndlvbkesabgdgaprvsaxyaeqqneefiuj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038187.7394614-690-88934454269689/AnsiballZ_command.py'
Jan 21 23:29:48 compute-1 sudo[110348]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:29:48 compute-1 python3.9[110350]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtstoraged.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 23:29:48 compute-1 sudo[110348]: pam_unix(sudo:session): session closed for user root
Jan 21 23:29:50 compute-1 sudo[110501]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xoiioetsgsaitfryqwrtxqsqvvsbqenw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038189.525161-852-230815634237121/AnsiballZ_getent.py'
Jan 21 23:29:50 compute-1 sudo[110501]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:29:50 compute-1 python3.9[110503]: ansible-ansible.builtin.getent Invoked with database=passwd key=libvirt fail_key=True service=None split=None
Jan 21 23:29:50 compute-1 sudo[110501]: pam_unix(sudo:session): session closed for user root
Jan 21 23:29:50 compute-1 sudo[110654]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eonwagnqpfmqlazgwkqfpygeliwuqfbo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038190.487697-876-185872243191449/AnsiballZ_group.py'
Jan 21 23:29:50 compute-1 sudo[110654]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:29:51 compute-1 python3.9[110656]: ansible-ansible.builtin.group Invoked with gid=42473 name=libvirt state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Jan 21 23:29:51 compute-1 groupadd[110657]: group added to /etc/group: name=libvirt, GID=42473
Jan 21 23:29:51 compute-1 groupadd[110657]: group added to /etc/gshadow: name=libvirt
Jan 21 23:29:51 compute-1 groupadd[110657]: new group: name=libvirt, GID=42473
Jan 21 23:29:51 compute-1 sudo[110654]: pam_unix(sudo:session): session closed for user root
Jan 21 23:29:52 compute-1 sudo[110812]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uougvhzkiqjsitxrpzltlymvnlxljttw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038191.6861203-900-88506973449093/AnsiballZ_user.py'
Jan 21 23:29:52 compute-1 sudo[110812]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:29:52 compute-1 python3.9[110814]: ansible-ansible.builtin.user Invoked with comment=libvirt user group=libvirt groups=[''] name=libvirt shell=/sbin/nologin state=present uid=42473 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-1 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Jan 21 23:29:52 compute-1 useradd[110816]: new user: name=libvirt, UID=42473, GID=42473, home=/home/libvirt, shell=/sbin/nologin, from=/dev/pts/0
Jan 21 23:29:52 compute-1 sudo[110812]: pam_unix(sudo:session): session closed for user root
Jan 21 23:29:53 compute-1 sudo[110983]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-phnzzqvndsttpwomjnueyfntcmrmtlsh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038193.0480368-933-280565334870326/AnsiballZ_setup.py'
Jan 21 23:29:53 compute-1 sudo[110983]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:29:53 compute-1 podman[110946]: 2026-01-21 23:29:53.454345623 +0000 UTC m=+0.115796988 container health_status 1b31374526468d6b98833c6ddc06c791524e3aaa8f4a92d02e3f11c449a17ca2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS)
Jan 21 23:29:53 compute-1 python3.9[110989]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 21 23:29:53 compute-1 sudo[110983]: pam_unix(sudo:session): session closed for user root
Jan 21 23:29:54 compute-1 sudo[111082]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vhhvyxmfcrtsfmwnvklrwgopkohycutl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038193.0480368-933-280565334870326/AnsiballZ_dnf.py'
Jan 21 23:29:54 compute-1 sudo[111082]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:29:54 compute-1 python3.9[111084]: ansible-ansible.legacy.dnf Invoked with name=['libvirt ', 'libvirt-admin ', 'libvirt-client ', 'libvirt-daemon ', 'qemu-kvm', 'qemu-img', 'libguestfs', 'libseccomp', 'swtpm', 'swtpm-tools', 'edk2-ovmf', 'ceph-common', 'cyrus-sasl-scram'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 21 23:30:02 compute-1 podman[111096]: 2026-01-21 23:30:02.558052792 +0000 UTC m=+0.056630880 container health_status af9a44923f14d865893c91712a29a99d59e05d83f1a1a0300c4f4d5af638f284 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 21 23:30:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:30:02.975 104184 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:30:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:30:02.976 104184 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:30:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:30:02.977 104184 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:30:23 compute-1 podman[111296]: 2026-01-21 23:30:23.63175576 +0000 UTC m=+0.121961472 container health_status 1b31374526468d6b98833c6ddc06c791524e3aaa8f4a92d02e3f11c449a17ca2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 21 23:30:24 compute-1 kernel: SELinux:  Converting 2763 SID table entries...
Jan 21 23:30:24 compute-1 kernel: SELinux:  policy capability network_peer_controls=1
Jan 21 23:30:24 compute-1 kernel: SELinux:  policy capability open_perms=1
Jan 21 23:30:24 compute-1 kernel: SELinux:  policy capability extended_socket_class=1
Jan 21 23:30:24 compute-1 kernel: SELinux:  policy capability always_check_network=0
Jan 21 23:30:24 compute-1 kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 21 23:30:24 compute-1 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 21 23:30:24 compute-1 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 21 23:30:33 compute-1 dbus-broker-launch[770]: avc:  op=load_policy lsm=selinux seqno=10 res=1
Jan 21 23:30:33 compute-1 podman[111330]: 2026-01-21 23:30:33.632024217 +0000 UTC m=+0.101346155 container health_status af9a44923f14d865893c91712a29a99d59e05d83f1a1a0300c4f4d5af638f284 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 21 23:30:33 compute-1 kernel: SELinux:  Converting 2763 SID table entries...
Jan 21 23:30:33 compute-1 kernel: SELinux:  policy capability network_peer_controls=1
Jan 21 23:30:33 compute-1 kernel: SELinux:  policy capability open_perms=1
Jan 21 23:30:33 compute-1 kernel: SELinux:  policy capability extended_socket_class=1
Jan 21 23:30:33 compute-1 kernel: SELinux:  policy capability always_check_network=0
Jan 21 23:30:33 compute-1 kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 21 23:30:33 compute-1 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 21 23:30:33 compute-1 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 21 23:30:54 compute-1 dbus-broker-launch[770]: avc:  op=load_policy lsm=selinux seqno=11 res=1
Jan 21 23:30:54 compute-1 podman[115825]: 2026-01-21 23:30:54.662221918 +0000 UTC m=+0.126411091 container health_status 1b31374526468d6b98833c6ddc06c791524e3aaa8f4a92d02e3f11c449a17ca2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, io.buildah.version=1.41.3, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 21 23:31:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:31:02.976 104184 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:31:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:31:02.977 104184 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:31:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:31:02.977 104184 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:31:04 compute-1 podman[121494]: 2026-01-21 23:31:04.600710552 +0000 UTC m=+0.077841601 container health_status af9a44923f14d865893c91712a29a99d59e05d83f1a1a0300c4f4d5af638f284 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3)
Jan 21 23:31:25 compute-1 podman[128260]: 2026-01-21 23:31:25.65986374 +0000 UTC m=+0.140369364 container health_status 1b31374526468d6b98833c6ddc06c791524e3aaa8f4a92d02e3f11c449a17ca2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ovn_controller)
Jan 21 23:31:31 compute-1 kernel: SELinux:  Converting 2764 SID table entries...
Jan 21 23:31:31 compute-1 kernel: SELinux:  policy capability network_peer_controls=1
Jan 21 23:31:31 compute-1 kernel: SELinux:  policy capability open_perms=1
Jan 21 23:31:31 compute-1 kernel: SELinux:  policy capability extended_socket_class=1
Jan 21 23:31:31 compute-1 kernel: SELinux:  policy capability always_check_network=0
Jan 21 23:31:31 compute-1 kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 21 23:31:31 compute-1 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 21 23:31:31 compute-1 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 21 23:31:32 compute-1 groupadd[128298]: group added to /etc/group: name=dnsmasq, GID=993
Jan 21 23:31:32 compute-1 groupadd[128298]: group added to /etc/gshadow: name=dnsmasq
Jan 21 23:31:32 compute-1 groupadd[128298]: new group: name=dnsmasq, GID=993
Jan 21 23:31:32 compute-1 useradd[128305]: new user: name=dnsmasq, UID=992, GID=993, home=/var/lib/dnsmasq, shell=/usr/sbin/nologin, from=none
Jan 21 23:31:32 compute-1 dbus-broker-launch[754]: Noticed file-system modification, trigger reload.
Jan 21 23:31:32 compute-1 dbus-broker-launch[770]: avc:  op=load_policy lsm=selinux seqno=12 res=1
Jan 21 23:31:32 compute-1 dbus-broker-launch[754]: Noticed file-system modification, trigger reload.
Jan 21 23:31:33 compute-1 groupadd[128318]: group added to /etc/group: name=clevis, GID=992
Jan 21 23:31:33 compute-1 groupadd[128318]: group added to /etc/gshadow: name=clevis
Jan 21 23:31:33 compute-1 groupadd[128318]: new group: name=clevis, GID=992
Jan 21 23:31:33 compute-1 useradd[128325]: new user: name=clevis, UID=991, GID=992, home=/var/cache/clevis, shell=/usr/sbin/nologin, from=none
Jan 21 23:31:33 compute-1 usermod[128335]: add 'clevis' to group 'tss'
Jan 21 23:31:33 compute-1 usermod[128335]: add 'clevis' to shadow group 'tss'
Jan 21 23:31:35 compute-1 podman[128359]: 2026-01-21 23:31:35.630625807 +0000 UTC m=+0.099546826 container health_status af9a44923f14d865893c91712a29a99d59e05d83f1a1a0300c4f4d5af638f284 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Jan 21 23:31:35 compute-1 polkitd[43406]: Reloading rules
Jan 21 23:31:35 compute-1 polkitd[43406]: Collecting garbage unconditionally...
Jan 21 23:31:35 compute-1 polkitd[43406]: Loading rules from directory /etc/polkit-1/rules.d
Jan 21 23:31:35 compute-1 polkitd[43406]: Loading rules from directory /usr/share/polkit-1/rules.d
Jan 21 23:31:35 compute-1 polkitd[43406]: Finished loading, compiling and executing 3 rules
Jan 21 23:31:35 compute-1 polkitd[43406]: Reloading rules
Jan 21 23:31:35 compute-1 polkitd[43406]: Collecting garbage unconditionally...
Jan 21 23:31:35 compute-1 polkitd[43406]: Loading rules from directory /etc/polkit-1/rules.d
Jan 21 23:31:35 compute-1 polkitd[43406]: Loading rules from directory /usr/share/polkit-1/rules.d
Jan 21 23:31:35 compute-1 polkitd[43406]: Finished loading, compiling and executing 3 rules
Jan 21 23:31:37 compute-1 groupadd[128546]: group added to /etc/group: name=ceph, GID=167
Jan 21 23:31:37 compute-1 groupadd[128546]: group added to /etc/gshadow: name=ceph
Jan 21 23:31:37 compute-1 groupadd[128546]: new group: name=ceph, GID=167
Jan 21 23:31:37 compute-1 useradd[128552]: new user: name=ceph, UID=167, GID=167, home=/var/lib/ceph, shell=/sbin/nologin, from=none
Jan 21 23:31:40 compute-1 systemd[1]: Stopping OpenSSH server daemon...
Jan 21 23:31:40 compute-1 sshd[1004]: Received signal 15; terminating.
Jan 21 23:31:40 compute-1 systemd[1]: sshd.service: Deactivated successfully.
Jan 21 23:31:40 compute-1 systemd[1]: Stopped OpenSSH server daemon.
Jan 21 23:31:40 compute-1 systemd[1]: sshd.service: Consumed 1.518s CPU time, read 32.0K from disk, written 0B to disk.
Jan 21 23:31:40 compute-1 systemd[1]: Stopped target sshd-keygen.target.
Jan 21 23:31:40 compute-1 systemd[1]: Stopping sshd-keygen.target...
Jan 21 23:31:40 compute-1 systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Jan 21 23:31:40 compute-1 systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Jan 21 23:31:40 compute-1 systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Jan 21 23:31:40 compute-1 systemd[1]: Reached target sshd-keygen.target.
Jan 21 23:31:40 compute-1 systemd[1]: Starting OpenSSH server daemon...
Jan 21 23:31:40 compute-1 sshd[129071]: Server listening on 0.0.0.0 port 22.
Jan 21 23:31:40 compute-1 sshd[129071]: Server listening on :: port 22.
Jan 21 23:31:40 compute-1 systemd[1]: Started OpenSSH server daemon.
Jan 21 23:31:42 compute-1 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 21 23:31:42 compute-1 systemd[1]: Starting man-db-cache-update.service...
Jan 21 23:31:42 compute-1 systemd[1]: Reloading.
Jan 21 23:31:42 compute-1 systemd-rc-local-generator[129333]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 23:31:42 compute-1 systemd-sysv-generator[129336]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 21 23:31:43 compute-1 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 21 23:31:46 compute-1 sudo[111082]: pam_unix(sudo:session): session closed for user root
Jan 21 23:31:51 compute-1 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 21 23:31:51 compute-1 systemd[1]: Finished man-db-cache-update.service.
Jan 21 23:31:51 compute-1 systemd[1]: man-db-cache-update.service: Consumed 11.767s CPU time.
Jan 21 23:31:51 compute-1 systemd[1]: run-rafbdd89b15c14bcdab14515165a2595d.service: Deactivated successfully.
Jan 21 23:31:56 compute-1 podman[137734]: 2026-01-21 23:31:56.633149367 +0000 UTC m=+0.117842505 container health_status 1b31374526468d6b98833c6ddc06c791524e3aaa8f4a92d02e3f11c449a17ca2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 21 23:32:00 compute-1 sudo[137886]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ujebbpmonsjrwlyoumbyleeckwlrsune ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038320.0106647-969-180946045860268/AnsiballZ_systemd.py'
Jan 21 23:32:00 compute-1 sudo[137886]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:32:00 compute-1 python3.9[137888]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 21 23:32:01 compute-1 systemd[1]: Reloading.
Jan 21 23:32:01 compute-1 systemd-sysv-generator[137916]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 21 23:32:01 compute-1 systemd-rc-local-generator[137912]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 23:32:01 compute-1 sudo[137886]: pam_unix(sudo:session): session closed for user root
Jan 21 23:32:01 compute-1 sudo[138076]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yhyfbrbppaiczpawrkgakvjzppkqkucu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038321.4860916-969-146075757749687/AnsiballZ_systemd.py'
Jan 21 23:32:01 compute-1 sudo[138076]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:32:02 compute-1 python3.9[138078]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd-tcp.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 21 23:32:02 compute-1 systemd[1]: Reloading.
Jan 21 23:32:02 compute-1 systemd-rc-local-generator[138105]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 23:32:02 compute-1 systemd-sysv-generator[138112]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 21 23:32:02 compute-1 sudo[138076]: pam_unix(sudo:session): session closed for user root
Jan 21 23:32:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:32:02.977 104184 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:32:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:32:02.978 104184 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:32:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:32:02.978 104184 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:32:03 compute-1 sudo[138266]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hfcnbdkwicoyedfpaxgqmenunaoqorhc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038322.6735818-969-121158227646826/AnsiballZ_systemd.py'
Jan 21 23:32:03 compute-1 sudo[138266]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:32:03 compute-1 python3.9[138268]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd-tls.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 21 23:32:03 compute-1 systemd[1]: Reloading.
Jan 21 23:32:03 compute-1 systemd-sysv-generator[138301]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 21 23:32:03 compute-1 systemd-rc-local-generator[138297]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 23:32:03 compute-1 sudo[138266]: pam_unix(sudo:session): session closed for user root
Jan 21 23:32:04 compute-1 sudo[138456]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-idmsxufeoipnlzattwkyyrkxlnqoxtek ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038323.8704395-969-89066520500299/AnsiballZ_systemd.py'
Jan 21 23:32:04 compute-1 sudo[138456]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:32:04 compute-1 python3.9[138458]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=virtproxyd-tcp.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 21 23:32:05 compute-1 systemd[1]: Reloading.
Jan 21 23:32:05 compute-1 podman[138461]: 2026-01-21 23:32:05.78432521 +0000 UTC m=+0.078808174 container health_status af9a44923f14d865893c91712a29a99d59e05d83f1a1a0300c4f4d5af638f284 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251202, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Jan 21 23:32:05 compute-1 systemd-rc-local-generator[138505]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 23:32:05 compute-1 systemd-sysv-generator[138510]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 21 23:32:06 compute-1 sudo[138456]: pam_unix(sudo:session): session closed for user root
Jan 21 23:32:06 compute-1 sudo[138664]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ddbojypkwmqwlmizjpokwfzneeoymsus ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038326.2539587-1056-221025607318695/AnsiballZ_systemd.py'
Jan 21 23:32:06 compute-1 sudo[138664]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:32:06 compute-1 python3.9[138666]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 21 23:32:07 compute-1 systemd[1]: Reloading.
Jan 21 23:32:07 compute-1 systemd-rc-local-generator[138697]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 23:32:07 compute-1 systemd-sysv-generator[138700]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 21 23:32:07 compute-1 sudo[138664]: pam_unix(sudo:session): session closed for user root
Jan 21 23:32:08 compute-1 sudo[138853]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jngpahqgvgejfdjopbnsblavmzxiuhqn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038327.6458085-1056-259507074234223/AnsiballZ_systemd.py'
Jan 21 23:32:08 compute-1 sudo[138853]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:32:08 compute-1 python3.9[138855]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 21 23:32:08 compute-1 systemd[1]: Reloading.
Jan 21 23:32:08 compute-1 systemd-rc-local-generator[138882]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 23:32:08 compute-1 systemd-sysv-generator[138887]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 21 23:32:08 compute-1 sudo[138853]: pam_unix(sudo:session): session closed for user root
Jan 21 23:32:09 compute-1 sudo[139043]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-deoofmrwqlvoarjysmocyvnhmjekujdd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038328.9945815-1056-261908969029071/AnsiballZ_systemd.py'
Jan 21 23:32:09 compute-1 sudo[139043]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:32:09 compute-1 python3.9[139045]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 21 23:32:09 compute-1 systemd[1]: Reloading.
Jan 21 23:32:09 compute-1 systemd-sysv-generator[139079]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 21 23:32:09 compute-1 systemd-rc-local-generator[139076]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 23:32:10 compute-1 sudo[139043]: pam_unix(sudo:session): session closed for user root
Jan 21 23:32:10 compute-1 sudo[139233]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-moqgrwdtwssolujremzzemrdbpukuesy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038330.2360327-1056-195591742957529/AnsiballZ_systemd.py'
Jan 21 23:32:10 compute-1 sudo[139233]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:32:10 compute-1 python3.9[139235]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 21 23:32:11 compute-1 sudo[139233]: pam_unix(sudo:session): session closed for user root
Jan 21 23:32:11 compute-1 sudo[139388]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uwxazspmnuayrajalyjkuckonwlippuv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038331.2658587-1056-242761504977509/AnsiballZ_systemd.py'
Jan 21 23:32:11 compute-1 sudo[139388]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:32:11 compute-1 python3.9[139390]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 21 23:32:12 compute-1 systemd[1]: Reloading.
Jan 21 23:32:12 compute-1 systemd-rc-local-generator[139420]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 23:32:12 compute-1 systemd-sysv-generator[139426]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 21 23:32:12 compute-1 sudo[139388]: pam_unix(sudo:session): session closed for user root
Jan 21 23:32:13 compute-1 sudo[139578]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ohvivjwvtzokfbzsqpuquseqznpojcpp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038333.5414474-1164-192972939586547/AnsiballZ_systemd.py'
Jan 21 23:32:13 compute-1 sudo[139578]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:32:14 compute-1 python3.9[139580]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-tls.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 21 23:32:14 compute-1 systemd[1]: Reloading.
Jan 21 23:32:14 compute-1 systemd-rc-local-generator[139612]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 23:32:14 compute-1 systemd-sysv-generator[139616]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 21 23:32:14 compute-1 systemd[1]: Listening on libvirt proxy daemon socket.
Jan 21 23:32:14 compute-1 systemd[1]: Listening on libvirt proxy daemon TLS IP socket.
Jan 21 23:32:14 compute-1 sudo[139578]: pam_unix(sudo:session): session closed for user root
Jan 21 23:32:15 compute-1 sudo[139772]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fabkrijfhlztyitbnoyabomnjalypegx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038334.9672625-1188-189705307845585/AnsiballZ_systemd.py'
Jan 21 23:32:15 compute-1 sudo[139772]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:32:15 compute-1 python3.9[139774]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 21 23:32:15 compute-1 sudo[139772]: pam_unix(sudo:session): session closed for user root
Jan 21 23:32:16 compute-1 sudo[139927]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-surcfmqxyycpcieawoirnitgmfevhoyv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038336.0340137-1188-219741002326200/AnsiballZ_systemd.py'
Jan 21 23:32:16 compute-1 sudo[139927]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:32:16 compute-1 python3.9[139929]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 21 23:32:16 compute-1 sudo[139927]: pam_unix(sudo:session): session closed for user root
Jan 21 23:32:17 compute-1 sudo[140082]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yyenkumcdkscqpnxvkfdegnalrlfrlxk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038336.9522574-1188-199916837302647/AnsiballZ_systemd.py'
Jan 21 23:32:17 compute-1 sudo[140082]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:32:17 compute-1 python3.9[140084]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 21 23:32:17 compute-1 sudo[140082]: pam_unix(sudo:session): session closed for user root
Jan 21 23:32:18 compute-1 sudo[140237]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nejlsjnjrxfgovivjowbrmxtqarewcjz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038337.841449-1188-80868911254405/AnsiballZ_systemd.py'
Jan 21 23:32:18 compute-1 sudo[140237]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:32:18 compute-1 python3.9[140239]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 21 23:32:18 compute-1 sudo[140237]: pam_unix(sudo:session): session closed for user root
Jan 21 23:32:19 compute-1 sudo[140392]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uteprbazqaywxgnzqbaidxtwiewemewu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038338.7675352-1188-227455596879972/AnsiballZ_systemd.py'
Jan 21 23:32:19 compute-1 sudo[140392]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:32:19 compute-1 python3.9[140394]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 21 23:32:19 compute-1 sudo[140392]: pam_unix(sudo:session): session closed for user root
Jan 21 23:32:20 compute-1 sudo[140547]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gctekzicwxqkhiwrquopynyjvnvuoaam ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038339.7734115-1188-25945111223201/AnsiballZ_systemd.py'
Jan 21 23:32:20 compute-1 sudo[140547]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:32:20 compute-1 python3.9[140549]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 21 23:32:20 compute-1 sudo[140547]: pam_unix(sudo:session): session closed for user root
Jan 21 23:32:21 compute-1 sudo[140702]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lpzsyalkgqsnjlfucgndftqickgugxjr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038340.6614037-1188-111303009712869/AnsiballZ_systemd.py'
Jan 21 23:32:21 compute-1 sudo[140702]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:32:21 compute-1 python3.9[140704]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 21 23:32:21 compute-1 sudo[140702]: pam_unix(sudo:session): session closed for user root
Jan 21 23:32:21 compute-1 sudo[140857]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pqycqvrgbxrrjatcotrpkzfvjpggjyub ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038341.601001-1188-152306678337723/AnsiballZ_systemd.py'
Jan 21 23:32:21 compute-1 sudo[140857]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:32:22 compute-1 python3.9[140859]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 21 23:32:22 compute-1 sudo[140857]: pam_unix(sudo:session): session closed for user root
Jan 21 23:32:22 compute-1 sudo[141012]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pgabfyhymcvkgnrifakdpxekghswpyur ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038342.5838985-1188-21588953784896/AnsiballZ_systemd.py'
Jan 21 23:32:22 compute-1 sudo[141012]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:32:23 compute-1 python3.9[141014]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 21 23:32:23 compute-1 sudo[141012]: pam_unix(sudo:session): session closed for user root
Jan 21 23:32:23 compute-1 sudo[141167]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wnlivsfaljmnowryjgksveqliiqgidvd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038343.5045485-1188-228283336579241/AnsiballZ_systemd.py'
Jan 21 23:32:23 compute-1 sudo[141167]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:32:24 compute-1 python3.9[141169]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 21 23:32:24 compute-1 sudo[141167]: pam_unix(sudo:session): session closed for user root
Jan 21 23:32:24 compute-1 sudo[141322]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nfyswcfoakulmxciapjeazotayfsktav ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038344.5015166-1188-195445332629594/AnsiballZ_systemd.py'
Jan 21 23:32:24 compute-1 sudo[141322]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:32:25 compute-1 python3.9[141324]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 21 23:32:26 compute-1 sudo[141322]: pam_unix(sudo:session): session closed for user root
Jan 21 23:32:26 compute-1 sudo[141487]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pcibacwmewnuidkdcktkinlcxhurgnwl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038346.4358394-1188-192880278969586/AnsiballZ_systemd.py'
Jan 21 23:32:26 compute-1 sudo[141487]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:32:26 compute-1 podman[141451]: 2026-01-21 23:32:26.918833446 +0000 UTC m=+0.126424687 container health_status 1b31374526468d6b98833c6ddc06c791524e3aaa8f4a92d02e3f11c449a17ca2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Jan 21 23:32:27 compute-1 python3.9[141491]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 21 23:32:27 compute-1 sudo[141487]: pam_unix(sudo:session): session closed for user root
Jan 21 23:32:27 compute-1 sudo[141655]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kfxlvnrvurplhvvanukjgplmupzhsuos ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038347.3775-1188-258665724000821/AnsiballZ_systemd.py'
Jan 21 23:32:27 compute-1 sudo[141655]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:32:27 compute-1 python3.9[141657]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 21 23:32:28 compute-1 sudo[141655]: pam_unix(sudo:session): session closed for user root
Jan 21 23:32:28 compute-1 sudo[141810]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xnduwxivnyejcuesppuoakuxqcdsnehu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038348.1991262-1188-51302447570416/AnsiballZ_systemd.py'
Jan 21 23:32:28 compute-1 sudo[141810]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:32:28 compute-1 python3.9[141812]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 21 23:32:29 compute-1 sudo[141810]: pam_unix(sudo:session): session closed for user root
Jan 21 23:32:29 compute-1 sudo[141965]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fhcbaqkygbzmbjlwfhajkmskrekqslkd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038349.599088-1494-129765365019287/AnsiballZ_file.py'
Jan 21 23:32:29 compute-1 sudo[141965]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:32:30 compute-1 python3.9[141967]: ansible-ansible.builtin.file Invoked with group=root owner=root path=/etc/tmpfiles.d/ setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Jan 21 23:32:30 compute-1 sudo[141965]: pam_unix(sudo:session): session closed for user root
Jan 21 23:32:30 compute-1 sudo[142117]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nmfdpghcaxfcbykonmcxansrmtiaebrq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038350.435004-1494-237520831847939/AnsiballZ_file.py'
Jan 21 23:32:30 compute-1 sudo[142117]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:32:30 compute-1 python3.9[142119]: ansible-ansible.builtin.file Invoked with group=root owner=root path=/var/lib/edpm-config/firewall setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Jan 21 23:32:30 compute-1 sudo[142117]: pam_unix(sudo:session): session closed for user root
Jan 21 23:32:31 compute-1 sudo[142269]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bnrbjvcgfvimtbzmmjeounytizmplwme ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038351.1637478-1494-104394926298407/AnsiballZ_file.py'
Jan 21 23:32:31 compute-1 sudo[142269]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:32:31 compute-1 python3.9[142271]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 21 23:32:31 compute-1 sudo[142269]: pam_unix(sudo:session): session closed for user root
Jan 21 23:32:32 compute-1 sudo[142421]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mqagpmsnupnhrhdyzxizjjlcnwuihlmw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038351.9442117-1494-143774272515163/AnsiballZ_file.py'
Jan 21 23:32:32 compute-1 sudo[142421]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:32:32 compute-1 python3.9[142423]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/libvirt/private setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 21 23:32:32 compute-1 sudo[142421]: pam_unix(sudo:session): session closed for user root
Jan 21 23:32:32 compute-1 sudo[142573]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mktkwdqstoeudynsjfqrsjejaxwzscns ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038352.6708825-1494-105448632653038/AnsiballZ_file.py'
Jan 21 23:32:32 compute-1 sudo[142573]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:32:33 compute-1 python3.9[142575]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/CA setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 21 23:32:33 compute-1 sudo[142573]: pam_unix(sudo:session): session closed for user root
Jan 21 23:32:33 compute-1 sudo[142725]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-unxyjyzlpterbavndxxakwilxszpvdgk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038353.344369-1494-107634573136364/AnsiballZ_file.py'
Jan 21 23:32:33 compute-1 sudo[142725]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:32:33 compute-1 python3.9[142727]: ansible-ansible.builtin.file Invoked with group=qemu owner=root path=/etc/pki/qemu setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Jan 21 23:32:33 compute-1 sudo[142725]: pam_unix(sudo:session): session closed for user root
Jan 21 23:32:34 compute-1 python3.9[142877]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 21 23:32:35 compute-1 sudo[143027]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-crodvbkkpkduczpgpisvyntulafozicg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038355.1761265-1647-35339263583893/AnsiballZ_stat.py'
Jan 21 23:32:35 compute-1 sudo[143027]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:32:35 compute-1 python3.9[143029]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtlogd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:32:35 compute-1 sudo[143027]: pam_unix(sudo:session): session closed for user root
Jan 21 23:32:36 compute-1 podman[143120]: 2026-01-21 23:32:36.611299008 +0000 UTC m=+0.090964397 container health_status af9a44923f14d865893c91712a29a99d59e05d83f1a1a0300c4f4d5af638f284 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3)
Jan 21 23:32:36 compute-1 sudo[143170]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ytdhwaosewrayotbbqxhurdjvgslhhnm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038355.1761265-1647-35339263583893/AnsiballZ_copy.py'
Jan 21 23:32:36 compute-1 sudo[143170]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:32:36 compute-1 python3.9[143172]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtlogd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1769038355.1761265-1647-35339263583893/.source.conf follow=False _original_basename=virtlogd.conf checksum=d7a72ae92c2c205983b029473e05a6aa4c58ec24 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:32:36 compute-1 sudo[143170]: pam_unix(sudo:session): session closed for user root
Jan 21 23:32:37 compute-1 sudo[143323]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qqtxslpqkwmnkxlwsstpezszxixezidx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038357.0320284-1647-118869376598379/AnsiballZ_stat.py'
Jan 21 23:32:37 compute-1 sudo[143323]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:32:37 compute-1 python3.9[143325]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtnodedevd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:32:37 compute-1 sudo[143323]: pam_unix(sudo:session): session closed for user root
Jan 21 23:32:38 compute-1 sudo[143448]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-enomyfdztpsnfccstrrdznpypyaeghqk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038357.0320284-1647-118869376598379/AnsiballZ_copy.py'
Jan 21 23:32:38 compute-1 sudo[143448]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:32:38 compute-1 python3.9[143450]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtnodedevd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1769038357.0320284-1647-118869376598379/.source.conf follow=False _original_basename=virtnodedevd.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:32:38 compute-1 sudo[143448]: pam_unix(sudo:session): session closed for user root
Jan 21 23:32:38 compute-1 sudo[143600]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rwbintwrdlbkxoykfedrtebgiecdgcjm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038358.4380393-1647-268332376779365/AnsiballZ_stat.py'
Jan 21 23:32:38 compute-1 sudo[143600]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:32:39 compute-1 python3.9[143602]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtproxyd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:32:39 compute-1 sudo[143600]: pam_unix(sudo:session): session closed for user root
Jan 21 23:32:39 compute-1 sudo[143726]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dqsfgjwxjpdszjkhcarqnlkxeakdoocd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038358.4380393-1647-268332376779365/AnsiballZ_copy.py'
Jan 21 23:32:39 compute-1 sudo[143726]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:32:39 compute-1 python3.9[143729]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtproxyd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1769038358.4380393-1647-268332376779365/.source.conf follow=False _original_basename=virtproxyd.conf checksum=28bc484b7c9988e03de49d4fcc0a088ea975f716 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:32:39 compute-1 sudo[143726]: pam_unix(sudo:session): session closed for user root
Jan 21 23:32:40 compute-1 sudo[143879]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yhuecoohtqyignldcbevvzgwadirtbso ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038359.9547515-1647-97828545978469/AnsiballZ_stat.py'
Jan 21 23:32:40 compute-1 sudo[143879]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:32:40 compute-1 python3.9[143881]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtqemud.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:32:40 compute-1 sudo[143879]: pam_unix(sudo:session): session closed for user root
Jan 21 23:32:40 compute-1 sudo[144004]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rxpzlnaqhlvliglsrhhofhojqbonbrvo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038359.9547515-1647-97828545978469/AnsiballZ_copy.py'
Jan 21 23:32:40 compute-1 sudo[144004]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:32:41 compute-1 python3.9[144006]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtqemud.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1769038359.9547515-1647-97828545978469/.source.conf follow=False _original_basename=virtqemud.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:32:41 compute-1 sudo[144004]: pam_unix(sudo:session): session closed for user root
Jan 21 23:32:41 compute-1 sshd-session[143725]: Received disconnect from 203.83.238.251 port 43844:11:  [preauth]
Jan 21 23:32:41 compute-1 sshd-session[143725]: Disconnected from authenticating user root 203.83.238.251 port 43844 [preauth]
Jan 21 23:32:41 compute-1 sudo[144156]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tfzpwlrcnnatusksktbnwfdypsupncty ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038361.3002234-1647-27216204509822/AnsiballZ_stat.py'
Jan 21 23:32:41 compute-1 sudo[144156]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:32:41 compute-1 python3.9[144158]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/qemu.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:32:41 compute-1 sudo[144156]: pam_unix(sudo:session): session closed for user root
Jan 21 23:32:42 compute-1 sudo[144281]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-brlpwrotqkknsgqqensqcrcydbznywig ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038361.3002234-1647-27216204509822/AnsiballZ_copy.py'
Jan 21 23:32:42 compute-1 sudo[144281]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:32:42 compute-1 python3.9[144283]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/qemu.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1769038361.3002234-1647-27216204509822/.source.conf follow=False _original_basename=qemu.conf.j2 checksum=c44de21af13c90603565570f09ff60c6a41ed8df backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:32:42 compute-1 sudo[144281]: pam_unix(sudo:session): session closed for user root
Jan 21 23:32:42 compute-1 sudo[144433]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-heinmstbuvvavqiigfdvbyajpmfxwplj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038362.6467469-1647-111471648175984/AnsiballZ_stat.py'
Jan 21 23:32:42 compute-1 sudo[144433]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:32:43 compute-1 python3.9[144435]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtsecretd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:32:43 compute-1 sudo[144433]: pam_unix(sudo:session): session closed for user root
Jan 21 23:32:43 compute-1 sudo[144558]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nekqjoxumamkujxlzfsqijzlkewxmmms ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038362.6467469-1647-111471648175984/AnsiballZ_copy.py'
Jan 21 23:32:43 compute-1 sudo[144558]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:32:43 compute-1 python3.9[144560]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtsecretd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1769038362.6467469-1647-111471648175984/.source.conf follow=False _original_basename=virtsecretd.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:32:43 compute-1 sudo[144558]: pam_unix(sudo:session): session closed for user root
Jan 21 23:32:44 compute-1 sudo[144710]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oqxaokdrplritsqejrudqtmabacgysyz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038363.949034-1647-136894929923119/AnsiballZ_stat.py'
Jan 21 23:32:44 compute-1 sudo[144710]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:32:44 compute-1 python3.9[144712]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/auth.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:32:44 compute-1 sudo[144710]: pam_unix(sudo:session): session closed for user root
Jan 21 23:32:44 compute-1 sudo[144833]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wqguepxeexsikdfpgxxfzbgtyeqsodjh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038363.949034-1647-136894929923119/AnsiballZ_copy.py'
Jan 21 23:32:44 compute-1 sudo[144833]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:32:45 compute-1 python3.9[144835]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/auth.conf group=libvirt mode=0600 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1769038363.949034-1647-136894929923119/.source.conf follow=False _original_basename=auth.conf checksum=a94cd818c374cec2c8425b70d2e0e2f41b743ae4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:32:45 compute-1 sudo[144833]: pam_unix(sudo:session): session closed for user root
Jan 21 23:32:45 compute-1 sudo[144985]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rguvypkarizjabszcydkaykdmmyxbspv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038365.3544788-1647-232430673598924/AnsiballZ_stat.py'
Jan 21 23:32:45 compute-1 sudo[144985]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:32:45 compute-1 python3.9[144987]: ansible-ansible.legacy.stat Invoked with path=/etc/sasl2/libvirt.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:32:45 compute-1 sudo[144985]: pam_unix(sudo:session): session closed for user root
Jan 21 23:32:46 compute-1 sudo[145110]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bypdrsouhppekzilhmrowdcydujllluu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038365.3544788-1647-232430673598924/AnsiballZ_copy.py'
Jan 21 23:32:46 compute-1 sudo[145110]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:32:46 compute-1 python3.9[145112]: ansible-ansible.legacy.copy Invoked with dest=/etc/sasl2/libvirt.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1769038365.3544788-1647-232430673598924/.source.conf follow=False _original_basename=sasl_libvirt.conf checksum=652e4d404bf79253d06956b8e9847c9364979d4a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:32:46 compute-1 sudo[145110]: pam_unix(sudo:session): session closed for user root
Jan 21 23:32:47 compute-1 sudo[145262]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zepgocfemmiapggcqdpzasvturonlwif ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038366.8439023-1986-113136109605566/AnsiballZ_command.py'
Jan 21 23:32:47 compute-1 sudo[145262]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:32:47 compute-1 python3.9[145264]: ansible-ansible.legacy.command Invoked with cmd=saslpasswd2 -f /etc/libvirt/passwd.db -p -a libvirt -u openstack migration stdin=12345678 _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None
Jan 21 23:32:47 compute-1 sudo[145262]: pam_unix(sudo:session): session closed for user root
Jan 21 23:32:48 compute-1 sudo[145415]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ntmobggvihrbtnrjwqkzvpftpxhkdrgk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038367.7158322-2013-111797066611145/AnsiballZ_file.py'
Jan 21 23:32:48 compute-1 sudo[145415]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:32:48 compute-1 python3.9[145417]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtlogd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:32:48 compute-1 sudo[145415]: pam_unix(sudo:session): session closed for user root
Jan 21 23:32:48 compute-1 sudo[145567]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nvqpbvuymtvvssbjmvzcnijsjgfdqtlz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038368.4589093-2013-93645252680471/AnsiballZ_file.py'
Jan 21 23:32:48 compute-1 sudo[145567]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:32:49 compute-1 python3.9[145569]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtlogd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:32:49 compute-1 sudo[145567]: pam_unix(sudo:session): session closed for user root
Jan 21 23:32:49 compute-1 sudo[145719]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-icvlfwockkiujhofsgzscgsszfbeiriu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038369.1730964-2013-247339242788036/AnsiballZ_file.py'
Jan 21 23:32:49 compute-1 sudo[145719]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:32:49 compute-1 python3.9[145721]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:32:49 compute-1 sudo[145719]: pam_unix(sudo:session): session closed for user root
Jan 21 23:32:50 compute-1 sudo[145871]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lvzqkwuxduvpqtbktltlhxavqgjsmmwg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038369.9696608-2013-176839212724421/AnsiballZ_file.py'
Jan 21 23:32:50 compute-1 sudo[145871]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:32:50 compute-1 python3.9[145873]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:32:50 compute-1 sudo[145871]: pam_unix(sudo:session): session closed for user root
Jan 21 23:32:51 compute-1 sudo[146023]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bexnslgiduamphizgvhyuqkdczauniwi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038370.7377102-2013-154305737083134/AnsiballZ_file.py'
Jan 21 23:32:51 compute-1 sudo[146023]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:32:51 compute-1 python3.9[146025]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:32:51 compute-1 sudo[146023]: pam_unix(sudo:session): session closed for user root
Jan 21 23:32:51 compute-1 sudo[146175]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jqexznytrmkkimwckblubhnevounztwb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038371.4923306-2013-95867598646387/AnsiballZ_file.py'
Jan 21 23:32:51 compute-1 sudo[146175]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:32:51 compute-1 python3.9[146177]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:32:52 compute-1 sudo[146175]: pam_unix(sudo:session): session closed for user root
Jan 21 23:32:52 compute-1 sudo[146327]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fmumprcktdynsgvyvnwfpgbaitcdrhum ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038372.158907-2013-73946203259690/AnsiballZ_file.py'
Jan 21 23:32:52 compute-1 sudo[146327]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:32:52 compute-1 python3.9[146329]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:32:52 compute-1 sudo[146327]: pam_unix(sudo:session): session closed for user root
Jan 21 23:32:53 compute-1 sudo[146479]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ronyqrgnefedhjhqhbcdkftcvavnknnv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038372.910065-2013-11043880090290/AnsiballZ_file.py'
Jan 21 23:32:53 compute-1 sudo[146479]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:32:53 compute-1 python3.9[146481]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:32:53 compute-1 sudo[146479]: pam_unix(sudo:session): session closed for user root
Jan 21 23:32:54 compute-1 sudo[146631]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xukbghtdlycrsrocvqgddbfriwrsrlov ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038373.669703-2013-17750989337880/AnsiballZ_file.py'
Jan 21 23:32:54 compute-1 sudo[146631]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:32:54 compute-1 python3.9[146633]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:32:54 compute-1 sudo[146631]: pam_unix(sudo:session): session closed for user root
Jan 21 23:32:54 compute-1 sudo[146783]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qwnnrbdvfdqbsuorrpkichbomrciuawm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038374.438254-2013-64694746579959/AnsiballZ_file.py'
Jan 21 23:32:54 compute-1 sudo[146783]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:32:55 compute-1 python3.9[146785]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:32:55 compute-1 sudo[146783]: pam_unix(sudo:session): session closed for user root
Jan 21 23:32:55 compute-1 sudo[146935]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vvlijvaaatniqgaushosuseyhuybjief ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038375.1852818-2013-187591537967593/AnsiballZ_file.py'
Jan 21 23:32:55 compute-1 sudo[146935]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:32:55 compute-1 python3.9[146937]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:32:55 compute-1 sudo[146935]: pam_unix(sudo:session): session closed for user root
Jan 21 23:32:56 compute-1 sudo[147087]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tbsdhlgflqarxqzdoopqezeztmbnijns ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038375.9728377-2013-35598143252903/AnsiballZ_file.py'
Jan 21 23:32:56 compute-1 sudo[147087]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:32:56 compute-1 python3.9[147089]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:32:56 compute-1 sudo[147087]: pam_unix(sudo:session): session closed for user root
Jan 21 23:32:57 compute-1 sudo[147250]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ksdmdtikejtryxigfvvetplenbwuimey ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038376.671467-2013-56930160149221/AnsiballZ_file.py'
Jan 21 23:32:57 compute-1 sudo[147250]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:32:57 compute-1 podman[147213]: 2026-01-21 23:32:57.089007717 +0000 UTC m=+0.115921304 container health_status 1b31374526468d6b98833c6ddc06c791524e3aaa8f4a92d02e3f11c449a17ca2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 21 23:32:57 compute-1 python3.9[147260]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:32:57 compute-1 sudo[147250]: pam_unix(sudo:session): session closed for user root
Jan 21 23:32:57 compute-1 sudo[147418]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oxuowvvgqidhedswxucosmrcvxpqcdvt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038377.421681-2013-131892194357160/AnsiballZ_file.py'
Jan 21 23:32:57 compute-1 sudo[147418]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:32:58 compute-1 python3.9[147420]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:32:58 compute-1 sudo[147418]: pam_unix(sudo:session): session closed for user root
Jan 21 23:32:58 compute-1 sudo[147570]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fzldoxrykfnvqnjhtoefwmchpbjpqqfd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038378.6959956-2310-168713429315760/AnsiballZ_stat.py'
Jan 21 23:32:58 compute-1 sudo[147570]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:32:59 compute-1 python3.9[147572]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtlogd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:32:59 compute-1 sudo[147570]: pam_unix(sudo:session): session closed for user root
Jan 21 23:32:59 compute-1 sudo[147693]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-buajfujlfcrypxhlfpvmxzrjzwusshvb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038378.6959956-2310-168713429315760/AnsiballZ_copy.py'
Jan 21 23:32:59 compute-1 sudo[147693]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:32:59 compute-1 python3.9[147695]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtlogd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769038378.6959956-2310-168713429315760/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:32:59 compute-1 sudo[147693]: pam_unix(sudo:session): session closed for user root
Jan 21 23:33:00 compute-1 sudo[147845]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pznlujgyqdpbcciursksetuaepfjscal ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038379.97293-2310-237330530392930/AnsiballZ_stat.py'
Jan 21 23:33:00 compute-1 sudo[147845]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:33:00 compute-1 python3.9[147847]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtlogd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:33:00 compute-1 sudo[147845]: pam_unix(sudo:session): session closed for user root
Jan 21 23:33:00 compute-1 sudo[147968]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qkmbdihmtthxprlgnmgrlsdwictxyjxi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038379.97293-2310-237330530392930/AnsiballZ_copy.py'
Jan 21 23:33:00 compute-1 sudo[147968]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:33:01 compute-1 python3.9[147970]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtlogd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769038379.97293-2310-237330530392930/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:33:01 compute-1 sudo[147968]: pam_unix(sudo:session): session closed for user root
Jan 21 23:33:01 compute-1 sudo[148120]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hwhfkqyztgzrdvbcylsmrpajcuowpjro ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038381.2485614-2310-93100611668926/AnsiballZ_stat.py'
Jan 21 23:33:01 compute-1 sudo[148120]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:33:01 compute-1 python3.9[148122]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:33:01 compute-1 sudo[148120]: pam_unix(sudo:session): session closed for user root
Jan 21 23:33:02 compute-1 sudo[148243]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ttwzfyqazbrziamjozgmjnnitndhzhdh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038381.2485614-2310-93100611668926/AnsiballZ_copy.py'
Jan 21 23:33:02 compute-1 sudo[148243]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:33:02 compute-1 python3.9[148245]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769038381.2485614-2310-93100611668926/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:33:02 compute-1 sudo[148243]: pam_unix(sudo:session): session closed for user root
Jan 21 23:33:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:33:02.978 104184 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:33:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:33:02.978 104184 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:33:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:33:02.979 104184 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:33:02 compute-1 sudo[148395]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-txumzqrlbembbrghpjzocrlhhckkjiiu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038382.70008-2310-86446347350188/AnsiballZ_stat.py'
Jan 21 23:33:02 compute-1 sudo[148395]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:33:03 compute-1 python3.9[148397]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:33:03 compute-1 sudo[148395]: pam_unix(sudo:session): session closed for user root
Jan 21 23:33:03 compute-1 sudo[148518]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kvzmvehpdkgjsugusfzuheakwilnxblo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038382.70008-2310-86446347350188/AnsiballZ_copy.py'
Jan 21 23:33:03 compute-1 sudo[148518]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:33:03 compute-1 python3.9[148520]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769038382.70008-2310-86446347350188/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:33:03 compute-1 sudo[148518]: pam_unix(sudo:session): session closed for user root
Jan 21 23:33:04 compute-1 sudo[148670]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hqwxlxyvknsdhgkoxvbitzmmkzqjalqc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038384.071793-2310-36722108564611/AnsiballZ_stat.py'
Jan 21 23:33:04 compute-1 sudo[148670]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:33:04 compute-1 python3.9[148672]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:33:04 compute-1 sudo[148670]: pam_unix(sudo:session): session closed for user root
Jan 21 23:33:05 compute-1 sudo[148793]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vfyhkmucnbiiyvbukzsckmktctpnlfvy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038384.071793-2310-36722108564611/AnsiballZ_copy.py'
Jan 21 23:33:05 compute-1 sudo[148793]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:33:05 compute-1 python3.9[148795]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769038384.071793-2310-36722108564611/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:33:05 compute-1 sudo[148793]: pam_unix(sudo:session): session closed for user root
Jan 21 23:33:05 compute-1 sudo[148945]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mbxzotynmmmpsqrbofrzsugrhiojhnhg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038385.5154908-2310-194298351877436/AnsiballZ_stat.py'
Jan 21 23:33:05 compute-1 sudo[148945]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:33:06 compute-1 python3.9[148947]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:33:06 compute-1 sudo[148945]: pam_unix(sudo:session): session closed for user root
Jan 21 23:33:06 compute-1 sudo[149068]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-somzgeyesgmtugajcslmzxzytaglhrfg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038385.5154908-2310-194298351877436/AnsiballZ_copy.py'
Jan 21 23:33:06 compute-1 sudo[149068]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:33:06 compute-1 python3.9[149070]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769038385.5154908-2310-194298351877436/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:33:06 compute-1 sudo[149068]: pam_unix(sudo:session): session closed for user root
Jan 21 23:33:07 compute-1 sudo[149237]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dheixdtsyghexoiuvoatepnbzsowwyrh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038386.868646-2310-141565162352166/AnsiballZ_stat.py'
Jan 21 23:33:07 compute-1 sudo[149237]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:33:07 compute-1 podman[149194]: 2026-01-21 23:33:07.211383841 +0000 UTC m=+0.061203293 container health_status af9a44923f14d865893c91712a29a99d59e05d83f1a1a0300c4f4d5af638f284 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 21 23:33:07 compute-1 python3.9[149241]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:33:07 compute-1 sudo[149237]: pam_unix(sudo:session): session closed for user root
Jan 21 23:33:07 compute-1 sudo[149363]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-auzczqmsdhwtsmqtrmoybahobhlpazjh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038386.868646-2310-141565162352166/AnsiballZ_copy.py'
Jan 21 23:33:07 compute-1 sudo[149363]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:33:08 compute-1 python3.9[149365]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769038386.868646-2310-141565162352166/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:33:08 compute-1 sudo[149363]: pam_unix(sudo:session): session closed for user root
Jan 21 23:33:08 compute-1 sudo[149515]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lykkzdmtxyczyjhznnjbvnbwywpymwxr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038388.2279735-2310-56996415060108/AnsiballZ_stat.py'
Jan 21 23:33:08 compute-1 sudo[149515]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:33:08 compute-1 python3.9[149517]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:33:08 compute-1 sudo[149515]: pam_unix(sudo:session): session closed for user root
Jan 21 23:33:09 compute-1 sudo[149638]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vqjyrohnszcbyvslobsninysajitswfo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038388.2279735-2310-56996415060108/AnsiballZ_copy.py'
Jan 21 23:33:09 compute-1 sudo[149638]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:33:09 compute-1 python3.9[149640]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769038388.2279735-2310-56996415060108/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:33:09 compute-1 sudo[149638]: pam_unix(sudo:session): session closed for user root
Jan 21 23:33:09 compute-1 sudo[149790]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tpciqpssonfaufjrznounyqbrygyaelm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038389.5073235-2310-224345085049002/AnsiballZ_stat.py'
Jan 21 23:33:09 compute-1 sudo[149790]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:33:10 compute-1 python3.9[149792]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:33:10 compute-1 sudo[149790]: pam_unix(sudo:session): session closed for user root
Jan 21 23:33:10 compute-1 sudo[149913]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-alguzegxomqvbekuczrymkanoapywdir ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038389.5073235-2310-224345085049002/AnsiballZ_copy.py'
Jan 21 23:33:10 compute-1 sudo[149913]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:33:10 compute-1 python3.9[149915]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769038389.5073235-2310-224345085049002/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:33:10 compute-1 sudo[149913]: pam_unix(sudo:session): session closed for user root
Jan 21 23:33:11 compute-1 sudo[150065]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fzfgxtbphdfywfbcwodpfrhpcnitjday ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038390.8820562-2310-135327537744153/AnsiballZ_stat.py'
Jan 21 23:33:11 compute-1 sudo[150065]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:33:11 compute-1 python3.9[150067]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:33:11 compute-1 sudo[150065]: pam_unix(sudo:session): session closed for user root
Jan 21 23:33:11 compute-1 sudo[150188]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qkcgqbnqargsnawlpscirhcpuygycjiy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038390.8820562-2310-135327537744153/AnsiballZ_copy.py'
Jan 21 23:33:11 compute-1 sudo[150188]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:33:12 compute-1 python3.9[150190]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769038390.8820562-2310-135327537744153/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:33:12 compute-1 sudo[150188]: pam_unix(sudo:session): session closed for user root
Jan 21 23:33:12 compute-1 sudo[150340]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mugnavpxsfdlscmpdatvpnjsuabkopbx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038392.26501-2310-185326754465904/AnsiballZ_stat.py'
Jan 21 23:33:12 compute-1 sudo[150340]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:33:12 compute-1 python3.9[150342]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:33:12 compute-1 sudo[150340]: pam_unix(sudo:session): session closed for user root
Jan 21 23:33:13 compute-1 sudo[150463]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iaevryaqzphabjpjdgmzsnbvpyfyxypq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038392.26501-2310-185326754465904/AnsiballZ_copy.py'
Jan 21 23:33:13 compute-1 sudo[150463]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:33:13 compute-1 python3.9[150465]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769038392.26501-2310-185326754465904/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:33:13 compute-1 sudo[150463]: pam_unix(sudo:session): session closed for user root
Jan 21 23:33:14 compute-1 sudo[150615]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jxobocbmlgdrtysdzaqxssrzuplkwnft ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038393.7553403-2310-206013300075544/AnsiballZ_stat.py'
Jan 21 23:33:14 compute-1 sudo[150615]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:33:14 compute-1 python3.9[150617]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:33:14 compute-1 sudo[150615]: pam_unix(sudo:session): session closed for user root
Jan 21 23:33:14 compute-1 sudo[150738]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-siadsycgazaqlawrptzscyrzzkfnfidt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038393.7553403-2310-206013300075544/AnsiballZ_copy.py'
Jan 21 23:33:14 compute-1 sudo[150738]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:33:14 compute-1 python3.9[150740]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769038393.7553403-2310-206013300075544/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:33:14 compute-1 sudo[150738]: pam_unix(sudo:session): session closed for user root
Jan 21 23:33:15 compute-1 sudo[150890]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mprcmfuxjvwkmkppwlttblrecayjvwsg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038395.0744226-2310-122850506335618/AnsiballZ_stat.py'
Jan 21 23:33:15 compute-1 sudo[150890]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:33:15 compute-1 python3.9[150892]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:33:15 compute-1 sudo[150890]: pam_unix(sudo:session): session closed for user root
Jan 21 23:33:15 compute-1 sudo[151013]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yqnlqowliknqegbtsrtfjtafdubbafhg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038395.0744226-2310-122850506335618/AnsiballZ_copy.py'
Jan 21 23:33:15 compute-1 sudo[151013]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:33:16 compute-1 python3.9[151015]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769038395.0744226-2310-122850506335618/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:33:16 compute-1 sudo[151013]: pam_unix(sudo:session): session closed for user root
Jan 21 23:33:16 compute-1 sudo[151165]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qwhcbgqqavbvqykjjtchdkklzrteketa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038396.40073-2310-36920463396328/AnsiballZ_stat.py'
Jan 21 23:33:16 compute-1 sudo[151165]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:33:16 compute-1 python3.9[151167]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:33:16 compute-1 sudo[151165]: pam_unix(sudo:session): session closed for user root
Jan 21 23:33:17 compute-1 sudo[151288]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ofymjsyozbgepngpppconkzcyxhjzpmi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038396.40073-2310-36920463396328/AnsiballZ_copy.py'
Jan 21 23:33:17 compute-1 sudo[151288]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:33:17 compute-1 python3.9[151290]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769038396.40073-2310-36920463396328/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:33:17 compute-1 sudo[151288]: pam_unix(sudo:session): session closed for user root
Jan 21 23:33:18 compute-1 python3.9[151440]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail
                                             ls -lRZ /run/libvirt | grep -E ':container_\S+_t'
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 23:33:19 compute-1 sudo[151593]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xusrrizwerkdkhwwpwnsfbbrylhiqfss ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038398.740399-2928-279500719286482/AnsiballZ_seboolean.py'
Jan 21 23:33:19 compute-1 sudo[151593]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:33:19 compute-1 python3.9[151595]: ansible-ansible.posix.seboolean Invoked with name=os_enable_vtpm persistent=True state=True ignore_selinux_state=False
Jan 21 23:33:20 compute-1 sudo[151593]: pam_unix(sudo:session): session closed for user root
Jan 21 23:33:21 compute-1 sudo[151749]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ojztkolyepkwezbscxfuldorkjbztmnm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038401.1301894-2952-234659901358556/AnsiballZ_copy.py'
Jan 21 23:33:21 compute-1 dbus-broker-launch[770]: avc:  op=load_policy lsm=selinux seqno=13 res=1
Jan 21 23:33:21 compute-1 sudo[151749]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:33:21 compute-1 python3.9[151751]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/servercert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:33:21 compute-1 sudo[151749]: pam_unix(sudo:session): session closed for user root
Jan 21 23:33:22 compute-1 sudo[151901]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xhhcgqbfpuxcqzyyuyaoeubrbywnknah ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038401.8948693-2952-275491839398359/AnsiballZ_copy.py'
Jan 21 23:33:22 compute-1 sudo[151901]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:33:22 compute-1 python3.9[151903]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/private/serverkey.pem group=root mode=0600 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:33:22 compute-1 sudo[151901]: pam_unix(sudo:session): session closed for user root
Jan 21 23:33:22 compute-1 sudo[152053]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wqcykfnqhyteiwtsyhpbxfzrnkyeitsx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038402.5850933-2952-31769806751923/AnsiballZ_copy.py'
Jan 21 23:33:22 compute-1 sudo[152053]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:33:23 compute-1 python3.9[152055]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/clientcert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:33:23 compute-1 sudo[152053]: pam_unix(sudo:session): session closed for user root
Jan 21 23:33:23 compute-1 sudo[152205]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sjzowubfulzgupozfblujtgmgnuzqqjx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038403.2637842-2952-254461862318671/AnsiballZ_copy.py'
Jan 21 23:33:23 compute-1 sudo[152205]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:33:23 compute-1 python3.9[152207]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/private/clientkey.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:33:23 compute-1 sudo[152205]: pam_unix(sudo:session): session closed for user root
Jan 21 23:33:24 compute-1 sudo[152357]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tniqbnizgasqjggnjzoiaftfgfqopkzy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038404.0768306-2952-218570529731791/AnsiballZ_copy.py'
Jan 21 23:33:24 compute-1 sudo[152357]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:33:24 compute-1 python3.9[152359]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/CA/cacert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/ca.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:33:24 compute-1 sudo[152357]: pam_unix(sudo:session): session closed for user root
Jan 21 23:33:25 compute-1 sudo[152509]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pvraczohrmyvdgejfpgjlmxoxokrutkv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038404.8670056-3060-133404020099257/AnsiballZ_copy.py'
Jan 21 23:33:25 compute-1 sudo[152509]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:33:25 compute-1 python3.9[152511]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/server-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:33:25 compute-1 sudo[152509]: pam_unix(sudo:session): session closed for user root
Jan 21 23:33:25 compute-1 sudo[152661]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nvjspqbqwqfsgtwhipzitbsdcyfbqzea ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038405.5715475-3060-193954559962099/AnsiballZ_copy.py'
Jan 21 23:33:25 compute-1 sudo[152661]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:33:26 compute-1 python3.9[152663]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/server-key.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:33:26 compute-1 sudo[152661]: pam_unix(sudo:session): session closed for user root
Jan 21 23:33:26 compute-1 sudo[152813]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nkuqsuzhixboufcnucgselhghqziyrhq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038406.3050728-3060-181845630361706/AnsiballZ_copy.py'
Jan 21 23:33:26 compute-1 sudo[152813]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:33:26 compute-1 python3.9[152815]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/client-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:33:26 compute-1 sudo[152813]: pam_unix(sudo:session): session closed for user root
Jan 21 23:33:27 compute-1 sudo[152981]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lphcncyyvmefyodreimuzcayhlildbsj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038407.0003533-3060-107717126862451/AnsiballZ_copy.py'
Jan 21 23:33:27 compute-1 sudo[152981]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:33:27 compute-1 podman[152939]: 2026-01-21 23:33:27.404922263 +0000 UTC m=+0.112574510 container health_status 1b31374526468d6b98833c6ddc06c791524e3aaa8f4a92d02e3f11c449a17ca2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2)
Jan 21 23:33:27 compute-1 python3.9[152987]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/client-key.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:33:27 compute-1 sudo[152981]: pam_unix(sudo:session): session closed for user root
Jan 21 23:33:28 compute-1 sudo[153143]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jouwgupgmyeunhqpkgquydwskmdbmlhe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038407.739676-3060-245230574134200/AnsiballZ_copy.py'
Jan 21 23:33:28 compute-1 sudo[153143]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:33:28 compute-1 python3.9[153145]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/ca-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/ca.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:33:28 compute-1 sudo[153143]: pam_unix(sudo:session): session closed for user root
Jan 21 23:33:28 compute-1 sudo[153295]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cfysjdhvdiijeazmlxfzvsmbnyxbcraj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038408.517968-3168-255749963887093/AnsiballZ_systemd.py'
Jan 21 23:33:28 compute-1 sudo[153295]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:33:29 compute-1 python3.9[153297]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtlogd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 21 23:33:29 compute-1 systemd[1]: Reloading.
Jan 21 23:33:29 compute-1 systemd-rc-local-generator[153327]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 23:33:29 compute-1 systemd-sysv-generator[153330]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 21 23:33:29 compute-1 systemd[1]: Starting libvirt logging daemon socket...
Jan 21 23:33:29 compute-1 systemd[1]: Listening on libvirt logging daemon socket.
Jan 21 23:33:29 compute-1 systemd[1]: Starting libvirt logging daemon admin socket...
Jan 21 23:33:29 compute-1 systemd[1]: Listening on libvirt logging daemon admin socket.
Jan 21 23:33:29 compute-1 systemd[1]: Starting libvirt logging daemon...
Jan 21 23:33:29 compute-1 systemd[1]: Started libvirt logging daemon.
Jan 21 23:33:29 compute-1 sudo[153295]: pam_unix(sudo:session): session closed for user root
Jan 21 23:33:30 compute-1 sudo[153489]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jrdxgmhkhyrnmfmocnknawlprbpqhigw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038409.8989697-3168-174223761048459/AnsiballZ_systemd.py'
Jan 21 23:33:30 compute-1 sudo[153489]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:33:30 compute-1 python3.9[153491]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtnodedevd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 21 23:33:30 compute-1 systemd[1]: Reloading.
Jan 21 23:33:30 compute-1 systemd-rc-local-generator[153515]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 23:33:30 compute-1 systemd-sysv-generator[153518]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 21 23:33:30 compute-1 systemd[1]: Starting libvirt nodedev daemon socket...
Jan 21 23:33:30 compute-1 systemd[1]: Listening on libvirt nodedev daemon socket.
Jan 21 23:33:30 compute-1 systemd[1]: Starting libvirt nodedev daemon admin socket...
Jan 21 23:33:30 compute-1 systemd[1]: Starting libvirt nodedev daemon read-only socket...
Jan 21 23:33:30 compute-1 systemd[1]: Listening on libvirt nodedev daemon admin socket.
Jan 21 23:33:30 compute-1 systemd[1]: Listening on libvirt nodedev daemon read-only socket.
Jan 21 23:33:30 compute-1 systemd[1]: Starting libvirt nodedev daemon...
Jan 21 23:33:30 compute-1 systemd[1]: Started libvirt nodedev daemon.
Jan 21 23:33:30 compute-1 sudo[153489]: pam_unix(sudo:session): session closed for user root
Jan 21 23:33:31 compute-1 systemd[1]: Starting SETroubleshoot daemon for processing new SELinux denial logs...
Jan 21 23:33:31 compute-1 systemd[1]: Started SETroubleshoot daemon for processing new SELinux denial logs.
Jan 21 23:33:31 compute-1 sudo[153708]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qwvinnjweolzidyphsvpiwccvhlinadt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038411.0822027-3168-1758887939392/AnsiballZ_systemd.py'
Jan 21 23:33:31 compute-1 sudo[153708]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:33:31 compute-1 systemd[1]: Created slice Slice /system/dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged.
Jan 21 23:33:31 compute-1 systemd[1]: Started dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service.
Jan 21 23:33:31 compute-1 python3.9[153711]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtproxyd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 21 23:33:31 compute-1 systemd[1]: Reloading.
Jan 21 23:33:31 compute-1 systemd-rc-local-generator[153746]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 23:33:31 compute-1 systemd-sysv-generator[153749]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 21 23:33:32 compute-1 systemd[1]: Starting libvirt proxy daemon admin socket...
Jan 21 23:33:32 compute-1 systemd[1]: Starting libvirt proxy daemon read-only socket...
Jan 21 23:33:32 compute-1 systemd[1]: Listening on libvirt proxy daemon read-only socket.
Jan 21 23:33:32 compute-1 systemd[1]: Listening on libvirt proxy daemon admin socket.
Jan 21 23:33:32 compute-1 systemd[1]: Starting libvirt proxy daemon...
Jan 21 23:33:32 compute-1 systemd[1]: Started libvirt proxy daemon.
Jan 21 23:33:32 compute-1 sudo[153708]: pam_unix(sudo:session): session closed for user root
Jan 21 23:33:32 compute-1 setroubleshoot[153581]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability. For complete SELinux messages run: sealert -l 53b2a74a-25ce-415e-8ccf-6329f258e4a2
Jan 21 23:33:32 compute-1 setroubleshoot[153581]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability.
                                                  
                                                  *****  Plugin dac_override (91.4 confidence) suggests   **********************
                                                  
                                                  If you want to help identify if domain needs this access or you have a file with the wrong permissions on your system
                                                  Then turn on full auditing to get path information about the offending file and generate the error again.
                                                  Do
                                                  
                                                  Turn on full auditing
                                                  # auditctl -w /etc/shadow -p w
                                                  Try to recreate AVC. Then execute
                                                  # ausearch -m avc -ts recent
                                                  If you see PATH record check ownership/permissions on file, and fix it,
                                                  otherwise report as a bugzilla.
                                                  
                                                  *****  Plugin catchall (9.59 confidence) suggests   **************************
                                                  
                                                  If you believe that virtlogd should have the dac_read_search capability by default.
                                                  Then you should report this as a bug.
                                                  You can generate a local policy module to allow this access.
                                                  Do
                                                  allow this access for now by executing:
                                                  # ausearch -c 'virtlogd' --raw | audit2allow -M my-virtlogd
                                                  # semodule -X 300 -i my-virtlogd.pp
                                                  
Jan 21 23:33:32 compute-1 setroubleshoot[153581]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability. For complete SELinux messages run: sealert -l 53b2a74a-25ce-415e-8ccf-6329f258e4a2
Jan 21 23:33:32 compute-1 rsyslogd[1003]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 21 23:33:32 compute-1 rsyslogd[1003]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 21 23:33:32 compute-1 setroubleshoot[153581]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability.
                                                  
                                                  *****  Plugin dac_override (91.4 confidence) suggests   **********************
                                                  
                                                  If you want to help identify if domain needs this access or you have a file with the wrong permissions on your system
                                                  Then turn on full auditing to get path information about the offending file and generate the error again.
                                                  Do
                                                  
                                                  Turn on full auditing
                                                  # auditctl -w /etc/shadow -p w
                                                  Try to recreate AVC. Then execute
                                                  # ausearch -m avc -ts recent
                                                  If you see PATH record check ownership/permissions on file, and fix it,
                                                  otherwise report as a bugzilla.
                                                  
                                                  *****  Plugin catchall (9.59 confidence) suggests   **************************
                                                  
                                                  If you believe that virtlogd should have the dac_read_search capability by default.
                                                  Then you should report this as a bug.
                                                  You can generate a local policy module to allow this access.
                                                  Do
                                                  allow this access for now by executing:
                                                  # ausearch -c 'virtlogd' --raw | audit2allow -M my-virtlogd
                                                  # semodule -X 300 -i my-virtlogd.pp
                                                  
Jan 21 23:33:32 compute-1 sudo[153929]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yzfhdfvklossorfjogbfxuismthalztw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038412.3621767-3168-252487051564805/AnsiballZ_systemd.py'
Jan 21 23:33:32 compute-1 sudo[153929]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:33:33 compute-1 python3.9[153931]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtqemud.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 21 23:33:33 compute-1 systemd[1]: Reloading.
Jan 21 23:33:33 compute-1 systemd-rc-local-generator[153958]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 23:33:33 compute-1 systemd-sysv-generator[153962]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 21 23:33:33 compute-1 systemd[1]: Listening on libvirt locking daemon socket.
Jan 21 23:33:33 compute-1 systemd[1]: Starting libvirt QEMU daemon socket...
Jan 21 23:33:33 compute-1 systemd[1]: Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw).
Jan 21 23:33:33 compute-1 systemd[1]: Starting Virtual Machine and Container Registration Service...
Jan 21 23:33:33 compute-1 systemd[1]: Listening on libvirt QEMU daemon socket.
Jan 21 23:33:33 compute-1 systemd[1]: Starting libvirt QEMU daemon admin socket...
Jan 21 23:33:33 compute-1 systemd[1]: Starting libvirt QEMU daemon read-only socket...
Jan 21 23:33:33 compute-1 systemd[1]: Listening on libvirt QEMU daemon admin socket.
Jan 21 23:33:33 compute-1 systemd[1]: Listening on libvirt QEMU daemon read-only socket.
Jan 21 23:33:33 compute-1 systemd[1]: Started Virtual Machine and Container Registration Service.
Jan 21 23:33:33 compute-1 systemd[1]: Starting libvirt QEMU daemon...
Jan 21 23:33:33 compute-1 systemd[1]: Started libvirt QEMU daemon.
Jan 21 23:33:33 compute-1 sudo[153929]: pam_unix(sudo:session): session closed for user root
Jan 21 23:33:34 compute-1 sudo[154143]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hrtaehjkgnzxvugedodyxeaaegmlmoaz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038413.666704-3168-20980720836815/AnsiballZ_systemd.py'
Jan 21 23:33:34 compute-1 sudo[154143]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:33:34 compute-1 python3.9[154145]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtsecretd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 21 23:33:34 compute-1 systemd[1]: Reloading.
Jan 21 23:33:34 compute-1 systemd-rc-local-generator[154170]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 23:33:34 compute-1 systemd-sysv-generator[154173]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 21 23:33:34 compute-1 systemd[1]: Starting libvirt secret daemon socket...
Jan 21 23:33:34 compute-1 systemd[1]: Listening on libvirt secret daemon socket.
Jan 21 23:33:34 compute-1 systemd[1]: Starting libvirt secret daemon admin socket...
Jan 21 23:33:34 compute-1 systemd[1]: Starting libvirt secret daemon read-only socket...
Jan 21 23:33:34 compute-1 systemd[1]: Listening on libvirt secret daemon admin socket.
Jan 21 23:33:34 compute-1 systemd[1]: Listening on libvirt secret daemon read-only socket.
Jan 21 23:33:34 compute-1 systemd[1]: Starting libvirt secret daemon...
Jan 21 23:33:34 compute-1 systemd[1]: Started libvirt secret daemon.
Jan 21 23:33:34 compute-1 sudo[154143]: pam_unix(sudo:session): session closed for user root
Jan 21 23:33:35 compute-1 sudo[154355]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-largmcastmdanfwmhzvxcidyktdhqbgm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038415.1765552-3280-14016660411920/AnsiballZ_file.py'
Jan 21 23:33:35 compute-1 sudo[154355]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:33:35 compute-1 python3.9[154357]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/openstack/config/ceph state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:33:35 compute-1 sudo[154355]: pam_unix(sudo:session): session closed for user root
Jan 21 23:33:36 compute-1 sudo[154507]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vujjmhjsyqctrjcwrnghytftbwwfaflv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038415.9960032-3303-152888539862028/AnsiballZ_find.py'
Jan 21 23:33:36 compute-1 sudo[154507]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:33:36 compute-1 python3.9[154509]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/config/ceph'] patterns=['*.conf'] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Jan 21 23:33:36 compute-1 sudo[154507]: pam_unix(sudo:session): session closed for user root
Jan 21 23:33:37 compute-1 podman[154609]: 2026-01-21 23:33:37.606181874 +0000 UTC m=+0.096817793 container health_status af9a44923f14d865893c91712a29a99d59e05d83f1a1a0300c4f4d5af638f284 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 21 23:33:37 compute-1 sudo[154676]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-abekzrspsjepkzytmcwglwjkmkgktmem ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038417.3224335-3345-237224574823277/AnsiballZ_stat.py'
Jan 21 23:33:37 compute-1 sudo[154676]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:33:37 compute-1 python3.9[154678]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/libvirt.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:33:37 compute-1 sudo[154676]: pam_unix(sudo:session): session closed for user root
Jan 21 23:33:38 compute-1 sudo[154800]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vaomcujvtsroviscghingpykduiemefl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038417.3224335-3345-237224574823277/AnsiballZ_copy.py'
Jan 21 23:33:38 compute-1 sudo[154800]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:33:38 compute-1 python3.9[154802]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/libvirt.yaml mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1769038417.3224335-3345-237224574823277/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=5ca83b1310a74c5e48c4c3d4640e1cb8fdac1061 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:33:38 compute-1 sudo[154800]: pam_unix(sudo:session): session closed for user root
Jan 21 23:33:39 compute-1 sudo[154952]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tuvvroduywzpahlekhjowaepadajpwxt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038419.0427787-3393-159334053194837/AnsiballZ_file.py'
Jan 21 23:33:39 compute-1 sudo[154952]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:33:39 compute-1 python3.9[154954]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:33:39 compute-1 sudo[154952]: pam_unix(sudo:session): session closed for user root
Jan 21 23:33:40 compute-1 sudo[155104]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-esmjrmcgmjtqacbpfljqiejyhvtnzbjy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038419.8272018-3417-121579177666907/AnsiballZ_stat.py'
Jan 21 23:33:40 compute-1 sudo[155104]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:33:40 compute-1 python3.9[155106]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:33:40 compute-1 sudo[155104]: pam_unix(sudo:session): session closed for user root
Jan 21 23:33:40 compute-1 sudo[155182]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-icnjbjmrbjzrhkvdplsccivodaadabdq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038419.8272018-3417-121579177666907/AnsiballZ_file.py'
Jan 21 23:33:40 compute-1 sudo[155182]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:33:40 compute-1 python3.9[155184]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:33:40 compute-1 sudo[155182]: pam_unix(sudo:session): session closed for user root
Jan 21 23:33:41 compute-1 sudo[155334]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wxhjsskbfmhvfetdzdlolxuupegtzppj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038421.1129823-3453-11734583095002/AnsiballZ_stat.py'
Jan 21 23:33:41 compute-1 sudo[155334]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:33:41 compute-1 python3.9[155336]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:33:41 compute-1 sudo[155334]: pam_unix(sudo:session): session closed for user root
Jan 21 23:33:42 compute-1 sudo[155412]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-atglvazyqzohhaxzqaajktwobtamskkn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038421.1129823-3453-11734583095002/AnsiballZ_file.py'
Jan 21 23:33:42 compute-1 sudo[155412]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:33:42 compute-1 python3.9[155414]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.gjg5jzn9 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:33:42 compute-1 sudo[155412]: pam_unix(sudo:session): session closed for user root
Jan 21 23:33:42 compute-1 systemd[1]: dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service: Deactivated successfully.
Jan 21 23:33:42 compute-1 systemd[1]: dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service: Consumed 1.017s CPU time.
Jan 21 23:33:42 compute-1 systemd[1]: setroubleshootd.service: Deactivated successfully.
Jan 21 23:33:42 compute-1 sudo[155564]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kqcyevvaeodktcouaymareaxrrokuxqm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038422.4073405-3489-228514661876118/AnsiballZ_stat.py'
Jan 21 23:33:42 compute-1 sudo[155564]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:33:42 compute-1 python3.9[155566]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:33:42 compute-1 sudo[155564]: pam_unix(sudo:session): session closed for user root
Jan 21 23:33:43 compute-1 sudo[155642]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wmksyrgacpmtdkfyouvvqizxuvcvtypg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038422.4073405-3489-228514661876118/AnsiballZ_file.py'
Jan 21 23:33:43 compute-1 sudo[155642]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:33:43 compute-1 python3.9[155644]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:33:43 compute-1 sudo[155642]: pam_unix(sudo:session): session closed for user root
Jan 21 23:33:44 compute-1 sudo[155794]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sxrkpilbydxxnkvsjnqrbvwvvspqpnxq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038423.7353675-3529-265322834303912/AnsiballZ_command.py'
Jan 21 23:33:44 compute-1 sudo[155794]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:33:44 compute-1 python3.9[155796]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 23:33:44 compute-1 sudo[155794]: pam_unix(sudo:session): session closed for user root
Jan 21 23:33:45 compute-1 sudo[155947]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qvftfozkhdsuugrzrmycjnixgxurwbzz ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1769038424.542405-3552-229802376058695/AnsiballZ_edpm_nftables_from_files.py'
Jan 21 23:33:45 compute-1 sudo[155947]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:33:45 compute-1 python3[155949]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Jan 21 23:33:45 compute-1 sudo[155947]: pam_unix(sudo:session): session closed for user root
Jan 21 23:33:45 compute-1 sudo[156099]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mnarofnpfkrdjsodlakojryshnxpzpgz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038425.448806-3576-251325206236996/AnsiballZ_stat.py'
Jan 21 23:33:45 compute-1 sudo[156099]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:33:46 compute-1 python3.9[156101]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:33:46 compute-1 sudo[156099]: pam_unix(sudo:session): session closed for user root
Jan 21 23:33:46 compute-1 sudo[156177]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-duefuxuatyzkuiwhhirqohewpbuaxqaz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038425.448806-3576-251325206236996/AnsiballZ_file.py'
Jan 21 23:33:46 compute-1 sudo[156177]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:33:46 compute-1 python3.9[156179]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:33:46 compute-1 sudo[156177]: pam_unix(sudo:session): session closed for user root
Jan 21 23:33:47 compute-1 sudo[156329]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mocffsteizhobdrykdbdphxprgnytejw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038426.7734773-3612-139609042772484/AnsiballZ_stat.py'
Jan 21 23:33:47 compute-1 sudo[156329]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:33:47 compute-1 python3.9[156331]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:33:47 compute-1 sudo[156329]: pam_unix(sudo:session): session closed for user root
Jan 21 23:33:47 compute-1 sudo[156454]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lwmkkgzociugbyixllpeysjhcuhzpidf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038426.7734773-3612-139609042772484/AnsiballZ_copy.py'
Jan 21 23:33:47 compute-1 sudo[156454]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:33:47 compute-1 python3.9[156456]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769038426.7734773-3612-139609042772484/.source.nft follow=False _original_basename=jump-chain.j2 checksum=3ce353c89bce3b135a0ed688d4e338b2efb15185 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:33:48 compute-1 sudo[156454]: pam_unix(sudo:session): session closed for user root
Jan 21 23:33:48 compute-1 sudo[156606]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kbzejuvxvchawwchmwpoumlaifsdtzis ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038428.2642796-3657-110962127658546/AnsiballZ_stat.py'
Jan 21 23:33:48 compute-1 sudo[156606]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:33:48 compute-1 python3.9[156608]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:33:48 compute-1 sudo[156606]: pam_unix(sudo:session): session closed for user root
Jan 21 23:33:49 compute-1 sudo[156684]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xgqppubxnshpphetwkvfesywvfjfloeh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038428.2642796-3657-110962127658546/AnsiballZ_file.py'
Jan 21 23:33:49 compute-1 sudo[156684]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:33:49 compute-1 python3.9[156686]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-flushes.nft _original_basename=flush-chain.j2 recurse=False state=file path=/etc/nftables/edpm-flushes.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:33:49 compute-1 sudo[156684]: pam_unix(sudo:session): session closed for user root
Jan 21 23:33:49 compute-1 sudo[156836]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ugpwzwnnxfvwdliqehcyrzbgdahmtonw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038429.6257064-3693-265205552560253/AnsiballZ_stat.py'
Jan 21 23:33:49 compute-1 sudo[156836]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:33:50 compute-1 python3.9[156838]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:33:50 compute-1 sudo[156836]: pam_unix(sudo:session): session closed for user root
Jan 21 23:33:50 compute-1 sudo[156914]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nvafqjmxqmczkspmlgtwyaanvlovqqbj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038429.6257064-3693-265205552560253/AnsiballZ_file.py'
Jan 21 23:33:50 compute-1 sudo[156914]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:33:50 compute-1 python3.9[156916]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-chains.nft _original_basename=chains.j2 recurse=False state=file path=/etc/nftables/edpm-chains.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:33:50 compute-1 sudo[156914]: pam_unix(sudo:session): session closed for user root
Jan 21 23:33:51 compute-1 sudo[157066]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tujendrflissclnrwnnpwtwpjlaklkfv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038430.9578974-3729-158788495882015/AnsiballZ_stat.py'
Jan 21 23:33:51 compute-1 sudo[157066]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:33:51 compute-1 python3.9[157068]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:33:51 compute-1 sudo[157066]: pam_unix(sudo:session): session closed for user root
Jan 21 23:33:52 compute-1 sudo[157191]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vnllucbryvynicjxvhwfkbqljouvxvve ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038430.9578974-3729-158788495882015/AnsiballZ_copy.py'
Jan 21 23:33:52 compute-1 sudo[157191]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:33:52 compute-1 python3.9[157193]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769038430.9578974-3729-158788495882015/.source.nft follow=False _original_basename=ruleset.j2 checksum=8a12d4eb5149b6e500230381c1359a710881e9b0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:33:52 compute-1 sudo[157191]: pam_unix(sudo:session): session closed for user root
Jan 21 23:33:52 compute-1 sudo[157343]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tpkvlsmxmyarasgolketbekexsgekksw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038432.6266341-3774-134454905196637/AnsiballZ_file.py'
Jan 21 23:33:52 compute-1 sudo[157343]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:33:53 compute-1 python3.9[157345]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:33:53 compute-1 sudo[157343]: pam_unix(sudo:session): session closed for user root
Jan 21 23:33:53 compute-1 sudo[157495]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zekjycombxlvuklbgmpzrbvqvtfcbimt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038433.4311411-3798-228284814895100/AnsiballZ_command.py'
Jan 21 23:33:53 compute-1 sudo[157495]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:33:53 compute-1 python3.9[157497]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 23:33:54 compute-1 sudo[157495]: pam_unix(sudo:session): session closed for user root
Jan 21 23:33:54 compute-1 sudo[157650]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fzmfmoypwgxhmaggpfgepdbychxnpgfz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038434.2423964-3822-197954469022237/AnsiballZ_blockinfile.py'
Jan 21 23:33:54 compute-1 sudo[157650]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:33:54 compute-1 python3.9[157652]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                             include "/etc/nftables/edpm-chains.nft"
                                             include "/etc/nftables/edpm-rules.nft"
                                             include "/etc/nftables/edpm-jumps.nft"
                                              path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:33:54 compute-1 sudo[157650]: pam_unix(sudo:session): session closed for user root
Jan 21 23:33:55 compute-1 sudo[157802]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tmsijoqpbwoxqbujvgehpqsuroxayonu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038435.2875283-3849-95989937769590/AnsiballZ_command.py'
Jan 21 23:33:55 compute-1 sudo[157802]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:33:55 compute-1 python3.9[157804]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 23:33:55 compute-1 sudo[157802]: pam_unix(sudo:session): session closed for user root
Jan 21 23:33:56 compute-1 sudo[157955]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zkefkkotlwamdphyyoolfdjhhcfekedh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038436.116376-3873-50641657615302/AnsiballZ_stat.py'
Jan 21 23:33:56 compute-1 sudo[157955]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:33:56 compute-1 python3.9[157957]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 21 23:33:56 compute-1 sudo[157955]: pam_unix(sudo:session): session closed for user root
Jan 21 23:33:57 compute-1 sudo[158109]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vmrytynzfaekmauwybkxczmhremudfit ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038436.9078844-3898-98061241187764/AnsiballZ_command.py'
Jan 21 23:33:57 compute-1 sudo[158109]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:33:57 compute-1 python3.9[158111]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 23:33:57 compute-1 sudo[158109]: pam_unix(sudo:session): session closed for user root
Jan 21 23:33:57 compute-1 podman[158115]: 2026-01-21 23:33:57.658274847 +0000 UTC m=+0.138037989 container health_status 1b31374526468d6b98833c6ddc06c791524e3aaa8f4a92d02e3f11c449a17ca2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Jan 21 23:33:58 compute-1 sudo[158291]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-axgcpcvypvpuybxycseqrrretjgotiyc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038437.7299762-3921-40766215307841/AnsiballZ_file.py'
Jan 21 23:33:58 compute-1 sudo[158291]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:33:58 compute-1 python3.9[158293]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:33:58 compute-1 sudo[158291]: pam_unix(sudo:session): session closed for user root
Jan 21 23:33:58 compute-1 sudo[158443]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-igflqicsovtlherilcxjsfunyjrpdmcy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038438.6151392-3946-182146526742347/AnsiballZ_stat.py'
Jan 21 23:33:58 compute-1 sudo[158443]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:33:59 compute-1 python3.9[158445]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm_libvirt.target follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:33:59 compute-1 sudo[158443]: pam_unix(sudo:session): session closed for user root
Jan 21 23:33:59 compute-1 sudo[158566]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cgsfisiwlmgejsspyoqxodkspeghdnuv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038438.6151392-3946-182146526742347/AnsiballZ_copy.py'
Jan 21 23:33:59 compute-1 sudo[158566]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:33:59 compute-1 python3.9[158568]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm_libvirt.target mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769038438.6151392-3946-182146526742347/.source.target follow=False _original_basename=edpm_libvirt.target checksum=13035a1aa0f414c677b14be9a5a363b6623d393c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:33:59 compute-1 sudo[158566]: pam_unix(sudo:session): session closed for user root
Jan 21 23:34:00 compute-1 sudo[158718]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-flwnatktzmhdrywzzoodpbsszelcxstn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038440.1188402-3990-240759456216866/AnsiballZ_stat.py'
Jan 21 23:34:00 compute-1 sudo[158718]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:34:00 compute-1 python3.9[158720]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm_libvirt_guests.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:34:00 compute-1 sudo[158718]: pam_unix(sudo:session): session closed for user root
Jan 21 23:34:01 compute-1 sudo[158841]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-soouhwouunjjopuyomzkwdfyovlnmeli ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038440.1188402-3990-240759456216866/AnsiballZ_copy.py'
Jan 21 23:34:01 compute-1 sudo[158841]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:34:01 compute-1 python3.9[158843]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm_libvirt_guests.service mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769038440.1188402-3990-240759456216866/.source.service follow=False _original_basename=edpm_libvirt_guests.service checksum=db83430a42fc2ccfd6ed8b56ebf04f3dff9cd0cf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:34:01 compute-1 sudo[158841]: pam_unix(sudo:session): session closed for user root
Jan 21 23:34:01 compute-1 sudo[158993]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ijimkizcrtmudbrakovwyxffxzvuoebg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038441.5177858-4035-5668563313331/AnsiballZ_stat.py'
Jan 21 23:34:01 compute-1 sudo[158993]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:34:02 compute-1 python3.9[158995]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virt-guest-shutdown.target follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:34:02 compute-1 sudo[158993]: pam_unix(sudo:session): session closed for user root
Jan 21 23:34:02 compute-1 sudo[159116]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yntouxsgbynoudkiaviaihcrlpgkkwhh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038441.5177858-4035-5668563313331/AnsiballZ_copy.py'
Jan 21 23:34:02 compute-1 sudo[159116]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:34:02 compute-1 python3.9[159118]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virt-guest-shutdown.target mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769038441.5177858-4035-5668563313331/.source.target follow=False _original_basename=virt-guest-shutdown.target checksum=49ca149619c596cbba877418629d2cf8f7b0f5cf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:34:02 compute-1 sudo[159116]: pam_unix(sudo:session): session closed for user root
Jan 21 23:34:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:34:02.979 104184 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:34:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:34:02.980 104184 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:34:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:34:02.980 104184 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:34:03 compute-1 sudo[159268]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uccmyoruqeciatfftpelqiudqsuvheer ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038442.954566-4080-173634732360092/AnsiballZ_systemd.py'
Jan 21 23:34:03 compute-1 sudo[159268]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:34:03 compute-1 python3.9[159270]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm_libvirt.target state=restarted daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 21 23:34:03 compute-1 systemd[1]: Reloading.
Jan 21 23:34:03 compute-1 systemd-rc-local-generator[159292]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 23:34:03 compute-1 systemd-sysv-generator[159295]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 21 23:34:03 compute-1 systemd[1]: Reached target edpm_libvirt.target.
Jan 21 23:34:03 compute-1 sudo[159268]: pam_unix(sudo:session): session closed for user root
Jan 21 23:34:04 compute-1 sudo[159460]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vmypgvjwoymozfmoswpxhzhowqsodsoj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038444.3115578-4104-183431618927996/AnsiballZ_systemd.py'
Jan 21 23:34:04 compute-1 sudo[159460]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:34:05 compute-1 python3.9[159462]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm_libvirt_guests daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Jan 21 23:34:05 compute-1 systemd[1]: Reloading.
Jan 21 23:34:05 compute-1 systemd-rc-local-generator[159489]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 23:34:05 compute-1 systemd-sysv-generator[159493]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 21 23:34:05 compute-1 systemd[1]: Reloading.
Jan 21 23:34:05 compute-1 systemd-rc-local-generator[159521]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 23:34:05 compute-1 systemd-sysv-generator[159525]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 21 23:34:05 compute-1 sudo[159460]: pam_unix(sudo:session): session closed for user root
Jan 21 23:34:06 compute-1 sshd-session[104734]: Connection closed by 192.168.122.30 port 53150
Jan 21 23:34:06 compute-1 sshd-session[104731]: pam_unix(sshd:session): session closed for user zuul
Jan 21 23:34:06 compute-1 systemd[1]: session-23.scope: Deactivated successfully.
Jan 21 23:34:06 compute-1 systemd[1]: session-23.scope: Consumed 3min 46.094s CPU time.
Jan 21 23:34:06 compute-1 systemd-logind[796]: Session 23 logged out. Waiting for processes to exit.
Jan 21 23:34:06 compute-1 systemd-logind[796]: Removed session 23.
Jan 21 23:34:08 compute-1 podman[159559]: 2026-01-21 23:34:08.622717391 +0000 UTC m=+0.109637000 container health_status af9a44923f14d865893c91712a29a99d59e05d83f1a1a0300c4f4d5af638f284 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 21 23:34:11 compute-1 sshd-session[159579]: Accepted publickey for zuul from 192.168.122.30 port 50614 ssh2: ECDSA SHA256:yVd9eaNlUxI8uClDfn5gH9n5gqPIh4AgIE0cN4DTPFE
Jan 21 23:34:11 compute-1 systemd-logind[796]: New session 24 of user zuul.
Jan 21 23:34:11 compute-1 systemd[1]: Started Session 24 of User zuul.
Jan 21 23:34:11 compute-1 sshd-session[159579]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 21 23:34:12 compute-1 python3.9[159732]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 21 23:34:14 compute-1 python3.9[159886]: ansible-ansible.builtin.service_facts Invoked
Jan 21 23:34:14 compute-1 network[159903]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Jan 21 23:34:14 compute-1 network[159904]: 'network-scripts' will be removed from distribution in near future.
Jan 21 23:34:14 compute-1 network[159905]: It is advised to switch to 'NetworkManager' instead for network management.
Jan 21 23:34:19 compute-1 sudo[160174]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lllvyqpcekjbymflpdimghxpdepxrzrv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038459.2409334-102-110299197896825/AnsiballZ_setup.py'
Jan 21 23:34:19 compute-1 sudo[160174]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:34:19 compute-1 python3.9[160176]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 21 23:34:20 compute-1 sudo[160174]: pam_unix(sudo:session): session closed for user root
Jan 21 23:34:20 compute-1 sudo[160258]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sjqceeonmbbdlkjdstfucejhpgsgzxnr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038459.2409334-102-110299197896825/AnsiballZ_dnf.py'
Jan 21 23:34:20 compute-1 sudo[160258]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:34:20 compute-1 python3.9[160260]: ansible-ansible.legacy.dnf Invoked with name=['iscsi-initiator-utils'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 21 23:34:25 compute-1 sudo[160258]: pam_unix(sudo:session): session closed for user root
Jan 21 23:34:27 compute-1 sudo[160411]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zdinbhrmzjcenbjduwabuhfdzwkqrzto ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038466.8206887-138-119421759640488/AnsiballZ_stat.py'
Jan 21 23:34:27 compute-1 sudo[160411]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:34:27 compute-1 python3.9[160413]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated/iscsid/etc/iscsi follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 21 23:34:27 compute-1 sudo[160411]: pam_unix(sudo:session): session closed for user root
Jan 21 23:34:28 compute-1 sudo[160569]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lrfruhujenuawrkotusjkekoemffxyxx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038467.8686407-168-230654753982910/AnsiballZ_command.py'
Jan 21 23:34:28 compute-1 sudo[160569]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:34:28 compute-1 podman[160537]: 2026-01-21 23:34:28.660626518 +0000 UTC m=+0.134795038 container health_status 1b31374526468d6b98833c6ddc06c791524e3aaa8f4a92d02e3f11c449a17ca2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 21 23:34:28 compute-1 python3.9[160577]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/restorecon -nvr /etc/iscsi /var/lib/iscsi _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 23:34:28 compute-1 sudo[160569]: pam_unix(sudo:session): session closed for user root
Jan 21 23:34:29 compute-1 sudo[160743]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rssekwodrqlktaneboosmwlpqflplztu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038469.2098646-198-56570006379903/AnsiballZ_stat.py'
Jan 21 23:34:29 compute-1 sudo[160743]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:34:29 compute-1 python3.9[160745]: ansible-ansible.builtin.stat Invoked with path=/etc/iscsi/.initiator_reset follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 21 23:34:29 compute-1 sudo[160743]: pam_unix(sudo:session): session closed for user root
Jan 21 23:34:30 compute-1 sudo[160895]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ukzenapgokdnsprzehpfgnycgztpires ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038470.1234791-222-113951699379932/AnsiballZ_command.py'
Jan 21 23:34:30 compute-1 sudo[160895]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:34:30 compute-1 python3.9[160897]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/iscsi-iname _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 23:34:30 compute-1 sudo[160895]: pam_unix(sudo:session): session closed for user root
Jan 21 23:34:31 compute-1 sudo[161048]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rxyvnwelnyderwbyzvcilhmxqhozkpqi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038470.9316468-246-40745776810941/AnsiballZ_stat.py'
Jan 21 23:34:31 compute-1 sudo[161048]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:34:31 compute-1 python3.9[161050]: ansible-ansible.legacy.stat Invoked with path=/etc/iscsi/initiatorname.iscsi follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:34:31 compute-1 sudo[161048]: pam_unix(sudo:session): session closed for user root
Jan 21 23:34:31 compute-1 sudo[161171]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gavsfnfyusjetrroixzgokfhftudkqcq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038470.9316468-246-40745776810941/AnsiballZ_copy.py'
Jan 21 23:34:31 compute-1 sudo[161171]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:34:32 compute-1 python3.9[161173]: ansible-ansible.legacy.copy Invoked with dest=/etc/iscsi/initiatorname.iscsi mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769038470.9316468-246-40745776810941/.source.iscsi _original_basename=.hyj6ml0d follow=False checksum=811054b74dccc6739547e913756aae5865e34af4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:34:32 compute-1 sudo[161171]: pam_unix(sudo:session): session closed for user root
Jan 21 23:34:32 compute-1 sudo[161323]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gyomjgziwudsvgvcrwqczbvetqlyswpp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038472.4614086-291-230137613791873/AnsiballZ_file.py'
Jan 21 23:34:32 compute-1 sudo[161323]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:34:33 compute-1 python3.9[161325]: ansible-ansible.builtin.file Invoked with mode=0600 path=/etc/iscsi/.initiator_reset state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:34:33 compute-1 sudo[161323]: pam_unix(sudo:session): session closed for user root
Jan 21 23:34:33 compute-1 sudo[161475]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hhahubqvihzelogzmtwwbyojpgainaic ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038473.4232802-315-113769799320061/AnsiballZ_lineinfile.py'
Jan 21 23:34:33 compute-1 sudo[161475]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:34:34 compute-1 python3.9[161477]: ansible-ansible.builtin.lineinfile Invoked with insertafter=^#node.session.auth.chap.algs line=node.session.auth.chap_algs = SHA3-256,SHA256,SHA1,MD5 path=/etc/iscsi/iscsid.conf regexp=^node.session.auth.chap_algs state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:34:34 compute-1 sudo[161475]: pam_unix(sudo:session): session closed for user root
Jan 21 23:34:35 compute-1 sudo[161627]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mcdovamfgbrzjwclxhgzudjxetoxghbl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038474.4502592-342-50273657516422/AnsiballZ_systemd_service.py'
Jan 21 23:34:35 compute-1 sudo[161627]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:34:35 compute-1 python3.9[161629]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=iscsid.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 21 23:34:35 compute-1 systemd[1]: Listening on Open-iSCSI iscsid Socket.
Jan 21 23:34:35 compute-1 sudo[161627]: pam_unix(sudo:session): session closed for user root
Jan 21 23:34:36 compute-1 sudo[161783]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nlyqblqemfskwyvucbqsvnoajqawbtwa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038475.8658917-366-166700079133428/AnsiballZ_systemd_service.py'
Jan 21 23:34:36 compute-1 sudo[161783]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:34:36 compute-1 python3.9[161785]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=iscsid state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 21 23:34:36 compute-1 systemd[1]: Reloading.
Jan 21 23:34:36 compute-1 systemd-rc-local-generator[161815]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 23:34:36 compute-1 systemd-sysv-generator[161819]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 21 23:34:36 compute-1 systemd[1]: One time configuration for iscsi.service was skipped because of an unmet condition check (ConditionPathExists=!/etc/iscsi/initiatorname.iscsi).
Jan 21 23:34:36 compute-1 systemd[1]: Starting Open-iSCSI...
Jan 21 23:34:36 compute-1 kernel: Loading iSCSI transport class v2.0-870.
Jan 21 23:34:36 compute-1 systemd[1]: Started Open-iSCSI.
Jan 21 23:34:36 compute-1 systemd[1]: Starting Logout off all iSCSI sessions on shutdown...
Jan 21 23:34:36 compute-1 systemd[1]: Finished Logout off all iSCSI sessions on shutdown.
Jan 21 23:34:36 compute-1 sudo[161783]: pam_unix(sudo:session): session closed for user root
Jan 21 23:34:38 compute-1 python3.9[161985]: ansible-ansible.builtin.service_facts Invoked
Jan 21 23:34:38 compute-1 network[162002]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Jan 21 23:34:38 compute-1 network[162003]: 'network-scripts' will be removed from distribution in near future.
Jan 21 23:34:38 compute-1 network[162004]: It is advised to switch to 'NetworkManager' instead for network management.
Jan 21 23:34:39 compute-1 podman[162011]: 2026-01-21 23:34:39.332967733 +0000 UTC m=+0.087023822 container health_status af9a44923f14d865893c91712a29a99d59e05d83f1a1a0300c4f4d5af638f284 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible)
Jan 21 23:34:44 compute-1 sudo[162291]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-plrqwallgadknpkwkthmnqjjyzwvibql ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038483.7145836-435-58891362610086/AnsiballZ_dnf.py'
Jan 21 23:34:44 compute-1 sudo[162291]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:34:44 compute-1 python3.9[162293]: ansible-ansible.legacy.dnf Invoked with name=['device-mapper-multipath'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 21 23:34:46 compute-1 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 21 23:34:46 compute-1 systemd[1]: Starting man-db-cache-update.service...
Jan 21 23:34:46 compute-1 systemd[1]: Reloading.
Jan 21 23:34:46 compute-1 systemd-rc-local-generator[162333]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 23:34:46 compute-1 systemd-sysv-generator[162340]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 21 23:34:47 compute-1 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 21 23:34:47 compute-1 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 21 23:34:47 compute-1 systemd[1]: Finished man-db-cache-update.service.
Jan 21 23:34:47 compute-1 systemd[1]: run-r95e3a83b4e104fe3aea1e1465f5efac8.service: Deactivated successfully.
Jan 21 23:34:47 compute-1 sudo[162291]: pam_unix(sudo:session): session closed for user root
Jan 21 23:34:48 compute-1 sudo[162608]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ticmozplavzgryutntzgstgyphtynrbe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038488.3526568-462-174918983016157/AnsiballZ_file.py'
Jan 21 23:34:48 compute-1 sudo[162608]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:34:48 compute-1 python3.9[162610]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Jan 21 23:34:48 compute-1 sudo[162608]: pam_unix(sudo:session): session closed for user root
Jan 21 23:34:49 compute-1 sudo[162760]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lfmmqqgrdlljkmlbrjivdwquevsxuoml ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038489.207686-486-115579685434820/AnsiballZ_modprobe.py'
Jan 21 23:34:49 compute-1 sudo[162760]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:34:49 compute-1 python3.9[162762]: ansible-community.general.modprobe Invoked with name=dm-multipath state=present params= persistent=disabled
Jan 21 23:34:49 compute-1 sudo[162760]: pam_unix(sudo:session): session closed for user root
Jan 21 23:34:50 compute-1 sudo[162916]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fssmpestdbyedowoxxdtzjcjdjwflxgv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038490.132087-510-32881111426603/AnsiballZ_stat.py'
Jan 21 23:34:50 compute-1 sudo[162916]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:34:50 compute-1 python3.9[162918]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/dm-multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:34:50 compute-1 sudo[162916]: pam_unix(sudo:session): session closed for user root
Jan 21 23:34:51 compute-1 sudo[163039]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jmiinooechhrbiyecjqjarrnidehngtk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038490.132087-510-32881111426603/AnsiballZ_copy.py'
Jan 21 23:34:51 compute-1 sudo[163039]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:34:51 compute-1 python3.9[163041]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/dm-multipath.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769038490.132087-510-32881111426603/.source.conf follow=False _original_basename=module-load.conf.j2 checksum=065061c60917e4f67cecc70d12ce55e42f9d0b3f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:34:51 compute-1 sudo[163039]: pam_unix(sudo:session): session closed for user root
Jan 21 23:34:52 compute-1 sudo[163191]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xiclhecyydrppvsbybwfddhwdsmdppop ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038491.7297873-558-240329383967987/AnsiballZ_lineinfile.py'
Jan 21 23:34:52 compute-1 sudo[163191]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:34:52 compute-1 python3.9[163193]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=dm-multipath  mode=0644 state=present path=/etc/modules encoding=utf-8 backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:34:52 compute-1 sudo[163191]: pam_unix(sudo:session): session closed for user root
Jan 21 23:34:53 compute-1 sudo[163343]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qibkumsxyprikqnltrbufgnutrvodiuu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038492.5822875-582-58287881700556/AnsiballZ_systemd.py'
Jan 21 23:34:53 compute-1 sudo[163343]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:34:53 compute-1 python3.9[163345]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 21 23:34:53 compute-1 systemd[1]: systemd-modules-load.service: Deactivated successfully.
Jan 21 23:34:53 compute-1 systemd[1]: Stopped Load Kernel Modules.
Jan 21 23:34:53 compute-1 systemd[1]: Stopping Load Kernel Modules...
Jan 21 23:34:53 compute-1 systemd[1]: Starting Load Kernel Modules...
Jan 21 23:34:53 compute-1 systemd[1]: Finished Load Kernel Modules.
Jan 21 23:34:53 compute-1 sudo[163343]: pam_unix(sudo:session): session closed for user root
Jan 21 23:34:55 compute-1 sudo[163499]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-malvqwxuxnthjzutlqlqgfecouwqoskf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038494.8341038-606-104878082083102/AnsiballZ_command.py'
Jan 21 23:34:55 compute-1 sudo[163499]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:34:55 compute-1 python3.9[163501]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/restorecon -nvr /etc/multipath _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 23:34:55 compute-1 sudo[163499]: pam_unix(sudo:session): session closed for user root
Jan 21 23:34:56 compute-1 sudo[163652]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ukaisivpwqpcbqrdakenymdjyasfezcs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038495.8329906-636-255934784406384/AnsiballZ_stat.py'
Jan 21 23:34:56 compute-1 sudo[163652]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:34:56 compute-1 python3.9[163654]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 21 23:34:56 compute-1 sudo[163652]: pam_unix(sudo:session): session closed for user root
Jan 21 23:34:57 compute-1 sudo[163804]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jccswclovwqgiklkjjukiiaiklxusmrw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038496.717895-663-213206958504675/AnsiballZ_stat.py'
Jan 21 23:34:57 compute-1 sudo[163804]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:34:57 compute-1 python3.9[163806]: ansible-ansible.legacy.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:34:57 compute-1 sudo[163804]: pam_unix(sudo:session): session closed for user root
Jan 21 23:34:57 compute-1 sudo[163927]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bfcwnisijeufoamazzxobrzhtakiwmdp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038496.717895-663-213206958504675/AnsiballZ_copy.py'
Jan 21 23:34:57 compute-1 sudo[163927]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:34:57 compute-1 python3.9[163929]: ansible-ansible.legacy.copy Invoked with dest=/etc/multipath.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769038496.717895-663-213206958504675/.source.conf _original_basename=multipath.conf follow=False checksum=bf02ab264d3d648048a81f3bacec8bc58db93162 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:34:57 compute-1 sudo[163927]: pam_unix(sudo:session): session closed for user root
Jan 21 23:34:58 compute-1 sudo[164079]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vhbadvntnrfymhrtcbxutvczvlakuedb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038498.3715088-708-2718505804068/AnsiballZ_command.py'
Jan 21 23:34:58 compute-1 sudo[164079]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:34:58 compute-1 podman[164081]: 2026-01-21 23:34:58.919541012 +0000 UTC m=+0.131982930 container health_status 1b31374526468d6b98833c6ddc06c791524e3aaa8f4a92d02e3f11c449a17ca2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, org.label-schema.build-date=20251202)
Jan 21 23:34:59 compute-1 python3.9[164082]: ansible-ansible.legacy.command Invoked with _raw_params=grep -q '^blacklist\s*{' /etc/multipath.conf _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 23:34:59 compute-1 sudo[164079]: pam_unix(sudo:session): session closed for user root
Jan 21 23:34:59 compute-1 sudo[164259]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vqpowxjrpoifoggnsyameocrsaqsmoqu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038499.483382-732-144833364109714/AnsiballZ_lineinfile.py'
Jan 21 23:34:59 compute-1 sudo[164259]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:34:59 compute-1 python3.9[164261]: ansible-ansible.builtin.lineinfile Invoked with line=blacklist { path=/etc/multipath.conf state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:35:00 compute-1 sudo[164259]: pam_unix(sudo:session): session closed for user root
Jan 21 23:35:00 compute-1 sudo[164411]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-schgggkrrrimldgjiknkvpbdlzgczujv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038500.3193212-756-215077174587900/AnsiballZ_replace.py'
Jan 21 23:35:00 compute-1 sudo[164411]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:35:00 compute-1 python3.9[164413]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^(blacklist {) replace=\1\n} backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:35:00 compute-1 sudo[164411]: pam_unix(sudo:session): session closed for user root
Jan 21 23:35:01 compute-1 sudo[164563]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tqlnonigwkxgzctxrbngaxokwmdlmaqi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038501.2117295-780-215127708003821/AnsiballZ_replace.py'
Jan 21 23:35:01 compute-1 sudo[164563]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:35:01 compute-1 python3.9[164565]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^blacklist\s*{\n[\s]+devnode \"\.\*\" replace=blacklist { backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:35:01 compute-1 sudo[164563]: pam_unix(sudo:session): session closed for user root
Jan 21 23:35:02 compute-1 sudo[164715]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dfcksbkfmthkqsmsbzbrmffitdkcuczf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038502.1708524-807-269063217267934/AnsiballZ_lineinfile.py'
Jan 21 23:35:02 compute-1 sudo[164715]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:35:02 compute-1 python3.9[164717]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        find_multipaths yes path=/etc/multipath.conf regexp=^\s+find_multipaths state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:35:02 compute-1 sudo[164715]: pam_unix(sudo:session): session closed for user root
Jan 21 23:35:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:35:02.980 104184 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:35:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:35:02.981 104184 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:35:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:35:02.982 104184 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:35:03 compute-1 sudo[164867]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hfpbgghrmhgzomjeizkhuwcqtpftxbtd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038502.9544349-807-268776639361612/AnsiballZ_lineinfile.py'
Jan 21 23:35:03 compute-1 sudo[164867]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:35:03 compute-1 python3.9[164869]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        recheck_wwid yes path=/etc/multipath.conf regexp=^\s+recheck_wwid state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:35:03 compute-1 sudo[164867]: pam_unix(sudo:session): session closed for user root
Jan 21 23:35:03 compute-1 sudo[165019]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xoynsawxpjsiddcjzyubkwarxkualbog ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038503.6067164-807-96741066118963/AnsiballZ_lineinfile.py'
Jan 21 23:35:03 compute-1 sudo[165019]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:35:04 compute-1 python3.9[165021]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        skip_kpartx yes path=/etc/multipath.conf regexp=^\s+skip_kpartx state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:35:04 compute-1 sudo[165019]: pam_unix(sudo:session): session closed for user root
Jan 21 23:35:04 compute-1 sudo[165171]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pnolrdyoforkrclbqlgguvswdwgmxeqm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038504.4422956-807-252590573451284/AnsiballZ_lineinfile.py'
Jan 21 23:35:04 compute-1 sudo[165171]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:35:04 compute-1 python3.9[165173]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        user_friendly_names no path=/etc/multipath.conf regexp=^\s+user_friendly_names state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:35:04 compute-1 sudo[165171]: pam_unix(sudo:session): session closed for user root
Jan 21 23:35:05 compute-1 sudo[165323]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-agibpqbyozqgqvyjyjboncdrbovlpeof ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038505.4707093-894-167042656227506/AnsiballZ_stat.py'
Jan 21 23:35:05 compute-1 sudo[165323]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:35:06 compute-1 python3.9[165325]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 21 23:35:06 compute-1 sudo[165323]: pam_unix(sudo:session): session closed for user root
Jan 21 23:35:06 compute-1 sudo[165477]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-admfbvnyweuvqajtetcqsawwxrayyhcg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038506.3511195-918-15107479921483/AnsiballZ_command.py'
Jan 21 23:35:06 compute-1 sudo[165477]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:35:06 compute-1 python3.9[165479]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/bin/true _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 23:35:06 compute-1 sudo[165477]: pam_unix(sudo:session): session closed for user root
Jan 21 23:35:07 compute-1 sudo[165630]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vyjxecvesxagcunecmgvzynoylhrtbfi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038507.3176422-945-60331211979951/AnsiballZ_systemd_service.py'
Jan 21 23:35:07 compute-1 sudo[165630]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:35:08 compute-1 python3.9[165632]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=multipathd.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 21 23:35:08 compute-1 systemd[1]: Listening on multipathd control socket.
Jan 21 23:35:08 compute-1 sudo[165630]: pam_unix(sudo:session): session closed for user root
Jan 21 23:35:09 compute-1 podman[165661]: 2026-01-21 23:35:09.60824355 +0000 UTC m=+0.092536555 container health_status af9a44923f14d865893c91712a29a99d59e05d83f1a1a0300c4f4d5af638f284 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible)
Jan 21 23:35:10 compute-1 sudo[165805]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fygqiuneukmnbdigrlkiqcnuxqncmhlf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038509.7355032-969-121750901428299/AnsiballZ_systemd_service.py'
Jan 21 23:35:10 compute-1 sudo[165805]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:35:10 compute-1 python3.9[165807]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=multipathd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 21 23:35:10 compute-1 systemd[1]: Starting Wait for udev To Complete Device Initialization...
Jan 21 23:35:10 compute-1 udevadm[165812]: systemd-udev-settle.service is deprecated. Please fix multipathd.service not to pull it in.
Jan 21 23:35:10 compute-1 systemd[1]: Finished Wait for udev To Complete Device Initialization.
Jan 21 23:35:10 compute-1 systemd[1]: Starting Device-Mapper Multipath Device Controller...
Jan 21 23:35:10 compute-1 multipathd[165815]: --------start up--------
Jan 21 23:35:10 compute-1 multipathd[165815]: read /etc/multipath.conf
Jan 21 23:35:10 compute-1 multipathd[165815]: path checkers start up
Jan 21 23:35:10 compute-1 systemd[1]: Started Device-Mapper Multipath Device Controller.
Jan 21 23:35:10 compute-1 sudo[165805]: pam_unix(sudo:session): session closed for user root
Jan 21 23:35:11 compute-1 sudo[165972]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wuehyjxpgyhhlcqpmipgbchfvklfotrf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038511.2962132-1005-184231577371108/AnsiballZ_file.py'
Jan 21 23:35:11 compute-1 sudo[165972]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:35:11 compute-1 python3.9[165974]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Jan 21 23:35:11 compute-1 sudo[165972]: pam_unix(sudo:session): session closed for user root
Jan 21 23:35:12 compute-1 sudo[166124]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pprbdeyzcijrxldlgumnnjxegbykhynw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038512.280687-1029-278267418547819/AnsiballZ_modprobe.py'
Jan 21 23:35:12 compute-1 sudo[166124]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:35:12 compute-1 python3.9[166126]: ansible-community.general.modprobe Invoked with name=nvme-fabrics state=present params= persistent=disabled
Jan 21 23:35:12 compute-1 kernel: Key type psk registered
Jan 21 23:35:12 compute-1 sudo[166124]: pam_unix(sudo:session): session closed for user root
Jan 21 23:35:13 compute-1 sudo[166285]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sfmnohwatqeztjzqrsommxdannykajmt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038513.2530146-1053-216454288703917/AnsiballZ_stat.py'
Jan 21 23:35:13 compute-1 sudo[166285]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:35:13 compute-1 python3.9[166287]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/nvme-fabrics.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:35:13 compute-1 sudo[166285]: pam_unix(sudo:session): session closed for user root
Jan 21 23:35:14 compute-1 sudo[166408]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vzdkizivuvkibgodcqfrhnhijgkknbpk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038513.2530146-1053-216454288703917/AnsiballZ_copy.py'
Jan 21 23:35:14 compute-1 sudo[166408]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:35:14 compute-1 python3.9[166410]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/nvme-fabrics.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769038513.2530146-1053-216454288703917/.source.conf follow=False _original_basename=module-load.conf.j2 checksum=783c778f0c68cc414f35486f234cbb1cf3f9bbff backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:35:14 compute-1 sudo[166408]: pam_unix(sudo:session): session closed for user root
Jan 21 23:35:15 compute-1 sudo[166560]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-daqliruqwpfjwmwjilbdsjoavjdjpepu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038514.8973198-1101-135350211405757/AnsiballZ_lineinfile.py'
Jan 21 23:35:15 compute-1 sudo[166560]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:35:15 compute-1 python3.9[166562]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=nvme-fabrics  mode=0644 state=present path=/etc/modules encoding=utf-8 backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:35:15 compute-1 sudo[166560]: pam_unix(sudo:session): session closed for user root
Jan 21 23:35:16 compute-1 sudo[166712]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zjikypugjziwuxexmjutpgjeaoprclfl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038515.7797449-1125-85647015261573/AnsiballZ_systemd.py'
Jan 21 23:35:16 compute-1 sudo[166712]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:35:16 compute-1 python3.9[166714]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 21 23:35:16 compute-1 systemd[1]: systemd-modules-load.service: Deactivated successfully.
Jan 21 23:35:16 compute-1 systemd[1]: Stopped Load Kernel Modules.
Jan 21 23:35:16 compute-1 systemd[1]: Stopping Load Kernel Modules...
Jan 21 23:35:16 compute-1 systemd[1]: Starting Load Kernel Modules...
Jan 21 23:35:16 compute-1 systemd[1]: Finished Load Kernel Modules.
Jan 21 23:35:16 compute-1 sudo[166712]: pam_unix(sudo:session): session closed for user root
Jan 21 23:35:17 compute-1 sudo[166868]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sgkpgdqkzjwpoiiknbkhidbirhgnwfam ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038517.2307696-1149-116401695758385/AnsiballZ_dnf.py'
Jan 21 23:35:17 compute-1 sudo[166868]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:35:17 compute-1 python3.9[166870]: ansible-ansible.legacy.dnf Invoked with name=['nvme-cli'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 21 23:35:20 compute-1 systemd[1]: Reloading.
Jan 21 23:35:20 compute-1 systemd-rc-local-generator[166903]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 23:35:20 compute-1 systemd-sysv-generator[166906]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 21 23:35:20 compute-1 systemd[1]: Reloading.
Jan 21 23:35:20 compute-1 systemd-rc-local-generator[166937]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 23:35:20 compute-1 systemd-sysv-generator[166941]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 21 23:35:21 compute-1 systemd-logind[796]: Watching system buttons on /dev/input/event0 (Power Button)
Jan 21 23:35:21 compute-1 systemd-logind[796]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard)
Jan 21 23:35:21 compute-1 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 21 23:35:21 compute-1 systemd[1]: Starting man-db-cache-update.service...
Jan 21 23:35:21 compute-1 systemd[1]: Reloading.
Jan 21 23:35:21 compute-1 systemd-rc-local-generator[167032]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 23:35:21 compute-1 systemd-sysv-generator[167038]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 21 23:35:21 compute-1 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 21 23:35:22 compute-1 sudo[166868]: pam_unix(sudo:session): session closed for user root
Jan 21 23:35:22 compute-1 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 21 23:35:22 compute-1 systemd[1]: Finished man-db-cache-update.service.
Jan 21 23:35:22 compute-1 systemd[1]: man-db-cache-update.service: Consumed 1.673s CPU time.
Jan 21 23:35:22 compute-1 systemd[1]: run-r9c1f0ede7dd14002b21486898940e6ac.service: Deactivated successfully.
Jan 21 23:35:22 compute-1 sudo[168334]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qrpvoxhvijclfqzrhcprcnzpglymyvko ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038522.4975584-1173-253759960124209/AnsiballZ_systemd_service.py'
Jan 21 23:35:22 compute-1 sudo[168334]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:35:23 compute-1 python3.9[168336]: ansible-ansible.builtin.systemd_service Invoked with name=iscsid state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 21 23:35:23 compute-1 systemd[1]: Stopping Open-iSCSI...
Jan 21 23:35:23 compute-1 iscsid[161826]: iscsid shutting down.
Jan 21 23:35:23 compute-1 systemd[1]: iscsid.service: Deactivated successfully.
Jan 21 23:35:23 compute-1 systemd[1]: Stopped Open-iSCSI.
Jan 21 23:35:23 compute-1 systemd[1]: One time configuration for iscsi.service was skipped because of an unmet condition check (ConditionPathExists=!/etc/iscsi/initiatorname.iscsi).
Jan 21 23:35:23 compute-1 systemd[1]: Starting Open-iSCSI...
Jan 21 23:35:23 compute-1 systemd[1]: Started Open-iSCSI.
Jan 21 23:35:23 compute-1 sudo[168334]: pam_unix(sudo:session): session closed for user root
Jan 21 23:35:23 compute-1 sudo[168491]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mopaxospctvwmpozrjhnjrhcoybvdqgh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038523.58376-1197-63565524712618/AnsiballZ_systemd_service.py'
Jan 21 23:35:23 compute-1 sudo[168491]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:35:24 compute-1 python3.9[168493]: ansible-ansible.builtin.systemd_service Invoked with name=multipathd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 21 23:35:24 compute-1 systemd[1]: Stopping Device-Mapper Multipath Device Controller...
Jan 21 23:35:24 compute-1 multipathd[165815]: exit (signal)
Jan 21 23:35:24 compute-1 multipathd[165815]: --------shut down-------
Jan 21 23:35:24 compute-1 systemd[1]: multipathd.service: Deactivated successfully.
Jan 21 23:35:24 compute-1 systemd[1]: Stopped Device-Mapper Multipath Device Controller.
Jan 21 23:35:24 compute-1 systemd[1]: Starting Device-Mapper Multipath Device Controller...
Jan 21 23:35:24 compute-1 multipathd[168500]: --------start up--------
Jan 21 23:35:24 compute-1 multipathd[168500]: read /etc/multipath.conf
Jan 21 23:35:24 compute-1 multipathd[168500]: path checkers start up
Jan 21 23:35:24 compute-1 systemd[1]: Started Device-Mapper Multipath Device Controller.
Jan 21 23:35:24 compute-1 sudo[168491]: pam_unix(sudo:session): session closed for user root
Jan 21 23:35:25 compute-1 python3.9[168657]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 21 23:35:26 compute-1 sudo[168811]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xjzvfmhstymbjerjokwuoylfuxhqygvu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038526.2165673-1249-220670755989990/AnsiballZ_file.py'
Jan 21 23:35:26 compute-1 sudo[168811]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:35:26 compute-1 python3.9[168813]: ansible-ansible.builtin.file Invoked with mode=0644 path=/etc/ssh/ssh_known_hosts state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:35:26 compute-1 sudo[168811]: pam_unix(sudo:session): session closed for user root
Jan 21 23:35:27 compute-1 sudo[168963]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rarhcdynlkghpvxciojlmuvtejejblfu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038527.5280726-1282-197280589317197/AnsiballZ_systemd_service.py'
Jan 21 23:35:27 compute-1 sudo[168963]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:35:28 compute-1 python3.9[168965]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 21 23:35:28 compute-1 systemd[1]: Reloading.
Jan 21 23:35:28 compute-1 systemd-sysv-generator[168995]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 21 23:35:28 compute-1 systemd-rc-local-generator[168992]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 23:35:28 compute-1 sudo[168963]: pam_unix(sudo:session): session closed for user root
Jan 21 23:35:29 compute-1 podman[169123]: 2026-01-21 23:35:29.474050614 +0000 UTC m=+0.164066325 container health_status 1b31374526468d6b98833c6ddc06c791524e3aaa8f4a92d02e3f11c449a17ca2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 21 23:35:29 compute-1 python3.9[169163]: ansible-ansible.builtin.service_facts Invoked
Jan 21 23:35:29 compute-1 network[169193]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Jan 21 23:35:29 compute-1 network[169194]: 'network-scripts' will be removed from distribution in near future.
Jan 21 23:35:29 compute-1 network[169195]: It is advised to switch to 'NetworkManager' instead for network management.
Jan 21 23:35:30 compute-1 systemd[1]: virtnodedevd.service: Deactivated successfully.
Jan 21 23:35:32 compute-1 systemd[1]: virtproxyd.service: Deactivated successfully.
Jan 21 23:35:33 compute-1 systemd[1]: virtqemud.service: Deactivated successfully.
Jan 21 23:35:34 compute-1 systemd[1]: virtsecretd.service: Deactivated successfully.
Jan 21 23:35:38 compute-1 sudo[169470]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nirpoohamshnyjnucxlouoeajlmcohpn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038537.7095366-1339-162462013997780/AnsiballZ_systemd_service.py'
Jan 21 23:35:38 compute-1 sudo[169470]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:35:38 compute-1 python3.9[169472]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 21 23:35:38 compute-1 sudo[169470]: pam_unix(sudo:session): session closed for user root
Jan 21 23:35:39 compute-1 sudo[169623]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-osppvubnztcqdjdgxtvpazwuczzlcgfy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038538.6128662-1339-157482498745859/AnsiballZ_systemd_service.py'
Jan 21 23:35:39 compute-1 sudo[169623]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:35:39 compute-1 python3.9[169625]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_migration_target.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 21 23:35:39 compute-1 sudo[169623]: pam_unix(sudo:session): session closed for user root
Jan 21 23:35:39 compute-1 sudo[169787]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kiyrphlkiopxslfqqluqsrbcibbjqpjb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038539.550278-1339-129206002601199/AnsiballZ_systemd_service.py'
Jan 21 23:35:39 compute-1 sudo[169787]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:35:40 compute-1 podman[169750]: 2026-01-21 23:35:40.005023066 +0000 UTC m=+0.096538946 container health_status af9a44923f14d865893c91712a29a99d59e05d83f1a1a0300c4f4d5af638f284 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, io.buildah.version=1.41.3)
Jan 21 23:35:40 compute-1 python3.9[169797]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api_cron.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 21 23:35:40 compute-1 sudo[169787]: pam_unix(sudo:session): session closed for user root
Jan 21 23:35:40 compute-1 sudo[169949]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iyecqlptyzxovjomeqadqotvdlbrlkse ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038540.518257-1339-116189058128803/AnsiballZ_systemd_service.py'
Jan 21 23:35:40 compute-1 sudo[169949]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:35:41 compute-1 python3.9[169951]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 21 23:35:41 compute-1 sudo[169949]: pam_unix(sudo:session): session closed for user root
Jan 21 23:35:42 compute-1 sudo[170102]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mnmcihitjrxpcfdwjkheapfkpitibfbg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038541.4203448-1339-27304249859217/AnsiballZ_systemd_service.py'
Jan 21 23:35:42 compute-1 sudo[170102]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:35:42 compute-1 python3.9[170104]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_conductor.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 21 23:35:42 compute-1 sudo[170102]: pam_unix(sudo:session): session closed for user root
Jan 21 23:35:43 compute-1 sudo[170255]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sqhdxwwacfmcxwbxixjcrqxukiizccdw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038542.7052684-1339-10014901398427/AnsiballZ_systemd_service.py'
Jan 21 23:35:43 compute-1 sudo[170255]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:35:43 compute-1 python3.9[170257]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_metadata.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 21 23:35:43 compute-1 sudo[170255]: pam_unix(sudo:session): session closed for user root
Jan 21 23:35:44 compute-1 sudo[170408]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ehalhpkurhdgohcjgirfviuoueihjrrl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038543.6331062-1339-133644119890599/AnsiballZ_systemd_service.py'
Jan 21 23:35:44 compute-1 sudo[170408]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:35:44 compute-1 python3.9[170410]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_scheduler.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 21 23:35:44 compute-1 sudo[170408]: pam_unix(sudo:session): session closed for user root
Jan 21 23:35:44 compute-1 sudo[170561]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-spkfagcahopcjhgnzbhhvwvrbpwbzifc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038544.5424953-1339-160242457616175/AnsiballZ_systemd_service.py'
Jan 21 23:35:44 compute-1 sudo[170561]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:35:45 compute-1 python3.9[170563]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_vnc_proxy.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 21 23:35:45 compute-1 sudo[170561]: pam_unix(sudo:session): session closed for user root
Jan 21 23:35:46 compute-1 sudo[170714]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gqizzkkyazamrxivkzenoqnfkllnfiil ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038546.018863-1516-78446584885677/AnsiballZ_file.py'
Jan 21 23:35:46 compute-1 sudo[170714]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:35:46 compute-1 python3.9[170716]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:35:46 compute-1 sudo[170714]: pam_unix(sudo:session): session closed for user root
Jan 21 23:35:47 compute-1 sudo[170866]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cviotbdxtgbvzppohqifkiwdatbrncam ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038546.9857414-1516-9398078886489/AnsiballZ_file.py'
Jan 21 23:35:47 compute-1 sudo[170866]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:35:47 compute-1 python3.9[170868]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:35:47 compute-1 sudo[170866]: pam_unix(sudo:session): session closed for user root
Jan 21 23:35:48 compute-1 sudo[171018]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mvmmcghkgbbvfilivapmuyxkkjclapgy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038547.776295-1516-150292191458441/AnsiballZ_file.py'
Jan 21 23:35:48 compute-1 sudo[171018]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:35:48 compute-1 python3.9[171020]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:35:48 compute-1 sudo[171018]: pam_unix(sudo:session): session closed for user root
Jan 21 23:35:49 compute-1 sudo[171170]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rmezynkvqwvlajiphkfsfybgukyzyecg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038548.6223664-1516-172287847520892/AnsiballZ_file.py'
Jan 21 23:35:49 compute-1 sudo[171170]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:35:49 compute-1 python3.9[171172]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:35:49 compute-1 sudo[171170]: pam_unix(sudo:session): session closed for user root
Jan 21 23:35:49 compute-1 sudo[171322]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rzlirzpmdsltakjuaxdhgrouhfnzudbx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038549.4374695-1516-103590252247142/AnsiballZ_file.py'
Jan 21 23:35:49 compute-1 sudo[171322]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:35:50 compute-1 python3.9[171324]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:35:50 compute-1 sudo[171322]: pam_unix(sudo:session): session closed for user root
Jan 21 23:35:50 compute-1 sudo[171474]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jypcuceyilmgvnriolgqsmwiujbquony ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038550.2152514-1516-55084669666945/AnsiballZ_file.py'
Jan 21 23:35:50 compute-1 sudo[171474]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:35:50 compute-1 python3.9[171476]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:35:50 compute-1 sudo[171474]: pam_unix(sudo:session): session closed for user root
Jan 21 23:35:51 compute-1 sudo[171626]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mfmwfnzciycxvtdrfvjyhyvommkikwmp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038550.9067357-1516-77643587743429/AnsiballZ_file.py'
Jan 21 23:35:51 compute-1 sudo[171626]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:35:51 compute-1 python3.9[171628]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:35:51 compute-1 sudo[171626]: pam_unix(sudo:session): session closed for user root
Jan 21 23:35:52 compute-1 sudo[171778]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aypsgdefbfbcacwhdqllarqsfdntveag ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038551.9209485-1516-151551165916577/AnsiballZ_file.py'
Jan 21 23:35:52 compute-1 sudo[171778]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:35:52 compute-1 python3.9[171780]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:35:52 compute-1 sudo[171778]: pam_unix(sudo:session): session closed for user root
Jan 21 23:35:54 compute-1 sudo[171930]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rlwsyvaeozbirzkktaothquocbriasgl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038553.8731587-1687-24988016477212/AnsiballZ_file.py'
Jan 21 23:35:54 compute-1 sudo[171930]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:35:54 compute-1 python3.9[171932]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:35:54 compute-1 sudo[171930]: pam_unix(sudo:session): session closed for user root
Jan 21 23:35:54 compute-1 sudo[172082]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vcfrezdmaqdcwdphvtjltddhybmzpkwp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038554.5844204-1687-170074020132239/AnsiballZ_file.py'
Jan 21 23:35:54 compute-1 sudo[172082]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:35:55 compute-1 python3.9[172084]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:35:55 compute-1 sudo[172082]: pam_unix(sudo:session): session closed for user root
Jan 21 23:35:55 compute-1 sudo[172234]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gxwnqkiiqyceokgqsgphsidawajmrweh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038555.3263228-1687-219066353660666/AnsiballZ_file.py'
Jan 21 23:35:55 compute-1 sudo[172234]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:35:55 compute-1 python3.9[172236]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:35:55 compute-1 sudo[172234]: pam_unix(sudo:session): session closed for user root
Jan 21 23:35:56 compute-1 sudo[172386]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ndeflhhossvchffxmfxfrfysefishfad ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038556.3102272-1687-100182681458116/AnsiballZ_file.py'
Jan 21 23:35:56 compute-1 sudo[172386]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:35:56 compute-1 python3.9[172388]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:35:56 compute-1 sudo[172386]: pam_unix(sudo:session): session closed for user root
Jan 21 23:35:57 compute-1 sudo[172538]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ktfzzmrbrdiehnbfvgkedojdnjvucacv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038557.0910578-1687-29891612417687/AnsiballZ_file.py'
Jan 21 23:35:57 compute-1 sudo[172538]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:35:57 compute-1 python3.9[172540]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:35:57 compute-1 sudo[172538]: pam_unix(sudo:session): session closed for user root
Jan 21 23:35:58 compute-1 sudo[172690]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-okqdcpcmsjkxefstyietqumhlbuaizev ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038557.8204153-1687-233698641373562/AnsiballZ_file.py'
Jan 21 23:35:58 compute-1 sudo[172690]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:35:58 compute-1 python3.9[172692]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:35:58 compute-1 sudo[172690]: pam_unix(sudo:session): session closed for user root
Jan 21 23:35:58 compute-1 sudo[172842]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eaxcqjfitrzfqpczgtiqtwvsljtqzkbn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038558.6434314-1687-158164511186605/AnsiballZ_file.py'
Jan 21 23:35:58 compute-1 sudo[172842]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:35:59 compute-1 python3.9[172844]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:35:59 compute-1 sudo[172842]: pam_unix(sudo:session): session closed for user root
Jan 21 23:35:59 compute-1 sudo[173007]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gblfllpbvkziuqwxdzydwrvrtrqzwlkl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038559.359342-1687-256525062834218/AnsiballZ_file.py'
Jan 21 23:35:59 compute-1 sudo[173007]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:35:59 compute-1 podman[172968]: 2026-01-21 23:35:59.787153825 +0000 UTC m=+0.134648066 container health_status 1b31374526468d6b98833c6ddc06c791524e3aaa8f4a92d02e3f11c449a17ca2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 21 23:35:59 compute-1 python3.9[173015]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:35:59 compute-1 sudo[173007]: pam_unix(sudo:session): session closed for user root
Jan 21 23:36:01 compute-1 anacron[30859]: Job `cron.daily' started
Jan 21 23:36:01 compute-1 anacron[30859]: Job `cron.daily' terminated
Jan 21 23:36:01 compute-1 sudo[173174]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rlyrmwlrjkuwsdupzhhpdftkxbxcqjxr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038560.764872-1861-34411426444969/AnsiballZ_command.py'
Jan 21 23:36:01 compute-1 sudo[173174]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:36:01 compute-1 python3.9[173176]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then
                                               systemctl disable --now certmonger.service
                                               test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service
                                             fi
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 23:36:01 compute-1 sudo[173174]: pam_unix(sudo:session): session closed for user root
Jan 21 23:36:02 compute-1 python3.9[173328]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Jan 21 23:36:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:36:02.981 104184 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:36:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:36:02.983 104184 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:36:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:36:02.983 104184 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:36:03 compute-1 sudo[173478]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ixgnugczbqkahcabrlgrrqzksjlpcpet ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038563.2055633-1915-237120789492149/AnsiballZ_systemd_service.py'
Jan 21 23:36:03 compute-1 sudo[173478]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:36:04 compute-1 python3.9[173480]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 21 23:36:04 compute-1 systemd[1]: Reloading.
Jan 21 23:36:04 compute-1 systemd-sysv-generator[173513]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 21 23:36:04 compute-1 systemd-rc-local-generator[173509]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 23:36:04 compute-1 sudo[173478]: pam_unix(sudo:session): session closed for user root
Jan 21 23:36:05 compute-1 sudo[173666]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ydckgckveqtlhhkuntfaduunzcygdlpq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038565.0995007-1939-256273822639398/AnsiballZ_command.py'
Jan 21 23:36:05 compute-1 sudo[173666]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:36:05 compute-1 python3.9[173668]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 23:36:05 compute-1 sudo[173666]: pam_unix(sudo:session): session closed for user root
Jan 21 23:36:06 compute-1 sudo[173819]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vymdiyopdlvcyrbwuhdplquorlrytztu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038566.0497375-1939-196435743183501/AnsiballZ_command.py'
Jan 21 23:36:06 compute-1 sudo[173819]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:36:06 compute-1 python3.9[173821]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_migration_target.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 23:36:06 compute-1 sudo[173819]: pam_unix(sudo:session): session closed for user root
Jan 21 23:36:07 compute-1 sudo[173972]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-efuayfegkxkxgfgquiweebojsougqryb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038566.8741672-1939-166639751301529/AnsiballZ_command.py'
Jan 21 23:36:07 compute-1 sudo[173972]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:36:07 compute-1 python3.9[173974]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api_cron.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 23:36:07 compute-1 sudo[173972]: pam_unix(sudo:session): session closed for user root
Jan 21 23:36:08 compute-1 sudo[174125]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-emntmxrznnqfzzkwcxuovvedtoutxmug ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038567.7072713-1939-94066946899577/AnsiballZ_command.py'
Jan 21 23:36:08 compute-1 sudo[174125]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:36:08 compute-1 python3.9[174127]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 23:36:08 compute-1 sudo[174125]: pam_unix(sudo:session): session closed for user root
Jan 21 23:36:08 compute-1 sudo[174278]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xvvritipmksxbgybljiuqrncoknafkqm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038568.4981775-1939-212153860011127/AnsiballZ_command.py'
Jan 21 23:36:08 compute-1 sudo[174278]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:36:09 compute-1 python3.9[174280]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_conductor.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 23:36:09 compute-1 sudo[174278]: pam_unix(sudo:session): session closed for user root
Jan 21 23:36:09 compute-1 sudo[174431]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-arfbemafeksbotaszsmderalfnylemtl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038569.3674793-1939-7775738325166/AnsiballZ_command.py'
Jan 21 23:36:09 compute-1 sudo[174431]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:36:09 compute-1 python3.9[174433]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_metadata.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 23:36:09 compute-1 sudo[174431]: pam_unix(sudo:session): session closed for user root
Jan 21 23:36:10 compute-1 sudo[174598]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gptsryturmdkficoxrotlddtlnivyrmw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038570.1500814-1939-87327356587919/AnsiballZ_command.py'
Jan 21 23:36:10 compute-1 sudo[174598]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:36:10 compute-1 podman[174558]: 2026-01-21 23:36:10.523458824 +0000 UTC m=+0.100422919 container health_status af9a44923f14d865893c91712a29a99d59e05d83f1a1a0300c4f4d5af638f284 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent)
Jan 21 23:36:10 compute-1 python3.9[174604]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_scheduler.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 23:36:10 compute-1 sudo[174598]: pam_unix(sudo:session): session closed for user root
Jan 21 23:36:11 compute-1 sudo[174755]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kyumdqysdqggvdlzkudhsxtagyedfubd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038570.8851402-1939-224393348831033/AnsiballZ_command.py'
Jan 21 23:36:11 compute-1 sudo[174755]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:36:11 compute-1 python3.9[174757]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_vnc_proxy.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 23:36:11 compute-1 sudo[174755]: pam_unix(sudo:session): session closed for user root
Jan 21 23:36:13 compute-1 sudo[174908]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gmqvbhlqpbqowrqxrfxiiuvqvbypnazl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038572.9129283-2146-153219199430495/AnsiballZ_file.py'
Jan 21 23:36:13 compute-1 sudo[174908]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:36:13 compute-1 python3.9[174910]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 21 23:36:13 compute-1 sudo[174908]: pam_unix(sudo:session): session closed for user root
Jan 21 23:36:13 compute-1 sudo[175060]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-atwxrpulytppeztacqmqosceezuekcgv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038573.588202-2146-279880397151491/AnsiballZ_file.py'
Jan 21 23:36:13 compute-1 sudo[175060]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:36:14 compute-1 python3.9[175062]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/containers setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 21 23:36:14 compute-1 sudo[175060]: pam_unix(sudo:session): session closed for user root
Jan 21 23:36:14 compute-1 sudo[175212]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-usopivsvytakgbtvvmvbfahfyppeipbf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038574.3413887-2146-22989757796430/AnsiballZ_file.py'
Jan 21 23:36:14 compute-1 sudo[175212]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:36:14 compute-1 python3.9[175214]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/nova_nvme_cleaner setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 21 23:36:14 compute-1 sudo[175212]: pam_unix(sudo:session): session closed for user root
Jan 21 23:36:15 compute-1 sudo[175364]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jdkrqsnrixrwqgfpjlzmknkeruiszwdy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038575.3994107-2212-214587493102777/AnsiballZ_file.py'
Jan 21 23:36:15 compute-1 sudo[175364]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:36:15 compute-1 python3.9[175366]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 21 23:36:15 compute-1 sudo[175364]: pam_unix(sudo:session): session closed for user root
Jan 21 23:36:16 compute-1 sudo[175516]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lnesyhcfzqmmxqbmdegimmhmpewzbbhy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038576.1672943-2212-128579003540187/AnsiballZ_file.py'
Jan 21 23:36:16 compute-1 sudo[175516]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:36:16 compute-1 python3.9[175518]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/_nova_secontext setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 21 23:36:16 compute-1 sudo[175516]: pam_unix(sudo:session): session closed for user root
Jan 21 23:36:17 compute-1 sudo[175668]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vsdlhcsnorycipparmzbazmwqirtrgcv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038576.878701-2212-248648727944946/AnsiballZ_file.py'
Jan 21 23:36:17 compute-1 sudo[175668]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:36:17 compute-1 python3.9[175670]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova/instances setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 21 23:36:17 compute-1 sudo[175668]: pam_unix(sudo:session): session closed for user root
Jan 21 23:36:18 compute-1 sudo[175820]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dcltuksiqkaldpppmtzeaivryjaprfzc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038577.6907868-2212-78156475326807/AnsiballZ_file.py'
Jan 21 23:36:18 compute-1 sudo[175820]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:36:18 compute-1 python3.9[175822]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/etc/ceph setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 21 23:36:18 compute-1 sudo[175820]: pam_unix(sudo:session): session closed for user root
Jan 21 23:36:18 compute-1 sudo[175973]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rsbpcuvathdzsxsxjbtguyggeeuyaxcg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038578.4379535-2212-62595371253636/AnsiballZ_file.py'
Jan 21 23:36:18 compute-1 sudo[175973]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:36:18 compute-1 python3.9[175975]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/multipath setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Jan 21 23:36:19 compute-1 sudo[175973]: pam_unix(sudo:session): session closed for user root
Jan 21 23:36:19 compute-1 sudo[176126]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pktjucrigowatmluednnsyyucruulmwg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038579.1877313-2212-48774863345295/AnsiballZ_file.py'
Jan 21 23:36:19 compute-1 sudo[176126]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:36:19 compute-1 python3.9[176128]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/nvme setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Jan 21 23:36:19 compute-1 sudo[176126]: pam_unix(sudo:session): session closed for user root
Jan 21 23:36:20 compute-1 sudo[176278]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jgmvmszpjtbpgwpmhenhvcjafevmtxip ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038579.979941-2212-238322086480148/AnsiballZ_file.py'
Jan 21 23:36:20 compute-1 sudo[176278]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:36:20 compute-1 python3.9[176280]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/run/openvswitch setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Jan 21 23:36:20 compute-1 sudo[176278]: pam_unix(sudo:session): session closed for user root
Jan 21 23:36:23 compute-1 sshd-session[175823]: Connection reset by 198.235.24.42 port 60658 [preauth]
Jan 21 23:36:26 compute-1 sudo[176430]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-shzleuzsdbeacimfndlrxvuxgmtdkzei ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038585.435043-2517-207000319842282/AnsiballZ_getent.py'
Jan 21 23:36:26 compute-1 sudo[176430]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:36:26 compute-1 python3.9[176432]: ansible-ansible.builtin.getent Invoked with database=passwd key=nova fail_key=True service=None split=None
Jan 21 23:36:26 compute-1 sudo[176430]: pam_unix(sudo:session): session closed for user root
Jan 21 23:36:27 compute-1 sudo[176583]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gqjbcvmqzmgmklssjrnggckxtfcbjpqp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038586.6043081-2541-235397328976393/AnsiballZ_group.py'
Jan 21 23:36:27 compute-1 sudo[176583]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:36:27 compute-1 python3.9[176585]: ansible-ansible.builtin.group Invoked with gid=42436 name=nova state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Jan 21 23:36:27 compute-1 groupadd[176586]: group added to /etc/group: name=nova, GID=42436
Jan 21 23:36:27 compute-1 groupadd[176586]: group added to /etc/gshadow: name=nova
Jan 21 23:36:27 compute-1 groupadd[176586]: new group: name=nova, GID=42436
Jan 21 23:36:27 compute-1 sudo[176583]: pam_unix(sudo:session): session closed for user root
Jan 21 23:36:28 compute-1 sudo[176741]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fwlhdocjynvgrbibrxfncnyzojemlhef ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038587.777145-2565-263279677987648/AnsiballZ_user.py'
Jan 21 23:36:28 compute-1 sudo[176741]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:36:28 compute-1 python3.9[176743]: ansible-ansible.builtin.user Invoked with comment=nova user group=nova groups=['libvirt'] name=nova shell=/bin/sh state=present uid=42436 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-1 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Jan 21 23:36:28 compute-1 useradd[176745]: new user: name=nova, UID=42436, GID=42436, home=/home/nova, shell=/bin/sh, from=/dev/pts/0
Jan 21 23:36:28 compute-1 useradd[176745]: add 'nova' to group 'libvirt'
Jan 21 23:36:28 compute-1 useradd[176745]: add 'nova' to shadow group 'libvirt'
Jan 21 23:36:28 compute-1 sudo[176741]: pam_unix(sudo:session): session closed for user root
Jan 21 23:36:29 compute-1 sshd-session[176776]: Accepted publickey for zuul from 192.168.122.30 port 36650 ssh2: ECDSA SHA256:yVd9eaNlUxI8uClDfn5gH9n5gqPIh4AgIE0cN4DTPFE
Jan 21 23:36:29 compute-1 systemd-logind[796]: New session 25 of user zuul.
Jan 21 23:36:29 compute-1 systemd[1]: Started Session 25 of User zuul.
Jan 21 23:36:29 compute-1 sshd-session[176776]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 21 23:36:30 compute-1 podman[176778]: 2026-01-21 23:36:30.081986478 +0000 UTC m=+0.129495438 container health_status 1b31374526468d6b98833c6ddc06c791524e3aaa8f4a92d02e3f11c449a17ca2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251202)
Jan 21 23:36:30 compute-1 sshd-session[176780]: Received disconnect from 192.168.122.30 port 36650:11: disconnected by user
Jan 21 23:36:30 compute-1 sshd-session[176780]: Disconnected from user zuul 192.168.122.30 port 36650
Jan 21 23:36:30 compute-1 sshd-session[176776]: pam_unix(sshd:session): session closed for user zuul
Jan 21 23:36:30 compute-1 systemd[1]: session-25.scope: Deactivated successfully.
Jan 21 23:36:30 compute-1 systemd-logind[796]: Session 25 logged out. Waiting for processes to exit.
Jan 21 23:36:30 compute-1 systemd-logind[796]: Removed session 25.
Jan 21 23:36:30 compute-1 python3.9[176956]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/config.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:36:31 compute-1 python3.9[177077]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/config.json mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769038590.3314428-2640-112988360196035/.source.json follow=False _original_basename=config.json.j2 checksum=b51012bfb0ca26296dcf3793a2f284446fb1395e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 21 23:36:32 compute-1 python3.9[177227]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/nova-blank.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:36:32 compute-1 python3.9[177303]: ansible-ansible.legacy.file Invoked with mode=0644 setype=container_file_t dest=/var/lib/openstack/config/nova/nova-blank.conf _original_basename=nova-blank.conf recurse=False state=file path=/var/lib/openstack/config/nova/nova-blank.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 21 23:36:33 compute-1 python3.9[177453]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/ssh-config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:36:34 compute-1 python3.9[177574]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/ssh-config mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769038592.8922977-2640-269101797802844/.source follow=False _original_basename=ssh-config checksum=4297f735c41bdc1ff52d72e6f623a02242f37958 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 21 23:36:34 compute-1 python3.9[177724]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/02-nova-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:36:35 compute-1 python3.9[177845]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/02-nova-host-specific.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769038594.182072-2640-131856852249335/.source.conf follow=False _original_basename=02-nova-host-specific.conf.j2 checksum=bc7f3bb7d4094c596a18178a888511b54e157ba4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 21 23:36:36 compute-1 python3.9[177995]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/nova_statedir_ownership.py follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:36:36 compute-1 python3.9[178116]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/nova_statedir_ownership.py mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769038595.5286582-2640-27783719098644/.source.py follow=False _original_basename=nova_statedir_ownership.py checksum=c6c8a3cfefa5efd60ceb1408c4e977becedb71e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 21 23:36:37 compute-1 python3.9[178266]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/run-on-host follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:36:37 compute-1 python3.9[178387]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/run-on-host mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769038596.917257-2640-220105456888025/.source follow=False _original_basename=run-on-host checksum=93aba8edc83d5878604a66d37fea2f12b60bdea2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 21 23:36:39 compute-1 sudo[178537]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cxspuhxpxecxvwmbidgwapcyungxrpqe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038599.3871193-2889-9617714160401/AnsiballZ_file.py'
Jan 21 23:36:39 compute-1 sudo[178537]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:36:39 compute-1 python3.9[178539]: ansible-ansible.builtin.file Invoked with group=nova mode=0700 owner=nova path=/home/nova/.ssh state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:36:39 compute-1 sudo[178537]: pam_unix(sudo:session): session closed for user root
Jan 21 23:36:40 compute-1 sudo[178701]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-niuxagfjezusutkhepmqpqpkjhotruvz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038600.3335335-2913-40659512230930/AnsiballZ_copy.py'
Jan 21 23:36:40 compute-1 sudo[178701]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:36:40 compute-1 podman[178663]: 2026-01-21 23:36:40.731240343 +0000 UTC m=+0.063070962 container health_status af9a44923f14d865893c91712a29a99d59e05d83f1a1a0300c4f4d5af638f284 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Jan 21 23:36:40 compute-1 python3.9[178707]: ansible-ansible.legacy.copy Invoked with dest=/home/nova/.ssh/authorized_keys group=nova mode=0600 owner=nova remote_src=True src=/var/lib/openstack/config/nova/ssh-publickey backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:36:40 compute-1 sudo[178701]: pam_unix(sudo:session): session closed for user root
Jan 21 23:36:41 compute-1 sudo[178859]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tqiqzwguxpbufljtazjvxgwoqpqenbmy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038601.1785698-2937-115707387382544/AnsiballZ_stat.py'
Jan 21 23:36:41 compute-1 sudo[178859]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:36:41 compute-1 python3.9[178861]: ansible-ansible.builtin.stat Invoked with path=/var/lib/nova/compute_id follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 21 23:36:41 compute-1 sudo[178859]: pam_unix(sudo:session): session closed for user root
Jan 21 23:36:42 compute-1 sudo[179011]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ttgnsfghqfctprkskpxrheaykkjmputh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038602.109764-2962-19227287390218/AnsiballZ_stat.py'
Jan 21 23:36:42 compute-1 sudo[179011]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:36:42 compute-1 python3.9[179013]: ansible-ansible.legacy.stat Invoked with path=/var/lib/nova/compute_id follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:36:42 compute-1 sudo[179011]: pam_unix(sudo:session): session closed for user root
Jan 21 23:36:43 compute-1 sudo[179134]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tboaqzuijxczktmbynadlucrbyidnuke ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038602.109764-2962-19227287390218/AnsiballZ_copy.py'
Jan 21 23:36:43 compute-1 sudo[179134]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:36:43 compute-1 python3.9[179136]: ansible-ansible.legacy.copy Invoked with attributes=+i dest=/var/lib/nova/compute_id group=nova mode=0400 owner=nova src=/home/zuul/.ansible/tmp/ansible-tmp-1769038602.109764-2962-19227287390218/.source _original_basename=.vgq94t6n follow=False checksum=f928c25f1282d5da0d44318f0c4ecbcf6a8a3d23 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None
Jan 21 23:36:43 compute-1 sudo[179134]: pam_unix(sudo:session): session closed for user root
Jan 21 23:36:44 compute-1 python3.9[179288]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 21 23:36:45 compute-1 python3.9[179440]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/nova_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:36:46 compute-1 python3.9[179561]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/containers/nova_compute.json mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769038604.898537-3039-46721805339491/.source.json follow=False _original_basename=nova_compute.json.j2 checksum=aff5546b44cf4461a7541a94e4cce1332c9b58b0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 21 23:36:46 compute-1 python3.9[179711]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/nova_compute_init.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:36:47 compute-1 python3.9[179832]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/containers/nova_compute_init.json mode=0700 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769038606.3329222-3084-192406339573247/.source.json follow=False _original_basename=nova_compute_init.json.j2 checksum=60b024e6db49dc6e700fc0d50263944d98d4c034 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 21 23:36:48 compute-1 sudo[179982]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vlczjaswjzgapjlcxyrwvxsoppxlldvl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038607.9780188-3135-270906815718831/AnsiballZ_container_config_data.py'
Jan 21 23:36:48 compute-1 sudo[179982]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:36:48 compute-1 python3.9[179984]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/containers config_pattern=nova_compute_init.json debug=False
Jan 21 23:36:48 compute-1 sudo[179982]: pam_unix(sudo:session): session closed for user root
Jan 21 23:36:49 compute-1 sudo[180134]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bvcooevgyyijeqdhviivqgsolhuzotbo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038609.20892-3168-232527173073937/AnsiballZ_container_config_hash.py'
Jan 21 23:36:49 compute-1 sudo[180134]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:36:49 compute-1 python3.9[180136]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Jan 21 23:36:49 compute-1 sudo[180134]: pam_unix(sudo:session): session closed for user root
Jan 21 23:36:51 compute-1 sudo[180286]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ccghhbigsjqlkwzfzfghksplstnortnp ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1769038610.4819748-3198-113975863930787/AnsiballZ_edpm_container_manage.py'
Jan 21 23:36:51 compute-1 sudo[180286]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:36:51 compute-1 python3[180288]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/containers config_id=edpm config_overrides={} config_patterns=nova_compute_init.json containers=[] log_base_path=/var/log/containers/stdouts debug=False
Jan 21 23:36:51 compute-1 podman[180323]: 2026-01-21 23:36:51.593186821 +0000 UTC m=+0.066748227 container create 18ba026f131b5cd8a9218bcabb3bd2b4a1a18f9ece3e9847168dc50aa2842ebb (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_managed=true, container_name=nova_compute_init)
Jan 21 23:36:51 compute-1 podman[180323]: 2026-01-21 23:36:51.558781715 +0000 UTC m=+0.032343131 image pull e3166cc074f328e3b121ff82d56ed43a2542af699baffe6874520fe3837c2b18 quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Jan 21 23:36:51 compute-1 python3[180288]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name nova_compute_init --conmon-pidfile /run/nova_compute_init.pid --env NOVA_STATEDIR_OWNERSHIP_SKIP=/var/lib/nova/compute_id --env __OS_DEBUG=False --label config_id=edpm --label container_name=nova_compute_init --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']} --log-driver journald --log-level info --network none --privileged=False --security-opt label=disable --user root --volume /dev/log:/dev/log --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z --volume /var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init
Jan 21 23:36:51 compute-1 sudo[180286]: pam_unix(sudo:session): session closed for user root
Jan 21 23:36:52 compute-1 sudo[180511]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hbvjgtadukapphpvpqtngjykmgjieepz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038612.1131928-3222-220674653419813/AnsiballZ_stat.py'
Jan 21 23:36:52 compute-1 sudo[180511]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:36:52 compute-1 python3.9[180513]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 21 23:36:52 compute-1 sudo[180511]: pam_unix(sudo:session): session closed for user root
Jan 21 23:36:53 compute-1 sudo[180665]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-feintxtyxymyhxbgxqtubhczsujaecus ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038613.4262118-3258-182133215321209/AnsiballZ_container_config_data.py'
Jan 21 23:36:53 compute-1 sudo[180665]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:36:53 compute-1 python3.9[180667]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/containers config_pattern=nova_compute.json debug=False
Jan 21 23:36:53 compute-1 sudo[180665]: pam_unix(sudo:session): session closed for user root
Jan 21 23:36:54 compute-1 sudo[180817]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uugnzvuifuagtfunkmglfkqmwbqlwksm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038614.5611687-3291-136825572827317/AnsiballZ_container_config_hash.py'
Jan 21 23:36:54 compute-1 sudo[180817]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:36:55 compute-1 python3.9[180819]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Jan 21 23:36:55 compute-1 sudo[180817]: pam_unix(sudo:session): session closed for user root
Jan 21 23:36:56 compute-1 sudo[180969]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-soclntujjltrrfbbscwujrrvfepasheu ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1769038615.7076516-3321-184338341006808/AnsiballZ_edpm_container_manage.py'
Jan 21 23:36:56 compute-1 sudo[180969]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:36:56 compute-1 python3[180971]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/containers config_id=edpm config_overrides={} config_patterns=nova_compute.json containers=[] log_base_path=/var/log/containers/stdouts debug=False
Jan 21 23:36:56 compute-1 podman[181006]: 2026-01-21 23:36:56.480807918 +0000 UTC m=+0.055712439 container create 56133df44cab3ff30ffc71c1039d65c8d91acff011fc376dcdcff1ab86939515 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, config_id=edpm, container_name=nova_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Jan 21 23:36:56 compute-1 podman[181006]: 2026-01-21 23:36:56.457082408 +0000 UTC m=+0.031986949 image pull e3166cc074f328e3b121ff82d56ed43a2542af699baffe6874520fe3837c2b18 quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Jan 21 23:36:56 compute-1 python3[180971]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name nova_compute --conmon-pidfile /run/nova_compute.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --label config_id=edpm --label container_name=nova_compute --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']} --log-driver journald --log-level info --network host --pid host --privileged=True --user nova --volume /var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro --volume /var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /etc/localtime:/etc/localtime:ro --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /var/lib/libvirt:/var/lib/libvirt --volume /run/libvirt:/run/libvirt:shared --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/iscsi:/var/lib/iscsi --volume /etc/multipath:/etc/multipath --volume /etc/multipath.conf:/etc/multipath.conf:ro,Z --volume /etc/iscsi:/etc/iscsi:ro --volume /etc/nvme:/etc/nvme --volume /var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified kolla_start
Jan 21 23:36:56 compute-1 sudo[180969]: pam_unix(sudo:session): session closed for user root
Jan 21 23:36:57 compute-1 sudo[181194]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cdjmopgozkvockczhhasrgxualkprjqu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038617.0488493-3345-53673899887799/AnsiballZ_stat.py'
Jan 21 23:36:57 compute-1 sudo[181194]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:36:57 compute-1 python3.9[181196]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 21 23:36:57 compute-1 sudo[181194]: pam_unix(sudo:session): session closed for user root
Jan 21 23:36:58 compute-1 sudo[181348]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rapwunwadfsurcxxyabwmatnmyqanrhy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038618.096983-3372-225403033380763/AnsiballZ_file.py'
Jan 21 23:36:58 compute-1 sudo[181348]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:36:58 compute-1 python3.9[181350]: ansible-file Invoked with path=/etc/systemd/system/edpm_nova_compute.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:36:58 compute-1 sudo[181348]: pam_unix(sudo:session): session closed for user root
Jan 21 23:36:59 compute-1 sudo[181499]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rltxwubpttgwjoqvzylljgtyyxymdgas ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038618.746974-3372-46228512138702/AnsiballZ_copy.py'
Jan 21 23:36:59 compute-1 sudo[181499]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:36:59 compute-1 python3.9[181501]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769038618.746974-3372-46228512138702/source dest=/etc/systemd/system/edpm_nova_compute.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:36:59 compute-1 sudo[181499]: pam_unix(sudo:session): session closed for user root
Jan 21 23:36:59 compute-1 sudo[181575]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uttkzhcnsfuwfkzetpcurvnwfdpyhvdt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038618.746974-3372-46228512138702/AnsiballZ_systemd.py'
Jan 21 23:36:59 compute-1 sudo[181575]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:37:00 compute-1 python3.9[181577]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 21 23:37:00 compute-1 systemd[1]: Reloading.
Jan 21 23:37:00 compute-1 systemd-rc-local-generator[181624]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 23:37:00 compute-1 systemd-sysv-generator[181627]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 21 23:37:00 compute-1 podman[181579]: 2026-01-21 23:37:00.32095028 +0000 UTC m=+0.173832017 container health_status 1b31374526468d6b98833c6ddc06c791524e3aaa8f4a92d02e3f11c449a17ca2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 21 23:37:00 compute-1 sudo[181575]: pam_unix(sudo:session): session closed for user root
Jan 21 23:37:00 compute-1 sudo[181713]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gxbxzdgxrooqqhmajjhglmncvqlcgtws ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038618.746974-3372-46228512138702/AnsiballZ_systemd.py'
Jan 21 23:37:00 compute-1 sudo[181713]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:37:01 compute-1 python3.9[181715]: ansible-systemd Invoked with state=restarted name=edpm_nova_compute.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 21 23:37:01 compute-1 systemd[1]: Reloading.
Jan 21 23:37:01 compute-1 systemd-sysv-generator[181749]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 21 23:37:01 compute-1 systemd-rc-local-generator[181745]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 23:37:01 compute-1 systemd[1]: Starting nova_compute container...
Jan 21 23:37:01 compute-1 systemd[1]: Started libcrun container.
Jan 21 23:37:01 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5cbf94059a10e47153a96e5d5e2cc6bef0cf17dc738e59d7938dae576afd3d29/merged/etc/nvme supports timestamps until 2038 (0x7fffffff)
Jan 21 23:37:01 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5cbf94059a10e47153a96e5d5e2cc6bef0cf17dc738e59d7938dae576afd3d29/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Jan 21 23:37:01 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5cbf94059a10e47153a96e5d5e2cc6bef0cf17dc738e59d7938dae576afd3d29/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Jan 21 23:37:01 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5cbf94059a10e47153a96e5d5e2cc6bef0cf17dc738e59d7938dae576afd3d29/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Jan 21 23:37:01 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5cbf94059a10e47153a96e5d5e2cc6bef0cf17dc738e59d7938dae576afd3d29/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Jan 21 23:37:01 compute-1 podman[181755]: 2026-01-21 23:37:01.59422397 +0000 UTC m=+0.130797978 container init 56133df44cab3ff30ffc71c1039d65c8d91acff011fc376dcdcff1ab86939515 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, config_id=edpm, container_name=nova_compute, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Jan 21 23:37:01 compute-1 podman[181755]: 2026-01-21 23:37:01.606168847 +0000 UTC m=+0.142742795 container start 56133df44cab3ff30ffc71c1039d65c8d91acff011fc376dcdcff1ab86939515 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=edpm, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=nova_compute, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Jan 21 23:37:01 compute-1 podman[181755]: nova_compute
Jan 21 23:37:01 compute-1 nova_compute[181770]: + sudo -E kolla_set_configs
Jan 21 23:37:01 compute-1 systemd[1]: Started nova_compute container.
Jan 21 23:37:01 compute-1 sudo[181713]: pam_unix(sudo:session): session closed for user root
Jan 21 23:37:01 compute-1 nova_compute[181770]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Jan 21 23:37:01 compute-1 nova_compute[181770]: INFO:__main__:Validating config file
Jan 21 23:37:01 compute-1 nova_compute[181770]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Jan 21 23:37:01 compute-1 nova_compute[181770]: INFO:__main__:Copying service configuration files
Jan 21 23:37:01 compute-1 nova_compute[181770]: INFO:__main__:Deleting /etc/nova/nova.conf
Jan 21 23:37:01 compute-1 nova_compute[181770]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf
Jan 21 23:37:01 compute-1 nova_compute[181770]: INFO:__main__:Setting permission for /etc/nova/nova.conf
Jan 21 23:37:01 compute-1 nova_compute[181770]: INFO:__main__:Copying /var/lib/kolla/config_files/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf
Jan 21 23:37:01 compute-1 nova_compute[181770]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf
Jan 21 23:37:01 compute-1 nova_compute[181770]: INFO:__main__:Copying /var/lib/kolla/config_files/25-nova-extra.conf to /etc/nova/nova.conf.d/25-nova-extra.conf
Jan 21 23:37:01 compute-1 nova_compute[181770]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/25-nova-extra.conf
Jan 21 23:37:01 compute-1 nova_compute[181770]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf
Jan 21 23:37:01 compute-1 nova_compute[181770]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf
Jan 21 23:37:01 compute-1 nova_compute[181770]: INFO:__main__:Copying /var/lib/kolla/config_files/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf
Jan 21 23:37:01 compute-1 nova_compute[181770]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf
Jan 21 23:37:01 compute-1 nova_compute[181770]: INFO:__main__:Deleting /etc/ceph
Jan 21 23:37:01 compute-1 nova_compute[181770]: INFO:__main__:Creating directory /etc/ceph
Jan 21 23:37:01 compute-1 nova_compute[181770]: INFO:__main__:Setting permission for /etc/ceph
Jan 21 23:37:01 compute-1 nova_compute[181770]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey
Jan 21 23:37:01 compute-1 nova_compute[181770]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Jan 21 23:37:01 compute-1 nova_compute[181770]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-config to /var/lib/nova/.ssh/config
Jan 21 23:37:01 compute-1 nova_compute[181770]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Jan 21 23:37:01 compute-1 nova_compute[181770]: INFO:__main__:Deleting /usr/sbin/iscsiadm
Jan 21 23:37:01 compute-1 nova_compute[181770]: INFO:__main__:Copying /var/lib/kolla/config_files/run-on-host to /usr/sbin/iscsiadm
Jan 21 23:37:01 compute-1 nova_compute[181770]: INFO:__main__:Setting permission for /usr/sbin/iscsiadm
Jan 21 23:37:01 compute-1 nova_compute[181770]: INFO:__main__:Writing out command to execute
Jan 21 23:37:01 compute-1 nova_compute[181770]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/
Jan 21 23:37:01 compute-1 nova_compute[181770]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Jan 21 23:37:01 compute-1 nova_compute[181770]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Jan 21 23:37:01 compute-1 nova_compute[181770]: ++ cat /run_command
Jan 21 23:37:01 compute-1 nova_compute[181770]: + CMD=nova-compute
Jan 21 23:37:01 compute-1 nova_compute[181770]: + ARGS=
Jan 21 23:37:01 compute-1 nova_compute[181770]: + sudo kolla_copy_cacerts
Jan 21 23:37:01 compute-1 nova_compute[181770]: + [[ ! -n '' ]]
Jan 21 23:37:01 compute-1 nova_compute[181770]: + . kolla_extend_start
Jan 21 23:37:01 compute-1 nova_compute[181770]: Running command: 'nova-compute'
Jan 21 23:37:01 compute-1 nova_compute[181770]: + echo 'Running command: '\''nova-compute'\'''
Jan 21 23:37:01 compute-1 nova_compute[181770]: + umask 0022
Jan 21 23:37:01 compute-1 nova_compute[181770]: + exec nova-compute
Jan 21 23:37:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:37:02.982 104184 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:37:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:37:02.983 104184 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:37:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:37:02.983 104184 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:37:03 compute-1 python3.9[181932]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner_healthcheck.service follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 21 23:37:03 compute-1 nova_compute[181770]: 2026-01-21 23:37:03.824 181774 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_linux_bridge.linux_bridge.LinuxBridgePlugin'>' with name 'linux_bridge' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Jan 21 23:37:03 compute-1 nova_compute[181770]: 2026-01-21 23:37:03.824 181774 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_noop.noop.NoOpPlugin'>' with name 'noop' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Jan 21 23:37:03 compute-1 nova_compute[181770]: 2026-01-21 23:37:03.825 181774 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_ovs.ovs.OvsPlugin'>' with name 'ovs' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Jan 21 23:37:03 compute-1 nova_compute[181770]: 2026-01-21 23:37:03.825 181774 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs
Jan 21 23:37:03 compute-1 nova_compute[181770]: 2026-01-21 23:37:03.973 181774 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:37:04 compute-1 nova_compute[181770]: 2026-01-21 23:37:04.004 181774 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 1 in 0.031s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:37:04 compute-1 nova_compute[181770]: 2026-01-21 23:37:04.004 181774 DEBUG oslo_concurrency.processutils [-] 'grep -F node.session.scan /sbin/iscsiadm' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473
Jan 21 23:37:04 compute-1 python3.9[182086]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 21 23:37:04 compute-1 nova_compute[181770]: 2026-01-21 23:37:04.867 181774 INFO nova.virt.driver [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.015 181774 INFO nova.compute.provider_config [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.029 181774 DEBUG oslo_concurrency.lockutils [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.030 181774 DEBUG oslo_concurrency.lockutils [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.030 181774 DEBUG oslo_concurrency.lockutils [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.030 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.9/site-packages/oslo_service/service.py:362
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.030 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.031 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.031 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.031 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.031 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.031 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] allow_resize_to_same_host      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.031 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] arq_binding_timeout            = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.032 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] backdoor_port                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.032 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] backdoor_socket                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.032 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] block_device_allocate_retries  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.032 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.032 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] cert                           = self.pem log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.032 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] compute_driver                 = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.032 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] compute_monitors               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.033 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] config_dir                     = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.033 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] config_drive_format            = iso9660 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.033 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] config_file                    = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.033 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.033 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] console_host                   = compute-1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.033 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] control_exchange               = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.033 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] cpu_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.034 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] daemon                         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.034 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.034 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.034 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] default_availability_zone      = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.034 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] default_ephemeral_format       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.034 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.034 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] default_schedule_zone          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.035 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] disk_allocation_ratio          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.035 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] enable_new_services            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.035 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] enabled_apis                   = ['osapi_compute', 'metadata'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.035 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] enabled_ssl_apis               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.035 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] flat_injected                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.035 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] force_config_drive             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.036 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] force_raw_images               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.036 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.036 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] heal_instance_info_cache_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.036 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] host                           = compute-1.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.036 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] initial_cpu_allocation_ratio   = 4.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.037 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] initial_disk_allocation_ratio  = 0.9 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.037 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] initial_ram_allocation_ratio   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.037 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] injected_network_template      = /usr/lib/python3.9/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.037 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] instance_build_timeout         = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.037 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] instance_delete_interval       = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.037 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.038 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] instance_name_template         = instance-%08x log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.038 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] instance_usage_audit           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.038 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] instance_usage_audit_period    = month log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.038 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.038 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] instances_path                 = /var/lib/nova/instances log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.038 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.039 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] key                            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.039 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] live_migration_retry_count     = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.039 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.039 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.039 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.040 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.040 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.040 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.040 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.040 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] log_rotation_type              = size log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.041 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.041 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.041 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.041 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.041 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.042 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] long_rpc_timeout               = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.042 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] max_concurrent_builds          = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.042 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.042 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] max_concurrent_snapshots       = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.042 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] max_local_block_devices        = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.042 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] max_logfile_count              = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.043 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] max_logfile_size_mb            = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.043 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.043 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] metadata_listen                = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.043 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] metadata_listen_port           = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.043 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] metadata_workers               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.044 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] migrate_max_retries            = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.044 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] mkisofs_cmd                    = /usr/bin/mkisofs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.044 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] my_block_storage_ip            = 192.168.122.101 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.044 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] my_ip                          = 192.168.122.101 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.044 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] network_allocate_retries       = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.045 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.045 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] osapi_compute_listen           = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.045 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] osapi_compute_listen_port      = 8774 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.045 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] osapi_compute_unique_server_name_scope =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.045 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] osapi_compute_workers          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.046 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] password_length                = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.046 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] periodic_enable                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.046 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] periodic_fuzzy_delay           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.046 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] pointer_model                  = usbtablet log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.046 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] preallocate_images             = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.047 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.047 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] pybasedir                      = /usr/lib/python3.9/site-packages log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.047 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] ram_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.047 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.047 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.047 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.047 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] reboot_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.048 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] reclaim_instance_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.048 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] record                         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.048 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] reimage_timeout_per_gb         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.048 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] report_interval                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.048 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] rescue_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.048 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] reserved_host_cpus             = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.048 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] reserved_host_disk_mb          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.049 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] reserved_host_memory_mb        = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.049 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] reserved_huge_pages            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.049 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] resize_confirm_window          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.049 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] resize_fs_using_block_device   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.049 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.049 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] rootwrap_config                = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.049 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] rpc_response_timeout           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.050 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] run_external_periodic_tasks    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.050 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.050 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.050 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.050 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.050 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] service_down_time              = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.050 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] servicegroup_driver            = db log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.051 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] shelved_offload_time           = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.051 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] shelved_poll_interval          = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.051 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] shutdown_timeout               = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.051 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] source_is_ipv6                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.051 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] ssl_only                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.051 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] state_path                     = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.051 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] sync_power_state_interval      = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.051 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] sync_power_state_pool_size     = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.052 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.052 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] tempdir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.052 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] timeout_nbd                    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.052 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.052 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] update_resources_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.053 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] use_cow_images                 = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.053 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.053 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.053 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.054 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] use_rootwrap_daemon            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.054 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.054 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.054 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] vcpu_pin_set                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.054 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] vif_plugging_is_fatal          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.055 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] vif_plugging_timeout           = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.055 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] virt_mkfs                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.055 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] volume_usage_poll_interval     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.055 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.055 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] web                            = /usr/share/spice-html5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.055 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.055 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] oslo_concurrency.lock_path     = /var/lib/nova/tmp log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.056 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.056 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.056 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.056 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.056 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.056 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] api.auth_strategy              = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.057 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] api.compute_link_prefix        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.057 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.057 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] api.dhcp_domain                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.057 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] api.enable_instance_password   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.057 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] api.glance_link_prefix         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.057 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.057 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.058 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.058 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.058 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] api.local_metadata_per_cell    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.058 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] api.max_limit                  = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.058 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] api.metadata_cache_expiration  = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.058 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] api.neutron_default_tenant_id  = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.058 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] api.use_forwarded_for          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.059 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] api.use_neutron_default_nets   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.059 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.059 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.059 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.059 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] api.vendordata_dynamic_ssl_certfile =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.059 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.059 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] api.vendordata_jsonfile_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.060 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] api.vendordata_providers       = ['StaticJSON'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.060 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] cache.backend                  = oslo_cache.dict log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.060 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] cache.backend_argument         = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.060 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] cache.config_prefix            = cache.oslo log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.060 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] cache.dead_timeout             = 60.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.060 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] cache.debug_cache_backend      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.061 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] cache.enable_retry_client      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.061 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] cache.enable_socket_keepalive  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.061 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] cache.enabled                  = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.061 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] cache.expiration_time          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.061 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.061 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] cache.hashclient_retry_delay   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.061 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] cache.memcache_dead_retry      = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.062 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] cache.memcache_password        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.062 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.062 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.062 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] cache.memcache_pool_maxsize    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.062 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.062 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] cache.memcache_sasl_enabled    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.063 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] cache.memcache_servers         = ['localhost:11211'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.063 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] cache.memcache_socket_timeout  = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.063 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] cache.memcache_username        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.063 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] cache.proxies                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.063 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] cache.retry_attempts           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.063 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] cache.retry_delay              = 0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.063 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] cache.socket_keepalive_count   = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.064 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] cache.socket_keepalive_idle    = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.064 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.064 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] cache.tls_allowed_ciphers      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.064 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] cache.tls_cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.064 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] cache.tls_certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.064 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] cache.tls_enabled              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.064 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] cache.tls_keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.065 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] cinder.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.065 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] cinder.auth_type               = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.065 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] cinder.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.065 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] cinder.catalog_info            = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.065 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] cinder.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.065 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] cinder.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.065 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] cinder.cross_az_attach         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.066 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] cinder.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.066 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] cinder.endpoint_template       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.066 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] cinder.http_retries            = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.066 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] cinder.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.066 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] cinder.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.066 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] cinder.os_region_name          = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.066 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] cinder.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.067 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] cinder.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.067 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.067 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] compute.cpu_dedicated_set      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.067 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] compute.cpu_shared_set         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.067 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.067 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.067 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.068 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.068 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.068 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.068 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.068 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.068 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] compute.vmdk_allowed_types     = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.068 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] conductor.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.069 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] console.allowed_origins        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.069 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] console.ssl_ciphers            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.069 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] console.ssl_minimum_version    = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.069 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] consoleauth.token_ttl          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.069 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] cyborg.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.069 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] cyborg.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.070 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] cyborg.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.070 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] cyborg.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.070 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] cyborg.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.070 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] cyborg.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.070 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] cyborg.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.070 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] cyborg.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.070 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] cyborg.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.070 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] cyborg.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.071 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] cyborg.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.071 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] cyborg.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.071 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] cyborg.service_type            = accelerator log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.071 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] cyborg.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.071 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] cyborg.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.071 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.072 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] cyborg.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.072 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] cyborg.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.072 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] cyborg.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.072 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] database.backend               = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.072 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] database.connection            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.072 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] database.connection_debug      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.072 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.073 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.073 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] database.connection_trace      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.073 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.073 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] database.db_max_retries        = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.073 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.073 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] database.db_retry_interval     = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.073 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] database.max_overflow          = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.074 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] database.max_pool_size         = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.074 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] database.max_retries           = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.074 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] database.mysql_enable_ndb      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.074 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] database.mysql_sql_mode        = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.074 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.074 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] database.pool_timeout          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.074 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] database.retry_interval        = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.075 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] database.slave_connection      = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.075 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] database.sqlite_synchronous    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.075 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] api_database.backend           = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.075 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] api_database.connection        = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.075 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] api_database.connection_debug  = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.075 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] api_database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.075 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.076 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] api_database.connection_trace  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.076 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.076 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] api_database.db_max_retries    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.076 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.076 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.076 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] api_database.max_overflow      = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.077 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] api_database.max_pool_size     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.077 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] api_database.max_retries       = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.077 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] api_database.mysql_enable_ndb  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.077 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] api_database.mysql_sql_mode    = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.077 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.077 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] api_database.pool_timeout      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.077 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] api_database.retry_interval    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.078 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] api_database.slave_connection  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.078 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.078 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] devices.enabled_mdev_types     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.078 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.078 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.079 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.079 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] glance.api_servers             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.079 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] glance.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.079 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] glance.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.079 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] glance.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.079 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] glance.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.079 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] glance.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.080 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] glance.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.080 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.080 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.080 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] glance.enable_rbd_download     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.080 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] glance.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.080 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] glance.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.080 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] glance.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.081 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] glance.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.081 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] glance.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.081 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] glance.num_retries             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.081 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] glance.rbd_ceph_conf           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.081 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] glance.rbd_connect_timeout     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.081 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] glance.rbd_pool                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.081 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] glance.rbd_user                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.082 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] glance.region_name             = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.082 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] glance.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.082 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] glance.service_type            = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.082 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] glance.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.082 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] glance.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.082 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.082 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] glance.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.083 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] glance.valid_interfaces        = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.083 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.083 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] glance.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.083 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] guestfs.debug                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.083 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] hyperv.config_drive_cdrom      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.083 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] hyperv.config_drive_inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.083 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] hyperv.dynamic_memory_ratio    = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.084 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] hyperv.enable_instance_metrics_collection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.084 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] hyperv.enable_remotefx         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.084 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] hyperv.instances_path_share    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.084 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] hyperv.iscsi_initiator_list    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.084 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] hyperv.limit_cpu_features      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.084 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] hyperv.mounted_disk_query_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.085 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] hyperv.mounted_disk_query_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.085 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] hyperv.power_state_check_timeframe = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.085 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] hyperv.power_state_event_polling_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.085 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] hyperv.qemu_img_cmd            = qemu-img.exe log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.085 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] hyperv.use_multipath_io        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.085 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] hyperv.volume_attach_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.085 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] hyperv.volume_attach_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.086 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] hyperv.vswitch_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.086 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] hyperv.wait_soft_reboot_seconds = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.086 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] mks.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.086 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] mks.mksproxy_base_url          = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.086 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] image_cache.manager_interval   = 2400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.087 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.087 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.087 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.087 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.087 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] image_cache.subdirectory_name  = _base log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.087 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] ironic.api_max_retries         = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.087 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] ironic.api_retry_interval      = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.088 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.088 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.088 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.088 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.088 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.088 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.088 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.089 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.089 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.089 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.089 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.089 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.089 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] ironic.partition_key           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.089 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] ironic.peer_list               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.089 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.090 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.090 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.090 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] ironic.service_type            = baremetal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.090 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.090 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.090 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.090 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.091 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] ironic.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.091 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.091 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] key_manager.backend            = barbican log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.091 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] key_manager.fixed_key          = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.091 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] barbican.auth_endpoint         = http://localhost/identity/v3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.091 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] barbican.barbican_api_version  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.092 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] barbican.barbican_endpoint     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.092 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.092 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] barbican.barbican_region_name  = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.092 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] barbican.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.092 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] barbican.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.092 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] barbican.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.092 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] barbican.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.093 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] barbican.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.093 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] barbican.number_of_retries     = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.093 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] barbican.retry_delay           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.093 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.093 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] barbican.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.093 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] barbican.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.093 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] barbican.verify_ssl            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.094 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] barbican.verify_ssl_path       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.094 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.094 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.094 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] barbican_service_user.cafile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.094 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.094 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.094 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.094 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] barbican_service_user.keyfile  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.095 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.095 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] barbican_service_user.timeout  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.095 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] vault.approle_role_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.095 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] vault.approle_secret_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.095 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] vault.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.095 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] vault.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.095 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] vault.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.096 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] vault.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.096 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] vault.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.096 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] vault.kv_mountpoint            = secret log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.096 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] vault.kv_version               = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.096 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] vault.namespace                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.096 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] vault.root_token_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.096 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] vault.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.097 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] vault.ssl_ca_crt_file          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.097 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] vault.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.097 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] vault.use_ssl                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.097 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] vault.vault_url                = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.097 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] keystone.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.097 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] keystone.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.097 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] keystone.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.098 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] keystone.connect_retries       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.098 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] keystone.connect_retry_delay   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.098 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] keystone.endpoint_override     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.098 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] keystone.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.098 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] keystone.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.098 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] keystone.max_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.098 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] keystone.min_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.099 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] keystone.region_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.099 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] keystone.service_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.099 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] keystone.service_type          = identity log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.099 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] keystone.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.099 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] keystone.status_code_retries   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.099 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.100 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] keystone.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.100 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] keystone.valid_interfaces      = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.100 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] keystone.version               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.100 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] libvirt.connection_uri         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.100 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] libvirt.cpu_mode               = custom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.101 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] libvirt.cpu_model_extra_flags  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.101 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] libvirt.cpu_models             = ['Nehalem'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.101 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.101 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.101 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] libvirt.cpu_power_management   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.101 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.102 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.102 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] libvirt.device_detach_timeout  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.102 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] libvirt.disk_cachemodes        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.102 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] libvirt.disk_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.102 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] libvirt.enabled_perf_events    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.102 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] libvirt.file_backed_memory     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.102 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] libvirt.gid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.103 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] libvirt.hw_disk_discard        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.103 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] libvirt.hw_machine_type        = ['x86_64=q35'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.103 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] libvirt.images_rbd_ceph_conf   =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.103 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.103 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.103 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] libvirt.images_rbd_glance_store_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.103 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] libvirt.images_rbd_pool        = rbd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.104 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] libvirt.images_type            = qcow2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.104 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] libvirt.images_volume_group    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.104 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] libvirt.inject_key             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.104 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] libvirt.inject_partition       = -2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.104 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] libvirt.inject_password        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.104 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] libvirt.iscsi_iface            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.104 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] libvirt.iser_use_multipath     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.105 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.105 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.105 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.105 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.105 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.105 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.105 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.105 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.106 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] libvirt.live_migration_scheme  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.106 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.106 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.106 181774 WARNING oslo_config.cfg [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal (
Jan 21 23:37:05 compute-1 nova_compute[181770]: live_migration_uri is deprecated for removal in favor of two other options that
Jan 21 23:37:05 compute-1 nova_compute[181770]: allow to change live migration scheme and target URI: ``live_migration_scheme``
Jan 21 23:37:05 compute-1 nova_compute[181770]: and ``live_migration_inbound_addr`` respectively.
Jan 21 23:37:05 compute-1 nova_compute[181770]: ).  Its value may be silently ignored in the future.
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.106 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] libvirt.live_migration_uri     = qemu+tls://%s/system log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.106 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] libvirt.live_migration_with_native_tls = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.107 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] libvirt.max_queues             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.107 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.107 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] libvirt.nfs_mount_options      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.107 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] libvirt.nfs_mount_point_base   = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.107 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.107 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] libvirt.num_iser_scan_tries    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.108 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.108 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.108 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] libvirt.num_pcie_ports         = 24 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.108 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] libvirt.num_volume_scan_tries  = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.108 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] libvirt.pmem_namespaces        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.108 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] libvirt.quobyte_client_cfg     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.108 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.109 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] libvirt.rbd_connect_timeout    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.109 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.109 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.109 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] libvirt.rbd_secret_uuid        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.109 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] libvirt.rbd_user               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.109 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.109 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.110 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] libvirt.rescue_image_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.110 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] libvirt.rescue_kernel_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.110 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] libvirt.rescue_ramdisk_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.110 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] libvirt.rng_dev_path           = /dev/urandom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.110 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] libvirt.rx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.110 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] libvirt.smbfs_mount_options    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.110 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.111 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] libvirt.snapshot_compression   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.111 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] libvirt.snapshot_image_format  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.111 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] libvirt.snapshots_directory    = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.111 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.111 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] libvirt.swtpm_enabled          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.112 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] libvirt.swtpm_group            = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.112 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] libvirt.swtpm_user             = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.112 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] libvirt.sysinfo_serial         = unique log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.112 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] libvirt.tx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.112 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] libvirt.uid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.112 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.112 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] libvirt.virt_type              = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.113 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] libvirt.volume_clear           = zero log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.113 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] libvirt.volume_clear_size      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.113 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] libvirt.volume_use_multipath   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.113 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] libvirt.vzstorage_cache_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.113 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] libvirt.vzstorage_log_path     = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.113 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] libvirt.vzstorage_mount_group  = qemu log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.113 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] libvirt.vzstorage_mount_opts   = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.113 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] libvirt.vzstorage_mount_perms  = 0770 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.114 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.114 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] libvirt.vzstorage_mount_user   = stack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.114 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.114 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] neutron.auth_section           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.114 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] neutron.auth_type              = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.114 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] neutron.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.114 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] neutron.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.115 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] neutron.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.115 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] neutron.connect_retries        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.115 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] neutron.connect_retry_delay    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.115 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] neutron.default_floating_pool  = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.115 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] neutron.endpoint_override      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.115 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.115 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] neutron.http_retries           = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.116 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] neutron.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.116 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] neutron.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.116 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] neutron.max_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.116 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.116 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] neutron.min_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.116 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] neutron.ovs_bridge             = br-int log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.116 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] neutron.physnets               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.117 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] neutron.region_name            = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.117 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.117 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] neutron.service_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.117 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] neutron.service_type           = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.117 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] neutron.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.117 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] neutron.status_code_retries    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.118 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.118 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] neutron.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.118 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] neutron.valid_interfaces       = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.118 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] neutron.version                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.118 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.118 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] notifications.default_level    = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.118 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] notifications.notification_format = both log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.119 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] notifications.notify_on_state_change = vm_and_task_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.119 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.119 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] pci.alias                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.119 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] pci.device_spec                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.119 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] pci.report_in_placement        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.119 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.119 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] placement.auth_type            = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.119 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] placement.auth_url             = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.120 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.120 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.120 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.120 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] placement.connect_retries      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.120 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] placement.connect_retry_delay  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.120 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] placement.default_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.121 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] placement.default_domain_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.121 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] placement.domain_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.121 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] placement.domain_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.121 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] placement.endpoint_override    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.121 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.121 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.121 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] placement.max_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.121 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] placement.min_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.122 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] placement.password             = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.122 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] placement.project_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.122 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] placement.project_domain_name  = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.122 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] placement.project_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.122 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] placement.project_name         = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.122 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] placement.region_name          = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.123 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] placement.service_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.123 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] placement.service_type         = placement log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.123 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.123 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] placement.status_code_retries  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.123 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.123 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] placement.system_scope         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.124 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.124 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] placement.trust_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.124 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] placement.user_domain_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.124 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] placement.user_domain_name     = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.124 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] placement.user_id              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.124 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] placement.username             = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.124 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] placement.valid_interfaces     = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.125 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] placement.version              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.125 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] quota.cores                    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.125 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.125 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] quota.driver                   = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.125 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.125 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.126 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] quota.injected_files           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.126 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] quota.instances                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.126 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] quota.key_pairs                = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.126 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] quota.metadata_items           = 128 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.126 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] quota.ram                      = 51200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.126 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] quota.recheck_quota            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.126 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] quota.server_group_members     = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.127 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] quota.server_groups            = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.127 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] rdp.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.127 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] rdp.html5_proxy_base_url       = http://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.127 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.127 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.127 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.128 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.128 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] scheduler.max_attempts         = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.128 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.128 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.128 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] scheduler.query_placement_for_availability_zone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.128 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.128 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.129 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] scheduler.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.129 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.129 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.129 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.129 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.129 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.129 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.130 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.130 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.130 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.130 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.130 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.130 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.130 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.131 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.131 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.131 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.131 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.131 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.131 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.131 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.131 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.132 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.132 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.132 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.132 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] metrics.required               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.132 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] metrics.weight_multiplier      = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.132 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] metrics.weight_of_unavailable  = -10000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.132 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] metrics.weight_setting         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.133 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] serial_console.base_url        = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.133 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] serial_console.enabled         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.133 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] serial_console.port_range      = 10000:20000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.133 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.133 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.133 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.134 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] service_user.auth_section      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.134 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] service_user.auth_type         = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.134 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] service_user.cafile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.134 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] service_user.certfile          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.134 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] service_user.collect_timing    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.134 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] service_user.insecure          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.134 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] service_user.keyfile           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.134 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.135 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] service_user.split_loggers     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.135 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] service_user.timeout           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.135 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] spice.agent_enabled            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.135 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] spice.enabled                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.135 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] spice.html5proxy_base_url      = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.135 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] spice.html5proxy_host          = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.136 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] spice.html5proxy_port          = 6082 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.136 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] spice.image_compression        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.136 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] spice.jpeg_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.136 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] spice.playback_compression     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.136 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] spice.server_listen            = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.136 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.136 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] spice.streaming_mode           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.137 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] spice.zlib_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.137 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] upgrade_levels.baseapi         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.137 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] upgrade_levels.cert            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.137 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] upgrade_levels.compute         = auto log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.137 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] upgrade_levels.conductor       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.137 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] upgrade_levels.scheduler       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.137 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.138 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.138 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.138 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.138 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.138 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.138 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.138 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.139 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.139 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.139 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.139 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] vmware.cache_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.139 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] vmware.cluster_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.139 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] vmware.connection_pool_size    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.139 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] vmware.console_delay_seconds   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.140 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] vmware.datastore_regex         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.140 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] vmware.host_ip                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.140 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.140 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.140 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] vmware.host_username           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.140 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.141 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] vmware.integration_bridge      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.141 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] vmware.maximum_objects         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.141 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] vmware.pbm_default_policy      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.141 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] vmware.pbm_enabled             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.141 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] vmware.pbm_wsdl_location       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.141 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] vmware.serial_log_dir          = /opt/vmware/vspc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.142 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] vmware.serial_port_proxy_uri   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.142 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.142 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.142 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] vmware.use_linked_clone        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.142 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] vmware.vnc_keymap              = en-us log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.142 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] vmware.vnc_port                = 5900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.142 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] vmware.vnc_port_total          = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.143 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] vnc.auth_schemes               = ['none'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.143 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] vnc.enabled                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.143 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] vnc.novncproxy_base_url        = https://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.143 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] vnc.novncproxy_host            = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.143 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] vnc.novncproxy_port            = 6080 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.144 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] vnc.server_listen              = ::0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.144 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] vnc.server_proxyclient_address = 192.168.122.101 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.144 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] vnc.vencrypt_ca_certs          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.144 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] vnc.vencrypt_client_cert       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.144 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] vnc.vencrypt_client_key        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.144 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] workarounds.disable_compute_service_check_for_ffu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.145 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.145 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.145 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.145 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.145 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] workarounds.disable_rootwrap   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.145 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.145 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.146 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.146 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.146 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.146 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.146 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.146 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.146 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.147 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.147 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.147 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.147 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.147 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.147 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.147 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] wsgi.api_paste_config          = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.148 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] wsgi.client_socket_timeout     = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.148 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] wsgi.default_pool_size         = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.148 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] wsgi.keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.148 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] wsgi.max_header_line           = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.148 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] wsgi.secure_proxy_ssl_header   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.148 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] wsgi.ssl_ca_file               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.148 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] wsgi.ssl_cert_file             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.149 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] wsgi.ssl_key_file              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.149 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] wsgi.tcp_keepidle              = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.149 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] wsgi.wsgi_log_format           = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.149 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] zvm.ca_file                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.149 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] zvm.cloud_connector_url        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.149 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] zvm.image_tmp_path             = /var/lib/nova/images log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.149 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] zvm.reachable_timeout          = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.150 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] oslo_policy.enforce_new_defaults = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.150 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] oslo_policy.enforce_scope      = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.150 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.150 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.150 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.150 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.151 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.151 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.151 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.151 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.151 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.151 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.151 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] remote_debug.host              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.152 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] remote_debug.port              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.152 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.152 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.152 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.152 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.152 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.152 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.153 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.153 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.153 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.153 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.153 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.153 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.153 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.154 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.154 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.154 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.154 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.154 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.154 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.154 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.155 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.155 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.155 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.155 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.155 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.155 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.155 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.155 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.156 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.156 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.156 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.156 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] oslo_messaging_notifications.driver = ['messagingv2'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.156 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.156 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.157 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.157 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] oslo_limit.auth_section        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.157 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] oslo_limit.auth_type           = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.157 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] oslo_limit.auth_url            = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.157 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] oslo_limit.cafile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.157 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] oslo_limit.certfile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.157 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] oslo_limit.collect_timing      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.158 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] oslo_limit.connect_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.158 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.158 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] oslo_limit.default_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.158 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.158 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] oslo_limit.domain_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.158 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] oslo_limit.domain_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.158 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] oslo_limit.endpoint_id         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.158 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] oslo_limit.endpoint_override   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.159 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] oslo_limit.insecure            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.159 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] oslo_limit.keyfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.159 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] oslo_limit.max_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.159 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] oslo_limit.min_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.159 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] oslo_limit.password            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.159 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] oslo_limit.project_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.159 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.160 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] oslo_limit.project_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.160 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] oslo_limit.project_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.160 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] oslo_limit.region_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.160 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] oslo_limit.service_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.160 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] oslo_limit.service_type        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.160 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] oslo_limit.split_loggers       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.160 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.160 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.161 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] oslo_limit.system_scope        = all log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.161 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] oslo_limit.timeout             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.161 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] oslo_limit.trust_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.161 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] oslo_limit.user_domain_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.161 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] oslo_limit.user_domain_name    = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.161 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] oslo_limit.user_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.161 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] oslo_limit.username            = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.162 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] oslo_limit.valid_interfaces    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.162 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] oslo_limit.version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.162 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.162 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.162 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.162 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.162 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.163 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.163 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.163 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.163 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.163 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.163 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] vif_plug_ovs_privileged.group  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.163 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.164 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.164 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.164 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] vif_plug_ovs_privileged.user   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.164 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.164 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.164 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] os_vif_linux_bridge.iptables_bottom_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.164 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.165 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] os_vif_linux_bridge.iptables_top_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.165 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.165 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] os_vif_linux_bridge.use_ipv6   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.165 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.165 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] os_vif_ovs.isolate_vif         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.165 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] os_vif_ovs.network_device_mtu  = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.165 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] os_vif_ovs.ovs_vsctl_timeout   = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.165 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] os_vif_ovs.ovsdb_connection    = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.166 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] os_vif_ovs.ovsdb_interface     = native log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.166 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] os_vif_ovs.per_port_bridge     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.166 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] os_brick.lock_path             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.166 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.166 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.166 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] privsep_osbrick.capabilities   = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.166 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] privsep_osbrick.group          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.167 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.167 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] privsep_osbrick.logger_name    = os_brick.privileged log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.167 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.167 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] privsep_osbrick.user           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.167 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] nova_sys_admin.capabilities    = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.167 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] nova_sys_admin.group           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.167 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] nova_sys_admin.helper_command  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.168 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] nova_sys_admin.logger_name     = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.168 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.168 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] nova_sys_admin.user            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.168 181774 DEBUG oslo_service.service [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.169 181774 INFO nova.service [-] Starting compute node (version 27.5.2-0.20250829104910.6f8decf.el9)
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.230 181774 DEBUG nova.virt.libvirt.host [None req-d237fa9d-112a-4ac9-ba3e-9871449b534b - - - - - -] Starting native event thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:492
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.230 181774 DEBUG nova.virt.libvirt.host [None req-d237fa9d-112a-4ac9-ba3e-9871449b534b - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:498
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.231 181774 DEBUG nova.virt.libvirt.host [None req-d237fa9d-112a-4ac9-ba3e-9871449b534b - - - - - -] Starting connection event dispatch thread initialize /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:620
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.231 181774 DEBUG nova.virt.libvirt.host [None req-d237fa9d-112a-4ac9-ba3e-9871449b534b - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:503
Jan 21 23:37:05 compute-1 systemd[1]: Starting libvirt QEMU daemon...
Jan 21 23:37:05 compute-1 systemd[1]: Started libvirt QEMU daemon.
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.296 181774 DEBUG nova.virt.libvirt.host [None req-d237fa9d-112a-4ac9-ba3e-9871449b534b - - - - - -] Registering for lifecycle events <nova.virt.libvirt.host.Host object at 0x7f2c03d60e20> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:509
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.299 181774 DEBUG nova.virt.libvirt.host [None req-d237fa9d-112a-4ac9-ba3e-9871449b534b - - - - - -] Registering for connection events: <nova.virt.libvirt.host.Host object at 0x7f2c03d60e20> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:530
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.300 181774 INFO nova.virt.libvirt.driver [None req-d237fa9d-112a-4ac9-ba3e-9871449b534b - - - - - -] Connection event '1' reason 'None'
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.342 181774 WARNING nova.virt.libvirt.driver [None req-d237fa9d-112a-4ac9-ba3e-9871449b534b - - - - - -] Cannot update service status on host "compute-1.ctlplane.example.com" since it is not registered.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host compute-1.ctlplane.example.com could not be found.
Jan 21 23:37:05 compute-1 nova_compute[181770]: 2026-01-21 23:37:05.343 181774 DEBUG nova.virt.libvirt.volume.mount [None req-d237fa9d-112a-4ac9-ba3e-9871449b534b - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/mount.py:130
Jan 21 23:37:05 compute-1 python3.9[182258]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service.requires follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 21 23:37:06 compute-1 nova_compute[181770]: 2026-01-21 23:37:06.219 181774 INFO nova.virt.libvirt.host [None req-d237fa9d-112a-4ac9-ba3e-9871449b534b - - - - - -] Libvirt host capabilities <capabilities>
Jan 21 23:37:06 compute-1 nova_compute[181770]: 
Jan 21 23:37:06 compute-1 nova_compute[181770]:   <host>
Jan 21 23:37:06 compute-1 nova_compute[181770]:     <uuid>d7c2924b-8ca5-4f75-9376-1023950dbf90</uuid>
Jan 21 23:37:06 compute-1 nova_compute[181770]:     <cpu>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <arch>x86_64</arch>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model>EPYC-Rome-v4</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <vendor>AMD</vendor>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <microcode version='16777317'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <signature family='23' model='49' stepping='0'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <topology sockets='8' dies='1' clusters='1' cores='1' threads='1'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <maxphysaddr mode='emulate' bits='40'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <feature name='x2apic'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <feature name='tsc-deadline'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <feature name='osxsave'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <feature name='hypervisor'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <feature name='tsc_adjust'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <feature name='spec-ctrl'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <feature name='stibp'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <feature name='arch-capabilities'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <feature name='ssbd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <feature name='cmp_legacy'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <feature name='topoext'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <feature name='virt-ssbd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <feature name='lbrv'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <feature name='tsc-scale'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <feature name='vmcb-clean'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <feature name='pause-filter'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <feature name='pfthreshold'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <feature name='svme-addr-chk'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <feature name='rdctl-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <feature name='skip-l1dfl-vmentry'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <feature name='mds-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <feature name='pschange-mc-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <pages unit='KiB' size='4'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <pages unit='KiB' size='2048'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <pages unit='KiB' size='1048576'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:     </cpu>
Jan 21 23:37:06 compute-1 nova_compute[181770]:     <power_management>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <suspend_mem/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <suspend_disk/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <suspend_hybrid/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:     </power_management>
Jan 21 23:37:06 compute-1 nova_compute[181770]:     <iommu support='no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:     <migration_features>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <live/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <uri_transports>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <uri_transport>tcp</uri_transport>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <uri_transport>rdma</uri_transport>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </uri_transports>
Jan 21 23:37:06 compute-1 nova_compute[181770]:     </migration_features>
Jan 21 23:37:06 compute-1 nova_compute[181770]:     <topology>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <cells num='1'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <cell id='0'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:           <memory unit='KiB'>7864316</memory>
Jan 21 23:37:06 compute-1 nova_compute[181770]:           <pages unit='KiB' size='4'>1966079</pages>
Jan 21 23:37:06 compute-1 nova_compute[181770]:           <pages unit='KiB' size='2048'>0</pages>
Jan 21 23:37:06 compute-1 nova_compute[181770]:           <pages unit='KiB' size='1048576'>0</pages>
Jan 21 23:37:06 compute-1 nova_compute[181770]:           <distances>
Jan 21 23:37:06 compute-1 nova_compute[181770]:             <sibling id='0' value='10'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:           </distances>
Jan 21 23:37:06 compute-1 nova_compute[181770]:           <cpus num='8'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:             <cpu id='0' socket_id='0' die_id='0' cluster_id='65535' core_id='0' siblings='0'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:             <cpu id='1' socket_id='1' die_id='1' cluster_id='65535' core_id='0' siblings='1'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:             <cpu id='2' socket_id='2' die_id='2' cluster_id='65535' core_id='0' siblings='2'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:             <cpu id='3' socket_id='3' die_id='3' cluster_id='65535' core_id='0' siblings='3'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:             <cpu id='4' socket_id='4' die_id='4' cluster_id='65535' core_id='0' siblings='4'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:             <cpu id='5' socket_id='5' die_id='5' cluster_id='65535' core_id='0' siblings='5'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:             <cpu id='6' socket_id='6' die_id='6' cluster_id='65535' core_id='0' siblings='6'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:             <cpu id='7' socket_id='7' die_id='7' cluster_id='65535' core_id='0' siblings='7'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:           </cpus>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         </cell>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </cells>
Jan 21 23:37:06 compute-1 nova_compute[181770]:     </topology>
Jan 21 23:37:06 compute-1 nova_compute[181770]:     <cache>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <bank id='0' level='2' type='both' size='512' unit='KiB' cpus='0'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <bank id='1' level='2' type='both' size='512' unit='KiB' cpus='1'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <bank id='2' level='2' type='both' size='512' unit='KiB' cpus='2'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <bank id='3' level='2' type='both' size='512' unit='KiB' cpus='3'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <bank id='4' level='2' type='both' size='512' unit='KiB' cpus='4'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <bank id='5' level='2' type='both' size='512' unit='KiB' cpus='5'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <bank id='6' level='2' type='both' size='512' unit='KiB' cpus='6'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <bank id='7' level='2' type='both' size='512' unit='KiB' cpus='7'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <bank id='0' level='3' type='both' size='16' unit='MiB' cpus='0'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <bank id='1' level='3' type='both' size='16' unit='MiB' cpus='1'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <bank id='2' level='3' type='both' size='16' unit='MiB' cpus='2'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <bank id='3' level='3' type='both' size='16' unit='MiB' cpus='3'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <bank id='4' level='3' type='both' size='16' unit='MiB' cpus='4'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <bank id='5' level='3' type='both' size='16' unit='MiB' cpus='5'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <bank id='6' level='3' type='both' size='16' unit='MiB' cpus='6'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <bank id='7' level='3' type='both' size='16' unit='MiB' cpus='7'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:     </cache>
Jan 21 23:37:06 compute-1 nova_compute[181770]:     <secmodel>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model>selinux</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <doi>0</doi>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <baselabel type='kvm'>system_u:system_r:svirt_t:s0</baselabel>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <baselabel type='qemu'>system_u:system_r:svirt_tcg_t:s0</baselabel>
Jan 21 23:37:06 compute-1 nova_compute[181770]:     </secmodel>
Jan 21 23:37:06 compute-1 nova_compute[181770]:     <secmodel>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model>dac</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <doi>0</doi>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <baselabel type='kvm'>+107:+107</baselabel>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <baselabel type='qemu'>+107:+107</baselabel>
Jan 21 23:37:06 compute-1 nova_compute[181770]:     </secmodel>
Jan 21 23:37:06 compute-1 nova_compute[181770]:   </host>
Jan 21 23:37:06 compute-1 nova_compute[181770]: 
Jan 21 23:37:06 compute-1 nova_compute[181770]:   <guest>
Jan 21 23:37:06 compute-1 nova_compute[181770]:     <os_type>hvm</os_type>
Jan 21 23:37:06 compute-1 nova_compute[181770]:     <arch name='i686'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <wordsize>32</wordsize>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <domain type='qemu'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <domain type='kvm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:     </arch>
Jan 21 23:37:06 compute-1 nova_compute[181770]:     <features>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <pae/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <nonpae/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <acpi default='on' toggle='yes'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <apic default='on' toggle='no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <cpuselection/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <deviceboot/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <disksnapshot default='on' toggle='no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <externalSnapshot/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:     </features>
Jan 21 23:37:06 compute-1 nova_compute[181770]:   </guest>
Jan 21 23:37:06 compute-1 nova_compute[181770]: 
Jan 21 23:37:06 compute-1 nova_compute[181770]:   <guest>
Jan 21 23:37:06 compute-1 nova_compute[181770]:     <os_type>hvm</os_type>
Jan 21 23:37:06 compute-1 nova_compute[181770]:     <arch name='x86_64'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <wordsize>64</wordsize>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <domain type='qemu'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <domain type='kvm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:     </arch>
Jan 21 23:37:06 compute-1 nova_compute[181770]:     <features>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <acpi default='on' toggle='yes'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <apic default='on' toggle='no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <cpuselection/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <deviceboot/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <disksnapshot default='on' toggle='no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <externalSnapshot/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:     </features>
Jan 21 23:37:06 compute-1 nova_compute[181770]:   </guest>
Jan 21 23:37:06 compute-1 nova_compute[181770]: 
Jan 21 23:37:06 compute-1 nova_compute[181770]: </capabilities>
Jan 21 23:37:06 compute-1 nova_compute[181770]: 
Jan 21 23:37:06 compute-1 nova_compute[181770]: 2026-01-21 23:37:06.227 181774 DEBUG nova.virt.libvirt.host [None req-d237fa9d-112a-4ac9-ba3e-9871449b534b - - - - - -] Getting domain capabilities for i686 via machine types: {'pc', 'q35'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952
Jan 21 23:37:06 compute-1 nova_compute[181770]: 2026-01-21 23:37:06.257 181774 DEBUG nova.virt.libvirt.host [None req-d237fa9d-112a-4ac9-ba3e-9871449b534b - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc:
Jan 21 23:37:06 compute-1 nova_compute[181770]: <domainCapabilities>
Jan 21 23:37:06 compute-1 nova_compute[181770]:   <path>/usr/libexec/qemu-kvm</path>
Jan 21 23:37:06 compute-1 nova_compute[181770]:   <domain>kvm</domain>
Jan 21 23:37:06 compute-1 nova_compute[181770]:   <machine>pc-i440fx-rhel7.6.0</machine>
Jan 21 23:37:06 compute-1 nova_compute[181770]:   <arch>i686</arch>
Jan 21 23:37:06 compute-1 nova_compute[181770]:   <vcpu max='240'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:   <iothreads supported='yes'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:   <os supported='yes'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:     <enum name='firmware'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:     <loader supported='yes'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <enum name='type'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>rom</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>pflash</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </enum>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <enum name='readonly'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>yes</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>no</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </enum>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <enum name='secure'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>no</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </enum>
Jan 21 23:37:06 compute-1 nova_compute[181770]:     </loader>
Jan 21 23:37:06 compute-1 nova_compute[181770]:   </os>
Jan 21 23:37:06 compute-1 nova_compute[181770]:   <cpu>
Jan 21 23:37:06 compute-1 nova_compute[181770]:     <mode name='host-passthrough' supported='yes'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <enum name='hostPassthroughMigratable'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>on</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>off</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </enum>
Jan 21 23:37:06 compute-1 nova_compute[181770]:     </mode>
Jan 21 23:37:06 compute-1 nova_compute[181770]:     <mode name='maximum' supported='yes'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <enum name='maximumMigratable'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>on</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>off</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </enum>
Jan 21 23:37:06 compute-1 nova_compute[181770]:     </mode>
Jan 21 23:37:06 compute-1 nova_compute[181770]:     <mode name='host-model' supported='yes'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model fallback='forbid'>EPYC-Rome</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <vendor>AMD</vendor>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <maxphysaddr mode='passthrough' limit='40'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <feature policy='require' name='x2apic'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <feature policy='require' name='tsc-deadline'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <feature policy='require' name='hypervisor'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <feature policy='require' name='tsc_adjust'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <feature policy='require' name='spec-ctrl'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <feature policy='require' name='stibp'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <feature policy='require' name='ssbd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <feature policy='require' name='cmp_legacy'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <feature policy='require' name='overflow-recov'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <feature policy='require' name='succor'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <feature policy='require' name='ibrs'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <feature policy='require' name='amd-ssbd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <feature policy='require' name='virt-ssbd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <feature policy='require' name='lbrv'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <feature policy='require' name='tsc-scale'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <feature policy='require' name='vmcb-clean'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <feature policy='require' name='flushbyasid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <feature policy='require' name='pause-filter'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <feature policy='require' name='pfthreshold'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <feature policy='require' name='svme-addr-chk'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <feature policy='require' name='lfence-always-serializing'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <feature policy='disable' name='xsaves'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:     </mode>
Jan 21 23:37:06 compute-1 nova_compute[181770]:     <mode name='custom' supported='yes'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Broadwell'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='hle'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Broadwell-IBRS'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='hle'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Broadwell-noTSX'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Broadwell-noTSX-IBRS'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Broadwell-v1'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='hle'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Broadwell-v2'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Broadwell-v3'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='hle'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Broadwell-v4'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Cascadelake-Server'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='hle'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pku'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Cascadelake-Server-noTSX'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='ibrs-all'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pku'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Cascadelake-Server-v1'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='hle'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pku'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Cascadelake-Server-v2'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='hle'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='ibrs-all'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pku'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Cascadelake-Server-v3'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='ibrs-all'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pku'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Cascadelake-Server-v4'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='ibrs-all'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pku'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Cascadelake-Server-v5'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='ibrs-all'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pku'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='ClearwaterForest'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx-ifma'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx-ne-convert'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx-vnni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx-vnni-int16'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx-vnni-int8'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='bhi-ctrl'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='bhi-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='bus-lock-detect'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='cldemote'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='cmpccxadd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='ddpd-u'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fbsdp-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fsrm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fsrs'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='ibrs-all'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='intel-psfd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='ipred-ctrl'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='lam'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='mcdt-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='movdir64b'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='movdiri'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pbrsb-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pku'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='prefetchiti'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='psdp-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='rrsba-ctrl'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='sbdr-ssdp-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='serialize'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='sha512'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='sm3'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='sm4'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='ss'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='vaes'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='ClearwaterForest-v1'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx-ifma'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx-ne-convert'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx-vnni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx-vnni-int16'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx-vnni-int8'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='bhi-ctrl'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='bhi-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='bus-lock-detect'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='cldemote'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='cmpccxadd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='ddpd-u'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fbsdp-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fsrm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fsrs'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='ibrs-all'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='intel-psfd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='ipred-ctrl'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='lam'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='mcdt-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='movdir64b'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='movdiri'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pbrsb-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pku'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='prefetchiti'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='psdp-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='rrsba-ctrl'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='sbdr-ssdp-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='serialize'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='sha512'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='sm3'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='sm4'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='ss'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='vaes'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Cooperlake'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512-bf16'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='hle'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='ibrs-all'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pku'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='taa-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Cooperlake-v1'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512-bf16'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='hle'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='ibrs-all'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pku'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='taa-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Cooperlake-v2'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512-bf16'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='hle'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='ibrs-all'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pku'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='taa-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Denverton'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='mpx'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Denverton-v1'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='mpx'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Denverton-v2'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Denverton-v3'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Dhyana-v2'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='EPYC-Genoa'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='amd-psfd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='auto-ibrs'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512-bf16'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bitalg'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512ifma'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vbmi'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fsrm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='la57'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='no-nested-data-bp'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='null-sel-clr-base'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pku'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='stibp-always-on'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='vaes'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='EPYC-Genoa-v1'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='amd-psfd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='auto-ibrs'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512-bf16'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bitalg'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512ifma'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vbmi'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fsrm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='la57'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='no-nested-data-bp'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='null-sel-clr-base'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pku'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='stibp-always-on'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='vaes'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='EPYC-Genoa-v2'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='amd-psfd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='auto-ibrs'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512-bf16'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bitalg'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512ifma'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vbmi'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fs-gs-base-ns'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fsrm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='la57'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='no-nested-data-bp'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='null-sel-clr-base'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='perfmon-v2'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pku'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='stibp-always-on'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='vaes'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='EPYC-Milan'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fsrm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pku'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='EPYC-Milan-v1'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fsrm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pku'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='EPYC-Milan-v2'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='amd-psfd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fsrm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='no-nested-data-bp'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='null-sel-clr-base'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pku'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='stibp-always-on'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='vaes'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='EPYC-Milan-v3'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='amd-psfd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fsrm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='no-nested-data-bp'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='null-sel-clr-base'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pku'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='stibp-always-on'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='vaes'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='EPYC-Rome'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='EPYC-Rome-v1'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='EPYC-Rome-v2'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='EPYC-Rome-v3'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='EPYC-Turin'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='amd-psfd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='auto-ibrs'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx-vnni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512-bf16'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512-vp2intersect'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bitalg'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512ifma'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vbmi'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fs-gs-base-ns'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fsrm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='ibpb-brtype'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='la57'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='movdir64b'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='movdiri'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='no-nested-data-bp'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='null-sel-clr-base'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='perfmon-v2'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pku'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='prefetchi'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='sbpb'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='srso-user-kernel-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='stibp-always-on'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='vaes'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='EPYC-Turin-v1'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='amd-psfd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='auto-ibrs'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx-vnni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512-bf16'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512-vp2intersect'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bitalg'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512ifma'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vbmi'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fs-gs-base-ns'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fsrm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='ibpb-brtype'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='la57'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='movdir64b'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='movdiri'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='no-nested-data-bp'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='null-sel-clr-base'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='perfmon-v2'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pku'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='prefetchi'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='sbpb'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='srso-user-kernel-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='stibp-always-on'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='vaes'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='EPYC-v3'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='EPYC-v4'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='AMD'>EPYC-v5</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='EPYC-v5'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='GraniteRapids'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='amx-bf16'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='amx-fp16'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='amx-int8'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='amx-tile'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx-vnni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512-bf16'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512-fp16'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bitalg'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512ifma'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vbmi'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='bus-lock-detect'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fbsdp-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fsrc'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fsrm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fsrs'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fzrm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='hle'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='ibrs-all'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='la57'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='mcdt-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pbrsb-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pku'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='prefetchiti'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='psdp-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='sbdr-ssdp-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='serialize'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='taa-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='tsx-ldtrk'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='vaes'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='xfd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='GraniteRapids-v1'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='amx-bf16'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='amx-fp16'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='amx-int8'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='amx-tile'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx-vnni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512-bf16'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512-fp16'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bitalg'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512ifma'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vbmi'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='bus-lock-detect'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fbsdp-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fsrc'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fsrm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fsrs'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fzrm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='hle'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='ibrs-all'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='la57'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='mcdt-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pbrsb-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pku'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='prefetchiti'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='psdp-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='sbdr-ssdp-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='serialize'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='taa-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='tsx-ldtrk'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='vaes'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='xfd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='GraniteRapids-v2'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='amx-bf16'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='amx-fp16'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='amx-int8'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='amx-tile'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx-vnni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx10'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx10-128'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx10-256'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx10-512'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512-bf16'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512-fp16'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bitalg'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512ifma'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vbmi'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='bus-lock-detect'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='cldemote'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fbsdp-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fsrc'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fsrm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fsrs'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fzrm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='hle'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='ibrs-all'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='la57'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='mcdt-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='movdir64b'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='movdiri'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pbrsb-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pku'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='prefetchiti'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='psdp-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='sbdr-ssdp-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='serialize'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='ss'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='taa-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='tsx-ldtrk'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='vaes'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='xfd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='GraniteRapids-v3'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='amx-bf16'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='amx-fp16'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='amx-int8'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='amx-tile'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx-vnni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx10'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx10-128'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx10-256'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx10-512'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512-bf16'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512-fp16'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bitalg'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512ifma'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vbmi'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='bus-lock-detect'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='cldemote'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fbsdp-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fsrc'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fsrm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fsrs'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fzrm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='hle'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='ibrs-all'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='la57'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='mcdt-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='movdir64b'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='movdiri'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pbrsb-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pku'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='prefetchiti'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='psdp-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='sbdr-ssdp-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='serialize'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='ss'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='taa-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='tsx-ldtrk'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='vaes'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='xfd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Haswell'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='hle'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Haswell-IBRS'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='hle'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Haswell-noTSX'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Haswell-noTSX-IBRS'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Haswell-v1'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='hle'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Haswell-v2'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Haswell-v3'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='hle'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Haswell-v4'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Icelake-Server'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bitalg'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vbmi'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='hle'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='la57'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pku'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='vaes'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Icelake-Server-noTSX'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bitalg'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vbmi'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='la57'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pku'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='vaes'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Icelake-Server-v1'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bitalg'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vbmi'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='hle'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='la57'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pku'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='vaes'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Icelake-Server-v2'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bitalg'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vbmi'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='la57'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pku'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='vaes'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Icelake-Server-v3'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bitalg'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vbmi'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='ibrs-all'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='la57'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pku'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='taa-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='vaes'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Icelake-Server-v4'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bitalg'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512ifma'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vbmi'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fsrm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='ibrs-all'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='la57'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pku'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='taa-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='vaes'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Icelake-Server-v5'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bitalg'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512ifma'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vbmi'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fsrm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='ibrs-all'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='la57'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pku'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='taa-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='vaes'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Icelake-Server-v6'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bitalg'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512ifma'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vbmi'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fsrm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='ibrs-all'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='la57'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pku'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='taa-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='vaes'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Icelake-Server-v7'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bitalg'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512ifma'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vbmi'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fsrm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='hle'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='ibrs-all'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='la57'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pku'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='taa-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='vaes'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='IvyBridge'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='IvyBridge-IBRS'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='IvyBridge-v1'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='IvyBridge-v2'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='KnightsMill'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512-4fmaps'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512-4vnniw'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512er'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512pf'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='ss'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='KnightsMill-v1'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512-4fmaps'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512-4vnniw'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512er'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512pf'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='ss'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Opteron_G4'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fma4'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='xop'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Opteron_G4-v1'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fma4'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='xop'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Opteron_G5'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fma4'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='tbm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='xop'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Opteron_G5-v1'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fma4'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='tbm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='xop'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='SapphireRapids'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='amx-bf16'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='amx-int8'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='amx-tile'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx-vnni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512-bf16'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512-fp16'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bitalg'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512ifma'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vbmi'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='bus-lock-detect'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fsrc'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fsrm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fsrs'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fzrm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='hle'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='ibrs-all'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='la57'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pku'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='serialize'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='taa-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='tsx-ldtrk'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='vaes'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='xfd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='SapphireRapids-v1'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='amx-bf16'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='amx-int8'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='amx-tile'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx-vnni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512-bf16'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512-fp16'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bitalg'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512ifma'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vbmi'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='bus-lock-detect'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fsrc'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fsrm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fsrs'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fzrm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='hle'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='ibrs-all'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='la57'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pku'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='serialize'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='taa-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='tsx-ldtrk'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='vaes'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='xfd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='SapphireRapids-v2'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='amx-bf16'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='amx-int8'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='amx-tile'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx-vnni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512-bf16'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512-fp16'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bitalg'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512ifma'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vbmi'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='bus-lock-detect'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fbsdp-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fsrc'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fsrm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fsrs'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fzrm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='hle'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='ibrs-all'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='la57'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pku'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='psdp-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='sbdr-ssdp-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='serialize'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='taa-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='tsx-ldtrk'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='vaes'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='xfd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='SapphireRapids-v3'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='amx-bf16'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='amx-int8'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='amx-tile'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx-vnni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512-bf16'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512-fp16'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bitalg'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512ifma'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vbmi'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='bus-lock-detect'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='cldemote'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fbsdp-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fsrc'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fsrm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fsrs'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fzrm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='hle'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='ibrs-all'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='la57'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='movdir64b'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='movdiri'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pku'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='psdp-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='sbdr-ssdp-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='serialize'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='ss'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='taa-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='tsx-ldtrk'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='vaes'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='xfd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='SapphireRapids-v4'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='amx-bf16'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='amx-int8'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='amx-tile'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx-vnni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512-bf16'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512-fp16'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bitalg'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512ifma'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vbmi'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='bus-lock-detect'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='cldemote'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fbsdp-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fsrc'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fsrm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fsrs'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fzrm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='hle'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='ibrs-all'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='la57'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='movdir64b'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='movdiri'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pku'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='psdp-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='sbdr-ssdp-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='serialize'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='ss'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='taa-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='tsx-ldtrk'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='vaes'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='xfd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='SierraForest'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx-ifma'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx-ne-convert'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx-vnni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx-vnni-int8'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='bus-lock-detect'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='cmpccxadd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fbsdp-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fsrm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fsrs'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='ibrs-all'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='mcdt-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pbrsb-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pku'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='psdp-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='sbdr-ssdp-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='serialize'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='vaes'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='SierraForest-v1'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx-ifma'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx-ne-convert'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx-vnni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx-vnni-int8'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='bus-lock-detect'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='cmpccxadd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fbsdp-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fsrm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fsrs'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='ibrs-all'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='mcdt-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pbrsb-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pku'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='psdp-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='sbdr-ssdp-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='serialize'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='vaes'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel'>SierraForest-v2</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='SierraForest-v2'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx-ifma'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx-ne-convert'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx-vnni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx-vnni-int8'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='bhi-ctrl'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='bus-lock-detect'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='cldemote'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='cmpccxadd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fbsdp-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fsrm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fsrs'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='ibrs-all'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='intel-psfd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='ipred-ctrl'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='lam'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='mcdt-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='movdir64b'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='movdiri'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pbrsb-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pku'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='psdp-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='rrsba-ctrl'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='sbdr-ssdp-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='serialize'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='ss'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='vaes'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel'>SierraForest-v3</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='SierraForest-v3'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx-ifma'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx-ne-convert'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx-vnni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx-vnni-int8'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='bhi-ctrl'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='bus-lock-detect'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='cldemote'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='cmpccxadd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fbsdp-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fsrm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fsrs'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='ibrs-all'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='intel-psfd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='ipred-ctrl'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='lam'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='mcdt-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='movdir64b'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='movdiri'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pbrsb-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pku'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='psdp-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='rrsba-ctrl'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='sbdr-ssdp-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='serialize'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='ss'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='vaes'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Skylake-Client'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='hle'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Skylake-Client-IBRS'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='hle'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Skylake-Client-v1'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='hle'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Skylake-Client-v2'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='hle'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Skylake-Client-v3'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Skylake-Client-v4'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Skylake-Server'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='hle'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pku'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Skylake-Server-IBRS'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='hle'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pku'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pku'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Skylake-Server-v1'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='hle'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pku'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Skylake-Server-v2'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='hle'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pku'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Skylake-Server-v3'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pku'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Skylake-Server-v4'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pku'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Skylake-Server-v5'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pku'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Snowridge'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='cldemote'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='core-capability'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='movdir64b'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='movdiri'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='mpx'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='split-lock-detect'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Snowridge-v1'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='cldemote'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='core-capability'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='movdir64b'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='movdiri'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='mpx'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='split-lock-detect'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Snowridge-v2'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='cldemote'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='core-capability'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='movdir64b'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='movdiri'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='split-lock-detect'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Snowridge-v3'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='cldemote'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='core-capability'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='movdir64b'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='movdiri'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='split-lock-detect'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Snowridge-v4'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='cldemote'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='movdir64b'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='movdiri'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='athlon'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='3dnow'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='3dnowext'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='athlon-v1'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='3dnow'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='3dnowext'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='core2duo'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='ss'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='core2duo-v1'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='ss'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='coreduo'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='ss'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='coreduo-v1'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='ss'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='n270'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='ss'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='n270-v1'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='ss'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='phenom'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='3dnow'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='3dnowext'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='phenom-v1'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='3dnow'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='3dnowext'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:     </mode>
Jan 21 23:37:06 compute-1 nova_compute[181770]:   </cpu>
Jan 21 23:37:06 compute-1 nova_compute[181770]:   <memoryBacking supported='yes'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:     <enum name='sourceType'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <value>file</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <value>anonymous</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <value>memfd</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:     </enum>
Jan 21 23:37:06 compute-1 nova_compute[181770]:   </memoryBacking>
Jan 21 23:37:06 compute-1 nova_compute[181770]:   <devices>
Jan 21 23:37:06 compute-1 nova_compute[181770]:     <disk supported='yes'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <enum name='diskDevice'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>disk</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>cdrom</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>floppy</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>lun</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </enum>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <enum name='bus'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>ide</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>fdc</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>scsi</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>virtio</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>usb</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>sata</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </enum>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <enum name='model'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>virtio</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>virtio-transitional</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>virtio-non-transitional</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </enum>
Jan 21 23:37:06 compute-1 nova_compute[181770]:     </disk>
Jan 21 23:37:06 compute-1 nova_compute[181770]:     <graphics supported='yes'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <enum name='type'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>vnc</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>egl-headless</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>dbus</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </enum>
Jan 21 23:37:06 compute-1 nova_compute[181770]:     </graphics>
Jan 21 23:37:06 compute-1 nova_compute[181770]:     <video supported='yes'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <enum name='modelType'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>vga</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>cirrus</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>virtio</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>none</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>bochs</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>ramfb</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </enum>
Jan 21 23:37:06 compute-1 nova_compute[181770]:     </video>
Jan 21 23:37:06 compute-1 nova_compute[181770]:     <hostdev supported='yes'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <enum name='mode'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>subsystem</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </enum>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <enum name='startupPolicy'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>default</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>mandatory</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>requisite</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>optional</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </enum>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <enum name='subsysType'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>usb</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>pci</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>scsi</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </enum>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <enum name='capsType'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <enum name='pciBackend'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:     </hostdev>
Jan 21 23:37:06 compute-1 nova_compute[181770]:     <rng supported='yes'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <enum name='model'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>virtio</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>virtio-transitional</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>virtio-non-transitional</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </enum>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <enum name='backendModel'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>random</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>egd</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>builtin</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </enum>
Jan 21 23:37:06 compute-1 nova_compute[181770]:     </rng>
Jan 21 23:37:06 compute-1 nova_compute[181770]:     <filesystem supported='yes'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <enum name='driverType'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>path</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>handle</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>virtiofs</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </enum>
Jan 21 23:37:06 compute-1 nova_compute[181770]:     </filesystem>
Jan 21 23:37:06 compute-1 nova_compute[181770]:     <tpm supported='yes'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <enum name='model'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>tpm-tis</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>tpm-crb</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </enum>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <enum name='backendModel'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>emulator</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>external</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </enum>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <enum name='backendVersion'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>2.0</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </enum>
Jan 21 23:37:06 compute-1 nova_compute[181770]:     </tpm>
Jan 21 23:37:06 compute-1 nova_compute[181770]:     <redirdev supported='yes'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <enum name='bus'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>usb</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </enum>
Jan 21 23:37:06 compute-1 nova_compute[181770]:     </redirdev>
Jan 21 23:37:06 compute-1 nova_compute[181770]:     <channel supported='yes'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <enum name='type'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>pty</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>unix</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </enum>
Jan 21 23:37:06 compute-1 nova_compute[181770]:     </channel>
Jan 21 23:37:06 compute-1 nova_compute[181770]:     <crypto supported='yes'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <enum name='model'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <enum name='type'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>qemu</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </enum>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <enum name='backendModel'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>builtin</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </enum>
Jan 21 23:37:06 compute-1 nova_compute[181770]:     </crypto>
Jan 21 23:37:06 compute-1 nova_compute[181770]:     <interface supported='yes'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <enum name='backendType'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>default</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>passt</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </enum>
Jan 21 23:37:06 compute-1 nova_compute[181770]:     </interface>
Jan 21 23:37:06 compute-1 nova_compute[181770]:     <panic supported='yes'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <enum name='model'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>isa</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>hyperv</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </enum>
Jan 21 23:37:06 compute-1 nova_compute[181770]:     </panic>
Jan 21 23:37:06 compute-1 nova_compute[181770]:     <console supported='yes'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <enum name='type'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>null</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>vc</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>pty</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>dev</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>file</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>pipe</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>stdio</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>udp</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>tcp</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>unix</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>qemu-vdagent</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>dbus</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </enum>
Jan 21 23:37:06 compute-1 nova_compute[181770]:     </console>
Jan 21 23:37:06 compute-1 nova_compute[181770]:   </devices>
Jan 21 23:37:06 compute-1 nova_compute[181770]:   <features>
Jan 21 23:37:06 compute-1 nova_compute[181770]:     <gic supported='no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:     <vmcoreinfo supported='yes'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:     <genid supported='yes'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:     <backingStoreInput supported='yes'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:     <backup supported='yes'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:     <async-teardown supported='yes'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:     <s390-pv supported='no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:     <ps2 supported='yes'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:     <tdx supported='no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:     <sev supported='no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:     <sgx supported='no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:     <hyperv supported='yes'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <enum name='features'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>relaxed</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>vapic</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>spinlocks</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>vpindex</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>runtime</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>synic</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>stimer</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>reset</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>vendor_id</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>frequencies</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>reenlightenment</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>tlbflush</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>ipi</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>avic</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>emsr_bitmap</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>xmm_input</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </enum>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <defaults>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <spinlocks>4095</spinlocks>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <stimer_direct>on</stimer_direct>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <tlbflush_direct>on</tlbflush_direct>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <tlbflush_extended>on</tlbflush_extended>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <vendor_id>Linux KVM Hv</vendor_id>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </defaults>
Jan 21 23:37:06 compute-1 nova_compute[181770]:     </hyperv>
Jan 21 23:37:06 compute-1 nova_compute[181770]:     <launchSecurity supported='no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:   </features>
Jan 21 23:37:06 compute-1 nova_compute[181770]: </domainCapabilities>
Jan 21 23:37:06 compute-1 nova_compute[181770]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Jan 21 23:37:06 compute-1 nova_compute[181770]: 2026-01-21 23:37:06.269 181774 DEBUG nova.virt.libvirt.host [None req-d237fa9d-112a-4ac9-ba3e-9871449b534b - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35:
Jan 21 23:37:06 compute-1 nova_compute[181770]: <domainCapabilities>
Jan 21 23:37:06 compute-1 nova_compute[181770]:   <path>/usr/libexec/qemu-kvm</path>
Jan 21 23:37:06 compute-1 nova_compute[181770]:   <domain>kvm</domain>
Jan 21 23:37:06 compute-1 nova_compute[181770]:   <machine>pc-q35-rhel9.8.0</machine>
Jan 21 23:37:06 compute-1 nova_compute[181770]:   <arch>i686</arch>
Jan 21 23:37:06 compute-1 nova_compute[181770]:   <vcpu max='4096'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:   <iothreads supported='yes'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:   <os supported='yes'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:     <enum name='firmware'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:     <loader supported='yes'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <enum name='type'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>rom</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>pflash</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </enum>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <enum name='readonly'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>yes</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>no</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </enum>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <enum name='secure'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>no</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </enum>
Jan 21 23:37:06 compute-1 nova_compute[181770]:     </loader>
Jan 21 23:37:06 compute-1 nova_compute[181770]:   </os>
Jan 21 23:37:06 compute-1 nova_compute[181770]:   <cpu>
Jan 21 23:37:06 compute-1 nova_compute[181770]:     <mode name='host-passthrough' supported='yes'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <enum name='hostPassthroughMigratable'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>on</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>off</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </enum>
Jan 21 23:37:06 compute-1 nova_compute[181770]:     </mode>
Jan 21 23:37:06 compute-1 nova_compute[181770]:     <mode name='maximum' supported='yes'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <enum name='maximumMigratable'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>on</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>off</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </enum>
Jan 21 23:37:06 compute-1 nova_compute[181770]:     </mode>
Jan 21 23:37:06 compute-1 nova_compute[181770]:     <mode name='host-model' supported='yes'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model fallback='forbid'>EPYC-Rome</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <vendor>AMD</vendor>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <maxphysaddr mode='passthrough' limit='40'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <feature policy='require' name='x2apic'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <feature policy='require' name='tsc-deadline'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <feature policy='require' name='hypervisor'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <feature policy='require' name='tsc_adjust'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <feature policy='require' name='spec-ctrl'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <feature policy='require' name='stibp'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <feature policy='require' name='ssbd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <feature policy='require' name='cmp_legacy'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <feature policy='require' name='overflow-recov'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <feature policy='require' name='succor'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <feature policy='require' name='ibrs'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <feature policy='require' name='amd-ssbd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <feature policy='require' name='virt-ssbd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <feature policy='require' name='lbrv'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <feature policy='require' name='tsc-scale'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <feature policy='require' name='vmcb-clean'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <feature policy='require' name='flushbyasid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <feature policy='require' name='pause-filter'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <feature policy='require' name='pfthreshold'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <feature policy='require' name='svme-addr-chk'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <feature policy='require' name='lfence-always-serializing'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <feature policy='disable' name='xsaves'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:     </mode>
Jan 21 23:37:06 compute-1 nova_compute[181770]:     <mode name='custom' supported='yes'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Broadwell'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='hle'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Broadwell-IBRS'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='hle'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Broadwell-noTSX'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Broadwell-noTSX-IBRS'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Broadwell-v1'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='hle'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Broadwell-v2'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Broadwell-v3'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='hle'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Broadwell-v4'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Cascadelake-Server'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='hle'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pku'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Cascadelake-Server-noTSX'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='ibrs-all'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pku'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Cascadelake-Server-v1'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='hle'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pku'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Cascadelake-Server-v2'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='hle'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='ibrs-all'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pku'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Cascadelake-Server-v3'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='ibrs-all'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pku'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Cascadelake-Server-v4'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='ibrs-all'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pku'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Cascadelake-Server-v5'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='ibrs-all'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pku'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='ClearwaterForest'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx-ifma'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx-ne-convert'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx-vnni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx-vnni-int16'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx-vnni-int8'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='bhi-ctrl'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='bhi-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='bus-lock-detect'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='cldemote'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='cmpccxadd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='ddpd-u'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fbsdp-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fsrm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fsrs'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='ibrs-all'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='intel-psfd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='ipred-ctrl'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='lam'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='mcdt-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='movdir64b'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='movdiri'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pbrsb-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pku'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='prefetchiti'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='psdp-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='rrsba-ctrl'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='sbdr-ssdp-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='serialize'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='sha512'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='sm3'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='sm4'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='ss'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='vaes'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='ClearwaterForest-v1'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx-ifma'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx-ne-convert'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx-vnni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx-vnni-int16'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx-vnni-int8'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='bhi-ctrl'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='bhi-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='bus-lock-detect'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='cldemote'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='cmpccxadd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='ddpd-u'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fbsdp-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fsrm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fsrs'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='ibrs-all'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='intel-psfd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='ipred-ctrl'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='lam'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='mcdt-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='movdir64b'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='movdiri'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pbrsb-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pku'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='prefetchiti'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='psdp-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='rrsba-ctrl'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='sbdr-ssdp-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='serialize'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='sha512'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='sm3'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='sm4'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='ss'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='vaes'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Cooperlake'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512-bf16'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='hle'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='ibrs-all'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pku'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='taa-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Cooperlake-v1'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512-bf16'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='hle'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='ibrs-all'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pku'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='taa-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Cooperlake-v2'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512-bf16'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='hle'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='ibrs-all'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pku'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='taa-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Denverton'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='mpx'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Denverton-v1'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='mpx'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Denverton-v2'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Denverton-v3'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Dhyana-v2'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='EPYC-Genoa'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='amd-psfd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='auto-ibrs'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512-bf16'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bitalg'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512ifma'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vbmi'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fsrm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='la57'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='no-nested-data-bp'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='null-sel-clr-base'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pku'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='stibp-always-on'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='vaes'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='EPYC-Genoa-v1'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='amd-psfd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='auto-ibrs'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512-bf16'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bitalg'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512ifma'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vbmi'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fsrm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='la57'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='no-nested-data-bp'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='null-sel-clr-base'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pku'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='stibp-always-on'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='vaes'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='EPYC-Genoa-v2'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='amd-psfd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='auto-ibrs'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512-bf16'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bitalg'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512ifma'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vbmi'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fs-gs-base-ns'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fsrm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='la57'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='no-nested-data-bp'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='null-sel-clr-base'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='perfmon-v2'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pku'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='stibp-always-on'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='vaes'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='EPYC-Milan'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fsrm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pku'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='EPYC-Milan-v1'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fsrm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pku'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='EPYC-Milan-v2'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='amd-psfd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fsrm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='no-nested-data-bp'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='null-sel-clr-base'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pku'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='stibp-always-on'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='vaes'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='EPYC-Milan-v3'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='amd-psfd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fsrm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='no-nested-data-bp'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='null-sel-clr-base'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pku'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='stibp-always-on'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='vaes'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='EPYC-Rome'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='EPYC-Rome-v1'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='EPYC-Rome-v2'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='EPYC-Rome-v3'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='EPYC-Turin'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='amd-psfd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='auto-ibrs'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx-vnni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512-bf16'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512-vp2intersect'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bitalg'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512ifma'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vbmi'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fs-gs-base-ns'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fsrm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='ibpb-brtype'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='la57'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='movdir64b'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='movdiri'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='no-nested-data-bp'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='null-sel-clr-base'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='perfmon-v2'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pku'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='prefetchi'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='sbpb'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='srso-user-kernel-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='stibp-always-on'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='vaes'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='EPYC-Turin-v1'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='amd-psfd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='auto-ibrs'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx-vnni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512-bf16'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512-vp2intersect'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bitalg'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512ifma'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vbmi'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fs-gs-base-ns'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fsrm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='ibpb-brtype'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='la57'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='movdir64b'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='movdiri'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='no-nested-data-bp'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='null-sel-clr-base'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='perfmon-v2'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pku'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='prefetchi'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='sbpb'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='srso-user-kernel-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='stibp-always-on'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='vaes'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='EPYC-v3'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='EPYC-v4'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='AMD'>EPYC-v5</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='EPYC-v5'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='GraniteRapids'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='amx-bf16'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='amx-fp16'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='amx-int8'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='amx-tile'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx-vnni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512-bf16'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512-fp16'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bitalg'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512ifma'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vbmi'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='bus-lock-detect'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fbsdp-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fsrc'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fsrm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fsrs'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fzrm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='hle'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='ibrs-all'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='la57'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='mcdt-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pbrsb-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pku'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='prefetchiti'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='psdp-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='sbdr-ssdp-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='serialize'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='taa-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='tsx-ldtrk'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='vaes'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='xfd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='GraniteRapids-v1'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='amx-bf16'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='amx-fp16'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='amx-int8'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='amx-tile'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx-vnni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512-bf16'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512-fp16'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bitalg'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512ifma'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vbmi'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='bus-lock-detect'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fbsdp-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fsrc'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fsrm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fsrs'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fzrm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='hle'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='ibrs-all'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='la57'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='mcdt-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pbrsb-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pku'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='prefetchiti'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='psdp-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='sbdr-ssdp-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='serialize'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='taa-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='tsx-ldtrk'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='vaes'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='xfd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='GraniteRapids-v2'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='amx-bf16'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='amx-fp16'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='amx-int8'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='amx-tile'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx-vnni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx10'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx10-128'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx10-256'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx10-512'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512-bf16'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512-fp16'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bitalg'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512ifma'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vbmi'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='bus-lock-detect'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='cldemote'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fbsdp-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fsrc'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fsrm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fsrs'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fzrm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='hle'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='ibrs-all'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='la57'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='mcdt-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='movdir64b'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='movdiri'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pbrsb-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pku'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='prefetchiti'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='psdp-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='sbdr-ssdp-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='serialize'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='ss'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='taa-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='tsx-ldtrk'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='vaes'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='xfd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='GraniteRapids-v3'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='amx-bf16'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='amx-fp16'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='amx-int8'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='amx-tile'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx-vnni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx10'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx10-128'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx10-256'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx10-512'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512-bf16'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512-fp16'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bitalg'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512ifma'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vbmi'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='bus-lock-detect'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='cldemote'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fbsdp-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fsrc'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fsrm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fsrs'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fzrm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='hle'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='ibrs-all'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='la57'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='mcdt-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='movdir64b'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='movdiri'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pbrsb-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pku'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='prefetchiti'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='psdp-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='sbdr-ssdp-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='serialize'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='ss'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='taa-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='tsx-ldtrk'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='vaes'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='xfd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Haswell'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='hle'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Haswell-IBRS'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='hle'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Haswell-noTSX'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Haswell-noTSX-IBRS'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Haswell-v1'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='hle'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Haswell-v2'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Haswell-v3'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='hle'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Haswell-v4'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Icelake-Server'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bitalg'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vbmi'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='hle'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='la57'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pku'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='vaes'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Icelake-Server-noTSX'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bitalg'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vbmi'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='la57'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pku'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='vaes'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Icelake-Server-v1'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bitalg'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vbmi'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='hle'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='la57'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pku'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='vaes'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Icelake-Server-v2'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bitalg'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vbmi'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='la57'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pku'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='vaes'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Icelake-Server-v3'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bitalg'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vbmi'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='ibrs-all'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='la57'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pku'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='taa-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='vaes'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Icelake-Server-v4'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bitalg'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512ifma'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vbmi'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fsrm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='ibrs-all'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='la57'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pku'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='taa-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='vaes'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Icelake-Server-v5'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bitalg'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512ifma'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vbmi'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fsrm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='ibrs-all'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='la57'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pku'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='taa-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='vaes'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Icelake-Server-v6'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bitalg'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512ifma'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vbmi'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fsrm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='ibrs-all'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='la57'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pku'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='taa-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='vaes'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Icelake-Server-v7'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bitalg'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512ifma'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vbmi'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fsrm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='hle'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='ibrs-all'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='la57'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pku'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='taa-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='vaes'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='IvyBridge'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='IvyBridge-IBRS'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='IvyBridge-v1'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='IvyBridge-v2'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='KnightsMill'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512-4fmaps'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512-4vnniw'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512er'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512pf'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='ss'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='KnightsMill-v1'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512-4fmaps'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512-4vnniw'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512er'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512pf'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='ss'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Opteron_G4'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fma4'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='xop'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Opteron_G4-v1'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fma4'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='xop'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Opteron_G5'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fma4'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='tbm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='xop'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Opteron_G5-v1'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fma4'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='tbm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='xop'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='SapphireRapids'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='amx-bf16'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='amx-int8'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='amx-tile'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx-vnni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512-bf16'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512-fp16'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bitalg'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512ifma'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vbmi'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='bus-lock-detect'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fsrc'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fsrm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fsrs'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fzrm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='hle'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='ibrs-all'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='la57'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pku'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='serialize'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='taa-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='tsx-ldtrk'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='vaes'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='xfd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='SapphireRapids-v1'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='amx-bf16'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='amx-int8'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='amx-tile'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx-vnni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512-bf16'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512-fp16'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bitalg'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512ifma'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vbmi'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='bus-lock-detect'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fsrc'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fsrm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fsrs'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fzrm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='hle'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='ibrs-all'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='la57'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pku'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='serialize'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='taa-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='tsx-ldtrk'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='vaes'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='xfd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='SapphireRapids-v2'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='amx-bf16'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='amx-int8'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='amx-tile'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx-vnni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512-bf16'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512-fp16'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bitalg'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512ifma'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vbmi'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='bus-lock-detect'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fbsdp-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fsrc'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fsrm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fsrs'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fzrm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='hle'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='ibrs-all'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='la57'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pku'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='psdp-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='sbdr-ssdp-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='serialize'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='taa-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='tsx-ldtrk'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='vaes'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='xfd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='SapphireRapids-v3'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='amx-bf16'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='amx-int8'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='amx-tile'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx-vnni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512-bf16'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512-fp16'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bitalg'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512ifma'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vbmi'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='bus-lock-detect'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='cldemote'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fbsdp-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fsrc'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fsrm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fsrs'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fzrm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='hle'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='ibrs-all'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='la57'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='movdir64b'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='movdiri'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pku'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='psdp-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='sbdr-ssdp-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='serialize'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='ss'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='taa-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='tsx-ldtrk'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='vaes'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='xfd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='SapphireRapids-v4'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='amx-bf16'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='amx-int8'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='amx-tile'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx-vnni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512-bf16'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512-fp16'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bitalg'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512ifma'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vbmi'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='bus-lock-detect'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='cldemote'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fbsdp-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fsrc'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fsrm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fsrs'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fzrm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='hle'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='ibrs-all'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='la57'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='movdir64b'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='movdiri'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pku'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='psdp-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='sbdr-ssdp-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='serialize'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='ss'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='taa-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='tsx-ldtrk'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='vaes'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='xfd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='SierraForest'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx-ifma'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx-ne-convert'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx-vnni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx-vnni-int8'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='bus-lock-detect'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='cmpccxadd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fbsdp-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fsrm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fsrs'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='ibrs-all'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='mcdt-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pbrsb-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pku'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='psdp-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='sbdr-ssdp-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='serialize'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='vaes'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='SierraForest-v1'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx-ifma'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx-ne-convert'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx-vnni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx-vnni-int8'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='bus-lock-detect'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='cmpccxadd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fbsdp-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fsrm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fsrs'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='ibrs-all'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='mcdt-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pbrsb-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pku'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='psdp-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='sbdr-ssdp-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='serialize'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='vaes'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel'>SierraForest-v2</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='SierraForest-v2'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx-ifma'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx-ne-convert'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx-vnni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx-vnni-int8'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='bhi-ctrl'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='bus-lock-detect'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='cldemote'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='cmpccxadd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fbsdp-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fsrm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fsrs'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='ibrs-all'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='intel-psfd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='ipred-ctrl'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='lam'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='mcdt-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='movdir64b'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='movdiri'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pbrsb-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pku'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='psdp-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='rrsba-ctrl'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='sbdr-ssdp-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='serialize'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='ss'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='vaes'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel'>SierraForest-v3</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='SierraForest-v3'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx-ifma'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx-ne-convert'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx-vnni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx-vnni-int8'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='bhi-ctrl'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='bus-lock-detect'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='cldemote'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='cmpccxadd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fbsdp-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fsrm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fsrs'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='ibrs-all'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='intel-psfd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='ipred-ctrl'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='lam'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='mcdt-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='movdir64b'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='movdiri'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pbrsb-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pku'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='psdp-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='rrsba-ctrl'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='sbdr-ssdp-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='serialize'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='ss'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='vaes'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Skylake-Client'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='hle'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Skylake-Client-IBRS'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='hle'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Skylake-Client-v1'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='hle'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Skylake-Client-v2'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='hle'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Skylake-Client-v3'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Skylake-Client-v4'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Skylake-Server'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='hle'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pku'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Skylake-Server-IBRS'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='hle'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pku'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pku'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Skylake-Server-v1'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='hle'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pku'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Skylake-Server-v2'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='hle'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pku'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Skylake-Server-v3'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pku'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Skylake-Server-v4'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pku'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Skylake-Server-v5'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pku'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Snowridge'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='cldemote'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='core-capability'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='movdir64b'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='movdiri'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='mpx'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='split-lock-detect'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Snowridge-v1'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='cldemote'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='core-capability'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='movdir64b'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='movdiri'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='mpx'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='split-lock-detect'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Snowridge-v2'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='cldemote'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='core-capability'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='movdir64b'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='movdiri'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='split-lock-detect'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Snowridge-v3'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='cldemote'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='core-capability'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='movdir64b'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='movdiri'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='split-lock-detect'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Snowridge-v4'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='cldemote'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='movdir64b'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='movdiri'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='athlon'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='3dnow'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='3dnowext'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='athlon-v1'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='3dnow'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='3dnowext'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='core2duo'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='ss'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='core2duo-v1'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='ss'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='coreduo'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='ss'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='coreduo-v1'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='ss'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='n270'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='ss'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='n270-v1'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='ss'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='phenom'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='3dnow'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='3dnowext'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='phenom-v1'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='3dnow'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='3dnowext'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:     </mode>
Jan 21 23:37:06 compute-1 nova_compute[181770]:   </cpu>
Jan 21 23:37:06 compute-1 nova_compute[181770]:   <memoryBacking supported='yes'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:     <enum name='sourceType'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <value>file</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <value>anonymous</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <value>memfd</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:     </enum>
Jan 21 23:37:06 compute-1 nova_compute[181770]:   </memoryBacking>
Jan 21 23:37:06 compute-1 nova_compute[181770]:   <devices>
Jan 21 23:37:06 compute-1 nova_compute[181770]:     <disk supported='yes'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <enum name='diskDevice'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>disk</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>cdrom</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>floppy</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>lun</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </enum>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <enum name='bus'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>fdc</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>scsi</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>virtio</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>usb</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>sata</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </enum>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <enum name='model'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>virtio</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>virtio-transitional</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>virtio-non-transitional</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </enum>
Jan 21 23:37:06 compute-1 nova_compute[181770]:     </disk>
Jan 21 23:37:06 compute-1 nova_compute[181770]:     <graphics supported='yes'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <enum name='type'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>vnc</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>egl-headless</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>dbus</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </enum>
Jan 21 23:37:06 compute-1 nova_compute[181770]:     </graphics>
Jan 21 23:37:06 compute-1 nova_compute[181770]:     <video supported='yes'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <enum name='modelType'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>vga</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>cirrus</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>virtio</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>none</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>bochs</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>ramfb</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </enum>
Jan 21 23:37:06 compute-1 nova_compute[181770]:     </video>
Jan 21 23:37:06 compute-1 nova_compute[181770]:     <hostdev supported='yes'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <enum name='mode'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>subsystem</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </enum>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <enum name='startupPolicy'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>default</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>mandatory</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>requisite</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>optional</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </enum>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <enum name='subsysType'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>usb</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>pci</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>scsi</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </enum>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <enum name='capsType'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <enum name='pciBackend'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:     </hostdev>
Jan 21 23:37:06 compute-1 nova_compute[181770]:     <rng supported='yes'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <enum name='model'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>virtio</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>virtio-transitional</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>virtio-non-transitional</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </enum>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <enum name='backendModel'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>random</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>egd</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>builtin</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </enum>
Jan 21 23:37:06 compute-1 nova_compute[181770]:     </rng>
Jan 21 23:37:06 compute-1 nova_compute[181770]:     <filesystem supported='yes'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <enum name='driverType'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>path</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>handle</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>virtiofs</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </enum>
Jan 21 23:37:06 compute-1 nova_compute[181770]:     </filesystem>
Jan 21 23:37:06 compute-1 nova_compute[181770]:     <tpm supported='yes'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <enum name='model'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>tpm-tis</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>tpm-crb</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </enum>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <enum name='backendModel'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>emulator</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>external</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </enum>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <enum name='backendVersion'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>2.0</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </enum>
Jan 21 23:37:06 compute-1 nova_compute[181770]:     </tpm>
Jan 21 23:37:06 compute-1 nova_compute[181770]:     <redirdev supported='yes'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <enum name='bus'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>usb</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </enum>
Jan 21 23:37:06 compute-1 nova_compute[181770]:     </redirdev>
Jan 21 23:37:06 compute-1 nova_compute[181770]:     <channel supported='yes'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <enum name='type'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>pty</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>unix</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </enum>
Jan 21 23:37:06 compute-1 nova_compute[181770]:     </channel>
Jan 21 23:37:06 compute-1 nova_compute[181770]:     <crypto supported='yes'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <enum name='model'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <enum name='type'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>qemu</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </enum>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <enum name='backendModel'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>builtin</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </enum>
Jan 21 23:37:06 compute-1 nova_compute[181770]:     </crypto>
Jan 21 23:37:06 compute-1 nova_compute[181770]:     <interface supported='yes'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <enum name='backendType'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>default</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>passt</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </enum>
Jan 21 23:37:06 compute-1 nova_compute[181770]:     </interface>
Jan 21 23:37:06 compute-1 nova_compute[181770]:     <panic supported='yes'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <enum name='model'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>isa</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>hyperv</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </enum>
Jan 21 23:37:06 compute-1 nova_compute[181770]:     </panic>
Jan 21 23:37:06 compute-1 nova_compute[181770]:     <console supported='yes'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <enum name='type'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>null</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>vc</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>pty</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>dev</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>file</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>pipe</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>stdio</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>udp</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>tcp</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>unix</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>qemu-vdagent</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>dbus</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </enum>
Jan 21 23:37:06 compute-1 nova_compute[181770]:     </console>
Jan 21 23:37:06 compute-1 nova_compute[181770]:   </devices>
Jan 21 23:37:06 compute-1 nova_compute[181770]:   <features>
Jan 21 23:37:06 compute-1 nova_compute[181770]:     <gic supported='no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:     <vmcoreinfo supported='yes'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:     <genid supported='yes'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:     <backingStoreInput supported='yes'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:     <backup supported='yes'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:     <async-teardown supported='yes'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:     <s390-pv supported='no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:     <ps2 supported='yes'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:     <tdx supported='no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:     <sev supported='no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:     <sgx supported='no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:     <hyperv supported='yes'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <enum name='features'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>relaxed</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>vapic</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>spinlocks</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>vpindex</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>runtime</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>synic</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>stimer</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>reset</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>vendor_id</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>frequencies</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>reenlightenment</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>tlbflush</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>ipi</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>avic</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>emsr_bitmap</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>xmm_input</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </enum>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <defaults>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <spinlocks>4095</spinlocks>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <stimer_direct>on</stimer_direct>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <tlbflush_direct>on</tlbflush_direct>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <tlbflush_extended>on</tlbflush_extended>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <vendor_id>Linux KVM Hv</vendor_id>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </defaults>
Jan 21 23:37:06 compute-1 nova_compute[181770]:     </hyperv>
Jan 21 23:37:06 compute-1 nova_compute[181770]:     <launchSecurity supported='no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:   </features>
Jan 21 23:37:06 compute-1 nova_compute[181770]: </domainCapabilities>
Jan 21 23:37:06 compute-1 nova_compute[181770]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Jan 21 23:37:06 compute-1 nova_compute[181770]: 2026-01-21 23:37:06.338 181774 DEBUG nova.virt.libvirt.host [None req-d237fa9d-112a-4ac9-ba3e-9871449b534b - - - - - -] Getting domain capabilities for x86_64 via machine types: {'pc', 'q35'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952
Jan 21 23:37:06 compute-1 nova_compute[181770]: 2026-01-21 23:37:06.345 181774 DEBUG nova.virt.libvirt.host [None req-d237fa9d-112a-4ac9-ba3e-9871449b534b - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc:
Jan 21 23:37:06 compute-1 nova_compute[181770]: <domainCapabilities>
Jan 21 23:37:06 compute-1 nova_compute[181770]:   <path>/usr/libexec/qemu-kvm</path>
Jan 21 23:37:06 compute-1 nova_compute[181770]:   <domain>kvm</domain>
Jan 21 23:37:06 compute-1 nova_compute[181770]:   <machine>pc-i440fx-rhel7.6.0</machine>
Jan 21 23:37:06 compute-1 nova_compute[181770]:   <arch>x86_64</arch>
Jan 21 23:37:06 compute-1 nova_compute[181770]:   <vcpu max='240'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:   <iothreads supported='yes'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:   <os supported='yes'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:     <enum name='firmware'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:     <loader supported='yes'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <enum name='type'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>rom</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>pflash</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </enum>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <enum name='readonly'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>yes</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>no</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </enum>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <enum name='secure'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>no</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </enum>
Jan 21 23:37:06 compute-1 nova_compute[181770]:     </loader>
Jan 21 23:37:06 compute-1 nova_compute[181770]:   </os>
Jan 21 23:37:06 compute-1 nova_compute[181770]:   <cpu>
Jan 21 23:37:06 compute-1 nova_compute[181770]:     <mode name='host-passthrough' supported='yes'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <enum name='hostPassthroughMigratable'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>on</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>off</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </enum>
Jan 21 23:37:06 compute-1 nova_compute[181770]:     </mode>
Jan 21 23:37:06 compute-1 nova_compute[181770]:     <mode name='maximum' supported='yes'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <enum name='maximumMigratable'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>on</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>off</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </enum>
Jan 21 23:37:06 compute-1 nova_compute[181770]:     </mode>
Jan 21 23:37:06 compute-1 nova_compute[181770]:     <mode name='host-model' supported='yes'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model fallback='forbid'>EPYC-Rome</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <vendor>AMD</vendor>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <maxphysaddr mode='passthrough' limit='40'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <feature policy='require' name='x2apic'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <feature policy='require' name='tsc-deadline'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <feature policy='require' name='hypervisor'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <feature policy='require' name='tsc_adjust'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <feature policy='require' name='spec-ctrl'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <feature policy='require' name='stibp'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <feature policy='require' name='ssbd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <feature policy='require' name='cmp_legacy'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <feature policy='require' name='overflow-recov'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <feature policy='require' name='succor'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <feature policy='require' name='ibrs'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <feature policy='require' name='amd-ssbd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <feature policy='require' name='virt-ssbd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <feature policy='require' name='lbrv'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <feature policy='require' name='tsc-scale'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <feature policy='require' name='vmcb-clean'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <feature policy='require' name='flushbyasid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <feature policy='require' name='pause-filter'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <feature policy='require' name='pfthreshold'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <feature policy='require' name='svme-addr-chk'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <feature policy='require' name='lfence-always-serializing'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <feature policy='disable' name='xsaves'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:     </mode>
Jan 21 23:37:06 compute-1 nova_compute[181770]:     <mode name='custom' supported='yes'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Broadwell'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='hle'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Broadwell-IBRS'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='hle'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Broadwell-noTSX'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Broadwell-noTSX-IBRS'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Broadwell-v1'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='hle'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Broadwell-v2'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Broadwell-v3'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='hle'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Broadwell-v4'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Cascadelake-Server'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='hle'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pku'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Cascadelake-Server-noTSX'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='ibrs-all'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pku'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Cascadelake-Server-v1'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='hle'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pku'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Cascadelake-Server-v2'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='hle'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='ibrs-all'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pku'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Cascadelake-Server-v3'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='ibrs-all'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pku'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Cascadelake-Server-v4'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='ibrs-all'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pku'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Cascadelake-Server-v5'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='ibrs-all'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pku'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='ClearwaterForest'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx-ifma'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx-ne-convert'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx-vnni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx-vnni-int16'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx-vnni-int8'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='bhi-ctrl'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='bhi-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='bus-lock-detect'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='cldemote'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='cmpccxadd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='ddpd-u'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fbsdp-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fsrm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fsrs'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='ibrs-all'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='intel-psfd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='ipred-ctrl'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='lam'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='mcdt-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='movdir64b'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='movdiri'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pbrsb-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pku'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='prefetchiti'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='psdp-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='rrsba-ctrl'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='sbdr-ssdp-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='serialize'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='sha512'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='sm3'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='sm4'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='ss'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='vaes'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='ClearwaterForest-v1'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx-ifma'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx-ne-convert'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx-vnni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx-vnni-int16'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx-vnni-int8'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='bhi-ctrl'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='bhi-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='bus-lock-detect'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='cldemote'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='cmpccxadd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='ddpd-u'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fbsdp-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fsrm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fsrs'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='ibrs-all'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='intel-psfd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='ipred-ctrl'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='lam'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='mcdt-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='movdir64b'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='movdiri'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pbrsb-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pku'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='prefetchiti'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='psdp-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='rrsba-ctrl'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='sbdr-ssdp-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='serialize'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='sha512'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='sm3'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='sm4'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='ss'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='vaes'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Cooperlake'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512-bf16'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='hle'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='ibrs-all'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pku'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='taa-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Cooperlake-v1'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512-bf16'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='hle'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='ibrs-all'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pku'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='taa-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Cooperlake-v2'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512-bf16'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='hle'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='ibrs-all'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pku'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='taa-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Denverton'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='mpx'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Denverton-v1'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='mpx'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Denverton-v2'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Denverton-v3'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Dhyana-v2'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='EPYC-Genoa'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='amd-psfd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='auto-ibrs'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512-bf16'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bitalg'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512ifma'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vbmi'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fsrm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='la57'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='no-nested-data-bp'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='null-sel-clr-base'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pku'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='stibp-always-on'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='vaes'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='EPYC-Genoa-v1'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='amd-psfd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='auto-ibrs'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512-bf16'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bitalg'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512ifma'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vbmi'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fsrm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='la57'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='no-nested-data-bp'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='null-sel-clr-base'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pku'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='stibp-always-on'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='vaes'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='EPYC-Genoa-v2'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='amd-psfd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='auto-ibrs'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512-bf16'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bitalg'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512ifma'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vbmi'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fs-gs-base-ns'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fsrm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='la57'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='no-nested-data-bp'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='null-sel-clr-base'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='perfmon-v2'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pku'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='stibp-always-on'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='vaes'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='EPYC-Milan'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fsrm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pku'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='EPYC-Milan-v1'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fsrm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pku'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='EPYC-Milan-v2'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='amd-psfd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fsrm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='no-nested-data-bp'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='null-sel-clr-base'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pku'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='stibp-always-on'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='vaes'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='EPYC-Milan-v3'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='amd-psfd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fsrm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='no-nested-data-bp'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='null-sel-clr-base'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pku'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='stibp-always-on'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='vaes'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='EPYC-Rome'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='EPYC-Rome-v1'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='EPYC-Rome-v2'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='EPYC-Rome-v3'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='EPYC-Turin'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='amd-psfd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='auto-ibrs'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx-vnni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512-bf16'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512-vp2intersect'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bitalg'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512ifma'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vbmi'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fs-gs-base-ns'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fsrm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='ibpb-brtype'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='la57'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='movdir64b'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='movdiri'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='no-nested-data-bp'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='null-sel-clr-base'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='perfmon-v2'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pku'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='prefetchi'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='sbpb'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='srso-user-kernel-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='stibp-always-on'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='vaes'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='EPYC-Turin-v1'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='amd-psfd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='auto-ibrs'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx-vnni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512-bf16'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512-vp2intersect'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bitalg'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512ifma'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vbmi'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fs-gs-base-ns'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fsrm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='ibpb-brtype'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='la57'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='movdir64b'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='movdiri'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='no-nested-data-bp'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='null-sel-clr-base'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='perfmon-v2'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pku'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='prefetchi'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='sbpb'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='srso-user-kernel-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='stibp-always-on'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='vaes'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='EPYC-v3'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='EPYC-v4'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='AMD'>EPYC-v5</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='EPYC-v5'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='GraniteRapids'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='amx-bf16'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='amx-fp16'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='amx-int8'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='amx-tile'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx-vnni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512-bf16'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512-fp16'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bitalg'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512ifma'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vbmi'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='bus-lock-detect'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fbsdp-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fsrc'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fsrm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fsrs'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fzrm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='hle'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='ibrs-all'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='la57'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='mcdt-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pbrsb-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pku'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='prefetchiti'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='psdp-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='sbdr-ssdp-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='serialize'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='taa-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='tsx-ldtrk'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='vaes'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='xfd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='GraniteRapids-v1'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='amx-bf16'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='amx-fp16'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='amx-int8'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='amx-tile'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx-vnni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512-bf16'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512-fp16'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bitalg'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512ifma'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vbmi'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='bus-lock-detect'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fbsdp-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fsrc'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fsrm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fsrs'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fzrm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='hle'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='ibrs-all'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='la57'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='mcdt-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pbrsb-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pku'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='prefetchiti'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='psdp-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='sbdr-ssdp-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='serialize'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='taa-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='tsx-ldtrk'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='vaes'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='xfd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='GraniteRapids-v2'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='amx-bf16'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='amx-fp16'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='amx-int8'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='amx-tile'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx-vnni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx10'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx10-128'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx10-256'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx10-512'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512-bf16'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512-fp16'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bitalg'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512ifma'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vbmi'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='bus-lock-detect'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='cldemote'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fbsdp-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fsrc'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fsrm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fsrs'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fzrm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='hle'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='ibrs-all'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='la57'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='mcdt-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='movdir64b'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='movdiri'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pbrsb-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pku'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='prefetchiti'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='psdp-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='sbdr-ssdp-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='serialize'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='ss'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='taa-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='tsx-ldtrk'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='vaes'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='xfd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='GraniteRapids-v3'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='amx-bf16'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='amx-fp16'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='amx-int8'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='amx-tile'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx-vnni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx10'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx10-128'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx10-256'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx10-512'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512-bf16'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512-fp16'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bitalg'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512ifma'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vbmi'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='bus-lock-detect'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='cldemote'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fbsdp-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fsrc'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fsrm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fsrs'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fzrm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='hle'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='ibrs-all'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='la57'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='mcdt-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='movdir64b'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='movdiri'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pbrsb-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pku'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='prefetchiti'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='psdp-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='sbdr-ssdp-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='serialize'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='ss'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='taa-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='tsx-ldtrk'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='vaes'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='xfd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Haswell'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='hle'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Haswell-IBRS'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='hle'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Haswell-noTSX'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Haswell-noTSX-IBRS'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Haswell-v1'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='hle'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Haswell-v2'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Haswell-v3'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='hle'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Haswell-v4'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Icelake-Server'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bitalg'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vbmi'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='hle'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='la57'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pku'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='vaes'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Icelake-Server-noTSX'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bitalg'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vbmi'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='la57'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pku'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='vaes'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Icelake-Server-v1'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bitalg'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vbmi'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='hle'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='la57'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pku'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='vaes'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Icelake-Server-v2'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bitalg'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vbmi'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='la57'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pku'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='vaes'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Icelake-Server-v3'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bitalg'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vbmi'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='ibrs-all'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='la57'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pku'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='taa-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='vaes'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Icelake-Server-v4'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bitalg'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512ifma'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vbmi'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fsrm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='ibrs-all'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='la57'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pku'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='taa-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='vaes'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Icelake-Server-v5'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bitalg'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512ifma'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vbmi'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fsrm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='ibrs-all'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='la57'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pku'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='taa-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='vaes'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Icelake-Server-v6'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bitalg'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512ifma'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vbmi'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fsrm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='ibrs-all'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='la57'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pku'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='taa-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='vaes'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Icelake-Server-v7'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bitalg'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512ifma'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vbmi'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fsrm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='hle'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='ibrs-all'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='la57'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pku'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='taa-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='vaes'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='IvyBridge'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='IvyBridge-IBRS'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='IvyBridge-v1'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='IvyBridge-v2'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='KnightsMill'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512-4fmaps'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512-4vnniw'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512er'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512pf'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='ss'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='KnightsMill-v1'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512-4fmaps'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512-4vnniw'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512er'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512pf'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='ss'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Opteron_G4'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fma4'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='xop'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Opteron_G4-v1'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fma4'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='xop'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Opteron_G5'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fma4'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='tbm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='xop'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Opteron_G5-v1'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fma4'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='tbm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='xop'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='SapphireRapids'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='amx-bf16'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='amx-int8'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='amx-tile'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx-vnni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512-bf16'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512-fp16'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bitalg'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512ifma'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vbmi'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='bus-lock-detect'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fsrc'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fsrm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fsrs'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fzrm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='hle'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='ibrs-all'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='la57'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pku'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='serialize'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='taa-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='tsx-ldtrk'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='vaes'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='xfd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='SapphireRapids-v1'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='amx-bf16'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='amx-int8'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='amx-tile'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx-vnni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512-bf16'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512-fp16'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bitalg'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512ifma'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vbmi'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='bus-lock-detect'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fsrc'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fsrm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fsrs'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fzrm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='hle'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='ibrs-all'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='la57'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pku'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='serialize'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='taa-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='tsx-ldtrk'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='vaes'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='xfd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='SapphireRapids-v2'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='amx-bf16'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='amx-int8'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='amx-tile'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx-vnni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512-bf16'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512-fp16'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bitalg'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512ifma'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vbmi'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='bus-lock-detect'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fbsdp-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fsrc'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fsrm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fsrs'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fzrm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='hle'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='ibrs-all'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='la57'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pku'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='psdp-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='sbdr-ssdp-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='serialize'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='taa-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='tsx-ldtrk'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='vaes'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='xfd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='SapphireRapids-v3'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='amx-bf16'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='amx-int8'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='amx-tile'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx-vnni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512-bf16'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512-fp16'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bitalg'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512ifma'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vbmi'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='bus-lock-detect'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='cldemote'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fbsdp-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fsrc'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fsrm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fsrs'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fzrm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='hle'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='ibrs-all'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='la57'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='movdir64b'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='movdiri'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pku'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='psdp-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='sbdr-ssdp-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='serialize'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='ss'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='taa-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='tsx-ldtrk'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='vaes'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='xfd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='SapphireRapids-v4'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='amx-bf16'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='amx-int8'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='amx-tile'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx-vnni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512-bf16'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512-fp16'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bitalg'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512ifma'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vbmi'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='bus-lock-detect'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='cldemote'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fbsdp-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fsrc'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fsrm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fsrs'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fzrm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='hle'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='ibrs-all'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='la57'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='movdir64b'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='movdiri'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pku'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='psdp-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='sbdr-ssdp-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='serialize'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='ss'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='taa-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='tsx-ldtrk'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='vaes'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='xfd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='SierraForest'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx-ifma'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx-ne-convert'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx-vnni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx-vnni-int8'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='bus-lock-detect'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='cmpccxadd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fbsdp-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fsrm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fsrs'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='ibrs-all'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='mcdt-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pbrsb-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pku'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='psdp-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='sbdr-ssdp-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='serialize'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='vaes'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='SierraForest-v1'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx-ifma'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx-ne-convert'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx-vnni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx-vnni-int8'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='bus-lock-detect'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='cmpccxadd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fbsdp-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fsrm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fsrs'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='ibrs-all'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='mcdt-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pbrsb-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pku'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='psdp-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='sbdr-ssdp-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='serialize'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='vaes'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel'>SierraForest-v2</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='SierraForest-v2'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx-ifma'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx-ne-convert'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx-vnni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx-vnni-int8'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='bhi-ctrl'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='bus-lock-detect'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='cldemote'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='cmpccxadd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fbsdp-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fsrm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fsrs'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='ibrs-all'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='intel-psfd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='ipred-ctrl'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='lam'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='mcdt-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='movdir64b'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='movdiri'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pbrsb-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pku'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='psdp-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='rrsba-ctrl'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='sbdr-ssdp-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='serialize'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='ss'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='vaes'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel'>SierraForest-v3</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='SierraForest-v3'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx-ifma'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx-ne-convert'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx-vnni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx-vnni-int8'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='bhi-ctrl'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='bus-lock-detect'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='cldemote'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='cmpccxadd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fbsdp-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fsrm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fsrs'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='ibrs-all'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='intel-psfd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='ipred-ctrl'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='lam'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='mcdt-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='movdir64b'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='movdiri'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pbrsb-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pku'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='psdp-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='rrsba-ctrl'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='sbdr-ssdp-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='serialize'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='ss'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='vaes'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Skylake-Client'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='hle'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Skylake-Client-IBRS'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='hle'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Skylake-Client-v1'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='hle'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Skylake-Client-v2'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='hle'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Skylake-Client-v3'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Skylake-Client-v4'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Skylake-Server'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='hle'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pku'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Skylake-Server-IBRS'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='hle'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pku'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pku'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Skylake-Server-v1'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='hle'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pku'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Skylake-Server-v2'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='hle'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pku'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Skylake-Server-v3'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pku'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Skylake-Server-v4'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pku'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Skylake-Server-v5'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pku'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Snowridge'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='cldemote'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='core-capability'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='movdir64b'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='movdiri'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='mpx'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='split-lock-detect'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Snowridge-v1'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='cldemote'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='core-capability'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='movdir64b'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='movdiri'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='mpx'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='split-lock-detect'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Snowridge-v2'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='cldemote'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='core-capability'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='movdir64b'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='movdiri'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='split-lock-detect'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Snowridge-v3'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='cldemote'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='core-capability'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='movdir64b'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='movdiri'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='split-lock-detect'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Snowridge-v4'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='cldemote'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='movdir64b'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='movdiri'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='athlon'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='3dnow'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='3dnowext'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='athlon-v1'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='3dnow'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='3dnowext'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='core2duo'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='ss'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='core2duo-v1'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='ss'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='coreduo'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='ss'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='coreduo-v1'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='ss'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='n270'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='ss'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='n270-v1'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='ss'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='phenom'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='3dnow'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='3dnowext'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='phenom-v1'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='3dnow'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='3dnowext'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:     </mode>
Jan 21 23:37:06 compute-1 nova_compute[181770]:   </cpu>
Jan 21 23:37:06 compute-1 nova_compute[181770]:   <memoryBacking supported='yes'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:     <enum name='sourceType'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <value>file</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <value>anonymous</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <value>memfd</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:     </enum>
Jan 21 23:37:06 compute-1 nova_compute[181770]:   </memoryBacking>
Jan 21 23:37:06 compute-1 nova_compute[181770]:   <devices>
Jan 21 23:37:06 compute-1 nova_compute[181770]:     <disk supported='yes'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <enum name='diskDevice'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>disk</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>cdrom</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>floppy</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>lun</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </enum>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <enum name='bus'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>ide</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>fdc</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>scsi</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>virtio</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>usb</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>sata</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </enum>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <enum name='model'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>virtio</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>virtio-transitional</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>virtio-non-transitional</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </enum>
Jan 21 23:37:06 compute-1 nova_compute[181770]:     </disk>
Jan 21 23:37:06 compute-1 nova_compute[181770]:     <graphics supported='yes'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <enum name='type'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>vnc</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>egl-headless</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>dbus</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </enum>
Jan 21 23:37:06 compute-1 nova_compute[181770]:     </graphics>
Jan 21 23:37:06 compute-1 nova_compute[181770]:     <video supported='yes'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <enum name='modelType'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>vga</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>cirrus</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>virtio</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>none</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>bochs</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>ramfb</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </enum>
Jan 21 23:37:06 compute-1 nova_compute[181770]:     </video>
Jan 21 23:37:06 compute-1 nova_compute[181770]:     <hostdev supported='yes'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <enum name='mode'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>subsystem</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </enum>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <enum name='startupPolicy'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>default</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>mandatory</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>requisite</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>optional</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </enum>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <enum name='subsysType'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>usb</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>pci</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>scsi</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </enum>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <enum name='capsType'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <enum name='pciBackend'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:     </hostdev>
Jan 21 23:37:06 compute-1 nova_compute[181770]:     <rng supported='yes'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <enum name='model'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>virtio</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>virtio-transitional</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>virtio-non-transitional</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </enum>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <enum name='backendModel'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>random</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>egd</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>builtin</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </enum>
Jan 21 23:37:06 compute-1 nova_compute[181770]:     </rng>
Jan 21 23:37:06 compute-1 nova_compute[181770]:     <filesystem supported='yes'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <enum name='driverType'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>path</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>handle</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>virtiofs</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </enum>
Jan 21 23:37:06 compute-1 nova_compute[181770]:     </filesystem>
Jan 21 23:37:06 compute-1 nova_compute[181770]:     <tpm supported='yes'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <enum name='model'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>tpm-tis</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>tpm-crb</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </enum>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <enum name='backendModel'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>emulator</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>external</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </enum>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <enum name='backendVersion'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>2.0</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </enum>
Jan 21 23:37:06 compute-1 nova_compute[181770]:     </tpm>
Jan 21 23:37:06 compute-1 nova_compute[181770]:     <redirdev supported='yes'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <enum name='bus'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>usb</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </enum>
Jan 21 23:37:06 compute-1 nova_compute[181770]:     </redirdev>
Jan 21 23:37:06 compute-1 nova_compute[181770]:     <channel supported='yes'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <enum name='type'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>pty</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>unix</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </enum>
Jan 21 23:37:06 compute-1 nova_compute[181770]:     </channel>
Jan 21 23:37:06 compute-1 nova_compute[181770]:     <crypto supported='yes'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <enum name='model'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <enum name='type'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>qemu</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </enum>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <enum name='backendModel'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>builtin</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </enum>
Jan 21 23:37:06 compute-1 nova_compute[181770]:     </crypto>
Jan 21 23:37:06 compute-1 nova_compute[181770]:     <interface supported='yes'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <enum name='backendType'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>default</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>passt</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </enum>
Jan 21 23:37:06 compute-1 nova_compute[181770]:     </interface>
Jan 21 23:37:06 compute-1 nova_compute[181770]:     <panic supported='yes'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <enum name='model'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>isa</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>hyperv</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </enum>
Jan 21 23:37:06 compute-1 nova_compute[181770]:     </panic>
Jan 21 23:37:06 compute-1 nova_compute[181770]:     <console supported='yes'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <enum name='type'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>null</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>vc</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>pty</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>dev</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>file</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>pipe</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>stdio</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>udp</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>tcp</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>unix</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>qemu-vdagent</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>dbus</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </enum>
Jan 21 23:37:06 compute-1 nova_compute[181770]:     </console>
Jan 21 23:37:06 compute-1 nova_compute[181770]:   </devices>
Jan 21 23:37:06 compute-1 nova_compute[181770]:   <features>
Jan 21 23:37:06 compute-1 nova_compute[181770]:     <gic supported='no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:     <vmcoreinfo supported='yes'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:     <genid supported='yes'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:     <backingStoreInput supported='yes'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:     <backup supported='yes'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:     <async-teardown supported='yes'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:     <s390-pv supported='no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:     <ps2 supported='yes'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:     <tdx supported='no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:     <sev supported='no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:     <sgx supported='no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:     <hyperv supported='yes'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <enum name='features'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>relaxed</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>vapic</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>spinlocks</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>vpindex</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>runtime</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>synic</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>stimer</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>reset</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>vendor_id</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>frequencies</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>reenlightenment</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>tlbflush</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>ipi</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>avic</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>emsr_bitmap</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>xmm_input</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </enum>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <defaults>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <spinlocks>4095</spinlocks>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <stimer_direct>on</stimer_direct>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <tlbflush_direct>on</tlbflush_direct>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <tlbflush_extended>on</tlbflush_extended>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <vendor_id>Linux KVM Hv</vendor_id>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </defaults>
Jan 21 23:37:06 compute-1 nova_compute[181770]:     </hyperv>
Jan 21 23:37:06 compute-1 nova_compute[181770]:     <launchSecurity supported='no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:   </features>
Jan 21 23:37:06 compute-1 nova_compute[181770]: </domainCapabilities>
Jan 21 23:37:06 compute-1 nova_compute[181770]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Jan 21 23:37:06 compute-1 nova_compute[181770]: 2026-01-21 23:37:06.414 181774 DEBUG nova.virt.libvirt.host [None req-d237fa9d-112a-4ac9-ba3e-9871449b534b - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35:
Jan 21 23:37:06 compute-1 nova_compute[181770]: <domainCapabilities>
Jan 21 23:37:06 compute-1 nova_compute[181770]:   <path>/usr/libexec/qemu-kvm</path>
Jan 21 23:37:06 compute-1 nova_compute[181770]:   <domain>kvm</domain>
Jan 21 23:37:06 compute-1 nova_compute[181770]:   <machine>pc-q35-rhel9.8.0</machine>
Jan 21 23:37:06 compute-1 nova_compute[181770]:   <arch>x86_64</arch>
Jan 21 23:37:06 compute-1 nova_compute[181770]:   <vcpu max='4096'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:   <iothreads supported='yes'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:   <os supported='yes'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:     <enum name='firmware'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <value>efi</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:     </enum>
Jan 21 23:37:06 compute-1 nova_compute[181770]:     <loader supported='yes'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.secboot.fd</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.fd</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <value>/usr/share/edk2/ovmf/OVMF.amdsev.fd</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <value>/usr/share/edk2/ovmf/OVMF.inteltdx.secboot.fd</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <enum name='type'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>rom</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>pflash</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </enum>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <enum name='readonly'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>yes</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>no</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </enum>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <enum name='secure'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>yes</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>no</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </enum>
Jan 21 23:37:06 compute-1 nova_compute[181770]:     </loader>
Jan 21 23:37:06 compute-1 nova_compute[181770]:   </os>
Jan 21 23:37:06 compute-1 nova_compute[181770]:   <cpu>
Jan 21 23:37:06 compute-1 nova_compute[181770]:     <mode name='host-passthrough' supported='yes'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <enum name='hostPassthroughMigratable'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>on</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>off</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </enum>
Jan 21 23:37:06 compute-1 nova_compute[181770]:     </mode>
Jan 21 23:37:06 compute-1 nova_compute[181770]:     <mode name='maximum' supported='yes'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <enum name='maximumMigratable'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>on</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>off</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </enum>
Jan 21 23:37:06 compute-1 nova_compute[181770]:     </mode>
Jan 21 23:37:06 compute-1 nova_compute[181770]:     <mode name='host-model' supported='yes'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model fallback='forbid'>EPYC-Rome</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <vendor>AMD</vendor>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <maxphysaddr mode='passthrough' limit='40'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <feature policy='require' name='x2apic'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <feature policy='require' name='tsc-deadline'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <feature policy='require' name='hypervisor'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <feature policy='require' name='tsc_adjust'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <feature policy='require' name='spec-ctrl'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <feature policy='require' name='stibp'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <feature policy='require' name='ssbd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <feature policy='require' name='cmp_legacy'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <feature policy='require' name='overflow-recov'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <feature policy='require' name='succor'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <feature policy='require' name='ibrs'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <feature policy='require' name='amd-ssbd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <feature policy='require' name='virt-ssbd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <feature policy='require' name='lbrv'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <feature policy='require' name='tsc-scale'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <feature policy='require' name='vmcb-clean'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <feature policy='require' name='flushbyasid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <feature policy='require' name='pause-filter'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <feature policy='require' name='pfthreshold'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <feature policy='require' name='svme-addr-chk'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <feature policy='require' name='lfence-always-serializing'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <feature policy='disable' name='xsaves'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:     </mode>
Jan 21 23:37:06 compute-1 nova_compute[181770]:     <mode name='custom' supported='yes'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Broadwell'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='hle'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Broadwell-IBRS'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='hle'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Broadwell-noTSX'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Broadwell-noTSX-IBRS'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Broadwell-v1'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='hle'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Broadwell-v2'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Broadwell-v3'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='hle'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Broadwell-v4'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Cascadelake-Server'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='hle'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pku'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Cascadelake-Server-noTSX'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='ibrs-all'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pku'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Cascadelake-Server-v1'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='hle'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pku'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Cascadelake-Server-v2'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='hle'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='ibrs-all'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pku'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Cascadelake-Server-v3'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='ibrs-all'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pku'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Cascadelake-Server-v4'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='ibrs-all'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pku'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Cascadelake-Server-v5'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='ibrs-all'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pku'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='ClearwaterForest'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx-ifma'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx-ne-convert'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx-vnni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx-vnni-int16'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx-vnni-int8'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='bhi-ctrl'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='bhi-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='bus-lock-detect'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='cldemote'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='cmpccxadd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='ddpd-u'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fbsdp-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fsrm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fsrs'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='ibrs-all'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='intel-psfd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='ipred-ctrl'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='lam'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='mcdt-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='movdir64b'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='movdiri'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pbrsb-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pku'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='prefetchiti'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='psdp-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='rrsba-ctrl'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='sbdr-ssdp-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='serialize'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='sha512'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='sm3'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='sm4'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='ss'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='vaes'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='ClearwaterForest-v1'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx-ifma'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx-ne-convert'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx-vnni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx-vnni-int16'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx-vnni-int8'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='bhi-ctrl'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='bhi-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='bus-lock-detect'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='cldemote'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='cmpccxadd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='ddpd-u'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fbsdp-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fsrm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fsrs'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='ibrs-all'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='intel-psfd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='ipred-ctrl'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='lam'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='mcdt-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='movdir64b'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='movdiri'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pbrsb-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pku'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='prefetchiti'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='psdp-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='rrsba-ctrl'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='sbdr-ssdp-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='serialize'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='sha512'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='sm3'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='sm4'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='ss'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='vaes'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Cooperlake'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512-bf16'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='hle'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='ibrs-all'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pku'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='taa-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Cooperlake-v1'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512-bf16'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='hle'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='ibrs-all'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pku'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='taa-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Cooperlake-v2'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512-bf16'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='hle'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='ibrs-all'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pku'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='taa-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Denverton'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='mpx'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Denverton-v1'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='mpx'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Denverton-v2'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Denverton-v3'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Dhyana-v2'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='EPYC-Genoa'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='amd-psfd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='auto-ibrs'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512-bf16'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bitalg'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512ifma'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vbmi'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fsrm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='la57'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='no-nested-data-bp'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='null-sel-clr-base'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pku'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='stibp-always-on'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='vaes'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='EPYC-Genoa-v1'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='amd-psfd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='auto-ibrs'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512-bf16'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bitalg'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512ifma'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vbmi'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fsrm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='la57'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='no-nested-data-bp'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='null-sel-clr-base'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pku'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='stibp-always-on'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='vaes'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='EPYC-Genoa-v2'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='amd-psfd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='auto-ibrs'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512-bf16'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bitalg'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512ifma'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vbmi'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fs-gs-base-ns'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fsrm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='la57'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='no-nested-data-bp'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='null-sel-clr-base'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='perfmon-v2'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pku'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='stibp-always-on'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='vaes'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='EPYC-Milan'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fsrm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pku'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='EPYC-Milan-v1'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fsrm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pku'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='EPYC-Milan-v2'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='amd-psfd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fsrm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='no-nested-data-bp'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='null-sel-clr-base'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pku'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='stibp-always-on'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='vaes'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='EPYC-Milan-v3'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='amd-psfd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fsrm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='no-nested-data-bp'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='null-sel-clr-base'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pku'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='stibp-always-on'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='vaes'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='EPYC-Rome'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='EPYC-Rome-v1'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='EPYC-Rome-v2'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='EPYC-Rome-v3'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='EPYC-Turin'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='amd-psfd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='auto-ibrs'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx-vnni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512-bf16'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512-vp2intersect'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bitalg'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512ifma'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vbmi'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fs-gs-base-ns'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fsrm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='ibpb-brtype'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='la57'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='movdir64b'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='movdiri'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='no-nested-data-bp'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='null-sel-clr-base'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='perfmon-v2'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pku'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='prefetchi'/>
Jan 21 23:37:06 compute-1 sudo[182450]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ilnnhkewdyfagzueqjlpvirevipejuhv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038625.9786427-3552-58727426648654/AnsiballZ_podman_container.py'
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='sbpb'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='srso-user-kernel-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='stibp-always-on'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='vaes'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='EPYC-Turin-v1'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='amd-psfd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='auto-ibrs'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx-vnni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512-bf16'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512-vp2intersect'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bitalg'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512ifma'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vbmi'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fs-gs-base-ns'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fsrm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='ibpb-brtype'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='la57'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='movdir64b'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='movdiri'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='no-nested-data-bp'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='null-sel-clr-base'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='perfmon-v2'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pku'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='prefetchi'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='sbpb'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='srso-user-kernel-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='stibp-always-on'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='vaes'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='EPYC-v3'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='EPYC-v4'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='AMD'>EPYC-v5</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='EPYC-v5'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='GraniteRapids'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='amx-bf16'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='amx-fp16'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='amx-int8'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='amx-tile'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx-vnni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512-bf16'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512-fp16'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bitalg'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512ifma'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vbmi'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='bus-lock-detect'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fbsdp-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fsrc'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fsrm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fsrs'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fzrm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='hle'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='ibrs-all'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='la57'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='mcdt-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pbrsb-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pku'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='prefetchiti'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='psdp-no'/>
Jan 21 23:37:06 compute-1 sudo[182450]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='sbdr-ssdp-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='serialize'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='taa-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='tsx-ldtrk'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='vaes'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='xfd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='GraniteRapids-v1'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='amx-bf16'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='amx-fp16'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='amx-int8'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='amx-tile'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx-vnni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512-bf16'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512-fp16'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bitalg'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512ifma'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vbmi'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='bus-lock-detect'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fbsdp-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fsrc'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fsrm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fsrs'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fzrm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='hle'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='ibrs-all'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='la57'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='mcdt-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pbrsb-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pku'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='prefetchiti'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='psdp-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='sbdr-ssdp-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='serialize'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='taa-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='tsx-ldtrk'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='vaes'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='xfd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='GraniteRapids-v2'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='amx-bf16'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='amx-fp16'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='amx-int8'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='amx-tile'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx-vnni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx10'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx10-128'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx10-256'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx10-512'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512-bf16'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512-fp16'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bitalg'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512ifma'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vbmi'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='bus-lock-detect'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='cldemote'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fbsdp-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fsrc'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fsrm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fsrs'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fzrm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='hle'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='ibrs-all'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='la57'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='mcdt-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='movdir64b'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='movdiri'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pbrsb-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pku'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='prefetchiti'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='psdp-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='sbdr-ssdp-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='serialize'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='ss'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='taa-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='tsx-ldtrk'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='vaes'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='xfd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='GraniteRapids-v3'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='amx-bf16'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='amx-fp16'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='amx-int8'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='amx-tile'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx-vnni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx10'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx10-128'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx10-256'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx10-512'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512-bf16'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512-fp16'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bitalg'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512ifma'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vbmi'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='bus-lock-detect'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='cldemote'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fbsdp-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fsrc'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fsrm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fsrs'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fzrm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='hle'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='ibrs-all'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='la57'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='mcdt-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='movdir64b'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='movdiri'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pbrsb-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pku'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='prefetchiti'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='psdp-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='sbdr-ssdp-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='serialize'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='ss'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='taa-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='tsx-ldtrk'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='vaes'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='xfd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Haswell'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='hle'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Haswell-IBRS'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='hle'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Haswell-noTSX'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Haswell-noTSX-IBRS'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Haswell-v1'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='hle'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Haswell-v2'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Haswell-v3'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='hle'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Haswell-v4'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Icelake-Server'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bitalg'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vbmi'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='hle'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='la57'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pku'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='vaes'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Icelake-Server-noTSX'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bitalg'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vbmi'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='la57'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pku'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='vaes'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Icelake-Server-v1'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bitalg'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vbmi'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='hle'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='la57'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pku'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='vaes'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Icelake-Server-v2'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bitalg'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vbmi'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='la57'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pku'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='vaes'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Icelake-Server-v3'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bitalg'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vbmi'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='ibrs-all'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='la57'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pku'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='taa-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='vaes'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Icelake-Server-v4'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bitalg'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512ifma'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vbmi'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fsrm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='ibrs-all'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='la57'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pku'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='taa-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='vaes'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Icelake-Server-v5'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bitalg'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512ifma'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vbmi'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fsrm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='ibrs-all'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='la57'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pku'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='taa-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='vaes'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Icelake-Server-v6'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bitalg'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512ifma'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vbmi'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fsrm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='ibrs-all'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='la57'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pku'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='taa-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='vaes'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Icelake-Server-v7'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bitalg'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512ifma'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vbmi'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fsrm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='hle'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='ibrs-all'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='la57'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pku'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='taa-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='vaes'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='IvyBridge'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='IvyBridge-IBRS'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='IvyBridge-v1'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='IvyBridge-v2'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='KnightsMill'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512-4fmaps'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512-4vnniw'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512er'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512pf'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='ss'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='KnightsMill-v1'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512-4fmaps'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512-4vnniw'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512er'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512pf'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='ss'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Opteron_G4'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fma4'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='xop'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Opteron_G4-v1'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fma4'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='xop'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Opteron_G5'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fma4'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='tbm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='xop'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Opteron_G5-v1'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fma4'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='tbm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='xop'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='SapphireRapids'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='amx-bf16'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='amx-int8'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='amx-tile'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx-vnni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512-bf16'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512-fp16'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bitalg'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512ifma'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vbmi'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='bus-lock-detect'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fsrc'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fsrm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fsrs'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fzrm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='hle'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='ibrs-all'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='la57'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pku'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='serialize'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='taa-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='tsx-ldtrk'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='vaes'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='xfd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='SapphireRapids-v1'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='amx-bf16'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='amx-int8'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='amx-tile'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx-vnni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512-bf16'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512-fp16'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bitalg'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512ifma'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vbmi'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='bus-lock-detect'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fsrc'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fsrm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fsrs'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fzrm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='hle'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='ibrs-all'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='la57'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pku'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='serialize'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='taa-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='tsx-ldtrk'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='vaes'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='xfd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='SapphireRapids-v2'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='amx-bf16'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='amx-int8'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='amx-tile'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx-vnni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512-bf16'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512-fp16'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bitalg'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512ifma'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vbmi'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='bus-lock-detect'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fbsdp-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fsrc'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fsrm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fsrs'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fzrm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='hle'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='ibrs-all'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='la57'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pku'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='psdp-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='sbdr-ssdp-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='serialize'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='taa-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='tsx-ldtrk'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='vaes'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='xfd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='SapphireRapids-v3'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='amx-bf16'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='amx-int8'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='amx-tile'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx-vnni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512-bf16'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512-fp16'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bitalg'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512ifma'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vbmi'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='bus-lock-detect'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='cldemote'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fbsdp-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fsrc'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fsrm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fsrs'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fzrm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='hle'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='ibrs-all'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='la57'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='movdir64b'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='movdiri'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pku'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='psdp-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='sbdr-ssdp-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='serialize'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='ss'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='taa-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='tsx-ldtrk'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='vaes'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='xfd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='SapphireRapids-v4'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='amx-bf16'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='amx-int8'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='amx-tile'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx-vnni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512-bf16'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512-fp16'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bitalg'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512ifma'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vbmi'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='bus-lock-detect'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='cldemote'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fbsdp-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fsrc'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fsrm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fsrs'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fzrm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='hle'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='ibrs-all'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='la57'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='movdir64b'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='movdiri'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pku'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='psdp-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='sbdr-ssdp-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='serialize'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='ss'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='taa-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='tsx-ldtrk'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='vaes'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='xfd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='SierraForest'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx-ifma'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx-ne-convert'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx-vnni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx-vnni-int8'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='bus-lock-detect'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='cmpccxadd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fbsdp-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fsrm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fsrs'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='ibrs-all'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='mcdt-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pbrsb-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pku'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='psdp-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='sbdr-ssdp-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='serialize'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='vaes'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='SierraForest-v1'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx-ifma'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx-ne-convert'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx-vnni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx-vnni-int8'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='bus-lock-detect'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='cmpccxadd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fbsdp-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fsrm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fsrs'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='ibrs-all'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='mcdt-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pbrsb-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pku'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='psdp-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='sbdr-ssdp-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='serialize'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='vaes'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel'>SierraForest-v2</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='SierraForest-v2'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx-ifma'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx-ne-convert'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx-vnni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx-vnni-int8'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='bhi-ctrl'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='bus-lock-detect'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='cldemote'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='cmpccxadd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fbsdp-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fsrm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fsrs'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='ibrs-all'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='intel-psfd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='ipred-ctrl'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='lam'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='mcdt-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='movdir64b'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='movdiri'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pbrsb-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pku'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='psdp-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='rrsba-ctrl'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='sbdr-ssdp-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='serialize'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='ss'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='vaes'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel'>SierraForest-v3</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='SierraForest-v3'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx-ifma'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx-ne-convert'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx-vnni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx-vnni-int8'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='bhi-ctrl'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='bus-lock-detect'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='cldemote'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='cmpccxadd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fbsdp-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fsrm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='fsrs'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='ibrs-all'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='intel-psfd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='ipred-ctrl'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='lam'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='mcdt-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='movdir64b'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='movdiri'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pbrsb-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pku'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='psdp-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='rrsba-ctrl'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='sbdr-ssdp-no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='serialize'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='ss'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='vaes'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Skylake-Client'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='hle'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Skylake-Client-IBRS'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='hle'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Skylake-Client-v1'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='hle'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Skylake-Client-v2'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='hle'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Skylake-Client-v3'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Skylake-Client-v4'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Skylake-Server'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='hle'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pku'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Skylake-Server-IBRS'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='hle'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pku'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pku'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Skylake-Server-v1'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='hle'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pku'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Skylake-Server-v2'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='hle'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pku'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Skylake-Server-v3'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pku'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Skylake-Server-v4'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pku'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Skylake-Server-v5'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='pku'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Snowridge'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='cldemote'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='core-capability'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='movdir64b'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='movdiri'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='mpx'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='split-lock-detect'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Snowridge-v1'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='cldemote'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='core-capability'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='movdir64b'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='movdiri'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='mpx'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='split-lock-detect'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Snowridge-v2'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='cldemote'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='core-capability'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='movdir64b'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='movdiri'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='split-lock-detect'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Snowridge-v3'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='cldemote'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='core-capability'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='movdir64b'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='movdiri'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='split-lock-detect'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='Snowridge-v4'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='cldemote'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='erms'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='movdir64b'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='movdiri'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='athlon'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='3dnow'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='3dnowext'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='athlon-v1'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='3dnow'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='3dnowext'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='core2duo'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='ss'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='core2duo-v1'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='ss'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='coreduo'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='ss'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='coreduo-v1'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='ss'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='n270'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='ss'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='n270-v1'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='ss'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='phenom'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='3dnow'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='3dnowext'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <blockers model='phenom-v1'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='3dnow'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <feature name='3dnowext'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </blockers>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]:     </mode>
Jan 21 23:37:06 compute-1 nova_compute[181770]:   </cpu>
Jan 21 23:37:06 compute-1 nova_compute[181770]:   <memoryBacking supported='yes'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:     <enum name='sourceType'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <value>file</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <value>anonymous</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <value>memfd</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:     </enum>
Jan 21 23:37:06 compute-1 nova_compute[181770]:   </memoryBacking>
Jan 21 23:37:06 compute-1 nova_compute[181770]:   <devices>
Jan 21 23:37:06 compute-1 nova_compute[181770]:     <disk supported='yes'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <enum name='diskDevice'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>disk</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>cdrom</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>floppy</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>lun</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </enum>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <enum name='bus'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>fdc</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>scsi</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>virtio</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>usb</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>sata</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </enum>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <enum name='model'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>virtio</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>virtio-transitional</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>virtio-non-transitional</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </enum>
Jan 21 23:37:06 compute-1 nova_compute[181770]:     </disk>
Jan 21 23:37:06 compute-1 nova_compute[181770]:     <graphics supported='yes'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <enum name='type'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>vnc</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>egl-headless</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>dbus</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </enum>
Jan 21 23:37:06 compute-1 nova_compute[181770]:     </graphics>
Jan 21 23:37:06 compute-1 nova_compute[181770]:     <video supported='yes'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <enum name='modelType'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>vga</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>cirrus</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>virtio</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>none</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>bochs</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>ramfb</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </enum>
Jan 21 23:37:06 compute-1 nova_compute[181770]:     </video>
Jan 21 23:37:06 compute-1 nova_compute[181770]:     <hostdev supported='yes'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <enum name='mode'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>subsystem</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </enum>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <enum name='startupPolicy'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>default</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>mandatory</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>requisite</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>optional</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </enum>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <enum name='subsysType'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>usb</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>pci</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>scsi</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </enum>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <enum name='capsType'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <enum name='pciBackend'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:     </hostdev>
Jan 21 23:37:06 compute-1 nova_compute[181770]:     <rng supported='yes'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <enum name='model'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>virtio</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>virtio-transitional</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>virtio-non-transitional</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </enum>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <enum name='backendModel'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>random</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>egd</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>builtin</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </enum>
Jan 21 23:37:06 compute-1 nova_compute[181770]:     </rng>
Jan 21 23:37:06 compute-1 nova_compute[181770]:     <filesystem supported='yes'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <enum name='driverType'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>path</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>handle</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>virtiofs</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </enum>
Jan 21 23:37:06 compute-1 nova_compute[181770]:     </filesystem>
Jan 21 23:37:06 compute-1 nova_compute[181770]:     <tpm supported='yes'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <enum name='model'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>tpm-tis</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>tpm-crb</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </enum>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <enum name='backendModel'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>emulator</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>external</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </enum>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <enum name='backendVersion'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>2.0</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </enum>
Jan 21 23:37:06 compute-1 nova_compute[181770]:     </tpm>
Jan 21 23:37:06 compute-1 nova_compute[181770]:     <redirdev supported='yes'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <enum name='bus'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>usb</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </enum>
Jan 21 23:37:06 compute-1 nova_compute[181770]:     </redirdev>
Jan 21 23:37:06 compute-1 nova_compute[181770]:     <channel supported='yes'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <enum name='type'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>pty</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>unix</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </enum>
Jan 21 23:37:06 compute-1 nova_compute[181770]:     </channel>
Jan 21 23:37:06 compute-1 nova_compute[181770]:     <crypto supported='yes'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <enum name='model'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <enum name='type'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>qemu</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </enum>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <enum name='backendModel'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>builtin</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </enum>
Jan 21 23:37:06 compute-1 nova_compute[181770]:     </crypto>
Jan 21 23:37:06 compute-1 nova_compute[181770]:     <interface supported='yes'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <enum name='backendType'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>default</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>passt</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </enum>
Jan 21 23:37:06 compute-1 nova_compute[181770]:     </interface>
Jan 21 23:37:06 compute-1 nova_compute[181770]:     <panic supported='yes'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <enum name='model'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>isa</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>hyperv</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </enum>
Jan 21 23:37:06 compute-1 nova_compute[181770]:     </panic>
Jan 21 23:37:06 compute-1 nova_compute[181770]:     <console supported='yes'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <enum name='type'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>null</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>vc</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>pty</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>dev</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>file</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>pipe</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>stdio</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>udp</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>tcp</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>unix</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>qemu-vdagent</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>dbus</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </enum>
Jan 21 23:37:06 compute-1 nova_compute[181770]:     </console>
Jan 21 23:37:06 compute-1 nova_compute[181770]:   </devices>
Jan 21 23:37:06 compute-1 nova_compute[181770]:   <features>
Jan 21 23:37:06 compute-1 nova_compute[181770]:     <gic supported='no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:     <vmcoreinfo supported='yes'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:     <genid supported='yes'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:     <backingStoreInput supported='yes'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:     <backup supported='yes'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:     <async-teardown supported='yes'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:     <s390-pv supported='no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:     <ps2 supported='yes'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:     <tdx supported='no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:     <sev supported='no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:     <sgx supported='no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:     <hyperv supported='yes'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <enum name='features'>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>relaxed</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>vapic</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>spinlocks</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>vpindex</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>runtime</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>synic</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>stimer</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>reset</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>vendor_id</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>frequencies</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>reenlightenment</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>tlbflush</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>ipi</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>avic</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>emsr_bitmap</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <value>xmm_input</value>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </enum>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       <defaults>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <spinlocks>4095</spinlocks>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <stimer_direct>on</stimer_direct>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <tlbflush_direct>on</tlbflush_direct>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <tlbflush_extended>on</tlbflush_extended>
Jan 21 23:37:06 compute-1 nova_compute[181770]:         <vendor_id>Linux KVM Hv</vendor_id>
Jan 21 23:37:06 compute-1 nova_compute[181770]:       </defaults>
Jan 21 23:37:06 compute-1 nova_compute[181770]:     </hyperv>
Jan 21 23:37:06 compute-1 nova_compute[181770]:     <launchSecurity supported='no'/>
Jan 21 23:37:06 compute-1 nova_compute[181770]:   </features>
Jan 21 23:37:06 compute-1 nova_compute[181770]: </domainCapabilities>
Jan 21 23:37:06 compute-1 nova_compute[181770]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Jan 21 23:37:06 compute-1 nova_compute[181770]: 2026-01-21 23:37:06.476 181774 DEBUG nova.virt.libvirt.host [None req-d237fa9d-112a-4ac9-ba3e-9871449b534b - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782
Jan 21 23:37:06 compute-1 nova_compute[181770]: 2026-01-21 23:37:06.476 181774 DEBUG nova.virt.libvirt.host [None req-d237fa9d-112a-4ac9-ba3e-9871449b534b - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782
Jan 21 23:37:06 compute-1 nova_compute[181770]: 2026-01-21 23:37:06.476 181774 DEBUG nova.virt.libvirt.host [None req-d237fa9d-112a-4ac9-ba3e-9871449b534b - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782
Jan 21 23:37:06 compute-1 nova_compute[181770]: 2026-01-21 23:37:06.481 181774 INFO nova.virt.libvirt.host [None req-d237fa9d-112a-4ac9-ba3e-9871449b534b - - - - - -] Secure Boot support detected
Jan 21 23:37:06 compute-1 nova_compute[181770]: 2026-01-21 23:37:06.485 181774 INFO nova.virt.libvirt.driver [None req-d237fa9d-112a-4ac9-ba3e-9871449b534b - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Jan 21 23:37:06 compute-1 nova_compute[181770]: 2026-01-21 23:37:06.485 181774 INFO nova.virt.libvirt.driver [None req-d237fa9d-112a-4ac9-ba3e-9871449b534b - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Jan 21 23:37:06 compute-1 nova_compute[181770]: 2026-01-21 23:37:06.496 181774 DEBUG nova.virt.libvirt.driver [None req-d237fa9d-112a-4ac9-ba3e-9871449b534b - - - - - -] cpu compare xml: <cpu match="exact">
Jan 21 23:37:06 compute-1 nova_compute[181770]:   <model>Nehalem</model>
Jan 21 23:37:06 compute-1 nova_compute[181770]: </cpu>
Jan 21 23:37:06 compute-1 nova_compute[181770]:  _compare_cpu /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10019
Jan 21 23:37:06 compute-1 nova_compute[181770]: 2026-01-21 23:37:06.499 181774 DEBUG nova.virt.libvirt.driver [None req-d237fa9d-112a-4ac9-ba3e-9871449b534b - - - - - -] Enabling emulated TPM support _check_vtpm_support /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1097
Jan 21 23:37:06 compute-1 nova_compute[181770]: 2026-01-21 23:37:06.539 181774 INFO nova.virt.node [None req-d237fa9d-112a-4ac9-ba3e-9871449b534b - - - - - -] Determined node identity 39680711-70c9-4df1-ae59-25e54fac688d from /var/lib/nova/compute_id
Jan 21 23:37:06 compute-1 nova_compute[181770]: 2026-01-21 23:37:06.559 181774 WARNING nova.compute.manager [None req-d237fa9d-112a-4ac9-ba3e-9871449b534b - - - - - -] Compute nodes ['39680711-70c9-4df1-ae59-25e54fac688d'] for host compute-1.ctlplane.example.com were not found in the database. If this is the first time this service is starting on this host, then you can ignore this warning.
Jan 21 23:37:06 compute-1 nova_compute[181770]: 2026-01-21 23:37:06.600 181774 INFO nova.compute.manager [None req-d237fa9d-112a-4ac9-ba3e-9871449b534b - - - - - -] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host
Jan 21 23:37:06 compute-1 nova_compute[181770]: 2026-01-21 23:37:06.718 181774 WARNING nova.compute.manager [None req-d237fa9d-112a-4ac9-ba3e-9871449b534b - - - - - -] No compute node record found for host compute-1.ctlplane.example.com. If this is the first time this service is starting on this host, then you can ignore this warning.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host compute-1.ctlplane.example.com could not be found.
Jan 21 23:37:06 compute-1 nova_compute[181770]: 2026-01-21 23:37:06.718 181774 DEBUG oslo_concurrency.lockutils [None req-d237fa9d-112a-4ac9-ba3e-9871449b534b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:37:06 compute-1 nova_compute[181770]: 2026-01-21 23:37:06.719 181774 DEBUG oslo_concurrency.lockutils [None req-d237fa9d-112a-4ac9-ba3e-9871449b534b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:37:06 compute-1 nova_compute[181770]: 2026-01-21 23:37:06.719 181774 DEBUG oslo_concurrency.lockutils [None req-d237fa9d-112a-4ac9-ba3e-9871449b534b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:37:06 compute-1 nova_compute[181770]: 2026-01-21 23:37:06.719 181774 DEBUG nova.compute.resource_tracker [None req-d237fa9d-112a-4ac9-ba3e-9871449b534b - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 21 23:37:06 compute-1 systemd[1]: Starting libvirt nodedev daemon...
Jan 21 23:37:06 compute-1 python3.9[182452]: ansible-containers.podman.podman_container Invoked with name=nova_nvme_cleaner state=absent executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Jan 21 23:37:06 compute-1 systemd[1]: Started libvirt nodedev daemon.
Jan 21 23:37:06 compute-1 sudo[182450]: pam_unix(sudo:session): session closed for user root
Jan 21 23:37:07 compute-1 nova_compute[181770]: 2026-01-21 23:37:07.109 181774 WARNING nova.virt.libvirt.driver [None req-d237fa9d-112a-4ac9-ba3e-9871449b534b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 21 23:37:07 compute-1 nova_compute[181770]: 2026-01-21 23:37:07.110 181774 DEBUG nova.compute.resource_tracker [None req-d237fa9d-112a-4ac9-ba3e-9871449b534b - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=6174MB free_disk=73.5830078125GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 21 23:37:07 compute-1 nova_compute[181770]: 2026-01-21 23:37:07.110 181774 DEBUG oslo_concurrency.lockutils [None req-d237fa9d-112a-4ac9-ba3e-9871449b534b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:37:07 compute-1 nova_compute[181770]: 2026-01-21 23:37:07.110 181774 DEBUG oslo_concurrency.lockutils [None req-d237fa9d-112a-4ac9-ba3e-9871449b534b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:37:07 compute-1 nova_compute[181770]: 2026-01-21 23:37:07.131 181774 WARNING nova.compute.resource_tracker [None req-d237fa9d-112a-4ac9-ba3e-9871449b534b - - - - - -] No compute node record for compute-1.ctlplane.example.com:39680711-70c9-4df1-ae59-25e54fac688d: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host 39680711-70c9-4df1-ae59-25e54fac688d could not be found.
Jan 21 23:37:07 compute-1 nova_compute[181770]: 2026-01-21 23:37:07.157 181774 INFO nova.compute.resource_tracker [None req-d237fa9d-112a-4ac9-ba3e-9871449b534b - - - - - -] Compute node record created for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com with uuid: 39680711-70c9-4df1-ae59-25e54fac688d
Jan 21 23:37:07 compute-1 nova_compute[181770]: 2026-01-21 23:37:07.219 181774 DEBUG nova.compute.resource_tracker [None req-d237fa9d-112a-4ac9-ba3e-9871449b534b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 21 23:37:07 compute-1 nova_compute[181770]: 2026-01-21 23:37:07.220 181774 DEBUG nova.compute.resource_tracker [None req-d237fa9d-112a-4ac9-ba3e-9871449b534b - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 21 23:37:07 compute-1 rsyslogd[1003]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 21 23:37:07 compute-1 sudo[182647]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yvfhkqpgvavurkudttggxgvbpysyjuqo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038627.289695-3576-247219008526263/AnsiballZ_systemd.py'
Jan 21 23:37:07 compute-1 sudo[182647]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:37:07 compute-1 nova_compute[181770]: 2026-01-21 23:37:07.828 181774 INFO nova.scheduler.client.report [None req-d237fa9d-112a-4ac9-ba3e-9871449b534b - - - - - -] [req-ba594ae0-e2fc-4829-995a-04eab4d47ae2] Created resource provider record via placement API for resource provider with UUID 39680711-70c9-4df1-ae59-25e54fac688d and name compute-1.ctlplane.example.com.
Jan 21 23:37:07 compute-1 nova_compute[181770]: 2026-01-21 23:37:07.865 181774 DEBUG nova.virt.libvirt.host [None req-d237fa9d-112a-4ac9-ba3e-9871449b534b - - - - - -] /sys/module/kvm_amd/parameters/sev contains [N
Jan 21 23:37:07 compute-1 nova_compute[181770]: ] _kernel_supports_amd_sev /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1803
Jan 21 23:37:07 compute-1 nova_compute[181770]: 2026-01-21 23:37:07.865 181774 INFO nova.virt.libvirt.host [None req-d237fa9d-112a-4ac9-ba3e-9871449b534b - - - - - -] kernel doesn't support AMD SEV
Jan 21 23:37:07 compute-1 nova_compute[181770]: 2026-01-21 23:37:07.866 181774 DEBUG nova.compute.provider_tree [None req-d237fa9d-112a-4ac9-ba3e-9871449b534b - - - - - -] Updating inventory in ProviderTree for provider 39680711-70c9-4df1-ae59-25e54fac688d with inventory: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Jan 21 23:37:07 compute-1 nova_compute[181770]: 2026-01-21 23:37:07.866 181774 DEBUG nova.virt.libvirt.driver [None req-d237fa9d-112a-4ac9-ba3e-9871449b534b - - - - - -] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 21 23:37:07 compute-1 nova_compute[181770]: 2026-01-21 23:37:07.868 181774 DEBUG nova.virt.libvirt.driver [None req-d237fa9d-112a-4ac9-ba3e-9871449b534b - - - - - -] Libvirt baseline CPU <cpu>
Jan 21 23:37:07 compute-1 nova_compute[181770]:   <arch>x86_64</arch>
Jan 21 23:37:07 compute-1 nova_compute[181770]:   <model>Nehalem</model>
Jan 21 23:37:07 compute-1 nova_compute[181770]:   <vendor>AMD</vendor>
Jan 21 23:37:07 compute-1 nova_compute[181770]:   <topology sockets="8" cores="1" threads="1"/>
Jan 21 23:37:07 compute-1 nova_compute[181770]: </cpu>
Jan 21 23:37:07 compute-1 nova_compute[181770]:  _get_guest_baseline_cpu_features /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12537
Jan 21 23:37:07 compute-1 nova_compute[181770]: 2026-01-21 23:37:07.945 181774 DEBUG nova.scheduler.client.report [None req-d237fa9d-112a-4ac9-ba3e-9871449b534b - - - - - -] Updated inventory for provider 39680711-70c9-4df1-ae59-25e54fac688d with generation 0 in Placement from set_inventory_for_provider using data: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:957
Jan 21 23:37:07 compute-1 nova_compute[181770]: 2026-01-21 23:37:07.945 181774 DEBUG nova.compute.provider_tree [None req-d237fa9d-112a-4ac9-ba3e-9871449b534b - - - - - -] Updating resource provider 39680711-70c9-4df1-ae59-25e54fac688d generation from 0 to 1 during operation: update_inventory _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164
Jan 21 23:37:07 compute-1 nova_compute[181770]: 2026-01-21 23:37:07.945 181774 DEBUG nova.compute.provider_tree [None req-d237fa9d-112a-4ac9-ba3e-9871449b534b - - - - - -] Updating inventory in ProviderTree for provider 39680711-70c9-4df1-ae59-25e54fac688d with inventory: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Jan 21 23:37:07 compute-1 python3.9[182649]: ansible-ansible.builtin.systemd Invoked with name=edpm_nova_compute.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 21 23:37:08 compute-1 systemd[1]: Stopping nova_compute container...
Jan 21 23:37:08 compute-1 nova_compute[181770]: 2026-01-21 23:37:08.091 181774 DEBUG oslo_concurrency.lockutils [None req-d237fa9d-112a-4ac9-ba3e-9871449b534b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.980s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:37:08 compute-1 nova_compute[181770]: 2026-01-21 23:37:08.091 181774 DEBUG oslo_concurrency.lockutils [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 23:37:08 compute-1 nova_compute[181770]: 2026-01-21 23:37:08.092 181774 DEBUG oslo_concurrency.lockutils [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 23:37:08 compute-1 nova_compute[181770]: 2026-01-21 23:37:08.092 181774 DEBUG oslo_concurrency.lockutils [None req-7d42997b-d27b-4625-b262-5033de600657 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 23:37:08 compute-1 virtqemud[182235]: libvirt version: 11.10.0, package: 2.el9 (builder@centos.org, 2025-12-18-15:09:54, )
Jan 21 23:37:08 compute-1 virtqemud[182235]: hostname: compute-1
Jan 21 23:37:08 compute-1 virtqemud[182235]: End of file while reading data: Input/output error
Jan 21 23:37:08 compute-1 systemd[1]: libpod-56133df44cab3ff30ffc71c1039d65c8d91acff011fc376dcdcff1ab86939515.scope: Deactivated successfully.
Jan 21 23:37:08 compute-1 systemd[1]: libpod-56133df44cab3ff30ffc71c1039d65c8d91acff011fc376dcdcff1ab86939515.scope: Consumed 3.396s CPU time.
Jan 21 23:37:08 compute-1 podman[182653]: 2026-01-21 23:37:08.45858374 +0000 UTC m=+0.411560725 container died 56133df44cab3ff30ffc71c1039d65c8d91acff011fc376dcdcff1ab86939515 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, container_name=nova_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=edpm, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Jan 21 23:37:08 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-56133df44cab3ff30ffc71c1039d65c8d91acff011fc376dcdcff1ab86939515-userdata-shm.mount: Deactivated successfully.
Jan 21 23:37:08 compute-1 systemd[1]: var-lib-containers-storage-overlay-5cbf94059a10e47153a96e5d5e2cc6bef0cf17dc738e59d7938dae576afd3d29-merged.mount: Deactivated successfully.
Jan 21 23:37:08 compute-1 podman[182653]: 2026-01-21 23:37:08.524675764 +0000 UTC m=+0.477652739 container cleanup 56133df44cab3ff30ffc71c1039d65c8d91acff011fc376dcdcff1ab86939515 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=edpm, container_name=nova_compute, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 21 23:37:08 compute-1 podman[182653]: nova_compute
Jan 21 23:37:08 compute-1 podman[182683]: nova_compute
Jan 21 23:37:08 compute-1 systemd[1]: edpm_nova_compute.service: Deactivated successfully.
Jan 21 23:37:08 compute-1 systemd[1]: Stopped nova_compute container.
Jan 21 23:37:08 compute-1 systemd[1]: Starting nova_compute container...
Jan 21 23:37:08 compute-1 systemd[1]: Started libcrun container.
Jan 21 23:37:08 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5cbf94059a10e47153a96e5d5e2cc6bef0cf17dc738e59d7938dae576afd3d29/merged/etc/nvme supports timestamps until 2038 (0x7fffffff)
Jan 21 23:37:08 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5cbf94059a10e47153a96e5d5e2cc6bef0cf17dc738e59d7938dae576afd3d29/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Jan 21 23:37:08 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5cbf94059a10e47153a96e5d5e2cc6bef0cf17dc738e59d7938dae576afd3d29/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Jan 21 23:37:08 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5cbf94059a10e47153a96e5d5e2cc6bef0cf17dc738e59d7938dae576afd3d29/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Jan 21 23:37:08 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5cbf94059a10e47153a96e5d5e2cc6bef0cf17dc738e59d7938dae576afd3d29/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Jan 21 23:37:08 compute-1 podman[182696]: 2026-01-21 23:37:08.74573046 +0000 UTC m=+0.117068341 container init 56133df44cab3ff30ffc71c1039d65c8d91acff011fc376dcdcff1ab86939515 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.build-date=20251202, tcib_managed=true, io.buildah.version=1.41.3, config_id=edpm, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, container_name=nova_compute, maintainer=OpenStack Kubernetes Operator team)
Jan 21 23:37:08 compute-1 podman[182696]: 2026-01-21 23:37:08.754375604 +0000 UTC m=+0.125713455 container start 56133df44cab3ff30ffc71c1039d65c8d91acff011fc376dcdcff1ab86939515 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, config_id=edpm, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, container_name=nova_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']})
Jan 21 23:37:08 compute-1 podman[182696]: nova_compute
Jan 21 23:37:08 compute-1 nova_compute[182713]: + sudo -E kolla_set_configs
Jan 21 23:37:08 compute-1 systemd[1]: Started nova_compute container.
Jan 21 23:37:08 compute-1 sudo[182647]: pam_unix(sudo:session): session closed for user root
Jan 21 23:37:08 compute-1 nova_compute[182713]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Jan 21 23:37:08 compute-1 nova_compute[182713]: INFO:__main__:Validating config file
Jan 21 23:37:08 compute-1 nova_compute[182713]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Jan 21 23:37:08 compute-1 nova_compute[182713]: INFO:__main__:Copying service configuration files
Jan 21 23:37:08 compute-1 nova_compute[182713]: INFO:__main__:Deleting /etc/nova/nova.conf
Jan 21 23:37:08 compute-1 nova_compute[182713]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf
Jan 21 23:37:08 compute-1 nova_compute[182713]: INFO:__main__:Setting permission for /etc/nova/nova.conf
Jan 21 23:37:08 compute-1 nova_compute[182713]: INFO:__main__:Deleting /etc/nova/nova.conf.d/01-nova.conf
Jan 21 23:37:08 compute-1 nova_compute[182713]: INFO:__main__:Copying /var/lib/kolla/config_files/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf
Jan 21 23:37:08 compute-1 nova_compute[182713]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf
Jan 21 23:37:08 compute-1 nova_compute[182713]: INFO:__main__:Deleting /etc/nova/nova.conf.d/25-nova-extra.conf
Jan 21 23:37:08 compute-1 nova_compute[182713]: INFO:__main__:Copying /var/lib/kolla/config_files/25-nova-extra.conf to /etc/nova/nova.conf.d/25-nova-extra.conf
Jan 21 23:37:08 compute-1 nova_compute[182713]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/25-nova-extra.conf
Jan 21 23:37:08 compute-1 nova_compute[182713]: INFO:__main__:Deleting /etc/nova/nova.conf.d/nova-blank.conf
Jan 21 23:37:08 compute-1 nova_compute[182713]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf
Jan 21 23:37:08 compute-1 nova_compute[182713]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf
Jan 21 23:37:08 compute-1 nova_compute[182713]: INFO:__main__:Deleting /etc/nova/nova.conf.d/02-nova-host-specific.conf
Jan 21 23:37:08 compute-1 nova_compute[182713]: INFO:__main__:Copying /var/lib/kolla/config_files/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf
Jan 21 23:37:08 compute-1 nova_compute[182713]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf
Jan 21 23:37:08 compute-1 nova_compute[182713]: INFO:__main__:Deleting /etc/ceph
Jan 21 23:37:08 compute-1 nova_compute[182713]: INFO:__main__:Creating directory /etc/ceph
Jan 21 23:37:08 compute-1 nova_compute[182713]: INFO:__main__:Setting permission for /etc/ceph
Jan 21 23:37:08 compute-1 nova_compute[182713]: INFO:__main__:Deleting /var/lib/nova/.ssh/ssh-privatekey
Jan 21 23:37:08 compute-1 nova_compute[182713]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey
Jan 21 23:37:08 compute-1 nova_compute[182713]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Jan 21 23:37:08 compute-1 nova_compute[182713]: INFO:__main__:Deleting /var/lib/nova/.ssh/config
Jan 21 23:37:08 compute-1 nova_compute[182713]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-config to /var/lib/nova/.ssh/config
Jan 21 23:37:08 compute-1 nova_compute[182713]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Jan 21 23:37:08 compute-1 nova_compute[182713]: INFO:__main__:Deleting /usr/sbin/iscsiadm
Jan 21 23:37:08 compute-1 nova_compute[182713]: INFO:__main__:Copying /var/lib/kolla/config_files/run-on-host to /usr/sbin/iscsiadm
Jan 21 23:37:08 compute-1 nova_compute[182713]: INFO:__main__:Setting permission for /usr/sbin/iscsiadm
Jan 21 23:37:08 compute-1 nova_compute[182713]: INFO:__main__:Writing out command to execute
Jan 21 23:37:08 compute-1 nova_compute[182713]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/
Jan 21 23:37:08 compute-1 nova_compute[182713]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Jan 21 23:37:08 compute-1 nova_compute[182713]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Jan 21 23:37:08 compute-1 nova_compute[182713]: ++ cat /run_command
Jan 21 23:37:08 compute-1 nova_compute[182713]: + CMD=nova-compute
Jan 21 23:37:08 compute-1 nova_compute[182713]: + ARGS=
Jan 21 23:37:08 compute-1 nova_compute[182713]: + sudo kolla_copy_cacerts
Jan 21 23:37:08 compute-1 nova_compute[182713]: + [[ ! -n '' ]]
Jan 21 23:37:08 compute-1 nova_compute[182713]: + . kolla_extend_start
Jan 21 23:37:08 compute-1 nova_compute[182713]: + echo 'Running command: '\''nova-compute'\'''
Jan 21 23:37:08 compute-1 nova_compute[182713]: Running command: 'nova-compute'
Jan 21 23:37:08 compute-1 nova_compute[182713]: + umask 0022
Jan 21 23:37:08 compute-1 nova_compute[182713]: + exec nova-compute
Jan 21 23:37:10 compute-1 sudo[182875]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qnxoeibdunyetuxuptsbopxpeoirurcp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038629.8202465-3603-78822358146337/AnsiballZ_podman_container.py'
Jan 21 23:37:10 compute-1 sudo[182875]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:37:10 compute-1 python3.9[182877]: ansible-containers.podman.podman_container Invoked with name=nova_compute_init state=started executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Jan 21 23:37:10 compute-1 systemd[1]: Started libpod-conmon-18ba026f131b5cd8a9218bcabb3bd2b4a1a18f9ece3e9847168dc50aa2842ebb.scope.
Jan 21 23:37:10 compute-1 systemd[1]: Started libcrun container.
Jan 21 23:37:10 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/447bae6972fb3dbf84a0b2629d8598814e3f627765960c435266807feca214e2/merged/usr/sbin/nova_statedir_ownership.py supports timestamps until 2038 (0x7fffffff)
Jan 21 23:37:10 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/447bae6972fb3dbf84a0b2629d8598814e3f627765960c435266807feca214e2/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Jan 21 23:37:10 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/447bae6972fb3dbf84a0b2629d8598814e3f627765960c435266807feca214e2/merged/var/lib/_nova_secontext supports timestamps until 2038 (0x7fffffff)
Jan 21 23:37:10 compute-1 podman[182899]: 2026-01-21 23:37:10.6304042 +0000 UTC m=+0.141131994 container init 18ba026f131b5cd8a9218bcabb3bd2b4a1a18f9ece3e9847168dc50aa2842ebb (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=edpm, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, container_name=nova_compute_init, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']})
Jan 21 23:37:10 compute-1 podman[182899]: 2026-01-21 23:37:10.641663557 +0000 UTC m=+0.152391311 container start 18ba026f131b5cd8a9218bcabb3bd2b4a1a18f9ece3e9847168dc50aa2842ebb (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, container_name=nova_compute_init, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, config_id=edpm, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 21 23:37:10 compute-1 python3.9[182877]: ansible-containers.podman.podman_container PODMAN-CONTAINER-DEBUG: podman start nova_compute_init
Jan 21 23:37:10 compute-1 nova_compute[182713]: 2026-01-21 23:37:10.695 182717 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_linux_bridge.linux_bridge.LinuxBridgePlugin'>' with name 'linux_bridge' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Jan 21 23:37:10 compute-1 nova_compute[182713]: 2026-01-21 23:37:10.696 182717 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_noop.noop.NoOpPlugin'>' with name 'noop' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Jan 21 23:37:10 compute-1 nova_compute[182713]: 2026-01-21 23:37:10.696 182717 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_ovs.ovs.OvsPlugin'>' with name 'ovs' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Jan 21 23:37:10 compute-1 nova_compute[182713]: 2026-01-21 23:37:10.696 182717 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs
Jan 21 23:37:10 compute-1 nova_compute_init[182923]: INFO:nova_statedir:Applying nova statedir ownership
Jan 21 23:37:10 compute-1 nova_compute_init[182923]: INFO:nova_statedir:Target ownership for /var/lib/nova: 42436:42436
Jan 21 23:37:10 compute-1 nova_compute_init[182923]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/
Jan 21 23:37:10 compute-1 nova_compute_init[182923]: INFO:nova_statedir:Changing ownership of /var/lib/nova from 1000:1000 to 42436:42436
Jan 21 23:37:10 compute-1 nova_compute_init[182923]: INFO:nova_statedir:Setting selinux context of /var/lib/nova to system_u:object_r:container_file_t:s0
Jan 21 23:37:10 compute-1 nova_compute_init[182923]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/instances/
Jan 21 23:37:10 compute-1 nova_compute_init[182923]: INFO:nova_statedir:Changing ownership of /var/lib/nova/instances from 1000:1000 to 42436:42436
Jan 21 23:37:10 compute-1 nova_compute_init[182923]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/instances to system_u:object_r:container_file_t:s0
Jan 21 23:37:10 compute-1 nova_compute_init[182923]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/
Jan 21 23:37:10 compute-1 nova_compute_init[182923]: INFO:nova_statedir:Ownership of /var/lib/nova/.ssh already 42436:42436
Jan 21 23:37:10 compute-1 nova_compute_init[182923]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/.ssh to system_u:object_r:container_file_t:s0
Jan 21 23:37:10 compute-1 nova_compute_init[182923]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/ssh-privatekey
Jan 21 23:37:10 compute-1 nova_compute_init[182923]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/config
Jan 21 23:37:10 compute-1 nova_compute_init[182923]: INFO:nova_statedir:Nova statedir ownership complete
Jan 21 23:37:10 compute-1 systemd[1]: libpod-18ba026f131b5cd8a9218bcabb3bd2b4a1a18f9ece3e9847168dc50aa2842ebb.scope: Deactivated successfully.
Jan 21 23:37:10 compute-1 podman[182935]: 2026-01-21 23:37:10.763306421 +0000 UTC m=+0.032173950 container died 18ba026f131b5cd8a9218bcabb3bd2b4a1a18f9ece3e9847168dc50aa2842ebb (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, container_name=nova_compute_init, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Jan 21 23:37:10 compute-1 podman[182935]: 2026-01-21 23:37:10.803294189 +0000 UTC m=+0.072161678 container cleanup 18ba026f131b5cd8a9218bcabb3bd2b4a1a18f9ece3e9847168dc50aa2842ebb (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, config_id=edpm, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, container_name=nova_compute_init, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Jan 21 23:37:10 compute-1 systemd[1]: var-lib-containers-storage-overlay-447bae6972fb3dbf84a0b2629d8598814e3f627765960c435266807feca214e2-merged.mount: Deactivated successfully.
Jan 21 23:37:10 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-18ba026f131b5cd8a9218bcabb3bd2b4a1a18f9ece3e9847168dc50aa2842ebb-userdata-shm.mount: Deactivated successfully.
Jan 21 23:37:10 compute-1 systemd[1]: libpod-conmon-18ba026f131b5cd8a9218bcabb3bd2b4a1a18f9ece3e9847168dc50aa2842ebb.scope: Deactivated successfully.
Jan 21 23:37:10 compute-1 nova_compute[182713]: 2026-01-21 23:37:10.833 182717 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:37:10 compute-1 nova_compute[182713]: 2026-01-21 23:37:10.845 182717 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 1 in 0.012s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:37:10 compute-1 nova_compute[182713]: 2026-01-21 23:37:10.845 182717 DEBUG oslo_concurrency.processutils [-] 'grep -F node.session.scan /sbin/iscsiadm' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473
Jan 21 23:37:10 compute-1 sudo[182875]: pam_unix(sudo:session): session closed for user root
Jan 21 23:37:10 compute-1 podman[182958]: 2026-01-21 23:37:10.898150245 +0000 UTC m=+0.076184616 container health_status af9a44923f14d865893c91712a29a99d59e05d83f1a1a0300c4f4d5af638f284 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.344 182717 INFO nova.virt.driver [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.475 182717 INFO nova.compute.provider_config [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.487 182717 DEBUG oslo_concurrency.lockutils [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.488 182717 DEBUG oslo_concurrency.lockutils [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.488 182717 DEBUG oslo_concurrency.lockutils [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.488 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.9/site-packages/oslo_service/service.py:362
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.489 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.489 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.489 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.489 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.489 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.489 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] allow_resize_to_same_host      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.490 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] arq_binding_timeout            = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.490 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] backdoor_port                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.490 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] backdoor_socket                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.490 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] block_device_allocate_retries  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.490 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.490 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] cert                           = self.pem log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.490 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] compute_driver                 = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.491 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] compute_monitors               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.491 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] config_dir                     = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.491 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] config_drive_format            = iso9660 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.491 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] config_file                    = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.491 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.491 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] console_host                   = compute-1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.491 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] control_exchange               = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.492 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] cpu_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.492 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] daemon                         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.492 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.492 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.492 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] default_availability_zone      = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.492 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] default_ephemeral_format       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.492 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.493 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] default_schedule_zone          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.493 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] disk_allocation_ratio          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.493 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] enable_new_services            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.493 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] enabled_apis                   = ['osapi_compute', 'metadata'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.493 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] enabled_ssl_apis               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.493 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] flat_injected                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.494 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] force_config_drive             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.494 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] force_raw_images               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.494 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.494 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] heal_instance_info_cache_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.494 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] host                           = compute-1.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.494 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] initial_cpu_allocation_ratio   = 4.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.495 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] initial_disk_allocation_ratio  = 0.9 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.495 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] initial_ram_allocation_ratio   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.495 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] injected_network_template      = /usr/lib/python3.9/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.495 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] instance_build_timeout         = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.495 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] instance_delete_interval       = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.495 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.495 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] instance_name_template         = instance-%08x log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.496 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] instance_usage_audit           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.496 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] instance_usage_audit_period    = month log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.496 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.496 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] instances_path                 = /var/lib/nova/instances log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.496 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.496 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] key                            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.497 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] live_migration_retry_count     = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.497 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.497 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.497 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.497 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.497 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.497 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.498 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.498 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] log_rotation_type              = size log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.498 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.498 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.498 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.498 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.498 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.499 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] long_rpc_timeout               = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.499 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] max_concurrent_builds          = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.499 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.499 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] max_concurrent_snapshots       = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.499 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] max_local_block_devices        = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.499 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] max_logfile_count              = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.500 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] max_logfile_size_mb            = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.500 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.500 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] metadata_listen                = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.500 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] metadata_listen_port           = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.500 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] metadata_workers               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.500 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] migrate_max_retries            = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.501 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] mkisofs_cmd                    = /usr/bin/mkisofs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.501 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] my_block_storage_ip            = 192.168.122.101 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.501 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] my_ip                          = 192.168.122.101 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.501 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] network_allocate_retries       = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.501 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.501 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] osapi_compute_listen           = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.501 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] osapi_compute_listen_port      = 8774 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.502 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] osapi_compute_unique_server_name_scope =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.502 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] osapi_compute_workers          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.502 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] password_length                = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.502 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] periodic_enable                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.502 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] periodic_fuzzy_delay           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.502 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] pointer_model                  = usbtablet log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.502 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] preallocate_images             = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.503 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.503 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] pybasedir                      = /usr/lib/python3.9/site-packages log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.503 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] ram_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.503 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.503 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.503 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.503 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] reboot_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.504 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] reclaim_instance_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.504 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] record                         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.504 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] reimage_timeout_per_gb         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.504 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] report_interval                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.504 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] rescue_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.504 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] reserved_host_cpus             = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.504 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] reserved_host_disk_mb          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.505 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] reserved_host_memory_mb        = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.505 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] reserved_huge_pages            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.505 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] resize_confirm_window          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.505 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] resize_fs_using_block_device   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.505 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.505 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] rootwrap_config                = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.505 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] rpc_response_timeout           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.505 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] run_external_periodic_tasks    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.506 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.506 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.506 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.506 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.506 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] service_down_time              = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.506 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] servicegroup_driver            = db log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.507 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] shelved_offload_time           = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.507 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] shelved_poll_interval          = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.507 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] shutdown_timeout               = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.507 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] source_is_ipv6                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.507 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] ssl_only                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.507 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] state_path                     = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.508 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] sync_power_state_interval      = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.508 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] sync_power_state_pool_size     = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.508 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.508 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] tempdir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.508 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] timeout_nbd                    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.508 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.508 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] update_resources_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.509 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] use_cow_images                 = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.509 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.509 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.509 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.509 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] use_rootwrap_daemon            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.509 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.509 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.510 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] vcpu_pin_set                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.510 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] vif_plugging_is_fatal          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.510 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] vif_plugging_timeout           = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.510 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] virt_mkfs                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.510 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] volume_usage_poll_interval     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.510 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.510 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] web                            = /usr/share/spice-html5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.511 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.511 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] oslo_concurrency.lock_path     = /var/lib/nova/tmp log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.511 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.511 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.511 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.511 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.511 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.512 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] api.auth_strategy              = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.512 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] api.compute_link_prefix        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.512 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.512 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] api.dhcp_domain                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.512 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] api.enable_instance_password   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.512 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] api.glance_link_prefix         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.512 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.513 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.513 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.513 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.513 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] api.local_metadata_per_cell    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.513 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] api.max_limit                  = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.513 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] api.metadata_cache_expiration  = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.514 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] api.neutron_default_tenant_id  = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.515 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] api.use_forwarded_for          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.515 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] api.use_neutron_default_nets   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.516 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.516 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.516 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.517 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] api.vendordata_dynamic_ssl_certfile =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.517 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.517 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] api.vendordata_jsonfile_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.518 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] api.vendordata_providers       = ['StaticJSON'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.518 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] cache.backend                  = oslo_cache.dict log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.518 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] cache.backend_argument         = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.518 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] cache.config_prefix            = cache.oslo log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.518 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] cache.dead_timeout             = 60.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.519 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] cache.debug_cache_backend      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.519 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] cache.enable_retry_client      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.519 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] cache.enable_socket_keepalive  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.519 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] cache.enabled                  = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.519 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] cache.expiration_time          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.519 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.519 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] cache.hashclient_retry_delay   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.520 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] cache.memcache_dead_retry      = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.520 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] cache.memcache_password        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.520 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.520 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.520 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] cache.memcache_pool_maxsize    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.520 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.520 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] cache.memcache_sasl_enabled    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.520 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] cache.memcache_servers         = ['localhost:11211'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.521 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] cache.memcache_socket_timeout  = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.521 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] cache.memcache_username        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.521 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] cache.proxies                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.521 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] cache.retry_attempts           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.521 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] cache.retry_delay              = 0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.521 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] cache.socket_keepalive_count   = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.521 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] cache.socket_keepalive_idle    = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.522 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.522 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] cache.tls_allowed_ciphers      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.522 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] cache.tls_cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.522 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] cache.tls_certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.522 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] cache.tls_enabled              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.522 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] cache.tls_keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.522 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] cinder.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.523 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] cinder.auth_type               = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.523 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] cinder.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.523 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] cinder.catalog_info            = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.523 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] cinder.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.523 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] cinder.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.523 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] cinder.cross_az_attach         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.523 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] cinder.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.524 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] cinder.endpoint_template       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.524 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] cinder.http_retries            = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.524 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] cinder.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.524 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] cinder.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.524 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] cinder.os_region_name          = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.524 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] cinder.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.524 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] cinder.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.525 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.525 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] compute.cpu_dedicated_set      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.525 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] compute.cpu_shared_set         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.525 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.525 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.525 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.525 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.526 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.527 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.527 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.527 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.527 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] compute.vmdk_allowed_types     = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.528 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] conductor.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.528 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] console.allowed_origins        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.528 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] console.ssl_ciphers            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.528 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] console.ssl_minimum_version    = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.528 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] consoleauth.token_ttl          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.528 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] cyborg.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.529 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] cyborg.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.529 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] cyborg.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.529 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] cyborg.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.529 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] cyborg.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.529 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] cyborg.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.529 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] cyborg.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.529 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] cyborg.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.530 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] cyborg.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.530 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] cyborg.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.530 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] cyborg.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.530 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] cyborg.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.530 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] cyborg.service_type            = accelerator log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.530 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] cyborg.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.530 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] cyborg.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.531 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.531 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] cyborg.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.531 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] cyborg.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.531 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] cyborg.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.531 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] database.backend               = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.531 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] database.connection            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.531 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] database.connection_debug      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.532 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.532 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.532 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] database.connection_trace      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.532 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.532 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] database.db_max_retries        = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.533 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.533 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] database.db_retry_interval     = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.533 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] database.max_overflow          = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.533 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] database.max_pool_size         = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.533 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] database.max_retries           = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.533 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] database.mysql_enable_ndb      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.533 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] database.mysql_sql_mode        = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.534 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.534 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] database.pool_timeout          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.534 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] database.retry_interval        = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.534 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] database.slave_connection      = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.534 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] database.sqlite_synchronous    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.534 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] api_database.backend           = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.534 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] api_database.connection        = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.535 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] api_database.connection_debug  = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.535 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] api_database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.535 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.535 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] api_database.connection_trace  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.535 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.535 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] api_database.db_max_retries    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.535 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.536 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.536 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] api_database.max_overflow      = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.536 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] api_database.max_pool_size     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.536 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] api_database.max_retries       = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.536 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] api_database.mysql_enable_ndb  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.536 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] api_database.mysql_sql_mode    = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.537 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.537 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] api_database.pool_timeout      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.537 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] api_database.retry_interval    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.537 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] api_database.slave_connection  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.537 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.537 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] devices.enabled_mdev_types     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.537 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.538 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.538 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.538 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] glance.api_servers             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.538 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] glance.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.538 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] glance.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.538 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] glance.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.539 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] glance.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.539 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] glance.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.539 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] glance.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.539 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.539 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.539 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] glance.enable_rbd_download     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.540 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] glance.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.540 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] glance.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.540 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] glance.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.540 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] glance.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.540 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] glance.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.541 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] glance.num_retries             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.541 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] glance.rbd_ceph_conf           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.541 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] glance.rbd_connect_timeout     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.541 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] glance.rbd_pool                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.541 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] glance.rbd_user                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.541 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] glance.region_name             = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.542 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] glance.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.542 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] glance.service_type            = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.542 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] glance.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.542 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] glance.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.542 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.542 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] glance.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.542 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] glance.valid_interfaces        = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.543 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.543 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] glance.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.543 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] guestfs.debug                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.543 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] hyperv.config_drive_cdrom      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.543 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] hyperv.config_drive_inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.543 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] hyperv.dynamic_memory_ratio    = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.543 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] hyperv.enable_instance_metrics_collection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.544 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] hyperv.enable_remotefx         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.544 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] hyperv.instances_path_share    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.544 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] hyperv.iscsi_initiator_list    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.544 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] hyperv.limit_cpu_features      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.544 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] hyperv.mounted_disk_query_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.544 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] hyperv.mounted_disk_query_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.544 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] hyperv.power_state_check_timeframe = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.544 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] hyperv.power_state_event_polling_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.545 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] hyperv.qemu_img_cmd            = qemu-img.exe log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.545 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] hyperv.use_multipath_io        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.545 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] hyperv.volume_attach_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.545 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] hyperv.volume_attach_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.545 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] hyperv.vswitch_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.545 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] hyperv.wait_soft_reboot_seconds = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.546 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] mks.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.546 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] mks.mksproxy_base_url          = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.546 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] image_cache.manager_interval   = 2400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.546 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.546 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.546 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.547 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.547 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] image_cache.subdirectory_name  = _base log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.547 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] ironic.api_max_retries         = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.547 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] ironic.api_retry_interval      = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.547 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.547 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.547 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.548 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.548 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.548 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.548 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.548 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.548 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.548 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.549 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.549 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.549 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] ironic.partition_key           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.549 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] ironic.peer_list               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.549 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.549 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.549 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.550 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] ironic.service_type            = baremetal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.550 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.550 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.550 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.550 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.550 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] ironic.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.551 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.551 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] key_manager.backend            = barbican log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.551 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] key_manager.fixed_key          = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.551 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] barbican.auth_endpoint         = http://localhost/identity/v3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.551 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] barbican.barbican_api_version  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.551 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] barbican.barbican_endpoint     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.551 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.552 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] barbican.barbican_region_name  = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.552 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] barbican.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.552 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] barbican.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.552 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] barbican.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.552 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] barbican.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.552 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] barbican.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.552 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] barbican.number_of_retries     = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.553 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] barbican.retry_delay           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.553 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.553 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] barbican.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.553 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] barbican.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.553 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] barbican.verify_ssl            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.554 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] barbican.verify_ssl_path       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.554 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.554 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.554 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] barbican_service_user.cafile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.554 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.555 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.555 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.555 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] barbican_service_user.keyfile  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.555 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.555 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] barbican_service_user.timeout  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.555 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] vault.approle_role_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.555 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] vault.approle_secret_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.556 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] vault.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.556 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] vault.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.556 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] vault.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.556 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] vault.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.556 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] vault.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.556 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] vault.kv_mountpoint            = secret log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.556 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] vault.kv_version               = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.557 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] vault.namespace                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.557 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] vault.root_token_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.557 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] vault.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.557 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] vault.ssl_ca_crt_file          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.557 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] vault.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.557 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] vault.use_ssl                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.557 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] vault.vault_url                = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.558 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] keystone.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.558 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] keystone.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.558 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] keystone.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.558 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] keystone.connect_retries       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.558 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] keystone.connect_retry_delay   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.558 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] keystone.endpoint_override     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.558 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] keystone.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.558 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] keystone.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.559 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] keystone.max_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.559 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] keystone.min_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.559 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] keystone.region_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.559 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] keystone.service_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.559 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] keystone.service_type          = identity log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.559 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] keystone.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.559 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] keystone.status_code_retries   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.560 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.560 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] keystone.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.560 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] keystone.valid_interfaces      = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.560 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] keystone.version               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.560 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] libvirt.connection_uri         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.560 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] libvirt.cpu_mode               = custom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.560 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] libvirt.cpu_model_extra_flags  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.561 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] libvirt.cpu_models             = ['Nehalem'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.561 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.561 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.561 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] libvirt.cpu_power_management   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.561 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.561 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.562 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] libvirt.device_detach_timeout  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.562 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] libvirt.disk_cachemodes        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.562 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] libvirt.disk_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.562 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] libvirt.enabled_perf_events    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.562 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] libvirt.file_backed_memory     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.562 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] libvirt.gid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.562 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] libvirt.hw_disk_discard        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.563 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] libvirt.hw_machine_type        = ['x86_64=q35'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.563 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] libvirt.images_rbd_ceph_conf   =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.563 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.563 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.563 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] libvirt.images_rbd_glance_store_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.563 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] libvirt.images_rbd_pool        = rbd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.564 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] libvirt.images_type            = qcow2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.564 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] libvirt.images_volume_group    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.564 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] libvirt.inject_key             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.564 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] libvirt.inject_partition       = -2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.564 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] libvirt.inject_password        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.564 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] libvirt.iscsi_iface            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.564 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] libvirt.iser_use_multipath     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.565 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.565 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.565 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.565 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.565 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.565 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.565 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.566 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.566 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] libvirt.live_migration_scheme  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.566 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.566 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.566 182717 WARNING oslo_config.cfg [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal (
Jan 21 23:37:11 compute-1 nova_compute[182713]: live_migration_uri is deprecated for removal in favor of two other options that
Jan 21 23:37:11 compute-1 nova_compute[182713]: allow to change live migration scheme and target URI: ``live_migration_scheme``
Jan 21 23:37:11 compute-1 nova_compute[182713]: and ``live_migration_inbound_addr`` respectively.
Jan 21 23:37:11 compute-1 nova_compute[182713]: ).  Its value may be silently ignored in the future.
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.567 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] libvirt.live_migration_uri     = qemu+tls://%s/system log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.567 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] libvirt.live_migration_with_native_tls = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.567 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] libvirt.max_queues             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.567 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.567 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] libvirt.nfs_mount_options      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.567 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] libvirt.nfs_mount_point_base   = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.568 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.568 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] libvirt.num_iser_scan_tries    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.568 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.568 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.568 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] libvirt.num_pcie_ports         = 24 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.568 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] libvirt.num_volume_scan_tries  = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.568 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] libvirt.pmem_namespaces        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.569 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] libvirt.quobyte_client_cfg     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.569 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.569 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] libvirt.rbd_connect_timeout    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.569 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.569 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.569 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] libvirt.rbd_secret_uuid        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.570 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] libvirt.rbd_user               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.570 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.570 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.570 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] libvirt.rescue_image_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.570 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] libvirt.rescue_kernel_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.570 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] libvirt.rescue_ramdisk_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.570 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] libvirt.rng_dev_path           = /dev/urandom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.571 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] libvirt.rx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.571 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] libvirt.smbfs_mount_options    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.571 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.571 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] libvirt.snapshot_compression   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.571 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] libvirt.snapshot_image_format  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.571 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] libvirt.snapshots_directory    = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.572 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.572 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] libvirt.swtpm_enabled          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.572 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] libvirt.swtpm_group            = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.572 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] libvirt.swtpm_user             = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.572 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] libvirt.sysinfo_serial         = unique log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.572 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] libvirt.tx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.573 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] libvirt.uid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.573 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.573 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] libvirt.virt_type              = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.573 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] libvirt.volume_clear           = zero log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.573 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] libvirt.volume_clear_size      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.573 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] libvirt.volume_use_multipath   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.573 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] libvirt.vzstorage_cache_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.573 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] libvirt.vzstorage_log_path     = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.574 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] libvirt.vzstorage_mount_group  = qemu log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.574 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] libvirt.vzstorage_mount_opts   = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.574 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] libvirt.vzstorage_mount_perms  = 0770 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.574 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.574 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] libvirt.vzstorage_mount_user   = stack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.574 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.574 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] neutron.auth_section           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.575 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] neutron.auth_type              = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.575 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] neutron.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.575 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] neutron.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.575 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] neutron.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.575 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] neutron.connect_retries        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.575 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] neutron.connect_retry_delay    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.575 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] neutron.default_floating_pool  = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.576 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] neutron.endpoint_override      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.576 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.576 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] neutron.http_retries           = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.576 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] neutron.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.576 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] neutron.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.576 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] neutron.max_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.576 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.576 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] neutron.min_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.577 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] neutron.ovs_bridge             = br-int log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.577 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] neutron.physnets               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.577 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] neutron.region_name            = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.577 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.577 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] neutron.service_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.577 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] neutron.service_type           = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.577 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] neutron.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.578 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] neutron.status_code_retries    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.578 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.578 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] neutron.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.578 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] neutron.valid_interfaces       = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.578 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] neutron.version                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.578 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.578 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] notifications.default_level    = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.579 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] notifications.notification_format = both log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.579 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] notifications.notify_on_state_change = vm_and_task_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.579 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.579 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] pci.alias                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.579 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] pci.device_spec                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.579 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] pci.report_in_placement        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.579 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.580 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] placement.auth_type            = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.580 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] placement.auth_url             = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.580 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.580 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.580 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.580 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] placement.connect_retries      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.580 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] placement.connect_retry_delay  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.581 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] placement.default_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.581 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] placement.default_domain_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.581 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] placement.domain_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.581 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] placement.domain_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.581 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] placement.endpoint_override    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.581 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.581 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.582 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] placement.max_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.582 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] placement.min_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.582 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] placement.password             = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.582 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] placement.project_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.582 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] placement.project_domain_name  = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.582 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] placement.project_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.583 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] placement.project_name         = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.583 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] placement.region_name          = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.583 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] placement.service_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.583 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] placement.service_type         = placement log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.583 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.583 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] placement.status_code_retries  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.583 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.584 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] placement.system_scope         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.584 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.584 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] placement.trust_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.584 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] placement.user_domain_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.584 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] placement.user_domain_name     = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.584 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] placement.user_id              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.584 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] placement.username             = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.584 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] placement.valid_interfaces     = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.585 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] placement.version              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.585 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] quota.cores                    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.585 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.585 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] quota.driver                   = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.585 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.585 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.586 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] quota.injected_files           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.586 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] quota.instances                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.586 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] quota.key_pairs                = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.586 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] quota.metadata_items           = 128 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.586 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] quota.ram                      = 51200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.586 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] quota.recheck_quota            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.586 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] quota.server_group_members     = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.586 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] quota.server_groups            = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.587 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] rdp.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.587 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] rdp.html5_proxy_base_url       = http://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.587 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.587 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.587 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.587 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.588 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] scheduler.max_attempts         = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.588 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.588 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.588 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] scheduler.query_placement_for_availability_zone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.588 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.588 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.588 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] scheduler.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.589 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.589 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.589 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.589 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.589 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.589 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.590 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.590 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.590 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.590 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.590 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.590 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.590 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.591 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.591 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.591 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.591 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.591 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.591 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.591 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.592 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.592 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.592 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.592 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.592 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] metrics.required               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.592 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] metrics.weight_multiplier      = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.593 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] metrics.weight_of_unavailable  = -10000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.593 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] metrics.weight_setting         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.593 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] serial_console.base_url        = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.593 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] serial_console.enabled         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.594 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] serial_console.port_range      = 10000:20000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.594 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.594 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.594 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.594 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] service_user.auth_section      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.594 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] service_user.auth_type         = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.595 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] service_user.cafile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.595 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] service_user.certfile          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.595 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] service_user.collect_timing    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.595 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] service_user.insecure          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.595 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] service_user.keyfile           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.595 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.595 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] service_user.split_loggers     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.596 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] service_user.timeout           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.596 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] spice.agent_enabled            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.596 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] spice.enabled                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.596 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] spice.html5proxy_base_url      = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.596 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] spice.html5proxy_host          = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.596 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] spice.html5proxy_port          = 6082 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.597 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] spice.image_compression        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.597 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] spice.jpeg_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.597 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] spice.playback_compression     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.597 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] spice.server_listen            = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.597 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.597 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] spice.streaming_mode           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.597 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] spice.zlib_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.598 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] upgrade_levels.baseapi         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.598 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] upgrade_levels.cert            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.598 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] upgrade_levels.compute         = auto log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.598 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] upgrade_levels.conductor       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.598 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] upgrade_levels.scheduler       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.598 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.598 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.599 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.599 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.599 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.599 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.599 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.599 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.599 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.600 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.600 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.600 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] vmware.cache_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.600 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] vmware.cluster_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.600 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] vmware.connection_pool_size    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.600 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] vmware.console_delay_seconds   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.601 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] vmware.datastore_regex         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.601 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] vmware.host_ip                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.601 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.601 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.601 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] vmware.host_username           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.601 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.602 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] vmware.integration_bridge      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.602 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] vmware.maximum_objects         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.602 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] vmware.pbm_default_policy      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.602 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] vmware.pbm_enabled             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.602 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] vmware.pbm_wsdl_location       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.602 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] vmware.serial_log_dir          = /opt/vmware/vspc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.602 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] vmware.serial_port_proxy_uri   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.603 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.603 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.603 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] vmware.use_linked_clone        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.603 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] vmware.vnc_keymap              = en-us log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.603 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] vmware.vnc_port                = 5900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.603 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] vmware.vnc_port_total          = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.603 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] vnc.auth_schemes               = ['none'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.604 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] vnc.enabled                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.604 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] vnc.novncproxy_base_url        = https://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.604 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] vnc.novncproxy_host            = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.604 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] vnc.novncproxy_port            = 6080 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.604 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] vnc.server_listen              = ::0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.605 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] vnc.server_proxyclient_address = 192.168.122.101 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.605 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] vnc.vencrypt_ca_certs          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.605 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] vnc.vencrypt_client_cert       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.605 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] vnc.vencrypt_client_key        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.606 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] workarounds.disable_compute_service_check_for_ffu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.606 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.606 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.606 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.606 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.606 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] workarounds.disable_rootwrap   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.607 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.607 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.607 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.607 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.607 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.608 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.608 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.608 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.608 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.608 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.608 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.608 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.609 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.609 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.609 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.609 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] wsgi.api_paste_config          = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.609 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] wsgi.client_socket_timeout     = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.609 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] wsgi.default_pool_size         = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.609 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] wsgi.keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.610 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] wsgi.max_header_line           = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.610 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] wsgi.secure_proxy_ssl_header   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.610 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] wsgi.ssl_ca_file               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.610 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] wsgi.ssl_cert_file             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.610 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] wsgi.ssl_key_file              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.610 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] wsgi.tcp_keepidle              = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.610 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] wsgi.wsgi_log_format           = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.611 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] zvm.ca_file                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.611 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] zvm.cloud_connector_url        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.611 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] zvm.image_tmp_path             = /var/lib/nova/images log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.611 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] zvm.reachable_timeout          = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.611 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] oslo_policy.enforce_new_defaults = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.611 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] oslo_policy.enforce_scope      = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.611 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.612 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.612 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.612 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.612 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.612 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.612 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.613 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.613 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.613 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.613 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] remote_debug.host              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.613 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] remote_debug.port              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.613 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.613 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.614 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.614 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.614 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.614 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.614 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.614 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.614 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.615 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.615 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.615 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.615 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.615 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.615 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.615 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.615 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.616 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.616 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.616 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.616 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.616 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.616 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.616 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.617 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.617 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.617 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.617 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.617 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.617 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.617 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.618 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] oslo_messaging_notifications.driver = ['messagingv2'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.618 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.618 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.618 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.618 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] oslo_limit.auth_section        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.618 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] oslo_limit.auth_type           = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.618 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] oslo_limit.auth_url            = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.619 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] oslo_limit.cafile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.619 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] oslo_limit.certfile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.619 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] oslo_limit.collect_timing      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.619 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] oslo_limit.connect_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.619 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.619 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] oslo_limit.default_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.619 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.619 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] oslo_limit.domain_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.620 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] oslo_limit.domain_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.620 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] oslo_limit.endpoint_id         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.620 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] oslo_limit.endpoint_override   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.620 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] oslo_limit.insecure            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.620 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] oslo_limit.keyfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.620 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] oslo_limit.max_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.620 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] oslo_limit.min_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.621 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] oslo_limit.password            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.621 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] oslo_limit.project_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.621 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.621 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] oslo_limit.project_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.621 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] oslo_limit.project_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.621 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] oslo_limit.region_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.621 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] oslo_limit.service_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.621 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] oslo_limit.service_type        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.622 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] oslo_limit.split_loggers       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.622 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.622 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.622 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] oslo_limit.system_scope        = all log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.622 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] oslo_limit.timeout             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.622 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] oslo_limit.trust_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.622 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] oslo_limit.user_domain_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.623 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] oslo_limit.user_domain_name    = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.623 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] oslo_limit.user_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.623 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] oslo_limit.username            = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.623 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] oslo_limit.valid_interfaces    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.623 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] oslo_limit.version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.623 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.623 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.623 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.624 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.624 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.624 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.624 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.624 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.624 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.624 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.625 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] vif_plug_ovs_privileged.group  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.625 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.625 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.625 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.625 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] vif_plug_ovs_privileged.user   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.625 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.625 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.626 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] os_vif_linux_bridge.iptables_bottom_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.626 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.626 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] os_vif_linux_bridge.iptables_top_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.626 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.626 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] os_vif_linux_bridge.use_ipv6   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.626 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.626 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] os_vif_ovs.isolate_vif         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.627 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] os_vif_ovs.network_device_mtu  = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.627 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] os_vif_ovs.ovs_vsctl_timeout   = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.627 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] os_vif_ovs.ovsdb_connection    = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.627 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] os_vif_ovs.ovsdb_interface     = native log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.627 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] os_vif_ovs.per_port_bridge     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.627 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] os_brick.lock_path             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.627 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.628 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.628 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] privsep_osbrick.capabilities   = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.628 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] privsep_osbrick.group          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.628 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.628 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] privsep_osbrick.logger_name    = os_brick.privileged log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.628 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.628 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] privsep_osbrick.user           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.629 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] nova_sys_admin.capabilities    = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.629 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] nova_sys_admin.group           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.629 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] nova_sys_admin.helper_command  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.629 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] nova_sys_admin.logger_name     = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.629 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.629 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] nova_sys_admin.user            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.629 182717 DEBUG oslo_service.service [None req-a4d8c412-30a3-4509-a2e1-d1fbe9962dde - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.630 182717 INFO nova.service [-] Starting compute node (version 27.5.2-0.20250829104910.6f8decf.el9)
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.654 182717 INFO nova.virt.node [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Determined node identity 39680711-70c9-4df1-ae59-25e54fac688d from /var/lib/nova/compute_id
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.655 182717 DEBUG nova.virt.libvirt.host [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Starting native event thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:492
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.656 182717 DEBUG nova.virt.libvirt.host [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:498
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.656 182717 DEBUG nova.virt.libvirt.host [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Starting connection event dispatch thread initialize /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:620
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.656 182717 DEBUG nova.virt.libvirt.host [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:503
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.670 182717 DEBUG nova.virt.libvirt.host [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Registering for lifecycle events <nova.virt.libvirt.host.Host object at 0x7fa9578cca90> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:509
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.672 182717 DEBUG nova.virt.libvirt.host [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Registering for connection events: <nova.virt.libvirt.host.Host object at 0x7fa9578cca90> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:530
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.673 182717 INFO nova.virt.libvirt.driver [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Connection event '1' reason 'None'
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.679 182717 INFO nova.virt.libvirt.host [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Libvirt host capabilities <capabilities>
Jan 21 23:37:11 compute-1 nova_compute[182713]: 
Jan 21 23:37:11 compute-1 nova_compute[182713]:   <host>
Jan 21 23:37:11 compute-1 nova_compute[182713]:     <uuid>d7c2924b-8ca5-4f75-9376-1023950dbf90</uuid>
Jan 21 23:37:11 compute-1 nova_compute[182713]:     <cpu>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <arch>x86_64</arch>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model>EPYC-Rome-v4</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <vendor>AMD</vendor>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <microcode version='16777317'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <signature family='23' model='49' stepping='0'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <topology sockets='8' dies='1' clusters='1' cores='1' threads='1'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <maxphysaddr mode='emulate' bits='40'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <feature name='x2apic'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <feature name='tsc-deadline'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <feature name='osxsave'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <feature name='hypervisor'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <feature name='tsc_adjust'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <feature name='spec-ctrl'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <feature name='stibp'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <feature name='arch-capabilities'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <feature name='ssbd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <feature name='cmp_legacy'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <feature name='topoext'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <feature name='virt-ssbd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <feature name='lbrv'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <feature name='tsc-scale'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <feature name='vmcb-clean'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <feature name='pause-filter'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <feature name='pfthreshold'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <feature name='svme-addr-chk'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <feature name='rdctl-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <feature name='skip-l1dfl-vmentry'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <feature name='mds-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <feature name='pschange-mc-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <pages unit='KiB' size='4'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <pages unit='KiB' size='2048'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <pages unit='KiB' size='1048576'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:     </cpu>
Jan 21 23:37:11 compute-1 nova_compute[182713]:     <power_management>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <suspend_mem/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <suspend_disk/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <suspend_hybrid/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:     </power_management>
Jan 21 23:37:11 compute-1 nova_compute[182713]:     <iommu support='no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:     <migration_features>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <live/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <uri_transports>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <uri_transport>tcp</uri_transport>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <uri_transport>rdma</uri_transport>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </uri_transports>
Jan 21 23:37:11 compute-1 nova_compute[182713]:     </migration_features>
Jan 21 23:37:11 compute-1 nova_compute[182713]:     <topology>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <cells num='1'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <cell id='0'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:           <memory unit='KiB'>7864316</memory>
Jan 21 23:37:11 compute-1 nova_compute[182713]:           <pages unit='KiB' size='4'>1966079</pages>
Jan 21 23:37:11 compute-1 nova_compute[182713]:           <pages unit='KiB' size='2048'>0</pages>
Jan 21 23:37:11 compute-1 nova_compute[182713]:           <pages unit='KiB' size='1048576'>0</pages>
Jan 21 23:37:11 compute-1 nova_compute[182713]:           <distances>
Jan 21 23:37:11 compute-1 nova_compute[182713]:             <sibling id='0' value='10'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:           </distances>
Jan 21 23:37:11 compute-1 nova_compute[182713]:           <cpus num='8'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:             <cpu id='0' socket_id='0' die_id='0' cluster_id='65535' core_id='0' siblings='0'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:             <cpu id='1' socket_id='1' die_id='1' cluster_id='65535' core_id='0' siblings='1'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:             <cpu id='2' socket_id='2' die_id='2' cluster_id='65535' core_id='0' siblings='2'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:             <cpu id='3' socket_id='3' die_id='3' cluster_id='65535' core_id='0' siblings='3'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:             <cpu id='4' socket_id='4' die_id='4' cluster_id='65535' core_id='0' siblings='4'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:             <cpu id='5' socket_id='5' die_id='5' cluster_id='65535' core_id='0' siblings='5'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:             <cpu id='6' socket_id='6' die_id='6' cluster_id='65535' core_id='0' siblings='6'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:             <cpu id='7' socket_id='7' die_id='7' cluster_id='65535' core_id='0' siblings='7'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:           </cpus>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         </cell>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </cells>
Jan 21 23:37:11 compute-1 nova_compute[182713]:     </topology>
Jan 21 23:37:11 compute-1 nova_compute[182713]:     <cache>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <bank id='0' level='2' type='both' size='512' unit='KiB' cpus='0'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <bank id='1' level='2' type='both' size='512' unit='KiB' cpus='1'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <bank id='2' level='2' type='both' size='512' unit='KiB' cpus='2'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <bank id='3' level='2' type='both' size='512' unit='KiB' cpus='3'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <bank id='4' level='2' type='both' size='512' unit='KiB' cpus='4'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <bank id='5' level='2' type='both' size='512' unit='KiB' cpus='5'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <bank id='6' level='2' type='both' size='512' unit='KiB' cpus='6'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <bank id='7' level='2' type='both' size='512' unit='KiB' cpus='7'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <bank id='0' level='3' type='both' size='16' unit='MiB' cpus='0'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <bank id='1' level='3' type='both' size='16' unit='MiB' cpus='1'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <bank id='2' level='3' type='both' size='16' unit='MiB' cpus='2'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <bank id='3' level='3' type='both' size='16' unit='MiB' cpus='3'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <bank id='4' level='3' type='both' size='16' unit='MiB' cpus='4'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <bank id='5' level='3' type='both' size='16' unit='MiB' cpus='5'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <bank id='6' level='3' type='both' size='16' unit='MiB' cpus='6'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <bank id='7' level='3' type='both' size='16' unit='MiB' cpus='7'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:     </cache>
Jan 21 23:37:11 compute-1 nova_compute[182713]:     <secmodel>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model>selinux</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <doi>0</doi>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <baselabel type='kvm'>system_u:system_r:svirt_t:s0</baselabel>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <baselabel type='qemu'>system_u:system_r:svirt_tcg_t:s0</baselabel>
Jan 21 23:37:11 compute-1 nova_compute[182713]:     </secmodel>
Jan 21 23:37:11 compute-1 nova_compute[182713]:     <secmodel>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model>dac</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <doi>0</doi>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <baselabel type='kvm'>+107:+107</baselabel>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <baselabel type='qemu'>+107:+107</baselabel>
Jan 21 23:37:11 compute-1 nova_compute[182713]:     </secmodel>
Jan 21 23:37:11 compute-1 nova_compute[182713]:   </host>
Jan 21 23:37:11 compute-1 nova_compute[182713]: 
Jan 21 23:37:11 compute-1 nova_compute[182713]:   <guest>
Jan 21 23:37:11 compute-1 nova_compute[182713]:     <os_type>hvm</os_type>
Jan 21 23:37:11 compute-1 nova_compute[182713]:     <arch name='i686'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <wordsize>32</wordsize>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <domain type='qemu'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <domain type='kvm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:     </arch>
Jan 21 23:37:11 compute-1 nova_compute[182713]:     <features>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <pae/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <nonpae/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <acpi default='on' toggle='yes'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <apic default='on' toggle='no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <cpuselection/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <deviceboot/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <disksnapshot default='on' toggle='no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <externalSnapshot/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:     </features>
Jan 21 23:37:11 compute-1 nova_compute[182713]:   </guest>
Jan 21 23:37:11 compute-1 nova_compute[182713]: 
Jan 21 23:37:11 compute-1 nova_compute[182713]:   <guest>
Jan 21 23:37:11 compute-1 nova_compute[182713]:     <os_type>hvm</os_type>
Jan 21 23:37:11 compute-1 nova_compute[182713]:     <arch name='x86_64'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <wordsize>64</wordsize>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <domain type='qemu'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <domain type='kvm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:     </arch>
Jan 21 23:37:11 compute-1 nova_compute[182713]:     <features>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <acpi default='on' toggle='yes'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <apic default='on' toggle='no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <cpuselection/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <deviceboot/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <disksnapshot default='on' toggle='no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <externalSnapshot/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:     </features>
Jan 21 23:37:11 compute-1 nova_compute[182713]:   </guest>
Jan 21 23:37:11 compute-1 nova_compute[182713]: 
Jan 21 23:37:11 compute-1 nova_compute[182713]: </capabilities>
Jan 21 23:37:11 compute-1 nova_compute[182713]: 
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.686 182717 DEBUG nova.virt.libvirt.host [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Getting domain capabilities for i686 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.692 182717 DEBUG nova.virt.libvirt.host [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35:
Jan 21 23:37:11 compute-1 nova_compute[182713]: <domainCapabilities>
Jan 21 23:37:11 compute-1 nova_compute[182713]:   <path>/usr/libexec/qemu-kvm</path>
Jan 21 23:37:11 compute-1 nova_compute[182713]:   <domain>kvm</domain>
Jan 21 23:37:11 compute-1 nova_compute[182713]:   <machine>pc-q35-rhel9.8.0</machine>
Jan 21 23:37:11 compute-1 nova_compute[182713]:   <arch>i686</arch>
Jan 21 23:37:11 compute-1 nova_compute[182713]:   <vcpu max='4096'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:   <iothreads supported='yes'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:   <os supported='yes'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:     <enum name='firmware'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:     <loader supported='yes'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <enum name='type'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>rom</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>pflash</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </enum>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <enum name='readonly'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>yes</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>no</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </enum>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <enum name='secure'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>no</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </enum>
Jan 21 23:37:11 compute-1 nova_compute[182713]:     </loader>
Jan 21 23:37:11 compute-1 nova_compute[182713]:   </os>
Jan 21 23:37:11 compute-1 nova_compute[182713]:   <cpu>
Jan 21 23:37:11 compute-1 nova_compute[182713]:     <mode name='host-passthrough' supported='yes'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <enum name='hostPassthroughMigratable'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>on</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>off</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </enum>
Jan 21 23:37:11 compute-1 nova_compute[182713]:     </mode>
Jan 21 23:37:11 compute-1 nova_compute[182713]:     <mode name='maximum' supported='yes'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <enum name='maximumMigratable'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>on</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>off</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </enum>
Jan 21 23:37:11 compute-1 nova_compute[182713]:     </mode>
Jan 21 23:37:11 compute-1 nova_compute[182713]:     <mode name='host-model' supported='yes'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model fallback='forbid'>EPYC-Rome</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <vendor>AMD</vendor>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <maxphysaddr mode='passthrough' limit='40'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <feature policy='require' name='x2apic'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <feature policy='require' name='tsc-deadline'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <feature policy='require' name='hypervisor'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <feature policy='require' name='tsc_adjust'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <feature policy='require' name='spec-ctrl'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <feature policy='require' name='stibp'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <feature policy='require' name='ssbd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <feature policy='require' name='cmp_legacy'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <feature policy='require' name='overflow-recov'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <feature policy='require' name='succor'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <feature policy='require' name='ibrs'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <feature policy='require' name='amd-ssbd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <feature policy='require' name='virt-ssbd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <feature policy='require' name='lbrv'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <feature policy='require' name='tsc-scale'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <feature policy='require' name='vmcb-clean'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <feature policy='require' name='flushbyasid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <feature policy='require' name='pause-filter'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <feature policy='require' name='pfthreshold'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <feature policy='require' name='svme-addr-chk'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <feature policy='require' name='lfence-always-serializing'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <feature policy='disable' name='xsaves'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:     </mode>
Jan 21 23:37:11 compute-1 nova_compute[182713]:     <mode name='custom' supported='yes'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Broadwell'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='hle'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='rtm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Broadwell-IBRS'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='hle'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='rtm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Broadwell-noTSX'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Broadwell-noTSX-IBRS'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Broadwell-v1'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='hle'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='rtm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Broadwell-v2'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Broadwell-v3'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='hle'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='rtm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Broadwell-v4'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Cascadelake-Server'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bw'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512cd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512dq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512f'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vl'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vnni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='hle'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pku'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='rtm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Cascadelake-Server-noTSX'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bw'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512cd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512dq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512f'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vl'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vnni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='ibrs-all'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pku'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Cascadelake-Server-v1'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bw'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512cd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512dq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512f'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vl'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vnni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='hle'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pku'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='rtm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Cascadelake-Server-v2'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bw'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512cd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512dq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512f'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vl'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vnni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='hle'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='ibrs-all'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pku'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='rtm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Cascadelake-Server-v3'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bw'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512cd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512dq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512f'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vl'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vnni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='ibrs-all'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pku'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Cascadelake-Server-v4'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bw'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512cd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512dq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512f'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vl'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vnni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='ibrs-all'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pku'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Cascadelake-Server-v5'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bw'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512cd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512dq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512f'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vl'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vnni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='ibrs-all'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pku'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='xsaves'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='ClearwaterForest'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx-ifma'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx-ne-convert'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx-vnni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx-vnni-int16'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx-vnni-int8'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='bhi-ctrl'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='bhi-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='bus-lock-detect'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='cldemote'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='cmpccxadd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='ddpd-u'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fbsdp-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fsrm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fsrs'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='gfni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='ibrs-all'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='intel-psfd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='ipred-ctrl'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='lam'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='mcdt-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='movdir64b'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='movdiri'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pbrsb-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pku'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='prefetchiti'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='psdp-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='rrsba-ctrl'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='sbdr-ssdp-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='serialize'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='sha512'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='sm3'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='sm4'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='ss'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='vaes'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='xsaves'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='ClearwaterForest-v1'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx-ifma'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx-ne-convert'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx-vnni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx-vnni-int16'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx-vnni-int8'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='bhi-ctrl'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='bhi-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='bus-lock-detect'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='cldemote'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='cmpccxadd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='ddpd-u'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fbsdp-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fsrm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fsrs'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='gfni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='ibrs-all'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='intel-psfd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='ipred-ctrl'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='lam'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='mcdt-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='movdir64b'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='movdiri'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pbrsb-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pku'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='prefetchiti'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='psdp-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='rrsba-ctrl'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='sbdr-ssdp-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='serialize'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='sha512'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='sm3'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='sm4'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='ss'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='vaes'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='xsaves'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Cooperlake'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512-bf16'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bw'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512cd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512dq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512f'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vl'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vnni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='hle'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='ibrs-all'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pku'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='rtm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='taa-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Cooperlake-v1'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512-bf16'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bw'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512cd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512dq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512f'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vl'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vnni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='hle'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='ibrs-all'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pku'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='rtm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='taa-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Cooperlake-v2'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512-bf16'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bw'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512cd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512dq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512f'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vl'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vnni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='hle'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='ibrs-all'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pku'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='rtm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='taa-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='xsaves'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Denverton'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='mpx'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Denverton-v1'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='mpx'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Denverton-v2'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Denverton-v3'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='xsaves'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Dhyana-v2'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='xsaves'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='EPYC-Genoa'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='amd-psfd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='auto-ibrs'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512-bf16'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bitalg'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bw'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512cd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512dq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512f'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512ifma'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vbmi'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vl'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vnni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fsrm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='gfni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='la57'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='no-nested-data-bp'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='null-sel-clr-base'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pku'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='stibp-always-on'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='vaes'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='xsaves'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='EPYC-Genoa-v1'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='amd-psfd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='auto-ibrs'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512-bf16'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bitalg'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bw'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512cd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512dq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512f'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512ifma'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vbmi'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vl'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vnni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fsrm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='gfni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='la57'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='no-nested-data-bp'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='null-sel-clr-base'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pku'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='stibp-always-on'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='vaes'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='xsaves'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='EPYC-Genoa-v2'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='amd-psfd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='auto-ibrs'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512-bf16'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bitalg'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bw'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512cd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512dq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512f'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512ifma'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vbmi'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vl'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vnni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fs-gs-base-ns'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fsrm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='gfni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='la57'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='no-nested-data-bp'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='null-sel-clr-base'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='perfmon-v2'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pku'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='stibp-always-on'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='vaes'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='xsaves'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='EPYC-Milan'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fsrm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pku'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='xsaves'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='EPYC-Milan-v1'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fsrm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pku'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='xsaves'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='EPYC-Milan-v2'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='amd-psfd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fsrm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='no-nested-data-bp'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='null-sel-clr-base'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pku'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='stibp-always-on'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='vaes'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='xsaves'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='EPYC-Milan-v3'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='amd-psfd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fsrm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='no-nested-data-bp'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='null-sel-clr-base'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pku'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='stibp-always-on'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='vaes'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='xsaves'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='EPYC-Rome'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='xsaves'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='EPYC-Rome-v1'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='xsaves'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='EPYC-Rome-v2'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='xsaves'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='EPYC-Rome-v3'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='xsaves'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='EPYC-Turin'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='amd-psfd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='auto-ibrs'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx-vnni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512-bf16'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512-vp2intersect'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bitalg'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bw'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512cd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512dq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512f'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512ifma'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vbmi'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vl'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vnni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fs-gs-base-ns'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fsrm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='gfni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='ibpb-brtype'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='la57'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='movdir64b'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='movdiri'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='no-nested-data-bp'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='null-sel-clr-base'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='perfmon-v2'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pku'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='prefetchi'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='sbpb'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='srso-user-kernel-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='stibp-always-on'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='vaes'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='xsaves'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='EPYC-Turin-v1'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='amd-psfd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='auto-ibrs'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx-vnni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512-bf16'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512-vp2intersect'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bitalg'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bw'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512cd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512dq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512f'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512ifma'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vbmi'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vl'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vnni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fs-gs-base-ns'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fsrm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='gfni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='ibpb-brtype'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='la57'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='movdir64b'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='movdiri'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='no-nested-data-bp'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='null-sel-clr-base'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='perfmon-v2'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pku'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='prefetchi'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='sbpb'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='srso-user-kernel-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='stibp-always-on'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='vaes'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='xsaves'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='EPYC-v3'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='xsaves'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='EPYC-v4'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='xsaves'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='AMD'>EPYC-v5</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='EPYC-v5'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='xsaves'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='GraniteRapids'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='amx-bf16'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='amx-fp16'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='amx-int8'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='amx-tile'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx-vnni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512-bf16'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512-fp16'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bitalg'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bw'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512cd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512dq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512f'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512ifma'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vbmi'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vl'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vnni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='bus-lock-detect'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fbsdp-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fsrc'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fsrm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fsrs'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fzrm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='gfni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='hle'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='ibrs-all'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='la57'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='mcdt-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pbrsb-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pku'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='prefetchiti'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='psdp-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='rtm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='sbdr-ssdp-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='serialize'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='taa-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='tsx-ldtrk'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='vaes'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='xfd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='xsaves'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='GraniteRapids-v1'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='amx-bf16'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='amx-fp16'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='amx-int8'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='amx-tile'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx-vnni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512-bf16'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512-fp16'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bitalg'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bw'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512cd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512dq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512f'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512ifma'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vbmi'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vl'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vnni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='bus-lock-detect'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fbsdp-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fsrc'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fsrm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fsrs'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fzrm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='gfni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='hle'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='ibrs-all'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='la57'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='mcdt-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pbrsb-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pku'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='prefetchiti'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='psdp-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='rtm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='sbdr-ssdp-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='serialize'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='taa-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='tsx-ldtrk'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='vaes'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='xfd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='xsaves'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='GraniteRapids-v2'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='amx-bf16'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='amx-fp16'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='amx-int8'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='amx-tile'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx-vnni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx10'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx10-128'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx10-256'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx10-512'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512-bf16'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512-fp16'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bitalg'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bw'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512cd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512dq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512f'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512ifma'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vbmi'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vl'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vnni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='bus-lock-detect'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='cldemote'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fbsdp-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fsrc'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fsrm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fsrs'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fzrm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='gfni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='hle'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='ibrs-all'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='la57'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='mcdt-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='movdir64b'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='movdiri'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pbrsb-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pku'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='prefetchiti'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='psdp-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='rtm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='sbdr-ssdp-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='serialize'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='ss'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='taa-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='tsx-ldtrk'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='vaes'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='xfd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='xsaves'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='GraniteRapids-v3'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='amx-bf16'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='amx-fp16'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='amx-int8'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='amx-tile'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx-vnni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx10'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx10-128'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx10-256'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx10-512'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512-bf16'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512-fp16'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bitalg'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bw'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512cd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512dq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512f'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512ifma'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vbmi'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vl'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vnni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='bus-lock-detect'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='cldemote'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fbsdp-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fsrc'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fsrm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fsrs'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fzrm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='gfni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='hle'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='ibrs-all'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='la57'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='mcdt-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='movdir64b'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='movdiri'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pbrsb-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pku'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='prefetchiti'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='psdp-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='rtm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='sbdr-ssdp-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='serialize'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='ss'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='taa-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='tsx-ldtrk'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='vaes'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='xfd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='xsaves'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Haswell'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='hle'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='rtm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Haswell-IBRS'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='hle'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='rtm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Haswell-noTSX'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Haswell-noTSX-IBRS'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Haswell-v1'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='hle'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='rtm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Haswell-v2'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Haswell-v3'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='hle'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='rtm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Haswell-v4'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Icelake-Server'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bitalg'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bw'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512cd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512dq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512f'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vbmi'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vl'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vnni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='gfni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='hle'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='la57'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pku'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='rtm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='vaes'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Icelake-Server-noTSX'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bitalg'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bw'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512cd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512dq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512f'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vbmi'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vl'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vnni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='gfni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='la57'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pku'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='vaes'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Icelake-Server-v1'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bitalg'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bw'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512cd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512dq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512f'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vbmi'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vl'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vnni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='gfni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='hle'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='la57'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pku'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='rtm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='vaes'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Icelake-Server-v2'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bitalg'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bw'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512cd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512dq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512f'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vbmi'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vl'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vnni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='gfni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='la57'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pku'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='vaes'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Icelake-Server-v3'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bitalg'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bw'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512cd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512dq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512f'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vbmi'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vl'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vnni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='gfni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='ibrs-all'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='la57'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pku'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='taa-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='vaes'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Icelake-Server-v4'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bitalg'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bw'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512cd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512dq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512f'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512ifma'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vbmi'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vl'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vnni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fsrm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='gfni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='ibrs-all'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='la57'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pku'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='taa-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='vaes'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Icelake-Server-v5'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bitalg'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bw'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512cd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512dq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512f'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512ifma'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vbmi'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vl'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vnni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fsrm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='gfni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='ibrs-all'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='la57'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pku'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='taa-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='vaes'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='xsaves'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Icelake-Server-v6'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bitalg'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bw'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512cd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512dq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512f'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512ifma'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vbmi'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vl'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vnni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fsrm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='gfni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='ibrs-all'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='la57'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pku'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='taa-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='vaes'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='xsaves'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Icelake-Server-v7'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bitalg'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bw'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512cd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512dq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512f'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512ifma'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vbmi'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vl'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vnni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fsrm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='gfni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='hle'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='ibrs-all'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='la57'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pku'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='rtm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='taa-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='vaes'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='xsaves'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='IvyBridge'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='IvyBridge-IBRS'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='IvyBridge-v1'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='IvyBridge-v2'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='KnightsMill'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512-4fmaps'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512-4vnniw'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512cd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512er'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512f'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512pf'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='ss'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='KnightsMill-v1'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512-4fmaps'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512-4vnniw'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512cd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512er'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512f'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512pf'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='ss'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Opteron_G4'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fma4'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='xop'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Opteron_G4-v1'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fma4'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='xop'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Opteron_G5'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fma4'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='tbm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='xop'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Opteron_G5-v1'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fma4'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='tbm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='xop'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='SapphireRapids'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='amx-bf16'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='amx-int8'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='amx-tile'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx-vnni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512-bf16'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512-fp16'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bitalg'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bw'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512cd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512dq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512f'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512ifma'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vbmi'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vl'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vnni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='bus-lock-detect'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fsrc'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fsrm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fsrs'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fzrm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='gfni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='hle'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='ibrs-all'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='la57'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pku'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='rtm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='serialize'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='taa-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='tsx-ldtrk'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='vaes'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='xfd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='xsaves'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='SapphireRapids-v1'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='amx-bf16'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='amx-int8'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='amx-tile'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx-vnni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512-bf16'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512-fp16'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bitalg'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bw'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512cd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512dq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512f'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512ifma'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vbmi'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vl'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vnni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='bus-lock-detect'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fsrc'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fsrm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fsrs'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fzrm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='gfni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='hle'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='ibrs-all'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='la57'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pku'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='rtm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='serialize'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='taa-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='tsx-ldtrk'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='vaes'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='xfd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='xsaves'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='SapphireRapids-v2'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='amx-bf16'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='amx-int8'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='amx-tile'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx-vnni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512-bf16'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512-fp16'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bitalg'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bw'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512cd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512dq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512f'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512ifma'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vbmi'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vl'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vnni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='bus-lock-detect'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fbsdp-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fsrc'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fsrm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fsrs'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fzrm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='gfni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='hle'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='ibrs-all'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='la57'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pku'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='psdp-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='rtm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='sbdr-ssdp-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='serialize'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='taa-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='tsx-ldtrk'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='vaes'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='xfd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='xsaves'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='SapphireRapids-v3'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='amx-bf16'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='amx-int8'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='amx-tile'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx-vnni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512-bf16'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512-fp16'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bitalg'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bw'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512cd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512dq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512f'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512ifma'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vbmi'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vl'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vnni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='bus-lock-detect'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='cldemote'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fbsdp-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fsrc'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fsrm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fsrs'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fzrm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='gfni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='hle'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='ibrs-all'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='la57'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='movdir64b'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='movdiri'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pku'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='psdp-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='rtm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='sbdr-ssdp-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='serialize'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='ss'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='taa-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='tsx-ldtrk'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='vaes'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='xfd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='xsaves'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='SapphireRapids-v4'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='amx-bf16'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='amx-int8'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='amx-tile'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx-vnni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512-bf16'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512-fp16'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bitalg'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bw'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512cd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512dq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512f'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512ifma'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vbmi'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vl'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vnni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='bus-lock-detect'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='cldemote'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fbsdp-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fsrc'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fsrm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fsrs'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fzrm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='gfni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='hle'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='ibrs-all'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='la57'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='movdir64b'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='movdiri'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pku'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='psdp-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='rtm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='sbdr-ssdp-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='serialize'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='ss'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='taa-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='tsx-ldtrk'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='vaes'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='xfd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='xsaves'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='SierraForest'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx-ifma'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx-ne-convert'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx-vnni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx-vnni-int8'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='bus-lock-detect'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='cmpccxadd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fbsdp-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fsrm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fsrs'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='gfni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='ibrs-all'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='mcdt-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pbrsb-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pku'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='psdp-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='sbdr-ssdp-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='serialize'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='vaes'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='xsaves'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='SierraForest-v1'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx-ifma'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx-ne-convert'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx-vnni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx-vnni-int8'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='bus-lock-detect'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='cmpccxadd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fbsdp-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fsrm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fsrs'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='gfni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='ibrs-all'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='mcdt-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pbrsb-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pku'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='psdp-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='sbdr-ssdp-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='serialize'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='vaes'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='xsaves'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel'>SierraForest-v2</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='SierraForest-v2'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx-ifma'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx-ne-convert'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx-vnni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx-vnni-int8'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='bhi-ctrl'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='bus-lock-detect'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='cldemote'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='cmpccxadd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fbsdp-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fsrm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fsrs'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='gfni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='ibrs-all'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='intel-psfd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='ipred-ctrl'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='lam'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='mcdt-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='movdir64b'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='movdiri'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pbrsb-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pku'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='psdp-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='rrsba-ctrl'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='sbdr-ssdp-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='serialize'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='ss'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='vaes'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='xsaves'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel'>SierraForest-v3</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='SierraForest-v3'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx-ifma'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx-ne-convert'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx-vnni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx-vnni-int8'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='bhi-ctrl'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='bus-lock-detect'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='cldemote'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='cmpccxadd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fbsdp-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fsrm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fsrs'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='gfni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='ibrs-all'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='intel-psfd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='ipred-ctrl'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='lam'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='mcdt-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='movdir64b'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='movdiri'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pbrsb-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pku'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='psdp-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='rrsba-ctrl'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='sbdr-ssdp-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='serialize'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='ss'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='vaes'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='xsaves'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Skylake-Client'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='hle'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='rtm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Skylake-Client-IBRS'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='hle'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='rtm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Skylake-Client-v1'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='hle'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='rtm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Skylake-Client-v2'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='hle'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='rtm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Skylake-Client-v3'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Skylake-Client-v4'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='xsaves'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Skylake-Server'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bw'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512cd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512dq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512f'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vl'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='hle'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pku'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='rtm'/>
Jan 21 23:37:11 compute-1 sshd-session[159582]: Connection closed by 192.168.122.30 port 50614
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Skylake-Server-IBRS'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bw'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512cd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512dq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512f'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vl'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='hle'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pku'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='rtm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bw'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512cd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512dq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512f'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vl'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pku'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Skylake-Server-v1'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bw'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512cd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512dq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512f'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vl'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='hle'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pku'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='rtm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Skylake-Server-v2'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bw'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512cd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512dq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512f'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vl'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='hle'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pku'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='rtm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Skylake-Server-v3'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bw'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512cd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512dq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512f'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vl'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 sshd-session[159579]: pam_unix(sshd:session): session closed for user zuul
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pku'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Skylake-Server-v4'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bw'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512cd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512dq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512f'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vl'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pku'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Skylake-Server-v5'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bw'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512cd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512dq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512f'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vl'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pku'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='xsaves'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Snowridge'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='cldemote'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='core-capability'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='gfni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='movdir64b'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='movdiri'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='mpx'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='split-lock-detect'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Snowridge-v1'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='cldemote'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='core-capability'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='gfni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='movdir64b'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='movdiri'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='mpx'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='split-lock-detect'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Snowridge-v2'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='cldemote'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='core-capability'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='gfni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='movdir64b'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='movdiri'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='split-lock-detect'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Snowridge-v3'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='cldemote'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='core-capability'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='gfni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='movdir64b'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='movdiri'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='split-lock-detect'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='xsaves'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Snowridge-v4'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='cldemote'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='gfni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='movdir64b'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='movdiri'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='xsaves'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='athlon'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='3dnow'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='3dnowext'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='athlon-v1'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='3dnow'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='3dnowext'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='core2duo'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='ss'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='core2duo-v1'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='ss'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='coreduo'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='ss'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='coreduo-v1'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='ss'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='n270'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='ss'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='n270-v1'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='ss'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='phenom'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='3dnow'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='3dnowext'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='phenom-v1'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='3dnow'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='3dnowext'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:     </mode>
Jan 21 23:37:11 compute-1 nova_compute[182713]:   </cpu>
Jan 21 23:37:11 compute-1 nova_compute[182713]:   <memoryBacking supported='yes'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:     <enum name='sourceType'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <value>file</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <value>anonymous</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <value>memfd</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:     </enum>
Jan 21 23:37:11 compute-1 nova_compute[182713]:   </memoryBacking>
Jan 21 23:37:11 compute-1 nova_compute[182713]:   <devices>
Jan 21 23:37:11 compute-1 nova_compute[182713]:     <disk supported='yes'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <enum name='diskDevice'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>disk</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>cdrom</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>floppy</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>lun</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </enum>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <enum name='bus'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>fdc</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>scsi</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>virtio</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>usb</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>sata</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </enum>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <enum name='model'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>virtio</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>virtio-transitional</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>virtio-non-transitional</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </enum>
Jan 21 23:37:11 compute-1 nova_compute[182713]:     </disk>
Jan 21 23:37:11 compute-1 nova_compute[182713]:     <graphics supported='yes'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <enum name='type'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>vnc</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>egl-headless</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>dbus</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </enum>
Jan 21 23:37:11 compute-1 nova_compute[182713]:     </graphics>
Jan 21 23:37:11 compute-1 nova_compute[182713]:     <video supported='yes'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <enum name='modelType'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>vga</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>cirrus</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>virtio</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>none</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>bochs</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>ramfb</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </enum>
Jan 21 23:37:11 compute-1 nova_compute[182713]:     </video>
Jan 21 23:37:11 compute-1 nova_compute[182713]:     <hostdev supported='yes'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <enum name='mode'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>subsystem</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </enum>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <enum name='startupPolicy'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>default</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>mandatory</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>requisite</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>optional</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </enum>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <enum name='subsysType'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>usb</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>pci</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>scsi</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </enum>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <enum name='capsType'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <enum name='pciBackend'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:     </hostdev>
Jan 21 23:37:11 compute-1 nova_compute[182713]:     <rng supported='yes'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <enum name='model'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>virtio</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>virtio-transitional</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>virtio-non-transitional</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </enum>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <enum name='backendModel'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>random</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>egd</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>builtin</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </enum>
Jan 21 23:37:11 compute-1 nova_compute[182713]:     </rng>
Jan 21 23:37:11 compute-1 nova_compute[182713]:     <filesystem supported='yes'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <enum name='driverType'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>path</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>handle</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>virtiofs</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </enum>
Jan 21 23:37:11 compute-1 nova_compute[182713]:     </filesystem>
Jan 21 23:37:11 compute-1 nova_compute[182713]:     <tpm supported='yes'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <enum name='model'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>tpm-tis</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>tpm-crb</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </enum>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <enum name='backendModel'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>emulator</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>external</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </enum>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <enum name='backendVersion'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>2.0</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </enum>
Jan 21 23:37:11 compute-1 nova_compute[182713]:     </tpm>
Jan 21 23:37:11 compute-1 nova_compute[182713]:     <redirdev supported='yes'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <enum name='bus'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>usb</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </enum>
Jan 21 23:37:11 compute-1 nova_compute[182713]:     </redirdev>
Jan 21 23:37:11 compute-1 nova_compute[182713]:     <channel supported='yes'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <enum name='type'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>pty</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>unix</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </enum>
Jan 21 23:37:11 compute-1 nova_compute[182713]:     </channel>
Jan 21 23:37:11 compute-1 nova_compute[182713]:     <crypto supported='yes'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <enum name='model'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <enum name='type'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>qemu</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </enum>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <enum name='backendModel'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>builtin</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </enum>
Jan 21 23:37:11 compute-1 nova_compute[182713]:     </crypto>
Jan 21 23:37:11 compute-1 nova_compute[182713]:     <interface supported='yes'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <enum name='backendType'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>default</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>passt</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </enum>
Jan 21 23:37:11 compute-1 nova_compute[182713]:     </interface>
Jan 21 23:37:11 compute-1 nova_compute[182713]:     <panic supported='yes'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <enum name='model'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>isa</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>hyperv</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </enum>
Jan 21 23:37:11 compute-1 nova_compute[182713]:     </panic>
Jan 21 23:37:11 compute-1 nova_compute[182713]:     <console supported='yes'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <enum name='type'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>null</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>vc</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>pty</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>dev</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>file</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>pipe</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>stdio</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>udp</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>tcp</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>unix</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>qemu-vdagent</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>dbus</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </enum>
Jan 21 23:37:11 compute-1 nova_compute[182713]:     </console>
Jan 21 23:37:11 compute-1 nova_compute[182713]:   </devices>
Jan 21 23:37:11 compute-1 nova_compute[182713]:   <features>
Jan 21 23:37:11 compute-1 nova_compute[182713]:     <gic supported='no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:     <vmcoreinfo supported='yes'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:     <genid supported='yes'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:     <backingStoreInput supported='yes'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:     <backup supported='yes'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:     <async-teardown supported='yes'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:     <s390-pv supported='no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:     <ps2 supported='yes'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:     <tdx supported='no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:     <sev supported='no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:     <sgx supported='no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:     <hyperv supported='yes'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <enum name='features'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>relaxed</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>vapic</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>spinlocks</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>vpindex</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>runtime</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>synic</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>stimer</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>reset</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>vendor_id</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>frequencies</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>reenlightenment</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>tlbflush</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>ipi</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>avic</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>emsr_bitmap</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>xmm_input</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </enum>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <defaults>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <spinlocks>4095</spinlocks>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <stimer_direct>on</stimer_direct>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <tlbflush_direct>on</tlbflush_direct>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <tlbflush_extended>on</tlbflush_extended>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <vendor_id>Linux KVM Hv</vendor_id>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </defaults>
Jan 21 23:37:11 compute-1 nova_compute[182713]:     </hyperv>
Jan 21 23:37:11 compute-1 nova_compute[182713]:     <launchSecurity supported='no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:   </features>
Jan 21 23:37:11 compute-1 nova_compute[182713]: </domainCapabilities>
Jan 21 23:37:11 compute-1 nova_compute[182713]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.697 182717 DEBUG nova.virt.libvirt.volume.mount [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/mount.py:130
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.699 182717 DEBUG nova.virt.libvirt.host [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc:
Jan 21 23:37:11 compute-1 nova_compute[182713]: <domainCapabilities>
Jan 21 23:37:11 compute-1 nova_compute[182713]:   <path>/usr/libexec/qemu-kvm</path>
Jan 21 23:37:11 compute-1 nova_compute[182713]:   <domain>kvm</domain>
Jan 21 23:37:11 compute-1 nova_compute[182713]:   <machine>pc-i440fx-rhel7.6.0</machine>
Jan 21 23:37:11 compute-1 nova_compute[182713]:   <arch>i686</arch>
Jan 21 23:37:11 compute-1 nova_compute[182713]:   <vcpu max='240'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:   <iothreads supported='yes'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:   <os supported='yes'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:     <enum name='firmware'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:     <loader supported='yes'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <enum name='type'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>rom</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>pflash</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </enum>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <enum name='readonly'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>yes</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>no</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </enum>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <enum name='secure'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>no</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </enum>
Jan 21 23:37:11 compute-1 nova_compute[182713]:     </loader>
Jan 21 23:37:11 compute-1 nova_compute[182713]:   </os>
Jan 21 23:37:11 compute-1 nova_compute[182713]:   <cpu>
Jan 21 23:37:11 compute-1 nova_compute[182713]:     <mode name='host-passthrough' supported='yes'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <enum name='hostPassthroughMigratable'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>on</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>off</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </enum>
Jan 21 23:37:11 compute-1 nova_compute[182713]:     </mode>
Jan 21 23:37:11 compute-1 nova_compute[182713]:     <mode name='maximum' supported='yes'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <enum name='maximumMigratable'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>on</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>off</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </enum>
Jan 21 23:37:11 compute-1 nova_compute[182713]:     </mode>
Jan 21 23:37:11 compute-1 nova_compute[182713]:     <mode name='host-model' supported='yes'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model fallback='forbid'>EPYC-Rome</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <vendor>AMD</vendor>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <maxphysaddr mode='passthrough' limit='40'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <feature policy='require' name='x2apic'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <feature policy='require' name='tsc-deadline'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <feature policy='require' name='hypervisor'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <feature policy='require' name='tsc_adjust'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <feature policy='require' name='spec-ctrl'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <feature policy='require' name='stibp'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <feature policy='require' name='ssbd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <feature policy='require' name='cmp_legacy'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <feature policy='require' name='overflow-recov'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <feature policy='require' name='succor'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <feature policy='require' name='ibrs'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <feature policy='require' name='amd-ssbd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <feature policy='require' name='virt-ssbd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <feature policy='require' name='lbrv'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <feature policy='require' name='tsc-scale'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <feature policy='require' name='vmcb-clean'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <feature policy='require' name='flushbyasid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <feature policy='require' name='pause-filter'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <feature policy='require' name='pfthreshold'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <feature policy='require' name='svme-addr-chk'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <feature policy='require' name='lfence-always-serializing'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <feature policy='disable' name='xsaves'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:     </mode>
Jan 21 23:37:11 compute-1 nova_compute[182713]:     <mode name='custom' supported='yes'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Broadwell'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='hle'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='rtm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Broadwell-IBRS'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='hle'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='rtm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Broadwell-noTSX'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Broadwell-noTSX-IBRS'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Broadwell-v1'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='hle'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='rtm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Broadwell-v2'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Broadwell-v3'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='hle'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='rtm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Broadwell-v4'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Cascadelake-Server'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bw'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512cd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512dq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512f'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vl'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vnni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='hle'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pku'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='rtm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Cascadelake-Server-noTSX'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bw'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512cd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512dq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512f'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vl'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vnni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='ibrs-all'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pku'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Cascadelake-Server-v1'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bw'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512cd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512dq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512f'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vl'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vnni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='hle'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pku'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='rtm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Cascadelake-Server-v2'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bw'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512cd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512dq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512f'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vl'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vnni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='hle'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='ibrs-all'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pku'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='rtm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Cascadelake-Server-v3'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bw'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512cd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512dq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512f'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vl'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vnni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='ibrs-all'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pku'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Cascadelake-Server-v4'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bw'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512cd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512dq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512f'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vl'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vnni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='ibrs-all'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pku'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Cascadelake-Server-v5'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bw'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512cd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512dq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512f'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vl'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vnni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='ibrs-all'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pku'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='xsaves'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='ClearwaterForest'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx-ifma'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx-ne-convert'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx-vnni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx-vnni-int16'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx-vnni-int8'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='bhi-ctrl'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='bhi-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='bus-lock-detect'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='cldemote'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='cmpccxadd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='ddpd-u'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fbsdp-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fsrm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fsrs'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='gfni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='ibrs-all'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='intel-psfd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='ipred-ctrl'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='lam'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='mcdt-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='movdir64b'/>
Jan 21 23:37:11 compute-1 systemd[1]: session-24.scope: Deactivated successfully.
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='movdiri'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pbrsb-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pku'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='prefetchiti'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='psdp-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='rrsba-ctrl'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='sbdr-ssdp-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='serialize'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='sha512'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='sm3'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='sm4'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='ss'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='vaes'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='xsaves'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='ClearwaterForest-v1'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx-ifma'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx-ne-convert'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx-vnni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx-vnni-int16'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx-vnni-int8'/>
Jan 21 23:37:11 compute-1 systemd[1]: session-24.scope: Consumed 1min 48.936s CPU time.
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='bhi-ctrl'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='bhi-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='bus-lock-detect'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='cldemote'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='cmpccxadd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='ddpd-u'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fbsdp-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fsrm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fsrs'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='gfni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='ibrs-all'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='intel-psfd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='ipred-ctrl'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='lam'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='mcdt-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='movdir64b'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='movdiri'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pbrsb-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pku'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='prefetchiti'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='psdp-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='rrsba-ctrl'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='sbdr-ssdp-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='serialize'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='sha512'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='sm3'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='sm4'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='ss'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='vaes'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='xsaves'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Cooperlake'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512-bf16'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bw'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512cd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512dq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512f'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vl'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vnni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='hle'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='ibrs-all'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pku'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='rtm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='taa-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Cooperlake-v1'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512-bf16'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bw'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512cd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512dq'/>
Jan 21 23:37:11 compute-1 systemd-logind[796]: Session 24 logged out. Waiting for processes to exit.
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512f'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vl'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vnni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='hle'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='ibrs-all'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pku'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='rtm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='taa-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Cooperlake-v2'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512-bf16'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bw'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512cd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512dq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512f'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vl'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vnni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='hle'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='ibrs-all'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pku'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='rtm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='taa-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='xsaves'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Denverton'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='mpx'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Denverton-v1'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='mpx'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Denverton-v2'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Denverton-v3'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='xsaves'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Dhyana-v2'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='xsaves'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='EPYC-Genoa'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='amd-psfd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='auto-ibrs'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512-bf16'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bitalg'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bw'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512cd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512dq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512f'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512ifma'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vbmi'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vl'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vnni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fsrm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='gfni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='la57'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='no-nested-data-bp'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='null-sel-clr-base'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pku'/>
Jan 21 23:37:11 compute-1 systemd-logind[796]: Removed session 24.
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='stibp-always-on'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='vaes'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='xsaves'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='EPYC-Genoa-v1'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='amd-psfd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='auto-ibrs'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512-bf16'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bitalg'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bw'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512cd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512dq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512f'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512ifma'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vbmi'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vl'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vnni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fsrm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='gfni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='la57'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='no-nested-data-bp'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='null-sel-clr-base'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pku'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='stibp-always-on'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='vaes'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='xsaves'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='EPYC-Genoa-v2'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='amd-psfd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='auto-ibrs'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512-bf16'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bitalg'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bw'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512cd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512dq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512f'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512ifma'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vbmi'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vl'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vnni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fs-gs-base-ns'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fsrm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='gfni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='la57'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='no-nested-data-bp'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='null-sel-clr-base'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='perfmon-v2'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pku'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='stibp-always-on'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='vaes'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='xsaves'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='EPYC-Milan'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fsrm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pku'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='xsaves'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='EPYC-Milan-v1'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fsrm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pku'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='xsaves'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='EPYC-Milan-v2'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='amd-psfd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fsrm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='no-nested-data-bp'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='null-sel-clr-base'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pku'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='stibp-always-on'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='vaes'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='xsaves'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='EPYC-Milan-v3'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='amd-psfd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fsrm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='no-nested-data-bp'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='null-sel-clr-base'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pku'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='stibp-always-on'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='vaes'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='xsaves'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='EPYC-Rome'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='xsaves'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='EPYC-Rome-v1'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='xsaves'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='EPYC-Rome-v2'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='xsaves'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='EPYC-Rome-v3'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='xsaves'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='EPYC-Turin'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='amd-psfd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='auto-ibrs'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx-vnni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512-bf16'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512-vp2intersect'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bitalg'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bw'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512cd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512dq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512f'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512ifma'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vbmi'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vl'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vnni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fs-gs-base-ns'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fsrm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='gfni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='ibpb-brtype'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='la57'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='movdir64b'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='movdiri'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='no-nested-data-bp'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='null-sel-clr-base'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='perfmon-v2'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pku'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='prefetchi'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='sbpb'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='srso-user-kernel-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='stibp-always-on'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='vaes'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='xsaves'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='EPYC-Turin-v1'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='amd-psfd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='auto-ibrs'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx-vnni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512-bf16'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512-vp2intersect'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bitalg'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bw'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512cd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512dq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512f'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512ifma'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vbmi'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vl'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vnni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fs-gs-base-ns'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fsrm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='gfni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='ibpb-brtype'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='la57'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='movdir64b'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='movdiri'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='no-nested-data-bp'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='null-sel-clr-base'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='perfmon-v2'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pku'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='prefetchi'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='sbpb'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='srso-user-kernel-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='stibp-always-on'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='vaes'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='xsaves'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='EPYC-v3'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='xsaves'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='EPYC-v4'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='xsaves'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='AMD'>EPYC-v5</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='EPYC-v5'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='xsaves'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='GraniteRapids'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='amx-bf16'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='amx-fp16'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='amx-int8'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='amx-tile'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx-vnni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512-bf16'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512-fp16'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bitalg'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bw'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512cd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512dq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512f'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512ifma'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vbmi'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vl'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vnni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='bus-lock-detect'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fbsdp-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fsrc'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fsrm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fsrs'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fzrm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='gfni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='hle'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='ibrs-all'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='la57'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='mcdt-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pbrsb-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pku'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='prefetchiti'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='psdp-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='rtm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='sbdr-ssdp-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='serialize'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='taa-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='tsx-ldtrk'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='vaes'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='xfd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='xsaves'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='GraniteRapids-v1'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='amx-bf16'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='amx-fp16'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='amx-int8'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='amx-tile'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx-vnni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512-bf16'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512-fp16'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bitalg'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bw'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512cd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512dq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512f'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512ifma'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vbmi'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vl'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vnni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='bus-lock-detect'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fbsdp-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fsrc'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fsrm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fsrs'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fzrm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='gfni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='hle'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='ibrs-all'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='la57'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='mcdt-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pbrsb-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pku'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='prefetchiti'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='psdp-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='rtm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='sbdr-ssdp-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='serialize'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='taa-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='tsx-ldtrk'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='vaes'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='xfd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='xsaves'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='GraniteRapids-v2'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='amx-bf16'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='amx-fp16'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='amx-int8'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='amx-tile'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx-vnni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx10'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx10-128'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx10-256'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx10-512'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512-bf16'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512-fp16'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bitalg'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bw'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512cd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512dq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512f'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512ifma'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vbmi'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vl'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vnni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='bus-lock-detect'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='cldemote'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fbsdp-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fsrc'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fsrm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fsrs'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fzrm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='gfni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='hle'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='ibrs-all'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='la57'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='mcdt-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='movdir64b'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='movdiri'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pbrsb-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pku'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='prefetchiti'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='psdp-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='rtm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='sbdr-ssdp-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='serialize'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='ss'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='taa-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='tsx-ldtrk'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='vaes'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='xfd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='xsaves'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='GraniteRapids-v3'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='amx-bf16'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='amx-fp16'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='amx-int8'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='amx-tile'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx-vnni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx10'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx10-128'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx10-256'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx10-512'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512-bf16'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512-fp16'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bitalg'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bw'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512cd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512dq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512f'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512ifma'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vbmi'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vl'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vnni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='bus-lock-detect'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='cldemote'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fbsdp-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fsrc'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fsrm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fsrs'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fzrm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='gfni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='hle'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='ibrs-all'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='la57'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='mcdt-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='movdir64b'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='movdiri'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pbrsb-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pku'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='prefetchiti'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='psdp-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='rtm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='sbdr-ssdp-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='serialize'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='ss'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='taa-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='tsx-ldtrk'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='vaes'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='xfd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='xsaves'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Haswell'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='hle'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='rtm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Haswell-IBRS'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='hle'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='rtm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Haswell-noTSX'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Haswell-noTSX-IBRS'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Haswell-v1'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='hle'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='rtm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Haswell-v2'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Haswell-v3'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='hle'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='rtm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Haswell-v4'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Icelake-Server'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bitalg'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bw'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512cd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512dq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512f'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vbmi'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vl'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vnni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='gfni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='hle'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='la57'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pku'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='rtm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='vaes'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Icelake-Server-noTSX'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bitalg'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bw'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512cd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512dq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512f'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vbmi'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vl'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vnni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='gfni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='la57'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pku'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='vaes'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Icelake-Server-v1'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bitalg'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bw'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512cd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512dq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512f'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vbmi'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vl'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vnni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='gfni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='hle'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='la57'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pku'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='rtm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='vaes'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Icelake-Server-v2'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bitalg'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bw'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512cd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512dq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512f'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vbmi'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vl'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vnni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='gfni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='la57'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pku'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='vaes'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Icelake-Server-v3'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bitalg'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bw'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512cd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512dq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512f'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vbmi'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vl'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vnni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='gfni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='ibrs-all'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='la57'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pku'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='taa-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='vaes'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Icelake-Server-v4'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bitalg'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bw'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512cd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512dq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512f'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512ifma'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vbmi'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vl'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vnni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fsrm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='gfni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='ibrs-all'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='la57'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pku'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='taa-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='vaes'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Icelake-Server-v5'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bitalg'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bw'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512cd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512dq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512f'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512ifma'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vbmi'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vl'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vnni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fsrm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='gfni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='ibrs-all'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='la57'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pku'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='taa-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='vaes'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='xsaves'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Icelake-Server-v6'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bitalg'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bw'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512cd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512dq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512f'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512ifma'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vbmi'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vl'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vnni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fsrm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='gfni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='ibrs-all'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='la57'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pku'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='taa-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='vaes'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='xsaves'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Icelake-Server-v7'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bitalg'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bw'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512cd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512dq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512f'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512ifma'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vbmi'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vl'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vnni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fsrm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='gfni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='hle'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='ibrs-all'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='la57'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pku'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='rtm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='taa-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='vaes'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='xsaves'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='IvyBridge'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='IvyBridge-IBRS'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='IvyBridge-v1'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='IvyBridge-v2'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='KnightsMill'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512-4fmaps'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512-4vnniw'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512cd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512er'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512f'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512pf'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='ss'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='KnightsMill-v1'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512-4fmaps'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512-4vnniw'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512cd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512er'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512f'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512pf'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='ss'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Opteron_G4'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fma4'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='xop'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Opteron_G4-v1'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fma4'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='xop'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Opteron_G5'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fma4'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='tbm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='xop'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Opteron_G5-v1'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fma4'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='tbm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='xop'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='SapphireRapids'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='amx-bf16'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='amx-int8'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='amx-tile'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx-vnni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512-bf16'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512-fp16'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bitalg'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bw'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512cd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512dq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512f'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512ifma'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vbmi'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vl'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vnni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='bus-lock-detect'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fsrc'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fsrm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fsrs'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fzrm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='gfni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='hle'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='ibrs-all'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='la57'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pku'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='rtm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='serialize'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='taa-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='tsx-ldtrk'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='vaes'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='xfd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='xsaves'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='SapphireRapids-v1'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='amx-bf16'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='amx-int8'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='amx-tile'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx-vnni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512-bf16'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512-fp16'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bitalg'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bw'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512cd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512dq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512f'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512ifma'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vbmi'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vl'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vnni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='bus-lock-detect'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fsrc'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fsrm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fsrs'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fzrm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='gfni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='hle'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='ibrs-all'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='la57'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pku'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='rtm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='serialize'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='taa-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='tsx-ldtrk'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='vaes'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='xfd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='xsaves'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='SapphireRapids-v2'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='amx-bf16'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='amx-int8'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='amx-tile'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx-vnni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512-bf16'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512-fp16'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bitalg'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bw'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512cd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512dq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512f'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512ifma'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vbmi'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vl'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vnni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='bus-lock-detect'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fbsdp-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fsrc'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fsrm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fsrs'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fzrm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='gfni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='hle'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='ibrs-all'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='la57'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pku'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='psdp-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='rtm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='sbdr-ssdp-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='serialize'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='taa-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='tsx-ldtrk'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='vaes'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='xfd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='xsaves'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='SapphireRapids-v3'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='amx-bf16'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='amx-int8'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='amx-tile'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx-vnni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512-bf16'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512-fp16'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bitalg'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bw'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512cd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512dq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512f'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512ifma'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vbmi'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vl'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vnni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='bus-lock-detect'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='cldemote'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fbsdp-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fsrc'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fsrm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fsrs'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fzrm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='gfni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='hle'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='ibrs-all'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='la57'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='movdir64b'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='movdiri'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pku'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='psdp-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='rtm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='sbdr-ssdp-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='serialize'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='ss'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='taa-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='tsx-ldtrk'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='vaes'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='xfd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='xsaves'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='SapphireRapids-v4'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='amx-bf16'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='amx-int8'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='amx-tile'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx-vnni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512-bf16'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512-fp16'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bitalg'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bw'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512cd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512dq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512f'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512ifma'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vbmi'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vl'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vnni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='bus-lock-detect'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='cldemote'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fbsdp-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fsrc'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fsrm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fsrs'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fzrm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='gfni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='hle'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='ibrs-all'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='la57'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='movdir64b'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='movdiri'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pku'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='psdp-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='rtm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='sbdr-ssdp-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='serialize'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='ss'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='taa-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='tsx-ldtrk'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='vaes'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='xfd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='xsaves'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='SierraForest'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx-ifma'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx-ne-convert'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx-vnni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx-vnni-int8'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='bus-lock-detect'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='cmpccxadd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fbsdp-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fsrm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fsrs'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='gfni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='ibrs-all'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='mcdt-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pbrsb-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pku'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='psdp-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='sbdr-ssdp-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='serialize'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='vaes'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='xsaves'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='SierraForest-v1'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx-ifma'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx-ne-convert'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx-vnni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx-vnni-int8'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='bus-lock-detect'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='cmpccxadd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fbsdp-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fsrm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fsrs'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='gfni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='ibrs-all'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='mcdt-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pbrsb-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pku'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='psdp-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='sbdr-ssdp-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='serialize'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='vaes'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='xsaves'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel'>SierraForest-v2</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='SierraForest-v2'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx-ifma'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx-ne-convert'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx-vnni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx-vnni-int8'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='bhi-ctrl'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='bus-lock-detect'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='cldemote'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='cmpccxadd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fbsdp-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fsrm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fsrs'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='gfni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='ibrs-all'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='intel-psfd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='ipred-ctrl'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='lam'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='mcdt-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='movdir64b'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='movdiri'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pbrsb-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pku'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='psdp-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='rrsba-ctrl'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='sbdr-ssdp-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='serialize'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='ss'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='vaes'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='xsaves'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel'>SierraForest-v3</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='SierraForest-v3'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx-ifma'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx-ne-convert'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx-vnni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx-vnni-int8'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='bhi-ctrl'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='bus-lock-detect'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='cldemote'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='cmpccxadd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fbsdp-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fsrm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fsrs'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='gfni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='ibrs-all'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='intel-psfd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='ipred-ctrl'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='lam'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='mcdt-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='movdir64b'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='movdiri'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pbrsb-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pku'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='psdp-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='rrsba-ctrl'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='sbdr-ssdp-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='serialize'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='ss'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='vaes'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='xsaves'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Skylake-Client'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='hle'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='rtm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Skylake-Client-IBRS'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='hle'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='rtm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Skylake-Client-v1'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='hle'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='rtm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Skylake-Client-v2'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='hle'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='rtm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Skylake-Client-v3'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Skylake-Client-v4'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='xsaves'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Skylake-Server'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bw'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512cd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512dq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512f'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vl'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='hle'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pku'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='rtm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Skylake-Server-IBRS'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bw'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512cd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512dq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512f'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vl'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='hle'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pku'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='rtm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bw'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512cd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512dq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512f'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vl'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pku'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Skylake-Server-v1'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bw'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512cd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512dq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512f'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vl'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='hle'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pku'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='rtm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Skylake-Server-v2'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bw'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512cd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512dq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512f'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vl'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='hle'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pku'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='rtm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Skylake-Server-v3'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bw'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512cd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512dq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512f'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vl'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pku'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Skylake-Server-v4'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bw'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512cd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512dq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512f'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vl'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pku'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Skylake-Server-v5'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bw'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512cd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512dq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512f'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vl'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pku'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='xsaves'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Snowridge'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='cldemote'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='core-capability'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='gfni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='movdir64b'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='movdiri'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='mpx'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='split-lock-detect'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Snowridge-v1'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='cldemote'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='core-capability'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='gfni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='movdir64b'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='movdiri'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='mpx'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='split-lock-detect'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Snowridge-v2'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='cldemote'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='core-capability'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='gfni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='movdir64b'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='movdiri'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='split-lock-detect'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Snowridge-v3'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='cldemote'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='core-capability'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='gfni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='movdir64b'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='movdiri'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='split-lock-detect'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='xsaves'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Snowridge-v4'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='cldemote'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='gfni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='movdir64b'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='movdiri'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='xsaves'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='athlon'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='3dnow'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='3dnowext'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='athlon-v1'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='3dnow'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='3dnowext'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='core2duo'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='ss'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='core2duo-v1'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='ss'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='coreduo'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='ss'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='coreduo-v1'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='ss'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='n270'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='ss'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='n270-v1'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='ss'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='phenom'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='3dnow'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='3dnowext'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='phenom-v1'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='3dnow'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='3dnowext'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:     </mode>
Jan 21 23:37:11 compute-1 nova_compute[182713]:   </cpu>
Jan 21 23:37:11 compute-1 nova_compute[182713]:   <memoryBacking supported='yes'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:     <enum name='sourceType'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <value>file</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <value>anonymous</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <value>memfd</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:     </enum>
Jan 21 23:37:11 compute-1 nova_compute[182713]:   </memoryBacking>
Jan 21 23:37:11 compute-1 nova_compute[182713]:   <devices>
Jan 21 23:37:11 compute-1 nova_compute[182713]:     <disk supported='yes'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <enum name='diskDevice'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>disk</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>cdrom</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>floppy</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>lun</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </enum>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <enum name='bus'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>ide</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>fdc</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>scsi</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>virtio</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>usb</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>sata</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </enum>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <enum name='model'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>virtio</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>virtio-transitional</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>virtio-non-transitional</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </enum>
Jan 21 23:37:11 compute-1 nova_compute[182713]:     </disk>
Jan 21 23:37:11 compute-1 nova_compute[182713]:     <graphics supported='yes'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <enum name='type'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>vnc</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>egl-headless</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>dbus</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </enum>
Jan 21 23:37:11 compute-1 nova_compute[182713]:     </graphics>
Jan 21 23:37:11 compute-1 nova_compute[182713]:     <video supported='yes'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <enum name='modelType'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>vga</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>cirrus</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>virtio</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>none</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>bochs</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>ramfb</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </enum>
Jan 21 23:37:11 compute-1 nova_compute[182713]:     </video>
Jan 21 23:37:11 compute-1 nova_compute[182713]:     <hostdev supported='yes'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <enum name='mode'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>subsystem</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </enum>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <enum name='startupPolicy'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>default</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>mandatory</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>requisite</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>optional</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </enum>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <enum name='subsysType'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>usb</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>pci</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>scsi</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </enum>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <enum name='capsType'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <enum name='pciBackend'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:     </hostdev>
Jan 21 23:37:11 compute-1 nova_compute[182713]:     <rng supported='yes'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <enum name='model'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>virtio</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>virtio-transitional</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>virtio-non-transitional</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </enum>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <enum name='backendModel'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>random</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>egd</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>builtin</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </enum>
Jan 21 23:37:11 compute-1 nova_compute[182713]:     </rng>
Jan 21 23:37:11 compute-1 nova_compute[182713]:     <filesystem supported='yes'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <enum name='driverType'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>path</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>handle</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>virtiofs</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </enum>
Jan 21 23:37:11 compute-1 nova_compute[182713]:     </filesystem>
Jan 21 23:37:11 compute-1 nova_compute[182713]:     <tpm supported='yes'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <enum name='model'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>tpm-tis</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>tpm-crb</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </enum>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <enum name='backendModel'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>emulator</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>external</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </enum>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <enum name='backendVersion'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>2.0</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </enum>
Jan 21 23:37:11 compute-1 nova_compute[182713]:     </tpm>
Jan 21 23:37:11 compute-1 nova_compute[182713]:     <redirdev supported='yes'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <enum name='bus'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>usb</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </enum>
Jan 21 23:37:11 compute-1 nova_compute[182713]:     </redirdev>
Jan 21 23:37:11 compute-1 nova_compute[182713]:     <channel supported='yes'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <enum name='type'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>pty</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>unix</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </enum>
Jan 21 23:37:11 compute-1 nova_compute[182713]:     </channel>
Jan 21 23:37:11 compute-1 nova_compute[182713]:     <crypto supported='yes'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <enum name='model'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <enum name='type'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>qemu</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </enum>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <enum name='backendModel'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>builtin</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </enum>
Jan 21 23:37:11 compute-1 nova_compute[182713]:     </crypto>
Jan 21 23:37:11 compute-1 nova_compute[182713]:     <interface supported='yes'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <enum name='backendType'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>default</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>passt</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </enum>
Jan 21 23:37:11 compute-1 nova_compute[182713]:     </interface>
Jan 21 23:37:11 compute-1 nova_compute[182713]:     <panic supported='yes'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <enum name='model'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>isa</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>hyperv</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </enum>
Jan 21 23:37:11 compute-1 nova_compute[182713]:     </panic>
Jan 21 23:37:11 compute-1 nova_compute[182713]:     <console supported='yes'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <enum name='type'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>null</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>vc</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>pty</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>dev</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>file</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>pipe</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>stdio</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>udp</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>tcp</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>unix</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>qemu-vdagent</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>dbus</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </enum>
Jan 21 23:37:11 compute-1 nova_compute[182713]:     </console>
Jan 21 23:37:11 compute-1 nova_compute[182713]:   </devices>
Jan 21 23:37:11 compute-1 nova_compute[182713]:   <features>
Jan 21 23:37:11 compute-1 nova_compute[182713]:     <gic supported='no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:     <vmcoreinfo supported='yes'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:     <genid supported='yes'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:     <backingStoreInput supported='yes'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:     <backup supported='yes'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:     <async-teardown supported='yes'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:     <s390-pv supported='no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:     <ps2 supported='yes'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:     <tdx supported='no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:     <sev supported='no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:     <sgx supported='no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:     <hyperv supported='yes'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <enum name='features'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>relaxed</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>vapic</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>spinlocks</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>vpindex</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>runtime</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>synic</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>stimer</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>reset</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>vendor_id</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>frequencies</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>reenlightenment</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>tlbflush</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>ipi</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>avic</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>emsr_bitmap</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>xmm_input</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </enum>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <defaults>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <spinlocks>4095</spinlocks>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <stimer_direct>on</stimer_direct>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <tlbflush_direct>on</tlbflush_direct>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <tlbflush_extended>on</tlbflush_extended>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <vendor_id>Linux KVM Hv</vendor_id>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </defaults>
Jan 21 23:37:11 compute-1 nova_compute[182713]:     </hyperv>
Jan 21 23:37:11 compute-1 nova_compute[182713]:     <launchSecurity supported='no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:   </features>
Jan 21 23:37:11 compute-1 nova_compute[182713]: </domainCapabilities>
Jan 21 23:37:11 compute-1 nova_compute[182713]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.748 182717 DEBUG nova.virt.libvirt.host [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Getting domain capabilities for x86_64 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.784 182717 DEBUG nova.virt.libvirt.host [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35:
Jan 21 23:37:11 compute-1 nova_compute[182713]: <domainCapabilities>
Jan 21 23:37:11 compute-1 nova_compute[182713]:   <path>/usr/libexec/qemu-kvm</path>
Jan 21 23:37:11 compute-1 nova_compute[182713]:   <domain>kvm</domain>
Jan 21 23:37:11 compute-1 nova_compute[182713]:   <machine>pc-q35-rhel9.8.0</machine>
Jan 21 23:37:11 compute-1 nova_compute[182713]:   <arch>x86_64</arch>
Jan 21 23:37:11 compute-1 nova_compute[182713]:   <vcpu max='4096'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:   <iothreads supported='yes'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:   <os supported='yes'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:     <enum name='firmware'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <value>efi</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:     </enum>
Jan 21 23:37:11 compute-1 nova_compute[182713]:     <loader supported='yes'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.secboot.fd</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.fd</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <value>/usr/share/edk2/ovmf/OVMF.amdsev.fd</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <value>/usr/share/edk2/ovmf/OVMF.inteltdx.secboot.fd</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <enum name='type'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>rom</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>pflash</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </enum>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <enum name='readonly'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>yes</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>no</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </enum>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <enum name='secure'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>yes</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>no</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </enum>
Jan 21 23:37:11 compute-1 nova_compute[182713]:     </loader>
Jan 21 23:37:11 compute-1 nova_compute[182713]:   </os>
Jan 21 23:37:11 compute-1 nova_compute[182713]:   <cpu>
Jan 21 23:37:11 compute-1 nova_compute[182713]:     <mode name='host-passthrough' supported='yes'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <enum name='hostPassthroughMigratable'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>on</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>off</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </enum>
Jan 21 23:37:11 compute-1 nova_compute[182713]:     </mode>
Jan 21 23:37:11 compute-1 nova_compute[182713]:     <mode name='maximum' supported='yes'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <enum name='maximumMigratable'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>on</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>off</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </enum>
Jan 21 23:37:11 compute-1 nova_compute[182713]:     </mode>
Jan 21 23:37:11 compute-1 nova_compute[182713]:     <mode name='host-model' supported='yes'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model fallback='forbid'>EPYC-Rome</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <vendor>AMD</vendor>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <maxphysaddr mode='passthrough' limit='40'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <feature policy='require' name='x2apic'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <feature policy='require' name='tsc-deadline'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <feature policy='require' name='hypervisor'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <feature policy='require' name='tsc_adjust'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <feature policy='require' name='spec-ctrl'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <feature policy='require' name='stibp'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <feature policy='require' name='ssbd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <feature policy='require' name='cmp_legacy'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <feature policy='require' name='overflow-recov'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <feature policy='require' name='succor'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <feature policy='require' name='ibrs'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <feature policy='require' name='amd-ssbd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <feature policy='require' name='virt-ssbd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <feature policy='require' name='lbrv'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <feature policy='require' name='tsc-scale'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <feature policy='require' name='vmcb-clean'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <feature policy='require' name='flushbyasid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <feature policy='require' name='pause-filter'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <feature policy='require' name='pfthreshold'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <feature policy='require' name='svme-addr-chk'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <feature policy='require' name='lfence-always-serializing'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <feature policy='disable' name='xsaves'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:     </mode>
Jan 21 23:37:11 compute-1 nova_compute[182713]:     <mode name='custom' supported='yes'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Broadwell'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='hle'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='rtm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Broadwell-IBRS'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='hle'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='rtm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Broadwell-noTSX'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Broadwell-noTSX-IBRS'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Broadwell-v1'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='hle'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='rtm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Broadwell-v2'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Broadwell-v3'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='hle'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='rtm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Broadwell-v4'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Cascadelake-Server'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bw'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512cd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512dq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512f'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vl'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vnni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='hle'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pku'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='rtm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Cascadelake-Server-noTSX'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bw'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512cd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512dq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512f'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vl'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vnni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='ibrs-all'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pku'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Cascadelake-Server-v1'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bw'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512cd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512dq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512f'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vl'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vnni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='hle'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pku'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='rtm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Cascadelake-Server-v2'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bw'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512cd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512dq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512f'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vl'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vnni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='hle'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='ibrs-all'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pku'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='rtm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Cascadelake-Server-v3'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bw'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512cd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512dq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512f'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vl'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vnni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='ibrs-all'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pku'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Cascadelake-Server-v4'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bw'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512cd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512dq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512f'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vl'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vnni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='ibrs-all'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pku'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Cascadelake-Server-v5'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bw'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512cd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512dq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512f'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vl'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vnni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='ibrs-all'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pku'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='xsaves'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='ClearwaterForest'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx-ifma'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx-ne-convert'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx-vnni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx-vnni-int16'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx-vnni-int8'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='bhi-ctrl'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='bhi-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='bus-lock-detect'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='cldemote'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='cmpccxadd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='ddpd-u'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fbsdp-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fsrm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fsrs'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='gfni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='ibrs-all'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='intel-psfd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='ipred-ctrl'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='lam'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='mcdt-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='movdir64b'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='movdiri'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pbrsb-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pku'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='prefetchiti'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='psdp-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='rrsba-ctrl'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='sbdr-ssdp-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='serialize'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='sha512'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='sm3'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='sm4'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='ss'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='vaes'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='xsaves'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='ClearwaterForest-v1'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx-ifma'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx-ne-convert'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx-vnni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx-vnni-int16'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx-vnni-int8'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='bhi-ctrl'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='bhi-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='bus-lock-detect'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='cldemote'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='cmpccxadd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='ddpd-u'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fbsdp-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fsrm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fsrs'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='gfni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='ibrs-all'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='intel-psfd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='ipred-ctrl'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='lam'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='mcdt-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='movdir64b'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='movdiri'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pbrsb-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pku'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='prefetchiti'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='psdp-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='rrsba-ctrl'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='sbdr-ssdp-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='serialize'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='sha512'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='sm3'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='sm4'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='ss'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='vaes'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='xsaves'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Cooperlake'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512-bf16'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bw'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512cd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512dq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512f'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vl'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vnni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='hle'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='ibrs-all'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pku'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='rtm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='taa-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Cooperlake-v1'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512-bf16'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bw'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512cd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512dq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512f'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vl'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vnni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='hle'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='ibrs-all'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pku'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='rtm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='taa-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Cooperlake-v2'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512-bf16'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bw'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512cd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512dq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512f'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vl'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vnni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='hle'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='ibrs-all'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pku'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='rtm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='taa-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='xsaves'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Denverton'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='mpx'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Denverton-v1'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='mpx'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Denverton-v2'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Denverton-v3'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='xsaves'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Dhyana-v2'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='xsaves'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='EPYC-Genoa'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='amd-psfd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='auto-ibrs'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512-bf16'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bitalg'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bw'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512cd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512dq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512f'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512ifma'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vbmi'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vl'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vnni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fsrm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='gfni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='la57'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='no-nested-data-bp'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='null-sel-clr-base'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pku'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='stibp-always-on'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='vaes'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='xsaves'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='EPYC-Genoa-v1'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='amd-psfd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='auto-ibrs'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512-bf16'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bitalg'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bw'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512cd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512dq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512f'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512ifma'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vbmi'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vl'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vnni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fsrm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='gfni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='la57'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='no-nested-data-bp'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='null-sel-clr-base'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pku'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='stibp-always-on'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='vaes'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='xsaves'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='EPYC-Genoa-v2'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='amd-psfd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='auto-ibrs'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512-bf16'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bitalg'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bw'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512cd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512dq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512f'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512ifma'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vbmi'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vl'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vnni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fs-gs-base-ns'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fsrm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='gfni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='la57'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='no-nested-data-bp'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='null-sel-clr-base'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='perfmon-v2'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pku'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='stibp-always-on'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='vaes'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='xsaves'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='EPYC-Milan'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fsrm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pku'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='xsaves'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='EPYC-Milan-v1'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fsrm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pku'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='xsaves'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='EPYC-Milan-v2'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='amd-psfd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fsrm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='no-nested-data-bp'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='null-sel-clr-base'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pku'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='stibp-always-on'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='vaes'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='xsaves'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='EPYC-Milan-v3'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='amd-psfd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fsrm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='no-nested-data-bp'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='null-sel-clr-base'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pku'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='stibp-always-on'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='vaes'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='xsaves'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='EPYC-Rome'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='xsaves'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='EPYC-Rome-v1'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='xsaves'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='EPYC-Rome-v2'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='xsaves'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='EPYC-Rome-v3'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='xsaves'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='EPYC-Turin'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='amd-psfd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='auto-ibrs'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx-vnni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512-bf16'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512-vp2intersect'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bitalg'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bw'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512cd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512dq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512f'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512ifma'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vbmi'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vl'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vnni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fs-gs-base-ns'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fsrm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='gfni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='ibpb-brtype'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='la57'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='movdir64b'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='movdiri'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='no-nested-data-bp'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='null-sel-clr-base'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='perfmon-v2'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pku'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='prefetchi'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='sbpb'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='srso-user-kernel-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='stibp-always-on'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='vaes'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='xsaves'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='EPYC-Turin-v1'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='amd-psfd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='auto-ibrs'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx-vnni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512-bf16'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512-vp2intersect'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bitalg'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bw'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512cd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512dq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512f'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512ifma'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vbmi'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vl'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vnni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fs-gs-base-ns'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fsrm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='gfni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='ibpb-brtype'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='la57'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='movdir64b'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='movdiri'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='no-nested-data-bp'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='null-sel-clr-base'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='perfmon-v2'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pku'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='prefetchi'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='sbpb'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='srso-user-kernel-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='stibp-always-on'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='vaes'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='xsaves'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='EPYC-v3'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='xsaves'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='EPYC-v4'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='xsaves'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='AMD'>EPYC-v5</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='EPYC-v5'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='xsaves'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='GraniteRapids'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='amx-bf16'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='amx-fp16'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='amx-int8'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='amx-tile'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx-vnni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512-bf16'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512-fp16'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bitalg'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bw'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512cd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512dq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512f'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512ifma'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vbmi'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vl'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vnni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='bus-lock-detect'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fbsdp-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fsrc'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fsrm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fsrs'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fzrm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='gfni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='hle'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='ibrs-all'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='la57'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='mcdt-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pbrsb-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pku'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='prefetchiti'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='psdp-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='rtm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='sbdr-ssdp-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='serialize'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='taa-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='tsx-ldtrk'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='vaes'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='xfd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='xsaves'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='GraniteRapids-v1'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='amx-bf16'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='amx-fp16'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='amx-int8'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='amx-tile'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx-vnni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512-bf16'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512-fp16'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bitalg'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bw'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512cd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512dq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512f'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512ifma'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vbmi'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vl'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vnni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='bus-lock-detect'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fbsdp-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fsrc'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fsrm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fsrs'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fzrm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='gfni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='hle'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='ibrs-all'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='la57'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='mcdt-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pbrsb-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pku'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='prefetchiti'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='psdp-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='rtm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='sbdr-ssdp-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='serialize'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='taa-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='tsx-ldtrk'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='vaes'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='xfd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='xsaves'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='GraniteRapids-v2'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='amx-bf16'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='amx-fp16'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='amx-int8'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='amx-tile'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx-vnni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx10'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx10-128'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx10-256'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx10-512'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512-bf16'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512-fp16'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bitalg'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bw'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512cd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512dq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512f'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512ifma'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vbmi'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vl'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vnni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='bus-lock-detect'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='cldemote'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fbsdp-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fsrc'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fsrm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fsrs'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fzrm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='gfni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='hle'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='ibrs-all'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='la57'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='mcdt-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='movdir64b'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='movdiri'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pbrsb-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pku'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='prefetchiti'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='psdp-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='rtm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='sbdr-ssdp-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='serialize'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='ss'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='taa-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='tsx-ldtrk'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='vaes'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='xfd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='xsaves'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='GraniteRapids-v3'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='amx-bf16'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='amx-fp16'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='amx-int8'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='amx-tile'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx-vnni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx10'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx10-128'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx10-256'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx10-512'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512-bf16'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512-fp16'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bitalg'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bw'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512cd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512dq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512f'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512ifma'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vbmi'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vl'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vnni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='bus-lock-detect'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='cldemote'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fbsdp-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fsrc'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fsrm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fsrs'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fzrm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='gfni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='hle'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='ibrs-all'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='la57'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='mcdt-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='movdir64b'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='movdiri'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pbrsb-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pku'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='prefetchiti'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='psdp-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='rtm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='sbdr-ssdp-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='serialize'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='ss'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='taa-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='tsx-ldtrk'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='vaes'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='xfd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='xsaves'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Haswell'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='hle'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='rtm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Haswell-IBRS'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='hle'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='rtm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Haswell-noTSX'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Haswell-noTSX-IBRS'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Haswell-v1'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='hle'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='rtm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Haswell-v2'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Haswell-v3'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='hle'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='rtm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Haswell-v4'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Icelake-Server'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bitalg'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bw'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512cd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512dq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512f'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vbmi'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vl'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vnni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='gfni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='hle'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='la57'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pku'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='rtm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='vaes'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Icelake-Server-noTSX'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bitalg'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bw'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512cd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512dq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512f'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vbmi'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vl'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vnni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='gfni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='la57'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pku'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='vaes'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Icelake-Server-v1'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bitalg'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bw'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512cd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512dq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512f'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vbmi'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vl'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vnni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='gfni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='hle'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='la57'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pku'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='rtm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='vaes'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Icelake-Server-v2'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bitalg'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bw'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512cd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512dq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512f'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vbmi'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vl'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vnni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='gfni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='la57'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pku'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='vaes'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Icelake-Server-v3'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bitalg'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bw'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512cd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512dq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512f'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vbmi'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vl'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vnni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='gfni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='ibrs-all'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='la57'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pku'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='taa-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='vaes'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Icelake-Server-v4'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bitalg'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bw'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512cd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512dq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512f'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512ifma'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vbmi'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vl'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vnni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fsrm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='gfni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='ibrs-all'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='la57'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pku'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='taa-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='vaes'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Icelake-Server-v5'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bitalg'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bw'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512cd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512dq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512f'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512ifma'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vbmi'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vl'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vnni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fsrm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='gfni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='ibrs-all'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='la57'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pku'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='taa-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='vaes'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='xsaves'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Icelake-Server-v6'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bitalg'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bw'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512cd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512dq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512f'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512ifma'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vbmi'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vl'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vnni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fsrm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='gfni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='ibrs-all'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='la57'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pku'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='taa-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='vaes'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='xsaves'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Icelake-Server-v7'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bitalg'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bw'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512cd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512dq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512f'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512ifma'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vbmi'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vl'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vnni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fsrm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='gfni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='hle'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='ibrs-all'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='la57'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pku'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='rtm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='taa-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='vaes'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='xsaves'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='IvyBridge'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='IvyBridge-IBRS'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='IvyBridge-v1'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='IvyBridge-v2'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='KnightsMill'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512-4fmaps'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512-4vnniw'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512cd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512er'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512f'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512pf'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='ss'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='KnightsMill-v1'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512-4fmaps'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512-4vnniw'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512cd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512er'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512f'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512pf'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='ss'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Opteron_G4'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fma4'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='xop'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Opteron_G4-v1'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fma4'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='xop'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Opteron_G5'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fma4'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='tbm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='xop'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Opteron_G5-v1'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fma4'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='tbm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='xop'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='SapphireRapids'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='amx-bf16'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='amx-int8'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='amx-tile'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx-vnni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512-bf16'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512-fp16'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bitalg'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bw'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512cd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512dq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512f'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512ifma'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vbmi'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vl'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vnni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='bus-lock-detect'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fsrc'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fsrm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fsrs'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fzrm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='gfni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='hle'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='ibrs-all'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='la57'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pku'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='rtm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='serialize'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='taa-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='tsx-ldtrk'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='vaes'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='xfd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='xsaves'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='SapphireRapids-v1'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='amx-bf16'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='amx-int8'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='amx-tile'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx-vnni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512-bf16'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512-fp16'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bitalg'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bw'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512cd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512dq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512f'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512ifma'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vbmi'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vl'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vnni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='bus-lock-detect'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fsrc'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fsrm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fsrs'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fzrm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='gfni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='hle'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='ibrs-all'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='la57'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pku'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='rtm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='serialize'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='taa-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='tsx-ldtrk'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='vaes'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='xfd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='xsaves'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='SapphireRapids-v2'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='amx-bf16'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='amx-int8'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='amx-tile'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx-vnni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512-bf16'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512-fp16'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bitalg'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bw'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512cd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512dq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512f'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512ifma'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vbmi'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vl'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vnni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='bus-lock-detect'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fbsdp-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fsrc'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fsrm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fsrs'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fzrm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='gfni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='hle'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='ibrs-all'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='la57'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pku'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='psdp-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='rtm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='sbdr-ssdp-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='serialize'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='taa-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='tsx-ldtrk'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='vaes'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='xfd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='xsaves'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='SapphireRapids-v3'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='amx-bf16'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='amx-int8'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='amx-tile'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx-vnni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512-bf16'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512-fp16'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bitalg'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bw'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512cd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512dq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512f'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512ifma'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vbmi'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vl'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vnni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='bus-lock-detect'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='cldemote'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fbsdp-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fsrc'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fsrm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fsrs'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fzrm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='gfni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='hle'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='ibrs-all'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='la57'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='movdir64b'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='movdiri'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pku'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='psdp-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='rtm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='sbdr-ssdp-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='serialize'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='ss'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='taa-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='tsx-ldtrk'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='vaes'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='xfd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='xsaves'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='SapphireRapids-v4'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='amx-bf16'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='amx-int8'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='amx-tile'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx-vnni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512-bf16'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512-fp16'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bitalg'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bw'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512cd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512dq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512f'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512ifma'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vbmi'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vl'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vnni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='bus-lock-detect'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='cldemote'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fbsdp-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fsrc'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fsrm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fsrs'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fzrm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='gfni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='hle'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='ibrs-all'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='la57'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='movdir64b'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='movdiri'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pku'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='psdp-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='rtm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='sbdr-ssdp-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='serialize'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='ss'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='taa-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='tsx-ldtrk'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='vaes'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='xfd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='xsaves'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='SierraForest'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx-ifma'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx-ne-convert'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx-vnni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx-vnni-int8'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='bus-lock-detect'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='cmpccxadd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fbsdp-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fsrm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fsrs'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='gfni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='ibrs-all'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='mcdt-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pbrsb-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pku'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='psdp-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='sbdr-ssdp-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='serialize'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='vaes'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='xsaves'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='SierraForest-v1'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx-ifma'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx-ne-convert'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx-vnni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx-vnni-int8'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='bus-lock-detect'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='cmpccxadd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fbsdp-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fsrm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fsrs'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='gfni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='ibrs-all'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='mcdt-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pbrsb-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pku'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='psdp-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='sbdr-ssdp-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='serialize'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='vaes'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='xsaves'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel'>SierraForest-v2</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='SierraForest-v2'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx-ifma'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx-ne-convert'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx-vnni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx-vnni-int8'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='bhi-ctrl'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='bus-lock-detect'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='cldemote'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='cmpccxadd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fbsdp-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fsrm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fsrs'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='gfni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='ibrs-all'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='intel-psfd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='ipred-ctrl'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='lam'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='mcdt-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='movdir64b'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='movdiri'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pbrsb-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pku'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='psdp-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='rrsba-ctrl'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='sbdr-ssdp-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='serialize'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='ss'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='vaes'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='xsaves'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel'>SierraForest-v3</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='SierraForest-v3'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx-ifma'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx-ne-convert'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx-vnni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx-vnni-int8'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='bhi-ctrl'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='bus-lock-detect'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='cldemote'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='cmpccxadd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fbsdp-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fsrm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fsrs'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='gfni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='ibrs-all'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='intel-psfd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='ipred-ctrl'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='lam'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='mcdt-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='movdir64b'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='movdiri'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pbrsb-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pku'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='psdp-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='rrsba-ctrl'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='sbdr-ssdp-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='serialize'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='ss'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='vaes'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='xsaves'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Skylake-Client'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='hle'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='rtm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Skylake-Client-IBRS'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='hle'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='rtm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Skylake-Client-v1'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='hle'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='rtm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Skylake-Client-v2'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='hle'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='rtm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Skylake-Client-v3'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Skylake-Client-v4'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='xsaves'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Skylake-Server'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bw'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512cd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512dq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512f'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vl'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='hle'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pku'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='rtm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Skylake-Server-IBRS'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bw'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512cd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512dq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512f'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vl'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='hle'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pku'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='rtm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bw'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512cd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512dq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512f'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vl'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pku'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Skylake-Server-v1'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bw'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512cd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512dq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512f'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vl'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='hle'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pku'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='rtm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Skylake-Server-v2'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bw'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512cd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512dq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512f'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vl'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='hle'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pku'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='rtm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Skylake-Server-v3'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bw'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512cd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512dq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512f'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vl'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pku'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Skylake-Server-v4'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bw'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512cd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512dq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512f'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vl'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pku'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Skylake-Server-v5'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bw'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512cd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512dq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512f'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vl'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pku'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='xsaves'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Snowridge'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='cldemote'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='core-capability'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='gfni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='movdir64b'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='movdiri'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='mpx'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='split-lock-detect'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Snowridge-v1'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='cldemote'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='core-capability'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='gfni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='movdir64b'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='movdiri'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='mpx'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='split-lock-detect'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Snowridge-v2'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='cldemote'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='core-capability'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='gfni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='movdir64b'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='movdiri'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='split-lock-detect'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Snowridge-v3'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='cldemote'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='core-capability'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='gfni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='movdir64b'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='movdiri'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='split-lock-detect'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='xsaves'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Snowridge-v4'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='cldemote'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='gfni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='movdir64b'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='movdiri'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='xsaves'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='athlon'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='3dnow'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='3dnowext'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='athlon-v1'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='3dnow'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='3dnowext'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='core2duo'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='ss'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='core2duo-v1'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='ss'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='coreduo'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='ss'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='coreduo-v1'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='ss'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='n270'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='ss'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='n270-v1'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='ss'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='phenom'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='3dnow'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='3dnowext'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='phenom-v1'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='3dnow'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='3dnowext'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:     </mode>
Jan 21 23:37:11 compute-1 nova_compute[182713]:   </cpu>
Jan 21 23:37:11 compute-1 nova_compute[182713]:   <memoryBacking supported='yes'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:     <enum name='sourceType'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <value>file</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <value>anonymous</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <value>memfd</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:     </enum>
Jan 21 23:37:11 compute-1 nova_compute[182713]:   </memoryBacking>
Jan 21 23:37:11 compute-1 nova_compute[182713]:   <devices>
Jan 21 23:37:11 compute-1 nova_compute[182713]:     <disk supported='yes'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <enum name='diskDevice'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>disk</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>cdrom</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>floppy</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>lun</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </enum>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <enum name='bus'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>fdc</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>scsi</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>virtio</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>usb</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>sata</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </enum>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <enum name='model'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>virtio</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>virtio-transitional</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>virtio-non-transitional</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </enum>
Jan 21 23:37:11 compute-1 nova_compute[182713]:     </disk>
Jan 21 23:37:11 compute-1 nova_compute[182713]:     <graphics supported='yes'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <enum name='type'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>vnc</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>egl-headless</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>dbus</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </enum>
Jan 21 23:37:11 compute-1 nova_compute[182713]:     </graphics>
Jan 21 23:37:11 compute-1 nova_compute[182713]:     <video supported='yes'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <enum name='modelType'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>vga</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>cirrus</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>virtio</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>none</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>bochs</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>ramfb</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </enum>
Jan 21 23:37:11 compute-1 nova_compute[182713]:     </video>
Jan 21 23:37:11 compute-1 nova_compute[182713]:     <hostdev supported='yes'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <enum name='mode'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>subsystem</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </enum>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <enum name='startupPolicy'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>default</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>mandatory</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>requisite</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>optional</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </enum>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <enum name='subsysType'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>usb</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>pci</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>scsi</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </enum>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <enum name='capsType'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <enum name='pciBackend'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:     </hostdev>
Jan 21 23:37:11 compute-1 nova_compute[182713]:     <rng supported='yes'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <enum name='model'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>virtio</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>virtio-transitional</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>virtio-non-transitional</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </enum>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <enum name='backendModel'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>random</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>egd</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>builtin</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </enum>
Jan 21 23:37:11 compute-1 nova_compute[182713]:     </rng>
Jan 21 23:37:11 compute-1 nova_compute[182713]:     <filesystem supported='yes'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <enum name='driverType'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>path</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>handle</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>virtiofs</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </enum>
Jan 21 23:37:11 compute-1 nova_compute[182713]:     </filesystem>
Jan 21 23:37:11 compute-1 nova_compute[182713]:     <tpm supported='yes'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <enum name='model'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>tpm-tis</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>tpm-crb</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </enum>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <enum name='backendModel'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>emulator</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>external</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </enum>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <enum name='backendVersion'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>2.0</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </enum>
Jan 21 23:37:11 compute-1 nova_compute[182713]:     </tpm>
Jan 21 23:37:11 compute-1 nova_compute[182713]:     <redirdev supported='yes'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <enum name='bus'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>usb</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </enum>
Jan 21 23:37:11 compute-1 nova_compute[182713]:     </redirdev>
Jan 21 23:37:11 compute-1 nova_compute[182713]:     <channel supported='yes'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <enum name='type'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>pty</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>unix</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </enum>
Jan 21 23:37:11 compute-1 nova_compute[182713]:     </channel>
Jan 21 23:37:11 compute-1 nova_compute[182713]:     <crypto supported='yes'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <enum name='model'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <enum name='type'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>qemu</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </enum>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <enum name='backendModel'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>builtin</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </enum>
Jan 21 23:37:11 compute-1 nova_compute[182713]:     </crypto>
Jan 21 23:37:11 compute-1 nova_compute[182713]:     <interface supported='yes'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <enum name='backendType'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>default</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>passt</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </enum>
Jan 21 23:37:11 compute-1 nova_compute[182713]:     </interface>
Jan 21 23:37:11 compute-1 nova_compute[182713]:     <panic supported='yes'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <enum name='model'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>isa</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>hyperv</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </enum>
Jan 21 23:37:11 compute-1 nova_compute[182713]:     </panic>
Jan 21 23:37:11 compute-1 nova_compute[182713]:     <console supported='yes'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <enum name='type'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>null</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>vc</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>pty</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>dev</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>file</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>pipe</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>stdio</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>udp</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>tcp</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>unix</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>qemu-vdagent</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>dbus</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </enum>
Jan 21 23:37:11 compute-1 nova_compute[182713]:     </console>
Jan 21 23:37:11 compute-1 nova_compute[182713]:   </devices>
Jan 21 23:37:11 compute-1 nova_compute[182713]:   <features>
Jan 21 23:37:11 compute-1 nova_compute[182713]:     <gic supported='no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:     <vmcoreinfo supported='yes'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:     <genid supported='yes'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:     <backingStoreInput supported='yes'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:     <backup supported='yes'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:     <async-teardown supported='yes'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:     <s390-pv supported='no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:     <ps2 supported='yes'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:     <tdx supported='no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:     <sev supported='no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:     <sgx supported='no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:     <hyperv supported='yes'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <enum name='features'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>relaxed</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>vapic</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>spinlocks</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>vpindex</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>runtime</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>synic</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>stimer</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>reset</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>vendor_id</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>frequencies</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>reenlightenment</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>tlbflush</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>ipi</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>avic</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>emsr_bitmap</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>xmm_input</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </enum>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <defaults>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <spinlocks>4095</spinlocks>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <stimer_direct>on</stimer_direct>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <tlbflush_direct>on</tlbflush_direct>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <tlbflush_extended>on</tlbflush_extended>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <vendor_id>Linux KVM Hv</vendor_id>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </defaults>
Jan 21 23:37:11 compute-1 nova_compute[182713]:     </hyperv>
Jan 21 23:37:11 compute-1 nova_compute[182713]:     <launchSecurity supported='no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:   </features>
Jan 21 23:37:11 compute-1 nova_compute[182713]: </domainCapabilities>
Jan 21 23:37:11 compute-1 nova_compute[182713]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Jan 21 23:37:11 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.860 182717 DEBUG nova.virt.libvirt.host [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc:
Jan 21 23:37:11 compute-1 nova_compute[182713]: <domainCapabilities>
Jan 21 23:37:11 compute-1 nova_compute[182713]:   <path>/usr/libexec/qemu-kvm</path>
Jan 21 23:37:11 compute-1 nova_compute[182713]:   <domain>kvm</domain>
Jan 21 23:37:11 compute-1 nova_compute[182713]:   <machine>pc-i440fx-rhel7.6.0</machine>
Jan 21 23:37:11 compute-1 nova_compute[182713]:   <arch>x86_64</arch>
Jan 21 23:37:11 compute-1 nova_compute[182713]:   <vcpu max='240'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:   <iothreads supported='yes'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:   <os supported='yes'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:     <enum name='firmware'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:     <loader supported='yes'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <enum name='type'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>rom</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>pflash</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </enum>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <enum name='readonly'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>yes</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>no</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </enum>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <enum name='secure'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>no</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </enum>
Jan 21 23:37:11 compute-1 nova_compute[182713]:     </loader>
Jan 21 23:37:11 compute-1 nova_compute[182713]:   </os>
Jan 21 23:37:11 compute-1 nova_compute[182713]:   <cpu>
Jan 21 23:37:11 compute-1 nova_compute[182713]:     <mode name='host-passthrough' supported='yes'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <enum name='hostPassthroughMigratable'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>on</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>off</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </enum>
Jan 21 23:37:11 compute-1 nova_compute[182713]:     </mode>
Jan 21 23:37:11 compute-1 nova_compute[182713]:     <mode name='maximum' supported='yes'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <enum name='maximumMigratable'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>on</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <value>off</value>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </enum>
Jan 21 23:37:11 compute-1 nova_compute[182713]:     </mode>
Jan 21 23:37:11 compute-1 nova_compute[182713]:     <mode name='host-model' supported='yes'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model fallback='forbid'>EPYC-Rome</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <vendor>AMD</vendor>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <maxphysaddr mode='passthrough' limit='40'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <feature policy='require' name='x2apic'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <feature policy='require' name='tsc-deadline'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <feature policy='require' name='hypervisor'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <feature policy='require' name='tsc_adjust'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <feature policy='require' name='spec-ctrl'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <feature policy='require' name='stibp'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <feature policy='require' name='ssbd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <feature policy='require' name='cmp_legacy'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <feature policy='require' name='overflow-recov'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <feature policy='require' name='succor'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <feature policy='require' name='ibrs'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <feature policy='require' name='amd-ssbd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <feature policy='require' name='virt-ssbd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <feature policy='require' name='lbrv'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <feature policy='require' name='tsc-scale'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <feature policy='require' name='vmcb-clean'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <feature policy='require' name='flushbyasid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <feature policy='require' name='pause-filter'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <feature policy='require' name='pfthreshold'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <feature policy='require' name='svme-addr-chk'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <feature policy='require' name='lfence-always-serializing'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <feature policy='disable' name='xsaves'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:     </mode>
Jan 21 23:37:11 compute-1 nova_compute[182713]:     <mode name='custom' supported='yes'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Broadwell'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='hle'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='rtm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Broadwell-IBRS'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='hle'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='rtm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Broadwell-noTSX'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Broadwell-noTSX-IBRS'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Broadwell-v1'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='hle'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='rtm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Broadwell-v2'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Broadwell-v3'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='hle'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='rtm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Broadwell-v4'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Cascadelake-Server'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bw'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512cd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512dq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512f'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vl'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vnni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='hle'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pku'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='rtm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Cascadelake-Server-noTSX'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bw'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512cd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512dq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512f'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vl'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vnni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='ibrs-all'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pku'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Cascadelake-Server-v1'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bw'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512cd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512dq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512f'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vl'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vnni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='hle'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pku'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='rtm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Cascadelake-Server-v2'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bw'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512cd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512dq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512f'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vl'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vnni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='hle'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='ibrs-all'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pku'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='rtm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Cascadelake-Server-v3'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bw'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512cd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512dq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512f'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vl'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vnni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='ibrs-all'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pku'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Cascadelake-Server-v4'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bw'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512cd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512dq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512f'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vl'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vnni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='ibrs-all'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pku'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Cascadelake-Server-v5'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bw'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512cd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512dq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512f'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vl'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vnni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='ibrs-all'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pku'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='xsaves'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='ClearwaterForest'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx-ifma'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx-ne-convert'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx-vnni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx-vnni-int16'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx-vnni-int8'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='bhi-ctrl'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='bhi-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='bus-lock-detect'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='cldemote'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='cmpccxadd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='ddpd-u'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fbsdp-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fsrm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fsrs'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='gfni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='ibrs-all'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='intel-psfd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='ipred-ctrl'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='lam'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='mcdt-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='movdir64b'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='movdiri'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pbrsb-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pku'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='prefetchiti'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='psdp-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='rrsba-ctrl'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='sbdr-ssdp-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='serialize'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='sha512'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='sm3'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='sm4'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='ss'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='vaes'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='xsaves'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='ClearwaterForest-v1'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx-ifma'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx-ne-convert'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx-vnni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx-vnni-int16'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx-vnni-int8'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='bhi-ctrl'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='bhi-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='bus-lock-detect'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='cldemote'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='cmpccxadd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='ddpd-u'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fbsdp-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fsrm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fsrs'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='gfni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='ibrs-all'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='intel-psfd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='ipred-ctrl'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='lam'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='mcdt-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='movdir64b'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='movdiri'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pbrsb-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pku'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='prefetchiti'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='psdp-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='rrsba-ctrl'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='sbdr-ssdp-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='serialize'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='sha512'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='sm3'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='sm4'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='ss'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='vaes'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='xsaves'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Cooperlake'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512-bf16'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bw'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512cd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512dq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512f'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vl'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vnni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='hle'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='ibrs-all'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pku'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='rtm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='taa-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Cooperlake-v1'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512-bf16'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bw'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512cd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512dq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512f'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vl'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vnni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='hle'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='ibrs-all'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pku'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='rtm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='taa-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Cooperlake-v2'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512-bf16'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bw'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512cd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512dq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512f'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vl'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vnni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='hle'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='ibrs-all'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pku'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='rtm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='taa-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='xsaves'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Denverton'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='mpx'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Denverton-v1'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='mpx'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Denverton-v2'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Denverton-v3'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='xsaves'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Dhyana-v2'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='xsaves'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='EPYC-Genoa'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='amd-psfd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='auto-ibrs'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512-bf16'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bitalg'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bw'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512cd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512dq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512f'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512ifma'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vbmi'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vl'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vnni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fsrm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='gfni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='la57'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='no-nested-data-bp'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='null-sel-clr-base'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pku'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='stibp-always-on'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='vaes'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='xsaves'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='EPYC-Genoa-v1'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='amd-psfd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='auto-ibrs'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512-bf16'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bitalg'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bw'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512cd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512dq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512f'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512ifma'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vbmi'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vl'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vnni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fsrm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='gfni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='la57'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='no-nested-data-bp'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='null-sel-clr-base'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pku'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='stibp-always-on'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='vaes'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='xsaves'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='EPYC-Genoa-v2'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='amd-psfd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='auto-ibrs'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512-bf16'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bitalg'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bw'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512cd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512dq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512f'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512ifma'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vbmi'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vl'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vnni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fs-gs-base-ns'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fsrm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='gfni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='la57'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='no-nested-data-bp'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='null-sel-clr-base'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='perfmon-v2'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pku'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='stibp-always-on'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='vaes'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='xsaves'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='EPYC-Milan'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fsrm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pku'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='xsaves'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='EPYC-Milan-v1'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fsrm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pku'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='xsaves'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='EPYC-Milan-v2'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='amd-psfd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fsrm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='no-nested-data-bp'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='null-sel-clr-base'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pku'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='stibp-always-on'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='vaes'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='xsaves'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='EPYC-Milan-v3'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='amd-psfd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fsrm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='no-nested-data-bp'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='null-sel-clr-base'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pku'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='stibp-always-on'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='vaes'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='xsaves'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='EPYC-Rome'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='xsaves'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='EPYC-Rome-v1'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='xsaves'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='EPYC-Rome-v2'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='xsaves'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='EPYC-Rome-v3'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='xsaves'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='EPYC-Turin'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='amd-psfd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='auto-ibrs'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx-vnni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512-bf16'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512-vp2intersect'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bitalg'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bw'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512cd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512dq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512f'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512ifma'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vbmi'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vl'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vnni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fs-gs-base-ns'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fsrm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='gfni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='ibpb-brtype'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='la57'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='movdir64b'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='movdiri'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='no-nested-data-bp'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='null-sel-clr-base'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='perfmon-v2'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pku'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='prefetchi'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='sbpb'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='srso-user-kernel-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='stibp-always-on'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='vaes'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='xsaves'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='EPYC-Turin-v1'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='amd-psfd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='auto-ibrs'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx-vnni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512-bf16'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512-vp2intersect'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bitalg'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bw'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512cd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512dq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512f'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512ifma'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vbmi'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vl'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vnni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fs-gs-base-ns'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fsrm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='gfni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='ibpb-brtype'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='la57'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='movdir64b'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='movdiri'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='no-nested-data-bp'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='null-sel-clr-base'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='perfmon-v2'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pku'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='prefetchi'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='sbpb'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='srso-user-kernel-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='stibp-always-on'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='vaes'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='xsaves'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='EPYC-v3'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='xsaves'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='EPYC-v4'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='xsaves'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='AMD'>EPYC-v5</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='EPYC-v5'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='xsaves'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='GraniteRapids'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='amx-bf16'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='amx-fp16'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='amx-int8'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='amx-tile'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx-vnni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512-bf16'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512-fp16'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bitalg'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bw'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512cd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512dq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512f'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512ifma'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vbmi'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vl'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vnni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='bus-lock-detect'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fbsdp-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fsrc'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fsrm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fsrs'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fzrm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='gfni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='hle'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='ibrs-all'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='la57'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='mcdt-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pbrsb-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pku'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='prefetchiti'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='psdp-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='rtm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='sbdr-ssdp-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='serialize'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='taa-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='tsx-ldtrk'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='vaes'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='xfd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='xsaves'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='GraniteRapids-v1'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='amx-bf16'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='amx-fp16'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='amx-int8'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='amx-tile'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx-vnni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512-bf16'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512-fp16'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bitalg'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bw'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512cd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512dq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512f'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512ifma'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vbmi'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vl'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vnni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='bus-lock-detect'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fbsdp-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fsrc'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fsrm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fsrs'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fzrm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='gfni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='hle'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='ibrs-all'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='la57'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='mcdt-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pbrsb-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pku'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='prefetchiti'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='psdp-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='rtm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='sbdr-ssdp-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='serialize'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='taa-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='tsx-ldtrk'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='vaes'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='xfd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='xsaves'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='GraniteRapids-v2'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='amx-bf16'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='amx-fp16'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='amx-int8'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='amx-tile'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx-vnni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx10'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx10-128'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx10-256'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx10-512'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512-bf16'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512-fp16'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bitalg'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bw'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512cd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512dq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512f'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512ifma'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vbmi'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vl'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vnni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='bus-lock-detect'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='cldemote'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fbsdp-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fsrc'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fsrm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fsrs'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fzrm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='gfni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='hle'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='ibrs-all'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='la57'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='mcdt-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='movdir64b'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='movdiri'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pbrsb-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pku'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='prefetchiti'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='psdp-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='rtm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='sbdr-ssdp-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='serialize'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='ss'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='taa-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='tsx-ldtrk'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='vaes'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='xfd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='xsaves'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='GraniteRapids-v3'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='amx-bf16'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='amx-fp16'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='amx-int8'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='amx-tile'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx-vnni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx10'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx10-128'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx10-256'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx10-512'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512-bf16'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512-fp16'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bitalg'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bw'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512cd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512dq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512f'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512ifma'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vbmi'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vl'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vnni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='bus-lock-detect'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='cldemote'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fbsdp-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fsrc'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fsrm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fsrs'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fzrm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='gfni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='hle'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='ibrs-all'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='la57'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='mcdt-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='movdir64b'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='movdiri'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pbrsb-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pku'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='prefetchiti'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='psdp-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='rtm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='sbdr-ssdp-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='serialize'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='ss'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='taa-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='tsx-ldtrk'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='vaes'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='xfd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='xsaves'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Haswell'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='hle'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='rtm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Haswell-IBRS'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='hle'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='rtm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Haswell-noTSX'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Haswell-noTSX-IBRS'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Haswell-v1'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='hle'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='rtm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Haswell-v2'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Haswell-v3'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='hle'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='rtm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Haswell-v4'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Icelake-Server'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bitalg'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bw'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512cd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512dq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512f'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vbmi'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vl'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vnni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='gfni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='hle'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='la57'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pku'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='rtm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='vaes'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Icelake-Server-noTSX'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bitalg'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bw'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512cd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512dq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512f'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vbmi'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vl'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vnni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='gfni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='la57'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pku'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='vaes'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Icelake-Server-v1'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bitalg'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bw'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512cd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512dq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512f'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vbmi'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vl'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vnni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='gfni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='hle'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='la57'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pku'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='rtm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='vaes'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Icelake-Server-v2'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bitalg'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bw'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512cd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512dq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512f'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vbmi'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vl'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vnni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='gfni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='la57'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pku'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='vaes'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Icelake-Server-v3'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bitalg'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bw'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512cd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512dq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512f'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vbmi'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vl'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vnni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='gfni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='ibrs-all'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='la57'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pku'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='taa-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='vaes'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Icelake-Server-v4'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bitalg'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bw'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512cd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512dq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512f'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512ifma'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vbmi'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vl'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vnni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fsrm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='gfni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='ibrs-all'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='la57'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pku'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='taa-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='vaes'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Icelake-Server-v5'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bitalg'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bw'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512cd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512dq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512f'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512ifma'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vbmi'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vl'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vnni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fsrm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='gfni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='ibrs-all'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='la57'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pku'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='taa-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='vaes'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='xsaves'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Icelake-Server-v6'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bitalg'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bw'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512cd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512dq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512f'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512ifma'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vbmi'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vl'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vnni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fsrm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='gfni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='ibrs-all'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='la57'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pku'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='taa-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='vaes'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='xsaves'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Icelake-Server-v7'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bitalg'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bw'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512cd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512dq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512f'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512ifma'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vbmi'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vl'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vnni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fsrm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='gfni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='hle'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='ibrs-all'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='la57'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pku'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='rtm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='taa-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='vaes'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='xsaves'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='IvyBridge'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='IvyBridge-IBRS'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='IvyBridge-v1'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='IvyBridge-v2'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='KnightsMill'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512-4fmaps'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512-4vnniw'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512cd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512er'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512f'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512pf'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='ss'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='KnightsMill-v1'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512-4fmaps'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512-4vnniw'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512cd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512er'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512f'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512pf'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='ss'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Opteron_G4'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fma4'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='xop'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Opteron_G4-v1'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fma4'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='xop'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Opteron_G5'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fma4'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='tbm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='xop'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Opteron_G5-v1'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fma4'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='tbm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='xop'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='SapphireRapids'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='amx-bf16'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='amx-int8'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='amx-tile'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx-vnni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512-bf16'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512-fp16'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bitalg'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bw'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512cd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512dq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512f'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512ifma'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vbmi'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vl'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vnni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='bus-lock-detect'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fsrc'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fsrm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fsrs'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fzrm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='gfni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='hle'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='ibrs-all'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='la57'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pku'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='rtm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='serialize'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='taa-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='tsx-ldtrk'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='vaes'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='xfd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='xsaves'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='SapphireRapids-v1'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='amx-bf16'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='amx-int8'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='amx-tile'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx-vnni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512-bf16'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512-fp16'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bitalg'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bw'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512cd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512dq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512f'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512ifma'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vbmi'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vl'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vnni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='bus-lock-detect'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fsrc'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fsrm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fsrs'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fzrm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='gfni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='hle'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='ibrs-all'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='la57'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pku'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='rtm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='serialize'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='taa-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='tsx-ldtrk'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='vaes'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='xfd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='xsaves'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='SapphireRapids-v2'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='amx-bf16'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='amx-int8'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='amx-tile'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx-vnni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512-bf16'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512-fp16'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bitalg'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bw'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512cd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512dq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512f'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512ifma'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vbmi'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vl'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vnni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='bus-lock-detect'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fbsdp-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fsrc'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fsrm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fsrs'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fzrm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='gfni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='hle'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='ibrs-all'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='la57'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pku'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='psdp-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='rtm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='sbdr-ssdp-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='serialize'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='taa-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='tsx-ldtrk'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='vaes'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='xfd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='xsaves'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='SapphireRapids-v3'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='amx-bf16'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='amx-int8'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='amx-tile'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx-vnni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512-bf16'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512-fp16'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bitalg'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bw'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512cd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512dq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512f'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512ifma'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vbmi'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vl'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vnni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='bus-lock-detect'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='cldemote'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fbsdp-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fsrc'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fsrm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fsrs'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fzrm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='gfni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='hle'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='ibrs-all'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='la57'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='movdir64b'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='movdiri'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pku'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='psdp-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='rtm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='sbdr-ssdp-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='serialize'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='ss'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='taa-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='tsx-ldtrk'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='vaes'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='xfd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='xsaves'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='SapphireRapids-v4'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='amx-bf16'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='amx-int8'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='amx-tile'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx-vnni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512-bf16'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512-fp16'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bitalg'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bw'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512cd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512dq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512f'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512ifma'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vbmi'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vl'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vnni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='bus-lock-detect'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='cldemote'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fbsdp-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fsrc'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fsrm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fsrs'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fzrm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='gfni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='hle'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='ibrs-all'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='la57'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='movdir64b'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='movdiri'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pku'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='psdp-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='rtm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='sbdr-ssdp-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='serialize'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='ss'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='taa-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='tsx-ldtrk'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='vaes'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='xfd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='xsaves'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='SierraForest'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx-ifma'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx-ne-convert'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx-vnni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx-vnni-int8'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='bus-lock-detect'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='cmpccxadd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fbsdp-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fsrm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fsrs'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='gfni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='ibrs-all'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='mcdt-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pbrsb-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pku'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='psdp-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='sbdr-ssdp-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='serialize'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='vaes'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='xsaves'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='SierraForest-v1'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx-ifma'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx-ne-convert'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx-vnni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx-vnni-int8'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='bus-lock-detect'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='cmpccxadd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fbsdp-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fsrm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fsrs'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='gfni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='ibrs-all'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='mcdt-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pbrsb-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pku'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='psdp-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='sbdr-ssdp-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='serialize'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='vaes'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='xsaves'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel'>SierraForest-v2</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='SierraForest-v2'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx-ifma'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx-ne-convert'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx-vnni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx-vnni-int8'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='bhi-ctrl'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='bus-lock-detect'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='cldemote'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='cmpccxadd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fbsdp-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fsrm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fsrs'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='gfni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='ibrs-all'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='intel-psfd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='ipred-ctrl'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='lam'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='mcdt-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='movdir64b'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='movdiri'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pbrsb-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pku'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='psdp-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='rrsba-ctrl'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='sbdr-ssdp-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='serialize'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='ss'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='vaes'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='xsaves'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel'>SierraForest-v3</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='SierraForest-v3'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx-ifma'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx-ne-convert'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx-vnni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx-vnni-int8'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='bhi-ctrl'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='bus-lock-detect'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='cldemote'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='cmpccxadd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fbsdp-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fsrm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='fsrs'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='gfni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='ibrs-all'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='intel-psfd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='ipred-ctrl'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='lam'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='mcdt-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='movdir64b'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='movdiri'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pbrsb-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pku'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='psdp-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='rrsba-ctrl'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='sbdr-ssdp-no'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='serialize'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='ss'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='vaes'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='xsaves'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Skylake-Client'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='hle'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='rtm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Skylake-Client-IBRS'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='hle'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='rtm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Skylake-Client-v1'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='hle'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='rtm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Skylake-Client-v2'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='hle'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='rtm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Skylake-Client-v3'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Skylake-Client-v4'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='xsaves'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Skylake-Server'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bw'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512cd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512dq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512f'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vl'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='hle'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pku'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='rtm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Skylake-Server-IBRS'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bw'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512cd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512dq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512f'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vl'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='hle'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pku'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='rtm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bw'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512cd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512dq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512f'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vl'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pku'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Skylake-Server-v1'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bw'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512cd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512dq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512f'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vl'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='hle'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pku'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='rtm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Skylake-Server-v2'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bw'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512cd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512dq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512f'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vl'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='hle'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pku'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='rtm'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Skylake-Server-v3'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bw'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512cd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512dq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512f'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vl'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pku'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Skylake-Server-v4'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bw'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512cd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512dq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512f'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vl'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pku'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Skylake-Server-v5'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512bw'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512cd'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512dq'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512f'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='avx512vl'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='invpcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pcid'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='pku'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='xsaves'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Snowridge'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='cldemote'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='core-capability'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='gfni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='movdir64b'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='movdiri'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='mpx'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='split-lock-detect'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Snowridge-v1'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='cldemote'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='core-capability'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='gfni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='movdir64b'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='movdiri'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='mpx'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='split-lock-detect'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Snowridge-v2'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='cldemote'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='core-capability'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='gfni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='movdir64b'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='movdiri'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='split-lock-detect'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Snowridge-v3'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='cldemote'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='core-capability'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='gfni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='movdir64b'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='movdiri'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='split-lock-detect'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='xsaves'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='Snowridge-v4'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='cldemote'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='erms'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='gfni'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='movdir64b'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='movdiri'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='xsaves'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='athlon'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='3dnow'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='3dnowext'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='athlon-v1'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='3dnow'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='3dnowext'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='core2duo'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='ss'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='core2duo-v1'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='ss'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='coreduo'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='ss'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='coreduo-v1'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='ss'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='n270'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='ss'/>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Jan 21 23:37:11 compute-1 nova_compute[182713]:       <blockers model='n270-v1'>
Jan 21 23:37:11 compute-1 nova_compute[182713]:         <feature name='ss'/>
Jan 21 23:37:12 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:12 compute-1 nova_compute[182713]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Jan 21 23:37:12 compute-1 nova_compute[182713]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Jan 21 23:37:12 compute-1 nova_compute[182713]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Jan 21 23:37:12 compute-1 nova_compute[182713]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Jan 21 23:37:12 compute-1 nova_compute[182713]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Jan 21 23:37:12 compute-1 nova_compute[182713]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Jan 21 23:37:12 compute-1 nova_compute[182713]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Jan 21 23:37:12 compute-1 nova_compute[182713]:       <blockers model='phenom'>
Jan 21 23:37:12 compute-1 nova_compute[182713]:         <feature name='3dnow'/>
Jan 21 23:37:12 compute-1 nova_compute[182713]:         <feature name='3dnowext'/>
Jan 21 23:37:12 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:12 compute-1 nova_compute[182713]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Jan 21 23:37:12 compute-1 nova_compute[182713]:       <blockers model='phenom-v1'>
Jan 21 23:37:12 compute-1 nova_compute[182713]:         <feature name='3dnow'/>
Jan 21 23:37:12 compute-1 nova_compute[182713]:         <feature name='3dnowext'/>
Jan 21 23:37:12 compute-1 nova_compute[182713]:       </blockers>
Jan 21 23:37:12 compute-1 nova_compute[182713]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Jan 21 23:37:12 compute-1 nova_compute[182713]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Jan 21 23:37:12 compute-1 nova_compute[182713]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Jan 21 23:37:12 compute-1 nova_compute[182713]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Jan 21 23:37:12 compute-1 nova_compute[182713]:     </mode>
Jan 21 23:37:12 compute-1 nova_compute[182713]:   </cpu>
Jan 21 23:37:12 compute-1 nova_compute[182713]:   <memoryBacking supported='yes'>
Jan 21 23:37:12 compute-1 nova_compute[182713]:     <enum name='sourceType'>
Jan 21 23:37:12 compute-1 nova_compute[182713]:       <value>file</value>
Jan 21 23:37:12 compute-1 nova_compute[182713]:       <value>anonymous</value>
Jan 21 23:37:12 compute-1 nova_compute[182713]:       <value>memfd</value>
Jan 21 23:37:12 compute-1 nova_compute[182713]:     </enum>
Jan 21 23:37:12 compute-1 nova_compute[182713]:   </memoryBacking>
Jan 21 23:37:12 compute-1 nova_compute[182713]:   <devices>
Jan 21 23:37:12 compute-1 nova_compute[182713]:     <disk supported='yes'>
Jan 21 23:37:12 compute-1 nova_compute[182713]:       <enum name='diskDevice'>
Jan 21 23:37:12 compute-1 nova_compute[182713]:         <value>disk</value>
Jan 21 23:37:12 compute-1 nova_compute[182713]:         <value>cdrom</value>
Jan 21 23:37:12 compute-1 nova_compute[182713]:         <value>floppy</value>
Jan 21 23:37:12 compute-1 nova_compute[182713]:         <value>lun</value>
Jan 21 23:37:12 compute-1 nova_compute[182713]:       </enum>
Jan 21 23:37:12 compute-1 nova_compute[182713]:       <enum name='bus'>
Jan 21 23:37:12 compute-1 nova_compute[182713]:         <value>ide</value>
Jan 21 23:37:12 compute-1 nova_compute[182713]:         <value>fdc</value>
Jan 21 23:37:12 compute-1 nova_compute[182713]:         <value>scsi</value>
Jan 21 23:37:12 compute-1 nova_compute[182713]:         <value>virtio</value>
Jan 21 23:37:12 compute-1 nova_compute[182713]:         <value>usb</value>
Jan 21 23:37:12 compute-1 nova_compute[182713]:         <value>sata</value>
Jan 21 23:37:12 compute-1 nova_compute[182713]:       </enum>
Jan 21 23:37:12 compute-1 nova_compute[182713]:       <enum name='model'>
Jan 21 23:37:12 compute-1 nova_compute[182713]:         <value>virtio</value>
Jan 21 23:37:12 compute-1 nova_compute[182713]:         <value>virtio-transitional</value>
Jan 21 23:37:12 compute-1 nova_compute[182713]:         <value>virtio-non-transitional</value>
Jan 21 23:37:12 compute-1 nova_compute[182713]:       </enum>
Jan 21 23:37:12 compute-1 nova_compute[182713]:     </disk>
Jan 21 23:37:12 compute-1 nova_compute[182713]:     <graphics supported='yes'>
Jan 21 23:37:12 compute-1 nova_compute[182713]:       <enum name='type'>
Jan 21 23:37:12 compute-1 nova_compute[182713]:         <value>vnc</value>
Jan 21 23:37:12 compute-1 nova_compute[182713]:         <value>egl-headless</value>
Jan 21 23:37:12 compute-1 nova_compute[182713]:         <value>dbus</value>
Jan 21 23:37:12 compute-1 nova_compute[182713]:       </enum>
Jan 21 23:37:12 compute-1 nova_compute[182713]:     </graphics>
Jan 21 23:37:12 compute-1 nova_compute[182713]:     <video supported='yes'>
Jan 21 23:37:12 compute-1 nova_compute[182713]:       <enum name='modelType'>
Jan 21 23:37:12 compute-1 nova_compute[182713]:         <value>vga</value>
Jan 21 23:37:12 compute-1 nova_compute[182713]:         <value>cirrus</value>
Jan 21 23:37:12 compute-1 nova_compute[182713]:         <value>virtio</value>
Jan 21 23:37:12 compute-1 nova_compute[182713]:         <value>none</value>
Jan 21 23:37:12 compute-1 nova_compute[182713]:         <value>bochs</value>
Jan 21 23:37:12 compute-1 nova_compute[182713]:         <value>ramfb</value>
Jan 21 23:37:12 compute-1 nova_compute[182713]:       </enum>
Jan 21 23:37:12 compute-1 nova_compute[182713]:     </video>
Jan 21 23:37:12 compute-1 nova_compute[182713]:     <hostdev supported='yes'>
Jan 21 23:37:12 compute-1 nova_compute[182713]:       <enum name='mode'>
Jan 21 23:37:12 compute-1 nova_compute[182713]:         <value>subsystem</value>
Jan 21 23:37:12 compute-1 nova_compute[182713]:       </enum>
Jan 21 23:37:12 compute-1 nova_compute[182713]:       <enum name='startupPolicy'>
Jan 21 23:37:12 compute-1 nova_compute[182713]:         <value>default</value>
Jan 21 23:37:12 compute-1 nova_compute[182713]:         <value>mandatory</value>
Jan 21 23:37:12 compute-1 nova_compute[182713]:         <value>requisite</value>
Jan 21 23:37:12 compute-1 nova_compute[182713]:         <value>optional</value>
Jan 21 23:37:12 compute-1 nova_compute[182713]:       </enum>
Jan 21 23:37:12 compute-1 nova_compute[182713]:       <enum name='subsysType'>
Jan 21 23:37:12 compute-1 nova_compute[182713]:         <value>usb</value>
Jan 21 23:37:12 compute-1 nova_compute[182713]:         <value>pci</value>
Jan 21 23:37:12 compute-1 nova_compute[182713]:         <value>scsi</value>
Jan 21 23:37:12 compute-1 nova_compute[182713]:       </enum>
Jan 21 23:37:12 compute-1 nova_compute[182713]:       <enum name='capsType'/>
Jan 21 23:37:12 compute-1 nova_compute[182713]:       <enum name='pciBackend'/>
Jan 21 23:37:12 compute-1 nova_compute[182713]:     </hostdev>
Jan 21 23:37:12 compute-1 nova_compute[182713]:     <rng supported='yes'>
Jan 21 23:37:12 compute-1 nova_compute[182713]:       <enum name='model'>
Jan 21 23:37:12 compute-1 nova_compute[182713]:         <value>virtio</value>
Jan 21 23:37:12 compute-1 nova_compute[182713]:         <value>virtio-transitional</value>
Jan 21 23:37:12 compute-1 nova_compute[182713]:         <value>virtio-non-transitional</value>
Jan 21 23:37:12 compute-1 nova_compute[182713]:       </enum>
Jan 21 23:37:12 compute-1 nova_compute[182713]:       <enum name='backendModel'>
Jan 21 23:37:12 compute-1 nova_compute[182713]:         <value>random</value>
Jan 21 23:37:12 compute-1 nova_compute[182713]:         <value>egd</value>
Jan 21 23:37:12 compute-1 nova_compute[182713]:         <value>builtin</value>
Jan 21 23:37:12 compute-1 nova_compute[182713]:       </enum>
Jan 21 23:37:12 compute-1 nova_compute[182713]:     </rng>
Jan 21 23:37:12 compute-1 nova_compute[182713]:     <filesystem supported='yes'>
Jan 21 23:37:12 compute-1 nova_compute[182713]:       <enum name='driverType'>
Jan 21 23:37:12 compute-1 nova_compute[182713]:         <value>path</value>
Jan 21 23:37:12 compute-1 nova_compute[182713]:         <value>handle</value>
Jan 21 23:37:12 compute-1 nova_compute[182713]:         <value>virtiofs</value>
Jan 21 23:37:12 compute-1 nova_compute[182713]:       </enum>
Jan 21 23:37:12 compute-1 nova_compute[182713]:     </filesystem>
Jan 21 23:37:12 compute-1 nova_compute[182713]:     <tpm supported='yes'>
Jan 21 23:37:12 compute-1 nova_compute[182713]:       <enum name='model'>
Jan 21 23:37:12 compute-1 nova_compute[182713]:         <value>tpm-tis</value>
Jan 21 23:37:12 compute-1 nova_compute[182713]:         <value>tpm-crb</value>
Jan 21 23:37:12 compute-1 nova_compute[182713]:       </enum>
Jan 21 23:37:12 compute-1 nova_compute[182713]:       <enum name='backendModel'>
Jan 21 23:37:12 compute-1 nova_compute[182713]:         <value>emulator</value>
Jan 21 23:37:12 compute-1 nova_compute[182713]:         <value>external</value>
Jan 21 23:37:12 compute-1 nova_compute[182713]:       </enum>
Jan 21 23:37:12 compute-1 nova_compute[182713]:       <enum name='backendVersion'>
Jan 21 23:37:12 compute-1 nova_compute[182713]:         <value>2.0</value>
Jan 21 23:37:12 compute-1 nova_compute[182713]:       </enum>
Jan 21 23:37:12 compute-1 nova_compute[182713]:     </tpm>
Jan 21 23:37:12 compute-1 nova_compute[182713]:     <redirdev supported='yes'>
Jan 21 23:37:12 compute-1 nova_compute[182713]:       <enum name='bus'>
Jan 21 23:37:12 compute-1 nova_compute[182713]:         <value>usb</value>
Jan 21 23:37:12 compute-1 nova_compute[182713]:       </enum>
Jan 21 23:37:12 compute-1 nova_compute[182713]:     </redirdev>
Jan 21 23:37:12 compute-1 nova_compute[182713]:     <channel supported='yes'>
Jan 21 23:37:12 compute-1 nova_compute[182713]:       <enum name='type'>
Jan 21 23:37:12 compute-1 nova_compute[182713]:         <value>pty</value>
Jan 21 23:37:12 compute-1 nova_compute[182713]:         <value>unix</value>
Jan 21 23:37:12 compute-1 nova_compute[182713]:       </enum>
Jan 21 23:37:12 compute-1 nova_compute[182713]:     </channel>
Jan 21 23:37:12 compute-1 nova_compute[182713]:     <crypto supported='yes'>
Jan 21 23:37:12 compute-1 nova_compute[182713]:       <enum name='model'/>
Jan 21 23:37:12 compute-1 nova_compute[182713]:       <enum name='type'>
Jan 21 23:37:12 compute-1 nova_compute[182713]:         <value>qemu</value>
Jan 21 23:37:12 compute-1 nova_compute[182713]:       </enum>
Jan 21 23:37:12 compute-1 nova_compute[182713]:       <enum name='backendModel'>
Jan 21 23:37:12 compute-1 nova_compute[182713]:         <value>builtin</value>
Jan 21 23:37:12 compute-1 nova_compute[182713]:       </enum>
Jan 21 23:37:12 compute-1 nova_compute[182713]:     </crypto>
Jan 21 23:37:12 compute-1 nova_compute[182713]:     <interface supported='yes'>
Jan 21 23:37:12 compute-1 nova_compute[182713]:       <enum name='backendType'>
Jan 21 23:37:12 compute-1 nova_compute[182713]:         <value>default</value>
Jan 21 23:37:12 compute-1 nova_compute[182713]:         <value>passt</value>
Jan 21 23:37:12 compute-1 nova_compute[182713]:       </enum>
Jan 21 23:37:12 compute-1 nova_compute[182713]:     </interface>
Jan 21 23:37:12 compute-1 nova_compute[182713]:     <panic supported='yes'>
Jan 21 23:37:12 compute-1 nova_compute[182713]:       <enum name='model'>
Jan 21 23:37:12 compute-1 nova_compute[182713]:         <value>isa</value>
Jan 21 23:37:12 compute-1 nova_compute[182713]:         <value>hyperv</value>
Jan 21 23:37:12 compute-1 nova_compute[182713]:       </enum>
Jan 21 23:37:12 compute-1 nova_compute[182713]:     </panic>
Jan 21 23:37:12 compute-1 nova_compute[182713]:     <console supported='yes'>
Jan 21 23:37:12 compute-1 nova_compute[182713]:       <enum name='type'>
Jan 21 23:37:12 compute-1 nova_compute[182713]:         <value>null</value>
Jan 21 23:37:12 compute-1 nova_compute[182713]:         <value>vc</value>
Jan 21 23:37:12 compute-1 nova_compute[182713]:         <value>pty</value>
Jan 21 23:37:12 compute-1 nova_compute[182713]:         <value>dev</value>
Jan 21 23:37:12 compute-1 nova_compute[182713]:         <value>file</value>
Jan 21 23:37:12 compute-1 nova_compute[182713]:         <value>pipe</value>
Jan 21 23:37:12 compute-1 nova_compute[182713]:         <value>stdio</value>
Jan 21 23:37:12 compute-1 nova_compute[182713]:         <value>udp</value>
Jan 21 23:37:12 compute-1 nova_compute[182713]:         <value>tcp</value>
Jan 21 23:37:12 compute-1 nova_compute[182713]:         <value>unix</value>
Jan 21 23:37:12 compute-1 nova_compute[182713]:         <value>qemu-vdagent</value>
Jan 21 23:37:12 compute-1 nova_compute[182713]:         <value>dbus</value>
Jan 21 23:37:12 compute-1 nova_compute[182713]:       </enum>
Jan 21 23:37:12 compute-1 nova_compute[182713]:     </console>
Jan 21 23:37:12 compute-1 nova_compute[182713]:   </devices>
Jan 21 23:37:12 compute-1 nova_compute[182713]:   <features>
Jan 21 23:37:12 compute-1 nova_compute[182713]:     <gic supported='no'/>
Jan 21 23:37:12 compute-1 nova_compute[182713]:     <vmcoreinfo supported='yes'/>
Jan 21 23:37:12 compute-1 nova_compute[182713]:     <genid supported='yes'/>
Jan 21 23:37:12 compute-1 nova_compute[182713]:     <backingStoreInput supported='yes'/>
Jan 21 23:37:12 compute-1 nova_compute[182713]:     <backup supported='yes'/>
Jan 21 23:37:12 compute-1 nova_compute[182713]:     <async-teardown supported='yes'/>
Jan 21 23:37:12 compute-1 nova_compute[182713]:     <s390-pv supported='no'/>
Jan 21 23:37:12 compute-1 nova_compute[182713]:     <ps2 supported='yes'/>
Jan 21 23:37:12 compute-1 nova_compute[182713]:     <tdx supported='no'/>
Jan 21 23:37:12 compute-1 nova_compute[182713]:     <sev supported='no'/>
Jan 21 23:37:12 compute-1 nova_compute[182713]:     <sgx supported='no'/>
Jan 21 23:37:12 compute-1 nova_compute[182713]:     <hyperv supported='yes'>
Jan 21 23:37:12 compute-1 nova_compute[182713]:       <enum name='features'>
Jan 21 23:37:12 compute-1 nova_compute[182713]:         <value>relaxed</value>
Jan 21 23:37:12 compute-1 nova_compute[182713]:         <value>vapic</value>
Jan 21 23:37:12 compute-1 nova_compute[182713]:         <value>spinlocks</value>
Jan 21 23:37:12 compute-1 nova_compute[182713]:         <value>vpindex</value>
Jan 21 23:37:12 compute-1 nova_compute[182713]:         <value>runtime</value>
Jan 21 23:37:12 compute-1 nova_compute[182713]:         <value>synic</value>
Jan 21 23:37:12 compute-1 nova_compute[182713]:         <value>stimer</value>
Jan 21 23:37:12 compute-1 nova_compute[182713]:         <value>reset</value>
Jan 21 23:37:12 compute-1 nova_compute[182713]:         <value>vendor_id</value>
Jan 21 23:37:12 compute-1 nova_compute[182713]:         <value>frequencies</value>
Jan 21 23:37:12 compute-1 nova_compute[182713]:         <value>reenlightenment</value>
Jan 21 23:37:12 compute-1 nova_compute[182713]:         <value>tlbflush</value>
Jan 21 23:37:12 compute-1 nova_compute[182713]:         <value>ipi</value>
Jan 21 23:37:12 compute-1 nova_compute[182713]:         <value>avic</value>
Jan 21 23:37:12 compute-1 nova_compute[182713]:         <value>emsr_bitmap</value>
Jan 21 23:37:12 compute-1 nova_compute[182713]:         <value>xmm_input</value>
Jan 21 23:37:12 compute-1 nova_compute[182713]:       </enum>
Jan 21 23:37:12 compute-1 nova_compute[182713]:       <defaults>
Jan 21 23:37:12 compute-1 nova_compute[182713]:         <spinlocks>4095</spinlocks>
Jan 21 23:37:12 compute-1 nova_compute[182713]:         <stimer_direct>on</stimer_direct>
Jan 21 23:37:12 compute-1 nova_compute[182713]:         <tlbflush_direct>on</tlbflush_direct>
Jan 21 23:37:12 compute-1 nova_compute[182713]:         <tlbflush_extended>on</tlbflush_extended>
Jan 21 23:37:12 compute-1 nova_compute[182713]:         <vendor_id>Linux KVM Hv</vendor_id>
Jan 21 23:37:12 compute-1 nova_compute[182713]:       </defaults>
Jan 21 23:37:12 compute-1 nova_compute[182713]:     </hyperv>
Jan 21 23:37:12 compute-1 nova_compute[182713]:     <launchSecurity supported='no'/>
Jan 21 23:37:12 compute-1 nova_compute[182713]:   </features>
Jan 21 23:37:12 compute-1 nova_compute[182713]: </domainCapabilities>
Jan 21 23:37:12 compute-1 nova_compute[182713]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Jan 21 23:37:12 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.935 182717 DEBUG nova.virt.libvirt.host [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782
Jan 21 23:37:12 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.936 182717 INFO nova.virt.libvirt.host [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Secure Boot support detected
Jan 21 23:37:12 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.938 182717 INFO nova.virt.libvirt.driver [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Jan 21 23:37:12 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.939 182717 INFO nova.virt.libvirt.driver [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Jan 21 23:37:12 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.949 182717 DEBUG nova.virt.libvirt.driver [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] cpu compare xml: <cpu match="exact">
Jan 21 23:37:12 compute-1 nova_compute[182713]:   <model>Nehalem</model>
Jan 21 23:37:12 compute-1 nova_compute[182713]: </cpu>
Jan 21 23:37:12 compute-1 nova_compute[182713]:  _compare_cpu /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10019
Jan 21 23:37:12 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.952 182717 DEBUG nova.virt.libvirt.driver [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Enabling emulated TPM support _check_vtpm_support /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1097
Jan 21 23:37:12 compute-1 nova_compute[182713]: 2026-01-21 23:37:11.986 182717 INFO nova.virt.node [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Determined node identity 39680711-70c9-4df1-ae59-25e54fac688d from /var/lib/nova/compute_id
Jan 21 23:37:12 compute-1 nova_compute[182713]: 2026-01-21 23:37:12.025 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Verified node 39680711-70c9-4df1-ae59-25e54fac688d matches my host compute-1.ctlplane.example.com _check_for_host_rename /usr/lib/python3.9/site-packages/nova/compute/manager.py:1568
Jan 21 23:37:12 compute-1 nova_compute[182713]: 2026-01-21 23:37:12.061 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host
Jan 21 23:37:12 compute-1 nova_compute[182713]: 2026-01-21 23:37:12.199 182717 DEBUG oslo_concurrency.lockutils [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:37:12 compute-1 nova_compute[182713]: 2026-01-21 23:37:12.200 182717 DEBUG oslo_concurrency.lockutils [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:37:12 compute-1 nova_compute[182713]: 2026-01-21 23:37:12.200 182717 DEBUG oslo_concurrency.lockutils [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:37:12 compute-1 nova_compute[182713]: 2026-01-21 23:37:12.201 182717 DEBUG nova.compute.resource_tracker [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 21 23:37:12 compute-1 nova_compute[182713]: 2026-01-21 23:37:12.412 182717 WARNING nova.virt.libvirt.driver [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 21 23:37:12 compute-1 nova_compute[182713]: 2026-01-21 23:37:12.413 182717 DEBUG nova.compute.resource_tracker [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=6151MB free_disk=73.58209991455078GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 21 23:37:12 compute-1 nova_compute[182713]: 2026-01-21 23:37:12.413 182717 DEBUG oslo_concurrency.lockutils [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:37:12 compute-1 nova_compute[182713]: 2026-01-21 23:37:12.414 182717 DEBUG oslo_concurrency.lockutils [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:37:12 compute-1 nova_compute[182713]: 2026-01-21 23:37:12.565 182717 DEBUG nova.compute.resource_tracker [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 21 23:37:12 compute-1 nova_compute[182713]: 2026-01-21 23:37:12.566 182717 DEBUG nova.compute.resource_tracker [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 21 23:37:12 compute-1 nova_compute[182713]: 2026-01-21 23:37:12.652 182717 DEBUG nova.scheduler.client.report [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Refreshing inventories for resource provider 39680711-70c9-4df1-ae59-25e54fac688d _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Jan 21 23:37:12 compute-1 nova_compute[182713]: 2026-01-21 23:37:12.679 182717 DEBUG nova.scheduler.client.report [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Updating ProviderTree inventory for provider 39680711-70c9-4df1-ae59-25e54fac688d from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Jan 21 23:37:12 compute-1 nova_compute[182713]: 2026-01-21 23:37:12.680 182717 DEBUG nova.compute.provider_tree [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Updating inventory in ProviderTree for provider 39680711-70c9-4df1-ae59-25e54fac688d with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Jan 21 23:37:12 compute-1 nova_compute[182713]: 2026-01-21 23:37:12.781 182717 DEBUG nova.scheduler.client.report [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Refreshing aggregate associations for resource provider 39680711-70c9-4df1-ae59-25e54fac688d, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Jan 21 23:37:12 compute-1 nova_compute[182713]: 2026-01-21 23:37:12.836 182717 DEBUG nova.scheduler.client.report [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Refreshing trait associations for resource provider 39680711-70c9-4df1-ae59-25e54fac688d, traits: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Jan 21 23:37:12 compute-1 nova_compute[182713]: 2026-01-21 23:37:12.882 182717 DEBUG nova.virt.libvirt.host [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] /sys/module/kvm_amd/parameters/sev contains [N
Jan 21 23:37:12 compute-1 nova_compute[182713]: ] _kernel_supports_amd_sev /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1803
Jan 21 23:37:12 compute-1 nova_compute[182713]: 2026-01-21 23:37:12.882 182717 INFO nova.virt.libvirt.host [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] kernel doesn't support AMD SEV
Jan 21 23:37:12 compute-1 nova_compute[182713]: 2026-01-21 23:37:12.884 182717 DEBUG nova.compute.provider_tree [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Inventory has not changed in ProviderTree for provider: 39680711-70c9-4df1-ae59-25e54fac688d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 21 23:37:12 compute-1 nova_compute[182713]: 2026-01-21 23:37:12.884 182717 DEBUG nova.virt.libvirt.driver [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 21 23:37:12 compute-1 nova_compute[182713]: 2026-01-21 23:37:12.887 182717 DEBUG nova.virt.libvirt.driver [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Libvirt baseline CPU <cpu>
Jan 21 23:37:12 compute-1 nova_compute[182713]:   <arch>x86_64</arch>
Jan 21 23:37:12 compute-1 nova_compute[182713]:   <model>Nehalem</model>
Jan 21 23:37:12 compute-1 nova_compute[182713]:   <vendor>AMD</vendor>
Jan 21 23:37:12 compute-1 nova_compute[182713]:   <topology sockets="8" cores="1" threads="1"/>
Jan 21 23:37:12 compute-1 nova_compute[182713]: </cpu>
Jan 21 23:37:12 compute-1 nova_compute[182713]:  _get_guest_baseline_cpu_features /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12537
Jan 21 23:37:12 compute-1 nova_compute[182713]: 2026-01-21 23:37:12.917 182717 DEBUG nova.scheduler.client.report [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Inventory has not changed for provider 39680711-70c9-4df1-ae59-25e54fac688d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 21 23:37:13 compute-1 nova_compute[182713]: 2026-01-21 23:37:13.104 182717 DEBUG nova.compute.provider_tree [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Updating resource provider 39680711-70c9-4df1-ae59-25e54fac688d generation from 1 to 2 during operation: update_traits _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164
Jan 21 23:37:13 compute-1 nova_compute[182713]: 2026-01-21 23:37:13.134 182717 DEBUG nova.compute.resource_tracker [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 21 23:37:13 compute-1 nova_compute[182713]: 2026-01-21 23:37:13.134 182717 DEBUG oslo_concurrency.lockutils [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.721s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:37:13 compute-1 nova_compute[182713]: 2026-01-21 23:37:13.135 182717 DEBUG nova.service [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Creating RPC server for service compute start /usr/lib/python3.9/site-packages/nova/service.py:182
Jan 21 23:37:13 compute-1 nova_compute[182713]: 2026-01-21 23:37:13.228 182717 DEBUG nova.service [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Join ServiceGroup membership for this service compute start /usr/lib/python3.9/site-packages/nova/service.py:199
Jan 21 23:37:13 compute-1 nova_compute[182713]: 2026-01-21 23:37:13.229 182717 DEBUG nova.servicegroup.drivers.db [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] DB_Driver: join new ServiceGroup member compute-1.ctlplane.example.com to the compute group, service = <Service: host=compute-1.ctlplane.example.com, binary=nova-compute, manager_class_name=nova.compute.manager.ComputeManager> join /usr/lib/python3.9/site-packages/nova/servicegroup/drivers/db.py:44
Jan 21 23:37:17 compute-1 sshd-session[183024]: Accepted publickey for zuul from 192.168.122.30 port 39646 ssh2: ECDSA SHA256:yVd9eaNlUxI8uClDfn5gH9n5gqPIh4AgIE0cN4DTPFE
Jan 21 23:37:17 compute-1 systemd-logind[796]: New session 26 of user zuul.
Jan 21 23:37:17 compute-1 systemd[1]: Started Session 26 of User zuul.
Jan 21 23:37:17 compute-1 sshd-session[183024]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 21 23:37:18 compute-1 python3.9[183177]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 21 23:37:20 compute-1 sudo[183331]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rozdnxpwrxaravjbkhvmwzwrrmppucta ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038639.290561-69-159324822710452/AnsiballZ_systemd_service.py'
Jan 21 23:37:20 compute-1 sudo[183331]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:37:20 compute-1 python3.9[183333]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 21 23:37:20 compute-1 systemd[1]: Reloading.
Jan 21 23:37:20 compute-1 systemd-sysv-generator[183364]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 21 23:37:20 compute-1 systemd-rc-local-generator[183360]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 23:37:20 compute-1 sudo[183331]: pam_unix(sudo:session): session closed for user root
Jan 21 23:37:21 compute-1 python3.9[183518]: ansible-ansible.builtin.service_facts Invoked
Jan 21 23:37:21 compute-1 network[183535]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Jan 21 23:37:21 compute-1 network[183536]: 'network-scripts' will be removed from distribution in near future.
Jan 21 23:37:21 compute-1 network[183537]: It is advised to switch to 'NetworkManager' instead for network management.
Jan 21 23:37:27 compute-1 sudo[183807]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-clxufsxwixjiiexcfqosjjnaatvtkjpd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038647.3472984-126-276833459923169/AnsiballZ_systemd_service.py'
Jan 21 23:37:27 compute-1 sudo[183807]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:37:28 compute-1 python3.9[183809]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_ceilometer_agent_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 21 23:37:28 compute-1 sudo[183807]: pam_unix(sudo:session): session closed for user root
Jan 21 23:37:29 compute-1 sudo[183960]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-flxmzhccgazkxwidlyneevtwczptkuzl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038648.8115373-157-238799506516325/AnsiballZ_file.py'
Jan 21 23:37:29 compute-1 sudo[183960]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:37:29 compute-1 python3.9[183962]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_ceilometer_agent_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:37:29 compute-1 sudo[183960]: pam_unix(sudo:session): session closed for user root
Jan 21 23:37:29 compute-1 rsyslogd[1003]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 21 23:37:29 compute-1 rsyslogd[1003]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 21 23:37:30 compute-1 sudo[184113]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nfkwwyfxvklduggraiygzaadhwjpimqc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038649.7762287-180-27853388848052/AnsiballZ_file.py'
Jan 21 23:37:30 compute-1 sudo[184113]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:37:30 compute-1 python3.9[184115]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_ceilometer_agent_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:37:30 compute-1 sudo[184113]: pam_unix(sudo:session): session closed for user root
Jan 21 23:37:30 compute-1 podman[184116]: 2026-01-21 23:37:30.638568302 +0000 UTC m=+0.114966555 container health_status 1b31374526468d6b98833c6ddc06c791524e3aaa8f4a92d02e3f11c449a17ca2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 21 23:37:31 compute-1 sudo[184291]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iwwtjbbswvznzulsnsqlmzuwwlakobsy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038650.8506823-207-31015378357939/AnsiballZ_command.py'
Jan 21 23:37:31 compute-1 sudo[184291]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:37:31 compute-1 python3.9[184293]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then
                                               systemctl disable --now certmonger.service
                                               test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service
                                             fi
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 23:37:31 compute-1 sudo[184291]: pam_unix(sudo:session): session closed for user root
Jan 21 23:37:32 compute-1 python3.9[184445]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Jan 21 23:37:33 compute-1 sudo[184595]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rxjxcagneidgijtvuhvygtpoucjjdmmf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038652.9850035-261-25060955923114/AnsiballZ_systemd_service.py'
Jan 21 23:37:33 compute-1 sudo[184595]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:37:33 compute-1 python3.9[184597]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 21 23:37:33 compute-1 systemd[1]: Reloading.
Jan 21 23:37:33 compute-1 systemd-sysv-generator[184627]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 21 23:37:33 compute-1 systemd-rc-local-generator[184624]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 23:37:34 compute-1 sudo[184595]: pam_unix(sudo:session): session closed for user root
Jan 21 23:37:34 compute-1 sudo[184782]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wcsgvrvlcywdkobivhytnupcniumromq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038654.3023875-285-21260751047707/AnsiballZ_command.py'
Jan 21 23:37:34 compute-1 sudo[184782]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:37:34 compute-1 python3.9[184784]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_ceilometer_agent_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 23:37:34 compute-1 sudo[184782]: pam_unix(sudo:session): session closed for user root
Jan 21 23:37:35 compute-1 sudo[184935]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rciaekzbadbgeynaqlhjmivvpmmjlqae ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038655.2605262-312-184061564799369/AnsiballZ_file.py'
Jan 21 23:37:35 compute-1 sudo[184935]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:37:35 compute-1 python3.9[184937]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/openstack/telemetry recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 21 23:37:35 compute-1 sudo[184935]: pam_unix(sudo:session): session closed for user root
Jan 21 23:37:36 compute-1 python3.9[185087]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 21 23:37:37 compute-1 sudo[185239]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xmbnznihjcfmwqdokepborosuryrwike ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038657.0514524-360-133554352973756/AnsiballZ_group.py'
Jan 21 23:37:37 compute-1 sudo[185239]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:37:37 compute-1 python3.9[185241]: ansible-ansible.builtin.group Invoked with name=libvirt state=present force=False system=False local=False non_unique=False gid=None gid_min=None gid_max=None
Jan 21 23:37:37 compute-1 sudo[185239]: pam_unix(sudo:session): session closed for user root
Jan 21 23:37:38 compute-1 sudo[185391]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wodbkowyuahcajoruxfqidgftxsqitvh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038658.3494508-393-193128722894710/AnsiballZ_getent.py'
Jan 21 23:37:38 compute-1 sudo[185391]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:37:39 compute-1 python3.9[185393]: ansible-ansible.builtin.getent Invoked with database=passwd key=ceilometer fail_key=True service=None split=None
Jan 21 23:37:39 compute-1 sudo[185391]: pam_unix(sudo:session): session closed for user root
Jan 21 23:37:39 compute-1 sudo[185544]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zppqmsbkzmzqikwdwwwquwgkeicgzhkz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038659.3363059-417-23536035620059/AnsiballZ_group.py'
Jan 21 23:37:39 compute-1 sudo[185544]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:37:39 compute-1 python3.9[185546]: ansible-ansible.builtin.group Invoked with gid=42405 name=ceilometer state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Jan 21 23:37:39 compute-1 groupadd[185547]: group added to /etc/group: name=ceilometer, GID=42405
Jan 21 23:37:39 compute-1 groupadd[185547]: group added to /etc/gshadow: name=ceilometer
Jan 21 23:37:39 compute-1 groupadd[185547]: new group: name=ceilometer, GID=42405
Jan 21 23:37:40 compute-1 sudo[185544]: pam_unix(sudo:session): session closed for user root
Jan 21 23:37:40 compute-1 sudo[185702]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bnpbhhdrlkfsdxntwleloxhnqzpoorab ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038660.2410383-441-103264094313354/AnsiballZ_user.py'
Jan 21 23:37:40 compute-1 sudo[185702]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:37:40 compute-1 python3.9[185704]: ansible-ansible.builtin.user Invoked with comment=ceilometer user group=ceilometer groups=['libvirt'] name=ceilometer shell=/sbin/nologin state=present uid=42405 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-1 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Jan 21 23:37:40 compute-1 useradd[185707]: new user: name=ceilometer, UID=42405, GID=42405, home=/home/ceilometer, shell=/sbin/nologin, from=/dev/pts/0
Jan 21 23:37:41 compute-1 useradd[185707]: add 'ceilometer' to group 'libvirt'
Jan 21 23:37:41 compute-1 useradd[185707]: add 'ceilometer' to shadow group 'libvirt'
Jan 21 23:37:41 compute-1 podman[185706]: 2026-01-21 23:37:41.047350615 +0000 UTC m=+0.073163600 container health_status af9a44923f14d865893c91712a29a99d59e05d83f1a1a0300c4f4d5af638f284 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 21 23:37:41 compute-1 sudo[185702]: pam_unix(sudo:session): session closed for user root
Jan 21 23:37:42 compute-1 python3.9[185879]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/telemetry/ceilometer.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:37:43 compute-1 python3.9[186000]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/telemetry/ceilometer.conf mode=0640 remote_src=False src=/home/zuul/.ansible/tmp/ansible-tmp-1769038662.1082428-519-227761612784776/.source.conf _original_basename=ceilometer.conf follow=False checksum=806b21daa538a66a80669be8bf74c414d178dfbc backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:37:43 compute-1 python3.9[186150]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/telemetry/polling.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:37:44 compute-1 python3.9[186271]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/telemetry/polling.yaml mode=0640 remote_src=False src=/home/zuul/.ansible/tmp/ansible-tmp-1769038663.4008648-519-144067939067262/.source.yaml _original_basename=polling.yaml follow=False checksum=6c8680a286285f2e0ef9fa528ca754765e5ed0e5 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:37:45 compute-1 python3.9[186421]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/telemetry/custom.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:37:45 compute-1 python3.9[186542]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/telemetry/custom.conf mode=0640 remote_src=False src=/home/zuul/.ansible/tmp/ansible-tmp-1769038664.6769035-519-11988186891411/.source.conf _original_basename=custom.conf follow=False checksum=838b8b0a7d7f72e55ab67d39f32e3cb3eca2139b backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:37:46 compute-1 python3.9[186692]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.crt follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 21 23:37:47 compute-1 python3.9[186844]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.key follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 21 23:37:48 compute-1 python3.9[186996]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/telemetry/ceilometer-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:37:48 compute-1 python3.9[187117]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/telemetry/ceilometer-host-specific.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769038667.6852834-696-63266859890851/.source.conf follow=False _original_basename=ceilometer-host-specific.conf.j2 checksum=a9bdb897f3979025d9a372b4beff53a09cbe0d55 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 21 23:37:49 compute-1 python3.9[187267]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/telemetry/openstack_network_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:37:49 compute-1 python3.9[187388]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/telemetry/openstack_network_exporter.yaml mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769038668.863806-696-221044296492825/.source.yaml follow=False _original_basename=openstack_network_exporter.yaml.j2 checksum=87dede51a10e22722618c1900db75cb764463d91 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 21 23:37:50 compute-1 python3.9[187538]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/telemetry/firewall.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:37:51 compute-1 python3.9[187659]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/telemetry/firewall.yaml mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769038670.3637443-783-23282265354477/.source.yaml _original_basename=firewall.yaml follow=False checksum=d942d984493b214bda2913f753ff68cdcedff00e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 21 23:37:52 compute-1 python3.9[187809]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/telemetry/node_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:37:52 compute-1 python3.9[187930]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/telemetry/node_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1769038671.8868399-831-155926221579463/.source.yaml _original_basename=node_exporter.yaml follow=False checksum=81d906d3e1e8c4f8367276f5d3a67b80ca7e989e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:37:53 compute-1 python3.9[188080]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/telemetry/podman_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:37:54 compute-1 python3.9[188201]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/telemetry/podman_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1769038673.1597939-876-164673348784290/.source.yaml _original_basename=podman_exporter.yaml follow=False checksum=7ccb5eca2ff1dc337c3f3ecbbff5245af7149c47 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:37:55 compute-1 python3.9[188351]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:37:55 compute-1 python3.9[188472]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1769038674.6697595-922-177965342946829/.source.yaml _original_basename=ceilometer_prom_exporter.yaml follow=False checksum=10157c879411ee6023e506dc85a343cedc52700f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:37:56 compute-1 sudo[188622]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-orhutdjznvbnznlfjtqbfuuxmcppofnz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038676.062234-966-195547149789001/AnsiballZ_file.py'
Jan 21 23:37:56 compute-1 sudo[188622]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:37:56 compute-1 python3.9[188624]: ansible-ansible.builtin.file Invoked with group=ceilometer mode=0644 owner=ceilometer path=/var/lib/openstack/certs/telemetry/default/tls.crt recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:37:56 compute-1 sudo[188622]: pam_unix(sudo:session): session closed for user root
Jan 21 23:37:57 compute-1 sudo[188774]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mtjiqgtsffltporynvbqyjoevhdilyte ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038676.8732247-990-123706801300902/AnsiballZ_file.py'
Jan 21 23:37:57 compute-1 sudo[188774]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:37:57 compute-1 python3.9[188776]: ansible-ansible.builtin.file Invoked with group=ceilometer mode=0644 owner=ceilometer path=/var/lib/openstack/certs/telemetry/default/tls.key recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:37:57 compute-1 sudo[188774]: pam_unix(sudo:session): session closed for user root
Jan 21 23:37:58 compute-1 python3.9[188926]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 21 23:37:59 compute-1 python3.9[189078]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.crt follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 21 23:37:59 compute-1 python3.9[189230]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.key follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 21 23:38:00 compute-1 sudo[189382]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-upyptwnyhvkynfctbtpyyovvnagrquyg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038680.19728-1086-165817184062416/AnsiballZ_file.py'
Jan 21 23:38:00 compute-1 sudo[189382]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:38:00 compute-1 python3.9[189384]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 21 23:38:00 compute-1 sudo[189382]: pam_unix(sudo:session): session closed for user root
Jan 21 23:38:00 compute-1 podman[189385]: 2026-01-21 23:38:00.846951197 +0000 UTC m=+0.088974241 container health_status 1b31374526468d6b98833c6ddc06c791524e3aaa8f4a92d02e3f11c449a17ca2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 21 23:38:01 compute-1 sudo[189558]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uuzkwktdwrgtxtvqlgmhsnnuwhxnxbvz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038681.0491114-1111-164192035530026/AnsiballZ_systemd_service.py'
Jan 21 23:38:01 compute-1 sudo[189558]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:38:01 compute-1 python3.9[189560]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=podman.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 21 23:38:01 compute-1 systemd[1]: Reloading.
Jan 21 23:38:01 compute-1 systemd-rc-local-generator[189582]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 23:38:01 compute-1 systemd-sysv-generator[189588]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 21 23:38:02 compute-1 systemd[1]: Listening on Podman API Socket.
Jan 21 23:38:02 compute-1 sudo[189558]: pam_unix(sudo:session): session closed for user root
Jan 21 23:38:02 compute-1 sudo[189749]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bxfzroovvnwqbqpywmknwdmaxqcdhtxy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038682.6758525-1137-164754139745362/AnsiballZ_stat.py'
Jan 21 23:38:02 compute-1 sudo[189749]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:38:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:38:02.983 104184 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:38:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:38:02.985 104184 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:38:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:38:02.985 104184 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:38:03 compute-1 python3.9[189751]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ceilometer_agent_compute/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:38:03 compute-1 sudo[189749]: pam_unix(sudo:session): session closed for user root
Jan 21 23:38:03 compute-1 sudo[189872]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yxvbprdiolfnlnjjgeserpnyjqdkqeas ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038682.6758525-1137-164754139745362/AnsiballZ_copy.py'
Jan 21 23:38:03 compute-1 sudo[189872]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:38:03 compute-1 python3.9[189874]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ceilometer_agent_compute/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769038682.6758525-1137-164754139745362/.source _original_basename=healthcheck follow=False checksum=ebb343c21fce35a02591a9351660cb7035a47d42 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 21 23:38:03 compute-1 sudo[189872]: pam_unix(sudo:session): session closed for user root
Jan 21 23:38:04 compute-1 nova_compute[182713]: 2026-01-21 23:38:04.231 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:38:04 compute-1 nova_compute[182713]: 2026-01-21 23:38:04.264 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:38:04 compute-1 sudo[189948]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-whmtgiavoimqdafpoevdvakwveeefzqm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038682.6758525-1137-164754139745362/AnsiballZ_stat.py'
Jan 21 23:38:04 compute-1 sudo[189948]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:38:04 compute-1 python3.9[189950]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ceilometer_agent_compute/healthcheck.future follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:38:04 compute-1 sudo[189948]: pam_unix(sudo:session): session closed for user root
Jan 21 23:38:04 compute-1 sudo[190071]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nmiracnqboktdnprncbrsnijnqlqvjkl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038682.6758525-1137-164754139745362/AnsiballZ_copy.py'
Jan 21 23:38:04 compute-1 sudo[190071]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:38:05 compute-1 python3.9[190073]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ceilometer_agent_compute/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769038682.6758525-1137-164754139745362/.source.future _original_basename=healthcheck.future follow=False checksum=d500a98192f4ddd70b4dfdc059e2d81aed36a294 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 21 23:38:05 compute-1 sudo[190071]: pam_unix(sudo:session): session closed for user root
Jan 21 23:38:06 compute-1 sudo[190223]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wdsvyrxjjsowjpebbbsmvglkufqaixlk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038686.3960297-1233-255779786357880/AnsiballZ_file.py'
Jan 21 23:38:06 compute-1 sudo[190223]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:38:06 compute-1 python3.9[190225]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:38:06 compute-1 sudo[190223]: pam_unix(sudo:session): session closed for user root
Jan 21 23:38:07 compute-1 sudo[190375]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dbobkgeckqciafznnhhutbrkcotfeyez ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038687.2313058-1257-215042929230065/AnsiballZ_file.py'
Jan 21 23:38:07 compute-1 sudo[190375]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:38:07 compute-1 python3.9[190377]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 21 23:38:07 compute-1 sudo[190375]: pam_unix(sudo:session): session closed for user root
Jan 21 23:38:08 compute-1 sudo[190527]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xnyztifntkdvmfqzirilbzremspqdfya ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038688.0472245-1281-24166921601082/AnsiballZ_stat.py'
Jan 21 23:38:08 compute-1 sudo[190527]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:38:08 compute-1 python3.9[190529]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ceilometer_agent_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:38:08 compute-1 sudo[190527]: pam_unix(sudo:session): session closed for user root
Jan 21 23:38:09 compute-1 sudo[190650]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qiilbjrtfowlkdstnkavqugwjnivnrwu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038688.0472245-1281-24166921601082/AnsiballZ_copy.py'
Jan 21 23:38:09 compute-1 sudo[190650]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:38:09 compute-1 python3.9[190652]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ceilometer_agent_compute.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1769038688.0472245-1281-24166921601082/.source.json _original_basename=.j3vqr_wl follow=False checksum=ce2b0c83293a970bafffa087afa083dd7c93a79c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:38:09 compute-1 sudo[190650]: pam_unix(sudo:session): session closed for user root
Jan 21 23:38:10 compute-1 python3.9[190802]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ceilometer_agent_compute state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:38:10 compute-1 nova_compute[182713]: 2026-01-21 23:38:10.859 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:38:10 compute-1 nova_compute[182713]: 2026-01-21 23:38:10.860 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:38:10 compute-1 nova_compute[182713]: 2026-01-21 23:38:10.860 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 21 23:38:10 compute-1 nova_compute[182713]: 2026-01-21 23:38:10.861 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 21 23:38:10 compute-1 nova_compute[182713]: 2026-01-21 23:38:10.880 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 21 23:38:10 compute-1 nova_compute[182713]: 2026-01-21 23:38:10.880 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:38:10 compute-1 nova_compute[182713]: 2026-01-21 23:38:10.881 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:38:10 compute-1 nova_compute[182713]: 2026-01-21 23:38:10.881 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:38:10 compute-1 nova_compute[182713]: 2026-01-21 23:38:10.882 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:38:10 compute-1 nova_compute[182713]: 2026-01-21 23:38:10.882 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:38:10 compute-1 nova_compute[182713]: 2026-01-21 23:38:10.883 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:38:10 compute-1 nova_compute[182713]: 2026-01-21 23:38:10.883 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 21 23:38:10 compute-1 nova_compute[182713]: 2026-01-21 23:38:10.883 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:38:10 compute-1 nova_compute[182713]: 2026-01-21 23:38:10.917 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:38:10 compute-1 nova_compute[182713]: 2026-01-21 23:38:10.918 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:38:10 compute-1 nova_compute[182713]: 2026-01-21 23:38:10.918 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:38:10 compute-1 nova_compute[182713]: 2026-01-21 23:38:10.919 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 21 23:38:11 compute-1 nova_compute[182713]: 2026-01-21 23:38:11.103 182717 WARNING nova.virt.libvirt.driver [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 21 23:38:11 compute-1 nova_compute[182713]: 2026-01-21 23:38:11.105 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=6118MB free_disk=73.58723449707031GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 21 23:38:11 compute-1 nova_compute[182713]: 2026-01-21 23:38:11.105 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:38:11 compute-1 nova_compute[182713]: 2026-01-21 23:38:11.105 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:38:11 compute-1 nova_compute[182713]: 2026-01-21 23:38:11.171 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 21 23:38:11 compute-1 nova_compute[182713]: 2026-01-21 23:38:11.172 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 21 23:38:11 compute-1 nova_compute[182713]: 2026-01-21 23:38:11.193 182717 DEBUG nova.compute.provider_tree [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Inventory has not changed in ProviderTree for provider: 39680711-70c9-4df1-ae59-25e54fac688d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 21 23:38:11 compute-1 nova_compute[182713]: 2026-01-21 23:38:11.209 182717 DEBUG nova.scheduler.client.report [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Inventory has not changed for provider 39680711-70c9-4df1-ae59-25e54fac688d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 21 23:38:11 compute-1 nova_compute[182713]: 2026-01-21 23:38:11.211 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 21 23:38:11 compute-1 nova_compute[182713]: 2026-01-21 23:38:11.211 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.106s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:38:11 compute-1 podman[191047]: 2026-01-21 23:38:11.584704637 +0000 UTC m=+0.088919696 container health_status af9a44923f14d865893c91712a29a99d59e05d83f1a1a0300c4f4d5af638f284 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251202)
Jan 21 23:38:12 compute-1 sudo[191242]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ryxycweacojeihxkpbwlgrqttmsnqoib ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038692.3145986-1401-126495299518220/AnsiballZ_container_config_data.py'
Jan 21 23:38:12 compute-1 sudo[191242]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:38:13 compute-1 python3.9[191244]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ceilometer_agent_compute config_pattern=*.json debug=False
Jan 21 23:38:13 compute-1 sudo[191242]: pam_unix(sudo:session): session closed for user root
Jan 21 23:38:14 compute-1 sudo[191395]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ptnoskgjiwgwsrprrgtlnagrqhpzghpw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038693.585433-1434-114126262738925/AnsiballZ_container_config_hash.py'
Jan 21 23:38:14 compute-1 sudo[191395]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:38:14 compute-1 python3.9[191397]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Jan 21 23:38:14 compute-1 sudo[191395]: pam_unix(sudo:session): session closed for user root
Jan 21 23:38:15 compute-1 sudo[191547]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vrilqqshdcityarqjaulxfjlnynzzdct ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1769038694.8042247-1464-110032936357973/AnsiballZ_edpm_container_manage.py'
Jan 21 23:38:15 compute-1 sudo[191547]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:38:15 compute-1 python3[191549]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ceilometer_agent_compute config_id=ceilometer_agent_compute config_overrides={} config_patterns=*.json containers=['ceilometer_agent_compute'] log_base_path=/var/log/containers/stdouts debug=False
Jan 21 23:38:15 compute-1 podman[191579]: 2026-01-21 23:38:15.830422591 +0000 UTC m=+0.062266267 container create cba261ffc859a105e0afa4436e64e27f9ba7e7387a1ea02146e0072c51844d44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, managed_by=edpm_ansible)
Jan 21 23:38:15 compute-1 podman[191579]: 2026-01-21 23:38:15.80172742 +0000 UTC m=+0.033571136 image pull 806262ad9f61127734555408f71447afe6ceede79cc666e6f523dacd5edec739 quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified
Jan 21 23:38:15 compute-1 python3[191549]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ceilometer_agent_compute --conmon-pidfile /run/ceilometer_agent_compute.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env OS_ENDPOINT_TYPE=internal --env EDPM_CONFIG_HASH=7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8 --healthcheck-command /openstack/healthcheck compute --label config_id=ceilometer_agent_compute --label container_name=ceilometer_agent_compute --label managed_by=edpm_ansible --label config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']} --log-driver journald --log-level info --network host --security-opt label:type:ceilometer_polling_t --user ceilometer --volume /var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z --volume /var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z --volume /run/libvirt:/run/libvirt:shared,ro --volume /etc/hosts:/etc/hosts:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z --volume /var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z --volume /dev/log:/dev/log --volume /var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified kolla_start
Jan 21 23:38:16 compute-1 sudo[191547]: pam_unix(sudo:session): session closed for user root
Jan 21 23:38:16 compute-1 sudo[191767]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qusndegvggnhdxnygtkvhgjdwpelwyaq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038696.3577497-1488-180281135885466/AnsiballZ_stat.py'
Jan 21 23:38:16 compute-1 sudo[191767]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:38:16 compute-1 python3.9[191769]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 21 23:38:16 compute-1 sudo[191767]: pam_unix(sudo:session): session closed for user root
Jan 21 23:38:17 compute-1 sudo[191921]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jbaiaacuwcikyaeeaqeeahognqcrmino ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038697.3290937-1515-262934619447571/AnsiballZ_file.py'
Jan 21 23:38:17 compute-1 sudo[191921]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:38:17 compute-1 python3.9[191923]: ansible-file Invoked with path=/etc/systemd/system/edpm_ceilometer_agent_compute.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:38:17 compute-1 sudo[191921]: pam_unix(sudo:session): session closed for user root
Jan 21 23:38:18 compute-1 sudo[191997]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uswufgcghmrjdokerafccqhayhqlbvhx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038697.3290937-1515-262934619447571/AnsiballZ_stat.py'
Jan 21 23:38:18 compute-1 sudo[191997]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:38:18 compute-1 python3.9[191999]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ceilometer_agent_compute_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 21 23:38:18 compute-1 sudo[191997]: pam_unix(sudo:session): session closed for user root
Jan 21 23:38:18 compute-1 sudo[192148]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jpeyjljmxamdiigejhdmfqikewhdhdyg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038698.4835415-1515-215505071825060/AnsiballZ_copy.py'
Jan 21 23:38:18 compute-1 sudo[192148]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:38:19 compute-1 python3.9[192150]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769038698.4835415-1515-215505071825060/source dest=/etc/systemd/system/edpm_ceilometer_agent_compute.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:38:19 compute-1 sudo[192148]: pam_unix(sudo:session): session closed for user root
Jan 21 23:38:19 compute-1 sudo[192224]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ednlwovjsqovuytvxajwjfupbexfgpwc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038698.4835415-1515-215505071825060/AnsiballZ_systemd.py'
Jan 21 23:38:19 compute-1 sudo[192224]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:38:20 compute-1 python3.9[192226]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 21 23:38:20 compute-1 systemd[1]: Reloading.
Jan 21 23:38:20 compute-1 systemd-rc-local-generator[192252]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 23:38:20 compute-1 systemd-sysv-generator[192256]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 21 23:38:20 compute-1 sudo[192224]: pam_unix(sudo:session): session closed for user root
Jan 21 23:38:20 compute-1 sudo[192335]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bkjpkdawqgmkpmmrxhnjqubruqbdywhi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038698.4835415-1515-215505071825060/AnsiballZ_systemd.py'
Jan 21 23:38:20 compute-1 sudo[192335]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:38:21 compute-1 python3.9[192337]: ansible-systemd Invoked with state=restarted name=edpm_ceilometer_agent_compute.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 21 23:38:21 compute-1 systemd[1]: Reloading.
Jan 21 23:38:21 compute-1 systemd-rc-local-generator[192368]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 23:38:21 compute-1 systemd-sysv-generator[192371]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 21 23:38:21 compute-1 systemd[1]: Starting ceilometer_agent_compute container...
Jan 21 23:38:21 compute-1 systemd[1]: Started libcrun container.
Jan 21 23:38:21 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e29057651dd678dccadfb30f37038a10bbada1aae5b05cc4e5ba068ac3600ac3/merged/etc/ceilometer/ceilometer_prom_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Jan 21 23:38:21 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e29057651dd678dccadfb30f37038a10bbada1aae5b05cc4e5ba068ac3600ac3/merged/etc/ceilometer/tls supports timestamps until 2038 (0x7fffffff)
Jan 21 23:38:21 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e29057651dd678dccadfb30f37038a10bbada1aae5b05cc4e5ba068ac3600ac3/merged/var/lib/kolla/config_files/config.json supports timestamps until 2038 (0x7fffffff)
Jan 21 23:38:21 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e29057651dd678dccadfb30f37038a10bbada1aae5b05cc4e5ba068ac3600ac3/merged/var/lib/kolla/config_files/src supports timestamps until 2038 (0x7fffffff)
Jan 21 23:38:21 compute-1 systemd[1]: Started /usr/bin/podman healthcheck run cba261ffc859a105e0afa4436e64e27f9ba7e7387a1ea02146e0072c51844d44.
Jan 21 23:38:21 compute-1 podman[192378]: 2026-01-21 23:38:21.620526475 +0000 UTC m=+0.180341819 container init cba261ffc859a105e0afa4436e64e27f9ba7e7387a1ea02146e0072c51844d44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Jan 21 23:38:21 compute-1 ceilometer_agent_compute[192394]: + sudo -E kolla_set_configs
Jan 21 23:38:21 compute-1 podman[192378]: 2026-01-21 23:38:21.662633149 +0000 UTC m=+0.222448483 container start cba261ffc859a105e0afa4436e64e27f9ba7e7387a1ea02146e0072c51844d44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 21 23:38:21 compute-1 podman[192378]: ceilometer_agent_compute
Jan 21 23:38:21 compute-1 ceilometer_agent_compute[192394]: sudo: unable to send audit message: Operation not permitted
Jan 21 23:38:21 compute-1 sudo[192400]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Jan 21 23:38:21 compute-1 sudo[192400]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory
Jan 21 23:38:21 compute-1 sudo[192400]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Jan 21 23:38:21 compute-1 systemd[1]: Started ceilometer_agent_compute container.
Jan 21 23:38:21 compute-1 sudo[192335]: pam_unix(sudo:session): session closed for user root
Jan 21 23:38:21 compute-1 ceilometer_agent_compute[192394]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Jan 21 23:38:21 compute-1 ceilometer_agent_compute[192394]: INFO:__main__:Validating config file
Jan 21 23:38:21 compute-1 ceilometer_agent_compute[192394]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Jan 21 23:38:21 compute-1 ceilometer_agent_compute[192394]: INFO:__main__:Copying service configuration files
Jan 21 23:38:21 compute-1 ceilometer_agent_compute[192394]: INFO:__main__:Deleting /etc/ceilometer/ceilometer.conf
Jan 21 23:38:21 compute-1 ceilometer_agent_compute[192394]: INFO:__main__:Copying /var/lib/kolla/config_files/src/ceilometer.conf to /etc/ceilometer/ceilometer.conf
Jan 21 23:38:21 compute-1 podman[192401]: 2026-01-21 23:38:21.74980195 +0000 UTC m=+0.067662079 container health_status cba261ffc859a105e0afa4436e64e27f9ba7e7387a1ea02146e0072c51844d44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=starting, health_failing_streak=1, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Jan 21 23:38:21 compute-1 ceilometer_agent_compute[192394]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf
Jan 21 23:38:21 compute-1 ceilometer_agent_compute[192394]: INFO:__main__:Deleting /etc/ceilometer/polling.yaml
Jan 21 23:38:21 compute-1 ceilometer_agent_compute[192394]: INFO:__main__:Copying /var/lib/kolla/config_files/src/polling.yaml to /etc/ceilometer/polling.yaml
Jan 21 23:38:21 compute-1 ceilometer_agent_compute[192394]: INFO:__main__:Setting permission for /etc/ceilometer/polling.yaml
Jan 21 23:38:21 compute-1 ceilometer_agent_compute[192394]: INFO:__main__:Copying /var/lib/kolla/config_files/src/custom.conf to /etc/ceilometer/ceilometer.conf.d/01-ceilometer-custom.conf
Jan 21 23:38:21 compute-1 ceilometer_agent_compute[192394]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf.d/01-ceilometer-custom.conf
Jan 21 23:38:21 compute-1 ceilometer_agent_compute[192394]: INFO:__main__:Copying /var/lib/kolla/config_files/src/ceilometer-host-specific.conf to /etc/ceilometer/ceilometer.conf.d/02-ceilometer-host-specific.conf
Jan 21 23:38:21 compute-1 ceilometer_agent_compute[192394]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf.d/02-ceilometer-host-specific.conf
Jan 21 23:38:21 compute-1 ceilometer_agent_compute[192394]: INFO:__main__:Writing out command to execute
Jan 21 23:38:21 compute-1 systemd[1]: cba261ffc859a105e0afa4436e64e27f9ba7e7387a1ea02146e0072c51844d44-17808f7e975261c0.service: Main process exited, code=exited, status=1/FAILURE
Jan 21 23:38:21 compute-1 systemd[1]: cba261ffc859a105e0afa4436e64e27f9ba7e7387a1ea02146e0072c51844d44-17808f7e975261c0.service: Failed with result 'exit-code'.
Jan 21 23:38:21 compute-1 sudo[192400]: pam_unix(sudo:session): session closed for user root
Jan 21 23:38:21 compute-1 ceilometer_agent_compute[192394]: ++ cat /run_command
Jan 21 23:38:21 compute-1 ceilometer_agent_compute[192394]: + CMD='/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'
Jan 21 23:38:21 compute-1 ceilometer_agent_compute[192394]: + ARGS=
Jan 21 23:38:21 compute-1 ceilometer_agent_compute[192394]: + sudo kolla_copy_cacerts
Jan 21 23:38:21 compute-1 ceilometer_agent_compute[192394]: sudo: unable to send audit message: Operation not permitted
Jan 21 23:38:21 compute-1 sudo[192440]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_copy_cacerts
Jan 21 23:38:21 compute-1 sudo[192440]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory
Jan 21 23:38:21 compute-1 sudo[192440]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Jan 21 23:38:21 compute-1 sudo[192440]: pam_unix(sudo:session): session closed for user root
Jan 21 23:38:21 compute-1 ceilometer_agent_compute[192394]: Running command: '/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'
Jan 21 23:38:21 compute-1 ceilometer_agent_compute[192394]: + [[ ! -n '' ]]
Jan 21 23:38:21 compute-1 ceilometer_agent_compute[192394]: + . kolla_extend_start
Jan 21 23:38:21 compute-1 ceilometer_agent_compute[192394]: + echo 'Running command: '\''/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'\'''
Jan 21 23:38:21 compute-1 ceilometer_agent_compute[192394]: + umask 0022
Jan 21 23:38:21 compute-1 ceilometer_agent_compute[192394]: + exec /usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.574 2 DEBUG cotyledon.oslo_config_glue [-] Full set of CONF: _load_service_manager_options /usr/lib/python3.9/site-packages/cotyledon/oslo_config_glue.py:40
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.574 2 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.574 2 DEBUG cotyledon.oslo_config_glue [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.574 2 DEBUG cotyledon.oslo_config_glue [-] command line args: ['--polling-namespaces', 'compute', '--logfile', '/dev/stdout'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.575 2 DEBUG cotyledon.oslo_config_glue [-] config files: ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.575 2 DEBUG cotyledon.oslo_config_glue [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.575 2 DEBUG cotyledon.oslo_config_glue [-] batch_size                     = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.575 2 DEBUG cotyledon.oslo_config_glue [-] cfg_file                       = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.575 2 DEBUG cotyledon.oslo_config_glue [-] config_dir                     = ['/etc/ceilometer/ceilometer.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.575 2 DEBUG cotyledon.oslo_config_glue [-] config_file                    = ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.575 2 DEBUG cotyledon.oslo_config_glue [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.575 2 DEBUG cotyledon.oslo_config_glue [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.575 2 DEBUG cotyledon.oslo_config_glue [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'futurist=INFO', 'neutronclient=INFO', 'keystoneclient=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.576 2 DEBUG cotyledon.oslo_config_glue [-] event_pipeline_cfg_file        = event_pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.576 2 DEBUG cotyledon.oslo_config_glue [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.576 2 DEBUG cotyledon.oslo_config_glue [-] host                           = compute-1.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.576 2 DEBUG cotyledon.oslo_config_glue [-] http_timeout                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.576 2 DEBUG cotyledon.oslo_config_glue [-] hypervisor_inspector           = libvirt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.576 2 DEBUG cotyledon.oslo_config_glue [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.576 2 DEBUG cotyledon.oslo_config_glue [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.576 2 DEBUG cotyledon.oslo_config_glue [-] libvirt_type                   = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.576 2 DEBUG cotyledon.oslo_config_glue [-] libvirt_uri                    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.577 2 DEBUG cotyledon.oslo_config_glue [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.577 2 DEBUG cotyledon.oslo_config_glue [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.577 2 DEBUG cotyledon.oslo_config_glue [-] log_dir                        = /var/log/ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.577 2 DEBUG cotyledon.oslo_config_glue [-] log_file                       = /dev/stdout log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.577 2 DEBUG cotyledon.oslo_config_glue [-] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.577 2 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.577 2 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.577 2 DEBUG cotyledon.oslo_config_glue [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.577 2 DEBUG cotyledon.oslo_config_glue [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.578 2 DEBUG cotyledon.oslo_config_glue [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.578 2 DEBUG cotyledon.oslo_config_glue [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.578 2 DEBUG cotyledon.oslo_config_glue [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.578 2 DEBUG cotyledon.oslo_config_glue [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.578 2 DEBUG cotyledon.oslo_config_glue [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.578 2 DEBUG cotyledon.oslo_config_glue [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.578 2 DEBUG cotyledon.oslo_config_glue [-] max_parallel_requests          = 64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.578 2 DEBUG cotyledon.oslo_config_glue [-] partitioning_group_prefix      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.578 2 DEBUG cotyledon.oslo_config_glue [-] pipeline_cfg_file              = pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.578 2 DEBUG cotyledon.oslo_config_glue [-] polling_namespaces             = ['compute'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.579 2 DEBUG cotyledon.oslo_config_glue [-] pollsters_definitions_dirs     = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.579 2 DEBUG cotyledon.oslo_config_glue [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.579 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.579 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.579 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.579 2 DEBUG cotyledon.oslo_config_glue [-] reseller_prefix                = AUTH_ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.579 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_keys         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.579 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_length       = 256 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.579 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_namespace    = ['metering.'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.579 2 DEBUG cotyledon.oslo_config_glue [-] rootwrap_config                = /etc/ceilometer/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.579 2 DEBUG cotyledon.oslo_config_glue [-] sample_source                  = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.580 2 DEBUG cotyledon.oslo_config_glue [-] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.580 2 DEBUG cotyledon.oslo_config_glue [-] tenant_name_discovery          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.580 2 DEBUG cotyledon.oslo_config_glue [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.580 2 DEBUG cotyledon.oslo_config_glue [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.580 2 DEBUG cotyledon.oslo_config_glue [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.580 2 DEBUG cotyledon.oslo_config_glue [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.580 2 DEBUG cotyledon.oslo_config_glue [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.580 2 DEBUG cotyledon.oslo_config_glue [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.580 2 DEBUG cotyledon.oslo_config_glue [-] compute.instance_discovery_method = libvirt_metadata log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.580 2 DEBUG cotyledon.oslo_config_glue [-] compute.resource_cache_expiry  = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.581 2 DEBUG cotyledon.oslo_config_glue [-] compute.resource_update_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.581 2 DEBUG cotyledon.oslo_config_glue [-] coordination.backend_url       = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.581 2 DEBUG cotyledon.oslo_config_glue [-] event.definitions_cfg_file     = event_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.581 2 DEBUG cotyledon.oslo_config_glue [-] event.drop_unmatched_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.581 2 DEBUG cotyledon.oslo_config_glue [-] event.store_raw                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.581 2 DEBUG cotyledon.oslo_config_glue [-] ipmi.node_manager_init_retry   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.581 2 DEBUG cotyledon.oslo_config_glue [-] ipmi.polling_retry             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.581 2 DEBUG cotyledon.oslo_config_glue [-] meter.meter_definitions_dirs   = ['/etc/ceilometer/meters.d', '/usr/lib/python3.9/site-packages/ceilometer/data/meters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.581 2 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_on_failure     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.581 2 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_path           = mon_pub_failures.txt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.582 2 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.582 2 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.582 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_count            = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.582 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_max_retries      = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.582 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_mode             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.582 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_polling_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.582 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_timeout          = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.582 2 DEBUG cotyledon.oslo_config_glue [-] monasca.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.582 2 DEBUG cotyledon.oslo_config_glue [-] monasca.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.582 2 DEBUG cotyledon.oslo_config_glue [-] monasca.client_max_retries     = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.582 2 DEBUG cotyledon.oslo_config_glue [-] monasca.client_retry_interval  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.583 2 DEBUG cotyledon.oslo_config_glue [-] monasca.clientapi_version      = 2_0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.583 2 DEBUG cotyledon.oslo_config_glue [-] monasca.cloud_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.583 2 DEBUG cotyledon.oslo_config_glue [-] monasca.cluster                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.583 2 DEBUG cotyledon.oslo_config_glue [-] monasca.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.583 2 DEBUG cotyledon.oslo_config_glue [-] monasca.control_plane          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.583 2 DEBUG cotyledon.oslo_config_glue [-] monasca.enable_api_pagination  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.583 2 DEBUG cotyledon.oslo_config_glue [-] monasca.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.583 2 DEBUG cotyledon.oslo_config_glue [-] monasca.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.583 2 DEBUG cotyledon.oslo_config_glue [-] monasca.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.583 2 DEBUG cotyledon.oslo_config_glue [-] monasca.monasca_mappings       = /etc/ceilometer/monasca_field_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.583 2 DEBUG cotyledon.oslo_config_glue [-] monasca.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.583 2 DEBUG cotyledon.oslo_config_glue [-] monasca.retry_on_failure       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.584 2 DEBUG cotyledon.oslo_config_glue [-] monasca.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.584 2 DEBUG cotyledon.oslo_config_glue [-] monasca.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.584 2 DEBUG cotyledon.oslo_config_glue [-] notification.ack_on_event_error = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.584 2 DEBUG cotyledon.oslo_config_glue [-] notification.batch_size        = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.584 2 DEBUG cotyledon.oslo_config_glue [-] notification.batch_timeout     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.584 2 DEBUG cotyledon.oslo_config_glue [-] notification.messaging_urls    = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.584 2 DEBUG cotyledon.oslo_config_glue [-] notification.notification_control_exchanges = ['nova', 'glance', 'neutron', 'cinder', 'heat', 'keystone', 'sahara', 'trove', 'zaqar', 'swift', 'ceilometer', 'magnum', 'dns', 'ironic', 'aodh'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.584 2 DEBUG cotyledon.oslo_config_glue [-] notification.pipelines         = ['meter', 'event'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.584 2 DEBUG cotyledon.oslo_config_glue [-] notification.workers           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.584 2 DEBUG cotyledon.oslo_config_glue [-] polling.batch_size             = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.585 2 DEBUG cotyledon.oslo_config_glue [-] polling.cfg_file               = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.585 2 DEBUG cotyledon.oslo_config_glue [-] polling.partitioning_group_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.585 2 DEBUG cotyledon.oslo_config_glue [-] polling.pollsters_definitions_dirs = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.585 2 DEBUG cotyledon.oslo_config_glue [-] polling.tenant_name_discovery  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.585 2 DEBUG cotyledon.oslo_config_glue [-] publisher.telemetry_secret     = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.585 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.event_topic = event log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.585 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.metering_topic = metering log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.585 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.telemetry_driver = messagingv2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.585 2 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.access_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.585 2 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.secret_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.585 2 DEBUG cotyledon.oslo_config_glue [-] rgw_client.implicit_tenants    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.586 2 DEBUG cotyledon.oslo_config_glue [-] service_types.cinder           = volumev3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.586 2 DEBUG cotyledon.oslo_config_glue [-] service_types.glance           = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.586 2 DEBUG cotyledon.oslo_config_glue [-] service_types.neutron          = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.586 2 DEBUG cotyledon.oslo_config_glue [-] service_types.nova             = compute log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.586 2 DEBUG cotyledon.oslo_config_glue [-] service_types.radosgw          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.586 2 DEBUG cotyledon.oslo_config_glue [-] service_types.swift            = object-store log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.586 2 DEBUG cotyledon.oslo_config_glue [-] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.586 2 DEBUG cotyledon.oslo_config_glue [-] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.586 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_ip                 = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.586 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.586 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.587 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_username           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.587 2 DEBUG cotyledon.oslo_config_glue [-] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.587 2 DEBUG cotyledon.oslo_config_glue [-] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.587 2 DEBUG cotyledon.oslo_config_glue [-] vmware.wsdl_location           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.587 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.587 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_type  = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.587 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.cafile     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.587 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.certfile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.587 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.587 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.insecure   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.587 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.interface  = internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.587 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.keyfile    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.588 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.588 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.588 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.timeout    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.588 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.588 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.588 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.588 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.588 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.588 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.588 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.588 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.589 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.589 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.589 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.589 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_section             = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.589 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_type                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.589 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.589 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.589 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.589 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.590 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.interface                = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.590 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.590 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.region_name              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.590 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.590 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.590 2 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.611 12 INFO ceilometer.polling.manager [-] Looking for dynamic pollsters configurations at [['/etc/ceilometer/pollsters.d']].
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.614 12 INFO ceilometer.polling.manager [-] No dynamic pollsters found in folder [/etc/ceilometer/pollsters.d].
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.615 12 INFO ceilometer.polling.manager [-] No dynamic pollsters file found in dirs [['/etc/ceilometer/pollsters.d']].
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.739 12 DEBUG ceilometer.compute.virt.libvirt.utils [-] Connecting to libvirt: qemu:///system new_libvirt_connection /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/utils.py:93
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.824 12 DEBUG cotyledon.oslo_config_glue [-] Full set of CONF: _load_service_options /usr/lib/python3.9/site-packages/cotyledon/oslo_config_glue.py:48
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.824 12 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.824 12 DEBUG cotyledon.oslo_config_glue [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.824 12 DEBUG cotyledon.oslo_config_glue [-] command line args: ['--polling-namespaces', 'compute', '--logfile', '/dev/stdout'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.825 12 DEBUG cotyledon.oslo_config_glue [-] config files: ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.825 12 DEBUG cotyledon.oslo_config_glue [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.825 12 DEBUG cotyledon.oslo_config_glue [-] batch_size                     = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.825 12 DEBUG cotyledon.oslo_config_glue [-] cfg_file                       = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.825 12 DEBUG cotyledon.oslo_config_glue [-] config_dir                     = ['/etc/ceilometer/ceilometer.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.825 12 DEBUG cotyledon.oslo_config_glue [-] config_file                    = ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.825 12 DEBUG cotyledon.oslo_config_glue [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.825 12 DEBUG cotyledon.oslo_config_glue [-] control_exchange               = ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.825 12 DEBUG cotyledon.oslo_config_glue [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.825 12 DEBUG cotyledon.oslo_config_glue [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'futurist=INFO', 'neutronclient=INFO', 'keystoneclient=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.826 12 DEBUG cotyledon.oslo_config_glue [-] event_pipeline_cfg_file        = event_pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.826 12 DEBUG cotyledon.oslo_config_glue [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.826 12 DEBUG cotyledon.oslo_config_glue [-] host                           = compute-1.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.826 12 DEBUG cotyledon.oslo_config_glue [-] http_timeout                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.826 12 DEBUG cotyledon.oslo_config_glue [-] hypervisor_inspector           = libvirt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.826 12 DEBUG cotyledon.oslo_config_glue [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.826 12 DEBUG cotyledon.oslo_config_glue [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.826 12 DEBUG cotyledon.oslo_config_glue [-] libvirt_type                   = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.827 12 DEBUG cotyledon.oslo_config_glue [-] libvirt_uri                    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.827 12 DEBUG cotyledon.oslo_config_glue [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.827 12 DEBUG cotyledon.oslo_config_glue [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.827 12 DEBUG cotyledon.oslo_config_glue [-] log_dir                        = /var/log/ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.827 12 DEBUG cotyledon.oslo_config_glue [-] log_file                       = /dev/stdout log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.827 12 DEBUG cotyledon.oslo_config_glue [-] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.827 12 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.827 12 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.827 12 DEBUG cotyledon.oslo_config_glue [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.828 12 DEBUG cotyledon.oslo_config_glue [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.828 12 DEBUG cotyledon.oslo_config_glue [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.828 12 DEBUG cotyledon.oslo_config_glue [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.828 12 DEBUG cotyledon.oslo_config_glue [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.828 12 DEBUG cotyledon.oslo_config_glue [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.828 12 DEBUG cotyledon.oslo_config_glue [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.828 12 DEBUG cotyledon.oslo_config_glue [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.828 12 DEBUG cotyledon.oslo_config_glue [-] max_parallel_requests          = 64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.828 12 DEBUG cotyledon.oslo_config_glue [-] partitioning_group_prefix      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.829 12 DEBUG cotyledon.oslo_config_glue [-] pipeline_cfg_file              = pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.829 12 DEBUG cotyledon.oslo_config_glue [-] polling_namespaces             = ['compute'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.829 12 DEBUG cotyledon.oslo_config_glue [-] pollsters_definitions_dirs     = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.829 12 DEBUG cotyledon.oslo_config_glue [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.829 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.829 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.829 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.829 12 DEBUG cotyledon.oslo_config_glue [-] reseller_prefix                = AUTH_ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.829 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_keys         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.829 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_length       = 256 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.830 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_namespace    = ['metering.'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.830 12 DEBUG cotyledon.oslo_config_glue [-] rootwrap_config                = /etc/ceilometer/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.830 12 DEBUG cotyledon.oslo_config_glue [-] sample_source                  = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.830 12 DEBUG cotyledon.oslo_config_glue [-] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.830 12 DEBUG cotyledon.oslo_config_glue [-] tenant_name_discovery          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.830 12 DEBUG cotyledon.oslo_config_glue [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.830 12 DEBUG cotyledon.oslo_config_glue [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.830 12 DEBUG cotyledon.oslo_config_glue [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.830 12 DEBUG cotyledon.oslo_config_glue [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.830 12 DEBUG cotyledon.oslo_config_glue [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.830 12 DEBUG cotyledon.oslo_config_glue [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.830 12 DEBUG cotyledon.oslo_config_glue [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.831 12 DEBUG cotyledon.oslo_config_glue [-] compute.instance_discovery_method = libvirt_metadata log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.831 12 DEBUG cotyledon.oslo_config_glue [-] compute.resource_cache_expiry  = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.831 12 DEBUG cotyledon.oslo_config_glue [-] compute.resource_update_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.831 12 DEBUG cotyledon.oslo_config_glue [-] coordination.backend_url       = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.831 12 DEBUG cotyledon.oslo_config_glue [-] event.definitions_cfg_file     = event_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.831 12 DEBUG cotyledon.oslo_config_glue [-] event.drop_unmatched_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.831 12 DEBUG cotyledon.oslo_config_glue [-] event.store_raw                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.831 12 DEBUG cotyledon.oslo_config_glue [-] ipmi.node_manager_init_retry   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.831 12 DEBUG cotyledon.oslo_config_glue [-] ipmi.polling_retry             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.831 12 DEBUG cotyledon.oslo_config_glue [-] meter.meter_definitions_dirs   = ['/etc/ceilometer/meters.d', '/usr/lib/python3.9/site-packages/ceilometer/data/meters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.832 12 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_on_failure     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.832 12 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_path           = mon_pub_failures.txt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.832 12 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.832 12 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.832 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_count            = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.832 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_max_retries      = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.832 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_mode             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.832 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_polling_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.832 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_timeout          = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.832 12 DEBUG cotyledon.oslo_config_glue [-] monasca.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.832 12 DEBUG cotyledon.oslo_config_glue [-] monasca.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.833 12 DEBUG cotyledon.oslo_config_glue [-] monasca.client_max_retries     = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.833 12 DEBUG cotyledon.oslo_config_glue [-] monasca.client_retry_interval  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.833 12 DEBUG cotyledon.oslo_config_glue [-] monasca.clientapi_version      = 2_0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.833 12 DEBUG cotyledon.oslo_config_glue [-] monasca.cloud_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.833 12 DEBUG cotyledon.oslo_config_glue [-] monasca.cluster                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.833 12 DEBUG cotyledon.oslo_config_glue [-] monasca.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.833 12 DEBUG cotyledon.oslo_config_glue [-] monasca.control_plane          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.833 12 DEBUG cotyledon.oslo_config_glue [-] monasca.enable_api_pagination  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.833 12 DEBUG cotyledon.oslo_config_glue [-] monasca.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.833 12 DEBUG cotyledon.oslo_config_glue [-] monasca.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.833 12 DEBUG cotyledon.oslo_config_glue [-] monasca.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.834 12 DEBUG cotyledon.oslo_config_glue [-] monasca.monasca_mappings       = /etc/ceilometer/monasca_field_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.834 12 DEBUG cotyledon.oslo_config_glue [-] monasca.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.834 12 DEBUG cotyledon.oslo_config_glue [-] monasca.retry_on_failure       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.834 12 DEBUG cotyledon.oslo_config_glue [-] monasca.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.834 12 DEBUG cotyledon.oslo_config_glue [-] monasca.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.834 12 DEBUG cotyledon.oslo_config_glue [-] notification.ack_on_event_error = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.834 12 DEBUG cotyledon.oslo_config_glue [-] notification.batch_size        = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.834 12 DEBUG cotyledon.oslo_config_glue [-] notification.batch_timeout     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.834 12 DEBUG cotyledon.oslo_config_glue [-] notification.messaging_urls    = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.834 12 DEBUG cotyledon.oslo_config_glue [-] notification.notification_control_exchanges = ['nova', 'glance', 'neutron', 'cinder', 'heat', 'keystone', 'sahara', 'trove', 'zaqar', 'swift', 'ceilometer', 'magnum', 'dns', 'ironic', 'aodh'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.835 12 DEBUG cotyledon.oslo_config_glue [-] notification.pipelines         = ['meter', 'event'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.835 12 DEBUG cotyledon.oslo_config_glue [-] notification.workers           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.835 12 DEBUG cotyledon.oslo_config_glue [-] polling.batch_size             = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.835 12 DEBUG cotyledon.oslo_config_glue [-] polling.cfg_file               = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.835 12 DEBUG cotyledon.oslo_config_glue [-] polling.partitioning_group_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.835 12 DEBUG cotyledon.oslo_config_glue [-] polling.pollsters_definitions_dirs = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.835 12 DEBUG cotyledon.oslo_config_glue [-] polling.tenant_name_discovery  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.835 12 DEBUG cotyledon.oslo_config_glue [-] publisher.telemetry_secret     = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.835 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.event_topic = event log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.836 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.metering_topic = metering log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.836 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.telemetry_driver = messagingv2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.836 12 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.access_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.836 12 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.secret_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.836 12 DEBUG cotyledon.oslo_config_glue [-] rgw_client.implicit_tenants    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.836 12 DEBUG cotyledon.oslo_config_glue [-] service_types.cinder           = volumev3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.836 12 DEBUG cotyledon.oslo_config_glue [-] service_types.glance           = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.836 12 DEBUG cotyledon.oslo_config_glue [-] service_types.neutron          = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.836 12 DEBUG cotyledon.oslo_config_glue [-] service_types.nova             = compute log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.837 12 DEBUG cotyledon.oslo_config_glue [-] service_types.radosgw          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.837 12 DEBUG cotyledon.oslo_config_glue [-] service_types.swift            = object-store log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.837 12 DEBUG cotyledon.oslo_config_glue [-] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.837 12 DEBUG cotyledon.oslo_config_glue [-] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.837 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_ip                 = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.837 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.837 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.837 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_username           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.837 12 DEBUG cotyledon.oslo_config_glue [-] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.838 12 DEBUG cotyledon.oslo_config_glue [-] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.838 12 DEBUG cotyledon.oslo_config_glue [-] vmware.wsdl_location           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.838 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.838 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_type  = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.838 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_url   = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.838 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.cafile     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.838 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.certfile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.838 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.839 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.default_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.839 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.839 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.domain_id  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.839 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.839 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.insecure   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.839 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.interface  = internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.839 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.keyfile    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.839 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.password   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.839 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.839 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.839 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.840 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_name = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.840 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.840 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.840 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.system_scope = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.840 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.timeout    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.840 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.trust_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.840 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.840 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.841 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.841 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.username   = ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.841 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.841 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.841 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.841 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.841 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.841 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.841 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.841 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.842 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.842 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.842 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.842 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_section             = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.842 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_type                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.842 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.842 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.842 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.842 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.843 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.interface                = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.843 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.843 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.region_name              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.843 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.843 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.843 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.843 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.843 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.843 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.844 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.844 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.844 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.844 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.844 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.844 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.844 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.844 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.845 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.845 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.845 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.845 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.845 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.845 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.845 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.845 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.845 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.846 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.846 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.846 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.846 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.846 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.846 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.846 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.846 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.846 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.846 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.846 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.847 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.847 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.847 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.847 12 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.847 12 DEBUG cotyledon._service [-] Run service AgentManager(0) [12] wait_forever /usr/lib/python3.9/site-packages/cotyledon/_service.py:241
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.849 12 DEBUG ceilometer.agent [-] Config file: {'sources': [{'name': 'pollsters', 'interval': 120, 'meters': ['power.state', 'cpu', 'memory.usage', 'disk.*', 'network.*']}]} load_config /usr/lib/python3.9/site-packages/ceilometer/agent.py:64
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.855 12 DEBUG ceilometer.compute.virt.libvirt.utils [-] Connecting to libvirt: qemu:///system new_libvirt_connection /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/utils.py:93
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.858 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.859 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.859 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.859 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.859 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.859 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.859 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.859 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.859 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.859 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.859 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.859 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.860 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.860 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.860 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.860 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.860 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.860 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.860 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.860 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.860 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.860 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.860 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.861 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:38:22.861 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:38:22 compute-1 python3.9[192578]: ansible-ansible.builtin.slurp Invoked with src=/var/lib/edpm-config/deployed_services.yaml
Jan 21 23:38:23 compute-1 sudo[192731]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ucbonfxqrmrlukrmjvqdorlohswghayl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038703.7355232-1650-25674614783875/AnsiballZ_stat.py'
Jan 21 23:38:23 compute-1 sudo[192731]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:38:24 compute-1 python3.9[192733]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/deployed_services.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:38:24 compute-1 sudo[192731]: pam_unix(sudo:session): session closed for user root
Jan 21 23:38:24 compute-1 sudo[192856]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zrisjysyhalicdwtuzcbcgrxvifzsykk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038703.7355232-1650-25674614783875/AnsiballZ_copy.py'
Jan 21 23:38:24 compute-1 sudo[192856]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:38:24 compute-1 python3.9[192858]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/deployed_services.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769038703.7355232-1650-25674614783875/.source.yaml _original_basename=.9bodn1zk follow=False checksum=9afa6966295a519b2180701325b87786c9fac371 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:38:24 compute-1 sudo[192856]: pam_unix(sudo:session): session closed for user root
Jan 21 23:38:25 compute-1 sudo[193008]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wthnxiwcphqubruumbhlqwlzofegfplk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038705.358657-1695-168986506721369/AnsiballZ_stat.py'
Jan 21 23:38:25 compute-1 sudo[193008]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:38:25 compute-1 python3.9[193010]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/node_exporter/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:38:25 compute-1 sudo[193008]: pam_unix(sudo:session): session closed for user root
Jan 21 23:38:26 compute-1 sudo[193131]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bepwddnkdgxhrckdqzwzbkpunsafnqty ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038705.358657-1695-168986506721369/AnsiballZ_copy.py'
Jan 21 23:38:26 compute-1 sudo[193131]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:38:26 compute-1 python3.9[193133]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/node_exporter/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769038705.358657-1695-168986506721369/.source _original_basename=healthcheck follow=False checksum=e380c11c36804bfc65a818f2960cfa663daacfe5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 21 23:38:26 compute-1 sudo[193131]: pam_unix(sudo:session): session closed for user root
Jan 21 23:38:27 compute-1 sudo[193283]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kdgoajlfqujnrouscktxlbaoxcsjughs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038707.4064765-1758-232571774478354/AnsiballZ_file.py'
Jan 21 23:38:27 compute-1 sudo[193283]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:38:27 compute-1 python3.9[193285]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:38:27 compute-1 sudo[193283]: pam_unix(sudo:session): session closed for user root
Jan 21 23:38:28 compute-1 sudo[193435]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aagtqcncolrwhtyvzdvrxmkumvphzzlv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038708.2410946-1782-57331199783317/AnsiballZ_file.py'
Jan 21 23:38:28 compute-1 sudo[193435]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:38:28 compute-1 python3.9[193437]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 21 23:38:28 compute-1 sudo[193435]: pam_unix(sudo:session): session closed for user root
Jan 21 23:38:29 compute-1 sudo[193587]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lvhlgamrqybkjrjcbjgevtdwhvprphvk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038709.112052-1806-125288502993076/AnsiballZ_stat.py'
Jan 21 23:38:29 compute-1 sudo[193587]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:38:29 compute-1 python3.9[193589]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ceilometer_agent_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:38:29 compute-1 sudo[193587]: pam_unix(sudo:session): session closed for user root
Jan 21 23:38:29 compute-1 sudo[193665]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uufxezbpmmxnpvxbphitdolxafdiujkz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038709.112052-1806-125288502993076/AnsiballZ_file.py'
Jan 21 23:38:29 compute-1 sudo[193665]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:38:30 compute-1 python3.9[193667]: ansible-ansible.legacy.file Invoked with mode=0600 dest=/var/lib/kolla/config_files/ceilometer_agent_compute.json _original_basename=.sf6wrnyy recurse=False state=file path=/var/lib/kolla/config_files/ceilometer_agent_compute.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:38:30 compute-1 sudo[193665]: pam_unix(sudo:session): session closed for user root
Jan 21 23:38:31 compute-1 podman[193818]: 2026-01-21 23:38:31.827756284 +0000 UTC m=+0.108101899 container health_status 1b31374526468d6b98833c6ddc06c791524e3aaa8f4a92d02e3f11c449a17ca2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller)
Jan 21 23:38:31 compute-1 python3.9[193817]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/node_exporter state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:38:34 compute-1 sudo[194264]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-awvasoheaibpimmhbofjqbsiojcrqhix ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038713.785552-1917-48734451373130/AnsiballZ_container_config_data.py'
Jan 21 23:38:34 compute-1 sudo[194264]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:38:34 compute-1 python3.9[194266]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/node_exporter config_pattern=*.json debug=False
Jan 21 23:38:34 compute-1 sudo[194264]: pam_unix(sudo:session): session closed for user root
Jan 21 23:38:35 compute-1 sudo[194416]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bikxxmevipgyrizykpxmfdfelifdfmmm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038714.9371583-1950-104787657022995/AnsiballZ_container_config_hash.py'
Jan 21 23:38:35 compute-1 sudo[194416]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:38:35 compute-1 python3.9[194418]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Jan 21 23:38:35 compute-1 sudo[194416]: pam_unix(sudo:session): session closed for user root
Jan 21 23:38:36 compute-1 sudo[194568]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zohvrlobwdppfexqywktplufvrzxqiyo ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1769038715.9827185-1980-238780566843063/AnsiballZ_edpm_container_manage.py'
Jan 21 23:38:36 compute-1 sudo[194568]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:38:36 compute-1 python3[194570]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/node_exporter config_id=node_exporter config_overrides={} config_patterns=*.json containers=['node_exporter'] log_base_path=/var/log/containers/stdouts debug=False
Jan 21 23:38:36 compute-1 podman[194606]: 2026-01-21 23:38:36.915608072 +0000 UTC m=+0.060781621 container create e305ebfcd3dbca18f8dd680606b043b108cf9efb0d0a771039cb04fe0df8441d (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, config_id=node_exporter, container_name=node_exporter, managed_by=edpm_ansible)
Jan 21 23:38:36 compute-1 podman[194606]: 2026-01-21 23:38:36.879486177 +0000 UTC m=+0.024659806 image pull 0da6a335fe1356545476b749c68f022c897de3a2139e8f0054f6937349ee2b83 quay.io/prometheus/node-exporter:v1.5.0
Jan 21 23:38:36 compute-1 python3[194570]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name node_exporter --conmon-pidfile /run/node_exporter.pid --env OS_ENDPOINT_TYPE=internal --env EDPM_CONFIG_HASH=22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8 --healthcheck-command /openstack/healthcheck node_exporter --label config_id=node_exporter --label container_name=node_exporter --label managed_by=edpm_ansible --label config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --publish 9100:9100 --user root --volume /var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z --volume /var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z --volume /var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw --volume /var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z quay.io/prometheus/node-exporter:v1.5.0 --web.config.file=/etc/node_exporter/node_exporter.yaml --web.disable-exporter-metrics --collector.systemd --collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\.service --no-collector.dmi --no-collector.entropy --no-collector.thermal_zone --no-collector.time --no-collector.timex --no-collector.uname --no-collector.stat --no-collector.hwmon --no-collector.os --no-collector.selinux --no-collector.textfile --no-collector.powersupplyclass --no-collector.pressure --no-collector.rapl
Jan 21 23:38:37 compute-1 sudo[194568]: pam_unix(sudo:session): session closed for user root
Jan 21 23:38:37 compute-1 sudo[194794]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jrlkjdnvzroltqafmfzplkvzepqeisny ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038717.320501-2004-35592859621681/AnsiballZ_stat.py'
Jan 21 23:38:37 compute-1 sudo[194794]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:38:37 compute-1 python3.9[194796]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 21 23:38:37 compute-1 sudo[194794]: pam_unix(sudo:session): session closed for user root
Jan 21 23:38:38 compute-1 sudo[194948]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yjrqtbghchmgfdvdephihlhdtfvaitju ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038718.1648269-2031-130751523864087/AnsiballZ_file.py'
Jan 21 23:38:38 compute-1 sudo[194948]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:38:38 compute-1 python3.9[194950]: ansible-file Invoked with path=/etc/systemd/system/edpm_node_exporter.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:38:38 compute-1 sudo[194948]: pam_unix(sudo:session): session closed for user root
Jan 21 23:38:38 compute-1 sudo[195024]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-exnutcuxchkxoolzmlbpiahqrxsyavmp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038718.1648269-2031-130751523864087/AnsiballZ_stat.py'
Jan 21 23:38:38 compute-1 sudo[195024]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:38:39 compute-1 python3.9[195026]: ansible-stat Invoked with path=/etc/systemd/system/edpm_node_exporter_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 21 23:38:39 compute-1 sudo[195024]: pam_unix(sudo:session): session closed for user root
Jan 21 23:38:39 compute-1 sudo[195175]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jkneqcjetmfpfjcjkoamvmklfanxeqbq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038719.2541203-2031-156740638234496/AnsiballZ_copy.py'
Jan 21 23:38:39 compute-1 sudo[195175]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:38:39 compute-1 python3.9[195177]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769038719.2541203-2031-156740638234496/source dest=/etc/systemd/system/edpm_node_exporter.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:38:39 compute-1 sudo[195175]: pam_unix(sudo:session): session closed for user root
Jan 21 23:38:40 compute-1 sudo[195251]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vlszslyjhchliklyazjhlscennmuylmi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038719.2541203-2031-156740638234496/AnsiballZ_systemd.py'
Jan 21 23:38:40 compute-1 sudo[195251]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:38:40 compute-1 python3.9[195253]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 21 23:38:40 compute-1 systemd[1]: Reloading.
Jan 21 23:38:40 compute-1 systemd-sysv-generator[195279]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 21 23:38:40 compute-1 systemd-rc-local-generator[195275]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 23:38:40 compute-1 sudo[195251]: pam_unix(sudo:session): session closed for user root
Jan 21 23:38:41 compute-1 sudo[195362]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-coqvpegvsxqzmeooxfbmzheepawetzch ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038719.2541203-2031-156740638234496/AnsiballZ_systemd.py'
Jan 21 23:38:41 compute-1 sudo[195362]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:38:41 compute-1 python3.9[195364]: ansible-systemd Invoked with state=restarted name=edpm_node_exporter.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 21 23:38:41 compute-1 systemd[1]: Reloading.
Jan 21 23:38:41 compute-1 systemd-rc-local-generator[195392]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 23:38:41 compute-1 systemd-sysv-generator[195400]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 21 23:38:41 compute-1 systemd[1]: Starting node_exporter container...
Jan 21 23:38:41 compute-1 systemd[1]: Started libcrun container.
Jan 21 23:38:41 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0d0fe1b63cf52f3a2899b0a9dbf437fcdceaf672d046eaa843bd04ba2e5d3ded/merged/etc/node_exporter/node_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Jan 21 23:38:41 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0d0fe1b63cf52f3a2899b0a9dbf437fcdceaf672d046eaa843bd04ba2e5d3ded/merged/etc/node_exporter/tls supports timestamps until 2038 (0x7fffffff)
Jan 21 23:38:41 compute-1 podman[195403]: 2026-01-21 23:38:41.988521161 +0000 UTC m=+0.123118291 container health_status af9a44923f14d865893c91712a29a99d59e05d83f1a1a0300c4f4d5af638f284 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, tcib_managed=true)
Jan 21 23:38:42 compute-1 systemd[1]: Started /usr/bin/podman healthcheck run e305ebfcd3dbca18f8dd680606b043b108cf9efb0d0a771039cb04fe0df8441d.
Jan 21 23:38:42 compute-1 podman[195405]: 2026-01-21 23:38:42.042719444 +0000 UTC m=+0.175608801 container init e305ebfcd3dbca18f8dd680606b043b108cf9efb0d0a771039cb04fe0df8441d (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 21 23:38:42 compute-1 node_exporter[195435]: ts=2026-01-21T23:38:42.066Z caller=node_exporter.go:180 level=info msg="Starting node_exporter" version="(version=1.5.0, branch=HEAD, revision=1b48970ffcf5630534fb00bb0687d73c66d1c959)"
Jan 21 23:38:42 compute-1 node_exporter[195435]: ts=2026-01-21T23:38:42.066Z caller=node_exporter.go:181 level=info msg="Build context" build_context="(go=go1.19.3, user=root@6e7732a7b81b, date=20221129-18:59:09)"
Jan 21 23:38:42 compute-1 node_exporter[195435]: ts=2026-01-21T23:38:42.066Z caller=node_exporter.go:183 level=warn msg="Node Exporter is running as root user. This exporter is designed to run as unprivileged user, root is not required."
Jan 21 23:38:42 compute-1 node_exporter[195435]: ts=2026-01-21T23:38:42.068Z caller=systemd_linux.go:152 level=info collector=systemd msg="Parsed flag --collector.systemd.unit-include" flag=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\.service
Jan 21 23:38:42 compute-1 node_exporter[195435]: ts=2026-01-21T23:38:42.068Z caller=systemd_linux.go:154 level=info collector=systemd msg="Parsed flag --collector.systemd.unit-exclude" flag=.+\.(automount|device|mount|scope|slice)
Jan 21 23:38:42 compute-1 node_exporter[195435]: ts=2026-01-21T23:38:42.069Z caller=diskstats_common.go:111 level=info collector=diskstats msg="Parsed flag --collector.diskstats.device-exclude" flag=^(ram|loop|fd|(h|s|v|xv)d[a-z]|nvme\d+n\d+p)\d+$
Jan 21 23:38:42 compute-1 node_exporter[195435]: ts=2026-01-21T23:38:42.069Z caller=diskstats_linux.go:264 level=error collector=diskstats msg="Failed to open directory, disabling udev device properties" path=/run/udev/data
Jan 21 23:38:42 compute-1 node_exporter[195435]: ts=2026-01-21T23:38:42.069Z caller=filesystem_common.go:111 level=info collector=filesystem msg="Parsed flag --collector.filesystem.mount-points-exclude" flag=^/(dev|proc|run/credentials/.+|sys|var/lib/docker/.+|var/lib/containers/storage/.+)($|/)
Jan 21 23:38:42 compute-1 node_exporter[195435]: ts=2026-01-21T23:38:42.069Z caller=filesystem_common.go:113 level=info collector=filesystem msg="Parsed flag --collector.filesystem.fs-types-exclude" flag=^(autofs|binfmt_misc|bpf|cgroup2?|configfs|debugfs|devpts|devtmpfs|fusectl|hugetlbfs|iso9660|mqueue|nsfs|overlay|proc|procfs|pstore|rpc_pipefs|securityfs|selinuxfs|squashfs|sysfs|tracefs)$
Jan 21 23:38:42 compute-1 node_exporter[195435]: ts=2026-01-21T23:38:42.069Z caller=node_exporter.go:110 level=info msg="Enabled collectors"
Jan 21 23:38:42 compute-1 node_exporter[195435]: ts=2026-01-21T23:38:42.069Z caller=node_exporter.go:117 level=info collector=arp
Jan 21 23:38:42 compute-1 node_exporter[195435]: ts=2026-01-21T23:38:42.069Z caller=node_exporter.go:117 level=info collector=bcache
Jan 21 23:38:42 compute-1 node_exporter[195435]: ts=2026-01-21T23:38:42.069Z caller=node_exporter.go:117 level=info collector=bonding
Jan 21 23:38:42 compute-1 node_exporter[195435]: ts=2026-01-21T23:38:42.069Z caller=node_exporter.go:117 level=info collector=btrfs
Jan 21 23:38:42 compute-1 node_exporter[195435]: ts=2026-01-21T23:38:42.069Z caller=node_exporter.go:117 level=info collector=conntrack
Jan 21 23:38:42 compute-1 node_exporter[195435]: ts=2026-01-21T23:38:42.069Z caller=node_exporter.go:117 level=info collector=cpu
Jan 21 23:38:42 compute-1 node_exporter[195435]: ts=2026-01-21T23:38:42.069Z caller=node_exporter.go:117 level=info collector=cpufreq
Jan 21 23:38:42 compute-1 node_exporter[195435]: ts=2026-01-21T23:38:42.069Z caller=node_exporter.go:117 level=info collector=diskstats
Jan 21 23:38:42 compute-1 node_exporter[195435]: ts=2026-01-21T23:38:42.070Z caller=node_exporter.go:117 level=info collector=edac
Jan 21 23:38:42 compute-1 node_exporter[195435]: ts=2026-01-21T23:38:42.070Z caller=node_exporter.go:117 level=info collector=fibrechannel
Jan 21 23:38:42 compute-1 node_exporter[195435]: ts=2026-01-21T23:38:42.070Z caller=node_exporter.go:117 level=info collector=filefd
Jan 21 23:38:42 compute-1 node_exporter[195435]: ts=2026-01-21T23:38:42.070Z caller=node_exporter.go:117 level=info collector=filesystem
Jan 21 23:38:42 compute-1 node_exporter[195435]: ts=2026-01-21T23:38:42.070Z caller=node_exporter.go:117 level=info collector=infiniband
Jan 21 23:38:42 compute-1 node_exporter[195435]: ts=2026-01-21T23:38:42.070Z caller=node_exporter.go:117 level=info collector=ipvs
Jan 21 23:38:42 compute-1 node_exporter[195435]: ts=2026-01-21T23:38:42.070Z caller=node_exporter.go:117 level=info collector=loadavg
Jan 21 23:38:42 compute-1 node_exporter[195435]: ts=2026-01-21T23:38:42.070Z caller=node_exporter.go:117 level=info collector=mdadm
Jan 21 23:38:42 compute-1 node_exporter[195435]: ts=2026-01-21T23:38:42.070Z caller=node_exporter.go:117 level=info collector=meminfo
Jan 21 23:38:42 compute-1 node_exporter[195435]: ts=2026-01-21T23:38:42.070Z caller=node_exporter.go:117 level=info collector=netclass
Jan 21 23:38:42 compute-1 node_exporter[195435]: ts=2026-01-21T23:38:42.070Z caller=node_exporter.go:117 level=info collector=netdev
Jan 21 23:38:42 compute-1 node_exporter[195435]: ts=2026-01-21T23:38:42.070Z caller=node_exporter.go:117 level=info collector=netstat
Jan 21 23:38:42 compute-1 node_exporter[195435]: ts=2026-01-21T23:38:42.070Z caller=node_exporter.go:117 level=info collector=nfs
Jan 21 23:38:42 compute-1 node_exporter[195435]: ts=2026-01-21T23:38:42.070Z caller=node_exporter.go:117 level=info collector=nfsd
Jan 21 23:38:42 compute-1 node_exporter[195435]: ts=2026-01-21T23:38:42.070Z caller=node_exporter.go:117 level=info collector=nvme
Jan 21 23:38:42 compute-1 node_exporter[195435]: ts=2026-01-21T23:38:42.070Z caller=node_exporter.go:117 level=info collector=schedstat
Jan 21 23:38:42 compute-1 node_exporter[195435]: ts=2026-01-21T23:38:42.070Z caller=node_exporter.go:117 level=info collector=sockstat
Jan 21 23:38:42 compute-1 node_exporter[195435]: ts=2026-01-21T23:38:42.070Z caller=node_exporter.go:117 level=info collector=softnet
Jan 21 23:38:42 compute-1 node_exporter[195435]: ts=2026-01-21T23:38:42.070Z caller=node_exporter.go:117 level=info collector=systemd
Jan 21 23:38:42 compute-1 node_exporter[195435]: ts=2026-01-21T23:38:42.070Z caller=node_exporter.go:117 level=info collector=tapestats
Jan 21 23:38:42 compute-1 node_exporter[195435]: ts=2026-01-21T23:38:42.070Z caller=node_exporter.go:117 level=info collector=udp_queues
Jan 21 23:38:42 compute-1 node_exporter[195435]: ts=2026-01-21T23:38:42.070Z caller=node_exporter.go:117 level=info collector=vmstat
Jan 21 23:38:42 compute-1 node_exporter[195435]: ts=2026-01-21T23:38:42.070Z caller=node_exporter.go:117 level=info collector=xfs
Jan 21 23:38:42 compute-1 node_exporter[195435]: ts=2026-01-21T23:38:42.070Z caller=node_exporter.go:117 level=info collector=zfs
Jan 21 23:38:42 compute-1 node_exporter[195435]: ts=2026-01-21T23:38:42.072Z caller=tls_config.go:232 level=info msg="Listening on" address=[::]:9100
Jan 21 23:38:42 compute-1 node_exporter[195435]: ts=2026-01-21T23:38:42.073Z caller=tls_config.go:268 level=info msg="TLS is enabled." http2=true address=[::]:9100
Jan 21 23:38:42 compute-1 podman[195405]: 2026-01-21 23:38:42.075751683 +0000 UTC m=+0.208641040 container start e305ebfcd3dbca18f8dd680606b043b108cf9efb0d0a771039cb04fe0df8441d (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Jan 21 23:38:42 compute-1 podman[195405]: node_exporter
Jan 21 23:38:42 compute-1 systemd[1]: Started node_exporter container.
Jan 21 23:38:42 compute-1 sudo[195362]: pam_unix(sudo:session): session closed for user root
Jan 21 23:38:42 compute-1 podman[195446]: 2026-01-21 23:38:42.18318095 +0000 UTC m=+0.087038337 container health_status e305ebfcd3dbca18f8dd680606b043b108cf9efb0d0a771039cb04fe0df8441d (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 21 23:38:43 compute-1 python3.9[195619]: ansible-ansible.builtin.slurp Invoked with src=/var/lib/edpm-config/deployed_services.yaml
Jan 21 23:38:44 compute-1 sudo[195769]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mphlokodndnqrkcjrtrlcqajqzucztnv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038724.0196524-2166-105271529687611/AnsiballZ_stat.py'
Jan 21 23:38:44 compute-1 sudo[195769]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:38:44 compute-1 python3.9[195771]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/deployed_services.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:38:44 compute-1 sudo[195769]: pam_unix(sudo:session): session closed for user root
Jan 21 23:38:45 compute-1 sudo[195894]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qnceysupkqpllyzqetjwsdibcyxmtmyk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038724.0196524-2166-105271529687611/AnsiballZ_copy.py'
Jan 21 23:38:45 compute-1 sudo[195894]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:38:45 compute-1 python3.9[195896]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/deployed_services.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769038724.0196524-2166-105271529687611/.source.yaml _original_basename=.7cl9d9l9 follow=False checksum=f7f2a63c4b6d9ab32b6599b5ceeeebe015d9558b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:38:45 compute-1 sudo[195894]: pam_unix(sudo:session): session closed for user root
Jan 21 23:38:45 compute-1 sudo[196046]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nvkxysghithwbgyosejtiqlwooxootkg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038725.4792125-2211-77584479341311/AnsiballZ_stat.py'
Jan 21 23:38:45 compute-1 sudo[196046]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:38:46 compute-1 python3.9[196048]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/podman_exporter/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:38:46 compute-1 sudo[196046]: pam_unix(sudo:session): session closed for user root
Jan 21 23:38:46 compute-1 sudo[196169]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xzrgpjlmjpioxnuvddtoegfqlvrwiypq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038725.4792125-2211-77584479341311/AnsiballZ_copy.py'
Jan 21 23:38:46 compute-1 sudo[196169]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:38:46 compute-1 python3.9[196171]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/podman_exporter/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769038725.4792125-2211-77584479341311/.source _original_basename=healthcheck follow=False checksum=e380c11c36804bfc65a818f2960cfa663daacfe5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 21 23:38:46 compute-1 sudo[196169]: pam_unix(sudo:session): session closed for user root
Jan 21 23:38:47 compute-1 sudo[196321]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-syfdfapsxnxqmauutvcbegxkudrkyozx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038727.5842407-2274-270967637549194/AnsiballZ_file.py'
Jan 21 23:38:47 compute-1 sudo[196321]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:38:48 compute-1 python3.9[196323]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:38:48 compute-1 sudo[196321]: pam_unix(sudo:session): session closed for user root
Jan 21 23:38:48 compute-1 sudo[196473]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dzweupgbgcqowfrbbqdlfyhxudgrsgzg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038728.3628967-2298-128769260030585/AnsiballZ_file.py'
Jan 21 23:38:48 compute-1 sudo[196473]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:38:48 compute-1 python3.9[196475]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 21 23:38:48 compute-1 sudo[196473]: pam_unix(sudo:session): session closed for user root
Jan 21 23:38:49 compute-1 sudo[196625]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kszwrquwjgtzfvvwtnrnkocpgebvceco ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038729.115379-2322-162131946805191/AnsiballZ_stat.py'
Jan 21 23:38:49 compute-1 sudo[196625]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:38:49 compute-1 python3.9[196627]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ceilometer_agent_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:38:49 compute-1 sudo[196625]: pam_unix(sudo:session): session closed for user root
Jan 21 23:38:49 compute-1 sudo[196703]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yoqtvbrakocffqhvrsezcayyojbrmdfs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038729.115379-2322-162131946805191/AnsiballZ_file.py'
Jan 21 23:38:49 compute-1 sudo[196703]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:38:50 compute-1 python3.9[196705]: ansible-ansible.legacy.file Invoked with mode=0600 dest=/var/lib/kolla/config_files/ceilometer_agent_compute.json _original_basename=.9j2i9kgz recurse=False state=file path=/var/lib/kolla/config_files/ceilometer_agent_compute.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:38:50 compute-1 sudo[196703]: pam_unix(sudo:session): session closed for user root
Jan 21 23:38:50 compute-1 python3.9[196855]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/podman_exporter state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:38:52 compute-1 podman[197100]: 2026-01-21 23:38:52.259800232 +0000 UTC m=+0.064730236 container health_status cba261ffc859a105e0afa4436e64e27f9ba7e7387a1ea02146e0072c51844d44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=starting, health_failing_streak=2, health_log=, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ceilometer_agent_compute, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 21 23:38:52 compute-1 systemd[1]: cba261ffc859a105e0afa4436e64e27f9ba7e7387a1ea02146e0072c51844d44-17808f7e975261c0.service: Main process exited, code=exited, status=1/FAILURE
Jan 21 23:38:52 compute-1 systemd[1]: cba261ffc859a105e0afa4436e64e27f9ba7e7387a1ea02146e0072c51844d44-17808f7e975261c0.service: Failed with result 'exit-code'.
Jan 21 23:38:53 compute-1 sudo[197295]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-efaowyjdgoaymgrraxhyangxflgaxzls ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038732.9263055-2433-169972213752022/AnsiballZ_container_config_data.py'
Jan 21 23:38:53 compute-1 sudo[197295]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:38:53 compute-1 python3.9[197297]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/podman_exporter config_pattern=*.json debug=False
Jan 21 23:38:53 compute-1 sudo[197295]: pam_unix(sudo:session): session closed for user root
Jan 21 23:38:54 compute-1 sudo[197447]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-clbcrnagvplwgbofvbjzhlsiatntwtzw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038734.1208844-2466-47068804285195/AnsiballZ_container_config_hash.py'
Jan 21 23:38:54 compute-1 sudo[197447]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:38:54 compute-1 python3.9[197449]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Jan 21 23:38:54 compute-1 sudo[197447]: pam_unix(sudo:session): session closed for user root
Jan 21 23:38:55 compute-1 sudo[197599]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xkvvbhcdyfqsaydkzupokmgyhbmnpfiu ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1769038735.2292128-2496-170403491979540/AnsiballZ_edpm_container_manage.py'
Jan 21 23:38:55 compute-1 sudo[197599]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:38:55 compute-1 python3[197601]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/podman_exporter config_id=podman_exporter config_overrides={} config_patterns=*.json containers=['podman_exporter'] log_base_path=/var/log/containers/stdouts debug=False
Jan 21 23:38:57 compute-1 podman[197614]: 2026-01-21 23:38:57.201094594 +0000 UTC m=+1.309035361 image pull e56d40e393eb5ea8704d9af8cf0d74665df83747106713fda91530f201837815 quay.io/navidys/prometheus-podman-exporter:v1.10.1
Jan 21 23:38:57 compute-1 podman[197707]: 2026-01-21 23:38:57.354490746 +0000 UTC m=+0.065092968 container create 9069102163b9014ce8d990e36224edf2f6277e737eebbc85a8ea0ba5a3d6a468 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_id=podman_exporter, container_name=podman_exporter, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 21 23:38:57 compute-1 podman[197707]: 2026-01-21 23:38:57.319551238 +0000 UTC m=+0.030153520 image pull e56d40e393eb5ea8704d9af8cf0d74665df83747106713fda91530f201837815 quay.io/navidys/prometheus-podman-exporter:v1.10.1
Jan 21 23:38:57 compute-1 python3[197601]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name podman_exporter --conmon-pidfile /run/podman_exporter.pid --env CONTAINER_HOST=unix:///run/podman/podman.sock --env OS_ENDPOINT_TYPE=internal --env EDPM_CONFIG_HASH=22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8 --healthcheck-command /openstack/healthcheck podman_exporter --label config_id=podman_exporter --label container_name=podman_exporter --label managed_by=edpm_ansible --label config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --publish 9882:9882 --user root --volume /var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z --volume /var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z --volume /run/podman/podman.sock:/run/podman/podman.sock:rw,z --volume /var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z quay.io/navidys/prometheus-podman-exporter:v1.10.1 --web.config.file=/etc/podman_exporter/podman_exporter.yaml
Jan 21 23:38:57 compute-1 sudo[197599]: pam_unix(sudo:session): session closed for user root
Jan 21 23:38:58 compute-1 sudo[197895]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ifyxvzuvvmcixudjincihfgqdzbpsqzx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038737.9232469-2520-224116515630860/AnsiballZ_stat.py'
Jan 21 23:38:58 compute-1 sudo[197895]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:38:58 compute-1 python3.9[197897]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 21 23:38:58 compute-1 sudo[197895]: pam_unix(sudo:session): session closed for user root
Jan 21 23:38:59 compute-1 sudo[198049]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fqfdqmqqcnzgttljgatxzuaifrhmpilu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038738.833691-2547-59537978327200/AnsiballZ_file.py'
Jan 21 23:38:59 compute-1 sudo[198049]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:38:59 compute-1 python3.9[198051]: ansible-file Invoked with path=/etc/systemd/system/edpm_podman_exporter.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:38:59 compute-1 sudo[198049]: pam_unix(sudo:session): session closed for user root
Jan 21 23:38:59 compute-1 sudo[198125]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yvvsffcyutzqrykmlogvhxlkgrzjolef ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038738.833691-2547-59537978327200/AnsiballZ_stat.py'
Jan 21 23:38:59 compute-1 sudo[198125]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:38:59 compute-1 python3.9[198127]: ansible-stat Invoked with path=/etc/systemd/system/edpm_podman_exporter_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 21 23:38:59 compute-1 sudo[198125]: pam_unix(sudo:session): session closed for user root
Jan 21 23:39:00 compute-1 sudo[198276]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-behgsebfwfyszsaythzjvuxrlyiatzhy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038739.8809047-2547-61882017660776/AnsiballZ_copy.py'
Jan 21 23:39:00 compute-1 sudo[198276]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:39:00 compute-1 python3.9[198278]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769038739.8809047-2547-61882017660776/source dest=/etc/systemd/system/edpm_podman_exporter.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:39:00 compute-1 sudo[198276]: pam_unix(sudo:session): session closed for user root
Jan 21 23:39:00 compute-1 sudo[198352]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jnbaozpralaneykkmtkaoibxzfazbzxl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038739.8809047-2547-61882017660776/AnsiballZ_systemd.py'
Jan 21 23:39:00 compute-1 sudo[198352]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:39:01 compute-1 python3.9[198354]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 21 23:39:01 compute-1 systemd[1]: Reloading.
Jan 21 23:39:01 compute-1 systemd-rc-local-generator[198376]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 23:39:01 compute-1 systemd-sysv-generator[198383]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 21 23:39:01 compute-1 sudo[198352]: pam_unix(sudo:session): session closed for user root
Jan 21 23:39:01 compute-1 sudo[198463]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ftcrepnryzmhfedlcbauhbhhqbpghkaw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038739.8809047-2547-61882017660776/AnsiballZ_systemd.py'
Jan 21 23:39:01 compute-1 sudo[198463]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:39:02 compute-1 podman[198465]: 2026-01-21 23:39:02.03340255 +0000 UTC m=+0.139569729 container health_status 1b31374526468d6b98833c6ddc06c791524e3aaa8f4a92d02e3f11c449a17ca2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller)
Jan 21 23:39:02 compute-1 python3.9[198466]: ansible-systemd Invoked with state=restarted name=edpm_podman_exporter.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 21 23:39:02 compute-1 systemd[1]: Reloading.
Jan 21 23:39:02 compute-1 systemd-rc-local-generator[198524]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 23:39:02 compute-1 systemd-sysv-generator[198527]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 21 23:39:02 compute-1 systemd[1]: Starting podman_exporter container...
Jan 21 23:39:02 compute-1 systemd[1]: Started libcrun container.
Jan 21 23:39:02 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/727e7ed729a18e058fb5231506cb2618dfacb2ae55081fedf333b89a2ad67238/merged/etc/podman_exporter/podman_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Jan 21 23:39:02 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/727e7ed729a18e058fb5231506cb2618dfacb2ae55081fedf333b89a2ad67238/merged/etc/podman_exporter/tls supports timestamps until 2038 (0x7fffffff)
Jan 21 23:39:02 compute-1 systemd[1]: Started /usr/bin/podman healthcheck run 9069102163b9014ce8d990e36224edf2f6277e737eebbc85a8ea0ba5a3d6a468.
Jan 21 23:39:02 compute-1 podman[198533]: 2026-01-21 23:39:02.857805724 +0000 UTC m=+0.206512802 container init 9069102163b9014ce8d990e36224edf2f6277e737eebbc85a8ea0ba5a3d6a468 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 21 23:39:02 compute-1 podman[198533]: 2026-01-21 23:39:02.892716682 +0000 UTC m=+0.241423540 container start 9069102163b9014ce8d990e36224edf2f6277e737eebbc85a8ea0ba5a3d6a468 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 21 23:39:02 compute-1 podman_exporter[198549]: ts=2026-01-21T23:39:02.895Z caller=exporter.go:68 level=info msg="Starting podman-prometheus-exporter" version="(version=1.10.1, branch=HEAD, revision=1)"
Jan 21 23:39:02 compute-1 podman_exporter[198549]: ts=2026-01-21T23:39:02.895Z caller=exporter.go:69 level=info msg=metrics enhanced=false
Jan 21 23:39:02 compute-1 podman[198533]: podman_exporter
Jan 21 23:39:02 compute-1 podman_exporter[198549]: ts=2026-01-21T23:39:02.895Z caller=handler.go:94 level=info msg="enabled collectors"
Jan 21 23:39:02 compute-1 podman_exporter[198549]: ts=2026-01-21T23:39:02.895Z caller=handler.go:105 level=info collector=container
Jan 21 23:39:02 compute-1 systemd[1]: Starting Podman API Service...
Jan 21 23:39:02 compute-1 systemd[1]: Started podman_exporter container.
Jan 21 23:39:02 compute-1 systemd[1]: Started Podman API Service.
Jan 21 23:39:02 compute-1 podman[198560]: time="2026-01-21T23:39:02Z" level=info msg="/usr/bin/podman filtering at log level info"
Jan 21 23:39:02 compute-1 podman[198560]: time="2026-01-21T23:39:02Z" level=info msg="Setting parallel job count to 25"
Jan 21 23:39:02 compute-1 podman[198560]: time="2026-01-21T23:39:02Z" level=info msg="Using sqlite as database backend"
Jan 21 23:39:02 compute-1 podman[198560]: time="2026-01-21T23:39:02Z" level=info msg="Not using native diff for overlay, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled"
Jan 21 23:39:02 compute-1 podman[198560]: time="2026-01-21T23:39:02Z" level=info msg="Using systemd socket activation to determine API endpoint"
Jan 21 23:39:02 compute-1 podman[198560]: time="2026-01-21T23:39:02Z" level=info msg="API service listening on \"/run/podman/podman.sock\". URI: \"unix:///run/podman/podman.sock\""
Jan 21 23:39:02 compute-1 sudo[198463]: pam_unix(sudo:session): session closed for user root
Jan 21 23:39:02 compute-1 podman[198560]: @ - - [21/Jan/2026:23:39:02 +0000] "GET /v4.9.3/libpod/_ping HTTP/1.1" 200 2 "" "Go-http-client/1.1"
Jan 21 23:39:02 compute-1 podman[198560]: time="2026-01-21T23:39:02Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 21 23:39:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:39:02.984 104184 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:39:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:39:02.985 104184 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:39:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:39:02.985 104184 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:39:02 compute-1 podman[198555]: 2026-01-21 23:39:02.994061988 +0000 UTC m=+0.079403897 container health_status 9069102163b9014ce8d990e36224edf2f6277e737eebbc85a8ea0ba5a3d6a468 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=starting, health_failing_streak=1, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 21 23:39:03 compute-1 systemd[1]: 9069102163b9014ce8d990e36224edf2f6277e737eebbc85a8ea0ba5a3d6a468-5dd826a8aebf6253.service: Main process exited, code=exited, status=1/FAILURE
Jan 21 23:39:03 compute-1 systemd[1]: 9069102163b9014ce8d990e36224edf2f6277e737eebbc85a8ea0ba5a3d6a468-5dd826a8aebf6253.service: Failed with result 'exit-code'.
Jan 21 23:39:03 compute-1 podman[198560]: @ - - [21/Jan/2026:23:39:02 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=true&sync=false HTTP/1.1" 200 18075 "" "Go-http-client/1.1"
Jan 21 23:39:03 compute-1 podman_exporter[198549]: ts=2026-01-21T23:39:03.030Z caller=exporter.go:96 level=info msg="Listening on" address=:9882
Jan 21 23:39:03 compute-1 podman_exporter[198549]: ts=2026-01-21T23:39:03.030Z caller=tls_config.go:313 level=info msg="Listening on" address=[::]:9882
Jan 21 23:39:03 compute-1 podman_exporter[198549]: ts=2026-01-21T23:39:03.031Z caller=tls_config.go:349 level=info msg="TLS is enabled." http2=true address=[::]:9882
Jan 21 23:39:04 compute-1 python3.9[198747]: ansible-ansible.builtin.slurp Invoked with src=/var/lib/edpm-config/deployed_services.yaml
Jan 21 23:39:05 compute-1 sudo[198897]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hpzqyuqswperhixbxyyiuibxdrcwntmw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038745.1335716-2683-63874513205508/AnsiballZ_stat.py'
Jan 21 23:39:05 compute-1 sudo[198897]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:39:05 compute-1 python3.9[198899]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/deployed_services.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:39:05 compute-1 sudo[198897]: pam_unix(sudo:session): session closed for user root
Jan 21 23:39:06 compute-1 sudo[199022]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dgilcygipcrfdtgsipumagyyymonavpr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038745.1335716-2683-63874513205508/AnsiballZ_copy.py'
Jan 21 23:39:06 compute-1 sudo[199022]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:39:06 compute-1 python3.9[199024]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/deployed_services.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769038745.1335716-2683-63874513205508/.source.yaml _original_basename=.9glnglzi follow=False checksum=f3d2380b7b2b83f386aaaefa3dff58bab4ad332f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:39:06 compute-1 sudo[199022]: pam_unix(sudo:session): session closed for user root
Jan 21 23:39:06 compute-1 sudo[199174]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ecdwokeurchhdgoyouqxbhakpadwuhkw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038746.607694-2727-66755570593823/AnsiballZ_stat.py'
Jan 21 23:39:06 compute-1 sudo[199174]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:39:07 compute-1 python3.9[199176]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/openstack_network_exporter/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:39:07 compute-1 sudo[199174]: pam_unix(sudo:session): session closed for user root
Jan 21 23:39:07 compute-1 sudo[199297]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lkylqezgxyvrdsmewrzotnsrytlpvlia ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038746.607694-2727-66755570593823/AnsiballZ_copy.py'
Jan 21 23:39:07 compute-1 sudo[199297]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:39:07 compute-1 python3.9[199299]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/openstack_network_exporter/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769038746.607694-2727-66755570593823/.source _original_basename=healthcheck follow=False checksum=e380c11c36804bfc65a818f2960cfa663daacfe5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 21 23:39:07 compute-1 sudo[199297]: pam_unix(sudo:session): session closed for user root
Jan 21 23:39:09 compute-1 sudo[199449]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wfopfopazfjuiamfqbxlmlluswhawlux ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038748.8702939-2790-120739617432161/AnsiballZ_file.py'
Jan 21 23:39:09 compute-1 sudo[199449]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:39:09 compute-1 python3.9[199451]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:39:09 compute-1 sudo[199449]: pam_unix(sudo:session): session closed for user root
Jan 21 23:39:09 compute-1 sudo[199601]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zmxilccrqrddvxtjxznejdwzfjvvijbc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038749.646195-2814-10387464209070/AnsiballZ_file.py'
Jan 21 23:39:09 compute-1 sudo[199601]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:39:10 compute-1 python3.9[199603]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 21 23:39:10 compute-1 sudo[199601]: pam_unix(sudo:session): session closed for user root
Jan 21 23:39:10 compute-1 sudo[199753]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jacyniadrhnanwdaiqjoiypmgskpudqj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038750.4756684-2838-149401036225357/AnsiballZ_stat.py'
Jan 21 23:39:10 compute-1 sudo[199753]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:39:11 compute-1 python3.9[199755]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ceilometer_agent_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:39:11 compute-1 sudo[199753]: pam_unix(sudo:session): session closed for user root
Jan 21 23:39:11 compute-1 nova_compute[182713]: 2026-01-21 23:39:11.202 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:39:11 compute-1 nova_compute[182713]: 2026-01-21 23:39:11.204 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:39:11 compute-1 sudo[199831]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-udhhpmpsaifyjkyfrfuxqvcifgxzhurl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038750.4756684-2838-149401036225357/AnsiballZ_file.py'
Jan 21 23:39:11 compute-1 sudo[199831]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:39:11 compute-1 python3.9[199833]: ansible-ansible.legacy.file Invoked with mode=0600 dest=/var/lib/kolla/config_files/ceilometer_agent_compute.json _original_basename=.1c0iupgp recurse=False state=file path=/var/lib/kolla/config_files/ceilometer_agent_compute.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:39:11 compute-1 sudo[199831]: pam_unix(sudo:session): session closed for user root
Jan 21 23:39:11 compute-1 nova_compute[182713]: 2026-01-21 23:39:11.856 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:39:11 compute-1 nova_compute[182713]: 2026-01-21 23:39:11.856 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:39:11 compute-1 nova_compute[182713]: 2026-01-21 23:39:11.857 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:39:11 compute-1 nova_compute[182713]: 2026-01-21 23:39:11.905 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:39:11 compute-1 nova_compute[182713]: 2026-01-21 23:39:11.906 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:39:11 compute-1 nova_compute[182713]: 2026-01-21 23:39:11.906 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:39:11 compute-1 nova_compute[182713]: 2026-01-21 23:39:11.907 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 21 23:39:12 compute-1 podman[199957]: 2026-01-21 23:39:12.120494221 +0000 UTC m=+0.075081771 container health_status af9a44923f14d865893c91712a29a99d59e05d83f1a1a0300c4f4d5af638f284 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 21 23:39:12 compute-1 nova_compute[182713]: 2026-01-21 23:39:12.164 182717 WARNING nova.virt.libvirt.driver [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 21 23:39:12 compute-1 nova_compute[182713]: 2026-01-21 23:39:12.166 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5911MB free_disk=73.53401565551758GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 21 23:39:12 compute-1 nova_compute[182713]: 2026-01-21 23:39:12.166 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:39:12 compute-1 nova_compute[182713]: 2026-01-21 23:39:12.166 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:39:12 compute-1 nova_compute[182713]: 2026-01-21 23:39:12.246 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 21 23:39:12 compute-1 nova_compute[182713]: 2026-01-21 23:39:12.247 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 21 23:39:12 compute-1 nova_compute[182713]: 2026-01-21 23:39:12.281 182717 DEBUG nova.compute.provider_tree [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Inventory has not changed in ProviderTree for provider: 39680711-70c9-4df1-ae59-25e54fac688d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 21 23:39:12 compute-1 nova_compute[182713]: 2026-01-21 23:39:12.296 182717 DEBUG nova.scheduler.client.report [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Inventory has not changed for provider 39680711-70c9-4df1-ae59-25e54fac688d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 21 23:39:12 compute-1 nova_compute[182713]: 2026-01-21 23:39:12.298 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 21 23:39:12 compute-1 nova_compute[182713]: 2026-01-21 23:39:12.298 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.132s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:39:12 compute-1 python3.9[200000]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/openstack_network_exporter state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:39:12 compute-1 podman[200005]: 2026-01-21 23:39:12.443704261 +0000 UTC m=+0.079564612 container health_status e305ebfcd3dbca18f8dd680606b043b108cf9efb0d0a771039cb04fe0df8441d (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 21 23:39:13 compute-1 nova_compute[182713]: 2026-01-21 23:39:13.297 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:39:13 compute-1 nova_compute[182713]: 2026-01-21 23:39:13.297 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 21 23:39:13 compute-1 nova_compute[182713]: 2026-01-21 23:39:13.298 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 21 23:39:13 compute-1 nova_compute[182713]: 2026-01-21 23:39:13.316 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 21 23:39:13 compute-1 nova_compute[182713]: 2026-01-21 23:39:13.317 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:39:13 compute-1 nova_compute[182713]: 2026-01-21 23:39:13.318 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:39:13 compute-1 nova_compute[182713]: 2026-01-21 23:39:13.318 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:39:13 compute-1 nova_compute[182713]: 2026-01-21 23:39:13.318 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:39:13 compute-1 nova_compute[182713]: 2026-01-21 23:39:13.319 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 21 23:39:14 compute-1 sudo[200449]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eznwhnejiqetzomfowintziiaoyqjtzb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038754.2164242-2949-157301411786979/AnsiballZ_container_config_data.py'
Jan 21 23:39:14 compute-1 sudo[200449]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:39:14 compute-1 python3.9[200451]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/openstack_network_exporter config_pattern=*.json debug=False
Jan 21 23:39:14 compute-1 sudo[200449]: pam_unix(sudo:session): session closed for user root
Jan 21 23:39:15 compute-1 sudo[200601]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tnxqsrzanxilupijghufhnoizjnolckg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038755.3150501-2982-55427314070340/AnsiballZ_container_config_hash.py'
Jan 21 23:39:15 compute-1 sudo[200601]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:39:15 compute-1 python3.9[200603]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Jan 21 23:39:15 compute-1 sudo[200601]: pam_unix(sudo:session): session closed for user root
Jan 21 23:39:16 compute-1 sudo[200753]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-whncvtyhajnlckrnzqjagczoccjdxxcn ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1769038756.3675385-3012-142019511361056/AnsiballZ_edpm_container_manage.py'
Jan 21 23:39:16 compute-1 sudo[200753]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:39:16 compute-1 python3[200755]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/openstack_network_exporter config_id=openstack_network_exporter config_overrides={} config_patterns=*.json containers=['openstack_network_exporter'] log_base_path=/var/log/containers/stdouts debug=False
Jan 21 23:39:19 compute-1 podman[200768]: 2026-01-21 23:39:19.80305457 +0000 UTC m=+2.801395864 image pull 186c5e97c6f6912533851a0044ea6da23938910e7bddfb4a6c0be9b48ab2a1d1 quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified
Jan 21 23:39:20 compute-1 podman[200864]: 2026-01-21 23:39:20.032216905 +0000 UTC m=+0.117560061 container create dce133a2ffa21f3006ac27dc80027060a9029e6d972f4c62fecd35dd3bbb00f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, config_id=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, architecture=x86_64, io.buildah.version=1.33.7, distribution-scope=public, container_name=openstack_network_exporter, managed_by=edpm_ansible, vcs-type=git, maintainer=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, version=9.6, name=ubi9-minimal, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers)
Jan 21 23:39:20 compute-1 podman[200864]: 2026-01-21 23:39:19.942407557 +0000 UTC m=+0.027750763 image pull 186c5e97c6f6912533851a0044ea6da23938910e7bddfb4a6c0be9b48ab2a1d1 quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified
Jan 21 23:39:20 compute-1 python3[200755]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name openstack_network_exporter --conmon-pidfile /run/openstack_network_exporter.pid --env OPENSTACK_NETWORK_EXPORTER_YAML=/etc/openstack_network_exporter/openstack_network_exporter.yaml --env OS_ENDPOINT_TYPE=internal --env EDPM_CONFIG_HASH=22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8 --healthcheck-command /openstack/healthcheck openstack-netwo --label config_id=openstack_network_exporter --label container_name=openstack_network_exporter --label managed_by=edpm_ansible --label config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --publish 9105:9105 --volume /var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z --volume /var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z --volume /var/run/openvswitch:/run/openvswitch:rw,z --volume /var/lib/openvswitch/ovn:/run/ovn:rw,z --volume /proc:/host/proc:ro --volume /var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified
Jan 21 23:39:20 compute-1 sudo[200753]: pam_unix(sudo:session): session closed for user root
Jan 21 23:39:21 compute-1 sudo[201052]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cehgbztwcxrxdphhlmmgptrhxcvamgdh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038760.7558696-3036-106404582347136/AnsiballZ_stat.py'
Jan 21 23:39:21 compute-1 sudo[201052]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:39:21 compute-1 python3.9[201054]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 21 23:39:21 compute-1 sudo[201052]: pam_unix(sudo:session): session closed for user root
Jan 21 23:39:21 compute-1 sudo[201206]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uyfsxnusllbmindbzfgoicpvwgspsokm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038761.6291711-3063-468904881668/AnsiballZ_file.py'
Jan 21 23:39:21 compute-1 sudo[201206]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:39:22 compute-1 python3.9[201208]: ansible-file Invoked with path=/etc/systemd/system/edpm_openstack_network_exporter.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:39:22 compute-1 sudo[201206]: pam_unix(sudo:session): session closed for user root
Jan 21 23:39:22 compute-1 sudo[201294]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lvbhrdnbzuxsirlihexuxjnrdhbqvaci ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038761.6291711-3063-468904881668/AnsiballZ_stat.py'
Jan 21 23:39:22 compute-1 sudo[201294]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:39:22 compute-1 podman[201256]: 2026-01-21 23:39:22.419743638 +0000 UTC m=+0.076603840 container health_status cba261ffc859a105e0afa4436e64e27f9ba7e7387a1ea02146e0072c51844d44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=unhealthy, health_failing_streak=3, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 21 23:39:22 compute-1 systemd[1]: cba261ffc859a105e0afa4436e64e27f9ba7e7387a1ea02146e0072c51844d44-17808f7e975261c0.service: Main process exited, code=exited, status=1/FAILURE
Jan 21 23:39:22 compute-1 systemd[1]: cba261ffc859a105e0afa4436e64e27f9ba7e7387a1ea02146e0072c51844d44-17808f7e975261c0.service: Failed with result 'exit-code'.
Jan 21 23:39:22 compute-1 python3.9[201303]: ansible-stat Invoked with path=/etc/systemd/system/edpm_openstack_network_exporter_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 21 23:39:22 compute-1 sudo[201294]: pam_unix(sudo:session): session closed for user root
Jan 21 23:39:23 compute-1 sudo[201452]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-egvscrfmqjfgfqgwsrdgmeufjggrxwkg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038762.6693146-3063-77697584314641/AnsiballZ_copy.py'
Jan 21 23:39:23 compute-1 sudo[201452]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:39:23 compute-1 python3.9[201454]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769038762.6693146-3063-77697584314641/source dest=/etc/systemd/system/edpm_openstack_network_exporter.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:39:23 compute-1 sudo[201452]: pam_unix(sudo:session): session closed for user root
Jan 21 23:39:23 compute-1 sudo[201528]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oodgjpibvvgrkzwwvjhiqvmnekspdbla ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038762.6693146-3063-77697584314641/AnsiballZ_systemd.py'
Jan 21 23:39:23 compute-1 sudo[201528]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:39:23 compute-1 python3.9[201530]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 21 23:39:23 compute-1 systemd[1]: Reloading.
Jan 21 23:39:23 compute-1 systemd-rc-local-generator[201551]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 23:39:23 compute-1 systemd-sysv-generator[201555]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 21 23:39:24 compute-1 sudo[201528]: pam_unix(sudo:session): session closed for user root
Jan 21 23:39:24 compute-1 sudo[201638]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kfdezuvlvkzwmqtyjhiuydbaloihpezb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038762.6693146-3063-77697584314641/AnsiballZ_systemd.py'
Jan 21 23:39:24 compute-1 sudo[201638]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:39:24 compute-1 python3.9[201640]: ansible-systemd Invoked with state=restarted name=edpm_openstack_network_exporter.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 21 23:39:24 compute-1 systemd[1]: Reloading.
Jan 21 23:39:24 compute-1 systemd-rc-local-generator[201670]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 23:39:24 compute-1 systemd-sysv-generator[201673]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 21 23:39:25 compute-1 systemd[1]: Starting openstack_network_exporter container...
Jan 21 23:39:25 compute-1 systemd[1]: Started libcrun container.
Jan 21 23:39:25 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f4cfb526c82e1c8e3d8d080c2c3d3a07ddd9dd0b7624682b148694044cc92501/merged/run/ovn supports timestamps until 2038 (0x7fffffff)
Jan 21 23:39:25 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f4cfb526c82e1c8e3d8d080c2c3d3a07ddd9dd0b7624682b148694044cc92501/merged/etc/openstack_network_exporter/openstack_network_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Jan 21 23:39:25 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f4cfb526c82e1c8e3d8d080c2c3d3a07ddd9dd0b7624682b148694044cc92501/merged/etc/openstack_network_exporter/tls supports timestamps until 2038 (0x7fffffff)
Jan 21 23:39:25 compute-1 systemd[1]: Started /usr/bin/podman healthcheck run dce133a2ffa21f3006ac27dc80027060a9029e6d972f4c62fecd35dd3bbb00f9.
Jan 21 23:39:25 compute-1 podman[201680]: 2026-01-21 23:39:25.58827866 +0000 UTC m=+0.412815479 container init dce133a2ffa21f3006ac27dc80027060a9029e6d972f4c62fecd35dd3bbb00f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, config_id=openstack_network_exporter, container_name=openstack_network_exporter, io.openshift.expose-services=, vendor=Red Hat, Inc., version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, name=ubi9-minimal, managed_by=edpm_ansible, release=1755695350, architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41)
Jan 21 23:39:25 compute-1 podman[201680]: 2026-01-21 23:39:25.630010845 +0000 UTC m=+0.454547574 container start dce133a2ffa21f3006ac27dc80027060a9029e6d972f4c62fecd35dd3bbb00f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, version=9.6, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., config_id=openstack_network_exporter, container_name=openstack_network_exporter, vendor=Red Hat, Inc., architecture=x86_64, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41)
Jan 21 23:39:25 compute-1 openstack_network_exporter[201695]: INFO    23:39:25 main.go:48: registering *bridge.Collector
Jan 21 23:39:25 compute-1 openstack_network_exporter[201695]: INFO    23:39:25 main.go:48: registering *coverage.Collector
Jan 21 23:39:25 compute-1 openstack_network_exporter[201695]: INFO    23:39:25 main.go:48: registering *datapath.Collector
Jan 21 23:39:25 compute-1 openstack_network_exporter[201695]: INFO    23:39:25 main.go:48: registering *iface.Collector
Jan 21 23:39:25 compute-1 openstack_network_exporter[201695]: INFO    23:39:25 main.go:48: registering *memory.Collector
Jan 21 23:39:25 compute-1 openstack_network_exporter[201695]: INFO    23:39:25 main.go:55: *ovnnorthd.Collector not registered, metric set not enabled
Jan 21 23:39:25 compute-1 openstack_network_exporter[201695]: INFO    23:39:25 main.go:48: registering *ovn.Collector
Jan 21 23:39:25 compute-1 podman[201680]: openstack_network_exporter
Jan 21 23:39:25 compute-1 openstack_network_exporter[201695]: INFO    23:39:25 main.go:55: *ovsdbserver.Collector not registered, metric set not enabled
Jan 21 23:39:25 compute-1 openstack_network_exporter[201695]: INFO    23:39:25 main.go:48: registering *pmd_perf.Collector
Jan 21 23:39:25 compute-1 openstack_network_exporter[201695]: INFO    23:39:25 main.go:48: registering *pmd_rxq.Collector
Jan 21 23:39:25 compute-1 openstack_network_exporter[201695]: INFO    23:39:25 main.go:48: registering *vswitch.Collector
Jan 21 23:39:25 compute-1 openstack_network_exporter[201695]: NOTICE  23:39:25 main.go:76: listening on https://:9105/metrics
Jan 21 23:39:25 compute-1 systemd[1]: Started openstack_network_exporter container.
Jan 21 23:39:25 compute-1 sudo[201638]: pam_unix(sudo:session): session closed for user root
Jan 21 23:39:25 compute-1 podman[201700]: 2026-01-21 23:39:25.732716885 +0000 UTC m=+0.088205211 container health_status dce133a2ffa21f3006ac27dc80027060a9029e6d972f4c62fecd35dd3bbb00f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=openstack_network_exporter, io.openshift.expose-services=, name=ubi9-minimal, distribution-scope=public, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, vcs-type=git, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Jan 21 23:39:26 compute-1 python3.9[201877]: ansible-ansible.builtin.slurp Invoked with src=/var/lib/edpm-config/deployed_services.yaml
Jan 21 23:39:27 compute-1 sudo[202027]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zsghkabsrexvgfknorulugpigsoqqgvl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038767.6265447-3198-204979748151113/AnsiballZ_stat.py'
Jan 21 23:39:27 compute-1 sudo[202027]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:39:28 compute-1 python3.9[202029]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/deployed_services.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:39:28 compute-1 sudo[202027]: pam_unix(sudo:session): session closed for user root
Jan 21 23:39:28 compute-1 sudo[202152]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oomnpangilhrrwtufthnerxcjwrydzms ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038767.6265447-3198-204979748151113/AnsiballZ_copy.py'
Jan 21 23:39:28 compute-1 sudo[202152]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:39:28 compute-1 python3.9[202154]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/deployed_services.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769038767.6265447-3198-204979748151113/.source.yaml _original_basename=.8wbfmguh follow=False checksum=0317da2c639ada97636b9543228efff3cda9d578 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:39:28 compute-1 sudo[202152]: pam_unix(sudo:session): session closed for user root
Jan 21 23:39:29 compute-1 sudo[202304]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wsnwzpjrnztjtmpoecuayelpshjkbypb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038769.197463-3243-181138641280533/AnsiballZ_find.py'
Jan 21 23:39:29 compute-1 sudo[202304]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:39:29 compute-1 python3.9[202306]: ansible-ansible.builtin.find Invoked with file_type=directory paths=['/var/lib/openstack/healthchecks/'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Jan 21 23:39:29 compute-1 sudo[202304]: pam_unix(sudo:session): session closed for user root
Jan 21 23:39:29 compute-1 auditd[699]: Audit daemon rotating log files
Jan 21 23:39:32 compute-1 podman[202331]: 2026-01-21 23:39:32.624976957 +0000 UTC m=+0.115074684 container health_status 1b31374526468d6b98833c6ddc06c791524e3aaa8f4a92d02e3f11c449a17ca2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, config_id=ovn_controller, container_name=ovn_controller)
Jan 21 23:39:33 compute-1 podman[202358]: 2026-01-21 23:39:33.602167608 +0000 UTC m=+0.087950372 container health_status 9069102163b9014ce8d990e36224edf2f6277e737eebbc85a8ea0ba5a3d6a468 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 21 23:39:42 compute-1 podman[202383]: 2026-01-21 23:39:42.593457483 +0000 UTC m=+0.074100232 container health_status af9a44923f14d865893c91712a29a99d59e05d83f1a1a0300c4f4d5af638f284 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 21 23:39:42 compute-1 podman[202384]: 2026-01-21 23:39:42.593428472 +0000 UTC m=+0.069609712 container health_status e305ebfcd3dbca18f8dd680606b043b108cf9efb0d0a771039cb04fe0df8441d (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 21 23:39:50 compute-1 sudo[202549]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pndugxenyfpnyvkbgyabzqmwyknttssi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038790.1478999-3468-122896023922927/AnsiballZ_podman_container_info.py'
Jan 21 23:39:50 compute-1 sudo[202549]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:39:50 compute-1 python3.9[202551]: ansible-containers.podman.podman_container_info Invoked with name=['ovn_controller'] executable=podman
Jan 21 23:39:50 compute-1 sudo[202549]: pam_unix(sudo:session): session closed for user root
Jan 21 23:39:51 compute-1 sudo[202714]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bzeexfnyekkdmxhtizpfsmefnnpnrkre ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038791.0013893-3476-221079942042276/AnsiballZ_podman_container_exec.py'
Jan 21 23:39:51 compute-1 sudo[202714]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:39:51 compute-1 python3.9[202716]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=ovn_controller detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Jan 21 23:39:51 compute-1 systemd[1]: Started libpod-conmon-1b31374526468d6b98833c6ddc06c791524e3aaa8f4a92d02e3f11c449a17ca2.scope.
Jan 21 23:39:51 compute-1 podman[202717]: 2026-01-21 23:39:51.648185709 +0000 UTC m=+0.099288144 container exec 1b31374526468d6b98833c6ddc06c791524e3aaa8f4a92d02e3f11c449a17ca2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.schema-version=1.0, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 21 23:39:51 compute-1 podman[202717]: 2026-01-21 23:39:51.681129892 +0000 UTC m=+0.132232277 container exec_died 1b31374526468d6b98833c6ddc06c791524e3aaa8f4a92d02e3f11c449a17ca2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true, org.label-schema.license=GPLv2, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_controller, io.buildah.version=1.41.3)
Jan 21 23:39:51 compute-1 systemd[1]: libpod-conmon-1b31374526468d6b98833c6ddc06c791524e3aaa8f4a92d02e3f11c449a17ca2.scope: Deactivated successfully.
Jan 21 23:39:51 compute-1 sudo[202714]: pam_unix(sudo:session): session closed for user root
Jan 21 23:39:52 compute-1 sudo[202899]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jrgyweqweksajsvqybxhhsspeudyvrcj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038791.8863719-3484-126299192862481/AnsiballZ_podman_container_exec.py'
Jan 21 23:39:52 compute-1 sudo[202899]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:39:52 compute-1 python3.9[202901]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=ovn_controller detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Jan 21 23:39:52 compute-1 systemd[1]: Started libpod-conmon-1b31374526468d6b98833c6ddc06c791524e3aaa8f4a92d02e3f11c449a17ca2.scope.
Jan 21 23:39:52 compute-1 podman[202902]: 2026-01-21 23:39:52.513151526 +0000 UTC m=+0.066381153 container exec 1b31374526468d6b98833c6ddc06c791524e3aaa8f4a92d02e3f11c449a17ca2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Jan 21 23:39:52 compute-1 podman[202902]: 2026-01-21 23:39:52.543545269 +0000 UTC m=+0.096774866 container exec_died 1b31374526468d6b98833c6ddc06c791524e3aaa8f4a92d02e3f11c449a17ca2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202)
Jan 21 23:39:52 compute-1 systemd[1]: libpod-conmon-1b31374526468d6b98833c6ddc06c791524e3aaa8f4a92d02e3f11c449a17ca2.scope: Deactivated successfully.
Jan 21 23:39:52 compute-1 sudo[202899]: pam_unix(sudo:session): session closed for user root
Jan 21 23:39:52 compute-1 podman[202915]: 2026-01-21 23:39:52.615967178 +0000 UTC m=+0.109036277 container health_status cba261ffc859a105e0afa4436e64e27f9ba7e7387a1ea02146e0072c51844d44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=unhealthy, health_failing_streak=4, health_log=, org.label-schema.schema-version=1.0, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ceilometer_agent_compute)
Jan 21 23:39:52 compute-1 systemd[1]: cba261ffc859a105e0afa4436e64e27f9ba7e7387a1ea02146e0072c51844d44-17808f7e975261c0.service: Main process exited, code=exited, status=1/FAILURE
Jan 21 23:39:52 compute-1 systemd[1]: cba261ffc859a105e0afa4436e64e27f9ba7e7387a1ea02146e0072c51844d44-17808f7e975261c0.service: Failed with result 'exit-code'.
Jan 21 23:39:53 compute-1 sudo[203099]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zkkpboqbjbsthqrvtxbrfvxtlnclwbrb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038792.7799861-3492-55057375854785/AnsiballZ_file.py'
Jan 21 23:39:53 compute-1 sudo[203099]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:39:53 compute-1 python3.9[203101]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/ovn_controller recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:39:53 compute-1 sudo[203099]: pam_unix(sudo:session): session closed for user root
Jan 21 23:39:53 compute-1 sudo[203251]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-limwaxmrpcprynqoajsjmonqfzrykfia ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038793.547529-3501-202226152937361/AnsiballZ_podman_container_info.py'
Jan 21 23:39:53 compute-1 sudo[203251]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:39:54 compute-1 python3.9[203253]: ansible-containers.podman.podman_container_info Invoked with name=['ovn_metadata_agent'] executable=podman
Jan 21 23:39:54 compute-1 sudo[203251]: pam_unix(sudo:session): session closed for user root
Jan 21 23:39:54 compute-1 sudo[203416]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-htguymoffwsaqxxryibhegchhcqxqxal ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038794.2886002-3509-115551949138896/AnsiballZ_podman_container_exec.py'
Jan 21 23:39:54 compute-1 sudo[203416]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:39:54 compute-1 python3.9[203418]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=ovn_metadata_agent detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Jan 21 23:39:54 compute-1 systemd[1]: Started libpod-conmon-af9a44923f14d865893c91712a29a99d59e05d83f1a1a0300c4f4d5af638f284.scope.
Jan 21 23:39:54 compute-1 podman[203419]: 2026-01-21 23:39:54.937050097 +0000 UTC m=+0.095451515 container exec af9a44923f14d865893c91712a29a99d59e05d83f1a1a0300c4f4d5af638f284 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 21 23:39:54 compute-1 podman[203419]: 2026-01-21 23:39:54.977320187 +0000 UTC m=+0.135721605 container exec_died af9a44923f14d865893c91712a29a99d59e05d83f1a1a0300c4f4d5af638f284 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 21 23:39:55 compute-1 systemd[1]: libpod-conmon-af9a44923f14d865893c91712a29a99d59e05d83f1a1a0300c4f4d5af638f284.scope: Deactivated successfully.
Jan 21 23:39:55 compute-1 sudo[203416]: pam_unix(sudo:session): session closed for user root
Jan 21 23:39:55 compute-1 sudo[203602]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dxssbhjybtmjxrzfvqcvlxzayfckxfkc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038795.2223926-3517-53661481205794/AnsiballZ_podman_container_exec.py'
Jan 21 23:39:55 compute-1 sudo[203602]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:39:55 compute-1 python3.9[203604]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=ovn_metadata_agent detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Jan 21 23:39:55 compute-1 systemd[1]: Started libpod-conmon-af9a44923f14d865893c91712a29a99d59e05d83f1a1a0300c4f4d5af638f284.scope.
Jan 21 23:39:55 compute-1 podman[203605]: 2026-01-21 23:39:55.835552925 +0000 UTC m=+0.095603730 container exec af9a44923f14d865893c91712a29a99d59e05d83f1a1a0300c4f4d5af638f284 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 21 23:39:55 compute-1 podman[203605]: 2026-01-21 23:39:55.86919541 +0000 UTC m=+0.129246135 container exec_died af9a44923f14d865893c91712a29a99d59e05d83f1a1a0300c4f4d5af638f284 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202)
Jan 21 23:39:55 compute-1 systemd[1]: libpod-conmon-af9a44923f14d865893c91712a29a99d59e05d83f1a1a0300c4f4d5af638f284.scope: Deactivated successfully.
Jan 21 23:39:55 compute-1 sudo[203602]: pam_unix(sudo:session): session closed for user root
Jan 21 23:39:55 compute-1 podman[203622]: 2026-01-21 23:39:55.934309562 +0000 UTC m=+0.089624734 container health_status dce133a2ffa21f3006ac27dc80027060a9029e6d972f4c62fecd35dd3bbb00f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., config_id=openstack_network_exporter, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, version=9.6, container_name=openstack_network_exporter, release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Jan 21 23:39:56 compute-1 sudo[203804]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rnjslmjbrbhgeokxsebbqqcdzctetqlc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038796.1387372-3525-11181393397535/AnsiballZ_file.py'
Jan 21 23:39:56 compute-1 sudo[203804]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:39:56 compute-1 python3.9[203806]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/ovn_metadata_agent recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:39:56 compute-1 sudo[203804]: pam_unix(sudo:session): session closed for user root
Jan 21 23:39:56 compute-1 rsyslogd[1003]: imjournal from <np0005591284:python3.9>: begin to drop messages due to rate-limiting
Jan 21 23:39:57 compute-1 sudo[203956]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ajkktziqarbfkjjmmziytzayjtucfksu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038796.9072149-3534-220041199115547/AnsiballZ_podman_container_info.py'
Jan 21 23:39:57 compute-1 sudo[203956]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:39:57 compute-1 python3.9[203958]: ansible-containers.podman.podman_container_info Invoked with name=['ceilometer_agent_compute'] executable=podman
Jan 21 23:39:57 compute-1 sudo[203956]: pam_unix(sudo:session): session closed for user root
Jan 21 23:39:57 compute-1 sudo[204121]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jiogoezhccgfyddrmfhxheppfsuofwai ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038797.6276636-3542-67649650746349/AnsiballZ_podman_container_exec.py'
Jan 21 23:39:57 compute-1 sudo[204121]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:39:58 compute-1 python3.9[204123]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=ceilometer_agent_compute detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Jan 21 23:39:58 compute-1 systemd[1]: Started libpod-conmon-cba261ffc859a105e0afa4436e64e27f9ba7e7387a1ea02146e0072c51844d44.scope.
Jan 21 23:39:58 compute-1 podman[204124]: 2026-01-21 23:39:58.287413514 +0000 UTC m=+0.105003611 container exec cba261ffc859a105e0afa4436e64e27f9ba7e7387a1ea02146e0072c51844d44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, managed_by=edpm_ansible, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Jan 21 23:39:58 compute-1 podman[204124]: 2026-01-21 23:39:58.323729652 +0000 UTC m=+0.141319699 container exec_died cba261ffc859a105e0afa4436e64e27f9ba7e7387a1ea02146e0072c51844d44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_id=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Jan 21 23:39:58 compute-1 systemd[1]: libpod-conmon-cba261ffc859a105e0afa4436e64e27f9ba7e7387a1ea02146e0072c51844d44.scope: Deactivated successfully.
Jan 21 23:39:58 compute-1 sudo[204121]: pam_unix(sudo:session): session closed for user root
Jan 21 23:39:58 compute-1 sudo[204305]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wijolndzjznneljzwfvifnlkveoerjtf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038798.5688016-3550-276653311881793/AnsiballZ_podman_container_exec.py'
Jan 21 23:39:58 compute-1 sudo[204305]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:39:59 compute-1 python3.9[204307]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=ceilometer_agent_compute detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Jan 21 23:39:59 compute-1 systemd[1]: Started libpod-conmon-cba261ffc859a105e0afa4436e64e27f9ba7e7387a1ea02146e0072c51844d44.scope.
Jan 21 23:39:59 compute-1 podman[204308]: 2026-01-21 23:39:59.180896626 +0000 UTC m=+0.082166702 container exec cba261ffc859a105e0afa4436e64e27f9ba7e7387a1ea02146e0072c51844d44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Jan 21 23:39:59 compute-1 podman[204308]: 2026-01-21 23:39:59.186307364 +0000 UTC m=+0.087577460 container exec_died cba261ffc859a105e0afa4436e64e27f9ba7e7387a1ea02146e0072c51844d44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Jan 21 23:39:59 compute-1 systemd[1]: libpod-conmon-cba261ffc859a105e0afa4436e64e27f9ba7e7387a1ea02146e0072c51844d44.scope: Deactivated successfully.
Jan 21 23:39:59 compute-1 sudo[204305]: pam_unix(sudo:session): session closed for user root
Jan 21 23:39:59 compute-1 sudo[204489]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fotdweisymphhbgjomectdqtwtbbwxqd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038799.4448996-3558-86440303376080/AnsiballZ_file.py'
Jan 21 23:39:59 compute-1 sudo[204489]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:40:00 compute-1 python3.9[204491]: ansible-ansible.builtin.file Invoked with group=42405 mode=0700 owner=42405 path=/var/lib/openstack/healthchecks/ceilometer_agent_compute recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:40:00 compute-1 sudo[204489]: pam_unix(sudo:session): session closed for user root
Jan 21 23:40:00 compute-1 sudo[204641]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xuaumycphexyppkbsjdzdmdsmfngylwc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038800.3198836-3567-165446493419241/AnsiballZ_podman_container_info.py'
Jan 21 23:40:00 compute-1 sudo[204641]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:40:00 compute-1 python3.9[204643]: ansible-containers.podman.podman_container_info Invoked with name=['node_exporter'] executable=podman
Jan 21 23:40:01 compute-1 sudo[204641]: pam_unix(sudo:session): session closed for user root
Jan 21 23:40:01 compute-1 sudo[204806]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-volxcmpkgglxxrryctewxntnkbqldclg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038801.2154975-3575-252297584728626/AnsiballZ_podman_container_exec.py'
Jan 21 23:40:01 compute-1 sudo[204806]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:40:01 compute-1 python3.9[204808]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=node_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Jan 21 23:40:01 compute-1 systemd[1]: Started libpod-conmon-e305ebfcd3dbca18f8dd680606b043b108cf9efb0d0a771039cb04fe0df8441d.scope.
Jan 21 23:40:01 compute-1 podman[204809]: 2026-01-21 23:40:01.887080372 +0000 UTC m=+0.086994212 container exec e305ebfcd3dbca18f8dd680606b043b108cf9efb0d0a771039cb04fe0df8441d (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 21 23:40:01 compute-1 podman[204809]: 2026-01-21 23:40:01.918090025 +0000 UTC m=+0.118003805 container exec_died e305ebfcd3dbca18f8dd680606b043b108cf9efb0d0a771039cb04fe0df8441d (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 21 23:40:01 compute-1 systemd[1]: libpod-conmon-e305ebfcd3dbca18f8dd680606b043b108cf9efb0d0a771039cb04fe0df8441d.scope: Deactivated successfully.
Jan 21 23:40:01 compute-1 sudo[204806]: pam_unix(sudo:session): session closed for user root
Jan 21 23:40:02 compute-1 sudo[204990]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mbfjorvldpxzechozngltgegpkjozhjb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038802.1554544-3583-249680633368400/AnsiballZ_podman_container_exec.py'
Jan 21 23:40:02 compute-1 sudo[204990]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:40:02 compute-1 python3.9[204992]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=node_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Jan 21 23:40:02 compute-1 systemd[1]: Started libpod-conmon-e305ebfcd3dbca18f8dd680606b043b108cf9efb0d0a771039cb04fe0df8441d.scope.
Jan 21 23:40:02 compute-1 podman[204993]: 2026-01-21 23:40:02.815796979 +0000 UTC m=+0.098379035 container exec e305ebfcd3dbca18f8dd680606b043b108cf9efb0d0a771039cb04fe0df8441d (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 21 23:40:02 compute-1 podman[204993]: 2026-01-21 23:40:02.852011274 +0000 UTC m=+0.134593340 container exec_died e305ebfcd3dbca18f8dd680606b043b108cf9efb0d0a771039cb04fe0df8441d (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 21 23:40:02 compute-1 systemd[1]: libpod-conmon-e305ebfcd3dbca18f8dd680606b043b108cf9efb0d0a771039cb04fe0df8441d.scope: Deactivated successfully.
Jan 21 23:40:02 compute-1 sudo[204990]: pam_unix(sudo:session): session closed for user root
Jan 21 23:40:02 compute-1 podman[205011]: 2026-01-21 23:40:02.984657452 +0000 UTC m=+0.159098351 container health_status 1b31374526468d6b98833c6ddc06c791524e3aaa8f4a92d02e3f11c449a17ca2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Jan 21 23:40:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:40:02.985 104184 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:40:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:40:02.986 104184 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:40:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:40:02.986 104184 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:40:03 compute-1 sudo[205196]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vllamywltxsqtnmrpyzkanlkszgavsdb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038803.1266294-3591-66024493804371/AnsiballZ_file.py'
Jan 21 23:40:03 compute-1 sudo[205196]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:40:03 compute-1 python3.9[205198]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/node_exporter recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:40:03 compute-1 sudo[205196]: pam_unix(sudo:session): session closed for user root
Jan 21 23:40:04 compute-1 sudo[205361]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hpjhiducwpcdalcgfhgqqaqccmjsddqx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038804.005125-3600-9689869556387/AnsiballZ_podman_container_info.py'
Jan 21 23:40:04 compute-1 sudo[205361]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:40:04 compute-1 podman[205322]: 2026-01-21 23:40:04.36655219 +0000 UTC m=+0.086920170 container health_status 9069102163b9014ce8d990e36224edf2f6277e737eebbc85a8ea0ba5a3d6a468 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 21 23:40:04 compute-1 python3.9[205367]: ansible-containers.podman.podman_container_info Invoked with name=['podman_exporter'] executable=podman
Jan 21 23:40:04 compute-1 sudo[205361]: pam_unix(sudo:session): session closed for user root
Jan 21 23:40:05 compute-1 sudo[205538]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uermzcjeixuniedrqzormfaammlijfbr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038804.8634126-3608-161826761638613/AnsiballZ_podman_container_exec.py'
Jan 21 23:40:05 compute-1 sudo[205538]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:40:05 compute-1 python3.9[205540]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=podman_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Jan 21 23:40:05 compute-1 systemd[1]: Started libpod-conmon-9069102163b9014ce8d990e36224edf2f6277e737eebbc85a8ea0ba5a3d6a468.scope.
Jan 21 23:40:05 compute-1 podman[205541]: 2026-01-21 23:40:05.521078087 +0000 UTC m=+0.092386420 container exec 9069102163b9014ce8d990e36224edf2f6277e737eebbc85a8ea0ba5a3d6a468 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 21 23:40:05 compute-1 podman[205541]: 2026-01-21 23:40:05.551052968 +0000 UTC m=+0.122361321 container exec_died 9069102163b9014ce8d990e36224edf2f6277e737eebbc85a8ea0ba5a3d6a468 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 21 23:40:05 compute-1 systemd[1]: libpod-conmon-9069102163b9014ce8d990e36224edf2f6277e737eebbc85a8ea0ba5a3d6a468.scope: Deactivated successfully.
Jan 21 23:40:05 compute-1 sudo[205538]: pam_unix(sudo:session): session closed for user root
Jan 21 23:40:06 compute-1 sudo[205723]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lsmiwbspmykmfpdsavjlfgsuwcyudoxc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038805.7967243-3616-250742053727082/AnsiballZ_podman_container_exec.py'
Jan 21 23:40:06 compute-1 sudo[205723]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:40:06 compute-1 python3.9[205725]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=podman_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Jan 21 23:40:06 compute-1 systemd[1]: Started libpod-conmon-9069102163b9014ce8d990e36224edf2f6277e737eebbc85a8ea0ba5a3d6a468.scope.
Jan 21 23:40:06 compute-1 podman[205726]: 2026-01-21 23:40:06.419553754 +0000 UTC m=+0.104914708 container exec 9069102163b9014ce8d990e36224edf2f6277e737eebbc85a8ea0ba5a3d6a468 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 21 23:40:06 compute-1 podman[205726]: 2026-01-21 23:40:06.455382056 +0000 UTC m=+0.140742940 container exec_died 9069102163b9014ce8d990e36224edf2f6277e737eebbc85a8ea0ba5a3d6a468 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 21 23:40:06 compute-1 systemd[1]: libpod-conmon-9069102163b9014ce8d990e36224edf2f6277e737eebbc85a8ea0ba5a3d6a468.scope: Deactivated successfully.
Jan 21 23:40:06 compute-1 sudo[205723]: pam_unix(sudo:session): session closed for user root
Jan 21 23:40:06 compute-1 sudo[205908]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oyyoeiftgxqekmttzohzmmmhbgadwpyj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038806.6860723-3624-168192179057807/AnsiballZ_file.py'
Jan 21 23:40:06 compute-1 sudo[205908]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:40:07 compute-1 python3.9[205910]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/podman_exporter recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:40:07 compute-1 sudo[205908]: pam_unix(sudo:session): session closed for user root
Jan 21 23:40:07 compute-1 sudo[206060]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xgyizhcxxushittassotjmexfcvwyyfw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038807.4049163-3633-17763086462350/AnsiballZ_podman_container_info.py'
Jan 21 23:40:07 compute-1 sudo[206060]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:40:07 compute-1 python3.9[206062]: ansible-containers.podman.podman_container_info Invoked with name=['openstack_network_exporter'] executable=podman
Jan 21 23:40:08 compute-1 sudo[206060]: pam_unix(sudo:session): session closed for user root
Jan 21 23:40:08 compute-1 sudo[206225]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vtdrdbmyjiopphbazohsdavugyymwcfe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038808.2586863-3641-36665049636389/AnsiballZ_podman_container_exec.py'
Jan 21 23:40:08 compute-1 sudo[206225]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:40:08 compute-1 python3.9[206227]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=openstack_network_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Jan 21 23:40:08 compute-1 systemd[1]: Started libpod-conmon-dce133a2ffa21f3006ac27dc80027060a9029e6d972f4c62fecd35dd3bbb00f9.scope.
Jan 21 23:40:08 compute-1 podman[206228]: 2026-01-21 23:40:08.979758268 +0000 UTC m=+0.117985365 container exec dce133a2ffa21f3006ac27dc80027060a9029e6d972f4c62fecd35dd3bbb00f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, maintainer=Red Hat, Inc., distribution-scope=public, io.buildah.version=1.33.7, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, release=1755695350, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., name=ubi9-minimal, build-date=2025-08-20T13:12:41, version=9.6, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git)
Jan 21 23:40:09 compute-1 podman[206228]: 2026-01-21 23:40:09.013301179 +0000 UTC m=+0.151528246 container exec_died dce133a2ffa21f3006ac27dc80027060a9029e6d972f4c62fecd35dd3bbb00f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, io.buildah.version=1.33.7, vcs-type=git, version=9.6, vendor=Red Hat, Inc., architecture=x86_64, container_name=openstack_network_exporter, name=ubi9-minimal, build-date=2025-08-20T13:12:41, config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, managed_by=edpm_ansible, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers)
Jan 21 23:40:09 compute-1 systemd[1]: libpod-conmon-dce133a2ffa21f3006ac27dc80027060a9029e6d972f4c62fecd35dd3bbb00f9.scope: Deactivated successfully.
Jan 21 23:40:09 compute-1 sudo[206225]: pam_unix(sudo:session): session closed for user root
Jan 21 23:40:09 compute-1 sudo[206410]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xvaulqlfixyuqmeoqaatzdhcijuuojrc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038809.286366-3649-239409278702067/AnsiballZ_podman_container_exec.py'
Jan 21 23:40:09 compute-1 sudo[206410]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:40:09 compute-1 python3.9[206412]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=openstack_network_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Jan 21 23:40:09 compute-1 systemd[1]: Started libpod-conmon-dce133a2ffa21f3006ac27dc80027060a9029e6d972f4c62fecd35dd3bbb00f9.scope.
Jan 21 23:40:09 compute-1 podman[206413]: 2026-01-21 23:40:09.990499361 +0000 UTC m=+0.103220796 container exec dce133a2ffa21f3006ac27dc80027060a9029e6d972f4c62fecd35dd3bbb00f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, maintainer=Red Hat, Inc., release=1755695350, distribution-scope=public, vcs-type=git, version=9.6, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=openstack_network_exporter, io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Jan 21 23:40:10 compute-1 podman[206413]: 2026-01-21 23:40:10.02557895 +0000 UTC m=+0.138300445 container exec_died dce133a2ffa21f3006ac27dc80027060a9029e6d972f4c62fecd35dd3bbb00f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, version=9.6, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, name=ubi9-minimal, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, config_id=openstack_network_exporter, managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git)
Jan 21 23:40:10 compute-1 systemd[1]: libpod-conmon-dce133a2ffa21f3006ac27dc80027060a9029e6d972f4c62fecd35dd3bbb00f9.scope: Deactivated successfully.
Jan 21 23:40:10 compute-1 sudo[206410]: pam_unix(sudo:session): session closed for user root
Jan 21 23:40:10 compute-1 sudo[206594]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nyrcgwmcaatishzmvixsqhjjinkdnwem ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038810.2911608-3657-18719170061817/AnsiballZ_file.py'
Jan 21 23:40:10 compute-1 sudo[206594]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:40:10 compute-1 python3.9[206596]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/openstack_network_exporter recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:40:10 compute-1 sudo[206594]: pam_unix(sudo:session): session closed for user root
Jan 21 23:40:11 compute-1 nova_compute[182713]: 2026-01-21 23:40:11.856 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:40:11 compute-1 nova_compute[182713]: 2026-01-21 23:40:11.858 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:40:11 compute-1 nova_compute[182713]: 2026-01-21 23:40:11.858 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:40:11 compute-1 nova_compute[182713]: 2026-01-21 23:40:11.902 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:40:11 compute-1 nova_compute[182713]: 2026-01-21 23:40:11.903 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:40:11 compute-1 nova_compute[182713]: 2026-01-21 23:40:11.904 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:40:11 compute-1 nova_compute[182713]: 2026-01-21 23:40:11.904 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 21 23:40:12 compute-1 nova_compute[182713]: 2026-01-21 23:40:12.158 182717 WARNING nova.virt.libvirt.driver [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 21 23:40:12 compute-1 nova_compute[182713]: 2026-01-21 23:40:12.161 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5901MB free_disk=73.41672134399414GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 21 23:40:12 compute-1 nova_compute[182713]: 2026-01-21 23:40:12.162 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:40:12 compute-1 nova_compute[182713]: 2026-01-21 23:40:12.162 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:40:12 compute-1 nova_compute[182713]: 2026-01-21 23:40:12.244 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 21 23:40:12 compute-1 nova_compute[182713]: 2026-01-21 23:40:12.245 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 21 23:40:12 compute-1 nova_compute[182713]: 2026-01-21 23:40:12.273 182717 DEBUG nova.compute.provider_tree [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Inventory has not changed in ProviderTree for provider: 39680711-70c9-4df1-ae59-25e54fac688d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 21 23:40:12 compute-1 nova_compute[182713]: 2026-01-21 23:40:12.291 182717 DEBUG nova.scheduler.client.report [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Inventory has not changed for provider 39680711-70c9-4df1-ae59-25e54fac688d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 21 23:40:12 compute-1 nova_compute[182713]: 2026-01-21 23:40:12.294 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 21 23:40:12 compute-1 nova_compute[182713]: 2026-01-21 23:40:12.295 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.133s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:40:13 compute-1 nova_compute[182713]: 2026-01-21 23:40:13.294 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:40:13 compute-1 nova_compute[182713]: 2026-01-21 23:40:13.294 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 21 23:40:13 compute-1 nova_compute[182713]: 2026-01-21 23:40:13.295 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 21 23:40:13 compute-1 nova_compute[182713]: 2026-01-21 23:40:13.312 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 21 23:40:13 compute-1 nova_compute[182713]: 2026-01-21 23:40:13.313 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:40:13 compute-1 podman[206622]: 2026-01-21 23:40:13.606369262 +0000 UTC m=+0.088601002 container health_status e305ebfcd3dbca18f8dd680606b043b108cf9efb0d0a771039cb04fe0df8441d (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 21 23:40:13 compute-1 podman[206621]: 2026-01-21 23:40:13.612490492 +0000 UTC m=+0.081821652 container health_status af9a44923f14d865893c91712a29a99d59e05d83f1a1a0300c4f4d5af638f284 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 21 23:40:13 compute-1 nova_compute[182713]: 2026-01-21 23:40:13.855 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:40:13 compute-1 nova_compute[182713]: 2026-01-21 23:40:13.855 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:40:13 compute-1 nova_compute[182713]: 2026-01-21 23:40:13.856 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 21 23:40:14 compute-1 nova_compute[182713]: 2026-01-21 23:40:14.857 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:40:14 compute-1 nova_compute[182713]: 2026-01-21 23:40:14.857 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:40:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:40:22.858 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:40:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:40:22.860 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:40:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:40:22.861 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:40:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:40:22.861 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:40:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:40:22.861 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:40:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:40:22.861 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:40:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:40:22.862 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:40:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:40:22.862 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:40:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:40:22.862 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:40:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:40:22.863 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:40:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:40:22.863 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:40:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:40:22.863 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:40:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:40:22.863 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:40:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:40:22.864 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:40:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:40:22.864 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:40:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:40:22.864 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:40:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:40:22.864 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:40:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:40:22.864 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:40:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:40:22.864 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:40:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:40:22.865 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:40:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:40:22.865 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:40:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:40:22.865 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:40:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:40:22.865 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:40:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:40:22.865 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:40:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:40:22.865 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:40:23 compute-1 podman[206666]: 2026-01-21 23:40:23.591948212 +0000 UTC m=+0.079103684 container health_status cba261ffc859a105e0afa4436e64e27f9ba7e7387a1ea02146e0072c51844d44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, config_id=ceilometer_agent_compute, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3)
Jan 21 23:40:26 compute-1 podman[206687]: 2026-01-21 23:40:26.605325913 +0000 UTC m=+0.093310238 container health_status dce133a2ffa21f3006ac27dc80027060a9029e6d972f4c62fecd35dd3bbb00f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, release=1755695350, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, vendor=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, io.openshift.expose-services=, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, distribution-scope=public, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., name=ubi9-minimal, version=9.6, com.redhat.component=ubi9-minimal-container)
Jan 21 23:40:33 compute-1 podman[206707]: 2026-01-21 23:40:33.69091706 +0000 UTC m=+0.155862814 container health_status 1b31374526468d6b98833c6ddc06c791524e3aaa8f4a92d02e3f11c449a17ca2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202)
Jan 21 23:40:33 compute-1 sudo[206859]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-divtrrnwugmgzjcuvipjkrixrxdzsdic ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038833.5837643-3867-278864241115865/AnsiballZ_file.py'
Jan 21 23:40:33 compute-1 sudo[206859]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:40:34 compute-1 python3.9[206861]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall/ state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:40:34 compute-1 sudo[206859]: pam_unix(sudo:session): session closed for user root
Jan 21 23:40:34 compute-1 podman[206909]: 2026-01-21 23:40:34.604830153 +0000 UTC m=+0.080225760 container health_status 9069102163b9014ce8d990e36224edf2f6277e737eebbc85a8ea0ba5a3d6a468 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 21 23:40:34 compute-1 sudo[207035]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-marzlgzetukiuaqbkmfeopmnjffaflce ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038834.483656-3891-239240233123687/AnsiballZ_stat.py'
Jan 21 23:40:34 compute-1 sudo[207035]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:40:35 compute-1 python3.9[207037]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/telemetry.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:40:35 compute-1 sudo[207035]: pam_unix(sudo:session): session closed for user root
Jan 21 23:40:35 compute-1 sudo[207158]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dvmbrzqikdxgrlmcqdnlknwzxkfbbnge ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038834.483656-3891-239240233123687/AnsiballZ_copy.py'
Jan 21 23:40:35 compute-1 sudo[207158]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:40:35 compute-1 python3.9[207160]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/telemetry.yaml mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1769038834.483656-3891-239240233123687/.source.yaml _original_basename=firewall.yaml follow=False checksum=d942d984493b214bda2913f753ff68cdcedff00e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:40:35 compute-1 sudo[207158]: pam_unix(sudo:session): session closed for user root
Jan 21 23:40:36 compute-1 sudo[207310]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jqkkqvabzdrlrfdxlxzvspquhnyrxbzw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038836.2337112-3939-259764630631975/AnsiballZ_file.py'
Jan 21 23:40:36 compute-1 sudo[207310]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:40:36 compute-1 python3.9[207312]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:40:36 compute-1 sudo[207310]: pam_unix(sudo:session): session closed for user root
Jan 21 23:40:38 compute-1 sudo[207462]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rfjcxxsisccgmsmqctxtrvkeekjnmfeu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038837.5831144-3964-163958639864312/AnsiballZ_stat.py'
Jan 21 23:40:38 compute-1 sudo[207462]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:40:38 compute-1 python3.9[207464]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:40:38 compute-1 sudo[207462]: pam_unix(sudo:session): session closed for user root
Jan 21 23:40:38 compute-1 sudo[207540]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xfceudztryycqtglbznboyfwjtnhwjah ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038837.5831144-3964-163958639864312/AnsiballZ_file.py'
Jan 21 23:40:38 compute-1 sudo[207540]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:40:38 compute-1 python3.9[207542]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:40:38 compute-1 sudo[207540]: pam_unix(sudo:session): session closed for user root
Jan 21 23:40:39 compute-1 sudo[207692]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oufeqiosrfzruwcbwgxkyhmjaavzfibm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038839.0734904-4000-253241698761886/AnsiballZ_stat.py'
Jan 21 23:40:39 compute-1 sudo[207692]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:40:39 compute-1 python3.9[207694]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:40:39 compute-1 sudo[207692]: pam_unix(sudo:session): session closed for user root
Jan 21 23:40:40 compute-1 sudo[207770]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gjjpeictutnliydsdvocmhohskuidobz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038839.0734904-4000-253241698761886/AnsiballZ_file.py'
Jan 21 23:40:40 compute-1 sudo[207770]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:40:40 compute-1 python3.9[207772]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.weleq_30 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:40:40 compute-1 sudo[207770]: pam_unix(sudo:session): session closed for user root
Jan 21 23:40:40 compute-1 sudo[207922]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nibbofzyzzmdmqctigjumvbnplgxhvyk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038840.4713433-4035-273390333333441/AnsiballZ_stat.py'
Jan 21 23:40:40 compute-1 sudo[207922]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:40:40 compute-1 python3.9[207924]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:40:41 compute-1 sudo[207922]: pam_unix(sudo:session): session closed for user root
Jan 21 23:40:41 compute-1 sudo[208000]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-evjqbzwfluwrmllhxsflzbbhvjjsftnm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038840.4713433-4035-273390333333441/AnsiballZ_file.py'
Jan 21 23:40:41 compute-1 sudo[208000]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:40:41 compute-1 python3.9[208002]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:40:41 compute-1 sudo[208000]: pam_unix(sudo:session): session closed for user root
Jan 21 23:40:42 compute-1 sudo[208152]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hefwuciqolxlosqfbnhiocyuqguuqqjx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038841.995056-4074-3963976253607/AnsiballZ_command.py'
Jan 21 23:40:42 compute-1 sudo[208152]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:40:42 compute-1 python3.9[208154]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 23:40:42 compute-1 sudo[208152]: pam_unix(sudo:session): session closed for user root
Jan 21 23:40:43 compute-1 sudo[208305]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lyfecttdtobzgeopwddbrvgmrvnptpyd ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1769038842.766891-4098-259331364004967/AnsiballZ_edpm_nftables_from_files.py'
Jan 21 23:40:43 compute-1 sudo[208305]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:40:43 compute-1 python3[208307]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Jan 21 23:40:43 compute-1 sudo[208305]: pam_unix(sudo:session): session closed for user root
Jan 21 23:40:44 compute-1 sudo[208477]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-airfpraqdsnykdollqhosmdmnhdhygce ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038843.7338693-4122-39122947396099/AnsiballZ_stat.py'
Jan 21 23:40:44 compute-1 sudo[208477]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:40:44 compute-1 podman[208431]: 2026-01-21 23:40:44.150015227 +0000 UTC m=+0.081835410 container health_status af9a44923f14d865893c91712a29a99d59e05d83f1a1a0300c4f4d5af638f284 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 21 23:40:44 compute-1 podman[208432]: 2026-01-21 23:40:44.154508027 +0000 UTC m=+0.074217921 container health_status e305ebfcd3dbca18f8dd680606b043b108cf9efb0d0a771039cb04fe0df8441d (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 21 23:40:44 compute-1 python3.9[208491]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:40:44 compute-1 sudo[208477]: pam_unix(sudo:session): session closed for user root
Jan 21 23:40:44 compute-1 sudo[208577]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-shewbpolqlpufzjnudrwjhcvgtivmqbc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038843.7338693-4122-39122947396099/AnsiballZ_file.py'
Jan 21 23:40:44 compute-1 sudo[208577]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:40:44 compute-1 python3.9[208579]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:40:44 compute-1 sudo[208577]: pam_unix(sudo:session): session closed for user root
Jan 21 23:40:45 compute-1 sudo[208729]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uwrpntvdjphxmnlopsddfnbnwlezkaua ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038845.4677541-4158-70389051732895/AnsiballZ_stat.py'
Jan 21 23:40:45 compute-1 sudo[208729]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:40:46 compute-1 python3.9[208731]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:40:46 compute-1 sudo[208729]: pam_unix(sudo:session): session closed for user root
Jan 21 23:40:46 compute-1 sudo[208807]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bolnjwqsrsowlpqdofngkciugghzqcds ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038845.4677541-4158-70389051732895/AnsiballZ_file.py'
Jan 21 23:40:46 compute-1 sudo[208807]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:40:46 compute-1 python3.9[208809]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-update-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-update-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:40:46 compute-1 sudo[208807]: pam_unix(sudo:session): session closed for user root
Jan 21 23:40:47 compute-1 sudo[208959]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dqmwbbagpjwnrkqdxjxlloqlnwsoftjo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038846.8665261-4194-120527605198440/AnsiballZ_stat.py'
Jan 21 23:40:47 compute-1 sudo[208959]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:40:47 compute-1 python3.9[208961]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:40:47 compute-1 sudo[208959]: pam_unix(sudo:session): session closed for user root
Jan 21 23:40:47 compute-1 sudo[209037]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cupeajstzxnmmanaokxhwfnmlrpuvchp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038846.8665261-4194-120527605198440/AnsiballZ_file.py'
Jan 21 23:40:47 compute-1 sudo[209037]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:40:47 compute-1 python3.9[209039]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-flushes.nft _original_basename=flush-chain.j2 recurse=False state=file path=/etc/nftables/edpm-flushes.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:40:48 compute-1 sudo[209037]: pam_unix(sudo:session): session closed for user root
Jan 21 23:40:48 compute-1 sudo[209189]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cvsbxnpktsbablrbibqdfhogllluwbkz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038848.2508214-4230-159309126085570/AnsiballZ_stat.py'
Jan 21 23:40:48 compute-1 sudo[209189]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:40:48 compute-1 python3.9[209191]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:40:48 compute-1 sudo[209189]: pam_unix(sudo:session): session closed for user root
Jan 21 23:40:49 compute-1 sudo[209267]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-brqqpbthstheoiclrjacbkvypbuhutvm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038848.2508214-4230-159309126085570/AnsiballZ_file.py'
Jan 21 23:40:49 compute-1 sudo[209267]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:40:49 compute-1 python3.9[209269]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-chains.nft _original_basename=chains.j2 recurse=False state=file path=/etc/nftables/edpm-chains.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:40:49 compute-1 sudo[209267]: pam_unix(sudo:session): session closed for user root
Jan 21 23:40:50 compute-1 sudo[209419]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cqriukbrspqttxdmcfuqwfjtlglgdfzy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038849.631528-4266-47993936129837/AnsiballZ_stat.py'
Jan 21 23:40:50 compute-1 sudo[209419]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:40:50 compute-1 python3.9[209421]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:40:50 compute-1 sudo[209419]: pam_unix(sudo:session): session closed for user root
Jan 21 23:40:50 compute-1 sudo[209544]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-onvejlivkypfffmqcfrgxntlhecqgavg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038849.631528-4266-47993936129837/AnsiballZ_copy.py'
Jan 21 23:40:50 compute-1 sudo[209544]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:40:50 compute-1 python3.9[209546]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769038849.631528-4266-47993936129837/.source.nft follow=False _original_basename=ruleset.j2 checksum=fb3275eced3a2e06312143189928124e1b2df34a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:40:50 compute-1 sudo[209544]: pam_unix(sudo:session): session closed for user root
Jan 21 23:40:51 compute-1 sudo[209696]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zzwzlvldojbutruwibzcevcuvmqtjpoi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038851.288074-4311-212055440253210/AnsiballZ_file.py'
Jan 21 23:40:51 compute-1 sudo[209696]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:40:51 compute-1 python3.9[209698]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:40:51 compute-1 sudo[209696]: pam_unix(sudo:session): session closed for user root
Jan 21 23:40:52 compute-1 sudo[209848]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bsmoyqtqziokxxdkynnvowpehkngmkdn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038852.1489723-4335-90154858866010/AnsiballZ_command.py'
Jan 21 23:40:52 compute-1 sudo[209848]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:40:52 compute-1 python3.9[209850]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 23:40:52 compute-1 sudo[209848]: pam_unix(sudo:session): session closed for user root
Jan 21 23:40:53 compute-1 sudo[210003]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wnlhudcivazcsptbvkmccfeaxfkxlich ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038853.0813968-4359-147205372138379/AnsiballZ_blockinfile.py'
Jan 21 23:40:53 compute-1 sudo[210003]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:40:53 compute-1 python3.9[210005]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                             include "/etc/nftables/edpm-chains.nft"
                                             include "/etc/nftables/edpm-rules.nft"
                                             include "/etc/nftables/edpm-jumps.nft"
                                              path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:40:53 compute-1 sudo[210003]: pam_unix(sudo:session): session closed for user root
Jan 21 23:40:54 compute-1 sudo[210164]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ykcqaswteurcnmwicgwfedlyoxwpqusc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038854.1349075-4386-39958088376613/AnsiballZ_command.py'
Jan 21 23:40:54 compute-1 sudo[210164]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:40:54 compute-1 podman[210129]: 2026-01-21 23:40:54.523561132 +0000 UTC m=+0.093073271 container health_status cba261ffc859a105e0afa4436e64e27f9ba7e7387a1ea02146e0072c51844d44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, managed_by=edpm_ansible)
Jan 21 23:40:54 compute-1 python3.9[210175]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 23:40:54 compute-1 sudo[210164]: pam_unix(sudo:session): session closed for user root
Jan 21 23:40:55 compute-1 sudo[210329]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qsvacssnukfdqkedorwunuxqsapcmhyf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038855.0244029-4410-723912422967/AnsiballZ_stat.py'
Jan 21 23:40:55 compute-1 sudo[210329]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:40:55 compute-1 python3.9[210331]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 21 23:40:55 compute-1 sudo[210329]: pam_unix(sudo:session): session closed for user root
Jan 21 23:40:56 compute-1 sudo[210483]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ywdgdvkcczfttiahjgkovsjwkvkhlyib ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038855.9253557-4434-40182069949133/AnsiballZ_command.py'
Jan 21 23:40:56 compute-1 sudo[210483]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:40:56 compute-1 python3.9[210485]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 23:40:56 compute-1 sudo[210483]: pam_unix(sudo:session): session closed for user root
Jan 21 23:40:57 compute-1 sudo[210649]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tywpmtsalpevhqjkwaejvbcfrnfvujrc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038856.7369518-4458-32900753926284/AnsiballZ_file.py'
Jan 21 23:40:57 compute-1 sudo[210649]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:40:57 compute-1 podman[210612]: 2026-01-21 23:40:57.31612288 +0000 UTC m=+0.069791603 container health_status dce133a2ffa21f3006ac27dc80027060a9029e6d972f4c62fecd35dd3bbb00f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=openstack_network_exporter, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, vendor=Red Hat, Inc., container_name=openstack_network_exporter, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, maintainer=Red Hat, Inc., release=1755695350, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Jan 21 23:40:57 compute-1 python3.9[210659]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:40:57 compute-1 sudo[210649]: pam_unix(sudo:session): session closed for user root
Jan 21 23:40:58 compute-1 sshd-session[183027]: Connection closed by 192.168.122.30 port 39646
Jan 21 23:40:58 compute-1 sshd-session[183024]: pam_unix(sshd:session): session closed for user zuul
Jan 21 23:40:58 compute-1 systemd[1]: session-26.scope: Deactivated successfully.
Jan 21 23:40:58 compute-1 systemd[1]: session-26.scope: Consumed 2min 5.309s CPU time.
Jan 21 23:40:58 compute-1 systemd-logind[796]: Session 26 logged out. Waiting for processes to exit.
Jan 21 23:40:58 compute-1 systemd-logind[796]: Removed session 26.
Jan 21 23:41:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:41:02.987 104184 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:41:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:41:02.987 104184 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:41:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:41:02.988 104184 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:41:04 compute-1 podman[210685]: 2026-01-21 23:41:04.640839393 +0000 UTC m=+0.118891349 container health_status 1b31374526468d6b98833c6ddc06c791524e3aaa8f4a92d02e3f11c449a17ca2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 21 23:41:04 compute-1 podman[210711]: 2026-01-21 23:41:04.783346068 +0000 UTC m=+0.108652668 container health_status 9069102163b9014ce8d990e36224edf2f6277e737eebbc85a8ea0ba5a3d6a468 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 21 23:41:10 compute-1 nova_compute[182713]: 2026-01-21 23:41:10.852 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:41:12 compute-1 nova_compute[182713]: 2026-01-21 23:41:12.855 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:41:12 compute-1 nova_compute[182713]: 2026-01-21 23:41:12.910 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:41:12 compute-1 nova_compute[182713]: 2026-01-21 23:41:12.911 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:41:12 compute-1 nova_compute[182713]: 2026-01-21 23:41:12.911 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:41:12 compute-1 nova_compute[182713]: 2026-01-21 23:41:12.911 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 21 23:41:13 compute-1 nova_compute[182713]: 2026-01-21 23:41:13.085 182717 WARNING nova.virt.libvirt.driver [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 21 23:41:13 compute-1 nova_compute[182713]: 2026-01-21 23:41:13.088 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5958MB free_disk=73.4164924621582GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 21 23:41:13 compute-1 nova_compute[182713]: 2026-01-21 23:41:13.088 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:41:13 compute-1 nova_compute[182713]: 2026-01-21 23:41:13.088 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:41:13 compute-1 nova_compute[182713]: 2026-01-21 23:41:13.172 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 21 23:41:13 compute-1 nova_compute[182713]: 2026-01-21 23:41:13.173 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 21 23:41:13 compute-1 nova_compute[182713]: 2026-01-21 23:41:13.202 182717 DEBUG nova.compute.provider_tree [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Inventory has not changed in ProviderTree for provider: 39680711-70c9-4df1-ae59-25e54fac688d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 21 23:41:13 compute-1 nova_compute[182713]: 2026-01-21 23:41:13.218 182717 DEBUG nova.scheduler.client.report [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Inventory has not changed for provider 39680711-70c9-4df1-ae59-25e54fac688d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 21 23:41:13 compute-1 nova_compute[182713]: 2026-01-21 23:41:13.219 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 21 23:41:13 compute-1 nova_compute[182713]: 2026-01-21 23:41:13.219 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.131s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:41:14 compute-1 nova_compute[182713]: 2026-01-21 23:41:14.214 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:41:14 compute-1 nova_compute[182713]: 2026-01-21 23:41:14.215 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:41:14 compute-1 nova_compute[182713]: 2026-01-21 23:41:14.216 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 21 23:41:14 compute-1 nova_compute[182713]: 2026-01-21 23:41:14.216 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 21 23:41:14 compute-1 nova_compute[182713]: 2026-01-21 23:41:14.232 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 21 23:41:14 compute-1 nova_compute[182713]: 2026-01-21 23:41:14.233 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:41:14 compute-1 nova_compute[182713]: 2026-01-21 23:41:14.233 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:41:14 compute-1 nova_compute[182713]: 2026-01-21 23:41:14.234 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:41:14 compute-1 nova_compute[182713]: 2026-01-21 23:41:14.234 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 21 23:41:14 compute-1 podman[210736]: 2026-01-21 23:41:14.61631437 +0000 UTC m=+0.060426081 container health_status af9a44923f14d865893c91712a29a99d59e05d83f1a1a0300c4f4d5af638f284 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent)
Jan 21 23:41:14 compute-1 podman[210737]: 2026-01-21 23:41:14.643817739 +0000 UTC m=+0.122439509 container health_status e305ebfcd3dbca18f8dd680606b043b108cf9efb0d0a771039cb04fe0df8441d (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 21 23:41:14 compute-1 nova_compute[182713]: 2026-01-21 23:41:14.857 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:41:14 compute-1 nova_compute[182713]: 2026-01-21 23:41:14.857 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:41:15 compute-1 nova_compute[182713]: 2026-01-21 23:41:15.856 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:41:25 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:41:25.566 104184 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=2, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:a4:fb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'fa:c2:bb:50:eb:27'}, ipsec=False) old=SB_Global(nb_cfg=1) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 21 23:41:25 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:41:25.567 104184 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 21 23:41:25 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:41:25.568 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=74526b6d-b1ca-423f-9094-b845f8b97526, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '2'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:41:25 compute-1 podman[210781]: 2026-01-21 23:41:25.608361722 +0000 UTC m=+0.101307184 container health_status cba261ffc859a105e0afa4436e64e27f9ba7e7387a1ea02146e0072c51844d44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 21 23:41:27 compute-1 podman[210800]: 2026-01-21 23:41:27.623584077 +0000 UTC m=+0.109826125 container health_status dce133a2ffa21f3006ac27dc80027060a9029e6d972f4c62fecd35dd3bbb00f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, vcs-type=git, com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, name=ubi9-minimal, release=1755695350, io.openshift.expose-services=, maintainer=Red Hat, Inc., distribution-scope=public, vendor=Red Hat, Inc., config_id=openstack_network_exporter, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Jan 21 23:41:35 compute-1 podman[210822]: 2026-01-21 23:41:35.560399694 +0000 UTC m=+0.054906398 container health_status 9069102163b9014ce8d990e36224edf2f6277e737eebbc85a8ea0ba5a3d6a468 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 21 23:41:35 compute-1 podman[210821]: 2026-01-21 23:41:35.607697308 +0000 UTC m=+0.099371725 container health_status 1b31374526468d6b98833c6ddc06c791524e3aaa8f4a92d02e3f11c449a17ca2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251202)
Jan 21 23:41:45 compute-1 podman[210871]: 2026-01-21 23:41:45.595259483 +0000 UTC m=+0.082898318 container health_status e305ebfcd3dbca18f8dd680606b043b108cf9efb0d0a771039cb04fe0df8441d (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 21 23:41:45 compute-1 podman[210870]: 2026-01-21 23:41:45.595330305 +0000 UTC m=+0.083410744 container health_status af9a44923f14d865893c91712a29a99d59e05d83f1a1a0300c4f4d5af638f284 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 21 23:41:56 compute-1 podman[210911]: 2026-01-21 23:41:56.682620082 +0000 UTC m=+0.166086945 container health_status cba261ffc859a105e0afa4436e64e27f9ba7e7387a1ea02146e0072c51844d44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Jan 21 23:41:58 compute-1 podman[210932]: 2026-01-21 23:41:58.611085441 +0000 UTC m=+0.097785476 container health_status dce133a2ffa21f3006ac27dc80027060a9029e6d972f4c62fecd35dd3bbb00f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1755695350, config_id=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, name=ubi9-minimal, version=9.6, container_name=openstack_network_exporter, vendor=Red Hat, Inc., managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, architecture=x86_64, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public)
Jan 21 23:42:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:42:02.989 104184 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:42:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:42:02.989 104184 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:42:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:42:02.989 104184 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:42:06 compute-1 podman[210952]: 2026-01-21 23:42:06.61467465 +0000 UTC m=+0.083932850 container health_status 9069102163b9014ce8d990e36224edf2f6277e737eebbc85a8ea0ba5a3d6a468 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 21 23:42:06 compute-1 podman[210951]: 2026-01-21 23:42:06.654821674 +0000 UTC m=+0.126760587 container health_status 1b31374526468d6b98833c6ddc06c791524e3aaa8f4a92d02e3f11c449a17ca2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 21 23:42:10 compute-1 nova_compute[182713]: 2026-01-21 23:42:10.856 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:42:10 compute-1 nova_compute[182713]: 2026-01-21 23:42:10.857 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Jan 21 23:42:10 compute-1 nova_compute[182713]: 2026-01-21 23:42:10.895 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Jan 21 23:42:10 compute-1 nova_compute[182713]: 2026-01-21 23:42:10.897 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:42:10 compute-1 nova_compute[182713]: 2026-01-21 23:42:10.897 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Jan 21 23:42:10 compute-1 nova_compute[182713]: 2026-01-21 23:42:10.934 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:42:12 compute-1 nova_compute[182713]: 2026-01-21 23:42:12.961 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:42:12 compute-1 nova_compute[182713]: 2026-01-21 23:42:12.997 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:42:12 compute-1 nova_compute[182713]: 2026-01-21 23:42:12.997 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:42:12 compute-1 nova_compute[182713]: 2026-01-21 23:42:12.998 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:42:12 compute-1 nova_compute[182713]: 2026-01-21 23:42:12.998 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 21 23:42:13 compute-1 nova_compute[182713]: 2026-01-21 23:42:13.205 182717 WARNING nova.virt.libvirt.driver [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 21 23:42:13 compute-1 nova_compute[182713]: 2026-01-21 23:42:13.207 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=6035MB free_disk=73.41647338867188GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 21 23:42:13 compute-1 nova_compute[182713]: 2026-01-21 23:42:13.208 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:42:13 compute-1 nova_compute[182713]: 2026-01-21 23:42:13.208 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:42:13 compute-1 nova_compute[182713]: 2026-01-21 23:42:13.449 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 21 23:42:13 compute-1 nova_compute[182713]: 2026-01-21 23:42:13.450 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 21 23:42:13 compute-1 nova_compute[182713]: 2026-01-21 23:42:13.580 182717 DEBUG nova.scheduler.client.report [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Refreshing inventories for resource provider 39680711-70c9-4df1-ae59-25e54fac688d _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Jan 21 23:42:13 compute-1 nova_compute[182713]: 2026-01-21 23:42:13.720 182717 DEBUG nova.scheduler.client.report [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Updating ProviderTree inventory for provider 39680711-70c9-4df1-ae59-25e54fac688d from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Jan 21 23:42:13 compute-1 nova_compute[182713]: 2026-01-21 23:42:13.720 182717 DEBUG nova.compute.provider_tree [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Updating inventory in ProviderTree for provider 39680711-70c9-4df1-ae59-25e54fac688d with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Jan 21 23:42:13 compute-1 nova_compute[182713]: 2026-01-21 23:42:13.761 182717 DEBUG nova.scheduler.client.report [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Refreshing aggregate associations for resource provider 39680711-70c9-4df1-ae59-25e54fac688d, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Jan 21 23:42:13 compute-1 nova_compute[182713]: 2026-01-21 23:42:13.823 182717 DEBUG nova.scheduler.client.report [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Refreshing trait associations for resource provider 39680711-70c9-4df1-ae59-25e54fac688d, traits: COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_ACCELERATORS,COMPUTE_NODE,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_SSE41,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_GRAPHICS_MODEL_CIRRUS,HW_CPU_X86_SSE42,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_SECURITY_TPM_2_0,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_STORAGE_BUS_USB,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_STORAGE_BUS_FDC,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_STORAGE_BUS_SATA,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_DEVICE_TAGGING,COMPUTE_NET_VIF_MODEL_NE2K_PCI,HW_CPU_X86_SSE2,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_CPU_X86_SSSE3,COMPUTE_STORAGE_BUS_IDE,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_RESCUE_BFV,HW_CPU_X86_MMX,HW_CPU_X86_SSE,COMPUTE_TRUSTED_CERTS,COMPUTE_IMAGE_TYPE_ISO _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Jan 21 23:42:13 compute-1 nova_compute[182713]: 2026-01-21 23:42:13.860 182717 DEBUG nova.compute.provider_tree [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Inventory has not changed in ProviderTree for provider: 39680711-70c9-4df1-ae59-25e54fac688d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 21 23:42:14 compute-1 nova_compute[182713]: 2026-01-21 23:42:14.015 182717 DEBUG nova.scheduler.client.report [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Inventory has not changed for provider 39680711-70c9-4df1-ae59-25e54fac688d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 21 23:42:14 compute-1 nova_compute[182713]: 2026-01-21 23:42:14.018 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 21 23:42:14 compute-1 nova_compute[182713]: 2026-01-21 23:42:14.019 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.810s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:42:14 compute-1 nova_compute[182713]: 2026-01-21 23:42:14.909 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:42:14 compute-1 nova_compute[182713]: 2026-01-21 23:42:14.909 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:42:14 compute-1 nova_compute[182713]: 2026-01-21 23:42:14.910 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:42:15 compute-1 nova_compute[182713]: 2026-01-21 23:42:15.856 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:42:15 compute-1 nova_compute[182713]: 2026-01-21 23:42:15.857 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 21 23:42:15 compute-1 nova_compute[182713]: 2026-01-21 23:42:15.857 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 21 23:42:16 compute-1 nova_compute[182713]: 2026-01-21 23:42:16.173 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 21 23:42:16 compute-1 nova_compute[182713]: 2026-01-21 23:42:16.174 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:42:16 compute-1 nova_compute[182713]: 2026-01-21 23:42:16.175 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:42:16 compute-1 nova_compute[182713]: 2026-01-21 23:42:16.175 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:42:16 compute-1 nova_compute[182713]: 2026-01-21 23:42:16.176 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 21 23:42:16 compute-1 podman[210998]: 2026-01-21 23:42:16.588080288 +0000 UTC m=+0.076393228 container health_status af9a44923f14d865893c91712a29a99d59e05d83f1a1a0300c4f4d5af638f284 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_metadata_agent)
Jan 21 23:42:16 compute-1 podman[210999]: 2026-01-21 23:42:16.594140035 +0000 UTC m=+0.075365057 container health_status e305ebfcd3dbca18f8dd680606b043b108cf9efb0d0a771039cb04fe0df8441d (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 21 23:42:16 compute-1 nova_compute[182713]: 2026-01-21 23:42:16.857 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:42:22.859 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:42:22.860 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:42:22.860 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:42:22.860 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:42:22.860 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:42:22.861 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:42:22.861 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:42:22.861 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:42:22.861 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:42:22.861 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:42:22.861 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:42:22.862 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:42:22.862 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:42:22.862 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:42:22.862 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:42:22.862 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:42:22.862 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:42:22.862 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:42:22.863 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:42:22.863 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:42:22.863 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:42:22.863 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:42:22.863 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:42:22.863 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:42:22.864 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:42:27 compute-1 podman[211041]: 2026-01-21 23:42:27.622725618 +0000 UTC m=+0.106104471 container health_status cba261ffc859a105e0afa4436e64e27f9ba7e7387a1ea02146e0072c51844d44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Jan 21 23:42:29 compute-1 podman[211062]: 2026-01-21 23:42:29.60161773 +0000 UTC m=+0.087116685 container health_status dce133a2ffa21f3006ac27dc80027060a9029e6d972f4c62fecd35dd3bbb00f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=openstack_network_exporter, name=ubi9-minimal, release=1755695350, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, maintainer=Red Hat, Inc., version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, architecture=x86_64, container_name=openstack_network_exporter)
Jan 21 23:42:37 compute-1 podman[211084]: 2026-01-21 23:42:37.556768435 +0000 UTC m=+0.050180462 container health_status 9069102163b9014ce8d990e36224edf2f6277e737eebbc85a8ea0ba5a3d6a468 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 21 23:42:37 compute-1 podman[211083]: 2026-01-21 23:42:37.60995472 +0000 UTC m=+0.108762624 container health_status 1b31374526468d6b98833c6ddc06c791524e3aaa8f4a92d02e3f11c449a17ca2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 21 23:42:47 compute-1 podman[211135]: 2026-01-21 23:42:47.561664182 +0000 UTC m=+0.052081601 container health_status af9a44923f14d865893c91712a29a99d59e05d83f1a1a0300c4f4d5af638f284 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 21 23:42:47 compute-1 podman[211136]: 2026-01-21 23:42:47.589540344 +0000 UTC m=+0.081047667 container health_status e305ebfcd3dbca18f8dd680606b043b108cf9efb0d0a771039cb04fe0df8441d (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 21 23:42:56 compute-1 sshd-session[211133]: Connection closed by 162.142.125.120 port 23560 [preauth]
Jan 21 23:42:58 compute-1 podman[211178]: 2026-01-21 23:42:58.585189303 +0000 UTC m=+0.077228568 container health_status cba261ffc859a105e0afa4436e64e27f9ba7e7387a1ea02146e0072c51844d44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Jan 21 23:43:00 compute-1 podman[211198]: 2026-01-21 23:43:00.609751889 +0000 UTC m=+0.095235156 container health_status dce133a2ffa21f3006ac27dc80027060a9029e6d972f4c62fecd35dd3bbb00f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, release=1755695350, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=openstack_network_exporter, vendor=Red Hat, Inc., version=9.6, container_name=openstack_network_exporter, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, name=ubi9-minimal, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Jan 21 23:43:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:43:02.990 104184 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:43:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:43:02.991 104184 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:43:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:43:02.991 104184 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:43:08 compute-1 podman[211220]: 2026-01-21 23:43:08.593368253 +0000 UTC m=+0.073316098 container health_status 9069102163b9014ce8d990e36224edf2f6277e737eebbc85a8ea0ba5a3d6a468 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 21 23:43:08 compute-1 podman[211219]: 2026-01-21 23:43:08.644987019 +0000 UTC m=+0.133980763 container health_status 1b31374526468d6b98833c6ddc06c791524e3aaa8f4a92d02e3f11c449a17ca2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS)
Jan 21 23:43:12 compute-1 nova_compute[182713]: 2026-01-21 23:43:12.851 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:43:14 compute-1 nova_compute[182713]: 2026-01-21 23:43:14.856 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:43:14 compute-1 nova_compute[182713]: 2026-01-21 23:43:14.857 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:43:14 compute-1 nova_compute[182713]: 2026-01-21 23:43:14.892 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:43:14 compute-1 nova_compute[182713]: 2026-01-21 23:43:14.893 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:43:14 compute-1 nova_compute[182713]: 2026-01-21 23:43:14.893 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:43:14 compute-1 nova_compute[182713]: 2026-01-21 23:43:14.894 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 21 23:43:15 compute-1 nova_compute[182713]: 2026-01-21 23:43:15.126 182717 WARNING nova.virt.libvirt.driver [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 21 23:43:15 compute-1 nova_compute[182713]: 2026-01-21 23:43:15.128 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=6047MB free_disk=73.41787338256836GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 21 23:43:15 compute-1 nova_compute[182713]: 2026-01-21 23:43:15.129 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:43:15 compute-1 nova_compute[182713]: 2026-01-21 23:43:15.129 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:43:15 compute-1 nova_compute[182713]: 2026-01-21 23:43:15.221 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 21 23:43:15 compute-1 nova_compute[182713]: 2026-01-21 23:43:15.222 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 21 23:43:15 compute-1 nova_compute[182713]: 2026-01-21 23:43:15.257 182717 DEBUG nova.compute.provider_tree [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Inventory has not changed in ProviderTree for provider: 39680711-70c9-4df1-ae59-25e54fac688d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 21 23:43:15 compute-1 nova_compute[182713]: 2026-01-21 23:43:15.282 182717 DEBUG nova.scheduler.client.report [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Inventory has not changed for provider 39680711-70c9-4df1-ae59-25e54fac688d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 21 23:43:15 compute-1 nova_compute[182713]: 2026-01-21 23:43:15.285 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 21 23:43:15 compute-1 nova_compute[182713]: 2026-01-21 23:43:15.285 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.156s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:43:16 compute-1 nova_compute[182713]: 2026-01-21 23:43:16.284 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:43:16 compute-1 nova_compute[182713]: 2026-01-21 23:43:16.285 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 21 23:43:16 compute-1 nova_compute[182713]: 2026-01-21 23:43:16.285 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 21 23:43:18 compute-1 podman[211269]: 2026-01-21 23:43:18.569842842 +0000 UTC m=+0.062882106 container health_status af9a44923f14d865893c91712a29a99d59e05d83f1a1a0300c4f4d5af638f284 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 21 23:43:18 compute-1 podman[211270]: 2026-01-21 23:43:18.58500973 +0000 UTC m=+0.069887522 container health_status e305ebfcd3dbca18f8dd680606b043b108cf9efb0d0a771039cb04fe0df8441d (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 21 23:43:19 compute-1 nova_compute[182713]: 2026-01-21 23:43:19.106 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 21 23:43:19 compute-1 nova_compute[182713]: 2026-01-21 23:43:19.106 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:43:19 compute-1 nova_compute[182713]: 2026-01-21 23:43:19.107 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:43:19 compute-1 nova_compute[182713]: 2026-01-21 23:43:19.107 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:43:19 compute-1 nova_compute[182713]: 2026-01-21 23:43:19.108 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:43:19 compute-1 nova_compute[182713]: 2026-01-21 23:43:19.108 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:43:19 compute-1 nova_compute[182713]: 2026-01-21 23:43:19.108 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:43:19 compute-1 nova_compute[182713]: 2026-01-21 23:43:19.109 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 21 23:43:26 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:43:26.876 104184 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=3, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:a4:fb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'fa:c2:bb:50:eb:27'}, ipsec=False) old=SB_Global(nb_cfg=2) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 21 23:43:26 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:43:26.877 104184 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 21 23:43:29 compute-1 podman[211308]: 2026-01-21 23:43:29.568685489 +0000 UTC m=+0.064725702 container health_status cba261ffc859a105e0afa4436e64e27f9ba7e7387a1ea02146e0072c51844d44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 21 23:43:29 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:43:29.879 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=74526b6d-b1ca-423f-9094-b845f8b97526, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '3'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:43:31 compute-1 podman[211325]: 2026-01-21 23:43:31.601737316 +0000 UTC m=+0.086100313 container health_status dce133a2ffa21f3006ac27dc80027060a9029e6d972f4c62fecd35dd3bbb00f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., io.openshift.expose-services=, managed_by=edpm_ansible, vcs-type=git, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., release=1755695350, version=9.6)
Jan 21 23:43:39 compute-1 podman[211347]: 2026-01-21 23:43:39.658058015 +0000 UTC m=+0.128774214 container health_status 1b31374526468d6b98833c6ddc06c791524e3aaa8f4a92d02e3f11c449a17ca2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 21 23:43:39 compute-1 podman[211348]: 2026-01-21 23:43:39.65853057 +0000 UTC m=+0.123729810 container health_status 9069102163b9014ce8d990e36224edf2f6277e737eebbc85a8ea0ba5a3d6a468 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 21 23:43:39 compute-1 nova_compute[182713]: 2026-01-21 23:43:39.790 182717 DEBUG oslo_concurrency.lockutils [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] Acquiring lock "6f64b039-da3a-47ef-9a52-b259b890a077" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:43:39 compute-1 nova_compute[182713]: 2026-01-21 23:43:39.791 182717 DEBUG oslo_concurrency.lockutils [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] Lock "6f64b039-da3a-47ef-9a52-b259b890a077" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:43:39 compute-1 nova_compute[182713]: 2026-01-21 23:43:39.831 182717 DEBUG nova.compute.manager [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] [instance: 6f64b039-da3a-47ef-9a52-b259b890a077] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 21 23:43:40 compute-1 nova_compute[182713]: 2026-01-21 23:43:40.096 182717 DEBUG oslo_concurrency.lockutils [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:43:40 compute-1 nova_compute[182713]: 2026-01-21 23:43:40.097 182717 DEBUG oslo_concurrency.lockutils [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:43:40 compute-1 nova_compute[182713]: 2026-01-21 23:43:40.105 182717 DEBUG nova.virt.hardware [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 21 23:43:40 compute-1 nova_compute[182713]: 2026-01-21 23:43:40.105 182717 INFO nova.compute.claims [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] [instance: 6f64b039-da3a-47ef-9a52-b259b890a077] Claim successful on node compute-1.ctlplane.example.com
Jan 21 23:43:40 compute-1 nova_compute[182713]: 2026-01-21 23:43:40.331 182717 DEBUG nova.compute.provider_tree [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] Inventory has not changed in ProviderTree for provider: 39680711-70c9-4df1-ae59-25e54fac688d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 21 23:43:40 compute-1 nova_compute[182713]: 2026-01-21 23:43:40.348 182717 DEBUG nova.scheduler.client.report [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] Inventory has not changed for provider 39680711-70c9-4df1-ae59-25e54fac688d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 21 23:43:40 compute-1 nova_compute[182713]: 2026-01-21 23:43:40.376 182717 DEBUG oslo_concurrency.lockutils [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.279s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:43:40 compute-1 nova_compute[182713]: 2026-01-21 23:43:40.377 182717 DEBUG nova.compute.manager [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] [instance: 6f64b039-da3a-47ef-9a52-b259b890a077] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 21 23:43:40 compute-1 nova_compute[182713]: 2026-01-21 23:43:40.708 182717 DEBUG nova.compute.manager [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] [instance: 6f64b039-da3a-47ef-9a52-b259b890a077] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 21 23:43:40 compute-1 nova_compute[182713]: 2026-01-21 23:43:40.710 182717 DEBUG nova.network.neutron [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] [instance: 6f64b039-da3a-47ef-9a52-b259b890a077] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 21 23:43:40 compute-1 nova_compute[182713]: 2026-01-21 23:43:40.786 182717 INFO nova.virt.libvirt.driver [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] [instance: 6f64b039-da3a-47ef-9a52-b259b890a077] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 21 23:43:40 compute-1 nova_compute[182713]: 2026-01-21 23:43:40.870 182717 DEBUG nova.compute.manager [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] [instance: 6f64b039-da3a-47ef-9a52-b259b890a077] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 21 23:43:41 compute-1 nova_compute[182713]: 2026-01-21 23:43:41.124 182717 DEBUG nova.compute.manager [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] [instance: 6f64b039-da3a-47ef-9a52-b259b890a077] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 21 23:43:41 compute-1 nova_compute[182713]: 2026-01-21 23:43:41.128 182717 DEBUG nova.virt.libvirt.driver [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] [instance: 6f64b039-da3a-47ef-9a52-b259b890a077] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 21 23:43:41 compute-1 nova_compute[182713]: 2026-01-21 23:43:41.129 182717 INFO nova.virt.libvirt.driver [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] [instance: 6f64b039-da3a-47ef-9a52-b259b890a077] Creating image(s)
Jan 21 23:43:41 compute-1 nova_compute[182713]: 2026-01-21 23:43:41.130 182717 DEBUG oslo_concurrency.lockutils [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] Acquiring lock "/var/lib/nova/instances/6f64b039-da3a-47ef-9a52-b259b890a077/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:43:41 compute-1 nova_compute[182713]: 2026-01-21 23:43:41.131 182717 DEBUG oslo_concurrency.lockutils [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] Lock "/var/lib/nova/instances/6f64b039-da3a-47ef-9a52-b259b890a077/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:43:41 compute-1 nova_compute[182713]: 2026-01-21 23:43:41.133 182717 DEBUG oslo_concurrency.lockutils [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] Lock "/var/lib/nova/instances/6f64b039-da3a-47ef-9a52-b259b890a077/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:43:41 compute-1 nova_compute[182713]: 2026-01-21 23:43:41.133 182717 DEBUG oslo_concurrency.lockutils [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] Acquiring lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:43:41 compute-1 nova_compute[182713]: 2026-01-21 23:43:41.135 182717 DEBUG oslo_concurrency.lockutils [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:43:42 compute-1 nova_compute[182713]: 2026-01-21 23:43:42.138 182717 DEBUG nova.network.neutron [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] [instance: 6f64b039-da3a-47ef-9a52-b259b890a077] Automatically allocating a network for project 8981554bfb65485a9218dab7f347822d. _auto_allocate_network /usr/lib/python3.9/site-packages/nova/network/neutron.py:2460
Jan 21 23:43:43 compute-1 nova_compute[182713]: 2026-01-21 23:43:43.151 182717 DEBUG oslo_concurrency.processutils [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474.part --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:43:43 compute-1 nova_compute[182713]: 2026-01-21 23:43:43.239 182717 DEBUG oslo_concurrency.processutils [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474.part --force-share --output=json" returned: 0 in 0.088s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:43:43 compute-1 nova_compute[182713]: 2026-01-21 23:43:43.241 182717 DEBUG nova.virt.images [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] 9cd98f02-a505-4543-a7ad-04e9a377b456 was qcow2, converting to raw fetch_to_raw /usr/lib/python3.9/site-packages/nova/virt/images.py:242
Jan 21 23:43:43 compute-1 nova_compute[182713]: 2026-01-21 23:43:43.242 182717 DEBUG nova.privsep.utils [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63
Jan 21 23:43:43 compute-1 nova_compute[182713]: 2026-01-21 23:43:43.243 182717 DEBUG oslo_concurrency.processutils [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] Running cmd (subprocess): qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474.part /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474.converted execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:43:43 compute-1 nova_compute[182713]: 2026-01-21 23:43:43.433 182717 DEBUG oslo_concurrency.processutils [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] CMD "qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474.part /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474.converted" returned: 0 in 0.191s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:43:43 compute-1 nova_compute[182713]: 2026-01-21 23:43:43.440 182717 DEBUG oslo_concurrency.processutils [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474.converted --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:43:43 compute-1 nova_compute[182713]: 2026-01-21 23:43:43.505 182717 DEBUG oslo_concurrency.processutils [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474.converted --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:43:43 compute-1 nova_compute[182713]: 2026-01-21 23:43:43.507 182717 DEBUG oslo_concurrency.lockutils [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 2.373s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:43:43 compute-1 nova_compute[182713]: 2026-01-21 23:43:43.528 182717 INFO oslo.privsep.daemon [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-compute.conf', '--config-dir', '/etc/nova/nova.conf.d', '--privsep_context', 'nova.privsep.sys_admin_pctxt', '--privsep_sock_path', '/tmp/tmpheeeyzpj/privsep.sock']
Jan 21 23:43:44 compute-1 nova_compute[182713]: 2026-01-21 23:43:44.029 182717 DEBUG oslo_concurrency.lockutils [None req-4288d71b-0032-48eb-b7ee-0311bbf3ae1b c11b574ed1e849b1b0f45e29432ef4d6 a182ea531d6a4192aff53c844249974e - - default default] Acquiring lock "3f2b9306-446d-4b0b-9db6-7a6ef24e18e9" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:43:44 compute-1 nova_compute[182713]: 2026-01-21 23:43:44.030 182717 DEBUG oslo_concurrency.lockutils [None req-4288d71b-0032-48eb-b7ee-0311bbf3ae1b c11b574ed1e849b1b0f45e29432ef4d6 a182ea531d6a4192aff53c844249974e - - default default] Lock "3f2b9306-446d-4b0b-9db6-7a6ef24e18e9" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:43:44 compute-1 nova_compute[182713]: 2026-01-21 23:43:44.055 182717 DEBUG nova.compute.manager [None req-4288d71b-0032-48eb-b7ee-0311bbf3ae1b c11b574ed1e849b1b0f45e29432ef4d6 a182ea531d6a4192aff53c844249974e - - default default] [instance: 3f2b9306-446d-4b0b-9db6-7a6ef24e18e9] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 21 23:43:44 compute-1 nova_compute[182713]: 2026-01-21 23:43:44.184 182717 DEBUG oslo_concurrency.lockutils [None req-4288d71b-0032-48eb-b7ee-0311bbf3ae1b c11b574ed1e849b1b0f45e29432ef4d6 a182ea531d6a4192aff53c844249974e - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:43:44 compute-1 nova_compute[182713]: 2026-01-21 23:43:44.184 182717 DEBUG oslo_concurrency.lockutils [None req-4288d71b-0032-48eb-b7ee-0311bbf3ae1b c11b574ed1e849b1b0f45e29432ef4d6 a182ea531d6a4192aff53c844249974e - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:43:44 compute-1 nova_compute[182713]: 2026-01-21 23:43:44.193 182717 DEBUG nova.virt.hardware [None req-4288d71b-0032-48eb-b7ee-0311bbf3ae1b c11b574ed1e849b1b0f45e29432ef4d6 a182ea531d6a4192aff53c844249974e - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 21 23:43:44 compute-1 nova_compute[182713]: 2026-01-21 23:43:44.194 182717 INFO nova.compute.claims [None req-4288d71b-0032-48eb-b7ee-0311bbf3ae1b c11b574ed1e849b1b0f45e29432ef4d6 a182ea531d6a4192aff53c844249974e - - default default] [instance: 3f2b9306-446d-4b0b-9db6-7a6ef24e18e9] Claim successful on node compute-1.ctlplane.example.com
Jan 21 23:43:44 compute-1 nova_compute[182713]: 2026-01-21 23:43:44.231 182717 INFO oslo.privsep.daemon [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] Spawned new privsep daemon via rootwrap
Jan 21 23:43:44 compute-1 nova_compute[182713]: 2026-01-21 23:43:44.094 211417 INFO oslo.privsep.daemon [-] privsep daemon starting
Jan 21 23:43:44 compute-1 nova_compute[182713]: 2026-01-21 23:43:44.101 211417 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Jan 21 23:43:44 compute-1 nova_compute[182713]: 2026-01-21 23:43:44.105 211417 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_CHOWN|CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_FOWNER|CAP_NET_ADMIN|CAP_SYS_ADMIN/CAP_CHOWN|CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_FOWNER|CAP_NET_ADMIN|CAP_SYS_ADMIN/none
Jan 21 23:43:44 compute-1 nova_compute[182713]: 2026-01-21 23:43:44.105 211417 INFO oslo.privsep.daemon [-] privsep daemon running as pid 211417
Jan 21 23:43:44 compute-1 nova_compute[182713]: 2026-01-21 23:43:44.307 182717 DEBUG oslo_concurrency.processutils [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:43:44 compute-1 nova_compute[182713]: 2026-01-21 23:43:44.376 182717 DEBUG nova.compute.provider_tree [None req-4288d71b-0032-48eb-b7ee-0311bbf3ae1b c11b574ed1e849b1b0f45e29432ef4d6 a182ea531d6a4192aff53c844249974e - - default default] Updating inventory in ProviderTree for provider 39680711-70c9-4df1-ae59-25e54fac688d with inventory: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Jan 21 23:43:44 compute-1 nova_compute[182713]: 2026-01-21 23:43:44.401 182717 DEBUG oslo_concurrency.processutils [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.094s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:43:44 compute-1 nova_compute[182713]: 2026-01-21 23:43:44.402 182717 DEBUG oslo_concurrency.lockutils [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] Acquiring lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:43:44 compute-1 nova_compute[182713]: 2026-01-21 23:43:44.403 182717 DEBUG oslo_concurrency.lockutils [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:43:44 compute-1 nova_compute[182713]: 2026-01-21 23:43:44.425 182717 DEBUG oslo_concurrency.processutils [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:43:44 compute-1 nova_compute[182713]: 2026-01-21 23:43:44.441 182717 ERROR nova.scheduler.client.report [None req-4288d71b-0032-48eb-b7ee-0311bbf3ae1b c11b574ed1e849b1b0f45e29432ef4d6 a182ea531d6a4192aff53c844249974e - - default default] [req-c23972ba-4b76-4a95-b704-1b3d10feed6c] Failed to update inventory to [{'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}}] for resource provider with UUID 39680711-70c9-4df1-ae59-25e54fac688d.  Got 409: {"errors": [{"status": 409, "title": "Conflict", "detail": "There was a conflict when trying to complete your request.\n\n resource provider generation conflict  ", "code": "placement.concurrent_update", "request_id": "req-c23972ba-4b76-4a95-b704-1b3d10feed6c"}]}
Jan 21 23:43:44 compute-1 nova_compute[182713]: 2026-01-21 23:43:44.462 182717 DEBUG nova.scheduler.client.report [None req-4288d71b-0032-48eb-b7ee-0311bbf3ae1b c11b574ed1e849b1b0f45e29432ef4d6 a182ea531d6a4192aff53c844249974e - - default default] Refreshing inventories for resource provider 39680711-70c9-4df1-ae59-25e54fac688d _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Jan 21 23:43:44 compute-1 nova_compute[182713]: 2026-01-21 23:43:44.478 182717 DEBUG oslo_concurrency.processutils [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:43:44 compute-1 nova_compute[182713]: 2026-01-21 23:43:44.478 182717 DEBUG oslo_concurrency.processutils [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/6f64b039-da3a-47ef-9a52-b259b890a077/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:43:44 compute-1 nova_compute[182713]: 2026-01-21 23:43:44.496 182717 DEBUG nova.scheduler.client.report [None req-4288d71b-0032-48eb-b7ee-0311bbf3ae1b c11b574ed1e849b1b0f45e29432ef4d6 a182ea531d6a4192aff53c844249974e - - default default] Updating ProviderTree inventory for provider 39680711-70c9-4df1-ae59-25e54fac688d from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Jan 21 23:43:44 compute-1 nova_compute[182713]: 2026-01-21 23:43:44.497 182717 DEBUG nova.compute.provider_tree [None req-4288d71b-0032-48eb-b7ee-0311bbf3ae1b c11b574ed1e849b1b0f45e29432ef4d6 a182ea531d6a4192aff53c844249974e - - default default] Updating inventory in ProviderTree for provider 39680711-70c9-4df1-ae59-25e54fac688d with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Jan 21 23:43:44 compute-1 nova_compute[182713]: 2026-01-21 23:43:44.512 182717 DEBUG oslo_concurrency.processutils [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/6f64b039-da3a-47ef-9a52-b259b890a077/disk 1073741824" returned: 0 in 0.034s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:43:44 compute-1 nova_compute[182713]: 2026-01-21 23:43:44.513 182717 DEBUG oslo_concurrency.lockutils [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.110s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:43:44 compute-1 nova_compute[182713]: 2026-01-21 23:43:44.513 182717 DEBUG oslo_concurrency.processutils [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:43:44 compute-1 nova_compute[182713]: 2026-01-21 23:43:44.535 182717 DEBUG nova.scheduler.client.report [None req-4288d71b-0032-48eb-b7ee-0311bbf3ae1b c11b574ed1e849b1b0f45e29432ef4d6 a182ea531d6a4192aff53c844249974e - - default default] Refreshing aggregate associations for resource provider 39680711-70c9-4df1-ae59-25e54fac688d, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Jan 21 23:43:44 compute-1 nova_compute[182713]: 2026-01-21 23:43:44.567 182717 DEBUG nova.scheduler.client.report [None req-4288d71b-0032-48eb-b7ee-0311bbf3ae1b c11b574ed1e849b1b0f45e29432ef4d6 a182ea531d6a4192aff53c844249974e - - default default] Refreshing trait associations for resource provider 39680711-70c9-4df1-ae59-25e54fac688d, traits: COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_ACCELERATORS,COMPUTE_NODE,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_SSE41,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_GRAPHICS_MODEL_CIRRUS,HW_CPU_X86_SSE42,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_SECURITY_TPM_2_0,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_STORAGE_BUS_USB,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_STORAGE_BUS_FDC,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_STORAGE_BUS_SATA,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_DEVICE_TAGGING,COMPUTE_NET_VIF_MODEL_NE2K_PCI,HW_CPU_X86_SSE2,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_CPU_X86_SSSE3,COMPUTE_STORAGE_BUS_IDE,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_RESCUE_BFV,HW_CPU_X86_MMX,HW_CPU_X86_SSE,COMPUTE_TRUSTED_CERTS,COMPUTE_IMAGE_TYPE_ISO _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Jan 21 23:43:44 compute-1 nova_compute[182713]: 2026-01-21 23:43:44.618 182717 DEBUG oslo_concurrency.processutils [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.105s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:43:44 compute-1 nova_compute[182713]: 2026-01-21 23:43:44.619 182717 DEBUG nova.virt.disk.api [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] Checking if we can resize image /var/lib/nova/instances/6f64b039-da3a-47ef-9a52-b259b890a077/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 21 23:43:44 compute-1 nova_compute[182713]: 2026-01-21 23:43:44.620 182717 DEBUG oslo_concurrency.processutils [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6f64b039-da3a-47ef-9a52-b259b890a077/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:43:44 compute-1 nova_compute[182713]: 2026-01-21 23:43:44.669 182717 DEBUG nova.compute.provider_tree [None req-4288d71b-0032-48eb-b7ee-0311bbf3ae1b c11b574ed1e849b1b0f45e29432ef4d6 a182ea531d6a4192aff53c844249974e - - default default] Updating inventory in ProviderTree for provider 39680711-70c9-4df1-ae59-25e54fac688d with inventory: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Jan 21 23:43:44 compute-1 nova_compute[182713]: 2026-01-21 23:43:44.712 182717 DEBUG oslo_concurrency.processutils [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6f64b039-da3a-47ef-9a52-b259b890a077/disk --force-share --output=json" returned: 0 in 0.092s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:43:44 compute-1 nova_compute[182713]: 2026-01-21 23:43:44.713 182717 DEBUG nova.virt.disk.api [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] Cannot resize image /var/lib/nova/instances/6f64b039-da3a-47ef-9a52-b259b890a077/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 21 23:43:44 compute-1 nova_compute[182713]: 2026-01-21 23:43:44.714 182717 DEBUG nova.objects.instance [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] Lazy-loading 'migration_context' on Instance uuid 6f64b039-da3a-47ef-9a52-b259b890a077 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:43:44 compute-1 nova_compute[182713]: 2026-01-21 23:43:44.732 182717 DEBUG nova.virt.libvirt.driver [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] [instance: 6f64b039-da3a-47ef-9a52-b259b890a077] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 21 23:43:44 compute-1 nova_compute[182713]: 2026-01-21 23:43:44.733 182717 DEBUG nova.virt.libvirt.driver [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] [instance: 6f64b039-da3a-47ef-9a52-b259b890a077] Ensure instance console log exists: /var/lib/nova/instances/6f64b039-da3a-47ef-9a52-b259b890a077/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 21 23:43:44 compute-1 nova_compute[182713]: 2026-01-21 23:43:44.733 182717 DEBUG oslo_concurrency.lockutils [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:43:44 compute-1 nova_compute[182713]: 2026-01-21 23:43:44.734 182717 DEBUG oslo_concurrency.lockutils [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:43:44 compute-1 nova_compute[182713]: 2026-01-21 23:43:44.734 182717 DEBUG oslo_concurrency.lockutils [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:43:44 compute-1 nova_compute[182713]: 2026-01-21 23:43:44.775 182717 DEBUG nova.scheduler.client.report [None req-4288d71b-0032-48eb-b7ee-0311bbf3ae1b c11b574ed1e849b1b0f45e29432ef4d6 a182ea531d6a4192aff53c844249974e - - default default] Updated inventory for provider 39680711-70c9-4df1-ae59-25e54fac688d with generation 4 in Placement from set_inventory_for_provider using data: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:957
Jan 21 23:43:44 compute-1 nova_compute[182713]: 2026-01-21 23:43:44.776 182717 DEBUG nova.compute.provider_tree [None req-4288d71b-0032-48eb-b7ee-0311bbf3ae1b c11b574ed1e849b1b0f45e29432ef4d6 a182ea531d6a4192aff53c844249974e - - default default] Updating resource provider 39680711-70c9-4df1-ae59-25e54fac688d generation from 4 to 5 during operation: update_inventory _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164
Jan 21 23:43:44 compute-1 nova_compute[182713]: 2026-01-21 23:43:44.776 182717 DEBUG nova.compute.provider_tree [None req-4288d71b-0032-48eb-b7ee-0311bbf3ae1b c11b574ed1e849b1b0f45e29432ef4d6 a182ea531d6a4192aff53c844249974e - - default default] Updating inventory in ProviderTree for provider 39680711-70c9-4df1-ae59-25e54fac688d with inventory: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Jan 21 23:43:44 compute-1 nova_compute[182713]: 2026-01-21 23:43:44.806 182717 DEBUG oslo_concurrency.lockutils [None req-4288d71b-0032-48eb-b7ee-0311bbf3ae1b c11b574ed1e849b1b0f45e29432ef4d6 a182ea531d6a4192aff53c844249974e - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.622s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:43:44 compute-1 nova_compute[182713]: 2026-01-21 23:43:44.808 182717 DEBUG nova.compute.manager [None req-4288d71b-0032-48eb-b7ee-0311bbf3ae1b c11b574ed1e849b1b0f45e29432ef4d6 a182ea531d6a4192aff53c844249974e - - default default] [instance: 3f2b9306-446d-4b0b-9db6-7a6ef24e18e9] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 21 23:43:44 compute-1 nova_compute[182713]: 2026-01-21 23:43:44.882 182717 DEBUG nova.compute.manager [None req-4288d71b-0032-48eb-b7ee-0311bbf3ae1b c11b574ed1e849b1b0f45e29432ef4d6 a182ea531d6a4192aff53c844249974e - - default default] [instance: 3f2b9306-446d-4b0b-9db6-7a6ef24e18e9] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 21 23:43:44 compute-1 nova_compute[182713]: 2026-01-21 23:43:44.883 182717 DEBUG nova.network.neutron [None req-4288d71b-0032-48eb-b7ee-0311bbf3ae1b c11b574ed1e849b1b0f45e29432ef4d6 a182ea531d6a4192aff53c844249974e - - default default] [instance: 3f2b9306-446d-4b0b-9db6-7a6ef24e18e9] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 21 23:43:44 compute-1 nova_compute[182713]: 2026-01-21 23:43:44.909 182717 INFO nova.virt.libvirt.driver [None req-4288d71b-0032-48eb-b7ee-0311bbf3ae1b c11b574ed1e849b1b0f45e29432ef4d6 a182ea531d6a4192aff53c844249974e - - default default] [instance: 3f2b9306-446d-4b0b-9db6-7a6ef24e18e9] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 21 23:43:44 compute-1 nova_compute[182713]: 2026-01-21 23:43:44.928 182717 DEBUG nova.compute.manager [None req-4288d71b-0032-48eb-b7ee-0311bbf3ae1b c11b574ed1e849b1b0f45e29432ef4d6 a182ea531d6a4192aff53c844249974e - - default default] [instance: 3f2b9306-446d-4b0b-9db6-7a6ef24e18e9] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 21 23:43:45 compute-1 nova_compute[182713]: 2026-01-21 23:43:45.078 182717 DEBUG nova.compute.manager [None req-4288d71b-0032-48eb-b7ee-0311bbf3ae1b c11b574ed1e849b1b0f45e29432ef4d6 a182ea531d6a4192aff53c844249974e - - default default] [instance: 3f2b9306-446d-4b0b-9db6-7a6ef24e18e9] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 21 23:43:45 compute-1 nova_compute[182713]: 2026-01-21 23:43:45.080 182717 DEBUG nova.virt.libvirt.driver [None req-4288d71b-0032-48eb-b7ee-0311bbf3ae1b c11b574ed1e849b1b0f45e29432ef4d6 a182ea531d6a4192aff53c844249974e - - default default] [instance: 3f2b9306-446d-4b0b-9db6-7a6ef24e18e9] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 21 23:43:45 compute-1 nova_compute[182713]: 2026-01-21 23:43:45.081 182717 INFO nova.virt.libvirt.driver [None req-4288d71b-0032-48eb-b7ee-0311bbf3ae1b c11b574ed1e849b1b0f45e29432ef4d6 a182ea531d6a4192aff53c844249974e - - default default] [instance: 3f2b9306-446d-4b0b-9db6-7a6ef24e18e9] Creating image(s)
Jan 21 23:43:45 compute-1 nova_compute[182713]: 2026-01-21 23:43:45.082 182717 DEBUG oslo_concurrency.lockutils [None req-4288d71b-0032-48eb-b7ee-0311bbf3ae1b c11b574ed1e849b1b0f45e29432ef4d6 a182ea531d6a4192aff53c844249974e - - default default] Acquiring lock "/var/lib/nova/instances/3f2b9306-446d-4b0b-9db6-7a6ef24e18e9/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:43:45 compute-1 nova_compute[182713]: 2026-01-21 23:43:45.082 182717 DEBUG oslo_concurrency.lockutils [None req-4288d71b-0032-48eb-b7ee-0311bbf3ae1b c11b574ed1e849b1b0f45e29432ef4d6 a182ea531d6a4192aff53c844249974e - - default default] Lock "/var/lib/nova/instances/3f2b9306-446d-4b0b-9db6-7a6ef24e18e9/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:43:45 compute-1 nova_compute[182713]: 2026-01-21 23:43:45.083 182717 DEBUG oslo_concurrency.lockutils [None req-4288d71b-0032-48eb-b7ee-0311bbf3ae1b c11b574ed1e849b1b0f45e29432ef4d6 a182ea531d6a4192aff53c844249974e - - default default] Lock "/var/lib/nova/instances/3f2b9306-446d-4b0b-9db6-7a6ef24e18e9/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:43:45 compute-1 nova_compute[182713]: 2026-01-21 23:43:45.107 182717 DEBUG oslo_concurrency.processutils [None req-4288d71b-0032-48eb-b7ee-0311bbf3ae1b c11b574ed1e849b1b0f45e29432ef4d6 a182ea531d6a4192aff53c844249974e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:43:45 compute-1 nova_compute[182713]: 2026-01-21 23:43:45.199 182717 DEBUG oslo_concurrency.processutils [None req-4288d71b-0032-48eb-b7ee-0311bbf3ae1b c11b574ed1e849b1b0f45e29432ef4d6 a182ea531d6a4192aff53c844249974e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.091s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:43:45 compute-1 nova_compute[182713]: 2026-01-21 23:43:45.200 182717 DEBUG oslo_concurrency.lockutils [None req-4288d71b-0032-48eb-b7ee-0311bbf3ae1b c11b574ed1e849b1b0f45e29432ef4d6 a182ea531d6a4192aff53c844249974e - - default default] Acquiring lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:43:45 compute-1 nova_compute[182713]: 2026-01-21 23:43:45.200 182717 DEBUG oslo_concurrency.lockutils [None req-4288d71b-0032-48eb-b7ee-0311bbf3ae1b c11b574ed1e849b1b0f45e29432ef4d6 a182ea531d6a4192aff53c844249974e - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:43:45 compute-1 nova_compute[182713]: 2026-01-21 23:43:45.215 182717 DEBUG oslo_concurrency.processutils [None req-4288d71b-0032-48eb-b7ee-0311bbf3ae1b c11b574ed1e849b1b0f45e29432ef4d6 a182ea531d6a4192aff53c844249974e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:43:45 compute-1 nova_compute[182713]: 2026-01-21 23:43:45.305 182717 DEBUG oslo_concurrency.processutils [None req-4288d71b-0032-48eb-b7ee-0311bbf3ae1b c11b574ed1e849b1b0f45e29432ef4d6 a182ea531d6a4192aff53c844249974e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.091s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:43:45 compute-1 nova_compute[182713]: 2026-01-21 23:43:45.307 182717 DEBUG oslo_concurrency.processutils [None req-4288d71b-0032-48eb-b7ee-0311bbf3ae1b c11b574ed1e849b1b0f45e29432ef4d6 a182ea531d6a4192aff53c844249974e - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/3f2b9306-446d-4b0b-9db6-7a6ef24e18e9/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:43:45 compute-1 nova_compute[182713]: 2026-01-21 23:43:45.354 182717 DEBUG oslo_concurrency.processutils [None req-4288d71b-0032-48eb-b7ee-0311bbf3ae1b c11b574ed1e849b1b0f45e29432ef4d6 a182ea531d6a4192aff53c844249974e - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/3f2b9306-446d-4b0b-9db6-7a6ef24e18e9/disk 1073741824" returned: 0 in 0.047s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:43:45 compute-1 nova_compute[182713]: 2026-01-21 23:43:45.355 182717 DEBUG oslo_concurrency.lockutils [None req-4288d71b-0032-48eb-b7ee-0311bbf3ae1b c11b574ed1e849b1b0f45e29432ef4d6 a182ea531d6a4192aff53c844249974e - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.154s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:43:45 compute-1 nova_compute[182713]: 2026-01-21 23:43:45.356 182717 DEBUG oslo_concurrency.processutils [None req-4288d71b-0032-48eb-b7ee-0311bbf3ae1b c11b574ed1e849b1b0f45e29432ef4d6 a182ea531d6a4192aff53c844249974e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:43:45 compute-1 nova_compute[182713]: 2026-01-21 23:43:45.416 182717 DEBUG oslo_concurrency.processutils [None req-4288d71b-0032-48eb-b7ee-0311bbf3ae1b c11b574ed1e849b1b0f45e29432ef4d6 a182ea531d6a4192aff53c844249974e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:43:45 compute-1 nova_compute[182713]: 2026-01-21 23:43:45.417 182717 DEBUG nova.virt.disk.api [None req-4288d71b-0032-48eb-b7ee-0311bbf3ae1b c11b574ed1e849b1b0f45e29432ef4d6 a182ea531d6a4192aff53c844249974e - - default default] Checking if we can resize image /var/lib/nova/instances/3f2b9306-446d-4b0b-9db6-7a6ef24e18e9/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 21 23:43:45 compute-1 nova_compute[182713]: 2026-01-21 23:43:45.417 182717 DEBUG oslo_concurrency.processutils [None req-4288d71b-0032-48eb-b7ee-0311bbf3ae1b c11b574ed1e849b1b0f45e29432ef4d6 a182ea531d6a4192aff53c844249974e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3f2b9306-446d-4b0b-9db6-7a6ef24e18e9/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:43:45 compute-1 nova_compute[182713]: 2026-01-21 23:43:45.515 182717 DEBUG oslo_concurrency.processutils [None req-4288d71b-0032-48eb-b7ee-0311bbf3ae1b c11b574ed1e849b1b0f45e29432ef4d6 a182ea531d6a4192aff53c844249974e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3f2b9306-446d-4b0b-9db6-7a6ef24e18e9/disk --force-share --output=json" returned: 0 in 0.097s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:43:45 compute-1 nova_compute[182713]: 2026-01-21 23:43:45.516 182717 DEBUG nova.virt.disk.api [None req-4288d71b-0032-48eb-b7ee-0311bbf3ae1b c11b574ed1e849b1b0f45e29432ef4d6 a182ea531d6a4192aff53c844249974e - - default default] Cannot resize image /var/lib/nova/instances/3f2b9306-446d-4b0b-9db6-7a6ef24e18e9/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 21 23:43:45 compute-1 nova_compute[182713]: 2026-01-21 23:43:45.516 182717 DEBUG nova.objects.instance [None req-4288d71b-0032-48eb-b7ee-0311bbf3ae1b c11b574ed1e849b1b0f45e29432ef4d6 a182ea531d6a4192aff53c844249974e - - default default] Lazy-loading 'migration_context' on Instance uuid 3f2b9306-446d-4b0b-9db6-7a6ef24e18e9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:43:45 compute-1 nova_compute[182713]: 2026-01-21 23:43:45.535 182717 DEBUG nova.virt.libvirt.driver [None req-4288d71b-0032-48eb-b7ee-0311bbf3ae1b c11b574ed1e849b1b0f45e29432ef4d6 a182ea531d6a4192aff53c844249974e - - default default] [instance: 3f2b9306-446d-4b0b-9db6-7a6ef24e18e9] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 21 23:43:45 compute-1 nova_compute[182713]: 2026-01-21 23:43:45.535 182717 DEBUG nova.virt.libvirt.driver [None req-4288d71b-0032-48eb-b7ee-0311bbf3ae1b c11b574ed1e849b1b0f45e29432ef4d6 a182ea531d6a4192aff53c844249974e - - default default] [instance: 3f2b9306-446d-4b0b-9db6-7a6ef24e18e9] Ensure instance console log exists: /var/lib/nova/instances/3f2b9306-446d-4b0b-9db6-7a6ef24e18e9/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 21 23:43:45 compute-1 nova_compute[182713]: 2026-01-21 23:43:45.536 182717 DEBUG oslo_concurrency.lockutils [None req-4288d71b-0032-48eb-b7ee-0311bbf3ae1b c11b574ed1e849b1b0f45e29432ef4d6 a182ea531d6a4192aff53c844249974e - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:43:45 compute-1 nova_compute[182713]: 2026-01-21 23:43:45.536 182717 DEBUG oslo_concurrency.lockutils [None req-4288d71b-0032-48eb-b7ee-0311bbf3ae1b c11b574ed1e849b1b0f45e29432ef4d6 a182ea531d6a4192aff53c844249974e - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:43:45 compute-1 nova_compute[182713]: 2026-01-21 23:43:45.536 182717 DEBUG oslo_concurrency.lockutils [None req-4288d71b-0032-48eb-b7ee-0311bbf3ae1b c11b574ed1e849b1b0f45e29432ef4d6 a182ea531d6a4192aff53c844249974e - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:43:45 compute-1 nova_compute[182713]: 2026-01-21 23:43:45.659 182717 DEBUG nova.network.neutron [None req-4288d71b-0032-48eb-b7ee-0311bbf3ae1b c11b574ed1e849b1b0f45e29432ef4d6 a182ea531d6a4192aff53c844249974e - - default default] [instance: 3f2b9306-446d-4b0b-9db6-7a6ef24e18e9] No network configured allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1188
Jan 21 23:43:45 compute-1 nova_compute[182713]: 2026-01-21 23:43:45.660 182717 DEBUG nova.compute.manager [None req-4288d71b-0032-48eb-b7ee-0311bbf3ae1b c11b574ed1e849b1b0f45e29432ef4d6 a182ea531d6a4192aff53c844249974e - - default default] [instance: 3f2b9306-446d-4b0b-9db6-7a6ef24e18e9] Instance network_info: |[]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 21 23:43:45 compute-1 nova_compute[182713]: 2026-01-21 23:43:45.663 182717 DEBUG nova.virt.libvirt.driver [None req-4288d71b-0032-48eb-b7ee-0311bbf3ae1b c11b574ed1e849b1b0f45e29432ef4d6 a182ea531d6a4192aff53c844249974e - - default default] [instance: 3f2b9306-446d-4b0b-9db6-7a6ef24e18e9] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'size': 0, 'disk_bus': 'virtio', 'boot_index': 0, 'device_name': '/dev/vda', 'guest_format': None, 'encrypted': False, 'encryption_format': None, 'encryption_secret_uuid': None, 'device_type': 'disk', 'image_id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 21 23:43:45 compute-1 nova_compute[182713]: 2026-01-21 23:43:45.668 182717 WARNING nova.virt.libvirt.driver [None req-4288d71b-0032-48eb-b7ee-0311bbf3ae1b c11b574ed1e849b1b0f45e29432ef4d6 a182ea531d6a4192aff53c844249974e - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 21 23:43:45 compute-1 nova_compute[182713]: 2026-01-21 23:43:45.674 182717 DEBUG nova.virt.libvirt.host [None req-4288d71b-0032-48eb-b7ee-0311bbf3ae1b c11b574ed1e849b1b0f45e29432ef4d6 a182ea531d6a4192aff53c844249974e - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 21 23:43:45 compute-1 nova_compute[182713]: 2026-01-21 23:43:45.674 182717 DEBUG nova.virt.libvirt.host [None req-4288d71b-0032-48eb-b7ee-0311bbf3ae1b c11b574ed1e849b1b0f45e29432ef4d6 a182ea531d6a4192aff53c844249974e - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 21 23:43:45 compute-1 nova_compute[182713]: 2026-01-21 23:43:45.678 182717 DEBUG nova.virt.libvirt.host [None req-4288d71b-0032-48eb-b7ee-0311bbf3ae1b c11b574ed1e849b1b0f45e29432ef4d6 a182ea531d6a4192aff53c844249974e - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 21 23:43:45 compute-1 nova_compute[182713]: 2026-01-21 23:43:45.679 182717 DEBUG nova.virt.libvirt.host [None req-4288d71b-0032-48eb-b7ee-0311bbf3ae1b c11b574ed1e849b1b0f45e29432ef4d6 a182ea531d6a4192aff53c844249974e - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 21 23:43:45 compute-1 nova_compute[182713]: 2026-01-21 23:43:45.680 182717 DEBUG nova.virt.libvirt.driver [None req-4288d71b-0032-48eb-b7ee-0311bbf3ae1b c11b574ed1e849b1b0f45e29432ef4d6 a182ea531d6a4192aff53c844249974e - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 21 23:43:45 compute-1 nova_compute[182713]: 2026-01-21 23:43:45.681 182717 DEBUG nova.virt.hardware [None req-4288d71b-0032-48eb-b7ee-0311bbf3ae1b c11b574ed1e849b1b0f45e29432ef4d6 a182ea531d6a4192aff53c844249974e - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-21T23:42:45Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c3389c03-89c4-4ff5-9e03-1a99d41713d4',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 21 23:43:45 compute-1 nova_compute[182713]: 2026-01-21 23:43:45.681 182717 DEBUG nova.virt.hardware [None req-4288d71b-0032-48eb-b7ee-0311bbf3ae1b c11b574ed1e849b1b0f45e29432ef4d6 a182ea531d6a4192aff53c844249974e - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 21 23:43:45 compute-1 nova_compute[182713]: 2026-01-21 23:43:45.681 182717 DEBUG nova.virt.hardware [None req-4288d71b-0032-48eb-b7ee-0311bbf3ae1b c11b574ed1e849b1b0f45e29432ef4d6 a182ea531d6a4192aff53c844249974e - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 21 23:43:45 compute-1 nova_compute[182713]: 2026-01-21 23:43:45.681 182717 DEBUG nova.virt.hardware [None req-4288d71b-0032-48eb-b7ee-0311bbf3ae1b c11b574ed1e849b1b0f45e29432ef4d6 a182ea531d6a4192aff53c844249974e - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 21 23:43:45 compute-1 nova_compute[182713]: 2026-01-21 23:43:45.682 182717 DEBUG nova.virt.hardware [None req-4288d71b-0032-48eb-b7ee-0311bbf3ae1b c11b574ed1e849b1b0f45e29432ef4d6 a182ea531d6a4192aff53c844249974e - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 21 23:43:45 compute-1 nova_compute[182713]: 2026-01-21 23:43:45.682 182717 DEBUG nova.virt.hardware [None req-4288d71b-0032-48eb-b7ee-0311bbf3ae1b c11b574ed1e849b1b0f45e29432ef4d6 a182ea531d6a4192aff53c844249974e - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 21 23:43:45 compute-1 nova_compute[182713]: 2026-01-21 23:43:45.682 182717 DEBUG nova.virt.hardware [None req-4288d71b-0032-48eb-b7ee-0311bbf3ae1b c11b574ed1e849b1b0f45e29432ef4d6 a182ea531d6a4192aff53c844249974e - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 21 23:43:45 compute-1 nova_compute[182713]: 2026-01-21 23:43:45.682 182717 DEBUG nova.virt.hardware [None req-4288d71b-0032-48eb-b7ee-0311bbf3ae1b c11b574ed1e849b1b0f45e29432ef4d6 a182ea531d6a4192aff53c844249974e - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 21 23:43:45 compute-1 nova_compute[182713]: 2026-01-21 23:43:45.683 182717 DEBUG nova.virt.hardware [None req-4288d71b-0032-48eb-b7ee-0311bbf3ae1b c11b574ed1e849b1b0f45e29432ef4d6 a182ea531d6a4192aff53c844249974e - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 21 23:43:45 compute-1 nova_compute[182713]: 2026-01-21 23:43:45.683 182717 DEBUG nova.virt.hardware [None req-4288d71b-0032-48eb-b7ee-0311bbf3ae1b c11b574ed1e849b1b0f45e29432ef4d6 a182ea531d6a4192aff53c844249974e - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 21 23:43:45 compute-1 nova_compute[182713]: 2026-01-21 23:43:45.683 182717 DEBUG nova.virt.hardware [None req-4288d71b-0032-48eb-b7ee-0311bbf3ae1b c11b574ed1e849b1b0f45e29432ef4d6 a182ea531d6a4192aff53c844249974e - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 21 23:43:45 compute-1 nova_compute[182713]: 2026-01-21 23:43:45.687 182717 DEBUG nova.privsep.utils [None req-4288d71b-0032-48eb-b7ee-0311bbf3ae1b c11b574ed1e849b1b0f45e29432ef4d6 a182ea531d6a4192aff53c844249974e - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63
Jan 21 23:43:45 compute-1 nova_compute[182713]: 2026-01-21 23:43:45.688 182717 DEBUG nova.objects.instance [None req-4288d71b-0032-48eb-b7ee-0311bbf3ae1b c11b574ed1e849b1b0f45e29432ef4d6 a182ea531d6a4192aff53c844249974e - - default default] Lazy-loading 'pci_devices' on Instance uuid 3f2b9306-446d-4b0b-9db6-7a6ef24e18e9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:43:45 compute-1 nova_compute[182713]: 2026-01-21 23:43:45.718 182717 DEBUG nova.virt.libvirt.driver [None req-4288d71b-0032-48eb-b7ee-0311bbf3ae1b c11b574ed1e849b1b0f45e29432ef4d6 a182ea531d6a4192aff53c844249974e - - default default] [instance: 3f2b9306-446d-4b0b-9db6-7a6ef24e18e9] End _get_guest_xml xml=<domain type="kvm">
Jan 21 23:43:45 compute-1 nova_compute[182713]:   <uuid>3f2b9306-446d-4b0b-9db6-7a6ef24e18e9</uuid>
Jan 21 23:43:45 compute-1 nova_compute[182713]:   <name>instance-00000005</name>
Jan 21 23:43:45 compute-1 nova_compute[182713]:   <memory>131072</memory>
Jan 21 23:43:45 compute-1 nova_compute[182713]:   <vcpu>1</vcpu>
Jan 21 23:43:45 compute-1 nova_compute[182713]:   <metadata>
Jan 21 23:43:45 compute-1 nova_compute[182713]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 21 23:43:45 compute-1 nova_compute[182713]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 21 23:43:45 compute-1 nova_compute[182713]:       <nova:name>tempest-DeleteServersAdminTestJSON-server-775700041</nova:name>
Jan 21 23:43:45 compute-1 nova_compute[182713]:       <nova:creationTime>2026-01-21 23:43:45</nova:creationTime>
Jan 21 23:43:45 compute-1 nova_compute[182713]:       <nova:flavor name="m1.nano">
Jan 21 23:43:45 compute-1 nova_compute[182713]:         <nova:memory>128</nova:memory>
Jan 21 23:43:45 compute-1 nova_compute[182713]:         <nova:disk>1</nova:disk>
Jan 21 23:43:45 compute-1 nova_compute[182713]:         <nova:swap>0</nova:swap>
Jan 21 23:43:45 compute-1 nova_compute[182713]:         <nova:ephemeral>0</nova:ephemeral>
Jan 21 23:43:45 compute-1 nova_compute[182713]:         <nova:vcpus>1</nova:vcpus>
Jan 21 23:43:45 compute-1 nova_compute[182713]:       </nova:flavor>
Jan 21 23:43:45 compute-1 nova_compute[182713]:       <nova:owner>
Jan 21 23:43:45 compute-1 nova_compute[182713]:         <nova:user uuid="c11b574ed1e849b1b0f45e29432ef4d6">tempest-DeleteServersAdminTestJSON-138472938-project-member</nova:user>
Jan 21 23:43:45 compute-1 nova_compute[182713]:         <nova:project uuid="a182ea531d6a4192aff53c844249974e">tempest-DeleteServersAdminTestJSON-138472938</nova:project>
Jan 21 23:43:45 compute-1 nova_compute[182713]:       </nova:owner>
Jan 21 23:43:45 compute-1 nova_compute[182713]:       <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 21 23:43:45 compute-1 nova_compute[182713]:       <nova:ports/>
Jan 21 23:43:45 compute-1 nova_compute[182713]:     </nova:instance>
Jan 21 23:43:45 compute-1 nova_compute[182713]:   </metadata>
Jan 21 23:43:45 compute-1 nova_compute[182713]:   <sysinfo type="smbios">
Jan 21 23:43:45 compute-1 nova_compute[182713]:     <system>
Jan 21 23:43:45 compute-1 nova_compute[182713]:       <entry name="manufacturer">RDO</entry>
Jan 21 23:43:45 compute-1 nova_compute[182713]:       <entry name="product">OpenStack Compute</entry>
Jan 21 23:43:45 compute-1 nova_compute[182713]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 21 23:43:45 compute-1 nova_compute[182713]:       <entry name="serial">3f2b9306-446d-4b0b-9db6-7a6ef24e18e9</entry>
Jan 21 23:43:45 compute-1 nova_compute[182713]:       <entry name="uuid">3f2b9306-446d-4b0b-9db6-7a6ef24e18e9</entry>
Jan 21 23:43:45 compute-1 nova_compute[182713]:       <entry name="family">Virtual Machine</entry>
Jan 21 23:43:45 compute-1 nova_compute[182713]:     </system>
Jan 21 23:43:45 compute-1 nova_compute[182713]:   </sysinfo>
Jan 21 23:43:45 compute-1 nova_compute[182713]:   <os>
Jan 21 23:43:45 compute-1 nova_compute[182713]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 21 23:43:45 compute-1 nova_compute[182713]:     <boot dev="hd"/>
Jan 21 23:43:45 compute-1 nova_compute[182713]:     <smbios mode="sysinfo"/>
Jan 21 23:43:45 compute-1 nova_compute[182713]:   </os>
Jan 21 23:43:45 compute-1 nova_compute[182713]:   <features>
Jan 21 23:43:45 compute-1 nova_compute[182713]:     <acpi/>
Jan 21 23:43:45 compute-1 nova_compute[182713]:     <apic/>
Jan 21 23:43:45 compute-1 nova_compute[182713]:     <vmcoreinfo/>
Jan 21 23:43:45 compute-1 nova_compute[182713]:   </features>
Jan 21 23:43:45 compute-1 nova_compute[182713]:   <clock offset="utc">
Jan 21 23:43:45 compute-1 nova_compute[182713]:     <timer name="pit" tickpolicy="delay"/>
Jan 21 23:43:45 compute-1 nova_compute[182713]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 21 23:43:45 compute-1 nova_compute[182713]:     <timer name="hpet" present="no"/>
Jan 21 23:43:45 compute-1 nova_compute[182713]:   </clock>
Jan 21 23:43:45 compute-1 nova_compute[182713]:   <cpu mode="custom" match="exact">
Jan 21 23:43:45 compute-1 nova_compute[182713]:     <model>Nehalem</model>
Jan 21 23:43:45 compute-1 nova_compute[182713]:     <topology sockets="1" cores="1" threads="1"/>
Jan 21 23:43:45 compute-1 nova_compute[182713]:   </cpu>
Jan 21 23:43:45 compute-1 nova_compute[182713]:   <devices>
Jan 21 23:43:45 compute-1 nova_compute[182713]:     <disk type="file" device="disk">
Jan 21 23:43:45 compute-1 nova_compute[182713]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 21 23:43:45 compute-1 nova_compute[182713]:       <source file="/var/lib/nova/instances/3f2b9306-446d-4b0b-9db6-7a6ef24e18e9/disk"/>
Jan 21 23:43:45 compute-1 nova_compute[182713]:       <target dev="vda" bus="virtio"/>
Jan 21 23:43:45 compute-1 nova_compute[182713]:     </disk>
Jan 21 23:43:45 compute-1 nova_compute[182713]:     <disk type="file" device="cdrom">
Jan 21 23:43:45 compute-1 nova_compute[182713]:       <driver name="qemu" type="raw" cache="none"/>
Jan 21 23:43:45 compute-1 nova_compute[182713]:       <source file="/var/lib/nova/instances/3f2b9306-446d-4b0b-9db6-7a6ef24e18e9/disk.config"/>
Jan 21 23:43:45 compute-1 nova_compute[182713]:       <target dev="sda" bus="sata"/>
Jan 21 23:43:45 compute-1 nova_compute[182713]:     </disk>
Jan 21 23:43:45 compute-1 nova_compute[182713]:     <serial type="pty">
Jan 21 23:43:45 compute-1 nova_compute[182713]:       <log file="/var/lib/nova/instances/3f2b9306-446d-4b0b-9db6-7a6ef24e18e9/console.log" append="off"/>
Jan 21 23:43:45 compute-1 nova_compute[182713]:     </serial>
Jan 21 23:43:45 compute-1 nova_compute[182713]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 21 23:43:45 compute-1 nova_compute[182713]:     <video>
Jan 21 23:43:45 compute-1 nova_compute[182713]:       <model type="virtio"/>
Jan 21 23:43:45 compute-1 nova_compute[182713]:     </video>
Jan 21 23:43:45 compute-1 nova_compute[182713]:     <input type="tablet" bus="usb"/>
Jan 21 23:43:45 compute-1 nova_compute[182713]:     <rng model="virtio">
Jan 21 23:43:45 compute-1 nova_compute[182713]:       <backend model="random">/dev/urandom</backend>
Jan 21 23:43:45 compute-1 nova_compute[182713]:     </rng>
Jan 21 23:43:45 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root"/>
Jan 21 23:43:45 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:43:45 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:43:45 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:43:45 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:43:45 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:43:45 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:43:45 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:43:45 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:43:45 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:43:45 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:43:45 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:43:45 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:43:45 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:43:45 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:43:45 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:43:45 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:43:45 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:43:45 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:43:45 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:43:45 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:43:45 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:43:45 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:43:45 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:43:45 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:43:45 compute-1 nova_compute[182713]:     <controller type="usb" index="0"/>
Jan 21 23:43:45 compute-1 nova_compute[182713]:     <memballoon model="virtio">
Jan 21 23:43:45 compute-1 nova_compute[182713]:       <stats period="10"/>
Jan 21 23:43:45 compute-1 nova_compute[182713]:     </memballoon>
Jan 21 23:43:45 compute-1 nova_compute[182713]:   </devices>
Jan 21 23:43:45 compute-1 nova_compute[182713]: </domain>
Jan 21 23:43:45 compute-1 nova_compute[182713]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 21 23:43:45 compute-1 nova_compute[182713]: 2026-01-21 23:43:45.777 182717 DEBUG nova.virt.libvirt.driver [None req-4288d71b-0032-48eb-b7ee-0311bbf3ae1b c11b574ed1e849b1b0f45e29432ef4d6 a182ea531d6a4192aff53c844249974e - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 21 23:43:45 compute-1 nova_compute[182713]: 2026-01-21 23:43:45.778 182717 DEBUG nova.virt.libvirt.driver [None req-4288d71b-0032-48eb-b7ee-0311bbf3ae1b c11b574ed1e849b1b0f45e29432ef4d6 a182ea531d6a4192aff53c844249974e - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 21 23:43:45 compute-1 nova_compute[182713]: 2026-01-21 23:43:45.778 182717 INFO nova.virt.libvirt.driver [None req-4288d71b-0032-48eb-b7ee-0311bbf3ae1b c11b574ed1e849b1b0f45e29432ef4d6 a182ea531d6a4192aff53c844249974e - - default default] [instance: 3f2b9306-446d-4b0b-9db6-7a6ef24e18e9] Using config drive
Jan 21 23:43:46 compute-1 nova_compute[182713]: 2026-01-21 23:43:46.888 182717 INFO nova.virt.libvirt.driver [None req-4288d71b-0032-48eb-b7ee-0311bbf3ae1b c11b574ed1e849b1b0f45e29432ef4d6 a182ea531d6a4192aff53c844249974e - - default default] [instance: 3f2b9306-446d-4b0b-9db6-7a6ef24e18e9] Creating config drive at /var/lib/nova/instances/3f2b9306-446d-4b0b-9db6-7a6ef24e18e9/disk.config
Jan 21 23:43:46 compute-1 nova_compute[182713]: 2026-01-21 23:43:46.895 182717 DEBUG oslo_concurrency.processutils [None req-4288d71b-0032-48eb-b7ee-0311bbf3ae1b c11b574ed1e849b1b0f45e29432ef4d6 a182ea531d6a4192aff53c844249974e - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/3f2b9306-446d-4b0b-9db6-7a6ef24e18e9/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpld_c4xfb execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:43:47 compute-1 nova_compute[182713]: 2026-01-21 23:43:47.023 182717 DEBUG oslo_concurrency.processutils [None req-4288d71b-0032-48eb-b7ee-0311bbf3ae1b c11b574ed1e849b1b0f45e29432ef4d6 a182ea531d6a4192aff53c844249974e - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/3f2b9306-446d-4b0b-9db6-7a6ef24e18e9/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpld_c4xfb" returned: 0 in 0.127s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:43:47 compute-1 systemd-machined[153970]: New machine qemu-1-instance-00000005.
Jan 21 23:43:47 compute-1 systemd[1]: Started Virtual Machine qemu-1-instance-00000005.
Jan 21 23:43:47 compute-1 nova_compute[182713]: 2026-01-21 23:43:47.950 182717 DEBUG nova.virt.driver [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Emitting event <LifecycleEvent: 1769039027.9491608, 3f2b9306-446d-4b0b-9db6-7a6ef24e18e9 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:43:47 compute-1 nova_compute[182713]: 2026-01-21 23:43:47.951 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 3f2b9306-446d-4b0b-9db6-7a6ef24e18e9] VM Resumed (Lifecycle Event)
Jan 21 23:43:47 compute-1 nova_compute[182713]: 2026-01-21 23:43:47.958 182717 DEBUG nova.compute.manager [None req-4288d71b-0032-48eb-b7ee-0311bbf3ae1b c11b574ed1e849b1b0f45e29432ef4d6 a182ea531d6a4192aff53c844249974e - - default default] [instance: 3f2b9306-446d-4b0b-9db6-7a6ef24e18e9] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 21 23:43:47 compute-1 nova_compute[182713]: 2026-01-21 23:43:47.959 182717 DEBUG nova.virt.libvirt.driver [None req-4288d71b-0032-48eb-b7ee-0311bbf3ae1b c11b574ed1e849b1b0f45e29432ef4d6 a182ea531d6a4192aff53c844249974e - - default default] [instance: 3f2b9306-446d-4b0b-9db6-7a6ef24e18e9] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 21 23:43:47 compute-1 nova_compute[182713]: 2026-01-21 23:43:47.963 182717 INFO nova.virt.libvirt.driver [-] [instance: 3f2b9306-446d-4b0b-9db6-7a6ef24e18e9] Instance spawned successfully.
Jan 21 23:43:47 compute-1 nova_compute[182713]: 2026-01-21 23:43:47.963 182717 DEBUG nova.virt.libvirt.driver [None req-4288d71b-0032-48eb-b7ee-0311bbf3ae1b c11b574ed1e849b1b0f45e29432ef4d6 a182ea531d6a4192aff53c844249974e - - default default] [instance: 3f2b9306-446d-4b0b-9db6-7a6ef24e18e9] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 21 23:43:48 compute-1 nova_compute[182713]: 2026-01-21 23:43:48.006 182717 DEBUG nova.virt.libvirt.driver [None req-4288d71b-0032-48eb-b7ee-0311bbf3ae1b c11b574ed1e849b1b0f45e29432ef4d6 a182ea531d6a4192aff53c844249974e - - default default] [instance: 3f2b9306-446d-4b0b-9db6-7a6ef24e18e9] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:43:48 compute-1 nova_compute[182713]: 2026-01-21 23:43:48.007 182717 DEBUG nova.virt.libvirt.driver [None req-4288d71b-0032-48eb-b7ee-0311bbf3ae1b c11b574ed1e849b1b0f45e29432ef4d6 a182ea531d6a4192aff53c844249974e - - default default] [instance: 3f2b9306-446d-4b0b-9db6-7a6ef24e18e9] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:43:48 compute-1 nova_compute[182713]: 2026-01-21 23:43:48.007 182717 DEBUG nova.virt.libvirt.driver [None req-4288d71b-0032-48eb-b7ee-0311bbf3ae1b c11b574ed1e849b1b0f45e29432ef4d6 a182ea531d6a4192aff53c844249974e - - default default] [instance: 3f2b9306-446d-4b0b-9db6-7a6ef24e18e9] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:43:48 compute-1 nova_compute[182713]: 2026-01-21 23:43:48.007 182717 DEBUG nova.virt.libvirt.driver [None req-4288d71b-0032-48eb-b7ee-0311bbf3ae1b c11b574ed1e849b1b0f45e29432ef4d6 a182ea531d6a4192aff53c844249974e - - default default] [instance: 3f2b9306-446d-4b0b-9db6-7a6ef24e18e9] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:43:48 compute-1 nova_compute[182713]: 2026-01-21 23:43:48.008 182717 DEBUG nova.virt.libvirt.driver [None req-4288d71b-0032-48eb-b7ee-0311bbf3ae1b c11b574ed1e849b1b0f45e29432ef4d6 a182ea531d6a4192aff53c844249974e - - default default] [instance: 3f2b9306-446d-4b0b-9db6-7a6ef24e18e9] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:43:48 compute-1 nova_compute[182713]: 2026-01-21 23:43:48.008 182717 DEBUG nova.virt.libvirt.driver [None req-4288d71b-0032-48eb-b7ee-0311bbf3ae1b c11b574ed1e849b1b0f45e29432ef4d6 a182ea531d6a4192aff53c844249974e - - default default] [instance: 3f2b9306-446d-4b0b-9db6-7a6ef24e18e9] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:43:48 compute-1 nova_compute[182713]: 2026-01-21 23:43:48.018 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 3f2b9306-446d-4b0b-9db6-7a6ef24e18e9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:43:48 compute-1 nova_compute[182713]: 2026-01-21 23:43:48.022 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 3f2b9306-446d-4b0b-9db6-7a6ef24e18e9] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 21 23:43:48 compute-1 nova_compute[182713]: 2026-01-21 23:43:48.048 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 3f2b9306-446d-4b0b-9db6-7a6ef24e18e9] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 21 23:43:48 compute-1 nova_compute[182713]: 2026-01-21 23:43:48.049 182717 DEBUG nova.virt.driver [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Emitting event <LifecycleEvent: 1769039027.956599, 3f2b9306-446d-4b0b-9db6-7a6ef24e18e9 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:43:48 compute-1 nova_compute[182713]: 2026-01-21 23:43:48.049 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 3f2b9306-446d-4b0b-9db6-7a6ef24e18e9] VM Started (Lifecycle Event)
Jan 21 23:43:48 compute-1 nova_compute[182713]: 2026-01-21 23:43:48.076 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 3f2b9306-446d-4b0b-9db6-7a6ef24e18e9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:43:48 compute-1 nova_compute[182713]: 2026-01-21 23:43:48.080 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 3f2b9306-446d-4b0b-9db6-7a6ef24e18e9] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 21 23:43:48 compute-1 nova_compute[182713]: 2026-01-21 23:43:48.102 182717 INFO nova.compute.manager [None req-4288d71b-0032-48eb-b7ee-0311bbf3ae1b c11b574ed1e849b1b0f45e29432ef4d6 a182ea531d6a4192aff53c844249974e - - default default] [instance: 3f2b9306-446d-4b0b-9db6-7a6ef24e18e9] Took 3.02 seconds to spawn the instance on the hypervisor.
Jan 21 23:43:48 compute-1 nova_compute[182713]: 2026-01-21 23:43:48.103 182717 DEBUG nova.compute.manager [None req-4288d71b-0032-48eb-b7ee-0311bbf3ae1b c11b574ed1e849b1b0f45e29432ef4d6 a182ea531d6a4192aff53c844249974e - - default default] [instance: 3f2b9306-446d-4b0b-9db6-7a6ef24e18e9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:43:48 compute-1 nova_compute[182713]: 2026-01-21 23:43:48.107 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 3f2b9306-446d-4b0b-9db6-7a6ef24e18e9] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 21 23:43:48 compute-1 nova_compute[182713]: 2026-01-21 23:43:48.220 182717 INFO nova.compute.manager [None req-4288d71b-0032-48eb-b7ee-0311bbf3ae1b c11b574ed1e849b1b0f45e29432ef4d6 a182ea531d6a4192aff53c844249974e - - default default] [instance: 3f2b9306-446d-4b0b-9db6-7a6ef24e18e9] Took 4.08 seconds to build instance.
Jan 21 23:43:48 compute-1 nova_compute[182713]: 2026-01-21 23:43:48.257 182717 DEBUG oslo_concurrency.lockutils [None req-4288d71b-0032-48eb-b7ee-0311bbf3ae1b c11b574ed1e849b1b0f45e29432ef4d6 a182ea531d6a4192aff53c844249974e - - default default] Lock "3f2b9306-446d-4b0b-9db6-7a6ef24e18e9" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 4.227s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:43:49 compute-1 nova_compute[182713]: 2026-01-21 23:43:49.568 182717 DEBUG oslo_concurrency.lockutils [None req-3357863f-5e04-434a-bf46-1bb2731e38af 39f708670a9d4fddacb6f05e3f163ec5 0f1b0c700e6249ddb68d0a766e8b3e38 - - default default] Acquiring lock "3f2b9306-446d-4b0b-9db6-7a6ef24e18e9" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:43:49 compute-1 nova_compute[182713]: 2026-01-21 23:43:49.569 182717 DEBUG oslo_concurrency.lockutils [None req-3357863f-5e04-434a-bf46-1bb2731e38af 39f708670a9d4fddacb6f05e3f163ec5 0f1b0c700e6249ddb68d0a766e8b3e38 - - default default] Lock "3f2b9306-446d-4b0b-9db6-7a6ef24e18e9" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:43:49 compute-1 nova_compute[182713]: 2026-01-21 23:43:49.569 182717 DEBUG oslo_concurrency.lockutils [None req-3357863f-5e04-434a-bf46-1bb2731e38af 39f708670a9d4fddacb6f05e3f163ec5 0f1b0c700e6249ddb68d0a766e8b3e38 - - default default] Acquiring lock "3f2b9306-446d-4b0b-9db6-7a6ef24e18e9-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:43:49 compute-1 nova_compute[182713]: 2026-01-21 23:43:49.570 182717 DEBUG oslo_concurrency.lockutils [None req-3357863f-5e04-434a-bf46-1bb2731e38af 39f708670a9d4fddacb6f05e3f163ec5 0f1b0c700e6249ddb68d0a766e8b3e38 - - default default] Lock "3f2b9306-446d-4b0b-9db6-7a6ef24e18e9-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:43:49 compute-1 nova_compute[182713]: 2026-01-21 23:43:49.570 182717 DEBUG oslo_concurrency.lockutils [None req-3357863f-5e04-434a-bf46-1bb2731e38af 39f708670a9d4fddacb6f05e3f163ec5 0f1b0c700e6249ddb68d0a766e8b3e38 - - default default] Lock "3f2b9306-446d-4b0b-9db6-7a6ef24e18e9-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:43:49 compute-1 nova_compute[182713]: 2026-01-21 23:43:49.584 182717 INFO nova.compute.manager [None req-3357863f-5e04-434a-bf46-1bb2731e38af 39f708670a9d4fddacb6f05e3f163ec5 0f1b0c700e6249ddb68d0a766e8b3e38 - - default default] [instance: 3f2b9306-446d-4b0b-9db6-7a6ef24e18e9] Terminating instance
Jan 21 23:43:49 compute-1 nova_compute[182713]: 2026-01-21 23:43:49.596 182717 DEBUG oslo_concurrency.lockutils [None req-3357863f-5e04-434a-bf46-1bb2731e38af 39f708670a9d4fddacb6f05e3f163ec5 0f1b0c700e6249ddb68d0a766e8b3e38 - - default default] Acquiring lock "refresh_cache-3f2b9306-446d-4b0b-9db6-7a6ef24e18e9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 23:43:49 compute-1 nova_compute[182713]: 2026-01-21 23:43:49.596 182717 DEBUG oslo_concurrency.lockutils [None req-3357863f-5e04-434a-bf46-1bb2731e38af 39f708670a9d4fddacb6f05e3f163ec5 0f1b0c700e6249ddb68d0a766e8b3e38 - - default default] Acquired lock "refresh_cache-3f2b9306-446d-4b0b-9db6-7a6ef24e18e9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 23:43:49 compute-1 nova_compute[182713]: 2026-01-21 23:43:49.596 182717 DEBUG nova.network.neutron [None req-3357863f-5e04-434a-bf46-1bb2731e38af 39f708670a9d4fddacb6f05e3f163ec5 0f1b0c700e6249ddb68d0a766e8b3e38 - - default default] [instance: 3f2b9306-446d-4b0b-9db6-7a6ef24e18e9] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 21 23:43:49 compute-1 podman[211477]: 2026-01-21 23:43:49.599065327 +0000 UTC m=+0.067988660 container health_status e305ebfcd3dbca18f8dd680606b043b108cf9efb0d0a771039cb04fe0df8441d (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Jan 21 23:43:49 compute-1 podman[211476]: 2026-01-21 23:43:49.601428183 +0000 UTC m=+0.076927739 container health_status af9a44923f14d865893c91712a29a99d59e05d83f1a1a0300c4f4d5af638f284 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 21 23:43:50 compute-1 nova_compute[182713]: 2026-01-21 23:43:50.091 182717 DEBUG nova.network.neutron [None req-3357863f-5e04-434a-bf46-1bb2731e38af 39f708670a9d4fddacb6f05e3f163ec5 0f1b0c700e6249ddb68d0a766e8b3e38 - - default default] [instance: 3f2b9306-446d-4b0b-9db6-7a6ef24e18e9] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 21 23:43:50 compute-1 nova_compute[182713]: 2026-01-21 23:43:50.784 182717 DEBUG nova.network.neutron [None req-3357863f-5e04-434a-bf46-1bb2731e38af 39f708670a9d4fddacb6f05e3f163ec5 0f1b0c700e6249ddb68d0a766e8b3e38 - - default default] [instance: 3f2b9306-446d-4b0b-9db6-7a6ef24e18e9] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 23:43:50 compute-1 nova_compute[182713]: 2026-01-21 23:43:50.801 182717 DEBUG oslo_concurrency.lockutils [None req-3357863f-5e04-434a-bf46-1bb2731e38af 39f708670a9d4fddacb6f05e3f163ec5 0f1b0c700e6249ddb68d0a766e8b3e38 - - default default] Releasing lock "refresh_cache-3f2b9306-446d-4b0b-9db6-7a6ef24e18e9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 23:43:50 compute-1 nova_compute[182713]: 2026-01-21 23:43:50.802 182717 DEBUG nova.compute.manager [None req-3357863f-5e04-434a-bf46-1bb2731e38af 39f708670a9d4fddacb6f05e3f163ec5 0f1b0c700e6249ddb68d0a766e8b3e38 - - default default] [instance: 3f2b9306-446d-4b0b-9db6-7a6ef24e18e9] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 21 23:43:50 compute-1 systemd[1]: machine-qemu\x2d1\x2dinstance\x2d00000005.scope: Deactivated successfully.
Jan 21 23:43:50 compute-1 systemd[1]: machine-qemu\x2d1\x2dinstance\x2d00000005.scope: Consumed 3.642s CPU time.
Jan 21 23:43:50 compute-1 systemd-machined[153970]: Machine qemu-1-instance-00000005 terminated.
Jan 21 23:43:51 compute-1 nova_compute[182713]: 2026-01-21 23:43:51.049 182717 INFO nova.virt.libvirt.driver [-] [instance: 3f2b9306-446d-4b0b-9db6-7a6ef24e18e9] Instance destroyed successfully.
Jan 21 23:43:51 compute-1 nova_compute[182713]: 2026-01-21 23:43:51.050 182717 DEBUG nova.objects.instance [None req-3357863f-5e04-434a-bf46-1bb2731e38af 39f708670a9d4fddacb6f05e3f163ec5 0f1b0c700e6249ddb68d0a766e8b3e38 - - default default] Lazy-loading 'resources' on Instance uuid 3f2b9306-446d-4b0b-9db6-7a6ef24e18e9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:43:51 compute-1 nova_compute[182713]: 2026-01-21 23:43:51.079 182717 INFO nova.virt.libvirt.driver [None req-3357863f-5e04-434a-bf46-1bb2731e38af 39f708670a9d4fddacb6f05e3f163ec5 0f1b0c700e6249ddb68d0a766e8b3e38 - - default default] [instance: 3f2b9306-446d-4b0b-9db6-7a6ef24e18e9] Deleting instance files /var/lib/nova/instances/3f2b9306-446d-4b0b-9db6-7a6ef24e18e9_del
Jan 21 23:43:51 compute-1 nova_compute[182713]: 2026-01-21 23:43:51.081 182717 INFO nova.virt.libvirt.driver [None req-3357863f-5e04-434a-bf46-1bb2731e38af 39f708670a9d4fddacb6f05e3f163ec5 0f1b0c700e6249ddb68d0a766e8b3e38 - - default default] [instance: 3f2b9306-446d-4b0b-9db6-7a6ef24e18e9] Deletion of /var/lib/nova/instances/3f2b9306-446d-4b0b-9db6-7a6ef24e18e9_del complete
Jan 21 23:43:51 compute-1 nova_compute[182713]: 2026-01-21 23:43:51.214 182717 DEBUG nova.virt.libvirt.host [None req-3357863f-5e04-434a-bf46-1bb2731e38af 39f708670a9d4fddacb6f05e3f163ec5 0f1b0c700e6249ddb68d0a766e8b3e38 - - default default] Checking UEFI support for host arch (x86_64) supports_uefi /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1754
Jan 21 23:43:51 compute-1 nova_compute[182713]: 2026-01-21 23:43:51.215 182717 INFO nova.virt.libvirt.host [None req-3357863f-5e04-434a-bf46-1bb2731e38af 39f708670a9d4fddacb6f05e3f163ec5 0f1b0c700e6249ddb68d0a766e8b3e38 - - default default] UEFI support detected
Jan 21 23:43:51 compute-1 nova_compute[182713]: 2026-01-21 23:43:51.217 182717 INFO nova.compute.manager [None req-3357863f-5e04-434a-bf46-1bb2731e38af 39f708670a9d4fddacb6f05e3f163ec5 0f1b0c700e6249ddb68d0a766e8b3e38 - - default default] [instance: 3f2b9306-446d-4b0b-9db6-7a6ef24e18e9] Took 0.42 seconds to destroy the instance on the hypervisor.
Jan 21 23:43:51 compute-1 nova_compute[182713]: 2026-01-21 23:43:51.218 182717 DEBUG oslo.service.loopingcall [None req-3357863f-5e04-434a-bf46-1bb2731e38af 39f708670a9d4fddacb6f05e3f163ec5 0f1b0c700e6249ddb68d0a766e8b3e38 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 21 23:43:51 compute-1 nova_compute[182713]: 2026-01-21 23:43:51.218 182717 DEBUG nova.compute.manager [-] [instance: 3f2b9306-446d-4b0b-9db6-7a6ef24e18e9] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 21 23:43:51 compute-1 nova_compute[182713]: 2026-01-21 23:43:51.219 182717 DEBUG nova.network.neutron [-] [instance: 3f2b9306-446d-4b0b-9db6-7a6ef24e18e9] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 21 23:43:51 compute-1 nova_compute[182713]: 2026-01-21 23:43:51.723 182717 DEBUG nova.network.neutron [-] [instance: 3f2b9306-446d-4b0b-9db6-7a6ef24e18e9] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 21 23:43:51 compute-1 nova_compute[182713]: 2026-01-21 23:43:51.743 182717 DEBUG nova.network.neutron [-] [instance: 3f2b9306-446d-4b0b-9db6-7a6ef24e18e9] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 23:43:51 compute-1 nova_compute[182713]: 2026-01-21 23:43:51.768 182717 INFO nova.compute.manager [-] [instance: 3f2b9306-446d-4b0b-9db6-7a6ef24e18e9] Took 0.55 seconds to deallocate network for instance.
Jan 21 23:43:51 compute-1 nova_compute[182713]: 2026-01-21 23:43:51.847 182717 DEBUG oslo_concurrency.lockutils [None req-3357863f-5e04-434a-bf46-1bb2731e38af 39f708670a9d4fddacb6f05e3f163ec5 0f1b0c700e6249ddb68d0a766e8b3e38 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:43:51 compute-1 nova_compute[182713]: 2026-01-21 23:43:51.848 182717 DEBUG oslo_concurrency.lockutils [None req-3357863f-5e04-434a-bf46-1bb2731e38af 39f708670a9d4fddacb6f05e3f163ec5 0f1b0c700e6249ddb68d0a766e8b3e38 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:43:51 compute-1 nova_compute[182713]: 2026-01-21 23:43:51.969 182717 DEBUG nova.compute.provider_tree [None req-3357863f-5e04-434a-bf46-1bb2731e38af 39f708670a9d4fddacb6f05e3f163ec5 0f1b0c700e6249ddb68d0a766e8b3e38 - - default default] Inventory has not changed in ProviderTree for provider: 39680711-70c9-4df1-ae59-25e54fac688d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 21 23:43:51 compute-1 nova_compute[182713]: 2026-01-21 23:43:51.984 182717 DEBUG nova.scheduler.client.report [None req-3357863f-5e04-434a-bf46-1bb2731e38af 39f708670a9d4fddacb6f05e3f163ec5 0f1b0c700e6249ddb68d0a766e8b3e38 - - default default] Inventory has not changed for provider 39680711-70c9-4df1-ae59-25e54fac688d based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 21 23:43:52 compute-1 nova_compute[182713]: 2026-01-21 23:43:52.011 182717 DEBUG oslo_concurrency.lockutils [None req-3357863f-5e04-434a-bf46-1bb2731e38af 39f708670a9d4fddacb6f05e3f163ec5 0f1b0c700e6249ddb68d0a766e8b3e38 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.162s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:43:52 compute-1 nova_compute[182713]: 2026-01-21 23:43:52.036 182717 INFO nova.scheduler.client.report [None req-3357863f-5e04-434a-bf46-1bb2731e38af 39f708670a9d4fddacb6f05e3f163ec5 0f1b0c700e6249ddb68d0a766e8b3e38 - - default default] Deleted allocations for instance 3f2b9306-446d-4b0b-9db6-7a6ef24e18e9
Jan 21 23:43:52 compute-1 nova_compute[182713]: 2026-01-21 23:43:52.120 182717 DEBUG oslo_concurrency.lockutils [None req-3357863f-5e04-434a-bf46-1bb2731e38af 39f708670a9d4fddacb6f05e3f163ec5 0f1b0c700e6249ddb68d0a766e8b3e38 - - default default] Lock "3f2b9306-446d-4b0b-9db6-7a6ef24e18e9" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.551s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:43:55 compute-1 nova_compute[182713]: 2026-01-21 23:43:55.916 182717 DEBUG oslo_concurrency.lockutils [None req-72b77d2f-d8bc-4b8d-a922-fbf132ebb242 c11b574ed1e849b1b0f45e29432ef4d6 a182ea531d6a4192aff53c844249974e - - default default] Acquiring lock "1b4a8e44-319b-431f-b842-ebb9dd2413fe" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:43:55 compute-1 nova_compute[182713]: 2026-01-21 23:43:55.917 182717 DEBUG oslo_concurrency.lockutils [None req-72b77d2f-d8bc-4b8d-a922-fbf132ebb242 c11b574ed1e849b1b0f45e29432ef4d6 a182ea531d6a4192aff53c844249974e - - default default] Lock "1b4a8e44-319b-431f-b842-ebb9dd2413fe" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:43:55 compute-1 nova_compute[182713]: 2026-01-21 23:43:55.948 182717 DEBUG nova.compute.manager [None req-72b77d2f-d8bc-4b8d-a922-fbf132ebb242 c11b574ed1e849b1b0f45e29432ef4d6 a182ea531d6a4192aff53c844249974e - - default default] [instance: 1b4a8e44-319b-431f-b842-ebb9dd2413fe] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 21 23:43:56 compute-1 nova_compute[182713]: 2026-01-21 23:43:56.130 182717 DEBUG oslo_concurrency.lockutils [None req-72b77d2f-d8bc-4b8d-a922-fbf132ebb242 c11b574ed1e849b1b0f45e29432ef4d6 a182ea531d6a4192aff53c844249974e - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:43:56 compute-1 nova_compute[182713]: 2026-01-21 23:43:56.131 182717 DEBUG oslo_concurrency.lockutils [None req-72b77d2f-d8bc-4b8d-a922-fbf132ebb242 c11b574ed1e849b1b0f45e29432ef4d6 a182ea531d6a4192aff53c844249974e - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:43:56 compute-1 nova_compute[182713]: 2026-01-21 23:43:56.141 182717 DEBUG nova.virt.hardware [None req-72b77d2f-d8bc-4b8d-a922-fbf132ebb242 c11b574ed1e849b1b0f45e29432ef4d6 a182ea531d6a4192aff53c844249974e - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 21 23:43:56 compute-1 nova_compute[182713]: 2026-01-21 23:43:56.141 182717 INFO nova.compute.claims [None req-72b77d2f-d8bc-4b8d-a922-fbf132ebb242 c11b574ed1e849b1b0f45e29432ef4d6 a182ea531d6a4192aff53c844249974e - - default default] [instance: 1b4a8e44-319b-431f-b842-ebb9dd2413fe] Claim successful on node compute-1.ctlplane.example.com
Jan 21 23:43:56 compute-1 nova_compute[182713]: 2026-01-21 23:43:56.350 182717 DEBUG nova.compute.provider_tree [None req-72b77d2f-d8bc-4b8d-a922-fbf132ebb242 c11b574ed1e849b1b0f45e29432ef4d6 a182ea531d6a4192aff53c844249974e - - default default] Inventory has not changed in ProviderTree for provider: 39680711-70c9-4df1-ae59-25e54fac688d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 21 23:43:56 compute-1 nova_compute[182713]: 2026-01-21 23:43:56.369 182717 DEBUG nova.scheduler.client.report [None req-72b77d2f-d8bc-4b8d-a922-fbf132ebb242 c11b574ed1e849b1b0f45e29432ef4d6 a182ea531d6a4192aff53c844249974e - - default default] Inventory has not changed for provider 39680711-70c9-4df1-ae59-25e54fac688d based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 21 23:43:56 compute-1 nova_compute[182713]: 2026-01-21 23:43:56.395 182717 DEBUG oslo_concurrency.lockutils [None req-72b77d2f-d8bc-4b8d-a922-fbf132ebb242 c11b574ed1e849b1b0f45e29432ef4d6 a182ea531d6a4192aff53c844249974e - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.264s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:43:56 compute-1 nova_compute[182713]: 2026-01-21 23:43:56.396 182717 DEBUG nova.compute.manager [None req-72b77d2f-d8bc-4b8d-a922-fbf132ebb242 c11b574ed1e849b1b0f45e29432ef4d6 a182ea531d6a4192aff53c844249974e - - default default] [instance: 1b4a8e44-319b-431f-b842-ebb9dd2413fe] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 21 23:43:56 compute-1 nova_compute[182713]: 2026-01-21 23:43:56.513 182717 DEBUG nova.compute.manager [None req-72b77d2f-d8bc-4b8d-a922-fbf132ebb242 c11b574ed1e849b1b0f45e29432ef4d6 a182ea531d6a4192aff53c844249974e - - default default] [instance: 1b4a8e44-319b-431f-b842-ebb9dd2413fe] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 21 23:43:56 compute-1 nova_compute[182713]: 2026-01-21 23:43:56.514 182717 DEBUG nova.network.neutron [None req-72b77d2f-d8bc-4b8d-a922-fbf132ebb242 c11b574ed1e849b1b0f45e29432ef4d6 a182ea531d6a4192aff53c844249974e - - default default] [instance: 1b4a8e44-319b-431f-b842-ebb9dd2413fe] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 21 23:43:56 compute-1 nova_compute[182713]: 2026-01-21 23:43:56.538 182717 INFO nova.virt.libvirt.driver [None req-72b77d2f-d8bc-4b8d-a922-fbf132ebb242 c11b574ed1e849b1b0f45e29432ef4d6 a182ea531d6a4192aff53c844249974e - - default default] [instance: 1b4a8e44-319b-431f-b842-ebb9dd2413fe] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 21 23:43:56 compute-1 nova_compute[182713]: 2026-01-21 23:43:56.561 182717 DEBUG nova.compute.manager [None req-72b77d2f-d8bc-4b8d-a922-fbf132ebb242 c11b574ed1e849b1b0f45e29432ef4d6 a182ea531d6a4192aff53c844249974e - - default default] [instance: 1b4a8e44-319b-431f-b842-ebb9dd2413fe] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 21 23:43:56 compute-1 nova_compute[182713]: 2026-01-21 23:43:56.763 182717 DEBUG nova.compute.manager [None req-72b77d2f-d8bc-4b8d-a922-fbf132ebb242 c11b574ed1e849b1b0f45e29432ef4d6 a182ea531d6a4192aff53c844249974e - - default default] [instance: 1b4a8e44-319b-431f-b842-ebb9dd2413fe] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 21 23:43:56 compute-1 nova_compute[182713]: 2026-01-21 23:43:56.766 182717 DEBUG nova.virt.libvirt.driver [None req-72b77d2f-d8bc-4b8d-a922-fbf132ebb242 c11b574ed1e849b1b0f45e29432ef4d6 a182ea531d6a4192aff53c844249974e - - default default] [instance: 1b4a8e44-319b-431f-b842-ebb9dd2413fe] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 21 23:43:56 compute-1 nova_compute[182713]: 2026-01-21 23:43:56.767 182717 INFO nova.virt.libvirt.driver [None req-72b77d2f-d8bc-4b8d-a922-fbf132ebb242 c11b574ed1e849b1b0f45e29432ef4d6 a182ea531d6a4192aff53c844249974e - - default default] [instance: 1b4a8e44-319b-431f-b842-ebb9dd2413fe] Creating image(s)
Jan 21 23:43:56 compute-1 nova_compute[182713]: 2026-01-21 23:43:56.768 182717 DEBUG oslo_concurrency.lockutils [None req-72b77d2f-d8bc-4b8d-a922-fbf132ebb242 c11b574ed1e849b1b0f45e29432ef4d6 a182ea531d6a4192aff53c844249974e - - default default] Acquiring lock "/var/lib/nova/instances/1b4a8e44-319b-431f-b842-ebb9dd2413fe/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:43:56 compute-1 nova_compute[182713]: 2026-01-21 23:43:56.768 182717 DEBUG oslo_concurrency.lockutils [None req-72b77d2f-d8bc-4b8d-a922-fbf132ebb242 c11b574ed1e849b1b0f45e29432ef4d6 a182ea531d6a4192aff53c844249974e - - default default] Lock "/var/lib/nova/instances/1b4a8e44-319b-431f-b842-ebb9dd2413fe/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:43:56 compute-1 nova_compute[182713]: 2026-01-21 23:43:56.770 182717 DEBUG oslo_concurrency.lockutils [None req-72b77d2f-d8bc-4b8d-a922-fbf132ebb242 c11b574ed1e849b1b0f45e29432ef4d6 a182ea531d6a4192aff53c844249974e - - default default] Lock "/var/lib/nova/instances/1b4a8e44-319b-431f-b842-ebb9dd2413fe/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:43:56 compute-1 nova_compute[182713]: 2026-01-21 23:43:56.800 182717 DEBUG oslo_concurrency.processutils [None req-72b77d2f-d8bc-4b8d-a922-fbf132ebb242 c11b574ed1e849b1b0f45e29432ef4d6 a182ea531d6a4192aff53c844249974e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:43:56 compute-1 nova_compute[182713]: 2026-01-21 23:43:56.868 182717 DEBUG oslo_concurrency.processutils [None req-72b77d2f-d8bc-4b8d-a922-fbf132ebb242 c11b574ed1e849b1b0f45e29432ef4d6 a182ea531d6a4192aff53c844249974e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:43:56 compute-1 nova_compute[182713]: 2026-01-21 23:43:56.869 182717 DEBUG oslo_concurrency.lockutils [None req-72b77d2f-d8bc-4b8d-a922-fbf132ebb242 c11b574ed1e849b1b0f45e29432ef4d6 a182ea531d6a4192aff53c844249974e - - default default] Acquiring lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:43:56 compute-1 nova_compute[182713]: 2026-01-21 23:43:56.870 182717 DEBUG oslo_concurrency.lockutils [None req-72b77d2f-d8bc-4b8d-a922-fbf132ebb242 c11b574ed1e849b1b0f45e29432ef4d6 a182ea531d6a4192aff53c844249974e - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:43:56 compute-1 nova_compute[182713]: 2026-01-21 23:43:56.893 182717 DEBUG oslo_concurrency.processutils [None req-72b77d2f-d8bc-4b8d-a922-fbf132ebb242 c11b574ed1e849b1b0f45e29432ef4d6 a182ea531d6a4192aff53c844249974e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:43:56 compute-1 nova_compute[182713]: 2026-01-21 23:43:56.956 182717 DEBUG oslo_concurrency.processutils [None req-72b77d2f-d8bc-4b8d-a922-fbf132ebb242 c11b574ed1e849b1b0f45e29432ef4d6 a182ea531d6a4192aff53c844249974e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:43:56 compute-1 nova_compute[182713]: 2026-01-21 23:43:56.957 182717 DEBUG oslo_concurrency.processutils [None req-72b77d2f-d8bc-4b8d-a922-fbf132ebb242 c11b574ed1e849b1b0f45e29432ef4d6 a182ea531d6a4192aff53c844249974e - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/1b4a8e44-319b-431f-b842-ebb9dd2413fe/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:43:56 compute-1 nova_compute[182713]: 2026-01-21 23:43:56.996 182717 DEBUG oslo_concurrency.processutils [None req-72b77d2f-d8bc-4b8d-a922-fbf132ebb242 c11b574ed1e849b1b0f45e29432ef4d6 a182ea531d6a4192aff53c844249974e - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/1b4a8e44-319b-431f-b842-ebb9dd2413fe/disk 1073741824" returned: 0 in 0.039s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:43:56 compute-1 nova_compute[182713]: 2026-01-21 23:43:56.998 182717 DEBUG oslo_concurrency.lockutils [None req-72b77d2f-d8bc-4b8d-a922-fbf132ebb242 c11b574ed1e849b1b0f45e29432ef4d6 a182ea531d6a4192aff53c844249974e - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.128s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:43:57 compute-1 nova_compute[182713]: 2026-01-21 23:43:56.999 182717 DEBUG oslo_concurrency.processutils [None req-72b77d2f-d8bc-4b8d-a922-fbf132ebb242 c11b574ed1e849b1b0f45e29432ef4d6 a182ea531d6a4192aff53c844249974e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:43:57 compute-1 nova_compute[182713]: 2026-01-21 23:43:57.060 182717 DEBUG oslo_concurrency.processutils [None req-72b77d2f-d8bc-4b8d-a922-fbf132ebb242 c11b574ed1e849b1b0f45e29432ef4d6 a182ea531d6a4192aff53c844249974e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:43:57 compute-1 nova_compute[182713]: 2026-01-21 23:43:57.062 182717 DEBUG nova.virt.disk.api [None req-72b77d2f-d8bc-4b8d-a922-fbf132ebb242 c11b574ed1e849b1b0f45e29432ef4d6 a182ea531d6a4192aff53c844249974e - - default default] Checking if we can resize image /var/lib/nova/instances/1b4a8e44-319b-431f-b842-ebb9dd2413fe/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 21 23:43:57 compute-1 nova_compute[182713]: 2026-01-21 23:43:57.062 182717 DEBUG oslo_concurrency.processutils [None req-72b77d2f-d8bc-4b8d-a922-fbf132ebb242 c11b574ed1e849b1b0f45e29432ef4d6 a182ea531d6a4192aff53c844249974e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1b4a8e44-319b-431f-b842-ebb9dd2413fe/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:43:57 compute-1 nova_compute[182713]: 2026-01-21 23:43:57.123 182717 DEBUG oslo_concurrency.processutils [None req-72b77d2f-d8bc-4b8d-a922-fbf132ebb242 c11b574ed1e849b1b0f45e29432ef4d6 a182ea531d6a4192aff53c844249974e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1b4a8e44-319b-431f-b842-ebb9dd2413fe/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:43:57 compute-1 nova_compute[182713]: 2026-01-21 23:43:57.125 182717 DEBUG nova.virt.disk.api [None req-72b77d2f-d8bc-4b8d-a922-fbf132ebb242 c11b574ed1e849b1b0f45e29432ef4d6 a182ea531d6a4192aff53c844249974e - - default default] Cannot resize image /var/lib/nova/instances/1b4a8e44-319b-431f-b842-ebb9dd2413fe/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 21 23:43:57 compute-1 nova_compute[182713]: 2026-01-21 23:43:57.126 182717 DEBUG nova.objects.instance [None req-72b77d2f-d8bc-4b8d-a922-fbf132ebb242 c11b574ed1e849b1b0f45e29432ef4d6 a182ea531d6a4192aff53c844249974e - - default default] Lazy-loading 'migration_context' on Instance uuid 1b4a8e44-319b-431f-b842-ebb9dd2413fe obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:43:57 compute-1 nova_compute[182713]: 2026-01-21 23:43:57.149 182717 DEBUG nova.virt.libvirt.driver [None req-72b77d2f-d8bc-4b8d-a922-fbf132ebb242 c11b574ed1e849b1b0f45e29432ef4d6 a182ea531d6a4192aff53c844249974e - - default default] [instance: 1b4a8e44-319b-431f-b842-ebb9dd2413fe] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 21 23:43:57 compute-1 nova_compute[182713]: 2026-01-21 23:43:57.150 182717 DEBUG nova.virt.libvirt.driver [None req-72b77d2f-d8bc-4b8d-a922-fbf132ebb242 c11b574ed1e849b1b0f45e29432ef4d6 a182ea531d6a4192aff53c844249974e - - default default] [instance: 1b4a8e44-319b-431f-b842-ebb9dd2413fe] Ensure instance console log exists: /var/lib/nova/instances/1b4a8e44-319b-431f-b842-ebb9dd2413fe/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 21 23:43:57 compute-1 nova_compute[182713]: 2026-01-21 23:43:57.150 182717 DEBUG oslo_concurrency.lockutils [None req-72b77d2f-d8bc-4b8d-a922-fbf132ebb242 c11b574ed1e849b1b0f45e29432ef4d6 a182ea531d6a4192aff53c844249974e - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:43:57 compute-1 nova_compute[182713]: 2026-01-21 23:43:57.151 182717 DEBUG oslo_concurrency.lockutils [None req-72b77d2f-d8bc-4b8d-a922-fbf132ebb242 c11b574ed1e849b1b0f45e29432ef4d6 a182ea531d6a4192aff53c844249974e - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:43:57 compute-1 nova_compute[182713]: 2026-01-21 23:43:57.151 182717 DEBUG oslo_concurrency.lockutils [None req-72b77d2f-d8bc-4b8d-a922-fbf132ebb242 c11b574ed1e849b1b0f45e29432ef4d6 a182ea531d6a4192aff53c844249974e - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:43:58 compute-1 nova_compute[182713]: 2026-01-21 23:43:58.428 182717 DEBUG nova.network.neutron [None req-72b77d2f-d8bc-4b8d-a922-fbf132ebb242 c11b574ed1e849b1b0f45e29432ef4d6 a182ea531d6a4192aff53c844249974e - - default default] [instance: 1b4a8e44-319b-431f-b842-ebb9dd2413fe] No network configured allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1188
Jan 21 23:43:58 compute-1 nova_compute[182713]: 2026-01-21 23:43:58.428 182717 DEBUG nova.compute.manager [None req-72b77d2f-d8bc-4b8d-a922-fbf132ebb242 c11b574ed1e849b1b0f45e29432ef4d6 a182ea531d6a4192aff53c844249974e - - default default] [instance: 1b4a8e44-319b-431f-b842-ebb9dd2413fe] Instance network_info: |[]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 21 23:43:58 compute-1 nova_compute[182713]: 2026-01-21 23:43:58.430 182717 DEBUG nova.virt.libvirt.driver [None req-72b77d2f-d8bc-4b8d-a922-fbf132ebb242 c11b574ed1e849b1b0f45e29432ef4d6 a182ea531d6a4192aff53c844249974e - - default default] [instance: 1b4a8e44-319b-431f-b842-ebb9dd2413fe] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'size': 0, 'disk_bus': 'virtio', 'boot_index': 0, 'device_name': '/dev/vda', 'guest_format': None, 'encrypted': False, 'encryption_format': None, 'encryption_secret_uuid': None, 'device_type': 'disk', 'image_id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 21 23:43:58 compute-1 nova_compute[182713]: 2026-01-21 23:43:58.435 182717 WARNING nova.virt.libvirt.driver [None req-72b77d2f-d8bc-4b8d-a922-fbf132ebb242 c11b574ed1e849b1b0f45e29432ef4d6 a182ea531d6a4192aff53c844249974e - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 21 23:43:58 compute-1 nova_compute[182713]: 2026-01-21 23:43:58.440 182717 DEBUG nova.virt.libvirt.host [None req-72b77d2f-d8bc-4b8d-a922-fbf132ebb242 c11b574ed1e849b1b0f45e29432ef4d6 a182ea531d6a4192aff53c844249974e - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 21 23:43:58 compute-1 nova_compute[182713]: 2026-01-21 23:43:58.440 182717 DEBUG nova.virt.libvirt.host [None req-72b77d2f-d8bc-4b8d-a922-fbf132ebb242 c11b574ed1e849b1b0f45e29432ef4d6 a182ea531d6a4192aff53c844249974e - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 21 23:43:58 compute-1 nova_compute[182713]: 2026-01-21 23:43:58.446 182717 DEBUG nova.virt.libvirt.host [None req-72b77d2f-d8bc-4b8d-a922-fbf132ebb242 c11b574ed1e849b1b0f45e29432ef4d6 a182ea531d6a4192aff53c844249974e - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 21 23:43:58 compute-1 nova_compute[182713]: 2026-01-21 23:43:58.446 182717 DEBUG nova.virt.libvirt.host [None req-72b77d2f-d8bc-4b8d-a922-fbf132ebb242 c11b574ed1e849b1b0f45e29432ef4d6 a182ea531d6a4192aff53c844249974e - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 21 23:43:58 compute-1 nova_compute[182713]: 2026-01-21 23:43:58.447 182717 DEBUG nova.virt.libvirt.driver [None req-72b77d2f-d8bc-4b8d-a922-fbf132ebb242 c11b574ed1e849b1b0f45e29432ef4d6 a182ea531d6a4192aff53c844249974e - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 21 23:43:58 compute-1 nova_compute[182713]: 2026-01-21 23:43:58.447 182717 DEBUG nova.virt.hardware [None req-72b77d2f-d8bc-4b8d-a922-fbf132ebb242 c11b574ed1e849b1b0f45e29432ef4d6 a182ea531d6a4192aff53c844249974e - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-21T23:42:45Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c3389c03-89c4-4ff5-9e03-1a99d41713d4',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 21 23:43:58 compute-1 nova_compute[182713]: 2026-01-21 23:43:58.448 182717 DEBUG nova.virt.hardware [None req-72b77d2f-d8bc-4b8d-a922-fbf132ebb242 c11b574ed1e849b1b0f45e29432ef4d6 a182ea531d6a4192aff53c844249974e - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 21 23:43:58 compute-1 nova_compute[182713]: 2026-01-21 23:43:58.448 182717 DEBUG nova.virt.hardware [None req-72b77d2f-d8bc-4b8d-a922-fbf132ebb242 c11b574ed1e849b1b0f45e29432ef4d6 a182ea531d6a4192aff53c844249974e - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 21 23:43:58 compute-1 nova_compute[182713]: 2026-01-21 23:43:58.448 182717 DEBUG nova.virt.hardware [None req-72b77d2f-d8bc-4b8d-a922-fbf132ebb242 c11b574ed1e849b1b0f45e29432ef4d6 a182ea531d6a4192aff53c844249974e - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 21 23:43:58 compute-1 nova_compute[182713]: 2026-01-21 23:43:58.448 182717 DEBUG nova.virt.hardware [None req-72b77d2f-d8bc-4b8d-a922-fbf132ebb242 c11b574ed1e849b1b0f45e29432ef4d6 a182ea531d6a4192aff53c844249974e - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 21 23:43:58 compute-1 nova_compute[182713]: 2026-01-21 23:43:58.449 182717 DEBUG nova.virt.hardware [None req-72b77d2f-d8bc-4b8d-a922-fbf132ebb242 c11b574ed1e849b1b0f45e29432ef4d6 a182ea531d6a4192aff53c844249974e - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 21 23:43:58 compute-1 nova_compute[182713]: 2026-01-21 23:43:58.449 182717 DEBUG nova.virt.hardware [None req-72b77d2f-d8bc-4b8d-a922-fbf132ebb242 c11b574ed1e849b1b0f45e29432ef4d6 a182ea531d6a4192aff53c844249974e - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 21 23:43:58 compute-1 nova_compute[182713]: 2026-01-21 23:43:58.449 182717 DEBUG nova.virt.hardware [None req-72b77d2f-d8bc-4b8d-a922-fbf132ebb242 c11b574ed1e849b1b0f45e29432ef4d6 a182ea531d6a4192aff53c844249974e - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 21 23:43:58 compute-1 nova_compute[182713]: 2026-01-21 23:43:58.449 182717 DEBUG nova.virt.hardware [None req-72b77d2f-d8bc-4b8d-a922-fbf132ebb242 c11b574ed1e849b1b0f45e29432ef4d6 a182ea531d6a4192aff53c844249974e - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 21 23:43:58 compute-1 nova_compute[182713]: 2026-01-21 23:43:58.449 182717 DEBUG nova.virt.hardware [None req-72b77d2f-d8bc-4b8d-a922-fbf132ebb242 c11b574ed1e849b1b0f45e29432ef4d6 a182ea531d6a4192aff53c844249974e - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 21 23:43:58 compute-1 nova_compute[182713]: 2026-01-21 23:43:58.450 182717 DEBUG nova.virt.hardware [None req-72b77d2f-d8bc-4b8d-a922-fbf132ebb242 c11b574ed1e849b1b0f45e29432ef4d6 a182ea531d6a4192aff53c844249974e - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 21 23:43:58 compute-1 nova_compute[182713]: 2026-01-21 23:43:58.452 182717 DEBUG nova.objects.instance [None req-72b77d2f-d8bc-4b8d-a922-fbf132ebb242 c11b574ed1e849b1b0f45e29432ef4d6 a182ea531d6a4192aff53c844249974e - - default default] Lazy-loading 'pci_devices' on Instance uuid 1b4a8e44-319b-431f-b842-ebb9dd2413fe obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:43:58 compute-1 nova_compute[182713]: 2026-01-21 23:43:58.489 182717 DEBUG nova.virt.libvirt.driver [None req-72b77d2f-d8bc-4b8d-a922-fbf132ebb242 c11b574ed1e849b1b0f45e29432ef4d6 a182ea531d6a4192aff53c844249974e - - default default] [instance: 1b4a8e44-319b-431f-b842-ebb9dd2413fe] End _get_guest_xml xml=<domain type="kvm">
Jan 21 23:43:58 compute-1 nova_compute[182713]:   <uuid>1b4a8e44-319b-431f-b842-ebb9dd2413fe</uuid>
Jan 21 23:43:58 compute-1 nova_compute[182713]:   <name>instance-00000006</name>
Jan 21 23:43:58 compute-1 nova_compute[182713]:   <memory>131072</memory>
Jan 21 23:43:58 compute-1 nova_compute[182713]:   <vcpu>1</vcpu>
Jan 21 23:43:58 compute-1 nova_compute[182713]:   <metadata>
Jan 21 23:43:58 compute-1 nova_compute[182713]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 21 23:43:58 compute-1 nova_compute[182713]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 21 23:43:58 compute-1 nova_compute[182713]:       <nova:name>tempest-DeleteServersAdminTestJSON-server-625835571</nova:name>
Jan 21 23:43:58 compute-1 nova_compute[182713]:       <nova:creationTime>2026-01-21 23:43:58</nova:creationTime>
Jan 21 23:43:58 compute-1 nova_compute[182713]:       <nova:flavor name="m1.nano">
Jan 21 23:43:58 compute-1 nova_compute[182713]:         <nova:memory>128</nova:memory>
Jan 21 23:43:58 compute-1 nova_compute[182713]:         <nova:disk>1</nova:disk>
Jan 21 23:43:58 compute-1 nova_compute[182713]:         <nova:swap>0</nova:swap>
Jan 21 23:43:58 compute-1 nova_compute[182713]:         <nova:ephemeral>0</nova:ephemeral>
Jan 21 23:43:58 compute-1 nova_compute[182713]:         <nova:vcpus>1</nova:vcpus>
Jan 21 23:43:58 compute-1 nova_compute[182713]:       </nova:flavor>
Jan 21 23:43:58 compute-1 nova_compute[182713]:       <nova:owner>
Jan 21 23:43:58 compute-1 nova_compute[182713]:         <nova:user uuid="c11b574ed1e849b1b0f45e29432ef4d6">tempest-DeleteServersAdminTestJSON-138472938-project-member</nova:user>
Jan 21 23:43:58 compute-1 nova_compute[182713]:         <nova:project uuid="a182ea531d6a4192aff53c844249974e">tempest-DeleteServersAdminTestJSON-138472938</nova:project>
Jan 21 23:43:58 compute-1 nova_compute[182713]:       </nova:owner>
Jan 21 23:43:58 compute-1 nova_compute[182713]:       <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 21 23:43:58 compute-1 nova_compute[182713]:       <nova:ports/>
Jan 21 23:43:58 compute-1 nova_compute[182713]:     </nova:instance>
Jan 21 23:43:58 compute-1 nova_compute[182713]:   </metadata>
Jan 21 23:43:58 compute-1 nova_compute[182713]:   <sysinfo type="smbios">
Jan 21 23:43:58 compute-1 nova_compute[182713]:     <system>
Jan 21 23:43:58 compute-1 nova_compute[182713]:       <entry name="manufacturer">RDO</entry>
Jan 21 23:43:58 compute-1 nova_compute[182713]:       <entry name="product">OpenStack Compute</entry>
Jan 21 23:43:58 compute-1 nova_compute[182713]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 21 23:43:58 compute-1 nova_compute[182713]:       <entry name="serial">1b4a8e44-319b-431f-b842-ebb9dd2413fe</entry>
Jan 21 23:43:58 compute-1 nova_compute[182713]:       <entry name="uuid">1b4a8e44-319b-431f-b842-ebb9dd2413fe</entry>
Jan 21 23:43:58 compute-1 nova_compute[182713]:       <entry name="family">Virtual Machine</entry>
Jan 21 23:43:58 compute-1 nova_compute[182713]:     </system>
Jan 21 23:43:58 compute-1 nova_compute[182713]:   </sysinfo>
Jan 21 23:43:58 compute-1 nova_compute[182713]:   <os>
Jan 21 23:43:58 compute-1 nova_compute[182713]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 21 23:43:58 compute-1 nova_compute[182713]:     <boot dev="hd"/>
Jan 21 23:43:58 compute-1 nova_compute[182713]:     <smbios mode="sysinfo"/>
Jan 21 23:43:58 compute-1 nova_compute[182713]:   </os>
Jan 21 23:43:58 compute-1 nova_compute[182713]:   <features>
Jan 21 23:43:58 compute-1 nova_compute[182713]:     <acpi/>
Jan 21 23:43:58 compute-1 nova_compute[182713]:     <apic/>
Jan 21 23:43:58 compute-1 nova_compute[182713]:     <vmcoreinfo/>
Jan 21 23:43:58 compute-1 nova_compute[182713]:   </features>
Jan 21 23:43:58 compute-1 nova_compute[182713]:   <clock offset="utc">
Jan 21 23:43:58 compute-1 nova_compute[182713]:     <timer name="pit" tickpolicy="delay"/>
Jan 21 23:43:58 compute-1 nova_compute[182713]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 21 23:43:58 compute-1 nova_compute[182713]:     <timer name="hpet" present="no"/>
Jan 21 23:43:58 compute-1 nova_compute[182713]:   </clock>
Jan 21 23:43:58 compute-1 nova_compute[182713]:   <cpu mode="custom" match="exact">
Jan 21 23:43:58 compute-1 nova_compute[182713]:     <model>Nehalem</model>
Jan 21 23:43:58 compute-1 nova_compute[182713]:     <topology sockets="1" cores="1" threads="1"/>
Jan 21 23:43:58 compute-1 nova_compute[182713]:   </cpu>
Jan 21 23:43:58 compute-1 nova_compute[182713]:   <devices>
Jan 21 23:43:58 compute-1 nova_compute[182713]:     <disk type="file" device="disk">
Jan 21 23:43:58 compute-1 nova_compute[182713]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 21 23:43:58 compute-1 nova_compute[182713]:       <source file="/var/lib/nova/instances/1b4a8e44-319b-431f-b842-ebb9dd2413fe/disk"/>
Jan 21 23:43:58 compute-1 nova_compute[182713]:       <target dev="vda" bus="virtio"/>
Jan 21 23:43:58 compute-1 nova_compute[182713]:     </disk>
Jan 21 23:43:58 compute-1 nova_compute[182713]:     <disk type="file" device="cdrom">
Jan 21 23:43:58 compute-1 nova_compute[182713]:       <driver name="qemu" type="raw" cache="none"/>
Jan 21 23:43:58 compute-1 nova_compute[182713]:       <source file="/var/lib/nova/instances/1b4a8e44-319b-431f-b842-ebb9dd2413fe/disk.config"/>
Jan 21 23:43:58 compute-1 nova_compute[182713]:       <target dev="sda" bus="sata"/>
Jan 21 23:43:58 compute-1 nova_compute[182713]:     </disk>
Jan 21 23:43:58 compute-1 nova_compute[182713]:     <serial type="pty">
Jan 21 23:43:58 compute-1 nova_compute[182713]:       <log file="/var/lib/nova/instances/1b4a8e44-319b-431f-b842-ebb9dd2413fe/console.log" append="off"/>
Jan 21 23:43:58 compute-1 nova_compute[182713]:     </serial>
Jan 21 23:43:58 compute-1 nova_compute[182713]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 21 23:43:58 compute-1 nova_compute[182713]:     <video>
Jan 21 23:43:58 compute-1 nova_compute[182713]:       <model type="virtio"/>
Jan 21 23:43:58 compute-1 nova_compute[182713]:     </video>
Jan 21 23:43:58 compute-1 nova_compute[182713]:     <input type="tablet" bus="usb"/>
Jan 21 23:43:58 compute-1 nova_compute[182713]:     <rng model="virtio">
Jan 21 23:43:58 compute-1 nova_compute[182713]:       <backend model="random">/dev/urandom</backend>
Jan 21 23:43:58 compute-1 nova_compute[182713]:     </rng>
Jan 21 23:43:58 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root"/>
Jan 21 23:43:58 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:43:58 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:43:58 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:43:58 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:43:58 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:43:58 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:43:58 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:43:58 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:43:58 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:43:58 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:43:58 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:43:58 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:43:58 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:43:58 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:43:58 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:43:58 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:43:58 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:43:58 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:43:58 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:43:58 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:43:58 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:43:58 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:43:58 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:43:58 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:43:58 compute-1 nova_compute[182713]:     <controller type="usb" index="0"/>
Jan 21 23:43:58 compute-1 nova_compute[182713]:     <memballoon model="virtio">
Jan 21 23:43:58 compute-1 nova_compute[182713]:       <stats period="10"/>
Jan 21 23:43:58 compute-1 nova_compute[182713]:     </memballoon>
Jan 21 23:43:58 compute-1 nova_compute[182713]:   </devices>
Jan 21 23:43:58 compute-1 nova_compute[182713]: </domain>
Jan 21 23:43:58 compute-1 nova_compute[182713]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 21 23:43:58 compute-1 nova_compute[182713]: 2026-01-21 23:43:58.563 182717 DEBUG nova.virt.libvirt.driver [None req-72b77d2f-d8bc-4b8d-a922-fbf132ebb242 c11b574ed1e849b1b0f45e29432ef4d6 a182ea531d6a4192aff53c844249974e - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 21 23:43:58 compute-1 nova_compute[182713]: 2026-01-21 23:43:58.563 182717 DEBUG nova.virt.libvirt.driver [None req-72b77d2f-d8bc-4b8d-a922-fbf132ebb242 c11b574ed1e849b1b0f45e29432ef4d6 a182ea531d6a4192aff53c844249974e - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 21 23:43:58 compute-1 nova_compute[182713]: 2026-01-21 23:43:58.564 182717 INFO nova.virt.libvirt.driver [None req-72b77d2f-d8bc-4b8d-a922-fbf132ebb242 c11b574ed1e849b1b0f45e29432ef4d6 a182ea531d6a4192aff53c844249974e - - default default] [instance: 1b4a8e44-319b-431f-b842-ebb9dd2413fe] Using config drive
Jan 21 23:43:59 compute-1 nova_compute[182713]: 2026-01-21 23:43:59.005 182717 INFO nova.virt.libvirt.driver [None req-72b77d2f-d8bc-4b8d-a922-fbf132ebb242 c11b574ed1e849b1b0f45e29432ef4d6 a182ea531d6a4192aff53c844249974e - - default default] [instance: 1b4a8e44-319b-431f-b842-ebb9dd2413fe] Creating config drive at /var/lib/nova/instances/1b4a8e44-319b-431f-b842-ebb9dd2413fe/disk.config
Jan 21 23:43:59 compute-1 nova_compute[182713]: 2026-01-21 23:43:59.016 182717 DEBUG oslo_concurrency.processutils [None req-72b77d2f-d8bc-4b8d-a922-fbf132ebb242 c11b574ed1e849b1b0f45e29432ef4d6 a182ea531d6a4192aff53c844249974e - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/1b4a8e44-319b-431f-b842-ebb9dd2413fe/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpd0x6tsss execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:43:59 compute-1 nova_compute[182713]: 2026-01-21 23:43:59.144 182717 DEBUG oslo_concurrency.processutils [None req-72b77d2f-d8bc-4b8d-a922-fbf132ebb242 c11b574ed1e849b1b0f45e29432ef4d6 a182ea531d6a4192aff53c844249974e - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/1b4a8e44-319b-431f-b842-ebb9dd2413fe/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpd0x6tsss" returned: 0 in 0.128s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:43:59 compute-1 systemd-machined[153970]: New machine qemu-2-instance-00000006.
Jan 21 23:43:59 compute-1 systemd[1]: Started Virtual Machine qemu-2-instance-00000006.
Jan 21 23:43:59 compute-1 nova_compute[182713]: 2026-01-21 23:43:59.686 182717 DEBUG nova.virt.driver [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Emitting event <LifecycleEvent: 1769039039.6851215, 1b4a8e44-319b-431f-b842-ebb9dd2413fe => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:43:59 compute-1 nova_compute[182713]: 2026-01-21 23:43:59.687 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 1b4a8e44-319b-431f-b842-ebb9dd2413fe] VM Resumed (Lifecycle Event)
Jan 21 23:43:59 compute-1 nova_compute[182713]: 2026-01-21 23:43:59.691 182717 DEBUG nova.compute.manager [None req-72b77d2f-d8bc-4b8d-a922-fbf132ebb242 c11b574ed1e849b1b0f45e29432ef4d6 a182ea531d6a4192aff53c844249974e - - default default] [instance: 1b4a8e44-319b-431f-b842-ebb9dd2413fe] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 21 23:43:59 compute-1 nova_compute[182713]: 2026-01-21 23:43:59.692 182717 DEBUG nova.virt.libvirt.driver [None req-72b77d2f-d8bc-4b8d-a922-fbf132ebb242 c11b574ed1e849b1b0f45e29432ef4d6 a182ea531d6a4192aff53c844249974e - - default default] [instance: 1b4a8e44-319b-431f-b842-ebb9dd2413fe] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 21 23:43:59 compute-1 nova_compute[182713]: 2026-01-21 23:43:59.698 182717 INFO nova.virt.libvirt.driver [-] [instance: 1b4a8e44-319b-431f-b842-ebb9dd2413fe] Instance spawned successfully.
Jan 21 23:43:59 compute-1 nova_compute[182713]: 2026-01-21 23:43:59.699 182717 DEBUG nova.virt.libvirt.driver [None req-72b77d2f-d8bc-4b8d-a922-fbf132ebb242 c11b574ed1e849b1b0f45e29432ef4d6 a182ea531d6a4192aff53c844249974e - - default default] [instance: 1b4a8e44-319b-431f-b842-ebb9dd2413fe] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 21 23:43:59 compute-1 podman[211566]: 2026-01-21 23:43:59.703755937 +0000 UTC m=+0.101038533 container health_status cba261ffc859a105e0afa4436e64e27f9ba7e7387a1ea02146e0072c51844d44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 21 23:43:59 compute-1 nova_compute[182713]: 2026-01-21 23:43:59.717 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 1b4a8e44-319b-431f-b842-ebb9dd2413fe] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:43:59 compute-1 nova_compute[182713]: 2026-01-21 23:43:59.724 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 1b4a8e44-319b-431f-b842-ebb9dd2413fe] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 21 23:43:59 compute-1 nova_compute[182713]: 2026-01-21 23:43:59.730 182717 DEBUG nova.virt.libvirt.driver [None req-72b77d2f-d8bc-4b8d-a922-fbf132ebb242 c11b574ed1e849b1b0f45e29432ef4d6 a182ea531d6a4192aff53c844249974e - - default default] [instance: 1b4a8e44-319b-431f-b842-ebb9dd2413fe] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:43:59 compute-1 nova_compute[182713]: 2026-01-21 23:43:59.731 182717 DEBUG nova.virt.libvirt.driver [None req-72b77d2f-d8bc-4b8d-a922-fbf132ebb242 c11b574ed1e849b1b0f45e29432ef4d6 a182ea531d6a4192aff53c844249974e - - default default] [instance: 1b4a8e44-319b-431f-b842-ebb9dd2413fe] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:43:59 compute-1 nova_compute[182713]: 2026-01-21 23:43:59.731 182717 DEBUG nova.virt.libvirt.driver [None req-72b77d2f-d8bc-4b8d-a922-fbf132ebb242 c11b574ed1e849b1b0f45e29432ef4d6 a182ea531d6a4192aff53c844249974e - - default default] [instance: 1b4a8e44-319b-431f-b842-ebb9dd2413fe] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:43:59 compute-1 nova_compute[182713]: 2026-01-21 23:43:59.732 182717 DEBUG nova.virt.libvirt.driver [None req-72b77d2f-d8bc-4b8d-a922-fbf132ebb242 c11b574ed1e849b1b0f45e29432ef4d6 a182ea531d6a4192aff53c844249974e - - default default] [instance: 1b4a8e44-319b-431f-b842-ebb9dd2413fe] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:43:59 compute-1 nova_compute[182713]: 2026-01-21 23:43:59.732 182717 DEBUG nova.virt.libvirt.driver [None req-72b77d2f-d8bc-4b8d-a922-fbf132ebb242 c11b574ed1e849b1b0f45e29432ef4d6 a182ea531d6a4192aff53c844249974e - - default default] [instance: 1b4a8e44-319b-431f-b842-ebb9dd2413fe] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:43:59 compute-1 nova_compute[182713]: 2026-01-21 23:43:59.733 182717 DEBUG nova.virt.libvirt.driver [None req-72b77d2f-d8bc-4b8d-a922-fbf132ebb242 c11b574ed1e849b1b0f45e29432ef4d6 a182ea531d6a4192aff53c844249974e - - default default] [instance: 1b4a8e44-319b-431f-b842-ebb9dd2413fe] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:43:59 compute-1 nova_compute[182713]: 2026-01-21 23:43:59.767 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 1b4a8e44-319b-431f-b842-ebb9dd2413fe] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 21 23:43:59 compute-1 nova_compute[182713]: 2026-01-21 23:43:59.768 182717 DEBUG nova.virt.driver [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Emitting event <LifecycleEvent: 1769039039.6865904, 1b4a8e44-319b-431f-b842-ebb9dd2413fe => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:43:59 compute-1 nova_compute[182713]: 2026-01-21 23:43:59.768 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 1b4a8e44-319b-431f-b842-ebb9dd2413fe] VM Started (Lifecycle Event)
Jan 21 23:43:59 compute-1 nova_compute[182713]: 2026-01-21 23:43:59.804 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 1b4a8e44-319b-431f-b842-ebb9dd2413fe] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:43:59 compute-1 nova_compute[182713]: 2026-01-21 23:43:59.809 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 1b4a8e44-319b-431f-b842-ebb9dd2413fe] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 21 23:43:59 compute-1 nova_compute[182713]: 2026-01-21 23:43:59.851 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 1b4a8e44-319b-431f-b842-ebb9dd2413fe] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 21 23:43:59 compute-1 nova_compute[182713]: 2026-01-21 23:43:59.885 182717 INFO nova.compute.manager [None req-72b77d2f-d8bc-4b8d-a922-fbf132ebb242 c11b574ed1e849b1b0f45e29432ef4d6 a182ea531d6a4192aff53c844249974e - - default default] [instance: 1b4a8e44-319b-431f-b842-ebb9dd2413fe] Took 3.12 seconds to spawn the instance on the hypervisor.
Jan 21 23:43:59 compute-1 nova_compute[182713]: 2026-01-21 23:43:59.885 182717 DEBUG nova.compute.manager [None req-72b77d2f-d8bc-4b8d-a922-fbf132ebb242 c11b574ed1e849b1b0f45e29432ef4d6 a182ea531d6a4192aff53c844249974e - - default default] [instance: 1b4a8e44-319b-431f-b842-ebb9dd2413fe] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:43:59 compute-1 nova_compute[182713]: 2026-01-21 23:43:59.986 182717 INFO nova.compute.manager [None req-72b77d2f-d8bc-4b8d-a922-fbf132ebb242 c11b574ed1e849b1b0f45e29432ef4d6 a182ea531d6a4192aff53c844249974e - - default default] [instance: 1b4a8e44-319b-431f-b842-ebb9dd2413fe] Took 3.90 seconds to build instance.
Jan 21 23:44:00 compute-1 nova_compute[182713]: 2026-01-21 23:44:00.013 182717 DEBUG oslo_concurrency.lockutils [None req-72b77d2f-d8bc-4b8d-a922-fbf132ebb242 c11b574ed1e849b1b0f45e29432ef4d6 a182ea531d6a4192aff53c844249974e - - default default] Lock "1b4a8e44-319b-431f-b842-ebb9dd2413fe" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 4.096s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:44:01 compute-1 nova_compute[182713]: 2026-01-21 23:44:01.711 182717 DEBUG oslo_concurrency.lockutils [None req-4af86b8c-72f8-4d2a-80a8-c3623faf028a c11b574ed1e849b1b0f45e29432ef4d6 a182ea531d6a4192aff53c844249974e - - default default] Acquiring lock "1b4a8e44-319b-431f-b842-ebb9dd2413fe" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:44:01 compute-1 nova_compute[182713]: 2026-01-21 23:44:01.712 182717 DEBUG oslo_concurrency.lockutils [None req-4af86b8c-72f8-4d2a-80a8-c3623faf028a c11b574ed1e849b1b0f45e29432ef4d6 a182ea531d6a4192aff53c844249974e - - default default] Lock "1b4a8e44-319b-431f-b842-ebb9dd2413fe" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:44:01 compute-1 nova_compute[182713]: 2026-01-21 23:44:01.712 182717 DEBUG oslo_concurrency.lockutils [None req-4af86b8c-72f8-4d2a-80a8-c3623faf028a c11b574ed1e849b1b0f45e29432ef4d6 a182ea531d6a4192aff53c844249974e - - default default] Acquiring lock "1b4a8e44-319b-431f-b842-ebb9dd2413fe-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:44:01 compute-1 nova_compute[182713]: 2026-01-21 23:44:01.712 182717 DEBUG oslo_concurrency.lockutils [None req-4af86b8c-72f8-4d2a-80a8-c3623faf028a c11b574ed1e849b1b0f45e29432ef4d6 a182ea531d6a4192aff53c844249974e - - default default] Lock "1b4a8e44-319b-431f-b842-ebb9dd2413fe-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:44:01 compute-1 nova_compute[182713]: 2026-01-21 23:44:01.713 182717 DEBUG oslo_concurrency.lockutils [None req-4af86b8c-72f8-4d2a-80a8-c3623faf028a c11b574ed1e849b1b0f45e29432ef4d6 a182ea531d6a4192aff53c844249974e - - default default] Lock "1b4a8e44-319b-431f-b842-ebb9dd2413fe-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:44:01 compute-1 nova_compute[182713]: 2026-01-21 23:44:01.726 182717 INFO nova.compute.manager [None req-4af86b8c-72f8-4d2a-80a8-c3623faf028a c11b574ed1e849b1b0f45e29432ef4d6 a182ea531d6a4192aff53c844249974e - - default default] [instance: 1b4a8e44-319b-431f-b842-ebb9dd2413fe] Terminating instance
Jan 21 23:44:01 compute-1 nova_compute[182713]: 2026-01-21 23:44:01.736 182717 DEBUG oslo_concurrency.lockutils [None req-4af86b8c-72f8-4d2a-80a8-c3623faf028a c11b574ed1e849b1b0f45e29432ef4d6 a182ea531d6a4192aff53c844249974e - - default default] Acquiring lock "refresh_cache-1b4a8e44-319b-431f-b842-ebb9dd2413fe" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 23:44:01 compute-1 nova_compute[182713]: 2026-01-21 23:44:01.737 182717 DEBUG oslo_concurrency.lockutils [None req-4af86b8c-72f8-4d2a-80a8-c3623faf028a c11b574ed1e849b1b0f45e29432ef4d6 a182ea531d6a4192aff53c844249974e - - default default] Acquired lock "refresh_cache-1b4a8e44-319b-431f-b842-ebb9dd2413fe" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 23:44:01 compute-1 nova_compute[182713]: 2026-01-21 23:44:01.737 182717 DEBUG nova.network.neutron [None req-4af86b8c-72f8-4d2a-80a8-c3623faf028a c11b574ed1e849b1b0f45e29432ef4d6 a182ea531d6a4192aff53c844249974e - - default default] [instance: 1b4a8e44-319b-431f-b842-ebb9dd2413fe] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 21 23:44:02 compute-1 nova_compute[182713]: 2026-01-21 23:44:02.121 182717 DEBUG nova.network.neutron [None req-4af86b8c-72f8-4d2a-80a8-c3623faf028a c11b574ed1e849b1b0f45e29432ef4d6 a182ea531d6a4192aff53c844249974e - - default default] [instance: 1b4a8e44-319b-431f-b842-ebb9dd2413fe] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 21 23:44:02 compute-1 nova_compute[182713]: 2026-01-21 23:44:02.461 182717 DEBUG nova.network.neutron [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] [instance: 6f64b039-da3a-47ef-9a52-b259b890a077] Automatically allocated network: {'id': '48de92c9-2a56-4dfe-a16e-fe0d52617564', 'name': 'auto_allocated_network', 'tenant_id': '8981554bfb65485a9218dab7f347822d', 'admin_state_up': True, 'mtu': 1442, 'status': 'ACTIVE', 'subnets': ['91f54b37-b5bc-463a-931b-a34707078f9d', 'd15c7507-6da8-4e7a-bb36-7b411b2b575a'], 'shared': False, 'availability_zone_hints': [], 'availability_zones': [], 'ipv4_address_scope': None, 'ipv6_address_scope': None, 'router:external': False, 'description': '', 'qos_policy_id': None, 'port_security_enabled': True, 'dns_domain': '', 'l2_adjacency': True, 'tags': [], 'created_at': '2026-01-21T23:43:42Z', 'updated_at': '2026-01-21T23:43:54Z', 'revision_number': 4, 'project_id': '8981554bfb65485a9218dab7f347822d'} _auto_allocate_network /usr/lib/python3.9/site-packages/nova/network/neutron.py:2478
Jan 21 23:44:02 compute-1 nova_compute[182713]: 2026-01-21 23:44:02.472 182717 WARNING oslo_policy.policy [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] JSON formatted policy_file support is deprecated since Victoria release. You need to use YAML format which will be default in future. You can use ``oslopolicy-convert-json-to-yaml`` tool to convert existing JSON-formatted policy file to YAML-formatted in backward compatible way: https://docs.openstack.org/oslo.policy/latest/cli/oslopolicy-convert-json-to-yaml.html.
Jan 21 23:44:02 compute-1 nova_compute[182713]: 2026-01-21 23:44:02.472 182717 WARNING oslo_policy.policy [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] JSON formatted policy_file support is deprecated since Victoria release. You need to use YAML format which will be default in future. You can use ``oslopolicy-convert-json-to-yaml`` tool to convert existing JSON-formatted policy file to YAML-formatted in backward compatible way: https://docs.openstack.org/oslo.policy/latest/cli/oslopolicy-convert-json-to-yaml.html.
Jan 21 23:44:02 compute-1 nova_compute[182713]: 2026-01-21 23:44:02.475 182717 DEBUG nova.policy [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'f92dd0c2072346c6b7e7588673443ff2', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '8981554bfb65485a9218dab7f347822d', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 21 23:44:02 compute-1 podman[211589]: 2026-01-21 23:44:02.594148085 +0000 UTC m=+0.080926340 container health_status dce133a2ffa21f3006ac27dc80027060a9029e6d972f4c62fecd35dd3bbb00f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, vendor=Red Hat, Inc., io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, io.buildah.version=1.33.7, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=openstack_network_exporter, managed_by=edpm_ansible, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.6)
Jan 21 23:44:02 compute-1 nova_compute[182713]: 2026-01-21 23:44:02.631 182717 DEBUG nova.network.neutron [None req-4af86b8c-72f8-4d2a-80a8-c3623faf028a c11b574ed1e849b1b0f45e29432ef4d6 a182ea531d6a4192aff53c844249974e - - default default] [instance: 1b4a8e44-319b-431f-b842-ebb9dd2413fe] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 23:44:02 compute-1 nova_compute[182713]: 2026-01-21 23:44:02.653 182717 DEBUG oslo_concurrency.lockutils [None req-4af86b8c-72f8-4d2a-80a8-c3623faf028a c11b574ed1e849b1b0f45e29432ef4d6 a182ea531d6a4192aff53c844249974e - - default default] Releasing lock "refresh_cache-1b4a8e44-319b-431f-b842-ebb9dd2413fe" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 23:44:02 compute-1 nova_compute[182713]: 2026-01-21 23:44:02.654 182717 DEBUG nova.compute.manager [None req-4af86b8c-72f8-4d2a-80a8-c3623faf028a c11b574ed1e849b1b0f45e29432ef4d6 a182ea531d6a4192aff53c844249974e - - default default] [instance: 1b4a8e44-319b-431f-b842-ebb9dd2413fe] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 21 23:44:02 compute-1 systemd[1]: machine-qemu\x2d2\x2dinstance\x2d00000006.scope: Deactivated successfully.
Jan 21 23:44:02 compute-1 systemd[1]: machine-qemu\x2d2\x2dinstance\x2d00000006.scope: Consumed 3.384s CPU time.
Jan 21 23:44:02 compute-1 systemd-machined[153970]: Machine qemu-2-instance-00000006 terminated.
Jan 21 23:44:02 compute-1 nova_compute[182713]: 2026-01-21 23:44:02.909 182717 INFO nova.virt.libvirt.driver [-] [instance: 1b4a8e44-319b-431f-b842-ebb9dd2413fe] Instance destroyed successfully.
Jan 21 23:44:02 compute-1 nova_compute[182713]: 2026-01-21 23:44:02.910 182717 DEBUG nova.objects.instance [None req-4af86b8c-72f8-4d2a-80a8-c3623faf028a c11b574ed1e849b1b0f45e29432ef4d6 a182ea531d6a4192aff53c844249974e - - default default] Lazy-loading 'resources' on Instance uuid 1b4a8e44-319b-431f-b842-ebb9dd2413fe obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:44:02 compute-1 nova_compute[182713]: 2026-01-21 23:44:02.934 182717 INFO nova.virt.libvirt.driver [None req-4af86b8c-72f8-4d2a-80a8-c3623faf028a c11b574ed1e849b1b0f45e29432ef4d6 a182ea531d6a4192aff53c844249974e - - default default] [instance: 1b4a8e44-319b-431f-b842-ebb9dd2413fe] Deleting instance files /var/lib/nova/instances/1b4a8e44-319b-431f-b842-ebb9dd2413fe_del
Jan 21 23:44:02 compute-1 nova_compute[182713]: 2026-01-21 23:44:02.936 182717 INFO nova.virt.libvirt.driver [None req-4af86b8c-72f8-4d2a-80a8-c3623faf028a c11b574ed1e849b1b0f45e29432ef4d6 a182ea531d6a4192aff53c844249974e - - default default] [instance: 1b4a8e44-319b-431f-b842-ebb9dd2413fe] Deletion of /var/lib/nova/instances/1b4a8e44-319b-431f-b842-ebb9dd2413fe_del complete
Jan 21 23:44:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:44:02.992 104184 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:44:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:44:02.992 104184 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:44:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:44:02.993 104184 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:44:03 compute-1 nova_compute[182713]: 2026-01-21 23:44:03.028 182717 INFO nova.compute.manager [None req-4af86b8c-72f8-4d2a-80a8-c3623faf028a c11b574ed1e849b1b0f45e29432ef4d6 a182ea531d6a4192aff53c844249974e - - default default] [instance: 1b4a8e44-319b-431f-b842-ebb9dd2413fe] Took 0.37 seconds to destroy the instance on the hypervisor.
Jan 21 23:44:03 compute-1 nova_compute[182713]: 2026-01-21 23:44:03.029 182717 DEBUG oslo.service.loopingcall [None req-4af86b8c-72f8-4d2a-80a8-c3623faf028a c11b574ed1e849b1b0f45e29432ef4d6 a182ea531d6a4192aff53c844249974e - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 21 23:44:03 compute-1 nova_compute[182713]: 2026-01-21 23:44:03.029 182717 DEBUG nova.compute.manager [-] [instance: 1b4a8e44-319b-431f-b842-ebb9dd2413fe] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 21 23:44:03 compute-1 nova_compute[182713]: 2026-01-21 23:44:03.030 182717 DEBUG nova.network.neutron [-] [instance: 1b4a8e44-319b-431f-b842-ebb9dd2413fe] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 21 23:44:03 compute-1 nova_compute[182713]: 2026-01-21 23:44:03.266 182717 DEBUG nova.network.neutron [-] [instance: 1b4a8e44-319b-431f-b842-ebb9dd2413fe] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 21 23:44:03 compute-1 nova_compute[182713]: 2026-01-21 23:44:03.298 182717 DEBUG nova.network.neutron [-] [instance: 1b4a8e44-319b-431f-b842-ebb9dd2413fe] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 23:44:03 compute-1 nova_compute[182713]: 2026-01-21 23:44:03.318 182717 INFO nova.compute.manager [-] [instance: 1b4a8e44-319b-431f-b842-ebb9dd2413fe] Took 0.29 seconds to deallocate network for instance.
Jan 21 23:44:03 compute-1 nova_compute[182713]: 2026-01-21 23:44:03.418 182717 DEBUG oslo_concurrency.lockutils [None req-4af86b8c-72f8-4d2a-80a8-c3623faf028a c11b574ed1e849b1b0f45e29432ef4d6 a182ea531d6a4192aff53c844249974e - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:44:03 compute-1 nova_compute[182713]: 2026-01-21 23:44:03.418 182717 DEBUG oslo_concurrency.lockutils [None req-4af86b8c-72f8-4d2a-80a8-c3623faf028a c11b574ed1e849b1b0f45e29432ef4d6 a182ea531d6a4192aff53c844249974e - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:44:03 compute-1 nova_compute[182713]: 2026-01-21 23:44:03.540 182717 DEBUG nova.compute.provider_tree [None req-4af86b8c-72f8-4d2a-80a8-c3623faf028a c11b574ed1e849b1b0f45e29432ef4d6 a182ea531d6a4192aff53c844249974e - - default default] Inventory has not changed in ProviderTree for provider: 39680711-70c9-4df1-ae59-25e54fac688d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 21 23:44:03 compute-1 nova_compute[182713]: 2026-01-21 23:44:03.559 182717 DEBUG nova.scheduler.client.report [None req-4af86b8c-72f8-4d2a-80a8-c3623faf028a c11b574ed1e849b1b0f45e29432ef4d6 a182ea531d6a4192aff53c844249974e - - default default] Inventory has not changed for provider 39680711-70c9-4df1-ae59-25e54fac688d based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 21 23:44:03 compute-1 nova_compute[182713]: 2026-01-21 23:44:03.583 182717 DEBUG oslo_concurrency.lockutils [None req-4af86b8c-72f8-4d2a-80a8-c3623faf028a c11b574ed1e849b1b0f45e29432ef4d6 a182ea531d6a4192aff53c844249974e - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.165s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:44:03 compute-1 nova_compute[182713]: 2026-01-21 23:44:03.614 182717 INFO nova.scheduler.client.report [None req-4af86b8c-72f8-4d2a-80a8-c3623faf028a c11b574ed1e849b1b0f45e29432ef4d6 a182ea531d6a4192aff53c844249974e - - default default] Deleted allocations for instance 1b4a8e44-319b-431f-b842-ebb9dd2413fe
Jan 21 23:44:03 compute-1 nova_compute[182713]: 2026-01-21 23:44:03.711 182717 DEBUG oslo_concurrency.lockutils [None req-4af86b8c-72f8-4d2a-80a8-c3623faf028a c11b574ed1e849b1b0f45e29432ef4d6 a182ea531d6a4192aff53c844249974e - - default default] Lock "1b4a8e44-319b-431f-b842-ebb9dd2413fe" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.999s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:44:04 compute-1 nova_compute[182713]: 2026-01-21 23:44:04.828 182717 DEBUG nova.network.neutron [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] [instance: 6f64b039-da3a-47ef-9a52-b259b890a077] Successfully created port: 3299b15c-00f0-4c59-9f02-44cb8d762ae2 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 21 23:44:06 compute-1 nova_compute[182713]: 2026-01-21 23:44:06.047 182717 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769039031.0458155, 3f2b9306-446d-4b0b-9db6-7a6ef24e18e9 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:44:06 compute-1 nova_compute[182713]: 2026-01-21 23:44:06.048 182717 INFO nova.compute.manager [-] [instance: 3f2b9306-446d-4b0b-9db6-7a6ef24e18e9] VM Stopped (Lifecycle Event)
Jan 21 23:44:06 compute-1 nova_compute[182713]: 2026-01-21 23:44:06.078 182717 DEBUG nova.compute.manager [None req-d16e8a62-3835-491d-84b8-603f51822798 - - - - - -] [instance: 3f2b9306-446d-4b0b-9db6-7a6ef24e18e9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:44:06 compute-1 nova_compute[182713]: 2026-01-21 23:44:06.600 182717 DEBUG nova.network.neutron [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] [instance: 6f64b039-da3a-47ef-9a52-b259b890a077] Successfully updated port: 3299b15c-00f0-4c59-9f02-44cb8d762ae2 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 21 23:44:06 compute-1 nova_compute[182713]: 2026-01-21 23:44:06.624 182717 DEBUG oslo_concurrency.lockutils [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] Acquiring lock "refresh_cache-6f64b039-da3a-47ef-9a52-b259b890a077" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 23:44:06 compute-1 nova_compute[182713]: 2026-01-21 23:44:06.626 182717 DEBUG oslo_concurrency.lockutils [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] Acquired lock "refresh_cache-6f64b039-da3a-47ef-9a52-b259b890a077" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 23:44:06 compute-1 nova_compute[182713]: 2026-01-21 23:44:06.626 182717 DEBUG nova.network.neutron [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] [instance: 6f64b039-da3a-47ef-9a52-b259b890a077] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 21 23:44:06 compute-1 nova_compute[182713]: 2026-01-21 23:44:06.854 182717 DEBUG nova.network.neutron [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] [instance: 6f64b039-da3a-47ef-9a52-b259b890a077] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 21 23:44:07 compute-1 nova_compute[182713]: 2026-01-21 23:44:07.279 182717 DEBUG nova.compute.manager [req-a58d16a5-9d2e-4030-91ef-38c5cbf834a1 req-0e6e9aa6-38cc-4c32-8dbc-47dd691ac13f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 6f64b039-da3a-47ef-9a52-b259b890a077] Received event network-changed-3299b15c-00f0-4c59-9f02-44cb8d762ae2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:44:07 compute-1 nova_compute[182713]: 2026-01-21 23:44:07.280 182717 DEBUG nova.compute.manager [req-a58d16a5-9d2e-4030-91ef-38c5cbf834a1 req-0e6e9aa6-38cc-4c32-8dbc-47dd691ac13f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 6f64b039-da3a-47ef-9a52-b259b890a077] Refreshing instance network info cache due to event network-changed-3299b15c-00f0-4c59-9f02-44cb8d762ae2. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 21 23:44:07 compute-1 nova_compute[182713]: 2026-01-21 23:44:07.280 182717 DEBUG oslo_concurrency.lockutils [req-a58d16a5-9d2e-4030-91ef-38c5cbf834a1 req-0e6e9aa6-38cc-4c32-8dbc-47dd691ac13f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-6f64b039-da3a-47ef-9a52-b259b890a077" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 23:44:09 compute-1 nova_compute[182713]: 2026-01-21 23:44:09.639 182717 DEBUG nova.network.neutron [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] [instance: 6f64b039-da3a-47ef-9a52-b259b890a077] Updating instance_info_cache with network_info: [{"id": "3299b15c-00f0-4c59-9f02-44cb8d762ae2", "address": "fa:16:3e:61:33:0b", "network": {"id": "48de92c9-2a56-4dfe-a16e-fe0d52617564", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "10.1.0.0/26", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.40", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "fdfe:381f:8400::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400::25e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8981554bfb65485a9218dab7f347822d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3299b15c-00", "ovs_interfaceid": "3299b15c-00f0-4c59-9f02-44cb8d762ae2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 23:44:09 compute-1 nova_compute[182713]: 2026-01-21 23:44:09.688 182717 DEBUG oslo_concurrency.lockutils [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] Releasing lock "refresh_cache-6f64b039-da3a-47ef-9a52-b259b890a077" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 23:44:09 compute-1 nova_compute[182713]: 2026-01-21 23:44:09.688 182717 DEBUG nova.compute.manager [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] [instance: 6f64b039-da3a-47ef-9a52-b259b890a077] Instance network_info: |[{"id": "3299b15c-00f0-4c59-9f02-44cb8d762ae2", "address": "fa:16:3e:61:33:0b", "network": {"id": "48de92c9-2a56-4dfe-a16e-fe0d52617564", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "10.1.0.0/26", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.40", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "fdfe:381f:8400::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400::25e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8981554bfb65485a9218dab7f347822d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3299b15c-00", "ovs_interfaceid": "3299b15c-00f0-4c59-9f02-44cb8d762ae2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 21 23:44:09 compute-1 nova_compute[182713]: 2026-01-21 23:44:09.689 182717 DEBUG oslo_concurrency.lockutils [req-a58d16a5-9d2e-4030-91ef-38c5cbf834a1 req-0e6e9aa6-38cc-4c32-8dbc-47dd691ac13f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-6f64b039-da3a-47ef-9a52-b259b890a077" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 23:44:09 compute-1 nova_compute[182713]: 2026-01-21 23:44:09.689 182717 DEBUG nova.network.neutron [req-a58d16a5-9d2e-4030-91ef-38c5cbf834a1 req-0e6e9aa6-38cc-4c32-8dbc-47dd691ac13f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 6f64b039-da3a-47ef-9a52-b259b890a077] Refreshing network info cache for port 3299b15c-00f0-4c59-9f02-44cb8d762ae2 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 21 23:44:09 compute-1 nova_compute[182713]: 2026-01-21 23:44:09.692 182717 DEBUG nova.virt.libvirt.driver [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] [instance: 6f64b039-da3a-47ef-9a52-b259b890a077] Start _get_guest_xml network_info=[{"id": "3299b15c-00f0-4c59-9f02-44cb8d762ae2", "address": "fa:16:3e:61:33:0b", "network": {"id": "48de92c9-2a56-4dfe-a16e-fe0d52617564", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "10.1.0.0/26", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.40", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "fdfe:381f:8400::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400::25e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8981554bfb65485a9218dab7f347822d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3299b15c-00", "ovs_interfaceid": "3299b15c-00f0-4c59-9f02-44cb8d762ae2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'size': 0, 'disk_bus': 'virtio', 'boot_index': 0, 'device_name': '/dev/vda', 'guest_format': None, 'encrypted': False, 'encryption_format': None, 'encryption_secret_uuid': None, 'device_type': 'disk', 'image_id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 21 23:44:09 compute-1 nova_compute[182713]: 2026-01-21 23:44:09.698 182717 WARNING nova.virt.libvirt.driver [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 21 23:44:09 compute-1 nova_compute[182713]: 2026-01-21 23:44:09.703 182717 DEBUG nova.virt.libvirt.host [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 21 23:44:09 compute-1 nova_compute[182713]: 2026-01-21 23:44:09.704 182717 DEBUG nova.virt.libvirt.host [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 21 23:44:09 compute-1 nova_compute[182713]: 2026-01-21 23:44:09.707 182717 DEBUG nova.virt.libvirt.host [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 21 23:44:09 compute-1 nova_compute[182713]: 2026-01-21 23:44:09.708 182717 DEBUG nova.virt.libvirt.host [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 21 23:44:09 compute-1 nova_compute[182713]: 2026-01-21 23:44:09.709 182717 DEBUG nova.virt.libvirt.driver [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 21 23:44:09 compute-1 nova_compute[182713]: 2026-01-21 23:44:09.709 182717 DEBUG nova.virt.hardware [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-21T23:42:45Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c3389c03-89c4-4ff5-9e03-1a99d41713d4',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 21 23:44:09 compute-1 nova_compute[182713]: 2026-01-21 23:44:09.710 182717 DEBUG nova.virt.hardware [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 21 23:44:09 compute-1 nova_compute[182713]: 2026-01-21 23:44:09.710 182717 DEBUG nova.virt.hardware [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 21 23:44:09 compute-1 nova_compute[182713]: 2026-01-21 23:44:09.710 182717 DEBUG nova.virt.hardware [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 21 23:44:09 compute-1 nova_compute[182713]: 2026-01-21 23:44:09.710 182717 DEBUG nova.virt.hardware [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 21 23:44:09 compute-1 nova_compute[182713]: 2026-01-21 23:44:09.711 182717 DEBUG nova.virt.hardware [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 21 23:44:09 compute-1 nova_compute[182713]: 2026-01-21 23:44:09.711 182717 DEBUG nova.virt.hardware [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 21 23:44:09 compute-1 nova_compute[182713]: 2026-01-21 23:44:09.711 182717 DEBUG nova.virt.hardware [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 21 23:44:09 compute-1 nova_compute[182713]: 2026-01-21 23:44:09.711 182717 DEBUG nova.virt.hardware [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 21 23:44:09 compute-1 nova_compute[182713]: 2026-01-21 23:44:09.712 182717 DEBUG nova.virt.hardware [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 21 23:44:09 compute-1 nova_compute[182713]: 2026-01-21 23:44:09.712 182717 DEBUG nova.virt.hardware [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 21 23:44:09 compute-1 nova_compute[182713]: 2026-01-21 23:44:09.715 182717 DEBUG nova.virt.libvirt.vif [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-21T23:43:38Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-tempest.common.compute-instance-1377062952-1',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-1377062952-1',id=2,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='8981554bfb65485a9218dab7f347822d',ramdisk_id='',reservation_id='r-ourrdd3l',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AutoAllocateNetworkTest-1853609216',owner_user_name='tempest-AutoAllocateNetworkTest-1853609216-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-21T23:43:40Z,user_data=None,user_id='f92dd0c2072346c6b7e7588673443ff2',uuid=6f64b039-da3a-47ef-9a52-b259b890a077,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "3299b15c-00f0-4c59-9f02-44cb8d762ae2", "address": "fa:16:3e:61:33:0b", "network": {"id": "48de92c9-2a56-4dfe-a16e-fe0d52617564", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "10.1.0.0/26", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.40", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "fdfe:381f:8400::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400::25e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8981554bfb65485a9218dab7f347822d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3299b15c-00", "ovs_interfaceid": "3299b15c-00f0-4c59-9f02-44cb8d762ae2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 21 23:44:09 compute-1 nova_compute[182713]: 2026-01-21 23:44:09.716 182717 DEBUG nova.network.os_vif_util [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] Converting VIF {"id": "3299b15c-00f0-4c59-9f02-44cb8d762ae2", "address": "fa:16:3e:61:33:0b", "network": {"id": "48de92c9-2a56-4dfe-a16e-fe0d52617564", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "10.1.0.0/26", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.40", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "fdfe:381f:8400::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400::25e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8981554bfb65485a9218dab7f347822d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3299b15c-00", "ovs_interfaceid": "3299b15c-00f0-4c59-9f02-44cb8d762ae2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 21 23:44:09 compute-1 nova_compute[182713]: 2026-01-21 23:44:09.717 182717 DEBUG nova.network.os_vif_util [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:61:33:0b,bridge_name='br-int',has_traffic_filtering=True,id=3299b15c-00f0-4c59-9f02-44cb8d762ae2,network=Network(48de92c9-2a56-4dfe-a16e-fe0d52617564),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3299b15c-00') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 21 23:44:09 compute-1 nova_compute[182713]: 2026-01-21 23:44:09.720 182717 DEBUG nova.objects.instance [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] Lazy-loading 'pci_devices' on Instance uuid 6f64b039-da3a-47ef-9a52-b259b890a077 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:44:09 compute-1 nova_compute[182713]: 2026-01-21 23:44:09.744 182717 DEBUG nova.virt.libvirt.driver [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] [instance: 6f64b039-da3a-47ef-9a52-b259b890a077] End _get_guest_xml xml=<domain type="kvm">
Jan 21 23:44:09 compute-1 nova_compute[182713]:   <uuid>6f64b039-da3a-47ef-9a52-b259b890a077</uuid>
Jan 21 23:44:09 compute-1 nova_compute[182713]:   <name>instance-00000002</name>
Jan 21 23:44:09 compute-1 nova_compute[182713]:   <memory>131072</memory>
Jan 21 23:44:09 compute-1 nova_compute[182713]:   <vcpu>1</vcpu>
Jan 21 23:44:09 compute-1 nova_compute[182713]:   <metadata>
Jan 21 23:44:09 compute-1 nova_compute[182713]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 21 23:44:09 compute-1 nova_compute[182713]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 21 23:44:09 compute-1 nova_compute[182713]:       <nova:name>tempest-tempest.common.compute-instance-1377062952-1</nova:name>
Jan 21 23:44:09 compute-1 nova_compute[182713]:       <nova:creationTime>2026-01-21 23:44:09</nova:creationTime>
Jan 21 23:44:09 compute-1 nova_compute[182713]:       <nova:flavor name="m1.nano">
Jan 21 23:44:09 compute-1 nova_compute[182713]:         <nova:memory>128</nova:memory>
Jan 21 23:44:09 compute-1 nova_compute[182713]:         <nova:disk>1</nova:disk>
Jan 21 23:44:09 compute-1 nova_compute[182713]:         <nova:swap>0</nova:swap>
Jan 21 23:44:09 compute-1 nova_compute[182713]:         <nova:ephemeral>0</nova:ephemeral>
Jan 21 23:44:09 compute-1 nova_compute[182713]:         <nova:vcpus>1</nova:vcpus>
Jan 21 23:44:09 compute-1 nova_compute[182713]:       </nova:flavor>
Jan 21 23:44:09 compute-1 nova_compute[182713]:       <nova:owner>
Jan 21 23:44:09 compute-1 nova_compute[182713]:         <nova:user uuid="f92dd0c2072346c6b7e7588673443ff2">tempest-AutoAllocateNetworkTest-1853609216-project-member</nova:user>
Jan 21 23:44:09 compute-1 nova_compute[182713]:         <nova:project uuid="8981554bfb65485a9218dab7f347822d">tempest-AutoAllocateNetworkTest-1853609216</nova:project>
Jan 21 23:44:09 compute-1 nova_compute[182713]:       </nova:owner>
Jan 21 23:44:09 compute-1 nova_compute[182713]:       <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 21 23:44:09 compute-1 nova_compute[182713]:       <nova:ports>
Jan 21 23:44:09 compute-1 nova_compute[182713]:         <nova:port uuid="3299b15c-00f0-4c59-9f02-44cb8d762ae2">
Jan 21 23:44:09 compute-1 nova_compute[182713]:           <nova:ip type="fixed" address="10.1.0.40" ipVersion="4"/>
Jan 21 23:44:09 compute-1 nova_compute[182713]:           <nova:ip type="fixed" address="fdfe:381f:8400::25e" ipVersion="6"/>
Jan 21 23:44:09 compute-1 nova_compute[182713]:         </nova:port>
Jan 21 23:44:09 compute-1 nova_compute[182713]:       </nova:ports>
Jan 21 23:44:09 compute-1 nova_compute[182713]:     </nova:instance>
Jan 21 23:44:09 compute-1 nova_compute[182713]:   </metadata>
Jan 21 23:44:09 compute-1 nova_compute[182713]:   <sysinfo type="smbios">
Jan 21 23:44:09 compute-1 nova_compute[182713]:     <system>
Jan 21 23:44:09 compute-1 nova_compute[182713]:       <entry name="manufacturer">RDO</entry>
Jan 21 23:44:09 compute-1 nova_compute[182713]:       <entry name="product">OpenStack Compute</entry>
Jan 21 23:44:09 compute-1 nova_compute[182713]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 21 23:44:09 compute-1 nova_compute[182713]:       <entry name="serial">6f64b039-da3a-47ef-9a52-b259b890a077</entry>
Jan 21 23:44:09 compute-1 nova_compute[182713]:       <entry name="uuid">6f64b039-da3a-47ef-9a52-b259b890a077</entry>
Jan 21 23:44:09 compute-1 nova_compute[182713]:       <entry name="family">Virtual Machine</entry>
Jan 21 23:44:09 compute-1 nova_compute[182713]:     </system>
Jan 21 23:44:09 compute-1 nova_compute[182713]:   </sysinfo>
Jan 21 23:44:09 compute-1 nova_compute[182713]:   <os>
Jan 21 23:44:09 compute-1 nova_compute[182713]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 21 23:44:09 compute-1 nova_compute[182713]:     <boot dev="hd"/>
Jan 21 23:44:09 compute-1 nova_compute[182713]:     <smbios mode="sysinfo"/>
Jan 21 23:44:09 compute-1 nova_compute[182713]:   </os>
Jan 21 23:44:09 compute-1 nova_compute[182713]:   <features>
Jan 21 23:44:09 compute-1 nova_compute[182713]:     <acpi/>
Jan 21 23:44:09 compute-1 nova_compute[182713]:     <apic/>
Jan 21 23:44:09 compute-1 nova_compute[182713]:     <vmcoreinfo/>
Jan 21 23:44:09 compute-1 nova_compute[182713]:   </features>
Jan 21 23:44:09 compute-1 nova_compute[182713]:   <clock offset="utc">
Jan 21 23:44:09 compute-1 nova_compute[182713]:     <timer name="pit" tickpolicy="delay"/>
Jan 21 23:44:09 compute-1 nova_compute[182713]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 21 23:44:09 compute-1 nova_compute[182713]:     <timer name="hpet" present="no"/>
Jan 21 23:44:09 compute-1 nova_compute[182713]:   </clock>
Jan 21 23:44:09 compute-1 nova_compute[182713]:   <cpu mode="custom" match="exact">
Jan 21 23:44:09 compute-1 nova_compute[182713]:     <model>Nehalem</model>
Jan 21 23:44:09 compute-1 nova_compute[182713]:     <topology sockets="1" cores="1" threads="1"/>
Jan 21 23:44:09 compute-1 nova_compute[182713]:   </cpu>
Jan 21 23:44:09 compute-1 nova_compute[182713]:   <devices>
Jan 21 23:44:09 compute-1 nova_compute[182713]:     <disk type="file" device="disk">
Jan 21 23:44:09 compute-1 nova_compute[182713]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 21 23:44:09 compute-1 nova_compute[182713]:       <source file="/var/lib/nova/instances/6f64b039-da3a-47ef-9a52-b259b890a077/disk"/>
Jan 21 23:44:09 compute-1 nova_compute[182713]:       <target dev="vda" bus="virtio"/>
Jan 21 23:44:09 compute-1 nova_compute[182713]:     </disk>
Jan 21 23:44:09 compute-1 nova_compute[182713]:     <disk type="file" device="cdrom">
Jan 21 23:44:09 compute-1 nova_compute[182713]:       <driver name="qemu" type="raw" cache="none"/>
Jan 21 23:44:09 compute-1 nova_compute[182713]:       <source file="/var/lib/nova/instances/6f64b039-da3a-47ef-9a52-b259b890a077/disk.config"/>
Jan 21 23:44:09 compute-1 nova_compute[182713]:       <target dev="sda" bus="sata"/>
Jan 21 23:44:09 compute-1 nova_compute[182713]:     </disk>
Jan 21 23:44:09 compute-1 nova_compute[182713]:     <interface type="ethernet">
Jan 21 23:44:09 compute-1 nova_compute[182713]:       <mac address="fa:16:3e:61:33:0b"/>
Jan 21 23:44:09 compute-1 nova_compute[182713]:       <model type="virtio"/>
Jan 21 23:44:09 compute-1 nova_compute[182713]:       <driver name="vhost" rx_queue_size="512"/>
Jan 21 23:44:09 compute-1 nova_compute[182713]:       <mtu size="1442"/>
Jan 21 23:44:09 compute-1 nova_compute[182713]:       <target dev="tap3299b15c-00"/>
Jan 21 23:44:09 compute-1 nova_compute[182713]:     </interface>
Jan 21 23:44:09 compute-1 nova_compute[182713]:     <serial type="pty">
Jan 21 23:44:09 compute-1 nova_compute[182713]:       <log file="/var/lib/nova/instances/6f64b039-da3a-47ef-9a52-b259b890a077/console.log" append="off"/>
Jan 21 23:44:09 compute-1 nova_compute[182713]:     </serial>
Jan 21 23:44:09 compute-1 nova_compute[182713]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 21 23:44:09 compute-1 nova_compute[182713]:     <video>
Jan 21 23:44:09 compute-1 nova_compute[182713]:       <model type="virtio"/>
Jan 21 23:44:09 compute-1 nova_compute[182713]:     </video>
Jan 21 23:44:09 compute-1 nova_compute[182713]:     <input type="tablet" bus="usb"/>
Jan 21 23:44:09 compute-1 nova_compute[182713]:     <rng model="virtio">
Jan 21 23:44:09 compute-1 nova_compute[182713]:       <backend model="random">/dev/urandom</backend>
Jan 21 23:44:09 compute-1 nova_compute[182713]:     </rng>
Jan 21 23:44:09 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root"/>
Jan 21 23:44:09 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:44:09 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:44:09 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:44:09 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:44:09 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:44:09 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:44:09 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:44:09 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:44:09 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:44:09 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:44:09 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:44:09 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:44:09 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:44:09 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:44:09 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:44:09 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:44:09 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:44:09 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:44:09 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:44:09 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:44:09 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:44:09 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:44:09 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:44:09 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:44:09 compute-1 nova_compute[182713]:     <controller type="usb" index="0"/>
Jan 21 23:44:09 compute-1 nova_compute[182713]:     <memballoon model="virtio">
Jan 21 23:44:09 compute-1 nova_compute[182713]:       <stats period="10"/>
Jan 21 23:44:09 compute-1 nova_compute[182713]:     </memballoon>
Jan 21 23:44:09 compute-1 nova_compute[182713]:   </devices>
Jan 21 23:44:09 compute-1 nova_compute[182713]: </domain>
Jan 21 23:44:09 compute-1 nova_compute[182713]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 21 23:44:09 compute-1 nova_compute[182713]: 2026-01-21 23:44:09.745 182717 DEBUG nova.compute.manager [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] [instance: 6f64b039-da3a-47ef-9a52-b259b890a077] Preparing to wait for external event network-vif-plugged-3299b15c-00f0-4c59-9f02-44cb8d762ae2 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 21 23:44:09 compute-1 nova_compute[182713]: 2026-01-21 23:44:09.746 182717 DEBUG oslo_concurrency.lockutils [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] Acquiring lock "6f64b039-da3a-47ef-9a52-b259b890a077-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:44:09 compute-1 nova_compute[182713]: 2026-01-21 23:44:09.747 182717 DEBUG oslo_concurrency.lockutils [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] Lock "6f64b039-da3a-47ef-9a52-b259b890a077-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:44:09 compute-1 nova_compute[182713]: 2026-01-21 23:44:09.747 182717 DEBUG oslo_concurrency.lockutils [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] Lock "6f64b039-da3a-47ef-9a52-b259b890a077-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:44:09 compute-1 nova_compute[182713]: 2026-01-21 23:44:09.748 182717 DEBUG nova.virt.libvirt.vif [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-21T23:43:38Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-tempest.common.compute-instance-1377062952-1',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-1377062952-1',id=2,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='8981554bfb65485a9218dab7f347822d',ramdisk_id='',reservation_id='r-ourrdd3l',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AutoAllocateNetworkTest-1853609216',owner_user_name='tempest-AutoAllocateNetworkTest-1853609216-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-21T23:43:40Z,user_data=None,user_id='f92dd0c2072346c6b7e7588673443ff2',uuid=6f64b039-da3a-47ef-9a52-b259b890a077,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "3299b15c-00f0-4c59-9f02-44cb8d762ae2", "address": "fa:16:3e:61:33:0b", "network": {"id": "48de92c9-2a56-4dfe-a16e-fe0d52617564", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "10.1.0.0/26", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.40", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "fdfe:381f:8400::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400::25e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8981554bfb65485a9218dab7f347822d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3299b15c-00", "ovs_interfaceid": "3299b15c-00f0-4c59-9f02-44cb8d762ae2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 21 23:44:09 compute-1 nova_compute[182713]: 2026-01-21 23:44:09.749 182717 DEBUG nova.network.os_vif_util [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] Converting VIF {"id": "3299b15c-00f0-4c59-9f02-44cb8d762ae2", "address": "fa:16:3e:61:33:0b", "network": {"id": "48de92c9-2a56-4dfe-a16e-fe0d52617564", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "10.1.0.0/26", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.40", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "fdfe:381f:8400::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400::25e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8981554bfb65485a9218dab7f347822d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3299b15c-00", "ovs_interfaceid": "3299b15c-00f0-4c59-9f02-44cb8d762ae2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 21 23:44:09 compute-1 nova_compute[182713]: 2026-01-21 23:44:09.751 182717 DEBUG nova.network.os_vif_util [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:61:33:0b,bridge_name='br-int',has_traffic_filtering=True,id=3299b15c-00f0-4c59-9f02-44cb8d762ae2,network=Network(48de92c9-2a56-4dfe-a16e-fe0d52617564),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3299b15c-00') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 21 23:44:09 compute-1 nova_compute[182713]: 2026-01-21 23:44:09.751 182717 DEBUG os_vif [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:61:33:0b,bridge_name='br-int',has_traffic_filtering=True,id=3299b15c-00f0-4c59-9f02-44cb8d762ae2,network=Network(48de92c9-2a56-4dfe-a16e-fe0d52617564),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3299b15c-00') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 21 23:44:09 compute-1 nova_compute[182713]: 2026-01-21 23:44:09.816 182717 DEBUG ovsdbapp.backend.ovs_idl [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] Created schema index Interface.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Jan 21 23:44:09 compute-1 nova_compute[182713]: 2026-01-21 23:44:09.816 182717 DEBUG ovsdbapp.backend.ovs_idl [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] Created schema index Port.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Jan 21 23:44:09 compute-1 nova_compute[182713]: 2026-01-21 23:44:09.816 182717 DEBUG ovsdbapp.backend.ovs_idl [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] Created schema index Bridge.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Jan 21 23:44:09 compute-1 nova_compute[182713]: 2026-01-21 23:44:09.817 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] tcp:127.0.0.1:6640: entering CONNECTING _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 21 23:44:09 compute-1 nova_compute[182713]: 2026-01-21 23:44:09.817 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] [POLLOUT] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:44:09 compute-1 nova_compute[182713]: 2026-01-21 23:44:09.818 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 21 23:44:09 compute-1 nova_compute[182713]: 2026-01-21 23:44:09.818 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:44:09 compute-1 nova_compute[182713]: 2026-01-21 23:44:09.820 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:44:09 compute-1 nova_compute[182713]: 2026-01-21 23:44:09.823 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:44:09 compute-1 nova_compute[182713]: 2026-01-21 23:44:09.835 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:44:09 compute-1 nova_compute[182713]: 2026-01-21 23:44:09.835 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:44:09 compute-1 nova_compute[182713]: 2026-01-21 23:44:09.835 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 21 23:44:09 compute-1 nova_compute[182713]: 2026-01-21 23:44:09.837 182717 INFO oslo.privsep.daemon [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-compute.conf', '--config-dir', '/etc/nova/nova.conf.d', '--privsep_context', 'vif_plug_ovs.privsep.vif_plug', '--privsep_sock_path', '/tmp/tmpsh_1yuz_/privsep.sock']
Jan 21 23:44:10 compute-1 podman[211626]: 2026-01-21 23:44:10.56344613 +0000 UTC m=+0.053527230 container health_status 9069102163b9014ce8d990e36224edf2f6277e737eebbc85a8ea0ba5a3d6a468 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 21 23:44:10 compute-1 podman[211625]: 2026-01-21 23:44:10.646369363 +0000 UTC m=+0.130999776 container health_status 1b31374526468d6b98833c6ddc06c791524e3aaa8f4a92d02e3f11c449a17ca2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_managed=true, managed_by=edpm_ansible)
Jan 21 23:44:10 compute-1 nova_compute[182713]: 2026-01-21 23:44:10.685 182717 INFO oslo.privsep.daemon [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] Spawned new privsep daemon via rootwrap
Jan 21 23:44:10 compute-1 nova_compute[182713]: 2026-01-21 23:44:10.480 211624 INFO oslo.privsep.daemon [-] privsep daemon starting
Jan 21 23:44:10 compute-1 nova_compute[182713]: 2026-01-21 23:44:10.485 211624 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Jan 21 23:44:10 compute-1 nova_compute[182713]: 2026-01-21 23:44:10.486 211624 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_NET_ADMIN/CAP_DAC_OVERRIDE|CAP_NET_ADMIN/none
Jan 21 23:44:10 compute-1 nova_compute[182713]: 2026-01-21 23:44:10.487 211624 INFO oslo.privsep.daemon [-] privsep daemon running as pid 211624
Jan 21 23:44:10 compute-1 nova_compute[182713]: 2026-01-21 23:44:10.995 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:44:10 compute-1 nova_compute[182713]: 2026-01-21 23:44:10.996 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap3299b15c-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:44:10 compute-1 nova_compute[182713]: 2026-01-21 23:44:10.997 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap3299b15c-00, col_values=(('external_ids', {'iface-id': '3299b15c-00f0-4c59-9f02-44cb8d762ae2', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:61:33:0b', 'vm-uuid': '6f64b039-da3a-47ef-9a52-b259b890a077'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:44:10 compute-1 nova_compute[182713]: 2026-01-21 23:44:10.999 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:44:11 compute-1 NetworkManager[54952]: <info>  [1769039051.0004] manager: (tap3299b15c-00): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/21)
Jan 21 23:44:11 compute-1 nova_compute[182713]: 2026-01-21 23:44:11.001 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 21 23:44:11 compute-1 nova_compute[182713]: 2026-01-21 23:44:11.007 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:44:11 compute-1 nova_compute[182713]: 2026-01-21 23:44:11.008 182717 INFO os_vif [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:61:33:0b,bridge_name='br-int',has_traffic_filtering=True,id=3299b15c-00f0-4c59-9f02-44cb8d762ae2,network=Network(48de92c9-2a56-4dfe-a16e-fe0d52617564),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3299b15c-00')
Jan 21 23:44:11 compute-1 nova_compute[182713]: 2026-01-21 23:44:11.086 182717 DEBUG oslo_concurrency.lockutils [None req-e6d05f9f-3aa6-4a1a-bccb-195d8854ac9a a5dd6b4c1bdc4bdd90966604b07abf04 2046237dff224b498e14ad59b5822ac1 - - default default] Acquiring lock "5a9ebed0-dcef-427e-a805-574905569389" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:44:11 compute-1 nova_compute[182713]: 2026-01-21 23:44:11.087 182717 DEBUG oslo_concurrency.lockutils [None req-e6d05f9f-3aa6-4a1a-bccb-195d8854ac9a a5dd6b4c1bdc4bdd90966604b07abf04 2046237dff224b498e14ad59b5822ac1 - - default default] Lock "5a9ebed0-dcef-427e-a805-574905569389" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:44:11 compute-1 nova_compute[182713]: 2026-01-21 23:44:11.090 182717 DEBUG nova.virt.libvirt.driver [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 21 23:44:11 compute-1 nova_compute[182713]: 2026-01-21 23:44:11.090 182717 DEBUG nova.virt.libvirt.driver [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 21 23:44:11 compute-1 nova_compute[182713]: 2026-01-21 23:44:11.090 182717 DEBUG nova.virt.libvirt.driver [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] No VIF found with MAC fa:16:3e:61:33:0b, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 21 23:44:11 compute-1 nova_compute[182713]: 2026-01-21 23:44:11.091 182717 INFO nova.virt.libvirt.driver [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] [instance: 6f64b039-da3a-47ef-9a52-b259b890a077] Using config drive
Jan 21 23:44:11 compute-1 nova_compute[182713]: 2026-01-21 23:44:11.117 182717 DEBUG nova.compute.manager [None req-e6d05f9f-3aa6-4a1a-bccb-195d8854ac9a a5dd6b4c1bdc4bdd90966604b07abf04 2046237dff224b498e14ad59b5822ac1 - - default default] [instance: 5a9ebed0-dcef-427e-a805-574905569389] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 21 23:44:11 compute-1 nova_compute[182713]: 2026-01-21 23:44:11.227 182717 DEBUG oslo_concurrency.lockutils [None req-e6d05f9f-3aa6-4a1a-bccb-195d8854ac9a a5dd6b4c1bdc4bdd90966604b07abf04 2046237dff224b498e14ad59b5822ac1 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:44:11 compute-1 nova_compute[182713]: 2026-01-21 23:44:11.228 182717 DEBUG oslo_concurrency.lockutils [None req-e6d05f9f-3aa6-4a1a-bccb-195d8854ac9a a5dd6b4c1bdc4bdd90966604b07abf04 2046237dff224b498e14ad59b5822ac1 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:44:11 compute-1 nova_compute[182713]: 2026-01-21 23:44:11.233 182717 DEBUG nova.virt.hardware [None req-e6d05f9f-3aa6-4a1a-bccb-195d8854ac9a a5dd6b4c1bdc4bdd90966604b07abf04 2046237dff224b498e14ad59b5822ac1 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 21 23:44:11 compute-1 nova_compute[182713]: 2026-01-21 23:44:11.234 182717 INFO nova.compute.claims [None req-e6d05f9f-3aa6-4a1a-bccb-195d8854ac9a a5dd6b4c1bdc4bdd90966604b07abf04 2046237dff224b498e14ad59b5822ac1 - - default default] [instance: 5a9ebed0-dcef-427e-a805-574905569389] Claim successful on node compute-1.ctlplane.example.com
Jan 21 23:44:11 compute-1 nova_compute[182713]: 2026-01-21 23:44:11.523 182717 DEBUG nova.compute.provider_tree [None req-e6d05f9f-3aa6-4a1a-bccb-195d8854ac9a a5dd6b4c1bdc4bdd90966604b07abf04 2046237dff224b498e14ad59b5822ac1 - - default default] Inventory has not changed in ProviderTree for provider: 39680711-70c9-4df1-ae59-25e54fac688d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 21 23:44:11 compute-1 nova_compute[182713]: 2026-01-21 23:44:11.564 182717 DEBUG nova.scheduler.client.report [None req-e6d05f9f-3aa6-4a1a-bccb-195d8854ac9a a5dd6b4c1bdc4bdd90966604b07abf04 2046237dff224b498e14ad59b5822ac1 - - default default] Inventory has not changed for provider 39680711-70c9-4df1-ae59-25e54fac688d based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 21 23:44:11 compute-1 nova_compute[182713]: 2026-01-21 23:44:11.877 182717 DEBUG oslo_concurrency.lockutils [None req-e6d05f9f-3aa6-4a1a-bccb-195d8854ac9a a5dd6b4c1bdc4bdd90966604b07abf04 2046237dff224b498e14ad59b5822ac1 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.649s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:44:11 compute-1 nova_compute[182713]: 2026-01-21 23:44:11.878 182717 DEBUG nova.compute.manager [None req-e6d05f9f-3aa6-4a1a-bccb-195d8854ac9a a5dd6b4c1bdc4bdd90966604b07abf04 2046237dff224b498e14ad59b5822ac1 - - default default] [instance: 5a9ebed0-dcef-427e-a805-574905569389] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 21 23:44:11 compute-1 nova_compute[182713]: 2026-01-21 23:44:11.980 182717 DEBUG nova.compute.manager [None req-e6d05f9f-3aa6-4a1a-bccb-195d8854ac9a a5dd6b4c1bdc4bdd90966604b07abf04 2046237dff224b498e14ad59b5822ac1 - - default default] [instance: 5a9ebed0-dcef-427e-a805-574905569389] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 21 23:44:11 compute-1 nova_compute[182713]: 2026-01-21 23:44:11.981 182717 DEBUG nova.network.neutron [None req-e6d05f9f-3aa6-4a1a-bccb-195d8854ac9a a5dd6b4c1bdc4bdd90966604b07abf04 2046237dff224b498e14ad59b5822ac1 - - default default] [instance: 5a9ebed0-dcef-427e-a805-574905569389] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 21 23:44:12 compute-1 nova_compute[182713]: 2026-01-21 23:44:12.006 182717 INFO nova.virt.libvirt.driver [None req-e6d05f9f-3aa6-4a1a-bccb-195d8854ac9a a5dd6b4c1bdc4bdd90966604b07abf04 2046237dff224b498e14ad59b5822ac1 - - default default] [instance: 5a9ebed0-dcef-427e-a805-574905569389] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 21 23:44:12 compute-1 nova_compute[182713]: 2026-01-21 23:44:12.027 182717 DEBUG nova.compute.manager [None req-e6d05f9f-3aa6-4a1a-bccb-195d8854ac9a a5dd6b4c1bdc4bdd90966604b07abf04 2046237dff224b498e14ad59b5822ac1 - - default default] [instance: 5a9ebed0-dcef-427e-a805-574905569389] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 21 23:44:12 compute-1 nova_compute[182713]: 2026-01-21 23:44:12.101 182717 INFO nova.virt.libvirt.driver [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] [instance: 6f64b039-da3a-47ef-9a52-b259b890a077] Creating config drive at /var/lib/nova/instances/6f64b039-da3a-47ef-9a52-b259b890a077/disk.config
Jan 21 23:44:12 compute-1 nova_compute[182713]: 2026-01-21 23:44:12.106 182717 DEBUG oslo_concurrency.processutils [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/6f64b039-da3a-47ef-9a52-b259b890a077/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmppentkb_5 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:44:12 compute-1 nova_compute[182713]: 2026-01-21 23:44:12.175 182717 DEBUG nova.compute.manager [None req-e6d05f9f-3aa6-4a1a-bccb-195d8854ac9a a5dd6b4c1bdc4bdd90966604b07abf04 2046237dff224b498e14ad59b5822ac1 - - default default] [instance: 5a9ebed0-dcef-427e-a805-574905569389] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 21 23:44:12 compute-1 nova_compute[182713]: 2026-01-21 23:44:12.181 182717 DEBUG nova.virt.libvirt.driver [None req-e6d05f9f-3aa6-4a1a-bccb-195d8854ac9a a5dd6b4c1bdc4bdd90966604b07abf04 2046237dff224b498e14ad59b5822ac1 - - default default] [instance: 5a9ebed0-dcef-427e-a805-574905569389] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 21 23:44:12 compute-1 nova_compute[182713]: 2026-01-21 23:44:12.182 182717 INFO nova.virt.libvirt.driver [None req-e6d05f9f-3aa6-4a1a-bccb-195d8854ac9a a5dd6b4c1bdc4bdd90966604b07abf04 2046237dff224b498e14ad59b5822ac1 - - default default] [instance: 5a9ebed0-dcef-427e-a805-574905569389] Creating image(s)
Jan 21 23:44:12 compute-1 nova_compute[182713]: 2026-01-21 23:44:12.183 182717 DEBUG oslo_concurrency.lockutils [None req-e6d05f9f-3aa6-4a1a-bccb-195d8854ac9a a5dd6b4c1bdc4bdd90966604b07abf04 2046237dff224b498e14ad59b5822ac1 - - default default] Acquiring lock "/var/lib/nova/instances/5a9ebed0-dcef-427e-a805-574905569389/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:44:12 compute-1 nova_compute[182713]: 2026-01-21 23:44:12.184 182717 DEBUG oslo_concurrency.lockutils [None req-e6d05f9f-3aa6-4a1a-bccb-195d8854ac9a a5dd6b4c1bdc4bdd90966604b07abf04 2046237dff224b498e14ad59b5822ac1 - - default default] Lock "/var/lib/nova/instances/5a9ebed0-dcef-427e-a805-574905569389/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:44:12 compute-1 nova_compute[182713]: 2026-01-21 23:44:12.185 182717 DEBUG oslo_concurrency.lockutils [None req-e6d05f9f-3aa6-4a1a-bccb-195d8854ac9a a5dd6b4c1bdc4bdd90966604b07abf04 2046237dff224b498e14ad59b5822ac1 - - default default] Lock "/var/lib/nova/instances/5a9ebed0-dcef-427e-a805-574905569389/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:44:12 compute-1 nova_compute[182713]: 2026-01-21 23:44:12.220 182717 DEBUG oslo_concurrency.processutils [None req-e6d05f9f-3aa6-4a1a-bccb-195d8854ac9a a5dd6b4c1bdc4bdd90966604b07abf04 2046237dff224b498e14ad59b5822ac1 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:44:12 compute-1 nova_compute[182713]: 2026-01-21 23:44:12.252 182717 DEBUG oslo_concurrency.processutils [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/6f64b039-da3a-47ef-9a52-b259b890a077/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmppentkb_5" returned: 0 in 0.146s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:44:12 compute-1 nova_compute[182713]: 2026-01-21 23:44:12.316 182717 DEBUG oslo_concurrency.processutils [None req-e6d05f9f-3aa6-4a1a-bccb-195d8854ac9a a5dd6b4c1bdc4bdd90966604b07abf04 2046237dff224b498e14ad59b5822ac1 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.096s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:44:12 compute-1 nova_compute[182713]: 2026-01-21 23:44:12.317 182717 DEBUG oslo_concurrency.lockutils [None req-e6d05f9f-3aa6-4a1a-bccb-195d8854ac9a a5dd6b4c1bdc4bdd90966604b07abf04 2046237dff224b498e14ad59b5822ac1 - - default default] Acquiring lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:44:12 compute-1 nova_compute[182713]: 2026-01-21 23:44:12.318 182717 DEBUG oslo_concurrency.lockutils [None req-e6d05f9f-3aa6-4a1a-bccb-195d8854ac9a a5dd6b4c1bdc4bdd90966604b07abf04 2046237dff224b498e14ad59b5822ac1 - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:44:12 compute-1 nova_compute[182713]: 2026-01-21 23:44:12.329 182717 DEBUG oslo_concurrency.processutils [None req-e6d05f9f-3aa6-4a1a-bccb-195d8854ac9a a5dd6b4c1bdc4bdd90966604b07abf04 2046237dff224b498e14ad59b5822ac1 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:44:12 compute-1 kernel: tun: Universal TUN/TAP device driver, 1.6
Jan 21 23:44:12 compute-1 kernel: tap3299b15c-00: entered promiscuous mode
Jan 21 23:44:12 compute-1 NetworkManager[54952]: <info>  [1769039052.3460] manager: (tap3299b15c-00): new Tun device (/org/freedesktop/NetworkManager/Devices/22)
Jan 21 23:44:12 compute-1 ovn_controller[94841]: 2026-01-21T23:44:12Z|00027|binding|INFO|Claiming lport 3299b15c-00f0-4c59-9f02-44cb8d762ae2 for this chassis.
Jan 21 23:44:12 compute-1 ovn_controller[94841]: 2026-01-21T23:44:12Z|00028|binding|INFO|3299b15c-00f0-4c59-9f02-44cb8d762ae2: Claiming fa:16:3e:61:33:0b 10.1.0.40 fdfe:381f:8400::25e
Jan 21 23:44:12 compute-1 nova_compute[182713]: 2026-01-21 23:44:12.349 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:44:12 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:44:12.368 104184 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:61:33:0b 10.1.0.40 fdfe:381f:8400::25e'], port_security=['fa:16:3e:61:33:0b 10.1.0.40 fdfe:381f:8400::25e'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.1.0.40/26 fdfe:381f:8400::25e/64', 'neutron:device_id': '6f64b039-da3a-47ef-9a52-b259b890a077', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-48de92c9-2a56-4dfe-a16e-fe0d52617564', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8981554bfb65485a9218dab7f347822d', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'e4bb4842-7cc7-47df-ad92-e426d20758f4', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=343b670f-2d8d-4f56-9cb9-7d9682347428, chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>], tunnel_key=6, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>], logical_port=3299b15c-00f0-4c59-9f02-44cb8d762ae2) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 21 23:44:12 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:44:12.372 104184 INFO neutron.agent.ovn.metadata.agent [-] Port 3299b15c-00f0-4c59-9f02-44cb8d762ae2 in datapath 48de92c9-2a56-4dfe-a16e-fe0d52617564 bound to our chassis
Jan 21 23:44:12 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:44:12.381 104184 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 48de92c9-2a56-4dfe-a16e-fe0d52617564
Jan 21 23:44:12 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:44:12.383 104184 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.default', '--privsep_sock_path', '/tmp/tmpftu6lxts/privsep.sock']
Jan 21 23:44:12 compute-1 systemd-udevd[211704]: Network interface NamePolicy= disabled on kernel command line.
Jan 21 23:44:12 compute-1 nova_compute[182713]: 2026-01-21 23:44:12.394 182717 DEBUG oslo_concurrency.processutils [None req-e6d05f9f-3aa6-4a1a-bccb-195d8854ac9a a5dd6b4c1bdc4bdd90966604b07abf04 2046237dff224b498e14ad59b5822ac1 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:44:12 compute-1 nova_compute[182713]: 2026-01-21 23:44:12.395 182717 DEBUG oslo_concurrency.processutils [None req-e6d05f9f-3aa6-4a1a-bccb-195d8854ac9a a5dd6b4c1bdc4bdd90966604b07abf04 2046237dff224b498e14ad59b5822ac1 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/5a9ebed0-dcef-427e-a805-574905569389/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:44:12 compute-1 NetworkManager[54952]: <info>  [1769039052.4069] device (tap3299b15c-00): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 21 23:44:12 compute-1 NetworkManager[54952]: <info>  [1769039052.4075] device (tap3299b15c-00): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 21 23:44:12 compute-1 systemd-machined[153970]: New machine qemu-3-instance-00000002.
Jan 21 23:44:12 compute-1 nova_compute[182713]: 2026-01-21 23:44:12.433 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:44:12 compute-1 ovn_controller[94841]: 2026-01-21T23:44:12Z|00029|binding|INFO|Setting lport 3299b15c-00f0-4c59-9f02-44cb8d762ae2 ovn-installed in OVS
Jan 21 23:44:12 compute-1 ovn_controller[94841]: 2026-01-21T23:44:12Z|00030|binding|INFO|Setting lport 3299b15c-00f0-4c59-9f02-44cb8d762ae2 up in Southbound
Jan 21 23:44:12 compute-1 nova_compute[182713]: 2026-01-21 23:44:12.440 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:44:12 compute-1 nova_compute[182713]: 2026-01-21 23:44:12.443 182717 DEBUG oslo_concurrency.processutils [None req-e6d05f9f-3aa6-4a1a-bccb-195d8854ac9a a5dd6b4c1bdc4bdd90966604b07abf04 2046237dff224b498e14ad59b5822ac1 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/5a9ebed0-dcef-427e-a805-574905569389/disk 1073741824" returned: 0 in 0.048s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:44:12 compute-1 nova_compute[182713]: 2026-01-21 23:44:12.444 182717 DEBUG oslo_concurrency.lockutils [None req-e6d05f9f-3aa6-4a1a-bccb-195d8854ac9a a5dd6b4c1bdc4bdd90966604b07abf04 2046237dff224b498e14ad59b5822ac1 - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.126s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:44:12 compute-1 nova_compute[182713]: 2026-01-21 23:44:12.444 182717 DEBUG oslo_concurrency.processutils [None req-e6d05f9f-3aa6-4a1a-bccb-195d8854ac9a a5dd6b4c1bdc4bdd90966604b07abf04 2046237dff224b498e14ad59b5822ac1 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:44:12 compute-1 systemd[1]: Started Virtual Machine qemu-3-instance-00000002.
Jan 21 23:44:12 compute-1 nova_compute[182713]: 2026-01-21 23:44:12.505 182717 DEBUG oslo_concurrency.processutils [None req-e6d05f9f-3aa6-4a1a-bccb-195d8854ac9a a5dd6b4c1bdc4bdd90966604b07abf04 2046237dff224b498e14ad59b5822ac1 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:44:12 compute-1 nova_compute[182713]: 2026-01-21 23:44:12.507 182717 DEBUG nova.virt.disk.api [None req-e6d05f9f-3aa6-4a1a-bccb-195d8854ac9a a5dd6b4c1bdc4bdd90966604b07abf04 2046237dff224b498e14ad59b5822ac1 - - default default] Checking if we can resize image /var/lib/nova/instances/5a9ebed0-dcef-427e-a805-574905569389/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 21 23:44:12 compute-1 nova_compute[182713]: 2026-01-21 23:44:12.507 182717 DEBUG oslo_concurrency.processutils [None req-e6d05f9f-3aa6-4a1a-bccb-195d8854ac9a a5dd6b4c1bdc4bdd90966604b07abf04 2046237dff224b498e14ad59b5822ac1 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5a9ebed0-dcef-427e-a805-574905569389/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:44:12 compute-1 nova_compute[182713]: 2026-01-21 23:44:12.568 182717 DEBUG oslo_concurrency.processutils [None req-e6d05f9f-3aa6-4a1a-bccb-195d8854ac9a a5dd6b4c1bdc4bdd90966604b07abf04 2046237dff224b498e14ad59b5822ac1 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5a9ebed0-dcef-427e-a805-574905569389/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:44:12 compute-1 nova_compute[182713]: 2026-01-21 23:44:12.570 182717 DEBUG nova.virt.disk.api [None req-e6d05f9f-3aa6-4a1a-bccb-195d8854ac9a a5dd6b4c1bdc4bdd90966604b07abf04 2046237dff224b498e14ad59b5822ac1 - - default default] Cannot resize image /var/lib/nova/instances/5a9ebed0-dcef-427e-a805-574905569389/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 21 23:44:12 compute-1 nova_compute[182713]: 2026-01-21 23:44:12.570 182717 DEBUG nova.objects.instance [None req-e6d05f9f-3aa6-4a1a-bccb-195d8854ac9a a5dd6b4c1bdc4bdd90966604b07abf04 2046237dff224b498e14ad59b5822ac1 - - default default] Lazy-loading 'migration_context' on Instance uuid 5a9ebed0-dcef-427e-a805-574905569389 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:44:12 compute-1 nova_compute[182713]: 2026-01-21 23:44:12.597 182717 DEBUG nova.virt.libvirt.driver [None req-e6d05f9f-3aa6-4a1a-bccb-195d8854ac9a a5dd6b4c1bdc4bdd90966604b07abf04 2046237dff224b498e14ad59b5822ac1 - - default default] [instance: 5a9ebed0-dcef-427e-a805-574905569389] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 21 23:44:12 compute-1 nova_compute[182713]: 2026-01-21 23:44:12.598 182717 DEBUG nova.virt.libvirt.driver [None req-e6d05f9f-3aa6-4a1a-bccb-195d8854ac9a a5dd6b4c1bdc4bdd90966604b07abf04 2046237dff224b498e14ad59b5822ac1 - - default default] [instance: 5a9ebed0-dcef-427e-a805-574905569389] Ensure instance console log exists: /var/lib/nova/instances/5a9ebed0-dcef-427e-a805-574905569389/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 21 23:44:12 compute-1 nova_compute[182713]: 2026-01-21 23:44:12.599 182717 DEBUG oslo_concurrency.lockutils [None req-e6d05f9f-3aa6-4a1a-bccb-195d8854ac9a a5dd6b4c1bdc4bdd90966604b07abf04 2046237dff224b498e14ad59b5822ac1 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:44:12 compute-1 nova_compute[182713]: 2026-01-21 23:44:12.599 182717 DEBUG oslo_concurrency.lockutils [None req-e6d05f9f-3aa6-4a1a-bccb-195d8854ac9a a5dd6b4c1bdc4bdd90966604b07abf04 2046237dff224b498e14ad59b5822ac1 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:44:12 compute-1 nova_compute[182713]: 2026-01-21 23:44:12.599 182717 DEBUG oslo_concurrency.lockutils [None req-e6d05f9f-3aa6-4a1a-bccb-195d8854ac9a a5dd6b4c1bdc4bdd90966604b07abf04 2046237dff224b498e14ad59b5822ac1 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:44:12 compute-1 nova_compute[182713]: 2026-01-21 23:44:12.681 182717 DEBUG nova.policy [None req-e6d05f9f-3aa6-4a1a-bccb-195d8854ac9a a5dd6b4c1bdc4bdd90966604b07abf04 2046237dff224b498e14ad59b5822ac1 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'a5dd6b4c1bdc4bdd90966604b07abf04', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '2046237dff224b498e14ad59b5822ac1', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 21 23:44:13 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:44:13.157 104184 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap
Jan 21 23:44:13 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:44:13.157 104184 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmpftu6lxts/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362
Jan 21 23:44:13 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:44:13.018 211733 INFO oslo.privsep.daemon [-] privsep daemon starting
Jan 21 23:44:13 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:44:13.026 211733 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Jan 21 23:44:13 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:44:13.029 211733 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_NET_ADMIN|CAP_SYS_ADMIN|CAP_SYS_PTRACE/CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_NET_ADMIN|CAP_SYS_ADMIN|CAP_SYS_PTRACE/none
Jan 21 23:44:13 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:44:13.029 211733 INFO oslo.privsep.daemon [-] privsep daemon running as pid 211733
Jan 21 23:44:13 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:44:13.161 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[5a885274-789a-4bdd-b3ff-41148b397478]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:44:13 compute-1 nova_compute[182713]: 2026-01-21 23:44:13.224 182717 DEBUG nova.virt.driver [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Emitting event <LifecycleEvent: 1769039053.2238882, 6f64b039-da3a-47ef-9a52-b259b890a077 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:44:13 compute-1 nova_compute[182713]: 2026-01-21 23:44:13.225 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 6f64b039-da3a-47ef-9a52-b259b890a077] VM Started (Lifecycle Event)
Jan 21 23:44:13 compute-1 nova_compute[182713]: 2026-01-21 23:44:13.249 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 6f64b039-da3a-47ef-9a52-b259b890a077] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:44:13 compute-1 nova_compute[182713]: 2026-01-21 23:44:13.255 182717 DEBUG nova.virt.driver [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Emitting event <LifecycleEvent: 1769039053.2244897, 6f64b039-da3a-47ef-9a52-b259b890a077 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:44:13 compute-1 nova_compute[182713]: 2026-01-21 23:44:13.255 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 6f64b039-da3a-47ef-9a52-b259b890a077] VM Paused (Lifecycle Event)
Jan 21 23:44:13 compute-1 nova_compute[182713]: 2026-01-21 23:44:13.300 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 6f64b039-da3a-47ef-9a52-b259b890a077] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:44:13 compute-1 nova_compute[182713]: 2026-01-21 23:44:13.305 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 6f64b039-da3a-47ef-9a52-b259b890a077] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 21 23:44:13 compute-1 nova_compute[182713]: 2026-01-21 23:44:13.335 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 6f64b039-da3a-47ef-9a52-b259b890a077] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 21 23:44:13 compute-1 nova_compute[182713]: 2026-01-21 23:44:13.375 182717 DEBUG nova.compute.manager [req-18d97e7f-9ab1-46d7-a50e-ceb85c578c9b req-6998a606-ce13-4fe5-b6e9-2874ff5cc7ce 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 6f64b039-da3a-47ef-9a52-b259b890a077] Received event network-vif-plugged-3299b15c-00f0-4c59-9f02-44cb8d762ae2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:44:13 compute-1 nova_compute[182713]: 2026-01-21 23:44:13.376 182717 DEBUG oslo_concurrency.lockutils [req-18d97e7f-9ab1-46d7-a50e-ceb85c578c9b req-6998a606-ce13-4fe5-b6e9-2874ff5cc7ce 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "6f64b039-da3a-47ef-9a52-b259b890a077-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:44:13 compute-1 nova_compute[182713]: 2026-01-21 23:44:13.376 182717 DEBUG oslo_concurrency.lockutils [req-18d97e7f-9ab1-46d7-a50e-ceb85c578c9b req-6998a606-ce13-4fe5-b6e9-2874ff5cc7ce 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "6f64b039-da3a-47ef-9a52-b259b890a077-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:44:13 compute-1 nova_compute[182713]: 2026-01-21 23:44:13.377 182717 DEBUG oslo_concurrency.lockutils [req-18d97e7f-9ab1-46d7-a50e-ceb85c578c9b req-6998a606-ce13-4fe5-b6e9-2874ff5cc7ce 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "6f64b039-da3a-47ef-9a52-b259b890a077-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:44:13 compute-1 nova_compute[182713]: 2026-01-21 23:44:13.377 182717 DEBUG nova.compute.manager [req-18d97e7f-9ab1-46d7-a50e-ceb85c578c9b req-6998a606-ce13-4fe5-b6e9-2874ff5cc7ce 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 6f64b039-da3a-47ef-9a52-b259b890a077] Processing event network-vif-plugged-3299b15c-00f0-4c59-9f02-44cb8d762ae2 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 21 23:44:13 compute-1 nova_compute[182713]: 2026-01-21 23:44:13.378 182717 DEBUG nova.compute.manager [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] [instance: 6f64b039-da3a-47ef-9a52-b259b890a077] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 21 23:44:13 compute-1 nova_compute[182713]: 2026-01-21 23:44:13.382 182717 DEBUG nova.virt.driver [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Emitting event <LifecycleEvent: 1769039053.382554, 6f64b039-da3a-47ef-9a52-b259b890a077 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:44:13 compute-1 nova_compute[182713]: 2026-01-21 23:44:13.383 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 6f64b039-da3a-47ef-9a52-b259b890a077] VM Resumed (Lifecycle Event)
Jan 21 23:44:13 compute-1 nova_compute[182713]: 2026-01-21 23:44:13.388 182717 DEBUG nova.virt.libvirt.driver [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] [instance: 6f64b039-da3a-47ef-9a52-b259b890a077] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 21 23:44:13 compute-1 nova_compute[182713]: 2026-01-21 23:44:13.393 182717 INFO nova.virt.libvirt.driver [-] [instance: 6f64b039-da3a-47ef-9a52-b259b890a077] Instance spawned successfully.
Jan 21 23:44:13 compute-1 nova_compute[182713]: 2026-01-21 23:44:13.394 182717 DEBUG nova.virt.libvirt.driver [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] [instance: 6f64b039-da3a-47ef-9a52-b259b890a077] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 21 23:44:13 compute-1 nova_compute[182713]: 2026-01-21 23:44:13.429 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 6f64b039-da3a-47ef-9a52-b259b890a077] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:44:13 compute-1 nova_compute[182713]: 2026-01-21 23:44:13.436 182717 DEBUG nova.virt.libvirt.driver [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] [instance: 6f64b039-da3a-47ef-9a52-b259b890a077] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:44:13 compute-1 nova_compute[182713]: 2026-01-21 23:44:13.437 182717 DEBUG nova.virt.libvirt.driver [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] [instance: 6f64b039-da3a-47ef-9a52-b259b890a077] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:44:13 compute-1 nova_compute[182713]: 2026-01-21 23:44:13.438 182717 DEBUG nova.virt.libvirt.driver [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] [instance: 6f64b039-da3a-47ef-9a52-b259b890a077] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:44:13 compute-1 nova_compute[182713]: 2026-01-21 23:44:13.438 182717 DEBUG nova.virt.libvirt.driver [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] [instance: 6f64b039-da3a-47ef-9a52-b259b890a077] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:44:13 compute-1 nova_compute[182713]: 2026-01-21 23:44:13.439 182717 DEBUG nova.virt.libvirt.driver [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] [instance: 6f64b039-da3a-47ef-9a52-b259b890a077] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:44:13 compute-1 nova_compute[182713]: 2026-01-21 23:44:13.440 182717 DEBUG nova.virt.libvirt.driver [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] [instance: 6f64b039-da3a-47ef-9a52-b259b890a077] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:44:13 compute-1 nova_compute[182713]: 2026-01-21 23:44:13.450 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 6f64b039-da3a-47ef-9a52-b259b890a077] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 21 23:44:13 compute-1 nova_compute[182713]: 2026-01-21 23:44:13.500 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 6f64b039-da3a-47ef-9a52-b259b890a077] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 21 23:44:13 compute-1 nova_compute[182713]: 2026-01-21 23:44:13.555 182717 INFO nova.compute.manager [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] [instance: 6f64b039-da3a-47ef-9a52-b259b890a077] Took 32.43 seconds to spawn the instance on the hypervisor.
Jan 21 23:44:13 compute-1 nova_compute[182713]: 2026-01-21 23:44:13.556 182717 DEBUG nova.compute.manager [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] [instance: 6f64b039-da3a-47ef-9a52-b259b890a077] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:44:13 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:44:13.660 211733 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:44:13 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:44:13.661 211733 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:44:13 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:44:13.661 211733 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:44:13 compute-1 nova_compute[182713]: 2026-01-21 23:44:13.665 182717 INFO nova.compute.manager [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] [instance: 6f64b039-da3a-47ef-9a52-b259b890a077] Took 33.69 seconds to build instance.
Jan 21 23:44:13 compute-1 nova_compute[182713]: 2026-01-21 23:44:13.690 182717 DEBUG nova.network.neutron [req-a58d16a5-9d2e-4030-91ef-38c5cbf834a1 req-0e6e9aa6-38cc-4c32-8dbc-47dd691ac13f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 6f64b039-da3a-47ef-9a52-b259b890a077] Updated VIF entry in instance network info cache for port 3299b15c-00f0-4c59-9f02-44cb8d762ae2. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 21 23:44:13 compute-1 nova_compute[182713]: 2026-01-21 23:44:13.690 182717 DEBUG nova.network.neutron [req-a58d16a5-9d2e-4030-91ef-38c5cbf834a1 req-0e6e9aa6-38cc-4c32-8dbc-47dd691ac13f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 6f64b039-da3a-47ef-9a52-b259b890a077] Updating instance_info_cache with network_info: [{"id": "3299b15c-00f0-4c59-9f02-44cb8d762ae2", "address": "fa:16:3e:61:33:0b", "network": {"id": "48de92c9-2a56-4dfe-a16e-fe0d52617564", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "10.1.0.0/26", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.40", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "fdfe:381f:8400::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400::25e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8981554bfb65485a9218dab7f347822d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3299b15c-00", "ovs_interfaceid": "3299b15c-00f0-4c59-9f02-44cb8d762ae2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 23:44:13 compute-1 nova_compute[182713]: 2026-01-21 23:44:13.693 182717 DEBUG oslo_concurrency.lockutils [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] Lock "6f64b039-da3a-47ef-9a52-b259b890a077" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 33.902s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:44:13 compute-1 nova_compute[182713]: 2026-01-21 23:44:13.713 182717 DEBUG oslo_concurrency.lockutils [req-a58d16a5-9d2e-4030-91ef-38c5cbf834a1 req-0e6e9aa6-38cc-4c32-8dbc-47dd691ac13f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-6f64b039-da3a-47ef-9a52-b259b890a077" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 23:44:13 compute-1 nova_compute[182713]: 2026-01-21 23:44:13.876 182717 DEBUG nova.network.neutron [None req-e6d05f9f-3aa6-4a1a-bccb-195d8854ac9a a5dd6b4c1bdc4bdd90966604b07abf04 2046237dff224b498e14ad59b5822ac1 - - default default] [instance: 5a9ebed0-dcef-427e-a805-574905569389] Successfully created port: 37f1f8d1-f4ea-416d-ba73-6fbc611802be _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 21 23:44:14 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:44:14.212 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[3bad4159-1a9a-4d37-af49-4d139791f276]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:44:14 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:44:14.227 104184 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap48de92c9-21 in ovnmeta-48de92c9-2a56-4dfe-a16e-fe0d52617564 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 21 23:44:14 compute-1 nova_compute[182713]: 2026-01-21 23:44:14.229 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:44:14 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:44:14.229 211733 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap48de92c9-20 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 21 23:44:14 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:44:14.229 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[8f9ab65b-9752-497f-bd15-93c77409eb5d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:44:14 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:44:14.234 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[aae56ced-3979-4741-a9b4-729e6bd4ac15]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:44:14 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:44:14.264 104576 DEBUG oslo.privsep.daemon [-] privsep: reply[fdec2252-457d-433e-abcd-ff535ffe535e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:44:14 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:44:14.287 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[8f0925bf-e6b2-49c8-b32f-1b1ca0e4b740]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:44:14 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:44:14.289 104184 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.link_cmd', '--privsep_sock_path', '/tmp/tmp_fwkfqlv/privsep.sock']
Jan 21 23:44:15 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:44:14.998 104184 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap
Jan 21 23:44:15 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:44:15.000 104184 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmp_fwkfqlv/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362
Jan 21 23:44:15 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:44:14.878 211754 INFO oslo.privsep.daemon [-] privsep daemon starting
Jan 21 23:44:15 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:44:14.886 211754 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Jan 21 23:44:15 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:44:14.890 211754 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_NET_ADMIN|CAP_SYS_ADMIN/CAP_NET_ADMIN|CAP_SYS_ADMIN/none
Jan 21 23:44:15 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:44:14.890 211754 INFO oslo.privsep.daemon [-] privsep daemon running as pid 211754
Jan 21 23:44:15 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:44:15.003 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[a4594865-3db6-492b-bff3-38c99c9b1f76]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:44:15 compute-1 nova_compute[182713]: 2026-01-21 23:44:15.120 182717 DEBUG nova.network.neutron [None req-e6d05f9f-3aa6-4a1a-bccb-195d8854ac9a a5dd6b4c1bdc4bdd90966604b07abf04 2046237dff224b498e14ad59b5822ac1 - - default default] [instance: 5a9ebed0-dcef-427e-a805-574905569389] Successfully updated port: 37f1f8d1-f4ea-416d-ba73-6fbc611802be _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 21 23:44:15 compute-1 nova_compute[182713]: 2026-01-21 23:44:15.143 182717 DEBUG oslo_concurrency.lockutils [None req-e6d05f9f-3aa6-4a1a-bccb-195d8854ac9a a5dd6b4c1bdc4bdd90966604b07abf04 2046237dff224b498e14ad59b5822ac1 - - default default] Acquiring lock "refresh_cache-5a9ebed0-dcef-427e-a805-574905569389" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 23:44:15 compute-1 nova_compute[182713]: 2026-01-21 23:44:15.143 182717 DEBUG oslo_concurrency.lockutils [None req-e6d05f9f-3aa6-4a1a-bccb-195d8854ac9a a5dd6b4c1bdc4bdd90966604b07abf04 2046237dff224b498e14ad59b5822ac1 - - default default] Acquired lock "refresh_cache-5a9ebed0-dcef-427e-a805-574905569389" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 23:44:15 compute-1 nova_compute[182713]: 2026-01-21 23:44:15.144 182717 DEBUG nova.network.neutron [None req-e6d05f9f-3aa6-4a1a-bccb-195d8854ac9a a5dd6b4c1bdc4bdd90966604b07abf04 2046237dff224b498e14ad59b5822ac1 - - default default] [instance: 5a9ebed0-dcef-427e-a805-574905569389] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 21 23:44:15 compute-1 nova_compute[182713]: 2026-01-21 23:44:15.475 182717 DEBUG nova.compute.manager [req-f2cc75ab-88bc-4c63-a39f-f3c0c14aab5a req-e0a2cba5-aa45-4ea8-803e-09a4119dea78 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 6f64b039-da3a-47ef-9a52-b259b890a077] Received event network-vif-plugged-3299b15c-00f0-4c59-9f02-44cb8d762ae2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:44:15 compute-1 nova_compute[182713]: 2026-01-21 23:44:15.476 182717 DEBUG oslo_concurrency.lockutils [req-f2cc75ab-88bc-4c63-a39f-f3c0c14aab5a req-e0a2cba5-aa45-4ea8-803e-09a4119dea78 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "6f64b039-da3a-47ef-9a52-b259b890a077-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:44:15 compute-1 nova_compute[182713]: 2026-01-21 23:44:15.476 182717 DEBUG oslo_concurrency.lockutils [req-f2cc75ab-88bc-4c63-a39f-f3c0c14aab5a req-e0a2cba5-aa45-4ea8-803e-09a4119dea78 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "6f64b039-da3a-47ef-9a52-b259b890a077-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:44:15 compute-1 nova_compute[182713]: 2026-01-21 23:44:15.476 182717 DEBUG oslo_concurrency.lockutils [req-f2cc75ab-88bc-4c63-a39f-f3c0c14aab5a req-e0a2cba5-aa45-4ea8-803e-09a4119dea78 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "6f64b039-da3a-47ef-9a52-b259b890a077-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:44:15 compute-1 nova_compute[182713]: 2026-01-21 23:44:15.476 182717 DEBUG nova.compute.manager [req-f2cc75ab-88bc-4c63-a39f-f3c0c14aab5a req-e0a2cba5-aa45-4ea8-803e-09a4119dea78 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 6f64b039-da3a-47ef-9a52-b259b890a077] No waiting events found dispatching network-vif-plugged-3299b15c-00f0-4c59-9f02-44cb8d762ae2 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 23:44:15 compute-1 nova_compute[182713]: 2026-01-21 23:44:15.477 182717 WARNING nova.compute.manager [req-f2cc75ab-88bc-4c63-a39f-f3c0c14aab5a req-e0a2cba5-aa45-4ea8-803e-09a4119dea78 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 6f64b039-da3a-47ef-9a52-b259b890a077] Received unexpected event network-vif-plugged-3299b15c-00f0-4c59-9f02-44cb8d762ae2 for instance with vm_state active and task_state None.
Jan 21 23:44:15 compute-1 nova_compute[182713]: 2026-01-21 23:44:15.525 182717 DEBUG nova.network.neutron [None req-e6d05f9f-3aa6-4a1a-bccb-195d8854ac9a a5dd6b4c1bdc4bdd90966604b07abf04 2046237dff224b498e14ad59b5822ac1 - - default default] [instance: 5a9ebed0-dcef-427e-a805-574905569389] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 21 23:44:15 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:44:15.578 211754 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:44:15 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:44:15.578 211754 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:44:15 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:44:15.578 211754 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:44:16 compute-1 nova_compute[182713]: 2026-01-21 23:44:16.000 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:44:16 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:44:16.196 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[d5795825-f5e0-417c-bea5-c1d2edcf1b19]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:44:16 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:44:16.230 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[948cb816-e653-4735-a05a-231ac55caa7c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:44:16 compute-1 NetworkManager[54952]: <info>  [1769039056.2314] manager: (tap48de92c9-20): new Veth device (/org/freedesktop/NetworkManager/Devices/23)
Jan 21 23:44:16 compute-1 systemd-udevd[211766]: Network interface NamePolicy= disabled on kernel command line.
Jan 21 23:44:16 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:44:16.267 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[b91c86f3-e1aa-4a3e-9d11-6d23c119b99e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:44:16 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:44:16.272 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[171fa86a-6567-4797-91e6-a6f712931fb3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:44:16 compute-1 NetworkManager[54952]: <info>  [1769039056.3141] device (tap48de92c9-20): carrier: link connected
Jan 21 23:44:16 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:44:16.324 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[ef5c3362-47bf-4797-98d7-c44068e1aebd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:44:16 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:44:16.354 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[5c4c5d0e-d3f3-403c-838e-2ddcc292b17f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap48de92c9-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:c4:15:5d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 12], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 362896, 'reachable_time': 39880, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 152, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 152, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 211784, 'error': None, 'target': 'ovnmeta-48de92c9-2a56-4dfe-a16e-fe0d52617564', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:44:16 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:44:16.378 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[b9b8c428-86ff-4f2f-9a76-fd6a0b062953]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fec4:155d'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 362896, 'tstamp': 362896}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 211785, 'error': None, 'target': 'ovnmeta-48de92c9-2a56-4dfe-a16e-fe0d52617564', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:44:16 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:44:16.404 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[e1227e23-0e1c-476a-a7ef-3f9744118c0a]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap48de92c9-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:c4:15:5d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 12], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 362896, 'reachable_time': 39880, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 152, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 152, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 211786, 'error': None, 'target': 'ovnmeta-48de92c9-2a56-4dfe-a16e-fe0d52617564', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:44:16 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:44:16.450 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[74f409e1-77df-4e23-93ca-46f1db41666f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:44:16 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:44:16.542 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[de0af770-6f11-4845-8e7f-40ad9ebc2ce9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:44:16 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:44:16.544 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap48de92c9-20, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:44:16 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:44:16.545 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 21 23:44:16 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:44:16.546 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap48de92c9-20, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:44:16 compute-1 nova_compute[182713]: 2026-01-21 23:44:16.548 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:44:16 compute-1 kernel: tap48de92c9-20: entered promiscuous mode
Jan 21 23:44:16 compute-1 NetworkManager[54952]: <info>  [1769039056.5494] manager: (tap48de92c9-20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/24)
Jan 21 23:44:16 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:44:16.552 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap48de92c9-20, col_values=(('external_ids', {'iface-id': '07b3db2a-1439-4d24-a2d8-d2c29586e870'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:44:16 compute-1 nova_compute[182713]: 2026-01-21 23:44:16.553 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:44:16 compute-1 ovn_controller[94841]: 2026-01-21T23:44:16Z|00031|binding|INFO|Releasing lport 07b3db2a-1439-4d24-a2d8-d2c29586e870 from this chassis (sb_readonly=0)
Jan 21 23:44:16 compute-1 nova_compute[182713]: 2026-01-21 23:44:16.565 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:44:16 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:44:16.566 104184 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/48de92c9-2a56-4dfe-a16e-fe0d52617564.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/48de92c9-2a56-4dfe-a16e-fe0d52617564.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 21 23:44:16 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:44:16.567 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[691259ab-8c1e-4a19-b55a-6ca4f8d24fce]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:44:16 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:44:16.569 104184 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 21 23:44:16 compute-1 ovn_metadata_agent[104179]: global
Jan 21 23:44:16 compute-1 ovn_metadata_agent[104179]:     log         /dev/log local0 debug
Jan 21 23:44:16 compute-1 ovn_metadata_agent[104179]:     log-tag     haproxy-metadata-proxy-48de92c9-2a56-4dfe-a16e-fe0d52617564
Jan 21 23:44:16 compute-1 ovn_metadata_agent[104179]:     user        root
Jan 21 23:44:16 compute-1 ovn_metadata_agent[104179]:     group       root
Jan 21 23:44:16 compute-1 ovn_metadata_agent[104179]:     maxconn     1024
Jan 21 23:44:16 compute-1 ovn_metadata_agent[104179]:     pidfile     /var/lib/neutron/external/pids/48de92c9-2a56-4dfe-a16e-fe0d52617564.pid.haproxy
Jan 21 23:44:16 compute-1 ovn_metadata_agent[104179]:     daemon
Jan 21 23:44:16 compute-1 ovn_metadata_agent[104179]: 
Jan 21 23:44:16 compute-1 ovn_metadata_agent[104179]: defaults
Jan 21 23:44:16 compute-1 ovn_metadata_agent[104179]:     log global
Jan 21 23:44:16 compute-1 ovn_metadata_agent[104179]:     mode http
Jan 21 23:44:16 compute-1 ovn_metadata_agent[104179]:     option httplog
Jan 21 23:44:16 compute-1 ovn_metadata_agent[104179]:     option dontlognull
Jan 21 23:44:16 compute-1 ovn_metadata_agent[104179]:     option http-server-close
Jan 21 23:44:16 compute-1 ovn_metadata_agent[104179]:     option forwardfor
Jan 21 23:44:16 compute-1 ovn_metadata_agent[104179]:     retries                 3
Jan 21 23:44:16 compute-1 ovn_metadata_agent[104179]:     timeout http-request    30s
Jan 21 23:44:16 compute-1 ovn_metadata_agent[104179]:     timeout connect         30s
Jan 21 23:44:16 compute-1 ovn_metadata_agent[104179]:     timeout client          32s
Jan 21 23:44:16 compute-1 ovn_metadata_agent[104179]:     timeout server          32s
Jan 21 23:44:16 compute-1 ovn_metadata_agent[104179]:     timeout http-keep-alive 30s
Jan 21 23:44:16 compute-1 ovn_metadata_agent[104179]: 
Jan 21 23:44:16 compute-1 ovn_metadata_agent[104179]: 
Jan 21 23:44:16 compute-1 ovn_metadata_agent[104179]: listen listener
Jan 21 23:44:16 compute-1 ovn_metadata_agent[104179]:     bind 169.254.169.254:80
Jan 21 23:44:16 compute-1 ovn_metadata_agent[104179]:     server metadata /var/lib/neutron/metadata_proxy
Jan 21 23:44:16 compute-1 ovn_metadata_agent[104179]:     http-request add-header X-OVN-Network-ID 48de92c9-2a56-4dfe-a16e-fe0d52617564
Jan 21 23:44:16 compute-1 ovn_metadata_agent[104179]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 21 23:44:16 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:44:16.569 104184 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-48de92c9-2a56-4dfe-a16e-fe0d52617564', 'env', 'PROCESS_TAG=haproxy-48de92c9-2a56-4dfe-a16e-fe0d52617564', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/48de92c9-2a56-4dfe-a16e-fe0d52617564.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 21 23:44:16 compute-1 nova_compute[182713]: 2026-01-21 23:44:16.831 182717 DEBUG nova.compute.manager [req-a71bd7e4-6b6d-427a-bbad-58b7d1852e2d req-48da06c6-0310-4056-8459-45b36dbad8ac 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 5a9ebed0-dcef-427e-a805-574905569389] Received event network-changed-37f1f8d1-f4ea-416d-ba73-6fbc611802be external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:44:16 compute-1 nova_compute[182713]: 2026-01-21 23:44:16.832 182717 DEBUG nova.compute.manager [req-a71bd7e4-6b6d-427a-bbad-58b7d1852e2d req-48da06c6-0310-4056-8459-45b36dbad8ac 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 5a9ebed0-dcef-427e-a805-574905569389] Refreshing instance network info cache due to event network-changed-37f1f8d1-f4ea-416d-ba73-6fbc611802be. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 21 23:44:16 compute-1 nova_compute[182713]: 2026-01-21 23:44:16.833 182717 DEBUG oslo_concurrency.lockutils [req-a71bd7e4-6b6d-427a-bbad-58b7d1852e2d req-48da06c6-0310-4056-8459-45b36dbad8ac 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-5a9ebed0-dcef-427e-a805-574905569389" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 23:44:16 compute-1 nova_compute[182713]: 2026-01-21 23:44:16.855 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:44:16 compute-1 nova_compute[182713]: 2026-01-21 23:44:16.856 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:44:16 compute-1 nova_compute[182713]: 2026-01-21 23:44:16.857 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 21 23:44:16 compute-1 nova_compute[182713]: 2026-01-21 23:44:16.857 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 21 23:44:16 compute-1 nova_compute[182713]: 2026-01-21 23:44:16.882 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] [instance: 5a9ebed0-dcef-427e-a805-574905569389] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871
Jan 21 23:44:17 compute-1 nova_compute[182713]: 2026-01-21 23:44:17.001 182717 DEBUG nova.network.neutron [None req-e6d05f9f-3aa6-4a1a-bccb-195d8854ac9a a5dd6b4c1bdc4bdd90966604b07abf04 2046237dff224b498e14ad59b5822ac1 - - default default] [instance: 5a9ebed0-dcef-427e-a805-574905569389] Updating instance_info_cache with network_info: [{"id": "37f1f8d1-f4ea-416d-ba73-6fbc611802be", "address": "fa:16:3e:42:f2:4a", "network": {"id": "b48e4868-bf7a-4c2f-b6e3-2d98c35b44de", "bridge": "br-int", "label": "tempest-ServersWithSpecificFlavorTestJSON-746822329-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2046237dff224b498e14ad59b5822ac1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap37f1f8d1-f4", "ovs_interfaceid": "37f1f8d1-f4ea-416d-ba73-6fbc611802be", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 23:44:17 compute-1 podman[211819]: 2026-01-21 23:44:17.034917741 +0000 UTC m=+0.075713700 container create f575efb7fdfbfe4150ccb7d9be2c10bcb2504413a33012e93cc65a3d7d4bb895 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-48de92c9-2a56-4dfe-a16e-fe0d52617564, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 21 23:44:17 compute-1 systemd[1]: Started libpod-conmon-f575efb7fdfbfe4150ccb7d9be2c10bcb2504413a33012e93cc65a3d7d4bb895.scope.
Jan 21 23:44:17 compute-1 podman[211819]: 2026-01-21 23:44:17.004276396 +0000 UTC m=+0.045072415 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 21 23:44:17 compute-1 systemd[1]: Started libcrun container.
Jan 21 23:44:17 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3a4c145370d51ad3a8b50c02d89ba624b251bc35c11fac7e86c83cd353233bcb/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 21 23:44:17 compute-1 podman[211819]: 2026-01-21 23:44:17.16911317 +0000 UTC m=+0.209909179 container init f575efb7fdfbfe4150ccb7d9be2c10bcb2504413a33012e93cc65a3d7d4bb895 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-48de92c9-2a56-4dfe-a16e-fe0d52617564, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 21 23:44:17 compute-1 podman[211819]: 2026-01-21 23:44:17.178702942 +0000 UTC m=+0.219498941 container start f575efb7fdfbfe4150ccb7d9be2c10bcb2504413a33012e93cc65a3d7d4bb895 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-48de92c9-2a56-4dfe-a16e-fe0d52617564, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 21 23:44:17 compute-1 neutron-haproxy-ovnmeta-48de92c9-2a56-4dfe-a16e-fe0d52617564[211834]: [NOTICE]   (211838) : New worker (211840) forked
Jan 21 23:44:17 compute-1 neutron-haproxy-ovnmeta-48de92c9-2a56-4dfe-a16e-fe0d52617564[211834]: [NOTICE]   (211838) : Loading success.
Jan 21 23:44:17 compute-1 nova_compute[182713]: 2026-01-21 23:44:17.322 182717 DEBUG oslo_concurrency.lockutils [None req-e6d05f9f-3aa6-4a1a-bccb-195d8854ac9a a5dd6b4c1bdc4bdd90966604b07abf04 2046237dff224b498e14ad59b5822ac1 - - default default] Releasing lock "refresh_cache-5a9ebed0-dcef-427e-a805-574905569389" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 23:44:17 compute-1 nova_compute[182713]: 2026-01-21 23:44:17.323 182717 DEBUG nova.compute.manager [None req-e6d05f9f-3aa6-4a1a-bccb-195d8854ac9a a5dd6b4c1bdc4bdd90966604b07abf04 2046237dff224b498e14ad59b5822ac1 - - default default] [instance: 5a9ebed0-dcef-427e-a805-574905569389] Instance network_info: |[{"id": "37f1f8d1-f4ea-416d-ba73-6fbc611802be", "address": "fa:16:3e:42:f2:4a", "network": {"id": "b48e4868-bf7a-4c2f-b6e3-2d98c35b44de", "bridge": "br-int", "label": "tempest-ServersWithSpecificFlavorTestJSON-746822329-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2046237dff224b498e14ad59b5822ac1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap37f1f8d1-f4", "ovs_interfaceid": "37f1f8d1-f4ea-416d-ba73-6fbc611802be", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 21 23:44:17 compute-1 nova_compute[182713]: 2026-01-21 23:44:17.324 182717 DEBUG oslo_concurrency.lockutils [req-a71bd7e4-6b6d-427a-bbad-58b7d1852e2d req-48da06c6-0310-4056-8459-45b36dbad8ac 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-5a9ebed0-dcef-427e-a805-574905569389" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 23:44:17 compute-1 nova_compute[182713]: 2026-01-21 23:44:17.324 182717 DEBUG nova.network.neutron [req-a71bd7e4-6b6d-427a-bbad-58b7d1852e2d req-48da06c6-0310-4056-8459-45b36dbad8ac 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 5a9ebed0-dcef-427e-a805-574905569389] Refreshing network info cache for port 37f1f8d1-f4ea-416d-ba73-6fbc611802be _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 21 23:44:17 compute-1 nova_compute[182713]: 2026-01-21 23:44:17.327 182717 DEBUG nova.virt.libvirt.driver [None req-e6d05f9f-3aa6-4a1a-bccb-195d8854ac9a a5dd6b4c1bdc4bdd90966604b07abf04 2046237dff224b498e14ad59b5822ac1 - - default default] [instance: 5a9ebed0-dcef-427e-a805-574905569389] Start _get_guest_xml network_info=[{"id": "37f1f8d1-f4ea-416d-ba73-6fbc611802be", "address": "fa:16:3e:42:f2:4a", "network": {"id": "b48e4868-bf7a-4c2f-b6e3-2d98c35b44de", "bridge": "br-int", "label": "tempest-ServersWithSpecificFlavorTestJSON-746822329-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2046237dff224b498e14ad59b5822ac1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap37f1f8d1-f4", "ovs_interfaceid": "37f1f8d1-f4ea-416d-ba73-6fbc611802be", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'size': 0, 'disk_bus': 'virtio', 'boot_index': 0, 'device_name': '/dev/vda', 'guest_format': None, 'encrypted': False, 'encryption_format': None, 'encryption_secret_uuid': None, 'device_type': 'disk', 'image_id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 21 23:44:17 compute-1 nova_compute[182713]: 2026-01-21 23:44:17.333 182717 WARNING nova.virt.libvirt.driver [None req-e6d05f9f-3aa6-4a1a-bccb-195d8854ac9a a5dd6b4c1bdc4bdd90966604b07abf04 2046237dff224b498e14ad59b5822ac1 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 21 23:44:17 compute-1 nova_compute[182713]: 2026-01-21 23:44:17.351 182717 DEBUG nova.virt.libvirt.host [None req-e6d05f9f-3aa6-4a1a-bccb-195d8854ac9a a5dd6b4c1bdc4bdd90966604b07abf04 2046237dff224b498e14ad59b5822ac1 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 21 23:44:17 compute-1 nova_compute[182713]: 2026-01-21 23:44:17.353 182717 DEBUG nova.virt.libvirt.host [None req-e6d05f9f-3aa6-4a1a-bccb-195d8854ac9a a5dd6b4c1bdc4bdd90966604b07abf04 2046237dff224b498e14ad59b5822ac1 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 21 23:44:17 compute-1 nova_compute[182713]: 2026-01-21 23:44:17.357 182717 DEBUG nova.virt.libvirt.host [None req-e6d05f9f-3aa6-4a1a-bccb-195d8854ac9a a5dd6b4c1bdc4bdd90966604b07abf04 2046237dff224b498e14ad59b5822ac1 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 21 23:44:17 compute-1 nova_compute[182713]: 2026-01-21 23:44:17.359 182717 DEBUG nova.virt.libvirt.host [None req-e6d05f9f-3aa6-4a1a-bccb-195d8854ac9a a5dd6b4c1bdc4bdd90966604b07abf04 2046237dff224b498e14ad59b5822ac1 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 21 23:44:17 compute-1 nova_compute[182713]: 2026-01-21 23:44:17.361 182717 DEBUG nova.virt.libvirt.driver [None req-e6d05f9f-3aa6-4a1a-bccb-195d8854ac9a a5dd6b4c1bdc4bdd90966604b07abf04 2046237dff224b498e14ad59b5822ac1 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 21 23:44:17 compute-1 nova_compute[182713]: 2026-01-21 23:44:17.362 182717 DEBUG nova.virt.hardware [None req-e6d05f9f-3aa6-4a1a-bccb-195d8854ac9a a5dd6b4c1bdc4bdd90966604b07abf04 2046237dff224b498e14ad59b5822ac1 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-21T23:44:01Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='17063997',id=10,is_public=True,memory_mb=128,name='tempest-flavor_with_ephemeral_0-176066592',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 21 23:44:17 compute-1 nova_compute[182713]: 2026-01-21 23:44:17.363 182717 DEBUG nova.virt.hardware [None req-e6d05f9f-3aa6-4a1a-bccb-195d8854ac9a a5dd6b4c1bdc4bdd90966604b07abf04 2046237dff224b498e14ad59b5822ac1 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 21 23:44:17 compute-1 nova_compute[182713]: 2026-01-21 23:44:17.364 182717 DEBUG nova.virt.hardware [None req-e6d05f9f-3aa6-4a1a-bccb-195d8854ac9a a5dd6b4c1bdc4bdd90966604b07abf04 2046237dff224b498e14ad59b5822ac1 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 21 23:44:17 compute-1 nova_compute[182713]: 2026-01-21 23:44:17.364 182717 DEBUG nova.virt.hardware [None req-e6d05f9f-3aa6-4a1a-bccb-195d8854ac9a a5dd6b4c1bdc4bdd90966604b07abf04 2046237dff224b498e14ad59b5822ac1 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 21 23:44:17 compute-1 nova_compute[182713]: 2026-01-21 23:44:17.365 182717 DEBUG nova.virt.hardware [None req-e6d05f9f-3aa6-4a1a-bccb-195d8854ac9a a5dd6b4c1bdc4bdd90966604b07abf04 2046237dff224b498e14ad59b5822ac1 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 21 23:44:17 compute-1 nova_compute[182713]: 2026-01-21 23:44:17.365 182717 DEBUG nova.virt.hardware [None req-e6d05f9f-3aa6-4a1a-bccb-195d8854ac9a a5dd6b4c1bdc4bdd90966604b07abf04 2046237dff224b498e14ad59b5822ac1 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 21 23:44:17 compute-1 nova_compute[182713]: 2026-01-21 23:44:17.366 182717 DEBUG nova.virt.hardware [None req-e6d05f9f-3aa6-4a1a-bccb-195d8854ac9a a5dd6b4c1bdc4bdd90966604b07abf04 2046237dff224b498e14ad59b5822ac1 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 21 23:44:17 compute-1 nova_compute[182713]: 2026-01-21 23:44:17.367 182717 DEBUG nova.virt.hardware [None req-e6d05f9f-3aa6-4a1a-bccb-195d8854ac9a a5dd6b4c1bdc4bdd90966604b07abf04 2046237dff224b498e14ad59b5822ac1 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 21 23:44:17 compute-1 nova_compute[182713]: 2026-01-21 23:44:17.367 182717 DEBUG nova.virt.hardware [None req-e6d05f9f-3aa6-4a1a-bccb-195d8854ac9a a5dd6b4c1bdc4bdd90966604b07abf04 2046237dff224b498e14ad59b5822ac1 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 21 23:44:17 compute-1 nova_compute[182713]: 2026-01-21 23:44:17.368 182717 DEBUG nova.virt.hardware [None req-e6d05f9f-3aa6-4a1a-bccb-195d8854ac9a a5dd6b4c1bdc4bdd90966604b07abf04 2046237dff224b498e14ad59b5822ac1 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 21 23:44:17 compute-1 nova_compute[182713]: 2026-01-21 23:44:17.368 182717 DEBUG nova.virt.hardware [None req-e6d05f9f-3aa6-4a1a-bccb-195d8854ac9a a5dd6b4c1bdc4bdd90966604b07abf04 2046237dff224b498e14ad59b5822ac1 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 21 23:44:17 compute-1 nova_compute[182713]: 2026-01-21 23:44:17.373 182717 DEBUG nova.virt.libvirt.vif [None req-e6d05f9f-3aa6-4a1a-bccb-195d8854ac9a a5dd6b4c1bdc4bdd90966604b07abf04 2046237dff224b498e14ad59b5822ac1 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-21T23:44:09Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersWithSpecificFlavorTestJSON-server-1964291827',display_name='tempest-ServersWithSpecificFlavorTestJSON-server-1964291827',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(10),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverswithspecificflavortestjson-server-1964291827',id=7,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=10,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMc2S1xHovsuR3v2I5l9J02AovvG576qE+PLZ2JMDE3YonbQUTzxCwL7O2BjXojm40p7I5K0N5rNZT68qJMNaZU2vWBXKRORRm5xx7rGTVYFpWqb/Ex26+4ExPg1IszR1Q==',key_name='tempest-keypair-2129534703',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='2046237dff224b498e14ad59b5822ac1',ramdisk_id='',reservation_id='r-6wca9wxj',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersWithSpecificFlavorTestJSON-1888253139',owner_user_name='tempest-ServersWithSpecificFlavorTestJSON-1888253139-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-21T23:44:12Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='a5dd6b4c1bdc4bdd90966604b07abf04',uuid=5a9ebed0-dcef-427e-a805-574905569389,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "37f1f8d1-f4ea-416d-ba73-6fbc611802be", "address": "fa:16:3e:42:f2:4a", "network": {"id": "b48e4868-bf7a-4c2f-b6e3-2d98c35b44de", "bridge": "br-int", "label": "tempest-ServersWithSpecificFlavorTestJSON-746822329-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2046237dff224b498e14ad59b5822ac1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap37f1f8d1-f4", "ovs_interfaceid": "37f1f8d1-f4ea-416d-ba73-6fbc611802be", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 21 23:44:17 compute-1 nova_compute[182713]: 2026-01-21 23:44:17.373 182717 DEBUG nova.network.os_vif_util [None req-e6d05f9f-3aa6-4a1a-bccb-195d8854ac9a a5dd6b4c1bdc4bdd90966604b07abf04 2046237dff224b498e14ad59b5822ac1 - - default default] Converting VIF {"id": "37f1f8d1-f4ea-416d-ba73-6fbc611802be", "address": "fa:16:3e:42:f2:4a", "network": {"id": "b48e4868-bf7a-4c2f-b6e3-2d98c35b44de", "bridge": "br-int", "label": "tempest-ServersWithSpecificFlavorTestJSON-746822329-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2046237dff224b498e14ad59b5822ac1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap37f1f8d1-f4", "ovs_interfaceid": "37f1f8d1-f4ea-416d-ba73-6fbc611802be", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 21 23:44:17 compute-1 nova_compute[182713]: 2026-01-21 23:44:17.375 182717 DEBUG nova.network.os_vif_util [None req-e6d05f9f-3aa6-4a1a-bccb-195d8854ac9a a5dd6b4c1bdc4bdd90966604b07abf04 2046237dff224b498e14ad59b5822ac1 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:42:f2:4a,bridge_name='br-int',has_traffic_filtering=True,id=37f1f8d1-f4ea-416d-ba73-6fbc611802be,network=Network(b48e4868-bf7a-4c2f-b6e3-2d98c35b44de),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap37f1f8d1-f4') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 21 23:44:17 compute-1 nova_compute[182713]: 2026-01-21 23:44:17.378 182717 DEBUG nova.objects.instance [None req-e6d05f9f-3aa6-4a1a-bccb-195d8854ac9a a5dd6b4c1bdc4bdd90966604b07abf04 2046237dff224b498e14ad59b5822ac1 - - default default] Lazy-loading 'pci_devices' on Instance uuid 5a9ebed0-dcef-427e-a805-574905569389 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:44:17 compute-1 nova_compute[182713]: 2026-01-21 23:44:17.395 182717 DEBUG nova.virt.libvirt.driver [None req-e6d05f9f-3aa6-4a1a-bccb-195d8854ac9a a5dd6b4c1bdc4bdd90966604b07abf04 2046237dff224b498e14ad59b5822ac1 - - default default] [instance: 5a9ebed0-dcef-427e-a805-574905569389] End _get_guest_xml xml=<domain type="kvm">
Jan 21 23:44:17 compute-1 nova_compute[182713]:   <uuid>5a9ebed0-dcef-427e-a805-574905569389</uuid>
Jan 21 23:44:17 compute-1 nova_compute[182713]:   <name>instance-00000007</name>
Jan 21 23:44:17 compute-1 nova_compute[182713]:   <memory>131072</memory>
Jan 21 23:44:17 compute-1 nova_compute[182713]:   <vcpu>1</vcpu>
Jan 21 23:44:17 compute-1 nova_compute[182713]:   <metadata>
Jan 21 23:44:17 compute-1 nova_compute[182713]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 21 23:44:17 compute-1 nova_compute[182713]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 21 23:44:17 compute-1 nova_compute[182713]:       <nova:name>tempest-ServersWithSpecificFlavorTestJSON-server-1964291827</nova:name>
Jan 21 23:44:17 compute-1 nova_compute[182713]:       <nova:creationTime>2026-01-21 23:44:17</nova:creationTime>
Jan 21 23:44:17 compute-1 nova_compute[182713]:       <nova:flavor name="tempest-flavor_with_ephemeral_0-176066592">
Jan 21 23:44:17 compute-1 nova_compute[182713]:         <nova:memory>128</nova:memory>
Jan 21 23:44:17 compute-1 nova_compute[182713]:         <nova:disk>1</nova:disk>
Jan 21 23:44:17 compute-1 nova_compute[182713]:         <nova:swap>0</nova:swap>
Jan 21 23:44:17 compute-1 nova_compute[182713]:         <nova:ephemeral>0</nova:ephemeral>
Jan 21 23:44:17 compute-1 nova_compute[182713]:         <nova:vcpus>1</nova:vcpus>
Jan 21 23:44:17 compute-1 nova_compute[182713]:       </nova:flavor>
Jan 21 23:44:17 compute-1 nova_compute[182713]:       <nova:owner>
Jan 21 23:44:17 compute-1 nova_compute[182713]:         <nova:user uuid="a5dd6b4c1bdc4bdd90966604b07abf04">tempest-ServersWithSpecificFlavorTestJSON-1888253139-project-member</nova:user>
Jan 21 23:44:17 compute-1 nova_compute[182713]:         <nova:project uuid="2046237dff224b498e14ad59b5822ac1">tempest-ServersWithSpecificFlavorTestJSON-1888253139</nova:project>
Jan 21 23:44:17 compute-1 nova_compute[182713]:       </nova:owner>
Jan 21 23:44:17 compute-1 nova_compute[182713]:       <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 21 23:44:17 compute-1 nova_compute[182713]:       <nova:ports>
Jan 21 23:44:17 compute-1 nova_compute[182713]:         <nova:port uuid="37f1f8d1-f4ea-416d-ba73-6fbc611802be">
Jan 21 23:44:17 compute-1 nova_compute[182713]:           <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Jan 21 23:44:17 compute-1 nova_compute[182713]:         </nova:port>
Jan 21 23:44:17 compute-1 nova_compute[182713]:       </nova:ports>
Jan 21 23:44:17 compute-1 nova_compute[182713]:     </nova:instance>
Jan 21 23:44:17 compute-1 nova_compute[182713]:   </metadata>
Jan 21 23:44:17 compute-1 nova_compute[182713]:   <sysinfo type="smbios">
Jan 21 23:44:17 compute-1 nova_compute[182713]:     <system>
Jan 21 23:44:17 compute-1 nova_compute[182713]:       <entry name="manufacturer">RDO</entry>
Jan 21 23:44:17 compute-1 nova_compute[182713]:       <entry name="product">OpenStack Compute</entry>
Jan 21 23:44:17 compute-1 nova_compute[182713]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 21 23:44:17 compute-1 nova_compute[182713]:       <entry name="serial">5a9ebed0-dcef-427e-a805-574905569389</entry>
Jan 21 23:44:17 compute-1 nova_compute[182713]:       <entry name="uuid">5a9ebed0-dcef-427e-a805-574905569389</entry>
Jan 21 23:44:17 compute-1 nova_compute[182713]:       <entry name="family">Virtual Machine</entry>
Jan 21 23:44:17 compute-1 nova_compute[182713]:     </system>
Jan 21 23:44:17 compute-1 nova_compute[182713]:   </sysinfo>
Jan 21 23:44:17 compute-1 nova_compute[182713]:   <os>
Jan 21 23:44:17 compute-1 nova_compute[182713]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 21 23:44:17 compute-1 nova_compute[182713]:     <boot dev="hd"/>
Jan 21 23:44:17 compute-1 nova_compute[182713]:     <smbios mode="sysinfo"/>
Jan 21 23:44:17 compute-1 nova_compute[182713]:   </os>
Jan 21 23:44:17 compute-1 nova_compute[182713]:   <features>
Jan 21 23:44:17 compute-1 nova_compute[182713]:     <acpi/>
Jan 21 23:44:17 compute-1 nova_compute[182713]:     <apic/>
Jan 21 23:44:17 compute-1 nova_compute[182713]:     <vmcoreinfo/>
Jan 21 23:44:17 compute-1 nova_compute[182713]:   </features>
Jan 21 23:44:17 compute-1 nova_compute[182713]:   <clock offset="utc">
Jan 21 23:44:17 compute-1 nova_compute[182713]:     <timer name="pit" tickpolicy="delay"/>
Jan 21 23:44:17 compute-1 nova_compute[182713]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 21 23:44:17 compute-1 nova_compute[182713]:     <timer name="hpet" present="no"/>
Jan 21 23:44:17 compute-1 nova_compute[182713]:   </clock>
Jan 21 23:44:17 compute-1 nova_compute[182713]:   <cpu mode="custom" match="exact">
Jan 21 23:44:17 compute-1 nova_compute[182713]:     <model>Nehalem</model>
Jan 21 23:44:17 compute-1 nova_compute[182713]:     <topology sockets="1" cores="1" threads="1"/>
Jan 21 23:44:17 compute-1 nova_compute[182713]:   </cpu>
Jan 21 23:44:17 compute-1 nova_compute[182713]:   <devices>
Jan 21 23:44:17 compute-1 nova_compute[182713]:     <disk type="file" device="disk">
Jan 21 23:44:17 compute-1 nova_compute[182713]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 21 23:44:17 compute-1 nova_compute[182713]:       <source file="/var/lib/nova/instances/5a9ebed0-dcef-427e-a805-574905569389/disk"/>
Jan 21 23:44:17 compute-1 nova_compute[182713]:       <target dev="vda" bus="virtio"/>
Jan 21 23:44:17 compute-1 nova_compute[182713]:     </disk>
Jan 21 23:44:17 compute-1 nova_compute[182713]:     <disk type="file" device="cdrom">
Jan 21 23:44:17 compute-1 nova_compute[182713]:       <driver name="qemu" type="raw" cache="none"/>
Jan 21 23:44:17 compute-1 nova_compute[182713]:       <source file="/var/lib/nova/instances/5a9ebed0-dcef-427e-a805-574905569389/disk.config"/>
Jan 21 23:44:17 compute-1 nova_compute[182713]:       <target dev="sda" bus="sata"/>
Jan 21 23:44:17 compute-1 nova_compute[182713]:     </disk>
Jan 21 23:44:17 compute-1 nova_compute[182713]:     <interface type="ethernet">
Jan 21 23:44:17 compute-1 nova_compute[182713]:       <mac address="fa:16:3e:42:f2:4a"/>
Jan 21 23:44:17 compute-1 nova_compute[182713]:       <model type="virtio"/>
Jan 21 23:44:17 compute-1 nova_compute[182713]:       <driver name="vhost" rx_queue_size="512"/>
Jan 21 23:44:17 compute-1 nova_compute[182713]:       <mtu size="1442"/>
Jan 21 23:44:17 compute-1 nova_compute[182713]:       <target dev="tap37f1f8d1-f4"/>
Jan 21 23:44:17 compute-1 nova_compute[182713]:     </interface>
Jan 21 23:44:17 compute-1 nova_compute[182713]:     <serial type="pty">
Jan 21 23:44:17 compute-1 nova_compute[182713]:       <log file="/var/lib/nova/instances/5a9ebed0-dcef-427e-a805-574905569389/console.log" append="off"/>
Jan 21 23:44:17 compute-1 nova_compute[182713]:     </serial>
Jan 21 23:44:17 compute-1 nova_compute[182713]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 21 23:44:17 compute-1 nova_compute[182713]:     <video>
Jan 21 23:44:17 compute-1 nova_compute[182713]:       <model type="virtio"/>
Jan 21 23:44:17 compute-1 nova_compute[182713]:     </video>
Jan 21 23:44:17 compute-1 nova_compute[182713]:     <input type="tablet" bus="usb"/>
Jan 21 23:44:17 compute-1 nova_compute[182713]:     <rng model="virtio">
Jan 21 23:44:17 compute-1 nova_compute[182713]:       <backend model="random">/dev/urandom</backend>
Jan 21 23:44:17 compute-1 nova_compute[182713]:     </rng>
Jan 21 23:44:17 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root"/>
Jan 21 23:44:17 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:44:17 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:44:17 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:44:17 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:44:17 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:44:17 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:44:17 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:44:17 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:44:17 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:44:17 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:44:17 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:44:17 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:44:17 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:44:17 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:44:17 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:44:17 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:44:17 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:44:17 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:44:17 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:44:17 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:44:17 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:44:17 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:44:17 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:44:17 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:44:17 compute-1 nova_compute[182713]:     <controller type="usb" index="0"/>
Jan 21 23:44:17 compute-1 nova_compute[182713]:     <memballoon model="virtio">
Jan 21 23:44:17 compute-1 nova_compute[182713]:       <stats period="10"/>
Jan 21 23:44:17 compute-1 nova_compute[182713]:     </memballoon>
Jan 21 23:44:17 compute-1 nova_compute[182713]:   </devices>
Jan 21 23:44:17 compute-1 nova_compute[182713]: </domain>
Jan 21 23:44:17 compute-1 nova_compute[182713]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 21 23:44:17 compute-1 nova_compute[182713]: 2026-01-21 23:44:17.402 182717 DEBUG nova.compute.manager [None req-e6d05f9f-3aa6-4a1a-bccb-195d8854ac9a a5dd6b4c1bdc4bdd90966604b07abf04 2046237dff224b498e14ad59b5822ac1 - - default default] [instance: 5a9ebed0-dcef-427e-a805-574905569389] Preparing to wait for external event network-vif-plugged-37f1f8d1-f4ea-416d-ba73-6fbc611802be prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 21 23:44:17 compute-1 nova_compute[182713]: 2026-01-21 23:44:17.402 182717 DEBUG oslo_concurrency.lockutils [None req-e6d05f9f-3aa6-4a1a-bccb-195d8854ac9a a5dd6b4c1bdc4bdd90966604b07abf04 2046237dff224b498e14ad59b5822ac1 - - default default] Acquiring lock "5a9ebed0-dcef-427e-a805-574905569389-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:44:17 compute-1 nova_compute[182713]: 2026-01-21 23:44:17.403 182717 DEBUG oslo_concurrency.lockutils [None req-e6d05f9f-3aa6-4a1a-bccb-195d8854ac9a a5dd6b4c1bdc4bdd90966604b07abf04 2046237dff224b498e14ad59b5822ac1 - - default default] Lock "5a9ebed0-dcef-427e-a805-574905569389-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:44:17 compute-1 nova_compute[182713]: 2026-01-21 23:44:17.405 182717 DEBUG oslo_concurrency.lockutils [None req-e6d05f9f-3aa6-4a1a-bccb-195d8854ac9a a5dd6b4c1bdc4bdd90966604b07abf04 2046237dff224b498e14ad59b5822ac1 - - default default] Lock "5a9ebed0-dcef-427e-a805-574905569389-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:44:17 compute-1 nova_compute[182713]: 2026-01-21 23:44:17.407 182717 DEBUG nova.virt.libvirt.vif [None req-e6d05f9f-3aa6-4a1a-bccb-195d8854ac9a a5dd6b4c1bdc4bdd90966604b07abf04 2046237dff224b498e14ad59b5822ac1 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-21T23:44:09Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersWithSpecificFlavorTestJSON-server-1964291827',display_name='tempest-ServersWithSpecificFlavorTestJSON-server-1964291827',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(10),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverswithspecificflavortestjson-server-1964291827',id=7,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=10,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMc2S1xHovsuR3v2I5l9J02AovvG576qE+PLZ2JMDE3YonbQUTzxCwL7O2BjXojm40p7I5K0N5rNZT68qJMNaZU2vWBXKRORRm5xx7rGTVYFpWqb/Ex26+4ExPg1IszR1Q==',key_name='tempest-keypair-2129534703',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='2046237dff224b498e14ad59b5822ac1',ramdisk_id='',reservation_id='r-6wca9wxj',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersWithSpecificFlavorTestJSON-1888253139',owner_user_name='tempest-ServersWithSpecificFlavorTestJSON-1888253139-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-21T23:44:12Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='a5dd6b4c1bdc4bdd90966604b07abf04',uuid=5a9ebed0-dcef-427e-a805-574905569389,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "37f1f8d1-f4ea-416d-ba73-6fbc611802be", "address": "fa:16:3e:42:f2:4a", "network": {"id": "b48e4868-bf7a-4c2f-b6e3-2d98c35b44de", "bridge": "br-int", "label": "tempest-ServersWithSpecificFlavorTestJSON-746822329-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2046237dff224b498e14ad59b5822ac1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap37f1f8d1-f4", "ovs_interfaceid": "37f1f8d1-f4ea-416d-ba73-6fbc611802be", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 21 23:44:17 compute-1 nova_compute[182713]: 2026-01-21 23:44:17.408 182717 DEBUG nova.network.os_vif_util [None req-e6d05f9f-3aa6-4a1a-bccb-195d8854ac9a a5dd6b4c1bdc4bdd90966604b07abf04 2046237dff224b498e14ad59b5822ac1 - - default default] Converting VIF {"id": "37f1f8d1-f4ea-416d-ba73-6fbc611802be", "address": "fa:16:3e:42:f2:4a", "network": {"id": "b48e4868-bf7a-4c2f-b6e3-2d98c35b44de", "bridge": "br-int", "label": "tempest-ServersWithSpecificFlavorTestJSON-746822329-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2046237dff224b498e14ad59b5822ac1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap37f1f8d1-f4", "ovs_interfaceid": "37f1f8d1-f4ea-416d-ba73-6fbc611802be", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 21 23:44:17 compute-1 nova_compute[182713]: 2026-01-21 23:44:17.410 182717 DEBUG nova.network.os_vif_util [None req-e6d05f9f-3aa6-4a1a-bccb-195d8854ac9a a5dd6b4c1bdc4bdd90966604b07abf04 2046237dff224b498e14ad59b5822ac1 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:42:f2:4a,bridge_name='br-int',has_traffic_filtering=True,id=37f1f8d1-f4ea-416d-ba73-6fbc611802be,network=Network(b48e4868-bf7a-4c2f-b6e3-2d98c35b44de),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap37f1f8d1-f4') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 21 23:44:17 compute-1 nova_compute[182713]: 2026-01-21 23:44:17.412 182717 DEBUG os_vif [None req-e6d05f9f-3aa6-4a1a-bccb-195d8854ac9a a5dd6b4c1bdc4bdd90966604b07abf04 2046237dff224b498e14ad59b5822ac1 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:42:f2:4a,bridge_name='br-int',has_traffic_filtering=True,id=37f1f8d1-f4ea-416d-ba73-6fbc611802be,network=Network(b48e4868-bf7a-4c2f-b6e3-2d98c35b44de),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap37f1f8d1-f4') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 21 23:44:17 compute-1 nova_compute[182713]: 2026-01-21 23:44:17.413 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:44:17 compute-1 nova_compute[182713]: 2026-01-21 23:44:17.414 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:44:17 compute-1 nova_compute[182713]: 2026-01-21 23:44:17.415 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 21 23:44:17 compute-1 nova_compute[182713]: 2026-01-21 23:44:17.420 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:44:17 compute-1 nova_compute[182713]: 2026-01-21 23:44:17.420 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap37f1f8d1-f4, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:44:17 compute-1 nova_compute[182713]: 2026-01-21 23:44:17.421 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap37f1f8d1-f4, col_values=(('external_ids', {'iface-id': '37f1f8d1-f4ea-416d-ba73-6fbc611802be', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:42:f2:4a', 'vm-uuid': '5a9ebed0-dcef-427e-a805-574905569389'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:44:17 compute-1 nova_compute[182713]: 2026-01-21 23:44:17.423 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:44:17 compute-1 NetworkManager[54952]: <info>  [1769039057.4249] manager: (tap37f1f8d1-f4): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/25)
Jan 21 23:44:17 compute-1 nova_compute[182713]: 2026-01-21 23:44:17.427 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 21 23:44:17 compute-1 nova_compute[182713]: 2026-01-21 23:44:17.430 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:44:17 compute-1 nova_compute[182713]: 2026-01-21 23:44:17.431 182717 INFO os_vif [None req-e6d05f9f-3aa6-4a1a-bccb-195d8854ac9a a5dd6b4c1bdc4bdd90966604b07abf04 2046237dff224b498e14ad59b5822ac1 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:42:f2:4a,bridge_name='br-int',has_traffic_filtering=True,id=37f1f8d1-f4ea-416d-ba73-6fbc611802be,network=Network(b48e4868-bf7a-4c2f-b6e3-2d98c35b44de),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap37f1f8d1-f4')
Jan 21 23:44:17 compute-1 nova_compute[182713]: 2026-01-21 23:44:17.504 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquiring lock "refresh_cache-6f64b039-da3a-47ef-9a52-b259b890a077" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 23:44:17 compute-1 nova_compute[182713]: 2026-01-21 23:44:17.505 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquired lock "refresh_cache-6f64b039-da3a-47ef-9a52-b259b890a077" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 23:44:17 compute-1 nova_compute[182713]: 2026-01-21 23:44:17.505 182717 DEBUG nova.network.neutron [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] [instance: 6f64b039-da3a-47ef-9a52-b259b890a077] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Jan 21 23:44:17 compute-1 nova_compute[182713]: 2026-01-21 23:44:17.505 182717 DEBUG nova.objects.instance [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lazy-loading 'info_cache' on Instance uuid 6f64b039-da3a-47ef-9a52-b259b890a077 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:44:17 compute-1 nova_compute[182713]: 2026-01-21 23:44:17.516 182717 DEBUG nova.virt.libvirt.driver [None req-e6d05f9f-3aa6-4a1a-bccb-195d8854ac9a a5dd6b4c1bdc4bdd90966604b07abf04 2046237dff224b498e14ad59b5822ac1 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 21 23:44:17 compute-1 nova_compute[182713]: 2026-01-21 23:44:17.517 182717 DEBUG nova.virt.libvirt.driver [None req-e6d05f9f-3aa6-4a1a-bccb-195d8854ac9a a5dd6b4c1bdc4bdd90966604b07abf04 2046237dff224b498e14ad59b5822ac1 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 21 23:44:17 compute-1 nova_compute[182713]: 2026-01-21 23:44:17.517 182717 DEBUG nova.virt.libvirt.driver [None req-e6d05f9f-3aa6-4a1a-bccb-195d8854ac9a a5dd6b4c1bdc4bdd90966604b07abf04 2046237dff224b498e14ad59b5822ac1 - - default default] No VIF found with MAC fa:16:3e:42:f2:4a, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 21 23:44:17 compute-1 nova_compute[182713]: 2026-01-21 23:44:17.518 182717 INFO nova.virt.libvirt.driver [None req-e6d05f9f-3aa6-4a1a-bccb-195d8854ac9a a5dd6b4c1bdc4bdd90966604b07abf04 2046237dff224b498e14ad59b5822ac1 - - default default] [instance: 5a9ebed0-dcef-427e-a805-574905569389] Using config drive
Jan 21 23:44:17 compute-1 nova_compute[182713]: 2026-01-21 23:44:17.907 182717 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769039042.9059649, 1b4a8e44-319b-431f-b842-ebb9dd2413fe => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:44:17 compute-1 nova_compute[182713]: 2026-01-21 23:44:17.908 182717 INFO nova.compute.manager [-] [instance: 1b4a8e44-319b-431f-b842-ebb9dd2413fe] VM Stopped (Lifecycle Event)
Jan 21 23:44:17 compute-1 nova_compute[182713]: 2026-01-21 23:44:17.927 182717 DEBUG nova.compute.manager [None req-e3c6a381-c3ea-4516-8d27-9d6a4c506cae - - - - - -] [instance: 1b4a8e44-319b-431f-b842-ebb9dd2413fe] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:44:18 compute-1 nova_compute[182713]: 2026-01-21 23:44:18.183 182717 INFO nova.virt.libvirt.driver [None req-e6d05f9f-3aa6-4a1a-bccb-195d8854ac9a a5dd6b4c1bdc4bdd90966604b07abf04 2046237dff224b498e14ad59b5822ac1 - - default default] [instance: 5a9ebed0-dcef-427e-a805-574905569389] Creating config drive at /var/lib/nova/instances/5a9ebed0-dcef-427e-a805-574905569389/disk.config
Jan 21 23:44:18 compute-1 nova_compute[182713]: 2026-01-21 23:44:18.191 182717 DEBUG oslo_concurrency.processutils [None req-e6d05f9f-3aa6-4a1a-bccb-195d8854ac9a a5dd6b4c1bdc4bdd90966604b07abf04 2046237dff224b498e14ad59b5822ac1 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/5a9ebed0-dcef-427e-a805-574905569389/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpmgxem840 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:44:18 compute-1 nova_compute[182713]: 2026-01-21 23:44:18.332 182717 DEBUG oslo_concurrency.processutils [None req-e6d05f9f-3aa6-4a1a-bccb-195d8854ac9a a5dd6b4c1bdc4bdd90966604b07abf04 2046237dff224b498e14ad59b5822ac1 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/5a9ebed0-dcef-427e-a805-574905569389/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpmgxem840" returned: 0 in 0.141s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:44:18 compute-1 NetworkManager[54952]: <info>  [1769039058.3917] manager: (tap37f1f8d1-f4): new Tun device (/org/freedesktop/NetworkManager/Devices/26)
Jan 21 23:44:18 compute-1 kernel: tap37f1f8d1-f4: entered promiscuous mode
Jan 21 23:44:18 compute-1 ovn_controller[94841]: 2026-01-21T23:44:18Z|00032|binding|INFO|Claiming lport 37f1f8d1-f4ea-416d-ba73-6fbc611802be for this chassis.
Jan 21 23:44:18 compute-1 systemd-udevd[211770]: Network interface NamePolicy= disabled on kernel command line.
Jan 21 23:44:18 compute-1 ovn_controller[94841]: 2026-01-21T23:44:18Z|00033|binding|INFO|37f1f8d1-f4ea-416d-ba73-6fbc611802be: Claiming fa:16:3e:42:f2:4a 10.100.0.9
Jan 21 23:44:18 compute-1 nova_compute[182713]: 2026-01-21 23:44:18.394 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:44:18 compute-1 nova_compute[182713]: 2026-01-21 23:44:18.399 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:44:18 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:44:18.407 104184 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:42:f2:4a 10.100.0.9'], port_security=['fa:16:3e:42:f2:4a 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '5a9ebed0-dcef-427e-a805-574905569389', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b48e4868-bf7a-4c2f-b6e3-2d98c35b44de', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2046237dff224b498e14ad59b5822ac1', 'neutron:revision_number': '2', 'neutron:security_group_ids': '4bf21f67-1fb4-4de2-8a56-49bd370458b1', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=db648dcf-ceff-4628-b6a4-f9104e5f375b, chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>], logical_port=37f1f8d1-f4ea-416d-ba73-6fbc611802be) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 21 23:44:18 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:44:18.408 104184 INFO neutron.agent.ovn.metadata.agent [-] Port 37f1f8d1-f4ea-416d-ba73-6fbc611802be in datapath b48e4868-bf7a-4c2f-b6e3-2d98c35b44de bound to our chassis
Jan 21 23:44:18 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:44:18.410 104184 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network b48e4868-bf7a-4c2f-b6e3-2d98c35b44de
Jan 21 23:44:18 compute-1 NetworkManager[54952]: <info>  [1769039058.4131] device (tap37f1f8d1-f4): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 21 23:44:18 compute-1 NetworkManager[54952]: <info>  [1769039058.4149] device (tap37f1f8d1-f4): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 21 23:44:18 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:44:18.429 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[6e99f18e-9e8b-41e0-936f-b43ee94fa266]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:44:18 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:44:18.430 104184 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapb48e4868-b1 in ovnmeta-b48e4868-bf7a-4c2f-b6e3-2d98c35b44de namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 21 23:44:18 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:44:18.431 211733 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapb48e4868-b0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 21 23:44:18 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:44:18.431 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[95da2790-8ba3-414d-8549-521699bc7f33]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:44:18 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:44:18.431 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[bfdfa6a5-4033-4d6b-a3a3-30b7b6b1aa10]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:44:18 compute-1 systemd-machined[153970]: New machine qemu-4-instance-00000007.
Jan 21 23:44:18 compute-1 nova_compute[182713]: 2026-01-21 23:44:18.450 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:44:18 compute-1 ovn_controller[94841]: 2026-01-21T23:44:18Z|00034|binding|INFO|Setting lport 37f1f8d1-f4ea-416d-ba73-6fbc611802be ovn-installed in OVS
Jan 21 23:44:18 compute-1 ovn_controller[94841]: 2026-01-21T23:44:18Z|00035|binding|INFO|Setting lport 37f1f8d1-f4ea-416d-ba73-6fbc611802be up in Southbound
Jan 21 23:44:18 compute-1 systemd[1]: Started Virtual Machine qemu-4-instance-00000007.
Jan 21 23:44:18 compute-1 nova_compute[182713]: 2026-01-21 23:44:18.455 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:44:18 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:44:18.468 104576 DEBUG oslo.privsep.daemon [-] privsep: reply[c12c6a17-9982-49d9-b3c7-0069c9e52417]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:44:18 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:44:18.497 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[b36a0b75-41e1-4432-963d-d48a77b832c6]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:44:18 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:44:18.527 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[1bda5ba5-0c75-44ed-a856-366805a31f1e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:44:18 compute-1 NetworkManager[54952]: <info>  [1769039058.5337] manager: (tapb48e4868-b0): new Veth device (/org/freedesktop/NetworkManager/Devices/27)
Jan 21 23:44:18 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:44:18.535 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[d7e9d2d9-741c-4230-a08a-7033cc58b372]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:44:18 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:44:18.566 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[a4561f93-0bbc-4d75-b8c4-06f378f1c751]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:44:18 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:44:18.570 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[114f5996-6a3a-4527-a1ba-08f5a7122cc5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:44:18 compute-1 NetworkManager[54952]: <info>  [1769039058.5953] device (tapb48e4868-b0): carrier: link connected
Jan 21 23:44:18 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:44:18.603 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[bb5d920b-f69d-432a-8d56-7076b09f63bd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:44:18 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:44:18.624 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[9183347f-5f81-4a43-8ccf-0ee6be9ac5af]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb48e4868-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ee:60:be'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 14], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 363124, 'reachable_time': 28835, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 211886, 'error': None, 'target': 'ovnmeta-b48e4868-bf7a-4c2f-b6e3-2d98c35b44de', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:44:18 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:44:18.644 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[86b795b8-8f9a-46fd-8ee1-1705ef60402c]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feee:60be'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 363124, 'tstamp': 363124}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 211887, 'error': None, 'target': 'ovnmeta-b48e4868-bf7a-4c2f-b6e3-2d98c35b44de', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:44:18 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:44:18.667 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[c33da202-9f51-41ea-b494-f7b8f4cccc1c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb48e4868-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ee:60:be'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 14], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 363124, 'reachable_time': 28835, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 152, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 152, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 211888, 'error': None, 'target': 'ovnmeta-b48e4868-bf7a-4c2f-b6e3-2d98c35b44de', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:44:18 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:44:18.706 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[6f76e56b-b0a2-4228-83a5-0ccd4be7ec69]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:44:18 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:44:18.771 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[703953d0-0470-4c35-9c3b-246091bf3997]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:44:18 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:44:18.773 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb48e4868-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:44:18 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:44:18.773 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 21 23:44:18 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:44:18.774 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb48e4868-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:44:18 compute-1 nova_compute[182713]: 2026-01-21 23:44:18.816 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:44:18 compute-1 kernel: tapb48e4868-b0: entered promiscuous mode
Jan 21 23:44:18 compute-1 NetworkManager[54952]: <info>  [1769039058.8177] manager: (tapb48e4868-b0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/28)
Jan 21 23:44:18 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:44:18.821 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapb48e4868-b0, col_values=(('external_ids', {'iface-id': 'a608e132-8265-4348-bc67-024ad89753c9'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:44:18 compute-1 ovn_controller[94841]: 2026-01-21T23:44:18Z|00036|binding|INFO|Releasing lport a608e132-8265-4348-bc67-024ad89753c9 from this chassis (sb_readonly=0)
Jan 21 23:44:18 compute-1 nova_compute[182713]: 2026-01-21 23:44:18.828 182717 DEBUG nova.virt.driver [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Emitting event <LifecycleEvent: 1769039058.8233304, 5a9ebed0-dcef-427e-a805-574905569389 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:44:18 compute-1 nova_compute[182713]: 2026-01-21 23:44:18.829 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 5a9ebed0-dcef-427e-a805-574905569389] VM Started (Lifecycle Event)
Jan 21 23:44:18 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:44:18.831 104184 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/b48e4868-bf7a-4c2f-b6e3-2d98c35b44de.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/b48e4868-bf7a-4c2f-b6e3-2d98c35b44de.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 21 23:44:18 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:44:18.832 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[6db90bb1-30cc-4b99-8ca6-186434ae7214]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:44:18 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:44:18.833 104184 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 21 23:44:18 compute-1 ovn_metadata_agent[104179]: global
Jan 21 23:44:18 compute-1 ovn_metadata_agent[104179]:     log         /dev/log local0 debug
Jan 21 23:44:18 compute-1 ovn_metadata_agent[104179]:     log-tag     haproxy-metadata-proxy-b48e4868-bf7a-4c2f-b6e3-2d98c35b44de
Jan 21 23:44:18 compute-1 ovn_metadata_agent[104179]:     user        root
Jan 21 23:44:18 compute-1 ovn_metadata_agent[104179]:     group       root
Jan 21 23:44:18 compute-1 ovn_metadata_agent[104179]:     maxconn     1024
Jan 21 23:44:18 compute-1 ovn_metadata_agent[104179]:     pidfile     /var/lib/neutron/external/pids/b48e4868-bf7a-4c2f-b6e3-2d98c35b44de.pid.haproxy
Jan 21 23:44:18 compute-1 ovn_metadata_agent[104179]:     daemon
Jan 21 23:44:18 compute-1 ovn_metadata_agent[104179]: 
Jan 21 23:44:18 compute-1 ovn_metadata_agent[104179]: defaults
Jan 21 23:44:18 compute-1 ovn_metadata_agent[104179]:     log global
Jan 21 23:44:18 compute-1 ovn_metadata_agent[104179]:     mode http
Jan 21 23:44:18 compute-1 ovn_metadata_agent[104179]:     option httplog
Jan 21 23:44:18 compute-1 ovn_metadata_agent[104179]:     option dontlognull
Jan 21 23:44:18 compute-1 ovn_metadata_agent[104179]:     option http-server-close
Jan 21 23:44:18 compute-1 ovn_metadata_agent[104179]:     option forwardfor
Jan 21 23:44:18 compute-1 ovn_metadata_agent[104179]:     retries                 3
Jan 21 23:44:18 compute-1 ovn_metadata_agent[104179]:     timeout http-request    30s
Jan 21 23:44:18 compute-1 ovn_metadata_agent[104179]:     timeout connect         30s
Jan 21 23:44:18 compute-1 ovn_metadata_agent[104179]:     timeout client          32s
Jan 21 23:44:18 compute-1 ovn_metadata_agent[104179]:     timeout server          32s
Jan 21 23:44:18 compute-1 ovn_metadata_agent[104179]:     timeout http-keep-alive 30s
Jan 21 23:44:18 compute-1 ovn_metadata_agent[104179]: 
Jan 21 23:44:18 compute-1 ovn_metadata_agent[104179]: 
Jan 21 23:44:18 compute-1 ovn_metadata_agent[104179]: listen listener
Jan 21 23:44:18 compute-1 ovn_metadata_agent[104179]:     bind 169.254.169.254:80
Jan 21 23:44:18 compute-1 ovn_metadata_agent[104179]:     server metadata /var/lib/neutron/metadata_proxy
Jan 21 23:44:18 compute-1 ovn_metadata_agent[104179]:     http-request add-header X-OVN-Network-ID b48e4868-bf7a-4c2f-b6e3-2d98c35b44de
Jan 21 23:44:18 compute-1 ovn_metadata_agent[104179]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 21 23:44:18 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:44:18.834 104184 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-b48e4868-bf7a-4c2f-b6e3-2d98c35b44de', 'env', 'PROCESS_TAG=haproxy-b48e4868-bf7a-4c2f-b6e3-2d98c35b44de', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/b48e4868-bf7a-4c2f-b6e3-2d98c35b44de.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 21 23:44:18 compute-1 nova_compute[182713]: 2026-01-21 23:44:18.836 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:44:18 compute-1 nova_compute[182713]: 2026-01-21 23:44:18.842 182717 DEBUG nova.compute.manager [req-600e35e6-7328-43a6-90d6-88d04cde434e req-0527466d-06e7-4a76-a35c-202d3dc8b260 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 5a9ebed0-dcef-427e-a805-574905569389] Received event network-vif-plugged-37f1f8d1-f4ea-416d-ba73-6fbc611802be external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:44:18 compute-1 nova_compute[182713]: 2026-01-21 23:44:18.843 182717 DEBUG oslo_concurrency.lockutils [req-600e35e6-7328-43a6-90d6-88d04cde434e req-0527466d-06e7-4a76-a35c-202d3dc8b260 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "5a9ebed0-dcef-427e-a805-574905569389-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:44:18 compute-1 nova_compute[182713]: 2026-01-21 23:44:18.844 182717 DEBUG oslo_concurrency.lockutils [req-600e35e6-7328-43a6-90d6-88d04cde434e req-0527466d-06e7-4a76-a35c-202d3dc8b260 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "5a9ebed0-dcef-427e-a805-574905569389-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:44:18 compute-1 nova_compute[182713]: 2026-01-21 23:44:18.845 182717 DEBUG oslo_concurrency.lockutils [req-600e35e6-7328-43a6-90d6-88d04cde434e req-0527466d-06e7-4a76-a35c-202d3dc8b260 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "5a9ebed0-dcef-427e-a805-574905569389-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:44:18 compute-1 nova_compute[182713]: 2026-01-21 23:44:18.845 182717 DEBUG nova.compute.manager [req-600e35e6-7328-43a6-90d6-88d04cde434e req-0527466d-06e7-4a76-a35c-202d3dc8b260 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 5a9ebed0-dcef-427e-a805-574905569389] Processing event network-vif-plugged-37f1f8d1-f4ea-416d-ba73-6fbc611802be _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 21 23:44:18 compute-1 nova_compute[182713]: 2026-01-21 23:44:18.846 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:44:18 compute-1 nova_compute[182713]: 2026-01-21 23:44:18.848 182717 DEBUG nova.compute.manager [None req-e6d05f9f-3aa6-4a1a-bccb-195d8854ac9a a5dd6b4c1bdc4bdd90966604b07abf04 2046237dff224b498e14ad59b5822ac1 - - default default] [instance: 5a9ebed0-dcef-427e-a805-574905569389] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 21 23:44:18 compute-1 nova_compute[182713]: 2026-01-21 23:44:18.853 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 5a9ebed0-dcef-427e-a805-574905569389] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:44:18 compute-1 nova_compute[182713]: 2026-01-21 23:44:18.855 182717 DEBUG nova.virt.libvirt.driver [None req-e6d05f9f-3aa6-4a1a-bccb-195d8854ac9a a5dd6b4c1bdc4bdd90966604b07abf04 2046237dff224b498e14ad59b5822ac1 - - default default] [instance: 5a9ebed0-dcef-427e-a805-574905569389] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 21 23:44:18 compute-1 nova_compute[182713]: 2026-01-21 23:44:18.868 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 5a9ebed0-dcef-427e-a805-574905569389] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 21 23:44:18 compute-1 nova_compute[182713]: 2026-01-21 23:44:18.873 182717 INFO nova.virt.libvirt.driver [-] [instance: 5a9ebed0-dcef-427e-a805-574905569389] Instance spawned successfully.
Jan 21 23:44:18 compute-1 nova_compute[182713]: 2026-01-21 23:44:18.874 182717 DEBUG nova.virt.libvirt.driver [None req-e6d05f9f-3aa6-4a1a-bccb-195d8854ac9a a5dd6b4c1bdc4bdd90966604b07abf04 2046237dff224b498e14ad59b5822ac1 - - default default] [instance: 5a9ebed0-dcef-427e-a805-574905569389] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 21 23:44:18 compute-1 nova_compute[182713]: 2026-01-21 23:44:18.900 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 5a9ebed0-dcef-427e-a805-574905569389] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 21 23:44:18 compute-1 nova_compute[182713]: 2026-01-21 23:44:18.900 182717 DEBUG nova.virt.driver [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Emitting event <LifecycleEvent: 1769039058.826332, 5a9ebed0-dcef-427e-a805-574905569389 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:44:18 compute-1 nova_compute[182713]: 2026-01-21 23:44:18.901 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 5a9ebed0-dcef-427e-a805-574905569389] VM Paused (Lifecycle Event)
Jan 21 23:44:18 compute-1 nova_compute[182713]: 2026-01-21 23:44:18.907 182717 DEBUG nova.virt.libvirt.driver [None req-e6d05f9f-3aa6-4a1a-bccb-195d8854ac9a a5dd6b4c1bdc4bdd90966604b07abf04 2046237dff224b498e14ad59b5822ac1 - - default default] [instance: 5a9ebed0-dcef-427e-a805-574905569389] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:44:18 compute-1 nova_compute[182713]: 2026-01-21 23:44:18.907 182717 DEBUG nova.virt.libvirt.driver [None req-e6d05f9f-3aa6-4a1a-bccb-195d8854ac9a a5dd6b4c1bdc4bdd90966604b07abf04 2046237dff224b498e14ad59b5822ac1 - - default default] [instance: 5a9ebed0-dcef-427e-a805-574905569389] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:44:18 compute-1 nova_compute[182713]: 2026-01-21 23:44:18.909 182717 DEBUG nova.virt.libvirt.driver [None req-e6d05f9f-3aa6-4a1a-bccb-195d8854ac9a a5dd6b4c1bdc4bdd90966604b07abf04 2046237dff224b498e14ad59b5822ac1 - - default default] [instance: 5a9ebed0-dcef-427e-a805-574905569389] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:44:18 compute-1 nova_compute[182713]: 2026-01-21 23:44:18.910 182717 DEBUG nova.virt.libvirt.driver [None req-e6d05f9f-3aa6-4a1a-bccb-195d8854ac9a a5dd6b4c1bdc4bdd90966604b07abf04 2046237dff224b498e14ad59b5822ac1 - - default default] [instance: 5a9ebed0-dcef-427e-a805-574905569389] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:44:18 compute-1 nova_compute[182713]: 2026-01-21 23:44:18.910 182717 DEBUG nova.virt.libvirt.driver [None req-e6d05f9f-3aa6-4a1a-bccb-195d8854ac9a a5dd6b4c1bdc4bdd90966604b07abf04 2046237dff224b498e14ad59b5822ac1 - - default default] [instance: 5a9ebed0-dcef-427e-a805-574905569389] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:44:18 compute-1 nova_compute[182713]: 2026-01-21 23:44:18.911 182717 DEBUG nova.virt.libvirt.driver [None req-e6d05f9f-3aa6-4a1a-bccb-195d8854ac9a a5dd6b4c1bdc4bdd90966604b07abf04 2046237dff224b498e14ad59b5822ac1 - - default default] [instance: 5a9ebed0-dcef-427e-a805-574905569389] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:44:18 compute-1 nova_compute[182713]: 2026-01-21 23:44:18.934 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 5a9ebed0-dcef-427e-a805-574905569389] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:44:18 compute-1 nova_compute[182713]: 2026-01-21 23:44:18.938 182717 DEBUG nova.virt.driver [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Emitting event <LifecycleEvent: 1769039058.8536837, 5a9ebed0-dcef-427e-a805-574905569389 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:44:18 compute-1 nova_compute[182713]: 2026-01-21 23:44:18.939 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 5a9ebed0-dcef-427e-a805-574905569389] VM Resumed (Lifecycle Event)
Jan 21 23:44:18 compute-1 nova_compute[182713]: 2026-01-21 23:44:18.971 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 5a9ebed0-dcef-427e-a805-574905569389] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:44:18 compute-1 nova_compute[182713]: 2026-01-21 23:44:18.986 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 5a9ebed0-dcef-427e-a805-574905569389] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 21 23:44:19 compute-1 nova_compute[182713]: 2026-01-21 23:44:19.010 182717 INFO nova.compute.manager [None req-e6d05f9f-3aa6-4a1a-bccb-195d8854ac9a a5dd6b4c1bdc4bdd90966604b07abf04 2046237dff224b498e14ad59b5822ac1 - - default default] [instance: 5a9ebed0-dcef-427e-a805-574905569389] Took 6.83 seconds to spawn the instance on the hypervisor.
Jan 21 23:44:19 compute-1 nova_compute[182713]: 2026-01-21 23:44:19.011 182717 DEBUG nova.compute.manager [None req-e6d05f9f-3aa6-4a1a-bccb-195d8854ac9a a5dd6b4c1bdc4bdd90966604b07abf04 2046237dff224b498e14ad59b5822ac1 - - default default] [instance: 5a9ebed0-dcef-427e-a805-574905569389] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:44:19 compute-1 nova_compute[182713]: 2026-01-21 23:44:19.020 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 5a9ebed0-dcef-427e-a805-574905569389] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 21 23:44:19 compute-1 nova_compute[182713]: 2026-01-21 23:44:19.044 182717 DEBUG oslo_concurrency.lockutils [None req-4cd816e0-3cbd-4d6d-952a-83b176cb306b f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] Acquiring lock "6f64b039-da3a-47ef-9a52-b259b890a077" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:44:19 compute-1 nova_compute[182713]: 2026-01-21 23:44:19.045 182717 DEBUG oslo_concurrency.lockutils [None req-4cd816e0-3cbd-4d6d-952a-83b176cb306b f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] Lock "6f64b039-da3a-47ef-9a52-b259b890a077" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:44:19 compute-1 nova_compute[182713]: 2026-01-21 23:44:19.045 182717 DEBUG oslo_concurrency.lockutils [None req-4cd816e0-3cbd-4d6d-952a-83b176cb306b f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] Acquiring lock "6f64b039-da3a-47ef-9a52-b259b890a077-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:44:19 compute-1 nova_compute[182713]: 2026-01-21 23:44:19.046 182717 DEBUG oslo_concurrency.lockutils [None req-4cd816e0-3cbd-4d6d-952a-83b176cb306b f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] Lock "6f64b039-da3a-47ef-9a52-b259b890a077-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:44:19 compute-1 nova_compute[182713]: 2026-01-21 23:44:19.046 182717 DEBUG oslo_concurrency.lockutils [None req-4cd816e0-3cbd-4d6d-952a-83b176cb306b f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] Lock "6f64b039-da3a-47ef-9a52-b259b890a077-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:44:19 compute-1 nova_compute[182713]: 2026-01-21 23:44:19.062 182717 INFO nova.compute.manager [None req-4cd816e0-3cbd-4d6d-952a-83b176cb306b f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] [instance: 6f64b039-da3a-47ef-9a52-b259b890a077] Terminating instance
Jan 21 23:44:19 compute-1 nova_compute[182713]: 2026-01-21 23:44:19.076 182717 DEBUG nova.compute.manager [None req-4cd816e0-3cbd-4d6d-952a-83b176cb306b f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] [instance: 6f64b039-da3a-47ef-9a52-b259b890a077] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 21 23:44:19 compute-1 kernel: tap3299b15c-00 (unregistering): left promiscuous mode
Jan 21 23:44:19 compute-1 NetworkManager[54952]: <info>  [1769039059.1030] device (tap3299b15c-00): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 21 23:44:19 compute-1 nova_compute[182713]: 2026-01-21 23:44:19.113 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:44:19 compute-1 ovn_controller[94841]: 2026-01-21T23:44:19Z|00037|binding|INFO|Releasing lport 3299b15c-00f0-4c59-9f02-44cb8d762ae2 from this chassis (sb_readonly=0)
Jan 21 23:44:19 compute-1 ovn_controller[94841]: 2026-01-21T23:44:19Z|00038|binding|INFO|Setting lport 3299b15c-00f0-4c59-9f02-44cb8d762ae2 down in Southbound
Jan 21 23:44:19 compute-1 ovn_controller[94841]: 2026-01-21T23:44:19Z|00039|binding|INFO|Removing iface tap3299b15c-00 ovn-installed in OVS
Jan 21 23:44:19 compute-1 nova_compute[182713]: 2026-01-21 23:44:19.118 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:44:19 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:44:19.123 104184 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:61:33:0b 10.1.0.40 fdfe:381f:8400::25e'], port_security=['fa:16:3e:61:33:0b 10.1.0.40 fdfe:381f:8400::25e'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.1.0.40/26 fdfe:381f:8400::25e/64', 'neutron:device_id': '6f64b039-da3a-47ef-9a52-b259b890a077', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-48de92c9-2a56-4dfe-a16e-fe0d52617564', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8981554bfb65485a9218dab7f347822d', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'e4bb4842-7cc7-47df-ad92-e426d20758f4', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=343b670f-2d8d-4f56-9cb9-7d9682347428, chassis=[], tunnel_key=6, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>], logical_port=3299b15c-00f0-4c59-9f02-44cb8d762ae2) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 21 23:44:19 compute-1 nova_compute[182713]: 2026-01-21 23:44:19.128 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:44:19 compute-1 nova_compute[182713]: 2026-01-21 23:44:19.136 182717 INFO nova.compute.manager [None req-e6d05f9f-3aa6-4a1a-bccb-195d8854ac9a a5dd6b4c1bdc4bdd90966604b07abf04 2046237dff224b498e14ad59b5822ac1 - - default default] [instance: 5a9ebed0-dcef-427e-a805-574905569389] Took 7.95 seconds to build instance.
Jan 21 23:44:19 compute-1 systemd[1]: machine-qemu\x2d3\x2dinstance\x2d00000002.scope: Deactivated successfully.
Jan 21 23:44:19 compute-1 systemd[1]: machine-qemu\x2d3\x2dinstance\x2d00000002.scope: Consumed 6.548s CPU time.
Jan 21 23:44:19 compute-1 systemd-machined[153970]: Machine qemu-3-instance-00000002 terminated.
Jan 21 23:44:19 compute-1 nova_compute[182713]: 2026-01-21 23:44:19.160 182717 DEBUG oslo_concurrency.lockutils [None req-e6d05f9f-3aa6-4a1a-bccb-195d8854ac9a a5dd6b4c1bdc4bdd90966604b07abf04 2046237dff224b498e14ad59b5822ac1 - - default default] Lock "5a9ebed0-dcef-427e-a805-574905569389" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.073s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:44:19 compute-1 nova_compute[182713]: 2026-01-21 23:44:19.231 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:44:19 compute-1 nova_compute[182713]: 2026-01-21 23:44:19.308 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:44:19 compute-1 nova_compute[182713]: 2026-01-21 23:44:19.320 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:44:19 compute-1 nova_compute[182713]: 2026-01-21 23:44:19.348 182717 INFO nova.virt.libvirt.driver [-] [instance: 6f64b039-da3a-47ef-9a52-b259b890a077] Instance destroyed successfully.
Jan 21 23:44:19 compute-1 nova_compute[182713]: 2026-01-21 23:44:19.349 182717 DEBUG nova.objects.instance [None req-4cd816e0-3cbd-4d6d-952a-83b176cb306b f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] Lazy-loading 'resources' on Instance uuid 6f64b039-da3a-47ef-9a52-b259b890a077 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:44:19 compute-1 nova_compute[182713]: 2026-01-21 23:44:19.378 182717 DEBUG nova.virt.libvirt.vif [None req-4cd816e0-3cbd-4d6d-952a-83b176cb306b f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-21T23:43:38Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-tempest.common.compute-instance-1377062952-1',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-1377062952-1',id=2,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-21T23:44:13Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='8981554bfb65485a9218dab7f347822d',ramdisk_id='',reservation_id='r-ourrdd3l',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AutoAllocateNetworkTest-1853609216',owner_user_name='tempest-AutoAllocateNetworkTest-1853609216-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-21T23:44:13Z,user_data=None,user_id='f92dd0c2072346c6b7e7588673443ff2',uuid=6f64b039-da3a-47ef-9a52-b259b890a077,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "3299b15c-00f0-4c59-9f02-44cb8d762ae2", "address": "fa:16:3e:61:33:0b", "network": {"id": "48de92c9-2a56-4dfe-a16e-fe0d52617564", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "10.1.0.0/26", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.40", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "fdfe:381f:8400::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400::25e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8981554bfb65485a9218dab7f347822d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3299b15c-00", "ovs_interfaceid": "3299b15c-00f0-4c59-9f02-44cb8d762ae2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 21 23:44:19 compute-1 nova_compute[182713]: 2026-01-21 23:44:19.379 182717 DEBUG nova.network.os_vif_util [None req-4cd816e0-3cbd-4d6d-952a-83b176cb306b f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] Converting VIF {"id": "3299b15c-00f0-4c59-9f02-44cb8d762ae2", "address": "fa:16:3e:61:33:0b", "network": {"id": "48de92c9-2a56-4dfe-a16e-fe0d52617564", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "10.1.0.0/26", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.40", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "fdfe:381f:8400::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400::25e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8981554bfb65485a9218dab7f347822d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3299b15c-00", "ovs_interfaceid": "3299b15c-00f0-4c59-9f02-44cb8d762ae2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 21 23:44:19 compute-1 nova_compute[182713]: 2026-01-21 23:44:19.380 182717 DEBUG nova.network.os_vif_util [None req-4cd816e0-3cbd-4d6d-952a-83b176cb306b f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:61:33:0b,bridge_name='br-int',has_traffic_filtering=True,id=3299b15c-00f0-4c59-9f02-44cb8d762ae2,network=Network(48de92c9-2a56-4dfe-a16e-fe0d52617564),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3299b15c-00') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 21 23:44:19 compute-1 nova_compute[182713]: 2026-01-21 23:44:19.381 182717 DEBUG os_vif [None req-4cd816e0-3cbd-4d6d-952a-83b176cb306b f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:61:33:0b,bridge_name='br-int',has_traffic_filtering=True,id=3299b15c-00f0-4c59-9f02-44cb8d762ae2,network=Network(48de92c9-2a56-4dfe-a16e-fe0d52617564),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3299b15c-00') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 21 23:44:19 compute-1 nova_compute[182713]: 2026-01-21 23:44:19.384 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:44:19 compute-1 nova_compute[182713]: 2026-01-21 23:44:19.385 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3299b15c-00, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:44:19 compute-1 nova_compute[182713]: 2026-01-21 23:44:19.388 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 21 23:44:19 compute-1 nova_compute[182713]: 2026-01-21 23:44:19.392 182717 INFO os_vif [None req-4cd816e0-3cbd-4d6d-952a-83b176cb306b f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:61:33:0b,bridge_name='br-int',has_traffic_filtering=True,id=3299b15c-00f0-4c59-9f02-44cb8d762ae2,network=Network(48de92c9-2a56-4dfe-a16e-fe0d52617564),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3299b15c-00')
Jan 21 23:44:19 compute-1 nova_compute[182713]: 2026-01-21 23:44:19.393 182717 INFO nova.virt.libvirt.driver [None req-4cd816e0-3cbd-4d6d-952a-83b176cb306b f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] [instance: 6f64b039-da3a-47ef-9a52-b259b890a077] Deleting instance files /var/lib/nova/instances/6f64b039-da3a-47ef-9a52-b259b890a077_del
Jan 21 23:44:19 compute-1 nova_compute[182713]: 2026-01-21 23:44:19.395 182717 INFO nova.virt.libvirt.driver [None req-4cd816e0-3cbd-4d6d-952a-83b176cb306b f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] [instance: 6f64b039-da3a-47ef-9a52-b259b890a077] Deletion of /var/lib/nova/instances/6f64b039-da3a-47ef-9a52-b259b890a077_del complete
Jan 21 23:44:19 compute-1 podman[211931]: 2026-01-21 23:44:19.398613621 +0000 UTC m=+0.087152542 container create f62e7fda55a47ad0180b22c8ebd22638d7f19e4ab1c4ea78d95639ebbbcd6286 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b48e4868-bf7a-4c2f-b6e3-2d98c35b44de, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Jan 21 23:44:19 compute-1 systemd[1]: Started libpod-conmon-f62e7fda55a47ad0180b22c8ebd22638d7f19e4ab1c4ea78d95639ebbbcd6286.scope.
Jan 21 23:44:19 compute-1 podman[211931]: 2026-01-21 23:44:19.350487517 +0000 UTC m=+0.039026518 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 21 23:44:19 compute-1 systemd[1]: Started libcrun container.
Jan 21 23:44:19 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b4d9834db4d7325d8c9b2b5172da1a0fac65cb11a11b30a1584f2d7d340944e8/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 21 23:44:19 compute-1 nova_compute[182713]: 2026-01-21 23:44:19.496 182717 INFO nova.compute.manager [None req-4cd816e0-3cbd-4d6d-952a-83b176cb306b f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] [instance: 6f64b039-da3a-47ef-9a52-b259b890a077] Took 0.42 seconds to destroy the instance on the hypervisor.
Jan 21 23:44:19 compute-1 nova_compute[182713]: 2026-01-21 23:44:19.498 182717 DEBUG oslo.service.loopingcall [None req-4cd816e0-3cbd-4d6d-952a-83b176cb306b f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 21 23:44:19 compute-1 nova_compute[182713]: 2026-01-21 23:44:19.499 182717 DEBUG nova.compute.manager [-] [instance: 6f64b039-da3a-47ef-9a52-b259b890a077] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 21 23:44:19 compute-1 nova_compute[182713]: 2026-01-21 23:44:19.499 182717 DEBUG nova.network.neutron [-] [instance: 6f64b039-da3a-47ef-9a52-b259b890a077] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 21 23:44:19 compute-1 podman[211931]: 2026-01-21 23:44:19.502665841 +0000 UTC m=+0.191204852 container init f62e7fda55a47ad0180b22c8ebd22638d7f19e4ab1c4ea78d95639ebbbcd6286 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b48e4868-bf7a-4c2f-b6e3-2d98c35b44de, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Jan 21 23:44:19 compute-1 podman[211931]: 2026-01-21 23:44:19.507752166 +0000 UTC m=+0.196291117 container start f62e7fda55a47ad0180b22c8ebd22638d7f19e4ab1c4ea78d95639ebbbcd6286 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b48e4868-bf7a-4c2f-b6e3-2d98c35b44de, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, io.buildah.version=1.41.3)
Jan 21 23:44:19 compute-1 neutron-haproxy-ovnmeta-b48e4868-bf7a-4c2f-b6e3-2d98c35b44de[211962]: [NOTICE]   (211966) : New worker (211968) forked
Jan 21 23:44:19 compute-1 neutron-haproxy-ovnmeta-b48e4868-bf7a-4c2f-b6e3-2d98c35b44de[211962]: [NOTICE]   (211966) : Loading success.
Jan 21 23:44:19 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:44:19.583 104184 INFO neutron.agent.ovn.metadata.agent [-] Port 3299b15c-00f0-4c59-9f02-44cb8d762ae2 in datapath 48de92c9-2a56-4dfe-a16e-fe0d52617564 unbound from our chassis
Jan 21 23:44:19 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:44:19.584 104184 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 48de92c9-2a56-4dfe-a16e-fe0d52617564, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 21 23:44:19 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:44:19.585 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[238652aa-9e00-47e8-a4ed-807725063aba]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:44:19 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:44:19.586 104184 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-48de92c9-2a56-4dfe-a16e-fe0d52617564 namespace which is not needed anymore
Jan 21 23:44:19 compute-1 nova_compute[182713]: 2026-01-21 23:44:19.608 182717 DEBUG nova.network.neutron [req-a71bd7e4-6b6d-427a-bbad-58b7d1852e2d req-48da06c6-0310-4056-8459-45b36dbad8ac 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 5a9ebed0-dcef-427e-a805-574905569389] Updated VIF entry in instance network info cache for port 37f1f8d1-f4ea-416d-ba73-6fbc611802be. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 21 23:44:19 compute-1 nova_compute[182713]: 2026-01-21 23:44:19.609 182717 DEBUG nova.network.neutron [req-a71bd7e4-6b6d-427a-bbad-58b7d1852e2d req-48da06c6-0310-4056-8459-45b36dbad8ac 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 5a9ebed0-dcef-427e-a805-574905569389] Updating instance_info_cache with network_info: [{"id": "37f1f8d1-f4ea-416d-ba73-6fbc611802be", "address": "fa:16:3e:42:f2:4a", "network": {"id": "b48e4868-bf7a-4c2f-b6e3-2d98c35b44de", "bridge": "br-int", "label": "tempest-ServersWithSpecificFlavorTestJSON-746822329-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2046237dff224b498e14ad59b5822ac1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap37f1f8d1-f4", "ovs_interfaceid": "37f1f8d1-f4ea-416d-ba73-6fbc611802be", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 23:44:19 compute-1 nova_compute[182713]: 2026-01-21 23:44:19.631 182717 DEBUG oslo_concurrency.lockutils [req-a71bd7e4-6b6d-427a-bbad-58b7d1852e2d req-48da06c6-0310-4056-8459-45b36dbad8ac 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-5a9ebed0-dcef-427e-a805-574905569389" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 23:44:19 compute-1 neutron-haproxy-ovnmeta-48de92c9-2a56-4dfe-a16e-fe0d52617564[211834]: [NOTICE]   (211838) : haproxy version is 2.8.14-c23fe91
Jan 21 23:44:19 compute-1 neutron-haproxy-ovnmeta-48de92c9-2a56-4dfe-a16e-fe0d52617564[211834]: [NOTICE]   (211838) : path to executable is /usr/sbin/haproxy
Jan 21 23:44:19 compute-1 neutron-haproxy-ovnmeta-48de92c9-2a56-4dfe-a16e-fe0d52617564[211834]: [ALERT]    (211838) : Current worker (211840) exited with code 143 (Terminated)
Jan 21 23:44:19 compute-1 neutron-haproxy-ovnmeta-48de92c9-2a56-4dfe-a16e-fe0d52617564[211834]: [WARNING]  (211838) : All workers exited. Exiting... (0)
Jan 21 23:44:19 compute-1 systemd[1]: libpod-f575efb7fdfbfe4150ccb7d9be2c10bcb2504413a33012e93cc65a3d7d4bb895.scope: Deactivated successfully.
Jan 21 23:44:19 compute-1 podman[211994]: 2026-01-21 23:44:19.757517379 +0000 UTC m=+0.043626298 container died f575efb7fdfbfe4150ccb7d9be2c10bcb2504413a33012e93cc65a3d7d4bb895 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-48de92c9-2a56-4dfe-a16e-fe0d52617564, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3)
Jan 21 23:44:19 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-f575efb7fdfbfe4150ccb7d9be2c10bcb2504413a33012e93cc65a3d7d4bb895-userdata-shm.mount: Deactivated successfully.
Jan 21 23:44:19 compute-1 systemd[1]: var-lib-containers-storage-overlay-3a4c145370d51ad3a8b50c02d89ba624b251bc35c11fac7e86c83cd353233bcb-merged.mount: Deactivated successfully.
Jan 21 23:44:19 compute-1 podman[211994]: 2026-01-21 23:44:19.801538229 +0000 UTC m=+0.087647148 container cleanup f575efb7fdfbfe4150ccb7d9be2c10bcb2504413a33012e93cc65a3d7d4bb895 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-48de92c9-2a56-4dfe-a16e-fe0d52617564, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 21 23:44:19 compute-1 systemd[1]: libpod-conmon-f575efb7fdfbfe4150ccb7d9be2c10bcb2504413a33012e93cc65a3d7d4bb895.scope: Deactivated successfully.
Jan 21 23:44:19 compute-1 podman[212010]: 2026-01-21 23:44:19.828764893 +0000 UTC m=+0.051405231 container health_status af9a44923f14d865893c91712a29a99d59e05d83f1a1a0300c4f4d5af638f284 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Jan 21 23:44:19 compute-1 podman[212019]: 2026-01-21 23:44:19.844664739 +0000 UTC m=+0.063254285 container health_status e305ebfcd3dbca18f8dd680606b043b108cf9efb0d0a771039cb04fe0df8441d (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 21 23:44:19 compute-1 podman[212047]: 2026-01-21 23:44:19.878148317 +0000 UTC m=+0.045270391 container remove f575efb7fdfbfe4150ccb7d9be2c10bcb2504413a33012e93cc65a3d7d4bb895 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-48de92c9-2a56-4dfe-a16e-fe0d52617564, org.label-schema.build-date=20251202, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Jan 21 23:44:19 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:44:19.884 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[6b5246f5-d7cd-4b16-9d64-bdc4567077da]: (4, ('Wed Jan 21 11:44:19 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-48de92c9-2a56-4dfe-a16e-fe0d52617564 (f575efb7fdfbfe4150ccb7d9be2c10bcb2504413a33012e93cc65a3d7d4bb895)\nf575efb7fdfbfe4150ccb7d9be2c10bcb2504413a33012e93cc65a3d7d4bb895\nWed Jan 21 11:44:19 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-48de92c9-2a56-4dfe-a16e-fe0d52617564 (f575efb7fdfbfe4150ccb7d9be2c10bcb2504413a33012e93cc65a3d7d4bb895)\nf575efb7fdfbfe4150ccb7d9be2c10bcb2504413a33012e93cc65a3d7d4bb895\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:44:19 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:44:19.886 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[f76460da-b7e0-4ec5-8cd3-cb26634034d4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:44:19 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:44:19.887 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap48de92c9-20, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:44:19 compute-1 nova_compute[182713]: 2026-01-21 23:44:19.933 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:44:19 compute-1 kernel: tap48de92c9-20: left promiscuous mode
Jan 21 23:44:19 compute-1 nova_compute[182713]: 2026-01-21 23:44:19.957 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:44:19 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:44:19.960 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[fdebd21a-98ae-4d64-80b7-8a19706c2e42]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:44:19 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:44:19.980 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[925839e7-5807-4e07-a44c-14d8db998694]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:44:19 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:44:19.981 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[651cb2fd-46ff-4de1-9570-116c49f1fdff]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:44:20 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:44:20.003 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[48f7a154-334f-4653-ade9-44e1774fb894]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 362884, 'reachable_time': 17745, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 212077, 'error': None, 'target': 'ovnmeta-48de92c9-2a56-4dfe-a16e-fe0d52617564', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:44:20 compute-1 systemd[1]: run-netns-ovnmeta\x2d48de92c9\x2d2a56\x2d4dfe\x2da16e\x2dfe0d52617564.mount: Deactivated successfully.
Jan 21 23:44:20 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:44:20.019 104576 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-48de92c9-2a56-4dfe-a16e-fe0d52617564 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 21 23:44:20 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:44:20.021 104576 DEBUG oslo.privsep.daemon [-] privsep: reply[07bf120d-dbe4-43b0-b318-6f8edfcd8de3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:44:20 compute-1 nova_compute[182713]: 2026-01-21 23:44:20.861 182717 DEBUG nova.network.neutron [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] [instance: 6f64b039-da3a-47ef-9a52-b259b890a077] Updating instance_info_cache with network_info: [{"id": "3299b15c-00f0-4c59-9f02-44cb8d762ae2", "address": "fa:16:3e:61:33:0b", "network": {"id": "48de92c9-2a56-4dfe-a16e-fe0d52617564", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "10.1.0.0/26", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.40", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "fdfe:381f:8400::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400::25e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8981554bfb65485a9218dab7f347822d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3299b15c-00", "ovs_interfaceid": "3299b15c-00f0-4c59-9f02-44cb8d762ae2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 23:44:20 compute-1 nova_compute[182713]: 2026-01-21 23:44:20.894 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Releasing lock "refresh_cache-6f64b039-da3a-47ef-9a52-b259b890a077" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 23:44:20 compute-1 nova_compute[182713]: 2026-01-21 23:44:20.895 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] [instance: 6f64b039-da3a-47ef-9a52-b259b890a077] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Jan 21 23:44:20 compute-1 nova_compute[182713]: 2026-01-21 23:44:20.896 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:44:20 compute-1 nova_compute[182713]: 2026-01-21 23:44:20.897 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:44:20 compute-1 nova_compute[182713]: 2026-01-21 23:44:20.897 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:44:20 compute-1 nova_compute[182713]: 2026-01-21 23:44:20.898 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:44:20 compute-1 nova_compute[182713]: 2026-01-21 23:44:20.899 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:44:20 compute-1 nova_compute[182713]: 2026-01-21 23:44:20.900 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:44:20 compute-1 nova_compute[182713]: 2026-01-21 23:44:20.900 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 21 23:44:20 compute-1 nova_compute[182713]: 2026-01-21 23:44:20.900 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:44:20 compute-1 nova_compute[182713]: 2026-01-21 23:44:20.930 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:44:20 compute-1 nova_compute[182713]: 2026-01-21 23:44:20.931 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:44:20 compute-1 nova_compute[182713]: 2026-01-21 23:44:20.931 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:44:20 compute-1 nova_compute[182713]: 2026-01-21 23:44:20.932 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 21 23:44:20 compute-1 nova_compute[182713]: 2026-01-21 23:44:20.975 182717 DEBUG nova.compute.manager [req-8c40d1c8-955b-4196-bdc2-a8637b462dae req-c1380884-bbcb-4e81-86d4-f35f1c6d05c3 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 5a9ebed0-dcef-427e-a805-574905569389] Received event network-vif-plugged-37f1f8d1-f4ea-416d-ba73-6fbc611802be external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:44:20 compute-1 nova_compute[182713]: 2026-01-21 23:44:20.976 182717 DEBUG oslo_concurrency.lockutils [req-8c40d1c8-955b-4196-bdc2-a8637b462dae req-c1380884-bbcb-4e81-86d4-f35f1c6d05c3 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "5a9ebed0-dcef-427e-a805-574905569389-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:44:20 compute-1 nova_compute[182713]: 2026-01-21 23:44:20.977 182717 DEBUG oslo_concurrency.lockutils [req-8c40d1c8-955b-4196-bdc2-a8637b462dae req-c1380884-bbcb-4e81-86d4-f35f1c6d05c3 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "5a9ebed0-dcef-427e-a805-574905569389-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:44:20 compute-1 nova_compute[182713]: 2026-01-21 23:44:20.978 182717 DEBUG oslo_concurrency.lockutils [req-8c40d1c8-955b-4196-bdc2-a8637b462dae req-c1380884-bbcb-4e81-86d4-f35f1c6d05c3 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "5a9ebed0-dcef-427e-a805-574905569389-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:44:20 compute-1 nova_compute[182713]: 2026-01-21 23:44:20.978 182717 DEBUG nova.compute.manager [req-8c40d1c8-955b-4196-bdc2-a8637b462dae req-c1380884-bbcb-4e81-86d4-f35f1c6d05c3 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 5a9ebed0-dcef-427e-a805-574905569389] No waiting events found dispatching network-vif-plugged-37f1f8d1-f4ea-416d-ba73-6fbc611802be pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 23:44:20 compute-1 nova_compute[182713]: 2026-01-21 23:44:20.979 182717 WARNING nova.compute.manager [req-8c40d1c8-955b-4196-bdc2-a8637b462dae req-c1380884-bbcb-4e81-86d4-f35f1c6d05c3 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 5a9ebed0-dcef-427e-a805-574905569389] Received unexpected event network-vif-plugged-37f1f8d1-f4ea-416d-ba73-6fbc611802be for instance with vm_state active and task_state None.
Jan 21 23:44:20 compute-1 nova_compute[182713]: 2026-01-21 23:44:20.979 182717 DEBUG nova.compute.manager [req-8c40d1c8-955b-4196-bdc2-a8637b462dae req-c1380884-bbcb-4e81-86d4-f35f1c6d05c3 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 6f64b039-da3a-47ef-9a52-b259b890a077] Received event network-vif-unplugged-3299b15c-00f0-4c59-9f02-44cb8d762ae2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:44:20 compute-1 nova_compute[182713]: 2026-01-21 23:44:20.980 182717 DEBUG oslo_concurrency.lockutils [req-8c40d1c8-955b-4196-bdc2-a8637b462dae req-c1380884-bbcb-4e81-86d4-f35f1c6d05c3 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "6f64b039-da3a-47ef-9a52-b259b890a077-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:44:20 compute-1 nova_compute[182713]: 2026-01-21 23:44:20.980 182717 DEBUG oslo_concurrency.lockutils [req-8c40d1c8-955b-4196-bdc2-a8637b462dae req-c1380884-bbcb-4e81-86d4-f35f1c6d05c3 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "6f64b039-da3a-47ef-9a52-b259b890a077-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:44:20 compute-1 nova_compute[182713]: 2026-01-21 23:44:20.981 182717 DEBUG oslo_concurrency.lockutils [req-8c40d1c8-955b-4196-bdc2-a8637b462dae req-c1380884-bbcb-4e81-86d4-f35f1c6d05c3 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "6f64b039-da3a-47ef-9a52-b259b890a077-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:44:20 compute-1 nova_compute[182713]: 2026-01-21 23:44:20.981 182717 DEBUG nova.compute.manager [req-8c40d1c8-955b-4196-bdc2-a8637b462dae req-c1380884-bbcb-4e81-86d4-f35f1c6d05c3 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 6f64b039-da3a-47ef-9a52-b259b890a077] No waiting events found dispatching network-vif-unplugged-3299b15c-00f0-4c59-9f02-44cb8d762ae2 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 23:44:20 compute-1 nova_compute[182713]: 2026-01-21 23:44:20.982 182717 DEBUG nova.compute.manager [req-8c40d1c8-955b-4196-bdc2-a8637b462dae req-c1380884-bbcb-4e81-86d4-f35f1c6d05c3 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 6f64b039-da3a-47ef-9a52-b259b890a077] Received event network-vif-unplugged-3299b15c-00f0-4c59-9f02-44cb8d762ae2 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 21 23:44:20 compute-1 nova_compute[182713]: 2026-01-21 23:44:20.982 182717 DEBUG nova.compute.manager [req-8c40d1c8-955b-4196-bdc2-a8637b462dae req-c1380884-bbcb-4e81-86d4-f35f1c6d05c3 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 6f64b039-da3a-47ef-9a52-b259b890a077] Received event network-vif-plugged-3299b15c-00f0-4c59-9f02-44cb8d762ae2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:44:20 compute-1 nova_compute[182713]: 2026-01-21 23:44:20.983 182717 DEBUG oslo_concurrency.lockutils [req-8c40d1c8-955b-4196-bdc2-a8637b462dae req-c1380884-bbcb-4e81-86d4-f35f1c6d05c3 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "6f64b039-da3a-47ef-9a52-b259b890a077-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:44:20 compute-1 nova_compute[182713]: 2026-01-21 23:44:20.983 182717 DEBUG oslo_concurrency.lockutils [req-8c40d1c8-955b-4196-bdc2-a8637b462dae req-c1380884-bbcb-4e81-86d4-f35f1c6d05c3 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "6f64b039-da3a-47ef-9a52-b259b890a077-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:44:20 compute-1 nova_compute[182713]: 2026-01-21 23:44:20.984 182717 DEBUG oslo_concurrency.lockutils [req-8c40d1c8-955b-4196-bdc2-a8637b462dae req-c1380884-bbcb-4e81-86d4-f35f1c6d05c3 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "6f64b039-da3a-47ef-9a52-b259b890a077-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:44:20 compute-1 nova_compute[182713]: 2026-01-21 23:44:20.984 182717 DEBUG nova.compute.manager [req-8c40d1c8-955b-4196-bdc2-a8637b462dae req-c1380884-bbcb-4e81-86d4-f35f1c6d05c3 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 6f64b039-da3a-47ef-9a52-b259b890a077] No waiting events found dispatching network-vif-plugged-3299b15c-00f0-4c59-9f02-44cb8d762ae2 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 23:44:20 compute-1 nova_compute[182713]: 2026-01-21 23:44:20.985 182717 WARNING nova.compute.manager [req-8c40d1c8-955b-4196-bdc2-a8637b462dae req-c1380884-bbcb-4e81-86d4-f35f1c6d05c3 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 6f64b039-da3a-47ef-9a52-b259b890a077] Received unexpected event network-vif-plugged-3299b15c-00f0-4c59-9f02-44cb8d762ae2 for instance with vm_state active and task_state deleting.
Jan 21 23:44:21 compute-1 nova_compute[182713]: 2026-01-21 23:44:21.050 182717 DEBUG oslo_concurrency.processutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5a9ebed0-dcef-427e-a805-574905569389/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:44:21 compute-1 nova_compute[182713]: 2026-01-21 23:44:21.121 182717 DEBUG nova.network.neutron [-] [instance: 6f64b039-da3a-47ef-9a52-b259b890a077] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 23:44:21 compute-1 nova_compute[182713]: 2026-01-21 23:44:21.139 182717 INFO nova.compute.manager [-] [instance: 6f64b039-da3a-47ef-9a52-b259b890a077] Took 1.64 seconds to deallocate network for instance.
Jan 21 23:44:21 compute-1 nova_compute[182713]: 2026-01-21 23:44:21.146 182717 DEBUG oslo_concurrency.processutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5a9ebed0-dcef-427e-a805-574905569389/disk --force-share --output=json" returned: 0 in 0.096s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:44:21 compute-1 nova_compute[182713]: 2026-01-21 23:44:21.147 182717 DEBUG oslo_concurrency.processutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5a9ebed0-dcef-427e-a805-574905569389/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:44:21 compute-1 nova_compute[182713]: 2026-01-21 23:44:21.218 182717 DEBUG oslo_concurrency.processutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5a9ebed0-dcef-427e-a805-574905569389/disk --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:44:21 compute-1 nova_compute[182713]: 2026-01-21 23:44:21.393 182717 DEBUG nova.compute.manager [req-4120fb00-d655-4885-a2d8-c627529b83e4 req-bb1e5135-4cde-465d-ba46-d9b5257ba87f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 6f64b039-da3a-47ef-9a52-b259b890a077] Received event network-vif-deleted-3299b15c-00f0-4c59-9f02-44cb8d762ae2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:44:21 compute-1 nova_compute[182713]: 2026-01-21 23:44:21.396 182717 DEBUG oslo_concurrency.lockutils [None req-4cd816e0-3cbd-4d6d-952a-83b176cb306b f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:44:21 compute-1 nova_compute[182713]: 2026-01-21 23:44:21.397 182717 DEBUG oslo_concurrency.lockutils [None req-4cd816e0-3cbd-4d6d-952a-83b176cb306b f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:44:21 compute-1 nova_compute[182713]: 2026-01-21 23:44:21.433 182717 WARNING nova.virt.libvirt.driver [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 21 23:44:21 compute-1 nova_compute[182713]: 2026-01-21 23:44:21.435 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5524MB free_disk=73.3822250366211GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 21 23:44:21 compute-1 nova_compute[182713]: 2026-01-21 23:44:21.435 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:44:21 compute-1 nova_compute[182713]: 2026-01-21 23:44:21.528 182717 DEBUG nova.compute.provider_tree [None req-4cd816e0-3cbd-4d6d-952a-83b176cb306b f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] Inventory has not changed in ProviderTree for provider: 39680711-70c9-4df1-ae59-25e54fac688d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 21 23:44:21 compute-1 nova_compute[182713]: 2026-01-21 23:44:21.546 182717 DEBUG nova.scheduler.client.report [None req-4cd816e0-3cbd-4d6d-952a-83b176cb306b f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] Inventory has not changed for provider 39680711-70c9-4df1-ae59-25e54fac688d based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 21 23:44:21 compute-1 nova_compute[182713]: 2026-01-21 23:44:21.580 182717 DEBUG oslo_concurrency.lockutils [None req-4cd816e0-3cbd-4d6d-952a-83b176cb306b f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.183s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:44:21 compute-1 nova_compute[182713]: 2026-01-21 23:44:21.582 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.147s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:44:21 compute-1 nova_compute[182713]: 2026-01-21 23:44:21.619 182717 INFO nova.scheduler.client.report [None req-4cd816e0-3cbd-4d6d-952a-83b176cb306b f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] Deleted allocations for instance 6f64b039-da3a-47ef-9a52-b259b890a077
Jan 21 23:44:21 compute-1 nova_compute[182713]: 2026-01-21 23:44:21.699 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Instance 5a9ebed0-dcef-427e-a805-574905569389 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 21 23:44:21 compute-1 nova_compute[182713]: 2026-01-21 23:44:21.699 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 21 23:44:21 compute-1 nova_compute[182713]: 2026-01-21 23:44:21.700 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 21 23:44:21 compute-1 nova_compute[182713]: 2026-01-21 23:44:21.773 182717 DEBUG nova.compute.provider_tree [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Inventory has not changed in ProviderTree for provider: 39680711-70c9-4df1-ae59-25e54fac688d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 21 23:44:21 compute-1 nova_compute[182713]: 2026-01-21 23:44:21.847 182717 DEBUG nova.scheduler.client.report [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Inventory has not changed for provider 39680711-70c9-4df1-ae59-25e54fac688d based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 21 23:44:21 compute-1 nova_compute[182713]: 2026-01-21 23:44:21.870 182717 DEBUG oslo_concurrency.lockutils [None req-4cd816e0-3cbd-4d6d-952a-83b176cb306b f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] Lock "6f64b039-da3a-47ef-9a52-b259b890a077" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.825s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:44:21 compute-1 nova_compute[182713]: 2026-01-21 23:44:21.917 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 21 23:44:21 compute-1 nova_compute[182713]: 2026-01-21 23:44:21.917 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.336s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:44:23 compute-1 nova_compute[182713]: 2026-01-21 23:44:23.341 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:44:23 compute-1 NetworkManager[54952]: <info>  [1769039063.3445] manager: (patch-provnet-99bcf688-d143-4306-a11c-956e66fbe227-to-br-int): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/29)
Jan 21 23:44:23 compute-1 NetworkManager[54952]: <info>  [1769039063.3459] device (patch-provnet-99bcf688-d143-4306-a11c-956e66fbe227-to-br-int)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 21 23:44:23 compute-1 NetworkManager[54952]: <warn>  [1769039063.3461] device (patch-provnet-99bcf688-d143-4306-a11c-956e66fbe227-to-br-int)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Jan 21 23:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:23.345 12 DEBUG novaclient.v2.client [-] REQ: curl -g -i -X GET https://nova-internal.openstack.svc:8774/v2.1/flavors?is_public=None -H "Accept: application/json" -H "User-Agent: python-novaclient" -H "X-Auth-Token: {SHA256}bc9b7ecb2af6ac629fc2448a7d596d0bfe459cc5ba04b0213d7aa35f343c0a3b" -H "X-OpenStack-Nova-API-Version: 2.1" _http_log_request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:519
Jan 21 23:44:23 compute-1 NetworkManager[54952]: <info>  [1769039063.3479] manager: (patch-br-int-to-provnet-99bcf688-d143-4306-a11c-956e66fbe227): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/30)
Jan 21 23:44:23 compute-1 NetworkManager[54952]: <info>  [1769039063.3486] device (patch-br-int-to-provnet-99bcf688-d143-4306-a11c-956e66fbe227)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 21 23:44:23 compute-1 NetworkManager[54952]: <warn>  [1769039063.3487] device (patch-br-int-to-provnet-99bcf688-d143-4306-a11c-956e66fbe227)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Jan 21 23:44:23 compute-1 NetworkManager[54952]: <info>  [1769039063.3502] manager: (patch-br-int-to-provnet-99bcf688-d143-4306-a11c-956e66fbe227): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/31)
Jan 21 23:44:23 compute-1 NetworkManager[54952]: <info>  [1769039063.3512] manager: (patch-provnet-99bcf688-d143-4306-a11c-956e66fbe227-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/32)
Jan 21 23:44:23 compute-1 NetworkManager[54952]: <info>  [1769039063.3520] device (patch-provnet-99bcf688-d143-4306-a11c-956e66fbe227-to-br-int)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'none', managed-type: 'full')
Jan 21 23:44:23 compute-1 NetworkManager[54952]: <info>  [1769039063.3526] device (patch-br-int-to-provnet-99bcf688-d143-4306-a11c-956e66fbe227)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'none', managed-type: 'full')
Jan 21 23:44:23 compute-1 nova_compute[182713]: 2026-01-21 23:44:23.390 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:44:23 compute-1 ovn_controller[94841]: 2026-01-21T23:44:23Z|00040|binding|INFO|Releasing lport a608e132-8265-4348-bc67-024ad89753c9 from this chassis (sb_readonly=0)
Jan 21 23:44:23 compute-1 nova_compute[182713]: 2026-01-21 23:44:23.403 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:23.732 12 DEBUG novaclient.v2.client [-] RESP: [200] Connection: Keep-Alive Content-Length: 1181 Content-Type: application/json Date: Wed, 21 Jan 2026 23:44:23 GMT Keep-Alive: timeout=5, max=100 OpenStack-API-Version: compute 2.1 Server: Apache Vary: OpenStack-API-Version,X-OpenStack-Nova-API-Version X-OpenStack-Nova-API-Version: 2.1 x-compute-request-id: req-94af51ec-aafb-4b69-bc28-e931a306987e x-openstack-request-id: req-94af51ec-aafb-4b69-bc28-e931a306987e _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:550
Jan 21 23:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:23.732 12 DEBUG novaclient.v2.client [-] RESP BODY: {"flavors": [{"id": "1628087693", "name": "tempest-flavor_with_ephemeral_1-1039478167", "links": [{"rel": "self", "href": "https://nova-internal.openstack.svc:8774/v2.1/flavors/1628087693"}, {"rel": "bookmark", "href": "https://nova-internal.openstack.svc:8774/flavors/1628087693"}]}, {"id": "17063997", "name": "tempest-flavor_with_ephemeral_0-176066592", "links": [{"rel": "self", "href": "https://nova-internal.openstack.svc:8774/v2.1/flavors/17063997"}, {"rel": "bookmark", "href": "https://nova-internal.openstack.svc:8774/flavors/17063997"}]}, {"id": "c3389c03-89c4-4ff5-9e03-1a99d41713d4", "name": "m1.nano", "links": [{"rel": "self", "href": "https://nova-internal.openstack.svc:8774/v2.1/flavors/c3389c03-89c4-4ff5-9e03-1a99d41713d4"}, {"rel": "bookmark", "href": "https://nova-internal.openstack.svc:8774/flavors/c3389c03-89c4-4ff5-9e03-1a99d41713d4"}]}, {"id": "ff01ccba-ad51-439f-9037-926190d6dc0f", "name": "m1.micro", "links": [{"rel": "self", "href": "https://nova-internal.openstack.svc:8774/v2.1/flavors/ff01ccba-ad51-439f-9037-926190d6dc0f"}, {"rel": "bookmark", "href": "https://nova-internal.openstack.svc:8774/flavors/ff01ccba-ad51-439f-9037-926190d6dc0f"}]}]} _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:582
Jan 21 23:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:23.732 12 DEBUG novaclient.v2.client [-] GET call to compute for https://nova-internal.openstack.svc:8774/v2.1/flavors?is_public=None used request id req-94af51ec-aafb-4b69-bc28-e931a306987e request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:954
Jan 21 23:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:23.735 12 DEBUG novaclient.v2.client [-] REQ: curl -g -i -X GET https://nova-internal.openstack.svc:8774/v2.1/flavors/17063997 -H "Accept: application/json" -H "User-Agent: python-novaclient" -H "X-Auth-Token: {SHA256}bc9b7ecb2af6ac629fc2448a7d596d0bfe459cc5ba04b0213d7aa35f343c0a3b" -H "X-OpenStack-Nova-API-Version: 2.1" _http_log_request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:519
Jan 21 23:44:23 compute-1 nova_compute[182713]: 2026-01-21 23:44:23.946 182717 DEBUG nova.compute.manager [req-fc7fdda7-f340-4907-8881-1793acb52b91 req-82ece5ee-a30f-499e-b3ee-1f53d8605f73 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 5a9ebed0-dcef-427e-a805-574905569389] Received event network-changed-37f1f8d1-f4ea-416d-ba73-6fbc611802be external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:44:23 compute-1 nova_compute[182713]: 2026-01-21 23:44:23.947 182717 DEBUG nova.compute.manager [req-fc7fdda7-f340-4907-8881-1793acb52b91 req-82ece5ee-a30f-499e-b3ee-1f53d8605f73 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 5a9ebed0-dcef-427e-a805-574905569389] Refreshing instance network info cache due to event network-changed-37f1f8d1-f4ea-416d-ba73-6fbc611802be. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 21 23:44:23 compute-1 nova_compute[182713]: 2026-01-21 23:44:23.947 182717 DEBUG oslo_concurrency.lockutils [req-fc7fdda7-f340-4907-8881-1793acb52b91 req-82ece5ee-a30f-499e-b3ee-1f53d8605f73 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-5a9ebed0-dcef-427e-a805-574905569389" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 23:44:23 compute-1 nova_compute[182713]: 2026-01-21 23:44:23.948 182717 DEBUG oslo_concurrency.lockutils [req-fc7fdda7-f340-4907-8881-1793acb52b91 req-82ece5ee-a30f-499e-b3ee-1f53d8605f73 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-5a9ebed0-dcef-427e-a805-574905569389" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 23:44:23 compute-1 nova_compute[182713]: 2026-01-21 23:44:23.948 182717 DEBUG nova.network.neutron [req-fc7fdda7-f340-4907-8881-1793acb52b91 req-82ece5ee-a30f-499e-b3ee-1f53d8605f73 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 5a9ebed0-dcef-427e-a805-574905569389] Refreshing network info cache for port 37f1f8d1-f4ea-416d-ba73-6fbc611802be _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 21 23:44:24 compute-1 nova_compute[182713]: 2026-01-21 23:44:24.234 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:44:24 compute-1 nova_compute[182713]: 2026-01-21 23:44:24.420 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.621 12 DEBUG novaclient.v2.client [-] RESP: [200] Connection: Keep-Alive Content-Length: 445 Content-Type: application/json Date: Wed, 21 Jan 2026 23:44:23 GMT Keep-Alive: timeout=5, max=99 OpenStack-API-Version: compute 2.1 Server: Apache Vary: OpenStack-API-Version,X-OpenStack-Nova-API-Version X-OpenStack-Nova-API-Version: 2.1 x-compute-request-id: req-84109f85-62a1-470f-9f35-282b2875169d x-openstack-request-id: req-84109f85-62a1-470f-9f35-282b2875169d _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:550
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.621 12 DEBUG novaclient.v2.client [-] RESP BODY: {"flavor": {"id": "17063997", "name": "tempest-flavor_with_ephemeral_0-176066592", "ram": 128, "disk": 1, "swap": "", "OS-FLV-EXT-DATA:ephemeral": 0, "OS-FLV-DISABLED:disabled": false, "vcpus": 1, "os-flavor-access:is_public": true, "rxtx_factor": 1.0, "links": [{"rel": "self", "href": "https://nova-internal.openstack.svc:8774/v2.1/flavors/17063997"}, {"rel": "bookmark", "href": "https://nova-internal.openstack.svc:8774/flavors/17063997"}]}} _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:582
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.622 12 DEBUG novaclient.v2.client [-] GET call to compute for https://nova-internal.openstack.svc:8774/v2.1/flavors/17063997 used request id req-84109f85-62a1-470f-9f35-282b2875169d request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:954
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.624 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '5a9ebed0-dcef-427e-a805-574905569389', 'name': 'tempest-ServersWithSpecificFlavorTestJSON-server-1964291827', 'flavor': {'id': '17063997', 'name': 'tempest-flavor_with_ephemeral_0-176066592', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000007', 'OS-EXT-SRV-ATTR:host': 'compute-1.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '2046237dff224b498e14ad59b5822ac1', 'user_id': 'a5dd6b4c1bdc4bdd90966604b07abf04', 'hostId': '30ab4b404f7384b3bbb3fea7fde7de717ac622f6bb64457f2f491842', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.625 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.629 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 5a9ebed0-dcef-427e-a805-574905569389 / tap37f1f8d1-f4 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.629 12 DEBUG ceilometer.compute.pollsters [-] 5a9ebed0-dcef-427e-a805-574905569389/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.642 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7a08e0b1-cdbf-4eea-95cb-7353cb913b77', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'a5dd6b4c1bdc4bdd90966604b07abf04', 'user_name': None, 'project_id': '2046237dff224b498e14ad59b5822ac1', 'project_name': None, 'resource_id': 'instance-00000007-5a9ebed0-dcef-427e-a805-574905569389-tap37f1f8d1-f4', 'timestamp': '2026-01-21T23:44:24.625429', 'resource_metadata': {'display_name': 'tempest-ServersWithSpecificFlavorTestJSON-server-1964291827', 'name': 'tap37f1f8d1-f4', 'instance_id': '5a9ebed0-dcef-427e-a805-574905569389', 'instance_type': 'tempest-flavor_with_ephemeral_0-176066592', 'host': '30ab4b404f7384b3bbb3fea7fde7de717ac622f6bb64457f2f491842', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '17063997', 'name': 'tempest-flavor_with_ephemeral_0-176066592', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:42:f2:4a', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap37f1f8d1-f4'}, 'message_id': '1dea7e58-f723-11f0-a0a4-fa163e934844', 'monotonic_time': 3637.33289993, 'message_signature': '89b77858e3a2c80a281f0cfb9ff34bcd902a0294c24a25a7d77566ef8bbf0ecf'}]}, 'timestamp': '2026-01-21 23:44:24.631633', '_unique_id': 'b1437a8265e246ea9baef4219aa6d16a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.642 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.642 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.642 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.642 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.642 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.642 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.642 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.642 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.642 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.642 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.642 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.642 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.642 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.642 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.642 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.642 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.642 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.642 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.642 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.642 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.642 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.642 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.642 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.642 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.642 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.642 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.642 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.642 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.642 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.642 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.642 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.642 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.642 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.642 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.642 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.642 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.642 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.642 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.642 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.642 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.642 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.642 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.642 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.642 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.642 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.642 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.642 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.642 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.642 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.642 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.642 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.642 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.642 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.642 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.648 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.687 12 DEBUG ceilometer.compute.pollsters [-] 5a9ebed0-dcef-427e-a805-574905569389/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.688 12 DEBUG ceilometer.compute.pollsters [-] 5a9ebed0-dcef-427e-a805-574905569389/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.689 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'df4901f9-f976-48ba-8dfa-4568604a578e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'a5dd6b4c1bdc4bdd90966604b07abf04', 'user_name': None, 'project_id': '2046237dff224b498e14ad59b5822ac1', 'project_name': None, 'resource_id': '5a9ebed0-dcef-427e-a805-574905569389-vda', 'timestamp': '2026-01-21T23:44:24.648811', 'resource_metadata': {'display_name': 'tempest-ServersWithSpecificFlavorTestJSON-server-1964291827', 'name': 'instance-00000007', 'instance_id': '5a9ebed0-dcef-427e-a805-574905569389', 'instance_type': 'tempest-flavor_with_ephemeral_0-176066592', 'host': '30ab4b404f7384b3bbb3fea7fde7de717ac622f6bb64457f2f491842', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '17063997', 'name': 'tempest-flavor_with_ephemeral_0-176066592', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '1df3436c-f723-11f0-a0a4-fa163e934844', 'monotonic_time': 3637.356337871, 'message_signature': '9fb42e2728511c0c0f02b284cf64080acce3af17a1eba4cd272aa2accc2098aa'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'a5dd6b4c1bdc4bdd90966604b07abf04', 'user_name': None, 'project_id': '2046237dff224b498e14ad59b5822ac1', 'project_name': None, 'resource_id': '5a9ebed0-dcef-427e-a805-574905569389-sda', 'timestamp': '2026-01-21T23:44:24.648811', 'resource_metadata': {'display_name': 'tempest-ServersWithSpecificFlavorTestJSON-server-1964291827', 'name': 'instance-00000007', 'instance_id': '5a9ebed0-dcef-427e-a805-574905569389', 'instance_type': 'tempest-flavor_with_ephemeral_0-176066592', 'host': '30ab4b404f7384b3bbb3fea7fde7de717ac622f6bb64457f2f491842', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '17063997', 'name': 'tempest-flavor_with_ephemeral_0-176066592', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '1df3523a-f723-11f0-a0a4-fa163e934844', 'monotonic_time': 3637.356337871, 'message_signature': '3440c3d31f2602069555904e0c74285cd63f70157f675f7803a8d4e32d8c47fd'}]}, 'timestamp': '2026-01-21 23:44:24.688883', '_unique_id': '88c6028f6e81432abf8c547ec4cbcf97'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.689 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.689 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.689 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.689 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.689 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.689 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.689 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.689 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.689 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.689 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.689 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.689 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.689 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.689 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.689 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.689 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.689 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.689 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.689 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.689 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.689 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.689 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.689 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.689 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.689 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.689 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.689 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.689 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.689 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.689 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.689 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.689 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.689 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.689 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.689 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.689 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.689 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.689 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.689 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.689 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.689 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.689 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.689 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.689 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.689 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.689 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.689 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.689 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.689 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.689 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.689 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.689 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.689 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.689 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.690 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.707 12 DEBUG ceilometer.compute.pollsters [-] 5a9ebed0-dcef-427e-a805-574905569389/disk.device.usage volume: 196624 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.708 12 DEBUG ceilometer.compute.pollsters [-] 5a9ebed0-dcef-427e-a805-574905569389/disk.device.usage volume: 509952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.709 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7cde735d-bc48-4cad-b50d-8bda6c701a5e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 196624, 'user_id': 'a5dd6b4c1bdc4bdd90966604b07abf04', 'user_name': None, 'project_id': '2046237dff224b498e14ad59b5822ac1', 'project_name': None, 'resource_id': '5a9ebed0-dcef-427e-a805-574905569389-vda', 'timestamp': '2026-01-21T23:44:24.690985', 'resource_metadata': {'display_name': 'tempest-ServersWithSpecificFlavorTestJSON-server-1964291827', 'name': 'instance-00000007', 'instance_id': '5a9ebed0-dcef-427e-a805-574905569389', 'instance_type': 'tempest-flavor_with_ephemeral_0-176066592', 'host': '30ab4b404f7384b3bbb3fea7fde7de717ac622f6bb64457f2f491842', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '17063997', 'name': 'tempest-flavor_with_ephemeral_0-176066592', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '1df64990-f723-11f0-a0a4-fa163e934844', 'monotonic_time': 3637.398341665, 'message_signature': '76259fe7a8468cfd929a9bd21bc73732fb7dd0ad9aadefc2d59997c68bb3c59b'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 509952, 'user_id': 'a5dd6b4c1bdc4bdd90966604b07abf04', 'user_name': None, 'project_id': '2046237dff224b498e14ad59b5822ac1', 'project_name': None, 'resource_id': '5a9ebed0-dcef-427e-a805-574905569389-sda', 'timestamp': '2026-01-21T23:44:24.690985', 'resource_metadata': {'display_name': 'tempest-ServersWithSpecificFlavorTestJSON-server-1964291827', 'name': 'instance-00000007', 'instance_id': '5a9ebed0-dcef-427e-a805-574905569389', 'instance_type': 'tempest-flavor_with_ephemeral_0-176066592', 'host': '30ab4b404f7384b3bbb3fea7fde7de717ac622f6bb64457f2f491842', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '17063997', 'name': 'tempest-flavor_with_ephemeral_0-176066592', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '1df655ca-f723-11f0-a0a4-fa163e934844', 'monotonic_time': 3637.398341665, 'message_signature': '501bbe374992be2fb72d29e61d99fec1be2742acfdd24eb8c82865505a5fb30c'}]}, 'timestamp': '2026-01-21 23:44:24.708600', '_unique_id': '9ce468e6d9c247398d7acfa890c4f5b8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.709 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.709 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.709 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.709 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.709 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.709 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.709 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.709 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.709 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.709 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.709 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.709 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.709 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.709 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.709 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.709 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.709 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.709 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.709 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.709 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.709 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.709 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.709 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.709 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.709 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.709 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.709 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.709 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.709 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.709 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.709 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.709 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.709 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.709 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.709 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.709 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.709 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.709 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.709 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.709 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.709 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.709 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.709 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.709 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.709 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.709 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.709 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.709 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.709 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.709 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.709 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.709 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.709 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.709 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.710 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.710 12 DEBUG ceilometer.compute.pollsters [-] 5a9ebed0-dcef-427e-a805-574905569389/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.710 12 DEBUG ceilometer.compute.pollsters [-] 5a9ebed0-dcef-427e-a805-574905569389/disk.device.capacity volume: 509952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.711 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a0ee9811-eabe-491c-8960-e02032dcf108', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'a5dd6b4c1bdc4bdd90966604b07abf04', 'user_name': None, 'project_id': '2046237dff224b498e14ad59b5822ac1', 'project_name': None, 'resource_id': '5a9ebed0-dcef-427e-a805-574905569389-vda', 'timestamp': '2026-01-21T23:44:24.710338', 'resource_metadata': {'display_name': 'tempest-ServersWithSpecificFlavorTestJSON-server-1964291827', 'name': 'instance-00000007', 'instance_id': '5a9ebed0-dcef-427e-a805-574905569389', 'instance_type': 'tempest-flavor_with_ephemeral_0-176066592', 'host': '30ab4b404f7384b3bbb3fea7fde7de717ac622f6bb64457f2f491842', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '17063997', 'name': 'tempest-flavor_with_ephemeral_0-176066592', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '1df6a6ba-f723-11f0-a0a4-fa163e934844', 'monotonic_time': 3637.398341665, 'message_signature': 'ad0c43151a07457c7ed4ce871e17dbab7448d79cab60c9b9eff6cd361e53acf7'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 509952, 'user_id': 'a5dd6b4c1bdc4bdd90966604b07abf04', 'user_name': None, 'project_id': '2046237dff224b498e14ad59b5822ac1', 'project_name': None, 'resource_id': '5a9ebed0-dcef-427e-a805-574905569389-sda', 'timestamp': '2026-01-21T23:44:24.710338', 'resource_metadata': {'display_name': 'tempest-ServersWithSpecificFlavorTestJSON-server-1964291827', 'name': 'instance-00000007', 'instance_id': '5a9ebed0-dcef-427e-a805-574905569389', 'instance_type': 'tempest-flavor_with_ephemeral_0-176066592', 'host': '30ab4b404f7384b3bbb3fea7fde7de717ac622f6bb64457f2f491842', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '17063997', 'name': 'tempest-flavor_with_ephemeral_0-176066592', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '1df6b1e6-f723-11f0-a0a4-fa163e934844', 'monotonic_time': 3637.398341665, 'message_signature': '502d0df4d293d632b877cacfb524c08104cbe0bb392aef31e59d7fc6e4f6f789'}]}, 'timestamp': '2026-01-21 23:44:24.710972', '_unique_id': '86373d152dc240d7ad3cde4284ca8f90'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.711 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.711 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.711 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.711 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.711 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.711 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.711 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.711 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.711 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.711 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.711 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.711 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.711 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.711 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.711 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.711 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.711 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.711 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.711 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.711 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.711 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.711 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.711 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.711 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.711 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.711 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.711 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.711 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.711 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.711 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.711 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.712 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.712 12 DEBUG ceilometer.compute.pollsters [-] 5a9ebed0-dcef-427e-a805-574905569389/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.713 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0d1d35a8-9322-4c2b-a2f1-1629966dfd47', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'a5dd6b4c1bdc4bdd90966604b07abf04', 'user_name': None, 'project_id': '2046237dff224b498e14ad59b5822ac1', 'project_name': None, 'resource_id': 'instance-00000007-5a9ebed0-dcef-427e-a805-574905569389-tap37f1f8d1-f4', 'timestamp': '2026-01-21T23:44:24.712533', 'resource_metadata': {'display_name': 'tempest-ServersWithSpecificFlavorTestJSON-server-1964291827', 'name': 'tap37f1f8d1-f4', 'instance_id': '5a9ebed0-dcef-427e-a805-574905569389', 'instance_type': 'tempest-flavor_with_ephemeral_0-176066592', 'host': '30ab4b404f7384b3bbb3fea7fde7de717ac622f6bb64457f2f491842', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '17063997', 'name': 'tempest-flavor_with_ephemeral_0-176066592', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:42:f2:4a', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap37f1f8d1-f4'}, 'message_id': '1df6fc50-f723-11f0-a0a4-fa163e934844', 'monotonic_time': 3637.33289993, 'message_signature': '8844f0e502c3042dc2430938e3d76584817853f80c6600f63a0d76eb99e9065f'}]}, 'timestamp': '2026-01-21 23:44:24.712897', '_unique_id': 'b4ac545df6bf43bfbbe65952a754bb72'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.713 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.713 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.713 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.713 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.713 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.713 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.713 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.713 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.713 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.713 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.713 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.713 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.713 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.713 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.713 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.713 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.713 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.713 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.713 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.713 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.713 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.713 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.713 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.713 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.713 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.713 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.713 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.713 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.713 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.713 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.713 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.714 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.751 12 DEBUG ceilometer.compute.pollsters [-] 5a9ebed0-dcef-427e-a805-574905569389/cpu volume: 5540000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.754 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '63d89f7b-8f31-4351-98a3-0c8e81b7c6fb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 5540000000, 'user_id': 'a5dd6b4c1bdc4bdd90966604b07abf04', 'user_name': None, 'project_id': '2046237dff224b498e14ad59b5822ac1', 'project_name': None, 'resource_id': '5a9ebed0-dcef-427e-a805-574905569389', 'timestamp': '2026-01-21T23:44:24.714387', 'resource_metadata': {'display_name': 'tempest-ServersWithSpecificFlavorTestJSON-server-1964291827', 'name': 'instance-00000007', 'instance_id': '5a9ebed0-dcef-427e-a805-574905569389', 'instance_type': 'tempest-flavor_with_ephemeral_0-176066592', 'host': '30ab4b404f7384b3bbb3fea7fde7de717ac622f6bb64457f2f491842', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '17063997', 'name': 'tempest-flavor_with_ephemeral_0-176066592', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '1dfd0924-f723-11f0-a0a4-fa163e934844', 'monotonic_time': 3637.458397917, 'message_signature': 'cec5f1cf2cecf49b86bbfd18042747b556ec430fa1ae216f0324f8e27c01531a'}]}, 'timestamp': '2026-01-21 23:44:24.752712', '_unique_id': '573217ee32e54b4daa905256d3409d69'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.754 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.754 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.754 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.754 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.754 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.754 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.754 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.754 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.754 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.754 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.754 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.754 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.754 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.754 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.754 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.754 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.754 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.754 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.754 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.754 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.754 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.754 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.754 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.754 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.754 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.754 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.754 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.754 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.754 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.754 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.754 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.755 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.755 12 DEBUG ceilometer.compute.pollsters [-] 5a9ebed0-dcef-427e-a805-574905569389/disk.device.read.bytes volume: 23775232 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.756 12 DEBUG ceilometer.compute.pollsters [-] 5a9ebed0-dcef-427e-a805-574905569389/disk.device.read.bytes volume: 2048 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.757 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1ce3adc5-c40b-47ef-9e0a-e50ce0f75c7e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 23775232, 'user_id': 'a5dd6b4c1bdc4bdd90966604b07abf04', 'user_name': None, 'project_id': '2046237dff224b498e14ad59b5822ac1', 'project_name': None, 'resource_id': '5a9ebed0-dcef-427e-a805-574905569389-vda', 'timestamp': '2026-01-21T23:44:24.755841', 'resource_metadata': {'display_name': 'tempest-ServersWithSpecificFlavorTestJSON-server-1964291827', 'name': 'instance-00000007', 'instance_id': '5a9ebed0-dcef-427e-a805-574905569389', 'instance_type': 'tempest-flavor_with_ephemeral_0-176066592', 'host': '30ab4b404f7384b3bbb3fea7fde7de717ac622f6bb64457f2f491842', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '17063997', 'name': 'tempest-flavor_with_ephemeral_0-176066592', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '1dfd9cf4-f723-11f0-a0a4-fa163e934844', 'monotonic_time': 3637.356337871, 'message_signature': 'ac1c5d8e8f95bd021e65ee35638e7119c85fb80ffce2510da0b140d2ece47ab5'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2048, 'user_id': 'a5dd6b4c1bdc4bdd90966604b07abf04', 'user_name': None, 'project_id': '2046237dff224b498e14ad59b5822ac1', 'project_name': None, 'resource_id': '5a9ebed0-dcef-427e-a805-574905569389-sda', 'timestamp': '2026-01-21T23:44:24.755841', 'resource_metadata': {'display_name': 'tempest-ServersWithSpecificFlavorTestJSON-server-1964291827', 'name': 'instance-00000007', 'instance_id': '5a9ebed0-dcef-427e-a805-574905569389', 'instance_type': 'tempest-flavor_with_ephemeral_0-176066592', 'host': '30ab4b404f7384b3bbb3fea7fde7de717ac622f6bb64457f2f491842', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '17063997', 'name': 'tempest-flavor_with_ephemeral_0-176066592', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '1dfdaf3c-f723-11f0-a0a4-fa163e934844', 'monotonic_time': 3637.356337871, 'message_signature': '21b9ce5d5ac25825dac766c4f71b34ff696a73345d404eb98de5d198229fbb16'}]}, 'timestamp': '2026-01-21 23:44:24.756837', '_unique_id': '4a6071dacc194947847112a9d1ff1e52'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.757 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.757 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.757 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.757 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.757 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.757 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.757 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.757 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.757 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.757 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.757 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.757 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.757 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.757 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.757 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.757 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.757 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.757 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.757 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.757 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.757 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.757 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.757 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.757 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.757 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.757 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.757 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.757 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.757 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.757 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.757 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.759 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.759 12 DEBUG ceilometer.compute.pollsters [-] 5a9ebed0-dcef-427e-a805-574905569389/network.incoming.bytes volume: 90 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.761 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '03f16a78-2f5b-49a0-b1ec-c1d4c00e245e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 90, 'user_id': 'a5dd6b4c1bdc4bdd90966604b07abf04', 'user_name': None, 'project_id': '2046237dff224b498e14ad59b5822ac1', 'project_name': None, 'resource_id': 'instance-00000007-5a9ebed0-dcef-427e-a805-574905569389-tap37f1f8d1-f4', 'timestamp': '2026-01-21T23:44:24.759833', 'resource_metadata': {'display_name': 'tempest-ServersWithSpecificFlavorTestJSON-server-1964291827', 'name': 'tap37f1f8d1-f4', 'instance_id': '5a9ebed0-dcef-427e-a805-574905569389', 'instance_type': 'tempest-flavor_with_ephemeral_0-176066592', 'host': '30ab4b404f7384b3bbb3fea7fde7de717ac622f6bb64457f2f491842', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '17063997', 'name': 'tempest-flavor_with_ephemeral_0-176066592', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:42:f2:4a', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap37f1f8d1-f4'}, 'message_id': '1dfe4230-f723-11f0-a0a4-fa163e934844', 'monotonic_time': 3637.33289993, 'message_signature': 'aee27023d73d555c8eb4f4a00966c9344ce1ce80969a22960815dea4e01de6f5'}]}, 'timestamp': '2026-01-21 23:44:24.760642', '_unique_id': '5f0e6ee29adc41d09695b52c4dd65cec'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.761 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.761 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.761 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.761 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.761 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.761 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.761 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.761 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.761 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.761 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.761 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.761 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.761 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.761 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.761 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.761 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.761 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.761 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.761 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.761 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.761 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.761 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.761 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.761 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.761 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.761 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.761 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.761 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.761 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.761 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.761 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.763 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.763 12 DEBUG ceilometer.compute.pollsters [-] 5a9ebed0-dcef-427e-a805-574905569389/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.764 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '760eb82e-e4c5-45b3-a024-2fac380744c5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'a5dd6b4c1bdc4bdd90966604b07abf04', 'user_name': None, 'project_id': '2046237dff224b498e14ad59b5822ac1', 'project_name': None, 'resource_id': 'instance-00000007-5a9ebed0-dcef-427e-a805-574905569389-tap37f1f8d1-f4', 'timestamp': '2026-01-21T23:44:24.763294', 'resource_metadata': {'display_name': 'tempest-ServersWithSpecificFlavorTestJSON-server-1964291827', 'name': 'tap37f1f8d1-f4', 'instance_id': '5a9ebed0-dcef-427e-a805-574905569389', 'instance_type': 'tempest-flavor_with_ephemeral_0-176066592', 'host': '30ab4b404f7384b3bbb3fea7fde7de717ac622f6bb64457f2f491842', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '17063997', 'name': 'tempest-flavor_with_ephemeral_0-176066592', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:42:f2:4a', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap37f1f8d1-f4'}, 'message_id': '1dfebe4a-f723-11f0-a0a4-fa163e934844', 'monotonic_time': 3637.33289993, 'message_signature': '13073baae7837e7877fd1981f174567ba87943e50f5993f5ded485cc10ccc8c3'}]}, 'timestamp': '2026-01-21 23:44:24.763880', '_unique_id': '643e8ab4495546e688c181a2eb47e9a5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.764 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.764 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.764 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.764 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.764 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.764 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.764 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.764 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.764 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.764 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.764 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.764 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.764 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.764 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.764 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.764 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.764 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.764 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.764 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.764 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.764 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.764 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.764 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.764 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.764 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.764 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.764 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.764 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.764 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.764 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.764 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.766 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.766 12 DEBUG ceilometer.compute.pollsters [-] 5a9ebed0-dcef-427e-a805-574905569389/network.incoming.packets volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.767 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '41072196-8a3d-487c-b966-0cbe0da51550', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 1, 'user_id': 'a5dd6b4c1bdc4bdd90966604b07abf04', 'user_name': None, 'project_id': '2046237dff224b498e14ad59b5822ac1', 'project_name': None, 'resource_id': 'instance-00000007-5a9ebed0-dcef-427e-a805-574905569389-tap37f1f8d1-f4', 'timestamp': '2026-01-21T23:44:24.766430', 'resource_metadata': {'display_name': 'tempest-ServersWithSpecificFlavorTestJSON-server-1964291827', 'name': 'tap37f1f8d1-f4', 'instance_id': '5a9ebed0-dcef-427e-a805-574905569389', 'instance_type': 'tempest-flavor_with_ephemeral_0-176066592', 'host': '30ab4b404f7384b3bbb3fea7fde7de717ac622f6bb64457f2f491842', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '17063997', 'name': 'tempest-flavor_with_ephemeral_0-176066592', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:42:f2:4a', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap37f1f8d1-f4'}, 'message_id': '1dff38b6-f723-11f0-a0a4-fa163e934844', 'monotonic_time': 3637.33289993, 'message_signature': '67aa7092e39a41026a50d48840d6109fcd59948e69e11088a72104d9f86fa0d9'}]}, 'timestamp': '2026-01-21 23:44:24.766971', '_unique_id': '5b40974afc8049d3b568283d098b3f69'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.767 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.767 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.767 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.767 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.767 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.767 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.767 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.767 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.767 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.767 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.767 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.767 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.767 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.767 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.767 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.767 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.767 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.767 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.767 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.767 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.767 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.767 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.767 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.767 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.767 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.767 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.767 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.767 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.767 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.767 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.767 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.768 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.769 12 DEBUG ceilometer.compute.pollsters [-] 5a9ebed0-dcef-427e-a805-574905569389/disk.device.read.requests volume: 760 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.769 12 DEBUG ceilometer.compute.pollsters [-] 5a9ebed0-dcef-427e-a805-574905569389/disk.device.read.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.770 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c2e5a152-b280-493c-bc84-0ecaddd8e886', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 760, 'user_id': 'a5dd6b4c1bdc4bdd90966604b07abf04', 'user_name': None, 'project_id': '2046237dff224b498e14ad59b5822ac1', 'project_name': None, 'resource_id': '5a9ebed0-dcef-427e-a805-574905569389-vda', 'timestamp': '2026-01-21T23:44:24.768954', 'resource_metadata': {'display_name': 'tempest-ServersWithSpecificFlavorTestJSON-server-1964291827', 'name': 'instance-00000007', 'instance_id': '5a9ebed0-dcef-427e-a805-574905569389', 'instance_type': 'tempest-flavor_with_ephemeral_0-176066592', 'host': '30ab4b404f7384b3bbb3fea7fde7de717ac622f6bb64457f2f491842', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '17063997', 'name': 'tempest-flavor_with_ephemeral_0-176066592', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '1dff98ba-f723-11f0-a0a4-fa163e934844', 'monotonic_time': 3637.356337871, 'message_signature': '40281929977090bb8378b684c8790293f14c3da47bd1409de2db82e6e0572c8d'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': 'a5dd6b4c1bdc4bdd90966604b07abf04', 'user_name': None, 'project_id': '2046237dff224b498e14ad59b5822ac1', 'project_name': None, 'resource_id': '5a9ebed0-dcef-427e-a805-574905569389-sda', 'timestamp': '2026-01-21T23:44:24.768954', 'resource_metadata': {'display_name': 'tempest-ServersWithSpecificFlavorTestJSON-server-1964291827', 'name': 'instance-00000007', 'instance_id': '5a9ebed0-dcef-427e-a805-574905569389', 'instance_type': 'tempest-flavor_with_ephemeral_0-176066592', 'host': '30ab4b404f7384b3bbb3fea7fde7de717ac622f6bb64457f2f491842', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '17063997', 'name': 'tempest-flavor_with_ephemeral_0-176066592', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '1dffa4b8-f723-11f0-a0a4-fa163e934844', 'monotonic_time': 3637.356337871, 'message_signature': '2892f0afd7f7f460adff588c1a1d816a93ecddea3d2cf56b4dc184c6f6157ae6'}]}, 'timestamp': '2026-01-21 23:44:24.769595', '_unique_id': 'e9ed6d08e8a643bfb82d3fe29d9d4b16'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.770 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.770 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.770 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.770 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.770 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.770 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.770 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.770 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.770 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.770 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.770 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.770 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.770 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.770 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.770 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.770 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.770 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.770 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.770 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.770 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.770 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.770 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.770 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.770 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.770 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.770 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.770 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.770 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.770 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.770 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.770 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.771 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.771 12 DEBUG ceilometer.compute.pollsters [-] 5a9ebed0-dcef-427e-a805-574905569389/network.outgoing.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.772 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e1730736-4fb2-4572-be06-0d1304fc633d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'a5dd6b4c1bdc4bdd90966604b07abf04', 'user_name': None, 'project_id': '2046237dff224b498e14ad59b5822ac1', 'project_name': None, 'resource_id': 'instance-00000007-5a9ebed0-dcef-427e-a805-574905569389-tap37f1f8d1-f4', 'timestamp': '2026-01-21T23:44:24.771202', 'resource_metadata': {'display_name': 'tempest-ServersWithSpecificFlavorTestJSON-server-1964291827', 'name': 'tap37f1f8d1-f4', 'instance_id': '5a9ebed0-dcef-427e-a805-574905569389', 'instance_type': 'tempest-flavor_with_ephemeral_0-176066592', 'host': '30ab4b404f7384b3bbb3fea7fde7de717ac622f6bb64457f2f491842', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '17063997', 'name': 'tempest-flavor_with_ephemeral_0-176066592', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:42:f2:4a', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap37f1f8d1-f4'}, 'message_id': '1dffefcc-f723-11f0-a0a4-fa163e934844', 'monotonic_time': 3637.33289993, 'message_signature': 'adae3ee12525bcc74ffde74aa47726fe542594162ec5df0ecde05f229d1f4edb'}]}, 'timestamp': '2026-01-21 23:44:24.771542', '_unique_id': 'c8a2f44d022546d2a5d06b982e248610'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.772 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.772 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.772 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.772 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.772 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.772 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.772 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.772 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.772 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.772 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.772 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.772 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.772 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.772 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.772 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.772 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.772 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.772 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.772 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.772 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.772 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.772 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.772 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.772 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.772 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.772 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.772 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.772 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.772 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.772 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.772 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.773 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.773 12 DEBUG ceilometer.compute.pollsters [-] 5a9ebed0-dcef-427e-a805-574905569389/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.774 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ad043c1f-215d-491f-8552-bbe8bfbb03c7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'a5dd6b4c1bdc4bdd90966604b07abf04', 'user_name': None, 'project_id': '2046237dff224b498e14ad59b5822ac1', 'project_name': None, 'resource_id': 'instance-00000007-5a9ebed0-dcef-427e-a805-574905569389-tap37f1f8d1-f4', 'timestamp': '2026-01-21T23:44:24.773281', 'resource_metadata': {'display_name': 'tempest-ServersWithSpecificFlavorTestJSON-server-1964291827', 'name': 'tap37f1f8d1-f4', 'instance_id': '5a9ebed0-dcef-427e-a805-574905569389', 'instance_type': 'tempest-flavor_with_ephemeral_0-176066592', 'host': '30ab4b404f7384b3bbb3fea7fde7de717ac622f6bb64457f2f491842', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '17063997', 'name': 'tempest-flavor_with_ephemeral_0-176066592', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:42:f2:4a', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap37f1f8d1-f4'}, 'message_id': '1e00412a-f723-11f0-a0a4-fa163e934844', 'monotonic_time': 3637.33289993, 'message_signature': '77fd1b74ab02dc8b02ed6b6d1bf0d8b07042238e3c8a840016e866a119f93c60'}]}, 'timestamp': '2026-01-21 23:44:24.773619', '_unique_id': '8ea89afb8c4e4659843542be9ea4b1e7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.774 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.774 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.774 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.774 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.774 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.774 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.774 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.774 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.774 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.774 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.774 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.774 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.774 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.774 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.774 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.774 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.774 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.774 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.774 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.774 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.774 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.774 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.774 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.774 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.774 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.774 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.774 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.774 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.774 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.774 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.774 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.774 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.774 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.774 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.774 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.774 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.774 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.774 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.774 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.774 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.774 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.774 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.774 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.774 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.774 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.774 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.774 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.774 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.774 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.774 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.774 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.774 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.774 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.774 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.775 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.775 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.775 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-ServersWithSpecificFlavorTestJSON-server-1964291827>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServersWithSpecificFlavorTestJSON-server-1964291827>]
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.776 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.776 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.776 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-ServersWithSpecificFlavorTestJSON-server-1964291827>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServersWithSpecificFlavorTestJSON-server-1964291827>]
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.776 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.776 12 DEBUG ceilometer.compute.pollsters [-] 5a9ebed0-dcef-427e-a805-574905569389/memory.usage volume: Unavailable _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.776 12 WARNING ceilometer.compute.pollsters [-] memory.usage statistic in not available for instance 5a9ebed0-dcef-427e-a805-574905569389: ceilometer.compute.pollsters.NoVolumeException
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.776 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.777 12 DEBUG ceilometer.compute.pollsters [-] 5a9ebed0-dcef-427e-a805-574905569389/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.777 12 DEBUG ceilometer.compute.pollsters [-] 5a9ebed0-dcef-427e-a805-574905569389/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.778 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1fda509f-95f5-4a57-92c0-8b334598859b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'a5dd6b4c1bdc4bdd90966604b07abf04', 'user_name': None, 'project_id': '2046237dff224b498e14ad59b5822ac1', 'project_name': None, 'resource_id': '5a9ebed0-dcef-427e-a805-574905569389-vda', 'timestamp': '2026-01-21T23:44:24.776978', 'resource_metadata': {'display_name': 'tempest-ServersWithSpecificFlavorTestJSON-server-1964291827', 'name': 'instance-00000007', 'instance_id': '5a9ebed0-dcef-427e-a805-574905569389', 'instance_type': 'tempest-flavor_with_ephemeral_0-176066592', 'host': '30ab4b404f7384b3bbb3fea7fde7de717ac622f6bb64457f2f491842', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '17063997', 'name': 'tempest-flavor_with_ephemeral_0-176066592', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '1e00d144-f723-11f0-a0a4-fa163e934844', 'monotonic_time': 3637.356337871, 'message_signature': 'ea6cba682645bf36e45fd4ee6c8ca8d0fadfa4a7d3f5796f695957746283a93e'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'a5dd6b4c1bdc4bdd90966604b07abf04', 'user_name': None, 'project_id': '2046237dff224b498e14ad59b5822ac1', 'project_name': None, 'resource_id': '5a9ebed0-dcef-427e-a805-574905569389-sda', 'timestamp': '2026-01-21T23:44:24.776978', 'resource_metadata': {'display_name': 'tempest-ServersWithSpecificFlavorTestJSON-server-1964291827', 'name': 'instance-00000007', 'instance_id': '5a9ebed0-dcef-427e-a805-574905569389', 'instance_type': 'tempest-flavor_with_ephemeral_0-176066592', 'host': '30ab4b404f7384b3bbb3fea7fde7de717ac622f6bb64457f2f491842', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '17063997', 'name': 'tempest-flavor_with_ephemeral_0-176066592', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '1e00dd42-f723-11f0-a0a4-fa163e934844', 'monotonic_time': 3637.356337871, 'message_signature': 'd39a8c803204f04463d4a89a26fb4e16a00b0e4ba6f7666942b1b374ef36034e'}]}, 'timestamp': '2026-01-21 23:44:24.777593', '_unique_id': '3ef2e37e994b4fa59d5cec2e85e8c7d9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.778 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.778 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.778 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.778 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.778 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.778 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.778 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.778 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.778 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.778 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.778 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.778 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.778 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.778 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.778 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.778 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.778 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.778 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.778 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.778 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.778 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.778 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.778 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.778 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.778 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.778 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.778 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.778 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.778 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.778 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.778 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.779 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.779 12 DEBUG ceilometer.compute.pollsters [-] 5a9ebed0-dcef-427e-a805-574905569389/network.outgoing.packets volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.780 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e5662de7-ea01-4e51-9251-82acc8602032', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'a5dd6b4c1bdc4bdd90966604b07abf04', 'user_name': None, 'project_id': '2046237dff224b498e14ad59b5822ac1', 'project_name': None, 'resource_id': 'instance-00000007-5a9ebed0-dcef-427e-a805-574905569389-tap37f1f8d1-f4', 'timestamp': '2026-01-21T23:44:24.779243', 'resource_metadata': {'display_name': 'tempest-ServersWithSpecificFlavorTestJSON-server-1964291827', 'name': 'tap37f1f8d1-f4', 'instance_id': '5a9ebed0-dcef-427e-a805-574905569389', 'instance_type': 'tempest-flavor_with_ephemeral_0-176066592', 'host': '30ab4b404f7384b3bbb3fea7fde7de717ac622f6bb64457f2f491842', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '17063997', 'name': 'tempest-flavor_with_ephemeral_0-176066592', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:42:f2:4a', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap37f1f8d1-f4'}, 'message_id': '1e0129dc-f723-11f0-a0a4-fa163e934844', 'monotonic_time': 3637.33289993, 'message_signature': '8883a13b71a0af2ceb2d3de88d6850d8946ae8616f33cba0bd730bc4edc700e3'}]}, 'timestamp': '2026-01-21 23:44:24.779579', '_unique_id': '3c77ae20ac834fc49c5b2300f35022c2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.780 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.780 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.780 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.780 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.780 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.780 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.780 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.780 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.780 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.780 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.780 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.780 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.780 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.780 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.780 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.780 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.780 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.780 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.780 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.780 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.780 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.780 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.780 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.780 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.780 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.780 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.780 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.780 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.780 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.780 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.780 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.781 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.781 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.781 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-ServersWithSpecificFlavorTestJSON-server-1964291827>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServersWithSpecificFlavorTestJSON-server-1964291827>]
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.781 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.781 12 DEBUG ceilometer.compute.pollsters [-] 5a9ebed0-dcef-427e-a805-574905569389/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.782 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e1a91b5e-8bd9-49a2-983a-64f5d9f2f8aa', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'a5dd6b4c1bdc4bdd90966604b07abf04', 'user_name': None, 'project_id': '2046237dff224b498e14ad59b5822ac1', 'project_name': None, 'resource_id': 'instance-00000007-5a9ebed0-dcef-427e-a805-574905569389-tap37f1f8d1-f4', 'timestamp': '2026-01-21T23:44:24.781608', 'resource_metadata': {'display_name': 'tempest-ServersWithSpecificFlavorTestJSON-server-1964291827', 'name': 'tap37f1f8d1-f4', 'instance_id': '5a9ebed0-dcef-427e-a805-574905569389', 'instance_type': 'tempest-flavor_with_ephemeral_0-176066592', 'host': '30ab4b404f7384b3bbb3fea7fde7de717ac622f6bb64457f2f491842', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '17063997', 'name': 'tempest-flavor_with_ephemeral_0-176066592', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:42:f2:4a', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap37f1f8d1-f4'}, 'message_id': '1e01863e-f723-11f0-a0a4-fa163e934844', 'monotonic_time': 3637.33289993, 'message_signature': 'ac0df08409325c5b3aeea73e6806bd249bd7ba007166dfdaaa017b44ca92f856'}]}, 'timestamp': '2026-01-21 23:44:24.781966', '_unique_id': '40f3ec9d34ba4d9a95c00ed6c3ad24d6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.782 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.782 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.782 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.782 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.782 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.782 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.782 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.782 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.782 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.782 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.782 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.782 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.782 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.782 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.782 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.782 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.782 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.782 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.782 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.782 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.782 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.782 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.782 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.782 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.782 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.782 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.782 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.782 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.782 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.782 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.782 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.783 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.783 12 DEBUG ceilometer.compute.pollsters [-] 5a9ebed0-dcef-427e-a805-574905569389/disk.device.allocation volume: 204800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.784 12 DEBUG ceilometer.compute.pollsters [-] 5a9ebed0-dcef-427e-a805-574905569389/disk.device.allocation volume: 512000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.785 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '97c615ca-abe2-4c70-bccb-1232abadb4bb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 204800, 'user_id': 'a5dd6b4c1bdc4bdd90966604b07abf04', 'user_name': None, 'project_id': '2046237dff224b498e14ad59b5822ac1', 'project_name': None, 'resource_id': '5a9ebed0-dcef-427e-a805-574905569389-vda', 'timestamp': '2026-01-21T23:44:24.783614', 'resource_metadata': {'display_name': 'tempest-ServersWithSpecificFlavorTestJSON-server-1964291827', 'name': 'instance-00000007', 'instance_id': '5a9ebed0-dcef-427e-a805-574905569389', 'instance_type': 'tempest-flavor_with_ephemeral_0-176066592', 'host': '30ab4b404f7384b3bbb3fea7fde7de717ac622f6bb64457f2f491842', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '17063997', 'name': 'tempest-flavor_with_ephemeral_0-176066592', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '1e01d62a-f723-11f0-a0a4-fa163e934844', 'monotonic_time': 3637.398341665, 'message_signature': '1bbf26876a501ae14d57a2d287703fea036a12c8e55eea5a5271dfad2c8f5f4f'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 512000, 'user_id': 'a5dd6b4c1bdc4bdd90966604b07abf04', 'user_name': None, 'project_id': '2046237dff224b498e14ad59b5822ac1', 'project_name': None, 'resource_id': '5a9ebed0-dcef-427e-a805-574905569389-sda', 'timestamp': '2026-01-21T23:44:24.783614', 'resource_metadata': {'display_name': 'tempest-ServersWithSpecificFlavorTestJSON-server-1964291827', 'name': 'instance-00000007', 'instance_id': '5a9ebed0-dcef-427e-a805-574905569389', 'instance_type': 'tempest-flavor_with_ephemeral_0-176066592', 'host': '30ab4b404f7384b3bbb3fea7fde7de717ac622f6bb64457f2f491842', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '17063997', 'name': 'tempest-flavor_with_ephemeral_0-176066592', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '1e01e32c-f723-11f0-a0a4-fa163e934844', 'monotonic_time': 3637.398341665, 'message_signature': 'e81991ba4499f0a1a96cc19a8d0fd7a3386a1cbe3d50937476aa21841354b50a'}]}, 'timestamp': '2026-01-21 23:44:24.784301', '_unique_id': '5b3182e765044bd9aecf287ebd7dac46'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.785 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.785 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.785 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.785 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.785 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.785 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.785 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.785 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.785 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.785 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.785 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.785 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.785 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.785 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.785 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.785 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.785 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.785 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.785 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.785 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.785 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.785 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.785 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.785 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.785 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.785 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.785 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.785 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.785 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.785 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.785 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.785 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.785 12 DEBUG ceilometer.compute.pollsters [-] 5a9ebed0-dcef-427e-a805-574905569389/disk.device.read.latency volume: 150956911 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.786 12 DEBUG ceilometer.compute.pollsters [-] 5a9ebed0-dcef-427e-a805-574905569389/disk.device.read.latency volume: 690393 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.787 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '96b40f3b-3b37-4cc7-a302-2945c95eb63b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 150956911, 'user_id': 'a5dd6b4c1bdc4bdd90966604b07abf04', 'user_name': None, 'project_id': '2046237dff224b498e14ad59b5822ac1', 'project_name': None, 'resource_id': '5a9ebed0-dcef-427e-a805-574905569389-vda', 'timestamp': '2026-01-21T23:44:24.785924', 'resource_metadata': {'display_name': 'tempest-ServersWithSpecificFlavorTestJSON-server-1964291827', 'name': 'instance-00000007', 'instance_id': '5a9ebed0-dcef-427e-a805-574905569389', 'instance_type': 'tempest-flavor_with_ephemeral_0-176066592', 'host': '30ab4b404f7384b3bbb3fea7fde7de717ac622f6bb64457f2f491842', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '17063997', 'name': 'tempest-flavor_with_ephemeral_0-176066592', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '1e022ec2-f723-11f0-a0a4-fa163e934844', 'monotonic_time': 3637.356337871, 'message_signature': 'd6e38371907434737d48c96666af5376fcf0065bd5ddd5a4ad7f612027bd1cc4'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 690393, 'user_id': 'a5dd6b4c1bdc4bdd90966604b07abf04', 'user_name': None, 'project_id': '2046237dff224b498e14ad59b5822ac1', 'project_name': None, 'resource_id': '5a9ebed0-dcef-427e-a805-574905569389-sda', 'timestamp': '2026-01-21T23:44:24.785924', 'resource_metadata': {'display_name': 'tempest-ServersWithSpecificFlavorTestJSON-server-1964291827', 'name': 'instance-00000007', 'instance_id': '5a9ebed0-dcef-427e-a805-574905569389', 'instance_type': 'tempest-flavor_with_ephemeral_0-176066592', 'host': '30ab4b404f7384b3bbb3fea7fde7de717ac622f6bb64457f2f491842', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '17063997', 'name': 'tempest-flavor_with_ephemeral_0-176066592', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '1e023ba6-f723-11f0-a0a4-fa163e934844', 'monotonic_time': 3637.356337871, 'message_signature': 'ede6c70a25827b64e5e4d7fd2c2efcf4646ab32884bec9c5eb2242de04175ffa'}]}, 'timestamp': '2026-01-21 23:44:24.786620', '_unique_id': '95a096274bca4382961c342a0bdc625a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.787 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.787 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.787 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.787 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.787 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.787 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.787 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.787 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.787 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.787 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.787 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.787 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.787 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.787 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.787 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.787 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.787 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.787 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.787 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.787 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.787 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.787 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.787 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.787 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.787 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.787 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.787 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.787 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.787 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.787 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.787 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.788 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.788 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.788 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-ServersWithSpecificFlavorTestJSON-server-1964291827>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServersWithSpecificFlavorTestJSON-server-1964291827>]
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.788 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.788 12 DEBUG ceilometer.compute.pollsters [-] 5a9ebed0-dcef-427e-a805-574905569389/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.789 12 DEBUG ceilometer.compute.pollsters [-] 5a9ebed0-dcef-427e-a805-574905569389/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.790 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'da003307-b270-44bd-9225-77965096018e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'a5dd6b4c1bdc4bdd90966604b07abf04', 'user_name': None, 'project_id': '2046237dff224b498e14ad59b5822ac1', 'project_name': None, 'resource_id': '5a9ebed0-dcef-427e-a805-574905569389-vda', 'timestamp': '2026-01-21T23:44:24.788805', 'resource_metadata': {'display_name': 'tempest-ServersWithSpecificFlavorTestJSON-server-1964291827', 'name': 'instance-00000007', 'instance_id': '5a9ebed0-dcef-427e-a805-574905569389', 'instance_type': 'tempest-flavor_with_ephemeral_0-176066592', 'host': '30ab4b404f7384b3bbb3fea7fde7de717ac622f6bb64457f2f491842', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '17063997', 'name': 'tempest-flavor_with_ephemeral_0-176066592', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '1e02a172-f723-11f0-a0a4-fa163e934844', 'monotonic_time': 3637.356337871, 'message_signature': 'cd80a6e7e41888ed1639d7e26f30cbd9de3924c5d058223b73665597fc27a7d6'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'a5dd6b4c1bdc4bdd90966604b07abf04', 'user_name': None, 'project_id': '2046237dff224b498e14ad59b5822ac1', 'project_name': None, 'resource_id': '5a9ebed0-dcef-427e-a805-574905569389-sda', 'timestamp': '2026-01-21T23:44:24.788805', 'resource_metadata': {'display_name': 'tempest-ServersWithSpecificFlavorTestJSON-server-1964291827', 'name': 'instance-00000007', 'instance_id': '5a9ebed0-dcef-427e-a805-574905569389', 'instance_type': 'tempest-flavor_with_ephemeral_0-176066592', 'host': '30ab4b404f7384b3bbb3fea7fde7de717ac622f6bb64457f2f491842', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '17063997', 'name': 'tempest-flavor_with_ephemeral_0-176066592', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '1e02acf8-f723-11f0-a0a4-fa163e934844', 'monotonic_time': 3637.356337871, 'message_signature': 'a2ae157e427532ac51d3cf96fe8a255558c519eece3ffe2386a8d2c4fd9901b2'}]}, 'timestamp': '2026-01-21 23:44:24.789464', '_unique_id': 'ed8326c21a284183b23ed756fed4be3f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.790 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.790 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.790 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.790 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.790 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.790 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.790 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.790 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.790 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.790 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.790 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.790 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.790 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.790 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.790 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.790 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.790 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.790 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.790 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.790 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.790 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.790 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.790 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.790 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.790 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.790 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.790 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.790 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.790 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.790 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.790 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.790 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.791 12 DEBUG ceilometer.compute.pollsters [-] 5a9ebed0-dcef-427e-a805-574905569389/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.792 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '01c7b5f5-c926-4548-b778-158037af7125', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'a5dd6b4c1bdc4bdd90966604b07abf04', 'user_name': None, 'project_id': '2046237dff224b498e14ad59b5822ac1', 'project_name': None, 'resource_id': 'instance-00000007-5a9ebed0-dcef-427e-a805-574905569389-tap37f1f8d1-f4', 'timestamp': '2026-01-21T23:44:24.791080', 'resource_metadata': {'display_name': 'tempest-ServersWithSpecificFlavorTestJSON-server-1964291827', 'name': 'tap37f1f8d1-f4', 'instance_id': '5a9ebed0-dcef-427e-a805-574905569389', 'instance_type': 'tempest-flavor_with_ephemeral_0-176066592', 'host': '30ab4b404f7384b3bbb3fea7fde7de717ac622f6bb64457f2f491842', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '17063997', 'name': 'tempest-flavor_with_ephemeral_0-176066592', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:42:f2:4a', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap37f1f8d1-f4'}, 'message_id': '1e02f91a-f723-11f0-a0a4-fa163e934844', 'monotonic_time': 3637.33289993, 'message_signature': '5e53a05c39e34be6e643c887b6d75c76e30a0c8a31686f7b19acdb09719e75b8'}]}, 'timestamp': '2026-01-21 23:44:24.791439', '_unique_id': '2d14c3db42434c3e8de9a7dc53e1a0f4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.792 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.792 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.792 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.792 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.792 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.792 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.792 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.792 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.792 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.792 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.792 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.792 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.792 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.792 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.792 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.792 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.792 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.792 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.792 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.792 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.792 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.792 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.792 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.792 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.792 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.792 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.792 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.792 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.792 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.792 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.792 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.792 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.792 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.792 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.792 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.792 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.792 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.792 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.792 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.792 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.792 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.792 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.792 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.792 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.792 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.792 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.792 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.792 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.792 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.792 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.792 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.792 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.792 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:44:24 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:44:24.792 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:44:26 compute-1 nova_compute[182713]: 2026-01-21 23:44:26.217 182717 DEBUG nova.network.neutron [req-fc7fdda7-f340-4907-8881-1793acb52b91 req-82ece5ee-a30f-499e-b3ee-1f53d8605f73 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 5a9ebed0-dcef-427e-a805-574905569389] Updated VIF entry in instance network info cache for port 37f1f8d1-f4ea-416d-ba73-6fbc611802be. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 21 23:44:26 compute-1 nova_compute[182713]: 2026-01-21 23:44:26.218 182717 DEBUG nova.network.neutron [req-fc7fdda7-f340-4907-8881-1793acb52b91 req-82ece5ee-a30f-499e-b3ee-1f53d8605f73 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 5a9ebed0-dcef-427e-a805-574905569389] Updating instance_info_cache with network_info: [{"id": "37f1f8d1-f4ea-416d-ba73-6fbc611802be", "address": "fa:16:3e:42:f2:4a", "network": {"id": "b48e4868-bf7a-4c2f-b6e3-2d98c35b44de", "bridge": "br-int", "label": "tempest-ServersWithSpecificFlavorTestJSON-746822329-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.184", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2046237dff224b498e14ad59b5822ac1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap37f1f8d1-f4", "ovs_interfaceid": "37f1f8d1-f4ea-416d-ba73-6fbc611802be", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 23:44:26 compute-1 nova_compute[182713]: 2026-01-21 23:44:26.242 182717 DEBUG oslo_concurrency.lockutils [req-fc7fdda7-f340-4907-8881-1793acb52b91 req-82ece5ee-a30f-499e-b3ee-1f53d8605f73 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-5a9ebed0-dcef-427e-a805-574905569389" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 23:44:27 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:44:27.670 104184 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=4, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:a4:fb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'fa:c2:bb:50:eb:27'}, ipsec=False) old=SB_Global(nb_cfg=3) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 21 23:44:27 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:44:27.671 104184 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 21 23:44:27 compute-1 nova_compute[182713]: 2026-01-21 23:44:27.672 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:44:29 compute-1 nova_compute[182713]: 2026-01-21 23:44:29.424 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:44:29 compute-1 nova_compute[182713]: 2026-01-21 23:44:29.791 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:44:30 compute-1 podman[212104]: 2026-01-21 23:44:30.577164252 +0000 UTC m=+0.075415610 container health_status cba261ffc859a105e0afa4436e64e27f9ba7e7387a1ea02146e0072c51844d44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 21 23:44:30 compute-1 ovn_controller[94841]: 2026-01-21T23:44:30Z|00004|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:42:f2:4a 10.100.0.9
Jan 21 23:44:30 compute-1 ovn_controller[94841]: 2026-01-21T23:44:30Z|00005|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:42:f2:4a 10.100.0.9
Jan 21 23:44:31 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:44:31.673 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=74526b6d-b1ca-423f-9094-b845f8b97526, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '4'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:44:33 compute-1 ovn_controller[94841]: 2026-01-21T23:44:33Z|00041|binding|INFO|Releasing lport a608e132-8265-4348-bc67-024ad89753c9 from this chassis (sb_readonly=0)
Jan 21 23:44:33 compute-1 nova_compute[182713]: 2026-01-21 23:44:33.207 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:44:33 compute-1 podman[212125]: 2026-01-21 23:44:33.613177121 +0000 UTC m=+0.107439631 container health_status dce133a2ffa21f3006ac27dc80027060a9029e6d972f4c62fecd35dd3bbb00f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=openstack_network_exporter, vendor=Red Hat, Inc., distribution-scope=public, container_name=openstack_network_exporter, version=9.6, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, build-date=2025-08-20T13:12:41, release=1755695350, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Jan 21 23:44:34 compute-1 nova_compute[182713]: 2026-01-21 23:44:34.346 182717 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769039059.3458834, 6f64b039-da3a-47ef-9a52-b259b890a077 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:44:34 compute-1 nova_compute[182713]: 2026-01-21 23:44:34.347 182717 INFO nova.compute.manager [-] [instance: 6f64b039-da3a-47ef-9a52-b259b890a077] VM Stopped (Lifecycle Event)
Jan 21 23:44:34 compute-1 nova_compute[182713]: 2026-01-21 23:44:34.371 182717 DEBUG nova.compute.manager [None req-b26bf741-2aeb-4eba-bca5-5484361ef1fd - - - - - -] [instance: 6f64b039-da3a-47ef-9a52-b259b890a077] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:44:34 compute-1 nova_compute[182713]: 2026-01-21 23:44:34.428 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:44:34 compute-1 nova_compute[182713]: 2026-01-21 23:44:34.793 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:44:39 compute-1 nova_compute[182713]: 2026-01-21 23:44:39.432 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:44:39 compute-1 nova_compute[182713]: 2026-01-21 23:44:39.796 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:44:40 compute-1 nova_compute[182713]: 2026-01-21 23:44:40.795 182717 DEBUG oslo_concurrency.lockutils [None req-83f2dc2e-9a3c-4baf-a68c-2237649cb231 a5dd6b4c1bdc4bdd90966604b07abf04 2046237dff224b498e14ad59b5822ac1 - - default default] Acquiring lock "5a9ebed0-dcef-427e-a805-574905569389" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:44:40 compute-1 nova_compute[182713]: 2026-01-21 23:44:40.797 182717 DEBUG oslo_concurrency.lockutils [None req-83f2dc2e-9a3c-4baf-a68c-2237649cb231 a5dd6b4c1bdc4bdd90966604b07abf04 2046237dff224b498e14ad59b5822ac1 - - default default] Lock "5a9ebed0-dcef-427e-a805-574905569389" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:44:40 compute-1 nova_compute[182713]: 2026-01-21 23:44:40.797 182717 DEBUG oslo_concurrency.lockutils [None req-83f2dc2e-9a3c-4baf-a68c-2237649cb231 a5dd6b4c1bdc4bdd90966604b07abf04 2046237dff224b498e14ad59b5822ac1 - - default default] Acquiring lock "5a9ebed0-dcef-427e-a805-574905569389-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:44:40 compute-1 nova_compute[182713]: 2026-01-21 23:44:40.798 182717 DEBUG oslo_concurrency.lockutils [None req-83f2dc2e-9a3c-4baf-a68c-2237649cb231 a5dd6b4c1bdc4bdd90966604b07abf04 2046237dff224b498e14ad59b5822ac1 - - default default] Lock "5a9ebed0-dcef-427e-a805-574905569389-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:44:40 compute-1 nova_compute[182713]: 2026-01-21 23:44:40.798 182717 DEBUG oslo_concurrency.lockutils [None req-83f2dc2e-9a3c-4baf-a68c-2237649cb231 a5dd6b4c1bdc4bdd90966604b07abf04 2046237dff224b498e14ad59b5822ac1 - - default default] Lock "5a9ebed0-dcef-427e-a805-574905569389-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:44:40 compute-1 nova_compute[182713]: 2026-01-21 23:44:40.848 182717 INFO nova.compute.manager [None req-83f2dc2e-9a3c-4baf-a68c-2237649cb231 a5dd6b4c1bdc4bdd90966604b07abf04 2046237dff224b498e14ad59b5822ac1 - - default default] [instance: 5a9ebed0-dcef-427e-a805-574905569389] Terminating instance
Jan 21 23:44:40 compute-1 nova_compute[182713]: 2026-01-21 23:44:40.993 182717 DEBUG nova.compute.manager [None req-83f2dc2e-9a3c-4baf-a68c-2237649cb231 a5dd6b4c1bdc4bdd90966604b07abf04 2046237dff224b498e14ad59b5822ac1 - - default default] [instance: 5a9ebed0-dcef-427e-a805-574905569389] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 21 23:44:41 compute-1 kernel: tap37f1f8d1-f4 (unregistering): left promiscuous mode
Jan 21 23:44:41 compute-1 NetworkManager[54952]: <info>  [1769039081.0180] device (tap37f1f8d1-f4): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 21 23:44:41 compute-1 nova_compute[182713]: 2026-01-21 23:44:41.027 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:44:41 compute-1 ovn_controller[94841]: 2026-01-21T23:44:41Z|00042|binding|INFO|Releasing lport 37f1f8d1-f4ea-416d-ba73-6fbc611802be from this chassis (sb_readonly=0)
Jan 21 23:44:41 compute-1 ovn_controller[94841]: 2026-01-21T23:44:41Z|00043|binding|INFO|Setting lport 37f1f8d1-f4ea-416d-ba73-6fbc611802be down in Southbound
Jan 21 23:44:41 compute-1 ovn_controller[94841]: 2026-01-21T23:44:41Z|00044|binding|INFO|Removing iface tap37f1f8d1-f4 ovn-installed in OVS
Jan 21 23:44:41 compute-1 nova_compute[182713]: 2026-01-21 23:44:41.030 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:44:41 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:44:41.049 104184 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:42:f2:4a 10.100.0.9'], port_security=['fa:16:3e:42:f2:4a 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '5a9ebed0-dcef-427e-a805-574905569389', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b48e4868-bf7a-4c2f-b6e3-2d98c35b44de', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2046237dff224b498e14ad59b5822ac1', 'neutron:revision_number': '4', 'neutron:security_group_ids': '4bf21f67-1fb4-4de2-8a56-49bd370458b1', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com', 'neutron:port_fip': '192.168.122.184'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=db648dcf-ceff-4628-b6a4-f9104e5f375b, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>], logical_port=37f1f8d1-f4ea-416d-ba73-6fbc611802be) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 21 23:44:41 compute-1 nova_compute[182713]: 2026-01-21 23:44:41.049 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:44:41 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:44:41.054 104184 INFO neutron.agent.ovn.metadata.agent [-] Port 37f1f8d1-f4ea-416d-ba73-6fbc611802be in datapath b48e4868-bf7a-4c2f-b6e3-2d98c35b44de unbound from our chassis
Jan 21 23:44:41 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:44:41.058 104184 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network b48e4868-bf7a-4c2f-b6e3-2d98c35b44de, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 21 23:44:41 compute-1 systemd[1]: machine-qemu\x2d4\x2dinstance\x2d00000007.scope: Deactivated successfully.
Jan 21 23:44:41 compute-1 systemd[1]: machine-qemu\x2d4\x2dinstance\x2d00000007.scope: Consumed 12.425s CPU time.
Jan 21 23:44:41 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:44:41.061 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[90736e12-bcc4-4a53-800a-22b0a68802e3]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:44:41 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:44:41.063 104184 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-b48e4868-bf7a-4c2f-b6e3-2d98c35b44de namespace which is not needed anymore
Jan 21 23:44:41 compute-1 systemd-machined[153970]: Machine qemu-4-instance-00000007 terminated.
Jan 21 23:44:41 compute-1 podman[212151]: 2026-01-21 23:44:41.132365987 +0000 UTC m=+0.071346069 container health_status 9069102163b9014ce8d990e36224edf2f6277e737eebbc85a8ea0ba5a3d6a468 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 21 23:44:41 compute-1 podman[212148]: 2026-01-21 23:44:41.172006074 +0000 UTC m=+0.109570949 container health_status 1b31374526468d6b98833c6ddc06c791524e3aaa8f4a92d02e3f11c449a17ca2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, config_id=ovn_controller, managed_by=edpm_ansible)
Jan 21 23:44:41 compute-1 neutron-haproxy-ovnmeta-b48e4868-bf7a-4c2f-b6e3-2d98c35b44de[211962]: [NOTICE]   (211966) : haproxy version is 2.8.14-c23fe91
Jan 21 23:44:41 compute-1 neutron-haproxy-ovnmeta-b48e4868-bf7a-4c2f-b6e3-2d98c35b44de[211962]: [NOTICE]   (211966) : path to executable is /usr/sbin/haproxy
Jan 21 23:44:41 compute-1 neutron-haproxy-ovnmeta-b48e4868-bf7a-4c2f-b6e3-2d98c35b44de[211962]: [WARNING]  (211966) : Exiting Master process...
Jan 21 23:44:41 compute-1 neutron-haproxy-ovnmeta-b48e4868-bf7a-4c2f-b6e3-2d98c35b44de[211962]: [ALERT]    (211966) : Current worker (211968) exited with code 143 (Terminated)
Jan 21 23:44:41 compute-1 neutron-haproxy-ovnmeta-b48e4868-bf7a-4c2f-b6e3-2d98c35b44de[211962]: [WARNING]  (211966) : All workers exited. Exiting... (0)
Jan 21 23:44:41 compute-1 systemd[1]: libpod-f62e7fda55a47ad0180b22c8ebd22638d7f19e4ab1c4ea78d95639ebbbcd6286.scope: Deactivated successfully.
Jan 21 23:44:41 compute-1 podman[212215]: 2026-01-21 23:44:41.250583807 +0000 UTC m=+0.075857046 container died f62e7fda55a47ad0180b22c8ebd22638d7f19e4ab1c4ea78d95639ebbbcd6286 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b48e4868-bf7a-4c2f-b6e3-2d98c35b44de, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202)
Jan 21 23:44:41 compute-1 nova_compute[182713]: 2026-01-21 23:44:41.272 182717 INFO nova.virt.libvirt.driver [-] [instance: 5a9ebed0-dcef-427e-a805-574905569389] Instance destroyed successfully.
Jan 21 23:44:41 compute-1 nova_compute[182713]: 2026-01-21 23:44:41.273 182717 DEBUG nova.objects.instance [None req-83f2dc2e-9a3c-4baf-a68c-2237649cb231 a5dd6b4c1bdc4bdd90966604b07abf04 2046237dff224b498e14ad59b5822ac1 - - default default] Lazy-loading 'resources' on Instance uuid 5a9ebed0-dcef-427e-a805-574905569389 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:44:41 compute-1 nova_compute[182713]: 2026-01-21 23:44:41.288 182717 DEBUG nova.virt.libvirt.vif [None req-83f2dc2e-9a3c-4baf-a68c-2237649cb231 a5dd6b4c1bdc4bdd90966604b07abf04 2046237dff224b498e14ad59b5822ac1 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-21T23:44:09Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersWithSpecificFlavorTestJSON-server-1964291827',display_name='tempest-ServersWithSpecificFlavorTestJSON-server-1964291827',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(10),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverswithspecificflavortestjson-server-1964291827',id=7,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=10,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMc2S1xHovsuR3v2I5l9J02AovvG576qE+PLZ2JMDE3YonbQUTzxCwL7O2BjXojm40p7I5K0N5rNZT68qJMNaZU2vWBXKRORRm5xx7rGTVYFpWqb/Ex26+4ExPg1IszR1Q==',key_name='tempest-keypair-2129534703',keypairs=<?>,launch_index=0,launched_at=2026-01-21T23:44:19Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='2046237dff224b498e14ad59b5822ac1',ramdisk_id='',reservation_id='r-6wca9wxj',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersWithSpecificFlavorTestJSON-1888253139',owner_user_name='tempest-ServersWithSpecificFlavorTestJSON-1888253139-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-21T23:44:19Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='a5dd6b4c1bdc4bdd90966604b07abf04',uuid=5a9ebed0-dcef-427e-a805-574905569389,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "37f1f8d1-f4ea-416d-ba73-6fbc611802be", "address": "fa:16:3e:42:f2:4a", "network": {"id": "b48e4868-bf7a-4c2f-b6e3-2d98c35b44de", "bridge": "br-int", "label": "tempest-ServersWithSpecificFlavorTestJSON-746822329-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.184", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2046237dff224b498e14ad59b5822ac1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap37f1f8d1-f4", "ovs_interfaceid": "37f1f8d1-f4ea-416d-ba73-6fbc611802be", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 21 23:44:41 compute-1 nova_compute[182713]: 2026-01-21 23:44:41.288 182717 DEBUG nova.network.os_vif_util [None req-83f2dc2e-9a3c-4baf-a68c-2237649cb231 a5dd6b4c1bdc4bdd90966604b07abf04 2046237dff224b498e14ad59b5822ac1 - - default default] Converting VIF {"id": "37f1f8d1-f4ea-416d-ba73-6fbc611802be", "address": "fa:16:3e:42:f2:4a", "network": {"id": "b48e4868-bf7a-4c2f-b6e3-2d98c35b44de", "bridge": "br-int", "label": "tempest-ServersWithSpecificFlavorTestJSON-746822329-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.184", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2046237dff224b498e14ad59b5822ac1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap37f1f8d1-f4", "ovs_interfaceid": "37f1f8d1-f4ea-416d-ba73-6fbc611802be", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 21 23:44:41 compute-1 nova_compute[182713]: 2026-01-21 23:44:41.289 182717 DEBUG nova.network.os_vif_util [None req-83f2dc2e-9a3c-4baf-a68c-2237649cb231 a5dd6b4c1bdc4bdd90966604b07abf04 2046237dff224b498e14ad59b5822ac1 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:42:f2:4a,bridge_name='br-int',has_traffic_filtering=True,id=37f1f8d1-f4ea-416d-ba73-6fbc611802be,network=Network(b48e4868-bf7a-4c2f-b6e3-2d98c35b44de),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap37f1f8d1-f4') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 21 23:44:41 compute-1 nova_compute[182713]: 2026-01-21 23:44:41.290 182717 DEBUG os_vif [None req-83f2dc2e-9a3c-4baf-a68c-2237649cb231 a5dd6b4c1bdc4bdd90966604b07abf04 2046237dff224b498e14ad59b5822ac1 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:42:f2:4a,bridge_name='br-int',has_traffic_filtering=True,id=37f1f8d1-f4ea-416d-ba73-6fbc611802be,network=Network(b48e4868-bf7a-4c2f-b6e3-2d98c35b44de),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap37f1f8d1-f4') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 21 23:44:41 compute-1 nova_compute[182713]: 2026-01-21 23:44:41.293 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:44:41 compute-1 nova_compute[182713]: 2026-01-21 23:44:41.294 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap37f1f8d1-f4, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:44:41 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-f62e7fda55a47ad0180b22c8ebd22638d7f19e4ab1c4ea78d95639ebbbcd6286-userdata-shm.mount: Deactivated successfully.
Jan 21 23:44:41 compute-1 nova_compute[182713]: 2026-01-21 23:44:41.297 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:44:41 compute-1 nova_compute[182713]: 2026-01-21 23:44:41.298 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 21 23:44:41 compute-1 systemd[1]: var-lib-containers-storage-overlay-b4d9834db4d7325d8c9b2b5172da1a0fac65cb11a11b30a1584f2d7d340944e8-merged.mount: Deactivated successfully.
Jan 21 23:44:41 compute-1 nova_compute[182713]: 2026-01-21 23:44:41.299 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:44:41 compute-1 podman[212215]: 2026-01-21 23:44:41.300813958 +0000 UTC m=+0.126087197 container cleanup f62e7fda55a47ad0180b22c8ebd22638d7f19e4ab1c4ea78d95639ebbbcd6286 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b48e4868-bf7a-4c2f-b6e3-2d98c35b44de, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 21 23:44:41 compute-1 nova_compute[182713]: 2026-01-21 23:44:41.307 182717 INFO os_vif [None req-83f2dc2e-9a3c-4baf-a68c-2237649cb231 a5dd6b4c1bdc4bdd90966604b07abf04 2046237dff224b498e14ad59b5822ac1 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:42:f2:4a,bridge_name='br-int',has_traffic_filtering=True,id=37f1f8d1-f4ea-416d-ba73-6fbc611802be,network=Network(b48e4868-bf7a-4c2f-b6e3-2d98c35b44de),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap37f1f8d1-f4')
Jan 21 23:44:41 compute-1 nova_compute[182713]: 2026-01-21 23:44:41.308 182717 INFO nova.virt.libvirt.driver [None req-83f2dc2e-9a3c-4baf-a68c-2237649cb231 a5dd6b4c1bdc4bdd90966604b07abf04 2046237dff224b498e14ad59b5822ac1 - - default default] [instance: 5a9ebed0-dcef-427e-a805-574905569389] Deleting instance files /var/lib/nova/instances/5a9ebed0-dcef-427e-a805-574905569389_del
Jan 21 23:44:41 compute-1 nova_compute[182713]: 2026-01-21 23:44:41.309 182717 INFO nova.virt.libvirt.driver [None req-83f2dc2e-9a3c-4baf-a68c-2237649cb231 a5dd6b4c1bdc4bdd90966604b07abf04 2046237dff224b498e14ad59b5822ac1 - - default default] [instance: 5a9ebed0-dcef-427e-a805-574905569389] Deletion of /var/lib/nova/instances/5a9ebed0-dcef-427e-a805-574905569389_del complete
Jan 21 23:44:41 compute-1 systemd[1]: libpod-conmon-f62e7fda55a47ad0180b22c8ebd22638d7f19e4ab1c4ea78d95639ebbbcd6286.scope: Deactivated successfully.
Jan 21 23:44:41 compute-1 nova_compute[182713]: 2026-01-21 23:44:41.522 182717 INFO nova.compute.manager [None req-83f2dc2e-9a3c-4baf-a68c-2237649cb231 a5dd6b4c1bdc4bdd90966604b07abf04 2046237dff224b498e14ad59b5822ac1 - - default default] [instance: 5a9ebed0-dcef-427e-a805-574905569389] Took 0.53 seconds to destroy the instance on the hypervisor.
Jan 21 23:44:41 compute-1 nova_compute[182713]: 2026-01-21 23:44:41.523 182717 DEBUG oslo.service.loopingcall [None req-83f2dc2e-9a3c-4baf-a68c-2237649cb231 a5dd6b4c1bdc4bdd90966604b07abf04 2046237dff224b498e14ad59b5822ac1 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 21 23:44:41 compute-1 nova_compute[182713]: 2026-01-21 23:44:41.523 182717 DEBUG nova.compute.manager [-] [instance: 5a9ebed0-dcef-427e-a805-574905569389] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 21 23:44:41 compute-1 nova_compute[182713]: 2026-01-21 23:44:41.524 182717 DEBUG nova.network.neutron [-] [instance: 5a9ebed0-dcef-427e-a805-574905569389] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 21 23:44:41 compute-1 nova_compute[182713]: 2026-01-21 23:44:41.536 182717 DEBUG nova.compute.manager [req-a5d9b0a0-ae8f-44cc-8fc3-4dd91bbb3364 req-23b93b01-8834-4e6a-b81b-de7730d19f81 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 5a9ebed0-dcef-427e-a805-574905569389] Received event network-vif-unplugged-37f1f8d1-f4ea-416d-ba73-6fbc611802be external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:44:41 compute-1 nova_compute[182713]: 2026-01-21 23:44:41.537 182717 DEBUG oslo_concurrency.lockutils [req-a5d9b0a0-ae8f-44cc-8fc3-4dd91bbb3364 req-23b93b01-8834-4e6a-b81b-de7730d19f81 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "5a9ebed0-dcef-427e-a805-574905569389-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:44:41 compute-1 nova_compute[182713]: 2026-01-21 23:44:41.537 182717 DEBUG oslo_concurrency.lockutils [req-a5d9b0a0-ae8f-44cc-8fc3-4dd91bbb3364 req-23b93b01-8834-4e6a-b81b-de7730d19f81 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "5a9ebed0-dcef-427e-a805-574905569389-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:44:41 compute-1 nova_compute[182713]: 2026-01-21 23:44:41.537 182717 DEBUG oslo_concurrency.lockutils [req-a5d9b0a0-ae8f-44cc-8fc3-4dd91bbb3364 req-23b93b01-8834-4e6a-b81b-de7730d19f81 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "5a9ebed0-dcef-427e-a805-574905569389-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:44:41 compute-1 nova_compute[182713]: 2026-01-21 23:44:41.537 182717 DEBUG nova.compute.manager [req-a5d9b0a0-ae8f-44cc-8fc3-4dd91bbb3364 req-23b93b01-8834-4e6a-b81b-de7730d19f81 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 5a9ebed0-dcef-427e-a805-574905569389] No waiting events found dispatching network-vif-unplugged-37f1f8d1-f4ea-416d-ba73-6fbc611802be pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 23:44:41 compute-1 nova_compute[182713]: 2026-01-21 23:44:41.538 182717 DEBUG nova.compute.manager [req-a5d9b0a0-ae8f-44cc-8fc3-4dd91bbb3364 req-23b93b01-8834-4e6a-b81b-de7730d19f81 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 5a9ebed0-dcef-427e-a805-574905569389] Received event network-vif-unplugged-37f1f8d1-f4ea-416d-ba73-6fbc611802be for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 21 23:44:41 compute-1 podman[212263]: 2026-01-21 23:44:41.722283269 +0000 UTC m=+0.396937364 container remove f62e7fda55a47ad0180b22c8ebd22638d7f19e4ab1c4ea78d95639ebbbcd6286 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b48e4868-bf7a-4c2f-b6e3-2d98c35b44de, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS)
Jan 21 23:44:41 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:44:41.729 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[f660d7cb-f037-4649-9597-bc095cb751b1]: (4, ('Wed Jan 21 11:44:41 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-b48e4868-bf7a-4c2f-b6e3-2d98c35b44de (f62e7fda55a47ad0180b22c8ebd22638d7f19e4ab1c4ea78d95639ebbbcd6286)\nf62e7fda55a47ad0180b22c8ebd22638d7f19e4ab1c4ea78d95639ebbbcd6286\nWed Jan 21 11:44:41 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-b48e4868-bf7a-4c2f-b6e3-2d98c35b44de (f62e7fda55a47ad0180b22c8ebd22638d7f19e4ab1c4ea78d95639ebbbcd6286)\nf62e7fda55a47ad0180b22c8ebd22638d7f19e4ab1c4ea78d95639ebbbcd6286\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:44:41 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:44:41.731 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[929c8e53-f1b8-47e2-bd22-71f749a07bf4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:44:41 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:44:41.732 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb48e4868-b0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:44:41 compute-1 nova_compute[182713]: 2026-01-21 23:44:41.734 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:44:41 compute-1 kernel: tapb48e4868-b0: left promiscuous mode
Jan 21 23:44:41 compute-1 nova_compute[182713]: 2026-01-21 23:44:41.758 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:44:41 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:44:41.763 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[07383255-c34d-40b6-843a-d5ef4640207d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:44:41 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:44:41.778 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[ed2f89dc-d19c-4293-a92e-7dfc12a2800e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:44:41 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:44:41.780 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[6fe6d93a-f768-4123-b68d-84fd97b6920c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:44:41 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:44:41.796 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[76bb8f9c-c020-4640-93ce-989801592e0e]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 363117, 'reachable_time': 20252, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 212278, 'error': None, 'target': 'ovnmeta-b48e4868-bf7a-4c2f-b6e3-2d98c35b44de', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:44:41 compute-1 systemd[1]: run-netns-ovnmeta\x2db48e4868\x2dbf7a\x2d4c2f\x2db6e3\x2d2d98c35b44de.mount: Deactivated successfully.
Jan 21 23:44:41 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:44:41.799 104576 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-b48e4868-bf7a-4c2f-b6e3-2d98c35b44de deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 21 23:44:41 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:44:41.800 104576 DEBUG oslo.privsep.daemon [-] privsep: reply[da157bc1-6b3a-4c61-bded-d7e083061a11]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:44:42 compute-1 nova_compute[182713]: 2026-01-21 23:44:42.867 182717 DEBUG nova.network.neutron [-] [instance: 5a9ebed0-dcef-427e-a805-574905569389] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 23:44:42 compute-1 nova_compute[182713]: 2026-01-21 23:44:42.906 182717 INFO nova.compute.manager [-] [instance: 5a9ebed0-dcef-427e-a805-574905569389] Took 1.38 seconds to deallocate network for instance.
Jan 21 23:44:42 compute-1 nova_compute[182713]: 2026-01-21 23:44:42.992 182717 DEBUG oslo_concurrency.lockutils [None req-83f2dc2e-9a3c-4baf-a68c-2237649cb231 a5dd6b4c1bdc4bdd90966604b07abf04 2046237dff224b498e14ad59b5822ac1 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:44:42 compute-1 nova_compute[182713]: 2026-01-21 23:44:42.993 182717 DEBUG oslo_concurrency.lockutils [None req-83f2dc2e-9a3c-4baf-a68c-2237649cb231 a5dd6b4c1bdc4bdd90966604b07abf04 2046237dff224b498e14ad59b5822ac1 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:44:43 compute-1 nova_compute[182713]: 2026-01-21 23:44:43.064 182717 DEBUG nova.compute.provider_tree [None req-83f2dc2e-9a3c-4baf-a68c-2237649cb231 a5dd6b4c1bdc4bdd90966604b07abf04 2046237dff224b498e14ad59b5822ac1 - - default default] Inventory has not changed in ProviderTree for provider: 39680711-70c9-4df1-ae59-25e54fac688d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 21 23:44:43 compute-1 nova_compute[182713]: 2026-01-21 23:44:43.082 182717 DEBUG nova.scheduler.client.report [None req-83f2dc2e-9a3c-4baf-a68c-2237649cb231 a5dd6b4c1bdc4bdd90966604b07abf04 2046237dff224b498e14ad59b5822ac1 - - default default] Inventory has not changed for provider 39680711-70c9-4df1-ae59-25e54fac688d based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 21 23:44:43 compute-1 nova_compute[182713]: 2026-01-21 23:44:43.110 182717 DEBUG oslo_concurrency.lockutils [None req-83f2dc2e-9a3c-4baf-a68c-2237649cb231 a5dd6b4c1bdc4bdd90966604b07abf04 2046237dff224b498e14ad59b5822ac1 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.117s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:44:43 compute-1 nova_compute[182713]: 2026-01-21 23:44:43.142 182717 INFO nova.scheduler.client.report [None req-83f2dc2e-9a3c-4baf-a68c-2237649cb231 a5dd6b4c1bdc4bdd90966604b07abf04 2046237dff224b498e14ad59b5822ac1 - - default default] Deleted allocations for instance 5a9ebed0-dcef-427e-a805-574905569389
Jan 21 23:44:43 compute-1 nova_compute[182713]: 2026-01-21 23:44:43.224 182717 DEBUG oslo_concurrency.lockutils [None req-83f2dc2e-9a3c-4baf-a68c-2237649cb231 a5dd6b4c1bdc4bdd90966604b07abf04 2046237dff224b498e14ad59b5822ac1 - - default default] Lock "5a9ebed0-dcef-427e-a805-574905569389" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.428s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:44:43 compute-1 nova_compute[182713]: 2026-01-21 23:44:43.827 182717 DEBUG nova.compute.manager [req-142ed5c4-2975-4cc2-a9c7-825ddd08ff58 req-e9f9faf2-ce24-47b7-9a84-b8a9d5f9b2e5 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 5a9ebed0-dcef-427e-a805-574905569389] Received event network-vif-plugged-37f1f8d1-f4ea-416d-ba73-6fbc611802be external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:44:43 compute-1 nova_compute[182713]: 2026-01-21 23:44:43.828 182717 DEBUG oslo_concurrency.lockutils [req-142ed5c4-2975-4cc2-a9c7-825ddd08ff58 req-e9f9faf2-ce24-47b7-9a84-b8a9d5f9b2e5 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "5a9ebed0-dcef-427e-a805-574905569389-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:44:43 compute-1 nova_compute[182713]: 2026-01-21 23:44:43.828 182717 DEBUG oslo_concurrency.lockutils [req-142ed5c4-2975-4cc2-a9c7-825ddd08ff58 req-e9f9faf2-ce24-47b7-9a84-b8a9d5f9b2e5 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "5a9ebed0-dcef-427e-a805-574905569389-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:44:43 compute-1 nova_compute[182713]: 2026-01-21 23:44:43.829 182717 DEBUG oslo_concurrency.lockutils [req-142ed5c4-2975-4cc2-a9c7-825ddd08ff58 req-e9f9faf2-ce24-47b7-9a84-b8a9d5f9b2e5 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "5a9ebed0-dcef-427e-a805-574905569389-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:44:43 compute-1 nova_compute[182713]: 2026-01-21 23:44:43.829 182717 DEBUG nova.compute.manager [req-142ed5c4-2975-4cc2-a9c7-825ddd08ff58 req-e9f9faf2-ce24-47b7-9a84-b8a9d5f9b2e5 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 5a9ebed0-dcef-427e-a805-574905569389] No waiting events found dispatching network-vif-plugged-37f1f8d1-f4ea-416d-ba73-6fbc611802be pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 23:44:43 compute-1 nova_compute[182713]: 2026-01-21 23:44:43.830 182717 WARNING nova.compute.manager [req-142ed5c4-2975-4cc2-a9c7-825ddd08ff58 req-e9f9faf2-ce24-47b7-9a84-b8a9d5f9b2e5 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 5a9ebed0-dcef-427e-a805-574905569389] Received unexpected event network-vif-plugged-37f1f8d1-f4ea-416d-ba73-6fbc611802be for instance with vm_state deleted and task_state None.
Jan 21 23:44:43 compute-1 nova_compute[182713]: 2026-01-21 23:44:43.830 182717 DEBUG nova.compute.manager [req-142ed5c4-2975-4cc2-a9c7-825ddd08ff58 req-e9f9faf2-ce24-47b7-9a84-b8a9d5f9b2e5 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 5a9ebed0-dcef-427e-a805-574905569389] Received event network-vif-deleted-37f1f8d1-f4ea-416d-ba73-6fbc611802be external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:44:44 compute-1 nova_compute[182713]: 2026-01-21 23:44:44.837 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:44:46 compute-1 nova_compute[182713]: 2026-01-21 23:44:46.298 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:44:49 compute-1 nova_compute[182713]: 2026-01-21 23:44:49.839 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:44:50 compute-1 podman[212280]: 2026-01-21 23:44:50.618079329 +0000 UTC m=+0.093193837 container health_status e305ebfcd3dbca18f8dd680606b043b108cf9efb0d0a771039cb04fe0df8441d (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 21 23:44:50 compute-1 podman[212279]: 2026-01-21 23:44:50.619217717 +0000 UTC m=+0.097556690 container health_status af9a44923f14d865893c91712a29a99d59e05d83f1a1a0300c4f4d5af638f284 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 21 23:44:51 compute-1 nova_compute[182713]: 2026-01-21 23:44:51.302 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:44:54 compute-1 nova_compute[182713]: 2026-01-21 23:44:54.840 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:44:55 compute-1 nova_compute[182713]: 2026-01-21 23:44:55.368 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:44:56 compute-1 nova_compute[182713]: 2026-01-21 23:44:56.271 182717 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769039081.2692566, 5a9ebed0-dcef-427e-a805-574905569389 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:44:56 compute-1 nova_compute[182713]: 2026-01-21 23:44:56.273 182717 INFO nova.compute.manager [-] [instance: 5a9ebed0-dcef-427e-a805-574905569389] VM Stopped (Lifecycle Event)
Jan 21 23:44:56 compute-1 nova_compute[182713]: 2026-01-21 23:44:56.299 182717 DEBUG nova.compute.manager [None req-d9fa741e-8ceb-466b-9334-af06debdcdce - - - - - -] [instance: 5a9ebed0-dcef-427e-a805-574905569389] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:44:56 compute-1 nova_compute[182713]: 2026-01-21 23:44:56.305 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:44:56 compute-1 nova_compute[182713]: 2026-01-21 23:44:56.935 182717 DEBUG oslo_concurrency.lockutils [None req-22c90f05-2714-474e-af72-3f5c9b3435cd d8df5485aa24472aa69d3b4afc7df2e1 866276d2b90c48bbb7b03ec864f892e4 - - default default] Acquiring lock "32005fa1-7813-416f-90c7-0fb025a2c743" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:44:56 compute-1 nova_compute[182713]: 2026-01-21 23:44:56.936 182717 DEBUG oslo_concurrency.lockutils [None req-22c90f05-2714-474e-af72-3f5c9b3435cd d8df5485aa24472aa69d3b4afc7df2e1 866276d2b90c48bbb7b03ec864f892e4 - - default default] Lock "32005fa1-7813-416f-90c7-0fb025a2c743" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:44:56 compute-1 nova_compute[182713]: 2026-01-21 23:44:56.958 182717 DEBUG nova.compute.manager [None req-22c90f05-2714-474e-af72-3f5c9b3435cd d8df5485aa24472aa69d3b4afc7df2e1 866276d2b90c48bbb7b03ec864f892e4 - - default default] [instance: 32005fa1-7813-416f-90c7-0fb025a2c743] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 21 23:44:57 compute-1 nova_compute[182713]: 2026-01-21 23:44:57.137 182717 DEBUG oslo_concurrency.lockutils [None req-22c90f05-2714-474e-af72-3f5c9b3435cd d8df5485aa24472aa69d3b4afc7df2e1 866276d2b90c48bbb7b03ec864f892e4 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:44:57 compute-1 nova_compute[182713]: 2026-01-21 23:44:57.137 182717 DEBUG oslo_concurrency.lockutils [None req-22c90f05-2714-474e-af72-3f5c9b3435cd d8df5485aa24472aa69d3b4afc7df2e1 866276d2b90c48bbb7b03ec864f892e4 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:44:57 compute-1 nova_compute[182713]: 2026-01-21 23:44:57.143 182717 DEBUG nova.virt.hardware [None req-22c90f05-2714-474e-af72-3f5c9b3435cd d8df5485aa24472aa69d3b4afc7df2e1 866276d2b90c48bbb7b03ec864f892e4 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 21 23:44:57 compute-1 nova_compute[182713]: 2026-01-21 23:44:57.143 182717 INFO nova.compute.claims [None req-22c90f05-2714-474e-af72-3f5c9b3435cd d8df5485aa24472aa69d3b4afc7df2e1 866276d2b90c48bbb7b03ec864f892e4 - - default default] [instance: 32005fa1-7813-416f-90c7-0fb025a2c743] Claim successful on node compute-1.ctlplane.example.com
Jan 21 23:44:57 compute-1 nova_compute[182713]: 2026-01-21 23:44:57.368 182717 DEBUG nova.compute.provider_tree [None req-22c90f05-2714-474e-af72-3f5c9b3435cd d8df5485aa24472aa69d3b4afc7df2e1 866276d2b90c48bbb7b03ec864f892e4 - - default default] Inventory has not changed in ProviderTree for provider: 39680711-70c9-4df1-ae59-25e54fac688d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 21 23:44:57 compute-1 nova_compute[182713]: 2026-01-21 23:44:57.388 182717 DEBUG nova.scheduler.client.report [None req-22c90f05-2714-474e-af72-3f5c9b3435cd d8df5485aa24472aa69d3b4afc7df2e1 866276d2b90c48bbb7b03ec864f892e4 - - default default] Inventory has not changed for provider 39680711-70c9-4df1-ae59-25e54fac688d based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 21 23:44:57 compute-1 nova_compute[182713]: 2026-01-21 23:44:57.420 182717 DEBUG oslo_concurrency.lockutils [None req-22c90f05-2714-474e-af72-3f5c9b3435cd d8df5485aa24472aa69d3b4afc7df2e1 866276d2b90c48bbb7b03ec864f892e4 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.283s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:44:57 compute-1 nova_compute[182713]: 2026-01-21 23:44:57.421 182717 DEBUG nova.compute.manager [None req-22c90f05-2714-474e-af72-3f5c9b3435cd d8df5485aa24472aa69d3b4afc7df2e1 866276d2b90c48bbb7b03ec864f892e4 - - default default] [instance: 32005fa1-7813-416f-90c7-0fb025a2c743] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 21 23:44:57 compute-1 nova_compute[182713]: 2026-01-21 23:44:57.495 182717 DEBUG nova.compute.manager [None req-22c90f05-2714-474e-af72-3f5c9b3435cd d8df5485aa24472aa69d3b4afc7df2e1 866276d2b90c48bbb7b03ec864f892e4 - - default default] [instance: 32005fa1-7813-416f-90c7-0fb025a2c743] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 21 23:44:57 compute-1 nova_compute[182713]: 2026-01-21 23:44:57.496 182717 DEBUG nova.network.neutron [None req-22c90f05-2714-474e-af72-3f5c9b3435cd d8df5485aa24472aa69d3b4afc7df2e1 866276d2b90c48bbb7b03ec864f892e4 - - default default] [instance: 32005fa1-7813-416f-90c7-0fb025a2c743] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 21 23:44:57 compute-1 nova_compute[182713]: 2026-01-21 23:44:57.517 182717 INFO nova.virt.libvirt.driver [None req-22c90f05-2714-474e-af72-3f5c9b3435cd d8df5485aa24472aa69d3b4afc7df2e1 866276d2b90c48bbb7b03ec864f892e4 - - default default] [instance: 32005fa1-7813-416f-90c7-0fb025a2c743] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 21 23:44:57 compute-1 nova_compute[182713]: 2026-01-21 23:44:57.549 182717 DEBUG nova.compute.manager [None req-22c90f05-2714-474e-af72-3f5c9b3435cd d8df5485aa24472aa69d3b4afc7df2e1 866276d2b90c48bbb7b03ec864f892e4 - - default default] [instance: 32005fa1-7813-416f-90c7-0fb025a2c743] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 21 23:44:57 compute-1 nova_compute[182713]: 2026-01-21 23:44:57.714 182717 DEBUG nova.compute.manager [None req-22c90f05-2714-474e-af72-3f5c9b3435cd d8df5485aa24472aa69d3b4afc7df2e1 866276d2b90c48bbb7b03ec864f892e4 - - default default] [instance: 32005fa1-7813-416f-90c7-0fb025a2c743] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 21 23:44:57 compute-1 nova_compute[182713]: 2026-01-21 23:44:57.716 182717 DEBUG nova.virt.libvirt.driver [None req-22c90f05-2714-474e-af72-3f5c9b3435cd d8df5485aa24472aa69d3b4afc7df2e1 866276d2b90c48bbb7b03ec864f892e4 - - default default] [instance: 32005fa1-7813-416f-90c7-0fb025a2c743] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 21 23:44:57 compute-1 nova_compute[182713]: 2026-01-21 23:44:57.716 182717 INFO nova.virt.libvirt.driver [None req-22c90f05-2714-474e-af72-3f5c9b3435cd d8df5485aa24472aa69d3b4afc7df2e1 866276d2b90c48bbb7b03ec864f892e4 - - default default] [instance: 32005fa1-7813-416f-90c7-0fb025a2c743] Creating image(s)
Jan 21 23:44:57 compute-1 nova_compute[182713]: 2026-01-21 23:44:57.717 182717 DEBUG oslo_concurrency.lockutils [None req-22c90f05-2714-474e-af72-3f5c9b3435cd d8df5485aa24472aa69d3b4afc7df2e1 866276d2b90c48bbb7b03ec864f892e4 - - default default] Acquiring lock "/var/lib/nova/instances/32005fa1-7813-416f-90c7-0fb025a2c743/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:44:57 compute-1 nova_compute[182713]: 2026-01-21 23:44:57.717 182717 DEBUG oslo_concurrency.lockutils [None req-22c90f05-2714-474e-af72-3f5c9b3435cd d8df5485aa24472aa69d3b4afc7df2e1 866276d2b90c48bbb7b03ec864f892e4 - - default default] Lock "/var/lib/nova/instances/32005fa1-7813-416f-90c7-0fb025a2c743/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:44:57 compute-1 nova_compute[182713]: 2026-01-21 23:44:57.718 182717 DEBUG oslo_concurrency.lockutils [None req-22c90f05-2714-474e-af72-3f5c9b3435cd d8df5485aa24472aa69d3b4afc7df2e1 866276d2b90c48bbb7b03ec864f892e4 - - default default] Lock "/var/lib/nova/instances/32005fa1-7813-416f-90c7-0fb025a2c743/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:44:57 compute-1 nova_compute[182713]: 2026-01-21 23:44:57.734 182717 DEBUG oslo_concurrency.processutils [None req-22c90f05-2714-474e-af72-3f5c9b3435cd d8df5485aa24472aa69d3b4afc7df2e1 866276d2b90c48bbb7b03ec864f892e4 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:44:57 compute-1 nova_compute[182713]: 2026-01-21 23:44:57.805 182717 DEBUG oslo_concurrency.processutils [None req-22c90f05-2714-474e-af72-3f5c9b3435cd d8df5485aa24472aa69d3b4afc7df2e1 866276d2b90c48bbb7b03ec864f892e4 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:44:57 compute-1 nova_compute[182713]: 2026-01-21 23:44:57.806 182717 DEBUG oslo_concurrency.lockutils [None req-22c90f05-2714-474e-af72-3f5c9b3435cd d8df5485aa24472aa69d3b4afc7df2e1 866276d2b90c48bbb7b03ec864f892e4 - - default default] Acquiring lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:44:57 compute-1 nova_compute[182713]: 2026-01-21 23:44:57.806 182717 DEBUG oslo_concurrency.lockutils [None req-22c90f05-2714-474e-af72-3f5c9b3435cd d8df5485aa24472aa69d3b4afc7df2e1 866276d2b90c48bbb7b03ec864f892e4 - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:44:57 compute-1 nova_compute[182713]: 2026-01-21 23:44:57.816 182717 DEBUG oslo_concurrency.processutils [None req-22c90f05-2714-474e-af72-3f5c9b3435cd d8df5485aa24472aa69d3b4afc7df2e1 866276d2b90c48bbb7b03ec864f892e4 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:44:57 compute-1 nova_compute[182713]: 2026-01-21 23:44:57.845 182717 DEBUG nova.network.neutron [None req-22c90f05-2714-474e-af72-3f5c9b3435cd d8df5485aa24472aa69d3b4afc7df2e1 866276d2b90c48bbb7b03ec864f892e4 - - default default] [instance: 32005fa1-7813-416f-90c7-0fb025a2c743] No network configured allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1188
Jan 21 23:44:57 compute-1 nova_compute[182713]: 2026-01-21 23:44:57.846 182717 DEBUG nova.compute.manager [None req-22c90f05-2714-474e-af72-3f5c9b3435cd d8df5485aa24472aa69d3b4afc7df2e1 866276d2b90c48bbb7b03ec864f892e4 - - default default] [instance: 32005fa1-7813-416f-90c7-0fb025a2c743] Instance network_info: |[]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 21 23:44:57 compute-1 nova_compute[182713]: 2026-01-21 23:44:57.871 182717 DEBUG oslo_concurrency.processutils [None req-22c90f05-2714-474e-af72-3f5c9b3435cd d8df5485aa24472aa69d3b4afc7df2e1 866276d2b90c48bbb7b03ec864f892e4 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:44:57 compute-1 nova_compute[182713]: 2026-01-21 23:44:57.872 182717 DEBUG oslo_concurrency.processutils [None req-22c90f05-2714-474e-af72-3f5c9b3435cd d8df5485aa24472aa69d3b4afc7df2e1 866276d2b90c48bbb7b03ec864f892e4 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/32005fa1-7813-416f-90c7-0fb025a2c743/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:44:58 compute-1 nova_compute[182713]: 2026-01-21 23:44:58.005 182717 DEBUG oslo_concurrency.processutils [None req-22c90f05-2714-474e-af72-3f5c9b3435cd d8df5485aa24472aa69d3b4afc7df2e1 866276d2b90c48bbb7b03ec864f892e4 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/32005fa1-7813-416f-90c7-0fb025a2c743/disk 1073741824" returned: 0 in 0.133s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:44:58 compute-1 nova_compute[182713]: 2026-01-21 23:44:58.007 182717 DEBUG oslo_concurrency.lockutils [None req-22c90f05-2714-474e-af72-3f5c9b3435cd d8df5485aa24472aa69d3b4afc7df2e1 866276d2b90c48bbb7b03ec864f892e4 - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.200s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:44:58 compute-1 nova_compute[182713]: 2026-01-21 23:44:58.007 182717 DEBUG oslo_concurrency.processutils [None req-22c90f05-2714-474e-af72-3f5c9b3435cd d8df5485aa24472aa69d3b4afc7df2e1 866276d2b90c48bbb7b03ec864f892e4 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:44:58 compute-1 nova_compute[182713]: 2026-01-21 23:44:58.077 182717 DEBUG oslo_concurrency.processutils [None req-22c90f05-2714-474e-af72-3f5c9b3435cd d8df5485aa24472aa69d3b4afc7df2e1 866276d2b90c48bbb7b03ec864f892e4 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:44:58 compute-1 nova_compute[182713]: 2026-01-21 23:44:58.079 182717 DEBUG nova.virt.disk.api [None req-22c90f05-2714-474e-af72-3f5c9b3435cd d8df5485aa24472aa69d3b4afc7df2e1 866276d2b90c48bbb7b03ec864f892e4 - - default default] Checking if we can resize image /var/lib/nova/instances/32005fa1-7813-416f-90c7-0fb025a2c743/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 21 23:44:58 compute-1 nova_compute[182713]: 2026-01-21 23:44:58.080 182717 DEBUG oslo_concurrency.processutils [None req-22c90f05-2714-474e-af72-3f5c9b3435cd d8df5485aa24472aa69d3b4afc7df2e1 866276d2b90c48bbb7b03ec864f892e4 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/32005fa1-7813-416f-90c7-0fb025a2c743/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:44:58 compute-1 nova_compute[182713]: 2026-01-21 23:44:58.139 182717 DEBUG oslo_concurrency.processutils [None req-22c90f05-2714-474e-af72-3f5c9b3435cd d8df5485aa24472aa69d3b4afc7df2e1 866276d2b90c48bbb7b03ec864f892e4 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/32005fa1-7813-416f-90c7-0fb025a2c743/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:44:58 compute-1 nova_compute[182713]: 2026-01-21 23:44:58.141 182717 DEBUG nova.virt.disk.api [None req-22c90f05-2714-474e-af72-3f5c9b3435cd d8df5485aa24472aa69d3b4afc7df2e1 866276d2b90c48bbb7b03ec864f892e4 - - default default] Cannot resize image /var/lib/nova/instances/32005fa1-7813-416f-90c7-0fb025a2c743/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 21 23:44:58 compute-1 nova_compute[182713]: 2026-01-21 23:44:58.142 182717 DEBUG nova.objects.instance [None req-22c90f05-2714-474e-af72-3f5c9b3435cd d8df5485aa24472aa69d3b4afc7df2e1 866276d2b90c48bbb7b03ec864f892e4 - - default default] Lazy-loading 'migration_context' on Instance uuid 32005fa1-7813-416f-90c7-0fb025a2c743 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:44:58 compute-1 nova_compute[182713]: 2026-01-21 23:44:58.164 182717 DEBUG nova.virt.libvirt.driver [None req-22c90f05-2714-474e-af72-3f5c9b3435cd d8df5485aa24472aa69d3b4afc7df2e1 866276d2b90c48bbb7b03ec864f892e4 - - default default] [instance: 32005fa1-7813-416f-90c7-0fb025a2c743] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 21 23:44:58 compute-1 nova_compute[182713]: 2026-01-21 23:44:58.165 182717 DEBUG nova.virt.libvirt.driver [None req-22c90f05-2714-474e-af72-3f5c9b3435cd d8df5485aa24472aa69d3b4afc7df2e1 866276d2b90c48bbb7b03ec864f892e4 - - default default] [instance: 32005fa1-7813-416f-90c7-0fb025a2c743] Ensure instance console log exists: /var/lib/nova/instances/32005fa1-7813-416f-90c7-0fb025a2c743/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 21 23:44:58 compute-1 nova_compute[182713]: 2026-01-21 23:44:58.166 182717 DEBUG oslo_concurrency.lockutils [None req-22c90f05-2714-474e-af72-3f5c9b3435cd d8df5485aa24472aa69d3b4afc7df2e1 866276d2b90c48bbb7b03ec864f892e4 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:44:58 compute-1 nova_compute[182713]: 2026-01-21 23:44:58.166 182717 DEBUG oslo_concurrency.lockutils [None req-22c90f05-2714-474e-af72-3f5c9b3435cd d8df5485aa24472aa69d3b4afc7df2e1 866276d2b90c48bbb7b03ec864f892e4 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:44:58 compute-1 nova_compute[182713]: 2026-01-21 23:44:58.167 182717 DEBUG oslo_concurrency.lockutils [None req-22c90f05-2714-474e-af72-3f5c9b3435cd d8df5485aa24472aa69d3b4afc7df2e1 866276d2b90c48bbb7b03ec864f892e4 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:44:58 compute-1 nova_compute[182713]: 2026-01-21 23:44:58.170 182717 DEBUG nova.virt.libvirt.driver [None req-22c90f05-2714-474e-af72-3f5c9b3435cd d8df5485aa24472aa69d3b4afc7df2e1 866276d2b90c48bbb7b03ec864f892e4 - - default default] [instance: 32005fa1-7813-416f-90c7-0fb025a2c743] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'size': 0, 'disk_bus': 'virtio', 'boot_index': 0, 'device_name': '/dev/vda', 'guest_format': None, 'encrypted': False, 'encryption_format': None, 'encryption_secret_uuid': None, 'device_type': 'disk', 'image_id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 21 23:44:58 compute-1 nova_compute[182713]: 2026-01-21 23:44:58.176 182717 WARNING nova.virt.libvirt.driver [None req-22c90f05-2714-474e-af72-3f5c9b3435cd d8df5485aa24472aa69d3b4afc7df2e1 866276d2b90c48bbb7b03ec864f892e4 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 21 23:44:58 compute-1 nova_compute[182713]: 2026-01-21 23:44:58.182 182717 DEBUG nova.virt.libvirt.host [None req-22c90f05-2714-474e-af72-3f5c9b3435cd d8df5485aa24472aa69d3b4afc7df2e1 866276d2b90c48bbb7b03ec864f892e4 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 21 23:44:58 compute-1 nova_compute[182713]: 2026-01-21 23:44:58.183 182717 DEBUG nova.virt.libvirt.host [None req-22c90f05-2714-474e-af72-3f5c9b3435cd d8df5485aa24472aa69d3b4afc7df2e1 866276d2b90c48bbb7b03ec864f892e4 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 21 23:44:58 compute-1 nova_compute[182713]: 2026-01-21 23:44:58.187 182717 DEBUG nova.virt.libvirt.host [None req-22c90f05-2714-474e-af72-3f5c9b3435cd d8df5485aa24472aa69d3b4afc7df2e1 866276d2b90c48bbb7b03ec864f892e4 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 21 23:44:58 compute-1 nova_compute[182713]: 2026-01-21 23:44:58.187 182717 DEBUG nova.virt.libvirt.host [None req-22c90f05-2714-474e-af72-3f5c9b3435cd d8df5485aa24472aa69d3b4afc7df2e1 866276d2b90c48bbb7b03ec864f892e4 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 21 23:44:58 compute-1 nova_compute[182713]: 2026-01-21 23:44:58.190 182717 DEBUG nova.virt.libvirt.driver [None req-22c90f05-2714-474e-af72-3f5c9b3435cd d8df5485aa24472aa69d3b4afc7df2e1 866276d2b90c48bbb7b03ec864f892e4 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 21 23:44:58 compute-1 nova_compute[182713]: 2026-01-21 23:44:58.190 182717 DEBUG nova.virt.hardware [None req-22c90f05-2714-474e-af72-3f5c9b3435cd d8df5485aa24472aa69d3b4afc7df2e1 866276d2b90c48bbb7b03ec864f892e4 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-21T23:42:45Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c3389c03-89c4-4ff5-9e03-1a99d41713d4',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 21 23:44:58 compute-1 nova_compute[182713]: 2026-01-21 23:44:58.191 182717 DEBUG nova.virt.hardware [None req-22c90f05-2714-474e-af72-3f5c9b3435cd d8df5485aa24472aa69d3b4afc7df2e1 866276d2b90c48bbb7b03ec864f892e4 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 21 23:44:58 compute-1 nova_compute[182713]: 2026-01-21 23:44:58.191 182717 DEBUG nova.virt.hardware [None req-22c90f05-2714-474e-af72-3f5c9b3435cd d8df5485aa24472aa69d3b4afc7df2e1 866276d2b90c48bbb7b03ec864f892e4 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 21 23:44:58 compute-1 nova_compute[182713]: 2026-01-21 23:44:58.192 182717 DEBUG nova.virt.hardware [None req-22c90f05-2714-474e-af72-3f5c9b3435cd d8df5485aa24472aa69d3b4afc7df2e1 866276d2b90c48bbb7b03ec864f892e4 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 21 23:44:58 compute-1 nova_compute[182713]: 2026-01-21 23:44:58.192 182717 DEBUG nova.virt.hardware [None req-22c90f05-2714-474e-af72-3f5c9b3435cd d8df5485aa24472aa69d3b4afc7df2e1 866276d2b90c48bbb7b03ec864f892e4 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 21 23:44:58 compute-1 nova_compute[182713]: 2026-01-21 23:44:58.193 182717 DEBUG nova.virt.hardware [None req-22c90f05-2714-474e-af72-3f5c9b3435cd d8df5485aa24472aa69d3b4afc7df2e1 866276d2b90c48bbb7b03ec864f892e4 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 21 23:44:58 compute-1 nova_compute[182713]: 2026-01-21 23:44:58.193 182717 DEBUG nova.virt.hardware [None req-22c90f05-2714-474e-af72-3f5c9b3435cd d8df5485aa24472aa69d3b4afc7df2e1 866276d2b90c48bbb7b03ec864f892e4 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 21 23:44:58 compute-1 nova_compute[182713]: 2026-01-21 23:44:58.194 182717 DEBUG nova.virt.hardware [None req-22c90f05-2714-474e-af72-3f5c9b3435cd d8df5485aa24472aa69d3b4afc7df2e1 866276d2b90c48bbb7b03ec864f892e4 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 21 23:44:58 compute-1 nova_compute[182713]: 2026-01-21 23:44:58.194 182717 DEBUG nova.virt.hardware [None req-22c90f05-2714-474e-af72-3f5c9b3435cd d8df5485aa24472aa69d3b4afc7df2e1 866276d2b90c48bbb7b03ec864f892e4 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 21 23:44:58 compute-1 nova_compute[182713]: 2026-01-21 23:44:58.194 182717 DEBUG nova.virt.hardware [None req-22c90f05-2714-474e-af72-3f5c9b3435cd d8df5485aa24472aa69d3b4afc7df2e1 866276d2b90c48bbb7b03ec864f892e4 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 21 23:44:58 compute-1 nova_compute[182713]: 2026-01-21 23:44:58.195 182717 DEBUG nova.virt.hardware [None req-22c90f05-2714-474e-af72-3f5c9b3435cd d8df5485aa24472aa69d3b4afc7df2e1 866276d2b90c48bbb7b03ec864f892e4 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 21 23:44:58 compute-1 nova_compute[182713]: 2026-01-21 23:44:58.202 182717 DEBUG nova.objects.instance [None req-22c90f05-2714-474e-af72-3f5c9b3435cd d8df5485aa24472aa69d3b4afc7df2e1 866276d2b90c48bbb7b03ec864f892e4 - - default default] Lazy-loading 'pci_devices' on Instance uuid 32005fa1-7813-416f-90c7-0fb025a2c743 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:44:58 compute-1 nova_compute[182713]: 2026-01-21 23:44:58.235 182717 DEBUG nova.virt.libvirt.driver [None req-22c90f05-2714-474e-af72-3f5c9b3435cd d8df5485aa24472aa69d3b4afc7df2e1 866276d2b90c48bbb7b03ec864f892e4 - - default default] [instance: 32005fa1-7813-416f-90c7-0fb025a2c743] End _get_guest_xml xml=<domain type="kvm">
Jan 21 23:44:58 compute-1 nova_compute[182713]:   <uuid>32005fa1-7813-416f-90c7-0fb025a2c743</uuid>
Jan 21 23:44:58 compute-1 nova_compute[182713]:   <name>instance-00000009</name>
Jan 21 23:44:58 compute-1 nova_compute[182713]:   <memory>131072</memory>
Jan 21 23:44:58 compute-1 nova_compute[182713]:   <vcpu>1</vcpu>
Jan 21 23:44:58 compute-1 nova_compute[182713]:   <metadata>
Jan 21 23:44:58 compute-1 nova_compute[182713]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 21 23:44:58 compute-1 nova_compute[182713]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 21 23:44:58 compute-1 nova_compute[182713]:       <nova:name>tempest-ServerDiagnosticsTest-server-1202960472</nova:name>
Jan 21 23:44:58 compute-1 nova_compute[182713]:       <nova:creationTime>2026-01-21 23:44:58</nova:creationTime>
Jan 21 23:44:58 compute-1 nova_compute[182713]:       <nova:flavor name="m1.nano">
Jan 21 23:44:58 compute-1 nova_compute[182713]:         <nova:memory>128</nova:memory>
Jan 21 23:44:58 compute-1 nova_compute[182713]:         <nova:disk>1</nova:disk>
Jan 21 23:44:58 compute-1 nova_compute[182713]:         <nova:swap>0</nova:swap>
Jan 21 23:44:58 compute-1 nova_compute[182713]:         <nova:ephemeral>0</nova:ephemeral>
Jan 21 23:44:58 compute-1 nova_compute[182713]:         <nova:vcpus>1</nova:vcpus>
Jan 21 23:44:58 compute-1 nova_compute[182713]:       </nova:flavor>
Jan 21 23:44:58 compute-1 nova_compute[182713]:       <nova:owner>
Jan 21 23:44:58 compute-1 nova_compute[182713]:         <nova:user uuid="d8df5485aa24472aa69d3b4afc7df2e1">tempest-ServerDiagnosticsTest-92790615-project-member</nova:user>
Jan 21 23:44:58 compute-1 nova_compute[182713]:         <nova:project uuid="866276d2b90c48bbb7b03ec864f892e4">tempest-ServerDiagnosticsTest-92790615</nova:project>
Jan 21 23:44:58 compute-1 nova_compute[182713]:       </nova:owner>
Jan 21 23:44:58 compute-1 nova_compute[182713]:       <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 21 23:44:58 compute-1 nova_compute[182713]:       <nova:ports/>
Jan 21 23:44:58 compute-1 nova_compute[182713]:     </nova:instance>
Jan 21 23:44:58 compute-1 nova_compute[182713]:   </metadata>
Jan 21 23:44:58 compute-1 nova_compute[182713]:   <sysinfo type="smbios">
Jan 21 23:44:58 compute-1 nova_compute[182713]:     <system>
Jan 21 23:44:58 compute-1 nova_compute[182713]:       <entry name="manufacturer">RDO</entry>
Jan 21 23:44:58 compute-1 nova_compute[182713]:       <entry name="product">OpenStack Compute</entry>
Jan 21 23:44:58 compute-1 nova_compute[182713]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 21 23:44:58 compute-1 nova_compute[182713]:       <entry name="serial">32005fa1-7813-416f-90c7-0fb025a2c743</entry>
Jan 21 23:44:58 compute-1 nova_compute[182713]:       <entry name="uuid">32005fa1-7813-416f-90c7-0fb025a2c743</entry>
Jan 21 23:44:58 compute-1 nova_compute[182713]:       <entry name="family">Virtual Machine</entry>
Jan 21 23:44:58 compute-1 nova_compute[182713]:     </system>
Jan 21 23:44:58 compute-1 nova_compute[182713]:   </sysinfo>
Jan 21 23:44:58 compute-1 nova_compute[182713]:   <os>
Jan 21 23:44:58 compute-1 nova_compute[182713]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 21 23:44:58 compute-1 nova_compute[182713]:     <boot dev="hd"/>
Jan 21 23:44:58 compute-1 nova_compute[182713]:     <smbios mode="sysinfo"/>
Jan 21 23:44:58 compute-1 nova_compute[182713]:   </os>
Jan 21 23:44:58 compute-1 nova_compute[182713]:   <features>
Jan 21 23:44:58 compute-1 nova_compute[182713]:     <acpi/>
Jan 21 23:44:58 compute-1 nova_compute[182713]:     <apic/>
Jan 21 23:44:58 compute-1 nova_compute[182713]:     <vmcoreinfo/>
Jan 21 23:44:58 compute-1 nova_compute[182713]:   </features>
Jan 21 23:44:58 compute-1 nova_compute[182713]:   <clock offset="utc">
Jan 21 23:44:58 compute-1 nova_compute[182713]:     <timer name="pit" tickpolicy="delay"/>
Jan 21 23:44:58 compute-1 nova_compute[182713]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 21 23:44:58 compute-1 nova_compute[182713]:     <timer name="hpet" present="no"/>
Jan 21 23:44:58 compute-1 nova_compute[182713]:   </clock>
Jan 21 23:44:58 compute-1 nova_compute[182713]:   <cpu mode="custom" match="exact">
Jan 21 23:44:58 compute-1 nova_compute[182713]:     <model>Nehalem</model>
Jan 21 23:44:58 compute-1 nova_compute[182713]:     <topology sockets="1" cores="1" threads="1"/>
Jan 21 23:44:58 compute-1 nova_compute[182713]:   </cpu>
Jan 21 23:44:58 compute-1 nova_compute[182713]:   <devices>
Jan 21 23:44:58 compute-1 nova_compute[182713]:     <disk type="file" device="disk">
Jan 21 23:44:58 compute-1 nova_compute[182713]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 21 23:44:58 compute-1 nova_compute[182713]:       <source file="/var/lib/nova/instances/32005fa1-7813-416f-90c7-0fb025a2c743/disk"/>
Jan 21 23:44:58 compute-1 nova_compute[182713]:       <target dev="vda" bus="virtio"/>
Jan 21 23:44:58 compute-1 nova_compute[182713]:     </disk>
Jan 21 23:44:58 compute-1 nova_compute[182713]:     <disk type="file" device="cdrom">
Jan 21 23:44:58 compute-1 nova_compute[182713]:       <driver name="qemu" type="raw" cache="none"/>
Jan 21 23:44:58 compute-1 nova_compute[182713]:       <source file="/var/lib/nova/instances/32005fa1-7813-416f-90c7-0fb025a2c743/disk.config"/>
Jan 21 23:44:58 compute-1 nova_compute[182713]:       <target dev="sda" bus="sata"/>
Jan 21 23:44:58 compute-1 nova_compute[182713]:     </disk>
Jan 21 23:44:58 compute-1 nova_compute[182713]:     <serial type="pty">
Jan 21 23:44:58 compute-1 nova_compute[182713]:       <log file="/var/lib/nova/instances/32005fa1-7813-416f-90c7-0fb025a2c743/console.log" append="off"/>
Jan 21 23:44:58 compute-1 nova_compute[182713]:     </serial>
Jan 21 23:44:58 compute-1 nova_compute[182713]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 21 23:44:58 compute-1 nova_compute[182713]:     <video>
Jan 21 23:44:58 compute-1 nova_compute[182713]:       <model type="virtio"/>
Jan 21 23:44:58 compute-1 nova_compute[182713]:     </video>
Jan 21 23:44:58 compute-1 nova_compute[182713]:     <input type="tablet" bus="usb"/>
Jan 21 23:44:58 compute-1 nova_compute[182713]:     <rng model="virtio">
Jan 21 23:44:58 compute-1 nova_compute[182713]:       <backend model="random">/dev/urandom</backend>
Jan 21 23:44:58 compute-1 nova_compute[182713]:     </rng>
Jan 21 23:44:58 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root"/>
Jan 21 23:44:58 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:44:58 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:44:58 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:44:58 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:44:58 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:44:58 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:44:58 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:44:58 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:44:58 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:44:58 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:44:58 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:44:58 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:44:58 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:44:58 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:44:58 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:44:58 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:44:58 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:44:58 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:44:58 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:44:58 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:44:58 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:44:58 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:44:58 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:44:58 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:44:58 compute-1 nova_compute[182713]:     <controller type="usb" index="0"/>
Jan 21 23:44:58 compute-1 nova_compute[182713]:     <memballoon model="virtio">
Jan 21 23:44:58 compute-1 nova_compute[182713]:       <stats period="10"/>
Jan 21 23:44:58 compute-1 nova_compute[182713]:     </memballoon>
Jan 21 23:44:58 compute-1 nova_compute[182713]:   </devices>
Jan 21 23:44:58 compute-1 nova_compute[182713]: </domain>
Jan 21 23:44:58 compute-1 nova_compute[182713]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 21 23:44:58 compute-1 nova_compute[182713]: 2026-01-21 23:44:58.300 182717 DEBUG nova.virt.libvirt.driver [None req-22c90f05-2714-474e-af72-3f5c9b3435cd d8df5485aa24472aa69d3b4afc7df2e1 866276d2b90c48bbb7b03ec864f892e4 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 21 23:44:58 compute-1 nova_compute[182713]: 2026-01-21 23:44:58.300 182717 DEBUG nova.virt.libvirt.driver [None req-22c90f05-2714-474e-af72-3f5c9b3435cd d8df5485aa24472aa69d3b4afc7df2e1 866276d2b90c48bbb7b03ec864f892e4 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 21 23:44:58 compute-1 nova_compute[182713]: 2026-01-21 23:44:58.301 182717 INFO nova.virt.libvirt.driver [None req-22c90f05-2714-474e-af72-3f5c9b3435cd d8df5485aa24472aa69d3b4afc7df2e1 866276d2b90c48bbb7b03ec864f892e4 - - default default] [instance: 32005fa1-7813-416f-90c7-0fb025a2c743] Using config drive
Jan 21 23:44:58 compute-1 nova_compute[182713]: 2026-01-21 23:44:58.574 182717 INFO nova.virt.libvirt.driver [None req-22c90f05-2714-474e-af72-3f5c9b3435cd d8df5485aa24472aa69d3b4afc7df2e1 866276d2b90c48bbb7b03ec864f892e4 - - default default] [instance: 32005fa1-7813-416f-90c7-0fb025a2c743] Creating config drive at /var/lib/nova/instances/32005fa1-7813-416f-90c7-0fb025a2c743/disk.config
Jan 21 23:44:58 compute-1 nova_compute[182713]: 2026-01-21 23:44:58.579 182717 DEBUG oslo_concurrency.processutils [None req-22c90f05-2714-474e-af72-3f5c9b3435cd d8df5485aa24472aa69d3b4afc7df2e1 866276d2b90c48bbb7b03ec864f892e4 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/32005fa1-7813-416f-90c7-0fb025a2c743/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpt1yzgxfm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:44:58 compute-1 nova_compute[182713]: 2026-01-21 23:44:58.720 182717 DEBUG oslo_concurrency.processutils [None req-22c90f05-2714-474e-af72-3f5c9b3435cd d8df5485aa24472aa69d3b4afc7df2e1 866276d2b90c48bbb7b03ec864f892e4 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/32005fa1-7813-416f-90c7-0fb025a2c743/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpt1yzgxfm" returned: 0 in 0.141s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:44:58 compute-1 systemd-machined[153970]: New machine qemu-5-instance-00000009.
Jan 21 23:44:58 compute-1 systemd[1]: Started Virtual Machine qemu-5-instance-00000009.
Jan 21 23:44:58 compute-1 nova_compute[182713]: 2026-01-21 23:44:58.943 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:44:59 compute-1 nova_compute[182713]: 2026-01-21 23:44:59.310 182717 DEBUG nova.virt.driver [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Emitting event <LifecycleEvent: 1769039099.309805, 32005fa1-7813-416f-90c7-0fb025a2c743 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:44:59 compute-1 nova_compute[182713]: 2026-01-21 23:44:59.310 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 32005fa1-7813-416f-90c7-0fb025a2c743] VM Resumed (Lifecycle Event)
Jan 21 23:44:59 compute-1 nova_compute[182713]: 2026-01-21 23:44:59.313 182717 DEBUG nova.compute.manager [None req-22c90f05-2714-474e-af72-3f5c9b3435cd d8df5485aa24472aa69d3b4afc7df2e1 866276d2b90c48bbb7b03ec864f892e4 - - default default] [instance: 32005fa1-7813-416f-90c7-0fb025a2c743] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 21 23:44:59 compute-1 nova_compute[182713]: 2026-01-21 23:44:59.313 182717 DEBUG nova.virt.libvirt.driver [None req-22c90f05-2714-474e-af72-3f5c9b3435cd d8df5485aa24472aa69d3b4afc7df2e1 866276d2b90c48bbb7b03ec864f892e4 - - default default] [instance: 32005fa1-7813-416f-90c7-0fb025a2c743] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 21 23:44:59 compute-1 nova_compute[182713]: 2026-01-21 23:44:59.316 182717 INFO nova.virt.libvirt.driver [-] [instance: 32005fa1-7813-416f-90c7-0fb025a2c743] Instance spawned successfully.
Jan 21 23:44:59 compute-1 nova_compute[182713]: 2026-01-21 23:44:59.316 182717 DEBUG nova.virt.libvirt.driver [None req-22c90f05-2714-474e-af72-3f5c9b3435cd d8df5485aa24472aa69d3b4afc7df2e1 866276d2b90c48bbb7b03ec864f892e4 - - default default] [instance: 32005fa1-7813-416f-90c7-0fb025a2c743] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 21 23:44:59 compute-1 nova_compute[182713]: 2026-01-21 23:44:59.339 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 32005fa1-7813-416f-90c7-0fb025a2c743] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:44:59 compute-1 nova_compute[182713]: 2026-01-21 23:44:59.346 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 32005fa1-7813-416f-90c7-0fb025a2c743] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 21 23:44:59 compute-1 nova_compute[182713]: 2026-01-21 23:44:59.349 182717 DEBUG nova.virt.libvirt.driver [None req-22c90f05-2714-474e-af72-3f5c9b3435cd d8df5485aa24472aa69d3b4afc7df2e1 866276d2b90c48bbb7b03ec864f892e4 - - default default] [instance: 32005fa1-7813-416f-90c7-0fb025a2c743] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:44:59 compute-1 nova_compute[182713]: 2026-01-21 23:44:59.349 182717 DEBUG nova.virt.libvirt.driver [None req-22c90f05-2714-474e-af72-3f5c9b3435cd d8df5485aa24472aa69d3b4afc7df2e1 866276d2b90c48bbb7b03ec864f892e4 - - default default] [instance: 32005fa1-7813-416f-90c7-0fb025a2c743] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:44:59 compute-1 nova_compute[182713]: 2026-01-21 23:44:59.350 182717 DEBUG nova.virt.libvirt.driver [None req-22c90f05-2714-474e-af72-3f5c9b3435cd d8df5485aa24472aa69d3b4afc7df2e1 866276d2b90c48bbb7b03ec864f892e4 - - default default] [instance: 32005fa1-7813-416f-90c7-0fb025a2c743] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:44:59 compute-1 nova_compute[182713]: 2026-01-21 23:44:59.350 182717 DEBUG nova.virt.libvirt.driver [None req-22c90f05-2714-474e-af72-3f5c9b3435cd d8df5485aa24472aa69d3b4afc7df2e1 866276d2b90c48bbb7b03ec864f892e4 - - default default] [instance: 32005fa1-7813-416f-90c7-0fb025a2c743] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:44:59 compute-1 nova_compute[182713]: 2026-01-21 23:44:59.350 182717 DEBUG nova.virt.libvirt.driver [None req-22c90f05-2714-474e-af72-3f5c9b3435cd d8df5485aa24472aa69d3b4afc7df2e1 866276d2b90c48bbb7b03ec864f892e4 - - default default] [instance: 32005fa1-7813-416f-90c7-0fb025a2c743] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:44:59 compute-1 nova_compute[182713]: 2026-01-21 23:44:59.351 182717 DEBUG nova.virt.libvirt.driver [None req-22c90f05-2714-474e-af72-3f5c9b3435cd d8df5485aa24472aa69d3b4afc7df2e1 866276d2b90c48bbb7b03ec864f892e4 - - default default] [instance: 32005fa1-7813-416f-90c7-0fb025a2c743] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:44:59 compute-1 nova_compute[182713]: 2026-01-21 23:44:59.378 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 32005fa1-7813-416f-90c7-0fb025a2c743] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 21 23:44:59 compute-1 nova_compute[182713]: 2026-01-21 23:44:59.379 182717 DEBUG nova.virt.driver [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Emitting event <LifecycleEvent: 1769039099.3120365, 32005fa1-7813-416f-90c7-0fb025a2c743 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:44:59 compute-1 nova_compute[182713]: 2026-01-21 23:44:59.379 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 32005fa1-7813-416f-90c7-0fb025a2c743] VM Started (Lifecycle Event)
Jan 21 23:44:59 compute-1 nova_compute[182713]: 2026-01-21 23:44:59.412 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 32005fa1-7813-416f-90c7-0fb025a2c743] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:44:59 compute-1 nova_compute[182713]: 2026-01-21 23:44:59.416 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 32005fa1-7813-416f-90c7-0fb025a2c743] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 21 23:44:59 compute-1 nova_compute[182713]: 2026-01-21 23:44:59.439 182717 INFO nova.compute.manager [None req-22c90f05-2714-474e-af72-3f5c9b3435cd d8df5485aa24472aa69d3b4afc7df2e1 866276d2b90c48bbb7b03ec864f892e4 - - default default] [instance: 32005fa1-7813-416f-90c7-0fb025a2c743] Took 1.72 seconds to spawn the instance on the hypervisor.
Jan 21 23:44:59 compute-1 nova_compute[182713]: 2026-01-21 23:44:59.440 182717 DEBUG nova.compute.manager [None req-22c90f05-2714-474e-af72-3f5c9b3435cd d8df5485aa24472aa69d3b4afc7df2e1 866276d2b90c48bbb7b03ec864f892e4 - - default default] [instance: 32005fa1-7813-416f-90c7-0fb025a2c743] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:44:59 compute-1 nova_compute[182713]: 2026-01-21 23:44:59.446 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 32005fa1-7813-416f-90c7-0fb025a2c743] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 21 23:44:59 compute-1 nova_compute[182713]: 2026-01-21 23:44:59.517 182717 INFO nova.compute.manager [None req-22c90f05-2714-474e-af72-3f5c9b3435cd d8df5485aa24472aa69d3b4afc7df2e1 866276d2b90c48bbb7b03ec864f892e4 - - default default] [instance: 32005fa1-7813-416f-90c7-0fb025a2c743] Took 2.44 seconds to build instance.
Jan 21 23:44:59 compute-1 nova_compute[182713]: 2026-01-21 23:44:59.542 182717 DEBUG oslo_concurrency.lockutils [None req-22c90f05-2714-474e-af72-3f5c9b3435cd d8df5485aa24472aa69d3b4afc7df2e1 866276d2b90c48bbb7b03ec864f892e4 - - default default] Lock "32005fa1-7813-416f-90c7-0fb025a2c743" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 2.606s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:44:59 compute-1 nova_compute[182713]: 2026-01-21 23:44:59.842 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:45:01 compute-1 nova_compute[182713]: 2026-01-21 23:45:01.308 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:45:01 compute-1 podman[212364]: 2026-01-21 23:45:01.597986129 +0000 UTC m=+0.086240882 container health_status cba261ffc859a105e0afa4436e64e27f9ba7e7387a1ea02146e0072c51844d44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 21 23:45:01 compute-1 nova_compute[182713]: 2026-01-21 23:45:01.772 182717 DEBUG nova.compute.manager [None req-4f87af8b-b499-4c33-bd6d-678d3f43481b 618010a0282c4b2a9550483b92b43cd3 43ce42e3be7f4d8f8231d877bfe1196e - - default default] [instance: 32005fa1-7813-416f-90c7-0fb025a2c743] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:45:01 compute-1 nova_compute[182713]: 2026-01-21 23:45:01.780 182717 INFO nova.compute.manager [None req-4f87af8b-b499-4c33-bd6d-678d3f43481b 618010a0282c4b2a9550483b92b43cd3 43ce42e3be7f4d8f8231d877bfe1196e - - default default] [instance: 32005fa1-7813-416f-90c7-0fb025a2c743] Retrieving diagnostics
Jan 21 23:45:02 compute-1 nova_compute[182713]: 2026-01-21 23:45:02.115 182717 DEBUG oslo_concurrency.lockutils [None req-a013f38c-6579-463f-ab53-5114f646fe37 d8df5485aa24472aa69d3b4afc7df2e1 866276d2b90c48bbb7b03ec864f892e4 - - default default] Acquiring lock "32005fa1-7813-416f-90c7-0fb025a2c743" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:45:02 compute-1 nova_compute[182713]: 2026-01-21 23:45:02.116 182717 DEBUG oslo_concurrency.lockutils [None req-a013f38c-6579-463f-ab53-5114f646fe37 d8df5485aa24472aa69d3b4afc7df2e1 866276d2b90c48bbb7b03ec864f892e4 - - default default] Lock "32005fa1-7813-416f-90c7-0fb025a2c743" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:45:02 compute-1 nova_compute[182713]: 2026-01-21 23:45:02.117 182717 DEBUG oslo_concurrency.lockutils [None req-a013f38c-6579-463f-ab53-5114f646fe37 d8df5485aa24472aa69d3b4afc7df2e1 866276d2b90c48bbb7b03ec864f892e4 - - default default] Acquiring lock "32005fa1-7813-416f-90c7-0fb025a2c743-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:45:02 compute-1 nova_compute[182713]: 2026-01-21 23:45:02.117 182717 DEBUG oslo_concurrency.lockutils [None req-a013f38c-6579-463f-ab53-5114f646fe37 d8df5485aa24472aa69d3b4afc7df2e1 866276d2b90c48bbb7b03ec864f892e4 - - default default] Lock "32005fa1-7813-416f-90c7-0fb025a2c743-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:45:02 compute-1 nova_compute[182713]: 2026-01-21 23:45:02.117 182717 DEBUG oslo_concurrency.lockutils [None req-a013f38c-6579-463f-ab53-5114f646fe37 d8df5485aa24472aa69d3b4afc7df2e1 866276d2b90c48bbb7b03ec864f892e4 - - default default] Lock "32005fa1-7813-416f-90c7-0fb025a2c743-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:45:02 compute-1 nova_compute[182713]: 2026-01-21 23:45:02.131 182717 INFO nova.compute.manager [None req-a013f38c-6579-463f-ab53-5114f646fe37 d8df5485aa24472aa69d3b4afc7df2e1 866276d2b90c48bbb7b03ec864f892e4 - - default default] [instance: 32005fa1-7813-416f-90c7-0fb025a2c743] Terminating instance
Jan 21 23:45:02 compute-1 nova_compute[182713]: 2026-01-21 23:45:02.139 182717 DEBUG oslo_concurrency.lockutils [None req-a013f38c-6579-463f-ab53-5114f646fe37 d8df5485aa24472aa69d3b4afc7df2e1 866276d2b90c48bbb7b03ec864f892e4 - - default default] Acquiring lock "refresh_cache-32005fa1-7813-416f-90c7-0fb025a2c743" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 23:45:02 compute-1 nova_compute[182713]: 2026-01-21 23:45:02.139 182717 DEBUG oslo_concurrency.lockutils [None req-a013f38c-6579-463f-ab53-5114f646fe37 d8df5485aa24472aa69d3b4afc7df2e1 866276d2b90c48bbb7b03ec864f892e4 - - default default] Acquired lock "refresh_cache-32005fa1-7813-416f-90c7-0fb025a2c743" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 23:45:02 compute-1 nova_compute[182713]: 2026-01-21 23:45:02.140 182717 DEBUG nova.network.neutron [None req-a013f38c-6579-463f-ab53-5114f646fe37 d8df5485aa24472aa69d3b4afc7df2e1 866276d2b90c48bbb7b03ec864f892e4 - - default default] [instance: 32005fa1-7813-416f-90c7-0fb025a2c743] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 21 23:45:02 compute-1 nova_compute[182713]: 2026-01-21 23:45:02.494 182717 DEBUG nova.network.neutron [None req-a013f38c-6579-463f-ab53-5114f646fe37 d8df5485aa24472aa69d3b4afc7df2e1 866276d2b90c48bbb7b03ec864f892e4 - - default default] [instance: 32005fa1-7813-416f-90c7-0fb025a2c743] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 21 23:45:02 compute-1 nova_compute[182713]: 2026-01-21 23:45:02.718 182717 DEBUG nova.network.neutron [None req-a013f38c-6579-463f-ab53-5114f646fe37 d8df5485aa24472aa69d3b4afc7df2e1 866276d2b90c48bbb7b03ec864f892e4 - - default default] [instance: 32005fa1-7813-416f-90c7-0fb025a2c743] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 23:45:02 compute-1 nova_compute[182713]: 2026-01-21 23:45:02.739 182717 DEBUG oslo_concurrency.lockutils [None req-a013f38c-6579-463f-ab53-5114f646fe37 d8df5485aa24472aa69d3b4afc7df2e1 866276d2b90c48bbb7b03ec864f892e4 - - default default] Releasing lock "refresh_cache-32005fa1-7813-416f-90c7-0fb025a2c743" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 23:45:02 compute-1 nova_compute[182713]: 2026-01-21 23:45:02.740 182717 DEBUG nova.compute.manager [None req-a013f38c-6579-463f-ab53-5114f646fe37 d8df5485aa24472aa69d3b4afc7df2e1 866276d2b90c48bbb7b03ec864f892e4 - - default default] [instance: 32005fa1-7813-416f-90c7-0fb025a2c743] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 21 23:45:02 compute-1 systemd[1]: machine-qemu\x2d5\x2dinstance\x2d00000009.scope: Deactivated successfully.
Jan 21 23:45:02 compute-1 systemd[1]: machine-qemu\x2d5\x2dinstance\x2d00000009.scope: Consumed 3.969s CPU time.
Jan 21 23:45:02 compute-1 systemd-machined[153970]: Machine qemu-5-instance-00000009 terminated.
Jan 21 23:45:02 compute-1 nova_compute[182713]: 2026-01-21 23:45:02.986 182717 INFO nova.virt.libvirt.driver [-] [instance: 32005fa1-7813-416f-90c7-0fb025a2c743] Instance destroyed successfully.
Jan 21 23:45:02 compute-1 nova_compute[182713]: 2026-01-21 23:45:02.987 182717 DEBUG nova.objects.instance [None req-a013f38c-6579-463f-ab53-5114f646fe37 d8df5485aa24472aa69d3b4afc7df2e1 866276d2b90c48bbb7b03ec864f892e4 - - default default] Lazy-loading 'resources' on Instance uuid 32005fa1-7813-416f-90c7-0fb025a2c743 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:45:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:45:02.993 104184 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:45:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:45:02.993 104184 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:45:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:45:02.993 104184 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:45:03 compute-1 nova_compute[182713]: 2026-01-21 23:45:03.004 182717 INFO nova.virt.libvirt.driver [None req-a013f38c-6579-463f-ab53-5114f646fe37 d8df5485aa24472aa69d3b4afc7df2e1 866276d2b90c48bbb7b03ec864f892e4 - - default default] [instance: 32005fa1-7813-416f-90c7-0fb025a2c743] Deleting instance files /var/lib/nova/instances/32005fa1-7813-416f-90c7-0fb025a2c743_del
Jan 21 23:45:03 compute-1 nova_compute[182713]: 2026-01-21 23:45:03.005 182717 INFO nova.virt.libvirt.driver [None req-a013f38c-6579-463f-ab53-5114f646fe37 d8df5485aa24472aa69d3b4afc7df2e1 866276d2b90c48bbb7b03ec864f892e4 - - default default] [instance: 32005fa1-7813-416f-90c7-0fb025a2c743] Deletion of /var/lib/nova/instances/32005fa1-7813-416f-90c7-0fb025a2c743_del complete
Jan 21 23:45:03 compute-1 nova_compute[182713]: 2026-01-21 23:45:03.124 182717 INFO nova.compute.manager [None req-a013f38c-6579-463f-ab53-5114f646fe37 d8df5485aa24472aa69d3b4afc7df2e1 866276d2b90c48bbb7b03ec864f892e4 - - default default] [instance: 32005fa1-7813-416f-90c7-0fb025a2c743] Took 0.38 seconds to destroy the instance on the hypervisor.
Jan 21 23:45:03 compute-1 nova_compute[182713]: 2026-01-21 23:45:03.125 182717 DEBUG oslo.service.loopingcall [None req-a013f38c-6579-463f-ab53-5114f646fe37 d8df5485aa24472aa69d3b4afc7df2e1 866276d2b90c48bbb7b03ec864f892e4 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 21 23:45:03 compute-1 nova_compute[182713]: 2026-01-21 23:45:03.126 182717 DEBUG nova.compute.manager [-] [instance: 32005fa1-7813-416f-90c7-0fb025a2c743] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 21 23:45:03 compute-1 nova_compute[182713]: 2026-01-21 23:45:03.126 182717 DEBUG nova.network.neutron [-] [instance: 32005fa1-7813-416f-90c7-0fb025a2c743] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 21 23:45:03 compute-1 nova_compute[182713]: 2026-01-21 23:45:03.282 182717 DEBUG nova.network.neutron [-] [instance: 32005fa1-7813-416f-90c7-0fb025a2c743] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 21 23:45:03 compute-1 nova_compute[182713]: 2026-01-21 23:45:03.309 182717 DEBUG nova.network.neutron [-] [instance: 32005fa1-7813-416f-90c7-0fb025a2c743] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 23:45:03 compute-1 nova_compute[182713]: 2026-01-21 23:45:03.325 182717 INFO nova.compute.manager [-] [instance: 32005fa1-7813-416f-90c7-0fb025a2c743] Took 0.20 seconds to deallocate network for instance.
Jan 21 23:45:03 compute-1 nova_compute[182713]: 2026-01-21 23:45:03.438 182717 DEBUG oslo_concurrency.lockutils [None req-a013f38c-6579-463f-ab53-5114f646fe37 d8df5485aa24472aa69d3b4afc7df2e1 866276d2b90c48bbb7b03ec864f892e4 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:45:03 compute-1 nova_compute[182713]: 2026-01-21 23:45:03.439 182717 DEBUG oslo_concurrency.lockutils [None req-a013f38c-6579-463f-ab53-5114f646fe37 d8df5485aa24472aa69d3b4afc7df2e1 866276d2b90c48bbb7b03ec864f892e4 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:45:03 compute-1 nova_compute[182713]: 2026-01-21 23:45:03.516 182717 DEBUG nova.compute.provider_tree [None req-a013f38c-6579-463f-ab53-5114f646fe37 d8df5485aa24472aa69d3b4afc7df2e1 866276d2b90c48bbb7b03ec864f892e4 - - default default] Inventory has not changed in ProviderTree for provider: 39680711-70c9-4df1-ae59-25e54fac688d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 21 23:45:03 compute-1 nova_compute[182713]: 2026-01-21 23:45:03.540 182717 DEBUG nova.scheduler.client.report [None req-a013f38c-6579-463f-ab53-5114f646fe37 d8df5485aa24472aa69d3b4afc7df2e1 866276d2b90c48bbb7b03ec864f892e4 - - default default] Inventory has not changed for provider 39680711-70c9-4df1-ae59-25e54fac688d based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 21 23:45:03 compute-1 nova_compute[182713]: 2026-01-21 23:45:03.574 182717 DEBUG oslo_concurrency.lockutils [None req-a013f38c-6579-463f-ab53-5114f646fe37 d8df5485aa24472aa69d3b4afc7df2e1 866276d2b90c48bbb7b03ec864f892e4 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.135s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:45:03 compute-1 nova_compute[182713]: 2026-01-21 23:45:03.634 182717 INFO nova.scheduler.client.report [None req-a013f38c-6579-463f-ab53-5114f646fe37 d8df5485aa24472aa69d3b4afc7df2e1 866276d2b90c48bbb7b03ec864f892e4 - - default default] Deleted allocations for instance 32005fa1-7813-416f-90c7-0fb025a2c743
Jan 21 23:45:03 compute-1 nova_compute[182713]: 2026-01-21 23:45:03.740 182717 DEBUG oslo_concurrency.lockutils [None req-a013f38c-6579-463f-ab53-5114f646fe37 d8df5485aa24472aa69d3b4afc7df2e1 866276d2b90c48bbb7b03ec864f892e4 - - default default] Lock "32005fa1-7813-416f-90c7-0fb025a2c743" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.624s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:45:04 compute-1 podman[212394]: 2026-01-21 23:45:04.560640405 +0000 UTC m=+0.054496951 container health_status dce133a2ffa21f3006ac27dc80027060a9029e6d972f4c62fecd35dd3bbb00f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.buildah.version=1.33.7, config_id=openstack_network_exporter, vendor=Red Hat, Inc., container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, managed_by=edpm_ansible, vcs-type=git, name=ubi9-minimal, version=9.6, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, architecture=x86_64)
Jan 21 23:45:04 compute-1 nova_compute[182713]: 2026-01-21 23:45:04.845 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:45:05 compute-1 nova_compute[182713]: 2026-01-21 23:45:05.609 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:45:06 compute-1 nova_compute[182713]: 2026-01-21 23:45:06.042 182717 DEBUG oslo_concurrency.lockutils [None req-e02ad10f-448d-4092-8949-8f02335b6925 9a864b8e66b942ce83a24a041833e808 e130d5a628fc48bfaf6e82d512f6073d - - default default] Acquiring lock "fbff60b0-209e-4bdb-bc7a-f22348c41e4a" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:45:06 compute-1 nova_compute[182713]: 2026-01-21 23:45:06.042 182717 DEBUG oslo_concurrency.lockutils [None req-e02ad10f-448d-4092-8949-8f02335b6925 9a864b8e66b942ce83a24a041833e808 e130d5a628fc48bfaf6e82d512f6073d - - default default] Lock "fbff60b0-209e-4bdb-bc7a-f22348c41e4a" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:45:06 compute-1 nova_compute[182713]: 2026-01-21 23:45:06.080 182717 DEBUG nova.compute.manager [None req-e02ad10f-448d-4092-8949-8f02335b6925 9a864b8e66b942ce83a24a041833e808 e130d5a628fc48bfaf6e82d512f6073d - - default default] [instance: fbff60b0-209e-4bdb-bc7a-f22348c41e4a] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 21 23:45:06 compute-1 nova_compute[182713]: 2026-01-21 23:45:06.175 182717 DEBUG oslo_concurrency.lockutils [None req-e02ad10f-448d-4092-8949-8f02335b6925 9a864b8e66b942ce83a24a041833e808 e130d5a628fc48bfaf6e82d512f6073d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:45:06 compute-1 nova_compute[182713]: 2026-01-21 23:45:06.176 182717 DEBUG oslo_concurrency.lockutils [None req-e02ad10f-448d-4092-8949-8f02335b6925 9a864b8e66b942ce83a24a041833e808 e130d5a628fc48bfaf6e82d512f6073d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:45:06 compute-1 nova_compute[182713]: 2026-01-21 23:45:06.185 182717 DEBUG nova.virt.hardware [None req-e02ad10f-448d-4092-8949-8f02335b6925 9a864b8e66b942ce83a24a041833e808 e130d5a628fc48bfaf6e82d512f6073d - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 21 23:45:06 compute-1 nova_compute[182713]: 2026-01-21 23:45:06.186 182717 INFO nova.compute.claims [None req-e02ad10f-448d-4092-8949-8f02335b6925 9a864b8e66b942ce83a24a041833e808 e130d5a628fc48bfaf6e82d512f6073d - - default default] [instance: fbff60b0-209e-4bdb-bc7a-f22348c41e4a] Claim successful on node compute-1.ctlplane.example.com
Jan 21 23:45:06 compute-1 nova_compute[182713]: 2026-01-21 23:45:06.312 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:45:06 compute-1 nova_compute[182713]: 2026-01-21 23:45:06.324 182717 DEBUG nova.compute.provider_tree [None req-e02ad10f-448d-4092-8949-8f02335b6925 9a864b8e66b942ce83a24a041833e808 e130d5a628fc48bfaf6e82d512f6073d - - default default] Inventory has not changed in ProviderTree for provider: 39680711-70c9-4df1-ae59-25e54fac688d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 21 23:45:06 compute-1 nova_compute[182713]: 2026-01-21 23:45:06.341 182717 DEBUG nova.scheduler.client.report [None req-e02ad10f-448d-4092-8949-8f02335b6925 9a864b8e66b942ce83a24a041833e808 e130d5a628fc48bfaf6e82d512f6073d - - default default] Inventory has not changed for provider 39680711-70c9-4df1-ae59-25e54fac688d based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 21 23:45:06 compute-1 nova_compute[182713]: 2026-01-21 23:45:06.389 182717 DEBUG oslo_concurrency.lockutils [None req-e02ad10f-448d-4092-8949-8f02335b6925 9a864b8e66b942ce83a24a041833e808 e130d5a628fc48bfaf6e82d512f6073d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.212s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:45:06 compute-1 nova_compute[182713]: 2026-01-21 23:45:06.390 182717 DEBUG nova.compute.manager [None req-e02ad10f-448d-4092-8949-8f02335b6925 9a864b8e66b942ce83a24a041833e808 e130d5a628fc48bfaf6e82d512f6073d - - default default] [instance: fbff60b0-209e-4bdb-bc7a-f22348c41e4a] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 21 23:45:06 compute-1 nova_compute[182713]: 2026-01-21 23:45:06.451 182717 DEBUG nova.compute.manager [None req-e02ad10f-448d-4092-8949-8f02335b6925 9a864b8e66b942ce83a24a041833e808 e130d5a628fc48bfaf6e82d512f6073d - - default default] [instance: fbff60b0-209e-4bdb-bc7a-f22348c41e4a] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 21 23:45:06 compute-1 nova_compute[182713]: 2026-01-21 23:45:06.452 182717 DEBUG nova.network.neutron [None req-e02ad10f-448d-4092-8949-8f02335b6925 9a864b8e66b942ce83a24a041833e808 e130d5a628fc48bfaf6e82d512f6073d - - default default] [instance: fbff60b0-209e-4bdb-bc7a-f22348c41e4a] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 21 23:45:06 compute-1 nova_compute[182713]: 2026-01-21 23:45:06.478 182717 INFO nova.virt.libvirt.driver [None req-e02ad10f-448d-4092-8949-8f02335b6925 9a864b8e66b942ce83a24a041833e808 e130d5a628fc48bfaf6e82d512f6073d - - default default] [instance: fbff60b0-209e-4bdb-bc7a-f22348c41e4a] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 21 23:45:06 compute-1 nova_compute[182713]: 2026-01-21 23:45:06.500 182717 DEBUG nova.compute.manager [None req-e02ad10f-448d-4092-8949-8f02335b6925 9a864b8e66b942ce83a24a041833e808 e130d5a628fc48bfaf6e82d512f6073d - - default default] [instance: fbff60b0-209e-4bdb-bc7a-f22348c41e4a] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 21 23:45:06 compute-1 nova_compute[182713]: 2026-01-21 23:45:06.621 182717 DEBUG nova.compute.manager [None req-e02ad10f-448d-4092-8949-8f02335b6925 9a864b8e66b942ce83a24a041833e808 e130d5a628fc48bfaf6e82d512f6073d - - default default] [instance: fbff60b0-209e-4bdb-bc7a-f22348c41e4a] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 21 23:45:06 compute-1 nova_compute[182713]: 2026-01-21 23:45:06.623 182717 DEBUG nova.virt.libvirt.driver [None req-e02ad10f-448d-4092-8949-8f02335b6925 9a864b8e66b942ce83a24a041833e808 e130d5a628fc48bfaf6e82d512f6073d - - default default] [instance: fbff60b0-209e-4bdb-bc7a-f22348c41e4a] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 21 23:45:06 compute-1 nova_compute[182713]: 2026-01-21 23:45:06.624 182717 INFO nova.virt.libvirt.driver [None req-e02ad10f-448d-4092-8949-8f02335b6925 9a864b8e66b942ce83a24a041833e808 e130d5a628fc48bfaf6e82d512f6073d - - default default] [instance: fbff60b0-209e-4bdb-bc7a-f22348c41e4a] Creating image(s)
Jan 21 23:45:06 compute-1 nova_compute[182713]: 2026-01-21 23:45:06.625 182717 DEBUG oslo_concurrency.lockutils [None req-e02ad10f-448d-4092-8949-8f02335b6925 9a864b8e66b942ce83a24a041833e808 e130d5a628fc48bfaf6e82d512f6073d - - default default] Acquiring lock "/var/lib/nova/instances/fbff60b0-209e-4bdb-bc7a-f22348c41e4a/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:45:06 compute-1 nova_compute[182713]: 2026-01-21 23:45:06.626 182717 DEBUG oslo_concurrency.lockutils [None req-e02ad10f-448d-4092-8949-8f02335b6925 9a864b8e66b942ce83a24a041833e808 e130d5a628fc48bfaf6e82d512f6073d - - default default] Lock "/var/lib/nova/instances/fbff60b0-209e-4bdb-bc7a-f22348c41e4a/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:45:06 compute-1 nova_compute[182713]: 2026-01-21 23:45:06.627 182717 DEBUG oslo_concurrency.lockutils [None req-e02ad10f-448d-4092-8949-8f02335b6925 9a864b8e66b942ce83a24a041833e808 e130d5a628fc48bfaf6e82d512f6073d - - default default] Lock "/var/lib/nova/instances/fbff60b0-209e-4bdb-bc7a-f22348c41e4a/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:45:06 compute-1 nova_compute[182713]: 2026-01-21 23:45:06.660 182717 DEBUG oslo_concurrency.processutils [None req-e02ad10f-448d-4092-8949-8f02335b6925 9a864b8e66b942ce83a24a041833e808 e130d5a628fc48bfaf6e82d512f6073d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:45:06 compute-1 nova_compute[182713]: 2026-01-21 23:45:06.731 182717 DEBUG oslo_concurrency.processutils [None req-e02ad10f-448d-4092-8949-8f02335b6925 9a864b8e66b942ce83a24a041833e808 e130d5a628fc48bfaf6e82d512f6073d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:45:06 compute-1 nova_compute[182713]: 2026-01-21 23:45:06.733 182717 DEBUG oslo_concurrency.lockutils [None req-e02ad10f-448d-4092-8949-8f02335b6925 9a864b8e66b942ce83a24a041833e808 e130d5a628fc48bfaf6e82d512f6073d - - default default] Acquiring lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:45:06 compute-1 nova_compute[182713]: 2026-01-21 23:45:06.734 182717 DEBUG oslo_concurrency.lockutils [None req-e02ad10f-448d-4092-8949-8f02335b6925 9a864b8e66b942ce83a24a041833e808 e130d5a628fc48bfaf6e82d512f6073d - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:45:06 compute-1 nova_compute[182713]: 2026-01-21 23:45:06.757 182717 DEBUG oslo_concurrency.processutils [None req-e02ad10f-448d-4092-8949-8f02335b6925 9a864b8e66b942ce83a24a041833e808 e130d5a628fc48bfaf6e82d512f6073d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:45:06 compute-1 nova_compute[182713]: 2026-01-21 23:45:06.849 182717 DEBUG oslo_concurrency.processutils [None req-e02ad10f-448d-4092-8949-8f02335b6925 9a864b8e66b942ce83a24a041833e808 e130d5a628fc48bfaf6e82d512f6073d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.092s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:45:06 compute-1 nova_compute[182713]: 2026-01-21 23:45:06.850 182717 DEBUG oslo_concurrency.processutils [None req-e02ad10f-448d-4092-8949-8f02335b6925 9a864b8e66b942ce83a24a041833e808 e130d5a628fc48bfaf6e82d512f6073d - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/fbff60b0-209e-4bdb-bc7a-f22348c41e4a/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:45:06 compute-1 nova_compute[182713]: 2026-01-21 23:45:06.893 182717 DEBUG oslo_concurrency.processutils [None req-e02ad10f-448d-4092-8949-8f02335b6925 9a864b8e66b942ce83a24a041833e808 e130d5a628fc48bfaf6e82d512f6073d - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/fbff60b0-209e-4bdb-bc7a-f22348c41e4a/disk 1073741824" returned: 0 in 0.044s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:45:06 compute-1 nova_compute[182713]: 2026-01-21 23:45:06.896 182717 DEBUG oslo_concurrency.lockutils [None req-e02ad10f-448d-4092-8949-8f02335b6925 9a864b8e66b942ce83a24a041833e808 e130d5a628fc48bfaf6e82d512f6073d - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.162s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:45:06 compute-1 nova_compute[182713]: 2026-01-21 23:45:06.896 182717 DEBUG oslo_concurrency.processutils [None req-e02ad10f-448d-4092-8949-8f02335b6925 9a864b8e66b942ce83a24a041833e808 e130d5a628fc48bfaf6e82d512f6073d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:45:06 compute-1 nova_compute[182713]: 2026-01-21 23:45:06.984 182717 DEBUG oslo_concurrency.processutils [None req-e02ad10f-448d-4092-8949-8f02335b6925 9a864b8e66b942ce83a24a041833e808 e130d5a628fc48bfaf6e82d512f6073d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.088s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:45:06 compute-1 nova_compute[182713]: 2026-01-21 23:45:06.986 182717 DEBUG nova.virt.disk.api [None req-e02ad10f-448d-4092-8949-8f02335b6925 9a864b8e66b942ce83a24a041833e808 e130d5a628fc48bfaf6e82d512f6073d - - default default] Checking if we can resize image /var/lib/nova/instances/fbff60b0-209e-4bdb-bc7a-f22348c41e4a/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 21 23:45:06 compute-1 nova_compute[182713]: 2026-01-21 23:45:06.987 182717 DEBUG oslo_concurrency.processutils [None req-e02ad10f-448d-4092-8949-8f02335b6925 9a864b8e66b942ce83a24a041833e808 e130d5a628fc48bfaf6e82d512f6073d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/fbff60b0-209e-4bdb-bc7a-f22348c41e4a/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:45:07 compute-1 nova_compute[182713]: 2026-01-21 23:45:07.051 182717 DEBUG oslo_concurrency.processutils [None req-e02ad10f-448d-4092-8949-8f02335b6925 9a864b8e66b942ce83a24a041833e808 e130d5a628fc48bfaf6e82d512f6073d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/fbff60b0-209e-4bdb-bc7a-f22348c41e4a/disk --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:45:07 compute-1 nova_compute[182713]: 2026-01-21 23:45:07.053 182717 DEBUG nova.virt.disk.api [None req-e02ad10f-448d-4092-8949-8f02335b6925 9a864b8e66b942ce83a24a041833e808 e130d5a628fc48bfaf6e82d512f6073d - - default default] Cannot resize image /var/lib/nova/instances/fbff60b0-209e-4bdb-bc7a-f22348c41e4a/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 21 23:45:07 compute-1 nova_compute[182713]: 2026-01-21 23:45:07.054 182717 DEBUG nova.objects.instance [None req-e02ad10f-448d-4092-8949-8f02335b6925 9a864b8e66b942ce83a24a041833e808 e130d5a628fc48bfaf6e82d512f6073d - - default default] Lazy-loading 'migration_context' on Instance uuid fbff60b0-209e-4bdb-bc7a-f22348c41e4a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:45:07 compute-1 nova_compute[182713]: 2026-01-21 23:45:07.084 182717 DEBUG nova.virt.libvirt.driver [None req-e02ad10f-448d-4092-8949-8f02335b6925 9a864b8e66b942ce83a24a041833e808 e130d5a628fc48bfaf6e82d512f6073d - - default default] [instance: fbff60b0-209e-4bdb-bc7a-f22348c41e4a] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 21 23:45:07 compute-1 nova_compute[182713]: 2026-01-21 23:45:07.084 182717 DEBUG nova.virt.libvirt.driver [None req-e02ad10f-448d-4092-8949-8f02335b6925 9a864b8e66b942ce83a24a041833e808 e130d5a628fc48bfaf6e82d512f6073d - - default default] [instance: fbff60b0-209e-4bdb-bc7a-f22348c41e4a] Ensure instance console log exists: /var/lib/nova/instances/fbff60b0-209e-4bdb-bc7a-f22348c41e4a/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 21 23:45:07 compute-1 nova_compute[182713]: 2026-01-21 23:45:07.085 182717 DEBUG oslo_concurrency.lockutils [None req-e02ad10f-448d-4092-8949-8f02335b6925 9a864b8e66b942ce83a24a041833e808 e130d5a628fc48bfaf6e82d512f6073d - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:45:07 compute-1 nova_compute[182713]: 2026-01-21 23:45:07.086 182717 DEBUG oslo_concurrency.lockutils [None req-e02ad10f-448d-4092-8949-8f02335b6925 9a864b8e66b942ce83a24a041833e808 e130d5a628fc48bfaf6e82d512f6073d - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:45:07 compute-1 nova_compute[182713]: 2026-01-21 23:45:07.087 182717 DEBUG oslo_concurrency.lockutils [None req-e02ad10f-448d-4092-8949-8f02335b6925 9a864b8e66b942ce83a24a041833e808 e130d5a628fc48bfaf6e82d512f6073d - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:45:07 compute-1 nova_compute[182713]: 2026-01-21 23:45:07.542 182717 DEBUG nova.network.neutron [None req-e02ad10f-448d-4092-8949-8f02335b6925 9a864b8e66b942ce83a24a041833e808 e130d5a628fc48bfaf6e82d512f6073d - - default default] [instance: fbff60b0-209e-4bdb-bc7a-f22348c41e4a] No network configured allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1188
Jan 21 23:45:07 compute-1 nova_compute[182713]: 2026-01-21 23:45:07.543 182717 DEBUG nova.compute.manager [None req-e02ad10f-448d-4092-8949-8f02335b6925 9a864b8e66b942ce83a24a041833e808 e130d5a628fc48bfaf6e82d512f6073d - - default default] [instance: fbff60b0-209e-4bdb-bc7a-f22348c41e4a] Instance network_info: |[]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 21 23:45:07 compute-1 nova_compute[182713]: 2026-01-21 23:45:07.546 182717 DEBUG nova.virt.libvirt.driver [None req-e02ad10f-448d-4092-8949-8f02335b6925 9a864b8e66b942ce83a24a041833e808 e130d5a628fc48bfaf6e82d512f6073d - - default default] [instance: fbff60b0-209e-4bdb-bc7a-f22348c41e4a] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'size': 0, 'disk_bus': 'virtio', 'boot_index': 0, 'device_name': '/dev/vda', 'guest_format': None, 'encrypted': False, 'encryption_format': None, 'encryption_secret_uuid': None, 'device_type': 'disk', 'image_id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 21 23:45:07 compute-1 nova_compute[182713]: 2026-01-21 23:45:07.552 182717 WARNING nova.virt.libvirt.driver [None req-e02ad10f-448d-4092-8949-8f02335b6925 9a864b8e66b942ce83a24a041833e808 e130d5a628fc48bfaf6e82d512f6073d - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 21 23:45:07 compute-1 nova_compute[182713]: 2026-01-21 23:45:07.557 182717 DEBUG nova.virt.libvirt.host [None req-e02ad10f-448d-4092-8949-8f02335b6925 9a864b8e66b942ce83a24a041833e808 e130d5a628fc48bfaf6e82d512f6073d - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 21 23:45:07 compute-1 nova_compute[182713]: 2026-01-21 23:45:07.558 182717 DEBUG nova.virt.libvirt.host [None req-e02ad10f-448d-4092-8949-8f02335b6925 9a864b8e66b942ce83a24a041833e808 e130d5a628fc48bfaf6e82d512f6073d - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 21 23:45:07 compute-1 nova_compute[182713]: 2026-01-21 23:45:07.561 182717 DEBUG nova.virt.libvirt.host [None req-e02ad10f-448d-4092-8949-8f02335b6925 9a864b8e66b942ce83a24a041833e808 e130d5a628fc48bfaf6e82d512f6073d - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 21 23:45:07 compute-1 nova_compute[182713]: 2026-01-21 23:45:07.562 182717 DEBUG nova.virt.libvirt.host [None req-e02ad10f-448d-4092-8949-8f02335b6925 9a864b8e66b942ce83a24a041833e808 e130d5a628fc48bfaf6e82d512f6073d - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 21 23:45:07 compute-1 nova_compute[182713]: 2026-01-21 23:45:07.563 182717 DEBUG nova.virt.libvirt.driver [None req-e02ad10f-448d-4092-8949-8f02335b6925 9a864b8e66b942ce83a24a041833e808 e130d5a628fc48bfaf6e82d512f6073d - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 21 23:45:07 compute-1 nova_compute[182713]: 2026-01-21 23:45:07.564 182717 DEBUG nova.virt.hardware [None req-e02ad10f-448d-4092-8949-8f02335b6925 9a864b8e66b942ce83a24a041833e808 e130d5a628fc48bfaf6e82d512f6073d - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-21T23:42:45Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c3389c03-89c4-4ff5-9e03-1a99d41713d4',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 21 23:45:07 compute-1 nova_compute[182713]: 2026-01-21 23:45:07.565 182717 DEBUG nova.virt.hardware [None req-e02ad10f-448d-4092-8949-8f02335b6925 9a864b8e66b942ce83a24a041833e808 e130d5a628fc48bfaf6e82d512f6073d - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 21 23:45:07 compute-1 nova_compute[182713]: 2026-01-21 23:45:07.565 182717 DEBUG nova.virt.hardware [None req-e02ad10f-448d-4092-8949-8f02335b6925 9a864b8e66b942ce83a24a041833e808 e130d5a628fc48bfaf6e82d512f6073d - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 21 23:45:07 compute-1 nova_compute[182713]: 2026-01-21 23:45:07.565 182717 DEBUG nova.virt.hardware [None req-e02ad10f-448d-4092-8949-8f02335b6925 9a864b8e66b942ce83a24a041833e808 e130d5a628fc48bfaf6e82d512f6073d - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 21 23:45:07 compute-1 nova_compute[182713]: 2026-01-21 23:45:07.565 182717 DEBUG nova.virt.hardware [None req-e02ad10f-448d-4092-8949-8f02335b6925 9a864b8e66b942ce83a24a041833e808 e130d5a628fc48bfaf6e82d512f6073d - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 21 23:45:07 compute-1 nova_compute[182713]: 2026-01-21 23:45:07.566 182717 DEBUG nova.virt.hardware [None req-e02ad10f-448d-4092-8949-8f02335b6925 9a864b8e66b942ce83a24a041833e808 e130d5a628fc48bfaf6e82d512f6073d - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 21 23:45:07 compute-1 nova_compute[182713]: 2026-01-21 23:45:07.566 182717 DEBUG nova.virt.hardware [None req-e02ad10f-448d-4092-8949-8f02335b6925 9a864b8e66b942ce83a24a041833e808 e130d5a628fc48bfaf6e82d512f6073d - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 21 23:45:07 compute-1 nova_compute[182713]: 2026-01-21 23:45:07.566 182717 DEBUG nova.virt.hardware [None req-e02ad10f-448d-4092-8949-8f02335b6925 9a864b8e66b942ce83a24a041833e808 e130d5a628fc48bfaf6e82d512f6073d - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 21 23:45:07 compute-1 nova_compute[182713]: 2026-01-21 23:45:07.567 182717 DEBUG nova.virt.hardware [None req-e02ad10f-448d-4092-8949-8f02335b6925 9a864b8e66b942ce83a24a041833e808 e130d5a628fc48bfaf6e82d512f6073d - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 21 23:45:07 compute-1 nova_compute[182713]: 2026-01-21 23:45:07.567 182717 DEBUG nova.virt.hardware [None req-e02ad10f-448d-4092-8949-8f02335b6925 9a864b8e66b942ce83a24a041833e808 e130d5a628fc48bfaf6e82d512f6073d - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 21 23:45:07 compute-1 nova_compute[182713]: 2026-01-21 23:45:07.567 182717 DEBUG nova.virt.hardware [None req-e02ad10f-448d-4092-8949-8f02335b6925 9a864b8e66b942ce83a24a041833e808 e130d5a628fc48bfaf6e82d512f6073d - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 21 23:45:07 compute-1 nova_compute[182713]: 2026-01-21 23:45:07.572 182717 DEBUG nova.objects.instance [None req-e02ad10f-448d-4092-8949-8f02335b6925 9a864b8e66b942ce83a24a041833e808 e130d5a628fc48bfaf6e82d512f6073d - - default default] Lazy-loading 'pci_devices' on Instance uuid fbff60b0-209e-4bdb-bc7a-f22348c41e4a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:45:07 compute-1 nova_compute[182713]: 2026-01-21 23:45:07.591 182717 DEBUG nova.virt.libvirt.driver [None req-e02ad10f-448d-4092-8949-8f02335b6925 9a864b8e66b942ce83a24a041833e808 e130d5a628fc48bfaf6e82d512f6073d - - default default] [instance: fbff60b0-209e-4bdb-bc7a-f22348c41e4a] End _get_guest_xml xml=<domain type="kvm">
Jan 21 23:45:07 compute-1 nova_compute[182713]:   <uuid>fbff60b0-209e-4bdb-bc7a-f22348c41e4a</uuid>
Jan 21 23:45:07 compute-1 nova_compute[182713]:   <name>instance-0000000a</name>
Jan 21 23:45:07 compute-1 nova_compute[182713]:   <memory>131072</memory>
Jan 21 23:45:07 compute-1 nova_compute[182713]:   <vcpu>1</vcpu>
Jan 21 23:45:07 compute-1 nova_compute[182713]:   <metadata>
Jan 21 23:45:07 compute-1 nova_compute[182713]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 21 23:45:07 compute-1 nova_compute[182713]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 21 23:45:07 compute-1 nova_compute[182713]:       <nova:name>tempest-ServerDiagnosticsNegativeTest-server-444868802</nova:name>
Jan 21 23:45:07 compute-1 nova_compute[182713]:       <nova:creationTime>2026-01-21 23:45:07</nova:creationTime>
Jan 21 23:45:07 compute-1 nova_compute[182713]:       <nova:flavor name="m1.nano">
Jan 21 23:45:07 compute-1 nova_compute[182713]:         <nova:memory>128</nova:memory>
Jan 21 23:45:07 compute-1 nova_compute[182713]:         <nova:disk>1</nova:disk>
Jan 21 23:45:07 compute-1 nova_compute[182713]:         <nova:swap>0</nova:swap>
Jan 21 23:45:07 compute-1 nova_compute[182713]:         <nova:ephemeral>0</nova:ephemeral>
Jan 21 23:45:07 compute-1 nova_compute[182713]:         <nova:vcpus>1</nova:vcpus>
Jan 21 23:45:07 compute-1 nova_compute[182713]:       </nova:flavor>
Jan 21 23:45:07 compute-1 nova_compute[182713]:       <nova:owner>
Jan 21 23:45:07 compute-1 nova_compute[182713]:         <nova:user uuid="9a864b8e66b942ce83a24a041833e808">tempest-ServerDiagnosticsNegativeTest-1836608143-project-member</nova:user>
Jan 21 23:45:07 compute-1 nova_compute[182713]:         <nova:project uuid="e130d5a628fc48bfaf6e82d512f6073d">tempest-ServerDiagnosticsNegativeTest-1836608143</nova:project>
Jan 21 23:45:07 compute-1 nova_compute[182713]:       </nova:owner>
Jan 21 23:45:07 compute-1 nova_compute[182713]:       <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 21 23:45:07 compute-1 nova_compute[182713]:       <nova:ports/>
Jan 21 23:45:07 compute-1 nova_compute[182713]:     </nova:instance>
Jan 21 23:45:07 compute-1 nova_compute[182713]:   </metadata>
Jan 21 23:45:07 compute-1 nova_compute[182713]:   <sysinfo type="smbios">
Jan 21 23:45:07 compute-1 nova_compute[182713]:     <system>
Jan 21 23:45:07 compute-1 nova_compute[182713]:       <entry name="manufacturer">RDO</entry>
Jan 21 23:45:07 compute-1 nova_compute[182713]:       <entry name="product">OpenStack Compute</entry>
Jan 21 23:45:07 compute-1 nova_compute[182713]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 21 23:45:07 compute-1 nova_compute[182713]:       <entry name="serial">fbff60b0-209e-4bdb-bc7a-f22348c41e4a</entry>
Jan 21 23:45:07 compute-1 nova_compute[182713]:       <entry name="uuid">fbff60b0-209e-4bdb-bc7a-f22348c41e4a</entry>
Jan 21 23:45:07 compute-1 nova_compute[182713]:       <entry name="family">Virtual Machine</entry>
Jan 21 23:45:07 compute-1 nova_compute[182713]:     </system>
Jan 21 23:45:07 compute-1 nova_compute[182713]:   </sysinfo>
Jan 21 23:45:07 compute-1 nova_compute[182713]:   <os>
Jan 21 23:45:07 compute-1 nova_compute[182713]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 21 23:45:07 compute-1 nova_compute[182713]:     <boot dev="hd"/>
Jan 21 23:45:07 compute-1 nova_compute[182713]:     <smbios mode="sysinfo"/>
Jan 21 23:45:07 compute-1 nova_compute[182713]:   </os>
Jan 21 23:45:07 compute-1 nova_compute[182713]:   <features>
Jan 21 23:45:07 compute-1 nova_compute[182713]:     <acpi/>
Jan 21 23:45:07 compute-1 nova_compute[182713]:     <apic/>
Jan 21 23:45:07 compute-1 nova_compute[182713]:     <vmcoreinfo/>
Jan 21 23:45:07 compute-1 nova_compute[182713]:   </features>
Jan 21 23:45:07 compute-1 nova_compute[182713]:   <clock offset="utc">
Jan 21 23:45:07 compute-1 nova_compute[182713]:     <timer name="pit" tickpolicy="delay"/>
Jan 21 23:45:07 compute-1 nova_compute[182713]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 21 23:45:07 compute-1 nova_compute[182713]:     <timer name="hpet" present="no"/>
Jan 21 23:45:07 compute-1 nova_compute[182713]:   </clock>
Jan 21 23:45:07 compute-1 nova_compute[182713]:   <cpu mode="custom" match="exact">
Jan 21 23:45:07 compute-1 nova_compute[182713]:     <model>Nehalem</model>
Jan 21 23:45:07 compute-1 nova_compute[182713]:     <topology sockets="1" cores="1" threads="1"/>
Jan 21 23:45:07 compute-1 nova_compute[182713]:   </cpu>
Jan 21 23:45:07 compute-1 nova_compute[182713]:   <devices>
Jan 21 23:45:07 compute-1 nova_compute[182713]:     <disk type="file" device="disk">
Jan 21 23:45:07 compute-1 nova_compute[182713]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 21 23:45:07 compute-1 nova_compute[182713]:       <source file="/var/lib/nova/instances/fbff60b0-209e-4bdb-bc7a-f22348c41e4a/disk"/>
Jan 21 23:45:07 compute-1 nova_compute[182713]:       <target dev="vda" bus="virtio"/>
Jan 21 23:45:07 compute-1 nova_compute[182713]:     </disk>
Jan 21 23:45:07 compute-1 nova_compute[182713]:     <disk type="file" device="cdrom">
Jan 21 23:45:07 compute-1 nova_compute[182713]:       <driver name="qemu" type="raw" cache="none"/>
Jan 21 23:45:07 compute-1 nova_compute[182713]:       <source file="/var/lib/nova/instances/fbff60b0-209e-4bdb-bc7a-f22348c41e4a/disk.config"/>
Jan 21 23:45:07 compute-1 nova_compute[182713]:       <target dev="sda" bus="sata"/>
Jan 21 23:45:07 compute-1 nova_compute[182713]:     </disk>
Jan 21 23:45:07 compute-1 nova_compute[182713]:     <serial type="pty">
Jan 21 23:45:07 compute-1 nova_compute[182713]:       <log file="/var/lib/nova/instances/fbff60b0-209e-4bdb-bc7a-f22348c41e4a/console.log" append="off"/>
Jan 21 23:45:07 compute-1 nova_compute[182713]:     </serial>
Jan 21 23:45:07 compute-1 nova_compute[182713]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 21 23:45:07 compute-1 nova_compute[182713]:     <video>
Jan 21 23:45:07 compute-1 nova_compute[182713]:       <model type="virtio"/>
Jan 21 23:45:07 compute-1 nova_compute[182713]:     </video>
Jan 21 23:45:07 compute-1 nova_compute[182713]:     <input type="tablet" bus="usb"/>
Jan 21 23:45:07 compute-1 nova_compute[182713]:     <rng model="virtio">
Jan 21 23:45:07 compute-1 nova_compute[182713]:       <backend model="random">/dev/urandom</backend>
Jan 21 23:45:07 compute-1 nova_compute[182713]:     </rng>
Jan 21 23:45:07 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root"/>
Jan 21 23:45:07 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:45:07 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:45:07 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:45:07 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:45:07 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:45:07 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:45:07 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:45:07 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:45:07 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:45:07 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:45:07 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:45:07 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:45:07 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:45:07 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:45:07 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:45:07 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:45:07 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:45:07 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:45:07 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:45:07 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:45:07 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:45:07 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:45:07 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:45:07 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:45:07 compute-1 nova_compute[182713]:     <controller type="usb" index="0"/>
Jan 21 23:45:07 compute-1 nova_compute[182713]:     <memballoon model="virtio">
Jan 21 23:45:07 compute-1 nova_compute[182713]:       <stats period="10"/>
Jan 21 23:45:07 compute-1 nova_compute[182713]:     </memballoon>
Jan 21 23:45:07 compute-1 nova_compute[182713]:   </devices>
Jan 21 23:45:07 compute-1 nova_compute[182713]: </domain>
Jan 21 23:45:07 compute-1 nova_compute[182713]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 21 23:45:07 compute-1 nova_compute[182713]: 2026-01-21 23:45:07.670 182717 DEBUG nova.virt.libvirt.driver [None req-e02ad10f-448d-4092-8949-8f02335b6925 9a864b8e66b942ce83a24a041833e808 e130d5a628fc48bfaf6e82d512f6073d - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 21 23:45:07 compute-1 nova_compute[182713]: 2026-01-21 23:45:07.671 182717 DEBUG nova.virt.libvirt.driver [None req-e02ad10f-448d-4092-8949-8f02335b6925 9a864b8e66b942ce83a24a041833e808 e130d5a628fc48bfaf6e82d512f6073d - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 21 23:45:07 compute-1 nova_compute[182713]: 2026-01-21 23:45:07.672 182717 INFO nova.virt.libvirt.driver [None req-e02ad10f-448d-4092-8949-8f02335b6925 9a864b8e66b942ce83a24a041833e808 e130d5a628fc48bfaf6e82d512f6073d - - default default] [instance: fbff60b0-209e-4bdb-bc7a-f22348c41e4a] Using config drive
Jan 21 23:45:07 compute-1 nova_compute[182713]: 2026-01-21 23:45:07.879 182717 INFO nova.virt.libvirt.driver [None req-e02ad10f-448d-4092-8949-8f02335b6925 9a864b8e66b942ce83a24a041833e808 e130d5a628fc48bfaf6e82d512f6073d - - default default] [instance: fbff60b0-209e-4bdb-bc7a-f22348c41e4a] Creating config drive at /var/lib/nova/instances/fbff60b0-209e-4bdb-bc7a-f22348c41e4a/disk.config
Jan 21 23:45:07 compute-1 nova_compute[182713]: 2026-01-21 23:45:07.884 182717 DEBUG oslo_concurrency.processutils [None req-e02ad10f-448d-4092-8949-8f02335b6925 9a864b8e66b942ce83a24a041833e808 e130d5a628fc48bfaf6e82d512f6073d - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/fbff60b0-209e-4bdb-bc7a-f22348c41e4a/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpehdyfc27 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:45:08 compute-1 nova_compute[182713]: 2026-01-21 23:45:08.005 182717 DEBUG oslo_concurrency.processutils [None req-e02ad10f-448d-4092-8949-8f02335b6925 9a864b8e66b942ce83a24a041833e808 e130d5a628fc48bfaf6e82d512f6073d - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/fbff60b0-209e-4bdb-bc7a-f22348c41e4a/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpehdyfc27" returned: 0 in 0.121s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:45:08 compute-1 systemd-machined[153970]: New machine qemu-6-instance-0000000a.
Jan 21 23:45:08 compute-1 systemd[1]: Started Virtual Machine qemu-6-instance-0000000a.
Jan 21 23:45:08 compute-1 nova_compute[182713]: 2026-01-21 23:45:08.465 182717 DEBUG nova.virt.driver [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Emitting event <LifecycleEvent: 1769039108.4649055, fbff60b0-209e-4bdb-bc7a-f22348c41e4a => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:45:08 compute-1 nova_compute[182713]: 2026-01-21 23:45:08.466 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: fbff60b0-209e-4bdb-bc7a-f22348c41e4a] VM Resumed (Lifecycle Event)
Jan 21 23:45:08 compute-1 nova_compute[182713]: 2026-01-21 23:45:08.469 182717 DEBUG nova.compute.manager [None req-e02ad10f-448d-4092-8949-8f02335b6925 9a864b8e66b942ce83a24a041833e808 e130d5a628fc48bfaf6e82d512f6073d - - default default] [instance: fbff60b0-209e-4bdb-bc7a-f22348c41e4a] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 21 23:45:08 compute-1 nova_compute[182713]: 2026-01-21 23:45:08.470 182717 DEBUG nova.virt.libvirt.driver [None req-e02ad10f-448d-4092-8949-8f02335b6925 9a864b8e66b942ce83a24a041833e808 e130d5a628fc48bfaf6e82d512f6073d - - default default] [instance: fbff60b0-209e-4bdb-bc7a-f22348c41e4a] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 21 23:45:08 compute-1 nova_compute[182713]: 2026-01-21 23:45:08.476 182717 INFO nova.virt.libvirt.driver [-] [instance: fbff60b0-209e-4bdb-bc7a-f22348c41e4a] Instance spawned successfully.
Jan 21 23:45:08 compute-1 nova_compute[182713]: 2026-01-21 23:45:08.477 182717 DEBUG nova.virt.libvirt.driver [None req-e02ad10f-448d-4092-8949-8f02335b6925 9a864b8e66b942ce83a24a041833e808 e130d5a628fc48bfaf6e82d512f6073d - - default default] [instance: fbff60b0-209e-4bdb-bc7a-f22348c41e4a] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 21 23:45:08 compute-1 nova_compute[182713]: 2026-01-21 23:45:08.493 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: fbff60b0-209e-4bdb-bc7a-f22348c41e4a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:45:08 compute-1 nova_compute[182713]: 2026-01-21 23:45:08.502 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: fbff60b0-209e-4bdb-bc7a-f22348c41e4a] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 21 23:45:08 compute-1 nova_compute[182713]: 2026-01-21 23:45:08.507 182717 DEBUG nova.virt.libvirt.driver [None req-e02ad10f-448d-4092-8949-8f02335b6925 9a864b8e66b942ce83a24a041833e808 e130d5a628fc48bfaf6e82d512f6073d - - default default] [instance: fbff60b0-209e-4bdb-bc7a-f22348c41e4a] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:45:08 compute-1 nova_compute[182713]: 2026-01-21 23:45:08.508 182717 DEBUG nova.virt.libvirt.driver [None req-e02ad10f-448d-4092-8949-8f02335b6925 9a864b8e66b942ce83a24a041833e808 e130d5a628fc48bfaf6e82d512f6073d - - default default] [instance: fbff60b0-209e-4bdb-bc7a-f22348c41e4a] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:45:08 compute-1 nova_compute[182713]: 2026-01-21 23:45:08.508 182717 DEBUG nova.virt.libvirt.driver [None req-e02ad10f-448d-4092-8949-8f02335b6925 9a864b8e66b942ce83a24a041833e808 e130d5a628fc48bfaf6e82d512f6073d - - default default] [instance: fbff60b0-209e-4bdb-bc7a-f22348c41e4a] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:45:08 compute-1 nova_compute[182713]: 2026-01-21 23:45:08.509 182717 DEBUG nova.virt.libvirt.driver [None req-e02ad10f-448d-4092-8949-8f02335b6925 9a864b8e66b942ce83a24a041833e808 e130d5a628fc48bfaf6e82d512f6073d - - default default] [instance: fbff60b0-209e-4bdb-bc7a-f22348c41e4a] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:45:08 compute-1 nova_compute[182713]: 2026-01-21 23:45:08.510 182717 DEBUG nova.virt.libvirt.driver [None req-e02ad10f-448d-4092-8949-8f02335b6925 9a864b8e66b942ce83a24a041833e808 e130d5a628fc48bfaf6e82d512f6073d - - default default] [instance: fbff60b0-209e-4bdb-bc7a-f22348c41e4a] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:45:08 compute-1 nova_compute[182713]: 2026-01-21 23:45:08.511 182717 DEBUG nova.virt.libvirt.driver [None req-e02ad10f-448d-4092-8949-8f02335b6925 9a864b8e66b942ce83a24a041833e808 e130d5a628fc48bfaf6e82d512f6073d - - default default] [instance: fbff60b0-209e-4bdb-bc7a-f22348c41e4a] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:45:08 compute-1 nova_compute[182713]: 2026-01-21 23:45:08.542 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: fbff60b0-209e-4bdb-bc7a-f22348c41e4a] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 21 23:45:08 compute-1 nova_compute[182713]: 2026-01-21 23:45:08.543 182717 DEBUG nova.virt.driver [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Emitting event <LifecycleEvent: 1769039108.4685729, fbff60b0-209e-4bdb-bc7a-f22348c41e4a => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:45:08 compute-1 nova_compute[182713]: 2026-01-21 23:45:08.543 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: fbff60b0-209e-4bdb-bc7a-f22348c41e4a] VM Started (Lifecycle Event)
Jan 21 23:45:08 compute-1 nova_compute[182713]: 2026-01-21 23:45:08.577 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: fbff60b0-209e-4bdb-bc7a-f22348c41e4a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:45:08 compute-1 nova_compute[182713]: 2026-01-21 23:45:08.583 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: fbff60b0-209e-4bdb-bc7a-f22348c41e4a] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 21 23:45:08 compute-1 nova_compute[182713]: 2026-01-21 23:45:08.616 182717 INFO nova.compute.manager [None req-e02ad10f-448d-4092-8949-8f02335b6925 9a864b8e66b942ce83a24a041833e808 e130d5a628fc48bfaf6e82d512f6073d - - default default] [instance: fbff60b0-209e-4bdb-bc7a-f22348c41e4a] Took 1.99 seconds to spawn the instance on the hypervisor.
Jan 21 23:45:08 compute-1 nova_compute[182713]: 2026-01-21 23:45:08.617 182717 DEBUG nova.compute.manager [None req-e02ad10f-448d-4092-8949-8f02335b6925 9a864b8e66b942ce83a24a041833e808 e130d5a628fc48bfaf6e82d512f6073d - - default default] [instance: fbff60b0-209e-4bdb-bc7a-f22348c41e4a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:45:08 compute-1 nova_compute[182713]: 2026-01-21 23:45:08.620 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: fbff60b0-209e-4bdb-bc7a-f22348c41e4a] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 21 23:45:08 compute-1 nova_compute[182713]: 2026-01-21 23:45:08.719 182717 INFO nova.compute.manager [None req-e02ad10f-448d-4092-8949-8f02335b6925 9a864b8e66b942ce83a24a041833e808 e130d5a628fc48bfaf6e82d512f6073d - - default default] [instance: fbff60b0-209e-4bdb-bc7a-f22348c41e4a] Took 2.58 seconds to build instance.
Jan 21 23:45:08 compute-1 nova_compute[182713]: 2026-01-21 23:45:08.759 182717 DEBUG oslo_concurrency.lockutils [None req-e02ad10f-448d-4092-8949-8f02335b6925 9a864b8e66b942ce83a24a041833e808 e130d5a628fc48bfaf6e82d512f6073d - - default default] Lock "fbff60b0-209e-4bdb-bc7a-f22348c41e4a" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 2.716s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:45:09 compute-1 nova_compute[182713]: 2026-01-21 23:45:09.305 182717 DEBUG oslo_concurrency.lockutils [None req-db9e9609-2cf9-4e57-99ab-31e0bbda3307 9a864b8e66b942ce83a24a041833e808 e130d5a628fc48bfaf6e82d512f6073d - - default default] Acquiring lock "fbff60b0-209e-4bdb-bc7a-f22348c41e4a" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:45:09 compute-1 nova_compute[182713]: 2026-01-21 23:45:09.306 182717 DEBUG oslo_concurrency.lockutils [None req-db9e9609-2cf9-4e57-99ab-31e0bbda3307 9a864b8e66b942ce83a24a041833e808 e130d5a628fc48bfaf6e82d512f6073d - - default default] Lock "fbff60b0-209e-4bdb-bc7a-f22348c41e4a" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:45:09 compute-1 nova_compute[182713]: 2026-01-21 23:45:09.306 182717 DEBUG oslo_concurrency.lockutils [None req-db9e9609-2cf9-4e57-99ab-31e0bbda3307 9a864b8e66b942ce83a24a041833e808 e130d5a628fc48bfaf6e82d512f6073d - - default default] Acquiring lock "fbff60b0-209e-4bdb-bc7a-f22348c41e4a-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:45:09 compute-1 nova_compute[182713]: 2026-01-21 23:45:09.307 182717 DEBUG oslo_concurrency.lockutils [None req-db9e9609-2cf9-4e57-99ab-31e0bbda3307 9a864b8e66b942ce83a24a041833e808 e130d5a628fc48bfaf6e82d512f6073d - - default default] Lock "fbff60b0-209e-4bdb-bc7a-f22348c41e4a-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:45:09 compute-1 nova_compute[182713]: 2026-01-21 23:45:09.307 182717 DEBUG oslo_concurrency.lockutils [None req-db9e9609-2cf9-4e57-99ab-31e0bbda3307 9a864b8e66b942ce83a24a041833e808 e130d5a628fc48bfaf6e82d512f6073d - - default default] Lock "fbff60b0-209e-4bdb-bc7a-f22348c41e4a-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:45:09 compute-1 nova_compute[182713]: 2026-01-21 23:45:09.321 182717 INFO nova.compute.manager [None req-db9e9609-2cf9-4e57-99ab-31e0bbda3307 9a864b8e66b942ce83a24a041833e808 e130d5a628fc48bfaf6e82d512f6073d - - default default] [instance: fbff60b0-209e-4bdb-bc7a-f22348c41e4a] Terminating instance
Jan 21 23:45:09 compute-1 nova_compute[182713]: 2026-01-21 23:45:09.335 182717 DEBUG oslo_concurrency.lockutils [None req-db9e9609-2cf9-4e57-99ab-31e0bbda3307 9a864b8e66b942ce83a24a041833e808 e130d5a628fc48bfaf6e82d512f6073d - - default default] Acquiring lock "refresh_cache-fbff60b0-209e-4bdb-bc7a-f22348c41e4a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 23:45:09 compute-1 nova_compute[182713]: 2026-01-21 23:45:09.336 182717 DEBUG oslo_concurrency.lockutils [None req-db9e9609-2cf9-4e57-99ab-31e0bbda3307 9a864b8e66b942ce83a24a041833e808 e130d5a628fc48bfaf6e82d512f6073d - - default default] Acquired lock "refresh_cache-fbff60b0-209e-4bdb-bc7a-f22348c41e4a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 23:45:09 compute-1 nova_compute[182713]: 2026-01-21 23:45:09.336 182717 DEBUG nova.network.neutron [None req-db9e9609-2cf9-4e57-99ab-31e0bbda3307 9a864b8e66b942ce83a24a041833e808 e130d5a628fc48bfaf6e82d512f6073d - - default default] [instance: fbff60b0-209e-4bdb-bc7a-f22348c41e4a] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 21 23:45:09 compute-1 nova_compute[182713]: 2026-01-21 23:45:09.536 182717 DEBUG nova.network.neutron [None req-db9e9609-2cf9-4e57-99ab-31e0bbda3307 9a864b8e66b942ce83a24a041833e808 e130d5a628fc48bfaf6e82d512f6073d - - default default] [instance: fbff60b0-209e-4bdb-bc7a-f22348c41e4a] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 21 23:45:09 compute-1 nova_compute[182713]: 2026-01-21 23:45:09.850 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:45:10 compute-1 nova_compute[182713]: 2026-01-21 23:45:10.534 182717 DEBUG nova.network.neutron [None req-db9e9609-2cf9-4e57-99ab-31e0bbda3307 9a864b8e66b942ce83a24a041833e808 e130d5a628fc48bfaf6e82d512f6073d - - default default] [instance: fbff60b0-209e-4bdb-bc7a-f22348c41e4a] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 23:45:10 compute-1 nova_compute[182713]: 2026-01-21 23:45:10.560 182717 DEBUG oslo_concurrency.lockutils [None req-db9e9609-2cf9-4e57-99ab-31e0bbda3307 9a864b8e66b942ce83a24a041833e808 e130d5a628fc48bfaf6e82d512f6073d - - default default] Releasing lock "refresh_cache-fbff60b0-209e-4bdb-bc7a-f22348c41e4a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 23:45:10 compute-1 nova_compute[182713]: 2026-01-21 23:45:10.561 182717 DEBUG nova.compute.manager [None req-db9e9609-2cf9-4e57-99ab-31e0bbda3307 9a864b8e66b942ce83a24a041833e808 e130d5a628fc48bfaf6e82d512f6073d - - default default] [instance: fbff60b0-209e-4bdb-bc7a-f22348c41e4a] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 21 23:45:10 compute-1 systemd[1]: machine-qemu\x2d6\x2dinstance\x2d0000000a.scope: Deactivated successfully.
Jan 21 23:45:10 compute-1 systemd[1]: machine-qemu\x2d6\x2dinstance\x2d0000000a.scope: Consumed 2.437s CPU time.
Jan 21 23:45:10 compute-1 systemd-machined[153970]: Machine qemu-6-instance-0000000a terminated.
Jan 21 23:45:10 compute-1 nova_compute[182713]: 2026-01-21 23:45:10.824 182717 INFO nova.virt.libvirt.driver [-] [instance: fbff60b0-209e-4bdb-bc7a-f22348c41e4a] Instance destroyed successfully.
Jan 21 23:45:10 compute-1 nova_compute[182713]: 2026-01-21 23:45:10.825 182717 DEBUG nova.objects.instance [None req-db9e9609-2cf9-4e57-99ab-31e0bbda3307 9a864b8e66b942ce83a24a041833e808 e130d5a628fc48bfaf6e82d512f6073d - - default default] Lazy-loading 'resources' on Instance uuid fbff60b0-209e-4bdb-bc7a-f22348c41e4a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:45:10 compute-1 nova_compute[182713]: 2026-01-21 23:45:10.841 182717 INFO nova.virt.libvirt.driver [None req-db9e9609-2cf9-4e57-99ab-31e0bbda3307 9a864b8e66b942ce83a24a041833e808 e130d5a628fc48bfaf6e82d512f6073d - - default default] [instance: fbff60b0-209e-4bdb-bc7a-f22348c41e4a] Deleting instance files /var/lib/nova/instances/fbff60b0-209e-4bdb-bc7a-f22348c41e4a_del
Jan 21 23:45:10 compute-1 nova_compute[182713]: 2026-01-21 23:45:10.843 182717 INFO nova.virt.libvirt.driver [None req-db9e9609-2cf9-4e57-99ab-31e0bbda3307 9a864b8e66b942ce83a24a041833e808 e130d5a628fc48bfaf6e82d512f6073d - - default default] [instance: fbff60b0-209e-4bdb-bc7a-f22348c41e4a] Deletion of /var/lib/nova/instances/fbff60b0-209e-4bdb-bc7a-f22348c41e4a_del complete
Jan 21 23:45:10 compute-1 nova_compute[182713]: 2026-01-21 23:45:10.929 182717 INFO nova.compute.manager [None req-db9e9609-2cf9-4e57-99ab-31e0bbda3307 9a864b8e66b942ce83a24a041833e808 e130d5a628fc48bfaf6e82d512f6073d - - default default] [instance: fbff60b0-209e-4bdb-bc7a-f22348c41e4a] Took 0.37 seconds to destroy the instance on the hypervisor.
Jan 21 23:45:10 compute-1 nova_compute[182713]: 2026-01-21 23:45:10.930 182717 DEBUG oslo.service.loopingcall [None req-db9e9609-2cf9-4e57-99ab-31e0bbda3307 9a864b8e66b942ce83a24a041833e808 e130d5a628fc48bfaf6e82d512f6073d - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 21 23:45:10 compute-1 nova_compute[182713]: 2026-01-21 23:45:10.930 182717 DEBUG nova.compute.manager [-] [instance: fbff60b0-209e-4bdb-bc7a-f22348c41e4a] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 21 23:45:10 compute-1 nova_compute[182713]: 2026-01-21 23:45:10.930 182717 DEBUG nova.network.neutron [-] [instance: fbff60b0-209e-4bdb-bc7a-f22348c41e4a] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 21 23:45:11 compute-1 nova_compute[182713]: 2026-01-21 23:45:11.352 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:45:11 compute-1 nova_compute[182713]: 2026-01-21 23:45:11.535 182717 DEBUG nova.network.neutron [-] [instance: fbff60b0-209e-4bdb-bc7a-f22348c41e4a] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 21 23:45:11 compute-1 nova_compute[182713]: 2026-01-21 23:45:11.556 182717 DEBUG nova.network.neutron [-] [instance: fbff60b0-209e-4bdb-bc7a-f22348c41e4a] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 23:45:11 compute-1 nova_compute[182713]: 2026-01-21 23:45:11.571 182717 INFO nova.compute.manager [-] [instance: fbff60b0-209e-4bdb-bc7a-f22348c41e4a] Took 0.64 seconds to deallocate network for instance.
Jan 21 23:45:11 compute-1 podman[212467]: 2026-01-21 23:45:11.602920654 +0000 UTC m=+0.072909136 container health_status 9069102163b9014ce8d990e36224edf2f6277e737eebbc85a8ea0ba5a3d6a468 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 21 23:45:11 compute-1 podman[212466]: 2026-01-21 23:45:11.611185129 +0000 UTC m=+0.092448508 container health_status 1b31374526468d6b98833c6ddc06c791524e3aaa8f4a92d02e3f11c449a17ca2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 21 23:45:11 compute-1 nova_compute[182713]: 2026-01-21 23:45:11.658 182717 DEBUG oslo_concurrency.lockutils [None req-db9e9609-2cf9-4e57-99ab-31e0bbda3307 9a864b8e66b942ce83a24a041833e808 e130d5a628fc48bfaf6e82d512f6073d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:45:11 compute-1 nova_compute[182713]: 2026-01-21 23:45:11.659 182717 DEBUG oslo_concurrency.lockutils [None req-db9e9609-2cf9-4e57-99ab-31e0bbda3307 9a864b8e66b942ce83a24a041833e808 e130d5a628fc48bfaf6e82d512f6073d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:45:11 compute-1 nova_compute[182713]: 2026-01-21 23:45:11.740 182717 DEBUG nova.compute.provider_tree [None req-db9e9609-2cf9-4e57-99ab-31e0bbda3307 9a864b8e66b942ce83a24a041833e808 e130d5a628fc48bfaf6e82d512f6073d - - default default] Inventory has not changed in ProviderTree for provider: 39680711-70c9-4df1-ae59-25e54fac688d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 21 23:45:11 compute-1 nova_compute[182713]: 2026-01-21 23:45:11.767 182717 DEBUG nova.scheduler.client.report [None req-db9e9609-2cf9-4e57-99ab-31e0bbda3307 9a864b8e66b942ce83a24a041833e808 e130d5a628fc48bfaf6e82d512f6073d - - default default] Inventory has not changed for provider 39680711-70c9-4df1-ae59-25e54fac688d based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 21 23:45:11 compute-1 nova_compute[182713]: 2026-01-21 23:45:11.808 182717 DEBUG oslo_concurrency.lockutils [None req-db9e9609-2cf9-4e57-99ab-31e0bbda3307 9a864b8e66b942ce83a24a041833e808 e130d5a628fc48bfaf6e82d512f6073d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.149s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:45:11 compute-1 nova_compute[182713]: 2026-01-21 23:45:11.841 182717 INFO nova.scheduler.client.report [None req-db9e9609-2cf9-4e57-99ab-31e0bbda3307 9a864b8e66b942ce83a24a041833e808 e130d5a628fc48bfaf6e82d512f6073d - - default default] Deleted allocations for instance fbff60b0-209e-4bdb-bc7a-f22348c41e4a
Jan 21 23:45:11 compute-1 nova_compute[182713]: 2026-01-21 23:45:11.957 182717 DEBUG oslo_concurrency.lockutils [None req-db9e9609-2cf9-4e57-99ab-31e0bbda3307 9a864b8e66b942ce83a24a041833e808 e130d5a628fc48bfaf6e82d512f6073d - - default default] Lock "fbff60b0-209e-4bdb-bc7a-f22348c41e4a" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.651s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:45:12 compute-1 nova_compute[182713]: 2026-01-21 23:45:12.438 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:45:14 compute-1 nova_compute[182713]: 2026-01-21 23:45:14.867 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:45:16 compute-1 nova_compute[182713]: 2026-01-21 23:45:16.324 182717 DEBUG oslo_concurrency.lockutils [None req-8ef05d51-73ac-4086-abe2-a999ed31fa0f e542cae206074b80ba3fce2fae031917 503e70549cf444b1bb5cde0daa90b93c - - default default] Acquiring lock "6154084a-6bed-48b4-9c3c-2592cfc58fb1" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:45:16 compute-1 nova_compute[182713]: 2026-01-21 23:45:16.325 182717 DEBUG oslo_concurrency.lockutils [None req-8ef05d51-73ac-4086-abe2-a999ed31fa0f e542cae206074b80ba3fce2fae031917 503e70549cf444b1bb5cde0daa90b93c - - default default] Lock "6154084a-6bed-48b4-9c3c-2592cfc58fb1" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:45:16 compute-1 nova_compute[182713]: 2026-01-21 23:45:16.346 182717 DEBUG nova.compute.manager [None req-8ef05d51-73ac-4086-abe2-a999ed31fa0f e542cae206074b80ba3fce2fae031917 503e70549cf444b1bb5cde0daa90b93c - - default default] [instance: 6154084a-6bed-48b4-9c3c-2592cfc58fb1] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 21 23:45:16 compute-1 nova_compute[182713]: 2026-01-21 23:45:16.354 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:45:16 compute-1 nova_compute[182713]: 2026-01-21 23:45:16.466 182717 DEBUG oslo_concurrency.lockutils [None req-8ef05d51-73ac-4086-abe2-a999ed31fa0f e542cae206074b80ba3fce2fae031917 503e70549cf444b1bb5cde0daa90b93c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:45:16 compute-1 nova_compute[182713]: 2026-01-21 23:45:16.467 182717 DEBUG oslo_concurrency.lockutils [None req-8ef05d51-73ac-4086-abe2-a999ed31fa0f e542cae206074b80ba3fce2fae031917 503e70549cf444b1bb5cde0daa90b93c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:45:16 compute-1 nova_compute[182713]: 2026-01-21 23:45:16.478 182717 DEBUG nova.virt.hardware [None req-8ef05d51-73ac-4086-abe2-a999ed31fa0f e542cae206074b80ba3fce2fae031917 503e70549cf444b1bb5cde0daa90b93c - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 21 23:45:16 compute-1 nova_compute[182713]: 2026-01-21 23:45:16.478 182717 INFO nova.compute.claims [None req-8ef05d51-73ac-4086-abe2-a999ed31fa0f e542cae206074b80ba3fce2fae031917 503e70549cf444b1bb5cde0daa90b93c - - default default] [instance: 6154084a-6bed-48b4-9c3c-2592cfc58fb1] Claim successful on node compute-1.ctlplane.example.com
Jan 21 23:45:16 compute-1 nova_compute[182713]: 2026-01-21 23:45:16.698 182717 DEBUG nova.compute.provider_tree [None req-8ef05d51-73ac-4086-abe2-a999ed31fa0f e542cae206074b80ba3fce2fae031917 503e70549cf444b1bb5cde0daa90b93c - - default default] Inventory has not changed in ProviderTree for provider: 39680711-70c9-4df1-ae59-25e54fac688d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 21 23:45:16 compute-1 nova_compute[182713]: 2026-01-21 23:45:16.735 182717 DEBUG nova.scheduler.client.report [None req-8ef05d51-73ac-4086-abe2-a999ed31fa0f e542cae206074b80ba3fce2fae031917 503e70549cf444b1bb5cde0daa90b93c - - default default] Inventory has not changed for provider 39680711-70c9-4df1-ae59-25e54fac688d based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 21 23:45:16 compute-1 nova_compute[182713]: 2026-01-21 23:45:16.773 182717 DEBUG oslo_concurrency.lockutils [None req-8ef05d51-73ac-4086-abe2-a999ed31fa0f e542cae206074b80ba3fce2fae031917 503e70549cf444b1bb5cde0daa90b93c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.306s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:45:16 compute-1 nova_compute[182713]: 2026-01-21 23:45:16.774 182717 DEBUG nova.compute.manager [None req-8ef05d51-73ac-4086-abe2-a999ed31fa0f e542cae206074b80ba3fce2fae031917 503e70549cf444b1bb5cde0daa90b93c - - default default] [instance: 6154084a-6bed-48b4-9c3c-2592cfc58fb1] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 21 23:45:16 compute-1 nova_compute[182713]: 2026-01-21 23:45:16.854 182717 DEBUG nova.compute.manager [None req-8ef05d51-73ac-4086-abe2-a999ed31fa0f e542cae206074b80ba3fce2fae031917 503e70549cf444b1bb5cde0daa90b93c - - default default] [instance: 6154084a-6bed-48b4-9c3c-2592cfc58fb1] Not allocating networking since 'none' was specified. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1948
Jan 21 23:45:16 compute-1 nova_compute[182713]: 2026-01-21 23:45:16.875 182717 INFO nova.virt.libvirt.driver [None req-8ef05d51-73ac-4086-abe2-a999ed31fa0f e542cae206074b80ba3fce2fae031917 503e70549cf444b1bb5cde0daa90b93c - - default default] [instance: 6154084a-6bed-48b4-9c3c-2592cfc58fb1] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 21 23:45:16 compute-1 nova_compute[182713]: 2026-01-21 23:45:16.896 182717 DEBUG nova.compute.manager [None req-8ef05d51-73ac-4086-abe2-a999ed31fa0f e542cae206074b80ba3fce2fae031917 503e70549cf444b1bb5cde0daa90b93c - - default default] [instance: 6154084a-6bed-48b4-9c3c-2592cfc58fb1] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 21 23:45:17 compute-1 nova_compute[182713]: 2026-01-21 23:45:17.055 182717 DEBUG nova.compute.manager [None req-8ef05d51-73ac-4086-abe2-a999ed31fa0f e542cae206074b80ba3fce2fae031917 503e70549cf444b1bb5cde0daa90b93c - - default default] [instance: 6154084a-6bed-48b4-9c3c-2592cfc58fb1] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 21 23:45:17 compute-1 nova_compute[182713]: 2026-01-21 23:45:17.056 182717 DEBUG nova.virt.libvirt.driver [None req-8ef05d51-73ac-4086-abe2-a999ed31fa0f e542cae206074b80ba3fce2fae031917 503e70549cf444b1bb5cde0daa90b93c - - default default] [instance: 6154084a-6bed-48b4-9c3c-2592cfc58fb1] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 21 23:45:17 compute-1 nova_compute[182713]: 2026-01-21 23:45:17.057 182717 INFO nova.virt.libvirt.driver [None req-8ef05d51-73ac-4086-abe2-a999ed31fa0f e542cae206074b80ba3fce2fae031917 503e70549cf444b1bb5cde0daa90b93c - - default default] [instance: 6154084a-6bed-48b4-9c3c-2592cfc58fb1] Creating image(s)
Jan 21 23:45:17 compute-1 nova_compute[182713]: 2026-01-21 23:45:17.057 182717 DEBUG oslo_concurrency.lockutils [None req-8ef05d51-73ac-4086-abe2-a999ed31fa0f e542cae206074b80ba3fce2fae031917 503e70549cf444b1bb5cde0daa90b93c - - default default] Acquiring lock "/var/lib/nova/instances/6154084a-6bed-48b4-9c3c-2592cfc58fb1/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:45:17 compute-1 nova_compute[182713]: 2026-01-21 23:45:17.058 182717 DEBUG oslo_concurrency.lockutils [None req-8ef05d51-73ac-4086-abe2-a999ed31fa0f e542cae206074b80ba3fce2fae031917 503e70549cf444b1bb5cde0daa90b93c - - default default] Lock "/var/lib/nova/instances/6154084a-6bed-48b4-9c3c-2592cfc58fb1/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:45:17 compute-1 nova_compute[182713]: 2026-01-21 23:45:17.059 182717 DEBUG oslo_concurrency.lockutils [None req-8ef05d51-73ac-4086-abe2-a999ed31fa0f e542cae206074b80ba3fce2fae031917 503e70549cf444b1bb5cde0daa90b93c - - default default] Lock "/var/lib/nova/instances/6154084a-6bed-48b4-9c3c-2592cfc58fb1/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:45:17 compute-1 nova_compute[182713]: 2026-01-21 23:45:17.073 182717 DEBUG oslo_concurrency.processutils [None req-8ef05d51-73ac-4086-abe2-a999ed31fa0f e542cae206074b80ba3fce2fae031917 503e70549cf444b1bb5cde0daa90b93c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:45:17 compute-1 nova_compute[182713]: 2026-01-21 23:45:17.161 182717 DEBUG oslo_concurrency.processutils [None req-8ef05d51-73ac-4086-abe2-a999ed31fa0f e542cae206074b80ba3fce2fae031917 503e70549cf444b1bb5cde0daa90b93c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.088s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:45:17 compute-1 nova_compute[182713]: 2026-01-21 23:45:17.164 182717 DEBUG oslo_concurrency.lockutils [None req-8ef05d51-73ac-4086-abe2-a999ed31fa0f e542cae206074b80ba3fce2fae031917 503e70549cf444b1bb5cde0daa90b93c - - default default] Acquiring lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:45:17 compute-1 nova_compute[182713]: 2026-01-21 23:45:17.165 182717 DEBUG oslo_concurrency.lockutils [None req-8ef05d51-73ac-4086-abe2-a999ed31fa0f e542cae206074b80ba3fce2fae031917 503e70549cf444b1bb5cde0daa90b93c - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:45:17 compute-1 nova_compute[182713]: 2026-01-21 23:45:17.191 182717 DEBUG oslo_concurrency.processutils [None req-8ef05d51-73ac-4086-abe2-a999ed31fa0f e542cae206074b80ba3fce2fae031917 503e70549cf444b1bb5cde0daa90b93c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:45:17 compute-1 nova_compute[182713]: 2026-01-21 23:45:17.246 182717 DEBUG oslo_concurrency.processutils [None req-8ef05d51-73ac-4086-abe2-a999ed31fa0f e542cae206074b80ba3fce2fae031917 503e70549cf444b1bb5cde0daa90b93c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:45:17 compute-1 nova_compute[182713]: 2026-01-21 23:45:17.247 182717 DEBUG oslo_concurrency.processutils [None req-8ef05d51-73ac-4086-abe2-a999ed31fa0f e542cae206074b80ba3fce2fae031917 503e70549cf444b1bb5cde0daa90b93c - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/6154084a-6bed-48b4-9c3c-2592cfc58fb1/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:45:17 compute-1 nova_compute[182713]: 2026-01-21 23:45:17.303 182717 DEBUG oslo_concurrency.processutils [None req-8ef05d51-73ac-4086-abe2-a999ed31fa0f e542cae206074b80ba3fce2fae031917 503e70549cf444b1bb5cde0daa90b93c - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/6154084a-6bed-48b4-9c3c-2592cfc58fb1/disk 1073741824" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:45:17 compute-1 nova_compute[182713]: 2026-01-21 23:45:17.305 182717 DEBUG oslo_concurrency.lockutils [None req-8ef05d51-73ac-4086-abe2-a999ed31fa0f e542cae206074b80ba3fce2fae031917 503e70549cf444b1bb5cde0daa90b93c - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.140s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:45:17 compute-1 nova_compute[182713]: 2026-01-21 23:45:17.306 182717 DEBUG oslo_concurrency.processutils [None req-8ef05d51-73ac-4086-abe2-a999ed31fa0f e542cae206074b80ba3fce2fae031917 503e70549cf444b1bb5cde0daa90b93c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:45:17 compute-1 nova_compute[182713]: 2026-01-21 23:45:17.393 182717 DEBUG oslo_concurrency.processutils [None req-8ef05d51-73ac-4086-abe2-a999ed31fa0f e542cae206074b80ba3fce2fae031917 503e70549cf444b1bb5cde0daa90b93c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.088s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:45:17 compute-1 nova_compute[182713]: 2026-01-21 23:45:17.395 182717 DEBUG nova.virt.disk.api [None req-8ef05d51-73ac-4086-abe2-a999ed31fa0f e542cae206074b80ba3fce2fae031917 503e70549cf444b1bb5cde0daa90b93c - - default default] Checking if we can resize image /var/lib/nova/instances/6154084a-6bed-48b4-9c3c-2592cfc58fb1/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 21 23:45:17 compute-1 nova_compute[182713]: 2026-01-21 23:45:17.395 182717 DEBUG oslo_concurrency.processutils [None req-8ef05d51-73ac-4086-abe2-a999ed31fa0f e542cae206074b80ba3fce2fae031917 503e70549cf444b1bb5cde0daa90b93c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6154084a-6bed-48b4-9c3c-2592cfc58fb1/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:45:17 compute-1 nova_compute[182713]: 2026-01-21 23:45:17.454 182717 DEBUG oslo_concurrency.processutils [None req-8ef05d51-73ac-4086-abe2-a999ed31fa0f e542cae206074b80ba3fce2fae031917 503e70549cf444b1bb5cde0daa90b93c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6154084a-6bed-48b4-9c3c-2592cfc58fb1/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:45:17 compute-1 nova_compute[182713]: 2026-01-21 23:45:17.455 182717 DEBUG nova.virt.disk.api [None req-8ef05d51-73ac-4086-abe2-a999ed31fa0f e542cae206074b80ba3fce2fae031917 503e70549cf444b1bb5cde0daa90b93c - - default default] Cannot resize image /var/lib/nova/instances/6154084a-6bed-48b4-9c3c-2592cfc58fb1/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 21 23:45:17 compute-1 nova_compute[182713]: 2026-01-21 23:45:17.455 182717 DEBUG nova.objects.instance [None req-8ef05d51-73ac-4086-abe2-a999ed31fa0f e542cae206074b80ba3fce2fae031917 503e70549cf444b1bb5cde0daa90b93c - - default default] Lazy-loading 'migration_context' on Instance uuid 6154084a-6bed-48b4-9c3c-2592cfc58fb1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:45:17 compute-1 nova_compute[182713]: 2026-01-21 23:45:17.473 182717 DEBUG nova.virt.libvirt.driver [None req-8ef05d51-73ac-4086-abe2-a999ed31fa0f e542cae206074b80ba3fce2fae031917 503e70549cf444b1bb5cde0daa90b93c - - default default] [instance: 6154084a-6bed-48b4-9c3c-2592cfc58fb1] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 21 23:45:17 compute-1 nova_compute[182713]: 2026-01-21 23:45:17.473 182717 DEBUG nova.virt.libvirt.driver [None req-8ef05d51-73ac-4086-abe2-a999ed31fa0f e542cae206074b80ba3fce2fae031917 503e70549cf444b1bb5cde0daa90b93c - - default default] [instance: 6154084a-6bed-48b4-9c3c-2592cfc58fb1] Ensure instance console log exists: /var/lib/nova/instances/6154084a-6bed-48b4-9c3c-2592cfc58fb1/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 21 23:45:17 compute-1 nova_compute[182713]: 2026-01-21 23:45:17.474 182717 DEBUG oslo_concurrency.lockutils [None req-8ef05d51-73ac-4086-abe2-a999ed31fa0f e542cae206074b80ba3fce2fae031917 503e70549cf444b1bb5cde0daa90b93c - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:45:17 compute-1 nova_compute[182713]: 2026-01-21 23:45:17.475 182717 DEBUG oslo_concurrency.lockutils [None req-8ef05d51-73ac-4086-abe2-a999ed31fa0f e542cae206074b80ba3fce2fae031917 503e70549cf444b1bb5cde0daa90b93c - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:45:17 compute-1 nova_compute[182713]: 2026-01-21 23:45:17.475 182717 DEBUG oslo_concurrency.lockutils [None req-8ef05d51-73ac-4086-abe2-a999ed31fa0f e542cae206074b80ba3fce2fae031917 503e70549cf444b1bb5cde0daa90b93c - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:45:17 compute-1 nova_compute[182713]: 2026-01-21 23:45:17.478 182717 DEBUG nova.virt.libvirt.driver [None req-8ef05d51-73ac-4086-abe2-a999ed31fa0f e542cae206074b80ba3fce2fae031917 503e70549cf444b1bb5cde0daa90b93c - - default default] [instance: 6154084a-6bed-48b4-9c3c-2592cfc58fb1] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'size': 0, 'disk_bus': 'virtio', 'boot_index': 0, 'device_name': '/dev/vda', 'guest_format': None, 'encrypted': False, 'encryption_format': None, 'encryption_secret_uuid': None, 'device_type': 'disk', 'image_id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 21 23:45:17 compute-1 nova_compute[182713]: 2026-01-21 23:45:17.486 182717 WARNING nova.virt.libvirt.driver [None req-8ef05d51-73ac-4086-abe2-a999ed31fa0f e542cae206074b80ba3fce2fae031917 503e70549cf444b1bb5cde0daa90b93c - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 21 23:45:17 compute-1 nova_compute[182713]: 2026-01-21 23:45:17.498 182717 DEBUG nova.virt.libvirt.host [None req-8ef05d51-73ac-4086-abe2-a999ed31fa0f e542cae206074b80ba3fce2fae031917 503e70549cf444b1bb5cde0daa90b93c - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 21 23:45:17 compute-1 nova_compute[182713]: 2026-01-21 23:45:17.499 182717 DEBUG nova.virt.libvirt.host [None req-8ef05d51-73ac-4086-abe2-a999ed31fa0f e542cae206074b80ba3fce2fae031917 503e70549cf444b1bb5cde0daa90b93c - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 21 23:45:17 compute-1 nova_compute[182713]: 2026-01-21 23:45:17.503 182717 DEBUG nova.virt.libvirt.host [None req-8ef05d51-73ac-4086-abe2-a999ed31fa0f e542cae206074b80ba3fce2fae031917 503e70549cf444b1bb5cde0daa90b93c - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 21 23:45:17 compute-1 nova_compute[182713]: 2026-01-21 23:45:17.504 182717 DEBUG nova.virt.libvirt.host [None req-8ef05d51-73ac-4086-abe2-a999ed31fa0f e542cae206074b80ba3fce2fae031917 503e70549cf444b1bb5cde0daa90b93c - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 21 23:45:17 compute-1 nova_compute[182713]: 2026-01-21 23:45:17.507 182717 DEBUG nova.virt.libvirt.driver [None req-8ef05d51-73ac-4086-abe2-a999ed31fa0f e542cae206074b80ba3fce2fae031917 503e70549cf444b1bb5cde0daa90b93c - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 21 23:45:17 compute-1 nova_compute[182713]: 2026-01-21 23:45:17.507 182717 DEBUG nova.virt.hardware [None req-8ef05d51-73ac-4086-abe2-a999ed31fa0f e542cae206074b80ba3fce2fae031917 503e70549cf444b1bb5cde0daa90b93c - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-21T23:42:45Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c3389c03-89c4-4ff5-9e03-1a99d41713d4',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 21 23:45:17 compute-1 nova_compute[182713]: 2026-01-21 23:45:17.508 182717 DEBUG nova.virt.hardware [None req-8ef05d51-73ac-4086-abe2-a999ed31fa0f e542cae206074b80ba3fce2fae031917 503e70549cf444b1bb5cde0daa90b93c - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 21 23:45:17 compute-1 nova_compute[182713]: 2026-01-21 23:45:17.508 182717 DEBUG nova.virt.hardware [None req-8ef05d51-73ac-4086-abe2-a999ed31fa0f e542cae206074b80ba3fce2fae031917 503e70549cf444b1bb5cde0daa90b93c - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 21 23:45:17 compute-1 nova_compute[182713]: 2026-01-21 23:45:17.509 182717 DEBUG nova.virt.hardware [None req-8ef05d51-73ac-4086-abe2-a999ed31fa0f e542cae206074b80ba3fce2fae031917 503e70549cf444b1bb5cde0daa90b93c - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 21 23:45:17 compute-1 nova_compute[182713]: 2026-01-21 23:45:17.509 182717 DEBUG nova.virt.hardware [None req-8ef05d51-73ac-4086-abe2-a999ed31fa0f e542cae206074b80ba3fce2fae031917 503e70549cf444b1bb5cde0daa90b93c - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 21 23:45:17 compute-1 nova_compute[182713]: 2026-01-21 23:45:17.510 182717 DEBUG nova.virt.hardware [None req-8ef05d51-73ac-4086-abe2-a999ed31fa0f e542cae206074b80ba3fce2fae031917 503e70549cf444b1bb5cde0daa90b93c - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 21 23:45:17 compute-1 nova_compute[182713]: 2026-01-21 23:45:17.510 182717 DEBUG nova.virt.hardware [None req-8ef05d51-73ac-4086-abe2-a999ed31fa0f e542cae206074b80ba3fce2fae031917 503e70549cf444b1bb5cde0daa90b93c - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 21 23:45:17 compute-1 nova_compute[182713]: 2026-01-21 23:45:17.511 182717 DEBUG nova.virt.hardware [None req-8ef05d51-73ac-4086-abe2-a999ed31fa0f e542cae206074b80ba3fce2fae031917 503e70549cf444b1bb5cde0daa90b93c - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 21 23:45:17 compute-1 nova_compute[182713]: 2026-01-21 23:45:17.511 182717 DEBUG nova.virt.hardware [None req-8ef05d51-73ac-4086-abe2-a999ed31fa0f e542cae206074b80ba3fce2fae031917 503e70549cf444b1bb5cde0daa90b93c - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 21 23:45:17 compute-1 nova_compute[182713]: 2026-01-21 23:45:17.512 182717 DEBUG nova.virt.hardware [None req-8ef05d51-73ac-4086-abe2-a999ed31fa0f e542cae206074b80ba3fce2fae031917 503e70549cf444b1bb5cde0daa90b93c - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 21 23:45:17 compute-1 nova_compute[182713]: 2026-01-21 23:45:17.512 182717 DEBUG nova.virt.hardware [None req-8ef05d51-73ac-4086-abe2-a999ed31fa0f e542cae206074b80ba3fce2fae031917 503e70549cf444b1bb5cde0daa90b93c - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 21 23:45:17 compute-1 nova_compute[182713]: 2026-01-21 23:45:17.519 182717 DEBUG nova.objects.instance [None req-8ef05d51-73ac-4086-abe2-a999ed31fa0f e542cae206074b80ba3fce2fae031917 503e70549cf444b1bb5cde0daa90b93c - - default default] Lazy-loading 'pci_devices' on Instance uuid 6154084a-6bed-48b4-9c3c-2592cfc58fb1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:45:17 compute-1 nova_compute[182713]: 2026-01-21 23:45:17.536 182717 DEBUG nova.virt.libvirt.driver [None req-8ef05d51-73ac-4086-abe2-a999ed31fa0f e542cae206074b80ba3fce2fae031917 503e70549cf444b1bb5cde0daa90b93c - - default default] [instance: 6154084a-6bed-48b4-9c3c-2592cfc58fb1] End _get_guest_xml xml=<domain type="kvm">
Jan 21 23:45:17 compute-1 nova_compute[182713]:   <uuid>6154084a-6bed-48b4-9c3c-2592cfc58fb1</uuid>
Jan 21 23:45:17 compute-1 nova_compute[182713]:   <name>instance-0000000b</name>
Jan 21 23:45:17 compute-1 nova_compute[182713]:   <memory>131072</memory>
Jan 21 23:45:17 compute-1 nova_compute[182713]:   <vcpu>1</vcpu>
Jan 21 23:45:17 compute-1 nova_compute[182713]:   <metadata>
Jan 21 23:45:17 compute-1 nova_compute[182713]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 21 23:45:17 compute-1 nova_compute[182713]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 21 23:45:17 compute-1 nova_compute[182713]:       <nova:name>tempest-ServerDiagnosticsV248Test-server-1744202673</nova:name>
Jan 21 23:45:17 compute-1 nova_compute[182713]:       <nova:creationTime>2026-01-21 23:45:17</nova:creationTime>
Jan 21 23:45:17 compute-1 nova_compute[182713]:       <nova:flavor name="m1.nano">
Jan 21 23:45:17 compute-1 nova_compute[182713]:         <nova:memory>128</nova:memory>
Jan 21 23:45:17 compute-1 nova_compute[182713]:         <nova:disk>1</nova:disk>
Jan 21 23:45:17 compute-1 nova_compute[182713]:         <nova:swap>0</nova:swap>
Jan 21 23:45:17 compute-1 nova_compute[182713]:         <nova:ephemeral>0</nova:ephemeral>
Jan 21 23:45:17 compute-1 nova_compute[182713]:         <nova:vcpus>1</nova:vcpus>
Jan 21 23:45:17 compute-1 nova_compute[182713]:       </nova:flavor>
Jan 21 23:45:17 compute-1 nova_compute[182713]:       <nova:owner>
Jan 21 23:45:17 compute-1 nova_compute[182713]:         <nova:user uuid="e542cae206074b80ba3fce2fae031917">tempest-ServerDiagnosticsV248Test-1860487327-project-member</nova:user>
Jan 21 23:45:17 compute-1 nova_compute[182713]:         <nova:project uuid="503e70549cf444b1bb5cde0daa90b93c">tempest-ServerDiagnosticsV248Test-1860487327</nova:project>
Jan 21 23:45:17 compute-1 nova_compute[182713]:       </nova:owner>
Jan 21 23:45:17 compute-1 nova_compute[182713]:       <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 21 23:45:17 compute-1 nova_compute[182713]:       <nova:ports/>
Jan 21 23:45:17 compute-1 nova_compute[182713]:     </nova:instance>
Jan 21 23:45:17 compute-1 nova_compute[182713]:   </metadata>
Jan 21 23:45:17 compute-1 nova_compute[182713]:   <sysinfo type="smbios">
Jan 21 23:45:17 compute-1 nova_compute[182713]:     <system>
Jan 21 23:45:17 compute-1 nova_compute[182713]:       <entry name="manufacturer">RDO</entry>
Jan 21 23:45:17 compute-1 nova_compute[182713]:       <entry name="product">OpenStack Compute</entry>
Jan 21 23:45:17 compute-1 nova_compute[182713]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 21 23:45:17 compute-1 nova_compute[182713]:       <entry name="serial">6154084a-6bed-48b4-9c3c-2592cfc58fb1</entry>
Jan 21 23:45:17 compute-1 nova_compute[182713]:       <entry name="uuid">6154084a-6bed-48b4-9c3c-2592cfc58fb1</entry>
Jan 21 23:45:17 compute-1 nova_compute[182713]:       <entry name="family">Virtual Machine</entry>
Jan 21 23:45:17 compute-1 nova_compute[182713]:     </system>
Jan 21 23:45:17 compute-1 nova_compute[182713]:   </sysinfo>
Jan 21 23:45:17 compute-1 nova_compute[182713]:   <os>
Jan 21 23:45:17 compute-1 nova_compute[182713]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 21 23:45:17 compute-1 nova_compute[182713]:     <boot dev="hd"/>
Jan 21 23:45:17 compute-1 nova_compute[182713]:     <smbios mode="sysinfo"/>
Jan 21 23:45:17 compute-1 nova_compute[182713]:   </os>
Jan 21 23:45:17 compute-1 nova_compute[182713]:   <features>
Jan 21 23:45:17 compute-1 nova_compute[182713]:     <acpi/>
Jan 21 23:45:17 compute-1 nova_compute[182713]:     <apic/>
Jan 21 23:45:17 compute-1 nova_compute[182713]:     <vmcoreinfo/>
Jan 21 23:45:17 compute-1 nova_compute[182713]:   </features>
Jan 21 23:45:17 compute-1 nova_compute[182713]:   <clock offset="utc">
Jan 21 23:45:17 compute-1 nova_compute[182713]:     <timer name="pit" tickpolicy="delay"/>
Jan 21 23:45:17 compute-1 nova_compute[182713]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 21 23:45:17 compute-1 nova_compute[182713]:     <timer name="hpet" present="no"/>
Jan 21 23:45:17 compute-1 nova_compute[182713]:   </clock>
Jan 21 23:45:17 compute-1 nova_compute[182713]:   <cpu mode="custom" match="exact">
Jan 21 23:45:17 compute-1 nova_compute[182713]:     <model>Nehalem</model>
Jan 21 23:45:17 compute-1 nova_compute[182713]:     <topology sockets="1" cores="1" threads="1"/>
Jan 21 23:45:17 compute-1 nova_compute[182713]:   </cpu>
Jan 21 23:45:17 compute-1 nova_compute[182713]:   <devices>
Jan 21 23:45:17 compute-1 nova_compute[182713]:     <disk type="file" device="disk">
Jan 21 23:45:17 compute-1 nova_compute[182713]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 21 23:45:17 compute-1 nova_compute[182713]:       <source file="/var/lib/nova/instances/6154084a-6bed-48b4-9c3c-2592cfc58fb1/disk"/>
Jan 21 23:45:17 compute-1 nova_compute[182713]:       <target dev="vda" bus="virtio"/>
Jan 21 23:45:17 compute-1 nova_compute[182713]:     </disk>
Jan 21 23:45:17 compute-1 nova_compute[182713]:     <disk type="file" device="cdrom">
Jan 21 23:45:17 compute-1 nova_compute[182713]:       <driver name="qemu" type="raw" cache="none"/>
Jan 21 23:45:17 compute-1 nova_compute[182713]:       <source file="/var/lib/nova/instances/6154084a-6bed-48b4-9c3c-2592cfc58fb1/disk.config"/>
Jan 21 23:45:17 compute-1 nova_compute[182713]:       <target dev="sda" bus="sata"/>
Jan 21 23:45:17 compute-1 nova_compute[182713]:     </disk>
Jan 21 23:45:17 compute-1 nova_compute[182713]:     <serial type="pty">
Jan 21 23:45:17 compute-1 nova_compute[182713]:       <log file="/var/lib/nova/instances/6154084a-6bed-48b4-9c3c-2592cfc58fb1/console.log" append="off"/>
Jan 21 23:45:17 compute-1 nova_compute[182713]:     </serial>
Jan 21 23:45:17 compute-1 nova_compute[182713]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 21 23:45:17 compute-1 nova_compute[182713]:     <video>
Jan 21 23:45:17 compute-1 nova_compute[182713]:       <model type="virtio"/>
Jan 21 23:45:17 compute-1 nova_compute[182713]:     </video>
Jan 21 23:45:17 compute-1 nova_compute[182713]:     <input type="tablet" bus="usb"/>
Jan 21 23:45:17 compute-1 nova_compute[182713]:     <rng model="virtio">
Jan 21 23:45:17 compute-1 nova_compute[182713]:       <backend model="random">/dev/urandom</backend>
Jan 21 23:45:17 compute-1 nova_compute[182713]:     </rng>
Jan 21 23:45:17 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root"/>
Jan 21 23:45:17 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:45:17 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:45:17 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:45:17 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:45:17 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:45:17 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:45:17 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:45:17 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:45:17 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:45:17 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:45:17 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:45:17 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:45:17 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:45:17 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:45:17 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:45:17 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:45:17 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:45:17 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:45:17 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:45:17 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:45:17 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:45:17 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:45:17 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:45:17 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:45:17 compute-1 nova_compute[182713]:     <controller type="usb" index="0"/>
Jan 21 23:45:17 compute-1 nova_compute[182713]:     <memballoon model="virtio">
Jan 21 23:45:17 compute-1 nova_compute[182713]:       <stats period="10"/>
Jan 21 23:45:17 compute-1 nova_compute[182713]:     </memballoon>
Jan 21 23:45:17 compute-1 nova_compute[182713]:   </devices>
Jan 21 23:45:17 compute-1 nova_compute[182713]: </domain>
Jan 21 23:45:17 compute-1 nova_compute[182713]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 21 23:45:17 compute-1 nova_compute[182713]: 2026-01-21 23:45:17.597 182717 DEBUG nova.virt.libvirt.driver [None req-8ef05d51-73ac-4086-abe2-a999ed31fa0f e542cae206074b80ba3fce2fae031917 503e70549cf444b1bb5cde0daa90b93c - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 21 23:45:17 compute-1 nova_compute[182713]: 2026-01-21 23:45:17.597 182717 DEBUG nova.virt.libvirt.driver [None req-8ef05d51-73ac-4086-abe2-a999ed31fa0f e542cae206074b80ba3fce2fae031917 503e70549cf444b1bb5cde0daa90b93c - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 21 23:45:17 compute-1 nova_compute[182713]: 2026-01-21 23:45:17.601 182717 INFO nova.virt.libvirt.driver [None req-8ef05d51-73ac-4086-abe2-a999ed31fa0f e542cae206074b80ba3fce2fae031917 503e70549cf444b1bb5cde0daa90b93c - - default default] [instance: 6154084a-6bed-48b4-9c3c-2592cfc58fb1] Using config drive
Jan 21 23:45:17 compute-1 nova_compute[182713]: 2026-01-21 23:45:17.850 182717 INFO nova.virt.libvirt.driver [None req-8ef05d51-73ac-4086-abe2-a999ed31fa0f e542cae206074b80ba3fce2fae031917 503e70549cf444b1bb5cde0daa90b93c - - default default] [instance: 6154084a-6bed-48b4-9c3c-2592cfc58fb1] Creating config drive at /var/lib/nova/instances/6154084a-6bed-48b4-9c3c-2592cfc58fb1/disk.config
Jan 21 23:45:17 compute-1 nova_compute[182713]: 2026-01-21 23:45:17.861 182717 DEBUG oslo_concurrency.processutils [None req-8ef05d51-73ac-4086-abe2-a999ed31fa0f e542cae206074b80ba3fce2fae031917 503e70549cf444b1bb5cde0daa90b93c - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/6154084a-6bed-48b4-9c3c-2592cfc58fb1/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpenm91_dd execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:45:17 compute-1 nova_compute[182713]: 2026-01-21 23:45:17.985 182717 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769039102.9837368, 32005fa1-7813-416f-90c7-0fb025a2c743 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:45:17 compute-1 nova_compute[182713]: 2026-01-21 23:45:17.986 182717 INFO nova.compute.manager [-] [instance: 32005fa1-7813-416f-90c7-0fb025a2c743] VM Stopped (Lifecycle Event)
Jan 21 23:45:18 compute-1 nova_compute[182713]: 2026-01-21 23:45:18.003 182717 DEBUG oslo_concurrency.processutils [None req-8ef05d51-73ac-4086-abe2-a999ed31fa0f e542cae206074b80ba3fce2fae031917 503e70549cf444b1bb5cde0daa90b93c - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/6154084a-6bed-48b4-9c3c-2592cfc58fb1/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpenm91_dd" returned: 0 in 0.142s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:45:18 compute-1 nova_compute[182713]: 2026-01-21 23:45:18.020 182717 DEBUG nova.compute.manager [None req-8603e25d-2884-4c44-88ea-7671eaffccad - - - - - -] [instance: 32005fa1-7813-416f-90c7-0fb025a2c743] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:45:18 compute-1 systemd-machined[153970]: New machine qemu-7-instance-0000000b.
Jan 21 23:45:18 compute-1 systemd[1]: Started Virtual Machine qemu-7-instance-0000000b.
Jan 21 23:45:18 compute-1 nova_compute[182713]: 2026-01-21 23:45:18.640 182717 DEBUG nova.virt.driver [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Emitting event <LifecycleEvent: 1769039118.639551, 6154084a-6bed-48b4-9c3c-2592cfc58fb1 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:45:18 compute-1 nova_compute[182713]: 2026-01-21 23:45:18.640 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 6154084a-6bed-48b4-9c3c-2592cfc58fb1] VM Resumed (Lifecycle Event)
Jan 21 23:45:18 compute-1 nova_compute[182713]: 2026-01-21 23:45:18.643 182717 DEBUG nova.compute.manager [None req-8ef05d51-73ac-4086-abe2-a999ed31fa0f e542cae206074b80ba3fce2fae031917 503e70549cf444b1bb5cde0daa90b93c - - default default] [instance: 6154084a-6bed-48b4-9c3c-2592cfc58fb1] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 21 23:45:18 compute-1 nova_compute[182713]: 2026-01-21 23:45:18.643 182717 DEBUG nova.virt.libvirt.driver [None req-8ef05d51-73ac-4086-abe2-a999ed31fa0f e542cae206074b80ba3fce2fae031917 503e70549cf444b1bb5cde0daa90b93c - - default default] [instance: 6154084a-6bed-48b4-9c3c-2592cfc58fb1] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 21 23:45:18 compute-1 nova_compute[182713]: 2026-01-21 23:45:18.647 182717 INFO nova.virt.libvirt.driver [-] [instance: 6154084a-6bed-48b4-9c3c-2592cfc58fb1] Instance spawned successfully.
Jan 21 23:45:18 compute-1 nova_compute[182713]: 2026-01-21 23:45:18.648 182717 DEBUG nova.virt.libvirt.driver [None req-8ef05d51-73ac-4086-abe2-a999ed31fa0f e542cae206074b80ba3fce2fae031917 503e70549cf444b1bb5cde0daa90b93c - - default default] [instance: 6154084a-6bed-48b4-9c3c-2592cfc58fb1] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 21 23:45:18 compute-1 nova_compute[182713]: 2026-01-21 23:45:18.678 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 6154084a-6bed-48b4-9c3c-2592cfc58fb1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:45:18 compute-1 nova_compute[182713]: 2026-01-21 23:45:18.685 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 6154084a-6bed-48b4-9c3c-2592cfc58fb1] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 21 23:45:18 compute-1 nova_compute[182713]: 2026-01-21 23:45:18.688 182717 DEBUG nova.virt.libvirt.driver [None req-8ef05d51-73ac-4086-abe2-a999ed31fa0f e542cae206074b80ba3fce2fae031917 503e70549cf444b1bb5cde0daa90b93c - - default default] [instance: 6154084a-6bed-48b4-9c3c-2592cfc58fb1] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:45:18 compute-1 nova_compute[182713]: 2026-01-21 23:45:18.689 182717 DEBUG nova.virt.libvirt.driver [None req-8ef05d51-73ac-4086-abe2-a999ed31fa0f e542cae206074b80ba3fce2fae031917 503e70549cf444b1bb5cde0daa90b93c - - default default] [instance: 6154084a-6bed-48b4-9c3c-2592cfc58fb1] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:45:18 compute-1 nova_compute[182713]: 2026-01-21 23:45:18.689 182717 DEBUG nova.virt.libvirt.driver [None req-8ef05d51-73ac-4086-abe2-a999ed31fa0f e542cae206074b80ba3fce2fae031917 503e70549cf444b1bb5cde0daa90b93c - - default default] [instance: 6154084a-6bed-48b4-9c3c-2592cfc58fb1] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:45:18 compute-1 nova_compute[182713]: 2026-01-21 23:45:18.690 182717 DEBUG nova.virt.libvirt.driver [None req-8ef05d51-73ac-4086-abe2-a999ed31fa0f e542cae206074b80ba3fce2fae031917 503e70549cf444b1bb5cde0daa90b93c - - default default] [instance: 6154084a-6bed-48b4-9c3c-2592cfc58fb1] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:45:18 compute-1 nova_compute[182713]: 2026-01-21 23:45:18.690 182717 DEBUG nova.virt.libvirt.driver [None req-8ef05d51-73ac-4086-abe2-a999ed31fa0f e542cae206074b80ba3fce2fae031917 503e70549cf444b1bb5cde0daa90b93c - - default default] [instance: 6154084a-6bed-48b4-9c3c-2592cfc58fb1] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:45:18 compute-1 nova_compute[182713]: 2026-01-21 23:45:18.691 182717 DEBUG nova.virt.libvirt.driver [None req-8ef05d51-73ac-4086-abe2-a999ed31fa0f e542cae206074b80ba3fce2fae031917 503e70549cf444b1bb5cde0daa90b93c - - default default] [instance: 6154084a-6bed-48b4-9c3c-2592cfc58fb1] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:45:18 compute-1 nova_compute[182713]: 2026-01-21 23:45:18.733 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 6154084a-6bed-48b4-9c3c-2592cfc58fb1] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 21 23:45:18 compute-1 nova_compute[182713]: 2026-01-21 23:45:18.734 182717 DEBUG nova.virt.driver [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Emitting event <LifecycleEvent: 1769039118.642599, 6154084a-6bed-48b4-9c3c-2592cfc58fb1 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:45:18 compute-1 nova_compute[182713]: 2026-01-21 23:45:18.734 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 6154084a-6bed-48b4-9c3c-2592cfc58fb1] VM Started (Lifecycle Event)
Jan 21 23:45:18 compute-1 nova_compute[182713]: 2026-01-21 23:45:18.763 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 6154084a-6bed-48b4-9c3c-2592cfc58fb1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:45:18 compute-1 nova_compute[182713]: 2026-01-21 23:45:18.766 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 6154084a-6bed-48b4-9c3c-2592cfc58fb1] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 21 23:45:18 compute-1 nova_compute[182713]: 2026-01-21 23:45:18.815 182717 INFO nova.compute.manager [None req-8ef05d51-73ac-4086-abe2-a999ed31fa0f e542cae206074b80ba3fce2fae031917 503e70549cf444b1bb5cde0daa90b93c - - default default] [instance: 6154084a-6bed-48b4-9c3c-2592cfc58fb1] Took 1.76 seconds to spawn the instance on the hypervisor.
Jan 21 23:45:18 compute-1 nova_compute[182713]: 2026-01-21 23:45:18.815 182717 DEBUG nova.compute.manager [None req-8ef05d51-73ac-4086-abe2-a999ed31fa0f e542cae206074b80ba3fce2fae031917 503e70549cf444b1bb5cde0daa90b93c - - default default] [instance: 6154084a-6bed-48b4-9c3c-2592cfc58fb1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:45:18 compute-1 nova_compute[182713]: 2026-01-21 23:45:18.834 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 6154084a-6bed-48b4-9c3c-2592cfc58fb1] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 21 23:45:18 compute-1 nova_compute[182713]: 2026-01-21 23:45:18.908 182717 INFO nova.compute.manager [None req-8ef05d51-73ac-4086-abe2-a999ed31fa0f e542cae206074b80ba3fce2fae031917 503e70549cf444b1bb5cde0daa90b93c - - default default] [instance: 6154084a-6bed-48b4-9c3c-2592cfc58fb1] Took 2.48 seconds to build instance.
Jan 21 23:45:18 compute-1 nova_compute[182713]: 2026-01-21 23:45:18.913 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:45:18 compute-1 nova_compute[182713]: 2026-01-21 23:45:18.913 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:45:18 compute-1 nova_compute[182713]: 2026-01-21 23:45:18.938 182717 DEBUG oslo_concurrency.lockutils [None req-8ef05d51-73ac-4086-abe2-a999ed31fa0f e542cae206074b80ba3fce2fae031917 503e70549cf444b1bb5cde0daa90b93c - - default default] Lock "6154084a-6bed-48b4-9c3c-2592cfc58fb1" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 2.613s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:45:18 compute-1 nova_compute[182713]: 2026-01-21 23:45:18.940 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:45:18 compute-1 nova_compute[182713]: 2026-01-21 23:45:18.941 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 21 23:45:18 compute-1 nova_compute[182713]: 2026-01-21 23:45:18.941 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 21 23:45:19 compute-1 nova_compute[182713]: 2026-01-21 23:45:19.107 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquiring lock "refresh_cache-6154084a-6bed-48b4-9c3c-2592cfc58fb1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 23:45:19 compute-1 nova_compute[182713]: 2026-01-21 23:45:19.108 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquired lock "refresh_cache-6154084a-6bed-48b4-9c3c-2592cfc58fb1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 23:45:19 compute-1 nova_compute[182713]: 2026-01-21 23:45:19.108 182717 DEBUG nova.network.neutron [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] [instance: 6154084a-6bed-48b4-9c3c-2592cfc58fb1] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Jan 21 23:45:19 compute-1 nova_compute[182713]: 2026-01-21 23:45:19.109 182717 DEBUG nova.objects.instance [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lazy-loading 'info_cache' on Instance uuid 6154084a-6bed-48b4-9c3c-2592cfc58fb1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:45:19 compute-1 nova_compute[182713]: 2026-01-21 23:45:19.359 182717 DEBUG nova.network.neutron [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] [instance: 6154084a-6bed-48b4-9c3c-2592cfc58fb1] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 21 23:45:19 compute-1 nova_compute[182713]: 2026-01-21 23:45:19.657 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:45:19 compute-1 nova_compute[182713]: 2026-01-21 23:45:19.724 182717 DEBUG nova.network.neutron [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] [instance: 6154084a-6bed-48b4-9c3c-2592cfc58fb1] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 23:45:19 compute-1 nova_compute[182713]: 2026-01-21 23:45:19.743 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Releasing lock "refresh_cache-6154084a-6bed-48b4-9c3c-2592cfc58fb1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 23:45:19 compute-1 nova_compute[182713]: 2026-01-21 23:45:19.744 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] [instance: 6154084a-6bed-48b4-9c3c-2592cfc58fb1] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Jan 21 23:45:19 compute-1 nova_compute[182713]: 2026-01-21 23:45:19.745 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:45:19 compute-1 nova_compute[182713]: 2026-01-21 23:45:19.745 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:45:19 compute-1 nova_compute[182713]: 2026-01-21 23:45:19.746 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:45:19 compute-1 nova_compute[182713]: 2026-01-21 23:45:19.746 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:45:19 compute-1 nova_compute[182713]: 2026-01-21 23:45:19.747 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 21 23:45:19 compute-1 nova_compute[182713]: 2026-01-21 23:45:19.747 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:45:19 compute-1 nova_compute[182713]: 2026-01-21 23:45:19.824 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:45:19 compute-1 nova_compute[182713]: 2026-01-21 23:45:19.824 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:45:19 compute-1 nova_compute[182713]: 2026-01-21 23:45:19.825 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:45:19 compute-1 nova_compute[182713]: 2026-01-21 23:45:19.825 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 21 23:45:19 compute-1 nova_compute[182713]: 2026-01-21 23:45:19.870 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:45:19 compute-1 nova_compute[182713]: 2026-01-21 23:45:19.881 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:45:19 compute-1 nova_compute[182713]: 2026-01-21 23:45:19.920 182717 DEBUG oslo_concurrency.processutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6154084a-6bed-48b4-9c3c-2592cfc58fb1/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:45:20 compute-1 nova_compute[182713]: 2026-01-21 23:45:20.020 182717 DEBUG oslo_concurrency.processutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6154084a-6bed-48b4-9c3c-2592cfc58fb1/disk --force-share --output=json" returned: 0 in 0.100s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:45:20 compute-1 nova_compute[182713]: 2026-01-21 23:45:20.022 182717 DEBUG oslo_concurrency.processutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6154084a-6bed-48b4-9c3c-2592cfc58fb1/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:45:20 compute-1 nova_compute[182713]: 2026-01-21 23:45:20.119 182717 DEBUG oslo_concurrency.processutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6154084a-6bed-48b4-9c3c-2592cfc58fb1/disk --force-share --output=json" returned: 0 in 0.097s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:45:20 compute-1 nova_compute[182713]: 2026-01-21 23:45:20.285 182717 WARNING nova.virt.libvirt.driver [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 21 23:45:20 compute-1 nova_compute[182713]: 2026-01-21 23:45:20.286 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5641MB free_disk=73.38225555419922GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 21 23:45:20 compute-1 nova_compute[182713]: 2026-01-21 23:45:20.286 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:45:20 compute-1 nova_compute[182713]: 2026-01-21 23:45:20.286 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:45:20 compute-1 nova_compute[182713]: 2026-01-21 23:45:20.362 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Instance 6154084a-6bed-48b4-9c3c-2592cfc58fb1 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 21 23:45:20 compute-1 nova_compute[182713]: 2026-01-21 23:45:20.363 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 21 23:45:20 compute-1 nova_compute[182713]: 2026-01-21 23:45:20.364 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 21 23:45:20 compute-1 nova_compute[182713]: 2026-01-21 23:45:20.420 182717 DEBUG nova.compute.provider_tree [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Inventory has not changed in ProviderTree for provider: 39680711-70c9-4df1-ae59-25e54fac688d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 21 23:45:20 compute-1 nova_compute[182713]: 2026-01-21 23:45:20.456 182717 DEBUG nova.scheduler.client.report [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Inventory has not changed for provider 39680711-70c9-4df1-ae59-25e54fac688d based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 21 23:45:20 compute-1 nova_compute[182713]: 2026-01-21 23:45:20.488 182717 DEBUG nova.compute.manager [None req-9143f5e9-cdfd-4e52-9f75-81dd9693b6d4 575d7c5cd7f3405095b350b748f9291e 196008547b5a471a857bce2650bdb761 - - default default] [instance: 6154084a-6bed-48b4-9c3c-2592cfc58fb1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:45:20 compute-1 nova_compute[182713]: 2026-01-21 23:45:20.491 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 21 23:45:20 compute-1 nova_compute[182713]: 2026-01-21 23:45:20.491 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.205s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:45:20 compute-1 nova_compute[182713]: 2026-01-21 23:45:20.496 182717 INFO nova.compute.manager [None req-9143f5e9-cdfd-4e52-9f75-81dd9693b6d4 575d7c5cd7f3405095b350b748f9291e 196008547b5a471a857bce2650bdb761 - - default default] [instance: 6154084a-6bed-48b4-9c3c-2592cfc58fb1] Retrieving diagnostics
Jan 21 23:45:20 compute-1 nova_compute[182713]: 2026-01-21 23:45:20.603 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:45:20 compute-1 nova_compute[182713]: 2026-01-21 23:45:20.856 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:45:21 compute-1 nova_compute[182713]: 2026-01-21 23:45:21.358 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:45:21 compute-1 podman[212565]: 2026-01-21 23:45:21.634562707 +0000 UTC m=+0.106843062 container health_status af9a44923f14d865893c91712a29a99d59e05d83f1a1a0300c4f4d5af638f284 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Jan 21 23:45:21 compute-1 podman[212566]: 2026-01-21 23:45:21.63920112 +0000 UTC m=+0.106687966 container health_status e305ebfcd3dbca18f8dd680606b043b108cf9efb0d0a771039cb04fe0df8441d (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Jan 21 23:45:24 compute-1 nova_compute[182713]: 2026-01-21 23:45:24.874 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:45:25 compute-1 nova_compute[182713]: 2026-01-21 23:45:25.821 182717 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769039110.8196652, fbff60b0-209e-4bdb-bc7a-f22348c41e4a => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:45:25 compute-1 nova_compute[182713]: 2026-01-21 23:45:25.822 182717 INFO nova.compute.manager [-] [instance: fbff60b0-209e-4bdb-bc7a-f22348c41e4a] VM Stopped (Lifecycle Event)
Jan 21 23:45:25 compute-1 nova_compute[182713]: 2026-01-21 23:45:25.848 182717 DEBUG nova.compute.manager [None req-7238273a-c98b-4f12-8385-2c3684d08ff2 - - - - - -] [instance: fbff60b0-209e-4bdb-bc7a-f22348c41e4a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:45:26 compute-1 nova_compute[182713]: 2026-01-21 23:45:26.366 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:45:30 compute-1 nova_compute[182713]: 2026-01-21 23:45:30.104 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:45:30 compute-1 nova_compute[182713]: 2026-01-21 23:45:30.795 182717 DEBUG nova.compute.manager [None req-450961bf-7fa4-471b-8fed-e5feeef513ee 575d7c5cd7f3405095b350b748f9291e 196008547b5a471a857bce2650bdb761 - - default default] [instance: 6154084a-6bed-48b4-9c3c-2592cfc58fb1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:45:30 compute-1 nova_compute[182713]: 2026-01-21 23:45:30.800 182717 INFO nova.compute.manager [None req-450961bf-7fa4-471b-8fed-e5feeef513ee 575d7c5cd7f3405095b350b748f9291e 196008547b5a471a857bce2650bdb761 - - default default] [instance: 6154084a-6bed-48b4-9c3c-2592cfc58fb1] Retrieving diagnostics
Jan 21 23:45:31 compute-1 nova_compute[182713]: 2026-01-21 23:45:31.053 182717 DEBUG oslo_concurrency.lockutils [None req-a9b3180e-b779-4a18-80d7-18efab36509a e542cae206074b80ba3fce2fae031917 503e70549cf444b1bb5cde0daa90b93c - - default default] Acquiring lock "6154084a-6bed-48b4-9c3c-2592cfc58fb1" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:45:31 compute-1 nova_compute[182713]: 2026-01-21 23:45:31.054 182717 DEBUG oslo_concurrency.lockutils [None req-a9b3180e-b779-4a18-80d7-18efab36509a e542cae206074b80ba3fce2fae031917 503e70549cf444b1bb5cde0daa90b93c - - default default] Lock "6154084a-6bed-48b4-9c3c-2592cfc58fb1" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:45:31 compute-1 nova_compute[182713]: 2026-01-21 23:45:31.054 182717 DEBUG oslo_concurrency.lockutils [None req-a9b3180e-b779-4a18-80d7-18efab36509a e542cae206074b80ba3fce2fae031917 503e70549cf444b1bb5cde0daa90b93c - - default default] Acquiring lock "6154084a-6bed-48b4-9c3c-2592cfc58fb1-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:45:31 compute-1 nova_compute[182713]: 2026-01-21 23:45:31.055 182717 DEBUG oslo_concurrency.lockutils [None req-a9b3180e-b779-4a18-80d7-18efab36509a e542cae206074b80ba3fce2fae031917 503e70549cf444b1bb5cde0daa90b93c - - default default] Lock "6154084a-6bed-48b4-9c3c-2592cfc58fb1-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:45:31 compute-1 nova_compute[182713]: 2026-01-21 23:45:31.055 182717 DEBUG oslo_concurrency.lockutils [None req-a9b3180e-b779-4a18-80d7-18efab36509a e542cae206074b80ba3fce2fae031917 503e70549cf444b1bb5cde0daa90b93c - - default default] Lock "6154084a-6bed-48b4-9c3c-2592cfc58fb1-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:45:31 compute-1 nova_compute[182713]: 2026-01-21 23:45:31.070 182717 INFO nova.compute.manager [None req-a9b3180e-b779-4a18-80d7-18efab36509a e542cae206074b80ba3fce2fae031917 503e70549cf444b1bb5cde0daa90b93c - - default default] [instance: 6154084a-6bed-48b4-9c3c-2592cfc58fb1] Terminating instance
Jan 21 23:45:31 compute-1 nova_compute[182713]: 2026-01-21 23:45:31.083 182717 DEBUG oslo_concurrency.lockutils [None req-a9b3180e-b779-4a18-80d7-18efab36509a e542cae206074b80ba3fce2fae031917 503e70549cf444b1bb5cde0daa90b93c - - default default] Acquiring lock "refresh_cache-6154084a-6bed-48b4-9c3c-2592cfc58fb1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 23:45:31 compute-1 nova_compute[182713]: 2026-01-21 23:45:31.083 182717 DEBUG oslo_concurrency.lockutils [None req-a9b3180e-b779-4a18-80d7-18efab36509a e542cae206074b80ba3fce2fae031917 503e70549cf444b1bb5cde0daa90b93c - - default default] Acquired lock "refresh_cache-6154084a-6bed-48b4-9c3c-2592cfc58fb1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 23:45:31 compute-1 nova_compute[182713]: 2026-01-21 23:45:31.083 182717 DEBUG nova.network.neutron [None req-a9b3180e-b779-4a18-80d7-18efab36509a e542cae206074b80ba3fce2fae031917 503e70549cf444b1bb5cde0daa90b93c - - default default] [instance: 6154084a-6bed-48b4-9c3c-2592cfc58fb1] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 21 23:45:31 compute-1 nova_compute[182713]: 2026-01-21 23:45:31.305 182717 DEBUG nova.network.neutron [None req-a9b3180e-b779-4a18-80d7-18efab36509a e542cae206074b80ba3fce2fae031917 503e70549cf444b1bb5cde0daa90b93c - - default default] [instance: 6154084a-6bed-48b4-9c3c-2592cfc58fb1] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 21 23:45:31 compute-1 nova_compute[182713]: 2026-01-21 23:45:31.368 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:45:31 compute-1 nova_compute[182713]: 2026-01-21 23:45:31.585 182717 DEBUG nova.network.neutron [None req-a9b3180e-b779-4a18-80d7-18efab36509a e542cae206074b80ba3fce2fae031917 503e70549cf444b1bb5cde0daa90b93c - - default default] [instance: 6154084a-6bed-48b4-9c3c-2592cfc58fb1] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 23:45:31 compute-1 nova_compute[182713]: 2026-01-21 23:45:31.610 182717 DEBUG oslo_concurrency.lockutils [None req-a9b3180e-b779-4a18-80d7-18efab36509a e542cae206074b80ba3fce2fae031917 503e70549cf444b1bb5cde0daa90b93c - - default default] Releasing lock "refresh_cache-6154084a-6bed-48b4-9c3c-2592cfc58fb1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 23:45:31 compute-1 nova_compute[182713]: 2026-01-21 23:45:31.611 182717 DEBUG nova.compute.manager [None req-a9b3180e-b779-4a18-80d7-18efab36509a e542cae206074b80ba3fce2fae031917 503e70549cf444b1bb5cde0daa90b93c - - default default] [instance: 6154084a-6bed-48b4-9c3c-2592cfc58fb1] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 21 23:45:31 compute-1 systemd[1]: machine-qemu\x2d7\x2dinstance\x2d0000000b.scope: Deactivated successfully.
Jan 21 23:45:31 compute-1 systemd[1]: machine-qemu\x2d7\x2dinstance\x2d0000000b.scope: Consumed 12.040s CPU time.
Jan 21 23:45:31 compute-1 systemd-machined[153970]: Machine qemu-7-instance-0000000b terminated.
Jan 21 23:45:31 compute-1 podman[212620]: 2026-01-21 23:45:31.750036212 +0000 UTC m=+0.084001588 container health_status cba261ffc859a105e0afa4436e64e27f9ba7e7387a1ea02146e0072c51844d44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Jan 21 23:45:31 compute-1 nova_compute[182713]: 2026-01-21 23:45:31.879 182717 INFO nova.virt.libvirt.driver [-] [instance: 6154084a-6bed-48b4-9c3c-2592cfc58fb1] Instance destroyed successfully.
Jan 21 23:45:31 compute-1 nova_compute[182713]: 2026-01-21 23:45:31.880 182717 DEBUG nova.objects.instance [None req-a9b3180e-b779-4a18-80d7-18efab36509a e542cae206074b80ba3fce2fae031917 503e70549cf444b1bb5cde0daa90b93c - - default default] Lazy-loading 'resources' on Instance uuid 6154084a-6bed-48b4-9c3c-2592cfc58fb1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:45:31 compute-1 nova_compute[182713]: 2026-01-21 23:45:31.906 182717 INFO nova.virt.libvirt.driver [None req-a9b3180e-b779-4a18-80d7-18efab36509a e542cae206074b80ba3fce2fae031917 503e70549cf444b1bb5cde0daa90b93c - - default default] [instance: 6154084a-6bed-48b4-9c3c-2592cfc58fb1] Deleting instance files /var/lib/nova/instances/6154084a-6bed-48b4-9c3c-2592cfc58fb1_del
Jan 21 23:45:31 compute-1 nova_compute[182713]: 2026-01-21 23:45:31.907 182717 INFO nova.virt.libvirt.driver [None req-a9b3180e-b779-4a18-80d7-18efab36509a e542cae206074b80ba3fce2fae031917 503e70549cf444b1bb5cde0daa90b93c - - default default] [instance: 6154084a-6bed-48b4-9c3c-2592cfc58fb1] Deletion of /var/lib/nova/instances/6154084a-6bed-48b4-9c3c-2592cfc58fb1_del complete
Jan 21 23:45:31 compute-1 nova_compute[182713]: 2026-01-21 23:45:31.992 182717 INFO nova.compute.manager [None req-a9b3180e-b779-4a18-80d7-18efab36509a e542cae206074b80ba3fce2fae031917 503e70549cf444b1bb5cde0daa90b93c - - default default] [instance: 6154084a-6bed-48b4-9c3c-2592cfc58fb1] Took 0.38 seconds to destroy the instance on the hypervisor.
Jan 21 23:45:31 compute-1 nova_compute[182713]: 2026-01-21 23:45:31.993 182717 DEBUG oslo.service.loopingcall [None req-a9b3180e-b779-4a18-80d7-18efab36509a e542cae206074b80ba3fce2fae031917 503e70549cf444b1bb5cde0daa90b93c - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 21 23:45:31 compute-1 nova_compute[182713]: 2026-01-21 23:45:31.993 182717 DEBUG nova.compute.manager [-] [instance: 6154084a-6bed-48b4-9c3c-2592cfc58fb1] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 21 23:45:31 compute-1 nova_compute[182713]: 2026-01-21 23:45:31.993 182717 DEBUG nova.network.neutron [-] [instance: 6154084a-6bed-48b4-9c3c-2592cfc58fb1] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 21 23:45:32 compute-1 nova_compute[182713]: 2026-01-21 23:45:32.218 182717 DEBUG nova.network.neutron [-] [instance: 6154084a-6bed-48b4-9c3c-2592cfc58fb1] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 21 23:45:32 compute-1 nova_compute[182713]: 2026-01-21 23:45:32.231 182717 DEBUG nova.network.neutron [-] [instance: 6154084a-6bed-48b4-9c3c-2592cfc58fb1] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 23:45:32 compute-1 nova_compute[182713]: 2026-01-21 23:45:32.245 182717 INFO nova.compute.manager [-] [instance: 6154084a-6bed-48b4-9c3c-2592cfc58fb1] Took 0.25 seconds to deallocate network for instance.
Jan 21 23:45:32 compute-1 nova_compute[182713]: 2026-01-21 23:45:32.387 182717 DEBUG oslo_concurrency.lockutils [None req-a9b3180e-b779-4a18-80d7-18efab36509a e542cae206074b80ba3fce2fae031917 503e70549cf444b1bb5cde0daa90b93c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:45:32 compute-1 nova_compute[182713]: 2026-01-21 23:45:32.387 182717 DEBUG oslo_concurrency.lockutils [None req-a9b3180e-b779-4a18-80d7-18efab36509a e542cae206074b80ba3fce2fae031917 503e70549cf444b1bb5cde0daa90b93c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:45:32 compute-1 nova_compute[182713]: 2026-01-21 23:45:32.457 182717 DEBUG nova.compute.provider_tree [None req-a9b3180e-b779-4a18-80d7-18efab36509a e542cae206074b80ba3fce2fae031917 503e70549cf444b1bb5cde0daa90b93c - - default default] Inventory has not changed in ProviderTree for provider: 39680711-70c9-4df1-ae59-25e54fac688d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 21 23:45:32 compute-1 nova_compute[182713]: 2026-01-21 23:45:32.480 182717 DEBUG nova.scheduler.client.report [None req-a9b3180e-b779-4a18-80d7-18efab36509a e542cae206074b80ba3fce2fae031917 503e70549cf444b1bb5cde0daa90b93c - - default default] Inventory has not changed for provider 39680711-70c9-4df1-ae59-25e54fac688d based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 21 23:45:32 compute-1 nova_compute[182713]: 2026-01-21 23:45:32.508 182717 DEBUG oslo_concurrency.lockutils [None req-a9b3180e-b779-4a18-80d7-18efab36509a e542cae206074b80ba3fce2fae031917 503e70549cf444b1bb5cde0daa90b93c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.120s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:45:32 compute-1 nova_compute[182713]: 2026-01-21 23:45:32.543 182717 INFO nova.scheduler.client.report [None req-a9b3180e-b779-4a18-80d7-18efab36509a e542cae206074b80ba3fce2fae031917 503e70549cf444b1bb5cde0daa90b93c - - default default] Deleted allocations for instance 6154084a-6bed-48b4-9c3c-2592cfc58fb1
Jan 21 23:45:32 compute-1 nova_compute[182713]: 2026-01-21 23:45:32.645 182717 DEBUG oslo_concurrency.lockutils [None req-a9b3180e-b779-4a18-80d7-18efab36509a e542cae206074b80ba3fce2fae031917 503e70549cf444b1bb5cde0daa90b93c - - default default] Lock "6154084a-6bed-48b4-9c3c-2592cfc58fb1" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.591s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:45:34 compute-1 nova_compute[182713]: 2026-01-21 23:45:34.877 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:45:35 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:45:35.008 104184 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=5, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:a4:fb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'fa:c2:bb:50:eb:27'}, ipsec=False) old=SB_Global(nb_cfg=4) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 21 23:45:35 compute-1 nova_compute[182713]: 2026-01-21 23:45:35.009 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:45:35 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:45:35.010 104184 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 21 23:45:35 compute-1 podman[212649]: 2026-01-21 23:45:35.584533052 +0000 UTC m=+0.072689879 container health_status dce133a2ffa21f3006ac27dc80027060a9029e6d972f4c62fecd35dd3bbb00f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, distribution-scope=public, com.redhat.component=ubi9-minimal-container, architecture=x86_64, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., io.buildah.version=1.33.7, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., managed_by=edpm_ansible, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, config_id=openstack_network_exporter, name=ubi9-minimal, io.openshift.expose-services=, vcs-type=git)
Jan 21 23:45:36 compute-1 nova_compute[182713]: 2026-01-21 23:45:36.412 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:45:39 compute-1 nova_compute[182713]: 2026-01-21 23:45:39.921 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:45:41 compute-1 nova_compute[182713]: 2026-01-21 23:45:41.417 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:45:42 compute-1 podman[212672]: 2026-01-21 23:45:42.620810098 +0000 UTC m=+0.093026996 container health_status 9069102163b9014ce8d990e36224edf2f6277e737eebbc85a8ea0ba5a3d6a468 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 21 23:45:42 compute-1 podman[212671]: 2026-01-21 23:45:42.673571183 +0000 UTC m=+0.148471684 container health_status 1b31374526468d6b98833c6ddc06c791524e3aaa8f4a92d02e3f11c449a17ca2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 21 23:45:44 compute-1 nova_compute[182713]: 2026-01-21 23:45:44.926 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:45:45 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:45:45.012 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=74526b6d-b1ca-423f-9094-b845f8b97526, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '5'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:45:46 compute-1 nova_compute[182713]: 2026-01-21 23:45:46.420 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:45:46 compute-1 nova_compute[182713]: 2026-01-21 23:45:46.879 182717 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769039131.8761973, 6154084a-6bed-48b4-9c3c-2592cfc58fb1 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:45:46 compute-1 nova_compute[182713]: 2026-01-21 23:45:46.879 182717 INFO nova.compute.manager [-] [instance: 6154084a-6bed-48b4-9c3c-2592cfc58fb1] VM Stopped (Lifecycle Event)
Jan 21 23:45:46 compute-1 nova_compute[182713]: 2026-01-21 23:45:46.916 182717 DEBUG nova.compute.manager [None req-db93f26b-4735-4b6f-8c4f-ae564176e351 - - - - - -] [instance: 6154084a-6bed-48b4-9c3c-2592cfc58fb1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:45:48 compute-1 nova_compute[182713]: 2026-01-21 23:45:48.714 182717 DEBUG oslo_concurrency.lockutils [None req-78e068c7-4328-48f5-9c4b-c3b06ccd7d1d fff24668a81c4abaabc1ef80c01dd16a f501f52c8dad4a8491327ce3f5b6b271 - - default default] Acquiring lock "b9bdcec2-f5a7-4696-ad27-2f26dcf41843" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:45:48 compute-1 nova_compute[182713]: 2026-01-21 23:45:48.715 182717 DEBUG oslo_concurrency.lockutils [None req-78e068c7-4328-48f5-9c4b-c3b06ccd7d1d fff24668a81c4abaabc1ef80c01dd16a f501f52c8dad4a8491327ce3f5b6b271 - - default default] Lock "b9bdcec2-f5a7-4696-ad27-2f26dcf41843" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:45:48 compute-1 nova_compute[182713]: 2026-01-21 23:45:48.755 182717 DEBUG nova.compute.manager [None req-78e068c7-4328-48f5-9c4b-c3b06ccd7d1d fff24668a81c4abaabc1ef80c01dd16a f501f52c8dad4a8491327ce3f5b6b271 - - default default] [instance: b9bdcec2-f5a7-4696-ad27-2f26dcf41843] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 21 23:45:48 compute-1 nova_compute[182713]: 2026-01-21 23:45:48.940 182717 DEBUG oslo_concurrency.lockutils [None req-78e068c7-4328-48f5-9c4b-c3b06ccd7d1d fff24668a81c4abaabc1ef80c01dd16a f501f52c8dad4a8491327ce3f5b6b271 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:45:48 compute-1 nova_compute[182713]: 2026-01-21 23:45:48.941 182717 DEBUG oslo_concurrency.lockutils [None req-78e068c7-4328-48f5-9c4b-c3b06ccd7d1d fff24668a81c4abaabc1ef80c01dd16a f501f52c8dad4a8491327ce3f5b6b271 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:45:48 compute-1 nova_compute[182713]: 2026-01-21 23:45:48.956 182717 DEBUG nova.virt.hardware [None req-78e068c7-4328-48f5-9c4b-c3b06ccd7d1d fff24668a81c4abaabc1ef80c01dd16a f501f52c8dad4a8491327ce3f5b6b271 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 21 23:45:48 compute-1 nova_compute[182713]: 2026-01-21 23:45:48.956 182717 INFO nova.compute.claims [None req-78e068c7-4328-48f5-9c4b-c3b06ccd7d1d fff24668a81c4abaabc1ef80c01dd16a f501f52c8dad4a8491327ce3f5b6b271 - - default default] [instance: b9bdcec2-f5a7-4696-ad27-2f26dcf41843] Claim successful on node compute-1.ctlplane.example.com
Jan 21 23:45:49 compute-1 nova_compute[182713]: 2026-01-21 23:45:49.229 182717 DEBUG nova.compute.provider_tree [None req-78e068c7-4328-48f5-9c4b-c3b06ccd7d1d fff24668a81c4abaabc1ef80c01dd16a f501f52c8dad4a8491327ce3f5b6b271 - - default default] Inventory has not changed in ProviderTree for provider: 39680711-70c9-4df1-ae59-25e54fac688d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 21 23:45:49 compute-1 nova_compute[182713]: 2026-01-21 23:45:49.303 182717 DEBUG nova.scheduler.client.report [None req-78e068c7-4328-48f5-9c4b-c3b06ccd7d1d fff24668a81c4abaabc1ef80c01dd16a f501f52c8dad4a8491327ce3f5b6b271 - - default default] Inventory has not changed for provider 39680711-70c9-4df1-ae59-25e54fac688d based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 21 23:45:49 compute-1 nova_compute[182713]: 2026-01-21 23:45:49.401 182717 DEBUG oslo_concurrency.lockutils [None req-78e068c7-4328-48f5-9c4b-c3b06ccd7d1d fff24668a81c4abaabc1ef80c01dd16a f501f52c8dad4a8491327ce3f5b6b271 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.460s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:45:49 compute-1 nova_compute[182713]: 2026-01-21 23:45:49.403 182717 DEBUG nova.compute.manager [None req-78e068c7-4328-48f5-9c4b-c3b06ccd7d1d fff24668a81c4abaabc1ef80c01dd16a f501f52c8dad4a8491327ce3f5b6b271 - - default default] [instance: b9bdcec2-f5a7-4696-ad27-2f26dcf41843] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 21 23:45:49 compute-1 nova_compute[182713]: 2026-01-21 23:45:49.537 182717 DEBUG nova.compute.manager [None req-78e068c7-4328-48f5-9c4b-c3b06ccd7d1d fff24668a81c4abaabc1ef80c01dd16a f501f52c8dad4a8491327ce3f5b6b271 - - default default] [instance: b9bdcec2-f5a7-4696-ad27-2f26dcf41843] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 21 23:45:49 compute-1 nova_compute[182713]: 2026-01-21 23:45:49.537 182717 DEBUG nova.network.neutron [None req-78e068c7-4328-48f5-9c4b-c3b06ccd7d1d fff24668a81c4abaabc1ef80c01dd16a f501f52c8dad4a8491327ce3f5b6b271 - - default default] [instance: b9bdcec2-f5a7-4696-ad27-2f26dcf41843] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 21 23:45:49 compute-1 nova_compute[182713]: 2026-01-21 23:45:49.598 182717 INFO nova.virt.libvirt.driver [None req-78e068c7-4328-48f5-9c4b-c3b06ccd7d1d fff24668a81c4abaabc1ef80c01dd16a f501f52c8dad4a8491327ce3f5b6b271 - - default default] [instance: b9bdcec2-f5a7-4696-ad27-2f26dcf41843] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 21 23:45:49 compute-1 nova_compute[182713]: 2026-01-21 23:45:49.638 182717 DEBUG nova.compute.manager [None req-78e068c7-4328-48f5-9c4b-c3b06ccd7d1d fff24668a81c4abaabc1ef80c01dd16a f501f52c8dad4a8491327ce3f5b6b271 - - default default] [instance: b9bdcec2-f5a7-4696-ad27-2f26dcf41843] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 21 23:45:49 compute-1 nova_compute[182713]: 2026-01-21 23:45:49.879 182717 DEBUG nova.compute.manager [None req-78e068c7-4328-48f5-9c4b-c3b06ccd7d1d fff24668a81c4abaabc1ef80c01dd16a f501f52c8dad4a8491327ce3f5b6b271 - - default default] [instance: b9bdcec2-f5a7-4696-ad27-2f26dcf41843] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 21 23:45:49 compute-1 nova_compute[182713]: 2026-01-21 23:45:49.880 182717 DEBUG nova.virt.libvirt.driver [None req-78e068c7-4328-48f5-9c4b-c3b06ccd7d1d fff24668a81c4abaabc1ef80c01dd16a f501f52c8dad4a8491327ce3f5b6b271 - - default default] [instance: b9bdcec2-f5a7-4696-ad27-2f26dcf41843] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 21 23:45:49 compute-1 nova_compute[182713]: 2026-01-21 23:45:49.881 182717 INFO nova.virt.libvirt.driver [None req-78e068c7-4328-48f5-9c4b-c3b06ccd7d1d fff24668a81c4abaabc1ef80c01dd16a f501f52c8dad4a8491327ce3f5b6b271 - - default default] [instance: b9bdcec2-f5a7-4696-ad27-2f26dcf41843] Creating image(s)
Jan 21 23:45:49 compute-1 nova_compute[182713]: 2026-01-21 23:45:49.882 182717 DEBUG oslo_concurrency.lockutils [None req-78e068c7-4328-48f5-9c4b-c3b06ccd7d1d fff24668a81c4abaabc1ef80c01dd16a f501f52c8dad4a8491327ce3f5b6b271 - - default default] Acquiring lock "/var/lib/nova/instances/b9bdcec2-f5a7-4696-ad27-2f26dcf41843/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:45:49 compute-1 nova_compute[182713]: 2026-01-21 23:45:49.882 182717 DEBUG oslo_concurrency.lockutils [None req-78e068c7-4328-48f5-9c4b-c3b06ccd7d1d fff24668a81c4abaabc1ef80c01dd16a f501f52c8dad4a8491327ce3f5b6b271 - - default default] Lock "/var/lib/nova/instances/b9bdcec2-f5a7-4696-ad27-2f26dcf41843/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:45:49 compute-1 nova_compute[182713]: 2026-01-21 23:45:49.883 182717 DEBUG oslo_concurrency.lockutils [None req-78e068c7-4328-48f5-9c4b-c3b06ccd7d1d fff24668a81c4abaabc1ef80c01dd16a f501f52c8dad4a8491327ce3f5b6b271 - - default default] Lock "/var/lib/nova/instances/b9bdcec2-f5a7-4696-ad27-2f26dcf41843/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:45:49 compute-1 nova_compute[182713]: 2026-01-21 23:45:49.907 182717 DEBUG oslo_concurrency.processutils [None req-78e068c7-4328-48f5-9c4b-c3b06ccd7d1d fff24668a81c4abaabc1ef80c01dd16a f501f52c8dad4a8491327ce3f5b6b271 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:45:49 compute-1 nova_compute[182713]: 2026-01-21 23:45:49.928 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:45:49 compute-1 nova_compute[182713]: 2026-01-21 23:45:49.985 182717 DEBUG oslo_concurrency.processutils [None req-78e068c7-4328-48f5-9c4b-c3b06ccd7d1d fff24668a81c4abaabc1ef80c01dd16a f501f52c8dad4a8491327ce3f5b6b271 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.078s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:45:49 compute-1 nova_compute[182713]: 2026-01-21 23:45:49.986 182717 DEBUG oslo_concurrency.lockutils [None req-78e068c7-4328-48f5-9c4b-c3b06ccd7d1d fff24668a81c4abaabc1ef80c01dd16a f501f52c8dad4a8491327ce3f5b6b271 - - default default] Acquiring lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:45:49 compute-1 nova_compute[182713]: 2026-01-21 23:45:49.986 182717 DEBUG oslo_concurrency.lockutils [None req-78e068c7-4328-48f5-9c4b-c3b06ccd7d1d fff24668a81c4abaabc1ef80c01dd16a f501f52c8dad4a8491327ce3f5b6b271 - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:45:50 compute-1 nova_compute[182713]: 2026-01-21 23:45:50.001 182717 DEBUG oslo_concurrency.processutils [None req-78e068c7-4328-48f5-9c4b-c3b06ccd7d1d fff24668a81c4abaabc1ef80c01dd16a f501f52c8dad4a8491327ce3f5b6b271 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:45:50 compute-1 nova_compute[182713]: 2026-01-21 23:45:50.073 182717 DEBUG oslo_concurrency.processutils [None req-78e068c7-4328-48f5-9c4b-c3b06ccd7d1d fff24668a81c4abaabc1ef80c01dd16a f501f52c8dad4a8491327ce3f5b6b271 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:45:50 compute-1 nova_compute[182713]: 2026-01-21 23:45:50.073 182717 DEBUG oslo_concurrency.processutils [None req-78e068c7-4328-48f5-9c4b-c3b06ccd7d1d fff24668a81c4abaabc1ef80c01dd16a f501f52c8dad4a8491327ce3f5b6b271 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/b9bdcec2-f5a7-4696-ad27-2f26dcf41843/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:45:50 compute-1 nova_compute[182713]: 2026-01-21 23:45:50.109 182717 DEBUG oslo_concurrency.processutils [None req-78e068c7-4328-48f5-9c4b-c3b06ccd7d1d fff24668a81c4abaabc1ef80c01dd16a f501f52c8dad4a8491327ce3f5b6b271 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/b9bdcec2-f5a7-4696-ad27-2f26dcf41843/disk 1073741824" returned: 0 in 0.035s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:45:50 compute-1 nova_compute[182713]: 2026-01-21 23:45:50.110 182717 DEBUG oslo_concurrency.lockutils [None req-78e068c7-4328-48f5-9c4b-c3b06ccd7d1d fff24668a81c4abaabc1ef80c01dd16a f501f52c8dad4a8491327ce3f5b6b271 - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.123s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:45:50 compute-1 nova_compute[182713]: 2026-01-21 23:45:50.110 182717 DEBUG oslo_concurrency.processutils [None req-78e068c7-4328-48f5-9c4b-c3b06ccd7d1d fff24668a81c4abaabc1ef80c01dd16a f501f52c8dad4a8491327ce3f5b6b271 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:45:50 compute-1 nova_compute[182713]: 2026-01-21 23:45:50.164 182717 DEBUG oslo_concurrency.processutils [None req-78e068c7-4328-48f5-9c4b-c3b06ccd7d1d fff24668a81c4abaabc1ef80c01dd16a f501f52c8dad4a8491327ce3f5b6b271 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:45:50 compute-1 nova_compute[182713]: 2026-01-21 23:45:50.167 182717 DEBUG nova.virt.disk.api [None req-78e068c7-4328-48f5-9c4b-c3b06ccd7d1d fff24668a81c4abaabc1ef80c01dd16a f501f52c8dad4a8491327ce3f5b6b271 - - default default] Checking if we can resize image /var/lib/nova/instances/b9bdcec2-f5a7-4696-ad27-2f26dcf41843/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 21 23:45:50 compute-1 nova_compute[182713]: 2026-01-21 23:45:50.168 182717 DEBUG oslo_concurrency.processutils [None req-78e068c7-4328-48f5-9c4b-c3b06ccd7d1d fff24668a81c4abaabc1ef80c01dd16a f501f52c8dad4a8491327ce3f5b6b271 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b9bdcec2-f5a7-4696-ad27-2f26dcf41843/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:45:50 compute-1 nova_compute[182713]: 2026-01-21 23:45:50.254 182717 DEBUG oslo_concurrency.processutils [None req-78e068c7-4328-48f5-9c4b-c3b06ccd7d1d fff24668a81c4abaabc1ef80c01dd16a f501f52c8dad4a8491327ce3f5b6b271 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b9bdcec2-f5a7-4696-ad27-2f26dcf41843/disk --force-share --output=json" returned: 0 in 0.087s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:45:50 compute-1 nova_compute[182713]: 2026-01-21 23:45:50.256 182717 DEBUG nova.virt.disk.api [None req-78e068c7-4328-48f5-9c4b-c3b06ccd7d1d fff24668a81c4abaabc1ef80c01dd16a f501f52c8dad4a8491327ce3f5b6b271 - - default default] Cannot resize image /var/lib/nova/instances/b9bdcec2-f5a7-4696-ad27-2f26dcf41843/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 21 23:45:50 compute-1 nova_compute[182713]: 2026-01-21 23:45:50.257 182717 DEBUG nova.objects.instance [None req-78e068c7-4328-48f5-9c4b-c3b06ccd7d1d fff24668a81c4abaabc1ef80c01dd16a f501f52c8dad4a8491327ce3f5b6b271 - - default default] Lazy-loading 'migration_context' on Instance uuid b9bdcec2-f5a7-4696-ad27-2f26dcf41843 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:45:50 compute-1 nova_compute[182713]: 2026-01-21 23:45:50.317 182717 DEBUG nova.virt.libvirt.driver [None req-78e068c7-4328-48f5-9c4b-c3b06ccd7d1d fff24668a81c4abaabc1ef80c01dd16a f501f52c8dad4a8491327ce3f5b6b271 - - default default] [instance: b9bdcec2-f5a7-4696-ad27-2f26dcf41843] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 21 23:45:50 compute-1 nova_compute[182713]: 2026-01-21 23:45:50.318 182717 DEBUG nova.virt.libvirt.driver [None req-78e068c7-4328-48f5-9c4b-c3b06ccd7d1d fff24668a81c4abaabc1ef80c01dd16a f501f52c8dad4a8491327ce3f5b6b271 - - default default] [instance: b9bdcec2-f5a7-4696-ad27-2f26dcf41843] Ensure instance console log exists: /var/lib/nova/instances/b9bdcec2-f5a7-4696-ad27-2f26dcf41843/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 21 23:45:50 compute-1 nova_compute[182713]: 2026-01-21 23:45:50.319 182717 DEBUG oslo_concurrency.lockutils [None req-78e068c7-4328-48f5-9c4b-c3b06ccd7d1d fff24668a81c4abaabc1ef80c01dd16a f501f52c8dad4a8491327ce3f5b6b271 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:45:50 compute-1 nova_compute[182713]: 2026-01-21 23:45:50.321 182717 DEBUG oslo_concurrency.lockutils [None req-78e068c7-4328-48f5-9c4b-c3b06ccd7d1d fff24668a81c4abaabc1ef80c01dd16a f501f52c8dad4a8491327ce3f5b6b271 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:45:50 compute-1 nova_compute[182713]: 2026-01-21 23:45:50.321 182717 DEBUG oslo_concurrency.lockutils [None req-78e068c7-4328-48f5-9c4b-c3b06ccd7d1d fff24668a81c4abaabc1ef80c01dd16a f501f52c8dad4a8491327ce3f5b6b271 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:45:50 compute-1 nova_compute[182713]: 2026-01-21 23:45:50.807 182717 DEBUG nova.network.neutron [None req-78e068c7-4328-48f5-9c4b-c3b06ccd7d1d fff24668a81c4abaabc1ef80c01dd16a f501f52c8dad4a8491327ce3f5b6b271 - - default default] [instance: b9bdcec2-f5a7-4696-ad27-2f26dcf41843] No network configured allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1188
Jan 21 23:45:50 compute-1 nova_compute[182713]: 2026-01-21 23:45:50.807 182717 DEBUG nova.compute.manager [None req-78e068c7-4328-48f5-9c4b-c3b06ccd7d1d fff24668a81c4abaabc1ef80c01dd16a f501f52c8dad4a8491327ce3f5b6b271 - - default default] [instance: b9bdcec2-f5a7-4696-ad27-2f26dcf41843] Instance network_info: |[]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 21 23:45:50 compute-1 nova_compute[182713]: 2026-01-21 23:45:50.809 182717 DEBUG nova.virt.libvirt.driver [None req-78e068c7-4328-48f5-9c4b-c3b06ccd7d1d fff24668a81c4abaabc1ef80c01dd16a f501f52c8dad4a8491327ce3f5b6b271 - - default default] [instance: b9bdcec2-f5a7-4696-ad27-2f26dcf41843] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'size': 0, 'disk_bus': 'virtio', 'boot_index': 0, 'device_name': '/dev/vda', 'guest_format': None, 'encrypted': False, 'encryption_format': None, 'encryption_secret_uuid': None, 'device_type': 'disk', 'image_id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 21 23:45:50 compute-1 nova_compute[182713]: 2026-01-21 23:45:50.814 182717 WARNING nova.virt.libvirt.driver [None req-78e068c7-4328-48f5-9c4b-c3b06ccd7d1d fff24668a81c4abaabc1ef80c01dd16a f501f52c8dad4a8491327ce3f5b6b271 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 21 23:45:50 compute-1 nova_compute[182713]: 2026-01-21 23:45:50.821 182717 DEBUG nova.virt.libvirt.host [None req-78e068c7-4328-48f5-9c4b-c3b06ccd7d1d fff24668a81c4abaabc1ef80c01dd16a f501f52c8dad4a8491327ce3f5b6b271 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 21 23:45:50 compute-1 nova_compute[182713]: 2026-01-21 23:45:50.821 182717 DEBUG nova.virt.libvirt.host [None req-78e068c7-4328-48f5-9c4b-c3b06ccd7d1d fff24668a81c4abaabc1ef80c01dd16a f501f52c8dad4a8491327ce3f5b6b271 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 21 23:45:50 compute-1 nova_compute[182713]: 2026-01-21 23:45:50.824 182717 DEBUG nova.virt.libvirt.host [None req-78e068c7-4328-48f5-9c4b-c3b06ccd7d1d fff24668a81c4abaabc1ef80c01dd16a f501f52c8dad4a8491327ce3f5b6b271 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 21 23:45:50 compute-1 nova_compute[182713]: 2026-01-21 23:45:50.825 182717 DEBUG nova.virt.libvirt.host [None req-78e068c7-4328-48f5-9c4b-c3b06ccd7d1d fff24668a81c4abaabc1ef80c01dd16a f501f52c8dad4a8491327ce3f5b6b271 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 21 23:45:50 compute-1 nova_compute[182713]: 2026-01-21 23:45:50.826 182717 DEBUG nova.virt.libvirt.driver [None req-78e068c7-4328-48f5-9c4b-c3b06ccd7d1d fff24668a81c4abaabc1ef80c01dd16a f501f52c8dad4a8491327ce3f5b6b271 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 21 23:45:50 compute-1 nova_compute[182713]: 2026-01-21 23:45:50.827 182717 DEBUG nova.virt.hardware [None req-78e068c7-4328-48f5-9c4b-c3b06ccd7d1d fff24668a81c4abaabc1ef80c01dd16a f501f52c8dad4a8491327ce3f5b6b271 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-21T23:42:45Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c3389c03-89c4-4ff5-9e03-1a99d41713d4',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 21 23:45:50 compute-1 nova_compute[182713]: 2026-01-21 23:45:50.827 182717 DEBUG nova.virt.hardware [None req-78e068c7-4328-48f5-9c4b-c3b06ccd7d1d fff24668a81c4abaabc1ef80c01dd16a f501f52c8dad4a8491327ce3f5b6b271 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 21 23:45:50 compute-1 nova_compute[182713]: 2026-01-21 23:45:50.827 182717 DEBUG nova.virt.hardware [None req-78e068c7-4328-48f5-9c4b-c3b06ccd7d1d fff24668a81c4abaabc1ef80c01dd16a f501f52c8dad4a8491327ce3f5b6b271 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 21 23:45:50 compute-1 nova_compute[182713]: 2026-01-21 23:45:50.828 182717 DEBUG nova.virt.hardware [None req-78e068c7-4328-48f5-9c4b-c3b06ccd7d1d fff24668a81c4abaabc1ef80c01dd16a f501f52c8dad4a8491327ce3f5b6b271 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 21 23:45:50 compute-1 nova_compute[182713]: 2026-01-21 23:45:50.828 182717 DEBUG nova.virt.hardware [None req-78e068c7-4328-48f5-9c4b-c3b06ccd7d1d fff24668a81c4abaabc1ef80c01dd16a f501f52c8dad4a8491327ce3f5b6b271 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 21 23:45:50 compute-1 nova_compute[182713]: 2026-01-21 23:45:50.829 182717 DEBUG nova.virt.hardware [None req-78e068c7-4328-48f5-9c4b-c3b06ccd7d1d fff24668a81c4abaabc1ef80c01dd16a f501f52c8dad4a8491327ce3f5b6b271 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 21 23:45:50 compute-1 nova_compute[182713]: 2026-01-21 23:45:50.829 182717 DEBUG nova.virt.hardware [None req-78e068c7-4328-48f5-9c4b-c3b06ccd7d1d fff24668a81c4abaabc1ef80c01dd16a f501f52c8dad4a8491327ce3f5b6b271 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 21 23:45:50 compute-1 nova_compute[182713]: 2026-01-21 23:45:50.829 182717 DEBUG nova.virt.hardware [None req-78e068c7-4328-48f5-9c4b-c3b06ccd7d1d fff24668a81c4abaabc1ef80c01dd16a f501f52c8dad4a8491327ce3f5b6b271 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 21 23:45:50 compute-1 nova_compute[182713]: 2026-01-21 23:45:50.829 182717 DEBUG nova.virt.hardware [None req-78e068c7-4328-48f5-9c4b-c3b06ccd7d1d fff24668a81c4abaabc1ef80c01dd16a f501f52c8dad4a8491327ce3f5b6b271 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 21 23:45:50 compute-1 nova_compute[182713]: 2026-01-21 23:45:50.830 182717 DEBUG nova.virt.hardware [None req-78e068c7-4328-48f5-9c4b-c3b06ccd7d1d fff24668a81c4abaabc1ef80c01dd16a f501f52c8dad4a8491327ce3f5b6b271 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 21 23:45:50 compute-1 nova_compute[182713]: 2026-01-21 23:45:50.830 182717 DEBUG nova.virt.hardware [None req-78e068c7-4328-48f5-9c4b-c3b06ccd7d1d fff24668a81c4abaabc1ef80c01dd16a f501f52c8dad4a8491327ce3f5b6b271 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 21 23:45:50 compute-1 nova_compute[182713]: 2026-01-21 23:45:50.835 182717 DEBUG nova.objects.instance [None req-78e068c7-4328-48f5-9c4b-c3b06ccd7d1d fff24668a81c4abaabc1ef80c01dd16a f501f52c8dad4a8491327ce3f5b6b271 - - default default] Lazy-loading 'pci_devices' on Instance uuid b9bdcec2-f5a7-4696-ad27-2f26dcf41843 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:45:50 compute-1 nova_compute[182713]: 2026-01-21 23:45:50.889 182717 DEBUG nova.virt.libvirt.driver [None req-78e068c7-4328-48f5-9c4b-c3b06ccd7d1d fff24668a81c4abaabc1ef80c01dd16a f501f52c8dad4a8491327ce3f5b6b271 - - default default] [instance: b9bdcec2-f5a7-4696-ad27-2f26dcf41843] End _get_guest_xml xml=<domain type="kvm">
Jan 21 23:45:50 compute-1 nova_compute[182713]:   <uuid>b9bdcec2-f5a7-4696-ad27-2f26dcf41843</uuid>
Jan 21 23:45:50 compute-1 nova_compute[182713]:   <name>instance-0000000e</name>
Jan 21 23:45:50 compute-1 nova_compute[182713]:   <memory>131072</memory>
Jan 21 23:45:50 compute-1 nova_compute[182713]:   <vcpu>1</vcpu>
Jan 21 23:45:50 compute-1 nova_compute[182713]:   <metadata>
Jan 21 23:45:50 compute-1 nova_compute[182713]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 21 23:45:50 compute-1 nova_compute[182713]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 21 23:45:50 compute-1 nova_compute[182713]:       <nova:name>tempest-ServerExternalEventsTest-server-50132590</nova:name>
Jan 21 23:45:50 compute-1 nova_compute[182713]:       <nova:creationTime>2026-01-21 23:45:50</nova:creationTime>
Jan 21 23:45:50 compute-1 nova_compute[182713]:       <nova:flavor name="m1.nano">
Jan 21 23:45:50 compute-1 nova_compute[182713]:         <nova:memory>128</nova:memory>
Jan 21 23:45:50 compute-1 nova_compute[182713]:         <nova:disk>1</nova:disk>
Jan 21 23:45:50 compute-1 nova_compute[182713]:         <nova:swap>0</nova:swap>
Jan 21 23:45:50 compute-1 nova_compute[182713]:         <nova:ephemeral>0</nova:ephemeral>
Jan 21 23:45:50 compute-1 nova_compute[182713]:         <nova:vcpus>1</nova:vcpus>
Jan 21 23:45:50 compute-1 nova_compute[182713]:       </nova:flavor>
Jan 21 23:45:50 compute-1 nova_compute[182713]:       <nova:owner>
Jan 21 23:45:50 compute-1 nova_compute[182713]:         <nova:user uuid="fff24668a81c4abaabc1ef80c01dd16a">tempest-ServerExternalEventsTest-253376490-project-member</nova:user>
Jan 21 23:45:50 compute-1 nova_compute[182713]:         <nova:project uuid="f501f52c8dad4a8491327ce3f5b6b271">tempest-ServerExternalEventsTest-253376490</nova:project>
Jan 21 23:45:50 compute-1 nova_compute[182713]:       </nova:owner>
Jan 21 23:45:50 compute-1 nova_compute[182713]:       <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 21 23:45:50 compute-1 nova_compute[182713]:       <nova:ports/>
Jan 21 23:45:50 compute-1 nova_compute[182713]:     </nova:instance>
Jan 21 23:45:50 compute-1 nova_compute[182713]:   </metadata>
Jan 21 23:45:50 compute-1 nova_compute[182713]:   <sysinfo type="smbios">
Jan 21 23:45:50 compute-1 nova_compute[182713]:     <system>
Jan 21 23:45:50 compute-1 nova_compute[182713]:       <entry name="manufacturer">RDO</entry>
Jan 21 23:45:50 compute-1 nova_compute[182713]:       <entry name="product">OpenStack Compute</entry>
Jan 21 23:45:50 compute-1 nova_compute[182713]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 21 23:45:50 compute-1 nova_compute[182713]:       <entry name="serial">b9bdcec2-f5a7-4696-ad27-2f26dcf41843</entry>
Jan 21 23:45:50 compute-1 nova_compute[182713]:       <entry name="uuid">b9bdcec2-f5a7-4696-ad27-2f26dcf41843</entry>
Jan 21 23:45:50 compute-1 nova_compute[182713]:       <entry name="family">Virtual Machine</entry>
Jan 21 23:45:50 compute-1 nova_compute[182713]:     </system>
Jan 21 23:45:50 compute-1 nova_compute[182713]:   </sysinfo>
Jan 21 23:45:50 compute-1 nova_compute[182713]:   <os>
Jan 21 23:45:50 compute-1 nova_compute[182713]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 21 23:45:50 compute-1 nova_compute[182713]:     <boot dev="hd"/>
Jan 21 23:45:50 compute-1 nova_compute[182713]:     <smbios mode="sysinfo"/>
Jan 21 23:45:50 compute-1 nova_compute[182713]:   </os>
Jan 21 23:45:50 compute-1 nova_compute[182713]:   <features>
Jan 21 23:45:50 compute-1 nova_compute[182713]:     <acpi/>
Jan 21 23:45:50 compute-1 nova_compute[182713]:     <apic/>
Jan 21 23:45:50 compute-1 nova_compute[182713]:     <vmcoreinfo/>
Jan 21 23:45:50 compute-1 nova_compute[182713]:   </features>
Jan 21 23:45:50 compute-1 nova_compute[182713]:   <clock offset="utc">
Jan 21 23:45:50 compute-1 nova_compute[182713]:     <timer name="pit" tickpolicy="delay"/>
Jan 21 23:45:50 compute-1 nova_compute[182713]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 21 23:45:50 compute-1 nova_compute[182713]:     <timer name="hpet" present="no"/>
Jan 21 23:45:50 compute-1 nova_compute[182713]:   </clock>
Jan 21 23:45:50 compute-1 nova_compute[182713]:   <cpu mode="custom" match="exact">
Jan 21 23:45:50 compute-1 nova_compute[182713]:     <model>Nehalem</model>
Jan 21 23:45:50 compute-1 nova_compute[182713]:     <topology sockets="1" cores="1" threads="1"/>
Jan 21 23:45:50 compute-1 nova_compute[182713]:   </cpu>
Jan 21 23:45:50 compute-1 nova_compute[182713]:   <devices>
Jan 21 23:45:50 compute-1 nova_compute[182713]:     <disk type="file" device="disk">
Jan 21 23:45:50 compute-1 nova_compute[182713]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 21 23:45:50 compute-1 nova_compute[182713]:       <source file="/var/lib/nova/instances/b9bdcec2-f5a7-4696-ad27-2f26dcf41843/disk"/>
Jan 21 23:45:50 compute-1 nova_compute[182713]:       <target dev="vda" bus="virtio"/>
Jan 21 23:45:50 compute-1 nova_compute[182713]:     </disk>
Jan 21 23:45:50 compute-1 nova_compute[182713]:     <disk type="file" device="cdrom">
Jan 21 23:45:50 compute-1 nova_compute[182713]:       <driver name="qemu" type="raw" cache="none"/>
Jan 21 23:45:50 compute-1 nova_compute[182713]:       <source file="/var/lib/nova/instances/b9bdcec2-f5a7-4696-ad27-2f26dcf41843/disk.config"/>
Jan 21 23:45:50 compute-1 nova_compute[182713]:       <target dev="sda" bus="sata"/>
Jan 21 23:45:50 compute-1 nova_compute[182713]:     </disk>
Jan 21 23:45:50 compute-1 nova_compute[182713]:     <serial type="pty">
Jan 21 23:45:50 compute-1 nova_compute[182713]:       <log file="/var/lib/nova/instances/b9bdcec2-f5a7-4696-ad27-2f26dcf41843/console.log" append="off"/>
Jan 21 23:45:50 compute-1 nova_compute[182713]:     </serial>
Jan 21 23:45:50 compute-1 nova_compute[182713]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 21 23:45:50 compute-1 nova_compute[182713]:     <video>
Jan 21 23:45:50 compute-1 nova_compute[182713]:       <model type="virtio"/>
Jan 21 23:45:50 compute-1 nova_compute[182713]:     </video>
Jan 21 23:45:50 compute-1 nova_compute[182713]:     <input type="tablet" bus="usb"/>
Jan 21 23:45:50 compute-1 nova_compute[182713]:     <rng model="virtio">
Jan 21 23:45:50 compute-1 nova_compute[182713]:       <backend model="random">/dev/urandom</backend>
Jan 21 23:45:50 compute-1 nova_compute[182713]:     </rng>
Jan 21 23:45:50 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root"/>
Jan 21 23:45:50 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:45:50 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:45:50 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:45:50 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:45:50 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:45:50 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:45:50 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:45:50 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:45:50 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:45:50 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:45:50 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:45:50 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:45:50 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:45:50 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:45:50 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:45:50 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:45:50 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:45:50 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:45:50 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:45:50 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:45:50 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:45:50 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:45:50 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:45:50 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:45:50 compute-1 nova_compute[182713]:     <controller type="usb" index="0"/>
Jan 21 23:45:50 compute-1 nova_compute[182713]:     <memballoon model="virtio">
Jan 21 23:45:50 compute-1 nova_compute[182713]:       <stats period="10"/>
Jan 21 23:45:50 compute-1 nova_compute[182713]:     </memballoon>
Jan 21 23:45:50 compute-1 nova_compute[182713]:   </devices>
Jan 21 23:45:50 compute-1 nova_compute[182713]: </domain>
Jan 21 23:45:50 compute-1 nova_compute[182713]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 21 23:45:51 compute-1 nova_compute[182713]: 2026-01-21 23:45:51.120 182717 DEBUG nova.virt.libvirt.driver [None req-78e068c7-4328-48f5-9c4b-c3b06ccd7d1d fff24668a81c4abaabc1ef80c01dd16a f501f52c8dad4a8491327ce3f5b6b271 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 21 23:45:51 compute-1 nova_compute[182713]: 2026-01-21 23:45:51.120 182717 DEBUG nova.virt.libvirt.driver [None req-78e068c7-4328-48f5-9c4b-c3b06ccd7d1d fff24668a81c4abaabc1ef80c01dd16a f501f52c8dad4a8491327ce3f5b6b271 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 21 23:45:51 compute-1 nova_compute[182713]: 2026-01-21 23:45:51.121 182717 INFO nova.virt.libvirt.driver [None req-78e068c7-4328-48f5-9c4b-c3b06ccd7d1d fff24668a81c4abaabc1ef80c01dd16a f501f52c8dad4a8491327ce3f5b6b271 - - default default] [instance: b9bdcec2-f5a7-4696-ad27-2f26dcf41843] Using config drive
Jan 21 23:45:51 compute-1 nova_compute[182713]: 2026-01-21 23:45:51.423 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:45:51 compute-1 nova_compute[182713]: 2026-01-21 23:45:51.573 182717 INFO nova.virt.libvirt.driver [None req-78e068c7-4328-48f5-9c4b-c3b06ccd7d1d fff24668a81c4abaabc1ef80c01dd16a f501f52c8dad4a8491327ce3f5b6b271 - - default default] [instance: b9bdcec2-f5a7-4696-ad27-2f26dcf41843] Creating config drive at /var/lib/nova/instances/b9bdcec2-f5a7-4696-ad27-2f26dcf41843/disk.config
Jan 21 23:45:51 compute-1 nova_compute[182713]: 2026-01-21 23:45:51.581 182717 DEBUG oslo_concurrency.processutils [None req-78e068c7-4328-48f5-9c4b-c3b06ccd7d1d fff24668a81c4abaabc1ef80c01dd16a f501f52c8dad4a8491327ce3f5b6b271 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/b9bdcec2-f5a7-4696-ad27-2f26dcf41843/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmprx64eb6a execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:45:51 compute-1 nova_compute[182713]: 2026-01-21 23:45:51.722 182717 DEBUG oslo_concurrency.processutils [None req-78e068c7-4328-48f5-9c4b-c3b06ccd7d1d fff24668a81c4abaabc1ef80c01dd16a f501f52c8dad4a8491327ce3f5b6b271 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/b9bdcec2-f5a7-4696-ad27-2f26dcf41843/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmprx64eb6a" returned: 0 in 0.141s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:45:51 compute-1 systemd-machined[153970]: New machine qemu-8-instance-0000000e.
Jan 21 23:45:51 compute-1 systemd[1]: Started Virtual Machine qemu-8-instance-0000000e.
Jan 21 23:45:51 compute-1 podman[212745]: 2026-01-21 23:45:51.844966217 +0000 UTC m=+0.075182333 container health_status e305ebfcd3dbca18f8dd680606b043b108cf9efb0d0a771039cb04fe0df8441d (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Jan 21 23:45:51 compute-1 podman[212740]: 2026-01-21 23:45:51.845018568 +0000 UTC m=+0.079074004 container health_status af9a44923f14d865893c91712a29a99d59e05d83f1a1a0300c4f4d5af638f284 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent)
Jan 21 23:45:52 compute-1 nova_compute[182713]: 2026-01-21 23:45:52.280 182717 DEBUG nova.virt.driver [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Emitting event <LifecycleEvent: 1769039152.279674, b9bdcec2-f5a7-4696-ad27-2f26dcf41843 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:45:52 compute-1 nova_compute[182713]: 2026-01-21 23:45:52.282 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: b9bdcec2-f5a7-4696-ad27-2f26dcf41843] VM Resumed (Lifecycle Event)
Jan 21 23:45:52 compute-1 nova_compute[182713]: 2026-01-21 23:45:52.288 182717 DEBUG nova.compute.manager [None req-78e068c7-4328-48f5-9c4b-c3b06ccd7d1d fff24668a81c4abaabc1ef80c01dd16a f501f52c8dad4a8491327ce3f5b6b271 - - default default] [instance: b9bdcec2-f5a7-4696-ad27-2f26dcf41843] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 21 23:45:52 compute-1 nova_compute[182713]: 2026-01-21 23:45:52.289 182717 DEBUG nova.virt.libvirt.driver [None req-78e068c7-4328-48f5-9c4b-c3b06ccd7d1d fff24668a81c4abaabc1ef80c01dd16a f501f52c8dad4a8491327ce3f5b6b271 - - default default] [instance: b9bdcec2-f5a7-4696-ad27-2f26dcf41843] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 21 23:45:52 compute-1 nova_compute[182713]: 2026-01-21 23:45:52.294 182717 INFO nova.virt.libvirt.driver [-] [instance: b9bdcec2-f5a7-4696-ad27-2f26dcf41843] Instance spawned successfully.
Jan 21 23:45:52 compute-1 nova_compute[182713]: 2026-01-21 23:45:52.294 182717 DEBUG nova.virt.libvirt.driver [None req-78e068c7-4328-48f5-9c4b-c3b06ccd7d1d fff24668a81c4abaabc1ef80c01dd16a f501f52c8dad4a8491327ce3f5b6b271 - - default default] [instance: b9bdcec2-f5a7-4696-ad27-2f26dcf41843] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 21 23:45:52 compute-1 nova_compute[182713]: 2026-01-21 23:45:52.335 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: b9bdcec2-f5a7-4696-ad27-2f26dcf41843] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:45:52 compute-1 nova_compute[182713]: 2026-01-21 23:45:52.342 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: b9bdcec2-f5a7-4696-ad27-2f26dcf41843] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 21 23:45:52 compute-1 nova_compute[182713]: 2026-01-21 23:45:52.348 182717 DEBUG nova.virt.libvirt.driver [None req-78e068c7-4328-48f5-9c4b-c3b06ccd7d1d fff24668a81c4abaabc1ef80c01dd16a f501f52c8dad4a8491327ce3f5b6b271 - - default default] [instance: b9bdcec2-f5a7-4696-ad27-2f26dcf41843] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:45:52 compute-1 nova_compute[182713]: 2026-01-21 23:45:52.349 182717 DEBUG nova.virt.libvirt.driver [None req-78e068c7-4328-48f5-9c4b-c3b06ccd7d1d fff24668a81c4abaabc1ef80c01dd16a f501f52c8dad4a8491327ce3f5b6b271 - - default default] [instance: b9bdcec2-f5a7-4696-ad27-2f26dcf41843] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:45:52 compute-1 nova_compute[182713]: 2026-01-21 23:45:52.350 182717 DEBUG nova.virt.libvirt.driver [None req-78e068c7-4328-48f5-9c4b-c3b06ccd7d1d fff24668a81c4abaabc1ef80c01dd16a f501f52c8dad4a8491327ce3f5b6b271 - - default default] [instance: b9bdcec2-f5a7-4696-ad27-2f26dcf41843] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:45:52 compute-1 nova_compute[182713]: 2026-01-21 23:45:52.350 182717 DEBUG nova.virt.libvirt.driver [None req-78e068c7-4328-48f5-9c4b-c3b06ccd7d1d fff24668a81c4abaabc1ef80c01dd16a f501f52c8dad4a8491327ce3f5b6b271 - - default default] [instance: b9bdcec2-f5a7-4696-ad27-2f26dcf41843] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:45:52 compute-1 nova_compute[182713]: 2026-01-21 23:45:52.351 182717 DEBUG nova.virt.libvirt.driver [None req-78e068c7-4328-48f5-9c4b-c3b06ccd7d1d fff24668a81c4abaabc1ef80c01dd16a f501f52c8dad4a8491327ce3f5b6b271 - - default default] [instance: b9bdcec2-f5a7-4696-ad27-2f26dcf41843] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:45:52 compute-1 nova_compute[182713]: 2026-01-21 23:45:52.352 182717 DEBUG nova.virt.libvirt.driver [None req-78e068c7-4328-48f5-9c4b-c3b06ccd7d1d fff24668a81c4abaabc1ef80c01dd16a f501f52c8dad4a8491327ce3f5b6b271 - - default default] [instance: b9bdcec2-f5a7-4696-ad27-2f26dcf41843] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:45:52 compute-1 nova_compute[182713]: 2026-01-21 23:45:52.389 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: b9bdcec2-f5a7-4696-ad27-2f26dcf41843] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 21 23:45:52 compute-1 nova_compute[182713]: 2026-01-21 23:45:52.390 182717 DEBUG nova.virt.driver [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Emitting event <LifecycleEvent: 1769039152.2830765, b9bdcec2-f5a7-4696-ad27-2f26dcf41843 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:45:52 compute-1 nova_compute[182713]: 2026-01-21 23:45:52.390 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: b9bdcec2-f5a7-4696-ad27-2f26dcf41843] VM Started (Lifecycle Event)
Jan 21 23:45:52 compute-1 nova_compute[182713]: 2026-01-21 23:45:52.425 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: b9bdcec2-f5a7-4696-ad27-2f26dcf41843] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:45:52 compute-1 nova_compute[182713]: 2026-01-21 23:45:52.429 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: b9bdcec2-f5a7-4696-ad27-2f26dcf41843] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 21 23:45:52 compute-1 nova_compute[182713]: 2026-01-21 23:45:52.454 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: b9bdcec2-f5a7-4696-ad27-2f26dcf41843] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 21 23:45:52 compute-1 nova_compute[182713]: 2026-01-21 23:45:52.481 182717 INFO nova.compute.manager [None req-78e068c7-4328-48f5-9c4b-c3b06ccd7d1d fff24668a81c4abaabc1ef80c01dd16a f501f52c8dad4a8491327ce3f5b6b271 - - default default] [instance: b9bdcec2-f5a7-4696-ad27-2f26dcf41843] Took 2.60 seconds to spawn the instance on the hypervisor.
Jan 21 23:45:52 compute-1 nova_compute[182713]: 2026-01-21 23:45:52.482 182717 DEBUG nova.compute.manager [None req-78e068c7-4328-48f5-9c4b-c3b06ccd7d1d fff24668a81c4abaabc1ef80c01dd16a f501f52c8dad4a8491327ce3f5b6b271 - - default default] [instance: b9bdcec2-f5a7-4696-ad27-2f26dcf41843] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:45:52 compute-1 nova_compute[182713]: 2026-01-21 23:45:52.596 182717 INFO nova.compute.manager [None req-78e068c7-4328-48f5-9c4b-c3b06ccd7d1d fff24668a81c4abaabc1ef80c01dd16a f501f52c8dad4a8491327ce3f5b6b271 - - default default] [instance: b9bdcec2-f5a7-4696-ad27-2f26dcf41843] Took 3.72 seconds to build instance.
Jan 21 23:45:52 compute-1 nova_compute[182713]: 2026-01-21 23:45:52.643 182717 DEBUG oslo_concurrency.lockutils [None req-78e068c7-4328-48f5-9c4b-c3b06ccd7d1d fff24668a81c4abaabc1ef80c01dd16a f501f52c8dad4a8491327ce3f5b6b271 - - default default] Lock "b9bdcec2-f5a7-4696-ad27-2f26dcf41843" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 3.928s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:45:53 compute-1 nova_compute[182713]: 2026-01-21 23:45:53.530 182717 DEBUG nova.compute.manager [None req-82323f18-270a-4a90-b89e-f974dbf8af67 6d2cdfa2f4c340a7b16d13c38184f9f4 16e1c0a4972f4b3ea48313b46120db94 - - default default] [instance: b9bdcec2-f5a7-4696-ad27-2f26dcf41843] Received event network-changed external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:45:53 compute-1 nova_compute[182713]: 2026-01-21 23:45:53.531 182717 DEBUG nova.compute.manager [None req-82323f18-270a-4a90-b89e-f974dbf8af67 6d2cdfa2f4c340a7b16d13c38184f9f4 16e1c0a4972f4b3ea48313b46120db94 - - default default] [instance: b9bdcec2-f5a7-4696-ad27-2f26dcf41843] Refreshing instance network info cache due to event network-changed. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 21 23:45:53 compute-1 nova_compute[182713]: 2026-01-21 23:45:53.532 182717 DEBUG oslo_concurrency.lockutils [None req-82323f18-270a-4a90-b89e-f974dbf8af67 6d2cdfa2f4c340a7b16d13c38184f9f4 16e1c0a4972f4b3ea48313b46120db94 - - default default] Acquiring lock "refresh_cache-b9bdcec2-f5a7-4696-ad27-2f26dcf41843" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 23:45:53 compute-1 nova_compute[182713]: 2026-01-21 23:45:53.532 182717 DEBUG oslo_concurrency.lockutils [None req-82323f18-270a-4a90-b89e-f974dbf8af67 6d2cdfa2f4c340a7b16d13c38184f9f4 16e1c0a4972f4b3ea48313b46120db94 - - default default] Acquired lock "refresh_cache-b9bdcec2-f5a7-4696-ad27-2f26dcf41843" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 23:45:53 compute-1 nova_compute[182713]: 2026-01-21 23:45:53.533 182717 DEBUG nova.network.neutron [None req-82323f18-270a-4a90-b89e-f974dbf8af67 6d2cdfa2f4c340a7b16d13c38184f9f4 16e1c0a4972f4b3ea48313b46120db94 - - default default] [instance: b9bdcec2-f5a7-4696-ad27-2f26dcf41843] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 21 23:45:53 compute-1 nova_compute[182713]: 2026-01-21 23:45:53.854 182717 DEBUG nova.network.neutron [None req-82323f18-270a-4a90-b89e-f974dbf8af67 6d2cdfa2f4c340a7b16d13c38184f9f4 16e1c0a4972f4b3ea48313b46120db94 - - default default] [instance: b9bdcec2-f5a7-4696-ad27-2f26dcf41843] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 21 23:45:53 compute-1 nova_compute[182713]: 2026-01-21 23:45:53.877 182717 DEBUG oslo_concurrency.lockutils [None req-c9f409e8-1916-4286-8568-7364204ce138 fff24668a81c4abaabc1ef80c01dd16a f501f52c8dad4a8491327ce3f5b6b271 - - default default] Acquiring lock "b9bdcec2-f5a7-4696-ad27-2f26dcf41843" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:45:53 compute-1 nova_compute[182713]: 2026-01-21 23:45:53.878 182717 DEBUG oslo_concurrency.lockutils [None req-c9f409e8-1916-4286-8568-7364204ce138 fff24668a81c4abaabc1ef80c01dd16a f501f52c8dad4a8491327ce3f5b6b271 - - default default] Lock "b9bdcec2-f5a7-4696-ad27-2f26dcf41843" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:45:53 compute-1 nova_compute[182713]: 2026-01-21 23:45:53.879 182717 DEBUG oslo_concurrency.lockutils [None req-c9f409e8-1916-4286-8568-7364204ce138 fff24668a81c4abaabc1ef80c01dd16a f501f52c8dad4a8491327ce3f5b6b271 - - default default] Acquiring lock "b9bdcec2-f5a7-4696-ad27-2f26dcf41843-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:45:53 compute-1 nova_compute[182713]: 2026-01-21 23:45:53.879 182717 DEBUG oslo_concurrency.lockutils [None req-c9f409e8-1916-4286-8568-7364204ce138 fff24668a81c4abaabc1ef80c01dd16a f501f52c8dad4a8491327ce3f5b6b271 - - default default] Lock "b9bdcec2-f5a7-4696-ad27-2f26dcf41843-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:45:53 compute-1 nova_compute[182713]: 2026-01-21 23:45:53.879 182717 DEBUG oslo_concurrency.lockutils [None req-c9f409e8-1916-4286-8568-7364204ce138 fff24668a81c4abaabc1ef80c01dd16a f501f52c8dad4a8491327ce3f5b6b271 - - default default] Lock "b9bdcec2-f5a7-4696-ad27-2f26dcf41843-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:45:53 compute-1 nova_compute[182713]: 2026-01-21 23:45:53.898 182717 INFO nova.compute.manager [None req-c9f409e8-1916-4286-8568-7364204ce138 fff24668a81c4abaabc1ef80c01dd16a f501f52c8dad4a8491327ce3f5b6b271 - - default default] [instance: b9bdcec2-f5a7-4696-ad27-2f26dcf41843] Terminating instance
Jan 21 23:45:53 compute-1 nova_compute[182713]: 2026-01-21 23:45:53.915 182717 DEBUG oslo_concurrency.lockutils [None req-c9f409e8-1916-4286-8568-7364204ce138 fff24668a81c4abaabc1ef80c01dd16a f501f52c8dad4a8491327ce3f5b6b271 - - default default] Acquiring lock "refresh_cache-b9bdcec2-f5a7-4696-ad27-2f26dcf41843" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 23:45:54 compute-1 ovn_controller[94841]: 2026-01-21T23:45:54Z|00045|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory
Jan 21 23:45:54 compute-1 nova_compute[182713]: 2026-01-21 23:45:54.321 182717 DEBUG nova.network.neutron [None req-82323f18-270a-4a90-b89e-f974dbf8af67 6d2cdfa2f4c340a7b16d13c38184f9f4 16e1c0a4972f4b3ea48313b46120db94 - - default default] [instance: b9bdcec2-f5a7-4696-ad27-2f26dcf41843] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 23:45:54 compute-1 nova_compute[182713]: 2026-01-21 23:45:54.341 182717 DEBUG oslo_concurrency.lockutils [None req-82323f18-270a-4a90-b89e-f974dbf8af67 6d2cdfa2f4c340a7b16d13c38184f9f4 16e1c0a4972f4b3ea48313b46120db94 - - default default] Releasing lock "refresh_cache-b9bdcec2-f5a7-4696-ad27-2f26dcf41843" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 23:45:54 compute-1 nova_compute[182713]: 2026-01-21 23:45:54.342 182717 DEBUG oslo_concurrency.lockutils [None req-c9f409e8-1916-4286-8568-7364204ce138 fff24668a81c4abaabc1ef80c01dd16a f501f52c8dad4a8491327ce3f5b6b271 - - default default] Acquired lock "refresh_cache-b9bdcec2-f5a7-4696-ad27-2f26dcf41843" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 23:45:54 compute-1 nova_compute[182713]: 2026-01-21 23:45:54.342 182717 DEBUG nova.network.neutron [None req-c9f409e8-1916-4286-8568-7364204ce138 fff24668a81c4abaabc1ef80c01dd16a f501f52c8dad4a8491327ce3f5b6b271 - - default default] [instance: b9bdcec2-f5a7-4696-ad27-2f26dcf41843] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 21 23:45:54 compute-1 nova_compute[182713]: 2026-01-21 23:45:54.929 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:45:55 compute-1 rsyslogd[1003]: imjournal: 4204 messages lost due to rate-limiting (20000 allowed within 600 seconds)
Jan 21 23:45:55 compute-1 nova_compute[182713]: 2026-01-21 23:45:55.359 182717 DEBUG nova.network.neutron [None req-c9f409e8-1916-4286-8568-7364204ce138 fff24668a81c4abaabc1ef80c01dd16a f501f52c8dad4a8491327ce3f5b6b271 - - default default] [instance: b9bdcec2-f5a7-4696-ad27-2f26dcf41843] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 21 23:45:55 compute-1 nova_compute[182713]: 2026-01-21 23:45:55.888 182717 DEBUG nova.network.neutron [None req-c9f409e8-1916-4286-8568-7364204ce138 fff24668a81c4abaabc1ef80c01dd16a f501f52c8dad4a8491327ce3f5b6b271 - - default default] [instance: b9bdcec2-f5a7-4696-ad27-2f26dcf41843] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 23:45:55 compute-1 nova_compute[182713]: 2026-01-21 23:45:55.918 182717 DEBUG oslo_concurrency.lockutils [None req-c9f409e8-1916-4286-8568-7364204ce138 fff24668a81c4abaabc1ef80c01dd16a f501f52c8dad4a8491327ce3f5b6b271 - - default default] Releasing lock "refresh_cache-b9bdcec2-f5a7-4696-ad27-2f26dcf41843" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 23:45:55 compute-1 nova_compute[182713]: 2026-01-21 23:45:55.919 182717 DEBUG nova.compute.manager [None req-c9f409e8-1916-4286-8568-7364204ce138 fff24668a81c4abaabc1ef80c01dd16a f501f52c8dad4a8491327ce3f5b6b271 - - default default] [instance: b9bdcec2-f5a7-4696-ad27-2f26dcf41843] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 21 23:45:55 compute-1 systemd[1]: machine-qemu\x2d8\x2dinstance\x2d0000000e.scope: Deactivated successfully.
Jan 21 23:45:55 compute-1 systemd[1]: machine-qemu\x2d8\x2dinstance\x2d0000000e.scope: Consumed 4.123s CPU time.
Jan 21 23:45:55 compute-1 systemd-machined[153970]: Machine qemu-8-instance-0000000e terminated.
Jan 21 23:45:56 compute-1 nova_compute[182713]: 2026-01-21 23:45:56.179 182717 INFO nova.virt.libvirt.driver [-] [instance: b9bdcec2-f5a7-4696-ad27-2f26dcf41843] Instance destroyed successfully.
Jan 21 23:45:56 compute-1 nova_compute[182713]: 2026-01-21 23:45:56.180 182717 DEBUG nova.objects.instance [None req-c9f409e8-1916-4286-8568-7364204ce138 fff24668a81c4abaabc1ef80c01dd16a f501f52c8dad4a8491327ce3f5b6b271 - - default default] Lazy-loading 'resources' on Instance uuid b9bdcec2-f5a7-4696-ad27-2f26dcf41843 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:45:56 compute-1 nova_compute[182713]: 2026-01-21 23:45:56.197 182717 INFO nova.virt.libvirt.driver [None req-c9f409e8-1916-4286-8568-7364204ce138 fff24668a81c4abaabc1ef80c01dd16a f501f52c8dad4a8491327ce3f5b6b271 - - default default] [instance: b9bdcec2-f5a7-4696-ad27-2f26dcf41843] Deleting instance files /var/lib/nova/instances/b9bdcec2-f5a7-4696-ad27-2f26dcf41843_del
Jan 21 23:45:56 compute-1 nova_compute[182713]: 2026-01-21 23:45:56.199 182717 INFO nova.virt.libvirt.driver [None req-c9f409e8-1916-4286-8568-7364204ce138 fff24668a81c4abaabc1ef80c01dd16a f501f52c8dad4a8491327ce3f5b6b271 - - default default] [instance: b9bdcec2-f5a7-4696-ad27-2f26dcf41843] Deletion of /var/lib/nova/instances/b9bdcec2-f5a7-4696-ad27-2f26dcf41843_del complete
Jan 21 23:45:56 compute-1 nova_compute[182713]: 2026-01-21 23:45:56.330 182717 INFO nova.compute.manager [None req-c9f409e8-1916-4286-8568-7364204ce138 fff24668a81c4abaabc1ef80c01dd16a f501f52c8dad4a8491327ce3f5b6b271 - - default default] [instance: b9bdcec2-f5a7-4696-ad27-2f26dcf41843] Took 0.41 seconds to destroy the instance on the hypervisor.
Jan 21 23:45:56 compute-1 nova_compute[182713]: 2026-01-21 23:45:56.331 182717 DEBUG oslo.service.loopingcall [None req-c9f409e8-1916-4286-8568-7364204ce138 fff24668a81c4abaabc1ef80c01dd16a f501f52c8dad4a8491327ce3f5b6b271 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 21 23:45:56 compute-1 nova_compute[182713]: 2026-01-21 23:45:56.331 182717 DEBUG nova.compute.manager [-] [instance: b9bdcec2-f5a7-4696-ad27-2f26dcf41843] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 21 23:45:56 compute-1 nova_compute[182713]: 2026-01-21 23:45:56.331 182717 DEBUG nova.network.neutron [-] [instance: b9bdcec2-f5a7-4696-ad27-2f26dcf41843] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 21 23:45:56 compute-1 nova_compute[182713]: 2026-01-21 23:45:56.452 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:45:56 compute-1 nova_compute[182713]: 2026-01-21 23:45:56.968 182717 DEBUG nova.network.neutron [-] [instance: b9bdcec2-f5a7-4696-ad27-2f26dcf41843] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 21 23:45:56 compute-1 nova_compute[182713]: 2026-01-21 23:45:56.987 182717 DEBUG nova.network.neutron [-] [instance: b9bdcec2-f5a7-4696-ad27-2f26dcf41843] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 23:45:57 compute-1 nova_compute[182713]: 2026-01-21 23:45:57.009 182717 INFO nova.compute.manager [-] [instance: b9bdcec2-f5a7-4696-ad27-2f26dcf41843] Took 0.68 seconds to deallocate network for instance.
Jan 21 23:45:57 compute-1 nova_compute[182713]: 2026-01-21 23:45:57.157 182717 DEBUG oslo_concurrency.lockutils [None req-c9f409e8-1916-4286-8568-7364204ce138 fff24668a81c4abaabc1ef80c01dd16a f501f52c8dad4a8491327ce3f5b6b271 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:45:57 compute-1 nova_compute[182713]: 2026-01-21 23:45:57.158 182717 DEBUG oslo_concurrency.lockutils [None req-c9f409e8-1916-4286-8568-7364204ce138 fff24668a81c4abaabc1ef80c01dd16a f501f52c8dad4a8491327ce3f5b6b271 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:45:57 compute-1 nova_compute[182713]: 2026-01-21 23:45:57.226 182717 DEBUG nova.compute.provider_tree [None req-c9f409e8-1916-4286-8568-7364204ce138 fff24668a81c4abaabc1ef80c01dd16a f501f52c8dad4a8491327ce3f5b6b271 - - default default] Inventory has not changed in ProviderTree for provider: 39680711-70c9-4df1-ae59-25e54fac688d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 21 23:45:57 compute-1 nova_compute[182713]: 2026-01-21 23:45:57.248 182717 DEBUG nova.scheduler.client.report [None req-c9f409e8-1916-4286-8568-7364204ce138 fff24668a81c4abaabc1ef80c01dd16a f501f52c8dad4a8491327ce3f5b6b271 - - default default] Inventory has not changed for provider 39680711-70c9-4df1-ae59-25e54fac688d based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 21 23:45:57 compute-1 nova_compute[182713]: 2026-01-21 23:45:57.267 182717 DEBUG oslo_concurrency.lockutils [None req-c9f409e8-1916-4286-8568-7364204ce138 fff24668a81c4abaabc1ef80c01dd16a f501f52c8dad4a8491327ce3f5b6b271 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.110s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:45:57 compute-1 nova_compute[182713]: 2026-01-21 23:45:57.316 182717 INFO nova.scheduler.client.report [None req-c9f409e8-1916-4286-8568-7364204ce138 fff24668a81c4abaabc1ef80c01dd16a f501f52c8dad4a8491327ce3f5b6b271 - - default default] Deleted allocations for instance b9bdcec2-f5a7-4696-ad27-2f26dcf41843
Jan 21 23:45:57 compute-1 nova_compute[182713]: 2026-01-21 23:45:57.415 182717 DEBUG oslo_concurrency.lockutils [None req-c9f409e8-1916-4286-8568-7364204ce138 fff24668a81c4abaabc1ef80c01dd16a f501f52c8dad4a8491327ce3f5b6b271 - - default default] Lock "b9bdcec2-f5a7-4696-ad27-2f26dcf41843" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.538s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:45:59 compute-1 nova_compute[182713]: 2026-01-21 23:45:59.968 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:46:01 compute-1 nova_compute[182713]: 2026-01-21 23:46:01.454 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:46:02 compute-1 podman[212815]: 2026-01-21 23:46:02.623245617 +0000 UTC m=+0.110246969 container health_status cba261ffc859a105e0afa4436e64e27f9ba7e7387a1ea02146e0072c51844d44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3)
Jan 21 23:46:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:46:02.993 104184 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:46:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:46:02.994 104184 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:46:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:46:02.994 104184 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:46:04 compute-1 nova_compute[182713]: 2026-01-21 23:46:04.971 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:46:06 compute-1 nova_compute[182713]: 2026-01-21 23:46:06.457 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:46:06 compute-1 podman[212833]: 2026-01-21 23:46:06.609428233 +0000 UTC m=+0.093458353 container health_status dce133a2ffa21f3006ac27dc80027060a9029e6d972f4c62fecd35dd3bbb00f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, config_id=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, architecture=x86_64, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal)
Jan 21 23:46:10 compute-1 nova_compute[182713]: 2026-01-21 23:46:10.016 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:46:11 compute-1 nova_compute[182713]: 2026-01-21 23:46:11.177 182717 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769039156.1762295, b9bdcec2-f5a7-4696-ad27-2f26dcf41843 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:46:11 compute-1 nova_compute[182713]: 2026-01-21 23:46:11.178 182717 INFO nova.compute.manager [-] [instance: b9bdcec2-f5a7-4696-ad27-2f26dcf41843] VM Stopped (Lifecycle Event)
Jan 21 23:46:11 compute-1 nova_compute[182713]: 2026-01-21 23:46:11.219 182717 DEBUG nova.compute.manager [None req-3864483a-73af-4207-84bf-0d5fa304a489 - - - - - -] [instance: b9bdcec2-f5a7-4696-ad27-2f26dcf41843] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:46:11 compute-1 nova_compute[182713]: 2026-01-21 23:46:11.460 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:46:13 compute-1 podman[212856]: 2026-01-21 23:46:13.592033846 +0000 UTC m=+0.073912572 container health_status 9069102163b9014ce8d990e36224edf2f6277e737eebbc85a8ea0ba5a3d6a468 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 21 23:46:13 compute-1 podman[212855]: 2026-01-21 23:46:13.635156306 +0000 UTC m=+0.122261476 container health_status 1b31374526468d6b98833c6ddc06c791524e3aaa8f4a92d02e3f11c449a17ca2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 21 23:46:15 compute-1 nova_compute[182713]: 2026-01-21 23:46:15.017 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:46:16 compute-1 nova_compute[182713]: 2026-01-21 23:46:16.463 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:46:16 compute-1 nova_compute[182713]: 2026-01-21 23:46:16.856 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:46:16 compute-1 nova_compute[182713]: 2026-01-21 23:46:16.857 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 21 23:46:17 compute-1 nova_compute[182713]: 2026-01-21 23:46:17.851 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:46:17 compute-1 nova_compute[182713]: 2026-01-21 23:46:17.855 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:46:18 compute-1 nova_compute[182713]: 2026-01-21 23:46:18.855 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:46:18 compute-1 nova_compute[182713]: 2026-01-21 23:46:18.856 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:46:18 compute-1 nova_compute[182713]: 2026-01-21 23:46:18.896 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:46:18 compute-1 nova_compute[182713]: 2026-01-21 23:46:18.897 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:46:18 compute-1 nova_compute[182713]: 2026-01-21 23:46:18.897 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:46:18 compute-1 nova_compute[182713]: 2026-01-21 23:46:18.898 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 21 23:46:19 compute-1 nova_compute[182713]: 2026-01-21 23:46:19.156 182717 WARNING nova.virt.libvirt.driver [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 21 23:46:19 compute-1 nova_compute[182713]: 2026-01-21 23:46:19.159 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5745MB free_disk=73.38105392456055GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 21 23:46:19 compute-1 nova_compute[182713]: 2026-01-21 23:46:19.159 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:46:19 compute-1 nova_compute[182713]: 2026-01-21 23:46:19.160 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:46:19 compute-1 nova_compute[182713]: 2026-01-21 23:46:19.292 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 21 23:46:19 compute-1 nova_compute[182713]: 2026-01-21 23:46:19.293 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 21 23:46:19 compute-1 nova_compute[182713]: 2026-01-21 23:46:19.335 182717 DEBUG nova.compute.provider_tree [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Inventory has not changed in ProviderTree for provider: 39680711-70c9-4df1-ae59-25e54fac688d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 21 23:46:19 compute-1 nova_compute[182713]: 2026-01-21 23:46:19.391 182717 DEBUG nova.scheduler.client.report [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Inventory has not changed for provider 39680711-70c9-4df1-ae59-25e54fac688d based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 21 23:46:19 compute-1 nova_compute[182713]: 2026-01-21 23:46:19.450 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 21 23:46:19 compute-1 nova_compute[182713]: 2026-01-21 23:46:19.451 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.291s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:46:20 compute-1 nova_compute[182713]: 2026-01-21 23:46:20.019 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:46:20 compute-1 nova_compute[182713]: 2026-01-21 23:46:20.451 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:46:20 compute-1 nova_compute[182713]: 2026-01-21 23:46:20.452 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 21 23:46:20 compute-1 nova_compute[182713]: 2026-01-21 23:46:20.452 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 21 23:46:20 compute-1 nova_compute[182713]: 2026-01-21 23:46:20.477 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 21 23:46:20 compute-1 nova_compute[182713]: 2026-01-21 23:46:20.478 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:46:20 compute-1 nova_compute[182713]: 2026-01-21 23:46:20.856 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:46:21 compute-1 nova_compute[182713]: 2026-01-21 23:46:21.467 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:46:21 compute-1 nova_compute[182713]: 2026-01-21 23:46:21.856 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:46:22 compute-1 podman[212906]: 2026-01-21 23:46:22.57497368 +0000 UTC m=+0.064529289 container health_status af9a44923f14d865893c91712a29a99d59e05d83f1a1a0300c4f4d5af638f284 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, io.buildah.version=1.41.3)
Jan 21 23:46:22 compute-1 podman[212907]: 2026-01-21 23:46:22.599661623 +0000 UTC m=+0.074981617 container health_status e305ebfcd3dbca18f8dd680606b043b108cf9efb0d0a771039cb04fe0df8441d (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Jan 21 23:46:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:46:22.863 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:46:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:46:22.864 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:46:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:46:22.864 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:46:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:46:22.865 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:46:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:46:22.865 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:46:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:46:22.865 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:46:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:46:22.865 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:46:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:46:22.865 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:46:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:46:22.866 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:46:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:46:22.866 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:46:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:46:22.866 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:46:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:46:22.866 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:46:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:46:22.867 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:46:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:46:22.867 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:46:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:46:22.867 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:46:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:46:22.867 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:46:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:46:22.867 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:46:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:46:22.867 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:46:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:46:22.867 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:46:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:46:22.867 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:46:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:46:22.868 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:46:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:46:22.868 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:46:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:46:22.868 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:46:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:46:22.868 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:46:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:46:22.868 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:46:25 compute-1 nova_compute[182713]: 2026-01-21 23:46:25.020 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:46:26 compute-1 nova_compute[182713]: 2026-01-21 23:46:26.469 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:46:30 compute-1 nova_compute[182713]: 2026-01-21 23:46:30.023 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:46:31 compute-1 nova_compute[182713]: 2026-01-21 23:46:31.473 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:46:32 compute-1 nova_compute[182713]: 2026-01-21 23:46:32.933 182717 DEBUG oslo_concurrency.lockutils [None req-a757ace6-3f57-4f36-89c9-14999d5dd4ba 553fdc065acf4000a185abac43878ab4 1298204af0f241dc8b63851b2046cf5c - - default default] Acquiring lock "b1080912-4a1f-4504-ae59-a0ad89963886" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:46:32 compute-1 nova_compute[182713]: 2026-01-21 23:46:32.933 182717 DEBUG oslo_concurrency.lockutils [None req-a757ace6-3f57-4f36-89c9-14999d5dd4ba 553fdc065acf4000a185abac43878ab4 1298204af0f241dc8b63851b2046cf5c - - default default] Lock "b1080912-4a1f-4504-ae59-a0ad89963886" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:46:32 compute-1 nova_compute[182713]: 2026-01-21 23:46:32.957 182717 DEBUG nova.compute.manager [None req-a757ace6-3f57-4f36-89c9-14999d5dd4ba 553fdc065acf4000a185abac43878ab4 1298204af0f241dc8b63851b2046cf5c - - default default] [instance: b1080912-4a1f-4504-ae59-a0ad89963886] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 21 23:46:33 compute-1 nova_compute[182713]: 2026-01-21 23:46:33.078 182717 DEBUG oslo_concurrency.lockutils [None req-a757ace6-3f57-4f36-89c9-14999d5dd4ba 553fdc065acf4000a185abac43878ab4 1298204af0f241dc8b63851b2046cf5c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:46:33 compute-1 nova_compute[182713]: 2026-01-21 23:46:33.079 182717 DEBUG oslo_concurrency.lockutils [None req-a757ace6-3f57-4f36-89c9-14999d5dd4ba 553fdc065acf4000a185abac43878ab4 1298204af0f241dc8b63851b2046cf5c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:46:33 compute-1 nova_compute[182713]: 2026-01-21 23:46:33.087 182717 DEBUG nova.virt.hardware [None req-a757ace6-3f57-4f36-89c9-14999d5dd4ba 553fdc065acf4000a185abac43878ab4 1298204af0f241dc8b63851b2046cf5c - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 21 23:46:33 compute-1 nova_compute[182713]: 2026-01-21 23:46:33.088 182717 INFO nova.compute.claims [None req-a757ace6-3f57-4f36-89c9-14999d5dd4ba 553fdc065acf4000a185abac43878ab4 1298204af0f241dc8b63851b2046cf5c - - default default] [instance: b1080912-4a1f-4504-ae59-a0ad89963886] Claim successful on node compute-1.ctlplane.example.com
Jan 21 23:46:33 compute-1 nova_compute[182713]: 2026-01-21 23:46:33.484 182717 DEBUG nova.compute.provider_tree [None req-a757ace6-3f57-4f36-89c9-14999d5dd4ba 553fdc065acf4000a185abac43878ab4 1298204af0f241dc8b63851b2046cf5c - - default default] Inventory has not changed in ProviderTree for provider: 39680711-70c9-4df1-ae59-25e54fac688d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 21 23:46:33 compute-1 nova_compute[182713]: 2026-01-21 23:46:33.510 182717 DEBUG nova.scheduler.client.report [None req-a757ace6-3f57-4f36-89c9-14999d5dd4ba 553fdc065acf4000a185abac43878ab4 1298204af0f241dc8b63851b2046cf5c - - default default] Inventory has not changed for provider 39680711-70c9-4df1-ae59-25e54fac688d based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 21 23:46:33 compute-1 nova_compute[182713]: 2026-01-21 23:46:33.549 182717 DEBUG oslo_concurrency.lockutils [None req-a757ace6-3f57-4f36-89c9-14999d5dd4ba 553fdc065acf4000a185abac43878ab4 1298204af0f241dc8b63851b2046cf5c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.470s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:46:33 compute-1 nova_compute[182713]: 2026-01-21 23:46:33.550 182717 DEBUG nova.compute.manager [None req-a757ace6-3f57-4f36-89c9-14999d5dd4ba 553fdc065acf4000a185abac43878ab4 1298204af0f241dc8b63851b2046cf5c - - default default] [instance: b1080912-4a1f-4504-ae59-a0ad89963886] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 21 23:46:33 compute-1 podman[212947]: 2026-01-21 23:46:33.596972305 +0000 UTC m=+0.087791287 container health_status cba261ffc859a105e0afa4436e64e27f9ba7e7387a1ea02146e0072c51844d44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Jan 21 23:46:33 compute-1 nova_compute[182713]: 2026-01-21 23:46:33.617 182717 DEBUG nova.compute.manager [None req-a757ace6-3f57-4f36-89c9-14999d5dd4ba 553fdc065acf4000a185abac43878ab4 1298204af0f241dc8b63851b2046cf5c - - default default] [instance: b1080912-4a1f-4504-ae59-a0ad89963886] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 21 23:46:33 compute-1 nova_compute[182713]: 2026-01-21 23:46:33.617 182717 DEBUG nova.network.neutron [None req-a757ace6-3f57-4f36-89c9-14999d5dd4ba 553fdc065acf4000a185abac43878ab4 1298204af0f241dc8b63851b2046cf5c - - default default] [instance: b1080912-4a1f-4504-ae59-a0ad89963886] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 21 23:46:33 compute-1 nova_compute[182713]: 2026-01-21 23:46:33.638 182717 INFO nova.virt.libvirt.driver [None req-a757ace6-3f57-4f36-89c9-14999d5dd4ba 553fdc065acf4000a185abac43878ab4 1298204af0f241dc8b63851b2046cf5c - - default default] [instance: b1080912-4a1f-4504-ae59-a0ad89963886] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 21 23:46:33 compute-1 nova_compute[182713]: 2026-01-21 23:46:33.662 182717 DEBUG nova.compute.manager [None req-a757ace6-3f57-4f36-89c9-14999d5dd4ba 553fdc065acf4000a185abac43878ab4 1298204af0f241dc8b63851b2046cf5c - - default default] [instance: b1080912-4a1f-4504-ae59-a0ad89963886] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 21 23:46:33 compute-1 nova_compute[182713]: 2026-01-21 23:46:33.776 182717 DEBUG nova.compute.manager [None req-a757ace6-3f57-4f36-89c9-14999d5dd4ba 553fdc065acf4000a185abac43878ab4 1298204af0f241dc8b63851b2046cf5c - - default default] [instance: b1080912-4a1f-4504-ae59-a0ad89963886] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 21 23:46:33 compute-1 nova_compute[182713]: 2026-01-21 23:46:33.779 182717 DEBUG nova.virt.libvirt.driver [None req-a757ace6-3f57-4f36-89c9-14999d5dd4ba 553fdc065acf4000a185abac43878ab4 1298204af0f241dc8b63851b2046cf5c - - default default] [instance: b1080912-4a1f-4504-ae59-a0ad89963886] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 21 23:46:33 compute-1 nova_compute[182713]: 2026-01-21 23:46:33.779 182717 INFO nova.virt.libvirt.driver [None req-a757ace6-3f57-4f36-89c9-14999d5dd4ba 553fdc065acf4000a185abac43878ab4 1298204af0f241dc8b63851b2046cf5c - - default default] [instance: b1080912-4a1f-4504-ae59-a0ad89963886] Creating image(s)
Jan 21 23:46:33 compute-1 nova_compute[182713]: 2026-01-21 23:46:33.780 182717 DEBUG oslo_concurrency.lockutils [None req-a757ace6-3f57-4f36-89c9-14999d5dd4ba 553fdc065acf4000a185abac43878ab4 1298204af0f241dc8b63851b2046cf5c - - default default] Acquiring lock "/var/lib/nova/instances/b1080912-4a1f-4504-ae59-a0ad89963886/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:46:33 compute-1 nova_compute[182713]: 2026-01-21 23:46:33.781 182717 DEBUG oslo_concurrency.lockutils [None req-a757ace6-3f57-4f36-89c9-14999d5dd4ba 553fdc065acf4000a185abac43878ab4 1298204af0f241dc8b63851b2046cf5c - - default default] Lock "/var/lib/nova/instances/b1080912-4a1f-4504-ae59-a0ad89963886/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:46:33 compute-1 nova_compute[182713]: 2026-01-21 23:46:33.782 182717 DEBUG oslo_concurrency.lockutils [None req-a757ace6-3f57-4f36-89c9-14999d5dd4ba 553fdc065acf4000a185abac43878ab4 1298204af0f241dc8b63851b2046cf5c - - default default] Lock "/var/lib/nova/instances/b1080912-4a1f-4504-ae59-a0ad89963886/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:46:33 compute-1 nova_compute[182713]: 2026-01-21 23:46:33.804 182717 DEBUG oslo_concurrency.processutils [None req-a757ace6-3f57-4f36-89c9-14999d5dd4ba 553fdc065acf4000a185abac43878ab4 1298204af0f241dc8b63851b2046cf5c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:46:33 compute-1 nova_compute[182713]: 2026-01-21 23:46:33.880 182717 DEBUG oslo_concurrency.processutils [None req-a757ace6-3f57-4f36-89c9-14999d5dd4ba 553fdc065acf4000a185abac43878ab4 1298204af0f241dc8b63851b2046cf5c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.076s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:46:33 compute-1 nova_compute[182713]: 2026-01-21 23:46:33.882 182717 DEBUG oslo_concurrency.lockutils [None req-a757ace6-3f57-4f36-89c9-14999d5dd4ba 553fdc065acf4000a185abac43878ab4 1298204af0f241dc8b63851b2046cf5c - - default default] Acquiring lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:46:33 compute-1 nova_compute[182713]: 2026-01-21 23:46:33.883 182717 DEBUG oslo_concurrency.lockutils [None req-a757ace6-3f57-4f36-89c9-14999d5dd4ba 553fdc065acf4000a185abac43878ab4 1298204af0f241dc8b63851b2046cf5c - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:46:33 compute-1 nova_compute[182713]: 2026-01-21 23:46:33.900 182717 DEBUG oslo_concurrency.processutils [None req-a757ace6-3f57-4f36-89c9-14999d5dd4ba 553fdc065acf4000a185abac43878ab4 1298204af0f241dc8b63851b2046cf5c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:46:33 compute-1 nova_compute[182713]: 2026-01-21 23:46:33.981 182717 DEBUG oslo_concurrency.processutils [None req-a757ace6-3f57-4f36-89c9-14999d5dd4ba 553fdc065acf4000a185abac43878ab4 1298204af0f241dc8b63851b2046cf5c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.081s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:46:33 compute-1 nova_compute[182713]: 2026-01-21 23:46:33.983 182717 DEBUG oslo_concurrency.processutils [None req-a757ace6-3f57-4f36-89c9-14999d5dd4ba 553fdc065acf4000a185abac43878ab4 1298204af0f241dc8b63851b2046cf5c - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/b1080912-4a1f-4504-ae59-a0ad89963886/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:46:34 compute-1 nova_compute[182713]: 2026-01-21 23:46:34.024 182717 DEBUG oslo_concurrency.processutils [None req-a757ace6-3f57-4f36-89c9-14999d5dd4ba 553fdc065acf4000a185abac43878ab4 1298204af0f241dc8b63851b2046cf5c - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/b1080912-4a1f-4504-ae59-a0ad89963886/disk 1073741824" returned: 0 in 0.041s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:46:34 compute-1 nova_compute[182713]: 2026-01-21 23:46:34.025 182717 DEBUG oslo_concurrency.lockutils [None req-a757ace6-3f57-4f36-89c9-14999d5dd4ba 553fdc065acf4000a185abac43878ab4 1298204af0f241dc8b63851b2046cf5c - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.142s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:46:34 compute-1 nova_compute[182713]: 2026-01-21 23:46:34.026 182717 DEBUG oslo_concurrency.processutils [None req-a757ace6-3f57-4f36-89c9-14999d5dd4ba 553fdc065acf4000a185abac43878ab4 1298204af0f241dc8b63851b2046cf5c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:46:34 compute-1 nova_compute[182713]: 2026-01-21 23:46:34.114 182717 DEBUG nova.policy [None req-a757ace6-3f57-4f36-89c9-14999d5dd4ba 553fdc065acf4000a185abac43878ab4 1298204af0f241dc8b63851b2046cf5c - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '553fdc065acf4000a185abac43878ab4', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '1298204af0f241dc8b63851b2046cf5c', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 21 23:46:34 compute-1 nova_compute[182713]: 2026-01-21 23:46:34.126 182717 DEBUG oslo_concurrency.processutils [None req-a757ace6-3f57-4f36-89c9-14999d5dd4ba 553fdc065acf4000a185abac43878ab4 1298204af0f241dc8b63851b2046cf5c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.100s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:46:34 compute-1 nova_compute[182713]: 2026-01-21 23:46:34.127 182717 DEBUG nova.virt.disk.api [None req-a757ace6-3f57-4f36-89c9-14999d5dd4ba 553fdc065acf4000a185abac43878ab4 1298204af0f241dc8b63851b2046cf5c - - default default] Checking if we can resize image /var/lib/nova/instances/b1080912-4a1f-4504-ae59-a0ad89963886/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 21 23:46:34 compute-1 nova_compute[182713]: 2026-01-21 23:46:34.127 182717 DEBUG oslo_concurrency.processutils [None req-a757ace6-3f57-4f36-89c9-14999d5dd4ba 553fdc065acf4000a185abac43878ab4 1298204af0f241dc8b63851b2046cf5c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b1080912-4a1f-4504-ae59-a0ad89963886/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:46:34 compute-1 nova_compute[182713]: 2026-01-21 23:46:34.220 182717 DEBUG oslo_concurrency.processutils [None req-a757ace6-3f57-4f36-89c9-14999d5dd4ba 553fdc065acf4000a185abac43878ab4 1298204af0f241dc8b63851b2046cf5c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b1080912-4a1f-4504-ae59-a0ad89963886/disk --force-share --output=json" returned: 0 in 0.093s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:46:34 compute-1 nova_compute[182713]: 2026-01-21 23:46:34.221 182717 DEBUG nova.virt.disk.api [None req-a757ace6-3f57-4f36-89c9-14999d5dd4ba 553fdc065acf4000a185abac43878ab4 1298204af0f241dc8b63851b2046cf5c - - default default] Cannot resize image /var/lib/nova/instances/b1080912-4a1f-4504-ae59-a0ad89963886/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 21 23:46:34 compute-1 nova_compute[182713]: 2026-01-21 23:46:34.222 182717 DEBUG nova.objects.instance [None req-a757ace6-3f57-4f36-89c9-14999d5dd4ba 553fdc065acf4000a185abac43878ab4 1298204af0f241dc8b63851b2046cf5c - - default default] Lazy-loading 'migration_context' on Instance uuid b1080912-4a1f-4504-ae59-a0ad89963886 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:46:34 compute-1 nova_compute[182713]: 2026-01-21 23:46:34.245 182717 DEBUG nova.virt.libvirt.driver [None req-a757ace6-3f57-4f36-89c9-14999d5dd4ba 553fdc065acf4000a185abac43878ab4 1298204af0f241dc8b63851b2046cf5c - - default default] [instance: b1080912-4a1f-4504-ae59-a0ad89963886] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 21 23:46:34 compute-1 nova_compute[182713]: 2026-01-21 23:46:34.246 182717 DEBUG nova.virt.libvirt.driver [None req-a757ace6-3f57-4f36-89c9-14999d5dd4ba 553fdc065acf4000a185abac43878ab4 1298204af0f241dc8b63851b2046cf5c - - default default] [instance: b1080912-4a1f-4504-ae59-a0ad89963886] Ensure instance console log exists: /var/lib/nova/instances/b1080912-4a1f-4504-ae59-a0ad89963886/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 21 23:46:34 compute-1 nova_compute[182713]: 2026-01-21 23:46:34.246 182717 DEBUG oslo_concurrency.lockutils [None req-a757ace6-3f57-4f36-89c9-14999d5dd4ba 553fdc065acf4000a185abac43878ab4 1298204af0f241dc8b63851b2046cf5c - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:46:34 compute-1 nova_compute[182713]: 2026-01-21 23:46:34.247 182717 DEBUG oslo_concurrency.lockutils [None req-a757ace6-3f57-4f36-89c9-14999d5dd4ba 553fdc065acf4000a185abac43878ab4 1298204af0f241dc8b63851b2046cf5c - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:46:34 compute-1 nova_compute[182713]: 2026-01-21 23:46:34.247 182717 DEBUG oslo_concurrency.lockutils [None req-a757ace6-3f57-4f36-89c9-14999d5dd4ba 553fdc065acf4000a185abac43878ab4 1298204af0f241dc8b63851b2046cf5c - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:46:35 compute-1 nova_compute[182713]: 2026-01-21 23:46:35.025 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:46:35 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:46:35.223 104184 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=6, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:a4:fb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'fa:c2:bb:50:eb:27'}, ipsec=False) old=SB_Global(nb_cfg=5) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 21 23:46:35 compute-1 nova_compute[182713]: 2026-01-21 23:46:35.224 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:46:35 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:46:35.225 104184 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 21 23:46:35 compute-1 nova_compute[182713]: 2026-01-21 23:46:35.730 182717 DEBUG nova.network.neutron [None req-a757ace6-3f57-4f36-89c9-14999d5dd4ba 553fdc065acf4000a185abac43878ab4 1298204af0f241dc8b63851b2046cf5c - - default default] [instance: b1080912-4a1f-4504-ae59-a0ad89963886] Successfully updated port: c16d8d18-6610-45c3-8172-54b8b99474ae _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 21 23:46:35 compute-1 nova_compute[182713]: 2026-01-21 23:46:35.771 182717 DEBUG oslo_concurrency.lockutils [None req-a757ace6-3f57-4f36-89c9-14999d5dd4ba 553fdc065acf4000a185abac43878ab4 1298204af0f241dc8b63851b2046cf5c - - default default] Acquiring lock "refresh_cache-b1080912-4a1f-4504-ae59-a0ad89963886" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 23:46:35 compute-1 nova_compute[182713]: 2026-01-21 23:46:35.772 182717 DEBUG oslo_concurrency.lockutils [None req-a757ace6-3f57-4f36-89c9-14999d5dd4ba 553fdc065acf4000a185abac43878ab4 1298204af0f241dc8b63851b2046cf5c - - default default] Acquired lock "refresh_cache-b1080912-4a1f-4504-ae59-a0ad89963886" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 23:46:35 compute-1 nova_compute[182713]: 2026-01-21 23:46:35.772 182717 DEBUG nova.network.neutron [None req-a757ace6-3f57-4f36-89c9-14999d5dd4ba 553fdc065acf4000a185abac43878ab4 1298204af0f241dc8b63851b2046cf5c - - default default] [instance: b1080912-4a1f-4504-ae59-a0ad89963886] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 21 23:46:35 compute-1 nova_compute[182713]: 2026-01-21 23:46:35.872 182717 DEBUG nova.compute.manager [req-c90c7529-e3bc-46f0-8ad0-459eae0c3824 req-484f5f75-b598-402b-b58a-34806d90bbfd 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: b1080912-4a1f-4504-ae59-a0ad89963886] Received event network-changed-c16d8d18-6610-45c3-8172-54b8b99474ae external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:46:35 compute-1 nova_compute[182713]: 2026-01-21 23:46:35.872 182717 DEBUG nova.compute.manager [req-c90c7529-e3bc-46f0-8ad0-459eae0c3824 req-484f5f75-b598-402b-b58a-34806d90bbfd 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: b1080912-4a1f-4504-ae59-a0ad89963886] Refreshing instance network info cache due to event network-changed-c16d8d18-6610-45c3-8172-54b8b99474ae. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 21 23:46:35 compute-1 nova_compute[182713]: 2026-01-21 23:46:35.873 182717 DEBUG oslo_concurrency.lockutils [req-c90c7529-e3bc-46f0-8ad0-459eae0c3824 req-484f5f75-b598-402b-b58a-34806d90bbfd 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-b1080912-4a1f-4504-ae59-a0ad89963886" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 23:46:36 compute-1 nova_compute[182713]: 2026-01-21 23:46:36.108 182717 DEBUG nova.network.neutron [None req-a757ace6-3f57-4f36-89c9-14999d5dd4ba 553fdc065acf4000a185abac43878ab4 1298204af0f241dc8b63851b2046cf5c - - default default] [instance: b1080912-4a1f-4504-ae59-a0ad89963886] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 21 23:46:36 compute-1 nova_compute[182713]: 2026-01-21 23:46:36.476 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:46:37 compute-1 nova_compute[182713]: 2026-01-21 23:46:37.440 182717 DEBUG nova.network.neutron [None req-a757ace6-3f57-4f36-89c9-14999d5dd4ba 553fdc065acf4000a185abac43878ab4 1298204af0f241dc8b63851b2046cf5c - - default default] [instance: b1080912-4a1f-4504-ae59-a0ad89963886] Updating instance_info_cache with network_info: [{"id": "c16d8d18-6610-45c3-8172-54b8b99474ae", "address": "fa:16:3e:d8:d5:91", "network": {"id": "b7816b8e-52c1-4d60-84f7-524ebe7dfa5c", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1363967197-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1298204af0f241dc8b63851b2046cf5c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc16d8d18-66", "ovs_interfaceid": "c16d8d18-6610-45c3-8172-54b8b99474ae", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 23:46:37 compute-1 nova_compute[182713]: 2026-01-21 23:46:37.463 182717 DEBUG oslo_concurrency.lockutils [None req-a757ace6-3f57-4f36-89c9-14999d5dd4ba 553fdc065acf4000a185abac43878ab4 1298204af0f241dc8b63851b2046cf5c - - default default] Releasing lock "refresh_cache-b1080912-4a1f-4504-ae59-a0ad89963886" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 23:46:37 compute-1 nova_compute[182713]: 2026-01-21 23:46:37.463 182717 DEBUG nova.compute.manager [None req-a757ace6-3f57-4f36-89c9-14999d5dd4ba 553fdc065acf4000a185abac43878ab4 1298204af0f241dc8b63851b2046cf5c - - default default] [instance: b1080912-4a1f-4504-ae59-a0ad89963886] Instance network_info: |[{"id": "c16d8d18-6610-45c3-8172-54b8b99474ae", "address": "fa:16:3e:d8:d5:91", "network": {"id": "b7816b8e-52c1-4d60-84f7-524ebe7dfa5c", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1363967197-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1298204af0f241dc8b63851b2046cf5c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc16d8d18-66", "ovs_interfaceid": "c16d8d18-6610-45c3-8172-54b8b99474ae", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 21 23:46:37 compute-1 nova_compute[182713]: 2026-01-21 23:46:37.464 182717 DEBUG oslo_concurrency.lockutils [req-c90c7529-e3bc-46f0-8ad0-459eae0c3824 req-484f5f75-b598-402b-b58a-34806d90bbfd 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-b1080912-4a1f-4504-ae59-a0ad89963886" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 23:46:37 compute-1 nova_compute[182713]: 2026-01-21 23:46:37.465 182717 DEBUG nova.network.neutron [req-c90c7529-e3bc-46f0-8ad0-459eae0c3824 req-484f5f75-b598-402b-b58a-34806d90bbfd 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: b1080912-4a1f-4504-ae59-a0ad89963886] Refreshing network info cache for port c16d8d18-6610-45c3-8172-54b8b99474ae _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 21 23:46:37 compute-1 nova_compute[182713]: 2026-01-21 23:46:37.470 182717 DEBUG nova.virt.libvirt.driver [None req-a757ace6-3f57-4f36-89c9-14999d5dd4ba 553fdc065acf4000a185abac43878ab4 1298204af0f241dc8b63851b2046cf5c - - default default] [instance: b1080912-4a1f-4504-ae59-a0ad89963886] Start _get_guest_xml network_info=[{"id": "c16d8d18-6610-45c3-8172-54b8b99474ae", "address": "fa:16:3e:d8:d5:91", "network": {"id": "b7816b8e-52c1-4d60-84f7-524ebe7dfa5c", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1363967197-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1298204af0f241dc8b63851b2046cf5c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc16d8d18-66", "ovs_interfaceid": "c16d8d18-6610-45c3-8172-54b8b99474ae", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'size': 0, 'disk_bus': 'virtio', 'boot_index': 0, 'device_name': '/dev/vda', 'guest_format': None, 'encrypted': False, 'encryption_format': None, 'encryption_secret_uuid': None, 'device_type': 'disk', 'image_id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 21 23:46:37 compute-1 nova_compute[182713]: 2026-01-21 23:46:37.475 182717 WARNING nova.virt.libvirt.driver [None req-a757ace6-3f57-4f36-89c9-14999d5dd4ba 553fdc065acf4000a185abac43878ab4 1298204af0f241dc8b63851b2046cf5c - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 21 23:46:37 compute-1 nova_compute[182713]: 2026-01-21 23:46:37.481 182717 DEBUG nova.virt.libvirt.host [None req-a757ace6-3f57-4f36-89c9-14999d5dd4ba 553fdc065acf4000a185abac43878ab4 1298204af0f241dc8b63851b2046cf5c - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 21 23:46:37 compute-1 nova_compute[182713]: 2026-01-21 23:46:37.482 182717 DEBUG nova.virt.libvirt.host [None req-a757ace6-3f57-4f36-89c9-14999d5dd4ba 553fdc065acf4000a185abac43878ab4 1298204af0f241dc8b63851b2046cf5c - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 21 23:46:37 compute-1 nova_compute[182713]: 2026-01-21 23:46:37.486 182717 DEBUG nova.virt.libvirt.host [None req-a757ace6-3f57-4f36-89c9-14999d5dd4ba 553fdc065acf4000a185abac43878ab4 1298204af0f241dc8b63851b2046cf5c - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 21 23:46:37 compute-1 nova_compute[182713]: 2026-01-21 23:46:37.487 182717 DEBUG nova.virt.libvirt.host [None req-a757ace6-3f57-4f36-89c9-14999d5dd4ba 553fdc065acf4000a185abac43878ab4 1298204af0f241dc8b63851b2046cf5c - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 21 23:46:37 compute-1 nova_compute[182713]: 2026-01-21 23:46:37.489 182717 DEBUG nova.virt.libvirt.driver [None req-a757ace6-3f57-4f36-89c9-14999d5dd4ba 553fdc065acf4000a185abac43878ab4 1298204af0f241dc8b63851b2046cf5c - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 21 23:46:37 compute-1 nova_compute[182713]: 2026-01-21 23:46:37.490 182717 DEBUG nova.virt.hardware [None req-a757ace6-3f57-4f36-89c9-14999d5dd4ba 553fdc065acf4000a185abac43878ab4 1298204af0f241dc8b63851b2046cf5c - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-21T23:42:45Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c3389c03-89c4-4ff5-9e03-1a99d41713d4',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 21 23:46:37 compute-1 nova_compute[182713]: 2026-01-21 23:46:37.490 182717 DEBUG nova.virt.hardware [None req-a757ace6-3f57-4f36-89c9-14999d5dd4ba 553fdc065acf4000a185abac43878ab4 1298204af0f241dc8b63851b2046cf5c - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 21 23:46:37 compute-1 nova_compute[182713]: 2026-01-21 23:46:37.491 182717 DEBUG nova.virt.hardware [None req-a757ace6-3f57-4f36-89c9-14999d5dd4ba 553fdc065acf4000a185abac43878ab4 1298204af0f241dc8b63851b2046cf5c - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 21 23:46:37 compute-1 nova_compute[182713]: 2026-01-21 23:46:37.491 182717 DEBUG nova.virt.hardware [None req-a757ace6-3f57-4f36-89c9-14999d5dd4ba 553fdc065acf4000a185abac43878ab4 1298204af0f241dc8b63851b2046cf5c - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 21 23:46:37 compute-1 nova_compute[182713]: 2026-01-21 23:46:37.492 182717 DEBUG nova.virt.hardware [None req-a757ace6-3f57-4f36-89c9-14999d5dd4ba 553fdc065acf4000a185abac43878ab4 1298204af0f241dc8b63851b2046cf5c - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 21 23:46:37 compute-1 nova_compute[182713]: 2026-01-21 23:46:37.492 182717 DEBUG nova.virt.hardware [None req-a757ace6-3f57-4f36-89c9-14999d5dd4ba 553fdc065acf4000a185abac43878ab4 1298204af0f241dc8b63851b2046cf5c - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 21 23:46:37 compute-1 nova_compute[182713]: 2026-01-21 23:46:37.492 182717 DEBUG nova.virt.hardware [None req-a757ace6-3f57-4f36-89c9-14999d5dd4ba 553fdc065acf4000a185abac43878ab4 1298204af0f241dc8b63851b2046cf5c - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 21 23:46:37 compute-1 nova_compute[182713]: 2026-01-21 23:46:37.493 182717 DEBUG nova.virt.hardware [None req-a757ace6-3f57-4f36-89c9-14999d5dd4ba 553fdc065acf4000a185abac43878ab4 1298204af0f241dc8b63851b2046cf5c - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 21 23:46:37 compute-1 nova_compute[182713]: 2026-01-21 23:46:37.493 182717 DEBUG nova.virt.hardware [None req-a757ace6-3f57-4f36-89c9-14999d5dd4ba 553fdc065acf4000a185abac43878ab4 1298204af0f241dc8b63851b2046cf5c - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 21 23:46:37 compute-1 nova_compute[182713]: 2026-01-21 23:46:37.494 182717 DEBUG nova.virt.hardware [None req-a757ace6-3f57-4f36-89c9-14999d5dd4ba 553fdc065acf4000a185abac43878ab4 1298204af0f241dc8b63851b2046cf5c - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 21 23:46:37 compute-1 nova_compute[182713]: 2026-01-21 23:46:37.494 182717 DEBUG nova.virt.hardware [None req-a757ace6-3f57-4f36-89c9-14999d5dd4ba 553fdc065acf4000a185abac43878ab4 1298204af0f241dc8b63851b2046cf5c - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 21 23:46:37 compute-1 nova_compute[182713]: 2026-01-21 23:46:37.501 182717 DEBUG nova.virt.libvirt.vif [None req-a757ace6-3f57-4f36-89c9-14999d5dd4ba 553fdc065acf4000a185abac43878ab4 1298204af0f241dc8b63851b2046cf5c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-21T23:46:31Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-LiveAutoBlockMigrationV225Test-server-1283276848',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-liveautoblockmigrationv225test-server-1283276848',id=17,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='1298204af0f241dc8b63851b2046cf5c',ramdisk_id='',reservation_id='r-b25ryamg',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-LiveAutoBlockMigrationV225Test-1063342224',owner_user_name='tempest-LiveAutoBlockMigrationV225Test-1063342224-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-21T23:46:33Z,user_data=None,user_id='553fdc065acf4000a185abac43878ab4',uuid=b1080912-4a1f-4504-ae59-a0ad89963886,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "c16d8d18-6610-45c3-8172-54b8b99474ae", "address": "fa:16:3e:d8:d5:91", "network": {"id": "b7816b8e-52c1-4d60-84f7-524ebe7dfa5c", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1363967197-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1298204af0f241dc8b63851b2046cf5c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc16d8d18-66", "ovs_interfaceid": "c16d8d18-6610-45c3-8172-54b8b99474ae", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 21 23:46:37 compute-1 nova_compute[182713]: 2026-01-21 23:46:37.501 182717 DEBUG nova.network.os_vif_util [None req-a757ace6-3f57-4f36-89c9-14999d5dd4ba 553fdc065acf4000a185abac43878ab4 1298204af0f241dc8b63851b2046cf5c - - default default] Converting VIF {"id": "c16d8d18-6610-45c3-8172-54b8b99474ae", "address": "fa:16:3e:d8:d5:91", "network": {"id": "b7816b8e-52c1-4d60-84f7-524ebe7dfa5c", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1363967197-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1298204af0f241dc8b63851b2046cf5c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc16d8d18-66", "ovs_interfaceid": "c16d8d18-6610-45c3-8172-54b8b99474ae", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 21 23:46:37 compute-1 nova_compute[182713]: 2026-01-21 23:46:37.503 182717 DEBUG nova.network.os_vif_util [None req-a757ace6-3f57-4f36-89c9-14999d5dd4ba 553fdc065acf4000a185abac43878ab4 1298204af0f241dc8b63851b2046cf5c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d8:d5:91,bridge_name='br-int',has_traffic_filtering=True,id=c16d8d18-6610-45c3-8172-54b8b99474ae,network=Network(b7816b8e-52c1-4d60-84f7-524ebe7dfa5c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapc16d8d18-66') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 21 23:46:37 compute-1 nova_compute[182713]: 2026-01-21 23:46:37.504 182717 DEBUG nova.objects.instance [None req-a757ace6-3f57-4f36-89c9-14999d5dd4ba 553fdc065acf4000a185abac43878ab4 1298204af0f241dc8b63851b2046cf5c - - default default] Lazy-loading 'pci_devices' on Instance uuid b1080912-4a1f-4504-ae59-a0ad89963886 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:46:37 compute-1 nova_compute[182713]: 2026-01-21 23:46:37.523 182717 DEBUG nova.virt.libvirt.driver [None req-a757ace6-3f57-4f36-89c9-14999d5dd4ba 553fdc065acf4000a185abac43878ab4 1298204af0f241dc8b63851b2046cf5c - - default default] [instance: b1080912-4a1f-4504-ae59-a0ad89963886] End _get_guest_xml xml=<domain type="kvm">
Jan 21 23:46:37 compute-1 nova_compute[182713]:   <uuid>b1080912-4a1f-4504-ae59-a0ad89963886</uuid>
Jan 21 23:46:37 compute-1 nova_compute[182713]:   <name>instance-00000011</name>
Jan 21 23:46:37 compute-1 nova_compute[182713]:   <memory>131072</memory>
Jan 21 23:46:37 compute-1 nova_compute[182713]:   <vcpu>1</vcpu>
Jan 21 23:46:37 compute-1 nova_compute[182713]:   <metadata>
Jan 21 23:46:37 compute-1 nova_compute[182713]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 21 23:46:37 compute-1 nova_compute[182713]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 21 23:46:37 compute-1 nova_compute[182713]:       <nova:name>tempest-LiveAutoBlockMigrationV225Test-server-1283276848</nova:name>
Jan 21 23:46:37 compute-1 nova_compute[182713]:       <nova:creationTime>2026-01-21 23:46:37</nova:creationTime>
Jan 21 23:46:37 compute-1 nova_compute[182713]:       <nova:flavor name="m1.nano">
Jan 21 23:46:37 compute-1 nova_compute[182713]:         <nova:memory>128</nova:memory>
Jan 21 23:46:37 compute-1 nova_compute[182713]:         <nova:disk>1</nova:disk>
Jan 21 23:46:37 compute-1 nova_compute[182713]:         <nova:swap>0</nova:swap>
Jan 21 23:46:37 compute-1 nova_compute[182713]:         <nova:ephemeral>0</nova:ephemeral>
Jan 21 23:46:37 compute-1 nova_compute[182713]:         <nova:vcpus>1</nova:vcpus>
Jan 21 23:46:37 compute-1 nova_compute[182713]:       </nova:flavor>
Jan 21 23:46:37 compute-1 nova_compute[182713]:       <nova:owner>
Jan 21 23:46:37 compute-1 nova_compute[182713]:         <nova:user uuid="553fdc065acf4000a185abac43878ab4">tempest-LiveAutoBlockMigrationV225Test-1063342224-project-member</nova:user>
Jan 21 23:46:37 compute-1 nova_compute[182713]:         <nova:project uuid="1298204af0f241dc8b63851b2046cf5c">tempest-LiveAutoBlockMigrationV225Test-1063342224</nova:project>
Jan 21 23:46:37 compute-1 nova_compute[182713]:       </nova:owner>
Jan 21 23:46:37 compute-1 nova_compute[182713]:       <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 21 23:46:37 compute-1 nova_compute[182713]:       <nova:ports>
Jan 21 23:46:37 compute-1 nova_compute[182713]:         <nova:port uuid="c16d8d18-6610-45c3-8172-54b8b99474ae">
Jan 21 23:46:37 compute-1 nova_compute[182713]:           <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Jan 21 23:46:37 compute-1 nova_compute[182713]:         </nova:port>
Jan 21 23:46:37 compute-1 nova_compute[182713]:       </nova:ports>
Jan 21 23:46:37 compute-1 nova_compute[182713]:     </nova:instance>
Jan 21 23:46:37 compute-1 nova_compute[182713]:   </metadata>
Jan 21 23:46:37 compute-1 nova_compute[182713]:   <sysinfo type="smbios">
Jan 21 23:46:37 compute-1 nova_compute[182713]:     <system>
Jan 21 23:46:37 compute-1 nova_compute[182713]:       <entry name="manufacturer">RDO</entry>
Jan 21 23:46:37 compute-1 nova_compute[182713]:       <entry name="product">OpenStack Compute</entry>
Jan 21 23:46:37 compute-1 nova_compute[182713]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 21 23:46:37 compute-1 nova_compute[182713]:       <entry name="serial">b1080912-4a1f-4504-ae59-a0ad89963886</entry>
Jan 21 23:46:37 compute-1 nova_compute[182713]:       <entry name="uuid">b1080912-4a1f-4504-ae59-a0ad89963886</entry>
Jan 21 23:46:37 compute-1 nova_compute[182713]:       <entry name="family">Virtual Machine</entry>
Jan 21 23:46:37 compute-1 nova_compute[182713]:     </system>
Jan 21 23:46:37 compute-1 nova_compute[182713]:   </sysinfo>
Jan 21 23:46:37 compute-1 nova_compute[182713]:   <os>
Jan 21 23:46:37 compute-1 nova_compute[182713]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 21 23:46:37 compute-1 nova_compute[182713]:     <boot dev="hd"/>
Jan 21 23:46:37 compute-1 nova_compute[182713]:     <smbios mode="sysinfo"/>
Jan 21 23:46:37 compute-1 nova_compute[182713]:   </os>
Jan 21 23:46:37 compute-1 nova_compute[182713]:   <features>
Jan 21 23:46:37 compute-1 nova_compute[182713]:     <acpi/>
Jan 21 23:46:37 compute-1 nova_compute[182713]:     <apic/>
Jan 21 23:46:37 compute-1 nova_compute[182713]:     <vmcoreinfo/>
Jan 21 23:46:37 compute-1 nova_compute[182713]:   </features>
Jan 21 23:46:37 compute-1 nova_compute[182713]:   <clock offset="utc">
Jan 21 23:46:37 compute-1 nova_compute[182713]:     <timer name="pit" tickpolicy="delay"/>
Jan 21 23:46:37 compute-1 nova_compute[182713]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 21 23:46:37 compute-1 nova_compute[182713]:     <timer name="hpet" present="no"/>
Jan 21 23:46:37 compute-1 nova_compute[182713]:   </clock>
Jan 21 23:46:37 compute-1 nova_compute[182713]:   <cpu mode="custom" match="exact">
Jan 21 23:46:37 compute-1 nova_compute[182713]:     <model>Nehalem</model>
Jan 21 23:46:37 compute-1 nova_compute[182713]:     <topology sockets="1" cores="1" threads="1"/>
Jan 21 23:46:37 compute-1 nova_compute[182713]:   </cpu>
Jan 21 23:46:37 compute-1 nova_compute[182713]:   <devices>
Jan 21 23:46:37 compute-1 nova_compute[182713]:     <disk type="file" device="disk">
Jan 21 23:46:37 compute-1 nova_compute[182713]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 21 23:46:37 compute-1 nova_compute[182713]:       <source file="/var/lib/nova/instances/b1080912-4a1f-4504-ae59-a0ad89963886/disk"/>
Jan 21 23:46:37 compute-1 nova_compute[182713]:       <target dev="vda" bus="virtio"/>
Jan 21 23:46:37 compute-1 nova_compute[182713]:     </disk>
Jan 21 23:46:37 compute-1 nova_compute[182713]:     <disk type="file" device="cdrom">
Jan 21 23:46:37 compute-1 nova_compute[182713]:       <driver name="qemu" type="raw" cache="none"/>
Jan 21 23:46:37 compute-1 nova_compute[182713]:       <source file="/var/lib/nova/instances/b1080912-4a1f-4504-ae59-a0ad89963886/disk.config"/>
Jan 21 23:46:37 compute-1 nova_compute[182713]:       <target dev="sda" bus="sata"/>
Jan 21 23:46:37 compute-1 nova_compute[182713]:     </disk>
Jan 21 23:46:37 compute-1 nova_compute[182713]:     <interface type="ethernet">
Jan 21 23:46:37 compute-1 nova_compute[182713]:       <mac address="fa:16:3e:d8:d5:91"/>
Jan 21 23:46:37 compute-1 nova_compute[182713]:       <model type="virtio"/>
Jan 21 23:46:37 compute-1 nova_compute[182713]:       <driver name="vhost" rx_queue_size="512"/>
Jan 21 23:46:37 compute-1 nova_compute[182713]:       <mtu size="1442"/>
Jan 21 23:46:37 compute-1 nova_compute[182713]:       <target dev="tapc16d8d18-66"/>
Jan 21 23:46:37 compute-1 nova_compute[182713]:     </interface>
Jan 21 23:46:37 compute-1 nova_compute[182713]:     <serial type="pty">
Jan 21 23:46:37 compute-1 nova_compute[182713]:       <log file="/var/lib/nova/instances/b1080912-4a1f-4504-ae59-a0ad89963886/console.log" append="off"/>
Jan 21 23:46:37 compute-1 nova_compute[182713]:     </serial>
Jan 21 23:46:37 compute-1 nova_compute[182713]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 21 23:46:37 compute-1 nova_compute[182713]:     <video>
Jan 21 23:46:37 compute-1 nova_compute[182713]:       <model type="virtio"/>
Jan 21 23:46:37 compute-1 nova_compute[182713]:     </video>
Jan 21 23:46:37 compute-1 nova_compute[182713]:     <input type="tablet" bus="usb"/>
Jan 21 23:46:37 compute-1 nova_compute[182713]:     <rng model="virtio">
Jan 21 23:46:37 compute-1 nova_compute[182713]:       <backend model="random">/dev/urandom</backend>
Jan 21 23:46:37 compute-1 nova_compute[182713]:     </rng>
Jan 21 23:46:37 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root"/>
Jan 21 23:46:37 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:46:37 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:46:37 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:46:37 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:46:37 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:46:37 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:46:37 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:46:37 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:46:37 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:46:37 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:46:37 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:46:37 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:46:37 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:46:37 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:46:37 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:46:37 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:46:37 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:46:37 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:46:37 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:46:37 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:46:37 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:46:37 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:46:37 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:46:37 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:46:37 compute-1 nova_compute[182713]:     <controller type="usb" index="0"/>
Jan 21 23:46:37 compute-1 nova_compute[182713]:     <memballoon model="virtio">
Jan 21 23:46:37 compute-1 nova_compute[182713]:       <stats period="10"/>
Jan 21 23:46:37 compute-1 nova_compute[182713]:     </memballoon>
Jan 21 23:46:37 compute-1 nova_compute[182713]:   </devices>
Jan 21 23:46:37 compute-1 nova_compute[182713]: </domain>
Jan 21 23:46:37 compute-1 nova_compute[182713]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 21 23:46:37 compute-1 nova_compute[182713]: 2026-01-21 23:46:37.525 182717 DEBUG nova.compute.manager [None req-a757ace6-3f57-4f36-89c9-14999d5dd4ba 553fdc065acf4000a185abac43878ab4 1298204af0f241dc8b63851b2046cf5c - - default default] [instance: b1080912-4a1f-4504-ae59-a0ad89963886] Preparing to wait for external event network-vif-plugged-c16d8d18-6610-45c3-8172-54b8b99474ae prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 21 23:46:37 compute-1 nova_compute[182713]: 2026-01-21 23:46:37.525 182717 DEBUG oslo_concurrency.lockutils [None req-a757ace6-3f57-4f36-89c9-14999d5dd4ba 553fdc065acf4000a185abac43878ab4 1298204af0f241dc8b63851b2046cf5c - - default default] Acquiring lock "b1080912-4a1f-4504-ae59-a0ad89963886-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:46:37 compute-1 nova_compute[182713]: 2026-01-21 23:46:37.526 182717 DEBUG oslo_concurrency.lockutils [None req-a757ace6-3f57-4f36-89c9-14999d5dd4ba 553fdc065acf4000a185abac43878ab4 1298204af0f241dc8b63851b2046cf5c - - default default] Lock "b1080912-4a1f-4504-ae59-a0ad89963886-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:46:37 compute-1 nova_compute[182713]: 2026-01-21 23:46:37.526 182717 DEBUG oslo_concurrency.lockutils [None req-a757ace6-3f57-4f36-89c9-14999d5dd4ba 553fdc065acf4000a185abac43878ab4 1298204af0f241dc8b63851b2046cf5c - - default default] Lock "b1080912-4a1f-4504-ae59-a0ad89963886-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:46:37 compute-1 nova_compute[182713]: 2026-01-21 23:46:37.528 182717 DEBUG nova.virt.libvirt.vif [None req-a757ace6-3f57-4f36-89c9-14999d5dd4ba 553fdc065acf4000a185abac43878ab4 1298204af0f241dc8b63851b2046cf5c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-21T23:46:31Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-LiveAutoBlockMigrationV225Test-server-1283276848',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-liveautoblockmigrationv225test-server-1283276848',id=17,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='1298204af0f241dc8b63851b2046cf5c',ramdisk_id='',reservation_id='r-b25ryamg',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-LiveAutoBlockMigrationV225Test-1063342224',owner_user_name='tempest-LiveAutoBlockMigrationV225Test-1063342224-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-21T23:46:33Z,user_data=None,user_id='553fdc065acf4000a185abac43878ab4',uuid=b1080912-4a1f-4504-ae59-a0ad89963886,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "c16d8d18-6610-45c3-8172-54b8b99474ae", "address": "fa:16:3e:d8:d5:91", "network": {"id": "b7816b8e-52c1-4d60-84f7-524ebe7dfa5c", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1363967197-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1298204af0f241dc8b63851b2046cf5c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc16d8d18-66", "ovs_interfaceid": "c16d8d18-6610-45c3-8172-54b8b99474ae", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 21 23:46:37 compute-1 nova_compute[182713]: 2026-01-21 23:46:37.528 182717 DEBUG nova.network.os_vif_util [None req-a757ace6-3f57-4f36-89c9-14999d5dd4ba 553fdc065acf4000a185abac43878ab4 1298204af0f241dc8b63851b2046cf5c - - default default] Converting VIF {"id": "c16d8d18-6610-45c3-8172-54b8b99474ae", "address": "fa:16:3e:d8:d5:91", "network": {"id": "b7816b8e-52c1-4d60-84f7-524ebe7dfa5c", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1363967197-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1298204af0f241dc8b63851b2046cf5c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc16d8d18-66", "ovs_interfaceid": "c16d8d18-6610-45c3-8172-54b8b99474ae", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 21 23:46:37 compute-1 nova_compute[182713]: 2026-01-21 23:46:37.529 182717 DEBUG nova.network.os_vif_util [None req-a757ace6-3f57-4f36-89c9-14999d5dd4ba 553fdc065acf4000a185abac43878ab4 1298204af0f241dc8b63851b2046cf5c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d8:d5:91,bridge_name='br-int',has_traffic_filtering=True,id=c16d8d18-6610-45c3-8172-54b8b99474ae,network=Network(b7816b8e-52c1-4d60-84f7-524ebe7dfa5c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapc16d8d18-66') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 21 23:46:37 compute-1 nova_compute[182713]: 2026-01-21 23:46:37.530 182717 DEBUG os_vif [None req-a757ace6-3f57-4f36-89c9-14999d5dd4ba 553fdc065acf4000a185abac43878ab4 1298204af0f241dc8b63851b2046cf5c - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:d8:d5:91,bridge_name='br-int',has_traffic_filtering=True,id=c16d8d18-6610-45c3-8172-54b8b99474ae,network=Network(b7816b8e-52c1-4d60-84f7-524ebe7dfa5c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapc16d8d18-66') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 21 23:46:37 compute-1 nova_compute[182713]: 2026-01-21 23:46:37.531 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:46:37 compute-1 nova_compute[182713]: 2026-01-21 23:46:37.532 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:46:37 compute-1 nova_compute[182713]: 2026-01-21 23:46:37.533 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 21 23:46:37 compute-1 nova_compute[182713]: 2026-01-21 23:46:37.542 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:46:37 compute-1 nova_compute[182713]: 2026-01-21 23:46:37.542 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc16d8d18-66, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:46:37 compute-1 nova_compute[182713]: 2026-01-21 23:46:37.543 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapc16d8d18-66, col_values=(('external_ids', {'iface-id': 'c16d8d18-6610-45c3-8172-54b8b99474ae', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:d8:d5:91', 'vm-uuid': 'b1080912-4a1f-4504-ae59-a0ad89963886'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:46:37 compute-1 nova_compute[182713]: 2026-01-21 23:46:37.546 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:46:37 compute-1 NetworkManager[54952]: <info>  [1769039197.5485] manager: (tapc16d8d18-66): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/33)
Jan 21 23:46:37 compute-1 nova_compute[182713]: 2026-01-21 23:46:37.551 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 21 23:46:37 compute-1 nova_compute[182713]: 2026-01-21 23:46:37.556 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:46:37 compute-1 nova_compute[182713]: 2026-01-21 23:46:37.557 182717 INFO os_vif [None req-a757ace6-3f57-4f36-89c9-14999d5dd4ba 553fdc065acf4000a185abac43878ab4 1298204af0f241dc8b63851b2046cf5c - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:d8:d5:91,bridge_name='br-int',has_traffic_filtering=True,id=c16d8d18-6610-45c3-8172-54b8b99474ae,network=Network(b7816b8e-52c1-4d60-84f7-524ebe7dfa5c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapc16d8d18-66')
Jan 21 23:46:37 compute-1 podman[212982]: 2026-01-21 23:46:37.583187733 +0000 UTC m=+0.079563850 container health_status dce133a2ffa21f3006ac27dc80027060a9029e6d972f4c62fecd35dd3bbb00f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, distribution-scope=public, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, release=1755695350, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.6)
Jan 21 23:46:37 compute-1 nova_compute[182713]: 2026-01-21 23:46:37.633 182717 DEBUG nova.virt.libvirt.driver [None req-a757ace6-3f57-4f36-89c9-14999d5dd4ba 553fdc065acf4000a185abac43878ab4 1298204af0f241dc8b63851b2046cf5c - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 21 23:46:37 compute-1 nova_compute[182713]: 2026-01-21 23:46:37.634 182717 DEBUG nova.virt.libvirt.driver [None req-a757ace6-3f57-4f36-89c9-14999d5dd4ba 553fdc065acf4000a185abac43878ab4 1298204af0f241dc8b63851b2046cf5c - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 21 23:46:37 compute-1 nova_compute[182713]: 2026-01-21 23:46:37.634 182717 DEBUG nova.virt.libvirt.driver [None req-a757ace6-3f57-4f36-89c9-14999d5dd4ba 553fdc065acf4000a185abac43878ab4 1298204af0f241dc8b63851b2046cf5c - - default default] No VIF found with MAC fa:16:3e:d8:d5:91, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 21 23:46:37 compute-1 nova_compute[182713]: 2026-01-21 23:46:37.635 182717 INFO nova.virt.libvirt.driver [None req-a757ace6-3f57-4f36-89c9-14999d5dd4ba 553fdc065acf4000a185abac43878ab4 1298204af0f241dc8b63851b2046cf5c - - default default] [instance: b1080912-4a1f-4504-ae59-a0ad89963886] Using config drive
Jan 21 23:46:38 compute-1 nova_compute[182713]: 2026-01-21 23:46:38.205 182717 INFO nova.virt.libvirt.driver [None req-a757ace6-3f57-4f36-89c9-14999d5dd4ba 553fdc065acf4000a185abac43878ab4 1298204af0f241dc8b63851b2046cf5c - - default default] [instance: b1080912-4a1f-4504-ae59-a0ad89963886] Creating config drive at /var/lib/nova/instances/b1080912-4a1f-4504-ae59-a0ad89963886/disk.config
Jan 21 23:46:38 compute-1 nova_compute[182713]: 2026-01-21 23:46:38.211 182717 DEBUG oslo_concurrency.processutils [None req-a757ace6-3f57-4f36-89c9-14999d5dd4ba 553fdc065acf4000a185abac43878ab4 1298204af0f241dc8b63851b2046cf5c - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/b1080912-4a1f-4504-ae59-a0ad89963886/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpndzpl8yo execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:46:38 compute-1 nova_compute[182713]: 2026-01-21 23:46:38.343 182717 DEBUG oslo_concurrency.processutils [None req-a757ace6-3f57-4f36-89c9-14999d5dd4ba 553fdc065acf4000a185abac43878ab4 1298204af0f241dc8b63851b2046cf5c - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/b1080912-4a1f-4504-ae59-a0ad89963886/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpndzpl8yo" returned: 0 in 0.133s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:46:38 compute-1 kernel: tapc16d8d18-66: entered promiscuous mode
Jan 21 23:46:38 compute-1 NetworkManager[54952]: <info>  [1769039198.4262] manager: (tapc16d8d18-66): new Tun device (/org/freedesktop/NetworkManager/Devices/34)
Jan 21 23:46:38 compute-1 ovn_controller[94841]: 2026-01-21T23:46:38Z|00046|binding|INFO|Claiming lport c16d8d18-6610-45c3-8172-54b8b99474ae for this chassis.
Jan 21 23:46:38 compute-1 nova_compute[182713]: 2026-01-21 23:46:38.428 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:46:38 compute-1 ovn_controller[94841]: 2026-01-21T23:46:38Z|00047|binding|INFO|c16d8d18-6610-45c3-8172-54b8b99474ae: Claiming fa:16:3e:d8:d5:91 10.100.0.4
Jan 21 23:46:38 compute-1 ovn_controller[94841]: 2026-01-21T23:46:38Z|00048|binding|INFO|Claiming lport 32683c17-e027-4757-9a64-36df76fef381 for this chassis.
Jan 21 23:46:38 compute-1 ovn_controller[94841]: 2026-01-21T23:46:38Z|00049|binding|INFO|32683c17-e027-4757-9a64-36df76fef381: Claiming fa:16:3e:51:cc:06 19.80.0.41
Jan 21 23:46:38 compute-1 nova_compute[182713]: 2026-01-21 23:46:38.431 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:46:38 compute-1 nova_compute[182713]: 2026-01-21 23:46:38.434 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:46:38 compute-1 systemd-machined[153970]: New machine qemu-9-instance-00000011.
Jan 21 23:46:38 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:46:38.468 104184 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d8:d5:91 10.100.0.4'], port_security=['fa:16:3e:d8:d5:91 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-parent-1447043042', 'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': 'b1080912-4a1f-4504-ae59-a0ad89963886', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b7816b8e-52c1-4d60-84f7-524ebe7dfa5c', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-parent-1447043042', 'neutron:project_id': '1298204af0f241dc8b63851b2046cf5c', 'neutron:revision_number': '2', 'neutron:security_group_ids': '4fca0662-11c4-4183-96b8-546eae3304ff', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c50c611d-d348-436f-bd12-bc6add278699, chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>], logical_port=c16d8d18-6610-45c3-8172-54b8b99474ae) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 21 23:46:38 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:46:38.469 104184 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:51:cc:06 19.80.0.41'], port_security=['fa:16:3e:51:cc:06 19.80.0.41'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=['c16d8d18-6610-45c3-8172-54b8b99474ae'], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-subport-1528740689', 'neutron:cidrs': '19.80.0.41/24', 'neutron:device_id': '', 'neutron:device_owner': 'trunk:subport', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3af949ae-65f7-4e98-9b88-e75f765a8686', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-subport-1528740689', 'neutron:project_id': '1298204af0f241dc8b63851b2046cf5c', 'neutron:revision_number': '2', 'neutron:security_group_ids': '4fca0662-11c4-4183-96b8-546eae3304ff', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[42], additional_encap=[], encap=[], mirror_rules=[], datapath=d251bc29-f047-44fe-b77c-1e7f2007e967, chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=32683c17-e027-4757-9a64-36df76fef381) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 21 23:46:38 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:46:38.470 104184 INFO neutron.agent.ovn.metadata.agent [-] Port c16d8d18-6610-45c3-8172-54b8b99474ae in datapath b7816b8e-52c1-4d60-84f7-524ebe7dfa5c bound to our chassis
Jan 21 23:46:38 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:46:38.471 104184 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network b7816b8e-52c1-4d60-84f7-524ebe7dfa5c
Jan 21 23:46:38 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:46:38.492 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[4ef51bd2-34d3-4d78-aabd-b043b472af54]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:46:38 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:46:38.493 104184 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapb7816b8e-51 in ovnmeta-b7816b8e-52c1-4d60-84f7-524ebe7dfa5c namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 21 23:46:38 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:46:38.495 211733 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapb7816b8e-50 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 21 23:46:38 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:46:38.495 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[64a85fc5-049e-4547-94cb-dff2310c7405]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:46:38 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:46:38.496 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[6daa5079-54e0-4fc2-9c17-c614b3b4dec7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:46:38 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:46:38.510 104576 DEBUG oslo.privsep.daemon [-] privsep: reply[05a9a446-b461-4c07-8905-eeecdc59d8f8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:46:38 compute-1 systemd[1]: Started Virtual Machine qemu-9-instance-00000011.
Jan 21 23:46:38 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:46:38.523 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[4e4a1192-ebf9-4cbf-b5a1-5a3a8a8317ae]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:46:38 compute-1 ovn_controller[94841]: 2026-01-21T23:46:38Z|00050|binding|INFO|Setting lport c16d8d18-6610-45c3-8172-54b8b99474ae ovn-installed in OVS
Jan 21 23:46:38 compute-1 ovn_controller[94841]: 2026-01-21T23:46:38Z|00051|binding|INFO|Setting lport c16d8d18-6610-45c3-8172-54b8b99474ae up in Southbound
Jan 21 23:46:38 compute-1 ovn_controller[94841]: 2026-01-21T23:46:38Z|00052|binding|INFO|Setting lport 32683c17-e027-4757-9a64-36df76fef381 up in Southbound
Jan 21 23:46:38 compute-1 nova_compute[182713]: 2026-01-21 23:46:38.527 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:46:38 compute-1 systemd-udevd[213026]: Network interface NamePolicy= disabled on kernel command line.
Jan 21 23:46:38 compute-1 NetworkManager[54952]: <info>  [1769039198.5517] device (tapc16d8d18-66): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 21 23:46:38 compute-1 NetworkManager[54952]: <info>  [1769039198.5526] device (tapc16d8d18-66): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 21 23:46:38 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:46:38.561 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[951895f3-e6de-4279-b46b-11dd1c21b571]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:46:38 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:46:38.568 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[e181d313-2c29-4b30-85e2-115da8745a6f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:46:38 compute-1 NetworkManager[54952]: <info>  [1769039198.5693] manager: (tapb7816b8e-50): new Veth device (/org/freedesktop/NetworkManager/Devices/35)
Jan 21 23:46:38 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:46:38.597 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[a81037f9-4e96-43c3-ba20-0619690214f7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:46:38 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:46:38.600 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[c409afab-58b3-4629-8e5b-e8411b999bc6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:46:38 compute-1 NetworkManager[54952]: <info>  [1769039198.6308] device (tapb7816b8e-50): carrier: link connected
Jan 21 23:46:38 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:46:38.635 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[a4d71d0b-6c21-4096-8046-6f6aa89f621f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:46:38 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:46:38.655 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[e36cdadc-8ebc-4652-ae84-4c2bc0fcde96]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb7816b8e-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b1:20:b0'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 18], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 377128, 'reachable_time': 31977, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 213055, 'error': None, 'target': 'ovnmeta-b7816b8e-52c1-4d60-84f7-524ebe7dfa5c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:46:38 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:46:38.670 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[a2a10222-559c-4f59-9888-a4a125cec1d5]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feb1:20b0'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 377128, 'tstamp': 377128}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 213056, 'error': None, 'target': 'ovnmeta-b7816b8e-52c1-4d60-84f7-524ebe7dfa5c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:46:38 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:46:38.685 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[7e238957-0c8e-4a14-bf8a-2f9e86b07a7a]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb7816b8e-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b1:20:b0'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 18], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 377128, 'reachable_time': 31977, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 213057, 'error': None, 'target': 'ovnmeta-b7816b8e-52c1-4d60-84f7-524ebe7dfa5c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:46:38 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:46:38.716 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[02d3b131-bbae-433c-b962-61acab0c9335]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:46:38 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:46:38.782 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[fe719e1f-ff96-46a8-9ca4-ac8d766e3912]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:46:38 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:46:38.783 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb7816b8e-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:46:38 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:46:38.783 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 21 23:46:38 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:46:38.784 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb7816b8e-50, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:46:38 compute-1 nova_compute[182713]: 2026-01-21 23:46:38.785 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:46:38 compute-1 kernel: tapb7816b8e-50: entered promiscuous mode
Jan 21 23:46:38 compute-1 NetworkManager[54952]: <info>  [1769039198.7867] manager: (tapb7816b8e-50): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/36)
Jan 21 23:46:38 compute-1 nova_compute[182713]: 2026-01-21 23:46:38.787 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:46:38 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:46:38.789 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapb7816b8e-50, col_values=(('external_ids', {'iface-id': 'ecebff42-11cb-48b4-9c3d-966172998a49'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:46:38 compute-1 nova_compute[182713]: 2026-01-21 23:46:38.790 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:46:38 compute-1 ovn_controller[94841]: 2026-01-21T23:46:38Z|00053|binding|INFO|Releasing lport ecebff42-11cb-48b4-9c3d-966172998a49 from this chassis (sb_readonly=0)
Jan 21 23:46:38 compute-1 nova_compute[182713]: 2026-01-21 23:46:38.791 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:46:38 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:46:38.792 104184 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/b7816b8e-52c1-4d60-84f7-524ebe7dfa5c.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/b7816b8e-52c1-4d60-84f7-524ebe7dfa5c.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 21 23:46:38 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:46:38.793 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[4f2738e0-3536-4050-a8eb-8509a78bb2e8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:46:38 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:46:38.794 104184 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 21 23:46:38 compute-1 ovn_metadata_agent[104179]: global
Jan 21 23:46:38 compute-1 ovn_metadata_agent[104179]:     log         /dev/log local0 debug
Jan 21 23:46:38 compute-1 ovn_metadata_agent[104179]:     log-tag     haproxy-metadata-proxy-b7816b8e-52c1-4d60-84f7-524ebe7dfa5c
Jan 21 23:46:38 compute-1 ovn_metadata_agent[104179]:     user        root
Jan 21 23:46:38 compute-1 ovn_metadata_agent[104179]:     group       root
Jan 21 23:46:38 compute-1 ovn_metadata_agent[104179]:     maxconn     1024
Jan 21 23:46:38 compute-1 ovn_metadata_agent[104179]:     pidfile     /var/lib/neutron/external/pids/b7816b8e-52c1-4d60-84f7-524ebe7dfa5c.pid.haproxy
Jan 21 23:46:38 compute-1 ovn_metadata_agent[104179]:     daemon
Jan 21 23:46:38 compute-1 ovn_metadata_agent[104179]: 
Jan 21 23:46:38 compute-1 ovn_metadata_agent[104179]: defaults
Jan 21 23:46:38 compute-1 ovn_metadata_agent[104179]:     log global
Jan 21 23:46:38 compute-1 ovn_metadata_agent[104179]:     mode http
Jan 21 23:46:38 compute-1 ovn_metadata_agent[104179]:     option httplog
Jan 21 23:46:38 compute-1 ovn_metadata_agent[104179]:     option dontlognull
Jan 21 23:46:38 compute-1 ovn_metadata_agent[104179]:     option http-server-close
Jan 21 23:46:38 compute-1 ovn_metadata_agent[104179]:     option forwardfor
Jan 21 23:46:38 compute-1 ovn_metadata_agent[104179]:     retries                 3
Jan 21 23:46:38 compute-1 ovn_metadata_agent[104179]:     timeout http-request    30s
Jan 21 23:46:38 compute-1 ovn_metadata_agent[104179]:     timeout connect         30s
Jan 21 23:46:38 compute-1 ovn_metadata_agent[104179]:     timeout client          32s
Jan 21 23:46:38 compute-1 ovn_metadata_agent[104179]:     timeout server          32s
Jan 21 23:46:38 compute-1 ovn_metadata_agent[104179]:     timeout http-keep-alive 30s
Jan 21 23:46:38 compute-1 ovn_metadata_agent[104179]: 
Jan 21 23:46:38 compute-1 ovn_metadata_agent[104179]: 
Jan 21 23:46:38 compute-1 ovn_metadata_agent[104179]: listen listener
Jan 21 23:46:38 compute-1 ovn_metadata_agent[104179]:     bind 169.254.169.254:80
Jan 21 23:46:38 compute-1 ovn_metadata_agent[104179]:     server metadata /var/lib/neutron/metadata_proxy
Jan 21 23:46:38 compute-1 ovn_metadata_agent[104179]:     http-request add-header X-OVN-Network-ID b7816b8e-52c1-4d60-84f7-524ebe7dfa5c
Jan 21 23:46:38 compute-1 ovn_metadata_agent[104179]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 21 23:46:38 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:46:38.794 104184 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-b7816b8e-52c1-4d60-84f7-524ebe7dfa5c', 'env', 'PROCESS_TAG=haproxy-b7816b8e-52c1-4d60-84f7-524ebe7dfa5c', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/b7816b8e-52c1-4d60-84f7-524ebe7dfa5c.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 21 23:46:38 compute-1 nova_compute[182713]: 2026-01-21 23:46:38.802 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:46:38 compute-1 nova_compute[182713]: 2026-01-21 23:46:38.832 182717 DEBUG nova.virt.driver [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Emitting event <LifecycleEvent: 1769039198.831438, b1080912-4a1f-4504-ae59-a0ad89963886 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:46:38 compute-1 nova_compute[182713]: 2026-01-21 23:46:38.832 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: b1080912-4a1f-4504-ae59-a0ad89963886] VM Started (Lifecycle Event)
Jan 21 23:46:38 compute-1 nova_compute[182713]: 2026-01-21 23:46:38.871 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: b1080912-4a1f-4504-ae59-a0ad89963886] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:46:38 compute-1 nova_compute[182713]: 2026-01-21 23:46:38.876 182717 DEBUG nova.virt.driver [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Emitting event <LifecycleEvent: 1769039198.832751, b1080912-4a1f-4504-ae59-a0ad89963886 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:46:38 compute-1 nova_compute[182713]: 2026-01-21 23:46:38.877 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: b1080912-4a1f-4504-ae59-a0ad89963886] VM Paused (Lifecycle Event)
Jan 21 23:46:38 compute-1 nova_compute[182713]: 2026-01-21 23:46:38.906 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: b1080912-4a1f-4504-ae59-a0ad89963886] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:46:38 compute-1 nova_compute[182713]: 2026-01-21 23:46:38.910 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: b1080912-4a1f-4504-ae59-a0ad89963886] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 21 23:46:38 compute-1 nova_compute[182713]: 2026-01-21 23:46:38.947 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: b1080912-4a1f-4504-ae59-a0ad89963886] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 21 23:46:39 compute-1 podman[213096]: 2026-01-21 23:46:39.205995704 +0000 UTC m=+0.068009408 container create 4fb79158bccf762200a7f67094ba5bb790a682de787843391631b51350362adc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b7816b8e-52c1-4d60-84f7-524ebe7dfa5c, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 21 23:46:39 compute-1 nova_compute[182713]: 2026-01-21 23:46:39.246 182717 DEBUG nova.network.neutron [req-c90c7529-e3bc-46f0-8ad0-459eae0c3824 req-484f5f75-b598-402b-b58a-34806d90bbfd 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: b1080912-4a1f-4504-ae59-a0ad89963886] Updated VIF entry in instance network info cache for port c16d8d18-6610-45c3-8172-54b8b99474ae. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 21 23:46:39 compute-1 nova_compute[182713]: 2026-01-21 23:46:39.247 182717 DEBUG nova.network.neutron [req-c90c7529-e3bc-46f0-8ad0-459eae0c3824 req-484f5f75-b598-402b-b58a-34806d90bbfd 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: b1080912-4a1f-4504-ae59-a0ad89963886] Updating instance_info_cache with network_info: [{"id": "c16d8d18-6610-45c3-8172-54b8b99474ae", "address": "fa:16:3e:d8:d5:91", "network": {"id": "b7816b8e-52c1-4d60-84f7-524ebe7dfa5c", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1363967197-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1298204af0f241dc8b63851b2046cf5c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc16d8d18-66", "ovs_interfaceid": "c16d8d18-6610-45c3-8172-54b8b99474ae", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 23:46:39 compute-1 systemd[1]: Started libpod-conmon-4fb79158bccf762200a7f67094ba5bb790a682de787843391631b51350362adc.scope.
Jan 21 23:46:39 compute-1 podman[213096]: 2026-01-21 23:46:39.170304038 +0000 UTC m=+0.032317832 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 21 23:46:39 compute-1 nova_compute[182713]: 2026-01-21 23:46:39.270 182717 DEBUG oslo_concurrency.lockutils [req-c90c7529-e3bc-46f0-8ad0-459eae0c3824 req-484f5f75-b598-402b-b58a-34806d90bbfd 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-b1080912-4a1f-4504-ae59-a0ad89963886" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 23:46:39 compute-1 systemd[1]: Started libcrun container.
Jan 21 23:46:39 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d6290e61230434e21e718b770a20d2af36540524d64e0d0c8f75e8df9c63a59a/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 21 23:46:39 compute-1 podman[213096]: 2026-01-21 23:46:39.323236802 +0000 UTC m=+0.185250556 container init 4fb79158bccf762200a7f67094ba5bb790a682de787843391631b51350362adc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b7816b8e-52c1-4d60-84f7-524ebe7dfa5c, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 21 23:46:39 compute-1 podman[213096]: 2026-01-21 23:46:39.330557721 +0000 UTC m=+0.192571425 container start 4fb79158bccf762200a7f67094ba5bb790a682de787843391631b51350362adc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b7816b8e-52c1-4d60-84f7-524ebe7dfa5c, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 21 23:46:39 compute-1 nova_compute[182713]: 2026-01-21 23:46:39.339 182717 DEBUG nova.compute.manager [req-3e1fa506-bb62-442e-a108-549b9a13922b req-2f5c7b46-8b08-4997-9e78-0058b756e4aa 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: b1080912-4a1f-4504-ae59-a0ad89963886] Received event network-vif-plugged-c16d8d18-6610-45c3-8172-54b8b99474ae external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:46:39 compute-1 nova_compute[182713]: 2026-01-21 23:46:39.339 182717 DEBUG oslo_concurrency.lockutils [req-3e1fa506-bb62-442e-a108-549b9a13922b req-2f5c7b46-8b08-4997-9e78-0058b756e4aa 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "b1080912-4a1f-4504-ae59-a0ad89963886-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:46:39 compute-1 nova_compute[182713]: 2026-01-21 23:46:39.340 182717 DEBUG oslo_concurrency.lockutils [req-3e1fa506-bb62-442e-a108-549b9a13922b req-2f5c7b46-8b08-4997-9e78-0058b756e4aa 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "b1080912-4a1f-4504-ae59-a0ad89963886-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:46:39 compute-1 nova_compute[182713]: 2026-01-21 23:46:39.340 182717 DEBUG oslo_concurrency.lockutils [req-3e1fa506-bb62-442e-a108-549b9a13922b req-2f5c7b46-8b08-4997-9e78-0058b756e4aa 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "b1080912-4a1f-4504-ae59-a0ad89963886-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:46:39 compute-1 nova_compute[182713]: 2026-01-21 23:46:39.340 182717 DEBUG nova.compute.manager [req-3e1fa506-bb62-442e-a108-549b9a13922b req-2f5c7b46-8b08-4997-9e78-0058b756e4aa 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: b1080912-4a1f-4504-ae59-a0ad89963886] Processing event network-vif-plugged-c16d8d18-6610-45c3-8172-54b8b99474ae _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 21 23:46:39 compute-1 nova_compute[182713]: 2026-01-21 23:46:39.341 182717 DEBUG nova.compute.manager [None req-a757ace6-3f57-4f36-89c9-14999d5dd4ba 553fdc065acf4000a185abac43878ab4 1298204af0f241dc8b63851b2046cf5c - - default default] [instance: b1080912-4a1f-4504-ae59-a0ad89963886] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 21 23:46:39 compute-1 nova_compute[182713]: 2026-01-21 23:46:39.348 182717 DEBUG nova.virt.libvirt.driver [None req-a757ace6-3f57-4f36-89c9-14999d5dd4ba 553fdc065acf4000a185abac43878ab4 1298204af0f241dc8b63851b2046cf5c - - default default] [instance: b1080912-4a1f-4504-ae59-a0ad89963886] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 21 23:46:39 compute-1 nova_compute[182713]: 2026-01-21 23:46:39.349 182717 DEBUG nova.virt.driver [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Emitting event <LifecycleEvent: 1769039199.3481326, b1080912-4a1f-4504-ae59-a0ad89963886 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:46:39 compute-1 nova_compute[182713]: 2026-01-21 23:46:39.349 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: b1080912-4a1f-4504-ae59-a0ad89963886] VM Resumed (Lifecycle Event)
Jan 21 23:46:39 compute-1 nova_compute[182713]: 2026-01-21 23:46:39.355 182717 INFO nova.virt.libvirt.driver [-] [instance: b1080912-4a1f-4504-ae59-a0ad89963886] Instance spawned successfully.
Jan 21 23:46:39 compute-1 nova_compute[182713]: 2026-01-21 23:46:39.356 182717 DEBUG nova.virt.libvirt.driver [None req-a757ace6-3f57-4f36-89c9-14999d5dd4ba 553fdc065acf4000a185abac43878ab4 1298204af0f241dc8b63851b2046cf5c - - default default] [instance: b1080912-4a1f-4504-ae59-a0ad89963886] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 21 23:46:39 compute-1 neutron-haproxy-ovnmeta-b7816b8e-52c1-4d60-84f7-524ebe7dfa5c[213112]: [NOTICE]   (213116) : New worker (213118) forked
Jan 21 23:46:39 compute-1 neutron-haproxy-ovnmeta-b7816b8e-52c1-4d60-84f7-524ebe7dfa5c[213112]: [NOTICE]   (213116) : Loading success.
Jan 21 23:46:39 compute-1 nova_compute[182713]: 2026-01-21 23:46:39.372 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: b1080912-4a1f-4504-ae59-a0ad89963886] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:46:39 compute-1 nova_compute[182713]: 2026-01-21 23:46:39.380 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: b1080912-4a1f-4504-ae59-a0ad89963886] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 21 23:46:39 compute-1 nova_compute[182713]: 2026-01-21 23:46:39.385 182717 DEBUG nova.virt.libvirt.driver [None req-a757ace6-3f57-4f36-89c9-14999d5dd4ba 553fdc065acf4000a185abac43878ab4 1298204af0f241dc8b63851b2046cf5c - - default default] [instance: b1080912-4a1f-4504-ae59-a0ad89963886] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:46:39 compute-1 nova_compute[182713]: 2026-01-21 23:46:39.386 182717 DEBUG nova.virt.libvirt.driver [None req-a757ace6-3f57-4f36-89c9-14999d5dd4ba 553fdc065acf4000a185abac43878ab4 1298204af0f241dc8b63851b2046cf5c - - default default] [instance: b1080912-4a1f-4504-ae59-a0ad89963886] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:46:39 compute-1 nova_compute[182713]: 2026-01-21 23:46:39.386 182717 DEBUG nova.virt.libvirt.driver [None req-a757ace6-3f57-4f36-89c9-14999d5dd4ba 553fdc065acf4000a185abac43878ab4 1298204af0f241dc8b63851b2046cf5c - - default default] [instance: b1080912-4a1f-4504-ae59-a0ad89963886] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:46:39 compute-1 nova_compute[182713]: 2026-01-21 23:46:39.387 182717 DEBUG nova.virt.libvirt.driver [None req-a757ace6-3f57-4f36-89c9-14999d5dd4ba 553fdc065acf4000a185abac43878ab4 1298204af0f241dc8b63851b2046cf5c - - default default] [instance: b1080912-4a1f-4504-ae59-a0ad89963886] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:46:39 compute-1 nova_compute[182713]: 2026-01-21 23:46:39.387 182717 DEBUG nova.virt.libvirt.driver [None req-a757ace6-3f57-4f36-89c9-14999d5dd4ba 553fdc065acf4000a185abac43878ab4 1298204af0f241dc8b63851b2046cf5c - - default default] [instance: b1080912-4a1f-4504-ae59-a0ad89963886] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:46:39 compute-1 nova_compute[182713]: 2026-01-21 23:46:39.387 182717 DEBUG nova.virt.libvirt.driver [None req-a757ace6-3f57-4f36-89c9-14999d5dd4ba 553fdc065acf4000a185abac43878ab4 1298204af0f241dc8b63851b2046cf5c - - default default] [instance: b1080912-4a1f-4504-ae59-a0ad89963886] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:46:39 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:46:39.425 104184 INFO neutron.agent.ovn.metadata.agent [-] Port 32683c17-e027-4757-9a64-36df76fef381 in datapath 3af949ae-65f7-4e98-9b88-e75f765a8686 unbound from our chassis
Jan 21 23:46:39 compute-1 nova_compute[182713]: 2026-01-21 23:46:39.427 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: b1080912-4a1f-4504-ae59-a0ad89963886] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 21 23:46:39 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:46:39.427 104184 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 3af949ae-65f7-4e98-9b88-e75f765a8686
Jan 21 23:46:39 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:46:39.443 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[dde08328-f0ca-4193-9ea8-b413634287f8]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:46:39 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:46:39.445 104184 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap3af949ae-61 in ovnmeta-3af949ae-65f7-4e98-9b88-e75f765a8686 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 21 23:46:39 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:46:39.447 211733 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap3af949ae-60 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 21 23:46:39 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:46:39.448 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[b3286772-0a7e-4778-87dc-ae66077b747b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:46:39 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:46:39.449 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[f9dccafb-5f97-4c2f-9474-d163fe4129c8]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:46:39 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:46:39.460 104576 DEBUG oslo.privsep.daemon [-] privsep: reply[cfeb13dc-c39f-4e21-93b2-134ac71bb18b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:46:39 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:46:39.478 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[5b5afc6d-71f6-48c9-84f2-725923c6ab8d]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:46:39 compute-1 nova_compute[182713]: 2026-01-21 23:46:39.484 182717 INFO nova.compute.manager [None req-a757ace6-3f57-4f36-89c9-14999d5dd4ba 553fdc065acf4000a185abac43878ab4 1298204af0f241dc8b63851b2046cf5c - - default default] [instance: b1080912-4a1f-4504-ae59-a0ad89963886] Took 5.71 seconds to spawn the instance on the hypervisor.
Jan 21 23:46:39 compute-1 nova_compute[182713]: 2026-01-21 23:46:39.484 182717 DEBUG nova.compute.manager [None req-a757ace6-3f57-4f36-89c9-14999d5dd4ba 553fdc065acf4000a185abac43878ab4 1298204af0f241dc8b63851b2046cf5c - - default default] [instance: b1080912-4a1f-4504-ae59-a0ad89963886] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:46:39 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:46:39.507 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[7f12b590-28d5-4ed9-a3c1-e24f9bf75028]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:46:39 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:46:39.513 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[3956d0b7-7efd-43cf-a9df-8abeae820996]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:46:39 compute-1 NetworkManager[54952]: <info>  [1769039199.5156] manager: (tap3af949ae-60): new Veth device (/org/freedesktop/NetworkManager/Devices/37)
Jan 21 23:46:39 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:46:39.553 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[4cc48665-c78e-4a0d-9e9a-f5a1e2e223e5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:46:39 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:46:39.557 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[0746c0f8-1509-43fe-99f6-8f89be8be43a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:46:39 compute-1 NetworkManager[54952]: <info>  [1769039199.5783] device (tap3af949ae-60): carrier: link connected
Jan 21 23:46:39 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:46:39.585 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[e4df5e84-2e9a-4baf-82e2-bc5746cf6fe0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:46:39 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:46:39.606 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[3cdee2b4-480c-428b-8e7c-9ef8b249bebf]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap3af949ae-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:49:66:8d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 19], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 377223, 'reachable_time': 42118, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 213137, 'error': None, 'target': 'ovnmeta-3af949ae-65f7-4e98-9b88-e75f765a8686', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:46:39 compute-1 nova_compute[182713]: 2026-01-21 23:46:39.614 182717 INFO nova.compute.manager [None req-a757ace6-3f57-4f36-89c9-14999d5dd4ba 553fdc065acf4000a185abac43878ab4 1298204af0f241dc8b63851b2046cf5c - - default default] [instance: b1080912-4a1f-4504-ae59-a0ad89963886] Took 6.59 seconds to build instance.
Jan 21 23:46:39 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:46:39.625 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[8a815578-1166-47fb-a706-5860ba6f8a63]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe49:668d'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 377223, 'tstamp': 377223}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 213138, 'error': None, 'target': 'ovnmeta-3af949ae-65f7-4e98-9b88-e75f765a8686', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:46:39 compute-1 nova_compute[182713]: 2026-01-21 23:46:39.631 182717 DEBUG oslo_concurrency.lockutils [None req-a757ace6-3f57-4f36-89c9-14999d5dd4ba 553fdc065acf4000a185abac43878ab4 1298204af0f241dc8b63851b2046cf5c - - default default] Lock "b1080912-4a1f-4504-ae59-a0ad89963886" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 6.698s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:46:39 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:46:39.643 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[fd8d3578-2a72-4484-8890-50e7af7009dc]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap3af949ae-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:49:66:8d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 19], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 377223, 'reachable_time': 42118, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 213139, 'error': None, 'target': 'ovnmeta-3af949ae-65f7-4e98-9b88-e75f765a8686', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:46:39 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:46:39.680 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[97bd1707-778e-4244-918d-eb0750fc69ed]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:46:39 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:46:39.755 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[e140445c-ab77-410d-b4ce-974a37c6e237]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:46:39 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:46:39.757 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3af949ae-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:46:39 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:46:39.757 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 21 23:46:39 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:46:39.758 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap3af949ae-60, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:46:39 compute-1 nova_compute[182713]: 2026-01-21 23:46:39.760 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:46:39 compute-1 NetworkManager[54952]: <info>  [1769039199.7614] manager: (tap3af949ae-60): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/38)
Jan 21 23:46:39 compute-1 kernel: tap3af949ae-60: entered promiscuous mode
Jan 21 23:46:39 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:46:39.766 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap3af949ae-60, col_values=(('external_ids', {'iface-id': 'da91e802-47a1-4124-a7f9-83eac4382374'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:46:39 compute-1 nova_compute[182713]: 2026-01-21 23:46:39.767 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:46:39 compute-1 ovn_controller[94841]: 2026-01-21T23:46:39Z|00054|binding|INFO|Releasing lport da91e802-47a1-4124-a7f9-83eac4382374 from this chassis (sb_readonly=0)
Jan 21 23:46:39 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:46:39.770 104184 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/3af949ae-65f7-4e98-9b88-e75f765a8686.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/3af949ae-65f7-4e98-9b88-e75f765a8686.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 21 23:46:39 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:46:39.771 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[16dba2fa-f2bc-48b3-8f02-20f1c00b25d4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:46:39 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:46:39.772 104184 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 21 23:46:39 compute-1 ovn_metadata_agent[104179]: global
Jan 21 23:46:39 compute-1 ovn_metadata_agent[104179]:     log         /dev/log local0 debug
Jan 21 23:46:39 compute-1 ovn_metadata_agent[104179]:     log-tag     haproxy-metadata-proxy-3af949ae-65f7-4e98-9b88-e75f765a8686
Jan 21 23:46:39 compute-1 ovn_metadata_agent[104179]:     user        root
Jan 21 23:46:39 compute-1 ovn_metadata_agent[104179]:     group       root
Jan 21 23:46:39 compute-1 ovn_metadata_agent[104179]:     maxconn     1024
Jan 21 23:46:39 compute-1 ovn_metadata_agent[104179]:     pidfile     /var/lib/neutron/external/pids/3af949ae-65f7-4e98-9b88-e75f765a8686.pid.haproxy
Jan 21 23:46:39 compute-1 ovn_metadata_agent[104179]:     daemon
Jan 21 23:46:39 compute-1 ovn_metadata_agent[104179]: 
Jan 21 23:46:39 compute-1 ovn_metadata_agent[104179]: defaults
Jan 21 23:46:39 compute-1 ovn_metadata_agent[104179]:     log global
Jan 21 23:46:39 compute-1 ovn_metadata_agent[104179]:     mode http
Jan 21 23:46:39 compute-1 ovn_metadata_agent[104179]:     option httplog
Jan 21 23:46:39 compute-1 ovn_metadata_agent[104179]:     option dontlognull
Jan 21 23:46:39 compute-1 ovn_metadata_agent[104179]:     option http-server-close
Jan 21 23:46:39 compute-1 ovn_metadata_agent[104179]:     option forwardfor
Jan 21 23:46:39 compute-1 ovn_metadata_agent[104179]:     retries                 3
Jan 21 23:46:39 compute-1 ovn_metadata_agent[104179]:     timeout http-request    30s
Jan 21 23:46:39 compute-1 ovn_metadata_agent[104179]:     timeout connect         30s
Jan 21 23:46:39 compute-1 ovn_metadata_agent[104179]:     timeout client          32s
Jan 21 23:46:39 compute-1 ovn_metadata_agent[104179]:     timeout server          32s
Jan 21 23:46:39 compute-1 ovn_metadata_agent[104179]:     timeout http-keep-alive 30s
Jan 21 23:46:39 compute-1 ovn_metadata_agent[104179]: 
Jan 21 23:46:39 compute-1 ovn_metadata_agent[104179]: 
Jan 21 23:46:39 compute-1 ovn_metadata_agent[104179]: listen listener
Jan 21 23:46:39 compute-1 ovn_metadata_agent[104179]:     bind 169.254.169.254:80
Jan 21 23:46:39 compute-1 ovn_metadata_agent[104179]:     server metadata /var/lib/neutron/metadata_proxy
Jan 21 23:46:39 compute-1 ovn_metadata_agent[104179]:     http-request add-header X-OVN-Network-ID 3af949ae-65f7-4e98-9b88-e75f765a8686
Jan 21 23:46:39 compute-1 ovn_metadata_agent[104179]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 21 23:46:39 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:46:39.774 104184 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-3af949ae-65f7-4e98-9b88-e75f765a8686', 'env', 'PROCESS_TAG=haproxy-3af949ae-65f7-4e98-9b88-e75f765a8686', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/3af949ae-65f7-4e98-9b88-e75f765a8686.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 21 23:46:39 compute-1 nova_compute[182713]: 2026-01-21 23:46:39.779 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:46:40 compute-1 nova_compute[182713]: 2026-01-21 23:46:40.027 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:46:40 compute-1 podman[213171]: 2026-01-21 23:46:40.200608517 +0000 UTC m=+0.048006043 container create 4aff6ef71bcdcdf74636b4972ad0700d0d616cd5551d5f4e5e76d87d372dcb56 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3af949ae-65f7-4e98-9b88-e75f765a8686, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Jan 21 23:46:40 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:46:40.228 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=74526b6d-b1ca-423f-9094-b845f8b97526, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '6'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:46:40 compute-1 systemd[1]: Started libpod-conmon-4aff6ef71bcdcdf74636b4972ad0700d0d616cd5551d5f4e5e76d87d372dcb56.scope.
Jan 21 23:46:40 compute-1 podman[213171]: 2026-01-21 23:46:40.174029925 +0000 UTC m=+0.021427451 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 21 23:46:40 compute-1 systemd[1]: Started libcrun container.
Jan 21 23:46:40 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/86c160f5142ffc103580a7eba7dcffe98b9b475a6426053b5b21dd40d5ae1b6b/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 21 23:46:40 compute-1 podman[213171]: 2026-01-21 23:46:40.292189781 +0000 UTC m=+0.139587337 container init 4aff6ef71bcdcdf74636b4972ad0700d0d616cd5551d5f4e5e76d87d372dcb56 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3af949ae-65f7-4e98-9b88-e75f765a8686, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true)
Jan 21 23:46:40 compute-1 podman[213171]: 2026-01-21 23:46:40.298758017 +0000 UTC m=+0.146155533 container start 4aff6ef71bcdcdf74636b4972ad0700d0d616cd5551d5f4e5e76d87d372dcb56 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3af949ae-65f7-4e98-9b88-e75f765a8686, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 21 23:46:40 compute-1 neutron-haproxy-ovnmeta-3af949ae-65f7-4e98-9b88-e75f765a8686[213187]: [NOTICE]   (213191) : New worker (213193) forked
Jan 21 23:46:40 compute-1 neutron-haproxy-ovnmeta-3af949ae-65f7-4e98-9b88-e75f765a8686[213187]: [NOTICE]   (213191) : Loading success.
Jan 21 23:46:41 compute-1 nova_compute[182713]: 2026-01-21 23:46:41.495 182717 DEBUG nova.compute.manager [req-fdc265a6-7250-4c8d-a099-119f9249705c req-f993332a-7dca-4ec0-b3bd-97d4f26c9bab 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: b1080912-4a1f-4504-ae59-a0ad89963886] Received event network-vif-plugged-c16d8d18-6610-45c3-8172-54b8b99474ae external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:46:41 compute-1 nova_compute[182713]: 2026-01-21 23:46:41.496 182717 DEBUG oslo_concurrency.lockutils [req-fdc265a6-7250-4c8d-a099-119f9249705c req-f993332a-7dca-4ec0-b3bd-97d4f26c9bab 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "b1080912-4a1f-4504-ae59-a0ad89963886-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:46:41 compute-1 nova_compute[182713]: 2026-01-21 23:46:41.496 182717 DEBUG oslo_concurrency.lockutils [req-fdc265a6-7250-4c8d-a099-119f9249705c req-f993332a-7dca-4ec0-b3bd-97d4f26c9bab 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "b1080912-4a1f-4504-ae59-a0ad89963886-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:46:41 compute-1 nova_compute[182713]: 2026-01-21 23:46:41.497 182717 DEBUG oslo_concurrency.lockutils [req-fdc265a6-7250-4c8d-a099-119f9249705c req-f993332a-7dca-4ec0-b3bd-97d4f26c9bab 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "b1080912-4a1f-4504-ae59-a0ad89963886-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:46:41 compute-1 nova_compute[182713]: 2026-01-21 23:46:41.497 182717 DEBUG nova.compute.manager [req-fdc265a6-7250-4c8d-a099-119f9249705c req-f993332a-7dca-4ec0-b3bd-97d4f26c9bab 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: b1080912-4a1f-4504-ae59-a0ad89963886] No waiting events found dispatching network-vif-plugged-c16d8d18-6610-45c3-8172-54b8b99474ae pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 23:46:41 compute-1 nova_compute[182713]: 2026-01-21 23:46:41.498 182717 WARNING nova.compute.manager [req-fdc265a6-7250-4c8d-a099-119f9249705c req-f993332a-7dca-4ec0-b3bd-97d4f26c9bab 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: b1080912-4a1f-4504-ae59-a0ad89963886] Received unexpected event network-vif-plugged-c16d8d18-6610-45c3-8172-54b8b99474ae for instance with vm_state active and task_state None.
Jan 21 23:46:42 compute-1 nova_compute[182713]: 2026-01-21 23:46:42.548 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:46:44 compute-1 podman[213203]: 2026-01-21 23:46:44.59113596 +0000 UTC m=+0.074481300 container health_status 9069102163b9014ce8d990e36224edf2f6277e737eebbc85a8ea0ba5a3d6a468 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 21 23:46:44 compute-1 podman[213202]: 2026-01-21 23:46:44.649253859 +0000 UTC m=+0.128185642 container health_status 1b31374526468d6b98833c6ddc06c791524e3aaa8f4a92d02e3f11c449a17ca2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Jan 21 23:46:45 compute-1 nova_compute[182713]: 2026-01-21 23:46:45.066 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:46:45 compute-1 nova_compute[182713]: 2026-01-21 23:46:45.266 182717 DEBUG nova.virt.libvirt.driver [None req-eb06dbd1-2873-4271-8d09-c8c9ae32bda6 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] [instance: b1080912-4a1f-4504-ae59-a0ad89963886] Check if temp file /var/lib/nova/instances/tmp43n8huxx exists to indicate shared storage is being used for migration. Exists? False _check_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10065
Jan 21 23:46:45 compute-1 nova_compute[182713]: 2026-01-21 23:46:45.266 182717 DEBUG nova.compute.manager [None req-eb06dbd1-2873-4271-8d09-c8c9ae32bda6 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] source check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp43n8huxx',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='b1080912-4a1f-4504-ae59-a0ad89963886',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_source /usr/lib/python3.9/site-packages/nova/compute/manager.py:8587
Jan 21 23:46:47 compute-1 nova_compute[182713]: 2026-01-21 23:46:47.581 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:46:47 compute-1 nova_compute[182713]: 2026-01-21 23:46:47.652 182717 DEBUG oslo_concurrency.processutils [None req-eb06dbd1-2873-4271-8d09-c8c9ae32bda6 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b1080912-4a1f-4504-ae59-a0ad89963886/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:46:47 compute-1 nova_compute[182713]: 2026-01-21 23:46:47.741 182717 DEBUG oslo_concurrency.processutils [None req-eb06dbd1-2873-4271-8d09-c8c9ae32bda6 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b1080912-4a1f-4504-ae59-a0ad89963886/disk --force-share --output=json" returned: 0 in 0.089s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:46:47 compute-1 nova_compute[182713]: 2026-01-21 23:46:47.744 182717 DEBUG oslo_concurrency.processutils [None req-eb06dbd1-2873-4271-8d09-c8c9ae32bda6 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b1080912-4a1f-4504-ae59-a0ad89963886/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:46:47 compute-1 nova_compute[182713]: 2026-01-21 23:46:47.811 182717 DEBUG oslo_concurrency.processutils [None req-eb06dbd1-2873-4271-8d09-c8c9ae32bda6 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b1080912-4a1f-4504-ae59-a0ad89963886/disk --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:46:47 compute-1 nova_compute[182713]: 2026-01-21 23:46:47.814 182717 DEBUG oslo_concurrency.lockutils [None req-eb06dbd1-2873-4271-8d09-c8c9ae32bda6 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] Acquiring lock "compute-rpcapi-router" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 23:46:47 compute-1 nova_compute[182713]: 2026-01-21 23:46:47.814 182717 DEBUG oslo_concurrency.lockutils [None req-eb06dbd1-2873-4271-8d09-c8c9ae32bda6 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] Acquired lock "compute-rpcapi-router" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 23:46:47 compute-1 nova_compute[182713]: 2026-01-21 23:46:47.833 182717 INFO nova.compute.rpcapi [None req-eb06dbd1-2873-4271-8d09-c8c9ae32bda6 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] Automatically selected compute RPC version 6.2 from minimum service version 66
Jan 21 23:46:47 compute-1 nova_compute[182713]: 2026-01-21 23:46:47.836 182717 DEBUG oslo_concurrency.lockutils [None req-eb06dbd1-2873-4271-8d09-c8c9ae32bda6 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] Releasing lock "compute-rpcapi-router" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 23:46:50 compute-1 nova_compute[182713]: 2026-01-21 23:46:50.102 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:46:50 compute-1 sshd-session[213263]: Accepted publickey for nova from 192.168.122.100 port 59192 ssh2: ECDSA SHA256:F7R2p0Um/J2TH6GGZ4p2tVmjepU+Vjbmccv9PcuTsYI
Jan 21 23:46:50 compute-1 systemd-logind[796]: New session 27 of user nova.
Jan 21 23:46:50 compute-1 systemd[1]: Created slice User Slice of UID 42436.
Jan 21 23:46:50 compute-1 systemd[1]: Starting User Runtime Directory /run/user/42436...
Jan 21 23:46:50 compute-1 systemd[1]: Finished User Runtime Directory /run/user/42436.
Jan 21 23:46:50 compute-1 systemd[1]: Starting User Manager for UID 42436...
Jan 21 23:46:50 compute-1 systemd[213281]: pam_unix(systemd-user:session): session opened for user nova(uid=42436) by nova(uid=0)
Jan 21 23:46:50 compute-1 systemd[213281]: Queued start job for default target Main User Target.
Jan 21 23:46:50 compute-1 systemd[213281]: Created slice User Application Slice.
Jan 21 23:46:50 compute-1 systemd[213281]: Started Mark boot as successful after the user session has run 2 minutes.
Jan 21 23:46:50 compute-1 systemd[213281]: Started Daily Cleanup of User's Temporary Directories.
Jan 21 23:46:50 compute-1 systemd[213281]: Reached target Paths.
Jan 21 23:46:50 compute-1 systemd[213281]: Reached target Timers.
Jan 21 23:46:50 compute-1 systemd[213281]: Starting D-Bus User Message Bus Socket...
Jan 21 23:46:50 compute-1 systemd[213281]: Starting Create User's Volatile Files and Directories...
Jan 21 23:46:50 compute-1 systemd[213281]: Listening on D-Bus User Message Bus Socket.
Jan 21 23:46:50 compute-1 systemd[213281]: Reached target Sockets.
Jan 21 23:46:50 compute-1 systemd[213281]: Finished Create User's Volatile Files and Directories.
Jan 21 23:46:50 compute-1 systemd[213281]: Reached target Basic System.
Jan 21 23:46:50 compute-1 systemd[213281]: Reached target Main User Target.
Jan 21 23:46:50 compute-1 systemd[213281]: Startup finished in 176ms.
Jan 21 23:46:50 compute-1 systemd[1]: Started User Manager for UID 42436.
Jan 21 23:46:50 compute-1 systemd[1]: Started Session 27 of User nova.
Jan 21 23:46:50 compute-1 sshd-session[213263]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Jan 21 23:46:50 compute-1 sshd-session[213298]: Received disconnect from 192.168.122.100 port 59192:11: disconnected by user
Jan 21 23:46:50 compute-1 sshd-session[213298]: Disconnected from user nova 192.168.122.100 port 59192
Jan 21 23:46:50 compute-1 sshd-session[213263]: pam_unix(sshd:session): session closed for user nova
Jan 21 23:46:50 compute-1 systemd[1]: session-27.scope: Deactivated successfully.
Jan 21 23:46:50 compute-1 systemd-logind[796]: Session 27 logged out. Waiting for processes to exit.
Jan 21 23:46:50 compute-1 systemd-logind[796]: Removed session 27.
Jan 21 23:46:52 compute-1 ovn_controller[94841]: 2026-01-21T23:46:52Z|00006|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:d8:d5:91 10.100.0.4
Jan 21 23:46:52 compute-1 ovn_controller[94841]: 2026-01-21T23:46:52Z|00007|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:d8:d5:91 10.100.0.4
Jan 21 23:46:52 compute-1 nova_compute[182713]: 2026-01-21 23:46:52.478 182717 DEBUG nova.compute.manager [req-86732336-d667-4f39-a1c0-73fac6721d66 req-fc5e219c-f3b1-4001-8a15-d1850c4cb8cf 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: b1080912-4a1f-4504-ae59-a0ad89963886] Received event network-vif-unplugged-c16d8d18-6610-45c3-8172-54b8b99474ae external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:46:52 compute-1 nova_compute[182713]: 2026-01-21 23:46:52.480 182717 DEBUG oslo_concurrency.lockutils [req-86732336-d667-4f39-a1c0-73fac6721d66 req-fc5e219c-f3b1-4001-8a15-d1850c4cb8cf 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "b1080912-4a1f-4504-ae59-a0ad89963886-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:46:52 compute-1 nova_compute[182713]: 2026-01-21 23:46:52.480 182717 DEBUG oslo_concurrency.lockutils [req-86732336-d667-4f39-a1c0-73fac6721d66 req-fc5e219c-f3b1-4001-8a15-d1850c4cb8cf 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "b1080912-4a1f-4504-ae59-a0ad89963886-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:46:52 compute-1 nova_compute[182713]: 2026-01-21 23:46:52.481 182717 DEBUG oslo_concurrency.lockutils [req-86732336-d667-4f39-a1c0-73fac6721d66 req-fc5e219c-f3b1-4001-8a15-d1850c4cb8cf 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "b1080912-4a1f-4504-ae59-a0ad89963886-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:46:52 compute-1 nova_compute[182713]: 2026-01-21 23:46:52.481 182717 DEBUG nova.compute.manager [req-86732336-d667-4f39-a1c0-73fac6721d66 req-fc5e219c-f3b1-4001-8a15-d1850c4cb8cf 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: b1080912-4a1f-4504-ae59-a0ad89963886] No waiting events found dispatching network-vif-unplugged-c16d8d18-6610-45c3-8172-54b8b99474ae pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 23:46:52 compute-1 nova_compute[182713]: 2026-01-21 23:46:52.482 182717 DEBUG nova.compute.manager [req-86732336-d667-4f39-a1c0-73fac6721d66 req-fc5e219c-f3b1-4001-8a15-d1850c4cb8cf 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: b1080912-4a1f-4504-ae59-a0ad89963886] Received event network-vif-unplugged-c16d8d18-6610-45c3-8172-54b8b99474ae for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 21 23:46:52 compute-1 nova_compute[182713]: 2026-01-21 23:46:52.586 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:46:53 compute-1 podman[213300]: 2026-01-21 23:46:53.581548038 +0000 UTC m=+0.062196327 container health_status af9a44923f14d865893c91712a29a99d59e05d83f1a1a0300c4f4d5af638f284 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent)
Jan 21 23:46:53 compute-1 podman[213301]: 2026-01-21 23:46:53.611902487 +0000 UTC m=+0.091028408 container health_status e305ebfcd3dbca18f8dd680606b043b108cf9efb0d0a771039cb04fe0df8441d (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 21 23:46:54 compute-1 nova_compute[182713]: 2026-01-21 23:46:54.437 182717 INFO nova.compute.manager [None req-eb06dbd1-2873-4271-8d09-c8c9ae32bda6 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] [instance: b1080912-4a1f-4504-ae59-a0ad89963886] Took 6.62 seconds for pre_live_migration on destination host compute-0.ctlplane.example.com.
Jan 21 23:46:54 compute-1 nova_compute[182713]: 2026-01-21 23:46:54.438 182717 DEBUG nova.compute.manager [None req-eb06dbd1-2873-4271-8d09-c8c9ae32bda6 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] [instance: b1080912-4a1f-4504-ae59-a0ad89963886] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 21 23:46:54 compute-1 nova_compute[182713]: 2026-01-21 23:46:54.455 182717 DEBUG nova.compute.manager [None req-eb06dbd1-2873-4271-8d09-c8c9ae32bda6 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp43n8huxx',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='b1080912-4a1f-4504-ae59-a0ad89963886',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=Migration(2a5bd0c2-6fc8-4a77-953d-99e26b2b89b9),old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) _do_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8939
Jan 21 23:46:54 compute-1 nova_compute[182713]: 2026-01-21 23:46:54.489 182717 DEBUG nova.objects.instance [None req-eb06dbd1-2873-4271-8d09-c8c9ae32bda6 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] Lazy-loading 'migration_context' on Instance uuid b1080912-4a1f-4504-ae59-a0ad89963886 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:46:54 compute-1 nova_compute[182713]: 2026-01-21 23:46:54.490 182717 DEBUG nova.virt.libvirt.driver [None req-eb06dbd1-2873-4271-8d09-c8c9ae32bda6 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] [instance: b1080912-4a1f-4504-ae59-a0ad89963886] Starting monitoring of live migration _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10639
Jan 21 23:46:54 compute-1 nova_compute[182713]: 2026-01-21 23:46:54.491 182717 DEBUG nova.virt.libvirt.driver [None req-eb06dbd1-2873-4271-8d09-c8c9ae32bda6 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] [instance: b1080912-4a1f-4504-ae59-a0ad89963886] Operation thread is still running _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10440
Jan 21 23:46:54 compute-1 nova_compute[182713]: 2026-01-21 23:46:54.492 182717 DEBUG nova.virt.libvirt.driver [None req-eb06dbd1-2873-4271-8d09-c8c9ae32bda6 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] [instance: b1080912-4a1f-4504-ae59-a0ad89963886] Migration not running yet _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10449
Jan 21 23:46:54 compute-1 nova_compute[182713]: 2026-01-21 23:46:54.525 182717 DEBUG nova.virt.libvirt.vif [None req-eb06dbd1-2873-4271-8d09-c8c9ae32bda6 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-21T23:46:31Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-LiveAutoBlockMigrationV225Test-server-1283276848',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-liveautoblockmigrationv225test-server-1283276848',id=17,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-21T23:46:39Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='1298204af0f241dc8b63851b2046cf5c',ramdisk_id='',reservation_id='r-b25ryamg',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-LiveAutoBlockMigrationV225Test-1063342224',owner_user_name='tempest-LiveAutoBlockMigrationV225Test-1063342224-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-21T23:46:39Z,user_data=None,user_id='553fdc065acf4000a185abac43878ab4',uuid=b1080912-4a1f-4504-ae59-a0ad89963886,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "c16d8d18-6610-45c3-8172-54b8b99474ae", "address": "fa:16:3e:d8:d5:91", "network": {"id": "b7816b8e-52c1-4d60-84f7-524ebe7dfa5c", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1363967197-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1298204af0f241dc8b63851b2046cf5c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapc16d8d18-66", "ovs_interfaceid": "c16d8d18-6610-45c3-8172-54b8b99474ae", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 21 23:46:54 compute-1 nova_compute[182713]: 2026-01-21 23:46:54.526 182717 DEBUG nova.network.os_vif_util [None req-eb06dbd1-2873-4271-8d09-c8c9ae32bda6 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] Converting VIF {"id": "c16d8d18-6610-45c3-8172-54b8b99474ae", "address": "fa:16:3e:d8:d5:91", "network": {"id": "b7816b8e-52c1-4d60-84f7-524ebe7dfa5c", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1363967197-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1298204af0f241dc8b63851b2046cf5c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapc16d8d18-66", "ovs_interfaceid": "c16d8d18-6610-45c3-8172-54b8b99474ae", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 21 23:46:54 compute-1 nova_compute[182713]: 2026-01-21 23:46:54.527 182717 DEBUG nova.network.os_vif_util [None req-eb06dbd1-2873-4271-8d09-c8c9ae32bda6 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d8:d5:91,bridge_name='br-int',has_traffic_filtering=True,id=c16d8d18-6610-45c3-8172-54b8b99474ae,network=Network(b7816b8e-52c1-4d60-84f7-524ebe7dfa5c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapc16d8d18-66') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 21 23:46:54 compute-1 nova_compute[182713]: 2026-01-21 23:46:54.529 182717 DEBUG nova.virt.libvirt.migration [None req-eb06dbd1-2873-4271-8d09-c8c9ae32bda6 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] [instance: b1080912-4a1f-4504-ae59-a0ad89963886] Updating guest XML with vif config: <interface type="ethernet">
Jan 21 23:46:54 compute-1 nova_compute[182713]:   <mac address="fa:16:3e:d8:d5:91"/>
Jan 21 23:46:54 compute-1 nova_compute[182713]:   <model type="virtio"/>
Jan 21 23:46:54 compute-1 nova_compute[182713]:   <driver name="vhost" rx_queue_size="512"/>
Jan 21 23:46:54 compute-1 nova_compute[182713]:   <mtu size="1442"/>
Jan 21 23:46:54 compute-1 nova_compute[182713]:   <target dev="tapc16d8d18-66"/>
Jan 21 23:46:54 compute-1 nova_compute[182713]: </interface>
Jan 21 23:46:54 compute-1 nova_compute[182713]:  _update_vif_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:388
Jan 21 23:46:54 compute-1 nova_compute[182713]: 2026-01-21 23:46:54.530 182717 DEBUG nova.virt.libvirt.driver [None req-eb06dbd1-2873-4271-8d09-c8c9ae32bda6 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] [instance: b1080912-4a1f-4504-ae59-a0ad89963886] About to invoke the migrate API _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10272
Jan 21 23:46:54 compute-1 nova_compute[182713]: 2026-01-21 23:46:54.677 182717 DEBUG nova.compute.manager [req-833a78a6-da47-4c00-b418-85e8c8c17b33 req-2870ee3a-2c70-41f4-b586-a6725eebd7f5 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: b1080912-4a1f-4504-ae59-a0ad89963886] Received event network-vif-plugged-c16d8d18-6610-45c3-8172-54b8b99474ae external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:46:54 compute-1 nova_compute[182713]: 2026-01-21 23:46:54.678 182717 DEBUG oslo_concurrency.lockutils [req-833a78a6-da47-4c00-b418-85e8c8c17b33 req-2870ee3a-2c70-41f4-b586-a6725eebd7f5 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "b1080912-4a1f-4504-ae59-a0ad89963886-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:46:54 compute-1 nova_compute[182713]: 2026-01-21 23:46:54.678 182717 DEBUG oslo_concurrency.lockutils [req-833a78a6-da47-4c00-b418-85e8c8c17b33 req-2870ee3a-2c70-41f4-b586-a6725eebd7f5 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "b1080912-4a1f-4504-ae59-a0ad89963886-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:46:54 compute-1 nova_compute[182713]: 2026-01-21 23:46:54.679 182717 DEBUG oslo_concurrency.lockutils [req-833a78a6-da47-4c00-b418-85e8c8c17b33 req-2870ee3a-2c70-41f4-b586-a6725eebd7f5 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "b1080912-4a1f-4504-ae59-a0ad89963886-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:46:54 compute-1 nova_compute[182713]: 2026-01-21 23:46:54.679 182717 DEBUG nova.compute.manager [req-833a78a6-da47-4c00-b418-85e8c8c17b33 req-2870ee3a-2c70-41f4-b586-a6725eebd7f5 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: b1080912-4a1f-4504-ae59-a0ad89963886] No waiting events found dispatching network-vif-plugged-c16d8d18-6610-45c3-8172-54b8b99474ae pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 23:46:54 compute-1 nova_compute[182713]: 2026-01-21 23:46:54.680 182717 WARNING nova.compute.manager [req-833a78a6-da47-4c00-b418-85e8c8c17b33 req-2870ee3a-2c70-41f4-b586-a6725eebd7f5 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: b1080912-4a1f-4504-ae59-a0ad89963886] Received unexpected event network-vif-plugged-c16d8d18-6610-45c3-8172-54b8b99474ae for instance with vm_state active and task_state migrating.
Jan 21 23:46:54 compute-1 nova_compute[182713]: 2026-01-21 23:46:54.680 182717 DEBUG nova.compute.manager [req-833a78a6-da47-4c00-b418-85e8c8c17b33 req-2870ee3a-2c70-41f4-b586-a6725eebd7f5 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: b1080912-4a1f-4504-ae59-a0ad89963886] Received event network-changed-c16d8d18-6610-45c3-8172-54b8b99474ae external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:46:54 compute-1 nova_compute[182713]: 2026-01-21 23:46:54.680 182717 DEBUG nova.compute.manager [req-833a78a6-da47-4c00-b418-85e8c8c17b33 req-2870ee3a-2c70-41f4-b586-a6725eebd7f5 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: b1080912-4a1f-4504-ae59-a0ad89963886] Refreshing instance network info cache due to event network-changed-c16d8d18-6610-45c3-8172-54b8b99474ae. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 21 23:46:54 compute-1 nova_compute[182713]: 2026-01-21 23:46:54.681 182717 DEBUG oslo_concurrency.lockutils [req-833a78a6-da47-4c00-b418-85e8c8c17b33 req-2870ee3a-2c70-41f4-b586-a6725eebd7f5 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-b1080912-4a1f-4504-ae59-a0ad89963886" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 23:46:54 compute-1 nova_compute[182713]: 2026-01-21 23:46:54.681 182717 DEBUG oslo_concurrency.lockutils [req-833a78a6-da47-4c00-b418-85e8c8c17b33 req-2870ee3a-2c70-41f4-b586-a6725eebd7f5 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-b1080912-4a1f-4504-ae59-a0ad89963886" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 23:46:54 compute-1 nova_compute[182713]: 2026-01-21 23:46:54.682 182717 DEBUG nova.network.neutron [req-833a78a6-da47-4c00-b418-85e8c8c17b33 req-2870ee3a-2c70-41f4-b586-a6725eebd7f5 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: b1080912-4a1f-4504-ae59-a0ad89963886] Refreshing network info cache for port c16d8d18-6610-45c3-8172-54b8b99474ae _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 21 23:46:54 compute-1 nova_compute[182713]: 2026-01-21 23:46:54.995 182717 DEBUG nova.virt.libvirt.migration [None req-eb06dbd1-2873-4271-8d09-c8c9ae32bda6 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] [instance: b1080912-4a1f-4504-ae59-a0ad89963886] Current None elapsed 0 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Jan 21 23:46:54 compute-1 nova_compute[182713]: 2026-01-21 23:46:54.996 182717 INFO nova.virt.libvirt.migration [None req-eb06dbd1-2873-4271-8d09-c8c9ae32bda6 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] [instance: b1080912-4a1f-4504-ae59-a0ad89963886] Increasing downtime to 50 ms after 0 sec elapsed time
Jan 21 23:46:55 compute-1 nova_compute[182713]: 2026-01-21 23:46:55.104 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:46:55 compute-1 nova_compute[182713]: 2026-01-21 23:46:55.141 182717 INFO nova.virt.libvirt.driver [None req-eb06dbd1-2873-4271-8d09-c8c9ae32bda6 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] [instance: b1080912-4a1f-4504-ae59-a0ad89963886] Migration running for 0 secs, memory 100% remaining (bytes processed=0, remaining=0, total=0); disk 100% remaining (bytes processed=0, remaining=0, total=0).
Jan 21 23:46:55 compute-1 nova_compute[182713]: 2026-01-21 23:46:55.646 182717 DEBUG nova.virt.libvirt.migration [None req-eb06dbd1-2873-4271-8d09-c8c9ae32bda6 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] [instance: b1080912-4a1f-4504-ae59-a0ad89963886] Current 50 elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Jan 21 23:46:55 compute-1 nova_compute[182713]: 2026-01-21 23:46:55.647 182717 DEBUG nova.virt.libvirt.migration [None req-eb06dbd1-2873-4271-8d09-c8c9ae32bda6 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] [instance: b1080912-4a1f-4504-ae59-a0ad89963886] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Jan 21 23:46:56 compute-1 nova_compute[182713]: 2026-01-21 23:46:56.152 182717 DEBUG nova.virt.libvirt.migration [None req-eb06dbd1-2873-4271-8d09-c8c9ae32bda6 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] [instance: b1080912-4a1f-4504-ae59-a0ad89963886] Current 50 elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Jan 21 23:46:56 compute-1 nova_compute[182713]: 2026-01-21 23:46:56.153 182717 DEBUG nova.virt.libvirt.migration [None req-eb06dbd1-2873-4271-8d09-c8c9ae32bda6 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] [instance: b1080912-4a1f-4504-ae59-a0ad89963886] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Jan 21 23:46:56 compute-1 nova_compute[182713]: 2026-01-21 23:46:56.414 182717 DEBUG nova.virt.driver [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Emitting event <LifecycleEvent: 1769039216.4132106, b1080912-4a1f-4504-ae59-a0ad89963886 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:46:56 compute-1 nova_compute[182713]: 2026-01-21 23:46:56.415 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: b1080912-4a1f-4504-ae59-a0ad89963886] VM Paused (Lifecycle Event)
Jan 21 23:46:56 compute-1 nova_compute[182713]: 2026-01-21 23:46:56.454 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: b1080912-4a1f-4504-ae59-a0ad89963886] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:46:56 compute-1 nova_compute[182713]: 2026-01-21 23:46:56.462 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: b1080912-4a1f-4504-ae59-a0ad89963886] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 21 23:46:56 compute-1 nova_compute[182713]: 2026-01-21 23:46:56.495 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: b1080912-4a1f-4504-ae59-a0ad89963886] During sync_power_state the instance has a pending task (migrating). Skip.
Jan 21 23:46:56 compute-1 kernel: tapc16d8d18-66 (unregistering): left promiscuous mode
Jan 21 23:46:56 compute-1 NetworkManager[54952]: <info>  [1769039216.6175] device (tapc16d8d18-66): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 21 23:46:56 compute-1 ovn_controller[94841]: 2026-01-21T23:46:56Z|00055|binding|INFO|Releasing lport c16d8d18-6610-45c3-8172-54b8b99474ae from this chassis (sb_readonly=0)
Jan 21 23:46:56 compute-1 ovn_controller[94841]: 2026-01-21T23:46:56Z|00056|binding|INFO|Setting lport c16d8d18-6610-45c3-8172-54b8b99474ae down in Southbound
Jan 21 23:46:56 compute-1 ovn_controller[94841]: 2026-01-21T23:46:56Z|00057|binding|INFO|Releasing lport 32683c17-e027-4757-9a64-36df76fef381 from this chassis (sb_readonly=0)
Jan 21 23:46:56 compute-1 ovn_controller[94841]: 2026-01-21T23:46:56Z|00058|binding|INFO|Setting lport 32683c17-e027-4757-9a64-36df76fef381 down in Southbound
Jan 21 23:46:56 compute-1 ovn_controller[94841]: 2026-01-21T23:46:56Z|00059|binding|INFO|Removing iface tapc16d8d18-66 ovn-installed in OVS
Jan 21 23:46:56 compute-1 nova_compute[182713]: 2026-01-21 23:46:56.665 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:46:56 compute-1 ovn_controller[94841]: 2026-01-21T23:46:56Z|00060|binding|INFO|Releasing lport ecebff42-11cb-48b4-9c3d-966172998a49 from this chassis (sb_readonly=0)
Jan 21 23:46:56 compute-1 ovn_controller[94841]: 2026-01-21T23:46:56Z|00061|binding|INFO|Releasing lport da91e802-47a1-4124-a7f9-83eac4382374 from this chassis (sb_readonly=0)
Jan 21 23:46:56 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:46:56.677 104184 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d8:d5:91 10.100.0.4'], port_security=['fa:16:3e:d8:d5:91 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com,compute-0.ctlplane.example.com', 'activation-strategy': 'rarp', 'additional-chassis-activated': '7f404a2f-20ba-4b9b-88d6-fa3588630efa'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-parent-1447043042', 'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': 'b1080912-4a1f-4504-ae59-a0ad89963886', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b7816b8e-52c1-4d60-84f7-524ebe7dfa5c', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-parent-1447043042', 'neutron:project_id': '1298204af0f241dc8b63851b2046cf5c', 'neutron:revision_number': '8', 'neutron:security_group_ids': '4fca0662-11c4-4183-96b8-546eae3304ff', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c50c611d-d348-436f-bd12-bc6add278699, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>], logical_port=c16d8d18-6610-45c3-8172-54b8b99474ae) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 21 23:46:56 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:46:56.679 104184 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:51:cc:06 19.80.0.41'], port_security=['fa:16:3e:51:cc:06 19.80.0.41'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=['c16d8d18-6610-45c3-8172-54b8b99474ae'], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-subport-1528740689', 'neutron:cidrs': '19.80.0.41/24', 'neutron:device_id': '', 'neutron:device_owner': 'trunk:subport', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3af949ae-65f7-4e98-9b88-e75f765a8686', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-subport-1528740689', 'neutron:project_id': '1298204af0f241dc8b63851b2046cf5c', 'neutron:revision_number': '3', 'neutron:security_group_ids': '4fca0662-11c4-4183-96b8-546eae3304ff', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[42], additional_encap=[], encap=[], mirror_rules=[], datapath=d251bc29-f047-44fe-b77c-1e7f2007e967, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=32683c17-e027-4757-9a64-36df76fef381) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 21 23:46:56 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:46:56.680 104184 INFO neutron.agent.ovn.metadata.agent [-] Port c16d8d18-6610-45c3-8172-54b8b99474ae in datapath b7816b8e-52c1-4d60-84f7-524ebe7dfa5c unbound from our chassis
Jan 21 23:46:56 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:46:56.682 104184 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network b7816b8e-52c1-4d60-84f7-524ebe7dfa5c, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 21 23:46:56 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:46:56.685 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[3256248c-09fd-4eb1-a207-813246b882cd]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:46:56 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:46:56.686 104184 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-b7816b8e-52c1-4d60-84f7-524ebe7dfa5c namespace which is not needed anymore
Jan 21 23:46:56 compute-1 nova_compute[182713]: 2026-01-21 23:46:56.688 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:46:56 compute-1 nova_compute[182713]: 2026-01-21 23:46:56.749 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:46:56 compute-1 systemd[1]: machine-qemu\x2d9\x2dinstance\x2d00000011.scope: Deactivated successfully.
Jan 21 23:46:56 compute-1 systemd[1]: machine-qemu\x2d9\x2dinstance\x2d00000011.scope: Consumed 13.071s CPU time.
Jan 21 23:46:56 compute-1 systemd-machined[153970]: Machine qemu-9-instance-00000011 terminated.
Jan 21 23:46:56 compute-1 neutron-haproxy-ovnmeta-b7816b8e-52c1-4d60-84f7-524ebe7dfa5c[213112]: [NOTICE]   (213116) : haproxy version is 2.8.14-c23fe91
Jan 21 23:46:56 compute-1 neutron-haproxy-ovnmeta-b7816b8e-52c1-4d60-84f7-524ebe7dfa5c[213112]: [NOTICE]   (213116) : path to executable is /usr/sbin/haproxy
Jan 21 23:46:56 compute-1 neutron-haproxy-ovnmeta-b7816b8e-52c1-4d60-84f7-524ebe7dfa5c[213112]: [WARNING]  (213116) : Exiting Master process...
Jan 21 23:46:56 compute-1 neutron-haproxy-ovnmeta-b7816b8e-52c1-4d60-84f7-524ebe7dfa5c[213112]: [ALERT]    (213116) : Current worker (213118) exited with code 143 (Terminated)
Jan 21 23:46:56 compute-1 neutron-haproxy-ovnmeta-b7816b8e-52c1-4d60-84f7-524ebe7dfa5c[213112]: [WARNING]  (213116) : All workers exited. Exiting... (0)
Jan 21 23:46:56 compute-1 systemd[1]: libpod-4fb79158bccf762200a7f67094ba5bb790a682de787843391631b51350362adc.scope: Deactivated successfully.
Jan 21 23:46:56 compute-1 podman[213370]: 2026-01-21 23:46:56.825928641 +0000 UTC m=+0.052620647 container died 4fb79158bccf762200a7f67094ba5bb790a682de787843391631b51350362adc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b7816b8e-52c1-4d60-84f7-524ebe7dfa5c, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 21 23:46:56 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-4fb79158bccf762200a7f67094ba5bb790a682de787843391631b51350362adc-userdata-shm.mount: Deactivated successfully.
Jan 21 23:46:56 compute-1 systemd[1]: var-lib-containers-storage-overlay-d6290e61230434e21e718b770a20d2af36540524d64e0d0c8f75e8df9c63a59a-merged.mount: Deactivated successfully.
Jan 21 23:46:56 compute-1 nova_compute[182713]: 2026-01-21 23:46:56.859 182717 DEBUG nova.virt.libvirt.guest [None req-eb06dbd1-2873-4271-8d09-c8c9ae32bda6 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] Domain has shutdown/gone away: Requested operation is not valid: domain is not running get_job_info /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:688
Jan 21 23:46:56 compute-1 nova_compute[182713]: 2026-01-21 23:46:56.863 182717 INFO nova.virt.libvirt.driver [None req-eb06dbd1-2873-4271-8d09-c8c9ae32bda6 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] [instance: b1080912-4a1f-4504-ae59-a0ad89963886] Migration operation has completed
Jan 21 23:46:56 compute-1 nova_compute[182713]: 2026-01-21 23:46:56.864 182717 INFO nova.compute.manager [None req-eb06dbd1-2873-4271-8d09-c8c9ae32bda6 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] [instance: b1080912-4a1f-4504-ae59-a0ad89963886] _post_live_migration() is started..
Jan 21 23:46:56 compute-1 nova_compute[182713]: 2026-01-21 23:46:56.867 182717 DEBUG nova.virt.libvirt.driver [None req-eb06dbd1-2873-4271-8d09-c8c9ae32bda6 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] [instance: b1080912-4a1f-4504-ae59-a0ad89963886] Migrate API has completed _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10279
Jan 21 23:46:56 compute-1 nova_compute[182713]: 2026-01-21 23:46:56.867 182717 DEBUG nova.virt.libvirt.driver [None req-eb06dbd1-2873-4271-8d09-c8c9ae32bda6 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] [instance: b1080912-4a1f-4504-ae59-a0ad89963886] Migration operation thread has finished _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10327
Jan 21 23:46:56 compute-1 nova_compute[182713]: 2026-01-21 23:46:56.868 182717 DEBUG nova.virt.libvirt.driver [None req-eb06dbd1-2873-4271-8d09-c8c9ae32bda6 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] [instance: b1080912-4a1f-4504-ae59-a0ad89963886] Migration operation thread notification thread_finished /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10630
Jan 21 23:46:56 compute-1 podman[213370]: 2026-01-21 23:46:56.871298761 +0000 UTC m=+0.097990767 container cleanup 4fb79158bccf762200a7f67094ba5bb790a682de787843391631b51350362adc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b7816b8e-52c1-4d60-84f7-524ebe7dfa5c, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 21 23:46:56 compute-1 systemd[1]: libpod-conmon-4fb79158bccf762200a7f67094ba5bb790a682de787843391631b51350362adc.scope: Deactivated successfully.
Jan 21 23:46:56 compute-1 nova_compute[182713]: 2026-01-21 23:46:56.946 182717 DEBUG nova.network.neutron [req-833a78a6-da47-4c00-b418-85e8c8c17b33 req-2870ee3a-2c70-41f4-b586-a6725eebd7f5 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: b1080912-4a1f-4504-ae59-a0ad89963886] Updated VIF entry in instance network info cache for port c16d8d18-6610-45c3-8172-54b8b99474ae. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 21 23:46:56 compute-1 nova_compute[182713]: 2026-01-21 23:46:56.947 182717 DEBUG nova.network.neutron [req-833a78a6-da47-4c00-b418-85e8c8c17b33 req-2870ee3a-2c70-41f4-b586-a6725eebd7f5 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: b1080912-4a1f-4504-ae59-a0ad89963886] Updating instance_info_cache with network_info: [{"id": "c16d8d18-6610-45c3-8172-54b8b99474ae", "address": "fa:16:3e:d8:d5:91", "network": {"id": "b7816b8e-52c1-4d60-84f7-524ebe7dfa5c", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1363967197-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1298204af0f241dc8b63851b2046cf5c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc16d8d18-66", "ovs_interfaceid": "c16d8d18-6610-45c3-8172-54b8b99474ae", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"migrating_to": "compute-0.ctlplane.example.com"}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 23:46:56 compute-1 podman[213418]: 2026-01-21 23:46:56.961014957 +0000 UTC m=+0.059263045 container remove 4fb79158bccf762200a7f67094ba5bb790a682de787843391631b51350362adc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b7816b8e-52c1-4d60-84f7-524ebe7dfa5c, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 21 23:46:56 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:46:56.967 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[69fd2d39-b6e3-4401-9fcd-d095f056f3cb]: (4, ('Wed Jan 21 11:46:56 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-b7816b8e-52c1-4d60-84f7-524ebe7dfa5c (4fb79158bccf762200a7f67094ba5bb790a682de787843391631b51350362adc)\n4fb79158bccf762200a7f67094ba5bb790a682de787843391631b51350362adc\nWed Jan 21 11:46:56 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-b7816b8e-52c1-4d60-84f7-524ebe7dfa5c (4fb79158bccf762200a7f67094ba5bb790a682de787843391631b51350362adc)\n4fb79158bccf762200a7f67094ba5bb790a682de787843391631b51350362adc\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:46:56 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:46:56.968 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[793ac072-39db-40cb-ae10-9283a803fd81]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:46:56 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:46:56.970 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb7816b8e-50, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:46:56 compute-1 nova_compute[182713]: 2026-01-21 23:46:56.972 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:46:56 compute-1 kernel: tapb7816b8e-50: left promiscuous mode
Jan 21 23:46:56 compute-1 nova_compute[182713]: 2026-01-21 23:46:56.975 182717 DEBUG oslo_concurrency.lockutils [req-833a78a6-da47-4c00-b418-85e8c8c17b33 req-2870ee3a-2c70-41f4-b586-a6725eebd7f5 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-b1080912-4a1f-4504-ae59-a0ad89963886" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 23:46:56 compute-1 nova_compute[182713]: 2026-01-21 23:46:56.988 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:46:56 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:46:56.992 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[4f54e85a-ed44-41f8-ac94-01e7033898db]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:46:57 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:46:57.005 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[d7042ebf-0030-4236-8a68-b6a313a50444]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:46:57 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:46:57.007 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[3d20c281-f898-4d96-aa95-eb71cc016e86]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:46:57 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:46:57.024 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[d7c9e3bf-cbff-410c-9512-d4ca70f5d921]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 377120, 'reachable_time': 34003, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 213437, 'error': None, 'target': 'ovnmeta-b7816b8e-52c1-4d60-84f7-524ebe7dfa5c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:46:57 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:46:57.029 104576 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-b7816b8e-52c1-4d60-84f7-524ebe7dfa5c deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 21 23:46:57 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:46:57.029 104576 DEBUG oslo.privsep.daemon [-] privsep: reply[60158d47-2a9d-44cf-8340-566340249758]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:46:57 compute-1 systemd[1]: run-netns-ovnmeta\x2db7816b8e\x2d52c1\x2d4d60\x2d84f7\x2d524ebe7dfa5c.mount: Deactivated successfully.
Jan 21 23:46:57 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:46:57.030 104184 INFO neutron.agent.ovn.metadata.agent [-] Port 32683c17-e027-4757-9a64-36df76fef381 in datapath 3af949ae-65f7-4e98-9b88-e75f765a8686 unbound from our chassis
Jan 21 23:46:57 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:46:57.033 104184 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 3af949ae-65f7-4e98-9b88-e75f765a8686, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 21 23:46:57 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:46:57.034 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[167b5e95-f4c8-4e8f-aa05-68d1c76fff6a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:46:57 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:46:57.035 104184 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-3af949ae-65f7-4e98-9b88-e75f765a8686 namespace which is not needed anymore
Jan 21 23:46:57 compute-1 nova_compute[182713]: 2026-01-21 23:46:57.124 182717 DEBUG nova.compute.manager [req-9d179917-7b05-47a0-98ee-42ae7dac0378 req-1cac98d9-efd4-4fe6-a891-d0052471a80e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: b1080912-4a1f-4504-ae59-a0ad89963886] Received event network-vif-unplugged-c16d8d18-6610-45c3-8172-54b8b99474ae external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:46:57 compute-1 nova_compute[182713]: 2026-01-21 23:46:57.124 182717 DEBUG oslo_concurrency.lockutils [req-9d179917-7b05-47a0-98ee-42ae7dac0378 req-1cac98d9-efd4-4fe6-a891-d0052471a80e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "b1080912-4a1f-4504-ae59-a0ad89963886-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:46:57 compute-1 nova_compute[182713]: 2026-01-21 23:46:57.125 182717 DEBUG oslo_concurrency.lockutils [req-9d179917-7b05-47a0-98ee-42ae7dac0378 req-1cac98d9-efd4-4fe6-a891-d0052471a80e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "b1080912-4a1f-4504-ae59-a0ad89963886-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:46:57 compute-1 nova_compute[182713]: 2026-01-21 23:46:57.125 182717 DEBUG oslo_concurrency.lockutils [req-9d179917-7b05-47a0-98ee-42ae7dac0378 req-1cac98d9-efd4-4fe6-a891-d0052471a80e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "b1080912-4a1f-4504-ae59-a0ad89963886-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:46:57 compute-1 nova_compute[182713]: 2026-01-21 23:46:57.125 182717 DEBUG nova.compute.manager [req-9d179917-7b05-47a0-98ee-42ae7dac0378 req-1cac98d9-efd4-4fe6-a891-d0052471a80e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: b1080912-4a1f-4504-ae59-a0ad89963886] No waiting events found dispatching network-vif-unplugged-c16d8d18-6610-45c3-8172-54b8b99474ae pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 23:46:57 compute-1 nova_compute[182713]: 2026-01-21 23:46:57.126 182717 DEBUG nova.compute.manager [req-9d179917-7b05-47a0-98ee-42ae7dac0378 req-1cac98d9-efd4-4fe6-a891-d0052471a80e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: b1080912-4a1f-4504-ae59-a0ad89963886] Received event network-vif-unplugged-c16d8d18-6610-45c3-8172-54b8b99474ae for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 21 23:46:57 compute-1 neutron-haproxy-ovnmeta-3af949ae-65f7-4e98-9b88-e75f765a8686[213187]: [NOTICE]   (213191) : haproxy version is 2.8.14-c23fe91
Jan 21 23:46:57 compute-1 neutron-haproxy-ovnmeta-3af949ae-65f7-4e98-9b88-e75f765a8686[213187]: [NOTICE]   (213191) : path to executable is /usr/sbin/haproxy
Jan 21 23:46:57 compute-1 neutron-haproxy-ovnmeta-3af949ae-65f7-4e98-9b88-e75f765a8686[213187]: [WARNING]  (213191) : Exiting Master process...
Jan 21 23:46:57 compute-1 neutron-haproxy-ovnmeta-3af949ae-65f7-4e98-9b88-e75f765a8686[213187]: [WARNING]  (213191) : Exiting Master process...
Jan 21 23:46:57 compute-1 neutron-haproxy-ovnmeta-3af949ae-65f7-4e98-9b88-e75f765a8686[213187]: [ALERT]    (213191) : Current worker (213193) exited with code 143 (Terminated)
Jan 21 23:46:57 compute-1 neutron-haproxy-ovnmeta-3af949ae-65f7-4e98-9b88-e75f765a8686[213187]: [WARNING]  (213191) : All workers exited. Exiting... (0)
Jan 21 23:46:57 compute-1 systemd[1]: libpod-4aff6ef71bcdcdf74636b4972ad0700d0d616cd5551d5f4e5e76d87d372dcb56.scope: Deactivated successfully.
Jan 21 23:46:57 compute-1 podman[213455]: 2026-01-21 23:46:57.215324281 +0000 UTC m=+0.048422996 container died 4aff6ef71bcdcdf74636b4972ad0700d0d616cd5551d5f4e5e76d87d372dcb56 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3af949ae-65f7-4e98-9b88-e75f765a8686, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 21 23:46:57 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-4aff6ef71bcdcdf74636b4972ad0700d0d616cd5551d5f4e5e76d87d372dcb56-userdata-shm.mount: Deactivated successfully.
Jan 21 23:46:57 compute-1 systemd[1]: var-lib-containers-storage-overlay-86c160f5142ffc103580a7eba7dcffe98b9b475a6426053b5b21dd40d5ae1b6b-merged.mount: Deactivated successfully.
Jan 21 23:46:57 compute-1 podman[213455]: 2026-01-21 23:46:57.248254911 +0000 UTC m=+0.081353626 container cleanup 4aff6ef71bcdcdf74636b4972ad0700d0d616cd5551d5f4e5e76d87d372dcb56 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3af949ae-65f7-4e98-9b88-e75f765a8686, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 21 23:46:57 compute-1 systemd[1]: libpod-conmon-4aff6ef71bcdcdf74636b4972ad0700d0d616cd5551d5f4e5e76d87d372dcb56.scope: Deactivated successfully.
Jan 21 23:46:57 compute-1 podman[213484]: 2026-01-21 23:46:57.345945827 +0000 UTC m=+0.061164464 container remove 4aff6ef71bcdcdf74636b4972ad0700d0d616cd5551d5f4e5e76d87d372dcb56 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3af949ae-65f7-4e98-9b88-e75f765a8686, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 21 23:46:57 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:46:57.355 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[88d5a09d-ceb4-4ca7-b3a3-fe837d9dd2f5]: (4, ('Wed Jan 21 11:46:57 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-3af949ae-65f7-4e98-9b88-e75f765a8686 (4aff6ef71bcdcdf74636b4972ad0700d0d616cd5551d5f4e5e76d87d372dcb56)\n4aff6ef71bcdcdf74636b4972ad0700d0d616cd5551d5f4e5e76d87d372dcb56\nWed Jan 21 11:46:57 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-3af949ae-65f7-4e98-9b88-e75f765a8686 (4aff6ef71bcdcdf74636b4972ad0700d0d616cd5551d5f4e5e76d87d372dcb56)\n4aff6ef71bcdcdf74636b4972ad0700d0d616cd5551d5f4e5e76d87d372dcb56\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:46:57 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:46:57.357 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[7cee74ac-f3bd-4aad-9f18-48eeafdf0576]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:46:57 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:46:57.358 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3af949ae-60, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:46:57 compute-1 nova_compute[182713]: 2026-01-21 23:46:57.360 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:46:57 compute-1 kernel: tap3af949ae-60: left promiscuous mode
Jan 21 23:46:57 compute-1 nova_compute[182713]: 2026-01-21 23:46:57.386 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:46:57 compute-1 nova_compute[182713]: 2026-01-21 23:46:57.388 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:46:57 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:46:57.390 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[643879fe-0e92-433b-bfc0-561335684966]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:46:57 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:46:57.415 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[76c288b4-82b1-4859-9bd6-607d6303ffe4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:46:57 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:46:57.416 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[a8058608-c660-4038-a225-4fd5a44cf062]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:46:57 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:46:57.442 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[bd00b756-6683-4ce0-addf-301c834acd01]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 377215, 'reachable_time': 20605, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 213503, 'error': None, 'target': 'ovnmeta-3af949ae-65f7-4e98-9b88-e75f765a8686', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:46:57 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:46:57.444 104576 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-3af949ae-65f7-4e98-9b88-e75f765a8686 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 21 23:46:57 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:46:57.445 104576 DEBUG oslo.privsep.daemon [-] privsep: reply[baa4b3f2-a442-42af-9cfd-7ec618587262]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:46:57 compute-1 nova_compute[182713]: 2026-01-21 23:46:57.589 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:46:57 compute-1 systemd[1]: run-netns-ovnmeta\x2d3af949ae\x2d65f7\x2d4e98\x2d9b88\x2de75f765a8686.mount: Deactivated successfully.
Jan 21 23:46:59 compute-1 nova_compute[182713]: 2026-01-21 23:46:59.240 182717 DEBUG nova.compute.manager [req-1e13f0dc-052f-4ab1-85b5-5115750148c3 req-9d6ee63e-63f4-4d75-b379-73cb80fd4e0d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: b1080912-4a1f-4504-ae59-a0ad89963886] Received event network-vif-plugged-c16d8d18-6610-45c3-8172-54b8b99474ae external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:46:59 compute-1 nova_compute[182713]: 2026-01-21 23:46:59.240 182717 DEBUG oslo_concurrency.lockutils [req-1e13f0dc-052f-4ab1-85b5-5115750148c3 req-9d6ee63e-63f4-4d75-b379-73cb80fd4e0d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "b1080912-4a1f-4504-ae59-a0ad89963886-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:46:59 compute-1 nova_compute[182713]: 2026-01-21 23:46:59.241 182717 DEBUG oslo_concurrency.lockutils [req-1e13f0dc-052f-4ab1-85b5-5115750148c3 req-9d6ee63e-63f4-4d75-b379-73cb80fd4e0d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "b1080912-4a1f-4504-ae59-a0ad89963886-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:46:59 compute-1 nova_compute[182713]: 2026-01-21 23:46:59.242 182717 DEBUG oslo_concurrency.lockutils [req-1e13f0dc-052f-4ab1-85b5-5115750148c3 req-9d6ee63e-63f4-4d75-b379-73cb80fd4e0d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "b1080912-4a1f-4504-ae59-a0ad89963886-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:46:59 compute-1 nova_compute[182713]: 2026-01-21 23:46:59.242 182717 DEBUG nova.compute.manager [req-1e13f0dc-052f-4ab1-85b5-5115750148c3 req-9d6ee63e-63f4-4d75-b379-73cb80fd4e0d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: b1080912-4a1f-4504-ae59-a0ad89963886] No waiting events found dispatching network-vif-plugged-c16d8d18-6610-45c3-8172-54b8b99474ae pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 23:46:59 compute-1 nova_compute[182713]: 2026-01-21 23:46:59.242 182717 WARNING nova.compute.manager [req-1e13f0dc-052f-4ab1-85b5-5115750148c3 req-9d6ee63e-63f4-4d75-b379-73cb80fd4e0d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: b1080912-4a1f-4504-ae59-a0ad89963886] Received unexpected event network-vif-plugged-c16d8d18-6610-45c3-8172-54b8b99474ae for instance with vm_state active and task_state migrating.
Jan 21 23:46:59 compute-1 nova_compute[182713]: 2026-01-21 23:46:59.243 182717 DEBUG nova.compute.manager [req-1e13f0dc-052f-4ab1-85b5-5115750148c3 req-9d6ee63e-63f4-4d75-b379-73cb80fd4e0d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: b1080912-4a1f-4504-ae59-a0ad89963886] Received event network-vif-unplugged-c16d8d18-6610-45c3-8172-54b8b99474ae external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:46:59 compute-1 nova_compute[182713]: 2026-01-21 23:46:59.243 182717 DEBUG oslo_concurrency.lockutils [req-1e13f0dc-052f-4ab1-85b5-5115750148c3 req-9d6ee63e-63f4-4d75-b379-73cb80fd4e0d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "b1080912-4a1f-4504-ae59-a0ad89963886-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:46:59 compute-1 nova_compute[182713]: 2026-01-21 23:46:59.243 182717 DEBUG oslo_concurrency.lockutils [req-1e13f0dc-052f-4ab1-85b5-5115750148c3 req-9d6ee63e-63f4-4d75-b379-73cb80fd4e0d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "b1080912-4a1f-4504-ae59-a0ad89963886-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:46:59 compute-1 nova_compute[182713]: 2026-01-21 23:46:59.244 182717 DEBUG oslo_concurrency.lockutils [req-1e13f0dc-052f-4ab1-85b5-5115750148c3 req-9d6ee63e-63f4-4d75-b379-73cb80fd4e0d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "b1080912-4a1f-4504-ae59-a0ad89963886-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:46:59 compute-1 nova_compute[182713]: 2026-01-21 23:46:59.244 182717 DEBUG nova.compute.manager [req-1e13f0dc-052f-4ab1-85b5-5115750148c3 req-9d6ee63e-63f4-4d75-b379-73cb80fd4e0d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: b1080912-4a1f-4504-ae59-a0ad89963886] No waiting events found dispatching network-vif-unplugged-c16d8d18-6610-45c3-8172-54b8b99474ae pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 23:46:59 compute-1 nova_compute[182713]: 2026-01-21 23:46:59.244 182717 DEBUG nova.compute.manager [req-1e13f0dc-052f-4ab1-85b5-5115750148c3 req-9d6ee63e-63f4-4d75-b379-73cb80fd4e0d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: b1080912-4a1f-4504-ae59-a0ad89963886] Received event network-vif-unplugged-c16d8d18-6610-45c3-8172-54b8b99474ae for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 21 23:47:00 compute-1 nova_compute[182713]: 2026-01-21 23:47:00.163 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:47:00 compute-1 systemd[1]: Stopping User Manager for UID 42436...
Jan 21 23:47:00 compute-1 systemd[213281]: Activating special unit Exit the Session...
Jan 21 23:47:00 compute-1 systemd[213281]: Stopped target Main User Target.
Jan 21 23:47:00 compute-1 systemd[213281]: Stopped target Basic System.
Jan 21 23:47:00 compute-1 systemd[213281]: Stopped target Paths.
Jan 21 23:47:00 compute-1 systemd[213281]: Stopped target Sockets.
Jan 21 23:47:00 compute-1 systemd[213281]: Stopped target Timers.
Jan 21 23:47:00 compute-1 systemd[213281]: Stopped Mark boot as successful after the user session has run 2 minutes.
Jan 21 23:47:00 compute-1 systemd[213281]: Stopped Daily Cleanup of User's Temporary Directories.
Jan 21 23:47:00 compute-1 systemd[213281]: Closed D-Bus User Message Bus Socket.
Jan 21 23:47:00 compute-1 systemd[213281]: Stopped Create User's Volatile Files and Directories.
Jan 21 23:47:00 compute-1 systemd[213281]: Removed slice User Application Slice.
Jan 21 23:47:00 compute-1 systemd[213281]: Reached target Shutdown.
Jan 21 23:47:00 compute-1 systemd[213281]: Finished Exit the Session.
Jan 21 23:47:00 compute-1 systemd[213281]: Reached target Exit the Session.
Jan 21 23:47:00 compute-1 systemd[1]: user@42436.service: Deactivated successfully.
Jan 21 23:47:00 compute-1 systemd[1]: Stopped User Manager for UID 42436.
Jan 21 23:47:01 compute-1 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Jan 21 23:47:01 compute-1 systemd[1]: run-user-42436.mount: Deactivated successfully.
Jan 21 23:47:01 compute-1 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Jan 21 23:47:01 compute-1 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Jan 21 23:47:01 compute-1 systemd[1]: Removed slice User Slice of UID 42436.
Jan 21 23:47:01 compute-1 nova_compute[182713]: 2026-01-21 23:47:01.373 182717 DEBUG nova.compute.manager [req-1b5cfcbc-5ec7-4c79-8fc9-7f1caf27fb95 req-f4b2c74d-e3bc-4702-8191-d8a040b553af 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: b1080912-4a1f-4504-ae59-a0ad89963886] Received event network-vif-plugged-c16d8d18-6610-45c3-8172-54b8b99474ae external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:47:01 compute-1 nova_compute[182713]: 2026-01-21 23:47:01.374 182717 DEBUG oslo_concurrency.lockutils [req-1b5cfcbc-5ec7-4c79-8fc9-7f1caf27fb95 req-f4b2c74d-e3bc-4702-8191-d8a040b553af 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "b1080912-4a1f-4504-ae59-a0ad89963886-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:47:01 compute-1 nova_compute[182713]: 2026-01-21 23:47:01.374 182717 DEBUG oslo_concurrency.lockutils [req-1b5cfcbc-5ec7-4c79-8fc9-7f1caf27fb95 req-f4b2c74d-e3bc-4702-8191-d8a040b553af 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "b1080912-4a1f-4504-ae59-a0ad89963886-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:47:01 compute-1 nova_compute[182713]: 2026-01-21 23:47:01.374 182717 DEBUG oslo_concurrency.lockutils [req-1b5cfcbc-5ec7-4c79-8fc9-7f1caf27fb95 req-f4b2c74d-e3bc-4702-8191-d8a040b553af 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "b1080912-4a1f-4504-ae59-a0ad89963886-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:47:01 compute-1 nova_compute[182713]: 2026-01-21 23:47:01.375 182717 DEBUG nova.compute.manager [req-1b5cfcbc-5ec7-4c79-8fc9-7f1caf27fb95 req-f4b2c74d-e3bc-4702-8191-d8a040b553af 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: b1080912-4a1f-4504-ae59-a0ad89963886] No waiting events found dispatching network-vif-plugged-c16d8d18-6610-45c3-8172-54b8b99474ae pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 23:47:01 compute-1 nova_compute[182713]: 2026-01-21 23:47:01.375 182717 WARNING nova.compute.manager [req-1b5cfcbc-5ec7-4c79-8fc9-7f1caf27fb95 req-f4b2c74d-e3bc-4702-8191-d8a040b553af 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: b1080912-4a1f-4504-ae59-a0ad89963886] Received unexpected event network-vif-plugged-c16d8d18-6610-45c3-8172-54b8b99474ae for instance with vm_state active and task_state migrating.
Jan 21 23:47:02 compute-1 nova_compute[182713]: 2026-01-21 23:47:02.128 182717 DEBUG nova.network.neutron [None req-eb06dbd1-2873-4271-8d09-c8c9ae32bda6 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] Activated binding for port c16d8d18-6610-45c3-8172-54b8b99474ae and host compute-0.ctlplane.example.com migrate_instance_start /usr/lib/python3.9/site-packages/nova/network/neutron.py:3181
Jan 21 23:47:02 compute-1 nova_compute[182713]: 2026-01-21 23:47:02.129 182717 DEBUG nova.compute.manager [None req-eb06dbd1-2873-4271-8d09-c8c9ae32bda6 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] [instance: b1080912-4a1f-4504-ae59-a0ad89963886] Calling driver.post_live_migration_at_source with original source VIFs from migrate_data: [{"id": "c16d8d18-6610-45c3-8172-54b8b99474ae", "address": "fa:16:3e:d8:d5:91", "network": {"id": "b7816b8e-52c1-4d60-84f7-524ebe7dfa5c", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1363967197-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1298204af0f241dc8b63851b2046cf5c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc16d8d18-66", "ovs_interfaceid": "c16d8d18-6610-45c3-8172-54b8b99474ae", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9326
Jan 21 23:47:02 compute-1 nova_compute[182713]: 2026-01-21 23:47:02.131 182717 DEBUG nova.virt.libvirt.vif [None req-eb06dbd1-2873-4271-8d09-c8c9ae32bda6 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-21T23:46:31Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-LiveAutoBlockMigrationV225Test-server-1283276848',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-liveautoblockmigrationv225test-server-1283276848',id=17,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-21T23:46:39Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='1298204af0f241dc8b63851b2046cf5c',ramdisk_id='',reservation_id='r-b25ryamg',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-LiveAutoBlockMigrationV225Test-1063342224',owner_user_name='tempest-LiveAutoBlockMigrationV225Test-1063342224-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-21T23:46:44Z,user_data=None,user_id='553fdc065acf4000a185abac43878ab4',uuid=b1080912-4a1f-4504-ae59-a0ad89963886,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "c16d8d18-6610-45c3-8172-54b8b99474ae", "address": "fa:16:3e:d8:d5:91", "network": {"id": "b7816b8e-52c1-4d60-84f7-524ebe7dfa5c", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1363967197-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1298204af0f241dc8b63851b2046cf5c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc16d8d18-66", "ovs_interfaceid": "c16d8d18-6610-45c3-8172-54b8b99474ae", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 21 23:47:02 compute-1 nova_compute[182713]: 2026-01-21 23:47:02.132 182717 DEBUG nova.network.os_vif_util [None req-eb06dbd1-2873-4271-8d09-c8c9ae32bda6 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] Converting VIF {"id": "c16d8d18-6610-45c3-8172-54b8b99474ae", "address": "fa:16:3e:d8:d5:91", "network": {"id": "b7816b8e-52c1-4d60-84f7-524ebe7dfa5c", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1363967197-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1298204af0f241dc8b63851b2046cf5c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc16d8d18-66", "ovs_interfaceid": "c16d8d18-6610-45c3-8172-54b8b99474ae", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 21 23:47:02 compute-1 nova_compute[182713]: 2026-01-21 23:47:02.133 182717 DEBUG nova.network.os_vif_util [None req-eb06dbd1-2873-4271-8d09-c8c9ae32bda6 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d8:d5:91,bridge_name='br-int',has_traffic_filtering=True,id=c16d8d18-6610-45c3-8172-54b8b99474ae,network=Network(b7816b8e-52c1-4d60-84f7-524ebe7dfa5c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapc16d8d18-66') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 21 23:47:02 compute-1 nova_compute[182713]: 2026-01-21 23:47:02.134 182717 DEBUG os_vif [None req-eb06dbd1-2873-4271-8d09-c8c9ae32bda6 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:d8:d5:91,bridge_name='br-int',has_traffic_filtering=True,id=c16d8d18-6610-45c3-8172-54b8b99474ae,network=Network(b7816b8e-52c1-4d60-84f7-524ebe7dfa5c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapc16d8d18-66') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 21 23:47:02 compute-1 nova_compute[182713]: 2026-01-21 23:47:02.141 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:47:02 compute-1 nova_compute[182713]: 2026-01-21 23:47:02.142 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc16d8d18-66, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:47:02 compute-1 nova_compute[182713]: 2026-01-21 23:47:02.144 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:47:02 compute-1 nova_compute[182713]: 2026-01-21 23:47:02.146 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:47:02 compute-1 nova_compute[182713]: 2026-01-21 23:47:02.155 182717 INFO os_vif [None req-eb06dbd1-2873-4271-8d09-c8c9ae32bda6 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:d8:d5:91,bridge_name='br-int',has_traffic_filtering=True,id=c16d8d18-6610-45c3-8172-54b8b99474ae,network=Network(b7816b8e-52c1-4d60-84f7-524ebe7dfa5c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapc16d8d18-66')
Jan 21 23:47:02 compute-1 nova_compute[182713]: 2026-01-21 23:47:02.156 182717 DEBUG oslo_concurrency.lockutils [None req-eb06dbd1-2873-4271-8d09-c8c9ae32bda6 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:47:02 compute-1 nova_compute[182713]: 2026-01-21 23:47:02.156 182717 DEBUG oslo_concurrency.lockutils [None req-eb06dbd1-2873-4271-8d09-c8c9ae32bda6 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:47:02 compute-1 nova_compute[182713]: 2026-01-21 23:47:02.157 182717 DEBUG oslo_concurrency.lockutils [None req-eb06dbd1-2873-4271-8d09-c8c9ae32bda6 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:47:02 compute-1 nova_compute[182713]: 2026-01-21 23:47:02.158 182717 DEBUG nova.compute.manager [None req-eb06dbd1-2873-4271-8d09-c8c9ae32bda6 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] [instance: b1080912-4a1f-4504-ae59-a0ad89963886] Calling driver.cleanup from _post_live_migration _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9349
Jan 21 23:47:02 compute-1 nova_compute[182713]: 2026-01-21 23:47:02.159 182717 INFO nova.virt.libvirt.driver [None req-eb06dbd1-2873-4271-8d09-c8c9ae32bda6 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] [instance: b1080912-4a1f-4504-ae59-a0ad89963886] Deleting instance files /var/lib/nova/instances/b1080912-4a1f-4504-ae59-a0ad89963886_del
Jan 21 23:47:02 compute-1 nova_compute[182713]: 2026-01-21 23:47:02.161 182717 INFO nova.virt.libvirt.driver [None req-eb06dbd1-2873-4271-8d09-c8c9ae32bda6 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] [instance: b1080912-4a1f-4504-ae59-a0ad89963886] Deletion of /var/lib/nova/instances/b1080912-4a1f-4504-ae59-a0ad89963886_del complete
Jan 21 23:47:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:47:02.994 104184 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:47:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:47:02.995 104184 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:47:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:47:02.995 104184 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:47:03 compute-1 nova_compute[182713]: 2026-01-21 23:47:03.516 182717 DEBUG nova.compute.manager [req-2fd36820-5dc9-4c7d-95b5-7ab06da62f7a req-40e7e57c-a984-4b12-a244-730fb94e5caa 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: b1080912-4a1f-4504-ae59-a0ad89963886] Received event network-vif-plugged-c16d8d18-6610-45c3-8172-54b8b99474ae external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:47:03 compute-1 nova_compute[182713]: 2026-01-21 23:47:03.516 182717 DEBUG oslo_concurrency.lockutils [req-2fd36820-5dc9-4c7d-95b5-7ab06da62f7a req-40e7e57c-a984-4b12-a244-730fb94e5caa 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "b1080912-4a1f-4504-ae59-a0ad89963886-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:47:03 compute-1 nova_compute[182713]: 2026-01-21 23:47:03.517 182717 DEBUG oslo_concurrency.lockutils [req-2fd36820-5dc9-4c7d-95b5-7ab06da62f7a req-40e7e57c-a984-4b12-a244-730fb94e5caa 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "b1080912-4a1f-4504-ae59-a0ad89963886-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:47:03 compute-1 nova_compute[182713]: 2026-01-21 23:47:03.517 182717 DEBUG oslo_concurrency.lockutils [req-2fd36820-5dc9-4c7d-95b5-7ab06da62f7a req-40e7e57c-a984-4b12-a244-730fb94e5caa 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "b1080912-4a1f-4504-ae59-a0ad89963886-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:47:03 compute-1 nova_compute[182713]: 2026-01-21 23:47:03.517 182717 DEBUG nova.compute.manager [req-2fd36820-5dc9-4c7d-95b5-7ab06da62f7a req-40e7e57c-a984-4b12-a244-730fb94e5caa 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: b1080912-4a1f-4504-ae59-a0ad89963886] No waiting events found dispatching network-vif-plugged-c16d8d18-6610-45c3-8172-54b8b99474ae pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 23:47:03 compute-1 nova_compute[182713]: 2026-01-21 23:47:03.518 182717 WARNING nova.compute.manager [req-2fd36820-5dc9-4c7d-95b5-7ab06da62f7a req-40e7e57c-a984-4b12-a244-730fb94e5caa 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: b1080912-4a1f-4504-ae59-a0ad89963886] Received unexpected event network-vif-plugged-c16d8d18-6610-45c3-8172-54b8b99474ae for instance with vm_state active and task_state migrating.
Jan 21 23:47:03 compute-1 nova_compute[182713]: 2026-01-21 23:47:03.518 182717 DEBUG nova.compute.manager [req-2fd36820-5dc9-4c7d-95b5-7ab06da62f7a req-40e7e57c-a984-4b12-a244-730fb94e5caa 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: b1080912-4a1f-4504-ae59-a0ad89963886] Received event network-vif-plugged-c16d8d18-6610-45c3-8172-54b8b99474ae external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:47:03 compute-1 nova_compute[182713]: 2026-01-21 23:47:03.518 182717 DEBUG oslo_concurrency.lockutils [req-2fd36820-5dc9-4c7d-95b5-7ab06da62f7a req-40e7e57c-a984-4b12-a244-730fb94e5caa 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "b1080912-4a1f-4504-ae59-a0ad89963886-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:47:03 compute-1 nova_compute[182713]: 2026-01-21 23:47:03.519 182717 DEBUG oslo_concurrency.lockutils [req-2fd36820-5dc9-4c7d-95b5-7ab06da62f7a req-40e7e57c-a984-4b12-a244-730fb94e5caa 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "b1080912-4a1f-4504-ae59-a0ad89963886-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:47:03 compute-1 nova_compute[182713]: 2026-01-21 23:47:03.519 182717 DEBUG oslo_concurrency.lockutils [req-2fd36820-5dc9-4c7d-95b5-7ab06da62f7a req-40e7e57c-a984-4b12-a244-730fb94e5caa 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "b1080912-4a1f-4504-ae59-a0ad89963886-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:47:03 compute-1 nova_compute[182713]: 2026-01-21 23:47:03.519 182717 DEBUG nova.compute.manager [req-2fd36820-5dc9-4c7d-95b5-7ab06da62f7a req-40e7e57c-a984-4b12-a244-730fb94e5caa 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: b1080912-4a1f-4504-ae59-a0ad89963886] No waiting events found dispatching network-vif-plugged-c16d8d18-6610-45c3-8172-54b8b99474ae pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 23:47:03 compute-1 nova_compute[182713]: 2026-01-21 23:47:03.520 182717 WARNING nova.compute.manager [req-2fd36820-5dc9-4c7d-95b5-7ab06da62f7a req-40e7e57c-a984-4b12-a244-730fb94e5caa 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: b1080912-4a1f-4504-ae59-a0ad89963886] Received unexpected event network-vif-plugged-c16d8d18-6610-45c3-8172-54b8b99474ae for instance with vm_state active and task_state migrating.
Jan 21 23:47:04 compute-1 podman[213505]: 2026-01-21 23:47:04.638951179 +0000 UTC m=+0.120058986 container health_status cba261ffc859a105e0afa4436e64e27f9ba7e7387a1ea02146e0072c51844d44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ceilometer_agent_compute, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Jan 21 23:47:05 compute-1 nova_compute[182713]: 2026-01-21 23:47:05.166 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:47:07 compute-1 nova_compute[182713]: 2026-01-21 23:47:07.170 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:47:08 compute-1 podman[213525]: 2026-01-21 23:47:08.616408393 +0000 UTC m=+0.096946683 container health_status dce133a2ffa21f3006ac27dc80027060a9029e6d972f4c62fecd35dd3bbb00f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, io.buildah.version=1.33.7, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible)
Jan 21 23:47:08 compute-1 nova_compute[182713]: 2026-01-21 23:47:08.746 182717 DEBUG oslo_concurrency.lockutils [None req-a657ec13-73d8-4f2e-942d-7c1ad4140c2b d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] Acquiring lock "5bdecf5d-9113-4584-ac23-44d59770eade" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:47:08 compute-1 nova_compute[182713]: 2026-01-21 23:47:08.746 182717 DEBUG oslo_concurrency.lockutils [None req-a657ec13-73d8-4f2e-942d-7c1ad4140c2b d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] Lock "5bdecf5d-9113-4584-ac23-44d59770eade" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:47:08 compute-1 nova_compute[182713]: 2026-01-21 23:47:08.779 182717 DEBUG nova.compute.manager [None req-a657ec13-73d8-4f2e-942d-7c1ad4140c2b d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] [instance: 5bdecf5d-9113-4584-ac23-44d59770eade] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 21 23:47:08 compute-1 nova_compute[182713]: 2026-01-21 23:47:08.918 182717 DEBUG oslo_concurrency.lockutils [None req-a657ec13-73d8-4f2e-942d-7c1ad4140c2b d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:47:08 compute-1 nova_compute[182713]: 2026-01-21 23:47:08.918 182717 DEBUG oslo_concurrency.lockutils [None req-a657ec13-73d8-4f2e-942d-7c1ad4140c2b d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:47:08 compute-1 nova_compute[182713]: 2026-01-21 23:47:08.928 182717 DEBUG nova.virt.hardware [None req-a657ec13-73d8-4f2e-942d-7c1ad4140c2b d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 21 23:47:08 compute-1 nova_compute[182713]: 2026-01-21 23:47:08.928 182717 INFO nova.compute.claims [None req-a657ec13-73d8-4f2e-942d-7c1ad4140c2b d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] [instance: 5bdecf5d-9113-4584-ac23-44d59770eade] Claim successful on node compute-1.ctlplane.example.com
Jan 21 23:47:09 compute-1 nova_compute[182713]: 2026-01-21 23:47:09.121 182717 DEBUG nova.compute.provider_tree [None req-a657ec13-73d8-4f2e-942d-7c1ad4140c2b d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] Inventory has not changed in ProviderTree for provider: 39680711-70c9-4df1-ae59-25e54fac688d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 21 23:47:09 compute-1 nova_compute[182713]: 2026-01-21 23:47:09.141 182717 DEBUG nova.scheduler.client.report [None req-a657ec13-73d8-4f2e-942d-7c1ad4140c2b d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] Inventory has not changed for provider 39680711-70c9-4df1-ae59-25e54fac688d based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 21 23:47:09 compute-1 nova_compute[182713]: 2026-01-21 23:47:09.171 182717 DEBUG oslo_concurrency.lockutils [None req-a657ec13-73d8-4f2e-942d-7c1ad4140c2b d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.253s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:47:09 compute-1 nova_compute[182713]: 2026-01-21 23:47:09.172 182717 DEBUG nova.compute.manager [None req-a657ec13-73d8-4f2e-942d-7c1ad4140c2b d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] [instance: 5bdecf5d-9113-4584-ac23-44d59770eade] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 21 23:47:09 compute-1 nova_compute[182713]: 2026-01-21 23:47:09.234 182717 DEBUG nova.compute.manager [None req-a657ec13-73d8-4f2e-942d-7c1ad4140c2b d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] [instance: 5bdecf5d-9113-4584-ac23-44d59770eade] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 21 23:47:09 compute-1 nova_compute[182713]: 2026-01-21 23:47:09.235 182717 DEBUG nova.network.neutron [None req-a657ec13-73d8-4f2e-942d-7c1ad4140c2b d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] [instance: 5bdecf5d-9113-4584-ac23-44d59770eade] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 21 23:47:09 compute-1 nova_compute[182713]: 2026-01-21 23:47:09.277 182717 INFO nova.virt.libvirt.driver [None req-a657ec13-73d8-4f2e-942d-7c1ad4140c2b d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] [instance: 5bdecf5d-9113-4584-ac23-44d59770eade] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 21 23:47:09 compute-1 nova_compute[182713]: 2026-01-21 23:47:09.312 182717 DEBUG nova.compute.manager [None req-a657ec13-73d8-4f2e-942d-7c1ad4140c2b d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] [instance: 5bdecf5d-9113-4584-ac23-44d59770eade] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 21 23:47:09 compute-1 nova_compute[182713]: 2026-01-21 23:47:09.624 182717 DEBUG nova.policy [None req-a657ec13-73d8-4f2e-942d-7c1ad4140c2b d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'd4ff24d8abf8416db9d64c645436c5f1', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'cdcb2f57183e484cace5d5f78dd635a1', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 21 23:47:09 compute-1 nova_compute[182713]: 2026-01-21 23:47:09.738 182717 DEBUG nova.compute.manager [None req-a657ec13-73d8-4f2e-942d-7c1ad4140c2b d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] [instance: 5bdecf5d-9113-4584-ac23-44d59770eade] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 21 23:47:09 compute-1 nova_compute[182713]: 2026-01-21 23:47:09.740 182717 DEBUG nova.virt.libvirt.driver [None req-a657ec13-73d8-4f2e-942d-7c1ad4140c2b d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] [instance: 5bdecf5d-9113-4584-ac23-44d59770eade] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 21 23:47:09 compute-1 nova_compute[182713]: 2026-01-21 23:47:09.740 182717 INFO nova.virt.libvirt.driver [None req-a657ec13-73d8-4f2e-942d-7c1ad4140c2b d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] [instance: 5bdecf5d-9113-4584-ac23-44d59770eade] Creating image(s)
Jan 21 23:47:09 compute-1 nova_compute[182713]: 2026-01-21 23:47:09.742 182717 DEBUG oslo_concurrency.lockutils [None req-a657ec13-73d8-4f2e-942d-7c1ad4140c2b d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] Acquiring lock "/var/lib/nova/instances/5bdecf5d-9113-4584-ac23-44d59770eade/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:47:09 compute-1 nova_compute[182713]: 2026-01-21 23:47:09.742 182717 DEBUG oslo_concurrency.lockutils [None req-a657ec13-73d8-4f2e-942d-7c1ad4140c2b d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] Lock "/var/lib/nova/instances/5bdecf5d-9113-4584-ac23-44d59770eade/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:47:09 compute-1 nova_compute[182713]: 2026-01-21 23:47:09.744 182717 DEBUG oslo_concurrency.lockutils [None req-a657ec13-73d8-4f2e-942d-7c1ad4140c2b d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] Lock "/var/lib/nova/instances/5bdecf5d-9113-4584-ac23-44d59770eade/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:47:09 compute-1 nova_compute[182713]: 2026-01-21 23:47:09.770 182717 DEBUG oslo_concurrency.processutils [None req-a657ec13-73d8-4f2e-942d-7c1ad4140c2b d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:47:09 compute-1 nova_compute[182713]: 2026-01-21 23:47:09.798 182717 DEBUG oslo_concurrency.lockutils [None req-eb06dbd1-2873-4271-8d09-c8c9ae32bda6 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] Acquiring lock "b1080912-4a1f-4504-ae59-a0ad89963886-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:47:09 compute-1 nova_compute[182713]: 2026-01-21 23:47:09.799 182717 DEBUG oslo_concurrency.lockutils [None req-eb06dbd1-2873-4271-8d09-c8c9ae32bda6 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] Lock "b1080912-4a1f-4504-ae59-a0ad89963886-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:47:09 compute-1 nova_compute[182713]: 2026-01-21 23:47:09.799 182717 DEBUG oslo_concurrency.lockutils [None req-eb06dbd1-2873-4271-8d09-c8c9ae32bda6 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] Lock "b1080912-4a1f-4504-ae59-a0ad89963886-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:47:09 compute-1 nova_compute[182713]: 2026-01-21 23:47:09.836 182717 DEBUG oslo_concurrency.lockutils [None req-eb06dbd1-2873-4271-8d09-c8c9ae32bda6 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:47:09 compute-1 nova_compute[182713]: 2026-01-21 23:47:09.837 182717 DEBUG oslo_concurrency.lockutils [None req-eb06dbd1-2873-4271-8d09-c8c9ae32bda6 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:47:09 compute-1 nova_compute[182713]: 2026-01-21 23:47:09.837 182717 DEBUG oslo_concurrency.lockutils [None req-eb06dbd1-2873-4271-8d09-c8c9ae32bda6 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:47:09 compute-1 nova_compute[182713]: 2026-01-21 23:47:09.838 182717 DEBUG nova.compute.resource_tracker [None req-eb06dbd1-2873-4271-8d09-c8c9ae32bda6 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 21 23:47:09 compute-1 nova_compute[182713]: 2026-01-21 23:47:09.871 182717 DEBUG oslo_concurrency.processutils [None req-a657ec13-73d8-4f2e-942d-7c1ad4140c2b d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.100s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:47:09 compute-1 nova_compute[182713]: 2026-01-21 23:47:09.872 182717 DEBUG oslo_concurrency.lockutils [None req-a657ec13-73d8-4f2e-942d-7c1ad4140c2b d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] Acquiring lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:47:09 compute-1 nova_compute[182713]: 2026-01-21 23:47:09.874 182717 DEBUG oslo_concurrency.lockutils [None req-a657ec13-73d8-4f2e-942d-7c1ad4140c2b d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:47:09 compute-1 nova_compute[182713]: 2026-01-21 23:47:09.896 182717 DEBUG oslo_concurrency.processutils [None req-a657ec13-73d8-4f2e-942d-7c1ad4140c2b d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:47:09 compute-1 nova_compute[182713]: 2026-01-21 23:47:09.983 182717 DEBUG oslo_concurrency.processutils [None req-a657ec13-73d8-4f2e-942d-7c1ad4140c2b d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.087s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:47:09 compute-1 nova_compute[182713]: 2026-01-21 23:47:09.986 182717 DEBUG oslo_concurrency.processutils [None req-a657ec13-73d8-4f2e-942d-7c1ad4140c2b d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/5bdecf5d-9113-4584-ac23-44d59770eade/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:47:10 compute-1 nova_compute[182713]: 2026-01-21 23:47:10.027 182717 DEBUG oslo_concurrency.processutils [None req-a657ec13-73d8-4f2e-942d-7c1ad4140c2b d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/5bdecf5d-9113-4584-ac23-44d59770eade/disk 1073741824" returned: 0 in 0.041s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:47:10 compute-1 nova_compute[182713]: 2026-01-21 23:47:10.029 182717 DEBUG oslo_concurrency.lockutils [None req-a657ec13-73d8-4f2e-942d-7c1ad4140c2b d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.155s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:47:10 compute-1 nova_compute[182713]: 2026-01-21 23:47:10.029 182717 DEBUG oslo_concurrency.processutils [None req-a657ec13-73d8-4f2e-942d-7c1ad4140c2b d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:47:10 compute-1 nova_compute[182713]: 2026-01-21 23:47:10.114 182717 DEBUG oslo_concurrency.processutils [None req-a657ec13-73d8-4f2e-942d-7c1ad4140c2b d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.084s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:47:10 compute-1 nova_compute[182713]: 2026-01-21 23:47:10.115 182717 DEBUG nova.virt.disk.api [None req-a657ec13-73d8-4f2e-942d-7c1ad4140c2b d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] Checking if we can resize image /var/lib/nova/instances/5bdecf5d-9113-4584-ac23-44d59770eade/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 21 23:47:10 compute-1 nova_compute[182713]: 2026-01-21 23:47:10.116 182717 DEBUG oslo_concurrency.processutils [None req-a657ec13-73d8-4f2e-942d-7c1ad4140c2b d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5bdecf5d-9113-4584-ac23-44d59770eade/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:47:10 compute-1 nova_compute[182713]: 2026-01-21 23:47:10.167 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:47:10 compute-1 nova_compute[182713]: 2026-01-21 23:47:10.203 182717 DEBUG oslo_concurrency.processutils [None req-a657ec13-73d8-4f2e-942d-7c1ad4140c2b d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5bdecf5d-9113-4584-ac23-44d59770eade/disk --force-share --output=json" returned: 0 in 0.087s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:47:10 compute-1 nova_compute[182713]: 2026-01-21 23:47:10.204 182717 DEBUG nova.virt.disk.api [None req-a657ec13-73d8-4f2e-942d-7c1ad4140c2b d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] Cannot resize image /var/lib/nova/instances/5bdecf5d-9113-4584-ac23-44d59770eade/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 21 23:47:10 compute-1 nova_compute[182713]: 2026-01-21 23:47:10.205 182717 DEBUG nova.objects.instance [None req-a657ec13-73d8-4f2e-942d-7c1ad4140c2b d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] Lazy-loading 'migration_context' on Instance uuid 5bdecf5d-9113-4584-ac23-44d59770eade obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:47:10 compute-1 nova_compute[182713]: 2026-01-21 23:47:10.239 182717 WARNING nova.virt.libvirt.driver [None req-eb06dbd1-2873-4271-8d09-c8c9ae32bda6 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 21 23:47:10 compute-1 nova_compute[182713]: 2026-01-21 23:47:10.240 182717 DEBUG nova.compute.resource_tracker [None req-eb06dbd1-2873-4271-8d09-c8c9ae32bda6 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5676MB free_disk=73.3809585571289GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 21 23:47:10 compute-1 nova_compute[182713]: 2026-01-21 23:47:10.241 182717 DEBUG oslo_concurrency.lockutils [None req-eb06dbd1-2873-4271-8d09-c8c9ae32bda6 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:47:10 compute-1 nova_compute[182713]: 2026-01-21 23:47:10.241 182717 DEBUG oslo_concurrency.lockutils [None req-eb06dbd1-2873-4271-8d09-c8c9ae32bda6 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:47:10 compute-1 nova_compute[182713]: 2026-01-21 23:47:10.367 182717 DEBUG nova.virt.libvirt.driver [None req-a657ec13-73d8-4f2e-942d-7c1ad4140c2b d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] [instance: 5bdecf5d-9113-4584-ac23-44d59770eade] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 21 23:47:10 compute-1 nova_compute[182713]: 2026-01-21 23:47:10.369 182717 DEBUG nova.virt.libvirt.driver [None req-a657ec13-73d8-4f2e-942d-7c1ad4140c2b d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] [instance: 5bdecf5d-9113-4584-ac23-44d59770eade] Ensure instance console log exists: /var/lib/nova/instances/5bdecf5d-9113-4584-ac23-44d59770eade/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 21 23:47:10 compute-1 nova_compute[182713]: 2026-01-21 23:47:10.370 182717 DEBUG oslo_concurrency.lockutils [None req-a657ec13-73d8-4f2e-942d-7c1ad4140c2b d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:47:10 compute-1 nova_compute[182713]: 2026-01-21 23:47:10.370 182717 DEBUG oslo_concurrency.lockutils [None req-a657ec13-73d8-4f2e-942d-7c1ad4140c2b d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:47:10 compute-1 nova_compute[182713]: 2026-01-21 23:47:10.371 182717 DEBUG oslo_concurrency.lockutils [None req-a657ec13-73d8-4f2e-942d-7c1ad4140c2b d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:47:10 compute-1 nova_compute[182713]: 2026-01-21 23:47:10.546 182717 DEBUG nova.compute.resource_tracker [None req-eb06dbd1-2873-4271-8d09-c8c9ae32bda6 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] Migration for instance b1080912-4a1f-4504-ae59-a0ad89963886 refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:903
Jan 21 23:47:10 compute-1 nova_compute[182713]: 2026-01-21 23:47:10.613 182717 DEBUG nova.compute.resource_tracker [None req-eb06dbd1-2873-4271-8d09-c8c9ae32bda6 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] [instance: b1080912-4a1f-4504-ae59-a0ad89963886] Skipping migration as instance is neither resizing nor live-migrating. _update_usage_from_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1491
Jan 21 23:47:10 compute-1 nova_compute[182713]: 2026-01-21 23:47:10.642 182717 DEBUG nova.compute.resource_tracker [None req-eb06dbd1-2873-4271-8d09-c8c9ae32bda6 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] Migration 2a5bd0c2-6fc8-4a77-953d-99e26b2b89b9 is active on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1640
Jan 21 23:47:10 compute-1 nova_compute[182713]: 2026-01-21 23:47:10.642 182717 DEBUG nova.compute.resource_tracker [None req-eb06dbd1-2873-4271-8d09-c8c9ae32bda6 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] Instance 5bdecf5d-9113-4584-ac23-44d59770eade actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 21 23:47:10 compute-1 nova_compute[182713]: 2026-01-21 23:47:10.643 182717 DEBUG nova.compute.resource_tracker [None req-eb06dbd1-2873-4271-8d09-c8c9ae32bda6 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 21 23:47:10 compute-1 nova_compute[182713]: 2026-01-21 23:47:10.644 182717 DEBUG nova.compute.resource_tracker [None req-eb06dbd1-2873-4271-8d09-c8c9ae32bda6 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 21 23:47:10 compute-1 nova_compute[182713]: 2026-01-21 23:47:10.782 182717 DEBUG nova.compute.provider_tree [None req-eb06dbd1-2873-4271-8d09-c8c9ae32bda6 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] Inventory has not changed in ProviderTree for provider: 39680711-70c9-4df1-ae59-25e54fac688d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 21 23:47:10 compute-1 nova_compute[182713]: 2026-01-21 23:47:10.810 182717 DEBUG nova.scheduler.client.report [None req-eb06dbd1-2873-4271-8d09-c8c9ae32bda6 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] Inventory has not changed for provider 39680711-70c9-4df1-ae59-25e54fac688d based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 21 23:47:10 compute-1 nova_compute[182713]: 2026-01-21 23:47:10.849 182717 DEBUG nova.compute.resource_tracker [None req-eb06dbd1-2873-4271-8d09-c8c9ae32bda6 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 21 23:47:10 compute-1 nova_compute[182713]: 2026-01-21 23:47:10.850 182717 DEBUG oslo_concurrency.lockutils [None req-eb06dbd1-2873-4271-8d09-c8c9ae32bda6 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.609s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:47:10 compute-1 nova_compute[182713]: 2026-01-21 23:47:10.855 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:47:10 compute-1 nova_compute[182713]: 2026-01-21 23:47:10.856 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Jan 21 23:47:10 compute-1 nova_compute[182713]: 2026-01-21 23:47:10.877 182717 INFO nova.compute.manager [None req-eb06dbd1-2873-4271-8d09-c8c9ae32bda6 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] [instance: b1080912-4a1f-4504-ae59-a0ad89963886] Migrating instance to compute-0.ctlplane.example.com finished successfully.
Jan 21 23:47:10 compute-1 nova_compute[182713]: 2026-01-21 23:47:10.883 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Jan 21 23:47:10 compute-1 nova_compute[182713]: 2026-01-21 23:47:10.884 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:47:11 compute-1 nova_compute[182713]: 2026-01-21 23:47:11.015 182717 INFO nova.scheduler.client.report [None req-eb06dbd1-2873-4271-8d09-c8c9ae32bda6 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] Deleted allocation for migration 2a5bd0c2-6fc8-4a77-953d-99e26b2b89b9
Jan 21 23:47:11 compute-1 nova_compute[182713]: 2026-01-21 23:47:11.016 182717 DEBUG nova.virt.libvirt.driver [None req-eb06dbd1-2873-4271-8d09-c8c9ae32bda6 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] [instance: b1080912-4a1f-4504-ae59-a0ad89963886] Live migration monitoring is all done _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10662
Jan 21 23:47:11 compute-1 nova_compute[182713]: 2026-01-21 23:47:11.153 182717 DEBUG nova.network.neutron [None req-a657ec13-73d8-4f2e-942d-7c1ad4140c2b d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] [instance: 5bdecf5d-9113-4584-ac23-44d59770eade] Successfully created port: df9aa099-aa41-4111-b46c-c8a593762a53 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 21 23:47:11 compute-1 nova_compute[182713]: 2026-01-21 23:47:11.859 182717 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769039216.8579981, b1080912-4a1f-4504-ae59-a0ad89963886 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:47:11 compute-1 nova_compute[182713]: 2026-01-21 23:47:11.860 182717 INFO nova.compute.manager [-] [instance: b1080912-4a1f-4504-ae59-a0ad89963886] VM Stopped (Lifecycle Event)
Jan 21 23:47:11 compute-1 nova_compute[182713]: 2026-01-21 23:47:11.891 182717 DEBUG nova.compute.manager [None req-e6a2fb27-d4d6-4165-85ca-67415782df51 - - - - - -] [instance: b1080912-4a1f-4504-ae59-a0ad89963886] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:47:12 compute-1 nova_compute[182713]: 2026-01-21 23:47:12.173 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:47:13 compute-1 nova_compute[182713]: 2026-01-21 23:47:13.364 182717 DEBUG nova.network.neutron [None req-a657ec13-73d8-4f2e-942d-7c1ad4140c2b d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] [instance: 5bdecf5d-9113-4584-ac23-44d59770eade] Successfully updated port: df9aa099-aa41-4111-b46c-c8a593762a53 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 21 23:47:13 compute-1 nova_compute[182713]: 2026-01-21 23:47:13.383 182717 DEBUG oslo_concurrency.lockutils [None req-a657ec13-73d8-4f2e-942d-7c1ad4140c2b d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] Acquiring lock "refresh_cache-5bdecf5d-9113-4584-ac23-44d59770eade" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 23:47:13 compute-1 nova_compute[182713]: 2026-01-21 23:47:13.384 182717 DEBUG oslo_concurrency.lockutils [None req-a657ec13-73d8-4f2e-942d-7c1ad4140c2b d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] Acquired lock "refresh_cache-5bdecf5d-9113-4584-ac23-44d59770eade" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 23:47:13 compute-1 nova_compute[182713]: 2026-01-21 23:47:13.385 182717 DEBUG nova.network.neutron [None req-a657ec13-73d8-4f2e-942d-7c1ad4140c2b d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] [instance: 5bdecf5d-9113-4584-ac23-44d59770eade] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 21 23:47:13 compute-1 nova_compute[182713]: 2026-01-21 23:47:13.477 182717 DEBUG nova.compute.manager [req-f81e91ee-e542-443f-9017-9b40a9edf139 req-1e4c54b3-e12f-40bf-8e54-430920da984d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 5bdecf5d-9113-4584-ac23-44d59770eade] Received event network-changed-df9aa099-aa41-4111-b46c-c8a593762a53 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:47:13 compute-1 nova_compute[182713]: 2026-01-21 23:47:13.477 182717 DEBUG nova.compute.manager [req-f81e91ee-e542-443f-9017-9b40a9edf139 req-1e4c54b3-e12f-40bf-8e54-430920da984d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 5bdecf5d-9113-4584-ac23-44d59770eade] Refreshing instance network info cache due to event network-changed-df9aa099-aa41-4111-b46c-c8a593762a53. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 21 23:47:13 compute-1 nova_compute[182713]: 2026-01-21 23:47:13.477 182717 DEBUG oslo_concurrency.lockutils [req-f81e91ee-e542-443f-9017-9b40a9edf139 req-1e4c54b3-e12f-40bf-8e54-430920da984d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-5bdecf5d-9113-4584-ac23-44d59770eade" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 23:47:13 compute-1 nova_compute[182713]: 2026-01-21 23:47:13.570 182717 DEBUG nova.network.neutron [None req-a657ec13-73d8-4f2e-942d-7c1ad4140c2b d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] [instance: 5bdecf5d-9113-4584-ac23-44d59770eade] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 21 23:47:15 compute-1 nova_compute[182713]: 2026-01-21 23:47:15.169 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:47:15 compute-1 nova_compute[182713]: 2026-01-21 23:47:15.373 182717 DEBUG nova.network.neutron [None req-a657ec13-73d8-4f2e-942d-7c1ad4140c2b d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] [instance: 5bdecf5d-9113-4584-ac23-44d59770eade] Updating instance_info_cache with network_info: [{"id": "df9aa099-aa41-4111-b46c-c8a593762a53", "address": "fa:16:3e:8f:4f:85", "network": {"id": "b2df233d-b255-4dda-925c-3ccab3a032ee", "bridge": "br-int", "label": "tempest-LiveMigrationTest-283223724-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cdcb2f57183e484cace5d5f78dd635a1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdf9aa099-aa", "ovs_interfaceid": "df9aa099-aa41-4111-b46c-c8a593762a53", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 23:47:15 compute-1 nova_compute[182713]: 2026-01-21 23:47:15.396 182717 DEBUG oslo_concurrency.lockutils [None req-a657ec13-73d8-4f2e-942d-7c1ad4140c2b d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] Releasing lock "refresh_cache-5bdecf5d-9113-4584-ac23-44d59770eade" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 23:47:15 compute-1 nova_compute[182713]: 2026-01-21 23:47:15.397 182717 DEBUG nova.compute.manager [None req-a657ec13-73d8-4f2e-942d-7c1ad4140c2b d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] [instance: 5bdecf5d-9113-4584-ac23-44d59770eade] Instance network_info: |[{"id": "df9aa099-aa41-4111-b46c-c8a593762a53", "address": "fa:16:3e:8f:4f:85", "network": {"id": "b2df233d-b255-4dda-925c-3ccab3a032ee", "bridge": "br-int", "label": "tempest-LiveMigrationTest-283223724-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cdcb2f57183e484cace5d5f78dd635a1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdf9aa099-aa", "ovs_interfaceid": "df9aa099-aa41-4111-b46c-c8a593762a53", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 21 23:47:15 compute-1 nova_compute[182713]: 2026-01-21 23:47:15.398 182717 DEBUG oslo_concurrency.lockutils [req-f81e91ee-e542-443f-9017-9b40a9edf139 req-1e4c54b3-e12f-40bf-8e54-430920da984d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-5bdecf5d-9113-4584-ac23-44d59770eade" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 23:47:15 compute-1 nova_compute[182713]: 2026-01-21 23:47:15.398 182717 DEBUG nova.network.neutron [req-f81e91ee-e542-443f-9017-9b40a9edf139 req-1e4c54b3-e12f-40bf-8e54-430920da984d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 5bdecf5d-9113-4584-ac23-44d59770eade] Refreshing network info cache for port df9aa099-aa41-4111-b46c-c8a593762a53 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 21 23:47:15 compute-1 nova_compute[182713]: 2026-01-21 23:47:15.404 182717 DEBUG nova.virt.libvirt.driver [None req-a657ec13-73d8-4f2e-942d-7c1ad4140c2b d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] [instance: 5bdecf5d-9113-4584-ac23-44d59770eade] Start _get_guest_xml network_info=[{"id": "df9aa099-aa41-4111-b46c-c8a593762a53", "address": "fa:16:3e:8f:4f:85", "network": {"id": "b2df233d-b255-4dda-925c-3ccab3a032ee", "bridge": "br-int", "label": "tempest-LiveMigrationTest-283223724-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cdcb2f57183e484cace5d5f78dd635a1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdf9aa099-aa", "ovs_interfaceid": "df9aa099-aa41-4111-b46c-c8a593762a53", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'size': 0, 'disk_bus': 'virtio', 'boot_index': 0, 'device_name': '/dev/vda', 'guest_format': None, 'encrypted': False, 'encryption_format': None, 'encryption_secret_uuid': None, 'device_type': 'disk', 'image_id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 21 23:47:15 compute-1 nova_compute[182713]: 2026-01-21 23:47:15.412 182717 WARNING nova.virt.libvirt.driver [None req-a657ec13-73d8-4f2e-942d-7c1ad4140c2b d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 21 23:47:15 compute-1 nova_compute[182713]: 2026-01-21 23:47:15.425 182717 DEBUG nova.virt.libvirt.host [None req-a657ec13-73d8-4f2e-942d-7c1ad4140c2b d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 21 23:47:15 compute-1 nova_compute[182713]: 2026-01-21 23:47:15.426 182717 DEBUG nova.virt.libvirt.host [None req-a657ec13-73d8-4f2e-942d-7c1ad4140c2b d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 21 23:47:15 compute-1 nova_compute[182713]: 2026-01-21 23:47:15.431 182717 DEBUG nova.virt.libvirt.host [None req-a657ec13-73d8-4f2e-942d-7c1ad4140c2b d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 21 23:47:15 compute-1 nova_compute[182713]: 2026-01-21 23:47:15.432 182717 DEBUG nova.virt.libvirt.host [None req-a657ec13-73d8-4f2e-942d-7c1ad4140c2b d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 21 23:47:15 compute-1 nova_compute[182713]: 2026-01-21 23:47:15.435 182717 DEBUG nova.virt.libvirt.driver [None req-a657ec13-73d8-4f2e-942d-7c1ad4140c2b d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 21 23:47:15 compute-1 nova_compute[182713]: 2026-01-21 23:47:15.436 182717 DEBUG nova.virt.hardware [None req-a657ec13-73d8-4f2e-942d-7c1ad4140c2b d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-21T23:42:45Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c3389c03-89c4-4ff5-9e03-1a99d41713d4',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 21 23:47:15 compute-1 nova_compute[182713]: 2026-01-21 23:47:15.437 182717 DEBUG nova.virt.hardware [None req-a657ec13-73d8-4f2e-942d-7c1ad4140c2b d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 21 23:47:15 compute-1 nova_compute[182713]: 2026-01-21 23:47:15.437 182717 DEBUG nova.virt.hardware [None req-a657ec13-73d8-4f2e-942d-7c1ad4140c2b d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 21 23:47:15 compute-1 nova_compute[182713]: 2026-01-21 23:47:15.438 182717 DEBUG nova.virt.hardware [None req-a657ec13-73d8-4f2e-942d-7c1ad4140c2b d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 21 23:47:15 compute-1 nova_compute[182713]: 2026-01-21 23:47:15.438 182717 DEBUG nova.virt.hardware [None req-a657ec13-73d8-4f2e-942d-7c1ad4140c2b d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 21 23:47:15 compute-1 nova_compute[182713]: 2026-01-21 23:47:15.439 182717 DEBUG nova.virt.hardware [None req-a657ec13-73d8-4f2e-942d-7c1ad4140c2b d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 21 23:47:15 compute-1 nova_compute[182713]: 2026-01-21 23:47:15.439 182717 DEBUG nova.virt.hardware [None req-a657ec13-73d8-4f2e-942d-7c1ad4140c2b d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 21 23:47:15 compute-1 nova_compute[182713]: 2026-01-21 23:47:15.440 182717 DEBUG nova.virt.hardware [None req-a657ec13-73d8-4f2e-942d-7c1ad4140c2b d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 21 23:47:15 compute-1 nova_compute[182713]: 2026-01-21 23:47:15.440 182717 DEBUG nova.virt.hardware [None req-a657ec13-73d8-4f2e-942d-7c1ad4140c2b d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 21 23:47:15 compute-1 nova_compute[182713]: 2026-01-21 23:47:15.441 182717 DEBUG nova.virt.hardware [None req-a657ec13-73d8-4f2e-942d-7c1ad4140c2b d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 21 23:47:15 compute-1 nova_compute[182713]: 2026-01-21 23:47:15.441 182717 DEBUG nova.virt.hardware [None req-a657ec13-73d8-4f2e-942d-7c1ad4140c2b d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 21 23:47:15 compute-1 nova_compute[182713]: 2026-01-21 23:47:15.450 182717 DEBUG nova.virt.libvirt.vif [None req-a657ec13-73d8-4f2e-942d-7c1ad4140c2b d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-21T23:47:07Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-LiveMigrationTest-server-821021372',display_name='tempest-LiveMigrationTest-server-821021372',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-livemigrationtest-server-821021372',id=21,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='cdcb2f57183e484cace5d5f78dd635a1',ramdisk_id='',reservation_id='r-xa5gd7vg',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-LiveMigrationTest-430976321',owner_user_name='tempest-LiveMigrationTest-430976321-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-21T23:47:09Z,user_data=None,user_id='d4ff24d8abf8416db9d64c645436c5f1',uuid=5bdecf5d-9113-4584-ac23-44d59770eade,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "df9aa099-aa41-4111-b46c-c8a593762a53", "address": "fa:16:3e:8f:4f:85", "network": {"id": "b2df233d-b255-4dda-925c-3ccab3a032ee", "bridge": "br-int", "label": "tempest-LiveMigrationTest-283223724-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cdcb2f57183e484cace5d5f78dd635a1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdf9aa099-aa", "ovs_interfaceid": "df9aa099-aa41-4111-b46c-c8a593762a53", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 21 23:47:15 compute-1 nova_compute[182713]: 2026-01-21 23:47:15.450 182717 DEBUG nova.network.os_vif_util [None req-a657ec13-73d8-4f2e-942d-7c1ad4140c2b d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] Converting VIF {"id": "df9aa099-aa41-4111-b46c-c8a593762a53", "address": "fa:16:3e:8f:4f:85", "network": {"id": "b2df233d-b255-4dda-925c-3ccab3a032ee", "bridge": "br-int", "label": "tempest-LiveMigrationTest-283223724-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cdcb2f57183e484cace5d5f78dd635a1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdf9aa099-aa", "ovs_interfaceid": "df9aa099-aa41-4111-b46c-c8a593762a53", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 21 23:47:15 compute-1 nova_compute[182713]: 2026-01-21 23:47:15.452 182717 DEBUG nova.network.os_vif_util [None req-a657ec13-73d8-4f2e-942d-7c1ad4140c2b d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:8f:4f:85,bridge_name='br-int',has_traffic_filtering=True,id=df9aa099-aa41-4111-b46c-c8a593762a53,network=Network(b2df233d-b255-4dda-925c-3ccab3a032ee),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdf9aa099-aa') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 21 23:47:15 compute-1 nova_compute[182713]: 2026-01-21 23:47:15.455 182717 DEBUG nova.objects.instance [None req-a657ec13-73d8-4f2e-942d-7c1ad4140c2b d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] Lazy-loading 'pci_devices' on Instance uuid 5bdecf5d-9113-4584-ac23-44d59770eade obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:47:15 compute-1 nova_compute[182713]: 2026-01-21 23:47:15.478 182717 DEBUG nova.virt.libvirt.driver [None req-a657ec13-73d8-4f2e-942d-7c1ad4140c2b d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] [instance: 5bdecf5d-9113-4584-ac23-44d59770eade] End _get_guest_xml xml=<domain type="kvm">
Jan 21 23:47:15 compute-1 nova_compute[182713]:   <uuid>5bdecf5d-9113-4584-ac23-44d59770eade</uuid>
Jan 21 23:47:15 compute-1 nova_compute[182713]:   <name>instance-00000015</name>
Jan 21 23:47:15 compute-1 nova_compute[182713]:   <memory>131072</memory>
Jan 21 23:47:15 compute-1 nova_compute[182713]:   <vcpu>1</vcpu>
Jan 21 23:47:15 compute-1 nova_compute[182713]:   <metadata>
Jan 21 23:47:15 compute-1 nova_compute[182713]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 21 23:47:15 compute-1 nova_compute[182713]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 21 23:47:15 compute-1 nova_compute[182713]:       <nova:name>tempest-LiveMigrationTest-server-821021372</nova:name>
Jan 21 23:47:15 compute-1 nova_compute[182713]:       <nova:creationTime>2026-01-21 23:47:15</nova:creationTime>
Jan 21 23:47:15 compute-1 nova_compute[182713]:       <nova:flavor name="m1.nano">
Jan 21 23:47:15 compute-1 nova_compute[182713]:         <nova:memory>128</nova:memory>
Jan 21 23:47:15 compute-1 nova_compute[182713]:         <nova:disk>1</nova:disk>
Jan 21 23:47:15 compute-1 nova_compute[182713]:         <nova:swap>0</nova:swap>
Jan 21 23:47:15 compute-1 nova_compute[182713]:         <nova:ephemeral>0</nova:ephemeral>
Jan 21 23:47:15 compute-1 nova_compute[182713]:         <nova:vcpus>1</nova:vcpus>
Jan 21 23:47:15 compute-1 nova_compute[182713]:       </nova:flavor>
Jan 21 23:47:15 compute-1 nova_compute[182713]:       <nova:owner>
Jan 21 23:47:15 compute-1 nova_compute[182713]:         <nova:user uuid="d4ff24d8abf8416db9d64c645436c5f1">tempest-LiveMigrationTest-430976321-project-member</nova:user>
Jan 21 23:47:15 compute-1 nova_compute[182713]:         <nova:project uuid="cdcb2f57183e484cace5d5f78dd635a1">tempest-LiveMigrationTest-430976321</nova:project>
Jan 21 23:47:15 compute-1 nova_compute[182713]:       </nova:owner>
Jan 21 23:47:15 compute-1 nova_compute[182713]:       <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 21 23:47:15 compute-1 nova_compute[182713]:       <nova:ports>
Jan 21 23:47:15 compute-1 nova_compute[182713]:         <nova:port uuid="df9aa099-aa41-4111-b46c-c8a593762a53">
Jan 21 23:47:15 compute-1 nova_compute[182713]:           <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Jan 21 23:47:15 compute-1 nova_compute[182713]:         </nova:port>
Jan 21 23:47:15 compute-1 nova_compute[182713]:       </nova:ports>
Jan 21 23:47:15 compute-1 nova_compute[182713]:     </nova:instance>
Jan 21 23:47:15 compute-1 nova_compute[182713]:   </metadata>
Jan 21 23:47:15 compute-1 nova_compute[182713]:   <sysinfo type="smbios">
Jan 21 23:47:15 compute-1 nova_compute[182713]:     <system>
Jan 21 23:47:15 compute-1 nova_compute[182713]:       <entry name="manufacturer">RDO</entry>
Jan 21 23:47:15 compute-1 nova_compute[182713]:       <entry name="product">OpenStack Compute</entry>
Jan 21 23:47:15 compute-1 nova_compute[182713]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 21 23:47:15 compute-1 nova_compute[182713]:       <entry name="serial">5bdecf5d-9113-4584-ac23-44d59770eade</entry>
Jan 21 23:47:15 compute-1 nova_compute[182713]:       <entry name="uuid">5bdecf5d-9113-4584-ac23-44d59770eade</entry>
Jan 21 23:47:15 compute-1 nova_compute[182713]:       <entry name="family">Virtual Machine</entry>
Jan 21 23:47:15 compute-1 nova_compute[182713]:     </system>
Jan 21 23:47:15 compute-1 nova_compute[182713]:   </sysinfo>
Jan 21 23:47:15 compute-1 nova_compute[182713]:   <os>
Jan 21 23:47:15 compute-1 nova_compute[182713]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 21 23:47:15 compute-1 nova_compute[182713]:     <boot dev="hd"/>
Jan 21 23:47:15 compute-1 nova_compute[182713]:     <smbios mode="sysinfo"/>
Jan 21 23:47:15 compute-1 nova_compute[182713]:   </os>
Jan 21 23:47:15 compute-1 nova_compute[182713]:   <features>
Jan 21 23:47:15 compute-1 nova_compute[182713]:     <acpi/>
Jan 21 23:47:15 compute-1 nova_compute[182713]:     <apic/>
Jan 21 23:47:15 compute-1 nova_compute[182713]:     <vmcoreinfo/>
Jan 21 23:47:15 compute-1 nova_compute[182713]:   </features>
Jan 21 23:47:15 compute-1 nova_compute[182713]:   <clock offset="utc">
Jan 21 23:47:15 compute-1 nova_compute[182713]:     <timer name="pit" tickpolicy="delay"/>
Jan 21 23:47:15 compute-1 nova_compute[182713]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 21 23:47:15 compute-1 nova_compute[182713]:     <timer name="hpet" present="no"/>
Jan 21 23:47:15 compute-1 nova_compute[182713]:   </clock>
Jan 21 23:47:15 compute-1 nova_compute[182713]:   <cpu mode="custom" match="exact">
Jan 21 23:47:15 compute-1 nova_compute[182713]:     <model>Nehalem</model>
Jan 21 23:47:15 compute-1 nova_compute[182713]:     <topology sockets="1" cores="1" threads="1"/>
Jan 21 23:47:15 compute-1 nova_compute[182713]:   </cpu>
Jan 21 23:47:15 compute-1 nova_compute[182713]:   <devices>
Jan 21 23:47:15 compute-1 nova_compute[182713]:     <disk type="file" device="disk">
Jan 21 23:47:15 compute-1 nova_compute[182713]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 21 23:47:15 compute-1 nova_compute[182713]:       <source file="/var/lib/nova/instances/5bdecf5d-9113-4584-ac23-44d59770eade/disk"/>
Jan 21 23:47:15 compute-1 nova_compute[182713]:       <target dev="vda" bus="virtio"/>
Jan 21 23:47:15 compute-1 nova_compute[182713]:     </disk>
Jan 21 23:47:15 compute-1 nova_compute[182713]:     <disk type="file" device="cdrom">
Jan 21 23:47:15 compute-1 nova_compute[182713]:       <driver name="qemu" type="raw" cache="none"/>
Jan 21 23:47:15 compute-1 nova_compute[182713]:       <source file="/var/lib/nova/instances/5bdecf5d-9113-4584-ac23-44d59770eade/disk.config"/>
Jan 21 23:47:15 compute-1 nova_compute[182713]:       <target dev="sda" bus="sata"/>
Jan 21 23:47:15 compute-1 nova_compute[182713]:     </disk>
Jan 21 23:47:15 compute-1 nova_compute[182713]:     <interface type="ethernet">
Jan 21 23:47:15 compute-1 nova_compute[182713]:       <mac address="fa:16:3e:8f:4f:85"/>
Jan 21 23:47:15 compute-1 nova_compute[182713]:       <model type="virtio"/>
Jan 21 23:47:15 compute-1 nova_compute[182713]:       <driver name="vhost" rx_queue_size="512"/>
Jan 21 23:47:15 compute-1 nova_compute[182713]:       <mtu size="1442"/>
Jan 21 23:47:15 compute-1 nova_compute[182713]:       <target dev="tapdf9aa099-aa"/>
Jan 21 23:47:15 compute-1 nova_compute[182713]:     </interface>
Jan 21 23:47:15 compute-1 nova_compute[182713]:     <serial type="pty">
Jan 21 23:47:15 compute-1 nova_compute[182713]:       <log file="/var/lib/nova/instances/5bdecf5d-9113-4584-ac23-44d59770eade/console.log" append="off"/>
Jan 21 23:47:15 compute-1 nova_compute[182713]:     </serial>
Jan 21 23:47:15 compute-1 nova_compute[182713]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 21 23:47:15 compute-1 nova_compute[182713]:     <video>
Jan 21 23:47:15 compute-1 nova_compute[182713]:       <model type="virtio"/>
Jan 21 23:47:15 compute-1 nova_compute[182713]:     </video>
Jan 21 23:47:15 compute-1 nova_compute[182713]:     <input type="tablet" bus="usb"/>
Jan 21 23:47:15 compute-1 nova_compute[182713]:     <rng model="virtio">
Jan 21 23:47:15 compute-1 nova_compute[182713]:       <backend model="random">/dev/urandom</backend>
Jan 21 23:47:15 compute-1 nova_compute[182713]:     </rng>
Jan 21 23:47:15 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root"/>
Jan 21 23:47:15 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:47:15 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:47:15 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:47:15 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:47:15 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:47:15 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:47:15 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:47:15 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:47:15 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:47:15 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:47:15 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:47:15 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:47:15 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:47:15 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:47:15 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:47:15 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:47:15 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:47:15 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:47:15 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:47:15 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:47:15 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:47:15 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:47:15 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:47:15 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:47:15 compute-1 nova_compute[182713]:     <controller type="usb" index="0"/>
Jan 21 23:47:15 compute-1 nova_compute[182713]:     <memballoon model="virtio">
Jan 21 23:47:15 compute-1 nova_compute[182713]:       <stats period="10"/>
Jan 21 23:47:15 compute-1 nova_compute[182713]:     </memballoon>
Jan 21 23:47:15 compute-1 nova_compute[182713]:   </devices>
Jan 21 23:47:15 compute-1 nova_compute[182713]: </domain>
Jan 21 23:47:15 compute-1 nova_compute[182713]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 21 23:47:15 compute-1 nova_compute[182713]: 2026-01-21 23:47:15.480 182717 DEBUG nova.compute.manager [None req-a657ec13-73d8-4f2e-942d-7c1ad4140c2b d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] [instance: 5bdecf5d-9113-4584-ac23-44d59770eade] Preparing to wait for external event network-vif-plugged-df9aa099-aa41-4111-b46c-c8a593762a53 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 21 23:47:15 compute-1 nova_compute[182713]: 2026-01-21 23:47:15.481 182717 DEBUG oslo_concurrency.lockutils [None req-a657ec13-73d8-4f2e-942d-7c1ad4140c2b d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] Acquiring lock "5bdecf5d-9113-4584-ac23-44d59770eade-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:47:15 compute-1 nova_compute[182713]: 2026-01-21 23:47:15.481 182717 DEBUG oslo_concurrency.lockutils [None req-a657ec13-73d8-4f2e-942d-7c1ad4140c2b d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] Lock "5bdecf5d-9113-4584-ac23-44d59770eade-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:47:15 compute-1 nova_compute[182713]: 2026-01-21 23:47:15.482 182717 DEBUG oslo_concurrency.lockutils [None req-a657ec13-73d8-4f2e-942d-7c1ad4140c2b d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] Lock "5bdecf5d-9113-4584-ac23-44d59770eade-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:47:15 compute-1 nova_compute[182713]: 2026-01-21 23:47:15.482 182717 DEBUG nova.virt.libvirt.vif [None req-a657ec13-73d8-4f2e-942d-7c1ad4140c2b d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-21T23:47:07Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-LiveMigrationTest-server-821021372',display_name='tempest-LiveMigrationTest-server-821021372',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-livemigrationtest-server-821021372',id=21,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='cdcb2f57183e484cace5d5f78dd635a1',ramdisk_id='',reservation_id='r-xa5gd7vg',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-LiveMigrationTest-430976321',owner_user_name='tempest-LiveMigrationTest-430976321-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-21T23:47:09Z,user_data=None,user_id='d4ff24d8abf8416db9d64c645436c5f1',uuid=5bdecf5d-9113-4584-ac23-44d59770eade,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "df9aa099-aa41-4111-b46c-c8a593762a53", "address": "fa:16:3e:8f:4f:85", "network": {"id": "b2df233d-b255-4dda-925c-3ccab3a032ee", "bridge": "br-int", "label": "tempest-LiveMigrationTest-283223724-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cdcb2f57183e484cace5d5f78dd635a1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdf9aa099-aa", "ovs_interfaceid": "df9aa099-aa41-4111-b46c-c8a593762a53", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 21 23:47:15 compute-1 nova_compute[182713]: 2026-01-21 23:47:15.483 182717 DEBUG nova.network.os_vif_util [None req-a657ec13-73d8-4f2e-942d-7c1ad4140c2b d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] Converting VIF {"id": "df9aa099-aa41-4111-b46c-c8a593762a53", "address": "fa:16:3e:8f:4f:85", "network": {"id": "b2df233d-b255-4dda-925c-3ccab3a032ee", "bridge": "br-int", "label": "tempest-LiveMigrationTest-283223724-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cdcb2f57183e484cace5d5f78dd635a1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdf9aa099-aa", "ovs_interfaceid": "df9aa099-aa41-4111-b46c-c8a593762a53", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 21 23:47:15 compute-1 nova_compute[182713]: 2026-01-21 23:47:15.483 182717 DEBUG nova.network.os_vif_util [None req-a657ec13-73d8-4f2e-942d-7c1ad4140c2b d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:8f:4f:85,bridge_name='br-int',has_traffic_filtering=True,id=df9aa099-aa41-4111-b46c-c8a593762a53,network=Network(b2df233d-b255-4dda-925c-3ccab3a032ee),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdf9aa099-aa') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 21 23:47:15 compute-1 nova_compute[182713]: 2026-01-21 23:47:15.484 182717 DEBUG os_vif [None req-a657ec13-73d8-4f2e-942d-7c1ad4140c2b d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:8f:4f:85,bridge_name='br-int',has_traffic_filtering=True,id=df9aa099-aa41-4111-b46c-c8a593762a53,network=Network(b2df233d-b255-4dda-925c-3ccab3a032ee),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdf9aa099-aa') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 21 23:47:15 compute-1 nova_compute[182713]: 2026-01-21 23:47:15.485 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:47:15 compute-1 nova_compute[182713]: 2026-01-21 23:47:15.486 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:47:15 compute-1 nova_compute[182713]: 2026-01-21 23:47:15.487 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 21 23:47:15 compute-1 nova_compute[182713]: 2026-01-21 23:47:15.491 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:47:15 compute-1 nova_compute[182713]: 2026-01-21 23:47:15.491 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapdf9aa099-aa, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:47:15 compute-1 nova_compute[182713]: 2026-01-21 23:47:15.492 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapdf9aa099-aa, col_values=(('external_ids', {'iface-id': 'df9aa099-aa41-4111-b46c-c8a593762a53', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:8f:4f:85', 'vm-uuid': '5bdecf5d-9113-4584-ac23-44d59770eade'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:47:15 compute-1 nova_compute[182713]: 2026-01-21 23:47:15.495 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:47:15 compute-1 NetworkManager[54952]: <info>  [1769039235.4965] manager: (tapdf9aa099-aa): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/39)
Jan 21 23:47:15 compute-1 nova_compute[182713]: 2026-01-21 23:47:15.498 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 21 23:47:15 compute-1 nova_compute[182713]: 2026-01-21 23:47:15.504 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:47:15 compute-1 nova_compute[182713]: 2026-01-21 23:47:15.505 182717 INFO os_vif [None req-a657ec13-73d8-4f2e-942d-7c1ad4140c2b d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:8f:4f:85,bridge_name='br-int',has_traffic_filtering=True,id=df9aa099-aa41-4111-b46c-c8a593762a53,network=Network(b2df233d-b255-4dda-925c-3ccab3a032ee),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdf9aa099-aa')
Jan 21 23:47:15 compute-1 nova_compute[182713]: 2026-01-21 23:47:15.566 182717 DEBUG nova.virt.libvirt.driver [None req-a657ec13-73d8-4f2e-942d-7c1ad4140c2b d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 21 23:47:15 compute-1 nova_compute[182713]: 2026-01-21 23:47:15.566 182717 DEBUG nova.virt.libvirt.driver [None req-a657ec13-73d8-4f2e-942d-7c1ad4140c2b d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 21 23:47:15 compute-1 nova_compute[182713]: 2026-01-21 23:47:15.567 182717 DEBUG nova.virt.libvirt.driver [None req-a657ec13-73d8-4f2e-942d-7c1ad4140c2b d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] No VIF found with MAC fa:16:3e:8f:4f:85, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 21 23:47:15 compute-1 nova_compute[182713]: 2026-01-21 23:47:15.568 182717 INFO nova.virt.libvirt.driver [None req-a657ec13-73d8-4f2e-942d-7c1ad4140c2b d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] [instance: 5bdecf5d-9113-4584-ac23-44d59770eade] Using config drive
Jan 21 23:47:15 compute-1 podman[213562]: 2026-01-21 23:47:15.615571277 +0000 UTC m=+0.104238292 container health_status 1b31374526468d6b98833c6ddc06c791524e3aaa8f4a92d02e3f11c449a17ca2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 21 23:47:15 compute-1 podman[213564]: 2026-01-21 23:47:15.616011109 +0000 UTC m=+0.097088737 container health_status 9069102163b9014ce8d990e36224edf2f6277e737eebbc85a8ea0ba5a3d6a468 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 21 23:47:16 compute-1 nova_compute[182713]: 2026-01-21 23:47:16.485 182717 INFO nova.virt.libvirt.driver [None req-a657ec13-73d8-4f2e-942d-7c1ad4140c2b d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] [instance: 5bdecf5d-9113-4584-ac23-44d59770eade] Creating config drive at /var/lib/nova/instances/5bdecf5d-9113-4584-ac23-44d59770eade/disk.config
Jan 21 23:47:16 compute-1 nova_compute[182713]: 2026-01-21 23:47:16.495 182717 DEBUG oslo_concurrency.processutils [None req-a657ec13-73d8-4f2e-942d-7c1ad4140c2b d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/5bdecf5d-9113-4584-ac23-44d59770eade/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpgfwxp2ev execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:47:16 compute-1 nova_compute[182713]: 2026-01-21 23:47:16.640 182717 DEBUG oslo_concurrency.processutils [None req-a657ec13-73d8-4f2e-942d-7c1ad4140c2b d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/5bdecf5d-9113-4584-ac23-44d59770eade/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpgfwxp2ev" returned: 0 in 0.146s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:47:16 compute-1 kernel: tapdf9aa099-aa: entered promiscuous mode
Jan 21 23:47:16 compute-1 NetworkManager[54952]: <info>  [1769039236.7230] manager: (tapdf9aa099-aa): new Tun device (/org/freedesktop/NetworkManager/Devices/40)
Jan 21 23:47:16 compute-1 ovn_controller[94841]: 2026-01-21T23:47:16Z|00062|binding|INFO|Claiming lport df9aa099-aa41-4111-b46c-c8a593762a53 for this chassis.
Jan 21 23:47:16 compute-1 ovn_controller[94841]: 2026-01-21T23:47:16Z|00063|binding|INFO|df9aa099-aa41-4111-b46c-c8a593762a53: Claiming fa:16:3e:8f:4f:85 10.100.0.6
Jan 21 23:47:16 compute-1 nova_compute[182713]: 2026-01-21 23:47:16.725 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:47:16 compute-1 nova_compute[182713]: 2026-01-21 23:47:16.728 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:47:16 compute-1 nova_compute[182713]: 2026-01-21 23:47:16.733 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:47:16 compute-1 nova_compute[182713]: 2026-01-21 23:47:16.736 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:47:16 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:47:16.743 104184 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:8f:4f:85 10.100.0.6'], port_security=['fa:16:3e:8f:4f:85 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '5bdecf5d-9113-4584-ac23-44d59770eade', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b2df233d-b255-4dda-925c-3ccab3a032ee', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'cdcb2f57183e484cace5d5f78dd635a1', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'b0d61dfd-cc58-4a7f-a4c1-6c8c92847694', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ceab9906-340c-4566-81ac-4c6dd292f58f, chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>], logical_port=df9aa099-aa41-4111-b46c-c8a593762a53) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 21 23:47:16 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:47:16.745 104184 INFO neutron.agent.ovn.metadata.agent [-] Port df9aa099-aa41-4111-b46c-c8a593762a53 in datapath b2df233d-b255-4dda-925c-3ccab3a032ee bound to our chassis
Jan 21 23:47:16 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:47:16.747 104184 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network b2df233d-b255-4dda-925c-3ccab3a032ee
Jan 21 23:47:16 compute-1 systemd-udevd[213629]: Network interface NamePolicy= disabled on kernel command line.
Jan 21 23:47:16 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:47:16.767 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[9028f8be-7958-4c61-8bc1-f14d18649c85]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:47:16 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:47:16.769 104184 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapb2df233d-b1 in ovnmeta-b2df233d-b255-4dda-925c-3ccab3a032ee namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 21 23:47:16 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:47:16.771 211733 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapb2df233d-b0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 21 23:47:16 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:47:16.771 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[00fe0213-d5a8-4c6f-b9e3-eca8276b9704]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:47:16 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:47:16.772 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[740bb4f0-2790-4d5c-b85f-0c282a68a679]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:47:16 compute-1 systemd-machined[153970]: New machine qemu-10-instance-00000015.
Jan 21 23:47:16 compute-1 NetworkManager[54952]: <info>  [1769039236.7817] device (tapdf9aa099-aa): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 21 23:47:16 compute-1 NetworkManager[54952]: <info>  [1769039236.7825] device (tapdf9aa099-aa): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 21 23:47:16 compute-1 ovn_controller[94841]: 2026-01-21T23:47:16Z|00064|binding|INFO|Setting lport df9aa099-aa41-4111-b46c-c8a593762a53 ovn-installed in OVS
Jan 21 23:47:16 compute-1 ovn_controller[94841]: 2026-01-21T23:47:16Z|00065|binding|INFO|Setting lport df9aa099-aa41-4111-b46c-c8a593762a53 up in Southbound
Jan 21 23:47:16 compute-1 nova_compute[182713]: 2026-01-21 23:47:16.791 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:47:16 compute-1 systemd[1]: Started Virtual Machine qemu-10-instance-00000015.
Jan 21 23:47:16 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:47:16.796 104576 DEBUG oslo.privsep.daemon [-] privsep: reply[962e6fb9-1dd9-4fe4-bc5f-1fcd7351aef1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:47:16 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:47:16.821 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[0a4c6ae7-ac13-49ed-a9d4-0951a02c3060]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:47:16 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:47:16.848 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[7a007ae2-87a3-422e-ac9b-f218156c8291]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:47:16 compute-1 systemd-udevd[213633]: Network interface NamePolicy= disabled on kernel command line.
Jan 21 23:47:16 compute-1 NetworkManager[54952]: <info>  [1769039236.8557] manager: (tapb2df233d-b0): new Veth device (/org/freedesktop/NetworkManager/Devices/41)
Jan 21 23:47:16 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:47:16.855 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[237112d8-1e94-4515-a384-5b9e39bb5730]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:47:16 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:47:16.886 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[8358fb24-a3cb-4bf3-b1f0-33fc63de6a0a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:47:16 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:47:16.891 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[86fcc953-af50-4891-b4d2-14872efeb6ac]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:47:16 compute-1 nova_compute[182713]: 2026-01-21 23:47:16.898 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:47:16 compute-1 NetworkManager[54952]: <info>  [1769039236.9199] device (tapb2df233d-b0): carrier: link connected
Jan 21 23:47:16 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:47:16.929 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[1a09338e-a0fd-431f-bc2c-e0c1b8411ca1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:47:16 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:47:16.955 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[d053a2ca-d8d3-47e4-b5ce-e6b2cc56f006]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb2df233d-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:45:e6:36'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 22], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 380957, 'reachable_time': 21589, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 213663, 'error': None, 'target': 'ovnmeta-b2df233d-b255-4dda-925c-3ccab3a032ee', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:47:16 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:47:16.974 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[01fe6c59-60f7-4118-99d6-0fe9dfce0bfb]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe45:e636'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 380957, 'tstamp': 380957}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 213664, 'error': None, 'target': 'ovnmeta-b2df233d-b255-4dda-925c-3ccab3a032ee', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:47:16 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:47:16.993 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[193d4034-46cf-452b-b9d1-0166c65254d2]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb2df233d-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:45:e6:36'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 22], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 380957, 'reachable_time': 21589, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 213665, 'error': None, 'target': 'ovnmeta-b2df233d-b255-4dda-925c-3ccab3a032ee', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:47:17 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:47:17.040 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[2e35d764-40cc-4163-b691-d6b33d9fa488]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:47:17 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:47:17.112 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[f313d457-b62d-40d7-8dc4-0eca6a6bf3c1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:47:17 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:47:17.113 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb2df233d-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:47:17 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:47:17.114 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 21 23:47:17 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:47:17.114 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb2df233d-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:47:17 compute-1 nova_compute[182713]: 2026-01-21 23:47:17.165 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:47:17 compute-1 NetworkManager[54952]: <info>  [1769039237.1656] manager: (tapb2df233d-b0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/42)
Jan 21 23:47:17 compute-1 kernel: tapb2df233d-b0: entered promiscuous mode
Jan 21 23:47:17 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:47:17.171 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapb2df233d-b0, col_values=(('external_ids', {'iface-id': '75454af0-da31-4238-b248-a6678c575f51'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:47:17 compute-1 nova_compute[182713]: 2026-01-21 23:47:17.172 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:47:17 compute-1 ovn_controller[94841]: 2026-01-21T23:47:17Z|00066|binding|INFO|Releasing lport 75454af0-da31-4238-b248-a6678c575f51 from this chassis (sb_readonly=0)
Jan 21 23:47:17 compute-1 nova_compute[182713]: 2026-01-21 23:47:17.174 182717 DEBUG oslo_concurrency.lockutils [None req-351eb237-3255-446d-9453-de10d1961d65 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Acquiring lock "044c71d9-3aaf-4e1c-af95-5c0636cf4000" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:47:17 compute-1 nova_compute[182713]: 2026-01-21 23:47:17.175 182717 DEBUG oslo_concurrency.lockutils [None req-351eb237-3255-446d-9453-de10d1961d65 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Lock "044c71d9-3aaf-4e1c-af95-5c0636cf4000" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:47:17 compute-1 nova_compute[182713]: 2026-01-21 23:47:17.185 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:47:17 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:47:17.187 104184 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/b2df233d-b255-4dda-925c-3ccab3a032ee.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/b2df233d-b255-4dda-925c-3ccab3a032ee.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 21 23:47:17 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:47:17.188 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[e761a700-5566-44bf-85a6-b92a6940bdf8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:47:17 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:47:17.189 104184 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 21 23:47:17 compute-1 ovn_metadata_agent[104179]: global
Jan 21 23:47:17 compute-1 ovn_metadata_agent[104179]:     log         /dev/log local0 debug
Jan 21 23:47:17 compute-1 ovn_metadata_agent[104179]:     log-tag     haproxy-metadata-proxy-b2df233d-b255-4dda-925c-3ccab3a032ee
Jan 21 23:47:17 compute-1 ovn_metadata_agent[104179]:     user        root
Jan 21 23:47:17 compute-1 ovn_metadata_agent[104179]:     group       root
Jan 21 23:47:17 compute-1 ovn_metadata_agent[104179]:     maxconn     1024
Jan 21 23:47:17 compute-1 ovn_metadata_agent[104179]:     pidfile     /var/lib/neutron/external/pids/b2df233d-b255-4dda-925c-3ccab3a032ee.pid.haproxy
Jan 21 23:47:17 compute-1 ovn_metadata_agent[104179]:     daemon
Jan 21 23:47:17 compute-1 ovn_metadata_agent[104179]: 
Jan 21 23:47:17 compute-1 ovn_metadata_agent[104179]: defaults
Jan 21 23:47:17 compute-1 ovn_metadata_agent[104179]:     log global
Jan 21 23:47:17 compute-1 ovn_metadata_agent[104179]:     mode http
Jan 21 23:47:17 compute-1 ovn_metadata_agent[104179]:     option httplog
Jan 21 23:47:17 compute-1 ovn_metadata_agent[104179]:     option dontlognull
Jan 21 23:47:17 compute-1 ovn_metadata_agent[104179]:     option http-server-close
Jan 21 23:47:17 compute-1 ovn_metadata_agent[104179]:     option forwardfor
Jan 21 23:47:17 compute-1 ovn_metadata_agent[104179]:     retries                 3
Jan 21 23:47:17 compute-1 ovn_metadata_agent[104179]:     timeout http-request    30s
Jan 21 23:47:17 compute-1 ovn_metadata_agent[104179]:     timeout connect         30s
Jan 21 23:47:17 compute-1 ovn_metadata_agent[104179]:     timeout client          32s
Jan 21 23:47:17 compute-1 ovn_metadata_agent[104179]:     timeout server          32s
Jan 21 23:47:17 compute-1 ovn_metadata_agent[104179]:     timeout http-keep-alive 30s
Jan 21 23:47:17 compute-1 ovn_metadata_agent[104179]: 
Jan 21 23:47:17 compute-1 ovn_metadata_agent[104179]: 
Jan 21 23:47:17 compute-1 ovn_metadata_agent[104179]: listen listener
Jan 21 23:47:17 compute-1 ovn_metadata_agent[104179]:     bind 169.254.169.254:80
Jan 21 23:47:17 compute-1 ovn_metadata_agent[104179]:     server metadata /var/lib/neutron/metadata_proxy
Jan 21 23:47:17 compute-1 ovn_metadata_agent[104179]:     http-request add-header X-OVN-Network-ID b2df233d-b255-4dda-925c-3ccab3a032ee
Jan 21 23:47:17 compute-1 ovn_metadata_agent[104179]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 21 23:47:17 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:47:17.189 104184 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-b2df233d-b255-4dda-925c-3ccab3a032ee', 'env', 'PROCESS_TAG=haproxy-b2df233d-b255-4dda-925c-3ccab3a032ee', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/b2df233d-b255-4dda-925c-3ccab3a032ee.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 21 23:47:17 compute-1 nova_compute[182713]: 2026-01-21 23:47:17.202 182717 DEBUG nova.compute.manager [None req-351eb237-3255-446d-9453-de10d1961d65 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] [instance: 044c71d9-3aaf-4e1c-af95-5c0636cf4000] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 21 23:47:17 compute-1 nova_compute[182713]: 2026-01-21 23:47:17.611 182717 DEBUG oslo_concurrency.lockutils [None req-351eb237-3255-446d-9453-de10d1961d65 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:47:17 compute-1 nova_compute[182713]: 2026-01-21 23:47:17.612 182717 DEBUG oslo_concurrency.lockutils [None req-351eb237-3255-446d-9453-de10d1961d65 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:47:17 compute-1 podman[213697]: 2026-01-21 23:47:17.614459071 +0000 UTC m=+0.060830424 container create 83b14bc70ebe131abc2be17be17a6149c0aa5c5e6ab78360e501cae4db2cf14c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b2df233d-b255-4dda-925c-3ccab3a032ee, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 21 23:47:17 compute-1 nova_compute[182713]: 2026-01-21 23:47:17.619 182717 DEBUG nova.virt.hardware [None req-351eb237-3255-446d-9453-de10d1961d65 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 21 23:47:17 compute-1 nova_compute[182713]: 2026-01-21 23:47:17.619 182717 INFO nova.compute.claims [None req-351eb237-3255-446d-9453-de10d1961d65 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] [instance: 044c71d9-3aaf-4e1c-af95-5c0636cf4000] Claim successful on node compute-1.ctlplane.example.com
Jan 21 23:47:17 compute-1 nova_compute[182713]: 2026-01-21 23:47:17.641 182717 DEBUG nova.virt.driver [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Emitting event <LifecycleEvent: 1769039237.641134, 5bdecf5d-9113-4584-ac23-44d59770eade => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:47:17 compute-1 nova_compute[182713]: 2026-01-21 23:47:17.641 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 5bdecf5d-9113-4584-ac23-44d59770eade] VM Started (Lifecycle Event)
Jan 21 23:47:17 compute-1 systemd[1]: Started libpod-conmon-83b14bc70ebe131abc2be17be17a6149c0aa5c5e6ab78360e501cae4db2cf14c.scope.
Jan 21 23:47:17 compute-1 nova_compute[182713]: 2026-01-21 23:47:17.673 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 5bdecf5d-9113-4584-ac23-44d59770eade] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:47:17 compute-1 nova_compute[182713]: 2026-01-21 23:47:17.679 182717 DEBUG nova.virt.driver [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Emitting event <LifecycleEvent: 1769039237.6416752, 5bdecf5d-9113-4584-ac23-44d59770eade => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:47:17 compute-1 nova_compute[182713]: 2026-01-21 23:47:17.680 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 5bdecf5d-9113-4584-ac23-44d59770eade] VM Paused (Lifecycle Event)
Jan 21 23:47:17 compute-1 podman[213697]: 2026-01-21 23:47:17.589885332 +0000 UTC m=+0.036256695 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 21 23:47:17 compute-1 systemd[1]: Started libcrun container.
Jan 21 23:47:17 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/45ac7128809ed154e539801e2168e3ae2574a03472072790150e25a8dd0a0de7/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 21 23:47:17 compute-1 nova_compute[182713]: 2026-01-21 23:47:17.705 182717 DEBUG nova.compute.manager [req-da943313-3a3e-4fdd-b1bc-19bc801689ce req-480baee5-1be4-414c-aef2-9e68aedf961d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 5bdecf5d-9113-4584-ac23-44d59770eade] Received event network-vif-plugged-df9aa099-aa41-4111-b46c-c8a593762a53 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:47:17 compute-1 nova_compute[182713]: 2026-01-21 23:47:17.706 182717 DEBUG oslo_concurrency.lockutils [req-da943313-3a3e-4fdd-b1bc-19bc801689ce req-480baee5-1be4-414c-aef2-9e68aedf961d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "5bdecf5d-9113-4584-ac23-44d59770eade-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:47:17 compute-1 nova_compute[182713]: 2026-01-21 23:47:17.706 182717 DEBUG oslo_concurrency.lockutils [req-da943313-3a3e-4fdd-b1bc-19bc801689ce req-480baee5-1be4-414c-aef2-9e68aedf961d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "5bdecf5d-9113-4584-ac23-44d59770eade-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:47:17 compute-1 nova_compute[182713]: 2026-01-21 23:47:17.707 182717 DEBUG oslo_concurrency.lockutils [req-da943313-3a3e-4fdd-b1bc-19bc801689ce req-480baee5-1be4-414c-aef2-9e68aedf961d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "5bdecf5d-9113-4584-ac23-44d59770eade-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:47:17 compute-1 nova_compute[182713]: 2026-01-21 23:47:17.707 182717 DEBUG nova.compute.manager [req-da943313-3a3e-4fdd-b1bc-19bc801689ce req-480baee5-1be4-414c-aef2-9e68aedf961d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 5bdecf5d-9113-4584-ac23-44d59770eade] Processing event network-vif-plugged-df9aa099-aa41-4111-b46c-c8a593762a53 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 21 23:47:17 compute-1 nova_compute[182713]: 2026-01-21 23:47:17.708 182717 DEBUG nova.compute.manager [req-da943313-3a3e-4fdd-b1bc-19bc801689ce req-480baee5-1be4-414c-aef2-9e68aedf961d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 5bdecf5d-9113-4584-ac23-44d59770eade] Received event network-vif-plugged-df9aa099-aa41-4111-b46c-c8a593762a53 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:47:17 compute-1 nova_compute[182713]: 2026-01-21 23:47:17.708 182717 DEBUG oslo_concurrency.lockutils [req-da943313-3a3e-4fdd-b1bc-19bc801689ce req-480baee5-1be4-414c-aef2-9e68aedf961d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "5bdecf5d-9113-4584-ac23-44d59770eade-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:47:17 compute-1 nova_compute[182713]: 2026-01-21 23:47:17.709 182717 DEBUG oslo_concurrency.lockutils [req-da943313-3a3e-4fdd-b1bc-19bc801689ce req-480baee5-1be4-414c-aef2-9e68aedf961d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "5bdecf5d-9113-4584-ac23-44d59770eade-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:47:17 compute-1 nova_compute[182713]: 2026-01-21 23:47:17.709 182717 DEBUG oslo_concurrency.lockutils [req-da943313-3a3e-4fdd-b1bc-19bc801689ce req-480baee5-1be4-414c-aef2-9e68aedf961d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "5bdecf5d-9113-4584-ac23-44d59770eade-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:47:17 compute-1 nova_compute[182713]: 2026-01-21 23:47:17.710 182717 DEBUG nova.compute.manager [req-da943313-3a3e-4fdd-b1bc-19bc801689ce req-480baee5-1be4-414c-aef2-9e68aedf961d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 5bdecf5d-9113-4584-ac23-44d59770eade] No waiting events found dispatching network-vif-plugged-df9aa099-aa41-4111-b46c-c8a593762a53 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 23:47:17 compute-1 nova_compute[182713]: 2026-01-21 23:47:17.710 182717 WARNING nova.compute.manager [req-da943313-3a3e-4fdd-b1bc-19bc801689ce req-480baee5-1be4-414c-aef2-9e68aedf961d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 5bdecf5d-9113-4584-ac23-44d59770eade] Received unexpected event network-vif-plugged-df9aa099-aa41-4111-b46c-c8a593762a53 for instance with vm_state building and task_state spawning.
Jan 21 23:47:17 compute-1 nova_compute[182713]: 2026-01-21 23:47:17.712 182717 DEBUG nova.compute.manager [None req-a657ec13-73d8-4f2e-942d-7c1ad4140c2b d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] [instance: 5bdecf5d-9113-4584-ac23-44d59770eade] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 21 23:47:17 compute-1 nova_compute[182713]: 2026-01-21 23:47:17.714 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 5bdecf5d-9113-4584-ac23-44d59770eade] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:47:17 compute-1 nova_compute[182713]: 2026-01-21 23:47:17.723 182717 DEBUG nova.virt.libvirt.driver [None req-a657ec13-73d8-4f2e-942d-7c1ad4140c2b d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] [instance: 5bdecf5d-9113-4584-ac23-44d59770eade] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 21 23:47:17 compute-1 podman[213697]: 2026-01-21 23:47:17.723881623 +0000 UTC m=+0.170253056 container init 83b14bc70ebe131abc2be17be17a6149c0aa5c5e6ab78360e501cae4db2cf14c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b2df233d-b255-4dda-925c-3ccab3a032ee, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 21 23:47:17 compute-1 nova_compute[182713]: 2026-01-21 23:47:17.726 182717 DEBUG nova.virt.driver [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Emitting event <LifecycleEvent: 1769039237.7203484, 5bdecf5d-9113-4584-ac23-44d59770eade => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:47:17 compute-1 nova_compute[182713]: 2026-01-21 23:47:17.726 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 5bdecf5d-9113-4584-ac23-44d59770eade] VM Resumed (Lifecycle Event)
Jan 21 23:47:17 compute-1 podman[213697]: 2026-01-21 23:47:17.736799288 +0000 UTC m=+0.183170671 container start 83b14bc70ebe131abc2be17be17a6149c0aa5c5e6ab78360e501cae4db2cf14c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b2df233d-b255-4dda-925c-3ccab3a032ee, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 21 23:47:17 compute-1 nova_compute[182713]: 2026-01-21 23:47:17.737 182717 INFO nova.virt.libvirt.driver [-] [instance: 5bdecf5d-9113-4584-ac23-44d59770eade] Instance spawned successfully.
Jan 21 23:47:17 compute-1 nova_compute[182713]: 2026-01-21 23:47:17.737 182717 DEBUG nova.virt.libvirt.driver [None req-a657ec13-73d8-4f2e-942d-7c1ad4140c2b d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] [instance: 5bdecf5d-9113-4584-ac23-44d59770eade] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 21 23:47:17 compute-1 nova_compute[182713]: 2026-01-21 23:47:17.754 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 5bdecf5d-9113-4584-ac23-44d59770eade] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:47:17 compute-1 nova_compute[182713]: 2026-01-21 23:47:17.762 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 5bdecf5d-9113-4584-ac23-44d59770eade] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 21 23:47:17 compute-1 nova_compute[182713]: 2026-01-21 23:47:17.768 182717 DEBUG nova.virt.libvirt.driver [None req-a657ec13-73d8-4f2e-942d-7c1ad4140c2b d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] [instance: 5bdecf5d-9113-4584-ac23-44d59770eade] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:47:17 compute-1 nova_compute[182713]: 2026-01-21 23:47:17.769 182717 DEBUG nova.virt.libvirt.driver [None req-a657ec13-73d8-4f2e-942d-7c1ad4140c2b d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] [instance: 5bdecf5d-9113-4584-ac23-44d59770eade] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:47:17 compute-1 nova_compute[182713]: 2026-01-21 23:47:17.770 182717 DEBUG nova.virt.libvirt.driver [None req-a657ec13-73d8-4f2e-942d-7c1ad4140c2b d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] [instance: 5bdecf5d-9113-4584-ac23-44d59770eade] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:47:17 compute-1 nova_compute[182713]: 2026-01-21 23:47:17.771 182717 DEBUG nova.virt.libvirt.driver [None req-a657ec13-73d8-4f2e-942d-7c1ad4140c2b d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] [instance: 5bdecf5d-9113-4584-ac23-44d59770eade] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:47:17 compute-1 nova_compute[182713]: 2026-01-21 23:47:17.773 182717 DEBUG nova.virt.libvirt.driver [None req-a657ec13-73d8-4f2e-942d-7c1ad4140c2b d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] [instance: 5bdecf5d-9113-4584-ac23-44d59770eade] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:47:17 compute-1 nova_compute[182713]: 2026-01-21 23:47:17.774 182717 DEBUG nova.virt.libvirt.driver [None req-a657ec13-73d8-4f2e-942d-7c1ad4140c2b d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] [instance: 5bdecf5d-9113-4584-ac23-44d59770eade] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:47:17 compute-1 neutron-haproxy-ovnmeta-b2df233d-b255-4dda-925c-3ccab3a032ee[213717]: [NOTICE]   (213721) : New worker (213723) forked
Jan 21 23:47:17 compute-1 neutron-haproxy-ovnmeta-b2df233d-b255-4dda-925c-3ccab3a032ee[213717]: [NOTICE]   (213721) : Loading success.
Jan 21 23:47:17 compute-1 nova_compute[182713]: 2026-01-21 23:47:17.782 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 5bdecf5d-9113-4584-ac23-44d59770eade] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 21 23:47:17 compute-1 nova_compute[182713]: 2026-01-21 23:47:17.855 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:47:17 compute-1 nova_compute[182713]: 2026-01-21 23:47:17.856 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:47:17 compute-1 nova_compute[182713]: 2026-01-21 23:47:17.856 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 21 23:47:17 compute-1 nova_compute[182713]: 2026-01-21 23:47:17.873 182717 INFO nova.compute.manager [None req-a657ec13-73d8-4f2e-942d-7c1ad4140c2b d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] [instance: 5bdecf5d-9113-4584-ac23-44d59770eade] Took 8.14 seconds to spawn the instance on the hypervisor.
Jan 21 23:47:17 compute-1 nova_compute[182713]: 2026-01-21 23:47:17.874 182717 DEBUG nova.compute.manager [None req-a657ec13-73d8-4f2e-942d-7c1ad4140c2b d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] [instance: 5bdecf5d-9113-4584-ac23-44d59770eade] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:47:17 compute-1 nova_compute[182713]: 2026-01-21 23:47:17.900 182717 DEBUG nova.compute.provider_tree [None req-351eb237-3255-446d-9453-de10d1961d65 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Inventory has not changed in ProviderTree for provider: 39680711-70c9-4df1-ae59-25e54fac688d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 21 23:47:17 compute-1 nova_compute[182713]: 2026-01-21 23:47:17.926 182717 DEBUG nova.scheduler.client.report [None req-351eb237-3255-446d-9453-de10d1961d65 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Inventory has not changed for provider 39680711-70c9-4df1-ae59-25e54fac688d based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 21 23:47:17 compute-1 nova_compute[182713]: 2026-01-21 23:47:17.974 182717 DEBUG oslo_concurrency.lockutils [None req-351eb237-3255-446d-9453-de10d1961d65 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.362s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:47:17 compute-1 nova_compute[182713]: 2026-01-21 23:47:17.974 182717 DEBUG nova.compute.manager [None req-351eb237-3255-446d-9453-de10d1961d65 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] [instance: 044c71d9-3aaf-4e1c-af95-5c0636cf4000] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 21 23:47:18 compute-1 nova_compute[182713]: 2026-01-21 23:47:18.001 182717 INFO nova.compute.manager [None req-a657ec13-73d8-4f2e-942d-7c1ad4140c2b d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] [instance: 5bdecf5d-9113-4584-ac23-44d59770eade] Took 9.12 seconds to build instance.
Jan 21 23:47:18 compute-1 nova_compute[182713]: 2026-01-21 23:47:18.024 182717 DEBUG oslo_concurrency.lockutils [None req-a657ec13-73d8-4f2e-942d-7c1ad4140c2b d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] Lock "5bdecf5d-9113-4584-ac23-44d59770eade" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 9.278s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:47:18 compute-1 nova_compute[182713]: 2026-01-21 23:47:18.038 182717 DEBUG nova.compute.manager [None req-351eb237-3255-446d-9453-de10d1961d65 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] [instance: 044c71d9-3aaf-4e1c-af95-5c0636cf4000] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 21 23:47:18 compute-1 nova_compute[182713]: 2026-01-21 23:47:18.038 182717 DEBUG nova.network.neutron [None req-351eb237-3255-446d-9453-de10d1961d65 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] [instance: 044c71d9-3aaf-4e1c-af95-5c0636cf4000] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 21 23:47:18 compute-1 nova_compute[182713]: 2026-01-21 23:47:18.042 182717 DEBUG nova.network.neutron [req-f81e91ee-e542-443f-9017-9b40a9edf139 req-1e4c54b3-e12f-40bf-8e54-430920da984d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 5bdecf5d-9113-4584-ac23-44d59770eade] Updated VIF entry in instance network info cache for port df9aa099-aa41-4111-b46c-c8a593762a53. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 21 23:47:18 compute-1 nova_compute[182713]: 2026-01-21 23:47:18.043 182717 DEBUG nova.network.neutron [req-f81e91ee-e542-443f-9017-9b40a9edf139 req-1e4c54b3-e12f-40bf-8e54-430920da984d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 5bdecf5d-9113-4584-ac23-44d59770eade] Updating instance_info_cache with network_info: [{"id": "df9aa099-aa41-4111-b46c-c8a593762a53", "address": "fa:16:3e:8f:4f:85", "network": {"id": "b2df233d-b255-4dda-925c-3ccab3a032ee", "bridge": "br-int", "label": "tempest-LiveMigrationTest-283223724-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cdcb2f57183e484cace5d5f78dd635a1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdf9aa099-aa", "ovs_interfaceid": "df9aa099-aa41-4111-b46c-c8a593762a53", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 23:47:18 compute-1 nova_compute[182713]: 2026-01-21 23:47:18.069 182717 INFO nova.virt.libvirt.driver [None req-351eb237-3255-446d-9453-de10d1961d65 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] [instance: 044c71d9-3aaf-4e1c-af95-5c0636cf4000] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 21 23:47:18 compute-1 nova_compute[182713]: 2026-01-21 23:47:18.073 182717 DEBUG oslo_concurrency.lockutils [req-f81e91ee-e542-443f-9017-9b40a9edf139 req-1e4c54b3-e12f-40bf-8e54-430920da984d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-5bdecf5d-9113-4584-ac23-44d59770eade" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 23:47:18 compute-1 nova_compute[182713]: 2026-01-21 23:47:18.094 182717 DEBUG nova.compute.manager [None req-351eb237-3255-446d-9453-de10d1961d65 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] [instance: 044c71d9-3aaf-4e1c-af95-5c0636cf4000] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 21 23:47:18 compute-1 nova_compute[182713]: 2026-01-21 23:47:18.222 182717 DEBUG nova.compute.manager [None req-351eb237-3255-446d-9453-de10d1961d65 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] [instance: 044c71d9-3aaf-4e1c-af95-5c0636cf4000] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 21 23:47:18 compute-1 nova_compute[182713]: 2026-01-21 23:47:18.225 182717 DEBUG nova.virt.libvirt.driver [None req-351eb237-3255-446d-9453-de10d1961d65 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] [instance: 044c71d9-3aaf-4e1c-af95-5c0636cf4000] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 21 23:47:18 compute-1 nova_compute[182713]: 2026-01-21 23:47:18.226 182717 INFO nova.virt.libvirt.driver [None req-351eb237-3255-446d-9453-de10d1961d65 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] [instance: 044c71d9-3aaf-4e1c-af95-5c0636cf4000] Creating image(s)
Jan 21 23:47:18 compute-1 nova_compute[182713]: 2026-01-21 23:47:18.228 182717 DEBUG oslo_concurrency.lockutils [None req-351eb237-3255-446d-9453-de10d1961d65 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Acquiring lock "/var/lib/nova/instances/044c71d9-3aaf-4e1c-af95-5c0636cf4000/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:47:18 compute-1 nova_compute[182713]: 2026-01-21 23:47:18.228 182717 DEBUG oslo_concurrency.lockutils [None req-351eb237-3255-446d-9453-de10d1961d65 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Lock "/var/lib/nova/instances/044c71d9-3aaf-4e1c-af95-5c0636cf4000/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:47:18 compute-1 nova_compute[182713]: 2026-01-21 23:47:18.229 182717 DEBUG oslo_concurrency.lockutils [None req-351eb237-3255-446d-9453-de10d1961d65 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Lock "/var/lib/nova/instances/044c71d9-3aaf-4e1c-af95-5c0636cf4000/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:47:18 compute-1 nova_compute[182713]: 2026-01-21 23:47:18.258 182717 DEBUG oslo_concurrency.processutils [None req-351eb237-3255-446d-9453-de10d1961d65 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:47:18 compute-1 nova_compute[182713]: 2026-01-21 23:47:18.334 182717 DEBUG oslo_concurrency.processutils [None req-351eb237-3255-446d-9453-de10d1961d65 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.076s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:47:18 compute-1 nova_compute[182713]: 2026-01-21 23:47:18.335 182717 DEBUG oslo_concurrency.lockutils [None req-351eb237-3255-446d-9453-de10d1961d65 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Acquiring lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:47:18 compute-1 nova_compute[182713]: 2026-01-21 23:47:18.336 182717 DEBUG oslo_concurrency.lockutils [None req-351eb237-3255-446d-9453-de10d1961d65 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:47:18 compute-1 nova_compute[182713]: 2026-01-21 23:47:18.355 182717 DEBUG oslo_concurrency.processutils [None req-351eb237-3255-446d-9453-de10d1961d65 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:47:18 compute-1 nova_compute[182713]: 2026-01-21 23:47:18.381 182717 DEBUG nova.policy [None req-351eb237-3255-446d-9453-de10d1961d65 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '4a6034ff39094b6486bac680b7ed5a57', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '4d40fc03fb534b5689415f3d8a3de1fc', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 21 23:47:18 compute-1 nova_compute[182713]: 2026-01-21 23:47:18.428 182717 DEBUG oslo_concurrency.processutils [None req-351eb237-3255-446d-9453-de10d1961d65 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.073s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:47:18 compute-1 nova_compute[182713]: 2026-01-21 23:47:18.429 182717 DEBUG oslo_concurrency.processutils [None req-351eb237-3255-446d-9453-de10d1961d65 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/044c71d9-3aaf-4e1c-af95-5c0636cf4000/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:47:18 compute-1 nova_compute[182713]: 2026-01-21 23:47:18.464 182717 DEBUG oslo_concurrency.processutils [None req-351eb237-3255-446d-9453-de10d1961d65 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/044c71d9-3aaf-4e1c-af95-5c0636cf4000/disk 1073741824" returned: 0 in 0.035s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:47:18 compute-1 nova_compute[182713]: 2026-01-21 23:47:18.465 182717 DEBUG oslo_concurrency.lockutils [None req-351eb237-3255-446d-9453-de10d1961d65 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.129s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:47:18 compute-1 nova_compute[182713]: 2026-01-21 23:47:18.466 182717 DEBUG oslo_concurrency.processutils [None req-351eb237-3255-446d-9453-de10d1961d65 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:47:18 compute-1 nova_compute[182713]: 2026-01-21 23:47:18.551 182717 DEBUG oslo_concurrency.processutils [None req-351eb237-3255-446d-9453-de10d1961d65 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.085s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:47:18 compute-1 nova_compute[182713]: 2026-01-21 23:47:18.553 182717 DEBUG nova.virt.disk.api [None req-351eb237-3255-446d-9453-de10d1961d65 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Checking if we can resize image /var/lib/nova/instances/044c71d9-3aaf-4e1c-af95-5c0636cf4000/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 21 23:47:18 compute-1 nova_compute[182713]: 2026-01-21 23:47:18.554 182717 DEBUG oslo_concurrency.processutils [None req-351eb237-3255-446d-9453-de10d1961d65 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/044c71d9-3aaf-4e1c-af95-5c0636cf4000/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:47:18 compute-1 nova_compute[182713]: 2026-01-21 23:47:18.646 182717 DEBUG oslo_concurrency.processutils [None req-351eb237-3255-446d-9453-de10d1961d65 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/044c71d9-3aaf-4e1c-af95-5c0636cf4000/disk --force-share --output=json" returned: 0 in 0.092s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:47:18 compute-1 nova_compute[182713]: 2026-01-21 23:47:18.648 182717 DEBUG nova.virt.disk.api [None req-351eb237-3255-446d-9453-de10d1961d65 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Cannot resize image /var/lib/nova/instances/044c71d9-3aaf-4e1c-af95-5c0636cf4000/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 21 23:47:18 compute-1 nova_compute[182713]: 2026-01-21 23:47:18.649 182717 DEBUG nova.objects.instance [None req-351eb237-3255-446d-9453-de10d1961d65 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Lazy-loading 'migration_context' on Instance uuid 044c71d9-3aaf-4e1c-af95-5c0636cf4000 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:47:18 compute-1 nova_compute[182713]: 2026-01-21 23:47:18.667 182717 DEBUG nova.virt.libvirt.driver [None req-351eb237-3255-446d-9453-de10d1961d65 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] [instance: 044c71d9-3aaf-4e1c-af95-5c0636cf4000] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 21 23:47:18 compute-1 nova_compute[182713]: 2026-01-21 23:47:18.668 182717 DEBUG nova.virt.libvirt.driver [None req-351eb237-3255-446d-9453-de10d1961d65 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] [instance: 044c71d9-3aaf-4e1c-af95-5c0636cf4000] Ensure instance console log exists: /var/lib/nova/instances/044c71d9-3aaf-4e1c-af95-5c0636cf4000/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 21 23:47:18 compute-1 nova_compute[182713]: 2026-01-21 23:47:18.669 182717 DEBUG oslo_concurrency.lockutils [None req-351eb237-3255-446d-9453-de10d1961d65 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:47:18 compute-1 nova_compute[182713]: 2026-01-21 23:47:18.669 182717 DEBUG oslo_concurrency.lockutils [None req-351eb237-3255-446d-9453-de10d1961d65 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:47:18 compute-1 nova_compute[182713]: 2026-01-21 23:47:18.670 182717 DEBUG oslo_concurrency.lockutils [None req-351eb237-3255-446d-9453-de10d1961d65 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:47:18 compute-1 nova_compute[182713]: 2026-01-21 23:47:18.855 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:47:18 compute-1 nova_compute[182713]: 2026-01-21 23:47:18.856 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Jan 21 23:47:19 compute-1 nova_compute[182713]: 2026-01-21 23:47:19.876 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:47:19 compute-1 nova_compute[182713]: 2026-01-21 23:47:19.877 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:47:19 compute-1 nova_compute[182713]: 2026-01-21 23:47:19.906 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:47:19 compute-1 nova_compute[182713]: 2026-01-21 23:47:19.907 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:47:19 compute-1 nova_compute[182713]: 2026-01-21 23:47:19.907 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:47:19 compute-1 nova_compute[182713]: 2026-01-21 23:47:19.907 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 21 23:47:19 compute-1 nova_compute[182713]: 2026-01-21 23:47:19.997 182717 DEBUG oslo_concurrency.processutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5bdecf5d-9113-4584-ac23-44d59770eade/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:47:20 compute-1 nova_compute[182713]: 2026-01-21 23:47:20.082 182717 DEBUG oslo_concurrency.processutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5bdecf5d-9113-4584-ac23-44d59770eade/disk --force-share --output=json" returned: 0 in 0.086s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:47:20 compute-1 nova_compute[182713]: 2026-01-21 23:47:20.084 182717 DEBUG oslo_concurrency.processutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5bdecf5d-9113-4584-ac23-44d59770eade/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:47:20 compute-1 nova_compute[182713]: 2026-01-21 23:47:20.147 182717 DEBUG oslo_concurrency.processutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5bdecf5d-9113-4584-ac23-44d59770eade/disk --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:47:20 compute-1 nova_compute[182713]: 2026-01-21 23:47:20.171 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:47:20 compute-1 nova_compute[182713]: 2026-01-21 23:47:20.371 182717 WARNING nova.virt.libvirt.driver [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 21 23:47:20 compute-1 nova_compute[182713]: 2026-01-21 23:47:20.373 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5533MB free_disk=73.37984085083008GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 21 23:47:20 compute-1 nova_compute[182713]: 2026-01-21 23:47:20.374 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:47:20 compute-1 nova_compute[182713]: 2026-01-21 23:47:20.374 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:47:20 compute-1 nova_compute[182713]: 2026-01-21 23:47:20.472 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Instance 5bdecf5d-9113-4584-ac23-44d59770eade actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 21 23:47:20 compute-1 nova_compute[182713]: 2026-01-21 23:47:20.473 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Instance 044c71d9-3aaf-4e1c-af95-5c0636cf4000 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 21 23:47:20 compute-1 nova_compute[182713]: 2026-01-21 23:47:20.473 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 21 23:47:20 compute-1 nova_compute[182713]: 2026-01-21 23:47:20.473 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 21 23:47:20 compute-1 nova_compute[182713]: 2026-01-21 23:47:20.494 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:47:20 compute-1 nova_compute[182713]: 2026-01-21 23:47:20.499 182717 DEBUG nova.network.neutron [None req-351eb237-3255-446d-9453-de10d1961d65 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] [instance: 044c71d9-3aaf-4e1c-af95-5c0636cf4000] Successfully created port: 417548ae-4551-4ae2-8160-bafd0974768d _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 21 23:47:20 compute-1 nova_compute[182713]: 2026-01-21 23:47:20.896 182717 DEBUG nova.compute.provider_tree [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Inventory has not changed in ProviderTree for provider: 39680711-70c9-4df1-ae59-25e54fac688d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 21 23:47:20 compute-1 nova_compute[182713]: 2026-01-21 23:47:20.923 182717 DEBUG nova.scheduler.client.report [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Inventory has not changed for provider 39680711-70c9-4df1-ae59-25e54fac688d based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 21 23:47:20 compute-1 nova_compute[182713]: 2026-01-21 23:47:20.950 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 21 23:47:20 compute-1 nova_compute[182713]: 2026-01-21 23:47:20.951 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.576s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:47:21 compute-1 nova_compute[182713]: 2026-01-21 23:47:21.412 182717 DEBUG nova.virt.libvirt.driver [None req-c15d4748-f712-4aac-9314-66c8c072b909 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] [instance: 5bdecf5d-9113-4584-ac23-44d59770eade] Check if temp file /var/lib/nova/instances/tmp9ku_13i8 exists to indicate shared storage is being used for migration. Exists? False _check_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10065
Jan 21 23:47:21 compute-1 nova_compute[182713]: 2026-01-21 23:47:21.419 182717 DEBUG oslo_concurrency.processutils [None req-c15d4748-f712-4aac-9314-66c8c072b909 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5bdecf5d-9113-4584-ac23-44d59770eade/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:47:21 compute-1 nova_compute[182713]: 2026-01-21 23:47:21.505 182717 DEBUG oslo_concurrency.processutils [None req-c15d4748-f712-4aac-9314-66c8c072b909 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5bdecf5d-9113-4584-ac23-44d59770eade/disk --force-share --output=json" returned: 0 in 0.086s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:47:21 compute-1 nova_compute[182713]: 2026-01-21 23:47:21.508 182717 DEBUG oslo_concurrency.processutils [None req-c15d4748-f712-4aac-9314-66c8c072b909 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5bdecf5d-9113-4584-ac23-44d59770eade/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:47:21 compute-1 nova_compute[182713]: 2026-01-21 23:47:21.603 182717 DEBUG oslo_concurrency.processutils [None req-c15d4748-f712-4aac-9314-66c8c072b909 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5bdecf5d-9113-4584-ac23-44d59770eade/disk --force-share --output=json" returned: 0 in 0.095s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:47:21 compute-1 nova_compute[182713]: 2026-01-21 23:47:21.605 182717 DEBUG nova.compute.manager [None req-c15d4748-f712-4aac-9314-66c8c072b909 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] source check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp9ku_13i8',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='5bdecf5d-9113-4584-ac23-44d59770eade',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_source /usr/lib/python3.9/site-packages/nova/compute/manager.py:8587
Jan 21 23:47:21 compute-1 nova_compute[182713]: 2026-01-21 23:47:21.929 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:47:21 compute-1 nova_compute[182713]: 2026-01-21 23:47:21.930 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 21 23:47:21 compute-1 nova_compute[182713]: 2026-01-21 23:47:21.930 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 21 23:47:21 compute-1 nova_compute[182713]: 2026-01-21 23:47:21.963 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] [instance: 044c71d9-3aaf-4e1c-af95-5c0636cf4000] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871
Jan 21 23:47:21 compute-1 nova_compute[182713]: 2026-01-21 23:47:21.964 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquiring lock "refresh_cache-5bdecf5d-9113-4584-ac23-44d59770eade" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 23:47:21 compute-1 nova_compute[182713]: 2026-01-21 23:47:21.964 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquired lock "refresh_cache-5bdecf5d-9113-4584-ac23-44d59770eade" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 23:47:21 compute-1 nova_compute[182713]: 2026-01-21 23:47:21.965 182717 DEBUG nova.network.neutron [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] [instance: 5bdecf5d-9113-4584-ac23-44d59770eade] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Jan 21 23:47:21 compute-1 nova_compute[182713]: 2026-01-21 23:47:21.965 182717 DEBUG nova.objects.instance [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lazy-loading 'info_cache' on Instance uuid 5bdecf5d-9113-4584-ac23-44d59770eade obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:47:22 compute-1 nova_compute[182713]: 2026-01-21 23:47:22.665 182717 DEBUG nova.network.neutron [None req-351eb237-3255-446d-9453-de10d1961d65 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] [instance: 044c71d9-3aaf-4e1c-af95-5c0636cf4000] Successfully updated port: 417548ae-4551-4ae2-8160-bafd0974768d _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 21 23:47:22 compute-1 nova_compute[182713]: 2026-01-21 23:47:22.682 182717 DEBUG oslo_concurrency.lockutils [None req-351eb237-3255-446d-9453-de10d1961d65 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Acquiring lock "refresh_cache-044c71d9-3aaf-4e1c-af95-5c0636cf4000" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 23:47:22 compute-1 nova_compute[182713]: 2026-01-21 23:47:22.682 182717 DEBUG oslo_concurrency.lockutils [None req-351eb237-3255-446d-9453-de10d1961d65 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Acquired lock "refresh_cache-044c71d9-3aaf-4e1c-af95-5c0636cf4000" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 23:47:22 compute-1 nova_compute[182713]: 2026-01-21 23:47:22.682 182717 DEBUG nova.network.neutron [None req-351eb237-3255-446d-9453-de10d1961d65 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] [instance: 044c71d9-3aaf-4e1c-af95-5c0636cf4000] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 21 23:47:22 compute-1 nova_compute[182713]: 2026-01-21 23:47:22.829 182717 DEBUG nova.compute.manager [req-f4ca0522-2958-4bb6-af4b-8105784020ec req-0f7aa929-7175-4dcb-a249-c4a60cf761ef 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 044c71d9-3aaf-4e1c-af95-5c0636cf4000] Received event network-changed-417548ae-4551-4ae2-8160-bafd0974768d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:47:22 compute-1 nova_compute[182713]: 2026-01-21 23:47:22.829 182717 DEBUG nova.compute.manager [req-f4ca0522-2958-4bb6-af4b-8105784020ec req-0f7aa929-7175-4dcb-a249-c4a60cf761ef 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 044c71d9-3aaf-4e1c-af95-5c0636cf4000] Refreshing instance network info cache due to event network-changed-417548ae-4551-4ae2-8160-bafd0974768d. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 21 23:47:22 compute-1 nova_compute[182713]: 2026-01-21 23:47:22.830 182717 DEBUG oslo_concurrency.lockutils [req-f4ca0522-2958-4bb6-af4b-8105784020ec req-0f7aa929-7175-4dcb-a249-c4a60cf761ef 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-044c71d9-3aaf-4e1c-af95-5c0636cf4000" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 23:47:22 compute-1 nova_compute[182713]: 2026-01-21 23:47:22.960 182717 DEBUG nova.network.neutron [None req-351eb237-3255-446d-9453-de10d1961d65 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] [instance: 044c71d9-3aaf-4e1c-af95-5c0636cf4000] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 21 23:47:23 compute-1 nova_compute[182713]: 2026-01-21 23:47:23.657 182717 DEBUG oslo_concurrency.processutils [None req-c15d4748-f712-4aac-9314-66c8c072b909 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5bdecf5d-9113-4584-ac23-44d59770eade/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:47:23 compute-1 nova_compute[182713]: 2026-01-21 23:47:23.721 182717 DEBUG oslo_concurrency.processutils [None req-c15d4748-f712-4aac-9314-66c8c072b909 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5bdecf5d-9113-4584-ac23-44d59770eade/disk --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:47:23 compute-1 nova_compute[182713]: 2026-01-21 23:47:23.722 182717 DEBUG oslo_concurrency.processutils [None req-c15d4748-f712-4aac-9314-66c8c072b909 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5bdecf5d-9113-4584-ac23-44d59770eade/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:47:23 compute-1 nova_compute[182713]: 2026-01-21 23:47:23.776 182717 DEBUG oslo_concurrency.processutils [None req-c15d4748-f712-4aac-9314-66c8c072b909 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5bdecf5d-9113-4584-ac23-44d59770eade/disk --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:47:24 compute-1 nova_compute[182713]: 2026-01-21 23:47:24.087 182717 DEBUG nova.network.neutron [None req-351eb237-3255-446d-9453-de10d1961d65 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] [instance: 044c71d9-3aaf-4e1c-af95-5c0636cf4000] Updating instance_info_cache with network_info: [{"id": "417548ae-4551-4ae2-8160-bafd0974768d", "address": "fa:16:3e:5a:f2:4d", "network": {"id": "1530a22a-f758-407d-b1aa-fd922904fe07", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1431691179-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4d40fc03fb534b5689415f3d8a3de1fc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap417548ae-45", "ovs_interfaceid": "417548ae-4551-4ae2-8160-bafd0974768d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 23:47:24 compute-1 nova_compute[182713]: 2026-01-21 23:47:24.116 182717 DEBUG oslo_concurrency.lockutils [None req-351eb237-3255-446d-9453-de10d1961d65 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Releasing lock "refresh_cache-044c71d9-3aaf-4e1c-af95-5c0636cf4000" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 23:47:24 compute-1 nova_compute[182713]: 2026-01-21 23:47:24.117 182717 DEBUG nova.compute.manager [None req-351eb237-3255-446d-9453-de10d1961d65 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] [instance: 044c71d9-3aaf-4e1c-af95-5c0636cf4000] Instance network_info: |[{"id": "417548ae-4551-4ae2-8160-bafd0974768d", "address": "fa:16:3e:5a:f2:4d", "network": {"id": "1530a22a-f758-407d-b1aa-fd922904fe07", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1431691179-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4d40fc03fb534b5689415f3d8a3de1fc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap417548ae-45", "ovs_interfaceid": "417548ae-4551-4ae2-8160-bafd0974768d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 21 23:47:24 compute-1 nova_compute[182713]: 2026-01-21 23:47:24.118 182717 DEBUG oslo_concurrency.lockutils [req-f4ca0522-2958-4bb6-af4b-8105784020ec req-0f7aa929-7175-4dcb-a249-c4a60cf761ef 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-044c71d9-3aaf-4e1c-af95-5c0636cf4000" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 23:47:24 compute-1 nova_compute[182713]: 2026-01-21 23:47:24.119 182717 DEBUG nova.network.neutron [req-f4ca0522-2958-4bb6-af4b-8105784020ec req-0f7aa929-7175-4dcb-a249-c4a60cf761ef 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 044c71d9-3aaf-4e1c-af95-5c0636cf4000] Refreshing network info cache for port 417548ae-4551-4ae2-8160-bafd0974768d _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 21 23:47:24 compute-1 nova_compute[182713]: 2026-01-21 23:47:24.124 182717 DEBUG nova.virt.libvirt.driver [None req-351eb237-3255-446d-9453-de10d1961d65 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] [instance: 044c71d9-3aaf-4e1c-af95-5c0636cf4000] Start _get_guest_xml network_info=[{"id": "417548ae-4551-4ae2-8160-bafd0974768d", "address": "fa:16:3e:5a:f2:4d", "network": {"id": "1530a22a-f758-407d-b1aa-fd922904fe07", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1431691179-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4d40fc03fb534b5689415f3d8a3de1fc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap417548ae-45", "ovs_interfaceid": "417548ae-4551-4ae2-8160-bafd0974768d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'size': 0, 'disk_bus': 'virtio', 'boot_index': 0, 'device_name': '/dev/vda', 'guest_format': None, 'encrypted': False, 'encryption_format': None, 'encryption_secret_uuid': None, 'device_type': 'disk', 'image_id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 21 23:47:24 compute-1 nova_compute[182713]: 2026-01-21 23:47:24.131 182717 WARNING nova.virt.libvirt.driver [None req-351eb237-3255-446d-9453-de10d1961d65 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 21 23:47:24 compute-1 nova_compute[182713]: 2026-01-21 23:47:24.137 182717 DEBUG nova.virt.libvirt.host [None req-351eb237-3255-446d-9453-de10d1961d65 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 21 23:47:24 compute-1 nova_compute[182713]: 2026-01-21 23:47:24.138 182717 DEBUG nova.virt.libvirt.host [None req-351eb237-3255-446d-9453-de10d1961d65 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 21 23:47:24 compute-1 nova_compute[182713]: 2026-01-21 23:47:24.150 182717 DEBUG nova.virt.libvirt.host [None req-351eb237-3255-446d-9453-de10d1961d65 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 21 23:47:24 compute-1 nova_compute[182713]: 2026-01-21 23:47:24.151 182717 DEBUG nova.virt.libvirt.host [None req-351eb237-3255-446d-9453-de10d1961d65 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 21 23:47:24 compute-1 nova_compute[182713]: 2026-01-21 23:47:24.153 182717 DEBUG nova.virt.libvirt.driver [None req-351eb237-3255-446d-9453-de10d1961d65 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 21 23:47:24 compute-1 nova_compute[182713]: 2026-01-21 23:47:24.154 182717 DEBUG nova.virt.hardware [None req-351eb237-3255-446d-9453-de10d1961d65 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-21T23:42:45Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c3389c03-89c4-4ff5-9e03-1a99d41713d4',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 21 23:47:24 compute-1 nova_compute[182713]: 2026-01-21 23:47:24.154 182717 DEBUG nova.virt.hardware [None req-351eb237-3255-446d-9453-de10d1961d65 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 21 23:47:24 compute-1 nova_compute[182713]: 2026-01-21 23:47:24.155 182717 DEBUG nova.virt.hardware [None req-351eb237-3255-446d-9453-de10d1961d65 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 21 23:47:24 compute-1 nova_compute[182713]: 2026-01-21 23:47:24.155 182717 DEBUG nova.virt.hardware [None req-351eb237-3255-446d-9453-de10d1961d65 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 21 23:47:24 compute-1 nova_compute[182713]: 2026-01-21 23:47:24.156 182717 DEBUG nova.virt.hardware [None req-351eb237-3255-446d-9453-de10d1961d65 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 21 23:47:24 compute-1 nova_compute[182713]: 2026-01-21 23:47:24.156 182717 DEBUG nova.virt.hardware [None req-351eb237-3255-446d-9453-de10d1961d65 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 21 23:47:24 compute-1 nova_compute[182713]: 2026-01-21 23:47:24.157 182717 DEBUG nova.virt.hardware [None req-351eb237-3255-446d-9453-de10d1961d65 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 21 23:47:24 compute-1 nova_compute[182713]: 2026-01-21 23:47:24.157 182717 DEBUG nova.virt.hardware [None req-351eb237-3255-446d-9453-de10d1961d65 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 21 23:47:24 compute-1 nova_compute[182713]: 2026-01-21 23:47:24.157 182717 DEBUG nova.virt.hardware [None req-351eb237-3255-446d-9453-de10d1961d65 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 21 23:47:24 compute-1 nova_compute[182713]: 2026-01-21 23:47:24.158 182717 DEBUG nova.virt.hardware [None req-351eb237-3255-446d-9453-de10d1961d65 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 21 23:47:24 compute-1 nova_compute[182713]: 2026-01-21 23:47:24.158 182717 DEBUG nova.virt.hardware [None req-351eb237-3255-446d-9453-de10d1961d65 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 21 23:47:24 compute-1 nova_compute[182713]: 2026-01-21 23:47:24.165 182717 DEBUG nova.virt.libvirt.vif [None req-351eb237-3255-446d-9453-de10d1961d65 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-21T23:47:16Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersAdminTestJSON-server-451134642',display_name='tempest-ServersAdminTestJSON-server-451134642',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serversadmintestjson-server-451134642',id=22,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='4d40fc03fb534b5689415f3d8a3de1fc',ramdisk_id='',reservation_id='r-3f908x09',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersAdminTestJSON-1815099341',owner_user_name='tempest-ServersAdminTestJSON-1815099341-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-21T23:47:18Z,user_data=None,user_id='4a6034ff39094b6486bac680b7ed5a57',uuid=044c71d9-3aaf-4e1c-af95-5c0636cf4000,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "417548ae-4551-4ae2-8160-bafd0974768d", "address": "fa:16:3e:5a:f2:4d", "network": {"id": "1530a22a-f758-407d-b1aa-fd922904fe07", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1431691179-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4d40fc03fb534b5689415f3d8a3de1fc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap417548ae-45", "ovs_interfaceid": "417548ae-4551-4ae2-8160-bafd0974768d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 21 23:47:24 compute-1 nova_compute[182713]: 2026-01-21 23:47:24.165 182717 DEBUG nova.network.os_vif_util [None req-351eb237-3255-446d-9453-de10d1961d65 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Converting VIF {"id": "417548ae-4551-4ae2-8160-bafd0974768d", "address": "fa:16:3e:5a:f2:4d", "network": {"id": "1530a22a-f758-407d-b1aa-fd922904fe07", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1431691179-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4d40fc03fb534b5689415f3d8a3de1fc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap417548ae-45", "ovs_interfaceid": "417548ae-4551-4ae2-8160-bafd0974768d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 21 23:47:24 compute-1 nova_compute[182713]: 2026-01-21 23:47:24.166 182717 DEBUG nova.network.os_vif_util [None req-351eb237-3255-446d-9453-de10d1961d65 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:5a:f2:4d,bridge_name='br-int',has_traffic_filtering=True,id=417548ae-4551-4ae2-8160-bafd0974768d,network=Network(1530a22a-f758-407d-b1aa-fd922904fe07),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap417548ae-45') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 21 23:47:24 compute-1 nova_compute[182713]: 2026-01-21 23:47:24.168 182717 DEBUG nova.objects.instance [None req-351eb237-3255-446d-9453-de10d1961d65 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Lazy-loading 'pci_devices' on Instance uuid 044c71d9-3aaf-4e1c-af95-5c0636cf4000 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:47:24 compute-1 nova_compute[182713]: 2026-01-21 23:47:24.188 182717 DEBUG nova.virt.libvirt.driver [None req-351eb237-3255-446d-9453-de10d1961d65 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] [instance: 044c71d9-3aaf-4e1c-af95-5c0636cf4000] End _get_guest_xml xml=<domain type="kvm">
Jan 21 23:47:24 compute-1 nova_compute[182713]:   <uuid>044c71d9-3aaf-4e1c-af95-5c0636cf4000</uuid>
Jan 21 23:47:24 compute-1 nova_compute[182713]:   <name>instance-00000016</name>
Jan 21 23:47:24 compute-1 nova_compute[182713]:   <memory>131072</memory>
Jan 21 23:47:24 compute-1 nova_compute[182713]:   <vcpu>1</vcpu>
Jan 21 23:47:24 compute-1 nova_compute[182713]:   <metadata>
Jan 21 23:47:24 compute-1 nova_compute[182713]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 21 23:47:24 compute-1 nova_compute[182713]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 21 23:47:24 compute-1 nova_compute[182713]:       <nova:name>tempest-ServersAdminTestJSON-server-451134642</nova:name>
Jan 21 23:47:24 compute-1 nova_compute[182713]:       <nova:creationTime>2026-01-21 23:47:24</nova:creationTime>
Jan 21 23:47:24 compute-1 nova_compute[182713]:       <nova:flavor name="m1.nano">
Jan 21 23:47:24 compute-1 nova_compute[182713]:         <nova:memory>128</nova:memory>
Jan 21 23:47:24 compute-1 nova_compute[182713]:         <nova:disk>1</nova:disk>
Jan 21 23:47:24 compute-1 nova_compute[182713]:         <nova:swap>0</nova:swap>
Jan 21 23:47:24 compute-1 nova_compute[182713]:         <nova:ephemeral>0</nova:ephemeral>
Jan 21 23:47:24 compute-1 nova_compute[182713]:         <nova:vcpus>1</nova:vcpus>
Jan 21 23:47:24 compute-1 nova_compute[182713]:       </nova:flavor>
Jan 21 23:47:24 compute-1 nova_compute[182713]:       <nova:owner>
Jan 21 23:47:24 compute-1 nova_compute[182713]:         <nova:user uuid="4a6034ff39094b6486bac680b7ed5a57">tempest-ServersAdminTestJSON-1815099341-project-member</nova:user>
Jan 21 23:47:24 compute-1 nova_compute[182713]:         <nova:project uuid="4d40fc03fb534b5689415f3d8a3de1fc">tempest-ServersAdminTestJSON-1815099341</nova:project>
Jan 21 23:47:24 compute-1 nova_compute[182713]:       </nova:owner>
Jan 21 23:47:24 compute-1 nova_compute[182713]:       <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 21 23:47:24 compute-1 nova_compute[182713]:       <nova:ports>
Jan 21 23:47:24 compute-1 nova_compute[182713]:         <nova:port uuid="417548ae-4551-4ae2-8160-bafd0974768d">
Jan 21 23:47:24 compute-1 nova_compute[182713]:           <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Jan 21 23:47:24 compute-1 nova_compute[182713]:         </nova:port>
Jan 21 23:47:24 compute-1 nova_compute[182713]:       </nova:ports>
Jan 21 23:47:24 compute-1 nova_compute[182713]:     </nova:instance>
Jan 21 23:47:24 compute-1 nova_compute[182713]:   </metadata>
Jan 21 23:47:24 compute-1 nova_compute[182713]:   <sysinfo type="smbios">
Jan 21 23:47:24 compute-1 nova_compute[182713]:     <system>
Jan 21 23:47:24 compute-1 nova_compute[182713]:       <entry name="manufacturer">RDO</entry>
Jan 21 23:47:24 compute-1 nova_compute[182713]:       <entry name="product">OpenStack Compute</entry>
Jan 21 23:47:24 compute-1 nova_compute[182713]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 21 23:47:24 compute-1 nova_compute[182713]:       <entry name="serial">044c71d9-3aaf-4e1c-af95-5c0636cf4000</entry>
Jan 21 23:47:24 compute-1 nova_compute[182713]:       <entry name="uuid">044c71d9-3aaf-4e1c-af95-5c0636cf4000</entry>
Jan 21 23:47:24 compute-1 nova_compute[182713]:       <entry name="family">Virtual Machine</entry>
Jan 21 23:47:24 compute-1 nova_compute[182713]:     </system>
Jan 21 23:47:24 compute-1 nova_compute[182713]:   </sysinfo>
Jan 21 23:47:24 compute-1 nova_compute[182713]:   <os>
Jan 21 23:47:24 compute-1 nova_compute[182713]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 21 23:47:24 compute-1 nova_compute[182713]:     <boot dev="hd"/>
Jan 21 23:47:24 compute-1 nova_compute[182713]:     <smbios mode="sysinfo"/>
Jan 21 23:47:24 compute-1 nova_compute[182713]:   </os>
Jan 21 23:47:24 compute-1 nova_compute[182713]:   <features>
Jan 21 23:47:24 compute-1 nova_compute[182713]:     <acpi/>
Jan 21 23:47:24 compute-1 nova_compute[182713]:     <apic/>
Jan 21 23:47:24 compute-1 nova_compute[182713]:     <vmcoreinfo/>
Jan 21 23:47:24 compute-1 nova_compute[182713]:   </features>
Jan 21 23:47:24 compute-1 nova_compute[182713]:   <clock offset="utc">
Jan 21 23:47:24 compute-1 nova_compute[182713]:     <timer name="pit" tickpolicy="delay"/>
Jan 21 23:47:24 compute-1 nova_compute[182713]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 21 23:47:24 compute-1 nova_compute[182713]:     <timer name="hpet" present="no"/>
Jan 21 23:47:24 compute-1 nova_compute[182713]:   </clock>
Jan 21 23:47:24 compute-1 nova_compute[182713]:   <cpu mode="custom" match="exact">
Jan 21 23:47:24 compute-1 nova_compute[182713]:     <model>Nehalem</model>
Jan 21 23:47:24 compute-1 nova_compute[182713]:     <topology sockets="1" cores="1" threads="1"/>
Jan 21 23:47:24 compute-1 nova_compute[182713]:   </cpu>
Jan 21 23:47:24 compute-1 nova_compute[182713]:   <devices>
Jan 21 23:47:24 compute-1 nova_compute[182713]:     <disk type="file" device="disk">
Jan 21 23:47:24 compute-1 nova_compute[182713]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 21 23:47:24 compute-1 nova_compute[182713]:       <source file="/var/lib/nova/instances/044c71d9-3aaf-4e1c-af95-5c0636cf4000/disk"/>
Jan 21 23:47:24 compute-1 nova_compute[182713]:       <target dev="vda" bus="virtio"/>
Jan 21 23:47:24 compute-1 nova_compute[182713]:     </disk>
Jan 21 23:47:24 compute-1 nova_compute[182713]:     <disk type="file" device="cdrom">
Jan 21 23:47:24 compute-1 nova_compute[182713]:       <driver name="qemu" type="raw" cache="none"/>
Jan 21 23:47:24 compute-1 nova_compute[182713]:       <source file="/var/lib/nova/instances/044c71d9-3aaf-4e1c-af95-5c0636cf4000/disk.config"/>
Jan 21 23:47:24 compute-1 nova_compute[182713]:       <target dev="sda" bus="sata"/>
Jan 21 23:47:24 compute-1 nova_compute[182713]:     </disk>
Jan 21 23:47:24 compute-1 nova_compute[182713]:     <interface type="ethernet">
Jan 21 23:47:24 compute-1 nova_compute[182713]:       <mac address="fa:16:3e:5a:f2:4d"/>
Jan 21 23:47:24 compute-1 nova_compute[182713]:       <model type="virtio"/>
Jan 21 23:47:24 compute-1 nova_compute[182713]:       <driver name="vhost" rx_queue_size="512"/>
Jan 21 23:47:24 compute-1 nova_compute[182713]:       <mtu size="1442"/>
Jan 21 23:47:24 compute-1 nova_compute[182713]:       <target dev="tap417548ae-45"/>
Jan 21 23:47:24 compute-1 nova_compute[182713]:     </interface>
Jan 21 23:47:24 compute-1 nova_compute[182713]:     <serial type="pty">
Jan 21 23:47:24 compute-1 nova_compute[182713]:       <log file="/var/lib/nova/instances/044c71d9-3aaf-4e1c-af95-5c0636cf4000/console.log" append="off"/>
Jan 21 23:47:24 compute-1 nova_compute[182713]:     </serial>
Jan 21 23:47:24 compute-1 nova_compute[182713]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 21 23:47:24 compute-1 nova_compute[182713]:     <video>
Jan 21 23:47:24 compute-1 nova_compute[182713]:       <model type="virtio"/>
Jan 21 23:47:24 compute-1 nova_compute[182713]:     </video>
Jan 21 23:47:24 compute-1 nova_compute[182713]:     <input type="tablet" bus="usb"/>
Jan 21 23:47:24 compute-1 nova_compute[182713]:     <rng model="virtio">
Jan 21 23:47:24 compute-1 nova_compute[182713]:       <backend model="random">/dev/urandom</backend>
Jan 21 23:47:24 compute-1 nova_compute[182713]:     </rng>
Jan 21 23:47:24 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root"/>
Jan 21 23:47:24 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:47:24 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:47:24 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:47:24 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:47:24 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:47:24 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:47:24 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:47:24 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:47:24 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:47:24 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:47:24 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:47:24 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:47:24 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:47:24 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:47:24 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:47:24 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:47:24 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:47:24 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:47:24 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:47:24 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:47:24 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:47:24 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:47:24 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:47:24 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:47:24 compute-1 nova_compute[182713]:     <controller type="usb" index="0"/>
Jan 21 23:47:24 compute-1 nova_compute[182713]:     <memballoon model="virtio">
Jan 21 23:47:24 compute-1 nova_compute[182713]:       <stats period="10"/>
Jan 21 23:47:24 compute-1 nova_compute[182713]:     </memballoon>
Jan 21 23:47:24 compute-1 nova_compute[182713]:   </devices>
Jan 21 23:47:24 compute-1 nova_compute[182713]: </domain>
Jan 21 23:47:24 compute-1 nova_compute[182713]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 21 23:47:24 compute-1 nova_compute[182713]: 2026-01-21 23:47:24.191 182717 DEBUG nova.compute.manager [None req-351eb237-3255-446d-9453-de10d1961d65 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] [instance: 044c71d9-3aaf-4e1c-af95-5c0636cf4000] Preparing to wait for external event network-vif-plugged-417548ae-4551-4ae2-8160-bafd0974768d prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 21 23:47:24 compute-1 nova_compute[182713]: 2026-01-21 23:47:24.191 182717 DEBUG oslo_concurrency.lockutils [None req-351eb237-3255-446d-9453-de10d1961d65 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Acquiring lock "044c71d9-3aaf-4e1c-af95-5c0636cf4000-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:47:24 compute-1 nova_compute[182713]: 2026-01-21 23:47:24.192 182717 DEBUG oslo_concurrency.lockutils [None req-351eb237-3255-446d-9453-de10d1961d65 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Lock "044c71d9-3aaf-4e1c-af95-5c0636cf4000-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:47:24 compute-1 nova_compute[182713]: 2026-01-21 23:47:24.192 182717 DEBUG oslo_concurrency.lockutils [None req-351eb237-3255-446d-9453-de10d1961d65 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Lock "044c71d9-3aaf-4e1c-af95-5c0636cf4000-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:47:24 compute-1 nova_compute[182713]: 2026-01-21 23:47:24.193 182717 DEBUG nova.virt.libvirt.vif [None req-351eb237-3255-446d-9453-de10d1961d65 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-21T23:47:16Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersAdminTestJSON-server-451134642',display_name='tempest-ServersAdminTestJSON-server-451134642',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serversadmintestjson-server-451134642',id=22,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='4d40fc03fb534b5689415f3d8a3de1fc',ramdisk_id='',reservation_id='r-3f908x09',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersAdminTestJSON-1815099341',owner_user_name='tempest-ServersAdminTestJSON-1815099341-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-21T23:47:18Z,user_data=None,user_id='4a6034ff39094b6486bac680b7ed5a57',uuid=044c71d9-3aaf-4e1c-af95-5c0636cf4000,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "417548ae-4551-4ae2-8160-bafd0974768d", "address": "fa:16:3e:5a:f2:4d", "network": {"id": "1530a22a-f758-407d-b1aa-fd922904fe07", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1431691179-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4d40fc03fb534b5689415f3d8a3de1fc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap417548ae-45", "ovs_interfaceid": "417548ae-4551-4ae2-8160-bafd0974768d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 21 23:47:24 compute-1 nova_compute[182713]: 2026-01-21 23:47:24.194 182717 DEBUG nova.network.os_vif_util [None req-351eb237-3255-446d-9453-de10d1961d65 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Converting VIF {"id": "417548ae-4551-4ae2-8160-bafd0974768d", "address": "fa:16:3e:5a:f2:4d", "network": {"id": "1530a22a-f758-407d-b1aa-fd922904fe07", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1431691179-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4d40fc03fb534b5689415f3d8a3de1fc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap417548ae-45", "ovs_interfaceid": "417548ae-4551-4ae2-8160-bafd0974768d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 21 23:47:24 compute-1 nova_compute[182713]: 2026-01-21 23:47:24.195 182717 DEBUG nova.network.os_vif_util [None req-351eb237-3255-446d-9453-de10d1961d65 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:5a:f2:4d,bridge_name='br-int',has_traffic_filtering=True,id=417548ae-4551-4ae2-8160-bafd0974768d,network=Network(1530a22a-f758-407d-b1aa-fd922904fe07),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap417548ae-45') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 21 23:47:24 compute-1 nova_compute[182713]: 2026-01-21 23:47:24.195 182717 DEBUG os_vif [None req-351eb237-3255-446d-9453-de10d1961d65 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:5a:f2:4d,bridge_name='br-int',has_traffic_filtering=True,id=417548ae-4551-4ae2-8160-bafd0974768d,network=Network(1530a22a-f758-407d-b1aa-fd922904fe07),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap417548ae-45') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 21 23:47:24 compute-1 nova_compute[182713]: 2026-01-21 23:47:24.196 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:47:24 compute-1 nova_compute[182713]: 2026-01-21 23:47:24.197 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:47:24 compute-1 nova_compute[182713]: 2026-01-21 23:47:24.198 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 21 23:47:24 compute-1 nova_compute[182713]: 2026-01-21 23:47:24.203 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:47:24 compute-1 nova_compute[182713]: 2026-01-21 23:47:24.204 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap417548ae-45, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:47:24 compute-1 nova_compute[182713]: 2026-01-21 23:47:24.205 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap417548ae-45, col_values=(('external_ids', {'iface-id': '417548ae-4551-4ae2-8160-bafd0974768d', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:5a:f2:4d', 'vm-uuid': '044c71d9-3aaf-4e1c-af95-5c0636cf4000'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:47:24 compute-1 nova_compute[182713]: 2026-01-21 23:47:24.207 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:47:24 compute-1 NetworkManager[54952]: <info>  [1769039244.2086] manager: (tap417548ae-45): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/43)
Jan 21 23:47:24 compute-1 nova_compute[182713]: 2026-01-21 23:47:24.210 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 21 23:47:24 compute-1 nova_compute[182713]: 2026-01-21 23:47:24.217 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:47:24 compute-1 nova_compute[182713]: 2026-01-21 23:47:24.219 182717 INFO os_vif [None req-351eb237-3255-446d-9453-de10d1961d65 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:5a:f2:4d,bridge_name='br-int',has_traffic_filtering=True,id=417548ae-4551-4ae2-8160-bafd0974768d,network=Network(1530a22a-f758-407d-b1aa-fd922904fe07),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap417548ae-45')
Jan 21 23:47:24 compute-1 nova_compute[182713]: 2026-01-21 23:47:24.303 182717 DEBUG nova.virt.libvirt.driver [None req-351eb237-3255-446d-9453-de10d1961d65 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 21 23:47:24 compute-1 nova_compute[182713]: 2026-01-21 23:47:24.303 182717 DEBUG nova.virt.libvirt.driver [None req-351eb237-3255-446d-9453-de10d1961d65 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 21 23:47:24 compute-1 nova_compute[182713]: 2026-01-21 23:47:24.304 182717 DEBUG nova.virt.libvirt.driver [None req-351eb237-3255-446d-9453-de10d1961d65 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] No VIF found with MAC fa:16:3e:5a:f2:4d, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 21 23:47:24 compute-1 nova_compute[182713]: 2026-01-21 23:47:24.304 182717 INFO nova.virt.libvirt.driver [None req-351eb237-3255-446d-9453-de10d1961d65 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] [instance: 044c71d9-3aaf-4e1c-af95-5c0636cf4000] Using config drive
Jan 21 23:47:24 compute-1 podman[213768]: 2026-01-21 23:47:24.594580426 +0000 UTC m=+0.078692243 container health_status af9a44923f14d865893c91712a29a99d59e05d83f1a1a0300c4f4d5af638f284 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 21 23:47:24 compute-1 podman[213769]: 2026-01-21 23:47:24.615745009 +0000 UTC m=+0.095804798 container health_status e305ebfcd3dbca18f8dd680606b043b108cf9efb0d0a771039cb04fe0df8441d (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 21 23:47:24 compute-1 nova_compute[182713]: 2026-01-21 23:47:24.985 182717 INFO nova.virt.libvirt.driver [None req-351eb237-3255-446d-9453-de10d1961d65 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] [instance: 044c71d9-3aaf-4e1c-af95-5c0636cf4000] Creating config drive at /var/lib/nova/instances/044c71d9-3aaf-4e1c-af95-5c0636cf4000/disk.config
Jan 21 23:47:24 compute-1 nova_compute[182713]: 2026-01-21 23:47:24.995 182717 DEBUG oslo_concurrency.processutils [None req-351eb237-3255-446d-9453-de10d1961d65 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/044c71d9-3aaf-4e1c-af95-5c0636cf4000/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpchs48swu execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:47:25 compute-1 nova_compute[182713]: 2026-01-21 23:47:25.023 182717 DEBUG nova.network.neutron [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] [instance: 5bdecf5d-9113-4584-ac23-44d59770eade] Updating instance_info_cache with network_info: [{"id": "df9aa099-aa41-4111-b46c-c8a593762a53", "address": "fa:16:3e:8f:4f:85", "network": {"id": "b2df233d-b255-4dda-925c-3ccab3a032ee", "bridge": "br-int", "label": "tempest-LiveMigrationTest-283223724-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cdcb2f57183e484cace5d5f78dd635a1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdf9aa099-aa", "ovs_interfaceid": "df9aa099-aa41-4111-b46c-c8a593762a53", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 23:47:25 compute-1 nova_compute[182713]: 2026-01-21 23:47:25.088 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Releasing lock "refresh_cache-5bdecf5d-9113-4584-ac23-44d59770eade" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 23:47:25 compute-1 nova_compute[182713]: 2026-01-21 23:47:25.088 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] [instance: 5bdecf5d-9113-4584-ac23-44d59770eade] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Jan 21 23:47:25 compute-1 nova_compute[182713]: 2026-01-21 23:47:25.089 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:47:25 compute-1 nova_compute[182713]: 2026-01-21 23:47:25.089 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:47:25 compute-1 nova_compute[182713]: 2026-01-21 23:47:25.090 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:47:25 compute-1 nova_compute[182713]: 2026-01-21 23:47:25.090 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:47:25 compute-1 nova_compute[182713]: 2026-01-21 23:47:25.134 182717 DEBUG oslo_concurrency.processutils [None req-351eb237-3255-446d-9453-de10d1961d65 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/044c71d9-3aaf-4e1c-af95-5c0636cf4000/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpchs48swu" returned: 0 in 0.139s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:47:25 compute-1 nova_compute[182713]: 2026-01-21 23:47:25.175 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:47:25 compute-1 kernel: tap417548ae-45: entered promiscuous mode
Jan 21 23:47:25 compute-1 NetworkManager[54952]: <info>  [1769039245.1971] manager: (tap417548ae-45): new Tun device (/org/freedesktop/NetworkManager/Devices/44)
Jan 21 23:47:25 compute-1 nova_compute[182713]: 2026-01-21 23:47:25.199 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:47:25 compute-1 ovn_controller[94841]: 2026-01-21T23:47:25Z|00067|binding|INFO|Claiming lport 417548ae-4551-4ae2-8160-bafd0974768d for this chassis.
Jan 21 23:47:25 compute-1 ovn_controller[94841]: 2026-01-21T23:47:25Z|00068|binding|INFO|417548ae-4551-4ae2-8160-bafd0974768d: Claiming fa:16:3e:5a:f2:4d 10.100.0.12
Jan 21 23:47:25 compute-1 nova_compute[182713]: 2026-01-21 23:47:25.212 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:47:25 compute-1 systemd-machined[153970]: New machine qemu-11-instance-00000016.
Jan 21 23:47:25 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:47:25.241 104184 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:5a:f2:4d 10.100.0.12'], port_security=['fa:16:3e:5a:f2:4d 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '044c71d9-3aaf-4e1c-af95-5c0636cf4000', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1530a22a-f758-407d-b1aa-fd922904fe07', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '4d40fc03fb534b5689415f3d8a3de1fc', 'neutron:revision_number': '2', 'neutron:security_group_ids': '53586897-09f0-4175-b34b-334e99525efe', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b41d97fa-58c9-4587-b2cb-1c83c11100c9, chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>], tunnel_key=6, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>], logical_port=417548ae-4551-4ae2-8160-bafd0974768d) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 21 23:47:25 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:47:25.242 104184 INFO neutron.agent.ovn.metadata.agent [-] Port 417548ae-4551-4ae2-8160-bafd0974768d in datapath 1530a22a-f758-407d-b1aa-fd922904fe07 bound to our chassis
Jan 21 23:47:25 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:47:25.245 104184 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 1530a22a-f758-407d-b1aa-fd922904fe07
Jan 21 23:47:25 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:47:25.259 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[ced4913d-3c8c-4d56-839c-bc7b0ac7644e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:47:25 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:47:25.266 104184 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap1530a22a-f1 in ovnmeta-1530a22a-f758-407d-b1aa-fd922904fe07 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 21 23:47:25 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:47:25.268 211733 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap1530a22a-f0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 21 23:47:25 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:47:25.268 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[4937ffa5-f7a0-4e1c-984a-730a7ba0721c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:47:25 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:47:25.269 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[7753dc90-3793-407c-828c-73589bf41ba9]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:47:25 compute-1 systemd[1]: Started Virtual Machine qemu-11-instance-00000016.
Jan 21 23:47:25 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:47:25.288 104576 DEBUG oslo.privsep.daemon [-] privsep: reply[31b8eaf2-fba9-4b19-9688-9145d9fa5ebd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:47:25 compute-1 ovn_controller[94841]: 2026-01-21T23:47:25Z|00069|binding|INFO|Setting lport 417548ae-4551-4ae2-8160-bafd0974768d ovn-installed in OVS
Jan 21 23:47:25 compute-1 ovn_controller[94841]: 2026-01-21T23:47:25Z|00070|binding|INFO|Setting lport 417548ae-4551-4ae2-8160-bafd0974768d up in Southbound
Jan 21 23:47:25 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:47:25.301 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[c30ffd73-bb31-4cbe-81fa-910cf36174c4]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:47:25 compute-1 nova_compute[182713]: 2026-01-21 23:47:25.302 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:47:25 compute-1 systemd-udevd[213835]: Network interface NamePolicy= disabled on kernel command line.
Jan 21 23:47:25 compute-1 NetworkManager[54952]: <info>  [1769039245.3239] device (tap417548ae-45): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 21 23:47:25 compute-1 NetworkManager[54952]: <info>  [1769039245.3262] device (tap417548ae-45): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 21 23:47:25 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:47:25.339 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[d7bc3468-f022-4ebf-a7a4-106e7fb67ea0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:47:25 compute-1 systemd-udevd[213842]: Network interface NamePolicy= disabled on kernel command line.
Jan 21 23:47:25 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:47:25.348 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[218bdc16-aed4-4de6-8e4b-fba41bfa94d2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:47:25 compute-1 NetworkManager[54952]: <info>  [1769039245.3504] manager: (tap1530a22a-f0): new Veth device (/org/freedesktop/NetworkManager/Devices/45)
Jan 21 23:47:25 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:47:25.388 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[ee459c5d-8ca1-4089-a096-a92725cd1a61]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:47:25 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:47:25.392 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[d3228f46-77f1-42b1-ada4-0ac675256db8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:47:25 compute-1 NetworkManager[54952]: <info>  [1769039245.4165] device (tap1530a22a-f0): carrier: link connected
Jan 21 23:47:25 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:47:25.421 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[d4ba00b5-121d-4bcc-9b1a-65c5a5125d0b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:47:25 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:47:25.439 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[fe8d2342-496b-4fef-8699-a317fa87cd63]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap1530a22a-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a6:bf:13'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 24], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 381806, 'reachable_time': 30659, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 213863, 'error': None, 'target': 'ovnmeta-1530a22a-f758-407d-b1aa-fd922904fe07', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:47:25 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:47:25.466 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[268d38d1-c05a-4b64-a19c-a60142d27744]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fea6:bf13'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 381806, 'tstamp': 381806}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 213866, 'error': None, 'target': 'ovnmeta-1530a22a-f758-407d-b1aa-fd922904fe07', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:47:25 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:47:25.490 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[357e2f8a-2ce7-444c-99ee-6ae1b04ad179]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap1530a22a-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a6:bf:13'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 24], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 381806, 'reachable_time': 30659, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 152, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 152, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 213871, 'error': None, 'target': 'ovnmeta-1530a22a-f758-407d-b1aa-fd922904fe07', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:47:25 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:47:25.541 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[e01a3dd5-4979-4429-8f44-336da676c24b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:47:25 compute-1 nova_compute[182713]: 2026-01-21 23:47:25.560 182717 DEBUG nova.virt.driver [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Emitting event <LifecycleEvent: 1769039245.5603857, 044c71d9-3aaf-4e1c-af95-5c0636cf4000 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:47:25 compute-1 nova_compute[182713]: 2026-01-21 23:47:25.561 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 044c71d9-3aaf-4e1c-af95-5c0636cf4000] VM Started (Lifecycle Event)
Jan 21 23:47:25 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:47:25.617 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[6c0db547-7e26-4aea-add7-c8e0a85bade9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:47:25 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:47:25.619 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1530a22a-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:47:25 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:47:25.619 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 21 23:47:25 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:47:25.620 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap1530a22a-f0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:47:25 compute-1 nova_compute[182713]: 2026-01-21 23:47:25.622 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:47:25 compute-1 kernel: tap1530a22a-f0: entered promiscuous mode
Jan 21 23:47:25 compute-1 NetworkManager[54952]: <info>  [1769039245.6251] manager: (tap1530a22a-f0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/46)
Jan 21 23:47:25 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:47:25.629 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap1530a22a-f0, col_values=(('external_ids', {'iface-id': '1e43acd7-fb26-4f78-8f65-2a3b2d4a2acd'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:47:25 compute-1 nova_compute[182713]: 2026-01-21 23:47:25.631 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:47:25 compute-1 ovn_controller[94841]: 2026-01-21T23:47:25Z|00071|binding|INFO|Releasing lport 1e43acd7-fb26-4f78-8f65-2a3b2d4a2acd from this chassis (sb_readonly=0)
Jan 21 23:47:25 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:47:25.633 104184 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/1530a22a-f758-407d-b1aa-fd922904fe07.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/1530a22a-f758-407d-b1aa-fd922904fe07.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 21 23:47:25 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:47:25.634 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[dd3bb782-0acb-4640-9050-133fbd8c7801]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:47:25 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:47:25.635 104184 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 21 23:47:25 compute-1 ovn_metadata_agent[104179]: global
Jan 21 23:47:25 compute-1 ovn_metadata_agent[104179]:     log         /dev/log local0 debug
Jan 21 23:47:25 compute-1 ovn_metadata_agent[104179]:     log-tag     haproxy-metadata-proxy-1530a22a-f758-407d-b1aa-fd922904fe07
Jan 21 23:47:25 compute-1 ovn_metadata_agent[104179]:     user        root
Jan 21 23:47:25 compute-1 ovn_metadata_agent[104179]:     group       root
Jan 21 23:47:25 compute-1 ovn_metadata_agent[104179]:     maxconn     1024
Jan 21 23:47:25 compute-1 ovn_metadata_agent[104179]:     pidfile     /var/lib/neutron/external/pids/1530a22a-f758-407d-b1aa-fd922904fe07.pid.haproxy
Jan 21 23:47:25 compute-1 ovn_metadata_agent[104179]:     daemon
Jan 21 23:47:25 compute-1 ovn_metadata_agent[104179]: 
Jan 21 23:47:25 compute-1 ovn_metadata_agent[104179]: defaults
Jan 21 23:47:25 compute-1 ovn_metadata_agent[104179]:     log global
Jan 21 23:47:25 compute-1 ovn_metadata_agent[104179]:     mode http
Jan 21 23:47:25 compute-1 ovn_metadata_agent[104179]:     option httplog
Jan 21 23:47:25 compute-1 ovn_metadata_agent[104179]:     option dontlognull
Jan 21 23:47:25 compute-1 ovn_metadata_agent[104179]:     option http-server-close
Jan 21 23:47:25 compute-1 ovn_metadata_agent[104179]:     option forwardfor
Jan 21 23:47:25 compute-1 ovn_metadata_agent[104179]:     retries                 3
Jan 21 23:47:25 compute-1 ovn_metadata_agent[104179]:     timeout http-request    30s
Jan 21 23:47:25 compute-1 ovn_metadata_agent[104179]:     timeout connect         30s
Jan 21 23:47:25 compute-1 ovn_metadata_agent[104179]:     timeout client          32s
Jan 21 23:47:25 compute-1 ovn_metadata_agent[104179]:     timeout server          32s
Jan 21 23:47:25 compute-1 ovn_metadata_agent[104179]:     timeout http-keep-alive 30s
Jan 21 23:47:25 compute-1 ovn_metadata_agent[104179]: 
Jan 21 23:47:25 compute-1 ovn_metadata_agent[104179]: 
Jan 21 23:47:25 compute-1 ovn_metadata_agent[104179]: listen listener
Jan 21 23:47:25 compute-1 ovn_metadata_agent[104179]:     bind 169.254.169.254:80
Jan 21 23:47:25 compute-1 ovn_metadata_agent[104179]:     server metadata /var/lib/neutron/metadata_proxy
Jan 21 23:47:25 compute-1 ovn_metadata_agent[104179]:     http-request add-header X-OVN-Network-ID 1530a22a-f758-407d-b1aa-fd922904fe07
Jan 21 23:47:25 compute-1 ovn_metadata_agent[104179]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 21 23:47:25 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:47:25.636 104184 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-1530a22a-f758-407d-b1aa-fd922904fe07', 'env', 'PROCESS_TAG=haproxy-1530a22a-f758-407d-b1aa-fd922904fe07', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/1530a22a-f758-407d-b1aa-fd922904fe07.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 21 23:47:25 compute-1 nova_compute[182713]: 2026-01-21 23:47:25.648 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:47:25 compute-1 nova_compute[182713]: 2026-01-21 23:47:25.663 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 044c71d9-3aaf-4e1c-af95-5c0636cf4000] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:47:25 compute-1 nova_compute[182713]: 2026-01-21 23:47:25.668 182717 DEBUG nova.virt.driver [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Emitting event <LifecycleEvent: 1769039245.5612113, 044c71d9-3aaf-4e1c-af95-5c0636cf4000 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:47:25 compute-1 nova_compute[182713]: 2026-01-21 23:47:25.669 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 044c71d9-3aaf-4e1c-af95-5c0636cf4000] VM Paused (Lifecycle Event)
Jan 21 23:47:25 compute-1 nova_compute[182713]: 2026-01-21 23:47:25.708 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 044c71d9-3aaf-4e1c-af95-5c0636cf4000] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:47:25 compute-1 nova_compute[182713]: 2026-01-21 23:47:25.713 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 044c71d9-3aaf-4e1c-af95-5c0636cf4000] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 21 23:47:25 compute-1 nova_compute[182713]: 2026-01-21 23:47:25.770 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 044c71d9-3aaf-4e1c-af95-5c0636cf4000] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 21 23:47:26 compute-1 podman[213904]: 2026-01-21 23:47:26.115947784 +0000 UTC m=+0.084624478 container create 20d1473d4d64c3a5685d2e3b64af39a2dab7cded56a64c7ede4fb35fc2c6c7f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1530a22a-f758-407d-b1aa-fd922904fe07, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Jan 21 23:47:26 compute-1 podman[213904]: 2026-01-21 23:47:26.074436386 +0000 UTC m=+0.043113110 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 21 23:47:26 compute-1 systemd[1]: Started libpod-conmon-20d1473d4d64c3a5685d2e3b64af39a2dab7cded56a64c7ede4fb35fc2c6c7f4.scope.
Jan 21 23:47:26 compute-1 systemd[1]: Started libcrun container.
Jan 21 23:47:26 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c9e80373ead48ae8fa08519da60a9521471c6d96e5e16d7711709a60f715b2ac/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 21 23:47:26 compute-1 podman[213904]: 2026-01-21 23:47:26.226014906 +0000 UTC m=+0.194691610 container init 20d1473d4d64c3a5685d2e3b64af39a2dab7cded56a64c7ede4fb35fc2c6c7f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1530a22a-f758-407d-b1aa-fd922904fe07, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 21 23:47:26 compute-1 podman[213904]: 2026-01-21 23:47:26.236471384 +0000 UTC m=+0.205148078 container start 20d1473d4d64c3a5685d2e3b64af39a2dab7cded56a64c7ede4fb35fc2c6c7f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1530a22a-f758-407d-b1aa-fd922904fe07, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3)
Jan 21 23:47:26 compute-1 neutron-haproxy-ovnmeta-1530a22a-f758-407d-b1aa-fd922904fe07[213920]: [NOTICE]   (213924) : New worker (213926) forked
Jan 21 23:47:26 compute-1 neutron-haproxy-ovnmeta-1530a22a-f758-407d-b1aa-fd922904fe07[213920]: [NOTICE]   (213924) : Loading success.
Jan 21 23:47:27 compute-1 nova_compute[182713]: 2026-01-21 23:47:27.159 182717 DEBUG nova.network.neutron [req-f4ca0522-2958-4bb6-af4b-8105784020ec req-0f7aa929-7175-4dcb-a249-c4a60cf761ef 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 044c71d9-3aaf-4e1c-af95-5c0636cf4000] Updated VIF entry in instance network info cache for port 417548ae-4551-4ae2-8160-bafd0974768d. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 21 23:47:27 compute-1 nova_compute[182713]: 2026-01-21 23:47:27.161 182717 DEBUG nova.network.neutron [req-f4ca0522-2958-4bb6-af4b-8105784020ec req-0f7aa929-7175-4dcb-a249-c4a60cf761ef 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 044c71d9-3aaf-4e1c-af95-5c0636cf4000] Updating instance_info_cache with network_info: [{"id": "417548ae-4551-4ae2-8160-bafd0974768d", "address": "fa:16:3e:5a:f2:4d", "network": {"id": "1530a22a-f758-407d-b1aa-fd922904fe07", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1431691179-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4d40fc03fb534b5689415f3d8a3de1fc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap417548ae-45", "ovs_interfaceid": "417548ae-4551-4ae2-8160-bafd0974768d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 23:47:27 compute-1 nova_compute[182713]: 2026-01-21 23:47:27.178 182717 DEBUG oslo_concurrency.lockutils [req-f4ca0522-2958-4bb6-af4b-8105784020ec req-0f7aa929-7175-4dcb-a249-c4a60cf761ef 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-044c71d9-3aaf-4e1c-af95-5c0636cf4000" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 23:47:27 compute-1 sshd-session[213935]: Accepted publickey for nova from 192.168.122.100 port 36678 ssh2: ECDSA SHA256:F7R2p0Um/J2TH6GGZ4p2tVmjepU+Vjbmccv9PcuTsYI
Jan 21 23:47:27 compute-1 systemd[1]: Created slice User Slice of UID 42436.
Jan 21 23:47:27 compute-1 systemd[1]: Starting User Runtime Directory /run/user/42436...
Jan 21 23:47:27 compute-1 systemd-logind[796]: New session 29 of user nova.
Jan 21 23:47:27 compute-1 systemd[1]: Finished User Runtime Directory /run/user/42436.
Jan 21 23:47:27 compute-1 systemd[1]: Starting User Manager for UID 42436...
Jan 21 23:47:27 compute-1 systemd[213939]: pam_unix(systemd-user:session): session opened for user nova(uid=42436) by nova(uid=0)
Jan 21 23:47:27 compute-1 systemd[213939]: Queued start job for default target Main User Target.
Jan 21 23:47:27 compute-1 systemd[213939]: Created slice User Application Slice.
Jan 21 23:47:27 compute-1 systemd[213939]: Started Mark boot as successful after the user session has run 2 minutes.
Jan 21 23:47:27 compute-1 systemd[213939]: Started Daily Cleanup of User's Temporary Directories.
Jan 21 23:47:27 compute-1 systemd[213939]: Reached target Paths.
Jan 21 23:47:27 compute-1 systemd[213939]: Reached target Timers.
Jan 21 23:47:27 compute-1 systemd[213939]: Starting D-Bus User Message Bus Socket...
Jan 21 23:47:27 compute-1 systemd[213939]: Starting Create User's Volatile Files and Directories...
Jan 21 23:47:27 compute-1 systemd[213939]: Listening on D-Bus User Message Bus Socket.
Jan 21 23:47:27 compute-1 systemd[213939]: Reached target Sockets.
Jan 21 23:47:27 compute-1 systemd[213939]: Finished Create User's Volatile Files and Directories.
Jan 21 23:47:27 compute-1 systemd[213939]: Reached target Basic System.
Jan 21 23:47:27 compute-1 systemd[213939]: Reached target Main User Target.
Jan 21 23:47:27 compute-1 systemd[213939]: Startup finished in 179ms.
Jan 21 23:47:27 compute-1 systemd[1]: Started User Manager for UID 42436.
Jan 21 23:47:27 compute-1 systemd[1]: Started Session 29 of User nova.
Jan 21 23:47:27 compute-1 sshd-session[213935]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Jan 21 23:47:27 compute-1 sshd-session[213954]: Received disconnect from 192.168.122.100 port 36678:11: disconnected by user
Jan 21 23:47:27 compute-1 sshd-session[213954]: Disconnected from user nova 192.168.122.100 port 36678
Jan 21 23:47:27 compute-1 sshd-session[213935]: pam_unix(sshd:session): session closed for user nova
Jan 21 23:47:27 compute-1 systemd[1]: session-29.scope: Deactivated successfully.
Jan 21 23:47:27 compute-1 systemd-logind[796]: Session 29 logged out. Waiting for processes to exit.
Jan 21 23:47:27 compute-1 systemd-logind[796]: Removed session 29.
Jan 21 23:47:28 compute-1 nova_compute[182713]: 2026-01-21 23:47:28.217 182717 DEBUG nova.compute.manager [req-c3189972-ccef-45da-9f9a-9e8ea919f246 req-f2a985b2-8f16-4cd8-956f-d52ccc0908bf 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 044c71d9-3aaf-4e1c-af95-5c0636cf4000] Received event network-vif-plugged-417548ae-4551-4ae2-8160-bafd0974768d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:47:28 compute-1 nova_compute[182713]: 2026-01-21 23:47:28.220 182717 DEBUG oslo_concurrency.lockutils [req-c3189972-ccef-45da-9f9a-9e8ea919f246 req-f2a985b2-8f16-4cd8-956f-d52ccc0908bf 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "044c71d9-3aaf-4e1c-af95-5c0636cf4000-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:47:28 compute-1 nova_compute[182713]: 2026-01-21 23:47:28.221 182717 DEBUG oslo_concurrency.lockutils [req-c3189972-ccef-45da-9f9a-9e8ea919f246 req-f2a985b2-8f16-4cd8-956f-d52ccc0908bf 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "044c71d9-3aaf-4e1c-af95-5c0636cf4000-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:47:28 compute-1 nova_compute[182713]: 2026-01-21 23:47:28.221 182717 DEBUG oslo_concurrency.lockutils [req-c3189972-ccef-45da-9f9a-9e8ea919f246 req-f2a985b2-8f16-4cd8-956f-d52ccc0908bf 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "044c71d9-3aaf-4e1c-af95-5c0636cf4000-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:47:28 compute-1 nova_compute[182713]: 2026-01-21 23:47:28.222 182717 DEBUG nova.compute.manager [req-c3189972-ccef-45da-9f9a-9e8ea919f246 req-f2a985b2-8f16-4cd8-956f-d52ccc0908bf 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 044c71d9-3aaf-4e1c-af95-5c0636cf4000] Processing event network-vif-plugged-417548ae-4551-4ae2-8160-bafd0974768d _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 21 23:47:28 compute-1 nova_compute[182713]: 2026-01-21 23:47:28.222 182717 DEBUG nova.compute.manager [req-c3189972-ccef-45da-9f9a-9e8ea919f246 req-f2a985b2-8f16-4cd8-956f-d52ccc0908bf 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 044c71d9-3aaf-4e1c-af95-5c0636cf4000] Received event network-vif-plugged-417548ae-4551-4ae2-8160-bafd0974768d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:47:28 compute-1 nova_compute[182713]: 2026-01-21 23:47:28.222 182717 DEBUG oslo_concurrency.lockutils [req-c3189972-ccef-45da-9f9a-9e8ea919f246 req-f2a985b2-8f16-4cd8-956f-d52ccc0908bf 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "044c71d9-3aaf-4e1c-af95-5c0636cf4000-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:47:28 compute-1 nova_compute[182713]: 2026-01-21 23:47:28.223 182717 DEBUG oslo_concurrency.lockutils [req-c3189972-ccef-45da-9f9a-9e8ea919f246 req-f2a985b2-8f16-4cd8-956f-d52ccc0908bf 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "044c71d9-3aaf-4e1c-af95-5c0636cf4000-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:47:28 compute-1 nova_compute[182713]: 2026-01-21 23:47:28.223 182717 DEBUG oslo_concurrency.lockutils [req-c3189972-ccef-45da-9f9a-9e8ea919f246 req-f2a985b2-8f16-4cd8-956f-d52ccc0908bf 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "044c71d9-3aaf-4e1c-af95-5c0636cf4000-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:47:28 compute-1 nova_compute[182713]: 2026-01-21 23:47:28.223 182717 DEBUG nova.compute.manager [req-c3189972-ccef-45da-9f9a-9e8ea919f246 req-f2a985b2-8f16-4cd8-956f-d52ccc0908bf 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 044c71d9-3aaf-4e1c-af95-5c0636cf4000] No waiting events found dispatching network-vif-plugged-417548ae-4551-4ae2-8160-bafd0974768d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 23:47:28 compute-1 nova_compute[182713]: 2026-01-21 23:47:28.224 182717 WARNING nova.compute.manager [req-c3189972-ccef-45da-9f9a-9e8ea919f246 req-f2a985b2-8f16-4cd8-956f-d52ccc0908bf 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 044c71d9-3aaf-4e1c-af95-5c0636cf4000] Received unexpected event network-vif-plugged-417548ae-4551-4ae2-8160-bafd0974768d for instance with vm_state building and task_state spawning.
Jan 21 23:47:28 compute-1 nova_compute[182713]: 2026-01-21 23:47:28.224 182717 DEBUG nova.compute.manager [None req-351eb237-3255-446d-9453-de10d1961d65 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] [instance: 044c71d9-3aaf-4e1c-af95-5c0636cf4000] Instance event wait completed in 2 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 21 23:47:28 compute-1 nova_compute[182713]: 2026-01-21 23:47:28.233 182717 DEBUG nova.virt.driver [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Emitting event <LifecycleEvent: 1769039248.2330995, 044c71d9-3aaf-4e1c-af95-5c0636cf4000 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:47:28 compute-1 nova_compute[182713]: 2026-01-21 23:47:28.233 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 044c71d9-3aaf-4e1c-af95-5c0636cf4000] VM Resumed (Lifecycle Event)
Jan 21 23:47:28 compute-1 nova_compute[182713]: 2026-01-21 23:47:28.235 182717 DEBUG nova.virt.libvirt.driver [None req-351eb237-3255-446d-9453-de10d1961d65 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] [instance: 044c71d9-3aaf-4e1c-af95-5c0636cf4000] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 21 23:47:28 compute-1 nova_compute[182713]: 2026-01-21 23:47:28.239 182717 INFO nova.virt.libvirt.driver [-] [instance: 044c71d9-3aaf-4e1c-af95-5c0636cf4000] Instance spawned successfully.
Jan 21 23:47:28 compute-1 nova_compute[182713]: 2026-01-21 23:47:28.240 182717 DEBUG nova.virt.libvirt.driver [None req-351eb237-3255-446d-9453-de10d1961d65 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] [instance: 044c71d9-3aaf-4e1c-af95-5c0636cf4000] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 21 23:47:28 compute-1 nova_compute[182713]: 2026-01-21 23:47:28.281 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 044c71d9-3aaf-4e1c-af95-5c0636cf4000] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:47:28 compute-1 nova_compute[182713]: 2026-01-21 23:47:28.287 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 044c71d9-3aaf-4e1c-af95-5c0636cf4000] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 21 23:47:28 compute-1 nova_compute[182713]: 2026-01-21 23:47:28.291 182717 DEBUG nova.virt.libvirt.driver [None req-351eb237-3255-446d-9453-de10d1961d65 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] [instance: 044c71d9-3aaf-4e1c-af95-5c0636cf4000] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:47:28 compute-1 nova_compute[182713]: 2026-01-21 23:47:28.292 182717 DEBUG nova.virt.libvirt.driver [None req-351eb237-3255-446d-9453-de10d1961d65 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] [instance: 044c71d9-3aaf-4e1c-af95-5c0636cf4000] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:47:28 compute-1 nova_compute[182713]: 2026-01-21 23:47:28.292 182717 DEBUG nova.virt.libvirt.driver [None req-351eb237-3255-446d-9453-de10d1961d65 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] [instance: 044c71d9-3aaf-4e1c-af95-5c0636cf4000] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:47:28 compute-1 nova_compute[182713]: 2026-01-21 23:47:28.293 182717 DEBUG nova.virt.libvirt.driver [None req-351eb237-3255-446d-9453-de10d1961d65 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] [instance: 044c71d9-3aaf-4e1c-af95-5c0636cf4000] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:47:28 compute-1 nova_compute[182713]: 2026-01-21 23:47:28.293 182717 DEBUG nova.virt.libvirt.driver [None req-351eb237-3255-446d-9453-de10d1961d65 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] [instance: 044c71d9-3aaf-4e1c-af95-5c0636cf4000] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:47:28 compute-1 nova_compute[182713]: 2026-01-21 23:47:28.294 182717 DEBUG nova.virt.libvirt.driver [None req-351eb237-3255-446d-9453-de10d1961d65 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] [instance: 044c71d9-3aaf-4e1c-af95-5c0636cf4000] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:47:28 compute-1 nova_compute[182713]: 2026-01-21 23:47:28.333 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 044c71d9-3aaf-4e1c-af95-5c0636cf4000] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 21 23:47:28 compute-1 nova_compute[182713]: 2026-01-21 23:47:28.383 182717 INFO nova.compute.manager [None req-351eb237-3255-446d-9453-de10d1961d65 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] [instance: 044c71d9-3aaf-4e1c-af95-5c0636cf4000] Took 10.16 seconds to spawn the instance on the hypervisor.
Jan 21 23:47:28 compute-1 nova_compute[182713]: 2026-01-21 23:47:28.384 182717 DEBUG nova.compute.manager [None req-351eb237-3255-446d-9453-de10d1961d65 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] [instance: 044c71d9-3aaf-4e1c-af95-5c0636cf4000] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:47:28 compute-1 nova_compute[182713]: 2026-01-21 23:47:28.476 182717 INFO nova.compute.manager [None req-351eb237-3255-446d-9453-de10d1961d65 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] [instance: 044c71d9-3aaf-4e1c-af95-5c0636cf4000] Took 11.14 seconds to build instance.
Jan 21 23:47:28 compute-1 nova_compute[182713]: 2026-01-21 23:47:28.507 182717 DEBUG oslo_concurrency.lockutils [None req-351eb237-3255-446d-9453-de10d1961d65 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Lock "044c71d9-3aaf-4e1c-af95-5c0636cf4000" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 11.333s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:47:28 compute-1 nova_compute[182713]: 2026-01-21 23:47:28.678 182717 DEBUG nova.compute.manager [req-29fc0998-6ab4-4f71-b84b-6cfe5c72bdc2 req-96b42d2c-8d7d-47a6-8fbc-0e214413a59e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 5bdecf5d-9113-4584-ac23-44d59770eade] Received event network-vif-unplugged-df9aa099-aa41-4111-b46c-c8a593762a53 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:47:28 compute-1 nova_compute[182713]: 2026-01-21 23:47:28.679 182717 DEBUG oslo_concurrency.lockutils [req-29fc0998-6ab4-4f71-b84b-6cfe5c72bdc2 req-96b42d2c-8d7d-47a6-8fbc-0e214413a59e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "5bdecf5d-9113-4584-ac23-44d59770eade-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:47:28 compute-1 nova_compute[182713]: 2026-01-21 23:47:28.679 182717 DEBUG oslo_concurrency.lockutils [req-29fc0998-6ab4-4f71-b84b-6cfe5c72bdc2 req-96b42d2c-8d7d-47a6-8fbc-0e214413a59e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "5bdecf5d-9113-4584-ac23-44d59770eade-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:47:28 compute-1 nova_compute[182713]: 2026-01-21 23:47:28.679 182717 DEBUG oslo_concurrency.lockutils [req-29fc0998-6ab4-4f71-b84b-6cfe5c72bdc2 req-96b42d2c-8d7d-47a6-8fbc-0e214413a59e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "5bdecf5d-9113-4584-ac23-44d59770eade-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:47:28 compute-1 nova_compute[182713]: 2026-01-21 23:47:28.680 182717 DEBUG nova.compute.manager [req-29fc0998-6ab4-4f71-b84b-6cfe5c72bdc2 req-96b42d2c-8d7d-47a6-8fbc-0e214413a59e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 5bdecf5d-9113-4584-ac23-44d59770eade] No waiting events found dispatching network-vif-unplugged-df9aa099-aa41-4111-b46c-c8a593762a53 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 23:47:28 compute-1 nova_compute[182713]: 2026-01-21 23:47:28.680 182717 DEBUG nova.compute.manager [req-29fc0998-6ab4-4f71-b84b-6cfe5c72bdc2 req-96b42d2c-8d7d-47a6-8fbc-0e214413a59e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 5bdecf5d-9113-4584-ac23-44d59770eade] Received event network-vif-unplugged-df9aa099-aa41-4111-b46c-c8a593762a53 for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 21 23:47:28 compute-1 nova_compute[182713]: 2026-01-21 23:47:28.976 182717 INFO nova.compute.manager [None req-c15d4748-f712-4aac-9314-66c8c072b909 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] [instance: 5bdecf5d-9113-4584-ac23-44d59770eade] Took 5.20 seconds for pre_live_migration on destination host compute-0.ctlplane.example.com.
Jan 21 23:47:28 compute-1 nova_compute[182713]: 2026-01-21 23:47:28.977 182717 DEBUG nova.compute.manager [None req-c15d4748-f712-4aac-9314-66c8c072b909 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] [instance: 5bdecf5d-9113-4584-ac23-44d59770eade] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 21 23:47:29 compute-1 nova_compute[182713]: 2026-01-21 23:47:29.008 182717 DEBUG nova.compute.manager [None req-c15d4748-f712-4aac-9314-66c8c072b909 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp9ku_13i8',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='5bdecf5d-9113-4584-ac23-44d59770eade',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=Migration(b99e07dc-0046-41fe-8ff1-33067bd0bd67),old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) _do_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8939
Jan 21 23:47:29 compute-1 nova_compute[182713]: 2026-01-21 23:47:29.035 182717 DEBUG nova.objects.instance [None req-c15d4748-f712-4aac-9314-66c8c072b909 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] Lazy-loading 'migration_context' on Instance uuid 5bdecf5d-9113-4584-ac23-44d59770eade obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:47:29 compute-1 nova_compute[182713]: 2026-01-21 23:47:29.037 182717 DEBUG nova.virt.libvirt.driver [None req-c15d4748-f712-4aac-9314-66c8c072b909 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] [instance: 5bdecf5d-9113-4584-ac23-44d59770eade] Starting monitoring of live migration _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10639
Jan 21 23:47:29 compute-1 nova_compute[182713]: 2026-01-21 23:47:29.041 182717 DEBUG nova.virt.libvirt.driver [None req-c15d4748-f712-4aac-9314-66c8c072b909 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] [instance: 5bdecf5d-9113-4584-ac23-44d59770eade] Operation thread is still running _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10440
Jan 21 23:47:29 compute-1 nova_compute[182713]: 2026-01-21 23:47:29.042 182717 DEBUG nova.virt.libvirt.driver [None req-c15d4748-f712-4aac-9314-66c8c072b909 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] [instance: 5bdecf5d-9113-4584-ac23-44d59770eade] Migration not running yet _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10449
Jan 21 23:47:29 compute-1 nova_compute[182713]: 2026-01-21 23:47:29.062 182717 DEBUG nova.virt.libvirt.vif [None req-c15d4748-f712-4aac-9314-66c8c072b909 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-21T23:47:07Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-LiveMigrationTest-server-821021372',display_name='tempest-LiveMigrationTest-server-821021372',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-livemigrationtest-server-821021372',id=21,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-21T23:47:17Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='cdcb2f57183e484cace5d5f78dd635a1',ramdisk_id='',reservation_id='r-xa5gd7vg',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-LiveMigrationTest-430976321',owner_user_name='tempest-LiveMigrationTest-430976321-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-21T23:47:17Z,user_data=None,user_id='d4ff24d8abf8416db9d64c645436c5f1',uuid=5bdecf5d-9113-4584-ac23-44d59770eade,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "df9aa099-aa41-4111-b46c-c8a593762a53", "address": "fa:16:3e:8f:4f:85", "network": {"id": "b2df233d-b255-4dda-925c-3ccab3a032ee", "bridge": "br-int", "label": "tempest-LiveMigrationTest-283223724-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cdcb2f57183e484cace5d5f78dd635a1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapdf9aa099-aa", "ovs_interfaceid": "df9aa099-aa41-4111-b46c-c8a593762a53", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 21 23:47:29 compute-1 nova_compute[182713]: 2026-01-21 23:47:29.063 182717 DEBUG nova.network.os_vif_util [None req-c15d4748-f712-4aac-9314-66c8c072b909 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] Converting VIF {"id": "df9aa099-aa41-4111-b46c-c8a593762a53", "address": "fa:16:3e:8f:4f:85", "network": {"id": "b2df233d-b255-4dda-925c-3ccab3a032ee", "bridge": "br-int", "label": "tempest-LiveMigrationTest-283223724-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cdcb2f57183e484cace5d5f78dd635a1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapdf9aa099-aa", "ovs_interfaceid": "df9aa099-aa41-4111-b46c-c8a593762a53", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 21 23:47:29 compute-1 nova_compute[182713]: 2026-01-21 23:47:29.064 182717 DEBUG nova.network.os_vif_util [None req-c15d4748-f712-4aac-9314-66c8c072b909 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:8f:4f:85,bridge_name='br-int',has_traffic_filtering=True,id=df9aa099-aa41-4111-b46c-c8a593762a53,network=Network(b2df233d-b255-4dda-925c-3ccab3a032ee),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdf9aa099-aa') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 21 23:47:29 compute-1 nova_compute[182713]: 2026-01-21 23:47:29.065 182717 DEBUG nova.virt.libvirt.migration [None req-c15d4748-f712-4aac-9314-66c8c072b909 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] [instance: 5bdecf5d-9113-4584-ac23-44d59770eade] Updating guest XML with vif config: <interface type="ethernet">
Jan 21 23:47:29 compute-1 nova_compute[182713]:   <mac address="fa:16:3e:8f:4f:85"/>
Jan 21 23:47:29 compute-1 nova_compute[182713]:   <model type="virtio"/>
Jan 21 23:47:29 compute-1 nova_compute[182713]:   <driver name="vhost" rx_queue_size="512"/>
Jan 21 23:47:29 compute-1 nova_compute[182713]:   <mtu size="1442"/>
Jan 21 23:47:29 compute-1 nova_compute[182713]:   <target dev="tapdf9aa099-aa"/>
Jan 21 23:47:29 compute-1 nova_compute[182713]: </interface>
Jan 21 23:47:29 compute-1 nova_compute[182713]:  _update_vif_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:388
Jan 21 23:47:29 compute-1 nova_compute[182713]: 2026-01-21 23:47:29.067 182717 DEBUG nova.virt.libvirt.driver [None req-c15d4748-f712-4aac-9314-66c8c072b909 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] [instance: 5bdecf5d-9113-4584-ac23-44d59770eade] About to invoke the migrate API _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10272
Jan 21 23:47:29 compute-1 nova_compute[182713]: 2026-01-21 23:47:29.208 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:47:29 compute-1 nova_compute[182713]: 2026-01-21 23:47:29.545 182717 DEBUG nova.virt.libvirt.migration [None req-c15d4748-f712-4aac-9314-66c8c072b909 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] [instance: 5bdecf5d-9113-4584-ac23-44d59770eade] Current None elapsed 0 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Jan 21 23:47:29 compute-1 nova_compute[182713]: 2026-01-21 23:47:29.546 182717 INFO nova.virt.libvirt.migration [None req-c15d4748-f712-4aac-9314-66c8c072b909 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] [instance: 5bdecf5d-9113-4584-ac23-44d59770eade] Increasing downtime to 50 ms after 0 sec elapsed time
Jan 21 23:47:29 compute-1 nova_compute[182713]: 2026-01-21 23:47:29.643 182717 INFO nova.virt.libvirt.driver [None req-c15d4748-f712-4aac-9314-66c8c072b909 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] [instance: 5bdecf5d-9113-4584-ac23-44d59770eade] Migration running for 0 secs, memory 100% remaining (bytes processed=0, remaining=0, total=0); disk 100% remaining (bytes processed=0, remaining=0, total=0).
Jan 21 23:47:30 compute-1 nova_compute[182713]: 2026-01-21 23:47:30.145 182717 DEBUG nova.virt.libvirt.migration [None req-c15d4748-f712-4aac-9314-66c8c072b909 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] [instance: 5bdecf5d-9113-4584-ac23-44d59770eade] Current 50 elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Jan 21 23:47:30 compute-1 nova_compute[182713]: 2026-01-21 23:47:30.146 182717 DEBUG nova.virt.libvirt.migration [None req-c15d4748-f712-4aac-9314-66c8c072b909 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] [instance: 5bdecf5d-9113-4584-ac23-44d59770eade] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Jan 21 23:47:30 compute-1 nova_compute[182713]: 2026-01-21 23:47:30.209 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:47:30 compute-1 nova_compute[182713]: 2026-01-21 23:47:30.651 182717 DEBUG nova.virt.libvirt.migration [None req-c15d4748-f712-4aac-9314-66c8c072b909 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] [instance: 5bdecf5d-9113-4584-ac23-44d59770eade] Current 50 elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Jan 21 23:47:30 compute-1 nova_compute[182713]: 2026-01-21 23:47:30.651 182717 DEBUG nova.virt.libvirt.migration [None req-c15d4748-f712-4aac-9314-66c8c072b909 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] [instance: 5bdecf5d-9113-4584-ac23-44d59770eade] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Jan 21 23:47:30 compute-1 nova_compute[182713]: 2026-01-21 23:47:30.817 182717 DEBUG nova.compute.manager [req-6264341e-601f-489d-b21b-84f6f420d093 req-e6b9b8f8-1211-4b85-80a6-d272e3606c9e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 5bdecf5d-9113-4584-ac23-44d59770eade] Received event network-vif-plugged-df9aa099-aa41-4111-b46c-c8a593762a53 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:47:30 compute-1 nova_compute[182713]: 2026-01-21 23:47:30.818 182717 DEBUG oslo_concurrency.lockutils [req-6264341e-601f-489d-b21b-84f6f420d093 req-e6b9b8f8-1211-4b85-80a6-d272e3606c9e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "5bdecf5d-9113-4584-ac23-44d59770eade-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:47:30 compute-1 nova_compute[182713]: 2026-01-21 23:47:30.818 182717 DEBUG oslo_concurrency.lockutils [req-6264341e-601f-489d-b21b-84f6f420d093 req-e6b9b8f8-1211-4b85-80a6-d272e3606c9e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "5bdecf5d-9113-4584-ac23-44d59770eade-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:47:30 compute-1 nova_compute[182713]: 2026-01-21 23:47:30.818 182717 DEBUG oslo_concurrency.lockutils [req-6264341e-601f-489d-b21b-84f6f420d093 req-e6b9b8f8-1211-4b85-80a6-d272e3606c9e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "5bdecf5d-9113-4584-ac23-44d59770eade-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:47:30 compute-1 nova_compute[182713]: 2026-01-21 23:47:30.818 182717 DEBUG nova.compute.manager [req-6264341e-601f-489d-b21b-84f6f420d093 req-e6b9b8f8-1211-4b85-80a6-d272e3606c9e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 5bdecf5d-9113-4584-ac23-44d59770eade] No waiting events found dispatching network-vif-plugged-df9aa099-aa41-4111-b46c-c8a593762a53 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 23:47:30 compute-1 nova_compute[182713]: 2026-01-21 23:47:30.818 182717 WARNING nova.compute.manager [req-6264341e-601f-489d-b21b-84f6f420d093 req-e6b9b8f8-1211-4b85-80a6-d272e3606c9e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 5bdecf5d-9113-4584-ac23-44d59770eade] Received unexpected event network-vif-plugged-df9aa099-aa41-4111-b46c-c8a593762a53 for instance with vm_state active and task_state migrating.
Jan 21 23:47:30 compute-1 nova_compute[182713]: 2026-01-21 23:47:30.819 182717 DEBUG nova.compute.manager [req-6264341e-601f-489d-b21b-84f6f420d093 req-e6b9b8f8-1211-4b85-80a6-d272e3606c9e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 5bdecf5d-9113-4584-ac23-44d59770eade] Received event network-changed-df9aa099-aa41-4111-b46c-c8a593762a53 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:47:30 compute-1 nova_compute[182713]: 2026-01-21 23:47:30.819 182717 DEBUG nova.compute.manager [req-6264341e-601f-489d-b21b-84f6f420d093 req-e6b9b8f8-1211-4b85-80a6-d272e3606c9e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 5bdecf5d-9113-4584-ac23-44d59770eade] Refreshing instance network info cache due to event network-changed-df9aa099-aa41-4111-b46c-c8a593762a53. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 21 23:47:30 compute-1 nova_compute[182713]: 2026-01-21 23:47:30.819 182717 DEBUG oslo_concurrency.lockutils [req-6264341e-601f-489d-b21b-84f6f420d093 req-e6b9b8f8-1211-4b85-80a6-d272e3606c9e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-5bdecf5d-9113-4584-ac23-44d59770eade" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 23:47:30 compute-1 nova_compute[182713]: 2026-01-21 23:47:30.819 182717 DEBUG oslo_concurrency.lockutils [req-6264341e-601f-489d-b21b-84f6f420d093 req-e6b9b8f8-1211-4b85-80a6-d272e3606c9e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-5bdecf5d-9113-4584-ac23-44d59770eade" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 23:47:30 compute-1 nova_compute[182713]: 2026-01-21 23:47:30.819 182717 DEBUG nova.network.neutron [req-6264341e-601f-489d-b21b-84f6f420d093 req-e6b9b8f8-1211-4b85-80a6-d272e3606c9e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 5bdecf5d-9113-4584-ac23-44d59770eade] Refreshing network info cache for port df9aa099-aa41-4111-b46c-c8a593762a53 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 21 23:47:31 compute-1 ovn_controller[94841]: 2026-01-21T23:47:31Z|00008|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:8f:4f:85 10.100.0.6
Jan 21 23:47:31 compute-1 ovn_controller[94841]: 2026-01-21T23:47:31Z|00009|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:8f:4f:85 10.100.0.6
Jan 21 23:47:31 compute-1 nova_compute[182713]: 2026-01-21 23:47:31.156 182717 DEBUG nova.virt.libvirt.migration [None req-c15d4748-f712-4aac-9314-66c8c072b909 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] [instance: 5bdecf5d-9113-4584-ac23-44d59770eade] Current 50 elapsed 2 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Jan 21 23:47:31 compute-1 nova_compute[182713]: 2026-01-21 23:47:31.156 182717 DEBUG nova.virt.libvirt.migration [None req-c15d4748-f712-4aac-9314-66c8c072b909 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] [instance: 5bdecf5d-9113-4584-ac23-44d59770eade] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Jan 21 23:47:31 compute-1 nova_compute[182713]: 2026-01-21 23:47:31.322 182717 DEBUG oslo_concurrency.lockutils [None req-bd211526-85b7-4b15-8f6a-871ccb0a6f25 db0e4ef44e9a4f2b81bd8440c196814e ecc9e65e64b74549bd550940ca6a5b75 - - default default] Acquiring lock "refresh_cache-044c71d9-3aaf-4e1c-af95-5c0636cf4000" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 23:47:31 compute-1 nova_compute[182713]: 2026-01-21 23:47:31.323 182717 DEBUG oslo_concurrency.lockutils [None req-bd211526-85b7-4b15-8f6a-871ccb0a6f25 db0e4ef44e9a4f2b81bd8440c196814e ecc9e65e64b74549bd550940ca6a5b75 - - default default] Acquired lock "refresh_cache-044c71d9-3aaf-4e1c-af95-5c0636cf4000" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 23:47:31 compute-1 nova_compute[182713]: 2026-01-21 23:47:31.323 182717 DEBUG nova.network.neutron [None req-bd211526-85b7-4b15-8f6a-871ccb0a6f25 db0e4ef44e9a4f2b81bd8440c196814e ecc9e65e64b74549bd550940ca6a5b75 - - default default] [instance: 044c71d9-3aaf-4e1c-af95-5c0636cf4000] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 21 23:47:31 compute-1 nova_compute[182713]: 2026-01-21 23:47:31.659 182717 DEBUG nova.virt.libvirt.migration [None req-c15d4748-f712-4aac-9314-66c8c072b909 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] [instance: 5bdecf5d-9113-4584-ac23-44d59770eade] Current 50 elapsed 2 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Jan 21 23:47:31 compute-1 nova_compute[182713]: 2026-01-21 23:47:31.660 182717 DEBUG nova.virt.libvirt.migration [None req-c15d4748-f712-4aac-9314-66c8c072b909 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] [instance: 5bdecf5d-9113-4584-ac23-44d59770eade] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Jan 21 23:47:31 compute-1 nova_compute[182713]: 2026-01-21 23:47:31.775 182717 DEBUG nova.virt.driver [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Emitting event <LifecycleEvent: 1769039251.7751398, 5bdecf5d-9113-4584-ac23-44d59770eade => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:47:31 compute-1 nova_compute[182713]: 2026-01-21 23:47:31.776 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 5bdecf5d-9113-4584-ac23-44d59770eade] VM Paused (Lifecycle Event)
Jan 21 23:47:31 compute-1 nova_compute[182713]: 2026-01-21 23:47:31.806 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 5bdecf5d-9113-4584-ac23-44d59770eade] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:47:31 compute-1 nova_compute[182713]: 2026-01-21 23:47:31.811 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 5bdecf5d-9113-4584-ac23-44d59770eade] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 21 23:47:31 compute-1 nova_compute[182713]: 2026-01-21 23:47:31.833 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 5bdecf5d-9113-4584-ac23-44d59770eade] During sync_power_state the instance has a pending task (migrating). Skip.
Jan 21 23:47:31 compute-1 kernel: tapdf9aa099-aa (unregistering): left promiscuous mode
Jan 21 23:47:31 compute-1 NetworkManager[54952]: <info>  [1769039251.9656] device (tapdf9aa099-aa): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 21 23:47:31 compute-1 nova_compute[182713]: 2026-01-21 23:47:31.978 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:47:31 compute-1 ovn_controller[94841]: 2026-01-21T23:47:31Z|00072|binding|INFO|Releasing lport df9aa099-aa41-4111-b46c-c8a593762a53 from this chassis (sb_readonly=0)
Jan 21 23:47:31 compute-1 ovn_controller[94841]: 2026-01-21T23:47:31Z|00073|binding|INFO|Setting lport df9aa099-aa41-4111-b46c-c8a593762a53 down in Southbound
Jan 21 23:47:31 compute-1 ovn_controller[94841]: 2026-01-21T23:47:31Z|00074|binding|INFO|Removing iface tapdf9aa099-aa ovn-installed in OVS
Jan 21 23:47:31 compute-1 nova_compute[182713]: 2026-01-21 23:47:31.981 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:47:31 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:47:31.987 104184 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:8f:4f:85 10.100.0.6'], port_security=['fa:16:3e:8f:4f:85 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com,compute-0.ctlplane.example.com', 'activation-strategy': 'rarp', 'additional-chassis-activated': '7f404a2f-20ba-4b9b-88d6-fa3588630efa'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '5bdecf5d-9113-4584-ac23-44d59770eade', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b2df233d-b255-4dda-925c-3ccab3a032ee', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'cdcb2f57183e484cace5d5f78dd635a1', 'neutron:revision_number': '8', 'neutron:security_group_ids': 'b0d61dfd-cc58-4a7f-a4c1-6c8c92847694', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ceab9906-340c-4566-81ac-4c6dd292f58f, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>], logical_port=df9aa099-aa41-4111-b46c-c8a593762a53) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 21 23:47:31 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:47:31.988 104184 INFO neutron.agent.ovn.metadata.agent [-] Port df9aa099-aa41-4111-b46c-c8a593762a53 in datapath b2df233d-b255-4dda-925c-3ccab3a032ee unbound from our chassis
Jan 21 23:47:31 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:47:31.993 104184 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network b2df233d-b255-4dda-925c-3ccab3a032ee, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 21 23:47:31 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:47:31.995 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[2cb5e806-75e6-42a7-8483-1bcf2b6c2e1e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:47:31 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:47:31.996 104184 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-b2df233d-b255-4dda-925c-3ccab3a032ee namespace which is not needed anymore
Jan 21 23:47:32 compute-1 nova_compute[182713]: 2026-01-21 23:47:32.007 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:47:32 compute-1 systemd[1]: machine-qemu\x2d10\x2dinstance\x2d00000015.scope: Deactivated successfully.
Jan 21 23:47:32 compute-1 systemd[1]: machine-qemu\x2d10\x2dinstance\x2d00000015.scope: Consumed 14.478s CPU time.
Jan 21 23:47:32 compute-1 systemd-machined[153970]: Machine qemu-10-instance-00000015 terminated.
Jan 21 23:47:32 compute-1 neutron-haproxy-ovnmeta-b2df233d-b255-4dda-925c-3ccab3a032ee[213717]: [NOTICE]   (213721) : haproxy version is 2.8.14-c23fe91
Jan 21 23:47:32 compute-1 neutron-haproxy-ovnmeta-b2df233d-b255-4dda-925c-3ccab3a032ee[213717]: [NOTICE]   (213721) : path to executable is /usr/sbin/haproxy
Jan 21 23:47:32 compute-1 neutron-haproxy-ovnmeta-b2df233d-b255-4dda-925c-3ccab3a032ee[213717]: [WARNING]  (213721) : Exiting Master process...
Jan 21 23:47:32 compute-1 neutron-haproxy-ovnmeta-b2df233d-b255-4dda-925c-3ccab3a032ee[213717]: [WARNING]  (213721) : Exiting Master process...
Jan 21 23:47:32 compute-1 neutron-haproxy-ovnmeta-b2df233d-b255-4dda-925c-3ccab3a032ee[213717]: [ALERT]    (213721) : Current worker (213723) exited with code 143 (Terminated)
Jan 21 23:47:32 compute-1 neutron-haproxy-ovnmeta-b2df233d-b255-4dda-925c-3ccab3a032ee[213717]: [WARNING]  (213721) : All workers exited. Exiting... (0)
Jan 21 23:47:32 compute-1 systemd[1]: libpod-83b14bc70ebe131abc2be17be17a6149c0aa5c5e6ab78360e501cae4db2cf14c.scope: Deactivated successfully.
Jan 21 23:47:32 compute-1 podman[214002]: 2026-01-21 23:47:32.135306577 +0000 UTC m=+0.047429874 container died 83b14bc70ebe131abc2be17be17a6149c0aa5c5e6ab78360e501cae4db2cf14c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b2df233d-b255-4dda-925c-3ccab3a032ee, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 21 23:47:32 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-83b14bc70ebe131abc2be17be17a6149c0aa5c5e6ab78360e501cae4db2cf14c-userdata-shm.mount: Deactivated successfully.
Jan 21 23:47:32 compute-1 systemd[1]: var-lib-containers-storage-overlay-45ac7128809ed154e539801e2168e3ae2574a03472072790150e25a8dd0a0de7-merged.mount: Deactivated successfully.
Jan 21 23:47:32 compute-1 podman[214002]: 2026-01-21 23:47:32.199184705 +0000 UTC m=+0.111307972 container cleanup 83b14bc70ebe131abc2be17be17a6149c0aa5c5e6ab78360e501cae4db2cf14c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b2df233d-b255-4dda-925c-3ccab3a032ee, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 21 23:47:32 compute-1 systemd[1]: libpod-conmon-83b14bc70ebe131abc2be17be17a6149c0aa5c5e6ab78360e501cae4db2cf14c.scope: Deactivated successfully.
Jan 21 23:47:32 compute-1 nova_compute[182713]: 2026-01-21 23:47:32.239 182717 DEBUG nova.virt.libvirt.guest [None req-c15d4748-f712-4aac-9314-66c8c072b909 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] Domain has shutdown/gone away: Requested operation is not valid: domain is not running get_job_info /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:688
Jan 21 23:47:32 compute-1 nova_compute[182713]: 2026-01-21 23:47:32.239 182717 INFO nova.virt.libvirt.driver [None req-c15d4748-f712-4aac-9314-66c8c072b909 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] [instance: 5bdecf5d-9113-4584-ac23-44d59770eade] Migration operation has completed
Jan 21 23:47:32 compute-1 nova_compute[182713]: 2026-01-21 23:47:32.239 182717 INFO nova.compute.manager [None req-c15d4748-f712-4aac-9314-66c8c072b909 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] [instance: 5bdecf5d-9113-4584-ac23-44d59770eade] _post_live_migration() is started..
Jan 21 23:47:32 compute-1 nova_compute[182713]: 2026-01-21 23:47:32.242 182717 DEBUG nova.virt.libvirt.driver [None req-c15d4748-f712-4aac-9314-66c8c072b909 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] [instance: 5bdecf5d-9113-4584-ac23-44d59770eade] Migrate API has completed _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10279
Jan 21 23:47:32 compute-1 nova_compute[182713]: 2026-01-21 23:47:32.242 182717 DEBUG nova.virt.libvirt.driver [None req-c15d4748-f712-4aac-9314-66c8c072b909 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] [instance: 5bdecf5d-9113-4584-ac23-44d59770eade] Migration operation thread has finished _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10327
Jan 21 23:47:32 compute-1 nova_compute[182713]: 2026-01-21 23:47:32.242 182717 DEBUG nova.virt.libvirt.driver [None req-c15d4748-f712-4aac-9314-66c8c072b909 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] [instance: 5bdecf5d-9113-4584-ac23-44d59770eade] Migration operation thread notification thread_finished /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10630
Jan 21 23:47:32 compute-1 podman[214042]: 2026-01-21 23:47:32.267549053 +0000 UTC m=+0.040872649 container remove 83b14bc70ebe131abc2be17be17a6149c0aa5c5e6ab78360e501cae4db2cf14c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b2df233d-b255-4dda-925c-3ccab3a032ee, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team)
Jan 21 23:47:32 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:47:32.272 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[13e267a7-2365-4574-a224-08c73bb58b85]: (4, ('Wed Jan 21 11:47:32 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-b2df233d-b255-4dda-925c-3ccab3a032ee (83b14bc70ebe131abc2be17be17a6149c0aa5c5e6ab78360e501cae4db2cf14c)\n83b14bc70ebe131abc2be17be17a6149c0aa5c5e6ab78360e501cae4db2cf14c\nWed Jan 21 11:47:32 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-b2df233d-b255-4dda-925c-3ccab3a032ee (83b14bc70ebe131abc2be17be17a6149c0aa5c5e6ab78360e501cae4db2cf14c)\n83b14bc70ebe131abc2be17be17a6149c0aa5c5e6ab78360e501cae4db2cf14c\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:47:32 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:47:32.273 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[0347699c-8b8b-4c1b-8233-172f941c79d5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:47:32 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:47:32.274 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb2df233d-b0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:47:32 compute-1 nova_compute[182713]: 2026-01-21 23:47:32.276 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:47:32 compute-1 kernel: tapb2df233d-b0: left promiscuous mode
Jan 21 23:47:32 compute-1 nova_compute[182713]: 2026-01-21 23:47:32.295 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:47:32 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:47:32.298 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[322ae0d6-6716-4016-8ceb-d8124ad3d09a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:47:32 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:47:32.315 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[c7f73947-4ff0-4827-bd54-6d9b329e5287]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:47:32 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:47:32.316 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[10835476-5a6d-4e0c-9d83-001cfa756890]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:47:32 compute-1 nova_compute[182713]: 2026-01-21 23:47:32.322 182717 DEBUG nova.compute.manager [req-c5eaa069-d111-4ce0-85c2-f514a0b0a766 req-2d992b67-13ae-40d5-955d-1b27f7bdf707 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 5bdecf5d-9113-4584-ac23-44d59770eade] Received event network-vif-unplugged-df9aa099-aa41-4111-b46c-c8a593762a53 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:47:32 compute-1 nova_compute[182713]: 2026-01-21 23:47:32.322 182717 DEBUG oslo_concurrency.lockutils [req-c5eaa069-d111-4ce0-85c2-f514a0b0a766 req-2d992b67-13ae-40d5-955d-1b27f7bdf707 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "5bdecf5d-9113-4584-ac23-44d59770eade-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:47:32 compute-1 nova_compute[182713]: 2026-01-21 23:47:32.323 182717 DEBUG oslo_concurrency.lockutils [req-c5eaa069-d111-4ce0-85c2-f514a0b0a766 req-2d992b67-13ae-40d5-955d-1b27f7bdf707 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "5bdecf5d-9113-4584-ac23-44d59770eade-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:47:32 compute-1 nova_compute[182713]: 2026-01-21 23:47:32.323 182717 DEBUG oslo_concurrency.lockutils [req-c5eaa069-d111-4ce0-85c2-f514a0b0a766 req-2d992b67-13ae-40d5-955d-1b27f7bdf707 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "5bdecf5d-9113-4584-ac23-44d59770eade-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:47:32 compute-1 nova_compute[182713]: 2026-01-21 23:47:32.323 182717 DEBUG nova.compute.manager [req-c5eaa069-d111-4ce0-85c2-f514a0b0a766 req-2d992b67-13ae-40d5-955d-1b27f7bdf707 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 5bdecf5d-9113-4584-ac23-44d59770eade] No waiting events found dispatching network-vif-unplugged-df9aa099-aa41-4111-b46c-c8a593762a53 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 23:47:32 compute-1 nova_compute[182713]: 2026-01-21 23:47:32.323 182717 DEBUG nova.compute.manager [req-c5eaa069-d111-4ce0-85c2-f514a0b0a766 req-2d992b67-13ae-40d5-955d-1b27f7bdf707 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 5bdecf5d-9113-4584-ac23-44d59770eade] Received event network-vif-unplugged-df9aa099-aa41-4111-b46c-c8a593762a53 for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 21 23:47:32 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:47:32.329 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[e40425aa-32d1-44ac-944e-c57fd9c6726e]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 380949, 'reachable_time': 35267, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 214065, 'error': None, 'target': 'ovnmeta-b2df233d-b255-4dda-925c-3ccab3a032ee', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:47:32 compute-1 systemd[1]: run-netns-ovnmeta\x2db2df233d\x2db255\x2d4dda\x2d925c\x2d3ccab3a032ee.mount: Deactivated successfully.
Jan 21 23:47:32 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:47:32.333 104576 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-b2df233d-b255-4dda-925c-3ccab3a032ee deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 21 23:47:32 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:47:32.333 104576 DEBUG oslo.privsep.daemon [-] privsep: reply[be624d37-772d-4502-8349-511010ee7795]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:47:32 compute-1 nova_compute[182713]: 2026-01-21 23:47:32.875 182717 DEBUG nova.network.neutron [req-6264341e-601f-489d-b21b-84f6f420d093 req-e6b9b8f8-1211-4b85-80a6-d272e3606c9e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 5bdecf5d-9113-4584-ac23-44d59770eade] Updated VIF entry in instance network info cache for port df9aa099-aa41-4111-b46c-c8a593762a53. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 21 23:47:32 compute-1 nova_compute[182713]: 2026-01-21 23:47:32.876 182717 DEBUG nova.network.neutron [req-6264341e-601f-489d-b21b-84f6f420d093 req-e6b9b8f8-1211-4b85-80a6-d272e3606c9e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 5bdecf5d-9113-4584-ac23-44d59770eade] Updating instance_info_cache with network_info: [{"id": "df9aa099-aa41-4111-b46c-c8a593762a53", "address": "fa:16:3e:8f:4f:85", "network": {"id": "b2df233d-b255-4dda-925c-3ccab3a032ee", "bridge": "br-int", "label": "tempest-LiveMigrationTest-283223724-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cdcb2f57183e484cace5d5f78dd635a1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdf9aa099-aa", "ovs_interfaceid": "df9aa099-aa41-4111-b46c-c8a593762a53", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"migrating_to": "compute-0.ctlplane.example.com"}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 23:47:32 compute-1 nova_compute[182713]: 2026-01-21 23:47:32.914 182717 DEBUG oslo_concurrency.lockutils [req-6264341e-601f-489d-b21b-84f6f420d093 req-e6b9b8f8-1211-4b85-80a6-d272e3606c9e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-5bdecf5d-9113-4584-ac23-44d59770eade" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 23:47:33 compute-1 nova_compute[182713]: 2026-01-21 23:47:33.139 182717 DEBUG nova.compute.manager [req-32c10d26-960a-4ab1-b5d9-3bb5599d3af6 req-4fd309ac-b75d-4c78-942f-8f156af11768 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 5bdecf5d-9113-4584-ac23-44d59770eade] Received event network-vif-unplugged-df9aa099-aa41-4111-b46c-c8a593762a53 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:47:33 compute-1 nova_compute[182713]: 2026-01-21 23:47:33.140 182717 DEBUG oslo_concurrency.lockutils [req-32c10d26-960a-4ab1-b5d9-3bb5599d3af6 req-4fd309ac-b75d-4c78-942f-8f156af11768 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "5bdecf5d-9113-4584-ac23-44d59770eade-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:47:33 compute-1 nova_compute[182713]: 2026-01-21 23:47:33.141 182717 DEBUG oslo_concurrency.lockutils [req-32c10d26-960a-4ab1-b5d9-3bb5599d3af6 req-4fd309ac-b75d-4c78-942f-8f156af11768 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "5bdecf5d-9113-4584-ac23-44d59770eade-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:47:33 compute-1 nova_compute[182713]: 2026-01-21 23:47:33.142 182717 DEBUG oslo_concurrency.lockutils [req-32c10d26-960a-4ab1-b5d9-3bb5599d3af6 req-4fd309ac-b75d-4c78-942f-8f156af11768 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "5bdecf5d-9113-4584-ac23-44d59770eade-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:47:33 compute-1 nova_compute[182713]: 2026-01-21 23:47:33.142 182717 DEBUG nova.compute.manager [req-32c10d26-960a-4ab1-b5d9-3bb5599d3af6 req-4fd309ac-b75d-4c78-942f-8f156af11768 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 5bdecf5d-9113-4584-ac23-44d59770eade] No waiting events found dispatching network-vif-unplugged-df9aa099-aa41-4111-b46c-c8a593762a53 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 23:47:33 compute-1 nova_compute[182713]: 2026-01-21 23:47:33.143 182717 DEBUG nova.compute.manager [req-32c10d26-960a-4ab1-b5d9-3bb5599d3af6 req-4fd309ac-b75d-4c78-942f-8f156af11768 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 5bdecf5d-9113-4584-ac23-44d59770eade] Received event network-vif-unplugged-df9aa099-aa41-4111-b46c-c8a593762a53 for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 21 23:47:33 compute-1 nova_compute[182713]: 2026-01-21 23:47:33.442 182717 DEBUG nova.network.neutron [None req-bd211526-85b7-4b15-8f6a-871ccb0a6f25 db0e4ef44e9a4f2b81bd8440c196814e ecc9e65e64b74549bd550940ca6a5b75 - - default default] [instance: 044c71d9-3aaf-4e1c-af95-5c0636cf4000] Updating instance_info_cache with network_info: [{"id": "417548ae-4551-4ae2-8160-bafd0974768d", "address": "fa:16:3e:5a:f2:4d", "network": {"id": "1530a22a-f758-407d-b1aa-fd922904fe07", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1431691179-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4d40fc03fb534b5689415f3d8a3de1fc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap417548ae-45", "ovs_interfaceid": "417548ae-4551-4ae2-8160-bafd0974768d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 23:47:33 compute-1 nova_compute[182713]: 2026-01-21 23:47:33.471 182717 DEBUG oslo_concurrency.lockutils [None req-bd211526-85b7-4b15-8f6a-871ccb0a6f25 db0e4ef44e9a4f2b81bd8440c196814e ecc9e65e64b74549bd550940ca6a5b75 - - default default] Releasing lock "refresh_cache-044c71d9-3aaf-4e1c-af95-5c0636cf4000" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 23:47:33 compute-1 nova_compute[182713]: 2026-01-21 23:47:33.472 182717 DEBUG nova.compute.manager [None req-bd211526-85b7-4b15-8f6a-871ccb0a6f25 db0e4ef44e9a4f2b81bd8440c196814e ecc9e65e64b74549bd550940ca6a5b75 - - default default] [instance: 044c71d9-3aaf-4e1c-af95-5c0636cf4000] Inject network info _inject_network_info /usr/lib/python3.9/site-packages/nova/compute/manager.py:7144
Jan 21 23:47:33 compute-1 nova_compute[182713]: 2026-01-21 23:47:33.473 182717 DEBUG nova.compute.manager [None req-bd211526-85b7-4b15-8f6a-871ccb0a6f25 db0e4ef44e9a4f2b81bd8440c196814e ecc9e65e64b74549bd550940ca6a5b75 - - default default] [instance: 044c71d9-3aaf-4e1c-af95-5c0636cf4000] network_info to inject: |[{"id": "417548ae-4551-4ae2-8160-bafd0974768d", "address": "fa:16:3e:5a:f2:4d", "network": {"id": "1530a22a-f758-407d-b1aa-fd922904fe07", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1431691179-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4d40fc03fb534b5689415f3d8a3de1fc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap417548ae-45", "ovs_interfaceid": "417548ae-4551-4ae2-8160-bafd0974768d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _inject_network_info /usr/lib/python3.9/site-packages/nova/compute/manager.py:7145
Jan 21 23:47:33 compute-1 nova_compute[182713]: 2026-01-21 23:47:33.495 182717 DEBUG nova.network.neutron [None req-c15d4748-f712-4aac-9314-66c8c072b909 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] Activated binding for port df9aa099-aa41-4111-b46c-c8a593762a53 and host compute-0.ctlplane.example.com migrate_instance_start /usr/lib/python3.9/site-packages/nova/network/neutron.py:3181
Jan 21 23:47:33 compute-1 nova_compute[182713]: 2026-01-21 23:47:33.495 182717 DEBUG nova.compute.manager [None req-c15d4748-f712-4aac-9314-66c8c072b909 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] [instance: 5bdecf5d-9113-4584-ac23-44d59770eade] Calling driver.post_live_migration_at_source with original source VIFs from migrate_data: [{"id": "df9aa099-aa41-4111-b46c-c8a593762a53", "address": "fa:16:3e:8f:4f:85", "network": {"id": "b2df233d-b255-4dda-925c-3ccab3a032ee", "bridge": "br-int", "label": "tempest-LiveMigrationTest-283223724-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cdcb2f57183e484cace5d5f78dd635a1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdf9aa099-aa", "ovs_interfaceid": "df9aa099-aa41-4111-b46c-c8a593762a53", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9326
Jan 21 23:47:33 compute-1 nova_compute[182713]: 2026-01-21 23:47:33.497 182717 DEBUG nova.virt.libvirt.vif [None req-c15d4748-f712-4aac-9314-66c8c072b909 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-21T23:47:07Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-LiveMigrationTest-server-821021372',display_name='tempest-LiveMigrationTest-server-821021372',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-livemigrationtest-server-821021372',id=21,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-21T23:47:17Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='cdcb2f57183e484cace5d5f78dd635a1',ramdisk_id='',reservation_id='r-xa5gd7vg',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-LiveMigrationTest-430976321',owner_user_name='tempest-LiveMigrationTest-430976321-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-21T23:47:20Z,user_data=None,user_id='d4ff24d8abf8416db9d64c645436c5f1',uuid=5bdecf5d-9113-4584-ac23-44d59770eade,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "df9aa099-aa41-4111-b46c-c8a593762a53", "address": "fa:16:3e:8f:4f:85", "network": {"id": "b2df233d-b255-4dda-925c-3ccab3a032ee", "bridge": "br-int", "label": "tempest-LiveMigrationTest-283223724-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cdcb2f57183e484cace5d5f78dd635a1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdf9aa099-aa", "ovs_interfaceid": "df9aa099-aa41-4111-b46c-c8a593762a53", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 21 23:47:33 compute-1 nova_compute[182713]: 2026-01-21 23:47:33.497 182717 DEBUG nova.network.os_vif_util [None req-c15d4748-f712-4aac-9314-66c8c072b909 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] Converting VIF {"id": "df9aa099-aa41-4111-b46c-c8a593762a53", "address": "fa:16:3e:8f:4f:85", "network": {"id": "b2df233d-b255-4dda-925c-3ccab3a032ee", "bridge": "br-int", "label": "tempest-LiveMigrationTest-283223724-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cdcb2f57183e484cace5d5f78dd635a1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdf9aa099-aa", "ovs_interfaceid": "df9aa099-aa41-4111-b46c-c8a593762a53", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 21 23:47:33 compute-1 nova_compute[182713]: 2026-01-21 23:47:33.498 182717 DEBUG nova.network.os_vif_util [None req-c15d4748-f712-4aac-9314-66c8c072b909 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:8f:4f:85,bridge_name='br-int',has_traffic_filtering=True,id=df9aa099-aa41-4111-b46c-c8a593762a53,network=Network(b2df233d-b255-4dda-925c-3ccab3a032ee),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdf9aa099-aa') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 21 23:47:33 compute-1 nova_compute[182713]: 2026-01-21 23:47:33.499 182717 DEBUG os_vif [None req-c15d4748-f712-4aac-9314-66c8c072b909 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:8f:4f:85,bridge_name='br-int',has_traffic_filtering=True,id=df9aa099-aa41-4111-b46c-c8a593762a53,network=Network(b2df233d-b255-4dda-925c-3ccab3a032ee),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdf9aa099-aa') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 21 23:47:33 compute-1 nova_compute[182713]: 2026-01-21 23:47:33.501 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:47:33 compute-1 nova_compute[182713]: 2026-01-21 23:47:33.502 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapdf9aa099-aa, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:47:33 compute-1 nova_compute[182713]: 2026-01-21 23:47:33.504 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:47:33 compute-1 nova_compute[182713]: 2026-01-21 23:47:33.506 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:47:33 compute-1 nova_compute[182713]: 2026-01-21 23:47:33.510 182717 INFO os_vif [None req-c15d4748-f712-4aac-9314-66c8c072b909 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:8f:4f:85,bridge_name='br-int',has_traffic_filtering=True,id=df9aa099-aa41-4111-b46c-c8a593762a53,network=Network(b2df233d-b255-4dda-925c-3ccab3a032ee),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdf9aa099-aa')
Jan 21 23:47:33 compute-1 nova_compute[182713]: 2026-01-21 23:47:33.511 182717 DEBUG oslo_concurrency.lockutils [None req-c15d4748-f712-4aac-9314-66c8c072b909 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:47:33 compute-1 nova_compute[182713]: 2026-01-21 23:47:33.511 182717 DEBUG oslo_concurrency.lockutils [None req-c15d4748-f712-4aac-9314-66c8c072b909 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:47:33 compute-1 nova_compute[182713]: 2026-01-21 23:47:33.512 182717 DEBUG oslo_concurrency.lockutils [None req-c15d4748-f712-4aac-9314-66c8c072b909 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:47:33 compute-1 nova_compute[182713]: 2026-01-21 23:47:33.512 182717 DEBUG nova.compute.manager [None req-c15d4748-f712-4aac-9314-66c8c072b909 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] [instance: 5bdecf5d-9113-4584-ac23-44d59770eade] Calling driver.cleanup from _post_live_migration _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9349
Jan 21 23:47:33 compute-1 nova_compute[182713]: 2026-01-21 23:47:33.513 182717 INFO nova.virt.libvirt.driver [None req-c15d4748-f712-4aac-9314-66c8c072b909 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] [instance: 5bdecf5d-9113-4584-ac23-44d59770eade] Deleting instance files /var/lib/nova/instances/5bdecf5d-9113-4584-ac23-44d59770eade_del
Jan 21 23:47:33 compute-1 nova_compute[182713]: 2026-01-21 23:47:33.514 182717 INFO nova.virt.libvirt.driver [None req-c15d4748-f712-4aac-9314-66c8c072b909 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] [instance: 5bdecf5d-9113-4584-ac23-44d59770eade] Deletion of /var/lib/nova/instances/5bdecf5d-9113-4584-ac23-44d59770eade_del complete
Jan 21 23:47:34 compute-1 nova_compute[182713]: 2026-01-21 23:47:34.435 182717 DEBUG nova.compute.manager [req-56d66fff-f597-48d5-a075-a73cfe4d01c6 req-88aea097-0379-46f2-acbb-e6a6071b8ce1 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 5bdecf5d-9113-4584-ac23-44d59770eade] Received event network-vif-plugged-df9aa099-aa41-4111-b46c-c8a593762a53 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:47:34 compute-1 nova_compute[182713]: 2026-01-21 23:47:34.435 182717 DEBUG oslo_concurrency.lockutils [req-56d66fff-f597-48d5-a075-a73cfe4d01c6 req-88aea097-0379-46f2-acbb-e6a6071b8ce1 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "5bdecf5d-9113-4584-ac23-44d59770eade-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:47:34 compute-1 nova_compute[182713]: 2026-01-21 23:47:34.436 182717 DEBUG oslo_concurrency.lockutils [req-56d66fff-f597-48d5-a075-a73cfe4d01c6 req-88aea097-0379-46f2-acbb-e6a6071b8ce1 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "5bdecf5d-9113-4584-ac23-44d59770eade-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:47:34 compute-1 nova_compute[182713]: 2026-01-21 23:47:34.436 182717 DEBUG oslo_concurrency.lockutils [req-56d66fff-f597-48d5-a075-a73cfe4d01c6 req-88aea097-0379-46f2-acbb-e6a6071b8ce1 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "5bdecf5d-9113-4584-ac23-44d59770eade-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:47:34 compute-1 nova_compute[182713]: 2026-01-21 23:47:34.437 182717 DEBUG nova.compute.manager [req-56d66fff-f597-48d5-a075-a73cfe4d01c6 req-88aea097-0379-46f2-acbb-e6a6071b8ce1 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 5bdecf5d-9113-4584-ac23-44d59770eade] No waiting events found dispatching network-vif-plugged-df9aa099-aa41-4111-b46c-c8a593762a53 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 23:47:34 compute-1 nova_compute[182713]: 2026-01-21 23:47:34.437 182717 WARNING nova.compute.manager [req-56d66fff-f597-48d5-a075-a73cfe4d01c6 req-88aea097-0379-46f2-acbb-e6a6071b8ce1 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 5bdecf5d-9113-4584-ac23-44d59770eade] Received unexpected event network-vif-plugged-df9aa099-aa41-4111-b46c-c8a593762a53 for instance with vm_state active and task_state migrating.
Jan 21 23:47:34 compute-1 nova_compute[182713]: 2026-01-21 23:47:34.437 182717 DEBUG nova.compute.manager [req-56d66fff-f597-48d5-a075-a73cfe4d01c6 req-88aea097-0379-46f2-acbb-e6a6071b8ce1 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 5bdecf5d-9113-4584-ac23-44d59770eade] Received event network-vif-plugged-df9aa099-aa41-4111-b46c-c8a593762a53 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:47:34 compute-1 nova_compute[182713]: 2026-01-21 23:47:34.438 182717 DEBUG oslo_concurrency.lockutils [req-56d66fff-f597-48d5-a075-a73cfe4d01c6 req-88aea097-0379-46f2-acbb-e6a6071b8ce1 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "5bdecf5d-9113-4584-ac23-44d59770eade-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:47:34 compute-1 nova_compute[182713]: 2026-01-21 23:47:34.438 182717 DEBUG oslo_concurrency.lockutils [req-56d66fff-f597-48d5-a075-a73cfe4d01c6 req-88aea097-0379-46f2-acbb-e6a6071b8ce1 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "5bdecf5d-9113-4584-ac23-44d59770eade-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:47:34 compute-1 nova_compute[182713]: 2026-01-21 23:47:34.438 182717 DEBUG oslo_concurrency.lockutils [req-56d66fff-f597-48d5-a075-a73cfe4d01c6 req-88aea097-0379-46f2-acbb-e6a6071b8ce1 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "5bdecf5d-9113-4584-ac23-44d59770eade-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:47:34 compute-1 nova_compute[182713]: 2026-01-21 23:47:34.439 182717 DEBUG nova.compute.manager [req-56d66fff-f597-48d5-a075-a73cfe4d01c6 req-88aea097-0379-46f2-acbb-e6a6071b8ce1 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 5bdecf5d-9113-4584-ac23-44d59770eade] No waiting events found dispatching network-vif-plugged-df9aa099-aa41-4111-b46c-c8a593762a53 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 23:47:34 compute-1 nova_compute[182713]: 2026-01-21 23:47:34.439 182717 WARNING nova.compute.manager [req-56d66fff-f597-48d5-a075-a73cfe4d01c6 req-88aea097-0379-46f2-acbb-e6a6071b8ce1 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 5bdecf5d-9113-4584-ac23-44d59770eade] Received unexpected event network-vif-plugged-df9aa099-aa41-4111-b46c-c8a593762a53 for instance with vm_state active and task_state migrating.
Jan 21 23:47:34 compute-1 nova_compute[182713]: 2026-01-21 23:47:34.439 182717 DEBUG nova.compute.manager [req-56d66fff-f597-48d5-a075-a73cfe4d01c6 req-88aea097-0379-46f2-acbb-e6a6071b8ce1 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 5bdecf5d-9113-4584-ac23-44d59770eade] Received event network-vif-plugged-df9aa099-aa41-4111-b46c-c8a593762a53 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:47:34 compute-1 nova_compute[182713]: 2026-01-21 23:47:34.440 182717 DEBUG oslo_concurrency.lockutils [req-56d66fff-f597-48d5-a075-a73cfe4d01c6 req-88aea097-0379-46f2-acbb-e6a6071b8ce1 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "5bdecf5d-9113-4584-ac23-44d59770eade-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:47:34 compute-1 nova_compute[182713]: 2026-01-21 23:47:34.440 182717 DEBUG oslo_concurrency.lockutils [req-56d66fff-f597-48d5-a075-a73cfe4d01c6 req-88aea097-0379-46f2-acbb-e6a6071b8ce1 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "5bdecf5d-9113-4584-ac23-44d59770eade-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:47:34 compute-1 nova_compute[182713]: 2026-01-21 23:47:34.441 182717 DEBUG oslo_concurrency.lockutils [req-56d66fff-f597-48d5-a075-a73cfe4d01c6 req-88aea097-0379-46f2-acbb-e6a6071b8ce1 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "5bdecf5d-9113-4584-ac23-44d59770eade-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:47:34 compute-1 nova_compute[182713]: 2026-01-21 23:47:34.441 182717 DEBUG nova.compute.manager [req-56d66fff-f597-48d5-a075-a73cfe4d01c6 req-88aea097-0379-46f2-acbb-e6a6071b8ce1 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 5bdecf5d-9113-4584-ac23-44d59770eade] No waiting events found dispatching network-vif-plugged-df9aa099-aa41-4111-b46c-c8a593762a53 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 23:47:34 compute-1 nova_compute[182713]: 2026-01-21 23:47:34.441 182717 WARNING nova.compute.manager [req-56d66fff-f597-48d5-a075-a73cfe4d01c6 req-88aea097-0379-46f2-acbb-e6a6071b8ce1 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 5bdecf5d-9113-4584-ac23-44d59770eade] Received unexpected event network-vif-plugged-df9aa099-aa41-4111-b46c-c8a593762a53 for instance with vm_state active and task_state migrating.
Jan 21 23:47:35 compute-1 nova_compute[182713]: 2026-01-21 23:47:35.211 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:47:35 compute-1 nova_compute[182713]: 2026-01-21 23:47:35.266 182717 DEBUG nova.compute.manager [req-fff81dec-a3ac-4757-8ff3-9abad2f6772f req-d5b31572-c5c6-4983-9764-943e433450d2 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 5bdecf5d-9113-4584-ac23-44d59770eade] Received event network-vif-plugged-df9aa099-aa41-4111-b46c-c8a593762a53 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:47:35 compute-1 nova_compute[182713]: 2026-01-21 23:47:35.267 182717 DEBUG oslo_concurrency.lockutils [req-fff81dec-a3ac-4757-8ff3-9abad2f6772f req-d5b31572-c5c6-4983-9764-943e433450d2 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "5bdecf5d-9113-4584-ac23-44d59770eade-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:47:35 compute-1 nova_compute[182713]: 2026-01-21 23:47:35.267 182717 DEBUG oslo_concurrency.lockutils [req-fff81dec-a3ac-4757-8ff3-9abad2f6772f req-d5b31572-c5c6-4983-9764-943e433450d2 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "5bdecf5d-9113-4584-ac23-44d59770eade-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:47:35 compute-1 nova_compute[182713]: 2026-01-21 23:47:35.268 182717 DEBUG oslo_concurrency.lockutils [req-fff81dec-a3ac-4757-8ff3-9abad2f6772f req-d5b31572-c5c6-4983-9764-943e433450d2 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "5bdecf5d-9113-4584-ac23-44d59770eade-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:47:35 compute-1 nova_compute[182713]: 2026-01-21 23:47:35.268 182717 DEBUG nova.compute.manager [req-fff81dec-a3ac-4757-8ff3-9abad2f6772f req-d5b31572-c5c6-4983-9764-943e433450d2 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 5bdecf5d-9113-4584-ac23-44d59770eade] No waiting events found dispatching network-vif-plugged-df9aa099-aa41-4111-b46c-c8a593762a53 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 23:47:35 compute-1 nova_compute[182713]: 2026-01-21 23:47:35.268 182717 WARNING nova.compute.manager [req-fff81dec-a3ac-4757-8ff3-9abad2f6772f req-d5b31572-c5c6-4983-9764-943e433450d2 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 5bdecf5d-9113-4584-ac23-44d59770eade] Received unexpected event network-vif-plugged-df9aa099-aa41-4111-b46c-c8a593762a53 for instance with vm_state active and task_state migrating.
Jan 21 23:47:35 compute-1 podman[214066]: 2026-01-21 23:47:35.575920888 +0000 UTC m=+0.065743248 container health_status cba261ffc859a105e0afa4436e64e27f9ba7e7387a1ea02146e0072c51844d44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, config_id=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 21 23:47:36 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:47:36.174 104184 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=7, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:a4:fb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'fa:c2:bb:50:eb:27'}, ipsec=False) old=SB_Global(nb_cfg=6) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 21 23:47:36 compute-1 nova_compute[182713]: 2026-01-21 23:47:36.176 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:47:36 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:47:36.175 104184 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 21 23:47:37 compute-1 systemd[1]: Stopping User Manager for UID 42436...
Jan 21 23:47:37 compute-1 systemd[213939]: Activating special unit Exit the Session...
Jan 21 23:47:37 compute-1 systemd[213939]: Stopped target Main User Target.
Jan 21 23:47:37 compute-1 systemd[213939]: Stopped target Basic System.
Jan 21 23:47:37 compute-1 systemd[213939]: Stopped target Paths.
Jan 21 23:47:37 compute-1 systemd[213939]: Stopped target Sockets.
Jan 21 23:47:37 compute-1 systemd[213939]: Stopped target Timers.
Jan 21 23:47:37 compute-1 systemd[213939]: Stopped Mark boot as successful after the user session has run 2 minutes.
Jan 21 23:47:37 compute-1 systemd[213939]: Stopped Daily Cleanup of User's Temporary Directories.
Jan 21 23:47:37 compute-1 systemd[213939]: Closed D-Bus User Message Bus Socket.
Jan 21 23:47:37 compute-1 systemd[213939]: Stopped Create User's Volatile Files and Directories.
Jan 21 23:47:37 compute-1 systemd[213939]: Removed slice User Application Slice.
Jan 21 23:47:37 compute-1 systemd[213939]: Reached target Shutdown.
Jan 21 23:47:37 compute-1 systemd[213939]: Finished Exit the Session.
Jan 21 23:47:37 compute-1 systemd[213939]: Reached target Exit the Session.
Jan 21 23:47:38 compute-1 systemd[1]: user@42436.service: Deactivated successfully.
Jan 21 23:47:38 compute-1 systemd[1]: Stopped User Manager for UID 42436.
Jan 21 23:47:38 compute-1 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Jan 21 23:47:38 compute-1 systemd[1]: run-user-42436.mount: Deactivated successfully.
Jan 21 23:47:38 compute-1 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Jan 21 23:47:38 compute-1 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Jan 21 23:47:38 compute-1 systemd[1]: Removed slice User Slice of UID 42436.
Jan 21 23:47:38 compute-1 nova_compute[182713]: 2026-01-21 23:47:38.504 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:47:39 compute-1 podman[214087]: 2026-01-21 23:47:39.611755038 +0000 UTC m=+0.088303453 container health_status dce133a2ffa21f3006ac27dc80027060a9029e6d972f4c62fecd35dd3bbb00f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.6, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, managed_by=edpm_ansible, architecture=x86_64, name=ubi9-minimal, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, release=1755695350, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=openstack_network_exporter)
Jan 21 23:47:39 compute-1 nova_compute[182713]: 2026-01-21 23:47:39.805 182717 DEBUG oslo_concurrency.lockutils [None req-c15d4748-f712-4aac-9314-66c8c072b909 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] Acquiring lock "5bdecf5d-9113-4584-ac23-44d59770eade-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:47:39 compute-1 nova_compute[182713]: 2026-01-21 23:47:39.806 182717 DEBUG oslo_concurrency.lockutils [None req-c15d4748-f712-4aac-9314-66c8c072b909 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] Lock "5bdecf5d-9113-4584-ac23-44d59770eade-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:47:39 compute-1 nova_compute[182713]: 2026-01-21 23:47:39.806 182717 DEBUG oslo_concurrency.lockutils [None req-c15d4748-f712-4aac-9314-66c8c072b909 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] Lock "5bdecf5d-9113-4584-ac23-44d59770eade-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:47:39 compute-1 nova_compute[182713]: 2026-01-21 23:47:39.862 182717 DEBUG oslo_concurrency.lockutils [None req-c15d4748-f712-4aac-9314-66c8c072b909 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:47:39 compute-1 nova_compute[182713]: 2026-01-21 23:47:39.863 182717 DEBUG oslo_concurrency.lockutils [None req-c15d4748-f712-4aac-9314-66c8c072b909 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:47:39 compute-1 nova_compute[182713]: 2026-01-21 23:47:39.864 182717 DEBUG oslo_concurrency.lockutils [None req-c15d4748-f712-4aac-9314-66c8c072b909 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:47:39 compute-1 nova_compute[182713]: 2026-01-21 23:47:39.865 182717 DEBUG nova.compute.resource_tracker [None req-c15d4748-f712-4aac-9314-66c8c072b909 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 21 23:47:39 compute-1 nova_compute[182713]: 2026-01-21 23:47:39.958 182717 DEBUG oslo_concurrency.processutils [None req-c15d4748-f712-4aac-9314-66c8c072b909 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/044c71d9-3aaf-4e1c-af95-5c0636cf4000/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:47:40 compute-1 nova_compute[182713]: 2026-01-21 23:47:40.050 182717 DEBUG oslo_concurrency.processutils [None req-c15d4748-f712-4aac-9314-66c8c072b909 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/044c71d9-3aaf-4e1c-af95-5c0636cf4000/disk --force-share --output=json" returned: 0 in 0.092s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:47:40 compute-1 nova_compute[182713]: 2026-01-21 23:47:40.052 182717 DEBUG oslo_concurrency.processutils [None req-c15d4748-f712-4aac-9314-66c8c072b909 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/044c71d9-3aaf-4e1c-af95-5c0636cf4000/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:47:40 compute-1 nova_compute[182713]: 2026-01-21 23:47:40.121 182717 DEBUG oslo_concurrency.processutils [None req-c15d4748-f712-4aac-9314-66c8c072b909 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/044c71d9-3aaf-4e1c-af95-5c0636cf4000/disk --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:47:40 compute-1 nova_compute[182713]: 2026-01-21 23:47:40.213 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:47:40 compute-1 nova_compute[182713]: 2026-01-21 23:47:40.289 182717 WARNING nova.virt.libvirt.driver [None req-c15d4748-f712-4aac-9314-66c8c072b909 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 21 23:47:40 compute-1 nova_compute[182713]: 2026-01-21 23:47:40.291 182717 DEBUG nova.compute.resource_tracker [None req-c15d4748-f712-4aac-9314-66c8c072b909 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5492MB free_disk=73.35296630859375GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 21 23:47:40 compute-1 nova_compute[182713]: 2026-01-21 23:47:40.291 182717 DEBUG oslo_concurrency.lockutils [None req-c15d4748-f712-4aac-9314-66c8c072b909 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:47:40 compute-1 nova_compute[182713]: 2026-01-21 23:47:40.292 182717 DEBUG oslo_concurrency.lockutils [None req-c15d4748-f712-4aac-9314-66c8c072b909 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:47:40 compute-1 nova_compute[182713]: 2026-01-21 23:47:40.369 182717 DEBUG nova.compute.resource_tracker [None req-c15d4748-f712-4aac-9314-66c8c072b909 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] Migration for instance 5bdecf5d-9113-4584-ac23-44d59770eade refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:903
Jan 21 23:47:40 compute-1 nova_compute[182713]: 2026-01-21 23:47:40.390 182717 DEBUG nova.compute.resource_tracker [None req-c15d4748-f712-4aac-9314-66c8c072b909 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] [instance: 5bdecf5d-9113-4584-ac23-44d59770eade] Skipping migration as instance is neither resizing nor live-migrating. _update_usage_from_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1491
Jan 21 23:47:40 compute-1 nova_compute[182713]: 2026-01-21 23:47:40.434 182717 DEBUG nova.compute.resource_tracker [None req-c15d4748-f712-4aac-9314-66c8c072b909 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] Instance 044c71d9-3aaf-4e1c-af95-5c0636cf4000 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 21 23:47:40 compute-1 nova_compute[182713]: 2026-01-21 23:47:40.435 182717 DEBUG nova.compute.resource_tracker [None req-c15d4748-f712-4aac-9314-66c8c072b909 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] Migration b99e07dc-0046-41fe-8ff1-33067bd0bd67 is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1640
Jan 21 23:47:40 compute-1 nova_compute[182713]: 2026-01-21 23:47:40.435 182717 DEBUG nova.compute.resource_tracker [None req-c15d4748-f712-4aac-9314-66c8c072b909 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 21 23:47:40 compute-1 nova_compute[182713]: 2026-01-21 23:47:40.436 182717 DEBUG nova.compute.resource_tracker [None req-c15d4748-f712-4aac-9314-66c8c072b909 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 21 23:47:40 compute-1 nova_compute[182713]: 2026-01-21 23:47:40.570 182717 DEBUG nova.compute.provider_tree [None req-c15d4748-f712-4aac-9314-66c8c072b909 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] Inventory has not changed in ProviderTree for provider: 39680711-70c9-4df1-ae59-25e54fac688d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 21 23:47:40 compute-1 nova_compute[182713]: 2026-01-21 23:47:40.587 182717 DEBUG nova.scheduler.client.report [None req-c15d4748-f712-4aac-9314-66c8c072b909 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] Inventory has not changed for provider 39680711-70c9-4df1-ae59-25e54fac688d based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 21 23:47:40 compute-1 nova_compute[182713]: 2026-01-21 23:47:40.614 182717 DEBUG nova.compute.resource_tracker [None req-c15d4748-f712-4aac-9314-66c8c072b909 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 21 23:47:40 compute-1 nova_compute[182713]: 2026-01-21 23:47:40.615 182717 DEBUG oslo_concurrency.lockutils [None req-c15d4748-f712-4aac-9314-66c8c072b909 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.323s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:47:40 compute-1 nova_compute[182713]: 2026-01-21 23:47:40.635 182717 INFO nova.compute.manager [None req-c15d4748-f712-4aac-9314-66c8c072b909 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] [instance: 5bdecf5d-9113-4584-ac23-44d59770eade] Migrating instance to compute-0.ctlplane.example.com finished successfully.
Jan 21 23:47:40 compute-1 nova_compute[182713]: 2026-01-21 23:47:40.823 182717 INFO nova.scheduler.client.report [None req-c15d4748-f712-4aac-9314-66c8c072b909 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] Deleted allocation for migration b99e07dc-0046-41fe-8ff1-33067bd0bd67
Jan 21 23:47:40 compute-1 nova_compute[182713]: 2026-01-21 23:47:40.824 182717 DEBUG nova.virt.libvirt.driver [None req-c15d4748-f712-4aac-9314-66c8c072b909 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] [instance: 5bdecf5d-9113-4584-ac23-44d59770eade] Live migration monitoring is all done _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10662
Jan 21 23:47:41 compute-1 ovn_controller[94841]: 2026-01-21T23:47:41Z|00010|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:5a:f2:4d 10.100.0.12
Jan 21 23:47:41 compute-1 ovn_controller[94841]: 2026-01-21T23:47:41Z|00011|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:5a:f2:4d 10.100.0.12
Jan 21 23:47:43 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:47:43.177 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=74526b6d-b1ca-423f-9094-b845f8b97526, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '7'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:47:43 compute-1 nova_compute[182713]: 2026-01-21 23:47:43.508 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:47:43 compute-1 nova_compute[182713]: 2026-01-21 23:47:43.865 182717 DEBUG nova.virt.libvirt.driver [None req-bd2b8775-609d-4145-8ee6-76c7bc9dac71 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] [instance: 5bdecf5d-9113-4584-ac23-44d59770eade] Creating tmpfile /var/lib/nova/instances/tmp34j3k16n to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10041
Jan 21 23:47:44 compute-1 nova_compute[182713]: 2026-01-21 23:47:44.128 182717 DEBUG nova.compute.manager [None req-bd2b8775-609d-4145-8ee6-76c7bc9dac71 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp34j3k16n',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.9/site-packages/nova/compute/manager.py:8476
Jan 21 23:47:45 compute-1 nova_compute[182713]: 2026-01-21 23:47:45.216 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:47:45 compute-1 nova_compute[182713]: 2026-01-21 23:47:45.737 182717 DEBUG nova.compute.manager [None req-bd2b8775-609d-4145-8ee6-76c7bc9dac71 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp34j3k16n',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='5bdecf5d-9113-4584-ac23-44d59770eade',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8604
Jan 21 23:47:45 compute-1 nova_compute[182713]: 2026-01-21 23:47:45.807 182717 DEBUG oslo_concurrency.lockutils [None req-bd2b8775-609d-4145-8ee6-76c7bc9dac71 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] Acquiring lock "refresh_cache-5bdecf5d-9113-4584-ac23-44d59770eade" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 23:47:45 compute-1 nova_compute[182713]: 2026-01-21 23:47:45.808 182717 DEBUG oslo_concurrency.lockutils [None req-bd2b8775-609d-4145-8ee6-76c7bc9dac71 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] Acquired lock "refresh_cache-5bdecf5d-9113-4584-ac23-44d59770eade" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 23:47:45 compute-1 nova_compute[182713]: 2026-01-21 23:47:45.808 182717 DEBUG nova.network.neutron [None req-bd2b8775-609d-4145-8ee6-76c7bc9dac71 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] [instance: 5bdecf5d-9113-4584-ac23-44d59770eade] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 21 23:47:46 compute-1 podman[214124]: 2026-01-21 23:47:46.613145522 +0000 UTC m=+0.086522327 container health_status 9069102163b9014ce8d990e36224edf2f6277e737eebbc85a8ea0ba5a3d6a468 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 21 23:47:46 compute-1 podman[214123]: 2026-01-21 23:47:46.663450215 +0000 UTC m=+0.143445427 container health_status 1b31374526468d6b98833c6ddc06c791524e3aaa8f4a92d02e3f11c449a17ca2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller)
Jan 21 23:47:47 compute-1 nova_compute[182713]: 2026-01-21 23:47:47.240 182717 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769039252.238056, 5bdecf5d-9113-4584-ac23-44d59770eade => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:47:47 compute-1 nova_compute[182713]: 2026-01-21 23:47:47.240 182717 INFO nova.compute.manager [-] [instance: 5bdecf5d-9113-4584-ac23-44d59770eade] VM Stopped (Lifecycle Event)
Jan 21 23:47:47 compute-1 nova_compute[182713]: 2026-01-21 23:47:47.288 182717 DEBUG nova.compute.manager [None req-9ef753eb-e965-4bf2-82be-f035215f3a5d - - - - - -] [instance: 5bdecf5d-9113-4584-ac23-44d59770eade] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:47:47 compute-1 nova_compute[182713]: 2026-01-21 23:47:47.351 182717 DEBUG oslo_concurrency.lockutils [None req-730b2ef1-cc43-46d2-921d-429f9de50ffa 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Acquiring lock "2977f489-9f9d-43f7-a617-7556b7df5171" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:47:47 compute-1 nova_compute[182713]: 2026-01-21 23:47:47.351 182717 DEBUG oslo_concurrency.lockutils [None req-730b2ef1-cc43-46d2-921d-429f9de50ffa 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Lock "2977f489-9f9d-43f7-a617-7556b7df5171" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:47:47 compute-1 nova_compute[182713]: 2026-01-21 23:47:47.373 182717 DEBUG nova.compute.manager [None req-730b2ef1-cc43-46d2-921d-429f9de50ffa 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] [instance: 2977f489-9f9d-43f7-a617-7556b7df5171] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 21 23:47:47 compute-1 nova_compute[182713]: 2026-01-21 23:47:47.499 182717 DEBUG oslo_concurrency.lockutils [None req-730b2ef1-cc43-46d2-921d-429f9de50ffa 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:47:47 compute-1 nova_compute[182713]: 2026-01-21 23:47:47.499 182717 DEBUG oslo_concurrency.lockutils [None req-730b2ef1-cc43-46d2-921d-429f9de50ffa 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:47:47 compute-1 nova_compute[182713]: 2026-01-21 23:47:47.508 182717 DEBUG nova.virt.hardware [None req-730b2ef1-cc43-46d2-921d-429f9de50ffa 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 21 23:47:47 compute-1 nova_compute[182713]: 2026-01-21 23:47:47.509 182717 INFO nova.compute.claims [None req-730b2ef1-cc43-46d2-921d-429f9de50ffa 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] [instance: 2977f489-9f9d-43f7-a617-7556b7df5171] Claim successful on node compute-1.ctlplane.example.com
Jan 21 23:47:47 compute-1 nova_compute[182713]: 2026-01-21 23:47:47.698 182717 DEBUG nova.compute.provider_tree [None req-730b2ef1-cc43-46d2-921d-429f9de50ffa 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Inventory has not changed in ProviderTree for provider: 39680711-70c9-4df1-ae59-25e54fac688d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 21 23:47:47 compute-1 nova_compute[182713]: 2026-01-21 23:47:47.716 182717 DEBUG nova.scheduler.client.report [None req-730b2ef1-cc43-46d2-921d-429f9de50ffa 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Inventory has not changed for provider 39680711-70c9-4df1-ae59-25e54fac688d based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 21 23:47:47 compute-1 nova_compute[182713]: 2026-01-21 23:47:47.757 182717 DEBUG oslo_concurrency.lockutils [None req-730b2ef1-cc43-46d2-921d-429f9de50ffa 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.258s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:47:47 compute-1 nova_compute[182713]: 2026-01-21 23:47:47.759 182717 DEBUG nova.compute.manager [None req-730b2ef1-cc43-46d2-921d-429f9de50ffa 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] [instance: 2977f489-9f9d-43f7-a617-7556b7df5171] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 21 23:47:47 compute-1 nova_compute[182713]: 2026-01-21 23:47:47.794 182717 DEBUG nova.network.neutron [None req-bd2b8775-609d-4145-8ee6-76c7bc9dac71 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] [instance: 5bdecf5d-9113-4584-ac23-44d59770eade] Updating instance_info_cache with network_info: [{"id": "df9aa099-aa41-4111-b46c-c8a593762a53", "address": "fa:16:3e:8f:4f:85", "network": {"id": "b2df233d-b255-4dda-925c-3ccab3a032ee", "bridge": "br-int", "label": "tempest-LiveMigrationTest-283223724-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cdcb2f57183e484cace5d5f78dd635a1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdf9aa099-aa", "ovs_interfaceid": "df9aa099-aa41-4111-b46c-c8a593762a53", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 23:47:47 compute-1 nova_compute[182713]: 2026-01-21 23:47:47.839 182717 DEBUG oslo_concurrency.lockutils [None req-bd2b8775-609d-4145-8ee6-76c7bc9dac71 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] Releasing lock "refresh_cache-5bdecf5d-9113-4584-ac23-44d59770eade" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 23:47:47 compute-1 nova_compute[182713]: 2026-01-21 23:47:47.857 182717 DEBUG nova.virt.libvirt.driver [None req-bd2b8775-609d-4145-8ee6-76c7bc9dac71 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] [instance: 5bdecf5d-9113-4584-ac23-44d59770eade] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp34j3k16n',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='5bdecf5d-9113-4584-ac23-44d59770eade',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10827
Jan 21 23:47:47 compute-1 nova_compute[182713]: 2026-01-21 23:47:47.858 182717 DEBUG nova.virt.libvirt.driver [None req-bd2b8775-609d-4145-8ee6-76c7bc9dac71 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] [instance: 5bdecf5d-9113-4584-ac23-44d59770eade] Creating instance directory: /var/lib/nova/instances/5bdecf5d-9113-4584-ac23-44d59770eade pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10840
Jan 21 23:47:47 compute-1 nova_compute[182713]: 2026-01-21 23:47:47.859 182717 DEBUG nova.virt.libvirt.driver [None req-bd2b8775-609d-4145-8ee6-76c7bc9dac71 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] [instance: 5bdecf5d-9113-4584-ac23-44d59770eade] Creating disk.info with the contents: {'/var/lib/nova/instances/5bdecf5d-9113-4584-ac23-44d59770eade/disk': 'qcow2', '/var/lib/nova/instances/5bdecf5d-9113-4584-ac23-44d59770eade/disk.config': 'raw'} pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10854
Jan 21 23:47:47 compute-1 nova_compute[182713]: 2026-01-21 23:47:47.860 182717 DEBUG nova.virt.libvirt.driver [None req-bd2b8775-609d-4145-8ee6-76c7bc9dac71 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] [instance: 5bdecf5d-9113-4584-ac23-44d59770eade] Checking to make sure images and backing files are present before live migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10864
Jan 21 23:47:47 compute-1 nova_compute[182713]: 2026-01-21 23:47:47.861 182717 DEBUG nova.objects.instance [None req-bd2b8775-609d-4145-8ee6-76c7bc9dac71 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] Lazy-loading 'trusted_certs' on Instance uuid 5bdecf5d-9113-4584-ac23-44d59770eade obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:47:47 compute-1 nova_compute[182713]: 2026-01-21 23:47:47.881 182717 DEBUG nova.compute.manager [None req-730b2ef1-cc43-46d2-921d-429f9de50ffa 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] [instance: 2977f489-9f9d-43f7-a617-7556b7df5171] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 21 23:47:47 compute-1 nova_compute[182713]: 2026-01-21 23:47:47.882 182717 DEBUG nova.network.neutron [None req-730b2ef1-cc43-46d2-921d-429f9de50ffa 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] [instance: 2977f489-9f9d-43f7-a617-7556b7df5171] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 21 23:47:47 compute-1 nova_compute[182713]: 2026-01-21 23:47:47.915 182717 INFO nova.virt.libvirt.driver [None req-730b2ef1-cc43-46d2-921d-429f9de50ffa 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] [instance: 2977f489-9f9d-43f7-a617-7556b7df5171] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 21 23:47:47 compute-1 nova_compute[182713]: 2026-01-21 23:47:47.919 182717 DEBUG oslo_concurrency.processutils [None req-bd2b8775-609d-4145-8ee6-76c7bc9dac71 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:47:47 compute-1 nova_compute[182713]: 2026-01-21 23:47:47.948 182717 DEBUG nova.compute.manager [None req-730b2ef1-cc43-46d2-921d-429f9de50ffa 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] [instance: 2977f489-9f9d-43f7-a617-7556b7df5171] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 21 23:47:48 compute-1 nova_compute[182713]: 2026-01-21 23:47:48.023 182717 DEBUG oslo_concurrency.processutils [None req-bd2b8775-609d-4145-8ee6-76c7bc9dac71 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.104s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:47:48 compute-1 nova_compute[182713]: 2026-01-21 23:47:48.024 182717 DEBUG oslo_concurrency.lockutils [None req-bd2b8775-609d-4145-8ee6-76c7bc9dac71 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] Acquiring lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:47:48 compute-1 nova_compute[182713]: 2026-01-21 23:47:48.025 182717 DEBUG oslo_concurrency.lockutils [None req-bd2b8775-609d-4145-8ee6-76c7bc9dac71 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:47:48 compute-1 nova_compute[182713]: 2026-01-21 23:47:48.036 182717 DEBUG oslo_concurrency.processutils [None req-bd2b8775-609d-4145-8ee6-76c7bc9dac71 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:47:48 compute-1 nova_compute[182713]: 2026-01-21 23:47:48.065 182717 DEBUG nova.compute.manager [None req-730b2ef1-cc43-46d2-921d-429f9de50ffa 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] [instance: 2977f489-9f9d-43f7-a617-7556b7df5171] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 21 23:47:48 compute-1 nova_compute[182713]: 2026-01-21 23:47:48.067 182717 DEBUG nova.virt.libvirt.driver [None req-730b2ef1-cc43-46d2-921d-429f9de50ffa 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] [instance: 2977f489-9f9d-43f7-a617-7556b7df5171] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 21 23:47:48 compute-1 nova_compute[182713]: 2026-01-21 23:47:48.068 182717 INFO nova.virt.libvirt.driver [None req-730b2ef1-cc43-46d2-921d-429f9de50ffa 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] [instance: 2977f489-9f9d-43f7-a617-7556b7df5171] Creating image(s)
Jan 21 23:47:48 compute-1 nova_compute[182713]: 2026-01-21 23:47:48.069 182717 DEBUG oslo_concurrency.lockutils [None req-730b2ef1-cc43-46d2-921d-429f9de50ffa 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Acquiring lock "/var/lib/nova/instances/2977f489-9f9d-43f7-a617-7556b7df5171/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:47:48 compute-1 nova_compute[182713]: 2026-01-21 23:47:48.069 182717 DEBUG oslo_concurrency.lockutils [None req-730b2ef1-cc43-46d2-921d-429f9de50ffa 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Lock "/var/lib/nova/instances/2977f489-9f9d-43f7-a617-7556b7df5171/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:47:48 compute-1 nova_compute[182713]: 2026-01-21 23:47:48.070 182717 DEBUG oslo_concurrency.lockutils [None req-730b2ef1-cc43-46d2-921d-429f9de50ffa 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Lock "/var/lib/nova/instances/2977f489-9f9d-43f7-a617-7556b7df5171/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:47:48 compute-1 nova_compute[182713]: 2026-01-21 23:47:48.085 182717 DEBUG oslo_concurrency.processutils [None req-730b2ef1-cc43-46d2-921d-429f9de50ffa 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:47:48 compute-1 nova_compute[182713]: 2026-01-21 23:47:48.110 182717 DEBUG oslo_concurrency.processutils [None req-bd2b8775-609d-4145-8ee6-76c7bc9dac71 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.074s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:47:48 compute-1 nova_compute[182713]: 2026-01-21 23:47:48.111 182717 DEBUG oslo_concurrency.processutils [None req-bd2b8775-609d-4145-8ee6-76c7bc9dac71 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/5bdecf5d-9113-4584-ac23-44d59770eade/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:47:48 compute-1 nova_compute[182713]: 2026-01-21 23:47:48.154 182717 DEBUG oslo_concurrency.processutils [None req-bd2b8775-609d-4145-8ee6-76c7bc9dac71 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/5bdecf5d-9113-4584-ac23-44d59770eade/disk 1073741824" returned: 0 in 0.043s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:47:48 compute-1 nova_compute[182713]: 2026-01-21 23:47:48.155 182717 DEBUG oslo_concurrency.lockutils [None req-bd2b8775-609d-4145-8ee6-76c7bc9dac71 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.130s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:47:48 compute-1 nova_compute[182713]: 2026-01-21 23:47:48.155 182717 DEBUG oslo_concurrency.processutils [None req-bd2b8775-609d-4145-8ee6-76c7bc9dac71 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:47:48 compute-1 nova_compute[182713]: 2026-01-21 23:47:48.180 182717 DEBUG oslo_concurrency.processutils [None req-730b2ef1-cc43-46d2-921d-429f9de50ffa 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.095s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:47:48 compute-1 nova_compute[182713]: 2026-01-21 23:47:48.181 182717 DEBUG oslo_concurrency.lockutils [None req-730b2ef1-cc43-46d2-921d-429f9de50ffa 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Acquiring lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:47:48 compute-1 nova_compute[182713]: 2026-01-21 23:47:48.182 182717 DEBUG oslo_concurrency.lockutils [None req-730b2ef1-cc43-46d2-921d-429f9de50ffa 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:47:48 compute-1 nova_compute[182713]: 2026-01-21 23:47:48.203 182717 DEBUG oslo_concurrency.processutils [None req-730b2ef1-cc43-46d2-921d-429f9de50ffa 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:47:48 compute-1 nova_compute[182713]: 2026-01-21 23:47:48.237 182717 DEBUG oslo_concurrency.processutils [None req-bd2b8775-609d-4145-8ee6-76c7bc9dac71 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.081s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:47:48 compute-1 nova_compute[182713]: 2026-01-21 23:47:48.239 182717 DEBUG nova.virt.disk.api [None req-bd2b8775-609d-4145-8ee6-76c7bc9dac71 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] Checking if we can resize image /var/lib/nova/instances/5bdecf5d-9113-4584-ac23-44d59770eade/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 21 23:47:48 compute-1 nova_compute[182713]: 2026-01-21 23:47:48.239 182717 DEBUG oslo_concurrency.processutils [None req-bd2b8775-609d-4145-8ee6-76c7bc9dac71 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5bdecf5d-9113-4584-ac23-44d59770eade/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:47:48 compute-1 nova_compute[182713]: 2026-01-21 23:47:48.282 182717 DEBUG oslo_concurrency.processutils [None req-730b2ef1-cc43-46d2-921d-429f9de50ffa 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.080s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:47:48 compute-1 nova_compute[182713]: 2026-01-21 23:47:48.284 182717 DEBUG oslo_concurrency.processutils [None req-730b2ef1-cc43-46d2-921d-429f9de50ffa 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/2977f489-9f9d-43f7-a617-7556b7df5171/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:47:48 compute-1 nova_compute[182713]: 2026-01-21 23:47:48.315 182717 DEBUG oslo_concurrency.processutils [None req-bd2b8775-609d-4145-8ee6-76c7bc9dac71 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5bdecf5d-9113-4584-ac23-44d59770eade/disk --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:47:48 compute-1 nova_compute[182713]: 2026-01-21 23:47:48.316 182717 DEBUG nova.virt.disk.api [None req-bd2b8775-609d-4145-8ee6-76c7bc9dac71 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] Cannot resize image /var/lib/nova/instances/5bdecf5d-9113-4584-ac23-44d59770eade/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 21 23:47:48 compute-1 nova_compute[182713]: 2026-01-21 23:47:48.317 182717 DEBUG nova.objects.instance [None req-bd2b8775-609d-4145-8ee6-76c7bc9dac71 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] Lazy-loading 'migration_context' on Instance uuid 5bdecf5d-9113-4584-ac23-44d59770eade obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:47:48 compute-1 nova_compute[182713]: 2026-01-21 23:47:48.334 182717 DEBUG oslo_concurrency.processutils [None req-730b2ef1-cc43-46d2-921d-429f9de50ffa 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/2977f489-9f9d-43f7-a617-7556b7df5171/disk 1073741824" returned: 0 in 0.050s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:47:48 compute-1 nova_compute[182713]: 2026-01-21 23:47:48.335 182717 DEBUG oslo_concurrency.lockutils [None req-730b2ef1-cc43-46d2-921d-429f9de50ffa 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.153s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:47:48 compute-1 nova_compute[182713]: 2026-01-21 23:47:48.336 182717 DEBUG oslo_concurrency.processutils [None req-730b2ef1-cc43-46d2-921d-429f9de50ffa 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:47:48 compute-1 nova_compute[182713]: 2026-01-21 23:47:48.365 182717 DEBUG oslo_concurrency.processutils [None req-bd2b8775-609d-4145-8ee6-76c7bc9dac71 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/5bdecf5d-9113-4584-ac23-44d59770eade/disk.config 485376 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:47:48 compute-1 nova_compute[182713]: 2026-01-21 23:47:48.392 182717 DEBUG oslo_concurrency.processutils [None req-bd2b8775-609d-4145-8ee6-76c7bc9dac71 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/5bdecf5d-9113-4584-ac23-44d59770eade/disk.config 485376" returned: 0 in 0.028s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:47:48 compute-1 nova_compute[182713]: 2026-01-21 23:47:48.394 182717 DEBUG nova.virt.libvirt.volume.remotefs [None req-bd2b8775-609d-4145-8ee6-76c7bc9dac71 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] Copying file compute-0.ctlplane.example.com:/var/lib/nova/instances/5bdecf5d-9113-4584-ac23-44d59770eade/disk.config to /var/lib/nova/instances/5bdecf5d-9113-4584-ac23-44d59770eade copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103
Jan 21 23:47:48 compute-1 nova_compute[182713]: 2026-01-21 23:47:48.394 182717 DEBUG oslo_concurrency.processutils [None req-bd2b8775-609d-4145-8ee6-76c7bc9dac71 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] Running cmd (subprocess): scp -C -r compute-0.ctlplane.example.com:/var/lib/nova/instances/5bdecf5d-9113-4584-ac23-44d59770eade/disk.config /var/lib/nova/instances/5bdecf5d-9113-4584-ac23-44d59770eade execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:47:48 compute-1 nova_compute[182713]: 2026-01-21 23:47:48.419 182717 DEBUG oslo_concurrency.processutils [None req-730b2ef1-cc43-46d2-921d-429f9de50ffa 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.083s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:47:48 compute-1 nova_compute[182713]: 2026-01-21 23:47:48.419 182717 DEBUG nova.virt.disk.api [None req-730b2ef1-cc43-46d2-921d-429f9de50ffa 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Checking if we can resize image /var/lib/nova/instances/2977f489-9f9d-43f7-a617-7556b7df5171/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 21 23:47:48 compute-1 nova_compute[182713]: 2026-01-21 23:47:48.420 182717 DEBUG oslo_concurrency.processutils [None req-730b2ef1-cc43-46d2-921d-429f9de50ffa 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2977f489-9f9d-43f7-a617-7556b7df5171/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:47:48 compute-1 nova_compute[182713]: 2026-01-21 23:47:48.476 182717 DEBUG oslo_concurrency.processutils [None req-730b2ef1-cc43-46d2-921d-429f9de50ffa 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2977f489-9f9d-43f7-a617-7556b7df5171/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:47:48 compute-1 nova_compute[182713]: 2026-01-21 23:47:48.478 182717 DEBUG nova.virt.disk.api [None req-730b2ef1-cc43-46d2-921d-429f9de50ffa 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Cannot resize image /var/lib/nova/instances/2977f489-9f9d-43f7-a617-7556b7df5171/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 21 23:47:48 compute-1 nova_compute[182713]: 2026-01-21 23:47:48.479 182717 DEBUG nova.objects.instance [None req-730b2ef1-cc43-46d2-921d-429f9de50ffa 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Lazy-loading 'migration_context' on Instance uuid 2977f489-9f9d-43f7-a617-7556b7df5171 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:47:48 compute-1 nova_compute[182713]: 2026-01-21 23:47:48.506 182717 DEBUG nova.virt.libvirt.driver [None req-730b2ef1-cc43-46d2-921d-429f9de50ffa 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] [instance: 2977f489-9f9d-43f7-a617-7556b7df5171] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 21 23:47:48 compute-1 nova_compute[182713]: 2026-01-21 23:47:48.506 182717 DEBUG nova.virt.libvirt.driver [None req-730b2ef1-cc43-46d2-921d-429f9de50ffa 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] [instance: 2977f489-9f9d-43f7-a617-7556b7df5171] Ensure instance console log exists: /var/lib/nova/instances/2977f489-9f9d-43f7-a617-7556b7df5171/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 21 23:47:48 compute-1 nova_compute[182713]: 2026-01-21 23:47:48.507 182717 DEBUG oslo_concurrency.lockutils [None req-730b2ef1-cc43-46d2-921d-429f9de50ffa 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:47:48 compute-1 nova_compute[182713]: 2026-01-21 23:47:48.508 182717 DEBUG oslo_concurrency.lockutils [None req-730b2ef1-cc43-46d2-921d-429f9de50ffa 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:47:48 compute-1 nova_compute[182713]: 2026-01-21 23:47:48.509 182717 DEBUG oslo_concurrency.lockutils [None req-730b2ef1-cc43-46d2-921d-429f9de50ffa 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:47:48 compute-1 nova_compute[182713]: 2026-01-21 23:47:48.560 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:47:48 compute-1 nova_compute[182713]: 2026-01-21 23:47:48.668 182717 DEBUG nova.network.neutron [None req-730b2ef1-cc43-46d2-921d-429f9de50ffa 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] [instance: 2977f489-9f9d-43f7-a617-7556b7df5171] No network configured allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1188
Jan 21 23:47:48 compute-1 nova_compute[182713]: 2026-01-21 23:47:48.669 182717 DEBUG nova.compute.manager [None req-730b2ef1-cc43-46d2-921d-429f9de50ffa 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] [instance: 2977f489-9f9d-43f7-a617-7556b7df5171] Instance network_info: |[]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 21 23:47:48 compute-1 nova_compute[182713]: 2026-01-21 23:47:48.671 182717 DEBUG nova.virt.libvirt.driver [None req-730b2ef1-cc43-46d2-921d-429f9de50ffa 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] [instance: 2977f489-9f9d-43f7-a617-7556b7df5171] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'size': 0, 'disk_bus': 'virtio', 'boot_index': 0, 'device_name': '/dev/vda', 'guest_format': None, 'encrypted': False, 'encryption_format': None, 'encryption_secret_uuid': None, 'device_type': 'disk', 'image_id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 21 23:47:48 compute-1 nova_compute[182713]: 2026-01-21 23:47:48.678 182717 WARNING nova.virt.libvirt.driver [None req-730b2ef1-cc43-46d2-921d-429f9de50ffa 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 21 23:47:48 compute-1 nova_compute[182713]: 2026-01-21 23:47:48.685 182717 DEBUG nova.virt.libvirt.host [None req-730b2ef1-cc43-46d2-921d-429f9de50ffa 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 21 23:47:48 compute-1 nova_compute[182713]: 2026-01-21 23:47:48.686 182717 DEBUG nova.virt.libvirt.host [None req-730b2ef1-cc43-46d2-921d-429f9de50ffa 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 21 23:47:48 compute-1 nova_compute[182713]: 2026-01-21 23:47:48.690 182717 DEBUG nova.virt.libvirt.host [None req-730b2ef1-cc43-46d2-921d-429f9de50ffa 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 21 23:47:48 compute-1 nova_compute[182713]: 2026-01-21 23:47:48.691 182717 DEBUG nova.virt.libvirt.host [None req-730b2ef1-cc43-46d2-921d-429f9de50ffa 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 21 23:47:48 compute-1 nova_compute[182713]: 2026-01-21 23:47:48.692 182717 DEBUG nova.virt.libvirt.driver [None req-730b2ef1-cc43-46d2-921d-429f9de50ffa 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 21 23:47:48 compute-1 nova_compute[182713]: 2026-01-21 23:47:48.693 182717 DEBUG nova.virt.hardware [None req-730b2ef1-cc43-46d2-921d-429f9de50ffa 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-21T23:42:45Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c3389c03-89c4-4ff5-9e03-1a99d41713d4',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 21 23:47:48 compute-1 nova_compute[182713]: 2026-01-21 23:47:48.693 182717 DEBUG nova.virt.hardware [None req-730b2ef1-cc43-46d2-921d-429f9de50ffa 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 21 23:47:48 compute-1 nova_compute[182713]: 2026-01-21 23:47:48.694 182717 DEBUG nova.virt.hardware [None req-730b2ef1-cc43-46d2-921d-429f9de50ffa 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 21 23:47:48 compute-1 nova_compute[182713]: 2026-01-21 23:47:48.694 182717 DEBUG nova.virt.hardware [None req-730b2ef1-cc43-46d2-921d-429f9de50ffa 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 21 23:47:48 compute-1 nova_compute[182713]: 2026-01-21 23:47:48.694 182717 DEBUG nova.virt.hardware [None req-730b2ef1-cc43-46d2-921d-429f9de50ffa 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 21 23:47:48 compute-1 nova_compute[182713]: 2026-01-21 23:47:48.695 182717 DEBUG nova.virt.hardware [None req-730b2ef1-cc43-46d2-921d-429f9de50ffa 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 21 23:47:48 compute-1 nova_compute[182713]: 2026-01-21 23:47:48.695 182717 DEBUG nova.virt.hardware [None req-730b2ef1-cc43-46d2-921d-429f9de50ffa 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 21 23:47:48 compute-1 nova_compute[182713]: 2026-01-21 23:47:48.695 182717 DEBUG nova.virt.hardware [None req-730b2ef1-cc43-46d2-921d-429f9de50ffa 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 21 23:47:48 compute-1 nova_compute[182713]: 2026-01-21 23:47:48.696 182717 DEBUG nova.virt.hardware [None req-730b2ef1-cc43-46d2-921d-429f9de50ffa 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 21 23:47:48 compute-1 nova_compute[182713]: 2026-01-21 23:47:48.696 182717 DEBUG nova.virt.hardware [None req-730b2ef1-cc43-46d2-921d-429f9de50ffa 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 21 23:47:48 compute-1 nova_compute[182713]: 2026-01-21 23:47:48.696 182717 DEBUG nova.virt.hardware [None req-730b2ef1-cc43-46d2-921d-429f9de50ffa 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 21 23:47:48 compute-1 nova_compute[182713]: 2026-01-21 23:47:48.701 182717 DEBUG nova.objects.instance [None req-730b2ef1-cc43-46d2-921d-429f9de50ffa 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Lazy-loading 'pci_devices' on Instance uuid 2977f489-9f9d-43f7-a617-7556b7df5171 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:47:48 compute-1 nova_compute[182713]: 2026-01-21 23:47:48.718 182717 DEBUG nova.virt.libvirt.driver [None req-730b2ef1-cc43-46d2-921d-429f9de50ffa 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] [instance: 2977f489-9f9d-43f7-a617-7556b7df5171] End _get_guest_xml xml=<domain type="kvm">
Jan 21 23:47:48 compute-1 nova_compute[182713]:   <uuid>2977f489-9f9d-43f7-a617-7556b7df5171</uuid>
Jan 21 23:47:48 compute-1 nova_compute[182713]:   <name>instance-00000017</name>
Jan 21 23:47:48 compute-1 nova_compute[182713]:   <memory>131072</memory>
Jan 21 23:47:48 compute-1 nova_compute[182713]:   <vcpu>1</vcpu>
Jan 21 23:47:48 compute-1 nova_compute[182713]:   <metadata>
Jan 21 23:47:48 compute-1 nova_compute[182713]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 21 23:47:48 compute-1 nova_compute[182713]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 21 23:47:48 compute-1 nova_compute[182713]:       <nova:name>tempest-MigrationsAdminTest-server-529809703</nova:name>
Jan 21 23:47:48 compute-1 nova_compute[182713]:       <nova:creationTime>2026-01-21 23:47:48</nova:creationTime>
Jan 21 23:47:48 compute-1 nova_compute[182713]:       <nova:flavor name="m1.nano">
Jan 21 23:47:48 compute-1 nova_compute[182713]:         <nova:memory>128</nova:memory>
Jan 21 23:47:48 compute-1 nova_compute[182713]:         <nova:disk>1</nova:disk>
Jan 21 23:47:48 compute-1 nova_compute[182713]:         <nova:swap>0</nova:swap>
Jan 21 23:47:48 compute-1 nova_compute[182713]:         <nova:ephemeral>0</nova:ephemeral>
Jan 21 23:47:48 compute-1 nova_compute[182713]:         <nova:vcpus>1</nova:vcpus>
Jan 21 23:47:48 compute-1 nova_compute[182713]:       </nova:flavor>
Jan 21 23:47:48 compute-1 nova_compute[182713]:       <nova:owner>
Jan 21 23:47:48 compute-1 nova_compute[182713]:         <nova:user uuid="36d71830ce70436e97fbc17b6da8d3c6">tempest-MigrationsAdminTest-1559502816-project-member</nova:user>
Jan 21 23:47:48 compute-1 nova_compute[182713]:         <nova:project uuid="95574103d0094883861c58d01690e5a3">tempest-MigrationsAdminTest-1559502816</nova:project>
Jan 21 23:47:48 compute-1 nova_compute[182713]:       </nova:owner>
Jan 21 23:47:48 compute-1 nova_compute[182713]:       <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 21 23:47:48 compute-1 nova_compute[182713]:       <nova:ports/>
Jan 21 23:47:48 compute-1 nova_compute[182713]:     </nova:instance>
Jan 21 23:47:48 compute-1 nova_compute[182713]:   </metadata>
Jan 21 23:47:48 compute-1 nova_compute[182713]:   <sysinfo type="smbios">
Jan 21 23:47:48 compute-1 nova_compute[182713]:     <system>
Jan 21 23:47:48 compute-1 nova_compute[182713]:       <entry name="manufacturer">RDO</entry>
Jan 21 23:47:48 compute-1 nova_compute[182713]:       <entry name="product">OpenStack Compute</entry>
Jan 21 23:47:48 compute-1 nova_compute[182713]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 21 23:47:48 compute-1 nova_compute[182713]:       <entry name="serial">2977f489-9f9d-43f7-a617-7556b7df5171</entry>
Jan 21 23:47:48 compute-1 nova_compute[182713]:       <entry name="uuid">2977f489-9f9d-43f7-a617-7556b7df5171</entry>
Jan 21 23:47:48 compute-1 nova_compute[182713]:       <entry name="family">Virtual Machine</entry>
Jan 21 23:47:48 compute-1 nova_compute[182713]:     </system>
Jan 21 23:47:48 compute-1 nova_compute[182713]:   </sysinfo>
Jan 21 23:47:48 compute-1 nova_compute[182713]:   <os>
Jan 21 23:47:48 compute-1 nova_compute[182713]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 21 23:47:48 compute-1 nova_compute[182713]:     <boot dev="hd"/>
Jan 21 23:47:48 compute-1 nova_compute[182713]:     <smbios mode="sysinfo"/>
Jan 21 23:47:48 compute-1 nova_compute[182713]:   </os>
Jan 21 23:47:48 compute-1 nova_compute[182713]:   <features>
Jan 21 23:47:48 compute-1 nova_compute[182713]:     <acpi/>
Jan 21 23:47:48 compute-1 nova_compute[182713]:     <apic/>
Jan 21 23:47:48 compute-1 nova_compute[182713]:     <vmcoreinfo/>
Jan 21 23:47:48 compute-1 nova_compute[182713]:   </features>
Jan 21 23:47:48 compute-1 nova_compute[182713]:   <clock offset="utc">
Jan 21 23:47:48 compute-1 nova_compute[182713]:     <timer name="pit" tickpolicy="delay"/>
Jan 21 23:47:48 compute-1 nova_compute[182713]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 21 23:47:48 compute-1 nova_compute[182713]:     <timer name="hpet" present="no"/>
Jan 21 23:47:48 compute-1 nova_compute[182713]:   </clock>
Jan 21 23:47:48 compute-1 nova_compute[182713]:   <cpu mode="custom" match="exact">
Jan 21 23:47:48 compute-1 nova_compute[182713]:     <model>Nehalem</model>
Jan 21 23:47:48 compute-1 nova_compute[182713]:     <topology sockets="1" cores="1" threads="1"/>
Jan 21 23:47:48 compute-1 nova_compute[182713]:   </cpu>
Jan 21 23:47:48 compute-1 nova_compute[182713]:   <devices>
Jan 21 23:47:48 compute-1 nova_compute[182713]:     <disk type="file" device="disk">
Jan 21 23:47:48 compute-1 nova_compute[182713]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 21 23:47:48 compute-1 nova_compute[182713]:       <source file="/var/lib/nova/instances/2977f489-9f9d-43f7-a617-7556b7df5171/disk"/>
Jan 21 23:47:48 compute-1 nova_compute[182713]:       <target dev="vda" bus="virtio"/>
Jan 21 23:47:48 compute-1 nova_compute[182713]:     </disk>
Jan 21 23:47:48 compute-1 nova_compute[182713]:     <disk type="file" device="cdrom">
Jan 21 23:47:48 compute-1 nova_compute[182713]:       <driver name="qemu" type="raw" cache="none"/>
Jan 21 23:47:48 compute-1 nova_compute[182713]:       <source file="/var/lib/nova/instances/2977f489-9f9d-43f7-a617-7556b7df5171/disk.config"/>
Jan 21 23:47:48 compute-1 nova_compute[182713]:       <target dev="sda" bus="sata"/>
Jan 21 23:47:48 compute-1 nova_compute[182713]:     </disk>
Jan 21 23:47:48 compute-1 nova_compute[182713]:     <serial type="pty">
Jan 21 23:47:48 compute-1 nova_compute[182713]:       <log file="/var/lib/nova/instances/2977f489-9f9d-43f7-a617-7556b7df5171/console.log" append="off"/>
Jan 21 23:47:48 compute-1 nova_compute[182713]:     </serial>
Jan 21 23:47:48 compute-1 nova_compute[182713]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 21 23:47:48 compute-1 nova_compute[182713]:     <video>
Jan 21 23:47:48 compute-1 nova_compute[182713]:       <model type="virtio"/>
Jan 21 23:47:48 compute-1 nova_compute[182713]:     </video>
Jan 21 23:47:48 compute-1 nova_compute[182713]:     <input type="tablet" bus="usb"/>
Jan 21 23:47:48 compute-1 nova_compute[182713]:     <rng model="virtio">
Jan 21 23:47:48 compute-1 nova_compute[182713]:       <backend model="random">/dev/urandom</backend>
Jan 21 23:47:48 compute-1 nova_compute[182713]:     </rng>
Jan 21 23:47:48 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root"/>
Jan 21 23:47:48 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:47:48 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:47:48 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:47:48 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:47:48 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:47:48 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:47:48 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:47:48 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:47:48 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:47:48 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:47:48 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:47:48 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:47:48 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:47:48 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:47:48 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:47:48 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:47:48 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:47:48 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:47:48 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:47:48 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:47:48 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:47:48 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:47:48 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:47:48 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:47:48 compute-1 nova_compute[182713]:     <controller type="usb" index="0"/>
Jan 21 23:47:48 compute-1 nova_compute[182713]:     <memballoon model="virtio">
Jan 21 23:47:48 compute-1 nova_compute[182713]:       <stats period="10"/>
Jan 21 23:47:48 compute-1 nova_compute[182713]:     </memballoon>
Jan 21 23:47:48 compute-1 nova_compute[182713]:   </devices>
Jan 21 23:47:48 compute-1 nova_compute[182713]: </domain>
Jan 21 23:47:48 compute-1 nova_compute[182713]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 21 23:47:48 compute-1 nova_compute[182713]: 2026-01-21 23:47:48.773 182717 DEBUG nova.virt.libvirt.driver [None req-730b2ef1-cc43-46d2-921d-429f9de50ffa 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 21 23:47:48 compute-1 nova_compute[182713]: 2026-01-21 23:47:48.774 182717 DEBUG nova.virt.libvirt.driver [None req-730b2ef1-cc43-46d2-921d-429f9de50ffa 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 21 23:47:48 compute-1 nova_compute[182713]: 2026-01-21 23:47:48.774 182717 INFO nova.virt.libvirt.driver [None req-730b2ef1-cc43-46d2-921d-429f9de50ffa 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] [instance: 2977f489-9f9d-43f7-a617-7556b7df5171] Using config drive
Jan 21 23:47:48 compute-1 nova_compute[182713]: 2026-01-21 23:47:48.930 182717 DEBUG oslo_concurrency.processutils [None req-bd2b8775-609d-4145-8ee6-76c7bc9dac71 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] CMD "scp -C -r compute-0.ctlplane.example.com:/var/lib/nova/instances/5bdecf5d-9113-4584-ac23-44d59770eade/disk.config /var/lib/nova/instances/5bdecf5d-9113-4584-ac23-44d59770eade" returned: 0 in 0.536s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:47:48 compute-1 nova_compute[182713]: 2026-01-21 23:47:48.931 182717 DEBUG nova.virt.libvirt.driver [None req-bd2b8775-609d-4145-8ee6-76c7bc9dac71 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] [instance: 5bdecf5d-9113-4584-ac23-44d59770eade] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10794
Jan 21 23:47:48 compute-1 nova_compute[182713]: 2026-01-21 23:47:48.934 182717 DEBUG nova.virt.libvirt.vif [None req-bd2b8775-609d-4145-8ee6-76c7bc9dac71 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-01-21T23:47:07Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-LiveMigrationTest-server-821021372',display_name='tempest-LiveMigrationTest-server-821021372',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-livemigrationtest-server-821021372',id=21,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-21T23:47:17Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='cdcb2f57183e484cace5d5f78dd635a1',ramdisk_id='',reservation_id='r-xa5gd7vg',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-LiveMigrationTest-430976321',owner_user_name='tempest-LiveMigrationTest-430976321-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=None,updated_at=2026-01-21T23:47:38Z,user_data=None,user_id='d4ff24d8abf8416db9d64c645436c5f1',uuid=5bdecf5d-9113-4584-ac23-44d59770eade,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "df9aa099-aa41-4111-b46c-c8a593762a53", "address": "fa:16:3e:8f:4f:85", "network": {"id": "b2df233d-b255-4dda-925c-3ccab3a032ee", "bridge": "br-int", "label": "tempest-LiveMigrationTest-283223724-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cdcb2f57183e484cace5d5f78dd635a1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapdf9aa099-aa", "ovs_interfaceid": "df9aa099-aa41-4111-b46c-c8a593762a53", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 21 23:47:48 compute-1 nova_compute[182713]: 2026-01-21 23:47:48.934 182717 DEBUG nova.network.os_vif_util [None req-bd2b8775-609d-4145-8ee6-76c7bc9dac71 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] Converting VIF {"id": "df9aa099-aa41-4111-b46c-c8a593762a53", "address": "fa:16:3e:8f:4f:85", "network": {"id": "b2df233d-b255-4dda-925c-3ccab3a032ee", "bridge": "br-int", "label": "tempest-LiveMigrationTest-283223724-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cdcb2f57183e484cace5d5f78dd635a1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapdf9aa099-aa", "ovs_interfaceid": "df9aa099-aa41-4111-b46c-c8a593762a53", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 21 23:47:48 compute-1 nova_compute[182713]: 2026-01-21 23:47:48.936 182717 DEBUG nova.network.os_vif_util [None req-bd2b8775-609d-4145-8ee6-76c7bc9dac71 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:8f:4f:85,bridge_name='br-int',has_traffic_filtering=True,id=df9aa099-aa41-4111-b46c-c8a593762a53,network=Network(b2df233d-b255-4dda-925c-3ccab3a032ee),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdf9aa099-aa') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 21 23:47:48 compute-1 nova_compute[182713]: 2026-01-21 23:47:48.937 182717 DEBUG os_vif [None req-bd2b8775-609d-4145-8ee6-76c7bc9dac71 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] Plugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:8f:4f:85,bridge_name='br-int',has_traffic_filtering=True,id=df9aa099-aa41-4111-b46c-c8a593762a53,network=Network(b2df233d-b255-4dda-925c-3ccab3a032ee),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdf9aa099-aa') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 21 23:47:48 compute-1 nova_compute[182713]: 2026-01-21 23:47:48.938 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:47:48 compute-1 nova_compute[182713]: 2026-01-21 23:47:48.938 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:47:48 compute-1 nova_compute[182713]: 2026-01-21 23:47:48.939 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 21 23:47:48 compute-1 nova_compute[182713]: 2026-01-21 23:47:48.945 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:47:48 compute-1 nova_compute[182713]: 2026-01-21 23:47:48.946 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapdf9aa099-aa, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:47:48 compute-1 nova_compute[182713]: 2026-01-21 23:47:48.946 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapdf9aa099-aa, col_values=(('external_ids', {'iface-id': 'df9aa099-aa41-4111-b46c-c8a593762a53', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:8f:4f:85', 'vm-uuid': '5bdecf5d-9113-4584-ac23-44d59770eade'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:47:48 compute-1 nova_compute[182713]: 2026-01-21 23:47:48.948 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:47:48 compute-1 NetworkManager[54952]: <info>  [1769039268.9501] manager: (tapdf9aa099-aa): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/47)
Jan 21 23:47:48 compute-1 nova_compute[182713]: 2026-01-21 23:47:48.952 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 21 23:47:48 compute-1 nova_compute[182713]: 2026-01-21 23:47:48.959 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:47:48 compute-1 nova_compute[182713]: 2026-01-21 23:47:48.960 182717 INFO os_vif [None req-bd2b8775-609d-4145-8ee6-76c7bc9dac71 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] Successfully plugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:8f:4f:85,bridge_name='br-int',has_traffic_filtering=True,id=df9aa099-aa41-4111-b46c-c8a593762a53,network=Network(b2df233d-b255-4dda-925c-3ccab3a032ee),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdf9aa099-aa')
Jan 21 23:47:48 compute-1 nova_compute[182713]: 2026-01-21 23:47:48.960 182717 DEBUG nova.virt.libvirt.driver [None req-bd2b8775-609d-4145-8ee6-76c7bc9dac71 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10954
Jan 21 23:47:48 compute-1 nova_compute[182713]: 2026-01-21 23:47:48.961 182717 DEBUG nova.compute.manager [None req-bd2b8775-609d-4145-8ee6-76c7bc9dac71 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp34j3k16n',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='5bdecf5d-9113-4584-ac23-44d59770eade',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8668
Jan 21 23:47:49 compute-1 nova_compute[182713]: 2026-01-21 23:47:49.021 182717 INFO nova.virt.libvirt.driver [None req-730b2ef1-cc43-46d2-921d-429f9de50ffa 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] [instance: 2977f489-9f9d-43f7-a617-7556b7df5171] Creating config drive at /var/lib/nova/instances/2977f489-9f9d-43f7-a617-7556b7df5171/disk.config
Jan 21 23:47:49 compute-1 nova_compute[182713]: 2026-01-21 23:47:49.031 182717 DEBUG oslo_concurrency.processutils [None req-730b2ef1-cc43-46d2-921d-429f9de50ffa 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/2977f489-9f9d-43f7-a617-7556b7df5171/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp87mayghp execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:47:49 compute-1 nova_compute[182713]: 2026-01-21 23:47:49.178 182717 DEBUG oslo_concurrency.processutils [None req-730b2ef1-cc43-46d2-921d-429f9de50ffa 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/2977f489-9f9d-43f7-a617-7556b7df5171/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp87mayghp" returned: 0 in 0.147s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:47:49 compute-1 systemd-machined[153970]: New machine qemu-12-instance-00000017.
Jan 21 23:47:49 compute-1 systemd[1]: Started Virtual Machine qemu-12-instance-00000017.
Jan 21 23:47:49 compute-1 nova_compute[182713]: 2026-01-21 23:47:49.598 182717 DEBUG nova.virt.driver [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Emitting event <LifecycleEvent: 1769039269.5972512, 2977f489-9f9d-43f7-a617-7556b7df5171 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:47:49 compute-1 nova_compute[182713]: 2026-01-21 23:47:49.599 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 2977f489-9f9d-43f7-a617-7556b7df5171] VM Resumed (Lifecycle Event)
Jan 21 23:47:49 compute-1 nova_compute[182713]: 2026-01-21 23:47:49.605 182717 DEBUG nova.compute.manager [None req-730b2ef1-cc43-46d2-921d-429f9de50ffa 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] [instance: 2977f489-9f9d-43f7-a617-7556b7df5171] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 21 23:47:49 compute-1 nova_compute[182713]: 2026-01-21 23:47:49.605 182717 DEBUG nova.virt.libvirt.driver [None req-730b2ef1-cc43-46d2-921d-429f9de50ffa 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] [instance: 2977f489-9f9d-43f7-a617-7556b7df5171] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 21 23:47:49 compute-1 nova_compute[182713]: 2026-01-21 23:47:49.611 182717 INFO nova.virt.libvirt.driver [-] [instance: 2977f489-9f9d-43f7-a617-7556b7df5171] Instance spawned successfully.
Jan 21 23:47:49 compute-1 nova_compute[182713]: 2026-01-21 23:47:49.612 182717 DEBUG nova.virt.libvirt.driver [None req-730b2ef1-cc43-46d2-921d-429f9de50ffa 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] [instance: 2977f489-9f9d-43f7-a617-7556b7df5171] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 21 23:47:49 compute-1 nova_compute[182713]: 2026-01-21 23:47:49.635 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 2977f489-9f9d-43f7-a617-7556b7df5171] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:47:49 compute-1 nova_compute[182713]: 2026-01-21 23:47:49.642 182717 DEBUG nova.virt.libvirt.driver [None req-730b2ef1-cc43-46d2-921d-429f9de50ffa 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] [instance: 2977f489-9f9d-43f7-a617-7556b7df5171] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:47:49 compute-1 nova_compute[182713]: 2026-01-21 23:47:49.643 182717 DEBUG nova.virt.libvirt.driver [None req-730b2ef1-cc43-46d2-921d-429f9de50ffa 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] [instance: 2977f489-9f9d-43f7-a617-7556b7df5171] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:47:49 compute-1 nova_compute[182713]: 2026-01-21 23:47:49.643 182717 DEBUG nova.virt.libvirt.driver [None req-730b2ef1-cc43-46d2-921d-429f9de50ffa 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] [instance: 2977f489-9f9d-43f7-a617-7556b7df5171] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:47:49 compute-1 nova_compute[182713]: 2026-01-21 23:47:49.644 182717 DEBUG nova.virt.libvirt.driver [None req-730b2ef1-cc43-46d2-921d-429f9de50ffa 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] [instance: 2977f489-9f9d-43f7-a617-7556b7df5171] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:47:49 compute-1 nova_compute[182713]: 2026-01-21 23:47:49.645 182717 DEBUG nova.virt.libvirt.driver [None req-730b2ef1-cc43-46d2-921d-429f9de50ffa 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] [instance: 2977f489-9f9d-43f7-a617-7556b7df5171] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:47:49 compute-1 nova_compute[182713]: 2026-01-21 23:47:49.646 182717 DEBUG nova.virt.libvirt.driver [None req-730b2ef1-cc43-46d2-921d-429f9de50ffa 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] [instance: 2977f489-9f9d-43f7-a617-7556b7df5171] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:47:49 compute-1 nova_compute[182713]: 2026-01-21 23:47:49.652 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 2977f489-9f9d-43f7-a617-7556b7df5171] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 21 23:47:49 compute-1 nova_compute[182713]: 2026-01-21 23:47:49.693 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 2977f489-9f9d-43f7-a617-7556b7df5171] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 21 23:47:49 compute-1 nova_compute[182713]: 2026-01-21 23:47:49.694 182717 DEBUG nova.virt.driver [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Emitting event <LifecycleEvent: 1769039269.6019626, 2977f489-9f9d-43f7-a617-7556b7df5171 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:47:49 compute-1 nova_compute[182713]: 2026-01-21 23:47:49.695 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 2977f489-9f9d-43f7-a617-7556b7df5171] VM Started (Lifecycle Event)
Jan 21 23:47:49 compute-1 nova_compute[182713]: 2026-01-21 23:47:49.728 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 2977f489-9f9d-43f7-a617-7556b7df5171] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:47:49 compute-1 nova_compute[182713]: 2026-01-21 23:47:49.733 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 2977f489-9f9d-43f7-a617-7556b7df5171] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 21 23:47:49 compute-1 nova_compute[182713]: 2026-01-21 23:47:49.771 182717 INFO nova.compute.manager [None req-730b2ef1-cc43-46d2-921d-429f9de50ffa 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] [instance: 2977f489-9f9d-43f7-a617-7556b7df5171] Took 1.70 seconds to spawn the instance on the hypervisor.
Jan 21 23:47:49 compute-1 nova_compute[182713]: 2026-01-21 23:47:49.771 182717 DEBUG nova.compute.manager [None req-730b2ef1-cc43-46d2-921d-429f9de50ffa 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] [instance: 2977f489-9f9d-43f7-a617-7556b7df5171] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:47:49 compute-1 nova_compute[182713]: 2026-01-21 23:47:49.847 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 2977f489-9f9d-43f7-a617-7556b7df5171] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 21 23:47:49 compute-1 nova_compute[182713]: 2026-01-21 23:47:49.919 182717 INFO nova.compute.manager [None req-730b2ef1-cc43-46d2-921d-429f9de50ffa 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] [instance: 2977f489-9f9d-43f7-a617-7556b7df5171] Took 2.46 seconds to build instance.
Jan 21 23:47:49 compute-1 nova_compute[182713]: 2026-01-21 23:47:49.939 182717 DEBUG oslo_concurrency.lockutils [None req-730b2ef1-cc43-46d2-921d-429f9de50ffa 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Lock "2977f489-9f9d-43f7-a617-7556b7df5171" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 2.588s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:47:50 compute-1 nova_compute[182713]: 2026-01-21 23:47:50.218 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:47:51 compute-1 nova_compute[182713]: 2026-01-21 23:47:51.166 182717 DEBUG nova.network.neutron [None req-bd2b8775-609d-4145-8ee6-76c7bc9dac71 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] [instance: 5bdecf5d-9113-4584-ac23-44d59770eade] Port df9aa099-aa41-4111-b46c-c8a593762a53 updated with migration profile {'os_vif_delegation': True, 'migrating_to': 'compute-1.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.9/site-packages/nova/network/neutron.py:354
Jan 21 23:47:51 compute-1 nova_compute[182713]: 2026-01-21 23:47:51.188 182717 DEBUG nova.compute.manager [None req-bd2b8775-609d-4145-8ee6-76c7bc9dac71 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp34j3k16n',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='5bdecf5d-9113-4584-ac23-44d59770eade',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8723
Jan 21 23:47:51 compute-1 systemd[1]: Starting libvirt proxy daemon...
Jan 21 23:47:51 compute-1 systemd[1]: Started libvirt proxy daemon.
Jan 21 23:47:51 compute-1 kernel: tapdf9aa099-aa: entered promiscuous mode
Jan 21 23:47:51 compute-1 systemd-udevd[214232]: Network interface NamePolicy= disabled on kernel command line.
Jan 21 23:47:51 compute-1 nova_compute[182713]: 2026-01-21 23:47:51.582 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:47:51 compute-1 ovn_controller[94841]: 2026-01-21T23:47:51Z|00075|binding|INFO|Claiming lport df9aa099-aa41-4111-b46c-c8a593762a53 for this additional chassis.
Jan 21 23:47:51 compute-1 ovn_controller[94841]: 2026-01-21T23:47:51Z|00076|binding|INFO|df9aa099-aa41-4111-b46c-c8a593762a53: Claiming fa:16:3e:8f:4f:85 10.100.0.6
Jan 21 23:47:51 compute-1 NetworkManager[54952]: <info>  [1769039271.5850] manager: (tapdf9aa099-aa): new Tun device (/org/freedesktop/NetworkManager/Devices/48)
Jan 21 23:47:51 compute-1 nova_compute[182713]: 2026-01-21 23:47:51.600 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:47:51 compute-1 ovn_controller[94841]: 2026-01-21T23:47:51Z|00077|binding|INFO|Setting lport df9aa099-aa41-4111-b46c-c8a593762a53 ovn-installed in OVS
Jan 21 23:47:51 compute-1 nova_compute[182713]: 2026-01-21 23:47:51.602 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:47:51 compute-1 nova_compute[182713]: 2026-01-21 23:47:51.606 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:47:51 compute-1 NetworkManager[54952]: <info>  [1769039271.6081] device (tapdf9aa099-aa): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 21 23:47:51 compute-1 NetworkManager[54952]: <info>  [1769039271.6104] device (tapdf9aa099-aa): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 21 23:47:51 compute-1 systemd-machined[153970]: New machine qemu-13-instance-00000015.
Jan 21 23:47:51 compute-1 systemd[1]: Started Virtual Machine qemu-13-instance-00000015.
Jan 21 23:47:52 compute-1 nova_compute[182713]: 2026-01-21 23:47:52.343 182717 DEBUG nova.virt.driver [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Emitting event <LifecycleEvent: 1769039272.3425508, 5bdecf5d-9113-4584-ac23-44d59770eade => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:47:52 compute-1 nova_compute[182713]: 2026-01-21 23:47:52.343 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 5bdecf5d-9113-4584-ac23-44d59770eade] VM Started (Lifecycle Event)
Jan 21 23:47:52 compute-1 nova_compute[182713]: 2026-01-21 23:47:52.387 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 5bdecf5d-9113-4584-ac23-44d59770eade] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:47:53 compute-1 nova_compute[182713]: 2026-01-21 23:47:53.150 182717 DEBUG nova.virt.driver [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Emitting event <LifecycleEvent: 1769039273.1492913, 5bdecf5d-9113-4584-ac23-44d59770eade => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:47:53 compute-1 nova_compute[182713]: 2026-01-21 23:47:53.151 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 5bdecf5d-9113-4584-ac23-44d59770eade] VM Resumed (Lifecycle Event)
Jan 21 23:47:53 compute-1 nova_compute[182713]: 2026-01-21 23:47:53.186 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 5bdecf5d-9113-4584-ac23-44d59770eade] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:47:53 compute-1 nova_compute[182713]: 2026-01-21 23:47:53.190 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 5bdecf5d-9113-4584-ac23-44d59770eade] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 21 23:47:53 compute-1 nova_compute[182713]: 2026-01-21 23:47:53.214 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 5bdecf5d-9113-4584-ac23-44d59770eade] During the sync_power process the instance has moved from host compute-0.ctlplane.example.com to host compute-1.ctlplane.example.com
Jan 21 23:47:53 compute-1 nova_compute[182713]: 2026-01-21 23:47:53.950 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:47:54 compute-1 nova_compute[182713]: 2026-01-21 23:47:54.846 182717 DEBUG oslo_concurrency.lockutils [None req-56bcb8a2-6be4-47c0-9ec7-758300a28f39 9677535ac83f4274b374a03b9ade1cf8 fd13708789ca41ca9ed376ba39d87279 - - default default] Acquiring lock "refresh_cache-2977f489-9f9d-43f7-a617-7556b7df5171" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 23:47:54 compute-1 nova_compute[182713]: 2026-01-21 23:47:54.847 182717 DEBUG oslo_concurrency.lockutils [None req-56bcb8a2-6be4-47c0-9ec7-758300a28f39 9677535ac83f4274b374a03b9ade1cf8 fd13708789ca41ca9ed376ba39d87279 - - default default] Acquired lock "refresh_cache-2977f489-9f9d-43f7-a617-7556b7df5171" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 23:47:54 compute-1 nova_compute[182713]: 2026-01-21 23:47:54.847 182717 DEBUG nova.network.neutron [None req-56bcb8a2-6be4-47c0-9ec7-758300a28f39 9677535ac83f4274b374a03b9ade1cf8 fd13708789ca41ca9ed376ba39d87279 - - default default] [instance: 2977f489-9f9d-43f7-a617-7556b7df5171] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 21 23:47:54 compute-1 ovn_controller[94841]: 2026-01-21T23:47:54Z|00078|binding|INFO|Claiming lport df9aa099-aa41-4111-b46c-c8a593762a53 for this chassis.
Jan 21 23:47:54 compute-1 ovn_controller[94841]: 2026-01-21T23:47:54Z|00079|binding|INFO|df9aa099-aa41-4111-b46c-c8a593762a53: Claiming fa:16:3e:8f:4f:85 10.100.0.6
Jan 21 23:47:54 compute-1 ovn_controller[94841]: 2026-01-21T23:47:54Z|00080|binding|INFO|Setting lport df9aa099-aa41-4111-b46c-c8a593762a53 up in Southbound
Jan 21 23:47:54 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:47:54.905 104184 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:8f:4f:85 10.100.0.6'], port_security=['fa:16:3e:8f:4f:85 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '5bdecf5d-9113-4584-ac23-44d59770eade', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b2df233d-b255-4dda-925c-3ccab3a032ee', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'cdcb2f57183e484cace5d5f78dd635a1', 'neutron:revision_number': '20', 'neutron:security_group_ids': 'b0d61dfd-cc58-4a7f-a4c1-6c8c92847694', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ceab9906-340c-4566-81ac-4c6dd292f58f, chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>], logical_port=df9aa099-aa41-4111-b46c-c8a593762a53) old=Port_Binding(up=[False], additional_chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 21 23:47:54 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:47:54.907 104184 INFO neutron.agent.ovn.metadata.agent [-] Port df9aa099-aa41-4111-b46c-c8a593762a53 in datapath b2df233d-b255-4dda-925c-3ccab3a032ee bound to our chassis
Jan 21 23:47:54 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:47:54.911 104184 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network b2df233d-b255-4dda-925c-3ccab3a032ee
Jan 21 23:47:54 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:47:54.928 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[b70111b7-25cf-413a-8817-928aa4e3942f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:47:54 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:47:54.930 104184 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapb2df233d-b1 in ovnmeta-b2df233d-b255-4dda-925c-3ccab3a032ee namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 21 23:47:54 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:47:54.935 211733 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapb2df233d-b0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 21 23:47:54 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:47:54.935 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[20f2c851-4231-4504-84df-b450145e9e7a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:47:54 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:47:54.937 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[4be1ac86-7608-4c72-b1f6-f352217f9402]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:47:54 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:47:54.956 104576 DEBUG oslo.privsep.daemon [-] privsep: reply[f84aec61-e43b-43ca-92c2-dedab8810256]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:47:54 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:47:54.986 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[4e16c848-4698-4229-8971-4849943754f4]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:47:55 compute-1 nova_compute[182713]: 2026-01-21 23:47:55.010 182717 DEBUG nova.network.neutron [None req-56bcb8a2-6be4-47c0-9ec7-758300a28f39 9677535ac83f4274b374a03b9ade1cf8 fd13708789ca41ca9ed376ba39d87279 - - default default] [instance: 2977f489-9f9d-43f7-a617-7556b7df5171] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 21 23:47:55 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:47:55.044 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[1b65218a-e0d3-47aa-bc8f-1f47cb16d4de]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:47:55 compute-1 NetworkManager[54952]: <info>  [1769039275.0534] manager: (tapb2df233d-b0): new Veth device (/org/freedesktop/NetworkManager/Devices/49)
Jan 21 23:47:55 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:47:55.051 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[d986725e-6e7a-4718-b6fb-e09efe20bece]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:47:55 compute-1 podman[214300]: 2026-01-21 23:47:55.093198939 +0000 UTC m=+0.099166343 container health_status af9a44923f14d865893c91712a29a99d59e05d83f1a1a0300c4f4d5af638f284 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS)
Jan 21 23:47:55 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:47:55.104 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[c31fe6b1-7c77-4e33-b168-b18ea7983408]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:47:55 compute-1 podman[214302]: 2026-01-21 23:47:55.108048225 +0000 UTC m=+0.106494868 container health_status e305ebfcd3dbca18f8dd680606b043b108cf9efb0d0a771039cb04fe0df8441d (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 21 23:47:55 compute-1 systemd-udevd[214344]: Network interface NamePolicy= disabled on kernel command line.
Jan 21 23:47:55 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:47:55.109 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[0c746c71-6b2d-4c6e-bd2e-272e905318c9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:47:55 compute-1 NetworkManager[54952]: <info>  [1769039275.1412] device (tapb2df233d-b0): carrier: link connected
Jan 21 23:47:55 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:47:55.145 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[1a40fd8d-4e64-4c46-bc72-3b6eea7d561d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:47:55 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:47:55.164 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[e36927f4-a9ff-458d-8557-661e148cda31]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb2df233d-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:45:e6:36'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 27], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 384779, 'reachable_time': 23610, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 214362, 'error': None, 'target': 'ovnmeta-b2df233d-b255-4dda-925c-3ccab3a032ee', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:47:55 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:47:55.180 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[40c3af1a-1571-40c0-a3b5-04bf286226eb]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe45:e636'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 384779, 'tstamp': 384779}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 214364, 'error': None, 'target': 'ovnmeta-b2df233d-b255-4dda-925c-3ccab3a032ee', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:47:55 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:47:55.198 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[7e92ff93-c93b-44bb-8e01-27fef5355a06]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb2df233d-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:45:e6:36'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 27], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 384779, 'reachable_time': 23610, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 214365, 'error': None, 'target': 'ovnmeta-b2df233d-b255-4dda-925c-3ccab3a032ee', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:47:55 compute-1 nova_compute[182713]: 2026-01-21 23:47:55.222 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:47:55 compute-1 nova_compute[182713]: 2026-01-21 23:47:55.227 182717 INFO nova.compute.manager [None req-bd2b8775-609d-4145-8ee6-76c7bc9dac71 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] [instance: 5bdecf5d-9113-4584-ac23-44d59770eade] Post operation of migration started
Jan 21 23:47:55 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:47:55.236 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[8b6d8a8d-247c-488c-9867-70944cd61441]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:47:55 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:47:55.291 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[6ea587d2-6da4-4f85-be01-cf72d35d8e1f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:47:55 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:47:55.292 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb2df233d-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:47:55 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:47:55.292 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 21 23:47:55 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:47:55.293 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb2df233d-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:47:55 compute-1 nova_compute[182713]: 2026-01-21 23:47:55.294 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:47:55 compute-1 NetworkManager[54952]: <info>  [1769039275.2956] manager: (tapb2df233d-b0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/50)
Jan 21 23:47:55 compute-1 kernel: tapb2df233d-b0: entered promiscuous mode
Jan 21 23:47:55 compute-1 nova_compute[182713]: 2026-01-21 23:47:55.297 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:47:55 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:47:55.298 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapb2df233d-b0, col_values=(('external_ids', {'iface-id': '75454af0-da31-4238-b248-a6678c575f51'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:47:55 compute-1 nova_compute[182713]: 2026-01-21 23:47:55.298 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:47:55 compute-1 ovn_controller[94841]: 2026-01-21T23:47:55Z|00081|binding|INFO|Releasing lport 75454af0-da31-4238-b248-a6678c575f51 from this chassis (sb_readonly=0)
Jan 21 23:47:55 compute-1 nova_compute[182713]: 2026-01-21 23:47:55.299 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:47:55 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:47:55.303 104184 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/b2df233d-b255-4dda-925c-3ccab3a032ee.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/b2df233d-b255-4dda-925c-3ccab3a032ee.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 21 23:47:55 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:47:55.304 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[3bd1d431-f676-4434-8d67-16719f40e554]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:47:55 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:47:55.305 104184 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 21 23:47:55 compute-1 ovn_metadata_agent[104179]: global
Jan 21 23:47:55 compute-1 ovn_metadata_agent[104179]:     log         /dev/log local0 debug
Jan 21 23:47:55 compute-1 ovn_metadata_agent[104179]:     log-tag     haproxy-metadata-proxy-b2df233d-b255-4dda-925c-3ccab3a032ee
Jan 21 23:47:55 compute-1 ovn_metadata_agent[104179]:     user        root
Jan 21 23:47:55 compute-1 ovn_metadata_agent[104179]:     group       root
Jan 21 23:47:55 compute-1 ovn_metadata_agent[104179]:     maxconn     1024
Jan 21 23:47:55 compute-1 ovn_metadata_agent[104179]:     pidfile     /var/lib/neutron/external/pids/b2df233d-b255-4dda-925c-3ccab3a032ee.pid.haproxy
Jan 21 23:47:55 compute-1 ovn_metadata_agent[104179]:     daemon
Jan 21 23:47:55 compute-1 ovn_metadata_agent[104179]: 
Jan 21 23:47:55 compute-1 ovn_metadata_agent[104179]: defaults
Jan 21 23:47:55 compute-1 ovn_metadata_agent[104179]:     log global
Jan 21 23:47:55 compute-1 ovn_metadata_agent[104179]:     mode http
Jan 21 23:47:55 compute-1 ovn_metadata_agent[104179]:     option httplog
Jan 21 23:47:55 compute-1 ovn_metadata_agent[104179]:     option dontlognull
Jan 21 23:47:55 compute-1 ovn_metadata_agent[104179]:     option http-server-close
Jan 21 23:47:55 compute-1 ovn_metadata_agent[104179]:     option forwardfor
Jan 21 23:47:55 compute-1 ovn_metadata_agent[104179]:     retries                 3
Jan 21 23:47:55 compute-1 ovn_metadata_agent[104179]:     timeout http-request    30s
Jan 21 23:47:55 compute-1 ovn_metadata_agent[104179]:     timeout connect         30s
Jan 21 23:47:55 compute-1 ovn_metadata_agent[104179]:     timeout client          32s
Jan 21 23:47:55 compute-1 ovn_metadata_agent[104179]:     timeout server          32s
Jan 21 23:47:55 compute-1 ovn_metadata_agent[104179]:     timeout http-keep-alive 30s
Jan 21 23:47:55 compute-1 ovn_metadata_agent[104179]: 
Jan 21 23:47:55 compute-1 ovn_metadata_agent[104179]: 
Jan 21 23:47:55 compute-1 ovn_metadata_agent[104179]: listen listener
Jan 21 23:47:55 compute-1 ovn_metadata_agent[104179]:     bind 169.254.169.254:80
Jan 21 23:47:55 compute-1 ovn_metadata_agent[104179]:     server metadata /var/lib/neutron/metadata_proxy
Jan 21 23:47:55 compute-1 ovn_metadata_agent[104179]:     http-request add-header X-OVN-Network-ID b2df233d-b255-4dda-925c-3ccab3a032ee
Jan 21 23:47:55 compute-1 ovn_metadata_agent[104179]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 21 23:47:55 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:47:55.305 104184 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-b2df233d-b255-4dda-925c-3ccab3a032ee', 'env', 'PROCESS_TAG=haproxy-b2df233d-b255-4dda-925c-3ccab3a032ee', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/b2df233d-b255-4dda-925c-3ccab3a032ee.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 21 23:47:55 compute-1 nova_compute[182713]: 2026-01-21 23:47:55.309 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:47:55 compute-1 nova_compute[182713]: 2026-01-21 23:47:55.417 182717 DEBUG nova.network.neutron [None req-56bcb8a2-6be4-47c0-9ec7-758300a28f39 9677535ac83f4274b374a03b9ade1cf8 fd13708789ca41ca9ed376ba39d87279 - - default default] [instance: 2977f489-9f9d-43f7-a617-7556b7df5171] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 23:47:55 compute-1 nova_compute[182713]: 2026-01-21 23:47:55.434 182717 DEBUG oslo_concurrency.lockutils [None req-56bcb8a2-6be4-47c0-9ec7-758300a28f39 9677535ac83f4274b374a03b9ade1cf8 fd13708789ca41ca9ed376ba39d87279 - - default default] Releasing lock "refresh_cache-2977f489-9f9d-43f7-a617-7556b7df5171" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 23:47:55 compute-1 nova_compute[182713]: 2026-01-21 23:47:55.616 182717 DEBUG nova.virt.libvirt.driver [None req-56bcb8a2-6be4-47c0-9ec7-758300a28f39 9677535ac83f4274b374a03b9ade1cf8 fd13708789ca41ca9ed376ba39d87279 - - default default] [instance: 2977f489-9f9d-43f7-a617-7556b7df5171] Starting migrate_disk_and_power_off migrate_disk_and_power_off /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11511
Jan 21 23:47:55 compute-1 nova_compute[182713]: 2026-01-21 23:47:55.616 182717 DEBUG nova.virt.libvirt.volume.remotefs [None req-56bcb8a2-6be4-47c0-9ec7-758300a28f39 9677535ac83f4274b374a03b9ade1cf8 fd13708789ca41ca9ed376ba39d87279 - - default default] Creating file /var/lib/nova/instances/2977f489-9f9d-43f7-a617-7556b7df5171/5c554038b4fa4750985b5fc1975c26d7.tmp on remote host 192.168.122.100 create_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:79
Jan 21 23:47:55 compute-1 nova_compute[182713]: 2026-01-21 23:47:55.617 182717 DEBUG oslo_concurrency.processutils [None req-56bcb8a2-6be4-47c0-9ec7-758300a28f39 9677535ac83f4274b374a03b9ade1cf8 fd13708789ca41ca9ed376ba39d87279 - - default default] Running cmd (subprocess): ssh -o BatchMode=yes 192.168.122.100 touch /var/lib/nova/instances/2977f489-9f9d-43f7-a617-7556b7df5171/5c554038b4fa4750985b5fc1975c26d7.tmp execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:47:55 compute-1 nova_compute[182713]: 2026-01-21 23:47:55.736 182717 DEBUG oslo_concurrency.lockutils [None req-bd2b8775-609d-4145-8ee6-76c7bc9dac71 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] Acquiring lock "refresh_cache-5bdecf5d-9113-4584-ac23-44d59770eade" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 23:47:55 compute-1 nova_compute[182713]: 2026-01-21 23:47:55.736 182717 DEBUG oslo_concurrency.lockutils [None req-bd2b8775-609d-4145-8ee6-76c7bc9dac71 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] Acquired lock "refresh_cache-5bdecf5d-9113-4584-ac23-44d59770eade" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 23:47:55 compute-1 nova_compute[182713]: 2026-01-21 23:47:55.737 182717 DEBUG nova.network.neutron [None req-bd2b8775-609d-4145-8ee6-76c7bc9dac71 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] [instance: 5bdecf5d-9113-4584-ac23-44d59770eade] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 21 23:47:55 compute-1 podman[214399]: 2026-01-21 23:47:55.761545093 +0000 UTC m=+0.078918243 container create 316e019b833162e58ed16590672189c281335e038baae1dcdc19cb48a0d6ebc0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b2df233d-b255-4dda-925c-3ccab3a032ee, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Jan 21 23:47:55 compute-1 podman[214399]: 2026-01-21 23:47:55.718778381 +0000 UTC m=+0.036151621 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 21 23:47:55 compute-1 systemd[1]: Started libpod-conmon-316e019b833162e58ed16590672189c281335e038baae1dcdc19cb48a0d6ebc0.scope.
Jan 21 23:47:55 compute-1 systemd[1]: Started libcrun container.
Jan 21 23:47:55 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f05451fdf54bc1deb33a3fd98d2d0b6532413481645a3aaa0f51d8d7ee428231/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 21 23:47:55 compute-1 podman[214399]: 2026-01-21 23:47:55.88548951 +0000 UTC m=+0.202862700 container init 316e019b833162e58ed16590672189c281335e038baae1dcdc19cb48a0d6ebc0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b2df233d-b255-4dda-925c-3ccab3a032ee, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 21 23:47:55 compute-1 nova_compute[182713]: 2026-01-21 23:47:55.896 182717 DEBUG oslo_concurrency.processutils [None req-56bcb8a2-6be4-47c0-9ec7-758300a28f39 9677535ac83f4274b374a03b9ade1cf8 fd13708789ca41ca9ed376ba39d87279 - - default default] CMD "ssh -o BatchMode=yes 192.168.122.100 touch /var/lib/nova/instances/2977f489-9f9d-43f7-a617-7556b7df5171/5c554038b4fa4750985b5fc1975c26d7.tmp" returned: 1 in 0.279s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:47:55 compute-1 nova_compute[182713]: 2026-01-21 23:47:55.897 182717 DEBUG oslo_concurrency.processutils [None req-56bcb8a2-6be4-47c0-9ec7-758300a28f39 9677535ac83f4274b374a03b9ade1cf8 fd13708789ca41ca9ed376ba39d87279 - - default default] 'ssh -o BatchMode=yes 192.168.122.100 touch /var/lib/nova/instances/2977f489-9f9d-43f7-a617-7556b7df5171/5c554038b4fa4750985b5fc1975c26d7.tmp' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473
Jan 21 23:47:55 compute-1 podman[214399]: 2026-01-21 23:47:55.89763356 +0000 UTC m=+0.215006740 container start 316e019b833162e58ed16590672189c281335e038baae1dcdc19cb48a0d6ebc0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b2df233d-b255-4dda-925c-3ccab3a032ee, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 21 23:47:55 compute-1 nova_compute[182713]: 2026-01-21 23:47:55.898 182717 DEBUG nova.virt.libvirt.volume.remotefs [None req-56bcb8a2-6be4-47c0-9ec7-758300a28f39 9677535ac83f4274b374a03b9ade1cf8 fd13708789ca41ca9ed376ba39d87279 - - default default] Creating directory /var/lib/nova/instances/2977f489-9f9d-43f7-a617-7556b7df5171 on remote host 192.168.122.100 create_dir /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:91
Jan 21 23:47:55 compute-1 nova_compute[182713]: 2026-01-21 23:47:55.899 182717 DEBUG oslo_concurrency.processutils [None req-56bcb8a2-6be4-47c0-9ec7-758300a28f39 9677535ac83f4274b374a03b9ade1cf8 fd13708789ca41ca9ed376ba39d87279 - - default default] Running cmd (subprocess): ssh -o BatchMode=yes 192.168.122.100 mkdir -p /var/lib/nova/instances/2977f489-9f9d-43f7-a617-7556b7df5171 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:47:55 compute-1 neutron-haproxy-ovnmeta-b2df233d-b255-4dda-925c-3ccab3a032ee[214415]: [NOTICE]   (214419) : New worker (214422) forked
Jan 21 23:47:55 compute-1 neutron-haproxy-ovnmeta-b2df233d-b255-4dda-925c-3ccab3a032ee[214415]: [NOTICE]   (214419) : Loading success.
Jan 21 23:47:56 compute-1 nova_compute[182713]: 2026-01-21 23:47:56.153 182717 DEBUG oslo_concurrency.processutils [None req-56bcb8a2-6be4-47c0-9ec7-758300a28f39 9677535ac83f4274b374a03b9ade1cf8 fd13708789ca41ca9ed376ba39d87279 - - default default] CMD "ssh -o BatchMode=yes 192.168.122.100 mkdir -p /var/lib/nova/instances/2977f489-9f9d-43f7-a617-7556b7df5171" returned: 0 in 0.254s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:47:56 compute-1 nova_compute[182713]: 2026-01-21 23:47:56.161 182717 DEBUG nova.virt.libvirt.driver [None req-56bcb8a2-6be4-47c0-9ec7-758300a28f39 9677535ac83f4274b374a03b9ade1cf8 fd13708789ca41ca9ed376ba39d87279 - - default default] [instance: 2977f489-9f9d-43f7-a617-7556b7df5171] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071
Jan 21 23:47:57 compute-1 nova_compute[182713]: 2026-01-21 23:47:57.714 182717 DEBUG nova.network.neutron [None req-bd2b8775-609d-4145-8ee6-76c7bc9dac71 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] [instance: 5bdecf5d-9113-4584-ac23-44d59770eade] Updating instance_info_cache with network_info: [{"id": "df9aa099-aa41-4111-b46c-c8a593762a53", "address": "fa:16:3e:8f:4f:85", "network": {"id": "b2df233d-b255-4dda-925c-3ccab3a032ee", "bridge": "br-int", "label": "tempest-LiveMigrationTest-283223724-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cdcb2f57183e484cace5d5f78dd635a1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdf9aa099-aa", "ovs_interfaceid": "df9aa099-aa41-4111-b46c-c8a593762a53", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 23:47:57 compute-1 nova_compute[182713]: 2026-01-21 23:47:57.768 182717 DEBUG oslo_concurrency.lockutils [None req-bd2b8775-609d-4145-8ee6-76c7bc9dac71 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] Releasing lock "refresh_cache-5bdecf5d-9113-4584-ac23-44d59770eade" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 23:47:57 compute-1 nova_compute[182713]: 2026-01-21 23:47:57.806 182717 DEBUG oslo_concurrency.lockutils [None req-bd2b8775-609d-4145-8ee6-76c7bc9dac71 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:47:57 compute-1 nova_compute[182713]: 2026-01-21 23:47:57.807 182717 DEBUG oslo_concurrency.lockutils [None req-bd2b8775-609d-4145-8ee6-76c7bc9dac71 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:47:57 compute-1 nova_compute[182713]: 2026-01-21 23:47:57.808 182717 DEBUG oslo_concurrency.lockutils [None req-bd2b8775-609d-4145-8ee6-76c7bc9dac71 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:47:57 compute-1 nova_compute[182713]: 2026-01-21 23:47:57.817 182717 INFO nova.virt.libvirt.driver [None req-bd2b8775-609d-4145-8ee6-76c7bc9dac71 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] [instance: 5bdecf5d-9113-4584-ac23-44d59770eade] Sending announce-self command to QEMU monitor. Attempt 1 of 3
Jan 21 23:47:57 compute-1 virtqemud[182235]: Domain id=13 name='instance-00000015' uuid=5bdecf5d-9113-4584-ac23-44d59770eade is tainted: custom-monitor
Jan 21 23:47:58 compute-1 nova_compute[182713]: 2026-01-21 23:47:58.829 182717 INFO nova.virt.libvirt.driver [None req-bd2b8775-609d-4145-8ee6-76c7bc9dac71 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] [instance: 5bdecf5d-9113-4584-ac23-44d59770eade] Sending announce-self command to QEMU monitor. Attempt 2 of 3
Jan 21 23:47:58 compute-1 nova_compute[182713]: 2026-01-21 23:47:58.953 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:47:59 compute-1 nova_compute[182713]: 2026-01-21 23:47:59.856 182717 INFO nova.virt.libvirt.driver [None req-bd2b8775-609d-4145-8ee6-76c7bc9dac71 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] [instance: 5bdecf5d-9113-4584-ac23-44d59770eade] Sending announce-self command to QEMU monitor. Attempt 3 of 3
Jan 21 23:47:59 compute-1 nova_compute[182713]: 2026-01-21 23:47:59.863 182717 DEBUG nova.compute.manager [None req-bd2b8775-609d-4145-8ee6-76c7bc9dac71 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] [instance: 5bdecf5d-9113-4584-ac23-44d59770eade] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:47:59 compute-1 nova_compute[182713]: 2026-01-21 23:47:59.885 182717 DEBUG nova.objects.instance [None req-bd2b8775-609d-4145-8ee6-76c7bc9dac71 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] [instance: 5bdecf5d-9113-4584-ac23-44d59770eade] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032
Jan 21 23:48:00 compute-1 nova_compute[182713]: 2026-01-21 23:48:00.251 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:48:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:48:02.995 104184 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:48:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:48:02.996 104184 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:48:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:48:02.997 104184 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:48:03 compute-1 nova_compute[182713]: 2026-01-21 23:48:03.978 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:48:04 compute-1 nova_compute[182713]: 2026-01-21 23:48:04.233 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:48:04 compute-1 nova_compute[182713]: 2026-01-21 23:48:04.257 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Triggering sync for uuid 044c71d9-3aaf-4e1c-af95-5c0636cf4000 _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268
Jan 21 23:48:04 compute-1 nova_compute[182713]: 2026-01-21 23:48:04.258 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Triggering sync for uuid 2977f489-9f9d-43f7-a617-7556b7df5171 _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268
Jan 21 23:48:04 compute-1 nova_compute[182713]: 2026-01-21 23:48:04.258 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Triggering sync for uuid 5bdecf5d-9113-4584-ac23-44d59770eade _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268
Jan 21 23:48:04 compute-1 nova_compute[182713]: 2026-01-21 23:48:04.258 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquiring lock "044c71d9-3aaf-4e1c-af95-5c0636cf4000" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:48:04 compute-1 nova_compute[182713]: 2026-01-21 23:48:04.258 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "044c71d9-3aaf-4e1c-af95-5c0636cf4000" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:48:04 compute-1 nova_compute[182713]: 2026-01-21 23:48:04.259 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquiring lock "2977f489-9f9d-43f7-a617-7556b7df5171" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:48:04 compute-1 nova_compute[182713]: 2026-01-21 23:48:04.259 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "2977f489-9f9d-43f7-a617-7556b7df5171" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:48:04 compute-1 nova_compute[182713]: 2026-01-21 23:48:04.259 182717 INFO nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] [instance: 2977f489-9f9d-43f7-a617-7556b7df5171] During sync_power_state the instance has a pending task (resize_migrating). Skip.
Jan 21 23:48:04 compute-1 nova_compute[182713]: 2026-01-21 23:48:04.260 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "2977f489-9f9d-43f7-a617-7556b7df5171" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:48:04 compute-1 nova_compute[182713]: 2026-01-21 23:48:04.260 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquiring lock "5bdecf5d-9113-4584-ac23-44d59770eade" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:48:04 compute-1 nova_compute[182713]: 2026-01-21 23:48:04.260 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "5bdecf5d-9113-4584-ac23-44d59770eade" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:48:04 compute-1 nova_compute[182713]: 2026-01-21 23:48:04.295 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "5bdecf5d-9113-4584-ac23-44d59770eade" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.035s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:48:04 compute-1 nova_compute[182713]: 2026-01-21 23:48:04.296 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "044c71d9-3aaf-4e1c-af95-5c0636cf4000" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.038s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:48:05 compute-1 nova_compute[182713]: 2026-01-21 23:48:05.298 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:48:06 compute-1 nova_compute[182713]: 2026-01-21 23:48:06.213 182717 DEBUG nova.virt.libvirt.driver [None req-56bcb8a2-6be4-47c0-9ec7-758300a28f39 9677535ac83f4274b374a03b9ade1cf8 fd13708789ca41ca9ed376ba39d87279 - - default default] [instance: 2977f489-9f9d-43f7-a617-7556b7df5171] Instance in state 1 after 10 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101
Jan 21 23:48:06 compute-1 podman[214444]: 2026-01-21 23:48:06.598960859 +0000 UTC m=+0.083357206 container health_status cba261ffc859a105e0afa4436e64e27f9ba7e7387a1ea02146e0072c51844d44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_managed=true, config_id=ceilometer_agent_compute, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ceilometer_agent_compute)
Jan 21 23:48:08 compute-1 systemd[1]: machine-qemu\x2d12\x2dinstance\x2d00000017.scope: Deactivated successfully.
Jan 21 23:48:08 compute-1 systemd[1]: machine-qemu\x2d12\x2dinstance\x2d00000017.scope: Consumed 12.649s CPU time.
Jan 21 23:48:08 compute-1 systemd-machined[153970]: Machine qemu-12-instance-00000017 terminated.
Jan 21 23:48:08 compute-1 nova_compute[182713]: 2026-01-21 23:48:08.982 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:48:09 compute-1 nova_compute[182713]: 2026-01-21 23:48:09.231 182717 INFO nova.virt.libvirt.driver [None req-56bcb8a2-6be4-47c0-9ec7-758300a28f39 9677535ac83f4274b374a03b9ade1cf8 fd13708789ca41ca9ed376ba39d87279 - - default default] [instance: 2977f489-9f9d-43f7-a617-7556b7df5171] Instance shutdown successfully after 13 seconds.
Jan 21 23:48:09 compute-1 nova_compute[182713]: 2026-01-21 23:48:09.239 182717 INFO nova.virt.libvirt.driver [-] [instance: 2977f489-9f9d-43f7-a617-7556b7df5171] Instance destroyed successfully.
Jan 21 23:48:09 compute-1 nova_compute[182713]: 2026-01-21 23:48:09.246 182717 DEBUG oslo_concurrency.processutils [None req-56bcb8a2-6be4-47c0-9ec7-758300a28f39 9677535ac83f4274b374a03b9ade1cf8 fd13708789ca41ca9ed376ba39d87279 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2977f489-9f9d-43f7-a617-7556b7df5171/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:48:09 compute-1 nova_compute[182713]: 2026-01-21 23:48:09.347 182717 DEBUG oslo_concurrency.processutils [None req-56bcb8a2-6be4-47c0-9ec7-758300a28f39 9677535ac83f4274b374a03b9ade1cf8 fd13708789ca41ca9ed376ba39d87279 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2977f489-9f9d-43f7-a617-7556b7df5171/disk --force-share --output=json" returned: 0 in 0.101s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:48:09 compute-1 nova_compute[182713]: 2026-01-21 23:48:09.349 182717 DEBUG oslo_concurrency.processutils [None req-56bcb8a2-6be4-47c0-9ec7-758300a28f39 9677535ac83f4274b374a03b9ade1cf8 fd13708789ca41ca9ed376ba39d87279 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2977f489-9f9d-43f7-a617-7556b7df5171/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:48:09 compute-1 nova_compute[182713]: 2026-01-21 23:48:09.428 182717 DEBUG oslo_concurrency.processutils [None req-56bcb8a2-6be4-47c0-9ec7-758300a28f39 9677535ac83f4274b374a03b9ade1cf8 fd13708789ca41ca9ed376ba39d87279 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2977f489-9f9d-43f7-a617-7556b7df5171/disk --force-share --output=json" returned: 0 in 0.079s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:48:09 compute-1 nova_compute[182713]: 2026-01-21 23:48:09.432 182717 DEBUG nova.virt.libvirt.volume.remotefs [None req-56bcb8a2-6be4-47c0-9ec7-758300a28f39 9677535ac83f4274b374a03b9ade1cf8 fd13708789ca41ca9ed376ba39d87279 - - default default] Copying file /var/lib/nova/instances/2977f489-9f9d-43f7-a617-7556b7df5171_resize/disk to 192.168.122.100:/var/lib/nova/instances/2977f489-9f9d-43f7-a617-7556b7df5171/disk copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103
Jan 21 23:48:09 compute-1 nova_compute[182713]: 2026-01-21 23:48:09.433 182717 DEBUG oslo_concurrency.processutils [None req-56bcb8a2-6be4-47c0-9ec7-758300a28f39 9677535ac83f4274b374a03b9ade1cf8 fd13708789ca41ca9ed376ba39d87279 - - default default] Running cmd (subprocess): scp -r /var/lib/nova/instances/2977f489-9f9d-43f7-a617-7556b7df5171_resize/disk 192.168.122.100:/var/lib/nova/instances/2977f489-9f9d-43f7-a617-7556b7df5171/disk execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:48:10 compute-1 nova_compute[182713]: 2026-01-21 23:48:10.178 182717 DEBUG oslo_concurrency.processutils [None req-56bcb8a2-6be4-47c0-9ec7-758300a28f39 9677535ac83f4274b374a03b9ade1cf8 fd13708789ca41ca9ed376ba39d87279 - - default default] CMD "scp -r /var/lib/nova/instances/2977f489-9f9d-43f7-a617-7556b7df5171_resize/disk 192.168.122.100:/var/lib/nova/instances/2977f489-9f9d-43f7-a617-7556b7df5171/disk" returned: 0 in 0.745s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:48:10 compute-1 nova_compute[182713]: 2026-01-21 23:48:10.180 182717 DEBUG nova.virt.libvirt.volume.remotefs [None req-56bcb8a2-6be4-47c0-9ec7-758300a28f39 9677535ac83f4274b374a03b9ade1cf8 fd13708789ca41ca9ed376ba39d87279 - - default default] Copying file /var/lib/nova/instances/2977f489-9f9d-43f7-a617-7556b7df5171_resize/disk.config to 192.168.122.100:/var/lib/nova/instances/2977f489-9f9d-43f7-a617-7556b7df5171/disk.config copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103
Jan 21 23:48:10 compute-1 nova_compute[182713]: 2026-01-21 23:48:10.180 182717 DEBUG oslo_concurrency.processutils [None req-56bcb8a2-6be4-47c0-9ec7-758300a28f39 9677535ac83f4274b374a03b9ade1cf8 fd13708789ca41ca9ed376ba39d87279 - - default default] Running cmd (subprocess): scp -C -r /var/lib/nova/instances/2977f489-9f9d-43f7-a617-7556b7df5171_resize/disk.config 192.168.122.100:/var/lib/nova/instances/2977f489-9f9d-43f7-a617-7556b7df5171/disk.config execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:48:10 compute-1 nova_compute[182713]: 2026-01-21 23:48:10.300 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:48:10 compute-1 nova_compute[182713]: 2026-01-21 23:48:10.397 182717 DEBUG oslo_concurrency.lockutils [None req-aaabcaf5-6448-48b0-b776-23ae8e0b8de5 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Acquiring lock "044c71d9-3aaf-4e1c-af95-5c0636cf4000" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:48:10 compute-1 nova_compute[182713]: 2026-01-21 23:48:10.398 182717 DEBUG oslo_concurrency.lockutils [None req-aaabcaf5-6448-48b0-b776-23ae8e0b8de5 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Lock "044c71d9-3aaf-4e1c-af95-5c0636cf4000" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:48:10 compute-1 nova_compute[182713]: 2026-01-21 23:48:10.399 182717 DEBUG oslo_concurrency.lockutils [None req-aaabcaf5-6448-48b0-b776-23ae8e0b8de5 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Acquiring lock "044c71d9-3aaf-4e1c-af95-5c0636cf4000-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:48:10 compute-1 nova_compute[182713]: 2026-01-21 23:48:10.399 182717 DEBUG oslo_concurrency.lockutils [None req-aaabcaf5-6448-48b0-b776-23ae8e0b8de5 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Lock "044c71d9-3aaf-4e1c-af95-5c0636cf4000-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:48:10 compute-1 nova_compute[182713]: 2026-01-21 23:48:10.399 182717 DEBUG oslo_concurrency.lockutils [None req-aaabcaf5-6448-48b0-b776-23ae8e0b8de5 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Lock "044c71d9-3aaf-4e1c-af95-5c0636cf4000-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:48:10 compute-1 nova_compute[182713]: 2026-01-21 23:48:10.412 182717 INFO nova.compute.manager [None req-aaabcaf5-6448-48b0-b776-23ae8e0b8de5 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] [instance: 044c71d9-3aaf-4e1c-af95-5c0636cf4000] Terminating instance
Jan 21 23:48:10 compute-1 nova_compute[182713]: 2026-01-21 23:48:10.424 182717 DEBUG nova.compute.manager [None req-aaabcaf5-6448-48b0-b776-23ae8e0b8de5 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] [instance: 044c71d9-3aaf-4e1c-af95-5c0636cf4000] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 21 23:48:10 compute-1 nova_compute[182713]: 2026-01-21 23:48:10.427 182717 DEBUG oslo_concurrency.processutils [None req-56bcb8a2-6be4-47c0-9ec7-758300a28f39 9677535ac83f4274b374a03b9ade1cf8 fd13708789ca41ca9ed376ba39d87279 - - default default] CMD "scp -C -r /var/lib/nova/instances/2977f489-9f9d-43f7-a617-7556b7df5171_resize/disk.config 192.168.122.100:/var/lib/nova/instances/2977f489-9f9d-43f7-a617-7556b7df5171/disk.config" returned: 0 in 0.246s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:48:10 compute-1 nova_compute[182713]: 2026-01-21 23:48:10.427 182717 DEBUG nova.virt.libvirt.volume.remotefs [None req-56bcb8a2-6be4-47c0-9ec7-758300a28f39 9677535ac83f4274b374a03b9ade1cf8 fd13708789ca41ca9ed376ba39d87279 - - default default] Copying file /var/lib/nova/instances/2977f489-9f9d-43f7-a617-7556b7df5171_resize/disk.info to 192.168.122.100:/var/lib/nova/instances/2977f489-9f9d-43f7-a617-7556b7df5171/disk.info copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103
Jan 21 23:48:10 compute-1 nova_compute[182713]: 2026-01-21 23:48:10.427 182717 DEBUG oslo_concurrency.processutils [None req-56bcb8a2-6be4-47c0-9ec7-758300a28f39 9677535ac83f4274b374a03b9ade1cf8 fd13708789ca41ca9ed376ba39d87279 - - default default] Running cmd (subprocess): scp -C -r /var/lib/nova/instances/2977f489-9f9d-43f7-a617-7556b7df5171_resize/disk.info 192.168.122.100:/var/lib/nova/instances/2977f489-9f9d-43f7-a617-7556b7df5171/disk.info execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:48:10 compute-1 kernel: tap417548ae-45 (unregistering): left promiscuous mode
Jan 21 23:48:10 compute-1 NetworkManager[54952]: <info>  [1769039290.4779] device (tap417548ae-45): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 21 23:48:10 compute-1 ovn_controller[94841]: 2026-01-21T23:48:10Z|00082|binding|INFO|Releasing lport 417548ae-4551-4ae2-8160-bafd0974768d from this chassis (sb_readonly=0)
Jan 21 23:48:10 compute-1 ovn_controller[94841]: 2026-01-21T23:48:10Z|00083|binding|INFO|Setting lport 417548ae-4551-4ae2-8160-bafd0974768d down in Southbound
Jan 21 23:48:10 compute-1 ovn_controller[94841]: 2026-01-21T23:48:10Z|00084|binding|INFO|Removing iface tap417548ae-45 ovn-installed in OVS
Jan 21 23:48:10 compute-1 nova_compute[182713]: 2026-01-21 23:48:10.491 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:48:10 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:48:10.507 104184 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:5a:f2:4d 10.100.0.12'], port_security=['fa:16:3e:5a:f2:4d 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '044c71d9-3aaf-4e1c-af95-5c0636cf4000', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1530a22a-f758-407d-b1aa-fd922904fe07', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '4d40fc03fb534b5689415f3d8a3de1fc', 'neutron:revision_number': '4', 'neutron:security_group_ids': '53586897-09f0-4175-b34b-334e99525efe', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b41d97fa-58c9-4587-b2cb-1c83c11100c9, chassis=[], tunnel_key=6, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>], logical_port=417548ae-4551-4ae2-8160-bafd0974768d) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 21 23:48:10 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:48:10.508 104184 INFO neutron.agent.ovn.metadata.agent [-] Port 417548ae-4551-4ae2-8160-bafd0974768d in datapath 1530a22a-f758-407d-b1aa-fd922904fe07 unbound from our chassis
Jan 21 23:48:10 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:48:10.511 104184 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 1530a22a-f758-407d-b1aa-fd922904fe07, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 21 23:48:10 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:48:10.512 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[28c179dc-731e-42b0-b7bc-91203d9dc827]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:48:10 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:48:10.513 104184 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-1530a22a-f758-407d-b1aa-fd922904fe07 namespace which is not needed anymore
Jan 21 23:48:10 compute-1 nova_compute[182713]: 2026-01-21 23:48:10.518 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:48:10 compute-1 systemd[1]: machine-qemu\x2d11\x2dinstance\x2d00000016.scope: Deactivated successfully.
Jan 21 23:48:10 compute-1 systemd[1]: machine-qemu\x2d11\x2dinstance\x2d00000016.scope: Consumed 14.520s CPU time.
Jan 21 23:48:10 compute-1 systemd-machined[153970]: Machine qemu-11-instance-00000016 terminated.
Jan 21 23:48:10 compute-1 podman[214486]: 2026-01-21 23:48:10.581042897 +0000 UTC m=+0.077569420 container health_status dce133a2ffa21f3006ac27dc80027060a9029e6d972f4c62fecd35dd3bbb00f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, config_id=openstack_network_exporter, container_name=openstack_network_exporter, release=1755695350, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., distribution-scope=public, io.openshift.expose-services=, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, name=ubi9-minimal, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git)
Jan 21 23:48:10 compute-1 nova_compute[182713]: 2026-01-21 23:48:10.652 182717 DEBUG oslo_concurrency.processutils [None req-56bcb8a2-6be4-47c0-9ec7-758300a28f39 9677535ac83f4274b374a03b9ade1cf8 fd13708789ca41ca9ed376ba39d87279 - - default default] CMD "scp -C -r /var/lib/nova/instances/2977f489-9f9d-43f7-a617-7556b7df5171_resize/disk.info 192.168.122.100:/var/lib/nova/instances/2977f489-9f9d-43f7-a617-7556b7df5171/disk.info" returned: 0 in 0.224s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:48:10 compute-1 neutron-haproxy-ovnmeta-1530a22a-f758-407d-b1aa-fd922904fe07[213920]: [NOTICE]   (213924) : haproxy version is 2.8.14-c23fe91
Jan 21 23:48:10 compute-1 neutron-haproxy-ovnmeta-1530a22a-f758-407d-b1aa-fd922904fe07[213920]: [NOTICE]   (213924) : path to executable is /usr/sbin/haproxy
Jan 21 23:48:10 compute-1 neutron-haproxy-ovnmeta-1530a22a-f758-407d-b1aa-fd922904fe07[213920]: [WARNING]  (213924) : Exiting Master process...
Jan 21 23:48:10 compute-1 neutron-haproxy-ovnmeta-1530a22a-f758-407d-b1aa-fd922904fe07[213920]: [ALERT]    (213924) : Current worker (213926) exited with code 143 (Terminated)
Jan 21 23:48:10 compute-1 neutron-haproxy-ovnmeta-1530a22a-f758-407d-b1aa-fd922904fe07[213920]: [WARNING]  (213924) : All workers exited. Exiting... (0)
Jan 21 23:48:10 compute-1 systemd[1]: libpod-20d1473d4d64c3a5685d2e3b64af39a2dab7cded56a64c7ede4fb35fc2c6c7f4.scope: Deactivated successfully.
Jan 21 23:48:10 compute-1 podman[214528]: 2026-01-21 23:48:10.690685615 +0000 UTC m=+0.055843763 container died 20d1473d4d64c3a5685d2e3b64af39a2dab7cded56a64c7ede4fb35fc2c6c7f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1530a22a-f758-407d-b1aa-fd922904fe07, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 21 23:48:10 compute-1 nova_compute[182713]: 2026-01-21 23:48:10.717 182717 INFO nova.virt.libvirt.driver [-] [instance: 044c71d9-3aaf-4e1c-af95-5c0636cf4000] Instance destroyed successfully.
Jan 21 23:48:10 compute-1 nova_compute[182713]: 2026-01-21 23:48:10.718 182717 DEBUG nova.objects.instance [None req-aaabcaf5-6448-48b0-b776-23ae8e0b8de5 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Lazy-loading 'resources' on Instance uuid 044c71d9-3aaf-4e1c-af95-5c0636cf4000 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:48:10 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-20d1473d4d64c3a5685d2e3b64af39a2dab7cded56a64c7ede4fb35fc2c6c7f4-userdata-shm.mount: Deactivated successfully.
Jan 21 23:48:10 compute-1 systemd[1]: var-lib-containers-storage-overlay-c9e80373ead48ae8fa08519da60a9521471c6d96e5e16d7711709a60f715b2ac-merged.mount: Deactivated successfully.
Jan 21 23:48:10 compute-1 podman[214528]: 2026-01-21 23:48:10.735921867 +0000 UTC m=+0.101079995 container cleanup 20d1473d4d64c3a5685d2e3b64af39a2dab7cded56a64c7ede4fb35fc2c6c7f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1530a22a-f758-407d-b1aa-fd922904fe07, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Jan 21 23:48:10 compute-1 nova_compute[182713]: 2026-01-21 23:48:10.741 182717 DEBUG nova.virt.libvirt.vif [None req-aaabcaf5-6448-48b0-b776-23ae8e0b8de5 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-21T23:47:16Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersAdminTestJSON-server-451134642',display_name='tempest-ServersAdminTestJSON-server-451134642',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serversadmintestjson-server-451134642',id=22,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-21T23:47:28Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='4d40fc03fb534b5689415f3d8a3de1fc',ramdisk_id='',reservation_id='r-3f908x09',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersAdminTestJSON-1815099341',owner_user_name='tempest-ServersAdminTestJSON-1815099341-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-21T23:47:28Z,user_data=None,user_id='4a6034ff39094b6486bac680b7ed5a57',uuid=044c71d9-3aaf-4e1c-af95-5c0636cf4000,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "417548ae-4551-4ae2-8160-bafd0974768d", "address": "fa:16:3e:5a:f2:4d", "network": {"id": "1530a22a-f758-407d-b1aa-fd922904fe07", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1431691179-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4d40fc03fb534b5689415f3d8a3de1fc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap417548ae-45", "ovs_interfaceid": "417548ae-4551-4ae2-8160-bafd0974768d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 21 23:48:10 compute-1 nova_compute[182713]: 2026-01-21 23:48:10.742 182717 DEBUG nova.network.os_vif_util [None req-aaabcaf5-6448-48b0-b776-23ae8e0b8de5 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Converting VIF {"id": "417548ae-4551-4ae2-8160-bafd0974768d", "address": "fa:16:3e:5a:f2:4d", "network": {"id": "1530a22a-f758-407d-b1aa-fd922904fe07", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1431691179-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4d40fc03fb534b5689415f3d8a3de1fc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap417548ae-45", "ovs_interfaceid": "417548ae-4551-4ae2-8160-bafd0974768d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 21 23:48:10 compute-1 nova_compute[182713]: 2026-01-21 23:48:10.743 182717 DEBUG nova.network.os_vif_util [None req-aaabcaf5-6448-48b0-b776-23ae8e0b8de5 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:5a:f2:4d,bridge_name='br-int',has_traffic_filtering=True,id=417548ae-4551-4ae2-8160-bafd0974768d,network=Network(1530a22a-f758-407d-b1aa-fd922904fe07),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap417548ae-45') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 21 23:48:10 compute-1 nova_compute[182713]: 2026-01-21 23:48:10.743 182717 DEBUG os_vif [None req-aaabcaf5-6448-48b0-b776-23ae8e0b8de5 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:5a:f2:4d,bridge_name='br-int',has_traffic_filtering=True,id=417548ae-4551-4ae2-8160-bafd0974768d,network=Network(1530a22a-f758-407d-b1aa-fd922904fe07),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap417548ae-45') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 21 23:48:10 compute-1 nova_compute[182713]: 2026-01-21 23:48:10.747 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:48:10 compute-1 systemd[1]: libpod-conmon-20d1473d4d64c3a5685d2e3b64af39a2dab7cded56a64c7ede4fb35fc2c6c7f4.scope: Deactivated successfully.
Jan 21 23:48:10 compute-1 nova_compute[182713]: 2026-01-21 23:48:10.747 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap417548ae-45, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:48:10 compute-1 nova_compute[182713]: 2026-01-21 23:48:10.749 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:48:10 compute-1 nova_compute[182713]: 2026-01-21 23:48:10.751 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 21 23:48:10 compute-1 nova_compute[182713]: 2026-01-21 23:48:10.754 182717 INFO os_vif [None req-aaabcaf5-6448-48b0-b776-23ae8e0b8de5 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:5a:f2:4d,bridge_name='br-int',has_traffic_filtering=True,id=417548ae-4551-4ae2-8160-bafd0974768d,network=Network(1530a22a-f758-407d-b1aa-fd922904fe07),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap417548ae-45')
Jan 21 23:48:10 compute-1 nova_compute[182713]: 2026-01-21 23:48:10.755 182717 INFO nova.virt.libvirt.driver [None req-aaabcaf5-6448-48b0-b776-23ae8e0b8de5 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] [instance: 044c71d9-3aaf-4e1c-af95-5c0636cf4000] Deleting instance files /var/lib/nova/instances/044c71d9-3aaf-4e1c-af95-5c0636cf4000_del
Jan 21 23:48:10 compute-1 nova_compute[182713]: 2026-01-21 23:48:10.755 182717 INFO nova.virt.libvirt.driver [None req-aaabcaf5-6448-48b0-b776-23ae8e0b8de5 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] [instance: 044c71d9-3aaf-4e1c-af95-5c0636cf4000] Deletion of /var/lib/nova/instances/044c71d9-3aaf-4e1c-af95-5c0636cf4000_del complete
Jan 21 23:48:10 compute-1 podman[214576]: 2026-01-21 23:48:10.830094018 +0000 UTC m=+0.054573161 container remove 20d1473d4d64c3a5685d2e3b64af39a2dab7cded56a64c7ede4fb35fc2c6c7f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1530a22a-f758-407d-b1aa-fd922904fe07, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251202)
Jan 21 23:48:10 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:48:10.838 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[a8ebe3b2-b851-45fa-ab71-e6f8c56338e1]: (4, ('Wed Jan 21 11:48:10 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-1530a22a-f758-407d-b1aa-fd922904fe07 (20d1473d4d64c3a5685d2e3b64af39a2dab7cded56a64c7ede4fb35fc2c6c7f4)\n20d1473d4d64c3a5685d2e3b64af39a2dab7cded56a64c7ede4fb35fc2c6c7f4\nWed Jan 21 11:48:10 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-1530a22a-f758-407d-b1aa-fd922904fe07 (20d1473d4d64c3a5685d2e3b64af39a2dab7cded56a64c7ede4fb35fc2c6c7f4)\n20d1473d4d64c3a5685d2e3b64af39a2dab7cded56a64c7ede4fb35fc2c6c7f4\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:48:10 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:48:10.839 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[78e66b91-b9ba-48af-8ce0-ee13047195c4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:48:10 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:48:10.841 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1530a22a-f0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:48:10 compute-1 nova_compute[182713]: 2026-01-21 23:48:10.843 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:48:10 compute-1 kernel: tap1530a22a-f0: left promiscuous mode
Jan 21 23:48:10 compute-1 nova_compute[182713]: 2026-01-21 23:48:10.858 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:48:10 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:48:10.861 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[95a89427-4605-4f87-a1f5-56fa9d0c4b2a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:48:10 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:48:10.878 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[21ff46a7-7f9b-49a4-8ac9-733b5d55f71c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:48:10 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:48:10.880 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[4660670c-c088-4a3d-a822-569d40f684bd]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:48:10 compute-1 nova_compute[182713]: 2026-01-21 23:48:10.886 182717 DEBUG oslo_concurrency.lockutils [None req-56bcb8a2-6be4-47c0-9ec7-758300a28f39 9677535ac83f4274b374a03b9ade1cf8 fd13708789ca41ca9ed376ba39d87279 - - default default] Acquiring lock "2977f489-9f9d-43f7-a617-7556b7df5171-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:48:10 compute-1 nova_compute[182713]: 2026-01-21 23:48:10.886 182717 DEBUG oslo_concurrency.lockutils [None req-56bcb8a2-6be4-47c0-9ec7-758300a28f39 9677535ac83f4274b374a03b9ade1cf8 fd13708789ca41ca9ed376ba39d87279 - - default default] Lock "2977f489-9f9d-43f7-a617-7556b7df5171-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:48:10 compute-1 nova_compute[182713]: 2026-01-21 23:48:10.887 182717 DEBUG oslo_concurrency.lockutils [None req-56bcb8a2-6be4-47c0-9ec7-758300a28f39 9677535ac83f4274b374a03b9ade1cf8 fd13708789ca41ca9ed376ba39d87279 - - default default] Lock "2977f489-9f9d-43f7-a617-7556b7df5171-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:48:10 compute-1 nova_compute[182713]: 2026-01-21 23:48:10.899 182717 INFO nova.compute.manager [None req-aaabcaf5-6448-48b0-b776-23ae8e0b8de5 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] [instance: 044c71d9-3aaf-4e1c-af95-5c0636cf4000] Took 0.47 seconds to destroy the instance on the hypervisor.
Jan 21 23:48:10 compute-1 nova_compute[182713]: 2026-01-21 23:48:10.900 182717 DEBUG oslo.service.loopingcall [None req-aaabcaf5-6448-48b0-b776-23ae8e0b8de5 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 21 23:48:10 compute-1 nova_compute[182713]: 2026-01-21 23:48:10.900 182717 DEBUG nova.compute.manager [-] [instance: 044c71d9-3aaf-4e1c-af95-5c0636cf4000] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 21 23:48:10 compute-1 nova_compute[182713]: 2026-01-21 23:48:10.901 182717 DEBUG nova.network.neutron [-] [instance: 044c71d9-3aaf-4e1c-af95-5c0636cf4000] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 21 23:48:10 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:48:10.904 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[88b421e5-0670-4b2f-9780-eee53d20534a]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 381798, 'reachable_time': 29669, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 214591, 'error': None, 'target': 'ovnmeta-1530a22a-f758-407d-b1aa-fd922904fe07', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:48:10 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:48:10.907 104576 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-1530a22a-f758-407d-b1aa-fd922904fe07 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 21 23:48:10 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:48:10.908 104576 DEBUG oslo.privsep.daemon [-] privsep: reply[45d7cac0-7197-4af4-970f-b43b05b3320c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:48:10 compute-1 systemd[1]: run-netns-ovnmeta\x2d1530a22a\x2df758\x2d407d\x2db1aa\x2dfd922904fe07.mount: Deactivated successfully.
Jan 21 23:48:11 compute-1 nova_compute[182713]: 2026-01-21 23:48:11.033 182717 DEBUG nova.compute.manager [req-7282f1ff-363a-42b7-90be-be99b6e8e82e req-a0fba033-df53-4c04-b408-814e72c12521 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 044c71d9-3aaf-4e1c-af95-5c0636cf4000] Received event network-vif-unplugged-417548ae-4551-4ae2-8160-bafd0974768d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:48:11 compute-1 nova_compute[182713]: 2026-01-21 23:48:11.034 182717 DEBUG oslo_concurrency.lockutils [req-7282f1ff-363a-42b7-90be-be99b6e8e82e req-a0fba033-df53-4c04-b408-814e72c12521 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "044c71d9-3aaf-4e1c-af95-5c0636cf4000-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:48:11 compute-1 nova_compute[182713]: 2026-01-21 23:48:11.035 182717 DEBUG oslo_concurrency.lockutils [req-7282f1ff-363a-42b7-90be-be99b6e8e82e req-a0fba033-df53-4c04-b408-814e72c12521 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "044c71d9-3aaf-4e1c-af95-5c0636cf4000-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:48:11 compute-1 nova_compute[182713]: 2026-01-21 23:48:11.036 182717 DEBUG oslo_concurrency.lockutils [req-7282f1ff-363a-42b7-90be-be99b6e8e82e req-a0fba033-df53-4c04-b408-814e72c12521 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "044c71d9-3aaf-4e1c-af95-5c0636cf4000-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:48:11 compute-1 nova_compute[182713]: 2026-01-21 23:48:11.036 182717 DEBUG nova.compute.manager [req-7282f1ff-363a-42b7-90be-be99b6e8e82e req-a0fba033-df53-4c04-b408-814e72c12521 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 044c71d9-3aaf-4e1c-af95-5c0636cf4000] No waiting events found dispatching network-vif-unplugged-417548ae-4551-4ae2-8160-bafd0974768d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 23:48:11 compute-1 nova_compute[182713]: 2026-01-21 23:48:11.037 182717 DEBUG nova.compute.manager [req-7282f1ff-363a-42b7-90be-be99b6e8e82e req-a0fba033-df53-4c04-b408-814e72c12521 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 044c71d9-3aaf-4e1c-af95-5c0636cf4000] Received event network-vif-unplugged-417548ae-4551-4ae2-8160-bafd0974768d for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 21 23:48:12 compute-1 nova_compute[182713]: 2026-01-21 23:48:12.934 182717 DEBUG nova.network.neutron [-] [instance: 044c71d9-3aaf-4e1c-af95-5c0636cf4000] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 23:48:12 compute-1 nova_compute[182713]: 2026-01-21 23:48:12.971 182717 INFO nova.compute.manager [-] [instance: 044c71d9-3aaf-4e1c-af95-5c0636cf4000] Took 2.07 seconds to deallocate network for instance.
Jan 21 23:48:13 compute-1 nova_compute[182713]: 2026-01-21 23:48:13.104 182717 DEBUG oslo_concurrency.lockutils [None req-aaabcaf5-6448-48b0-b776-23ae8e0b8de5 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:48:13 compute-1 nova_compute[182713]: 2026-01-21 23:48:13.105 182717 DEBUG oslo_concurrency.lockutils [None req-aaabcaf5-6448-48b0-b776-23ae8e0b8de5 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:48:13 compute-1 nova_compute[182713]: 2026-01-21 23:48:13.135 182717 DEBUG nova.compute.manager [req-ab16dc31-e783-4555-8046-d8da50fa2b5b req-7cd703c9-d326-467f-8156-631119340875 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 044c71d9-3aaf-4e1c-af95-5c0636cf4000] Received event network-vif-plugged-417548ae-4551-4ae2-8160-bafd0974768d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:48:13 compute-1 nova_compute[182713]: 2026-01-21 23:48:13.136 182717 DEBUG oslo_concurrency.lockutils [req-ab16dc31-e783-4555-8046-d8da50fa2b5b req-7cd703c9-d326-467f-8156-631119340875 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "044c71d9-3aaf-4e1c-af95-5c0636cf4000-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:48:13 compute-1 nova_compute[182713]: 2026-01-21 23:48:13.136 182717 DEBUG oslo_concurrency.lockutils [req-ab16dc31-e783-4555-8046-d8da50fa2b5b req-7cd703c9-d326-467f-8156-631119340875 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "044c71d9-3aaf-4e1c-af95-5c0636cf4000-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:48:13 compute-1 nova_compute[182713]: 2026-01-21 23:48:13.137 182717 DEBUG oslo_concurrency.lockutils [req-ab16dc31-e783-4555-8046-d8da50fa2b5b req-7cd703c9-d326-467f-8156-631119340875 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "044c71d9-3aaf-4e1c-af95-5c0636cf4000-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:48:13 compute-1 nova_compute[182713]: 2026-01-21 23:48:13.137 182717 DEBUG nova.compute.manager [req-ab16dc31-e783-4555-8046-d8da50fa2b5b req-7cd703c9-d326-467f-8156-631119340875 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 044c71d9-3aaf-4e1c-af95-5c0636cf4000] No waiting events found dispatching network-vif-plugged-417548ae-4551-4ae2-8160-bafd0974768d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 23:48:13 compute-1 nova_compute[182713]: 2026-01-21 23:48:13.137 182717 WARNING nova.compute.manager [req-ab16dc31-e783-4555-8046-d8da50fa2b5b req-7cd703c9-d326-467f-8156-631119340875 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 044c71d9-3aaf-4e1c-af95-5c0636cf4000] Received unexpected event network-vif-plugged-417548ae-4551-4ae2-8160-bafd0974768d for instance with vm_state deleted and task_state None.
Jan 21 23:48:13 compute-1 nova_compute[182713]: 2026-01-21 23:48:13.138 182717 DEBUG nova.compute.manager [req-ab16dc31-e783-4555-8046-d8da50fa2b5b req-7cd703c9-d326-467f-8156-631119340875 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 044c71d9-3aaf-4e1c-af95-5c0636cf4000] Received event network-vif-deleted-417548ae-4551-4ae2-8160-bafd0974768d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:48:13 compute-1 nova_compute[182713]: 2026-01-21 23:48:13.201 182717 DEBUG nova.compute.provider_tree [None req-aaabcaf5-6448-48b0-b776-23ae8e0b8de5 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Inventory has not changed in ProviderTree for provider: 39680711-70c9-4df1-ae59-25e54fac688d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 21 23:48:13 compute-1 nova_compute[182713]: 2026-01-21 23:48:13.219 182717 DEBUG nova.scheduler.client.report [None req-aaabcaf5-6448-48b0-b776-23ae8e0b8de5 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Inventory has not changed for provider 39680711-70c9-4df1-ae59-25e54fac688d based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 21 23:48:13 compute-1 nova_compute[182713]: 2026-01-21 23:48:13.255 182717 DEBUG oslo_concurrency.lockutils [None req-aaabcaf5-6448-48b0-b776-23ae8e0b8de5 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.150s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:48:13 compute-1 nova_compute[182713]: 2026-01-21 23:48:13.312 182717 INFO nova.scheduler.client.report [None req-aaabcaf5-6448-48b0-b776-23ae8e0b8de5 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Deleted allocations for instance 044c71d9-3aaf-4e1c-af95-5c0636cf4000
Jan 21 23:48:13 compute-1 nova_compute[182713]: 2026-01-21 23:48:13.421 182717 DEBUG oslo_concurrency.lockutils [None req-aaabcaf5-6448-48b0-b776-23ae8e0b8de5 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Lock "044c71d9-3aaf-4e1c-af95-5c0636cf4000" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.023s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:48:14 compute-1 nova_compute[182713]: 2026-01-21 23:48:14.407 182717 DEBUG oslo_concurrency.lockutils [None req-a87f5acb-8fa8-487f-8c8d-397faf8c9a7f d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] Acquiring lock "2038fd11-9c07-48d0-8092-d973d69d8eb9" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:48:14 compute-1 nova_compute[182713]: 2026-01-21 23:48:14.408 182717 DEBUG oslo_concurrency.lockutils [None req-a87f5acb-8fa8-487f-8c8d-397faf8c9a7f d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] Lock "2038fd11-9c07-48d0-8092-d973d69d8eb9" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:48:14 compute-1 nova_compute[182713]: 2026-01-21 23:48:14.430 182717 DEBUG nova.compute.manager [None req-a87f5acb-8fa8-487f-8c8d-397faf8c9a7f d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] [instance: 2038fd11-9c07-48d0-8092-d973d69d8eb9] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 21 23:48:14 compute-1 nova_compute[182713]: 2026-01-21 23:48:14.555 182717 DEBUG oslo_concurrency.lockutils [None req-a87f5acb-8fa8-487f-8c8d-397faf8c9a7f d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:48:14 compute-1 nova_compute[182713]: 2026-01-21 23:48:14.555 182717 DEBUG oslo_concurrency.lockutils [None req-a87f5acb-8fa8-487f-8c8d-397faf8c9a7f d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:48:14 compute-1 nova_compute[182713]: 2026-01-21 23:48:14.562 182717 DEBUG nova.virt.hardware [None req-a87f5acb-8fa8-487f-8c8d-397faf8c9a7f d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 21 23:48:14 compute-1 nova_compute[182713]: 2026-01-21 23:48:14.562 182717 INFO nova.compute.claims [None req-a87f5acb-8fa8-487f-8c8d-397faf8c9a7f d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] [instance: 2038fd11-9c07-48d0-8092-d973d69d8eb9] Claim successful on node compute-1.ctlplane.example.com
Jan 21 23:48:14 compute-1 nova_compute[182713]: 2026-01-21 23:48:14.750 182717 DEBUG nova.compute.provider_tree [None req-a87f5acb-8fa8-487f-8c8d-397faf8c9a7f d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] Inventory has not changed in ProviderTree for provider: 39680711-70c9-4df1-ae59-25e54fac688d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 21 23:48:14 compute-1 nova_compute[182713]: 2026-01-21 23:48:14.791 182717 DEBUG nova.scheduler.client.report [None req-a87f5acb-8fa8-487f-8c8d-397faf8c9a7f d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] Inventory has not changed for provider 39680711-70c9-4df1-ae59-25e54fac688d based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 21 23:48:14 compute-1 nova_compute[182713]: 2026-01-21 23:48:14.829 182717 DEBUG oslo_concurrency.lockutils [None req-2d6470a6-8e32-4148-b043-6f328c0c0650 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Acquiring lock "2977f489-9f9d-43f7-a617-7556b7df5171" by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:48:14 compute-1 nova_compute[182713]: 2026-01-21 23:48:14.829 182717 DEBUG oslo_concurrency.lockutils [None req-2d6470a6-8e32-4148-b043-6f328c0c0650 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Lock "2977f489-9f9d-43f7-a617-7556b7df5171" acquired by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:48:14 compute-1 nova_compute[182713]: 2026-01-21 23:48:14.830 182717 DEBUG nova.compute.manager [None req-2d6470a6-8e32-4148-b043-6f328c0c0650 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] [instance: 2977f489-9f9d-43f7-a617-7556b7df5171] Going to confirm migration 6 do_confirm_resize /usr/lib/python3.9/site-packages/nova/compute/manager.py:4679
Jan 21 23:48:14 compute-1 nova_compute[182713]: 2026-01-21 23:48:14.833 182717 DEBUG oslo_concurrency.lockutils [None req-a87f5acb-8fa8-487f-8c8d-397faf8c9a7f d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.278s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:48:14 compute-1 nova_compute[182713]: 2026-01-21 23:48:14.834 182717 DEBUG nova.compute.manager [None req-a87f5acb-8fa8-487f-8c8d-397faf8c9a7f d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] [instance: 2038fd11-9c07-48d0-8092-d973d69d8eb9] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 21 23:48:14 compute-1 nova_compute[182713]: 2026-01-21 23:48:14.905 182717 DEBUG nova.objects.instance [None req-2d6470a6-8e32-4148-b043-6f328c0c0650 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Lazy-loading 'info_cache' on Instance uuid 2977f489-9f9d-43f7-a617-7556b7df5171 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:48:14 compute-1 nova_compute[182713]: 2026-01-21 23:48:14.976 182717 DEBUG nova.compute.manager [None req-a87f5acb-8fa8-487f-8c8d-397faf8c9a7f d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] [instance: 2038fd11-9c07-48d0-8092-d973d69d8eb9] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 21 23:48:14 compute-1 nova_compute[182713]: 2026-01-21 23:48:14.977 182717 DEBUG nova.network.neutron [None req-a87f5acb-8fa8-487f-8c8d-397faf8c9a7f d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] [instance: 2038fd11-9c07-48d0-8092-d973d69d8eb9] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 21 23:48:15 compute-1 nova_compute[182713]: 2026-01-21 23:48:15.001 182717 INFO nova.virt.libvirt.driver [None req-a87f5acb-8fa8-487f-8c8d-397faf8c9a7f d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] [instance: 2038fd11-9c07-48d0-8092-d973d69d8eb9] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 21 23:48:15 compute-1 nova_compute[182713]: 2026-01-21 23:48:15.029 182717 DEBUG nova.compute.manager [None req-a87f5acb-8fa8-487f-8c8d-397faf8c9a7f d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] [instance: 2038fd11-9c07-48d0-8092-d973d69d8eb9] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 21 23:48:15 compute-1 nova_compute[182713]: 2026-01-21 23:48:15.170 182717 DEBUG nova.compute.manager [None req-a87f5acb-8fa8-487f-8c8d-397faf8c9a7f d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] [instance: 2038fd11-9c07-48d0-8092-d973d69d8eb9] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 21 23:48:15 compute-1 nova_compute[182713]: 2026-01-21 23:48:15.172 182717 DEBUG nova.virt.libvirt.driver [None req-a87f5acb-8fa8-487f-8c8d-397faf8c9a7f d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] [instance: 2038fd11-9c07-48d0-8092-d973d69d8eb9] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 21 23:48:15 compute-1 nova_compute[182713]: 2026-01-21 23:48:15.172 182717 INFO nova.virt.libvirt.driver [None req-a87f5acb-8fa8-487f-8c8d-397faf8c9a7f d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] [instance: 2038fd11-9c07-48d0-8092-d973d69d8eb9] Creating image(s)
Jan 21 23:48:15 compute-1 nova_compute[182713]: 2026-01-21 23:48:15.174 182717 DEBUG oslo_concurrency.lockutils [None req-a87f5acb-8fa8-487f-8c8d-397faf8c9a7f d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] Acquiring lock "/var/lib/nova/instances/2038fd11-9c07-48d0-8092-d973d69d8eb9/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:48:15 compute-1 nova_compute[182713]: 2026-01-21 23:48:15.174 182717 DEBUG oslo_concurrency.lockutils [None req-a87f5acb-8fa8-487f-8c8d-397faf8c9a7f d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] Lock "/var/lib/nova/instances/2038fd11-9c07-48d0-8092-d973d69d8eb9/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:48:15 compute-1 nova_compute[182713]: 2026-01-21 23:48:15.175 182717 DEBUG oslo_concurrency.lockutils [None req-a87f5acb-8fa8-487f-8c8d-397faf8c9a7f d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] Lock "/var/lib/nova/instances/2038fd11-9c07-48d0-8092-d973d69d8eb9/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:48:15 compute-1 nova_compute[182713]: 2026-01-21 23:48:15.202 182717 DEBUG oslo_concurrency.processutils [None req-a87f5acb-8fa8-487f-8c8d-397faf8c9a7f d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:48:15 compute-1 nova_compute[182713]: 2026-01-21 23:48:15.285 182717 DEBUG oslo_concurrency.processutils [None req-a87f5acb-8fa8-487f-8c8d-397faf8c9a7f d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.084s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:48:15 compute-1 nova_compute[182713]: 2026-01-21 23:48:15.287 182717 DEBUG oslo_concurrency.lockutils [None req-a87f5acb-8fa8-487f-8c8d-397faf8c9a7f d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] Acquiring lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:48:15 compute-1 nova_compute[182713]: 2026-01-21 23:48:15.288 182717 DEBUG oslo_concurrency.lockutils [None req-a87f5acb-8fa8-487f-8c8d-397faf8c9a7f d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:48:15 compute-1 nova_compute[182713]: 2026-01-21 23:48:15.298 182717 DEBUG oslo_concurrency.processutils [None req-a87f5acb-8fa8-487f-8c8d-397faf8c9a7f d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:48:15 compute-1 nova_compute[182713]: 2026-01-21 23:48:15.324 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:48:15 compute-1 nova_compute[182713]: 2026-01-21 23:48:15.362 182717 DEBUG oslo_concurrency.lockutils [None req-2d6470a6-8e32-4148-b043-6f328c0c0650 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Acquiring lock "refresh_cache-2977f489-9f9d-43f7-a617-7556b7df5171" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 23:48:15 compute-1 nova_compute[182713]: 2026-01-21 23:48:15.363 182717 DEBUG oslo_concurrency.lockutils [None req-2d6470a6-8e32-4148-b043-6f328c0c0650 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Acquired lock "refresh_cache-2977f489-9f9d-43f7-a617-7556b7df5171" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 23:48:15 compute-1 nova_compute[182713]: 2026-01-21 23:48:15.363 182717 DEBUG nova.network.neutron [None req-2d6470a6-8e32-4148-b043-6f328c0c0650 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] [instance: 2977f489-9f9d-43f7-a617-7556b7df5171] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 21 23:48:15 compute-1 nova_compute[182713]: 2026-01-21 23:48:15.388 182717 DEBUG oslo_concurrency.processutils [None req-a87f5acb-8fa8-487f-8c8d-397faf8c9a7f d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.090s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:48:15 compute-1 nova_compute[182713]: 2026-01-21 23:48:15.389 182717 DEBUG oslo_concurrency.processutils [None req-a87f5acb-8fa8-487f-8c8d-397faf8c9a7f d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/2038fd11-9c07-48d0-8092-d973d69d8eb9/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:48:15 compute-1 nova_compute[182713]: 2026-01-21 23:48:15.427 182717 DEBUG oslo_concurrency.processutils [None req-a87f5acb-8fa8-487f-8c8d-397faf8c9a7f d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/2038fd11-9c07-48d0-8092-d973d69d8eb9/disk 1073741824" returned: 0 in 0.039s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:48:15 compute-1 nova_compute[182713]: 2026-01-21 23:48:15.429 182717 DEBUG oslo_concurrency.lockutils [None req-a87f5acb-8fa8-487f-8c8d-397faf8c9a7f d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.141s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:48:15 compute-1 nova_compute[182713]: 2026-01-21 23:48:15.429 182717 DEBUG oslo_concurrency.processutils [None req-a87f5acb-8fa8-487f-8c8d-397faf8c9a7f d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:48:15 compute-1 nova_compute[182713]: 2026-01-21 23:48:15.494 182717 DEBUG oslo_concurrency.processutils [None req-a87f5acb-8fa8-487f-8c8d-397faf8c9a7f d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:48:15 compute-1 nova_compute[182713]: 2026-01-21 23:48:15.497 182717 DEBUG nova.virt.disk.api [None req-a87f5acb-8fa8-487f-8c8d-397faf8c9a7f d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] Checking if we can resize image /var/lib/nova/instances/2038fd11-9c07-48d0-8092-d973d69d8eb9/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 21 23:48:15 compute-1 nova_compute[182713]: 2026-01-21 23:48:15.498 182717 DEBUG oslo_concurrency.processutils [None req-a87f5acb-8fa8-487f-8c8d-397faf8c9a7f d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2038fd11-9c07-48d0-8092-d973d69d8eb9/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:48:15 compute-1 nova_compute[182713]: 2026-01-21 23:48:15.593 182717 DEBUG oslo_concurrency.processutils [None req-a87f5acb-8fa8-487f-8c8d-397faf8c9a7f d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2038fd11-9c07-48d0-8092-d973d69d8eb9/disk --force-share --output=json" returned: 0 in 0.095s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:48:15 compute-1 nova_compute[182713]: 2026-01-21 23:48:15.594 182717 DEBUG nova.virt.disk.api [None req-a87f5acb-8fa8-487f-8c8d-397faf8c9a7f d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] Cannot resize image /var/lib/nova/instances/2038fd11-9c07-48d0-8092-d973d69d8eb9/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 21 23:48:15 compute-1 nova_compute[182713]: 2026-01-21 23:48:15.595 182717 DEBUG nova.objects.instance [None req-a87f5acb-8fa8-487f-8c8d-397faf8c9a7f d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] Lazy-loading 'migration_context' on Instance uuid 2038fd11-9c07-48d0-8092-d973d69d8eb9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:48:15 compute-1 nova_compute[182713]: 2026-01-21 23:48:15.619 182717 DEBUG nova.virt.libvirt.driver [None req-a87f5acb-8fa8-487f-8c8d-397faf8c9a7f d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] [instance: 2038fd11-9c07-48d0-8092-d973d69d8eb9] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 21 23:48:15 compute-1 nova_compute[182713]: 2026-01-21 23:48:15.620 182717 DEBUG nova.virt.libvirt.driver [None req-a87f5acb-8fa8-487f-8c8d-397faf8c9a7f d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] [instance: 2038fd11-9c07-48d0-8092-d973d69d8eb9] Ensure instance console log exists: /var/lib/nova/instances/2038fd11-9c07-48d0-8092-d973d69d8eb9/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 21 23:48:15 compute-1 nova_compute[182713]: 2026-01-21 23:48:15.620 182717 DEBUG oslo_concurrency.lockutils [None req-a87f5acb-8fa8-487f-8c8d-397faf8c9a7f d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:48:15 compute-1 nova_compute[182713]: 2026-01-21 23:48:15.620 182717 DEBUG oslo_concurrency.lockutils [None req-a87f5acb-8fa8-487f-8c8d-397faf8c9a7f d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:48:15 compute-1 nova_compute[182713]: 2026-01-21 23:48:15.621 182717 DEBUG oslo_concurrency.lockutils [None req-a87f5acb-8fa8-487f-8c8d-397faf8c9a7f d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:48:15 compute-1 nova_compute[182713]: 2026-01-21 23:48:15.694 182717 DEBUG nova.network.neutron [None req-2d6470a6-8e32-4148-b043-6f328c0c0650 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] [instance: 2977f489-9f9d-43f7-a617-7556b7df5171] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 21 23:48:15 compute-1 nova_compute[182713]: 2026-01-21 23:48:15.750 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:48:16 compute-1 nova_compute[182713]: 2026-01-21 23:48:16.177 182717 DEBUG nova.policy [None req-a87f5acb-8fa8-487f-8c8d-397faf8c9a7f d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'd4ff24d8abf8416db9d64c645436c5f1', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'cdcb2f57183e484cace5d5f78dd635a1', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 21 23:48:16 compute-1 nova_compute[182713]: 2026-01-21 23:48:16.332 182717 DEBUG nova.network.neutron [None req-2d6470a6-8e32-4148-b043-6f328c0c0650 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] [instance: 2977f489-9f9d-43f7-a617-7556b7df5171] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 23:48:16 compute-1 nova_compute[182713]: 2026-01-21 23:48:16.366 182717 DEBUG oslo_concurrency.lockutils [None req-2d6470a6-8e32-4148-b043-6f328c0c0650 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Releasing lock "refresh_cache-2977f489-9f9d-43f7-a617-7556b7df5171" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 23:48:16 compute-1 nova_compute[182713]: 2026-01-21 23:48:16.367 182717 DEBUG nova.objects.instance [None req-2d6470a6-8e32-4148-b043-6f328c0c0650 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Lazy-loading 'migration_context' on Instance uuid 2977f489-9f9d-43f7-a617-7556b7df5171 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:48:16 compute-1 nova_compute[182713]: 2026-01-21 23:48:16.401 182717 DEBUG oslo_concurrency.lockutils [None req-2d6470a6-8e32-4148-b043-6f328c0c0650 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:48:16 compute-1 nova_compute[182713]: 2026-01-21 23:48:16.402 182717 DEBUG oslo_concurrency.lockutils [None req-2d6470a6-8e32-4148-b043-6f328c0c0650 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:48:16 compute-1 nova_compute[182713]: 2026-01-21 23:48:16.550 182717 DEBUG nova.compute.provider_tree [None req-2d6470a6-8e32-4148-b043-6f328c0c0650 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Inventory has not changed in ProviderTree for provider: 39680711-70c9-4df1-ae59-25e54fac688d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 21 23:48:16 compute-1 nova_compute[182713]: 2026-01-21 23:48:16.571 182717 DEBUG nova.scheduler.client.report [None req-2d6470a6-8e32-4148-b043-6f328c0c0650 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Inventory has not changed for provider 39680711-70c9-4df1-ae59-25e54fac688d based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 21 23:48:16 compute-1 nova_compute[182713]: 2026-01-21 23:48:16.649 182717 DEBUG oslo_concurrency.lockutils [None req-2d6470a6-8e32-4148-b043-6f328c0c0650 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" :: held 0.247s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:48:16 compute-1 nova_compute[182713]: 2026-01-21 23:48:16.842 182717 INFO nova.scheduler.client.report [None req-2d6470a6-8e32-4148-b043-6f328c0c0650 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Deleted allocation for migration 78e371bb-27b6-4b83-90af-e79567818d7b
Jan 21 23:48:16 compute-1 nova_compute[182713]: 2026-01-21 23:48:16.941 182717 DEBUG oslo_concurrency.lockutils [None req-2d6470a6-8e32-4148-b043-6f328c0c0650 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Lock "2977f489-9f9d-43f7-a617-7556b7df5171" "released" by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" :: held 2.111s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:48:17 compute-1 podman[214609]: 2026-01-21 23:48:17.570083768 +0000 UTC m=+0.059853672 container health_status 9069102163b9014ce8d990e36224edf2f6277e737eebbc85a8ea0ba5a3d6a468 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 21 23:48:17 compute-1 podman[214608]: 2026-01-21 23:48:17.610418371 +0000 UTC m=+0.101073794 container health_status 1b31374526468d6b98833c6ddc06c791524e3aaa8f4a92d02e3f11c449a17ca2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller)
Jan 21 23:48:17 compute-1 nova_compute[182713]: 2026-01-21 23:48:17.643 182717 DEBUG nova.network.neutron [None req-a87f5acb-8fa8-487f-8c8d-397faf8c9a7f d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] [instance: 2038fd11-9c07-48d0-8092-d973d69d8eb9] Successfully updated port: bbc46799-0727-41d9-9ae1-017037df9492 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 21 23:48:17 compute-1 nova_compute[182713]: 2026-01-21 23:48:17.663 182717 DEBUG oslo_concurrency.lockutils [None req-a87f5acb-8fa8-487f-8c8d-397faf8c9a7f d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] Acquiring lock "refresh_cache-2038fd11-9c07-48d0-8092-d973d69d8eb9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 23:48:17 compute-1 nova_compute[182713]: 2026-01-21 23:48:17.664 182717 DEBUG oslo_concurrency.lockutils [None req-a87f5acb-8fa8-487f-8c8d-397faf8c9a7f d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] Acquired lock "refresh_cache-2038fd11-9c07-48d0-8092-d973d69d8eb9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 23:48:17 compute-1 nova_compute[182713]: 2026-01-21 23:48:17.664 182717 DEBUG nova.network.neutron [None req-a87f5acb-8fa8-487f-8c8d-397faf8c9a7f d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] [instance: 2038fd11-9c07-48d0-8092-d973d69d8eb9] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 21 23:48:17 compute-1 nova_compute[182713]: 2026-01-21 23:48:17.882 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:48:17 compute-1 nova_compute[182713]: 2026-01-21 23:48:17.889 182717 DEBUG nova.network.neutron [None req-a87f5acb-8fa8-487f-8c8d-397faf8c9a7f d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] [instance: 2038fd11-9c07-48d0-8092-d973d69d8eb9] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 21 23:48:18 compute-1 nova_compute[182713]: 2026-01-21 23:48:18.027 182717 DEBUG nova.compute.manager [req-fdb0cdf8-27c2-46a1-a39c-dea2ad2caeea req-54c88cf5-f2e3-450f-8148-5c471cef7098 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 2038fd11-9c07-48d0-8092-d973d69d8eb9] Received event network-changed-bbc46799-0727-41d9-9ae1-017037df9492 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:48:18 compute-1 nova_compute[182713]: 2026-01-21 23:48:18.028 182717 DEBUG nova.compute.manager [req-fdb0cdf8-27c2-46a1-a39c-dea2ad2caeea req-54c88cf5-f2e3-450f-8148-5c471cef7098 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 2038fd11-9c07-48d0-8092-d973d69d8eb9] Refreshing instance network info cache due to event network-changed-bbc46799-0727-41d9-9ae1-017037df9492. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 21 23:48:18 compute-1 nova_compute[182713]: 2026-01-21 23:48:18.028 182717 DEBUG oslo_concurrency.lockutils [req-fdb0cdf8-27c2-46a1-a39c-dea2ad2caeea req-54c88cf5-f2e3-450f-8148-5c471cef7098 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-2038fd11-9c07-48d0-8092-d973d69d8eb9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 23:48:18 compute-1 nova_compute[182713]: 2026-01-21 23:48:18.877 182717 DEBUG nova.network.neutron [None req-a87f5acb-8fa8-487f-8c8d-397faf8c9a7f d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] [instance: 2038fd11-9c07-48d0-8092-d973d69d8eb9] Updating instance_info_cache with network_info: [{"id": "bbc46799-0727-41d9-9ae1-017037df9492", "address": "fa:16:3e:14:7f:86", "network": {"id": "b2df233d-b255-4dda-925c-3ccab3a032ee", "bridge": "br-int", "label": "tempest-LiveMigrationTest-283223724-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cdcb2f57183e484cace5d5f78dd635a1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbbc46799-07", "ovs_interfaceid": "bbc46799-0727-41d9-9ae1-017037df9492", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 23:48:18 compute-1 nova_compute[182713]: 2026-01-21 23:48:18.916 182717 DEBUG oslo_concurrency.lockutils [None req-a87f5acb-8fa8-487f-8c8d-397faf8c9a7f d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] Releasing lock "refresh_cache-2038fd11-9c07-48d0-8092-d973d69d8eb9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 23:48:18 compute-1 nova_compute[182713]: 2026-01-21 23:48:18.917 182717 DEBUG nova.compute.manager [None req-a87f5acb-8fa8-487f-8c8d-397faf8c9a7f d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] [instance: 2038fd11-9c07-48d0-8092-d973d69d8eb9] Instance network_info: |[{"id": "bbc46799-0727-41d9-9ae1-017037df9492", "address": "fa:16:3e:14:7f:86", "network": {"id": "b2df233d-b255-4dda-925c-3ccab3a032ee", "bridge": "br-int", "label": "tempest-LiveMigrationTest-283223724-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cdcb2f57183e484cace5d5f78dd635a1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbbc46799-07", "ovs_interfaceid": "bbc46799-0727-41d9-9ae1-017037df9492", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 21 23:48:18 compute-1 nova_compute[182713]: 2026-01-21 23:48:18.917 182717 DEBUG oslo_concurrency.lockutils [req-fdb0cdf8-27c2-46a1-a39c-dea2ad2caeea req-54c88cf5-f2e3-450f-8148-5c471cef7098 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-2038fd11-9c07-48d0-8092-d973d69d8eb9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 23:48:18 compute-1 nova_compute[182713]: 2026-01-21 23:48:18.918 182717 DEBUG nova.network.neutron [req-fdb0cdf8-27c2-46a1-a39c-dea2ad2caeea req-54c88cf5-f2e3-450f-8148-5c471cef7098 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 2038fd11-9c07-48d0-8092-d973d69d8eb9] Refreshing network info cache for port bbc46799-0727-41d9-9ae1-017037df9492 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 21 23:48:18 compute-1 nova_compute[182713]: 2026-01-21 23:48:18.921 182717 DEBUG nova.virt.libvirt.driver [None req-a87f5acb-8fa8-487f-8c8d-397faf8c9a7f d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] [instance: 2038fd11-9c07-48d0-8092-d973d69d8eb9] Start _get_guest_xml network_info=[{"id": "bbc46799-0727-41d9-9ae1-017037df9492", "address": "fa:16:3e:14:7f:86", "network": {"id": "b2df233d-b255-4dda-925c-3ccab3a032ee", "bridge": "br-int", "label": "tempest-LiveMigrationTest-283223724-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cdcb2f57183e484cace5d5f78dd635a1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbbc46799-07", "ovs_interfaceid": "bbc46799-0727-41d9-9ae1-017037df9492", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'size': 0, 'disk_bus': 'virtio', 'boot_index': 0, 'device_name': '/dev/vda', 'guest_format': None, 'encrypted': False, 'encryption_format': None, 'encryption_secret_uuid': None, 'device_type': 'disk', 'image_id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 21 23:48:18 compute-1 nova_compute[182713]: 2026-01-21 23:48:18.926 182717 WARNING nova.virt.libvirt.driver [None req-a87f5acb-8fa8-487f-8c8d-397faf8c9a7f d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 21 23:48:18 compute-1 nova_compute[182713]: 2026-01-21 23:48:18.933 182717 DEBUG nova.virt.libvirt.host [None req-a87f5acb-8fa8-487f-8c8d-397faf8c9a7f d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 21 23:48:18 compute-1 nova_compute[182713]: 2026-01-21 23:48:18.933 182717 DEBUG nova.virt.libvirt.host [None req-a87f5acb-8fa8-487f-8c8d-397faf8c9a7f d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 21 23:48:18 compute-1 nova_compute[182713]: 2026-01-21 23:48:18.937 182717 DEBUG nova.virt.libvirt.host [None req-a87f5acb-8fa8-487f-8c8d-397faf8c9a7f d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 21 23:48:18 compute-1 nova_compute[182713]: 2026-01-21 23:48:18.938 182717 DEBUG nova.virt.libvirt.host [None req-a87f5acb-8fa8-487f-8c8d-397faf8c9a7f d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 21 23:48:18 compute-1 nova_compute[182713]: 2026-01-21 23:48:18.939 182717 DEBUG nova.virt.libvirt.driver [None req-a87f5acb-8fa8-487f-8c8d-397faf8c9a7f d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 21 23:48:18 compute-1 nova_compute[182713]: 2026-01-21 23:48:18.939 182717 DEBUG nova.virt.hardware [None req-a87f5acb-8fa8-487f-8c8d-397faf8c9a7f d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-21T23:42:45Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c3389c03-89c4-4ff5-9e03-1a99d41713d4',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 21 23:48:18 compute-1 nova_compute[182713]: 2026-01-21 23:48:18.940 182717 DEBUG nova.virt.hardware [None req-a87f5acb-8fa8-487f-8c8d-397faf8c9a7f d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 21 23:48:18 compute-1 nova_compute[182713]: 2026-01-21 23:48:18.940 182717 DEBUG nova.virt.hardware [None req-a87f5acb-8fa8-487f-8c8d-397faf8c9a7f d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 21 23:48:18 compute-1 nova_compute[182713]: 2026-01-21 23:48:18.940 182717 DEBUG nova.virt.hardware [None req-a87f5acb-8fa8-487f-8c8d-397faf8c9a7f d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 21 23:48:18 compute-1 nova_compute[182713]: 2026-01-21 23:48:18.941 182717 DEBUG nova.virt.hardware [None req-a87f5acb-8fa8-487f-8c8d-397faf8c9a7f d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 21 23:48:18 compute-1 nova_compute[182713]: 2026-01-21 23:48:18.941 182717 DEBUG nova.virt.hardware [None req-a87f5acb-8fa8-487f-8c8d-397faf8c9a7f d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 21 23:48:18 compute-1 nova_compute[182713]: 2026-01-21 23:48:18.941 182717 DEBUG nova.virt.hardware [None req-a87f5acb-8fa8-487f-8c8d-397faf8c9a7f d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 21 23:48:18 compute-1 nova_compute[182713]: 2026-01-21 23:48:18.942 182717 DEBUG nova.virt.hardware [None req-a87f5acb-8fa8-487f-8c8d-397faf8c9a7f d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 21 23:48:18 compute-1 nova_compute[182713]: 2026-01-21 23:48:18.942 182717 DEBUG nova.virt.hardware [None req-a87f5acb-8fa8-487f-8c8d-397faf8c9a7f d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 21 23:48:18 compute-1 nova_compute[182713]: 2026-01-21 23:48:18.942 182717 DEBUG nova.virt.hardware [None req-a87f5acb-8fa8-487f-8c8d-397faf8c9a7f d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 21 23:48:18 compute-1 nova_compute[182713]: 2026-01-21 23:48:18.942 182717 DEBUG nova.virt.hardware [None req-a87f5acb-8fa8-487f-8c8d-397faf8c9a7f d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 21 23:48:18 compute-1 nova_compute[182713]: 2026-01-21 23:48:18.947 182717 DEBUG nova.virt.libvirt.vif [None req-a87f5acb-8fa8-487f-8c8d-397faf8c9a7f d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-21T23:48:12Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-LiveMigrationTest-server-1127248027',display_name='tempest-LiveMigrationTest-server-1127248027',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-livemigrationtest-server-1127248027',id=25,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='cdcb2f57183e484cace5d5f78dd635a1',ramdisk_id='',reservation_id='r-6nt020yh',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-LiveMigrationTest-430976321',owner_user_name='tempest-LiveMigrationTest-430976321-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-21T23:48:15Z,user_data=None,user_id='d4ff24d8abf8416db9d64c645436c5f1',uuid=2038fd11-9c07-48d0-8092-d973d69d8eb9,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "bbc46799-0727-41d9-9ae1-017037df9492", "address": "fa:16:3e:14:7f:86", "network": {"id": "b2df233d-b255-4dda-925c-3ccab3a032ee", "bridge": "br-int", "label": "tempest-LiveMigrationTest-283223724-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cdcb2f57183e484cace5d5f78dd635a1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbbc46799-07", "ovs_interfaceid": "bbc46799-0727-41d9-9ae1-017037df9492", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 21 23:48:18 compute-1 nova_compute[182713]: 2026-01-21 23:48:18.947 182717 DEBUG nova.network.os_vif_util [None req-a87f5acb-8fa8-487f-8c8d-397faf8c9a7f d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] Converting VIF {"id": "bbc46799-0727-41d9-9ae1-017037df9492", "address": "fa:16:3e:14:7f:86", "network": {"id": "b2df233d-b255-4dda-925c-3ccab3a032ee", "bridge": "br-int", "label": "tempest-LiveMigrationTest-283223724-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cdcb2f57183e484cace5d5f78dd635a1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbbc46799-07", "ovs_interfaceid": "bbc46799-0727-41d9-9ae1-017037df9492", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 21 23:48:18 compute-1 nova_compute[182713]: 2026-01-21 23:48:18.948 182717 DEBUG nova.network.os_vif_util [None req-a87f5acb-8fa8-487f-8c8d-397faf8c9a7f d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:14:7f:86,bridge_name='br-int',has_traffic_filtering=True,id=bbc46799-0727-41d9-9ae1-017037df9492,network=Network(b2df233d-b255-4dda-925c-3ccab3a032ee),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapbbc46799-07') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 21 23:48:18 compute-1 nova_compute[182713]: 2026-01-21 23:48:18.949 182717 DEBUG nova.objects.instance [None req-a87f5acb-8fa8-487f-8c8d-397faf8c9a7f d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] Lazy-loading 'pci_devices' on Instance uuid 2038fd11-9c07-48d0-8092-d973d69d8eb9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:48:18 compute-1 nova_compute[182713]: 2026-01-21 23:48:18.965 182717 DEBUG nova.virt.libvirt.driver [None req-a87f5acb-8fa8-487f-8c8d-397faf8c9a7f d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] [instance: 2038fd11-9c07-48d0-8092-d973d69d8eb9] End _get_guest_xml xml=<domain type="kvm">
Jan 21 23:48:18 compute-1 nova_compute[182713]:   <uuid>2038fd11-9c07-48d0-8092-d973d69d8eb9</uuid>
Jan 21 23:48:18 compute-1 nova_compute[182713]:   <name>instance-00000019</name>
Jan 21 23:48:18 compute-1 nova_compute[182713]:   <memory>131072</memory>
Jan 21 23:48:18 compute-1 nova_compute[182713]:   <vcpu>1</vcpu>
Jan 21 23:48:18 compute-1 nova_compute[182713]:   <metadata>
Jan 21 23:48:18 compute-1 nova_compute[182713]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 21 23:48:18 compute-1 nova_compute[182713]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 21 23:48:18 compute-1 nova_compute[182713]:       <nova:name>tempest-LiveMigrationTest-server-1127248027</nova:name>
Jan 21 23:48:18 compute-1 nova_compute[182713]:       <nova:creationTime>2026-01-21 23:48:18</nova:creationTime>
Jan 21 23:48:18 compute-1 nova_compute[182713]:       <nova:flavor name="m1.nano">
Jan 21 23:48:18 compute-1 nova_compute[182713]:         <nova:memory>128</nova:memory>
Jan 21 23:48:18 compute-1 nova_compute[182713]:         <nova:disk>1</nova:disk>
Jan 21 23:48:18 compute-1 nova_compute[182713]:         <nova:swap>0</nova:swap>
Jan 21 23:48:18 compute-1 nova_compute[182713]:         <nova:ephemeral>0</nova:ephemeral>
Jan 21 23:48:18 compute-1 nova_compute[182713]:         <nova:vcpus>1</nova:vcpus>
Jan 21 23:48:18 compute-1 nova_compute[182713]:       </nova:flavor>
Jan 21 23:48:18 compute-1 nova_compute[182713]:       <nova:owner>
Jan 21 23:48:18 compute-1 nova_compute[182713]:         <nova:user uuid="d4ff24d8abf8416db9d64c645436c5f1">tempest-LiveMigrationTest-430976321-project-member</nova:user>
Jan 21 23:48:18 compute-1 nova_compute[182713]:         <nova:project uuid="cdcb2f57183e484cace5d5f78dd635a1">tempest-LiveMigrationTest-430976321</nova:project>
Jan 21 23:48:18 compute-1 nova_compute[182713]:       </nova:owner>
Jan 21 23:48:18 compute-1 nova_compute[182713]:       <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 21 23:48:18 compute-1 nova_compute[182713]:       <nova:ports>
Jan 21 23:48:18 compute-1 nova_compute[182713]:         <nova:port uuid="bbc46799-0727-41d9-9ae1-017037df9492">
Jan 21 23:48:18 compute-1 nova_compute[182713]:           <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Jan 21 23:48:18 compute-1 nova_compute[182713]:         </nova:port>
Jan 21 23:48:18 compute-1 nova_compute[182713]:       </nova:ports>
Jan 21 23:48:18 compute-1 nova_compute[182713]:     </nova:instance>
Jan 21 23:48:18 compute-1 nova_compute[182713]:   </metadata>
Jan 21 23:48:18 compute-1 nova_compute[182713]:   <sysinfo type="smbios">
Jan 21 23:48:18 compute-1 nova_compute[182713]:     <system>
Jan 21 23:48:18 compute-1 nova_compute[182713]:       <entry name="manufacturer">RDO</entry>
Jan 21 23:48:18 compute-1 nova_compute[182713]:       <entry name="product">OpenStack Compute</entry>
Jan 21 23:48:18 compute-1 nova_compute[182713]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 21 23:48:18 compute-1 nova_compute[182713]:       <entry name="serial">2038fd11-9c07-48d0-8092-d973d69d8eb9</entry>
Jan 21 23:48:18 compute-1 nova_compute[182713]:       <entry name="uuid">2038fd11-9c07-48d0-8092-d973d69d8eb9</entry>
Jan 21 23:48:18 compute-1 nova_compute[182713]:       <entry name="family">Virtual Machine</entry>
Jan 21 23:48:18 compute-1 nova_compute[182713]:     </system>
Jan 21 23:48:18 compute-1 nova_compute[182713]:   </sysinfo>
Jan 21 23:48:18 compute-1 nova_compute[182713]:   <os>
Jan 21 23:48:18 compute-1 nova_compute[182713]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 21 23:48:18 compute-1 nova_compute[182713]:     <boot dev="hd"/>
Jan 21 23:48:18 compute-1 nova_compute[182713]:     <smbios mode="sysinfo"/>
Jan 21 23:48:18 compute-1 nova_compute[182713]:   </os>
Jan 21 23:48:18 compute-1 nova_compute[182713]:   <features>
Jan 21 23:48:18 compute-1 nova_compute[182713]:     <acpi/>
Jan 21 23:48:18 compute-1 nova_compute[182713]:     <apic/>
Jan 21 23:48:18 compute-1 nova_compute[182713]:     <vmcoreinfo/>
Jan 21 23:48:18 compute-1 nova_compute[182713]:   </features>
Jan 21 23:48:18 compute-1 nova_compute[182713]:   <clock offset="utc">
Jan 21 23:48:18 compute-1 nova_compute[182713]:     <timer name="pit" tickpolicy="delay"/>
Jan 21 23:48:18 compute-1 nova_compute[182713]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 21 23:48:18 compute-1 nova_compute[182713]:     <timer name="hpet" present="no"/>
Jan 21 23:48:18 compute-1 nova_compute[182713]:   </clock>
Jan 21 23:48:18 compute-1 nova_compute[182713]:   <cpu mode="custom" match="exact">
Jan 21 23:48:18 compute-1 nova_compute[182713]:     <model>Nehalem</model>
Jan 21 23:48:18 compute-1 nova_compute[182713]:     <topology sockets="1" cores="1" threads="1"/>
Jan 21 23:48:18 compute-1 nova_compute[182713]:   </cpu>
Jan 21 23:48:18 compute-1 nova_compute[182713]:   <devices>
Jan 21 23:48:18 compute-1 nova_compute[182713]:     <disk type="file" device="disk">
Jan 21 23:48:18 compute-1 nova_compute[182713]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 21 23:48:18 compute-1 nova_compute[182713]:       <source file="/var/lib/nova/instances/2038fd11-9c07-48d0-8092-d973d69d8eb9/disk"/>
Jan 21 23:48:18 compute-1 nova_compute[182713]:       <target dev="vda" bus="virtio"/>
Jan 21 23:48:18 compute-1 nova_compute[182713]:     </disk>
Jan 21 23:48:18 compute-1 nova_compute[182713]:     <disk type="file" device="cdrom">
Jan 21 23:48:18 compute-1 nova_compute[182713]:       <driver name="qemu" type="raw" cache="none"/>
Jan 21 23:48:18 compute-1 nova_compute[182713]:       <source file="/var/lib/nova/instances/2038fd11-9c07-48d0-8092-d973d69d8eb9/disk.config"/>
Jan 21 23:48:18 compute-1 nova_compute[182713]:       <target dev="sda" bus="sata"/>
Jan 21 23:48:18 compute-1 nova_compute[182713]:     </disk>
Jan 21 23:48:18 compute-1 nova_compute[182713]:     <interface type="ethernet">
Jan 21 23:48:18 compute-1 nova_compute[182713]:       <mac address="fa:16:3e:14:7f:86"/>
Jan 21 23:48:18 compute-1 nova_compute[182713]:       <model type="virtio"/>
Jan 21 23:48:18 compute-1 nova_compute[182713]:       <driver name="vhost" rx_queue_size="512"/>
Jan 21 23:48:18 compute-1 nova_compute[182713]:       <mtu size="1442"/>
Jan 21 23:48:18 compute-1 nova_compute[182713]:       <target dev="tapbbc46799-07"/>
Jan 21 23:48:18 compute-1 nova_compute[182713]:     </interface>
Jan 21 23:48:18 compute-1 nova_compute[182713]:     <serial type="pty">
Jan 21 23:48:18 compute-1 nova_compute[182713]:       <log file="/var/lib/nova/instances/2038fd11-9c07-48d0-8092-d973d69d8eb9/console.log" append="off"/>
Jan 21 23:48:18 compute-1 nova_compute[182713]:     </serial>
Jan 21 23:48:18 compute-1 nova_compute[182713]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 21 23:48:18 compute-1 nova_compute[182713]:     <video>
Jan 21 23:48:18 compute-1 nova_compute[182713]:       <model type="virtio"/>
Jan 21 23:48:18 compute-1 nova_compute[182713]:     </video>
Jan 21 23:48:18 compute-1 nova_compute[182713]:     <input type="tablet" bus="usb"/>
Jan 21 23:48:18 compute-1 nova_compute[182713]:     <rng model="virtio">
Jan 21 23:48:18 compute-1 nova_compute[182713]:       <backend model="random">/dev/urandom</backend>
Jan 21 23:48:18 compute-1 nova_compute[182713]:     </rng>
Jan 21 23:48:18 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root"/>
Jan 21 23:48:18 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:48:18 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:48:18 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:48:18 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:48:18 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:48:18 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:48:18 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:48:18 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:48:18 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:48:18 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:48:18 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:48:18 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:48:18 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:48:18 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:48:18 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:48:18 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:48:18 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:48:18 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:48:18 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:48:18 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:48:18 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:48:18 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:48:18 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:48:18 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:48:18 compute-1 nova_compute[182713]:     <controller type="usb" index="0"/>
Jan 21 23:48:18 compute-1 nova_compute[182713]:     <memballoon model="virtio">
Jan 21 23:48:18 compute-1 nova_compute[182713]:       <stats period="10"/>
Jan 21 23:48:18 compute-1 nova_compute[182713]:     </memballoon>
Jan 21 23:48:18 compute-1 nova_compute[182713]:   </devices>
Jan 21 23:48:18 compute-1 nova_compute[182713]: </domain>
Jan 21 23:48:18 compute-1 nova_compute[182713]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 21 23:48:18 compute-1 nova_compute[182713]: 2026-01-21 23:48:18.966 182717 DEBUG nova.compute.manager [None req-a87f5acb-8fa8-487f-8c8d-397faf8c9a7f d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] [instance: 2038fd11-9c07-48d0-8092-d973d69d8eb9] Preparing to wait for external event network-vif-plugged-bbc46799-0727-41d9-9ae1-017037df9492 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 21 23:48:18 compute-1 nova_compute[182713]: 2026-01-21 23:48:18.966 182717 DEBUG oslo_concurrency.lockutils [None req-a87f5acb-8fa8-487f-8c8d-397faf8c9a7f d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] Acquiring lock "2038fd11-9c07-48d0-8092-d973d69d8eb9-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:48:18 compute-1 nova_compute[182713]: 2026-01-21 23:48:18.967 182717 DEBUG oslo_concurrency.lockutils [None req-a87f5acb-8fa8-487f-8c8d-397faf8c9a7f d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] Lock "2038fd11-9c07-48d0-8092-d973d69d8eb9-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:48:18 compute-1 nova_compute[182713]: 2026-01-21 23:48:18.967 182717 DEBUG oslo_concurrency.lockutils [None req-a87f5acb-8fa8-487f-8c8d-397faf8c9a7f d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] Lock "2038fd11-9c07-48d0-8092-d973d69d8eb9-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:48:18 compute-1 nova_compute[182713]: 2026-01-21 23:48:18.968 182717 DEBUG nova.virt.libvirt.vif [None req-a87f5acb-8fa8-487f-8c8d-397faf8c9a7f d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-21T23:48:12Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-LiveMigrationTest-server-1127248027',display_name='tempest-LiveMigrationTest-server-1127248027',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-livemigrationtest-server-1127248027',id=25,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='cdcb2f57183e484cace5d5f78dd635a1',ramdisk_id='',reservation_id='r-6nt020yh',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-LiveMigrationTest-430976321',owner_user_name='tempest-LiveMigrationTest-430976321-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-21T23:48:15Z,user_data=None,user_id='d4ff24d8abf8416db9d64c645436c5f1',uuid=2038fd11-9c07-48d0-8092-d973d69d8eb9,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "bbc46799-0727-41d9-9ae1-017037df9492", "address": "fa:16:3e:14:7f:86", "network": {"id": "b2df233d-b255-4dda-925c-3ccab3a032ee", "bridge": "br-int", "label": "tempest-LiveMigrationTest-283223724-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cdcb2f57183e484cace5d5f78dd635a1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbbc46799-07", "ovs_interfaceid": "bbc46799-0727-41d9-9ae1-017037df9492", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 21 23:48:18 compute-1 nova_compute[182713]: 2026-01-21 23:48:18.968 182717 DEBUG nova.network.os_vif_util [None req-a87f5acb-8fa8-487f-8c8d-397faf8c9a7f d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] Converting VIF {"id": "bbc46799-0727-41d9-9ae1-017037df9492", "address": "fa:16:3e:14:7f:86", "network": {"id": "b2df233d-b255-4dda-925c-3ccab3a032ee", "bridge": "br-int", "label": "tempest-LiveMigrationTest-283223724-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cdcb2f57183e484cace5d5f78dd635a1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbbc46799-07", "ovs_interfaceid": "bbc46799-0727-41d9-9ae1-017037df9492", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 21 23:48:18 compute-1 nova_compute[182713]: 2026-01-21 23:48:18.969 182717 DEBUG nova.network.os_vif_util [None req-a87f5acb-8fa8-487f-8c8d-397faf8c9a7f d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:14:7f:86,bridge_name='br-int',has_traffic_filtering=True,id=bbc46799-0727-41d9-9ae1-017037df9492,network=Network(b2df233d-b255-4dda-925c-3ccab3a032ee),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapbbc46799-07') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 21 23:48:18 compute-1 nova_compute[182713]: 2026-01-21 23:48:18.972 182717 DEBUG os_vif [None req-a87f5acb-8fa8-487f-8c8d-397faf8c9a7f d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:14:7f:86,bridge_name='br-int',has_traffic_filtering=True,id=bbc46799-0727-41d9-9ae1-017037df9492,network=Network(b2df233d-b255-4dda-925c-3ccab3a032ee),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapbbc46799-07') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 21 23:48:18 compute-1 nova_compute[182713]: 2026-01-21 23:48:18.973 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:48:18 compute-1 nova_compute[182713]: 2026-01-21 23:48:18.973 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:48:18 compute-1 nova_compute[182713]: 2026-01-21 23:48:18.974 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 21 23:48:18 compute-1 nova_compute[182713]: 2026-01-21 23:48:18.978 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:48:18 compute-1 nova_compute[182713]: 2026-01-21 23:48:18.978 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapbbc46799-07, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:48:18 compute-1 nova_compute[182713]: 2026-01-21 23:48:18.979 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapbbc46799-07, col_values=(('external_ids', {'iface-id': 'bbc46799-0727-41d9-9ae1-017037df9492', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:14:7f:86', 'vm-uuid': '2038fd11-9c07-48d0-8092-d973d69d8eb9'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:48:18 compute-1 nova_compute[182713]: 2026-01-21 23:48:18.981 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:48:18 compute-1 NetworkManager[54952]: <info>  [1769039298.9825] manager: (tapbbc46799-07): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/51)
Jan 21 23:48:18 compute-1 nova_compute[182713]: 2026-01-21 23:48:18.984 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 21 23:48:18 compute-1 nova_compute[182713]: 2026-01-21 23:48:18.988 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:48:18 compute-1 nova_compute[182713]: 2026-01-21 23:48:18.989 182717 INFO os_vif [None req-a87f5acb-8fa8-487f-8c8d-397faf8c9a7f d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:14:7f:86,bridge_name='br-int',has_traffic_filtering=True,id=bbc46799-0727-41d9-9ae1-017037df9492,network=Network(b2df233d-b255-4dda-925c-3ccab3a032ee),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapbbc46799-07')
Jan 21 23:48:19 compute-1 nova_compute[182713]: 2026-01-21 23:48:19.060 182717 DEBUG nova.virt.libvirt.driver [None req-a87f5acb-8fa8-487f-8c8d-397faf8c9a7f d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 21 23:48:19 compute-1 nova_compute[182713]: 2026-01-21 23:48:19.060 182717 DEBUG nova.virt.libvirt.driver [None req-a87f5acb-8fa8-487f-8c8d-397faf8c9a7f d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 21 23:48:19 compute-1 nova_compute[182713]: 2026-01-21 23:48:19.061 182717 DEBUG nova.virt.libvirt.driver [None req-a87f5acb-8fa8-487f-8c8d-397faf8c9a7f d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] No VIF found with MAC fa:16:3e:14:7f:86, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 21 23:48:19 compute-1 nova_compute[182713]: 2026-01-21 23:48:19.061 182717 INFO nova.virt.libvirt.driver [None req-a87f5acb-8fa8-487f-8c8d-397faf8c9a7f d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] [instance: 2038fd11-9c07-48d0-8092-d973d69d8eb9] Using config drive
Jan 21 23:48:19 compute-1 nova_compute[182713]: 2026-01-21 23:48:19.855 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:48:19 compute-1 nova_compute[182713]: 2026-01-21 23:48:19.856 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 21 23:48:20 compute-1 nova_compute[182713]: 2026-01-21 23:48:20.068 182717 INFO nova.virt.libvirt.driver [None req-a87f5acb-8fa8-487f-8c8d-397faf8c9a7f d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] [instance: 2038fd11-9c07-48d0-8092-d973d69d8eb9] Creating config drive at /var/lib/nova/instances/2038fd11-9c07-48d0-8092-d973d69d8eb9/disk.config
Jan 21 23:48:20 compute-1 nova_compute[182713]: 2026-01-21 23:48:20.077 182717 DEBUG oslo_concurrency.processutils [None req-a87f5acb-8fa8-487f-8c8d-397faf8c9a7f d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/2038fd11-9c07-48d0-8092-d973d69d8eb9/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpqaita43e execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:48:20 compute-1 nova_compute[182713]: 2026-01-21 23:48:20.217 182717 DEBUG oslo_concurrency.processutils [None req-a87f5acb-8fa8-487f-8c8d-397faf8c9a7f d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/2038fd11-9c07-48d0-8092-d973d69d8eb9/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpqaita43e" returned: 0 in 0.141s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:48:20 compute-1 kernel: tapbbc46799-07: entered promiscuous mode
Jan 21 23:48:20 compute-1 NetworkManager[54952]: <info>  [1769039300.3032] manager: (tapbbc46799-07): new Tun device (/org/freedesktop/NetworkManager/Devices/52)
Jan 21 23:48:20 compute-1 ovn_controller[94841]: 2026-01-21T23:48:20Z|00085|binding|INFO|Claiming lport bbc46799-0727-41d9-9ae1-017037df9492 for this chassis.
Jan 21 23:48:20 compute-1 nova_compute[182713]: 2026-01-21 23:48:20.349 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:48:20 compute-1 ovn_controller[94841]: 2026-01-21T23:48:20Z|00086|binding|INFO|bbc46799-0727-41d9-9ae1-017037df9492: Claiming fa:16:3e:14:7f:86 10.100.0.13
Jan 21 23:48:20 compute-1 ovn_controller[94841]: 2026-01-21T23:48:20Z|00087|binding|INFO|Claiming lport 56571b22-2d90-46ed-b4c3-681729d375d9 for this chassis.
Jan 21 23:48:20 compute-1 ovn_controller[94841]: 2026-01-21T23:48:20Z|00088|binding|INFO|56571b22-2d90-46ed-b4c3-681729d375d9: Claiming fa:16:3e:cf:be:25 19.80.0.150
Jan 21 23:48:20 compute-1 nova_compute[182713]: 2026-01-21 23:48:20.354 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:48:20 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:48:20.366 104184 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:14:7f:86 10.100.0.13'], port_security=['fa:16:3e:14:7f:86 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-parent-1972857521', 'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '2038fd11-9c07-48d0-8092-d973d69d8eb9', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b2df233d-b255-4dda-925c-3ccab3a032ee', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-parent-1972857521', 'neutron:project_id': 'cdcb2f57183e484cace5d5f78dd635a1', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'b0d61dfd-cc58-4a7f-a4c1-6c8c92847694', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ceab9906-340c-4566-81ac-4c6dd292f58f, chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>], logical_port=bbc46799-0727-41d9-9ae1-017037df9492) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 21 23:48:20 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:48:20.368 104184 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:cf:be:25 19.80.0.150'], port_security=['fa:16:3e:cf:be:25 19.80.0.150'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=['bbc46799-0727-41d9-9ae1-017037df9492'], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-subport-1175960055', 'neutron:cidrs': '19.80.0.150/24', 'neutron:device_id': '', 'neutron:device_owner': 'trunk:subport', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1b0b760c-cbd0-4413-9603-713296c75717', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-subport-1175960055', 'neutron:project_id': 'cdcb2f57183e484cace5d5f78dd635a1', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'b0d61dfd-cc58-4a7f-a4c1-6c8c92847694', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[42], additional_encap=[], encap=[], mirror_rules=[], datapath=05a945c7-2b1a-4093-88a2-493079ba8709, chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=56571b22-2d90-46ed-b4c3-681729d375d9) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 21 23:48:20 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:48:20.369 104184 INFO neutron.agent.ovn.metadata.agent [-] Port bbc46799-0727-41d9-9ae1-017037df9492 in datapath b2df233d-b255-4dda-925c-3ccab3a032ee bound to our chassis
Jan 21 23:48:20 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:48:20.372 104184 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network b2df233d-b255-4dda-925c-3ccab3a032ee
Jan 21 23:48:20 compute-1 systemd-machined[153970]: New machine qemu-14-instance-00000019.
Jan 21 23:48:20 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:48:20.397 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[a58eb261-083c-4f6c-bf82-39d915026f7c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:48:20 compute-1 systemd[1]: Started Virtual Machine qemu-14-instance-00000019.
Jan 21 23:48:20 compute-1 ovn_controller[94841]: 2026-01-21T23:48:20Z|00089|binding|INFO|Setting lport bbc46799-0727-41d9-9ae1-017037df9492 ovn-installed in OVS
Jan 21 23:48:20 compute-1 ovn_controller[94841]: 2026-01-21T23:48:20Z|00090|binding|INFO|Setting lport bbc46799-0727-41d9-9ae1-017037df9492 up in Southbound
Jan 21 23:48:20 compute-1 ovn_controller[94841]: 2026-01-21T23:48:20Z|00091|binding|INFO|Setting lport 56571b22-2d90-46ed-b4c3-681729d375d9 up in Southbound
Jan 21 23:48:20 compute-1 nova_compute[182713]: 2026-01-21 23:48:20.417 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:48:20 compute-1 systemd-udevd[214682]: Network interface NamePolicy= disabled on kernel command line.
Jan 21 23:48:20 compute-1 NetworkManager[54952]: <info>  [1769039300.4399] device (tapbbc46799-07): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 21 23:48:20 compute-1 NetworkManager[54952]: <info>  [1769039300.4411] device (tapbbc46799-07): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 21 23:48:20 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:48:20.448 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[f132dbde-119d-4881-b2f9-9f4f9455bfd8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:48:20 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:48:20.455 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[f587f1af-8b7d-44fa-ae96-325149f2d408]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:48:20 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:48:20.495 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[20798841-6770-414f-bfd4-447dbc3b7416]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:48:20 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:48:20.518 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[3eab20f1-f069-449f-9f24-5118bbd7c806]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb2df233d-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:45:e6:36'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 21, 'tx_packets': 5, 'rx_bytes': 1162, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 21, 'tx_packets': 5, 'rx_bytes': 1162, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 27], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 384779, 'reachable_time': 23610, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 214692, 'error': None, 'target': 'ovnmeta-b2df233d-b255-4dda-925c-3ccab3a032ee', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:48:20 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:48:20.536 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[e2a48be6-a125-41ba-8974-fca0050d057e]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapb2df233d-b1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 384791, 'tstamp': 384791}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 214695, 'error': None, 'target': 'ovnmeta-b2df233d-b255-4dda-925c-3ccab3a032ee', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapb2df233d-b1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 384793, 'tstamp': 384793}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 214695, 'error': None, 'target': 'ovnmeta-b2df233d-b255-4dda-925c-3ccab3a032ee', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:48:20 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:48:20.538 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb2df233d-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:48:20 compute-1 nova_compute[182713]: 2026-01-21 23:48:20.540 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:48:20 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:48:20.543 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb2df233d-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:48:20 compute-1 nova_compute[182713]: 2026-01-21 23:48:20.541 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:48:20 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:48:20.544 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 21 23:48:20 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:48:20.544 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapb2df233d-b0, col_values=(('external_ids', {'iface-id': '75454af0-da31-4238-b248-a6678c575f51'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:48:20 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:48:20.545 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 21 23:48:20 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:48:20.547 104184 INFO neutron.agent.ovn.metadata.agent [-] Port 56571b22-2d90-46ed-b4c3-681729d375d9 in datapath 1b0b760c-cbd0-4413-9603-713296c75717 unbound from our chassis
Jan 21 23:48:20 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:48:20.551 104184 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 1b0b760c-cbd0-4413-9603-713296c75717
Jan 21 23:48:20 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:48:20.568 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[1867ceb6-4e42-45f9-970d-0ddd713dde13]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:48:20 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:48:20.570 104184 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap1b0b760c-c1 in ovnmeta-1b0b760c-cbd0-4413-9603-713296c75717 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 21 23:48:20 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:48:20.573 211733 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap1b0b760c-c0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 21 23:48:20 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:48:20.573 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[ac749910-b2bc-473a-bb35-d102da5782f8]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:48:20 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:48:20.575 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[4cac2538-3945-46dc-b289-e2b50fb67995]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:48:20 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:48:20.587 104576 DEBUG oslo.privsep.daemon [-] privsep: reply[99dc31e2-bcc2-4213-8d29-d7cce4d74b46]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:48:20 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:48:20.604 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[1d437f31-650e-49ba-ab77-eed7df804426]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:48:20 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:48:20.647 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[73792b60-54cf-4bbe-82d5-223774cf96f7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:48:20 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:48:20.657 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[35d75857-95e5-4d03-b8af-e4c28c1ca5f6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:48:20 compute-1 NetworkManager[54952]: <info>  [1769039300.6586] manager: (tap1b0b760c-c0): new Veth device (/org/freedesktop/NetworkManager/Devices/53)
Jan 21 23:48:20 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:48:20.702 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[b25a2a01-8bbf-4808-aec9-6a0fc695c51e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:48:20 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:48:20.706 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[5253bca1-0720-4237-8761-25fbb2cf9255]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:48:20 compute-1 NetworkManager[54952]: <info>  [1769039300.7351] device (tap1b0b760c-c0): carrier: link connected
Jan 21 23:48:20 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:48:20.741 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[964185d8-ce02-4f95-919e-17e5c29a681c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:48:20 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:48:20.778 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[e473bbdf-0008-41de-a23f-76852be909bc]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap1b0b760c-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:6d:4e:23'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 30], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 387338, 'reachable_time': 42714, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 214720, 'error': None, 'target': 'ovnmeta-1b0b760c-cbd0-4413-9603-713296c75717', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:48:20 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:48:20.797 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[595e97ac-119e-4cc5-9f3c-536bf48cecff]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe6d:4e23'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 387338, 'tstamp': 387338}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 214721, 'error': None, 'target': 'ovnmeta-1b0b760c-cbd0-4413-9603-713296c75717', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:48:20 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:48:20.825 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[d3927aa3-7698-49eb-a0ce-1ca8d97948c3]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap1b0b760c-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:6d:4e:23'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 30], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 387338, 'reachable_time': 42714, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 214722, 'error': None, 'target': 'ovnmeta-1b0b760c-cbd0-4413-9603-713296c75717', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:48:20 compute-1 nova_compute[182713]: 2026-01-21 23:48:20.855 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:48:20 compute-1 nova_compute[182713]: 2026-01-21 23:48:20.856 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:48:20 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:48:20.873 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[4484dab7-15eb-46b8-83b9-be4ed6c7207f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:48:20 compute-1 nova_compute[182713]: 2026-01-21 23:48:20.924 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:48:20 compute-1 nova_compute[182713]: 2026-01-21 23:48:20.924 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:48:20 compute-1 nova_compute[182713]: 2026-01-21 23:48:20.925 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:48:20 compute-1 nova_compute[182713]: 2026-01-21 23:48:20.925 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 21 23:48:20 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:48:20.968 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[7f643177-cfca-495f-a950-f772ebc6eacb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:48:20 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:48:20.970 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1b0b760c-c0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:48:20 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:48:20.970 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 21 23:48:20 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:48:20.970 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap1b0b760c-c0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:48:20 compute-1 kernel: tap1b0b760c-c0: entered promiscuous mode
Jan 21 23:48:20 compute-1 NetworkManager[54952]: <info>  [1769039300.9739] manager: (tap1b0b760c-c0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/54)
Jan 21 23:48:20 compute-1 nova_compute[182713]: 2026-01-21 23:48:20.973 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:48:20 compute-1 nova_compute[182713]: 2026-01-21 23:48:20.975 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:48:20 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:48:20.982 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap1b0b760c-c0, col_values=(('external_ids', {'iface-id': '81fbdf60-46d3-442f-bc69-6a381397338b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:48:20 compute-1 nova_compute[182713]: 2026-01-21 23:48:20.983 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:48:20 compute-1 ovn_controller[94841]: 2026-01-21T23:48:20Z|00092|binding|INFO|Releasing lport 81fbdf60-46d3-442f-bc69-6a381397338b from this chassis (sb_readonly=0)
Jan 21 23:48:20 compute-1 nova_compute[182713]: 2026-01-21 23:48:20.984 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:48:20 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:48:20.994 104184 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/1b0b760c-cbd0-4413-9603-713296c75717.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/1b0b760c-cbd0-4413-9603-713296c75717.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 21 23:48:20 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:48:20.995 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[0a9051dc-67d3-4d78-86b3-5010f945071e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:48:20 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:48:20.996 104184 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 21 23:48:20 compute-1 ovn_metadata_agent[104179]: global
Jan 21 23:48:20 compute-1 ovn_metadata_agent[104179]:     log         /dev/log local0 debug
Jan 21 23:48:20 compute-1 ovn_metadata_agent[104179]:     log-tag     haproxy-metadata-proxy-1b0b760c-cbd0-4413-9603-713296c75717
Jan 21 23:48:20 compute-1 ovn_metadata_agent[104179]:     user        root
Jan 21 23:48:20 compute-1 ovn_metadata_agent[104179]:     group       root
Jan 21 23:48:20 compute-1 ovn_metadata_agent[104179]:     maxconn     1024
Jan 21 23:48:20 compute-1 ovn_metadata_agent[104179]:     pidfile     /var/lib/neutron/external/pids/1b0b760c-cbd0-4413-9603-713296c75717.pid.haproxy
Jan 21 23:48:20 compute-1 ovn_metadata_agent[104179]:     daemon
Jan 21 23:48:20 compute-1 ovn_metadata_agent[104179]: 
Jan 21 23:48:20 compute-1 ovn_metadata_agent[104179]: defaults
Jan 21 23:48:20 compute-1 nova_compute[182713]: 2026-01-21 23:48:20.996 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:48:20 compute-1 ovn_metadata_agent[104179]:     log global
Jan 21 23:48:20 compute-1 ovn_metadata_agent[104179]:     mode http
Jan 21 23:48:20 compute-1 ovn_metadata_agent[104179]:     option httplog
Jan 21 23:48:20 compute-1 ovn_metadata_agent[104179]:     option dontlognull
Jan 21 23:48:20 compute-1 ovn_metadata_agent[104179]:     option http-server-close
Jan 21 23:48:20 compute-1 ovn_metadata_agent[104179]:     option forwardfor
Jan 21 23:48:20 compute-1 ovn_metadata_agent[104179]:     retries                 3
Jan 21 23:48:20 compute-1 ovn_metadata_agent[104179]:     timeout http-request    30s
Jan 21 23:48:20 compute-1 ovn_metadata_agent[104179]:     timeout connect         30s
Jan 21 23:48:20 compute-1 ovn_metadata_agent[104179]:     timeout client          32s
Jan 21 23:48:20 compute-1 ovn_metadata_agent[104179]:     timeout server          32s
Jan 21 23:48:20 compute-1 ovn_metadata_agent[104179]:     timeout http-keep-alive 30s
Jan 21 23:48:20 compute-1 ovn_metadata_agent[104179]: 
Jan 21 23:48:20 compute-1 ovn_metadata_agent[104179]: 
Jan 21 23:48:20 compute-1 ovn_metadata_agent[104179]: listen listener
Jan 21 23:48:20 compute-1 ovn_metadata_agent[104179]:     bind 169.254.169.254:80
Jan 21 23:48:20 compute-1 ovn_metadata_agent[104179]:     server metadata /var/lib/neutron/metadata_proxy
Jan 21 23:48:20 compute-1 ovn_metadata_agent[104179]:     http-request add-header X-OVN-Network-ID 1b0b760c-cbd0-4413-9603-713296c75717
Jan 21 23:48:20 compute-1 ovn_metadata_agent[104179]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 21 23:48:20 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:48:20.997 104184 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-1b0b760c-cbd0-4413-9603-713296c75717', 'env', 'PROCESS_TAG=haproxy-1b0b760c-cbd0-4413-9603-713296c75717', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/1b0b760c-cbd0-4413-9603-713296c75717.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 21 23:48:21 compute-1 nova_compute[182713]: 2026-01-21 23:48:21.031 182717 DEBUG oslo_concurrency.processutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2038fd11-9c07-48d0-8092-d973d69d8eb9/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:48:21 compute-1 nova_compute[182713]: 2026-01-21 23:48:21.132 182717 DEBUG oslo_concurrency.processutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2038fd11-9c07-48d0-8092-d973d69d8eb9/disk --force-share --output=json" returned: 0 in 0.102s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:48:21 compute-1 nova_compute[182713]: 2026-01-21 23:48:21.133 182717 DEBUG oslo_concurrency.processutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2038fd11-9c07-48d0-8092-d973d69d8eb9/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:48:21 compute-1 nova_compute[182713]: 2026-01-21 23:48:21.172 182717 DEBUG nova.virt.driver [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Emitting event <LifecycleEvent: 1769039301.1718216, 2038fd11-9c07-48d0-8092-d973d69d8eb9 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:48:21 compute-1 nova_compute[182713]: 2026-01-21 23:48:21.173 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 2038fd11-9c07-48d0-8092-d973d69d8eb9] VM Started (Lifecycle Event)
Jan 21 23:48:21 compute-1 nova_compute[182713]: 2026-01-21 23:48:21.198 182717 DEBUG oslo_concurrency.processutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2038fd11-9c07-48d0-8092-d973d69d8eb9/disk --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:48:21 compute-1 nova_compute[182713]: 2026-01-21 23:48:21.204 182717 DEBUG oslo_concurrency.processutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5bdecf5d-9113-4584-ac23-44d59770eade/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:48:21 compute-1 nova_compute[182713]: 2026-01-21 23:48:21.222 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 2038fd11-9c07-48d0-8092-d973d69d8eb9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:48:21 compute-1 nova_compute[182713]: 2026-01-21 23:48:21.226 182717 DEBUG nova.virt.driver [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Emitting event <LifecycleEvent: 1769039301.1719708, 2038fd11-9c07-48d0-8092-d973d69d8eb9 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:48:21 compute-1 nova_compute[182713]: 2026-01-21 23:48:21.226 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 2038fd11-9c07-48d0-8092-d973d69d8eb9] VM Paused (Lifecycle Event)
Jan 21 23:48:21 compute-1 nova_compute[182713]: 2026-01-21 23:48:21.246 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 2038fd11-9c07-48d0-8092-d973d69d8eb9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:48:21 compute-1 nova_compute[182713]: 2026-01-21 23:48:21.249 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 2038fd11-9c07-48d0-8092-d973d69d8eb9] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 21 23:48:21 compute-1 nova_compute[182713]: 2026-01-21 23:48:21.261 182717 DEBUG oslo_concurrency.processutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5bdecf5d-9113-4584-ac23-44d59770eade/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:48:21 compute-1 nova_compute[182713]: 2026-01-21 23:48:21.262 182717 DEBUG oslo_concurrency.processutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5bdecf5d-9113-4584-ac23-44d59770eade/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:48:21 compute-1 nova_compute[182713]: 2026-01-21 23:48:21.283 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 2038fd11-9c07-48d0-8092-d973d69d8eb9] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 21 23:48:21 compute-1 nova_compute[182713]: 2026-01-21 23:48:21.318 182717 DEBUG oslo_concurrency.processutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5bdecf5d-9113-4584-ac23-44d59770eade/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:48:21 compute-1 nova_compute[182713]: 2026-01-21 23:48:21.386 182717 DEBUG nova.compute.manager [req-844f20b2-b815-42dc-b461-ce84ab5f0dc2 req-36f803c2-4e44-447f-858f-a6d03326a5fe 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 2038fd11-9c07-48d0-8092-d973d69d8eb9] Received event network-vif-plugged-bbc46799-0727-41d9-9ae1-017037df9492 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:48:21 compute-1 nova_compute[182713]: 2026-01-21 23:48:21.387 182717 DEBUG oslo_concurrency.lockutils [req-844f20b2-b815-42dc-b461-ce84ab5f0dc2 req-36f803c2-4e44-447f-858f-a6d03326a5fe 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "2038fd11-9c07-48d0-8092-d973d69d8eb9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:48:21 compute-1 nova_compute[182713]: 2026-01-21 23:48:21.387 182717 DEBUG oslo_concurrency.lockutils [req-844f20b2-b815-42dc-b461-ce84ab5f0dc2 req-36f803c2-4e44-447f-858f-a6d03326a5fe 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "2038fd11-9c07-48d0-8092-d973d69d8eb9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:48:21 compute-1 nova_compute[182713]: 2026-01-21 23:48:21.387 182717 DEBUG oslo_concurrency.lockutils [req-844f20b2-b815-42dc-b461-ce84ab5f0dc2 req-36f803c2-4e44-447f-858f-a6d03326a5fe 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "2038fd11-9c07-48d0-8092-d973d69d8eb9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:48:21 compute-1 nova_compute[182713]: 2026-01-21 23:48:21.388 182717 DEBUG nova.compute.manager [req-844f20b2-b815-42dc-b461-ce84ab5f0dc2 req-36f803c2-4e44-447f-858f-a6d03326a5fe 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 2038fd11-9c07-48d0-8092-d973d69d8eb9] Processing event network-vif-plugged-bbc46799-0727-41d9-9ae1-017037df9492 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 21 23:48:21 compute-1 nova_compute[182713]: 2026-01-21 23:48:21.389 182717 DEBUG nova.compute.manager [None req-a87f5acb-8fa8-487f-8c8d-397faf8c9a7f d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] [instance: 2038fd11-9c07-48d0-8092-d973d69d8eb9] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 21 23:48:21 compute-1 nova_compute[182713]: 2026-01-21 23:48:21.394 182717 DEBUG nova.virt.driver [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Emitting event <LifecycleEvent: 1769039301.3936136, 2038fd11-9c07-48d0-8092-d973d69d8eb9 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:48:21 compute-1 nova_compute[182713]: 2026-01-21 23:48:21.394 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 2038fd11-9c07-48d0-8092-d973d69d8eb9] VM Resumed (Lifecycle Event)
Jan 21 23:48:21 compute-1 nova_compute[182713]: 2026-01-21 23:48:21.397 182717 DEBUG nova.virt.libvirt.driver [None req-a87f5acb-8fa8-487f-8c8d-397faf8c9a7f d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] [instance: 2038fd11-9c07-48d0-8092-d973d69d8eb9] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 21 23:48:21 compute-1 nova_compute[182713]: 2026-01-21 23:48:21.402 182717 INFO nova.virt.libvirt.driver [-] [instance: 2038fd11-9c07-48d0-8092-d973d69d8eb9] Instance spawned successfully.
Jan 21 23:48:21 compute-1 nova_compute[182713]: 2026-01-21 23:48:21.403 182717 DEBUG nova.virt.libvirt.driver [None req-a87f5acb-8fa8-487f-8c8d-397faf8c9a7f d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] [instance: 2038fd11-9c07-48d0-8092-d973d69d8eb9] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 21 23:48:21 compute-1 nova_compute[182713]: 2026-01-21 23:48:21.426 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 2038fd11-9c07-48d0-8092-d973d69d8eb9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:48:21 compute-1 nova_compute[182713]: 2026-01-21 23:48:21.433 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 2038fd11-9c07-48d0-8092-d973d69d8eb9] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 21 23:48:21 compute-1 nova_compute[182713]: 2026-01-21 23:48:21.440 182717 DEBUG nova.virt.libvirt.driver [None req-a87f5acb-8fa8-487f-8c8d-397faf8c9a7f d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] [instance: 2038fd11-9c07-48d0-8092-d973d69d8eb9] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:48:21 compute-1 nova_compute[182713]: 2026-01-21 23:48:21.441 182717 DEBUG nova.virt.libvirt.driver [None req-a87f5acb-8fa8-487f-8c8d-397faf8c9a7f d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] [instance: 2038fd11-9c07-48d0-8092-d973d69d8eb9] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:48:21 compute-1 nova_compute[182713]: 2026-01-21 23:48:21.441 182717 DEBUG nova.virt.libvirt.driver [None req-a87f5acb-8fa8-487f-8c8d-397faf8c9a7f d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] [instance: 2038fd11-9c07-48d0-8092-d973d69d8eb9] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:48:21 compute-1 nova_compute[182713]: 2026-01-21 23:48:21.442 182717 DEBUG nova.virt.libvirt.driver [None req-a87f5acb-8fa8-487f-8c8d-397faf8c9a7f d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] [instance: 2038fd11-9c07-48d0-8092-d973d69d8eb9] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:48:21 compute-1 nova_compute[182713]: 2026-01-21 23:48:21.443 182717 DEBUG nova.virt.libvirt.driver [None req-a87f5acb-8fa8-487f-8c8d-397faf8c9a7f d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] [instance: 2038fd11-9c07-48d0-8092-d973d69d8eb9] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:48:21 compute-1 nova_compute[182713]: 2026-01-21 23:48:21.443 182717 DEBUG nova.virt.libvirt.driver [None req-a87f5acb-8fa8-487f-8c8d-397faf8c9a7f d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] [instance: 2038fd11-9c07-48d0-8092-d973d69d8eb9] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:48:21 compute-1 nova_compute[182713]: 2026-01-21 23:48:21.457 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 2038fd11-9c07-48d0-8092-d973d69d8eb9] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 21 23:48:21 compute-1 podman[214774]: 2026-01-21 23:48:21.460202785 +0000 UTC m=+0.065111291 container create 49db29b8cafe07b2b7044a503284b450d3b6befcff045abe8af8efd6cd349e43 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1b0b760c-cbd0-4413-9603-713296c75717, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, tcib_managed=true)
Jan 21 23:48:21 compute-1 nova_compute[182713]: 2026-01-21 23:48:21.484 182717 DEBUG nova.network.neutron [req-fdb0cdf8-27c2-46a1-a39c-dea2ad2caeea req-54c88cf5-f2e3-450f-8148-5c471cef7098 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 2038fd11-9c07-48d0-8092-d973d69d8eb9] Updated VIF entry in instance network info cache for port bbc46799-0727-41d9-9ae1-017037df9492. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 21 23:48:21 compute-1 nova_compute[182713]: 2026-01-21 23:48:21.485 182717 DEBUG nova.network.neutron [req-fdb0cdf8-27c2-46a1-a39c-dea2ad2caeea req-54c88cf5-f2e3-450f-8148-5c471cef7098 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 2038fd11-9c07-48d0-8092-d973d69d8eb9] Updating instance_info_cache with network_info: [{"id": "bbc46799-0727-41d9-9ae1-017037df9492", "address": "fa:16:3e:14:7f:86", "network": {"id": "b2df233d-b255-4dda-925c-3ccab3a032ee", "bridge": "br-int", "label": "tempest-LiveMigrationTest-283223724-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cdcb2f57183e484cace5d5f78dd635a1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbbc46799-07", "ovs_interfaceid": "bbc46799-0727-41d9-9ae1-017037df9492", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 23:48:21 compute-1 podman[214774]: 2026-01-21 23:48:21.426021728 +0000 UTC m=+0.030930214 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 21 23:48:21 compute-1 systemd[1]: Started libpod-conmon-49db29b8cafe07b2b7044a503284b450d3b6befcff045abe8af8efd6cd349e43.scope.
Jan 21 23:48:21 compute-1 systemd[1]: Started libcrun container.
Jan 21 23:48:21 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f2556d407cb583d45463be68c1827de5f67e221f7b99d16afca1796f85253375/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 21 23:48:21 compute-1 podman[214774]: 2026-01-21 23:48:21.586148076 +0000 UTC m=+0.191057022 container init 49db29b8cafe07b2b7044a503284b450d3b6befcff045abe8af8efd6cd349e43 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1b0b760c-cbd0-4413-9603-713296c75717, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.license=GPLv2)
Jan 21 23:48:21 compute-1 podman[214774]: 2026-01-21 23:48:21.598806261 +0000 UTC m=+0.203714747 container start 49db29b8cafe07b2b7044a503284b450d3b6befcff045abe8af8efd6cd349e43 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1b0b760c-cbd0-4413-9603-713296c75717, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Jan 21 23:48:21 compute-1 neutron-haproxy-ovnmeta-1b0b760c-cbd0-4413-9603-713296c75717[214789]: [NOTICE]   (214793) : New worker (214795) forked
Jan 21 23:48:21 compute-1 neutron-haproxy-ovnmeta-1b0b760c-cbd0-4413-9603-713296c75717[214789]: [NOTICE]   (214793) : Loading success.
Jan 21 23:48:21 compute-1 nova_compute[182713]: 2026-01-21 23:48:21.656 182717 WARNING nova.virt.libvirt.driver [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 21 23:48:21 compute-1 nova_compute[182713]: 2026-01-21 23:48:21.658 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5479MB free_disk=73.35110855102539GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 21 23:48:21 compute-1 nova_compute[182713]: 2026-01-21 23:48:21.658 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:48:21 compute-1 nova_compute[182713]: 2026-01-21 23:48:21.658 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:48:21 compute-1 nova_compute[182713]: 2026-01-21 23:48:21.768 182717 DEBUG oslo_concurrency.lockutils [req-fdb0cdf8-27c2-46a1-a39c-dea2ad2caeea req-54c88cf5-f2e3-450f-8148-5c471cef7098 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-2038fd11-9c07-48d0-8092-d973d69d8eb9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 23:48:21 compute-1 nova_compute[182713]: 2026-01-21 23:48:21.815 182717 INFO nova.compute.manager [None req-a87f5acb-8fa8-487f-8c8d-397faf8c9a7f d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] [instance: 2038fd11-9c07-48d0-8092-d973d69d8eb9] Took 6.64 seconds to spawn the instance on the hypervisor.
Jan 21 23:48:21 compute-1 nova_compute[182713]: 2026-01-21 23:48:21.815 182717 DEBUG nova.compute.manager [None req-a87f5acb-8fa8-487f-8c8d-397faf8c9a7f d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] [instance: 2038fd11-9c07-48d0-8092-d973d69d8eb9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:48:21 compute-1 nova_compute[182713]: 2026-01-21 23:48:21.834 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Instance 5bdecf5d-9113-4584-ac23-44d59770eade actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 21 23:48:21 compute-1 nova_compute[182713]: 2026-01-21 23:48:21.835 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Instance 2038fd11-9c07-48d0-8092-d973d69d8eb9 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 21 23:48:21 compute-1 nova_compute[182713]: 2026-01-21 23:48:21.835 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 21 23:48:21 compute-1 nova_compute[182713]: 2026-01-21 23:48:21.835 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 21 23:48:21 compute-1 nova_compute[182713]: 2026-01-21 23:48:21.914 182717 INFO nova.compute.manager [None req-a87f5acb-8fa8-487f-8c8d-397faf8c9a7f d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] [instance: 2038fd11-9c07-48d0-8092-d973d69d8eb9] Took 7.41 seconds to build instance.
Jan 21 23:48:21 compute-1 nova_compute[182713]: 2026-01-21 23:48:21.918 182717 DEBUG nova.compute.provider_tree [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Inventory has not changed in ProviderTree for provider: 39680711-70c9-4df1-ae59-25e54fac688d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 21 23:48:21 compute-1 nova_compute[182713]: 2026-01-21 23:48:21.945 182717 DEBUG nova.scheduler.client.report [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Inventory has not changed for provider 39680711-70c9-4df1-ae59-25e54fac688d based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 21 23:48:21 compute-1 nova_compute[182713]: 2026-01-21 23:48:21.950 182717 DEBUG oslo_concurrency.lockutils [None req-a87f5acb-8fa8-487f-8c8d-397faf8c9a7f d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] Lock "2038fd11-9c07-48d0-8092-d973d69d8eb9" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.541s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:48:21 compute-1 nova_compute[182713]: 2026-01-21 23:48:21.969 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 21 23:48:21 compute-1 nova_compute[182713]: 2026-01-21 23:48:21.969 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.311s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:48:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:22.870 12 DEBUG novaclient.v2.client [-] REQ: curl -g -i -X GET https://nova-internal.openstack.svc:8774/v2.1/flavors?is_public=None -H "Accept: application/json" -H "User-Agent: python-novaclient" -H "X-Auth-Token: {SHA256}bc9b7ecb2af6ac629fc2448a7d596d0bfe459cc5ba04b0213d7aa35f343c0a3b" -H "X-OpenStack-Nova-API-Version: 2.1" _http_log_request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:519
Jan 21 23:48:22 compute-1 nova_compute[182713]: 2026-01-21 23:48:22.965 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:48:22 compute-1 nova_compute[182713]: 2026-01-21 23:48:22.966 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:48:22 compute-1 nova_compute[182713]: 2026-01-21 23:48:22.966 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 21 23:48:22 compute-1 nova_compute[182713]: 2026-01-21 23:48:22.966 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 21 23:48:23 compute-1 nova_compute[182713]: 2026-01-21 23:48:23.252 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquiring lock "refresh_cache-2038fd11-9c07-48d0-8092-d973d69d8eb9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 23:48:23 compute-1 nova_compute[182713]: 2026-01-21 23:48:23.252 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquired lock "refresh_cache-2038fd11-9c07-48d0-8092-d973d69d8eb9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 23:48:23 compute-1 nova_compute[182713]: 2026-01-21 23:48:23.252 182717 DEBUG nova.network.neutron [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] [instance: 2038fd11-9c07-48d0-8092-d973d69d8eb9] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Jan 21 23:48:23 compute-1 nova_compute[182713]: 2026-01-21 23:48:23.253 182717 DEBUG nova.objects.instance [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lazy-loading 'info_cache' on Instance uuid 2038fd11-9c07-48d0-8092-d973d69d8eb9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.395 12 DEBUG novaclient.v2.client [-] RESP: [200] Connection: Keep-Alive Content-Length: 644 Content-Type: application/json Date: Wed, 21 Jan 2026 23:48:22 GMT Keep-Alive: timeout=5, max=100 OpenStack-API-Version: compute 2.1 Server: Apache Vary: OpenStack-API-Version,X-OpenStack-Nova-API-Version X-OpenStack-Nova-API-Version: 2.1 x-compute-request-id: req-18569ff8-0c22-41ca-b372-dc01a37145ad x-openstack-request-id: req-18569ff8-0c22-41ca-b372-dc01a37145ad _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:550
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.395 12 DEBUG novaclient.v2.client [-] RESP BODY: {"flavors": [{"id": "c3389c03-89c4-4ff5-9e03-1a99d41713d4", "name": "m1.nano", "links": [{"rel": "self", "href": "https://nova-internal.openstack.svc:8774/v2.1/flavors/c3389c03-89c4-4ff5-9e03-1a99d41713d4"}, {"rel": "bookmark", "href": "https://nova-internal.openstack.svc:8774/flavors/c3389c03-89c4-4ff5-9e03-1a99d41713d4"}]}, {"id": "ff01ccba-ad51-439f-9037-926190d6dc0f", "name": "m1.micro", "links": [{"rel": "self", "href": "https://nova-internal.openstack.svc:8774/v2.1/flavors/ff01ccba-ad51-439f-9037-926190d6dc0f"}, {"rel": "bookmark", "href": "https://nova-internal.openstack.svc:8774/flavors/ff01ccba-ad51-439f-9037-926190d6dc0f"}]}]} _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:582
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.395 12 DEBUG novaclient.v2.client [-] GET call to compute for https://nova-internal.openstack.svc:8774/v2.1/flavors?is_public=None used request id req-18569ff8-0c22-41ca-b372-dc01a37145ad request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:954
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.397 12 DEBUG novaclient.v2.client [-] REQ: curl -g -i -X GET https://nova-internal.openstack.svc:8774/v2.1/flavors/c3389c03-89c4-4ff5-9e03-1a99d41713d4 -H "Accept: application/json" -H "User-Agent: python-novaclient" -H "X-Auth-Token: {SHA256}bc9b7ecb2af6ac629fc2448a7d596d0bfe459cc5ba04b0213d7aa35f343c0a3b" -H "X-OpenStack-Nova-API-Version: 2.1" _http_log_request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:519
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.484 12 DEBUG novaclient.v2.client [-] RESP: [200] Connection: Keep-Alive Content-Length: 495 Content-Type: application/json Date: Wed, 21 Jan 2026 23:48:23 GMT Keep-Alive: timeout=5, max=99 OpenStack-API-Version: compute 2.1 Server: Apache Vary: OpenStack-API-Version,X-OpenStack-Nova-API-Version X-OpenStack-Nova-API-Version: 2.1 x-compute-request-id: req-63578f78-5643-4ede-8a4c-f33dea53a2f9 x-openstack-request-id: req-63578f78-5643-4ede-8a4c-f33dea53a2f9 _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:550
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.485 12 DEBUG novaclient.v2.client [-] RESP BODY: {"flavor": {"id": "c3389c03-89c4-4ff5-9e03-1a99d41713d4", "name": "m1.nano", "ram": 128, "disk": 1, "swap": "", "OS-FLV-EXT-DATA:ephemeral": 0, "OS-FLV-DISABLED:disabled": false, "vcpus": 1, "os-flavor-access:is_public": true, "rxtx_factor": 1.0, "links": [{"rel": "self", "href": "https://nova-internal.openstack.svc:8774/v2.1/flavors/c3389c03-89c4-4ff5-9e03-1a99d41713d4"}, {"rel": "bookmark", "href": "https://nova-internal.openstack.svc:8774/flavors/c3389c03-89c4-4ff5-9e03-1a99d41713d4"}]}} _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:582
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.485 12 DEBUG novaclient.v2.client [-] GET call to compute for https://nova-internal.openstack.svc:8774/v2.1/flavors/c3389c03-89c4-4ff5-9e03-1a99d41713d4 used request id req-63578f78-5643-4ede-8a4c-f33dea53a2f9 request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:954
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.486 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '2038fd11-9c07-48d0-8092-d973d69d8eb9', 'name': 'tempest-LiveMigrationTest-server-1127248027', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000019', 'OS-EXT-SRV-ATTR:host': 'compute-1.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'cdcb2f57183e484cace5d5f78dd635a1', 'user_id': 'd4ff24d8abf8416db9d64c645436c5f1', 'hostId': '3a5bbea88021c2dec9ed32ce2b39bccc332cee8daebee264a0ef0cc2', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.489 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '5bdecf5d-9113-4584-ac23-44d59770eade', 'name': 'tempest-LiveMigrationTest-server-821021372', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000015', 'OS-EXT-SRV-ATTR:host': 'compute-1.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'cdcb2f57183e484cace5d5f78dd635a1', 'user_id': 'd4ff24d8abf8416db9d64c645436c5f1', 'hostId': '3a5bbea88021c2dec9ed32ce2b39bccc332cee8daebee264a0ef0cc2', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.490 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Jan 21 23:48:23 compute-1 nova_compute[182713]: 2026-01-21 23:48:23.494 182717 DEBUG nova.compute.manager [req-9bec3392-d332-4939-812e-fdba666739c4 req-12ad34cb-a64c-434b-97a1-68fa79e84935 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 2038fd11-9c07-48d0-8092-d973d69d8eb9] Received event network-vif-plugged-bbc46799-0727-41d9-9ae1-017037df9492 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:48:23 compute-1 nova_compute[182713]: 2026-01-21 23:48:23.494 182717 DEBUG oslo_concurrency.lockutils [req-9bec3392-d332-4939-812e-fdba666739c4 req-12ad34cb-a64c-434b-97a1-68fa79e84935 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "2038fd11-9c07-48d0-8092-d973d69d8eb9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:48:23 compute-1 nova_compute[182713]: 2026-01-21 23:48:23.495 182717 DEBUG oslo_concurrency.lockutils [req-9bec3392-d332-4939-812e-fdba666739c4 req-12ad34cb-a64c-434b-97a1-68fa79e84935 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "2038fd11-9c07-48d0-8092-d973d69d8eb9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:48:23 compute-1 nova_compute[182713]: 2026-01-21 23:48:23.495 182717 DEBUG oslo_concurrency.lockutils [req-9bec3392-d332-4939-812e-fdba666739c4 req-12ad34cb-a64c-434b-97a1-68fa79e84935 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "2038fd11-9c07-48d0-8092-d973d69d8eb9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:48:23 compute-1 nova_compute[182713]: 2026-01-21 23:48:23.495 182717 DEBUG nova.compute.manager [req-9bec3392-d332-4939-812e-fdba666739c4 req-12ad34cb-a64c-434b-97a1-68fa79e84935 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 2038fd11-9c07-48d0-8092-d973d69d8eb9] No waiting events found dispatching network-vif-plugged-bbc46799-0727-41d9-9ae1-017037df9492 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 23:48:23 compute-1 nova_compute[182713]: 2026-01-21 23:48:23.496 182717 WARNING nova.compute.manager [req-9bec3392-d332-4939-812e-fdba666739c4 req-12ad34cb-a64c-434b-97a1-68fa79e84935 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 2038fd11-9c07-48d0-8092-d973d69d8eb9] Received unexpected event network-vif-plugged-bbc46799-0727-41d9-9ae1-017037df9492 for instance with vm_state active and task_state None.
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.521 12 DEBUG ceilometer.compute.pollsters [-] 2038fd11-9c07-48d0-8092-d973d69d8eb9/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.522 12 DEBUG ceilometer.compute.pollsters [-] 2038fd11-9c07-48d0-8092-d973d69d8eb9/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.561 12 DEBUG ceilometer.compute.pollsters [-] 5bdecf5d-9113-4584-ac23-44d59770eade/disk.device.write.requests volume: 19 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.561 12 DEBUG ceilometer.compute.pollsters [-] 5bdecf5d-9113-4584-ac23-44d59770eade/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.565 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '338302cd-8430-46b4-8f63-7a3fbd71431d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'd4ff24d8abf8416db9d64c645436c5f1', 'user_name': None, 'project_id': 'cdcb2f57183e484cace5d5f78dd635a1', 'project_name': None, 'resource_id': '2038fd11-9c07-48d0-8092-d973d69d8eb9-vda', 'timestamp': '2026-01-21T23:48:23.490644', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-1127248027', 'name': 'instance-00000019', 'instance_id': '2038fd11-9c07-48d0-8092-d973d69d8eb9', 'instance_type': 'm1.nano', 'host': '3a5bbea88021c2dec9ed32ce2b39bccc332cee8daebee264a0ef0cc2', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'ac4e65f6-f723-11f0-a0a4-fa163e934844', 'monotonic_time': 3876.197953725, 'message_signature': '532cadbed40a16f29147128fb9a193756979ac7fabd9dc97ffeba4258d9f408b'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'd4ff24d8abf8416db9d64c645436c5f1', 'user_name': None, 'project_id': 'cdcb2f57183e484cace5d5f78dd635a1', 'project_name': None, 'resource_id': '2038fd11-9c07-48d0-8092-d973d69d8eb9-sda', 'timestamp': '2026-01-21T23:48:23.490644', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-1127248027', 'name': 'instance-00000019', 'instance_id': '2038fd11-9c07-48d0-8092-d973d69d8eb9', 'instance_type': 'm1.nano', 'host': '3a5bbea88021c2dec9ed32ce2b39bccc332cee8daebee264a0ef0cc2', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'ac4e95b2-f723-11f0-a0a4-fa163e934844', 'monotonic_time': 3876.197953725, 'message_signature': '4899ac685c6d080d4b5686e57991dd6f92de8cff4fbc8f3c8401d037f3a269ce'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 19, 'user_id': 'd4ff24d8abf8416db9d64c645436c5f1', 'user_name': None, 'project_id': 'cdcb2f57183e484cace5d5f78dd635a1', 'project_name': None, 'resource_id': '5bdecf5d-9113-4584-ac23-44d59770eade-vda', 'timestamp': '2026-01-21T23:48:23.490644', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-821021372', 'name': 'instance-00000015', 'instance_id': '5bdecf5d-9113-4584-ac23-44d59770eade', 'instance_type': 'm1.nano', 'host': '3a5bbea88021c2dec9ed32ce2b39bccc332cee8daebee264a0ef0cc2', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'ac546c30-f723-11f0-a0a4-fa163e934844', 'monotonic_time': 3876.230850781, 'message_signature': '1744ec7e8f0580cf207f9b8d78b342f2e11fefb3880f0ade4525a3dc620c1387'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'd4ff24d8abf8416db9d64c645436c5f1', 'user_name': None, 'project_id': 'cdcb2f57183e484cace5d5f78dd635a1', 'project_name': None, 'resource_id': '5bdecf5d-9113-4584-ac23-44d59770eade-sda', 'timestamp': '2026-01-21T23:48:23.490644', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-821021372', 'name': 'instance-00000015', 'instance_id': '5bdecf5d-9113-4584-ac23-44d59770eade', 'instance_type': 'm1.nano', 'host': '3a5bbea88021c2dec9ed32ce2b39bccc332cee8daebee264a0ef0cc2', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'ac547bee-f723-11f0-a0a4-fa163e934844', 'monotonic_time': 3876.230850781, 'message_signature': '5cd68da4e1da805c69efbfd9a6f0f413c56fc8e4fb77a3dfdd8683a284f6e300'}]}, 'timestamp': '2026-01-21 23:48:23.562298', '_unique_id': '3ca339c2444945ca9bd68fbc82724914'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.565 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.565 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.565 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.565 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.565 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.565 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.565 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.565 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.565 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.565 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.565 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.565 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.565 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.565 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.565 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.565 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.565 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.565 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.565 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.565 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.565 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.565 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.565 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.565 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.565 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.565 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.565 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.565 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.565 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.565 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.565 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.565 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.565 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.565 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.565 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.565 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.565 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.565 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.565 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.565 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.565 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.565 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.565 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.565 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.565 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.565 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.565 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.565 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.565 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.565 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.565 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.565 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.565 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.565 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.566 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.574 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 2038fd11-9c07-48d0-8092-d973d69d8eb9 / tapbbc46799-07 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.574 12 DEBUG ceilometer.compute.pollsters [-] 2038fd11-9c07-48d0-8092-d973d69d8eb9/network.incoming.packets volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.577 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 5bdecf5d-9113-4584-ac23-44d59770eade / tapdf9aa099-aa inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.578 12 DEBUG ceilometer.compute.pollsters [-] 5bdecf5d-9113-4584-ac23-44d59770eade/network.incoming.packets volume: 9 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.579 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '29327251-f4a7-4415-8989-6bedc50ec3ba', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 1, 'user_id': 'd4ff24d8abf8416db9d64c645436c5f1', 'user_name': None, 'project_id': 'cdcb2f57183e484cace5d5f78dd635a1', 'project_name': None, 'resource_id': 'instance-00000019-2038fd11-9c07-48d0-8092-d973d69d8eb9-tapbbc46799-07', 'timestamp': '2026-01-21T23:48:23.566685', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-1127248027', 'name': 'tapbbc46799-07', 'instance_id': '2038fd11-9c07-48d0-8092-d973d69d8eb9', 'instance_type': 'm1.nano', 'host': '3a5bbea88021c2dec9ed32ce2b39bccc332cee8daebee264a0ef0cc2', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:14:7f:86', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapbbc46799-07'}, 'message_id': 'ac5672d2-f723-11f0-a0a4-fa163e934844', 'monotonic_time': 3876.273916503, 'message_signature': 'f506a22e5f46d7e1a5ce3f52776c7c134abc301bd8c548573fda7443cc0db68a'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 9, 'user_id': 'd4ff24d8abf8416db9d64c645436c5f1', 'user_name': None, 'project_id': 'cdcb2f57183e484cace5d5f78dd635a1', 'project_name': None, 'resource_id': 'instance-00000015-5bdecf5d-9113-4584-ac23-44d59770eade-tapdf9aa099-aa', 'timestamp': '2026-01-21T23:48:23.566685', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-821021372', 'name': 'tapdf9aa099-aa', 'instance_id': '5bdecf5d-9113-4584-ac23-44d59770eade', 'instance_type': 'm1.nano', 'host': '3a5bbea88021c2dec9ed32ce2b39bccc332cee8daebee264a0ef0cc2', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:8f:4f:85', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapdf9aa099-aa'}, 'message_id': 'ac56f8e2-f723-11f0-a0a4-fa163e934844', 'monotonic_time': 3876.28225046, 'message_signature': 'ba9ec89905c1b55d2148ce9dfb32c0771a4dcc30469379b7aa210c8f28823e9a'}]}, 'timestamp': '2026-01-21 23:48:23.578447', '_unique_id': '449bef550ea64274b40b4718f46d18ab'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.579 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.579 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.579 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.579 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.579 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.579 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.579 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.579 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.579 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.579 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.579 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.579 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.579 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.579 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.579 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.579 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.579 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.579 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.579 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.579 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.579 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.579 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.579 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.579 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.579 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.579 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.579 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.579 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.579 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.579 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.579 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.580 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.581 12 DEBUG ceilometer.compute.pollsters [-] 2038fd11-9c07-48d0-8092-d973d69d8eb9/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.581 12 DEBUG ceilometer.compute.pollsters [-] 5bdecf5d-9113-4584-ac23-44d59770eade/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.582 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1f4740e1-8346-4e01-b655-968ccdd4f184', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'd4ff24d8abf8416db9d64c645436c5f1', 'user_name': None, 'project_id': 'cdcb2f57183e484cace5d5f78dd635a1', 'project_name': None, 'resource_id': 'instance-00000019-2038fd11-9c07-48d0-8092-d973d69d8eb9-tapbbc46799-07', 'timestamp': '2026-01-21T23:48:23.580985', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-1127248027', 'name': 'tapbbc46799-07', 'instance_id': '2038fd11-9c07-48d0-8092-d973d69d8eb9', 'instance_type': 'm1.nano', 'host': '3a5bbea88021c2dec9ed32ce2b39bccc332cee8daebee264a0ef0cc2', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:14:7f:86', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapbbc46799-07'}, 'message_id': 'ac576b4c-f723-11f0-a0a4-fa163e934844', 'monotonic_time': 3876.273916503, 'message_signature': '46d2fa2f63499bd11ad4185fa57f2ab43746ca3ac05d80ff1d72b81837db33f7'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'd4ff24d8abf8416db9d64c645436c5f1', 'user_name': None, 'project_id': 'cdcb2f57183e484cace5d5f78dd635a1', 'project_name': None, 'resource_id': 'instance-00000015-5bdecf5d-9113-4584-ac23-44d59770eade-tapdf9aa099-aa', 'timestamp': '2026-01-21T23:48:23.580985', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-821021372', 'name': 'tapdf9aa099-aa', 'instance_id': '5bdecf5d-9113-4584-ac23-44d59770eade', 'instance_type': 'm1.nano', 'host': '3a5bbea88021c2dec9ed32ce2b39bccc332cee8daebee264a0ef0cc2', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:8f:4f:85', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapdf9aa099-aa'}, 'message_id': 'ac577790-f723-11f0-a0a4-fa163e934844', 'monotonic_time': 3876.28225046, 'message_signature': 'd89343e3d267f51cea7b672dda2bded73f3c7eeddd086822620841dd9b8e7574'}]}, 'timestamp': '2026-01-21 23:48:23.581628', '_unique_id': '66cb2821ade64b23bfc12ad9e7dc0a47'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.582 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.582 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.582 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.582 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.582 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.582 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.582 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.582 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.582 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.582 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.582 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.582 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.582 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.582 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.582 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.582 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.582 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.582 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.582 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.582 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.582 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.582 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.582 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.582 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.582 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.582 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.582 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.582 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.582 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.582 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.582 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.582 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.582 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.582 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.582 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.582 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.582 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.582 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.582 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.582 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.582 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.582 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.582 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.582 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.582 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.582 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.582 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.582 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.582 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.582 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.582 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.582 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.582 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.582 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.583 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.583 12 DEBUG ceilometer.compute.pollsters [-] 2038fd11-9c07-48d0-8092-d973d69d8eb9/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.583 12 DEBUG ceilometer.compute.pollsters [-] 5bdecf5d-9113-4584-ac23-44d59770eade/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.584 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '21ce9477-376f-49bd-b093-68767bf1aaf5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'd4ff24d8abf8416db9d64c645436c5f1', 'user_name': None, 'project_id': 'cdcb2f57183e484cace5d5f78dd635a1', 'project_name': None, 'resource_id': 'instance-00000019-2038fd11-9c07-48d0-8092-d973d69d8eb9-tapbbc46799-07', 'timestamp': '2026-01-21T23:48:23.583442', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-1127248027', 'name': 'tapbbc46799-07', 'instance_id': '2038fd11-9c07-48d0-8092-d973d69d8eb9', 'instance_type': 'm1.nano', 'host': '3a5bbea88021c2dec9ed32ce2b39bccc332cee8daebee264a0ef0cc2', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:14:7f:86', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapbbc46799-07'}, 'message_id': 'ac57caf6-f723-11f0-a0a4-fa163e934844', 'monotonic_time': 3876.273916503, 'message_signature': '698be5de6a7d60d472d40a8002f5b6151181ef03a411214fc3bc73084c4257a3'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'd4ff24d8abf8416db9d64c645436c5f1', 'user_name': None, 'project_id': 'cdcb2f57183e484cace5d5f78dd635a1', 'project_name': None, 'resource_id': 'instance-00000015-5bdecf5d-9113-4584-ac23-44d59770eade-tapdf9aa099-aa', 'timestamp': '2026-01-21T23:48:23.583442', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-821021372', 'name': 'tapdf9aa099-aa', 'instance_id': '5bdecf5d-9113-4584-ac23-44d59770eade', 'instance_type': 'm1.nano', 'host': '3a5bbea88021c2dec9ed32ce2b39bccc332cee8daebee264a0ef0cc2', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:8f:4f:85', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapdf9aa099-aa'}, 'message_id': 'ac57dadc-f723-11f0-a0a4-fa163e934844', 'monotonic_time': 3876.28225046, 'message_signature': '0716672a08a80263bc30b81a00fe55f26bc9a3c1eb7043c8eeed8b005866826a'}]}, 'timestamp': '2026-01-21 23:48:23.584180', '_unique_id': 'd8ee3eee8fb34e708c131060d5ca7487'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.584 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.584 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.584 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.584 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.584 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.584 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.584 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.584 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.584 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.584 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.584 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.584 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.584 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.584 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.584 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.584 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.584 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.584 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.584 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.584 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.584 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.584 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.584 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.584 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.584 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.584 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.584 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.584 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.584 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.584 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.584 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.584 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.584 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.584 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.584 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.584 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.584 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.584 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.584 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.584 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.584 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.584 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.584 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.584 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.584 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.584 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.584 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.584 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.584 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.584 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.584 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.584 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.584 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.584 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.585 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.585 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.586 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-LiveMigrationTest-server-1127248027>, <NovaLikeServer: tempest-LiveMigrationTest-server-821021372>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-LiveMigrationTest-server-1127248027>, <NovaLikeServer: tempest-LiveMigrationTest-server-821021372>]
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.586 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.586 12 DEBUG ceilometer.compute.pollsters [-] 2038fd11-9c07-48d0-8092-d973d69d8eb9/network.outgoing.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.586 12 DEBUG ceilometer.compute.pollsters [-] 5bdecf5d-9113-4584-ac23-44d59770eade/network.outgoing.bytes volume: 5560 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.588 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '97105ed3-4870-4614-b190-169956a7c967', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'd4ff24d8abf8416db9d64c645436c5f1', 'user_name': None, 'project_id': 'cdcb2f57183e484cace5d5f78dd635a1', 'project_name': None, 'resource_id': 'instance-00000019-2038fd11-9c07-48d0-8092-d973d69d8eb9-tapbbc46799-07', 'timestamp': '2026-01-21T23:48:23.586527', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-1127248027', 'name': 'tapbbc46799-07', 'instance_id': '2038fd11-9c07-48d0-8092-d973d69d8eb9', 'instance_type': 'm1.nano', 'host': '3a5bbea88021c2dec9ed32ce2b39bccc332cee8daebee264a0ef0cc2', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:14:7f:86', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapbbc46799-07'}, 'message_id': 'ac584382-f723-11f0-a0a4-fa163e934844', 'monotonic_time': 3876.273916503, 'message_signature': '0dd19b6905d8d5f88ffb5e30cd047018b1216ec0cb9f09fb8d458fc7fc77bc12'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 5560, 'user_id': 'd4ff24d8abf8416db9d64c645436c5f1', 'user_name': None, 'project_id': 'cdcb2f57183e484cace5d5f78dd635a1', 'project_name': None, 'resource_id': 'instance-00000015-5bdecf5d-9113-4584-ac23-44d59770eade-tapdf9aa099-aa', 'timestamp': '2026-01-21T23:48:23.586527', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-821021372', 'name': 'tapdf9aa099-aa', 'instance_id': '5bdecf5d-9113-4584-ac23-44d59770eade', 'instance_type': 'm1.nano', 'host': '3a5bbea88021c2dec9ed32ce2b39bccc332cee8daebee264a0ef0cc2', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:8f:4f:85', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapdf9aa099-aa'}, 'message_id': 'ac58532c-f723-11f0-a0a4-fa163e934844', 'monotonic_time': 3876.28225046, 'message_signature': 'f45b37b71d72b0ef35b64618f1466886973732a17b6afde2de4f3f7a2a2ff898'}]}, 'timestamp': '2026-01-21 23:48:23.587252', '_unique_id': '8de83a0c748e40e1acc22a15b485ae89'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.588 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.588 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.588 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.588 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.588 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.588 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.588 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.588 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.588 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.588 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.588 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.588 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.588 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.588 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.588 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.588 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.588 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.588 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.588 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.588 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.588 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.588 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.588 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.588 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.588 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.588 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.588 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.588 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.588 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.588 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.588 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.588 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.588 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.588 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.588 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.588 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.588 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.588 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.588 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.588 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.588 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.588 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.588 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.588 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.588 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.588 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.588 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.588 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.588 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.588 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.588 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.588 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.588 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.588 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.589 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.589 12 DEBUG ceilometer.compute.pollsters [-] 2038fd11-9c07-48d0-8092-d973d69d8eb9/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.589 12 DEBUG ceilometer.compute.pollsters [-] 5bdecf5d-9113-4584-ac23-44d59770eade/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.591 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '79d4c7d7-2f31-42be-a70b-e9230af4ca3f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'd4ff24d8abf8416db9d64c645436c5f1', 'user_name': None, 'project_id': 'cdcb2f57183e484cace5d5f78dd635a1', 'project_name': None, 'resource_id': 'instance-00000019-2038fd11-9c07-48d0-8092-d973d69d8eb9-tapbbc46799-07', 'timestamp': '2026-01-21T23:48:23.589310', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-1127248027', 'name': 'tapbbc46799-07', 'instance_id': '2038fd11-9c07-48d0-8092-d973d69d8eb9', 'instance_type': 'm1.nano', 'host': '3a5bbea88021c2dec9ed32ce2b39bccc332cee8daebee264a0ef0cc2', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:14:7f:86', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapbbc46799-07'}, 'message_id': 'ac58b042-f723-11f0-a0a4-fa163e934844', 'monotonic_time': 3876.273916503, 'message_signature': '5bdd46606fcb5f7b9767fc0f9fb194a1a62935d448519ed07f9be3d1594e303e'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'd4ff24d8abf8416db9d64c645436c5f1', 'user_name': None, 'project_id': 'cdcb2f57183e484cace5d5f78dd635a1', 'project_name': None, 'resource_id': 'instance-00000015-5bdecf5d-9113-4584-ac23-44d59770eade-tapdf9aa099-aa', 'timestamp': '2026-01-21T23:48:23.589310', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-821021372', 'name': 'tapdf9aa099-aa', 'instance_id': '5bdecf5d-9113-4584-ac23-44d59770eade', 'instance_type': 'm1.nano', 'host': '3a5bbea88021c2dec9ed32ce2b39bccc332cee8daebee264a0ef0cc2', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:8f:4f:85', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapdf9aa099-aa'}, 'message_id': 'ac58bc22-f723-11f0-a0a4-fa163e934844', 'monotonic_time': 3876.28225046, 'message_signature': 'f597c8686a01d7699fee510574559e194956807ce335d9e11cdbe8ff0446e55d'}]}, 'timestamp': '2026-01-21 23:48:23.590096', '_unique_id': '3405fe89803640dbb441a2f51411aaac'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.591 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.591 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.591 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.591 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.591 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.591 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.591 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.591 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.591 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.591 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.591 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.591 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.591 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.591 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.591 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.591 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.591 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.591 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.591 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.591 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.591 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.591 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.591 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.591 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.591 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.591 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.591 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.591 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.591 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.591 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.591 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.591 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.591 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.591 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.591 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.591 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.591 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.591 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.591 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.591 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.591 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.591 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.591 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.591 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.591 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.591 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.591 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.591 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.591 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.591 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.591 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.591 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.591 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.591 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.592 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.592 12 DEBUG ceilometer.compute.pollsters [-] 2038fd11-9c07-48d0-8092-d973d69d8eb9/disk.device.read.bytes volume: 23775232 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.592 12 DEBUG ceilometer.compute.pollsters [-] 2038fd11-9c07-48d0-8092-d973d69d8eb9/disk.device.read.bytes volume: 2048 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.593 12 DEBUG ceilometer.compute.pollsters [-] 5bdecf5d-9113-4584-ac23-44d59770eade/disk.device.read.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.593 12 DEBUG ceilometer.compute.pollsters [-] 5bdecf5d-9113-4584-ac23-44d59770eade/disk.device.read.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.594 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f2fea31a-5829-40b6-b891-3100fcec6c87', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 23775232, 'user_id': 'd4ff24d8abf8416db9d64c645436c5f1', 'user_name': None, 'project_id': 'cdcb2f57183e484cace5d5f78dd635a1', 'project_name': None, 'resource_id': '2038fd11-9c07-48d0-8092-d973d69d8eb9-vda', 'timestamp': '2026-01-21T23:48:23.592524', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-1127248027', 'name': 'instance-00000019', 'instance_id': '2038fd11-9c07-48d0-8092-d973d69d8eb9', 'instance_type': 'm1.nano', 'host': '3a5bbea88021c2dec9ed32ce2b39bccc332cee8daebee264a0ef0cc2', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'ac592e1e-f723-11f0-a0a4-fa163e934844', 'monotonic_time': 3876.197953725, 'message_signature': '24daf2014328b14cf32c2b789a19c5e21345d957fd0635dd90936122b7fae12f'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2048, 'user_id': 'd4ff24d8abf8416db9d64c645436c5f1', 'user_name': None, 'project_id': 'cdcb2f57183e484cace5d5f78dd635a1', 'project_name': None, 'resource_id': '2038fd11-9c07-48d0-8092-d973d69d8eb9-sda', 'timestamp': '2026-01-21T23:48:23.592524', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-1127248027', 'name': 'instance-00000019', 'instance_id': '2038fd11-9c07-48d0-8092-d973d69d8eb9', 'instance_type': 'm1.nano', 'host': '3a5bbea88021c2dec9ed32ce2b39bccc332cee8daebee264a0ef0cc2', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'ac593b5c-f723-11f0-a0a4-fa163e934844', 'monotonic_time': 3876.197953725, 'message_signature': '5e6f026fcd8b3f93ae3568ba102dd57e56f94d27c22d7c299820cebc93cabf9f'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'd4ff24d8abf8416db9d64c645436c5f1', 'user_name': None, 'project_id': 'cdcb2f57183e484cace5d5f78dd635a1', 'project_name': None, 'resource_id': '5bdecf5d-9113-4584-ac23-44d59770eade-vda', 'timestamp': '2026-01-21T23:48:23.592524', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-821021372', 'name': 'instance-00000015', 'instance_id': '5bdecf5d-9113-4584-ac23-44d59770eade', 'instance_type': 'm1.nano', 'host': '3a5bbea88021c2dec9ed32ce2b39bccc332cee8daebee264a0ef0cc2', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'ac59499e-f723-11f0-a0a4-fa163e934844', 'monotonic_time': 3876.230850781, 'message_signature': 'b2e5c815d4f3380411d9573e6d126305649612d9db73f2a275b041b35b95ec7e'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'd4ff24d8abf8416db9d64c645436c5f1', 'user_name': None, 'project_id': 'cdcb2f57183e484cace5d5f78dd635a1', 'project_name': None, 'resource_id': '5bdecf5d-9113-4584-ac23-44d59770eade-sda', 'timestamp': '2026-01-21T23:48:23.592524', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-821021372', 'name': 'instance-00000015', 'instance_id': '5bdecf5d-9113-4584-ac23-44d59770eade', 'instance_type': 'm1.nano', 'host': '3a5bbea88021c2dec9ed32ce2b39bccc332cee8daebee264a0ef0cc2', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'ac5955f6-f723-11f0-a0a4-fa163e934844', 'monotonic_time': 3876.230850781, 'message_signature': '5f79ee2dcce2b06b4b97c5706136700751886b8a04b55d0be46d5327f3c5efe4'}]}, 'timestamp': '2026-01-21 23:48:23.593950', '_unique_id': '9029390bcdf84a899fdef75caef1757c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.594 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.594 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.594 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.594 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.594 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.594 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.594 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.594 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.594 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.594 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.594 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.594 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.594 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.594 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.594 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.594 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.594 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.594 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.594 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.594 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.594 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.594 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.594 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.594 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.594 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.594 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.594 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.594 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.594 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.594 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.594 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.594 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.594 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.594 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.594 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.594 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.594 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.594 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.594 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.594 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.594 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.594 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.594 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.594 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.594 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.594 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.594 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.594 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.594 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.594 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.594 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.594 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.594 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.594 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.596 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.596 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.596 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-LiveMigrationTest-server-1127248027>, <NovaLikeServer: tempest-LiveMigrationTest-server-821021372>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-LiveMigrationTest-server-1127248027>, <NovaLikeServer: tempest-LiveMigrationTest-server-821021372>]
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.596 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.619 12 DEBUG ceilometer.compute.pollsters [-] 2038fd11-9c07-48d0-8092-d973d69d8eb9/memory.usage volume: Unavailable _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.619 12 WARNING ceilometer.compute.pollsters [-] memory.usage statistic in not available for instance 2038fd11-9c07-48d0-8092-d973d69d8eb9: ceilometer.compute.pollsters.NoVolumeException
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.635 12 DEBUG ceilometer.compute.pollsters [-] 5bdecf5d-9113-4584-ac23-44d59770eade/memory.usage volume: 42.765625 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.637 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f7dc0440-787c-43c1-b6c8-1b298851f625', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 42.765625, 'user_id': 'd4ff24d8abf8416db9d64c645436c5f1', 'user_name': None, 'project_id': 'cdcb2f57183e484cace5d5f78dd635a1', 'project_name': None, 'resource_id': '5bdecf5d-9113-4584-ac23-44d59770eade', 'timestamp': '2026-01-21T23:48:23.597076', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-821021372', 'name': 'instance-00000015', 'instance_id': '5bdecf5d-9113-4584-ac23-44d59770eade', 'instance_type': 'm1.nano', 'host': '3a5bbea88021c2dec9ed32ce2b39bccc332cee8daebee264a0ef0cc2', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': 'ac5fb6da-f723-11f0-a0a4-fa163e934844', 'monotonic_time': 3876.341989166, 'message_signature': '2c1be27431052a603caa7ba570802093d46fef6aff51fad58d3c91b52bef1aa8'}]}, 'timestamp': '2026-01-21 23:48:23.635777', '_unique_id': '74f39e0c226649798709b109f22ba91e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.637 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.637 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.637 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.637 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.637 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.637 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.637 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.637 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.637 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.637 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.637 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.637 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.637 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.637 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.637 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.637 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.637 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.637 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.637 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.637 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.637 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.637 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.637 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.637 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.637 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.637 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.637 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.637 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.637 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.637 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.637 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.637 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.637 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.637 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.637 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.637 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.637 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.637 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.637 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.637 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.637 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.637 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.637 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.637 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.637 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.637 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.637 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.637 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.637 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.637 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.637 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.637 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.637 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.637 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.637 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.638 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.638 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-LiveMigrationTest-server-1127248027>, <NovaLikeServer: tempest-LiveMigrationTest-server-821021372>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-LiveMigrationTest-server-1127248027>, <NovaLikeServer: tempest-LiveMigrationTest-server-821021372>]
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.638 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.638 12 DEBUG ceilometer.compute.pollsters [-] 2038fd11-9c07-48d0-8092-d973d69d8eb9/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.638 12 DEBUG ceilometer.compute.pollsters [-] 2038fd11-9c07-48d0-8092-d973d69d8eb9/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.639 12 DEBUG ceilometer.compute.pollsters [-] 5bdecf5d-9113-4584-ac23-44d59770eade/disk.device.write.latency volume: 14446235 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.639 12 DEBUG ceilometer.compute.pollsters [-] 5bdecf5d-9113-4584-ac23-44d59770eade/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.640 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bf50464b-3d66-4d5b-abbc-fdc6989c8154', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'd4ff24d8abf8416db9d64c645436c5f1', 'user_name': None, 'project_id': 'cdcb2f57183e484cace5d5f78dd635a1', 'project_name': None, 'resource_id': '2038fd11-9c07-48d0-8092-d973d69d8eb9-vda', 'timestamp': '2026-01-21T23:48:23.638508', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-1127248027', 'name': 'instance-00000019', 'instance_id': '2038fd11-9c07-48d0-8092-d973d69d8eb9', 'instance_type': 'm1.nano', 'host': '3a5bbea88021c2dec9ed32ce2b39bccc332cee8daebee264a0ef0cc2', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'ac6032fe-f723-11f0-a0a4-fa163e934844', 'monotonic_time': 3876.197953725, 'message_signature': '97c0ec4e5f933419a16c4465447fc789cd8447c31d0dcdb780238eb9a75f3429'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'd4ff24d8abf8416db9d64c645436c5f1', 'user_name': None, 'project_id': 'cdcb2f57183e484cace5d5f78dd635a1', 'project_name': None, 'resource_id': '2038fd11-9c07-48d0-8092-d973d69d8eb9-sda', 'timestamp': '2026-01-21T23:48:23.638508', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-1127248027', 'name': 'instance-00000019', 'instance_id': '2038fd11-9c07-48d0-8092-d973d69d8eb9', 'instance_type': 'm1.nano', 'host': '3a5bbea88021c2dec9ed32ce2b39bccc332cee8daebee264a0ef0cc2', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'ac603dee-f723-11f0-a0a4-fa163e934844', 'monotonic_time': 3876.197953725, 'message_signature': '6d2592e21c4892a3533d9e2c9cae980487e2e96990cc918256aa806fb7dc082b'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 14446235, 'user_id': 'd4ff24d8abf8416db9d64c645436c5f1', 'user_name': None, 'project_id': 'cdcb2f57183e484cace5d5f78dd635a1', 'project_name': None, 'resource_id': '5bdecf5d-9113-4584-ac23-44d59770eade-vda', 'timestamp': '2026-01-21T23:48:23.638508', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-821021372', 'name': 'instance-00000015', 'instance_id': '5bdecf5d-9113-4584-ac23-44d59770eade', 'instance_type': 'm1.nano', 'host': '3a5bbea88021c2dec9ed32ce2b39bccc332cee8daebee264a0ef0cc2', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'ac604ed8-f723-11f0-a0a4-fa163e934844', 'monotonic_time': 3876.230850781, 'message_signature': '6cb0367fe538d4cc47fb95a7b4592c73a211be7c64b2f60b859db9dddf2b390b'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'd4ff24d8abf8416db9d64c645436c5f1', 'user_name': None, 'project_id': 'cdcb2f57183e484cace5d5f78dd635a1', 'project_name': None, 'resource_id': '5bdecf5d-9113-4584-ac23-44d59770eade-sda', 'timestamp': '2026-01-21T23:48:23.638508', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-821021372', 'name': 'instance-00000015', 'instance_id': '5bdecf5d-9113-4584-ac23-44d59770eade', 'instance_type': 'm1.nano', 'host': '3a5bbea88021c2dec9ed32ce2b39bccc332cee8daebee264a0ef0cc2', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'ac605982-f723-11f0-a0a4-fa163e934844', 'monotonic_time': 3876.230850781, 'message_signature': '862b7fd206959beae1af9bb6db1a5e07978ee505dceff69ee6de35537a19f21a'}]}, 'timestamp': '2026-01-21 23:48:23.639838', '_unique_id': '28119d75c33541508987f69d0ea7ef0d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.640 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.640 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.640 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.640 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.640 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.640 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.640 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.640 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.640 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.640 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.640 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.640 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.640 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.640 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.640 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.640 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.640 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.640 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.640 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.640 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.640 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.640 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.640 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.640 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.640 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.640 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.640 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.640 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.640 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.640 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.640 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.640 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.640 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.640 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.640 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.640 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.640 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.640 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.640 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.640 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.640 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.640 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.640 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.640 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.640 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.640 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.640 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.640 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.640 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.640 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.640 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.640 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.640 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.640 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.641 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.641 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.641 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-LiveMigrationTest-server-1127248027>, <NovaLikeServer: tempest-LiveMigrationTest-server-821021372>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-LiveMigrationTest-server-1127248027>, <NovaLikeServer: tempest-LiveMigrationTest-server-821021372>]
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.641 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.642 12 DEBUG ceilometer.compute.pollsters [-] 2038fd11-9c07-48d0-8092-d973d69d8eb9/disk.device.read.requests volume: 760 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.642 12 DEBUG ceilometer.compute.pollsters [-] 2038fd11-9c07-48d0-8092-d973d69d8eb9/disk.device.read.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.642 12 DEBUG ceilometer.compute.pollsters [-] 5bdecf5d-9113-4584-ac23-44d59770eade/disk.device.read.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.642 12 DEBUG ceilometer.compute.pollsters [-] 5bdecf5d-9113-4584-ac23-44d59770eade/disk.device.read.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.643 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '232fb68a-56f0-4819-867b-80228c444fe3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 760, 'user_id': 'd4ff24d8abf8416db9d64c645436c5f1', 'user_name': None, 'project_id': 'cdcb2f57183e484cace5d5f78dd635a1', 'project_name': None, 'resource_id': '2038fd11-9c07-48d0-8092-d973d69d8eb9-vda', 'timestamp': '2026-01-21T23:48:23.642031', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-1127248027', 'name': 'instance-00000019', 'instance_id': '2038fd11-9c07-48d0-8092-d973d69d8eb9', 'instance_type': 'm1.nano', 'host': '3a5bbea88021c2dec9ed32ce2b39bccc332cee8daebee264a0ef0cc2', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'ac60bb20-f723-11f0-a0a4-fa163e934844', 'monotonic_time': 3876.197953725, 'message_signature': '25f2459d5705e571febf002536c1e83eef76833c13dac72990b3b968bdbb19d6'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': 'd4ff24d8abf8416db9d64c645436c5f1', 'user_name': None, 'project_id': 'cdcb2f57183e484cace5d5f78dd635a1', 'project_name': None, 'resource_id': '2038fd11-9c07-48d0-8092-d973d69d8eb9-sda', 'timestamp': '2026-01-21T23:48:23.642031', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-1127248027', 'name': 'instance-00000019', 'instance_id': '2038fd11-9c07-48d0-8092-d973d69d8eb9', 'instance_type': 'm1.nano', 'host': '3a5bbea88021c2dec9ed32ce2b39bccc332cee8daebee264a0ef0cc2', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'ac60c822-f723-11f0-a0a4-fa163e934844', 'monotonic_time': 3876.197953725, 'message_signature': '4a124baace3ae8d5ba3cf9718e7c79bc4aaeda56b30dec957a98f4d79e90af16'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'd4ff24d8abf8416db9d64c645436c5f1', 'user_name': None, 'project_id': 'cdcb2f57183e484cace5d5f78dd635a1', 'project_name': None, 'resource_id': '5bdecf5d-9113-4584-ac23-44d59770eade-vda', 'timestamp': '2026-01-21T23:48:23.642031', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-821021372', 'name': 'instance-00000015', 'instance_id': '5bdecf5d-9113-4584-ac23-44d59770eade', 'instance_type': 'm1.nano', 'host': '3a5bbea88021c2dec9ed32ce2b39bccc332cee8daebee264a0ef0cc2', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'ac60d182-f723-11f0-a0a4-fa163e934844', 'monotonic_time': 3876.230850781, 'message_signature': 'cc2c1646bd3387ff165010b5bd2f6cade90aa599a523675e66cf36d262d19101'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'd4ff24d8abf8416db9d64c645436c5f1', 'user_name': None, 'project_id': 'cdcb2f57183e484cace5d5f78dd635a1', 'project_name': None, 'resource_id': '5bdecf5d-9113-4584-ac23-44d59770eade-sda', 'timestamp': '2026-01-21T23:48:23.642031', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-821021372', 'name': 'instance-00000015', 'instance_id': '5bdecf5d-9113-4584-ac23-44d59770eade', 'instance_type': 'm1.nano', 'host': '3a5bbea88021c2dec9ed32ce2b39bccc332cee8daebee264a0ef0cc2', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'ac60dc0e-f723-11f0-a0a4-fa163e934844', 'monotonic_time': 3876.230850781, 'message_signature': 'af8bdad9532b3dc956a86fefd8a29010f65c35a917f58dfd929ebf5a1c71d724'}]}, 'timestamp': '2026-01-21 23:48:23.643130', '_unique_id': '2e47f89b1cef489ebca4bcf5750a67c5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.643 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.643 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.643 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.643 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.643 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.643 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.643 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.643 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.643 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.643 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.643 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.643 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.643 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.643 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.643 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.643 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.643 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.643 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.643 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.643 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.643 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.643 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.643 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.643 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.643 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.643 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.643 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.643 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.643 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.643 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.643 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.643 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.643 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.643 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.643 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.643 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.643 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.643 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.643 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.643 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.643 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.643 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.643 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.643 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.643 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.643 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.643 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.643 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.643 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.643 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.643 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.643 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.643 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.643 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.644 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.644 12 DEBUG ceilometer.compute.pollsters [-] 2038fd11-9c07-48d0-8092-d973d69d8eb9/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.644 12 DEBUG ceilometer.compute.pollsters [-] 5bdecf5d-9113-4584-ac23-44d59770eade/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.645 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4c1be7fc-d0df-4c49-aefd-71ce3365b9ba', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'd4ff24d8abf8416db9d64c645436c5f1', 'user_name': None, 'project_id': 'cdcb2f57183e484cace5d5f78dd635a1', 'project_name': None, 'resource_id': 'instance-00000019-2038fd11-9c07-48d0-8092-d973d69d8eb9-tapbbc46799-07', 'timestamp': '2026-01-21T23:48:23.644431', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-1127248027', 'name': 'tapbbc46799-07', 'instance_id': '2038fd11-9c07-48d0-8092-d973d69d8eb9', 'instance_type': 'm1.nano', 'host': '3a5bbea88021c2dec9ed32ce2b39bccc332cee8daebee264a0ef0cc2', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:14:7f:86', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapbbc46799-07'}, 'message_id': 'ac6117b4-f723-11f0-a0a4-fa163e934844', 'monotonic_time': 3876.273916503, 'message_signature': '07eac1758720a0855a45b81114d0340d67d6c84bb449db13808e8d2daa5f5053'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'd4ff24d8abf8416db9d64c645436c5f1', 'user_name': None, 'project_id': 'cdcb2f57183e484cace5d5f78dd635a1', 'project_name': None, 'resource_id': 'instance-00000015-5bdecf5d-9113-4584-ac23-44d59770eade-tapdf9aa099-aa', 'timestamp': '2026-01-21T23:48:23.644431', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-821021372', 'name': 'tapdf9aa099-aa', 'instance_id': '5bdecf5d-9113-4584-ac23-44d59770eade', 'instance_type': 'm1.nano', 'host': '3a5bbea88021c2dec9ed32ce2b39bccc332cee8daebee264a0ef0cc2', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:8f:4f:85', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapdf9aa099-aa'}, 'message_id': 'ac612056-f723-11f0-a0a4-fa163e934844', 'monotonic_time': 3876.28225046, 'message_signature': 'd159614244314fb17899966af3d1286f4127bfc1b5bb91cb20077781084a6cca'}]}, 'timestamp': '2026-01-21 23:48:23.644904', '_unique_id': '523581c7adea42f8a9fcb814f3038e86'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.645 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.645 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.645 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.645 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.645 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.645 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.645 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.645 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.645 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.645 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.645 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.645 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.645 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.645 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.645 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.645 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.645 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.645 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.645 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.645 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.645 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.645 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.645 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.645 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.645 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.645 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.645 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.645 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.645 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.645 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.645 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.645 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.645 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.645 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.645 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.645 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.645 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.645 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.645 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.645 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.645 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.645 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.645 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.645 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.645 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.645 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.645 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.645 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.645 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.645 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.645 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.645 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.645 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.645 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.645 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.646 12 DEBUG ceilometer.compute.pollsters [-] 2038fd11-9c07-48d0-8092-d973d69d8eb9/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.646 12 DEBUG ceilometer.compute.pollsters [-] 5bdecf5d-9113-4584-ac23-44d59770eade/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.647 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd4de2ede-a824-4fbc-a949-6f089c709d6b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'd4ff24d8abf8416db9d64c645436c5f1', 'user_name': None, 'project_id': 'cdcb2f57183e484cace5d5f78dd635a1', 'project_name': None, 'resource_id': 'instance-00000019-2038fd11-9c07-48d0-8092-d973d69d8eb9-tapbbc46799-07', 'timestamp': '2026-01-21T23:48:23.646039', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-1127248027', 'name': 'tapbbc46799-07', 'instance_id': '2038fd11-9c07-48d0-8092-d973d69d8eb9', 'instance_type': 'm1.nano', 'host': '3a5bbea88021c2dec9ed32ce2b39bccc332cee8daebee264a0ef0cc2', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:14:7f:86', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapbbc46799-07'}, 'message_id': 'ac615616-f723-11f0-a0a4-fa163e934844', 'monotonic_time': 3876.273916503, 'message_signature': '5e174ced4a77d594bfa41bf62e81b54e683c1e34b1b3e2dcb0a1fb8866b700db'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'd4ff24d8abf8416db9d64c645436c5f1', 'user_name': None, 'project_id': 'cdcb2f57183e484cace5d5f78dd635a1', 'project_name': None, 'resource_id': 'instance-00000015-5bdecf5d-9113-4584-ac23-44d59770eade-tapdf9aa099-aa', 'timestamp': '2026-01-21T23:48:23.646039', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-821021372', 'name': 'tapdf9aa099-aa', 'instance_id': '5bdecf5d-9113-4584-ac23-44d59770eade', 'instance_type': 'm1.nano', 'host': '3a5bbea88021c2dec9ed32ce2b39bccc332cee8daebee264a0ef0cc2', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:8f:4f:85', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapdf9aa099-aa'}, 'message_id': 'ac61626e-f723-11f0-a0a4-fa163e934844', 'monotonic_time': 3876.28225046, 'message_signature': '67746ca5c69dc656c7eb5b0dfdf7b563aca94d13e0ebe57119d5f5886e2a8fc7'}]}, 'timestamp': '2026-01-21 23:48:23.646584', '_unique_id': 'be2083d96f4d4b4597087cc2b9493a58'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.647 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.647 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.647 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.647 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.647 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.647 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.647 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.647 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.647 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.647 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.647 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.647 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.647 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.647 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.647 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.647 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.647 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.647 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.647 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.647 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.647 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.647 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.647 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.647 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.647 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.647 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.647 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.647 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.647 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.647 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.647 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.647 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.647 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.647 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.647 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.647 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.647 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.647 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.647 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.647 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.647 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.647 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.647 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.647 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.647 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.647 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.647 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.647 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.647 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.647 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.647 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.647 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.647 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.647 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.647 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.659 12 DEBUG ceilometer.compute.pollsters [-] 2038fd11-9c07-48d0-8092-d973d69d8eb9/disk.device.usage volume: 196624 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.660 12 DEBUG ceilometer.compute.pollsters [-] 2038fd11-9c07-48d0-8092-d973d69d8eb9/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.670 12 DEBUG ceilometer.compute.pollsters [-] 5bdecf5d-9113-4584-ac23-44d59770eade/disk.device.usage volume: 29949952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.671 12 DEBUG ceilometer.compute.pollsters [-] 5bdecf5d-9113-4584-ac23-44d59770eade/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.672 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a8749376-6487-4783-8a0c-a4016b09638d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 196624, 'user_id': 'd4ff24d8abf8416db9d64c645436c5f1', 'user_name': None, 'project_id': 'cdcb2f57183e484cace5d5f78dd635a1', 'project_name': None, 'resource_id': '2038fd11-9c07-48d0-8092-d973d69d8eb9-vda', 'timestamp': '2026-01-21T23:48:23.647839', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-1127248027', 'name': 'instance-00000019', 'instance_id': '2038fd11-9c07-48d0-8092-d973d69d8eb9', 'instance_type': 'm1.nano', 'host': '3a5bbea88021c2dec9ed32ce2b39bccc332cee8daebee264a0ef0cc2', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'ac63885a-f723-11f0-a0a4-fa163e934844', 'monotonic_time': 3876.355100538, 'message_signature': '8b9ec422894299c661a17c74ec55cd65f338d19df500796e344e97139ad99360'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'd4ff24d8abf8416db9d64c645436c5f1', 'user_name': None, 'project_id': 'cdcb2f57183e484cace5d5f78dd635a1', 'project_name': None, 'resource_id': '2038fd11-9c07-48d0-8092-d973d69d8eb9-sda', 'timestamp': '2026-01-21T23:48:23.647839', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-1127248027', 'name': 'instance-00000019', 'instance_id': '2038fd11-9c07-48d0-8092-d973d69d8eb9', 'instance_type': 'm1.nano', 'host': '3a5bbea88021c2dec9ed32ce2b39bccc332cee8daebee264a0ef0cc2', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'ac639a2a-f723-11f0-a0a4-fa163e934844', 'monotonic_time': 3876.355100538, 'message_signature': '38ced4b2a437efec2b041f3b4b59bb6cf80e9457f169739319cf17bcc5f89946'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29949952, 'user_id': 'd4ff24d8abf8416db9d64c645436c5f1', 'user_name': None, 'project_id': 'cdcb2f57183e484cace5d5f78dd635a1', 'project_name': None, 'resource_id': '5bdecf5d-9113-4584-ac23-44d59770eade-vda', 'timestamp': '2026-01-21T23:48:23.647839', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-821021372', 'name': 'instance-00000015', 'instance_id': '5bdecf5d-9113-4584-ac23-44d59770eade', 'instance_type': 'm1.nano', 'host': '3a5bbea88021c2dec9ed32ce2b39bccc332cee8daebee264a0ef0cc2', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'ac652264-f723-11f0-a0a4-fa163e934844', 'monotonic_time': 3876.368364073, 'message_signature': 'e7f45035f2cfb42c8e2f4bc7809e1ebf4858de9e278f852682682194a3007df7'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'd4ff24d8abf8416db9d64c645436c5f1', 'user_name': None, 'project_id': 'cdcb2f57183e484cace5d5f78dd635a1', 'project_name': None, 'resource_id': '5bdecf5d-9113-4584-ac23-44d59770eade-sda', 'timestamp': '2026-01-21T23:48:23.647839', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-821021372', 'name': 'instance-00000015', 'instance_id': '5bdecf5d-9113-4584-ac23-44d59770eade', 'instance_type': 'm1.nano', 'host': '3a5bbea88021c2dec9ed32ce2b39bccc332cee8daebee264a0ef0cc2', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'ac652ae8-f723-11f0-a0a4-fa163e934844', 'monotonic_time': 3876.368364073, 'message_signature': '223638ca94ab864c4d7025d9ea228ae8239e0fb313827e053ba0842b65e41949'}]}, 'timestamp': '2026-01-21 23:48:23.671386', '_unique_id': '032c158c002d4cd1a4e4a4c72e074584'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.672 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.672 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.672 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.672 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.672 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.672 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.672 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.672 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.672 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.672 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.672 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.672 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.672 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.672 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.672 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.672 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.672 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.672 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.672 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.672 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.672 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.672 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.672 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.672 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.672 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.672 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.672 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.672 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.672 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.672 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.672 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.672 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.672 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.672 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.672 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.672 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.672 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.672 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.672 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.672 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.672 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.672 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.672 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.672 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.672 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.672 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.672 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.672 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.672 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.672 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.672 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.672 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.672 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.672 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.673 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.673 12 DEBUG ceilometer.compute.pollsters [-] 2038fd11-9c07-48d0-8092-d973d69d8eb9/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.673 12 DEBUG ceilometer.compute.pollsters [-] 5bdecf5d-9113-4584-ac23-44d59770eade/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.674 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '11489eba-bed6-434c-b9c4-d3e5ed72f671', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'd4ff24d8abf8416db9d64c645436c5f1', 'user_name': None, 'project_id': 'cdcb2f57183e484cace5d5f78dd635a1', 'project_name': None, 'resource_id': 'instance-00000019-2038fd11-9c07-48d0-8092-d973d69d8eb9-tapbbc46799-07', 'timestamp': '2026-01-21T23:48:23.673465', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-1127248027', 'name': 'tapbbc46799-07', 'instance_id': '2038fd11-9c07-48d0-8092-d973d69d8eb9', 'instance_type': 'm1.nano', 'host': '3a5bbea88021c2dec9ed32ce2b39bccc332cee8daebee264a0ef0cc2', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:14:7f:86', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapbbc46799-07'}, 'message_id': 'ac6587fe-f723-11f0-a0a4-fa163e934844', 'monotonic_time': 3876.273916503, 'message_signature': 'bc5adc9ee82da164adbba2001580354345cbb570ad96447b68aabd0ddec2dcbf'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'd4ff24d8abf8416db9d64c645436c5f1', 'user_name': None, 'project_id': 'cdcb2f57183e484cace5d5f78dd635a1', 'project_name': None, 'resource_id': 'instance-00000015-5bdecf5d-9113-4584-ac23-44d59770eade-tapdf9aa099-aa', 'timestamp': '2026-01-21T23:48:23.673465', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-821021372', 'name': 'tapdf9aa099-aa', 'instance_id': '5bdecf5d-9113-4584-ac23-44d59770eade', 'instance_type': 'm1.nano', 'host': '3a5bbea88021c2dec9ed32ce2b39bccc332cee8daebee264a0ef0cc2', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:8f:4f:85', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapdf9aa099-aa'}, 'message_id': 'ac6591b8-f723-11f0-a0a4-fa163e934844', 'monotonic_time': 3876.28225046, 'message_signature': '3deb5c705c32144a03d078dea784a8e63f56e898bb5d60fbfef461434d4ac5df'}]}, 'timestamp': '2026-01-21 23:48:23.674004', '_unique_id': 'efbc8cc1adab4e4b932e73c27471a193'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.674 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.674 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.674 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.674 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.674 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.674 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.674 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.674 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.674 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.674 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.674 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.674 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.674 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.674 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.674 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.674 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.674 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.674 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.674 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.674 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.674 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.674 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.674 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.674 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.674 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.674 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.674 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.674 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.674 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.674 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.674 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.674 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.674 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.674 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.674 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.674 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.674 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.674 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.674 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.674 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.674 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.674 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.674 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.674 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.674 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.674 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.674 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.674 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.674 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.674 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.674 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.674 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.674 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.674 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.675 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.675 12 DEBUG ceilometer.compute.pollsters [-] 2038fd11-9c07-48d0-8092-d973d69d8eb9/cpu volume: 2080000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.675 12 DEBUG ceilometer.compute.pollsters [-] 5bdecf5d-9113-4584-ac23-44d59770eade/cpu volume: 190000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.676 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '175095b7-68ca-4268-80af-82e291742288', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 2080000000, 'user_id': 'd4ff24d8abf8416db9d64c645436c5f1', 'user_name': None, 'project_id': 'cdcb2f57183e484cace5d5f78dd635a1', 'project_name': None, 'resource_id': '2038fd11-9c07-48d0-8092-d973d69d8eb9', 'timestamp': '2026-01-21T23:48:23.675165', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-1127248027', 'name': 'instance-00000019', 'instance_id': '2038fd11-9c07-48d0-8092-d973d69d8eb9', 'instance_type': 'm1.nano', 'host': '3a5bbea88021c2dec9ed32ce2b39bccc332cee8daebee264a0ef0cc2', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': 'ac65cb24-f723-11f0-a0a4-fa163e934844', 'monotonic_time': 3876.326272902, 'message_signature': '6338ed887cfb35f8ea69e1d80a1afd295c5edaa9ef9d9308d2e5c4141e61acae'}, {'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 190000000, 'user_id': 'd4ff24d8abf8416db9d64c645436c5f1', 'user_name': None, 'project_id': 'cdcb2f57183e484cace5d5f78dd635a1', 'project_name': None, 'resource_id': '5bdecf5d-9113-4584-ac23-44d59770eade', 'timestamp': '2026-01-21T23:48:23.675165', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-821021372', 'name': 'instance-00000015', 'instance_id': '5bdecf5d-9113-4584-ac23-44d59770eade', 'instance_type': 'm1.nano', 'host': '3a5bbea88021c2dec9ed32ce2b39bccc332cee8daebee264a0ef0cc2', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': 'ac65d4c0-f723-11f0-a0a4-fa163e934844', 'monotonic_time': 3876.341989166, 'message_signature': '9a509a2222bf346932b80ec1cc1775b0205ea7aa4b20470a6d2571ebe7c3195a'}]}, 'timestamp': '2026-01-21 23:48:23.675720', '_unique_id': '975650a868d043f1ac83aa4fa8e52d2e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.676 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.676 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.676 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.676 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.676 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.676 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.676 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.676 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.676 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.676 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.676 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.676 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.676 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.676 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.676 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.676 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.676 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.676 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.676 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.676 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.676 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.676 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.676 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.676 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.676 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.676 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.676 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.676 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.676 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.676 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.676 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.676 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.676 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.676 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.676 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.676 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.676 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.676 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.676 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.676 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.676 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.676 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.676 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.676 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.676 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.676 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.676 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.676 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.676 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.676 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.676 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.676 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.676 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.676 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.676 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.676 12 DEBUG ceilometer.compute.pollsters [-] 2038fd11-9c07-48d0-8092-d973d69d8eb9/disk.device.allocation volume: 204800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.677 12 DEBUG ceilometer.compute.pollsters [-] 2038fd11-9c07-48d0-8092-d973d69d8eb9/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.677 12 DEBUG ceilometer.compute.pollsters [-] 5bdecf5d-9113-4584-ac23-44d59770eade/disk.device.allocation volume: 30613504 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.677 12 DEBUG ceilometer.compute.pollsters [-] 5bdecf5d-9113-4584-ac23-44d59770eade/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.678 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd70675ce-6416-45d0-82f8-5b5368ebbdbb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 204800, 'user_id': 'd4ff24d8abf8416db9d64c645436c5f1', 'user_name': None, 'project_id': 'cdcb2f57183e484cace5d5f78dd635a1', 'project_name': None, 'resource_id': '2038fd11-9c07-48d0-8092-d973d69d8eb9-vda', 'timestamp': '2026-01-21T23:48:23.676903', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-1127248027', 'name': 'instance-00000019', 'instance_id': '2038fd11-9c07-48d0-8092-d973d69d8eb9', 'instance_type': 'm1.nano', 'host': '3a5bbea88021c2dec9ed32ce2b39bccc332cee8daebee264a0ef0cc2', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'ac660bc0-f723-11f0-a0a4-fa163e934844', 'monotonic_time': 3876.355100538, 'message_signature': 'bfa5cec7ec2ecc9b28b5716888fcc9ca9002aa4653c816d55fd896189b1c9881'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': 'd4ff24d8abf8416db9d64c645436c5f1', 'user_name': None, 'project_id': 'cdcb2f57183e484cace5d5f78dd635a1', 'project_name': None, 'resource_id': '2038fd11-9c07-48d0-8092-d973d69d8eb9-sda', 'timestamp': '2026-01-21T23:48:23.676903', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-1127248027', 'name': 'instance-00000019', 'instance_id': '2038fd11-9c07-48d0-8092-d973d69d8eb9', 'instance_type': 'm1.nano', 'host': '3a5bbea88021c2dec9ed32ce2b39bccc332cee8daebee264a0ef0cc2', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'ac66135e-f723-11f0-a0a4-fa163e934844', 'monotonic_time': 3876.355100538, 'message_signature': '152516765e5d2d7403cb3ad22ec6a1d273d2a8b3f62e54ab9aa69435d8f34b5a'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30613504, 'user_id': 'd4ff24d8abf8416db9d64c645436c5f1', 'user_name': None, 'project_id': 'cdcb2f57183e484cace5d5f78dd635a1', 'project_name': None, 'resource_id': '5bdecf5d-9113-4584-ac23-44d59770eade-vda', 'timestamp': '2026-01-21T23:48:23.676903', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-821021372', 'name': 'instance-00000015', 'instance_id': '5bdecf5d-9113-4584-ac23-44d59770eade', 'instance_type': 'm1.nano', 'host': '3a5bbea88021c2dec9ed32ce2b39bccc332cee8daebee264a0ef0cc2', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'ac661aa2-f723-11f0-a0a4-fa163e934844', 'monotonic_time': 3876.368364073, 'message_signature': '0c6d930e4d4d159458f5cf64d9cfbe1896e318fecf145581cd0edcceb91983a2'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': 'd4ff24d8abf8416db9d64c645436c5f1', 'user_name': None, 'project_id': 'cdcb2f57183e484cace5d5f78dd635a1', 'project_name': None, 'resource_id': '5bdecf5d-9113-4584-ac23-44d59770eade-sda', 'timestamp': '2026-01-21T23:48:23.676903', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-821021372', 'name': 'instance-00000015', 'instance_id': '5bdecf5d-9113-4584-ac23-44d59770eade', 'instance_type': 'm1.nano', 'host': '3a5bbea88021c2dec9ed32ce2b39bccc332cee8daebee264a0ef0cc2', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'ac6621e6-f723-11f0-a0a4-fa163e934844', 'monotonic_time': 3876.368364073, 'message_signature': '4df0d67fe9b22e46e421ee3c246525b3790c3716400b598d675d1e906331a6fc'}]}, 'timestamp': '2026-01-21 23:48:23.677732', '_unique_id': '40828b92a6b74a3c9cb733873bf868a9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.678 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.678 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.678 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.678 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.678 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.678 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.678 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.678 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.678 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.678 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.678 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.678 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.678 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.678 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.678 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.678 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.678 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.678 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.678 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.678 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.678 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.678 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.678 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.678 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.678 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.678 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.678 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.678 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.678 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.678 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.678 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.678 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.678 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.678 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.678 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.678 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.678 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.678 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.678 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.678 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.678 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.678 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.678 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.678 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.678 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.678 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.678 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.678 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.678 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.678 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.678 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.678 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.678 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.678 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.678 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.678 12 DEBUG ceilometer.compute.pollsters [-] 2038fd11-9c07-48d0-8092-d973d69d8eb9/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.679 12 DEBUG ceilometer.compute.pollsters [-] 2038fd11-9c07-48d0-8092-d973d69d8eb9/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.679 12 DEBUG ceilometer.compute.pollsters [-] 5bdecf5d-9113-4584-ac23-44d59770eade/disk.device.write.bytes volume: 126976 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.679 12 DEBUG ceilometer.compute.pollsters [-] 5bdecf5d-9113-4584-ac23-44d59770eade/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.680 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bf59b309-99a7-474e-bde5-d0d65c816177', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'd4ff24d8abf8416db9d64c645436c5f1', 'user_name': None, 'project_id': 'cdcb2f57183e484cace5d5f78dd635a1', 'project_name': None, 'resource_id': '2038fd11-9c07-48d0-8092-d973d69d8eb9-vda', 'timestamp': '2026-01-21T23:48:23.678958', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-1127248027', 'name': 'instance-00000019', 'instance_id': '2038fd11-9c07-48d0-8092-d973d69d8eb9', 'instance_type': 'm1.nano', 'host': '3a5bbea88021c2dec9ed32ce2b39bccc332cee8daebee264a0ef0cc2', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'ac665bde-f723-11f0-a0a4-fa163e934844', 'monotonic_time': 3876.197953725, 'message_signature': '109db6f2237e05c3ef48006948e712b7fa31ef57f6011772e55d8aec3ab001e5'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'd4ff24d8abf8416db9d64c645436c5f1', 'user_name': None, 'project_id': 'cdcb2f57183e484cace5d5f78dd635a1', 'project_name': None, 'resource_id': '2038fd11-9c07-48d0-8092-d973d69d8eb9-sda', 'timestamp': '2026-01-21T23:48:23.678958', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-1127248027', 'name': 'instance-00000019', 'instance_id': '2038fd11-9c07-48d0-8092-d973d69d8eb9', 'instance_type': 'm1.nano', 'host': '3a5bbea88021c2dec9ed32ce2b39bccc332cee8daebee264a0ef0cc2', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'ac666372-f723-11f0-a0a4-fa163e934844', 'monotonic_time': 3876.197953725, 'message_signature': 'd42c97529d8a5fb7f0d4ea737a1439d1beaebe56f155c5bda5c120a2e366f60e'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 126976, 'user_id': 'd4ff24d8abf8416db9d64c645436c5f1', 'user_name': None, 'project_id': 'cdcb2f57183e484cace5d5f78dd635a1', 'project_name': None, 'resource_id': '5bdecf5d-9113-4584-ac23-44d59770eade-vda', 'timestamp': '2026-01-21T23:48:23.678958', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-821021372', 'name': 'instance-00000015', 'instance_id': '5bdecf5d-9113-4584-ac23-44d59770eade', 'instance_type': 'm1.nano', 'host': '3a5bbea88021c2dec9ed32ce2b39bccc332cee8daebee264a0ef0cc2', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'ac666ade-f723-11f0-a0a4-fa163e934844', 'monotonic_time': 3876.230850781, 'message_signature': '2932fa4ccd08d3bc8aa32b789cc2990caeb4f9c67d8f226dfa262432463a648f'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'd4ff24d8abf8416db9d64c645436c5f1', 'user_name': None, 'project_id': 'cdcb2f57183e484cace5d5f78dd635a1', 'project_name': None, 'resource_id': '5bdecf5d-9113-4584-ac23-44d59770eade-sda', 'timestamp': '2026-01-21T23:48:23.678958', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-821021372', 'name': 'instance-00000015', 'instance_id': '5bdecf5d-9113-4584-ac23-44d59770eade', 'instance_type': 'm1.nano', 'host': '3a5bbea88021c2dec9ed32ce2b39bccc332cee8daebee264a0ef0cc2', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'ac6671f0-f723-11f0-a0a4-fa163e934844', 'monotonic_time': 3876.230850781, 'message_signature': '8e6a1b049d862416031bb5395f4700ecf23a96ab032a20f9a37660a40ee5afea'}]}, 'timestamp': '2026-01-21 23:48:23.679777', '_unique_id': '7c55bcd3cdd8461390e9b48438f16018'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.680 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.680 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.680 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.680 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.680 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.680 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.680 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.680 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.680 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.680 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.680 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.680 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.680 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.680 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.680 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.680 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.680 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.680 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.680 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.680 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.680 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.680 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.680 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.680 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.680 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.680 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.680 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.680 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.680 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.680 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.680 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.680 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.680 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.680 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.680 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.680 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.680 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.680 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.680 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.680 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.680 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.680 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.680 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.680 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.680 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.680 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.680 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.680 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.680 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.680 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.680 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.680 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.680 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.680 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.680 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.681 12 DEBUG ceilometer.compute.pollsters [-] 2038fd11-9c07-48d0-8092-d973d69d8eb9/network.outgoing.packets volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.681 12 DEBUG ceilometer.compute.pollsters [-] 5bdecf5d-9113-4584-ac23-44d59770eade/network.outgoing.packets volume: 80 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.681 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '866ee435-ea11-4ed4-b0d9-4cf8f343c91e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'd4ff24d8abf8416db9d64c645436c5f1', 'user_name': None, 'project_id': 'cdcb2f57183e484cace5d5f78dd635a1', 'project_name': None, 'resource_id': 'instance-00000019-2038fd11-9c07-48d0-8092-d973d69d8eb9-tapbbc46799-07', 'timestamp': '2026-01-21T23:48:23.680994', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-1127248027', 'name': 'tapbbc46799-07', 'instance_id': '2038fd11-9c07-48d0-8092-d973d69d8eb9', 'instance_type': 'm1.nano', 'host': '3a5bbea88021c2dec9ed32ce2b39bccc332cee8daebee264a0ef0cc2', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:14:7f:86', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapbbc46799-07'}, 'message_id': 'ac66ab84-f723-11f0-a0a4-fa163e934844', 'monotonic_time': 3876.273916503, 'message_signature': '8a0fe8799c808ea2fddeb6cca3b0299f876afde16df5632b755c2877a2e500c8'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 80, 'user_id': 'd4ff24d8abf8416db9d64c645436c5f1', 'user_name': None, 'project_id': 'cdcb2f57183e484cace5d5f78dd635a1', 'project_name': None, 'resource_id': 'instance-00000015-5bdecf5d-9113-4584-ac23-44d59770eade-tapdf9aa099-aa', 'timestamp': '2026-01-21T23:48:23.680994', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-821021372', 'name': 'tapdf9aa099-aa', 'instance_id': '5bdecf5d-9113-4584-ac23-44d59770eade', 'instance_type': 'm1.nano', 'host': '3a5bbea88021c2dec9ed32ce2b39bccc332cee8daebee264a0ef0cc2', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:8f:4f:85', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapdf9aa099-aa'}, 'message_id': 'ac66b37c-f723-11f0-a0a4-fa163e934844', 'monotonic_time': 3876.28225046, 'message_signature': '53e39396ba8fd2ca8263166600e71ae158b3fc071b7cb97f181cb719b6b4deda'}]}, 'timestamp': '2026-01-21 23:48:23.681419', '_unique_id': '9a5d70836b4b4b7cbd0716184c1ab9de'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.681 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.681 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.681 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.681 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.681 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.681 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.681 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.681 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.681 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.681 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.681 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.681 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.681 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.681 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.681 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.681 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.681 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.681 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.681 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.681 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.681 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.681 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.681 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.681 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.681 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.681 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.681 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.681 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.681 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.681 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.681 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.681 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.681 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.681 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.681 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.681 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.681 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.681 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.681 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.681 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.681 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.681 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.681 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.681 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.681 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.681 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.681 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.681 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.681 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.681 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.681 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.681 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.681 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.681 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.682 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.682 12 DEBUG ceilometer.compute.pollsters [-] 2038fd11-9c07-48d0-8092-d973d69d8eb9/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.682 12 DEBUG ceilometer.compute.pollsters [-] 2038fd11-9c07-48d0-8092-d973d69d8eb9/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.683 12 DEBUG ceilometer.compute.pollsters [-] 5bdecf5d-9113-4584-ac23-44d59770eade/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.683 12 DEBUG ceilometer.compute.pollsters [-] 5bdecf5d-9113-4584-ac23-44d59770eade/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.683 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ec65899a-a4bb-4de6-abb1-e076019a64cc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'd4ff24d8abf8416db9d64c645436c5f1', 'user_name': None, 'project_id': 'cdcb2f57183e484cace5d5f78dd635a1', 'project_name': None, 'resource_id': '2038fd11-9c07-48d0-8092-d973d69d8eb9-vda', 'timestamp': '2026-01-21T23:48:23.682541', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-1127248027', 'name': 'instance-00000019', 'instance_id': '2038fd11-9c07-48d0-8092-d973d69d8eb9', 'instance_type': 'm1.nano', 'host': '3a5bbea88021c2dec9ed32ce2b39bccc332cee8daebee264a0ef0cc2', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'ac66e7ac-f723-11f0-a0a4-fa163e934844', 'monotonic_time': 3876.355100538, 'message_signature': '95b32fcc4b341949bb416ef82590a3090663b65e9201416240bd6a3dec72a55b'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'd4ff24d8abf8416db9d64c645436c5f1', 'user_name': None, 'project_id': 'cdcb2f57183e484cace5d5f78dd635a1', 'project_name': None, 'resource_id': '2038fd11-9c07-48d0-8092-d973d69d8eb9-sda', 'timestamp': '2026-01-21T23:48:23.682541', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-1127248027', 'name': 'instance-00000019', 'instance_id': '2038fd11-9c07-48d0-8092-d973d69d8eb9', 'instance_type': 'm1.nano', 'host': '3a5bbea88021c2dec9ed32ce2b39bccc332cee8daebee264a0ef0cc2', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'ac66f260-f723-11f0-a0a4-fa163e934844', 'monotonic_time': 3876.355100538, 'message_signature': '7578f35301f123dbed1814e2a4adb3ed2cf72e4efe4797b4abc76e8632062c94'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'd4ff24d8abf8416db9d64c645436c5f1', 'user_name': None, 'project_id': 'cdcb2f57183e484cace5d5f78dd635a1', 'project_name': None, 'resource_id': '5bdecf5d-9113-4584-ac23-44d59770eade-vda', 'timestamp': '2026-01-21T23:48:23.682541', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-821021372', 'name': 'instance-00000015', 'instance_id': '5bdecf5d-9113-4584-ac23-44d59770eade', 'instance_type': 'm1.nano', 'host': '3a5bbea88021c2dec9ed32ce2b39bccc332cee8daebee264a0ef0cc2', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'ac66f9fe-f723-11f0-a0a4-fa163e934844', 'monotonic_time': 3876.368364073, 'message_signature': '083334a739b5b08dae3e274d319e90146dd230d465f5ca38ea404472ca7aa57f'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'd4ff24d8abf8416db9d64c645436c5f1', 'user_name': None, 'project_id': 'cdcb2f57183e484cace5d5f78dd635a1', 'project_name': None, 'resource_id': '5bdecf5d-9113-4584-ac23-44d59770eade-sda', 'timestamp': '2026-01-21T23:48:23.682541', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-821021372', 'name': 'instance-00000015', 'instance_id': '5bdecf5d-9113-4584-ac23-44d59770eade', 'instance_type': 'm1.nano', 'host': '3a5bbea88021c2dec9ed32ce2b39bccc332cee8daebee264a0ef0cc2', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'ac670138-f723-11f0-a0a4-fa163e934844', 'monotonic_time': 3876.368364073, 'message_signature': '33506fa8f1450b30e5918f40adf691d0e085cc85648bc5ffabc1430153b74bfe'}]}, 'timestamp': '2026-01-21 23:48:23.683400', '_unique_id': '61a7e7bab3904ad29f40d4a6fce5991e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.683 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.683 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.683 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.683 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.683 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.683 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.683 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.683 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.683 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.683 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.683 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.683 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.683 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.683 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.683 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.683 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.683 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.683 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.683 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.683 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.683 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.683 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.683 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.683 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.683 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.683 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.683 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.683 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.683 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.683 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.683 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.684 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.684 12 DEBUG ceilometer.compute.pollsters [-] 2038fd11-9c07-48d0-8092-d973d69d8eb9/network.incoming.bytes volume: 90 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.684 12 DEBUG ceilometer.compute.pollsters [-] 5bdecf5d-9113-4584-ac23-44d59770eade/network.incoming.bytes volume: 706 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.685 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'dc10d8db-61c7-41f5-803c-0b5a96ecf3c7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 90, 'user_id': 'd4ff24d8abf8416db9d64c645436c5f1', 'user_name': None, 'project_id': 'cdcb2f57183e484cace5d5f78dd635a1', 'project_name': None, 'resource_id': 'instance-00000019-2038fd11-9c07-48d0-8092-d973d69d8eb9-tapbbc46799-07', 'timestamp': '2026-01-21T23:48:23.684534', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-1127248027', 'name': 'tapbbc46799-07', 'instance_id': '2038fd11-9c07-48d0-8092-d973d69d8eb9', 'instance_type': 'm1.nano', 'host': '3a5bbea88021c2dec9ed32ce2b39bccc332cee8daebee264a0ef0cc2', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:14:7f:86', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapbbc46799-07'}, 'message_id': 'ac6735b8-f723-11f0-a0a4-fa163e934844', 'monotonic_time': 3876.273916503, 'message_signature': '03da7c3a2241eed6c723b0fe53151f5bca29b332883aa7b1e48da417e2734c07'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 706, 'user_id': 'd4ff24d8abf8416db9d64c645436c5f1', 'user_name': None, 'project_id': 'cdcb2f57183e484cace5d5f78dd635a1', 'project_name': None, 'resource_id': 'instance-00000015-5bdecf5d-9113-4584-ac23-44d59770eade-tapdf9aa099-aa', 'timestamp': '2026-01-21T23:48:23.684534', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-821021372', 'name': 'tapdf9aa099-aa', 'instance_id': '5bdecf5d-9113-4584-ac23-44d59770eade', 'instance_type': 'm1.nano', 'host': '3a5bbea88021c2dec9ed32ce2b39bccc332cee8daebee264a0ef0cc2', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:8f:4f:85', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapdf9aa099-aa'}, 'message_id': 'ac67417a-f723-11f0-a0a4-fa163e934844', 'monotonic_time': 3876.28225046, 'message_signature': '2f4f47b0c58c241493e324e60b6f3dc0db72d1a857f06793fa7df818ddb1c798'}]}, 'timestamp': '2026-01-21 23:48:23.685060', '_unique_id': 'c8f600b2ad974620bd3fded21e808bb4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.685 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.685 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.685 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.685 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.685 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.685 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.685 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.685 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.685 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.685 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.685 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.685 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.685 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.685 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.685 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.685 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.685 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.685 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.685 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.685 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.685 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.685 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.685 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.685 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.685 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.685 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.685 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.685 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.685 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.685 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.685 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.685 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.685 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.685 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.685 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.685 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.685 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.685 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.685 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.685 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.685 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.685 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.685 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.685 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.685 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.685 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.685 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.685 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.685 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.685 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.685 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.685 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.685 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.685 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.686 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.686 12 DEBUG ceilometer.compute.pollsters [-] 2038fd11-9c07-48d0-8092-d973d69d8eb9/disk.device.read.latency volume: 125349631 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.686 12 DEBUG ceilometer.compute.pollsters [-] 2038fd11-9c07-48d0-8092-d973d69d8eb9/disk.device.read.latency volume: 604230 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.686 12 DEBUG ceilometer.compute.pollsters [-] 5bdecf5d-9113-4584-ac23-44d59770eade/disk.device.read.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.686 12 DEBUG ceilometer.compute.pollsters [-] 5bdecf5d-9113-4584-ac23-44d59770eade/disk.device.read.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.687 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '999a17fb-4b6d-45ac-87ec-a9b55c1d1609', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 125349631, 'user_id': 'd4ff24d8abf8416db9d64c645436c5f1', 'user_name': None, 'project_id': 'cdcb2f57183e484cace5d5f78dd635a1', 'project_name': None, 'resource_id': '2038fd11-9c07-48d0-8092-d973d69d8eb9-vda', 'timestamp': '2026-01-21T23:48:23.686156', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-1127248027', 'name': 'instance-00000019', 'instance_id': '2038fd11-9c07-48d0-8092-d973d69d8eb9', 'instance_type': 'm1.nano', 'host': '3a5bbea88021c2dec9ed32ce2b39bccc332cee8daebee264a0ef0cc2', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'ac6774f6-f723-11f0-a0a4-fa163e934844', 'monotonic_time': 3876.197953725, 'message_signature': 'f9e8af351f597887f425bbd9d9bfa1876ff02ebee6fd474b7c2d03420bbac8b0'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 604230, 'user_id': 'd4ff24d8abf8416db9d64c645436c5f1', 'user_name': None, 'project_id': 'cdcb2f57183e484cace5d5f78dd635a1', 'project_name': None, 'resource_id': '2038fd11-9c07-48d0-8092-d973d69d8eb9-sda', 'timestamp': '2026-01-21T23:48:23.686156', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-1127248027', 'name': 'instance-00000019', 'instance_id': '2038fd11-9c07-48d0-8092-d973d69d8eb9', 'instance_type': 'm1.nano', 'host': '3a5bbea88021c2dec9ed32ce2b39bccc332cee8daebee264a0ef0cc2', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'ac677f3c-f723-11f0-a0a4-fa163e934844', 'monotonic_time': 3876.197953725, 'message_signature': '530addba669fa1e0b5812f401d13f4e11d1148c712f81bfd73626d53085cfa16'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'd4ff24d8abf8416db9d64c645436c5f1', 'user_name': None, 'project_id': 'cdcb2f57183e484cace5d5f78dd635a1', 'project_name': None, 'resource_id': '5bdecf5d-9113-4584-ac23-44d59770eade-vda', 'timestamp': '2026-01-21T23:48:23.686156', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-821021372', 'name': 'instance-00000015', 'instance_id': '5bdecf5d-9113-4584-ac23-44d59770eade', 'instance_type': 'm1.nano', 'host': '3a5bbea88021c2dec9ed32ce2b39bccc332cee8daebee264a0ef0cc2', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'ac6786d0-f723-11f0-a0a4-fa163e934844', 'monotonic_time': 3876.230850781, 'message_signature': 'e23433909bba2b1aac0c607a9053bd3304fd5c53f17a00de7c9713cd77439622'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'd4ff24d8abf8416db9d64c645436c5f1', 'user_name': None, 'project_id': 'cdcb2f57183e484cace5d5f78dd635a1', 'project_name': None, 'resource_id': '5bdecf5d-9113-4584-ac23-44d59770eade-sda', 'timestamp': '2026-01-21T23:48:23.686156', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-821021372', 'name': 'instance-00000015', 'instance_id': '5bdecf5d-9113-4584-ac23-44d59770eade', 'instance_type': 'm1.nano', 'host': '3a5bbea88021c2dec9ed32ce2b39bccc332cee8daebee264a0ef0cc2', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'ac6790c6-f723-11f0-a0a4-fa163e934844', 'monotonic_time': 3876.230850781, 'message_signature': 'cb46124b7a602277aa5244b890df5fb0b8896c92d5626b1692eb8b0201a81ec5'}]}, 'timestamp': '2026-01-21 23:48:23.687078', '_unique_id': '11ff7629f06b4bb2b6bed9b4e6d2fb7d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.687 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.687 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.687 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.687 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.687 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.687 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.687 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.687 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.687 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.687 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.687 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.687 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.687 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.687 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.687 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.687 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.687 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.687 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.687 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.687 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.687 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.687 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.687 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.687 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.687 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.687 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.687 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.687 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.687 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.687 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:48:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:48:23.687 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:48:23 compute-1 nova_compute[182713]: 2026-01-21 23:48:23.693 182717 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769039288.6919327, 2977f489-9f9d-43f7-a617-7556b7df5171 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:48:23 compute-1 nova_compute[182713]: 2026-01-21 23:48:23.694 182717 INFO nova.compute.manager [-] [instance: 2977f489-9f9d-43f7-a617-7556b7df5171] VM Stopped (Lifecycle Event)
Jan 21 23:48:23 compute-1 nova_compute[182713]: 2026-01-21 23:48:23.715 182717 DEBUG nova.compute.manager [None req-02649fa6-90c9-4fb1-bf32-f4505d140854 - - - - - -] [instance: 2977f489-9f9d-43f7-a617-7556b7df5171] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:48:23 compute-1 nova_compute[182713]: 2026-01-21 23:48:23.983 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:48:25 compute-1 nova_compute[182713]: 2026-01-21 23:48:25.354 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:48:25 compute-1 podman[214805]: 2026-01-21 23:48:25.577865293 +0000 UTC m=+0.064618585 container health_status e305ebfcd3dbca18f8dd680606b043b108cf9efb0d0a771039cb04fe0df8441d (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Jan 21 23:48:25 compute-1 nova_compute[182713]: 2026-01-21 23:48:25.588 182717 DEBUG nova.network.neutron [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] [instance: 2038fd11-9c07-48d0-8092-d973d69d8eb9] Updating instance_info_cache with network_info: [{"id": "bbc46799-0727-41d9-9ae1-017037df9492", "address": "fa:16:3e:14:7f:86", "network": {"id": "b2df233d-b255-4dda-925c-3ccab3a032ee", "bridge": "br-int", "label": "tempest-LiveMigrationTest-283223724-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cdcb2f57183e484cace5d5f78dd635a1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbbc46799-07", "ovs_interfaceid": "bbc46799-0727-41d9-9ae1-017037df9492", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 23:48:25 compute-1 podman[214804]: 2026-01-21 23:48:25.598400612 +0000 UTC m=+0.087233020 container health_status af9a44923f14d865893c91712a29a99d59e05d83f1a1a0300c4f4d5af638f284 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 21 23:48:25 compute-1 nova_compute[182713]: 2026-01-21 23:48:25.619 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Releasing lock "refresh_cache-2038fd11-9c07-48d0-8092-d973d69d8eb9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 23:48:25 compute-1 nova_compute[182713]: 2026-01-21 23:48:25.620 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] [instance: 2038fd11-9c07-48d0-8092-d973d69d8eb9] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Jan 21 23:48:25 compute-1 nova_compute[182713]: 2026-01-21 23:48:25.620 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:48:25 compute-1 nova_compute[182713]: 2026-01-21 23:48:25.621 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:48:25 compute-1 nova_compute[182713]: 2026-01-21 23:48:25.622 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:48:25 compute-1 nova_compute[182713]: 2026-01-21 23:48:25.713 182717 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769039290.7125337, 044c71d9-3aaf-4e1c-af95-5c0636cf4000 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:48:25 compute-1 nova_compute[182713]: 2026-01-21 23:48:25.713 182717 INFO nova.compute.manager [-] [instance: 044c71d9-3aaf-4e1c-af95-5c0636cf4000] VM Stopped (Lifecycle Event)
Jan 21 23:48:25 compute-1 nova_compute[182713]: 2026-01-21 23:48:25.737 182717 DEBUG nova.compute.manager [None req-43aac664-3d5e-408f-9838-08cc479873c0 - - - - - -] [instance: 044c71d9-3aaf-4e1c-af95-5c0636cf4000] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:48:25 compute-1 nova_compute[182713]: 2026-01-21 23:48:25.952 182717 DEBUG nova.virt.libvirt.driver [None req-33db5e23-5585-4e5a-977e-1f0387bd5035 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] [instance: 2038fd11-9c07-48d0-8092-d973d69d8eb9] Check if temp file /var/lib/nova/instances/tmpu3c6dsuj exists to indicate shared storage is being used for migration. Exists? False _check_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10065
Jan 21 23:48:25 compute-1 nova_compute[182713]: 2026-01-21 23:48:25.960 182717 DEBUG oslo_concurrency.processutils [None req-33db5e23-5585-4e5a-977e-1f0387bd5035 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2038fd11-9c07-48d0-8092-d973d69d8eb9/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:48:26 compute-1 nova_compute[182713]: 2026-01-21 23:48:26.057 182717 DEBUG oslo_concurrency.processutils [None req-33db5e23-5585-4e5a-977e-1f0387bd5035 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2038fd11-9c07-48d0-8092-d973d69d8eb9/disk --force-share --output=json" returned: 0 in 0.097s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:48:26 compute-1 nova_compute[182713]: 2026-01-21 23:48:26.060 182717 DEBUG oslo_concurrency.processutils [None req-33db5e23-5585-4e5a-977e-1f0387bd5035 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2038fd11-9c07-48d0-8092-d973d69d8eb9/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:48:26 compute-1 nova_compute[182713]: 2026-01-21 23:48:26.131 182717 DEBUG oslo_concurrency.processutils [None req-33db5e23-5585-4e5a-977e-1f0387bd5035 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2038fd11-9c07-48d0-8092-d973d69d8eb9/disk --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:48:26 compute-1 nova_compute[182713]: 2026-01-21 23:48:26.133 182717 DEBUG nova.compute.manager [None req-33db5e23-5585-4e5a-977e-1f0387bd5035 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] source check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=72704,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpu3c6dsuj',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='2038fd11-9c07-48d0-8092-d973d69d8eb9',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_source /usr/lib/python3.9/site-packages/nova/compute/manager.py:8587
Jan 21 23:48:27 compute-1 nova_compute[182713]: 2026-01-21 23:48:27.288 182717 DEBUG oslo_concurrency.processutils [None req-33db5e23-5585-4e5a-977e-1f0387bd5035 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2038fd11-9c07-48d0-8092-d973d69d8eb9/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:48:27 compute-1 nova_compute[182713]: 2026-01-21 23:48:27.373 182717 DEBUG oslo_concurrency.processutils [None req-33db5e23-5585-4e5a-977e-1f0387bd5035 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2038fd11-9c07-48d0-8092-d973d69d8eb9/disk --force-share --output=json" returned: 0 in 0.085s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:48:27 compute-1 nova_compute[182713]: 2026-01-21 23:48:27.375 182717 DEBUG oslo_concurrency.processutils [None req-33db5e23-5585-4e5a-977e-1f0387bd5035 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2038fd11-9c07-48d0-8092-d973d69d8eb9/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:48:27 compute-1 nova_compute[182713]: 2026-01-21 23:48:27.437 182717 DEBUG oslo_concurrency.processutils [None req-33db5e23-5585-4e5a-977e-1f0387bd5035 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2038fd11-9c07-48d0-8092-d973d69d8eb9/disk --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:48:28 compute-1 nova_compute[182713]: 2026-01-21 23:48:28.986 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:48:29 compute-1 sshd-session[214859]: Accepted publickey for nova from 192.168.122.100 port 54694 ssh2: ECDSA SHA256:F7R2p0Um/J2TH6GGZ4p2tVmjepU+Vjbmccv9PcuTsYI
Jan 21 23:48:29 compute-1 systemd-logind[796]: New session 31 of user nova.
Jan 21 23:48:29 compute-1 systemd[1]: Created slice User Slice of UID 42436.
Jan 21 23:48:29 compute-1 systemd[1]: Starting User Runtime Directory /run/user/42436...
Jan 21 23:48:29 compute-1 ovn_controller[94841]: 2026-01-21T23:48:29Z|00093|binding|INFO|Releasing lport 75454af0-da31-4238-b248-a6678c575f51 from this chassis (sb_readonly=0)
Jan 21 23:48:29 compute-1 ovn_controller[94841]: 2026-01-21T23:48:29Z|00094|binding|INFO|Releasing lport 81fbdf60-46d3-442f-bc69-6a381397338b from this chassis (sb_readonly=0)
Jan 21 23:48:29 compute-1 systemd[1]: Finished User Runtime Directory /run/user/42436.
Jan 21 23:48:29 compute-1 systemd[1]: Starting User Manager for UID 42436...
Jan 21 23:48:29 compute-1 nova_compute[182713]: 2026-01-21 23:48:29.914 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:48:29 compute-1 systemd[214863]: pam_unix(systemd-user:session): session opened for user nova(uid=42436) by nova(uid=0)
Jan 21 23:48:30 compute-1 systemd[214863]: Queued start job for default target Main User Target.
Jan 21 23:48:30 compute-1 systemd[214863]: Created slice User Application Slice.
Jan 21 23:48:30 compute-1 systemd[214863]: Started Mark boot as successful after the user session has run 2 minutes.
Jan 21 23:48:30 compute-1 systemd[214863]: Started Daily Cleanup of User's Temporary Directories.
Jan 21 23:48:30 compute-1 systemd[214863]: Reached target Paths.
Jan 21 23:48:30 compute-1 systemd[214863]: Reached target Timers.
Jan 21 23:48:30 compute-1 systemd[214863]: Starting D-Bus User Message Bus Socket...
Jan 21 23:48:30 compute-1 systemd[214863]: Starting Create User's Volatile Files and Directories...
Jan 21 23:48:30 compute-1 systemd[214863]: Listening on D-Bus User Message Bus Socket.
Jan 21 23:48:30 compute-1 systemd[214863]: Reached target Sockets.
Jan 21 23:48:30 compute-1 systemd[214863]: Finished Create User's Volatile Files and Directories.
Jan 21 23:48:30 compute-1 systemd[214863]: Reached target Basic System.
Jan 21 23:48:30 compute-1 systemd[214863]: Reached target Main User Target.
Jan 21 23:48:30 compute-1 systemd[214863]: Startup finished in 196ms.
Jan 21 23:48:30 compute-1 systemd[1]: Started User Manager for UID 42436.
Jan 21 23:48:30 compute-1 systemd[1]: Started Session 31 of User nova.
Jan 21 23:48:30 compute-1 sshd-session[214859]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Jan 21 23:48:30 compute-1 sshd-session[214879]: Received disconnect from 192.168.122.100 port 54694:11: disconnected by user
Jan 21 23:48:30 compute-1 sshd-session[214879]: Disconnected from user nova 192.168.122.100 port 54694
Jan 21 23:48:30 compute-1 sshd-session[214859]: pam_unix(sshd:session): session closed for user nova
Jan 21 23:48:30 compute-1 systemd[1]: session-31.scope: Deactivated successfully.
Jan 21 23:48:30 compute-1 systemd-logind[796]: Session 31 logged out. Waiting for processes to exit.
Jan 21 23:48:30 compute-1 systemd-logind[796]: Removed session 31.
Jan 21 23:48:30 compute-1 nova_compute[182713]: 2026-01-21 23:48:30.357 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:48:31 compute-1 nova_compute[182713]: 2026-01-21 23:48:31.520 182717 DEBUG nova.compute.manager [req-7317dd20-0511-4788-b092-68a798158853 req-abf666fb-3818-45f8-91b8-01151d3c98ad 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 2038fd11-9c07-48d0-8092-d973d69d8eb9] Received event network-vif-unplugged-bbc46799-0727-41d9-9ae1-017037df9492 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:48:31 compute-1 nova_compute[182713]: 2026-01-21 23:48:31.521 182717 DEBUG oslo_concurrency.lockutils [req-7317dd20-0511-4788-b092-68a798158853 req-abf666fb-3818-45f8-91b8-01151d3c98ad 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "2038fd11-9c07-48d0-8092-d973d69d8eb9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:48:31 compute-1 nova_compute[182713]: 2026-01-21 23:48:31.522 182717 DEBUG oslo_concurrency.lockutils [req-7317dd20-0511-4788-b092-68a798158853 req-abf666fb-3818-45f8-91b8-01151d3c98ad 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "2038fd11-9c07-48d0-8092-d973d69d8eb9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:48:31 compute-1 nova_compute[182713]: 2026-01-21 23:48:31.522 182717 DEBUG oslo_concurrency.lockutils [req-7317dd20-0511-4788-b092-68a798158853 req-abf666fb-3818-45f8-91b8-01151d3c98ad 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "2038fd11-9c07-48d0-8092-d973d69d8eb9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:48:31 compute-1 nova_compute[182713]: 2026-01-21 23:48:31.522 182717 DEBUG nova.compute.manager [req-7317dd20-0511-4788-b092-68a798158853 req-abf666fb-3818-45f8-91b8-01151d3c98ad 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 2038fd11-9c07-48d0-8092-d973d69d8eb9] No waiting events found dispatching network-vif-unplugged-bbc46799-0727-41d9-9ae1-017037df9492 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 23:48:31 compute-1 nova_compute[182713]: 2026-01-21 23:48:31.523 182717 DEBUG nova.compute.manager [req-7317dd20-0511-4788-b092-68a798158853 req-abf666fb-3818-45f8-91b8-01151d3c98ad 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 2038fd11-9c07-48d0-8092-d973d69d8eb9] Received event network-vif-unplugged-bbc46799-0727-41d9-9ae1-017037df9492 for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 21 23:48:32 compute-1 nova_compute[182713]: 2026-01-21 23:48:32.429 182717 INFO nova.compute.manager [None req-33db5e23-5585-4e5a-977e-1f0387bd5035 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] [instance: 2038fd11-9c07-48d0-8092-d973d69d8eb9] Took 4.99 seconds for pre_live_migration on destination host compute-0.ctlplane.example.com.
Jan 21 23:48:32 compute-1 nova_compute[182713]: 2026-01-21 23:48:32.430 182717 DEBUG nova.compute.manager [None req-33db5e23-5585-4e5a-977e-1f0387bd5035 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] [instance: 2038fd11-9c07-48d0-8092-d973d69d8eb9] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 21 23:48:32 compute-1 nova_compute[182713]: 2026-01-21 23:48:32.455 182717 DEBUG nova.compute.manager [None req-33db5e23-5585-4e5a-977e-1f0387bd5035 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=72704,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpu3c6dsuj',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='2038fd11-9c07-48d0-8092-d973d69d8eb9',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=Migration(e622de76-e2a2-47db-95df-19796bedea51),old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) _do_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8939
Jan 21 23:48:32 compute-1 nova_compute[182713]: 2026-01-21 23:48:32.487 182717 DEBUG nova.objects.instance [None req-33db5e23-5585-4e5a-977e-1f0387bd5035 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] Lazy-loading 'migration_context' on Instance uuid 2038fd11-9c07-48d0-8092-d973d69d8eb9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:48:32 compute-1 nova_compute[182713]: 2026-01-21 23:48:32.489 182717 DEBUG nova.virt.libvirt.driver [None req-33db5e23-5585-4e5a-977e-1f0387bd5035 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] [instance: 2038fd11-9c07-48d0-8092-d973d69d8eb9] Starting monitoring of live migration _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10639
Jan 21 23:48:32 compute-1 nova_compute[182713]: 2026-01-21 23:48:32.490 182717 DEBUG nova.virt.libvirt.driver [None req-33db5e23-5585-4e5a-977e-1f0387bd5035 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] [instance: 2038fd11-9c07-48d0-8092-d973d69d8eb9] Operation thread is still running _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10440
Jan 21 23:48:32 compute-1 nova_compute[182713]: 2026-01-21 23:48:32.491 182717 DEBUG nova.virt.libvirt.driver [None req-33db5e23-5585-4e5a-977e-1f0387bd5035 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] [instance: 2038fd11-9c07-48d0-8092-d973d69d8eb9] Migration not running yet _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10449
Jan 21 23:48:32 compute-1 nova_compute[182713]: 2026-01-21 23:48:32.522 182717 DEBUG nova.virt.libvirt.vif [None req-33db5e23-5585-4e5a-977e-1f0387bd5035 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-21T23:48:12Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-LiveMigrationTest-server-1127248027',display_name='tempest-LiveMigrationTest-server-1127248027',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-livemigrationtest-server-1127248027',id=25,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-21T23:48:21Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='cdcb2f57183e484cace5d5f78dd635a1',ramdisk_id='',reservation_id='r-6nt020yh',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-LiveMigrationTest-430976321',owner_user_name='tempest-LiveMigrationTest-430976321-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-21T23:48:21Z,user_data=None,user_id='d4ff24d8abf8416db9d64c645436c5f1',uuid=2038fd11-9c07-48d0-8092-d973d69d8eb9,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "bbc46799-0727-41d9-9ae1-017037df9492", "address": "fa:16:3e:14:7f:86", "network": {"id": "b2df233d-b255-4dda-925c-3ccab3a032ee", "bridge": "br-int", "label": "tempest-LiveMigrationTest-283223724-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cdcb2f57183e484cace5d5f78dd635a1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapbbc46799-07", "ovs_interfaceid": "bbc46799-0727-41d9-9ae1-017037df9492", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 21 23:48:32 compute-1 nova_compute[182713]: 2026-01-21 23:48:32.523 182717 DEBUG nova.network.os_vif_util [None req-33db5e23-5585-4e5a-977e-1f0387bd5035 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] Converting VIF {"id": "bbc46799-0727-41d9-9ae1-017037df9492", "address": "fa:16:3e:14:7f:86", "network": {"id": "b2df233d-b255-4dda-925c-3ccab3a032ee", "bridge": "br-int", "label": "tempest-LiveMigrationTest-283223724-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cdcb2f57183e484cace5d5f78dd635a1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapbbc46799-07", "ovs_interfaceid": "bbc46799-0727-41d9-9ae1-017037df9492", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 21 23:48:32 compute-1 nova_compute[182713]: 2026-01-21 23:48:32.524 182717 DEBUG nova.network.os_vif_util [None req-33db5e23-5585-4e5a-977e-1f0387bd5035 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:14:7f:86,bridge_name='br-int',has_traffic_filtering=True,id=bbc46799-0727-41d9-9ae1-017037df9492,network=Network(b2df233d-b255-4dda-925c-3ccab3a032ee),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapbbc46799-07') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 21 23:48:32 compute-1 nova_compute[182713]: 2026-01-21 23:48:32.525 182717 DEBUG nova.virt.libvirt.migration [None req-33db5e23-5585-4e5a-977e-1f0387bd5035 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] [instance: 2038fd11-9c07-48d0-8092-d973d69d8eb9] Updating guest XML with vif config: <interface type="ethernet">
Jan 21 23:48:32 compute-1 nova_compute[182713]:   <mac address="fa:16:3e:14:7f:86"/>
Jan 21 23:48:32 compute-1 nova_compute[182713]:   <model type="virtio"/>
Jan 21 23:48:32 compute-1 nova_compute[182713]:   <driver name="vhost" rx_queue_size="512"/>
Jan 21 23:48:32 compute-1 nova_compute[182713]:   <mtu size="1442"/>
Jan 21 23:48:32 compute-1 nova_compute[182713]:   <target dev="tapbbc46799-07"/>
Jan 21 23:48:32 compute-1 nova_compute[182713]: </interface>
Jan 21 23:48:32 compute-1 nova_compute[182713]:  _update_vif_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:388
Jan 21 23:48:32 compute-1 nova_compute[182713]: 2026-01-21 23:48:32.526 182717 DEBUG nova.virt.libvirt.driver [None req-33db5e23-5585-4e5a-977e-1f0387bd5035 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] [instance: 2038fd11-9c07-48d0-8092-d973d69d8eb9] About to invoke the migrate API _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10272
Jan 21 23:48:32 compute-1 nova_compute[182713]: 2026-01-21 23:48:32.994 182717 DEBUG nova.virt.libvirt.migration [None req-33db5e23-5585-4e5a-977e-1f0387bd5035 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] [instance: 2038fd11-9c07-48d0-8092-d973d69d8eb9] Current None elapsed 0 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Jan 21 23:48:32 compute-1 nova_compute[182713]: 2026-01-21 23:48:32.995 182717 INFO nova.virt.libvirt.migration [None req-33db5e23-5585-4e5a-977e-1f0387bd5035 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] [instance: 2038fd11-9c07-48d0-8092-d973d69d8eb9] Increasing downtime to 50 ms after 0 sec elapsed time
Jan 21 23:48:33 compute-1 nova_compute[182713]: 2026-01-21 23:48:33.098 182717 INFO nova.virt.libvirt.driver [None req-33db5e23-5585-4e5a-977e-1f0387bd5035 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] [instance: 2038fd11-9c07-48d0-8092-d973d69d8eb9] Migration running for 0 secs, memory 100% remaining (bytes processed=0, remaining=0, total=0); disk 100% remaining (bytes processed=0, remaining=0, total=0).
Jan 21 23:48:33 compute-1 ovn_controller[94841]: 2026-01-21T23:48:33Z|00012|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:14:7f:86 10.100.0.13
Jan 21 23:48:33 compute-1 ovn_controller[94841]: 2026-01-21T23:48:33Z|00013|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:14:7f:86 10.100.0.13
Jan 21 23:48:33 compute-1 nova_compute[182713]: 2026-01-21 23:48:33.602 182717 DEBUG nova.virt.libvirt.migration [None req-33db5e23-5585-4e5a-977e-1f0387bd5035 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] [instance: 2038fd11-9c07-48d0-8092-d973d69d8eb9] Current 50 elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Jan 21 23:48:33 compute-1 nova_compute[182713]: 2026-01-21 23:48:33.603 182717 DEBUG nova.virt.libvirt.migration [None req-33db5e23-5585-4e5a-977e-1f0387bd5035 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] [instance: 2038fd11-9c07-48d0-8092-d973d69d8eb9] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Jan 21 23:48:33 compute-1 nova_compute[182713]: 2026-01-21 23:48:33.622 182717 DEBUG nova.compute.manager [req-e64b4469-baff-4e63-9122-27d738f996ba req-bda022e6-95db-45b3-a980-b98d9bf8fe3b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 2038fd11-9c07-48d0-8092-d973d69d8eb9] Received event network-vif-plugged-bbc46799-0727-41d9-9ae1-017037df9492 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:48:33 compute-1 nova_compute[182713]: 2026-01-21 23:48:33.623 182717 DEBUG oslo_concurrency.lockutils [req-e64b4469-baff-4e63-9122-27d738f996ba req-bda022e6-95db-45b3-a980-b98d9bf8fe3b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "2038fd11-9c07-48d0-8092-d973d69d8eb9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:48:33 compute-1 nova_compute[182713]: 2026-01-21 23:48:33.623 182717 DEBUG oslo_concurrency.lockutils [req-e64b4469-baff-4e63-9122-27d738f996ba req-bda022e6-95db-45b3-a980-b98d9bf8fe3b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "2038fd11-9c07-48d0-8092-d973d69d8eb9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:48:33 compute-1 nova_compute[182713]: 2026-01-21 23:48:33.624 182717 DEBUG oslo_concurrency.lockutils [req-e64b4469-baff-4e63-9122-27d738f996ba req-bda022e6-95db-45b3-a980-b98d9bf8fe3b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "2038fd11-9c07-48d0-8092-d973d69d8eb9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:48:33 compute-1 nova_compute[182713]: 2026-01-21 23:48:33.624 182717 DEBUG nova.compute.manager [req-e64b4469-baff-4e63-9122-27d738f996ba req-bda022e6-95db-45b3-a980-b98d9bf8fe3b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 2038fd11-9c07-48d0-8092-d973d69d8eb9] No waiting events found dispatching network-vif-plugged-bbc46799-0727-41d9-9ae1-017037df9492 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 23:48:33 compute-1 nova_compute[182713]: 2026-01-21 23:48:33.625 182717 WARNING nova.compute.manager [req-e64b4469-baff-4e63-9122-27d738f996ba req-bda022e6-95db-45b3-a980-b98d9bf8fe3b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 2038fd11-9c07-48d0-8092-d973d69d8eb9] Received unexpected event network-vif-plugged-bbc46799-0727-41d9-9ae1-017037df9492 for instance with vm_state active and task_state migrating.
Jan 21 23:48:33 compute-1 nova_compute[182713]: 2026-01-21 23:48:33.625 182717 DEBUG nova.compute.manager [req-e64b4469-baff-4e63-9122-27d738f996ba req-bda022e6-95db-45b3-a980-b98d9bf8fe3b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 2038fd11-9c07-48d0-8092-d973d69d8eb9] Received event network-changed-bbc46799-0727-41d9-9ae1-017037df9492 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:48:33 compute-1 nova_compute[182713]: 2026-01-21 23:48:33.625 182717 DEBUG nova.compute.manager [req-e64b4469-baff-4e63-9122-27d738f996ba req-bda022e6-95db-45b3-a980-b98d9bf8fe3b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 2038fd11-9c07-48d0-8092-d973d69d8eb9] Refreshing instance network info cache due to event network-changed-bbc46799-0727-41d9-9ae1-017037df9492. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 21 23:48:33 compute-1 nova_compute[182713]: 2026-01-21 23:48:33.626 182717 DEBUG oslo_concurrency.lockutils [req-e64b4469-baff-4e63-9122-27d738f996ba req-bda022e6-95db-45b3-a980-b98d9bf8fe3b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-2038fd11-9c07-48d0-8092-d973d69d8eb9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 23:48:33 compute-1 nova_compute[182713]: 2026-01-21 23:48:33.626 182717 DEBUG oslo_concurrency.lockutils [req-e64b4469-baff-4e63-9122-27d738f996ba req-bda022e6-95db-45b3-a980-b98d9bf8fe3b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-2038fd11-9c07-48d0-8092-d973d69d8eb9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 23:48:33 compute-1 nova_compute[182713]: 2026-01-21 23:48:33.627 182717 DEBUG nova.network.neutron [req-e64b4469-baff-4e63-9122-27d738f996ba req-bda022e6-95db-45b3-a980-b98d9bf8fe3b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 2038fd11-9c07-48d0-8092-d973d69d8eb9] Refreshing network info cache for port bbc46799-0727-41d9-9ae1-017037df9492 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 21 23:48:33 compute-1 nova_compute[182713]: 2026-01-21 23:48:33.990 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:48:34 compute-1 nova_compute[182713]: 2026-01-21 23:48:34.109 182717 DEBUG nova.virt.libvirt.migration [None req-33db5e23-5585-4e5a-977e-1f0387bd5035 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] [instance: 2038fd11-9c07-48d0-8092-d973d69d8eb9] Current 50 elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Jan 21 23:48:34 compute-1 nova_compute[182713]: 2026-01-21 23:48:34.110 182717 DEBUG nova.virt.libvirt.migration [None req-33db5e23-5585-4e5a-977e-1f0387bd5035 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] [instance: 2038fd11-9c07-48d0-8092-d973d69d8eb9] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Jan 21 23:48:34 compute-1 nova_compute[182713]: 2026-01-21 23:48:34.298 182717 DEBUG nova.virt.driver [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Emitting event <LifecycleEvent: 1769039314.296745, 2038fd11-9c07-48d0-8092-d973d69d8eb9 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:48:34 compute-1 nova_compute[182713]: 2026-01-21 23:48:34.298 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 2038fd11-9c07-48d0-8092-d973d69d8eb9] VM Paused (Lifecycle Event)
Jan 21 23:48:34 compute-1 nova_compute[182713]: 2026-01-21 23:48:34.329 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 2038fd11-9c07-48d0-8092-d973d69d8eb9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:48:34 compute-1 nova_compute[182713]: 2026-01-21 23:48:34.335 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 2038fd11-9c07-48d0-8092-d973d69d8eb9] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 21 23:48:34 compute-1 nova_compute[182713]: 2026-01-21 23:48:34.362 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 2038fd11-9c07-48d0-8092-d973d69d8eb9] During sync_power_state the instance has a pending task (migrating). Skip.
Jan 21 23:48:34 compute-1 kernel: tapbbc46799-07 (unregistering): left promiscuous mode
Jan 21 23:48:34 compute-1 NetworkManager[54952]: <info>  [1769039314.5042] device (tapbbc46799-07): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 21 23:48:34 compute-1 ovn_controller[94841]: 2026-01-21T23:48:34Z|00095|binding|INFO|Releasing lport bbc46799-0727-41d9-9ae1-017037df9492 from this chassis (sb_readonly=0)
Jan 21 23:48:34 compute-1 ovn_controller[94841]: 2026-01-21T23:48:34Z|00096|binding|INFO|Setting lport bbc46799-0727-41d9-9ae1-017037df9492 down in Southbound
Jan 21 23:48:34 compute-1 ovn_controller[94841]: 2026-01-21T23:48:34Z|00097|binding|INFO|Releasing lport 56571b22-2d90-46ed-b4c3-681729d375d9 from this chassis (sb_readonly=0)
Jan 21 23:48:34 compute-1 ovn_controller[94841]: 2026-01-21T23:48:34Z|00098|binding|INFO|Setting lport 56571b22-2d90-46ed-b4c3-681729d375d9 down in Southbound
Jan 21 23:48:34 compute-1 ovn_controller[94841]: 2026-01-21T23:48:34Z|00099|binding|INFO|Removing iface tapbbc46799-07 ovn-installed in OVS
Jan 21 23:48:34 compute-1 nova_compute[182713]: 2026-01-21 23:48:34.572 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:48:34 compute-1 ovn_controller[94841]: 2026-01-21T23:48:34Z|00100|binding|INFO|Releasing lport 75454af0-da31-4238-b248-a6678c575f51 from this chassis (sb_readonly=0)
Jan 21 23:48:34 compute-1 ovn_controller[94841]: 2026-01-21T23:48:34Z|00101|binding|INFO|Releasing lport 81fbdf60-46d3-442f-bc69-6a381397338b from this chassis (sb_readonly=0)
Jan 21 23:48:34 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:48:34.601 104184 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:14:7f:86 10.100.0.13'], port_security=['fa:16:3e:14:7f:86 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com,compute-0.ctlplane.example.com', 'activation-strategy': 'rarp', 'additional-chassis-activated': '7f404a2f-20ba-4b9b-88d6-fa3588630efa'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-parent-1972857521', 'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '2038fd11-9c07-48d0-8092-d973d69d8eb9', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b2df233d-b255-4dda-925c-3ccab3a032ee', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-parent-1972857521', 'neutron:project_id': 'cdcb2f57183e484cace5d5f78dd635a1', 'neutron:revision_number': '8', 'neutron:security_group_ids': 'b0d61dfd-cc58-4a7f-a4c1-6c8c92847694', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ceab9906-340c-4566-81ac-4c6dd292f58f, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>], logical_port=bbc46799-0727-41d9-9ae1-017037df9492) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 21 23:48:34 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:48:34.605 104184 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:cf:be:25 19.80.0.150'], port_security=['fa:16:3e:cf:be:25 19.80.0.150'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=['bbc46799-0727-41d9-9ae1-017037df9492'], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-subport-1175960055', 'neutron:cidrs': '19.80.0.150/24', 'neutron:device_id': '', 'neutron:device_owner': 'trunk:subport', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1b0b760c-cbd0-4413-9603-713296c75717', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-subport-1175960055', 'neutron:project_id': 'cdcb2f57183e484cace5d5f78dd635a1', 'neutron:revision_number': '3', 'neutron:security_group_ids': 'b0d61dfd-cc58-4a7f-a4c1-6c8c92847694', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[42], additional_encap=[], encap=[], mirror_rules=[], datapath=05a945c7-2b1a-4093-88a2-493079ba8709, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=56571b22-2d90-46ed-b4c3-681729d375d9) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 21 23:48:34 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:48:34.607 104184 INFO neutron.agent.ovn.metadata.agent [-] Port bbc46799-0727-41d9-9ae1-017037df9492 in datapath b2df233d-b255-4dda-925c-3ccab3a032ee unbound from our chassis
Jan 21 23:48:34 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:48:34.611 104184 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network b2df233d-b255-4dda-925c-3ccab3a032ee
Jan 21 23:48:34 compute-1 nova_compute[182713]: 2026-01-21 23:48:34.622 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:48:34 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:48:34.645 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[cb3c7b7f-e5f1-4d7d-8ab3-16d6511c29d4]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:48:34 compute-1 nova_compute[182713]: 2026-01-21 23:48:34.661 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:48:34 compute-1 systemd[1]: machine-qemu\x2d14\x2dinstance\x2d00000019.scope: Deactivated successfully.
Jan 21 23:48:34 compute-1 systemd[1]: machine-qemu\x2d14\x2dinstance\x2d00000019.scope: Consumed 13.681s CPU time.
Jan 21 23:48:34 compute-1 systemd-machined[153970]: Machine qemu-14-instance-00000019 terminated.
Jan 21 23:48:34 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:48:34.699 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[c505947e-4409-41ee-8680-a8e1931967c4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:48:34 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:48:34.707 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[e69fa82b-7fe7-49fc-9181-24c46d5b234e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:48:34 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:48:34.752 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[2576ff17-e362-4748-a8a2-0084a6bb9e7c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:48:34 compute-1 nova_compute[182713]: 2026-01-21 23:48:34.755 182717 DEBUG nova.virt.libvirt.guest [None req-33db5e23-5585-4e5a-977e-1f0387bd5035 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] Domain has shutdown/gone away: Requested operation is not valid: domain is not running get_job_info /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:688
Jan 21 23:48:34 compute-1 nova_compute[182713]: 2026-01-21 23:48:34.755 182717 INFO nova.virt.libvirt.driver [None req-33db5e23-5585-4e5a-977e-1f0387bd5035 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] [instance: 2038fd11-9c07-48d0-8092-d973d69d8eb9] Migration operation has completed
Jan 21 23:48:34 compute-1 nova_compute[182713]: 2026-01-21 23:48:34.756 182717 INFO nova.compute.manager [None req-33db5e23-5585-4e5a-977e-1f0387bd5035 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] [instance: 2038fd11-9c07-48d0-8092-d973d69d8eb9] _post_live_migration() is started..
Jan 21 23:48:34 compute-1 nova_compute[182713]: 2026-01-21 23:48:34.758 182717 DEBUG nova.virt.libvirt.driver [None req-33db5e23-5585-4e5a-977e-1f0387bd5035 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] [instance: 2038fd11-9c07-48d0-8092-d973d69d8eb9] Migrate API has completed _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10279
Jan 21 23:48:34 compute-1 nova_compute[182713]: 2026-01-21 23:48:34.758 182717 DEBUG nova.virt.libvirt.driver [None req-33db5e23-5585-4e5a-977e-1f0387bd5035 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] [instance: 2038fd11-9c07-48d0-8092-d973d69d8eb9] Migration operation thread has finished _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10327
Jan 21 23:48:34 compute-1 nova_compute[182713]: 2026-01-21 23:48:34.759 182717 DEBUG nova.virt.libvirt.driver [None req-33db5e23-5585-4e5a-977e-1f0387bd5035 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] [instance: 2038fd11-9c07-48d0-8092-d973d69d8eb9] Migration operation thread notification thread_finished /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10630
Jan 21 23:48:34 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:48:34.783 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[15fae109-ccb0-4f14-9ea6-d47030edc6f8]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb2df233d-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:45:e6:36'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 21, 'tx_packets': 7, 'rx_bytes': 1162, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 21, 'tx_packets': 7, 'rx_bytes': 1162, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 27], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 384779, 'reachable_time': 23610, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 214927, 'error': None, 'target': 'ovnmeta-b2df233d-b255-4dda-925c-3ccab3a032ee', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:48:34 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:48:34.807 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[4e3cf920-64a0-4af2-b947-8c1b22496c58]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapb2df233d-b1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 384791, 'tstamp': 384791}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 214928, 'error': None, 'target': 'ovnmeta-b2df233d-b255-4dda-925c-3ccab3a032ee', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapb2df233d-b1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 384793, 'tstamp': 384793}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 214928, 'error': None, 'target': 'ovnmeta-b2df233d-b255-4dda-925c-3ccab3a032ee', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:48:34 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:48:34.809 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb2df233d-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:48:34 compute-1 nova_compute[182713]: 2026-01-21 23:48:34.812 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:48:34 compute-1 nova_compute[182713]: 2026-01-21 23:48:34.818 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:48:34 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:48:34.819 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb2df233d-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:48:34 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:48:34.820 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 21 23:48:34 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:48:34.820 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapb2df233d-b0, col_values=(('external_ids', {'iface-id': '75454af0-da31-4238-b248-a6678c575f51'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:48:34 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:48:34.821 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 21 23:48:34 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:48:34.824 104184 INFO neutron.agent.ovn.metadata.agent [-] Port 56571b22-2d90-46ed-b4c3-681729d375d9 in datapath 1b0b760c-cbd0-4413-9603-713296c75717 unbound from our chassis
Jan 21 23:48:34 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:48:34.827 104184 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 1b0b760c-cbd0-4413-9603-713296c75717, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 21 23:48:34 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:48:34.828 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[79397dfa-1f48-433f-a7f0-196abcb9f817]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:48:34 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:48:34.833 104184 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-1b0b760c-cbd0-4413-9603-713296c75717 namespace which is not needed anymore
Jan 21 23:48:35 compute-1 neutron-haproxy-ovnmeta-1b0b760c-cbd0-4413-9603-713296c75717[214789]: [NOTICE]   (214793) : haproxy version is 2.8.14-c23fe91
Jan 21 23:48:35 compute-1 neutron-haproxy-ovnmeta-1b0b760c-cbd0-4413-9603-713296c75717[214789]: [NOTICE]   (214793) : path to executable is /usr/sbin/haproxy
Jan 21 23:48:35 compute-1 neutron-haproxy-ovnmeta-1b0b760c-cbd0-4413-9603-713296c75717[214789]: [WARNING]  (214793) : Exiting Master process...
Jan 21 23:48:35 compute-1 neutron-haproxy-ovnmeta-1b0b760c-cbd0-4413-9603-713296c75717[214789]: [WARNING]  (214793) : Exiting Master process...
Jan 21 23:48:35 compute-1 neutron-haproxy-ovnmeta-1b0b760c-cbd0-4413-9603-713296c75717[214789]: [ALERT]    (214793) : Current worker (214795) exited with code 143 (Terminated)
Jan 21 23:48:35 compute-1 neutron-haproxy-ovnmeta-1b0b760c-cbd0-4413-9603-713296c75717[214789]: [WARNING]  (214793) : All workers exited. Exiting... (0)
Jan 21 23:48:35 compute-1 systemd[1]: libpod-49db29b8cafe07b2b7044a503284b450d3b6befcff045abe8af8efd6cd349e43.scope: Deactivated successfully.
Jan 21 23:48:35 compute-1 podman[214947]: 2026-01-21 23:48:35.046904314 +0000 UTC m=+0.060313477 container died 49db29b8cafe07b2b7044a503284b450d3b6befcff045abe8af8efd6cd349e43 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1b0b760c-cbd0-4413-9603-713296c75717, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Jan 21 23:48:35 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-49db29b8cafe07b2b7044a503284b450d3b6befcff045abe8af8efd6cd349e43-userdata-shm.mount: Deactivated successfully.
Jan 21 23:48:35 compute-1 systemd[1]: var-lib-containers-storage-overlay-f2556d407cb583d45463be68c1827de5f67e221f7b99d16afca1796f85253375-merged.mount: Deactivated successfully.
Jan 21 23:48:35 compute-1 podman[214947]: 2026-01-21 23:48:35.091927029 +0000 UTC m=+0.105336192 container cleanup 49db29b8cafe07b2b7044a503284b450d3b6befcff045abe8af8efd6cd349e43 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1b0b760c-cbd0-4413-9603-713296c75717, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 21 23:48:35 compute-1 systemd[1]: libpod-conmon-49db29b8cafe07b2b7044a503284b450d3b6befcff045abe8af8efd6cd349e43.scope: Deactivated successfully.
Jan 21 23:48:35 compute-1 podman[214977]: 2026-01-21 23:48:35.189972415 +0000 UTC m=+0.059234352 container remove 49db29b8cafe07b2b7044a503284b450d3b6befcff045abe8af8efd6cd349e43 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1b0b760c-cbd0-4413-9603-713296c75717, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202)
Jan 21 23:48:35 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:48:35.198 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[ff88c67a-eac9-495d-84d6-0375e212572c]: (4, ('Wed Jan 21 11:48:34 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-1b0b760c-cbd0-4413-9603-713296c75717 (49db29b8cafe07b2b7044a503284b450d3b6befcff045abe8af8efd6cd349e43)\n49db29b8cafe07b2b7044a503284b450d3b6befcff045abe8af8efd6cd349e43\nWed Jan 21 11:48:35 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-1b0b760c-cbd0-4413-9603-713296c75717 (49db29b8cafe07b2b7044a503284b450d3b6befcff045abe8af8efd6cd349e43)\n49db29b8cafe07b2b7044a503284b450d3b6befcff045abe8af8efd6cd349e43\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:48:35 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:48:35.201 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[9282cc03-5366-4914-9eca-1919978efa5b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:48:35 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:48:35.202 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1b0b760c-c0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:48:35 compute-1 kernel: tap1b0b760c-c0: left promiscuous mode
Jan 21 23:48:35 compute-1 nova_compute[182713]: 2026-01-21 23:48:35.206 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:48:35 compute-1 nova_compute[182713]: 2026-01-21 23:48:35.236 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:48:35 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:48:35.239 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[edf82a86-f149-4760-9119-39f1e5ae4446]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:48:35 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:48:35.261 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[732bc5a1-ce14-4109-89e5-ac5835f88179]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:48:35 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:48:35.262 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[8218952b-4ca4-432c-a7cf-3069b9464408]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:48:35 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:48:35.283 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[e6c0de70-55f5-4537-ace6-76f18e20f79c]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 387329, 'reachable_time': 29951, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 214996, 'error': None, 'target': 'ovnmeta-1b0b760c-cbd0-4413-9603-713296c75717', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:48:35 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:48:35.286 104576 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-1b0b760c-cbd0-4413-9603-713296c75717 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 21 23:48:35 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:48:35.286 104576 DEBUG oslo.privsep.daemon [-] privsep: reply[f03bccf9-4a98-4c65-8559-403fc2df1c6e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:48:35 compute-1 systemd[1]: run-netns-ovnmeta\x2d1b0b760c\x2dcbd0\x2d4413\x2d9603\x2d713296c75717.mount: Deactivated successfully.
Jan 21 23:48:35 compute-1 nova_compute[182713]: 2026-01-21 23:48:35.359 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:48:36 compute-1 nova_compute[182713]: 2026-01-21 23:48:36.021 182717 DEBUG nova.compute.manager [req-fae0cd6a-e57c-466f-861c-61e711924f4c req-16cb3578-915e-4e89-83ef-80bb809d360c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 2038fd11-9c07-48d0-8092-d973d69d8eb9] Received event network-vif-unplugged-bbc46799-0727-41d9-9ae1-017037df9492 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:48:36 compute-1 nova_compute[182713]: 2026-01-21 23:48:36.021 182717 DEBUG oslo_concurrency.lockutils [req-fae0cd6a-e57c-466f-861c-61e711924f4c req-16cb3578-915e-4e89-83ef-80bb809d360c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "2038fd11-9c07-48d0-8092-d973d69d8eb9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:48:36 compute-1 nova_compute[182713]: 2026-01-21 23:48:36.022 182717 DEBUG oslo_concurrency.lockutils [req-fae0cd6a-e57c-466f-861c-61e711924f4c req-16cb3578-915e-4e89-83ef-80bb809d360c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "2038fd11-9c07-48d0-8092-d973d69d8eb9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:48:36 compute-1 nova_compute[182713]: 2026-01-21 23:48:36.022 182717 DEBUG oslo_concurrency.lockutils [req-fae0cd6a-e57c-466f-861c-61e711924f4c req-16cb3578-915e-4e89-83ef-80bb809d360c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "2038fd11-9c07-48d0-8092-d973d69d8eb9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:48:36 compute-1 nova_compute[182713]: 2026-01-21 23:48:36.022 182717 DEBUG nova.compute.manager [req-fae0cd6a-e57c-466f-861c-61e711924f4c req-16cb3578-915e-4e89-83ef-80bb809d360c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 2038fd11-9c07-48d0-8092-d973d69d8eb9] No waiting events found dispatching network-vif-unplugged-bbc46799-0727-41d9-9ae1-017037df9492 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 23:48:36 compute-1 nova_compute[182713]: 2026-01-21 23:48:36.023 182717 DEBUG nova.compute.manager [req-fae0cd6a-e57c-466f-861c-61e711924f4c req-16cb3578-915e-4e89-83ef-80bb809d360c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 2038fd11-9c07-48d0-8092-d973d69d8eb9] Received event network-vif-unplugged-bbc46799-0727-41d9-9ae1-017037df9492 for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 21 23:48:36 compute-1 nova_compute[182713]: 2026-01-21 23:48:36.023 182717 DEBUG nova.compute.manager [req-fae0cd6a-e57c-466f-861c-61e711924f4c req-16cb3578-915e-4e89-83ef-80bb809d360c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 2038fd11-9c07-48d0-8092-d973d69d8eb9] Received event network-vif-plugged-bbc46799-0727-41d9-9ae1-017037df9492 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:48:36 compute-1 nova_compute[182713]: 2026-01-21 23:48:36.023 182717 DEBUG oslo_concurrency.lockutils [req-fae0cd6a-e57c-466f-861c-61e711924f4c req-16cb3578-915e-4e89-83ef-80bb809d360c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "2038fd11-9c07-48d0-8092-d973d69d8eb9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:48:36 compute-1 nova_compute[182713]: 2026-01-21 23:48:36.024 182717 DEBUG oslo_concurrency.lockutils [req-fae0cd6a-e57c-466f-861c-61e711924f4c req-16cb3578-915e-4e89-83ef-80bb809d360c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "2038fd11-9c07-48d0-8092-d973d69d8eb9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:48:36 compute-1 nova_compute[182713]: 2026-01-21 23:48:36.024 182717 DEBUG oslo_concurrency.lockutils [req-fae0cd6a-e57c-466f-861c-61e711924f4c req-16cb3578-915e-4e89-83ef-80bb809d360c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "2038fd11-9c07-48d0-8092-d973d69d8eb9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:48:36 compute-1 nova_compute[182713]: 2026-01-21 23:48:36.024 182717 DEBUG nova.compute.manager [req-fae0cd6a-e57c-466f-861c-61e711924f4c req-16cb3578-915e-4e89-83ef-80bb809d360c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 2038fd11-9c07-48d0-8092-d973d69d8eb9] No waiting events found dispatching network-vif-plugged-bbc46799-0727-41d9-9ae1-017037df9492 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 23:48:36 compute-1 nova_compute[182713]: 2026-01-21 23:48:36.025 182717 WARNING nova.compute.manager [req-fae0cd6a-e57c-466f-861c-61e711924f4c req-16cb3578-915e-4e89-83ef-80bb809d360c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 2038fd11-9c07-48d0-8092-d973d69d8eb9] Received unexpected event network-vif-plugged-bbc46799-0727-41d9-9ae1-017037df9492 for instance with vm_state active and task_state migrating.
Jan 21 23:48:36 compute-1 nova_compute[182713]: 2026-01-21 23:48:36.025 182717 DEBUG nova.compute.manager [req-fae0cd6a-e57c-466f-861c-61e711924f4c req-16cb3578-915e-4e89-83ef-80bb809d360c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 2038fd11-9c07-48d0-8092-d973d69d8eb9] Received event network-vif-plugged-bbc46799-0727-41d9-9ae1-017037df9492 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:48:36 compute-1 nova_compute[182713]: 2026-01-21 23:48:36.025 182717 DEBUG oslo_concurrency.lockutils [req-fae0cd6a-e57c-466f-861c-61e711924f4c req-16cb3578-915e-4e89-83ef-80bb809d360c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "2038fd11-9c07-48d0-8092-d973d69d8eb9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:48:36 compute-1 nova_compute[182713]: 2026-01-21 23:48:36.026 182717 DEBUG oslo_concurrency.lockutils [req-fae0cd6a-e57c-466f-861c-61e711924f4c req-16cb3578-915e-4e89-83ef-80bb809d360c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "2038fd11-9c07-48d0-8092-d973d69d8eb9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:48:36 compute-1 nova_compute[182713]: 2026-01-21 23:48:36.026 182717 DEBUG oslo_concurrency.lockutils [req-fae0cd6a-e57c-466f-861c-61e711924f4c req-16cb3578-915e-4e89-83ef-80bb809d360c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "2038fd11-9c07-48d0-8092-d973d69d8eb9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:48:36 compute-1 nova_compute[182713]: 2026-01-21 23:48:36.026 182717 DEBUG nova.compute.manager [req-fae0cd6a-e57c-466f-861c-61e711924f4c req-16cb3578-915e-4e89-83ef-80bb809d360c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 2038fd11-9c07-48d0-8092-d973d69d8eb9] No waiting events found dispatching network-vif-plugged-bbc46799-0727-41d9-9ae1-017037df9492 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 23:48:36 compute-1 nova_compute[182713]: 2026-01-21 23:48:36.027 182717 WARNING nova.compute.manager [req-fae0cd6a-e57c-466f-861c-61e711924f4c req-16cb3578-915e-4e89-83ef-80bb809d360c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 2038fd11-9c07-48d0-8092-d973d69d8eb9] Received unexpected event network-vif-plugged-bbc46799-0727-41d9-9ae1-017037df9492 for instance with vm_state active and task_state migrating.
Jan 21 23:48:36 compute-1 nova_compute[182713]: 2026-01-21 23:48:36.110 182717 DEBUG nova.network.neutron [req-e64b4469-baff-4e63-9122-27d738f996ba req-bda022e6-95db-45b3-a980-b98d9bf8fe3b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 2038fd11-9c07-48d0-8092-d973d69d8eb9] Updated VIF entry in instance network info cache for port bbc46799-0727-41d9-9ae1-017037df9492. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 21 23:48:36 compute-1 nova_compute[182713]: 2026-01-21 23:48:36.111 182717 DEBUG nova.network.neutron [req-e64b4469-baff-4e63-9122-27d738f996ba req-bda022e6-95db-45b3-a980-b98d9bf8fe3b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 2038fd11-9c07-48d0-8092-d973d69d8eb9] Updating instance_info_cache with network_info: [{"id": "bbc46799-0727-41d9-9ae1-017037df9492", "address": "fa:16:3e:14:7f:86", "network": {"id": "b2df233d-b255-4dda-925c-3ccab3a032ee", "bridge": "br-int", "label": "tempest-LiveMigrationTest-283223724-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cdcb2f57183e484cace5d5f78dd635a1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbbc46799-07", "ovs_interfaceid": "bbc46799-0727-41d9-9ae1-017037df9492", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"migrating_to": "compute-0.ctlplane.example.com"}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 23:48:36 compute-1 nova_compute[182713]: 2026-01-21 23:48:36.144 182717 DEBUG nova.compute.manager [req-c6c44296-df89-42ad-b23e-9d60ffbd3ff7 req-7fb8918d-59ae-4c4d-a3c2-ae217b43a96e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 2038fd11-9c07-48d0-8092-d973d69d8eb9] Received event network-vif-unplugged-bbc46799-0727-41d9-9ae1-017037df9492 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:48:36 compute-1 nova_compute[182713]: 2026-01-21 23:48:36.145 182717 DEBUG oslo_concurrency.lockutils [req-c6c44296-df89-42ad-b23e-9d60ffbd3ff7 req-7fb8918d-59ae-4c4d-a3c2-ae217b43a96e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "2038fd11-9c07-48d0-8092-d973d69d8eb9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:48:36 compute-1 nova_compute[182713]: 2026-01-21 23:48:36.146 182717 DEBUG oslo_concurrency.lockutils [req-c6c44296-df89-42ad-b23e-9d60ffbd3ff7 req-7fb8918d-59ae-4c4d-a3c2-ae217b43a96e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "2038fd11-9c07-48d0-8092-d973d69d8eb9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:48:36 compute-1 nova_compute[182713]: 2026-01-21 23:48:36.146 182717 DEBUG oslo_concurrency.lockutils [req-c6c44296-df89-42ad-b23e-9d60ffbd3ff7 req-7fb8918d-59ae-4c4d-a3c2-ae217b43a96e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "2038fd11-9c07-48d0-8092-d973d69d8eb9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:48:36 compute-1 nova_compute[182713]: 2026-01-21 23:48:36.147 182717 DEBUG nova.compute.manager [req-c6c44296-df89-42ad-b23e-9d60ffbd3ff7 req-7fb8918d-59ae-4c4d-a3c2-ae217b43a96e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 2038fd11-9c07-48d0-8092-d973d69d8eb9] No waiting events found dispatching network-vif-unplugged-bbc46799-0727-41d9-9ae1-017037df9492 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 23:48:36 compute-1 nova_compute[182713]: 2026-01-21 23:48:36.147 182717 DEBUG nova.compute.manager [req-c6c44296-df89-42ad-b23e-9d60ffbd3ff7 req-7fb8918d-59ae-4c4d-a3c2-ae217b43a96e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 2038fd11-9c07-48d0-8092-d973d69d8eb9] Received event network-vif-unplugged-bbc46799-0727-41d9-9ae1-017037df9492 for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 21 23:48:36 compute-1 nova_compute[182713]: 2026-01-21 23:48:36.150 182717 DEBUG oslo_concurrency.lockutils [req-e64b4469-baff-4e63-9122-27d738f996ba req-bda022e6-95db-45b3-a980-b98d9bf8fe3b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-2038fd11-9c07-48d0-8092-d973d69d8eb9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 23:48:36 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:48:36.455 104184 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=8, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:a4:fb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'fa:c2:bb:50:eb:27'}, ipsec=False) old=SB_Global(nb_cfg=7) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 21 23:48:36 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:48:36.457 104184 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 21 23:48:36 compute-1 nova_compute[182713]: 2026-01-21 23:48:36.457 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:48:36 compute-1 nova_compute[182713]: 2026-01-21 23:48:36.540 182717 DEBUG nova.network.neutron [None req-33db5e23-5585-4e5a-977e-1f0387bd5035 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] Activated binding for port bbc46799-0727-41d9-9ae1-017037df9492 and host compute-0.ctlplane.example.com migrate_instance_start /usr/lib/python3.9/site-packages/nova/network/neutron.py:3181
Jan 21 23:48:36 compute-1 nova_compute[182713]: 2026-01-21 23:48:36.541 182717 DEBUG nova.compute.manager [None req-33db5e23-5585-4e5a-977e-1f0387bd5035 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] [instance: 2038fd11-9c07-48d0-8092-d973d69d8eb9] Calling driver.post_live_migration_at_source with original source VIFs from migrate_data: [{"id": "bbc46799-0727-41d9-9ae1-017037df9492", "address": "fa:16:3e:14:7f:86", "network": {"id": "b2df233d-b255-4dda-925c-3ccab3a032ee", "bridge": "br-int", "label": "tempest-LiveMigrationTest-283223724-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cdcb2f57183e484cace5d5f78dd635a1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbbc46799-07", "ovs_interfaceid": "bbc46799-0727-41d9-9ae1-017037df9492", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9326
Jan 21 23:48:36 compute-1 nova_compute[182713]: 2026-01-21 23:48:36.542 182717 DEBUG nova.virt.libvirt.vif [None req-33db5e23-5585-4e5a-977e-1f0387bd5035 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-21T23:48:12Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-LiveMigrationTest-server-1127248027',display_name='tempest-LiveMigrationTest-server-1127248027',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-livemigrationtest-server-1127248027',id=25,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-21T23:48:21Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='cdcb2f57183e484cace5d5f78dd635a1',ramdisk_id='',reservation_id='r-6nt020yh',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-LiveMigrationTest-430976321',owner_user_name='tempest-LiveMigrationTest-430976321-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-21T23:48:25Z,user_data=None,user_id='d4ff24d8abf8416db9d64c645436c5f1',uuid=2038fd11-9c07-48d0-8092-d973d69d8eb9,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "bbc46799-0727-41d9-9ae1-017037df9492", "address": "fa:16:3e:14:7f:86", "network": {"id": "b2df233d-b255-4dda-925c-3ccab3a032ee", "bridge": "br-int", "label": "tempest-LiveMigrationTest-283223724-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cdcb2f57183e484cace5d5f78dd635a1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbbc46799-07", "ovs_interfaceid": "bbc46799-0727-41d9-9ae1-017037df9492", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 21 23:48:36 compute-1 nova_compute[182713]: 2026-01-21 23:48:36.542 182717 DEBUG nova.network.os_vif_util [None req-33db5e23-5585-4e5a-977e-1f0387bd5035 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] Converting VIF {"id": "bbc46799-0727-41d9-9ae1-017037df9492", "address": "fa:16:3e:14:7f:86", "network": {"id": "b2df233d-b255-4dda-925c-3ccab3a032ee", "bridge": "br-int", "label": "tempest-LiveMigrationTest-283223724-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cdcb2f57183e484cace5d5f78dd635a1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbbc46799-07", "ovs_interfaceid": "bbc46799-0727-41d9-9ae1-017037df9492", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 21 23:48:36 compute-1 nova_compute[182713]: 2026-01-21 23:48:36.543 182717 DEBUG nova.network.os_vif_util [None req-33db5e23-5585-4e5a-977e-1f0387bd5035 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:14:7f:86,bridge_name='br-int',has_traffic_filtering=True,id=bbc46799-0727-41d9-9ae1-017037df9492,network=Network(b2df233d-b255-4dda-925c-3ccab3a032ee),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapbbc46799-07') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 21 23:48:36 compute-1 nova_compute[182713]: 2026-01-21 23:48:36.543 182717 DEBUG os_vif [None req-33db5e23-5585-4e5a-977e-1f0387bd5035 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:14:7f:86,bridge_name='br-int',has_traffic_filtering=True,id=bbc46799-0727-41d9-9ae1-017037df9492,network=Network(b2df233d-b255-4dda-925c-3ccab3a032ee),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapbbc46799-07') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 21 23:48:36 compute-1 nova_compute[182713]: 2026-01-21 23:48:36.546 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:48:36 compute-1 nova_compute[182713]: 2026-01-21 23:48:36.547 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapbbc46799-07, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:48:36 compute-1 nova_compute[182713]: 2026-01-21 23:48:36.550 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:48:36 compute-1 nova_compute[182713]: 2026-01-21 23:48:36.552 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 21 23:48:36 compute-1 nova_compute[182713]: 2026-01-21 23:48:36.562 182717 INFO os_vif [None req-33db5e23-5585-4e5a-977e-1f0387bd5035 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:14:7f:86,bridge_name='br-int',has_traffic_filtering=True,id=bbc46799-0727-41d9-9ae1-017037df9492,network=Network(b2df233d-b255-4dda-925c-3ccab3a032ee),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapbbc46799-07')
Jan 21 23:48:36 compute-1 nova_compute[182713]: 2026-01-21 23:48:36.563 182717 DEBUG oslo_concurrency.lockutils [None req-33db5e23-5585-4e5a-977e-1f0387bd5035 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:48:36 compute-1 nova_compute[182713]: 2026-01-21 23:48:36.564 182717 DEBUG oslo_concurrency.lockutils [None req-33db5e23-5585-4e5a-977e-1f0387bd5035 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:48:36 compute-1 nova_compute[182713]: 2026-01-21 23:48:36.565 182717 DEBUG oslo_concurrency.lockutils [None req-33db5e23-5585-4e5a-977e-1f0387bd5035 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:48:36 compute-1 nova_compute[182713]: 2026-01-21 23:48:36.566 182717 DEBUG nova.compute.manager [None req-33db5e23-5585-4e5a-977e-1f0387bd5035 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] [instance: 2038fd11-9c07-48d0-8092-d973d69d8eb9] Calling driver.cleanup from _post_live_migration _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9349
Jan 21 23:48:36 compute-1 nova_compute[182713]: 2026-01-21 23:48:36.567 182717 INFO nova.virt.libvirt.driver [None req-33db5e23-5585-4e5a-977e-1f0387bd5035 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] [instance: 2038fd11-9c07-48d0-8092-d973d69d8eb9] Deleting instance files /var/lib/nova/instances/2038fd11-9c07-48d0-8092-d973d69d8eb9_del
Jan 21 23:48:36 compute-1 nova_compute[182713]: 2026-01-21 23:48:36.569 182717 INFO nova.virt.libvirt.driver [None req-33db5e23-5585-4e5a-977e-1f0387bd5035 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] [instance: 2038fd11-9c07-48d0-8092-d973d69d8eb9] Deletion of /var/lib/nova/instances/2038fd11-9c07-48d0-8092-d973d69d8eb9_del complete
Jan 21 23:48:37 compute-1 podman[214997]: 2026-01-21 23:48:37.611331985 +0000 UTC m=+0.094238245 container health_status cba261ffc859a105e0afa4436e64e27f9ba7e7387a1ea02146e0072c51844d44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Jan 21 23:48:38 compute-1 nova_compute[182713]: 2026-01-21 23:48:38.236 182717 DEBUG nova.compute.manager [req-9bb03f31-31bc-4799-9999-b81c36cdee40 req-2cf22eea-e858-4689-91f6-e2ede33c9eea 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 2038fd11-9c07-48d0-8092-d973d69d8eb9] Received event network-vif-plugged-bbc46799-0727-41d9-9ae1-017037df9492 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:48:38 compute-1 nova_compute[182713]: 2026-01-21 23:48:38.236 182717 DEBUG oslo_concurrency.lockutils [req-9bb03f31-31bc-4799-9999-b81c36cdee40 req-2cf22eea-e858-4689-91f6-e2ede33c9eea 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "2038fd11-9c07-48d0-8092-d973d69d8eb9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:48:38 compute-1 nova_compute[182713]: 2026-01-21 23:48:38.237 182717 DEBUG oslo_concurrency.lockutils [req-9bb03f31-31bc-4799-9999-b81c36cdee40 req-2cf22eea-e858-4689-91f6-e2ede33c9eea 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "2038fd11-9c07-48d0-8092-d973d69d8eb9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:48:38 compute-1 nova_compute[182713]: 2026-01-21 23:48:38.237 182717 DEBUG oslo_concurrency.lockutils [req-9bb03f31-31bc-4799-9999-b81c36cdee40 req-2cf22eea-e858-4689-91f6-e2ede33c9eea 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "2038fd11-9c07-48d0-8092-d973d69d8eb9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:48:38 compute-1 nova_compute[182713]: 2026-01-21 23:48:38.237 182717 DEBUG nova.compute.manager [req-9bb03f31-31bc-4799-9999-b81c36cdee40 req-2cf22eea-e858-4689-91f6-e2ede33c9eea 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 2038fd11-9c07-48d0-8092-d973d69d8eb9] No waiting events found dispatching network-vif-plugged-bbc46799-0727-41d9-9ae1-017037df9492 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 23:48:38 compute-1 nova_compute[182713]: 2026-01-21 23:48:38.237 182717 WARNING nova.compute.manager [req-9bb03f31-31bc-4799-9999-b81c36cdee40 req-2cf22eea-e858-4689-91f6-e2ede33c9eea 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 2038fd11-9c07-48d0-8092-d973d69d8eb9] Received unexpected event network-vif-plugged-bbc46799-0727-41d9-9ae1-017037df9492 for instance with vm_state active and task_state migrating.
Jan 21 23:48:38 compute-1 nova_compute[182713]: 2026-01-21 23:48:38.237 182717 DEBUG nova.compute.manager [req-9bb03f31-31bc-4799-9999-b81c36cdee40 req-2cf22eea-e858-4689-91f6-e2ede33c9eea 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 2038fd11-9c07-48d0-8092-d973d69d8eb9] Received event network-vif-plugged-bbc46799-0727-41d9-9ae1-017037df9492 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:48:38 compute-1 nova_compute[182713]: 2026-01-21 23:48:38.238 182717 DEBUG oslo_concurrency.lockutils [req-9bb03f31-31bc-4799-9999-b81c36cdee40 req-2cf22eea-e858-4689-91f6-e2ede33c9eea 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "2038fd11-9c07-48d0-8092-d973d69d8eb9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:48:38 compute-1 nova_compute[182713]: 2026-01-21 23:48:38.238 182717 DEBUG oslo_concurrency.lockutils [req-9bb03f31-31bc-4799-9999-b81c36cdee40 req-2cf22eea-e858-4689-91f6-e2ede33c9eea 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "2038fd11-9c07-48d0-8092-d973d69d8eb9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:48:38 compute-1 nova_compute[182713]: 2026-01-21 23:48:38.238 182717 DEBUG oslo_concurrency.lockutils [req-9bb03f31-31bc-4799-9999-b81c36cdee40 req-2cf22eea-e858-4689-91f6-e2ede33c9eea 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "2038fd11-9c07-48d0-8092-d973d69d8eb9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:48:38 compute-1 nova_compute[182713]: 2026-01-21 23:48:38.238 182717 DEBUG nova.compute.manager [req-9bb03f31-31bc-4799-9999-b81c36cdee40 req-2cf22eea-e858-4689-91f6-e2ede33c9eea 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 2038fd11-9c07-48d0-8092-d973d69d8eb9] No waiting events found dispatching network-vif-plugged-bbc46799-0727-41d9-9ae1-017037df9492 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 23:48:38 compute-1 nova_compute[182713]: 2026-01-21 23:48:38.238 182717 WARNING nova.compute.manager [req-9bb03f31-31bc-4799-9999-b81c36cdee40 req-2cf22eea-e858-4689-91f6-e2ede33c9eea 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 2038fd11-9c07-48d0-8092-d973d69d8eb9] Received unexpected event network-vif-plugged-bbc46799-0727-41d9-9ae1-017037df9492 for instance with vm_state active and task_state migrating.
Jan 21 23:48:40 compute-1 nova_compute[182713]: 2026-01-21 23:48:40.362 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:48:40 compute-1 systemd[1]: Stopping User Manager for UID 42436...
Jan 21 23:48:40 compute-1 systemd[214863]: Activating special unit Exit the Session...
Jan 21 23:48:40 compute-1 systemd[214863]: Stopped target Main User Target.
Jan 21 23:48:40 compute-1 systemd[214863]: Stopped target Basic System.
Jan 21 23:48:40 compute-1 systemd[214863]: Stopped target Paths.
Jan 21 23:48:40 compute-1 systemd[214863]: Stopped target Sockets.
Jan 21 23:48:40 compute-1 systemd[214863]: Stopped target Timers.
Jan 21 23:48:40 compute-1 systemd[214863]: Stopped Mark boot as successful after the user session has run 2 minutes.
Jan 21 23:48:40 compute-1 systemd[214863]: Stopped Daily Cleanup of User's Temporary Directories.
Jan 21 23:48:40 compute-1 systemd[214863]: Closed D-Bus User Message Bus Socket.
Jan 21 23:48:40 compute-1 systemd[214863]: Stopped Create User's Volatile Files and Directories.
Jan 21 23:48:40 compute-1 systemd[214863]: Removed slice User Application Slice.
Jan 21 23:48:40 compute-1 systemd[214863]: Reached target Shutdown.
Jan 21 23:48:40 compute-1 systemd[214863]: Finished Exit the Session.
Jan 21 23:48:40 compute-1 systemd[214863]: Reached target Exit the Session.
Jan 21 23:48:40 compute-1 systemd[1]: user@42436.service: Deactivated successfully.
Jan 21 23:48:40 compute-1 systemd[1]: Stopped User Manager for UID 42436.
Jan 21 23:48:40 compute-1 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Jan 21 23:48:40 compute-1 systemd[1]: run-user-42436.mount: Deactivated successfully.
Jan 21 23:48:40 compute-1 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Jan 21 23:48:40 compute-1 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Jan 21 23:48:40 compute-1 systemd[1]: Removed slice User Slice of UID 42436.
Jan 21 23:48:41 compute-1 nova_compute[182713]: 2026-01-21 23:48:41.550 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:48:41 compute-1 podman[215019]: 2026-01-21 23:48:41.604402584 +0000 UTC m=+0.084321037 container health_status dce133a2ffa21f3006ac27dc80027060a9029e6d972f4c62fecd35dd3bbb00f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, name=ubi9-minimal, distribution-scope=public, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, release=1755695350, container_name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container)
Jan 21 23:48:41 compute-1 nova_compute[182713]: 2026-01-21 23:48:41.917 182717 DEBUG oslo_concurrency.lockutils [None req-33db5e23-5585-4e5a-977e-1f0387bd5035 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] Acquiring lock "2038fd11-9c07-48d0-8092-d973d69d8eb9-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:48:41 compute-1 nova_compute[182713]: 2026-01-21 23:48:41.918 182717 DEBUG oslo_concurrency.lockutils [None req-33db5e23-5585-4e5a-977e-1f0387bd5035 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] Lock "2038fd11-9c07-48d0-8092-d973d69d8eb9-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:48:41 compute-1 nova_compute[182713]: 2026-01-21 23:48:41.918 182717 DEBUG oslo_concurrency.lockutils [None req-33db5e23-5585-4e5a-977e-1f0387bd5035 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] Lock "2038fd11-9c07-48d0-8092-d973d69d8eb9-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:48:41 compute-1 nova_compute[182713]: 2026-01-21 23:48:41.954 182717 DEBUG oslo_concurrency.lockutils [None req-33db5e23-5585-4e5a-977e-1f0387bd5035 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:48:41 compute-1 nova_compute[182713]: 2026-01-21 23:48:41.955 182717 DEBUG oslo_concurrency.lockutils [None req-33db5e23-5585-4e5a-977e-1f0387bd5035 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:48:41 compute-1 nova_compute[182713]: 2026-01-21 23:48:41.955 182717 DEBUG oslo_concurrency.lockutils [None req-33db5e23-5585-4e5a-977e-1f0387bd5035 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:48:41 compute-1 nova_compute[182713]: 2026-01-21 23:48:41.955 182717 DEBUG nova.compute.resource_tracker [None req-33db5e23-5585-4e5a-977e-1f0387bd5035 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 21 23:48:42 compute-1 nova_compute[182713]: 2026-01-21 23:48:42.035 182717 DEBUG oslo_concurrency.processutils [None req-33db5e23-5585-4e5a-977e-1f0387bd5035 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5bdecf5d-9113-4584-ac23-44d59770eade/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:48:42 compute-1 nova_compute[182713]: 2026-01-21 23:48:42.124 182717 DEBUG oslo_concurrency.processutils [None req-33db5e23-5585-4e5a-977e-1f0387bd5035 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5bdecf5d-9113-4584-ac23-44d59770eade/disk --force-share --output=json" returned: 0 in 0.088s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:48:42 compute-1 nova_compute[182713]: 2026-01-21 23:48:42.125 182717 DEBUG oslo_concurrency.processutils [None req-33db5e23-5585-4e5a-977e-1f0387bd5035 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5bdecf5d-9113-4584-ac23-44d59770eade/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:48:42 compute-1 nova_compute[182713]: 2026-01-21 23:48:42.194 182717 DEBUG oslo_concurrency.processutils [None req-33db5e23-5585-4e5a-977e-1f0387bd5035 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5bdecf5d-9113-4584-ac23-44d59770eade/disk --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:48:42 compute-1 nova_compute[182713]: 2026-01-21 23:48:42.460 182717 WARNING nova.virt.libvirt.driver [None req-33db5e23-5585-4e5a-977e-1f0387bd5035 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 21 23:48:42 compute-1 nova_compute[182713]: 2026-01-21 23:48:42.462 182717 DEBUG nova.compute.resource_tracker [None req-33db5e23-5585-4e5a-977e-1f0387bd5035 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5489MB free_disk=73.35187149047852GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 21 23:48:42 compute-1 nova_compute[182713]: 2026-01-21 23:48:42.463 182717 DEBUG oslo_concurrency.lockutils [None req-33db5e23-5585-4e5a-977e-1f0387bd5035 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:48:42 compute-1 nova_compute[182713]: 2026-01-21 23:48:42.464 182717 DEBUG oslo_concurrency.lockutils [None req-33db5e23-5585-4e5a-977e-1f0387bd5035 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:48:42 compute-1 nova_compute[182713]: 2026-01-21 23:48:42.518 182717 DEBUG nova.compute.resource_tracker [None req-33db5e23-5585-4e5a-977e-1f0387bd5035 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] Migration for instance 2038fd11-9c07-48d0-8092-d973d69d8eb9 refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:903
Jan 21 23:48:42 compute-1 nova_compute[182713]: 2026-01-21 23:48:42.539 182717 DEBUG nova.compute.resource_tracker [None req-33db5e23-5585-4e5a-977e-1f0387bd5035 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] [instance: 2038fd11-9c07-48d0-8092-d973d69d8eb9] Skipping migration as instance is neither resizing nor live-migrating. _update_usage_from_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1491
Jan 21 23:48:42 compute-1 nova_compute[182713]: 2026-01-21 23:48:42.587 182717 DEBUG nova.compute.resource_tracker [None req-33db5e23-5585-4e5a-977e-1f0387bd5035 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] Instance 5bdecf5d-9113-4584-ac23-44d59770eade actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 21 23:48:42 compute-1 nova_compute[182713]: 2026-01-21 23:48:42.587 182717 DEBUG nova.compute.resource_tracker [None req-33db5e23-5585-4e5a-977e-1f0387bd5035 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] Migration e622de76-e2a2-47db-95df-19796bedea51 is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1640
Jan 21 23:48:42 compute-1 nova_compute[182713]: 2026-01-21 23:48:42.588 182717 DEBUG nova.compute.resource_tracker [None req-33db5e23-5585-4e5a-977e-1f0387bd5035 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 21 23:48:42 compute-1 nova_compute[182713]: 2026-01-21 23:48:42.588 182717 DEBUG nova.compute.resource_tracker [None req-33db5e23-5585-4e5a-977e-1f0387bd5035 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 21 23:48:42 compute-1 nova_compute[182713]: 2026-01-21 23:48:42.658 182717 DEBUG nova.compute.provider_tree [None req-33db5e23-5585-4e5a-977e-1f0387bd5035 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] Inventory has not changed in ProviderTree for provider: 39680711-70c9-4df1-ae59-25e54fac688d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 21 23:48:42 compute-1 nova_compute[182713]: 2026-01-21 23:48:42.678 182717 DEBUG nova.scheduler.client.report [None req-33db5e23-5585-4e5a-977e-1f0387bd5035 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] Inventory has not changed for provider 39680711-70c9-4df1-ae59-25e54fac688d based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 21 23:48:42 compute-1 nova_compute[182713]: 2026-01-21 23:48:42.706 182717 DEBUG nova.compute.resource_tracker [None req-33db5e23-5585-4e5a-977e-1f0387bd5035 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 21 23:48:42 compute-1 nova_compute[182713]: 2026-01-21 23:48:42.707 182717 DEBUG oslo_concurrency.lockutils [None req-33db5e23-5585-4e5a-977e-1f0387bd5035 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.243s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:48:42 compute-1 nova_compute[182713]: 2026-01-21 23:48:42.732 182717 INFO nova.compute.manager [None req-33db5e23-5585-4e5a-977e-1f0387bd5035 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] [instance: 2038fd11-9c07-48d0-8092-d973d69d8eb9] Migrating instance to compute-0.ctlplane.example.com finished successfully.
Jan 21 23:48:42 compute-1 nova_compute[182713]: 2026-01-21 23:48:42.845 182717 INFO nova.scheduler.client.report [None req-33db5e23-5585-4e5a-977e-1f0387bd5035 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] Deleted allocation for migration e622de76-e2a2-47db-95df-19796bedea51
Jan 21 23:48:42 compute-1 nova_compute[182713]: 2026-01-21 23:48:42.846 182717 DEBUG nova.virt.libvirt.driver [None req-33db5e23-5585-4e5a-977e-1f0387bd5035 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] [instance: 2038fd11-9c07-48d0-8092-d973d69d8eb9] Live migration monitoring is all done _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10662
Jan 21 23:48:43 compute-1 nova_compute[182713]: 2026-01-21 23:48:43.784 182717 DEBUG oslo_concurrency.lockutils [None req-21d7e6f4-17ed-40a9-9a73-5da6e2fd2809 98cbe317d3494846bdfe48215cfbc5c0 af45596abab74cc9aca5cbb551899c80 - - default default] Acquiring lock "fb3b64cb-7a89-4d0b-b821-db928d77b940" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:48:43 compute-1 nova_compute[182713]: 2026-01-21 23:48:43.785 182717 DEBUG oslo_concurrency.lockutils [None req-21d7e6f4-17ed-40a9-9a73-5da6e2fd2809 98cbe317d3494846bdfe48215cfbc5c0 af45596abab74cc9aca5cbb551899c80 - - default default] Lock "fb3b64cb-7a89-4d0b-b821-db928d77b940" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:48:43 compute-1 nova_compute[182713]: 2026-01-21 23:48:43.816 182717 DEBUG nova.compute.manager [None req-21d7e6f4-17ed-40a9-9a73-5da6e2fd2809 98cbe317d3494846bdfe48215cfbc5c0 af45596abab74cc9aca5cbb551899c80 - - default default] [instance: fb3b64cb-7a89-4d0b-b821-db928d77b940] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 21 23:48:43 compute-1 nova_compute[182713]: 2026-01-21 23:48:43.954 182717 DEBUG oslo_concurrency.lockutils [None req-21d7e6f4-17ed-40a9-9a73-5da6e2fd2809 98cbe317d3494846bdfe48215cfbc5c0 af45596abab74cc9aca5cbb551899c80 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:48:43 compute-1 nova_compute[182713]: 2026-01-21 23:48:43.954 182717 DEBUG oslo_concurrency.lockutils [None req-21d7e6f4-17ed-40a9-9a73-5da6e2fd2809 98cbe317d3494846bdfe48215cfbc5c0 af45596abab74cc9aca5cbb551899c80 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:48:43 compute-1 nova_compute[182713]: 2026-01-21 23:48:43.963 182717 DEBUG nova.virt.hardware [None req-21d7e6f4-17ed-40a9-9a73-5da6e2fd2809 98cbe317d3494846bdfe48215cfbc5c0 af45596abab74cc9aca5cbb551899c80 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 21 23:48:43 compute-1 nova_compute[182713]: 2026-01-21 23:48:43.963 182717 INFO nova.compute.claims [None req-21d7e6f4-17ed-40a9-9a73-5da6e2fd2809 98cbe317d3494846bdfe48215cfbc5c0 af45596abab74cc9aca5cbb551899c80 - - default default] [instance: fb3b64cb-7a89-4d0b-b821-db928d77b940] Claim successful on node compute-1.ctlplane.example.com
Jan 21 23:48:44 compute-1 nova_compute[182713]: 2026-01-21 23:48:44.154 182717 DEBUG nova.compute.provider_tree [None req-21d7e6f4-17ed-40a9-9a73-5da6e2fd2809 98cbe317d3494846bdfe48215cfbc5c0 af45596abab74cc9aca5cbb551899c80 - - default default] Inventory has not changed in ProviderTree for provider: 39680711-70c9-4df1-ae59-25e54fac688d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 21 23:48:44 compute-1 nova_compute[182713]: 2026-01-21 23:48:44.169 182717 DEBUG nova.scheduler.client.report [None req-21d7e6f4-17ed-40a9-9a73-5da6e2fd2809 98cbe317d3494846bdfe48215cfbc5c0 af45596abab74cc9aca5cbb551899c80 - - default default] Inventory has not changed for provider 39680711-70c9-4df1-ae59-25e54fac688d based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 21 23:48:44 compute-1 nova_compute[182713]: 2026-01-21 23:48:44.196 182717 DEBUG oslo_concurrency.lockutils [None req-21d7e6f4-17ed-40a9-9a73-5da6e2fd2809 98cbe317d3494846bdfe48215cfbc5c0 af45596abab74cc9aca5cbb551899c80 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.242s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:48:44 compute-1 nova_compute[182713]: 2026-01-21 23:48:44.197 182717 DEBUG nova.compute.manager [None req-21d7e6f4-17ed-40a9-9a73-5da6e2fd2809 98cbe317d3494846bdfe48215cfbc5c0 af45596abab74cc9aca5cbb551899c80 - - default default] [instance: fb3b64cb-7a89-4d0b-b821-db928d77b940] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 21 23:48:44 compute-1 nova_compute[182713]: 2026-01-21 23:48:44.251 182717 DEBUG nova.compute.manager [None req-21d7e6f4-17ed-40a9-9a73-5da6e2fd2809 98cbe317d3494846bdfe48215cfbc5c0 af45596abab74cc9aca5cbb551899c80 - - default default] [instance: fb3b64cb-7a89-4d0b-b821-db928d77b940] Not allocating networking since 'none' was specified. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1948
Jan 21 23:48:44 compute-1 nova_compute[182713]: 2026-01-21 23:48:44.266 182717 INFO nova.virt.libvirt.driver [None req-21d7e6f4-17ed-40a9-9a73-5da6e2fd2809 98cbe317d3494846bdfe48215cfbc5c0 af45596abab74cc9aca5cbb551899c80 - - default default] [instance: fb3b64cb-7a89-4d0b-b821-db928d77b940] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 21 23:48:44 compute-1 nova_compute[182713]: 2026-01-21 23:48:44.292 182717 DEBUG nova.compute.manager [None req-21d7e6f4-17ed-40a9-9a73-5da6e2fd2809 98cbe317d3494846bdfe48215cfbc5c0 af45596abab74cc9aca5cbb551899c80 - - default default] [instance: fb3b64cb-7a89-4d0b-b821-db928d77b940] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 21 23:48:44 compute-1 nova_compute[182713]: 2026-01-21 23:48:44.408 182717 DEBUG nova.compute.manager [None req-21d7e6f4-17ed-40a9-9a73-5da6e2fd2809 98cbe317d3494846bdfe48215cfbc5c0 af45596abab74cc9aca5cbb551899c80 - - default default] [instance: fb3b64cb-7a89-4d0b-b821-db928d77b940] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 21 23:48:44 compute-1 nova_compute[182713]: 2026-01-21 23:48:44.410 182717 DEBUG nova.virt.libvirt.driver [None req-21d7e6f4-17ed-40a9-9a73-5da6e2fd2809 98cbe317d3494846bdfe48215cfbc5c0 af45596abab74cc9aca5cbb551899c80 - - default default] [instance: fb3b64cb-7a89-4d0b-b821-db928d77b940] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 21 23:48:44 compute-1 nova_compute[182713]: 2026-01-21 23:48:44.411 182717 INFO nova.virt.libvirt.driver [None req-21d7e6f4-17ed-40a9-9a73-5da6e2fd2809 98cbe317d3494846bdfe48215cfbc5c0 af45596abab74cc9aca5cbb551899c80 - - default default] [instance: fb3b64cb-7a89-4d0b-b821-db928d77b940] Creating image(s)
Jan 21 23:48:44 compute-1 nova_compute[182713]: 2026-01-21 23:48:44.412 182717 DEBUG oslo_concurrency.lockutils [None req-21d7e6f4-17ed-40a9-9a73-5da6e2fd2809 98cbe317d3494846bdfe48215cfbc5c0 af45596abab74cc9aca5cbb551899c80 - - default default] Acquiring lock "/var/lib/nova/instances/fb3b64cb-7a89-4d0b-b821-db928d77b940/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:48:44 compute-1 nova_compute[182713]: 2026-01-21 23:48:44.413 182717 DEBUG oslo_concurrency.lockutils [None req-21d7e6f4-17ed-40a9-9a73-5da6e2fd2809 98cbe317d3494846bdfe48215cfbc5c0 af45596abab74cc9aca5cbb551899c80 - - default default] Lock "/var/lib/nova/instances/fb3b64cb-7a89-4d0b-b821-db928d77b940/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:48:44 compute-1 nova_compute[182713]: 2026-01-21 23:48:44.414 182717 DEBUG oslo_concurrency.lockutils [None req-21d7e6f4-17ed-40a9-9a73-5da6e2fd2809 98cbe317d3494846bdfe48215cfbc5c0 af45596abab74cc9aca5cbb551899c80 - - default default] Lock "/var/lib/nova/instances/fb3b64cb-7a89-4d0b-b821-db928d77b940/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:48:44 compute-1 nova_compute[182713]: 2026-01-21 23:48:44.443 182717 DEBUG oslo_concurrency.processutils [None req-21d7e6f4-17ed-40a9-9a73-5da6e2fd2809 98cbe317d3494846bdfe48215cfbc5c0 af45596abab74cc9aca5cbb551899c80 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:48:44 compute-1 nova_compute[182713]: 2026-01-21 23:48:44.532 182717 DEBUG oslo_concurrency.processutils [None req-21d7e6f4-17ed-40a9-9a73-5da6e2fd2809 98cbe317d3494846bdfe48215cfbc5c0 af45596abab74cc9aca5cbb551899c80 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.089s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:48:44 compute-1 nova_compute[182713]: 2026-01-21 23:48:44.533 182717 DEBUG oslo_concurrency.lockutils [None req-21d7e6f4-17ed-40a9-9a73-5da6e2fd2809 98cbe317d3494846bdfe48215cfbc5c0 af45596abab74cc9aca5cbb551899c80 - - default default] Acquiring lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:48:44 compute-1 nova_compute[182713]: 2026-01-21 23:48:44.534 182717 DEBUG oslo_concurrency.lockutils [None req-21d7e6f4-17ed-40a9-9a73-5da6e2fd2809 98cbe317d3494846bdfe48215cfbc5c0 af45596abab74cc9aca5cbb551899c80 - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:48:44 compute-1 nova_compute[182713]: 2026-01-21 23:48:44.548 182717 DEBUG oslo_concurrency.processutils [None req-21d7e6f4-17ed-40a9-9a73-5da6e2fd2809 98cbe317d3494846bdfe48215cfbc5c0 af45596abab74cc9aca5cbb551899c80 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:48:44 compute-1 nova_compute[182713]: 2026-01-21 23:48:44.621 182717 DEBUG oslo_concurrency.processutils [None req-21d7e6f4-17ed-40a9-9a73-5da6e2fd2809 98cbe317d3494846bdfe48215cfbc5c0 af45596abab74cc9aca5cbb551899c80 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.073s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:48:44 compute-1 nova_compute[182713]: 2026-01-21 23:48:44.623 182717 DEBUG oslo_concurrency.processutils [None req-21d7e6f4-17ed-40a9-9a73-5da6e2fd2809 98cbe317d3494846bdfe48215cfbc5c0 af45596abab74cc9aca5cbb551899c80 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/fb3b64cb-7a89-4d0b-b821-db928d77b940/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:48:44 compute-1 nova_compute[182713]: 2026-01-21 23:48:44.676 182717 DEBUG oslo_concurrency.processutils [None req-21d7e6f4-17ed-40a9-9a73-5da6e2fd2809 98cbe317d3494846bdfe48215cfbc5c0 af45596abab74cc9aca5cbb551899c80 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/fb3b64cb-7a89-4d0b-b821-db928d77b940/disk 1073741824" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:48:44 compute-1 nova_compute[182713]: 2026-01-21 23:48:44.677 182717 DEBUG oslo_concurrency.lockutils [None req-21d7e6f4-17ed-40a9-9a73-5da6e2fd2809 98cbe317d3494846bdfe48215cfbc5c0 af45596abab74cc9aca5cbb551899c80 - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.143s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:48:44 compute-1 nova_compute[182713]: 2026-01-21 23:48:44.678 182717 DEBUG oslo_concurrency.processutils [None req-21d7e6f4-17ed-40a9-9a73-5da6e2fd2809 98cbe317d3494846bdfe48215cfbc5c0 af45596abab74cc9aca5cbb551899c80 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:48:44 compute-1 nova_compute[182713]: 2026-01-21 23:48:44.768 182717 DEBUG oslo_concurrency.processutils [None req-21d7e6f4-17ed-40a9-9a73-5da6e2fd2809 98cbe317d3494846bdfe48215cfbc5c0 af45596abab74cc9aca5cbb551899c80 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.090s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:48:44 compute-1 nova_compute[182713]: 2026-01-21 23:48:44.770 182717 DEBUG nova.virt.disk.api [None req-21d7e6f4-17ed-40a9-9a73-5da6e2fd2809 98cbe317d3494846bdfe48215cfbc5c0 af45596abab74cc9aca5cbb551899c80 - - default default] Checking if we can resize image /var/lib/nova/instances/fb3b64cb-7a89-4d0b-b821-db928d77b940/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 21 23:48:44 compute-1 nova_compute[182713]: 2026-01-21 23:48:44.771 182717 DEBUG oslo_concurrency.processutils [None req-21d7e6f4-17ed-40a9-9a73-5da6e2fd2809 98cbe317d3494846bdfe48215cfbc5c0 af45596abab74cc9aca5cbb551899c80 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/fb3b64cb-7a89-4d0b-b821-db928d77b940/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:48:44 compute-1 nova_compute[182713]: 2026-01-21 23:48:44.840 182717 DEBUG oslo_concurrency.processutils [None req-21d7e6f4-17ed-40a9-9a73-5da6e2fd2809 98cbe317d3494846bdfe48215cfbc5c0 af45596abab74cc9aca5cbb551899c80 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/fb3b64cb-7a89-4d0b-b821-db928d77b940/disk --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:48:44 compute-1 nova_compute[182713]: 2026-01-21 23:48:44.841 182717 DEBUG nova.virt.disk.api [None req-21d7e6f4-17ed-40a9-9a73-5da6e2fd2809 98cbe317d3494846bdfe48215cfbc5c0 af45596abab74cc9aca5cbb551899c80 - - default default] Cannot resize image /var/lib/nova/instances/fb3b64cb-7a89-4d0b-b821-db928d77b940/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 21 23:48:44 compute-1 nova_compute[182713]: 2026-01-21 23:48:44.841 182717 DEBUG nova.objects.instance [None req-21d7e6f4-17ed-40a9-9a73-5da6e2fd2809 98cbe317d3494846bdfe48215cfbc5c0 af45596abab74cc9aca5cbb551899c80 - - default default] Lazy-loading 'migration_context' on Instance uuid fb3b64cb-7a89-4d0b-b821-db928d77b940 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:48:44 compute-1 nova_compute[182713]: 2026-01-21 23:48:44.862 182717 DEBUG nova.virt.libvirt.driver [None req-21d7e6f4-17ed-40a9-9a73-5da6e2fd2809 98cbe317d3494846bdfe48215cfbc5c0 af45596abab74cc9aca5cbb551899c80 - - default default] [instance: fb3b64cb-7a89-4d0b-b821-db928d77b940] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 21 23:48:44 compute-1 nova_compute[182713]: 2026-01-21 23:48:44.863 182717 DEBUG nova.virt.libvirt.driver [None req-21d7e6f4-17ed-40a9-9a73-5da6e2fd2809 98cbe317d3494846bdfe48215cfbc5c0 af45596abab74cc9aca5cbb551899c80 - - default default] [instance: fb3b64cb-7a89-4d0b-b821-db928d77b940] Ensure instance console log exists: /var/lib/nova/instances/fb3b64cb-7a89-4d0b-b821-db928d77b940/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 21 23:48:44 compute-1 nova_compute[182713]: 2026-01-21 23:48:44.863 182717 DEBUG oslo_concurrency.lockutils [None req-21d7e6f4-17ed-40a9-9a73-5da6e2fd2809 98cbe317d3494846bdfe48215cfbc5c0 af45596abab74cc9aca5cbb551899c80 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:48:44 compute-1 nova_compute[182713]: 2026-01-21 23:48:44.864 182717 DEBUG oslo_concurrency.lockutils [None req-21d7e6f4-17ed-40a9-9a73-5da6e2fd2809 98cbe317d3494846bdfe48215cfbc5c0 af45596abab74cc9aca5cbb551899c80 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:48:44 compute-1 nova_compute[182713]: 2026-01-21 23:48:44.864 182717 DEBUG oslo_concurrency.lockutils [None req-21d7e6f4-17ed-40a9-9a73-5da6e2fd2809 98cbe317d3494846bdfe48215cfbc5c0 af45596abab74cc9aca5cbb551899c80 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:48:44 compute-1 nova_compute[182713]: 2026-01-21 23:48:44.865 182717 DEBUG nova.virt.libvirt.driver [None req-21d7e6f4-17ed-40a9-9a73-5da6e2fd2809 98cbe317d3494846bdfe48215cfbc5c0 af45596abab74cc9aca5cbb551899c80 - - default default] [instance: fb3b64cb-7a89-4d0b-b821-db928d77b940] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'size': 0, 'disk_bus': 'virtio', 'boot_index': 0, 'device_name': '/dev/vda', 'guest_format': None, 'encrypted': False, 'encryption_format': None, 'encryption_secret_uuid': None, 'device_type': 'disk', 'image_id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 21 23:48:44 compute-1 nova_compute[182713]: 2026-01-21 23:48:44.872 182717 WARNING nova.virt.libvirt.driver [None req-21d7e6f4-17ed-40a9-9a73-5da6e2fd2809 98cbe317d3494846bdfe48215cfbc5c0 af45596abab74cc9aca5cbb551899c80 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 21 23:48:44 compute-1 nova_compute[182713]: 2026-01-21 23:48:44.876 182717 DEBUG nova.virt.libvirt.host [None req-21d7e6f4-17ed-40a9-9a73-5da6e2fd2809 98cbe317d3494846bdfe48215cfbc5c0 af45596abab74cc9aca5cbb551899c80 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 21 23:48:44 compute-1 nova_compute[182713]: 2026-01-21 23:48:44.877 182717 DEBUG nova.virt.libvirt.host [None req-21d7e6f4-17ed-40a9-9a73-5da6e2fd2809 98cbe317d3494846bdfe48215cfbc5c0 af45596abab74cc9aca5cbb551899c80 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 21 23:48:44 compute-1 nova_compute[182713]: 2026-01-21 23:48:44.879 182717 DEBUG nova.virt.libvirt.host [None req-21d7e6f4-17ed-40a9-9a73-5da6e2fd2809 98cbe317d3494846bdfe48215cfbc5c0 af45596abab74cc9aca5cbb551899c80 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 21 23:48:44 compute-1 nova_compute[182713]: 2026-01-21 23:48:44.880 182717 DEBUG nova.virt.libvirt.host [None req-21d7e6f4-17ed-40a9-9a73-5da6e2fd2809 98cbe317d3494846bdfe48215cfbc5c0 af45596abab74cc9aca5cbb551899c80 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 21 23:48:44 compute-1 nova_compute[182713]: 2026-01-21 23:48:44.881 182717 DEBUG nova.virt.libvirt.driver [None req-21d7e6f4-17ed-40a9-9a73-5da6e2fd2809 98cbe317d3494846bdfe48215cfbc5c0 af45596abab74cc9aca5cbb551899c80 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 21 23:48:44 compute-1 nova_compute[182713]: 2026-01-21 23:48:44.882 182717 DEBUG nova.virt.hardware [None req-21d7e6f4-17ed-40a9-9a73-5da6e2fd2809 98cbe317d3494846bdfe48215cfbc5c0 af45596abab74cc9aca5cbb551899c80 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-21T23:42:45Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c3389c03-89c4-4ff5-9e03-1a99d41713d4',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 21 23:48:44 compute-1 nova_compute[182713]: 2026-01-21 23:48:44.882 182717 DEBUG nova.virt.hardware [None req-21d7e6f4-17ed-40a9-9a73-5da6e2fd2809 98cbe317d3494846bdfe48215cfbc5c0 af45596abab74cc9aca5cbb551899c80 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 21 23:48:44 compute-1 nova_compute[182713]: 2026-01-21 23:48:44.882 182717 DEBUG nova.virt.hardware [None req-21d7e6f4-17ed-40a9-9a73-5da6e2fd2809 98cbe317d3494846bdfe48215cfbc5c0 af45596abab74cc9aca5cbb551899c80 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 21 23:48:44 compute-1 nova_compute[182713]: 2026-01-21 23:48:44.882 182717 DEBUG nova.virt.hardware [None req-21d7e6f4-17ed-40a9-9a73-5da6e2fd2809 98cbe317d3494846bdfe48215cfbc5c0 af45596abab74cc9aca5cbb551899c80 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 21 23:48:44 compute-1 nova_compute[182713]: 2026-01-21 23:48:44.882 182717 DEBUG nova.virt.hardware [None req-21d7e6f4-17ed-40a9-9a73-5da6e2fd2809 98cbe317d3494846bdfe48215cfbc5c0 af45596abab74cc9aca5cbb551899c80 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 21 23:48:44 compute-1 nova_compute[182713]: 2026-01-21 23:48:44.883 182717 DEBUG nova.virt.hardware [None req-21d7e6f4-17ed-40a9-9a73-5da6e2fd2809 98cbe317d3494846bdfe48215cfbc5c0 af45596abab74cc9aca5cbb551899c80 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 21 23:48:44 compute-1 nova_compute[182713]: 2026-01-21 23:48:44.883 182717 DEBUG nova.virt.hardware [None req-21d7e6f4-17ed-40a9-9a73-5da6e2fd2809 98cbe317d3494846bdfe48215cfbc5c0 af45596abab74cc9aca5cbb551899c80 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 21 23:48:44 compute-1 nova_compute[182713]: 2026-01-21 23:48:44.883 182717 DEBUG nova.virt.hardware [None req-21d7e6f4-17ed-40a9-9a73-5da6e2fd2809 98cbe317d3494846bdfe48215cfbc5c0 af45596abab74cc9aca5cbb551899c80 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 21 23:48:44 compute-1 nova_compute[182713]: 2026-01-21 23:48:44.883 182717 DEBUG nova.virt.hardware [None req-21d7e6f4-17ed-40a9-9a73-5da6e2fd2809 98cbe317d3494846bdfe48215cfbc5c0 af45596abab74cc9aca5cbb551899c80 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 21 23:48:44 compute-1 nova_compute[182713]: 2026-01-21 23:48:44.883 182717 DEBUG nova.virt.hardware [None req-21d7e6f4-17ed-40a9-9a73-5da6e2fd2809 98cbe317d3494846bdfe48215cfbc5c0 af45596abab74cc9aca5cbb551899c80 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 21 23:48:44 compute-1 nova_compute[182713]: 2026-01-21 23:48:44.884 182717 DEBUG nova.virt.hardware [None req-21d7e6f4-17ed-40a9-9a73-5da6e2fd2809 98cbe317d3494846bdfe48215cfbc5c0 af45596abab74cc9aca5cbb551899c80 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 21 23:48:44 compute-1 nova_compute[182713]: 2026-01-21 23:48:44.888 182717 DEBUG nova.objects.instance [None req-21d7e6f4-17ed-40a9-9a73-5da6e2fd2809 98cbe317d3494846bdfe48215cfbc5c0 af45596abab74cc9aca5cbb551899c80 - - default default] Lazy-loading 'pci_devices' on Instance uuid fb3b64cb-7a89-4d0b-b821-db928d77b940 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:48:44 compute-1 nova_compute[182713]: 2026-01-21 23:48:44.903 182717 DEBUG nova.virt.libvirt.driver [None req-21d7e6f4-17ed-40a9-9a73-5da6e2fd2809 98cbe317d3494846bdfe48215cfbc5c0 af45596abab74cc9aca5cbb551899c80 - - default default] [instance: fb3b64cb-7a89-4d0b-b821-db928d77b940] End _get_guest_xml xml=<domain type="kvm">
Jan 21 23:48:44 compute-1 nova_compute[182713]:   <uuid>fb3b64cb-7a89-4d0b-b821-db928d77b940</uuid>
Jan 21 23:48:44 compute-1 nova_compute[182713]:   <name>instance-0000001d</name>
Jan 21 23:48:44 compute-1 nova_compute[182713]:   <memory>131072</memory>
Jan 21 23:48:44 compute-1 nova_compute[182713]:   <vcpu>1</vcpu>
Jan 21 23:48:44 compute-1 nova_compute[182713]:   <metadata>
Jan 21 23:48:44 compute-1 nova_compute[182713]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 21 23:48:44 compute-1 nova_compute[182713]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 21 23:48:44 compute-1 nova_compute[182713]:       <nova:name>tempest-UnshelveToHostMultiNodesTest-server-1721280846</nova:name>
Jan 21 23:48:44 compute-1 nova_compute[182713]:       <nova:creationTime>2026-01-21 23:48:44</nova:creationTime>
Jan 21 23:48:44 compute-1 nova_compute[182713]:       <nova:flavor name="m1.nano">
Jan 21 23:48:44 compute-1 nova_compute[182713]:         <nova:memory>128</nova:memory>
Jan 21 23:48:44 compute-1 nova_compute[182713]:         <nova:disk>1</nova:disk>
Jan 21 23:48:44 compute-1 nova_compute[182713]:         <nova:swap>0</nova:swap>
Jan 21 23:48:44 compute-1 nova_compute[182713]:         <nova:ephemeral>0</nova:ephemeral>
Jan 21 23:48:44 compute-1 nova_compute[182713]:         <nova:vcpus>1</nova:vcpus>
Jan 21 23:48:44 compute-1 nova_compute[182713]:       </nova:flavor>
Jan 21 23:48:44 compute-1 nova_compute[182713]:       <nova:owner>
Jan 21 23:48:44 compute-1 nova_compute[182713]:         <nova:user uuid="98cbe317d3494846bdfe48215cfbc5c0">tempest-UnshelveToHostMultiNodesTest-32641639-project-member</nova:user>
Jan 21 23:48:44 compute-1 nova_compute[182713]:         <nova:project uuid="af45596abab74cc9aca5cbb551899c80">tempest-UnshelveToHostMultiNodesTest-32641639</nova:project>
Jan 21 23:48:44 compute-1 nova_compute[182713]:       </nova:owner>
Jan 21 23:48:44 compute-1 nova_compute[182713]:       <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 21 23:48:44 compute-1 nova_compute[182713]:       <nova:ports/>
Jan 21 23:48:44 compute-1 nova_compute[182713]:     </nova:instance>
Jan 21 23:48:44 compute-1 nova_compute[182713]:   </metadata>
Jan 21 23:48:44 compute-1 nova_compute[182713]:   <sysinfo type="smbios">
Jan 21 23:48:44 compute-1 nova_compute[182713]:     <system>
Jan 21 23:48:44 compute-1 nova_compute[182713]:       <entry name="manufacturer">RDO</entry>
Jan 21 23:48:44 compute-1 nova_compute[182713]:       <entry name="product">OpenStack Compute</entry>
Jan 21 23:48:44 compute-1 nova_compute[182713]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 21 23:48:44 compute-1 nova_compute[182713]:       <entry name="serial">fb3b64cb-7a89-4d0b-b821-db928d77b940</entry>
Jan 21 23:48:44 compute-1 nova_compute[182713]:       <entry name="uuid">fb3b64cb-7a89-4d0b-b821-db928d77b940</entry>
Jan 21 23:48:44 compute-1 nova_compute[182713]:       <entry name="family">Virtual Machine</entry>
Jan 21 23:48:44 compute-1 nova_compute[182713]:     </system>
Jan 21 23:48:44 compute-1 nova_compute[182713]:   </sysinfo>
Jan 21 23:48:44 compute-1 nova_compute[182713]:   <os>
Jan 21 23:48:44 compute-1 nova_compute[182713]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 21 23:48:44 compute-1 nova_compute[182713]:     <boot dev="hd"/>
Jan 21 23:48:44 compute-1 nova_compute[182713]:     <smbios mode="sysinfo"/>
Jan 21 23:48:44 compute-1 nova_compute[182713]:   </os>
Jan 21 23:48:44 compute-1 nova_compute[182713]:   <features>
Jan 21 23:48:44 compute-1 nova_compute[182713]:     <acpi/>
Jan 21 23:48:44 compute-1 nova_compute[182713]:     <apic/>
Jan 21 23:48:44 compute-1 nova_compute[182713]:     <vmcoreinfo/>
Jan 21 23:48:44 compute-1 nova_compute[182713]:   </features>
Jan 21 23:48:44 compute-1 nova_compute[182713]:   <clock offset="utc">
Jan 21 23:48:44 compute-1 nova_compute[182713]:     <timer name="pit" tickpolicy="delay"/>
Jan 21 23:48:44 compute-1 nova_compute[182713]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 21 23:48:44 compute-1 nova_compute[182713]:     <timer name="hpet" present="no"/>
Jan 21 23:48:44 compute-1 nova_compute[182713]:   </clock>
Jan 21 23:48:44 compute-1 nova_compute[182713]:   <cpu mode="custom" match="exact">
Jan 21 23:48:44 compute-1 nova_compute[182713]:     <model>Nehalem</model>
Jan 21 23:48:44 compute-1 nova_compute[182713]:     <topology sockets="1" cores="1" threads="1"/>
Jan 21 23:48:44 compute-1 nova_compute[182713]:   </cpu>
Jan 21 23:48:44 compute-1 nova_compute[182713]:   <devices>
Jan 21 23:48:44 compute-1 nova_compute[182713]:     <disk type="file" device="disk">
Jan 21 23:48:44 compute-1 nova_compute[182713]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 21 23:48:44 compute-1 nova_compute[182713]:       <source file="/var/lib/nova/instances/fb3b64cb-7a89-4d0b-b821-db928d77b940/disk"/>
Jan 21 23:48:44 compute-1 nova_compute[182713]:       <target dev="vda" bus="virtio"/>
Jan 21 23:48:44 compute-1 nova_compute[182713]:     </disk>
Jan 21 23:48:44 compute-1 nova_compute[182713]:     <disk type="file" device="cdrom">
Jan 21 23:48:44 compute-1 nova_compute[182713]:       <driver name="qemu" type="raw" cache="none"/>
Jan 21 23:48:44 compute-1 nova_compute[182713]:       <source file="/var/lib/nova/instances/fb3b64cb-7a89-4d0b-b821-db928d77b940/disk.config"/>
Jan 21 23:48:44 compute-1 nova_compute[182713]:       <target dev="sda" bus="sata"/>
Jan 21 23:48:44 compute-1 nova_compute[182713]:     </disk>
Jan 21 23:48:44 compute-1 nova_compute[182713]:     <serial type="pty">
Jan 21 23:48:44 compute-1 nova_compute[182713]:       <log file="/var/lib/nova/instances/fb3b64cb-7a89-4d0b-b821-db928d77b940/console.log" append="off"/>
Jan 21 23:48:44 compute-1 nova_compute[182713]:     </serial>
Jan 21 23:48:44 compute-1 nova_compute[182713]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 21 23:48:44 compute-1 nova_compute[182713]:     <video>
Jan 21 23:48:44 compute-1 nova_compute[182713]:       <model type="virtio"/>
Jan 21 23:48:44 compute-1 nova_compute[182713]:     </video>
Jan 21 23:48:44 compute-1 nova_compute[182713]:     <input type="tablet" bus="usb"/>
Jan 21 23:48:44 compute-1 nova_compute[182713]:     <rng model="virtio">
Jan 21 23:48:44 compute-1 nova_compute[182713]:       <backend model="random">/dev/urandom</backend>
Jan 21 23:48:44 compute-1 nova_compute[182713]:     </rng>
Jan 21 23:48:44 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root"/>
Jan 21 23:48:44 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:48:44 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:48:44 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:48:44 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:48:44 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:48:44 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:48:44 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:48:44 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:48:44 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:48:44 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:48:44 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:48:44 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:48:44 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:48:44 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:48:44 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:48:44 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:48:44 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:48:44 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:48:44 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:48:44 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:48:44 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:48:44 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:48:44 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:48:44 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:48:44 compute-1 nova_compute[182713]:     <controller type="usb" index="0"/>
Jan 21 23:48:44 compute-1 nova_compute[182713]:     <memballoon model="virtio">
Jan 21 23:48:44 compute-1 nova_compute[182713]:       <stats period="10"/>
Jan 21 23:48:44 compute-1 nova_compute[182713]:     </memballoon>
Jan 21 23:48:44 compute-1 nova_compute[182713]:   </devices>
Jan 21 23:48:44 compute-1 nova_compute[182713]: </domain>
Jan 21 23:48:44 compute-1 nova_compute[182713]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 21 23:48:44 compute-1 nova_compute[182713]: 2026-01-21 23:48:44.957 182717 DEBUG nova.virt.libvirt.driver [None req-21d7e6f4-17ed-40a9-9a73-5da6e2fd2809 98cbe317d3494846bdfe48215cfbc5c0 af45596abab74cc9aca5cbb551899c80 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 21 23:48:44 compute-1 nova_compute[182713]: 2026-01-21 23:48:44.958 182717 DEBUG nova.virt.libvirt.driver [None req-21d7e6f4-17ed-40a9-9a73-5da6e2fd2809 98cbe317d3494846bdfe48215cfbc5c0 af45596abab74cc9aca5cbb551899c80 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 21 23:48:44 compute-1 nova_compute[182713]: 2026-01-21 23:48:44.958 182717 INFO nova.virt.libvirt.driver [None req-21d7e6f4-17ed-40a9-9a73-5da6e2fd2809 98cbe317d3494846bdfe48215cfbc5c0 af45596abab74cc9aca5cbb551899c80 - - default default] [instance: fb3b64cb-7a89-4d0b-b821-db928d77b940] Using config drive
Jan 21 23:48:45 compute-1 nova_compute[182713]: 2026-01-21 23:48:45.196 182717 INFO nova.virt.libvirt.driver [None req-21d7e6f4-17ed-40a9-9a73-5da6e2fd2809 98cbe317d3494846bdfe48215cfbc5c0 af45596abab74cc9aca5cbb551899c80 - - default default] [instance: fb3b64cb-7a89-4d0b-b821-db928d77b940] Creating config drive at /var/lib/nova/instances/fb3b64cb-7a89-4d0b-b821-db928d77b940/disk.config
Jan 21 23:48:45 compute-1 nova_compute[182713]: 2026-01-21 23:48:45.205 182717 DEBUG oslo_concurrency.processutils [None req-21d7e6f4-17ed-40a9-9a73-5da6e2fd2809 98cbe317d3494846bdfe48215cfbc5c0 af45596abab74cc9aca5cbb551899c80 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/fb3b64cb-7a89-4d0b-b821-db928d77b940/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpos35gxp6 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:48:45 compute-1 nova_compute[182713]: 2026-01-21 23:48:45.351 182717 DEBUG oslo_concurrency.processutils [None req-21d7e6f4-17ed-40a9-9a73-5da6e2fd2809 98cbe317d3494846bdfe48215cfbc5c0 af45596abab74cc9aca5cbb551899c80 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/fb3b64cb-7a89-4d0b-b821-db928d77b940/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpos35gxp6" returned: 0 in 0.146s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:48:45 compute-1 nova_compute[182713]: 2026-01-21 23:48:45.364 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:48:45 compute-1 systemd-machined[153970]: New machine qemu-15-instance-0000001d.
Jan 21 23:48:45 compute-1 systemd[1]: Started Virtual Machine qemu-15-instance-0000001d.
Jan 21 23:48:46 compute-1 nova_compute[182713]: 2026-01-21 23:48:46.313 182717 DEBUG nova.virt.driver [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Emitting event <LifecycleEvent: 1769039326.3126965, fb3b64cb-7a89-4d0b-b821-db928d77b940 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:48:46 compute-1 nova_compute[182713]: 2026-01-21 23:48:46.315 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: fb3b64cb-7a89-4d0b-b821-db928d77b940] VM Resumed (Lifecycle Event)
Jan 21 23:48:46 compute-1 nova_compute[182713]: 2026-01-21 23:48:46.318 182717 DEBUG nova.compute.manager [None req-21d7e6f4-17ed-40a9-9a73-5da6e2fd2809 98cbe317d3494846bdfe48215cfbc5c0 af45596abab74cc9aca5cbb551899c80 - - default default] [instance: fb3b64cb-7a89-4d0b-b821-db928d77b940] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 21 23:48:46 compute-1 nova_compute[182713]: 2026-01-21 23:48:46.319 182717 DEBUG nova.virt.libvirt.driver [None req-21d7e6f4-17ed-40a9-9a73-5da6e2fd2809 98cbe317d3494846bdfe48215cfbc5c0 af45596abab74cc9aca5cbb551899c80 - - default default] [instance: fb3b64cb-7a89-4d0b-b821-db928d77b940] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 21 23:48:46 compute-1 nova_compute[182713]: 2026-01-21 23:48:46.324 182717 INFO nova.virt.libvirt.driver [-] [instance: fb3b64cb-7a89-4d0b-b821-db928d77b940] Instance spawned successfully.
Jan 21 23:48:46 compute-1 nova_compute[182713]: 2026-01-21 23:48:46.324 182717 DEBUG nova.virt.libvirt.driver [None req-21d7e6f4-17ed-40a9-9a73-5da6e2fd2809 98cbe317d3494846bdfe48215cfbc5c0 af45596abab74cc9aca5cbb551899c80 - - default default] [instance: fb3b64cb-7a89-4d0b-b821-db928d77b940] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 21 23:48:46 compute-1 nova_compute[182713]: 2026-01-21 23:48:46.357 182717 DEBUG nova.virt.libvirt.driver [None req-21d7e6f4-17ed-40a9-9a73-5da6e2fd2809 98cbe317d3494846bdfe48215cfbc5c0 af45596abab74cc9aca5cbb551899c80 - - default default] [instance: fb3b64cb-7a89-4d0b-b821-db928d77b940] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:48:46 compute-1 nova_compute[182713]: 2026-01-21 23:48:46.358 182717 DEBUG nova.virt.libvirt.driver [None req-21d7e6f4-17ed-40a9-9a73-5da6e2fd2809 98cbe317d3494846bdfe48215cfbc5c0 af45596abab74cc9aca5cbb551899c80 - - default default] [instance: fb3b64cb-7a89-4d0b-b821-db928d77b940] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:48:46 compute-1 nova_compute[182713]: 2026-01-21 23:48:46.358 182717 DEBUG nova.virt.libvirt.driver [None req-21d7e6f4-17ed-40a9-9a73-5da6e2fd2809 98cbe317d3494846bdfe48215cfbc5c0 af45596abab74cc9aca5cbb551899c80 - - default default] [instance: fb3b64cb-7a89-4d0b-b821-db928d77b940] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:48:46 compute-1 nova_compute[182713]: 2026-01-21 23:48:46.359 182717 DEBUG nova.virt.libvirt.driver [None req-21d7e6f4-17ed-40a9-9a73-5da6e2fd2809 98cbe317d3494846bdfe48215cfbc5c0 af45596abab74cc9aca5cbb551899c80 - - default default] [instance: fb3b64cb-7a89-4d0b-b821-db928d77b940] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:48:46 compute-1 nova_compute[182713]: 2026-01-21 23:48:46.360 182717 DEBUG nova.virt.libvirt.driver [None req-21d7e6f4-17ed-40a9-9a73-5da6e2fd2809 98cbe317d3494846bdfe48215cfbc5c0 af45596abab74cc9aca5cbb551899c80 - - default default] [instance: fb3b64cb-7a89-4d0b-b821-db928d77b940] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:48:46 compute-1 nova_compute[182713]: 2026-01-21 23:48:46.360 182717 DEBUG nova.virt.libvirt.driver [None req-21d7e6f4-17ed-40a9-9a73-5da6e2fd2809 98cbe317d3494846bdfe48215cfbc5c0 af45596abab74cc9aca5cbb551899c80 - - default default] [instance: fb3b64cb-7a89-4d0b-b821-db928d77b940] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:48:46 compute-1 nova_compute[182713]: 2026-01-21 23:48:46.366 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: fb3b64cb-7a89-4d0b-b821-db928d77b940] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:48:46 compute-1 nova_compute[182713]: 2026-01-21 23:48:46.371 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: fb3b64cb-7a89-4d0b-b821-db928d77b940] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 21 23:48:46 compute-1 nova_compute[182713]: 2026-01-21 23:48:46.427 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: fb3b64cb-7a89-4d0b-b821-db928d77b940] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 21 23:48:46 compute-1 nova_compute[182713]: 2026-01-21 23:48:46.428 182717 DEBUG nova.virt.driver [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Emitting event <LifecycleEvent: 1769039326.3142366, fb3b64cb-7a89-4d0b-b821-db928d77b940 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:48:46 compute-1 nova_compute[182713]: 2026-01-21 23:48:46.428 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: fb3b64cb-7a89-4d0b-b821-db928d77b940] VM Started (Lifecycle Event)
Jan 21 23:48:46 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:48:46.461 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=74526b6d-b1ca-423f-9094-b845f8b97526, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '8'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:48:46 compute-1 nova_compute[182713]: 2026-01-21 23:48:46.485 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: fb3b64cb-7a89-4d0b-b821-db928d77b940] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:48:46 compute-1 nova_compute[182713]: 2026-01-21 23:48:46.490 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: fb3b64cb-7a89-4d0b-b821-db928d77b940] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 21 23:48:46 compute-1 nova_compute[182713]: 2026-01-21 23:48:46.536 182717 INFO nova.compute.manager [None req-21d7e6f4-17ed-40a9-9a73-5da6e2fd2809 98cbe317d3494846bdfe48215cfbc5c0 af45596abab74cc9aca5cbb551899c80 - - default default] [instance: fb3b64cb-7a89-4d0b-b821-db928d77b940] Took 2.13 seconds to spawn the instance on the hypervisor.
Jan 21 23:48:46 compute-1 nova_compute[182713]: 2026-01-21 23:48:46.537 182717 DEBUG nova.compute.manager [None req-21d7e6f4-17ed-40a9-9a73-5da6e2fd2809 98cbe317d3494846bdfe48215cfbc5c0 af45596abab74cc9aca5cbb551899c80 - - default default] [instance: fb3b64cb-7a89-4d0b-b821-db928d77b940] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:48:46 compute-1 nova_compute[182713]: 2026-01-21 23:48:46.553 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:48:46 compute-1 nova_compute[182713]: 2026-01-21 23:48:46.556 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: fb3b64cb-7a89-4d0b-b821-db928d77b940] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 21 23:48:46 compute-1 nova_compute[182713]: 2026-01-21 23:48:46.681 182717 INFO nova.compute.manager [None req-21d7e6f4-17ed-40a9-9a73-5da6e2fd2809 98cbe317d3494846bdfe48215cfbc5c0 af45596abab74cc9aca5cbb551899c80 - - default default] [instance: fb3b64cb-7a89-4d0b-b821-db928d77b940] Took 2.77 seconds to build instance.
Jan 21 23:48:46 compute-1 nova_compute[182713]: 2026-01-21 23:48:46.712 182717 DEBUG oslo_concurrency.lockutils [None req-21d7e6f4-17ed-40a9-9a73-5da6e2fd2809 98cbe317d3494846bdfe48215cfbc5c0 af45596abab74cc9aca5cbb551899c80 - - default default] Lock "fb3b64cb-7a89-4d0b-b821-db928d77b940" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 2.927s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:48:48 compute-1 podman[215091]: 2026-01-21 23:48:48.585683226 +0000 UTC m=+0.065733231 container health_status 9069102163b9014ce8d990e36224edf2f6277e737eebbc85a8ea0ba5a3d6a468 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 21 23:48:48 compute-1 podman[215090]: 2026-01-21 23:48:48.656690783 +0000 UTC m=+0.143940839 container health_status 1b31374526468d6b98833c6ddc06c791524e3aaa8f4a92d02e3f11c449a17ca2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 21 23:48:49 compute-1 nova_compute[182713]: 2026-01-21 23:48:49.487 182717 DEBUG oslo_concurrency.lockutils [None req-e859a494-976a-4a37-9084-6e45948e127c 98cbe317d3494846bdfe48215cfbc5c0 af45596abab74cc9aca5cbb551899c80 - - default default] Acquiring lock "fb3b64cb-7a89-4d0b-b821-db928d77b940" by "nova.compute.manager.ComputeManager.shelve_instance.<locals>.do_shelve_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:48:49 compute-1 nova_compute[182713]: 2026-01-21 23:48:49.488 182717 DEBUG oslo_concurrency.lockutils [None req-e859a494-976a-4a37-9084-6e45948e127c 98cbe317d3494846bdfe48215cfbc5c0 af45596abab74cc9aca5cbb551899c80 - - default default] Lock "fb3b64cb-7a89-4d0b-b821-db928d77b940" acquired by "nova.compute.manager.ComputeManager.shelve_instance.<locals>.do_shelve_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:48:49 compute-1 nova_compute[182713]: 2026-01-21 23:48:49.488 182717 INFO nova.compute.manager [None req-e859a494-976a-4a37-9084-6e45948e127c 98cbe317d3494846bdfe48215cfbc5c0 af45596abab74cc9aca5cbb551899c80 - - default default] [instance: fb3b64cb-7a89-4d0b-b821-db928d77b940] Shelving
Jan 21 23:48:49 compute-1 nova_compute[182713]: 2026-01-21 23:48:49.554 182717 DEBUG nova.virt.libvirt.driver [None req-e859a494-976a-4a37-9084-6e45948e127c 98cbe317d3494846bdfe48215cfbc5c0 af45596abab74cc9aca5cbb551899c80 - - default default] [instance: fb3b64cb-7a89-4d0b-b821-db928d77b940] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071
Jan 21 23:48:49 compute-1 nova_compute[182713]: 2026-01-21 23:48:49.755 182717 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769039314.7547302, 2038fd11-9c07-48d0-8092-d973d69d8eb9 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:48:49 compute-1 nova_compute[182713]: 2026-01-21 23:48:49.756 182717 INFO nova.compute.manager [-] [instance: 2038fd11-9c07-48d0-8092-d973d69d8eb9] VM Stopped (Lifecycle Event)
Jan 21 23:48:49 compute-1 nova_compute[182713]: 2026-01-21 23:48:49.823 182717 DEBUG nova.compute.manager [None req-97cfb9cc-7afd-4623-8551-15777ca89b11 - - - - - -] [instance: 2038fd11-9c07-48d0-8092-d973d69d8eb9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:48:50 compute-1 nova_compute[182713]: 2026-01-21 23:48:50.366 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:48:51 compute-1 nova_compute[182713]: 2026-01-21 23:48:51.556 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:48:52 compute-1 nova_compute[182713]: 2026-01-21 23:48:52.085 182717 DEBUG oslo_concurrency.lockutils [None req-429a6556-48cd-4e21-a144-c548cb9e66b7 d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] Acquiring lock "5bdecf5d-9113-4584-ac23-44d59770eade" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:48:52 compute-1 nova_compute[182713]: 2026-01-21 23:48:52.086 182717 DEBUG oslo_concurrency.lockutils [None req-429a6556-48cd-4e21-a144-c548cb9e66b7 d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] Lock "5bdecf5d-9113-4584-ac23-44d59770eade" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:48:52 compute-1 nova_compute[182713]: 2026-01-21 23:48:52.086 182717 DEBUG oslo_concurrency.lockutils [None req-429a6556-48cd-4e21-a144-c548cb9e66b7 d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] Acquiring lock "5bdecf5d-9113-4584-ac23-44d59770eade-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:48:52 compute-1 nova_compute[182713]: 2026-01-21 23:48:52.087 182717 DEBUG oslo_concurrency.lockutils [None req-429a6556-48cd-4e21-a144-c548cb9e66b7 d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] Lock "5bdecf5d-9113-4584-ac23-44d59770eade-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:48:52 compute-1 nova_compute[182713]: 2026-01-21 23:48:52.087 182717 DEBUG oslo_concurrency.lockutils [None req-429a6556-48cd-4e21-a144-c548cb9e66b7 d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] Lock "5bdecf5d-9113-4584-ac23-44d59770eade-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:48:52 compute-1 nova_compute[182713]: 2026-01-21 23:48:52.101 182717 INFO nova.compute.manager [None req-429a6556-48cd-4e21-a144-c548cb9e66b7 d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] [instance: 5bdecf5d-9113-4584-ac23-44d59770eade] Terminating instance
Jan 21 23:48:52 compute-1 nova_compute[182713]: 2026-01-21 23:48:52.116 182717 DEBUG nova.compute.manager [None req-429a6556-48cd-4e21-a144-c548cb9e66b7 d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] [instance: 5bdecf5d-9113-4584-ac23-44d59770eade] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 21 23:48:52 compute-1 kernel: tapdf9aa099-aa (unregistering): left promiscuous mode
Jan 21 23:48:52 compute-1 NetworkManager[54952]: <info>  [1769039332.1471] device (tapdf9aa099-aa): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 21 23:48:52 compute-1 ovn_controller[94841]: 2026-01-21T23:48:52Z|00102|binding|INFO|Releasing lport df9aa099-aa41-4111-b46c-c8a593762a53 from this chassis (sb_readonly=0)
Jan 21 23:48:52 compute-1 nova_compute[182713]: 2026-01-21 23:48:52.151 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:48:52 compute-1 ovn_controller[94841]: 2026-01-21T23:48:52Z|00103|binding|INFO|Setting lport df9aa099-aa41-4111-b46c-c8a593762a53 down in Southbound
Jan 21 23:48:52 compute-1 ovn_controller[94841]: 2026-01-21T23:48:52Z|00104|binding|INFO|Removing iface tapdf9aa099-aa ovn-installed in OVS
Jan 21 23:48:52 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:48:52.160 104184 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:8f:4f:85 10.100.0.6'], port_security=['fa:16:3e:8f:4f:85 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '5bdecf5d-9113-4584-ac23-44d59770eade', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b2df233d-b255-4dda-925c-3ccab3a032ee', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'cdcb2f57183e484cace5d5f78dd635a1', 'neutron:revision_number': '22', 'neutron:security_group_ids': 'b0d61dfd-cc58-4a7f-a4c1-6c8c92847694', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ceab9906-340c-4566-81ac-4c6dd292f58f, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>], logical_port=df9aa099-aa41-4111-b46c-c8a593762a53) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 21 23:48:52 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:48:52.163 104184 INFO neutron.agent.ovn.metadata.agent [-] Port df9aa099-aa41-4111-b46c-c8a593762a53 in datapath b2df233d-b255-4dda-925c-3ccab3a032ee unbound from our chassis
Jan 21 23:48:52 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:48:52.167 104184 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network b2df233d-b255-4dda-925c-3ccab3a032ee, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 21 23:48:52 compute-1 nova_compute[182713]: 2026-01-21 23:48:52.167 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:48:52 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:48:52.169 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[806358af-d9a7-424f-b4df-aaa46188c202]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:48:52 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:48:52.170 104184 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-b2df233d-b255-4dda-925c-3ccab3a032ee namespace which is not needed anymore
Jan 21 23:48:52 compute-1 systemd[1]: machine-qemu\x2d13\x2dinstance\x2d00000015.scope: Deactivated successfully.
Jan 21 23:48:52 compute-1 systemd[1]: machine-qemu\x2d13\x2dinstance\x2d00000015.scope: Consumed 4.198s CPU time.
Jan 21 23:48:52 compute-1 systemd-machined[153970]: Machine qemu-13-instance-00000015 terminated.
Jan 21 23:48:52 compute-1 neutron-haproxy-ovnmeta-b2df233d-b255-4dda-925c-3ccab3a032ee[214415]: [NOTICE]   (214419) : haproxy version is 2.8.14-c23fe91
Jan 21 23:48:52 compute-1 neutron-haproxy-ovnmeta-b2df233d-b255-4dda-925c-3ccab3a032ee[214415]: [NOTICE]   (214419) : path to executable is /usr/sbin/haproxy
Jan 21 23:48:52 compute-1 neutron-haproxy-ovnmeta-b2df233d-b255-4dda-925c-3ccab3a032ee[214415]: [WARNING]  (214419) : Exiting Master process...
Jan 21 23:48:52 compute-1 neutron-haproxy-ovnmeta-b2df233d-b255-4dda-925c-3ccab3a032ee[214415]: [ALERT]    (214419) : Current worker (214422) exited with code 143 (Terminated)
Jan 21 23:48:52 compute-1 neutron-haproxy-ovnmeta-b2df233d-b255-4dda-925c-3ccab3a032ee[214415]: [WARNING]  (214419) : All workers exited. Exiting... (0)
Jan 21 23:48:52 compute-1 systemd[1]: libpod-316e019b833162e58ed16590672189c281335e038baae1dcdc19cb48a0d6ebc0.scope: Deactivated successfully.
Jan 21 23:48:52 compute-1 podman[215165]: 2026-01-21 23:48:52.320353385 +0000 UTC m=+0.051184123 container died 316e019b833162e58ed16590672189c281335e038baae1dcdc19cb48a0d6ebc0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b2df233d-b255-4dda-925c-3ccab3a032ee, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 21 23:48:52 compute-1 kernel: tapdf9aa099-aa: entered promiscuous mode
Jan 21 23:48:52 compute-1 NetworkManager[54952]: <info>  [1769039332.3465] manager: (tapdf9aa099-aa): new Tun device (/org/freedesktop/NetworkManager/Devices/55)
Jan 21 23:48:52 compute-1 systemd-udevd[215145]: Network interface NamePolicy= disabled on kernel command line.
Jan 21 23:48:52 compute-1 nova_compute[182713]: 2026-01-21 23:48:52.347 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:48:52 compute-1 ovn_controller[94841]: 2026-01-21T23:48:52Z|00105|binding|INFO|Claiming lport df9aa099-aa41-4111-b46c-c8a593762a53 for this chassis.
Jan 21 23:48:52 compute-1 kernel: tapdf9aa099-aa (unregistering): left promiscuous mode
Jan 21 23:48:52 compute-1 ovn_controller[94841]: 2026-01-21T23:48:52Z|00106|binding|INFO|df9aa099-aa41-4111-b46c-c8a593762a53: Claiming fa:16:3e:8f:4f:85 10.100.0.6
Jan 21 23:48:52 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:48:52.357 104184 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:8f:4f:85 10.100.0.6'], port_security=['fa:16:3e:8f:4f:85 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '5bdecf5d-9113-4584-ac23-44d59770eade', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b2df233d-b255-4dda-925c-3ccab3a032ee', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'cdcb2f57183e484cace5d5f78dd635a1', 'neutron:revision_number': '22', 'neutron:security_group_ids': 'b0d61dfd-cc58-4a7f-a4c1-6c8c92847694', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ceab9906-340c-4566-81ac-4c6dd292f58f, chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>], logical_port=df9aa099-aa41-4111-b46c-c8a593762a53) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 21 23:48:52 compute-1 nova_compute[182713]: 2026-01-21 23:48:52.376 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:48:52 compute-1 ovn_controller[94841]: 2026-01-21T23:48:52Z|00107|binding|INFO|Releasing lport df9aa099-aa41-4111-b46c-c8a593762a53 from this chassis (sb_readonly=0)
Jan 21 23:48:52 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-316e019b833162e58ed16590672189c281335e038baae1dcdc19cb48a0d6ebc0-userdata-shm.mount: Deactivated successfully.
Jan 21 23:48:52 compute-1 systemd[1]: var-lib-containers-storage-overlay-f05451fdf54bc1deb33a3fd98d2d0b6532413481645a3aaa0f51d8d7ee428231-merged.mount: Deactivated successfully.
Jan 21 23:48:52 compute-1 podman[215165]: 2026-01-21 23:48:52.398514873 +0000 UTC m=+0.129345581 container cleanup 316e019b833162e58ed16590672189c281335e038baae1dcdc19cb48a0d6ebc0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b2df233d-b255-4dda-925c-3ccab3a032ee, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Jan 21 23:48:52 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:48:52.399 104184 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:8f:4f:85 10.100.0.6'], port_security=['fa:16:3e:8f:4f:85 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '5bdecf5d-9113-4584-ac23-44d59770eade', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b2df233d-b255-4dda-925c-3ccab3a032ee', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'cdcb2f57183e484cace5d5f78dd635a1', 'neutron:revision_number': '22', 'neutron:security_group_ids': 'b0d61dfd-cc58-4a7f-a4c1-6c8c92847694', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ceab9906-340c-4566-81ac-4c6dd292f58f, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>], logical_port=df9aa099-aa41-4111-b46c-c8a593762a53) old=Port_Binding(chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 21 23:48:52 compute-1 systemd[1]: libpod-conmon-316e019b833162e58ed16590672189c281335e038baae1dcdc19cb48a0d6ebc0.scope: Deactivated successfully.
Jan 21 23:48:52 compute-1 nova_compute[182713]: 2026-01-21 23:48:52.418 182717 INFO nova.virt.libvirt.driver [-] [instance: 5bdecf5d-9113-4584-ac23-44d59770eade] Instance destroyed successfully.
Jan 21 23:48:52 compute-1 nova_compute[182713]: 2026-01-21 23:48:52.419 182717 DEBUG nova.objects.instance [None req-429a6556-48cd-4e21-a144-c548cb9e66b7 d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] Lazy-loading 'resources' on Instance uuid 5bdecf5d-9113-4584-ac23-44d59770eade obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:48:52 compute-1 nova_compute[182713]: 2026-01-21 23:48:52.439 182717 DEBUG nova.virt.libvirt.vif [None req-429a6556-48cd-4e21-a144-c548cb9e66b7 d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-01-21T23:47:07Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-LiveMigrationTest-server-821021372',display_name='tempest-LiveMigrationTest-server-821021372',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-livemigrationtest-server-821021372',id=21,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-21T23:47:17Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='cdcb2f57183e484cace5d5f78dd635a1',ramdisk_id='',reservation_id='r-xa5gd7vg',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='2',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-LiveMigrationTest-430976321',owner_user_name='tempest-LiveMigrationTest-430976321-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-21T23:47:59Z,user_data=None,user_id='d4ff24d8abf8416db9d64c645436c5f1',uuid=5bdecf5d-9113-4584-ac23-44d59770eade,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "df9aa099-aa41-4111-b46c-c8a593762a53", "address": "fa:16:3e:8f:4f:85", "network": {"id": "b2df233d-b255-4dda-925c-3ccab3a032ee", "bridge": "br-int", "label": "tempest-LiveMigrationTest-283223724-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cdcb2f57183e484cace5d5f78dd635a1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdf9aa099-aa", "ovs_interfaceid": "df9aa099-aa41-4111-b46c-c8a593762a53", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 21 23:48:52 compute-1 nova_compute[182713]: 2026-01-21 23:48:52.440 182717 DEBUG nova.network.os_vif_util [None req-429a6556-48cd-4e21-a144-c548cb9e66b7 d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] Converting VIF {"id": "df9aa099-aa41-4111-b46c-c8a593762a53", "address": "fa:16:3e:8f:4f:85", "network": {"id": "b2df233d-b255-4dda-925c-3ccab3a032ee", "bridge": "br-int", "label": "tempest-LiveMigrationTest-283223724-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cdcb2f57183e484cace5d5f78dd635a1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdf9aa099-aa", "ovs_interfaceid": "df9aa099-aa41-4111-b46c-c8a593762a53", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 21 23:48:52 compute-1 nova_compute[182713]: 2026-01-21 23:48:52.441 182717 DEBUG nova.network.os_vif_util [None req-429a6556-48cd-4e21-a144-c548cb9e66b7 d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:8f:4f:85,bridge_name='br-int',has_traffic_filtering=True,id=df9aa099-aa41-4111-b46c-c8a593762a53,network=Network(b2df233d-b255-4dda-925c-3ccab3a032ee),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdf9aa099-aa') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 21 23:48:52 compute-1 nova_compute[182713]: 2026-01-21 23:48:52.441 182717 DEBUG os_vif [None req-429a6556-48cd-4e21-a144-c548cb9e66b7 d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:8f:4f:85,bridge_name='br-int',has_traffic_filtering=True,id=df9aa099-aa41-4111-b46c-c8a593762a53,network=Network(b2df233d-b255-4dda-925c-3ccab3a032ee),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdf9aa099-aa') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 21 23:48:52 compute-1 nova_compute[182713]: 2026-01-21 23:48:52.443 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:48:52 compute-1 nova_compute[182713]: 2026-01-21 23:48:52.443 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapdf9aa099-aa, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:48:52 compute-1 nova_compute[182713]: 2026-01-21 23:48:52.445 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:48:52 compute-1 nova_compute[182713]: 2026-01-21 23:48:52.459 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:48:52 compute-1 nova_compute[182713]: 2026-01-21 23:48:52.467 182717 INFO os_vif [None req-429a6556-48cd-4e21-a144-c548cb9e66b7 d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:8f:4f:85,bridge_name='br-int',has_traffic_filtering=True,id=df9aa099-aa41-4111-b46c-c8a593762a53,network=Network(b2df233d-b255-4dda-925c-3ccab3a032ee),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdf9aa099-aa')
Jan 21 23:48:52 compute-1 nova_compute[182713]: 2026-01-21 23:48:52.468 182717 INFO nova.virt.libvirt.driver [None req-429a6556-48cd-4e21-a144-c548cb9e66b7 d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] [instance: 5bdecf5d-9113-4584-ac23-44d59770eade] Deleting instance files /var/lib/nova/instances/5bdecf5d-9113-4584-ac23-44d59770eade_del
Jan 21 23:48:52 compute-1 nova_compute[182713]: 2026-01-21 23:48:52.469 182717 INFO nova.virt.libvirt.driver [None req-429a6556-48cd-4e21-a144-c548cb9e66b7 d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] [instance: 5bdecf5d-9113-4584-ac23-44d59770eade] Deletion of /var/lib/nova/instances/5bdecf5d-9113-4584-ac23-44d59770eade_del complete
Jan 21 23:48:52 compute-1 podman[215201]: 2026-01-21 23:48:52.488888983 +0000 UTC m=+0.062647591 container remove 316e019b833162e58ed16590672189c281335e038baae1dcdc19cb48a0d6ebc0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b2df233d-b255-4dda-925c-3ccab3a032ee, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Jan 21 23:48:52 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:48:52.497 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[c81867b6-ad33-48c7-978a-0ed9c1ab3092]: (4, ('Wed Jan 21 11:48:52 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-b2df233d-b255-4dda-925c-3ccab3a032ee (316e019b833162e58ed16590672189c281335e038baae1dcdc19cb48a0d6ebc0)\n316e019b833162e58ed16590672189c281335e038baae1dcdc19cb48a0d6ebc0\nWed Jan 21 11:48:52 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-b2df233d-b255-4dda-925c-3ccab3a032ee (316e019b833162e58ed16590672189c281335e038baae1dcdc19cb48a0d6ebc0)\n316e019b833162e58ed16590672189c281335e038baae1dcdc19cb48a0d6ebc0\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:48:52 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:48:52.499 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[23c65a13-ab08-4aff-9a40-8d5d6b98a8be]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:48:52 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:48:52.500 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb2df233d-b0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:48:52 compute-1 nova_compute[182713]: 2026-01-21 23:48:52.502 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:48:52 compute-1 kernel: tapb2df233d-b0: left promiscuous mode
Jan 21 23:48:52 compute-1 nova_compute[182713]: 2026-01-21 23:48:52.507 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:48:52 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:48:52.529 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[cad3488d-9a12-4a38-8b6d-9177c89956b4]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:48:52 compute-1 nova_compute[182713]: 2026-01-21 23:48:52.536 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:48:52 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:48:52.551 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[88637ece-0f07-4ea3-9085-ed9078db7ba7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:48:52 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:48:52.554 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[5fc91b14-16fa-4210-bdde-807eb0d4da16]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:48:52 compute-1 nova_compute[182713]: 2026-01-21 23:48:52.553 182717 INFO nova.compute.manager [None req-429a6556-48cd-4e21-a144-c548cb9e66b7 d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] [instance: 5bdecf5d-9113-4584-ac23-44d59770eade] Took 0.44 seconds to destroy the instance on the hypervisor.
Jan 21 23:48:52 compute-1 nova_compute[182713]: 2026-01-21 23:48:52.554 182717 DEBUG oslo.service.loopingcall [None req-429a6556-48cd-4e21-a144-c548cb9e66b7 d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 21 23:48:52 compute-1 nova_compute[182713]: 2026-01-21 23:48:52.554 182717 DEBUG nova.compute.manager [-] [instance: 5bdecf5d-9113-4584-ac23-44d59770eade] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 21 23:48:52 compute-1 nova_compute[182713]: 2026-01-21 23:48:52.554 182717 DEBUG nova.network.neutron [-] [instance: 5bdecf5d-9113-4584-ac23-44d59770eade] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 21 23:48:52 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:48:52.585 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[2486a251-8512-4223-bfed-d8c92ee7e389]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 384768, 'reachable_time': 17154, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 215214, 'error': None, 'target': 'ovnmeta-b2df233d-b255-4dda-925c-3ccab3a032ee', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:48:52 compute-1 systemd[1]: run-netns-ovnmeta\x2db2df233d\x2db255\x2d4dda\x2d925c\x2d3ccab3a032ee.mount: Deactivated successfully.
Jan 21 23:48:52 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:48:52.589 104576 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-b2df233d-b255-4dda-925c-3ccab3a032ee deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 21 23:48:52 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:48:52.589 104576 DEBUG oslo.privsep.daemon [-] privsep: reply[6039083f-f82b-4f6b-9d64-f1979c7891c2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:48:52 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:48:52.593 104184 INFO neutron.agent.ovn.metadata.agent [-] Port df9aa099-aa41-4111-b46c-c8a593762a53 in datapath b2df233d-b255-4dda-925c-3ccab3a032ee unbound from our chassis
Jan 21 23:48:52 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:48:52.595 104184 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network b2df233d-b255-4dda-925c-3ccab3a032ee, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 21 23:48:52 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:48:52.597 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[70803e97-2eaf-4b15-b5ad-b378041aa46f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:48:52 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:48:52.597 104184 INFO neutron.agent.ovn.metadata.agent [-] Port df9aa099-aa41-4111-b46c-c8a593762a53 in datapath b2df233d-b255-4dda-925c-3ccab3a032ee unbound from our chassis
Jan 21 23:48:52 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:48:52.599 104184 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network b2df233d-b255-4dda-925c-3ccab3a032ee, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 21 23:48:52 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:48:52.600 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[ccdeecd7-c0e9-4f9d-a633-e942fc29dcd5]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:48:53 compute-1 nova_compute[182713]: 2026-01-21 23:48:53.204 182717 DEBUG nova.network.neutron [-] [instance: 5bdecf5d-9113-4584-ac23-44d59770eade] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 23:48:53 compute-1 nova_compute[182713]: 2026-01-21 23:48:53.230 182717 INFO nova.compute.manager [-] [instance: 5bdecf5d-9113-4584-ac23-44d59770eade] Took 0.68 seconds to deallocate network for instance.
Jan 21 23:48:53 compute-1 nova_compute[182713]: 2026-01-21 23:48:53.310 182717 DEBUG nova.compute.manager [req-70e2e133-7558-4b48-a10d-01ae8c2ff55a req-a2ca9c3f-1b40-41ed-940f-f60f4f9a27e1 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 5bdecf5d-9113-4584-ac23-44d59770eade] Received event network-vif-deleted-df9aa099-aa41-4111-b46c-c8a593762a53 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:48:53 compute-1 nova_compute[182713]: 2026-01-21 23:48:53.335 182717 DEBUG nova.compute.manager [req-6479c26c-60e2-430d-a170-c08f04d67656 req-8d06d496-8059-4fe7-8979-f17d137c3024 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 5bdecf5d-9113-4584-ac23-44d59770eade] Received event network-vif-unplugged-df9aa099-aa41-4111-b46c-c8a593762a53 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:48:53 compute-1 nova_compute[182713]: 2026-01-21 23:48:53.336 182717 DEBUG oslo_concurrency.lockutils [req-6479c26c-60e2-430d-a170-c08f04d67656 req-8d06d496-8059-4fe7-8979-f17d137c3024 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "5bdecf5d-9113-4584-ac23-44d59770eade-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:48:53 compute-1 nova_compute[182713]: 2026-01-21 23:48:53.336 182717 DEBUG oslo_concurrency.lockutils [req-6479c26c-60e2-430d-a170-c08f04d67656 req-8d06d496-8059-4fe7-8979-f17d137c3024 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "5bdecf5d-9113-4584-ac23-44d59770eade-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:48:53 compute-1 nova_compute[182713]: 2026-01-21 23:48:53.336 182717 DEBUG oslo_concurrency.lockutils [req-6479c26c-60e2-430d-a170-c08f04d67656 req-8d06d496-8059-4fe7-8979-f17d137c3024 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "5bdecf5d-9113-4584-ac23-44d59770eade-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:48:53 compute-1 nova_compute[182713]: 2026-01-21 23:48:53.336 182717 DEBUG nova.compute.manager [req-6479c26c-60e2-430d-a170-c08f04d67656 req-8d06d496-8059-4fe7-8979-f17d137c3024 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 5bdecf5d-9113-4584-ac23-44d59770eade] No waiting events found dispatching network-vif-unplugged-df9aa099-aa41-4111-b46c-c8a593762a53 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 23:48:53 compute-1 nova_compute[182713]: 2026-01-21 23:48:53.337 182717 DEBUG nova.compute.manager [req-6479c26c-60e2-430d-a170-c08f04d67656 req-8d06d496-8059-4fe7-8979-f17d137c3024 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 5bdecf5d-9113-4584-ac23-44d59770eade] Received event network-vif-unplugged-df9aa099-aa41-4111-b46c-c8a593762a53 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 21 23:48:53 compute-1 nova_compute[182713]: 2026-01-21 23:48:53.337 182717 DEBUG nova.compute.manager [req-6479c26c-60e2-430d-a170-c08f04d67656 req-8d06d496-8059-4fe7-8979-f17d137c3024 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 5bdecf5d-9113-4584-ac23-44d59770eade] Received event network-vif-plugged-df9aa099-aa41-4111-b46c-c8a593762a53 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:48:53 compute-1 nova_compute[182713]: 2026-01-21 23:48:53.337 182717 DEBUG oslo_concurrency.lockutils [req-6479c26c-60e2-430d-a170-c08f04d67656 req-8d06d496-8059-4fe7-8979-f17d137c3024 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "5bdecf5d-9113-4584-ac23-44d59770eade-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:48:53 compute-1 nova_compute[182713]: 2026-01-21 23:48:53.337 182717 DEBUG oslo_concurrency.lockutils [req-6479c26c-60e2-430d-a170-c08f04d67656 req-8d06d496-8059-4fe7-8979-f17d137c3024 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "5bdecf5d-9113-4584-ac23-44d59770eade-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:48:53 compute-1 nova_compute[182713]: 2026-01-21 23:48:53.337 182717 DEBUG oslo_concurrency.lockutils [req-6479c26c-60e2-430d-a170-c08f04d67656 req-8d06d496-8059-4fe7-8979-f17d137c3024 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "5bdecf5d-9113-4584-ac23-44d59770eade-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:48:53 compute-1 nova_compute[182713]: 2026-01-21 23:48:53.338 182717 DEBUG nova.compute.manager [req-6479c26c-60e2-430d-a170-c08f04d67656 req-8d06d496-8059-4fe7-8979-f17d137c3024 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 5bdecf5d-9113-4584-ac23-44d59770eade] No waiting events found dispatching network-vif-plugged-df9aa099-aa41-4111-b46c-c8a593762a53 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 23:48:53 compute-1 nova_compute[182713]: 2026-01-21 23:48:53.338 182717 WARNING nova.compute.manager [req-6479c26c-60e2-430d-a170-c08f04d67656 req-8d06d496-8059-4fe7-8979-f17d137c3024 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 5bdecf5d-9113-4584-ac23-44d59770eade] Received unexpected event network-vif-plugged-df9aa099-aa41-4111-b46c-c8a593762a53 for instance with vm_state active and task_state deleting.
Jan 21 23:48:53 compute-1 nova_compute[182713]: 2026-01-21 23:48:53.364 182717 DEBUG oslo_concurrency.lockutils [None req-429a6556-48cd-4e21-a144-c548cb9e66b7 d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:48:53 compute-1 nova_compute[182713]: 2026-01-21 23:48:53.365 182717 DEBUG oslo_concurrency.lockutils [None req-429a6556-48cd-4e21-a144-c548cb9e66b7 d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:48:53 compute-1 nova_compute[182713]: 2026-01-21 23:48:53.401 182717 DEBUG nova.scheduler.client.report [None req-429a6556-48cd-4e21-a144-c548cb9e66b7 d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] Refreshing inventories for resource provider 39680711-70c9-4df1-ae59-25e54fac688d _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Jan 21 23:48:53 compute-1 nova_compute[182713]: 2026-01-21 23:48:53.429 182717 DEBUG nova.scheduler.client.report [None req-429a6556-48cd-4e21-a144-c548cb9e66b7 d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] Updating ProviderTree inventory for provider 39680711-70c9-4df1-ae59-25e54fac688d from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Jan 21 23:48:53 compute-1 nova_compute[182713]: 2026-01-21 23:48:53.430 182717 DEBUG nova.compute.provider_tree [None req-429a6556-48cd-4e21-a144-c548cb9e66b7 d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] Updating inventory in ProviderTree for provider 39680711-70c9-4df1-ae59-25e54fac688d with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Jan 21 23:48:53 compute-1 nova_compute[182713]: 2026-01-21 23:48:53.447 182717 DEBUG nova.scheduler.client.report [None req-429a6556-48cd-4e21-a144-c548cb9e66b7 d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] Refreshing aggregate associations for resource provider 39680711-70c9-4df1-ae59-25e54fac688d, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Jan 21 23:48:53 compute-1 nova_compute[182713]: 2026-01-21 23:48:53.478 182717 DEBUG nova.scheduler.client.report [None req-429a6556-48cd-4e21-a144-c548cb9e66b7 d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] Refreshing trait associations for resource provider 39680711-70c9-4df1-ae59-25e54fac688d, traits: COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_ACCELERATORS,COMPUTE_NODE,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_SSE41,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_GRAPHICS_MODEL_CIRRUS,HW_CPU_X86_SSE42,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_SECURITY_TPM_2_0,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_STORAGE_BUS_USB,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_STORAGE_BUS_FDC,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_STORAGE_BUS_SATA,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_DEVICE_TAGGING,COMPUTE_NET_VIF_MODEL_NE2K_PCI,HW_CPU_X86_SSE2,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_CPU_X86_SSSE3,COMPUTE_STORAGE_BUS_IDE,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_RESCUE_BFV,HW_CPU_X86_MMX,HW_CPU_X86_SSE,COMPUTE_TRUSTED_CERTS,COMPUTE_IMAGE_TYPE_ISO _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Jan 21 23:48:53 compute-1 nova_compute[182713]: 2026-01-21 23:48:53.549 182717 DEBUG nova.compute.provider_tree [None req-429a6556-48cd-4e21-a144-c548cb9e66b7 d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] Inventory has not changed in ProviderTree for provider: 39680711-70c9-4df1-ae59-25e54fac688d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 21 23:48:53 compute-1 nova_compute[182713]: 2026-01-21 23:48:53.569 182717 DEBUG nova.scheduler.client.report [None req-429a6556-48cd-4e21-a144-c548cb9e66b7 d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] Inventory has not changed for provider 39680711-70c9-4df1-ae59-25e54fac688d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 21 23:48:53 compute-1 nova_compute[182713]: 2026-01-21 23:48:53.592 182717 DEBUG oslo_concurrency.lockutils [None req-429a6556-48cd-4e21-a144-c548cb9e66b7 d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.227s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:48:53 compute-1 nova_compute[182713]: 2026-01-21 23:48:53.613 182717 INFO nova.scheduler.client.report [None req-429a6556-48cd-4e21-a144-c548cb9e66b7 d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] Deleted allocations for instance 5bdecf5d-9113-4584-ac23-44d59770eade
Jan 21 23:48:53 compute-1 nova_compute[182713]: 2026-01-21 23:48:53.691 182717 DEBUG oslo_concurrency.lockutils [None req-429a6556-48cd-4e21-a144-c548cb9e66b7 d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] Lock "5bdecf5d-9113-4584-ac23-44d59770eade" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.605s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:48:55 compute-1 nova_compute[182713]: 2026-01-21 23:48:55.368 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:48:56 compute-1 nova_compute[182713]: 2026-01-21 23:48:56.638 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:48:56 compute-1 podman[215216]: 2026-01-21 23:48:56.648151116 +0000 UTC m=+0.130907331 container health_status e305ebfcd3dbca18f8dd680606b043b108cf9efb0d0a771039cb04fe0df8441d (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Jan 21 23:48:56 compute-1 podman[215215]: 2026-01-21 23:48:56.656023358 +0000 UTC m=+0.150584102 container health_status af9a44923f14d865893c91712a29a99d59e05d83f1a1a0300c4f4d5af638f284 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202)
Jan 21 23:48:57 compute-1 nova_compute[182713]: 2026-01-21 23:48:57.446 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:48:59 compute-1 nova_compute[182713]: 2026-01-21 23:48:59.621 182717 DEBUG nova.virt.libvirt.driver [None req-e859a494-976a-4a37-9084-6e45948e127c 98cbe317d3494846bdfe48215cfbc5c0 af45596abab74cc9aca5cbb551899c80 - - default default] [instance: fb3b64cb-7a89-4d0b-b821-db928d77b940] Instance in state 1 after 10 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101
Jan 21 23:49:00 compute-1 nova_compute[182713]: 2026-01-21 23:49:00.369 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:49:01 compute-1 systemd[1]: machine-qemu\x2d15\x2dinstance\x2d0000001d.scope: Deactivated successfully.
Jan 21 23:49:01 compute-1 systemd[1]: machine-qemu\x2d15\x2dinstance\x2d0000001d.scope: Consumed 13.116s CPU time.
Jan 21 23:49:01 compute-1 systemd-machined[153970]: Machine qemu-15-instance-0000001d terminated.
Jan 21 23:49:02 compute-1 nova_compute[182713]: 2026-01-21 23:49:02.451 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:49:02 compute-1 nova_compute[182713]: 2026-01-21 23:49:02.642 182717 INFO nova.virt.libvirt.driver [None req-e859a494-976a-4a37-9084-6e45948e127c 98cbe317d3494846bdfe48215cfbc5c0 af45596abab74cc9aca5cbb551899c80 - - default default] [instance: fb3b64cb-7a89-4d0b-b821-db928d77b940] Instance shutdown successfully after 13 seconds.
Jan 21 23:49:02 compute-1 nova_compute[182713]: 2026-01-21 23:49:02.650 182717 INFO nova.virt.libvirt.driver [-] [instance: fb3b64cb-7a89-4d0b-b821-db928d77b940] Instance destroyed successfully.
Jan 21 23:49:02 compute-1 nova_compute[182713]: 2026-01-21 23:49:02.650 182717 DEBUG nova.objects.instance [None req-e859a494-976a-4a37-9084-6e45948e127c 98cbe317d3494846bdfe48215cfbc5c0 af45596abab74cc9aca5cbb551899c80 - - default default] Lazy-loading 'numa_topology' on Instance uuid fb3b64cb-7a89-4d0b-b821-db928d77b940 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:49:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:49:02.996 104184 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:49:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:49:02.997 104184 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:49:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:49:02.997 104184 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:49:03 compute-1 nova_compute[182713]: 2026-01-21 23:49:03.313 182717 INFO nova.virt.libvirt.driver [None req-e859a494-976a-4a37-9084-6e45948e127c 98cbe317d3494846bdfe48215cfbc5c0 af45596abab74cc9aca5cbb551899c80 - - default default] [instance: fb3b64cb-7a89-4d0b-b821-db928d77b940] Beginning cold snapshot process
Jan 21 23:49:03 compute-1 nova_compute[182713]: 2026-01-21 23:49:03.595 182717 DEBUG nova.privsep.utils [None req-e859a494-976a-4a37-9084-6e45948e127c 98cbe317d3494846bdfe48215cfbc5c0 af45596abab74cc9aca5cbb551899c80 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63
Jan 21 23:49:03 compute-1 nova_compute[182713]: 2026-01-21 23:49:03.596 182717 DEBUG oslo_concurrency.processutils [None req-e859a494-976a-4a37-9084-6e45948e127c 98cbe317d3494846bdfe48215cfbc5c0 af45596abab74cc9aca5cbb551899c80 - - default default] Running cmd (subprocess): qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/fb3b64cb-7a89-4d0b-b821-db928d77b940/disk /var/lib/nova/instances/snapshots/tmpomicpg7m/28c0823d3e4149c889d9951abbe98037 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:49:03 compute-1 nova_compute[182713]: 2026-01-21 23:49:03.971 182717 DEBUG oslo_concurrency.processutils [None req-e859a494-976a-4a37-9084-6e45948e127c 98cbe317d3494846bdfe48215cfbc5c0 af45596abab74cc9aca5cbb551899c80 - - default default] CMD "qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/fb3b64cb-7a89-4d0b-b821-db928d77b940/disk /var/lib/nova/instances/snapshots/tmpomicpg7m/28c0823d3e4149c889d9951abbe98037" returned: 0 in 0.376s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:49:03 compute-1 nova_compute[182713]: 2026-01-21 23:49:03.973 182717 INFO nova.virt.libvirt.driver [None req-e859a494-976a-4a37-9084-6e45948e127c 98cbe317d3494846bdfe48215cfbc5c0 af45596abab74cc9aca5cbb551899c80 - - default default] [instance: fb3b64cb-7a89-4d0b-b821-db928d77b940] Snapshot extracted, beginning image upload
Jan 21 23:49:05 compute-1 nova_compute[182713]: 2026-01-21 23:49:05.371 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:49:06 compute-1 nova_compute[182713]: 2026-01-21 23:49:06.769 182717 INFO nova.virt.libvirt.driver [None req-e859a494-976a-4a37-9084-6e45948e127c 98cbe317d3494846bdfe48215cfbc5c0 af45596abab74cc9aca5cbb551899c80 - - default default] [instance: fb3b64cb-7a89-4d0b-b821-db928d77b940] Snapshot image upload complete
Jan 21 23:49:06 compute-1 nova_compute[182713]: 2026-01-21 23:49:06.772 182717 DEBUG nova.compute.manager [None req-e859a494-976a-4a37-9084-6e45948e127c 98cbe317d3494846bdfe48215cfbc5c0 af45596abab74cc9aca5cbb551899c80 - - default default] [instance: fb3b64cb-7a89-4d0b-b821-db928d77b940] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:49:06 compute-1 nova_compute[182713]: 2026-01-21 23:49:06.871 182717 INFO nova.compute.manager [None req-e859a494-976a-4a37-9084-6e45948e127c 98cbe317d3494846bdfe48215cfbc5c0 af45596abab74cc9aca5cbb551899c80 - - default default] [instance: fb3b64cb-7a89-4d0b-b821-db928d77b940] Shelve offloading
Jan 21 23:49:06 compute-1 nova_compute[182713]: 2026-01-21 23:49:06.896 182717 INFO nova.virt.libvirt.driver [-] [instance: fb3b64cb-7a89-4d0b-b821-db928d77b940] Instance destroyed successfully.
Jan 21 23:49:06 compute-1 nova_compute[182713]: 2026-01-21 23:49:06.897 182717 DEBUG nova.compute.manager [None req-e859a494-976a-4a37-9084-6e45948e127c 98cbe317d3494846bdfe48215cfbc5c0 af45596abab74cc9aca5cbb551899c80 - - default default] [instance: fb3b64cb-7a89-4d0b-b821-db928d77b940] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:49:06 compute-1 nova_compute[182713]: 2026-01-21 23:49:06.901 182717 DEBUG oslo_concurrency.lockutils [None req-e859a494-976a-4a37-9084-6e45948e127c 98cbe317d3494846bdfe48215cfbc5c0 af45596abab74cc9aca5cbb551899c80 - - default default] Acquiring lock "refresh_cache-fb3b64cb-7a89-4d0b-b821-db928d77b940" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 23:49:06 compute-1 nova_compute[182713]: 2026-01-21 23:49:06.902 182717 DEBUG oslo_concurrency.lockutils [None req-e859a494-976a-4a37-9084-6e45948e127c 98cbe317d3494846bdfe48215cfbc5c0 af45596abab74cc9aca5cbb551899c80 - - default default] Acquired lock "refresh_cache-fb3b64cb-7a89-4d0b-b821-db928d77b940" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 23:49:06 compute-1 nova_compute[182713]: 2026-01-21 23:49:06.903 182717 DEBUG nova.network.neutron [None req-e859a494-976a-4a37-9084-6e45948e127c 98cbe317d3494846bdfe48215cfbc5c0 af45596abab74cc9aca5cbb551899c80 - - default default] [instance: fb3b64cb-7a89-4d0b-b821-db928d77b940] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 21 23:49:07 compute-1 nova_compute[182713]: 2026-01-21 23:49:07.236 182717 DEBUG nova.network.neutron [None req-e859a494-976a-4a37-9084-6e45948e127c 98cbe317d3494846bdfe48215cfbc5c0 af45596abab74cc9aca5cbb551899c80 - - default default] [instance: fb3b64cb-7a89-4d0b-b821-db928d77b940] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 21 23:49:07 compute-1 nova_compute[182713]: 2026-01-21 23:49:07.413 182717 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769039332.411393, 5bdecf5d-9113-4584-ac23-44d59770eade => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:49:07 compute-1 nova_compute[182713]: 2026-01-21 23:49:07.413 182717 INFO nova.compute.manager [-] [instance: 5bdecf5d-9113-4584-ac23-44d59770eade] VM Stopped (Lifecycle Event)
Jan 21 23:49:07 compute-1 nova_compute[182713]: 2026-01-21 23:49:07.435 182717 DEBUG nova.compute.manager [None req-fcf8f8a4-bc84-439f-876c-0811422ad4db - - - - - -] [instance: 5bdecf5d-9113-4584-ac23-44d59770eade] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:49:07 compute-1 nova_compute[182713]: 2026-01-21 23:49:07.454 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:49:07 compute-1 nova_compute[182713]: 2026-01-21 23:49:07.740 182717 DEBUG nova.network.neutron [None req-e859a494-976a-4a37-9084-6e45948e127c 98cbe317d3494846bdfe48215cfbc5c0 af45596abab74cc9aca5cbb551899c80 - - default default] [instance: fb3b64cb-7a89-4d0b-b821-db928d77b940] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 23:49:07 compute-1 nova_compute[182713]: 2026-01-21 23:49:07.761 182717 DEBUG oslo_concurrency.lockutils [None req-e859a494-976a-4a37-9084-6e45948e127c 98cbe317d3494846bdfe48215cfbc5c0 af45596abab74cc9aca5cbb551899c80 - - default default] Releasing lock "refresh_cache-fb3b64cb-7a89-4d0b-b821-db928d77b940" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 23:49:07 compute-1 nova_compute[182713]: 2026-01-21 23:49:07.772 182717 INFO nova.virt.libvirt.driver [-] [instance: fb3b64cb-7a89-4d0b-b821-db928d77b940] Instance destroyed successfully.
Jan 21 23:49:07 compute-1 nova_compute[182713]: 2026-01-21 23:49:07.773 182717 DEBUG nova.objects.instance [None req-e859a494-976a-4a37-9084-6e45948e127c 98cbe317d3494846bdfe48215cfbc5c0 af45596abab74cc9aca5cbb551899c80 - - default default] Lazy-loading 'resources' on Instance uuid fb3b64cb-7a89-4d0b-b821-db928d77b940 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:49:07 compute-1 nova_compute[182713]: 2026-01-21 23:49:07.792 182717 INFO nova.virt.libvirt.driver [None req-e859a494-976a-4a37-9084-6e45948e127c 98cbe317d3494846bdfe48215cfbc5c0 af45596abab74cc9aca5cbb551899c80 - - default default] [instance: fb3b64cb-7a89-4d0b-b821-db928d77b940] Deleting instance files /var/lib/nova/instances/fb3b64cb-7a89-4d0b-b821-db928d77b940_del
Jan 21 23:49:07 compute-1 nova_compute[182713]: 2026-01-21 23:49:07.804 182717 INFO nova.virt.libvirt.driver [None req-e859a494-976a-4a37-9084-6e45948e127c 98cbe317d3494846bdfe48215cfbc5c0 af45596abab74cc9aca5cbb551899c80 - - default default] [instance: fb3b64cb-7a89-4d0b-b821-db928d77b940] Deletion of /var/lib/nova/instances/fb3b64cb-7a89-4d0b-b821-db928d77b940_del complete
Jan 21 23:49:07 compute-1 nova_compute[182713]: 2026-01-21 23:49:07.942 182717 INFO nova.scheduler.client.report [None req-e859a494-976a-4a37-9084-6e45948e127c 98cbe317d3494846bdfe48215cfbc5c0 af45596abab74cc9aca5cbb551899c80 - - default default] Deleted allocations for instance fb3b64cb-7a89-4d0b-b821-db928d77b940
Jan 21 23:49:08 compute-1 nova_compute[182713]: 2026-01-21 23:49:08.015 182717 DEBUG oslo_concurrency.lockutils [None req-e859a494-976a-4a37-9084-6e45948e127c 98cbe317d3494846bdfe48215cfbc5c0 af45596abab74cc9aca5cbb551899c80 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:49:08 compute-1 nova_compute[182713]: 2026-01-21 23:49:08.015 182717 DEBUG oslo_concurrency.lockutils [None req-e859a494-976a-4a37-9084-6e45948e127c 98cbe317d3494846bdfe48215cfbc5c0 af45596abab74cc9aca5cbb551899c80 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:49:08 compute-1 nova_compute[182713]: 2026-01-21 23:49:08.058 182717 DEBUG nova.compute.provider_tree [None req-e859a494-976a-4a37-9084-6e45948e127c 98cbe317d3494846bdfe48215cfbc5c0 af45596abab74cc9aca5cbb551899c80 - - default default] Inventory has not changed in ProviderTree for provider: 39680711-70c9-4df1-ae59-25e54fac688d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 21 23:49:08 compute-1 nova_compute[182713]: 2026-01-21 23:49:08.075 182717 DEBUG nova.scheduler.client.report [None req-e859a494-976a-4a37-9084-6e45948e127c 98cbe317d3494846bdfe48215cfbc5c0 af45596abab74cc9aca5cbb551899c80 - - default default] Inventory has not changed for provider 39680711-70c9-4df1-ae59-25e54fac688d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 21 23:49:08 compute-1 nova_compute[182713]: 2026-01-21 23:49:08.097 182717 DEBUG oslo_concurrency.lockutils [None req-e859a494-976a-4a37-9084-6e45948e127c 98cbe317d3494846bdfe48215cfbc5c0 af45596abab74cc9aca5cbb551899c80 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.082s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:49:08 compute-1 nova_compute[182713]: 2026-01-21 23:49:08.173 182717 DEBUG oslo_concurrency.lockutils [None req-e859a494-976a-4a37-9084-6e45948e127c 98cbe317d3494846bdfe48215cfbc5c0 af45596abab74cc9aca5cbb551899c80 - - default default] Lock "fb3b64cb-7a89-4d0b-b821-db928d77b940" "released" by "nova.compute.manager.ComputeManager.shelve_instance.<locals>.do_shelve_instance" :: held 18.685s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:49:08 compute-1 podman[215288]: 2026-01-21 23:49:08.592028164 +0000 UTC m=+0.084938247 container health_status cba261ffc859a105e0afa4436e64e27f9ba7e7387a1ea02146e0072c51844d44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, container_name=ceilometer_agent_compute)
Jan 21 23:49:10 compute-1 nova_compute[182713]: 2026-01-21 23:49:10.373 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:49:10 compute-1 nova_compute[182713]: 2026-01-21 23:49:10.736 182717 DEBUG oslo_concurrency.lockutils [None req-a2d99467-44fd-4af5-84e4-e0963fdaa227 6ecb82537aea43e49dbf7e72fc5cb2fd 8857c3f1f4694ee0b967080721188d60 - - default default] Acquiring lock "fb3b64cb-7a89-4d0b-b821-db928d77b940" by "nova.compute.manager.ComputeManager.unshelve_instance.<locals>.do_unshelve_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:49:10 compute-1 nova_compute[182713]: 2026-01-21 23:49:10.737 182717 DEBUG oslo_concurrency.lockutils [None req-a2d99467-44fd-4af5-84e4-e0963fdaa227 6ecb82537aea43e49dbf7e72fc5cb2fd 8857c3f1f4694ee0b967080721188d60 - - default default] Lock "fb3b64cb-7a89-4d0b-b821-db928d77b940" acquired by "nova.compute.manager.ComputeManager.unshelve_instance.<locals>.do_unshelve_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:49:10 compute-1 nova_compute[182713]: 2026-01-21 23:49:10.737 182717 INFO nova.compute.manager [None req-a2d99467-44fd-4af5-84e4-e0963fdaa227 6ecb82537aea43e49dbf7e72fc5cb2fd 8857c3f1f4694ee0b967080721188d60 - - default default] [instance: fb3b64cb-7a89-4d0b-b821-db928d77b940] Unshelving
Jan 21 23:49:10 compute-1 nova_compute[182713]: 2026-01-21 23:49:10.874 182717 DEBUG oslo_concurrency.lockutils [None req-a2d99467-44fd-4af5-84e4-e0963fdaa227 6ecb82537aea43e49dbf7e72fc5cb2fd 8857c3f1f4694ee0b967080721188d60 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:49:10 compute-1 nova_compute[182713]: 2026-01-21 23:49:10.875 182717 DEBUG oslo_concurrency.lockutils [None req-a2d99467-44fd-4af5-84e4-e0963fdaa227 6ecb82537aea43e49dbf7e72fc5cb2fd 8857c3f1f4694ee0b967080721188d60 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:49:10 compute-1 nova_compute[182713]: 2026-01-21 23:49:10.881 182717 DEBUG nova.objects.instance [None req-a2d99467-44fd-4af5-84e4-e0963fdaa227 6ecb82537aea43e49dbf7e72fc5cb2fd 8857c3f1f4694ee0b967080721188d60 - - default default] Lazy-loading 'pci_requests' on Instance uuid fb3b64cb-7a89-4d0b-b821-db928d77b940 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:49:10 compute-1 nova_compute[182713]: 2026-01-21 23:49:10.905 182717 DEBUG nova.objects.instance [None req-a2d99467-44fd-4af5-84e4-e0963fdaa227 6ecb82537aea43e49dbf7e72fc5cb2fd 8857c3f1f4694ee0b967080721188d60 - - default default] Lazy-loading 'numa_topology' on Instance uuid fb3b64cb-7a89-4d0b-b821-db928d77b940 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:49:10 compute-1 nova_compute[182713]: 2026-01-21 23:49:10.928 182717 DEBUG nova.virt.hardware [None req-a2d99467-44fd-4af5-84e4-e0963fdaa227 6ecb82537aea43e49dbf7e72fc5cb2fd 8857c3f1f4694ee0b967080721188d60 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 21 23:49:10 compute-1 nova_compute[182713]: 2026-01-21 23:49:10.929 182717 INFO nova.compute.claims [None req-a2d99467-44fd-4af5-84e4-e0963fdaa227 6ecb82537aea43e49dbf7e72fc5cb2fd 8857c3f1f4694ee0b967080721188d60 - - default default] [instance: fb3b64cb-7a89-4d0b-b821-db928d77b940] Claim successful on node compute-1.ctlplane.example.com
Jan 21 23:49:11 compute-1 nova_compute[182713]: 2026-01-21 23:49:11.108 182717 DEBUG nova.compute.provider_tree [None req-a2d99467-44fd-4af5-84e4-e0963fdaa227 6ecb82537aea43e49dbf7e72fc5cb2fd 8857c3f1f4694ee0b967080721188d60 - - default default] Inventory has not changed in ProviderTree for provider: 39680711-70c9-4df1-ae59-25e54fac688d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 21 23:49:11 compute-1 nova_compute[182713]: 2026-01-21 23:49:11.131 182717 DEBUG nova.scheduler.client.report [None req-a2d99467-44fd-4af5-84e4-e0963fdaa227 6ecb82537aea43e49dbf7e72fc5cb2fd 8857c3f1f4694ee0b967080721188d60 - - default default] Inventory has not changed for provider 39680711-70c9-4df1-ae59-25e54fac688d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 21 23:49:11 compute-1 nova_compute[182713]: 2026-01-21 23:49:11.171 182717 DEBUG oslo_concurrency.lockutils [None req-a2d99467-44fd-4af5-84e4-e0963fdaa227 6ecb82537aea43e49dbf7e72fc5cb2fd 8857c3f1f4694ee0b967080721188d60 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.296s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:49:11 compute-1 nova_compute[182713]: 2026-01-21 23:49:11.359 182717 DEBUG oslo_concurrency.lockutils [None req-a2d99467-44fd-4af5-84e4-e0963fdaa227 6ecb82537aea43e49dbf7e72fc5cb2fd 8857c3f1f4694ee0b967080721188d60 - - default default] Acquiring lock "refresh_cache-fb3b64cb-7a89-4d0b-b821-db928d77b940" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 23:49:11 compute-1 nova_compute[182713]: 2026-01-21 23:49:11.360 182717 DEBUG oslo_concurrency.lockutils [None req-a2d99467-44fd-4af5-84e4-e0963fdaa227 6ecb82537aea43e49dbf7e72fc5cb2fd 8857c3f1f4694ee0b967080721188d60 - - default default] Acquired lock "refresh_cache-fb3b64cb-7a89-4d0b-b821-db928d77b940" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 23:49:11 compute-1 nova_compute[182713]: 2026-01-21 23:49:11.360 182717 DEBUG nova.network.neutron [None req-a2d99467-44fd-4af5-84e4-e0963fdaa227 6ecb82537aea43e49dbf7e72fc5cb2fd 8857c3f1f4694ee0b967080721188d60 - - default default] [instance: fb3b64cb-7a89-4d0b-b821-db928d77b940] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 21 23:49:11 compute-1 nova_compute[182713]: 2026-01-21 23:49:11.361 182717 DEBUG oslo_concurrency.lockutils [None req-3418927a-f057-4804-bc5d-53ddbd5c254b f63fa215646b41c79f42ebb0bdcfcea0 d261e3eff0854b5c86b1fdf0c14f9027 - - default default] Acquiring lock "c5403455-52cf-4717-b07a-49f01c2ed814" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:49:11 compute-1 nova_compute[182713]: 2026-01-21 23:49:11.361 182717 DEBUG oslo_concurrency.lockutils [None req-3418927a-f057-4804-bc5d-53ddbd5c254b f63fa215646b41c79f42ebb0bdcfcea0 d261e3eff0854b5c86b1fdf0c14f9027 - - default default] Lock "c5403455-52cf-4717-b07a-49f01c2ed814" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:49:11 compute-1 nova_compute[182713]: 2026-01-21 23:49:11.384 182717 DEBUG nova.compute.manager [None req-3418927a-f057-4804-bc5d-53ddbd5c254b f63fa215646b41c79f42ebb0bdcfcea0 d261e3eff0854b5c86b1fdf0c14f9027 - - default default] [instance: c5403455-52cf-4717-b07a-49f01c2ed814] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 21 23:49:11 compute-1 nova_compute[182713]: 2026-01-21 23:49:11.522 182717 DEBUG nova.network.neutron [None req-a2d99467-44fd-4af5-84e4-e0963fdaa227 6ecb82537aea43e49dbf7e72fc5cb2fd 8857c3f1f4694ee0b967080721188d60 - - default default] [instance: fb3b64cb-7a89-4d0b-b821-db928d77b940] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 21 23:49:11 compute-1 nova_compute[182713]: 2026-01-21 23:49:11.529 182717 DEBUG oslo_concurrency.lockutils [None req-3418927a-f057-4804-bc5d-53ddbd5c254b f63fa215646b41c79f42ebb0bdcfcea0 d261e3eff0854b5c86b1fdf0c14f9027 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:49:11 compute-1 nova_compute[182713]: 2026-01-21 23:49:11.529 182717 DEBUG oslo_concurrency.lockutils [None req-3418927a-f057-4804-bc5d-53ddbd5c254b f63fa215646b41c79f42ebb0bdcfcea0 d261e3eff0854b5c86b1fdf0c14f9027 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:49:11 compute-1 nova_compute[182713]: 2026-01-21 23:49:11.538 182717 DEBUG nova.virt.hardware [None req-3418927a-f057-4804-bc5d-53ddbd5c254b f63fa215646b41c79f42ebb0bdcfcea0 d261e3eff0854b5c86b1fdf0c14f9027 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 21 23:49:11 compute-1 nova_compute[182713]: 2026-01-21 23:49:11.538 182717 INFO nova.compute.claims [None req-3418927a-f057-4804-bc5d-53ddbd5c254b f63fa215646b41c79f42ebb0bdcfcea0 d261e3eff0854b5c86b1fdf0c14f9027 - - default default] [instance: c5403455-52cf-4717-b07a-49f01c2ed814] Claim successful on node compute-1.ctlplane.example.com
Jan 21 23:49:11 compute-1 nova_compute[182713]: 2026-01-21 23:49:11.906 182717 DEBUG nova.compute.provider_tree [None req-3418927a-f057-4804-bc5d-53ddbd5c254b f63fa215646b41c79f42ebb0bdcfcea0 d261e3eff0854b5c86b1fdf0c14f9027 - - default default] Inventory has not changed in ProviderTree for provider: 39680711-70c9-4df1-ae59-25e54fac688d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 21 23:49:11 compute-1 nova_compute[182713]: 2026-01-21 23:49:11.928 182717 DEBUG nova.scheduler.client.report [None req-3418927a-f057-4804-bc5d-53ddbd5c254b f63fa215646b41c79f42ebb0bdcfcea0 d261e3eff0854b5c86b1fdf0c14f9027 - - default default] Inventory has not changed for provider 39680711-70c9-4df1-ae59-25e54fac688d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 21 23:49:11 compute-1 nova_compute[182713]: 2026-01-21 23:49:11.954 182717 DEBUG oslo_concurrency.lockutils [None req-3418927a-f057-4804-bc5d-53ddbd5c254b f63fa215646b41c79f42ebb0bdcfcea0 d261e3eff0854b5c86b1fdf0c14f9027 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.425s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:49:11 compute-1 nova_compute[182713]: 2026-01-21 23:49:11.955 182717 DEBUG nova.compute.manager [None req-3418927a-f057-4804-bc5d-53ddbd5c254b f63fa215646b41c79f42ebb0bdcfcea0 d261e3eff0854b5c86b1fdf0c14f9027 - - default default] [instance: c5403455-52cf-4717-b07a-49f01c2ed814] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 21 23:49:12 compute-1 nova_compute[182713]: 2026-01-21 23:49:12.017 182717 DEBUG nova.compute.manager [None req-3418927a-f057-4804-bc5d-53ddbd5c254b f63fa215646b41c79f42ebb0bdcfcea0 d261e3eff0854b5c86b1fdf0c14f9027 - - default default] [instance: c5403455-52cf-4717-b07a-49f01c2ed814] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 21 23:49:12 compute-1 nova_compute[182713]: 2026-01-21 23:49:12.018 182717 DEBUG nova.network.neutron [None req-3418927a-f057-4804-bc5d-53ddbd5c254b f63fa215646b41c79f42ebb0bdcfcea0 d261e3eff0854b5c86b1fdf0c14f9027 - - default default] [instance: c5403455-52cf-4717-b07a-49f01c2ed814] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 21 23:49:12 compute-1 nova_compute[182713]: 2026-01-21 23:49:12.039 182717 INFO nova.virt.libvirt.driver [None req-3418927a-f057-4804-bc5d-53ddbd5c254b f63fa215646b41c79f42ebb0bdcfcea0 d261e3eff0854b5c86b1fdf0c14f9027 - - default default] [instance: c5403455-52cf-4717-b07a-49f01c2ed814] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 21 23:49:12 compute-1 nova_compute[182713]: 2026-01-21 23:49:12.062 182717 DEBUG nova.compute.manager [None req-3418927a-f057-4804-bc5d-53ddbd5c254b f63fa215646b41c79f42ebb0bdcfcea0 d261e3eff0854b5c86b1fdf0c14f9027 - - default default] [instance: c5403455-52cf-4717-b07a-49f01c2ed814] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 21 23:49:12 compute-1 nova_compute[182713]: 2026-01-21 23:49:12.143 182717 DEBUG nova.network.neutron [None req-a2d99467-44fd-4af5-84e4-e0963fdaa227 6ecb82537aea43e49dbf7e72fc5cb2fd 8857c3f1f4694ee0b967080721188d60 - - default default] [instance: fb3b64cb-7a89-4d0b-b821-db928d77b940] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 23:49:12 compute-1 nova_compute[182713]: 2026-01-21 23:49:12.163 182717 DEBUG oslo_concurrency.lockutils [None req-e222f738-cbc5-445d-a065-dc78de4baf8f abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] Acquiring lock "092ae3ba-e79b-47fc-b3eb-a0fb61865b6f" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:49:12 compute-1 nova_compute[182713]: 2026-01-21 23:49:12.165 182717 DEBUG oslo_concurrency.lockutils [None req-e222f738-cbc5-445d-a065-dc78de4baf8f abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] Lock "092ae3ba-e79b-47fc-b3eb-a0fb61865b6f" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:49:12 compute-1 nova_compute[182713]: 2026-01-21 23:49:12.178 182717 DEBUG oslo_concurrency.lockutils [None req-a2d99467-44fd-4af5-84e4-e0963fdaa227 6ecb82537aea43e49dbf7e72fc5cb2fd 8857c3f1f4694ee0b967080721188d60 - - default default] Releasing lock "refresh_cache-fb3b64cb-7a89-4d0b-b821-db928d77b940" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 23:49:12 compute-1 nova_compute[182713]: 2026-01-21 23:49:12.182 182717 DEBUG nova.virt.libvirt.driver [None req-a2d99467-44fd-4af5-84e4-e0963fdaa227 6ecb82537aea43e49dbf7e72fc5cb2fd 8857c3f1f4694ee0b967080721188d60 - - default default] [instance: fb3b64cb-7a89-4d0b-b821-db928d77b940] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 21 23:49:12 compute-1 nova_compute[182713]: 2026-01-21 23:49:12.183 182717 INFO nova.virt.libvirt.driver [None req-a2d99467-44fd-4af5-84e4-e0963fdaa227 6ecb82537aea43e49dbf7e72fc5cb2fd 8857c3f1f4694ee0b967080721188d60 - - default default] [instance: fb3b64cb-7a89-4d0b-b821-db928d77b940] Creating image(s)
Jan 21 23:49:12 compute-1 nova_compute[182713]: 2026-01-21 23:49:12.184 182717 DEBUG oslo_concurrency.lockutils [None req-a2d99467-44fd-4af5-84e4-e0963fdaa227 6ecb82537aea43e49dbf7e72fc5cb2fd 8857c3f1f4694ee0b967080721188d60 - - default default] Acquiring lock "/var/lib/nova/instances/fb3b64cb-7a89-4d0b-b821-db928d77b940/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:49:12 compute-1 nova_compute[182713]: 2026-01-21 23:49:12.185 182717 DEBUG oslo_concurrency.lockutils [None req-a2d99467-44fd-4af5-84e4-e0963fdaa227 6ecb82537aea43e49dbf7e72fc5cb2fd 8857c3f1f4694ee0b967080721188d60 - - default default] Lock "/var/lib/nova/instances/fb3b64cb-7a89-4d0b-b821-db928d77b940/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:49:12 compute-1 nova_compute[182713]: 2026-01-21 23:49:12.187 182717 DEBUG oslo_concurrency.lockutils [None req-a2d99467-44fd-4af5-84e4-e0963fdaa227 6ecb82537aea43e49dbf7e72fc5cb2fd 8857c3f1f4694ee0b967080721188d60 - - default default] Lock "/var/lib/nova/instances/fb3b64cb-7a89-4d0b-b821-db928d77b940/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:49:12 compute-1 nova_compute[182713]: 2026-01-21 23:49:12.188 182717 DEBUG nova.objects.instance [None req-a2d99467-44fd-4af5-84e4-e0963fdaa227 6ecb82537aea43e49dbf7e72fc5cb2fd 8857c3f1f4694ee0b967080721188d60 - - default default] Lazy-loading 'trusted_certs' on Instance uuid fb3b64cb-7a89-4d0b-b821-db928d77b940 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:49:12 compute-1 nova_compute[182713]: 2026-01-21 23:49:12.194 182717 DEBUG nova.compute.manager [None req-e222f738-cbc5-445d-a065-dc78de4baf8f abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] [instance: 092ae3ba-e79b-47fc-b3eb-a0fb61865b6f] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 21 23:49:12 compute-1 nova_compute[182713]: 2026-01-21 23:49:12.202 182717 DEBUG nova.compute.manager [None req-3418927a-f057-4804-bc5d-53ddbd5c254b f63fa215646b41c79f42ebb0bdcfcea0 d261e3eff0854b5c86b1fdf0c14f9027 - - default default] [instance: c5403455-52cf-4717-b07a-49f01c2ed814] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 21 23:49:12 compute-1 nova_compute[182713]: 2026-01-21 23:49:12.204 182717 DEBUG nova.virt.libvirt.driver [None req-3418927a-f057-4804-bc5d-53ddbd5c254b f63fa215646b41c79f42ebb0bdcfcea0 d261e3eff0854b5c86b1fdf0c14f9027 - - default default] [instance: c5403455-52cf-4717-b07a-49f01c2ed814] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 21 23:49:12 compute-1 nova_compute[182713]: 2026-01-21 23:49:12.204 182717 INFO nova.virt.libvirt.driver [None req-3418927a-f057-4804-bc5d-53ddbd5c254b f63fa215646b41c79f42ebb0bdcfcea0 d261e3eff0854b5c86b1fdf0c14f9027 - - default default] [instance: c5403455-52cf-4717-b07a-49f01c2ed814] Creating image(s)
Jan 21 23:49:12 compute-1 nova_compute[182713]: 2026-01-21 23:49:12.205 182717 DEBUG oslo_concurrency.lockutils [None req-3418927a-f057-4804-bc5d-53ddbd5c254b f63fa215646b41c79f42ebb0bdcfcea0 d261e3eff0854b5c86b1fdf0c14f9027 - - default default] Acquiring lock "/var/lib/nova/instances/c5403455-52cf-4717-b07a-49f01c2ed814/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:49:12 compute-1 nova_compute[182713]: 2026-01-21 23:49:12.205 182717 DEBUG oslo_concurrency.lockutils [None req-3418927a-f057-4804-bc5d-53ddbd5c254b f63fa215646b41c79f42ebb0bdcfcea0 d261e3eff0854b5c86b1fdf0c14f9027 - - default default] Lock "/var/lib/nova/instances/c5403455-52cf-4717-b07a-49f01c2ed814/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:49:12 compute-1 nova_compute[182713]: 2026-01-21 23:49:12.206 182717 DEBUG oslo_concurrency.lockutils [None req-3418927a-f057-4804-bc5d-53ddbd5c254b f63fa215646b41c79f42ebb0bdcfcea0 d261e3eff0854b5c86b1fdf0c14f9027 - - default default] Lock "/var/lib/nova/instances/c5403455-52cf-4717-b07a-49f01c2ed814/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:49:12 compute-1 nova_compute[182713]: 2026-01-21 23:49:12.229 182717 DEBUG oslo_concurrency.lockutils [None req-a2d99467-44fd-4af5-84e4-e0963fdaa227 6ecb82537aea43e49dbf7e72fc5cb2fd 8857c3f1f4694ee0b967080721188d60 - - default default] Acquiring lock "3224e12fdef635d4ff8f1066ef61cd1ea8e72c9b" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:49:12 compute-1 nova_compute[182713]: 2026-01-21 23:49:12.230 182717 DEBUG oslo_concurrency.lockutils [None req-a2d99467-44fd-4af5-84e4-e0963fdaa227 6ecb82537aea43e49dbf7e72fc5cb2fd 8857c3f1f4694ee0b967080721188d60 - - default default] Lock "3224e12fdef635d4ff8f1066ef61cd1ea8e72c9b" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:49:12 compute-1 nova_compute[182713]: 2026-01-21 23:49:12.234 182717 DEBUG oslo_concurrency.processutils [None req-3418927a-f057-4804-bc5d-53ddbd5c254b f63fa215646b41c79f42ebb0bdcfcea0 d261e3eff0854b5c86b1fdf0c14f9027 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:49:12 compute-1 nova_compute[182713]: 2026-01-21 23:49:12.311 182717 DEBUG oslo_concurrency.lockutils [None req-e222f738-cbc5-445d-a065-dc78de4baf8f abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:49:12 compute-1 nova_compute[182713]: 2026-01-21 23:49:12.312 182717 DEBUG oslo_concurrency.lockutils [None req-e222f738-cbc5-445d-a065-dc78de4baf8f abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:49:12 compute-1 nova_compute[182713]: 2026-01-21 23:49:12.319 182717 DEBUG nova.virt.hardware [None req-e222f738-cbc5-445d-a065-dc78de4baf8f abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 21 23:49:12 compute-1 nova_compute[182713]: 2026-01-21 23:49:12.320 182717 INFO nova.compute.claims [None req-e222f738-cbc5-445d-a065-dc78de4baf8f abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] [instance: 092ae3ba-e79b-47fc-b3eb-a0fb61865b6f] Claim successful on node compute-1.ctlplane.example.com
Jan 21 23:49:12 compute-1 nova_compute[182713]: 2026-01-21 23:49:12.326 182717 DEBUG oslo_concurrency.processutils [None req-3418927a-f057-4804-bc5d-53ddbd5c254b f63fa215646b41c79f42ebb0bdcfcea0 d261e3eff0854b5c86b1fdf0c14f9027 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.093s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:49:12 compute-1 nova_compute[182713]: 2026-01-21 23:49:12.327 182717 DEBUG oslo_concurrency.lockutils [None req-3418927a-f057-4804-bc5d-53ddbd5c254b f63fa215646b41c79f42ebb0bdcfcea0 d261e3eff0854b5c86b1fdf0c14f9027 - - default default] Acquiring lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:49:12 compute-1 nova_compute[182713]: 2026-01-21 23:49:12.328 182717 DEBUG oslo_concurrency.lockutils [None req-3418927a-f057-4804-bc5d-53ddbd5c254b f63fa215646b41c79f42ebb0bdcfcea0 d261e3eff0854b5c86b1fdf0c14f9027 - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:49:12 compute-1 nova_compute[182713]: 2026-01-21 23:49:12.350 182717 DEBUG oslo_concurrency.processutils [None req-3418927a-f057-4804-bc5d-53ddbd5c254b f63fa215646b41c79f42ebb0bdcfcea0 d261e3eff0854b5c86b1fdf0c14f9027 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:49:12 compute-1 nova_compute[182713]: 2026-01-21 23:49:12.374 182717 DEBUG nova.network.neutron [None req-3418927a-f057-4804-bc5d-53ddbd5c254b f63fa215646b41c79f42ebb0bdcfcea0 d261e3eff0854b5c86b1fdf0c14f9027 - - default default] [instance: c5403455-52cf-4717-b07a-49f01c2ed814] No network configured allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1188
Jan 21 23:49:12 compute-1 nova_compute[182713]: 2026-01-21 23:49:12.374 182717 DEBUG nova.compute.manager [None req-3418927a-f057-4804-bc5d-53ddbd5c254b f63fa215646b41c79f42ebb0bdcfcea0 d261e3eff0854b5c86b1fdf0c14f9027 - - default default] [instance: c5403455-52cf-4717-b07a-49f01c2ed814] Instance network_info: |[]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 21 23:49:12 compute-1 nova_compute[182713]: 2026-01-21 23:49:12.429 182717 DEBUG oslo_concurrency.processutils [None req-3418927a-f057-4804-bc5d-53ddbd5c254b f63fa215646b41c79f42ebb0bdcfcea0 d261e3eff0854b5c86b1fdf0c14f9027 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.079s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:49:12 compute-1 nova_compute[182713]: 2026-01-21 23:49:12.430 182717 DEBUG oslo_concurrency.processutils [None req-3418927a-f057-4804-bc5d-53ddbd5c254b f63fa215646b41c79f42ebb0bdcfcea0 d261e3eff0854b5c86b1fdf0c14f9027 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/c5403455-52cf-4717-b07a-49f01c2ed814/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:49:12 compute-1 nova_compute[182713]: 2026-01-21 23:49:12.458 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:49:12 compute-1 nova_compute[182713]: 2026-01-21 23:49:12.467 182717 DEBUG oslo_concurrency.processutils [None req-3418927a-f057-4804-bc5d-53ddbd5c254b f63fa215646b41c79f42ebb0bdcfcea0 d261e3eff0854b5c86b1fdf0c14f9027 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/c5403455-52cf-4717-b07a-49f01c2ed814/disk 1073741824" returned: 0 in 0.037s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:49:12 compute-1 nova_compute[182713]: 2026-01-21 23:49:12.467 182717 DEBUG oslo_concurrency.lockutils [None req-3418927a-f057-4804-bc5d-53ddbd5c254b f63fa215646b41c79f42ebb0bdcfcea0 d261e3eff0854b5c86b1fdf0c14f9027 - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.139s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:49:12 compute-1 nova_compute[182713]: 2026-01-21 23:49:12.468 182717 DEBUG oslo_concurrency.processutils [None req-3418927a-f057-4804-bc5d-53ddbd5c254b f63fa215646b41c79f42ebb0bdcfcea0 d261e3eff0854b5c86b1fdf0c14f9027 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:49:12 compute-1 nova_compute[182713]: 2026-01-21 23:49:12.542 182717 DEBUG nova.compute.provider_tree [None req-e222f738-cbc5-445d-a065-dc78de4baf8f abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] Inventory has not changed in ProviderTree for provider: 39680711-70c9-4df1-ae59-25e54fac688d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 21 23:49:12 compute-1 nova_compute[182713]: 2026-01-21 23:49:12.561 182717 DEBUG nova.scheduler.client.report [None req-e222f738-cbc5-445d-a065-dc78de4baf8f abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] Inventory has not changed for provider 39680711-70c9-4df1-ae59-25e54fac688d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 21 23:49:12 compute-1 nova_compute[182713]: 2026-01-21 23:49:12.566 182717 DEBUG oslo_concurrency.processutils [None req-3418927a-f057-4804-bc5d-53ddbd5c254b f63fa215646b41c79f42ebb0bdcfcea0 d261e3eff0854b5c86b1fdf0c14f9027 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.098s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:49:12 compute-1 nova_compute[182713]: 2026-01-21 23:49:12.567 182717 DEBUG nova.virt.disk.api [None req-3418927a-f057-4804-bc5d-53ddbd5c254b f63fa215646b41c79f42ebb0bdcfcea0 d261e3eff0854b5c86b1fdf0c14f9027 - - default default] Checking if we can resize image /var/lib/nova/instances/c5403455-52cf-4717-b07a-49f01c2ed814/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 21 23:49:12 compute-1 nova_compute[182713]: 2026-01-21 23:49:12.567 182717 DEBUG oslo_concurrency.processutils [None req-3418927a-f057-4804-bc5d-53ddbd5c254b f63fa215646b41c79f42ebb0bdcfcea0 d261e3eff0854b5c86b1fdf0c14f9027 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c5403455-52cf-4717-b07a-49f01c2ed814/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:49:12 compute-1 nova_compute[182713]: 2026-01-21 23:49:12.590 182717 DEBUG oslo_concurrency.lockutils [None req-e222f738-cbc5-445d-a065-dc78de4baf8f abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.279s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:49:12 compute-1 nova_compute[182713]: 2026-01-21 23:49:12.592 182717 DEBUG nova.compute.manager [None req-e222f738-cbc5-445d-a065-dc78de4baf8f abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] [instance: 092ae3ba-e79b-47fc-b3eb-a0fb61865b6f] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 21 23:49:12 compute-1 podman[215318]: 2026-01-21 23:49:12.591285302 +0000 UTC m=+0.075768202 container health_status dce133a2ffa21f3006ac27dc80027060a9029e6d972f4c62fecd35dd3bbb00f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=openstack_network_exporter, io.buildah.version=1.33.7, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, vendor=Red Hat, Inc., version=9.6)
Jan 21 23:49:12 compute-1 nova_compute[182713]: 2026-01-21 23:49:12.636 182717 DEBUG oslo_concurrency.processutils [None req-3418927a-f057-4804-bc5d-53ddbd5c254b f63fa215646b41c79f42ebb0bdcfcea0 d261e3eff0854b5c86b1fdf0c14f9027 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c5403455-52cf-4717-b07a-49f01c2ed814/disk --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:49:12 compute-1 nova_compute[182713]: 2026-01-21 23:49:12.636 182717 DEBUG nova.virt.disk.api [None req-3418927a-f057-4804-bc5d-53ddbd5c254b f63fa215646b41c79f42ebb0bdcfcea0 d261e3eff0854b5c86b1fdf0c14f9027 - - default default] Cannot resize image /var/lib/nova/instances/c5403455-52cf-4717-b07a-49f01c2ed814/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 21 23:49:12 compute-1 nova_compute[182713]: 2026-01-21 23:49:12.637 182717 DEBUG nova.objects.instance [None req-3418927a-f057-4804-bc5d-53ddbd5c254b f63fa215646b41c79f42ebb0bdcfcea0 d261e3eff0854b5c86b1fdf0c14f9027 - - default default] Lazy-loading 'migration_context' on Instance uuid c5403455-52cf-4717-b07a-49f01c2ed814 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:49:12 compute-1 nova_compute[182713]: 2026-01-21 23:49:12.656 182717 DEBUG nova.virt.libvirt.driver [None req-3418927a-f057-4804-bc5d-53ddbd5c254b f63fa215646b41c79f42ebb0bdcfcea0 d261e3eff0854b5c86b1fdf0c14f9027 - - default default] [instance: c5403455-52cf-4717-b07a-49f01c2ed814] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 21 23:49:12 compute-1 nova_compute[182713]: 2026-01-21 23:49:12.657 182717 DEBUG nova.virt.libvirt.driver [None req-3418927a-f057-4804-bc5d-53ddbd5c254b f63fa215646b41c79f42ebb0bdcfcea0 d261e3eff0854b5c86b1fdf0c14f9027 - - default default] [instance: c5403455-52cf-4717-b07a-49f01c2ed814] Ensure instance console log exists: /var/lib/nova/instances/c5403455-52cf-4717-b07a-49f01c2ed814/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 21 23:49:12 compute-1 nova_compute[182713]: 2026-01-21 23:49:12.657 182717 DEBUG oslo_concurrency.lockutils [None req-3418927a-f057-4804-bc5d-53ddbd5c254b f63fa215646b41c79f42ebb0bdcfcea0 d261e3eff0854b5c86b1fdf0c14f9027 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:49:12 compute-1 nova_compute[182713]: 2026-01-21 23:49:12.658 182717 DEBUG oslo_concurrency.lockutils [None req-3418927a-f057-4804-bc5d-53ddbd5c254b f63fa215646b41c79f42ebb0bdcfcea0 d261e3eff0854b5c86b1fdf0c14f9027 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:49:12 compute-1 nova_compute[182713]: 2026-01-21 23:49:12.658 182717 DEBUG oslo_concurrency.lockutils [None req-3418927a-f057-4804-bc5d-53ddbd5c254b f63fa215646b41c79f42ebb0bdcfcea0 d261e3eff0854b5c86b1fdf0c14f9027 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:49:12 compute-1 nova_compute[182713]: 2026-01-21 23:49:12.660 182717 DEBUG nova.virt.libvirt.driver [None req-3418927a-f057-4804-bc5d-53ddbd5c254b f63fa215646b41c79f42ebb0bdcfcea0 d261e3eff0854b5c86b1fdf0c14f9027 - - default default] [instance: c5403455-52cf-4717-b07a-49f01c2ed814] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'size': 0, 'disk_bus': 'virtio', 'boot_index': 0, 'device_name': '/dev/vda', 'guest_format': None, 'encrypted': False, 'encryption_format': None, 'encryption_secret_uuid': None, 'device_type': 'disk', 'image_id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 21 23:49:12 compute-1 nova_compute[182713]: 2026-01-21 23:49:12.660 182717 DEBUG nova.compute.manager [None req-e222f738-cbc5-445d-a065-dc78de4baf8f abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] [instance: 092ae3ba-e79b-47fc-b3eb-a0fb61865b6f] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 21 23:49:12 compute-1 nova_compute[182713]: 2026-01-21 23:49:12.661 182717 DEBUG nova.network.neutron [None req-e222f738-cbc5-445d-a065-dc78de4baf8f abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] [instance: 092ae3ba-e79b-47fc-b3eb-a0fb61865b6f] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 21 23:49:12 compute-1 nova_compute[182713]: 2026-01-21 23:49:12.670 182717 WARNING nova.virt.libvirt.driver [None req-3418927a-f057-4804-bc5d-53ddbd5c254b f63fa215646b41c79f42ebb0bdcfcea0 d261e3eff0854b5c86b1fdf0c14f9027 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 21 23:49:12 compute-1 nova_compute[182713]: 2026-01-21 23:49:12.674 182717 DEBUG nova.virt.libvirt.host [None req-3418927a-f057-4804-bc5d-53ddbd5c254b f63fa215646b41c79f42ebb0bdcfcea0 d261e3eff0854b5c86b1fdf0c14f9027 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 21 23:49:12 compute-1 nova_compute[182713]: 2026-01-21 23:49:12.675 182717 DEBUG nova.virt.libvirt.host [None req-3418927a-f057-4804-bc5d-53ddbd5c254b f63fa215646b41c79f42ebb0bdcfcea0 d261e3eff0854b5c86b1fdf0c14f9027 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 21 23:49:12 compute-1 nova_compute[182713]: 2026-01-21 23:49:12.678 182717 DEBUG nova.virt.libvirt.host [None req-3418927a-f057-4804-bc5d-53ddbd5c254b f63fa215646b41c79f42ebb0bdcfcea0 d261e3eff0854b5c86b1fdf0c14f9027 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 21 23:49:12 compute-1 nova_compute[182713]: 2026-01-21 23:49:12.679 182717 DEBUG nova.virt.libvirt.host [None req-3418927a-f057-4804-bc5d-53ddbd5c254b f63fa215646b41c79f42ebb0bdcfcea0 d261e3eff0854b5c86b1fdf0c14f9027 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 21 23:49:12 compute-1 nova_compute[182713]: 2026-01-21 23:49:12.681 182717 DEBUG nova.virt.libvirt.driver [None req-3418927a-f057-4804-bc5d-53ddbd5c254b f63fa215646b41c79f42ebb0bdcfcea0 d261e3eff0854b5c86b1fdf0c14f9027 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 21 23:49:12 compute-1 nova_compute[182713]: 2026-01-21 23:49:12.682 182717 DEBUG nova.virt.hardware [None req-3418927a-f057-4804-bc5d-53ddbd5c254b f63fa215646b41c79f42ebb0bdcfcea0 d261e3eff0854b5c86b1fdf0c14f9027 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-21T23:42:45Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c3389c03-89c4-4ff5-9e03-1a99d41713d4',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 21 23:49:12 compute-1 nova_compute[182713]: 2026-01-21 23:49:12.682 182717 DEBUG nova.virt.hardware [None req-3418927a-f057-4804-bc5d-53ddbd5c254b f63fa215646b41c79f42ebb0bdcfcea0 d261e3eff0854b5c86b1fdf0c14f9027 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 21 23:49:12 compute-1 nova_compute[182713]: 2026-01-21 23:49:12.683 182717 DEBUG nova.virt.hardware [None req-3418927a-f057-4804-bc5d-53ddbd5c254b f63fa215646b41c79f42ebb0bdcfcea0 d261e3eff0854b5c86b1fdf0c14f9027 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 21 23:49:12 compute-1 nova_compute[182713]: 2026-01-21 23:49:12.683 182717 DEBUG nova.virt.hardware [None req-3418927a-f057-4804-bc5d-53ddbd5c254b f63fa215646b41c79f42ebb0bdcfcea0 d261e3eff0854b5c86b1fdf0c14f9027 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 21 23:49:12 compute-1 nova_compute[182713]: 2026-01-21 23:49:12.683 182717 DEBUG nova.virt.hardware [None req-3418927a-f057-4804-bc5d-53ddbd5c254b f63fa215646b41c79f42ebb0bdcfcea0 d261e3eff0854b5c86b1fdf0c14f9027 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 21 23:49:12 compute-1 nova_compute[182713]: 2026-01-21 23:49:12.684 182717 DEBUG nova.virt.hardware [None req-3418927a-f057-4804-bc5d-53ddbd5c254b f63fa215646b41c79f42ebb0bdcfcea0 d261e3eff0854b5c86b1fdf0c14f9027 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 21 23:49:12 compute-1 nova_compute[182713]: 2026-01-21 23:49:12.684 182717 DEBUG nova.virt.hardware [None req-3418927a-f057-4804-bc5d-53ddbd5c254b f63fa215646b41c79f42ebb0bdcfcea0 d261e3eff0854b5c86b1fdf0c14f9027 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 21 23:49:12 compute-1 nova_compute[182713]: 2026-01-21 23:49:12.684 182717 DEBUG nova.virt.hardware [None req-3418927a-f057-4804-bc5d-53ddbd5c254b f63fa215646b41c79f42ebb0bdcfcea0 d261e3eff0854b5c86b1fdf0c14f9027 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 21 23:49:12 compute-1 nova_compute[182713]: 2026-01-21 23:49:12.685 182717 DEBUG nova.virt.hardware [None req-3418927a-f057-4804-bc5d-53ddbd5c254b f63fa215646b41c79f42ebb0bdcfcea0 d261e3eff0854b5c86b1fdf0c14f9027 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 21 23:49:12 compute-1 nova_compute[182713]: 2026-01-21 23:49:12.685 182717 DEBUG nova.virt.hardware [None req-3418927a-f057-4804-bc5d-53ddbd5c254b f63fa215646b41c79f42ebb0bdcfcea0 d261e3eff0854b5c86b1fdf0c14f9027 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 21 23:49:12 compute-1 nova_compute[182713]: 2026-01-21 23:49:12.685 182717 DEBUG nova.virt.hardware [None req-3418927a-f057-4804-bc5d-53ddbd5c254b f63fa215646b41c79f42ebb0bdcfcea0 d261e3eff0854b5c86b1fdf0c14f9027 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 21 23:49:12 compute-1 nova_compute[182713]: 2026-01-21 23:49:12.692 182717 DEBUG nova.objects.instance [None req-3418927a-f057-4804-bc5d-53ddbd5c254b f63fa215646b41c79f42ebb0bdcfcea0 d261e3eff0854b5c86b1fdf0c14f9027 - - default default] Lazy-loading 'pci_devices' on Instance uuid c5403455-52cf-4717-b07a-49f01c2ed814 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:49:12 compute-1 nova_compute[182713]: 2026-01-21 23:49:12.694 182717 INFO nova.virt.libvirt.driver [None req-e222f738-cbc5-445d-a065-dc78de4baf8f abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] [instance: 092ae3ba-e79b-47fc-b3eb-a0fb61865b6f] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 21 23:49:12 compute-1 nova_compute[182713]: 2026-01-21 23:49:12.717 182717 DEBUG nova.virt.libvirt.driver [None req-3418927a-f057-4804-bc5d-53ddbd5c254b f63fa215646b41c79f42ebb0bdcfcea0 d261e3eff0854b5c86b1fdf0c14f9027 - - default default] [instance: c5403455-52cf-4717-b07a-49f01c2ed814] End _get_guest_xml xml=<domain type="kvm">
Jan 21 23:49:12 compute-1 nova_compute[182713]:   <uuid>c5403455-52cf-4717-b07a-49f01c2ed814</uuid>
Jan 21 23:49:12 compute-1 nova_compute[182713]:   <name>instance-0000001f</name>
Jan 21 23:49:12 compute-1 nova_compute[182713]:   <memory>131072</memory>
Jan 21 23:49:12 compute-1 nova_compute[182713]:   <vcpu>1</vcpu>
Jan 21 23:49:12 compute-1 nova_compute[182713]:   <metadata>
Jan 21 23:49:12 compute-1 nova_compute[182713]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 21 23:49:12 compute-1 nova_compute[182713]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 21 23:49:12 compute-1 nova_compute[182713]:       <nova:name>tempest-LiveMigrationNegativeTest-server-749868200</nova:name>
Jan 21 23:49:12 compute-1 nova_compute[182713]:       <nova:creationTime>2026-01-21 23:49:12</nova:creationTime>
Jan 21 23:49:12 compute-1 nova_compute[182713]:       <nova:flavor name="m1.nano">
Jan 21 23:49:12 compute-1 nova_compute[182713]:         <nova:memory>128</nova:memory>
Jan 21 23:49:12 compute-1 nova_compute[182713]:         <nova:disk>1</nova:disk>
Jan 21 23:49:12 compute-1 nova_compute[182713]:         <nova:swap>0</nova:swap>
Jan 21 23:49:12 compute-1 nova_compute[182713]:         <nova:ephemeral>0</nova:ephemeral>
Jan 21 23:49:12 compute-1 nova_compute[182713]:         <nova:vcpus>1</nova:vcpus>
Jan 21 23:49:12 compute-1 nova_compute[182713]:       </nova:flavor>
Jan 21 23:49:12 compute-1 nova_compute[182713]:       <nova:owner>
Jan 21 23:49:12 compute-1 nova_compute[182713]:         <nova:user uuid="f63fa215646b41c79f42ebb0bdcfcea0">tempest-LiveMigrationNegativeTest-896104195-project-member</nova:user>
Jan 21 23:49:12 compute-1 nova_compute[182713]:         <nova:project uuid="d261e3eff0854b5c86b1fdf0c14f9027">tempest-LiveMigrationNegativeTest-896104195</nova:project>
Jan 21 23:49:12 compute-1 nova_compute[182713]:       </nova:owner>
Jan 21 23:49:12 compute-1 nova_compute[182713]:       <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 21 23:49:12 compute-1 nova_compute[182713]:       <nova:ports/>
Jan 21 23:49:12 compute-1 nova_compute[182713]:     </nova:instance>
Jan 21 23:49:12 compute-1 nova_compute[182713]:   </metadata>
Jan 21 23:49:12 compute-1 nova_compute[182713]:   <sysinfo type="smbios">
Jan 21 23:49:12 compute-1 nova_compute[182713]:     <system>
Jan 21 23:49:12 compute-1 nova_compute[182713]:       <entry name="manufacturer">RDO</entry>
Jan 21 23:49:12 compute-1 nova_compute[182713]:       <entry name="product">OpenStack Compute</entry>
Jan 21 23:49:12 compute-1 nova_compute[182713]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 21 23:49:12 compute-1 nova_compute[182713]:       <entry name="serial">c5403455-52cf-4717-b07a-49f01c2ed814</entry>
Jan 21 23:49:12 compute-1 nova_compute[182713]:       <entry name="uuid">c5403455-52cf-4717-b07a-49f01c2ed814</entry>
Jan 21 23:49:12 compute-1 nova_compute[182713]:       <entry name="family">Virtual Machine</entry>
Jan 21 23:49:12 compute-1 nova_compute[182713]:     </system>
Jan 21 23:49:12 compute-1 nova_compute[182713]:   </sysinfo>
Jan 21 23:49:12 compute-1 nova_compute[182713]:   <os>
Jan 21 23:49:12 compute-1 nova_compute[182713]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 21 23:49:12 compute-1 nova_compute[182713]:     <boot dev="hd"/>
Jan 21 23:49:12 compute-1 nova_compute[182713]:     <smbios mode="sysinfo"/>
Jan 21 23:49:12 compute-1 nova_compute[182713]:   </os>
Jan 21 23:49:12 compute-1 nova_compute[182713]:   <features>
Jan 21 23:49:12 compute-1 nova_compute[182713]:     <acpi/>
Jan 21 23:49:12 compute-1 nova_compute[182713]:     <apic/>
Jan 21 23:49:12 compute-1 nova_compute[182713]:     <vmcoreinfo/>
Jan 21 23:49:12 compute-1 nova_compute[182713]:   </features>
Jan 21 23:49:12 compute-1 nova_compute[182713]:   <clock offset="utc">
Jan 21 23:49:12 compute-1 nova_compute[182713]:     <timer name="pit" tickpolicy="delay"/>
Jan 21 23:49:12 compute-1 nova_compute[182713]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 21 23:49:12 compute-1 nova_compute[182713]:     <timer name="hpet" present="no"/>
Jan 21 23:49:12 compute-1 nova_compute[182713]:   </clock>
Jan 21 23:49:12 compute-1 nova_compute[182713]:   <cpu mode="custom" match="exact">
Jan 21 23:49:12 compute-1 nova_compute[182713]:     <model>Nehalem</model>
Jan 21 23:49:12 compute-1 nova_compute[182713]:     <topology sockets="1" cores="1" threads="1"/>
Jan 21 23:49:12 compute-1 nova_compute[182713]:   </cpu>
Jan 21 23:49:12 compute-1 nova_compute[182713]:   <devices>
Jan 21 23:49:12 compute-1 nova_compute[182713]:     <disk type="file" device="disk">
Jan 21 23:49:12 compute-1 nova_compute[182713]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 21 23:49:12 compute-1 nova_compute[182713]:       <source file="/var/lib/nova/instances/c5403455-52cf-4717-b07a-49f01c2ed814/disk"/>
Jan 21 23:49:12 compute-1 nova_compute[182713]:       <target dev="vda" bus="virtio"/>
Jan 21 23:49:12 compute-1 nova_compute[182713]:     </disk>
Jan 21 23:49:12 compute-1 nova_compute[182713]:     <disk type="file" device="cdrom">
Jan 21 23:49:12 compute-1 nova_compute[182713]:       <driver name="qemu" type="raw" cache="none"/>
Jan 21 23:49:12 compute-1 nova_compute[182713]:       <source file="/var/lib/nova/instances/c5403455-52cf-4717-b07a-49f01c2ed814/disk.config"/>
Jan 21 23:49:12 compute-1 nova_compute[182713]:       <target dev="sda" bus="sata"/>
Jan 21 23:49:12 compute-1 nova_compute[182713]:     </disk>
Jan 21 23:49:12 compute-1 nova_compute[182713]:     <serial type="pty">
Jan 21 23:49:12 compute-1 nova_compute[182713]:       <log file="/var/lib/nova/instances/c5403455-52cf-4717-b07a-49f01c2ed814/console.log" append="off"/>
Jan 21 23:49:12 compute-1 nova_compute[182713]:     </serial>
Jan 21 23:49:12 compute-1 nova_compute[182713]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 21 23:49:12 compute-1 nova_compute[182713]:     <video>
Jan 21 23:49:12 compute-1 nova_compute[182713]:       <model type="virtio"/>
Jan 21 23:49:12 compute-1 nova_compute[182713]:     </video>
Jan 21 23:49:12 compute-1 nova_compute[182713]:     <input type="tablet" bus="usb"/>
Jan 21 23:49:12 compute-1 nova_compute[182713]:     <rng model="virtio">
Jan 21 23:49:12 compute-1 nova_compute[182713]:       <backend model="random">/dev/urandom</backend>
Jan 21 23:49:12 compute-1 nova_compute[182713]:     </rng>
Jan 21 23:49:12 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root"/>
Jan 21 23:49:12 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:49:12 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:49:12 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:49:12 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:49:12 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:49:12 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:49:12 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:49:12 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:49:12 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:49:12 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:49:12 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:49:12 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:49:12 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:49:12 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:49:12 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:49:12 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:49:12 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:49:12 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:49:12 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:49:12 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:49:12 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:49:12 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:49:12 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:49:12 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:49:12 compute-1 nova_compute[182713]:     <controller type="usb" index="0"/>
Jan 21 23:49:12 compute-1 nova_compute[182713]:     <memballoon model="virtio">
Jan 21 23:49:12 compute-1 nova_compute[182713]:       <stats period="10"/>
Jan 21 23:49:12 compute-1 nova_compute[182713]:     </memballoon>
Jan 21 23:49:12 compute-1 nova_compute[182713]:   </devices>
Jan 21 23:49:12 compute-1 nova_compute[182713]: </domain>
Jan 21 23:49:12 compute-1 nova_compute[182713]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 21 23:49:12 compute-1 nova_compute[182713]: 2026-01-21 23:49:12.718 182717 DEBUG nova.compute.manager [None req-e222f738-cbc5-445d-a065-dc78de4baf8f abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] [instance: 092ae3ba-e79b-47fc-b3eb-a0fb61865b6f] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 21 23:49:12 compute-1 nova_compute[182713]: 2026-01-21 23:49:12.782 182717 DEBUG nova.virt.libvirt.driver [None req-3418927a-f057-4804-bc5d-53ddbd5c254b f63fa215646b41c79f42ebb0bdcfcea0 d261e3eff0854b5c86b1fdf0c14f9027 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 21 23:49:12 compute-1 nova_compute[182713]: 2026-01-21 23:49:12.783 182717 DEBUG nova.virt.libvirt.driver [None req-3418927a-f057-4804-bc5d-53ddbd5c254b f63fa215646b41c79f42ebb0bdcfcea0 d261e3eff0854b5c86b1fdf0c14f9027 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 21 23:49:12 compute-1 nova_compute[182713]: 2026-01-21 23:49:12.783 182717 INFO nova.virt.libvirt.driver [None req-3418927a-f057-4804-bc5d-53ddbd5c254b f63fa215646b41c79f42ebb0bdcfcea0 d261e3eff0854b5c86b1fdf0c14f9027 - - default default] [instance: c5403455-52cf-4717-b07a-49f01c2ed814] Using config drive
Jan 21 23:49:12 compute-1 nova_compute[182713]: 2026-01-21 23:49:12.823 182717 DEBUG nova.compute.manager [None req-e222f738-cbc5-445d-a065-dc78de4baf8f abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] [instance: 092ae3ba-e79b-47fc-b3eb-a0fb61865b6f] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 21 23:49:12 compute-1 nova_compute[182713]: 2026-01-21 23:49:12.825 182717 DEBUG nova.virt.libvirt.driver [None req-e222f738-cbc5-445d-a065-dc78de4baf8f abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] [instance: 092ae3ba-e79b-47fc-b3eb-a0fb61865b6f] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 21 23:49:12 compute-1 nova_compute[182713]: 2026-01-21 23:49:12.825 182717 INFO nova.virt.libvirt.driver [None req-e222f738-cbc5-445d-a065-dc78de4baf8f abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] [instance: 092ae3ba-e79b-47fc-b3eb-a0fb61865b6f] Creating image(s)
Jan 21 23:49:12 compute-1 nova_compute[182713]: 2026-01-21 23:49:12.826 182717 DEBUG oslo_concurrency.lockutils [None req-e222f738-cbc5-445d-a065-dc78de4baf8f abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] Acquiring lock "/var/lib/nova/instances/092ae3ba-e79b-47fc-b3eb-a0fb61865b6f/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:49:12 compute-1 nova_compute[182713]: 2026-01-21 23:49:12.826 182717 DEBUG oslo_concurrency.lockutils [None req-e222f738-cbc5-445d-a065-dc78de4baf8f abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] Lock "/var/lib/nova/instances/092ae3ba-e79b-47fc-b3eb-a0fb61865b6f/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:49:12 compute-1 nova_compute[182713]: 2026-01-21 23:49:12.827 182717 DEBUG oslo_concurrency.lockutils [None req-e222f738-cbc5-445d-a065-dc78de4baf8f abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] Lock "/var/lib/nova/instances/092ae3ba-e79b-47fc-b3eb-a0fb61865b6f/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:49:12 compute-1 nova_compute[182713]: 2026-01-21 23:49:12.843 182717 DEBUG oslo_concurrency.processutils [None req-e222f738-cbc5-445d-a065-dc78de4baf8f abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:49:12 compute-1 nova_compute[182713]: 2026-01-21 23:49:12.930 182717 DEBUG oslo_concurrency.processutils [None req-e222f738-cbc5-445d-a065-dc78de4baf8f abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.087s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:49:12 compute-1 nova_compute[182713]: 2026-01-21 23:49:12.931 182717 DEBUG oslo_concurrency.lockutils [None req-e222f738-cbc5-445d-a065-dc78de4baf8f abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] Acquiring lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:49:12 compute-1 nova_compute[182713]: 2026-01-21 23:49:12.932 182717 DEBUG oslo_concurrency.lockutils [None req-e222f738-cbc5-445d-a065-dc78de4baf8f abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:49:12 compute-1 nova_compute[182713]: 2026-01-21 23:49:12.947 182717 DEBUG oslo_concurrency.processutils [None req-e222f738-cbc5-445d-a065-dc78de4baf8f abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:49:12 compute-1 nova_compute[182713]: 2026-01-21 23:49:12.993 182717 INFO nova.virt.libvirt.driver [None req-3418927a-f057-4804-bc5d-53ddbd5c254b f63fa215646b41c79f42ebb0bdcfcea0 d261e3eff0854b5c86b1fdf0c14f9027 - - default default] [instance: c5403455-52cf-4717-b07a-49f01c2ed814] Creating config drive at /var/lib/nova/instances/c5403455-52cf-4717-b07a-49f01c2ed814/disk.config
Jan 21 23:49:13 compute-1 nova_compute[182713]: 2026-01-21 23:49:13.000 182717 DEBUG oslo_concurrency.processutils [None req-3418927a-f057-4804-bc5d-53ddbd5c254b f63fa215646b41c79f42ebb0bdcfcea0 d261e3eff0854b5c86b1fdf0c14f9027 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/c5403455-52cf-4717-b07a-49f01c2ed814/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpelqoq7v8 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:49:13 compute-1 nova_compute[182713]: 2026-01-21 23:49:13.028 182717 DEBUG oslo_concurrency.processutils [None req-e222f738-cbc5-445d-a065-dc78de4baf8f abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.081s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:49:13 compute-1 nova_compute[182713]: 2026-01-21 23:49:13.029 182717 DEBUG oslo_concurrency.processutils [None req-e222f738-cbc5-445d-a065-dc78de4baf8f abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/092ae3ba-e79b-47fc-b3eb-a0fb61865b6f/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:49:13 compute-1 nova_compute[182713]: 2026-01-21 23:49:13.080 182717 DEBUG oslo_concurrency.processutils [None req-e222f738-cbc5-445d-a065-dc78de4baf8f abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/092ae3ba-e79b-47fc-b3eb-a0fb61865b6f/disk 1073741824" returned: 0 in 0.051s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:49:13 compute-1 nova_compute[182713]: 2026-01-21 23:49:13.082 182717 DEBUG oslo_concurrency.lockutils [None req-e222f738-cbc5-445d-a065-dc78de4baf8f abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.150s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:49:13 compute-1 nova_compute[182713]: 2026-01-21 23:49:13.083 182717 DEBUG oslo_concurrency.processutils [None req-e222f738-cbc5-445d-a065-dc78de4baf8f abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:49:13 compute-1 nova_compute[182713]: 2026-01-21 23:49:13.141 182717 DEBUG oslo_concurrency.processutils [None req-3418927a-f057-4804-bc5d-53ddbd5c254b f63fa215646b41c79f42ebb0bdcfcea0 d261e3eff0854b5c86b1fdf0c14f9027 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/c5403455-52cf-4717-b07a-49f01c2ed814/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpelqoq7v8" returned: 0 in 0.141s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:49:13 compute-1 nova_compute[182713]: 2026-01-21 23:49:13.172 182717 DEBUG oslo_concurrency.processutils [None req-e222f738-cbc5-445d-a065-dc78de4baf8f abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.089s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:49:13 compute-1 nova_compute[182713]: 2026-01-21 23:49:13.172 182717 DEBUG nova.virt.disk.api [None req-e222f738-cbc5-445d-a065-dc78de4baf8f abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] Checking if we can resize image /var/lib/nova/instances/092ae3ba-e79b-47fc-b3eb-a0fb61865b6f/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 21 23:49:13 compute-1 nova_compute[182713]: 2026-01-21 23:49:13.173 182717 DEBUG oslo_concurrency.processutils [None req-e222f738-cbc5-445d-a065-dc78de4baf8f abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/092ae3ba-e79b-47fc-b3eb-a0fb61865b6f/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:49:13 compute-1 nova_compute[182713]: 2026-01-21 23:49:13.230 182717 DEBUG nova.network.neutron [None req-e222f738-cbc5-445d-a065-dc78de4baf8f abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] [instance: 092ae3ba-e79b-47fc-b3eb-a0fb61865b6f] No network configured allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1188
Jan 21 23:49:13 compute-1 nova_compute[182713]: 2026-01-21 23:49:13.231 182717 DEBUG nova.compute.manager [None req-e222f738-cbc5-445d-a065-dc78de4baf8f abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] [instance: 092ae3ba-e79b-47fc-b3eb-a0fb61865b6f] Instance network_info: |[]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 21 23:49:13 compute-1 systemd-machined[153970]: New machine qemu-16-instance-0000001f.
Jan 21 23:49:13 compute-1 systemd[1]: Started Virtual Machine qemu-16-instance-0000001f.
Jan 21 23:49:13 compute-1 nova_compute[182713]: 2026-01-21 23:49:13.282 182717 DEBUG oslo_concurrency.processutils [None req-e222f738-cbc5-445d-a065-dc78de4baf8f abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/092ae3ba-e79b-47fc-b3eb-a0fb61865b6f/disk --force-share --output=json" returned: 0 in 0.109s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:49:13 compute-1 nova_compute[182713]: 2026-01-21 23:49:13.282 182717 DEBUG nova.virt.disk.api [None req-e222f738-cbc5-445d-a065-dc78de4baf8f abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] Cannot resize image /var/lib/nova/instances/092ae3ba-e79b-47fc-b3eb-a0fb61865b6f/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 21 23:49:13 compute-1 nova_compute[182713]: 2026-01-21 23:49:13.283 182717 DEBUG nova.objects.instance [None req-e222f738-cbc5-445d-a065-dc78de4baf8f abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] Lazy-loading 'migration_context' on Instance uuid 092ae3ba-e79b-47fc-b3eb-a0fb61865b6f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:49:13 compute-1 nova_compute[182713]: 2026-01-21 23:49:13.314 182717 DEBUG nova.virt.libvirt.driver [None req-e222f738-cbc5-445d-a065-dc78de4baf8f abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] [instance: 092ae3ba-e79b-47fc-b3eb-a0fb61865b6f] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 21 23:49:13 compute-1 nova_compute[182713]: 2026-01-21 23:49:13.314 182717 DEBUG nova.virt.libvirt.driver [None req-e222f738-cbc5-445d-a065-dc78de4baf8f abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] [instance: 092ae3ba-e79b-47fc-b3eb-a0fb61865b6f] Ensure instance console log exists: /var/lib/nova/instances/092ae3ba-e79b-47fc-b3eb-a0fb61865b6f/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 21 23:49:13 compute-1 nova_compute[182713]: 2026-01-21 23:49:13.315 182717 DEBUG oslo_concurrency.lockutils [None req-e222f738-cbc5-445d-a065-dc78de4baf8f abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:49:13 compute-1 nova_compute[182713]: 2026-01-21 23:49:13.315 182717 DEBUG oslo_concurrency.lockutils [None req-e222f738-cbc5-445d-a065-dc78de4baf8f abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:49:13 compute-1 nova_compute[182713]: 2026-01-21 23:49:13.315 182717 DEBUG oslo_concurrency.lockutils [None req-e222f738-cbc5-445d-a065-dc78de4baf8f abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:49:13 compute-1 nova_compute[182713]: 2026-01-21 23:49:13.316 182717 DEBUG nova.virt.libvirt.driver [None req-e222f738-cbc5-445d-a065-dc78de4baf8f abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] [instance: 092ae3ba-e79b-47fc-b3eb-a0fb61865b6f] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'size': 0, 'disk_bus': 'virtio', 'boot_index': 0, 'device_name': '/dev/vda', 'guest_format': None, 'encrypted': False, 'encryption_format': None, 'encryption_secret_uuid': None, 'device_type': 'disk', 'image_id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 21 23:49:13 compute-1 nova_compute[182713]: 2026-01-21 23:49:13.343 182717 WARNING nova.virt.libvirt.driver [None req-e222f738-cbc5-445d-a065-dc78de4baf8f abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 21 23:49:13 compute-1 nova_compute[182713]: 2026-01-21 23:49:13.365 182717 DEBUG nova.virt.libvirt.host [None req-e222f738-cbc5-445d-a065-dc78de4baf8f abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 21 23:49:13 compute-1 nova_compute[182713]: 2026-01-21 23:49:13.366 182717 DEBUG nova.virt.libvirt.host [None req-e222f738-cbc5-445d-a065-dc78de4baf8f abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 21 23:49:13 compute-1 nova_compute[182713]: 2026-01-21 23:49:13.390 182717 DEBUG nova.virt.libvirt.host [None req-e222f738-cbc5-445d-a065-dc78de4baf8f abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 21 23:49:13 compute-1 nova_compute[182713]: 2026-01-21 23:49:13.391 182717 DEBUG nova.virt.libvirt.host [None req-e222f738-cbc5-445d-a065-dc78de4baf8f abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 21 23:49:13 compute-1 nova_compute[182713]: 2026-01-21 23:49:13.392 182717 DEBUG nova.virt.libvirt.driver [None req-e222f738-cbc5-445d-a065-dc78de4baf8f abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 21 23:49:13 compute-1 nova_compute[182713]: 2026-01-21 23:49:13.392 182717 DEBUG nova.virt.hardware [None req-e222f738-cbc5-445d-a065-dc78de4baf8f abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-21T23:42:45Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c3389c03-89c4-4ff5-9e03-1a99d41713d4',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 21 23:49:13 compute-1 nova_compute[182713]: 2026-01-21 23:49:13.392 182717 DEBUG nova.virt.hardware [None req-e222f738-cbc5-445d-a065-dc78de4baf8f abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 21 23:49:13 compute-1 nova_compute[182713]: 2026-01-21 23:49:13.392 182717 DEBUG nova.virt.hardware [None req-e222f738-cbc5-445d-a065-dc78de4baf8f abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 21 23:49:13 compute-1 nova_compute[182713]: 2026-01-21 23:49:13.393 182717 DEBUG nova.virt.hardware [None req-e222f738-cbc5-445d-a065-dc78de4baf8f abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 21 23:49:13 compute-1 nova_compute[182713]: 2026-01-21 23:49:13.393 182717 DEBUG nova.virt.hardware [None req-e222f738-cbc5-445d-a065-dc78de4baf8f abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 21 23:49:13 compute-1 nova_compute[182713]: 2026-01-21 23:49:13.393 182717 DEBUG nova.virt.hardware [None req-e222f738-cbc5-445d-a065-dc78de4baf8f abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 21 23:49:13 compute-1 nova_compute[182713]: 2026-01-21 23:49:13.393 182717 DEBUG nova.virt.hardware [None req-e222f738-cbc5-445d-a065-dc78de4baf8f abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 21 23:49:13 compute-1 nova_compute[182713]: 2026-01-21 23:49:13.393 182717 DEBUG nova.virt.hardware [None req-e222f738-cbc5-445d-a065-dc78de4baf8f abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 21 23:49:13 compute-1 nova_compute[182713]: 2026-01-21 23:49:13.393 182717 DEBUG nova.virt.hardware [None req-e222f738-cbc5-445d-a065-dc78de4baf8f abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 21 23:49:13 compute-1 nova_compute[182713]: 2026-01-21 23:49:13.394 182717 DEBUG nova.virt.hardware [None req-e222f738-cbc5-445d-a065-dc78de4baf8f abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 21 23:49:13 compute-1 nova_compute[182713]: 2026-01-21 23:49:13.394 182717 DEBUG nova.virt.hardware [None req-e222f738-cbc5-445d-a065-dc78de4baf8f abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 21 23:49:13 compute-1 nova_compute[182713]: 2026-01-21 23:49:13.397 182717 DEBUG nova.objects.instance [None req-e222f738-cbc5-445d-a065-dc78de4baf8f abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] Lazy-loading 'pci_devices' on Instance uuid 092ae3ba-e79b-47fc-b3eb-a0fb61865b6f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:49:13 compute-1 nova_compute[182713]: 2026-01-21 23:49:13.751 182717 DEBUG nova.virt.libvirt.driver [None req-e222f738-cbc5-445d-a065-dc78de4baf8f abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] [instance: 092ae3ba-e79b-47fc-b3eb-a0fb61865b6f] End _get_guest_xml xml=<domain type="kvm">
Jan 21 23:49:13 compute-1 nova_compute[182713]:   <uuid>092ae3ba-e79b-47fc-b3eb-a0fb61865b6f</uuid>
Jan 21 23:49:13 compute-1 nova_compute[182713]:   <name>instance-00000020</name>
Jan 21 23:49:13 compute-1 nova_compute[182713]:   <memory>131072</memory>
Jan 21 23:49:13 compute-1 nova_compute[182713]:   <vcpu>1</vcpu>
Jan 21 23:49:13 compute-1 nova_compute[182713]:   <metadata>
Jan 21 23:49:13 compute-1 nova_compute[182713]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 21 23:49:13 compute-1 nova_compute[182713]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 21 23:49:13 compute-1 nova_compute[182713]:       <nova:name>tempest-ListImageFiltersTestJSON-server-1307083856</nova:name>
Jan 21 23:49:13 compute-1 nova_compute[182713]:       <nova:creationTime>2026-01-21 23:49:13</nova:creationTime>
Jan 21 23:49:13 compute-1 nova_compute[182713]:       <nova:flavor name="m1.nano">
Jan 21 23:49:13 compute-1 nova_compute[182713]:         <nova:memory>128</nova:memory>
Jan 21 23:49:13 compute-1 nova_compute[182713]:         <nova:disk>1</nova:disk>
Jan 21 23:49:13 compute-1 nova_compute[182713]:         <nova:swap>0</nova:swap>
Jan 21 23:49:13 compute-1 nova_compute[182713]:         <nova:ephemeral>0</nova:ephemeral>
Jan 21 23:49:13 compute-1 nova_compute[182713]:         <nova:vcpus>1</nova:vcpus>
Jan 21 23:49:13 compute-1 nova_compute[182713]:       </nova:flavor>
Jan 21 23:49:13 compute-1 nova_compute[182713]:       <nova:owner>
Jan 21 23:49:13 compute-1 nova_compute[182713]:         <nova:user uuid="abd17ede09d948d58de153b963381f13">tempest-ListImageFiltersTestJSON-2096581596-project-member</nova:user>
Jan 21 23:49:13 compute-1 nova_compute[182713]:         <nova:project uuid="54c1b2890dcc4b4599ff907adcbbbbb0">tempest-ListImageFiltersTestJSON-2096581596</nova:project>
Jan 21 23:49:13 compute-1 nova_compute[182713]:       </nova:owner>
Jan 21 23:49:13 compute-1 nova_compute[182713]:       <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 21 23:49:13 compute-1 nova_compute[182713]:       <nova:ports/>
Jan 21 23:49:13 compute-1 nova_compute[182713]:     </nova:instance>
Jan 21 23:49:13 compute-1 nova_compute[182713]:   </metadata>
Jan 21 23:49:13 compute-1 nova_compute[182713]:   <sysinfo type="smbios">
Jan 21 23:49:13 compute-1 nova_compute[182713]:     <system>
Jan 21 23:49:13 compute-1 nova_compute[182713]:       <entry name="manufacturer">RDO</entry>
Jan 21 23:49:13 compute-1 nova_compute[182713]:       <entry name="product">OpenStack Compute</entry>
Jan 21 23:49:13 compute-1 nova_compute[182713]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 21 23:49:13 compute-1 nova_compute[182713]:       <entry name="serial">092ae3ba-e79b-47fc-b3eb-a0fb61865b6f</entry>
Jan 21 23:49:13 compute-1 nova_compute[182713]:       <entry name="uuid">092ae3ba-e79b-47fc-b3eb-a0fb61865b6f</entry>
Jan 21 23:49:13 compute-1 nova_compute[182713]:       <entry name="family">Virtual Machine</entry>
Jan 21 23:49:13 compute-1 nova_compute[182713]:     </system>
Jan 21 23:49:13 compute-1 nova_compute[182713]:   </sysinfo>
Jan 21 23:49:13 compute-1 nova_compute[182713]:   <os>
Jan 21 23:49:13 compute-1 nova_compute[182713]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 21 23:49:13 compute-1 nova_compute[182713]:     <boot dev="hd"/>
Jan 21 23:49:13 compute-1 nova_compute[182713]:     <smbios mode="sysinfo"/>
Jan 21 23:49:13 compute-1 nova_compute[182713]:   </os>
Jan 21 23:49:13 compute-1 nova_compute[182713]:   <features>
Jan 21 23:49:13 compute-1 nova_compute[182713]:     <acpi/>
Jan 21 23:49:13 compute-1 nova_compute[182713]:     <apic/>
Jan 21 23:49:13 compute-1 nova_compute[182713]:     <vmcoreinfo/>
Jan 21 23:49:13 compute-1 nova_compute[182713]:   </features>
Jan 21 23:49:13 compute-1 nova_compute[182713]:   <clock offset="utc">
Jan 21 23:49:13 compute-1 nova_compute[182713]:     <timer name="pit" tickpolicy="delay"/>
Jan 21 23:49:13 compute-1 nova_compute[182713]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 21 23:49:13 compute-1 nova_compute[182713]:     <timer name="hpet" present="no"/>
Jan 21 23:49:13 compute-1 nova_compute[182713]:   </clock>
Jan 21 23:49:13 compute-1 nova_compute[182713]:   <cpu mode="custom" match="exact">
Jan 21 23:49:13 compute-1 nova_compute[182713]:     <model>Nehalem</model>
Jan 21 23:49:13 compute-1 nova_compute[182713]:     <topology sockets="1" cores="1" threads="1"/>
Jan 21 23:49:13 compute-1 nova_compute[182713]:   </cpu>
Jan 21 23:49:13 compute-1 nova_compute[182713]:   <devices>
Jan 21 23:49:13 compute-1 nova_compute[182713]:     <disk type="file" device="disk">
Jan 21 23:49:13 compute-1 nova_compute[182713]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 21 23:49:13 compute-1 nova_compute[182713]:       <source file="/var/lib/nova/instances/092ae3ba-e79b-47fc-b3eb-a0fb61865b6f/disk"/>
Jan 21 23:49:13 compute-1 nova_compute[182713]:       <target dev="vda" bus="virtio"/>
Jan 21 23:49:13 compute-1 nova_compute[182713]:     </disk>
Jan 21 23:49:13 compute-1 nova_compute[182713]:     <disk type="file" device="cdrom">
Jan 21 23:49:13 compute-1 nova_compute[182713]:       <driver name="qemu" type="raw" cache="none"/>
Jan 21 23:49:13 compute-1 nova_compute[182713]:       <source file="/var/lib/nova/instances/092ae3ba-e79b-47fc-b3eb-a0fb61865b6f/disk.config"/>
Jan 21 23:49:13 compute-1 nova_compute[182713]:       <target dev="sda" bus="sata"/>
Jan 21 23:49:13 compute-1 nova_compute[182713]:     </disk>
Jan 21 23:49:13 compute-1 nova_compute[182713]:     <serial type="pty">
Jan 21 23:49:13 compute-1 nova_compute[182713]:       <log file="/var/lib/nova/instances/092ae3ba-e79b-47fc-b3eb-a0fb61865b6f/console.log" append="off"/>
Jan 21 23:49:13 compute-1 nova_compute[182713]:     </serial>
Jan 21 23:49:13 compute-1 nova_compute[182713]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 21 23:49:13 compute-1 nova_compute[182713]:     <video>
Jan 21 23:49:13 compute-1 nova_compute[182713]:       <model type="virtio"/>
Jan 21 23:49:13 compute-1 nova_compute[182713]:     </video>
Jan 21 23:49:13 compute-1 nova_compute[182713]:     <input type="tablet" bus="usb"/>
Jan 21 23:49:13 compute-1 nova_compute[182713]:     <rng model="virtio">
Jan 21 23:49:13 compute-1 nova_compute[182713]:       <backend model="random">/dev/urandom</backend>
Jan 21 23:49:13 compute-1 nova_compute[182713]:     </rng>
Jan 21 23:49:13 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root"/>
Jan 21 23:49:13 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:49:13 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:49:13 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:49:13 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:49:13 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:49:13 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:49:13 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:49:13 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:49:13 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:49:13 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:49:13 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:49:13 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:49:13 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:49:13 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:49:13 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:49:13 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:49:13 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:49:13 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:49:13 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:49:13 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:49:13 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:49:13 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:49:13 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:49:13 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:49:13 compute-1 nova_compute[182713]:     <controller type="usb" index="0"/>
Jan 21 23:49:13 compute-1 nova_compute[182713]:     <memballoon model="virtio">
Jan 21 23:49:13 compute-1 nova_compute[182713]:       <stats period="10"/>
Jan 21 23:49:13 compute-1 nova_compute[182713]:     </memballoon>
Jan 21 23:49:13 compute-1 nova_compute[182713]:   </devices>
Jan 21 23:49:13 compute-1 nova_compute[182713]: </domain>
Jan 21 23:49:13 compute-1 nova_compute[182713]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 21 23:49:13 compute-1 nova_compute[182713]: 2026-01-21 23:49:13.821 182717 DEBUG nova.virt.libvirt.driver [None req-e222f738-cbc5-445d-a065-dc78de4baf8f abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 21 23:49:13 compute-1 nova_compute[182713]: 2026-01-21 23:49:13.821 182717 DEBUG nova.virt.libvirt.driver [None req-e222f738-cbc5-445d-a065-dc78de4baf8f abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 21 23:49:13 compute-1 nova_compute[182713]: 2026-01-21 23:49:13.822 182717 INFO nova.virt.libvirt.driver [None req-e222f738-cbc5-445d-a065-dc78de4baf8f abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] [instance: 092ae3ba-e79b-47fc-b3eb-a0fb61865b6f] Using config drive
Jan 21 23:49:14 compute-1 nova_compute[182713]: 2026-01-21 23:49:14.041 182717 DEBUG oslo_concurrency.processutils [None req-a2d99467-44fd-4af5-84e4-e0963fdaa227 6ecb82537aea43e49dbf7e72fc5cb2fd 8857c3f1f4694ee0b967080721188d60 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3224e12fdef635d4ff8f1066ef61cd1ea8e72c9b.part --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:49:14 compute-1 nova_compute[182713]: 2026-01-21 23:49:14.075 182717 INFO nova.virt.libvirt.driver [None req-e222f738-cbc5-445d-a065-dc78de4baf8f abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] [instance: 092ae3ba-e79b-47fc-b3eb-a0fb61865b6f] Creating config drive at /var/lib/nova/instances/092ae3ba-e79b-47fc-b3eb-a0fb61865b6f/disk.config
Jan 21 23:49:14 compute-1 nova_compute[182713]: 2026-01-21 23:49:14.085 182717 DEBUG oslo_concurrency.processutils [None req-e222f738-cbc5-445d-a065-dc78de4baf8f abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/092ae3ba-e79b-47fc-b3eb-a0fb61865b6f/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpnhkz0y45 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:49:14 compute-1 nova_compute[182713]: 2026-01-21 23:49:14.133 182717 DEBUG oslo_concurrency.processutils [None req-a2d99467-44fd-4af5-84e4-e0963fdaa227 6ecb82537aea43e49dbf7e72fc5cb2fd 8857c3f1f4694ee0b967080721188d60 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3224e12fdef635d4ff8f1066ef61cd1ea8e72c9b.part --force-share --output=json" returned: 0 in 0.092s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:49:14 compute-1 nova_compute[182713]: 2026-01-21 23:49:14.135 182717 DEBUG nova.virt.images [None req-a2d99467-44fd-4af5-84e4-e0963fdaa227 6ecb82537aea43e49dbf7e72fc5cb2fd 8857c3f1f4694ee0b967080721188d60 - - default default] 4e54f6af-d382-4cfa-83d0-5dee4b638922 was qcow2, converting to raw fetch_to_raw /usr/lib/python3.9/site-packages/nova/virt/images.py:242
Jan 21 23:49:14 compute-1 nova_compute[182713]: 2026-01-21 23:49:14.137 182717 DEBUG nova.privsep.utils [None req-a2d99467-44fd-4af5-84e4-e0963fdaa227 6ecb82537aea43e49dbf7e72fc5cb2fd 8857c3f1f4694ee0b967080721188d60 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63
Jan 21 23:49:14 compute-1 nova_compute[182713]: 2026-01-21 23:49:14.138 182717 DEBUG oslo_concurrency.processutils [None req-a2d99467-44fd-4af5-84e4-e0963fdaa227 6ecb82537aea43e49dbf7e72fc5cb2fd 8857c3f1f4694ee0b967080721188d60 - - default default] Running cmd (subprocess): qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/3224e12fdef635d4ff8f1066ef61cd1ea8e72c9b.part /var/lib/nova/instances/_base/3224e12fdef635d4ff8f1066ef61cd1ea8e72c9b.converted execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:49:14 compute-1 nova_compute[182713]: 2026-01-21 23:49:14.231 182717 DEBUG oslo_concurrency.processutils [None req-e222f738-cbc5-445d-a065-dc78de4baf8f abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/092ae3ba-e79b-47fc-b3eb-a0fb61865b6f/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpnhkz0y45" returned: 0 in 0.145s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:49:14 compute-1 systemd-machined[153970]: New machine qemu-17-instance-00000020.
Jan 21 23:49:14 compute-1 systemd[1]: Started Virtual Machine qemu-17-instance-00000020.
Jan 21 23:49:14 compute-1 nova_compute[182713]: 2026-01-21 23:49:14.453 182717 DEBUG oslo_concurrency.processutils [None req-a2d99467-44fd-4af5-84e4-e0963fdaa227 6ecb82537aea43e49dbf7e72fc5cb2fd 8857c3f1f4694ee0b967080721188d60 - - default default] CMD "qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/3224e12fdef635d4ff8f1066ef61cd1ea8e72c9b.part /var/lib/nova/instances/_base/3224e12fdef635d4ff8f1066ef61cd1ea8e72c9b.converted" returned: 0 in 0.315s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:49:14 compute-1 nova_compute[182713]: 2026-01-21 23:49:14.472 182717 DEBUG oslo_concurrency.processutils [None req-a2d99467-44fd-4af5-84e4-e0963fdaa227 6ecb82537aea43e49dbf7e72fc5cb2fd 8857c3f1f4694ee0b967080721188d60 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3224e12fdef635d4ff8f1066ef61cd1ea8e72c9b.converted --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:49:14 compute-1 nova_compute[182713]: 2026-01-21 23:49:14.556 182717 DEBUG nova.virt.driver [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Emitting event <LifecycleEvent: 1769039354.5561306, c5403455-52cf-4717-b07a-49f01c2ed814 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:49:14 compute-1 nova_compute[182713]: 2026-01-21 23:49:14.557 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: c5403455-52cf-4717-b07a-49f01c2ed814] VM Resumed (Lifecycle Event)
Jan 21 23:49:14 compute-1 nova_compute[182713]: 2026-01-21 23:49:14.562 182717 DEBUG oslo_concurrency.processutils [None req-a2d99467-44fd-4af5-84e4-e0963fdaa227 6ecb82537aea43e49dbf7e72fc5cb2fd 8857c3f1f4694ee0b967080721188d60 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3224e12fdef635d4ff8f1066ef61cd1ea8e72c9b.converted --force-share --output=json" returned: 0 in 0.090s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:49:14 compute-1 nova_compute[182713]: 2026-01-21 23:49:14.564 182717 DEBUG oslo_concurrency.lockutils [None req-a2d99467-44fd-4af5-84e4-e0963fdaa227 6ecb82537aea43e49dbf7e72fc5cb2fd 8857c3f1f4694ee0b967080721188d60 - - default default] Lock "3224e12fdef635d4ff8f1066ef61cd1ea8e72c9b" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 2.334s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:49:14 compute-1 nova_compute[182713]: 2026-01-21 23:49:14.591 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: c5403455-52cf-4717-b07a-49f01c2ed814] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:49:14 compute-1 nova_compute[182713]: 2026-01-21 23:49:14.593 182717 DEBUG nova.compute.manager [None req-3418927a-f057-4804-bc5d-53ddbd5c254b f63fa215646b41c79f42ebb0bdcfcea0 d261e3eff0854b5c86b1fdf0c14f9027 - - default default] [instance: c5403455-52cf-4717-b07a-49f01c2ed814] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 21 23:49:14 compute-1 nova_compute[182713]: 2026-01-21 23:49:14.593 182717 DEBUG nova.virt.libvirt.driver [None req-3418927a-f057-4804-bc5d-53ddbd5c254b f63fa215646b41c79f42ebb0bdcfcea0 d261e3eff0854b5c86b1fdf0c14f9027 - - default default] [instance: c5403455-52cf-4717-b07a-49f01c2ed814] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 21 23:49:14 compute-1 nova_compute[182713]: 2026-01-21 23:49:14.594 182717 DEBUG oslo_concurrency.processutils [None req-a2d99467-44fd-4af5-84e4-e0963fdaa227 6ecb82537aea43e49dbf7e72fc5cb2fd 8857c3f1f4694ee0b967080721188d60 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3224e12fdef635d4ff8f1066ef61cd1ea8e72c9b --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:49:14 compute-1 nova_compute[182713]: 2026-01-21 23:49:14.634 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: c5403455-52cf-4717-b07a-49f01c2ed814] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 21 23:49:14 compute-1 nova_compute[182713]: 2026-01-21 23:49:14.639 182717 INFO nova.virt.libvirt.driver [-] [instance: c5403455-52cf-4717-b07a-49f01c2ed814] Instance spawned successfully.
Jan 21 23:49:14 compute-1 nova_compute[182713]: 2026-01-21 23:49:14.640 182717 DEBUG nova.virt.libvirt.driver [None req-3418927a-f057-4804-bc5d-53ddbd5c254b f63fa215646b41c79f42ebb0bdcfcea0 d261e3eff0854b5c86b1fdf0c14f9027 - - default default] [instance: c5403455-52cf-4717-b07a-49f01c2ed814] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 21 23:49:14 compute-1 nova_compute[182713]: 2026-01-21 23:49:14.661 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: c5403455-52cf-4717-b07a-49f01c2ed814] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 21 23:49:14 compute-1 nova_compute[182713]: 2026-01-21 23:49:14.662 182717 DEBUG nova.virt.driver [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Emitting event <LifecycleEvent: 1769039354.5738406, c5403455-52cf-4717-b07a-49f01c2ed814 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:49:14 compute-1 nova_compute[182713]: 2026-01-21 23:49:14.662 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: c5403455-52cf-4717-b07a-49f01c2ed814] VM Started (Lifecycle Event)
Jan 21 23:49:14 compute-1 nova_compute[182713]: 2026-01-21 23:49:14.673 182717 DEBUG nova.virt.libvirt.driver [None req-3418927a-f057-4804-bc5d-53ddbd5c254b f63fa215646b41c79f42ebb0bdcfcea0 d261e3eff0854b5c86b1fdf0c14f9027 - - default default] [instance: c5403455-52cf-4717-b07a-49f01c2ed814] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:49:14 compute-1 nova_compute[182713]: 2026-01-21 23:49:14.673 182717 DEBUG nova.virt.libvirt.driver [None req-3418927a-f057-4804-bc5d-53ddbd5c254b f63fa215646b41c79f42ebb0bdcfcea0 d261e3eff0854b5c86b1fdf0c14f9027 - - default default] [instance: c5403455-52cf-4717-b07a-49f01c2ed814] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:49:14 compute-1 nova_compute[182713]: 2026-01-21 23:49:14.674 182717 DEBUG nova.virt.libvirt.driver [None req-3418927a-f057-4804-bc5d-53ddbd5c254b f63fa215646b41c79f42ebb0bdcfcea0 d261e3eff0854b5c86b1fdf0c14f9027 - - default default] [instance: c5403455-52cf-4717-b07a-49f01c2ed814] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:49:14 compute-1 nova_compute[182713]: 2026-01-21 23:49:14.675 182717 DEBUG nova.virt.libvirt.driver [None req-3418927a-f057-4804-bc5d-53ddbd5c254b f63fa215646b41c79f42ebb0bdcfcea0 d261e3eff0854b5c86b1fdf0c14f9027 - - default default] [instance: c5403455-52cf-4717-b07a-49f01c2ed814] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:49:14 compute-1 nova_compute[182713]: 2026-01-21 23:49:14.676 182717 DEBUG nova.virt.libvirt.driver [None req-3418927a-f057-4804-bc5d-53ddbd5c254b f63fa215646b41c79f42ebb0bdcfcea0 d261e3eff0854b5c86b1fdf0c14f9027 - - default default] [instance: c5403455-52cf-4717-b07a-49f01c2ed814] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:49:14 compute-1 nova_compute[182713]: 2026-01-21 23:49:14.677 182717 DEBUG nova.virt.libvirt.driver [None req-3418927a-f057-4804-bc5d-53ddbd5c254b f63fa215646b41c79f42ebb0bdcfcea0 d261e3eff0854b5c86b1fdf0c14f9027 - - default default] [instance: c5403455-52cf-4717-b07a-49f01c2ed814] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:49:14 compute-1 nova_compute[182713]: 2026-01-21 23:49:14.682 182717 DEBUG oslo_concurrency.processutils [None req-a2d99467-44fd-4af5-84e4-e0963fdaa227 6ecb82537aea43e49dbf7e72fc5cb2fd 8857c3f1f4694ee0b967080721188d60 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3224e12fdef635d4ff8f1066ef61cd1ea8e72c9b --force-share --output=json" returned: 0 in 0.088s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:49:14 compute-1 nova_compute[182713]: 2026-01-21 23:49:14.683 182717 DEBUG oslo_concurrency.lockutils [None req-a2d99467-44fd-4af5-84e4-e0963fdaa227 6ecb82537aea43e49dbf7e72fc5cb2fd 8857c3f1f4694ee0b967080721188d60 - - default default] Acquiring lock "3224e12fdef635d4ff8f1066ef61cd1ea8e72c9b" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:49:14 compute-1 nova_compute[182713]: 2026-01-21 23:49:14.684 182717 DEBUG oslo_concurrency.lockutils [None req-a2d99467-44fd-4af5-84e4-e0963fdaa227 6ecb82537aea43e49dbf7e72fc5cb2fd 8857c3f1f4694ee0b967080721188d60 - - default default] Lock "3224e12fdef635d4ff8f1066ef61cd1ea8e72c9b" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:49:14 compute-1 nova_compute[182713]: 2026-01-21 23:49:14.703 182717 DEBUG oslo_concurrency.processutils [None req-a2d99467-44fd-4af5-84e4-e0963fdaa227 6ecb82537aea43e49dbf7e72fc5cb2fd 8857c3f1f4694ee0b967080721188d60 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3224e12fdef635d4ff8f1066ef61cd1ea8e72c9b --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:49:14 compute-1 nova_compute[182713]: 2026-01-21 23:49:14.725 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: c5403455-52cf-4717-b07a-49f01c2ed814] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:49:14 compute-1 nova_compute[182713]: 2026-01-21 23:49:14.730 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: c5403455-52cf-4717-b07a-49f01c2ed814] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 21 23:49:14 compute-1 nova_compute[182713]: 2026-01-21 23:49:14.747 182717 INFO nova.compute.manager [None req-3418927a-f057-4804-bc5d-53ddbd5c254b f63fa215646b41c79f42ebb0bdcfcea0 d261e3eff0854b5c86b1fdf0c14f9027 - - default default] [instance: c5403455-52cf-4717-b07a-49f01c2ed814] Took 2.54 seconds to spawn the instance on the hypervisor.
Jan 21 23:49:14 compute-1 nova_compute[182713]: 2026-01-21 23:49:14.747 182717 DEBUG nova.compute.manager [None req-3418927a-f057-4804-bc5d-53ddbd5c254b f63fa215646b41c79f42ebb0bdcfcea0 d261e3eff0854b5c86b1fdf0c14f9027 - - default default] [instance: c5403455-52cf-4717-b07a-49f01c2ed814] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:49:14 compute-1 nova_compute[182713]: 2026-01-21 23:49:14.754 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: c5403455-52cf-4717-b07a-49f01c2ed814] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 21 23:49:14 compute-1 nova_compute[182713]: 2026-01-21 23:49:14.758 182717 DEBUG oslo_concurrency.processutils [None req-a2d99467-44fd-4af5-84e4-e0963fdaa227 6ecb82537aea43e49dbf7e72fc5cb2fd 8857c3f1f4694ee0b967080721188d60 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3224e12fdef635d4ff8f1066ef61cd1ea8e72c9b --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:49:14 compute-1 nova_compute[182713]: 2026-01-21 23:49:14.759 182717 DEBUG oslo_concurrency.processutils [None req-a2d99467-44fd-4af5-84e4-e0963fdaa227 6ecb82537aea43e49dbf7e72fc5cb2fd 8857c3f1f4694ee0b967080721188d60 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3224e12fdef635d4ff8f1066ef61cd1ea8e72c9b,backing_fmt=raw /var/lib/nova/instances/fb3b64cb-7a89-4d0b-b821-db928d77b940/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:49:14 compute-1 nova_compute[182713]: 2026-01-21 23:49:14.789 182717 DEBUG oslo_concurrency.processutils [None req-a2d99467-44fd-4af5-84e4-e0963fdaa227 6ecb82537aea43e49dbf7e72fc5cb2fd 8857c3f1f4694ee0b967080721188d60 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3224e12fdef635d4ff8f1066ef61cd1ea8e72c9b,backing_fmt=raw /var/lib/nova/instances/fb3b64cb-7a89-4d0b-b821-db928d77b940/disk 1073741824" returned: 0 in 0.030s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:49:14 compute-1 nova_compute[182713]: 2026-01-21 23:49:14.790 182717 DEBUG oslo_concurrency.lockutils [None req-a2d99467-44fd-4af5-84e4-e0963fdaa227 6ecb82537aea43e49dbf7e72fc5cb2fd 8857c3f1f4694ee0b967080721188d60 - - default default] Lock "3224e12fdef635d4ff8f1066ef61cd1ea8e72c9b" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.106s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:49:14 compute-1 nova_compute[182713]: 2026-01-21 23:49:14.791 182717 DEBUG oslo_concurrency.processutils [None req-a2d99467-44fd-4af5-84e4-e0963fdaa227 6ecb82537aea43e49dbf7e72fc5cb2fd 8857c3f1f4694ee0b967080721188d60 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3224e12fdef635d4ff8f1066ef61cd1ea8e72c9b --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:49:14 compute-1 nova_compute[182713]: 2026-01-21 23:49:14.840 182717 INFO nova.compute.manager [None req-3418927a-f057-4804-bc5d-53ddbd5c254b f63fa215646b41c79f42ebb0bdcfcea0 d261e3eff0854b5c86b1fdf0c14f9027 - - default default] [instance: c5403455-52cf-4717-b07a-49f01c2ed814] Took 3.35 seconds to build instance.
Jan 21 23:49:14 compute-1 nova_compute[182713]: 2026-01-21 23:49:14.867 182717 DEBUG oslo_concurrency.lockutils [None req-3418927a-f057-4804-bc5d-53ddbd5c254b f63fa215646b41c79f42ebb0bdcfcea0 d261e3eff0854b5c86b1fdf0c14f9027 - - default default] Lock "c5403455-52cf-4717-b07a-49f01c2ed814" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 3.505s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:49:14 compute-1 nova_compute[182713]: 2026-01-21 23:49:14.868 182717 DEBUG oslo_concurrency.processutils [None req-a2d99467-44fd-4af5-84e4-e0963fdaa227 6ecb82537aea43e49dbf7e72fc5cb2fd 8857c3f1f4694ee0b967080721188d60 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3224e12fdef635d4ff8f1066ef61cd1ea8e72c9b --force-share --output=json" returned: 0 in 0.077s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:49:14 compute-1 nova_compute[182713]: 2026-01-21 23:49:14.869 182717 DEBUG nova.objects.instance [None req-a2d99467-44fd-4af5-84e4-e0963fdaa227 6ecb82537aea43e49dbf7e72fc5cb2fd 8857c3f1f4694ee0b967080721188d60 - - default default] Lazy-loading 'migration_context' on Instance uuid fb3b64cb-7a89-4d0b-b821-db928d77b940 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:49:14 compute-1 nova_compute[182713]: 2026-01-21 23:49:14.882 182717 INFO nova.virt.libvirt.driver [None req-a2d99467-44fd-4af5-84e4-e0963fdaa227 6ecb82537aea43e49dbf7e72fc5cb2fd 8857c3f1f4694ee0b967080721188d60 - - default default] [instance: fb3b64cb-7a89-4d0b-b821-db928d77b940] Rebasing disk image.
Jan 21 23:49:14 compute-1 nova_compute[182713]: 2026-01-21 23:49:14.882 182717 DEBUG oslo_concurrency.processutils [None req-a2d99467-44fd-4af5-84e4-e0963fdaa227 6ecb82537aea43e49dbf7e72fc5cb2fd 8857c3f1f4694ee0b967080721188d60 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:49:14 compute-1 nova_compute[182713]: 2026-01-21 23:49:14.936 182717 DEBUG oslo_concurrency.processutils [None req-a2d99467-44fd-4af5-84e4-e0963fdaa227 6ecb82537aea43e49dbf7e72fc5cb2fd 8857c3f1f4694ee0b967080721188d60 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:49:14 compute-1 nova_compute[182713]: 2026-01-21 23:49:14.938 182717 DEBUG oslo_concurrency.processutils [None req-a2d99467-44fd-4af5-84e4-e0963fdaa227 6ecb82537aea43e49dbf7e72fc5cb2fd 8857c3f1f4694ee0b967080721188d60 - - default default] Running cmd (subprocess): qemu-img rebase -b /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 -F raw /var/lib/nova/instances/fb3b64cb-7a89-4d0b-b821-db928d77b940/disk execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:49:15 compute-1 nova_compute[182713]: 2026-01-21 23:49:15.221 182717 DEBUG nova.virt.driver [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Emitting event <LifecycleEvent: 1769039355.2205718, 092ae3ba-e79b-47fc-b3eb-a0fb61865b6f => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:49:15 compute-1 nova_compute[182713]: 2026-01-21 23:49:15.222 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 092ae3ba-e79b-47fc-b3eb-a0fb61865b6f] VM Resumed (Lifecycle Event)
Jan 21 23:49:15 compute-1 nova_compute[182713]: 2026-01-21 23:49:15.228 182717 DEBUG nova.compute.manager [None req-e222f738-cbc5-445d-a065-dc78de4baf8f abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] [instance: 092ae3ba-e79b-47fc-b3eb-a0fb61865b6f] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 21 23:49:15 compute-1 nova_compute[182713]: 2026-01-21 23:49:15.229 182717 DEBUG nova.virt.libvirt.driver [None req-e222f738-cbc5-445d-a065-dc78de4baf8f abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] [instance: 092ae3ba-e79b-47fc-b3eb-a0fb61865b6f] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 21 23:49:15 compute-1 nova_compute[182713]: 2026-01-21 23:49:15.233 182717 INFO nova.virt.libvirt.driver [-] [instance: 092ae3ba-e79b-47fc-b3eb-a0fb61865b6f] Instance spawned successfully.
Jan 21 23:49:15 compute-1 nova_compute[182713]: 2026-01-21 23:49:15.233 182717 DEBUG nova.virt.libvirt.driver [None req-e222f738-cbc5-445d-a065-dc78de4baf8f abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] [instance: 092ae3ba-e79b-47fc-b3eb-a0fb61865b6f] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 21 23:49:15 compute-1 nova_compute[182713]: 2026-01-21 23:49:15.257 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 092ae3ba-e79b-47fc-b3eb-a0fb61865b6f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:49:15 compute-1 nova_compute[182713]: 2026-01-21 23:49:15.261 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 092ae3ba-e79b-47fc-b3eb-a0fb61865b6f] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 21 23:49:15 compute-1 nova_compute[182713]: 2026-01-21 23:49:15.278 182717 DEBUG nova.virt.libvirt.driver [None req-e222f738-cbc5-445d-a065-dc78de4baf8f abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] [instance: 092ae3ba-e79b-47fc-b3eb-a0fb61865b6f] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:49:15 compute-1 nova_compute[182713]: 2026-01-21 23:49:15.278 182717 DEBUG nova.virt.libvirt.driver [None req-e222f738-cbc5-445d-a065-dc78de4baf8f abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] [instance: 092ae3ba-e79b-47fc-b3eb-a0fb61865b6f] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:49:15 compute-1 nova_compute[182713]: 2026-01-21 23:49:15.279 182717 DEBUG nova.virt.libvirt.driver [None req-e222f738-cbc5-445d-a065-dc78de4baf8f abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] [instance: 092ae3ba-e79b-47fc-b3eb-a0fb61865b6f] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:49:15 compute-1 nova_compute[182713]: 2026-01-21 23:49:15.279 182717 DEBUG nova.virt.libvirt.driver [None req-e222f738-cbc5-445d-a065-dc78de4baf8f abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] [instance: 092ae3ba-e79b-47fc-b3eb-a0fb61865b6f] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:49:15 compute-1 nova_compute[182713]: 2026-01-21 23:49:15.280 182717 DEBUG nova.virt.libvirt.driver [None req-e222f738-cbc5-445d-a065-dc78de4baf8f abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] [instance: 092ae3ba-e79b-47fc-b3eb-a0fb61865b6f] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:49:15 compute-1 nova_compute[182713]: 2026-01-21 23:49:15.280 182717 DEBUG nova.virt.libvirt.driver [None req-e222f738-cbc5-445d-a065-dc78de4baf8f abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] [instance: 092ae3ba-e79b-47fc-b3eb-a0fb61865b6f] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:49:15 compute-1 nova_compute[182713]: 2026-01-21 23:49:15.308 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 092ae3ba-e79b-47fc-b3eb-a0fb61865b6f] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 21 23:49:15 compute-1 nova_compute[182713]: 2026-01-21 23:49:15.308 182717 DEBUG nova.virt.driver [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Emitting event <LifecycleEvent: 1769039355.2274404, 092ae3ba-e79b-47fc-b3eb-a0fb61865b6f => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:49:15 compute-1 nova_compute[182713]: 2026-01-21 23:49:15.309 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 092ae3ba-e79b-47fc-b3eb-a0fb61865b6f] VM Started (Lifecycle Event)
Jan 21 23:49:15 compute-1 nova_compute[182713]: 2026-01-21 23:49:15.334 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 092ae3ba-e79b-47fc-b3eb-a0fb61865b6f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:49:15 compute-1 nova_compute[182713]: 2026-01-21 23:49:15.337 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 092ae3ba-e79b-47fc-b3eb-a0fb61865b6f] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 21 23:49:15 compute-1 nova_compute[182713]: 2026-01-21 23:49:15.370 182717 INFO nova.compute.manager [None req-e222f738-cbc5-445d-a065-dc78de4baf8f abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] [instance: 092ae3ba-e79b-47fc-b3eb-a0fb61865b6f] Took 2.55 seconds to spawn the instance on the hypervisor.
Jan 21 23:49:15 compute-1 nova_compute[182713]: 2026-01-21 23:49:15.370 182717 DEBUG nova.compute.manager [None req-e222f738-cbc5-445d-a065-dc78de4baf8f abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] [instance: 092ae3ba-e79b-47fc-b3eb-a0fb61865b6f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:49:15 compute-1 nova_compute[182713]: 2026-01-21 23:49:15.372 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 092ae3ba-e79b-47fc-b3eb-a0fb61865b6f] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 21 23:49:15 compute-1 nova_compute[182713]: 2026-01-21 23:49:15.379 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:49:15 compute-1 nova_compute[182713]: 2026-01-21 23:49:15.462 182717 INFO nova.compute.manager [None req-e222f738-cbc5-445d-a065-dc78de4baf8f abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] [instance: 092ae3ba-e79b-47fc-b3eb-a0fb61865b6f] Took 3.19 seconds to build instance.
Jan 21 23:49:15 compute-1 nova_compute[182713]: 2026-01-21 23:49:15.482 182717 DEBUG oslo_concurrency.lockutils [None req-e222f738-cbc5-445d-a065-dc78de4baf8f abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] Lock "092ae3ba-e79b-47fc-b3eb-a0fb61865b6f" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 3.317s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:49:16 compute-1 nova_compute[182713]: 2026-01-21 23:49:16.577 182717 DEBUG oslo_concurrency.processutils [None req-a2d99467-44fd-4af5-84e4-e0963fdaa227 6ecb82537aea43e49dbf7e72fc5cb2fd 8857c3f1f4694ee0b967080721188d60 - - default default] CMD "qemu-img rebase -b /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 -F raw /var/lib/nova/instances/fb3b64cb-7a89-4d0b-b821-db928d77b940/disk" returned: 0 in 1.639s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:49:16 compute-1 nova_compute[182713]: 2026-01-21 23:49:16.578 182717 DEBUG nova.virt.libvirt.driver [None req-a2d99467-44fd-4af5-84e4-e0963fdaa227 6ecb82537aea43e49dbf7e72fc5cb2fd 8857c3f1f4694ee0b967080721188d60 - - default default] [instance: fb3b64cb-7a89-4d0b-b821-db928d77b940] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 21 23:49:16 compute-1 nova_compute[182713]: 2026-01-21 23:49:16.578 182717 DEBUG nova.virt.libvirt.driver [None req-a2d99467-44fd-4af5-84e4-e0963fdaa227 6ecb82537aea43e49dbf7e72fc5cb2fd 8857c3f1f4694ee0b967080721188d60 - - default default] [instance: fb3b64cb-7a89-4d0b-b821-db928d77b940] Ensure instance console log exists: /var/lib/nova/instances/fb3b64cb-7a89-4d0b-b821-db928d77b940/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 21 23:49:16 compute-1 nova_compute[182713]: 2026-01-21 23:49:16.579 182717 DEBUG oslo_concurrency.lockutils [None req-a2d99467-44fd-4af5-84e4-e0963fdaa227 6ecb82537aea43e49dbf7e72fc5cb2fd 8857c3f1f4694ee0b967080721188d60 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:49:16 compute-1 nova_compute[182713]: 2026-01-21 23:49:16.579 182717 DEBUG oslo_concurrency.lockutils [None req-a2d99467-44fd-4af5-84e4-e0963fdaa227 6ecb82537aea43e49dbf7e72fc5cb2fd 8857c3f1f4694ee0b967080721188d60 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:49:16 compute-1 nova_compute[182713]: 2026-01-21 23:49:16.579 182717 DEBUG oslo_concurrency.lockutils [None req-a2d99467-44fd-4af5-84e4-e0963fdaa227 6ecb82537aea43e49dbf7e72fc5cb2fd 8857c3f1f4694ee0b967080721188d60 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:49:16 compute-1 nova_compute[182713]: 2026-01-21 23:49:16.581 182717 DEBUG nova.virt.libvirt.driver [None req-a2d99467-44fd-4af5-84e4-e0963fdaa227 6ecb82537aea43e49dbf7e72fc5cb2fd 8857c3f1f4694ee0b967080721188d60 - - default default] [instance: fb3b64cb-7a89-4d0b-b821-db928d77b940] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='4e82ab5f0447e0739194ced67571c39e',container_format='bare',created_at=2026-01-21T23:48:49Z,direct_url=<?>,disk_format='qcow2',id=4e54f6af-d382-4cfa-83d0-5dee4b638922,min_disk=1,min_ram=0,name='tempest-UnshelveToHostMultiNodesTest-server-1721280846-shelved',owner='af45596abab74cc9aca5cbb551899c80',properties=ImageMetaProps,protected=<?>,size=52232192,status='active',tags=<?>,updated_at=2026-01-21T23:49:06Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'size': 0, 'disk_bus': 'virtio', 'boot_index': 0, 'device_name': '/dev/vda', 'guest_format': None, 'encrypted': False, 'encryption_format': None, 'encryption_secret_uuid': None, 'device_type': 'disk', 'image_id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 21 23:49:16 compute-1 nova_compute[182713]: 2026-01-21 23:49:16.591 182717 WARNING nova.virt.libvirt.driver [None req-a2d99467-44fd-4af5-84e4-e0963fdaa227 6ecb82537aea43e49dbf7e72fc5cb2fd 8857c3f1f4694ee0b967080721188d60 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 21 23:49:16 compute-1 nova_compute[182713]: 2026-01-21 23:49:16.610 182717 DEBUG nova.virt.libvirt.host [None req-a2d99467-44fd-4af5-84e4-e0963fdaa227 6ecb82537aea43e49dbf7e72fc5cb2fd 8857c3f1f4694ee0b967080721188d60 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 21 23:49:16 compute-1 nova_compute[182713]: 2026-01-21 23:49:16.611 182717 DEBUG nova.virt.libvirt.host [None req-a2d99467-44fd-4af5-84e4-e0963fdaa227 6ecb82537aea43e49dbf7e72fc5cb2fd 8857c3f1f4694ee0b967080721188d60 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 21 23:49:16 compute-1 nova_compute[182713]: 2026-01-21 23:49:16.615 182717 DEBUG nova.virt.libvirt.host [None req-a2d99467-44fd-4af5-84e4-e0963fdaa227 6ecb82537aea43e49dbf7e72fc5cb2fd 8857c3f1f4694ee0b967080721188d60 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 21 23:49:16 compute-1 nova_compute[182713]: 2026-01-21 23:49:16.616 182717 DEBUG nova.virt.libvirt.host [None req-a2d99467-44fd-4af5-84e4-e0963fdaa227 6ecb82537aea43e49dbf7e72fc5cb2fd 8857c3f1f4694ee0b967080721188d60 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 21 23:49:16 compute-1 nova_compute[182713]: 2026-01-21 23:49:16.618 182717 DEBUG nova.virt.libvirt.driver [None req-a2d99467-44fd-4af5-84e4-e0963fdaa227 6ecb82537aea43e49dbf7e72fc5cb2fd 8857c3f1f4694ee0b967080721188d60 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 21 23:49:16 compute-1 nova_compute[182713]: 2026-01-21 23:49:16.619 182717 DEBUG nova.virt.hardware [None req-a2d99467-44fd-4af5-84e4-e0963fdaa227 6ecb82537aea43e49dbf7e72fc5cb2fd 8857c3f1f4694ee0b967080721188d60 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-21T23:42:45Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c3389c03-89c4-4ff5-9e03-1a99d41713d4',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='4e82ab5f0447e0739194ced67571c39e',container_format='bare',created_at=2026-01-21T23:48:49Z,direct_url=<?>,disk_format='qcow2',id=4e54f6af-d382-4cfa-83d0-5dee4b638922,min_disk=1,min_ram=0,name='tempest-UnshelveToHostMultiNodesTest-server-1721280846-shelved',owner='af45596abab74cc9aca5cbb551899c80',properties=ImageMetaProps,protected=<?>,size=52232192,status='active',tags=<?>,updated_at=2026-01-21T23:49:06Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 21 23:49:16 compute-1 nova_compute[182713]: 2026-01-21 23:49:16.620 182717 DEBUG nova.virt.hardware [None req-a2d99467-44fd-4af5-84e4-e0963fdaa227 6ecb82537aea43e49dbf7e72fc5cb2fd 8857c3f1f4694ee0b967080721188d60 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 21 23:49:16 compute-1 nova_compute[182713]: 2026-01-21 23:49:16.620 182717 DEBUG nova.virt.hardware [None req-a2d99467-44fd-4af5-84e4-e0963fdaa227 6ecb82537aea43e49dbf7e72fc5cb2fd 8857c3f1f4694ee0b967080721188d60 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 21 23:49:16 compute-1 nova_compute[182713]: 2026-01-21 23:49:16.621 182717 DEBUG nova.virt.hardware [None req-a2d99467-44fd-4af5-84e4-e0963fdaa227 6ecb82537aea43e49dbf7e72fc5cb2fd 8857c3f1f4694ee0b967080721188d60 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 21 23:49:16 compute-1 nova_compute[182713]: 2026-01-21 23:49:16.621 182717 DEBUG nova.virt.hardware [None req-a2d99467-44fd-4af5-84e4-e0963fdaa227 6ecb82537aea43e49dbf7e72fc5cb2fd 8857c3f1f4694ee0b967080721188d60 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 21 23:49:16 compute-1 nova_compute[182713]: 2026-01-21 23:49:16.622 182717 DEBUG nova.virt.hardware [None req-a2d99467-44fd-4af5-84e4-e0963fdaa227 6ecb82537aea43e49dbf7e72fc5cb2fd 8857c3f1f4694ee0b967080721188d60 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 21 23:49:16 compute-1 nova_compute[182713]: 2026-01-21 23:49:16.623 182717 DEBUG nova.virt.hardware [None req-a2d99467-44fd-4af5-84e4-e0963fdaa227 6ecb82537aea43e49dbf7e72fc5cb2fd 8857c3f1f4694ee0b967080721188d60 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 21 23:49:16 compute-1 nova_compute[182713]: 2026-01-21 23:49:16.623 182717 DEBUG nova.virt.hardware [None req-a2d99467-44fd-4af5-84e4-e0963fdaa227 6ecb82537aea43e49dbf7e72fc5cb2fd 8857c3f1f4694ee0b967080721188d60 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 21 23:49:16 compute-1 nova_compute[182713]: 2026-01-21 23:49:16.624 182717 DEBUG nova.virt.hardware [None req-a2d99467-44fd-4af5-84e4-e0963fdaa227 6ecb82537aea43e49dbf7e72fc5cb2fd 8857c3f1f4694ee0b967080721188d60 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 21 23:49:16 compute-1 nova_compute[182713]: 2026-01-21 23:49:16.625 182717 DEBUG nova.virt.hardware [None req-a2d99467-44fd-4af5-84e4-e0963fdaa227 6ecb82537aea43e49dbf7e72fc5cb2fd 8857c3f1f4694ee0b967080721188d60 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 21 23:49:16 compute-1 nova_compute[182713]: 2026-01-21 23:49:16.626 182717 DEBUG nova.virt.hardware [None req-a2d99467-44fd-4af5-84e4-e0963fdaa227 6ecb82537aea43e49dbf7e72fc5cb2fd 8857c3f1f4694ee0b967080721188d60 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 21 23:49:16 compute-1 nova_compute[182713]: 2026-01-21 23:49:16.626 182717 DEBUG nova.objects.instance [None req-a2d99467-44fd-4af5-84e4-e0963fdaa227 6ecb82537aea43e49dbf7e72fc5cb2fd 8857c3f1f4694ee0b967080721188d60 - - default default] Lazy-loading 'vcpu_model' on Instance uuid fb3b64cb-7a89-4d0b-b821-db928d77b940 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:49:16 compute-1 nova_compute[182713]: 2026-01-21 23:49:16.653 182717 DEBUG nova.objects.instance [None req-a2d99467-44fd-4af5-84e4-e0963fdaa227 6ecb82537aea43e49dbf7e72fc5cb2fd 8857c3f1f4694ee0b967080721188d60 - - default default] Lazy-loading 'pci_devices' on Instance uuid fb3b64cb-7a89-4d0b-b821-db928d77b940 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:49:16 compute-1 nova_compute[182713]: 2026-01-21 23:49:16.667 182717 DEBUG nova.virt.libvirt.driver [None req-a2d99467-44fd-4af5-84e4-e0963fdaa227 6ecb82537aea43e49dbf7e72fc5cb2fd 8857c3f1f4694ee0b967080721188d60 - - default default] [instance: fb3b64cb-7a89-4d0b-b821-db928d77b940] End _get_guest_xml xml=<domain type="kvm">
Jan 21 23:49:16 compute-1 nova_compute[182713]:   <uuid>fb3b64cb-7a89-4d0b-b821-db928d77b940</uuid>
Jan 21 23:49:16 compute-1 nova_compute[182713]:   <name>instance-0000001d</name>
Jan 21 23:49:16 compute-1 nova_compute[182713]:   <memory>131072</memory>
Jan 21 23:49:16 compute-1 nova_compute[182713]:   <vcpu>1</vcpu>
Jan 21 23:49:16 compute-1 nova_compute[182713]:   <metadata>
Jan 21 23:49:16 compute-1 nova_compute[182713]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 21 23:49:16 compute-1 nova_compute[182713]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 21 23:49:16 compute-1 nova_compute[182713]:       <nova:name>tempest-UnshelveToHostMultiNodesTest-server-1721280846</nova:name>
Jan 21 23:49:16 compute-1 nova_compute[182713]:       <nova:creationTime>2026-01-21 23:49:16</nova:creationTime>
Jan 21 23:49:16 compute-1 nova_compute[182713]:       <nova:flavor name="m1.nano">
Jan 21 23:49:16 compute-1 nova_compute[182713]:         <nova:memory>128</nova:memory>
Jan 21 23:49:16 compute-1 nova_compute[182713]:         <nova:disk>1</nova:disk>
Jan 21 23:49:16 compute-1 nova_compute[182713]:         <nova:swap>0</nova:swap>
Jan 21 23:49:16 compute-1 nova_compute[182713]:         <nova:ephemeral>0</nova:ephemeral>
Jan 21 23:49:16 compute-1 nova_compute[182713]:         <nova:vcpus>1</nova:vcpus>
Jan 21 23:49:16 compute-1 nova_compute[182713]:       </nova:flavor>
Jan 21 23:49:16 compute-1 nova_compute[182713]:       <nova:owner>
Jan 21 23:49:16 compute-1 nova_compute[182713]:         <nova:user uuid="98cbe317d3494846bdfe48215cfbc5c0">tempest-UnshelveToHostMultiNodesTest-32641639-project-member</nova:user>
Jan 21 23:49:16 compute-1 nova_compute[182713]:         <nova:project uuid="af45596abab74cc9aca5cbb551899c80">tempest-UnshelveToHostMultiNodesTest-32641639</nova:project>
Jan 21 23:49:16 compute-1 nova_compute[182713]:       </nova:owner>
Jan 21 23:49:16 compute-1 nova_compute[182713]:       <nova:root type="image" uuid="4e54f6af-d382-4cfa-83d0-5dee4b638922"/>
Jan 21 23:49:16 compute-1 nova_compute[182713]:       <nova:ports/>
Jan 21 23:49:16 compute-1 nova_compute[182713]:     </nova:instance>
Jan 21 23:49:16 compute-1 nova_compute[182713]:   </metadata>
Jan 21 23:49:16 compute-1 nova_compute[182713]:   <sysinfo type="smbios">
Jan 21 23:49:16 compute-1 nova_compute[182713]:     <system>
Jan 21 23:49:16 compute-1 nova_compute[182713]:       <entry name="manufacturer">RDO</entry>
Jan 21 23:49:16 compute-1 nova_compute[182713]:       <entry name="product">OpenStack Compute</entry>
Jan 21 23:49:16 compute-1 nova_compute[182713]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 21 23:49:16 compute-1 nova_compute[182713]:       <entry name="serial">fb3b64cb-7a89-4d0b-b821-db928d77b940</entry>
Jan 21 23:49:16 compute-1 nova_compute[182713]:       <entry name="uuid">fb3b64cb-7a89-4d0b-b821-db928d77b940</entry>
Jan 21 23:49:16 compute-1 nova_compute[182713]:       <entry name="family">Virtual Machine</entry>
Jan 21 23:49:16 compute-1 nova_compute[182713]:     </system>
Jan 21 23:49:16 compute-1 nova_compute[182713]:   </sysinfo>
Jan 21 23:49:16 compute-1 nova_compute[182713]:   <os>
Jan 21 23:49:16 compute-1 nova_compute[182713]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 21 23:49:16 compute-1 nova_compute[182713]:     <boot dev="hd"/>
Jan 21 23:49:16 compute-1 nova_compute[182713]:     <smbios mode="sysinfo"/>
Jan 21 23:49:16 compute-1 nova_compute[182713]:   </os>
Jan 21 23:49:16 compute-1 nova_compute[182713]:   <features>
Jan 21 23:49:16 compute-1 nova_compute[182713]:     <acpi/>
Jan 21 23:49:16 compute-1 nova_compute[182713]:     <apic/>
Jan 21 23:49:16 compute-1 nova_compute[182713]:     <vmcoreinfo/>
Jan 21 23:49:16 compute-1 nova_compute[182713]:   </features>
Jan 21 23:49:16 compute-1 nova_compute[182713]:   <clock offset="utc">
Jan 21 23:49:16 compute-1 nova_compute[182713]:     <timer name="pit" tickpolicy="delay"/>
Jan 21 23:49:16 compute-1 nova_compute[182713]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 21 23:49:16 compute-1 nova_compute[182713]:     <timer name="hpet" present="no"/>
Jan 21 23:49:16 compute-1 nova_compute[182713]:   </clock>
Jan 21 23:49:16 compute-1 nova_compute[182713]:   <cpu mode="custom" match="exact">
Jan 21 23:49:16 compute-1 nova_compute[182713]:     <model>Nehalem</model>
Jan 21 23:49:16 compute-1 nova_compute[182713]:     <topology sockets="1" cores="1" threads="1"/>
Jan 21 23:49:16 compute-1 nova_compute[182713]:   </cpu>
Jan 21 23:49:16 compute-1 nova_compute[182713]:   <devices>
Jan 21 23:49:16 compute-1 nova_compute[182713]:     <disk type="file" device="disk">
Jan 21 23:49:16 compute-1 nova_compute[182713]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 21 23:49:16 compute-1 nova_compute[182713]:       <source file="/var/lib/nova/instances/fb3b64cb-7a89-4d0b-b821-db928d77b940/disk"/>
Jan 21 23:49:16 compute-1 nova_compute[182713]:       <target dev="vda" bus="virtio"/>
Jan 21 23:49:16 compute-1 nova_compute[182713]:     </disk>
Jan 21 23:49:16 compute-1 nova_compute[182713]:     <disk type="file" device="cdrom">
Jan 21 23:49:16 compute-1 nova_compute[182713]:       <driver name="qemu" type="raw" cache="none"/>
Jan 21 23:49:16 compute-1 nova_compute[182713]:       <source file="/var/lib/nova/instances/fb3b64cb-7a89-4d0b-b821-db928d77b940/disk.config"/>
Jan 21 23:49:16 compute-1 nova_compute[182713]:       <target dev="sda" bus="sata"/>
Jan 21 23:49:16 compute-1 nova_compute[182713]:     </disk>
Jan 21 23:49:16 compute-1 nova_compute[182713]:     <serial type="pty">
Jan 21 23:49:16 compute-1 nova_compute[182713]:       <log file="/var/lib/nova/instances/fb3b64cb-7a89-4d0b-b821-db928d77b940/console.log" append="off"/>
Jan 21 23:49:16 compute-1 nova_compute[182713]:     </serial>
Jan 21 23:49:16 compute-1 nova_compute[182713]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 21 23:49:16 compute-1 nova_compute[182713]:     <video>
Jan 21 23:49:16 compute-1 nova_compute[182713]:       <model type="virtio"/>
Jan 21 23:49:16 compute-1 nova_compute[182713]:     </video>
Jan 21 23:49:16 compute-1 nova_compute[182713]:     <input type="tablet" bus="usb"/>
Jan 21 23:49:16 compute-1 nova_compute[182713]:     <input type="keyboard" bus="usb"/>
Jan 21 23:49:16 compute-1 nova_compute[182713]:     <rng model="virtio">
Jan 21 23:49:16 compute-1 nova_compute[182713]:       <backend model="random">/dev/urandom</backend>
Jan 21 23:49:16 compute-1 nova_compute[182713]:     </rng>
Jan 21 23:49:16 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root"/>
Jan 21 23:49:16 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:49:16 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:49:16 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:49:16 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:49:16 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:49:16 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:49:16 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:49:16 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:49:16 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:49:16 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:49:16 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:49:16 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:49:16 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:49:16 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:49:16 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:49:16 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:49:16 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:49:16 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:49:16 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:49:16 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:49:16 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:49:16 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:49:16 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:49:16 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:49:16 compute-1 nova_compute[182713]:     <controller type="usb" index="0"/>
Jan 21 23:49:16 compute-1 nova_compute[182713]:     <memballoon model="virtio">
Jan 21 23:49:16 compute-1 nova_compute[182713]:       <stats period="10"/>
Jan 21 23:49:16 compute-1 nova_compute[182713]:     </memballoon>
Jan 21 23:49:16 compute-1 nova_compute[182713]:   </devices>
Jan 21 23:49:16 compute-1 nova_compute[182713]: </domain>
Jan 21 23:49:16 compute-1 nova_compute[182713]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 21 23:49:16 compute-1 nova_compute[182713]: 2026-01-21 23:49:16.754 182717 DEBUG nova.virt.libvirt.driver [None req-a2d99467-44fd-4af5-84e4-e0963fdaa227 6ecb82537aea43e49dbf7e72fc5cb2fd 8857c3f1f4694ee0b967080721188d60 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 21 23:49:16 compute-1 nova_compute[182713]: 2026-01-21 23:49:16.755 182717 DEBUG nova.virt.libvirt.driver [None req-a2d99467-44fd-4af5-84e4-e0963fdaa227 6ecb82537aea43e49dbf7e72fc5cb2fd 8857c3f1f4694ee0b967080721188d60 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 21 23:49:16 compute-1 nova_compute[182713]: 2026-01-21 23:49:16.755 182717 INFO nova.virt.libvirt.driver [None req-a2d99467-44fd-4af5-84e4-e0963fdaa227 6ecb82537aea43e49dbf7e72fc5cb2fd 8857c3f1f4694ee0b967080721188d60 - - default default] [instance: fb3b64cb-7a89-4d0b-b821-db928d77b940] Using config drive
Jan 21 23:49:16 compute-1 nova_compute[182713]: 2026-01-21 23:49:16.773 182717 DEBUG nova.objects.instance [None req-a2d99467-44fd-4af5-84e4-e0963fdaa227 6ecb82537aea43e49dbf7e72fc5cb2fd 8857c3f1f4694ee0b967080721188d60 - - default default] Lazy-loading 'ec2_ids' on Instance uuid fb3b64cb-7a89-4d0b-b821-db928d77b940 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:49:16 compute-1 nova_compute[182713]: 2026-01-21 23:49:16.821 182717 DEBUG nova.objects.instance [None req-a2d99467-44fd-4af5-84e4-e0963fdaa227 6ecb82537aea43e49dbf7e72fc5cb2fd 8857c3f1f4694ee0b967080721188d60 - - default default] Lazy-loading 'keypairs' on Instance uuid fb3b64cb-7a89-4d0b-b821-db928d77b940 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:49:16 compute-1 nova_compute[182713]: 2026-01-21 23:49:16.961 182717 INFO nova.virt.libvirt.driver [None req-a2d99467-44fd-4af5-84e4-e0963fdaa227 6ecb82537aea43e49dbf7e72fc5cb2fd 8857c3f1f4694ee0b967080721188d60 - - default default] [instance: fb3b64cb-7a89-4d0b-b821-db928d77b940] Creating config drive at /var/lib/nova/instances/fb3b64cb-7a89-4d0b-b821-db928d77b940/disk.config
Jan 21 23:49:16 compute-1 nova_compute[182713]: 2026-01-21 23:49:16.970 182717 DEBUG oslo_concurrency.processutils [None req-a2d99467-44fd-4af5-84e4-e0963fdaa227 6ecb82537aea43e49dbf7e72fc5cb2fd 8857c3f1f4694ee0b967080721188d60 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/fb3b64cb-7a89-4d0b-b821-db928d77b940/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpflxf0dik execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:49:17 compute-1 nova_compute[182713]: 2026-01-21 23:49:17.100 182717 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769039342.098471, fb3b64cb-7a89-4d0b-b821-db928d77b940 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:49:17 compute-1 nova_compute[182713]: 2026-01-21 23:49:17.101 182717 INFO nova.compute.manager [-] [instance: fb3b64cb-7a89-4d0b-b821-db928d77b940] VM Stopped (Lifecycle Event)
Jan 21 23:49:17 compute-1 nova_compute[182713]: 2026-01-21 23:49:17.112 182717 DEBUG oslo_concurrency.processutils [None req-a2d99467-44fd-4af5-84e4-e0963fdaa227 6ecb82537aea43e49dbf7e72fc5cb2fd 8857c3f1f4694ee0b967080721188d60 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/fb3b64cb-7a89-4d0b-b821-db928d77b940/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpflxf0dik" returned: 0 in 0.142s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:49:17 compute-1 nova_compute[182713]: 2026-01-21 23:49:17.126 182717 DEBUG nova.compute.manager [None req-5e058dc0-2795-4bf7-b902-2cc0e5718c9f - - - - - -] [instance: fb3b64cb-7a89-4d0b-b821-db928d77b940] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:49:17 compute-1 systemd-machined[153970]: New machine qemu-18-instance-0000001d.
Jan 21 23:49:17 compute-1 systemd[1]: Started Virtual Machine qemu-18-instance-0000001d.
Jan 21 23:49:17 compute-1 nova_compute[182713]: 2026-01-21 23:49:17.461 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:49:17 compute-1 nova_compute[182713]: 2026-01-21 23:49:17.761 182717 DEBUG nova.virt.driver [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Emitting event <LifecycleEvent: 1769039357.7605124, fb3b64cb-7a89-4d0b-b821-db928d77b940 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:49:17 compute-1 nova_compute[182713]: 2026-01-21 23:49:17.763 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: fb3b64cb-7a89-4d0b-b821-db928d77b940] VM Resumed (Lifecycle Event)
Jan 21 23:49:17 compute-1 nova_compute[182713]: 2026-01-21 23:49:17.765 182717 DEBUG nova.compute.manager [None req-a2d99467-44fd-4af5-84e4-e0963fdaa227 6ecb82537aea43e49dbf7e72fc5cb2fd 8857c3f1f4694ee0b967080721188d60 - - default default] [instance: fb3b64cb-7a89-4d0b-b821-db928d77b940] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 21 23:49:17 compute-1 nova_compute[182713]: 2026-01-21 23:49:17.766 182717 DEBUG nova.virt.libvirt.driver [None req-a2d99467-44fd-4af5-84e4-e0963fdaa227 6ecb82537aea43e49dbf7e72fc5cb2fd 8857c3f1f4694ee0b967080721188d60 - - default default] [instance: fb3b64cb-7a89-4d0b-b821-db928d77b940] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 21 23:49:17 compute-1 nova_compute[182713]: 2026-01-21 23:49:17.770 182717 INFO nova.virt.libvirt.driver [-] [instance: fb3b64cb-7a89-4d0b-b821-db928d77b940] Instance spawned successfully.
Jan 21 23:49:17 compute-1 nova_compute[182713]: 2026-01-21 23:49:17.796 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: fb3b64cb-7a89-4d0b-b821-db928d77b940] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:49:17 compute-1 nova_compute[182713]: 2026-01-21 23:49:17.800 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: fb3b64cb-7a89-4d0b-b821-db928d77b940] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: shelved_offloaded, current task_state: spawning, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 21 23:49:17 compute-1 nova_compute[182713]: 2026-01-21 23:49:17.828 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: fb3b64cb-7a89-4d0b-b821-db928d77b940] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 21 23:49:17 compute-1 nova_compute[182713]: 2026-01-21 23:49:17.829 182717 DEBUG nova.virt.driver [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Emitting event <LifecycleEvent: 1769039357.761158, fb3b64cb-7a89-4d0b-b821-db928d77b940 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:49:17 compute-1 nova_compute[182713]: 2026-01-21 23:49:17.830 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: fb3b64cb-7a89-4d0b-b821-db928d77b940] VM Started (Lifecycle Event)
Jan 21 23:49:17 compute-1 nova_compute[182713]: 2026-01-21 23:49:17.858 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: fb3b64cb-7a89-4d0b-b821-db928d77b940] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:49:17 compute-1 nova_compute[182713]: 2026-01-21 23:49:17.863 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: fb3b64cb-7a89-4d0b-b821-db928d77b940] Synchronizing instance power state after lifecycle event "Started"; current vm_state: shelved_offloaded, current task_state: spawning, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 21 23:49:17 compute-1 nova_compute[182713]: 2026-01-21 23:49:17.890 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: fb3b64cb-7a89-4d0b-b821-db928d77b940] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 21 23:49:18 compute-1 nova_compute[182713]: 2026-01-21 23:49:18.441 182717 DEBUG nova.compute.manager [None req-78171ed1-7993-4892-a504-8a52a83d8f50 abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] [instance: 092ae3ba-e79b-47fc-b3eb-a0fb61865b6f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:49:18 compute-1 nova_compute[182713]: 2026-01-21 23:49:18.551 182717 INFO nova.compute.manager [None req-78171ed1-7993-4892-a504-8a52a83d8f50 abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] [instance: 092ae3ba-e79b-47fc-b3eb-a0fb61865b6f] instance snapshotting
Jan 21 23:49:18 compute-1 nova_compute[182713]: 2026-01-21 23:49:18.710 182717 DEBUG nova.compute.manager [None req-a2d99467-44fd-4af5-84e4-e0963fdaa227 6ecb82537aea43e49dbf7e72fc5cb2fd 8857c3f1f4694ee0b967080721188d60 - - default default] [instance: fb3b64cb-7a89-4d0b-b821-db928d77b940] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:49:18 compute-1 nova_compute[182713]: 2026-01-21 23:49:18.828 182717 INFO nova.virt.libvirt.driver [None req-78171ed1-7993-4892-a504-8a52a83d8f50 abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] [instance: 092ae3ba-e79b-47fc-b3eb-a0fb61865b6f] Beginning live snapshot process
Jan 21 23:49:18 compute-1 nova_compute[182713]: 2026-01-21 23:49:18.864 182717 DEBUG oslo_concurrency.lockutils [None req-a2d99467-44fd-4af5-84e4-e0963fdaa227 6ecb82537aea43e49dbf7e72fc5cb2fd 8857c3f1f4694ee0b967080721188d60 - - default default] Lock "fb3b64cb-7a89-4d0b-b821-db928d77b940" "released" by "nova.compute.manager.ComputeManager.unshelve_instance.<locals>.do_unshelve_instance" :: held 8.128s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:49:19 compute-1 virtqemud[182235]: invalid argument: disk vda does not have an active block job
Jan 21 23:49:19 compute-1 nova_compute[182713]: 2026-01-21 23:49:19.079 182717 DEBUG oslo_concurrency.processutils [None req-78171ed1-7993-4892-a504-8a52a83d8f50 abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/092ae3ba-e79b-47fc-b3eb-a0fb61865b6f/disk --force-share --output=json -f qcow2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:49:19 compute-1 nova_compute[182713]: 2026-01-21 23:49:19.174 182717 DEBUG oslo_concurrency.processutils [None req-78171ed1-7993-4892-a504-8a52a83d8f50 abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/092ae3ba-e79b-47fc-b3eb-a0fb61865b6f/disk --force-share --output=json -f qcow2" returned: 0 in 0.096s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:49:19 compute-1 nova_compute[182713]: 2026-01-21 23:49:19.176 182717 DEBUG oslo_concurrency.processutils [None req-78171ed1-7993-4892-a504-8a52a83d8f50 abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/092ae3ba-e79b-47fc-b3eb-a0fb61865b6f/disk --force-share --output=json -f qcow2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:49:19 compute-1 nova_compute[182713]: 2026-01-21 23:49:19.261 182717 DEBUG oslo_concurrency.processutils [None req-78171ed1-7993-4892-a504-8a52a83d8f50 abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/092ae3ba-e79b-47fc-b3eb-a0fb61865b6f/disk --force-share --output=json -f qcow2" returned: 0 in 0.085s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:49:19 compute-1 nova_compute[182713]: 2026-01-21 23:49:19.286 182717 DEBUG oslo_concurrency.processutils [None req-78171ed1-7993-4892-a504-8a52a83d8f50 abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:49:19 compute-1 nova_compute[182713]: 2026-01-21 23:49:19.351 182717 DEBUG oslo_concurrency.processutils [None req-78171ed1-7993-4892-a504-8a52a83d8f50 abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:49:19 compute-1 nova_compute[182713]: 2026-01-21 23:49:19.354 182717 DEBUG oslo_concurrency.processutils [None req-78171ed1-7993-4892-a504-8a52a83d8f50 abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/snapshots/tmprl_hh4ti/9b03aa237f904e1da4212f2fd348bdae.delta 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:49:19 compute-1 nova_compute[182713]: 2026-01-21 23:49:19.399 182717 DEBUG oslo_concurrency.processutils [None req-78171ed1-7993-4892-a504-8a52a83d8f50 abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/snapshots/tmprl_hh4ti/9b03aa237f904e1da4212f2fd348bdae.delta 1073741824" returned: 0 in 0.045s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:49:19 compute-1 nova_compute[182713]: 2026-01-21 23:49:19.402 182717 INFO nova.virt.libvirt.driver [None req-78171ed1-7993-4892-a504-8a52a83d8f50 abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] [instance: 092ae3ba-e79b-47fc-b3eb-a0fb61865b6f] Quiescing instance not available: QEMU guest agent is not enabled.
Jan 21 23:49:19 compute-1 nova_compute[182713]: 2026-01-21 23:49:19.490 182717 DEBUG nova.virt.libvirt.guest [None req-78171ed1-7993-4892-a504-8a52a83d8f50 abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] COPY block job progress, current cursor: 1 final cursor: 1 is_job_complete /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:846
Jan 21 23:49:19 compute-1 nova_compute[182713]: 2026-01-21 23:49:19.496 182717 INFO nova.virt.libvirt.driver [None req-78171ed1-7993-4892-a504-8a52a83d8f50 abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] [instance: 092ae3ba-e79b-47fc-b3eb-a0fb61865b6f] Skipping quiescing instance: QEMU guest agent is not enabled.
Jan 21 23:49:19 compute-1 podman[215489]: 2026-01-21 23:49:19.544163701 +0000 UTC m=+0.084601126 container health_status 9069102163b9014ce8d990e36224edf2f6277e737eebbc85a8ea0ba5a3d6a468 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 21 23:49:19 compute-1 nova_compute[182713]: 2026-01-21 23:49:19.563 182717 DEBUG nova.privsep.utils [None req-78171ed1-7993-4892-a504-8a52a83d8f50 abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63
Jan 21 23:49:19 compute-1 nova_compute[182713]: 2026-01-21 23:49:19.564 182717 DEBUG oslo_concurrency.processutils [None req-78171ed1-7993-4892-a504-8a52a83d8f50 abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] Running cmd (subprocess): qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/snapshots/tmprl_hh4ti/9b03aa237f904e1da4212f2fd348bdae.delta /var/lib/nova/instances/snapshots/tmprl_hh4ti/9b03aa237f904e1da4212f2fd348bdae execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:49:19 compute-1 podman[215487]: 2026-01-21 23:49:19.579551496 +0000 UTC m=+0.131009854 container health_status 1b31374526468d6b98833c6ddc06c791524e3aaa8f4a92d02e3f11c449a17ca2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, container_name=ovn_controller, io.buildah.version=1.41.3)
Jan 21 23:49:19 compute-1 nova_compute[182713]: 2026-01-21 23:49:19.697 182717 DEBUG oslo_concurrency.processutils [None req-78171ed1-7993-4892-a504-8a52a83d8f50 abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] CMD "qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/snapshots/tmprl_hh4ti/9b03aa237f904e1da4212f2fd348bdae.delta /var/lib/nova/instances/snapshots/tmprl_hh4ti/9b03aa237f904e1da4212f2fd348bdae" returned: 0 in 0.133s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:49:19 compute-1 nova_compute[182713]: 2026-01-21 23:49:19.700 182717 INFO nova.virt.libvirt.driver [None req-78171ed1-7993-4892-a504-8a52a83d8f50 abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] [instance: 092ae3ba-e79b-47fc-b3eb-a0fb61865b6f] Snapshot extracted, beginning image upload
Jan 21 23:49:19 compute-1 nova_compute[182713]: 2026-01-21 23:49:19.855 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:49:19 compute-1 nova_compute[182713]: 2026-01-21 23:49:19.856 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:49:19 compute-1 nova_compute[182713]: 2026-01-21 23:49:19.856 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 21 23:49:20 compute-1 nova_compute[182713]: 2026-01-21 23:49:20.362 182717 DEBUG oslo_concurrency.lockutils [None req-779748a3-143b-49be-86c1-8a70f23777ad 98cbe317d3494846bdfe48215cfbc5c0 af45596abab74cc9aca5cbb551899c80 - - default default] Acquiring lock "fb3b64cb-7a89-4d0b-b821-db928d77b940" by "nova.compute.manager.ComputeManager.shelve_instance.<locals>.do_shelve_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:49:20 compute-1 nova_compute[182713]: 2026-01-21 23:49:20.362 182717 DEBUG oslo_concurrency.lockutils [None req-779748a3-143b-49be-86c1-8a70f23777ad 98cbe317d3494846bdfe48215cfbc5c0 af45596abab74cc9aca5cbb551899c80 - - default default] Lock "fb3b64cb-7a89-4d0b-b821-db928d77b940" acquired by "nova.compute.manager.ComputeManager.shelve_instance.<locals>.do_shelve_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:49:20 compute-1 nova_compute[182713]: 2026-01-21 23:49:20.363 182717 INFO nova.compute.manager [None req-779748a3-143b-49be-86c1-8a70f23777ad 98cbe317d3494846bdfe48215cfbc5c0 af45596abab74cc9aca5cbb551899c80 - - default default] [instance: fb3b64cb-7a89-4d0b-b821-db928d77b940] Shelving
Jan 21 23:49:20 compute-1 nova_compute[182713]: 2026-01-21 23:49:20.382 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:49:20 compute-1 nova_compute[182713]: 2026-01-21 23:49:20.404 182717 DEBUG nova.virt.libvirt.driver [None req-779748a3-143b-49be-86c1-8a70f23777ad 98cbe317d3494846bdfe48215cfbc5c0 af45596abab74cc9aca5cbb551899c80 - - default default] [instance: fb3b64cb-7a89-4d0b-b821-db928d77b940] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071
Jan 21 23:49:21 compute-1 nova_compute[182713]: 2026-01-21 23:49:21.739 182717 INFO nova.virt.libvirt.driver [None req-78171ed1-7993-4892-a504-8a52a83d8f50 abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] [instance: 092ae3ba-e79b-47fc-b3eb-a0fb61865b6f] Snapshot image upload complete
Jan 21 23:49:21 compute-1 nova_compute[182713]: 2026-01-21 23:49:21.740 182717 INFO nova.compute.manager [None req-78171ed1-7993-4892-a504-8a52a83d8f50 abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] [instance: 092ae3ba-e79b-47fc-b3eb-a0fb61865b6f] Took 3.17 seconds to snapshot the instance on the hypervisor.
Jan 21 23:49:21 compute-1 nova_compute[182713]: 2026-01-21 23:49:21.852 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:49:21 compute-1 nova_compute[182713]: 2026-01-21 23:49:21.853 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:49:21 compute-1 nova_compute[182713]: 2026-01-21 23:49:21.888 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:49:22 compute-1 nova_compute[182713]: 2026-01-21 23:49:22.501 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:49:22 compute-1 nova_compute[182713]: 2026-01-21 23:49:22.856 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:49:22 compute-1 nova_compute[182713]: 2026-01-21 23:49:22.856 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 21 23:49:22 compute-1 nova_compute[182713]: 2026-01-21 23:49:22.887 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 21 23:49:22 compute-1 nova_compute[182713]: 2026-01-21 23:49:22.889 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:49:22 compute-1 nova_compute[182713]: 2026-01-21 23:49:22.889 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:49:22 compute-1 nova_compute[182713]: 2026-01-21 23:49:22.915 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:49:22 compute-1 nova_compute[182713]: 2026-01-21 23:49:22.915 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:49:22 compute-1 nova_compute[182713]: 2026-01-21 23:49:22.915 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:49:22 compute-1 nova_compute[182713]: 2026-01-21 23:49:22.916 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 21 23:49:22 compute-1 nova_compute[182713]: 2026-01-21 23:49:22.991 182717 DEBUG oslo_concurrency.processutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/092ae3ba-e79b-47fc-b3eb-a0fb61865b6f/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:49:23 compute-1 nova_compute[182713]: 2026-01-21 23:49:23.105 182717 DEBUG oslo_concurrency.processutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/092ae3ba-e79b-47fc-b3eb-a0fb61865b6f/disk --force-share --output=json" returned: 0 in 0.114s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:49:23 compute-1 nova_compute[182713]: 2026-01-21 23:49:23.107 182717 DEBUG oslo_concurrency.processutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/092ae3ba-e79b-47fc-b3eb-a0fb61865b6f/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:49:23 compute-1 nova_compute[182713]: 2026-01-21 23:49:23.198 182717 DEBUG oslo_concurrency.processutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/092ae3ba-e79b-47fc-b3eb-a0fb61865b6f/disk --force-share --output=json" returned: 0 in 0.091s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:49:23 compute-1 nova_compute[182713]: 2026-01-21 23:49:23.205 182717 DEBUG oslo_concurrency.processutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/fb3b64cb-7a89-4d0b-b821-db928d77b940/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:49:23 compute-1 nova_compute[182713]: 2026-01-21 23:49:23.290 182717 DEBUG oslo_concurrency.processutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/fb3b64cb-7a89-4d0b-b821-db928d77b940/disk --force-share --output=json" returned: 0 in 0.084s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:49:23 compute-1 nova_compute[182713]: 2026-01-21 23:49:23.291 182717 DEBUG oslo_concurrency.processutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/fb3b64cb-7a89-4d0b-b821-db928d77b940/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:49:23 compute-1 nova_compute[182713]: 2026-01-21 23:49:23.385 182717 DEBUG oslo_concurrency.processutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/fb3b64cb-7a89-4d0b-b821-db928d77b940/disk --force-share --output=json" returned: 0 in 0.094s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:49:23 compute-1 nova_compute[182713]: 2026-01-21 23:49:23.392 182717 DEBUG oslo_concurrency.processutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c5403455-52cf-4717-b07a-49f01c2ed814/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:49:23 compute-1 nova_compute[182713]: 2026-01-21 23:49:23.482 182717 DEBUG oslo_concurrency.processutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c5403455-52cf-4717-b07a-49f01c2ed814/disk --force-share --output=json" returned: 0 in 0.090s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:49:23 compute-1 nova_compute[182713]: 2026-01-21 23:49:23.483 182717 DEBUG oslo_concurrency.processutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c5403455-52cf-4717-b07a-49f01c2ed814/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:49:23 compute-1 nova_compute[182713]: 2026-01-21 23:49:23.548 182717 DEBUG oslo_concurrency.processutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c5403455-52cf-4717-b07a-49f01c2ed814/disk --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:49:23 compute-1 nova_compute[182713]: 2026-01-21 23:49:23.712 182717 WARNING nova.virt.libvirt.driver [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 21 23:49:23 compute-1 nova_compute[182713]: 2026-01-21 23:49:23.714 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5311MB free_disk=73.28300476074219GB free_vcpus=5 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 21 23:49:23 compute-1 nova_compute[182713]: 2026-01-21 23:49:23.715 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:49:23 compute-1 nova_compute[182713]: 2026-01-21 23:49:23.715 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:49:23 compute-1 nova_compute[182713]: 2026-01-21 23:49:23.886 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Instance fb3b64cb-7a89-4d0b-b821-db928d77b940 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 21 23:49:23 compute-1 nova_compute[182713]: 2026-01-21 23:49:23.886 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Instance c5403455-52cf-4717-b07a-49f01c2ed814 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 21 23:49:23 compute-1 nova_compute[182713]: 2026-01-21 23:49:23.886 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Instance 092ae3ba-e79b-47fc-b3eb-a0fb61865b6f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 21 23:49:23 compute-1 nova_compute[182713]: 2026-01-21 23:49:23.886 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 3 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 21 23:49:23 compute-1 nova_compute[182713]: 2026-01-21 23:49:23.887 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=896MB phys_disk=79GB used_disk=3GB total_vcpus=8 used_vcpus=3 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 21 23:49:23 compute-1 nova_compute[182713]: 2026-01-21 23:49:23.984 182717 DEBUG nova.compute.provider_tree [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Inventory has not changed in ProviderTree for provider: 39680711-70c9-4df1-ae59-25e54fac688d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 21 23:49:24 compute-1 nova_compute[182713]: 2026-01-21 23:49:24.002 182717 DEBUG nova.scheduler.client.report [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Inventory has not changed for provider 39680711-70c9-4df1-ae59-25e54fac688d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 21 23:49:24 compute-1 nova_compute[182713]: 2026-01-21 23:49:24.031 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 21 23:49:24 compute-1 nova_compute[182713]: 2026-01-21 23:49:24.031 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.316s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:49:24 compute-1 nova_compute[182713]: 2026-01-21 23:49:24.999 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:49:25 compute-1 nova_compute[182713]: 2026-01-21 23:49:24.999 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:49:25 compute-1 nova_compute[182713]: 2026-01-21 23:49:25.384 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:49:27 compute-1 nova_compute[182713]: 2026-01-21 23:49:27.507 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:49:27 compute-1 podman[215589]: 2026-01-21 23:49:27.587130485 +0000 UTC m=+0.076853467 container health_status af9a44923f14d865893c91712a29a99d59e05d83f1a1a0300c4f4d5af638f284 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 21 23:49:27 compute-1 podman[215590]: 2026-01-21 23:49:27.611254249 +0000 UTC m=+0.076692042 container health_status e305ebfcd3dbca18f8dd680606b043b108cf9efb0d0a771039cb04fe0df8441d (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 21 23:49:27 compute-1 nova_compute[182713]: 2026-01-21 23:49:27.892 182717 DEBUG oslo_concurrency.lockutils [None req-d6ccc504-7d13-450b-b39e-37a1621fd728 f63fa215646b41c79f42ebb0bdcfcea0 d261e3eff0854b5c86b1fdf0c14f9027 - - default default] Acquiring lock "c5403455-52cf-4717-b07a-49f01c2ed814" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:49:27 compute-1 nova_compute[182713]: 2026-01-21 23:49:27.893 182717 DEBUG oslo_concurrency.lockutils [None req-d6ccc504-7d13-450b-b39e-37a1621fd728 f63fa215646b41c79f42ebb0bdcfcea0 d261e3eff0854b5c86b1fdf0c14f9027 - - default default] Lock "c5403455-52cf-4717-b07a-49f01c2ed814" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:49:27 compute-1 nova_compute[182713]: 2026-01-21 23:49:27.893 182717 DEBUG oslo_concurrency.lockutils [None req-d6ccc504-7d13-450b-b39e-37a1621fd728 f63fa215646b41c79f42ebb0bdcfcea0 d261e3eff0854b5c86b1fdf0c14f9027 - - default default] Acquiring lock "c5403455-52cf-4717-b07a-49f01c2ed814-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:49:27 compute-1 nova_compute[182713]: 2026-01-21 23:49:27.894 182717 DEBUG oslo_concurrency.lockutils [None req-d6ccc504-7d13-450b-b39e-37a1621fd728 f63fa215646b41c79f42ebb0bdcfcea0 d261e3eff0854b5c86b1fdf0c14f9027 - - default default] Lock "c5403455-52cf-4717-b07a-49f01c2ed814-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:49:27 compute-1 nova_compute[182713]: 2026-01-21 23:49:27.894 182717 DEBUG oslo_concurrency.lockutils [None req-d6ccc504-7d13-450b-b39e-37a1621fd728 f63fa215646b41c79f42ebb0bdcfcea0 d261e3eff0854b5c86b1fdf0c14f9027 - - default default] Lock "c5403455-52cf-4717-b07a-49f01c2ed814-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:49:27 compute-1 nova_compute[182713]: 2026-01-21 23:49:27.904 182717 INFO nova.compute.manager [None req-d6ccc504-7d13-450b-b39e-37a1621fd728 f63fa215646b41c79f42ebb0bdcfcea0 d261e3eff0854b5c86b1fdf0c14f9027 - - default default] [instance: c5403455-52cf-4717-b07a-49f01c2ed814] Terminating instance
Jan 21 23:49:27 compute-1 nova_compute[182713]: 2026-01-21 23:49:27.914 182717 DEBUG oslo_concurrency.lockutils [None req-d6ccc504-7d13-450b-b39e-37a1621fd728 f63fa215646b41c79f42ebb0bdcfcea0 d261e3eff0854b5c86b1fdf0c14f9027 - - default default] Acquiring lock "refresh_cache-c5403455-52cf-4717-b07a-49f01c2ed814" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 23:49:27 compute-1 nova_compute[182713]: 2026-01-21 23:49:27.914 182717 DEBUG oslo_concurrency.lockutils [None req-d6ccc504-7d13-450b-b39e-37a1621fd728 f63fa215646b41c79f42ebb0bdcfcea0 d261e3eff0854b5c86b1fdf0c14f9027 - - default default] Acquired lock "refresh_cache-c5403455-52cf-4717-b07a-49f01c2ed814" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 23:49:27 compute-1 nova_compute[182713]: 2026-01-21 23:49:27.914 182717 DEBUG nova.network.neutron [None req-d6ccc504-7d13-450b-b39e-37a1621fd728 f63fa215646b41c79f42ebb0bdcfcea0 d261e3eff0854b5c86b1fdf0c14f9027 - - default default] [instance: c5403455-52cf-4717-b07a-49f01c2ed814] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 21 23:49:28 compute-1 nova_compute[182713]: 2026-01-21 23:49:28.087 182717 DEBUG nova.network.neutron [None req-d6ccc504-7d13-450b-b39e-37a1621fd728 f63fa215646b41c79f42ebb0bdcfcea0 d261e3eff0854b5c86b1fdf0c14f9027 - - default default] [instance: c5403455-52cf-4717-b07a-49f01c2ed814] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 21 23:49:28 compute-1 nova_compute[182713]: 2026-01-21 23:49:28.386 182717 DEBUG nova.network.neutron [None req-d6ccc504-7d13-450b-b39e-37a1621fd728 f63fa215646b41c79f42ebb0bdcfcea0 d261e3eff0854b5c86b1fdf0c14f9027 - - default default] [instance: c5403455-52cf-4717-b07a-49f01c2ed814] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 23:49:28 compute-1 nova_compute[182713]: 2026-01-21 23:49:28.403 182717 DEBUG oslo_concurrency.lockutils [None req-d6ccc504-7d13-450b-b39e-37a1621fd728 f63fa215646b41c79f42ebb0bdcfcea0 d261e3eff0854b5c86b1fdf0c14f9027 - - default default] Releasing lock "refresh_cache-c5403455-52cf-4717-b07a-49f01c2ed814" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 23:49:28 compute-1 nova_compute[182713]: 2026-01-21 23:49:28.404 182717 DEBUG nova.compute.manager [None req-d6ccc504-7d13-450b-b39e-37a1621fd728 f63fa215646b41c79f42ebb0bdcfcea0 d261e3eff0854b5c86b1fdf0c14f9027 - - default default] [instance: c5403455-52cf-4717-b07a-49f01c2ed814] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 21 23:49:28 compute-1 systemd[1]: machine-qemu\x2d16\x2dinstance\x2d0000001f.scope: Deactivated successfully.
Jan 21 23:49:28 compute-1 systemd[1]: machine-qemu\x2d16\x2dinstance\x2d0000001f.scope: Consumed 13.473s CPU time.
Jan 21 23:49:28 compute-1 systemd-machined[153970]: Machine qemu-16-instance-0000001f terminated.
Jan 21 23:49:28 compute-1 nova_compute[182713]: 2026-01-21 23:49:28.666 182717 INFO nova.virt.libvirt.driver [-] [instance: c5403455-52cf-4717-b07a-49f01c2ed814] Instance destroyed successfully.
Jan 21 23:49:28 compute-1 nova_compute[182713]: 2026-01-21 23:49:28.666 182717 DEBUG nova.objects.instance [None req-d6ccc504-7d13-450b-b39e-37a1621fd728 f63fa215646b41c79f42ebb0bdcfcea0 d261e3eff0854b5c86b1fdf0c14f9027 - - default default] Lazy-loading 'resources' on Instance uuid c5403455-52cf-4717-b07a-49f01c2ed814 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:49:28 compute-1 nova_compute[182713]: 2026-01-21 23:49:28.681 182717 INFO nova.virt.libvirt.driver [None req-d6ccc504-7d13-450b-b39e-37a1621fd728 f63fa215646b41c79f42ebb0bdcfcea0 d261e3eff0854b5c86b1fdf0c14f9027 - - default default] [instance: c5403455-52cf-4717-b07a-49f01c2ed814] Deleting instance files /var/lib/nova/instances/c5403455-52cf-4717-b07a-49f01c2ed814_del
Jan 21 23:49:28 compute-1 nova_compute[182713]: 2026-01-21 23:49:28.681 182717 INFO nova.virt.libvirt.driver [None req-d6ccc504-7d13-450b-b39e-37a1621fd728 f63fa215646b41c79f42ebb0bdcfcea0 d261e3eff0854b5c86b1fdf0c14f9027 - - default default] [instance: c5403455-52cf-4717-b07a-49f01c2ed814] Deletion of /var/lib/nova/instances/c5403455-52cf-4717-b07a-49f01c2ed814_del complete
Jan 21 23:49:28 compute-1 nova_compute[182713]: 2026-01-21 23:49:28.771 182717 INFO nova.compute.manager [None req-d6ccc504-7d13-450b-b39e-37a1621fd728 f63fa215646b41c79f42ebb0bdcfcea0 d261e3eff0854b5c86b1fdf0c14f9027 - - default default] [instance: c5403455-52cf-4717-b07a-49f01c2ed814] Took 0.37 seconds to destroy the instance on the hypervisor.
Jan 21 23:49:28 compute-1 nova_compute[182713]: 2026-01-21 23:49:28.772 182717 DEBUG oslo.service.loopingcall [None req-d6ccc504-7d13-450b-b39e-37a1621fd728 f63fa215646b41c79f42ebb0bdcfcea0 d261e3eff0854b5c86b1fdf0c14f9027 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 21 23:49:28 compute-1 nova_compute[182713]: 2026-01-21 23:49:28.773 182717 DEBUG nova.compute.manager [-] [instance: c5403455-52cf-4717-b07a-49f01c2ed814] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 21 23:49:28 compute-1 nova_compute[182713]: 2026-01-21 23:49:28.773 182717 DEBUG nova.network.neutron [-] [instance: c5403455-52cf-4717-b07a-49f01c2ed814] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 21 23:49:29 compute-1 nova_compute[182713]: 2026-01-21 23:49:29.695 182717 DEBUG nova.network.neutron [-] [instance: c5403455-52cf-4717-b07a-49f01c2ed814] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 21 23:49:29 compute-1 nova_compute[182713]: 2026-01-21 23:49:29.920 182717 DEBUG nova.network.neutron [-] [instance: c5403455-52cf-4717-b07a-49f01c2ed814] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 23:49:29 compute-1 nova_compute[182713]: 2026-01-21 23:49:29.936 182717 INFO nova.compute.manager [-] [instance: c5403455-52cf-4717-b07a-49f01c2ed814] Took 1.16 seconds to deallocate network for instance.
Jan 21 23:49:30 compute-1 nova_compute[182713]: 2026-01-21 23:49:30.088 182717 DEBUG oslo_concurrency.lockutils [None req-d6ccc504-7d13-450b-b39e-37a1621fd728 f63fa215646b41c79f42ebb0bdcfcea0 d261e3eff0854b5c86b1fdf0c14f9027 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:49:30 compute-1 nova_compute[182713]: 2026-01-21 23:49:30.089 182717 DEBUG oslo_concurrency.lockutils [None req-d6ccc504-7d13-450b-b39e-37a1621fd728 f63fa215646b41c79f42ebb0bdcfcea0 d261e3eff0854b5c86b1fdf0c14f9027 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:49:30 compute-1 nova_compute[182713]: 2026-01-21 23:49:30.191 182717 DEBUG nova.compute.provider_tree [None req-d6ccc504-7d13-450b-b39e-37a1621fd728 f63fa215646b41c79f42ebb0bdcfcea0 d261e3eff0854b5c86b1fdf0c14f9027 - - default default] Inventory has not changed in ProviderTree for provider: 39680711-70c9-4df1-ae59-25e54fac688d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 21 23:49:30 compute-1 nova_compute[182713]: 2026-01-21 23:49:30.211 182717 DEBUG nova.scheduler.client.report [None req-d6ccc504-7d13-450b-b39e-37a1621fd728 f63fa215646b41c79f42ebb0bdcfcea0 d261e3eff0854b5c86b1fdf0c14f9027 - - default default] Inventory has not changed for provider 39680711-70c9-4df1-ae59-25e54fac688d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 21 23:49:30 compute-1 nova_compute[182713]: 2026-01-21 23:49:30.240 182717 DEBUG oslo_concurrency.lockutils [None req-d6ccc504-7d13-450b-b39e-37a1621fd728 f63fa215646b41c79f42ebb0bdcfcea0 d261e3eff0854b5c86b1fdf0c14f9027 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.151s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:49:30 compute-1 nova_compute[182713]: 2026-01-21 23:49:30.312 182717 INFO nova.scheduler.client.report [None req-d6ccc504-7d13-450b-b39e-37a1621fd728 f63fa215646b41c79f42ebb0bdcfcea0 d261e3eff0854b5c86b1fdf0c14f9027 - - default default] Deleted allocations for instance c5403455-52cf-4717-b07a-49f01c2ed814
Jan 21 23:49:30 compute-1 nova_compute[182713]: 2026-01-21 23:49:30.332 182717 DEBUG nova.compute.manager [None req-8abb7035-3c06-42a6-ada0-7b0d86252758 abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] [instance: 092ae3ba-e79b-47fc-b3eb-a0fb61865b6f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:49:30 compute-1 nova_compute[182713]: 2026-01-21 23:49:30.386 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:49:30 compute-1 nova_compute[182713]: 2026-01-21 23:49:30.393 182717 DEBUG oslo_concurrency.lockutils [None req-d6ccc504-7d13-450b-b39e-37a1621fd728 f63fa215646b41c79f42ebb0bdcfcea0 d261e3eff0854b5c86b1fdf0c14f9027 - - default default] Lock "c5403455-52cf-4717-b07a-49f01c2ed814" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.500s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:49:30 compute-1 nova_compute[182713]: 2026-01-21 23:49:30.396 182717 INFO nova.compute.manager [None req-8abb7035-3c06-42a6-ada0-7b0d86252758 abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] [instance: 092ae3ba-e79b-47fc-b3eb-a0fb61865b6f] instance snapshotting
Jan 21 23:49:30 compute-1 nova_compute[182713]: 2026-01-21 23:49:30.459 182717 DEBUG nova.virt.libvirt.driver [None req-779748a3-143b-49be-86c1-8a70f23777ad 98cbe317d3494846bdfe48215cfbc5c0 af45596abab74cc9aca5cbb551899c80 - - default default] [instance: fb3b64cb-7a89-4d0b-b821-db928d77b940] Instance in state 1 after 10 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101
Jan 21 23:49:30 compute-1 nova_compute[182713]: 2026-01-21 23:49:30.708 182717 INFO nova.virt.libvirt.driver [None req-8abb7035-3c06-42a6-ada0-7b0d86252758 abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] [instance: 092ae3ba-e79b-47fc-b3eb-a0fb61865b6f] Beginning live snapshot process
Jan 21 23:49:30 compute-1 virtqemud[182235]: invalid argument: disk vda does not have an active block job
Jan 21 23:49:30 compute-1 nova_compute[182713]: 2026-01-21 23:49:30.953 182717 DEBUG oslo_concurrency.processutils [None req-8abb7035-3c06-42a6-ada0-7b0d86252758 abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/092ae3ba-e79b-47fc-b3eb-a0fb61865b6f/disk --force-share --output=json -f qcow2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:49:31 compute-1 nova_compute[182713]: 2026-01-21 23:49:31.049 182717 DEBUG oslo_concurrency.processutils [None req-8abb7035-3c06-42a6-ada0-7b0d86252758 abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/092ae3ba-e79b-47fc-b3eb-a0fb61865b6f/disk --force-share --output=json -f qcow2" returned: 0 in 0.096s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:49:31 compute-1 nova_compute[182713]: 2026-01-21 23:49:31.051 182717 DEBUG oslo_concurrency.processutils [None req-8abb7035-3c06-42a6-ada0-7b0d86252758 abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/092ae3ba-e79b-47fc-b3eb-a0fb61865b6f/disk --force-share --output=json -f qcow2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:49:31 compute-1 nova_compute[182713]: 2026-01-21 23:49:31.140 182717 DEBUG oslo_concurrency.processutils [None req-8abb7035-3c06-42a6-ada0-7b0d86252758 abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/092ae3ba-e79b-47fc-b3eb-a0fb61865b6f/disk --force-share --output=json -f qcow2" returned: 0 in 0.089s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:49:31 compute-1 nova_compute[182713]: 2026-01-21 23:49:31.165 182717 DEBUG oslo_concurrency.processutils [None req-8abb7035-3c06-42a6-ada0-7b0d86252758 abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:49:31 compute-1 nova_compute[182713]: 2026-01-21 23:49:31.249 182717 DEBUG oslo_concurrency.processutils [None req-8abb7035-3c06-42a6-ada0-7b0d86252758 abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.083s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:49:31 compute-1 nova_compute[182713]: 2026-01-21 23:49:31.251 182717 DEBUG oslo_concurrency.processutils [None req-8abb7035-3c06-42a6-ada0-7b0d86252758 abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/snapshots/tmputt7p3r3/9bcd7ae58c5a4397a559880776fb1733.delta 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:49:31 compute-1 nova_compute[182713]: 2026-01-21 23:49:31.289 182717 DEBUG oslo_concurrency.processutils [None req-8abb7035-3c06-42a6-ada0-7b0d86252758 abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/snapshots/tmputt7p3r3/9bcd7ae58c5a4397a559880776fb1733.delta 1073741824" returned: 0 in 0.038s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:49:31 compute-1 nova_compute[182713]: 2026-01-21 23:49:31.291 182717 INFO nova.virt.libvirt.driver [None req-8abb7035-3c06-42a6-ada0-7b0d86252758 abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] [instance: 092ae3ba-e79b-47fc-b3eb-a0fb61865b6f] Quiescing instance not available: QEMU guest agent is not enabled.
Jan 21 23:49:31 compute-1 nova_compute[182713]: 2026-01-21 23:49:31.359 182717 DEBUG nova.virt.libvirt.guest [None req-8abb7035-3c06-42a6-ada0-7b0d86252758 abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] COPY block job progress, current cursor: 0 final cursor: 75235328 is_job_complete /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:846
Jan 21 23:49:31 compute-1 nova_compute[182713]: 2026-01-21 23:49:31.865 182717 DEBUG nova.virt.libvirt.guest [None req-8abb7035-3c06-42a6-ada0-7b0d86252758 abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] COPY block job progress, current cursor: 75235328 final cursor: 75235328 is_job_complete /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:846
Jan 21 23:49:31 compute-1 nova_compute[182713]: 2026-01-21 23:49:31.870 182717 INFO nova.virt.libvirt.driver [None req-8abb7035-3c06-42a6-ada0-7b0d86252758 abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] [instance: 092ae3ba-e79b-47fc-b3eb-a0fb61865b6f] Skipping quiescing instance: QEMU guest agent is not enabled.
Jan 21 23:49:31 compute-1 nova_compute[182713]: 2026-01-21 23:49:31.927 182717 DEBUG nova.privsep.utils [None req-8abb7035-3c06-42a6-ada0-7b0d86252758 abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63
Jan 21 23:49:31 compute-1 nova_compute[182713]: 2026-01-21 23:49:31.928 182717 DEBUG oslo_concurrency.processutils [None req-8abb7035-3c06-42a6-ada0-7b0d86252758 abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] Running cmd (subprocess): qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/snapshots/tmputt7p3r3/9bcd7ae58c5a4397a559880776fb1733.delta /var/lib/nova/instances/snapshots/tmputt7p3r3/9bcd7ae58c5a4397a559880776fb1733 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:49:32 compute-1 nova_compute[182713]: 2026-01-21 23:49:32.253 182717 DEBUG oslo_concurrency.processutils [None req-8abb7035-3c06-42a6-ada0-7b0d86252758 abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] CMD "qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/snapshots/tmputt7p3r3/9bcd7ae58c5a4397a559880776fb1733.delta /var/lib/nova/instances/snapshots/tmputt7p3r3/9bcd7ae58c5a4397a559880776fb1733" returned: 0 in 0.325s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:49:32 compute-1 nova_compute[182713]: 2026-01-21 23:49:32.260 182717 INFO nova.virt.libvirt.driver [None req-8abb7035-3c06-42a6-ada0-7b0d86252758 abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] [instance: 092ae3ba-e79b-47fc-b3eb-a0fb61865b6f] Snapshot extracted, beginning image upload
Jan 21 23:49:32 compute-1 nova_compute[182713]: 2026-01-21 23:49:32.510 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:49:32 compute-1 systemd[1]: machine-qemu\x2d18\x2dinstance\x2d0000001d.scope: Deactivated successfully.
Jan 21 23:49:32 compute-1 systemd[1]: machine-qemu\x2d18\x2dinstance\x2d0000001d.scope: Consumed 13.289s CPU time.
Jan 21 23:49:32 compute-1 systemd-machined[153970]: Machine qemu-18-instance-0000001d terminated.
Jan 21 23:49:32 compute-1 ovn_controller[94841]: 2026-01-21T23:49:32Z|00108|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory
Jan 21 23:49:33 compute-1 nova_compute[182713]: 2026-01-21 23:49:33.367 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:49:33 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:49:33.367 104184 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=9, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:a4:fb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'fa:c2:bb:50:eb:27'}, ipsec=False) old=SB_Global(nb_cfg=8) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 21 23:49:33 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:49:33.369 104184 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 21 23:49:33 compute-1 nova_compute[182713]: 2026-01-21 23:49:33.475 182717 INFO nova.virt.libvirt.driver [None req-779748a3-143b-49be-86c1-8a70f23777ad 98cbe317d3494846bdfe48215cfbc5c0 af45596abab74cc9aca5cbb551899c80 - - default default] [instance: fb3b64cb-7a89-4d0b-b821-db928d77b940] Instance shutdown successfully after 13 seconds.
Jan 21 23:49:33 compute-1 nova_compute[182713]: 2026-01-21 23:49:33.483 182717 INFO nova.virt.libvirt.driver [-] [instance: fb3b64cb-7a89-4d0b-b821-db928d77b940] Instance destroyed successfully.
Jan 21 23:49:33 compute-1 nova_compute[182713]: 2026-01-21 23:49:33.484 182717 DEBUG nova.objects.instance [None req-779748a3-143b-49be-86c1-8a70f23777ad 98cbe317d3494846bdfe48215cfbc5c0 af45596abab74cc9aca5cbb551899c80 - - default default] Lazy-loading 'numa_topology' on Instance uuid fb3b64cb-7a89-4d0b-b821-db928d77b940 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:49:33 compute-1 nova_compute[182713]: 2026-01-21 23:49:33.849 182717 INFO nova.virt.libvirt.driver [None req-779748a3-143b-49be-86c1-8a70f23777ad 98cbe317d3494846bdfe48215cfbc5c0 af45596abab74cc9aca5cbb551899c80 - - default default] [instance: fb3b64cb-7a89-4d0b-b821-db928d77b940] Beginning cold snapshot process
Jan 21 23:49:34 compute-1 nova_compute[182713]: 2026-01-21 23:49:34.200 182717 DEBUG nova.privsep.utils [None req-779748a3-143b-49be-86c1-8a70f23777ad 98cbe317d3494846bdfe48215cfbc5c0 af45596abab74cc9aca5cbb551899c80 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63
Jan 21 23:49:34 compute-1 nova_compute[182713]: 2026-01-21 23:49:34.201 182717 DEBUG oslo_concurrency.processutils [None req-779748a3-143b-49be-86c1-8a70f23777ad 98cbe317d3494846bdfe48215cfbc5c0 af45596abab74cc9aca5cbb551899c80 - - default default] Running cmd (subprocess): qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/fb3b64cb-7a89-4d0b-b821-db928d77b940/disk /var/lib/nova/instances/snapshots/tmpvzxkxtev/b1d3e9f6a43842c68de8fdcb66eae1d1 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:49:34 compute-1 nova_compute[182713]: 2026-01-21 23:49:34.678 182717 DEBUG oslo_concurrency.processutils [None req-779748a3-143b-49be-86c1-8a70f23777ad 98cbe317d3494846bdfe48215cfbc5c0 af45596abab74cc9aca5cbb551899c80 - - default default] CMD "qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/fb3b64cb-7a89-4d0b-b821-db928d77b940/disk /var/lib/nova/instances/snapshots/tmpvzxkxtev/b1d3e9f6a43842c68de8fdcb66eae1d1" returned: 0 in 0.477s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:49:34 compute-1 nova_compute[182713]: 2026-01-21 23:49:34.679 182717 INFO nova.virt.libvirt.driver [None req-779748a3-143b-49be-86c1-8a70f23777ad 98cbe317d3494846bdfe48215cfbc5c0 af45596abab74cc9aca5cbb551899c80 - - default default] [instance: fb3b64cb-7a89-4d0b-b821-db928d77b940] Snapshot extracted, beginning image upload
Jan 21 23:49:35 compute-1 nova_compute[182713]: 2026-01-21 23:49:35.202 182717 INFO nova.virt.libvirt.driver [None req-8abb7035-3c06-42a6-ada0-7b0d86252758 abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] [instance: 092ae3ba-e79b-47fc-b3eb-a0fb61865b6f] Snapshot image upload complete
Jan 21 23:49:35 compute-1 nova_compute[182713]: 2026-01-21 23:49:35.203 182717 INFO nova.compute.manager [None req-8abb7035-3c06-42a6-ada0-7b0d86252758 abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] [instance: 092ae3ba-e79b-47fc-b3eb-a0fb61865b6f] Took 4.79 seconds to snapshot the instance on the hypervisor.
Jan 21 23:49:35 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:49:35.372 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=74526b6d-b1ca-423f-9094-b845f8b97526, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '9'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:49:35 compute-1 nova_compute[182713]: 2026-01-21 23:49:35.387 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:49:37 compute-1 nova_compute[182713]: 2026-01-21 23:49:37.513 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:49:38 compute-1 nova_compute[182713]: 2026-01-21 23:49:38.124 182717 INFO nova.virt.libvirt.driver [None req-779748a3-143b-49be-86c1-8a70f23777ad 98cbe317d3494846bdfe48215cfbc5c0 af45596abab74cc9aca5cbb551899c80 - - default default] [instance: fb3b64cb-7a89-4d0b-b821-db928d77b940] Snapshot image upload complete
Jan 21 23:49:38 compute-1 nova_compute[182713]: 2026-01-21 23:49:38.125 182717 DEBUG nova.compute.manager [None req-779748a3-143b-49be-86c1-8a70f23777ad 98cbe317d3494846bdfe48215cfbc5c0 af45596abab74cc9aca5cbb551899c80 - - default default] [instance: fb3b64cb-7a89-4d0b-b821-db928d77b940] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:49:38 compute-1 nova_compute[182713]: 2026-01-21 23:49:38.225 182717 INFO nova.compute.manager [None req-779748a3-143b-49be-86c1-8a70f23777ad 98cbe317d3494846bdfe48215cfbc5c0 af45596abab74cc9aca5cbb551899c80 - - default default] [instance: fb3b64cb-7a89-4d0b-b821-db928d77b940] Shelve offloading
Jan 21 23:49:38 compute-1 nova_compute[182713]: 2026-01-21 23:49:38.246 182717 INFO nova.virt.libvirt.driver [-] [instance: fb3b64cb-7a89-4d0b-b821-db928d77b940] Instance destroyed successfully.
Jan 21 23:49:38 compute-1 nova_compute[182713]: 2026-01-21 23:49:38.247 182717 DEBUG nova.compute.manager [None req-779748a3-143b-49be-86c1-8a70f23777ad 98cbe317d3494846bdfe48215cfbc5c0 af45596abab74cc9aca5cbb551899c80 - - default default] [instance: fb3b64cb-7a89-4d0b-b821-db928d77b940] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:49:38 compute-1 nova_compute[182713]: 2026-01-21 23:49:38.250 182717 DEBUG oslo_concurrency.lockutils [None req-779748a3-143b-49be-86c1-8a70f23777ad 98cbe317d3494846bdfe48215cfbc5c0 af45596abab74cc9aca5cbb551899c80 - - default default] Acquiring lock "refresh_cache-fb3b64cb-7a89-4d0b-b821-db928d77b940" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 23:49:38 compute-1 nova_compute[182713]: 2026-01-21 23:49:38.251 182717 DEBUG oslo_concurrency.lockutils [None req-779748a3-143b-49be-86c1-8a70f23777ad 98cbe317d3494846bdfe48215cfbc5c0 af45596abab74cc9aca5cbb551899c80 - - default default] Acquired lock "refresh_cache-fb3b64cb-7a89-4d0b-b821-db928d77b940" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 23:49:38 compute-1 nova_compute[182713]: 2026-01-21 23:49:38.251 182717 DEBUG nova.network.neutron [None req-779748a3-143b-49be-86c1-8a70f23777ad 98cbe317d3494846bdfe48215cfbc5c0 af45596abab74cc9aca5cbb551899c80 - - default default] [instance: fb3b64cb-7a89-4d0b-b821-db928d77b940] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 21 23:49:38 compute-1 nova_compute[182713]: 2026-01-21 23:49:38.711 182717 DEBUG nova.network.neutron [None req-779748a3-143b-49be-86c1-8a70f23777ad 98cbe317d3494846bdfe48215cfbc5c0 af45596abab74cc9aca5cbb551899c80 - - default default] [instance: fb3b64cb-7a89-4d0b-b821-db928d77b940] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 21 23:49:38 compute-1 nova_compute[182713]: 2026-01-21 23:49:38.964 182717 DEBUG nova.network.neutron [None req-779748a3-143b-49be-86c1-8a70f23777ad 98cbe317d3494846bdfe48215cfbc5c0 af45596abab74cc9aca5cbb551899c80 - - default default] [instance: fb3b64cb-7a89-4d0b-b821-db928d77b940] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 23:49:38 compute-1 nova_compute[182713]: 2026-01-21 23:49:38.987 182717 DEBUG oslo_concurrency.lockutils [None req-779748a3-143b-49be-86c1-8a70f23777ad 98cbe317d3494846bdfe48215cfbc5c0 af45596abab74cc9aca5cbb551899c80 - - default default] Releasing lock "refresh_cache-fb3b64cb-7a89-4d0b-b821-db928d77b940" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 23:49:38 compute-1 nova_compute[182713]: 2026-01-21 23:49:38.998 182717 INFO nova.virt.libvirt.driver [-] [instance: fb3b64cb-7a89-4d0b-b821-db928d77b940] Instance destroyed successfully.
Jan 21 23:49:38 compute-1 nova_compute[182713]: 2026-01-21 23:49:38.999 182717 DEBUG nova.objects.instance [None req-779748a3-143b-49be-86c1-8a70f23777ad 98cbe317d3494846bdfe48215cfbc5c0 af45596abab74cc9aca5cbb551899c80 - - default default] Lazy-loading 'resources' on Instance uuid fb3b64cb-7a89-4d0b-b821-db928d77b940 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:49:39 compute-1 nova_compute[182713]: 2026-01-21 23:49:39.014 182717 INFO nova.virt.libvirt.driver [None req-779748a3-143b-49be-86c1-8a70f23777ad 98cbe317d3494846bdfe48215cfbc5c0 af45596abab74cc9aca5cbb551899c80 - - default default] [instance: fb3b64cb-7a89-4d0b-b821-db928d77b940] Deleting instance files /var/lib/nova/instances/fb3b64cb-7a89-4d0b-b821-db928d77b940_del
Jan 21 23:49:39 compute-1 nova_compute[182713]: 2026-01-21 23:49:39.022 182717 INFO nova.virt.libvirt.driver [None req-779748a3-143b-49be-86c1-8a70f23777ad 98cbe317d3494846bdfe48215cfbc5c0 af45596abab74cc9aca5cbb551899c80 - - default default] [instance: fb3b64cb-7a89-4d0b-b821-db928d77b940] Deletion of /var/lib/nova/instances/fb3b64cb-7a89-4d0b-b821-db928d77b940_del complete
Jan 21 23:49:39 compute-1 nova_compute[182713]: 2026-01-21 23:49:39.138 182717 INFO nova.scheduler.client.report [None req-779748a3-143b-49be-86c1-8a70f23777ad 98cbe317d3494846bdfe48215cfbc5c0 af45596abab74cc9aca5cbb551899c80 - - default default] Deleted allocations for instance fb3b64cb-7a89-4d0b-b821-db928d77b940
Jan 21 23:49:39 compute-1 nova_compute[182713]: 2026-01-21 23:49:39.223 182717 DEBUG oslo_concurrency.lockutils [None req-779748a3-143b-49be-86c1-8a70f23777ad 98cbe317d3494846bdfe48215cfbc5c0 af45596abab74cc9aca5cbb551899c80 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:49:39 compute-1 nova_compute[182713]: 2026-01-21 23:49:39.224 182717 DEBUG oslo_concurrency.lockutils [None req-779748a3-143b-49be-86c1-8a70f23777ad 98cbe317d3494846bdfe48215cfbc5c0 af45596abab74cc9aca5cbb551899c80 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:49:39 compute-1 nova_compute[182713]: 2026-01-21 23:49:39.280 182717 DEBUG nova.compute.provider_tree [None req-779748a3-143b-49be-86c1-8a70f23777ad 98cbe317d3494846bdfe48215cfbc5c0 af45596abab74cc9aca5cbb551899c80 - - default default] Inventory has not changed in ProviderTree for provider: 39680711-70c9-4df1-ae59-25e54fac688d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 21 23:49:39 compute-1 nova_compute[182713]: 2026-01-21 23:49:39.299 182717 DEBUG nova.scheduler.client.report [None req-779748a3-143b-49be-86c1-8a70f23777ad 98cbe317d3494846bdfe48215cfbc5c0 af45596abab74cc9aca5cbb551899c80 - - default default] Inventory has not changed for provider 39680711-70c9-4df1-ae59-25e54fac688d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 21 23:49:39 compute-1 nova_compute[182713]: 2026-01-21 23:49:39.327 182717 DEBUG oslo_concurrency.lockutils [None req-779748a3-143b-49be-86c1-8a70f23777ad 98cbe317d3494846bdfe48215cfbc5c0 af45596abab74cc9aca5cbb551899c80 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.104s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:49:39 compute-1 nova_compute[182713]: 2026-01-21 23:49:39.422 182717 DEBUG oslo_concurrency.lockutils [None req-779748a3-143b-49be-86c1-8a70f23777ad 98cbe317d3494846bdfe48215cfbc5c0 af45596abab74cc9aca5cbb551899c80 - - default default] Lock "fb3b64cb-7a89-4d0b-b821-db928d77b940" "released" by "nova.compute.manager.ComputeManager.shelve_instance.<locals>.do_shelve_instance" :: held 19.059s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:49:39 compute-1 podman[215697]: 2026-01-21 23:49:39.591477035 +0000 UTC m=+0.082059024 container health_status cba261ffc859a105e0afa4436e64e27f9ba7e7387a1ea02146e0072c51844d44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 21 23:49:40 compute-1 nova_compute[182713]: 2026-01-21 23:49:40.389 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:49:40 compute-1 nova_compute[182713]: 2026-01-21 23:49:40.519 182717 DEBUG nova.compute.manager [None req-542b52cb-55c3-4eb4-88d8-187b192e758e 9677535ac83f4274b374a03b9ade1cf8 fd13708789ca41ca9ed376ba39d87279 - - default default] [instance: 0f91ac3a-2383-45bf-94b7-631c1737e936] Stashing vm_state: active _prep_resize /usr/lib/python3.9/site-packages/nova/compute/manager.py:5560
Jan 21 23:49:40 compute-1 nova_compute[182713]: 2026-01-21 23:49:40.727 182717 DEBUG oslo_concurrency.lockutils [None req-542b52cb-55c3-4eb4-88d8-187b192e758e 9677535ac83f4274b374a03b9ade1cf8 fd13708789ca41ca9ed376ba39d87279 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:49:40 compute-1 nova_compute[182713]: 2026-01-21 23:49:40.728 182717 DEBUG oslo_concurrency.lockutils [None req-542b52cb-55c3-4eb4-88d8-187b192e758e 9677535ac83f4274b374a03b9ade1cf8 fd13708789ca41ca9ed376ba39d87279 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:49:40 compute-1 nova_compute[182713]: 2026-01-21 23:49:40.766 182717 DEBUG nova.objects.instance [None req-542b52cb-55c3-4eb4-88d8-187b192e758e 9677535ac83f4274b374a03b9ade1cf8 fd13708789ca41ca9ed376ba39d87279 - - default default] Lazy-loading 'pci_requests' on Instance uuid 0f91ac3a-2383-45bf-94b7-631c1737e936 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:49:40 compute-1 nova_compute[182713]: 2026-01-21 23:49:40.791 182717 DEBUG nova.virt.hardware [None req-542b52cb-55c3-4eb4-88d8-187b192e758e 9677535ac83f4274b374a03b9ade1cf8 fd13708789ca41ca9ed376ba39d87279 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 21 23:49:40 compute-1 nova_compute[182713]: 2026-01-21 23:49:40.791 182717 INFO nova.compute.claims [None req-542b52cb-55c3-4eb4-88d8-187b192e758e 9677535ac83f4274b374a03b9ade1cf8 fd13708789ca41ca9ed376ba39d87279 - - default default] [instance: 0f91ac3a-2383-45bf-94b7-631c1737e936] Claim successful on node compute-1.ctlplane.example.com
Jan 21 23:49:40 compute-1 nova_compute[182713]: 2026-01-21 23:49:40.792 182717 DEBUG nova.objects.instance [None req-542b52cb-55c3-4eb4-88d8-187b192e758e 9677535ac83f4274b374a03b9ade1cf8 fd13708789ca41ca9ed376ba39d87279 - - default default] Lazy-loading 'resources' on Instance uuid 0f91ac3a-2383-45bf-94b7-631c1737e936 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:49:40 compute-1 nova_compute[182713]: 2026-01-21 23:49:40.806 182717 DEBUG nova.objects.instance [None req-542b52cb-55c3-4eb4-88d8-187b192e758e 9677535ac83f4274b374a03b9ade1cf8 fd13708789ca41ca9ed376ba39d87279 - - default default] Lazy-loading 'numa_topology' on Instance uuid 0f91ac3a-2383-45bf-94b7-631c1737e936 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:49:40 compute-1 nova_compute[182713]: 2026-01-21 23:49:40.820 182717 DEBUG nova.objects.instance [None req-542b52cb-55c3-4eb4-88d8-187b192e758e 9677535ac83f4274b374a03b9ade1cf8 fd13708789ca41ca9ed376ba39d87279 - - default default] Lazy-loading 'pci_devices' on Instance uuid 0f91ac3a-2383-45bf-94b7-631c1737e936 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:49:40 compute-1 nova_compute[182713]: 2026-01-21 23:49:40.875 182717 INFO nova.compute.resource_tracker [None req-542b52cb-55c3-4eb4-88d8-187b192e758e 9677535ac83f4274b374a03b9ade1cf8 fd13708789ca41ca9ed376ba39d87279 - - default default] [instance: 0f91ac3a-2383-45bf-94b7-631c1737e936] Updating resource usage from migration d3b9acf4-73a1-472f-99a3-614594e27415
Jan 21 23:49:40 compute-1 nova_compute[182713]: 2026-01-21 23:49:40.876 182717 DEBUG nova.compute.resource_tracker [None req-542b52cb-55c3-4eb4-88d8-187b192e758e 9677535ac83f4274b374a03b9ade1cf8 fd13708789ca41ca9ed376ba39d87279 - - default default] [instance: 0f91ac3a-2383-45bf-94b7-631c1737e936] Starting to track incoming migration d3b9acf4-73a1-472f-99a3-614594e27415 with flavor c3389c03-89c4-4ff5-9e03-1a99d41713d4 _update_usage_from_migration /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1431
Jan 21 23:49:40 compute-1 nova_compute[182713]: 2026-01-21 23:49:40.974 182717 DEBUG nova.compute.provider_tree [None req-542b52cb-55c3-4eb4-88d8-187b192e758e 9677535ac83f4274b374a03b9ade1cf8 fd13708789ca41ca9ed376ba39d87279 - - default default] Inventory has not changed in ProviderTree for provider: 39680711-70c9-4df1-ae59-25e54fac688d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 21 23:49:40 compute-1 nova_compute[182713]: 2026-01-21 23:49:40.989 182717 DEBUG nova.scheduler.client.report [None req-542b52cb-55c3-4eb4-88d8-187b192e758e 9677535ac83f4274b374a03b9ade1cf8 fd13708789ca41ca9ed376ba39d87279 - - default default] Inventory has not changed for provider 39680711-70c9-4df1-ae59-25e54fac688d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 21 23:49:41 compute-1 nova_compute[182713]: 2026-01-21 23:49:41.007 182717 DEBUG oslo_concurrency.lockutils [None req-542b52cb-55c3-4eb4-88d8-187b192e758e 9677535ac83f4274b374a03b9ade1cf8 fd13708789ca41ca9ed376ba39d87279 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: held 0.279s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:49:41 compute-1 nova_compute[182713]: 2026-01-21 23:49:41.007 182717 INFO nova.compute.manager [None req-542b52cb-55c3-4eb4-88d8-187b192e758e 9677535ac83f4274b374a03b9ade1cf8 fd13708789ca41ca9ed376ba39d87279 - - default default] [instance: 0f91ac3a-2383-45bf-94b7-631c1737e936] Migrating
Jan 21 23:49:42 compute-1 sshd-session[215717]: Accepted publickey for nova from 192.168.122.102 port 43410 ssh2: ECDSA SHA256:F7R2p0Um/J2TH6GGZ4p2tVmjepU+Vjbmccv9PcuTsYI
Jan 21 23:49:42 compute-1 systemd[1]: Created slice User Slice of UID 42436.
Jan 21 23:49:42 compute-1 systemd[1]: Starting User Runtime Directory /run/user/42436...
Jan 21 23:49:42 compute-1 systemd-logind[796]: New session 33 of user nova.
Jan 21 23:49:42 compute-1 systemd[1]: Finished User Runtime Directory /run/user/42436.
Jan 21 23:49:42 compute-1 systemd[1]: Starting User Manager for UID 42436...
Jan 21 23:49:42 compute-1 systemd[215721]: pam_unix(systemd-user:session): session opened for user nova(uid=42436) by nova(uid=0)
Jan 21 23:49:42 compute-1 systemd[215721]: Queued start job for default target Main User Target.
Jan 21 23:49:42 compute-1 systemd[215721]: Created slice User Application Slice.
Jan 21 23:49:42 compute-1 systemd[215721]: Started Mark boot as successful after the user session has run 2 minutes.
Jan 21 23:49:42 compute-1 systemd[215721]: Started Daily Cleanup of User's Temporary Directories.
Jan 21 23:49:42 compute-1 systemd[215721]: Reached target Paths.
Jan 21 23:49:42 compute-1 systemd[215721]: Reached target Timers.
Jan 21 23:49:42 compute-1 systemd[215721]: Starting D-Bus User Message Bus Socket...
Jan 21 23:49:42 compute-1 systemd[215721]: Starting Create User's Volatile Files and Directories...
Jan 21 23:49:42 compute-1 systemd[215721]: Finished Create User's Volatile Files and Directories.
Jan 21 23:49:42 compute-1 systemd[215721]: Listening on D-Bus User Message Bus Socket.
Jan 21 23:49:42 compute-1 systemd[215721]: Reached target Sockets.
Jan 21 23:49:42 compute-1 systemd[215721]: Reached target Basic System.
Jan 21 23:49:42 compute-1 systemd[215721]: Reached target Main User Target.
Jan 21 23:49:42 compute-1 systemd[215721]: Startup finished in 140ms.
Jan 21 23:49:42 compute-1 systemd[1]: Started User Manager for UID 42436.
Jan 21 23:49:42 compute-1 systemd[1]: Started Session 33 of User nova.
Jan 21 23:49:42 compute-1 sshd-session[215717]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Jan 21 23:49:42 compute-1 sshd-session[215737]: Received disconnect from 192.168.122.102 port 43410:11: disconnected by user
Jan 21 23:49:42 compute-1 sshd-session[215737]: Disconnected from user nova 192.168.122.102 port 43410
Jan 21 23:49:42 compute-1 sshd-session[215717]: pam_unix(sshd:session): session closed for user nova
Jan 21 23:49:42 compute-1 systemd[1]: session-33.scope: Deactivated successfully.
Jan 21 23:49:42 compute-1 systemd-logind[796]: Session 33 logged out. Waiting for processes to exit.
Jan 21 23:49:42 compute-1 systemd-logind[796]: Removed session 33.
Jan 21 23:49:42 compute-1 nova_compute[182713]: 2026-01-21 23:49:42.560 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:49:42 compute-1 sshd-session[215739]: Accepted publickey for nova from 192.168.122.102 port 43412 ssh2: ECDSA SHA256:F7R2p0Um/J2TH6GGZ4p2tVmjepU+Vjbmccv9PcuTsYI
Jan 21 23:49:42 compute-1 systemd-logind[796]: New session 35 of user nova.
Jan 21 23:49:42 compute-1 systemd[1]: Started Session 35 of User nova.
Jan 21 23:49:42 compute-1 sshd-session[215739]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Jan 21 23:49:42 compute-1 sshd-session[215748]: Received disconnect from 192.168.122.102 port 43412:11: disconnected by user
Jan 21 23:49:42 compute-1 sshd-session[215748]: Disconnected from user nova 192.168.122.102 port 43412
Jan 21 23:49:42 compute-1 podman[215742]: 2026-01-21 23:49:42.724630494 +0000 UTC m=+0.066659900 container health_status dce133a2ffa21f3006ac27dc80027060a9029e6d972f4c62fecd35dd3bbb00f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., name=ubi9-minimal, vcs-type=git, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, config_id=openstack_network_exporter, architecture=x86_64, container_name=openstack_network_exporter, distribution-scope=public, managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., release=1755695350)
Jan 21 23:49:42 compute-1 sshd-session[215739]: pam_unix(sshd:session): session closed for user nova
Jan 21 23:49:42 compute-1 systemd[1]: session-35.scope: Deactivated successfully.
Jan 21 23:49:42 compute-1 systemd-logind[796]: Session 35 logged out. Waiting for processes to exit.
Jan 21 23:49:42 compute-1 systemd-logind[796]: Removed session 35.
Jan 21 23:49:43 compute-1 nova_compute[182713]: 2026-01-21 23:49:43.664 182717 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769039368.66254, c5403455-52cf-4717-b07a-49f01c2ed814 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:49:43 compute-1 nova_compute[182713]: 2026-01-21 23:49:43.665 182717 INFO nova.compute.manager [-] [instance: c5403455-52cf-4717-b07a-49f01c2ed814] VM Stopped (Lifecycle Event)
Jan 21 23:49:43 compute-1 nova_compute[182713]: 2026-01-21 23:49:43.901 182717 DEBUG nova.compute.manager [None req-e12af0fc-0f75-444a-a7f6-b827e76b1ff1 - - - - - -] [instance: c5403455-52cf-4717-b07a-49f01c2ed814] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:49:45 compute-1 nova_compute[182713]: 2026-01-21 23:49:45.391 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:49:47 compute-1 nova_compute[182713]: 2026-01-21 23:49:47.565 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:49:47 compute-1 nova_compute[182713]: 2026-01-21 23:49:47.904 182717 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769039372.9028773, fb3b64cb-7a89-4d0b-b821-db928d77b940 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:49:47 compute-1 nova_compute[182713]: 2026-01-21 23:49:47.905 182717 INFO nova.compute.manager [-] [instance: fb3b64cb-7a89-4d0b-b821-db928d77b940] VM Stopped (Lifecycle Event)
Jan 21 23:49:47 compute-1 nova_compute[182713]: 2026-01-21 23:49:47.942 182717 DEBUG nova.compute.manager [None req-56de3008-307d-4c4f-910a-8a6b5ab1980f - - - - - -] [instance: fb3b64cb-7a89-4d0b-b821-db928d77b940] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:49:50 compute-1 nova_compute[182713]: 2026-01-21 23:49:50.392 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:49:50 compute-1 podman[215768]: 2026-01-21 23:49:50.605432236 +0000 UTC m=+0.085475074 container health_status 9069102163b9014ce8d990e36224edf2f6277e737eebbc85a8ea0ba5a3d6a468 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 21 23:49:50 compute-1 podman[215767]: 2026-01-21 23:49:50.612284135 +0000 UTC m=+0.105505466 container health_status 1b31374526468d6b98833c6ddc06c791524e3aaa8f4a92d02e3f11c449a17ca2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Jan 21 23:49:51 compute-1 nova_compute[182713]: 2026-01-21 23:49:51.473 182717 DEBUG oslo_concurrency.lockutils [None req-2d8afcb4-c1d8-475c-8d66-8cda50215cba abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] Acquiring lock "092ae3ba-e79b-47fc-b3eb-a0fb61865b6f" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:49:51 compute-1 nova_compute[182713]: 2026-01-21 23:49:51.475 182717 DEBUG oslo_concurrency.lockutils [None req-2d8afcb4-c1d8-475c-8d66-8cda50215cba abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] Lock "092ae3ba-e79b-47fc-b3eb-a0fb61865b6f" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:49:51 compute-1 nova_compute[182713]: 2026-01-21 23:49:51.475 182717 DEBUG oslo_concurrency.lockutils [None req-2d8afcb4-c1d8-475c-8d66-8cda50215cba abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] Acquiring lock "092ae3ba-e79b-47fc-b3eb-a0fb61865b6f-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:49:51 compute-1 nova_compute[182713]: 2026-01-21 23:49:51.475 182717 DEBUG oslo_concurrency.lockutils [None req-2d8afcb4-c1d8-475c-8d66-8cda50215cba abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] Lock "092ae3ba-e79b-47fc-b3eb-a0fb61865b6f-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:49:51 compute-1 nova_compute[182713]: 2026-01-21 23:49:51.475 182717 DEBUG oslo_concurrency.lockutils [None req-2d8afcb4-c1d8-475c-8d66-8cda50215cba abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] Lock "092ae3ba-e79b-47fc-b3eb-a0fb61865b6f-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:49:51 compute-1 nova_compute[182713]: 2026-01-21 23:49:51.491 182717 INFO nova.compute.manager [None req-2d8afcb4-c1d8-475c-8d66-8cda50215cba abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] [instance: 092ae3ba-e79b-47fc-b3eb-a0fb61865b6f] Terminating instance
Jan 21 23:49:51 compute-1 nova_compute[182713]: 2026-01-21 23:49:51.501 182717 DEBUG oslo_concurrency.lockutils [None req-2d8afcb4-c1d8-475c-8d66-8cda50215cba abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] Acquiring lock "refresh_cache-092ae3ba-e79b-47fc-b3eb-a0fb61865b6f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 23:49:51 compute-1 nova_compute[182713]: 2026-01-21 23:49:51.501 182717 DEBUG oslo_concurrency.lockutils [None req-2d8afcb4-c1d8-475c-8d66-8cda50215cba abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] Acquired lock "refresh_cache-092ae3ba-e79b-47fc-b3eb-a0fb61865b6f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 23:49:51 compute-1 nova_compute[182713]: 2026-01-21 23:49:51.502 182717 DEBUG nova.network.neutron [None req-2d8afcb4-c1d8-475c-8d66-8cda50215cba abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] [instance: 092ae3ba-e79b-47fc-b3eb-a0fb61865b6f] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 21 23:49:51 compute-1 nova_compute[182713]: 2026-01-21 23:49:51.743 182717 DEBUG nova.network.neutron [None req-2d8afcb4-c1d8-475c-8d66-8cda50215cba abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] [instance: 092ae3ba-e79b-47fc-b3eb-a0fb61865b6f] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 21 23:49:52 compute-1 nova_compute[182713]: 2026-01-21 23:49:52.569 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:49:52 compute-1 systemd[1]: Stopping User Manager for UID 42436...
Jan 21 23:49:52 compute-1 systemd[215721]: Activating special unit Exit the Session...
Jan 21 23:49:52 compute-1 systemd[215721]: Stopped target Main User Target.
Jan 21 23:49:52 compute-1 systemd[215721]: Stopped target Basic System.
Jan 21 23:49:52 compute-1 systemd[215721]: Stopped target Paths.
Jan 21 23:49:52 compute-1 systemd[215721]: Stopped target Sockets.
Jan 21 23:49:52 compute-1 systemd[215721]: Stopped target Timers.
Jan 21 23:49:52 compute-1 systemd[215721]: Stopped Mark boot as successful after the user session has run 2 minutes.
Jan 21 23:49:52 compute-1 systemd[215721]: Stopped Daily Cleanup of User's Temporary Directories.
Jan 21 23:49:52 compute-1 systemd[215721]: Closed D-Bus User Message Bus Socket.
Jan 21 23:49:52 compute-1 systemd[215721]: Stopped Create User's Volatile Files and Directories.
Jan 21 23:49:52 compute-1 systemd[215721]: Removed slice User Application Slice.
Jan 21 23:49:52 compute-1 systemd[215721]: Reached target Shutdown.
Jan 21 23:49:52 compute-1 systemd[215721]: Finished Exit the Session.
Jan 21 23:49:52 compute-1 systemd[215721]: Reached target Exit the Session.
Jan 21 23:49:53 compute-1 systemd[1]: user@42436.service: Deactivated successfully.
Jan 21 23:49:53 compute-1 systemd[1]: Stopped User Manager for UID 42436.
Jan 21 23:49:53 compute-1 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Jan 21 23:49:53 compute-1 systemd[1]: run-user-42436.mount: Deactivated successfully.
Jan 21 23:49:53 compute-1 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Jan 21 23:49:53 compute-1 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Jan 21 23:49:53 compute-1 systemd[1]: Removed slice User Slice of UID 42436.
Jan 21 23:49:53 compute-1 nova_compute[182713]: 2026-01-21 23:49:53.254 182717 DEBUG nova.network.neutron [None req-2d8afcb4-c1d8-475c-8d66-8cda50215cba abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] [instance: 092ae3ba-e79b-47fc-b3eb-a0fb61865b6f] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 23:49:53 compute-1 nova_compute[182713]: 2026-01-21 23:49:53.297 182717 DEBUG oslo_concurrency.lockutils [None req-2d8afcb4-c1d8-475c-8d66-8cda50215cba abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] Releasing lock "refresh_cache-092ae3ba-e79b-47fc-b3eb-a0fb61865b6f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 23:49:53 compute-1 nova_compute[182713]: 2026-01-21 23:49:53.303 182717 DEBUG nova.compute.manager [None req-2d8afcb4-c1d8-475c-8d66-8cda50215cba abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] [instance: 092ae3ba-e79b-47fc-b3eb-a0fb61865b6f] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 21 23:49:53 compute-1 systemd[1]: machine-qemu\x2d17\x2dinstance\x2d00000020.scope: Deactivated successfully.
Jan 21 23:49:53 compute-1 systemd[1]: machine-qemu\x2d17\x2dinstance\x2d00000020.scope: Consumed 14.303s CPU time.
Jan 21 23:49:53 compute-1 systemd-machined[153970]: Machine qemu-17-instance-00000020 terminated.
Jan 21 23:49:53 compute-1 systemd[1]: virtproxyd.service: Deactivated successfully.
Jan 21 23:49:53 compute-1 nova_compute[182713]: 2026-01-21 23:49:53.565 182717 INFO nova.virt.libvirt.driver [-] [instance: 092ae3ba-e79b-47fc-b3eb-a0fb61865b6f] Instance destroyed successfully.
Jan 21 23:49:53 compute-1 nova_compute[182713]: 2026-01-21 23:49:53.566 182717 DEBUG nova.objects.instance [None req-2d8afcb4-c1d8-475c-8d66-8cda50215cba abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] Lazy-loading 'resources' on Instance uuid 092ae3ba-e79b-47fc-b3eb-a0fb61865b6f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:49:53 compute-1 nova_compute[182713]: 2026-01-21 23:49:53.582 182717 INFO nova.virt.libvirt.driver [None req-2d8afcb4-c1d8-475c-8d66-8cda50215cba abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] [instance: 092ae3ba-e79b-47fc-b3eb-a0fb61865b6f] Deleting instance files /var/lib/nova/instances/092ae3ba-e79b-47fc-b3eb-a0fb61865b6f_del
Jan 21 23:49:53 compute-1 nova_compute[182713]: 2026-01-21 23:49:53.583 182717 INFO nova.virt.libvirt.driver [None req-2d8afcb4-c1d8-475c-8d66-8cda50215cba abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] [instance: 092ae3ba-e79b-47fc-b3eb-a0fb61865b6f] Deletion of /var/lib/nova/instances/092ae3ba-e79b-47fc-b3eb-a0fb61865b6f_del complete
Jan 21 23:49:53 compute-1 nova_compute[182713]: 2026-01-21 23:49:53.656 182717 INFO nova.compute.manager [None req-2d8afcb4-c1d8-475c-8d66-8cda50215cba abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] [instance: 092ae3ba-e79b-47fc-b3eb-a0fb61865b6f] Took 0.35 seconds to destroy the instance on the hypervisor.
Jan 21 23:49:53 compute-1 nova_compute[182713]: 2026-01-21 23:49:53.657 182717 DEBUG oslo.service.loopingcall [None req-2d8afcb4-c1d8-475c-8d66-8cda50215cba abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 21 23:49:53 compute-1 nova_compute[182713]: 2026-01-21 23:49:53.657 182717 DEBUG nova.compute.manager [-] [instance: 092ae3ba-e79b-47fc-b3eb-a0fb61865b6f] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 21 23:49:53 compute-1 nova_compute[182713]: 2026-01-21 23:49:53.658 182717 DEBUG nova.network.neutron [-] [instance: 092ae3ba-e79b-47fc-b3eb-a0fb61865b6f] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 21 23:49:54 compute-1 nova_compute[182713]: 2026-01-21 23:49:54.706 182717 DEBUG nova.network.neutron [-] [instance: 092ae3ba-e79b-47fc-b3eb-a0fb61865b6f] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 21 23:49:54 compute-1 nova_compute[182713]: 2026-01-21 23:49:54.724 182717 DEBUG nova.network.neutron [-] [instance: 092ae3ba-e79b-47fc-b3eb-a0fb61865b6f] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 23:49:54 compute-1 nova_compute[182713]: 2026-01-21 23:49:54.743 182717 INFO nova.compute.manager [-] [instance: 092ae3ba-e79b-47fc-b3eb-a0fb61865b6f] Took 1.09 seconds to deallocate network for instance.
Jan 21 23:49:54 compute-1 nova_compute[182713]: 2026-01-21 23:49:54.838 182717 DEBUG oslo_concurrency.lockutils [None req-2d8afcb4-c1d8-475c-8d66-8cda50215cba abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:49:54 compute-1 nova_compute[182713]: 2026-01-21 23:49:54.839 182717 DEBUG oslo_concurrency.lockutils [None req-2d8afcb4-c1d8-475c-8d66-8cda50215cba abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:49:54 compute-1 nova_compute[182713]: 2026-01-21 23:49:54.951 182717 DEBUG nova.compute.provider_tree [None req-2d8afcb4-c1d8-475c-8d66-8cda50215cba abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] Inventory has not changed in ProviderTree for provider: 39680711-70c9-4df1-ae59-25e54fac688d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 21 23:49:54 compute-1 nova_compute[182713]: 2026-01-21 23:49:54.976 182717 DEBUG nova.scheduler.client.report [None req-2d8afcb4-c1d8-475c-8d66-8cda50215cba abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] Inventory has not changed for provider 39680711-70c9-4df1-ae59-25e54fac688d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 21 23:49:55 compute-1 nova_compute[182713]: 2026-01-21 23:49:55.008 182717 DEBUG oslo_concurrency.lockutils [None req-2d8afcb4-c1d8-475c-8d66-8cda50215cba abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.169s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:49:55 compute-1 nova_compute[182713]: 2026-01-21 23:49:55.046 182717 INFO nova.scheduler.client.report [None req-2d8afcb4-c1d8-475c-8d66-8cda50215cba abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] Deleted allocations for instance 092ae3ba-e79b-47fc-b3eb-a0fb61865b6f
Jan 21 23:49:55 compute-1 nova_compute[182713]: 2026-01-21 23:49:55.168 182717 DEBUG oslo_concurrency.lockutils [None req-2d8afcb4-c1d8-475c-8d66-8cda50215cba abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] Lock "092ae3ba-e79b-47fc-b3eb-a0fb61865b6f" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.693s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:49:55 compute-1 nova_compute[182713]: 2026-01-21 23:49:55.394 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:49:56 compute-1 sshd-session[215829]: Accepted publickey for nova from 192.168.122.102 port 47564 ssh2: ECDSA SHA256:F7R2p0Um/J2TH6GGZ4p2tVmjepU+Vjbmccv9PcuTsYI
Jan 21 23:49:56 compute-1 systemd[1]: Created slice User Slice of UID 42436.
Jan 21 23:49:56 compute-1 systemd[1]: Starting User Runtime Directory /run/user/42436...
Jan 21 23:49:56 compute-1 systemd-logind[796]: New session 36 of user nova.
Jan 21 23:49:56 compute-1 systemd[1]: Finished User Runtime Directory /run/user/42436.
Jan 21 23:49:56 compute-1 systemd[1]: Starting User Manager for UID 42436...
Jan 21 23:49:56 compute-1 systemd[215833]: pam_unix(systemd-user:session): session opened for user nova(uid=42436) by nova(uid=0)
Jan 21 23:49:56 compute-1 systemd[215833]: Queued start job for default target Main User Target.
Jan 21 23:49:56 compute-1 systemd[215833]: Created slice User Application Slice.
Jan 21 23:49:56 compute-1 systemd[215833]: Started Mark boot as successful after the user session has run 2 minutes.
Jan 21 23:49:56 compute-1 systemd[215833]: Started Daily Cleanup of User's Temporary Directories.
Jan 21 23:49:56 compute-1 systemd[215833]: Reached target Paths.
Jan 21 23:49:56 compute-1 systemd[215833]: Reached target Timers.
Jan 21 23:49:56 compute-1 systemd[215833]: Starting D-Bus User Message Bus Socket...
Jan 21 23:49:56 compute-1 systemd[215833]: Starting Create User's Volatile Files and Directories...
Jan 21 23:49:56 compute-1 systemd[215833]: Finished Create User's Volatile Files and Directories.
Jan 21 23:49:56 compute-1 systemd[215833]: Listening on D-Bus User Message Bus Socket.
Jan 21 23:49:56 compute-1 systemd[215833]: Reached target Sockets.
Jan 21 23:49:56 compute-1 systemd[215833]: Reached target Basic System.
Jan 21 23:49:56 compute-1 systemd[215833]: Reached target Main User Target.
Jan 21 23:49:56 compute-1 systemd[215833]: Startup finished in 150ms.
Jan 21 23:49:56 compute-1 systemd[1]: Started User Manager for UID 42436.
Jan 21 23:49:56 compute-1 systemd[1]: Started Session 36 of User nova.
Jan 21 23:49:56 compute-1 sshd-session[215829]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Jan 21 23:49:56 compute-1 sshd-session[215848]: Received disconnect from 192.168.122.102 port 47564:11: disconnected by user
Jan 21 23:49:56 compute-1 sshd-session[215848]: Disconnected from user nova 192.168.122.102 port 47564
Jan 21 23:49:56 compute-1 sshd-session[215829]: pam_unix(sshd:session): session closed for user nova
Jan 21 23:49:56 compute-1 systemd[1]: session-36.scope: Deactivated successfully.
Jan 21 23:49:56 compute-1 systemd-logind[796]: Session 36 logged out. Waiting for processes to exit.
Jan 21 23:49:56 compute-1 systemd-logind[796]: Removed session 36.
Jan 21 23:49:57 compute-1 sshd-session[215850]: Accepted publickey for nova from 192.168.122.102 port 47578 ssh2: ECDSA SHA256:F7R2p0Um/J2TH6GGZ4p2tVmjepU+Vjbmccv9PcuTsYI
Jan 21 23:49:57 compute-1 systemd-logind[796]: New session 38 of user nova.
Jan 21 23:49:57 compute-1 systemd[1]: Started Session 38 of User nova.
Jan 21 23:49:57 compute-1 sshd-session[215850]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Jan 21 23:49:57 compute-1 sshd-session[215853]: Received disconnect from 192.168.122.102 port 47578:11: disconnected by user
Jan 21 23:49:57 compute-1 sshd-session[215853]: Disconnected from user nova 192.168.122.102 port 47578
Jan 21 23:49:57 compute-1 sshd-session[215850]: pam_unix(sshd:session): session closed for user nova
Jan 21 23:49:57 compute-1 systemd[1]: session-38.scope: Deactivated successfully.
Jan 21 23:49:57 compute-1 systemd-logind[796]: Session 38 logged out. Waiting for processes to exit.
Jan 21 23:49:57 compute-1 systemd-logind[796]: Removed session 38.
Jan 21 23:49:57 compute-1 sshd-session[215855]: Accepted publickey for nova from 192.168.122.102 port 47594 ssh2: ECDSA SHA256:F7R2p0Um/J2TH6GGZ4p2tVmjepU+Vjbmccv9PcuTsYI
Jan 21 23:49:57 compute-1 systemd-logind[796]: New session 39 of user nova.
Jan 21 23:49:57 compute-1 systemd[1]: Started Session 39 of User nova.
Jan 21 23:49:57 compute-1 sshd-session[215855]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Jan 21 23:49:57 compute-1 sshd-session[215858]: Received disconnect from 192.168.122.102 port 47594:11: disconnected by user
Jan 21 23:49:57 compute-1 sshd-session[215858]: Disconnected from user nova 192.168.122.102 port 47594
Jan 21 23:49:57 compute-1 sshd-session[215855]: pam_unix(sshd:session): session closed for user nova
Jan 21 23:49:57 compute-1 systemd[1]: session-39.scope: Deactivated successfully.
Jan 21 23:49:57 compute-1 systemd-logind[796]: Session 39 logged out. Waiting for processes to exit.
Jan 21 23:49:57 compute-1 systemd-logind[796]: Removed session 39.
Jan 21 23:49:57 compute-1 nova_compute[182713]: 2026-01-21 23:49:57.572 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:49:57 compute-1 nova_compute[182713]: 2026-01-21 23:49:57.981 182717 DEBUG oslo_concurrency.lockutils [None req-542b52cb-55c3-4eb4-88d8-187b192e758e 9677535ac83f4274b374a03b9ade1cf8 fd13708789ca41ca9ed376ba39d87279 - - default default] Acquiring lock "refresh_cache-0f91ac3a-2383-45bf-94b7-631c1737e936" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 23:49:57 compute-1 nova_compute[182713]: 2026-01-21 23:49:57.982 182717 DEBUG oslo_concurrency.lockutils [None req-542b52cb-55c3-4eb4-88d8-187b192e758e 9677535ac83f4274b374a03b9ade1cf8 fd13708789ca41ca9ed376ba39d87279 - - default default] Acquired lock "refresh_cache-0f91ac3a-2383-45bf-94b7-631c1737e936" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 23:49:57 compute-1 nova_compute[182713]: 2026-01-21 23:49:57.982 182717 DEBUG nova.network.neutron [None req-542b52cb-55c3-4eb4-88d8-187b192e758e 9677535ac83f4274b374a03b9ade1cf8 fd13708789ca41ca9ed376ba39d87279 - - default default] [instance: 0f91ac3a-2383-45bf-94b7-631c1737e936] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 21 23:49:58 compute-1 nova_compute[182713]: 2026-01-21 23:49:58.140 182717 DEBUG nova.network.neutron [None req-542b52cb-55c3-4eb4-88d8-187b192e758e 9677535ac83f4274b374a03b9ade1cf8 fd13708789ca41ca9ed376ba39d87279 - - default default] [instance: 0f91ac3a-2383-45bf-94b7-631c1737e936] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 21 23:49:58 compute-1 nova_compute[182713]: 2026-01-21 23:49:58.418 182717 DEBUG nova.network.neutron [None req-542b52cb-55c3-4eb4-88d8-187b192e758e 9677535ac83f4274b374a03b9ade1cf8 fd13708789ca41ca9ed376ba39d87279 - - default default] [instance: 0f91ac3a-2383-45bf-94b7-631c1737e936] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 23:49:58 compute-1 nova_compute[182713]: 2026-01-21 23:49:58.436 182717 DEBUG oslo_concurrency.lockutils [None req-542b52cb-55c3-4eb4-88d8-187b192e758e 9677535ac83f4274b374a03b9ade1cf8 fd13708789ca41ca9ed376ba39d87279 - - default default] Releasing lock "refresh_cache-0f91ac3a-2383-45bf-94b7-631c1737e936" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 23:49:58 compute-1 nova_compute[182713]: 2026-01-21 23:49:58.578 182717 DEBUG nova.virt.libvirt.driver [None req-542b52cb-55c3-4eb4-88d8-187b192e758e 9677535ac83f4274b374a03b9ade1cf8 fd13708789ca41ca9ed376ba39d87279 - - default default] [instance: 0f91ac3a-2383-45bf-94b7-631c1737e936] Starting finish_migration finish_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11698
Jan 21 23:49:58 compute-1 nova_compute[182713]: 2026-01-21 23:49:58.580 182717 DEBUG nova.virt.libvirt.driver [None req-542b52cb-55c3-4eb4-88d8-187b192e758e 9677535ac83f4274b374a03b9ade1cf8 fd13708789ca41ca9ed376ba39d87279 - - default default] [instance: 0f91ac3a-2383-45bf-94b7-631c1737e936] Instance directory exists: not creating _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4719
Jan 21 23:49:58 compute-1 nova_compute[182713]: 2026-01-21 23:49:58.581 182717 INFO nova.virt.libvirt.driver [None req-542b52cb-55c3-4eb4-88d8-187b192e758e 9677535ac83f4274b374a03b9ade1cf8 fd13708789ca41ca9ed376ba39d87279 - - default default] [instance: 0f91ac3a-2383-45bf-94b7-631c1737e936] Creating image(s)
Jan 21 23:49:58 compute-1 nova_compute[182713]: 2026-01-21 23:49:58.582 182717 DEBUG nova.objects.instance [None req-542b52cb-55c3-4eb4-88d8-187b192e758e 9677535ac83f4274b374a03b9ade1cf8 fd13708789ca41ca9ed376ba39d87279 - - default default] Lazy-loading 'trusted_certs' on Instance uuid 0f91ac3a-2383-45bf-94b7-631c1737e936 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:49:58 compute-1 podman[215860]: 2026-01-21 23:49:58.610141413 +0000 UTC m=+0.086561988 container health_status af9a44923f14d865893c91712a29a99d59e05d83f1a1a0300c4f4d5af638f284 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 21 23:49:58 compute-1 nova_compute[182713]: 2026-01-21 23:49:58.612 182717 DEBUG oslo_concurrency.processutils [None req-542b52cb-55c3-4eb4-88d8-187b192e758e 9677535ac83f4274b374a03b9ade1cf8 fd13708789ca41ca9ed376ba39d87279 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:49:58 compute-1 podman[215861]: 2026-01-21 23:49:58.624943218 +0000 UTC m=+0.090568457 container health_status e305ebfcd3dbca18f8dd680606b043b108cf9efb0d0a771039cb04fe0df8441d (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Jan 21 23:49:58 compute-1 nova_compute[182713]: 2026-01-21 23:49:58.696 182717 DEBUG oslo_concurrency.processutils [None req-542b52cb-55c3-4eb4-88d8-187b192e758e 9677535ac83f4274b374a03b9ade1cf8 fd13708789ca41ca9ed376ba39d87279 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.084s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:49:58 compute-1 nova_compute[182713]: 2026-01-21 23:49:58.697 182717 DEBUG nova.virt.disk.api [None req-542b52cb-55c3-4eb4-88d8-187b192e758e 9677535ac83f4274b374a03b9ade1cf8 fd13708789ca41ca9ed376ba39d87279 - - default default] Checking if we can resize image /var/lib/nova/instances/0f91ac3a-2383-45bf-94b7-631c1737e936/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 21 23:49:58 compute-1 nova_compute[182713]: 2026-01-21 23:49:58.697 182717 DEBUG oslo_concurrency.processutils [None req-542b52cb-55c3-4eb4-88d8-187b192e758e 9677535ac83f4274b374a03b9ade1cf8 fd13708789ca41ca9ed376ba39d87279 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0f91ac3a-2383-45bf-94b7-631c1737e936/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:49:58 compute-1 nova_compute[182713]: 2026-01-21 23:49:58.757 182717 DEBUG oslo_concurrency.processutils [None req-542b52cb-55c3-4eb4-88d8-187b192e758e 9677535ac83f4274b374a03b9ade1cf8 fd13708789ca41ca9ed376ba39d87279 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0f91ac3a-2383-45bf-94b7-631c1737e936/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:49:58 compute-1 nova_compute[182713]: 2026-01-21 23:49:58.759 182717 DEBUG nova.virt.disk.api [None req-542b52cb-55c3-4eb4-88d8-187b192e758e 9677535ac83f4274b374a03b9ade1cf8 fd13708789ca41ca9ed376ba39d87279 - - default default] Cannot resize image /var/lib/nova/instances/0f91ac3a-2383-45bf-94b7-631c1737e936/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 21 23:49:58 compute-1 nova_compute[182713]: 2026-01-21 23:49:58.775 182717 DEBUG nova.virt.libvirt.driver [None req-542b52cb-55c3-4eb4-88d8-187b192e758e 9677535ac83f4274b374a03b9ade1cf8 fd13708789ca41ca9ed376ba39d87279 - - default default] [instance: 0f91ac3a-2383-45bf-94b7-631c1737e936] Did not create local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4859
Jan 21 23:49:58 compute-1 nova_compute[182713]: 2026-01-21 23:49:58.776 182717 DEBUG nova.virt.libvirt.driver [None req-542b52cb-55c3-4eb4-88d8-187b192e758e 9677535ac83f4274b374a03b9ade1cf8 fd13708789ca41ca9ed376ba39d87279 - - default default] [instance: 0f91ac3a-2383-45bf-94b7-631c1737e936] Ensure instance console log exists: /var/lib/nova/instances/0f91ac3a-2383-45bf-94b7-631c1737e936/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 21 23:49:58 compute-1 nova_compute[182713]: 2026-01-21 23:49:58.777 182717 DEBUG oslo_concurrency.lockutils [None req-542b52cb-55c3-4eb4-88d8-187b192e758e 9677535ac83f4274b374a03b9ade1cf8 fd13708789ca41ca9ed376ba39d87279 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:49:58 compute-1 nova_compute[182713]: 2026-01-21 23:49:58.778 182717 DEBUG oslo_concurrency.lockutils [None req-542b52cb-55c3-4eb4-88d8-187b192e758e 9677535ac83f4274b374a03b9ade1cf8 fd13708789ca41ca9ed376ba39d87279 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:49:58 compute-1 nova_compute[182713]: 2026-01-21 23:49:58.778 182717 DEBUG oslo_concurrency.lockutils [None req-542b52cb-55c3-4eb4-88d8-187b192e758e 9677535ac83f4274b374a03b9ade1cf8 fd13708789ca41ca9ed376ba39d87279 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:49:58 compute-1 nova_compute[182713]: 2026-01-21 23:49:58.781 182717 DEBUG nova.virt.libvirt.driver [None req-542b52cb-55c3-4eb4-88d8-187b192e758e 9677535ac83f4274b374a03b9ade1cf8 fd13708789ca41ca9ed376ba39d87279 - - default default] [instance: 0f91ac3a-2383-45bf-94b7-631c1737e936] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'size': 0, 'disk_bus': 'virtio', 'boot_index': 0, 'device_name': '/dev/vda', 'guest_format': None, 'encrypted': False, 'encryption_format': None, 'encryption_secret_uuid': None, 'device_type': 'disk', 'image_id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 21 23:49:58 compute-1 nova_compute[182713]: 2026-01-21 23:49:58.792 182717 WARNING nova.virt.libvirt.driver [None req-542b52cb-55c3-4eb4-88d8-187b192e758e 9677535ac83f4274b374a03b9ade1cf8 fd13708789ca41ca9ed376ba39d87279 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 21 23:49:58 compute-1 nova_compute[182713]: 2026-01-21 23:49:58.798 182717 DEBUG nova.virt.libvirt.host [None req-542b52cb-55c3-4eb4-88d8-187b192e758e 9677535ac83f4274b374a03b9ade1cf8 fd13708789ca41ca9ed376ba39d87279 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 21 23:49:58 compute-1 nova_compute[182713]: 2026-01-21 23:49:58.799 182717 DEBUG nova.virt.libvirt.host [None req-542b52cb-55c3-4eb4-88d8-187b192e758e 9677535ac83f4274b374a03b9ade1cf8 fd13708789ca41ca9ed376ba39d87279 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 21 23:49:58 compute-1 nova_compute[182713]: 2026-01-21 23:49:58.802 182717 DEBUG nova.virt.libvirt.host [None req-542b52cb-55c3-4eb4-88d8-187b192e758e 9677535ac83f4274b374a03b9ade1cf8 fd13708789ca41ca9ed376ba39d87279 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 21 23:49:58 compute-1 nova_compute[182713]: 2026-01-21 23:49:58.803 182717 DEBUG nova.virt.libvirt.host [None req-542b52cb-55c3-4eb4-88d8-187b192e758e 9677535ac83f4274b374a03b9ade1cf8 fd13708789ca41ca9ed376ba39d87279 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 21 23:49:58 compute-1 nova_compute[182713]: 2026-01-21 23:49:58.805 182717 DEBUG nova.virt.libvirt.driver [None req-542b52cb-55c3-4eb4-88d8-187b192e758e 9677535ac83f4274b374a03b9ade1cf8 fd13708789ca41ca9ed376ba39d87279 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 21 23:49:58 compute-1 nova_compute[182713]: 2026-01-21 23:49:58.806 182717 DEBUG nova.virt.hardware [None req-542b52cb-55c3-4eb4-88d8-187b192e758e 9677535ac83f4274b374a03b9ade1cf8 fd13708789ca41ca9ed376ba39d87279 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-21T23:42:45Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c3389c03-89c4-4ff5-9e03-1a99d41713d4',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 21 23:49:58 compute-1 nova_compute[182713]: 2026-01-21 23:49:58.807 182717 DEBUG nova.virt.hardware [None req-542b52cb-55c3-4eb4-88d8-187b192e758e 9677535ac83f4274b374a03b9ade1cf8 fd13708789ca41ca9ed376ba39d87279 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 21 23:49:58 compute-1 nova_compute[182713]: 2026-01-21 23:49:58.807 182717 DEBUG nova.virt.hardware [None req-542b52cb-55c3-4eb4-88d8-187b192e758e 9677535ac83f4274b374a03b9ade1cf8 fd13708789ca41ca9ed376ba39d87279 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 21 23:49:58 compute-1 nova_compute[182713]: 2026-01-21 23:49:58.808 182717 DEBUG nova.virt.hardware [None req-542b52cb-55c3-4eb4-88d8-187b192e758e 9677535ac83f4274b374a03b9ade1cf8 fd13708789ca41ca9ed376ba39d87279 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 21 23:49:58 compute-1 nova_compute[182713]: 2026-01-21 23:49:58.808 182717 DEBUG nova.virt.hardware [None req-542b52cb-55c3-4eb4-88d8-187b192e758e 9677535ac83f4274b374a03b9ade1cf8 fd13708789ca41ca9ed376ba39d87279 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 21 23:49:58 compute-1 nova_compute[182713]: 2026-01-21 23:49:58.809 182717 DEBUG nova.virt.hardware [None req-542b52cb-55c3-4eb4-88d8-187b192e758e 9677535ac83f4274b374a03b9ade1cf8 fd13708789ca41ca9ed376ba39d87279 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 21 23:49:58 compute-1 nova_compute[182713]: 2026-01-21 23:49:58.809 182717 DEBUG nova.virt.hardware [None req-542b52cb-55c3-4eb4-88d8-187b192e758e 9677535ac83f4274b374a03b9ade1cf8 fd13708789ca41ca9ed376ba39d87279 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 21 23:49:58 compute-1 nova_compute[182713]: 2026-01-21 23:49:58.810 182717 DEBUG nova.virt.hardware [None req-542b52cb-55c3-4eb4-88d8-187b192e758e 9677535ac83f4274b374a03b9ade1cf8 fd13708789ca41ca9ed376ba39d87279 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 21 23:49:58 compute-1 nova_compute[182713]: 2026-01-21 23:49:58.810 182717 DEBUG nova.virt.hardware [None req-542b52cb-55c3-4eb4-88d8-187b192e758e 9677535ac83f4274b374a03b9ade1cf8 fd13708789ca41ca9ed376ba39d87279 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 21 23:49:58 compute-1 nova_compute[182713]: 2026-01-21 23:49:58.811 182717 DEBUG nova.virt.hardware [None req-542b52cb-55c3-4eb4-88d8-187b192e758e 9677535ac83f4274b374a03b9ade1cf8 fd13708789ca41ca9ed376ba39d87279 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 21 23:49:58 compute-1 nova_compute[182713]: 2026-01-21 23:49:58.811 182717 DEBUG nova.virt.hardware [None req-542b52cb-55c3-4eb4-88d8-187b192e758e 9677535ac83f4274b374a03b9ade1cf8 fd13708789ca41ca9ed376ba39d87279 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 21 23:49:58 compute-1 nova_compute[182713]: 2026-01-21 23:49:58.812 182717 DEBUG nova.objects.instance [None req-542b52cb-55c3-4eb4-88d8-187b192e758e 9677535ac83f4274b374a03b9ade1cf8 fd13708789ca41ca9ed376ba39d87279 - - default default] Lazy-loading 'vcpu_model' on Instance uuid 0f91ac3a-2383-45bf-94b7-631c1737e936 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:49:58 compute-1 nova_compute[182713]: 2026-01-21 23:49:58.834 182717 DEBUG oslo_concurrency.processutils [None req-542b52cb-55c3-4eb4-88d8-187b192e758e 9677535ac83f4274b374a03b9ade1cf8 fd13708789ca41ca9ed376ba39d87279 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0f91ac3a-2383-45bf-94b7-631c1737e936/disk.config --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:49:58 compute-1 nova_compute[182713]: 2026-01-21 23:49:58.932 182717 DEBUG oslo_concurrency.processutils [None req-542b52cb-55c3-4eb4-88d8-187b192e758e 9677535ac83f4274b374a03b9ade1cf8 fd13708789ca41ca9ed376ba39d87279 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0f91ac3a-2383-45bf-94b7-631c1737e936/disk.config --force-share --output=json" returned: 0 in 0.098s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:49:58 compute-1 nova_compute[182713]: 2026-01-21 23:49:58.934 182717 DEBUG oslo_concurrency.lockutils [None req-542b52cb-55c3-4eb4-88d8-187b192e758e 9677535ac83f4274b374a03b9ade1cf8 fd13708789ca41ca9ed376ba39d87279 - - default default] Acquiring lock "/var/lib/nova/instances/0f91ac3a-2383-45bf-94b7-631c1737e936/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:49:58 compute-1 nova_compute[182713]: 2026-01-21 23:49:58.934 182717 DEBUG oslo_concurrency.lockutils [None req-542b52cb-55c3-4eb4-88d8-187b192e758e 9677535ac83f4274b374a03b9ade1cf8 fd13708789ca41ca9ed376ba39d87279 - - default default] Lock "/var/lib/nova/instances/0f91ac3a-2383-45bf-94b7-631c1737e936/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:49:58 compute-1 nova_compute[182713]: 2026-01-21 23:49:58.935 182717 DEBUG oslo_concurrency.lockutils [None req-542b52cb-55c3-4eb4-88d8-187b192e758e 9677535ac83f4274b374a03b9ade1cf8 fd13708789ca41ca9ed376ba39d87279 - - default default] Lock "/var/lib/nova/instances/0f91ac3a-2383-45bf-94b7-631c1737e936/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:49:58 compute-1 nova_compute[182713]: 2026-01-21 23:49:58.938 182717 DEBUG nova.virt.libvirt.driver [None req-542b52cb-55c3-4eb4-88d8-187b192e758e 9677535ac83f4274b374a03b9ade1cf8 fd13708789ca41ca9ed376ba39d87279 - - default default] [instance: 0f91ac3a-2383-45bf-94b7-631c1737e936] End _get_guest_xml xml=<domain type="kvm">
Jan 21 23:49:58 compute-1 nova_compute[182713]:   <uuid>0f91ac3a-2383-45bf-94b7-631c1737e936</uuid>
Jan 21 23:49:58 compute-1 nova_compute[182713]:   <name>instance-00000023</name>
Jan 21 23:49:58 compute-1 nova_compute[182713]:   <memory>131072</memory>
Jan 21 23:49:58 compute-1 nova_compute[182713]:   <vcpu>1</vcpu>
Jan 21 23:49:58 compute-1 nova_compute[182713]:   <metadata>
Jan 21 23:49:58 compute-1 nova_compute[182713]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 21 23:49:58 compute-1 nova_compute[182713]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 21 23:49:58 compute-1 nova_compute[182713]:       <nova:name>tempest-MigrationsAdminTest-server-1788036298</nova:name>
Jan 21 23:49:58 compute-1 nova_compute[182713]:       <nova:creationTime>2026-01-21 23:49:58</nova:creationTime>
Jan 21 23:49:58 compute-1 nova_compute[182713]:       <nova:flavor name="m1.nano">
Jan 21 23:49:58 compute-1 nova_compute[182713]:         <nova:memory>128</nova:memory>
Jan 21 23:49:58 compute-1 nova_compute[182713]:         <nova:disk>1</nova:disk>
Jan 21 23:49:58 compute-1 nova_compute[182713]:         <nova:swap>0</nova:swap>
Jan 21 23:49:58 compute-1 nova_compute[182713]:         <nova:ephemeral>0</nova:ephemeral>
Jan 21 23:49:58 compute-1 nova_compute[182713]:         <nova:vcpus>1</nova:vcpus>
Jan 21 23:49:58 compute-1 nova_compute[182713]:       </nova:flavor>
Jan 21 23:49:58 compute-1 nova_compute[182713]:       <nova:owner>
Jan 21 23:49:58 compute-1 nova_compute[182713]:         <nova:user uuid="36d71830ce70436e97fbc17b6da8d3c6">tempest-MigrationsAdminTest-1559502816-project-member</nova:user>
Jan 21 23:49:58 compute-1 nova_compute[182713]:         <nova:project uuid="95574103d0094883861c58d01690e5a3">tempest-MigrationsAdminTest-1559502816</nova:project>
Jan 21 23:49:58 compute-1 nova_compute[182713]:       </nova:owner>
Jan 21 23:49:58 compute-1 nova_compute[182713]:       <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 21 23:49:58 compute-1 nova_compute[182713]:       <nova:ports/>
Jan 21 23:49:58 compute-1 nova_compute[182713]:     </nova:instance>
Jan 21 23:49:58 compute-1 nova_compute[182713]:   </metadata>
Jan 21 23:49:58 compute-1 nova_compute[182713]:   <sysinfo type="smbios">
Jan 21 23:49:58 compute-1 nova_compute[182713]:     <system>
Jan 21 23:49:58 compute-1 nova_compute[182713]:       <entry name="manufacturer">RDO</entry>
Jan 21 23:49:58 compute-1 nova_compute[182713]:       <entry name="product">OpenStack Compute</entry>
Jan 21 23:49:58 compute-1 nova_compute[182713]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 21 23:49:58 compute-1 nova_compute[182713]:       <entry name="serial">0f91ac3a-2383-45bf-94b7-631c1737e936</entry>
Jan 21 23:49:58 compute-1 nova_compute[182713]:       <entry name="uuid">0f91ac3a-2383-45bf-94b7-631c1737e936</entry>
Jan 21 23:49:58 compute-1 nova_compute[182713]:       <entry name="family">Virtual Machine</entry>
Jan 21 23:49:58 compute-1 nova_compute[182713]:     </system>
Jan 21 23:49:58 compute-1 nova_compute[182713]:   </sysinfo>
Jan 21 23:49:58 compute-1 nova_compute[182713]:   <os>
Jan 21 23:49:58 compute-1 nova_compute[182713]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 21 23:49:58 compute-1 nova_compute[182713]:     <boot dev="hd"/>
Jan 21 23:49:58 compute-1 nova_compute[182713]:     <smbios mode="sysinfo"/>
Jan 21 23:49:58 compute-1 nova_compute[182713]:   </os>
Jan 21 23:49:58 compute-1 nova_compute[182713]:   <features>
Jan 21 23:49:58 compute-1 nova_compute[182713]:     <acpi/>
Jan 21 23:49:58 compute-1 nova_compute[182713]:     <apic/>
Jan 21 23:49:58 compute-1 nova_compute[182713]:     <vmcoreinfo/>
Jan 21 23:49:58 compute-1 nova_compute[182713]:   </features>
Jan 21 23:49:58 compute-1 nova_compute[182713]:   <clock offset="utc">
Jan 21 23:49:58 compute-1 nova_compute[182713]:     <timer name="pit" tickpolicy="delay"/>
Jan 21 23:49:58 compute-1 nova_compute[182713]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 21 23:49:58 compute-1 nova_compute[182713]:     <timer name="hpet" present="no"/>
Jan 21 23:49:58 compute-1 nova_compute[182713]:   </clock>
Jan 21 23:49:58 compute-1 nova_compute[182713]:   <cpu mode="custom" match="exact">
Jan 21 23:49:58 compute-1 nova_compute[182713]:     <model>Nehalem</model>
Jan 21 23:49:58 compute-1 nova_compute[182713]:     <topology sockets="1" cores="1" threads="1"/>
Jan 21 23:49:58 compute-1 nova_compute[182713]:   </cpu>
Jan 21 23:49:58 compute-1 nova_compute[182713]:   <devices>
Jan 21 23:49:58 compute-1 nova_compute[182713]:     <disk type="file" device="disk">
Jan 21 23:49:58 compute-1 nova_compute[182713]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 21 23:49:58 compute-1 nova_compute[182713]:       <source file="/var/lib/nova/instances/0f91ac3a-2383-45bf-94b7-631c1737e936/disk"/>
Jan 21 23:49:58 compute-1 nova_compute[182713]:       <target dev="vda" bus="virtio"/>
Jan 21 23:49:58 compute-1 nova_compute[182713]:     </disk>
Jan 21 23:49:58 compute-1 nova_compute[182713]:     <disk type="file" device="cdrom">
Jan 21 23:49:58 compute-1 nova_compute[182713]:       <driver name="qemu" type="raw" cache="none"/>
Jan 21 23:49:58 compute-1 nova_compute[182713]:       <source file="/var/lib/nova/instances/0f91ac3a-2383-45bf-94b7-631c1737e936/disk.config"/>
Jan 21 23:49:58 compute-1 nova_compute[182713]:       <target dev="sda" bus="sata"/>
Jan 21 23:49:58 compute-1 nova_compute[182713]:     </disk>
Jan 21 23:49:58 compute-1 nova_compute[182713]:     <serial type="pty">
Jan 21 23:49:58 compute-1 nova_compute[182713]:       <log file="/var/lib/nova/instances/0f91ac3a-2383-45bf-94b7-631c1737e936/console.log" append="off"/>
Jan 21 23:49:58 compute-1 nova_compute[182713]:     </serial>
Jan 21 23:49:58 compute-1 nova_compute[182713]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 21 23:49:58 compute-1 nova_compute[182713]:     <video>
Jan 21 23:49:58 compute-1 nova_compute[182713]:       <model type="virtio"/>
Jan 21 23:49:58 compute-1 nova_compute[182713]:     </video>
Jan 21 23:49:58 compute-1 nova_compute[182713]:     <input type="tablet" bus="usb"/>
Jan 21 23:49:58 compute-1 nova_compute[182713]:     <rng model="virtio">
Jan 21 23:49:58 compute-1 nova_compute[182713]:       <backend model="random">/dev/urandom</backend>
Jan 21 23:49:58 compute-1 nova_compute[182713]:     </rng>
Jan 21 23:49:58 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root"/>
Jan 21 23:49:58 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:49:58 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:49:58 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:49:58 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:49:58 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:49:58 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:49:58 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:49:58 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:49:58 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:49:58 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:49:58 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:49:58 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:49:58 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:49:58 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:49:58 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:49:58 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:49:58 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:49:58 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:49:58 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:49:58 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:49:58 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:49:58 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:49:58 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:49:58 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:49:58 compute-1 nova_compute[182713]:     <controller type="usb" index="0"/>
Jan 21 23:49:58 compute-1 nova_compute[182713]:     <memballoon model="virtio">
Jan 21 23:49:58 compute-1 nova_compute[182713]:       <stats period="10"/>
Jan 21 23:49:58 compute-1 nova_compute[182713]:     </memballoon>
Jan 21 23:49:58 compute-1 nova_compute[182713]:   </devices>
Jan 21 23:49:58 compute-1 nova_compute[182713]: </domain>
Jan 21 23:49:58 compute-1 nova_compute[182713]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 21 23:49:59 compute-1 nova_compute[182713]: 2026-01-21 23:49:59.004 182717 DEBUG nova.virt.libvirt.driver [None req-542b52cb-55c3-4eb4-88d8-187b192e758e 9677535ac83f4274b374a03b9ade1cf8 fd13708789ca41ca9ed376ba39d87279 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 21 23:49:59 compute-1 nova_compute[182713]: 2026-01-21 23:49:59.004 182717 DEBUG nova.virt.libvirt.driver [None req-542b52cb-55c3-4eb4-88d8-187b192e758e 9677535ac83f4274b374a03b9ade1cf8 fd13708789ca41ca9ed376ba39d87279 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 21 23:49:59 compute-1 nova_compute[182713]: 2026-01-21 23:49:59.005 182717 INFO nova.virt.libvirt.driver [None req-542b52cb-55c3-4eb4-88d8-187b192e758e 9677535ac83f4274b374a03b9ade1cf8 fd13708789ca41ca9ed376ba39d87279 - - default default] [instance: 0f91ac3a-2383-45bf-94b7-631c1737e936] Using config drive
Jan 21 23:49:59 compute-1 systemd-machined[153970]: New machine qemu-19-instance-00000023.
Jan 21 23:49:59 compute-1 systemd[1]: Started Virtual Machine qemu-19-instance-00000023.
Jan 21 23:49:59 compute-1 nova_compute[182713]: 2026-01-21 23:49:59.613 182717 DEBUG nova.virt.driver [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Emitting event <LifecycleEvent: 1769039399.6126285, 0f91ac3a-2383-45bf-94b7-631c1737e936 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:49:59 compute-1 nova_compute[182713]: 2026-01-21 23:49:59.613 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 0f91ac3a-2383-45bf-94b7-631c1737e936] VM Resumed (Lifecycle Event)
Jan 21 23:49:59 compute-1 nova_compute[182713]: 2026-01-21 23:49:59.616 182717 DEBUG nova.compute.manager [None req-542b52cb-55c3-4eb4-88d8-187b192e758e 9677535ac83f4274b374a03b9ade1cf8 fd13708789ca41ca9ed376ba39d87279 - - default default] [instance: 0f91ac3a-2383-45bf-94b7-631c1737e936] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 21 23:49:59 compute-1 nova_compute[182713]: 2026-01-21 23:49:59.620 182717 INFO nova.virt.libvirt.driver [-] [instance: 0f91ac3a-2383-45bf-94b7-631c1737e936] Instance running successfully.
Jan 21 23:49:59 compute-1 virtqemud[182235]: argument unsupported: QEMU guest agent is not configured
Jan 21 23:49:59 compute-1 nova_compute[182713]: 2026-01-21 23:49:59.624 182717 DEBUG nova.virt.libvirt.guest [None req-542b52cb-55c3-4eb4-88d8-187b192e758e 9677535ac83f4274b374a03b9ade1cf8 fd13708789ca41ca9ed376ba39d87279 - - default default] [instance: 0f91ac3a-2383-45bf-94b7-631c1737e936] Failed to set time: agent not configured sync_guest_time /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:200
Jan 21 23:49:59 compute-1 nova_compute[182713]: 2026-01-21 23:49:59.624 182717 DEBUG nova.virt.libvirt.driver [None req-542b52cb-55c3-4eb4-88d8-187b192e758e 9677535ac83f4274b374a03b9ade1cf8 fd13708789ca41ca9ed376ba39d87279 - - default default] [instance: 0f91ac3a-2383-45bf-94b7-631c1737e936] finish_migration finished successfully. finish_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11793
Jan 21 23:49:59 compute-1 nova_compute[182713]: 2026-01-21 23:49:59.639 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 0f91ac3a-2383-45bf-94b7-631c1737e936] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:49:59 compute-1 nova_compute[182713]: 2026-01-21 23:49:59.642 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 0f91ac3a-2383-45bf-94b7-631c1737e936] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: resize_finish, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 21 23:49:59 compute-1 nova_compute[182713]: 2026-01-21 23:49:59.682 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 0f91ac3a-2383-45bf-94b7-631c1737e936] During sync_power_state the instance has a pending task (resize_finish). Skip.
Jan 21 23:49:59 compute-1 nova_compute[182713]: 2026-01-21 23:49:59.682 182717 DEBUG nova.virt.driver [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Emitting event <LifecycleEvent: 1769039399.6136758, 0f91ac3a-2383-45bf-94b7-631c1737e936 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:49:59 compute-1 nova_compute[182713]: 2026-01-21 23:49:59.683 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 0f91ac3a-2383-45bf-94b7-631c1737e936] VM Started (Lifecycle Event)
Jan 21 23:49:59 compute-1 nova_compute[182713]: 2026-01-21 23:49:59.714 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 0f91ac3a-2383-45bf-94b7-631c1737e936] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:49:59 compute-1 nova_compute[182713]: 2026-01-21 23:49:59.719 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 0f91ac3a-2383-45bf-94b7-631c1737e936] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: resize_finish, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 21 23:50:00 compute-1 nova_compute[182713]: 2026-01-21 23:50:00.397 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:50:01 compute-1 nova_compute[182713]: 2026-01-21 23:50:01.623 182717 DEBUG oslo_concurrency.lockutils [None req-dba29aea-48cc-424b-9c0a-415ae10508a7 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Acquiring lock "refresh_cache-0f91ac3a-2383-45bf-94b7-631c1737e936" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 23:50:01 compute-1 nova_compute[182713]: 2026-01-21 23:50:01.623 182717 DEBUG oslo_concurrency.lockutils [None req-dba29aea-48cc-424b-9c0a-415ae10508a7 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Acquired lock "refresh_cache-0f91ac3a-2383-45bf-94b7-631c1737e936" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 23:50:01 compute-1 nova_compute[182713]: 2026-01-21 23:50:01.624 182717 DEBUG nova.network.neutron [None req-dba29aea-48cc-424b-9c0a-415ae10508a7 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] [instance: 0f91ac3a-2383-45bf-94b7-631c1737e936] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 21 23:50:01 compute-1 nova_compute[182713]: 2026-01-21 23:50:01.860 182717 DEBUG nova.network.neutron [None req-dba29aea-48cc-424b-9c0a-415ae10508a7 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] [instance: 0f91ac3a-2383-45bf-94b7-631c1737e936] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 21 23:50:02 compute-1 nova_compute[182713]: 2026-01-21 23:50:02.457 182717 DEBUG nova.network.neutron [None req-dba29aea-48cc-424b-9c0a-415ae10508a7 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] [instance: 0f91ac3a-2383-45bf-94b7-631c1737e936] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 23:50:02 compute-1 nova_compute[182713]: 2026-01-21 23:50:02.477 182717 DEBUG oslo_concurrency.lockutils [None req-dba29aea-48cc-424b-9c0a-415ae10508a7 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Releasing lock "refresh_cache-0f91ac3a-2383-45bf-94b7-631c1737e936" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 23:50:02 compute-1 nova_compute[182713]: 2026-01-21 23:50:02.495 182717 DEBUG nova.virt.libvirt.driver [None req-dba29aea-48cc-424b-9c0a-415ae10508a7 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] [instance: 0f91ac3a-2383-45bf-94b7-631c1737e936] Creating tmpfile /var/lib/nova/instances/0f91ac3a-2383-45bf-94b7-631c1737e936/tmpyirnid9c to verify with other compute node that the instance is on the same shared storage. check_instance_shared_storage_local /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:9618
Jan 21 23:50:02 compute-1 systemd[1]: machine-qemu\x2d19\x2dinstance\x2d00000023.scope: Deactivated successfully.
Jan 21 23:50:02 compute-1 systemd[1]: machine-qemu\x2d19\x2dinstance\x2d00000023.scope: Consumed 3.462s CPU time.
Jan 21 23:50:02 compute-1 systemd-machined[153970]: Machine qemu-19-instance-00000023 terminated.
Jan 21 23:50:02 compute-1 nova_compute[182713]: 2026-01-21 23:50:02.576 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:50:02 compute-1 nova_compute[182713]: 2026-01-21 23:50:02.777 182717 INFO nova.virt.libvirt.driver [-] [instance: 0f91ac3a-2383-45bf-94b7-631c1737e936] Instance destroyed successfully.
Jan 21 23:50:02 compute-1 nova_compute[182713]: 2026-01-21 23:50:02.778 182717 DEBUG nova.objects.instance [None req-dba29aea-48cc-424b-9c0a-415ae10508a7 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Lazy-loading 'resources' on Instance uuid 0f91ac3a-2383-45bf-94b7-631c1737e936 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:50:02 compute-1 nova_compute[182713]: 2026-01-21 23:50:02.796 182717 INFO nova.virt.libvirt.driver [None req-dba29aea-48cc-424b-9c0a-415ae10508a7 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] [instance: 0f91ac3a-2383-45bf-94b7-631c1737e936] Deleting instance files /var/lib/nova/instances/0f91ac3a-2383-45bf-94b7-631c1737e936_del
Jan 21 23:50:02 compute-1 nova_compute[182713]: 2026-01-21 23:50:02.809 182717 INFO nova.virt.libvirt.driver [None req-dba29aea-48cc-424b-9c0a-415ae10508a7 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] [instance: 0f91ac3a-2383-45bf-94b7-631c1737e936] Deletion of /var/lib/nova/instances/0f91ac3a-2383-45bf-94b7-631c1737e936_del complete
Jan 21 23:50:02 compute-1 nova_compute[182713]: 2026-01-21 23:50:02.912 182717 DEBUG oslo_concurrency.lockutils [None req-dba29aea-48cc-424b-9c0a-415ae10508a7 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_dest" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:50:02 compute-1 nova_compute[182713]: 2026-01-21 23:50:02.913 182717 DEBUG oslo_concurrency.lockutils [None req-dba29aea-48cc-424b-9c0a-415ae10508a7 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_dest" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:50:02 compute-1 nova_compute[182713]: 2026-01-21 23:50:02.946 182717 DEBUG nova.objects.instance [None req-dba29aea-48cc-424b-9c0a-415ae10508a7 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Lazy-loading 'migration_context' on Instance uuid 0f91ac3a-2383-45bf-94b7-631c1737e936 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:50:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:50:02.998 104184 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:50:02 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:50:02.999 104184 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:50:03 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:50:02.999 104184 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:50:03 compute-1 nova_compute[182713]: 2026-01-21 23:50:03.022 182717 DEBUG nova.compute.provider_tree [None req-dba29aea-48cc-424b-9c0a-415ae10508a7 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Inventory has not changed in ProviderTree for provider: 39680711-70c9-4df1-ae59-25e54fac688d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 21 23:50:03 compute-1 nova_compute[182713]: 2026-01-21 23:50:03.041 182717 DEBUG nova.scheduler.client.report [None req-dba29aea-48cc-424b-9c0a-415ae10508a7 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Inventory has not changed for provider 39680711-70c9-4df1-ae59-25e54fac688d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 21 23:50:03 compute-1 nova_compute[182713]: 2026-01-21 23:50:03.120 182717 DEBUG oslo_concurrency.lockutils [None req-dba29aea-48cc-424b-9c0a-415ae10508a7 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_dest" :: held 0.207s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:50:05 compute-1 nova_compute[182713]: 2026-01-21 23:50:05.396 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:50:07 compute-1 nova_compute[182713]: 2026-01-21 23:50:07.579 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:50:07 compute-1 systemd[1]: Stopping User Manager for UID 42436...
Jan 21 23:50:07 compute-1 systemd[215833]: Activating special unit Exit the Session...
Jan 21 23:50:07 compute-1 systemd[215833]: Stopped target Main User Target.
Jan 21 23:50:07 compute-1 systemd[215833]: Stopped target Basic System.
Jan 21 23:50:07 compute-1 systemd[215833]: Stopped target Paths.
Jan 21 23:50:07 compute-1 systemd[215833]: Stopped target Sockets.
Jan 21 23:50:07 compute-1 systemd[215833]: Stopped target Timers.
Jan 21 23:50:07 compute-1 systemd[215833]: Stopped Mark boot as successful after the user session has run 2 minutes.
Jan 21 23:50:07 compute-1 systemd[215833]: Stopped Daily Cleanup of User's Temporary Directories.
Jan 21 23:50:07 compute-1 systemd[215833]: Closed D-Bus User Message Bus Socket.
Jan 21 23:50:07 compute-1 systemd[215833]: Stopped Create User's Volatile Files and Directories.
Jan 21 23:50:07 compute-1 systemd[215833]: Removed slice User Application Slice.
Jan 21 23:50:07 compute-1 systemd[215833]: Reached target Shutdown.
Jan 21 23:50:07 compute-1 systemd[215833]: Finished Exit the Session.
Jan 21 23:50:07 compute-1 systemd[215833]: Reached target Exit the Session.
Jan 21 23:50:07 compute-1 systemd[1]: user@42436.service: Deactivated successfully.
Jan 21 23:50:07 compute-1 systemd[1]: Stopped User Manager for UID 42436.
Jan 21 23:50:07 compute-1 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Jan 21 23:50:07 compute-1 systemd[1]: run-user-42436.mount: Deactivated successfully.
Jan 21 23:50:07 compute-1 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Jan 21 23:50:07 compute-1 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Jan 21 23:50:07 compute-1 systemd[1]: Removed slice User Slice of UID 42436.
Jan 21 23:50:08 compute-1 nova_compute[182713]: 2026-01-21 23:50:08.561 182717 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769039393.5598166, 092ae3ba-e79b-47fc-b3eb-a0fb61865b6f => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:50:08 compute-1 nova_compute[182713]: 2026-01-21 23:50:08.562 182717 INFO nova.compute.manager [-] [instance: 092ae3ba-e79b-47fc-b3eb-a0fb61865b6f] VM Stopped (Lifecycle Event)
Jan 21 23:50:08 compute-1 nova_compute[182713]: 2026-01-21 23:50:08.597 182717 DEBUG nova.compute.manager [None req-769fa53b-7d48-46a9-89ec-3b969df3bb06 - - - - - -] [instance: 092ae3ba-e79b-47fc-b3eb-a0fb61865b6f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:50:10 compute-1 nova_compute[182713]: 2026-01-21 23:50:10.399 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:50:10 compute-1 podman[215947]: 2026-01-21 23:50:10.623552331 +0000 UTC m=+0.103497929 container health_status cba261ffc859a105e0afa4436e64e27f9ba7e7387a1ea02146e0072c51844d44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, org.label-schema.build-date=20251202, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 21 23:50:12 compute-1 nova_compute[182713]: 2026-01-21 23:50:12.583 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:50:13 compute-1 podman[215967]: 2026-01-21 23:50:13.618434068 +0000 UTC m=+0.101851347 container health_status dce133a2ffa21f3006ac27dc80027060a9029e6d972f4c62fecd35dd3bbb00f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, distribution-scope=public, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, release=1755695350, com.redhat.component=ubi9-minimal-container, config_id=openstack_network_exporter, version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Jan 21 23:50:15 compute-1 nova_compute[182713]: 2026-01-21 23:50:15.400 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:50:17 compute-1 nova_compute[182713]: 2026-01-21 23:50:17.587 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:50:17 compute-1 nova_compute[182713]: 2026-01-21 23:50:17.777 182717 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769039402.7735767, 0f91ac3a-2383-45bf-94b7-631c1737e936 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:50:17 compute-1 nova_compute[182713]: 2026-01-21 23:50:17.777 182717 INFO nova.compute.manager [-] [instance: 0f91ac3a-2383-45bf-94b7-631c1737e936] VM Stopped (Lifecycle Event)
Jan 21 23:50:17 compute-1 nova_compute[182713]: 2026-01-21 23:50:17.802 182717 DEBUG nova.compute.manager [None req-0f00d337-896d-43f7-a199-9034f96e09a9 - - - - - -] [instance: 0f91ac3a-2383-45bf-94b7-631c1737e936] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:50:20 compute-1 nova_compute[182713]: 2026-01-21 23:50:20.402 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:50:20 compute-1 nova_compute[182713]: 2026-01-21 23:50:20.855 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:50:20 compute-1 nova_compute[182713]: 2026-01-21 23:50:20.856 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:50:20 compute-1 nova_compute[182713]: 2026-01-21 23:50:20.856 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 21 23:50:21 compute-1 podman[215990]: 2026-01-21 23:50:21.617538411 +0000 UTC m=+0.092211812 container health_status 9069102163b9014ce8d990e36224edf2f6277e737eebbc85a8ea0ba5a3d6a468 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 21 23:50:21 compute-1 podman[215989]: 2026-01-21 23:50:21.660176512 +0000 UTC m=+0.141973208 container health_status 1b31374526468d6b98833c6ddc06c791524e3aaa8f4a92d02e3f11c449a17ca2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 21 23:50:22 compute-1 nova_compute[182713]: 2026-01-21 23:50:22.588 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:50:22 compute-1 nova_compute[182713]: 2026-01-21 23:50:22.852 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:50:22 compute-1 nova_compute[182713]: 2026-01-21 23:50:22.855 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:50:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:50:22.864 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:50:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:50:22.866 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:50:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:50:22.867 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:50:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:50:22.867 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:50:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:50:22.868 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:50:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:50:22.869 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:50:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:50:22.869 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:50:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:50:22.870 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:50:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:50:22.870 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:50:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:50:22.871 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:50:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:50:22.871 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:50:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:50:22.872 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:50:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:50:22.872 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:50:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:50:22.873 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:50:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:50:22.873 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:50:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:50:22.873 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:50:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:50:22.874 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:50:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:50:22.874 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:50:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:50:22.875 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:50:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:50:22.875 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:50:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:50:22.876 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:50:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:50:22.876 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:50:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:50:22.877 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:50:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:50:22.877 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:50:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:50:22.877 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:50:23 compute-1 nova_compute[182713]: 2026-01-21 23:50:23.856 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:50:23 compute-1 nova_compute[182713]: 2026-01-21 23:50:23.856 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:50:24 compute-1 nova_compute[182713]: 2026-01-21 23:50:24.856 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:50:24 compute-1 nova_compute[182713]: 2026-01-21 23:50:24.856 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 21 23:50:24 compute-1 nova_compute[182713]: 2026-01-21 23:50:24.856 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 21 23:50:24 compute-1 nova_compute[182713]: 2026-01-21 23:50:24.882 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 21 23:50:24 compute-1 nova_compute[182713]: 2026-01-21 23:50:24.883 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:50:24 compute-1 nova_compute[182713]: 2026-01-21 23:50:24.884 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:50:24 compute-1 nova_compute[182713]: 2026-01-21 23:50:24.911 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:50:24 compute-1 nova_compute[182713]: 2026-01-21 23:50:24.912 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:50:24 compute-1 nova_compute[182713]: 2026-01-21 23:50:24.913 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:50:24 compute-1 nova_compute[182713]: 2026-01-21 23:50:24.913 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 21 23:50:25 compute-1 nova_compute[182713]: 2026-01-21 23:50:25.181 182717 WARNING nova.virt.libvirt.driver [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 21 23:50:25 compute-1 nova_compute[182713]: 2026-01-21 23:50:25.183 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5703MB free_disk=73.31353759765625GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 21 23:50:25 compute-1 nova_compute[182713]: 2026-01-21 23:50:25.183 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:50:25 compute-1 nova_compute[182713]: 2026-01-21 23:50:25.183 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:50:25 compute-1 nova_compute[182713]: 2026-01-21 23:50:25.275 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 21 23:50:25 compute-1 nova_compute[182713]: 2026-01-21 23:50:25.276 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 21 23:50:25 compute-1 nova_compute[182713]: 2026-01-21 23:50:25.307 182717 DEBUG nova.compute.provider_tree [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Inventory has not changed in ProviderTree for provider: 39680711-70c9-4df1-ae59-25e54fac688d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 21 23:50:25 compute-1 nova_compute[182713]: 2026-01-21 23:50:25.327 182717 DEBUG nova.scheduler.client.report [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Inventory has not changed for provider 39680711-70c9-4df1-ae59-25e54fac688d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 21 23:50:25 compute-1 nova_compute[182713]: 2026-01-21 23:50:25.352 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 21 23:50:25 compute-1 nova_compute[182713]: 2026-01-21 23:50:25.353 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.169s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:50:25 compute-1 nova_compute[182713]: 2026-01-21 23:50:25.438 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:50:27 compute-1 nova_compute[182713]: 2026-01-21 23:50:27.641 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:50:29 compute-1 podman[216042]: 2026-01-21 23:50:29.587418058 +0000 UTC m=+0.070198375 container health_status e305ebfcd3dbca18f8dd680606b043b108cf9efb0d0a771039cb04fe0df8441d (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 21 23:50:29 compute-1 podman[216041]: 2026-01-21 23:50:29.614378352 +0000 UTC m=+0.094047530 container health_status af9a44923f14d865893c91712a29a99d59e05d83f1a1a0300c4f4d5af638f284 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3)
Jan 21 23:50:30 compute-1 nova_compute[182713]: 2026-01-21 23:50:30.440 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:50:32 compute-1 nova_compute[182713]: 2026-01-21 23:50:32.693 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:50:35 compute-1 nova_compute[182713]: 2026-01-21 23:50:35.443 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:50:35 compute-1 nova_compute[182713]: 2026-01-21 23:50:35.463 182717 DEBUG oslo_concurrency.lockutils [None req-286704a9-bb1f-42c8-b04b-b0a7c93bfa61 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] Acquiring lock "ce29d6fa-fbbb-4352-b243-5af960b17123" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:50:35 compute-1 nova_compute[182713]: 2026-01-21 23:50:35.463 182717 DEBUG oslo_concurrency.lockutils [None req-286704a9-bb1f-42c8-b04b-b0a7c93bfa61 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] Lock "ce29d6fa-fbbb-4352-b243-5af960b17123" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:50:35 compute-1 nova_compute[182713]: 2026-01-21 23:50:35.496 182717 DEBUG nova.compute.manager [None req-286704a9-bb1f-42c8-b04b-b0a7c93bfa61 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] [instance: ce29d6fa-fbbb-4352-b243-5af960b17123] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 21 23:50:35 compute-1 nova_compute[182713]: 2026-01-21 23:50:35.623 182717 DEBUG oslo_concurrency.lockutils [None req-286704a9-bb1f-42c8-b04b-b0a7c93bfa61 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:50:35 compute-1 nova_compute[182713]: 2026-01-21 23:50:35.623 182717 DEBUG oslo_concurrency.lockutils [None req-286704a9-bb1f-42c8-b04b-b0a7c93bfa61 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:50:35 compute-1 nova_compute[182713]: 2026-01-21 23:50:35.629 182717 DEBUG nova.virt.hardware [None req-286704a9-bb1f-42c8-b04b-b0a7c93bfa61 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 21 23:50:35 compute-1 nova_compute[182713]: 2026-01-21 23:50:35.630 182717 INFO nova.compute.claims [None req-286704a9-bb1f-42c8-b04b-b0a7c93bfa61 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] [instance: ce29d6fa-fbbb-4352-b243-5af960b17123] Claim successful on node compute-1.ctlplane.example.com
Jan 21 23:50:35 compute-1 nova_compute[182713]: 2026-01-21 23:50:35.789 182717 DEBUG nova.compute.provider_tree [None req-286704a9-bb1f-42c8-b04b-b0a7c93bfa61 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] Inventory has not changed in ProviderTree for provider: 39680711-70c9-4df1-ae59-25e54fac688d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 21 23:50:35 compute-1 nova_compute[182713]: 2026-01-21 23:50:35.806 182717 DEBUG nova.scheduler.client.report [None req-286704a9-bb1f-42c8-b04b-b0a7c93bfa61 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] Inventory has not changed for provider 39680711-70c9-4df1-ae59-25e54fac688d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 21 23:50:35 compute-1 nova_compute[182713]: 2026-01-21 23:50:35.827 182717 DEBUG oslo_concurrency.lockutils [None req-286704a9-bb1f-42c8-b04b-b0a7c93bfa61 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.204s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:50:35 compute-1 nova_compute[182713]: 2026-01-21 23:50:35.828 182717 DEBUG nova.compute.manager [None req-286704a9-bb1f-42c8-b04b-b0a7c93bfa61 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] [instance: ce29d6fa-fbbb-4352-b243-5af960b17123] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 21 23:50:35 compute-1 nova_compute[182713]: 2026-01-21 23:50:35.909 182717 DEBUG nova.compute.manager [None req-286704a9-bb1f-42c8-b04b-b0a7c93bfa61 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] [instance: ce29d6fa-fbbb-4352-b243-5af960b17123] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 21 23:50:35 compute-1 nova_compute[182713]: 2026-01-21 23:50:35.910 182717 DEBUG nova.network.neutron [None req-286704a9-bb1f-42c8-b04b-b0a7c93bfa61 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] [instance: ce29d6fa-fbbb-4352-b243-5af960b17123] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 21 23:50:35 compute-1 nova_compute[182713]: 2026-01-21 23:50:35.949 182717 INFO nova.virt.libvirt.driver [None req-286704a9-bb1f-42c8-b04b-b0a7c93bfa61 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] [instance: ce29d6fa-fbbb-4352-b243-5af960b17123] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 21 23:50:35 compute-1 nova_compute[182713]: 2026-01-21 23:50:35.970 182717 DEBUG nova.compute.manager [None req-286704a9-bb1f-42c8-b04b-b0a7c93bfa61 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] [instance: ce29d6fa-fbbb-4352-b243-5af960b17123] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 21 23:50:36 compute-1 nova_compute[182713]: 2026-01-21 23:50:36.095 182717 DEBUG nova.compute.manager [None req-286704a9-bb1f-42c8-b04b-b0a7c93bfa61 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] [instance: ce29d6fa-fbbb-4352-b243-5af960b17123] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 21 23:50:36 compute-1 nova_compute[182713]: 2026-01-21 23:50:36.097 182717 DEBUG nova.virt.libvirt.driver [None req-286704a9-bb1f-42c8-b04b-b0a7c93bfa61 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] [instance: ce29d6fa-fbbb-4352-b243-5af960b17123] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 21 23:50:36 compute-1 nova_compute[182713]: 2026-01-21 23:50:36.098 182717 INFO nova.virt.libvirt.driver [None req-286704a9-bb1f-42c8-b04b-b0a7c93bfa61 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] [instance: ce29d6fa-fbbb-4352-b243-5af960b17123] Creating image(s)
Jan 21 23:50:36 compute-1 nova_compute[182713]: 2026-01-21 23:50:36.099 182717 DEBUG oslo_concurrency.lockutils [None req-286704a9-bb1f-42c8-b04b-b0a7c93bfa61 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] Acquiring lock "/var/lib/nova/instances/ce29d6fa-fbbb-4352-b243-5af960b17123/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:50:36 compute-1 nova_compute[182713]: 2026-01-21 23:50:36.099 182717 DEBUG oslo_concurrency.lockutils [None req-286704a9-bb1f-42c8-b04b-b0a7c93bfa61 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] Lock "/var/lib/nova/instances/ce29d6fa-fbbb-4352-b243-5af960b17123/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:50:36 compute-1 nova_compute[182713]: 2026-01-21 23:50:36.101 182717 DEBUG oslo_concurrency.lockutils [None req-286704a9-bb1f-42c8-b04b-b0a7c93bfa61 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] Lock "/var/lib/nova/instances/ce29d6fa-fbbb-4352-b243-5af960b17123/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:50:36 compute-1 nova_compute[182713]: 2026-01-21 23:50:36.129 182717 DEBUG oslo_concurrency.processutils [None req-286704a9-bb1f-42c8-b04b-b0a7c93bfa61 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:50:36 compute-1 nova_compute[182713]: 2026-01-21 23:50:36.220 182717 DEBUG oslo_concurrency.processutils [None req-286704a9-bb1f-42c8-b04b-b0a7c93bfa61 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.091s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:50:36 compute-1 nova_compute[182713]: 2026-01-21 23:50:36.222 182717 DEBUG oslo_concurrency.lockutils [None req-286704a9-bb1f-42c8-b04b-b0a7c93bfa61 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] Acquiring lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:50:36 compute-1 nova_compute[182713]: 2026-01-21 23:50:36.223 182717 DEBUG oslo_concurrency.lockutils [None req-286704a9-bb1f-42c8-b04b-b0a7c93bfa61 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:50:36 compute-1 nova_compute[182713]: 2026-01-21 23:50:36.250 182717 DEBUG oslo_concurrency.processutils [None req-286704a9-bb1f-42c8-b04b-b0a7c93bfa61 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:50:36 compute-1 nova_compute[182713]: 2026-01-21 23:50:36.322 182717 DEBUG oslo_concurrency.processutils [None req-286704a9-bb1f-42c8-b04b-b0a7c93bfa61 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:50:36 compute-1 nova_compute[182713]: 2026-01-21 23:50:36.324 182717 DEBUG oslo_concurrency.processutils [None req-286704a9-bb1f-42c8-b04b-b0a7c93bfa61 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/ce29d6fa-fbbb-4352-b243-5af960b17123/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:50:36 compute-1 nova_compute[182713]: 2026-01-21 23:50:36.366 182717 DEBUG oslo_concurrency.processutils [None req-286704a9-bb1f-42c8-b04b-b0a7c93bfa61 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/ce29d6fa-fbbb-4352-b243-5af960b17123/disk 1073741824" returned: 0 in 0.042s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:50:36 compute-1 nova_compute[182713]: 2026-01-21 23:50:36.368 182717 DEBUG oslo_concurrency.lockutils [None req-286704a9-bb1f-42c8-b04b-b0a7c93bfa61 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.145s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:50:36 compute-1 nova_compute[182713]: 2026-01-21 23:50:36.369 182717 DEBUG oslo_concurrency.processutils [None req-286704a9-bb1f-42c8-b04b-b0a7c93bfa61 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:50:36 compute-1 nova_compute[182713]: 2026-01-21 23:50:36.423 182717 DEBUG oslo_concurrency.processutils [None req-286704a9-bb1f-42c8-b04b-b0a7c93bfa61 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:50:36 compute-1 nova_compute[182713]: 2026-01-21 23:50:36.425 182717 DEBUG nova.virt.disk.api [None req-286704a9-bb1f-42c8-b04b-b0a7c93bfa61 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] Checking if we can resize image /var/lib/nova/instances/ce29d6fa-fbbb-4352-b243-5af960b17123/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 21 23:50:36 compute-1 nova_compute[182713]: 2026-01-21 23:50:36.425 182717 DEBUG oslo_concurrency.processutils [None req-286704a9-bb1f-42c8-b04b-b0a7c93bfa61 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ce29d6fa-fbbb-4352-b243-5af960b17123/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:50:36 compute-1 nova_compute[182713]: 2026-01-21 23:50:36.473 182717 DEBUG nova.policy [None req-286704a9-bb1f-42c8-b04b-b0a7c93bfa61 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '8da2db8893d4442aaaada7d43ff2500f', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'bdcd24bf916b4c3aa2e173bea9dd7202', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 21 23:50:36 compute-1 nova_compute[182713]: 2026-01-21 23:50:36.479 182717 DEBUG oslo_concurrency.processutils [None req-286704a9-bb1f-42c8-b04b-b0a7c93bfa61 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ce29d6fa-fbbb-4352-b243-5af960b17123/disk --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:50:36 compute-1 nova_compute[182713]: 2026-01-21 23:50:36.480 182717 DEBUG nova.virt.disk.api [None req-286704a9-bb1f-42c8-b04b-b0a7c93bfa61 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] Cannot resize image /var/lib/nova/instances/ce29d6fa-fbbb-4352-b243-5af960b17123/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 21 23:50:36 compute-1 nova_compute[182713]: 2026-01-21 23:50:36.480 182717 DEBUG nova.objects.instance [None req-286704a9-bb1f-42c8-b04b-b0a7c93bfa61 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] Lazy-loading 'migration_context' on Instance uuid ce29d6fa-fbbb-4352-b243-5af960b17123 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:50:36 compute-1 nova_compute[182713]: 2026-01-21 23:50:36.498 182717 DEBUG nova.virt.libvirt.driver [None req-286704a9-bb1f-42c8-b04b-b0a7c93bfa61 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] [instance: ce29d6fa-fbbb-4352-b243-5af960b17123] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 21 23:50:36 compute-1 nova_compute[182713]: 2026-01-21 23:50:36.499 182717 DEBUG nova.virt.libvirt.driver [None req-286704a9-bb1f-42c8-b04b-b0a7c93bfa61 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] [instance: ce29d6fa-fbbb-4352-b243-5af960b17123] Ensure instance console log exists: /var/lib/nova/instances/ce29d6fa-fbbb-4352-b243-5af960b17123/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 21 23:50:36 compute-1 nova_compute[182713]: 2026-01-21 23:50:36.500 182717 DEBUG oslo_concurrency.lockutils [None req-286704a9-bb1f-42c8-b04b-b0a7c93bfa61 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:50:36 compute-1 nova_compute[182713]: 2026-01-21 23:50:36.500 182717 DEBUG oslo_concurrency.lockutils [None req-286704a9-bb1f-42c8-b04b-b0a7c93bfa61 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:50:36 compute-1 nova_compute[182713]: 2026-01-21 23:50:36.500 182717 DEBUG oslo_concurrency.lockutils [None req-286704a9-bb1f-42c8-b04b-b0a7c93bfa61 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:50:37 compute-1 nova_compute[182713]: 2026-01-21 23:50:37.408 182717 DEBUG nova.network.neutron [None req-286704a9-bb1f-42c8-b04b-b0a7c93bfa61 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] [instance: ce29d6fa-fbbb-4352-b243-5af960b17123] Successfully created port: 6a31fa49-995e-4526-9e75-1973bd7f492b _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 21 23:50:37 compute-1 nova_compute[182713]: 2026-01-21 23:50:37.696 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:50:38 compute-1 nova_compute[182713]: 2026-01-21 23:50:38.803 182717 DEBUG nova.network.neutron [None req-286704a9-bb1f-42c8-b04b-b0a7c93bfa61 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] [instance: ce29d6fa-fbbb-4352-b243-5af960b17123] Successfully updated port: 6a31fa49-995e-4526-9e75-1973bd7f492b _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 21 23:50:38 compute-1 nova_compute[182713]: 2026-01-21 23:50:38.820 182717 DEBUG oslo_concurrency.lockutils [None req-286704a9-bb1f-42c8-b04b-b0a7c93bfa61 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] Acquiring lock "refresh_cache-ce29d6fa-fbbb-4352-b243-5af960b17123" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 23:50:38 compute-1 nova_compute[182713]: 2026-01-21 23:50:38.821 182717 DEBUG oslo_concurrency.lockutils [None req-286704a9-bb1f-42c8-b04b-b0a7c93bfa61 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] Acquired lock "refresh_cache-ce29d6fa-fbbb-4352-b243-5af960b17123" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 23:50:38 compute-1 nova_compute[182713]: 2026-01-21 23:50:38.821 182717 DEBUG nova.network.neutron [None req-286704a9-bb1f-42c8-b04b-b0a7c93bfa61 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] [instance: ce29d6fa-fbbb-4352-b243-5af960b17123] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 21 23:50:38 compute-1 nova_compute[182713]: 2026-01-21 23:50:38.911 182717 DEBUG nova.compute.manager [req-226a5717-f8e6-4fae-a333-e13748c24b39 req-dc17613e-0021-4a45-bdef-848caf65caa1 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: ce29d6fa-fbbb-4352-b243-5af960b17123] Received event network-changed-6a31fa49-995e-4526-9e75-1973bd7f492b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:50:38 compute-1 nova_compute[182713]: 2026-01-21 23:50:38.911 182717 DEBUG nova.compute.manager [req-226a5717-f8e6-4fae-a333-e13748c24b39 req-dc17613e-0021-4a45-bdef-848caf65caa1 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: ce29d6fa-fbbb-4352-b243-5af960b17123] Refreshing instance network info cache due to event network-changed-6a31fa49-995e-4526-9e75-1973bd7f492b. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 21 23:50:38 compute-1 nova_compute[182713]: 2026-01-21 23:50:38.912 182717 DEBUG oslo_concurrency.lockutils [req-226a5717-f8e6-4fae-a333-e13748c24b39 req-dc17613e-0021-4a45-bdef-848caf65caa1 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-ce29d6fa-fbbb-4352-b243-5af960b17123" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 23:50:39 compute-1 nova_compute[182713]: 2026-01-21 23:50:39.013 182717 DEBUG nova.network.neutron [None req-286704a9-bb1f-42c8-b04b-b0a7c93bfa61 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] [instance: ce29d6fa-fbbb-4352-b243-5af960b17123] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 21 23:50:40 compute-1 nova_compute[182713]: 2026-01-21 23:50:40.322 182717 DEBUG nova.network.neutron [None req-286704a9-bb1f-42c8-b04b-b0a7c93bfa61 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] [instance: ce29d6fa-fbbb-4352-b243-5af960b17123] Updating instance_info_cache with network_info: [{"id": "6a31fa49-995e-4526-9e75-1973bd7f492b", "address": "fa:16:3e:46:85:15", "network": {"id": "b94414b2-c7ed-4d1b-b462-f41cb84cbcd8", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-823172346-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bdcd24bf916b4c3aa2e173bea9dd7202", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6a31fa49-99", "ovs_interfaceid": "6a31fa49-995e-4526-9e75-1973bd7f492b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 23:50:40 compute-1 nova_compute[182713]: 2026-01-21 23:50:40.347 182717 DEBUG oslo_concurrency.lockutils [None req-286704a9-bb1f-42c8-b04b-b0a7c93bfa61 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] Releasing lock "refresh_cache-ce29d6fa-fbbb-4352-b243-5af960b17123" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 23:50:40 compute-1 nova_compute[182713]: 2026-01-21 23:50:40.348 182717 DEBUG nova.compute.manager [None req-286704a9-bb1f-42c8-b04b-b0a7c93bfa61 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] [instance: ce29d6fa-fbbb-4352-b243-5af960b17123] Instance network_info: |[{"id": "6a31fa49-995e-4526-9e75-1973bd7f492b", "address": "fa:16:3e:46:85:15", "network": {"id": "b94414b2-c7ed-4d1b-b462-f41cb84cbcd8", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-823172346-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bdcd24bf916b4c3aa2e173bea9dd7202", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6a31fa49-99", "ovs_interfaceid": "6a31fa49-995e-4526-9e75-1973bd7f492b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 21 23:50:40 compute-1 nova_compute[182713]: 2026-01-21 23:50:40.349 182717 DEBUG oslo_concurrency.lockutils [req-226a5717-f8e6-4fae-a333-e13748c24b39 req-dc17613e-0021-4a45-bdef-848caf65caa1 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-ce29d6fa-fbbb-4352-b243-5af960b17123" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 23:50:40 compute-1 nova_compute[182713]: 2026-01-21 23:50:40.349 182717 DEBUG nova.network.neutron [req-226a5717-f8e6-4fae-a333-e13748c24b39 req-dc17613e-0021-4a45-bdef-848caf65caa1 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: ce29d6fa-fbbb-4352-b243-5af960b17123] Refreshing network info cache for port 6a31fa49-995e-4526-9e75-1973bd7f492b _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 21 23:50:40 compute-1 nova_compute[182713]: 2026-01-21 23:50:40.354 182717 DEBUG nova.virt.libvirt.driver [None req-286704a9-bb1f-42c8-b04b-b0a7c93bfa61 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] [instance: ce29d6fa-fbbb-4352-b243-5af960b17123] Start _get_guest_xml network_info=[{"id": "6a31fa49-995e-4526-9e75-1973bd7f492b", "address": "fa:16:3e:46:85:15", "network": {"id": "b94414b2-c7ed-4d1b-b462-f41cb84cbcd8", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-823172346-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bdcd24bf916b4c3aa2e173bea9dd7202", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6a31fa49-99", "ovs_interfaceid": "6a31fa49-995e-4526-9e75-1973bd7f492b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'size': 0, 'disk_bus': 'virtio', 'boot_index': 0, 'device_name': '/dev/vda', 'guest_format': None, 'encrypted': False, 'encryption_format': None, 'encryption_secret_uuid': None, 'device_type': 'disk', 'image_id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 21 23:50:40 compute-1 nova_compute[182713]: 2026-01-21 23:50:40.362 182717 WARNING nova.virt.libvirt.driver [None req-286704a9-bb1f-42c8-b04b-b0a7c93bfa61 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 21 23:50:40 compute-1 nova_compute[182713]: 2026-01-21 23:50:40.372 182717 DEBUG nova.virt.libvirt.host [None req-286704a9-bb1f-42c8-b04b-b0a7c93bfa61 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 21 23:50:40 compute-1 nova_compute[182713]: 2026-01-21 23:50:40.373 182717 DEBUG nova.virt.libvirt.host [None req-286704a9-bb1f-42c8-b04b-b0a7c93bfa61 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 21 23:50:40 compute-1 nova_compute[182713]: 2026-01-21 23:50:40.382 182717 DEBUG nova.virt.libvirt.host [None req-286704a9-bb1f-42c8-b04b-b0a7c93bfa61 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 21 23:50:40 compute-1 nova_compute[182713]: 2026-01-21 23:50:40.383 182717 DEBUG nova.virt.libvirt.host [None req-286704a9-bb1f-42c8-b04b-b0a7c93bfa61 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 21 23:50:40 compute-1 nova_compute[182713]: 2026-01-21 23:50:40.385 182717 DEBUG nova.virt.libvirt.driver [None req-286704a9-bb1f-42c8-b04b-b0a7c93bfa61 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 21 23:50:40 compute-1 nova_compute[182713]: 2026-01-21 23:50:40.386 182717 DEBUG nova.virt.hardware [None req-286704a9-bb1f-42c8-b04b-b0a7c93bfa61 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-21T23:42:45Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c3389c03-89c4-4ff5-9e03-1a99d41713d4',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 21 23:50:40 compute-1 nova_compute[182713]: 2026-01-21 23:50:40.387 182717 DEBUG nova.virt.hardware [None req-286704a9-bb1f-42c8-b04b-b0a7c93bfa61 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 21 23:50:40 compute-1 nova_compute[182713]: 2026-01-21 23:50:40.387 182717 DEBUG nova.virt.hardware [None req-286704a9-bb1f-42c8-b04b-b0a7c93bfa61 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 21 23:50:40 compute-1 nova_compute[182713]: 2026-01-21 23:50:40.387 182717 DEBUG nova.virt.hardware [None req-286704a9-bb1f-42c8-b04b-b0a7c93bfa61 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 21 23:50:40 compute-1 nova_compute[182713]: 2026-01-21 23:50:40.388 182717 DEBUG nova.virt.hardware [None req-286704a9-bb1f-42c8-b04b-b0a7c93bfa61 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 21 23:50:40 compute-1 nova_compute[182713]: 2026-01-21 23:50:40.388 182717 DEBUG nova.virt.hardware [None req-286704a9-bb1f-42c8-b04b-b0a7c93bfa61 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 21 23:50:40 compute-1 nova_compute[182713]: 2026-01-21 23:50:40.389 182717 DEBUG nova.virt.hardware [None req-286704a9-bb1f-42c8-b04b-b0a7c93bfa61 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 21 23:50:40 compute-1 nova_compute[182713]: 2026-01-21 23:50:40.389 182717 DEBUG nova.virt.hardware [None req-286704a9-bb1f-42c8-b04b-b0a7c93bfa61 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 21 23:50:40 compute-1 nova_compute[182713]: 2026-01-21 23:50:40.389 182717 DEBUG nova.virt.hardware [None req-286704a9-bb1f-42c8-b04b-b0a7c93bfa61 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 21 23:50:40 compute-1 nova_compute[182713]: 2026-01-21 23:50:40.390 182717 DEBUG nova.virt.hardware [None req-286704a9-bb1f-42c8-b04b-b0a7c93bfa61 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 21 23:50:40 compute-1 nova_compute[182713]: 2026-01-21 23:50:40.390 182717 DEBUG nova.virt.hardware [None req-286704a9-bb1f-42c8-b04b-b0a7c93bfa61 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 21 23:50:40 compute-1 nova_compute[182713]: 2026-01-21 23:50:40.398 182717 DEBUG nova.virt.libvirt.vif [None req-286704a9-bb1f-42c8-b04b-b0a7c93bfa61 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-21T23:50:34Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-FloatingIPsAssociationTestJSON-server-72653518',display_name='tempest-FloatingIPsAssociationTestJSON-server-72653518',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-floatingipsassociationtestjson-server-72653518',id=38,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='bdcd24bf916b4c3aa2e173bea9dd7202',ramdisk_id='',reservation_id='r-m7r7zlfx',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-FloatingIPsAssociationTestJSON-1164348821',owner_user_name='tempest-FloatingIPsAssociationTestJSON-1164348821-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-21T23:50:36Z,user_data=None,user_id='8da2db8893d4442aaaada7d43ff2500f',uuid=ce29d6fa-fbbb-4352-b243-5af960b17123,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "6a31fa49-995e-4526-9e75-1973bd7f492b", "address": "fa:16:3e:46:85:15", "network": {"id": "b94414b2-c7ed-4d1b-b462-f41cb84cbcd8", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-823172346-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bdcd24bf916b4c3aa2e173bea9dd7202", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6a31fa49-99", "ovs_interfaceid": "6a31fa49-995e-4526-9e75-1973bd7f492b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 21 23:50:40 compute-1 nova_compute[182713]: 2026-01-21 23:50:40.399 182717 DEBUG nova.network.os_vif_util [None req-286704a9-bb1f-42c8-b04b-b0a7c93bfa61 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] Converting VIF {"id": "6a31fa49-995e-4526-9e75-1973bd7f492b", "address": "fa:16:3e:46:85:15", "network": {"id": "b94414b2-c7ed-4d1b-b462-f41cb84cbcd8", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-823172346-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bdcd24bf916b4c3aa2e173bea9dd7202", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6a31fa49-99", "ovs_interfaceid": "6a31fa49-995e-4526-9e75-1973bd7f492b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 21 23:50:40 compute-1 nova_compute[182713]: 2026-01-21 23:50:40.400 182717 DEBUG nova.network.os_vif_util [None req-286704a9-bb1f-42c8-b04b-b0a7c93bfa61 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:46:85:15,bridge_name='br-int',has_traffic_filtering=True,id=6a31fa49-995e-4526-9e75-1973bd7f492b,network=Network(b94414b2-c7ed-4d1b-b462-f41cb84cbcd8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6a31fa49-99') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 21 23:50:40 compute-1 nova_compute[182713]: 2026-01-21 23:50:40.402 182717 DEBUG nova.objects.instance [None req-286704a9-bb1f-42c8-b04b-b0a7c93bfa61 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] Lazy-loading 'pci_devices' on Instance uuid ce29d6fa-fbbb-4352-b243-5af960b17123 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:50:40 compute-1 nova_compute[182713]: 2026-01-21 23:50:40.423 182717 DEBUG nova.virt.libvirt.driver [None req-286704a9-bb1f-42c8-b04b-b0a7c93bfa61 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] [instance: ce29d6fa-fbbb-4352-b243-5af960b17123] End _get_guest_xml xml=<domain type="kvm">
Jan 21 23:50:40 compute-1 nova_compute[182713]:   <uuid>ce29d6fa-fbbb-4352-b243-5af960b17123</uuid>
Jan 21 23:50:40 compute-1 nova_compute[182713]:   <name>instance-00000026</name>
Jan 21 23:50:40 compute-1 nova_compute[182713]:   <memory>131072</memory>
Jan 21 23:50:40 compute-1 nova_compute[182713]:   <vcpu>1</vcpu>
Jan 21 23:50:40 compute-1 nova_compute[182713]:   <metadata>
Jan 21 23:50:40 compute-1 nova_compute[182713]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 21 23:50:40 compute-1 nova_compute[182713]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 21 23:50:40 compute-1 nova_compute[182713]:       <nova:name>tempest-FloatingIPsAssociationTestJSON-server-72653518</nova:name>
Jan 21 23:50:40 compute-1 nova_compute[182713]:       <nova:creationTime>2026-01-21 23:50:40</nova:creationTime>
Jan 21 23:50:40 compute-1 nova_compute[182713]:       <nova:flavor name="m1.nano">
Jan 21 23:50:40 compute-1 nova_compute[182713]:         <nova:memory>128</nova:memory>
Jan 21 23:50:40 compute-1 nova_compute[182713]:         <nova:disk>1</nova:disk>
Jan 21 23:50:40 compute-1 nova_compute[182713]:         <nova:swap>0</nova:swap>
Jan 21 23:50:40 compute-1 nova_compute[182713]:         <nova:ephemeral>0</nova:ephemeral>
Jan 21 23:50:40 compute-1 nova_compute[182713]:         <nova:vcpus>1</nova:vcpus>
Jan 21 23:50:40 compute-1 nova_compute[182713]:       </nova:flavor>
Jan 21 23:50:40 compute-1 nova_compute[182713]:       <nova:owner>
Jan 21 23:50:40 compute-1 nova_compute[182713]:         <nova:user uuid="8da2db8893d4442aaaada7d43ff2500f">tempest-FloatingIPsAssociationTestJSON-1164348821-project-member</nova:user>
Jan 21 23:50:40 compute-1 nova_compute[182713]:         <nova:project uuid="bdcd24bf916b4c3aa2e173bea9dd7202">tempest-FloatingIPsAssociationTestJSON-1164348821</nova:project>
Jan 21 23:50:40 compute-1 nova_compute[182713]:       </nova:owner>
Jan 21 23:50:40 compute-1 nova_compute[182713]:       <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 21 23:50:40 compute-1 nova_compute[182713]:       <nova:ports>
Jan 21 23:50:40 compute-1 nova_compute[182713]:         <nova:port uuid="6a31fa49-995e-4526-9e75-1973bd7f492b">
Jan 21 23:50:40 compute-1 nova_compute[182713]:           <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Jan 21 23:50:40 compute-1 nova_compute[182713]:         </nova:port>
Jan 21 23:50:40 compute-1 nova_compute[182713]:       </nova:ports>
Jan 21 23:50:40 compute-1 nova_compute[182713]:     </nova:instance>
Jan 21 23:50:40 compute-1 nova_compute[182713]:   </metadata>
Jan 21 23:50:40 compute-1 nova_compute[182713]:   <sysinfo type="smbios">
Jan 21 23:50:40 compute-1 nova_compute[182713]:     <system>
Jan 21 23:50:40 compute-1 nova_compute[182713]:       <entry name="manufacturer">RDO</entry>
Jan 21 23:50:40 compute-1 nova_compute[182713]:       <entry name="product">OpenStack Compute</entry>
Jan 21 23:50:40 compute-1 nova_compute[182713]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 21 23:50:40 compute-1 nova_compute[182713]:       <entry name="serial">ce29d6fa-fbbb-4352-b243-5af960b17123</entry>
Jan 21 23:50:40 compute-1 nova_compute[182713]:       <entry name="uuid">ce29d6fa-fbbb-4352-b243-5af960b17123</entry>
Jan 21 23:50:40 compute-1 nova_compute[182713]:       <entry name="family">Virtual Machine</entry>
Jan 21 23:50:40 compute-1 nova_compute[182713]:     </system>
Jan 21 23:50:40 compute-1 nova_compute[182713]:   </sysinfo>
Jan 21 23:50:40 compute-1 nova_compute[182713]:   <os>
Jan 21 23:50:40 compute-1 nova_compute[182713]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 21 23:50:40 compute-1 nova_compute[182713]:     <boot dev="hd"/>
Jan 21 23:50:40 compute-1 nova_compute[182713]:     <smbios mode="sysinfo"/>
Jan 21 23:50:40 compute-1 nova_compute[182713]:   </os>
Jan 21 23:50:40 compute-1 nova_compute[182713]:   <features>
Jan 21 23:50:40 compute-1 nova_compute[182713]:     <acpi/>
Jan 21 23:50:40 compute-1 nova_compute[182713]:     <apic/>
Jan 21 23:50:40 compute-1 nova_compute[182713]:     <vmcoreinfo/>
Jan 21 23:50:40 compute-1 nova_compute[182713]:   </features>
Jan 21 23:50:40 compute-1 nova_compute[182713]:   <clock offset="utc">
Jan 21 23:50:40 compute-1 nova_compute[182713]:     <timer name="pit" tickpolicy="delay"/>
Jan 21 23:50:40 compute-1 nova_compute[182713]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 21 23:50:40 compute-1 nova_compute[182713]:     <timer name="hpet" present="no"/>
Jan 21 23:50:40 compute-1 nova_compute[182713]:   </clock>
Jan 21 23:50:40 compute-1 nova_compute[182713]:   <cpu mode="custom" match="exact">
Jan 21 23:50:40 compute-1 nova_compute[182713]:     <model>Nehalem</model>
Jan 21 23:50:40 compute-1 nova_compute[182713]:     <topology sockets="1" cores="1" threads="1"/>
Jan 21 23:50:40 compute-1 nova_compute[182713]:   </cpu>
Jan 21 23:50:40 compute-1 nova_compute[182713]:   <devices>
Jan 21 23:50:40 compute-1 nova_compute[182713]:     <disk type="file" device="disk">
Jan 21 23:50:40 compute-1 nova_compute[182713]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 21 23:50:40 compute-1 nova_compute[182713]:       <source file="/var/lib/nova/instances/ce29d6fa-fbbb-4352-b243-5af960b17123/disk"/>
Jan 21 23:50:40 compute-1 nova_compute[182713]:       <target dev="vda" bus="virtio"/>
Jan 21 23:50:40 compute-1 nova_compute[182713]:     </disk>
Jan 21 23:50:40 compute-1 nova_compute[182713]:     <disk type="file" device="cdrom">
Jan 21 23:50:40 compute-1 nova_compute[182713]:       <driver name="qemu" type="raw" cache="none"/>
Jan 21 23:50:40 compute-1 nova_compute[182713]:       <source file="/var/lib/nova/instances/ce29d6fa-fbbb-4352-b243-5af960b17123/disk.config"/>
Jan 21 23:50:40 compute-1 nova_compute[182713]:       <target dev="sda" bus="sata"/>
Jan 21 23:50:40 compute-1 nova_compute[182713]:     </disk>
Jan 21 23:50:40 compute-1 nova_compute[182713]:     <interface type="ethernet">
Jan 21 23:50:40 compute-1 nova_compute[182713]:       <mac address="fa:16:3e:46:85:15"/>
Jan 21 23:50:40 compute-1 nova_compute[182713]:       <model type="virtio"/>
Jan 21 23:50:40 compute-1 nova_compute[182713]:       <driver name="vhost" rx_queue_size="512"/>
Jan 21 23:50:40 compute-1 nova_compute[182713]:       <mtu size="1442"/>
Jan 21 23:50:40 compute-1 nova_compute[182713]:       <target dev="tap6a31fa49-99"/>
Jan 21 23:50:40 compute-1 nova_compute[182713]:     </interface>
Jan 21 23:50:40 compute-1 nova_compute[182713]:     <serial type="pty">
Jan 21 23:50:40 compute-1 nova_compute[182713]:       <log file="/var/lib/nova/instances/ce29d6fa-fbbb-4352-b243-5af960b17123/console.log" append="off"/>
Jan 21 23:50:40 compute-1 nova_compute[182713]:     </serial>
Jan 21 23:50:40 compute-1 nova_compute[182713]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 21 23:50:40 compute-1 nova_compute[182713]:     <video>
Jan 21 23:50:40 compute-1 nova_compute[182713]:       <model type="virtio"/>
Jan 21 23:50:40 compute-1 nova_compute[182713]:     </video>
Jan 21 23:50:40 compute-1 nova_compute[182713]:     <input type="tablet" bus="usb"/>
Jan 21 23:50:40 compute-1 nova_compute[182713]:     <rng model="virtio">
Jan 21 23:50:40 compute-1 nova_compute[182713]:       <backend model="random">/dev/urandom</backend>
Jan 21 23:50:40 compute-1 nova_compute[182713]:     </rng>
Jan 21 23:50:40 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root"/>
Jan 21 23:50:40 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:50:40 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:50:40 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:50:40 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:50:40 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:50:40 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:50:40 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:50:40 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:50:40 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:50:40 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:50:40 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:50:40 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:50:40 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:50:40 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:50:40 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:50:40 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:50:40 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:50:40 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:50:40 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:50:40 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:50:40 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:50:40 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:50:40 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:50:40 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:50:40 compute-1 nova_compute[182713]:     <controller type="usb" index="0"/>
Jan 21 23:50:40 compute-1 nova_compute[182713]:     <memballoon model="virtio">
Jan 21 23:50:40 compute-1 nova_compute[182713]:       <stats period="10"/>
Jan 21 23:50:40 compute-1 nova_compute[182713]:     </memballoon>
Jan 21 23:50:40 compute-1 nova_compute[182713]:   </devices>
Jan 21 23:50:40 compute-1 nova_compute[182713]: </domain>
Jan 21 23:50:40 compute-1 nova_compute[182713]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 21 23:50:40 compute-1 nova_compute[182713]: 2026-01-21 23:50:40.425 182717 DEBUG nova.compute.manager [None req-286704a9-bb1f-42c8-b04b-b0a7c93bfa61 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] [instance: ce29d6fa-fbbb-4352-b243-5af960b17123] Preparing to wait for external event network-vif-plugged-6a31fa49-995e-4526-9e75-1973bd7f492b prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 21 23:50:40 compute-1 nova_compute[182713]: 2026-01-21 23:50:40.426 182717 DEBUG oslo_concurrency.lockutils [None req-286704a9-bb1f-42c8-b04b-b0a7c93bfa61 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] Acquiring lock "ce29d6fa-fbbb-4352-b243-5af960b17123-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:50:40 compute-1 nova_compute[182713]: 2026-01-21 23:50:40.426 182717 DEBUG oslo_concurrency.lockutils [None req-286704a9-bb1f-42c8-b04b-b0a7c93bfa61 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] Lock "ce29d6fa-fbbb-4352-b243-5af960b17123-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:50:40 compute-1 nova_compute[182713]: 2026-01-21 23:50:40.426 182717 DEBUG oslo_concurrency.lockutils [None req-286704a9-bb1f-42c8-b04b-b0a7c93bfa61 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] Lock "ce29d6fa-fbbb-4352-b243-5af960b17123-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:50:40 compute-1 nova_compute[182713]: 2026-01-21 23:50:40.427 182717 DEBUG nova.virt.libvirt.vif [None req-286704a9-bb1f-42c8-b04b-b0a7c93bfa61 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-21T23:50:34Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-FloatingIPsAssociationTestJSON-server-72653518',display_name='tempest-FloatingIPsAssociationTestJSON-server-72653518',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-floatingipsassociationtestjson-server-72653518',id=38,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='bdcd24bf916b4c3aa2e173bea9dd7202',ramdisk_id='',reservation_id='r-m7r7zlfx',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-FloatingIPsAssociationTestJSON-1164348821',owner_user_name='tempest-FloatingIPsAssociationTestJSON-1164348821-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-21T23:50:36Z,user_data=None,user_id='8da2db8893d4442aaaada7d43ff2500f',uuid=ce29d6fa-fbbb-4352-b243-5af960b17123,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "6a31fa49-995e-4526-9e75-1973bd7f492b", "address": "fa:16:3e:46:85:15", "network": {"id": "b94414b2-c7ed-4d1b-b462-f41cb84cbcd8", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-823172346-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bdcd24bf916b4c3aa2e173bea9dd7202", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6a31fa49-99", "ovs_interfaceid": "6a31fa49-995e-4526-9e75-1973bd7f492b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 21 23:50:40 compute-1 nova_compute[182713]: 2026-01-21 23:50:40.427 182717 DEBUG nova.network.os_vif_util [None req-286704a9-bb1f-42c8-b04b-b0a7c93bfa61 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] Converting VIF {"id": "6a31fa49-995e-4526-9e75-1973bd7f492b", "address": "fa:16:3e:46:85:15", "network": {"id": "b94414b2-c7ed-4d1b-b462-f41cb84cbcd8", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-823172346-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bdcd24bf916b4c3aa2e173bea9dd7202", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6a31fa49-99", "ovs_interfaceid": "6a31fa49-995e-4526-9e75-1973bd7f492b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 21 23:50:40 compute-1 nova_compute[182713]: 2026-01-21 23:50:40.428 182717 DEBUG nova.network.os_vif_util [None req-286704a9-bb1f-42c8-b04b-b0a7c93bfa61 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:46:85:15,bridge_name='br-int',has_traffic_filtering=True,id=6a31fa49-995e-4526-9e75-1973bd7f492b,network=Network(b94414b2-c7ed-4d1b-b462-f41cb84cbcd8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6a31fa49-99') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 21 23:50:40 compute-1 nova_compute[182713]: 2026-01-21 23:50:40.429 182717 DEBUG os_vif [None req-286704a9-bb1f-42c8-b04b-b0a7c93bfa61 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:46:85:15,bridge_name='br-int',has_traffic_filtering=True,id=6a31fa49-995e-4526-9e75-1973bd7f492b,network=Network(b94414b2-c7ed-4d1b-b462-f41cb84cbcd8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6a31fa49-99') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 21 23:50:40 compute-1 nova_compute[182713]: 2026-01-21 23:50:40.429 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:50:40 compute-1 nova_compute[182713]: 2026-01-21 23:50:40.430 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:50:40 compute-1 nova_compute[182713]: 2026-01-21 23:50:40.431 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 21 23:50:40 compute-1 nova_compute[182713]: 2026-01-21 23:50:40.439 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:50:40 compute-1 nova_compute[182713]: 2026-01-21 23:50:40.439 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap6a31fa49-99, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:50:40 compute-1 nova_compute[182713]: 2026-01-21 23:50:40.440 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap6a31fa49-99, col_values=(('external_ids', {'iface-id': '6a31fa49-995e-4526-9e75-1973bd7f492b', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:46:85:15', 'vm-uuid': 'ce29d6fa-fbbb-4352-b243-5af960b17123'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:50:40 compute-1 nova_compute[182713]: 2026-01-21 23:50:40.441 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:50:40 compute-1 NetworkManager[54952]: <info>  [1769039440.4426] manager: (tap6a31fa49-99): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/56)
Jan 21 23:50:40 compute-1 nova_compute[182713]: 2026-01-21 23:50:40.444 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 21 23:50:40 compute-1 nova_compute[182713]: 2026-01-21 23:50:40.451 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:50:40 compute-1 nova_compute[182713]: 2026-01-21 23:50:40.452 182717 INFO os_vif [None req-286704a9-bb1f-42c8-b04b-b0a7c93bfa61 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:46:85:15,bridge_name='br-int',has_traffic_filtering=True,id=6a31fa49-995e-4526-9e75-1973bd7f492b,network=Network(b94414b2-c7ed-4d1b-b462-f41cb84cbcd8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6a31fa49-99')
Jan 21 23:50:40 compute-1 nova_compute[182713]: 2026-01-21 23:50:40.453 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:50:40 compute-1 nova_compute[182713]: 2026-01-21 23:50:40.525 182717 DEBUG nova.virt.libvirt.driver [None req-286704a9-bb1f-42c8-b04b-b0a7c93bfa61 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 21 23:50:40 compute-1 nova_compute[182713]: 2026-01-21 23:50:40.525 182717 DEBUG nova.virt.libvirt.driver [None req-286704a9-bb1f-42c8-b04b-b0a7c93bfa61 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 21 23:50:40 compute-1 nova_compute[182713]: 2026-01-21 23:50:40.525 182717 DEBUG nova.virt.libvirt.driver [None req-286704a9-bb1f-42c8-b04b-b0a7c93bfa61 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] No VIF found with MAC fa:16:3e:46:85:15, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 21 23:50:40 compute-1 nova_compute[182713]: 2026-01-21 23:50:40.526 182717 INFO nova.virt.libvirt.driver [None req-286704a9-bb1f-42c8-b04b-b0a7c93bfa61 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] [instance: ce29d6fa-fbbb-4352-b243-5af960b17123] Using config drive
Jan 21 23:50:41 compute-1 nova_compute[182713]: 2026-01-21 23:50:41.139 182717 INFO nova.virt.libvirt.driver [None req-286704a9-bb1f-42c8-b04b-b0a7c93bfa61 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] [instance: ce29d6fa-fbbb-4352-b243-5af960b17123] Creating config drive at /var/lib/nova/instances/ce29d6fa-fbbb-4352-b243-5af960b17123/disk.config
Jan 21 23:50:41 compute-1 nova_compute[182713]: 2026-01-21 23:50:41.154 182717 DEBUG oslo_concurrency.processutils [None req-286704a9-bb1f-42c8-b04b-b0a7c93bfa61 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/ce29d6fa-fbbb-4352-b243-5af960b17123/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpnytks7mt execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:50:41 compute-1 nova_compute[182713]: 2026-01-21 23:50:41.288 182717 DEBUG oslo_concurrency.processutils [None req-286704a9-bb1f-42c8-b04b-b0a7c93bfa61 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/ce29d6fa-fbbb-4352-b243-5af960b17123/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpnytks7mt" returned: 0 in 0.134s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:50:41 compute-1 kernel: tap6a31fa49-99: entered promiscuous mode
Jan 21 23:50:41 compute-1 NetworkManager[54952]: <info>  [1769039441.3850] manager: (tap6a31fa49-99): new Tun device (/org/freedesktop/NetworkManager/Devices/57)
Jan 21 23:50:41 compute-1 ovn_controller[94841]: 2026-01-21T23:50:41Z|00109|binding|INFO|Claiming lport 6a31fa49-995e-4526-9e75-1973bd7f492b for this chassis.
Jan 21 23:50:41 compute-1 ovn_controller[94841]: 2026-01-21T23:50:41Z|00110|binding|INFO|6a31fa49-995e-4526-9e75-1973bd7f492b: Claiming fa:16:3e:46:85:15 10.100.0.13
Jan 21 23:50:41 compute-1 nova_compute[182713]: 2026-01-21 23:50:41.388 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:50:41 compute-1 nova_compute[182713]: 2026-01-21 23:50:41.395 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:50:41 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:50:41.407 104184 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:46:85:15 10.100.0.13'], port_security=['fa:16:3e:46:85:15 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': 'ce29d6fa-fbbb-4352-b243-5af960b17123', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b94414b2-c7ed-4d1b-b462-f41cb84cbcd8', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'bdcd24bf916b4c3aa2e173bea9dd7202', 'neutron:revision_number': '2', 'neutron:security_group_ids': '288c5115-ef70-4922-9c68-a1234762984e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e09721f1-c960-4ea4-8636-beb23b3dfb25, chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>], logical_port=6a31fa49-995e-4526-9e75-1973bd7f492b) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 21 23:50:41 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:50:41.409 104184 INFO neutron.agent.ovn.metadata.agent [-] Port 6a31fa49-995e-4526-9e75-1973bd7f492b in datapath b94414b2-c7ed-4d1b-b462-f41cb84cbcd8 bound to our chassis
Jan 21 23:50:41 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:50:41.413 104184 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network b94414b2-c7ed-4d1b-b462-f41cb84cbcd8
Jan 21 23:50:41 compute-1 systemd-udevd[216128]: Network interface NamePolicy= disabled on kernel command line.
Jan 21 23:50:41 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:50:41.435 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[ce09d8d1-b185-4c2a-a059-14a80975b260]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:50:41 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:50:41.437 104184 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapb94414b2-c1 in ovnmeta-b94414b2-c7ed-4d1b-b462-f41cb84cbcd8 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 21 23:50:41 compute-1 systemd-machined[153970]: New machine qemu-20-instance-00000026.
Jan 21 23:50:41 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:50:41.440 211733 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapb94414b2-c0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 21 23:50:41 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:50:41.441 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[d2566f8e-b899-41ab-a7ca-35e3234abb3a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:50:41 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:50:41.442 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[e9d9d139-65ac-48de-98be-65dc06c7c33c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:50:41 compute-1 NetworkManager[54952]: <info>  [1769039441.4505] device (tap6a31fa49-99): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 21 23:50:41 compute-1 NetworkManager[54952]: <info>  [1769039441.4524] device (tap6a31fa49-99): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 21 23:50:41 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:50:41.462 104576 DEBUG oslo.privsep.daemon [-] privsep: reply[b87ea7c3-4de1-48bb-9bde-2b5e2d744616]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:50:41 compute-1 systemd[1]: Started Virtual Machine qemu-20-instance-00000026.
Jan 21 23:50:41 compute-1 nova_compute[182713]: 2026-01-21 23:50:41.476 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:50:41 compute-1 ovn_controller[94841]: 2026-01-21T23:50:41Z|00111|binding|INFO|Setting lport 6a31fa49-995e-4526-9e75-1973bd7f492b ovn-installed in OVS
Jan 21 23:50:41 compute-1 ovn_controller[94841]: 2026-01-21T23:50:41Z|00112|binding|INFO|Setting lport 6a31fa49-995e-4526-9e75-1973bd7f492b up in Southbound
Jan 21 23:50:41 compute-1 nova_compute[182713]: 2026-01-21 23:50:41.483 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:50:41 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:50:41.499 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[17c8f359-0726-4471-8c9e-ea63ede9e16b]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:50:41 compute-1 podman[216112]: 2026-01-21 23:50:41.502745141 +0000 UTC m=+0.125714843 container health_status cba261ffc859a105e0afa4436e64e27f9ba7e7387a1ea02146e0072c51844d44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 21 23:50:41 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:50:41.541 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[7ef88103-b7f2-404d-abb2-43a2c8fdf58b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:50:41 compute-1 systemd-udevd[216132]: Network interface NamePolicy= disabled on kernel command line.
Jan 21 23:50:41 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:50:41.551 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[973ca2bd-ea3f-45a6-8e3b-00e5f2defadc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:50:41 compute-1 NetworkManager[54952]: <info>  [1769039441.5530] manager: (tapb94414b2-c0): new Veth device (/org/freedesktop/NetworkManager/Devices/58)
Jan 21 23:50:41 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:50:41.591 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[d960ae6d-324f-4367-8ab8-fdd7413ed282]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:50:41 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:50:41.595 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[680182e4-2fd0-493f-ac37-60f6bfa4d828]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:50:41 compute-1 NetworkManager[54952]: <info>  [1769039441.6249] device (tapb94414b2-c0): carrier: link connected
Jan 21 23:50:41 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:50:41.634 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[e3c0c98b-a921-49d5-aed2-70851534638b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:50:41 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:50:41.659 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[8deb90cc-b8ea-4f38-9693-7bbedd8a0f0e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb94414b2-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:df:82:31'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 34], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 401427, 'reachable_time': 34704, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 152, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 152, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 216173, 'error': None, 'target': 'ovnmeta-b94414b2-c7ed-4d1b-b462-f41cb84cbcd8', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:50:41 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:50:41.689 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[9b2e4f0a-b63e-4695-a4db-d7af51d05ad4]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fedf:8231'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 401427, 'tstamp': 401427}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 216174, 'error': None, 'target': 'ovnmeta-b94414b2-c7ed-4d1b-b462-f41cb84cbcd8', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:50:41 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:50:41.716 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[fcfaff98-8e4a-49d4-a598-8346a2ac2a83]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb94414b2-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:df:82:31'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 34], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 401427, 'reachable_time': 34704, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 152, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 152, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 216175, 'error': None, 'target': 'ovnmeta-b94414b2-c7ed-4d1b-b462-f41cb84cbcd8', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:50:41 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:50:41.772 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[b84b2905-90cc-4204-8f56-ecfdd8aed72d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:50:41 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:50:41.853 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[da1958b2-bd87-4c62-92db-bddcfc845d8e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:50:41 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:50:41.855 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb94414b2-c0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:50:41 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:50:41.855 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 21 23:50:41 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:50:41.856 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb94414b2-c0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:50:41 compute-1 NetworkManager[54952]: <info>  [1769039441.8593] manager: (tapb94414b2-c0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/59)
Jan 21 23:50:41 compute-1 nova_compute[182713]: 2026-01-21 23:50:41.858 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:50:41 compute-1 kernel: tapb94414b2-c0: entered promiscuous mode
Jan 21 23:50:41 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:50:41.866 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapb94414b2-c0, col_values=(('external_ids', {'iface-id': 'e1559aec-27c5-46a3-81fc-ddeb80ee3759'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:50:41 compute-1 nova_compute[182713]: 2026-01-21 23:50:41.865 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:50:41 compute-1 ovn_controller[94841]: 2026-01-21T23:50:41Z|00113|binding|INFO|Releasing lport e1559aec-27c5-46a3-81fc-ddeb80ee3759 from this chassis (sb_readonly=0)
Jan 21 23:50:41 compute-1 nova_compute[182713]: 2026-01-21 23:50:41.869 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:50:41 compute-1 nova_compute[182713]: 2026-01-21 23:50:41.892 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:50:41 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:50:41.894 104184 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/b94414b2-c7ed-4d1b-b462-f41cb84cbcd8.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/b94414b2-c7ed-4d1b-b462-f41cb84cbcd8.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 21 23:50:41 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:50:41.895 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[1c254275-1155-4e66-8b60-bd58e1fafaf6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:50:41 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:50:41.896 104184 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 21 23:50:41 compute-1 ovn_metadata_agent[104179]: global
Jan 21 23:50:41 compute-1 ovn_metadata_agent[104179]:     log         /dev/log local0 debug
Jan 21 23:50:41 compute-1 ovn_metadata_agent[104179]:     log-tag     haproxy-metadata-proxy-b94414b2-c7ed-4d1b-b462-f41cb84cbcd8
Jan 21 23:50:41 compute-1 ovn_metadata_agent[104179]:     user        root
Jan 21 23:50:41 compute-1 ovn_metadata_agent[104179]:     group       root
Jan 21 23:50:41 compute-1 ovn_metadata_agent[104179]:     maxconn     1024
Jan 21 23:50:41 compute-1 ovn_metadata_agent[104179]:     pidfile     /var/lib/neutron/external/pids/b94414b2-c7ed-4d1b-b462-f41cb84cbcd8.pid.haproxy
Jan 21 23:50:41 compute-1 ovn_metadata_agent[104179]:     daemon
Jan 21 23:50:41 compute-1 ovn_metadata_agent[104179]: 
Jan 21 23:50:41 compute-1 ovn_metadata_agent[104179]: defaults
Jan 21 23:50:41 compute-1 ovn_metadata_agent[104179]:     log global
Jan 21 23:50:41 compute-1 ovn_metadata_agent[104179]:     mode http
Jan 21 23:50:41 compute-1 ovn_metadata_agent[104179]:     option httplog
Jan 21 23:50:41 compute-1 ovn_metadata_agent[104179]:     option dontlognull
Jan 21 23:50:41 compute-1 ovn_metadata_agent[104179]:     option http-server-close
Jan 21 23:50:41 compute-1 ovn_metadata_agent[104179]:     option forwardfor
Jan 21 23:50:41 compute-1 ovn_metadata_agent[104179]:     retries                 3
Jan 21 23:50:41 compute-1 ovn_metadata_agent[104179]:     timeout http-request    30s
Jan 21 23:50:41 compute-1 ovn_metadata_agent[104179]:     timeout connect         30s
Jan 21 23:50:41 compute-1 ovn_metadata_agent[104179]:     timeout client          32s
Jan 21 23:50:41 compute-1 ovn_metadata_agent[104179]:     timeout server          32s
Jan 21 23:50:41 compute-1 ovn_metadata_agent[104179]:     timeout http-keep-alive 30s
Jan 21 23:50:41 compute-1 ovn_metadata_agent[104179]: 
Jan 21 23:50:41 compute-1 ovn_metadata_agent[104179]: 
Jan 21 23:50:41 compute-1 ovn_metadata_agent[104179]: listen listener
Jan 21 23:50:41 compute-1 ovn_metadata_agent[104179]:     bind 169.254.169.254:80
Jan 21 23:50:41 compute-1 ovn_metadata_agent[104179]:     server metadata /var/lib/neutron/metadata_proxy
Jan 21 23:50:41 compute-1 ovn_metadata_agent[104179]:     http-request add-header X-OVN-Network-ID b94414b2-c7ed-4d1b-b462-f41cb84cbcd8
Jan 21 23:50:41 compute-1 ovn_metadata_agent[104179]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 21 23:50:41 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:50:41.898 104184 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-b94414b2-c7ed-4d1b-b462-f41cb84cbcd8', 'env', 'PROCESS_TAG=haproxy-b94414b2-c7ed-4d1b-b462-f41cb84cbcd8', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/b94414b2-c7ed-4d1b-b462-f41cb84cbcd8.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 21 23:50:42 compute-1 podman[216209]: 2026-01-21 23:50:42.275359945 +0000 UTC m=+0.052637108 container create 2a00d349c42547b8f7b5a18108b0c24d1aa5a9a0d5a9c4ebf0f5ecc6e9a4bb8c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b94414b2-c7ed-4d1b-b462-f41cb84cbcd8, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 21 23:50:42 compute-1 systemd[1]: Started libpod-conmon-2a00d349c42547b8f7b5a18108b0c24d1aa5a9a0d5a9c4ebf0f5ecc6e9a4bb8c.scope.
Jan 21 23:50:42 compute-1 nova_compute[182713]: 2026-01-21 23:50:42.316 182717 DEBUG nova.virt.driver [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Emitting event <LifecycleEvent: 1769039442.3150494, ce29d6fa-fbbb-4352-b243-5af960b17123 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:50:42 compute-1 nova_compute[182713]: 2026-01-21 23:50:42.318 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: ce29d6fa-fbbb-4352-b243-5af960b17123] VM Started (Lifecycle Event)
Jan 21 23:50:42 compute-1 systemd[1]: Started libcrun container.
Jan 21 23:50:42 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/97eb3192fdfe7015f0293e6bad55251506d75ac3995db2dda5b35fbbde989a4e/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 21 23:50:42 compute-1 podman[216209]: 2026-01-21 23:50:42.338181475 +0000 UTC m=+0.115458638 container init 2a00d349c42547b8f7b5a18108b0c24d1aa5a9a0d5a9c4ebf0f5ecc6e9a4bb8c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b94414b2-c7ed-4d1b-b462-f41cb84cbcd8, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Jan 21 23:50:42 compute-1 podman[216209]: 2026-01-21 23:50:42.246651425 +0000 UTC m=+0.023928628 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 21 23:50:42 compute-1 podman[216209]: 2026-01-21 23:50:42.343720841 +0000 UTC m=+0.120998004 container start 2a00d349c42547b8f7b5a18108b0c24d1aa5a9a0d5a9c4ebf0f5ecc6e9a4bb8c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b94414b2-c7ed-4d1b-b462-f41cb84cbcd8, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 21 23:50:42 compute-1 nova_compute[182713]: 2026-01-21 23:50:42.350 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: ce29d6fa-fbbb-4352-b243-5af960b17123] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:50:42 compute-1 nova_compute[182713]: 2026-01-21 23:50:42.355 182717 DEBUG nova.virt.driver [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Emitting event <LifecycleEvent: 1769039442.3153763, ce29d6fa-fbbb-4352-b243-5af960b17123 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:50:42 compute-1 nova_compute[182713]: 2026-01-21 23:50:42.355 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: ce29d6fa-fbbb-4352-b243-5af960b17123] VM Paused (Lifecycle Event)
Jan 21 23:50:42 compute-1 neutron-haproxy-ovnmeta-b94414b2-c7ed-4d1b-b462-f41cb84cbcd8[216229]: [NOTICE]   (216233) : New worker (216235) forked
Jan 21 23:50:42 compute-1 neutron-haproxy-ovnmeta-b94414b2-c7ed-4d1b-b462-f41cb84cbcd8[216229]: [NOTICE]   (216233) : Loading success.
Jan 21 23:50:42 compute-1 nova_compute[182713]: 2026-01-21 23:50:42.386 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: ce29d6fa-fbbb-4352-b243-5af960b17123] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:50:42 compute-1 nova_compute[182713]: 2026-01-21 23:50:42.390 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: ce29d6fa-fbbb-4352-b243-5af960b17123] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 21 23:50:42 compute-1 nova_compute[182713]: 2026-01-21 23:50:42.431 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: ce29d6fa-fbbb-4352-b243-5af960b17123] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 21 23:50:43 compute-1 nova_compute[182713]: 2026-01-21 23:50:43.358 182717 DEBUG nova.network.neutron [req-226a5717-f8e6-4fae-a333-e13748c24b39 req-dc17613e-0021-4a45-bdef-848caf65caa1 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: ce29d6fa-fbbb-4352-b243-5af960b17123] Updated VIF entry in instance network info cache for port 6a31fa49-995e-4526-9e75-1973bd7f492b. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 21 23:50:43 compute-1 nova_compute[182713]: 2026-01-21 23:50:43.359 182717 DEBUG nova.network.neutron [req-226a5717-f8e6-4fae-a333-e13748c24b39 req-dc17613e-0021-4a45-bdef-848caf65caa1 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: ce29d6fa-fbbb-4352-b243-5af960b17123] Updating instance_info_cache with network_info: [{"id": "6a31fa49-995e-4526-9e75-1973bd7f492b", "address": "fa:16:3e:46:85:15", "network": {"id": "b94414b2-c7ed-4d1b-b462-f41cb84cbcd8", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-823172346-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bdcd24bf916b4c3aa2e173bea9dd7202", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6a31fa49-99", "ovs_interfaceid": "6a31fa49-995e-4526-9e75-1973bd7f492b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 23:50:43 compute-1 nova_compute[182713]: 2026-01-21 23:50:43.367 182717 DEBUG nova.compute.manager [req-d9b9aa87-8c7d-4d72-a99c-a5afbe3473e8 req-1d6c929b-8e0f-42c5-88f3-946eeb76c518 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: ce29d6fa-fbbb-4352-b243-5af960b17123] Received event network-vif-plugged-6a31fa49-995e-4526-9e75-1973bd7f492b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:50:43 compute-1 nova_compute[182713]: 2026-01-21 23:50:43.368 182717 DEBUG oslo_concurrency.lockutils [req-d9b9aa87-8c7d-4d72-a99c-a5afbe3473e8 req-1d6c929b-8e0f-42c5-88f3-946eeb76c518 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "ce29d6fa-fbbb-4352-b243-5af960b17123-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:50:43 compute-1 nova_compute[182713]: 2026-01-21 23:50:43.369 182717 DEBUG oslo_concurrency.lockutils [req-d9b9aa87-8c7d-4d72-a99c-a5afbe3473e8 req-1d6c929b-8e0f-42c5-88f3-946eeb76c518 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "ce29d6fa-fbbb-4352-b243-5af960b17123-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:50:43 compute-1 nova_compute[182713]: 2026-01-21 23:50:43.369 182717 DEBUG oslo_concurrency.lockutils [req-d9b9aa87-8c7d-4d72-a99c-a5afbe3473e8 req-1d6c929b-8e0f-42c5-88f3-946eeb76c518 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "ce29d6fa-fbbb-4352-b243-5af960b17123-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:50:43 compute-1 nova_compute[182713]: 2026-01-21 23:50:43.370 182717 DEBUG nova.compute.manager [req-d9b9aa87-8c7d-4d72-a99c-a5afbe3473e8 req-1d6c929b-8e0f-42c5-88f3-946eeb76c518 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: ce29d6fa-fbbb-4352-b243-5af960b17123] Processing event network-vif-plugged-6a31fa49-995e-4526-9e75-1973bd7f492b _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 21 23:50:43 compute-1 nova_compute[182713]: 2026-01-21 23:50:43.371 182717 DEBUG nova.compute.manager [req-d9b9aa87-8c7d-4d72-a99c-a5afbe3473e8 req-1d6c929b-8e0f-42c5-88f3-946eeb76c518 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: ce29d6fa-fbbb-4352-b243-5af960b17123] Received event network-vif-plugged-6a31fa49-995e-4526-9e75-1973bd7f492b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:50:43 compute-1 nova_compute[182713]: 2026-01-21 23:50:43.371 182717 DEBUG oslo_concurrency.lockutils [req-d9b9aa87-8c7d-4d72-a99c-a5afbe3473e8 req-1d6c929b-8e0f-42c5-88f3-946eeb76c518 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "ce29d6fa-fbbb-4352-b243-5af960b17123-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:50:43 compute-1 nova_compute[182713]: 2026-01-21 23:50:43.372 182717 DEBUG oslo_concurrency.lockutils [req-d9b9aa87-8c7d-4d72-a99c-a5afbe3473e8 req-1d6c929b-8e0f-42c5-88f3-946eeb76c518 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "ce29d6fa-fbbb-4352-b243-5af960b17123-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:50:43 compute-1 nova_compute[182713]: 2026-01-21 23:50:43.372 182717 DEBUG oslo_concurrency.lockutils [req-d9b9aa87-8c7d-4d72-a99c-a5afbe3473e8 req-1d6c929b-8e0f-42c5-88f3-946eeb76c518 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "ce29d6fa-fbbb-4352-b243-5af960b17123-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:50:43 compute-1 nova_compute[182713]: 2026-01-21 23:50:43.373 182717 DEBUG nova.compute.manager [req-d9b9aa87-8c7d-4d72-a99c-a5afbe3473e8 req-1d6c929b-8e0f-42c5-88f3-946eeb76c518 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: ce29d6fa-fbbb-4352-b243-5af960b17123] No waiting events found dispatching network-vif-plugged-6a31fa49-995e-4526-9e75-1973bd7f492b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 23:50:43 compute-1 nova_compute[182713]: 2026-01-21 23:50:43.373 182717 WARNING nova.compute.manager [req-d9b9aa87-8c7d-4d72-a99c-a5afbe3473e8 req-1d6c929b-8e0f-42c5-88f3-946eeb76c518 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: ce29d6fa-fbbb-4352-b243-5af960b17123] Received unexpected event network-vif-plugged-6a31fa49-995e-4526-9e75-1973bd7f492b for instance with vm_state building and task_state spawning.
Jan 21 23:50:43 compute-1 nova_compute[182713]: 2026-01-21 23:50:43.375 182717 DEBUG nova.compute.manager [None req-286704a9-bb1f-42c8-b04b-b0a7c93bfa61 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] [instance: ce29d6fa-fbbb-4352-b243-5af960b17123] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 21 23:50:43 compute-1 nova_compute[182713]: 2026-01-21 23:50:43.377 182717 DEBUG oslo_concurrency.lockutils [req-226a5717-f8e6-4fae-a333-e13748c24b39 req-dc17613e-0021-4a45-bdef-848caf65caa1 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-ce29d6fa-fbbb-4352-b243-5af960b17123" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 23:50:43 compute-1 nova_compute[182713]: 2026-01-21 23:50:43.380 182717 DEBUG nova.virt.driver [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Emitting event <LifecycleEvent: 1769039443.3795476, ce29d6fa-fbbb-4352-b243-5af960b17123 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:50:43 compute-1 nova_compute[182713]: 2026-01-21 23:50:43.380 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: ce29d6fa-fbbb-4352-b243-5af960b17123] VM Resumed (Lifecycle Event)
Jan 21 23:50:43 compute-1 nova_compute[182713]: 2026-01-21 23:50:43.382 182717 DEBUG nova.virt.libvirt.driver [None req-286704a9-bb1f-42c8-b04b-b0a7c93bfa61 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] [instance: ce29d6fa-fbbb-4352-b243-5af960b17123] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 21 23:50:43 compute-1 nova_compute[182713]: 2026-01-21 23:50:43.387 182717 INFO nova.virt.libvirt.driver [-] [instance: ce29d6fa-fbbb-4352-b243-5af960b17123] Instance spawned successfully.
Jan 21 23:50:43 compute-1 nova_compute[182713]: 2026-01-21 23:50:43.388 182717 DEBUG nova.virt.libvirt.driver [None req-286704a9-bb1f-42c8-b04b-b0a7c93bfa61 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] [instance: ce29d6fa-fbbb-4352-b243-5af960b17123] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 21 23:50:43 compute-1 nova_compute[182713]: 2026-01-21 23:50:43.409 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: ce29d6fa-fbbb-4352-b243-5af960b17123] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:50:43 compute-1 nova_compute[182713]: 2026-01-21 23:50:43.417 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: ce29d6fa-fbbb-4352-b243-5af960b17123] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 21 23:50:43 compute-1 nova_compute[182713]: 2026-01-21 23:50:43.424 182717 DEBUG nova.virt.libvirt.driver [None req-286704a9-bb1f-42c8-b04b-b0a7c93bfa61 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] [instance: ce29d6fa-fbbb-4352-b243-5af960b17123] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:50:43 compute-1 nova_compute[182713]: 2026-01-21 23:50:43.425 182717 DEBUG nova.virt.libvirt.driver [None req-286704a9-bb1f-42c8-b04b-b0a7c93bfa61 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] [instance: ce29d6fa-fbbb-4352-b243-5af960b17123] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:50:43 compute-1 nova_compute[182713]: 2026-01-21 23:50:43.426 182717 DEBUG nova.virt.libvirt.driver [None req-286704a9-bb1f-42c8-b04b-b0a7c93bfa61 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] [instance: ce29d6fa-fbbb-4352-b243-5af960b17123] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:50:43 compute-1 nova_compute[182713]: 2026-01-21 23:50:43.427 182717 DEBUG nova.virt.libvirt.driver [None req-286704a9-bb1f-42c8-b04b-b0a7c93bfa61 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] [instance: ce29d6fa-fbbb-4352-b243-5af960b17123] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:50:43 compute-1 nova_compute[182713]: 2026-01-21 23:50:43.428 182717 DEBUG nova.virt.libvirt.driver [None req-286704a9-bb1f-42c8-b04b-b0a7c93bfa61 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] [instance: ce29d6fa-fbbb-4352-b243-5af960b17123] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:50:43 compute-1 nova_compute[182713]: 2026-01-21 23:50:43.428 182717 DEBUG nova.virt.libvirt.driver [None req-286704a9-bb1f-42c8-b04b-b0a7c93bfa61 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] [instance: ce29d6fa-fbbb-4352-b243-5af960b17123] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:50:43 compute-1 nova_compute[182713]: 2026-01-21 23:50:43.456 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: ce29d6fa-fbbb-4352-b243-5af960b17123] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 21 23:50:43 compute-1 nova_compute[182713]: 2026-01-21 23:50:43.515 182717 INFO nova.compute.manager [None req-286704a9-bb1f-42c8-b04b-b0a7c93bfa61 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] [instance: ce29d6fa-fbbb-4352-b243-5af960b17123] Took 7.42 seconds to spawn the instance on the hypervisor.
Jan 21 23:50:43 compute-1 nova_compute[182713]: 2026-01-21 23:50:43.516 182717 DEBUG nova.compute.manager [None req-286704a9-bb1f-42c8-b04b-b0a7c93bfa61 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] [instance: ce29d6fa-fbbb-4352-b243-5af960b17123] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:50:43 compute-1 nova_compute[182713]: 2026-01-21 23:50:43.622 182717 INFO nova.compute.manager [None req-286704a9-bb1f-42c8-b04b-b0a7c93bfa61 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] [instance: ce29d6fa-fbbb-4352-b243-5af960b17123] Took 8.05 seconds to build instance.
Jan 21 23:50:43 compute-1 nova_compute[182713]: 2026-01-21 23:50:43.643 182717 DEBUG oslo_concurrency.lockutils [None req-286704a9-bb1f-42c8-b04b-b0a7c93bfa61 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] Lock "ce29d6fa-fbbb-4352-b243-5af960b17123" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.180s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:50:44 compute-1 podman[216244]: 2026-01-21 23:50:44.619329353 +0000 UTC m=+0.103048585 container health_status dce133a2ffa21f3006ac27dc80027060a9029e6d972f4c62fecd35dd3bbb00f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, io.buildah.version=1.33.7, name=ubi9-minimal, vendor=Red Hat, Inc., container_name=openstack_network_exporter, build-date=2025-08-20T13:12:41, config_id=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible)
Jan 21 23:50:45 compute-1 nova_compute[182713]: 2026-01-21 23:50:45.444 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:50:45 compute-1 nova_compute[182713]: 2026-01-21 23:50:45.456 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:50:46 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:50:46.843 104184 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=10, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:a4:fb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'fa:c2:bb:50:eb:27'}, ipsec=False) old=SB_Global(nb_cfg=9) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 21 23:50:46 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:50:46.844 104184 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 21 23:50:46 compute-1 nova_compute[182713]: 2026-01-21 23:50:46.847 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:50:50 compute-1 nova_compute[182713]: 2026-01-21 23:50:50.450 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:50:50 compute-1 nova_compute[182713]: 2026-01-21 23:50:50.455 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:50:50 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:50:50.847 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=74526b6d-b1ca-423f-9094-b845f8b97526, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '10'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:50:52 compute-1 podman[216267]: 2026-01-21 23:50:52.611818454 +0000 UTC m=+0.089991242 container health_status 9069102163b9014ce8d990e36224edf2f6277e737eebbc85a8ea0ba5a3d6a468 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 21 23:50:52 compute-1 podman[216266]: 2026-01-21 23:50:52.6824077 +0000 UTC m=+0.163030615 container health_status 1b31374526468d6b98833c6ddc06c791524e3aaa8f4a92d02e3f11c449a17ca2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251202)
Jan 21 23:50:55 compute-1 nova_compute[182713]: 2026-01-21 23:50:55.455 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:50:55 compute-1 nova_compute[182713]: 2026-01-21 23:50:55.457 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:50:55 compute-1 nova_compute[182713]: 2026-01-21 23:50:55.927 182717 DEBUG oslo_concurrency.lockutils [None req-f7eca75d-575e-48d3-9c0f-b278d76249cd d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Acquiring lock "c93af124-010e-4edc-8de7-7e16740d73e2" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:50:55 compute-1 nova_compute[182713]: 2026-01-21 23:50:55.928 182717 DEBUG oslo_concurrency.lockutils [None req-f7eca75d-575e-48d3-9c0f-b278d76249cd d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Lock "c93af124-010e-4edc-8de7-7e16740d73e2" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:50:55 compute-1 nova_compute[182713]: 2026-01-21 23:50:55.973 182717 DEBUG nova.compute.manager [None req-f7eca75d-575e-48d3-9c0f-b278d76249cd d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] [instance: c93af124-010e-4edc-8de7-7e16740d73e2] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 21 23:50:55 compute-1 ovn_controller[94841]: 2026-01-21T23:50:55Z|00014|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:46:85:15 10.100.0.13
Jan 21 23:50:55 compute-1 ovn_controller[94841]: 2026-01-21T23:50:55Z|00015|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:46:85:15 10.100.0.13
Jan 21 23:50:56 compute-1 nova_compute[182713]: 2026-01-21 23:50:56.111 182717 DEBUG oslo_concurrency.lockutils [None req-f7eca75d-575e-48d3-9c0f-b278d76249cd d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:50:56 compute-1 nova_compute[182713]: 2026-01-21 23:50:56.112 182717 DEBUG oslo_concurrency.lockutils [None req-f7eca75d-575e-48d3-9c0f-b278d76249cd d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:50:56 compute-1 nova_compute[182713]: 2026-01-21 23:50:56.124 182717 DEBUG nova.virt.hardware [None req-f7eca75d-575e-48d3-9c0f-b278d76249cd d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 21 23:50:56 compute-1 nova_compute[182713]: 2026-01-21 23:50:56.125 182717 INFO nova.compute.claims [None req-f7eca75d-575e-48d3-9c0f-b278d76249cd d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] [instance: c93af124-010e-4edc-8de7-7e16740d73e2] Claim successful on node compute-1.ctlplane.example.com
Jan 21 23:50:56 compute-1 nova_compute[182713]: 2026-01-21 23:50:56.395 182717 DEBUG nova.compute.provider_tree [None req-f7eca75d-575e-48d3-9c0f-b278d76249cd d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Inventory has not changed in ProviderTree for provider: 39680711-70c9-4df1-ae59-25e54fac688d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 21 23:50:56 compute-1 nova_compute[182713]: 2026-01-21 23:50:56.438 182717 DEBUG nova.scheduler.client.report [None req-f7eca75d-575e-48d3-9c0f-b278d76249cd d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Inventory has not changed for provider 39680711-70c9-4df1-ae59-25e54fac688d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 21 23:50:56 compute-1 nova_compute[182713]: 2026-01-21 23:50:56.487 182717 DEBUG oslo_concurrency.lockutils [None req-f7eca75d-575e-48d3-9c0f-b278d76249cd d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.375s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:50:56 compute-1 nova_compute[182713]: 2026-01-21 23:50:56.488 182717 DEBUG nova.compute.manager [None req-f7eca75d-575e-48d3-9c0f-b278d76249cd d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] [instance: c93af124-010e-4edc-8de7-7e16740d73e2] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 21 23:50:56 compute-1 nova_compute[182713]: 2026-01-21 23:50:56.585 182717 DEBUG nova.compute.manager [None req-f7eca75d-575e-48d3-9c0f-b278d76249cd d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] [instance: c93af124-010e-4edc-8de7-7e16740d73e2] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 21 23:50:56 compute-1 nova_compute[182713]: 2026-01-21 23:50:56.586 182717 DEBUG nova.network.neutron [None req-f7eca75d-575e-48d3-9c0f-b278d76249cd d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] [instance: c93af124-010e-4edc-8de7-7e16740d73e2] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 21 23:50:56 compute-1 nova_compute[182713]: 2026-01-21 23:50:56.627 182717 INFO nova.virt.libvirt.driver [None req-f7eca75d-575e-48d3-9c0f-b278d76249cd d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] [instance: c93af124-010e-4edc-8de7-7e16740d73e2] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 21 23:50:56 compute-1 nova_compute[182713]: 2026-01-21 23:50:56.682 182717 DEBUG nova.compute.manager [None req-f7eca75d-575e-48d3-9c0f-b278d76249cd d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] [instance: c93af124-010e-4edc-8de7-7e16740d73e2] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 21 23:50:56 compute-1 nova_compute[182713]: 2026-01-21 23:50:56.935 182717 DEBUG nova.compute.manager [None req-f7eca75d-575e-48d3-9c0f-b278d76249cd d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] [instance: c93af124-010e-4edc-8de7-7e16740d73e2] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 21 23:50:56 compute-1 nova_compute[182713]: 2026-01-21 23:50:56.936 182717 DEBUG nova.virt.libvirt.driver [None req-f7eca75d-575e-48d3-9c0f-b278d76249cd d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] [instance: c93af124-010e-4edc-8de7-7e16740d73e2] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 21 23:50:56 compute-1 nova_compute[182713]: 2026-01-21 23:50:56.936 182717 INFO nova.virt.libvirt.driver [None req-f7eca75d-575e-48d3-9c0f-b278d76249cd d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] [instance: c93af124-010e-4edc-8de7-7e16740d73e2] Creating image(s)
Jan 21 23:50:56 compute-1 nova_compute[182713]: 2026-01-21 23:50:56.937 182717 DEBUG oslo_concurrency.lockutils [None req-f7eca75d-575e-48d3-9c0f-b278d76249cd d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Acquiring lock "/var/lib/nova/instances/c93af124-010e-4edc-8de7-7e16740d73e2/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:50:56 compute-1 nova_compute[182713]: 2026-01-21 23:50:56.937 182717 DEBUG oslo_concurrency.lockutils [None req-f7eca75d-575e-48d3-9c0f-b278d76249cd d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Lock "/var/lib/nova/instances/c93af124-010e-4edc-8de7-7e16740d73e2/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:50:56 compute-1 nova_compute[182713]: 2026-01-21 23:50:56.938 182717 DEBUG oslo_concurrency.lockutils [None req-f7eca75d-575e-48d3-9c0f-b278d76249cd d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Lock "/var/lib/nova/instances/c93af124-010e-4edc-8de7-7e16740d73e2/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:50:56 compute-1 nova_compute[182713]: 2026-01-21 23:50:56.952 182717 DEBUG oslo_concurrency.processutils [None req-f7eca75d-575e-48d3-9c0f-b278d76249cd d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:50:57 compute-1 nova_compute[182713]: 2026-01-21 23:50:57.007 182717 DEBUG oslo_concurrency.processutils [None req-f7eca75d-575e-48d3-9c0f-b278d76249cd d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:50:57 compute-1 nova_compute[182713]: 2026-01-21 23:50:57.009 182717 DEBUG oslo_concurrency.lockutils [None req-f7eca75d-575e-48d3-9c0f-b278d76249cd d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Acquiring lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:50:57 compute-1 nova_compute[182713]: 2026-01-21 23:50:57.009 182717 DEBUG oslo_concurrency.lockutils [None req-f7eca75d-575e-48d3-9c0f-b278d76249cd d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:50:57 compute-1 nova_compute[182713]: 2026-01-21 23:50:57.025 182717 DEBUG oslo_concurrency.processutils [None req-f7eca75d-575e-48d3-9c0f-b278d76249cd d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:50:57 compute-1 nova_compute[182713]: 2026-01-21 23:50:57.089 182717 DEBUG oslo_concurrency.processutils [None req-f7eca75d-575e-48d3-9c0f-b278d76249cd d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:50:57 compute-1 nova_compute[182713]: 2026-01-21 23:50:57.091 182717 DEBUG oslo_concurrency.processutils [None req-f7eca75d-575e-48d3-9c0f-b278d76249cd d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/c93af124-010e-4edc-8de7-7e16740d73e2/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:50:57 compute-1 nova_compute[182713]: 2026-01-21 23:50:57.129 182717 DEBUG oslo_concurrency.processutils [None req-f7eca75d-575e-48d3-9c0f-b278d76249cd d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/c93af124-010e-4edc-8de7-7e16740d73e2/disk 1073741824" returned: 0 in 0.038s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:50:57 compute-1 nova_compute[182713]: 2026-01-21 23:50:57.130 182717 DEBUG oslo_concurrency.lockutils [None req-f7eca75d-575e-48d3-9c0f-b278d76249cd d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.121s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:50:57 compute-1 nova_compute[182713]: 2026-01-21 23:50:57.131 182717 DEBUG oslo_concurrency.processutils [None req-f7eca75d-575e-48d3-9c0f-b278d76249cd d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:50:57 compute-1 nova_compute[182713]: 2026-01-21 23:50:57.209 182717 DEBUG oslo_concurrency.processutils [None req-f7eca75d-575e-48d3-9c0f-b278d76249cd d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.078s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:50:57 compute-1 nova_compute[182713]: 2026-01-21 23:50:57.211 182717 DEBUG nova.virt.disk.api [None req-f7eca75d-575e-48d3-9c0f-b278d76249cd d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Checking if we can resize image /var/lib/nova/instances/c93af124-010e-4edc-8de7-7e16740d73e2/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 21 23:50:57 compute-1 nova_compute[182713]: 2026-01-21 23:50:57.212 182717 DEBUG oslo_concurrency.processutils [None req-f7eca75d-575e-48d3-9c0f-b278d76249cd d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c93af124-010e-4edc-8de7-7e16740d73e2/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:50:57 compute-1 nova_compute[182713]: 2026-01-21 23:50:57.273 182717 DEBUG nova.network.neutron [None req-f7eca75d-575e-48d3-9c0f-b278d76249cd d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] [instance: c93af124-010e-4edc-8de7-7e16740d73e2] No network configured allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1188
Jan 21 23:50:57 compute-1 nova_compute[182713]: 2026-01-21 23:50:57.273 182717 DEBUG nova.compute.manager [None req-f7eca75d-575e-48d3-9c0f-b278d76249cd d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] [instance: c93af124-010e-4edc-8de7-7e16740d73e2] Instance network_info: |[]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 21 23:50:57 compute-1 nova_compute[182713]: 2026-01-21 23:50:57.289 182717 DEBUG oslo_concurrency.processutils [None req-f7eca75d-575e-48d3-9c0f-b278d76249cd d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c93af124-010e-4edc-8de7-7e16740d73e2/disk --force-share --output=json" returned: 0 in 0.077s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:50:57 compute-1 nova_compute[182713]: 2026-01-21 23:50:57.290 182717 DEBUG nova.virt.disk.api [None req-f7eca75d-575e-48d3-9c0f-b278d76249cd d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Cannot resize image /var/lib/nova/instances/c93af124-010e-4edc-8de7-7e16740d73e2/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 21 23:50:57 compute-1 nova_compute[182713]: 2026-01-21 23:50:57.290 182717 DEBUG nova.objects.instance [None req-f7eca75d-575e-48d3-9c0f-b278d76249cd d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Lazy-loading 'migration_context' on Instance uuid c93af124-010e-4edc-8de7-7e16740d73e2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:50:57 compute-1 nova_compute[182713]: 2026-01-21 23:50:57.304 182717 DEBUG nova.virt.libvirt.driver [None req-f7eca75d-575e-48d3-9c0f-b278d76249cd d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] [instance: c93af124-010e-4edc-8de7-7e16740d73e2] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 21 23:50:57 compute-1 nova_compute[182713]: 2026-01-21 23:50:57.305 182717 DEBUG nova.virt.libvirt.driver [None req-f7eca75d-575e-48d3-9c0f-b278d76249cd d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] [instance: c93af124-010e-4edc-8de7-7e16740d73e2] Ensure instance console log exists: /var/lib/nova/instances/c93af124-010e-4edc-8de7-7e16740d73e2/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 21 23:50:57 compute-1 nova_compute[182713]: 2026-01-21 23:50:57.306 182717 DEBUG oslo_concurrency.lockutils [None req-f7eca75d-575e-48d3-9c0f-b278d76249cd d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:50:57 compute-1 nova_compute[182713]: 2026-01-21 23:50:57.306 182717 DEBUG oslo_concurrency.lockutils [None req-f7eca75d-575e-48d3-9c0f-b278d76249cd d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:50:57 compute-1 nova_compute[182713]: 2026-01-21 23:50:57.306 182717 DEBUG oslo_concurrency.lockutils [None req-f7eca75d-575e-48d3-9c0f-b278d76249cd d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:50:57 compute-1 nova_compute[182713]: 2026-01-21 23:50:57.308 182717 DEBUG nova.virt.libvirt.driver [None req-f7eca75d-575e-48d3-9c0f-b278d76249cd d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] [instance: c93af124-010e-4edc-8de7-7e16740d73e2] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'size': 0, 'disk_bus': 'virtio', 'boot_index': 0, 'device_name': '/dev/vda', 'guest_format': None, 'encrypted': False, 'encryption_format': None, 'encryption_secret_uuid': None, 'device_type': 'disk', 'image_id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 21 23:50:57 compute-1 nova_compute[182713]: 2026-01-21 23:50:57.313 182717 WARNING nova.virt.libvirt.driver [None req-f7eca75d-575e-48d3-9c0f-b278d76249cd d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 21 23:50:57 compute-1 nova_compute[182713]: 2026-01-21 23:50:57.318 182717 DEBUG nova.virt.libvirt.host [None req-f7eca75d-575e-48d3-9c0f-b278d76249cd d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 21 23:50:57 compute-1 nova_compute[182713]: 2026-01-21 23:50:57.318 182717 DEBUG nova.virt.libvirt.host [None req-f7eca75d-575e-48d3-9c0f-b278d76249cd d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 21 23:50:57 compute-1 nova_compute[182713]: 2026-01-21 23:50:57.321 182717 DEBUG nova.virt.libvirt.host [None req-f7eca75d-575e-48d3-9c0f-b278d76249cd d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 21 23:50:57 compute-1 nova_compute[182713]: 2026-01-21 23:50:57.322 182717 DEBUG nova.virt.libvirt.host [None req-f7eca75d-575e-48d3-9c0f-b278d76249cd d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 21 23:50:57 compute-1 nova_compute[182713]: 2026-01-21 23:50:57.323 182717 DEBUG nova.virt.libvirt.driver [None req-f7eca75d-575e-48d3-9c0f-b278d76249cd d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 21 23:50:57 compute-1 nova_compute[182713]: 2026-01-21 23:50:57.323 182717 DEBUG nova.virt.hardware [None req-f7eca75d-575e-48d3-9c0f-b278d76249cd d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-21T23:42:45Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c3389c03-89c4-4ff5-9e03-1a99d41713d4',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 21 23:50:57 compute-1 nova_compute[182713]: 2026-01-21 23:50:57.324 182717 DEBUG nova.virt.hardware [None req-f7eca75d-575e-48d3-9c0f-b278d76249cd d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 21 23:50:57 compute-1 nova_compute[182713]: 2026-01-21 23:50:57.324 182717 DEBUG nova.virt.hardware [None req-f7eca75d-575e-48d3-9c0f-b278d76249cd d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 21 23:50:57 compute-1 nova_compute[182713]: 2026-01-21 23:50:57.325 182717 DEBUG nova.virt.hardware [None req-f7eca75d-575e-48d3-9c0f-b278d76249cd d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 21 23:50:57 compute-1 nova_compute[182713]: 2026-01-21 23:50:57.325 182717 DEBUG nova.virt.hardware [None req-f7eca75d-575e-48d3-9c0f-b278d76249cd d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 21 23:50:57 compute-1 nova_compute[182713]: 2026-01-21 23:50:57.325 182717 DEBUG nova.virt.hardware [None req-f7eca75d-575e-48d3-9c0f-b278d76249cd d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 21 23:50:57 compute-1 nova_compute[182713]: 2026-01-21 23:50:57.326 182717 DEBUG nova.virt.hardware [None req-f7eca75d-575e-48d3-9c0f-b278d76249cd d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 21 23:50:57 compute-1 nova_compute[182713]: 2026-01-21 23:50:57.326 182717 DEBUG nova.virt.hardware [None req-f7eca75d-575e-48d3-9c0f-b278d76249cd d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 21 23:50:57 compute-1 nova_compute[182713]: 2026-01-21 23:50:57.326 182717 DEBUG nova.virt.hardware [None req-f7eca75d-575e-48d3-9c0f-b278d76249cd d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 21 23:50:57 compute-1 nova_compute[182713]: 2026-01-21 23:50:57.327 182717 DEBUG nova.virt.hardware [None req-f7eca75d-575e-48d3-9c0f-b278d76249cd d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 21 23:50:57 compute-1 nova_compute[182713]: 2026-01-21 23:50:57.327 182717 DEBUG nova.virt.hardware [None req-f7eca75d-575e-48d3-9c0f-b278d76249cd d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 21 23:50:57 compute-1 nova_compute[182713]: 2026-01-21 23:50:57.333 182717 DEBUG nova.objects.instance [None req-f7eca75d-575e-48d3-9c0f-b278d76249cd d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Lazy-loading 'pci_devices' on Instance uuid c93af124-010e-4edc-8de7-7e16740d73e2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:50:57 compute-1 nova_compute[182713]: 2026-01-21 23:50:57.363 182717 DEBUG nova.virt.libvirt.driver [None req-f7eca75d-575e-48d3-9c0f-b278d76249cd d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] [instance: c93af124-010e-4edc-8de7-7e16740d73e2] End _get_guest_xml xml=<domain type="kvm">
Jan 21 23:50:57 compute-1 nova_compute[182713]:   <uuid>c93af124-010e-4edc-8de7-7e16740d73e2</uuid>
Jan 21 23:50:57 compute-1 nova_compute[182713]:   <name>instance-00000028</name>
Jan 21 23:50:57 compute-1 nova_compute[182713]:   <memory>131072</memory>
Jan 21 23:50:57 compute-1 nova_compute[182713]:   <vcpu>1</vcpu>
Jan 21 23:50:57 compute-1 nova_compute[182713]:   <metadata>
Jan 21 23:50:57 compute-1 nova_compute[182713]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 21 23:50:57 compute-1 nova_compute[182713]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 21 23:50:57 compute-1 nova_compute[182713]:       <nova:name>tempest-ServersOnMultiNodesTest-server-1245436965</nova:name>
Jan 21 23:50:57 compute-1 nova_compute[182713]:       <nova:creationTime>2026-01-21 23:50:57</nova:creationTime>
Jan 21 23:50:57 compute-1 nova_compute[182713]:       <nova:flavor name="m1.nano">
Jan 21 23:50:57 compute-1 nova_compute[182713]:         <nova:memory>128</nova:memory>
Jan 21 23:50:57 compute-1 nova_compute[182713]:         <nova:disk>1</nova:disk>
Jan 21 23:50:57 compute-1 nova_compute[182713]:         <nova:swap>0</nova:swap>
Jan 21 23:50:57 compute-1 nova_compute[182713]:         <nova:ephemeral>0</nova:ephemeral>
Jan 21 23:50:57 compute-1 nova_compute[182713]:         <nova:vcpus>1</nova:vcpus>
Jan 21 23:50:57 compute-1 nova_compute[182713]:       </nova:flavor>
Jan 21 23:50:57 compute-1 nova_compute[182713]:       <nova:owner>
Jan 21 23:50:57 compute-1 nova_compute[182713]:         <nova:user uuid="d0c4727b6f6e46339b56a8168cf80a7b">tempest-ServersOnMultiNodesTest-1927863391-project-member</nova:user>
Jan 21 23:50:57 compute-1 nova_compute[182713]:         <nova:project uuid="c1e85e2b0f934b719d3ad4076dc719f2">tempest-ServersOnMultiNodesTest-1927863391</nova:project>
Jan 21 23:50:57 compute-1 nova_compute[182713]:       </nova:owner>
Jan 21 23:50:57 compute-1 nova_compute[182713]:       <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 21 23:50:57 compute-1 nova_compute[182713]:       <nova:ports/>
Jan 21 23:50:57 compute-1 nova_compute[182713]:     </nova:instance>
Jan 21 23:50:57 compute-1 nova_compute[182713]:   </metadata>
Jan 21 23:50:57 compute-1 nova_compute[182713]:   <sysinfo type="smbios">
Jan 21 23:50:57 compute-1 nova_compute[182713]:     <system>
Jan 21 23:50:57 compute-1 nova_compute[182713]:       <entry name="manufacturer">RDO</entry>
Jan 21 23:50:57 compute-1 nova_compute[182713]:       <entry name="product">OpenStack Compute</entry>
Jan 21 23:50:57 compute-1 nova_compute[182713]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 21 23:50:57 compute-1 nova_compute[182713]:       <entry name="serial">c93af124-010e-4edc-8de7-7e16740d73e2</entry>
Jan 21 23:50:57 compute-1 nova_compute[182713]:       <entry name="uuid">c93af124-010e-4edc-8de7-7e16740d73e2</entry>
Jan 21 23:50:57 compute-1 nova_compute[182713]:       <entry name="family">Virtual Machine</entry>
Jan 21 23:50:57 compute-1 nova_compute[182713]:     </system>
Jan 21 23:50:57 compute-1 nova_compute[182713]:   </sysinfo>
Jan 21 23:50:57 compute-1 nova_compute[182713]:   <os>
Jan 21 23:50:57 compute-1 nova_compute[182713]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 21 23:50:57 compute-1 nova_compute[182713]:     <boot dev="hd"/>
Jan 21 23:50:57 compute-1 nova_compute[182713]:     <smbios mode="sysinfo"/>
Jan 21 23:50:57 compute-1 nova_compute[182713]:   </os>
Jan 21 23:50:57 compute-1 nova_compute[182713]:   <features>
Jan 21 23:50:57 compute-1 nova_compute[182713]:     <acpi/>
Jan 21 23:50:57 compute-1 nova_compute[182713]:     <apic/>
Jan 21 23:50:57 compute-1 nova_compute[182713]:     <vmcoreinfo/>
Jan 21 23:50:57 compute-1 nova_compute[182713]:   </features>
Jan 21 23:50:57 compute-1 nova_compute[182713]:   <clock offset="utc">
Jan 21 23:50:57 compute-1 nova_compute[182713]:     <timer name="pit" tickpolicy="delay"/>
Jan 21 23:50:57 compute-1 nova_compute[182713]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 21 23:50:57 compute-1 nova_compute[182713]:     <timer name="hpet" present="no"/>
Jan 21 23:50:57 compute-1 nova_compute[182713]:   </clock>
Jan 21 23:50:57 compute-1 nova_compute[182713]:   <cpu mode="custom" match="exact">
Jan 21 23:50:57 compute-1 nova_compute[182713]:     <model>Nehalem</model>
Jan 21 23:50:57 compute-1 nova_compute[182713]:     <topology sockets="1" cores="1" threads="1"/>
Jan 21 23:50:57 compute-1 nova_compute[182713]:   </cpu>
Jan 21 23:50:57 compute-1 nova_compute[182713]:   <devices>
Jan 21 23:50:57 compute-1 nova_compute[182713]:     <disk type="file" device="disk">
Jan 21 23:50:57 compute-1 nova_compute[182713]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 21 23:50:57 compute-1 nova_compute[182713]:       <source file="/var/lib/nova/instances/c93af124-010e-4edc-8de7-7e16740d73e2/disk"/>
Jan 21 23:50:57 compute-1 nova_compute[182713]:       <target dev="vda" bus="virtio"/>
Jan 21 23:50:57 compute-1 nova_compute[182713]:     </disk>
Jan 21 23:50:57 compute-1 nova_compute[182713]:     <disk type="file" device="cdrom">
Jan 21 23:50:57 compute-1 nova_compute[182713]:       <driver name="qemu" type="raw" cache="none"/>
Jan 21 23:50:57 compute-1 nova_compute[182713]:       <source file="/var/lib/nova/instances/c93af124-010e-4edc-8de7-7e16740d73e2/disk.config"/>
Jan 21 23:50:57 compute-1 nova_compute[182713]:       <target dev="sda" bus="sata"/>
Jan 21 23:50:57 compute-1 nova_compute[182713]:     </disk>
Jan 21 23:50:57 compute-1 nova_compute[182713]:     <serial type="pty">
Jan 21 23:50:57 compute-1 nova_compute[182713]:       <log file="/var/lib/nova/instances/c93af124-010e-4edc-8de7-7e16740d73e2/console.log" append="off"/>
Jan 21 23:50:57 compute-1 nova_compute[182713]:     </serial>
Jan 21 23:50:57 compute-1 nova_compute[182713]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 21 23:50:57 compute-1 nova_compute[182713]:     <video>
Jan 21 23:50:57 compute-1 nova_compute[182713]:       <model type="virtio"/>
Jan 21 23:50:57 compute-1 nova_compute[182713]:     </video>
Jan 21 23:50:57 compute-1 nova_compute[182713]:     <input type="tablet" bus="usb"/>
Jan 21 23:50:57 compute-1 nova_compute[182713]:     <rng model="virtio">
Jan 21 23:50:57 compute-1 nova_compute[182713]:       <backend model="random">/dev/urandom</backend>
Jan 21 23:50:57 compute-1 nova_compute[182713]:     </rng>
Jan 21 23:50:57 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root"/>
Jan 21 23:50:57 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:50:57 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:50:57 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:50:57 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:50:57 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:50:57 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:50:57 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:50:57 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:50:57 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:50:57 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:50:57 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:50:57 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:50:57 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:50:57 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:50:57 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:50:57 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:50:57 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:50:57 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:50:57 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:50:57 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:50:57 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:50:57 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:50:57 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:50:57 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:50:57 compute-1 nova_compute[182713]:     <controller type="usb" index="0"/>
Jan 21 23:50:57 compute-1 nova_compute[182713]:     <memballoon model="virtio">
Jan 21 23:50:57 compute-1 nova_compute[182713]:       <stats period="10"/>
Jan 21 23:50:57 compute-1 nova_compute[182713]:     </memballoon>
Jan 21 23:50:57 compute-1 nova_compute[182713]:   </devices>
Jan 21 23:50:57 compute-1 nova_compute[182713]: </domain>
Jan 21 23:50:57 compute-1 nova_compute[182713]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 21 23:50:57 compute-1 nova_compute[182713]: 2026-01-21 23:50:57.457 182717 DEBUG nova.virt.libvirt.driver [None req-f7eca75d-575e-48d3-9c0f-b278d76249cd d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 21 23:50:57 compute-1 nova_compute[182713]: 2026-01-21 23:50:57.457 182717 DEBUG nova.virt.libvirt.driver [None req-f7eca75d-575e-48d3-9c0f-b278d76249cd d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 21 23:50:57 compute-1 nova_compute[182713]: 2026-01-21 23:50:57.458 182717 INFO nova.virt.libvirt.driver [None req-f7eca75d-575e-48d3-9c0f-b278d76249cd d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] [instance: c93af124-010e-4edc-8de7-7e16740d73e2] Using config drive
Jan 21 23:50:57 compute-1 nova_compute[182713]: 2026-01-21 23:50:57.893 182717 INFO nova.virt.libvirt.driver [None req-f7eca75d-575e-48d3-9c0f-b278d76249cd d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] [instance: c93af124-010e-4edc-8de7-7e16740d73e2] Creating config drive at /var/lib/nova/instances/c93af124-010e-4edc-8de7-7e16740d73e2/disk.config
Jan 21 23:50:57 compute-1 nova_compute[182713]: 2026-01-21 23:50:57.906 182717 DEBUG oslo_concurrency.processutils [None req-f7eca75d-575e-48d3-9c0f-b278d76249cd d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/c93af124-010e-4edc-8de7-7e16740d73e2/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpzvf2ml66 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:50:58 compute-1 nova_compute[182713]: 2026-01-21 23:50:58.035 182717 DEBUG oslo_concurrency.processutils [None req-f7eca75d-575e-48d3-9c0f-b278d76249cd d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/c93af124-010e-4edc-8de7-7e16740d73e2/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpzvf2ml66" returned: 0 in 0.130s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:50:58 compute-1 systemd-machined[153970]: New machine qemu-21-instance-00000028.
Jan 21 23:50:58 compute-1 systemd[1]: Started Virtual Machine qemu-21-instance-00000028.
Jan 21 23:50:58 compute-1 nova_compute[182713]: 2026-01-21 23:50:58.475 182717 DEBUG nova.virt.driver [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Emitting event <LifecycleEvent: 1769039458.4746232, c93af124-010e-4edc-8de7-7e16740d73e2 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:50:58 compute-1 nova_compute[182713]: 2026-01-21 23:50:58.476 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: c93af124-010e-4edc-8de7-7e16740d73e2] VM Resumed (Lifecycle Event)
Jan 21 23:50:58 compute-1 nova_compute[182713]: 2026-01-21 23:50:58.478 182717 DEBUG nova.compute.manager [None req-f7eca75d-575e-48d3-9c0f-b278d76249cd d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] [instance: c93af124-010e-4edc-8de7-7e16740d73e2] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 21 23:50:58 compute-1 nova_compute[182713]: 2026-01-21 23:50:58.479 182717 DEBUG nova.virt.libvirt.driver [None req-f7eca75d-575e-48d3-9c0f-b278d76249cd d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] [instance: c93af124-010e-4edc-8de7-7e16740d73e2] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 21 23:50:58 compute-1 nova_compute[182713]: 2026-01-21 23:50:58.482 182717 INFO nova.virt.libvirt.driver [-] [instance: c93af124-010e-4edc-8de7-7e16740d73e2] Instance spawned successfully.
Jan 21 23:50:58 compute-1 nova_compute[182713]: 2026-01-21 23:50:58.482 182717 DEBUG nova.virt.libvirt.driver [None req-f7eca75d-575e-48d3-9c0f-b278d76249cd d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] [instance: c93af124-010e-4edc-8de7-7e16740d73e2] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 21 23:50:58 compute-1 nova_compute[182713]: 2026-01-21 23:50:58.578 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: c93af124-010e-4edc-8de7-7e16740d73e2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:50:58 compute-1 nova_compute[182713]: 2026-01-21 23:50:58.593 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: c93af124-010e-4edc-8de7-7e16740d73e2] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 21 23:50:58 compute-1 nova_compute[182713]: 2026-01-21 23:50:58.598 182717 DEBUG nova.virt.libvirt.driver [None req-f7eca75d-575e-48d3-9c0f-b278d76249cd d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] [instance: c93af124-010e-4edc-8de7-7e16740d73e2] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:50:58 compute-1 nova_compute[182713]: 2026-01-21 23:50:58.599 182717 DEBUG nova.virt.libvirt.driver [None req-f7eca75d-575e-48d3-9c0f-b278d76249cd d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] [instance: c93af124-010e-4edc-8de7-7e16740d73e2] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:50:58 compute-1 nova_compute[182713]: 2026-01-21 23:50:58.600 182717 DEBUG nova.virt.libvirt.driver [None req-f7eca75d-575e-48d3-9c0f-b278d76249cd d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] [instance: c93af124-010e-4edc-8de7-7e16740d73e2] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:50:58 compute-1 nova_compute[182713]: 2026-01-21 23:50:58.601 182717 DEBUG nova.virt.libvirt.driver [None req-f7eca75d-575e-48d3-9c0f-b278d76249cd d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] [instance: c93af124-010e-4edc-8de7-7e16740d73e2] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:50:58 compute-1 nova_compute[182713]: 2026-01-21 23:50:58.601 182717 DEBUG nova.virt.libvirt.driver [None req-f7eca75d-575e-48d3-9c0f-b278d76249cd d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] [instance: c93af124-010e-4edc-8de7-7e16740d73e2] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:50:58 compute-1 nova_compute[182713]: 2026-01-21 23:50:58.602 182717 DEBUG nova.virt.libvirt.driver [None req-f7eca75d-575e-48d3-9c0f-b278d76249cd d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] [instance: c93af124-010e-4edc-8de7-7e16740d73e2] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:50:58 compute-1 nova_compute[182713]: 2026-01-21 23:50:58.628 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: c93af124-010e-4edc-8de7-7e16740d73e2] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 21 23:50:58 compute-1 nova_compute[182713]: 2026-01-21 23:50:58.629 182717 DEBUG nova.virt.driver [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Emitting event <LifecycleEvent: 1769039458.4753358, c93af124-010e-4edc-8de7-7e16740d73e2 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:50:58 compute-1 nova_compute[182713]: 2026-01-21 23:50:58.629 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: c93af124-010e-4edc-8de7-7e16740d73e2] VM Started (Lifecycle Event)
Jan 21 23:50:58 compute-1 nova_compute[182713]: 2026-01-21 23:50:58.652 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: c93af124-010e-4edc-8de7-7e16740d73e2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:50:58 compute-1 nova_compute[182713]: 2026-01-21 23:50:58.655 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: c93af124-010e-4edc-8de7-7e16740d73e2] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 21 23:50:58 compute-1 nova_compute[182713]: 2026-01-21 23:50:58.682 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: c93af124-010e-4edc-8de7-7e16740d73e2] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 21 23:50:58 compute-1 nova_compute[182713]: 2026-01-21 23:50:58.702 182717 INFO nova.compute.manager [None req-f7eca75d-575e-48d3-9c0f-b278d76249cd d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] [instance: c93af124-010e-4edc-8de7-7e16740d73e2] Took 1.77 seconds to spawn the instance on the hypervisor.
Jan 21 23:50:58 compute-1 nova_compute[182713]: 2026-01-21 23:50:58.702 182717 DEBUG nova.compute.manager [None req-f7eca75d-575e-48d3-9c0f-b278d76249cd d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] [instance: c93af124-010e-4edc-8de7-7e16740d73e2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:50:58 compute-1 nova_compute[182713]: 2026-01-21 23:50:58.806 182717 INFO nova.compute.manager [None req-f7eca75d-575e-48d3-9c0f-b278d76249cd d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] [instance: c93af124-010e-4edc-8de7-7e16740d73e2] Took 2.75 seconds to build instance.
Jan 21 23:50:58 compute-1 nova_compute[182713]: 2026-01-21 23:50:58.832 182717 DEBUG oslo_concurrency.lockutils [None req-f7eca75d-575e-48d3-9c0f-b278d76249cd d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Lock "c93af124-010e-4edc-8de7-7e16740d73e2" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 2.904s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:51:00 compute-1 nova_compute[182713]: 2026-01-21 23:51:00.458 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 21 23:51:00 compute-1 nova_compute[182713]: 2026-01-21 23:51:00.460 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:51:00 compute-1 nova_compute[182713]: 2026-01-21 23:51:00.460 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Jan 21 23:51:00 compute-1 nova_compute[182713]: 2026-01-21 23:51:00.460 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 21 23:51:00 compute-1 nova_compute[182713]: 2026-01-21 23:51:00.461 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 21 23:51:00 compute-1 nova_compute[182713]: 2026-01-21 23:51:00.462 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:51:00 compute-1 podman[216373]: 2026-01-21 23:51:00.609629424 +0000 UTC m=+0.095466605 container health_status af9a44923f14d865893c91712a29a99d59e05d83f1a1a0300c4f4d5af638f284 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent)
Jan 21 23:51:00 compute-1 podman[216374]: 2026-01-21 23:51:00.614879761 +0000 UTC m=+0.096706484 container health_status e305ebfcd3dbca18f8dd680606b043b108cf9efb0d0a771039cb04fe0df8441d (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 21 23:51:03 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:51:02.998 104184 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:51:03 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:51:02.999 104184 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:51:03 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:51:02.999 104184 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:51:05 compute-1 nova_compute[182713]: 2026-01-21 23:51:05.463 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:51:05 compute-1 nova_compute[182713]: 2026-01-21 23:51:05.916 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:51:05 compute-1 NetworkManager[54952]: <info>  [1769039465.9352] manager: (patch-br-int-to-provnet-99bcf688-d143-4306-a11c-956e66fbe227): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/60)
Jan 21 23:51:05 compute-1 NetworkManager[54952]: <info>  [1769039465.9365] manager: (patch-provnet-99bcf688-d143-4306-a11c-956e66fbe227-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/61)
Jan 21 23:51:05 compute-1 nova_compute[182713]: 2026-01-21 23:51:05.987 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:51:05 compute-1 ovn_controller[94841]: 2026-01-21T23:51:05Z|00114|binding|INFO|Releasing lport e1559aec-27c5-46a3-81fc-ddeb80ee3759 from this chassis (sb_readonly=0)
Jan 21 23:51:06 compute-1 nova_compute[182713]: 2026-01-21 23:51:06.002 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:51:06 compute-1 nova_compute[182713]: 2026-01-21 23:51:06.442 182717 DEBUG nova.compute.manager [req-ebe7edad-dc00-462b-b3c6-a0b5026adc65 req-c4445637-80cd-46b5-9e6f-55fa3f520313 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: ce29d6fa-fbbb-4352-b243-5af960b17123] Received event network-changed-6a31fa49-995e-4526-9e75-1973bd7f492b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:51:06 compute-1 nova_compute[182713]: 2026-01-21 23:51:06.442 182717 DEBUG nova.compute.manager [req-ebe7edad-dc00-462b-b3c6-a0b5026adc65 req-c4445637-80cd-46b5-9e6f-55fa3f520313 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: ce29d6fa-fbbb-4352-b243-5af960b17123] Refreshing instance network info cache due to event network-changed-6a31fa49-995e-4526-9e75-1973bd7f492b. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 21 23:51:06 compute-1 nova_compute[182713]: 2026-01-21 23:51:06.443 182717 DEBUG oslo_concurrency.lockutils [req-ebe7edad-dc00-462b-b3c6-a0b5026adc65 req-c4445637-80cd-46b5-9e6f-55fa3f520313 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-ce29d6fa-fbbb-4352-b243-5af960b17123" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 23:51:06 compute-1 nova_compute[182713]: 2026-01-21 23:51:06.444 182717 DEBUG oslo_concurrency.lockutils [req-ebe7edad-dc00-462b-b3c6-a0b5026adc65 req-c4445637-80cd-46b5-9e6f-55fa3f520313 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-ce29d6fa-fbbb-4352-b243-5af960b17123" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 23:51:06 compute-1 nova_compute[182713]: 2026-01-21 23:51:06.444 182717 DEBUG nova.network.neutron [req-ebe7edad-dc00-462b-b3c6-a0b5026adc65 req-c4445637-80cd-46b5-9e6f-55fa3f520313 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: ce29d6fa-fbbb-4352-b243-5af960b17123] Refreshing network info cache for port 6a31fa49-995e-4526-9e75-1973bd7f492b _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 21 23:51:08 compute-1 ovn_controller[94841]: 2026-01-21T23:51:08Z|00115|binding|INFO|Releasing lport e1559aec-27c5-46a3-81fc-ddeb80ee3759 from this chassis (sb_readonly=0)
Jan 21 23:51:08 compute-1 nova_compute[182713]: 2026-01-21 23:51:08.120 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:51:08 compute-1 nova_compute[182713]: 2026-01-21 23:51:08.492 182717 DEBUG nova.network.neutron [req-ebe7edad-dc00-462b-b3c6-a0b5026adc65 req-c4445637-80cd-46b5-9e6f-55fa3f520313 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: ce29d6fa-fbbb-4352-b243-5af960b17123] Updated VIF entry in instance network info cache for port 6a31fa49-995e-4526-9e75-1973bd7f492b. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 21 23:51:08 compute-1 nova_compute[182713]: 2026-01-21 23:51:08.493 182717 DEBUG nova.network.neutron [req-ebe7edad-dc00-462b-b3c6-a0b5026adc65 req-c4445637-80cd-46b5-9e6f-55fa3f520313 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: ce29d6fa-fbbb-4352-b243-5af960b17123] Updating instance_info_cache with network_info: [{"id": "6a31fa49-995e-4526-9e75-1973bd7f492b", "address": "fa:16:3e:46:85:15", "network": {"id": "b94414b2-c7ed-4d1b-b462-f41cb84cbcd8", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-823172346-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.247", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bdcd24bf916b4c3aa2e173bea9dd7202", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6a31fa49-99", "ovs_interfaceid": "6a31fa49-995e-4526-9e75-1973bd7f492b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 23:51:08 compute-1 nova_compute[182713]: 2026-01-21 23:51:08.517 182717 DEBUG oslo_concurrency.lockutils [req-ebe7edad-dc00-462b-b3c6-a0b5026adc65 req-c4445637-80cd-46b5-9e6f-55fa3f520313 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-ce29d6fa-fbbb-4352-b243-5af960b17123" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 23:51:10 compute-1 nova_compute[182713]: 2026-01-21 23:51:10.465 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:51:10 compute-1 nova_compute[182713]: 2026-01-21 23:51:10.468 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:51:12 compute-1 podman[216427]: 2026-01-21 23:51:12.634387514 +0000 UTC m=+0.108453856 container health_status cba261ffc859a105e0afa4436e64e27f9ba7e7387a1ea02146e0072c51844d44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Jan 21 23:51:12 compute-1 nova_compute[182713]: 2026-01-21 23:51:12.723 182717 DEBUG nova.compute.manager [req-d521b72b-5d23-4edc-82d2-43a4c199ae07 req-f175dfd0-b9f7-4036-a13c-9371cb6debb2 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: ce29d6fa-fbbb-4352-b243-5af960b17123] Received event network-changed-6a31fa49-995e-4526-9e75-1973bd7f492b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:51:12 compute-1 nova_compute[182713]: 2026-01-21 23:51:12.723 182717 DEBUG nova.compute.manager [req-d521b72b-5d23-4edc-82d2-43a4c199ae07 req-f175dfd0-b9f7-4036-a13c-9371cb6debb2 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: ce29d6fa-fbbb-4352-b243-5af960b17123] Refreshing instance network info cache due to event network-changed-6a31fa49-995e-4526-9e75-1973bd7f492b. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 21 23:51:12 compute-1 nova_compute[182713]: 2026-01-21 23:51:12.724 182717 DEBUG oslo_concurrency.lockutils [req-d521b72b-5d23-4edc-82d2-43a4c199ae07 req-f175dfd0-b9f7-4036-a13c-9371cb6debb2 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-ce29d6fa-fbbb-4352-b243-5af960b17123" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 23:51:12 compute-1 nova_compute[182713]: 2026-01-21 23:51:12.724 182717 DEBUG oslo_concurrency.lockutils [req-d521b72b-5d23-4edc-82d2-43a4c199ae07 req-f175dfd0-b9f7-4036-a13c-9371cb6debb2 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-ce29d6fa-fbbb-4352-b243-5af960b17123" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 23:51:12 compute-1 nova_compute[182713]: 2026-01-21 23:51:12.724 182717 DEBUG nova.network.neutron [req-d521b72b-5d23-4edc-82d2-43a4c199ae07 req-f175dfd0-b9f7-4036-a13c-9371cb6debb2 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: ce29d6fa-fbbb-4352-b243-5af960b17123] Refreshing network info cache for port 6a31fa49-995e-4526-9e75-1973bd7f492b _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 21 23:51:15 compute-1 nova_compute[182713]: 2026-01-21 23:51:15.266 182717 DEBUG nova.network.neutron [req-d521b72b-5d23-4edc-82d2-43a4c199ae07 req-f175dfd0-b9f7-4036-a13c-9371cb6debb2 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: ce29d6fa-fbbb-4352-b243-5af960b17123] Updated VIF entry in instance network info cache for port 6a31fa49-995e-4526-9e75-1973bd7f492b. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 21 23:51:15 compute-1 nova_compute[182713]: 2026-01-21 23:51:15.267 182717 DEBUG nova.network.neutron [req-d521b72b-5d23-4edc-82d2-43a4c199ae07 req-f175dfd0-b9f7-4036-a13c-9371cb6debb2 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: ce29d6fa-fbbb-4352-b243-5af960b17123] Updating instance_info_cache with network_info: [{"id": "6a31fa49-995e-4526-9e75-1973bd7f492b", "address": "fa:16:3e:46:85:15", "network": {"id": "b94414b2-c7ed-4d1b-b462-f41cb84cbcd8", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-823172346-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bdcd24bf916b4c3aa2e173bea9dd7202", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6a31fa49-99", "ovs_interfaceid": "6a31fa49-995e-4526-9e75-1973bd7f492b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 23:51:15 compute-1 nova_compute[182713]: 2026-01-21 23:51:15.311 182717 DEBUG oslo_concurrency.lockutils [req-d521b72b-5d23-4edc-82d2-43a4c199ae07 req-f175dfd0-b9f7-4036-a13c-9371cb6debb2 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-ce29d6fa-fbbb-4352-b243-5af960b17123" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 23:51:15 compute-1 nova_compute[182713]: 2026-01-21 23:51:15.467 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:51:15 compute-1 podman[216449]: 2026-01-21 23:51:15.613769891 +0000 UTC m=+0.098311426 container health_status dce133a2ffa21f3006ac27dc80027060a9029e6d972f4c62fecd35dd3bbb00f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, maintainer=Red Hat, Inc., architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, version=9.6, container_name=openstack_network_exporter, distribution-scope=public)
Jan 21 23:51:16 compute-1 nova_compute[182713]: 2026-01-21 23:51:16.922 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:51:20 compute-1 nova_compute[182713]: 2026-01-21 23:51:20.331 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:51:20 compute-1 nova_compute[182713]: 2026-01-21 23:51:20.470 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:51:22 compute-1 nova_compute[182713]: 2026-01-21 23:51:22.350 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:51:22 compute-1 nova_compute[182713]: 2026-01-21 23:51:22.507 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:51:22 compute-1 nova_compute[182713]: 2026-01-21 23:51:22.856 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:51:22 compute-1 nova_compute[182713]: 2026-01-21 23:51:22.856 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:51:22 compute-1 nova_compute[182713]: 2026-01-21 23:51:22.857 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:51:22 compute-1 nova_compute[182713]: 2026-01-21 23:51:22.857 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 21 23:51:23 compute-1 podman[216471]: 2026-01-21 23:51:23.614115381 +0000 UTC m=+0.078638192 container health_status 9069102163b9014ce8d990e36224edf2f6277e737eebbc85a8ea0ba5a3d6a468 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 21 23:51:23 compute-1 podman[216470]: 2026-01-21 23:51:23.62294197 +0000 UTC m=+0.104668226 container health_status 1b31374526468d6b98833c6ddc06c791524e3aaa8f4a92d02e3f11c449a17ca2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.vendor=CentOS, container_name=ovn_controller)
Jan 21 23:51:23 compute-1 nova_compute[182713]: 2026-01-21 23:51:23.852 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:51:23 compute-1 nova_compute[182713]: 2026-01-21 23:51:23.854 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:51:24 compute-1 nova_compute[182713]: 2026-01-21 23:51:24.855 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:51:24 compute-1 nova_compute[182713]: 2026-01-21 23:51:24.856 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:51:25 compute-1 nova_compute[182713]: 2026-01-21 23:51:25.473 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:51:25 compute-1 nova_compute[182713]: 2026-01-21 23:51:25.856 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:51:25 compute-1 nova_compute[182713]: 2026-01-21 23:51:25.856 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 21 23:51:25 compute-1 nova_compute[182713]: 2026-01-21 23:51:25.857 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 21 23:51:26 compute-1 nova_compute[182713]: 2026-01-21 23:51:26.191 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquiring lock "refresh_cache-ce29d6fa-fbbb-4352-b243-5af960b17123" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 23:51:26 compute-1 nova_compute[182713]: 2026-01-21 23:51:26.192 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquired lock "refresh_cache-ce29d6fa-fbbb-4352-b243-5af960b17123" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 23:51:26 compute-1 nova_compute[182713]: 2026-01-21 23:51:26.192 182717 DEBUG nova.network.neutron [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] [instance: ce29d6fa-fbbb-4352-b243-5af960b17123] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Jan 21 23:51:26 compute-1 nova_compute[182713]: 2026-01-21 23:51:26.192 182717 DEBUG nova.objects.instance [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lazy-loading 'info_cache' on Instance uuid ce29d6fa-fbbb-4352-b243-5af960b17123 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:51:26 compute-1 nova_compute[182713]: 2026-01-21 23:51:26.518 182717 DEBUG nova.compute.manager [req-95c30fb8-e8fc-44aa-8c83-500e7a9a0b7d req-ced2e0dc-33aa-4d92-b83d-fed3e4f643ad 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: ce29d6fa-fbbb-4352-b243-5af960b17123] Received event network-changed-6a31fa49-995e-4526-9e75-1973bd7f492b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:51:26 compute-1 nova_compute[182713]: 2026-01-21 23:51:26.519 182717 DEBUG nova.compute.manager [req-95c30fb8-e8fc-44aa-8c83-500e7a9a0b7d req-ced2e0dc-33aa-4d92-b83d-fed3e4f643ad 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: ce29d6fa-fbbb-4352-b243-5af960b17123] Refreshing instance network info cache due to event network-changed-6a31fa49-995e-4526-9e75-1973bd7f492b. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 21 23:51:26 compute-1 nova_compute[182713]: 2026-01-21 23:51:26.520 182717 DEBUG oslo_concurrency.lockutils [req-95c30fb8-e8fc-44aa-8c83-500e7a9a0b7d req-ced2e0dc-33aa-4d92-b83d-fed3e4f643ad 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-ce29d6fa-fbbb-4352-b243-5af960b17123" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 23:51:27 compute-1 nova_compute[182713]: 2026-01-21 23:51:27.114 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:51:29 compute-1 nova_compute[182713]: 2026-01-21 23:51:29.108 182717 DEBUG nova.network.neutron [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] [instance: ce29d6fa-fbbb-4352-b243-5af960b17123] Updating instance_info_cache with network_info: [{"id": "6a31fa49-995e-4526-9e75-1973bd7f492b", "address": "fa:16:3e:46:85:15", "network": {"id": "b94414b2-c7ed-4d1b-b462-f41cb84cbcd8", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-823172346-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.247", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bdcd24bf916b4c3aa2e173bea9dd7202", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6a31fa49-99", "ovs_interfaceid": "6a31fa49-995e-4526-9e75-1973bd7f492b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 23:51:29 compute-1 nova_compute[182713]: 2026-01-21 23:51:29.143 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Releasing lock "refresh_cache-ce29d6fa-fbbb-4352-b243-5af960b17123" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 23:51:29 compute-1 nova_compute[182713]: 2026-01-21 23:51:29.144 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] [instance: ce29d6fa-fbbb-4352-b243-5af960b17123] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Jan 21 23:51:29 compute-1 nova_compute[182713]: 2026-01-21 23:51:29.145 182717 DEBUG oslo_concurrency.lockutils [req-95c30fb8-e8fc-44aa-8c83-500e7a9a0b7d req-ced2e0dc-33aa-4d92-b83d-fed3e4f643ad 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-ce29d6fa-fbbb-4352-b243-5af960b17123" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 23:51:29 compute-1 nova_compute[182713]: 2026-01-21 23:51:29.145 182717 DEBUG nova.network.neutron [req-95c30fb8-e8fc-44aa-8c83-500e7a9a0b7d req-ced2e0dc-33aa-4d92-b83d-fed3e4f643ad 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: ce29d6fa-fbbb-4352-b243-5af960b17123] Refreshing network info cache for port 6a31fa49-995e-4526-9e75-1973bd7f492b _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 21 23:51:29 compute-1 nova_compute[182713]: 2026-01-21 23:51:29.147 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:51:29 compute-1 nova_compute[182713]: 2026-01-21 23:51:29.188 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:51:29 compute-1 nova_compute[182713]: 2026-01-21 23:51:29.189 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:51:29 compute-1 nova_compute[182713]: 2026-01-21 23:51:29.189 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:51:29 compute-1 nova_compute[182713]: 2026-01-21 23:51:29.190 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 21 23:51:29 compute-1 nova_compute[182713]: 2026-01-21 23:51:29.283 182717 DEBUG oslo_concurrency.processutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ce29d6fa-fbbb-4352-b243-5af960b17123/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:51:29 compute-1 nova_compute[182713]: 2026-01-21 23:51:29.374 182717 DEBUG oslo_concurrency.processutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ce29d6fa-fbbb-4352-b243-5af960b17123/disk --force-share --output=json" returned: 0 in 0.091s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:51:29 compute-1 nova_compute[182713]: 2026-01-21 23:51:29.375 182717 DEBUG oslo_concurrency.processutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ce29d6fa-fbbb-4352-b243-5af960b17123/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:51:29 compute-1 nova_compute[182713]: 2026-01-21 23:51:29.455 182717 DEBUG oslo_concurrency.processutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ce29d6fa-fbbb-4352-b243-5af960b17123/disk --force-share --output=json" returned: 0 in 0.080s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:51:29 compute-1 nova_compute[182713]: 2026-01-21 23:51:29.464 182717 DEBUG oslo_concurrency.processutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c93af124-010e-4edc-8de7-7e16740d73e2/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:51:29 compute-1 nova_compute[182713]: 2026-01-21 23:51:29.555 182717 DEBUG oslo_concurrency.processutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c93af124-010e-4edc-8de7-7e16740d73e2/disk --force-share --output=json" returned: 0 in 0.091s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:51:29 compute-1 nova_compute[182713]: 2026-01-21 23:51:29.556 182717 DEBUG oslo_concurrency.processutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c93af124-010e-4edc-8de7-7e16740d73e2/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:51:29 compute-1 nova_compute[182713]: 2026-01-21 23:51:29.622 182717 DEBUG oslo_concurrency.lockutils [None req-acb6e893-f86a-4848-9f5d-48cd8d7c78ec d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Acquiring lock "c93af124-010e-4edc-8de7-7e16740d73e2" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:51:29 compute-1 nova_compute[182713]: 2026-01-21 23:51:29.623 182717 DEBUG oslo_concurrency.lockutils [None req-acb6e893-f86a-4848-9f5d-48cd8d7c78ec d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Lock "c93af124-010e-4edc-8de7-7e16740d73e2" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:51:29 compute-1 nova_compute[182713]: 2026-01-21 23:51:29.623 182717 DEBUG oslo_concurrency.lockutils [None req-acb6e893-f86a-4848-9f5d-48cd8d7c78ec d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Acquiring lock "c93af124-010e-4edc-8de7-7e16740d73e2-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:51:29 compute-1 nova_compute[182713]: 2026-01-21 23:51:29.624 182717 DEBUG oslo_concurrency.lockutils [None req-acb6e893-f86a-4848-9f5d-48cd8d7c78ec d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Lock "c93af124-010e-4edc-8de7-7e16740d73e2-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:51:29 compute-1 nova_compute[182713]: 2026-01-21 23:51:29.624 182717 DEBUG oslo_concurrency.lockutils [None req-acb6e893-f86a-4848-9f5d-48cd8d7c78ec d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Lock "c93af124-010e-4edc-8de7-7e16740d73e2-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:51:29 compute-1 nova_compute[182713]: 2026-01-21 23:51:29.632 182717 DEBUG oslo_concurrency.processutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c93af124-010e-4edc-8de7-7e16740d73e2/disk --force-share --output=json" returned: 0 in 0.076s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:51:29 compute-1 nova_compute[182713]: 2026-01-21 23:51:29.651 182717 INFO nova.compute.manager [None req-acb6e893-f86a-4848-9f5d-48cd8d7c78ec d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] [instance: c93af124-010e-4edc-8de7-7e16740d73e2] Terminating instance
Jan 21 23:51:29 compute-1 nova_compute[182713]: 2026-01-21 23:51:29.666 182717 DEBUG oslo_concurrency.lockutils [None req-acb6e893-f86a-4848-9f5d-48cd8d7c78ec d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Acquiring lock "refresh_cache-c93af124-010e-4edc-8de7-7e16740d73e2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 23:51:29 compute-1 nova_compute[182713]: 2026-01-21 23:51:29.667 182717 DEBUG oslo_concurrency.lockutils [None req-acb6e893-f86a-4848-9f5d-48cd8d7c78ec d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Acquired lock "refresh_cache-c93af124-010e-4edc-8de7-7e16740d73e2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 23:51:29 compute-1 nova_compute[182713]: 2026-01-21 23:51:29.667 182717 DEBUG nova.network.neutron [None req-acb6e893-f86a-4848-9f5d-48cd8d7c78ec d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] [instance: c93af124-010e-4edc-8de7-7e16740d73e2] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 21 23:51:29 compute-1 nova_compute[182713]: 2026-01-21 23:51:29.880 182717 WARNING nova.virt.libvirt.driver [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 21 23:51:29 compute-1 nova_compute[182713]: 2026-01-21 23:51:29.881 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5402MB free_disk=73.25151062011719GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 21 23:51:29 compute-1 nova_compute[182713]: 2026-01-21 23:51:29.882 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:51:29 compute-1 nova_compute[182713]: 2026-01-21 23:51:29.882 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:51:29 compute-1 nova_compute[182713]: 2026-01-21 23:51:29.906 182717 DEBUG nova.network.neutron [None req-acb6e893-f86a-4848-9f5d-48cd8d7c78ec d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] [instance: c93af124-010e-4edc-8de7-7e16740d73e2] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 21 23:51:30 compute-1 nova_compute[182713]: 2026-01-21 23:51:30.001 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Instance ce29d6fa-fbbb-4352-b243-5af960b17123 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 21 23:51:30 compute-1 nova_compute[182713]: 2026-01-21 23:51:30.002 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Instance c93af124-010e-4edc-8de7-7e16740d73e2 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 21 23:51:30 compute-1 nova_compute[182713]: 2026-01-21 23:51:30.002 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 21 23:51:30 compute-1 nova_compute[182713]: 2026-01-21 23:51:30.002 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 21 23:51:30 compute-1 nova_compute[182713]: 2026-01-21 23:51:30.082 182717 DEBUG nova.compute.provider_tree [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Inventory has not changed in ProviderTree for provider: 39680711-70c9-4df1-ae59-25e54fac688d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 21 23:51:30 compute-1 nova_compute[182713]: 2026-01-21 23:51:30.104 182717 DEBUG nova.scheduler.client.report [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Inventory has not changed for provider 39680711-70c9-4df1-ae59-25e54fac688d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 21 23:51:30 compute-1 nova_compute[182713]: 2026-01-21 23:51:30.390 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 21 23:51:30 compute-1 nova_compute[182713]: 2026-01-21 23:51:30.391 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.509s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:51:30 compute-1 nova_compute[182713]: 2026-01-21 23:51:30.475 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:51:30 compute-1 nova_compute[182713]: 2026-01-21 23:51:30.825 182717 DEBUG nova.network.neutron [None req-acb6e893-f86a-4848-9f5d-48cd8d7c78ec d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] [instance: c93af124-010e-4edc-8de7-7e16740d73e2] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 23:51:30 compute-1 nova_compute[182713]: 2026-01-21 23:51:30.844 182717 DEBUG oslo_concurrency.lockutils [None req-acb6e893-f86a-4848-9f5d-48cd8d7c78ec d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Releasing lock "refresh_cache-c93af124-010e-4edc-8de7-7e16740d73e2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 23:51:30 compute-1 nova_compute[182713]: 2026-01-21 23:51:30.845 182717 DEBUG nova.compute.manager [None req-acb6e893-f86a-4848-9f5d-48cd8d7c78ec d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] [instance: c93af124-010e-4edc-8de7-7e16740d73e2] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 21 23:51:30 compute-1 systemd[1]: machine-qemu\x2d21\x2dinstance\x2d00000028.scope: Deactivated successfully.
Jan 21 23:51:30 compute-1 systemd[1]: machine-qemu\x2d21\x2dinstance\x2d00000028.scope: Consumed 12.645s CPU time.
Jan 21 23:51:30 compute-1 systemd-machined[153970]: Machine qemu-21-instance-00000028 terminated.
Jan 21 23:51:30 compute-1 podman[216532]: 2026-01-21 23:51:30.960685753 +0000 UTC m=+0.057534684 container health_status e305ebfcd3dbca18f8dd680606b043b108cf9efb0d0a771039cb04fe0df8441d (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Jan 21 23:51:30 compute-1 podman[216531]: 2026-01-21 23:51:30.963784461 +0000 UTC m=+0.065809546 container health_status af9a44923f14d865893c91712a29a99d59e05d83f1a1a0300c4f4d5af638f284 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 21 23:51:31 compute-1 nova_compute[182713]: 2026-01-21 23:51:31.113 182717 INFO nova.virt.libvirt.driver [-] [instance: c93af124-010e-4edc-8de7-7e16740d73e2] Instance destroyed successfully.
Jan 21 23:51:31 compute-1 nova_compute[182713]: 2026-01-21 23:51:31.114 182717 DEBUG nova.objects.instance [None req-acb6e893-f86a-4848-9f5d-48cd8d7c78ec d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Lazy-loading 'resources' on Instance uuid c93af124-010e-4edc-8de7-7e16740d73e2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:51:31 compute-1 nova_compute[182713]: 2026-01-21 23:51:31.130 182717 INFO nova.virt.libvirt.driver [None req-acb6e893-f86a-4848-9f5d-48cd8d7c78ec d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] [instance: c93af124-010e-4edc-8de7-7e16740d73e2] Deleting instance files /var/lib/nova/instances/c93af124-010e-4edc-8de7-7e16740d73e2_del
Jan 21 23:51:31 compute-1 nova_compute[182713]: 2026-01-21 23:51:31.132 182717 INFO nova.virt.libvirt.driver [None req-acb6e893-f86a-4848-9f5d-48cd8d7c78ec d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] [instance: c93af124-010e-4edc-8de7-7e16740d73e2] Deletion of /var/lib/nova/instances/c93af124-010e-4edc-8de7-7e16740d73e2_del complete
Jan 21 23:51:31 compute-1 nova_compute[182713]: 2026-01-21 23:51:31.214 182717 INFO nova.compute.manager [None req-acb6e893-f86a-4848-9f5d-48cd8d7c78ec d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] [instance: c93af124-010e-4edc-8de7-7e16740d73e2] Took 0.37 seconds to destroy the instance on the hypervisor.
Jan 21 23:51:31 compute-1 nova_compute[182713]: 2026-01-21 23:51:31.215 182717 DEBUG oslo.service.loopingcall [None req-acb6e893-f86a-4848-9f5d-48cd8d7c78ec d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 21 23:51:31 compute-1 nova_compute[182713]: 2026-01-21 23:51:31.217 182717 DEBUG nova.compute.manager [-] [instance: c93af124-010e-4edc-8de7-7e16740d73e2] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 21 23:51:31 compute-1 nova_compute[182713]: 2026-01-21 23:51:31.217 182717 DEBUG nova.network.neutron [-] [instance: c93af124-010e-4edc-8de7-7e16740d73e2] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 21 23:51:31 compute-1 nova_compute[182713]: 2026-01-21 23:51:31.827 182717 DEBUG nova.network.neutron [-] [instance: c93af124-010e-4edc-8de7-7e16740d73e2] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 21 23:51:31 compute-1 nova_compute[182713]: 2026-01-21 23:51:31.846 182717 DEBUG nova.network.neutron [-] [instance: c93af124-010e-4edc-8de7-7e16740d73e2] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 23:51:31 compute-1 nova_compute[182713]: 2026-01-21 23:51:31.862 182717 INFO nova.compute.manager [-] [instance: c93af124-010e-4edc-8de7-7e16740d73e2] Took 0.65 seconds to deallocate network for instance.
Jan 21 23:51:31 compute-1 nova_compute[182713]: 2026-01-21 23:51:31.981 182717 DEBUG oslo_concurrency.lockutils [None req-acb6e893-f86a-4848-9f5d-48cd8d7c78ec d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:51:31 compute-1 nova_compute[182713]: 2026-01-21 23:51:31.982 182717 DEBUG oslo_concurrency.lockutils [None req-acb6e893-f86a-4848-9f5d-48cd8d7c78ec d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:51:32 compute-1 nova_compute[182713]: 2026-01-21 23:51:32.070 182717 DEBUG nova.compute.provider_tree [None req-acb6e893-f86a-4848-9f5d-48cd8d7c78ec d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Inventory has not changed in ProviderTree for provider: 39680711-70c9-4df1-ae59-25e54fac688d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 21 23:51:32 compute-1 nova_compute[182713]: 2026-01-21 23:51:32.092 182717 DEBUG nova.scheduler.client.report [None req-acb6e893-f86a-4848-9f5d-48cd8d7c78ec d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Inventory has not changed for provider 39680711-70c9-4df1-ae59-25e54fac688d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 21 23:51:32 compute-1 nova_compute[182713]: 2026-01-21 23:51:32.117 182717 DEBUG oslo_concurrency.lockutils [None req-acb6e893-f86a-4848-9f5d-48cd8d7c78ec d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.134s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:51:32 compute-1 nova_compute[182713]: 2026-01-21 23:51:32.141 182717 INFO nova.scheduler.client.report [None req-acb6e893-f86a-4848-9f5d-48cd8d7c78ec d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Deleted allocations for instance c93af124-010e-4edc-8de7-7e16740d73e2
Jan 21 23:51:32 compute-1 nova_compute[182713]: 2026-01-21 23:51:32.219 182717 DEBUG nova.network.neutron [req-95c30fb8-e8fc-44aa-8c83-500e7a9a0b7d req-ced2e0dc-33aa-4d92-b83d-fed3e4f643ad 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: ce29d6fa-fbbb-4352-b243-5af960b17123] Updated VIF entry in instance network info cache for port 6a31fa49-995e-4526-9e75-1973bd7f492b. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 21 23:51:32 compute-1 nova_compute[182713]: 2026-01-21 23:51:32.219 182717 DEBUG nova.network.neutron [req-95c30fb8-e8fc-44aa-8c83-500e7a9a0b7d req-ced2e0dc-33aa-4d92-b83d-fed3e4f643ad 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: ce29d6fa-fbbb-4352-b243-5af960b17123] Updating instance_info_cache with network_info: [{"id": "6a31fa49-995e-4526-9e75-1973bd7f492b", "address": "fa:16:3e:46:85:15", "network": {"id": "b94414b2-c7ed-4d1b-b462-f41cb84cbcd8", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-823172346-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.247", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bdcd24bf916b4c3aa2e173bea9dd7202", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6a31fa49-99", "ovs_interfaceid": "6a31fa49-995e-4526-9e75-1973bd7f492b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 23:51:32 compute-1 nova_compute[182713]: 2026-01-21 23:51:32.243 182717 DEBUG oslo_concurrency.lockutils [req-95c30fb8-e8fc-44aa-8c83-500e7a9a0b7d req-ced2e0dc-33aa-4d92-b83d-fed3e4f643ad 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-ce29d6fa-fbbb-4352-b243-5af960b17123" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 23:51:32 compute-1 nova_compute[182713]: 2026-01-21 23:51:32.245 182717 DEBUG oslo_concurrency.lockutils [None req-acb6e893-f86a-4848-9f5d-48cd8d7c78ec d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Lock "c93af124-010e-4edc-8de7-7e16740d73e2" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.622s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:51:34 compute-1 nova_compute[182713]: 2026-01-21 23:51:34.570 182717 DEBUG nova.compute.manager [req-8a576c84-58d4-4946-86d2-2073c3779f49 req-d26bed08-a449-494c-99ef-0c5135f73869 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: ce29d6fa-fbbb-4352-b243-5af960b17123] Received event network-changed-6a31fa49-995e-4526-9e75-1973bd7f492b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:51:34 compute-1 nova_compute[182713]: 2026-01-21 23:51:34.571 182717 DEBUG nova.compute.manager [req-8a576c84-58d4-4946-86d2-2073c3779f49 req-d26bed08-a449-494c-99ef-0c5135f73869 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: ce29d6fa-fbbb-4352-b243-5af960b17123] Refreshing instance network info cache due to event network-changed-6a31fa49-995e-4526-9e75-1973bd7f492b. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 21 23:51:34 compute-1 nova_compute[182713]: 2026-01-21 23:51:34.571 182717 DEBUG oslo_concurrency.lockutils [req-8a576c84-58d4-4946-86d2-2073c3779f49 req-d26bed08-a449-494c-99ef-0c5135f73869 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-ce29d6fa-fbbb-4352-b243-5af960b17123" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 23:51:34 compute-1 nova_compute[182713]: 2026-01-21 23:51:34.572 182717 DEBUG oslo_concurrency.lockutils [req-8a576c84-58d4-4946-86d2-2073c3779f49 req-d26bed08-a449-494c-99ef-0c5135f73869 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-ce29d6fa-fbbb-4352-b243-5af960b17123" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 23:51:34 compute-1 nova_compute[182713]: 2026-01-21 23:51:34.572 182717 DEBUG nova.network.neutron [req-8a576c84-58d4-4946-86d2-2073c3779f49 req-d26bed08-a449-494c-99ef-0c5135f73869 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: ce29d6fa-fbbb-4352-b243-5af960b17123] Refreshing network info cache for port 6a31fa49-995e-4526-9e75-1973bd7f492b _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 21 23:51:35 compute-1 nova_compute[182713]: 2026-01-21 23:51:35.477 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:51:35 compute-1 nova_compute[182713]: 2026-01-21 23:51:35.479 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:51:36 compute-1 nova_compute[182713]: 2026-01-21 23:51:36.969 182717 DEBUG nova.network.neutron [req-8a576c84-58d4-4946-86d2-2073c3779f49 req-d26bed08-a449-494c-99ef-0c5135f73869 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: ce29d6fa-fbbb-4352-b243-5af960b17123] Updated VIF entry in instance network info cache for port 6a31fa49-995e-4526-9e75-1973bd7f492b. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 21 23:51:36 compute-1 nova_compute[182713]: 2026-01-21 23:51:36.970 182717 DEBUG nova.network.neutron [req-8a576c84-58d4-4946-86d2-2073c3779f49 req-d26bed08-a449-494c-99ef-0c5135f73869 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: ce29d6fa-fbbb-4352-b243-5af960b17123] Updating instance_info_cache with network_info: [{"id": "6a31fa49-995e-4526-9e75-1973bd7f492b", "address": "fa:16:3e:46:85:15", "network": {"id": "b94414b2-c7ed-4d1b-b462-f41cb84cbcd8", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-823172346-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bdcd24bf916b4c3aa2e173bea9dd7202", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6a31fa49-99", "ovs_interfaceid": "6a31fa49-995e-4526-9e75-1973bd7f492b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 23:51:36 compute-1 nova_compute[182713]: 2026-01-21 23:51:36.994 182717 DEBUG oslo_concurrency.lockutils [req-8a576c84-58d4-4946-86d2-2073c3779f49 req-d26bed08-a449-494c-99ef-0c5135f73869 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-ce29d6fa-fbbb-4352-b243-5af960b17123" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 23:51:39 compute-1 nova_compute[182713]: 2026-01-21 23:51:39.165 182717 DEBUG oslo_concurrency.lockutils [None req-fc6061d9-5a5d-4897-b518-3adbc0226a02 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] Acquiring lock "ce29d6fa-fbbb-4352-b243-5af960b17123" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:51:39 compute-1 nova_compute[182713]: 2026-01-21 23:51:39.166 182717 DEBUG oslo_concurrency.lockutils [None req-fc6061d9-5a5d-4897-b518-3adbc0226a02 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] Lock "ce29d6fa-fbbb-4352-b243-5af960b17123" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:51:39 compute-1 nova_compute[182713]: 2026-01-21 23:51:39.166 182717 DEBUG oslo_concurrency.lockutils [None req-fc6061d9-5a5d-4897-b518-3adbc0226a02 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] Acquiring lock "ce29d6fa-fbbb-4352-b243-5af960b17123-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:51:39 compute-1 nova_compute[182713]: 2026-01-21 23:51:39.166 182717 DEBUG oslo_concurrency.lockutils [None req-fc6061d9-5a5d-4897-b518-3adbc0226a02 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] Lock "ce29d6fa-fbbb-4352-b243-5af960b17123-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:51:39 compute-1 nova_compute[182713]: 2026-01-21 23:51:39.167 182717 DEBUG oslo_concurrency.lockutils [None req-fc6061d9-5a5d-4897-b518-3adbc0226a02 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] Lock "ce29d6fa-fbbb-4352-b243-5af960b17123-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:51:39 compute-1 nova_compute[182713]: 2026-01-21 23:51:39.182 182717 INFO nova.compute.manager [None req-fc6061d9-5a5d-4897-b518-3adbc0226a02 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] [instance: ce29d6fa-fbbb-4352-b243-5af960b17123] Terminating instance
Jan 21 23:51:39 compute-1 nova_compute[182713]: 2026-01-21 23:51:39.205 182717 DEBUG nova.compute.manager [None req-fc6061d9-5a5d-4897-b518-3adbc0226a02 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] [instance: ce29d6fa-fbbb-4352-b243-5af960b17123] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 21 23:51:39 compute-1 kernel: tap6a31fa49-99 (unregistering): left promiscuous mode
Jan 21 23:51:39 compute-1 NetworkManager[54952]: <info>  [1769039499.2309] device (tap6a31fa49-99): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 21 23:51:39 compute-1 nova_compute[182713]: 2026-01-21 23:51:39.245 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:51:39 compute-1 ovn_controller[94841]: 2026-01-21T23:51:39Z|00116|binding|INFO|Releasing lport 6a31fa49-995e-4526-9e75-1973bd7f492b from this chassis (sb_readonly=0)
Jan 21 23:51:39 compute-1 ovn_controller[94841]: 2026-01-21T23:51:39Z|00117|binding|INFO|Setting lport 6a31fa49-995e-4526-9e75-1973bd7f492b down in Southbound
Jan 21 23:51:39 compute-1 ovn_controller[94841]: 2026-01-21T23:51:39Z|00118|binding|INFO|Removing iface tap6a31fa49-99 ovn-installed in OVS
Jan 21 23:51:39 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:51:39.259 104184 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:46:85:15 10.100.0.13'], port_security=['fa:16:3e:46:85:15 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': 'ce29d6fa-fbbb-4352-b243-5af960b17123', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b94414b2-c7ed-4d1b-b462-f41cb84cbcd8', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'bdcd24bf916b4c3aa2e173bea9dd7202', 'neutron:revision_number': '4', 'neutron:security_group_ids': '288c5115-ef70-4922-9c68-a1234762984e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e09721f1-c960-4ea4-8636-beb23b3dfb25, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>], logical_port=6a31fa49-995e-4526-9e75-1973bd7f492b) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 21 23:51:39 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:51:39.260 104184 INFO neutron.agent.ovn.metadata.agent [-] Port 6a31fa49-995e-4526-9e75-1973bd7f492b in datapath b94414b2-c7ed-4d1b-b462-f41cb84cbcd8 unbound from our chassis
Jan 21 23:51:39 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:51:39.262 104184 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network b94414b2-c7ed-4d1b-b462-f41cb84cbcd8, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 21 23:51:39 compute-1 nova_compute[182713]: 2026-01-21 23:51:39.264 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:51:39 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:51:39.264 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[c9f84b71-cd3d-4f10-82e4-078d5162ea1c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:51:39 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:51:39.267 104184 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-b94414b2-c7ed-4d1b-b462-f41cb84cbcd8 namespace which is not needed anymore
Jan 21 23:51:39 compute-1 systemd[1]: machine-qemu\x2d20\x2dinstance\x2d00000026.scope: Deactivated successfully.
Jan 21 23:51:39 compute-1 systemd[1]: machine-qemu\x2d20\x2dinstance\x2d00000026.scope: Consumed 14.816s CPU time.
Jan 21 23:51:39 compute-1 systemd-machined[153970]: Machine qemu-20-instance-00000026 terminated.
Jan 21 23:51:39 compute-1 neutron-haproxy-ovnmeta-b94414b2-c7ed-4d1b-b462-f41cb84cbcd8[216229]: [NOTICE]   (216233) : haproxy version is 2.8.14-c23fe91
Jan 21 23:51:39 compute-1 neutron-haproxy-ovnmeta-b94414b2-c7ed-4d1b-b462-f41cb84cbcd8[216229]: [NOTICE]   (216233) : path to executable is /usr/sbin/haproxy
Jan 21 23:51:39 compute-1 neutron-haproxy-ovnmeta-b94414b2-c7ed-4d1b-b462-f41cb84cbcd8[216229]: [WARNING]  (216233) : Exiting Master process...
Jan 21 23:51:39 compute-1 neutron-haproxy-ovnmeta-b94414b2-c7ed-4d1b-b462-f41cb84cbcd8[216229]: [WARNING]  (216233) : Exiting Master process...
Jan 21 23:51:39 compute-1 neutron-haproxy-ovnmeta-b94414b2-c7ed-4d1b-b462-f41cb84cbcd8[216229]: [ALERT]    (216233) : Current worker (216235) exited with code 143 (Terminated)
Jan 21 23:51:39 compute-1 neutron-haproxy-ovnmeta-b94414b2-c7ed-4d1b-b462-f41cb84cbcd8[216229]: [WARNING]  (216233) : All workers exited. Exiting... (0)
Jan 21 23:51:39 compute-1 systemd[1]: libpod-2a00d349c42547b8f7b5a18108b0c24d1aa5a9a0d5a9c4ebf0f5ecc6e9a4bb8c.scope: Deactivated successfully.
Jan 21 23:51:39 compute-1 conmon[216229]: conmon 2a00d349c42547b8f7b5 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-2a00d349c42547b8f7b5a18108b0c24d1aa5a9a0d5a9c4ebf0f5ecc6e9a4bb8c.scope/container/memory.events
Jan 21 23:51:39 compute-1 podman[216605]: 2026-01-21 23:51:39.428571903 +0000 UTC m=+0.054546178 container died 2a00d349c42547b8f7b5a18108b0c24d1aa5a9a0d5a9c4ebf0f5ecc6e9a4bb8c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b94414b2-c7ed-4d1b-b462-f41cb84cbcd8, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.build-date=20251202)
Jan 21 23:51:39 compute-1 nova_compute[182713]: 2026-01-21 23:51:39.429 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:51:39 compute-1 nova_compute[182713]: 2026-01-21 23:51:39.433 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:51:39 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-2a00d349c42547b8f7b5a18108b0c24d1aa5a9a0d5a9c4ebf0f5ecc6e9a4bb8c-userdata-shm.mount: Deactivated successfully.
Jan 21 23:51:39 compute-1 systemd[1]: var-lib-containers-storage-overlay-97eb3192fdfe7015f0293e6bad55251506d75ac3995db2dda5b35fbbde989a4e-merged.mount: Deactivated successfully.
Jan 21 23:51:39 compute-1 podman[216605]: 2026-01-21 23:51:39.468143757 +0000 UTC m=+0.094118032 container cleanup 2a00d349c42547b8f7b5a18108b0c24d1aa5a9a0d5a9c4ebf0f5ecc6e9a4bb8c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b94414b2-c7ed-4d1b-b462-f41cb84cbcd8, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS)
Jan 21 23:51:39 compute-1 nova_compute[182713]: 2026-01-21 23:51:39.479 182717 INFO nova.virt.libvirt.driver [-] [instance: ce29d6fa-fbbb-4352-b243-5af960b17123] Instance destroyed successfully.
Jan 21 23:51:39 compute-1 nova_compute[182713]: 2026-01-21 23:51:39.481 182717 DEBUG nova.objects.instance [None req-fc6061d9-5a5d-4897-b518-3adbc0226a02 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] Lazy-loading 'resources' on Instance uuid ce29d6fa-fbbb-4352-b243-5af960b17123 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:51:39 compute-1 systemd[1]: libpod-conmon-2a00d349c42547b8f7b5a18108b0c24d1aa5a9a0d5a9c4ebf0f5ecc6e9a4bb8c.scope: Deactivated successfully.
Jan 21 23:51:39 compute-1 nova_compute[182713]: 2026-01-21 23:51:39.506 182717 DEBUG nova.virt.libvirt.vif [None req-fc6061d9-5a5d-4897-b518-3adbc0226a02 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-21T23:50:34Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-FloatingIPsAssociationTestJSON-server-72653518',display_name='tempest-FloatingIPsAssociationTestJSON-server-72653518',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-floatingipsassociationtestjson-server-72653518',id=38,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-21T23:50:43Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='bdcd24bf916b4c3aa2e173bea9dd7202',ramdisk_id='',reservation_id='r-m7r7zlfx',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-FloatingIPsAssociationTestJSON-1164348821',owner_user_name='tempest-FloatingIPsAssociationTestJSON-1164348821-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-21T23:50:43Z,user_data=None,user_id='8da2db8893d4442aaaada7d43ff2500f',uuid=ce29d6fa-fbbb-4352-b243-5af960b17123,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "6a31fa49-995e-4526-9e75-1973bd7f492b", "address": "fa:16:3e:46:85:15", "network": {"id": "b94414b2-c7ed-4d1b-b462-f41cb84cbcd8", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-823172346-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bdcd24bf916b4c3aa2e173bea9dd7202", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6a31fa49-99", "ovs_interfaceid": "6a31fa49-995e-4526-9e75-1973bd7f492b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 21 23:51:39 compute-1 nova_compute[182713]: 2026-01-21 23:51:39.506 182717 DEBUG nova.network.os_vif_util [None req-fc6061d9-5a5d-4897-b518-3adbc0226a02 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] Converting VIF {"id": "6a31fa49-995e-4526-9e75-1973bd7f492b", "address": "fa:16:3e:46:85:15", "network": {"id": "b94414b2-c7ed-4d1b-b462-f41cb84cbcd8", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-823172346-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bdcd24bf916b4c3aa2e173bea9dd7202", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6a31fa49-99", "ovs_interfaceid": "6a31fa49-995e-4526-9e75-1973bd7f492b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 21 23:51:39 compute-1 nova_compute[182713]: 2026-01-21 23:51:39.507 182717 DEBUG nova.network.os_vif_util [None req-fc6061d9-5a5d-4897-b518-3adbc0226a02 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:46:85:15,bridge_name='br-int',has_traffic_filtering=True,id=6a31fa49-995e-4526-9e75-1973bd7f492b,network=Network(b94414b2-c7ed-4d1b-b462-f41cb84cbcd8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6a31fa49-99') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 21 23:51:39 compute-1 nova_compute[182713]: 2026-01-21 23:51:39.508 182717 DEBUG os_vif [None req-fc6061d9-5a5d-4897-b518-3adbc0226a02 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:46:85:15,bridge_name='br-int',has_traffic_filtering=True,id=6a31fa49-995e-4526-9e75-1973bd7f492b,network=Network(b94414b2-c7ed-4d1b-b462-f41cb84cbcd8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6a31fa49-99') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 21 23:51:39 compute-1 nova_compute[182713]: 2026-01-21 23:51:39.512 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:51:39 compute-1 nova_compute[182713]: 2026-01-21 23:51:39.513 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6a31fa49-99, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:51:39 compute-1 nova_compute[182713]: 2026-01-21 23:51:39.514 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:51:39 compute-1 nova_compute[182713]: 2026-01-21 23:51:39.517 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 21 23:51:39 compute-1 nova_compute[182713]: 2026-01-21 23:51:39.522 182717 INFO os_vif [None req-fc6061d9-5a5d-4897-b518-3adbc0226a02 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:46:85:15,bridge_name='br-int',has_traffic_filtering=True,id=6a31fa49-995e-4526-9e75-1973bd7f492b,network=Network(b94414b2-c7ed-4d1b-b462-f41cb84cbcd8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6a31fa49-99')
Jan 21 23:51:39 compute-1 nova_compute[182713]: 2026-01-21 23:51:39.523 182717 INFO nova.virt.libvirt.driver [None req-fc6061d9-5a5d-4897-b518-3adbc0226a02 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] [instance: ce29d6fa-fbbb-4352-b243-5af960b17123] Deleting instance files /var/lib/nova/instances/ce29d6fa-fbbb-4352-b243-5af960b17123_del
Jan 21 23:51:39 compute-1 nova_compute[182713]: 2026-01-21 23:51:39.523 182717 INFO nova.virt.libvirt.driver [None req-fc6061d9-5a5d-4897-b518-3adbc0226a02 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] [instance: ce29d6fa-fbbb-4352-b243-5af960b17123] Deletion of /var/lib/nova/instances/ce29d6fa-fbbb-4352-b243-5af960b17123_del complete
Jan 21 23:51:39 compute-1 podman[216651]: 2026-01-21 23:51:39.531936337 +0000 UTC m=+0.039380898 container remove 2a00d349c42547b8f7b5a18108b0c24d1aa5a9a0d5a9c4ebf0f5ecc6e9a4bb8c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b94414b2-c7ed-4d1b-b462-f41cb84cbcd8, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team)
Jan 21 23:51:39 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:51:39.539 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[bd8b04a5-a761-4d4e-8aa5-6f3a6fd2dcb9]: (4, ('Wed Jan 21 11:51:39 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-b94414b2-c7ed-4d1b-b462-f41cb84cbcd8 (2a00d349c42547b8f7b5a18108b0c24d1aa5a9a0d5a9c4ebf0f5ecc6e9a4bb8c)\n2a00d349c42547b8f7b5a18108b0c24d1aa5a9a0d5a9c4ebf0f5ecc6e9a4bb8c\nWed Jan 21 11:51:39 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-b94414b2-c7ed-4d1b-b462-f41cb84cbcd8 (2a00d349c42547b8f7b5a18108b0c24d1aa5a9a0d5a9c4ebf0f5ecc6e9a4bb8c)\n2a00d349c42547b8f7b5a18108b0c24d1aa5a9a0d5a9c4ebf0f5ecc6e9a4bb8c\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:51:39 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:51:39.541 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[6e3dc191-dff7-43f0-a16e-8a084ac7037d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:51:39 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:51:39.543 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb94414b2-c0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:51:39 compute-1 nova_compute[182713]: 2026-01-21 23:51:39.545 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:51:39 compute-1 kernel: tapb94414b2-c0: left promiscuous mode
Jan 21 23:51:39 compute-1 nova_compute[182713]: 2026-01-21 23:51:39.555 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:51:39 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:51:39.558 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[e7d6b2ce-9599-4efc-a190-2ed43d50c066]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:51:39 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:51:39.574 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[ea26ded8-7239-4d9f-9b1e-fe989a914cc9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:51:39 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:51:39.576 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[9d3772e4-d47d-4900-a5af-029cf179e933]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:51:39 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:51:39.591 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[00448ecf-c538-4821-9c75-f859838819e2]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 401418, 'reachable_time': 24288, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 216666, 'error': None, 'target': 'ovnmeta-b94414b2-c7ed-4d1b-b462-f41cb84cbcd8', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:51:39 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:51:39.594 104576 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-b94414b2-c7ed-4d1b-b462-f41cb84cbcd8 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 21 23:51:39 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:51:39.594 104576 DEBUG oslo.privsep.daemon [-] privsep: reply[cc660470-0147-42c0-93ed-ba7b53c8894d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:51:39 compute-1 systemd[1]: run-netns-ovnmeta\x2db94414b2\x2dc7ed\x2d4d1b\x2db462\x2df41cb84cbcd8.mount: Deactivated successfully.
Jan 21 23:51:39 compute-1 nova_compute[182713]: 2026-01-21 23:51:39.622 182717 INFO nova.compute.manager [None req-fc6061d9-5a5d-4897-b518-3adbc0226a02 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] [instance: ce29d6fa-fbbb-4352-b243-5af960b17123] Took 0.42 seconds to destroy the instance on the hypervisor.
Jan 21 23:51:39 compute-1 nova_compute[182713]: 2026-01-21 23:51:39.623 182717 DEBUG oslo.service.loopingcall [None req-fc6061d9-5a5d-4897-b518-3adbc0226a02 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 21 23:51:39 compute-1 nova_compute[182713]: 2026-01-21 23:51:39.623 182717 DEBUG nova.compute.manager [-] [instance: ce29d6fa-fbbb-4352-b243-5af960b17123] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 21 23:51:39 compute-1 nova_compute[182713]: 2026-01-21 23:51:39.623 182717 DEBUG nova.network.neutron [-] [instance: ce29d6fa-fbbb-4352-b243-5af960b17123] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 21 23:51:40 compute-1 nova_compute[182713]: 2026-01-21 23:51:40.480 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:51:40 compute-1 nova_compute[182713]: 2026-01-21 23:51:40.570 182717 DEBUG nova.compute.manager [req-aa0dd15f-3a05-4551-88f4-8a1210a035a6 req-a1a00b52-3dc8-4061-8c19-209860eac9d6 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: ce29d6fa-fbbb-4352-b243-5af960b17123] Received event network-vif-unplugged-6a31fa49-995e-4526-9e75-1973bd7f492b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:51:40 compute-1 nova_compute[182713]: 2026-01-21 23:51:40.571 182717 DEBUG oslo_concurrency.lockutils [req-aa0dd15f-3a05-4551-88f4-8a1210a035a6 req-a1a00b52-3dc8-4061-8c19-209860eac9d6 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "ce29d6fa-fbbb-4352-b243-5af960b17123-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:51:40 compute-1 nova_compute[182713]: 2026-01-21 23:51:40.571 182717 DEBUG oslo_concurrency.lockutils [req-aa0dd15f-3a05-4551-88f4-8a1210a035a6 req-a1a00b52-3dc8-4061-8c19-209860eac9d6 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "ce29d6fa-fbbb-4352-b243-5af960b17123-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:51:40 compute-1 nova_compute[182713]: 2026-01-21 23:51:40.572 182717 DEBUG oslo_concurrency.lockutils [req-aa0dd15f-3a05-4551-88f4-8a1210a035a6 req-a1a00b52-3dc8-4061-8c19-209860eac9d6 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "ce29d6fa-fbbb-4352-b243-5af960b17123-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:51:40 compute-1 nova_compute[182713]: 2026-01-21 23:51:40.572 182717 DEBUG nova.compute.manager [req-aa0dd15f-3a05-4551-88f4-8a1210a035a6 req-a1a00b52-3dc8-4061-8c19-209860eac9d6 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: ce29d6fa-fbbb-4352-b243-5af960b17123] No waiting events found dispatching network-vif-unplugged-6a31fa49-995e-4526-9e75-1973bd7f492b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 23:51:40 compute-1 nova_compute[182713]: 2026-01-21 23:51:40.573 182717 DEBUG nova.compute.manager [req-aa0dd15f-3a05-4551-88f4-8a1210a035a6 req-a1a00b52-3dc8-4061-8c19-209860eac9d6 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: ce29d6fa-fbbb-4352-b243-5af960b17123] Received event network-vif-unplugged-6a31fa49-995e-4526-9e75-1973bd7f492b for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 21 23:51:41 compute-1 nova_compute[182713]: 2026-01-21 23:51:41.133 182717 DEBUG nova.network.neutron [-] [instance: ce29d6fa-fbbb-4352-b243-5af960b17123] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 23:51:41 compute-1 nova_compute[182713]: 2026-01-21 23:51:41.163 182717 INFO nova.compute.manager [-] [instance: ce29d6fa-fbbb-4352-b243-5af960b17123] Took 1.54 seconds to deallocate network for instance.
Jan 21 23:51:41 compute-1 nova_compute[182713]: 2026-01-21 23:51:41.204 182717 DEBUG nova.compute.manager [req-6fa81dd3-c7f7-429c-a25f-653a9beaa144 req-459464f5-de9d-450f-9afd-b053e3baa440 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: ce29d6fa-fbbb-4352-b243-5af960b17123] Received event network-vif-deleted-6a31fa49-995e-4526-9e75-1973bd7f492b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:51:41 compute-1 nova_compute[182713]: 2026-01-21 23:51:41.251 182717 DEBUG oslo_concurrency.lockutils [None req-fc6061d9-5a5d-4897-b518-3adbc0226a02 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:51:41 compute-1 nova_compute[182713]: 2026-01-21 23:51:41.252 182717 DEBUG oslo_concurrency.lockutils [None req-fc6061d9-5a5d-4897-b518-3adbc0226a02 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:51:41 compute-1 nova_compute[182713]: 2026-01-21 23:51:41.342 182717 DEBUG nova.compute.provider_tree [None req-fc6061d9-5a5d-4897-b518-3adbc0226a02 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] Inventory has not changed in ProviderTree for provider: 39680711-70c9-4df1-ae59-25e54fac688d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 21 23:51:41 compute-1 nova_compute[182713]: 2026-01-21 23:51:41.361 182717 DEBUG nova.scheduler.client.report [None req-fc6061d9-5a5d-4897-b518-3adbc0226a02 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] Inventory has not changed for provider 39680711-70c9-4df1-ae59-25e54fac688d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 21 23:51:41 compute-1 nova_compute[182713]: 2026-01-21 23:51:41.389 182717 DEBUG oslo_concurrency.lockutils [None req-fc6061d9-5a5d-4897-b518-3adbc0226a02 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.138s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:51:41 compute-1 nova_compute[182713]: 2026-01-21 23:51:41.427 182717 INFO nova.scheduler.client.report [None req-fc6061d9-5a5d-4897-b518-3adbc0226a02 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] Deleted allocations for instance ce29d6fa-fbbb-4352-b243-5af960b17123
Jan 21 23:51:41 compute-1 nova_compute[182713]: 2026-01-21 23:51:41.800 182717 DEBUG oslo_concurrency.lockutils [None req-fc6061d9-5a5d-4897-b518-3adbc0226a02 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] Lock "ce29d6fa-fbbb-4352-b243-5af960b17123" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.634s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:51:43 compute-1 nova_compute[182713]: 2026-01-21 23:51:43.148 182717 DEBUG nova.compute.manager [req-2c6862cc-df18-4774-bd04-9cc885fdaef3 req-63c49bea-8684-48a3-bf3c-57066c992f6b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: ce29d6fa-fbbb-4352-b243-5af960b17123] Received event network-vif-plugged-6a31fa49-995e-4526-9e75-1973bd7f492b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:51:43 compute-1 nova_compute[182713]: 2026-01-21 23:51:43.149 182717 DEBUG oslo_concurrency.lockutils [req-2c6862cc-df18-4774-bd04-9cc885fdaef3 req-63c49bea-8684-48a3-bf3c-57066c992f6b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "ce29d6fa-fbbb-4352-b243-5af960b17123-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:51:43 compute-1 nova_compute[182713]: 2026-01-21 23:51:43.150 182717 DEBUG oslo_concurrency.lockutils [req-2c6862cc-df18-4774-bd04-9cc885fdaef3 req-63c49bea-8684-48a3-bf3c-57066c992f6b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "ce29d6fa-fbbb-4352-b243-5af960b17123-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:51:43 compute-1 nova_compute[182713]: 2026-01-21 23:51:43.150 182717 DEBUG oslo_concurrency.lockutils [req-2c6862cc-df18-4774-bd04-9cc885fdaef3 req-63c49bea-8684-48a3-bf3c-57066c992f6b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "ce29d6fa-fbbb-4352-b243-5af960b17123-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:51:43 compute-1 nova_compute[182713]: 2026-01-21 23:51:43.151 182717 DEBUG nova.compute.manager [req-2c6862cc-df18-4774-bd04-9cc885fdaef3 req-63c49bea-8684-48a3-bf3c-57066c992f6b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: ce29d6fa-fbbb-4352-b243-5af960b17123] No waiting events found dispatching network-vif-plugged-6a31fa49-995e-4526-9e75-1973bd7f492b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 23:51:43 compute-1 nova_compute[182713]: 2026-01-21 23:51:43.151 182717 WARNING nova.compute.manager [req-2c6862cc-df18-4774-bd04-9cc885fdaef3 req-63c49bea-8684-48a3-bf3c-57066c992f6b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: ce29d6fa-fbbb-4352-b243-5af960b17123] Received unexpected event network-vif-plugged-6a31fa49-995e-4526-9e75-1973bd7f492b for instance with vm_state deleted and task_state None.
Jan 21 23:51:43 compute-1 podman[216667]: 2026-01-21 23:51:43.613136245 +0000 UTC m=+0.092406478 container health_status cba261ffc859a105e0afa4436e64e27f9ba7e7387a1ea02146e0072c51844d44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202)
Jan 21 23:51:44 compute-1 nova_compute[182713]: 2026-01-21 23:51:44.516 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:51:45 compute-1 nova_compute[182713]: 2026-01-21 23:51:45.482 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:51:46 compute-1 nova_compute[182713]: 2026-01-21 23:51:46.110 182717 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769039491.1091173, c93af124-010e-4edc-8de7-7e16740d73e2 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:51:46 compute-1 nova_compute[182713]: 2026-01-21 23:51:46.111 182717 INFO nova.compute.manager [-] [instance: c93af124-010e-4edc-8de7-7e16740d73e2] VM Stopped (Lifecycle Event)
Jan 21 23:51:46 compute-1 nova_compute[182713]: 2026-01-21 23:51:46.133 182717 DEBUG nova.compute.manager [None req-70400cbc-d032-4dc1-9c03-c66595cb93e0 - - - - - -] [instance: c93af124-010e-4edc-8de7-7e16740d73e2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:51:46 compute-1 podman[216688]: 2026-01-21 23:51:46.595671352 +0000 UTC m=+0.083178585 container health_status dce133a2ffa21f3006ac27dc80027060a9029e6d972f4c62fecd35dd3bbb00f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., version=9.6, io.openshift.expose-services=, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, release=1755695350, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, io.buildah.version=1.33.7, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, config_id=openstack_network_exporter, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible)
Jan 21 23:51:47 compute-1 nova_compute[182713]: 2026-01-21 23:51:47.070 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:51:47 compute-1 nova_compute[182713]: 2026-01-21 23:51:47.240 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:51:47 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:51:47.921 104184 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=11, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:a4:fb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'fa:c2:bb:50:eb:27'}, ipsec=False) old=SB_Global(nb_cfg=10) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 21 23:51:47 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:51:47.922 104184 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 21 23:51:47 compute-1 nova_compute[182713]: 2026-01-21 23:51:47.951 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:51:49 compute-1 nova_compute[182713]: 2026-01-21 23:51:49.519 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:51:49 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:51:49.926 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=74526b6d-b1ca-423f-9094-b845f8b97526, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '11'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:51:50 compute-1 nova_compute[182713]: 2026-01-21 23:51:50.485 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:51:54 compute-1 nova_compute[182713]: 2026-01-21 23:51:54.477 182717 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769039499.4755683, ce29d6fa-fbbb-4352-b243-5af960b17123 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:51:54 compute-1 nova_compute[182713]: 2026-01-21 23:51:54.478 182717 INFO nova.compute.manager [-] [instance: ce29d6fa-fbbb-4352-b243-5af960b17123] VM Stopped (Lifecycle Event)
Jan 21 23:51:54 compute-1 nova_compute[182713]: 2026-01-21 23:51:54.507 182717 DEBUG nova.compute.manager [None req-f95e9f27-8729-4133-a1e8-49e92dac142c - - - - - -] [instance: ce29d6fa-fbbb-4352-b243-5af960b17123] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:51:54 compute-1 nova_compute[182713]: 2026-01-21 23:51:54.523 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:51:54 compute-1 podman[216712]: 2026-01-21 23:51:54.60626769 +0000 UTC m=+0.082870177 container health_status 9069102163b9014ce8d990e36224edf2f6277e737eebbc85a8ea0ba5a3d6a468 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 21 23:51:54 compute-1 podman[216711]: 2026-01-21 23:51:54.652981679 +0000 UTC m=+0.132795368 container health_status 1b31374526468d6b98833c6ddc06c791524e3aaa8f4a92d02e3f11c449a17ca2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 21 23:51:55 compute-1 nova_compute[182713]: 2026-01-21 23:51:55.487 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:51:59 compute-1 nova_compute[182713]: 2026-01-21 23:51:59.527 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:52:00 compute-1 nova_compute[182713]: 2026-01-21 23:52:00.533 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:52:01 compute-1 podman[216760]: 2026-01-21 23:52:01.572655779 +0000 UTC m=+0.066464746 container health_status af9a44923f14d865893c91712a29a99d59e05d83f1a1a0300c4f4d5af638f284 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202)
Jan 21 23:52:01 compute-1 podman[216761]: 2026-01-21 23:52:01.581976584 +0000 UTC m=+0.062729798 container health_status e305ebfcd3dbca18f8dd680606b043b108cf9efb0d0a771039cb04fe0df8441d (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 21 23:52:03 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:52:02.999 104184 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:52:03 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:52:03.000 104184 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:52:03 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:52:03.000 104184 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:52:04 compute-1 nova_compute[182713]: 2026-01-21 23:52:04.530 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:52:05 compute-1 nova_compute[182713]: 2026-01-21 23:52:05.535 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:52:09 compute-1 nova_compute[182713]: 2026-01-21 23:52:09.533 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:52:10 compute-1 nova_compute[182713]: 2026-01-21 23:52:10.538 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:52:11 compute-1 nova_compute[182713]: 2026-01-21 23:52:11.857 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:52:12 compute-1 nova_compute[182713]: 2026-01-21 23:52:12.877 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:52:12 compute-1 nova_compute[182713]: 2026-01-21 23:52:12.878 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Jan 21 23:52:12 compute-1 nova_compute[182713]: 2026-01-21 23:52:12.909 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Jan 21 23:52:14 compute-1 nova_compute[182713]: 2026-01-21 23:52:14.536 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:52:14 compute-1 podman[216804]: 2026-01-21 23:52:14.582751925 +0000 UTC m=+0.079289373 container health_status cba261ffc859a105e0afa4436e64e27f9ba7e7387a1ea02146e0072c51844d44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ceilometer_agent_compute, managed_by=edpm_ansible)
Jan 21 23:52:15 compute-1 nova_compute[182713]: 2026-01-21 23:52:15.540 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:52:17 compute-1 podman[216824]: 2026-01-21 23:52:17.582458707 +0000 UTC m=+0.071832982 container health_status dce133a2ffa21f3006ac27dc80027060a9029e6d972f4c62fecd35dd3bbb00f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=openstack_network_exporter, container_name=openstack_network_exporter, managed_by=edpm_ansible, io.buildah.version=1.33.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, distribution-scope=public, io.openshift.tags=minimal rhel9, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., release=1755695350, architecture=x86_64, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, version=9.6, com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Jan 21 23:52:19 compute-1 nova_compute[182713]: 2026-01-21 23:52:19.539 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:52:20 compute-1 nova_compute[182713]: 2026-01-21 23:52:20.543 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:52:21 compute-1 nova_compute[182713]: 2026-01-21 23:52:21.856 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:52:21 compute-1 nova_compute[182713]: 2026-01-21 23:52:21.857 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Jan 21 23:52:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:52:22.865 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:52:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:52:22.867 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:52:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:52:22.867 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:52:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:52:22.867 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:52:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:52:22.867 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:52:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:52:22.868 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:52:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:52:22.868 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:52:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:52:22.868 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:52:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:52:22.868 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:52:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:52:22.868 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:52:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:52:22.869 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:52:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:52:22.869 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:52:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:52:22.869 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:52:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:52:22.869 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:52:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:52:22.870 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:52:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:52:22.870 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:52:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:52:22.870 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:52:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:52:22.870 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:52:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:52:22.870 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:52:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:52:22.870 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:52:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:52:22.871 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:52:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:52:22.871 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:52:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:52:22.871 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:52:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:52:22.871 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:52:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:52:22.871 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:52:24 compute-1 nova_compute[182713]: 2026-01-21 23:52:24.146 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:52:24 compute-1 nova_compute[182713]: 2026-01-21 23:52:24.148 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:52:24 compute-1 nova_compute[182713]: 2026-01-21 23:52:24.149 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:52:24 compute-1 nova_compute[182713]: 2026-01-21 23:52:24.149 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 21 23:52:24 compute-1 nova_compute[182713]: 2026-01-21 23:52:24.542 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:52:24 compute-1 nova_compute[182713]: 2026-01-21 23:52:24.854 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:52:24 compute-1 nova_compute[182713]: 2026-01-21 23:52:24.854 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:52:25 compute-1 nova_compute[182713]: 2026-01-21 23:52:25.545 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:52:25 compute-1 podman[216847]: 2026-01-21 23:52:25.606824698 +0000 UTC m=+0.084401172 container health_status 9069102163b9014ce8d990e36224edf2f6277e737eebbc85a8ea0ba5a3d6a468 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 21 23:52:25 compute-1 podman[216846]: 2026-01-21 23:52:25.656199538 +0000 UTC m=+0.134296259 container health_status 1b31374526468d6b98833c6ddc06c791524e3aaa8f4a92d02e3f11c449a17ca2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, tcib_managed=true, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, io.buildah.version=1.41.3)
Jan 21 23:52:25 compute-1 nova_compute[182713]: 2026-01-21 23:52:25.856 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:52:25 compute-1 nova_compute[182713]: 2026-01-21 23:52:25.857 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:52:25 compute-1 nova_compute[182713]: 2026-01-21 23:52:25.898 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:52:25 compute-1 nova_compute[182713]: 2026-01-21 23:52:25.899 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:52:25 compute-1 nova_compute[182713]: 2026-01-21 23:52:25.899 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:52:25 compute-1 nova_compute[182713]: 2026-01-21 23:52:25.899 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 21 23:52:26 compute-1 nova_compute[182713]: 2026-01-21 23:52:26.159 182717 WARNING nova.virt.libvirt.driver [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 21 23:52:26 compute-1 nova_compute[182713]: 2026-01-21 23:52:26.162 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5744MB free_disk=73.30962753295898GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 21 23:52:26 compute-1 nova_compute[182713]: 2026-01-21 23:52:26.162 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:52:26 compute-1 nova_compute[182713]: 2026-01-21 23:52:26.163 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:52:26 compute-1 nova_compute[182713]: 2026-01-21 23:52:26.489 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 21 23:52:26 compute-1 nova_compute[182713]: 2026-01-21 23:52:26.489 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 21 23:52:26 compute-1 nova_compute[182713]: 2026-01-21 23:52:26.606 182717 DEBUG nova.compute.provider_tree [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Inventory has not changed in ProviderTree for provider: 39680711-70c9-4df1-ae59-25e54fac688d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 21 23:52:27 compute-1 nova_compute[182713]: 2026-01-21 23:52:27.978 182717 DEBUG nova.scheduler.client.report [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Inventory has not changed for provider 39680711-70c9-4df1-ae59-25e54fac688d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 21 23:52:28 compute-1 nova_compute[182713]: 2026-01-21 23:52:28.078 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 21 23:52:28 compute-1 nova_compute[182713]: 2026-01-21 23:52:28.079 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.916s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:52:29 compute-1 nova_compute[182713]: 2026-01-21 23:52:29.078 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:52:29 compute-1 nova_compute[182713]: 2026-01-21 23:52:29.079 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 21 23:52:29 compute-1 nova_compute[182713]: 2026-01-21 23:52:29.546 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:52:30 compute-1 nova_compute[182713]: 2026-01-21 23:52:30.119 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 21 23:52:30 compute-1 nova_compute[182713]: 2026-01-21 23:52:30.120 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:52:30 compute-1 nova_compute[182713]: 2026-01-21 23:52:30.546 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:52:32 compute-1 podman[216897]: 2026-01-21 23:52:32.571475523 +0000 UTC m=+0.066959909 container health_status af9a44923f14d865893c91712a29a99d59e05d83f1a1a0300c4f4d5af638f284 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Jan 21 23:52:32 compute-1 podman[216898]: 2026-01-21 23:52:32.587901565 +0000 UTC m=+0.066741412 container health_status e305ebfcd3dbca18f8dd680606b043b108cf9efb0d0a771039cb04fe0df8441d (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Jan 21 23:52:34 compute-1 nova_compute[182713]: 2026-01-21 23:52:34.549 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:52:35 compute-1 nova_compute[182713]: 2026-01-21 23:52:35.548 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:52:38 compute-1 ovn_controller[94841]: 2026-01-21T23:52:38Z|00119|memory_trim|INFO|Detected inactivity (last active 30003 ms ago): trimming memory
Jan 21 23:52:39 compute-1 nova_compute[182713]: 2026-01-21 23:52:39.552 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:52:40 compute-1 nova_compute[182713]: 2026-01-21 23:52:40.550 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:52:44 compute-1 nova_compute[182713]: 2026-01-21 23:52:44.556 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:52:44 compute-1 nova_compute[182713]: 2026-01-21 23:52:44.922 182717 DEBUG oslo_concurrency.lockutils [None req-4464467c-e96a-4fdb-ba00-50362cf1d451 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Acquiring lock "40bd1cc4-d1de-4488-8160-e6d4f5fce4bc" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:52:44 compute-1 nova_compute[182713]: 2026-01-21 23:52:44.923 182717 DEBUG oslo_concurrency.lockutils [None req-4464467c-e96a-4fdb-ba00-50362cf1d451 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Lock "40bd1cc4-d1de-4488-8160-e6d4f5fce4bc" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:52:44 compute-1 nova_compute[182713]: 2026-01-21 23:52:44.986 182717 DEBUG nova.compute.manager [None req-4464467c-e96a-4fdb-ba00-50362cf1d451 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 40bd1cc4-d1de-4488-8160-e6d4f5fce4bc] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 21 23:52:45 compute-1 nova_compute[182713]: 2026-01-21 23:52:45.270 182717 DEBUG oslo_concurrency.lockutils [None req-4464467c-e96a-4fdb-ba00-50362cf1d451 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:52:45 compute-1 nova_compute[182713]: 2026-01-21 23:52:45.271 182717 DEBUG oslo_concurrency.lockutils [None req-4464467c-e96a-4fdb-ba00-50362cf1d451 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:52:45 compute-1 nova_compute[182713]: 2026-01-21 23:52:45.284 182717 DEBUG nova.virt.hardware [None req-4464467c-e96a-4fdb-ba00-50362cf1d451 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 21 23:52:45 compute-1 nova_compute[182713]: 2026-01-21 23:52:45.284 182717 INFO nova.compute.claims [None req-4464467c-e96a-4fdb-ba00-50362cf1d451 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 40bd1cc4-d1de-4488-8160-e6d4f5fce4bc] Claim successful on node compute-1.ctlplane.example.com
Jan 21 23:52:45 compute-1 nova_compute[182713]: 2026-01-21 23:52:45.559 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:52:45 compute-1 podman[216941]: 2026-01-21 23:52:45.595768226 +0000 UTC m=+0.079373006 container health_status cba261ffc859a105e0afa4436e64e27f9ba7e7387a1ea02146e0072c51844d44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 21 23:52:45 compute-1 nova_compute[182713]: 2026-01-21 23:52:45.995 182717 DEBUG nova.compute.provider_tree [None req-4464467c-e96a-4fdb-ba00-50362cf1d451 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Inventory has not changed in ProviderTree for provider: 39680711-70c9-4df1-ae59-25e54fac688d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 21 23:52:46 compute-1 nova_compute[182713]: 2026-01-21 23:52:46.046 182717 DEBUG nova.scheduler.client.report [None req-4464467c-e96a-4fdb-ba00-50362cf1d451 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Inventory has not changed for provider 39680711-70c9-4df1-ae59-25e54fac688d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 21 23:52:46 compute-1 nova_compute[182713]: 2026-01-21 23:52:46.124 182717 DEBUG oslo_concurrency.lockutils [None req-4464467c-e96a-4fdb-ba00-50362cf1d451 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.853s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:52:46 compute-1 nova_compute[182713]: 2026-01-21 23:52:46.125 182717 DEBUG nova.compute.manager [None req-4464467c-e96a-4fdb-ba00-50362cf1d451 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 40bd1cc4-d1de-4488-8160-e6d4f5fce4bc] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 21 23:52:46 compute-1 nova_compute[182713]: 2026-01-21 23:52:46.395 182717 DEBUG nova.compute.manager [None req-4464467c-e96a-4fdb-ba00-50362cf1d451 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 40bd1cc4-d1de-4488-8160-e6d4f5fce4bc] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 21 23:52:46 compute-1 nova_compute[182713]: 2026-01-21 23:52:46.395 182717 DEBUG nova.network.neutron [None req-4464467c-e96a-4fdb-ba00-50362cf1d451 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 40bd1cc4-d1de-4488-8160-e6d4f5fce4bc] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 21 23:52:46 compute-1 nova_compute[182713]: 2026-01-21 23:52:46.448 182717 INFO nova.virt.libvirt.driver [None req-4464467c-e96a-4fdb-ba00-50362cf1d451 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 40bd1cc4-d1de-4488-8160-e6d4f5fce4bc] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 21 23:52:46 compute-1 nova_compute[182713]: 2026-01-21 23:52:46.535 182717 DEBUG nova.compute.manager [None req-4464467c-e96a-4fdb-ba00-50362cf1d451 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 40bd1cc4-d1de-4488-8160-e6d4f5fce4bc] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 21 23:52:46 compute-1 nova_compute[182713]: 2026-01-21 23:52:46.852 182717 DEBUG nova.compute.manager [None req-4464467c-e96a-4fdb-ba00-50362cf1d451 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 40bd1cc4-d1de-4488-8160-e6d4f5fce4bc] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 21 23:52:46 compute-1 nova_compute[182713]: 2026-01-21 23:52:46.854 182717 DEBUG nova.virt.libvirt.driver [None req-4464467c-e96a-4fdb-ba00-50362cf1d451 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 40bd1cc4-d1de-4488-8160-e6d4f5fce4bc] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 21 23:52:46 compute-1 nova_compute[182713]: 2026-01-21 23:52:46.855 182717 INFO nova.virt.libvirt.driver [None req-4464467c-e96a-4fdb-ba00-50362cf1d451 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 40bd1cc4-d1de-4488-8160-e6d4f5fce4bc] Creating image(s)
Jan 21 23:52:46 compute-1 nova_compute[182713]: 2026-01-21 23:52:46.856 182717 DEBUG oslo_concurrency.lockutils [None req-4464467c-e96a-4fdb-ba00-50362cf1d451 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Acquiring lock "/var/lib/nova/instances/40bd1cc4-d1de-4488-8160-e6d4f5fce4bc/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:52:46 compute-1 nova_compute[182713]: 2026-01-21 23:52:46.857 182717 DEBUG oslo_concurrency.lockutils [None req-4464467c-e96a-4fdb-ba00-50362cf1d451 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Lock "/var/lib/nova/instances/40bd1cc4-d1de-4488-8160-e6d4f5fce4bc/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:52:46 compute-1 nova_compute[182713]: 2026-01-21 23:52:46.858 182717 DEBUG oslo_concurrency.lockutils [None req-4464467c-e96a-4fdb-ba00-50362cf1d451 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Lock "/var/lib/nova/instances/40bd1cc4-d1de-4488-8160-e6d4f5fce4bc/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:52:46 compute-1 nova_compute[182713]: 2026-01-21 23:52:46.885 182717 DEBUG oslo_concurrency.processutils [None req-4464467c-e96a-4fdb-ba00-50362cf1d451 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:52:46 compute-1 nova_compute[182713]: 2026-01-21 23:52:46.982 182717 DEBUG oslo_concurrency.processutils [None req-4464467c-e96a-4fdb-ba00-50362cf1d451 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.097s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:52:46 compute-1 nova_compute[182713]: 2026-01-21 23:52:46.983 182717 DEBUG oslo_concurrency.lockutils [None req-4464467c-e96a-4fdb-ba00-50362cf1d451 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Acquiring lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:52:46 compute-1 nova_compute[182713]: 2026-01-21 23:52:46.984 182717 DEBUG oslo_concurrency.lockutils [None req-4464467c-e96a-4fdb-ba00-50362cf1d451 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:52:47 compute-1 nova_compute[182713]: 2026-01-21 23:52:47.000 182717 DEBUG oslo_concurrency.processutils [None req-4464467c-e96a-4fdb-ba00-50362cf1d451 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:52:47 compute-1 nova_compute[182713]: 2026-01-21 23:52:47.083 182717 DEBUG oslo_concurrency.processutils [None req-4464467c-e96a-4fdb-ba00-50362cf1d451 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.083s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:52:47 compute-1 nova_compute[182713]: 2026-01-21 23:52:47.084 182717 DEBUG oslo_concurrency.processutils [None req-4464467c-e96a-4fdb-ba00-50362cf1d451 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/40bd1cc4-d1de-4488-8160-e6d4f5fce4bc/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:52:47 compute-1 nova_compute[182713]: 2026-01-21 23:52:47.130 182717 DEBUG oslo_concurrency.processutils [None req-4464467c-e96a-4fdb-ba00-50362cf1d451 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/40bd1cc4-d1de-4488-8160-e6d4f5fce4bc/disk 1073741824" returned: 0 in 0.046s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:52:47 compute-1 nova_compute[182713]: 2026-01-21 23:52:47.132 182717 DEBUG oslo_concurrency.lockutils [None req-4464467c-e96a-4fdb-ba00-50362cf1d451 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.148s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:52:47 compute-1 nova_compute[182713]: 2026-01-21 23:52:47.132 182717 DEBUG oslo_concurrency.processutils [None req-4464467c-e96a-4fdb-ba00-50362cf1d451 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:52:47 compute-1 nova_compute[182713]: 2026-01-21 23:52:47.216 182717 DEBUG oslo_concurrency.processutils [None req-4464467c-e96a-4fdb-ba00-50362cf1d451 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.084s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:52:47 compute-1 nova_compute[182713]: 2026-01-21 23:52:47.217 182717 DEBUG nova.virt.disk.api [None req-4464467c-e96a-4fdb-ba00-50362cf1d451 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Checking if we can resize image /var/lib/nova/instances/40bd1cc4-d1de-4488-8160-e6d4f5fce4bc/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 21 23:52:47 compute-1 nova_compute[182713]: 2026-01-21 23:52:47.217 182717 DEBUG oslo_concurrency.processutils [None req-4464467c-e96a-4fdb-ba00-50362cf1d451 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/40bd1cc4-d1de-4488-8160-e6d4f5fce4bc/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:52:47 compute-1 nova_compute[182713]: 2026-01-21 23:52:47.267 182717 DEBUG oslo_concurrency.processutils [None req-4464467c-e96a-4fdb-ba00-50362cf1d451 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/40bd1cc4-d1de-4488-8160-e6d4f5fce4bc/disk --force-share --output=json" returned: 0 in 0.049s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:52:47 compute-1 nova_compute[182713]: 2026-01-21 23:52:47.268 182717 DEBUG nova.virt.disk.api [None req-4464467c-e96a-4fdb-ba00-50362cf1d451 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Cannot resize image /var/lib/nova/instances/40bd1cc4-d1de-4488-8160-e6d4f5fce4bc/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 21 23:52:47 compute-1 nova_compute[182713]: 2026-01-21 23:52:47.269 182717 DEBUG nova.objects.instance [None req-4464467c-e96a-4fdb-ba00-50362cf1d451 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Lazy-loading 'migration_context' on Instance uuid 40bd1cc4-d1de-4488-8160-e6d4f5fce4bc obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:52:47 compute-1 nova_compute[182713]: 2026-01-21 23:52:47.321 182717 DEBUG nova.virt.libvirt.driver [None req-4464467c-e96a-4fdb-ba00-50362cf1d451 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 40bd1cc4-d1de-4488-8160-e6d4f5fce4bc] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 21 23:52:47 compute-1 nova_compute[182713]: 2026-01-21 23:52:47.322 182717 DEBUG nova.virt.libvirt.driver [None req-4464467c-e96a-4fdb-ba00-50362cf1d451 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 40bd1cc4-d1de-4488-8160-e6d4f5fce4bc] Ensure instance console log exists: /var/lib/nova/instances/40bd1cc4-d1de-4488-8160-e6d4f5fce4bc/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 21 23:52:47 compute-1 nova_compute[182713]: 2026-01-21 23:52:47.323 182717 DEBUG oslo_concurrency.lockutils [None req-4464467c-e96a-4fdb-ba00-50362cf1d451 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:52:47 compute-1 nova_compute[182713]: 2026-01-21 23:52:47.323 182717 DEBUG oslo_concurrency.lockutils [None req-4464467c-e96a-4fdb-ba00-50362cf1d451 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:52:47 compute-1 nova_compute[182713]: 2026-01-21 23:52:47.324 182717 DEBUG oslo_concurrency.lockutils [None req-4464467c-e96a-4fdb-ba00-50362cf1d451 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:52:47 compute-1 nova_compute[182713]: 2026-01-21 23:52:47.435 182717 DEBUG nova.policy [None req-4464467c-e96a-4fdb-ba00-50362cf1d451 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'a7fb6bdd938b4fcdb749b0bc4f86f97e', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'c09a5cf201e249f69f57cd4a632d1e2b', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 21 23:52:48 compute-1 podman[216977]: 2026-01-21 23:52:48.607737399 +0000 UTC m=+0.092003711 container health_status dce133a2ffa21f3006ac27dc80027060a9029e6d972f4c62fecd35dd3bbb00f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, io.openshift.tags=minimal rhel9, version=9.6, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=openstack_network_exporter)
Jan 21 23:52:49 compute-1 nova_compute[182713]: 2026-01-21 23:52:49.560 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:52:50 compute-1 nova_compute[182713]: 2026-01-21 23:52:50.555 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:52:51 compute-1 nova_compute[182713]: 2026-01-21 23:52:51.229 182717 DEBUG nova.network.neutron [None req-4464467c-e96a-4fdb-ba00-50362cf1d451 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 40bd1cc4-d1de-4488-8160-e6d4f5fce4bc] Successfully created port: bc3d02f6-e146-4659-b018-41d3813ed1c3 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 21 23:52:53 compute-1 nova_compute[182713]: 2026-01-21 23:52:53.338 182717 DEBUG nova.network.neutron [None req-4464467c-e96a-4fdb-ba00-50362cf1d451 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 40bd1cc4-d1de-4488-8160-e6d4f5fce4bc] Successfully updated port: bc3d02f6-e146-4659-b018-41d3813ed1c3 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 21 23:52:53 compute-1 nova_compute[182713]: 2026-01-21 23:52:53.361 182717 DEBUG oslo_concurrency.lockutils [None req-4464467c-e96a-4fdb-ba00-50362cf1d451 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Acquiring lock "refresh_cache-40bd1cc4-d1de-4488-8160-e6d4f5fce4bc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 23:52:53 compute-1 nova_compute[182713]: 2026-01-21 23:52:53.361 182717 DEBUG oslo_concurrency.lockutils [None req-4464467c-e96a-4fdb-ba00-50362cf1d451 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Acquired lock "refresh_cache-40bd1cc4-d1de-4488-8160-e6d4f5fce4bc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 23:52:53 compute-1 nova_compute[182713]: 2026-01-21 23:52:53.362 182717 DEBUG nova.network.neutron [None req-4464467c-e96a-4fdb-ba00-50362cf1d451 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 40bd1cc4-d1de-4488-8160-e6d4f5fce4bc] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 21 23:52:53 compute-1 nova_compute[182713]: 2026-01-21 23:52:53.639 182717 DEBUG nova.compute.manager [req-97e5d993-7079-42a0-a268-33a9711bef67 req-65cfb358-f85d-4327-b639-7784fd63fb98 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 40bd1cc4-d1de-4488-8160-e6d4f5fce4bc] Received event network-changed-bc3d02f6-e146-4659-b018-41d3813ed1c3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:52:53 compute-1 nova_compute[182713]: 2026-01-21 23:52:53.640 182717 DEBUG nova.compute.manager [req-97e5d993-7079-42a0-a268-33a9711bef67 req-65cfb358-f85d-4327-b639-7784fd63fb98 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 40bd1cc4-d1de-4488-8160-e6d4f5fce4bc] Refreshing instance network info cache due to event network-changed-bc3d02f6-e146-4659-b018-41d3813ed1c3. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 21 23:52:53 compute-1 nova_compute[182713]: 2026-01-21 23:52:53.640 182717 DEBUG oslo_concurrency.lockutils [req-97e5d993-7079-42a0-a268-33a9711bef67 req-65cfb358-f85d-4327-b639-7784fd63fb98 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-40bd1cc4-d1de-4488-8160-e6d4f5fce4bc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 23:52:53 compute-1 nova_compute[182713]: 2026-01-21 23:52:53.985 182717 DEBUG nova.network.neutron [None req-4464467c-e96a-4fdb-ba00-50362cf1d451 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 40bd1cc4-d1de-4488-8160-e6d4f5fce4bc] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 21 23:52:54 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:52:54.119 104184 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=12, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:a4:fb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'fa:c2:bb:50:eb:27'}, ipsec=False) old=SB_Global(nb_cfg=11) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 21 23:52:54 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:52:54.120 104184 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 21 23:52:54 compute-1 nova_compute[182713]: 2026-01-21 23:52:54.121 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:52:54 compute-1 nova_compute[182713]: 2026-01-21 23:52:54.563 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:52:55 compute-1 nova_compute[182713]: 2026-01-21 23:52:55.557 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:52:56 compute-1 nova_compute[182713]: 2026-01-21 23:52:56.608 182717 DEBUG nova.network.neutron [None req-4464467c-e96a-4fdb-ba00-50362cf1d451 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 40bd1cc4-d1de-4488-8160-e6d4f5fce4bc] Updating instance_info_cache with network_info: [{"id": "bc3d02f6-e146-4659-b018-41d3813ed1c3", "address": "fa:16:3e:05:f7:b6", "network": {"id": "7b586c54-3322-410f-9bc9-972a63b8deff", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1542056474-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c09a5cf201e249f69f57cd4a632d1e2b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbc3d02f6-e1", "ovs_interfaceid": "bc3d02f6-e146-4659-b018-41d3813ed1c3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 23:52:56 compute-1 podman[217000]: 2026-01-21 23:52:56.611742056 +0000 UTC m=+0.087528891 container health_status 9069102163b9014ce8d990e36224edf2f6277e737eebbc85a8ea0ba5a3d6a468 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 21 23:52:56 compute-1 podman[216999]: 2026-01-21 23:52:56.675803973 +0000 UTC m=+0.157590625 container health_status 1b31374526468d6b98833c6ddc06c791524e3aaa8f4a92d02e3f11c449a17ca2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, container_name=ovn_controller, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible)
Jan 21 23:52:56 compute-1 nova_compute[182713]: 2026-01-21 23:52:56.710 182717 DEBUG oslo_concurrency.lockutils [None req-4464467c-e96a-4fdb-ba00-50362cf1d451 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Releasing lock "refresh_cache-40bd1cc4-d1de-4488-8160-e6d4f5fce4bc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 23:52:56 compute-1 nova_compute[182713]: 2026-01-21 23:52:56.711 182717 DEBUG nova.compute.manager [None req-4464467c-e96a-4fdb-ba00-50362cf1d451 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 40bd1cc4-d1de-4488-8160-e6d4f5fce4bc] Instance network_info: |[{"id": "bc3d02f6-e146-4659-b018-41d3813ed1c3", "address": "fa:16:3e:05:f7:b6", "network": {"id": "7b586c54-3322-410f-9bc9-972a63b8deff", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1542056474-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c09a5cf201e249f69f57cd4a632d1e2b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbc3d02f6-e1", "ovs_interfaceid": "bc3d02f6-e146-4659-b018-41d3813ed1c3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 21 23:52:56 compute-1 nova_compute[182713]: 2026-01-21 23:52:56.712 182717 DEBUG oslo_concurrency.lockutils [req-97e5d993-7079-42a0-a268-33a9711bef67 req-65cfb358-f85d-4327-b639-7784fd63fb98 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-40bd1cc4-d1de-4488-8160-e6d4f5fce4bc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 23:52:56 compute-1 nova_compute[182713]: 2026-01-21 23:52:56.712 182717 DEBUG nova.network.neutron [req-97e5d993-7079-42a0-a268-33a9711bef67 req-65cfb358-f85d-4327-b639-7784fd63fb98 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 40bd1cc4-d1de-4488-8160-e6d4f5fce4bc] Refreshing network info cache for port bc3d02f6-e146-4659-b018-41d3813ed1c3 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 21 23:52:56 compute-1 nova_compute[182713]: 2026-01-21 23:52:56.716 182717 DEBUG nova.virt.libvirt.driver [None req-4464467c-e96a-4fdb-ba00-50362cf1d451 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 40bd1cc4-d1de-4488-8160-e6d4f5fce4bc] Start _get_guest_xml network_info=[{"id": "bc3d02f6-e146-4659-b018-41d3813ed1c3", "address": "fa:16:3e:05:f7:b6", "network": {"id": "7b586c54-3322-410f-9bc9-972a63b8deff", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1542056474-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c09a5cf201e249f69f57cd4a632d1e2b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbc3d02f6-e1", "ovs_interfaceid": "bc3d02f6-e146-4659-b018-41d3813ed1c3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'size': 0, 'disk_bus': 'virtio', 'boot_index': 0, 'device_name': '/dev/vda', 'guest_format': None, 'encrypted': False, 'encryption_format': None, 'encryption_secret_uuid': None, 'device_type': 'disk', 'image_id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 21 23:52:56 compute-1 nova_compute[182713]: 2026-01-21 23:52:56.721 182717 WARNING nova.virt.libvirt.driver [None req-4464467c-e96a-4fdb-ba00-50362cf1d451 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 21 23:52:56 compute-1 nova_compute[182713]: 2026-01-21 23:52:56.753 182717 DEBUG nova.virt.libvirt.host [None req-4464467c-e96a-4fdb-ba00-50362cf1d451 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 21 23:52:56 compute-1 nova_compute[182713]: 2026-01-21 23:52:56.756 182717 DEBUG nova.virt.libvirt.host [None req-4464467c-e96a-4fdb-ba00-50362cf1d451 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 21 23:52:56 compute-1 nova_compute[182713]: 2026-01-21 23:52:56.762 182717 DEBUG nova.virt.libvirt.host [None req-4464467c-e96a-4fdb-ba00-50362cf1d451 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 21 23:52:56 compute-1 nova_compute[182713]: 2026-01-21 23:52:56.763 182717 DEBUG nova.virt.libvirt.host [None req-4464467c-e96a-4fdb-ba00-50362cf1d451 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 21 23:52:56 compute-1 nova_compute[182713]: 2026-01-21 23:52:56.765 182717 DEBUG nova.virt.libvirt.driver [None req-4464467c-e96a-4fdb-ba00-50362cf1d451 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 21 23:52:56 compute-1 nova_compute[182713]: 2026-01-21 23:52:56.765 182717 DEBUG nova.virt.hardware [None req-4464467c-e96a-4fdb-ba00-50362cf1d451 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-21T23:42:45Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c3389c03-89c4-4ff5-9e03-1a99d41713d4',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 21 23:52:56 compute-1 nova_compute[182713]: 2026-01-21 23:52:56.766 182717 DEBUG nova.virt.hardware [None req-4464467c-e96a-4fdb-ba00-50362cf1d451 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 21 23:52:56 compute-1 nova_compute[182713]: 2026-01-21 23:52:56.766 182717 DEBUG nova.virt.hardware [None req-4464467c-e96a-4fdb-ba00-50362cf1d451 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 21 23:52:56 compute-1 nova_compute[182713]: 2026-01-21 23:52:56.767 182717 DEBUG nova.virt.hardware [None req-4464467c-e96a-4fdb-ba00-50362cf1d451 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 21 23:52:56 compute-1 nova_compute[182713]: 2026-01-21 23:52:56.767 182717 DEBUG nova.virt.hardware [None req-4464467c-e96a-4fdb-ba00-50362cf1d451 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 21 23:52:56 compute-1 nova_compute[182713]: 2026-01-21 23:52:56.767 182717 DEBUG nova.virt.hardware [None req-4464467c-e96a-4fdb-ba00-50362cf1d451 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 21 23:52:56 compute-1 nova_compute[182713]: 2026-01-21 23:52:56.768 182717 DEBUG nova.virt.hardware [None req-4464467c-e96a-4fdb-ba00-50362cf1d451 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 21 23:52:56 compute-1 nova_compute[182713]: 2026-01-21 23:52:56.768 182717 DEBUG nova.virt.hardware [None req-4464467c-e96a-4fdb-ba00-50362cf1d451 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 21 23:52:56 compute-1 nova_compute[182713]: 2026-01-21 23:52:56.769 182717 DEBUG nova.virt.hardware [None req-4464467c-e96a-4fdb-ba00-50362cf1d451 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 21 23:52:56 compute-1 nova_compute[182713]: 2026-01-21 23:52:56.769 182717 DEBUG nova.virt.hardware [None req-4464467c-e96a-4fdb-ba00-50362cf1d451 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 21 23:52:56 compute-1 nova_compute[182713]: 2026-01-21 23:52:56.769 182717 DEBUG nova.virt.hardware [None req-4464467c-e96a-4fdb-ba00-50362cf1d451 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 21 23:52:56 compute-1 nova_compute[182713]: 2026-01-21 23:52:56.775 182717 DEBUG nova.virt.libvirt.vif [None req-4464467c-e96a-4fdb-ba00-50362cf1d451 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-21T23:52:42Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerDiskConfigTestJSON-server-732234770',display_name='tempest-ServerDiskConfigTestJSON-server-732234770',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverdiskconfigtestjson-server-732234770',id=49,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c09a5cf201e249f69f57cd4a632d1e2b',ramdisk_id='',reservation_id='r-7yxmtfzt',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerDiskConfigTestJSON-1417790226',owner_user_name='tempest-ServerDiskConfigTestJSON-1417790226-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-21T23:52:46Z,user_data=None,user_id='a7fb6bdd938b4fcdb749b0bc4f86f97e',uuid=40bd1cc4-d1de-4488-8160-e6d4f5fce4bc,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "bc3d02f6-e146-4659-b018-41d3813ed1c3", "address": "fa:16:3e:05:f7:b6", "network": {"id": "7b586c54-3322-410f-9bc9-972a63b8deff", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1542056474-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c09a5cf201e249f69f57cd4a632d1e2b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbc3d02f6-e1", "ovs_interfaceid": "bc3d02f6-e146-4659-b018-41d3813ed1c3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 21 23:52:56 compute-1 nova_compute[182713]: 2026-01-21 23:52:56.776 182717 DEBUG nova.network.os_vif_util [None req-4464467c-e96a-4fdb-ba00-50362cf1d451 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Converting VIF {"id": "bc3d02f6-e146-4659-b018-41d3813ed1c3", "address": "fa:16:3e:05:f7:b6", "network": {"id": "7b586c54-3322-410f-9bc9-972a63b8deff", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1542056474-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c09a5cf201e249f69f57cd4a632d1e2b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbc3d02f6-e1", "ovs_interfaceid": "bc3d02f6-e146-4659-b018-41d3813ed1c3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 21 23:52:56 compute-1 nova_compute[182713]: 2026-01-21 23:52:56.778 182717 DEBUG nova.network.os_vif_util [None req-4464467c-e96a-4fdb-ba00-50362cf1d451 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:05:f7:b6,bridge_name='br-int',has_traffic_filtering=True,id=bc3d02f6-e146-4659-b018-41d3813ed1c3,network=Network(7b586c54-3322-410f-9bc9-972a63b8deff),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbc3d02f6-e1') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 21 23:52:56 compute-1 nova_compute[182713]: 2026-01-21 23:52:56.779 182717 DEBUG nova.objects.instance [None req-4464467c-e96a-4fdb-ba00-50362cf1d451 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Lazy-loading 'pci_devices' on Instance uuid 40bd1cc4-d1de-4488-8160-e6d4f5fce4bc obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:52:56 compute-1 nova_compute[182713]: 2026-01-21 23:52:56.805 182717 DEBUG nova.virt.libvirt.driver [None req-4464467c-e96a-4fdb-ba00-50362cf1d451 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 40bd1cc4-d1de-4488-8160-e6d4f5fce4bc] End _get_guest_xml xml=<domain type="kvm">
Jan 21 23:52:56 compute-1 nova_compute[182713]:   <uuid>40bd1cc4-d1de-4488-8160-e6d4f5fce4bc</uuid>
Jan 21 23:52:56 compute-1 nova_compute[182713]:   <name>instance-00000031</name>
Jan 21 23:52:56 compute-1 nova_compute[182713]:   <memory>131072</memory>
Jan 21 23:52:56 compute-1 nova_compute[182713]:   <vcpu>1</vcpu>
Jan 21 23:52:56 compute-1 nova_compute[182713]:   <metadata>
Jan 21 23:52:56 compute-1 nova_compute[182713]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 21 23:52:56 compute-1 nova_compute[182713]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 21 23:52:56 compute-1 nova_compute[182713]:       <nova:name>tempest-ServerDiskConfigTestJSON-server-732234770</nova:name>
Jan 21 23:52:56 compute-1 nova_compute[182713]:       <nova:creationTime>2026-01-21 23:52:56</nova:creationTime>
Jan 21 23:52:56 compute-1 nova_compute[182713]:       <nova:flavor name="m1.nano">
Jan 21 23:52:56 compute-1 nova_compute[182713]:         <nova:memory>128</nova:memory>
Jan 21 23:52:56 compute-1 nova_compute[182713]:         <nova:disk>1</nova:disk>
Jan 21 23:52:56 compute-1 nova_compute[182713]:         <nova:swap>0</nova:swap>
Jan 21 23:52:56 compute-1 nova_compute[182713]:         <nova:ephemeral>0</nova:ephemeral>
Jan 21 23:52:56 compute-1 nova_compute[182713]:         <nova:vcpus>1</nova:vcpus>
Jan 21 23:52:56 compute-1 nova_compute[182713]:       </nova:flavor>
Jan 21 23:52:56 compute-1 nova_compute[182713]:       <nova:owner>
Jan 21 23:52:56 compute-1 nova_compute[182713]:         <nova:user uuid="a7fb6bdd938b4fcdb749b0bc4f86f97e">tempest-ServerDiskConfigTestJSON-1417790226-project-member</nova:user>
Jan 21 23:52:56 compute-1 nova_compute[182713]:         <nova:project uuid="c09a5cf201e249f69f57cd4a632d1e2b">tempest-ServerDiskConfigTestJSON-1417790226</nova:project>
Jan 21 23:52:56 compute-1 nova_compute[182713]:       </nova:owner>
Jan 21 23:52:56 compute-1 nova_compute[182713]:       <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 21 23:52:56 compute-1 nova_compute[182713]:       <nova:ports>
Jan 21 23:52:56 compute-1 nova_compute[182713]:         <nova:port uuid="bc3d02f6-e146-4659-b018-41d3813ed1c3">
Jan 21 23:52:56 compute-1 nova_compute[182713]:           <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Jan 21 23:52:56 compute-1 nova_compute[182713]:         </nova:port>
Jan 21 23:52:56 compute-1 nova_compute[182713]:       </nova:ports>
Jan 21 23:52:56 compute-1 nova_compute[182713]:     </nova:instance>
Jan 21 23:52:56 compute-1 nova_compute[182713]:   </metadata>
Jan 21 23:52:56 compute-1 nova_compute[182713]:   <sysinfo type="smbios">
Jan 21 23:52:56 compute-1 nova_compute[182713]:     <system>
Jan 21 23:52:56 compute-1 nova_compute[182713]:       <entry name="manufacturer">RDO</entry>
Jan 21 23:52:56 compute-1 nova_compute[182713]:       <entry name="product">OpenStack Compute</entry>
Jan 21 23:52:56 compute-1 nova_compute[182713]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 21 23:52:56 compute-1 nova_compute[182713]:       <entry name="serial">40bd1cc4-d1de-4488-8160-e6d4f5fce4bc</entry>
Jan 21 23:52:56 compute-1 nova_compute[182713]:       <entry name="uuid">40bd1cc4-d1de-4488-8160-e6d4f5fce4bc</entry>
Jan 21 23:52:56 compute-1 nova_compute[182713]:       <entry name="family">Virtual Machine</entry>
Jan 21 23:52:56 compute-1 nova_compute[182713]:     </system>
Jan 21 23:52:56 compute-1 nova_compute[182713]:   </sysinfo>
Jan 21 23:52:56 compute-1 nova_compute[182713]:   <os>
Jan 21 23:52:56 compute-1 nova_compute[182713]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 21 23:52:56 compute-1 nova_compute[182713]:     <boot dev="hd"/>
Jan 21 23:52:56 compute-1 nova_compute[182713]:     <smbios mode="sysinfo"/>
Jan 21 23:52:56 compute-1 nova_compute[182713]:   </os>
Jan 21 23:52:56 compute-1 nova_compute[182713]:   <features>
Jan 21 23:52:56 compute-1 nova_compute[182713]:     <acpi/>
Jan 21 23:52:56 compute-1 nova_compute[182713]:     <apic/>
Jan 21 23:52:56 compute-1 nova_compute[182713]:     <vmcoreinfo/>
Jan 21 23:52:56 compute-1 nova_compute[182713]:   </features>
Jan 21 23:52:56 compute-1 nova_compute[182713]:   <clock offset="utc">
Jan 21 23:52:56 compute-1 nova_compute[182713]:     <timer name="pit" tickpolicy="delay"/>
Jan 21 23:52:56 compute-1 nova_compute[182713]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 21 23:52:56 compute-1 nova_compute[182713]:     <timer name="hpet" present="no"/>
Jan 21 23:52:56 compute-1 nova_compute[182713]:   </clock>
Jan 21 23:52:56 compute-1 nova_compute[182713]:   <cpu mode="custom" match="exact">
Jan 21 23:52:56 compute-1 nova_compute[182713]:     <model>Nehalem</model>
Jan 21 23:52:56 compute-1 nova_compute[182713]:     <topology sockets="1" cores="1" threads="1"/>
Jan 21 23:52:56 compute-1 nova_compute[182713]:   </cpu>
Jan 21 23:52:56 compute-1 nova_compute[182713]:   <devices>
Jan 21 23:52:56 compute-1 nova_compute[182713]:     <disk type="file" device="disk">
Jan 21 23:52:56 compute-1 nova_compute[182713]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 21 23:52:56 compute-1 nova_compute[182713]:       <source file="/var/lib/nova/instances/40bd1cc4-d1de-4488-8160-e6d4f5fce4bc/disk"/>
Jan 21 23:52:56 compute-1 nova_compute[182713]:       <target dev="vda" bus="virtio"/>
Jan 21 23:52:56 compute-1 nova_compute[182713]:     </disk>
Jan 21 23:52:56 compute-1 nova_compute[182713]:     <disk type="file" device="cdrom">
Jan 21 23:52:56 compute-1 nova_compute[182713]:       <driver name="qemu" type="raw" cache="none"/>
Jan 21 23:52:56 compute-1 nova_compute[182713]:       <source file="/var/lib/nova/instances/40bd1cc4-d1de-4488-8160-e6d4f5fce4bc/disk.config"/>
Jan 21 23:52:56 compute-1 nova_compute[182713]:       <target dev="sda" bus="sata"/>
Jan 21 23:52:56 compute-1 nova_compute[182713]:     </disk>
Jan 21 23:52:56 compute-1 nova_compute[182713]:     <interface type="ethernet">
Jan 21 23:52:56 compute-1 nova_compute[182713]:       <mac address="fa:16:3e:05:f7:b6"/>
Jan 21 23:52:56 compute-1 nova_compute[182713]:       <model type="virtio"/>
Jan 21 23:52:56 compute-1 nova_compute[182713]:       <driver name="vhost" rx_queue_size="512"/>
Jan 21 23:52:56 compute-1 nova_compute[182713]:       <mtu size="1442"/>
Jan 21 23:52:56 compute-1 nova_compute[182713]:       <target dev="tapbc3d02f6-e1"/>
Jan 21 23:52:56 compute-1 nova_compute[182713]:     </interface>
Jan 21 23:52:56 compute-1 nova_compute[182713]:     <serial type="pty">
Jan 21 23:52:56 compute-1 nova_compute[182713]:       <log file="/var/lib/nova/instances/40bd1cc4-d1de-4488-8160-e6d4f5fce4bc/console.log" append="off"/>
Jan 21 23:52:56 compute-1 nova_compute[182713]:     </serial>
Jan 21 23:52:56 compute-1 nova_compute[182713]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 21 23:52:56 compute-1 nova_compute[182713]:     <video>
Jan 21 23:52:56 compute-1 nova_compute[182713]:       <model type="virtio"/>
Jan 21 23:52:56 compute-1 nova_compute[182713]:     </video>
Jan 21 23:52:56 compute-1 nova_compute[182713]:     <input type="tablet" bus="usb"/>
Jan 21 23:52:56 compute-1 nova_compute[182713]:     <rng model="virtio">
Jan 21 23:52:56 compute-1 nova_compute[182713]:       <backend model="random">/dev/urandom</backend>
Jan 21 23:52:56 compute-1 nova_compute[182713]:     </rng>
Jan 21 23:52:56 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root"/>
Jan 21 23:52:56 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:52:56 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:52:56 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:52:56 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:52:56 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:52:56 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:52:56 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:52:56 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:52:56 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:52:56 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:52:56 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:52:56 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:52:56 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:52:56 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:52:56 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:52:56 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:52:56 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:52:56 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:52:56 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:52:56 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:52:56 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:52:56 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:52:56 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:52:56 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:52:56 compute-1 nova_compute[182713]:     <controller type="usb" index="0"/>
Jan 21 23:52:56 compute-1 nova_compute[182713]:     <memballoon model="virtio">
Jan 21 23:52:56 compute-1 nova_compute[182713]:       <stats period="10"/>
Jan 21 23:52:56 compute-1 nova_compute[182713]:     </memballoon>
Jan 21 23:52:56 compute-1 nova_compute[182713]:   </devices>
Jan 21 23:52:56 compute-1 nova_compute[182713]: </domain>
Jan 21 23:52:56 compute-1 nova_compute[182713]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 21 23:52:56 compute-1 nova_compute[182713]: 2026-01-21 23:52:56.806 182717 DEBUG nova.compute.manager [None req-4464467c-e96a-4fdb-ba00-50362cf1d451 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 40bd1cc4-d1de-4488-8160-e6d4f5fce4bc] Preparing to wait for external event network-vif-plugged-bc3d02f6-e146-4659-b018-41d3813ed1c3 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 21 23:52:56 compute-1 nova_compute[182713]: 2026-01-21 23:52:56.807 182717 DEBUG oslo_concurrency.lockutils [None req-4464467c-e96a-4fdb-ba00-50362cf1d451 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Acquiring lock "40bd1cc4-d1de-4488-8160-e6d4f5fce4bc-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:52:56 compute-1 nova_compute[182713]: 2026-01-21 23:52:56.807 182717 DEBUG oslo_concurrency.lockutils [None req-4464467c-e96a-4fdb-ba00-50362cf1d451 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Lock "40bd1cc4-d1de-4488-8160-e6d4f5fce4bc-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:52:56 compute-1 nova_compute[182713]: 2026-01-21 23:52:56.807 182717 DEBUG oslo_concurrency.lockutils [None req-4464467c-e96a-4fdb-ba00-50362cf1d451 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Lock "40bd1cc4-d1de-4488-8160-e6d4f5fce4bc-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:52:56 compute-1 nova_compute[182713]: 2026-01-21 23:52:56.808 182717 DEBUG nova.virt.libvirt.vif [None req-4464467c-e96a-4fdb-ba00-50362cf1d451 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-21T23:52:42Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerDiskConfigTestJSON-server-732234770',display_name='tempest-ServerDiskConfigTestJSON-server-732234770',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverdiskconfigtestjson-server-732234770',id=49,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c09a5cf201e249f69f57cd4a632d1e2b',ramdisk_id='',reservation_id='r-7yxmtfzt',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerDiskConfigTestJSON-1417790226',owner_user_name='tempest-ServerDiskConfigTestJSON-1417790226-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-21T23:52:46Z,user_data=None,user_id='a7fb6bdd938b4fcdb749b0bc4f86f97e',uuid=40bd1cc4-d1de-4488-8160-e6d4f5fce4bc,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "bc3d02f6-e146-4659-b018-41d3813ed1c3", "address": "fa:16:3e:05:f7:b6", "network": {"id": "7b586c54-3322-410f-9bc9-972a63b8deff", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1542056474-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c09a5cf201e249f69f57cd4a632d1e2b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbc3d02f6-e1", "ovs_interfaceid": "bc3d02f6-e146-4659-b018-41d3813ed1c3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 21 23:52:56 compute-1 nova_compute[182713]: 2026-01-21 23:52:56.809 182717 DEBUG nova.network.os_vif_util [None req-4464467c-e96a-4fdb-ba00-50362cf1d451 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Converting VIF {"id": "bc3d02f6-e146-4659-b018-41d3813ed1c3", "address": "fa:16:3e:05:f7:b6", "network": {"id": "7b586c54-3322-410f-9bc9-972a63b8deff", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1542056474-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c09a5cf201e249f69f57cd4a632d1e2b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbc3d02f6-e1", "ovs_interfaceid": "bc3d02f6-e146-4659-b018-41d3813ed1c3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 21 23:52:56 compute-1 nova_compute[182713]: 2026-01-21 23:52:56.809 182717 DEBUG nova.network.os_vif_util [None req-4464467c-e96a-4fdb-ba00-50362cf1d451 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:05:f7:b6,bridge_name='br-int',has_traffic_filtering=True,id=bc3d02f6-e146-4659-b018-41d3813ed1c3,network=Network(7b586c54-3322-410f-9bc9-972a63b8deff),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbc3d02f6-e1') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 21 23:52:56 compute-1 nova_compute[182713]: 2026-01-21 23:52:56.810 182717 DEBUG os_vif [None req-4464467c-e96a-4fdb-ba00-50362cf1d451 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:05:f7:b6,bridge_name='br-int',has_traffic_filtering=True,id=bc3d02f6-e146-4659-b018-41d3813ed1c3,network=Network(7b586c54-3322-410f-9bc9-972a63b8deff),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbc3d02f6-e1') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 21 23:52:56 compute-1 nova_compute[182713]: 2026-01-21 23:52:56.811 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:52:56 compute-1 nova_compute[182713]: 2026-01-21 23:52:56.811 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:52:56 compute-1 nova_compute[182713]: 2026-01-21 23:52:56.813 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 21 23:52:56 compute-1 nova_compute[182713]: 2026-01-21 23:52:56.819 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:52:56 compute-1 nova_compute[182713]: 2026-01-21 23:52:56.819 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapbc3d02f6-e1, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:52:56 compute-1 nova_compute[182713]: 2026-01-21 23:52:56.820 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapbc3d02f6-e1, col_values=(('external_ids', {'iface-id': 'bc3d02f6-e146-4659-b018-41d3813ed1c3', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:05:f7:b6', 'vm-uuid': '40bd1cc4-d1de-4488-8160-e6d4f5fce4bc'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:52:56 compute-1 nova_compute[182713]: 2026-01-21 23:52:56.821 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:52:56 compute-1 NetworkManager[54952]: <info>  [1769039576.8234] manager: (tapbc3d02f6-e1): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/62)
Jan 21 23:52:56 compute-1 nova_compute[182713]: 2026-01-21 23:52:56.824 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 21 23:52:56 compute-1 nova_compute[182713]: 2026-01-21 23:52:56.830 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:52:56 compute-1 nova_compute[182713]: 2026-01-21 23:52:56.832 182717 INFO os_vif [None req-4464467c-e96a-4fdb-ba00-50362cf1d451 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:05:f7:b6,bridge_name='br-int',has_traffic_filtering=True,id=bc3d02f6-e146-4659-b018-41d3813ed1c3,network=Network(7b586c54-3322-410f-9bc9-972a63b8deff),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbc3d02f6-e1')
Jan 21 23:52:56 compute-1 nova_compute[182713]: 2026-01-21 23:52:56.914 182717 DEBUG nova.virt.libvirt.driver [None req-4464467c-e96a-4fdb-ba00-50362cf1d451 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 21 23:52:56 compute-1 nova_compute[182713]: 2026-01-21 23:52:56.914 182717 DEBUG nova.virt.libvirt.driver [None req-4464467c-e96a-4fdb-ba00-50362cf1d451 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 21 23:52:56 compute-1 nova_compute[182713]: 2026-01-21 23:52:56.914 182717 DEBUG nova.virt.libvirt.driver [None req-4464467c-e96a-4fdb-ba00-50362cf1d451 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] No VIF found with MAC fa:16:3e:05:f7:b6, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 21 23:52:56 compute-1 nova_compute[182713]: 2026-01-21 23:52:56.915 182717 INFO nova.virt.libvirt.driver [None req-4464467c-e96a-4fdb-ba00-50362cf1d451 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 40bd1cc4-d1de-4488-8160-e6d4f5fce4bc] Using config drive
Jan 21 23:52:58 compute-1 nova_compute[182713]: 2026-01-21 23:52:58.021 182717 INFO nova.virt.libvirt.driver [None req-4464467c-e96a-4fdb-ba00-50362cf1d451 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 40bd1cc4-d1de-4488-8160-e6d4f5fce4bc] Creating config drive at /var/lib/nova/instances/40bd1cc4-d1de-4488-8160-e6d4f5fce4bc/disk.config
Jan 21 23:52:58 compute-1 nova_compute[182713]: 2026-01-21 23:52:58.033 182717 DEBUG oslo_concurrency.processutils [None req-4464467c-e96a-4fdb-ba00-50362cf1d451 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/40bd1cc4-d1de-4488-8160-e6d4f5fce4bc/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpleoyet82 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:52:58 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:52:58.123 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=74526b6d-b1ca-423f-9094-b845f8b97526, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '12'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:52:58 compute-1 nova_compute[182713]: 2026-01-21 23:52:58.165 182717 DEBUG oslo_concurrency.processutils [None req-4464467c-e96a-4fdb-ba00-50362cf1d451 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/40bd1cc4-d1de-4488-8160-e6d4f5fce4bc/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpleoyet82" returned: 0 in 0.132s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:52:58 compute-1 kernel: tapbc3d02f6-e1: entered promiscuous mode
Jan 21 23:52:58 compute-1 ovn_controller[94841]: 2026-01-21T23:52:58Z|00120|binding|INFO|Claiming lport bc3d02f6-e146-4659-b018-41d3813ed1c3 for this chassis.
Jan 21 23:52:58 compute-1 ovn_controller[94841]: 2026-01-21T23:52:58Z|00121|binding|INFO|bc3d02f6-e146-4659-b018-41d3813ed1c3: Claiming fa:16:3e:05:f7:b6 10.100.0.3
Jan 21 23:52:58 compute-1 nova_compute[182713]: 2026-01-21 23:52:58.243 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:52:58 compute-1 NetworkManager[54952]: <info>  [1769039578.2455] manager: (tapbc3d02f6-e1): new Tun device (/org/freedesktop/NetworkManager/Devices/63)
Jan 21 23:52:58 compute-1 nova_compute[182713]: 2026-01-21 23:52:58.251 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:52:58 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:52:58.261 104184 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:05:f7:b6 10.100.0.3'], port_security=['fa:16:3e:05:f7:b6 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': '40bd1cc4-d1de-4488-8160-e6d4f5fce4bc', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7b586c54-3322-410f-9bc9-972a63b8deff', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c09a5cf201e249f69f57cd4a632d1e2b', 'neutron:revision_number': '2', 'neutron:security_group_ids': '0a9884ba-4fab-4d1a-a8f1-d417efefef12', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f503744d-afcf-48c4-bcde-b001877de7d3, chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>], logical_port=bc3d02f6-e146-4659-b018-41d3813ed1c3) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 21 23:52:58 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:52:58.263 104184 INFO neutron.agent.ovn.metadata.agent [-] Port bc3d02f6-e146-4659-b018-41d3813ed1c3 in datapath 7b586c54-3322-410f-9bc9-972a63b8deff bound to our chassis
Jan 21 23:52:58 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:52:58.265 104184 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 7b586c54-3322-410f-9bc9-972a63b8deff
Jan 21 23:52:58 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:52:58.283 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[2f2cad71-1bfd-4715-82d3-724217617259]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:52:58 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:52:58.284 104184 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap7b586c54-31 in ovnmeta-7b586c54-3322-410f-9bc9-972a63b8deff namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 21 23:52:58 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:52:58.287 211733 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap7b586c54-30 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 21 23:52:58 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:52:58.287 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[fa068c7b-450f-4bde-8533-2be726563d1a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:52:58 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:52:58.289 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[fa2a3cda-537c-4caf-a5de-6ad3725aa9d0]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:52:58 compute-1 systemd-udevd[217068]: Network interface NamePolicy= disabled on kernel command line.
Jan 21 23:52:58 compute-1 NetworkManager[54952]: <info>  [1769039578.3173] device (tapbc3d02f6-e1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 21 23:52:58 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:52:58.316 104576 DEBUG oslo.privsep.daemon [-] privsep: reply[e1e7bbcc-0f00-492d-b4a6-da548a0bf21e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:52:58 compute-1 NetworkManager[54952]: <info>  [1769039578.3194] device (tapbc3d02f6-e1): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 21 23:52:58 compute-1 systemd-machined[153970]: New machine qemu-22-instance-00000031.
Jan 21 23:52:58 compute-1 nova_compute[182713]: 2026-01-21 23:52:58.338 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:52:58 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:52:58.343 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[95524ab9-fb9c-45be-a1c4-6b365251bdab]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:52:58 compute-1 ovn_controller[94841]: 2026-01-21T23:52:58Z|00122|binding|INFO|Setting lport bc3d02f6-e146-4659-b018-41d3813ed1c3 ovn-installed in OVS
Jan 21 23:52:58 compute-1 ovn_controller[94841]: 2026-01-21T23:52:58Z|00123|binding|INFO|Setting lport bc3d02f6-e146-4659-b018-41d3813ed1c3 up in Southbound
Jan 21 23:52:58 compute-1 nova_compute[182713]: 2026-01-21 23:52:58.348 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:52:58 compute-1 systemd[1]: Started Virtual Machine qemu-22-instance-00000031.
Jan 21 23:52:58 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:52:58.371 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[8e3b1182-884b-4795-964d-af8142cacf45]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:52:58 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:52:58.379 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[5b0b8f9b-2c84-4bf3-b3ab-fcadb9482c58]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:52:58 compute-1 NetworkManager[54952]: <info>  [1769039578.3808] manager: (tap7b586c54-30): new Veth device (/org/freedesktop/NetworkManager/Devices/64)
Jan 21 23:52:58 compute-1 systemd-udevd[217073]: Network interface NamePolicy= disabled on kernel command line.
Jan 21 23:52:58 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:52:58.418 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[3d205cb5-7da3-49b8-83fb-81988f9d0170]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:52:58 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:52:58.422 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[6aaad3f2-f2e3-4852-a9d3-4556c95ac193]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:52:58 compute-1 NetworkManager[54952]: <info>  [1769039578.4587] device (tap7b586c54-30): carrier: link connected
Jan 21 23:52:58 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:52:58.466 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[b2a1b80f-0056-4e66-ba3a-bcdbb95f1825]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:52:58 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:52:58.493 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[d792e110-29b5-4935-85b5-64f40573cd06]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap7b586c54-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:dc:a9:f5'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 37], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 415111, 'reachable_time': 31400, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 217103, 'error': None, 'target': 'ovnmeta-7b586c54-3322-410f-9bc9-972a63b8deff', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:52:58 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:52:58.518 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[350d6a55-bd4f-41b2-afc8-f3300279a7b7]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fedc:a9f5'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 415111, 'tstamp': 415111}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 217104, 'error': None, 'target': 'ovnmeta-7b586c54-3322-410f-9bc9-972a63b8deff', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:52:58 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:52:58.537 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[481fc3de-0965-43f0-b689-afd69fa00824]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap7b586c54-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:dc:a9:f5'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 37], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 415111, 'reachable_time': 31400, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 217105, 'error': None, 'target': 'ovnmeta-7b586c54-3322-410f-9bc9-972a63b8deff', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:52:58 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:52:58.582 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[af5c73fd-3d89-4158-a5a3-b13bb7be9fff]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:52:58 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:52:58.649 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[4d7f42f6-ed3e-4005-8f34-38a7089b5cd3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:52:58 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:52:58.651 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7b586c54-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:52:58 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:52:58.652 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 21 23:52:58 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:52:58.653 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap7b586c54-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:52:58 compute-1 kernel: tap7b586c54-30: entered promiscuous mode
Jan 21 23:52:58 compute-1 nova_compute[182713]: 2026-01-21 23:52:58.655 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:52:58 compute-1 NetworkManager[54952]: <info>  [1769039578.6564] manager: (tap7b586c54-30): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/65)
Jan 21 23:52:58 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:52:58.659 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap7b586c54-30, col_values=(('external_ids', {'iface-id': '52e5d5d5-be78-49fa-86d7-24ac4adf40c1'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:52:58 compute-1 ovn_controller[94841]: 2026-01-21T23:52:58Z|00124|binding|INFO|Releasing lport 52e5d5d5-be78-49fa-86d7-24ac4adf40c1 from this chassis (sb_readonly=0)
Jan 21 23:52:58 compute-1 nova_compute[182713]: 2026-01-21 23:52:58.660 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:52:58 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:52:58.661 104184 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/7b586c54-3322-410f-9bc9-972a63b8deff.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/7b586c54-3322-410f-9bc9-972a63b8deff.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 21 23:52:58 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:52:58.662 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[ff939e77-0f4c-49d6-b441-6c18b4769c37]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:52:58 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:52:58.663 104184 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 21 23:52:58 compute-1 ovn_metadata_agent[104179]: global
Jan 21 23:52:58 compute-1 ovn_metadata_agent[104179]:     log         /dev/log local0 debug
Jan 21 23:52:58 compute-1 ovn_metadata_agent[104179]:     log-tag     haproxy-metadata-proxy-7b586c54-3322-410f-9bc9-972a63b8deff
Jan 21 23:52:58 compute-1 ovn_metadata_agent[104179]:     user        root
Jan 21 23:52:58 compute-1 ovn_metadata_agent[104179]:     group       root
Jan 21 23:52:58 compute-1 ovn_metadata_agent[104179]:     maxconn     1024
Jan 21 23:52:58 compute-1 ovn_metadata_agent[104179]:     pidfile     /var/lib/neutron/external/pids/7b586c54-3322-410f-9bc9-972a63b8deff.pid.haproxy
Jan 21 23:52:58 compute-1 ovn_metadata_agent[104179]:     daemon
Jan 21 23:52:58 compute-1 ovn_metadata_agent[104179]: 
Jan 21 23:52:58 compute-1 ovn_metadata_agent[104179]: defaults
Jan 21 23:52:58 compute-1 ovn_metadata_agent[104179]:     log global
Jan 21 23:52:58 compute-1 ovn_metadata_agent[104179]:     mode http
Jan 21 23:52:58 compute-1 ovn_metadata_agent[104179]:     option httplog
Jan 21 23:52:58 compute-1 ovn_metadata_agent[104179]:     option dontlognull
Jan 21 23:52:58 compute-1 ovn_metadata_agent[104179]:     option http-server-close
Jan 21 23:52:58 compute-1 ovn_metadata_agent[104179]:     option forwardfor
Jan 21 23:52:58 compute-1 ovn_metadata_agent[104179]:     retries                 3
Jan 21 23:52:58 compute-1 ovn_metadata_agent[104179]:     timeout http-request    30s
Jan 21 23:52:58 compute-1 ovn_metadata_agent[104179]:     timeout connect         30s
Jan 21 23:52:58 compute-1 ovn_metadata_agent[104179]:     timeout client          32s
Jan 21 23:52:58 compute-1 ovn_metadata_agent[104179]:     timeout server          32s
Jan 21 23:52:58 compute-1 ovn_metadata_agent[104179]:     timeout http-keep-alive 30s
Jan 21 23:52:58 compute-1 ovn_metadata_agent[104179]: 
Jan 21 23:52:58 compute-1 ovn_metadata_agent[104179]: 
Jan 21 23:52:58 compute-1 ovn_metadata_agent[104179]: listen listener
Jan 21 23:52:58 compute-1 ovn_metadata_agent[104179]:     bind 169.254.169.254:80
Jan 21 23:52:58 compute-1 ovn_metadata_agent[104179]:     server metadata /var/lib/neutron/metadata_proxy
Jan 21 23:52:58 compute-1 ovn_metadata_agent[104179]:     http-request add-header X-OVN-Network-ID 7b586c54-3322-410f-9bc9-972a63b8deff
Jan 21 23:52:58 compute-1 ovn_metadata_agent[104179]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 21 23:52:58 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:52:58.664 104184 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-7b586c54-3322-410f-9bc9-972a63b8deff', 'env', 'PROCESS_TAG=haproxy-7b586c54-3322-410f-9bc9-972a63b8deff', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/7b586c54-3322-410f-9bc9-972a63b8deff.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 21 23:52:58 compute-1 nova_compute[182713]: 2026-01-21 23:52:58.671 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:52:58 compute-1 nova_compute[182713]: 2026-01-21 23:52:58.911 182717 DEBUG nova.virt.driver [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Emitting event <LifecycleEvent: 1769039578.9106865, 40bd1cc4-d1de-4488-8160-e6d4f5fce4bc => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:52:58 compute-1 nova_compute[182713]: 2026-01-21 23:52:58.912 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 40bd1cc4-d1de-4488-8160-e6d4f5fce4bc] VM Started (Lifecycle Event)
Jan 21 23:52:58 compute-1 nova_compute[182713]: 2026-01-21 23:52:58.954 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 40bd1cc4-d1de-4488-8160-e6d4f5fce4bc] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:52:58 compute-1 nova_compute[182713]: 2026-01-21 23:52:58.960 182717 DEBUG nova.virt.driver [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Emitting event <LifecycleEvent: 1769039578.9109879, 40bd1cc4-d1de-4488-8160-e6d4f5fce4bc => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:52:58 compute-1 nova_compute[182713]: 2026-01-21 23:52:58.960 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 40bd1cc4-d1de-4488-8160-e6d4f5fce4bc] VM Paused (Lifecycle Event)
Jan 21 23:52:59 compute-1 podman[217145]: 2026-01-21 23:52:59.034813463 +0000 UTC m=+0.049075551 container create 022ddbabe4cdf4dfbf64f52356ae81c162e684dc5a07ac33cebcd2cf30f5c060 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7b586c54-3322-410f-9bc9-972a63b8deff, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0)
Jan 21 23:52:59 compute-1 nova_compute[182713]: 2026-01-21 23:52:59.058 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 40bd1cc4-d1de-4488-8160-e6d4f5fce4bc] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:52:59 compute-1 nova_compute[182713]: 2026-01-21 23:52:59.065 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 40bd1cc4-d1de-4488-8160-e6d4f5fce4bc] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 21 23:52:59 compute-1 systemd[1]: Started libpod-conmon-022ddbabe4cdf4dfbf64f52356ae81c162e684dc5a07ac33cebcd2cf30f5c060.scope.
Jan 21 23:52:59 compute-1 podman[217145]: 2026-01-21 23:52:59.009316168 +0000 UTC m=+0.023578266 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 21 23:52:59 compute-1 systemd[1]: Started libcrun container.
Jan 21 23:52:59 compute-1 nova_compute[182713]: 2026-01-21 23:52:59.129 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 40bd1cc4-d1de-4488-8160-e6d4f5fce4bc] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 21 23:52:59 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/84de826167cf13ea4db1c36b0aedc89793fde5e8a384896095a7e90d4f8e839e/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 21 23:52:59 compute-1 podman[217145]: 2026-01-21 23:52:59.146289889 +0000 UTC m=+0.160551987 container init 022ddbabe4cdf4dfbf64f52356ae81c162e684dc5a07ac33cebcd2cf30f5c060 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7b586c54-3322-410f-9bc9-972a63b8deff, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 21 23:52:59 compute-1 podman[217145]: 2026-01-21 23:52:59.156399605 +0000 UTC m=+0.170661693 container start 022ddbabe4cdf4dfbf64f52356ae81c162e684dc5a07ac33cebcd2cf30f5c060 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7b586c54-3322-410f-9bc9-972a63b8deff, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 21 23:52:59 compute-1 neutron-haproxy-ovnmeta-7b586c54-3322-410f-9bc9-972a63b8deff[217160]: [NOTICE]   (217164) : New worker (217166) forked
Jan 21 23:52:59 compute-1 neutron-haproxy-ovnmeta-7b586c54-3322-410f-9bc9-972a63b8deff[217160]: [NOTICE]   (217164) : Loading success.
Jan 21 23:52:59 compute-1 nova_compute[182713]: 2026-01-21 23:52:59.300 182717 DEBUG nova.compute.manager [req-021354ac-5cbd-4e20-974a-293575d6c655 req-4627e92a-3ef8-4d8f-a600-2af8a2ed7c85 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 40bd1cc4-d1de-4488-8160-e6d4f5fce4bc] Received event network-vif-plugged-bc3d02f6-e146-4659-b018-41d3813ed1c3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:52:59 compute-1 nova_compute[182713]: 2026-01-21 23:52:59.301 182717 DEBUG oslo_concurrency.lockutils [req-021354ac-5cbd-4e20-974a-293575d6c655 req-4627e92a-3ef8-4d8f-a600-2af8a2ed7c85 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "40bd1cc4-d1de-4488-8160-e6d4f5fce4bc-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:52:59 compute-1 nova_compute[182713]: 2026-01-21 23:52:59.301 182717 DEBUG oslo_concurrency.lockutils [req-021354ac-5cbd-4e20-974a-293575d6c655 req-4627e92a-3ef8-4d8f-a600-2af8a2ed7c85 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "40bd1cc4-d1de-4488-8160-e6d4f5fce4bc-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:52:59 compute-1 nova_compute[182713]: 2026-01-21 23:52:59.301 182717 DEBUG oslo_concurrency.lockutils [req-021354ac-5cbd-4e20-974a-293575d6c655 req-4627e92a-3ef8-4d8f-a600-2af8a2ed7c85 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "40bd1cc4-d1de-4488-8160-e6d4f5fce4bc-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:52:59 compute-1 nova_compute[182713]: 2026-01-21 23:52:59.301 182717 DEBUG nova.compute.manager [req-021354ac-5cbd-4e20-974a-293575d6c655 req-4627e92a-3ef8-4d8f-a600-2af8a2ed7c85 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 40bd1cc4-d1de-4488-8160-e6d4f5fce4bc] Processing event network-vif-plugged-bc3d02f6-e146-4659-b018-41d3813ed1c3 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 21 23:52:59 compute-1 nova_compute[182713]: 2026-01-21 23:52:59.302 182717 DEBUG nova.compute.manager [None req-4464467c-e96a-4fdb-ba00-50362cf1d451 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 40bd1cc4-d1de-4488-8160-e6d4f5fce4bc] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 21 23:52:59 compute-1 nova_compute[182713]: 2026-01-21 23:52:59.307 182717 DEBUG nova.virt.driver [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Emitting event <LifecycleEvent: 1769039579.3064973, 40bd1cc4-d1de-4488-8160-e6d4f5fce4bc => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:52:59 compute-1 nova_compute[182713]: 2026-01-21 23:52:59.307 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 40bd1cc4-d1de-4488-8160-e6d4f5fce4bc] VM Resumed (Lifecycle Event)
Jan 21 23:52:59 compute-1 nova_compute[182713]: 2026-01-21 23:52:59.310 182717 DEBUG nova.virt.libvirt.driver [None req-4464467c-e96a-4fdb-ba00-50362cf1d451 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 40bd1cc4-d1de-4488-8160-e6d4f5fce4bc] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 21 23:52:59 compute-1 nova_compute[182713]: 2026-01-21 23:52:59.316 182717 INFO nova.virt.libvirt.driver [-] [instance: 40bd1cc4-d1de-4488-8160-e6d4f5fce4bc] Instance spawned successfully.
Jan 21 23:52:59 compute-1 nova_compute[182713]: 2026-01-21 23:52:59.317 182717 DEBUG nova.virt.libvirt.driver [None req-4464467c-e96a-4fdb-ba00-50362cf1d451 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 40bd1cc4-d1de-4488-8160-e6d4f5fce4bc] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 21 23:52:59 compute-1 nova_compute[182713]: 2026-01-21 23:52:59.380 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 40bd1cc4-d1de-4488-8160-e6d4f5fce4bc] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:52:59 compute-1 nova_compute[182713]: 2026-01-21 23:52:59.389 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 40bd1cc4-d1de-4488-8160-e6d4f5fce4bc] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 21 23:52:59 compute-1 nova_compute[182713]: 2026-01-21 23:52:59.395 182717 DEBUG nova.virt.libvirt.driver [None req-4464467c-e96a-4fdb-ba00-50362cf1d451 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 40bd1cc4-d1de-4488-8160-e6d4f5fce4bc] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:52:59 compute-1 nova_compute[182713]: 2026-01-21 23:52:59.396 182717 DEBUG nova.virt.libvirt.driver [None req-4464467c-e96a-4fdb-ba00-50362cf1d451 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 40bd1cc4-d1de-4488-8160-e6d4f5fce4bc] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:52:59 compute-1 nova_compute[182713]: 2026-01-21 23:52:59.396 182717 DEBUG nova.virt.libvirt.driver [None req-4464467c-e96a-4fdb-ba00-50362cf1d451 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 40bd1cc4-d1de-4488-8160-e6d4f5fce4bc] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:52:59 compute-1 nova_compute[182713]: 2026-01-21 23:52:59.397 182717 DEBUG nova.virt.libvirt.driver [None req-4464467c-e96a-4fdb-ba00-50362cf1d451 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 40bd1cc4-d1de-4488-8160-e6d4f5fce4bc] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:52:59 compute-1 nova_compute[182713]: 2026-01-21 23:52:59.397 182717 DEBUG nova.virt.libvirt.driver [None req-4464467c-e96a-4fdb-ba00-50362cf1d451 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 40bd1cc4-d1de-4488-8160-e6d4f5fce4bc] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:52:59 compute-1 nova_compute[182713]: 2026-01-21 23:52:59.398 182717 DEBUG nova.virt.libvirt.driver [None req-4464467c-e96a-4fdb-ba00-50362cf1d451 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 40bd1cc4-d1de-4488-8160-e6d4f5fce4bc] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:52:59 compute-1 nova_compute[182713]: 2026-01-21 23:52:59.471 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 40bd1cc4-d1de-4488-8160-e6d4f5fce4bc] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 21 23:52:59 compute-1 nova_compute[182713]: 2026-01-21 23:52:59.591 182717 INFO nova.compute.manager [None req-4464467c-e96a-4fdb-ba00-50362cf1d451 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 40bd1cc4-d1de-4488-8160-e6d4f5fce4bc] Took 12.74 seconds to spawn the instance on the hypervisor.
Jan 21 23:52:59 compute-1 nova_compute[182713]: 2026-01-21 23:52:59.592 182717 DEBUG nova.compute.manager [None req-4464467c-e96a-4fdb-ba00-50362cf1d451 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 40bd1cc4-d1de-4488-8160-e6d4f5fce4bc] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:52:59 compute-1 nova_compute[182713]: 2026-01-21 23:52:59.752 182717 INFO nova.compute.manager [None req-4464467c-e96a-4fdb-ba00-50362cf1d451 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 40bd1cc4-d1de-4488-8160-e6d4f5fce4bc] Took 14.62 seconds to build instance.
Jan 21 23:52:59 compute-1 nova_compute[182713]: 2026-01-21 23:52:59.807 182717 DEBUG oslo_concurrency.lockutils [None req-4464467c-e96a-4fdb-ba00-50362cf1d451 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Lock "40bd1cc4-d1de-4488-8160-e6d4f5fce4bc" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 14.884s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:53:00 compute-1 nova_compute[182713]: 2026-01-21 23:53:00.561 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:53:01 compute-1 nova_compute[182713]: 2026-01-21 23:53:01.102 182717 DEBUG nova.network.neutron [req-97e5d993-7079-42a0-a268-33a9711bef67 req-65cfb358-f85d-4327-b639-7784fd63fb98 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 40bd1cc4-d1de-4488-8160-e6d4f5fce4bc] Updated VIF entry in instance network info cache for port bc3d02f6-e146-4659-b018-41d3813ed1c3. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 21 23:53:01 compute-1 nova_compute[182713]: 2026-01-21 23:53:01.103 182717 DEBUG nova.network.neutron [req-97e5d993-7079-42a0-a268-33a9711bef67 req-65cfb358-f85d-4327-b639-7784fd63fb98 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 40bd1cc4-d1de-4488-8160-e6d4f5fce4bc] Updating instance_info_cache with network_info: [{"id": "bc3d02f6-e146-4659-b018-41d3813ed1c3", "address": "fa:16:3e:05:f7:b6", "network": {"id": "7b586c54-3322-410f-9bc9-972a63b8deff", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1542056474-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c09a5cf201e249f69f57cd4a632d1e2b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbc3d02f6-e1", "ovs_interfaceid": "bc3d02f6-e146-4659-b018-41d3813ed1c3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 23:53:01 compute-1 nova_compute[182713]: 2026-01-21 23:53:01.166 182717 DEBUG oslo_concurrency.lockutils [req-97e5d993-7079-42a0-a268-33a9711bef67 req-65cfb358-f85d-4327-b639-7784fd63fb98 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-40bd1cc4-d1de-4488-8160-e6d4f5fce4bc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 23:53:01 compute-1 nova_compute[182713]: 2026-01-21 23:53:01.647 182717 DEBUG nova.compute.manager [req-0e050d34-3383-4dd8-82d8-c62f9e97bba6 req-7209d25d-31b9-48c8-a320-1bd5ee692997 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 40bd1cc4-d1de-4488-8160-e6d4f5fce4bc] Received event network-vif-plugged-bc3d02f6-e146-4659-b018-41d3813ed1c3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:53:01 compute-1 nova_compute[182713]: 2026-01-21 23:53:01.648 182717 DEBUG oslo_concurrency.lockutils [req-0e050d34-3383-4dd8-82d8-c62f9e97bba6 req-7209d25d-31b9-48c8-a320-1bd5ee692997 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "40bd1cc4-d1de-4488-8160-e6d4f5fce4bc-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:53:01 compute-1 nova_compute[182713]: 2026-01-21 23:53:01.648 182717 DEBUG oslo_concurrency.lockutils [req-0e050d34-3383-4dd8-82d8-c62f9e97bba6 req-7209d25d-31b9-48c8-a320-1bd5ee692997 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "40bd1cc4-d1de-4488-8160-e6d4f5fce4bc-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:53:01 compute-1 nova_compute[182713]: 2026-01-21 23:53:01.649 182717 DEBUG oslo_concurrency.lockutils [req-0e050d34-3383-4dd8-82d8-c62f9e97bba6 req-7209d25d-31b9-48c8-a320-1bd5ee692997 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "40bd1cc4-d1de-4488-8160-e6d4f5fce4bc-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:53:01 compute-1 nova_compute[182713]: 2026-01-21 23:53:01.649 182717 DEBUG nova.compute.manager [req-0e050d34-3383-4dd8-82d8-c62f9e97bba6 req-7209d25d-31b9-48c8-a320-1bd5ee692997 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 40bd1cc4-d1de-4488-8160-e6d4f5fce4bc] No waiting events found dispatching network-vif-plugged-bc3d02f6-e146-4659-b018-41d3813ed1c3 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 23:53:01 compute-1 nova_compute[182713]: 2026-01-21 23:53:01.650 182717 WARNING nova.compute.manager [req-0e050d34-3383-4dd8-82d8-c62f9e97bba6 req-7209d25d-31b9-48c8-a320-1bd5ee692997 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 40bd1cc4-d1de-4488-8160-e6d4f5fce4bc] Received unexpected event network-vif-plugged-bc3d02f6-e146-4659-b018-41d3813ed1c3 for instance with vm_state active and task_state None.
Jan 21 23:53:01 compute-1 nova_compute[182713]: 2026-01-21 23:53:01.823 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:53:03 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:53:03.000 104184 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:53:03 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:53:03.002 104184 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:53:03 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:53:03.002 104184 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:53:03 compute-1 podman[217175]: 2026-01-21 23:53:03.598956783 +0000 UTC m=+0.081019347 container health_status af9a44923f14d865893c91712a29a99d59e05d83f1a1a0300c4f4d5af638f284 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Jan 21 23:53:03 compute-1 podman[217176]: 2026-01-21 23:53:03.609678877 +0000 UTC m=+0.088554792 container health_status e305ebfcd3dbca18f8dd680606b043b108cf9efb0d0a771039cb04fe0df8441d (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 21 23:53:05 compute-1 nova_compute[182713]: 2026-01-21 23:53:05.588 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:53:06 compute-1 nova_compute[182713]: 2026-01-21 23:53:06.847 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:53:10 compute-1 nova_compute[182713]: 2026-01-21 23:53:10.624 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:53:11 compute-1 ovn_controller[94841]: 2026-01-21T23:53:11Z|00016|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:05:f7:b6 10.100.0.3
Jan 21 23:53:11 compute-1 ovn_controller[94841]: 2026-01-21T23:53:11Z|00017|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:05:f7:b6 10.100.0.3
Jan 21 23:53:11 compute-1 nova_compute[182713]: 2026-01-21 23:53:11.851 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:53:15 compute-1 nova_compute[182713]: 2026-01-21 23:53:15.659 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:53:16 compute-1 nova_compute[182713]: 2026-01-21 23:53:16.236 182717 DEBUG oslo_concurrency.lockutils [None req-26dc531a-94bd-4864-905e-beac91ad2dea a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Acquiring lock "refresh_cache-40bd1cc4-d1de-4488-8160-e6d4f5fce4bc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 23:53:16 compute-1 nova_compute[182713]: 2026-01-21 23:53:16.237 182717 DEBUG oslo_concurrency.lockutils [None req-26dc531a-94bd-4864-905e-beac91ad2dea a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Acquired lock "refresh_cache-40bd1cc4-d1de-4488-8160-e6d4f5fce4bc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 23:53:16 compute-1 nova_compute[182713]: 2026-01-21 23:53:16.238 182717 DEBUG nova.network.neutron [None req-26dc531a-94bd-4864-905e-beac91ad2dea a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 40bd1cc4-d1de-4488-8160-e6d4f5fce4bc] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 21 23:53:16 compute-1 podman[217237]: 2026-01-21 23:53:16.63070114 +0000 UTC m=+0.109986662 container health_status cba261ffc859a105e0afa4436e64e27f9ba7e7387a1ea02146e0072c51844d44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202)
Jan 21 23:53:16 compute-1 nova_compute[182713]: 2026-01-21 23:53:16.894 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:53:19 compute-1 nova_compute[182713]: 2026-01-21 23:53:19.165 182717 DEBUG nova.network.neutron [None req-26dc531a-94bd-4864-905e-beac91ad2dea a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 40bd1cc4-d1de-4488-8160-e6d4f5fce4bc] Updating instance_info_cache with network_info: [{"id": "bc3d02f6-e146-4659-b018-41d3813ed1c3", "address": "fa:16:3e:05:f7:b6", "network": {"id": "7b586c54-3322-410f-9bc9-972a63b8deff", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1542056474-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c09a5cf201e249f69f57cd4a632d1e2b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbc3d02f6-e1", "ovs_interfaceid": "bc3d02f6-e146-4659-b018-41d3813ed1c3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 23:53:19 compute-1 nova_compute[182713]: 2026-01-21 23:53:19.220 182717 DEBUG oslo_concurrency.lockutils [None req-26dc531a-94bd-4864-905e-beac91ad2dea a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Releasing lock "refresh_cache-40bd1cc4-d1de-4488-8160-e6d4f5fce4bc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 23:53:19 compute-1 nova_compute[182713]: 2026-01-21 23:53:19.440 182717 DEBUG nova.virt.libvirt.driver [None req-26dc531a-94bd-4864-905e-beac91ad2dea a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 40bd1cc4-d1de-4488-8160-e6d4f5fce4bc] Starting migrate_disk_and_power_off migrate_disk_and_power_off /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11511
Jan 21 23:53:19 compute-1 nova_compute[182713]: 2026-01-21 23:53:19.441 182717 DEBUG nova.virt.libvirt.volume.remotefs [None req-26dc531a-94bd-4864-905e-beac91ad2dea a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Creating file /var/lib/nova/instances/40bd1cc4-d1de-4488-8160-e6d4f5fce4bc/b35939f6fed4456990c874515fb27bf2.tmp on remote host 192.168.122.102 create_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:79
Jan 21 23:53:19 compute-1 nova_compute[182713]: 2026-01-21 23:53:19.441 182717 DEBUG oslo_concurrency.processutils [None req-26dc531a-94bd-4864-905e-beac91ad2dea a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Running cmd (subprocess): ssh -o BatchMode=yes 192.168.122.102 touch /var/lib/nova/instances/40bd1cc4-d1de-4488-8160-e6d4f5fce4bc/b35939f6fed4456990c874515fb27bf2.tmp execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:53:19 compute-1 podman[217259]: 2026-01-21 23:53:19.612641396 +0000 UTC m=+0.094970262 container health_status dce133a2ffa21f3006ac27dc80027060a9029e6d972f4c62fecd35dd3bbb00f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, version=9.6, io.openshift.expose-services=, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., managed_by=edpm_ansible, container_name=openstack_network_exporter, config_id=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64)
Jan 21 23:53:19 compute-1 nova_compute[182713]: 2026-01-21 23:53:19.992 182717 DEBUG oslo_concurrency.processutils [None req-26dc531a-94bd-4864-905e-beac91ad2dea a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] CMD "ssh -o BatchMode=yes 192.168.122.102 touch /var/lib/nova/instances/40bd1cc4-d1de-4488-8160-e6d4f5fce4bc/b35939f6fed4456990c874515fb27bf2.tmp" returned: 1 in 0.551s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:53:19 compute-1 nova_compute[182713]: 2026-01-21 23:53:19.994 182717 DEBUG oslo_concurrency.processutils [None req-26dc531a-94bd-4864-905e-beac91ad2dea a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] 'ssh -o BatchMode=yes 192.168.122.102 touch /var/lib/nova/instances/40bd1cc4-d1de-4488-8160-e6d4f5fce4bc/b35939f6fed4456990c874515fb27bf2.tmp' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473
Jan 21 23:53:19 compute-1 nova_compute[182713]: 2026-01-21 23:53:19.994 182717 DEBUG nova.virt.libvirt.volume.remotefs [None req-26dc531a-94bd-4864-905e-beac91ad2dea a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Creating directory /var/lib/nova/instances/40bd1cc4-d1de-4488-8160-e6d4f5fce4bc on remote host 192.168.122.102 create_dir /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:91
Jan 21 23:53:19 compute-1 nova_compute[182713]: 2026-01-21 23:53:19.995 182717 DEBUG oslo_concurrency.processutils [None req-26dc531a-94bd-4864-905e-beac91ad2dea a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Running cmd (subprocess): ssh -o BatchMode=yes 192.168.122.102 mkdir -p /var/lib/nova/instances/40bd1cc4-d1de-4488-8160-e6d4f5fce4bc execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:53:20 compute-1 nova_compute[182713]: 2026-01-21 23:53:20.235 182717 DEBUG oslo_concurrency.processutils [None req-26dc531a-94bd-4864-905e-beac91ad2dea a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] CMD "ssh -o BatchMode=yes 192.168.122.102 mkdir -p /var/lib/nova/instances/40bd1cc4-d1de-4488-8160-e6d4f5fce4bc" returned: 0 in 0.240s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:53:20 compute-1 nova_compute[182713]: 2026-01-21 23:53:20.242 182717 DEBUG nova.virt.libvirt.driver [None req-26dc531a-94bd-4864-905e-beac91ad2dea a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 40bd1cc4-d1de-4488-8160-e6d4f5fce4bc] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071
Jan 21 23:53:20 compute-1 nova_compute[182713]: 2026-01-21 23:53:20.662 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:53:21 compute-1 nova_compute[182713]: 2026-01-21 23:53:21.897 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:53:22 compute-1 kernel: tapbc3d02f6-e1 (unregistering): left promiscuous mode
Jan 21 23:53:22 compute-1 NetworkManager[54952]: <info>  [1769039602.4821] device (tapbc3d02f6-e1): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 21 23:53:22 compute-1 nova_compute[182713]: 2026-01-21 23:53:22.489 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:53:22 compute-1 ovn_controller[94841]: 2026-01-21T23:53:22Z|00125|binding|INFO|Releasing lport bc3d02f6-e146-4659-b018-41d3813ed1c3 from this chassis (sb_readonly=0)
Jan 21 23:53:22 compute-1 ovn_controller[94841]: 2026-01-21T23:53:22Z|00126|binding|INFO|Setting lport bc3d02f6-e146-4659-b018-41d3813ed1c3 down in Southbound
Jan 21 23:53:22 compute-1 ovn_controller[94841]: 2026-01-21T23:53:22Z|00127|binding|INFO|Removing iface tapbc3d02f6-e1 ovn-installed in OVS
Jan 21 23:53:22 compute-1 nova_compute[182713]: 2026-01-21 23:53:22.492 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:53:22 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:53:22.500 104184 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:05:f7:b6 10.100.0.3'], port_security=['fa:16:3e:05:f7:b6 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': '40bd1cc4-d1de-4488-8160-e6d4f5fce4bc', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7b586c54-3322-410f-9bc9-972a63b8deff', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c09a5cf201e249f69f57cd4a632d1e2b', 'neutron:revision_number': '4', 'neutron:security_group_ids': '0a9884ba-4fab-4d1a-a8f1-d417efefef12', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f503744d-afcf-48c4-bcde-b001877de7d3, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>], logical_port=bc3d02f6-e146-4659-b018-41d3813ed1c3) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 21 23:53:22 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:53:22.502 104184 INFO neutron.agent.ovn.metadata.agent [-] Port bc3d02f6-e146-4659-b018-41d3813ed1c3 in datapath 7b586c54-3322-410f-9bc9-972a63b8deff unbound from our chassis
Jan 21 23:53:22 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:53:22.505 104184 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 7b586c54-3322-410f-9bc9-972a63b8deff, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 21 23:53:22 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:53:22.508 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[acedf0f1-a66a-4f52-9fa1-5b02199c8fd2]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:53:22 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:53:22.511 104184 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-7b586c54-3322-410f-9bc9-972a63b8deff namespace which is not needed anymore
Jan 21 23:53:22 compute-1 nova_compute[182713]: 2026-01-21 23:53:22.523 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:53:22 compute-1 systemd[1]: machine-qemu\x2d22\x2dinstance\x2d00000031.scope: Deactivated successfully.
Jan 21 23:53:22 compute-1 systemd[1]: machine-qemu\x2d22\x2dinstance\x2d00000031.scope: Consumed 14.015s CPU time.
Jan 21 23:53:22 compute-1 systemd-machined[153970]: Machine qemu-22-instance-00000031 terminated.
Jan 21 23:53:22 compute-1 neutron-haproxy-ovnmeta-7b586c54-3322-410f-9bc9-972a63b8deff[217160]: [NOTICE]   (217164) : haproxy version is 2.8.14-c23fe91
Jan 21 23:53:22 compute-1 neutron-haproxy-ovnmeta-7b586c54-3322-410f-9bc9-972a63b8deff[217160]: [NOTICE]   (217164) : path to executable is /usr/sbin/haproxy
Jan 21 23:53:22 compute-1 neutron-haproxy-ovnmeta-7b586c54-3322-410f-9bc9-972a63b8deff[217160]: [WARNING]  (217164) : Exiting Master process...
Jan 21 23:53:22 compute-1 neutron-haproxy-ovnmeta-7b586c54-3322-410f-9bc9-972a63b8deff[217160]: [WARNING]  (217164) : Exiting Master process...
Jan 21 23:53:22 compute-1 neutron-haproxy-ovnmeta-7b586c54-3322-410f-9bc9-972a63b8deff[217160]: [ALERT]    (217164) : Current worker (217166) exited with code 143 (Terminated)
Jan 21 23:53:22 compute-1 neutron-haproxy-ovnmeta-7b586c54-3322-410f-9bc9-972a63b8deff[217160]: [WARNING]  (217164) : All workers exited. Exiting... (0)
Jan 21 23:53:22 compute-1 systemd[1]: libpod-022ddbabe4cdf4dfbf64f52356ae81c162e684dc5a07ac33cebcd2cf30f5c060.scope: Deactivated successfully.
Jan 21 23:53:22 compute-1 podman[217306]: 2026-01-21 23:53:22.699542547 +0000 UTC m=+0.054699766 container died 022ddbabe4cdf4dfbf64f52356ae81c162e684dc5a07ac33cebcd2cf30f5c060 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7b586c54-3322-410f-9bc9-972a63b8deff, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 21 23:53:22 compute-1 systemd[1]: var-lib-containers-storage-overlay-84de826167cf13ea4db1c36b0aedc89793fde5e8a384896095a7e90d4f8e839e-merged.mount: Deactivated successfully.
Jan 21 23:53:22 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-022ddbabe4cdf4dfbf64f52356ae81c162e684dc5a07ac33cebcd2cf30f5c060-userdata-shm.mount: Deactivated successfully.
Jan 21 23:53:22 compute-1 podman[217306]: 2026-01-21 23:53:22.742559989 +0000 UTC m=+0.097717208 container cleanup 022ddbabe4cdf4dfbf64f52356ae81c162e684dc5a07ac33cebcd2cf30f5c060 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7b586c54-3322-410f-9bc9-972a63b8deff, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0)
Jan 21 23:53:22 compute-1 systemd[1]: libpod-conmon-022ddbabe4cdf4dfbf64f52356ae81c162e684dc5a07ac33cebcd2cf30f5c060.scope: Deactivated successfully.
Jan 21 23:53:22 compute-1 podman[217355]: 2026-01-21 23:53:22.81474383 +0000 UTC m=+0.044816558 container remove 022ddbabe4cdf4dfbf64f52356ae81c162e684dc5a07ac33cebcd2cf30f5c060 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7b586c54-3322-410f-9bc9-972a63b8deff, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Jan 21 23:53:22 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:53:22.821 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[ecc20834-941e-49a2-a094-94713532d430]: (4, ('Wed Jan 21 11:53:22 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-7b586c54-3322-410f-9bc9-972a63b8deff (022ddbabe4cdf4dfbf64f52356ae81c162e684dc5a07ac33cebcd2cf30f5c060)\n022ddbabe4cdf4dfbf64f52356ae81c162e684dc5a07ac33cebcd2cf30f5c060\nWed Jan 21 11:53:22 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-7b586c54-3322-410f-9bc9-972a63b8deff (022ddbabe4cdf4dfbf64f52356ae81c162e684dc5a07ac33cebcd2cf30f5c060)\n022ddbabe4cdf4dfbf64f52356ae81c162e684dc5a07ac33cebcd2cf30f5c060\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:53:22 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:53:22.823 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[add0b5a4-d110-4c61-b792-1fdae3a880a3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:53:22 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:53:22.824 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7b586c54-30, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:53:22 compute-1 nova_compute[182713]: 2026-01-21 23:53:22.826 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:53:22 compute-1 kernel: tap7b586c54-30: left promiscuous mode
Jan 21 23:53:22 compute-1 nova_compute[182713]: 2026-01-21 23:53:22.845 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:53:22 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:53:22.850 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[ff160370-f5ec-48da-89d5-6a82396c6110]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:53:22 compute-1 nova_compute[182713]: 2026-01-21 23:53:22.855 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:53:22 compute-1 nova_compute[182713]: 2026-01-21 23:53:22.856 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 21 23:53:22 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:53:22.864 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[98eb5813-acf0-4e0d-be28-02bd8acd6850]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:53:22 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:53:22.865 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[ca13e4ca-7609-45ba-ac4f-bde15e70b221]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:53:22 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:53:22.885 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[5de5c0db-5f9b-4787-9d24-23165aafaec5]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 415101, 'reachable_time': 24547, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 217375, 'error': None, 'target': 'ovnmeta-7b586c54-3322-410f-9bc9-972a63b8deff', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:53:22 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:53:22.888 104576 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-7b586c54-3322-410f-9bc9-972a63b8deff deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 21 23:53:22 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:53:22.888 104576 DEBUG oslo.privsep.daemon [-] privsep: reply[422b3fc4-f031-4811-a4ed-7fabe5cc6dc8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:53:22 compute-1 systemd[1]: run-netns-ovnmeta\x2d7b586c54\x2d3322\x2d410f\x2d9bc9\x2d972a63b8deff.mount: Deactivated successfully.
Jan 21 23:53:23 compute-1 nova_compute[182713]: 2026-01-21 23:53:23.263 182717 INFO nova.virt.libvirt.driver [None req-26dc531a-94bd-4864-905e-beac91ad2dea a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 40bd1cc4-d1de-4488-8160-e6d4f5fce4bc] Instance shutdown successfully after 3 seconds.
Jan 21 23:53:23 compute-1 nova_compute[182713]: 2026-01-21 23:53:23.271 182717 INFO nova.virt.libvirt.driver [-] [instance: 40bd1cc4-d1de-4488-8160-e6d4f5fce4bc] Instance destroyed successfully.
Jan 21 23:53:23 compute-1 nova_compute[182713]: 2026-01-21 23:53:23.273 182717 DEBUG nova.virt.libvirt.vif [None req-26dc531a-94bd-4864-905e-beac91ad2dea a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-21T23:52:42Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerDiskConfigTestJSON-server-732234770',display_name='tempest-ServerDiskConfigTestJSON-server-732234770',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverdiskconfigtestjson-server-732234770',id=49,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-21T23:52:59Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='c09a5cf201e249f69f57cd4a632d1e2b',ramdisk_id='',reservation_id='r-7yxmtfzt',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-ServerDiskConfigTestJSON-1417790226',owner_user_name='tempest-ServerDiskConfigTestJSON-1417790226-project-member'},tags=<?>,task_state='resize_migrating',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-21T23:53:15Z,user_data=None,user_id='a7fb6bdd938b4fcdb749b0bc4f86f97e',uuid=40bd1cc4-d1de-4488-8160-e6d4f5fce4bc,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "bc3d02f6-e146-4659-b018-41d3813ed1c3", "address": "fa:16:3e:05:f7:b6", "network": {"id": "7b586c54-3322-410f-9bc9-972a63b8deff", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1542056474-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerDiskConfigTestJSON-1542056474-network", "vif_mac": "fa:16:3e:05:f7:b6"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c09a5cf201e249f69f57cd4a632d1e2b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbc3d02f6-e1", "ovs_interfaceid": "bc3d02f6-e146-4659-b018-41d3813ed1c3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 21 23:53:23 compute-1 nova_compute[182713]: 2026-01-21 23:53:23.275 182717 DEBUG nova.network.os_vif_util [None req-26dc531a-94bd-4864-905e-beac91ad2dea a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Converting VIF {"id": "bc3d02f6-e146-4659-b018-41d3813ed1c3", "address": "fa:16:3e:05:f7:b6", "network": {"id": "7b586c54-3322-410f-9bc9-972a63b8deff", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1542056474-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerDiskConfigTestJSON-1542056474-network", "vif_mac": "fa:16:3e:05:f7:b6"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c09a5cf201e249f69f57cd4a632d1e2b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbc3d02f6-e1", "ovs_interfaceid": "bc3d02f6-e146-4659-b018-41d3813ed1c3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 21 23:53:23 compute-1 nova_compute[182713]: 2026-01-21 23:53:23.277 182717 DEBUG nova.network.os_vif_util [None req-26dc531a-94bd-4864-905e-beac91ad2dea a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:05:f7:b6,bridge_name='br-int',has_traffic_filtering=True,id=bc3d02f6-e146-4659-b018-41d3813ed1c3,network=Network(7b586c54-3322-410f-9bc9-972a63b8deff),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbc3d02f6-e1') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 21 23:53:23 compute-1 nova_compute[182713]: 2026-01-21 23:53:23.278 182717 DEBUG os_vif [None req-26dc531a-94bd-4864-905e-beac91ad2dea a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:05:f7:b6,bridge_name='br-int',has_traffic_filtering=True,id=bc3d02f6-e146-4659-b018-41d3813ed1c3,network=Network(7b586c54-3322-410f-9bc9-972a63b8deff),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbc3d02f6-e1') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 21 23:53:23 compute-1 nova_compute[182713]: 2026-01-21 23:53:23.282 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:53:23 compute-1 nova_compute[182713]: 2026-01-21 23:53:23.283 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapbc3d02f6-e1, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:53:23 compute-1 nova_compute[182713]: 2026-01-21 23:53:23.286 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:53:23 compute-1 nova_compute[182713]: 2026-01-21 23:53:23.289 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 21 23:53:23 compute-1 nova_compute[182713]: 2026-01-21 23:53:23.295 182717 INFO os_vif [None req-26dc531a-94bd-4864-905e-beac91ad2dea a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:05:f7:b6,bridge_name='br-int',has_traffic_filtering=True,id=bc3d02f6-e146-4659-b018-41d3813ed1c3,network=Network(7b586c54-3322-410f-9bc9-972a63b8deff),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbc3d02f6-e1')
Jan 21 23:53:23 compute-1 nova_compute[182713]: 2026-01-21 23:53:23.299 182717 DEBUG oslo_concurrency.processutils [None req-26dc531a-94bd-4864-905e-beac91ad2dea a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/40bd1cc4-d1de-4488-8160-e6d4f5fce4bc/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:53:23 compute-1 nova_compute[182713]: 2026-01-21 23:53:23.398 182717 DEBUG nova.compute.manager [req-8fb1910a-7379-4075-8168-9176af5e7f7c req-a2b84b50-f2ea-4e83-86d9-24d5e5acf814 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 40bd1cc4-d1de-4488-8160-e6d4f5fce4bc] Received event network-vif-unplugged-bc3d02f6-e146-4659-b018-41d3813ed1c3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:53:23 compute-1 nova_compute[182713]: 2026-01-21 23:53:23.399 182717 DEBUG oslo_concurrency.lockutils [req-8fb1910a-7379-4075-8168-9176af5e7f7c req-a2b84b50-f2ea-4e83-86d9-24d5e5acf814 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "40bd1cc4-d1de-4488-8160-e6d4f5fce4bc-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:53:23 compute-1 nova_compute[182713]: 2026-01-21 23:53:23.403 182717 DEBUG oslo_concurrency.lockutils [req-8fb1910a-7379-4075-8168-9176af5e7f7c req-a2b84b50-f2ea-4e83-86d9-24d5e5acf814 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "40bd1cc4-d1de-4488-8160-e6d4f5fce4bc-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.005s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:53:23 compute-1 nova_compute[182713]: 2026-01-21 23:53:23.404 182717 DEBUG oslo_concurrency.lockutils [req-8fb1910a-7379-4075-8168-9176af5e7f7c req-a2b84b50-f2ea-4e83-86d9-24d5e5acf814 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "40bd1cc4-d1de-4488-8160-e6d4f5fce4bc-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:53:23 compute-1 nova_compute[182713]: 2026-01-21 23:53:23.405 182717 DEBUG nova.compute.manager [req-8fb1910a-7379-4075-8168-9176af5e7f7c req-a2b84b50-f2ea-4e83-86d9-24d5e5acf814 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 40bd1cc4-d1de-4488-8160-e6d4f5fce4bc] No waiting events found dispatching network-vif-unplugged-bc3d02f6-e146-4659-b018-41d3813ed1c3 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 23:53:23 compute-1 nova_compute[182713]: 2026-01-21 23:53:23.405 182717 WARNING nova.compute.manager [req-8fb1910a-7379-4075-8168-9176af5e7f7c req-a2b84b50-f2ea-4e83-86d9-24d5e5acf814 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 40bd1cc4-d1de-4488-8160-e6d4f5fce4bc] Received unexpected event network-vif-unplugged-bc3d02f6-e146-4659-b018-41d3813ed1c3 for instance with vm_state active and task_state resize_migrating.
Jan 21 23:53:23 compute-1 nova_compute[182713]: 2026-01-21 23:53:23.407 182717 DEBUG oslo_concurrency.processutils [None req-26dc531a-94bd-4864-905e-beac91ad2dea a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/40bd1cc4-d1de-4488-8160-e6d4f5fce4bc/disk --force-share --output=json" returned: 0 in 0.108s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:53:23 compute-1 nova_compute[182713]: 2026-01-21 23:53:23.408 182717 DEBUG oslo_concurrency.processutils [None req-26dc531a-94bd-4864-905e-beac91ad2dea a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/40bd1cc4-d1de-4488-8160-e6d4f5fce4bc/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:53:23 compute-1 nova_compute[182713]: 2026-01-21 23:53:23.497 182717 DEBUG oslo_concurrency.processutils [None req-26dc531a-94bd-4864-905e-beac91ad2dea a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/40bd1cc4-d1de-4488-8160-e6d4f5fce4bc/disk --force-share --output=json" returned: 0 in 0.089s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:53:23 compute-1 nova_compute[182713]: 2026-01-21 23:53:23.500 182717 DEBUG nova.virt.libvirt.volume.remotefs [None req-26dc531a-94bd-4864-905e-beac91ad2dea a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Copying file /var/lib/nova/instances/40bd1cc4-d1de-4488-8160-e6d4f5fce4bc_resize/disk to 192.168.122.102:/var/lib/nova/instances/40bd1cc4-d1de-4488-8160-e6d4f5fce4bc/disk copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103
Jan 21 23:53:23 compute-1 nova_compute[182713]: 2026-01-21 23:53:23.501 182717 DEBUG oslo_concurrency.processutils [None req-26dc531a-94bd-4864-905e-beac91ad2dea a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Running cmd (subprocess): scp -r /var/lib/nova/instances/40bd1cc4-d1de-4488-8160-e6d4f5fce4bc_resize/disk 192.168.122.102:/var/lib/nova/instances/40bd1cc4-d1de-4488-8160-e6d4f5fce4bc/disk execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:53:23 compute-1 nova_compute[182713]: 2026-01-21 23:53:23.856 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:53:23 compute-1 nova_compute[182713]: 2026-01-21 23:53:23.857 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:53:24 compute-1 nova_compute[182713]: 2026-01-21 23:53:24.098 182717 DEBUG oslo_concurrency.processutils [None req-26dc531a-94bd-4864-905e-beac91ad2dea a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] CMD "scp -r /var/lib/nova/instances/40bd1cc4-d1de-4488-8160-e6d4f5fce4bc_resize/disk 192.168.122.102:/var/lib/nova/instances/40bd1cc4-d1de-4488-8160-e6d4f5fce4bc/disk" returned: 0 in 0.598s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:53:24 compute-1 nova_compute[182713]: 2026-01-21 23:53:24.099 182717 DEBUG nova.virt.libvirt.volume.remotefs [None req-26dc531a-94bd-4864-905e-beac91ad2dea a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Copying file /var/lib/nova/instances/40bd1cc4-d1de-4488-8160-e6d4f5fce4bc_resize/disk.config to 192.168.122.102:/var/lib/nova/instances/40bd1cc4-d1de-4488-8160-e6d4f5fce4bc/disk.config copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103
Jan 21 23:53:24 compute-1 nova_compute[182713]: 2026-01-21 23:53:24.100 182717 DEBUG oslo_concurrency.processutils [None req-26dc531a-94bd-4864-905e-beac91ad2dea a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Running cmd (subprocess): scp -C -r /var/lib/nova/instances/40bd1cc4-d1de-4488-8160-e6d4f5fce4bc_resize/disk.config 192.168.122.102:/var/lib/nova/instances/40bd1cc4-d1de-4488-8160-e6d4f5fce4bc/disk.config execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:53:24 compute-1 nova_compute[182713]: 2026-01-21 23:53:24.366 182717 DEBUG oslo_concurrency.processutils [None req-26dc531a-94bd-4864-905e-beac91ad2dea a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] CMD "scp -C -r /var/lib/nova/instances/40bd1cc4-d1de-4488-8160-e6d4f5fce4bc_resize/disk.config 192.168.122.102:/var/lib/nova/instances/40bd1cc4-d1de-4488-8160-e6d4f5fce4bc/disk.config" returned: 0 in 0.266s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:53:24 compute-1 nova_compute[182713]: 2026-01-21 23:53:24.368 182717 DEBUG nova.virt.libvirt.volume.remotefs [None req-26dc531a-94bd-4864-905e-beac91ad2dea a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Copying file /var/lib/nova/instances/40bd1cc4-d1de-4488-8160-e6d4f5fce4bc_resize/disk.info to 192.168.122.102:/var/lib/nova/instances/40bd1cc4-d1de-4488-8160-e6d4f5fce4bc/disk.info copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103
Jan 21 23:53:24 compute-1 nova_compute[182713]: 2026-01-21 23:53:24.369 182717 DEBUG oslo_concurrency.processutils [None req-26dc531a-94bd-4864-905e-beac91ad2dea a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Running cmd (subprocess): scp -C -r /var/lib/nova/instances/40bd1cc4-d1de-4488-8160-e6d4f5fce4bc_resize/disk.info 192.168.122.102:/var/lib/nova/instances/40bd1cc4-d1de-4488-8160-e6d4f5fce4bc/disk.info execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:53:24 compute-1 nova_compute[182713]: 2026-01-21 23:53:24.639 182717 DEBUG oslo_concurrency.processutils [None req-26dc531a-94bd-4864-905e-beac91ad2dea a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] CMD "scp -C -r /var/lib/nova/instances/40bd1cc4-d1de-4488-8160-e6d4f5fce4bc_resize/disk.info 192.168.122.102:/var/lib/nova/instances/40bd1cc4-d1de-4488-8160-e6d4f5fce4bc/disk.info" returned: 0 in 0.270s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:53:25 compute-1 nova_compute[182713]: 2026-01-21 23:53:25.095 182717 DEBUG neutronclient.v2_0.client [None req-26dc531a-94bd-4864-905e-beac91ad2dea a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Error message: {"NeutronError": {"type": "PortBindingNotFound", "message": "Binding for port bc3d02f6-e146-4659-b018-41d3813ed1c3 for host compute-2.ctlplane.example.com could not be found.", "detail": ""}} _handle_fault_response /usr/lib/python3.9/site-packages/neutronclient/v2_0/client.py:262
Jan 21 23:53:25 compute-1 nova_compute[182713]: 2026-01-21 23:53:25.311 182717 DEBUG oslo_concurrency.lockutils [None req-26dc531a-94bd-4864-905e-beac91ad2dea a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Acquiring lock "40bd1cc4-d1de-4488-8160-e6d4f5fce4bc-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:53:25 compute-1 nova_compute[182713]: 2026-01-21 23:53:25.312 182717 DEBUG oslo_concurrency.lockutils [None req-26dc531a-94bd-4864-905e-beac91ad2dea a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Lock "40bd1cc4-d1de-4488-8160-e6d4f5fce4bc-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:53:25 compute-1 nova_compute[182713]: 2026-01-21 23:53:25.312 182717 DEBUG oslo_concurrency.lockutils [None req-26dc531a-94bd-4864-905e-beac91ad2dea a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Lock "40bd1cc4-d1de-4488-8160-e6d4f5fce4bc-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:53:25 compute-1 nova_compute[182713]: 2026-01-21 23:53:25.578 182717 DEBUG nova.compute.manager [req-993f94bd-c253-4336-bd1b-e1f14ab1c651 req-a13f6df0-e332-48a4-aeac-170e01fb773d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 40bd1cc4-d1de-4488-8160-e6d4f5fce4bc] Received event network-vif-plugged-bc3d02f6-e146-4659-b018-41d3813ed1c3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:53:25 compute-1 nova_compute[182713]: 2026-01-21 23:53:25.579 182717 DEBUG oslo_concurrency.lockutils [req-993f94bd-c253-4336-bd1b-e1f14ab1c651 req-a13f6df0-e332-48a4-aeac-170e01fb773d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "40bd1cc4-d1de-4488-8160-e6d4f5fce4bc-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:53:25 compute-1 nova_compute[182713]: 2026-01-21 23:53:25.579 182717 DEBUG oslo_concurrency.lockutils [req-993f94bd-c253-4336-bd1b-e1f14ab1c651 req-a13f6df0-e332-48a4-aeac-170e01fb773d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "40bd1cc4-d1de-4488-8160-e6d4f5fce4bc-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:53:25 compute-1 nova_compute[182713]: 2026-01-21 23:53:25.580 182717 DEBUG oslo_concurrency.lockutils [req-993f94bd-c253-4336-bd1b-e1f14ab1c651 req-a13f6df0-e332-48a4-aeac-170e01fb773d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "40bd1cc4-d1de-4488-8160-e6d4f5fce4bc-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:53:25 compute-1 nova_compute[182713]: 2026-01-21 23:53:25.580 182717 DEBUG nova.compute.manager [req-993f94bd-c253-4336-bd1b-e1f14ab1c651 req-a13f6df0-e332-48a4-aeac-170e01fb773d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 40bd1cc4-d1de-4488-8160-e6d4f5fce4bc] No waiting events found dispatching network-vif-plugged-bc3d02f6-e146-4659-b018-41d3813ed1c3 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 23:53:25 compute-1 nova_compute[182713]: 2026-01-21 23:53:25.580 182717 WARNING nova.compute.manager [req-993f94bd-c253-4336-bd1b-e1f14ab1c651 req-a13f6df0-e332-48a4-aeac-170e01fb773d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 40bd1cc4-d1de-4488-8160-e6d4f5fce4bc] Received unexpected event network-vif-plugged-bc3d02f6-e146-4659-b018-41d3813ed1c3 for instance with vm_state active and task_state resize_migrated.
Jan 21 23:53:25 compute-1 nova_compute[182713]: 2026-01-21 23:53:25.664 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:53:25 compute-1 nova_compute[182713]: 2026-01-21 23:53:25.856 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:53:26 compute-1 nova_compute[182713]: 2026-01-21 23:53:26.851 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:53:26 compute-1 nova_compute[182713]: 2026-01-21 23:53:26.852 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:53:26 compute-1 nova_compute[182713]: 2026-01-21 23:53:26.882 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:53:27 compute-1 nova_compute[182713]: 2026-01-21 23:53:27.341 182717 DEBUG nova.compute.manager [req-b5f108c3-5767-4332-8c1d-c545d1eee43f req-4c1ea6a5-d664-44f1-9c3f-960a68ea0ee0 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 40bd1cc4-d1de-4488-8160-e6d4f5fce4bc] Received event network-changed-bc3d02f6-e146-4659-b018-41d3813ed1c3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:53:27 compute-1 nova_compute[182713]: 2026-01-21 23:53:27.341 182717 DEBUG nova.compute.manager [req-b5f108c3-5767-4332-8c1d-c545d1eee43f req-4c1ea6a5-d664-44f1-9c3f-960a68ea0ee0 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 40bd1cc4-d1de-4488-8160-e6d4f5fce4bc] Refreshing instance network info cache due to event network-changed-bc3d02f6-e146-4659-b018-41d3813ed1c3. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 21 23:53:27 compute-1 nova_compute[182713]: 2026-01-21 23:53:27.341 182717 DEBUG oslo_concurrency.lockutils [req-b5f108c3-5767-4332-8c1d-c545d1eee43f req-4c1ea6a5-d664-44f1-9c3f-960a68ea0ee0 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-40bd1cc4-d1de-4488-8160-e6d4f5fce4bc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 23:53:27 compute-1 nova_compute[182713]: 2026-01-21 23:53:27.341 182717 DEBUG oslo_concurrency.lockutils [req-b5f108c3-5767-4332-8c1d-c545d1eee43f req-4c1ea6a5-d664-44f1-9c3f-960a68ea0ee0 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-40bd1cc4-d1de-4488-8160-e6d4f5fce4bc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 23:53:27 compute-1 nova_compute[182713]: 2026-01-21 23:53:27.342 182717 DEBUG nova.network.neutron [req-b5f108c3-5767-4332-8c1d-c545d1eee43f req-4c1ea6a5-d664-44f1-9c3f-960a68ea0ee0 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 40bd1cc4-d1de-4488-8160-e6d4f5fce4bc] Refreshing network info cache for port bc3d02f6-e146-4659-b018-41d3813ed1c3 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 21 23:53:27 compute-1 podman[217389]: 2026-01-21 23:53:27.567824012 +0000 UTC m=+0.056199174 container health_status 9069102163b9014ce8d990e36224edf2f6277e737eebbc85a8ea0ba5a3d6a468 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 21 23:53:27 compute-1 podman[217388]: 2026-01-21 23:53:27.608509321 +0000 UTC m=+0.098942227 container health_status 1b31374526468d6b98833c6ddc06c791524e3aaa8f4a92d02e3f11c449a17ca2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 21 23:53:27 compute-1 nova_compute[182713]: 2026-01-21 23:53:27.855 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:53:27 compute-1 nova_compute[182713]: 2026-01-21 23:53:27.856 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:53:27 compute-1 nova_compute[182713]: 2026-01-21 23:53:27.891 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:53:27 compute-1 nova_compute[182713]: 2026-01-21 23:53:27.892 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:53:27 compute-1 nova_compute[182713]: 2026-01-21 23:53:27.892 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:53:27 compute-1 nova_compute[182713]: 2026-01-21 23:53:27.892 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 21 23:53:28 compute-1 nova_compute[182713]: 2026-01-21 23:53:28.012 182717 WARNING nova.virt.libvirt.driver [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Periodic task is updating the host stats, it is trying to get disk info for instance-00000031, but the backing disk storage was removed by a concurrent operation such as resize. Error: No disk at /var/lib/nova/instances/40bd1cc4-d1de-4488-8160-e6d4f5fce4bc/disk: nova.exception.DiskNotFound: No disk at /var/lib/nova/instances/40bd1cc4-d1de-4488-8160-e6d4f5fce4bc/disk
Jan 21 23:53:28 compute-1 nova_compute[182713]: 2026-01-21 23:53:28.202 182717 WARNING nova.virt.libvirt.driver [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 21 23:53:28 compute-1 nova_compute[182713]: 2026-01-21 23:53:28.203 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5704MB free_disk=73.28041076660156GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 21 23:53:28 compute-1 nova_compute[182713]: 2026-01-21 23:53:28.204 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:53:28 compute-1 nova_compute[182713]: 2026-01-21 23:53:28.204 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:53:28 compute-1 nova_compute[182713]: 2026-01-21 23:53:28.312 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Migration for instance 40bd1cc4-d1de-4488-8160-e6d4f5fce4bc refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:903
Jan 21 23:53:28 compute-1 nova_compute[182713]: 2026-01-21 23:53:28.336 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:53:28 compute-1 nova_compute[182713]: 2026-01-21 23:53:28.362 182717 INFO nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] [instance: 40bd1cc4-d1de-4488-8160-e6d4f5fce4bc] Updating resource usage from migration 7a6f14fa-de30-444c-a296-3a3cc19a7a58
Jan 21 23:53:28 compute-1 nova_compute[182713]: 2026-01-21 23:53:28.363 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] [instance: 40bd1cc4-d1de-4488-8160-e6d4f5fce4bc] Starting to track outgoing migration 7a6f14fa-de30-444c-a296-3a3cc19a7a58 with flavor c3389c03-89c4-4ff5-9e03-1a99d41713d4 _update_usage_from_migration /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1444
Jan 21 23:53:28 compute-1 nova_compute[182713]: 2026-01-21 23:53:28.404 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Migration 7a6f14fa-de30-444c-a296-3a3cc19a7a58 is active on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1640
Jan 21 23:53:28 compute-1 nova_compute[182713]: 2026-01-21 23:53:28.405 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 21 23:53:28 compute-1 nova_compute[182713]: 2026-01-21 23:53:28.405 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 21 23:53:28 compute-1 nova_compute[182713]: 2026-01-21 23:53:28.547 182717 DEBUG nova.compute.provider_tree [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Inventory has not changed in ProviderTree for provider: 39680711-70c9-4df1-ae59-25e54fac688d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 21 23:53:28 compute-1 nova_compute[182713]: 2026-01-21 23:53:28.579 182717 DEBUG nova.scheduler.client.report [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Inventory has not changed for provider 39680711-70c9-4df1-ae59-25e54fac688d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 21 23:53:28 compute-1 nova_compute[182713]: 2026-01-21 23:53:28.633 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 21 23:53:28 compute-1 nova_compute[182713]: 2026-01-21 23:53:28.634 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.430s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:53:29 compute-1 nova_compute[182713]: 2026-01-21 23:53:29.635 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:53:29 compute-1 nova_compute[182713]: 2026-01-21 23:53:29.636 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 21 23:53:29 compute-1 nova_compute[182713]: 2026-01-21 23:53:29.636 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 21 23:53:29 compute-1 nova_compute[182713]: 2026-01-21 23:53:29.655 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 21 23:53:30 compute-1 nova_compute[182713]: 2026-01-21 23:53:30.696 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:53:31 compute-1 nova_compute[182713]: 2026-01-21 23:53:31.623 182717 DEBUG nova.network.neutron [req-b5f108c3-5767-4332-8c1d-c545d1eee43f req-4c1ea6a5-d664-44f1-9c3f-960a68ea0ee0 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 40bd1cc4-d1de-4488-8160-e6d4f5fce4bc] Updated VIF entry in instance network info cache for port bc3d02f6-e146-4659-b018-41d3813ed1c3. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 21 23:53:31 compute-1 nova_compute[182713]: 2026-01-21 23:53:31.624 182717 DEBUG nova.network.neutron [req-b5f108c3-5767-4332-8c1d-c545d1eee43f req-4c1ea6a5-d664-44f1-9c3f-960a68ea0ee0 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 40bd1cc4-d1de-4488-8160-e6d4f5fce4bc] Updating instance_info_cache with network_info: [{"id": "bc3d02f6-e146-4659-b018-41d3813ed1c3", "address": "fa:16:3e:05:f7:b6", "network": {"id": "7b586c54-3322-410f-9bc9-972a63b8deff", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1542056474-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c09a5cf201e249f69f57cd4a632d1e2b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbc3d02f6-e1", "ovs_interfaceid": "bc3d02f6-e146-4659-b018-41d3813ed1c3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 23:53:31 compute-1 nova_compute[182713]: 2026-01-21 23:53:31.660 182717 DEBUG oslo_concurrency.lockutils [req-b5f108c3-5767-4332-8c1d-c545d1eee43f req-4c1ea6a5-d664-44f1-9c3f-960a68ea0ee0 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-40bd1cc4-d1de-4488-8160-e6d4f5fce4bc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 23:53:33 compute-1 nova_compute[182713]: 2026-01-21 23:53:33.338 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:53:34 compute-1 nova_compute[182713]: 2026-01-21 23:53:34.103 182717 DEBUG nova.compute.manager [req-8fce7ef7-eb65-40de-88a0-f9c57aff3e10 req-edcf3a05-78b1-4fd5-a06e-1e8077e26c3c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 40bd1cc4-d1de-4488-8160-e6d4f5fce4bc] Received event network-vif-plugged-bc3d02f6-e146-4659-b018-41d3813ed1c3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:53:34 compute-1 nova_compute[182713]: 2026-01-21 23:53:34.104 182717 DEBUG oslo_concurrency.lockutils [req-8fce7ef7-eb65-40de-88a0-f9c57aff3e10 req-edcf3a05-78b1-4fd5-a06e-1e8077e26c3c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "40bd1cc4-d1de-4488-8160-e6d4f5fce4bc-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:53:34 compute-1 nova_compute[182713]: 2026-01-21 23:53:34.105 182717 DEBUG oslo_concurrency.lockutils [req-8fce7ef7-eb65-40de-88a0-f9c57aff3e10 req-edcf3a05-78b1-4fd5-a06e-1e8077e26c3c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "40bd1cc4-d1de-4488-8160-e6d4f5fce4bc-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:53:34 compute-1 nova_compute[182713]: 2026-01-21 23:53:34.105 182717 DEBUG oslo_concurrency.lockutils [req-8fce7ef7-eb65-40de-88a0-f9c57aff3e10 req-edcf3a05-78b1-4fd5-a06e-1e8077e26c3c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "40bd1cc4-d1de-4488-8160-e6d4f5fce4bc-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:53:34 compute-1 nova_compute[182713]: 2026-01-21 23:53:34.106 182717 DEBUG nova.compute.manager [req-8fce7ef7-eb65-40de-88a0-f9c57aff3e10 req-edcf3a05-78b1-4fd5-a06e-1e8077e26c3c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 40bd1cc4-d1de-4488-8160-e6d4f5fce4bc] No waiting events found dispatching network-vif-plugged-bc3d02f6-e146-4659-b018-41d3813ed1c3 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 23:53:34 compute-1 nova_compute[182713]: 2026-01-21 23:53:34.106 182717 WARNING nova.compute.manager [req-8fce7ef7-eb65-40de-88a0-f9c57aff3e10 req-edcf3a05-78b1-4fd5-a06e-1e8077e26c3c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 40bd1cc4-d1de-4488-8160-e6d4f5fce4bc] Received unexpected event network-vif-plugged-bc3d02f6-e146-4659-b018-41d3813ed1c3 for instance with vm_state active and task_state resize_finish.
Jan 21 23:53:34 compute-1 podman[217438]: 2026-01-21 23:53:34.618981123 +0000 UTC m=+0.088922404 container health_status e305ebfcd3dbca18f8dd680606b043b108cf9efb0d0a771039cb04fe0df8441d (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 21 23:53:34 compute-1 podman[217437]: 2026-01-21 23:53:34.628293824 +0000 UTC m=+0.102798267 container health_status af9a44923f14d865893c91712a29a99d59e05d83f1a1a0300c4f4d5af638f284 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 21 23:53:35 compute-1 nova_compute[182713]: 2026-01-21 23:53:35.699 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:53:36 compute-1 nova_compute[182713]: 2026-01-21 23:53:36.345 182717 DEBUG nova.compute.manager [req-5c2f91a7-06ed-4026-8c6c-eaeb1700ec25 req-efdad5d1-6cd7-4499-96f3-9b13925f0b0f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 40bd1cc4-d1de-4488-8160-e6d4f5fce4bc] Received event network-vif-plugged-bc3d02f6-e146-4659-b018-41d3813ed1c3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:53:36 compute-1 nova_compute[182713]: 2026-01-21 23:53:36.346 182717 DEBUG oslo_concurrency.lockutils [req-5c2f91a7-06ed-4026-8c6c-eaeb1700ec25 req-efdad5d1-6cd7-4499-96f3-9b13925f0b0f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "40bd1cc4-d1de-4488-8160-e6d4f5fce4bc-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:53:36 compute-1 nova_compute[182713]: 2026-01-21 23:53:36.347 182717 DEBUG oslo_concurrency.lockutils [req-5c2f91a7-06ed-4026-8c6c-eaeb1700ec25 req-efdad5d1-6cd7-4499-96f3-9b13925f0b0f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "40bd1cc4-d1de-4488-8160-e6d4f5fce4bc-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:53:36 compute-1 nova_compute[182713]: 2026-01-21 23:53:36.347 182717 DEBUG oslo_concurrency.lockutils [req-5c2f91a7-06ed-4026-8c6c-eaeb1700ec25 req-efdad5d1-6cd7-4499-96f3-9b13925f0b0f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "40bd1cc4-d1de-4488-8160-e6d4f5fce4bc-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:53:36 compute-1 nova_compute[182713]: 2026-01-21 23:53:36.348 182717 DEBUG nova.compute.manager [req-5c2f91a7-06ed-4026-8c6c-eaeb1700ec25 req-efdad5d1-6cd7-4499-96f3-9b13925f0b0f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 40bd1cc4-d1de-4488-8160-e6d4f5fce4bc] No waiting events found dispatching network-vif-plugged-bc3d02f6-e146-4659-b018-41d3813ed1c3 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 23:53:36 compute-1 nova_compute[182713]: 2026-01-21 23:53:36.348 182717 WARNING nova.compute.manager [req-5c2f91a7-06ed-4026-8c6c-eaeb1700ec25 req-efdad5d1-6cd7-4499-96f3-9b13925f0b0f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 40bd1cc4-d1de-4488-8160-e6d4f5fce4bc] Received unexpected event network-vif-plugged-bc3d02f6-e146-4659-b018-41d3813ed1c3 for instance with vm_state resized and task_state None.
Jan 21 23:53:36 compute-1 nova_compute[182713]: 2026-01-21 23:53:36.519 182717 DEBUG oslo_concurrency.lockutils [None req-a3fe6ce7-b0cb-4880-8153-71d18d6b292c a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Acquiring lock "40bd1cc4-d1de-4488-8160-e6d4f5fce4bc" by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:53:36 compute-1 nova_compute[182713]: 2026-01-21 23:53:36.520 182717 DEBUG oslo_concurrency.lockutils [None req-a3fe6ce7-b0cb-4880-8153-71d18d6b292c a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Lock "40bd1cc4-d1de-4488-8160-e6d4f5fce4bc" acquired by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:53:36 compute-1 nova_compute[182713]: 2026-01-21 23:53:36.520 182717 DEBUG nova.compute.manager [None req-a3fe6ce7-b0cb-4880-8153-71d18d6b292c a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 40bd1cc4-d1de-4488-8160-e6d4f5fce4bc] Going to confirm migration 11 do_confirm_resize /usr/lib/python3.9/site-packages/nova/compute/manager.py:4679
Jan 21 23:53:36 compute-1 nova_compute[182713]: 2026-01-21 23:53:36.590 182717 DEBUG nova.objects.instance [None req-a3fe6ce7-b0cb-4880-8153-71d18d6b292c a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Lazy-loading 'info_cache' on Instance uuid 40bd1cc4-d1de-4488-8160-e6d4f5fce4bc obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:53:37 compute-1 nova_compute[182713]: 2026-01-21 23:53:37.621 182717 DEBUG neutronclient.v2_0.client [None req-a3fe6ce7-b0cb-4880-8153-71d18d6b292c a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Error message: {"NeutronError": {"type": "PortBindingNotFound", "message": "Binding for port bc3d02f6-e146-4659-b018-41d3813ed1c3 for host compute-1.ctlplane.example.com could not be found.", "detail": ""}} _handle_fault_response /usr/lib/python3.9/site-packages/neutronclient/v2_0/client.py:262
Jan 21 23:53:37 compute-1 nova_compute[182713]: 2026-01-21 23:53:37.622 182717 DEBUG oslo_concurrency.lockutils [None req-a3fe6ce7-b0cb-4880-8153-71d18d6b292c a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Acquiring lock "refresh_cache-40bd1cc4-d1de-4488-8160-e6d4f5fce4bc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 23:53:37 compute-1 nova_compute[182713]: 2026-01-21 23:53:37.622 182717 DEBUG oslo_concurrency.lockutils [None req-a3fe6ce7-b0cb-4880-8153-71d18d6b292c a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Acquired lock "refresh_cache-40bd1cc4-d1de-4488-8160-e6d4f5fce4bc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 23:53:37 compute-1 nova_compute[182713]: 2026-01-21 23:53:37.623 182717 DEBUG nova.network.neutron [None req-a3fe6ce7-b0cb-4880-8153-71d18d6b292c a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 40bd1cc4-d1de-4488-8160-e6d4f5fce4bc] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 21 23:53:37 compute-1 nova_compute[182713]: 2026-01-21 23:53:37.762 182717 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769039602.7607083, 40bd1cc4-d1de-4488-8160-e6d4f5fce4bc => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:53:37 compute-1 nova_compute[182713]: 2026-01-21 23:53:37.763 182717 INFO nova.compute.manager [-] [instance: 40bd1cc4-d1de-4488-8160-e6d4f5fce4bc] VM Stopped (Lifecycle Event)
Jan 21 23:53:37 compute-1 nova_compute[182713]: 2026-01-21 23:53:37.808 182717 DEBUG nova.compute.manager [None req-ab317acc-e065-4a64-9b82-ef255328f8e6 - - - - - -] [instance: 40bd1cc4-d1de-4488-8160-e6d4f5fce4bc] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:53:37 compute-1 nova_compute[182713]: 2026-01-21 23:53:37.812 182717 DEBUG nova.compute.manager [None req-ab317acc-e065-4a64-9b82-ef255328f8e6 - - - - - -] [instance: 40bd1cc4-d1de-4488-8160-e6d4f5fce4bc] Synchronizing instance power state after lifecycle event "Stopped"; current vm_state: resized, current task_state: None, current DB power_state: 1, VM power_state: 4 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 21 23:53:37 compute-1 nova_compute[182713]: 2026-01-21 23:53:37.837 182717 INFO nova.compute.manager [None req-ab317acc-e065-4a64-9b82-ef255328f8e6 - - - - - -] [instance: 40bd1cc4-d1de-4488-8160-e6d4f5fce4bc] During the sync_power process the instance has moved from host compute-2.ctlplane.example.com to host compute-1.ctlplane.example.com
Jan 21 23:53:38 compute-1 nova_compute[182713]: 2026-01-21 23:53:38.340 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:53:40 compute-1 nova_compute[182713]: 2026-01-21 23:53:40.701 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:53:42 compute-1 nova_compute[182713]: 2026-01-21 23:53:42.344 182717 DEBUG nova.network.neutron [None req-a3fe6ce7-b0cb-4880-8153-71d18d6b292c a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 40bd1cc4-d1de-4488-8160-e6d4f5fce4bc] Updating instance_info_cache with network_info: [{"id": "bc3d02f6-e146-4659-b018-41d3813ed1c3", "address": "fa:16:3e:05:f7:b6", "network": {"id": "7b586c54-3322-410f-9bc9-972a63b8deff", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1542056474-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c09a5cf201e249f69f57cd4a632d1e2b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbc3d02f6-e1", "ovs_interfaceid": "bc3d02f6-e146-4659-b018-41d3813ed1c3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 23:53:42 compute-1 nova_compute[182713]: 2026-01-21 23:53:42.371 182717 DEBUG oslo_concurrency.lockutils [None req-a3fe6ce7-b0cb-4880-8153-71d18d6b292c a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Releasing lock "refresh_cache-40bd1cc4-d1de-4488-8160-e6d4f5fce4bc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 23:53:42 compute-1 nova_compute[182713]: 2026-01-21 23:53:42.371 182717 DEBUG nova.objects.instance [None req-a3fe6ce7-b0cb-4880-8153-71d18d6b292c a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Lazy-loading 'migration_context' on Instance uuid 40bd1cc4-d1de-4488-8160-e6d4f5fce4bc obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:53:42 compute-1 nova_compute[182713]: 2026-01-21 23:53:42.405 182717 DEBUG nova.virt.libvirt.vif [None req-a3fe6ce7-b0cb-4880-8153-71d18d6b292c a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-21T23:52:42Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerDiskConfigTestJSON-server-732234770',display_name='tempest-ServerDiskConfigTestJSON-server-732234770',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverdiskconfigtestjson-server-732234770',id=49,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-21T23:53:34Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-2.ctlplane.example.com',numa_topology=<?>,old_flavor=Flavor(1),os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='c09a5cf201e249f69f57cd4a632d1e2b',ramdisk_id='',reservation_id='r-7yxmtfzt',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-ServerDiskConfigTestJSON-1417790226',owner_user_name='tempest-ServerDiskConfigTestJSON-1417790226-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-21T23:53:34Z,user_data=None,user_id='a7fb6bdd938b4fcdb749b0bc4f86f97e',uuid=40bd1cc4-d1de-4488-8160-e6d4f5fce4bc,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='resized') vif={"id": "bc3d02f6-e146-4659-b018-41d3813ed1c3", "address": "fa:16:3e:05:f7:b6", "network": {"id": "7b586c54-3322-410f-9bc9-972a63b8deff", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1542056474-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c09a5cf201e249f69f57cd4a632d1e2b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbc3d02f6-e1", "ovs_interfaceid": "bc3d02f6-e146-4659-b018-41d3813ed1c3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 21 23:53:42 compute-1 nova_compute[182713]: 2026-01-21 23:53:42.406 182717 DEBUG nova.network.os_vif_util [None req-a3fe6ce7-b0cb-4880-8153-71d18d6b292c a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Converting VIF {"id": "bc3d02f6-e146-4659-b018-41d3813ed1c3", "address": "fa:16:3e:05:f7:b6", "network": {"id": "7b586c54-3322-410f-9bc9-972a63b8deff", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1542056474-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c09a5cf201e249f69f57cd4a632d1e2b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbc3d02f6-e1", "ovs_interfaceid": "bc3d02f6-e146-4659-b018-41d3813ed1c3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 21 23:53:42 compute-1 nova_compute[182713]: 2026-01-21 23:53:42.407 182717 DEBUG nova.network.os_vif_util [None req-a3fe6ce7-b0cb-4880-8153-71d18d6b292c a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:05:f7:b6,bridge_name='br-int',has_traffic_filtering=True,id=bc3d02f6-e146-4659-b018-41d3813ed1c3,network=Network(7b586c54-3322-410f-9bc9-972a63b8deff),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbc3d02f6-e1') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 21 23:53:42 compute-1 nova_compute[182713]: 2026-01-21 23:53:42.408 182717 DEBUG os_vif [None req-a3fe6ce7-b0cb-4880-8153-71d18d6b292c a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:05:f7:b6,bridge_name='br-int',has_traffic_filtering=True,id=bc3d02f6-e146-4659-b018-41d3813ed1c3,network=Network(7b586c54-3322-410f-9bc9-972a63b8deff),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbc3d02f6-e1') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 21 23:53:42 compute-1 nova_compute[182713]: 2026-01-21 23:53:42.411 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:53:42 compute-1 nova_compute[182713]: 2026-01-21 23:53:42.412 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapbc3d02f6-e1, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:53:42 compute-1 nova_compute[182713]: 2026-01-21 23:53:42.412 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 21 23:53:42 compute-1 nova_compute[182713]: 2026-01-21 23:53:42.418 182717 INFO os_vif [None req-a3fe6ce7-b0cb-4880-8153-71d18d6b292c a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:05:f7:b6,bridge_name='br-int',has_traffic_filtering=True,id=bc3d02f6-e146-4659-b018-41d3813ed1c3,network=Network(7b586c54-3322-410f-9bc9-972a63b8deff),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbc3d02f6-e1')
Jan 21 23:53:42 compute-1 nova_compute[182713]: 2026-01-21 23:53:42.419 182717 DEBUG oslo_concurrency.lockutils [None req-a3fe6ce7-b0cb-4880-8153-71d18d6b292c a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:53:42 compute-1 nova_compute[182713]: 2026-01-21 23:53:42.419 182717 DEBUG oslo_concurrency.lockutils [None req-a3fe6ce7-b0cb-4880-8153-71d18d6b292c a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:53:42 compute-1 nova_compute[182713]: 2026-01-21 23:53:42.550 182717 DEBUG nova.compute.provider_tree [None req-a3fe6ce7-b0cb-4880-8153-71d18d6b292c a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Inventory has not changed in ProviderTree for provider: 39680711-70c9-4df1-ae59-25e54fac688d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 21 23:53:42 compute-1 nova_compute[182713]: 2026-01-21 23:53:42.584 182717 DEBUG nova.scheduler.client.report [None req-a3fe6ce7-b0cb-4880-8153-71d18d6b292c a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Inventory has not changed for provider 39680711-70c9-4df1-ae59-25e54fac688d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 21 23:53:42 compute-1 nova_compute[182713]: 2026-01-21 23:53:42.656 182717 DEBUG oslo_concurrency.lockutils [None req-a3fe6ce7-b0cb-4880-8153-71d18d6b292c a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" :: held 0.237s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:53:42 compute-1 nova_compute[182713]: 2026-01-21 23:53:42.940 182717 INFO nova.scheduler.client.report [None req-a3fe6ce7-b0cb-4880-8153-71d18d6b292c a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Deleted allocation for migration 7a6f14fa-de30-444c-a296-3a3cc19a7a58
Jan 21 23:53:43 compute-1 nova_compute[182713]: 2026-01-21 23:53:43.061 182717 DEBUG oslo_concurrency.lockutils [None req-a3fe6ce7-b0cb-4880-8153-71d18d6b292c a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Lock "40bd1cc4-d1de-4488-8160-e6d4f5fce4bc" "released" by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" :: held 6.542s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:53:43 compute-1 nova_compute[182713]: 2026-01-21 23:53:43.343 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:53:45 compute-1 nova_compute[182713]: 2026-01-21 23:53:45.704 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:53:47 compute-1 podman[217482]: 2026-01-21 23:53:47.641944225 +0000 UTC m=+0.123569225 container health_status cba261ffc859a105e0afa4436e64e27f9ba7e7387a1ea02146e0072c51844d44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ceilometer_agent_compute, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 21 23:53:48 compute-1 nova_compute[182713]: 2026-01-21 23:53:48.377 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:53:50 compute-1 podman[217503]: 2026-01-21 23:53:50.593130293 +0000 UTC m=+0.084971922 container health_status dce133a2ffa21f3006ac27dc80027060a9029e6d972f4c62fecd35dd3bbb00f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, release=1755695350, version=9.6, name=ubi9-minimal, distribution-scope=public, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9)
Jan 21 23:53:50 compute-1 nova_compute[182713]: 2026-01-21 23:53:50.745 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:53:53 compute-1 nova_compute[182713]: 2026-01-21 23:53:53.380 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:53:55 compute-1 nova_compute[182713]: 2026-01-21 23:53:55.747 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:53:57 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:53:57.779 104184 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=13, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:a4:fb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'fa:c2:bb:50:eb:27'}, ipsec=False) old=SB_Global(nb_cfg=12) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 21 23:53:57 compute-1 nova_compute[182713]: 2026-01-21 23:53:57.780 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:53:57 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:53:57.781 104184 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 21 23:53:58 compute-1 nova_compute[182713]: 2026-01-21 23:53:58.436 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:53:58 compute-1 podman[217526]: 2026-01-21 23:53:58.587112508 +0000 UTC m=+0.063516831 container health_status 9069102163b9014ce8d990e36224edf2f6277e737eebbc85a8ea0ba5a3d6a468 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 21 23:53:58 compute-1 podman[217525]: 2026-01-21 23:53:58.607552136 +0000 UTC m=+0.091866166 container health_status 1b31374526468d6b98833c6ddc06c791524e3aaa8f4a92d02e3f11c449a17ca2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, org.label-schema.license=GPLv2)
Jan 21 23:54:00 compute-1 nova_compute[182713]: 2026-01-21 23:54:00.677 182717 DEBUG oslo_concurrency.lockutils [None req-db9bad5a-ec5f-4318-bb59-fb3ffb47a293 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Acquiring lock "827ad99f-45db-4d46-9b29-e7093f18eac8" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:54:00 compute-1 nova_compute[182713]: 2026-01-21 23:54:00.678 182717 DEBUG oslo_concurrency.lockutils [None req-db9bad5a-ec5f-4318-bb59-fb3ffb47a293 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Lock "827ad99f-45db-4d46-9b29-e7093f18eac8" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:54:00 compute-1 nova_compute[182713]: 2026-01-21 23:54:00.728 182717 DEBUG nova.compute.manager [None req-db9bad5a-ec5f-4318-bb59-fb3ffb47a293 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: 827ad99f-45db-4d46-9b29-e7093f18eac8] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 21 23:54:00 compute-1 nova_compute[182713]: 2026-01-21 23:54:00.749 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:54:00 compute-1 nova_compute[182713]: 2026-01-21 23:54:00.941 182717 DEBUG oslo_concurrency.lockutils [None req-db9bad5a-ec5f-4318-bb59-fb3ffb47a293 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:54:00 compute-1 nova_compute[182713]: 2026-01-21 23:54:00.942 182717 DEBUG oslo_concurrency.lockutils [None req-db9bad5a-ec5f-4318-bb59-fb3ffb47a293 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:54:00 compute-1 nova_compute[182713]: 2026-01-21 23:54:00.957 182717 DEBUG nova.virt.hardware [None req-db9bad5a-ec5f-4318-bb59-fb3ffb47a293 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 21 23:54:00 compute-1 nova_compute[182713]: 2026-01-21 23:54:00.957 182717 INFO nova.compute.claims [None req-db9bad5a-ec5f-4318-bb59-fb3ffb47a293 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: 827ad99f-45db-4d46-9b29-e7093f18eac8] Claim successful on node compute-1.ctlplane.example.com
Jan 21 23:54:01 compute-1 nova_compute[182713]: 2026-01-21 23:54:01.164 182717 DEBUG nova.scheduler.client.report [None req-db9bad5a-ec5f-4318-bb59-fb3ffb47a293 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Refreshing inventories for resource provider 39680711-70c9-4df1-ae59-25e54fac688d _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Jan 21 23:54:01 compute-1 nova_compute[182713]: 2026-01-21 23:54:01.206 182717 DEBUG nova.scheduler.client.report [None req-db9bad5a-ec5f-4318-bb59-fb3ffb47a293 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Updating ProviderTree inventory for provider 39680711-70c9-4df1-ae59-25e54fac688d from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Jan 21 23:54:01 compute-1 nova_compute[182713]: 2026-01-21 23:54:01.206 182717 DEBUG nova.compute.provider_tree [None req-db9bad5a-ec5f-4318-bb59-fb3ffb47a293 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Updating inventory in ProviderTree for provider 39680711-70c9-4df1-ae59-25e54fac688d with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Jan 21 23:54:01 compute-1 nova_compute[182713]: 2026-01-21 23:54:01.251 182717 DEBUG nova.scheduler.client.report [None req-db9bad5a-ec5f-4318-bb59-fb3ffb47a293 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Refreshing aggregate associations for resource provider 39680711-70c9-4df1-ae59-25e54fac688d, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Jan 21 23:54:01 compute-1 nova_compute[182713]: 2026-01-21 23:54:01.364 182717 DEBUG nova.scheduler.client.report [None req-db9bad5a-ec5f-4318-bb59-fb3ffb47a293 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Refreshing trait associations for resource provider 39680711-70c9-4df1-ae59-25e54fac688d, traits: COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_ACCELERATORS,COMPUTE_NODE,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_SSE41,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_GRAPHICS_MODEL_CIRRUS,HW_CPU_X86_SSE42,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_SECURITY_TPM_2_0,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_STORAGE_BUS_USB,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_STORAGE_BUS_FDC,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_STORAGE_BUS_SATA,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_DEVICE_TAGGING,COMPUTE_NET_VIF_MODEL_NE2K_PCI,HW_CPU_X86_SSE2,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_CPU_X86_SSSE3,COMPUTE_STORAGE_BUS_IDE,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_RESCUE_BFV,HW_CPU_X86_MMX,HW_CPU_X86_SSE,COMPUTE_TRUSTED_CERTS,COMPUTE_IMAGE_TYPE_ISO _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Jan 21 23:54:01 compute-1 nova_compute[182713]: 2026-01-21 23:54:01.474 182717 DEBUG nova.compute.provider_tree [None req-db9bad5a-ec5f-4318-bb59-fb3ffb47a293 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Inventory has not changed in ProviderTree for provider: 39680711-70c9-4df1-ae59-25e54fac688d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 21 23:54:01 compute-1 nova_compute[182713]: 2026-01-21 23:54:01.699 182717 DEBUG nova.scheduler.client.report [None req-db9bad5a-ec5f-4318-bb59-fb3ffb47a293 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Inventory has not changed for provider 39680711-70c9-4df1-ae59-25e54fac688d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 21 23:54:01 compute-1 nova_compute[182713]: 2026-01-21 23:54:01.732 182717 DEBUG oslo_concurrency.lockutils [None req-db9bad5a-ec5f-4318-bb59-fb3ffb47a293 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.790s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:54:01 compute-1 nova_compute[182713]: 2026-01-21 23:54:01.733 182717 DEBUG nova.compute.manager [None req-db9bad5a-ec5f-4318-bb59-fb3ffb47a293 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: 827ad99f-45db-4d46-9b29-e7093f18eac8] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 21 23:54:01 compute-1 rsyslogd[1003]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 21 23:54:01 compute-1 nova_compute[182713]: 2026-01-21 23:54:01.825 182717 DEBUG nova.compute.manager [None req-db9bad5a-ec5f-4318-bb59-fb3ffb47a293 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: 827ad99f-45db-4d46-9b29-e7093f18eac8] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 21 23:54:01 compute-1 nova_compute[182713]: 2026-01-21 23:54:01.826 182717 DEBUG nova.network.neutron [None req-db9bad5a-ec5f-4318-bb59-fb3ffb47a293 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: 827ad99f-45db-4d46-9b29-e7093f18eac8] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 21 23:54:01 compute-1 nova_compute[182713]: 2026-01-21 23:54:01.852 182717 INFO nova.virt.libvirt.driver [None req-db9bad5a-ec5f-4318-bb59-fb3ffb47a293 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: 827ad99f-45db-4d46-9b29-e7093f18eac8] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 21 23:54:01 compute-1 nova_compute[182713]: 2026-01-21 23:54:01.914 182717 DEBUG nova.compute.manager [None req-db9bad5a-ec5f-4318-bb59-fb3ffb47a293 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: 827ad99f-45db-4d46-9b29-e7093f18eac8] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 21 23:54:02 compute-1 nova_compute[182713]: 2026-01-21 23:54:02.147 182717 DEBUG nova.compute.manager [None req-db9bad5a-ec5f-4318-bb59-fb3ffb47a293 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: 827ad99f-45db-4d46-9b29-e7093f18eac8] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 21 23:54:02 compute-1 nova_compute[182713]: 2026-01-21 23:54:02.149 182717 DEBUG nova.virt.libvirt.driver [None req-db9bad5a-ec5f-4318-bb59-fb3ffb47a293 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: 827ad99f-45db-4d46-9b29-e7093f18eac8] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 21 23:54:02 compute-1 nova_compute[182713]: 2026-01-21 23:54:02.150 182717 INFO nova.virt.libvirt.driver [None req-db9bad5a-ec5f-4318-bb59-fb3ffb47a293 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: 827ad99f-45db-4d46-9b29-e7093f18eac8] Creating image(s)
Jan 21 23:54:02 compute-1 nova_compute[182713]: 2026-01-21 23:54:02.151 182717 DEBUG oslo_concurrency.lockutils [None req-db9bad5a-ec5f-4318-bb59-fb3ffb47a293 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Acquiring lock "/var/lib/nova/instances/827ad99f-45db-4d46-9b29-e7093f18eac8/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:54:02 compute-1 nova_compute[182713]: 2026-01-21 23:54:02.152 182717 DEBUG oslo_concurrency.lockutils [None req-db9bad5a-ec5f-4318-bb59-fb3ffb47a293 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Lock "/var/lib/nova/instances/827ad99f-45db-4d46-9b29-e7093f18eac8/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:54:02 compute-1 nova_compute[182713]: 2026-01-21 23:54:02.153 182717 DEBUG oslo_concurrency.lockutils [None req-db9bad5a-ec5f-4318-bb59-fb3ffb47a293 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Lock "/var/lib/nova/instances/827ad99f-45db-4d46-9b29-e7093f18eac8/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:54:02 compute-1 nova_compute[182713]: 2026-01-21 23:54:02.184 182717 DEBUG nova.policy [None req-db9bad5a-ec5f-4318-bb59-fb3ffb47a293 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '6eb1bcf645844eaca088761a04e59542', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '63e5713bcd4c429796b251487b6136bc', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 21 23:54:02 compute-1 nova_compute[182713]: 2026-01-21 23:54:02.187 182717 DEBUG oslo_concurrency.processutils [None req-db9bad5a-ec5f-4318-bb59-fb3ffb47a293 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:54:02 compute-1 nova_compute[182713]: 2026-01-21 23:54:02.282 182717 DEBUG oslo_concurrency.processutils [None req-db9bad5a-ec5f-4318-bb59-fb3ffb47a293 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.095s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:54:02 compute-1 nova_compute[182713]: 2026-01-21 23:54:02.283 182717 DEBUG oslo_concurrency.lockutils [None req-db9bad5a-ec5f-4318-bb59-fb3ffb47a293 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Acquiring lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:54:02 compute-1 nova_compute[182713]: 2026-01-21 23:54:02.284 182717 DEBUG oslo_concurrency.lockutils [None req-db9bad5a-ec5f-4318-bb59-fb3ffb47a293 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:54:02 compute-1 nova_compute[182713]: 2026-01-21 23:54:02.294 182717 DEBUG oslo_concurrency.processutils [None req-db9bad5a-ec5f-4318-bb59-fb3ffb47a293 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:54:02 compute-1 nova_compute[182713]: 2026-01-21 23:54:02.347 182717 DEBUG oslo_concurrency.processutils [None req-db9bad5a-ec5f-4318-bb59-fb3ffb47a293 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:54:02 compute-1 nova_compute[182713]: 2026-01-21 23:54:02.349 182717 DEBUG oslo_concurrency.processutils [None req-db9bad5a-ec5f-4318-bb59-fb3ffb47a293 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/827ad99f-45db-4d46-9b29-e7093f18eac8/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:54:02 compute-1 nova_compute[182713]: 2026-01-21 23:54:02.383 182717 DEBUG oslo_concurrency.processutils [None req-db9bad5a-ec5f-4318-bb59-fb3ffb47a293 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/827ad99f-45db-4d46-9b29-e7093f18eac8/disk 1073741824" returned: 0 in 0.034s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:54:02 compute-1 nova_compute[182713]: 2026-01-21 23:54:02.384 182717 DEBUG oslo_concurrency.lockutils [None req-db9bad5a-ec5f-4318-bb59-fb3ffb47a293 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.100s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:54:02 compute-1 nova_compute[182713]: 2026-01-21 23:54:02.384 182717 DEBUG oslo_concurrency.processutils [None req-db9bad5a-ec5f-4318-bb59-fb3ffb47a293 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:54:02 compute-1 nova_compute[182713]: 2026-01-21 23:54:02.437 182717 DEBUG oslo_concurrency.processutils [None req-db9bad5a-ec5f-4318-bb59-fb3ffb47a293 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:54:02 compute-1 nova_compute[182713]: 2026-01-21 23:54:02.439 182717 DEBUG nova.virt.disk.api [None req-db9bad5a-ec5f-4318-bb59-fb3ffb47a293 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Checking if we can resize image /var/lib/nova/instances/827ad99f-45db-4d46-9b29-e7093f18eac8/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 21 23:54:02 compute-1 nova_compute[182713]: 2026-01-21 23:54:02.440 182717 DEBUG oslo_concurrency.processutils [None req-db9bad5a-ec5f-4318-bb59-fb3ffb47a293 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/827ad99f-45db-4d46-9b29-e7093f18eac8/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:54:02 compute-1 nova_compute[182713]: 2026-01-21 23:54:02.531 182717 DEBUG oslo_concurrency.processutils [None req-db9bad5a-ec5f-4318-bb59-fb3ffb47a293 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/827ad99f-45db-4d46-9b29-e7093f18eac8/disk --force-share --output=json" returned: 0 in 0.091s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:54:02 compute-1 nova_compute[182713]: 2026-01-21 23:54:02.533 182717 DEBUG nova.virt.disk.api [None req-db9bad5a-ec5f-4318-bb59-fb3ffb47a293 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Cannot resize image /var/lib/nova/instances/827ad99f-45db-4d46-9b29-e7093f18eac8/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 21 23:54:02 compute-1 nova_compute[182713]: 2026-01-21 23:54:02.534 182717 DEBUG nova.objects.instance [None req-db9bad5a-ec5f-4318-bb59-fb3ffb47a293 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Lazy-loading 'migration_context' on Instance uuid 827ad99f-45db-4d46-9b29-e7093f18eac8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:54:02 compute-1 nova_compute[182713]: 2026-01-21 23:54:02.551 182717 DEBUG nova.virt.libvirt.driver [None req-db9bad5a-ec5f-4318-bb59-fb3ffb47a293 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: 827ad99f-45db-4d46-9b29-e7093f18eac8] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 21 23:54:02 compute-1 nova_compute[182713]: 2026-01-21 23:54:02.552 182717 DEBUG nova.virt.libvirt.driver [None req-db9bad5a-ec5f-4318-bb59-fb3ffb47a293 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: 827ad99f-45db-4d46-9b29-e7093f18eac8] Ensure instance console log exists: /var/lib/nova/instances/827ad99f-45db-4d46-9b29-e7093f18eac8/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 21 23:54:02 compute-1 nova_compute[182713]: 2026-01-21 23:54:02.552 182717 DEBUG oslo_concurrency.lockutils [None req-db9bad5a-ec5f-4318-bb59-fb3ffb47a293 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:54:02 compute-1 nova_compute[182713]: 2026-01-21 23:54:02.553 182717 DEBUG oslo_concurrency.lockutils [None req-db9bad5a-ec5f-4318-bb59-fb3ffb47a293 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:54:02 compute-1 nova_compute[182713]: 2026-01-21 23:54:02.553 182717 DEBUG oslo_concurrency.lockutils [None req-db9bad5a-ec5f-4318-bb59-fb3ffb47a293 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:54:03 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:54:03.001 104184 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:54:03 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:54:03.001 104184 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:54:03 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:54:03.001 104184 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:54:03 compute-1 nova_compute[182713]: 2026-01-21 23:54:03.384 182717 DEBUG nova.network.neutron [None req-db9bad5a-ec5f-4318-bb59-fb3ffb47a293 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: 827ad99f-45db-4d46-9b29-e7093f18eac8] Successfully created port: 77c99e81-8985-4b57-a59c-7cf8f9c949da _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 21 23:54:03 compute-1 nova_compute[182713]: 2026-01-21 23:54:03.438 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:54:04 compute-1 nova_compute[182713]: 2026-01-21 23:54:04.576 182717 DEBUG nova.network.neutron [None req-db9bad5a-ec5f-4318-bb59-fb3ffb47a293 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: 827ad99f-45db-4d46-9b29-e7093f18eac8] Successfully updated port: 77c99e81-8985-4b57-a59c-7cf8f9c949da _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 21 23:54:04 compute-1 nova_compute[182713]: 2026-01-21 23:54:04.600 182717 DEBUG oslo_concurrency.lockutils [None req-db9bad5a-ec5f-4318-bb59-fb3ffb47a293 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Acquiring lock "refresh_cache-827ad99f-45db-4d46-9b29-e7093f18eac8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 23:54:04 compute-1 nova_compute[182713]: 2026-01-21 23:54:04.601 182717 DEBUG oslo_concurrency.lockutils [None req-db9bad5a-ec5f-4318-bb59-fb3ffb47a293 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Acquired lock "refresh_cache-827ad99f-45db-4d46-9b29-e7093f18eac8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 23:54:04 compute-1 nova_compute[182713]: 2026-01-21 23:54:04.601 182717 DEBUG nova.network.neutron [None req-db9bad5a-ec5f-4318-bb59-fb3ffb47a293 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: 827ad99f-45db-4d46-9b29-e7093f18eac8] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 21 23:54:04 compute-1 nova_compute[182713]: 2026-01-21 23:54:04.912 182717 DEBUG nova.network.neutron [None req-db9bad5a-ec5f-4318-bb59-fb3ffb47a293 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: 827ad99f-45db-4d46-9b29-e7093f18eac8] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 21 23:54:05 compute-1 nova_compute[182713]: 2026-01-21 23:54:05.437 182717 DEBUG nova.compute.manager [req-4475f072-85e5-45bd-8d33-dd2d545ebbaf req-26cc4829-fd3c-4b24-897f-0925fb69ab00 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 827ad99f-45db-4d46-9b29-e7093f18eac8] Received event network-changed-77c99e81-8985-4b57-a59c-7cf8f9c949da external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:54:05 compute-1 nova_compute[182713]: 2026-01-21 23:54:05.438 182717 DEBUG nova.compute.manager [req-4475f072-85e5-45bd-8d33-dd2d545ebbaf req-26cc4829-fd3c-4b24-897f-0925fb69ab00 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 827ad99f-45db-4d46-9b29-e7093f18eac8] Refreshing instance network info cache due to event network-changed-77c99e81-8985-4b57-a59c-7cf8f9c949da. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 21 23:54:05 compute-1 nova_compute[182713]: 2026-01-21 23:54:05.438 182717 DEBUG oslo_concurrency.lockutils [req-4475f072-85e5-45bd-8d33-dd2d545ebbaf req-26cc4829-fd3c-4b24-897f-0925fb69ab00 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-827ad99f-45db-4d46-9b29-e7093f18eac8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 23:54:05 compute-1 podman[217592]: 2026-01-21 23:54:05.593511323 +0000 UTC m=+0.079562332 container health_status af9a44923f14d865893c91712a29a99d59e05d83f1a1a0300c4f4d5af638f284 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 21 23:54:05 compute-1 podman[217593]: 2026-01-21 23:54:05.604755824 +0000 UTC m=+0.081664047 container health_status e305ebfcd3dbca18f8dd680606b043b108cf9efb0d0a771039cb04fe0df8441d (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 21 23:54:05 compute-1 nova_compute[182713]: 2026-01-21 23:54:05.753 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:54:07 compute-1 nova_compute[182713]: 2026-01-21 23:54:07.654 182717 DEBUG nova.network.neutron [None req-db9bad5a-ec5f-4318-bb59-fb3ffb47a293 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: 827ad99f-45db-4d46-9b29-e7093f18eac8] Updating instance_info_cache with network_info: [{"id": "77c99e81-8985-4b57-a59c-7cf8f9c949da", "address": "fa:16:3e:ae:70:85", "network": {"id": "74e2da48-44c2-4c6d-9597-6c47d6247f9c", "bridge": "br-int", "label": "tempest-ImagesTestJSON-926280031-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "63e5713bcd4c429796b251487b6136bc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap77c99e81-89", "ovs_interfaceid": "77c99e81-8985-4b57-a59c-7cf8f9c949da", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 23:54:07 compute-1 nova_compute[182713]: 2026-01-21 23:54:07.708 182717 DEBUG oslo_concurrency.lockutils [None req-db9bad5a-ec5f-4318-bb59-fb3ffb47a293 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Releasing lock "refresh_cache-827ad99f-45db-4d46-9b29-e7093f18eac8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 23:54:07 compute-1 nova_compute[182713]: 2026-01-21 23:54:07.709 182717 DEBUG nova.compute.manager [None req-db9bad5a-ec5f-4318-bb59-fb3ffb47a293 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: 827ad99f-45db-4d46-9b29-e7093f18eac8] Instance network_info: |[{"id": "77c99e81-8985-4b57-a59c-7cf8f9c949da", "address": "fa:16:3e:ae:70:85", "network": {"id": "74e2da48-44c2-4c6d-9597-6c47d6247f9c", "bridge": "br-int", "label": "tempest-ImagesTestJSON-926280031-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "63e5713bcd4c429796b251487b6136bc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap77c99e81-89", "ovs_interfaceid": "77c99e81-8985-4b57-a59c-7cf8f9c949da", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 21 23:54:07 compute-1 nova_compute[182713]: 2026-01-21 23:54:07.709 182717 DEBUG oslo_concurrency.lockutils [req-4475f072-85e5-45bd-8d33-dd2d545ebbaf req-26cc4829-fd3c-4b24-897f-0925fb69ab00 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-827ad99f-45db-4d46-9b29-e7093f18eac8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 23:54:07 compute-1 nova_compute[182713]: 2026-01-21 23:54:07.710 182717 DEBUG nova.network.neutron [req-4475f072-85e5-45bd-8d33-dd2d545ebbaf req-26cc4829-fd3c-4b24-897f-0925fb69ab00 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 827ad99f-45db-4d46-9b29-e7093f18eac8] Refreshing network info cache for port 77c99e81-8985-4b57-a59c-7cf8f9c949da _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 21 23:54:07 compute-1 nova_compute[182713]: 2026-01-21 23:54:07.715 182717 DEBUG nova.virt.libvirt.driver [None req-db9bad5a-ec5f-4318-bb59-fb3ffb47a293 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: 827ad99f-45db-4d46-9b29-e7093f18eac8] Start _get_guest_xml network_info=[{"id": "77c99e81-8985-4b57-a59c-7cf8f9c949da", "address": "fa:16:3e:ae:70:85", "network": {"id": "74e2da48-44c2-4c6d-9597-6c47d6247f9c", "bridge": "br-int", "label": "tempest-ImagesTestJSON-926280031-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "63e5713bcd4c429796b251487b6136bc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap77c99e81-89", "ovs_interfaceid": "77c99e81-8985-4b57-a59c-7cf8f9c949da", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'size': 0, 'disk_bus': 'virtio', 'boot_index': 0, 'device_name': '/dev/vda', 'guest_format': None, 'encrypted': False, 'encryption_format': None, 'encryption_secret_uuid': None, 'device_type': 'disk', 'image_id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 21 23:54:07 compute-1 nova_compute[182713]: 2026-01-21 23:54:07.724 182717 WARNING nova.virt.libvirt.driver [None req-db9bad5a-ec5f-4318-bb59-fb3ffb47a293 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 21 23:54:07 compute-1 nova_compute[182713]: 2026-01-21 23:54:07.733 182717 DEBUG nova.virt.libvirt.host [None req-db9bad5a-ec5f-4318-bb59-fb3ffb47a293 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 21 23:54:07 compute-1 nova_compute[182713]: 2026-01-21 23:54:07.734 182717 DEBUG nova.virt.libvirt.host [None req-db9bad5a-ec5f-4318-bb59-fb3ffb47a293 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 21 23:54:07 compute-1 nova_compute[182713]: 2026-01-21 23:54:07.744 182717 DEBUG nova.virt.libvirt.host [None req-db9bad5a-ec5f-4318-bb59-fb3ffb47a293 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 21 23:54:07 compute-1 nova_compute[182713]: 2026-01-21 23:54:07.746 182717 DEBUG nova.virt.libvirt.host [None req-db9bad5a-ec5f-4318-bb59-fb3ffb47a293 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 21 23:54:07 compute-1 nova_compute[182713]: 2026-01-21 23:54:07.748 182717 DEBUG nova.virt.libvirt.driver [None req-db9bad5a-ec5f-4318-bb59-fb3ffb47a293 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 21 23:54:07 compute-1 nova_compute[182713]: 2026-01-21 23:54:07.749 182717 DEBUG nova.virt.hardware [None req-db9bad5a-ec5f-4318-bb59-fb3ffb47a293 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-21T23:42:45Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c3389c03-89c4-4ff5-9e03-1a99d41713d4',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 21 23:54:07 compute-1 nova_compute[182713]: 2026-01-21 23:54:07.750 182717 DEBUG nova.virt.hardware [None req-db9bad5a-ec5f-4318-bb59-fb3ffb47a293 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 21 23:54:07 compute-1 nova_compute[182713]: 2026-01-21 23:54:07.750 182717 DEBUG nova.virt.hardware [None req-db9bad5a-ec5f-4318-bb59-fb3ffb47a293 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 21 23:54:07 compute-1 nova_compute[182713]: 2026-01-21 23:54:07.751 182717 DEBUG nova.virt.hardware [None req-db9bad5a-ec5f-4318-bb59-fb3ffb47a293 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 21 23:54:07 compute-1 nova_compute[182713]: 2026-01-21 23:54:07.751 182717 DEBUG nova.virt.hardware [None req-db9bad5a-ec5f-4318-bb59-fb3ffb47a293 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 21 23:54:07 compute-1 nova_compute[182713]: 2026-01-21 23:54:07.752 182717 DEBUG nova.virt.hardware [None req-db9bad5a-ec5f-4318-bb59-fb3ffb47a293 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 21 23:54:07 compute-1 nova_compute[182713]: 2026-01-21 23:54:07.752 182717 DEBUG nova.virt.hardware [None req-db9bad5a-ec5f-4318-bb59-fb3ffb47a293 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 21 23:54:07 compute-1 nova_compute[182713]: 2026-01-21 23:54:07.752 182717 DEBUG nova.virt.hardware [None req-db9bad5a-ec5f-4318-bb59-fb3ffb47a293 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 21 23:54:07 compute-1 nova_compute[182713]: 2026-01-21 23:54:07.752 182717 DEBUG nova.virt.hardware [None req-db9bad5a-ec5f-4318-bb59-fb3ffb47a293 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 21 23:54:07 compute-1 nova_compute[182713]: 2026-01-21 23:54:07.753 182717 DEBUG nova.virt.hardware [None req-db9bad5a-ec5f-4318-bb59-fb3ffb47a293 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 21 23:54:07 compute-1 nova_compute[182713]: 2026-01-21 23:54:07.753 182717 DEBUG nova.virt.hardware [None req-db9bad5a-ec5f-4318-bb59-fb3ffb47a293 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 21 23:54:07 compute-1 nova_compute[182713]: 2026-01-21 23:54:07.758 182717 DEBUG nova.virt.libvirt.vif [None req-db9bad5a-ec5f-4318-bb59-fb3ffb47a293 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-21T23:53:59Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ImagesTestJSON-server-347423779',display_name='tempest-ImagesTestJSON-server-347423779',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-imagestestjson-server-347423779',id=52,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='63e5713bcd4c429796b251487b6136bc',ramdisk_id='',reservation_id='r-9d5rx5u8',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ImagesTestJSON-126431515',owner_user_name='tempest-ImagesTestJSON-126431515-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-21T23:54:01Z,user_data=None,user_id='6eb1bcf645844eaca088761a04e59542',uuid=827ad99f-45db-4d46-9b29-e7093f18eac8,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "77c99e81-8985-4b57-a59c-7cf8f9c949da", "address": "fa:16:3e:ae:70:85", "network": {"id": "74e2da48-44c2-4c6d-9597-6c47d6247f9c", "bridge": "br-int", "label": "tempest-ImagesTestJSON-926280031-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "63e5713bcd4c429796b251487b6136bc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap77c99e81-89", "ovs_interfaceid": "77c99e81-8985-4b57-a59c-7cf8f9c949da", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 21 23:54:07 compute-1 nova_compute[182713]: 2026-01-21 23:54:07.758 182717 DEBUG nova.network.os_vif_util [None req-db9bad5a-ec5f-4318-bb59-fb3ffb47a293 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Converting VIF {"id": "77c99e81-8985-4b57-a59c-7cf8f9c949da", "address": "fa:16:3e:ae:70:85", "network": {"id": "74e2da48-44c2-4c6d-9597-6c47d6247f9c", "bridge": "br-int", "label": "tempest-ImagesTestJSON-926280031-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "63e5713bcd4c429796b251487b6136bc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap77c99e81-89", "ovs_interfaceid": "77c99e81-8985-4b57-a59c-7cf8f9c949da", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 21 23:54:07 compute-1 nova_compute[182713]: 2026-01-21 23:54:07.759 182717 DEBUG nova.network.os_vif_util [None req-db9bad5a-ec5f-4318-bb59-fb3ffb47a293 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ae:70:85,bridge_name='br-int',has_traffic_filtering=True,id=77c99e81-8985-4b57-a59c-7cf8f9c949da,network=Network(74e2da48-44c2-4c6d-9597-6c47d6247f9c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap77c99e81-89') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 21 23:54:07 compute-1 nova_compute[182713]: 2026-01-21 23:54:07.760 182717 DEBUG nova.objects.instance [None req-db9bad5a-ec5f-4318-bb59-fb3ffb47a293 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Lazy-loading 'pci_devices' on Instance uuid 827ad99f-45db-4d46-9b29-e7093f18eac8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:54:07 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:54:07.784 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=74526b6d-b1ca-423f-9094-b845f8b97526, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '13'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:54:07 compute-1 nova_compute[182713]: 2026-01-21 23:54:07.798 182717 DEBUG nova.virt.libvirt.driver [None req-db9bad5a-ec5f-4318-bb59-fb3ffb47a293 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: 827ad99f-45db-4d46-9b29-e7093f18eac8] End _get_guest_xml xml=<domain type="kvm">
Jan 21 23:54:07 compute-1 nova_compute[182713]:   <uuid>827ad99f-45db-4d46-9b29-e7093f18eac8</uuid>
Jan 21 23:54:07 compute-1 nova_compute[182713]:   <name>instance-00000034</name>
Jan 21 23:54:07 compute-1 nova_compute[182713]:   <memory>131072</memory>
Jan 21 23:54:07 compute-1 nova_compute[182713]:   <vcpu>1</vcpu>
Jan 21 23:54:07 compute-1 nova_compute[182713]:   <metadata>
Jan 21 23:54:07 compute-1 nova_compute[182713]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 21 23:54:07 compute-1 nova_compute[182713]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 21 23:54:07 compute-1 nova_compute[182713]:       <nova:name>tempest-ImagesTestJSON-server-347423779</nova:name>
Jan 21 23:54:07 compute-1 nova_compute[182713]:       <nova:creationTime>2026-01-21 23:54:07</nova:creationTime>
Jan 21 23:54:07 compute-1 nova_compute[182713]:       <nova:flavor name="m1.nano">
Jan 21 23:54:07 compute-1 nova_compute[182713]:         <nova:memory>128</nova:memory>
Jan 21 23:54:07 compute-1 nova_compute[182713]:         <nova:disk>1</nova:disk>
Jan 21 23:54:07 compute-1 nova_compute[182713]:         <nova:swap>0</nova:swap>
Jan 21 23:54:07 compute-1 nova_compute[182713]:         <nova:ephemeral>0</nova:ephemeral>
Jan 21 23:54:07 compute-1 nova_compute[182713]:         <nova:vcpus>1</nova:vcpus>
Jan 21 23:54:07 compute-1 nova_compute[182713]:       </nova:flavor>
Jan 21 23:54:07 compute-1 nova_compute[182713]:       <nova:owner>
Jan 21 23:54:07 compute-1 nova_compute[182713]:         <nova:user uuid="6eb1bcf645844eaca088761a04e59542">tempest-ImagesTestJSON-126431515-project-member</nova:user>
Jan 21 23:54:07 compute-1 nova_compute[182713]:         <nova:project uuid="63e5713bcd4c429796b251487b6136bc">tempest-ImagesTestJSON-126431515</nova:project>
Jan 21 23:54:07 compute-1 nova_compute[182713]:       </nova:owner>
Jan 21 23:54:07 compute-1 nova_compute[182713]:       <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 21 23:54:07 compute-1 nova_compute[182713]:       <nova:ports>
Jan 21 23:54:07 compute-1 nova_compute[182713]:         <nova:port uuid="77c99e81-8985-4b57-a59c-7cf8f9c949da">
Jan 21 23:54:07 compute-1 nova_compute[182713]:           <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Jan 21 23:54:07 compute-1 nova_compute[182713]:         </nova:port>
Jan 21 23:54:07 compute-1 nova_compute[182713]:       </nova:ports>
Jan 21 23:54:07 compute-1 nova_compute[182713]:     </nova:instance>
Jan 21 23:54:07 compute-1 nova_compute[182713]:   </metadata>
Jan 21 23:54:07 compute-1 nova_compute[182713]:   <sysinfo type="smbios">
Jan 21 23:54:07 compute-1 nova_compute[182713]:     <system>
Jan 21 23:54:07 compute-1 nova_compute[182713]:       <entry name="manufacturer">RDO</entry>
Jan 21 23:54:07 compute-1 nova_compute[182713]:       <entry name="product">OpenStack Compute</entry>
Jan 21 23:54:07 compute-1 nova_compute[182713]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 21 23:54:07 compute-1 nova_compute[182713]:       <entry name="serial">827ad99f-45db-4d46-9b29-e7093f18eac8</entry>
Jan 21 23:54:07 compute-1 nova_compute[182713]:       <entry name="uuid">827ad99f-45db-4d46-9b29-e7093f18eac8</entry>
Jan 21 23:54:07 compute-1 nova_compute[182713]:       <entry name="family">Virtual Machine</entry>
Jan 21 23:54:07 compute-1 nova_compute[182713]:     </system>
Jan 21 23:54:07 compute-1 nova_compute[182713]:   </sysinfo>
Jan 21 23:54:07 compute-1 nova_compute[182713]:   <os>
Jan 21 23:54:07 compute-1 nova_compute[182713]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 21 23:54:07 compute-1 nova_compute[182713]:     <boot dev="hd"/>
Jan 21 23:54:07 compute-1 nova_compute[182713]:     <smbios mode="sysinfo"/>
Jan 21 23:54:07 compute-1 nova_compute[182713]:   </os>
Jan 21 23:54:07 compute-1 nova_compute[182713]:   <features>
Jan 21 23:54:07 compute-1 nova_compute[182713]:     <acpi/>
Jan 21 23:54:07 compute-1 nova_compute[182713]:     <apic/>
Jan 21 23:54:07 compute-1 nova_compute[182713]:     <vmcoreinfo/>
Jan 21 23:54:07 compute-1 nova_compute[182713]:   </features>
Jan 21 23:54:07 compute-1 nova_compute[182713]:   <clock offset="utc">
Jan 21 23:54:07 compute-1 nova_compute[182713]:     <timer name="pit" tickpolicy="delay"/>
Jan 21 23:54:07 compute-1 nova_compute[182713]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 21 23:54:07 compute-1 nova_compute[182713]:     <timer name="hpet" present="no"/>
Jan 21 23:54:07 compute-1 nova_compute[182713]:   </clock>
Jan 21 23:54:07 compute-1 nova_compute[182713]:   <cpu mode="custom" match="exact">
Jan 21 23:54:07 compute-1 nova_compute[182713]:     <model>Nehalem</model>
Jan 21 23:54:07 compute-1 nova_compute[182713]:     <topology sockets="1" cores="1" threads="1"/>
Jan 21 23:54:07 compute-1 nova_compute[182713]:   </cpu>
Jan 21 23:54:07 compute-1 nova_compute[182713]:   <devices>
Jan 21 23:54:07 compute-1 nova_compute[182713]:     <disk type="file" device="disk">
Jan 21 23:54:07 compute-1 nova_compute[182713]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 21 23:54:07 compute-1 nova_compute[182713]:       <source file="/var/lib/nova/instances/827ad99f-45db-4d46-9b29-e7093f18eac8/disk"/>
Jan 21 23:54:07 compute-1 nova_compute[182713]:       <target dev="vda" bus="virtio"/>
Jan 21 23:54:07 compute-1 nova_compute[182713]:     </disk>
Jan 21 23:54:07 compute-1 nova_compute[182713]:     <disk type="file" device="cdrom">
Jan 21 23:54:07 compute-1 nova_compute[182713]:       <driver name="qemu" type="raw" cache="none"/>
Jan 21 23:54:07 compute-1 nova_compute[182713]:       <source file="/var/lib/nova/instances/827ad99f-45db-4d46-9b29-e7093f18eac8/disk.config"/>
Jan 21 23:54:07 compute-1 nova_compute[182713]:       <target dev="sda" bus="sata"/>
Jan 21 23:54:07 compute-1 nova_compute[182713]:     </disk>
Jan 21 23:54:07 compute-1 nova_compute[182713]:     <interface type="ethernet">
Jan 21 23:54:07 compute-1 nova_compute[182713]:       <mac address="fa:16:3e:ae:70:85"/>
Jan 21 23:54:07 compute-1 nova_compute[182713]:       <model type="virtio"/>
Jan 21 23:54:07 compute-1 nova_compute[182713]:       <driver name="vhost" rx_queue_size="512"/>
Jan 21 23:54:07 compute-1 nova_compute[182713]:       <mtu size="1442"/>
Jan 21 23:54:07 compute-1 nova_compute[182713]:       <target dev="tap77c99e81-89"/>
Jan 21 23:54:07 compute-1 nova_compute[182713]:     </interface>
Jan 21 23:54:07 compute-1 nova_compute[182713]:     <serial type="pty">
Jan 21 23:54:07 compute-1 nova_compute[182713]:       <log file="/var/lib/nova/instances/827ad99f-45db-4d46-9b29-e7093f18eac8/console.log" append="off"/>
Jan 21 23:54:07 compute-1 nova_compute[182713]:     </serial>
Jan 21 23:54:07 compute-1 nova_compute[182713]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 21 23:54:07 compute-1 nova_compute[182713]:     <video>
Jan 21 23:54:07 compute-1 nova_compute[182713]:       <model type="virtio"/>
Jan 21 23:54:07 compute-1 nova_compute[182713]:     </video>
Jan 21 23:54:07 compute-1 nova_compute[182713]:     <input type="tablet" bus="usb"/>
Jan 21 23:54:07 compute-1 nova_compute[182713]:     <rng model="virtio">
Jan 21 23:54:07 compute-1 nova_compute[182713]:       <backend model="random">/dev/urandom</backend>
Jan 21 23:54:07 compute-1 nova_compute[182713]:     </rng>
Jan 21 23:54:07 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root"/>
Jan 21 23:54:07 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:54:07 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:54:07 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:54:07 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:54:07 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:54:07 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:54:07 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:54:07 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:54:07 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:54:07 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:54:07 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:54:07 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:54:07 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:54:07 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:54:07 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:54:07 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:54:07 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:54:07 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:54:07 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:54:07 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:54:07 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:54:07 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:54:07 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:54:07 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:54:07 compute-1 nova_compute[182713]:     <controller type="usb" index="0"/>
Jan 21 23:54:07 compute-1 nova_compute[182713]:     <memballoon model="virtio">
Jan 21 23:54:07 compute-1 nova_compute[182713]:       <stats period="10"/>
Jan 21 23:54:07 compute-1 nova_compute[182713]:     </memballoon>
Jan 21 23:54:07 compute-1 nova_compute[182713]:   </devices>
Jan 21 23:54:07 compute-1 nova_compute[182713]: </domain>
Jan 21 23:54:07 compute-1 nova_compute[182713]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 21 23:54:07 compute-1 nova_compute[182713]: 2026-01-21 23:54:07.800 182717 DEBUG nova.compute.manager [None req-db9bad5a-ec5f-4318-bb59-fb3ffb47a293 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: 827ad99f-45db-4d46-9b29-e7093f18eac8] Preparing to wait for external event network-vif-plugged-77c99e81-8985-4b57-a59c-7cf8f9c949da prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 21 23:54:07 compute-1 nova_compute[182713]: 2026-01-21 23:54:07.800 182717 DEBUG oslo_concurrency.lockutils [None req-db9bad5a-ec5f-4318-bb59-fb3ffb47a293 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Acquiring lock "827ad99f-45db-4d46-9b29-e7093f18eac8-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:54:07 compute-1 nova_compute[182713]: 2026-01-21 23:54:07.800 182717 DEBUG oslo_concurrency.lockutils [None req-db9bad5a-ec5f-4318-bb59-fb3ffb47a293 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Lock "827ad99f-45db-4d46-9b29-e7093f18eac8-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:54:07 compute-1 nova_compute[182713]: 2026-01-21 23:54:07.801 182717 DEBUG oslo_concurrency.lockutils [None req-db9bad5a-ec5f-4318-bb59-fb3ffb47a293 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Lock "827ad99f-45db-4d46-9b29-e7093f18eac8-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:54:07 compute-1 nova_compute[182713]: 2026-01-21 23:54:07.801 182717 DEBUG nova.virt.libvirt.vif [None req-db9bad5a-ec5f-4318-bb59-fb3ffb47a293 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-21T23:53:59Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ImagesTestJSON-server-347423779',display_name='tempest-ImagesTestJSON-server-347423779',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-imagestestjson-server-347423779',id=52,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='63e5713bcd4c429796b251487b6136bc',ramdisk_id='',reservation_id='r-9d5rx5u8',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ImagesTestJSON-126431515',owner_user_name='tempest-ImagesTestJSON-126431515-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-21T23:54:01Z,user_data=None,user_id='6eb1bcf645844eaca088761a04e59542',uuid=827ad99f-45db-4d46-9b29-e7093f18eac8,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "77c99e81-8985-4b57-a59c-7cf8f9c949da", "address": "fa:16:3e:ae:70:85", "network": {"id": "74e2da48-44c2-4c6d-9597-6c47d6247f9c", "bridge": "br-int", "label": "tempest-ImagesTestJSON-926280031-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "63e5713bcd4c429796b251487b6136bc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap77c99e81-89", "ovs_interfaceid": "77c99e81-8985-4b57-a59c-7cf8f9c949da", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 21 23:54:07 compute-1 nova_compute[182713]: 2026-01-21 23:54:07.802 182717 DEBUG nova.network.os_vif_util [None req-db9bad5a-ec5f-4318-bb59-fb3ffb47a293 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Converting VIF {"id": "77c99e81-8985-4b57-a59c-7cf8f9c949da", "address": "fa:16:3e:ae:70:85", "network": {"id": "74e2da48-44c2-4c6d-9597-6c47d6247f9c", "bridge": "br-int", "label": "tempest-ImagesTestJSON-926280031-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "63e5713bcd4c429796b251487b6136bc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap77c99e81-89", "ovs_interfaceid": "77c99e81-8985-4b57-a59c-7cf8f9c949da", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 21 23:54:07 compute-1 nova_compute[182713]: 2026-01-21 23:54:07.802 182717 DEBUG nova.network.os_vif_util [None req-db9bad5a-ec5f-4318-bb59-fb3ffb47a293 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ae:70:85,bridge_name='br-int',has_traffic_filtering=True,id=77c99e81-8985-4b57-a59c-7cf8f9c949da,network=Network(74e2da48-44c2-4c6d-9597-6c47d6247f9c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap77c99e81-89') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 21 23:54:07 compute-1 nova_compute[182713]: 2026-01-21 23:54:07.803 182717 DEBUG os_vif [None req-db9bad5a-ec5f-4318-bb59-fb3ffb47a293 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ae:70:85,bridge_name='br-int',has_traffic_filtering=True,id=77c99e81-8985-4b57-a59c-7cf8f9c949da,network=Network(74e2da48-44c2-4c6d-9597-6c47d6247f9c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap77c99e81-89') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 21 23:54:07 compute-1 nova_compute[182713]: 2026-01-21 23:54:07.803 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:54:07 compute-1 nova_compute[182713]: 2026-01-21 23:54:07.804 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:54:07 compute-1 nova_compute[182713]: 2026-01-21 23:54:07.804 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 21 23:54:07 compute-1 nova_compute[182713]: 2026-01-21 23:54:07.809 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:54:07 compute-1 nova_compute[182713]: 2026-01-21 23:54:07.809 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap77c99e81-89, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:54:07 compute-1 nova_compute[182713]: 2026-01-21 23:54:07.810 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap77c99e81-89, col_values=(('external_ids', {'iface-id': '77c99e81-8985-4b57-a59c-7cf8f9c949da', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:ae:70:85', 'vm-uuid': '827ad99f-45db-4d46-9b29-e7093f18eac8'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:54:07 compute-1 nova_compute[182713]: 2026-01-21 23:54:07.811 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:54:07 compute-1 nova_compute[182713]: 2026-01-21 23:54:07.813 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 21 23:54:07 compute-1 NetworkManager[54952]: <info>  [1769039647.8130] manager: (tap77c99e81-89): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/66)
Jan 21 23:54:07 compute-1 nova_compute[182713]: 2026-01-21 23:54:07.820 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:54:07 compute-1 nova_compute[182713]: 2026-01-21 23:54:07.821 182717 INFO os_vif [None req-db9bad5a-ec5f-4318-bb59-fb3ffb47a293 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ae:70:85,bridge_name='br-int',has_traffic_filtering=True,id=77c99e81-8985-4b57-a59c-7cf8f9c949da,network=Network(74e2da48-44c2-4c6d-9597-6c47d6247f9c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap77c99e81-89')
Jan 21 23:54:07 compute-1 nova_compute[182713]: 2026-01-21 23:54:07.951 182717 DEBUG nova.virt.libvirt.driver [None req-db9bad5a-ec5f-4318-bb59-fb3ffb47a293 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 21 23:54:07 compute-1 nova_compute[182713]: 2026-01-21 23:54:07.952 182717 DEBUG nova.virt.libvirt.driver [None req-db9bad5a-ec5f-4318-bb59-fb3ffb47a293 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 21 23:54:07 compute-1 nova_compute[182713]: 2026-01-21 23:54:07.953 182717 DEBUG nova.virt.libvirt.driver [None req-db9bad5a-ec5f-4318-bb59-fb3ffb47a293 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] No VIF found with MAC fa:16:3e:ae:70:85, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 21 23:54:07 compute-1 nova_compute[182713]: 2026-01-21 23:54:07.954 182717 INFO nova.virt.libvirt.driver [None req-db9bad5a-ec5f-4318-bb59-fb3ffb47a293 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: 827ad99f-45db-4d46-9b29-e7093f18eac8] Using config drive
Jan 21 23:54:09 compute-1 nova_compute[182713]: 2026-01-21 23:54:09.587 182717 INFO nova.virt.libvirt.driver [None req-db9bad5a-ec5f-4318-bb59-fb3ffb47a293 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: 827ad99f-45db-4d46-9b29-e7093f18eac8] Creating config drive at /var/lib/nova/instances/827ad99f-45db-4d46-9b29-e7093f18eac8/disk.config
Jan 21 23:54:09 compute-1 nova_compute[182713]: 2026-01-21 23:54:09.596 182717 DEBUG oslo_concurrency.processutils [None req-db9bad5a-ec5f-4318-bb59-fb3ffb47a293 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/827ad99f-45db-4d46-9b29-e7093f18eac8/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmply5mly3a execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:54:09 compute-1 nova_compute[182713]: 2026-01-21 23:54:09.738 182717 DEBUG oslo_concurrency.processutils [None req-db9bad5a-ec5f-4318-bb59-fb3ffb47a293 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/827ad99f-45db-4d46-9b29-e7093f18eac8/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmply5mly3a" returned: 0 in 0.142s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:54:09 compute-1 kernel: tap77c99e81-89: entered promiscuous mode
Jan 21 23:54:09 compute-1 NetworkManager[54952]: <info>  [1769039649.8313] manager: (tap77c99e81-89): new Tun device (/org/freedesktop/NetworkManager/Devices/67)
Jan 21 23:54:09 compute-1 ovn_controller[94841]: 2026-01-21T23:54:09Z|00128|binding|INFO|Claiming lport 77c99e81-8985-4b57-a59c-7cf8f9c949da for this chassis.
Jan 21 23:54:09 compute-1 nova_compute[182713]: 2026-01-21 23:54:09.833 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:54:09 compute-1 ovn_controller[94841]: 2026-01-21T23:54:09Z|00129|binding|INFO|77c99e81-8985-4b57-a59c-7cf8f9c949da: Claiming fa:16:3e:ae:70:85 10.100.0.5
Jan 21 23:54:09 compute-1 nova_compute[182713]: 2026-01-21 23:54:09.841 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:54:09 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:54:09.868 104184 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ae:70:85 10.100.0.5'], port_security=['fa:16:3e:ae:70:85 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '827ad99f-45db-4d46-9b29-e7093f18eac8', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-74e2da48-44c2-4c6d-9597-6c47d6247f9c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '63e5713bcd4c429796b251487b6136bc', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'f9b7e0b6-a204-4fa2-b013-6e4797586550', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d83c4887-6d70-4852-825f-2112d1d70c76, chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>], logical_port=77c99e81-8985-4b57-a59c-7cf8f9c949da) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 21 23:54:09 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:54:09.870 104184 INFO neutron.agent.ovn.metadata.agent [-] Port 77c99e81-8985-4b57-a59c-7cf8f9c949da in datapath 74e2da48-44c2-4c6d-9597-6c47d6247f9c bound to our chassis
Jan 21 23:54:09 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:54:09.872 104184 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 74e2da48-44c2-4c6d-9597-6c47d6247f9c
Jan 21 23:54:09 compute-1 systemd-machined[153970]: New machine qemu-23-instance-00000034.
Jan 21 23:54:09 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:54:09.887 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[26e168bd-722e-4abb-987c-74631d7ec095]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:54:09 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:54:09.888 104184 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap74e2da48-41 in ovnmeta-74e2da48-44c2-4c6d-9597-6c47d6247f9c namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 21 23:54:09 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:54:09.891 211733 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap74e2da48-40 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 21 23:54:09 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:54:09.891 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[4982636a-e4f1-4286-8e65-1a5e21b29ab7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:54:09 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:54:09.892 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[142c47a8-55e2-4b0a-bc52-38c555f28863]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:54:09 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:54:09.912 104576 DEBUG oslo.privsep.daemon [-] privsep: reply[78895149-77bb-42d9-893a-93a1f158edda]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:54:09 compute-1 systemd[1]: Started Virtual Machine qemu-23-instance-00000034.
Jan 21 23:54:09 compute-1 nova_compute[182713]: 2026-01-21 23:54:09.928 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:54:09 compute-1 ovn_controller[94841]: 2026-01-21T23:54:09Z|00130|binding|INFO|Setting lport 77c99e81-8985-4b57-a59c-7cf8f9c949da ovn-installed in OVS
Jan 21 23:54:09 compute-1 ovn_controller[94841]: 2026-01-21T23:54:09Z|00131|binding|INFO|Setting lport 77c99e81-8985-4b57-a59c-7cf8f9c949da up in Southbound
Jan 21 23:54:09 compute-1 nova_compute[182713]: 2026-01-21 23:54:09.934 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:54:09 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:54:09.945 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[7e994ea3-bec0-4e62-9a91-b2847dfb52f8]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:54:09 compute-1 systemd-udevd[217656]: Network interface NamePolicy= disabled on kernel command line.
Jan 21 23:54:09 compute-1 NetworkManager[54952]: <info>  [1769039649.9617] device (tap77c99e81-89): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 21 23:54:09 compute-1 NetworkManager[54952]: <info>  [1769039649.9627] device (tap77c99e81-89): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 21 23:54:09 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:54:09.988 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[9a9b654a-41f4-496f-a005-0376fa41edec]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:54:09 compute-1 NetworkManager[54952]: <info>  [1769039649.9951] manager: (tap74e2da48-40): new Veth device (/org/freedesktop/NetworkManager/Devices/68)
Jan 21 23:54:09 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:54:09.993 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[063df29a-63fd-402b-9d33-f0c011fd112d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:54:10 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:54:10.033 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[f44a1a3f-6911-4a07-8bd9-fabb78424551]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:54:10 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:54:10.038 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[072cb8fa-9cc1-4c54-b366-170481b6db95]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:54:10 compute-1 NetworkManager[54952]: <info>  [1769039650.0695] device (tap74e2da48-40): carrier: link connected
Jan 21 23:54:10 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:54:10.078 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[52bd1ce3-414c-4c9e-968a-65c2e779c348]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:54:10 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:54:10.106 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[ba26af9e-5166-4141-9266-ea7bcf9a32dd]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap74e2da48-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:af:75:49'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 40], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 422272, 'reachable_time': 36292, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 217687, 'error': None, 'target': 'ovnmeta-74e2da48-44c2-4c6d-9597-6c47d6247f9c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:54:10 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:54:10.125 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[5a83876e-6484-4dbe-9a15-05de5120028c]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feaf:7549'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 422272, 'tstamp': 422272}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 217688, 'error': None, 'target': 'ovnmeta-74e2da48-44c2-4c6d-9597-6c47d6247f9c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:54:10 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:54:10.142 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[7d812c31-c5d9-47d9-9329-736a0d462203]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap74e2da48-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:af:75:49'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 40], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 422272, 'reachable_time': 36292, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 217689, 'error': None, 'target': 'ovnmeta-74e2da48-44c2-4c6d-9597-6c47d6247f9c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:54:10 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:54:10.173 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[5239bd96-ee34-4f62-9b98-32fa24d031c8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:54:10 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:54:10.233 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[637087f6-a77e-463e-93f9-30bb0a846334]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:54:10 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:54:10.235 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap74e2da48-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:54:10 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:54:10.236 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 21 23:54:10 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:54:10.236 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap74e2da48-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:54:10 compute-1 kernel: tap74e2da48-40: entered promiscuous mode
Jan 21 23:54:10 compute-1 NetworkManager[54952]: <info>  [1769039650.2401] manager: (tap74e2da48-40): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/69)
Jan 21 23:54:10 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:54:10.243 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap74e2da48-40, col_values=(('external_ids', {'iface-id': '5f8f321e-2942-4700-a50e-4b0628052c1b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:54:10 compute-1 ovn_controller[94841]: 2026-01-21T23:54:10Z|00132|binding|INFO|Releasing lport 5f8f321e-2942-4700-a50e-4b0628052c1b from this chassis (sb_readonly=0)
Jan 21 23:54:10 compute-1 nova_compute[182713]: 2026-01-21 23:54:10.245 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:54:10 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:54:10.246 104184 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/74e2da48-44c2-4c6d-9597-6c47d6247f9c.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/74e2da48-44c2-4c6d-9597-6c47d6247f9c.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 21 23:54:10 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:54:10.247 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[c502136c-daeb-40e5-9ba2-4597e7d579d3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:54:10 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:54:10.248 104184 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 21 23:54:10 compute-1 ovn_metadata_agent[104179]: global
Jan 21 23:54:10 compute-1 ovn_metadata_agent[104179]:     log         /dev/log local0 debug
Jan 21 23:54:10 compute-1 ovn_metadata_agent[104179]:     log-tag     haproxy-metadata-proxy-74e2da48-44c2-4c6d-9597-6c47d6247f9c
Jan 21 23:54:10 compute-1 ovn_metadata_agent[104179]:     user        root
Jan 21 23:54:10 compute-1 ovn_metadata_agent[104179]:     group       root
Jan 21 23:54:10 compute-1 ovn_metadata_agent[104179]:     maxconn     1024
Jan 21 23:54:10 compute-1 ovn_metadata_agent[104179]:     pidfile     /var/lib/neutron/external/pids/74e2da48-44c2-4c6d-9597-6c47d6247f9c.pid.haproxy
Jan 21 23:54:10 compute-1 ovn_metadata_agent[104179]:     daemon
Jan 21 23:54:10 compute-1 ovn_metadata_agent[104179]: 
Jan 21 23:54:10 compute-1 ovn_metadata_agent[104179]: defaults
Jan 21 23:54:10 compute-1 ovn_metadata_agent[104179]:     log global
Jan 21 23:54:10 compute-1 ovn_metadata_agent[104179]:     mode http
Jan 21 23:54:10 compute-1 ovn_metadata_agent[104179]:     option httplog
Jan 21 23:54:10 compute-1 ovn_metadata_agent[104179]:     option dontlognull
Jan 21 23:54:10 compute-1 ovn_metadata_agent[104179]:     option http-server-close
Jan 21 23:54:10 compute-1 ovn_metadata_agent[104179]:     option forwardfor
Jan 21 23:54:10 compute-1 ovn_metadata_agent[104179]:     retries                 3
Jan 21 23:54:10 compute-1 ovn_metadata_agent[104179]:     timeout http-request    30s
Jan 21 23:54:10 compute-1 ovn_metadata_agent[104179]:     timeout connect         30s
Jan 21 23:54:10 compute-1 ovn_metadata_agent[104179]:     timeout client          32s
Jan 21 23:54:10 compute-1 ovn_metadata_agent[104179]:     timeout server          32s
Jan 21 23:54:10 compute-1 ovn_metadata_agent[104179]:     timeout http-keep-alive 30s
Jan 21 23:54:10 compute-1 ovn_metadata_agent[104179]: 
Jan 21 23:54:10 compute-1 ovn_metadata_agent[104179]: 
Jan 21 23:54:10 compute-1 ovn_metadata_agent[104179]: listen listener
Jan 21 23:54:10 compute-1 ovn_metadata_agent[104179]:     bind 169.254.169.254:80
Jan 21 23:54:10 compute-1 ovn_metadata_agent[104179]:     server metadata /var/lib/neutron/metadata_proxy
Jan 21 23:54:10 compute-1 ovn_metadata_agent[104179]:     http-request add-header X-OVN-Network-ID 74e2da48-44c2-4c6d-9597-6c47d6247f9c
Jan 21 23:54:10 compute-1 ovn_metadata_agent[104179]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 21 23:54:10 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:54:10.250 104184 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-74e2da48-44c2-4c6d-9597-6c47d6247f9c', 'env', 'PROCESS_TAG=haproxy-74e2da48-44c2-4c6d-9597-6c47d6247f9c', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/74e2da48-44c2-4c6d-9597-6c47d6247f9c.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 21 23:54:10 compute-1 nova_compute[182713]: 2026-01-21 23:54:10.260 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:54:10 compute-1 nova_compute[182713]: 2026-01-21 23:54:10.356 182717 DEBUG nova.virt.driver [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Emitting event <LifecycleEvent: 1769039650.3552084, 827ad99f-45db-4d46-9b29-e7093f18eac8 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:54:10 compute-1 nova_compute[182713]: 2026-01-21 23:54:10.356 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 827ad99f-45db-4d46-9b29-e7093f18eac8] VM Started (Lifecycle Event)
Jan 21 23:54:10 compute-1 podman[217728]: 2026-01-21 23:54:10.737237428 +0000 UTC m=+0.072960376 container create b79e3ab3d036a5b4023e3264ca6c2fc65a68d4bfc6d1b66240123072df91e8f6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-74e2da48-44c2-4c6d-9597-6c47d6247f9c, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 21 23:54:10 compute-1 nova_compute[182713]: 2026-01-21 23:54:10.755 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:54:10 compute-1 systemd[1]: Started libpod-conmon-b79e3ab3d036a5b4023e3264ca6c2fc65a68d4bfc6d1b66240123072df91e8f6.scope.
Jan 21 23:54:10 compute-1 podman[217728]: 2026-01-21 23:54:10.705423296 +0000 UTC m=+0.041146254 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 21 23:54:10 compute-1 systemd[1]: Started libcrun container.
Jan 21 23:54:10 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/14eb5c6b1ed2ce941f824044e0add73804e50c04d46961a057fb0330250e5a6c/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 21 23:54:10 compute-1 podman[217728]: 2026-01-21 23:54:10.848324082 +0000 UTC m=+0.184047030 container init b79e3ab3d036a5b4023e3264ca6c2fc65a68d4bfc6d1b66240123072df91e8f6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-74e2da48-44c2-4c6d-9597-6c47d6247f9c, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202)
Jan 21 23:54:10 compute-1 podman[217728]: 2026-01-21 23:54:10.860468582 +0000 UTC m=+0.196191500 container start b79e3ab3d036a5b4023e3264ca6c2fc65a68d4bfc6d1b66240123072df91e8f6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-74e2da48-44c2-4c6d-9597-6c47d6247f9c, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2)
Jan 21 23:54:10 compute-1 neutron-haproxy-ovnmeta-74e2da48-44c2-4c6d-9597-6c47d6247f9c[217743]: [NOTICE]   (217747) : New worker (217749) forked
Jan 21 23:54:10 compute-1 neutron-haproxy-ovnmeta-74e2da48-44c2-4c6d-9597-6c47d6247f9c[217743]: [NOTICE]   (217747) : Loading success.
Jan 21 23:54:11 compute-1 nova_compute[182713]: 2026-01-21 23:54:11.017 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 827ad99f-45db-4d46-9b29-e7093f18eac8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:54:11 compute-1 nova_compute[182713]: 2026-01-21 23:54:11.025 182717 DEBUG nova.virt.driver [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Emitting event <LifecycleEvent: 1769039650.355394, 827ad99f-45db-4d46-9b29-e7093f18eac8 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:54:11 compute-1 nova_compute[182713]: 2026-01-21 23:54:11.026 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 827ad99f-45db-4d46-9b29-e7093f18eac8] VM Paused (Lifecycle Event)
Jan 21 23:54:11 compute-1 nova_compute[182713]: 2026-01-21 23:54:11.115 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 827ad99f-45db-4d46-9b29-e7093f18eac8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:54:11 compute-1 nova_compute[182713]: 2026-01-21 23:54:11.118 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 827ad99f-45db-4d46-9b29-e7093f18eac8] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 21 23:54:11 compute-1 nova_compute[182713]: 2026-01-21 23:54:11.157 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 827ad99f-45db-4d46-9b29-e7093f18eac8] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 21 23:54:12 compute-1 nova_compute[182713]: 2026-01-21 23:54:12.554 182717 DEBUG nova.compute.manager [req-5b653b25-d664-4c87-8848-9e6d461157ab req-3136c5af-c186-4b3e-be29-bfe499f1f9e6 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 827ad99f-45db-4d46-9b29-e7093f18eac8] Received event network-vif-plugged-77c99e81-8985-4b57-a59c-7cf8f9c949da external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:54:12 compute-1 nova_compute[182713]: 2026-01-21 23:54:12.554 182717 DEBUG oslo_concurrency.lockutils [req-5b653b25-d664-4c87-8848-9e6d461157ab req-3136c5af-c186-4b3e-be29-bfe499f1f9e6 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "827ad99f-45db-4d46-9b29-e7093f18eac8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:54:12 compute-1 nova_compute[182713]: 2026-01-21 23:54:12.555 182717 DEBUG oslo_concurrency.lockutils [req-5b653b25-d664-4c87-8848-9e6d461157ab req-3136c5af-c186-4b3e-be29-bfe499f1f9e6 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "827ad99f-45db-4d46-9b29-e7093f18eac8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:54:12 compute-1 nova_compute[182713]: 2026-01-21 23:54:12.555 182717 DEBUG oslo_concurrency.lockutils [req-5b653b25-d664-4c87-8848-9e6d461157ab req-3136c5af-c186-4b3e-be29-bfe499f1f9e6 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "827ad99f-45db-4d46-9b29-e7093f18eac8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:54:12 compute-1 nova_compute[182713]: 2026-01-21 23:54:12.556 182717 DEBUG nova.compute.manager [req-5b653b25-d664-4c87-8848-9e6d461157ab req-3136c5af-c186-4b3e-be29-bfe499f1f9e6 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 827ad99f-45db-4d46-9b29-e7093f18eac8] Processing event network-vif-plugged-77c99e81-8985-4b57-a59c-7cf8f9c949da _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 21 23:54:12 compute-1 nova_compute[182713]: 2026-01-21 23:54:12.557 182717 DEBUG nova.compute.manager [None req-db9bad5a-ec5f-4318-bb59-fb3ffb47a293 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: 827ad99f-45db-4d46-9b29-e7093f18eac8] Instance event wait completed in 2 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 21 23:54:12 compute-1 nova_compute[182713]: 2026-01-21 23:54:12.562 182717 DEBUG nova.virt.driver [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Emitting event <LifecycleEvent: 1769039652.5616126, 827ad99f-45db-4d46-9b29-e7093f18eac8 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:54:12 compute-1 nova_compute[182713]: 2026-01-21 23:54:12.562 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 827ad99f-45db-4d46-9b29-e7093f18eac8] VM Resumed (Lifecycle Event)
Jan 21 23:54:12 compute-1 nova_compute[182713]: 2026-01-21 23:54:12.564 182717 DEBUG nova.virt.libvirt.driver [None req-db9bad5a-ec5f-4318-bb59-fb3ffb47a293 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: 827ad99f-45db-4d46-9b29-e7093f18eac8] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 21 23:54:12 compute-1 nova_compute[182713]: 2026-01-21 23:54:12.569 182717 INFO nova.virt.libvirt.driver [-] [instance: 827ad99f-45db-4d46-9b29-e7093f18eac8] Instance spawned successfully.
Jan 21 23:54:12 compute-1 nova_compute[182713]: 2026-01-21 23:54:12.570 182717 DEBUG nova.virt.libvirt.driver [None req-db9bad5a-ec5f-4318-bb59-fb3ffb47a293 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: 827ad99f-45db-4d46-9b29-e7093f18eac8] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 21 23:54:12 compute-1 nova_compute[182713]: 2026-01-21 23:54:12.611 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 827ad99f-45db-4d46-9b29-e7093f18eac8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:54:12 compute-1 nova_compute[182713]: 2026-01-21 23:54:12.620 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 827ad99f-45db-4d46-9b29-e7093f18eac8] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 21 23:54:12 compute-1 nova_compute[182713]: 2026-01-21 23:54:12.626 182717 DEBUG nova.virt.libvirt.driver [None req-db9bad5a-ec5f-4318-bb59-fb3ffb47a293 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: 827ad99f-45db-4d46-9b29-e7093f18eac8] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:54:12 compute-1 nova_compute[182713]: 2026-01-21 23:54:12.627 182717 DEBUG nova.virt.libvirt.driver [None req-db9bad5a-ec5f-4318-bb59-fb3ffb47a293 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: 827ad99f-45db-4d46-9b29-e7093f18eac8] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:54:12 compute-1 nova_compute[182713]: 2026-01-21 23:54:12.627 182717 DEBUG nova.virt.libvirt.driver [None req-db9bad5a-ec5f-4318-bb59-fb3ffb47a293 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: 827ad99f-45db-4d46-9b29-e7093f18eac8] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:54:12 compute-1 nova_compute[182713]: 2026-01-21 23:54:12.628 182717 DEBUG nova.virt.libvirt.driver [None req-db9bad5a-ec5f-4318-bb59-fb3ffb47a293 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: 827ad99f-45db-4d46-9b29-e7093f18eac8] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:54:12 compute-1 nova_compute[182713]: 2026-01-21 23:54:12.629 182717 DEBUG nova.virt.libvirt.driver [None req-db9bad5a-ec5f-4318-bb59-fb3ffb47a293 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: 827ad99f-45db-4d46-9b29-e7093f18eac8] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:54:12 compute-1 nova_compute[182713]: 2026-01-21 23:54:12.630 182717 DEBUG nova.virt.libvirt.driver [None req-db9bad5a-ec5f-4318-bb59-fb3ffb47a293 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: 827ad99f-45db-4d46-9b29-e7093f18eac8] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:54:12 compute-1 nova_compute[182713]: 2026-01-21 23:54:12.683 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 827ad99f-45db-4d46-9b29-e7093f18eac8] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 21 23:54:12 compute-1 nova_compute[182713]: 2026-01-21 23:54:12.736 182717 INFO nova.compute.manager [None req-db9bad5a-ec5f-4318-bb59-fb3ffb47a293 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: 827ad99f-45db-4d46-9b29-e7093f18eac8] Took 10.59 seconds to spawn the instance on the hypervisor.
Jan 21 23:54:12 compute-1 nova_compute[182713]: 2026-01-21 23:54:12.737 182717 DEBUG nova.compute.manager [None req-db9bad5a-ec5f-4318-bb59-fb3ffb47a293 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: 827ad99f-45db-4d46-9b29-e7093f18eac8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:54:12 compute-1 nova_compute[182713]: 2026-01-21 23:54:12.740 182717 DEBUG nova.network.neutron [req-4475f072-85e5-45bd-8d33-dd2d545ebbaf req-26cc4829-fd3c-4b24-897f-0925fb69ab00 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 827ad99f-45db-4d46-9b29-e7093f18eac8] Updated VIF entry in instance network info cache for port 77c99e81-8985-4b57-a59c-7cf8f9c949da. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 21 23:54:12 compute-1 nova_compute[182713]: 2026-01-21 23:54:12.741 182717 DEBUG nova.network.neutron [req-4475f072-85e5-45bd-8d33-dd2d545ebbaf req-26cc4829-fd3c-4b24-897f-0925fb69ab00 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 827ad99f-45db-4d46-9b29-e7093f18eac8] Updating instance_info_cache with network_info: [{"id": "77c99e81-8985-4b57-a59c-7cf8f9c949da", "address": "fa:16:3e:ae:70:85", "network": {"id": "74e2da48-44c2-4c6d-9597-6c47d6247f9c", "bridge": "br-int", "label": "tempest-ImagesTestJSON-926280031-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "63e5713bcd4c429796b251487b6136bc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap77c99e81-89", "ovs_interfaceid": "77c99e81-8985-4b57-a59c-7cf8f9c949da", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 23:54:12 compute-1 nova_compute[182713]: 2026-01-21 23:54:12.813 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:54:12 compute-1 nova_compute[182713]: 2026-01-21 23:54:12.862 182717 DEBUG oslo_concurrency.lockutils [req-4475f072-85e5-45bd-8d33-dd2d545ebbaf req-26cc4829-fd3c-4b24-897f-0925fb69ab00 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-827ad99f-45db-4d46-9b29-e7093f18eac8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 23:54:13 compute-1 nova_compute[182713]: 2026-01-21 23:54:13.023 182717 INFO nova.compute.manager [None req-db9bad5a-ec5f-4318-bb59-fb3ffb47a293 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: 827ad99f-45db-4d46-9b29-e7093f18eac8] Took 12.17 seconds to build instance.
Jan 21 23:54:13 compute-1 nova_compute[182713]: 2026-01-21 23:54:13.080 182717 DEBUG oslo_concurrency.lockutils [None req-db9bad5a-ec5f-4318-bb59-fb3ffb47a293 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Lock "827ad99f-45db-4d46-9b29-e7093f18eac8" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 12.402s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:54:14 compute-1 nova_compute[182713]: 2026-01-21 23:54:14.803 182717 DEBUG nova.compute.manager [req-7242b570-f6fd-4f62-909c-30683736325b req-993fe327-4294-4e7f-b1c1-98be8e327be3 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 827ad99f-45db-4d46-9b29-e7093f18eac8] Received event network-vif-plugged-77c99e81-8985-4b57-a59c-7cf8f9c949da external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:54:14 compute-1 nova_compute[182713]: 2026-01-21 23:54:14.803 182717 DEBUG oslo_concurrency.lockutils [req-7242b570-f6fd-4f62-909c-30683736325b req-993fe327-4294-4e7f-b1c1-98be8e327be3 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "827ad99f-45db-4d46-9b29-e7093f18eac8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:54:14 compute-1 nova_compute[182713]: 2026-01-21 23:54:14.804 182717 DEBUG oslo_concurrency.lockutils [req-7242b570-f6fd-4f62-909c-30683736325b req-993fe327-4294-4e7f-b1c1-98be8e327be3 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "827ad99f-45db-4d46-9b29-e7093f18eac8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:54:14 compute-1 nova_compute[182713]: 2026-01-21 23:54:14.804 182717 DEBUG oslo_concurrency.lockutils [req-7242b570-f6fd-4f62-909c-30683736325b req-993fe327-4294-4e7f-b1c1-98be8e327be3 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "827ad99f-45db-4d46-9b29-e7093f18eac8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:54:14 compute-1 nova_compute[182713]: 2026-01-21 23:54:14.805 182717 DEBUG nova.compute.manager [req-7242b570-f6fd-4f62-909c-30683736325b req-993fe327-4294-4e7f-b1c1-98be8e327be3 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 827ad99f-45db-4d46-9b29-e7093f18eac8] No waiting events found dispatching network-vif-plugged-77c99e81-8985-4b57-a59c-7cf8f9c949da pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 23:54:14 compute-1 nova_compute[182713]: 2026-01-21 23:54:14.805 182717 WARNING nova.compute.manager [req-7242b570-f6fd-4f62-909c-30683736325b req-993fe327-4294-4e7f-b1c1-98be8e327be3 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 827ad99f-45db-4d46-9b29-e7093f18eac8] Received unexpected event network-vif-plugged-77c99e81-8985-4b57-a59c-7cf8f9c949da for instance with vm_state active and task_state None.
Jan 21 23:54:15 compute-1 nova_compute[182713]: 2026-01-21 23:54:15.759 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:54:16 compute-1 nova_compute[182713]: 2026-01-21 23:54:16.450 182717 INFO nova.compute.manager [None req-c6b698b0-540d-470a-80f0-e04ebe9ff215 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: 827ad99f-45db-4d46-9b29-e7093f18eac8] Pausing
Jan 21 23:54:16 compute-1 nova_compute[182713]: 2026-01-21 23:54:16.453 182717 DEBUG nova.objects.instance [None req-c6b698b0-540d-470a-80f0-e04ebe9ff215 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Lazy-loading 'flavor' on Instance uuid 827ad99f-45db-4d46-9b29-e7093f18eac8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:54:16 compute-1 nova_compute[182713]: 2026-01-21 23:54:16.604 182717 DEBUG nova.virt.driver [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Emitting event <LifecycleEvent: 1769039656.6038902, 827ad99f-45db-4d46-9b29-e7093f18eac8 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:54:16 compute-1 nova_compute[182713]: 2026-01-21 23:54:16.604 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 827ad99f-45db-4d46-9b29-e7093f18eac8] VM Paused (Lifecycle Event)
Jan 21 23:54:16 compute-1 nova_compute[182713]: 2026-01-21 23:54:16.607 182717 DEBUG nova.compute.manager [None req-c6b698b0-540d-470a-80f0-e04ebe9ff215 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: 827ad99f-45db-4d46-9b29-e7093f18eac8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:54:16 compute-1 nova_compute[182713]: 2026-01-21 23:54:16.650 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 827ad99f-45db-4d46-9b29-e7093f18eac8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:54:16 compute-1 nova_compute[182713]: 2026-01-21 23:54:16.654 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 827ad99f-45db-4d46-9b29-e7093f18eac8] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: pausing, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 21 23:54:16 compute-1 nova_compute[182713]: 2026-01-21 23:54:16.752 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 827ad99f-45db-4d46-9b29-e7093f18eac8] During sync_power_state the instance has a pending task (pausing). Skip.
Jan 21 23:54:17 compute-1 nova_compute[182713]: 2026-01-21 23:54:17.817 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:54:18 compute-1 podman[217758]: 2026-01-21 23:54:18.668280152 +0000 UTC m=+0.122752570 container health_status cba261ffc859a105e0afa4436e64e27f9ba7e7387a1ea02146e0072c51844d44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ceilometer_agent_compute, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Jan 21 23:54:20 compute-1 nova_compute[182713]: 2026-01-21 23:54:20.765 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:54:21 compute-1 podman[217780]: 2026-01-21 23:54:21.619307536 +0000 UTC m=+0.095714429 container health_status dce133a2ffa21f3006ac27dc80027060a9029e6d972f4c62fecd35dd3bbb00f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=openstack_network_exporter, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, architecture=x86_64, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, release=1755695350)
Jan 21 23:54:21 compute-1 nova_compute[182713]: 2026-01-21 23:54:21.657 182717 DEBUG nova.compute.manager [None req-5a83e2e7-06ae-4bc8-83a0-3c0f2682d24b 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: 827ad99f-45db-4d46-9b29-e7093f18eac8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:54:21 compute-1 nova_compute[182713]: 2026-01-21 23:54:21.770 182717 INFO nova.compute.manager [None req-5a83e2e7-06ae-4bc8-83a0-3c0f2682d24b 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: 827ad99f-45db-4d46-9b29-e7093f18eac8] instance snapshotting
Jan 21 23:54:21 compute-1 nova_compute[182713]: 2026-01-21 23:54:21.771 182717 WARNING nova.compute.manager [None req-5a83e2e7-06ae-4bc8-83a0-3c0f2682d24b 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: 827ad99f-45db-4d46-9b29-e7093f18eac8] trying to snapshot a non-running instance: (state: 3 expected: 1)
Jan 21 23:54:22 compute-1 nova_compute[182713]: 2026-01-21 23:54:22.321 182717 INFO nova.virt.libvirt.driver [None req-5a83e2e7-06ae-4bc8-83a0-3c0f2682d24b 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: 827ad99f-45db-4d46-9b29-e7093f18eac8] Beginning live snapshot process
Jan 21 23:54:22 compute-1 virtqemud[182235]: invalid argument: disk vda does not have an active block job
Jan 21 23:54:22 compute-1 nova_compute[182713]: 2026-01-21 23:54:22.561 182717 DEBUG oslo_concurrency.processutils [None req-5a83e2e7-06ae-4bc8-83a0-3c0f2682d24b 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/827ad99f-45db-4d46-9b29-e7093f18eac8/disk --force-share --output=json -f qcow2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:54:22 compute-1 nova_compute[182713]: 2026-01-21 23:54:22.660 182717 DEBUG oslo_concurrency.processutils [None req-5a83e2e7-06ae-4bc8-83a0-3c0f2682d24b 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/827ad99f-45db-4d46-9b29-e7093f18eac8/disk --force-share --output=json -f qcow2" returned: 0 in 0.098s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:54:22 compute-1 nova_compute[182713]: 2026-01-21 23:54:22.662 182717 DEBUG oslo_concurrency.processutils [None req-5a83e2e7-06ae-4bc8-83a0-3c0f2682d24b 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/827ad99f-45db-4d46-9b29-e7093f18eac8/disk --force-share --output=json -f qcow2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:54:22 compute-1 nova_compute[182713]: 2026-01-21 23:54:22.749 182717 DEBUG oslo_concurrency.processutils [None req-5a83e2e7-06ae-4bc8-83a0-3c0f2682d24b 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/827ad99f-45db-4d46-9b29-e7093f18eac8/disk --force-share --output=json -f qcow2" returned: 0 in 0.087s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:54:22 compute-1 nova_compute[182713]: 2026-01-21 23:54:22.781 182717 DEBUG oslo_concurrency.processutils [None req-5a83e2e7-06ae-4bc8-83a0-3c0f2682d24b 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:54:22 compute-1 nova_compute[182713]: 2026-01-21 23:54:22.820 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:54:22 compute-1 nova_compute[182713]: 2026-01-21 23:54:22.855 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:54:22 compute-1 nova_compute[182713]: 2026-01-21 23:54:22.856 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 21 23:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:22.867 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '827ad99f-45db-4d46-9b29-e7093f18eac8', 'name': 'tempest-ImagesTestJSON-server-347423779', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000034', 'OS-EXT-SRV-ATTR:host': 'compute-1.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'paused', 'tenant_id': '63e5713bcd4c429796b251487b6136bc', 'user_id': '6eb1bcf645844eaca088761a04e59542', 'hostId': 'd077fa559ec5930f9f2ebe56ec582b45241b18010f40f326cd90587e', 'status': 'paused', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Jan 21 23:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:22.869 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Jan 21 23:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:22.872 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 827ad99f-45db-4d46-9b29-e7093f18eac8 / tap77c99e81-89 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Jan 21 23:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:22.872 12 DEBUG ceilometer.compute.pollsters [-] 827ad99f-45db-4d46-9b29-e7093f18eac8/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:54:22 compute-1 nova_compute[182713]: 2026-01-21 23:54:22.873 182717 DEBUG oslo_concurrency.processutils [None req-5a83e2e7-06ae-4bc8-83a0-3c0f2682d24b 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.091s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:54:22 compute-1 nova_compute[182713]: 2026-01-21 23:54:22.874 182717 DEBUG oslo_concurrency.processutils [None req-5a83e2e7-06ae-4bc8-83a0-3c0f2682d24b 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/snapshots/tmpt0sthgu2/11884c1b149c4281b901fc135e67fe2c.delta 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:22.876 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e59c4a3f-5727-40bf-80c1-075e8349a0ff', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '6eb1bcf645844eaca088761a04e59542', 'user_name': None, 'project_id': '63e5713bcd4c429796b251487b6136bc', 'project_name': None, 'resource_id': 'instance-00000034-827ad99f-45db-4d46-9b29-e7093f18eac8-tap77c99e81-89', 'timestamp': '2026-01-21T23:54:22.869279', 'resource_metadata': {'display_name': 'tempest-ImagesTestJSON-server-347423779', 'name': 'tap77c99e81-89', 'instance_id': '827ad99f-45db-4d46-9b29-e7093f18eac8', 'instance_type': 'm1.nano', 'host': 'd077fa559ec5930f9f2ebe56ec582b45241b18010f40f326cd90587e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ae:70:85', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap77c99e81-89'}, 'message_id': '827f0fc2-f724-11f0-a0a4-fa163e934844', 'monotonic_time': 4235.57659638, 'message_signature': '0e8af43d5b9cd37bbc9ec88877b9434432c836c312f24a4e0e649d7fd0d75e2c'}]}, 'timestamp': '2026-01-21 23:54:22.873828', '_unique_id': '669115af6dc24176a62c42adf1c75c2a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:22.876 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:22.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 23:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:22.876 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 23:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:22.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:22.876 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:22.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 23:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:22.876 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 23:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:22.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 23:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:22.876 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 23:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:22.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 23:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:22.876 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 23:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:22.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 23:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:22.876 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 23:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:22.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 23:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:22.876 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 23:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:22.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 23:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:22.876 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 23:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:22.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 23:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:22.876 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 23:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:22.876 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 23:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:22.876 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:22.876 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 23:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:22.876 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:22.876 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:22.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 23:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:22.876 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 23:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:22.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 23:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:22.876 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 23:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:22.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 23:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:22.876 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 23:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:22.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 23:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:22.876 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 23:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:22.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 23:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:22.876 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 23:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:22.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 23:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:22.876 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 23:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:22.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 23:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:22.876 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 23:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:22.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 23:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:22.876 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 23:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:22.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 23:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:22.876 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 23:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:22.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 23:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:22.876 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 23:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:22.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 23:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:22.876 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 23:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:22.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:22.876 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:22.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 23:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:22.876 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 23:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:22.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 23:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:22.876 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 23:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:22.876 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:22.876 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:22.878 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Jan 21 23:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:22.897 12 DEBUG ceilometer.compute.pollsters [-] 827ad99f-45db-4d46-9b29-e7093f18eac8/cpu volume: 3820000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:22.899 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ac8de9b3-82e5-48e6-aef4-a550945ba64a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 3820000000, 'user_id': '6eb1bcf645844eaca088761a04e59542', 'user_name': None, 'project_id': '63e5713bcd4c429796b251487b6136bc', 'project_name': None, 'resource_id': '827ad99f-45db-4d46-9b29-e7093f18eac8', 'timestamp': '2026-01-21T23:54:22.878821', 'resource_metadata': {'display_name': 'tempest-ImagesTestJSON-server-347423779', 'name': 'instance-00000034', 'instance_id': '827ad99f-45db-4d46-9b29-e7093f18eac8', 'instance_type': 'm1.nano', 'host': 'd077fa559ec5930f9f2ebe56ec582b45241b18010f40f326cd90587e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '8282c612-f724-11f0-a0a4-fa163e934844', 'monotonic_time': 4235.60450469, 'message_signature': '2e1ae83622bec7ac2c96d8c928de4a3d68f6902c4d62a42a03723cd009a26d51'}]}, 'timestamp': '2026-01-21 23:54:22.897970', '_unique_id': '68b3f61047dc4ff0b65266bc3c1b4f59'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:22.899 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:22.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 23:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:22.899 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 23:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:22.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:22.899 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:22.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 23:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:22.899 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 23:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:22.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 23:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:22.899 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 23:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:22.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 23:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:22.899 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 23:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:22.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 23:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:22.899 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 23:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:22.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 23:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:22.899 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 23:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:22.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 23:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:22.899 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 23:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:22.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 23:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:22.899 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 23:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:22.899 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 23:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:22.899 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:22.899 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 23:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:22.899 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:22.899 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:22.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 23:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:22.899 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 23:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:22.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 23:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:22.899 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 23:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:22.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 23:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:22.899 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 23:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:22.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 23:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:22.899 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 23:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:22.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 23:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:22.899 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 23:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:22.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 23:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:22.899 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 23:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:22.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 23:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:22.899 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 23:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:22.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 23:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:22.899 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 23:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:22.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 23:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:22.899 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 23:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:22.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 23:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:22.899 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 23:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:22.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 23:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:22.899 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 23:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:22.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:22.899 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:22.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 23:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:22.899 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 23:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:22.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 23:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:22.899 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 23:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:22.899 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:22.899 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:22.900 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Jan 21 23:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:22.900 12 DEBUG ceilometer.compute.pollsters [-] 827ad99f-45db-4d46-9b29-e7093f18eac8/memory.usage volume: Unavailable _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:22.900 12 WARNING ceilometer.compute.pollsters [-] memory.usage statistic in not available for instance 827ad99f-45db-4d46-9b29-e7093f18eac8: ceilometer.compute.pollsters.NoVolumeException
Jan 21 23:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:22.900 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Jan 21 23:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:22.900 12 DEBUG ceilometer.compute.pollsters [-] 827ad99f-45db-4d46-9b29-e7093f18eac8/network.outgoing.packets volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:22.902 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd1d8f740-633b-4bd6-a7a1-ddef2883df5d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '6eb1bcf645844eaca088761a04e59542', 'user_name': None, 'project_id': '63e5713bcd4c429796b251487b6136bc', 'project_name': None, 'resource_id': 'instance-00000034-827ad99f-45db-4d46-9b29-e7093f18eac8-tap77c99e81-89', 'timestamp': '2026-01-21T23:54:22.900807', 'resource_metadata': {'display_name': 'tempest-ImagesTestJSON-server-347423779', 'name': 'tap77c99e81-89', 'instance_id': '827ad99f-45db-4d46-9b29-e7093f18eac8', 'instance_type': 'm1.nano', 'host': 'd077fa559ec5930f9f2ebe56ec582b45241b18010f40f326cd90587e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ae:70:85', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap77c99e81-89'}, 'message_id': '82834ad8-f724-11f0-a0a4-fa163e934844', 'monotonic_time': 4235.57659638, 'message_signature': 'c21c7f4826b6e58d4f763fb63beaf9c3fd5543aa81ec8cbd78ae06d7b5d184e4'}]}, 'timestamp': '2026-01-21 23:54:22.901344', '_unique_id': '522e527a839448c2bf4c09d245d90483'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:22.902 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:22.902 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 23:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:22.902 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 23:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:22.902 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:22.902 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:22.902 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 23:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:22.902 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 23:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:22.902 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 23:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:22.902 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 23:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:22.902 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 23:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:22.902 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 23:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:22.902 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 23:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:22.902 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 23:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:22.902 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 23:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:22.902 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 23:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:22.902 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 23:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:22.902 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 23:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:22.902 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 23:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:22.902 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 23:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:22.902 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 23:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:22.902 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:22.902 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 23:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:22.902 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:22.902 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:22.902 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 23:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:22.902 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 23:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:22.902 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 23:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:22.902 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 23:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:22.902 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 23:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:22.902 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 23:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:22.902 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 23:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:22.902 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 23:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:22.902 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 23:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:22.902 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 23:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:22.902 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 23:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:22.902 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 23:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:22.902 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 23:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:22.902 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 23:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:22.902 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 23:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:22.902 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 23:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:22.902 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 23:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:22.902 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 23:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:22.902 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 23:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:22.902 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 23:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:22.902 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 23:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:22.902 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 23:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:22.902 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:22.902 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:22.902 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 23:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:22.902 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 23:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:22.902 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 23:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:22.902 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 23:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:22.902 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:22.902 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:22.902 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Jan 21 23:54:22 compute-1 nova_compute[182713]: 2026-01-21 23:54:22.931 182717 DEBUG oslo_concurrency.processutils [None req-5a83e2e7-06ae-4bc8-83a0-3c0f2682d24b 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/snapshots/tmpt0sthgu2/11884c1b149c4281b901fc135e67fe2c.delta 1073741824" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:54:22 compute-1 nova_compute[182713]: 2026-01-21 23:54:22.932 182717 INFO nova.virt.libvirt.driver [None req-5a83e2e7-06ae-4bc8-83a0-3c0f2682d24b 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: 827ad99f-45db-4d46-9b29-e7093f18eac8] Quiescing instance not available: QEMU guest agent is not enabled.
Jan 21 23:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:22.936 12 DEBUG ceilometer.compute.pollsters [-] 827ad99f-45db-4d46-9b29-e7093f18eac8/disk.device.read.latency volume: 186472859 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:22.936 12 DEBUG ceilometer.compute.pollsters [-] 827ad99f-45db-4d46-9b29-e7093f18eac8/disk.device.read.latency volume: 771204 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:22.938 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '524daf7b-5b20-480b-9942-c964ab92deb9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 186472859, 'user_id': '6eb1bcf645844eaca088761a04e59542', 'user_name': None, 'project_id': '63e5713bcd4c429796b251487b6136bc', 'project_name': None, 'resource_id': '827ad99f-45db-4d46-9b29-e7093f18eac8-vda', 'timestamp': '2026-01-21T23:54:22.903005', 'resource_metadata': {'display_name': 'tempest-ImagesTestJSON-server-347423779', 'name': 'instance-00000034', 'instance_id': '827ad99f-45db-4d46-9b29-e7093f18eac8', 'instance_type': 'm1.nano', 'host': 'd077fa559ec5930f9f2ebe56ec582b45241b18010f40f326cd90587e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '8288b7d4-f724-11f0-a0a4-fa163e934844', 'monotonic_time': 4235.610235556, 'message_signature': '43e6a33f3cdc40f243c0488cdd6872c703d9659bf861c8267c39b34e3cf3eed2'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 771204, 'user_id': '6eb1bcf645844eaca088761a04e59542', 'user_name': None, 'project_id': '63e5713bcd4c429796b251487b6136bc', 'project_name': None, 'resource_id': '827ad99f-45db-4d46-9b29-e7093f18eac8-sda', 'timestamp': '2026-01-21T23:54:22.903005', 'resource_metadata': {'display_name': 'tempest-ImagesTestJSON-server-347423779', 'name': 'instance-00000034', 'instance_id': '827ad99f-45db-4d46-9b29-e7093f18eac8', 'instance_type': 'm1.nano', 'host': 'd077fa559ec5930f9f2ebe56ec582b45241b18010f40f326cd90587e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '8288c972-f724-11f0-a0a4-fa163e934844', 'monotonic_time': 4235.610235556, 'message_signature': '772de57604cd213b608a07775060146b4161bfdbfd54cb1e5f0f379da11216cb'}]}, 'timestamp': '2026-01-21 23:54:22.937357', '_unique_id': '9e5cd22de298448186faa421a296e5e6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:22.938 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:22.938 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 23:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:22.938 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 23:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:22.938 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:22.938 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:22.938 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 23:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:22.938 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 23:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:22.938 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 23:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:22.938 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 23:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:22.938 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 23:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:22.938 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 23:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:22.938 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 23:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:22.938 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 23:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:22.938 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 23:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:22.938 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 23:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:22.938 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 23:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:22.938 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 23:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:22.938 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 23:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:22.938 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 23:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:22.938 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 23:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:22.938 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:22.938 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 23:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:22.938 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:22.938 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:22.938 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 23:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:22.938 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 23:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:22.938 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 23:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:22.938 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 23:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:22.938 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 23:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:22.938 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 23:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:22.938 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 23:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:22.938 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 23:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:22.938 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 23:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:22.938 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 23:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:22.938 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 23:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:22.938 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 23:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:22.938 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 23:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:22.938 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 23:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:22.938 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 23:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:22.938 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 23:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:22.938 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 23:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:22.938 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 23:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:22.938 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 23:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:22.938 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 23:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:22.938 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 23:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:22.938 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 23:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:22.938 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:22.938 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:22.938 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 23:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:22.938 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 23:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:22.938 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 23:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:22.938 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 23:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:22.938 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:22.938 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:22.939 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Jan 21 23:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:22.939 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 21 23:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:22.939 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-ImagesTestJSON-server-347423779>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ImagesTestJSON-server-347423779>]
Jan 21 23:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:22.940 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Jan 21 23:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:22.940 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 21 23:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:22.940 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-ImagesTestJSON-server-347423779>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ImagesTestJSON-server-347423779>]
Jan 21 23:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:22.940 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Jan 21 23:54:23 compute-1 nova_compute[182713]: 2026-01-21 23:54:23.000 182717 DEBUG nova.virt.libvirt.guest [None req-5a83e2e7-06ae-4bc8-83a0-3c0f2682d24b 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] COPY block job progress, current cursor: 1 final cursor: 1 is_job_complete /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:846
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.007 12 DEBUG ceilometer.compute.pollsters [-] 827ad99f-45db-4d46-9b29-e7093f18eac8/disk.device.usage volume: 196624 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.008 12 DEBUG ceilometer.compute.pollsters [-] 827ad99f-45db-4d46-9b29-e7093f18eac8/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:54:23 compute-1 nova_compute[182713]: 2026-01-21 23:54:23.011 182717 INFO nova.virt.libvirt.driver [None req-5a83e2e7-06ae-4bc8-83a0-3c0f2682d24b 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: 827ad99f-45db-4d46-9b29-e7093f18eac8] Skipping quiescing instance: QEMU guest agent is not enabled.
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.009 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7ad50b70-4b8f-46f0-a8d1-a8659a809446', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 196624, 'user_id': '6eb1bcf645844eaca088761a04e59542', 'user_name': None, 'project_id': '63e5713bcd4c429796b251487b6136bc', 'project_name': None, 'resource_id': '827ad99f-45db-4d46-9b29-e7093f18eac8-vda', 'timestamp': '2026-01-21T23:54:22.941018', 'resource_metadata': {'display_name': 'tempest-ImagesTestJSON-server-347423779', 'name': 'instance-00000034', 'instance_id': '827ad99f-45db-4d46-9b29-e7093f18eac8', 'instance_type': 'm1.nano', 'host': 'd077fa559ec5930f9f2ebe56ec582b45241b18010f40f326cd90587e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '82938cd6-f724-11f0-a0a4-fa163e934844', 'monotonic_time': 4235.648261327, 'message_signature': '7ad3da7fe9779b4022c35618787ad3624e9d2f1c0f2eec9d62d7a87926789d03'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '6eb1bcf645844eaca088761a04e59542', 'user_name': None, 'project_id': '63e5713bcd4c429796b251487b6136bc', 'project_name': None, 'resource_id': '827ad99f-45db-4d46-9b29-e7093f18eac8-sda', 'timestamp': '2026-01-21T23:54:22.941018', 'resource_metadata': {'display_name': 'tempest-ImagesTestJSON-server-347423779', 'name': 'instance-00000034', 'instance_id': '827ad99f-45db-4d46-9b29-e7093f18eac8', 'instance_type': 'm1.nano', 'host': 'd077fa559ec5930f9f2ebe56ec582b45241b18010f40f326cd90587e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '8293a2b6-f724-11f0-a0a4-fa163e934844', 'monotonic_time': 4235.648261327, 'message_signature': '5696781e707c79de5c0edf09732fc1c7cfaa6954b5b7efe5b8a9e9188ed9345d'}]}, 'timestamp': '2026-01-21 23:54:23.008432', '_unique_id': '1968e45f294c45519f52c044e9dcbfd8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.009 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.009 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.009 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.009 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.009 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.009 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.009 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.009 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.009 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.009 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.009 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.009 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.009 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.009 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.009 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.009 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.009 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.009 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.009 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.009 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.009 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.009 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.009 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.009 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.009 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.009 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.009 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.009 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.009 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.009 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.009 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.013 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.013 12 DEBUG ceilometer.compute.pollsters [-] 827ad99f-45db-4d46-9b29-e7093f18eac8/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.014 12 DEBUG ceilometer.compute.pollsters [-] 827ad99f-45db-4d46-9b29-e7093f18eac8/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.015 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5dc52a2a-c660-4300-bff5-5a8da28f375f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '6eb1bcf645844eaca088761a04e59542', 'user_name': None, 'project_id': '63e5713bcd4c429796b251487b6136bc', 'project_name': None, 'resource_id': '827ad99f-45db-4d46-9b29-e7093f18eac8-vda', 'timestamp': '2026-01-21T23:54:23.013696', 'resource_metadata': {'display_name': 'tempest-ImagesTestJSON-server-347423779', 'name': 'instance-00000034', 'instance_id': '827ad99f-45db-4d46-9b29-e7093f18eac8', 'instance_type': 'm1.nano', 'host': 'd077fa559ec5930f9f2ebe56ec582b45241b18010f40f326cd90587e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '829480aa-f724-11f0-a0a4-fa163e934844', 'monotonic_time': 4235.610235556, 'message_signature': 'bfabf0325a7c492246a9cf2ad9dcff4c7e4a37ac90908648fe6c514c1814fa83'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '6eb1bcf645844eaca088761a04e59542', 'user_name': None, 'project_id': '63e5713bcd4c429796b251487b6136bc', 'project_name': None, 'resource_id': '827ad99f-45db-4d46-9b29-e7093f18eac8-sda', 'timestamp': '2026-01-21T23:54:23.013696', 'resource_metadata': {'display_name': 'tempest-ImagesTestJSON-server-347423779', 'name': 'instance-00000034', 'instance_id': '827ad99f-45db-4d46-9b29-e7093f18eac8', 'instance_type': 'm1.nano', 'host': 'd077fa559ec5930f9f2ebe56ec582b45241b18010f40f326cd90587e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '82948cb2-f724-11f0-a0a4-fa163e934844', 'monotonic_time': 4235.610235556, 'message_signature': 'a36c5cd4b826097cb9c541f46d5699a6716a969db821cbd665934ae7ef9eef7e'}]}, 'timestamp': '2026-01-21 23:54:23.014351', '_unique_id': '2043663f62bc48749f839d46b0335ea7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.015 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.015 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.015 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.015 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.015 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.015 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.015 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.015 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.015 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.015 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.015 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.015 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.015 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.015 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.015 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.015 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.015 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.015 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.015 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.015 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.015 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.015 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.015 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.015 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.015 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.015 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.015 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.015 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.015 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.015 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.015 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.016 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.016 12 DEBUG ceilometer.compute.pollsters [-] 827ad99f-45db-4d46-9b29-e7093f18eac8/disk.device.allocation volume: 204800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.016 12 DEBUG ceilometer.compute.pollsters [-] 827ad99f-45db-4d46-9b29-e7093f18eac8/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.017 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '68ddea09-ca17-4c40-bbf1-97c6b8729c92', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 204800, 'user_id': '6eb1bcf645844eaca088761a04e59542', 'user_name': None, 'project_id': '63e5713bcd4c429796b251487b6136bc', 'project_name': None, 'resource_id': '827ad99f-45db-4d46-9b29-e7093f18eac8-vda', 'timestamp': '2026-01-21T23:54:23.016440', 'resource_metadata': {'display_name': 'tempest-ImagesTestJSON-server-347423779', 'name': 'instance-00000034', 'instance_id': '827ad99f-45db-4d46-9b29-e7093f18eac8', 'instance_type': 'm1.nano', 'host': 'd077fa559ec5930f9f2ebe56ec582b45241b18010f40f326cd90587e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '8294ea40-f724-11f0-a0a4-fa163e934844', 'monotonic_time': 4235.648261327, 'message_signature': '1f9d3e583959bc0100269d90dac9305539e7190307f90b74d50fdcf86efa8024'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '6eb1bcf645844eaca088761a04e59542', 'user_name': None, 'project_id': '63e5713bcd4c429796b251487b6136bc', 'project_name': None, 'resource_id': '827ad99f-45db-4d46-9b29-e7093f18eac8-sda', 'timestamp': '2026-01-21T23:54:23.016440', 'resource_metadata': {'display_name': 'tempest-ImagesTestJSON-server-347423779', 'name': 'instance-00000034', 'instance_id': '827ad99f-45db-4d46-9b29-e7093f18eac8', 'instance_type': 'm1.nano', 'host': 'd077fa559ec5930f9f2ebe56ec582b45241b18010f40f326cd90587e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '8294f6a2-f724-11f0-a0a4-fa163e934844', 'monotonic_time': 4235.648261327, 'message_signature': '8f6e9495e47a3da5eedc99ee88a749b132f9e860a3766328177c8a0f104fd9eb'}]}, 'timestamp': '2026-01-21 23:54:23.017087', '_unique_id': '6ee0e9946ed14057a1b1c34688effe3f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.017 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.017 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.017 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.017 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.017 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.017 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.017 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.017 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.017 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.017 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.017 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.017 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.017 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.017 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.017 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.017 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.017 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.017 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.017 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.017 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.017 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.017 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.017 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.017 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.017 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.017 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.017 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.017 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.017 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.017 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.017 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.018 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.018 12 DEBUG ceilometer.compute.pollsters [-] 827ad99f-45db-4d46-9b29-e7093f18eac8/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.020 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '656234f8-4694-4eac-aa11-d652f2422038', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '6eb1bcf645844eaca088761a04e59542', 'user_name': None, 'project_id': '63e5713bcd4c429796b251487b6136bc', 'project_name': None, 'resource_id': 'instance-00000034-827ad99f-45db-4d46-9b29-e7093f18eac8-tap77c99e81-89', 'timestamp': '2026-01-21T23:54:23.018872', 'resource_metadata': {'display_name': 'tempest-ImagesTestJSON-server-347423779', 'name': 'tap77c99e81-89', 'instance_id': '827ad99f-45db-4d46-9b29-e7093f18eac8', 'instance_type': 'm1.nano', 'host': 'd077fa559ec5930f9f2ebe56ec582b45241b18010f40f326cd90587e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ae:70:85', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap77c99e81-89'}, 'message_id': '82955246-f724-11f0-a0a4-fa163e934844', 'monotonic_time': 4235.57659638, 'message_signature': '0e6af2137186e3985f5ccb2cdaab74996e97b4d4cf968f7c08b530440fffc250'}]}, 'timestamp': '2026-01-21 23:54:23.019505', '_unique_id': '26ae41deccc24341943c7af1f73aa2ac'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.020 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.020 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.020 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.020 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.020 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.020 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.020 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.020 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.020 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.020 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.020 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.020 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.020 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.020 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.020 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.020 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.020 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.020 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.020 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.020 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.020 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.020 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.020 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.020 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.020 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.020 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.020 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.020 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.020 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.020 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.020 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.021 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.021 12 DEBUG ceilometer.compute.pollsters [-] 827ad99f-45db-4d46-9b29-e7093f18eac8/disk.device.read.requests volume: 760 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.021 12 DEBUG ceilometer.compute.pollsters [-] 827ad99f-45db-4d46-9b29-e7093f18eac8/disk.device.read.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.022 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fa2cf3d7-c1f5-4507-a498-d2e1cdaa18e7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 760, 'user_id': '6eb1bcf645844eaca088761a04e59542', 'user_name': None, 'project_id': '63e5713bcd4c429796b251487b6136bc', 'project_name': None, 'resource_id': '827ad99f-45db-4d46-9b29-e7093f18eac8-vda', 'timestamp': '2026-01-21T23:54:23.021376', 'resource_metadata': {'display_name': 'tempest-ImagesTestJSON-server-347423779', 'name': 'instance-00000034', 'instance_id': '827ad99f-45db-4d46-9b29-e7093f18eac8', 'instance_type': 'm1.nano', 'host': 'd077fa559ec5930f9f2ebe56ec582b45241b18010f40f326cd90587e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '8295ab6a-f724-11f0-a0a4-fa163e934844', 'monotonic_time': 4235.610235556, 'message_signature': 'ae68fa5fb692406c73ac36db94e997f54eaa393050dc26576903632abedf5cc1'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '6eb1bcf645844eaca088761a04e59542', 'user_name': None, 'project_id': '63e5713bcd4c429796b251487b6136bc', 'project_name': None, 'resource_id': '827ad99f-45db-4d46-9b29-e7093f18eac8-sda', 'timestamp': '2026-01-21T23:54:23.021376', 'resource_metadata': {'display_name': 'tempest-ImagesTestJSON-server-347423779', 'name': 'instance-00000034', 'instance_id': '827ad99f-45db-4d46-9b29-e7093f18eac8', 'instance_type': 'm1.nano', 'host': 'd077fa559ec5930f9f2ebe56ec582b45241b18010f40f326cd90587e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '8295b682-f724-11f0-a0a4-fa163e934844', 'monotonic_time': 4235.610235556, 'message_signature': '2122af2765b2782710864672c9a084bfff5fa547fbde6a5bb449120906ecab7a'}]}, 'timestamp': '2026-01-21 23:54:23.022005', '_unique_id': 'd520280322bb49e194deaa24453e770c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.022 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.022 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.022 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.022 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.022 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.022 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.022 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.022 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.022 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.022 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.022 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.022 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.022 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.022 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.022 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.022 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.022 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.022 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.022 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.022 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.022 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.022 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.022 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.022 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.022 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.022 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.022 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.022 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.022 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.022 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.022 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.023 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.023 12 DEBUG ceilometer.compute.pollsters [-] 827ad99f-45db-4d46-9b29-e7093f18eac8/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.023 12 DEBUG ceilometer.compute.pollsters [-] 827ad99f-45db-4d46-9b29-e7093f18eac8/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.025 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '22ed7d5a-67a4-46f9-84a7-d22be221c97a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '6eb1bcf645844eaca088761a04e59542', 'user_name': None, 'project_id': '63e5713bcd4c429796b251487b6136bc', 'project_name': None, 'resource_id': '827ad99f-45db-4d46-9b29-e7093f18eac8-vda', 'timestamp': '2026-01-21T23:54:23.023579', 'resource_metadata': {'display_name': 'tempest-ImagesTestJSON-server-347423779', 'name': 'instance-00000034', 'instance_id': '827ad99f-45db-4d46-9b29-e7093f18eac8', 'instance_type': 'm1.nano', 'host': 'd077fa559ec5930f9f2ebe56ec582b45241b18010f40f326cd90587e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '82960100-f724-11f0-a0a4-fa163e934844', 'monotonic_time': 4235.610235556, 'message_signature': 'd6b87901a9f1b8c787e6e4f3b9ebb5f393ce417a2d2e5315d1b0331f9ff23960'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '6eb1bcf645844eaca088761a04e59542', 'user_name': None, 'project_id': '63e5713bcd4c429796b251487b6136bc', 'project_name': None, 'resource_id': '827ad99f-45db-4d46-9b29-e7093f18eac8-sda', 'timestamp': '2026-01-21T23:54:23.023579', 'resource_metadata': {'display_name': 'tempest-ImagesTestJSON-server-347423779', 'name': 'instance-00000034', 'instance_id': '827ad99f-45db-4d46-9b29-e7093f18eac8', 'instance_type': 'm1.nano', 'host': 'd077fa559ec5930f9f2ebe56ec582b45241b18010f40f326cd90587e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '82960f24-f724-11f0-a0a4-fa163e934844', 'monotonic_time': 4235.610235556, 'message_signature': '6bef4f5631b2575c83438e75a96e7c170492fd5eedaea05ad7fdce9c8ee705f7'}]}, 'timestamp': '2026-01-21 23:54:23.024294', '_unique_id': 'b3f8bf3e6a9f457fb4ad9d87e6ee5871'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.025 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.025 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.025 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.025 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.025 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.025 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.025 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.025 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.025 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.025 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.025 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.025 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.025 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.025 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.025 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.025 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.025 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.025 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.025 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.025 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.025 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.025 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.025 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.025 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.025 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.025 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.025 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.025 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.025 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.025 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.025 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.026 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.026 12 DEBUG ceilometer.compute.pollsters [-] 827ad99f-45db-4d46-9b29-e7093f18eac8/network.outgoing.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.027 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c3b763c2-8c2f-43dc-bc71-46b4c51e98e1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '6eb1bcf645844eaca088761a04e59542', 'user_name': None, 'project_id': '63e5713bcd4c429796b251487b6136bc', 'project_name': None, 'resource_id': 'instance-00000034-827ad99f-45db-4d46-9b29-e7093f18eac8-tap77c99e81-89', 'timestamp': '2026-01-21T23:54:23.026299', 'resource_metadata': {'display_name': 'tempest-ImagesTestJSON-server-347423779', 'name': 'tap77c99e81-89', 'instance_id': '827ad99f-45db-4d46-9b29-e7093f18eac8', 'instance_type': 'm1.nano', 'host': 'd077fa559ec5930f9f2ebe56ec582b45241b18010f40f326cd90587e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ae:70:85', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap77c99e81-89'}, 'message_id': '82966d2a-f724-11f0-a0a4-fa163e934844', 'monotonic_time': 4235.57659638, 'message_signature': 'efe625985ab9bf3b9d2a6719d0ee97555262f228989b288a996f3f9e939beb00'}]}, 'timestamp': '2026-01-21 23:54:23.026723', '_unique_id': 'efc23078fce94f188e77592901e6dc76'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.027 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.027 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.027 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.027 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.027 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.027 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.027 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.027 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.027 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.027 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.027 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.027 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.027 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.027 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.027 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.027 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.027 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.027 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.027 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.027 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.027 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.027 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.027 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.027 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.027 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.027 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.027 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.027 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.027 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.027 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.027 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.028 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.028 12 DEBUG ceilometer.compute.pollsters [-] 827ad99f-45db-4d46-9b29-e7093f18eac8/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.028 12 DEBUG ceilometer.compute.pollsters [-] 827ad99f-45db-4d46-9b29-e7093f18eac8/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.029 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f12b919f-304b-4d35-817c-816cf6e5fe41', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '6eb1bcf645844eaca088761a04e59542', 'user_name': None, 'project_id': '63e5713bcd4c429796b251487b6136bc', 'project_name': None, 'resource_id': '827ad99f-45db-4d46-9b29-e7093f18eac8-vda', 'timestamp': '2026-01-21T23:54:23.028464', 'resource_metadata': {'display_name': 'tempest-ImagesTestJSON-server-347423779', 'name': 'instance-00000034', 'instance_id': '827ad99f-45db-4d46-9b29-e7093f18eac8', 'instance_type': 'm1.nano', 'host': 'd077fa559ec5930f9f2ebe56ec582b45241b18010f40f326cd90587e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '8296bffa-f724-11f0-a0a4-fa163e934844', 'monotonic_time': 4235.610235556, 'message_signature': '27e76507e968a4a11a0f4da3031ca84e57ac47522e7d39f8bc5c105903a403a6'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '6eb1bcf645844eaca088761a04e59542', 'user_name': None, 'project_id': '63e5713bcd4c429796b251487b6136bc', 'project_name': None, 'resource_id': '827ad99f-45db-4d46-9b29-e7093f18eac8-sda', 'timestamp': '2026-01-21T23:54:23.028464', 'resource_metadata': {'display_name': 'tempest-ImagesTestJSON-server-347423779', 'name': 'instance-00000034', 'instance_id': '827ad99f-45db-4d46-9b29-e7093f18eac8', 'instance_type': 'm1.nano', 'host': 'd077fa559ec5930f9f2ebe56ec582b45241b18010f40f326cd90587e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '8296cc84-f724-11f0-a0a4-fa163e934844', 'monotonic_time': 4235.610235556, 'message_signature': 'f1207c73086326ffb4d54f1f15ead95a9175a1c8f38e92cc218da2231088785a'}]}, 'timestamp': '2026-01-21 23:54:23.029092', '_unique_id': '1d46c1c87db0457181f1cb23e1de9b5b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.029 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.029 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.029 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.029 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.029 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.029 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.029 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.029 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.029 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.029 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.029 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.029 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.029 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.029 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.029 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.029 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.029 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.029 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.029 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.029 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.029 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.029 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.029 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.029 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.029 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.029 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.029 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.029 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.029 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.029 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.029 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.030 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.030 12 DEBUG ceilometer.compute.pollsters [-] 827ad99f-45db-4d46-9b29-e7093f18eac8/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.031 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '79925e7d-17ae-4265-b96e-59a68789c5ac', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '6eb1bcf645844eaca088761a04e59542', 'user_name': None, 'project_id': '63e5713bcd4c429796b251487b6136bc', 'project_name': None, 'resource_id': 'instance-00000034-827ad99f-45db-4d46-9b29-e7093f18eac8-tap77c99e81-89', 'timestamp': '2026-01-21T23:54:23.030751', 'resource_metadata': {'display_name': 'tempest-ImagesTestJSON-server-347423779', 'name': 'tap77c99e81-89', 'instance_id': '827ad99f-45db-4d46-9b29-e7093f18eac8', 'instance_type': 'm1.nano', 'host': 'd077fa559ec5930f9f2ebe56ec582b45241b18010f40f326cd90587e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ae:70:85', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap77c99e81-89'}, 'message_id': '82971bb2-f724-11f0-a0a4-fa163e934844', 'monotonic_time': 4235.57659638, 'message_signature': 'b37ae093f554d109918bc95ce87122637fc84fdac40d029540b70fc531db4419'}]}, 'timestamp': '2026-01-21 23:54:23.031177', '_unique_id': '263441dae8ef492381e92a5bf91ff963'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.031 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.031 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.031 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.031 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.031 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.031 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.031 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.031 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.031 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.031 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.031 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.031 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.031 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.031 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.031 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.031 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.031 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.031 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.031 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.031 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.031 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.031 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.031 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.031 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.031 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.031 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.031 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.031 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.031 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.031 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.031 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.032 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.033 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.033 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-ImagesTestJSON-server-347423779>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ImagesTestJSON-server-347423779>]
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.033 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.033 12 DEBUG ceilometer.compute.pollsters [-] 827ad99f-45db-4d46-9b29-e7093f18eac8/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.034 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '49065693-5aa1-4f10-9fa4-1b65c0b6b385', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '6eb1bcf645844eaca088761a04e59542', 'user_name': None, 'project_id': '63e5713bcd4c429796b251487b6136bc', 'project_name': None, 'resource_id': 'instance-00000034-827ad99f-45db-4d46-9b29-e7093f18eac8-tap77c99e81-89', 'timestamp': '2026-01-21T23:54:23.033479', 'resource_metadata': {'display_name': 'tempest-ImagesTestJSON-server-347423779', 'name': 'tap77c99e81-89', 'instance_id': '827ad99f-45db-4d46-9b29-e7093f18eac8', 'instance_type': 'm1.nano', 'host': 'd077fa559ec5930f9f2ebe56ec582b45241b18010f40f326cd90587e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ae:70:85', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap77c99e81-89'}, 'message_id': '8297839a-f724-11f0-a0a4-fa163e934844', 'monotonic_time': 4235.57659638, 'message_signature': 'c8d4a35c0383d79593c3714fb47216f433b54d3761d2c56eb133a8dfa47e5967'}]}, 'timestamp': '2026-01-21 23:54:23.033797', '_unique_id': '1d7e0c996d254aab89017fce698ca36e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.034 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.034 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.034 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.034 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.034 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.034 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.034 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.034 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.034 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.034 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.034 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.034 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.034 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.034 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.034 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.034 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.034 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.034 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.034 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.034 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.034 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.034 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.034 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.034 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.034 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.034 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.034 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.034 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.034 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.034 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.034 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.035 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.035 12 DEBUG ceilometer.compute.pollsters [-] 827ad99f-45db-4d46-9b29-e7093f18eac8/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.036 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '89835f02-886a-4a42-8170-c05539f8e5d3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '6eb1bcf645844eaca088761a04e59542', 'user_name': None, 'project_id': '63e5713bcd4c429796b251487b6136bc', 'project_name': None, 'resource_id': 'instance-00000034-827ad99f-45db-4d46-9b29-e7093f18eac8-tap77c99e81-89', 'timestamp': '2026-01-21T23:54:23.035316', 'resource_metadata': {'display_name': 'tempest-ImagesTestJSON-server-347423779', 'name': 'tap77c99e81-89', 'instance_id': '827ad99f-45db-4d46-9b29-e7093f18eac8', 'instance_type': 'm1.nano', 'host': 'd077fa559ec5930f9f2ebe56ec582b45241b18010f40f326cd90587e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ae:70:85', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap77c99e81-89'}, 'message_id': '8297cb5c-f724-11f0-a0a4-fa163e934844', 'monotonic_time': 4235.57659638, 'message_signature': '106133a306f1ab6a57e945cd97d66e967c66325a2cab32c922160be23ee0091d'}]}, 'timestamp': '2026-01-21 23:54:23.035631', '_unique_id': 'ad1c9db24d534117a4f861b4f328a9d0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.036 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.036 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.036 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.036 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.036 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.036 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.036 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.036 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.036 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.036 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.036 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.036 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.036 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.036 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.036 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.036 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.036 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.036 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.036 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.036 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.036 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.036 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.036 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.036 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.036 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.036 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.036 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.036 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.036 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.036 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.036 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.037 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.037 12 DEBUG ceilometer.compute.pollsters [-] 827ad99f-45db-4d46-9b29-e7093f18eac8/network.incoming.packets volume: 2 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.038 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a3bd22ae-34cb-4b7d-8008-b860caabf121', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 2, 'user_id': '6eb1bcf645844eaca088761a04e59542', 'user_name': None, 'project_id': '63e5713bcd4c429796b251487b6136bc', 'project_name': None, 'resource_id': 'instance-00000034-827ad99f-45db-4d46-9b29-e7093f18eac8-tap77c99e81-89', 'timestamp': '2026-01-21T23:54:23.037630', 'resource_metadata': {'display_name': 'tempest-ImagesTestJSON-server-347423779', 'name': 'tap77c99e81-89', 'instance_id': '827ad99f-45db-4d46-9b29-e7093f18eac8', 'instance_type': 'm1.nano', 'host': 'd077fa559ec5930f9f2ebe56ec582b45241b18010f40f326cd90587e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ae:70:85', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap77c99e81-89'}, 'message_id': '82982606-f724-11f0-a0a4-fa163e934844', 'monotonic_time': 4235.57659638, 'message_signature': '15437cfc82d30dd96b315130d5283133574090936ae73ebf72d85d6059ed574d'}]}, 'timestamp': '2026-01-21 23:54:23.038075', '_unique_id': 'fcedda32f2374d76adfbde493c2f684a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.038 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.038 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.038 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.038 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.038 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.038 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.038 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.038 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.038 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.038 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.038 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.038 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.038 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.038 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.038 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.038 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.038 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.038 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.038 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.038 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.038 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.038 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.038 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.038 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.038 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.038 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.038 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.038 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.038 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.038 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.038 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.039 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.039 12 DEBUG ceilometer.compute.pollsters [-] 827ad99f-45db-4d46-9b29-e7093f18eac8/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.040 12 DEBUG ceilometer.compute.pollsters [-] 827ad99f-45db-4d46-9b29-e7093f18eac8/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.041 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3a2428a4-7d2f-45b8-9585-d5c21b191793', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '6eb1bcf645844eaca088761a04e59542', 'user_name': None, 'project_id': '63e5713bcd4c429796b251487b6136bc', 'project_name': None, 'resource_id': '827ad99f-45db-4d46-9b29-e7093f18eac8-vda', 'timestamp': '2026-01-21T23:54:23.039762', 'resource_metadata': {'display_name': 'tempest-ImagesTestJSON-server-347423779', 'name': 'instance-00000034', 'instance_id': '827ad99f-45db-4d46-9b29-e7093f18eac8', 'instance_type': 'm1.nano', 'host': 'd077fa559ec5930f9f2ebe56ec582b45241b18010f40f326cd90587e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '82987bec-f724-11f0-a0a4-fa163e934844', 'monotonic_time': 4235.648261327, 'message_signature': '558d2ab10e157d6439729478833b4819f1df088c9ef0201beea0c489e9c1ef24'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '6eb1bcf645844eaca088761a04e59542', 'user_name': None, 'project_id': '63e5713bcd4c429796b251487b6136bc', 'project_name': None, 'resource_id': '827ad99f-45db-4d46-9b29-e7093f18eac8-sda', 'timestamp': '2026-01-21T23:54:23.039762', 'resource_metadata': {'display_name': 'tempest-ImagesTestJSON-server-347423779', 'name': 'instance-00000034', 'instance_id': '827ad99f-45db-4d46-9b29-e7093f18eac8', 'instance_type': 'm1.nano', 'host': 'd077fa559ec5930f9f2ebe56ec582b45241b18010f40f326cd90587e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '8298893e-f724-11f0-a0a4-fa163e934844', 'monotonic_time': 4235.648261327, 'message_signature': 'cc0ba1fcf385ed320d6b530f7fc0711121067bff990d69ae723c864d66e6882e'}]}, 'timestamp': '2026-01-21 23:54:23.040541', '_unique_id': '2feb7bf639bb4ef4b896d3acb0559068'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.041 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.041 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.041 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.041 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.041 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.041 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.041 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.041 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.041 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.041 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.041 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.041 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.041 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.041 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.041 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.041 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.041 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.041 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.041 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.041 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.041 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.041 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.041 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.041 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.041 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.041 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.041 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.041 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.041 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.041 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.041 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.042 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.042 12 DEBUG ceilometer.compute.pollsters [-] 827ad99f-45db-4d46-9b29-e7093f18eac8/network.incoming.bytes volume: 176 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.043 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3053f449-4fc8-46a2-b47b-f5621e7c66bf', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 176, 'user_id': '6eb1bcf645844eaca088761a04e59542', 'user_name': None, 'project_id': '63e5713bcd4c429796b251487b6136bc', 'project_name': None, 'resource_id': 'instance-00000034-827ad99f-45db-4d46-9b29-e7093f18eac8-tap77c99e81-89', 'timestamp': '2026-01-21T23:54:23.042244', 'resource_metadata': {'display_name': 'tempest-ImagesTestJSON-server-347423779', 'name': 'tap77c99e81-89', 'instance_id': '827ad99f-45db-4d46-9b29-e7093f18eac8', 'instance_type': 'm1.nano', 'host': 'd077fa559ec5930f9f2ebe56ec582b45241b18010f40f326cd90587e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ae:70:85', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap77c99e81-89'}, 'message_id': '8298da1a-f724-11f0-a0a4-fa163e934844', 'monotonic_time': 4235.57659638, 'message_signature': 'b6d2ed378047e255b135c0df04d7eece9457eb8c9dd051d6412fe3f005e9fd4d'}]}, 'timestamp': '2026-01-21 23:54:23.042562', '_unique_id': 'e0cbeb629a9e46e2ac9cf052cf4bacf8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.043 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.043 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.043 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.043 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.043 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.043 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.043 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.043 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.043 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.043 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.043 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.043 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.043 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.043 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.043 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.043 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.043 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.043 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.043 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.043 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.043 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.043 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.043 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.043 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.043 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.043 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.043 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.043 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.043 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.043 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.043 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.044 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.044 12 DEBUG ceilometer.compute.pollsters [-] 827ad99f-45db-4d46-9b29-e7093f18eac8/disk.device.read.bytes volume: 23775232 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.044 12 DEBUG ceilometer.compute.pollsters [-] 827ad99f-45db-4d46-9b29-e7093f18eac8/disk.device.read.bytes volume: 2048 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.045 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5094a639-b49c-4980-a889-37f67de349d6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 23775232, 'user_id': '6eb1bcf645844eaca088761a04e59542', 'user_name': None, 'project_id': '63e5713bcd4c429796b251487b6136bc', 'project_name': None, 'resource_id': '827ad99f-45db-4d46-9b29-e7093f18eac8-vda', 'timestamp': '2026-01-21T23:54:23.044456', 'resource_metadata': {'display_name': 'tempest-ImagesTestJSON-server-347423779', 'name': 'instance-00000034', 'instance_id': '827ad99f-45db-4d46-9b29-e7093f18eac8', 'instance_type': 'm1.nano', 'host': 'd077fa559ec5930f9f2ebe56ec582b45241b18010f40f326cd90587e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '829932a8-f724-11f0-a0a4-fa163e934844', 'monotonic_time': 4235.610235556, 'message_signature': '17020ad308684ad506a9cdb552fd1cc4eab2001c357a460aecf5f9205a516b41'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2048, 'user_id': '6eb1bcf645844eaca088761a04e59542', 'user_name': None, 'project_id': '63e5713bcd4c429796b251487b6136bc', 'project_name': None, 'resource_id': '827ad99f-45db-4d46-9b29-e7093f18eac8-sda', 'timestamp': '2026-01-21T23:54:23.044456', 'resource_metadata': {'display_name': 'tempest-ImagesTestJSON-server-347423779', 'name': 'instance-00000034', 'instance_id': '827ad99f-45db-4d46-9b29-e7093f18eac8', 'instance_type': 'm1.nano', 'host': 'd077fa559ec5930f9f2ebe56ec582b45241b18010f40f326cd90587e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '82994004-f724-11f0-a0a4-fa163e934844', 'monotonic_time': 4235.610235556, 'message_signature': '84b8614086c8c6b9df711b8d85fb5e2591903636aa6de22d8e7e5e419100b6d8'}]}, 'timestamp': '2026-01-21 23:54:23.045156', '_unique_id': 'ee556e7ed5c44bd9acf13a419b830d9d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.045 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.045 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.045 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.045 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.045 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.045 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.045 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.045 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.045 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.045 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.045 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.045 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.045 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.045 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.045 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.045 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.045 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.045 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.045 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.045 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.045 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.045 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.045 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.045 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.045 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.045 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.045 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.045 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.045 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.045 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.045 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.046 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.046 12 DEBUG ceilometer.compute.pollsters [-] 827ad99f-45db-4d46-9b29-e7093f18eac8/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.047 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '87b24beb-41b6-4d37-a5b4-73f860be805c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '6eb1bcf645844eaca088761a04e59542', 'user_name': None, 'project_id': '63e5713bcd4c429796b251487b6136bc', 'project_name': None, 'resource_id': 'instance-00000034-827ad99f-45db-4d46-9b29-e7093f18eac8-tap77c99e81-89', 'timestamp': '2026-01-21T23:54:23.046702', 'resource_metadata': {'display_name': 'tempest-ImagesTestJSON-server-347423779', 'name': 'tap77c99e81-89', 'instance_id': '827ad99f-45db-4d46-9b29-e7093f18eac8', 'instance_type': 'm1.nano', 'host': 'd077fa559ec5930f9f2ebe56ec582b45241b18010f40f326cd90587e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ae:70:85', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap77c99e81-89'}, 'message_id': '82998956-f724-11f0-a0a4-fa163e934844', 'monotonic_time': 4235.57659638, 'message_signature': '8c815d006cb953c7d4d9447c9dcd5821335c6feecd0a53b145833f05acb7b469'}]}, 'timestamp': '2026-01-21 23:54:23.047054', '_unique_id': '1d0ac868f7344d8691329f4f4738af54'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.047 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.047 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.047 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.047 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.047 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.047 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.047 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.047 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.047 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.047 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.047 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.047 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.047 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.047 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.047 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.047 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.047 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.047 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.047 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.047 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.047 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.047 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.047 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.047 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.047 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.047 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.047 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.047 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.047 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.047 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.047 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.048 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.048 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 21 23:54:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:54:23.048 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-ImagesTestJSON-server-347423779>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ImagesTestJSON-server-347423779>]
Jan 21 23:54:23 compute-1 nova_compute[182713]: 2026-01-21 23:54:23.076 182717 DEBUG nova.privsep.utils [None req-5a83e2e7-06ae-4bc8-83a0-3c0f2682d24b 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63
Jan 21 23:54:23 compute-1 nova_compute[182713]: 2026-01-21 23:54:23.077 182717 DEBUG oslo_concurrency.processutils [None req-5a83e2e7-06ae-4bc8-83a0-3c0f2682d24b 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Running cmd (subprocess): qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/snapshots/tmpt0sthgu2/11884c1b149c4281b901fc135e67fe2c.delta /var/lib/nova/instances/snapshots/tmpt0sthgu2/11884c1b149c4281b901fc135e67fe2c execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:54:23 compute-1 nova_compute[182713]: 2026-01-21 23:54:23.261 182717 DEBUG oslo_concurrency.processutils [None req-5a83e2e7-06ae-4bc8-83a0-3c0f2682d24b 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] CMD "qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/snapshots/tmpt0sthgu2/11884c1b149c4281b901fc135e67fe2c.delta /var/lib/nova/instances/snapshots/tmpt0sthgu2/11884c1b149c4281b901fc135e67fe2c" returned: 0 in 0.184s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:54:23 compute-1 nova_compute[182713]: 2026-01-21 23:54:23.263 182717 INFO nova.virt.libvirt.driver [None req-5a83e2e7-06ae-4bc8-83a0-3c0f2682d24b 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: 827ad99f-45db-4d46-9b29-e7093f18eac8] Snapshot extracted, beginning image upload
Jan 21 23:54:23 compute-1 nova_compute[182713]: 2026-01-21 23:54:23.870 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:54:23 compute-1 nova_compute[182713]: 2026-01-21 23:54:23.871 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:54:25 compute-1 nova_compute[182713]: 2026-01-21 23:54:25.703 182717 INFO nova.virt.libvirt.driver [None req-5a83e2e7-06ae-4bc8-83a0-3c0f2682d24b 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: 827ad99f-45db-4d46-9b29-e7093f18eac8] Snapshot image upload complete
Jan 21 23:54:25 compute-1 nova_compute[182713]: 2026-01-21 23:54:25.705 182717 INFO nova.compute.manager [None req-5a83e2e7-06ae-4bc8-83a0-3c0f2682d24b 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: 827ad99f-45db-4d46-9b29-e7093f18eac8] Took 3.92 seconds to snapshot the instance on the hypervisor.
Jan 21 23:54:25 compute-1 nova_compute[182713]: 2026-01-21 23:54:25.802 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:54:25 compute-1 nova_compute[182713]: 2026-01-21 23:54:25.858 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:54:26 compute-1 nova_compute[182713]: 2026-01-21 23:54:26.856 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:54:27 compute-1 nova_compute[182713]: 2026-01-21 23:54:27.825 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:54:27 compute-1 nova_compute[182713]: 2026-01-21 23:54:27.855 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:54:28 compute-1 nova_compute[182713]: 2026-01-21 23:54:28.851 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:54:28 compute-1 nova_compute[182713]: 2026-01-21 23:54:28.855 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:54:28 compute-1 nova_compute[182713]: 2026-01-21 23:54:28.893 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:54:28 compute-1 nova_compute[182713]: 2026-01-21 23:54:28.894 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:54:28 compute-1 nova_compute[182713]: 2026-01-21 23:54:28.894 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:54:28 compute-1 nova_compute[182713]: 2026-01-21 23:54:28.894 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 21 23:54:29 compute-1 nova_compute[182713]: 2026-01-21 23:54:29.012 182717 DEBUG oslo_concurrency.processutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/827ad99f-45db-4d46-9b29-e7093f18eac8/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:54:29 compute-1 nova_compute[182713]: 2026-01-21 23:54:29.067 182717 DEBUG oslo_concurrency.lockutils [None req-994e1e0d-79b9-4d7c-9e90-396247411305 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Acquiring lock "827ad99f-45db-4d46-9b29-e7093f18eac8" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:54:29 compute-1 nova_compute[182713]: 2026-01-21 23:54:29.068 182717 DEBUG oslo_concurrency.lockutils [None req-994e1e0d-79b9-4d7c-9e90-396247411305 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Lock "827ad99f-45db-4d46-9b29-e7093f18eac8" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:54:29 compute-1 nova_compute[182713]: 2026-01-21 23:54:29.068 182717 DEBUG oslo_concurrency.lockutils [None req-994e1e0d-79b9-4d7c-9e90-396247411305 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Acquiring lock "827ad99f-45db-4d46-9b29-e7093f18eac8-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:54:29 compute-1 nova_compute[182713]: 2026-01-21 23:54:29.069 182717 DEBUG oslo_concurrency.lockutils [None req-994e1e0d-79b9-4d7c-9e90-396247411305 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Lock "827ad99f-45db-4d46-9b29-e7093f18eac8-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:54:29 compute-1 nova_compute[182713]: 2026-01-21 23:54:29.069 182717 DEBUG oslo_concurrency.lockutils [None req-994e1e0d-79b9-4d7c-9e90-396247411305 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Lock "827ad99f-45db-4d46-9b29-e7093f18eac8-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:54:29 compute-1 nova_compute[182713]: 2026-01-21 23:54:29.079 182717 DEBUG oslo_concurrency.processutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/827ad99f-45db-4d46-9b29-e7093f18eac8/disk --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:54:29 compute-1 nova_compute[182713]: 2026-01-21 23:54:29.080 182717 DEBUG oslo_concurrency.processutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/827ad99f-45db-4d46-9b29-e7093f18eac8/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:54:29 compute-1 nova_compute[182713]: 2026-01-21 23:54:29.095 182717 INFO nova.compute.manager [None req-994e1e0d-79b9-4d7c-9e90-396247411305 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: 827ad99f-45db-4d46-9b29-e7093f18eac8] Terminating instance
Jan 21 23:54:29 compute-1 nova_compute[182713]: 2026-01-21 23:54:29.110 182717 DEBUG nova.compute.manager [None req-994e1e0d-79b9-4d7c-9e90-396247411305 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: 827ad99f-45db-4d46-9b29-e7093f18eac8] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 21 23:54:29 compute-1 kernel: tap77c99e81-89 (unregistering): left promiscuous mode
Jan 21 23:54:29 compute-1 NetworkManager[54952]: <info>  [1769039669.1321] device (tap77c99e81-89): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 21 23:54:29 compute-1 nova_compute[182713]: 2026-01-21 23:54:29.136 182717 DEBUG oslo_concurrency.processutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/827ad99f-45db-4d46-9b29-e7093f18eac8/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:54:29 compute-1 ovn_controller[94841]: 2026-01-21T23:54:29Z|00133|binding|INFO|Releasing lport 77c99e81-8985-4b57-a59c-7cf8f9c949da from this chassis (sb_readonly=0)
Jan 21 23:54:29 compute-1 ovn_controller[94841]: 2026-01-21T23:54:29Z|00134|binding|INFO|Setting lport 77c99e81-8985-4b57-a59c-7cf8f9c949da down in Southbound
Jan 21 23:54:29 compute-1 ovn_controller[94841]: 2026-01-21T23:54:29Z|00135|binding|INFO|Removing iface tap77c99e81-89 ovn-installed in OVS
Jan 21 23:54:29 compute-1 nova_compute[182713]: 2026-01-21 23:54:29.142 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:54:29 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:54:29.162 104184 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ae:70:85 10.100.0.5'], port_security=['fa:16:3e:ae:70:85 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '827ad99f-45db-4d46-9b29-e7093f18eac8', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-74e2da48-44c2-4c6d-9597-6c47d6247f9c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '63e5713bcd4c429796b251487b6136bc', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'f9b7e0b6-a204-4fa2-b013-6e4797586550', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d83c4887-6d70-4852-825f-2112d1d70c76, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>], logical_port=77c99e81-8985-4b57-a59c-7cf8f9c949da) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 21 23:54:29 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:54:29.164 104184 INFO neutron.agent.ovn.metadata.agent [-] Port 77c99e81-8985-4b57-a59c-7cf8f9c949da in datapath 74e2da48-44c2-4c6d-9597-6c47d6247f9c unbound from our chassis
Jan 21 23:54:29 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:54:29.167 104184 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 74e2da48-44c2-4c6d-9597-6c47d6247f9c, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 21 23:54:29 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:54:29.169 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[b34670d3-c7d8-4a32-b26e-3de075e6e8e9]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:54:29 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:54:29.170 104184 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-74e2da48-44c2-4c6d-9597-6c47d6247f9c namespace which is not needed anymore
Jan 21 23:54:29 compute-1 systemd[1]: machine-qemu\x2d23\x2dinstance\x2d00000034.scope: Deactivated successfully.
Jan 21 23:54:29 compute-1 systemd[1]: machine-qemu\x2d23\x2dinstance\x2d00000034.scope: Consumed 4.540s CPU time.
Jan 21 23:54:29 compute-1 nova_compute[182713]: 2026-01-21 23:54:29.182 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:54:29 compute-1 systemd-machined[153970]: Machine qemu-23-instance-00000034 terminated.
Jan 21 23:54:29 compute-1 podman[217837]: 2026-01-21 23:54:29.227319756 +0000 UTC m=+0.065979923 container health_status 9069102163b9014ce8d990e36224edf2f6277e737eebbc85a8ea0ba5a3d6a468 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 21 23:54:29 compute-1 podman[217836]: 2026-01-21 23:54:29.268606238 +0000 UTC m=+0.108014148 container health_status 1b31374526468d6b98833c6ddc06c791524e3aaa8f4a92d02e3f11c449a17ca2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Jan 21 23:54:29 compute-1 neutron-haproxy-ovnmeta-74e2da48-44c2-4c6d-9597-6c47d6247f9c[217743]: [NOTICE]   (217747) : haproxy version is 2.8.14-c23fe91
Jan 21 23:54:29 compute-1 neutron-haproxy-ovnmeta-74e2da48-44c2-4c6d-9597-6c47d6247f9c[217743]: [NOTICE]   (217747) : path to executable is /usr/sbin/haproxy
Jan 21 23:54:29 compute-1 neutron-haproxy-ovnmeta-74e2da48-44c2-4c6d-9597-6c47d6247f9c[217743]: [WARNING]  (217747) : Exiting Master process...
Jan 21 23:54:29 compute-1 neutron-haproxy-ovnmeta-74e2da48-44c2-4c6d-9597-6c47d6247f9c[217743]: [ALERT]    (217747) : Current worker (217749) exited with code 143 (Terminated)
Jan 21 23:54:29 compute-1 neutron-haproxy-ovnmeta-74e2da48-44c2-4c6d-9597-6c47d6247f9c[217743]: [WARNING]  (217747) : All workers exited. Exiting... (0)
Jan 21 23:54:29 compute-1 systemd[1]: libpod-b79e3ab3d036a5b4023e3264ca6c2fc65a68d4bfc6d1b66240123072df91e8f6.scope: Deactivated successfully.
Jan 21 23:54:29 compute-1 podman[217899]: 2026-01-21 23:54:29.308767604 +0000 UTC m=+0.050434654 container died b79e3ab3d036a5b4023e3264ca6c2fc65a68d4bfc6d1b66240123072df91e8f6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-74e2da48-44c2-4c6d-9597-6c47d6247f9c, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, tcib_managed=true)
Jan 21 23:54:29 compute-1 nova_compute[182713]: 2026-01-21 23:54:29.331 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:54:29 compute-1 nova_compute[182713]: 2026-01-21 23:54:29.335 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:54:29 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-b79e3ab3d036a5b4023e3264ca6c2fc65a68d4bfc6d1b66240123072df91e8f6-userdata-shm.mount: Deactivated successfully.
Jan 21 23:54:29 compute-1 systemd[1]: var-lib-containers-storage-overlay-14eb5c6b1ed2ce941f824044e0add73804e50c04d46961a057fb0330250e5a6c-merged.mount: Deactivated successfully.
Jan 21 23:54:29 compute-1 podman[217899]: 2026-01-21 23:54:29.356941348 +0000 UTC m=+0.098608388 container cleanup b79e3ab3d036a5b4023e3264ca6c2fc65a68d4bfc6d1b66240123072df91e8f6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-74e2da48-44c2-4c6d-9597-6c47d6247f9c, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Jan 21 23:54:29 compute-1 systemd[1]: libpod-conmon-b79e3ab3d036a5b4023e3264ca6c2fc65a68d4bfc6d1b66240123072df91e8f6.scope: Deactivated successfully.
Jan 21 23:54:29 compute-1 nova_compute[182713]: 2026-01-21 23:54:29.379 182717 WARNING nova.virt.libvirt.driver [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 21 23:54:29 compute-1 nova_compute[182713]: 2026-01-21 23:54:29.380 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5599MB free_disk=73.30353546142578GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 21 23:54:29 compute-1 nova_compute[182713]: 2026-01-21 23:54:29.380 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:54:29 compute-1 nova_compute[182713]: 2026-01-21 23:54:29.380 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:54:29 compute-1 nova_compute[182713]: 2026-01-21 23:54:29.384 182717 INFO nova.virt.libvirt.driver [-] [instance: 827ad99f-45db-4d46-9b29-e7093f18eac8] Instance destroyed successfully.
Jan 21 23:54:29 compute-1 nova_compute[182713]: 2026-01-21 23:54:29.384 182717 DEBUG nova.objects.instance [None req-994e1e0d-79b9-4d7c-9e90-396247411305 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Lazy-loading 'resources' on Instance uuid 827ad99f-45db-4d46-9b29-e7093f18eac8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:54:29 compute-1 nova_compute[182713]: 2026-01-21 23:54:29.399 182717 DEBUG nova.virt.libvirt.vif [None req-994e1e0d-79b9-4d7c-9e90-396247411305 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-21T23:53:59Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ImagesTestJSON-server-347423779',display_name='tempest-ImagesTestJSON-server-347423779',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-imagestestjson-server-347423779',id=52,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-21T23:54:12Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=3,progress=0,project_id='63e5713bcd4c429796b251487b6136bc',ramdisk_id='',reservation_id='r-9d5rx5u8',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ImagesTestJSON-126431515',owner_user_name='tempest-ImagesTestJSON-126431515-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-21T23:54:25Z,user_data=None,user_id='6eb1bcf645844eaca088761a04e59542',uuid=827ad99f-45db-4d46-9b29-e7093f18eac8,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='paused') vif={"id": "77c99e81-8985-4b57-a59c-7cf8f9c949da", "address": "fa:16:3e:ae:70:85", "network": {"id": "74e2da48-44c2-4c6d-9597-6c47d6247f9c", "bridge": "br-int", "label": "tempest-ImagesTestJSON-926280031-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "63e5713bcd4c429796b251487b6136bc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap77c99e81-89", "ovs_interfaceid": "77c99e81-8985-4b57-a59c-7cf8f9c949da", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 21 23:54:29 compute-1 nova_compute[182713]: 2026-01-21 23:54:29.399 182717 DEBUG nova.network.os_vif_util [None req-994e1e0d-79b9-4d7c-9e90-396247411305 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Converting VIF {"id": "77c99e81-8985-4b57-a59c-7cf8f9c949da", "address": "fa:16:3e:ae:70:85", "network": {"id": "74e2da48-44c2-4c6d-9597-6c47d6247f9c", "bridge": "br-int", "label": "tempest-ImagesTestJSON-926280031-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "63e5713bcd4c429796b251487b6136bc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap77c99e81-89", "ovs_interfaceid": "77c99e81-8985-4b57-a59c-7cf8f9c949da", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 21 23:54:29 compute-1 nova_compute[182713]: 2026-01-21 23:54:29.401 182717 DEBUG nova.network.os_vif_util [None req-994e1e0d-79b9-4d7c-9e90-396247411305 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ae:70:85,bridge_name='br-int',has_traffic_filtering=True,id=77c99e81-8985-4b57-a59c-7cf8f9c949da,network=Network(74e2da48-44c2-4c6d-9597-6c47d6247f9c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap77c99e81-89') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 21 23:54:29 compute-1 nova_compute[182713]: 2026-01-21 23:54:29.401 182717 DEBUG os_vif [None req-994e1e0d-79b9-4d7c-9e90-396247411305 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ae:70:85,bridge_name='br-int',has_traffic_filtering=True,id=77c99e81-8985-4b57-a59c-7cf8f9c949da,network=Network(74e2da48-44c2-4c6d-9597-6c47d6247f9c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap77c99e81-89') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 21 23:54:29 compute-1 nova_compute[182713]: 2026-01-21 23:54:29.404 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:54:29 compute-1 nova_compute[182713]: 2026-01-21 23:54:29.404 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap77c99e81-89, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:54:29 compute-1 nova_compute[182713]: 2026-01-21 23:54:29.406 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:54:29 compute-1 nova_compute[182713]: 2026-01-21 23:54:29.407 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:54:29 compute-1 nova_compute[182713]: 2026-01-21 23:54:29.411 182717 INFO os_vif [None req-994e1e0d-79b9-4d7c-9e90-396247411305 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ae:70:85,bridge_name='br-int',has_traffic_filtering=True,id=77c99e81-8985-4b57-a59c-7cf8f9c949da,network=Network(74e2da48-44c2-4c6d-9597-6c47d6247f9c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap77c99e81-89')
Jan 21 23:54:29 compute-1 nova_compute[182713]: 2026-01-21 23:54:29.411 182717 INFO nova.virt.libvirt.driver [None req-994e1e0d-79b9-4d7c-9e90-396247411305 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: 827ad99f-45db-4d46-9b29-e7093f18eac8] Deleting instance files /var/lib/nova/instances/827ad99f-45db-4d46-9b29-e7093f18eac8_del
Jan 21 23:54:29 compute-1 nova_compute[182713]: 2026-01-21 23:54:29.412 182717 INFO nova.virt.libvirt.driver [None req-994e1e0d-79b9-4d7c-9e90-396247411305 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: 827ad99f-45db-4d46-9b29-e7093f18eac8] Deletion of /var/lib/nova/instances/827ad99f-45db-4d46-9b29-e7093f18eac8_del complete
Jan 21 23:54:29 compute-1 podman[217943]: 2026-01-21 23:54:29.431648598 +0000 UTC m=+0.051215677 container remove b79e3ab3d036a5b4023e3264ca6c2fc65a68d4bfc6d1b66240123072df91e8f6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-74e2da48-44c2-4c6d-9597-6c47d6247f9c, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 21 23:54:29 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:54:29.437 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[6ee6b902-a45a-4b52-8b3e-6d2495614232]: (4, ('Wed Jan 21 11:54:29 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-74e2da48-44c2-4c6d-9597-6c47d6247f9c (b79e3ab3d036a5b4023e3264ca6c2fc65a68d4bfc6d1b66240123072df91e8f6)\nb79e3ab3d036a5b4023e3264ca6c2fc65a68d4bfc6d1b66240123072df91e8f6\nWed Jan 21 11:54:29 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-74e2da48-44c2-4c6d-9597-6c47d6247f9c (b79e3ab3d036a5b4023e3264ca6c2fc65a68d4bfc6d1b66240123072df91e8f6)\nb79e3ab3d036a5b4023e3264ca6c2fc65a68d4bfc6d1b66240123072df91e8f6\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:54:29 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:54:29.439 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[3db2908c-9573-4631-a0ae-f78f344e2942]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:54:29 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:54:29.439 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap74e2da48-40, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:54:29 compute-1 nova_compute[182713]: 2026-01-21 23:54:29.441 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:54:29 compute-1 kernel: tap74e2da48-40: left promiscuous mode
Jan 21 23:54:29 compute-1 nova_compute[182713]: 2026-01-21 23:54:29.470 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Instance 827ad99f-45db-4d46-9b29-e7093f18eac8 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 21 23:54:29 compute-1 nova_compute[182713]: 2026-01-21 23:54:29.471 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 21 23:54:29 compute-1 nova_compute[182713]: 2026-01-21 23:54:29.471 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 21 23:54:29 compute-1 nova_compute[182713]: 2026-01-21 23:54:29.492 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:54:29 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:54:29.496 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[67734e8b-71a6-49f6-994b-c4fe35a14a47]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:54:29 compute-1 nova_compute[182713]: 2026-01-21 23:54:29.499 182717 INFO nova.compute.manager [None req-994e1e0d-79b9-4d7c-9e90-396247411305 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: 827ad99f-45db-4d46-9b29-e7093f18eac8] Took 0.39 seconds to destroy the instance on the hypervisor.
Jan 21 23:54:29 compute-1 nova_compute[182713]: 2026-01-21 23:54:29.499 182717 DEBUG oslo.service.loopingcall [None req-994e1e0d-79b9-4d7c-9e90-396247411305 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 21 23:54:29 compute-1 nova_compute[182713]: 2026-01-21 23:54:29.500 182717 DEBUG nova.compute.manager [-] [instance: 827ad99f-45db-4d46-9b29-e7093f18eac8] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 21 23:54:29 compute-1 nova_compute[182713]: 2026-01-21 23:54:29.500 182717 DEBUG nova.network.neutron [-] [instance: 827ad99f-45db-4d46-9b29-e7093f18eac8] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 21 23:54:29 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:54:29.515 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[fc24ee32-a5e5-4534-a93f-a7692a432def]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:54:29 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:54:29.517 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[0683e7f6-daa3-4f1e-bdaa-fabaeacf3bac]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:54:29 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:54:29.533 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[a82e266f-22f4-46d2-aa37-c2a9eb42876c]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 422263, 'reachable_time': 29932, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 217958, 'error': None, 'target': 'ovnmeta-74e2da48-44c2-4c6d-9597-6c47d6247f9c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:54:29 compute-1 systemd[1]: run-netns-ovnmeta\x2d74e2da48\x2d44c2\x2d4c6d\x2d9597\x2d6c47d6247f9c.mount: Deactivated successfully.
Jan 21 23:54:29 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:54:29.537 104576 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-74e2da48-44c2-4c6d-9597-6c47d6247f9c deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 21 23:54:29 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:54:29.537 104576 DEBUG oslo.privsep.daemon [-] privsep: reply[3edb8e29-2a9e-49d2-bc6c-5ce002ddabab]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:54:29 compute-1 nova_compute[182713]: 2026-01-21 23:54:29.582 182717 DEBUG nova.compute.provider_tree [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Inventory has not changed in ProviderTree for provider: 39680711-70c9-4df1-ae59-25e54fac688d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 21 23:54:30 compute-1 nova_compute[182713]: 2026-01-21 23:54:30.837 182717 DEBUG nova.compute.manager [req-8c75d08f-17a1-498c-bc38-e8cdb8325fba req-25a775f5-0b9c-4826-ae53-6f8a832ba027 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 827ad99f-45db-4d46-9b29-e7093f18eac8] Received event network-vif-unplugged-77c99e81-8985-4b57-a59c-7cf8f9c949da external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:54:30 compute-1 nova_compute[182713]: 2026-01-21 23:54:30.838 182717 DEBUG oslo_concurrency.lockutils [req-8c75d08f-17a1-498c-bc38-e8cdb8325fba req-25a775f5-0b9c-4826-ae53-6f8a832ba027 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "827ad99f-45db-4d46-9b29-e7093f18eac8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:54:30 compute-1 nova_compute[182713]: 2026-01-21 23:54:30.838 182717 DEBUG oslo_concurrency.lockutils [req-8c75d08f-17a1-498c-bc38-e8cdb8325fba req-25a775f5-0b9c-4826-ae53-6f8a832ba027 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "827ad99f-45db-4d46-9b29-e7093f18eac8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:54:30 compute-1 nova_compute[182713]: 2026-01-21 23:54:30.839 182717 DEBUG oslo_concurrency.lockutils [req-8c75d08f-17a1-498c-bc38-e8cdb8325fba req-25a775f5-0b9c-4826-ae53-6f8a832ba027 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "827ad99f-45db-4d46-9b29-e7093f18eac8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:54:30 compute-1 nova_compute[182713]: 2026-01-21 23:54:30.839 182717 DEBUG nova.compute.manager [req-8c75d08f-17a1-498c-bc38-e8cdb8325fba req-25a775f5-0b9c-4826-ae53-6f8a832ba027 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 827ad99f-45db-4d46-9b29-e7093f18eac8] No waiting events found dispatching network-vif-unplugged-77c99e81-8985-4b57-a59c-7cf8f9c949da pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 23:54:30 compute-1 nova_compute[182713]: 2026-01-21 23:54:30.839 182717 DEBUG nova.compute.manager [req-8c75d08f-17a1-498c-bc38-e8cdb8325fba req-25a775f5-0b9c-4826-ae53-6f8a832ba027 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 827ad99f-45db-4d46-9b29-e7093f18eac8] Received event network-vif-unplugged-77c99e81-8985-4b57-a59c-7cf8f9c949da for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 21 23:54:30 compute-1 nova_compute[182713]: 2026-01-21 23:54:30.874 182717 DEBUG nova.scheduler.client.report [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Inventory has not changed for provider 39680711-70c9-4df1-ae59-25e54fac688d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 21 23:54:30 compute-1 nova_compute[182713]: 2026-01-21 23:54:30.920 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 21 23:54:30 compute-1 nova_compute[182713]: 2026-01-21 23:54:30.920 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.540s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:54:31 compute-1 nova_compute[182713]: 2026-01-21 23:54:31.011 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:54:31 compute-1 nova_compute[182713]: 2026-01-21 23:54:31.679 182717 DEBUG nova.network.neutron [-] [instance: 827ad99f-45db-4d46-9b29-e7093f18eac8] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 23:54:31 compute-1 nova_compute[182713]: 2026-01-21 23:54:31.695 182717 INFO nova.compute.manager [-] [instance: 827ad99f-45db-4d46-9b29-e7093f18eac8] Took 2.20 seconds to deallocate network for instance.
Jan 21 23:54:31 compute-1 nova_compute[182713]: 2026-01-21 23:54:31.792 182717 DEBUG oslo_concurrency.lockutils [None req-994e1e0d-79b9-4d7c-9e90-396247411305 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:54:31 compute-1 nova_compute[182713]: 2026-01-21 23:54:31.793 182717 DEBUG oslo_concurrency.lockutils [None req-994e1e0d-79b9-4d7c-9e90-396247411305 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:54:31 compute-1 nova_compute[182713]: 2026-01-21 23:54:31.869 182717 DEBUG nova.compute.provider_tree [None req-994e1e0d-79b9-4d7c-9e90-396247411305 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Inventory has not changed in ProviderTree for provider: 39680711-70c9-4df1-ae59-25e54fac688d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 21 23:54:31 compute-1 nova_compute[182713]: 2026-01-21 23:54:31.881 182717 DEBUG nova.compute.manager [req-2e4fcb11-415c-4020-86fb-cfe89c8b426b req-36ff4c8b-55ee-429d-a5f2-ff6e44e7e363 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 827ad99f-45db-4d46-9b29-e7093f18eac8] Received event network-vif-deleted-77c99e81-8985-4b57-a59c-7cf8f9c949da external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:54:31 compute-1 nova_compute[182713]: 2026-01-21 23:54:31.889 182717 DEBUG nova.scheduler.client.report [None req-994e1e0d-79b9-4d7c-9e90-396247411305 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Inventory has not changed for provider 39680711-70c9-4df1-ae59-25e54fac688d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 21 23:54:31 compute-1 nova_compute[182713]: 2026-01-21 23:54:31.912 182717 DEBUG oslo_concurrency.lockutils [None req-994e1e0d-79b9-4d7c-9e90-396247411305 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.120s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:54:31 compute-1 nova_compute[182713]: 2026-01-21 23:54:31.920 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:54:31 compute-1 nova_compute[182713]: 2026-01-21 23:54:31.920 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 21 23:54:31 compute-1 nova_compute[182713]: 2026-01-21 23:54:31.920 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 21 23:54:31 compute-1 nova_compute[182713]: 2026-01-21 23:54:31.979 182717 INFO nova.scheduler.client.report [None req-994e1e0d-79b9-4d7c-9e90-396247411305 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Deleted allocations for instance 827ad99f-45db-4d46-9b29-e7093f18eac8
Jan 21 23:54:32 compute-1 nova_compute[182713]: 2026-01-21 23:54:32.107 182717 DEBUG oslo_concurrency.lockutils [None req-994e1e0d-79b9-4d7c-9e90-396247411305 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Lock "827ad99f-45db-4d46-9b29-e7093f18eac8" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.039s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:54:32 compute-1 nova_compute[182713]: 2026-01-21 23:54:32.230 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquiring lock "refresh_cache-827ad99f-45db-4d46-9b29-e7093f18eac8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 23:54:32 compute-1 nova_compute[182713]: 2026-01-21 23:54:32.231 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquired lock "refresh_cache-827ad99f-45db-4d46-9b29-e7093f18eac8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 23:54:32 compute-1 nova_compute[182713]: 2026-01-21 23:54:32.231 182717 DEBUG nova.network.neutron [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] [instance: 827ad99f-45db-4d46-9b29-e7093f18eac8] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Jan 21 23:54:32 compute-1 nova_compute[182713]: 2026-01-21 23:54:32.231 182717 DEBUG nova.objects.instance [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lazy-loading 'info_cache' on Instance uuid 827ad99f-45db-4d46-9b29-e7093f18eac8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:54:32 compute-1 nova_compute[182713]: 2026-01-21 23:54:32.299 182717 DEBUG nova.compute.utils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] [instance: 827ad99f-45db-4d46-9b29-e7093f18eac8] Can not refresh info_cache because instance was not found refresh_info_cache_for_instance /usr/lib/python3.9/site-packages/nova/compute/utils.py:1010
Jan 21 23:54:32 compute-1 nova_compute[182713]: 2026-01-21 23:54:32.506 182717 DEBUG nova.network.neutron [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] [instance: 827ad99f-45db-4d46-9b29-e7093f18eac8] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 21 23:54:32 compute-1 nova_compute[182713]: 2026-01-21 23:54:32.984 182717 DEBUG nova.compute.manager [req-1eaa3001-48ee-463e-bf27-5cbb8cd09b6d req-6e67a8b2-5705-40b0-8ba3-0f881f9c93b1 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 827ad99f-45db-4d46-9b29-e7093f18eac8] Received event network-vif-plugged-77c99e81-8985-4b57-a59c-7cf8f9c949da external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:54:32 compute-1 nova_compute[182713]: 2026-01-21 23:54:32.985 182717 DEBUG oslo_concurrency.lockutils [req-1eaa3001-48ee-463e-bf27-5cbb8cd09b6d req-6e67a8b2-5705-40b0-8ba3-0f881f9c93b1 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "827ad99f-45db-4d46-9b29-e7093f18eac8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:54:32 compute-1 nova_compute[182713]: 2026-01-21 23:54:32.985 182717 DEBUG oslo_concurrency.lockutils [req-1eaa3001-48ee-463e-bf27-5cbb8cd09b6d req-6e67a8b2-5705-40b0-8ba3-0f881f9c93b1 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "827ad99f-45db-4d46-9b29-e7093f18eac8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:54:32 compute-1 nova_compute[182713]: 2026-01-21 23:54:32.985 182717 DEBUG oslo_concurrency.lockutils [req-1eaa3001-48ee-463e-bf27-5cbb8cd09b6d req-6e67a8b2-5705-40b0-8ba3-0f881f9c93b1 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "827ad99f-45db-4d46-9b29-e7093f18eac8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:54:32 compute-1 nova_compute[182713]: 2026-01-21 23:54:32.985 182717 DEBUG nova.compute.manager [req-1eaa3001-48ee-463e-bf27-5cbb8cd09b6d req-6e67a8b2-5705-40b0-8ba3-0f881f9c93b1 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 827ad99f-45db-4d46-9b29-e7093f18eac8] No waiting events found dispatching network-vif-plugged-77c99e81-8985-4b57-a59c-7cf8f9c949da pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 23:54:32 compute-1 nova_compute[182713]: 2026-01-21 23:54:32.986 182717 WARNING nova.compute.manager [req-1eaa3001-48ee-463e-bf27-5cbb8cd09b6d req-6e67a8b2-5705-40b0-8ba3-0f881f9c93b1 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 827ad99f-45db-4d46-9b29-e7093f18eac8] Received unexpected event network-vif-plugged-77c99e81-8985-4b57-a59c-7cf8f9c949da for instance with vm_state deleted and task_state None.
Jan 21 23:54:33 compute-1 nova_compute[182713]: 2026-01-21 23:54:33.129 182717 DEBUG nova.network.neutron [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] [instance: 827ad99f-45db-4d46-9b29-e7093f18eac8] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 23:54:33 compute-1 nova_compute[182713]: 2026-01-21 23:54:33.184 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Releasing lock "refresh_cache-827ad99f-45db-4d46-9b29-e7093f18eac8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 23:54:33 compute-1 nova_compute[182713]: 2026-01-21 23:54:33.185 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] [instance: 827ad99f-45db-4d46-9b29-e7093f18eac8] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Jan 21 23:54:33 compute-1 nova_compute[182713]: 2026-01-21 23:54:33.750 182717 DEBUG oslo_concurrency.lockutils [None req-522dc215-0e71-4f1b-8681-5796cb1b4981 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Acquiring lock "d94f4b0c-b132-4eb6-9c92-6850506a821a" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:54:33 compute-1 nova_compute[182713]: 2026-01-21 23:54:33.751 182717 DEBUG oslo_concurrency.lockutils [None req-522dc215-0e71-4f1b-8681-5796cb1b4981 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Lock "d94f4b0c-b132-4eb6-9c92-6850506a821a" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:54:33 compute-1 nova_compute[182713]: 2026-01-21 23:54:33.775 182717 DEBUG nova.compute.manager [None req-522dc215-0e71-4f1b-8681-5796cb1b4981 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: d94f4b0c-b132-4eb6-9c92-6850506a821a] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 21 23:54:33 compute-1 nova_compute[182713]: 2026-01-21 23:54:33.910 182717 DEBUG oslo_concurrency.lockutils [None req-522dc215-0e71-4f1b-8681-5796cb1b4981 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:54:33 compute-1 nova_compute[182713]: 2026-01-21 23:54:33.910 182717 DEBUG oslo_concurrency.lockutils [None req-522dc215-0e71-4f1b-8681-5796cb1b4981 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:54:33 compute-1 nova_compute[182713]: 2026-01-21 23:54:33.920 182717 DEBUG nova.virt.hardware [None req-522dc215-0e71-4f1b-8681-5796cb1b4981 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 21 23:54:33 compute-1 nova_compute[182713]: 2026-01-21 23:54:33.920 182717 INFO nova.compute.claims [None req-522dc215-0e71-4f1b-8681-5796cb1b4981 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: d94f4b0c-b132-4eb6-9c92-6850506a821a] Claim successful on node compute-1.ctlplane.example.com
Jan 21 23:54:34 compute-1 nova_compute[182713]: 2026-01-21 23:54:34.072 182717 DEBUG nova.compute.provider_tree [None req-522dc215-0e71-4f1b-8681-5796cb1b4981 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Inventory has not changed in ProviderTree for provider: 39680711-70c9-4df1-ae59-25e54fac688d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 21 23:54:34 compute-1 nova_compute[182713]: 2026-01-21 23:54:34.104 182717 DEBUG nova.scheduler.client.report [None req-522dc215-0e71-4f1b-8681-5796cb1b4981 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Inventory has not changed for provider 39680711-70c9-4df1-ae59-25e54fac688d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 21 23:54:34 compute-1 nova_compute[182713]: 2026-01-21 23:54:34.126 182717 DEBUG oslo_concurrency.lockutils [None req-522dc215-0e71-4f1b-8681-5796cb1b4981 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.216s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:54:34 compute-1 nova_compute[182713]: 2026-01-21 23:54:34.127 182717 DEBUG nova.compute.manager [None req-522dc215-0e71-4f1b-8681-5796cb1b4981 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: d94f4b0c-b132-4eb6-9c92-6850506a821a] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 21 23:54:34 compute-1 nova_compute[182713]: 2026-01-21 23:54:34.188 182717 DEBUG nova.compute.manager [None req-522dc215-0e71-4f1b-8681-5796cb1b4981 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: d94f4b0c-b132-4eb6-9c92-6850506a821a] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 21 23:54:34 compute-1 nova_compute[182713]: 2026-01-21 23:54:34.189 182717 DEBUG nova.network.neutron [None req-522dc215-0e71-4f1b-8681-5796cb1b4981 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: d94f4b0c-b132-4eb6-9c92-6850506a821a] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 21 23:54:34 compute-1 nova_compute[182713]: 2026-01-21 23:54:34.218 182717 INFO nova.virt.libvirt.driver [None req-522dc215-0e71-4f1b-8681-5796cb1b4981 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: d94f4b0c-b132-4eb6-9c92-6850506a821a] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 21 23:54:34 compute-1 nova_compute[182713]: 2026-01-21 23:54:34.281 182717 DEBUG nova.compute.manager [None req-522dc215-0e71-4f1b-8681-5796cb1b4981 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: d94f4b0c-b132-4eb6-9c92-6850506a821a] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 21 23:54:34 compute-1 nova_compute[182713]: 2026-01-21 23:54:34.443 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:54:36 compute-1 nova_compute[182713]: 2026-01-21 23:54:36.014 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:54:36 compute-1 podman[217959]: 2026-01-21 23:54:36.615273222 +0000 UTC m=+0.097596858 container health_status af9a44923f14d865893c91712a29a99d59e05d83f1a1a0300c4f4d5af638f284 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202)
Jan 21 23:54:36 compute-1 podman[217960]: 2026-01-21 23:54:36.643263644 +0000 UTC m=+0.119519793 container health_status e305ebfcd3dbca18f8dd680606b043b108cf9efb0d0a771039cb04fe0df8441d (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 21 23:54:37 compute-1 nova_compute[182713]: 2026-01-21 23:54:37.847 182717 DEBUG nova.compute.manager [None req-522dc215-0e71-4f1b-8681-5796cb1b4981 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: d94f4b0c-b132-4eb6-9c92-6850506a821a] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 21 23:54:37 compute-1 nova_compute[182713]: 2026-01-21 23:54:37.848 182717 DEBUG nova.virt.libvirt.driver [None req-522dc215-0e71-4f1b-8681-5796cb1b4981 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: d94f4b0c-b132-4eb6-9c92-6850506a821a] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 21 23:54:37 compute-1 nova_compute[182713]: 2026-01-21 23:54:37.849 182717 INFO nova.virt.libvirt.driver [None req-522dc215-0e71-4f1b-8681-5796cb1b4981 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: d94f4b0c-b132-4eb6-9c92-6850506a821a] Creating image(s)
Jan 21 23:54:37 compute-1 nova_compute[182713]: 2026-01-21 23:54:37.850 182717 DEBUG oslo_concurrency.lockutils [None req-522dc215-0e71-4f1b-8681-5796cb1b4981 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Acquiring lock "/var/lib/nova/instances/d94f4b0c-b132-4eb6-9c92-6850506a821a/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:54:37 compute-1 nova_compute[182713]: 2026-01-21 23:54:37.850 182717 DEBUG oslo_concurrency.lockutils [None req-522dc215-0e71-4f1b-8681-5796cb1b4981 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Lock "/var/lib/nova/instances/d94f4b0c-b132-4eb6-9c92-6850506a821a/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:54:37 compute-1 nova_compute[182713]: 2026-01-21 23:54:37.852 182717 DEBUG oslo_concurrency.lockutils [None req-522dc215-0e71-4f1b-8681-5796cb1b4981 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Lock "/var/lib/nova/instances/d94f4b0c-b132-4eb6-9c92-6850506a821a/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:54:37 compute-1 nova_compute[182713]: 2026-01-21 23:54:37.877 182717 DEBUG oslo_concurrency.processutils [None req-522dc215-0e71-4f1b-8681-5796cb1b4981 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:54:37 compute-1 nova_compute[182713]: 2026-01-21 23:54:37.933 182717 DEBUG nova.policy [None req-522dc215-0e71-4f1b-8681-5796cb1b4981 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '6eb1bcf645844eaca088761a04e59542', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '63e5713bcd4c429796b251487b6136bc', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 21 23:54:37 compute-1 nova_compute[182713]: 2026-01-21 23:54:37.969 182717 DEBUG oslo_concurrency.processutils [None req-522dc215-0e71-4f1b-8681-5796cb1b4981 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.093s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:54:37 compute-1 nova_compute[182713]: 2026-01-21 23:54:37.971 182717 DEBUG oslo_concurrency.lockutils [None req-522dc215-0e71-4f1b-8681-5796cb1b4981 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Acquiring lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:54:37 compute-1 nova_compute[182713]: 2026-01-21 23:54:37.972 182717 DEBUG oslo_concurrency.lockutils [None req-522dc215-0e71-4f1b-8681-5796cb1b4981 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:54:37 compute-1 nova_compute[182713]: 2026-01-21 23:54:37.996 182717 DEBUG oslo_concurrency.processutils [None req-522dc215-0e71-4f1b-8681-5796cb1b4981 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:54:38 compute-1 nova_compute[182713]: 2026-01-21 23:54:38.083 182717 DEBUG oslo_concurrency.processutils [None req-522dc215-0e71-4f1b-8681-5796cb1b4981 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.086s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:54:38 compute-1 nova_compute[182713]: 2026-01-21 23:54:38.085 182717 DEBUG oslo_concurrency.processutils [None req-522dc215-0e71-4f1b-8681-5796cb1b4981 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/d94f4b0c-b132-4eb6-9c92-6850506a821a/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:54:38 compute-1 nova_compute[182713]: 2026-01-21 23:54:38.123 182717 DEBUG oslo_concurrency.processutils [None req-522dc215-0e71-4f1b-8681-5796cb1b4981 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/d94f4b0c-b132-4eb6-9c92-6850506a821a/disk 1073741824" returned: 0 in 0.038s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:54:38 compute-1 nova_compute[182713]: 2026-01-21 23:54:38.125 182717 DEBUG oslo_concurrency.lockutils [None req-522dc215-0e71-4f1b-8681-5796cb1b4981 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.153s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:54:38 compute-1 nova_compute[182713]: 2026-01-21 23:54:38.126 182717 DEBUG oslo_concurrency.processutils [None req-522dc215-0e71-4f1b-8681-5796cb1b4981 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:54:38 compute-1 nova_compute[182713]: 2026-01-21 23:54:38.184 182717 DEBUG oslo_concurrency.processutils [None req-522dc215-0e71-4f1b-8681-5796cb1b4981 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:54:38 compute-1 nova_compute[182713]: 2026-01-21 23:54:38.186 182717 DEBUG nova.virt.disk.api [None req-522dc215-0e71-4f1b-8681-5796cb1b4981 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Checking if we can resize image /var/lib/nova/instances/d94f4b0c-b132-4eb6-9c92-6850506a821a/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 21 23:54:38 compute-1 nova_compute[182713]: 2026-01-21 23:54:38.187 182717 DEBUG oslo_concurrency.processutils [None req-522dc215-0e71-4f1b-8681-5796cb1b4981 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d94f4b0c-b132-4eb6-9c92-6850506a821a/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:54:38 compute-1 nova_compute[182713]: 2026-01-21 23:54:38.243 182717 DEBUG oslo_concurrency.processutils [None req-522dc215-0e71-4f1b-8681-5796cb1b4981 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d94f4b0c-b132-4eb6-9c92-6850506a821a/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:54:38 compute-1 nova_compute[182713]: 2026-01-21 23:54:38.245 182717 DEBUG nova.virt.disk.api [None req-522dc215-0e71-4f1b-8681-5796cb1b4981 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Cannot resize image /var/lib/nova/instances/d94f4b0c-b132-4eb6-9c92-6850506a821a/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 21 23:54:38 compute-1 nova_compute[182713]: 2026-01-21 23:54:38.245 182717 DEBUG nova.objects.instance [None req-522dc215-0e71-4f1b-8681-5796cb1b4981 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Lazy-loading 'migration_context' on Instance uuid d94f4b0c-b132-4eb6-9c92-6850506a821a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:54:38 compute-1 nova_compute[182713]: 2026-01-21 23:54:38.263 182717 DEBUG nova.virt.libvirt.driver [None req-522dc215-0e71-4f1b-8681-5796cb1b4981 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: d94f4b0c-b132-4eb6-9c92-6850506a821a] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 21 23:54:38 compute-1 nova_compute[182713]: 2026-01-21 23:54:38.264 182717 DEBUG nova.virt.libvirt.driver [None req-522dc215-0e71-4f1b-8681-5796cb1b4981 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: d94f4b0c-b132-4eb6-9c92-6850506a821a] Ensure instance console log exists: /var/lib/nova/instances/d94f4b0c-b132-4eb6-9c92-6850506a821a/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 21 23:54:38 compute-1 nova_compute[182713]: 2026-01-21 23:54:38.265 182717 DEBUG oslo_concurrency.lockutils [None req-522dc215-0e71-4f1b-8681-5796cb1b4981 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:54:38 compute-1 nova_compute[182713]: 2026-01-21 23:54:38.265 182717 DEBUG oslo_concurrency.lockutils [None req-522dc215-0e71-4f1b-8681-5796cb1b4981 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:54:38 compute-1 nova_compute[182713]: 2026-01-21 23:54:38.266 182717 DEBUG oslo_concurrency.lockutils [None req-522dc215-0e71-4f1b-8681-5796cb1b4981 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:54:39 compute-1 nova_compute[182713]: 2026-01-21 23:54:39.089 182717 DEBUG nova.network.neutron [None req-522dc215-0e71-4f1b-8681-5796cb1b4981 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: d94f4b0c-b132-4eb6-9c92-6850506a821a] Successfully created port: fc6ae600-05c3-478d-9bbd-b0041c1043c1 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 21 23:54:39 compute-1 nova_compute[182713]: 2026-01-21 23:54:39.446 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:54:40 compute-1 nova_compute[182713]: 2026-01-21 23:54:40.508 182717 DEBUG nova.network.neutron [None req-522dc215-0e71-4f1b-8681-5796cb1b4981 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: d94f4b0c-b132-4eb6-9c92-6850506a821a] Successfully updated port: fc6ae600-05c3-478d-9bbd-b0041c1043c1 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 21 23:54:40 compute-1 nova_compute[182713]: 2026-01-21 23:54:40.532 182717 DEBUG oslo_concurrency.lockutils [None req-522dc215-0e71-4f1b-8681-5796cb1b4981 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Acquiring lock "refresh_cache-d94f4b0c-b132-4eb6-9c92-6850506a821a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 23:54:40 compute-1 nova_compute[182713]: 2026-01-21 23:54:40.532 182717 DEBUG oslo_concurrency.lockutils [None req-522dc215-0e71-4f1b-8681-5796cb1b4981 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Acquired lock "refresh_cache-d94f4b0c-b132-4eb6-9c92-6850506a821a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 23:54:40 compute-1 nova_compute[182713]: 2026-01-21 23:54:40.532 182717 DEBUG nova.network.neutron [None req-522dc215-0e71-4f1b-8681-5796cb1b4981 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: d94f4b0c-b132-4eb6-9c92-6850506a821a] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 21 23:54:40 compute-1 nova_compute[182713]: 2026-01-21 23:54:40.638 182717 DEBUG nova.compute.manager [req-6aa39ad1-a473-4de7-8699-d0b7ef16ff13 req-a008ec75-a4c6-4611-8c0b-dd35e92bb2bb 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: d94f4b0c-b132-4eb6-9c92-6850506a821a] Received event network-changed-fc6ae600-05c3-478d-9bbd-b0041c1043c1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:54:40 compute-1 nova_compute[182713]: 2026-01-21 23:54:40.639 182717 DEBUG nova.compute.manager [req-6aa39ad1-a473-4de7-8699-d0b7ef16ff13 req-a008ec75-a4c6-4611-8c0b-dd35e92bb2bb 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: d94f4b0c-b132-4eb6-9c92-6850506a821a] Refreshing instance network info cache due to event network-changed-fc6ae600-05c3-478d-9bbd-b0041c1043c1. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 21 23:54:40 compute-1 nova_compute[182713]: 2026-01-21 23:54:40.639 182717 DEBUG oslo_concurrency.lockutils [req-6aa39ad1-a473-4de7-8699-d0b7ef16ff13 req-a008ec75-a4c6-4611-8c0b-dd35e92bb2bb 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-d94f4b0c-b132-4eb6-9c92-6850506a821a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 23:54:41 compute-1 nova_compute[182713]: 2026-01-21 23:54:41.015 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:54:41 compute-1 nova_compute[182713]: 2026-01-21 23:54:41.204 182717 DEBUG nova.network.neutron [None req-522dc215-0e71-4f1b-8681-5796cb1b4981 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: d94f4b0c-b132-4eb6-9c92-6850506a821a] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 21 23:54:42 compute-1 nova_compute[182713]: 2026-01-21 23:54:42.674 182717 DEBUG nova.network.neutron [None req-522dc215-0e71-4f1b-8681-5796cb1b4981 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: d94f4b0c-b132-4eb6-9c92-6850506a821a] Updating instance_info_cache with network_info: [{"id": "fc6ae600-05c3-478d-9bbd-b0041c1043c1", "address": "fa:16:3e:6e:94:fb", "network": {"id": "74e2da48-44c2-4c6d-9597-6c47d6247f9c", "bridge": "br-int", "label": "tempest-ImagesTestJSON-926280031-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "63e5713bcd4c429796b251487b6136bc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfc6ae600-05", "ovs_interfaceid": "fc6ae600-05c3-478d-9bbd-b0041c1043c1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 23:54:42 compute-1 nova_compute[182713]: 2026-01-21 23:54:42.703 182717 DEBUG oslo_concurrency.lockutils [None req-522dc215-0e71-4f1b-8681-5796cb1b4981 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Releasing lock "refresh_cache-d94f4b0c-b132-4eb6-9c92-6850506a821a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 23:54:42 compute-1 nova_compute[182713]: 2026-01-21 23:54:42.704 182717 DEBUG nova.compute.manager [None req-522dc215-0e71-4f1b-8681-5796cb1b4981 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: d94f4b0c-b132-4eb6-9c92-6850506a821a] Instance network_info: |[{"id": "fc6ae600-05c3-478d-9bbd-b0041c1043c1", "address": "fa:16:3e:6e:94:fb", "network": {"id": "74e2da48-44c2-4c6d-9597-6c47d6247f9c", "bridge": "br-int", "label": "tempest-ImagesTestJSON-926280031-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "63e5713bcd4c429796b251487b6136bc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfc6ae600-05", "ovs_interfaceid": "fc6ae600-05c3-478d-9bbd-b0041c1043c1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 21 23:54:42 compute-1 nova_compute[182713]: 2026-01-21 23:54:42.704 182717 DEBUG oslo_concurrency.lockutils [req-6aa39ad1-a473-4de7-8699-d0b7ef16ff13 req-a008ec75-a4c6-4611-8c0b-dd35e92bb2bb 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-d94f4b0c-b132-4eb6-9c92-6850506a821a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 23:54:42 compute-1 nova_compute[182713]: 2026-01-21 23:54:42.705 182717 DEBUG nova.network.neutron [req-6aa39ad1-a473-4de7-8699-d0b7ef16ff13 req-a008ec75-a4c6-4611-8c0b-dd35e92bb2bb 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: d94f4b0c-b132-4eb6-9c92-6850506a821a] Refreshing network info cache for port fc6ae600-05c3-478d-9bbd-b0041c1043c1 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 21 23:54:42 compute-1 nova_compute[182713]: 2026-01-21 23:54:42.710 182717 DEBUG nova.virt.libvirt.driver [None req-522dc215-0e71-4f1b-8681-5796cb1b4981 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: d94f4b0c-b132-4eb6-9c92-6850506a821a] Start _get_guest_xml network_info=[{"id": "fc6ae600-05c3-478d-9bbd-b0041c1043c1", "address": "fa:16:3e:6e:94:fb", "network": {"id": "74e2da48-44c2-4c6d-9597-6c47d6247f9c", "bridge": "br-int", "label": "tempest-ImagesTestJSON-926280031-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "63e5713bcd4c429796b251487b6136bc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfc6ae600-05", "ovs_interfaceid": "fc6ae600-05c3-478d-9bbd-b0041c1043c1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'size': 0, 'disk_bus': 'virtio', 'boot_index': 0, 'device_name': '/dev/vda', 'guest_format': None, 'encrypted': False, 'encryption_format': None, 'encryption_secret_uuid': None, 'device_type': 'disk', 'image_id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 21 23:54:42 compute-1 nova_compute[182713]: 2026-01-21 23:54:42.716 182717 WARNING nova.virt.libvirt.driver [None req-522dc215-0e71-4f1b-8681-5796cb1b4981 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 21 23:54:42 compute-1 nova_compute[182713]: 2026-01-21 23:54:42.722 182717 DEBUG nova.virt.libvirt.host [None req-522dc215-0e71-4f1b-8681-5796cb1b4981 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 21 23:54:42 compute-1 nova_compute[182713]: 2026-01-21 23:54:42.723 182717 DEBUG nova.virt.libvirt.host [None req-522dc215-0e71-4f1b-8681-5796cb1b4981 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 21 23:54:42 compute-1 nova_compute[182713]: 2026-01-21 23:54:42.732 182717 DEBUG nova.virt.libvirt.host [None req-522dc215-0e71-4f1b-8681-5796cb1b4981 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 21 23:54:42 compute-1 nova_compute[182713]: 2026-01-21 23:54:42.733 182717 DEBUG nova.virt.libvirt.host [None req-522dc215-0e71-4f1b-8681-5796cb1b4981 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 21 23:54:42 compute-1 nova_compute[182713]: 2026-01-21 23:54:42.735 182717 DEBUG nova.virt.libvirt.driver [None req-522dc215-0e71-4f1b-8681-5796cb1b4981 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 21 23:54:42 compute-1 nova_compute[182713]: 2026-01-21 23:54:42.735 182717 DEBUG nova.virt.hardware [None req-522dc215-0e71-4f1b-8681-5796cb1b4981 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-21T23:42:45Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c3389c03-89c4-4ff5-9e03-1a99d41713d4',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 21 23:54:42 compute-1 nova_compute[182713]: 2026-01-21 23:54:42.736 182717 DEBUG nova.virt.hardware [None req-522dc215-0e71-4f1b-8681-5796cb1b4981 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 21 23:54:42 compute-1 nova_compute[182713]: 2026-01-21 23:54:42.737 182717 DEBUG nova.virt.hardware [None req-522dc215-0e71-4f1b-8681-5796cb1b4981 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 21 23:54:42 compute-1 nova_compute[182713]: 2026-01-21 23:54:42.737 182717 DEBUG nova.virt.hardware [None req-522dc215-0e71-4f1b-8681-5796cb1b4981 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 21 23:54:42 compute-1 nova_compute[182713]: 2026-01-21 23:54:42.738 182717 DEBUG nova.virt.hardware [None req-522dc215-0e71-4f1b-8681-5796cb1b4981 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 21 23:54:42 compute-1 nova_compute[182713]: 2026-01-21 23:54:42.738 182717 DEBUG nova.virt.hardware [None req-522dc215-0e71-4f1b-8681-5796cb1b4981 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 21 23:54:42 compute-1 nova_compute[182713]: 2026-01-21 23:54:42.739 182717 DEBUG nova.virt.hardware [None req-522dc215-0e71-4f1b-8681-5796cb1b4981 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 21 23:54:42 compute-1 nova_compute[182713]: 2026-01-21 23:54:42.739 182717 DEBUG nova.virt.hardware [None req-522dc215-0e71-4f1b-8681-5796cb1b4981 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 21 23:54:42 compute-1 nova_compute[182713]: 2026-01-21 23:54:42.740 182717 DEBUG nova.virt.hardware [None req-522dc215-0e71-4f1b-8681-5796cb1b4981 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 21 23:54:42 compute-1 nova_compute[182713]: 2026-01-21 23:54:42.740 182717 DEBUG nova.virt.hardware [None req-522dc215-0e71-4f1b-8681-5796cb1b4981 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 21 23:54:42 compute-1 nova_compute[182713]: 2026-01-21 23:54:42.741 182717 DEBUG nova.virt.hardware [None req-522dc215-0e71-4f1b-8681-5796cb1b4981 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 21 23:54:42 compute-1 nova_compute[182713]: 2026-01-21 23:54:42.747 182717 DEBUG nova.virt.libvirt.vif [None req-522dc215-0e71-4f1b-8681-5796cb1b4981 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-21T23:54:32Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ImagesTestJSON-server-421695372',display_name='tempest-ImagesTestJSON-server-421695372',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-imagestestjson-server-421695372',id=53,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='63e5713bcd4c429796b251487b6136bc',ramdisk_id='',reservation_id='r-afkdxekb',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ImagesTestJSON-126431515',owner_user_name='tempest-ImagesTestJSON-126431515-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-21T23:54:34Z,user_data=None,user_id='6eb1bcf645844eaca088761a04e59542',uuid=d94f4b0c-b132-4eb6-9c92-6850506a821a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "fc6ae600-05c3-478d-9bbd-b0041c1043c1", "address": "fa:16:3e:6e:94:fb", "network": {"id": "74e2da48-44c2-4c6d-9597-6c47d6247f9c", "bridge": "br-int", "label": "tempest-ImagesTestJSON-926280031-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "63e5713bcd4c429796b251487b6136bc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfc6ae600-05", "ovs_interfaceid": "fc6ae600-05c3-478d-9bbd-b0041c1043c1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 21 23:54:42 compute-1 nova_compute[182713]: 2026-01-21 23:54:42.748 182717 DEBUG nova.network.os_vif_util [None req-522dc215-0e71-4f1b-8681-5796cb1b4981 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Converting VIF {"id": "fc6ae600-05c3-478d-9bbd-b0041c1043c1", "address": "fa:16:3e:6e:94:fb", "network": {"id": "74e2da48-44c2-4c6d-9597-6c47d6247f9c", "bridge": "br-int", "label": "tempest-ImagesTestJSON-926280031-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "63e5713bcd4c429796b251487b6136bc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfc6ae600-05", "ovs_interfaceid": "fc6ae600-05c3-478d-9bbd-b0041c1043c1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 21 23:54:42 compute-1 nova_compute[182713]: 2026-01-21 23:54:42.749 182717 DEBUG nova.network.os_vif_util [None req-522dc215-0e71-4f1b-8681-5796cb1b4981 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:6e:94:fb,bridge_name='br-int',has_traffic_filtering=True,id=fc6ae600-05c3-478d-9bbd-b0041c1043c1,network=Network(74e2da48-44c2-4c6d-9597-6c47d6247f9c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfc6ae600-05') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 21 23:54:42 compute-1 nova_compute[182713]: 2026-01-21 23:54:42.750 182717 DEBUG nova.objects.instance [None req-522dc215-0e71-4f1b-8681-5796cb1b4981 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Lazy-loading 'pci_devices' on Instance uuid d94f4b0c-b132-4eb6-9c92-6850506a821a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:54:42 compute-1 nova_compute[182713]: 2026-01-21 23:54:42.781 182717 DEBUG nova.virt.libvirt.driver [None req-522dc215-0e71-4f1b-8681-5796cb1b4981 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: d94f4b0c-b132-4eb6-9c92-6850506a821a] End _get_guest_xml xml=<domain type="kvm">
Jan 21 23:54:42 compute-1 nova_compute[182713]:   <uuid>d94f4b0c-b132-4eb6-9c92-6850506a821a</uuid>
Jan 21 23:54:42 compute-1 nova_compute[182713]:   <name>instance-00000035</name>
Jan 21 23:54:42 compute-1 nova_compute[182713]:   <memory>131072</memory>
Jan 21 23:54:42 compute-1 nova_compute[182713]:   <vcpu>1</vcpu>
Jan 21 23:54:42 compute-1 nova_compute[182713]:   <metadata>
Jan 21 23:54:42 compute-1 nova_compute[182713]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 21 23:54:42 compute-1 nova_compute[182713]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 21 23:54:42 compute-1 nova_compute[182713]:       <nova:name>tempest-ImagesTestJSON-server-421695372</nova:name>
Jan 21 23:54:42 compute-1 nova_compute[182713]:       <nova:creationTime>2026-01-21 23:54:42</nova:creationTime>
Jan 21 23:54:42 compute-1 nova_compute[182713]:       <nova:flavor name="m1.nano">
Jan 21 23:54:42 compute-1 nova_compute[182713]:         <nova:memory>128</nova:memory>
Jan 21 23:54:42 compute-1 nova_compute[182713]:         <nova:disk>1</nova:disk>
Jan 21 23:54:42 compute-1 nova_compute[182713]:         <nova:swap>0</nova:swap>
Jan 21 23:54:42 compute-1 nova_compute[182713]:         <nova:ephemeral>0</nova:ephemeral>
Jan 21 23:54:42 compute-1 nova_compute[182713]:         <nova:vcpus>1</nova:vcpus>
Jan 21 23:54:42 compute-1 nova_compute[182713]:       </nova:flavor>
Jan 21 23:54:42 compute-1 nova_compute[182713]:       <nova:owner>
Jan 21 23:54:42 compute-1 nova_compute[182713]:         <nova:user uuid="6eb1bcf645844eaca088761a04e59542">tempest-ImagesTestJSON-126431515-project-member</nova:user>
Jan 21 23:54:42 compute-1 nova_compute[182713]:         <nova:project uuid="63e5713bcd4c429796b251487b6136bc">tempest-ImagesTestJSON-126431515</nova:project>
Jan 21 23:54:42 compute-1 nova_compute[182713]:       </nova:owner>
Jan 21 23:54:42 compute-1 nova_compute[182713]:       <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 21 23:54:42 compute-1 nova_compute[182713]:       <nova:ports>
Jan 21 23:54:42 compute-1 nova_compute[182713]:         <nova:port uuid="fc6ae600-05c3-478d-9bbd-b0041c1043c1">
Jan 21 23:54:42 compute-1 nova_compute[182713]:           <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Jan 21 23:54:42 compute-1 nova_compute[182713]:         </nova:port>
Jan 21 23:54:42 compute-1 nova_compute[182713]:       </nova:ports>
Jan 21 23:54:42 compute-1 nova_compute[182713]:     </nova:instance>
Jan 21 23:54:42 compute-1 nova_compute[182713]:   </metadata>
Jan 21 23:54:42 compute-1 nova_compute[182713]:   <sysinfo type="smbios">
Jan 21 23:54:42 compute-1 nova_compute[182713]:     <system>
Jan 21 23:54:42 compute-1 nova_compute[182713]:       <entry name="manufacturer">RDO</entry>
Jan 21 23:54:42 compute-1 nova_compute[182713]:       <entry name="product">OpenStack Compute</entry>
Jan 21 23:54:42 compute-1 nova_compute[182713]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 21 23:54:42 compute-1 nova_compute[182713]:       <entry name="serial">d94f4b0c-b132-4eb6-9c92-6850506a821a</entry>
Jan 21 23:54:42 compute-1 nova_compute[182713]:       <entry name="uuid">d94f4b0c-b132-4eb6-9c92-6850506a821a</entry>
Jan 21 23:54:42 compute-1 nova_compute[182713]:       <entry name="family">Virtual Machine</entry>
Jan 21 23:54:42 compute-1 nova_compute[182713]:     </system>
Jan 21 23:54:42 compute-1 nova_compute[182713]:   </sysinfo>
Jan 21 23:54:42 compute-1 nova_compute[182713]:   <os>
Jan 21 23:54:42 compute-1 nova_compute[182713]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 21 23:54:42 compute-1 nova_compute[182713]:     <boot dev="hd"/>
Jan 21 23:54:42 compute-1 nova_compute[182713]:     <smbios mode="sysinfo"/>
Jan 21 23:54:42 compute-1 nova_compute[182713]:   </os>
Jan 21 23:54:42 compute-1 nova_compute[182713]:   <features>
Jan 21 23:54:42 compute-1 nova_compute[182713]:     <acpi/>
Jan 21 23:54:42 compute-1 nova_compute[182713]:     <apic/>
Jan 21 23:54:42 compute-1 nova_compute[182713]:     <vmcoreinfo/>
Jan 21 23:54:42 compute-1 nova_compute[182713]:   </features>
Jan 21 23:54:42 compute-1 nova_compute[182713]:   <clock offset="utc">
Jan 21 23:54:42 compute-1 nova_compute[182713]:     <timer name="pit" tickpolicy="delay"/>
Jan 21 23:54:42 compute-1 nova_compute[182713]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 21 23:54:42 compute-1 nova_compute[182713]:     <timer name="hpet" present="no"/>
Jan 21 23:54:42 compute-1 nova_compute[182713]:   </clock>
Jan 21 23:54:42 compute-1 nova_compute[182713]:   <cpu mode="custom" match="exact">
Jan 21 23:54:42 compute-1 nova_compute[182713]:     <model>Nehalem</model>
Jan 21 23:54:42 compute-1 nova_compute[182713]:     <topology sockets="1" cores="1" threads="1"/>
Jan 21 23:54:42 compute-1 nova_compute[182713]:   </cpu>
Jan 21 23:54:42 compute-1 nova_compute[182713]:   <devices>
Jan 21 23:54:42 compute-1 nova_compute[182713]:     <disk type="file" device="disk">
Jan 21 23:54:42 compute-1 nova_compute[182713]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 21 23:54:42 compute-1 nova_compute[182713]:       <source file="/var/lib/nova/instances/d94f4b0c-b132-4eb6-9c92-6850506a821a/disk"/>
Jan 21 23:54:42 compute-1 nova_compute[182713]:       <target dev="vda" bus="virtio"/>
Jan 21 23:54:42 compute-1 nova_compute[182713]:     </disk>
Jan 21 23:54:42 compute-1 nova_compute[182713]:     <disk type="file" device="cdrom">
Jan 21 23:54:42 compute-1 nova_compute[182713]:       <driver name="qemu" type="raw" cache="none"/>
Jan 21 23:54:42 compute-1 nova_compute[182713]:       <source file="/var/lib/nova/instances/d94f4b0c-b132-4eb6-9c92-6850506a821a/disk.config"/>
Jan 21 23:54:42 compute-1 nova_compute[182713]:       <target dev="sda" bus="sata"/>
Jan 21 23:54:42 compute-1 nova_compute[182713]:     </disk>
Jan 21 23:54:42 compute-1 nova_compute[182713]:     <interface type="ethernet">
Jan 21 23:54:42 compute-1 nova_compute[182713]:       <mac address="fa:16:3e:6e:94:fb"/>
Jan 21 23:54:42 compute-1 nova_compute[182713]:       <model type="virtio"/>
Jan 21 23:54:42 compute-1 nova_compute[182713]:       <driver name="vhost" rx_queue_size="512"/>
Jan 21 23:54:42 compute-1 nova_compute[182713]:       <mtu size="1442"/>
Jan 21 23:54:42 compute-1 nova_compute[182713]:       <target dev="tapfc6ae600-05"/>
Jan 21 23:54:42 compute-1 nova_compute[182713]:     </interface>
Jan 21 23:54:42 compute-1 nova_compute[182713]:     <serial type="pty">
Jan 21 23:54:42 compute-1 nova_compute[182713]:       <log file="/var/lib/nova/instances/d94f4b0c-b132-4eb6-9c92-6850506a821a/console.log" append="off"/>
Jan 21 23:54:42 compute-1 nova_compute[182713]:     </serial>
Jan 21 23:54:42 compute-1 nova_compute[182713]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 21 23:54:42 compute-1 nova_compute[182713]:     <video>
Jan 21 23:54:42 compute-1 nova_compute[182713]:       <model type="virtio"/>
Jan 21 23:54:42 compute-1 nova_compute[182713]:     </video>
Jan 21 23:54:42 compute-1 nova_compute[182713]:     <input type="tablet" bus="usb"/>
Jan 21 23:54:42 compute-1 nova_compute[182713]:     <rng model="virtio">
Jan 21 23:54:42 compute-1 nova_compute[182713]:       <backend model="random">/dev/urandom</backend>
Jan 21 23:54:42 compute-1 nova_compute[182713]:     </rng>
Jan 21 23:54:42 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root"/>
Jan 21 23:54:42 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:54:42 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:54:42 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:54:42 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:54:42 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:54:42 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:54:42 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:54:42 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:54:42 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:54:42 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:54:42 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:54:42 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:54:42 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:54:42 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:54:42 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:54:42 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:54:42 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:54:42 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:54:42 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:54:42 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:54:42 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:54:42 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:54:42 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:54:42 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:54:42 compute-1 nova_compute[182713]:     <controller type="usb" index="0"/>
Jan 21 23:54:42 compute-1 nova_compute[182713]:     <memballoon model="virtio">
Jan 21 23:54:42 compute-1 nova_compute[182713]:       <stats period="10"/>
Jan 21 23:54:42 compute-1 nova_compute[182713]:     </memballoon>
Jan 21 23:54:42 compute-1 nova_compute[182713]:   </devices>
Jan 21 23:54:42 compute-1 nova_compute[182713]: </domain>
Jan 21 23:54:42 compute-1 nova_compute[182713]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 21 23:54:42 compute-1 nova_compute[182713]: 2026-01-21 23:54:42.783 182717 DEBUG nova.compute.manager [None req-522dc215-0e71-4f1b-8681-5796cb1b4981 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: d94f4b0c-b132-4eb6-9c92-6850506a821a] Preparing to wait for external event network-vif-plugged-fc6ae600-05c3-478d-9bbd-b0041c1043c1 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 21 23:54:42 compute-1 nova_compute[182713]: 2026-01-21 23:54:42.783 182717 DEBUG oslo_concurrency.lockutils [None req-522dc215-0e71-4f1b-8681-5796cb1b4981 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Acquiring lock "d94f4b0c-b132-4eb6-9c92-6850506a821a-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:54:42 compute-1 nova_compute[182713]: 2026-01-21 23:54:42.784 182717 DEBUG oslo_concurrency.lockutils [None req-522dc215-0e71-4f1b-8681-5796cb1b4981 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Lock "d94f4b0c-b132-4eb6-9c92-6850506a821a-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:54:42 compute-1 nova_compute[182713]: 2026-01-21 23:54:42.784 182717 DEBUG oslo_concurrency.lockutils [None req-522dc215-0e71-4f1b-8681-5796cb1b4981 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Lock "d94f4b0c-b132-4eb6-9c92-6850506a821a-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:54:42 compute-1 nova_compute[182713]: 2026-01-21 23:54:42.785 182717 DEBUG nova.virt.libvirt.vif [None req-522dc215-0e71-4f1b-8681-5796cb1b4981 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-21T23:54:32Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ImagesTestJSON-server-421695372',display_name='tempest-ImagesTestJSON-server-421695372',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-imagestestjson-server-421695372',id=53,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='63e5713bcd4c429796b251487b6136bc',ramdisk_id='',reservation_id='r-afkdxekb',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ImagesTestJSON-126431515',owner_user_name='tempest-ImagesTestJSON-126431515-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-21T23:54:34Z,user_data=None,user_id='6eb1bcf645844eaca088761a04e59542',uuid=d94f4b0c-b132-4eb6-9c92-6850506a821a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "fc6ae600-05c3-478d-9bbd-b0041c1043c1", "address": "fa:16:3e:6e:94:fb", "network": {"id": "74e2da48-44c2-4c6d-9597-6c47d6247f9c", "bridge": "br-int", "label": "tempest-ImagesTestJSON-926280031-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "63e5713bcd4c429796b251487b6136bc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfc6ae600-05", "ovs_interfaceid": "fc6ae600-05c3-478d-9bbd-b0041c1043c1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 21 23:54:42 compute-1 nova_compute[182713]: 2026-01-21 23:54:42.785 182717 DEBUG nova.network.os_vif_util [None req-522dc215-0e71-4f1b-8681-5796cb1b4981 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Converting VIF {"id": "fc6ae600-05c3-478d-9bbd-b0041c1043c1", "address": "fa:16:3e:6e:94:fb", "network": {"id": "74e2da48-44c2-4c6d-9597-6c47d6247f9c", "bridge": "br-int", "label": "tempest-ImagesTestJSON-926280031-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "63e5713bcd4c429796b251487b6136bc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfc6ae600-05", "ovs_interfaceid": "fc6ae600-05c3-478d-9bbd-b0041c1043c1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 21 23:54:42 compute-1 nova_compute[182713]: 2026-01-21 23:54:42.786 182717 DEBUG nova.network.os_vif_util [None req-522dc215-0e71-4f1b-8681-5796cb1b4981 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:6e:94:fb,bridge_name='br-int',has_traffic_filtering=True,id=fc6ae600-05c3-478d-9bbd-b0041c1043c1,network=Network(74e2da48-44c2-4c6d-9597-6c47d6247f9c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfc6ae600-05') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 21 23:54:42 compute-1 nova_compute[182713]: 2026-01-21 23:54:42.786 182717 DEBUG os_vif [None req-522dc215-0e71-4f1b-8681-5796cb1b4981 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:6e:94:fb,bridge_name='br-int',has_traffic_filtering=True,id=fc6ae600-05c3-478d-9bbd-b0041c1043c1,network=Network(74e2da48-44c2-4c6d-9597-6c47d6247f9c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfc6ae600-05') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 21 23:54:42 compute-1 nova_compute[182713]: 2026-01-21 23:54:42.787 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:54:42 compute-1 nova_compute[182713]: 2026-01-21 23:54:42.787 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:54:42 compute-1 nova_compute[182713]: 2026-01-21 23:54:42.788 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 21 23:54:42 compute-1 nova_compute[182713]: 2026-01-21 23:54:42.792 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:54:42 compute-1 nova_compute[182713]: 2026-01-21 23:54:42.792 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapfc6ae600-05, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:54:42 compute-1 nova_compute[182713]: 2026-01-21 23:54:42.793 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapfc6ae600-05, col_values=(('external_ids', {'iface-id': 'fc6ae600-05c3-478d-9bbd-b0041c1043c1', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:6e:94:fb', 'vm-uuid': 'd94f4b0c-b132-4eb6-9c92-6850506a821a'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:54:42 compute-1 nova_compute[182713]: 2026-01-21 23:54:42.794 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:54:42 compute-1 NetworkManager[54952]: <info>  [1769039682.7961] manager: (tapfc6ae600-05): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/70)
Jan 21 23:54:42 compute-1 nova_compute[182713]: 2026-01-21 23:54:42.797 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 21 23:54:42 compute-1 nova_compute[182713]: 2026-01-21 23:54:42.801 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:54:42 compute-1 nova_compute[182713]: 2026-01-21 23:54:42.802 182717 INFO os_vif [None req-522dc215-0e71-4f1b-8681-5796cb1b4981 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:6e:94:fb,bridge_name='br-int',has_traffic_filtering=True,id=fc6ae600-05c3-478d-9bbd-b0041c1043c1,network=Network(74e2da48-44c2-4c6d-9597-6c47d6247f9c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfc6ae600-05')
Jan 21 23:54:42 compute-1 nova_compute[182713]: 2026-01-21 23:54:42.888 182717 DEBUG nova.virt.libvirt.driver [None req-522dc215-0e71-4f1b-8681-5796cb1b4981 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 21 23:54:42 compute-1 nova_compute[182713]: 2026-01-21 23:54:42.889 182717 DEBUG nova.virt.libvirt.driver [None req-522dc215-0e71-4f1b-8681-5796cb1b4981 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 21 23:54:42 compute-1 nova_compute[182713]: 2026-01-21 23:54:42.889 182717 DEBUG nova.virt.libvirt.driver [None req-522dc215-0e71-4f1b-8681-5796cb1b4981 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] No VIF found with MAC fa:16:3e:6e:94:fb, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 21 23:54:42 compute-1 nova_compute[182713]: 2026-01-21 23:54:42.889 182717 INFO nova.virt.libvirt.driver [None req-522dc215-0e71-4f1b-8681-5796cb1b4981 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: d94f4b0c-b132-4eb6-9c92-6850506a821a] Using config drive
Jan 21 23:54:43 compute-1 nova_compute[182713]: 2026-01-21 23:54:43.342 182717 INFO nova.virt.libvirt.driver [None req-522dc215-0e71-4f1b-8681-5796cb1b4981 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: d94f4b0c-b132-4eb6-9c92-6850506a821a] Creating config drive at /var/lib/nova/instances/d94f4b0c-b132-4eb6-9c92-6850506a821a/disk.config
Jan 21 23:54:43 compute-1 nova_compute[182713]: 2026-01-21 23:54:43.353 182717 DEBUG oslo_concurrency.processutils [None req-522dc215-0e71-4f1b-8681-5796cb1b4981 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/d94f4b0c-b132-4eb6-9c92-6850506a821a/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpuovyyrht execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:54:43 compute-1 nova_compute[182713]: 2026-01-21 23:54:43.486 182717 DEBUG oslo_concurrency.processutils [None req-522dc215-0e71-4f1b-8681-5796cb1b4981 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/d94f4b0c-b132-4eb6-9c92-6850506a821a/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpuovyyrht" returned: 0 in 0.134s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:54:43 compute-1 kernel: tapfc6ae600-05: entered promiscuous mode
Jan 21 23:54:43 compute-1 NetworkManager[54952]: <info>  [1769039683.5578] manager: (tapfc6ae600-05): new Tun device (/org/freedesktop/NetworkManager/Devices/71)
Jan 21 23:54:43 compute-1 nova_compute[182713]: 2026-01-21 23:54:43.559 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:54:43 compute-1 ovn_controller[94841]: 2026-01-21T23:54:43Z|00136|binding|INFO|Claiming lport fc6ae600-05c3-478d-9bbd-b0041c1043c1 for this chassis.
Jan 21 23:54:43 compute-1 ovn_controller[94841]: 2026-01-21T23:54:43Z|00137|binding|INFO|fc6ae600-05c3-478d-9bbd-b0041c1043c1: Claiming fa:16:3e:6e:94:fb 10.100.0.10
Jan 21 23:54:43 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:54:43.579 104184 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:6e:94:fb 10.100.0.10'], port_security=['fa:16:3e:6e:94:fb 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': 'd94f4b0c-b132-4eb6-9c92-6850506a821a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-74e2da48-44c2-4c6d-9597-6c47d6247f9c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '63e5713bcd4c429796b251487b6136bc', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'f9b7e0b6-a204-4fa2-b013-6e4797586550', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d83c4887-6d70-4852-825f-2112d1d70c76, chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>], logical_port=fc6ae600-05c3-478d-9bbd-b0041c1043c1) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 21 23:54:43 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:54:43.581 104184 INFO neutron.agent.ovn.metadata.agent [-] Port fc6ae600-05c3-478d-9bbd-b0041c1043c1 in datapath 74e2da48-44c2-4c6d-9597-6c47d6247f9c bound to our chassis
Jan 21 23:54:43 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:54:43.585 104184 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 74e2da48-44c2-4c6d-9597-6c47d6247f9c
Jan 21 23:54:43 compute-1 ovn_controller[94841]: 2026-01-21T23:54:43Z|00138|binding|INFO|Setting lport fc6ae600-05c3-478d-9bbd-b0041c1043c1 ovn-installed in OVS
Jan 21 23:54:43 compute-1 ovn_controller[94841]: 2026-01-21T23:54:43Z|00139|binding|INFO|Setting lport fc6ae600-05c3-478d-9bbd-b0041c1043c1 up in Southbound
Jan 21 23:54:43 compute-1 nova_compute[182713]: 2026-01-21 23:54:43.587 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:54:43 compute-1 nova_compute[182713]: 2026-01-21 23:54:43.589 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:54:43 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:54:43.599 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[c8d85e55-c518-40a2-8cac-45df5c869211]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:54:43 compute-1 systemd-udevd[218034]: Network interface NamePolicy= disabled on kernel command line.
Jan 21 23:54:43 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:54:43.600 104184 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap74e2da48-41 in ovnmeta-74e2da48-44c2-4c6d-9597-6c47d6247f9c namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 21 23:54:43 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:54:43.602 211733 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap74e2da48-40 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 21 23:54:43 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:54:43.602 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[453146eb-8cf4-4aca-9b41-eea17f68615c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:54:43 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:54:43.603 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[3365432d-ae10-466a-b2c8-e3f8c3aec498]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:54:43 compute-1 NetworkManager[54952]: <info>  [1769039683.6212] device (tapfc6ae600-05): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 21 23:54:43 compute-1 NetworkManager[54952]: <info>  [1769039683.6224] device (tapfc6ae600-05): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 21 23:54:43 compute-1 systemd-machined[153970]: New machine qemu-24-instance-00000035.
Jan 21 23:54:43 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:54:43.624 104576 DEBUG oslo.privsep.daemon [-] privsep: reply[7ff93145-f87d-46b8-8b89-4670e5c59d62]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:54:43 compute-1 systemd[1]: Started Virtual Machine qemu-24-instance-00000035.
Jan 21 23:54:43 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:54:43.639 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[5c37b057-79de-4ca3-838e-7dfd8da99187]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:54:43 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:54:43.679 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[b172157c-51e8-4f80-9e22-999e5bcf7a47]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:54:43 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:54:43.685 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[199f071d-48f9-4d6a-be49-d83c38d341ac]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:54:43 compute-1 NetworkManager[54952]: <info>  [1769039683.6867] manager: (tap74e2da48-40): new Veth device (/org/freedesktop/NetworkManager/Devices/72)
Jan 21 23:54:43 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:54:43.718 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[0fa3960b-3395-4e21-89ce-1114507962bd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:54:43 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:54:43.721 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[1b0d9f7e-b064-44b2-a6cc-c792a0d78a77]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:54:43 compute-1 NetworkManager[54952]: <info>  [1769039683.7405] device (tap74e2da48-40): carrier: link connected
Jan 21 23:54:43 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:54:43.745 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[2795f061-92ae-4a24-8663-33533e45b338]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:54:43 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:54:43.766 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[b0cf9409-16c6-440a-afdc-d0ca06b996b9]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap74e2da48-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:af:75:49'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 43], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 425639, 'reachable_time': 16230, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 218068, 'error': None, 'target': 'ovnmeta-74e2da48-44c2-4c6d-9597-6c47d6247f9c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:54:43 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:54:43.782 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[694f9810-5564-4457-913f-a88037ea0d73]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feaf:7549'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 425639, 'tstamp': 425639}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 218069, 'error': None, 'target': 'ovnmeta-74e2da48-44c2-4c6d-9597-6c47d6247f9c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:54:43 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:54:43.803 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[e1c66db4-344e-4289-ab9e-41131d074984]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap74e2da48-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:af:75:49'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 43], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 425639, 'reachable_time': 16230, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 218070, 'error': None, 'target': 'ovnmeta-74e2da48-44c2-4c6d-9597-6c47d6247f9c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:54:43 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:54:43.839 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[21628696-e9d0-4574-8eb9-b666642ffb23]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:54:43 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:54:43.921 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[20fe2e5f-4ad4-4ba8-be38-e7d4fe19de7e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:54:43 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:54:43.923 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap74e2da48-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:54:43 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:54:43.923 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 21 23:54:43 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:54:43.924 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap74e2da48-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:54:43 compute-1 NetworkManager[54952]: <info>  [1769039683.9275] manager: (tap74e2da48-40): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/73)
Jan 21 23:54:43 compute-1 kernel: tap74e2da48-40: entered promiscuous mode
Jan 21 23:54:43 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:54:43.931 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap74e2da48-40, col_values=(('external_ids', {'iface-id': '5f8f321e-2942-4700-a50e-4b0628052c1b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:54:43 compute-1 nova_compute[182713]: 2026-01-21 23:54:43.931 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:54:43 compute-1 ovn_controller[94841]: 2026-01-21T23:54:43Z|00140|binding|INFO|Releasing lport 5f8f321e-2942-4700-a50e-4b0628052c1b from this chassis (sb_readonly=0)
Jan 21 23:54:43 compute-1 nova_compute[182713]: 2026-01-21 23:54:43.956 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:54:43 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:54:43.957 104184 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/74e2da48-44c2-4c6d-9597-6c47d6247f9c.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/74e2da48-44c2-4c6d-9597-6c47d6247f9c.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 21 23:54:43 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:54:43.958 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[1cdbb7aa-9649-4061-8203-f3169a0cc476]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:54:43 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:54:43.960 104184 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 21 23:54:43 compute-1 ovn_metadata_agent[104179]: global
Jan 21 23:54:43 compute-1 ovn_metadata_agent[104179]:     log         /dev/log local0 debug
Jan 21 23:54:43 compute-1 ovn_metadata_agent[104179]:     log-tag     haproxy-metadata-proxy-74e2da48-44c2-4c6d-9597-6c47d6247f9c
Jan 21 23:54:43 compute-1 ovn_metadata_agent[104179]:     user        root
Jan 21 23:54:43 compute-1 ovn_metadata_agent[104179]:     group       root
Jan 21 23:54:43 compute-1 ovn_metadata_agent[104179]:     maxconn     1024
Jan 21 23:54:43 compute-1 ovn_metadata_agent[104179]:     pidfile     /var/lib/neutron/external/pids/74e2da48-44c2-4c6d-9597-6c47d6247f9c.pid.haproxy
Jan 21 23:54:43 compute-1 ovn_metadata_agent[104179]:     daemon
Jan 21 23:54:43 compute-1 ovn_metadata_agent[104179]: 
Jan 21 23:54:43 compute-1 ovn_metadata_agent[104179]: defaults
Jan 21 23:54:43 compute-1 ovn_metadata_agent[104179]:     log global
Jan 21 23:54:43 compute-1 ovn_metadata_agent[104179]:     mode http
Jan 21 23:54:43 compute-1 ovn_metadata_agent[104179]:     option httplog
Jan 21 23:54:43 compute-1 ovn_metadata_agent[104179]:     option dontlognull
Jan 21 23:54:43 compute-1 ovn_metadata_agent[104179]:     option http-server-close
Jan 21 23:54:43 compute-1 ovn_metadata_agent[104179]:     option forwardfor
Jan 21 23:54:43 compute-1 ovn_metadata_agent[104179]:     retries                 3
Jan 21 23:54:43 compute-1 ovn_metadata_agent[104179]:     timeout http-request    30s
Jan 21 23:54:43 compute-1 ovn_metadata_agent[104179]:     timeout connect         30s
Jan 21 23:54:43 compute-1 ovn_metadata_agent[104179]:     timeout client          32s
Jan 21 23:54:43 compute-1 ovn_metadata_agent[104179]:     timeout server          32s
Jan 21 23:54:43 compute-1 ovn_metadata_agent[104179]:     timeout http-keep-alive 30s
Jan 21 23:54:43 compute-1 ovn_metadata_agent[104179]: 
Jan 21 23:54:43 compute-1 ovn_metadata_agent[104179]: 
Jan 21 23:54:43 compute-1 ovn_metadata_agent[104179]: listen listener
Jan 21 23:54:43 compute-1 ovn_metadata_agent[104179]:     bind 169.254.169.254:80
Jan 21 23:54:43 compute-1 ovn_metadata_agent[104179]:     server metadata /var/lib/neutron/metadata_proxy
Jan 21 23:54:43 compute-1 ovn_metadata_agent[104179]:     http-request add-header X-OVN-Network-ID 74e2da48-44c2-4c6d-9597-6c47d6247f9c
Jan 21 23:54:43 compute-1 ovn_metadata_agent[104179]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 21 23:54:43 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:54:43.961 104184 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-74e2da48-44c2-4c6d-9597-6c47d6247f9c', 'env', 'PROCESS_TAG=haproxy-74e2da48-44c2-4c6d-9597-6c47d6247f9c', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/74e2da48-44c2-4c6d-9597-6c47d6247f9c.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 21 23:54:43 compute-1 nova_compute[182713]: 2026-01-21 23:54:43.988 182717 DEBUG nova.virt.driver [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Emitting event <LifecycleEvent: 1769039683.987296, d94f4b0c-b132-4eb6-9c92-6850506a821a => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:54:43 compute-1 nova_compute[182713]: 2026-01-21 23:54:43.988 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: d94f4b0c-b132-4eb6-9c92-6850506a821a] VM Started (Lifecycle Event)
Jan 21 23:54:44 compute-1 nova_compute[182713]: 2026-01-21 23:54:44.020 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: d94f4b0c-b132-4eb6-9c92-6850506a821a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:54:44 compute-1 nova_compute[182713]: 2026-01-21 23:54:44.025 182717 DEBUG nova.virt.driver [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Emitting event <LifecycleEvent: 1769039683.9875462, d94f4b0c-b132-4eb6-9c92-6850506a821a => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:54:44 compute-1 nova_compute[182713]: 2026-01-21 23:54:44.026 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: d94f4b0c-b132-4eb6-9c92-6850506a821a] VM Paused (Lifecycle Event)
Jan 21 23:54:44 compute-1 nova_compute[182713]: 2026-01-21 23:54:44.047 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: d94f4b0c-b132-4eb6-9c92-6850506a821a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:54:44 compute-1 nova_compute[182713]: 2026-01-21 23:54:44.052 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: d94f4b0c-b132-4eb6-9c92-6850506a821a] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 21 23:54:44 compute-1 nova_compute[182713]: 2026-01-21 23:54:44.087 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: d94f4b0c-b132-4eb6-9c92-6850506a821a] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 21 23:54:44 compute-1 podman[218109]: 2026-01-21 23:54:44.374814609 +0000 UTC m=+0.077143507 container create 81566ded70953dc102b279b286eea81de2ac337b3fe58b5ab18c3fa5c3c67965 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-74e2da48-44c2-4c6d-9597-6c47d6247f9c, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3)
Jan 21 23:54:44 compute-1 nova_compute[182713]: 2026-01-21 23:54:44.384 182717 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769039669.3821864, 827ad99f-45db-4d46-9b29-e7093f18eac8 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:54:44 compute-1 nova_compute[182713]: 2026-01-21 23:54:44.385 182717 INFO nova.compute.manager [-] [instance: 827ad99f-45db-4d46-9b29-e7093f18eac8] VM Stopped (Lifecycle Event)
Jan 21 23:54:44 compute-1 systemd[1]: Started libpod-conmon-81566ded70953dc102b279b286eea81de2ac337b3fe58b5ab18c3fa5c3c67965.scope.
Jan 21 23:54:44 compute-1 podman[218109]: 2026-01-21 23:54:44.335392555 +0000 UTC m=+0.037721513 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 21 23:54:44 compute-1 systemd[1]: Started libcrun container.
Jan 21 23:54:44 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/251ed7aa053f47d46d85aa28b64a533ce2ea83007ce40cd825cc2c829d54c272/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 21 23:54:44 compute-1 podman[218109]: 2026-01-21 23:54:44.47520435 +0000 UTC m=+0.177533218 container init 81566ded70953dc102b279b286eea81de2ac337b3fe58b5ab18c3fa5c3c67965 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-74e2da48-44c2-4c6d-9597-6c47d6247f9c, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202)
Jan 21 23:54:44 compute-1 podman[218109]: 2026-01-21 23:54:44.479811882 +0000 UTC m=+0.182140750 container start 81566ded70953dc102b279b286eea81de2ac337b3fe58b5ab18c3fa5c3c67965 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-74e2da48-44c2-4c6d-9597-6c47d6247f9c, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Jan 21 23:54:44 compute-1 neutron-haproxy-ovnmeta-74e2da48-44c2-4c6d-9597-6c47d6247f9c[218125]: [NOTICE]   (218129) : New worker (218131) forked
Jan 21 23:54:44 compute-1 neutron-haproxy-ovnmeta-74e2da48-44c2-4c6d-9597-6c47d6247f9c[218125]: [NOTICE]   (218129) : Loading success.
Jan 21 23:54:44 compute-1 nova_compute[182713]: 2026-01-21 23:54:44.526 182717 DEBUG nova.compute.manager [None req-2879e26e-0fcd-45a7-a2ec-7161867693c1 - - - - - -] [instance: 827ad99f-45db-4d46-9b29-e7093f18eac8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:54:44 compute-1 nova_compute[182713]: 2026-01-21 23:54:44.875 182717 DEBUG nova.compute.manager [req-0ff08c90-cc7d-48ac-b16e-6a453f1a5c28 req-9ff03eb7-32f6-4b4e-b663-54367e621e06 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: d94f4b0c-b132-4eb6-9c92-6850506a821a] Received event network-vif-plugged-fc6ae600-05c3-478d-9bbd-b0041c1043c1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:54:44 compute-1 nova_compute[182713]: 2026-01-21 23:54:44.876 182717 DEBUG oslo_concurrency.lockutils [req-0ff08c90-cc7d-48ac-b16e-6a453f1a5c28 req-9ff03eb7-32f6-4b4e-b663-54367e621e06 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "d94f4b0c-b132-4eb6-9c92-6850506a821a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:54:44 compute-1 nova_compute[182713]: 2026-01-21 23:54:44.877 182717 DEBUG oslo_concurrency.lockutils [req-0ff08c90-cc7d-48ac-b16e-6a453f1a5c28 req-9ff03eb7-32f6-4b4e-b663-54367e621e06 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "d94f4b0c-b132-4eb6-9c92-6850506a821a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:54:44 compute-1 nova_compute[182713]: 2026-01-21 23:54:44.877 182717 DEBUG oslo_concurrency.lockutils [req-0ff08c90-cc7d-48ac-b16e-6a453f1a5c28 req-9ff03eb7-32f6-4b4e-b663-54367e621e06 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "d94f4b0c-b132-4eb6-9c92-6850506a821a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:54:44 compute-1 nova_compute[182713]: 2026-01-21 23:54:44.878 182717 DEBUG nova.compute.manager [req-0ff08c90-cc7d-48ac-b16e-6a453f1a5c28 req-9ff03eb7-32f6-4b4e-b663-54367e621e06 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: d94f4b0c-b132-4eb6-9c92-6850506a821a] Processing event network-vif-plugged-fc6ae600-05c3-478d-9bbd-b0041c1043c1 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 21 23:54:44 compute-1 nova_compute[182713]: 2026-01-21 23:54:44.880 182717 DEBUG nova.compute.manager [None req-522dc215-0e71-4f1b-8681-5796cb1b4981 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: d94f4b0c-b132-4eb6-9c92-6850506a821a] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 21 23:54:44 compute-1 nova_compute[182713]: 2026-01-21 23:54:44.885 182717 DEBUG nova.virt.driver [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Emitting event <LifecycleEvent: 1769039684.8850675, d94f4b0c-b132-4eb6-9c92-6850506a821a => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:54:44 compute-1 nova_compute[182713]: 2026-01-21 23:54:44.886 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: d94f4b0c-b132-4eb6-9c92-6850506a821a] VM Resumed (Lifecycle Event)
Jan 21 23:54:44 compute-1 nova_compute[182713]: 2026-01-21 23:54:44.889 182717 DEBUG nova.virt.libvirt.driver [None req-522dc215-0e71-4f1b-8681-5796cb1b4981 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: d94f4b0c-b132-4eb6-9c92-6850506a821a] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 21 23:54:44 compute-1 nova_compute[182713]: 2026-01-21 23:54:44.895 182717 INFO nova.virt.libvirt.driver [-] [instance: d94f4b0c-b132-4eb6-9c92-6850506a821a] Instance spawned successfully.
Jan 21 23:54:44 compute-1 nova_compute[182713]: 2026-01-21 23:54:44.896 182717 DEBUG nova.virt.libvirt.driver [None req-522dc215-0e71-4f1b-8681-5796cb1b4981 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: d94f4b0c-b132-4eb6-9c92-6850506a821a] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 21 23:54:44 compute-1 nova_compute[182713]: 2026-01-21 23:54:44.916 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: d94f4b0c-b132-4eb6-9c92-6850506a821a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:54:44 compute-1 nova_compute[182713]: 2026-01-21 23:54:44.928 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: d94f4b0c-b132-4eb6-9c92-6850506a821a] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 21 23:54:44 compute-1 nova_compute[182713]: 2026-01-21 23:54:44.935 182717 DEBUG nova.virt.libvirt.driver [None req-522dc215-0e71-4f1b-8681-5796cb1b4981 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: d94f4b0c-b132-4eb6-9c92-6850506a821a] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:54:44 compute-1 nova_compute[182713]: 2026-01-21 23:54:44.936 182717 DEBUG nova.virt.libvirt.driver [None req-522dc215-0e71-4f1b-8681-5796cb1b4981 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: d94f4b0c-b132-4eb6-9c92-6850506a821a] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:54:44 compute-1 nova_compute[182713]: 2026-01-21 23:54:44.937 182717 DEBUG nova.virt.libvirt.driver [None req-522dc215-0e71-4f1b-8681-5796cb1b4981 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: d94f4b0c-b132-4eb6-9c92-6850506a821a] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:54:44 compute-1 nova_compute[182713]: 2026-01-21 23:54:44.938 182717 DEBUG nova.virt.libvirt.driver [None req-522dc215-0e71-4f1b-8681-5796cb1b4981 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: d94f4b0c-b132-4eb6-9c92-6850506a821a] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:54:44 compute-1 nova_compute[182713]: 2026-01-21 23:54:44.939 182717 DEBUG nova.virt.libvirt.driver [None req-522dc215-0e71-4f1b-8681-5796cb1b4981 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: d94f4b0c-b132-4eb6-9c92-6850506a821a] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:54:44 compute-1 nova_compute[182713]: 2026-01-21 23:54:44.940 182717 DEBUG nova.virt.libvirt.driver [None req-522dc215-0e71-4f1b-8681-5796cb1b4981 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: d94f4b0c-b132-4eb6-9c92-6850506a821a] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:54:44 compute-1 nova_compute[182713]: 2026-01-21 23:54:44.965 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: d94f4b0c-b132-4eb6-9c92-6850506a821a] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 21 23:54:45 compute-1 nova_compute[182713]: 2026-01-21 23:54:45.007 182717 INFO nova.compute.manager [None req-522dc215-0e71-4f1b-8681-5796cb1b4981 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: d94f4b0c-b132-4eb6-9c92-6850506a821a] Took 7.16 seconds to spawn the instance on the hypervisor.
Jan 21 23:54:45 compute-1 nova_compute[182713]: 2026-01-21 23:54:45.007 182717 DEBUG nova.compute.manager [None req-522dc215-0e71-4f1b-8681-5796cb1b4981 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: d94f4b0c-b132-4eb6-9c92-6850506a821a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:54:45 compute-1 nova_compute[182713]: 2026-01-21 23:54:45.121 182717 INFO nova.compute.manager [None req-522dc215-0e71-4f1b-8681-5796cb1b4981 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: d94f4b0c-b132-4eb6-9c92-6850506a821a] Took 11.28 seconds to build instance.
Jan 21 23:54:45 compute-1 nova_compute[182713]: 2026-01-21 23:54:45.141 182717 DEBUG oslo_concurrency.lockutils [None req-522dc215-0e71-4f1b-8681-5796cb1b4981 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Lock "d94f4b0c-b132-4eb6-9c92-6850506a821a" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 11.390s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:54:45 compute-1 nova_compute[182713]: 2026-01-21 23:54:45.281 182717 DEBUG nova.network.neutron [req-6aa39ad1-a473-4de7-8699-d0b7ef16ff13 req-a008ec75-a4c6-4611-8c0b-dd35e92bb2bb 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: d94f4b0c-b132-4eb6-9c92-6850506a821a] Updated VIF entry in instance network info cache for port fc6ae600-05c3-478d-9bbd-b0041c1043c1. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 21 23:54:45 compute-1 nova_compute[182713]: 2026-01-21 23:54:45.282 182717 DEBUG nova.network.neutron [req-6aa39ad1-a473-4de7-8699-d0b7ef16ff13 req-a008ec75-a4c6-4611-8c0b-dd35e92bb2bb 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: d94f4b0c-b132-4eb6-9c92-6850506a821a] Updating instance_info_cache with network_info: [{"id": "fc6ae600-05c3-478d-9bbd-b0041c1043c1", "address": "fa:16:3e:6e:94:fb", "network": {"id": "74e2da48-44c2-4c6d-9597-6c47d6247f9c", "bridge": "br-int", "label": "tempest-ImagesTestJSON-926280031-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "63e5713bcd4c429796b251487b6136bc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfc6ae600-05", "ovs_interfaceid": "fc6ae600-05c3-478d-9bbd-b0041c1043c1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 23:54:45 compute-1 nova_compute[182713]: 2026-01-21 23:54:45.307 182717 DEBUG oslo_concurrency.lockutils [req-6aa39ad1-a473-4de7-8699-d0b7ef16ff13 req-a008ec75-a4c6-4611-8c0b-dd35e92bb2bb 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-d94f4b0c-b132-4eb6-9c92-6850506a821a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 23:54:46 compute-1 nova_compute[182713]: 2026-01-21 23:54:46.017 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:54:47 compute-1 nova_compute[182713]: 2026-01-21 23:54:47.090 182717 DEBUG nova.compute.manager [req-4ecd54d1-183b-4cb6-91da-029394f42604 req-89d729d6-eae9-4908-8db0-b5fafca9195a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: d94f4b0c-b132-4eb6-9c92-6850506a821a] Received event network-vif-plugged-fc6ae600-05c3-478d-9bbd-b0041c1043c1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:54:47 compute-1 nova_compute[182713]: 2026-01-21 23:54:47.091 182717 DEBUG oslo_concurrency.lockutils [req-4ecd54d1-183b-4cb6-91da-029394f42604 req-89d729d6-eae9-4908-8db0-b5fafca9195a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "d94f4b0c-b132-4eb6-9c92-6850506a821a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:54:47 compute-1 nova_compute[182713]: 2026-01-21 23:54:47.091 182717 DEBUG oslo_concurrency.lockutils [req-4ecd54d1-183b-4cb6-91da-029394f42604 req-89d729d6-eae9-4908-8db0-b5fafca9195a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "d94f4b0c-b132-4eb6-9c92-6850506a821a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:54:47 compute-1 nova_compute[182713]: 2026-01-21 23:54:47.092 182717 DEBUG oslo_concurrency.lockutils [req-4ecd54d1-183b-4cb6-91da-029394f42604 req-89d729d6-eae9-4908-8db0-b5fafca9195a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "d94f4b0c-b132-4eb6-9c92-6850506a821a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:54:47 compute-1 nova_compute[182713]: 2026-01-21 23:54:47.092 182717 DEBUG nova.compute.manager [req-4ecd54d1-183b-4cb6-91da-029394f42604 req-89d729d6-eae9-4908-8db0-b5fafca9195a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: d94f4b0c-b132-4eb6-9c92-6850506a821a] No waiting events found dispatching network-vif-plugged-fc6ae600-05c3-478d-9bbd-b0041c1043c1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 23:54:47 compute-1 nova_compute[182713]: 2026-01-21 23:54:47.093 182717 WARNING nova.compute.manager [req-4ecd54d1-183b-4cb6-91da-029394f42604 req-89d729d6-eae9-4908-8db0-b5fafca9195a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: d94f4b0c-b132-4eb6-9c92-6850506a821a] Received unexpected event network-vif-plugged-fc6ae600-05c3-478d-9bbd-b0041c1043c1 for instance with vm_state active and task_state None.
Jan 21 23:54:47 compute-1 nova_compute[182713]: 2026-01-21 23:54:47.541 182717 DEBUG oslo_concurrency.lockutils [None req-5ab6aae7-e086-4184-b582-3331eca58b7e 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Acquiring lock "d94f4b0c-b132-4eb6-9c92-6850506a821a" by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:54:47 compute-1 nova_compute[182713]: 2026-01-21 23:54:47.542 182717 DEBUG oslo_concurrency.lockutils [None req-5ab6aae7-e086-4184-b582-3331eca58b7e 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Lock "d94f4b0c-b132-4eb6-9c92-6850506a821a" acquired by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:54:47 compute-1 nova_compute[182713]: 2026-01-21 23:54:47.542 182717 DEBUG nova.compute.manager [None req-5ab6aae7-e086-4184-b582-3331eca58b7e 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: d94f4b0c-b132-4eb6-9c92-6850506a821a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:54:47 compute-1 nova_compute[182713]: 2026-01-21 23:54:47.546 182717 DEBUG nova.compute.manager [None req-5ab6aae7-e086-4184-b582-3331eca58b7e 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: d94f4b0c-b132-4eb6-9c92-6850506a821a] Stopping instance; current vm_state: active, current task_state: powering-off, current DB power_state: 1, current VM power_state: 1 do_stop_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3338
Jan 21 23:54:47 compute-1 nova_compute[182713]: 2026-01-21 23:54:47.547 182717 DEBUG nova.objects.instance [None req-5ab6aae7-e086-4184-b582-3331eca58b7e 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Lazy-loading 'flavor' on Instance uuid d94f4b0c-b132-4eb6-9c92-6850506a821a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:54:47 compute-1 nova_compute[182713]: 2026-01-21 23:54:47.587 182717 DEBUG nova.objects.instance [None req-5ab6aae7-e086-4184-b582-3331eca58b7e 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Lazy-loading 'info_cache' on Instance uuid d94f4b0c-b132-4eb6-9c92-6850506a821a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:54:47 compute-1 nova_compute[182713]: 2026-01-21 23:54:47.629 182717 DEBUG nova.virt.libvirt.driver [None req-5ab6aae7-e086-4184-b582-3331eca58b7e 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: d94f4b0c-b132-4eb6-9c92-6850506a821a] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071
Jan 21 23:54:47 compute-1 nova_compute[182713]: 2026-01-21 23:54:47.795 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:54:49 compute-1 podman[218140]: 2026-01-21 23:54:49.623461622 +0000 UTC m=+0.099966479 container health_status cba261ffc859a105e0afa4436e64e27f9ba7e7387a1ea02146e0072c51844d44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ceilometer_agent_compute, managed_by=edpm_ansible)
Jan 21 23:54:51 compute-1 nova_compute[182713]: 2026-01-21 23:54:51.019 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:54:52 compute-1 podman[218161]: 2026-01-21 23:54:52.592607148 +0000 UTC m=+0.077717004 container health_status dce133a2ffa21f3006ac27dc80027060a9029e6d972f4c62fecd35dd3bbb00f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., release=1755695350, version=9.6, build-date=2025-08-20T13:12:41, config_id=openstack_network_exporter, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., io.buildah.version=1.33.7, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Jan 21 23:54:52 compute-1 nova_compute[182713]: 2026-01-21 23:54:52.799 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:54:56 compute-1 nova_compute[182713]: 2026-01-21 23:54:56.020 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:54:57 compute-1 ovn_controller[94841]: 2026-01-21T23:54:57Z|00018|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:6e:94:fb 10.100.0.10
Jan 21 23:54:57 compute-1 ovn_controller[94841]: 2026-01-21T23:54:57Z|00019|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:6e:94:fb 10.100.0.10
Jan 21 23:54:57 compute-1 nova_compute[182713]: 2026-01-21 23:54:57.692 182717 DEBUG nova.virt.libvirt.driver [None req-5ab6aae7-e086-4184-b582-3331eca58b7e 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: d94f4b0c-b132-4eb6-9c92-6850506a821a] Instance in state 1 after 10 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101
Jan 21 23:54:57 compute-1 nova_compute[182713]: 2026-01-21 23:54:57.804 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:54:59 compute-1 podman[218199]: 2026-01-21 23:54:59.590027356 +0000 UTC m=+0.075039242 container health_status 9069102163b9014ce8d990e36224edf2f6277e737eebbc85a8ea0ba5a3d6a468 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 21 23:54:59 compute-1 podman[218198]: 2026-01-21 23:54:59.632091242 +0000 UTC m=+0.122608097 container health_status 1b31374526468d6b98833c6ddc06c791524e3aaa8f4a92d02e3f11c449a17ca2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3)
Jan 21 23:54:59 compute-1 kernel: tapfc6ae600-05 (unregistering): left promiscuous mode
Jan 21 23:54:59 compute-1 NetworkManager[54952]: <info>  [1769039699.9398] device (tapfc6ae600-05): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 21 23:54:59 compute-1 ovn_controller[94841]: 2026-01-21T23:54:59Z|00141|binding|INFO|Releasing lport fc6ae600-05c3-478d-9bbd-b0041c1043c1 from this chassis (sb_readonly=0)
Jan 21 23:54:59 compute-1 ovn_controller[94841]: 2026-01-21T23:54:59Z|00142|binding|INFO|Setting lport fc6ae600-05c3-478d-9bbd-b0041c1043c1 down in Southbound
Jan 21 23:54:59 compute-1 nova_compute[182713]: 2026-01-21 23:54:59.950 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:54:59 compute-1 ovn_controller[94841]: 2026-01-21T23:54:59Z|00143|binding|INFO|Removing iface tapfc6ae600-05 ovn-installed in OVS
Jan 21 23:54:59 compute-1 nova_compute[182713]: 2026-01-21 23:54:59.956 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:54:59 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:54:59.961 104184 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:6e:94:fb 10.100.0.10'], port_security=['fa:16:3e:6e:94:fb 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': 'd94f4b0c-b132-4eb6-9c92-6850506a821a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-74e2da48-44c2-4c6d-9597-6c47d6247f9c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '63e5713bcd4c429796b251487b6136bc', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'f9b7e0b6-a204-4fa2-b013-6e4797586550', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d83c4887-6d70-4852-825f-2112d1d70c76, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>], logical_port=fc6ae600-05c3-478d-9bbd-b0041c1043c1) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 21 23:54:59 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:54:59.963 104184 INFO neutron.agent.ovn.metadata.agent [-] Port fc6ae600-05c3-478d-9bbd-b0041c1043c1 in datapath 74e2da48-44c2-4c6d-9597-6c47d6247f9c unbound from our chassis
Jan 21 23:54:59 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:54:59.966 104184 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 74e2da48-44c2-4c6d-9597-6c47d6247f9c, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 21 23:54:59 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:54:59.968 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[6b31a6d4-e138-41cc-a7b8-31973ccf308c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:54:59 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:54:59.969 104184 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-74e2da48-44c2-4c6d-9597-6c47d6247f9c namespace which is not needed anymore
Jan 21 23:54:59 compute-1 nova_compute[182713]: 2026-01-21 23:54:59.977 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:54:59 compute-1 systemd[1]: machine-qemu\x2d24\x2dinstance\x2d00000035.scope: Deactivated successfully.
Jan 21 23:54:59 compute-1 systemd[1]: machine-qemu\x2d24\x2dinstance\x2d00000035.scope: Consumed 12.198s CPU time.
Jan 21 23:55:00 compute-1 systemd-machined[153970]: Machine qemu-24-instance-00000035 terminated.
Jan 21 23:55:00 compute-1 neutron-haproxy-ovnmeta-74e2da48-44c2-4c6d-9597-6c47d6247f9c[218125]: [NOTICE]   (218129) : haproxy version is 2.8.14-c23fe91
Jan 21 23:55:00 compute-1 neutron-haproxy-ovnmeta-74e2da48-44c2-4c6d-9597-6c47d6247f9c[218125]: [NOTICE]   (218129) : path to executable is /usr/sbin/haproxy
Jan 21 23:55:00 compute-1 neutron-haproxy-ovnmeta-74e2da48-44c2-4c6d-9597-6c47d6247f9c[218125]: [WARNING]  (218129) : Exiting Master process...
Jan 21 23:55:00 compute-1 neutron-haproxy-ovnmeta-74e2da48-44c2-4c6d-9597-6c47d6247f9c[218125]: [ALERT]    (218129) : Current worker (218131) exited with code 143 (Terminated)
Jan 21 23:55:00 compute-1 neutron-haproxy-ovnmeta-74e2da48-44c2-4c6d-9597-6c47d6247f9c[218125]: [WARNING]  (218129) : All workers exited. Exiting... (0)
Jan 21 23:55:00 compute-1 systemd[1]: libpod-81566ded70953dc102b279b286eea81de2ac337b3fe58b5ab18c3fa5c3c67965.scope: Deactivated successfully.
Jan 21 23:55:00 compute-1 podman[218276]: 2026-01-21 23:55:00.119230434 +0000 UTC m=+0.047894366 container died 81566ded70953dc102b279b286eea81de2ac337b3fe58b5ab18c3fa5c3c67965 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-74e2da48-44c2-4c6d-9597-6c47d6247f9c, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202)
Jan 21 23:55:00 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-81566ded70953dc102b279b286eea81de2ac337b3fe58b5ab18c3fa5c3c67965-userdata-shm.mount: Deactivated successfully.
Jan 21 23:55:00 compute-1 systemd[1]: var-lib-containers-storage-overlay-251ed7aa053f47d46d85aa28b64a533ce2ea83007ce40cd825cc2c829d54c272-merged.mount: Deactivated successfully.
Jan 21 23:55:00 compute-1 nova_compute[182713]: 2026-01-21 23:55:00.169 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:55:00 compute-1 nova_compute[182713]: 2026-01-21 23:55:00.174 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:55:00 compute-1 podman[218276]: 2026-01-21 23:55:00.196691439 +0000 UTC m=+0.125355371 container cleanup 81566ded70953dc102b279b286eea81de2ac337b3fe58b5ab18c3fa5c3c67965 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-74e2da48-44c2-4c6d-9597-6c47d6247f9c, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 21 23:55:00 compute-1 systemd[1]: libpod-conmon-81566ded70953dc102b279b286eea81de2ac337b3fe58b5ab18c3fa5c3c67965.scope: Deactivated successfully.
Jan 21 23:55:00 compute-1 podman[218322]: 2026-01-21 23:55:00.308986997 +0000 UTC m=+0.089942561 container remove 81566ded70953dc102b279b286eea81de2ac337b3fe58b5ab18c3fa5c3c67965 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-74e2da48-44c2-4c6d-9597-6c47d6247f9c, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 21 23:55:00 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:55:00.314 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[18091025-071f-4402-9092-1e60416b4f7f]: (4, ('Wed Jan 21 11:55:00 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-74e2da48-44c2-4c6d-9597-6c47d6247f9c (81566ded70953dc102b279b286eea81de2ac337b3fe58b5ab18c3fa5c3c67965)\n81566ded70953dc102b279b286eea81de2ac337b3fe58b5ab18c3fa5c3c67965\nWed Jan 21 11:55:00 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-74e2da48-44c2-4c6d-9597-6c47d6247f9c (81566ded70953dc102b279b286eea81de2ac337b3fe58b5ab18c3fa5c3c67965)\n81566ded70953dc102b279b286eea81de2ac337b3fe58b5ab18c3fa5c3c67965\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:55:00 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:55:00.316 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[72543a2d-be08-43a6-87bc-9222f0e946fb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:55:00 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:55:00.317 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap74e2da48-40, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:55:00 compute-1 nova_compute[182713]: 2026-01-21 23:55:00.319 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:55:00 compute-1 kernel: tap74e2da48-40: left promiscuous mode
Jan 21 23:55:00 compute-1 nova_compute[182713]: 2026-01-21 23:55:00.334 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:55:00 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:55:00.337 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[7fbceefd-fc83-43ab-85fd-684d554b7836]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:55:00 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:55:00.356 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[9386c906-5990-4389-94b5-dac62c1fdecc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:55:00 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:55:00.357 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[87289473-d823-478e-b7dc-d37c3843d841]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:55:00 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:55:00.376 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[d6727698-853f-454d-a17d-6e05e2aaac6c]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 425632, 'reachable_time': 38073, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 218341, 'error': None, 'target': 'ovnmeta-74e2da48-44c2-4c6d-9597-6c47d6247f9c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:55:00 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:55:00.380 104576 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-74e2da48-44c2-4c6d-9597-6c47d6247f9c deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 21 23:55:00 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:55:00.381 104576 DEBUG oslo.privsep.daemon [-] privsep: reply[feefc17c-9f90-472d-9d02-9751ba6c4b3c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:55:00 compute-1 systemd[1]: run-netns-ovnmeta\x2d74e2da48\x2d44c2\x2d4c6d\x2d9597\x2d6c47d6247f9c.mount: Deactivated successfully.
Jan 21 23:55:00 compute-1 nova_compute[182713]: 2026-01-21 23:55:00.637 182717 DEBUG nova.compute.manager [req-f06c3d0f-dd86-46e4-ad42-5a5a1511a664 req-c6396a72-317a-496f-8dbf-dd3850732021 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: d94f4b0c-b132-4eb6-9c92-6850506a821a] Received event network-vif-unplugged-fc6ae600-05c3-478d-9bbd-b0041c1043c1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:55:00 compute-1 nova_compute[182713]: 2026-01-21 23:55:00.638 182717 DEBUG oslo_concurrency.lockutils [req-f06c3d0f-dd86-46e4-ad42-5a5a1511a664 req-c6396a72-317a-496f-8dbf-dd3850732021 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "d94f4b0c-b132-4eb6-9c92-6850506a821a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:55:00 compute-1 nova_compute[182713]: 2026-01-21 23:55:00.638 182717 DEBUG oslo_concurrency.lockutils [req-f06c3d0f-dd86-46e4-ad42-5a5a1511a664 req-c6396a72-317a-496f-8dbf-dd3850732021 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "d94f4b0c-b132-4eb6-9c92-6850506a821a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:55:00 compute-1 nova_compute[182713]: 2026-01-21 23:55:00.638 182717 DEBUG oslo_concurrency.lockutils [req-f06c3d0f-dd86-46e4-ad42-5a5a1511a664 req-c6396a72-317a-496f-8dbf-dd3850732021 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "d94f4b0c-b132-4eb6-9c92-6850506a821a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:55:00 compute-1 nova_compute[182713]: 2026-01-21 23:55:00.638 182717 DEBUG nova.compute.manager [req-f06c3d0f-dd86-46e4-ad42-5a5a1511a664 req-c6396a72-317a-496f-8dbf-dd3850732021 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: d94f4b0c-b132-4eb6-9c92-6850506a821a] No waiting events found dispatching network-vif-unplugged-fc6ae600-05c3-478d-9bbd-b0041c1043c1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 23:55:00 compute-1 nova_compute[182713]: 2026-01-21 23:55:00.639 182717 WARNING nova.compute.manager [req-f06c3d0f-dd86-46e4-ad42-5a5a1511a664 req-c6396a72-317a-496f-8dbf-dd3850732021 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: d94f4b0c-b132-4eb6-9c92-6850506a821a] Received unexpected event network-vif-unplugged-fc6ae600-05c3-478d-9bbd-b0041c1043c1 for instance with vm_state active and task_state powering-off.
Jan 21 23:55:00 compute-1 nova_compute[182713]: 2026-01-21 23:55:00.724 182717 INFO nova.virt.libvirt.driver [None req-5ab6aae7-e086-4184-b582-3331eca58b7e 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: d94f4b0c-b132-4eb6-9c92-6850506a821a] Instance shutdown successfully after 13 seconds.
Jan 21 23:55:00 compute-1 nova_compute[182713]: 2026-01-21 23:55:00.730 182717 INFO nova.virt.libvirt.driver [-] [instance: d94f4b0c-b132-4eb6-9c92-6850506a821a] Instance destroyed successfully.
Jan 21 23:55:00 compute-1 nova_compute[182713]: 2026-01-21 23:55:00.731 182717 DEBUG nova.objects.instance [None req-5ab6aae7-e086-4184-b582-3331eca58b7e 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Lazy-loading 'numa_topology' on Instance uuid d94f4b0c-b132-4eb6-9c92-6850506a821a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:55:00 compute-1 nova_compute[182713]: 2026-01-21 23:55:00.747 182717 DEBUG nova.compute.manager [None req-5ab6aae7-e086-4184-b582-3331eca58b7e 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: d94f4b0c-b132-4eb6-9c92-6850506a821a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:55:00 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:55:00.876 104184 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=14, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:a4:fb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'fa:c2:bb:50:eb:27'}, ipsec=False) old=SB_Global(nb_cfg=13) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 21 23:55:00 compute-1 nova_compute[182713]: 2026-01-21 23:55:00.876 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:55:00 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:55:00.878 104184 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 21 23:55:00 compute-1 nova_compute[182713]: 2026-01-21 23:55:00.888 182717 DEBUG oslo_concurrency.lockutils [None req-5ab6aae7-e086-4184-b582-3331eca58b7e 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Lock "d94f4b0c-b132-4eb6-9c92-6850506a821a" "released" by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" :: held 13.346s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:55:01 compute-1 nova_compute[182713]: 2026-01-21 23:55:01.022 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:55:02 compute-1 nova_compute[182713]: 2026-01-21 23:55:02.807 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:55:03 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:55:03.002 104184 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:55:03 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:55:03.003 104184 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:55:03 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:55:03.003 104184 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:55:03 compute-1 nova_compute[182713]: 2026-01-21 23:55:03.512 182717 DEBUG nova.compute.manager [req-789d564a-74ef-44fa-af90-54fa0a08149f req-4f0e31ae-b442-40ea-9a33-39eb1468b259 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: d94f4b0c-b132-4eb6-9c92-6850506a821a] Received event network-vif-plugged-fc6ae600-05c3-478d-9bbd-b0041c1043c1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:55:03 compute-1 nova_compute[182713]: 2026-01-21 23:55:03.513 182717 DEBUG oslo_concurrency.lockutils [req-789d564a-74ef-44fa-af90-54fa0a08149f req-4f0e31ae-b442-40ea-9a33-39eb1468b259 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "d94f4b0c-b132-4eb6-9c92-6850506a821a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:55:03 compute-1 nova_compute[182713]: 2026-01-21 23:55:03.513 182717 DEBUG oslo_concurrency.lockutils [req-789d564a-74ef-44fa-af90-54fa0a08149f req-4f0e31ae-b442-40ea-9a33-39eb1468b259 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "d94f4b0c-b132-4eb6-9c92-6850506a821a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:55:03 compute-1 nova_compute[182713]: 2026-01-21 23:55:03.513 182717 DEBUG oslo_concurrency.lockutils [req-789d564a-74ef-44fa-af90-54fa0a08149f req-4f0e31ae-b442-40ea-9a33-39eb1468b259 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "d94f4b0c-b132-4eb6-9c92-6850506a821a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:55:03 compute-1 nova_compute[182713]: 2026-01-21 23:55:03.514 182717 DEBUG nova.compute.manager [req-789d564a-74ef-44fa-af90-54fa0a08149f req-4f0e31ae-b442-40ea-9a33-39eb1468b259 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: d94f4b0c-b132-4eb6-9c92-6850506a821a] No waiting events found dispatching network-vif-plugged-fc6ae600-05c3-478d-9bbd-b0041c1043c1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 23:55:03 compute-1 nova_compute[182713]: 2026-01-21 23:55:03.514 182717 WARNING nova.compute.manager [req-789d564a-74ef-44fa-af90-54fa0a08149f req-4f0e31ae-b442-40ea-9a33-39eb1468b259 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: d94f4b0c-b132-4eb6-9c92-6850506a821a] Received unexpected event network-vif-plugged-fc6ae600-05c3-478d-9bbd-b0041c1043c1 for instance with vm_state stopped and task_state None.
Jan 21 23:55:03 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:55:03.881 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=74526b6d-b1ca-423f-9094-b845f8b97526, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '14'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:55:04 compute-1 nova_compute[182713]: 2026-01-21 23:55:04.515 182717 DEBUG nova.compute.manager [None req-237f3fda-099e-46f3-b1e9-72988eaee80b 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: d94f4b0c-b132-4eb6-9c92-6850506a821a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:55:04 compute-1 nova_compute[182713]: 2026-01-21 23:55:04.626 182717 INFO nova.compute.manager [None req-237f3fda-099e-46f3-b1e9-72988eaee80b 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: d94f4b0c-b132-4eb6-9c92-6850506a821a] instance snapshotting
Jan 21 23:55:04 compute-1 nova_compute[182713]: 2026-01-21 23:55:04.626 182717 WARNING nova.compute.manager [None req-237f3fda-099e-46f3-b1e9-72988eaee80b 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: d94f4b0c-b132-4eb6-9c92-6850506a821a] trying to snapshot a non-running instance: (state: 4 expected: 1)
Jan 21 23:55:04 compute-1 nova_compute[182713]: 2026-01-21 23:55:04.936 182717 INFO nova.virt.libvirt.driver [None req-237f3fda-099e-46f3-b1e9-72988eaee80b 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: d94f4b0c-b132-4eb6-9c92-6850506a821a] Beginning cold snapshot process
Jan 21 23:55:05 compute-1 nova_compute[182713]: 2026-01-21 23:55:05.239 182717 DEBUG nova.privsep.utils [None req-237f3fda-099e-46f3-b1e9-72988eaee80b 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63
Jan 21 23:55:05 compute-1 nova_compute[182713]: 2026-01-21 23:55:05.240 182717 DEBUG oslo_concurrency.processutils [None req-237f3fda-099e-46f3-b1e9-72988eaee80b 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Running cmd (subprocess): qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/d94f4b0c-b132-4eb6-9c92-6850506a821a/disk /var/lib/nova/instances/snapshots/tmprn8iql_m/ccd685b613cb46faba0f5074d735ca62 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:55:05 compute-1 nova_compute[182713]: 2026-01-21 23:55:05.519 182717 DEBUG oslo_concurrency.processutils [None req-237f3fda-099e-46f3-b1e9-72988eaee80b 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] CMD "qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/d94f4b0c-b132-4eb6-9c92-6850506a821a/disk /var/lib/nova/instances/snapshots/tmprn8iql_m/ccd685b613cb46faba0f5074d735ca62" returned: 0 in 0.279s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:55:05 compute-1 nova_compute[182713]: 2026-01-21 23:55:05.521 182717 INFO nova.virt.libvirt.driver [None req-237f3fda-099e-46f3-b1e9-72988eaee80b 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: d94f4b0c-b132-4eb6-9c92-6850506a821a] Snapshot extracted, beginning image upload
Jan 21 23:55:06 compute-1 nova_compute[182713]: 2026-01-21 23:55:06.023 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:55:07 compute-1 podman[218352]: 2026-01-21 23:55:07.608395455 +0000 UTC m=+0.078870810 container health_status af9a44923f14d865893c91712a29a99d59e05d83f1a1a0300c4f4d5af638f284 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Jan 21 23:55:07 compute-1 podman[218353]: 2026-01-21 23:55:07.640412791 +0000 UTC m=+0.107700697 container health_status e305ebfcd3dbca18f8dd680606b043b108cf9efb0d0a771039cb04fe0df8441d (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 21 23:55:07 compute-1 nova_compute[182713]: 2026-01-21 23:55:07.810 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:55:08 compute-1 nova_compute[182713]: 2026-01-21 23:55:08.393 182717 INFO nova.virt.libvirt.driver [None req-237f3fda-099e-46f3-b1e9-72988eaee80b 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: d94f4b0c-b132-4eb6-9c92-6850506a821a] Snapshot image upload complete
Jan 21 23:55:08 compute-1 nova_compute[182713]: 2026-01-21 23:55:08.394 182717 INFO nova.compute.manager [None req-237f3fda-099e-46f3-b1e9-72988eaee80b 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: d94f4b0c-b132-4eb6-9c92-6850506a821a] Took 3.75 seconds to snapshot the instance on the hypervisor.
Jan 21 23:55:10 compute-1 nova_compute[182713]: 2026-01-21 23:55:10.227 182717 DEBUG oslo_concurrency.lockutils [None req-b5d763a0-acb8-4189-afbb-7d21e1ae45a2 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Acquiring lock "d94f4b0c-b132-4eb6-9c92-6850506a821a" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:55:10 compute-1 nova_compute[182713]: 2026-01-21 23:55:10.228 182717 DEBUG oslo_concurrency.lockutils [None req-b5d763a0-acb8-4189-afbb-7d21e1ae45a2 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Lock "d94f4b0c-b132-4eb6-9c92-6850506a821a" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:55:10 compute-1 nova_compute[182713]: 2026-01-21 23:55:10.229 182717 DEBUG oslo_concurrency.lockutils [None req-b5d763a0-acb8-4189-afbb-7d21e1ae45a2 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Acquiring lock "d94f4b0c-b132-4eb6-9c92-6850506a821a-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:55:10 compute-1 nova_compute[182713]: 2026-01-21 23:55:10.229 182717 DEBUG oslo_concurrency.lockutils [None req-b5d763a0-acb8-4189-afbb-7d21e1ae45a2 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Lock "d94f4b0c-b132-4eb6-9c92-6850506a821a-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:55:10 compute-1 nova_compute[182713]: 2026-01-21 23:55:10.230 182717 DEBUG oslo_concurrency.lockutils [None req-b5d763a0-acb8-4189-afbb-7d21e1ae45a2 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Lock "d94f4b0c-b132-4eb6-9c92-6850506a821a-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:55:10 compute-1 nova_compute[182713]: 2026-01-21 23:55:10.246 182717 INFO nova.compute.manager [None req-b5d763a0-acb8-4189-afbb-7d21e1ae45a2 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: d94f4b0c-b132-4eb6-9c92-6850506a821a] Terminating instance
Jan 21 23:55:10 compute-1 nova_compute[182713]: 2026-01-21 23:55:10.260 182717 DEBUG nova.compute.manager [None req-b5d763a0-acb8-4189-afbb-7d21e1ae45a2 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: d94f4b0c-b132-4eb6-9c92-6850506a821a] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 21 23:55:10 compute-1 nova_compute[182713]: 2026-01-21 23:55:10.270 182717 INFO nova.virt.libvirt.driver [-] [instance: d94f4b0c-b132-4eb6-9c92-6850506a821a] Instance destroyed successfully.
Jan 21 23:55:10 compute-1 nova_compute[182713]: 2026-01-21 23:55:10.271 182717 DEBUG nova.objects.instance [None req-b5d763a0-acb8-4189-afbb-7d21e1ae45a2 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Lazy-loading 'resources' on Instance uuid d94f4b0c-b132-4eb6-9c92-6850506a821a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:55:10 compute-1 nova_compute[182713]: 2026-01-21 23:55:10.292 182717 DEBUG nova.virt.libvirt.vif [None req-b5d763a0-acb8-4189-afbb-7d21e1ae45a2 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-21T23:54:32Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ImagesTestJSON-server-421695372',display_name='tempest-ImagesTestJSON-server-421695372',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-imagestestjson-server-421695372',id=53,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-21T23:54:45Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='63e5713bcd4c429796b251487b6136bc',ramdisk_id='',reservation_id='r-afkdxekb',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ImagesTestJSON-126431515',owner_user_name='tempest-ImagesTestJSON-126431515-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-21T23:55:08Z,user_data=None,user_id='6eb1bcf645844eaca088761a04e59542',uuid=d94f4b0c-b132-4eb6-9c92-6850506a821a,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "fc6ae600-05c3-478d-9bbd-b0041c1043c1", "address": "fa:16:3e:6e:94:fb", "network": {"id": "74e2da48-44c2-4c6d-9597-6c47d6247f9c", "bridge": "br-int", "label": "tempest-ImagesTestJSON-926280031-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "63e5713bcd4c429796b251487b6136bc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfc6ae600-05", "ovs_interfaceid": "fc6ae600-05c3-478d-9bbd-b0041c1043c1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 21 23:55:10 compute-1 nova_compute[182713]: 2026-01-21 23:55:10.293 182717 DEBUG nova.network.os_vif_util [None req-b5d763a0-acb8-4189-afbb-7d21e1ae45a2 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Converting VIF {"id": "fc6ae600-05c3-478d-9bbd-b0041c1043c1", "address": "fa:16:3e:6e:94:fb", "network": {"id": "74e2da48-44c2-4c6d-9597-6c47d6247f9c", "bridge": "br-int", "label": "tempest-ImagesTestJSON-926280031-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "63e5713bcd4c429796b251487b6136bc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfc6ae600-05", "ovs_interfaceid": "fc6ae600-05c3-478d-9bbd-b0041c1043c1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 21 23:55:10 compute-1 nova_compute[182713]: 2026-01-21 23:55:10.294 182717 DEBUG nova.network.os_vif_util [None req-b5d763a0-acb8-4189-afbb-7d21e1ae45a2 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:6e:94:fb,bridge_name='br-int',has_traffic_filtering=True,id=fc6ae600-05c3-478d-9bbd-b0041c1043c1,network=Network(74e2da48-44c2-4c6d-9597-6c47d6247f9c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfc6ae600-05') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 21 23:55:10 compute-1 nova_compute[182713]: 2026-01-21 23:55:10.294 182717 DEBUG os_vif [None req-b5d763a0-acb8-4189-afbb-7d21e1ae45a2 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:6e:94:fb,bridge_name='br-int',has_traffic_filtering=True,id=fc6ae600-05c3-478d-9bbd-b0041c1043c1,network=Network(74e2da48-44c2-4c6d-9597-6c47d6247f9c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfc6ae600-05') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 21 23:55:10 compute-1 nova_compute[182713]: 2026-01-21 23:55:10.297 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:55:10 compute-1 nova_compute[182713]: 2026-01-21 23:55:10.298 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfc6ae600-05, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:55:10 compute-1 nova_compute[182713]: 2026-01-21 23:55:10.300 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:55:10 compute-1 nova_compute[182713]: 2026-01-21 23:55:10.301 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:55:10 compute-1 nova_compute[182713]: 2026-01-21 23:55:10.307 182717 INFO os_vif [None req-b5d763a0-acb8-4189-afbb-7d21e1ae45a2 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:6e:94:fb,bridge_name='br-int',has_traffic_filtering=True,id=fc6ae600-05c3-478d-9bbd-b0041c1043c1,network=Network(74e2da48-44c2-4c6d-9597-6c47d6247f9c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfc6ae600-05')
Jan 21 23:55:10 compute-1 nova_compute[182713]: 2026-01-21 23:55:10.308 182717 INFO nova.virt.libvirt.driver [None req-b5d763a0-acb8-4189-afbb-7d21e1ae45a2 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: d94f4b0c-b132-4eb6-9c92-6850506a821a] Deleting instance files /var/lib/nova/instances/d94f4b0c-b132-4eb6-9c92-6850506a821a_del
Jan 21 23:55:10 compute-1 nova_compute[182713]: 2026-01-21 23:55:10.318 182717 INFO nova.virt.libvirt.driver [None req-b5d763a0-acb8-4189-afbb-7d21e1ae45a2 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: d94f4b0c-b132-4eb6-9c92-6850506a821a] Deletion of /var/lib/nova/instances/d94f4b0c-b132-4eb6-9c92-6850506a821a_del complete
Jan 21 23:55:10 compute-1 nova_compute[182713]: 2026-01-21 23:55:10.395 182717 INFO nova.compute.manager [None req-b5d763a0-acb8-4189-afbb-7d21e1ae45a2 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: d94f4b0c-b132-4eb6-9c92-6850506a821a] Took 0.13 seconds to destroy the instance on the hypervisor.
Jan 21 23:55:10 compute-1 nova_compute[182713]: 2026-01-21 23:55:10.396 182717 DEBUG oslo.service.loopingcall [None req-b5d763a0-acb8-4189-afbb-7d21e1ae45a2 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 21 23:55:10 compute-1 nova_compute[182713]: 2026-01-21 23:55:10.396 182717 DEBUG nova.compute.manager [-] [instance: d94f4b0c-b132-4eb6-9c92-6850506a821a] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 21 23:55:10 compute-1 nova_compute[182713]: 2026-01-21 23:55:10.396 182717 DEBUG nova.network.neutron [-] [instance: d94f4b0c-b132-4eb6-9c92-6850506a821a] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 21 23:55:11 compute-1 nova_compute[182713]: 2026-01-21 23:55:11.025 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:55:12 compute-1 nova_compute[182713]: 2026-01-21 23:55:12.718 182717 DEBUG nova.network.neutron [-] [instance: d94f4b0c-b132-4eb6-9c92-6850506a821a] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 23:55:12 compute-1 nova_compute[182713]: 2026-01-21 23:55:12.745 182717 INFO nova.compute.manager [-] [instance: d94f4b0c-b132-4eb6-9c92-6850506a821a] Took 2.35 seconds to deallocate network for instance.
Jan 21 23:55:12 compute-1 nova_compute[182713]: 2026-01-21 23:55:12.841 182717 DEBUG oslo_concurrency.lockutils [None req-b5d763a0-acb8-4189-afbb-7d21e1ae45a2 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:55:12 compute-1 nova_compute[182713]: 2026-01-21 23:55:12.842 182717 DEBUG oslo_concurrency.lockutils [None req-b5d763a0-acb8-4189-afbb-7d21e1ae45a2 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:55:12 compute-1 nova_compute[182713]: 2026-01-21 23:55:12.897 182717 DEBUG nova.compute.manager [req-a77e5daa-72d9-44fe-8d24-62110b19db99 req-655028f9-458d-45a9-9272-9f2ab9cbe217 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: d94f4b0c-b132-4eb6-9c92-6850506a821a] Received event network-vif-deleted-fc6ae600-05c3-478d-9bbd-b0041c1043c1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:55:12 compute-1 nova_compute[182713]: 2026-01-21 23:55:12.944 182717 DEBUG nova.compute.provider_tree [None req-b5d763a0-acb8-4189-afbb-7d21e1ae45a2 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Inventory has not changed in ProviderTree for provider: 39680711-70c9-4df1-ae59-25e54fac688d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 21 23:55:12 compute-1 nova_compute[182713]: 2026-01-21 23:55:12.968 182717 DEBUG nova.scheduler.client.report [None req-b5d763a0-acb8-4189-afbb-7d21e1ae45a2 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Inventory has not changed for provider 39680711-70c9-4df1-ae59-25e54fac688d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 21 23:55:12 compute-1 nova_compute[182713]: 2026-01-21 23:55:12.997 182717 DEBUG oslo_concurrency.lockutils [None req-b5d763a0-acb8-4189-afbb-7d21e1ae45a2 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.155s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:55:13 compute-1 nova_compute[182713]: 2026-01-21 23:55:13.029 182717 INFO nova.scheduler.client.report [None req-b5d763a0-acb8-4189-afbb-7d21e1ae45a2 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Deleted allocations for instance d94f4b0c-b132-4eb6-9c92-6850506a821a
Jan 21 23:55:13 compute-1 nova_compute[182713]: 2026-01-21 23:55:13.109 182717 DEBUG oslo_concurrency.lockutils [None req-b5d763a0-acb8-4189-afbb-7d21e1ae45a2 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Lock "d94f4b0c-b132-4eb6-9c92-6850506a821a" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.881s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:55:14 compute-1 nova_compute[182713]: 2026-01-21 23:55:14.137 182717 DEBUG oslo_concurrency.lockutils [None req-7e6ec281-be96-44e2-aa9a-420fd781e254 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Acquiring lock "7821b939-505a-45e3-a74b-dce7c6fdc856" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:55:14 compute-1 nova_compute[182713]: 2026-01-21 23:55:14.138 182717 DEBUG oslo_concurrency.lockutils [None req-7e6ec281-be96-44e2-aa9a-420fd781e254 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Lock "7821b939-505a-45e3-a74b-dce7c6fdc856" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:55:14 compute-1 nova_compute[182713]: 2026-01-21 23:55:14.162 182717 DEBUG nova.compute.manager [None req-7e6ec281-be96-44e2-aa9a-420fd781e254 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: 7821b939-505a-45e3-a74b-dce7c6fdc856] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 21 23:55:14 compute-1 nova_compute[182713]: 2026-01-21 23:55:14.285 182717 DEBUG oslo_concurrency.lockutils [None req-7e6ec281-be96-44e2-aa9a-420fd781e254 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:55:14 compute-1 nova_compute[182713]: 2026-01-21 23:55:14.286 182717 DEBUG oslo_concurrency.lockutils [None req-7e6ec281-be96-44e2-aa9a-420fd781e254 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:55:14 compute-1 nova_compute[182713]: 2026-01-21 23:55:14.295 182717 DEBUG nova.virt.hardware [None req-7e6ec281-be96-44e2-aa9a-420fd781e254 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 21 23:55:14 compute-1 nova_compute[182713]: 2026-01-21 23:55:14.296 182717 INFO nova.compute.claims [None req-7e6ec281-be96-44e2-aa9a-420fd781e254 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: 7821b939-505a-45e3-a74b-dce7c6fdc856] Claim successful on node compute-1.ctlplane.example.com
Jan 21 23:55:14 compute-1 nova_compute[182713]: 2026-01-21 23:55:14.451 182717 DEBUG nova.compute.provider_tree [None req-7e6ec281-be96-44e2-aa9a-420fd781e254 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Inventory has not changed in ProviderTree for provider: 39680711-70c9-4df1-ae59-25e54fac688d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 21 23:55:14 compute-1 nova_compute[182713]: 2026-01-21 23:55:14.469 182717 DEBUG nova.scheduler.client.report [None req-7e6ec281-be96-44e2-aa9a-420fd781e254 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Inventory has not changed for provider 39680711-70c9-4df1-ae59-25e54fac688d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 21 23:55:14 compute-1 nova_compute[182713]: 2026-01-21 23:55:14.525 182717 DEBUG oslo_concurrency.lockutils [None req-7e6ec281-be96-44e2-aa9a-420fd781e254 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.239s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:55:14 compute-1 nova_compute[182713]: 2026-01-21 23:55:14.527 182717 DEBUG nova.compute.manager [None req-7e6ec281-be96-44e2-aa9a-420fd781e254 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: 7821b939-505a-45e3-a74b-dce7c6fdc856] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 21 23:55:14 compute-1 nova_compute[182713]: 2026-01-21 23:55:14.611 182717 DEBUG nova.compute.manager [None req-7e6ec281-be96-44e2-aa9a-420fd781e254 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: 7821b939-505a-45e3-a74b-dce7c6fdc856] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 21 23:55:14 compute-1 nova_compute[182713]: 2026-01-21 23:55:14.612 182717 DEBUG nova.network.neutron [None req-7e6ec281-be96-44e2-aa9a-420fd781e254 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: 7821b939-505a-45e3-a74b-dce7c6fdc856] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 21 23:55:14 compute-1 nova_compute[182713]: 2026-01-21 23:55:14.636 182717 INFO nova.virt.libvirt.driver [None req-7e6ec281-be96-44e2-aa9a-420fd781e254 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: 7821b939-505a-45e3-a74b-dce7c6fdc856] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 21 23:55:14 compute-1 nova_compute[182713]: 2026-01-21 23:55:14.667 182717 DEBUG nova.compute.manager [None req-7e6ec281-be96-44e2-aa9a-420fd781e254 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: 7821b939-505a-45e3-a74b-dce7c6fdc856] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 21 23:55:14 compute-1 nova_compute[182713]: 2026-01-21 23:55:14.821 182717 DEBUG nova.compute.manager [None req-7e6ec281-be96-44e2-aa9a-420fd781e254 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: 7821b939-505a-45e3-a74b-dce7c6fdc856] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 21 23:55:14 compute-1 nova_compute[182713]: 2026-01-21 23:55:14.823 182717 DEBUG nova.virt.libvirt.driver [None req-7e6ec281-be96-44e2-aa9a-420fd781e254 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: 7821b939-505a-45e3-a74b-dce7c6fdc856] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 21 23:55:14 compute-1 nova_compute[182713]: 2026-01-21 23:55:14.824 182717 INFO nova.virt.libvirt.driver [None req-7e6ec281-be96-44e2-aa9a-420fd781e254 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: 7821b939-505a-45e3-a74b-dce7c6fdc856] Creating image(s)
Jan 21 23:55:14 compute-1 nova_compute[182713]: 2026-01-21 23:55:14.825 182717 DEBUG oslo_concurrency.lockutils [None req-7e6ec281-be96-44e2-aa9a-420fd781e254 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Acquiring lock "/var/lib/nova/instances/7821b939-505a-45e3-a74b-dce7c6fdc856/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:55:14 compute-1 nova_compute[182713]: 2026-01-21 23:55:14.826 182717 DEBUG oslo_concurrency.lockutils [None req-7e6ec281-be96-44e2-aa9a-420fd781e254 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Lock "/var/lib/nova/instances/7821b939-505a-45e3-a74b-dce7c6fdc856/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:55:14 compute-1 nova_compute[182713]: 2026-01-21 23:55:14.827 182717 DEBUG oslo_concurrency.lockutils [None req-7e6ec281-be96-44e2-aa9a-420fd781e254 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Lock "/var/lib/nova/instances/7821b939-505a-45e3-a74b-dce7c6fdc856/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:55:14 compute-1 nova_compute[182713]: 2026-01-21 23:55:14.853 182717 DEBUG oslo_concurrency.processutils [None req-7e6ec281-be96-44e2-aa9a-420fd781e254 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:55:14 compute-1 nova_compute[182713]: 2026-01-21 23:55:14.940 182717 DEBUG oslo_concurrency.processutils [None req-7e6ec281-be96-44e2-aa9a-420fd781e254 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.088s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:55:14 compute-1 nova_compute[182713]: 2026-01-21 23:55:14.942 182717 DEBUG oslo_concurrency.lockutils [None req-7e6ec281-be96-44e2-aa9a-420fd781e254 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Acquiring lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:55:14 compute-1 nova_compute[182713]: 2026-01-21 23:55:14.944 182717 DEBUG oslo_concurrency.lockutils [None req-7e6ec281-be96-44e2-aa9a-420fd781e254 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:55:14 compute-1 nova_compute[182713]: 2026-01-21 23:55:14.961 182717 DEBUG oslo_concurrency.processutils [None req-7e6ec281-be96-44e2-aa9a-420fd781e254 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:55:15 compute-1 nova_compute[182713]: 2026-01-21 23:55:15.053 182717 DEBUG oslo_concurrency.processutils [None req-7e6ec281-be96-44e2-aa9a-420fd781e254 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.093s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:55:15 compute-1 nova_compute[182713]: 2026-01-21 23:55:15.055 182717 DEBUG oslo_concurrency.processutils [None req-7e6ec281-be96-44e2-aa9a-420fd781e254 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/7821b939-505a-45e3-a74b-dce7c6fdc856/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:55:15 compute-1 nova_compute[182713]: 2026-01-21 23:55:15.107 182717 DEBUG oslo_concurrency.processutils [None req-7e6ec281-be96-44e2-aa9a-420fd781e254 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/7821b939-505a-45e3-a74b-dce7c6fdc856/disk 1073741824" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:55:15 compute-1 nova_compute[182713]: 2026-01-21 23:55:15.108 182717 DEBUG oslo_concurrency.lockutils [None req-7e6ec281-be96-44e2-aa9a-420fd781e254 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.164s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:55:15 compute-1 nova_compute[182713]: 2026-01-21 23:55:15.108 182717 DEBUG oslo_concurrency.processutils [None req-7e6ec281-be96-44e2-aa9a-420fd781e254 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:55:15 compute-1 nova_compute[182713]: 2026-01-21 23:55:15.193 182717 DEBUG oslo_concurrency.processutils [None req-7e6ec281-be96-44e2-aa9a-420fd781e254 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.084s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:55:15 compute-1 nova_compute[182713]: 2026-01-21 23:55:15.194 182717 DEBUG nova.virt.disk.api [None req-7e6ec281-be96-44e2-aa9a-420fd781e254 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Checking if we can resize image /var/lib/nova/instances/7821b939-505a-45e3-a74b-dce7c6fdc856/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 21 23:55:15 compute-1 nova_compute[182713]: 2026-01-21 23:55:15.195 182717 DEBUG oslo_concurrency.processutils [None req-7e6ec281-be96-44e2-aa9a-420fd781e254 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7821b939-505a-45e3-a74b-dce7c6fdc856/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:55:15 compute-1 nova_compute[182713]: 2026-01-21 23:55:15.222 182717 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769039700.209984, d94f4b0c-b132-4eb6-9c92-6850506a821a => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:55:15 compute-1 nova_compute[182713]: 2026-01-21 23:55:15.223 182717 INFO nova.compute.manager [-] [instance: d94f4b0c-b132-4eb6-9c92-6850506a821a] VM Stopped (Lifecycle Event)
Jan 21 23:55:15 compute-1 nova_compute[182713]: 2026-01-21 23:55:15.259 182717 DEBUG nova.compute.manager [None req-97631ab8-2903-4c53-abe1-c1645f82b89d - - - - - -] [instance: d94f4b0c-b132-4eb6-9c92-6850506a821a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:55:15 compute-1 nova_compute[182713]: 2026-01-21 23:55:15.280 182717 DEBUG nova.policy [None req-7e6ec281-be96-44e2-aa9a-420fd781e254 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '6eb1bcf645844eaca088761a04e59542', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '63e5713bcd4c429796b251487b6136bc', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 21 23:55:15 compute-1 nova_compute[182713]: 2026-01-21 23:55:15.285 182717 DEBUG oslo_concurrency.processutils [None req-7e6ec281-be96-44e2-aa9a-420fd781e254 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7821b939-505a-45e3-a74b-dce7c6fdc856/disk --force-share --output=json" returned: 0 in 0.090s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:55:15 compute-1 nova_compute[182713]: 2026-01-21 23:55:15.286 182717 DEBUG nova.virt.disk.api [None req-7e6ec281-be96-44e2-aa9a-420fd781e254 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Cannot resize image /var/lib/nova/instances/7821b939-505a-45e3-a74b-dce7c6fdc856/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 21 23:55:15 compute-1 nova_compute[182713]: 2026-01-21 23:55:15.287 182717 DEBUG nova.objects.instance [None req-7e6ec281-be96-44e2-aa9a-420fd781e254 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Lazy-loading 'migration_context' on Instance uuid 7821b939-505a-45e3-a74b-dce7c6fdc856 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:55:15 compute-1 nova_compute[182713]: 2026-01-21 23:55:15.301 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:55:15 compute-1 nova_compute[182713]: 2026-01-21 23:55:15.321 182717 DEBUG nova.virt.libvirt.driver [None req-7e6ec281-be96-44e2-aa9a-420fd781e254 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: 7821b939-505a-45e3-a74b-dce7c6fdc856] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 21 23:55:15 compute-1 nova_compute[182713]: 2026-01-21 23:55:15.321 182717 DEBUG nova.virt.libvirt.driver [None req-7e6ec281-be96-44e2-aa9a-420fd781e254 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: 7821b939-505a-45e3-a74b-dce7c6fdc856] Ensure instance console log exists: /var/lib/nova/instances/7821b939-505a-45e3-a74b-dce7c6fdc856/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 21 23:55:15 compute-1 nova_compute[182713]: 2026-01-21 23:55:15.322 182717 DEBUG oslo_concurrency.lockutils [None req-7e6ec281-be96-44e2-aa9a-420fd781e254 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:55:15 compute-1 nova_compute[182713]: 2026-01-21 23:55:15.323 182717 DEBUG oslo_concurrency.lockutils [None req-7e6ec281-be96-44e2-aa9a-420fd781e254 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:55:15 compute-1 nova_compute[182713]: 2026-01-21 23:55:15.323 182717 DEBUG oslo_concurrency.lockutils [None req-7e6ec281-be96-44e2-aa9a-420fd781e254 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:55:16 compute-1 nova_compute[182713]: 2026-01-21 23:55:16.026 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:55:16 compute-1 nova_compute[182713]: 2026-01-21 23:55:16.393 182717 DEBUG nova.network.neutron [None req-7e6ec281-be96-44e2-aa9a-420fd781e254 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: 7821b939-505a-45e3-a74b-dce7c6fdc856] Successfully created port: 72ef4c9f-6bd3-45d3-9383-212dd9f03330 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 21 23:55:18 compute-1 nova_compute[182713]: 2026-01-21 23:55:18.529 182717 DEBUG nova.network.neutron [None req-7e6ec281-be96-44e2-aa9a-420fd781e254 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: 7821b939-505a-45e3-a74b-dce7c6fdc856] Successfully updated port: 72ef4c9f-6bd3-45d3-9383-212dd9f03330 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 21 23:55:18 compute-1 nova_compute[182713]: 2026-01-21 23:55:18.551 182717 DEBUG oslo_concurrency.lockutils [None req-7e6ec281-be96-44e2-aa9a-420fd781e254 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Acquiring lock "refresh_cache-7821b939-505a-45e3-a74b-dce7c6fdc856" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 23:55:18 compute-1 nova_compute[182713]: 2026-01-21 23:55:18.551 182717 DEBUG oslo_concurrency.lockutils [None req-7e6ec281-be96-44e2-aa9a-420fd781e254 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Acquired lock "refresh_cache-7821b939-505a-45e3-a74b-dce7c6fdc856" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 23:55:18 compute-1 nova_compute[182713]: 2026-01-21 23:55:18.552 182717 DEBUG nova.network.neutron [None req-7e6ec281-be96-44e2-aa9a-420fd781e254 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: 7821b939-505a-45e3-a74b-dce7c6fdc856] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 21 23:55:18 compute-1 nova_compute[182713]: 2026-01-21 23:55:18.649 182717 DEBUG nova.compute.manager [req-a42e1bc8-ab54-4211-99c3-563ef649eaf8 req-2bbe4552-82f2-4785-bb04-f1943b2b2fe2 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 7821b939-505a-45e3-a74b-dce7c6fdc856] Received event network-changed-72ef4c9f-6bd3-45d3-9383-212dd9f03330 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:55:18 compute-1 nova_compute[182713]: 2026-01-21 23:55:18.650 182717 DEBUG nova.compute.manager [req-a42e1bc8-ab54-4211-99c3-563ef649eaf8 req-2bbe4552-82f2-4785-bb04-f1943b2b2fe2 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 7821b939-505a-45e3-a74b-dce7c6fdc856] Refreshing instance network info cache due to event network-changed-72ef4c9f-6bd3-45d3-9383-212dd9f03330. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 21 23:55:18 compute-1 nova_compute[182713]: 2026-01-21 23:55:18.650 182717 DEBUG oslo_concurrency.lockutils [req-a42e1bc8-ab54-4211-99c3-563ef649eaf8 req-2bbe4552-82f2-4785-bb04-f1943b2b2fe2 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-7821b939-505a-45e3-a74b-dce7c6fdc856" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 23:55:18 compute-1 nova_compute[182713]: 2026-01-21 23:55:18.783 182717 DEBUG nova.network.neutron [None req-7e6ec281-be96-44e2-aa9a-420fd781e254 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: 7821b939-505a-45e3-a74b-dce7c6fdc856] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 21 23:55:19 compute-1 nova_compute[182713]: 2026-01-21 23:55:19.841 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:55:20 compute-1 nova_compute[182713]: 2026-01-21 23:55:20.129 182717 DEBUG nova.network.neutron [None req-7e6ec281-be96-44e2-aa9a-420fd781e254 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: 7821b939-505a-45e3-a74b-dce7c6fdc856] Updating instance_info_cache with network_info: [{"id": "72ef4c9f-6bd3-45d3-9383-212dd9f03330", "address": "fa:16:3e:9b:5f:9c", "network": {"id": "74e2da48-44c2-4c6d-9597-6c47d6247f9c", "bridge": "br-int", "label": "tempest-ImagesTestJSON-926280031-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "63e5713bcd4c429796b251487b6136bc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap72ef4c9f-6b", "ovs_interfaceid": "72ef4c9f-6bd3-45d3-9383-212dd9f03330", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 23:55:20 compute-1 nova_compute[182713]: 2026-01-21 23:55:20.163 182717 DEBUG oslo_concurrency.lockutils [None req-7e6ec281-be96-44e2-aa9a-420fd781e254 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Releasing lock "refresh_cache-7821b939-505a-45e3-a74b-dce7c6fdc856" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 23:55:20 compute-1 nova_compute[182713]: 2026-01-21 23:55:20.164 182717 DEBUG nova.compute.manager [None req-7e6ec281-be96-44e2-aa9a-420fd781e254 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: 7821b939-505a-45e3-a74b-dce7c6fdc856] Instance network_info: |[{"id": "72ef4c9f-6bd3-45d3-9383-212dd9f03330", "address": "fa:16:3e:9b:5f:9c", "network": {"id": "74e2da48-44c2-4c6d-9597-6c47d6247f9c", "bridge": "br-int", "label": "tempest-ImagesTestJSON-926280031-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "63e5713bcd4c429796b251487b6136bc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap72ef4c9f-6b", "ovs_interfaceid": "72ef4c9f-6bd3-45d3-9383-212dd9f03330", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 21 23:55:20 compute-1 nova_compute[182713]: 2026-01-21 23:55:20.164 182717 DEBUG oslo_concurrency.lockutils [req-a42e1bc8-ab54-4211-99c3-563ef649eaf8 req-2bbe4552-82f2-4785-bb04-f1943b2b2fe2 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-7821b939-505a-45e3-a74b-dce7c6fdc856" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 23:55:20 compute-1 nova_compute[182713]: 2026-01-21 23:55:20.165 182717 DEBUG nova.network.neutron [req-a42e1bc8-ab54-4211-99c3-563ef649eaf8 req-2bbe4552-82f2-4785-bb04-f1943b2b2fe2 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 7821b939-505a-45e3-a74b-dce7c6fdc856] Refreshing network info cache for port 72ef4c9f-6bd3-45d3-9383-212dd9f03330 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 21 23:55:20 compute-1 nova_compute[182713]: 2026-01-21 23:55:20.170 182717 DEBUG nova.virt.libvirt.driver [None req-7e6ec281-be96-44e2-aa9a-420fd781e254 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: 7821b939-505a-45e3-a74b-dce7c6fdc856] Start _get_guest_xml network_info=[{"id": "72ef4c9f-6bd3-45d3-9383-212dd9f03330", "address": "fa:16:3e:9b:5f:9c", "network": {"id": "74e2da48-44c2-4c6d-9597-6c47d6247f9c", "bridge": "br-int", "label": "tempest-ImagesTestJSON-926280031-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "63e5713bcd4c429796b251487b6136bc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap72ef4c9f-6b", "ovs_interfaceid": "72ef4c9f-6bd3-45d3-9383-212dd9f03330", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'size': 0, 'disk_bus': 'virtio', 'boot_index': 0, 'device_name': '/dev/vda', 'guest_format': None, 'encrypted': False, 'encryption_format': None, 'encryption_secret_uuid': None, 'device_type': 'disk', 'image_id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 21 23:55:20 compute-1 nova_compute[182713]: 2026-01-21 23:55:20.177 182717 WARNING nova.virt.libvirt.driver [None req-7e6ec281-be96-44e2-aa9a-420fd781e254 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 21 23:55:20 compute-1 nova_compute[182713]: 2026-01-21 23:55:20.186 182717 DEBUG nova.virt.libvirt.host [None req-7e6ec281-be96-44e2-aa9a-420fd781e254 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 21 23:55:20 compute-1 nova_compute[182713]: 2026-01-21 23:55:20.187 182717 DEBUG nova.virt.libvirt.host [None req-7e6ec281-be96-44e2-aa9a-420fd781e254 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 21 23:55:20 compute-1 nova_compute[182713]: 2026-01-21 23:55:20.195 182717 DEBUG nova.virt.libvirt.host [None req-7e6ec281-be96-44e2-aa9a-420fd781e254 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 21 23:55:20 compute-1 nova_compute[182713]: 2026-01-21 23:55:20.196 182717 DEBUG nova.virt.libvirt.host [None req-7e6ec281-be96-44e2-aa9a-420fd781e254 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 21 23:55:20 compute-1 nova_compute[182713]: 2026-01-21 23:55:20.199 182717 DEBUG nova.virt.libvirt.driver [None req-7e6ec281-be96-44e2-aa9a-420fd781e254 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 21 23:55:20 compute-1 nova_compute[182713]: 2026-01-21 23:55:20.199 182717 DEBUG nova.virt.hardware [None req-7e6ec281-be96-44e2-aa9a-420fd781e254 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-21T23:42:45Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c3389c03-89c4-4ff5-9e03-1a99d41713d4',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 21 23:55:20 compute-1 nova_compute[182713]: 2026-01-21 23:55:20.200 182717 DEBUG nova.virt.hardware [None req-7e6ec281-be96-44e2-aa9a-420fd781e254 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 21 23:55:20 compute-1 nova_compute[182713]: 2026-01-21 23:55:20.200 182717 DEBUG nova.virt.hardware [None req-7e6ec281-be96-44e2-aa9a-420fd781e254 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 21 23:55:20 compute-1 nova_compute[182713]: 2026-01-21 23:55:20.201 182717 DEBUG nova.virt.hardware [None req-7e6ec281-be96-44e2-aa9a-420fd781e254 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 21 23:55:20 compute-1 nova_compute[182713]: 2026-01-21 23:55:20.201 182717 DEBUG nova.virt.hardware [None req-7e6ec281-be96-44e2-aa9a-420fd781e254 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 21 23:55:20 compute-1 nova_compute[182713]: 2026-01-21 23:55:20.202 182717 DEBUG nova.virt.hardware [None req-7e6ec281-be96-44e2-aa9a-420fd781e254 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 21 23:55:20 compute-1 nova_compute[182713]: 2026-01-21 23:55:20.202 182717 DEBUG nova.virt.hardware [None req-7e6ec281-be96-44e2-aa9a-420fd781e254 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 21 23:55:20 compute-1 nova_compute[182713]: 2026-01-21 23:55:20.203 182717 DEBUG nova.virt.hardware [None req-7e6ec281-be96-44e2-aa9a-420fd781e254 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 21 23:55:20 compute-1 nova_compute[182713]: 2026-01-21 23:55:20.203 182717 DEBUG nova.virt.hardware [None req-7e6ec281-be96-44e2-aa9a-420fd781e254 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 21 23:55:20 compute-1 nova_compute[182713]: 2026-01-21 23:55:20.204 182717 DEBUG nova.virt.hardware [None req-7e6ec281-be96-44e2-aa9a-420fd781e254 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 21 23:55:20 compute-1 nova_compute[182713]: 2026-01-21 23:55:20.204 182717 DEBUG nova.virt.hardware [None req-7e6ec281-be96-44e2-aa9a-420fd781e254 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 21 23:55:20 compute-1 nova_compute[182713]: 2026-01-21 23:55:20.212 182717 DEBUG nova.virt.libvirt.vif [None req-7e6ec281-be96-44e2-aa9a-420fd781e254 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-21T23:55:13Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ImagesTestJSON-server-201905165',display_name='tempest-ImagesTestJSON-server-201905165',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-imagestestjson-server-201905165',id=56,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='63e5713bcd4c429796b251487b6136bc',ramdisk_id='',reservation_id='r-yfkb4omo',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ImagesTestJSON-126431515',owner_user_name='tempest-ImagesTestJSON-126431515-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-21T23:55:14Z,user_data=None,user_id='6eb1bcf645844eaca088761a04e59542',uuid=7821b939-505a-45e3-a74b-dce7c6fdc856,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "72ef4c9f-6bd3-45d3-9383-212dd9f03330", "address": "fa:16:3e:9b:5f:9c", "network": {"id": "74e2da48-44c2-4c6d-9597-6c47d6247f9c", "bridge": "br-int", "label": "tempest-ImagesTestJSON-926280031-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "63e5713bcd4c429796b251487b6136bc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap72ef4c9f-6b", "ovs_interfaceid": "72ef4c9f-6bd3-45d3-9383-212dd9f03330", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 21 23:55:20 compute-1 nova_compute[182713]: 2026-01-21 23:55:20.212 182717 DEBUG nova.network.os_vif_util [None req-7e6ec281-be96-44e2-aa9a-420fd781e254 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Converting VIF {"id": "72ef4c9f-6bd3-45d3-9383-212dd9f03330", "address": "fa:16:3e:9b:5f:9c", "network": {"id": "74e2da48-44c2-4c6d-9597-6c47d6247f9c", "bridge": "br-int", "label": "tempest-ImagesTestJSON-926280031-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "63e5713bcd4c429796b251487b6136bc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap72ef4c9f-6b", "ovs_interfaceid": "72ef4c9f-6bd3-45d3-9383-212dd9f03330", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 21 23:55:20 compute-1 nova_compute[182713]: 2026-01-21 23:55:20.214 182717 DEBUG nova.network.os_vif_util [None req-7e6ec281-be96-44e2-aa9a-420fd781e254 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:9b:5f:9c,bridge_name='br-int',has_traffic_filtering=True,id=72ef4c9f-6bd3-45d3-9383-212dd9f03330,network=Network(74e2da48-44c2-4c6d-9597-6c47d6247f9c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap72ef4c9f-6b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 21 23:55:20 compute-1 nova_compute[182713]: 2026-01-21 23:55:20.216 182717 DEBUG nova.objects.instance [None req-7e6ec281-be96-44e2-aa9a-420fd781e254 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Lazy-loading 'pci_devices' on Instance uuid 7821b939-505a-45e3-a74b-dce7c6fdc856 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:55:20 compute-1 nova_compute[182713]: 2026-01-21 23:55:20.235 182717 DEBUG nova.virt.libvirt.driver [None req-7e6ec281-be96-44e2-aa9a-420fd781e254 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: 7821b939-505a-45e3-a74b-dce7c6fdc856] End _get_guest_xml xml=<domain type="kvm">
Jan 21 23:55:20 compute-1 nova_compute[182713]:   <uuid>7821b939-505a-45e3-a74b-dce7c6fdc856</uuid>
Jan 21 23:55:20 compute-1 nova_compute[182713]:   <name>instance-00000038</name>
Jan 21 23:55:20 compute-1 nova_compute[182713]:   <memory>131072</memory>
Jan 21 23:55:20 compute-1 nova_compute[182713]:   <vcpu>1</vcpu>
Jan 21 23:55:20 compute-1 nova_compute[182713]:   <metadata>
Jan 21 23:55:20 compute-1 nova_compute[182713]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 21 23:55:20 compute-1 nova_compute[182713]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 21 23:55:20 compute-1 nova_compute[182713]:       <nova:name>tempest-ImagesTestJSON-server-201905165</nova:name>
Jan 21 23:55:20 compute-1 nova_compute[182713]:       <nova:creationTime>2026-01-21 23:55:20</nova:creationTime>
Jan 21 23:55:20 compute-1 nova_compute[182713]:       <nova:flavor name="m1.nano">
Jan 21 23:55:20 compute-1 nova_compute[182713]:         <nova:memory>128</nova:memory>
Jan 21 23:55:20 compute-1 nova_compute[182713]:         <nova:disk>1</nova:disk>
Jan 21 23:55:20 compute-1 nova_compute[182713]:         <nova:swap>0</nova:swap>
Jan 21 23:55:20 compute-1 nova_compute[182713]:         <nova:ephemeral>0</nova:ephemeral>
Jan 21 23:55:20 compute-1 nova_compute[182713]:         <nova:vcpus>1</nova:vcpus>
Jan 21 23:55:20 compute-1 nova_compute[182713]:       </nova:flavor>
Jan 21 23:55:20 compute-1 nova_compute[182713]:       <nova:owner>
Jan 21 23:55:20 compute-1 nova_compute[182713]:         <nova:user uuid="6eb1bcf645844eaca088761a04e59542">tempest-ImagesTestJSON-126431515-project-member</nova:user>
Jan 21 23:55:20 compute-1 nova_compute[182713]:         <nova:project uuid="63e5713bcd4c429796b251487b6136bc">tempest-ImagesTestJSON-126431515</nova:project>
Jan 21 23:55:20 compute-1 nova_compute[182713]:       </nova:owner>
Jan 21 23:55:20 compute-1 nova_compute[182713]:       <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 21 23:55:20 compute-1 nova_compute[182713]:       <nova:ports>
Jan 21 23:55:20 compute-1 nova_compute[182713]:         <nova:port uuid="72ef4c9f-6bd3-45d3-9383-212dd9f03330">
Jan 21 23:55:20 compute-1 nova_compute[182713]:           <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Jan 21 23:55:20 compute-1 nova_compute[182713]:         </nova:port>
Jan 21 23:55:20 compute-1 nova_compute[182713]:       </nova:ports>
Jan 21 23:55:20 compute-1 nova_compute[182713]:     </nova:instance>
Jan 21 23:55:20 compute-1 nova_compute[182713]:   </metadata>
Jan 21 23:55:20 compute-1 nova_compute[182713]:   <sysinfo type="smbios">
Jan 21 23:55:20 compute-1 nova_compute[182713]:     <system>
Jan 21 23:55:20 compute-1 nova_compute[182713]:       <entry name="manufacturer">RDO</entry>
Jan 21 23:55:20 compute-1 nova_compute[182713]:       <entry name="product">OpenStack Compute</entry>
Jan 21 23:55:20 compute-1 nova_compute[182713]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 21 23:55:20 compute-1 nova_compute[182713]:       <entry name="serial">7821b939-505a-45e3-a74b-dce7c6fdc856</entry>
Jan 21 23:55:20 compute-1 nova_compute[182713]:       <entry name="uuid">7821b939-505a-45e3-a74b-dce7c6fdc856</entry>
Jan 21 23:55:20 compute-1 nova_compute[182713]:       <entry name="family">Virtual Machine</entry>
Jan 21 23:55:20 compute-1 nova_compute[182713]:     </system>
Jan 21 23:55:20 compute-1 nova_compute[182713]:   </sysinfo>
Jan 21 23:55:20 compute-1 nova_compute[182713]:   <os>
Jan 21 23:55:20 compute-1 nova_compute[182713]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 21 23:55:20 compute-1 nova_compute[182713]:     <boot dev="hd"/>
Jan 21 23:55:20 compute-1 nova_compute[182713]:     <smbios mode="sysinfo"/>
Jan 21 23:55:20 compute-1 nova_compute[182713]:   </os>
Jan 21 23:55:20 compute-1 nova_compute[182713]:   <features>
Jan 21 23:55:20 compute-1 nova_compute[182713]:     <acpi/>
Jan 21 23:55:20 compute-1 nova_compute[182713]:     <apic/>
Jan 21 23:55:20 compute-1 nova_compute[182713]:     <vmcoreinfo/>
Jan 21 23:55:20 compute-1 nova_compute[182713]:   </features>
Jan 21 23:55:20 compute-1 nova_compute[182713]:   <clock offset="utc">
Jan 21 23:55:20 compute-1 nova_compute[182713]:     <timer name="pit" tickpolicy="delay"/>
Jan 21 23:55:20 compute-1 nova_compute[182713]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 21 23:55:20 compute-1 nova_compute[182713]:     <timer name="hpet" present="no"/>
Jan 21 23:55:20 compute-1 nova_compute[182713]:   </clock>
Jan 21 23:55:20 compute-1 nova_compute[182713]:   <cpu mode="custom" match="exact">
Jan 21 23:55:20 compute-1 nova_compute[182713]:     <model>Nehalem</model>
Jan 21 23:55:20 compute-1 nova_compute[182713]:     <topology sockets="1" cores="1" threads="1"/>
Jan 21 23:55:20 compute-1 nova_compute[182713]:   </cpu>
Jan 21 23:55:20 compute-1 nova_compute[182713]:   <devices>
Jan 21 23:55:20 compute-1 nova_compute[182713]:     <disk type="file" device="disk">
Jan 21 23:55:20 compute-1 nova_compute[182713]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 21 23:55:20 compute-1 nova_compute[182713]:       <source file="/var/lib/nova/instances/7821b939-505a-45e3-a74b-dce7c6fdc856/disk"/>
Jan 21 23:55:20 compute-1 nova_compute[182713]:       <target dev="vda" bus="virtio"/>
Jan 21 23:55:20 compute-1 nova_compute[182713]:     </disk>
Jan 21 23:55:20 compute-1 nova_compute[182713]:     <disk type="file" device="cdrom">
Jan 21 23:55:20 compute-1 nova_compute[182713]:       <driver name="qemu" type="raw" cache="none"/>
Jan 21 23:55:20 compute-1 nova_compute[182713]:       <source file="/var/lib/nova/instances/7821b939-505a-45e3-a74b-dce7c6fdc856/disk.config"/>
Jan 21 23:55:20 compute-1 nova_compute[182713]:       <target dev="sda" bus="sata"/>
Jan 21 23:55:20 compute-1 nova_compute[182713]:     </disk>
Jan 21 23:55:20 compute-1 nova_compute[182713]:     <interface type="ethernet">
Jan 21 23:55:20 compute-1 nova_compute[182713]:       <mac address="fa:16:3e:9b:5f:9c"/>
Jan 21 23:55:20 compute-1 nova_compute[182713]:       <model type="virtio"/>
Jan 21 23:55:20 compute-1 nova_compute[182713]:       <driver name="vhost" rx_queue_size="512"/>
Jan 21 23:55:20 compute-1 nova_compute[182713]:       <mtu size="1442"/>
Jan 21 23:55:20 compute-1 nova_compute[182713]:       <target dev="tap72ef4c9f-6b"/>
Jan 21 23:55:20 compute-1 nova_compute[182713]:     </interface>
Jan 21 23:55:20 compute-1 nova_compute[182713]:     <serial type="pty">
Jan 21 23:55:20 compute-1 nova_compute[182713]:       <log file="/var/lib/nova/instances/7821b939-505a-45e3-a74b-dce7c6fdc856/console.log" append="off"/>
Jan 21 23:55:20 compute-1 nova_compute[182713]:     </serial>
Jan 21 23:55:20 compute-1 nova_compute[182713]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 21 23:55:20 compute-1 nova_compute[182713]:     <video>
Jan 21 23:55:20 compute-1 nova_compute[182713]:       <model type="virtio"/>
Jan 21 23:55:20 compute-1 nova_compute[182713]:     </video>
Jan 21 23:55:20 compute-1 nova_compute[182713]:     <input type="tablet" bus="usb"/>
Jan 21 23:55:20 compute-1 nova_compute[182713]:     <rng model="virtio">
Jan 21 23:55:20 compute-1 nova_compute[182713]:       <backend model="random">/dev/urandom</backend>
Jan 21 23:55:20 compute-1 nova_compute[182713]:     </rng>
Jan 21 23:55:20 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root"/>
Jan 21 23:55:20 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:55:20 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:55:20 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:55:20 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:55:20 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:55:20 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:55:20 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:55:20 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:55:20 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:55:20 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:55:20 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:55:20 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:55:20 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:55:20 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:55:20 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:55:20 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:55:20 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:55:20 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:55:20 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:55:20 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:55:20 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:55:20 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:55:20 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:55:20 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:55:20 compute-1 nova_compute[182713]:     <controller type="usb" index="0"/>
Jan 21 23:55:20 compute-1 nova_compute[182713]:     <memballoon model="virtio">
Jan 21 23:55:20 compute-1 nova_compute[182713]:       <stats period="10"/>
Jan 21 23:55:20 compute-1 nova_compute[182713]:     </memballoon>
Jan 21 23:55:20 compute-1 nova_compute[182713]:   </devices>
Jan 21 23:55:20 compute-1 nova_compute[182713]: </domain>
Jan 21 23:55:20 compute-1 nova_compute[182713]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 21 23:55:20 compute-1 nova_compute[182713]: 2026-01-21 23:55:20.237 182717 DEBUG nova.compute.manager [None req-7e6ec281-be96-44e2-aa9a-420fd781e254 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: 7821b939-505a-45e3-a74b-dce7c6fdc856] Preparing to wait for external event network-vif-plugged-72ef4c9f-6bd3-45d3-9383-212dd9f03330 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 21 23:55:20 compute-1 nova_compute[182713]: 2026-01-21 23:55:20.237 182717 DEBUG oslo_concurrency.lockutils [None req-7e6ec281-be96-44e2-aa9a-420fd781e254 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Acquiring lock "7821b939-505a-45e3-a74b-dce7c6fdc856-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:55:20 compute-1 nova_compute[182713]: 2026-01-21 23:55:20.238 182717 DEBUG oslo_concurrency.lockutils [None req-7e6ec281-be96-44e2-aa9a-420fd781e254 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Lock "7821b939-505a-45e3-a74b-dce7c6fdc856-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:55:20 compute-1 nova_compute[182713]: 2026-01-21 23:55:20.238 182717 DEBUG oslo_concurrency.lockutils [None req-7e6ec281-be96-44e2-aa9a-420fd781e254 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Lock "7821b939-505a-45e3-a74b-dce7c6fdc856-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:55:20 compute-1 nova_compute[182713]: 2026-01-21 23:55:20.239 182717 DEBUG nova.virt.libvirt.vif [None req-7e6ec281-be96-44e2-aa9a-420fd781e254 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-21T23:55:13Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ImagesTestJSON-server-201905165',display_name='tempest-ImagesTestJSON-server-201905165',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-imagestestjson-server-201905165',id=56,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='63e5713bcd4c429796b251487b6136bc',ramdisk_id='',reservation_id='r-yfkb4omo',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ImagesTestJSON-126431515',owner_user_name='tempest-ImagesTestJSON-126431515-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-21T23:55:14Z,user_data=None,user_id='6eb1bcf645844eaca088761a04e59542',uuid=7821b939-505a-45e3-a74b-dce7c6fdc856,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "72ef4c9f-6bd3-45d3-9383-212dd9f03330", "address": "fa:16:3e:9b:5f:9c", "network": {"id": "74e2da48-44c2-4c6d-9597-6c47d6247f9c", "bridge": "br-int", "label": "tempest-ImagesTestJSON-926280031-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "63e5713bcd4c429796b251487b6136bc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap72ef4c9f-6b", "ovs_interfaceid": "72ef4c9f-6bd3-45d3-9383-212dd9f03330", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 21 23:55:20 compute-1 nova_compute[182713]: 2026-01-21 23:55:20.240 182717 DEBUG nova.network.os_vif_util [None req-7e6ec281-be96-44e2-aa9a-420fd781e254 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Converting VIF {"id": "72ef4c9f-6bd3-45d3-9383-212dd9f03330", "address": "fa:16:3e:9b:5f:9c", "network": {"id": "74e2da48-44c2-4c6d-9597-6c47d6247f9c", "bridge": "br-int", "label": "tempest-ImagesTestJSON-926280031-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "63e5713bcd4c429796b251487b6136bc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap72ef4c9f-6b", "ovs_interfaceid": "72ef4c9f-6bd3-45d3-9383-212dd9f03330", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 21 23:55:20 compute-1 nova_compute[182713]: 2026-01-21 23:55:20.241 182717 DEBUG nova.network.os_vif_util [None req-7e6ec281-be96-44e2-aa9a-420fd781e254 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:9b:5f:9c,bridge_name='br-int',has_traffic_filtering=True,id=72ef4c9f-6bd3-45d3-9383-212dd9f03330,network=Network(74e2da48-44c2-4c6d-9597-6c47d6247f9c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap72ef4c9f-6b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 21 23:55:20 compute-1 nova_compute[182713]: 2026-01-21 23:55:20.241 182717 DEBUG os_vif [None req-7e6ec281-be96-44e2-aa9a-420fd781e254 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:9b:5f:9c,bridge_name='br-int',has_traffic_filtering=True,id=72ef4c9f-6bd3-45d3-9383-212dd9f03330,network=Network(74e2da48-44c2-4c6d-9597-6c47d6247f9c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap72ef4c9f-6b') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 21 23:55:20 compute-1 nova_compute[182713]: 2026-01-21 23:55:20.242 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:55:20 compute-1 nova_compute[182713]: 2026-01-21 23:55:20.243 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:55:20 compute-1 nova_compute[182713]: 2026-01-21 23:55:20.243 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 21 23:55:20 compute-1 nova_compute[182713]: 2026-01-21 23:55:20.248 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:55:20 compute-1 nova_compute[182713]: 2026-01-21 23:55:20.248 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap72ef4c9f-6b, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:55:20 compute-1 nova_compute[182713]: 2026-01-21 23:55:20.249 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap72ef4c9f-6b, col_values=(('external_ids', {'iface-id': '72ef4c9f-6bd3-45d3-9383-212dd9f03330', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:9b:5f:9c', 'vm-uuid': '7821b939-505a-45e3-a74b-dce7c6fdc856'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:55:20 compute-1 nova_compute[182713]: 2026-01-21 23:55:20.251 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:55:20 compute-1 NetworkManager[54952]: <info>  [1769039720.2534] manager: (tap72ef4c9f-6b): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/74)
Jan 21 23:55:20 compute-1 nova_compute[182713]: 2026-01-21 23:55:20.253 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 21 23:55:20 compute-1 nova_compute[182713]: 2026-01-21 23:55:20.259 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:55:20 compute-1 nova_compute[182713]: 2026-01-21 23:55:20.260 182717 INFO os_vif [None req-7e6ec281-be96-44e2-aa9a-420fd781e254 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:9b:5f:9c,bridge_name='br-int',has_traffic_filtering=True,id=72ef4c9f-6bd3-45d3-9383-212dd9f03330,network=Network(74e2da48-44c2-4c6d-9597-6c47d6247f9c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap72ef4c9f-6b')
Jan 21 23:55:20 compute-1 nova_compute[182713]: 2026-01-21 23:55:20.347 182717 DEBUG nova.virt.libvirt.driver [None req-7e6ec281-be96-44e2-aa9a-420fd781e254 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 21 23:55:20 compute-1 nova_compute[182713]: 2026-01-21 23:55:20.347 182717 DEBUG nova.virt.libvirt.driver [None req-7e6ec281-be96-44e2-aa9a-420fd781e254 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 21 23:55:20 compute-1 nova_compute[182713]: 2026-01-21 23:55:20.347 182717 DEBUG nova.virt.libvirt.driver [None req-7e6ec281-be96-44e2-aa9a-420fd781e254 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] No VIF found with MAC fa:16:3e:9b:5f:9c, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 21 23:55:20 compute-1 nova_compute[182713]: 2026-01-21 23:55:20.348 182717 INFO nova.virt.libvirt.driver [None req-7e6ec281-be96-44e2-aa9a-420fd781e254 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: 7821b939-505a-45e3-a74b-dce7c6fdc856] Using config drive
Jan 21 23:55:20 compute-1 podman[218409]: 2026-01-21 23:55:20.400643327 +0000 UTC m=+0.089073324 container health_status cba261ffc859a105e0afa4436e64e27f9ba7e7387a1ea02146e0072c51844d44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_managed=true, config_id=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 21 23:55:20 compute-1 nova_compute[182713]: 2026-01-21 23:55:20.733 182717 INFO nova.virt.libvirt.driver [None req-7e6ec281-be96-44e2-aa9a-420fd781e254 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: 7821b939-505a-45e3-a74b-dce7c6fdc856] Creating config drive at /var/lib/nova/instances/7821b939-505a-45e3-a74b-dce7c6fdc856/disk.config
Jan 21 23:55:20 compute-1 nova_compute[182713]: 2026-01-21 23:55:20.743 182717 DEBUG oslo_concurrency.processutils [None req-7e6ec281-be96-44e2-aa9a-420fd781e254 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/7821b939-505a-45e3-a74b-dce7c6fdc856/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpv9v7yeaz execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:55:20 compute-1 nova_compute[182713]: 2026-01-21 23:55:20.886 182717 DEBUG oslo_concurrency.processutils [None req-7e6ec281-be96-44e2-aa9a-420fd781e254 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/7821b939-505a-45e3-a74b-dce7c6fdc856/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpv9v7yeaz" returned: 0 in 0.143s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:55:20 compute-1 kernel: tap72ef4c9f-6b: entered promiscuous mode
Jan 21 23:55:20 compute-1 NetworkManager[54952]: <info>  [1769039720.9719] manager: (tap72ef4c9f-6b): new Tun device (/org/freedesktop/NetworkManager/Devices/75)
Jan 21 23:55:20 compute-1 nova_compute[182713]: 2026-01-21 23:55:20.975 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:55:20 compute-1 ovn_controller[94841]: 2026-01-21T23:55:20Z|00144|binding|INFO|Claiming lport 72ef4c9f-6bd3-45d3-9383-212dd9f03330 for this chassis.
Jan 21 23:55:20 compute-1 ovn_controller[94841]: 2026-01-21T23:55:20Z|00145|binding|INFO|72ef4c9f-6bd3-45d3-9383-212dd9f03330: Claiming fa:16:3e:9b:5f:9c 10.100.0.8
Jan 21 23:55:20 compute-1 nova_compute[182713]: 2026-01-21 23:55:20.982 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:55:20 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:55:20.989 104184 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:9b:5f:9c 10.100.0.8'], port_security=['fa:16:3e:9b:5f:9c 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '7821b939-505a-45e3-a74b-dce7c6fdc856', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-74e2da48-44c2-4c6d-9597-6c47d6247f9c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '63e5713bcd4c429796b251487b6136bc', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'f9b7e0b6-a204-4fa2-b013-6e4797586550', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d83c4887-6d70-4852-825f-2112d1d70c76, chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>], logical_port=72ef4c9f-6bd3-45d3-9383-212dd9f03330) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 21 23:55:20 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:55:20.991 104184 INFO neutron.agent.ovn.metadata.agent [-] Port 72ef4c9f-6bd3-45d3-9383-212dd9f03330 in datapath 74e2da48-44c2-4c6d-9597-6c47d6247f9c bound to our chassis
Jan 21 23:55:20 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:55:20.994 104184 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 74e2da48-44c2-4c6d-9597-6c47d6247f9c
Jan 21 23:55:21 compute-1 systemd-udevd[218444]: Network interface NamePolicy= disabled on kernel command line.
Jan 21 23:55:21 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:55:21.012 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[2fba993e-1b0f-48a8-a370-45cd841c6dc7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:55:21 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:55:21.014 104184 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap74e2da48-41 in ovnmeta-74e2da48-44c2-4c6d-9597-6c47d6247f9c namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 21 23:55:21 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:55:21.016 211733 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap74e2da48-40 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 21 23:55:21 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:55:21.016 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[0ad21d8c-cd97-4b2a-96c8-038ffb6bb13a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:55:21 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:55:21.017 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[bda1c75e-a47b-4c52-9c18-ecb027716837]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:55:21 compute-1 NetworkManager[54952]: <info>  [1769039721.0256] device (tap72ef4c9f-6b): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 21 23:55:21 compute-1 NetworkManager[54952]: <info>  [1769039721.0272] device (tap72ef4c9f-6b): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 21 23:55:21 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:55:21.034 104576 DEBUG oslo.privsep.daemon [-] privsep: reply[1d72790a-9141-4bc6-8b46-f031da507e53]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:55:21 compute-1 systemd-machined[153970]: New machine qemu-25-instance-00000038.
Jan 21 23:55:21 compute-1 nova_compute[182713]: 2026-01-21 23:55:21.066 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:55:21 compute-1 nova_compute[182713]: 2026-01-21 23:55:21.068 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:55:21 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:55:21.067 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[2230bcc2-136b-4a3f-923b-a029aeb2f69a]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:55:21 compute-1 systemd[1]: Started Virtual Machine qemu-25-instance-00000038.
Jan 21 23:55:21 compute-1 ovn_controller[94841]: 2026-01-21T23:55:21Z|00146|binding|INFO|Setting lport 72ef4c9f-6bd3-45d3-9383-212dd9f03330 ovn-installed in OVS
Jan 21 23:55:21 compute-1 ovn_controller[94841]: 2026-01-21T23:55:21Z|00147|binding|INFO|Setting lport 72ef4c9f-6bd3-45d3-9383-212dd9f03330 up in Southbound
Jan 21 23:55:21 compute-1 nova_compute[182713]: 2026-01-21 23:55:21.073 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:55:21 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:55:21.101 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[5a2b8988-6a33-4bcb-ac32-3f25c623cce1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:55:21 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:55:21.108 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[d51697c4-eff5-45c4-bbf6-dcaf206a5439]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:55:21 compute-1 systemd-udevd[218449]: Network interface NamePolicy= disabled on kernel command line.
Jan 21 23:55:21 compute-1 NetworkManager[54952]: <info>  [1769039721.1100] manager: (tap74e2da48-40): new Veth device (/org/freedesktop/NetworkManager/Devices/76)
Jan 21 23:55:21 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:55:21.150 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[b907af2a-f76e-4447-9767-100c876f4757]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:55:21 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:55:21.154 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[c04134e3-6736-4b20-a21a-b28bd6c4f897]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:55:21 compute-1 NetworkManager[54952]: <info>  [1769039721.1930] device (tap74e2da48-40): carrier: link connected
Jan 21 23:55:21 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:55:21.203 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[a97d2f69-f012-4594-a520-ed7f0a15cc74]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:55:21 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:55:21.238 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[64c13e60-22b3-461b-9d0a-101274ae60d2]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap74e2da48-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:af:75:49'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 46], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 429384, 'reachable_time': 16350, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 218479, 'error': None, 'target': 'ovnmeta-74e2da48-44c2-4c6d-9597-6c47d6247f9c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:55:21 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:55:21.269 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[0791094d-29dd-40d5-ba51-3037e295db1a]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feaf:7549'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 429384, 'tstamp': 429384}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 218480, 'error': None, 'target': 'ovnmeta-74e2da48-44c2-4c6d-9597-6c47d6247f9c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:55:21 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:55:21.299 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[49d1cf3e-59fb-46e2-b1ff-0972ffdf7f2e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap74e2da48-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:af:75:49'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 46], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 429384, 'reachable_time': 16350, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 218481, 'error': None, 'target': 'ovnmeta-74e2da48-44c2-4c6d-9597-6c47d6247f9c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:55:21 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:55:21.342 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[d303cf01-6154-4eec-8b75-fa6322761fbd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:55:21 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:55:21.423 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[3bbb8b9b-3a3a-4df7-84c6-d4d6a8f32ec0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:55:21 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:55:21.425 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap74e2da48-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:55:21 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:55:21.426 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 21 23:55:21 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:55:21.427 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap74e2da48-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:55:21 compute-1 nova_compute[182713]: 2026-01-21 23:55:21.429 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:55:21 compute-1 NetworkManager[54952]: <info>  [1769039721.4307] manager: (tap74e2da48-40): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/77)
Jan 21 23:55:21 compute-1 kernel: tap74e2da48-40: entered promiscuous mode
Jan 21 23:55:21 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:55:21.434 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap74e2da48-40, col_values=(('external_ids', {'iface-id': '5f8f321e-2942-4700-a50e-4b0628052c1b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:55:21 compute-1 nova_compute[182713]: 2026-01-21 23:55:21.435 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:55:21 compute-1 ovn_controller[94841]: 2026-01-21T23:55:21Z|00148|binding|INFO|Releasing lport 5f8f321e-2942-4700-a50e-4b0628052c1b from this chassis (sb_readonly=0)
Jan 21 23:55:21 compute-1 nova_compute[182713]: 2026-01-21 23:55:21.457 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:55:21 compute-1 nova_compute[182713]: 2026-01-21 23:55:21.458 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:55:21 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:55:21.459 104184 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/74e2da48-44c2-4c6d-9597-6c47d6247f9c.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/74e2da48-44c2-4c6d-9597-6c47d6247f9c.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 21 23:55:21 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:55:21.461 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[fbd14fd1-4478-4fe6-a56d-e298a5caa638]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:55:21 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:55:21.462 104184 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 21 23:55:21 compute-1 ovn_metadata_agent[104179]: global
Jan 21 23:55:21 compute-1 ovn_metadata_agent[104179]:     log         /dev/log local0 debug
Jan 21 23:55:21 compute-1 ovn_metadata_agent[104179]:     log-tag     haproxy-metadata-proxy-74e2da48-44c2-4c6d-9597-6c47d6247f9c
Jan 21 23:55:21 compute-1 ovn_metadata_agent[104179]:     user        root
Jan 21 23:55:21 compute-1 ovn_metadata_agent[104179]:     group       root
Jan 21 23:55:21 compute-1 ovn_metadata_agent[104179]:     maxconn     1024
Jan 21 23:55:21 compute-1 ovn_metadata_agent[104179]:     pidfile     /var/lib/neutron/external/pids/74e2da48-44c2-4c6d-9597-6c47d6247f9c.pid.haproxy
Jan 21 23:55:21 compute-1 ovn_metadata_agent[104179]:     daemon
Jan 21 23:55:21 compute-1 ovn_metadata_agent[104179]: 
Jan 21 23:55:21 compute-1 ovn_metadata_agent[104179]: defaults
Jan 21 23:55:21 compute-1 ovn_metadata_agent[104179]:     log global
Jan 21 23:55:21 compute-1 ovn_metadata_agent[104179]:     mode http
Jan 21 23:55:21 compute-1 ovn_metadata_agent[104179]:     option httplog
Jan 21 23:55:21 compute-1 ovn_metadata_agent[104179]:     option dontlognull
Jan 21 23:55:21 compute-1 ovn_metadata_agent[104179]:     option http-server-close
Jan 21 23:55:21 compute-1 ovn_metadata_agent[104179]:     option forwardfor
Jan 21 23:55:21 compute-1 ovn_metadata_agent[104179]:     retries                 3
Jan 21 23:55:21 compute-1 ovn_metadata_agent[104179]:     timeout http-request    30s
Jan 21 23:55:21 compute-1 ovn_metadata_agent[104179]:     timeout connect         30s
Jan 21 23:55:21 compute-1 ovn_metadata_agent[104179]:     timeout client          32s
Jan 21 23:55:21 compute-1 ovn_metadata_agent[104179]:     timeout server          32s
Jan 21 23:55:21 compute-1 ovn_metadata_agent[104179]:     timeout http-keep-alive 30s
Jan 21 23:55:21 compute-1 ovn_metadata_agent[104179]: 
Jan 21 23:55:21 compute-1 ovn_metadata_agent[104179]: 
Jan 21 23:55:21 compute-1 ovn_metadata_agent[104179]: listen listener
Jan 21 23:55:21 compute-1 ovn_metadata_agent[104179]:     bind 169.254.169.254:80
Jan 21 23:55:21 compute-1 ovn_metadata_agent[104179]:     server metadata /var/lib/neutron/metadata_proxy
Jan 21 23:55:21 compute-1 ovn_metadata_agent[104179]:     http-request add-header X-OVN-Network-ID 74e2da48-44c2-4c6d-9597-6c47d6247f9c
Jan 21 23:55:21 compute-1 ovn_metadata_agent[104179]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 21 23:55:21 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:55:21.463 104184 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-74e2da48-44c2-4c6d-9597-6c47d6247f9c', 'env', 'PROCESS_TAG=haproxy-74e2da48-44c2-4c6d-9597-6c47d6247f9c', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/74e2da48-44c2-4c6d-9597-6c47d6247f9c.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 21 23:55:21 compute-1 nova_compute[182713]: 2026-01-21 23:55:21.523 182717 DEBUG nova.virt.driver [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Emitting event <LifecycleEvent: 1769039721.5227795, 7821b939-505a-45e3-a74b-dce7c6fdc856 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:55:21 compute-1 nova_compute[182713]: 2026-01-21 23:55:21.523 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 7821b939-505a-45e3-a74b-dce7c6fdc856] VM Started (Lifecycle Event)
Jan 21 23:55:21 compute-1 nova_compute[182713]: 2026-01-21 23:55:21.594 182717 DEBUG nova.compute.manager [req-7bd9f85e-79e9-4a5e-975e-82995ef7afe0 req-4bdbd1db-8adf-4d13-bf50-55ca61f7d474 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 7821b939-505a-45e3-a74b-dce7c6fdc856] Received event network-vif-plugged-72ef4c9f-6bd3-45d3-9383-212dd9f03330 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:55:21 compute-1 nova_compute[182713]: 2026-01-21 23:55:21.595 182717 DEBUG oslo_concurrency.lockutils [req-7bd9f85e-79e9-4a5e-975e-82995ef7afe0 req-4bdbd1db-8adf-4d13-bf50-55ca61f7d474 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "7821b939-505a-45e3-a74b-dce7c6fdc856-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:55:21 compute-1 nova_compute[182713]: 2026-01-21 23:55:21.595 182717 DEBUG oslo_concurrency.lockutils [req-7bd9f85e-79e9-4a5e-975e-82995ef7afe0 req-4bdbd1db-8adf-4d13-bf50-55ca61f7d474 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "7821b939-505a-45e3-a74b-dce7c6fdc856-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:55:21 compute-1 nova_compute[182713]: 2026-01-21 23:55:21.596 182717 DEBUG oslo_concurrency.lockutils [req-7bd9f85e-79e9-4a5e-975e-82995ef7afe0 req-4bdbd1db-8adf-4d13-bf50-55ca61f7d474 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "7821b939-505a-45e3-a74b-dce7c6fdc856-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:55:21 compute-1 nova_compute[182713]: 2026-01-21 23:55:21.596 182717 DEBUG nova.compute.manager [req-7bd9f85e-79e9-4a5e-975e-82995ef7afe0 req-4bdbd1db-8adf-4d13-bf50-55ca61f7d474 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 7821b939-505a-45e3-a74b-dce7c6fdc856] Processing event network-vif-plugged-72ef4c9f-6bd3-45d3-9383-212dd9f03330 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 21 23:55:21 compute-1 nova_compute[182713]: 2026-01-21 23:55:21.597 182717 DEBUG nova.compute.manager [None req-7e6ec281-be96-44e2-aa9a-420fd781e254 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: 7821b939-505a-45e3-a74b-dce7c6fdc856] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 21 23:55:21 compute-1 nova_compute[182713]: 2026-01-21 23:55:21.601 182717 DEBUG nova.virt.libvirt.driver [None req-7e6ec281-be96-44e2-aa9a-420fd781e254 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: 7821b939-505a-45e3-a74b-dce7c6fdc856] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 21 23:55:21 compute-1 nova_compute[182713]: 2026-01-21 23:55:21.606 182717 INFO nova.virt.libvirt.driver [-] [instance: 7821b939-505a-45e3-a74b-dce7c6fdc856] Instance spawned successfully.
Jan 21 23:55:21 compute-1 nova_compute[182713]: 2026-01-21 23:55:21.606 182717 DEBUG nova.virt.libvirt.driver [None req-7e6ec281-be96-44e2-aa9a-420fd781e254 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: 7821b939-505a-45e3-a74b-dce7c6fdc856] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 21 23:55:21 compute-1 nova_compute[182713]: 2026-01-21 23:55:21.668 182717 DEBUG nova.virt.libvirt.driver [None req-7e6ec281-be96-44e2-aa9a-420fd781e254 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: 7821b939-505a-45e3-a74b-dce7c6fdc856] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:55:21 compute-1 nova_compute[182713]: 2026-01-21 23:55:21.668 182717 DEBUG nova.virt.libvirt.driver [None req-7e6ec281-be96-44e2-aa9a-420fd781e254 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: 7821b939-505a-45e3-a74b-dce7c6fdc856] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:55:21 compute-1 nova_compute[182713]: 2026-01-21 23:55:21.669 182717 DEBUG nova.virt.libvirt.driver [None req-7e6ec281-be96-44e2-aa9a-420fd781e254 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: 7821b939-505a-45e3-a74b-dce7c6fdc856] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:55:21 compute-1 nova_compute[182713]: 2026-01-21 23:55:21.669 182717 DEBUG nova.virt.libvirt.driver [None req-7e6ec281-be96-44e2-aa9a-420fd781e254 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: 7821b939-505a-45e3-a74b-dce7c6fdc856] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:55:21 compute-1 nova_compute[182713]: 2026-01-21 23:55:21.670 182717 DEBUG nova.virt.libvirt.driver [None req-7e6ec281-be96-44e2-aa9a-420fd781e254 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: 7821b939-505a-45e3-a74b-dce7c6fdc856] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:55:21 compute-1 nova_compute[182713]: 2026-01-21 23:55:21.670 182717 DEBUG nova.virt.libvirt.driver [None req-7e6ec281-be96-44e2-aa9a-420fd781e254 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: 7821b939-505a-45e3-a74b-dce7c6fdc856] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:55:21 compute-1 nova_compute[182713]: 2026-01-21 23:55:21.870 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 7821b939-505a-45e3-a74b-dce7c6fdc856] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:55:21 compute-1 nova_compute[182713]: 2026-01-21 23:55:21.875 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 7821b939-505a-45e3-a74b-dce7c6fdc856] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 21 23:55:21 compute-1 podman[218519]: 2026-01-21 23:55:21.904203709 +0000 UTC m=+0.077582410 container create e298530d555b12d99ee64d55c18e93e63d5de333ae97424ca253421d7b7480de (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-74e2da48-44c2-4c6d-9597-6c47d6247f9c, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 21 23:55:21 compute-1 nova_compute[182713]: 2026-01-21 23:55:21.926 182717 INFO nova.compute.manager [None req-7e6ec281-be96-44e2-aa9a-420fd781e254 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: 7821b939-505a-45e3-a74b-dce7c6fdc856] Took 7.10 seconds to spawn the instance on the hypervisor.
Jan 21 23:55:21 compute-1 nova_compute[182713]: 2026-01-21 23:55:21.928 182717 DEBUG nova.compute.manager [None req-7e6ec281-be96-44e2-aa9a-420fd781e254 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: 7821b939-505a-45e3-a74b-dce7c6fdc856] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:55:21 compute-1 nova_compute[182713]: 2026-01-21 23:55:21.930 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 7821b939-505a-45e3-a74b-dce7c6fdc856] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 21 23:55:21 compute-1 nova_compute[182713]: 2026-01-21 23:55:21.930 182717 DEBUG nova.virt.driver [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Emitting event <LifecycleEvent: 1769039721.523022, 7821b939-505a-45e3-a74b-dce7c6fdc856 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:55:21 compute-1 nova_compute[182713]: 2026-01-21 23:55:21.931 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 7821b939-505a-45e3-a74b-dce7c6fdc856] VM Paused (Lifecycle Event)
Jan 21 23:55:21 compute-1 systemd[1]: Started libpod-conmon-e298530d555b12d99ee64d55c18e93e63d5de333ae97424ca253421d7b7480de.scope.
Jan 21 23:55:21 compute-1 podman[218519]: 2026-01-21 23:55:21.868251682 +0000 UTC m=+0.041630453 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 21 23:55:21 compute-1 nova_compute[182713]: 2026-01-21 23:55:21.977 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 7821b939-505a-45e3-a74b-dce7c6fdc856] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:55:21 compute-1 nova_compute[182713]: 2026-01-21 23:55:21.982 182717 DEBUG nova.virt.driver [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Emitting event <LifecycleEvent: 1769039721.6010845, 7821b939-505a-45e3-a74b-dce7c6fdc856 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:55:21 compute-1 nova_compute[182713]: 2026-01-21 23:55:21.982 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 7821b939-505a-45e3-a74b-dce7c6fdc856] VM Resumed (Lifecycle Event)
Jan 21 23:55:21 compute-1 systemd[1]: Started libcrun container.
Jan 21 23:55:22 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7550ba65d333f047871950200dd748b9dfbfa7cf965b4efac6009ca12ac446b4/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 21 23:55:22 compute-1 podman[218519]: 2026-01-21 23:55:22.02013199 +0000 UTC m=+0.193510761 container init e298530d555b12d99ee64d55c18e93e63d5de333ae97424ca253421d7b7480de (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-74e2da48-44c2-4c6d-9597-6c47d6247f9c, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 21 23:55:22 compute-1 nova_compute[182713]: 2026-01-21 23:55:22.025 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 7821b939-505a-45e3-a74b-dce7c6fdc856] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:55:22 compute-1 nova_compute[182713]: 2026-01-21 23:55:22.029 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 7821b939-505a-45e3-a74b-dce7c6fdc856] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 21 23:55:22 compute-1 podman[218519]: 2026-01-21 23:55:22.031029695 +0000 UTC m=+0.204408426 container start e298530d555b12d99ee64d55c18e93e63d5de333ae97424ca253421d7b7480de (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-74e2da48-44c2-4c6d-9597-6c47d6247f9c, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Jan 21 23:55:22 compute-1 neutron-haproxy-ovnmeta-74e2da48-44c2-4c6d-9597-6c47d6247f9c[218534]: [NOTICE]   (218538) : New worker (218540) forked
Jan 21 23:55:22 compute-1 neutron-haproxy-ovnmeta-74e2da48-44c2-4c6d-9597-6c47d6247f9c[218534]: [NOTICE]   (218538) : Loading success.
Jan 21 23:55:22 compute-1 nova_compute[182713]: 2026-01-21 23:55:22.061 182717 INFO nova.compute.manager [None req-7e6ec281-be96-44e2-aa9a-420fd781e254 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: 7821b939-505a-45e3-a74b-dce7c6fdc856] Took 7.81 seconds to build instance.
Jan 21 23:55:22 compute-1 nova_compute[182713]: 2026-01-21 23:55:22.080 182717 DEBUG oslo_concurrency.lockutils [None req-7e6ec281-be96-44e2-aa9a-420fd781e254 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Lock "7821b939-505a-45e3-a74b-dce7c6fdc856" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.942s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:55:22 compute-1 nova_compute[182713]: 2026-01-21 23:55:22.569 182717 DEBUG nova.network.neutron [req-a42e1bc8-ab54-4211-99c3-563ef649eaf8 req-2bbe4552-82f2-4785-bb04-f1943b2b2fe2 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 7821b939-505a-45e3-a74b-dce7c6fdc856] Updated VIF entry in instance network info cache for port 72ef4c9f-6bd3-45d3-9383-212dd9f03330. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 21 23:55:22 compute-1 nova_compute[182713]: 2026-01-21 23:55:22.570 182717 DEBUG nova.network.neutron [req-a42e1bc8-ab54-4211-99c3-563ef649eaf8 req-2bbe4552-82f2-4785-bb04-f1943b2b2fe2 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 7821b939-505a-45e3-a74b-dce7c6fdc856] Updating instance_info_cache with network_info: [{"id": "72ef4c9f-6bd3-45d3-9383-212dd9f03330", "address": "fa:16:3e:9b:5f:9c", "network": {"id": "74e2da48-44c2-4c6d-9597-6c47d6247f9c", "bridge": "br-int", "label": "tempest-ImagesTestJSON-926280031-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "63e5713bcd4c429796b251487b6136bc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap72ef4c9f-6b", "ovs_interfaceid": "72ef4c9f-6bd3-45d3-9383-212dd9f03330", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 23:55:22 compute-1 nova_compute[182713]: 2026-01-21 23:55:22.593 182717 DEBUG oslo_concurrency.lockutils [req-a42e1bc8-ab54-4211-99c3-563ef649eaf8 req-2bbe4552-82f2-4785-bb04-f1943b2b2fe2 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-7821b939-505a-45e3-a74b-dce7c6fdc856" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 23:55:23 compute-1 podman[218549]: 2026-01-21 23:55:23.573754974 +0000 UTC m=+0.060937677 container health_status dce133a2ffa21f3006ac27dc80027060a9029e6d972f4c62fecd35dd3bbb00f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public, vcs-type=git, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, name=ubi9-minimal, managed_by=edpm_ansible, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=openstack_network_exporter, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41)
Jan 21 23:55:24 compute-1 nova_compute[182713]: 2026-01-21 23:55:24.060 182717 DEBUG nova.compute.manager [req-2c54c8c6-281e-402a-b3b2-e5450bbcd00f req-e1a06cc7-72f3-4bea-89cc-72b4127b502c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 7821b939-505a-45e3-a74b-dce7c6fdc856] Received event network-vif-plugged-72ef4c9f-6bd3-45d3-9383-212dd9f03330 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:55:24 compute-1 nova_compute[182713]: 2026-01-21 23:55:24.061 182717 DEBUG oslo_concurrency.lockutils [req-2c54c8c6-281e-402a-b3b2-e5450bbcd00f req-e1a06cc7-72f3-4bea-89cc-72b4127b502c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "7821b939-505a-45e3-a74b-dce7c6fdc856-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:55:24 compute-1 nova_compute[182713]: 2026-01-21 23:55:24.062 182717 DEBUG oslo_concurrency.lockutils [req-2c54c8c6-281e-402a-b3b2-e5450bbcd00f req-e1a06cc7-72f3-4bea-89cc-72b4127b502c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "7821b939-505a-45e3-a74b-dce7c6fdc856-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:55:24 compute-1 nova_compute[182713]: 2026-01-21 23:55:24.062 182717 DEBUG oslo_concurrency.lockutils [req-2c54c8c6-281e-402a-b3b2-e5450bbcd00f req-e1a06cc7-72f3-4bea-89cc-72b4127b502c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "7821b939-505a-45e3-a74b-dce7c6fdc856-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:55:24 compute-1 nova_compute[182713]: 2026-01-21 23:55:24.063 182717 DEBUG nova.compute.manager [req-2c54c8c6-281e-402a-b3b2-e5450bbcd00f req-e1a06cc7-72f3-4bea-89cc-72b4127b502c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 7821b939-505a-45e3-a74b-dce7c6fdc856] No waiting events found dispatching network-vif-plugged-72ef4c9f-6bd3-45d3-9383-212dd9f03330 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 23:55:24 compute-1 nova_compute[182713]: 2026-01-21 23:55:24.063 182717 WARNING nova.compute.manager [req-2c54c8c6-281e-402a-b3b2-e5450bbcd00f req-e1a06cc7-72f3-4bea-89cc-72b4127b502c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 7821b939-505a-45e3-a74b-dce7c6fdc856] Received unexpected event network-vif-plugged-72ef4c9f-6bd3-45d3-9383-212dd9f03330 for instance with vm_state active and task_state None.
Jan 21 23:55:24 compute-1 nova_compute[182713]: 2026-01-21 23:55:24.164 182717 DEBUG nova.objects.instance [None req-72f57f63-4398-4d5d-bbb6-bbfd9e0f9252 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Lazy-loading 'pci_devices' on Instance uuid 7821b939-505a-45e3-a74b-dce7c6fdc856 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:55:24 compute-1 nova_compute[182713]: 2026-01-21 23:55:24.208 182717 DEBUG nova.virt.driver [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Emitting event <LifecycleEvent: 1769039724.208333, 7821b939-505a-45e3-a74b-dce7c6fdc856 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:55:24 compute-1 nova_compute[182713]: 2026-01-21 23:55:24.209 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 7821b939-505a-45e3-a74b-dce7c6fdc856] VM Paused (Lifecycle Event)
Jan 21 23:55:24 compute-1 nova_compute[182713]: 2026-01-21 23:55:24.241 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 7821b939-505a-45e3-a74b-dce7c6fdc856] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:55:24 compute-1 nova_compute[182713]: 2026-01-21 23:55:24.246 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 7821b939-505a-45e3-a74b-dce7c6fdc856] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: suspending, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 21 23:55:24 compute-1 nova_compute[182713]: 2026-01-21 23:55:24.282 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 7821b939-505a-45e3-a74b-dce7c6fdc856] During sync_power_state the instance has a pending task (suspending). Skip.
Jan 21 23:55:24 compute-1 kernel: tap72ef4c9f-6b (unregistering): left promiscuous mode
Jan 21 23:55:24 compute-1 NetworkManager[54952]: <info>  [1769039724.8064] device (tap72ef4c9f-6b): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 21 23:55:24 compute-1 ovn_controller[94841]: 2026-01-21T23:55:24Z|00149|binding|INFO|Releasing lport 72ef4c9f-6bd3-45d3-9383-212dd9f03330 from this chassis (sb_readonly=0)
Jan 21 23:55:24 compute-1 ovn_controller[94841]: 2026-01-21T23:55:24Z|00150|binding|INFO|Setting lport 72ef4c9f-6bd3-45d3-9383-212dd9f03330 down in Southbound
Jan 21 23:55:24 compute-1 ovn_controller[94841]: 2026-01-21T23:55:24Z|00151|binding|INFO|Removing iface tap72ef4c9f-6b ovn-installed in OVS
Jan 21 23:55:24 compute-1 nova_compute[182713]: 2026-01-21 23:55:24.817 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:55:24 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:55:24.825 104184 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:9b:5f:9c 10.100.0.8'], port_security=['fa:16:3e:9b:5f:9c 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '7821b939-505a-45e3-a74b-dce7c6fdc856', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-74e2da48-44c2-4c6d-9597-6c47d6247f9c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '63e5713bcd4c429796b251487b6136bc', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'f9b7e0b6-a204-4fa2-b013-6e4797586550', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d83c4887-6d70-4852-825f-2112d1d70c76, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>], logical_port=72ef4c9f-6bd3-45d3-9383-212dd9f03330) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 21 23:55:24 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:55:24.828 104184 INFO neutron.agent.ovn.metadata.agent [-] Port 72ef4c9f-6bd3-45d3-9383-212dd9f03330 in datapath 74e2da48-44c2-4c6d-9597-6c47d6247f9c unbound from our chassis
Jan 21 23:55:24 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:55:24.831 104184 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 74e2da48-44c2-4c6d-9597-6c47d6247f9c, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 21 23:55:24 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:55:24.832 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[0eb28dc8-89e5-4f58-8f6f-b64c623648f5]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:55:24 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:55:24.833 104184 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-74e2da48-44c2-4c6d-9597-6c47d6247f9c namespace which is not needed anymore
Jan 21 23:55:24 compute-1 nova_compute[182713]: 2026-01-21 23:55:24.847 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:55:24 compute-1 nova_compute[182713]: 2026-01-21 23:55:24.856 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:55:24 compute-1 nova_compute[182713]: 2026-01-21 23:55:24.857 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:55:24 compute-1 nova_compute[182713]: 2026-01-21 23:55:24.857 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:55:24 compute-1 nova_compute[182713]: 2026-01-21 23:55:24.858 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 21 23:55:24 compute-1 systemd[1]: machine-qemu\x2d25\x2dinstance\x2d00000038.scope: Deactivated successfully.
Jan 21 23:55:24 compute-1 systemd[1]: machine-qemu\x2d25\x2dinstance\x2d00000038.scope: Consumed 3.177s CPU time.
Jan 21 23:55:24 compute-1 systemd-machined[153970]: Machine qemu-25-instance-00000038 terminated.
Jan 21 23:55:25 compute-1 kernel: tap72ef4c9f-6b: entered promiscuous mode
Jan 21 23:55:25 compute-1 NetworkManager[54952]: <info>  [1769039725.0133] manager: (tap72ef4c9f-6b): new Tun device (/org/freedesktop/NetworkManager/Devices/78)
Jan 21 23:55:25 compute-1 kernel: tap72ef4c9f-6b (unregistering): left promiscuous mode
Jan 21 23:55:25 compute-1 neutron-haproxy-ovnmeta-74e2da48-44c2-4c6d-9597-6c47d6247f9c[218534]: [NOTICE]   (218538) : haproxy version is 2.8.14-c23fe91
Jan 21 23:55:25 compute-1 neutron-haproxy-ovnmeta-74e2da48-44c2-4c6d-9597-6c47d6247f9c[218534]: [NOTICE]   (218538) : path to executable is /usr/sbin/haproxy
Jan 21 23:55:25 compute-1 neutron-haproxy-ovnmeta-74e2da48-44c2-4c6d-9597-6c47d6247f9c[218534]: [WARNING]  (218538) : Exiting Master process...
Jan 21 23:55:25 compute-1 neutron-haproxy-ovnmeta-74e2da48-44c2-4c6d-9597-6c47d6247f9c[218534]: [WARNING]  (218538) : Exiting Master process...
Jan 21 23:55:25 compute-1 neutron-haproxy-ovnmeta-74e2da48-44c2-4c6d-9597-6c47d6247f9c[218534]: [ALERT]    (218538) : Current worker (218540) exited with code 143 (Terminated)
Jan 21 23:55:25 compute-1 neutron-haproxy-ovnmeta-74e2da48-44c2-4c6d-9597-6c47d6247f9c[218534]: [WARNING]  (218538) : All workers exited. Exiting... (0)
Jan 21 23:55:25 compute-1 nova_compute[182713]: 2026-01-21 23:55:25.021 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:55:25 compute-1 systemd[1]: libpod-e298530d555b12d99ee64d55c18e93e63d5de333ae97424ca253421d7b7480de.scope: Deactivated successfully.
Jan 21 23:55:25 compute-1 podman[218598]: 2026-01-21 23:55:25.030128404 +0000 UTC m=+0.070388949 container died e298530d555b12d99ee64d55c18e93e63d5de333ae97424ca253421d7b7480de (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-74e2da48-44c2-4c6d-9597-6c47d6247f9c, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 21 23:55:25 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-e298530d555b12d99ee64d55c18e93e63d5de333ae97424ca253421d7b7480de-userdata-shm.mount: Deactivated successfully.
Jan 21 23:55:25 compute-1 nova_compute[182713]: 2026-01-21 23:55:25.083 182717 DEBUG nova.compute.manager [None req-72f57f63-4398-4d5d-bbb6-bbfd9e0f9252 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: 7821b939-505a-45e3-a74b-dce7c6fdc856] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:55:25 compute-1 systemd[1]: var-lib-containers-storage-overlay-7550ba65d333f047871950200dd748b9dfbfa7cf965b4efac6009ca12ac446b4-merged.mount: Deactivated successfully.
Jan 21 23:55:25 compute-1 podman[218598]: 2026-01-21 23:55:25.095784785 +0000 UTC m=+0.136045330 container cleanup e298530d555b12d99ee64d55c18e93e63d5de333ae97424ca253421d7b7480de (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-74e2da48-44c2-4c6d-9597-6c47d6247f9c, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 21 23:55:25 compute-1 systemd[1]: libpod-conmon-e298530d555b12d99ee64d55c18e93e63d5de333ae97424ca253421d7b7480de.scope: Deactivated successfully.
Jan 21 23:55:25 compute-1 podman[218645]: 2026-01-21 23:55:25.182285489 +0000 UTC m=+0.061631008 container remove e298530d555b12d99ee64d55c18e93e63d5de333ae97424ca253421d7b7480de (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-74e2da48-44c2-4c6d-9597-6c47d6247f9c, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 21 23:55:25 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:55:25.190 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[8be2cd9f-0fbc-41f6-b108-63cd2b5a6371]: (4, ('Wed Jan 21 11:55:24 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-74e2da48-44c2-4c6d-9597-6c47d6247f9c (e298530d555b12d99ee64d55c18e93e63d5de333ae97424ca253421d7b7480de)\ne298530d555b12d99ee64d55c18e93e63d5de333ae97424ca253421d7b7480de\nWed Jan 21 11:55:25 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-74e2da48-44c2-4c6d-9597-6c47d6247f9c (e298530d555b12d99ee64d55c18e93e63d5de333ae97424ca253421d7b7480de)\ne298530d555b12d99ee64d55c18e93e63d5de333ae97424ca253421d7b7480de\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:55:25 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:55:25.192 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[5b086325-eb43-4c54-9610-ada14d31332e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:55:25 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:55:25.193 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap74e2da48-40, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:55:25 compute-1 nova_compute[182713]: 2026-01-21 23:55:25.196 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:55:25 compute-1 kernel: tap74e2da48-40: left promiscuous mode
Jan 21 23:55:25 compute-1 nova_compute[182713]: 2026-01-21 23:55:25.222 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:55:25 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:55:25.226 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[18461f0c-4b27-45ea-8fe5-de39921ab9d3]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:55:25 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:55:25.240 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[a5a45e57-770b-4d58-bcb7-4461f2043c51]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:55:25 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:55:25.241 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[deefaa82-1b90-43ca-b328-5b5789fc323d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:55:25 compute-1 nova_compute[182713]: 2026-01-21 23:55:25.250 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:55:25 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:55:25.259 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[18c63b07-8385-4520-be87-2299ff883b8e]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 429375, 'reachable_time': 43982, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 218662, 'error': None, 'target': 'ovnmeta-74e2da48-44c2-4c6d-9597-6c47d6247f9c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:55:25 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:55:25.261 104576 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-74e2da48-44c2-4c6d-9597-6c47d6247f9c deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 21 23:55:25 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:55:25.262 104576 DEBUG oslo.privsep.daemon [-] privsep: reply[584cf2b5-7715-4b46-a41b-ddd75739d0f9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:55:25 compute-1 systemd[1]: run-netns-ovnmeta\x2d74e2da48\x2d44c2\x2d4c6d\x2d9597\x2d6c47d6247f9c.mount: Deactivated successfully.
Jan 21 23:55:25 compute-1 nova_compute[182713]: 2026-01-21 23:55:25.857 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:55:26 compute-1 nova_compute[182713]: 2026-01-21 23:55:26.070 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:55:26 compute-1 nova_compute[182713]: 2026-01-21 23:55:26.185 182717 DEBUG nova.compute.manager [req-9c4deaf4-fb41-4a7d-9907-5152315e3975 req-b3dfa277-d412-4dc0-8326-910e684264b9 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 7821b939-505a-45e3-a74b-dce7c6fdc856] Received event network-vif-unplugged-72ef4c9f-6bd3-45d3-9383-212dd9f03330 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:55:26 compute-1 nova_compute[182713]: 2026-01-21 23:55:26.186 182717 DEBUG oslo_concurrency.lockutils [req-9c4deaf4-fb41-4a7d-9907-5152315e3975 req-b3dfa277-d412-4dc0-8326-910e684264b9 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "7821b939-505a-45e3-a74b-dce7c6fdc856-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:55:26 compute-1 nova_compute[182713]: 2026-01-21 23:55:26.186 182717 DEBUG oslo_concurrency.lockutils [req-9c4deaf4-fb41-4a7d-9907-5152315e3975 req-b3dfa277-d412-4dc0-8326-910e684264b9 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "7821b939-505a-45e3-a74b-dce7c6fdc856-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:55:26 compute-1 nova_compute[182713]: 2026-01-21 23:55:26.186 182717 DEBUG oslo_concurrency.lockutils [req-9c4deaf4-fb41-4a7d-9907-5152315e3975 req-b3dfa277-d412-4dc0-8326-910e684264b9 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "7821b939-505a-45e3-a74b-dce7c6fdc856-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:55:26 compute-1 nova_compute[182713]: 2026-01-21 23:55:26.186 182717 DEBUG nova.compute.manager [req-9c4deaf4-fb41-4a7d-9907-5152315e3975 req-b3dfa277-d412-4dc0-8326-910e684264b9 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 7821b939-505a-45e3-a74b-dce7c6fdc856] No waiting events found dispatching network-vif-unplugged-72ef4c9f-6bd3-45d3-9383-212dd9f03330 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 23:55:26 compute-1 nova_compute[182713]: 2026-01-21 23:55:26.186 182717 WARNING nova.compute.manager [req-9c4deaf4-fb41-4a7d-9907-5152315e3975 req-b3dfa277-d412-4dc0-8326-910e684264b9 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 7821b939-505a-45e3-a74b-dce7c6fdc856] Received unexpected event network-vif-unplugged-72ef4c9f-6bd3-45d3-9383-212dd9f03330 for instance with vm_state suspended and task_state None.
Jan 21 23:55:26 compute-1 nova_compute[182713]: 2026-01-21 23:55:26.186 182717 DEBUG nova.compute.manager [req-9c4deaf4-fb41-4a7d-9907-5152315e3975 req-b3dfa277-d412-4dc0-8326-910e684264b9 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 7821b939-505a-45e3-a74b-dce7c6fdc856] Received event network-vif-plugged-72ef4c9f-6bd3-45d3-9383-212dd9f03330 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:55:26 compute-1 nova_compute[182713]: 2026-01-21 23:55:26.187 182717 DEBUG oslo_concurrency.lockutils [req-9c4deaf4-fb41-4a7d-9907-5152315e3975 req-b3dfa277-d412-4dc0-8326-910e684264b9 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "7821b939-505a-45e3-a74b-dce7c6fdc856-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:55:26 compute-1 nova_compute[182713]: 2026-01-21 23:55:26.187 182717 DEBUG oslo_concurrency.lockutils [req-9c4deaf4-fb41-4a7d-9907-5152315e3975 req-b3dfa277-d412-4dc0-8326-910e684264b9 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "7821b939-505a-45e3-a74b-dce7c6fdc856-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:55:26 compute-1 nova_compute[182713]: 2026-01-21 23:55:26.187 182717 DEBUG oslo_concurrency.lockutils [req-9c4deaf4-fb41-4a7d-9907-5152315e3975 req-b3dfa277-d412-4dc0-8326-910e684264b9 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "7821b939-505a-45e3-a74b-dce7c6fdc856-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:55:26 compute-1 nova_compute[182713]: 2026-01-21 23:55:26.187 182717 DEBUG nova.compute.manager [req-9c4deaf4-fb41-4a7d-9907-5152315e3975 req-b3dfa277-d412-4dc0-8326-910e684264b9 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 7821b939-505a-45e3-a74b-dce7c6fdc856] No waiting events found dispatching network-vif-plugged-72ef4c9f-6bd3-45d3-9383-212dd9f03330 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 23:55:26 compute-1 nova_compute[182713]: 2026-01-21 23:55:26.187 182717 WARNING nova.compute.manager [req-9c4deaf4-fb41-4a7d-9907-5152315e3975 req-b3dfa277-d412-4dc0-8326-910e684264b9 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 7821b939-505a-45e3-a74b-dce7c6fdc856] Received unexpected event network-vif-plugged-72ef4c9f-6bd3-45d3-9383-212dd9f03330 for instance with vm_state suspended and task_state None.
Jan 21 23:55:27 compute-1 nova_compute[182713]: 2026-01-21 23:55:27.214 182717 DEBUG nova.compute.manager [None req-306479ae-fc4e-403a-af2b-52a9963c8117 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: 7821b939-505a-45e3-a74b-dce7c6fdc856] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:55:27 compute-1 nova_compute[182713]: 2026-01-21 23:55:27.285 182717 INFO nova.compute.manager [None req-306479ae-fc4e-403a-af2b-52a9963c8117 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: 7821b939-505a-45e3-a74b-dce7c6fdc856] instance snapshotting
Jan 21 23:55:27 compute-1 nova_compute[182713]: 2026-01-21 23:55:27.286 182717 WARNING nova.compute.manager [None req-306479ae-fc4e-403a-af2b-52a9963c8117 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: 7821b939-505a-45e3-a74b-dce7c6fdc856] trying to snapshot a non-running instance: (state: 4 expected: 1)
Jan 21 23:55:27 compute-1 nova_compute[182713]: 2026-01-21 23:55:27.618 182717 INFO nova.virt.libvirt.driver [None req-306479ae-fc4e-403a-af2b-52a9963c8117 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: 7821b939-505a-45e3-a74b-dce7c6fdc856] Beginning cold snapshot process
Jan 21 23:55:27 compute-1 nova_compute[182713]: 2026-01-21 23:55:27.855 182717 DEBUG nova.privsep.utils [None req-306479ae-fc4e-403a-af2b-52a9963c8117 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63
Jan 21 23:55:27 compute-1 nova_compute[182713]: 2026-01-21 23:55:27.856 182717 DEBUG oslo_concurrency.processutils [None req-306479ae-fc4e-403a-af2b-52a9963c8117 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Running cmd (subprocess): qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/7821b939-505a-45e3-a74b-dce7c6fdc856/disk /var/lib/nova/instances/snapshots/tmp7j8dh2gq/17b5a08888744bae90942244cc959d43 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:55:27 compute-1 nova_compute[182713]: 2026-01-21 23:55:27.892 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:55:27 compute-1 nova_compute[182713]: 2026-01-21 23:55:27.893 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:55:28 compute-1 nova_compute[182713]: 2026-01-21 23:55:28.026 182717 DEBUG oslo_concurrency.processutils [None req-306479ae-fc4e-403a-af2b-52a9963c8117 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] CMD "qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/7821b939-505a-45e3-a74b-dce7c6fdc856/disk /var/lib/nova/instances/snapshots/tmp7j8dh2gq/17b5a08888744bae90942244cc959d43" returned: 0 in 0.170s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:55:28 compute-1 nova_compute[182713]: 2026-01-21 23:55:28.027 182717 INFO nova.virt.libvirt.driver [None req-306479ae-fc4e-403a-af2b-52a9963c8117 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: 7821b939-505a-45e3-a74b-dce7c6fdc856] Snapshot extracted, beginning image upload
Jan 21 23:55:28 compute-1 nova_compute[182713]: 2026-01-21 23:55:28.856 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:55:28 compute-1 nova_compute[182713]: 2026-01-21 23:55:28.882 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:55:28 compute-1 nova_compute[182713]: 2026-01-21 23:55:28.883 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:55:28 compute-1 nova_compute[182713]: 2026-01-21 23:55:28.883 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:55:28 compute-1 nova_compute[182713]: 2026-01-21 23:55:28.883 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 21 23:55:28 compute-1 nova_compute[182713]: 2026-01-21 23:55:28.963 182717 DEBUG oslo_concurrency.processutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7821b939-505a-45e3-a74b-dce7c6fdc856/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:55:29 compute-1 nova_compute[182713]: 2026-01-21 23:55:29.048 182717 DEBUG oslo_concurrency.processutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7821b939-505a-45e3-a74b-dce7c6fdc856/disk --force-share --output=json" returned: 0 in 0.085s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:55:29 compute-1 nova_compute[182713]: 2026-01-21 23:55:29.049 182717 DEBUG oslo_concurrency.processutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7821b939-505a-45e3-a74b-dce7c6fdc856/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:55:29 compute-1 nova_compute[182713]: 2026-01-21 23:55:29.142 182717 DEBUG oslo_concurrency.processutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7821b939-505a-45e3-a74b-dce7c6fdc856/disk --force-share --output=json" returned: 0 in 0.093s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:55:29 compute-1 nova_compute[182713]: 2026-01-21 23:55:29.336 182717 WARNING nova.virt.libvirt.driver [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 21 23:55:29 compute-1 nova_compute[182713]: 2026-01-21 23:55:29.337 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5619MB free_disk=73.2146110534668GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 21 23:55:29 compute-1 nova_compute[182713]: 2026-01-21 23:55:29.337 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:55:29 compute-1 nova_compute[182713]: 2026-01-21 23:55:29.338 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:55:29 compute-1 nova_compute[182713]: 2026-01-21 23:55:29.412 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Instance 7821b939-505a-45e3-a74b-dce7c6fdc856 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 21 23:55:29 compute-1 nova_compute[182713]: 2026-01-21 23:55:29.413 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 21 23:55:29 compute-1 nova_compute[182713]: 2026-01-21 23:55:29.413 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 21 23:55:29 compute-1 nova_compute[182713]: 2026-01-21 23:55:29.490 182717 DEBUG nova.compute.provider_tree [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Inventory has not changed in ProviderTree for provider: 39680711-70c9-4df1-ae59-25e54fac688d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 21 23:55:29 compute-1 nova_compute[182713]: 2026-01-21 23:55:29.521 182717 DEBUG nova.scheduler.client.report [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Inventory has not changed for provider 39680711-70c9-4df1-ae59-25e54fac688d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 21 23:55:29 compute-1 nova_compute[182713]: 2026-01-21 23:55:29.551 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 21 23:55:29 compute-1 nova_compute[182713]: 2026-01-21 23:55:29.552 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.214s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:55:30 compute-1 nova_compute[182713]: 2026-01-21 23:55:30.161 182717 INFO nova.virt.libvirt.driver [None req-306479ae-fc4e-403a-af2b-52a9963c8117 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: 7821b939-505a-45e3-a74b-dce7c6fdc856] Snapshot image upload complete
Jan 21 23:55:30 compute-1 nova_compute[182713]: 2026-01-21 23:55:30.161 182717 INFO nova.compute.manager [None req-306479ae-fc4e-403a-af2b-52a9963c8117 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: 7821b939-505a-45e3-a74b-dce7c6fdc856] Took 2.86 seconds to snapshot the instance on the hypervisor.
Jan 21 23:55:30 compute-1 nova_compute[182713]: 2026-01-21 23:55:30.253 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:55:30 compute-1 nova_compute[182713]: 2026-01-21 23:55:30.548 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:55:30 compute-1 nova_compute[182713]: 2026-01-21 23:55:30.549 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:55:30 compute-1 podman[218680]: 2026-01-21 23:55:30.605969674 +0000 UTC m=+0.079874121 container health_status 9069102163b9014ce8d990e36224edf2f6277e737eebbc85a8ea0ba5a3d6a468 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 21 23:55:30 compute-1 podman[218679]: 2026-01-21 23:55:30.633546443 +0000 UTC m=+0.119945685 container health_status 1b31374526468d6b98833c6ddc06c791524e3aaa8f4a92d02e3f11c449a17ca2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.build-date=20251202)
Jan 21 23:55:30 compute-1 nova_compute[182713]: 2026-01-21 23:55:30.854 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:55:30 compute-1 nova_compute[182713]: 2026-01-21 23:55:30.855 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 21 23:55:30 compute-1 nova_compute[182713]: 2026-01-21 23:55:30.855 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 21 23:55:31 compute-1 nova_compute[182713]: 2026-01-21 23:55:31.072 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:55:31 compute-1 nova_compute[182713]: 2026-01-21 23:55:31.294 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquiring lock "refresh_cache-7821b939-505a-45e3-a74b-dce7c6fdc856" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 23:55:31 compute-1 nova_compute[182713]: 2026-01-21 23:55:31.294 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquired lock "refresh_cache-7821b939-505a-45e3-a74b-dce7c6fdc856" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 23:55:31 compute-1 nova_compute[182713]: 2026-01-21 23:55:31.294 182717 DEBUG nova.network.neutron [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] [instance: 7821b939-505a-45e3-a74b-dce7c6fdc856] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Jan 21 23:55:31 compute-1 nova_compute[182713]: 2026-01-21 23:55:31.295 182717 DEBUG nova.objects.instance [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lazy-loading 'info_cache' on Instance uuid 7821b939-505a-45e3-a74b-dce7c6fdc856 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:55:31 compute-1 nova_compute[182713]: 2026-01-21 23:55:31.998 182717 DEBUG oslo_concurrency.lockutils [None req-4d3da152-49ec-4806-a396-8a1bbf34f401 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Acquiring lock "7821b939-505a-45e3-a74b-dce7c6fdc856" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:55:31 compute-1 nova_compute[182713]: 2026-01-21 23:55:31.998 182717 DEBUG oslo_concurrency.lockutils [None req-4d3da152-49ec-4806-a396-8a1bbf34f401 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Lock "7821b939-505a-45e3-a74b-dce7c6fdc856" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:55:31 compute-1 nova_compute[182713]: 2026-01-21 23:55:31.999 182717 DEBUG oslo_concurrency.lockutils [None req-4d3da152-49ec-4806-a396-8a1bbf34f401 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Acquiring lock "7821b939-505a-45e3-a74b-dce7c6fdc856-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:55:32 compute-1 nova_compute[182713]: 2026-01-21 23:55:31.999 182717 DEBUG oslo_concurrency.lockutils [None req-4d3da152-49ec-4806-a396-8a1bbf34f401 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Lock "7821b939-505a-45e3-a74b-dce7c6fdc856-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:55:32 compute-1 nova_compute[182713]: 2026-01-21 23:55:32.000 182717 DEBUG oslo_concurrency.lockutils [None req-4d3da152-49ec-4806-a396-8a1bbf34f401 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Lock "7821b939-505a-45e3-a74b-dce7c6fdc856-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:55:32 compute-1 nova_compute[182713]: 2026-01-21 23:55:32.020 182717 INFO nova.compute.manager [None req-4d3da152-49ec-4806-a396-8a1bbf34f401 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: 7821b939-505a-45e3-a74b-dce7c6fdc856] Terminating instance
Jan 21 23:55:32 compute-1 nova_compute[182713]: 2026-01-21 23:55:32.034 182717 DEBUG nova.compute.manager [None req-4d3da152-49ec-4806-a396-8a1bbf34f401 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: 7821b939-505a-45e3-a74b-dce7c6fdc856] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 21 23:55:32 compute-1 nova_compute[182713]: 2026-01-21 23:55:32.043 182717 INFO nova.virt.libvirt.driver [-] [instance: 7821b939-505a-45e3-a74b-dce7c6fdc856] Instance destroyed successfully.
Jan 21 23:55:32 compute-1 nova_compute[182713]: 2026-01-21 23:55:32.044 182717 DEBUG nova.objects.instance [None req-4d3da152-49ec-4806-a396-8a1bbf34f401 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Lazy-loading 'resources' on Instance uuid 7821b939-505a-45e3-a74b-dce7c6fdc856 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:55:32 compute-1 nova_compute[182713]: 2026-01-21 23:55:32.065 182717 DEBUG nova.virt.libvirt.vif [None req-4d3da152-49ec-4806-a396-8a1bbf34f401 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-21T23:55:13Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ImagesTestJSON-server-201905165',display_name='tempest-ImagesTestJSON-server-201905165',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-imagestestjson-server-201905165',id=56,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-21T23:55:21Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='63e5713bcd4c429796b251487b6136bc',ramdisk_id='',reservation_id='r-yfkb4omo',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-ImagesTestJSON-126431515',owner_user_name='tempest-ImagesTestJSON-126431515-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-21T23:55:30Z,user_data=None,user_id='6eb1bcf645844eaca088761a04e59542',uuid=7821b939-505a-45e3-a74b-dce7c6fdc856,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='suspended') vif={"id": "72ef4c9f-6bd3-45d3-9383-212dd9f03330", "address": "fa:16:3e:9b:5f:9c", "network": {"id": "74e2da48-44c2-4c6d-9597-6c47d6247f9c", "bridge": "br-int", "label": "tempest-ImagesTestJSON-926280031-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "63e5713bcd4c429796b251487b6136bc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap72ef4c9f-6b", "ovs_interfaceid": "72ef4c9f-6bd3-45d3-9383-212dd9f03330", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 21 23:55:32 compute-1 nova_compute[182713]: 2026-01-21 23:55:32.066 182717 DEBUG nova.network.os_vif_util [None req-4d3da152-49ec-4806-a396-8a1bbf34f401 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Converting VIF {"id": "72ef4c9f-6bd3-45d3-9383-212dd9f03330", "address": "fa:16:3e:9b:5f:9c", "network": {"id": "74e2da48-44c2-4c6d-9597-6c47d6247f9c", "bridge": "br-int", "label": "tempest-ImagesTestJSON-926280031-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "63e5713bcd4c429796b251487b6136bc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap72ef4c9f-6b", "ovs_interfaceid": "72ef4c9f-6bd3-45d3-9383-212dd9f03330", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 21 23:55:32 compute-1 nova_compute[182713]: 2026-01-21 23:55:32.067 182717 DEBUG nova.network.os_vif_util [None req-4d3da152-49ec-4806-a396-8a1bbf34f401 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:9b:5f:9c,bridge_name='br-int',has_traffic_filtering=True,id=72ef4c9f-6bd3-45d3-9383-212dd9f03330,network=Network(74e2da48-44c2-4c6d-9597-6c47d6247f9c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap72ef4c9f-6b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 21 23:55:32 compute-1 nova_compute[182713]: 2026-01-21 23:55:32.068 182717 DEBUG os_vif [None req-4d3da152-49ec-4806-a396-8a1bbf34f401 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:9b:5f:9c,bridge_name='br-int',has_traffic_filtering=True,id=72ef4c9f-6bd3-45d3-9383-212dd9f03330,network=Network(74e2da48-44c2-4c6d-9597-6c47d6247f9c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap72ef4c9f-6b') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 21 23:55:32 compute-1 nova_compute[182713]: 2026-01-21 23:55:32.071 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:55:32 compute-1 nova_compute[182713]: 2026-01-21 23:55:32.071 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap72ef4c9f-6b, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:55:32 compute-1 nova_compute[182713]: 2026-01-21 23:55:32.124 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:55:32 compute-1 nova_compute[182713]: 2026-01-21 23:55:32.126 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 21 23:55:32 compute-1 nova_compute[182713]: 2026-01-21 23:55:32.131 182717 INFO os_vif [None req-4d3da152-49ec-4806-a396-8a1bbf34f401 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:9b:5f:9c,bridge_name='br-int',has_traffic_filtering=True,id=72ef4c9f-6bd3-45d3-9383-212dd9f03330,network=Network(74e2da48-44c2-4c6d-9597-6c47d6247f9c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap72ef4c9f-6b')
Jan 21 23:55:32 compute-1 nova_compute[182713]: 2026-01-21 23:55:32.131 182717 INFO nova.virt.libvirt.driver [None req-4d3da152-49ec-4806-a396-8a1bbf34f401 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: 7821b939-505a-45e3-a74b-dce7c6fdc856] Deleting instance files /var/lib/nova/instances/7821b939-505a-45e3-a74b-dce7c6fdc856_del
Jan 21 23:55:32 compute-1 nova_compute[182713]: 2026-01-21 23:55:32.132 182717 INFO nova.virt.libvirt.driver [None req-4d3da152-49ec-4806-a396-8a1bbf34f401 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: 7821b939-505a-45e3-a74b-dce7c6fdc856] Deletion of /var/lib/nova/instances/7821b939-505a-45e3-a74b-dce7c6fdc856_del complete
Jan 21 23:55:32 compute-1 nova_compute[182713]: 2026-01-21 23:55:32.252 182717 INFO nova.compute.manager [None req-4d3da152-49ec-4806-a396-8a1bbf34f401 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: 7821b939-505a-45e3-a74b-dce7c6fdc856] Took 0.22 seconds to destroy the instance on the hypervisor.
Jan 21 23:55:32 compute-1 nova_compute[182713]: 2026-01-21 23:55:32.253 182717 DEBUG oslo.service.loopingcall [None req-4d3da152-49ec-4806-a396-8a1bbf34f401 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 21 23:55:32 compute-1 nova_compute[182713]: 2026-01-21 23:55:32.254 182717 DEBUG nova.compute.manager [-] [instance: 7821b939-505a-45e3-a74b-dce7c6fdc856] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 21 23:55:32 compute-1 nova_compute[182713]: 2026-01-21 23:55:32.254 182717 DEBUG nova.network.neutron [-] [instance: 7821b939-505a-45e3-a74b-dce7c6fdc856] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 21 23:55:32 compute-1 nova_compute[182713]: 2026-01-21 23:55:32.959 182717 DEBUG nova.network.neutron [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] [instance: 7821b939-505a-45e3-a74b-dce7c6fdc856] Updating instance_info_cache with network_info: [{"id": "72ef4c9f-6bd3-45d3-9383-212dd9f03330", "address": "fa:16:3e:9b:5f:9c", "network": {"id": "74e2da48-44c2-4c6d-9597-6c47d6247f9c", "bridge": "br-int", "label": "tempest-ImagesTestJSON-926280031-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "63e5713bcd4c429796b251487b6136bc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap72ef4c9f-6b", "ovs_interfaceid": "72ef4c9f-6bd3-45d3-9383-212dd9f03330", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 23:55:32 compute-1 nova_compute[182713]: 2026-01-21 23:55:32.963 182717 DEBUG nova.network.neutron [-] [instance: 7821b939-505a-45e3-a74b-dce7c6fdc856] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 23:55:32 compute-1 nova_compute[182713]: 2026-01-21 23:55:32.991 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Releasing lock "refresh_cache-7821b939-505a-45e3-a74b-dce7c6fdc856" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 23:55:32 compute-1 nova_compute[182713]: 2026-01-21 23:55:32.991 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] [instance: 7821b939-505a-45e3-a74b-dce7c6fdc856] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Jan 21 23:55:32 compute-1 nova_compute[182713]: 2026-01-21 23:55:32.996 182717 INFO nova.compute.manager [-] [instance: 7821b939-505a-45e3-a74b-dce7c6fdc856] Took 0.74 seconds to deallocate network for instance.
Jan 21 23:55:33 compute-1 nova_compute[182713]: 2026-01-21 23:55:33.083 182717 DEBUG oslo_concurrency.lockutils [None req-4d3da152-49ec-4806-a396-8a1bbf34f401 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:55:33 compute-1 nova_compute[182713]: 2026-01-21 23:55:33.084 182717 DEBUG oslo_concurrency.lockutils [None req-4d3da152-49ec-4806-a396-8a1bbf34f401 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:55:33 compute-1 nova_compute[182713]: 2026-01-21 23:55:33.088 182717 DEBUG nova.compute.manager [req-574bb0dc-9363-458c-819d-c8a6d6820824 req-d375a40a-ac22-4321-8ce0-645501e96d4c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 7821b939-505a-45e3-a74b-dce7c6fdc856] Received event network-vif-deleted-72ef4c9f-6bd3-45d3-9383-212dd9f03330 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:55:33 compute-1 nova_compute[182713]: 2026-01-21 23:55:33.150 182717 DEBUG nova.compute.provider_tree [None req-4d3da152-49ec-4806-a396-8a1bbf34f401 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Inventory has not changed in ProviderTree for provider: 39680711-70c9-4df1-ae59-25e54fac688d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 21 23:55:33 compute-1 nova_compute[182713]: 2026-01-21 23:55:33.167 182717 DEBUG nova.scheduler.client.report [None req-4d3da152-49ec-4806-a396-8a1bbf34f401 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Inventory has not changed for provider 39680711-70c9-4df1-ae59-25e54fac688d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 21 23:55:33 compute-1 nova_compute[182713]: 2026-01-21 23:55:33.198 182717 DEBUG oslo_concurrency.lockutils [None req-4d3da152-49ec-4806-a396-8a1bbf34f401 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.114s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:55:33 compute-1 nova_compute[182713]: 2026-01-21 23:55:33.232 182717 INFO nova.scheduler.client.report [None req-4d3da152-49ec-4806-a396-8a1bbf34f401 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Deleted allocations for instance 7821b939-505a-45e3-a74b-dce7c6fdc856
Jan 21 23:55:33 compute-1 nova_compute[182713]: 2026-01-21 23:55:33.319 182717 DEBUG oslo_concurrency.lockutils [None req-4d3da152-49ec-4806-a396-8a1bbf34f401 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Lock "7821b939-505a-45e3-a74b-dce7c6fdc856" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.321s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:55:35 compute-1 nova_compute[182713]: 2026-01-21 23:55:35.027 182717 DEBUG oslo_concurrency.lockutils [None req-88c78b83-62b4-4740-abaa-caed5e17bd63 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Acquiring lock "22742d9b-a6a8-4f10-a17f-a9704a1f8f43" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:55:35 compute-1 nova_compute[182713]: 2026-01-21 23:55:35.028 182717 DEBUG oslo_concurrency.lockutils [None req-88c78b83-62b4-4740-abaa-caed5e17bd63 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Lock "22742d9b-a6a8-4f10-a17f-a9704a1f8f43" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:55:35 compute-1 nova_compute[182713]: 2026-01-21 23:55:35.056 182717 DEBUG nova.compute.manager [None req-88c78b83-62b4-4740-abaa-caed5e17bd63 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] [instance: 22742d9b-a6a8-4f10-a17f-a9704a1f8f43] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 21 23:55:35 compute-1 nova_compute[182713]: 2026-01-21 23:55:35.110 182717 DEBUG oslo_concurrency.lockutils [None req-bac2316a-8912-4826-8cd6-e33f10e4683b 2acb5062ef0a46b3b37336c6e856e999 c99aa787fccd4f1d9553df1471383775 - - default default] Acquiring lock "cdb8738b-801f-4bd8-a93c-3553748dedd7" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:55:35 compute-1 nova_compute[182713]: 2026-01-21 23:55:35.110 182717 DEBUG oslo_concurrency.lockutils [None req-bac2316a-8912-4826-8cd6-e33f10e4683b 2acb5062ef0a46b3b37336c6e856e999 c99aa787fccd4f1d9553df1471383775 - - default default] Lock "cdb8738b-801f-4bd8-a93c-3553748dedd7" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:55:35 compute-1 nova_compute[182713]: 2026-01-21 23:55:35.134 182717 DEBUG nova.compute.manager [None req-bac2316a-8912-4826-8cd6-e33f10e4683b 2acb5062ef0a46b3b37336c6e856e999 c99aa787fccd4f1d9553df1471383775 - - default default] [instance: cdb8738b-801f-4bd8-a93c-3553748dedd7] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 21 23:55:35 compute-1 nova_compute[182713]: 2026-01-21 23:55:35.167 182717 DEBUG oslo_concurrency.lockutils [None req-e8e55543-5f47-4415-ba22-55d1bfa56be9 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Acquiring lock "2a5e9a44-f095-4122-8db9-4918b6ba22b7" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:55:35 compute-1 nova_compute[182713]: 2026-01-21 23:55:35.168 182717 DEBUG oslo_concurrency.lockutils [None req-e8e55543-5f47-4415-ba22-55d1bfa56be9 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Lock "2a5e9a44-f095-4122-8db9-4918b6ba22b7" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:55:35 compute-1 nova_compute[182713]: 2026-01-21 23:55:35.175 182717 DEBUG oslo_concurrency.lockutils [None req-88c78b83-62b4-4740-abaa-caed5e17bd63 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:55:35 compute-1 nova_compute[182713]: 2026-01-21 23:55:35.175 182717 DEBUG oslo_concurrency.lockutils [None req-88c78b83-62b4-4740-abaa-caed5e17bd63 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:55:35 compute-1 nova_compute[182713]: 2026-01-21 23:55:35.184 182717 DEBUG nova.virt.hardware [None req-88c78b83-62b4-4740-abaa-caed5e17bd63 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 21 23:55:35 compute-1 nova_compute[182713]: 2026-01-21 23:55:35.185 182717 INFO nova.compute.claims [None req-88c78b83-62b4-4740-abaa-caed5e17bd63 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] [instance: 22742d9b-a6a8-4f10-a17f-a9704a1f8f43] Claim successful on node compute-1.ctlplane.example.com
Jan 21 23:55:35 compute-1 nova_compute[182713]: 2026-01-21 23:55:35.218 182717 DEBUG nova.compute.manager [None req-e8e55543-5f47-4415-ba22-55d1bfa56be9 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: 2a5e9a44-f095-4122-8db9-4918b6ba22b7] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 21 23:55:35 compute-1 nova_compute[182713]: 2026-01-21 23:55:35.323 182717 DEBUG oslo_concurrency.lockutils [None req-bac2316a-8912-4826-8cd6-e33f10e4683b 2acb5062ef0a46b3b37336c6e856e999 c99aa787fccd4f1d9553df1471383775 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:55:35 compute-1 nova_compute[182713]: 2026-01-21 23:55:35.374 182717 DEBUG oslo_concurrency.lockutils [None req-e8e55543-5f47-4415-ba22-55d1bfa56be9 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:55:35 compute-1 nova_compute[182713]: 2026-01-21 23:55:35.414 182717 DEBUG nova.compute.provider_tree [None req-88c78b83-62b4-4740-abaa-caed5e17bd63 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Inventory has not changed in ProviderTree for provider: 39680711-70c9-4df1-ae59-25e54fac688d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 21 23:55:35 compute-1 nova_compute[182713]: 2026-01-21 23:55:35.433 182717 DEBUG nova.scheduler.client.report [None req-88c78b83-62b4-4740-abaa-caed5e17bd63 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Inventory has not changed for provider 39680711-70c9-4df1-ae59-25e54fac688d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 21 23:55:35 compute-1 nova_compute[182713]: 2026-01-21 23:55:35.464 182717 DEBUG oslo_concurrency.lockutils [None req-88c78b83-62b4-4740-abaa-caed5e17bd63 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.288s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:55:35 compute-1 nova_compute[182713]: 2026-01-21 23:55:35.465 182717 DEBUG nova.compute.manager [None req-88c78b83-62b4-4740-abaa-caed5e17bd63 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] [instance: 22742d9b-a6a8-4f10-a17f-a9704a1f8f43] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 21 23:55:35 compute-1 nova_compute[182713]: 2026-01-21 23:55:35.470 182717 DEBUG oslo_concurrency.lockutils [None req-bac2316a-8912-4826-8cd6-e33f10e4683b 2acb5062ef0a46b3b37336c6e856e999 c99aa787fccd4f1d9553df1471383775 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.147s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:55:35 compute-1 nova_compute[182713]: 2026-01-21 23:55:35.478 182717 DEBUG nova.virt.hardware [None req-bac2316a-8912-4826-8cd6-e33f10e4683b 2acb5062ef0a46b3b37336c6e856e999 c99aa787fccd4f1d9553df1471383775 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 21 23:55:35 compute-1 nova_compute[182713]: 2026-01-21 23:55:35.479 182717 INFO nova.compute.claims [None req-bac2316a-8912-4826-8cd6-e33f10e4683b 2acb5062ef0a46b3b37336c6e856e999 c99aa787fccd4f1d9553df1471383775 - - default default] [instance: cdb8738b-801f-4bd8-a93c-3553748dedd7] Claim successful on node compute-1.ctlplane.example.com
Jan 21 23:55:35 compute-1 nova_compute[182713]: 2026-01-21 23:55:35.584 182717 DEBUG nova.compute.manager [None req-88c78b83-62b4-4740-abaa-caed5e17bd63 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] [instance: 22742d9b-a6a8-4f10-a17f-a9704a1f8f43] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 21 23:55:35 compute-1 nova_compute[182713]: 2026-01-21 23:55:35.585 182717 DEBUG nova.network.neutron [None req-88c78b83-62b4-4740-abaa-caed5e17bd63 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] [instance: 22742d9b-a6a8-4f10-a17f-a9704a1f8f43] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 21 23:55:35 compute-1 nova_compute[182713]: 2026-01-21 23:55:35.617 182717 INFO nova.virt.libvirt.driver [None req-88c78b83-62b4-4740-abaa-caed5e17bd63 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] [instance: 22742d9b-a6a8-4f10-a17f-a9704a1f8f43] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 21 23:55:35 compute-1 nova_compute[182713]: 2026-01-21 23:55:35.644 182717 DEBUG nova.compute.manager [None req-88c78b83-62b4-4740-abaa-caed5e17bd63 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] [instance: 22742d9b-a6a8-4f10-a17f-a9704a1f8f43] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 21 23:55:35 compute-1 nova_compute[182713]: 2026-01-21 23:55:35.736 182717 DEBUG nova.compute.provider_tree [None req-bac2316a-8912-4826-8cd6-e33f10e4683b 2acb5062ef0a46b3b37336c6e856e999 c99aa787fccd4f1d9553df1471383775 - - default default] Inventory has not changed in ProviderTree for provider: 39680711-70c9-4df1-ae59-25e54fac688d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 21 23:55:35 compute-1 nova_compute[182713]: 2026-01-21 23:55:35.763 182717 DEBUG nova.scheduler.client.report [None req-bac2316a-8912-4826-8cd6-e33f10e4683b 2acb5062ef0a46b3b37336c6e856e999 c99aa787fccd4f1d9553df1471383775 - - default default] Inventory has not changed for provider 39680711-70c9-4df1-ae59-25e54fac688d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 21 23:55:35 compute-1 nova_compute[182713]: 2026-01-21 23:55:35.819 182717 DEBUG oslo_concurrency.lockutils [None req-bac2316a-8912-4826-8cd6-e33f10e4683b 2acb5062ef0a46b3b37336c6e856e999 c99aa787fccd4f1d9553df1471383775 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.349s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:55:35 compute-1 nova_compute[182713]: 2026-01-21 23:55:35.819 182717 DEBUG nova.compute.manager [None req-bac2316a-8912-4826-8cd6-e33f10e4683b 2acb5062ef0a46b3b37336c6e856e999 c99aa787fccd4f1d9553df1471383775 - - default default] [instance: cdb8738b-801f-4bd8-a93c-3553748dedd7] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 21 23:55:35 compute-1 nova_compute[182713]: 2026-01-21 23:55:35.823 182717 DEBUG oslo_concurrency.lockutils [None req-e8e55543-5f47-4415-ba22-55d1bfa56be9 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.449s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:55:35 compute-1 nova_compute[182713]: 2026-01-21 23:55:35.824 182717 DEBUG nova.compute.manager [None req-88c78b83-62b4-4740-abaa-caed5e17bd63 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] [instance: 22742d9b-a6a8-4f10-a17f-a9704a1f8f43] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 21 23:55:35 compute-1 nova_compute[182713]: 2026-01-21 23:55:35.825 182717 DEBUG nova.virt.libvirt.driver [None req-88c78b83-62b4-4740-abaa-caed5e17bd63 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] [instance: 22742d9b-a6a8-4f10-a17f-a9704a1f8f43] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 21 23:55:35 compute-1 nova_compute[182713]: 2026-01-21 23:55:35.825 182717 INFO nova.virt.libvirt.driver [None req-88c78b83-62b4-4740-abaa-caed5e17bd63 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] [instance: 22742d9b-a6a8-4f10-a17f-a9704a1f8f43] Creating image(s)
Jan 21 23:55:35 compute-1 nova_compute[182713]: 2026-01-21 23:55:35.826 182717 DEBUG oslo_concurrency.lockutils [None req-88c78b83-62b4-4740-abaa-caed5e17bd63 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Acquiring lock "/var/lib/nova/instances/22742d9b-a6a8-4f10-a17f-a9704a1f8f43/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:55:35 compute-1 nova_compute[182713]: 2026-01-21 23:55:35.826 182717 DEBUG oslo_concurrency.lockutils [None req-88c78b83-62b4-4740-abaa-caed5e17bd63 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Lock "/var/lib/nova/instances/22742d9b-a6a8-4f10-a17f-a9704a1f8f43/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:55:35 compute-1 nova_compute[182713]: 2026-01-21 23:55:35.827 182717 DEBUG oslo_concurrency.lockutils [None req-88c78b83-62b4-4740-abaa-caed5e17bd63 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Lock "/var/lib/nova/instances/22742d9b-a6a8-4f10-a17f-a9704a1f8f43/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:55:35 compute-1 nova_compute[182713]: 2026-01-21 23:55:35.843 182717 DEBUG oslo_concurrency.processutils [None req-88c78b83-62b4-4740-abaa-caed5e17bd63 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:55:35 compute-1 nova_compute[182713]: 2026-01-21 23:55:35.871 182717 DEBUG nova.virt.hardware [None req-e8e55543-5f47-4415-ba22-55d1bfa56be9 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 21 23:55:35 compute-1 nova_compute[182713]: 2026-01-21 23:55:35.872 182717 INFO nova.compute.claims [None req-e8e55543-5f47-4415-ba22-55d1bfa56be9 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: 2a5e9a44-f095-4122-8db9-4918b6ba22b7] Claim successful on node compute-1.ctlplane.example.com
Jan 21 23:55:35 compute-1 nova_compute[182713]: 2026-01-21 23:55:35.904 182717 DEBUG nova.compute.manager [None req-bac2316a-8912-4826-8cd6-e33f10e4683b 2acb5062ef0a46b3b37336c6e856e999 c99aa787fccd4f1d9553df1471383775 - - default default] [instance: cdb8738b-801f-4bd8-a93c-3553748dedd7] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 21 23:55:35 compute-1 nova_compute[182713]: 2026-01-21 23:55:35.904 182717 DEBUG nova.network.neutron [None req-bac2316a-8912-4826-8cd6-e33f10e4683b 2acb5062ef0a46b3b37336c6e856e999 c99aa787fccd4f1d9553df1471383775 - - default default] [instance: cdb8738b-801f-4bd8-a93c-3553748dedd7] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 21 23:55:35 compute-1 nova_compute[182713]: 2026-01-21 23:55:35.910 182717 DEBUG oslo_concurrency.processutils [None req-88c78b83-62b4-4740-abaa-caed5e17bd63 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:55:35 compute-1 nova_compute[182713]: 2026-01-21 23:55:35.911 182717 DEBUG oslo_concurrency.lockutils [None req-88c78b83-62b4-4740-abaa-caed5e17bd63 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Acquiring lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:55:35 compute-1 nova_compute[182713]: 2026-01-21 23:55:35.912 182717 DEBUG oslo_concurrency.lockutils [None req-88c78b83-62b4-4740-abaa-caed5e17bd63 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:55:35 compute-1 nova_compute[182713]: 2026-01-21 23:55:35.935 182717 DEBUG oslo_concurrency.processutils [None req-88c78b83-62b4-4740-abaa-caed5e17bd63 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:55:35 compute-1 nova_compute[182713]: 2026-01-21 23:55:35.963 182717 INFO nova.virt.libvirt.driver [None req-bac2316a-8912-4826-8cd6-e33f10e4683b 2acb5062ef0a46b3b37336c6e856e999 c99aa787fccd4f1d9553df1471383775 - - default default] [instance: cdb8738b-801f-4bd8-a93c-3553748dedd7] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 21 23:55:35 compute-1 nova_compute[182713]: 2026-01-21 23:55:35.972 182717 DEBUG nova.policy [None req-88c78b83-62b4-4740-abaa-caed5e17bd63 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '7e79b904cb8a49f990b05eb0ed72fdf4', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '70b1c9f8be0042aa8de9841a26729700', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 21 23:55:36 compute-1 nova_compute[182713]: 2026-01-21 23:55:36.001 182717 DEBUG nova.compute.manager [None req-bac2316a-8912-4826-8cd6-e33f10e4683b 2acb5062ef0a46b3b37336c6e856e999 c99aa787fccd4f1d9553df1471383775 - - default default] [instance: cdb8738b-801f-4bd8-a93c-3553748dedd7] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 21 23:55:36 compute-1 nova_compute[182713]: 2026-01-21 23:55:36.008 182717 DEBUG oslo_concurrency.processutils [None req-88c78b83-62b4-4740-abaa-caed5e17bd63 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.073s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:55:36 compute-1 nova_compute[182713]: 2026-01-21 23:55:36.009 182717 DEBUG oslo_concurrency.processutils [None req-88c78b83-62b4-4740-abaa-caed5e17bd63 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/22742d9b-a6a8-4f10-a17f-a9704a1f8f43/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:55:36 compute-1 nova_compute[182713]: 2026-01-21 23:55:36.047 182717 DEBUG oslo_concurrency.processutils [None req-88c78b83-62b4-4740-abaa-caed5e17bd63 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/22742d9b-a6a8-4f10-a17f-a9704a1f8f43/disk 1073741824" returned: 0 in 0.039s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:55:36 compute-1 nova_compute[182713]: 2026-01-21 23:55:36.048 182717 DEBUG oslo_concurrency.lockutils [None req-88c78b83-62b4-4740-abaa-caed5e17bd63 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.136s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:55:36 compute-1 nova_compute[182713]: 2026-01-21 23:55:36.049 182717 DEBUG oslo_concurrency.processutils [None req-88c78b83-62b4-4740-abaa-caed5e17bd63 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:55:36 compute-1 nova_compute[182713]: 2026-01-21 23:55:36.083 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:55:36 compute-1 nova_compute[182713]: 2026-01-21 23:55:36.126 182717 DEBUG oslo_concurrency.processutils [None req-88c78b83-62b4-4740-abaa-caed5e17bd63 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.077s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:55:36 compute-1 nova_compute[182713]: 2026-01-21 23:55:36.127 182717 DEBUG nova.virt.disk.api [None req-88c78b83-62b4-4740-abaa-caed5e17bd63 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Checking if we can resize image /var/lib/nova/instances/22742d9b-a6a8-4f10-a17f-a9704a1f8f43/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 21 23:55:36 compute-1 nova_compute[182713]: 2026-01-21 23:55:36.127 182717 DEBUG oslo_concurrency.processutils [None req-88c78b83-62b4-4740-abaa-caed5e17bd63 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/22742d9b-a6a8-4f10-a17f-a9704a1f8f43/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:55:36 compute-1 nova_compute[182713]: 2026-01-21 23:55:36.159 182717 DEBUG nova.compute.manager [None req-bac2316a-8912-4826-8cd6-e33f10e4683b 2acb5062ef0a46b3b37336c6e856e999 c99aa787fccd4f1d9553df1471383775 - - default default] [instance: cdb8738b-801f-4bd8-a93c-3553748dedd7] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 21 23:55:36 compute-1 nova_compute[182713]: 2026-01-21 23:55:36.162 182717 DEBUG nova.virt.libvirt.driver [None req-bac2316a-8912-4826-8cd6-e33f10e4683b 2acb5062ef0a46b3b37336c6e856e999 c99aa787fccd4f1d9553df1471383775 - - default default] [instance: cdb8738b-801f-4bd8-a93c-3553748dedd7] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 21 23:55:36 compute-1 nova_compute[182713]: 2026-01-21 23:55:36.162 182717 INFO nova.virt.libvirt.driver [None req-bac2316a-8912-4826-8cd6-e33f10e4683b 2acb5062ef0a46b3b37336c6e856e999 c99aa787fccd4f1d9553df1471383775 - - default default] [instance: cdb8738b-801f-4bd8-a93c-3553748dedd7] Creating image(s)
Jan 21 23:55:36 compute-1 nova_compute[182713]: 2026-01-21 23:55:36.163 182717 DEBUG oslo_concurrency.lockutils [None req-bac2316a-8912-4826-8cd6-e33f10e4683b 2acb5062ef0a46b3b37336c6e856e999 c99aa787fccd4f1d9553df1471383775 - - default default] Acquiring lock "/var/lib/nova/instances/cdb8738b-801f-4bd8-a93c-3553748dedd7/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:55:36 compute-1 nova_compute[182713]: 2026-01-21 23:55:36.164 182717 DEBUG oslo_concurrency.lockutils [None req-bac2316a-8912-4826-8cd6-e33f10e4683b 2acb5062ef0a46b3b37336c6e856e999 c99aa787fccd4f1d9553df1471383775 - - default default] Lock "/var/lib/nova/instances/cdb8738b-801f-4bd8-a93c-3553748dedd7/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:55:36 compute-1 nova_compute[182713]: 2026-01-21 23:55:36.164 182717 DEBUG oslo_concurrency.lockutils [None req-bac2316a-8912-4826-8cd6-e33f10e4683b 2acb5062ef0a46b3b37336c6e856e999 c99aa787fccd4f1d9553df1471383775 - - default default] Lock "/var/lib/nova/instances/cdb8738b-801f-4bd8-a93c-3553748dedd7/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:55:36 compute-1 nova_compute[182713]: 2026-01-21 23:55:36.180 182717 DEBUG oslo_concurrency.processutils [None req-bac2316a-8912-4826-8cd6-e33f10e4683b 2acb5062ef0a46b3b37336c6e856e999 c99aa787fccd4f1d9553df1471383775 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:55:36 compute-1 nova_compute[182713]: 2026-01-21 23:55:36.212 182717 DEBUG nova.compute.provider_tree [None req-e8e55543-5f47-4415-ba22-55d1bfa56be9 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Inventory has not changed in ProviderTree for provider: 39680711-70c9-4df1-ae59-25e54fac688d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 21 23:55:36 compute-1 nova_compute[182713]: 2026-01-21 23:55:36.215 182717 DEBUG oslo_concurrency.processutils [None req-88c78b83-62b4-4740-abaa-caed5e17bd63 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/22742d9b-a6a8-4f10-a17f-a9704a1f8f43/disk --force-share --output=json" returned: 0 in 0.088s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:55:36 compute-1 nova_compute[182713]: 2026-01-21 23:55:36.216 182717 DEBUG nova.virt.disk.api [None req-88c78b83-62b4-4740-abaa-caed5e17bd63 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Cannot resize image /var/lib/nova/instances/22742d9b-a6a8-4f10-a17f-a9704a1f8f43/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 21 23:55:36 compute-1 nova_compute[182713]: 2026-01-21 23:55:36.217 182717 DEBUG nova.objects.instance [None req-88c78b83-62b4-4740-abaa-caed5e17bd63 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Lazy-loading 'migration_context' on Instance uuid 22742d9b-a6a8-4f10-a17f-a9704a1f8f43 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:55:36 compute-1 nova_compute[182713]: 2026-01-21 23:55:36.238 182717 DEBUG nova.scheduler.client.report [None req-e8e55543-5f47-4415-ba22-55d1bfa56be9 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Inventory has not changed for provider 39680711-70c9-4df1-ae59-25e54fac688d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 21 23:55:36 compute-1 nova_compute[182713]: 2026-01-21 23:55:36.242 182717 DEBUG nova.virt.libvirt.driver [None req-88c78b83-62b4-4740-abaa-caed5e17bd63 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] [instance: 22742d9b-a6a8-4f10-a17f-a9704a1f8f43] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 21 23:55:36 compute-1 nova_compute[182713]: 2026-01-21 23:55:36.242 182717 DEBUG nova.virt.libvirt.driver [None req-88c78b83-62b4-4740-abaa-caed5e17bd63 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] [instance: 22742d9b-a6a8-4f10-a17f-a9704a1f8f43] Ensure instance console log exists: /var/lib/nova/instances/22742d9b-a6a8-4f10-a17f-a9704a1f8f43/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 21 23:55:36 compute-1 nova_compute[182713]: 2026-01-21 23:55:36.242 182717 DEBUG oslo_concurrency.lockutils [None req-88c78b83-62b4-4740-abaa-caed5e17bd63 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:55:36 compute-1 nova_compute[182713]: 2026-01-21 23:55:36.243 182717 DEBUG oslo_concurrency.lockutils [None req-88c78b83-62b4-4740-abaa-caed5e17bd63 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:55:36 compute-1 nova_compute[182713]: 2026-01-21 23:55:36.243 182717 DEBUG oslo_concurrency.lockutils [None req-88c78b83-62b4-4740-abaa-caed5e17bd63 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:55:36 compute-1 nova_compute[182713]: 2026-01-21 23:55:36.246 182717 DEBUG nova.policy [None req-bac2316a-8912-4826-8cd6-e33f10e4683b 2acb5062ef0a46b3b37336c6e856e999 c99aa787fccd4f1d9553df1471383775 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '2acb5062ef0a46b3b37336c6e856e999', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'c99aa787fccd4f1d9553df1471383775', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 21 23:55:36 compute-1 nova_compute[182713]: 2026-01-21 23:55:36.270 182717 DEBUG oslo_concurrency.lockutils [None req-e8e55543-5f47-4415-ba22-55d1bfa56be9 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.447s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:55:36 compute-1 nova_compute[182713]: 2026-01-21 23:55:36.271 182717 DEBUG nova.compute.manager [None req-e8e55543-5f47-4415-ba22-55d1bfa56be9 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: 2a5e9a44-f095-4122-8db9-4918b6ba22b7] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 21 23:55:36 compute-1 nova_compute[182713]: 2026-01-21 23:55:36.276 182717 DEBUG oslo_concurrency.processutils [None req-bac2316a-8912-4826-8cd6-e33f10e4683b 2acb5062ef0a46b3b37336c6e856e999 c99aa787fccd4f1d9553df1471383775 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.096s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:55:36 compute-1 nova_compute[182713]: 2026-01-21 23:55:36.277 182717 DEBUG oslo_concurrency.lockutils [None req-bac2316a-8912-4826-8cd6-e33f10e4683b 2acb5062ef0a46b3b37336c6e856e999 c99aa787fccd4f1d9553df1471383775 - - default default] Acquiring lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:55:36 compute-1 nova_compute[182713]: 2026-01-21 23:55:36.278 182717 DEBUG oslo_concurrency.lockutils [None req-bac2316a-8912-4826-8cd6-e33f10e4683b 2acb5062ef0a46b3b37336c6e856e999 c99aa787fccd4f1d9553df1471383775 - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:55:36 compute-1 nova_compute[182713]: 2026-01-21 23:55:36.295 182717 DEBUG oslo_concurrency.processutils [None req-bac2316a-8912-4826-8cd6-e33f10e4683b 2acb5062ef0a46b3b37336c6e856e999 c99aa787fccd4f1d9553df1471383775 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:55:36 compute-1 nova_compute[182713]: 2026-01-21 23:55:36.343 182717 DEBUG nova.compute.manager [None req-e8e55543-5f47-4415-ba22-55d1bfa56be9 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: 2a5e9a44-f095-4122-8db9-4918b6ba22b7] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 21 23:55:36 compute-1 nova_compute[182713]: 2026-01-21 23:55:36.344 182717 DEBUG nova.network.neutron [None req-e8e55543-5f47-4415-ba22-55d1bfa56be9 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: 2a5e9a44-f095-4122-8db9-4918b6ba22b7] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 21 23:55:36 compute-1 nova_compute[182713]: 2026-01-21 23:55:36.350 182717 DEBUG oslo_concurrency.processutils [None req-bac2316a-8912-4826-8cd6-e33f10e4683b 2acb5062ef0a46b3b37336c6e856e999 c99aa787fccd4f1d9553df1471383775 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:55:36 compute-1 nova_compute[182713]: 2026-01-21 23:55:36.350 182717 DEBUG oslo_concurrency.processutils [None req-bac2316a-8912-4826-8cd6-e33f10e4683b 2acb5062ef0a46b3b37336c6e856e999 c99aa787fccd4f1d9553df1471383775 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/cdb8738b-801f-4bd8-a93c-3553748dedd7/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:55:36 compute-1 nova_compute[182713]: 2026-01-21 23:55:36.372 182717 INFO nova.virt.libvirt.driver [None req-e8e55543-5f47-4415-ba22-55d1bfa56be9 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: 2a5e9a44-f095-4122-8db9-4918b6ba22b7] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 21 23:55:36 compute-1 nova_compute[182713]: 2026-01-21 23:55:36.387 182717 DEBUG oslo_concurrency.processutils [None req-bac2316a-8912-4826-8cd6-e33f10e4683b 2acb5062ef0a46b3b37336c6e856e999 c99aa787fccd4f1d9553df1471383775 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/cdb8738b-801f-4bd8-a93c-3553748dedd7/disk 1073741824" returned: 0 in 0.037s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:55:36 compute-1 nova_compute[182713]: 2026-01-21 23:55:36.388 182717 DEBUG oslo_concurrency.lockutils [None req-bac2316a-8912-4826-8cd6-e33f10e4683b 2acb5062ef0a46b3b37336c6e856e999 c99aa787fccd4f1d9553df1471383775 - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.110s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:55:36 compute-1 nova_compute[182713]: 2026-01-21 23:55:36.388 182717 DEBUG oslo_concurrency.processutils [None req-bac2316a-8912-4826-8cd6-e33f10e4683b 2acb5062ef0a46b3b37336c6e856e999 c99aa787fccd4f1d9553df1471383775 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:55:36 compute-1 nova_compute[182713]: 2026-01-21 23:55:36.412 182717 DEBUG nova.compute.manager [None req-e8e55543-5f47-4415-ba22-55d1bfa56be9 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: 2a5e9a44-f095-4122-8db9-4918b6ba22b7] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 21 23:55:36 compute-1 nova_compute[182713]: 2026-01-21 23:55:36.452 182717 DEBUG oslo_concurrency.processutils [None req-bac2316a-8912-4826-8cd6-e33f10e4683b 2acb5062ef0a46b3b37336c6e856e999 c99aa787fccd4f1d9553df1471383775 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:55:36 compute-1 nova_compute[182713]: 2026-01-21 23:55:36.454 182717 DEBUG nova.virt.disk.api [None req-bac2316a-8912-4826-8cd6-e33f10e4683b 2acb5062ef0a46b3b37336c6e856e999 c99aa787fccd4f1d9553df1471383775 - - default default] Checking if we can resize image /var/lib/nova/instances/cdb8738b-801f-4bd8-a93c-3553748dedd7/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 21 23:55:36 compute-1 nova_compute[182713]: 2026-01-21 23:55:36.455 182717 DEBUG oslo_concurrency.processutils [None req-bac2316a-8912-4826-8cd6-e33f10e4683b 2acb5062ef0a46b3b37336c6e856e999 c99aa787fccd4f1d9553df1471383775 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/cdb8738b-801f-4bd8-a93c-3553748dedd7/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:55:36 compute-1 nova_compute[182713]: 2026-01-21 23:55:36.545 182717 DEBUG oslo_concurrency.processutils [None req-bac2316a-8912-4826-8cd6-e33f10e4683b 2acb5062ef0a46b3b37336c6e856e999 c99aa787fccd4f1d9553df1471383775 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/cdb8738b-801f-4bd8-a93c-3553748dedd7/disk --force-share --output=json" returned: 0 in 0.090s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:55:36 compute-1 nova_compute[182713]: 2026-01-21 23:55:36.546 182717 DEBUG nova.virt.disk.api [None req-bac2316a-8912-4826-8cd6-e33f10e4683b 2acb5062ef0a46b3b37336c6e856e999 c99aa787fccd4f1d9553df1471383775 - - default default] Cannot resize image /var/lib/nova/instances/cdb8738b-801f-4bd8-a93c-3553748dedd7/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 21 23:55:36 compute-1 nova_compute[182713]: 2026-01-21 23:55:36.547 182717 DEBUG nova.objects.instance [None req-bac2316a-8912-4826-8cd6-e33f10e4683b 2acb5062ef0a46b3b37336c6e856e999 c99aa787fccd4f1d9553df1471383775 - - default default] Lazy-loading 'migration_context' on Instance uuid cdb8738b-801f-4bd8-a93c-3553748dedd7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:55:36 compute-1 nova_compute[182713]: 2026-01-21 23:55:36.584 182717 DEBUG nova.virt.libvirt.driver [None req-bac2316a-8912-4826-8cd6-e33f10e4683b 2acb5062ef0a46b3b37336c6e856e999 c99aa787fccd4f1d9553df1471383775 - - default default] [instance: cdb8738b-801f-4bd8-a93c-3553748dedd7] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 21 23:55:36 compute-1 nova_compute[182713]: 2026-01-21 23:55:36.585 182717 DEBUG nova.virt.libvirt.driver [None req-bac2316a-8912-4826-8cd6-e33f10e4683b 2acb5062ef0a46b3b37336c6e856e999 c99aa787fccd4f1d9553df1471383775 - - default default] [instance: cdb8738b-801f-4bd8-a93c-3553748dedd7] Ensure instance console log exists: /var/lib/nova/instances/cdb8738b-801f-4bd8-a93c-3553748dedd7/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 21 23:55:36 compute-1 nova_compute[182713]: 2026-01-21 23:55:36.586 182717 DEBUG oslo_concurrency.lockutils [None req-bac2316a-8912-4826-8cd6-e33f10e4683b 2acb5062ef0a46b3b37336c6e856e999 c99aa787fccd4f1d9553df1471383775 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:55:36 compute-1 nova_compute[182713]: 2026-01-21 23:55:36.587 182717 DEBUG oslo_concurrency.lockutils [None req-bac2316a-8912-4826-8cd6-e33f10e4683b 2acb5062ef0a46b3b37336c6e856e999 c99aa787fccd4f1d9553df1471383775 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:55:36 compute-1 nova_compute[182713]: 2026-01-21 23:55:36.587 182717 DEBUG oslo_concurrency.lockutils [None req-bac2316a-8912-4826-8cd6-e33f10e4683b 2acb5062ef0a46b3b37336c6e856e999 c99aa787fccd4f1d9553df1471383775 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:55:36 compute-1 nova_compute[182713]: 2026-01-21 23:55:36.609 182717 DEBUG nova.compute.manager [None req-e8e55543-5f47-4415-ba22-55d1bfa56be9 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: 2a5e9a44-f095-4122-8db9-4918b6ba22b7] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 21 23:55:36 compute-1 nova_compute[182713]: 2026-01-21 23:55:36.612 182717 DEBUG nova.virt.libvirt.driver [None req-e8e55543-5f47-4415-ba22-55d1bfa56be9 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: 2a5e9a44-f095-4122-8db9-4918b6ba22b7] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 21 23:55:36 compute-1 nova_compute[182713]: 2026-01-21 23:55:36.613 182717 INFO nova.virt.libvirt.driver [None req-e8e55543-5f47-4415-ba22-55d1bfa56be9 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: 2a5e9a44-f095-4122-8db9-4918b6ba22b7] Creating image(s)
Jan 21 23:55:36 compute-1 nova_compute[182713]: 2026-01-21 23:55:36.615 182717 DEBUG oslo_concurrency.lockutils [None req-e8e55543-5f47-4415-ba22-55d1bfa56be9 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Acquiring lock "/var/lib/nova/instances/2a5e9a44-f095-4122-8db9-4918b6ba22b7/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:55:36 compute-1 nova_compute[182713]: 2026-01-21 23:55:36.616 182717 DEBUG oslo_concurrency.lockutils [None req-e8e55543-5f47-4415-ba22-55d1bfa56be9 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Lock "/var/lib/nova/instances/2a5e9a44-f095-4122-8db9-4918b6ba22b7/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:55:36 compute-1 nova_compute[182713]: 2026-01-21 23:55:36.617 182717 DEBUG oslo_concurrency.lockutils [None req-e8e55543-5f47-4415-ba22-55d1bfa56be9 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Lock "/var/lib/nova/instances/2a5e9a44-f095-4122-8db9-4918b6ba22b7/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:55:36 compute-1 nova_compute[182713]: 2026-01-21 23:55:36.645 182717 DEBUG nova.policy [None req-e8e55543-5f47-4415-ba22-55d1bfa56be9 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '6eb1bcf645844eaca088761a04e59542', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '63e5713bcd4c429796b251487b6136bc', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 21 23:55:36 compute-1 nova_compute[182713]: 2026-01-21 23:55:36.650 182717 DEBUG oslo_concurrency.processutils [None req-e8e55543-5f47-4415-ba22-55d1bfa56be9 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:55:36 compute-1 nova_compute[182713]: 2026-01-21 23:55:36.713 182717 DEBUG oslo_concurrency.processutils [None req-e8e55543-5f47-4415-ba22-55d1bfa56be9 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:55:36 compute-1 nova_compute[182713]: 2026-01-21 23:55:36.714 182717 DEBUG oslo_concurrency.lockutils [None req-e8e55543-5f47-4415-ba22-55d1bfa56be9 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Acquiring lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:55:36 compute-1 nova_compute[182713]: 2026-01-21 23:55:36.715 182717 DEBUG oslo_concurrency.lockutils [None req-e8e55543-5f47-4415-ba22-55d1bfa56be9 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:55:36 compute-1 nova_compute[182713]: 2026-01-21 23:55:36.733 182717 DEBUG oslo_concurrency.processutils [None req-e8e55543-5f47-4415-ba22-55d1bfa56be9 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:55:36 compute-1 nova_compute[182713]: 2026-01-21 23:55:36.796 182717 DEBUG oslo_concurrency.processutils [None req-e8e55543-5f47-4415-ba22-55d1bfa56be9 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:55:36 compute-1 nova_compute[182713]: 2026-01-21 23:55:36.797 182717 DEBUG oslo_concurrency.processutils [None req-e8e55543-5f47-4415-ba22-55d1bfa56be9 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/2a5e9a44-f095-4122-8db9-4918b6ba22b7/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:55:36 compute-1 nova_compute[182713]: 2026-01-21 23:55:36.845 182717 DEBUG oslo_concurrency.processutils [None req-e8e55543-5f47-4415-ba22-55d1bfa56be9 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/2a5e9a44-f095-4122-8db9-4918b6ba22b7/disk 1073741824" returned: 0 in 0.048s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:55:36 compute-1 nova_compute[182713]: 2026-01-21 23:55:36.846 182717 DEBUG oslo_concurrency.lockutils [None req-e8e55543-5f47-4415-ba22-55d1bfa56be9 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.131s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:55:36 compute-1 nova_compute[182713]: 2026-01-21 23:55:36.846 182717 DEBUG oslo_concurrency.processutils [None req-e8e55543-5f47-4415-ba22-55d1bfa56be9 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:55:36 compute-1 nova_compute[182713]: 2026-01-21 23:55:36.936 182717 DEBUG nova.network.neutron [None req-88c78b83-62b4-4740-abaa-caed5e17bd63 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] [instance: 22742d9b-a6a8-4f10-a17f-a9704a1f8f43] Successfully created port: b767fb64-f4a0-49cc-85c0-21b059344b3d _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 21 23:55:36 compute-1 nova_compute[182713]: 2026-01-21 23:55:36.943 182717 DEBUG oslo_concurrency.processutils [None req-e8e55543-5f47-4415-ba22-55d1bfa56be9 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.097s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:55:36 compute-1 nova_compute[182713]: 2026-01-21 23:55:36.944 182717 DEBUG nova.virt.disk.api [None req-e8e55543-5f47-4415-ba22-55d1bfa56be9 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Checking if we can resize image /var/lib/nova/instances/2a5e9a44-f095-4122-8db9-4918b6ba22b7/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 21 23:55:36 compute-1 nova_compute[182713]: 2026-01-21 23:55:36.945 182717 DEBUG oslo_concurrency.processutils [None req-e8e55543-5f47-4415-ba22-55d1bfa56be9 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2a5e9a44-f095-4122-8db9-4918b6ba22b7/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:55:37 compute-1 nova_compute[182713]: 2026-01-21 23:55:37.034 182717 DEBUG oslo_concurrency.processutils [None req-e8e55543-5f47-4415-ba22-55d1bfa56be9 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2a5e9a44-f095-4122-8db9-4918b6ba22b7/disk --force-share --output=json" returned: 0 in 0.089s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:55:37 compute-1 nova_compute[182713]: 2026-01-21 23:55:37.035 182717 DEBUG nova.virt.disk.api [None req-e8e55543-5f47-4415-ba22-55d1bfa56be9 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Cannot resize image /var/lib/nova/instances/2a5e9a44-f095-4122-8db9-4918b6ba22b7/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 21 23:55:37 compute-1 nova_compute[182713]: 2026-01-21 23:55:37.036 182717 DEBUG nova.objects.instance [None req-e8e55543-5f47-4415-ba22-55d1bfa56be9 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Lazy-loading 'migration_context' on Instance uuid 2a5e9a44-f095-4122-8db9-4918b6ba22b7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:55:37 compute-1 nova_compute[182713]: 2026-01-21 23:55:37.071 182717 DEBUG nova.virt.libvirt.driver [None req-e8e55543-5f47-4415-ba22-55d1bfa56be9 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: 2a5e9a44-f095-4122-8db9-4918b6ba22b7] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 21 23:55:37 compute-1 nova_compute[182713]: 2026-01-21 23:55:37.072 182717 DEBUG nova.virt.libvirt.driver [None req-e8e55543-5f47-4415-ba22-55d1bfa56be9 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: 2a5e9a44-f095-4122-8db9-4918b6ba22b7] Ensure instance console log exists: /var/lib/nova/instances/2a5e9a44-f095-4122-8db9-4918b6ba22b7/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 21 23:55:37 compute-1 nova_compute[182713]: 2026-01-21 23:55:37.073 182717 DEBUG oslo_concurrency.lockutils [None req-e8e55543-5f47-4415-ba22-55d1bfa56be9 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:55:37 compute-1 nova_compute[182713]: 2026-01-21 23:55:37.073 182717 DEBUG oslo_concurrency.lockutils [None req-e8e55543-5f47-4415-ba22-55d1bfa56be9 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:55:37 compute-1 nova_compute[182713]: 2026-01-21 23:55:37.073 182717 DEBUG oslo_concurrency.lockutils [None req-e8e55543-5f47-4415-ba22-55d1bfa56be9 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:55:37 compute-1 nova_compute[182713]: 2026-01-21 23:55:37.124 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:55:37 compute-1 nova_compute[182713]: 2026-01-21 23:55:37.550 182717 DEBUG nova.network.neutron [None req-bac2316a-8912-4826-8cd6-e33f10e4683b 2acb5062ef0a46b3b37336c6e856e999 c99aa787fccd4f1d9553df1471383775 - - default default] [instance: cdb8738b-801f-4bd8-a93c-3553748dedd7] Successfully created port: 7ff4f083-3e07-4de8-8f0c-bd99fc57ead6 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 21 23:55:37 compute-1 nova_compute[182713]: 2026-01-21 23:55:37.558 182717 DEBUG nova.network.neutron [None req-e8e55543-5f47-4415-ba22-55d1bfa56be9 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: 2a5e9a44-f095-4122-8db9-4918b6ba22b7] Successfully created port: ce72a712-5acd-45cf-9d5d-66fb0d28ce4b _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 21 23:55:38 compute-1 nova_compute[182713]: 2026-01-21 23:55:38.513 182717 DEBUG nova.network.neutron [None req-88c78b83-62b4-4740-abaa-caed5e17bd63 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] [instance: 22742d9b-a6a8-4f10-a17f-a9704a1f8f43] Successfully updated port: b767fb64-f4a0-49cc-85c0-21b059344b3d _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 21 23:55:38 compute-1 nova_compute[182713]: 2026-01-21 23:55:38.532 182717 DEBUG oslo_concurrency.lockutils [None req-88c78b83-62b4-4740-abaa-caed5e17bd63 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Acquiring lock "refresh_cache-22742d9b-a6a8-4f10-a17f-a9704a1f8f43" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 23:55:38 compute-1 nova_compute[182713]: 2026-01-21 23:55:38.533 182717 DEBUG oslo_concurrency.lockutils [None req-88c78b83-62b4-4740-abaa-caed5e17bd63 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Acquired lock "refresh_cache-22742d9b-a6a8-4f10-a17f-a9704a1f8f43" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 23:55:38 compute-1 nova_compute[182713]: 2026-01-21 23:55:38.533 182717 DEBUG nova.network.neutron [None req-88c78b83-62b4-4740-abaa-caed5e17bd63 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] [instance: 22742d9b-a6a8-4f10-a17f-a9704a1f8f43] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 21 23:55:38 compute-1 podman[218773]: 2026-01-21 23:55:38.597887958 +0000 UTC m=+0.076130885 container health_status af9a44923f14d865893c91712a29a99d59e05d83f1a1a0300c4f4d5af638f284 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Jan 21 23:55:38 compute-1 podman[218774]: 2026-01-21 23:55:38.617834002 +0000 UTC m=+0.087847565 container health_status e305ebfcd3dbca18f8dd680606b043b108cf9efb0d0a771039cb04fe0df8441d (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 21 23:55:38 compute-1 nova_compute[182713]: 2026-01-21 23:55:38.786 182717 DEBUG nova.network.neutron [None req-88c78b83-62b4-4740-abaa-caed5e17bd63 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] [instance: 22742d9b-a6a8-4f10-a17f-a9704a1f8f43] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 21 23:55:38 compute-1 nova_compute[182713]: 2026-01-21 23:55:38.830 182717 DEBUG nova.network.neutron [None req-e8e55543-5f47-4415-ba22-55d1bfa56be9 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: 2a5e9a44-f095-4122-8db9-4918b6ba22b7] Successfully updated port: ce72a712-5acd-45cf-9d5d-66fb0d28ce4b _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 21 23:55:38 compute-1 nova_compute[182713]: 2026-01-21 23:55:38.850 182717 DEBUG nova.network.neutron [None req-bac2316a-8912-4826-8cd6-e33f10e4683b 2acb5062ef0a46b3b37336c6e856e999 c99aa787fccd4f1d9553df1471383775 - - default default] [instance: cdb8738b-801f-4bd8-a93c-3553748dedd7] Successfully updated port: 7ff4f083-3e07-4de8-8f0c-bd99fc57ead6 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 21 23:55:38 compute-1 nova_compute[182713]: 2026-01-21 23:55:38.853 182717 DEBUG oslo_concurrency.lockutils [None req-e8e55543-5f47-4415-ba22-55d1bfa56be9 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Acquiring lock "refresh_cache-2a5e9a44-f095-4122-8db9-4918b6ba22b7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 23:55:38 compute-1 nova_compute[182713]: 2026-01-21 23:55:38.854 182717 DEBUG oslo_concurrency.lockutils [None req-e8e55543-5f47-4415-ba22-55d1bfa56be9 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Acquired lock "refresh_cache-2a5e9a44-f095-4122-8db9-4918b6ba22b7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 23:55:38 compute-1 nova_compute[182713]: 2026-01-21 23:55:38.854 182717 DEBUG nova.network.neutron [None req-e8e55543-5f47-4415-ba22-55d1bfa56be9 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: 2a5e9a44-f095-4122-8db9-4918b6ba22b7] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 21 23:55:38 compute-1 nova_compute[182713]: 2026-01-21 23:55:38.877 182717 DEBUG oslo_concurrency.lockutils [None req-bac2316a-8912-4826-8cd6-e33f10e4683b 2acb5062ef0a46b3b37336c6e856e999 c99aa787fccd4f1d9553df1471383775 - - default default] Acquiring lock "refresh_cache-cdb8738b-801f-4bd8-a93c-3553748dedd7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 23:55:38 compute-1 nova_compute[182713]: 2026-01-21 23:55:38.877 182717 DEBUG oslo_concurrency.lockutils [None req-bac2316a-8912-4826-8cd6-e33f10e4683b 2acb5062ef0a46b3b37336c6e856e999 c99aa787fccd4f1d9553df1471383775 - - default default] Acquired lock "refresh_cache-cdb8738b-801f-4bd8-a93c-3553748dedd7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 23:55:38 compute-1 nova_compute[182713]: 2026-01-21 23:55:38.877 182717 DEBUG nova.network.neutron [None req-bac2316a-8912-4826-8cd6-e33f10e4683b 2acb5062ef0a46b3b37336c6e856e999 c99aa787fccd4f1d9553df1471383775 - - default default] [instance: cdb8738b-801f-4bd8-a93c-3553748dedd7] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 21 23:55:38 compute-1 nova_compute[182713]: 2026-01-21 23:55:38.893 182717 DEBUG nova.compute.manager [req-306717a3-c08a-4167-94c3-89c982faa494 req-ed709abd-26eb-4ae1-9317-21ce16440171 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 22742d9b-a6a8-4f10-a17f-a9704a1f8f43] Received event network-changed-b767fb64-f4a0-49cc-85c0-21b059344b3d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:55:38 compute-1 nova_compute[182713]: 2026-01-21 23:55:38.893 182717 DEBUG nova.compute.manager [req-306717a3-c08a-4167-94c3-89c982faa494 req-ed709abd-26eb-4ae1-9317-21ce16440171 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 22742d9b-a6a8-4f10-a17f-a9704a1f8f43] Refreshing instance network info cache due to event network-changed-b767fb64-f4a0-49cc-85c0-21b059344b3d. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 21 23:55:38 compute-1 nova_compute[182713]: 2026-01-21 23:55:38.894 182717 DEBUG oslo_concurrency.lockutils [req-306717a3-c08a-4167-94c3-89c982faa494 req-ed709abd-26eb-4ae1-9317-21ce16440171 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-22742d9b-a6a8-4f10-a17f-a9704a1f8f43" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 23:55:38 compute-1 nova_compute[182713]: 2026-01-21 23:55:38.963 182717 DEBUG nova.compute.manager [req-4f5ed248-a508-48f6-b9b7-a9b10e08f927 req-f3797714-92e3-424a-ae3d-45ad192b9661 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 2a5e9a44-f095-4122-8db9-4918b6ba22b7] Received event network-changed-ce72a712-5acd-45cf-9d5d-66fb0d28ce4b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:55:38 compute-1 nova_compute[182713]: 2026-01-21 23:55:38.964 182717 DEBUG nova.compute.manager [req-4f5ed248-a508-48f6-b9b7-a9b10e08f927 req-f3797714-92e3-424a-ae3d-45ad192b9661 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 2a5e9a44-f095-4122-8db9-4918b6ba22b7] Refreshing instance network info cache due to event network-changed-ce72a712-5acd-45cf-9d5d-66fb0d28ce4b. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 21 23:55:38 compute-1 nova_compute[182713]: 2026-01-21 23:55:38.964 182717 DEBUG oslo_concurrency.lockutils [req-4f5ed248-a508-48f6-b9b7-a9b10e08f927 req-f3797714-92e3-424a-ae3d-45ad192b9661 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-2a5e9a44-f095-4122-8db9-4918b6ba22b7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 23:55:39 compute-1 nova_compute[182713]: 2026-01-21 23:55:39.088 182717 DEBUG nova.network.neutron [None req-e8e55543-5f47-4415-ba22-55d1bfa56be9 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: 2a5e9a44-f095-4122-8db9-4918b6ba22b7] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 21 23:55:39 compute-1 nova_compute[182713]: 2026-01-21 23:55:39.152 182717 DEBUG nova.network.neutron [None req-bac2316a-8912-4826-8cd6-e33f10e4683b 2acb5062ef0a46b3b37336c6e856e999 c99aa787fccd4f1d9553df1471383775 - - default default] [instance: cdb8738b-801f-4bd8-a93c-3553748dedd7] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 21 23:55:39 compute-1 nova_compute[182713]: 2026-01-21 23:55:39.389 182717 DEBUG nova.compute.manager [req-3259802a-6f49-4910-9fa7-bd0e21c0c8df req-9742506c-d0ca-4ce8-a261-67a5e9285d54 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: cdb8738b-801f-4bd8-a93c-3553748dedd7] Received event network-changed-7ff4f083-3e07-4de8-8f0c-bd99fc57ead6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:55:39 compute-1 nova_compute[182713]: 2026-01-21 23:55:39.390 182717 DEBUG nova.compute.manager [req-3259802a-6f49-4910-9fa7-bd0e21c0c8df req-9742506c-d0ca-4ce8-a261-67a5e9285d54 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: cdb8738b-801f-4bd8-a93c-3553748dedd7] Refreshing instance network info cache due to event network-changed-7ff4f083-3e07-4de8-8f0c-bd99fc57ead6. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 21 23:55:39 compute-1 nova_compute[182713]: 2026-01-21 23:55:39.390 182717 DEBUG oslo_concurrency.lockutils [req-3259802a-6f49-4910-9fa7-bd0e21c0c8df req-9742506c-d0ca-4ce8-a261-67a5e9285d54 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-cdb8738b-801f-4bd8-a93c-3553748dedd7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 23:55:40 compute-1 nova_compute[182713]: 2026-01-21 23:55:40.086 182717 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769039725.0826309, 7821b939-505a-45e3-a74b-dce7c6fdc856 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:55:40 compute-1 nova_compute[182713]: 2026-01-21 23:55:40.087 182717 INFO nova.compute.manager [-] [instance: 7821b939-505a-45e3-a74b-dce7c6fdc856] VM Stopped (Lifecycle Event)
Jan 21 23:55:40 compute-1 nova_compute[182713]: 2026-01-21 23:55:40.131 182717 DEBUG nova.compute.manager [None req-c6902d76-ad90-4501-bb8b-f8a16f7ee4e3 - - - - - -] [instance: 7821b939-505a-45e3-a74b-dce7c6fdc856] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:55:41 compute-1 nova_compute[182713]: 2026-01-21 23:55:41.078 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:55:42 compute-1 nova_compute[182713]: 2026-01-21 23:55:42.127 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:55:43 compute-1 nova_compute[182713]: 2026-01-21 23:55:43.367 182717 DEBUG nova.network.neutron [None req-bac2316a-8912-4826-8cd6-e33f10e4683b 2acb5062ef0a46b3b37336c6e856e999 c99aa787fccd4f1d9553df1471383775 - - default default] [instance: cdb8738b-801f-4bd8-a93c-3553748dedd7] Updating instance_info_cache with network_info: [{"id": "7ff4f083-3e07-4de8-8f0c-bd99fc57ead6", "address": "fa:16:3e:02:3d:57", "network": {"id": "261c8c44-5c0a-4f69-8e63-c90dfc4facd7", "bridge": "br-int", "label": "tempest-InstanceActionsV221TestJSON-1124446518-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c99aa787fccd4f1d9553df1471383775", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7ff4f083-3e", "ovs_interfaceid": "7ff4f083-3e07-4de8-8f0c-bd99fc57ead6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 23:55:43 compute-1 nova_compute[182713]: 2026-01-21 23:55:43.394 182717 DEBUG oslo_concurrency.lockutils [None req-bac2316a-8912-4826-8cd6-e33f10e4683b 2acb5062ef0a46b3b37336c6e856e999 c99aa787fccd4f1d9553df1471383775 - - default default] Releasing lock "refresh_cache-cdb8738b-801f-4bd8-a93c-3553748dedd7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 23:55:43 compute-1 nova_compute[182713]: 2026-01-21 23:55:43.395 182717 DEBUG nova.compute.manager [None req-bac2316a-8912-4826-8cd6-e33f10e4683b 2acb5062ef0a46b3b37336c6e856e999 c99aa787fccd4f1d9553df1471383775 - - default default] [instance: cdb8738b-801f-4bd8-a93c-3553748dedd7] Instance network_info: |[{"id": "7ff4f083-3e07-4de8-8f0c-bd99fc57ead6", "address": "fa:16:3e:02:3d:57", "network": {"id": "261c8c44-5c0a-4f69-8e63-c90dfc4facd7", "bridge": "br-int", "label": "tempest-InstanceActionsV221TestJSON-1124446518-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c99aa787fccd4f1d9553df1471383775", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7ff4f083-3e", "ovs_interfaceid": "7ff4f083-3e07-4de8-8f0c-bd99fc57ead6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 21 23:55:43 compute-1 nova_compute[182713]: 2026-01-21 23:55:43.395 182717 DEBUG oslo_concurrency.lockutils [req-3259802a-6f49-4910-9fa7-bd0e21c0c8df req-9742506c-d0ca-4ce8-a261-67a5e9285d54 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-cdb8738b-801f-4bd8-a93c-3553748dedd7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 23:55:43 compute-1 nova_compute[182713]: 2026-01-21 23:55:43.396 182717 DEBUG nova.network.neutron [req-3259802a-6f49-4910-9fa7-bd0e21c0c8df req-9742506c-d0ca-4ce8-a261-67a5e9285d54 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: cdb8738b-801f-4bd8-a93c-3553748dedd7] Refreshing network info cache for port 7ff4f083-3e07-4de8-8f0c-bd99fc57ead6 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 21 23:55:43 compute-1 nova_compute[182713]: 2026-01-21 23:55:43.400 182717 DEBUG nova.virt.libvirt.driver [None req-bac2316a-8912-4826-8cd6-e33f10e4683b 2acb5062ef0a46b3b37336c6e856e999 c99aa787fccd4f1d9553df1471383775 - - default default] [instance: cdb8738b-801f-4bd8-a93c-3553748dedd7] Start _get_guest_xml network_info=[{"id": "7ff4f083-3e07-4de8-8f0c-bd99fc57ead6", "address": "fa:16:3e:02:3d:57", "network": {"id": "261c8c44-5c0a-4f69-8e63-c90dfc4facd7", "bridge": "br-int", "label": "tempest-InstanceActionsV221TestJSON-1124446518-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c99aa787fccd4f1d9553df1471383775", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7ff4f083-3e", "ovs_interfaceid": "7ff4f083-3e07-4de8-8f0c-bd99fc57ead6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'size': 0, 'disk_bus': 'virtio', 'boot_index': 0, 'device_name': '/dev/vda', 'guest_format': None, 'encrypted': False, 'encryption_format': None, 'encryption_secret_uuid': None, 'device_type': 'disk', 'image_id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 21 23:55:43 compute-1 nova_compute[182713]: 2026-01-21 23:55:43.405 182717 WARNING nova.virt.libvirt.driver [None req-bac2316a-8912-4826-8cd6-e33f10e4683b 2acb5062ef0a46b3b37336c6e856e999 c99aa787fccd4f1d9553df1471383775 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 21 23:55:43 compute-1 nova_compute[182713]: 2026-01-21 23:55:43.412 182717 DEBUG nova.virt.libvirt.host [None req-bac2316a-8912-4826-8cd6-e33f10e4683b 2acb5062ef0a46b3b37336c6e856e999 c99aa787fccd4f1d9553df1471383775 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 21 23:55:43 compute-1 nova_compute[182713]: 2026-01-21 23:55:43.413 182717 DEBUG nova.virt.libvirt.host [None req-bac2316a-8912-4826-8cd6-e33f10e4683b 2acb5062ef0a46b3b37336c6e856e999 c99aa787fccd4f1d9553df1471383775 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 21 23:55:43 compute-1 nova_compute[182713]: 2026-01-21 23:55:43.425 182717 DEBUG nova.virt.libvirt.host [None req-bac2316a-8912-4826-8cd6-e33f10e4683b 2acb5062ef0a46b3b37336c6e856e999 c99aa787fccd4f1d9553df1471383775 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 21 23:55:43 compute-1 nova_compute[182713]: 2026-01-21 23:55:43.426 182717 DEBUG nova.virt.libvirt.host [None req-bac2316a-8912-4826-8cd6-e33f10e4683b 2acb5062ef0a46b3b37336c6e856e999 c99aa787fccd4f1d9553df1471383775 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 21 23:55:43 compute-1 nova_compute[182713]: 2026-01-21 23:55:43.429 182717 DEBUG nova.virt.libvirt.driver [None req-bac2316a-8912-4826-8cd6-e33f10e4683b 2acb5062ef0a46b3b37336c6e856e999 c99aa787fccd4f1d9553df1471383775 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 21 23:55:43 compute-1 nova_compute[182713]: 2026-01-21 23:55:43.429 182717 DEBUG nova.virt.hardware [None req-bac2316a-8912-4826-8cd6-e33f10e4683b 2acb5062ef0a46b3b37336c6e856e999 c99aa787fccd4f1d9553df1471383775 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-21T23:42:45Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c3389c03-89c4-4ff5-9e03-1a99d41713d4',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 21 23:55:43 compute-1 nova_compute[182713]: 2026-01-21 23:55:43.430 182717 DEBUG nova.virt.hardware [None req-bac2316a-8912-4826-8cd6-e33f10e4683b 2acb5062ef0a46b3b37336c6e856e999 c99aa787fccd4f1d9553df1471383775 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 21 23:55:43 compute-1 nova_compute[182713]: 2026-01-21 23:55:43.431 182717 DEBUG nova.virt.hardware [None req-bac2316a-8912-4826-8cd6-e33f10e4683b 2acb5062ef0a46b3b37336c6e856e999 c99aa787fccd4f1d9553df1471383775 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 21 23:55:43 compute-1 nova_compute[182713]: 2026-01-21 23:55:43.431 182717 DEBUG nova.virt.hardware [None req-bac2316a-8912-4826-8cd6-e33f10e4683b 2acb5062ef0a46b3b37336c6e856e999 c99aa787fccd4f1d9553df1471383775 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 21 23:55:43 compute-1 nova_compute[182713]: 2026-01-21 23:55:43.432 182717 DEBUG nova.virt.hardware [None req-bac2316a-8912-4826-8cd6-e33f10e4683b 2acb5062ef0a46b3b37336c6e856e999 c99aa787fccd4f1d9553df1471383775 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 21 23:55:43 compute-1 nova_compute[182713]: 2026-01-21 23:55:43.432 182717 DEBUG nova.virt.hardware [None req-bac2316a-8912-4826-8cd6-e33f10e4683b 2acb5062ef0a46b3b37336c6e856e999 c99aa787fccd4f1d9553df1471383775 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 21 23:55:43 compute-1 nova_compute[182713]: 2026-01-21 23:55:43.433 182717 DEBUG nova.virt.hardware [None req-bac2316a-8912-4826-8cd6-e33f10e4683b 2acb5062ef0a46b3b37336c6e856e999 c99aa787fccd4f1d9553df1471383775 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 21 23:55:43 compute-1 nova_compute[182713]: 2026-01-21 23:55:43.434 182717 DEBUG nova.virt.hardware [None req-bac2316a-8912-4826-8cd6-e33f10e4683b 2acb5062ef0a46b3b37336c6e856e999 c99aa787fccd4f1d9553df1471383775 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 21 23:55:43 compute-1 nova_compute[182713]: 2026-01-21 23:55:43.434 182717 DEBUG nova.virt.hardware [None req-bac2316a-8912-4826-8cd6-e33f10e4683b 2acb5062ef0a46b3b37336c6e856e999 c99aa787fccd4f1d9553df1471383775 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 21 23:55:43 compute-1 nova_compute[182713]: 2026-01-21 23:55:43.435 182717 DEBUG nova.virt.hardware [None req-bac2316a-8912-4826-8cd6-e33f10e4683b 2acb5062ef0a46b3b37336c6e856e999 c99aa787fccd4f1d9553df1471383775 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 21 23:55:43 compute-1 nova_compute[182713]: 2026-01-21 23:55:43.435 182717 DEBUG nova.virt.hardware [None req-bac2316a-8912-4826-8cd6-e33f10e4683b 2acb5062ef0a46b3b37336c6e856e999 c99aa787fccd4f1d9553df1471383775 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 21 23:55:43 compute-1 nova_compute[182713]: 2026-01-21 23:55:43.443 182717 DEBUG nova.virt.libvirt.vif [None req-bac2316a-8912-4826-8cd6-e33f10e4683b 2acb5062ef0a46b3b37336c6e856e999 c99aa787fccd4f1d9553df1471383775 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-21T23:55:33Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-InstanceActionsV221TestJSON-server-1039580542',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-instanceactionsv221testjson-server-1039580542',id=59,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c99aa787fccd4f1d9553df1471383775',ramdisk_id='',reservation_id='r-5rgack7f',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-InstanceActionsV221TestJSON-1186443402',owner_user_name='tempest-InstanceActionsV221TestJSON-1186443402-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-21T23:55:36Z,user_data=None,user_id='2acb5062ef0a46b3b37336c6e856e999',uuid=cdb8738b-801f-4bd8-a93c-3553748dedd7,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "7ff4f083-3e07-4de8-8f0c-bd99fc57ead6", "address": "fa:16:3e:02:3d:57", "network": {"id": "261c8c44-5c0a-4f69-8e63-c90dfc4facd7", "bridge": "br-int", "label": "tempest-InstanceActionsV221TestJSON-1124446518-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c99aa787fccd4f1d9553df1471383775", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7ff4f083-3e", "ovs_interfaceid": "7ff4f083-3e07-4de8-8f0c-bd99fc57ead6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 21 23:55:43 compute-1 nova_compute[182713]: 2026-01-21 23:55:43.443 182717 DEBUG nova.network.os_vif_util [None req-bac2316a-8912-4826-8cd6-e33f10e4683b 2acb5062ef0a46b3b37336c6e856e999 c99aa787fccd4f1d9553df1471383775 - - default default] Converting VIF {"id": "7ff4f083-3e07-4de8-8f0c-bd99fc57ead6", "address": "fa:16:3e:02:3d:57", "network": {"id": "261c8c44-5c0a-4f69-8e63-c90dfc4facd7", "bridge": "br-int", "label": "tempest-InstanceActionsV221TestJSON-1124446518-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c99aa787fccd4f1d9553df1471383775", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7ff4f083-3e", "ovs_interfaceid": "7ff4f083-3e07-4de8-8f0c-bd99fc57ead6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 21 23:55:43 compute-1 nova_compute[182713]: 2026-01-21 23:55:43.445 182717 DEBUG nova.network.os_vif_util [None req-bac2316a-8912-4826-8cd6-e33f10e4683b 2acb5062ef0a46b3b37336c6e856e999 c99aa787fccd4f1d9553df1471383775 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:02:3d:57,bridge_name='br-int',has_traffic_filtering=True,id=7ff4f083-3e07-4de8-8f0c-bd99fc57ead6,network=Network(261c8c44-5c0a-4f69-8e63-c90dfc4facd7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7ff4f083-3e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 21 23:55:43 compute-1 nova_compute[182713]: 2026-01-21 23:55:43.447 182717 DEBUG nova.objects.instance [None req-bac2316a-8912-4826-8cd6-e33f10e4683b 2acb5062ef0a46b3b37336c6e856e999 c99aa787fccd4f1d9553df1471383775 - - default default] Lazy-loading 'pci_devices' on Instance uuid cdb8738b-801f-4bd8-a93c-3553748dedd7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:55:43 compute-1 nova_compute[182713]: 2026-01-21 23:55:43.474 182717 DEBUG nova.virt.libvirt.driver [None req-bac2316a-8912-4826-8cd6-e33f10e4683b 2acb5062ef0a46b3b37336c6e856e999 c99aa787fccd4f1d9553df1471383775 - - default default] [instance: cdb8738b-801f-4bd8-a93c-3553748dedd7] End _get_guest_xml xml=<domain type="kvm">
Jan 21 23:55:43 compute-1 nova_compute[182713]:   <uuid>cdb8738b-801f-4bd8-a93c-3553748dedd7</uuid>
Jan 21 23:55:43 compute-1 nova_compute[182713]:   <name>instance-0000003b</name>
Jan 21 23:55:43 compute-1 nova_compute[182713]:   <memory>131072</memory>
Jan 21 23:55:43 compute-1 nova_compute[182713]:   <vcpu>1</vcpu>
Jan 21 23:55:43 compute-1 nova_compute[182713]:   <metadata>
Jan 21 23:55:43 compute-1 nova_compute[182713]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 21 23:55:43 compute-1 nova_compute[182713]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 21 23:55:43 compute-1 nova_compute[182713]:       <nova:name>tempest-InstanceActionsV221TestJSON-server-1039580542</nova:name>
Jan 21 23:55:43 compute-1 nova_compute[182713]:       <nova:creationTime>2026-01-21 23:55:43</nova:creationTime>
Jan 21 23:55:43 compute-1 nova_compute[182713]:       <nova:flavor name="m1.nano">
Jan 21 23:55:43 compute-1 nova_compute[182713]:         <nova:memory>128</nova:memory>
Jan 21 23:55:43 compute-1 nova_compute[182713]:         <nova:disk>1</nova:disk>
Jan 21 23:55:43 compute-1 nova_compute[182713]:         <nova:swap>0</nova:swap>
Jan 21 23:55:43 compute-1 nova_compute[182713]:         <nova:ephemeral>0</nova:ephemeral>
Jan 21 23:55:43 compute-1 nova_compute[182713]:         <nova:vcpus>1</nova:vcpus>
Jan 21 23:55:43 compute-1 nova_compute[182713]:       </nova:flavor>
Jan 21 23:55:43 compute-1 nova_compute[182713]:       <nova:owner>
Jan 21 23:55:43 compute-1 nova_compute[182713]:         <nova:user uuid="2acb5062ef0a46b3b37336c6e856e999">tempest-InstanceActionsV221TestJSON-1186443402-project-member</nova:user>
Jan 21 23:55:43 compute-1 nova_compute[182713]:         <nova:project uuid="c99aa787fccd4f1d9553df1471383775">tempest-InstanceActionsV221TestJSON-1186443402</nova:project>
Jan 21 23:55:43 compute-1 nova_compute[182713]:       </nova:owner>
Jan 21 23:55:43 compute-1 nova_compute[182713]:       <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 21 23:55:43 compute-1 nova_compute[182713]:       <nova:ports>
Jan 21 23:55:43 compute-1 nova_compute[182713]:         <nova:port uuid="7ff4f083-3e07-4de8-8f0c-bd99fc57ead6">
Jan 21 23:55:43 compute-1 nova_compute[182713]:           <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Jan 21 23:55:43 compute-1 nova_compute[182713]:         </nova:port>
Jan 21 23:55:43 compute-1 nova_compute[182713]:       </nova:ports>
Jan 21 23:55:43 compute-1 nova_compute[182713]:     </nova:instance>
Jan 21 23:55:43 compute-1 nova_compute[182713]:   </metadata>
Jan 21 23:55:43 compute-1 nova_compute[182713]:   <sysinfo type="smbios">
Jan 21 23:55:43 compute-1 nova_compute[182713]:     <system>
Jan 21 23:55:43 compute-1 nova_compute[182713]:       <entry name="manufacturer">RDO</entry>
Jan 21 23:55:43 compute-1 nova_compute[182713]:       <entry name="product">OpenStack Compute</entry>
Jan 21 23:55:43 compute-1 nova_compute[182713]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 21 23:55:43 compute-1 nova_compute[182713]:       <entry name="serial">cdb8738b-801f-4bd8-a93c-3553748dedd7</entry>
Jan 21 23:55:43 compute-1 nova_compute[182713]:       <entry name="uuid">cdb8738b-801f-4bd8-a93c-3553748dedd7</entry>
Jan 21 23:55:43 compute-1 nova_compute[182713]:       <entry name="family">Virtual Machine</entry>
Jan 21 23:55:43 compute-1 nova_compute[182713]:     </system>
Jan 21 23:55:43 compute-1 nova_compute[182713]:   </sysinfo>
Jan 21 23:55:43 compute-1 nova_compute[182713]:   <os>
Jan 21 23:55:43 compute-1 nova_compute[182713]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 21 23:55:43 compute-1 nova_compute[182713]:     <boot dev="hd"/>
Jan 21 23:55:43 compute-1 nova_compute[182713]:     <smbios mode="sysinfo"/>
Jan 21 23:55:43 compute-1 nova_compute[182713]:   </os>
Jan 21 23:55:43 compute-1 nova_compute[182713]:   <features>
Jan 21 23:55:43 compute-1 nova_compute[182713]:     <acpi/>
Jan 21 23:55:43 compute-1 nova_compute[182713]:     <apic/>
Jan 21 23:55:43 compute-1 nova_compute[182713]:     <vmcoreinfo/>
Jan 21 23:55:43 compute-1 nova_compute[182713]:   </features>
Jan 21 23:55:43 compute-1 nova_compute[182713]:   <clock offset="utc">
Jan 21 23:55:43 compute-1 nova_compute[182713]:     <timer name="pit" tickpolicy="delay"/>
Jan 21 23:55:43 compute-1 nova_compute[182713]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 21 23:55:43 compute-1 nova_compute[182713]:     <timer name="hpet" present="no"/>
Jan 21 23:55:43 compute-1 nova_compute[182713]:   </clock>
Jan 21 23:55:43 compute-1 nova_compute[182713]:   <cpu mode="custom" match="exact">
Jan 21 23:55:43 compute-1 nova_compute[182713]:     <model>Nehalem</model>
Jan 21 23:55:43 compute-1 nova_compute[182713]:     <topology sockets="1" cores="1" threads="1"/>
Jan 21 23:55:43 compute-1 nova_compute[182713]:   </cpu>
Jan 21 23:55:43 compute-1 nova_compute[182713]:   <devices>
Jan 21 23:55:43 compute-1 nova_compute[182713]:     <disk type="file" device="disk">
Jan 21 23:55:43 compute-1 nova_compute[182713]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 21 23:55:43 compute-1 nova_compute[182713]:       <source file="/var/lib/nova/instances/cdb8738b-801f-4bd8-a93c-3553748dedd7/disk"/>
Jan 21 23:55:43 compute-1 nova_compute[182713]:       <target dev="vda" bus="virtio"/>
Jan 21 23:55:43 compute-1 nova_compute[182713]:     </disk>
Jan 21 23:55:43 compute-1 nova_compute[182713]:     <disk type="file" device="cdrom">
Jan 21 23:55:43 compute-1 nova_compute[182713]:       <driver name="qemu" type="raw" cache="none"/>
Jan 21 23:55:43 compute-1 nova_compute[182713]:       <source file="/var/lib/nova/instances/cdb8738b-801f-4bd8-a93c-3553748dedd7/disk.config"/>
Jan 21 23:55:43 compute-1 nova_compute[182713]:       <target dev="sda" bus="sata"/>
Jan 21 23:55:43 compute-1 nova_compute[182713]:     </disk>
Jan 21 23:55:43 compute-1 nova_compute[182713]:     <interface type="ethernet">
Jan 21 23:55:43 compute-1 nova_compute[182713]:       <mac address="fa:16:3e:02:3d:57"/>
Jan 21 23:55:43 compute-1 nova_compute[182713]:       <model type="virtio"/>
Jan 21 23:55:43 compute-1 nova_compute[182713]:       <driver name="vhost" rx_queue_size="512"/>
Jan 21 23:55:43 compute-1 nova_compute[182713]:       <mtu size="1442"/>
Jan 21 23:55:43 compute-1 nova_compute[182713]:       <target dev="tap7ff4f083-3e"/>
Jan 21 23:55:43 compute-1 nova_compute[182713]:     </interface>
Jan 21 23:55:43 compute-1 nova_compute[182713]:     <serial type="pty">
Jan 21 23:55:43 compute-1 nova_compute[182713]:       <log file="/var/lib/nova/instances/cdb8738b-801f-4bd8-a93c-3553748dedd7/console.log" append="off"/>
Jan 21 23:55:43 compute-1 nova_compute[182713]:     </serial>
Jan 21 23:55:43 compute-1 nova_compute[182713]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 21 23:55:43 compute-1 nova_compute[182713]:     <video>
Jan 21 23:55:43 compute-1 nova_compute[182713]:       <model type="virtio"/>
Jan 21 23:55:43 compute-1 nova_compute[182713]:     </video>
Jan 21 23:55:43 compute-1 nova_compute[182713]:     <input type="tablet" bus="usb"/>
Jan 21 23:55:43 compute-1 nova_compute[182713]:     <rng model="virtio">
Jan 21 23:55:43 compute-1 nova_compute[182713]:       <backend model="random">/dev/urandom</backend>
Jan 21 23:55:43 compute-1 nova_compute[182713]:     </rng>
Jan 21 23:55:43 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root"/>
Jan 21 23:55:43 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:55:43 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:55:43 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:55:43 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:55:43 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:55:43 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:55:43 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:55:43 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:55:43 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:55:43 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:55:43 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:55:43 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:55:43 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:55:43 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:55:43 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:55:43 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:55:43 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:55:43 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:55:43 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:55:43 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:55:43 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:55:43 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:55:43 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:55:43 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:55:43 compute-1 nova_compute[182713]:     <controller type="usb" index="0"/>
Jan 21 23:55:43 compute-1 nova_compute[182713]:     <memballoon model="virtio">
Jan 21 23:55:43 compute-1 nova_compute[182713]:       <stats period="10"/>
Jan 21 23:55:43 compute-1 nova_compute[182713]:     </memballoon>
Jan 21 23:55:43 compute-1 nova_compute[182713]:   </devices>
Jan 21 23:55:43 compute-1 nova_compute[182713]: </domain>
Jan 21 23:55:43 compute-1 nova_compute[182713]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 21 23:55:43 compute-1 nova_compute[182713]: 2026-01-21 23:55:43.476 182717 DEBUG nova.compute.manager [None req-bac2316a-8912-4826-8cd6-e33f10e4683b 2acb5062ef0a46b3b37336c6e856e999 c99aa787fccd4f1d9553df1471383775 - - default default] [instance: cdb8738b-801f-4bd8-a93c-3553748dedd7] Preparing to wait for external event network-vif-plugged-7ff4f083-3e07-4de8-8f0c-bd99fc57ead6 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 21 23:55:43 compute-1 nova_compute[182713]: 2026-01-21 23:55:43.476 182717 DEBUG oslo_concurrency.lockutils [None req-bac2316a-8912-4826-8cd6-e33f10e4683b 2acb5062ef0a46b3b37336c6e856e999 c99aa787fccd4f1d9553df1471383775 - - default default] Acquiring lock "cdb8738b-801f-4bd8-a93c-3553748dedd7-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:55:43 compute-1 nova_compute[182713]: 2026-01-21 23:55:43.477 182717 DEBUG oslo_concurrency.lockutils [None req-bac2316a-8912-4826-8cd6-e33f10e4683b 2acb5062ef0a46b3b37336c6e856e999 c99aa787fccd4f1d9553df1471383775 - - default default] Lock "cdb8738b-801f-4bd8-a93c-3553748dedd7-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:55:43 compute-1 nova_compute[182713]: 2026-01-21 23:55:43.477 182717 DEBUG oslo_concurrency.lockutils [None req-bac2316a-8912-4826-8cd6-e33f10e4683b 2acb5062ef0a46b3b37336c6e856e999 c99aa787fccd4f1d9553df1471383775 - - default default] Lock "cdb8738b-801f-4bd8-a93c-3553748dedd7-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:55:43 compute-1 nova_compute[182713]: 2026-01-21 23:55:43.478 182717 DEBUG nova.virt.libvirt.vif [None req-bac2316a-8912-4826-8cd6-e33f10e4683b 2acb5062ef0a46b3b37336c6e856e999 c99aa787fccd4f1d9553df1471383775 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-21T23:55:33Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-InstanceActionsV221TestJSON-server-1039580542',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-instanceactionsv221testjson-server-1039580542',id=59,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c99aa787fccd4f1d9553df1471383775',ramdisk_id='',reservation_id='r-5rgack7f',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-InstanceActionsV221TestJSON-1186443402',owner_user_name='tempest-InstanceActionsV221TestJSON-1186443402-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-21T23:55:36Z,user_data=None,user_id='2acb5062ef0a46b3b37336c6e856e999',uuid=cdb8738b-801f-4bd8-a93c-3553748dedd7,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "7ff4f083-3e07-4de8-8f0c-bd99fc57ead6", "address": "fa:16:3e:02:3d:57", "network": {"id": "261c8c44-5c0a-4f69-8e63-c90dfc4facd7", "bridge": "br-int", "label": "tempest-InstanceActionsV221TestJSON-1124446518-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c99aa787fccd4f1d9553df1471383775", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7ff4f083-3e", "ovs_interfaceid": "7ff4f083-3e07-4de8-8f0c-bd99fc57ead6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 21 23:55:43 compute-1 nova_compute[182713]: 2026-01-21 23:55:43.479 182717 DEBUG nova.network.os_vif_util [None req-bac2316a-8912-4826-8cd6-e33f10e4683b 2acb5062ef0a46b3b37336c6e856e999 c99aa787fccd4f1d9553df1471383775 - - default default] Converting VIF {"id": "7ff4f083-3e07-4de8-8f0c-bd99fc57ead6", "address": "fa:16:3e:02:3d:57", "network": {"id": "261c8c44-5c0a-4f69-8e63-c90dfc4facd7", "bridge": "br-int", "label": "tempest-InstanceActionsV221TestJSON-1124446518-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c99aa787fccd4f1d9553df1471383775", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7ff4f083-3e", "ovs_interfaceid": "7ff4f083-3e07-4de8-8f0c-bd99fc57ead6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 21 23:55:43 compute-1 nova_compute[182713]: 2026-01-21 23:55:43.480 182717 DEBUG nova.network.os_vif_util [None req-bac2316a-8912-4826-8cd6-e33f10e4683b 2acb5062ef0a46b3b37336c6e856e999 c99aa787fccd4f1d9553df1471383775 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:02:3d:57,bridge_name='br-int',has_traffic_filtering=True,id=7ff4f083-3e07-4de8-8f0c-bd99fc57ead6,network=Network(261c8c44-5c0a-4f69-8e63-c90dfc4facd7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7ff4f083-3e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 21 23:55:43 compute-1 nova_compute[182713]: 2026-01-21 23:55:43.480 182717 DEBUG os_vif [None req-bac2316a-8912-4826-8cd6-e33f10e4683b 2acb5062ef0a46b3b37336c6e856e999 c99aa787fccd4f1d9553df1471383775 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:02:3d:57,bridge_name='br-int',has_traffic_filtering=True,id=7ff4f083-3e07-4de8-8f0c-bd99fc57ead6,network=Network(261c8c44-5c0a-4f69-8e63-c90dfc4facd7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7ff4f083-3e') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 21 23:55:43 compute-1 nova_compute[182713]: 2026-01-21 23:55:43.482 182717 DEBUG nova.network.neutron [None req-88c78b83-62b4-4740-abaa-caed5e17bd63 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] [instance: 22742d9b-a6a8-4f10-a17f-a9704a1f8f43] Updating instance_info_cache with network_info: [{"id": "b767fb64-f4a0-49cc-85c0-21b059344b3d", "address": "fa:16:3e:4e:99:ca", "network": {"id": "a78bfb22-a192-4dbe-a117-9f8a59130e27", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-9709523-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "70b1c9f8be0042aa8de9841a26729700", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb767fb64-f4", "ovs_interfaceid": "b767fb64-f4a0-49cc-85c0-21b059344b3d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 23:55:43 compute-1 nova_compute[182713]: 2026-01-21 23:55:43.484 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:55:43 compute-1 nova_compute[182713]: 2026-01-21 23:55:43.484 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:55:43 compute-1 nova_compute[182713]: 2026-01-21 23:55:43.485 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 21 23:55:43 compute-1 nova_compute[182713]: 2026-01-21 23:55:43.492 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:55:43 compute-1 nova_compute[182713]: 2026-01-21 23:55:43.492 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap7ff4f083-3e, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:55:43 compute-1 nova_compute[182713]: 2026-01-21 23:55:43.493 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap7ff4f083-3e, col_values=(('external_ids', {'iface-id': '7ff4f083-3e07-4de8-8f0c-bd99fc57ead6', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:02:3d:57', 'vm-uuid': 'cdb8738b-801f-4bd8-a93c-3553748dedd7'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:55:43 compute-1 nova_compute[182713]: 2026-01-21 23:55:43.495 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:55:43 compute-1 NetworkManager[54952]: <info>  [1769039743.4969] manager: (tap7ff4f083-3e): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/79)
Jan 21 23:55:43 compute-1 nova_compute[182713]: 2026-01-21 23:55:43.498 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 21 23:55:43 compute-1 nova_compute[182713]: 2026-01-21 23:55:43.503 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:55:43 compute-1 nova_compute[182713]: 2026-01-21 23:55:43.505 182717 INFO os_vif [None req-bac2316a-8912-4826-8cd6-e33f10e4683b 2acb5062ef0a46b3b37336c6e856e999 c99aa787fccd4f1d9553df1471383775 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:02:3d:57,bridge_name='br-int',has_traffic_filtering=True,id=7ff4f083-3e07-4de8-8f0c-bd99fc57ead6,network=Network(261c8c44-5c0a-4f69-8e63-c90dfc4facd7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7ff4f083-3e')
Jan 21 23:55:43 compute-1 nova_compute[182713]: 2026-01-21 23:55:43.508 182717 DEBUG oslo_concurrency.lockutils [None req-88c78b83-62b4-4740-abaa-caed5e17bd63 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Releasing lock "refresh_cache-22742d9b-a6a8-4f10-a17f-a9704a1f8f43" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 23:55:43 compute-1 nova_compute[182713]: 2026-01-21 23:55:43.509 182717 DEBUG nova.compute.manager [None req-88c78b83-62b4-4740-abaa-caed5e17bd63 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] [instance: 22742d9b-a6a8-4f10-a17f-a9704a1f8f43] Instance network_info: |[{"id": "b767fb64-f4a0-49cc-85c0-21b059344b3d", "address": "fa:16:3e:4e:99:ca", "network": {"id": "a78bfb22-a192-4dbe-a117-9f8a59130e27", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-9709523-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "70b1c9f8be0042aa8de9841a26729700", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb767fb64-f4", "ovs_interfaceid": "b767fb64-f4a0-49cc-85c0-21b059344b3d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 21 23:55:43 compute-1 nova_compute[182713]: 2026-01-21 23:55:43.510 182717 DEBUG oslo_concurrency.lockutils [req-306717a3-c08a-4167-94c3-89c982faa494 req-ed709abd-26eb-4ae1-9317-21ce16440171 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-22742d9b-a6a8-4f10-a17f-a9704a1f8f43" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 23:55:43 compute-1 nova_compute[182713]: 2026-01-21 23:55:43.510 182717 DEBUG nova.network.neutron [req-306717a3-c08a-4167-94c3-89c982faa494 req-ed709abd-26eb-4ae1-9317-21ce16440171 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 22742d9b-a6a8-4f10-a17f-a9704a1f8f43] Refreshing network info cache for port b767fb64-f4a0-49cc-85c0-21b059344b3d _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 21 23:55:43 compute-1 nova_compute[182713]: 2026-01-21 23:55:43.518 182717 DEBUG nova.virt.libvirt.driver [None req-88c78b83-62b4-4740-abaa-caed5e17bd63 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] [instance: 22742d9b-a6a8-4f10-a17f-a9704a1f8f43] Start _get_guest_xml network_info=[{"id": "b767fb64-f4a0-49cc-85c0-21b059344b3d", "address": "fa:16:3e:4e:99:ca", "network": {"id": "a78bfb22-a192-4dbe-a117-9f8a59130e27", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-9709523-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "70b1c9f8be0042aa8de9841a26729700", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb767fb64-f4", "ovs_interfaceid": "b767fb64-f4a0-49cc-85c0-21b059344b3d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'size': 0, 'disk_bus': 'virtio', 'boot_index': 0, 'device_name': '/dev/vda', 'guest_format': None, 'encrypted': False, 'encryption_format': None, 'encryption_secret_uuid': None, 'device_type': 'disk', 'image_id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 21 23:55:43 compute-1 nova_compute[182713]: 2026-01-21 23:55:43.527 182717 WARNING nova.virt.libvirt.driver [None req-88c78b83-62b4-4740-abaa-caed5e17bd63 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 21 23:55:43 compute-1 nova_compute[182713]: 2026-01-21 23:55:43.532 182717 DEBUG nova.virt.libvirt.host [None req-88c78b83-62b4-4740-abaa-caed5e17bd63 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 21 23:55:43 compute-1 nova_compute[182713]: 2026-01-21 23:55:43.532 182717 DEBUG nova.virt.libvirt.host [None req-88c78b83-62b4-4740-abaa-caed5e17bd63 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 21 23:55:43 compute-1 nova_compute[182713]: 2026-01-21 23:55:43.540 182717 DEBUG nova.virt.libvirt.host [None req-88c78b83-62b4-4740-abaa-caed5e17bd63 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 21 23:55:43 compute-1 nova_compute[182713]: 2026-01-21 23:55:43.541 182717 DEBUG nova.virt.libvirt.host [None req-88c78b83-62b4-4740-abaa-caed5e17bd63 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 21 23:55:43 compute-1 nova_compute[182713]: 2026-01-21 23:55:43.543 182717 DEBUG nova.virt.libvirt.driver [None req-88c78b83-62b4-4740-abaa-caed5e17bd63 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 21 23:55:43 compute-1 nova_compute[182713]: 2026-01-21 23:55:43.543 182717 DEBUG nova.virt.hardware [None req-88c78b83-62b4-4740-abaa-caed5e17bd63 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-21T23:42:45Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c3389c03-89c4-4ff5-9e03-1a99d41713d4',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 21 23:55:43 compute-1 nova_compute[182713]: 2026-01-21 23:55:43.544 182717 DEBUG nova.virt.hardware [None req-88c78b83-62b4-4740-abaa-caed5e17bd63 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 21 23:55:43 compute-1 nova_compute[182713]: 2026-01-21 23:55:43.544 182717 DEBUG nova.virt.hardware [None req-88c78b83-62b4-4740-abaa-caed5e17bd63 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 21 23:55:43 compute-1 nova_compute[182713]: 2026-01-21 23:55:43.545 182717 DEBUG nova.virt.hardware [None req-88c78b83-62b4-4740-abaa-caed5e17bd63 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 21 23:55:43 compute-1 nova_compute[182713]: 2026-01-21 23:55:43.545 182717 DEBUG nova.virt.hardware [None req-88c78b83-62b4-4740-abaa-caed5e17bd63 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 21 23:55:43 compute-1 nova_compute[182713]: 2026-01-21 23:55:43.545 182717 DEBUG nova.virt.hardware [None req-88c78b83-62b4-4740-abaa-caed5e17bd63 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 21 23:55:43 compute-1 nova_compute[182713]: 2026-01-21 23:55:43.546 182717 DEBUG nova.virt.hardware [None req-88c78b83-62b4-4740-abaa-caed5e17bd63 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 21 23:55:43 compute-1 nova_compute[182713]: 2026-01-21 23:55:43.546 182717 DEBUG nova.virt.hardware [None req-88c78b83-62b4-4740-abaa-caed5e17bd63 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 21 23:55:43 compute-1 nova_compute[182713]: 2026-01-21 23:55:43.546 182717 DEBUG nova.virt.hardware [None req-88c78b83-62b4-4740-abaa-caed5e17bd63 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 21 23:55:43 compute-1 nova_compute[182713]: 2026-01-21 23:55:43.547 182717 DEBUG nova.virt.hardware [None req-88c78b83-62b4-4740-abaa-caed5e17bd63 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 21 23:55:43 compute-1 nova_compute[182713]: 2026-01-21 23:55:43.547 182717 DEBUG nova.virt.hardware [None req-88c78b83-62b4-4740-abaa-caed5e17bd63 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 21 23:55:43 compute-1 nova_compute[182713]: 2026-01-21 23:55:43.553 182717 DEBUG nova.virt.libvirt.vif [None req-88c78b83-62b4-4740-abaa-caed5e17bd63 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-21T23:55:33Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ListServerFiltersTestJSON-instance-1374146863',display_name='tempest-ListServerFiltersTestJSON-instance-1374146863',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-listserverfilterstestjson-instance-1374146863',id=58,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='70b1c9f8be0042aa8de9841a26729700',ramdisk_id='',reservation_id='r-3b2x1al6',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ListServerFiltersTestJSON-1547380946',owner_user_name='tempest-ListServerFiltersTestJSON-1547380946-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-21T23:55:35Z,user_data=None,user_id='7e79b904cb8a49f990b05eb0ed72fdf4',uuid=22742d9b-a6a8-4f10-a17f-a9704a1f8f43,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "b767fb64-f4a0-49cc-85c0-21b059344b3d", "address": "fa:16:3e:4e:99:ca", "network": {"id": "a78bfb22-a192-4dbe-a117-9f8a59130e27", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-9709523-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "70b1c9f8be0042aa8de9841a26729700", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb767fb64-f4", "ovs_interfaceid": "b767fb64-f4a0-49cc-85c0-21b059344b3d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 21 23:55:43 compute-1 nova_compute[182713]: 2026-01-21 23:55:43.553 182717 DEBUG nova.network.os_vif_util [None req-88c78b83-62b4-4740-abaa-caed5e17bd63 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Converting VIF {"id": "b767fb64-f4a0-49cc-85c0-21b059344b3d", "address": "fa:16:3e:4e:99:ca", "network": {"id": "a78bfb22-a192-4dbe-a117-9f8a59130e27", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-9709523-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "70b1c9f8be0042aa8de9841a26729700", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb767fb64-f4", "ovs_interfaceid": "b767fb64-f4a0-49cc-85c0-21b059344b3d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 21 23:55:43 compute-1 nova_compute[182713]: 2026-01-21 23:55:43.554 182717 DEBUG nova.network.os_vif_util [None req-88c78b83-62b4-4740-abaa-caed5e17bd63 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:4e:99:ca,bridge_name='br-int',has_traffic_filtering=True,id=b767fb64-f4a0-49cc-85c0-21b059344b3d,network=Network(a78bfb22-a192-4dbe-a117-9f8a59130e27),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb767fb64-f4') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 21 23:55:43 compute-1 nova_compute[182713]: 2026-01-21 23:55:43.556 182717 DEBUG nova.objects.instance [None req-88c78b83-62b4-4740-abaa-caed5e17bd63 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Lazy-loading 'pci_devices' on Instance uuid 22742d9b-a6a8-4f10-a17f-a9704a1f8f43 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:55:43 compute-1 nova_compute[182713]: 2026-01-21 23:55:43.601 182717 DEBUG nova.virt.libvirt.driver [None req-88c78b83-62b4-4740-abaa-caed5e17bd63 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] [instance: 22742d9b-a6a8-4f10-a17f-a9704a1f8f43] End _get_guest_xml xml=<domain type="kvm">
Jan 21 23:55:43 compute-1 nova_compute[182713]:   <uuid>22742d9b-a6a8-4f10-a17f-a9704a1f8f43</uuid>
Jan 21 23:55:43 compute-1 nova_compute[182713]:   <name>instance-0000003a</name>
Jan 21 23:55:43 compute-1 nova_compute[182713]:   <memory>131072</memory>
Jan 21 23:55:43 compute-1 nova_compute[182713]:   <vcpu>1</vcpu>
Jan 21 23:55:43 compute-1 nova_compute[182713]:   <metadata>
Jan 21 23:55:43 compute-1 nova_compute[182713]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 21 23:55:43 compute-1 nova_compute[182713]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 21 23:55:43 compute-1 nova_compute[182713]:       <nova:name>tempest-ListServerFiltersTestJSON-instance-1374146863</nova:name>
Jan 21 23:55:43 compute-1 nova_compute[182713]:       <nova:creationTime>2026-01-21 23:55:43</nova:creationTime>
Jan 21 23:55:43 compute-1 nova_compute[182713]:       <nova:flavor name="m1.nano">
Jan 21 23:55:43 compute-1 nova_compute[182713]:         <nova:memory>128</nova:memory>
Jan 21 23:55:43 compute-1 nova_compute[182713]:         <nova:disk>1</nova:disk>
Jan 21 23:55:43 compute-1 nova_compute[182713]:         <nova:swap>0</nova:swap>
Jan 21 23:55:43 compute-1 nova_compute[182713]:         <nova:ephemeral>0</nova:ephemeral>
Jan 21 23:55:43 compute-1 nova_compute[182713]:         <nova:vcpus>1</nova:vcpus>
Jan 21 23:55:43 compute-1 nova_compute[182713]:       </nova:flavor>
Jan 21 23:55:43 compute-1 nova_compute[182713]:       <nova:owner>
Jan 21 23:55:43 compute-1 nova_compute[182713]:         <nova:user uuid="7e79b904cb8a49f990b05eb0ed72fdf4">tempest-ListServerFiltersTestJSON-1547380946-project-member</nova:user>
Jan 21 23:55:43 compute-1 nova_compute[182713]:         <nova:project uuid="70b1c9f8be0042aa8de9841a26729700">tempest-ListServerFiltersTestJSON-1547380946</nova:project>
Jan 21 23:55:43 compute-1 nova_compute[182713]:       </nova:owner>
Jan 21 23:55:43 compute-1 nova_compute[182713]:       <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 21 23:55:43 compute-1 nova_compute[182713]:       <nova:ports>
Jan 21 23:55:43 compute-1 nova_compute[182713]:         <nova:port uuid="b767fb64-f4a0-49cc-85c0-21b059344b3d">
Jan 21 23:55:43 compute-1 nova_compute[182713]:           <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Jan 21 23:55:43 compute-1 nova_compute[182713]:         </nova:port>
Jan 21 23:55:43 compute-1 nova_compute[182713]:       </nova:ports>
Jan 21 23:55:43 compute-1 nova_compute[182713]:     </nova:instance>
Jan 21 23:55:43 compute-1 nova_compute[182713]:   </metadata>
Jan 21 23:55:43 compute-1 nova_compute[182713]:   <sysinfo type="smbios">
Jan 21 23:55:43 compute-1 nova_compute[182713]:     <system>
Jan 21 23:55:43 compute-1 nova_compute[182713]:       <entry name="manufacturer">RDO</entry>
Jan 21 23:55:43 compute-1 nova_compute[182713]:       <entry name="product">OpenStack Compute</entry>
Jan 21 23:55:43 compute-1 nova_compute[182713]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 21 23:55:43 compute-1 nova_compute[182713]:       <entry name="serial">22742d9b-a6a8-4f10-a17f-a9704a1f8f43</entry>
Jan 21 23:55:43 compute-1 nova_compute[182713]:       <entry name="uuid">22742d9b-a6a8-4f10-a17f-a9704a1f8f43</entry>
Jan 21 23:55:43 compute-1 nova_compute[182713]:       <entry name="family">Virtual Machine</entry>
Jan 21 23:55:43 compute-1 nova_compute[182713]:     </system>
Jan 21 23:55:43 compute-1 nova_compute[182713]:   </sysinfo>
Jan 21 23:55:43 compute-1 nova_compute[182713]:   <os>
Jan 21 23:55:43 compute-1 nova_compute[182713]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 21 23:55:43 compute-1 nova_compute[182713]:     <boot dev="hd"/>
Jan 21 23:55:43 compute-1 nova_compute[182713]:     <smbios mode="sysinfo"/>
Jan 21 23:55:43 compute-1 nova_compute[182713]:   </os>
Jan 21 23:55:43 compute-1 nova_compute[182713]:   <features>
Jan 21 23:55:43 compute-1 nova_compute[182713]:     <acpi/>
Jan 21 23:55:43 compute-1 nova_compute[182713]:     <apic/>
Jan 21 23:55:43 compute-1 nova_compute[182713]:     <vmcoreinfo/>
Jan 21 23:55:43 compute-1 nova_compute[182713]:   </features>
Jan 21 23:55:43 compute-1 nova_compute[182713]:   <clock offset="utc">
Jan 21 23:55:43 compute-1 nova_compute[182713]:     <timer name="pit" tickpolicy="delay"/>
Jan 21 23:55:43 compute-1 nova_compute[182713]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 21 23:55:43 compute-1 nova_compute[182713]:     <timer name="hpet" present="no"/>
Jan 21 23:55:43 compute-1 nova_compute[182713]:   </clock>
Jan 21 23:55:43 compute-1 nova_compute[182713]:   <cpu mode="custom" match="exact">
Jan 21 23:55:43 compute-1 nova_compute[182713]:     <model>Nehalem</model>
Jan 21 23:55:43 compute-1 nova_compute[182713]:     <topology sockets="1" cores="1" threads="1"/>
Jan 21 23:55:43 compute-1 nova_compute[182713]:   </cpu>
Jan 21 23:55:43 compute-1 nova_compute[182713]:   <devices>
Jan 21 23:55:43 compute-1 nova_compute[182713]:     <disk type="file" device="disk">
Jan 21 23:55:43 compute-1 nova_compute[182713]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 21 23:55:43 compute-1 nova_compute[182713]:       <source file="/var/lib/nova/instances/22742d9b-a6a8-4f10-a17f-a9704a1f8f43/disk"/>
Jan 21 23:55:43 compute-1 nova_compute[182713]:       <target dev="vda" bus="virtio"/>
Jan 21 23:55:43 compute-1 nova_compute[182713]:     </disk>
Jan 21 23:55:43 compute-1 nova_compute[182713]:     <disk type="file" device="cdrom">
Jan 21 23:55:43 compute-1 nova_compute[182713]:       <driver name="qemu" type="raw" cache="none"/>
Jan 21 23:55:43 compute-1 nova_compute[182713]:       <source file="/var/lib/nova/instances/22742d9b-a6a8-4f10-a17f-a9704a1f8f43/disk.config"/>
Jan 21 23:55:43 compute-1 nova_compute[182713]:       <target dev="sda" bus="sata"/>
Jan 21 23:55:43 compute-1 nova_compute[182713]:     </disk>
Jan 21 23:55:43 compute-1 nova_compute[182713]:     <interface type="ethernet">
Jan 21 23:55:43 compute-1 nova_compute[182713]:       <mac address="fa:16:3e:4e:99:ca"/>
Jan 21 23:55:43 compute-1 nova_compute[182713]:       <model type="virtio"/>
Jan 21 23:55:43 compute-1 nova_compute[182713]:       <driver name="vhost" rx_queue_size="512"/>
Jan 21 23:55:43 compute-1 nova_compute[182713]:       <mtu size="1442"/>
Jan 21 23:55:43 compute-1 nova_compute[182713]:       <target dev="tapb767fb64-f4"/>
Jan 21 23:55:43 compute-1 nova_compute[182713]:     </interface>
Jan 21 23:55:43 compute-1 nova_compute[182713]:     <serial type="pty">
Jan 21 23:55:43 compute-1 nova_compute[182713]:       <log file="/var/lib/nova/instances/22742d9b-a6a8-4f10-a17f-a9704a1f8f43/console.log" append="off"/>
Jan 21 23:55:43 compute-1 nova_compute[182713]:     </serial>
Jan 21 23:55:43 compute-1 nova_compute[182713]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 21 23:55:43 compute-1 nova_compute[182713]:     <video>
Jan 21 23:55:43 compute-1 nova_compute[182713]:       <model type="virtio"/>
Jan 21 23:55:43 compute-1 nova_compute[182713]:     </video>
Jan 21 23:55:43 compute-1 nova_compute[182713]:     <input type="tablet" bus="usb"/>
Jan 21 23:55:43 compute-1 nova_compute[182713]:     <rng model="virtio">
Jan 21 23:55:43 compute-1 nova_compute[182713]:       <backend model="random">/dev/urandom</backend>
Jan 21 23:55:43 compute-1 nova_compute[182713]:     </rng>
Jan 21 23:55:43 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root"/>
Jan 21 23:55:43 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:55:43 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:55:43 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:55:43 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:55:43 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:55:43 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:55:43 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:55:43 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:55:43 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:55:43 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:55:43 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:55:43 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:55:43 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:55:43 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:55:43 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:55:43 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:55:43 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:55:43 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:55:43 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:55:43 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:55:43 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:55:43 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:55:43 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:55:43 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:55:43 compute-1 nova_compute[182713]:     <controller type="usb" index="0"/>
Jan 21 23:55:43 compute-1 nova_compute[182713]:     <memballoon model="virtio">
Jan 21 23:55:43 compute-1 nova_compute[182713]:       <stats period="10"/>
Jan 21 23:55:43 compute-1 nova_compute[182713]:     </memballoon>
Jan 21 23:55:43 compute-1 nova_compute[182713]:   </devices>
Jan 21 23:55:43 compute-1 nova_compute[182713]: </domain>
Jan 21 23:55:43 compute-1 nova_compute[182713]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 21 23:55:43 compute-1 nova_compute[182713]: 2026-01-21 23:55:43.602 182717 DEBUG nova.compute.manager [None req-88c78b83-62b4-4740-abaa-caed5e17bd63 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] [instance: 22742d9b-a6a8-4f10-a17f-a9704a1f8f43] Preparing to wait for external event network-vif-plugged-b767fb64-f4a0-49cc-85c0-21b059344b3d prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 21 23:55:43 compute-1 nova_compute[182713]: 2026-01-21 23:55:43.602 182717 DEBUG oslo_concurrency.lockutils [None req-88c78b83-62b4-4740-abaa-caed5e17bd63 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Acquiring lock "22742d9b-a6a8-4f10-a17f-a9704a1f8f43-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:55:43 compute-1 nova_compute[182713]: 2026-01-21 23:55:43.603 182717 DEBUG oslo_concurrency.lockutils [None req-88c78b83-62b4-4740-abaa-caed5e17bd63 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Lock "22742d9b-a6a8-4f10-a17f-a9704a1f8f43-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:55:43 compute-1 nova_compute[182713]: 2026-01-21 23:55:43.603 182717 DEBUG oslo_concurrency.lockutils [None req-88c78b83-62b4-4740-abaa-caed5e17bd63 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Lock "22742d9b-a6a8-4f10-a17f-a9704a1f8f43-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:55:43 compute-1 nova_compute[182713]: 2026-01-21 23:55:43.604 182717 DEBUG nova.virt.libvirt.vif [None req-88c78b83-62b4-4740-abaa-caed5e17bd63 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-21T23:55:33Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ListServerFiltersTestJSON-instance-1374146863',display_name='tempest-ListServerFiltersTestJSON-instance-1374146863',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-listserverfilterstestjson-instance-1374146863',id=58,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='70b1c9f8be0042aa8de9841a26729700',ramdisk_id='',reservation_id='r-3b2x1al6',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ListServerFiltersTestJSON-1547380946',owner_user_name='tempest-ListServerFiltersTestJSON-1547380946-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-21T23:55:35Z,user_data=None,user_id='7e79b904cb8a49f990b05eb0ed72fdf4',uuid=22742d9b-a6a8-4f10-a17f-a9704a1f8f43,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "b767fb64-f4a0-49cc-85c0-21b059344b3d", "address": "fa:16:3e:4e:99:ca", "network": {"id": "a78bfb22-a192-4dbe-a117-9f8a59130e27", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-9709523-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "70b1c9f8be0042aa8de9841a26729700", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb767fb64-f4", "ovs_interfaceid": "b767fb64-f4a0-49cc-85c0-21b059344b3d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 21 23:55:43 compute-1 nova_compute[182713]: 2026-01-21 23:55:43.605 182717 DEBUG nova.network.os_vif_util [None req-88c78b83-62b4-4740-abaa-caed5e17bd63 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Converting VIF {"id": "b767fb64-f4a0-49cc-85c0-21b059344b3d", "address": "fa:16:3e:4e:99:ca", "network": {"id": "a78bfb22-a192-4dbe-a117-9f8a59130e27", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-9709523-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "70b1c9f8be0042aa8de9841a26729700", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb767fb64-f4", "ovs_interfaceid": "b767fb64-f4a0-49cc-85c0-21b059344b3d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 21 23:55:43 compute-1 nova_compute[182713]: 2026-01-21 23:55:43.606 182717 DEBUG nova.network.os_vif_util [None req-88c78b83-62b4-4740-abaa-caed5e17bd63 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:4e:99:ca,bridge_name='br-int',has_traffic_filtering=True,id=b767fb64-f4a0-49cc-85c0-21b059344b3d,network=Network(a78bfb22-a192-4dbe-a117-9f8a59130e27),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb767fb64-f4') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 21 23:55:43 compute-1 nova_compute[182713]: 2026-01-21 23:55:43.606 182717 DEBUG os_vif [None req-88c78b83-62b4-4740-abaa-caed5e17bd63 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:4e:99:ca,bridge_name='br-int',has_traffic_filtering=True,id=b767fb64-f4a0-49cc-85c0-21b059344b3d,network=Network(a78bfb22-a192-4dbe-a117-9f8a59130e27),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb767fb64-f4') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 21 23:55:43 compute-1 nova_compute[182713]: 2026-01-21 23:55:43.607 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:55:43 compute-1 nova_compute[182713]: 2026-01-21 23:55:43.608 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:55:43 compute-1 nova_compute[182713]: 2026-01-21 23:55:43.608 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 21 23:55:43 compute-1 nova_compute[182713]: 2026-01-21 23:55:43.612 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:55:43 compute-1 nova_compute[182713]: 2026-01-21 23:55:43.612 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb767fb64-f4, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:55:43 compute-1 nova_compute[182713]: 2026-01-21 23:55:43.613 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapb767fb64-f4, col_values=(('external_ids', {'iface-id': 'b767fb64-f4a0-49cc-85c0-21b059344b3d', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:4e:99:ca', 'vm-uuid': '22742d9b-a6a8-4f10-a17f-a9704a1f8f43'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:55:43 compute-1 nova_compute[182713]: 2026-01-21 23:55:43.615 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:55:43 compute-1 NetworkManager[54952]: <info>  [1769039743.6173] manager: (tapb767fb64-f4): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/80)
Jan 21 23:55:43 compute-1 nova_compute[182713]: 2026-01-21 23:55:43.618 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 21 23:55:43 compute-1 nova_compute[182713]: 2026-01-21 23:55:43.629 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:55:43 compute-1 nova_compute[182713]: 2026-01-21 23:55:43.630 182717 INFO os_vif [None req-88c78b83-62b4-4740-abaa-caed5e17bd63 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:4e:99:ca,bridge_name='br-int',has_traffic_filtering=True,id=b767fb64-f4a0-49cc-85c0-21b059344b3d,network=Network(a78bfb22-a192-4dbe-a117-9f8a59130e27),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb767fb64-f4')
Jan 21 23:55:43 compute-1 nova_compute[182713]: 2026-01-21 23:55:43.634 182717 DEBUG nova.virt.libvirt.driver [None req-bac2316a-8912-4826-8cd6-e33f10e4683b 2acb5062ef0a46b3b37336c6e856e999 c99aa787fccd4f1d9553df1471383775 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 21 23:55:43 compute-1 nova_compute[182713]: 2026-01-21 23:55:43.635 182717 DEBUG nova.virt.libvirt.driver [None req-bac2316a-8912-4826-8cd6-e33f10e4683b 2acb5062ef0a46b3b37336c6e856e999 c99aa787fccd4f1d9553df1471383775 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 21 23:55:43 compute-1 nova_compute[182713]: 2026-01-21 23:55:43.635 182717 DEBUG nova.virt.libvirt.driver [None req-bac2316a-8912-4826-8cd6-e33f10e4683b 2acb5062ef0a46b3b37336c6e856e999 c99aa787fccd4f1d9553df1471383775 - - default default] No VIF found with MAC fa:16:3e:02:3d:57, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 21 23:55:43 compute-1 nova_compute[182713]: 2026-01-21 23:55:43.636 182717 INFO nova.virt.libvirt.driver [None req-bac2316a-8912-4826-8cd6-e33f10e4683b 2acb5062ef0a46b3b37336c6e856e999 c99aa787fccd4f1d9553df1471383775 - - default default] [instance: cdb8738b-801f-4bd8-a93c-3553748dedd7] Using config drive
Jan 21 23:55:43 compute-1 nova_compute[182713]: 2026-01-21 23:55:43.682 182717 DEBUG nova.network.neutron [None req-e8e55543-5f47-4415-ba22-55d1bfa56be9 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: 2a5e9a44-f095-4122-8db9-4918b6ba22b7] Updating instance_info_cache with network_info: [{"id": "ce72a712-5acd-45cf-9d5d-66fb0d28ce4b", "address": "fa:16:3e:4f:57:00", "network": {"id": "74e2da48-44c2-4c6d-9597-6c47d6247f9c", "bridge": "br-int", "label": "tempest-ImagesTestJSON-926280031-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "63e5713bcd4c429796b251487b6136bc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapce72a712-5a", "ovs_interfaceid": "ce72a712-5acd-45cf-9d5d-66fb0d28ce4b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 23:55:43 compute-1 nova_compute[182713]: 2026-01-21 23:55:43.689 182717 DEBUG nova.virt.libvirt.driver [None req-88c78b83-62b4-4740-abaa-caed5e17bd63 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 21 23:55:43 compute-1 nova_compute[182713]: 2026-01-21 23:55:43.689 182717 DEBUG nova.virt.libvirt.driver [None req-88c78b83-62b4-4740-abaa-caed5e17bd63 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 21 23:55:43 compute-1 nova_compute[182713]: 2026-01-21 23:55:43.689 182717 DEBUG nova.virt.libvirt.driver [None req-88c78b83-62b4-4740-abaa-caed5e17bd63 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] No VIF found with MAC fa:16:3e:4e:99:ca, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 21 23:55:43 compute-1 nova_compute[182713]: 2026-01-21 23:55:43.690 182717 INFO nova.virt.libvirt.driver [None req-88c78b83-62b4-4740-abaa-caed5e17bd63 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] [instance: 22742d9b-a6a8-4f10-a17f-a9704a1f8f43] Using config drive
Jan 21 23:55:43 compute-1 nova_compute[182713]: 2026-01-21 23:55:43.716 182717 DEBUG oslo_concurrency.lockutils [None req-e8e55543-5f47-4415-ba22-55d1bfa56be9 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Releasing lock "refresh_cache-2a5e9a44-f095-4122-8db9-4918b6ba22b7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 23:55:43 compute-1 nova_compute[182713]: 2026-01-21 23:55:43.717 182717 DEBUG nova.compute.manager [None req-e8e55543-5f47-4415-ba22-55d1bfa56be9 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: 2a5e9a44-f095-4122-8db9-4918b6ba22b7] Instance network_info: |[{"id": "ce72a712-5acd-45cf-9d5d-66fb0d28ce4b", "address": "fa:16:3e:4f:57:00", "network": {"id": "74e2da48-44c2-4c6d-9597-6c47d6247f9c", "bridge": "br-int", "label": "tempest-ImagesTestJSON-926280031-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "63e5713bcd4c429796b251487b6136bc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapce72a712-5a", "ovs_interfaceid": "ce72a712-5acd-45cf-9d5d-66fb0d28ce4b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 21 23:55:43 compute-1 nova_compute[182713]: 2026-01-21 23:55:43.718 182717 DEBUG oslo_concurrency.lockutils [req-4f5ed248-a508-48f6-b9b7-a9b10e08f927 req-f3797714-92e3-424a-ae3d-45ad192b9661 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-2a5e9a44-f095-4122-8db9-4918b6ba22b7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 23:55:43 compute-1 nova_compute[182713]: 2026-01-21 23:55:43.718 182717 DEBUG nova.network.neutron [req-4f5ed248-a508-48f6-b9b7-a9b10e08f927 req-f3797714-92e3-424a-ae3d-45ad192b9661 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 2a5e9a44-f095-4122-8db9-4918b6ba22b7] Refreshing network info cache for port ce72a712-5acd-45cf-9d5d-66fb0d28ce4b _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 21 23:55:43 compute-1 nova_compute[182713]: 2026-01-21 23:55:43.723 182717 DEBUG nova.virt.libvirt.driver [None req-e8e55543-5f47-4415-ba22-55d1bfa56be9 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: 2a5e9a44-f095-4122-8db9-4918b6ba22b7] Start _get_guest_xml network_info=[{"id": "ce72a712-5acd-45cf-9d5d-66fb0d28ce4b", "address": "fa:16:3e:4f:57:00", "network": {"id": "74e2da48-44c2-4c6d-9597-6c47d6247f9c", "bridge": "br-int", "label": "tempest-ImagesTestJSON-926280031-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "63e5713bcd4c429796b251487b6136bc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapce72a712-5a", "ovs_interfaceid": "ce72a712-5acd-45cf-9d5d-66fb0d28ce4b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'size': 0, 'disk_bus': 'virtio', 'boot_index': 0, 'device_name': '/dev/vda', 'guest_format': None, 'encrypted': False, 'encryption_format': None, 'encryption_secret_uuid': None, 'device_type': 'disk', 'image_id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 21 23:55:43 compute-1 nova_compute[182713]: 2026-01-21 23:55:43.731 182717 WARNING nova.virt.libvirt.driver [None req-e8e55543-5f47-4415-ba22-55d1bfa56be9 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 21 23:55:43 compute-1 nova_compute[182713]: 2026-01-21 23:55:43.736 182717 DEBUG nova.virt.libvirt.host [None req-e8e55543-5f47-4415-ba22-55d1bfa56be9 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 21 23:55:43 compute-1 nova_compute[182713]: 2026-01-21 23:55:43.737 182717 DEBUG nova.virt.libvirt.host [None req-e8e55543-5f47-4415-ba22-55d1bfa56be9 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 21 23:55:43 compute-1 nova_compute[182713]: 2026-01-21 23:55:43.747 182717 DEBUG nova.virt.libvirt.host [None req-e8e55543-5f47-4415-ba22-55d1bfa56be9 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 21 23:55:43 compute-1 nova_compute[182713]: 2026-01-21 23:55:43.748 182717 DEBUG nova.virt.libvirt.host [None req-e8e55543-5f47-4415-ba22-55d1bfa56be9 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 21 23:55:43 compute-1 nova_compute[182713]: 2026-01-21 23:55:43.749 182717 DEBUG nova.virt.libvirt.driver [None req-e8e55543-5f47-4415-ba22-55d1bfa56be9 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 21 23:55:43 compute-1 nova_compute[182713]: 2026-01-21 23:55:43.749 182717 DEBUG nova.virt.hardware [None req-e8e55543-5f47-4415-ba22-55d1bfa56be9 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-21T23:42:45Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c3389c03-89c4-4ff5-9e03-1a99d41713d4',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 21 23:55:43 compute-1 nova_compute[182713]: 2026-01-21 23:55:43.750 182717 DEBUG nova.virt.hardware [None req-e8e55543-5f47-4415-ba22-55d1bfa56be9 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 21 23:55:43 compute-1 nova_compute[182713]: 2026-01-21 23:55:43.750 182717 DEBUG nova.virt.hardware [None req-e8e55543-5f47-4415-ba22-55d1bfa56be9 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 21 23:55:43 compute-1 nova_compute[182713]: 2026-01-21 23:55:43.750 182717 DEBUG nova.virt.hardware [None req-e8e55543-5f47-4415-ba22-55d1bfa56be9 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 21 23:55:43 compute-1 nova_compute[182713]: 2026-01-21 23:55:43.750 182717 DEBUG nova.virt.hardware [None req-e8e55543-5f47-4415-ba22-55d1bfa56be9 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 21 23:55:43 compute-1 nova_compute[182713]: 2026-01-21 23:55:43.751 182717 DEBUG nova.virt.hardware [None req-e8e55543-5f47-4415-ba22-55d1bfa56be9 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 21 23:55:43 compute-1 nova_compute[182713]: 2026-01-21 23:55:43.751 182717 DEBUG nova.virt.hardware [None req-e8e55543-5f47-4415-ba22-55d1bfa56be9 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 21 23:55:43 compute-1 nova_compute[182713]: 2026-01-21 23:55:43.751 182717 DEBUG nova.virt.hardware [None req-e8e55543-5f47-4415-ba22-55d1bfa56be9 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 21 23:55:43 compute-1 nova_compute[182713]: 2026-01-21 23:55:43.751 182717 DEBUG nova.virt.hardware [None req-e8e55543-5f47-4415-ba22-55d1bfa56be9 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 21 23:55:43 compute-1 nova_compute[182713]: 2026-01-21 23:55:43.752 182717 DEBUG nova.virt.hardware [None req-e8e55543-5f47-4415-ba22-55d1bfa56be9 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 21 23:55:43 compute-1 nova_compute[182713]: 2026-01-21 23:55:43.752 182717 DEBUG nova.virt.hardware [None req-e8e55543-5f47-4415-ba22-55d1bfa56be9 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 21 23:55:43 compute-1 nova_compute[182713]: 2026-01-21 23:55:43.756 182717 DEBUG nova.virt.libvirt.vif [None req-e8e55543-5f47-4415-ba22-55d1bfa56be9 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-21T23:55:33Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ImagesTestJSON-server-812838625',display_name='tempest-ImagesTestJSON-server-812838625',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-imagestestjson-server-812838625',id=60,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='63e5713bcd4c429796b251487b6136bc',ramdisk_id='',reservation_id='r-p2ism8a8',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ImagesTestJSON-126431515',owner_user_name='tempest-ImagesTestJSON-126431515-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-21T23:55:36Z,user_data=None,user_id='6eb1bcf645844eaca088761a04e59542',uuid=2a5e9a44-f095-4122-8db9-4918b6ba22b7,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "ce72a712-5acd-45cf-9d5d-66fb0d28ce4b", "address": "fa:16:3e:4f:57:00", "network": {"id": "74e2da48-44c2-4c6d-9597-6c47d6247f9c", "bridge": "br-int", "label": "tempest-ImagesTestJSON-926280031-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "63e5713bcd4c429796b251487b6136bc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapce72a712-5a", "ovs_interfaceid": "ce72a712-5acd-45cf-9d5d-66fb0d28ce4b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 21 23:55:43 compute-1 nova_compute[182713]: 2026-01-21 23:55:43.756 182717 DEBUG nova.network.os_vif_util [None req-e8e55543-5f47-4415-ba22-55d1bfa56be9 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Converting VIF {"id": "ce72a712-5acd-45cf-9d5d-66fb0d28ce4b", "address": "fa:16:3e:4f:57:00", "network": {"id": "74e2da48-44c2-4c6d-9597-6c47d6247f9c", "bridge": "br-int", "label": "tempest-ImagesTestJSON-926280031-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "63e5713bcd4c429796b251487b6136bc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapce72a712-5a", "ovs_interfaceid": "ce72a712-5acd-45cf-9d5d-66fb0d28ce4b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 21 23:55:43 compute-1 nova_compute[182713]: 2026-01-21 23:55:43.757 182717 DEBUG nova.network.os_vif_util [None req-e8e55543-5f47-4415-ba22-55d1bfa56be9 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:4f:57:00,bridge_name='br-int',has_traffic_filtering=True,id=ce72a712-5acd-45cf-9d5d-66fb0d28ce4b,network=Network(74e2da48-44c2-4c6d-9597-6c47d6247f9c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapce72a712-5a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 21 23:55:43 compute-1 nova_compute[182713]: 2026-01-21 23:55:43.758 182717 DEBUG nova.objects.instance [None req-e8e55543-5f47-4415-ba22-55d1bfa56be9 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Lazy-loading 'pci_devices' on Instance uuid 2a5e9a44-f095-4122-8db9-4918b6ba22b7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:55:43 compute-1 nova_compute[182713]: 2026-01-21 23:55:43.780 182717 DEBUG nova.virt.libvirt.driver [None req-e8e55543-5f47-4415-ba22-55d1bfa56be9 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: 2a5e9a44-f095-4122-8db9-4918b6ba22b7] End _get_guest_xml xml=<domain type="kvm">
Jan 21 23:55:43 compute-1 nova_compute[182713]:   <uuid>2a5e9a44-f095-4122-8db9-4918b6ba22b7</uuid>
Jan 21 23:55:43 compute-1 nova_compute[182713]:   <name>instance-0000003c</name>
Jan 21 23:55:43 compute-1 nova_compute[182713]:   <memory>131072</memory>
Jan 21 23:55:43 compute-1 nova_compute[182713]:   <vcpu>1</vcpu>
Jan 21 23:55:43 compute-1 nova_compute[182713]:   <metadata>
Jan 21 23:55:43 compute-1 nova_compute[182713]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 21 23:55:43 compute-1 nova_compute[182713]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 21 23:55:43 compute-1 nova_compute[182713]:       <nova:name>tempest-ImagesTestJSON-server-812838625</nova:name>
Jan 21 23:55:43 compute-1 nova_compute[182713]:       <nova:creationTime>2026-01-21 23:55:43</nova:creationTime>
Jan 21 23:55:43 compute-1 nova_compute[182713]:       <nova:flavor name="m1.nano">
Jan 21 23:55:43 compute-1 nova_compute[182713]:         <nova:memory>128</nova:memory>
Jan 21 23:55:43 compute-1 nova_compute[182713]:         <nova:disk>1</nova:disk>
Jan 21 23:55:43 compute-1 nova_compute[182713]:         <nova:swap>0</nova:swap>
Jan 21 23:55:43 compute-1 nova_compute[182713]:         <nova:ephemeral>0</nova:ephemeral>
Jan 21 23:55:43 compute-1 nova_compute[182713]:         <nova:vcpus>1</nova:vcpus>
Jan 21 23:55:43 compute-1 nova_compute[182713]:       </nova:flavor>
Jan 21 23:55:43 compute-1 nova_compute[182713]:       <nova:owner>
Jan 21 23:55:43 compute-1 nova_compute[182713]:         <nova:user uuid="6eb1bcf645844eaca088761a04e59542">tempest-ImagesTestJSON-126431515-project-member</nova:user>
Jan 21 23:55:43 compute-1 nova_compute[182713]:         <nova:project uuid="63e5713bcd4c429796b251487b6136bc">tempest-ImagesTestJSON-126431515</nova:project>
Jan 21 23:55:43 compute-1 nova_compute[182713]:       </nova:owner>
Jan 21 23:55:43 compute-1 nova_compute[182713]:       <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 21 23:55:43 compute-1 nova_compute[182713]:       <nova:ports>
Jan 21 23:55:43 compute-1 nova_compute[182713]:         <nova:port uuid="ce72a712-5acd-45cf-9d5d-66fb0d28ce4b">
Jan 21 23:55:43 compute-1 nova_compute[182713]:           <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Jan 21 23:55:43 compute-1 nova_compute[182713]:         </nova:port>
Jan 21 23:55:43 compute-1 nova_compute[182713]:       </nova:ports>
Jan 21 23:55:43 compute-1 nova_compute[182713]:     </nova:instance>
Jan 21 23:55:43 compute-1 nova_compute[182713]:   </metadata>
Jan 21 23:55:43 compute-1 nova_compute[182713]:   <sysinfo type="smbios">
Jan 21 23:55:43 compute-1 nova_compute[182713]:     <system>
Jan 21 23:55:43 compute-1 nova_compute[182713]:       <entry name="manufacturer">RDO</entry>
Jan 21 23:55:43 compute-1 nova_compute[182713]:       <entry name="product">OpenStack Compute</entry>
Jan 21 23:55:43 compute-1 nova_compute[182713]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 21 23:55:43 compute-1 nova_compute[182713]:       <entry name="serial">2a5e9a44-f095-4122-8db9-4918b6ba22b7</entry>
Jan 21 23:55:43 compute-1 nova_compute[182713]:       <entry name="uuid">2a5e9a44-f095-4122-8db9-4918b6ba22b7</entry>
Jan 21 23:55:43 compute-1 nova_compute[182713]:       <entry name="family">Virtual Machine</entry>
Jan 21 23:55:43 compute-1 nova_compute[182713]:     </system>
Jan 21 23:55:43 compute-1 nova_compute[182713]:   </sysinfo>
Jan 21 23:55:43 compute-1 nova_compute[182713]:   <os>
Jan 21 23:55:43 compute-1 nova_compute[182713]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 21 23:55:43 compute-1 nova_compute[182713]:     <boot dev="hd"/>
Jan 21 23:55:43 compute-1 nova_compute[182713]:     <smbios mode="sysinfo"/>
Jan 21 23:55:43 compute-1 nova_compute[182713]:   </os>
Jan 21 23:55:43 compute-1 nova_compute[182713]:   <features>
Jan 21 23:55:43 compute-1 nova_compute[182713]:     <acpi/>
Jan 21 23:55:43 compute-1 nova_compute[182713]:     <apic/>
Jan 21 23:55:43 compute-1 nova_compute[182713]:     <vmcoreinfo/>
Jan 21 23:55:43 compute-1 nova_compute[182713]:   </features>
Jan 21 23:55:43 compute-1 nova_compute[182713]:   <clock offset="utc">
Jan 21 23:55:43 compute-1 nova_compute[182713]:     <timer name="pit" tickpolicy="delay"/>
Jan 21 23:55:43 compute-1 nova_compute[182713]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 21 23:55:43 compute-1 nova_compute[182713]:     <timer name="hpet" present="no"/>
Jan 21 23:55:43 compute-1 nova_compute[182713]:   </clock>
Jan 21 23:55:43 compute-1 nova_compute[182713]:   <cpu mode="custom" match="exact">
Jan 21 23:55:43 compute-1 nova_compute[182713]:     <model>Nehalem</model>
Jan 21 23:55:43 compute-1 nova_compute[182713]:     <topology sockets="1" cores="1" threads="1"/>
Jan 21 23:55:43 compute-1 nova_compute[182713]:   </cpu>
Jan 21 23:55:43 compute-1 nova_compute[182713]:   <devices>
Jan 21 23:55:43 compute-1 nova_compute[182713]:     <disk type="file" device="disk">
Jan 21 23:55:43 compute-1 nova_compute[182713]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 21 23:55:43 compute-1 nova_compute[182713]:       <source file="/var/lib/nova/instances/2a5e9a44-f095-4122-8db9-4918b6ba22b7/disk"/>
Jan 21 23:55:43 compute-1 nova_compute[182713]:       <target dev="vda" bus="virtio"/>
Jan 21 23:55:43 compute-1 nova_compute[182713]:     </disk>
Jan 21 23:55:43 compute-1 nova_compute[182713]:     <disk type="file" device="cdrom">
Jan 21 23:55:43 compute-1 nova_compute[182713]:       <driver name="qemu" type="raw" cache="none"/>
Jan 21 23:55:43 compute-1 nova_compute[182713]:       <source file="/var/lib/nova/instances/2a5e9a44-f095-4122-8db9-4918b6ba22b7/disk.config"/>
Jan 21 23:55:43 compute-1 nova_compute[182713]:       <target dev="sda" bus="sata"/>
Jan 21 23:55:43 compute-1 nova_compute[182713]:     </disk>
Jan 21 23:55:43 compute-1 nova_compute[182713]:     <interface type="ethernet">
Jan 21 23:55:43 compute-1 nova_compute[182713]:       <mac address="fa:16:3e:4f:57:00"/>
Jan 21 23:55:43 compute-1 nova_compute[182713]:       <model type="virtio"/>
Jan 21 23:55:43 compute-1 nova_compute[182713]:       <driver name="vhost" rx_queue_size="512"/>
Jan 21 23:55:43 compute-1 nova_compute[182713]:       <mtu size="1442"/>
Jan 21 23:55:43 compute-1 nova_compute[182713]:       <target dev="tapce72a712-5a"/>
Jan 21 23:55:43 compute-1 nova_compute[182713]:     </interface>
Jan 21 23:55:43 compute-1 nova_compute[182713]:     <serial type="pty">
Jan 21 23:55:43 compute-1 nova_compute[182713]:       <log file="/var/lib/nova/instances/2a5e9a44-f095-4122-8db9-4918b6ba22b7/console.log" append="off"/>
Jan 21 23:55:43 compute-1 nova_compute[182713]:     </serial>
Jan 21 23:55:43 compute-1 nova_compute[182713]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 21 23:55:43 compute-1 nova_compute[182713]:     <video>
Jan 21 23:55:43 compute-1 nova_compute[182713]:       <model type="virtio"/>
Jan 21 23:55:43 compute-1 nova_compute[182713]:     </video>
Jan 21 23:55:43 compute-1 nova_compute[182713]:     <input type="tablet" bus="usb"/>
Jan 21 23:55:43 compute-1 nova_compute[182713]:     <rng model="virtio">
Jan 21 23:55:43 compute-1 nova_compute[182713]:       <backend model="random">/dev/urandom</backend>
Jan 21 23:55:43 compute-1 nova_compute[182713]:     </rng>
Jan 21 23:55:43 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root"/>
Jan 21 23:55:43 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:55:43 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:55:43 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:55:43 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:55:43 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:55:43 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:55:43 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:55:43 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:55:43 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:55:43 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:55:43 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:55:43 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:55:43 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:55:43 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:55:43 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:55:43 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:55:43 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:55:43 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:55:43 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:55:43 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:55:43 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:55:43 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:55:43 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:55:43 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:55:43 compute-1 nova_compute[182713]:     <controller type="usb" index="0"/>
Jan 21 23:55:43 compute-1 nova_compute[182713]:     <memballoon model="virtio">
Jan 21 23:55:43 compute-1 nova_compute[182713]:       <stats period="10"/>
Jan 21 23:55:43 compute-1 nova_compute[182713]:     </memballoon>
Jan 21 23:55:43 compute-1 nova_compute[182713]:   </devices>
Jan 21 23:55:43 compute-1 nova_compute[182713]: </domain>
Jan 21 23:55:43 compute-1 nova_compute[182713]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 21 23:55:43 compute-1 nova_compute[182713]: 2026-01-21 23:55:43.781 182717 DEBUG nova.compute.manager [None req-e8e55543-5f47-4415-ba22-55d1bfa56be9 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: 2a5e9a44-f095-4122-8db9-4918b6ba22b7] Preparing to wait for external event network-vif-plugged-ce72a712-5acd-45cf-9d5d-66fb0d28ce4b prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 21 23:55:43 compute-1 nova_compute[182713]: 2026-01-21 23:55:43.781 182717 DEBUG oslo_concurrency.lockutils [None req-e8e55543-5f47-4415-ba22-55d1bfa56be9 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Acquiring lock "2a5e9a44-f095-4122-8db9-4918b6ba22b7-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:55:43 compute-1 nova_compute[182713]: 2026-01-21 23:55:43.782 182717 DEBUG oslo_concurrency.lockutils [None req-e8e55543-5f47-4415-ba22-55d1bfa56be9 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Lock "2a5e9a44-f095-4122-8db9-4918b6ba22b7-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:55:43 compute-1 nova_compute[182713]: 2026-01-21 23:55:43.782 182717 DEBUG oslo_concurrency.lockutils [None req-e8e55543-5f47-4415-ba22-55d1bfa56be9 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Lock "2a5e9a44-f095-4122-8db9-4918b6ba22b7-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:55:43 compute-1 nova_compute[182713]: 2026-01-21 23:55:43.783 182717 DEBUG nova.virt.libvirt.vif [None req-e8e55543-5f47-4415-ba22-55d1bfa56be9 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-21T23:55:33Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ImagesTestJSON-server-812838625',display_name='tempest-ImagesTestJSON-server-812838625',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-imagestestjson-server-812838625',id=60,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='63e5713bcd4c429796b251487b6136bc',ramdisk_id='',reservation_id='r-p2ism8a8',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ImagesTestJSON-126431515',owner_user_name='tempest-ImagesTestJSON-126431515-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-21T23:55:36Z,user_data=None,user_id='6eb1bcf645844eaca088761a04e59542',uuid=2a5e9a44-f095-4122-8db9-4918b6ba22b7,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "ce72a712-5acd-45cf-9d5d-66fb0d28ce4b", "address": "fa:16:3e:4f:57:00", "network": {"id": "74e2da48-44c2-4c6d-9597-6c47d6247f9c", "bridge": "br-int", "label": "tempest-ImagesTestJSON-926280031-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "63e5713bcd4c429796b251487b6136bc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapce72a712-5a", "ovs_interfaceid": "ce72a712-5acd-45cf-9d5d-66fb0d28ce4b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 21 23:55:43 compute-1 nova_compute[182713]: 2026-01-21 23:55:43.783 182717 DEBUG nova.network.os_vif_util [None req-e8e55543-5f47-4415-ba22-55d1bfa56be9 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Converting VIF {"id": "ce72a712-5acd-45cf-9d5d-66fb0d28ce4b", "address": "fa:16:3e:4f:57:00", "network": {"id": "74e2da48-44c2-4c6d-9597-6c47d6247f9c", "bridge": "br-int", "label": "tempest-ImagesTestJSON-926280031-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "63e5713bcd4c429796b251487b6136bc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapce72a712-5a", "ovs_interfaceid": "ce72a712-5acd-45cf-9d5d-66fb0d28ce4b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 21 23:55:43 compute-1 nova_compute[182713]: 2026-01-21 23:55:43.784 182717 DEBUG nova.network.os_vif_util [None req-e8e55543-5f47-4415-ba22-55d1bfa56be9 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:4f:57:00,bridge_name='br-int',has_traffic_filtering=True,id=ce72a712-5acd-45cf-9d5d-66fb0d28ce4b,network=Network(74e2da48-44c2-4c6d-9597-6c47d6247f9c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapce72a712-5a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 21 23:55:43 compute-1 nova_compute[182713]: 2026-01-21 23:55:43.784 182717 DEBUG os_vif [None req-e8e55543-5f47-4415-ba22-55d1bfa56be9 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:4f:57:00,bridge_name='br-int',has_traffic_filtering=True,id=ce72a712-5acd-45cf-9d5d-66fb0d28ce4b,network=Network(74e2da48-44c2-4c6d-9597-6c47d6247f9c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapce72a712-5a') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 21 23:55:43 compute-1 nova_compute[182713]: 2026-01-21 23:55:43.785 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:55:43 compute-1 nova_compute[182713]: 2026-01-21 23:55:43.785 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:55:43 compute-1 nova_compute[182713]: 2026-01-21 23:55:43.785 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 21 23:55:43 compute-1 nova_compute[182713]: 2026-01-21 23:55:43.789 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:55:43 compute-1 nova_compute[182713]: 2026-01-21 23:55:43.789 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapce72a712-5a, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:55:43 compute-1 nova_compute[182713]: 2026-01-21 23:55:43.789 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapce72a712-5a, col_values=(('external_ids', {'iface-id': 'ce72a712-5acd-45cf-9d5d-66fb0d28ce4b', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:4f:57:00', 'vm-uuid': '2a5e9a44-f095-4122-8db9-4918b6ba22b7'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:55:43 compute-1 nova_compute[182713]: 2026-01-21 23:55:43.791 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:55:43 compute-1 NetworkManager[54952]: <info>  [1769039743.7925] manager: (tapce72a712-5a): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/81)
Jan 21 23:55:43 compute-1 nova_compute[182713]: 2026-01-21 23:55:43.799 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 21 23:55:43 compute-1 nova_compute[182713]: 2026-01-21 23:55:43.804 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:55:43 compute-1 nova_compute[182713]: 2026-01-21 23:55:43.806 182717 INFO os_vif [None req-e8e55543-5f47-4415-ba22-55d1bfa56be9 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:4f:57:00,bridge_name='br-int',has_traffic_filtering=True,id=ce72a712-5acd-45cf-9d5d-66fb0d28ce4b,network=Network(74e2da48-44c2-4c6d-9597-6c47d6247f9c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapce72a712-5a')
Jan 21 23:55:43 compute-1 nova_compute[182713]: 2026-01-21 23:55:43.872 182717 DEBUG nova.virt.libvirt.driver [None req-e8e55543-5f47-4415-ba22-55d1bfa56be9 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 21 23:55:43 compute-1 nova_compute[182713]: 2026-01-21 23:55:43.873 182717 DEBUG nova.virt.libvirt.driver [None req-e8e55543-5f47-4415-ba22-55d1bfa56be9 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 21 23:55:43 compute-1 nova_compute[182713]: 2026-01-21 23:55:43.873 182717 DEBUG nova.virt.libvirt.driver [None req-e8e55543-5f47-4415-ba22-55d1bfa56be9 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] No VIF found with MAC fa:16:3e:4f:57:00, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 21 23:55:43 compute-1 nova_compute[182713]: 2026-01-21 23:55:43.873 182717 INFO nova.virt.libvirt.driver [None req-e8e55543-5f47-4415-ba22-55d1bfa56be9 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: 2a5e9a44-f095-4122-8db9-4918b6ba22b7] Using config drive
Jan 21 23:55:44 compute-1 nova_compute[182713]: 2026-01-21 23:55:44.143 182717 INFO nova.virt.libvirt.driver [None req-88c78b83-62b4-4740-abaa-caed5e17bd63 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] [instance: 22742d9b-a6a8-4f10-a17f-a9704a1f8f43] Creating config drive at /var/lib/nova/instances/22742d9b-a6a8-4f10-a17f-a9704a1f8f43/disk.config
Jan 21 23:55:44 compute-1 nova_compute[182713]: 2026-01-21 23:55:44.149 182717 DEBUG oslo_concurrency.processutils [None req-88c78b83-62b4-4740-abaa-caed5e17bd63 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/22742d9b-a6a8-4f10-a17f-a9704a1f8f43/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpbh6f_0it execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:55:44 compute-1 nova_compute[182713]: 2026-01-21 23:55:44.286 182717 DEBUG oslo_concurrency.processutils [None req-88c78b83-62b4-4740-abaa-caed5e17bd63 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/22742d9b-a6a8-4f10-a17f-a9704a1f8f43/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpbh6f_0it" returned: 0 in 0.137s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:55:44 compute-1 NetworkManager[54952]: <info>  [1769039744.3604] manager: (tapb767fb64-f4): new Tun device (/org/freedesktop/NetworkManager/Devices/82)
Jan 21 23:55:44 compute-1 kernel: tapb767fb64-f4: entered promiscuous mode
Jan 21 23:55:44 compute-1 ovn_controller[94841]: 2026-01-21T23:55:44Z|00152|binding|INFO|Claiming lport b767fb64-f4a0-49cc-85c0-21b059344b3d for this chassis.
Jan 21 23:55:44 compute-1 nova_compute[182713]: 2026-01-21 23:55:44.372 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:55:44 compute-1 ovn_controller[94841]: 2026-01-21T23:55:44Z|00153|binding|INFO|b767fb64-f4a0-49cc-85c0-21b059344b3d: Claiming fa:16:3e:4e:99:ca 10.100.0.10
Jan 21 23:55:44 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:55:44.384 104184 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:4e:99:ca 10.100.0.10'], port_security=['fa:16:3e:4e:99:ca 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '22742d9b-a6a8-4f10-a17f-a9704a1f8f43', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a78bfb22-a192-4dbe-a117-9f8a59130e27', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '70b1c9f8be0042aa8de9841a26729700', 'neutron:revision_number': '2', 'neutron:security_group_ids': '5943869c-ade1-4cd3-81a5-29e65236fb49', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=88d3d39a-f56f-4f3b-95e9-79768ac7b596, chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>], logical_port=b767fb64-f4a0-49cc-85c0-21b059344b3d) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 21 23:55:44 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:55:44.385 104184 INFO neutron.agent.ovn.metadata.agent [-] Port b767fb64-f4a0-49cc-85c0-21b059344b3d in datapath a78bfb22-a192-4dbe-a117-9f8a59130e27 bound to our chassis
Jan 21 23:55:44 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:55:44.387 104184 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network a78bfb22-a192-4dbe-a117-9f8a59130e27
Jan 21 23:55:44 compute-1 systemd-udevd[218845]: Network interface NamePolicy= disabled on kernel command line.
Jan 21 23:55:44 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:55:44.400 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[efca044f-3308-4d29-8c0f-344090634dfc]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:55:44 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:55:44.401 104184 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapa78bfb22-a1 in ovnmeta-a78bfb22-a192-4dbe-a117-9f8a59130e27 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 21 23:55:44 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:55:44.404 211733 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapa78bfb22-a0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 21 23:55:44 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:55:44.404 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[87013470-1180-487e-ab4a-c8833fe9eadd]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:55:44 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:55:44.405 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[9c1c5598-9d09-4d1a-a2b6-5c1f7ae94ac7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:55:44 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:55:44.416 104576 DEBUG oslo.privsep.daemon [-] privsep: reply[26ec5d08-e7d0-4068-89e1-987f92d7c4c5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:55:44 compute-1 NetworkManager[54952]: <info>  [1769039744.4207] device (tapb767fb64-f4): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 21 23:55:44 compute-1 NetworkManager[54952]: <info>  [1769039744.4216] device (tapb767fb64-f4): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 21 23:55:44 compute-1 systemd-machined[153970]: New machine qemu-26-instance-0000003a.
Jan 21 23:55:44 compute-1 nova_compute[182713]: 2026-01-21 23:55:44.439 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:55:44 compute-1 systemd[1]: Started Virtual Machine qemu-26-instance-0000003a.
Jan 21 23:55:44 compute-1 ovn_controller[94841]: 2026-01-21T23:55:44Z|00154|binding|INFO|Setting lport b767fb64-f4a0-49cc-85c0-21b059344b3d ovn-installed in OVS
Jan 21 23:55:44 compute-1 ovn_controller[94841]: 2026-01-21T23:55:44Z|00155|binding|INFO|Setting lport b767fb64-f4a0-49cc-85c0-21b059344b3d up in Southbound
Jan 21 23:55:44 compute-1 nova_compute[182713]: 2026-01-21 23:55:44.442 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:55:44 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:55:44.448 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[60153593-c21d-4c9f-8ac0-b724a81ccf90]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:55:44 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:55:44.473 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[c12ebfe4-3e22-41bb-9f6a-a3d3f186cbfe]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:55:44 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:55:44.480 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[ee0ec1b5-68f9-4970-b9ad-34309850b5b7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:55:44 compute-1 NetworkManager[54952]: <info>  [1769039744.4814] manager: (tapa78bfb22-a0): new Veth device (/org/freedesktop/NetworkManager/Devices/83)
Jan 21 23:55:44 compute-1 systemd-udevd[218849]: Network interface NamePolicy= disabled on kernel command line.
Jan 21 23:55:44 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:55:44.515 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[4e215865-5e83-4378-baaa-0ac72662a836]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:55:44 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:55:44.517 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[fd027a23-2711-4c58-8e30-029ddfcc5ed1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:55:44 compute-1 NetworkManager[54952]: <info>  [1769039744.5455] device (tapa78bfb22-a0): carrier: link connected
Jan 21 23:55:44 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:55:44.550 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[21fa22f1-b84d-4462-8cff-4a0c00309aa4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:55:44 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:55:44.573 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[412df9ce-f815-4677-a26c-cc164b6b22f4]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapa78bfb22-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:2f:41:94'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 49], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 431719, 'reachable_time': 18566, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 218884, 'error': None, 'target': 'ovnmeta-a78bfb22-a192-4dbe-a117-9f8a59130e27', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:55:44 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:55:44.597 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[7e4d8e6a-7870-44e0-8ee9-2bf022ca6a73]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe2f:4194'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 431719, 'tstamp': 431719}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 218885, 'error': None, 'target': 'ovnmeta-a78bfb22-a192-4dbe-a117-9f8a59130e27', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:55:44 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:55:44.620 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[d8aab610-a81c-4046-9050-11272688fe24]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapa78bfb22-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:2f:41:94'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 49], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 431719, 'reachable_time': 18566, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 218886, 'error': None, 'target': 'ovnmeta-a78bfb22-a192-4dbe-a117-9f8a59130e27', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:55:44 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:55:44.664 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[a6957909-0cf3-4e1f-8a4c-31e2a5c7dcca]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:55:44 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:55:44.743 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[3eda58d0-be60-4fe8-945d-6c9a7126df2b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:55:44 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:55:44.744 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa78bfb22-a0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:55:44 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:55:44.744 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 21 23:55:44 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:55:44.744 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa78bfb22-a0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:55:44 compute-1 nova_compute[182713]: 2026-01-21 23:55:44.746 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:55:44 compute-1 NetworkManager[54952]: <info>  [1769039744.7478] manager: (tapa78bfb22-a0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/84)
Jan 21 23:55:44 compute-1 nova_compute[182713]: 2026-01-21 23:55:44.748 182717 DEBUG nova.virt.driver [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Emitting event <LifecycleEvent: 1769039744.7482905, 22742d9b-a6a8-4f10-a17f-a9704a1f8f43 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:55:44 compute-1 nova_compute[182713]: 2026-01-21 23:55:44.749 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 22742d9b-a6a8-4f10-a17f-a9704a1f8f43] VM Started (Lifecycle Event)
Jan 21 23:55:44 compute-1 kernel: tapa78bfb22-a0: entered promiscuous mode
Jan 21 23:55:44 compute-1 nova_compute[182713]: 2026-01-21 23:55:44.753 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:55:44 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:55:44.755 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapa78bfb22-a0, col_values=(('external_ids', {'iface-id': 'bb8c3f45-55b8-4c8e-8a31-26c5ecb4fb32'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:55:44 compute-1 nova_compute[182713]: 2026-01-21 23:55:44.757 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:55:44 compute-1 ovn_controller[94841]: 2026-01-21T23:55:44Z|00156|binding|INFO|Releasing lport bb8c3f45-55b8-4c8e-8a31-26c5ecb4fb32 from this chassis (sb_readonly=0)
Jan 21 23:55:44 compute-1 nova_compute[182713]: 2026-01-21 23:55:44.767 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 22742d9b-a6a8-4f10-a17f-a9704a1f8f43] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:55:44 compute-1 nova_compute[182713]: 2026-01-21 23:55:44.771 182717 DEBUG nova.virt.driver [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Emitting event <LifecycleEvent: 1769039744.748663, 22742d9b-a6a8-4f10-a17f-a9704a1f8f43 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:55:44 compute-1 nova_compute[182713]: 2026-01-21 23:55:44.771 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 22742d9b-a6a8-4f10-a17f-a9704a1f8f43] VM Paused (Lifecycle Event)
Jan 21 23:55:44 compute-1 nova_compute[182713]: 2026-01-21 23:55:44.777 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:55:44 compute-1 nova_compute[182713]: 2026-01-21 23:55:44.785 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:55:44 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:55:44.786 104184 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/a78bfb22-a192-4dbe-a117-9f8a59130e27.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/a78bfb22-a192-4dbe-a117-9f8a59130e27.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 21 23:55:44 compute-1 nova_compute[182713]: 2026-01-21 23:55:44.787 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 22742d9b-a6a8-4f10-a17f-a9704a1f8f43] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:55:44 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:55:44.787 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[524af724-4cc5-4a80-ab9e-853bfce3a00e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:55:44 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:55:44.789 104184 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 21 23:55:44 compute-1 ovn_metadata_agent[104179]: global
Jan 21 23:55:44 compute-1 ovn_metadata_agent[104179]:     log         /dev/log local0 debug
Jan 21 23:55:44 compute-1 ovn_metadata_agent[104179]:     log-tag     haproxy-metadata-proxy-a78bfb22-a192-4dbe-a117-9f8a59130e27
Jan 21 23:55:44 compute-1 ovn_metadata_agent[104179]:     user        root
Jan 21 23:55:44 compute-1 ovn_metadata_agent[104179]:     group       root
Jan 21 23:55:44 compute-1 ovn_metadata_agent[104179]:     maxconn     1024
Jan 21 23:55:44 compute-1 ovn_metadata_agent[104179]:     pidfile     /var/lib/neutron/external/pids/a78bfb22-a192-4dbe-a117-9f8a59130e27.pid.haproxy
Jan 21 23:55:44 compute-1 ovn_metadata_agent[104179]:     daemon
Jan 21 23:55:44 compute-1 ovn_metadata_agent[104179]: 
Jan 21 23:55:44 compute-1 ovn_metadata_agent[104179]: defaults
Jan 21 23:55:44 compute-1 ovn_metadata_agent[104179]:     log global
Jan 21 23:55:44 compute-1 ovn_metadata_agent[104179]:     mode http
Jan 21 23:55:44 compute-1 ovn_metadata_agent[104179]:     option httplog
Jan 21 23:55:44 compute-1 ovn_metadata_agent[104179]:     option dontlognull
Jan 21 23:55:44 compute-1 ovn_metadata_agent[104179]:     option http-server-close
Jan 21 23:55:44 compute-1 ovn_metadata_agent[104179]:     option forwardfor
Jan 21 23:55:44 compute-1 ovn_metadata_agent[104179]:     retries                 3
Jan 21 23:55:44 compute-1 ovn_metadata_agent[104179]:     timeout http-request    30s
Jan 21 23:55:44 compute-1 ovn_metadata_agent[104179]:     timeout connect         30s
Jan 21 23:55:44 compute-1 ovn_metadata_agent[104179]:     timeout client          32s
Jan 21 23:55:44 compute-1 ovn_metadata_agent[104179]:     timeout server          32s
Jan 21 23:55:44 compute-1 ovn_metadata_agent[104179]:     timeout http-keep-alive 30s
Jan 21 23:55:44 compute-1 ovn_metadata_agent[104179]: 
Jan 21 23:55:44 compute-1 ovn_metadata_agent[104179]: 
Jan 21 23:55:44 compute-1 ovn_metadata_agent[104179]: listen listener
Jan 21 23:55:44 compute-1 ovn_metadata_agent[104179]:     bind 169.254.169.254:80
Jan 21 23:55:44 compute-1 ovn_metadata_agent[104179]:     server metadata /var/lib/neutron/metadata_proxy
Jan 21 23:55:44 compute-1 ovn_metadata_agent[104179]:     http-request add-header X-OVN-Network-ID a78bfb22-a192-4dbe-a117-9f8a59130e27
Jan 21 23:55:44 compute-1 ovn_metadata_agent[104179]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 21 23:55:44 compute-1 nova_compute[182713]: 2026-01-21 23:55:44.789 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 22742d9b-a6a8-4f10-a17f-a9704a1f8f43] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 21 23:55:44 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:55:44.790 104184 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-a78bfb22-a192-4dbe-a117-9f8a59130e27', 'env', 'PROCESS_TAG=haproxy-a78bfb22-a192-4dbe-a117-9f8a59130e27', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/a78bfb22-a192-4dbe-a117-9f8a59130e27.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 21 23:55:44 compute-1 nova_compute[182713]: 2026-01-21 23:55:44.823 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 22742d9b-a6a8-4f10-a17f-a9704a1f8f43] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 21 23:55:45 compute-1 podman[218935]: 2026-01-21 23:55:45.223465857 +0000 UTC m=+0.072208115 container create 9033d9d9a076c37515fccab8ce4648cff0eb61574a0829fc93fa553902417637 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a78bfb22-a192-4dbe-a117-9f8a59130e27, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 21 23:55:45 compute-1 systemd[1]: Started libpod-conmon-9033d9d9a076c37515fccab8ce4648cff0eb61574a0829fc93fa553902417637.scope.
Jan 21 23:55:45 compute-1 podman[218935]: 2026-01-21 23:55:45.191251994 +0000 UTC m=+0.039994302 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 21 23:55:45 compute-1 systemd[1]: Started libcrun container.
Jan 21 23:55:45 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/aa7ade534e76ab0ebf2b7ad96afaf902c7b1bc431792cde7f318b7e298e25a54/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 21 23:55:45 compute-1 podman[218935]: 2026-01-21 23:55:45.317402609 +0000 UTC m=+0.166144897 container init 9033d9d9a076c37515fccab8ce4648cff0eb61574a0829fc93fa553902417637 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a78bfb22-a192-4dbe-a117-9f8a59130e27, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0)
Jan 21 23:55:45 compute-1 podman[218935]: 2026-01-21 23:55:45.330155102 +0000 UTC m=+0.178897370 container start 9033d9d9a076c37515fccab8ce4648cff0eb61574a0829fc93fa553902417637 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a78bfb22-a192-4dbe-a117-9f8a59130e27, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Jan 21 23:55:45 compute-1 neutron-haproxy-ovnmeta-a78bfb22-a192-4dbe-a117-9f8a59130e27[218950]: [NOTICE]   (218954) : New worker (218956) forked
Jan 21 23:55:45 compute-1 neutron-haproxy-ovnmeta-a78bfb22-a192-4dbe-a117-9f8a59130e27[218950]: [NOTICE]   (218954) : Loading success.
Jan 21 23:55:45 compute-1 nova_compute[182713]: 2026-01-21 23:55:45.380 182717 INFO nova.virt.libvirt.driver [None req-e8e55543-5f47-4415-ba22-55d1bfa56be9 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: 2a5e9a44-f095-4122-8db9-4918b6ba22b7] Creating config drive at /var/lib/nova/instances/2a5e9a44-f095-4122-8db9-4918b6ba22b7/disk.config
Jan 21 23:55:45 compute-1 nova_compute[182713]: 2026-01-21 23:55:45.390 182717 DEBUG oslo_concurrency.processutils [None req-e8e55543-5f47-4415-ba22-55d1bfa56be9 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/2a5e9a44-f095-4122-8db9-4918b6ba22b7/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpm9trm_gy execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:55:45 compute-1 nova_compute[182713]: 2026-01-21 23:55:45.434 182717 INFO nova.virt.libvirt.driver [None req-bac2316a-8912-4826-8cd6-e33f10e4683b 2acb5062ef0a46b3b37336c6e856e999 c99aa787fccd4f1d9553df1471383775 - - default default] [instance: cdb8738b-801f-4bd8-a93c-3553748dedd7] Creating config drive at /var/lib/nova/instances/cdb8738b-801f-4bd8-a93c-3553748dedd7/disk.config
Jan 21 23:55:45 compute-1 nova_compute[182713]: 2026-01-21 23:55:45.441 182717 DEBUG oslo_concurrency.processutils [None req-bac2316a-8912-4826-8cd6-e33f10e4683b 2acb5062ef0a46b3b37336c6e856e999 c99aa787fccd4f1d9553df1471383775 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/cdb8738b-801f-4bd8-a93c-3553748dedd7/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpvklw9lds execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:55:45 compute-1 nova_compute[182713]: 2026-01-21 23:55:45.526 182717 DEBUG oslo_concurrency.processutils [None req-e8e55543-5f47-4415-ba22-55d1bfa56be9 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/2a5e9a44-f095-4122-8db9-4918b6ba22b7/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpm9trm_gy" returned: 0 in 0.137s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:55:45 compute-1 NetworkManager[54952]: <info>  [1769039745.6106] manager: (tapce72a712-5a): new Tun device (/org/freedesktop/NetworkManager/Devices/85)
Jan 21 23:55:45 compute-1 systemd-udevd[218864]: Network interface NamePolicy= disabled on kernel command line.
Jan 21 23:55:45 compute-1 kernel: tapce72a712-5a: entered promiscuous mode
Jan 21 23:55:45 compute-1 ovn_controller[94841]: 2026-01-21T23:55:45Z|00157|binding|INFO|Claiming lport ce72a712-5acd-45cf-9d5d-66fb0d28ce4b for this chassis.
Jan 21 23:55:45 compute-1 ovn_controller[94841]: 2026-01-21T23:55:45Z|00158|binding|INFO|ce72a712-5acd-45cf-9d5d-66fb0d28ce4b: Claiming fa:16:3e:4f:57:00 10.100.0.7
Jan 21 23:55:45 compute-1 nova_compute[182713]: 2026-01-21 23:55:45.613 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:55:45 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:55:45.625 104184 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:4f:57:00 10.100.0.7'], port_security=['fa:16:3e:4f:57:00 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '2a5e9a44-f095-4122-8db9-4918b6ba22b7', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-74e2da48-44c2-4c6d-9597-6c47d6247f9c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '63e5713bcd4c429796b251487b6136bc', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'f9b7e0b6-a204-4fa2-b013-6e4797586550', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d83c4887-6d70-4852-825f-2112d1d70c76, chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>], logical_port=ce72a712-5acd-45cf-9d5d-66fb0d28ce4b) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 21 23:55:45 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:55:45.627 104184 INFO neutron.agent.ovn.metadata.agent [-] Port ce72a712-5acd-45cf-9d5d-66fb0d28ce4b in datapath 74e2da48-44c2-4c6d-9597-6c47d6247f9c bound to our chassis
Jan 21 23:55:45 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:55:45.628 104184 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 74e2da48-44c2-4c6d-9597-6c47d6247f9c
Jan 21 23:55:45 compute-1 NetworkManager[54952]: <info>  [1769039745.6295] device (tapce72a712-5a): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 21 23:55:45 compute-1 ovn_controller[94841]: 2026-01-21T23:55:45Z|00159|binding|INFO|Setting lport ce72a712-5acd-45cf-9d5d-66fb0d28ce4b ovn-installed in OVS
Jan 21 23:55:45 compute-1 ovn_controller[94841]: 2026-01-21T23:55:45Z|00160|binding|INFO|Setting lport ce72a712-5acd-45cf-9d5d-66fb0d28ce4b up in Southbound
Jan 21 23:55:45 compute-1 NetworkManager[54952]: <info>  [1769039745.6306] device (tapce72a712-5a): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 21 23:55:45 compute-1 nova_compute[182713]: 2026-01-21 23:55:45.631 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:55:45 compute-1 nova_compute[182713]: 2026-01-21 23:55:45.634 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:55:45 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:55:45.639 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[a720dfae-49ac-4cf5-b118-1999029b22ac]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:55:45 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:55:45.640 104184 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap74e2da48-41 in ovnmeta-74e2da48-44c2-4c6d-9597-6c47d6247f9c namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 21 23:55:45 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:55:45.642 211733 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap74e2da48-40 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 21 23:55:45 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:55:45.642 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[e401f125-3e62-473b-8230-c976aa61402f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:55:45 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:55:45.643 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[82c46054-9013-41ba-8325-35e7dd76986c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:55:45 compute-1 systemd-machined[153970]: New machine qemu-27-instance-0000003c.
Jan 21 23:55:45 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:55:45.652 104576 DEBUG oslo.privsep.daemon [-] privsep: reply[c25343ab-a09c-47d8-b2f2-23a8f80a331f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:55:45 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:55:45.666 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[41397fdb-bb4e-4a7d-9188-d89f0c271cbb]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:55:45 compute-1 systemd[1]: Started Virtual Machine qemu-27-instance-0000003c.
Jan 21 23:55:45 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:55:45.696 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[e7ebf78c-613c-477d-aabb-048fe25df4a4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:55:45 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:55:45.704 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[18c970b8-25b3-4a5e-9302-f1e7b0ab2fa9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:55:45 compute-1 NetworkManager[54952]: <info>  [1769039745.7051] manager: (tap74e2da48-40): new Veth device (/org/freedesktop/NetworkManager/Devices/86)
Jan 21 23:55:45 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:55:45.740 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[3339a273-175c-4e65-ac38-c4ff26e0e7e8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:55:45 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:55:45.744 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[587e88d6-0539-47b0-937e-125946afc89e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:55:45 compute-1 NetworkManager[54952]: <info>  [1769039745.7643] device (tap74e2da48-40): carrier: link connected
Jan 21 23:55:45 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:55:45.772 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[e6405c3e-5f0d-4d93-a421-7bebe6655725]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:55:45 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:55:45.802 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[3dcce729-a186-45e4-a5c0-132be74b2baa]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap74e2da48-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:af:75:49'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 51], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 431841, 'reachable_time': 42329, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 219009, 'error': None, 'target': 'ovnmeta-74e2da48-44c2-4c6d-9597-6c47d6247f9c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:55:45 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:55:45.817 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[28165d09-ed00-4d7d-a582-2a5ab1a1e414]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feaf:7549'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 431841, 'tstamp': 431841}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 219010, 'error': None, 'target': 'ovnmeta-74e2da48-44c2-4c6d-9597-6c47d6247f9c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:55:45 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:55:45.837 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[0c0b8577-f43d-4321-9dee-f9c2aff4a21f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap74e2da48-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:af:75:49'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 51], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 431841, 'reachable_time': 42329, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 219011, 'error': None, 'target': 'ovnmeta-74e2da48-44c2-4c6d-9597-6c47d6247f9c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:55:45 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:55:45.873 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[a35f78c1-9194-4f5b-8eeb-d0dd88beb000]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:55:45 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:55:45.956 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[a696e3fd-b3a1-42ab-8173-495004c3acfa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:55:45 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:55:45.958 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap74e2da48-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:55:45 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:55:45.958 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 21 23:55:45 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:55:45.959 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap74e2da48-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:55:45 compute-1 nova_compute[182713]: 2026-01-21 23:55:45.961 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:55:45 compute-1 NetworkManager[54952]: <info>  [1769039745.9619] manager: (tap74e2da48-40): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/87)
Jan 21 23:55:45 compute-1 kernel: tap74e2da48-40: entered promiscuous mode
Jan 21 23:55:45 compute-1 nova_compute[182713]: 2026-01-21 23:55:45.967 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:55:45 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:55:45.968 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap74e2da48-40, col_values=(('external_ids', {'iface-id': '5f8f321e-2942-4700-a50e-4b0628052c1b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:55:45 compute-1 nova_compute[182713]: 2026-01-21 23:55:45.970 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:55:45 compute-1 ovn_controller[94841]: 2026-01-21T23:55:45Z|00161|binding|INFO|Releasing lport 5f8f321e-2942-4700-a50e-4b0628052c1b from this chassis (sb_readonly=0)
Jan 21 23:55:45 compute-1 nova_compute[182713]: 2026-01-21 23:55:45.998 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:55:46 compute-1 nova_compute[182713]: 2026-01-21 23:55:46.006 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:55:46 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:55:46.008 104184 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/74e2da48-44c2-4c6d-9597-6c47d6247f9c.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/74e2da48-44c2-4c6d-9597-6c47d6247f9c.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 21 23:55:46 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:55:46.010 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[71ef97a9-b4e6-454b-96aa-a6f2d76d4928]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:55:46 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:55:46.011 104184 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 21 23:55:46 compute-1 ovn_metadata_agent[104179]: global
Jan 21 23:55:46 compute-1 ovn_metadata_agent[104179]:     log         /dev/log local0 debug
Jan 21 23:55:46 compute-1 ovn_metadata_agent[104179]:     log-tag     haproxy-metadata-proxy-74e2da48-44c2-4c6d-9597-6c47d6247f9c
Jan 21 23:55:46 compute-1 ovn_metadata_agent[104179]:     user        root
Jan 21 23:55:46 compute-1 ovn_metadata_agent[104179]:     group       root
Jan 21 23:55:46 compute-1 ovn_metadata_agent[104179]:     maxconn     1024
Jan 21 23:55:46 compute-1 ovn_metadata_agent[104179]:     pidfile     /var/lib/neutron/external/pids/74e2da48-44c2-4c6d-9597-6c47d6247f9c.pid.haproxy
Jan 21 23:55:46 compute-1 ovn_metadata_agent[104179]:     daemon
Jan 21 23:55:46 compute-1 ovn_metadata_agent[104179]: 
Jan 21 23:55:46 compute-1 ovn_metadata_agent[104179]: defaults
Jan 21 23:55:46 compute-1 ovn_metadata_agent[104179]:     log global
Jan 21 23:55:46 compute-1 ovn_metadata_agent[104179]:     mode http
Jan 21 23:55:46 compute-1 ovn_metadata_agent[104179]:     option httplog
Jan 21 23:55:46 compute-1 ovn_metadata_agent[104179]:     option dontlognull
Jan 21 23:55:46 compute-1 ovn_metadata_agent[104179]:     option http-server-close
Jan 21 23:55:46 compute-1 ovn_metadata_agent[104179]:     option forwardfor
Jan 21 23:55:46 compute-1 ovn_metadata_agent[104179]:     retries                 3
Jan 21 23:55:46 compute-1 ovn_metadata_agent[104179]:     timeout http-request    30s
Jan 21 23:55:46 compute-1 ovn_metadata_agent[104179]:     timeout connect         30s
Jan 21 23:55:46 compute-1 ovn_metadata_agent[104179]:     timeout client          32s
Jan 21 23:55:46 compute-1 ovn_metadata_agent[104179]:     timeout server          32s
Jan 21 23:55:46 compute-1 ovn_metadata_agent[104179]:     timeout http-keep-alive 30s
Jan 21 23:55:46 compute-1 ovn_metadata_agent[104179]: 
Jan 21 23:55:46 compute-1 ovn_metadata_agent[104179]: 
Jan 21 23:55:46 compute-1 ovn_metadata_agent[104179]: listen listener
Jan 21 23:55:46 compute-1 ovn_metadata_agent[104179]:     bind 169.254.169.254:80
Jan 21 23:55:46 compute-1 ovn_metadata_agent[104179]:     server metadata /var/lib/neutron/metadata_proxy
Jan 21 23:55:46 compute-1 ovn_metadata_agent[104179]:     http-request add-header X-OVN-Network-ID 74e2da48-44c2-4c6d-9597-6c47d6247f9c
Jan 21 23:55:46 compute-1 ovn_metadata_agent[104179]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 21 23:55:46 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:55:46.012 104184 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-74e2da48-44c2-4c6d-9597-6c47d6247f9c', 'env', 'PROCESS_TAG=haproxy-74e2da48-44c2-4c6d-9597-6c47d6247f9c', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/74e2da48-44c2-4c6d-9597-6c47d6247f9c.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 21 23:55:46 compute-1 nova_compute[182713]: 2026-01-21 23:55:46.081 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:55:46 compute-1 nova_compute[182713]: 2026-01-21 23:55:46.118 182717 DEBUG nova.compute.manager [req-578718d9-6a86-418c-9e46-5f165f3544d4 req-b2ef04b2-da94-4eb2-909f-b99493ddf3a9 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 22742d9b-a6a8-4f10-a17f-a9704a1f8f43] Received event network-vif-plugged-b767fb64-f4a0-49cc-85c0-21b059344b3d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:55:46 compute-1 nova_compute[182713]: 2026-01-21 23:55:46.118 182717 DEBUG oslo_concurrency.lockutils [req-578718d9-6a86-418c-9e46-5f165f3544d4 req-b2ef04b2-da94-4eb2-909f-b99493ddf3a9 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "22742d9b-a6a8-4f10-a17f-a9704a1f8f43-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:55:46 compute-1 nova_compute[182713]: 2026-01-21 23:55:46.119 182717 DEBUG oslo_concurrency.lockutils [req-578718d9-6a86-418c-9e46-5f165f3544d4 req-b2ef04b2-da94-4eb2-909f-b99493ddf3a9 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "22742d9b-a6a8-4f10-a17f-a9704a1f8f43-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:55:46 compute-1 nova_compute[182713]: 2026-01-21 23:55:46.119 182717 DEBUG oslo_concurrency.lockutils [req-578718d9-6a86-418c-9e46-5f165f3544d4 req-b2ef04b2-da94-4eb2-909f-b99493ddf3a9 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "22742d9b-a6a8-4f10-a17f-a9704a1f8f43-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:55:46 compute-1 nova_compute[182713]: 2026-01-21 23:55:46.119 182717 DEBUG nova.compute.manager [req-578718d9-6a86-418c-9e46-5f165f3544d4 req-b2ef04b2-da94-4eb2-909f-b99493ddf3a9 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 22742d9b-a6a8-4f10-a17f-a9704a1f8f43] Processing event network-vif-plugged-b767fb64-f4a0-49cc-85c0-21b059344b3d _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 21 23:55:46 compute-1 nova_compute[182713]: 2026-01-21 23:55:46.121 182717 DEBUG nova.compute.manager [None req-88c78b83-62b4-4740-abaa-caed5e17bd63 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] [instance: 22742d9b-a6a8-4f10-a17f-a9704a1f8f43] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 21 23:55:46 compute-1 nova_compute[182713]: 2026-01-21 23:55:46.128 182717 DEBUG nova.virt.driver [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Emitting event <LifecycleEvent: 1769039746.1266842, 22742d9b-a6a8-4f10-a17f-a9704a1f8f43 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:55:46 compute-1 nova_compute[182713]: 2026-01-21 23:55:46.129 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 22742d9b-a6a8-4f10-a17f-a9704a1f8f43] VM Resumed (Lifecycle Event)
Jan 21 23:55:46 compute-1 nova_compute[182713]: 2026-01-21 23:55:46.132 182717 DEBUG nova.virt.libvirt.driver [None req-88c78b83-62b4-4740-abaa-caed5e17bd63 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] [instance: 22742d9b-a6a8-4f10-a17f-a9704a1f8f43] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 21 23:55:46 compute-1 nova_compute[182713]: 2026-01-21 23:55:46.147 182717 INFO nova.virt.libvirt.driver [-] [instance: 22742d9b-a6a8-4f10-a17f-a9704a1f8f43] Instance spawned successfully.
Jan 21 23:55:46 compute-1 nova_compute[182713]: 2026-01-21 23:55:46.147 182717 DEBUG nova.virt.libvirt.driver [None req-88c78b83-62b4-4740-abaa-caed5e17bd63 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] [instance: 22742d9b-a6a8-4f10-a17f-a9704a1f8f43] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 21 23:55:46 compute-1 nova_compute[182713]: 2026-01-21 23:55:46.162 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 22742d9b-a6a8-4f10-a17f-a9704a1f8f43] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:55:46 compute-1 nova_compute[182713]: 2026-01-21 23:55:46.165 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 22742d9b-a6a8-4f10-a17f-a9704a1f8f43] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 21 23:55:46 compute-1 nova_compute[182713]: 2026-01-21 23:55:46.374 182717 DEBUG nova.virt.libvirt.driver [None req-88c78b83-62b4-4740-abaa-caed5e17bd63 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] [instance: 22742d9b-a6a8-4f10-a17f-a9704a1f8f43] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:55:46 compute-1 nova_compute[182713]: 2026-01-21 23:55:46.375 182717 DEBUG nova.virt.libvirt.driver [None req-88c78b83-62b4-4740-abaa-caed5e17bd63 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] [instance: 22742d9b-a6a8-4f10-a17f-a9704a1f8f43] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:55:46 compute-1 nova_compute[182713]: 2026-01-21 23:55:46.376 182717 DEBUG nova.virt.libvirt.driver [None req-88c78b83-62b4-4740-abaa-caed5e17bd63 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] [instance: 22742d9b-a6a8-4f10-a17f-a9704a1f8f43] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:55:46 compute-1 nova_compute[182713]: 2026-01-21 23:55:46.376 182717 DEBUG nova.virt.libvirt.driver [None req-88c78b83-62b4-4740-abaa-caed5e17bd63 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] [instance: 22742d9b-a6a8-4f10-a17f-a9704a1f8f43] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:55:46 compute-1 nova_compute[182713]: 2026-01-21 23:55:46.376 182717 DEBUG nova.virt.libvirt.driver [None req-88c78b83-62b4-4740-abaa-caed5e17bd63 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] [instance: 22742d9b-a6a8-4f10-a17f-a9704a1f8f43] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:55:46 compute-1 nova_compute[182713]: 2026-01-21 23:55:46.377 182717 DEBUG nova.virt.libvirt.driver [None req-88c78b83-62b4-4740-abaa-caed5e17bd63 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] [instance: 22742d9b-a6a8-4f10-a17f-a9704a1f8f43] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:55:46 compute-1 nova_compute[182713]: 2026-01-21 23:55:46.384 182717 DEBUG oslo_concurrency.processutils [None req-bac2316a-8912-4826-8cd6-e33f10e4683b 2acb5062ef0a46b3b37336c6e856e999 c99aa787fccd4f1d9553df1471383775 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/cdb8738b-801f-4bd8-a93c-3553748dedd7/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpvklw9lds" returned: 0 in 0.943s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:55:46 compute-1 nova_compute[182713]: 2026-01-21 23:55:46.418 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 22742d9b-a6a8-4f10-a17f-a9704a1f8f43] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 21 23:55:46 compute-1 nova_compute[182713]: 2026-01-21 23:55:46.419 182717 DEBUG nova.virt.driver [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Emitting event <LifecycleEvent: 1769039746.3554623, 2a5e9a44-f095-4122-8db9-4918b6ba22b7 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:55:46 compute-1 nova_compute[182713]: 2026-01-21 23:55:46.419 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 2a5e9a44-f095-4122-8db9-4918b6ba22b7] VM Started (Lifecycle Event)
Jan 21 23:55:46 compute-1 NetworkManager[54952]: <info>  [1769039746.4607] manager: (tap7ff4f083-3e): new Tun device (/org/freedesktop/NetworkManager/Devices/88)
Jan 21 23:55:46 compute-1 kernel: tap7ff4f083-3e: entered promiscuous mode
Jan 21 23:55:46 compute-1 ovn_controller[94841]: 2026-01-21T23:55:46Z|00162|binding|INFO|Claiming lport 7ff4f083-3e07-4de8-8f0c-bd99fc57ead6 for this chassis.
Jan 21 23:55:46 compute-1 nova_compute[182713]: 2026-01-21 23:55:46.463 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 2a5e9a44-f095-4122-8db9-4918b6ba22b7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:55:46 compute-1 ovn_controller[94841]: 2026-01-21T23:55:46Z|00163|binding|INFO|7ff4f083-3e07-4de8-8f0c-bd99fc57ead6: Claiming fa:16:3e:02:3d:57 10.100.0.3
Jan 21 23:55:46 compute-1 nova_compute[182713]: 2026-01-21 23:55:46.464 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:55:46 compute-1 nova_compute[182713]: 2026-01-21 23:55:46.471 182717 DEBUG nova.virt.driver [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Emitting event <LifecycleEvent: 1769039746.3632586, 2a5e9a44-f095-4122-8db9-4918b6ba22b7 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:55:46 compute-1 nova_compute[182713]: 2026-01-21 23:55:46.471 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 2a5e9a44-f095-4122-8db9-4918b6ba22b7] VM Paused (Lifecycle Event)
Jan 21 23:55:46 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:55:46.481 104184 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:02:3d:57 10.100.0.3'], port_security=['fa:16:3e:02:3d:57 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'cdb8738b-801f-4bd8-a93c-3553748dedd7', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-261c8c44-5c0a-4f69-8e63-c90dfc4facd7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c99aa787fccd4f1d9553df1471383775', 'neutron:revision_number': '2', 'neutron:security_group_ids': '19c64d08-9fe5-4e54-b72d-53061e1c93c3', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=92ae6344-2d0b-4688-9fb2-71325c1bc562, chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>], logical_port=7ff4f083-3e07-4de8-8f0c-bd99fc57ead6) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 21 23:55:46 compute-1 NetworkManager[54952]: <info>  [1769039746.4843] device (tap7ff4f083-3e): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 21 23:55:46 compute-1 NetworkManager[54952]: <info>  [1769039746.4860] device (tap7ff4f083-3e): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 21 23:55:46 compute-1 podman[219057]: 2026-01-21 23:55:46.491773085 +0000 UTC m=+0.067085997 container create a46657691f0235cddd44eda9767cbec3452386d2ae0e6a95a644487c291a43b7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-74e2da48-44c2-4c6d-9597-6c47d6247f9c, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.vendor=CentOS)
Jan 21 23:55:46 compute-1 nova_compute[182713]: 2026-01-21 23:55:46.493 182717 INFO nova.compute.manager [None req-88c78b83-62b4-4740-abaa-caed5e17bd63 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] [instance: 22742d9b-a6a8-4f10-a17f-a9704a1f8f43] Took 10.67 seconds to spawn the instance on the hypervisor.
Jan 21 23:55:46 compute-1 nova_compute[182713]: 2026-01-21 23:55:46.493 182717 DEBUG nova.compute.manager [None req-88c78b83-62b4-4740-abaa-caed5e17bd63 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] [instance: 22742d9b-a6a8-4f10-a17f-a9704a1f8f43] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:55:46 compute-1 nova_compute[182713]: 2026-01-21 23:55:46.502 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 2a5e9a44-f095-4122-8db9-4918b6ba22b7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:55:46 compute-1 nova_compute[182713]: 2026-01-21 23:55:46.505 182717 DEBUG nova.compute.manager [req-50ed94a5-73cd-4ead-8870-cada458b6ae6 req-1ed7dfcf-8843-4adb-8f66-cf1738f0f532 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 2a5e9a44-f095-4122-8db9-4918b6ba22b7] Received event network-vif-plugged-ce72a712-5acd-45cf-9d5d-66fb0d28ce4b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:55:46 compute-1 nova_compute[182713]: 2026-01-21 23:55:46.506 182717 DEBUG oslo_concurrency.lockutils [req-50ed94a5-73cd-4ead-8870-cada458b6ae6 req-1ed7dfcf-8843-4adb-8f66-cf1738f0f532 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "2a5e9a44-f095-4122-8db9-4918b6ba22b7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:55:46 compute-1 nova_compute[182713]: 2026-01-21 23:55:46.506 182717 DEBUG oslo_concurrency.lockutils [req-50ed94a5-73cd-4ead-8870-cada458b6ae6 req-1ed7dfcf-8843-4adb-8f66-cf1738f0f532 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "2a5e9a44-f095-4122-8db9-4918b6ba22b7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:55:46 compute-1 nova_compute[182713]: 2026-01-21 23:55:46.506 182717 DEBUG oslo_concurrency.lockutils [req-50ed94a5-73cd-4ead-8870-cada458b6ae6 req-1ed7dfcf-8843-4adb-8f66-cf1738f0f532 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "2a5e9a44-f095-4122-8db9-4918b6ba22b7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:55:46 compute-1 nova_compute[182713]: 2026-01-21 23:55:46.506 182717 DEBUG nova.compute.manager [req-50ed94a5-73cd-4ead-8870-cada458b6ae6 req-1ed7dfcf-8843-4adb-8f66-cf1738f0f532 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 2a5e9a44-f095-4122-8db9-4918b6ba22b7] Processing event network-vif-plugged-ce72a712-5acd-45cf-9d5d-66fb0d28ce4b _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 21 23:55:46 compute-1 nova_compute[182713]: 2026-01-21 23:55:46.507 182717 DEBUG nova.compute.manager [None req-e8e55543-5f47-4415-ba22-55d1bfa56be9 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: 2a5e9a44-f095-4122-8db9-4918b6ba22b7] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 21 23:55:46 compute-1 nova_compute[182713]: 2026-01-21 23:55:46.513 182717 DEBUG nova.virt.driver [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Emitting event <LifecycleEvent: 1769039746.51102, 2a5e9a44-f095-4122-8db9-4918b6ba22b7 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:55:46 compute-1 nova_compute[182713]: 2026-01-21 23:55:46.513 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 2a5e9a44-f095-4122-8db9-4918b6ba22b7] VM Resumed (Lifecycle Event)
Jan 21 23:55:46 compute-1 nova_compute[182713]: 2026-01-21 23:55:46.515 182717 DEBUG nova.virt.libvirt.driver [None req-e8e55543-5f47-4415-ba22-55d1bfa56be9 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: 2a5e9a44-f095-4122-8db9-4918b6ba22b7] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 21 23:55:46 compute-1 nova_compute[182713]: 2026-01-21 23:55:46.538 182717 INFO nova.virt.libvirt.driver [-] [instance: 2a5e9a44-f095-4122-8db9-4918b6ba22b7] Instance spawned successfully.
Jan 21 23:55:46 compute-1 nova_compute[182713]: 2026-01-21 23:55:46.539 182717 DEBUG nova.virt.libvirt.driver [None req-e8e55543-5f47-4415-ba22-55d1bfa56be9 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: 2a5e9a44-f095-4122-8db9-4918b6ba22b7] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 21 23:55:46 compute-1 systemd[1]: Started libpod-conmon-a46657691f0235cddd44eda9767cbec3452386d2ae0e6a95a644487c291a43b7.scope.
Jan 21 23:55:46 compute-1 systemd-machined[153970]: New machine qemu-28-instance-0000003b.
Jan 21 23:55:46 compute-1 nova_compute[182713]: 2026-01-21 23:55:46.545 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:55:46 compute-1 systemd[1]: Started Virtual Machine qemu-28-instance-0000003b.
Jan 21 23:55:46 compute-1 nova_compute[182713]: 2026-01-21 23:55:46.548 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 2a5e9a44-f095-4122-8db9-4918b6ba22b7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:55:46 compute-1 ovn_controller[94841]: 2026-01-21T23:55:46Z|00164|binding|INFO|Setting lport 7ff4f083-3e07-4de8-8f0c-bd99fc57ead6 ovn-installed in OVS
Jan 21 23:55:46 compute-1 ovn_controller[94841]: 2026-01-21T23:55:46Z|00165|binding|INFO|Setting lport 7ff4f083-3e07-4de8-8f0c-bd99fc57ead6 up in Southbound
Jan 21 23:55:46 compute-1 nova_compute[182713]: 2026-01-21 23:55:46.550 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:55:46 compute-1 podman[219057]: 2026-01-21 23:55:46.457406456 +0000 UTC m=+0.032719398 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 21 23:55:46 compute-1 nova_compute[182713]: 2026-01-21 23:55:46.555 182717 DEBUG nova.network.neutron [req-3259802a-6f49-4910-9fa7-bd0e21c0c8df req-9742506c-d0ca-4ce8-a261-67a5e9285d54 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: cdb8738b-801f-4bd8-a93c-3553748dedd7] Updated VIF entry in instance network info cache for port 7ff4f083-3e07-4de8-8f0c-bd99fc57ead6. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 21 23:55:46 compute-1 nova_compute[182713]: 2026-01-21 23:55:46.555 182717 DEBUG nova.network.neutron [req-3259802a-6f49-4910-9fa7-bd0e21c0c8df req-9742506c-d0ca-4ce8-a261-67a5e9285d54 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: cdb8738b-801f-4bd8-a93c-3553748dedd7] Updating instance_info_cache with network_info: [{"id": "7ff4f083-3e07-4de8-8f0c-bd99fc57ead6", "address": "fa:16:3e:02:3d:57", "network": {"id": "261c8c44-5c0a-4f69-8e63-c90dfc4facd7", "bridge": "br-int", "label": "tempest-InstanceActionsV221TestJSON-1124446518-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c99aa787fccd4f1d9553df1471383775", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7ff4f083-3e", "ovs_interfaceid": "7ff4f083-3e07-4de8-8f0c-bd99fc57ead6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 23:55:46 compute-1 nova_compute[182713]: 2026-01-21 23:55:46.558 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 2a5e9a44-f095-4122-8db9-4918b6ba22b7] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 21 23:55:46 compute-1 systemd[1]: Started libcrun container.
Jan 21 23:55:46 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d737babeb3960a5e4dea217b0e260a200cf9365146f5fa7e496fe1302ac01acf/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 21 23:55:46 compute-1 nova_compute[182713]: 2026-01-21 23:55:46.581 182717 DEBUG nova.virt.libvirt.driver [None req-e8e55543-5f47-4415-ba22-55d1bfa56be9 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: 2a5e9a44-f095-4122-8db9-4918b6ba22b7] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:55:46 compute-1 nova_compute[182713]: 2026-01-21 23:55:46.582 182717 DEBUG nova.virt.libvirt.driver [None req-e8e55543-5f47-4415-ba22-55d1bfa56be9 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: 2a5e9a44-f095-4122-8db9-4918b6ba22b7] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:55:46 compute-1 nova_compute[182713]: 2026-01-21 23:55:46.582 182717 DEBUG nova.virt.libvirt.driver [None req-e8e55543-5f47-4415-ba22-55d1bfa56be9 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: 2a5e9a44-f095-4122-8db9-4918b6ba22b7] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:55:46 compute-1 nova_compute[182713]: 2026-01-21 23:55:46.583 182717 DEBUG nova.virt.libvirt.driver [None req-e8e55543-5f47-4415-ba22-55d1bfa56be9 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: 2a5e9a44-f095-4122-8db9-4918b6ba22b7] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:55:46 compute-1 nova_compute[182713]: 2026-01-21 23:55:46.583 182717 DEBUG nova.virt.libvirt.driver [None req-e8e55543-5f47-4415-ba22-55d1bfa56be9 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: 2a5e9a44-f095-4122-8db9-4918b6ba22b7] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:55:46 compute-1 nova_compute[182713]: 2026-01-21 23:55:46.583 182717 DEBUG nova.virt.libvirt.driver [None req-e8e55543-5f47-4415-ba22-55d1bfa56be9 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: 2a5e9a44-f095-4122-8db9-4918b6ba22b7] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:55:46 compute-1 nova_compute[182713]: 2026-01-21 23:55:46.586 182717 DEBUG nova.network.neutron [req-4f5ed248-a508-48f6-b9b7-a9b10e08f927 req-f3797714-92e3-424a-ae3d-45ad192b9661 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 2a5e9a44-f095-4122-8db9-4918b6ba22b7] Updated VIF entry in instance network info cache for port ce72a712-5acd-45cf-9d5d-66fb0d28ce4b. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 21 23:55:46 compute-1 nova_compute[182713]: 2026-01-21 23:55:46.587 182717 DEBUG nova.network.neutron [req-4f5ed248-a508-48f6-b9b7-a9b10e08f927 req-f3797714-92e3-424a-ae3d-45ad192b9661 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 2a5e9a44-f095-4122-8db9-4918b6ba22b7] Updating instance_info_cache with network_info: [{"id": "ce72a712-5acd-45cf-9d5d-66fb0d28ce4b", "address": "fa:16:3e:4f:57:00", "network": {"id": "74e2da48-44c2-4c6d-9597-6c47d6247f9c", "bridge": "br-int", "label": "tempest-ImagesTestJSON-926280031-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "63e5713bcd4c429796b251487b6136bc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapce72a712-5a", "ovs_interfaceid": "ce72a712-5acd-45cf-9d5d-66fb0d28ce4b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 23:55:46 compute-1 podman[219057]: 2026-01-21 23:55:46.590515426 +0000 UTC m=+0.165828368 container init a46657691f0235cddd44eda9767cbec3452386d2ae0e6a95a644487c291a43b7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-74e2da48-44c2-4c6d-9597-6c47d6247f9c, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Jan 21 23:55:46 compute-1 nova_compute[182713]: 2026-01-21 23:55:46.593 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 2a5e9a44-f095-4122-8db9-4918b6ba22b7] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 21 23:55:46 compute-1 podman[219057]: 2026-01-21 23:55:46.596090707 +0000 UTC m=+0.171403629 container start a46657691f0235cddd44eda9767cbec3452386d2ae0e6a95a644487c291a43b7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-74e2da48-44c2-4c6d-9597-6c47d6247f9c, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2)
Jan 21 23:55:46 compute-1 nova_compute[182713]: 2026-01-21 23:55:46.610 182717 DEBUG oslo_concurrency.lockutils [req-3259802a-6f49-4910-9fa7-bd0e21c0c8df req-9742506c-d0ca-4ce8-a261-67a5e9285d54 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-cdb8738b-801f-4bd8-a93c-3553748dedd7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 23:55:46 compute-1 nova_compute[182713]: 2026-01-21 23:55:46.613 182717 INFO nova.compute.manager [None req-88c78b83-62b4-4740-abaa-caed5e17bd63 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] [instance: 22742d9b-a6a8-4f10-a17f-a9704a1f8f43] Took 11.48 seconds to build instance.
Jan 21 23:55:46 compute-1 nova_compute[182713]: 2026-01-21 23:55:46.617 182717 DEBUG oslo_concurrency.lockutils [req-4f5ed248-a508-48f6-b9b7-a9b10e08f927 req-f3797714-92e3-424a-ae3d-45ad192b9661 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-2a5e9a44-f095-4122-8db9-4918b6ba22b7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 23:55:46 compute-1 neutron-haproxy-ovnmeta-74e2da48-44c2-4c6d-9597-6c47d6247f9c[219086]: [NOTICE]   (219093) : New worker (219097) forked
Jan 21 23:55:46 compute-1 neutron-haproxy-ovnmeta-74e2da48-44c2-4c6d-9597-6c47d6247f9c[219086]: [NOTICE]   (219093) : Loading success.
Jan 21 23:55:46 compute-1 nova_compute[182713]: 2026-01-21 23:55:46.643 182717 DEBUG oslo_concurrency.lockutils [None req-88c78b83-62b4-4740-abaa-caed5e17bd63 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Lock "22742d9b-a6a8-4f10-a17f-a9704a1f8f43" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 11.615s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:55:46 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:55:46.653 104184 INFO neutron.agent.ovn.metadata.agent [-] Port 7ff4f083-3e07-4de8-8f0c-bd99fc57ead6 in datapath 261c8c44-5c0a-4f69-8e63-c90dfc4facd7 unbound from our chassis
Jan 21 23:55:46 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:55:46.655 104184 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 261c8c44-5c0a-4f69-8e63-c90dfc4facd7
Jan 21 23:55:46 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:55:46.667 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[f202645f-8f66-48d8-9d86-508d4ef6ac7b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:55:46 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:55:46.668 104184 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap261c8c44-51 in ovnmeta-261c8c44-5c0a-4f69-8e63-c90dfc4facd7 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 21 23:55:46 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:55:46.671 211733 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap261c8c44-50 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 21 23:55:46 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:55:46.671 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[968ce5cf-c6a2-4591-b96c-9e3908348c95]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:55:46 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:55:46.672 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[9246900c-5c68-48e6-8afd-b08dbeeb063e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:55:46 compute-1 nova_compute[182713]: 2026-01-21 23:55:46.678 182717 INFO nova.compute.manager [None req-e8e55543-5f47-4415-ba22-55d1bfa56be9 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: 2a5e9a44-f095-4122-8db9-4918b6ba22b7] Took 10.07 seconds to spawn the instance on the hypervisor.
Jan 21 23:55:46 compute-1 nova_compute[182713]: 2026-01-21 23:55:46.679 182717 DEBUG nova.compute.manager [None req-e8e55543-5f47-4415-ba22-55d1bfa56be9 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: 2a5e9a44-f095-4122-8db9-4918b6ba22b7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:55:46 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:55:46.688 104576 DEBUG oslo.privsep.daemon [-] privsep: reply[35e29b2c-42f7-4652-9be3-7b25e8f750fa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:55:46 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:55:46.714 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[aba9b0d1-00af-4fc6-ab0b-e2f83e4fe023]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:55:46 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:55:46.744 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[7d63dc18-1188-4318-a3b5-9ebf82f11c96]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:55:46 compute-1 NetworkManager[54952]: <info>  [1769039746.7547] manager: (tap261c8c44-50): new Veth device (/org/freedesktop/NetworkManager/Devices/89)
Jan 21 23:55:46 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:55:46.754 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[b6644cbf-f023-4676-8090-d2312ea4449b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:55:46 compute-1 nova_compute[182713]: 2026-01-21 23:55:46.785 182717 INFO nova.compute.manager [None req-e8e55543-5f47-4415-ba22-55d1bfa56be9 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: 2a5e9a44-f095-4122-8db9-4918b6ba22b7] Took 11.47 seconds to build instance.
Jan 21 23:55:46 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:55:46.796 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[427442de-e219-4adc-80cb-c1fcdb57141a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:55:46 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:55:46.799 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[9ae71f25-c37b-4a79-9eee-d2b22659456f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:55:46 compute-1 nova_compute[182713]: 2026-01-21 23:55:46.805 182717 DEBUG oslo_concurrency.lockutils [None req-e8e55543-5f47-4415-ba22-55d1bfa56be9 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Lock "2a5e9a44-f095-4122-8db9-4918b6ba22b7" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 11.637s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:55:46 compute-1 NetworkManager[54952]: <info>  [1769039746.8236] device (tap261c8c44-50): carrier: link connected
Jan 21 23:55:46 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:55:46.832 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[8e6903cc-c974-42af-b417-f1c062360f49]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:55:46 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:55:46.855 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[ffd1eec4-6adc-4f3b-9e74-bbee099dc92c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap261c8c44-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:11:12:17'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 53], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 431947, 'reachable_time': 23333, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 219117, 'error': None, 'target': 'ovnmeta-261c8c44-5c0a-4f69-8e63-c90dfc4facd7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:55:46 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:55:46.871 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[44d16308-655c-4bc1-86c3-ebb1014124db]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe11:1217'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 431947, 'tstamp': 431947}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 219118, 'error': None, 'target': 'ovnmeta-261c8c44-5c0a-4f69-8e63-c90dfc4facd7', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:55:46 compute-1 nova_compute[182713]: 2026-01-21 23:55:46.883 182717 DEBUG nova.network.neutron [req-306717a3-c08a-4167-94c3-89c982faa494 req-ed709abd-26eb-4ae1-9317-21ce16440171 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 22742d9b-a6a8-4f10-a17f-a9704a1f8f43] Updated VIF entry in instance network info cache for port b767fb64-f4a0-49cc-85c0-21b059344b3d. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 21 23:55:46 compute-1 nova_compute[182713]: 2026-01-21 23:55:46.883 182717 DEBUG nova.network.neutron [req-306717a3-c08a-4167-94c3-89c982faa494 req-ed709abd-26eb-4ae1-9317-21ce16440171 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 22742d9b-a6a8-4f10-a17f-a9704a1f8f43] Updating instance_info_cache with network_info: [{"id": "b767fb64-f4a0-49cc-85c0-21b059344b3d", "address": "fa:16:3e:4e:99:ca", "network": {"id": "a78bfb22-a192-4dbe-a117-9f8a59130e27", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-9709523-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "70b1c9f8be0042aa8de9841a26729700", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb767fb64-f4", "ovs_interfaceid": "b767fb64-f4a0-49cc-85c0-21b059344b3d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 23:55:46 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:55:46.895 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[2708c45b-2747-4490-9e61-b3ef614ea497]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap261c8c44-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:11:12:17'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 53], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 431947, 'reachable_time': 23333, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 219119, 'error': None, 'target': 'ovnmeta-261c8c44-5c0a-4f69-8e63-c90dfc4facd7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:55:46 compute-1 nova_compute[182713]: 2026-01-21 23:55:46.906 182717 DEBUG oslo_concurrency.lockutils [req-306717a3-c08a-4167-94c3-89c982faa494 req-ed709abd-26eb-4ae1-9317-21ce16440171 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-22742d9b-a6a8-4f10-a17f-a9704a1f8f43" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 23:55:46 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:55:46.925 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[1cde13e9-e7f7-475c-806b-9b4e0e900516]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:55:46 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:55:46.986 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[ae9c9e96-fc2d-456f-9a25-7bca417221a6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:55:46 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:55:46.988 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap261c8c44-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:55:46 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:55:46.989 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 21 23:55:46 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:55:46.990 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap261c8c44-50, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:55:46 compute-1 nova_compute[182713]: 2026-01-21 23:55:46.993 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:55:46 compute-1 kernel: tap261c8c44-50: entered promiscuous mode
Jan 21 23:55:46 compute-1 NetworkManager[54952]: <info>  [1769039746.9939] manager: (tap261c8c44-50): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/90)
Jan 21 23:55:46 compute-1 nova_compute[182713]: 2026-01-21 23:55:46.995 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:55:47 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:55:47.002 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap261c8c44-50, col_values=(('external_ids', {'iface-id': 'd51e7b80-ebe6-4f24-aaba-33926301c47b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:55:47 compute-1 nova_compute[182713]: 2026-01-21 23:55:47.003 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:55:47 compute-1 ovn_controller[94841]: 2026-01-21T23:55:47Z|00166|binding|INFO|Releasing lport d51e7b80-ebe6-4f24-aaba-33926301c47b from this chassis (sb_readonly=0)
Jan 21 23:55:47 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:55:47.009 104184 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/261c8c44-5c0a-4f69-8e63-c90dfc4facd7.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/261c8c44-5c0a-4f69-8e63-c90dfc4facd7.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 21 23:55:47 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:55:47.010 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[af173ede-9e1a-4f00-ae7c-a684beb5d659]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:55:47 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:55:47.013 104184 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 21 23:55:47 compute-1 ovn_metadata_agent[104179]: global
Jan 21 23:55:47 compute-1 ovn_metadata_agent[104179]:     log         /dev/log local0 debug
Jan 21 23:55:47 compute-1 ovn_metadata_agent[104179]:     log-tag     haproxy-metadata-proxy-261c8c44-5c0a-4f69-8e63-c90dfc4facd7
Jan 21 23:55:47 compute-1 ovn_metadata_agent[104179]:     user        root
Jan 21 23:55:47 compute-1 ovn_metadata_agent[104179]:     group       root
Jan 21 23:55:47 compute-1 ovn_metadata_agent[104179]:     maxconn     1024
Jan 21 23:55:47 compute-1 ovn_metadata_agent[104179]:     pidfile     /var/lib/neutron/external/pids/261c8c44-5c0a-4f69-8e63-c90dfc4facd7.pid.haproxy
Jan 21 23:55:47 compute-1 ovn_metadata_agent[104179]:     daemon
Jan 21 23:55:47 compute-1 ovn_metadata_agent[104179]: 
Jan 21 23:55:47 compute-1 ovn_metadata_agent[104179]: defaults
Jan 21 23:55:47 compute-1 ovn_metadata_agent[104179]:     log global
Jan 21 23:55:47 compute-1 ovn_metadata_agent[104179]:     mode http
Jan 21 23:55:47 compute-1 ovn_metadata_agent[104179]:     option httplog
Jan 21 23:55:47 compute-1 ovn_metadata_agent[104179]:     option dontlognull
Jan 21 23:55:47 compute-1 ovn_metadata_agent[104179]:     option http-server-close
Jan 21 23:55:47 compute-1 ovn_metadata_agent[104179]:     option forwardfor
Jan 21 23:55:47 compute-1 ovn_metadata_agent[104179]:     retries                 3
Jan 21 23:55:47 compute-1 ovn_metadata_agent[104179]:     timeout http-request    30s
Jan 21 23:55:47 compute-1 ovn_metadata_agent[104179]:     timeout connect         30s
Jan 21 23:55:47 compute-1 ovn_metadata_agent[104179]:     timeout client          32s
Jan 21 23:55:47 compute-1 ovn_metadata_agent[104179]:     timeout server          32s
Jan 21 23:55:47 compute-1 ovn_metadata_agent[104179]:     timeout http-keep-alive 30s
Jan 21 23:55:47 compute-1 ovn_metadata_agent[104179]: 
Jan 21 23:55:47 compute-1 ovn_metadata_agent[104179]: 
Jan 21 23:55:47 compute-1 ovn_metadata_agent[104179]: listen listener
Jan 21 23:55:47 compute-1 ovn_metadata_agent[104179]:     bind 169.254.169.254:80
Jan 21 23:55:47 compute-1 ovn_metadata_agent[104179]:     server metadata /var/lib/neutron/metadata_proxy
Jan 21 23:55:47 compute-1 ovn_metadata_agent[104179]:     http-request add-header X-OVN-Network-ID 261c8c44-5c0a-4f69-8e63-c90dfc4facd7
Jan 21 23:55:47 compute-1 ovn_metadata_agent[104179]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 21 23:55:47 compute-1 nova_compute[182713]: 2026-01-21 23:55:47.017 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:55:47 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:55:47.019 104184 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-261c8c44-5c0a-4f69-8e63-c90dfc4facd7', 'env', 'PROCESS_TAG=haproxy-261c8c44-5c0a-4f69-8e63-c90dfc4facd7', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/261c8c44-5c0a-4f69-8e63-c90dfc4facd7.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 21 23:55:47 compute-1 nova_compute[182713]: 2026-01-21 23:55:47.074 182717 DEBUG nova.virt.driver [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Emitting event <LifecycleEvent: 1769039747.0734646, cdb8738b-801f-4bd8-a93c-3553748dedd7 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:55:47 compute-1 nova_compute[182713]: 2026-01-21 23:55:47.075 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: cdb8738b-801f-4bd8-a93c-3553748dedd7] VM Started (Lifecycle Event)
Jan 21 23:55:47 compute-1 nova_compute[182713]: 2026-01-21 23:55:47.105 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: cdb8738b-801f-4bd8-a93c-3553748dedd7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:55:47 compute-1 nova_compute[182713]: 2026-01-21 23:55:47.109 182717 DEBUG nova.virt.driver [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Emitting event <LifecycleEvent: 1769039747.0735657, cdb8738b-801f-4bd8-a93c-3553748dedd7 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:55:47 compute-1 nova_compute[182713]: 2026-01-21 23:55:47.109 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: cdb8738b-801f-4bd8-a93c-3553748dedd7] VM Paused (Lifecycle Event)
Jan 21 23:55:47 compute-1 nova_compute[182713]: 2026-01-21 23:55:47.129 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: cdb8738b-801f-4bd8-a93c-3553748dedd7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:55:47 compute-1 nova_compute[182713]: 2026-01-21 23:55:47.132 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: cdb8738b-801f-4bd8-a93c-3553748dedd7] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 21 23:55:47 compute-1 nova_compute[182713]: 2026-01-21 23:55:47.159 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: cdb8738b-801f-4bd8-a93c-3553748dedd7] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 21 23:55:47 compute-1 nova_compute[182713]: 2026-01-21 23:55:47.303 182717 DEBUG nova.compute.manager [req-dcbad326-a391-4e87-954f-c4935b98be12 req-fb30b56a-da07-4a73-a5e1-3594578ef5be 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: cdb8738b-801f-4bd8-a93c-3553748dedd7] Received event network-vif-plugged-7ff4f083-3e07-4de8-8f0c-bd99fc57ead6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:55:47 compute-1 nova_compute[182713]: 2026-01-21 23:55:47.303 182717 DEBUG oslo_concurrency.lockutils [req-dcbad326-a391-4e87-954f-c4935b98be12 req-fb30b56a-da07-4a73-a5e1-3594578ef5be 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "cdb8738b-801f-4bd8-a93c-3553748dedd7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:55:47 compute-1 nova_compute[182713]: 2026-01-21 23:55:47.303 182717 DEBUG oslo_concurrency.lockutils [req-dcbad326-a391-4e87-954f-c4935b98be12 req-fb30b56a-da07-4a73-a5e1-3594578ef5be 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "cdb8738b-801f-4bd8-a93c-3553748dedd7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:55:47 compute-1 nova_compute[182713]: 2026-01-21 23:55:47.304 182717 DEBUG oslo_concurrency.lockutils [req-dcbad326-a391-4e87-954f-c4935b98be12 req-fb30b56a-da07-4a73-a5e1-3594578ef5be 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "cdb8738b-801f-4bd8-a93c-3553748dedd7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:55:47 compute-1 nova_compute[182713]: 2026-01-21 23:55:47.304 182717 DEBUG nova.compute.manager [req-dcbad326-a391-4e87-954f-c4935b98be12 req-fb30b56a-da07-4a73-a5e1-3594578ef5be 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: cdb8738b-801f-4bd8-a93c-3553748dedd7] Processing event network-vif-plugged-7ff4f083-3e07-4de8-8f0c-bd99fc57ead6 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 21 23:55:47 compute-1 nova_compute[182713]: 2026-01-21 23:55:47.305 182717 DEBUG nova.compute.manager [None req-bac2316a-8912-4826-8cd6-e33f10e4683b 2acb5062ef0a46b3b37336c6e856e999 c99aa787fccd4f1d9553df1471383775 - - default default] [instance: cdb8738b-801f-4bd8-a93c-3553748dedd7] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 21 23:55:47 compute-1 nova_compute[182713]: 2026-01-21 23:55:47.308 182717 DEBUG nova.virt.driver [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Emitting event <LifecycleEvent: 1769039747.3085535, cdb8738b-801f-4bd8-a93c-3553748dedd7 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:55:47 compute-1 nova_compute[182713]: 2026-01-21 23:55:47.309 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: cdb8738b-801f-4bd8-a93c-3553748dedd7] VM Resumed (Lifecycle Event)
Jan 21 23:55:47 compute-1 nova_compute[182713]: 2026-01-21 23:55:47.311 182717 DEBUG nova.virt.libvirt.driver [None req-bac2316a-8912-4826-8cd6-e33f10e4683b 2acb5062ef0a46b3b37336c6e856e999 c99aa787fccd4f1d9553df1471383775 - - default default] [instance: cdb8738b-801f-4bd8-a93c-3553748dedd7] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 21 23:55:47 compute-1 nova_compute[182713]: 2026-01-21 23:55:47.315 182717 INFO nova.virt.libvirt.driver [-] [instance: cdb8738b-801f-4bd8-a93c-3553748dedd7] Instance spawned successfully.
Jan 21 23:55:47 compute-1 nova_compute[182713]: 2026-01-21 23:55:47.315 182717 DEBUG nova.virt.libvirt.driver [None req-bac2316a-8912-4826-8cd6-e33f10e4683b 2acb5062ef0a46b3b37336c6e856e999 c99aa787fccd4f1d9553df1471383775 - - default default] [instance: cdb8738b-801f-4bd8-a93c-3553748dedd7] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 21 23:55:47 compute-1 nova_compute[182713]: 2026-01-21 23:55:47.334 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: cdb8738b-801f-4bd8-a93c-3553748dedd7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:55:47 compute-1 nova_compute[182713]: 2026-01-21 23:55:47.341 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: cdb8738b-801f-4bd8-a93c-3553748dedd7] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 21 23:55:47 compute-1 nova_compute[182713]: 2026-01-21 23:55:47.345 182717 DEBUG nova.virt.libvirt.driver [None req-bac2316a-8912-4826-8cd6-e33f10e4683b 2acb5062ef0a46b3b37336c6e856e999 c99aa787fccd4f1d9553df1471383775 - - default default] [instance: cdb8738b-801f-4bd8-a93c-3553748dedd7] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:55:47 compute-1 nova_compute[182713]: 2026-01-21 23:55:47.346 182717 DEBUG nova.virt.libvirt.driver [None req-bac2316a-8912-4826-8cd6-e33f10e4683b 2acb5062ef0a46b3b37336c6e856e999 c99aa787fccd4f1d9553df1471383775 - - default default] [instance: cdb8738b-801f-4bd8-a93c-3553748dedd7] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:55:47 compute-1 nova_compute[182713]: 2026-01-21 23:55:47.347 182717 DEBUG nova.virt.libvirt.driver [None req-bac2316a-8912-4826-8cd6-e33f10e4683b 2acb5062ef0a46b3b37336c6e856e999 c99aa787fccd4f1d9553df1471383775 - - default default] [instance: cdb8738b-801f-4bd8-a93c-3553748dedd7] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:55:47 compute-1 nova_compute[182713]: 2026-01-21 23:55:47.347 182717 DEBUG nova.virt.libvirt.driver [None req-bac2316a-8912-4826-8cd6-e33f10e4683b 2acb5062ef0a46b3b37336c6e856e999 c99aa787fccd4f1d9553df1471383775 - - default default] [instance: cdb8738b-801f-4bd8-a93c-3553748dedd7] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:55:47 compute-1 nova_compute[182713]: 2026-01-21 23:55:47.348 182717 DEBUG nova.virt.libvirt.driver [None req-bac2316a-8912-4826-8cd6-e33f10e4683b 2acb5062ef0a46b3b37336c6e856e999 c99aa787fccd4f1d9553df1471383775 - - default default] [instance: cdb8738b-801f-4bd8-a93c-3553748dedd7] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:55:47 compute-1 nova_compute[182713]: 2026-01-21 23:55:47.349 182717 DEBUG nova.virt.libvirt.driver [None req-bac2316a-8912-4826-8cd6-e33f10e4683b 2acb5062ef0a46b3b37336c6e856e999 c99aa787fccd4f1d9553df1471383775 - - default default] [instance: cdb8738b-801f-4bd8-a93c-3553748dedd7] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:55:47 compute-1 nova_compute[182713]: 2026-01-21 23:55:47.387 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: cdb8738b-801f-4bd8-a93c-3553748dedd7] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 21 23:55:47 compute-1 nova_compute[182713]: 2026-01-21 23:55:47.456 182717 INFO nova.compute.manager [None req-bac2316a-8912-4826-8cd6-e33f10e4683b 2acb5062ef0a46b3b37336c6e856e999 c99aa787fccd4f1d9553df1471383775 - - default default] [instance: cdb8738b-801f-4bd8-a93c-3553748dedd7] Took 11.30 seconds to spawn the instance on the hypervisor.
Jan 21 23:55:47 compute-1 nova_compute[182713]: 2026-01-21 23:55:47.456 182717 DEBUG nova.compute.manager [None req-bac2316a-8912-4826-8cd6-e33f10e4683b 2acb5062ef0a46b3b37336c6e856e999 c99aa787fccd4f1d9553df1471383775 - - default default] [instance: cdb8738b-801f-4bd8-a93c-3553748dedd7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:55:47 compute-1 podman[219159]: 2026-01-21 23:55:47.463126128 +0000 UTC m=+0.062540687 container create 1a55c17407edd0da71ca662cd64fb90bd9b29111da88f5e0c4ce86c68c599d7e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-261c8c44-5c0a-4f69-8e63-c90dfc4facd7, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Jan 21 23:55:47 compute-1 systemd[1]: Started libpod-conmon-1a55c17407edd0da71ca662cd64fb90bd9b29111da88f5e0c4ce86c68c599d7e.scope.
Jan 21 23:55:47 compute-1 podman[219159]: 2026-01-21 23:55:47.434423194 +0000 UTC m=+0.033837733 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 21 23:55:47 compute-1 systemd[1]: Started libcrun container.
Jan 21 23:55:47 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8f12a86a0ad749ed2348f735721656cc5b19c909f9198d8d17120c63dac7b122/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 21 23:55:47 compute-1 nova_compute[182713]: 2026-01-21 23:55:47.558 182717 INFO nova.compute.manager [None req-bac2316a-8912-4826-8cd6-e33f10e4683b 2acb5062ef0a46b3b37336c6e856e999 c99aa787fccd4f1d9553df1471383775 - - default default] [instance: cdb8738b-801f-4bd8-a93c-3553748dedd7] Took 12.31 seconds to build instance.
Jan 21 23:55:47 compute-1 podman[219159]: 2026-01-21 23:55:47.570309929 +0000 UTC m=+0.169724548 container init 1a55c17407edd0da71ca662cd64fb90bd9b29111da88f5e0c4ce86c68c599d7e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-261c8c44-5c0a-4f69-8e63-c90dfc4facd7, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Jan 21 23:55:47 compute-1 podman[219159]: 2026-01-21 23:55:47.581833383 +0000 UTC m=+0.181247942 container start 1a55c17407edd0da71ca662cd64fb90bd9b29111da88f5e0c4ce86c68c599d7e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-261c8c44-5c0a-4f69-8e63-c90dfc4facd7, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS)
Jan 21 23:55:47 compute-1 nova_compute[182713]: 2026-01-21 23:55:47.586 182717 DEBUG oslo_concurrency.lockutils [None req-bac2316a-8912-4826-8cd6-e33f10e4683b 2acb5062ef0a46b3b37336c6e856e999 c99aa787fccd4f1d9553df1471383775 - - default default] Lock "cdb8738b-801f-4bd8-a93c-3553748dedd7" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 12.475s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:55:47 compute-1 neutron-haproxy-ovnmeta-261c8c44-5c0a-4f69-8e63-c90dfc4facd7[219173]: [NOTICE]   (219177) : New worker (219179) forked
Jan 21 23:55:47 compute-1 neutron-haproxy-ovnmeta-261c8c44-5c0a-4f69-8e63-c90dfc4facd7[219173]: [NOTICE]   (219177) : Loading success.
Jan 21 23:55:47 compute-1 nova_compute[182713]: 2026-01-21 23:55:47.837 182717 DEBUG nova.compute.manager [None req-ef526cfe-4761-4d37-b23e-7555ca6b5ee5 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: 2a5e9a44-f095-4122-8db9-4918b6ba22b7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:55:47 compute-1 nova_compute[182713]: 2026-01-21 23:55:47.924 182717 INFO nova.compute.manager [None req-ef526cfe-4761-4d37-b23e-7555ca6b5ee5 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: 2a5e9a44-f095-4122-8db9-4918b6ba22b7] instance snapshotting
Jan 21 23:55:48 compute-1 nova_compute[182713]: 2026-01-21 23:55:48.241 182717 INFO nova.virt.libvirt.driver [None req-ef526cfe-4761-4d37-b23e-7555ca6b5ee5 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: 2a5e9a44-f095-4122-8db9-4918b6ba22b7] Beginning live snapshot process
Jan 21 23:55:48 compute-1 nova_compute[182713]: 2026-01-21 23:55:48.246 182717 DEBUG nova.compute.manager [req-97620a22-16de-4832-acba-86553aa4b9f1 req-f1bdb826-57fd-4c8e-a359-f184e2f78f64 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 22742d9b-a6a8-4f10-a17f-a9704a1f8f43] Received event network-vif-plugged-b767fb64-f4a0-49cc-85c0-21b059344b3d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:55:48 compute-1 nova_compute[182713]: 2026-01-21 23:55:48.246 182717 DEBUG oslo_concurrency.lockutils [req-97620a22-16de-4832-acba-86553aa4b9f1 req-f1bdb826-57fd-4c8e-a359-f184e2f78f64 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "22742d9b-a6a8-4f10-a17f-a9704a1f8f43-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:55:48 compute-1 nova_compute[182713]: 2026-01-21 23:55:48.246 182717 DEBUG oslo_concurrency.lockutils [req-97620a22-16de-4832-acba-86553aa4b9f1 req-f1bdb826-57fd-4c8e-a359-f184e2f78f64 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "22742d9b-a6a8-4f10-a17f-a9704a1f8f43-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:55:48 compute-1 nova_compute[182713]: 2026-01-21 23:55:48.246 182717 DEBUG oslo_concurrency.lockutils [req-97620a22-16de-4832-acba-86553aa4b9f1 req-f1bdb826-57fd-4c8e-a359-f184e2f78f64 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "22742d9b-a6a8-4f10-a17f-a9704a1f8f43-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:55:48 compute-1 nova_compute[182713]: 2026-01-21 23:55:48.247 182717 DEBUG nova.compute.manager [req-97620a22-16de-4832-acba-86553aa4b9f1 req-f1bdb826-57fd-4c8e-a359-f184e2f78f64 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 22742d9b-a6a8-4f10-a17f-a9704a1f8f43] No waiting events found dispatching network-vif-plugged-b767fb64-f4a0-49cc-85c0-21b059344b3d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 23:55:48 compute-1 nova_compute[182713]: 2026-01-21 23:55:48.247 182717 WARNING nova.compute.manager [req-97620a22-16de-4832-acba-86553aa4b9f1 req-f1bdb826-57fd-4c8e-a359-f184e2f78f64 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 22742d9b-a6a8-4f10-a17f-a9704a1f8f43] Received unexpected event network-vif-plugged-b767fb64-f4a0-49cc-85c0-21b059344b3d for instance with vm_state active and task_state None.
Jan 21 23:55:48 compute-1 nova_compute[182713]: 2026-01-21 23:55:48.438 182717 DEBUG oslo_concurrency.lockutils [None req-281480fd-1e53-4688-9f90-3acfe26e7d1a 2acb5062ef0a46b3b37336c6e856e999 c99aa787fccd4f1d9553df1471383775 - - default default] Acquiring lock "cdb8738b-801f-4bd8-a93c-3553748dedd7" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:55:48 compute-1 nova_compute[182713]: 2026-01-21 23:55:48.439 182717 DEBUG oslo_concurrency.lockutils [None req-281480fd-1e53-4688-9f90-3acfe26e7d1a 2acb5062ef0a46b3b37336c6e856e999 c99aa787fccd4f1d9553df1471383775 - - default default] Lock "cdb8738b-801f-4bd8-a93c-3553748dedd7" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:55:48 compute-1 nova_compute[182713]: 2026-01-21 23:55:48.439 182717 DEBUG oslo_concurrency.lockutils [None req-281480fd-1e53-4688-9f90-3acfe26e7d1a 2acb5062ef0a46b3b37336c6e856e999 c99aa787fccd4f1d9553df1471383775 - - default default] Acquiring lock "cdb8738b-801f-4bd8-a93c-3553748dedd7-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:55:48 compute-1 nova_compute[182713]: 2026-01-21 23:55:48.440 182717 DEBUG oslo_concurrency.lockutils [None req-281480fd-1e53-4688-9f90-3acfe26e7d1a 2acb5062ef0a46b3b37336c6e856e999 c99aa787fccd4f1d9553df1471383775 - - default default] Lock "cdb8738b-801f-4bd8-a93c-3553748dedd7-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:55:48 compute-1 nova_compute[182713]: 2026-01-21 23:55:48.440 182717 DEBUG oslo_concurrency.lockutils [None req-281480fd-1e53-4688-9f90-3acfe26e7d1a 2acb5062ef0a46b3b37336c6e856e999 c99aa787fccd4f1d9553df1471383775 - - default default] Lock "cdb8738b-801f-4bd8-a93c-3553748dedd7-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:55:48 compute-1 nova_compute[182713]: 2026-01-21 23:55:48.455 182717 INFO nova.compute.manager [None req-281480fd-1e53-4688-9f90-3acfe26e7d1a 2acb5062ef0a46b3b37336c6e856e999 c99aa787fccd4f1d9553df1471383775 - - default default] [instance: cdb8738b-801f-4bd8-a93c-3553748dedd7] Terminating instance
Jan 21 23:55:48 compute-1 nova_compute[182713]: 2026-01-21 23:55:48.468 182717 DEBUG nova.compute.manager [None req-281480fd-1e53-4688-9f90-3acfe26e7d1a 2acb5062ef0a46b3b37336c6e856e999 c99aa787fccd4f1d9553df1471383775 - - default default] [instance: cdb8738b-801f-4bd8-a93c-3553748dedd7] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 21 23:55:48 compute-1 kernel: tap7ff4f083-3e (unregistering): left promiscuous mode
Jan 21 23:55:48 compute-1 NetworkManager[54952]: <info>  [1769039748.5036] device (tap7ff4f083-3e): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 21 23:55:48 compute-1 virtqemud[182235]: invalid argument: disk vda does not have an active block job
Jan 21 23:55:48 compute-1 nova_compute[182713]: 2026-01-21 23:55:48.514 182717 DEBUG oslo_concurrency.processutils [None req-ef526cfe-4761-4d37-b23e-7555ca6b5ee5 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2a5e9a44-f095-4122-8db9-4918b6ba22b7/disk --force-share --output=json -f qcow2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:55:48 compute-1 ovn_controller[94841]: 2026-01-21T23:55:48Z|00167|binding|INFO|Releasing lport 7ff4f083-3e07-4de8-8f0c-bd99fc57ead6 from this chassis (sb_readonly=0)
Jan 21 23:55:48 compute-1 ovn_controller[94841]: 2026-01-21T23:55:48Z|00168|binding|INFO|Setting lport 7ff4f083-3e07-4de8-8f0c-bd99fc57ead6 down in Southbound
Jan 21 23:55:48 compute-1 ovn_controller[94841]: 2026-01-21T23:55:48Z|00169|binding|INFO|Removing iface tap7ff4f083-3e ovn-installed in OVS
Jan 21 23:55:48 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:55:48.523 104184 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:02:3d:57 10.100.0.3'], port_security=['fa:16:3e:02:3d:57 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'cdb8738b-801f-4bd8-a93c-3553748dedd7', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-261c8c44-5c0a-4f69-8e63-c90dfc4facd7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c99aa787fccd4f1d9553df1471383775', 'neutron:revision_number': '4', 'neutron:security_group_ids': '19c64d08-9fe5-4e54-b72d-53061e1c93c3', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=92ae6344-2d0b-4688-9fb2-71325c1bc562, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>], logical_port=7ff4f083-3e07-4de8-8f0c-bd99fc57ead6) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 21 23:55:48 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:55:48.527 104184 INFO neutron.agent.ovn.metadata.agent [-] Port 7ff4f083-3e07-4de8-8f0c-bd99fc57ead6 in datapath 261c8c44-5c0a-4f69-8e63-c90dfc4facd7 unbound from our chassis
Jan 21 23:55:48 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:55:48.532 104184 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 261c8c44-5c0a-4f69-8e63-c90dfc4facd7, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 21 23:55:48 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:55:48.535 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[32b7ea01-eb63-4bc4-affe-c0c946866df3]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:55:48 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:55:48.536 104184 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-261c8c44-5c0a-4f69-8e63-c90dfc4facd7 namespace which is not needed anymore
Jan 21 23:55:48 compute-1 nova_compute[182713]: 2026-01-21 23:55:48.537 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:55:48 compute-1 nova_compute[182713]: 2026-01-21 23:55:48.547 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:55:48 compute-1 systemd[1]: machine-qemu\x2d28\x2dinstance\x2d0000003b.scope: Deactivated successfully.
Jan 21 23:55:48 compute-1 systemd[1]: machine-qemu\x2d28\x2dinstance\x2d0000003b.scope: Consumed 1.639s CPU time.
Jan 21 23:55:48 compute-1 systemd-machined[153970]: Machine qemu-28-instance-0000003b terminated.
Jan 21 23:55:48 compute-1 nova_compute[182713]: 2026-01-21 23:55:48.576 182717 DEBUG oslo_concurrency.processutils [None req-ef526cfe-4761-4d37-b23e-7555ca6b5ee5 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2a5e9a44-f095-4122-8db9-4918b6ba22b7/disk --force-share --output=json -f qcow2" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:55:48 compute-1 nova_compute[182713]: 2026-01-21 23:55:48.577 182717 DEBUG oslo_concurrency.processutils [None req-ef526cfe-4761-4d37-b23e-7555ca6b5ee5 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2a5e9a44-f095-4122-8db9-4918b6ba22b7/disk --force-share --output=json -f qcow2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:55:48 compute-1 nova_compute[182713]: 2026-01-21 23:55:48.604 182717 DEBUG nova.compute.manager [req-906235ec-2ed2-4035-a23f-287a0540d63b req-6c0b8232-8f00-439c-8788-fc595258c2bd 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 2a5e9a44-f095-4122-8db9-4918b6ba22b7] Received event network-vif-plugged-ce72a712-5acd-45cf-9d5d-66fb0d28ce4b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:55:48 compute-1 nova_compute[182713]: 2026-01-21 23:55:48.605 182717 DEBUG oslo_concurrency.lockutils [req-906235ec-2ed2-4035-a23f-287a0540d63b req-6c0b8232-8f00-439c-8788-fc595258c2bd 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "2a5e9a44-f095-4122-8db9-4918b6ba22b7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:55:48 compute-1 nova_compute[182713]: 2026-01-21 23:55:48.605 182717 DEBUG oslo_concurrency.lockutils [req-906235ec-2ed2-4035-a23f-287a0540d63b req-6c0b8232-8f00-439c-8788-fc595258c2bd 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "2a5e9a44-f095-4122-8db9-4918b6ba22b7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:55:48 compute-1 nova_compute[182713]: 2026-01-21 23:55:48.605 182717 DEBUG oslo_concurrency.lockutils [req-906235ec-2ed2-4035-a23f-287a0540d63b req-6c0b8232-8f00-439c-8788-fc595258c2bd 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "2a5e9a44-f095-4122-8db9-4918b6ba22b7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:55:48 compute-1 nova_compute[182713]: 2026-01-21 23:55:48.605 182717 DEBUG nova.compute.manager [req-906235ec-2ed2-4035-a23f-287a0540d63b req-6c0b8232-8f00-439c-8788-fc595258c2bd 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 2a5e9a44-f095-4122-8db9-4918b6ba22b7] No waiting events found dispatching network-vif-plugged-ce72a712-5acd-45cf-9d5d-66fb0d28ce4b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 23:55:48 compute-1 nova_compute[182713]: 2026-01-21 23:55:48.605 182717 WARNING nova.compute.manager [req-906235ec-2ed2-4035-a23f-287a0540d63b req-6c0b8232-8f00-439c-8788-fc595258c2bd 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 2a5e9a44-f095-4122-8db9-4918b6ba22b7] Received unexpected event network-vif-plugged-ce72a712-5acd-45cf-9d5d-66fb0d28ce4b for instance with vm_state active and task_state image_pending_upload.
Jan 21 23:55:48 compute-1 nova_compute[182713]: 2026-01-21 23:55:48.638 182717 DEBUG oslo_concurrency.processutils [None req-ef526cfe-4761-4d37-b23e-7555ca6b5ee5 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2a5e9a44-f095-4122-8db9-4918b6ba22b7/disk --force-share --output=json -f qcow2" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:55:48 compute-1 nova_compute[182713]: 2026-01-21 23:55:48.652 182717 DEBUG oslo_concurrency.processutils [None req-ef526cfe-4761-4d37-b23e-7555ca6b5ee5 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:55:48 compute-1 neutron-haproxy-ovnmeta-261c8c44-5c0a-4f69-8e63-c90dfc4facd7[219173]: [NOTICE]   (219177) : haproxy version is 2.8.14-c23fe91
Jan 21 23:55:48 compute-1 neutron-haproxy-ovnmeta-261c8c44-5c0a-4f69-8e63-c90dfc4facd7[219173]: [NOTICE]   (219177) : path to executable is /usr/sbin/haproxy
Jan 21 23:55:48 compute-1 neutron-haproxy-ovnmeta-261c8c44-5c0a-4f69-8e63-c90dfc4facd7[219173]: [WARNING]  (219177) : Exiting Master process...
Jan 21 23:55:48 compute-1 neutron-haproxy-ovnmeta-261c8c44-5c0a-4f69-8e63-c90dfc4facd7[219173]: [WARNING]  (219177) : Exiting Master process...
Jan 21 23:55:48 compute-1 neutron-haproxy-ovnmeta-261c8c44-5c0a-4f69-8e63-c90dfc4facd7[219173]: [ALERT]    (219177) : Current worker (219179) exited with code 143 (Terminated)
Jan 21 23:55:48 compute-1 neutron-haproxy-ovnmeta-261c8c44-5c0a-4f69-8e63-c90dfc4facd7[219173]: [WARNING]  (219177) : All workers exited. Exiting... (0)
Jan 21 23:55:48 compute-1 systemd[1]: libpod-1a55c17407edd0da71ca662cd64fb90bd9b29111da88f5e0c4ce86c68c599d7e.scope: Deactivated successfully.
Jan 21 23:55:48 compute-1 conmon[219173]: conmon 1a55c17407edd0da71ca <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-1a55c17407edd0da71ca662cd64fb90bd9b29111da88f5e0c4ce86c68c599d7e.scope/container/memory.events
Jan 21 23:55:48 compute-1 podman[219214]: 2026-01-21 23:55:48.67829789 +0000 UTC m=+0.053364015 container died 1a55c17407edd0da71ca662cd64fb90bd9b29111da88f5e0c4ce86c68c599d7e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-261c8c44-5c0a-4f69-8e63-c90dfc4facd7, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202)
Jan 21 23:55:48 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-1a55c17407edd0da71ca662cd64fb90bd9b29111da88f5e0c4ce86c68c599d7e-userdata-shm.mount: Deactivated successfully.
Jan 21 23:55:48 compute-1 systemd[1]: var-lib-containers-storage-overlay-8f12a86a0ad749ed2348f735721656cc5b19c909f9198d8d17120c63dac7b122-merged.mount: Deactivated successfully.
Jan 21 23:55:48 compute-1 nova_compute[182713]: 2026-01-21 23:55:48.730 182717 DEBUG oslo_concurrency.processutils [None req-ef526cfe-4761-4d37-b23e-7555ca6b5ee5 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.078s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:55:48 compute-1 nova_compute[182713]: 2026-01-21 23:55:48.731 182717 DEBUG oslo_concurrency.processutils [None req-ef526cfe-4761-4d37-b23e-7555ca6b5ee5 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/snapshots/tmpaqm6l9dv/420465db03b34769b9dfcefa61a7f5d7.delta 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:55:48 compute-1 podman[219214]: 2026-01-21 23:55:48.743833657 +0000 UTC m=+0.118899762 container cleanup 1a55c17407edd0da71ca662cd64fb90bd9b29111da88f5e0c4ce86c68c599d7e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-261c8c44-5c0a-4f69-8e63-c90dfc4facd7, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Jan 21 23:55:48 compute-1 systemd[1]: libpod-conmon-1a55c17407edd0da71ca662cd64fb90bd9b29111da88f5e0c4ce86c68c599d7e.scope: Deactivated successfully.
Jan 21 23:55:48 compute-1 nova_compute[182713]: 2026-01-21 23:55:48.760 182717 INFO nova.virt.libvirt.driver [-] [instance: cdb8738b-801f-4bd8-a93c-3553748dedd7] Instance destroyed successfully.
Jan 21 23:55:48 compute-1 nova_compute[182713]: 2026-01-21 23:55:48.761 182717 DEBUG nova.objects.instance [None req-281480fd-1e53-4688-9f90-3acfe26e7d1a 2acb5062ef0a46b3b37336c6e856e999 c99aa787fccd4f1d9553df1471383775 - - default default] Lazy-loading 'resources' on Instance uuid cdb8738b-801f-4bd8-a93c-3553748dedd7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:55:48 compute-1 nova_compute[182713]: 2026-01-21 23:55:48.774 182717 DEBUG oslo_concurrency.processutils [None req-ef526cfe-4761-4d37-b23e-7555ca6b5ee5 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/snapshots/tmpaqm6l9dv/420465db03b34769b9dfcefa61a7f5d7.delta 1073741824" returned: 0 in 0.043s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:55:48 compute-1 nova_compute[182713]: 2026-01-21 23:55:48.775 182717 INFO nova.virt.libvirt.driver [None req-ef526cfe-4761-4d37-b23e-7555ca6b5ee5 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: 2a5e9a44-f095-4122-8db9-4918b6ba22b7] Quiescing instance not available: QEMU guest agent is not enabled.
Jan 21 23:55:48 compute-1 nova_compute[182713]: 2026-01-21 23:55:48.780 182717 DEBUG nova.virt.libvirt.vif [None req-281480fd-1e53-4688-9f90-3acfe26e7d1a 2acb5062ef0a46b3b37336c6e856e999 c99aa787fccd4f1d9553df1471383775 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-21T23:55:33Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-InstanceActionsV221TestJSON-server-1039580542',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-instanceactionsv221testjson-server-1039580542',id=59,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-21T23:55:47Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='c99aa787fccd4f1d9553df1471383775',ramdisk_id='',reservation_id='r-5rgack7f',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-InstanceActionsV221TestJSON-1186443402',owner_user_name='tempest-InstanceActionsV221TestJSON-1186443402-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-21T23:55:47Z,user_data=None,user_id='2acb5062ef0a46b3b37336c6e856e999',uuid=cdb8738b-801f-4bd8-a93c-3553748dedd7,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "7ff4f083-3e07-4de8-8f0c-bd99fc57ead6", "address": "fa:16:3e:02:3d:57", "network": {"id": "261c8c44-5c0a-4f69-8e63-c90dfc4facd7", "bridge": "br-int", "label": "tempest-InstanceActionsV221TestJSON-1124446518-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c99aa787fccd4f1d9553df1471383775", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7ff4f083-3e", "ovs_interfaceid": "7ff4f083-3e07-4de8-8f0c-bd99fc57ead6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 21 23:55:48 compute-1 nova_compute[182713]: 2026-01-21 23:55:48.780 182717 DEBUG nova.network.os_vif_util [None req-281480fd-1e53-4688-9f90-3acfe26e7d1a 2acb5062ef0a46b3b37336c6e856e999 c99aa787fccd4f1d9553df1471383775 - - default default] Converting VIF {"id": "7ff4f083-3e07-4de8-8f0c-bd99fc57ead6", "address": "fa:16:3e:02:3d:57", "network": {"id": "261c8c44-5c0a-4f69-8e63-c90dfc4facd7", "bridge": "br-int", "label": "tempest-InstanceActionsV221TestJSON-1124446518-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c99aa787fccd4f1d9553df1471383775", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7ff4f083-3e", "ovs_interfaceid": "7ff4f083-3e07-4de8-8f0c-bd99fc57ead6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 21 23:55:48 compute-1 nova_compute[182713]: 2026-01-21 23:55:48.781 182717 DEBUG nova.network.os_vif_util [None req-281480fd-1e53-4688-9f90-3acfe26e7d1a 2acb5062ef0a46b3b37336c6e856e999 c99aa787fccd4f1d9553df1471383775 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:02:3d:57,bridge_name='br-int',has_traffic_filtering=True,id=7ff4f083-3e07-4de8-8f0c-bd99fc57ead6,network=Network(261c8c44-5c0a-4f69-8e63-c90dfc4facd7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7ff4f083-3e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 21 23:55:48 compute-1 nova_compute[182713]: 2026-01-21 23:55:48.782 182717 DEBUG os_vif [None req-281480fd-1e53-4688-9f90-3acfe26e7d1a 2acb5062ef0a46b3b37336c6e856e999 c99aa787fccd4f1d9553df1471383775 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:02:3d:57,bridge_name='br-int',has_traffic_filtering=True,id=7ff4f083-3e07-4de8-8f0c-bd99fc57ead6,network=Network(261c8c44-5c0a-4f69-8e63-c90dfc4facd7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7ff4f083-3e') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 21 23:55:48 compute-1 nova_compute[182713]: 2026-01-21 23:55:48.784 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:55:48 compute-1 nova_compute[182713]: 2026-01-21 23:55:48.784 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7ff4f083-3e, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:55:48 compute-1 nova_compute[182713]: 2026-01-21 23:55:48.785 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:55:48 compute-1 nova_compute[182713]: 2026-01-21 23:55:48.788 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 21 23:55:48 compute-1 nova_compute[182713]: 2026-01-21 23:55:48.790 182717 INFO os_vif [None req-281480fd-1e53-4688-9f90-3acfe26e7d1a 2acb5062ef0a46b3b37336c6e856e999 c99aa787fccd4f1d9553df1471383775 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:02:3d:57,bridge_name='br-int',has_traffic_filtering=True,id=7ff4f083-3e07-4de8-8f0c-bd99fc57ead6,network=Network(261c8c44-5c0a-4f69-8e63-c90dfc4facd7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7ff4f083-3e')
Jan 21 23:55:48 compute-1 nova_compute[182713]: 2026-01-21 23:55:48.791 182717 INFO nova.virt.libvirt.driver [None req-281480fd-1e53-4688-9f90-3acfe26e7d1a 2acb5062ef0a46b3b37336c6e856e999 c99aa787fccd4f1d9553df1471383775 - - default default] [instance: cdb8738b-801f-4bd8-a93c-3553748dedd7] Deleting instance files /var/lib/nova/instances/cdb8738b-801f-4bd8-a93c-3553748dedd7_del
Jan 21 23:55:48 compute-1 nova_compute[182713]: 2026-01-21 23:55:48.792 182717 INFO nova.virt.libvirt.driver [None req-281480fd-1e53-4688-9f90-3acfe26e7d1a 2acb5062ef0a46b3b37336c6e856e999 c99aa787fccd4f1d9553df1471383775 - - default default] [instance: cdb8738b-801f-4bd8-a93c-3553748dedd7] Deletion of /var/lib/nova/instances/cdb8738b-801f-4bd8-a93c-3553748dedd7_del complete
Jan 21 23:55:48 compute-1 podman[219263]: 2026-01-21 23:55:48.832740006 +0000 UTC m=+0.058986508 container remove 1a55c17407edd0da71ca662cd64fb90bd9b29111da88f5e0c4ce86c68c599d7e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-261c8c44-5c0a-4f69-8e63-c90dfc4facd7, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Jan 21 23:55:48 compute-1 nova_compute[182713]: 2026-01-21 23:55:48.835 182717 DEBUG nova.virt.libvirt.guest [None req-ef526cfe-4761-4d37-b23e-7555ca6b5ee5 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] COPY block job progress, current cursor: 1 final cursor: 1 is_job_complete /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:846
Jan 21 23:55:48 compute-1 nova_compute[182713]: 2026-01-21 23:55:48.839 182717 INFO nova.virt.libvirt.driver [None req-ef526cfe-4761-4d37-b23e-7555ca6b5ee5 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: 2a5e9a44-f095-4122-8db9-4918b6ba22b7] Skipping quiescing instance: QEMU guest agent is not enabled.
Jan 21 23:55:48 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:55:48.839 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[117eeaab-b945-4883-97cf-80c5e5610e6a]: (4, ('Wed Jan 21 11:55:48 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-261c8c44-5c0a-4f69-8e63-c90dfc4facd7 (1a55c17407edd0da71ca662cd64fb90bd9b29111da88f5e0c4ce86c68c599d7e)\n1a55c17407edd0da71ca662cd64fb90bd9b29111da88f5e0c4ce86c68c599d7e\nWed Jan 21 11:55:48 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-261c8c44-5c0a-4f69-8e63-c90dfc4facd7 (1a55c17407edd0da71ca662cd64fb90bd9b29111da88f5e0c4ce86c68c599d7e)\n1a55c17407edd0da71ca662cd64fb90bd9b29111da88f5e0c4ce86c68c599d7e\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:55:48 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:55:48.841 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[defdf65e-ea18-47b7-a97d-a9fc4ce38e1d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:55:48 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:55:48.843 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap261c8c44-50, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:55:48 compute-1 kernel: tap261c8c44-50: left promiscuous mode
Jan 21 23:55:48 compute-1 nova_compute[182713]: 2026-01-21 23:55:48.850 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:55:48 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:55:48.852 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[7fb762f6-aa0f-4de7-b1bb-7247f1bdc627]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:55:48 compute-1 nova_compute[182713]: 2026-01-21 23:55:48.862 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:55:48 compute-1 nova_compute[182713]: 2026-01-21 23:55:48.867 182717 INFO nova.compute.manager [None req-281480fd-1e53-4688-9f90-3acfe26e7d1a 2acb5062ef0a46b3b37336c6e856e999 c99aa787fccd4f1d9553df1471383775 - - default default] [instance: cdb8738b-801f-4bd8-a93c-3553748dedd7] Took 0.40 seconds to destroy the instance on the hypervisor.
Jan 21 23:55:48 compute-1 nova_compute[182713]: 2026-01-21 23:55:48.867 182717 DEBUG oslo.service.loopingcall [None req-281480fd-1e53-4688-9f90-3acfe26e7d1a 2acb5062ef0a46b3b37336c6e856e999 c99aa787fccd4f1d9553df1471383775 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 21 23:55:48 compute-1 nova_compute[182713]: 2026-01-21 23:55:48.867 182717 DEBUG nova.compute.manager [-] [instance: cdb8738b-801f-4bd8-a93c-3553748dedd7] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 21 23:55:48 compute-1 nova_compute[182713]: 2026-01-21 23:55:48.868 182717 DEBUG nova.network.neutron [-] [instance: cdb8738b-801f-4bd8-a93c-3553748dedd7] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 21 23:55:48 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:55:48.870 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[b2d1ecac-d5f2-4074-adcb-cb1ee0167529]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:55:48 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:55:48.871 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[90069bac-4b7c-4500-b578-1da3fb22cabf]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:55:48 compute-1 nova_compute[182713]: 2026-01-21 23:55:48.891 182717 DEBUG nova.privsep.utils [None req-ef526cfe-4761-4d37-b23e-7555ca6b5ee5 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63
Jan 21 23:55:48 compute-1 nova_compute[182713]: 2026-01-21 23:55:48.891 182717 DEBUG oslo_concurrency.processutils [None req-ef526cfe-4761-4d37-b23e-7555ca6b5ee5 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Running cmd (subprocess): qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/snapshots/tmpaqm6l9dv/420465db03b34769b9dfcefa61a7f5d7.delta /var/lib/nova/instances/snapshots/tmpaqm6l9dv/420465db03b34769b9dfcefa61a7f5d7 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:55:48 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:55:48.893 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[1c3cbd7e-277b-4b01-b1ab-46b0a8902cb2]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 431939, 'reachable_time': 30940, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 219288, 'error': None, 'target': 'ovnmeta-261c8c44-5c0a-4f69-8e63-c90dfc4facd7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:55:48 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:55:48.895 104576 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-261c8c44-5c0a-4f69-8e63-c90dfc4facd7 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 21 23:55:48 compute-1 systemd[1]: run-netns-ovnmeta\x2d261c8c44\x2d5c0a\x2d4f69\x2d8e63\x2dc90dfc4facd7.mount: Deactivated successfully.
Jan 21 23:55:48 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:55:48.896 104576 DEBUG oslo.privsep.daemon [-] privsep: reply[8768d67a-457b-4958-8195-f046c7a6a169]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:55:49 compute-1 nova_compute[182713]: 2026-01-21 23:55:49.057 182717 DEBUG oslo_concurrency.processutils [None req-ef526cfe-4761-4d37-b23e-7555ca6b5ee5 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] CMD "qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/snapshots/tmpaqm6l9dv/420465db03b34769b9dfcefa61a7f5d7.delta /var/lib/nova/instances/snapshots/tmpaqm6l9dv/420465db03b34769b9dfcefa61a7f5d7" returned: 0 in 0.165s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:55:49 compute-1 nova_compute[182713]: 2026-01-21 23:55:49.058 182717 INFO nova.virt.libvirt.driver [None req-ef526cfe-4761-4d37-b23e-7555ca6b5ee5 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: 2a5e9a44-f095-4122-8db9-4918b6ba22b7] Snapshot extracted, beginning image upload
Jan 21 23:55:49 compute-1 nova_compute[182713]: 2026-01-21 23:55:49.776 182717 DEBUG nova.compute.manager [req-a8d801e8-db65-4ed9-b27e-6d9bb69dfa22 req-36094cd3-bbb5-4c41-af4e-51d9cbdbc2bc 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: cdb8738b-801f-4bd8-a93c-3553748dedd7] Received event network-vif-plugged-7ff4f083-3e07-4de8-8f0c-bd99fc57ead6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:55:49 compute-1 nova_compute[182713]: 2026-01-21 23:55:49.777 182717 DEBUG oslo_concurrency.lockutils [req-a8d801e8-db65-4ed9-b27e-6d9bb69dfa22 req-36094cd3-bbb5-4c41-af4e-51d9cbdbc2bc 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "cdb8738b-801f-4bd8-a93c-3553748dedd7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:55:49 compute-1 nova_compute[182713]: 2026-01-21 23:55:49.777 182717 DEBUG oslo_concurrency.lockutils [req-a8d801e8-db65-4ed9-b27e-6d9bb69dfa22 req-36094cd3-bbb5-4c41-af4e-51d9cbdbc2bc 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "cdb8738b-801f-4bd8-a93c-3553748dedd7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:55:49 compute-1 nova_compute[182713]: 2026-01-21 23:55:49.778 182717 DEBUG oslo_concurrency.lockutils [req-a8d801e8-db65-4ed9-b27e-6d9bb69dfa22 req-36094cd3-bbb5-4c41-af4e-51d9cbdbc2bc 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "cdb8738b-801f-4bd8-a93c-3553748dedd7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:55:49 compute-1 nova_compute[182713]: 2026-01-21 23:55:49.778 182717 DEBUG nova.compute.manager [req-a8d801e8-db65-4ed9-b27e-6d9bb69dfa22 req-36094cd3-bbb5-4c41-af4e-51d9cbdbc2bc 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: cdb8738b-801f-4bd8-a93c-3553748dedd7] No waiting events found dispatching network-vif-plugged-7ff4f083-3e07-4de8-8f0c-bd99fc57ead6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 23:55:49 compute-1 nova_compute[182713]: 2026-01-21 23:55:49.778 182717 WARNING nova.compute.manager [req-a8d801e8-db65-4ed9-b27e-6d9bb69dfa22 req-36094cd3-bbb5-4c41-af4e-51d9cbdbc2bc 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: cdb8738b-801f-4bd8-a93c-3553748dedd7] Received unexpected event network-vif-plugged-7ff4f083-3e07-4de8-8f0c-bd99fc57ead6 for instance with vm_state active and task_state deleting.
Jan 21 23:55:50 compute-1 nova_compute[182713]: 2026-01-21 23:55:50.445 182717 DEBUG nova.network.neutron [-] [instance: cdb8738b-801f-4bd8-a93c-3553748dedd7] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 23:55:50 compute-1 nova_compute[182713]: 2026-01-21 23:55:50.487 182717 INFO nova.compute.manager [-] [instance: cdb8738b-801f-4bd8-a93c-3553748dedd7] Took 1.62 seconds to deallocate network for instance.
Jan 21 23:55:50 compute-1 nova_compute[182713]: 2026-01-21 23:55:50.601 182717 DEBUG oslo_concurrency.lockutils [None req-281480fd-1e53-4688-9f90-3acfe26e7d1a 2acb5062ef0a46b3b37336c6e856e999 c99aa787fccd4f1d9553df1471383775 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:55:50 compute-1 nova_compute[182713]: 2026-01-21 23:55:50.602 182717 DEBUG oslo_concurrency.lockutils [None req-281480fd-1e53-4688-9f90-3acfe26e7d1a 2acb5062ef0a46b3b37336c6e856e999 c99aa787fccd4f1d9553df1471383775 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:55:50 compute-1 podman[219299]: 2026-01-21 23:55:50.616362412 +0000 UTC m=+0.098086840 container health_status cba261ffc859a105e0afa4436e64e27f9ba7e7387a1ea02146e0072c51844d44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute)
Jan 21 23:55:50 compute-1 nova_compute[182713]: 2026-01-21 23:55:50.725 182717 DEBUG nova.compute.provider_tree [None req-281480fd-1e53-4688-9f90-3acfe26e7d1a 2acb5062ef0a46b3b37336c6e856e999 c99aa787fccd4f1d9553df1471383775 - - default default] Inventory has not changed in ProviderTree for provider: 39680711-70c9-4df1-ae59-25e54fac688d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 21 23:55:50 compute-1 nova_compute[182713]: 2026-01-21 23:55:50.734 182717 DEBUG nova.compute.manager [req-c2093a3e-c190-4f53-bd16-45e8e8f5421e req-e85d2fb5-e30b-422b-9998-c4a6e86091f5 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: cdb8738b-801f-4bd8-a93c-3553748dedd7] Received event network-vif-deleted-7ff4f083-3e07-4de8-8f0c-bd99fc57ead6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:55:50 compute-1 nova_compute[182713]: 2026-01-21 23:55:50.750 182717 DEBUG nova.scheduler.client.report [None req-281480fd-1e53-4688-9f90-3acfe26e7d1a 2acb5062ef0a46b3b37336c6e856e999 c99aa787fccd4f1d9553df1471383775 - - default default] Inventory has not changed for provider 39680711-70c9-4df1-ae59-25e54fac688d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 21 23:55:50 compute-1 nova_compute[182713]: 2026-01-21 23:55:50.776 182717 DEBUG oslo_concurrency.lockutils [None req-281480fd-1e53-4688-9f90-3acfe26e7d1a 2acb5062ef0a46b3b37336c6e856e999 c99aa787fccd4f1d9553df1471383775 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.174s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:55:50 compute-1 nova_compute[182713]: 2026-01-21 23:55:50.819 182717 INFO nova.scheduler.client.report [None req-281480fd-1e53-4688-9f90-3acfe26e7d1a 2acb5062ef0a46b3b37336c6e856e999 c99aa787fccd4f1d9553df1471383775 - - default default] Deleted allocations for instance cdb8738b-801f-4bd8-a93c-3553748dedd7
Jan 21 23:55:50 compute-1 nova_compute[182713]: 2026-01-21 23:55:50.904 182717 DEBUG oslo_concurrency.lockutils [None req-281480fd-1e53-4688-9f90-3acfe26e7d1a 2acb5062ef0a46b3b37336c6e856e999 c99aa787fccd4f1d9553df1471383775 - - default default] Lock "cdb8738b-801f-4bd8-a93c-3553748dedd7" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.465s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:55:51 compute-1 nova_compute[182713]: 2026-01-21 23:55:51.083 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:55:51 compute-1 nova_compute[182713]: 2026-01-21 23:55:51.124 182717 INFO nova.virt.libvirt.driver [None req-ef526cfe-4761-4d37-b23e-7555ca6b5ee5 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: 2a5e9a44-f095-4122-8db9-4918b6ba22b7] Snapshot image upload complete
Jan 21 23:55:51 compute-1 nova_compute[182713]: 2026-01-21 23:55:51.125 182717 INFO nova.compute.manager [None req-ef526cfe-4761-4d37-b23e-7555ca6b5ee5 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: 2a5e9a44-f095-4122-8db9-4918b6ba22b7] Took 3.18 seconds to snapshot the instance on the hypervisor.
Jan 21 23:55:51 compute-1 nova_compute[182713]: 2026-01-21 23:55:51.864 182717 DEBUG nova.compute.manager [req-f926d3c4-175b-4796-9558-8f8d0ffc7647 req-954cf371-b4de-4fe8-88c5-a8ccd7f7dc9d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: cdb8738b-801f-4bd8-a93c-3553748dedd7] Received event network-vif-unplugged-7ff4f083-3e07-4de8-8f0c-bd99fc57ead6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:55:51 compute-1 nova_compute[182713]: 2026-01-21 23:55:51.865 182717 DEBUG oslo_concurrency.lockutils [req-f926d3c4-175b-4796-9558-8f8d0ffc7647 req-954cf371-b4de-4fe8-88c5-a8ccd7f7dc9d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "cdb8738b-801f-4bd8-a93c-3553748dedd7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:55:51 compute-1 nova_compute[182713]: 2026-01-21 23:55:51.866 182717 DEBUG oslo_concurrency.lockutils [req-f926d3c4-175b-4796-9558-8f8d0ffc7647 req-954cf371-b4de-4fe8-88c5-a8ccd7f7dc9d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "cdb8738b-801f-4bd8-a93c-3553748dedd7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:55:51 compute-1 nova_compute[182713]: 2026-01-21 23:55:51.866 182717 DEBUG oslo_concurrency.lockutils [req-f926d3c4-175b-4796-9558-8f8d0ffc7647 req-954cf371-b4de-4fe8-88c5-a8ccd7f7dc9d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "cdb8738b-801f-4bd8-a93c-3553748dedd7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:55:51 compute-1 nova_compute[182713]: 2026-01-21 23:55:51.867 182717 DEBUG nova.compute.manager [req-f926d3c4-175b-4796-9558-8f8d0ffc7647 req-954cf371-b4de-4fe8-88c5-a8ccd7f7dc9d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: cdb8738b-801f-4bd8-a93c-3553748dedd7] No waiting events found dispatching network-vif-unplugged-7ff4f083-3e07-4de8-8f0c-bd99fc57ead6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 23:55:51 compute-1 nova_compute[182713]: 2026-01-21 23:55:51.867 182717 WARNING nova.compute.manager [req-f926d3c4-175b-4796-9558-8f8d0ffc7647 req-954cf371-b4de-4fe8-88c5-a8ccd7f7dc9d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: cdb8738b-801f-4bd8-a93c-3553748dedd7] Received unexpected event network-vif-unplugged-7ff4f083-3e07-4de8-8f0c-bd99fc57ead6 for instance with vm_state deleted and task_state None.
Jan 21 23:55:51 compute-1 nova_compute[182713]: 2026-01-21 23:55:51.868 182717 DEBUG nova.compute.manager [req-f926d3c4-175b-4796-9558-8f8d0ffc7647 req-954cf371-b4de-4fe8-88c5-a8ccd7f7dc9d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: cdb8738b-801f-4bd8-a93c-3553748dedd7] Received event network-vif-plugged-7ff4f083-3e07-4de8-8f0c-bd99fc57ead6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:55:51 compute-1 nova_compute[182713]: 2026-01-21 23:55:51.868 182717 DEBUG oslo_concurrency.lockutils [req-f926d3c4-175b-4796-9558-8f8d0ffc7647 req-954cf371-b4de-4fe8-88c5-a8ccd7f7dc9d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "cdb8738b-801f-4bd8-a93c-3553748dedd7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:55:51 compute-1 nova_compute[182713]: 2026-01-21 23:55:51.869 182717 DEBUG oslo_concurrency.lockutils [req-f926d3c4-175b-4796-9558-8f8d0ffc7647 req-954cf371-b4de-4fe8-88c5-a8ccd7f7dc9d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "cdb8738b-801f-4bd8-a93c-3553748dedd7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:55:51 compute-1 nova_compute[182713]: 2026-01-21 23:55:51.869 182717 DEBUG oslo_concurrency.lockutils [req-f926d3c4-175b-4796-9558-8f8d0ffc7647 req-954cf371-b4de-4fe8-88c5-a8ccd7f7dc9d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "cdb8738b-801f-4bd8-a93c-3553748dedd7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:55:51 compute-1 nova_compute[182713]: 2026-01-21 23:55:51.870 182717 DEBUG nova.compute.manager [req-f926d3c4-175b-4796-9558-8f8d0ffc7647 req-954cf371-b4de-4fe8-88c5-a8ccd7f7dc9d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: cdb8738b-801f-4bd8-a93c-3553748dedd7] No waiting events found dispatching network-vif-plugged-7ff4f083-3e07-4de8-8f0c-bd99fc57ead6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 23:55:51 compute-1 nova_compute[182713]: 2026-01-21 23:55:51.870 182717 WARNING nova.compute.manager [req-f926d3c4-175b-4796-9558-8f8d0ffc7647 req-954cf371-b4de-4fe8-88c5-a8ccd7f7dc9d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: cdb8738b-801f-4bd8-a93c-3553748dedd7] Received unexpected event network-vif-plugged-7ff4f083-3e07-4de8-8f0c-bd99fc57ead6 for instance with vm_state deleted and task_state None.
Jan 21 23:55:53 compute-1 nova_compute[182713]: 2026-01-21 23:55:53.819 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:55:53 compute-1 podman[219319]: 2026-01-21 23:55:53.922416184 +0000 UTC m=+0.076687333 container health_status dce133a2ffa21f3006ac27dc80027060a9029e6d972f4c62fecd35dd3bbb00f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, managed_by=edpm_ansible, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, version=9.6, build-date=2025-08-20T13:12:41, release=1755695350, vcs-type=git, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, config_id=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, name=ubi9-minimal, io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers)
Jan 21 23:55:55 compute-1 ovn_controller[94841]: 2026-01-21T23:55:55Z|00170|binding|INFO|Releasing lport 5f8f321e-2942-4700-a50e-4b0628052c1b from this chassis (sb_readonly=0)
Jan 21 23:55:55 compute-1 ovn_controller[94841]: 2026-01-21T23:55:55Z|00171|binding|INFO|Releasing lport bb8c3f45-55b8-4c8e-8a31-26c5ecb4fb32 from this chassis (sb_readonly=0)
Jan 21 23:55:55 compute-1 nova_compute[182713]: 2026-01-21 23:55:55.745 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:55:56 compute-1 nova_compute[182713]: 2026-01-21 23:55:56.086 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:55:58 compute-1 ovn_controller[94841]: 2026-01-21T23:55:58Z|00020|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:4f:57:00 10.100.0.7
Jan 21 23:55:58 compute-1 ovn_controller[94841]: 2026-01-21T23:55:58Z|00021|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:4f:57:00 10.100.0.7
Jan 21 23:55:58 compute-1 nova_compute[182713]: 2026-01-21 23:55:58.821 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:55:59 compute-1 ovn_controller[94841]: 2026-01-21T23:55:59Z|00022|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:4e:99:ca 10.100.0.10
Jan 21 23:55:59 compute-1 ovn_controller[94841]: 2026-01-21T23:55:59Z|00023|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:4e:99:ca 10.100.0.10
Jan 21 23:56:01 compute-1 nova_compute[182713]: 2026-01-21 23:56:01.091 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:56:01 compute-1 anacron[30859]: Job `cron.weekly' started
Jan 21 23:56:01 compute-1 anacron[30859]: Job `cron.weekly' terminated
Jan 21 23:56:01 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:56:01.374 104184 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=15, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:a4:fb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'fa:c2:bb:50:eb:27'}, ipsec=False) old=SB_Global(nb_cfg=14) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 21 23:56:01 compute-1 nova_compute[182713]: 2026-01-21 23:56:01.375 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:56:01 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:56:01.377 104184 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 21 23:56:01 compute-1 podman[219373]: 2026-01-21 23:56:01.643997412 +0000 UTC m=+0.115359602 container health_status 9069102163b9014ce8d990e36224edf2f6277e737eebbc85a8ea0ba5a3d6a468 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 21 23:56:01 compute-1 podman[219372]: 2026-01-21 23:56:01.668107185 +0000 UTC m=+0.141664934 container health_status 1b31374526468d6b98833c6ddc06c791524e3aaa8f4a92d02e3f11c449a17ca2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller)
Jan 21 23:56:02 compute-1 nova_compute[182713]: 2026-01-21 23:56:02.200 182717 DEBUG oslo_concurrency.lockutils [None req-e0e52053-bd0b-4d19-af25-9c1f8423b3b0 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Acquiring lock "22742d9b-a6a8-4f10-a17f-a9704a1f8f43" by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:56:02 compute-1 nova_compute[182713]: 2026-01-21 23:56:02.201 182717 DEBUG oslo_concurrency.lockutils [None req-e0e52053-bd0b-4d19-af25-9c1f8423b3b0 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Lock "22742d9b-a6a8-4f10-a17f-a9704a1f8f43" acquired by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:56:02 compute-1 nova_compute[182713]: 2026-01-21 23:56:02.201 182717 DEBUG nova.compute.manager [None req-e0e52053-bd0b-4d19-af25-9c1f8423b3b0 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] [instance: 22742d9b-a6a8-4f10-a17f-a9704a1f8f43] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:56:02 compute-1 nova_compute[182713]: 2026-01-21 23:56:02.208 182717 DEBUG nova.compute.manager [None req-e0e52053-bd0b-4d19-af25-9c1f8423b3b0 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] [instance: 22742d9b-a6a8-4f10-a17f-a9704a1f8f43] Stopping instance; current vm_state: active, current task_state: powering-off, current DB power_state: 1, current VM power_state: 1 do_stop_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3338
Jan 21 23:56:02 compute-1 nova_compute[182713]: 2026-01-21 23:56:02.210 182717 DEBUG nova.objects.instance [None req-e0e52053-bd0b-4d19-af25-9c1f8423b3b0 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Lazy-loading 'flavor' on Instance uuid 22742d9b-a6a8-4f10-a17f-a9704a1f8f43 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:56:02 compute-1 nova_compute[182713]: 2026-01-21 23:56:02.239 182717 DEBUG nova.objects.instance [None req-e0e52053-bd0b-4d19-af25-9c1f8423b3b0 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Lazy-loading 'info_cache' on Instance uuid 22742d9b-a6a8-4f10-a17f-a9704a1f8f43 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:56:02 compute-1 nova_compute[182713]: 2026-01-21 23:56:02.266 182717 DEBUG nova.virt.libvirt.driver [None req-e0e52053-bd0b-4d19-af25-9c1f8423b3b0 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] [instance: 22742d9b-a6a8-4f10-a17f-a9704a1f8f43] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071
Jan 21 23:56:03 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:56:03.003 104184 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:56:03 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:56:03.004 104184 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:56:03 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:56:03.005 104184 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:56:03 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:56:03.380 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=74526b6d-b1ca-423f-9094-b845f8b97526, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '15'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:56:03 compute-1 nova_compute[182713]: 2026-01-21 23:56:03.784 182717 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769039748.7320383, cdb8738b-801f-4bd8-a93c-3553748dedd7 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:56:03 compute-1 nova_compute[182713]: 2026-01-21 23:56:03.784 182717 INFO nova.compute.manager [-] [instance: cdb8738b-801f-4bd8-a93c-3553748dedd7] VM Stopped (Lifecycle Event)
Jan 21 23:56:03 compute-1 nova_compute[182713]: 2026-01-21 23:56:03.805 182717 DEBUG nova.compute.manager [None req-6c15a902-4733-4dcb-8d32-5ade4851052f - - - - - -] [instance: cdb8738b-801f-4bd8-a93c-3553748dedd7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:56:03 compute-1 nova_compute[182713]: 2026-01-21 23:56:03.824 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:56:04 compute-1 kernel: tapb767fb64-f4 (unregistering): left promiscuous mode
Jan 21 23:56:04 compute-1 NetworkManager[54952]: <info>  [1769039764.4812] device (tapb767fb64-f4): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 21 23:56:04 compute-1 ovn_controller[94841]: 2026-01-21T23:56:04Z|00172|binding|INFO|Releasing lport b767fb64-f4a0-49cc-85c0-21b059344b3d from this chassis (sb_readonly=0)
Jan 21 23:56:04 compute-1 ovn_controller[94841]: 2026-01-21T23:56:04Z|00173|binding|INFO|Setting lport b767fb64-f4a0-49cc-85c0-21b059344b3d down in Southbound
Jan 21 23:56:04 compute-1 nova_compute[182713]: 2026-01-21 23:56:04.487 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:56:04 compute-1 ovn_controller[94841]: 2026-01-21T23:56:04Z|00174|binding|INFO|Removing iface tapb767fb64-f4 ovn-installed in OVS
Jan 21 23:56:04 compute-1 nova_compute[182713]: 2026-01-21 23:56:04.500 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:56:04 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:56:04.507 104184 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:4e:99:ca 10.100.0.10'], port_security=['fa:16:3e:4e:99:ca 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '22742d9b-a6a8-4f10-a17f-a9704a1f8f43', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a78bfb22-a192-4dbe-a117-9f8a59130e27', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '70b1c9f8be0042aa8de9841a26729700', 'neutron:revision_number': '4', 'neutron:security_group_ids': '5943869c-ade1-4cd3-81a5-29e65236fb49', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=88d3d39a-f56f-4f3b-95e9-79768ac7b596, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>], logical_port=b767fb64-f4a0-49cc-85c0-21b059344b3d) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 21 23:56:04 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:56:04.509 104184 INFO neutron.agent.ovn.metadata.agent [-] Port b767fb64-f4a0-49cc-85c0-21b059344b3d in datapath a78bfb22-a192-4dbe-a117-9f8a59130e27 unbound from our chassis
Jan 21 23:56:04 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:56:04.511 104184 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network a78bfb22-a192-4dbe-a117-9f8a59130e27, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 21 23:56:04 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:56:04.513 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[6de80e46-cffb-4f7d-95ff-12d8e69a9096]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:56:04 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:56:04.514 104184 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-a78bfb22-a192-4dbe-a117-9f8a59130e27 namespace which is not needed anymore
Jan 21 23:56:04 compute-1 nova_compute[182713]: 2026-01-21 23:56:04.523 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:56:04 compute-1 systemd[1]: machine-qemu\x2d26\x2dinstance\x2d0000003a.scope: Deactivated successfully.
Jan 21 23:56:04 compute-1 systemd[1]: machine-qemu\x2d26\x2dinstance\x2d0000003a.scope: Consumed 12.848s CPU time.
Jan 21 23:56:04 compute-1 systemd-machined[153970]: Machine qemu-26-instance-0000003a terminated.
Jan 21 23:56:04 compute-1 neutron-haproxy-ovnmeta-a78bfb22-a192-4dbe-a117-9f8a59130e27[218950]: [NOTICE]   (218954) : haproxy version is 2.8.14-c23fe91
Jan 21 23:56:04 compute-1 neutron-haproxy-ovnmeta-a78bfb22-a192-4dbe-a117-9f8a59130e27[218950]: [NOTICE]   (218954) : path to executable is /usr/sbin/haproxy
Jan 21 23:56:04 compute-1 neutron-haproxy-ovnmeta-a78bfb22-a192-4dbe-a117-9f8a59130e27[218950]: [WARNING]  (218954) : Exiting Master process...
Jan 21 23:56:04 compute-1 neutron-haproxy-ovnmeta-a78bfb22-a192-4dbe-a117-9f8a59130e27[218950]: [ALERT]    (218954) : Current worker (218956) exited with code 143 (Terminated)
Jan 21 23:56:04 compute-1 neutron-haproxy-ovnmeta-a78bfb22-a192-4dbe-a117-9f8a59130e27[218950]: [WARNING]  (218954) : All workers exited. Exiting... (0)
Jan 21 23:56:04 compute-1 systemd[1]: libpod-9033d9d9a076c37515fccab8ce4648cff0eb61574a0829fc93fa553902417637.scope: Deactivated successfully.
Jan 21 23:56:04 compute-1 conmon[218950]: conmon 9033d9d9a076c37515fc <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-9033d9d9a076c37515fccab8ce4648cff0eb61574a0829fc93fa553902417637.scope/container/memory.events
Jan 21 23:56:04 compute-1 podman[219445]: 2026-01-21 23:56:04.65081688 +0000 UTC m=+0.048830315 container died 9033d9d9a076c37515fccab8ce4648cff0eb61574a0829fc93fa553902417637 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a78bfb22-a192-4dbe-a117-9f8a59130e27, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 21 23:56:04 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-9033d9d9a076c37515fccab8ce4648cff0eb61574a0829fc93fa553902417637-userdata-shm.mount: Deactivated successfully.
Jan 21 23:56:04 compute-1 systemd[1]: var-lib-containers-storage-overlay-aa7ade534e76ab0ebf2b7ad96afaf902c7b1bc431792cde7f318b7e298e25a54-merged.mount: Deactivated successfully.
Jan 21 23:56:04 compute-1 podman[219445]: 2026-01-21 23:56:04.686229281 +0000 UTC m=+0.084242746 container cleanup 9033d9d9a076c37515fccab8ce4648cff0eb61574a0829fc93fa553902417637 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a78bfb22-a192-4dbe-a117-9f8a59130e27, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202)
Jan 21 23:56:04 compute-1 systemd[1]: libpod-conmon-9033d9d9a076c37515fccab8ce4648cff0eb61574a0829fc93fa553902417637.scope: Deactivated successfully.
Jan 21 23:56:04 compute-1 podman[219472]: 2026-01-21 23:56:04.768500974 +0000 UTC m=+0.054719466 container remove 9033d9d9a076c37515fccab8ce4648cff0eb61574a0829fc93fa553902417637 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a78bfb22-a192-4dbe-a117-9f8a59130e27, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 21 23:56:04 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:56:04.773 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[e31f1cd4-e28a-4c1b-b3a5-1ab1f329a27e]: (4, ('Wed Jan 21 11:56:04 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-a78bfb22-a192-4dbe-a117-9f8a59130e27 (9033d9d9a076c37515fccab8ce4648cff0eb61574a0829fc93fa553902417637)\n9033d9d9a076c37515fccab8ce4648cff0eb61574a0829fc93fa553902417637\nWed Jan 21 11:56:04 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-a78bfb22-a192-4dbe-a117-9f8a59130e27 (9033d9d9a076c37515fccab8ce4648cff0eb61574a0829fc93fa553902417637)\n9033d9d9a076c37515fccab8ce4648cff0eb61574a0829fc93fa553902417637\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:56:04 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:56:04.776 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[07b6bfe5-2f3c-4664-92f6-ab4045b98a5a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:56:04 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:56:04.777 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa78bfb22-a0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:56:04 compute-1 nova_compute[182713]: 2026-01-21 23:56:04.778 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:56:04 compute-1 kernel: tapa78bfb22-a0: left promiscuous mode
Jan 21 23:56:04 compute-1 nova_compute[182713]: 2026-01-21 23:56:04.798 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:56:04 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:56:04.802 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[67ead1d9-5f87-4c30-a625-fde90eb5f0fe]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:56:04 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:56:04.817 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[3af1357c-2106-4ade-9e84-77910dd887b0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:56:04 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:56:04.819 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[fca335d5-c3d9-4539-ae11-5e8494b4b4ce]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:56:04 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:56:04.846 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[b0b03b87-3ec2-4ef2-b316-6c1152907b5e]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 431712, 'reachable_time': 15949, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 219506, 'error': None, 'target': 'ovnmeta-a78bfb22-a192-4dbe-a117-9f8a59130e27', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:56:04 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:56:04.849 104576 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-a78bfb22-a192-4dbe-a117-9f8a59130e27 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 21 23:56:04 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:56:04.849 104576 DEBUG oslo.privsep.daemon [-] privsep: reply[4534258e-21f6-4ca2-839c-7b8804bb782b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:56:04 compute-1 systemd[1]: run-netns-ovnmeta\x2da78bfb22\x2da192\x2d4dbe\x2da117\x2d9f8a59130e27.mount: Deactivated successfully.
Jan 21 23:56:04 compute-1 nova_compute[182713]: 2026-01-21 23:56:04.874 182717 DEBUG nova.compute.manager [req-aa996632-225d-429d-b37b-0a31778d5462 req-6531dbbc-1f3a-4d9f-9dbf-266390ee90e7 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 22742d9b-a6a8-4f10-a17f-a9704a1f8f43] Received event network-vif-unplugged-b767fb64-f4a0-49cc-85c0-21b059344b3d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:56:04 compute-1 nova_compute[182713]: 2026-01-21 23:56:04.874 182717 DEBUG oslo_concurrency.lockutils [req-aa996632-225d-429d-b37b-0a31778d5462 req-6531dbbc-1f3a-4d9f-9dbf-266390ee90e7 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "22742d9b-a6a8-4f10-a17f-a9704a1f8f43-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:56:04 compute-1 nova_compute[182713]: 2026-01-21 23:56:04.875 182717 DEBUG oslo_concurrency.lockutils [req-aa996632-225d-429d-b37b-0a31778d5462 req-6531dbbc-1f3a-4d9f-9dbf-266390ee90e7 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "22742d9b-a6a8-4f10-a17f-a9704a1f8f43-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:56:04 compute-1 nova_compute[182713]: 2026-01-21 23:56:04.875 182717 DEBUG oslo_concurrency.lockutils [req-aa996632-225d-429d-b37b-0a31778d5462 req-6531dbbc-1f3a-4d9f-9dbf-266390ee90e7 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "22742d9b-a6a8-4f10-a17f-a9704a1f8f43-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:56:04 compute-1 nova_compute[182713]: 2026-01-21 23:56:04.875 182717 DEBUG nova.compute.manager [req-aa996632-225d-429d-b37b-0a31778d5462 req-6531dbbc-1f3a-4d9f-9dbf-266390ee90e7 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 22742d9b-a6a8-4f10-a17f-a9704a1f8f43] No waiting events found dispatching network-vif-unplugged-b767fb64-f4a0-49cc-85c0-21b059344b3d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 23:56:04 compute-1 nova_compute[182713]: 2026-01-21 23:56:04.875 182717 WARNING nova.compute.manager [req-aa996632-225d-429d-b37b-0a31778d5462 req-6531dbbc-1f3a-4d9f-9dbf-266390ee90e7 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 22742d9b-a6a8-4f10-a17f-a9704a1f8f43] Received unexpected event network-vif-unplugged-b767fb64-f4a0-49cc-85c0-21b059344b3d for instance with vm_state active and task_state powering-off.
Jan 21 23:56:05 compute-1 nova_compute[182713]: 2026-01-21 23:56:05.285 182717 INFO nova.virt.libvirt.driver [None req-e0e52053-bd0b-4d19-af25-9c1f8423b3b0 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] [instance: 22742d9b-a6a8-4f10-a17f-a9704a1f8f43] Instance shutdown successfully after 3 seconds.
Jan 21 23:56:05 compute-1 nova_compute[182713]: 2026-01-21 23:56:05.292 182717 INFO nova.virt.libvirt.driver [-] [instance: 22742d9b-a6a8-4f10-a17f-a9704a1f8f43] Instance destroyed successfully.
Jan 21 23:56:05 compute-1 nova_compute[182713]: 2026-01-21 23:56:05.292 182717 DEBUG nova.objects.instance [None req-e0e52053-bd0b-4d19-af25-9c1f8423b3b0 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Lazy-loading 'numa_topology' on Instance uuid 22742d9b-a6a8-4f10-a17f-a9704a1f8f43 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:56:05 compute-1 nova_compute[182713]: 2026-01-21 23:56:05.322 182717 DEBUG nova.compute.manager [None req-e0e52053-bd0b-4d19-af25-9c1f8423b3b0 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] [instance: 22742d9b-a6a8-4f10-a17f-a9704a1f8f43] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:56:05 compute-1 nova_compute[182713]: 2026-01-21 23:56:05.409 182717 DEBUG oslo_concurrency.lockutils [None req-e0e52053-bd0b-4d19-af25-9c1f8423b3b0 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Lock "22742d9b-a6a8-4f10-a17f-a9704a1f8f43" "released" by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" :: held 3.208s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:56:06 compute-1 nova_compute[182713]: 2026-01-21 23:56:06.095 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:56:06 compute-1 nova_compute[182713]: 2026-01-21 23:56:06.788 182717 DEBUG nova.objects.instance [None req-4e65fe80-d9f3-4139-88c1-a7743e2856a7 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Lazy-loading 'flavor' on Instance uuid 22742d9b-a6a8-4f10-a17f-a9704a1f8f43 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:56:06 compute-1 nova_compute[182713]: 2026-01-21 23:56:06.825 182717 DEBUG nova.objects.instance [None req-4e65fe80-d9f3-4139-88c1-a7743e2856a7 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Lazy-loading 'info_cache' on Instance uuid 22742d9b-a6a8-4f10-a17f-a9704a1f8f43 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:56:06 compute-1 nova_compute[182713]: 2026-01-21 23:56:06.865 182717 DEBUG oslo_concurrency.lockutils [None req-4e65fe80-d9f3-4139-88c1-a7743e2856a7 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Acquiring lock "refresh_cache-22742d9b-a6a8-4f10-a17f-a9704a1f8f43" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 23:56:06 compute-1 nova_compute[182713]: 2026-01-21 23:56:06.866 182717 DEBUG oslo_concurrency.lockutils [None req-4e65fe80-d9f3-4139-88c1-a7743e2856a7 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Acquired lock "refresh_cache-22742d9b-a6a8-4f10-a17f-a9704a1f8f43" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 23:56:06 compute-1 nova_compute[182713]: 2026-01-21 23:56:06.866 182717 DEBUG nova.network.neutron [None req-4e65fe80-d9f3-4139-88c1-a7743e2856a7 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] [instance: 22742d9b-a6a8-4f10-a17f-a9704a1f8f43] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 21 23:56:07 compute-1 nova_compute[182713]: 2026-01-21 23:56:07.004 182717 DEBUG nova.compute.manager [req-5e23dce2-6ef5-4fd1-9059-551761e430f8 req-be97d572-7903-4699-b8ca-370f45e00e7d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 22742d9b-a6a8-4f10-a17f-a9704a1f8f43] Received event network-vif-plugged-b767fb64-f4a0-49cc-85c0-21b059344b3d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:56:07 compute-1 nova_compute[182713]: 2026-01-21 23:56:07.005 182717 DEBUG oslo_concurrency.lockutils [req-5e23dce2-6ef5-4fd1-9059-551761e430f8 req-be97d572-7903-4699-b8ca-370f45e00e7d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "22742d9b-a6a8-4f10-a17f-a9704a1f8f43-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:56:07 compute-1 nova_compute[182713]: 2026-01-21 23:56:07.006 182717 DEBUG oslo_concurrency.lockutils [req-5e23dce2-6ef5-4fd1-9059-551761e430f8 req-be97d572-7903-4699-b8ca-370f45e00e7d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "22742d9b-a6a8-4f10-a17f-a9704a1f8f43-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:56:07 compute-1 nova_compute[182713]: 2026-01-21 23:56:07.006 182717 DEBUG oslo_concurrency.lockutils [req-5e23dce2-6ef5-4fd1-9059-551761e430f8 req-be97d572-7903-4699-b8ca-370f45e00e7d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "22742d9b-a6a8-4f10-a17f-a9704a1f8f43-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:56:07 compute-1 nova_compute[182713]: 2026-01-21 23:56:07.006 182717 DEBUG nova.compute.manager [req-5e23dce2-6ef5-4fd1-9059-551761e430f8 req-be97d572-7903-4699-b8ca-370f45e00e7d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 22742d9b-a6a8-4f10-a17f-a9704a1f8f43] No waiting events found dispatching network-vif-plugged-b767fb64-f4a0-49cc-85c0-21b059344b3d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 23:56:07 compute-1 nova_compute[182713]: 2026-01-21 23:56:07.007 182717 WARNING nova.compute.manager [req-5e23dce2-6ef5-4fd1-9059-551761e430f8 req-be97d572-7903-4699-b8ca-370f45e00e7d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 22742d9b-a6a8-4f10-a17f-a9704a1f8f43] Received unexpected event network-vif-plugged-b767fb64-f4a0-49cc-85c0-21b059344b3d for instance with vm_state stopped and task_state powering-on.
Jan 21 23:56:08 compute-1 nova_compute[182713]: 2026-01-21 23:56:08.413 182717 DEBUG nova.network.neutron [None req-4e65fe80-d9f3-4139-88c1-a7743e2856a7 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] [instance: 22742d9b-a6a8-4f10-a17f-a9704a1f8f43] Updating instance_info_cache with network_info: [{"id": "b767fb64-f4a0-49cc-85c0-21b059344b3d", "address": "fa:16:3e:4e:99:ca", "network": {"id": "a78bfb22-a192-4dbe-a117-9f8a59130e27", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-9709523-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "70b1c9f8be0042aa8de9841a26729700", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb767fb64-f4", "ovs_interfaceid": "b767fb64-f4a0-49cc-85c0-21b059344b3d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 23:56:08 compute-1 nova_compute[182713]: 2026-01-21 23:56:08.443 182717 DEBUG oslo_concurrency.lockutils [None req-4e65fe80-d9f3-4139-88c1-a7743e2856a7 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Releasing lock "refresh_cache-22742d9b-a6a8-4f10-a17f-a9704a1f8f43" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 23:56:08 compute-1 nova_compute[182713]: 2026-01-21 23:56:08.482 182717 INFO nova.virt.libvirt.driver [-] [instance: 22742d9b-a6a8-4f10-a17f-a9704a1f8f43] Instance destroyed successfully.
Jan 21 23:56:08 compute-1 nova_compute[182713]: 2026-01-21 23:56:08.482 182717 DEBUG nova.objects.instance [None req-4e65fe80-d9f3-4139-88c1-a7743e2856a7 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Lazy-loading 'numa_topology' on Instance uuid 22742d9b-a6a8-4f10-a17f-a9704a1f8f43 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:56:08 compute-1 nova_compute[182713]: 2026-01-21 23:56:08.502 182717 DEBUG nova.objects.instance [None req-4e65fe80-d9f3-4139-88c1-a7743e2856a7 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Lazy-loading 'resources' on Instance uuid 22742d9b-a6a8-4f10-a17f-a9704a1f8f43 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:56:08 compute-1 nova_compute[182713]: 2026-01-21 23:56:08.519 182717 DEBUG nova.virt.libvirt.vif [None req-4e65fe80-d9f3-4139-88c1-a7743e2856a7 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-21T23:55:33Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ListServerFiltersTestJSON-instance-1374146863',display_name='tempest-ListServerFiltersTestJSON-instance-1374146863',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-listserverfilterstestjson-instance-1374146863',id=58,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-21T23:55:46Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='70b1c9f8be0042aa8de9841a26729700',ramdisk_id='',reservation_id='r-3b2x1al6',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ListServerFiltersTestJSON-1547380946',owner_user_name='tempest-ListServerFiltersTestJSON-1547380946-project-member'},tags=<?>,task_state='powering-on',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-21T23:56:05Z,user_data=None,user_id='7e79b904cb8a49f990b05eb0ed72fdf4',uuid=22742d9b-a6a8-4f10-a17f-a9704a1f8f43,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "b767fb64-f4a0-49cc-85c0-21b059344b3d", "address": "fa:16:3e:4e:99:ca", "network": {"id": "a78bfb22-a192-4dbe-a117-9f8a59130e27", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-9709523-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "70b1c9f8be0042aa8de9841a26729700", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb767fb64-f4", "ovs_interfaceid": "b767fb64-f4a0-49cc-85c0-21b059344b3d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 21 23:56:08 compute-1 nova_compute[182713]: 2026-01-21 23:56:08.519 182717 DEBUG nova.network.os_vif_util [None req-4e65fe80-d9f3-4139-88c1-a7743e2856a7 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Converting VIF {"id": "b767fb64-f4a0-49cc-85c0-21b059344b3d", "address": "fa:16:3e:4e:99:ca", "network": {"id": "a78bfb22-a192-4dbe-a117-9f8a59130e27", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-9709523-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "70b1c9f8be0042aa8de9841a26729700", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb767fb64-f4", "ovs_interfaceid": "b767fb64-f4a0-49cc-85c0-21b059344b3d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 21 23:56:08 compute-1 nova_compute[182713]: 2026-01-21 23:56:08.520 182717 DEBUG nova.network.os_vif_util [None req-4e65fe80-d9f3-4139-88c1-a7743e2856a7 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:4e:99:ca,bridge_name='br-int',has_traffic_filtering=True,id=b767fb64-f4a0-49cc-85c0-21b059344b3d,network=Network(a78bfb22-a192-4dbe-a117-9f8a59130e27),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb767fb64-f4') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 21 23:56:08 compute-1 nova_compute[182713]: 2026-01-21 23:56:08.521 182717 DEBUG os_vif [None req-4e65fe80-d9f3-4139-88c1-a7743e2856a7 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:4e:99:ca,bridge_name='br-int',has_traffic_filtering=True,id=b767fb64-f4a0-49cc-85c0-21b059344b3d,network=Network(a78bfb22-a192-4dbe-a117-9f8a59130e27),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb767fb64-f4') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 21 23:56:08 compute-1 nova_compute[182713]: 2026-01-21 23:56:08.524 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:56:08 compute-1 nova_compute[182713]: 2026-01-21 23:56:08.524 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb767fb64-f4, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:56:08 compute-1 nova_compute[182713]: 2026-01-21 23:56:08.526 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:56:08 compute-1 nova_compute[182713]: 2026-01-21 23:56:08.528 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:56:08 compute-1 nova_compute[182713]: 2026-01-21 23:56:08.534 182717 INFO os_vif [None req-4e65fe80-d9f3-4139-88c1-a7743e2856a7 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:4e:99:ca,bridge_name='br-int',has_traffic_filtering=True,id=b767fb64-f4a0-49cc-85c0-21b059344b3d,network=Network(a78bfb22-a192-4dbe-a117-9f8a59130e27),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb767fb64-f4')
Jan 21 23:56:08 compute-1 nova_compute[182713]: 2026-01-21 23:56:08.542 182717 DEBUG nova.virt.libvirt.driver [None req-4e65fe80-d9f3-4139-88c1-a7743e2856a7 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] [instance: 22742d9b-a6a8-4f10-a17f-a9704a1f8f43] Start _get_guest_xml network_info=[{"id": "b767fb64-f4a0-49cc-85c0-21b059344b3d", "address": "fa:16:3e:4e:99:ca", "network": {"id": "a78bfb22-a192-4dbe-a117-9f8a59130e27", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-9709523-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "70b1c9f8be0042aa8de9841a26729700", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb767fb64-f4", "ovs_interfaceid": "b767fb64-f4a0-49cc-85c0-21b059344b3d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'size': 0, 'disk_bus': 'virtio', 'boot_index': 0, 'device_name': '/dev/vda', 'guest_format': None, 'encrypted': False, 'encryption_format': None, 'encryption_secret_uuid': None, 'device_type': 'disk', 'image_id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 21 23:56:08 compute-1 nova_compute[182713]: 2026-01-21 23:56:08.547 182717 WARNING nova.virt.libvirt.driver [None req-4e65fe80-d9f3-4139-88c1-a7743e2856a7 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 21 23:56:08 compute-1 nova_compute[182713]: 2026-01-21 23:56:08.553 182717 DEBUG nova.virt.libvirt.host [None req-4e65fe80-d9f3-4139-88c1-a7743e2856a7 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 21 23:56:08 compute-1 nova_compute[182713]: 2026-01-21 23:56:08.554 182717 DEBUG nova.virt.libvirt.host [None req-4e65fe80-d9f3-4139-88c1-a7743e2856a7 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 21 23:56:08 compute-1 nova_compute[182713]: 2026-01-21 23:56:08.561 182717 DEBUG nova.virt.libvirt.host [None req-4e65fe80-d9f3-4139-88c1-a7743e2856a7 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 21 23:56:08 compute-1 nova_compute[182713]: 2026-01-21 23:56:08.561 182717 DEBUG nova.virt.libvirt.host [None req-4e65fe80-d9f3-4139-88c1-a7743e2856a7 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 21 23:56:08 compute-1 nova_compute[182713]: 2026-01-21 23:56:08.563 182717 DEBUG nova.virt.libvirt.driver [None req-4e65fe80-d9f3-4139-88c1-a7743e2856a7 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 21 23:56:08 compute-1 nova_compute[182713]: 2026-01-21 23:56:08.563 182717 DEBUG nova.virt.hardware [None req-4e65fe80-d9f3-4139-88c1-a7743e2856a7 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-21T23:42:45Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c3389c03-89c4-4ff5-9e03-1a99d41713d4',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 21 23:56:08 compute-1 nova_compute[182713]: 2026-01-21 23:56:08.563 182717 DEBUG nova.virt.hardware [None req-4e65fe80-d9f3-4139-88c1-a7743e2856a7 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 21 23:56:08 compute-1 nova_compute[182713]: 2026-01-21 23:56:08.564 182717 DEBUG nova.virt.hardware [None req-4e65fe80-d9f3-4139-88c1-a7743e2856a7 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 21 23:56:08 compute-1 nova_compute[182713]: 2026-01-21 23:56:08.564 182717 DEBUG nova.virt.hardware [None req-4e65fe80-d9f3-4139-88c1-a7743e2856a7 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 21 23:56:08 compute-1 nova_compute[182713]: 2026-01-21 23:56:08.564 182717 DEBUG nova.virt.hardware [None req-4e65fe80-d9f3-4139-88c1-a7743e2856a7 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 21 23:56:08 compute-1 nova_compute[182713]: 2026-01-21 23:56:08.565 182717 DEBUG nova.virt.hardware [None req-4e65fe80-d9f3-4139-88c1-a7743e2856a7 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 21 23:56:08 compute-1 nova_compute[182713]: 2026-01-21 23:56:08.565 182717 DEBUG nova.virt.hardware [None req-4e65fe80-d9f3-4139-88c1-a7743e2856a7 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 21 23:56:08 compute-1 nova_compute[182713]: 2026-01-21 23:56:08.565 182717 DEBUG nova.virt.hardware [None req-4e65fe80-d9f3-4139-88c1-a7743e2856a7 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 21 23:56:08 compute-1 nova_compute[182713]: 2026-01-21 23:56:08.566 182717 DEBUG nova.virt.hardware [None req-4e65fe80-d9f3-4139-88c1-a7743e2856a7 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 21 23:56:08 compute-1 nova_compute[182713]: 2026-01-21 23:56:08.566 182717 DEBUG nova.virt.hardware [None req-4e65fe80-d9f3-4139-88c1-a7743e2856a7 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 21 23:56:08 compute-1 nova_compute[182713]: 2026-01-21 23:56:08.566 182717 DEBUG nova.virt.hardware [None req-4e65fe80-d9f3-4139-88c1-a7743e2856a7 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 21 23:56:08 compute-1 nova_compute[182713]: 2026-01-21 23:56:08.567 182717 DEBUG nova.objects.instance [None req-4e65fe80-d9f3-4139-88c1-a7743e2856a7 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Lazy-loading 'vcpu_model' on Instance uuid 22742d9b-a6a8-4f10-a17f-a9704a1f8f43 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:56:08 compute-1 nova_compute[182713]: 2026-01-21 23:56:08.591 182717 DEBUG oslo_concurrency.processutils [None req-4e65fe80-d9f3-4139-88c1-a7743e2856a7 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/22742d9b-a6a8-4f10-a17f-a9704a1f8f43/disk.config --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:56:08 compute-1 nova_compute[182713]: 2026-01-21 23:56:08.671 182717 DEBUG oslo_concurrency.processutils [None req-4e65fe80-d9f3-4139-88c1-a7743e2856a7 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/22742d9b-a6a8-4f10-a17f-a9704a1f8f43/disk.config --force-share --output=json" returned: 0 in 0.081s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:56:08 compute-1 nova_compute[182713]: 2026-01-21 23:56:08.674 182717 DEBUG oslo_concurrency.lockutils [None req-4e65fe80-d9f3-4139-88c1-a7743e2856a7 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Acquiring lock "/var/lib/nova/instances/22742d9b-a6a8-4f10-a17f-a9704a1f8f43/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:56:08 compute-1 nova_compute[182713]: 2026-01-21 23:56:08.675 182717 DEBUG oslo_concurrency.lockutils [None req-4e65fe80-d9f3-4139-88c1-a7743e2856a7 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Lock "/var/lib/nova/instances/22742d9b-a6a8-4f10-a17f-a9704a1f8f43/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:56:08 compute-1 nova_compute[182713]: 2026-01-21 23:56:08.677 182717 DEBUG oslo_concurrency.lockutils [None req-4e65fe80-d9f3-4139-88c1-a7743e2856a7 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Lock "/var/lib/nova/instances/22742d9b-a6a8-4f10-a17f-a9704a1f8f43/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:56:08 compute-1 nova_compute[182713]: 2026-01-21 23:56:08.679 182717 DEBUG nova.virt.libvirt.vif [None req-4e65fe80-d9f3-4139-88c1-a7743e2856a7 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-21T23:55:33Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ListServerFiltersTestJSON-instance-1374146863',display_name='tempest-ListServerFiltersTestJSON-instance-1374146863',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-listserverfilterstestjson-instance-1374146863',id=58,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-21T23:55:46Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='70b1c9f8be0042aa8de9841a26729700',ramdisk_id='',reservation_id='r-3b2x1al6',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ListServerFiltersTestJSON-1547380946',owner_user_name='tempest-ListServerFiltersTestJSON-1547380946-project-member'},tags=<?>,task_state='powering-on',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-21T23:56:05Z,user_data=None,user_id='7e79b904cb8a49f990b05eb0ed72fdf4',uuid=22742d9b-a6a8-4f10-a17f-a9704a1f8f43,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "b767fb64-f4a0-49cc-85c0-21b059344b3d", "address": "fa:16:3e:4e:99:ca", "network": {"id": "a78bfb22-a192-4dbe-a117-9f8a59130e27", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-9709523-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "70b1c9f8be0042aa8de9841a26729700", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb767fb64-f4", "ovs_interfaceid": "b767fb64-f4a0-49cc-85c0-21b059344b3d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 21 23:56:08 compute-1 nova_compute[182713]: 2026-01-21 23:56:08.680 182717 DEBUG nova.network.os_vif_util [None req-4e65fe80-d9f3-4139-88c1-a7743e2856a7 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Converting VIF {"id": "b767fb64-f4a0-49cc-85c0-21b059344b3d", "address": "fa:16:3e:4e:99:ca", "network": {"id": "a78bfb22-a192-4dbe-a117-9f8a59130e27", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-9709523-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "70b1c9f8be0042aa8de9841a26729700", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb767fb64-f4", "ovs_interfaceid": "b767fb64-f4a0-49cc-85c0-21b059344b3d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 21 23:56:08 compute-1 nova_compute[182713]: 2026-01-21 23:56:08.682 182717 DEBUG nova.network.os_vif_util [None req-4e65fe80-d9f3-4139-88c1-a7743e2856a7 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:4e:99:ca,bridge_name='br-int',has_traffic_filtering=True,id=b767fb64-f4a0-49cc-85c0-21b059344b3d,network=Network(a78bfb22-a192-4dbe-a117-9f8a59130e27),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb767fb64-f4') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 21 23:56:08 compute-1 nova_compute[182713]: 2026-01-21 23:56:08.684 182717 DEBUG nova.objects.instance [None req-4e65fe80-d9f3-4139-88c1-a7743e2856a7 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Lazy-loading 'pci_devices' on Instance uuid 22742d9b-a6a8-4f10-a17f-a9704a1f8f43 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:56:08 compute-1 nova_compute[182713]: 2026-01-21 23:56:08.709 182717 DEBUG nova.virt.libvirt.driver [None req-4e65fe80-d9f3-4139-88c1-a7743e2856a7 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] [instance: 22742d9b-a6a8-4f10-a17f-a9704a1f8f43] End _get_guest_xml xml=<domain type="kvm">
Jan 21 23:56:08 compute-1 nova_compute[182713]:   <uuid>22742d9b-a6a8-4f10-a17f-a9704a1f8f43</uuid>
Jan 21 23:56:08 compute-1 nova_compute[182713]:   <name>instance-0000003a</name>
Jan 21 23:56:08 compute-1 nova_compute[182713]:   <memory>131072</memory>
Jan 21 23:56:08 compute-1 nova_compute[182713]:   <vcpu>1</vcpu>
Jan 21 23:56:08 compute-1 nova_compute[182713]:   <metadata>
Jan 21 23:56:08 compute-1 nova_compute[182713]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 21 23:56:08 compute-1 nova_compute[182713]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 21 23:56:08 compute-1 nova_compute[182713]:       <nova:name>tempest-ListServerFiltersTestJSON-instance-1374146863</nova:name>
Jan 21 23:56:08 compute-1 nova_compute[182713]:       <nova:creationTime>2026-01-21 23:56:08</nova:creationTime>
Jan 21 23:56:08 compute-1 nova_compute[182713]:       <nova:flavor name="m1.nano">
Jan 21 23:56:08 compute-1 nova_compute[182713]:         <nova:memory>128</nova:memory>
Jan 21 23:56:08 compute-1 nova_compute[182713]:         <nova:disk>1</nova:disk>
Jan 21 23:56:08 compute-1 nova_compute[182713]:         <nova:swap>0</nova:swap>
Jan 21 23:56:08 compute-1 nova_compute[182713]:         <nova:ephemeral>0</nova:ephemeral>
Jan 21 23:56:08 compute-1 nova_compute[182713]:         <nova:vcpus>1</nova:vcpus>
Jan 21 23:56:08 compute-1 nova_compute[182713]:       </nova:flavor>
Jan 21 23:56:08 compute-1 nova_compute[182713]:       <nova:owner>
Jan 21 23:56:08 compute-1 nova_compute[182713]:         <nova:user uuid="7e79b904cb8a49f990b05eb0ed72fdf4">tempest-ListServerFiltersTestJSON-1547380946-project-member</nova:user>
Jan 21 23:56:08 compute-1 nova_compute[182713]:         <nova:project uuid="70b1c9f8be0042aa8de9841a26729700">tempest-ListServerFiltersTestJSON-1547380946</nova:project>
Jan 21 23:56:08 compute-1 nova_compute[182713]:       </nova:owner>
Jan 21 23:56:08 compute-1 nova_compute[182713]:       <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 21 23:56:08 compute-1 nova_compute[182713]:       <nova:ports>
Jan 21 23:56:08 compute-1 nova_compute[182713]:         <nova:port uuid="b767fb64-f4a0-49cc-85c0-21b059344b3d">
Jan 21 23:56:08 compute-1 nova_compute[182713]:           <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Jan 21 23:56:08 compute-1 nova_compute[182713]:         </nova:port>
Jan 21 23:56:08 compute-1 nova_compute[182713]:       </nova:ports>
Jan 21 23:56:08 compute-1 nova_compute[182713]:     </nova:instance>
Jan 21 23:56:08 compute-1 nova_compute[182713]:   </metadata>
Jan 21 23:56:08 compute-1 nova_compute[182713]:   <sysinfo type="smbios">
Jan 21 23:56:08 compute-1 nova_compute[182713]:     <system>
Jan 21 23:56:08 compute-1 nova_compute[182713]:       <entry name="manufacturer">RDO</entry>
Jan 21 23:56:08 compute-1 nova_compute[182713]:       <entry name="product">OpenStack Compute</entry>
Jan 21 23:56:08 compute-1 nova_compute[182713]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 21 23:56:08 compute-1 nova_compute[182713]:       <entry name="serial">22742d9b-a6a8-4f10-a17f-a9704a1f8f43</entry>
Jan 21 23:56:08 compute-1 nova_compute[182713]:       <entry name="uuid">22742d9b-a6a8-4f10-a17f-a9704a1f8f43</entry>
Jan 21 23:56:08 compute-1 nova_compute[182713]:       <entry name="family">Virtual Machine</entry>
Jan 21 23:56:08 compute-1 nova_compute[182713]:     </system>
Jan 21 23:56:08 compute-1 nova_compute[182713]:   </sysinfo>
Jan 21 23:56:08 compute-1 nova_compute[182713]:   <os>
Jan 21 23:56:08 compute-1 nova_compute[182713]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 21 23:56:08 compute-1 nova_compute[182713]:     <boot dev="hd"/>
Jan 21 23:56:08 compute-1 nova_compute[182713]:     <smbios mode="sysinfo"/>
Jan 21 23:56:08 compute-1 nova_compute[182713]:   </os>
Jan 21 23:56:08 compute-1 nova_compute[182713]:   <features>
Jan 21 23:56:08 compute-1 nova_compute[182713]:     <acpi/>
Jan 21 23:56:08 compute-1 nova_compute[182713]:     <apic/>
Jan 21 23:56:08 compute-1 nova_compute[182713]:     <vmcoreinfo/>
Jan 21 23:56:08 compute-1 nova_compute[182713]:   </features>
Jan 21 23:56:08 compute-1 nova_compute[182713]:   <clock offset="utc">
Jan 21 23:56:08 compute-1 nova_compute[182713]:     <timer name="pit" tickpolicy="delay"/>
Jan 21 23:56:08 compute-1 nova_compute[182713]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 21 23:56:08 compute-1 nova_compute[182713]:     <timer name="hpet" present="no"/>
Jan 21 23:56:08 compute-1 nova_compute[182713]:   </clock>
Jan 21 23:56:08 compute-1 nova_compute[182713]:   <cpu mode="custom" match="exact">
Jan 21 23:56:08 compute-1 nova_compute[182713]:     <model>Nehalem</model>
Jan 21 23:56:08 compute-1 nova_compute[182713]:     <topology sockets="1" cores="1" threads="1"/>
Jan 21 23:56:08 compute-1 nova_compute[182713]:   </cpu>
Jan 21 23:56:08 compute-1 nova_compute[182713]:   <devices>
Jan 21 23:56:08 compute-1 nova_compute[182713]:     <disk type="file" device="disk">
Jan 21 23:56:08 compute-1 nova_compute[182713]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 21 23:56:08 compute-1 nova_compute[182713]:       <source file="/var/lib/nova/instances/22742d9b-a6a8-4f10-a17f-a9704a1f8f43/disk"/>
Jan 21 23:56:08 compute-1 nova_compute[182713]:       <target dev="vda" bus="virtio"/>
Jan 21 23:56:08 compute-1 nova_compute[182713]:     </disk>
Jan 21 23:56:08 compute-1 nova_compute[182713]:     <disk type="file" device="cdrom">
Jan 21 23:56:08 compute-1 nova_compute[182713]:       <driver name="qemu" type="raw" cache="none"/>
Jan 21 23:56:08 compute-1 nova_compute[182713]:       <source file="/var/lib/nova/instances/22742d9b-a6a8-4f10-a17f-a9704a1f8f43/disk.config"/>
Jan 21 23:56:08 compute-1 nova_compute[182713]:       <target dev="sda" bus="sata"/>
Jan 21 23:56:08 compute-1 nova_compute[182713]:     </disk>
Jan 21 23:56:08 compute-1 nova_compute[182713]:     <interface type="ethernet">
Jan 21 23:56:08 compute-1 nova_compute[182713]:       <mac address="fa:16:3e:4e:99:ca"/>
Jan 21 23:56:08 compute-1 nova_compute[182713]:       <model type="virtio"/>
Jan 21 23:56:08 compute-1 nova_compute[182713]:       <driver name="vhost" rx_queue_size="512"/>
Jan 21 23:56:08 compute-1 nova_compute[182713]:       <mtu size="1442"/>
Jan 21 23:56:08 compute-1 nova_compute[182713]:       <target dev="tapb767fb64-f4"/>
Jan 21 23:56:08 compute-1 nova_compute[182713]:     </interface>
Jan 21 23:56:08 compute-1 nova_compute[182713]:     <serial type="pty">
Jan 21 23:56:08 compute-1 nova_compute[182713]:       <log file="/var/lib/nova/instances/22742d9b-a6a8-4f10-a17f-a9704a1f8f43/console.log" append="off"/>
Jan 21 23:56:08 compute-1 nova_compute[182713]:     </serial>
Jan 21 23:56:08 compute-1 nova_compute[182713]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 21 23:56:08 compute-1 nova_compute[182713]:     <video>
Jan 21 23:56:08 compute-1 nova_compute[182713]:       <model type="virtio"/>
Jan 21 23:56:08 compute-1 nova_compute[182713]:     </video>
Jan 21 23:56:08 compute-1 nova_compute[182713]:     <input type="tablet" bus="usb"/>
Jan 21 23:56:08 compute-1 nova_compute[182713]:     <input type="keyboard" bus="usb"/>
Jan 21 23:56:08 compute-1 nova_compute[182713]:     <rng model="virtio">
Jan 21 23:56:08 compute-1 nova_compute[182713]:       <backend model="random">/dev/urandom</backend>
Jan 21 23:56:08 compute-1 nova_compute[182713]:     </rng>
Jan 21 23:56:08 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root"/>
Jan 21 23:56:08 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:56:08 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:56:08 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:56:08 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:56:08 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:56:08 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:56:08 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:56:08 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:56:08 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:56:08 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:56:08 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:56:08 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:56:08 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:56:08 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:56:08 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:56:08 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:56:08 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:56:08 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:56:08 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:56:08 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:56:08 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:56:08 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:56:08 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:56:08 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:56:08 compute-1 nova_compute[182713]:     <controller type="usb" index="0"/>
Jan 21 23:56:08 compute-1 nova_compute[182713]:     <memballoon model="virtio">
Jan 21 23:56:08 compute-1 nova_compute[182713]:       <stats period="10"/>
Jan 21 23:56:08 compute-1 nova_compute[182713]:     </memballoon>
Jan 21 23:56:08 compute-1 nova_compute[182713]:   </devices>
Jan 21 23:56:08 compute-1 nova_compute[182713]: </domain>
Jan 21 23:56:08 compute-1 nova_compute[182713]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 21 23:56:08 compute-1 nova_compute[182713]: 2026-01-21 23:56:08.712 182717 DEBUG oslo_concurrency.processutils [None req-4e65fe80-d9f3-4139-88c1-a7743e2856a7 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/22742d9b-a6a8-4f10-a17f-a9704a1f8f43/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:56:08 compute-1 nova_compute[182713]: 2026-01-21 23:56:08.762 182717 DEBUG oslo_concurrency.lockutils [None req-9f2526cf-cb41-4c37-9673-dbe45c0ac68c 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Acquiring lock "2a5e9a44-f095-4122-8db9-4918b6ba22b7" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:56:08 compute-1 nova_compute[182713]: 2026-01-21 23:56:08.763 182717 DEBUG oslo_concurrency.lockutils [None req-9f2526cf-cb41-4c37-9673-dbe45c0ac68c 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Lock "2a5e9a44-f095-4122-8db9-4918b6ba22b7" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:56:08 compute-1 nova_compute[182713]: 2026-01-21 23:56:08.763 182717 DEBUG oslo_concurrency.lockutils [None req-9f2526cf-cb41-4c37-9673-dbe45c0ac68c 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Acquiring lock "2a5e9a44-f095-4122-8db9-4918b6ba22b7-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:56:08 compute-1 nova_compute[182713]: 2026-01-21 23:56:08.764 182717 DEBUG oslo_concurrency.lockutils [None req-9f2526cf-cb41-4c37-9673-dbe45c0ac68c 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Lock "2a5e9a44-f095-4122-8db9-4918b6ba22b7-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:56:08 compute-1 nova_compute[182713]: 2026-01-21 23:56:08.764 182717 DEBUG oslo_concurrency.lockutils [None req-9f2526cf-cb41-4c37-9673-dbe45c0ac68c 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Lock "2a5e9a44-f095-4122-8db9-4918b6ba22b7-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:56:08 compute-1 nova_compute[182713]: 2026-01-21 23:56:08.780 182717 DEBUG oslo_concurrency.processutils [None req-4e65fe80-d9f3-4139-88c1-a7743e2856a7 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/22742d9b-a6a8-4f10-a17f-a9704a1f8f43/disk --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:56:08 compute-1 nova_compute[182713]: 2026-01-21 23:56:08.781 182717 DEBUG oslo_concurrency.processutils [None req-4e65fe80-d9f3-4139-88c1-a7743e2856a7 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/22742d9b-a6a8-4f10-a17f-a9704a1f8f43/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:56:08 compute-1 nova_compute[182713]: 2026-01-21 23:56:08.810 182717 INFO nova.compute.manager [None req-9f2526cf-cb41-4c37-9673-dbe45c0ac68c 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: 2a5e9a44-f095-4122-8db9-4918b6ba22b7] Terminating instance
Jan 21 23:56:08 compute-1 nova_compute[182713]: 2026-01-21 23:56:08.830 182717 DEBUG nova.compute.manager [None req-9f2526cf-cb41-4c37-9673-dbe45c0ac68c 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: 2a5e9a44-f095-4122-8db9-4918b6ba22b7] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 21 23:56:08 compute-1 kernel: tapce72a712-5a (unregistering): left promiscuous mode
Jan 21 23:56:08 compute-1 NetworkManager[54952]: <info>  [1769039768.8630] device (tapce72a712-5a): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 21 23:56:08 compute-1 ovn_controller[94841]: 2026-01-21T23:56:08Z|00175|binding|INFO|Releasing lport ce72a712-5acd-45cf-9d5d-66fb0d28ce4b from this chassis (sb_readonly=0)
Jan 21 23:56:08 compute-1 ovn_controller[94841]: 2026-01-21T23:56:08Z|00176|binding|INFO|Setting lport ce72a712-5acd-45cf-9d5d-66fb0d28ce4b down in Southbound
Jan 21 23:56:08 compute-1 nova_compute[182713]: 2026-01-21 23:56:08.873 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:56:08 compute-1 ovn_controller[94841]: 2026-01-21T23:56:08Z|00177|binding|INFO|Removing iface tapce72a712-5a ovn-installed in OVS
Jan 21 23:56:08 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:56:08.885 104184 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:4f:57:00 10.100.0.7'], port_security=['fa:16:3e:4f:57:00 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '2a5e9a44-f095-4122-8db9-4918b6ba22b7', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-74e2da48-44c2-4c6d-9597-6c47d6247f9c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '63e5713bcd4c429796b251487b6136bc', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'f9b7e0b6-a204-4fa2-b013-6e4797586550', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d83c4887-6d70-4852-825f-2112d1d70c76, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>], logical_port=ce72a712-5acd-45cf-9d5d-66fb0d28ce4b) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 21 23:56:08 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:56:08.886 104184 INFO neutron.agent.ovn.metadata.agent [-] Port ce72a712-5acd-45cf-9d5d-66fb0d28ce4b in datapath 74e2da48-44c2-4c6d-9597-6c47d6247f9c unbound from our chassis
Jan 21 23:56:08 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:56:08.888 104184 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 74e2da48-44c2-4c6d-9597-6c47d6247f9c, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 21 23:56:08 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:56:08.889 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[b8458522-80a1-41e1-a52f-b5d5d563411a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:56:08 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:56:08.890 104184 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-74e2da48-44c2-4c6d-9597-6c47d6247f9c namespace which is not needed anymore
Jan 21 23:56:08 compute-1 nova_compute[182713]: 2026-01-21 23:56:08.895 182717 DEBUG oslo_concurrency.processutils [None req-4e65fe80-d9f3-4139-88c1-a7743e2856a7 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/22742d9b-a6a8-4f10-a17f-a9704a1f8f43/disk --force-share --output=json" returned: 0 in 0.114s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:56:08 compute-1 nova_compute[182713]: 2026-01-21 23:56:08.899 182717 DEBUG nova.objects.instance [None req-4e65fe80-d9f3-4139-88c1-a7743e2856a7 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Lazy-loading 'trusted_certs' on Instance uuid 22742d9b-a6a8-4f10-a17f-a9704a1f8f43 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:56:08 compute-1 nova_compute[182713]: 2026-01-21 23:56:08.901 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:56:08 compute-1 systemd[1]: machine-qemu\x2d27\x2dinstance\x2d0000003c.scope: Deactivated successfully.
Jan 21 23:56:08 compute-1 systemd[1]: machine-qemu\x2d27\x2dinstance\x2d0000003c.scope: Consumed 12.794s CPU time.
Jan 21 23:56:08 compute-1 nova_compute[182713]: 2026-01-21 23:56:08.923 182717 DEBUG oslo_concurrency.processutils [None req-4e65fe80-d9f3-4139-88c1-a7743e2856a7 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:56:08 compute-1 systemd-machined[153970]: Machine qemu-27-instance-0000003c terminated.
Jan 21 23:56:08 compute-1 podman[219515]: 2026-01-21 23:56:08.97545581 +0000 UTC m=+0.081313716 container health_status af9a44923f14d865893c91712a29a99d59e05d83f1a1a0300c4f4d5af638f284 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, tcib_managed=true)
Jan 21 23:56:08 compute-1 nova_compute[182713]: 2026-01-21 23:56:08.994 182717 DEBUG oslo_concurrency.processutils [None req-4e65fe80-d9f3-4139-88c1-a7743e2856a7 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:56:08 compute-1 nova_compute[182713]: 2026-01-21 23:56:08.995 182717 DEBUG nova.virt.disk.api [None req-4e65fe80-d9f3-4139-88c1-a7743e2856a7 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Checking if we can resize image /var/lib/nova/instances/22742d9b-a6a8-4f10-a17f-a9704a1f8f43/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 21 23:56:08 compute-1 nova_compute[182713]: 2026-01-21 23:56:08.995 182717 DEBUG oslo_concurrency.processutils [None req-4e65fe80-d9f3-4139-88c1-a7743e2856a7 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/22742d9b-a6a8-4f10-a17f-a9704a1f8f43/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:56:09 compute-1 podman[219518]: 2026-01-21 23:56:09.015820992 +0000 UTC m=+0.112597589 container health_status e305ebfcd3dbca18f8dd680606b043b108cf9efb0d0a771039cb04fe0df8441d (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Jan 21 23:56:09 compute-1 neutron-haproxy-ovnmeta-74e2da48-44c2-4c6d-9597-6c47d6247f9c[219086]: [NOTICE]   (219093) : haproxy version is 2.8.14-c23fe91
Jan 21 23:56:09 compute-1 neutron-haproxy-ovnmeta-74e2da48-44c2-4c6d-9597-6c47d6247f9c[219086]: [NOTICE]   (219093) : path to executable is /usr/sbin/haproxy
Jan 21 23:56:09 compute-1 neutron-haproxy-ovnmeta-74e2da48-44c2-4c6d-9597-6c47d6247f9c[219086]: [WARNING]  (219093) : Exiting Master process...
Jan 21 23:56:09 compute-1 neutron-haproxy-ovnmeta-74e2da48-44c2-4c6d-9597-6c47d6247f9c[219086]: [ALERT]    (219093) : Current worker (219097) exited with code 143 (Terminated)
Jan 21 23:56:09 compute-1 neutron-haproxy-ovnmeta-74e2da48-44c2-4c6d-9597-6c47d6247f9c[219086]: [WARNING]  (219093) : All workers exited. Exiting... (0)
Jan 21 23:56:09 compute-1 systemd[1]: libpod-a46657691f0235cddd44eda9767cbec3452386d2ae0e6a95a644487c291a43b7.scope: Deactivated successfully.
Jan 21 23:56:09 compute-1 kernel: tapce72a712-5a: entered promiscuous mode
Jan 21 23:56:09 compute-1 kernel: tapce72a712-5a (unregistering): left promiscuous mode
Jan 21 23:56:09 compute-1 ovn_controller[94841]: 2026-01-21T23:56:09Z|00178|binding|INFO|Claiming lport ce72a712-5acd-45cf-9d5d-66fb0d28ce4b for this chassis.
Jan 21 23:56:09 compute-1 podman[219581]: 2026-01-21 23:56:09.056302219 +0000 UTC m=+0.048595168 container died a46657691f0235cddd44eda9767cbec3452386d2ae0e6a95a644487c291a43b7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-74e2da48-44c2-4c6d-9597-6c47d6247f9c, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 21 23:56:09 compute-1 NetworkManager[54952]: <info>  [1769039769.0566] manager: (tapce72a712-5a): new Tun device (/org/freedesktop/NetworkManager/Devices/91)
Jan 21 23:56:09 compute-1 nova_compute[182713]: 2026-01-21 23:56:09.055 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:56:09 compute-1 ovn_controller[94841]: 2026-01-21T23:56:09Z|00179|binding|INFO|ce72a712-5acd-45cf-9d5d-66fb0d28ce4b: Claiming fa:16:3e:4f:57:00 10.100.0.7
Jan 21 23:56:09 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:56:09.065 104184 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:4f:57:00 10.100.0.7'], port_security=['fa:16:3e:4f:57:00 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '2a5e9a44-f095-4122-8db9-4918b6ba22b7', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-74e2da48-44c2-4c6d-9597-6c47d6247f9c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '63e5713bcd4c429796b251487b6136bc', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'f9b7e0b6-a204-4fa2-b013-6e4797586550', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d83c4887-6d70-4852-825f-2112d1d70c76, chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>], logical_port=ce72a712-5acd-45cf-9d5d-66fb0d28ce4b) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 21 23:56:09 compute-1 nova_compute[182713]: 2026-01-21 23:56:09.081 182717 DEBUG oslo_concurrency.processutils [None req-4e65fe80-d9f3-4139-88c1-a7743e2856a7 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/22742d9b-a6a8-4f10-a17f-a9704a1f8f43/disk --force-share --output=json" returned: 0 in 0.086s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:56:09 compute-1 nova_compute[182713]: 2026-01-21 23:56:09.083 182717 DEBUG nova.virt.disk.api [None req-4e65fe80-d9f3-4139-88c1-a7743e2856a7 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Cannot resize image /var/lib/nova/instances/22742d9b-a6a8-4f10-a17f-a9704a1f8f43/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 21 23:56:09 compute-1 nova_compute[182713]: 2026-01-21 23:56:09.083 182717 DEBUG nova.objects.instance [None req-4e65fe80-d9f3-4139-88c1-a7743e2856a7 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Lazy-loading 'migration_context' on Instance uuid 22742d9b-a6a8-4f10-a17f-a9704a1f8f43 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:56:09 compute-1 ovn_controller[94841]: 2026-01-21T23:56:09Z|00180|binding|INFO|Setting lport ce72a712-5acd-45cf-9d5d-66fb0d28ce4b ovn-installed in OVS
Jan 21 23:56:09 compute-1 ovn_controller[94841]: 2026-01-21T23:56:09Z|00181|binding|INFO|Setting lport ce72a712-5acd-45cf-9d5d-66fb0d28ce4b up in Southbound
Jan 21 23:56:09 compute-1 ovn_controller[94841]: 2026-01-21T23:56:09Z|00182|binding|INFO|Releasing lport ce72a712-5acd-45cf-9d5d-66fb0d28ce4b from this chassis (sb_readonly=1)
Jan 21 23:56:09 compute-1 ovn_controller[94841]: 2026-01-21T23:56:09Z|00183|if_status|INFO|Not setting lport ce72a712-5acd-45cf-9d5d-66fb0d28ce4b down as sb is readonly
Jan 21 23:56:09 compute-1 nova_compute[182713]: 2026-01-21 23:56:09.091 182717 DEBUG nova.compute.manager [req-2cb393fa-168c-4cae-8554-992bc44bb89b req-75252fab-40c5-41ea-be50-db5d1f4282dd 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 2a5e9a44-f095-4122-8db9-4918b6ba22b7] Received event network-vif-unplugged-ce72a712-5acd-45cf-9d5d-66fb0d28ce4b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:56:09 compute-1 ovn_controller[94841]: 2026-01-21T23:56:09Z|00184|binding|INFO|Removing iface tapce72a712-5a ovn-installed in OVS
Jan 21 23:56:09 compute-1 nova_compute[182713]: 2026-01-21 23:56:09.092 182717 DEBUG oslo_concurrency.lockutils [req-2cb393fa-168c-4cae-8554-992bc44bb89b req-75252fab-40c5-41ea-be50-db5d1f4282dd 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "2a5e9a44-f095-4122-8db9-4918b6ba22b7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:56:09 compute-1 nova_compute[182713]: 2026-01-21 23:56:09.092 182717 DEBUG oslo_concurrency.lockutils [req-2cb393fa-168c-4cae-8554-992bc44bb89b req-75252fab-40c5-41ea-be50-db5d1f4282dd 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "2a5e9a44-f095-4122-8db9-4918b6ba22b7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:56:09 compute-1 nova_compute[182713]: 2026-01-21 23:56:09.093 182717 DEBUG oslo_concurrency.lockutils [req-2cb393fa-168c-4cae-8554-992bc44bb89b req-75252fab-40c5-41ea-be50-db5d1f4282dd 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "2a5e9a44-f095-4122-8db9-4918b6ba22b7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:56:09 compute-1 nova_compute[182713]: 2026-01-21 23:56:09.093 182717 DEBUG nova.compute.manager [req-2cb393fa-168c-4cae-8554-992bc44bb89b req-75252fab-40c5-41ea-be50-db5d1f4282dd 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 2a5e9a44-f095-4122-8db9-4918b6ba22b7] No waiting events found dispatching network-vif-unplugged-ce72a712-5acd-45cf-9d5d-66fb0d28ce4b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 23:56:09 compute-1 nova_compute[182713]: 2026-01-21 23:56:09.093 182717 DEBUG nova.compute.manager [req-2cb393fa-168c-4cae-8554-992bc44bb89b req-75252fab-40c5-41ea-be50-db5d1f4282dd 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 2a5e9a44-f095-4122-8db9-4918b6ba22b7] Received event network-vif-unplugged-ce72a712-5acd-45cf-9d5d-66fb0d28ce4b for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 21 23:56:09 compute-1 ovn_controller[94841]: 2026-01-21T23:56:09Z|00185|binding|INFO|Releasing lport ce72a712-5acd-45cf-9d5d-66fb0d28ce4b from this chassis (sb_readonly=0)
Jan 21 23:56:09 compute-1 nova_compute[182713]: 2026-01-21 23:56:09.094 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:56:09 compute-1 ovn_controller[94841]: 2026-01-21T23:56:09Z|00186|binding|INFO|Setting lport ce72a712-5acd-45cf-9d5d-66fb0d28ce4b down in Southbound
Jan 21 23:56:09 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:56:09.103 104184 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:4f:57:00 10.100.0.7'], port_security=['fa:16:3e:4f:57:00 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '2a5e9a44-f095-4122-8db9-4918b6ba22b7', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-74e2da48-44c2-4c6d-9597-6c47d6247f9c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '63e5713bcd4c429796b251487b6136bc', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'f9b7e0b6-a204-4fa2-b013-6e4797586550', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d83c4887-6d70-4852-825f-2112d1d70c76, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>], logical_port=ce72a712-5acd-45cf-9d5d-66fb0d28ce4b) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 21 23:56:09 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-a46657691f0235cddd44eda9767cbec3452386d2ae0e6a95a644487c291a43b7-userdata-shm.mount: Deactivated successfully.
Jan 21 23:56:09 compute-1 systemd[1]: var-lib-containers-storage-overlay-d737babeb3960a5e4dea217b0e260a200cf9365146f5fa7e496fe1302ac01acf-merged.mount: Deactivated successfully.
Jan 21 23:56:09 compute-1 nova_compute[182713]: 2026-01-21 23:56:09.107 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:56:09 compute-1 nova_compute[182713]: 2026-01-21 23:56:09.113 182717 INFO nova.virt.libvirt.driver [-] [instance: 2a5e9a44-f095-4122-8db9-4918b6ba22b7] Instance destroyed successfully.
Jan 21 23:56:09 compute-1 podman[219581]: 2026-01-21 23:56:09.114528032 +0000 UTC m=+0.106821001 container cleanup a46657691f0235cddd44eda9767cbec3452386d2ae0e6a95a644487c291a43b7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-74e2da48-44c2-4c6d-9597-6c47d6247f9c, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Jan 21 23:56:09 compute-1 nova_compute[182713]: 2026-01-21 23:56:09.114 182717 DEBUG nova.objects.instance [None req-9f2526cf-cb41-4c37-9673-dbe45c0ac68c 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Lazy-loading 'resources' on Instance uuid 2a5e9a44-f095-4122-8db9-4918b6ba22b7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:56:09 compute-1 nova_compute[182713]: 2026-01-21 23:56:09.117 182717 DEBUG nova.virt.libvirt.vif [None req-4e65fe80-d9f3-4139-88c1-a7743e2856a7 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-21T23:55:33Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ListServerFiltersTestJSON-instance-1374146863',display_name='tempest-ListServerFiltersTestJSON-instance-1374146863',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-listserverfilterstestjson-instance-1374146863',id=58,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-21T23:55:46Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=<?>,power_state=4,progress=0,project_id='70b1c9f8be0042aa8de9841a26729700',ramdisk_id='',reservation_id='r-3b2x1al6',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ListServerFiltersTestJSON-1547380946',owner_user_name='tempest-ListServerFiltersTestJSON-1547380946-project-member'},tags=<?>,task_state='powering-on',terminated_at=None,trusted_certs=None,updated_at=2026-01-21T23:56:05Z,user_data=None,user_id='7e79b904cb8a49f990b05eb0ed72fdf4',uuid=22742d9b-a6a8-4f10-a17f-a9704a1f8f43,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "b767fb64-f4a0-49cc-85c0-21b059344b3d", "address": "fa:16:3e:4e:99:ca", "network": {"id": "a78bfb22-a192-4dbe-a117-9f8a59130e27", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-9709523-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "70b1c9f8be0042aa8de9841a26729700", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb767fb64-f4", "ovs_interfaceid": "b767fb64-f4a0-49cc-85c0-21b059344b3d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 21 23:56:09 compute-1 nova_compute[182713]: 2026-01-21 23:56:09.117 182717 DEBUG nova.network.os_vif_util [None req-4e65fe80-d9f3-4139-88c1-a7743e2856a7 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Converting VIF {"id": "b767fb64-f4a0-49cc-85c0-21b059344b3d", "address": "fa:16:3e:4e:99:ca", "network": {"id": "a78bfb22-a192-4dbe-a117-9f8a59130e27", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-9709523-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "70b1c9f8be0042aa8de9841a26729700", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb767fb64-f4", "ovs_interfaceid": "b767fb64-f4a0-49cc-85c0-21b059344b3d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 21 23:56:09 compute-1 nova_compute[182713]: 2026-01-21 23:56:09.118 182717 DEBUG nova.network.os_vif_util [None req-4e65fe80-d9f3-4139-88c1-a7743e2856a7 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:4e:99:ca,bridge_name='br-int',has_traffic_filtering=True,id=b767fb64-f4a0-49cc-85c0-21b059344b3d,network=Network(a78bfb22-a192-4dbe-a117-9f8a59130e27),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb767fb64-f4') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 21 23:56:09 compute-1 nova_compute[182713]: 2026-01-21 23:56:09.118 182717 DEBUG os_vif [None req-4e65fe80-d9f3-4139-88c1-a7743e2856a7 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:4e:99:ca,bridge_name='br-int',has_traffic_filtering=True,id=b767fb64-f4a0-49cc-85c0-21b059344b3d,network=Network(a78bfb22-a192-4dbe-a117-9f8a59130e27),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb767fb64-f4') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 21 23:56:09 compute-1 nova_compute[182713]: 2026-01-21 23:56:09.119 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:56:09 compute-1 nova_compute[182713]: 2026-01-21 23:56:09.119 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:56:09 compute-1 nova_compute[182713]: 2026-01-21 23:56:09.120 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 21 23:56:09 compute-1 systemd[1]: libpod-conmon-a46657691f0235cddd44eda9767cbec3452386d2ae0e6a95a644487c291a43b7.scope: Deactivated successfully.
Jan 21 23:56:09 compute-1 nova_compute[182713]: 2026-01-21 23:56:09.123 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:56:09 compute-1 nova_compute[182713]: 2026-01-21 23:56:09.123 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb767fb64-f4, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:56:09 compute-1 nova_compute[182713]: 2026-01-21 23:56:09.124 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapb767fb64-f4, col_values=(('external_ids', {'iface-id': 'b767fb64-f4a0-49cc-85c0-21b059344b3d', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:4e:99:ca', 'vm-uuid': '22742d9b-a6a8-4f10-a17f-a9704a1f8f43'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:56:09 compute-1 nova_compute[182713]: 2026-01-21 23:56:09.125 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:56:09 compute-1 NetworkManager[54952]: <info>  [1769039769.1264] manager: (tapb767fb64-f4): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/92)
Jan 21 23:56:09 compute-1 nova_compute[182713]: 2026-01-21 23:56:09.128 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 21 23:56:09 compute-1 nova_compute[182713]: 2026-01-21 23:56:09.132 182717 DEBUG nova.virt.libvirt.vif [None req-9f2526cf-cb41-4c37-9673-dbe45c0ac68c 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-21T23:55:33Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ImagesTestJSON-server-812838625',display_name='tempest-ImagesTestJSON-server-812838625',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-imagestestjson-server-812838625',id=60,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-21T23:55:46Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='63e5713bcd4c429796b251487b6136bc',ramdisk_id='',reservation_id='r-p2ism8a8',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ImagesTestJSON-126431515',owner_user_name='tempest-ImagesTestJSON-126431515-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-21T23:55:51Z,user_data=None,user_id='6eb1bcf645844eaca088761a04e59542',uuid=2a5e9a44-f095-4122-8db9-4918b6ba22b7,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "ce72a712-5acd-45cf-9d5d-66fb0d28ce4b", "address": "fa:16:3e:4f:57:00", "network": {"id": "74e2da48-44c2-4c6d-9597-6c47d6247f9c", "bridge": "br-int", "label": "tempest-ImagesTestJSON-926280031-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "63e5713bcd4c429796b251487b6136bc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapce72a712-5a", "ovs_interfaceid": "ce72a712-5acd-45cf-9d5d-66fb0d28ce4b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 21 23:56:09 compute-1 nova_compute[182713]: 2026-01-21 23:56:09.132 182717 DEBUG nova.network.os_vif_util [None req-9f2526cf-cb41-4c37-9673-dbe45c0ac68c 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Converting VIF {"id": "ce72a712-5acd-45cf-9d5d-66fb0d28ce4b", "address": "fa:16:3e:4f:57:00", "network": {"id": "74e2da48-44c2-4c6d-9597-6c47d6247f9c", "bridge": "br-int", "label": "tempest-ImagesTestJSON-926280031-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "63e5713bcd4c429796b251487b6136bc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapce72a712-5a", "ovs_interfaceid": "ce72a712-5acd-45cf-9d5d-66fb0d28ce4b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 21 23:56:09 compute-1 nova_compute[182713]: 2026-01-21 23:56:09.133 182717 DEBUG nova.network.os_vif_util [None req-9f2526cf-cb41-4c37-9673-dbe45c0ac68c 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:4f:57:00,bridge_name='br-int',has_traffic_filtering=True,id=ce72a712-5acd-45cf-9d5d-66fb0d28ce4b,network=Network(74e2da48-44c2-4c6d-9597-6c47d6247f9c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapce72a712-5a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 21 23:56:09 compute-1 nova_compute[182713]: 2026-01-21 23:56:09.133 182717 DEBUG os_vif [None req-9f2526cf-cb41-4c37-9673-dbe45c0ac68c 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:4f:57:00,bridge_name='br-int',has_traffic_filtering=True,id=ce72a712-5acd-45cf-9d5d-66fb0d28ce4b,network=Network(74e2da48-44c2-4c6d-9597-6c47d6247f9c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapce72a712-5a') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 21 23:56:09 compute-1 nova_compute[182713]: 2026-01-21 23:56:09.134 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:56:09 compute-1 nova_compute[182713]: 2026-01-21 23:56:09.135 182717 INFO os_vif [None req-4e65fe80-d9f3-4139-88c1-a7743e2856a7 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:4e:99:ca,bridge_name='br-int',has_traffic_filtering=True,id=b767fb64-f4a0-49cc-85c0-21b059344b3d,network=Network(a78bfb22-a192-4dbe-a117-9f8a59130e27),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb767fb64-f4')
Jan 21 23:56:09 compute-1 nova_compute[182713]: 2026-01-21 23:56:09.136 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:56:09 compute-1 nova_compute[182713]: 2026-01-21 23:56:09.137 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapce72a712-5a, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:56:09 compute-1 nova_compute[182713]: 2026-01-21 23:56:09.138 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:56:09 compute-1 nova_compute[182713]: 2026-01-21 23:56:09.139 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 21 23:56:09 compute-1 nova_compute[182713]: 2026-01-21 23:56:09.142 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:56:09 compute-1 nova_compute[182713]: 2026-01-21 23:56:09.143 182717 INFO os_vif [None req-9f2526cf-cb41-4c37-9673-dbe45c0ac68c 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:4f:57:00,bridge_name='br-int',has_traffic_filtering=True,id=ce72a712-5acd-45cf-9d5d-66fb0d28ce4b,network=Network(74e2da48-44c2-4c6d-9597-6c47d6247f9c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapce72a712-5a')
Jan 21 23:56:09 compute-1 nova_compute[182713]: 2026-01-21 23:56:09.144 182717 INFO nova.virt.libvirt.driver [None req-9f2526cf-cb41-4c37-9673-dbe45c0ac68c 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: 2a5e9a44-f095-4122-8db9-4918b6ba22b7] Deleting instance files /var/lib/nova/instances/2a5e9a44-f095-4122-8db9-4918b6ba22b7_del
Jan 21 23:56:09 compute-1 nova_compute[182713]: 2026-01-21 23:56:09.144 182717 INFO nova.virt.libvirt.driver [None req-9f2526cf-cb41-4c37-9673-dbe45c0ac68c 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: 2a5e9a44-f095-4122-8db9-4918b6ba22b7] Deletion of /var/lib/nova/instances/2a5e9a44-f095-4122-8db9-4918b6ba22b7_del complete
Jan 21 23:56:09 compute-1 podman[219625]: 2026-01-21 23:56:09.177715857 +0000 UTC m=+0.042002044 container remove a46657691f0235cddd44eda9767cbec3452386d2ae0e6a95a644487c291a43b7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-74e2da48-44c2-4c6d-9597-6c47d6247f9c, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Jan 21 23:56:09 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:56:09.184 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[d0fed360-2efb-4bc2-a73a-d0f8b061782b]: (4, ('Wed Jan 21 11:56:08 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-74e2da48-44c2-4c6d-9597-6c47d6247f9c (a46657691f0235cddd44eda9767cbec3452386d2ae0e6a95a644487c291a43b7)\na46657691f0235cddd44eda9767cbec3452386d2ae0e6a95a644487c291a43b7\nWed Jan 21 11:56:09 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-74e2da48-44c2-4c6d-9597-6c47d6247f9c (a46657691f0235cddd44eda9767cbec3452386d2ae0e6a95a644487c291a43b7)\na46657691f0235cddd44eda9767cbec3452386d2ae0e6a95a644487c291a43b7\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:56:09 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:56:09.187 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[e477b42f-cb71-4a67-8ab2-424e7356f8b3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:56:09 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:56:09.188 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap74e2da48-40, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:56:09 compute-1 kernel: tap74e2da48-40: left promiscuous mode
Jan 21 23:56:09 compute-1 nova_compute[182713]: 2026-01-21 23:56:09.191 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:56:09 compute-1 nova_compute[182713]: 2026-01-21 23:56:09.205 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:56:09 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:56:09.210 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[6c542f77-9940-43fa-8c11-fbe38b68bc46]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:56:09 compute-1 kernel: tapb767fb64-f4: entered promiscuous mode
Jan 21 23:56:09 compute-1 NetworkManager[54952]: <info>  [1769039769.2135] manager: (tapb767fb64-f4): new Tun device (/org/freedesktop/NetworkManager/Devices/93)
Jan 21 23:56:09 compute-1 ovn_controller[94841]: 2026-01-21T23:56:09Z|00187|binding|INFO|Claiming lport b767fb64-f4a0-49cc-85c0-21b059344b3d for this chassis.
Jan 21 23:56:09 compute-1 ovn_controller[94841]: 2026-01-21T23:56:09Z|00188|binding|INFO|b767fb64-f4a0-49cc-85c0-21b059344b3d: Claiming fa:16:3e:4e:99:ca 10.100.0.10
Jan 21 23:56:09 compute-1 nova_compute[182713]: 2026-01-21 23:56:09.215 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:56:09 compute-1 systemd-udevd[219540]: Network interface NamePolicy= disabled on kernel command line.
Jan 21 23:56:09 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:56:09.223 104184 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:4e:99:ca 10.100.0.10'], port_security=['fa:16:3e:4e:99:ca 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '22742d9b-a6a8-4f10-a17f-a9704a1f8f43', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a78bfb22-a192-4dbe-a117-9f8a59130e27', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '70b1c9f8be0042aa8de9841a26729700', 'neutron:revision_number': '5', 'neutron:security_group_ids': '5943869c-ade1-4cd3-81a5-29e65236fb49', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=88d3d39a-f56f-4f3b-95e9-79768ac7b596, chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>], logical_port=b767fb64-f4a0-49cc-85c0-21b059344b3d) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 21 23:56:09 compute-1 ovn_controller[94841]: 2026-01-21T23:56:09Z|00189|binding|INFO|Setting lport b767fb64-f4a0-49cc-85c0-21b059344b3d ovn-installed in OVS
Jan 21 23:56:09 compute-1 ovn_controller[94841]: 2026-01-21T23:56:09Z|00190|binding|INFO|Setting lport b767fb64-f4a0-49cc-85c0-21b059344b3d up in Southbound
Jan 21 23:56:09 compute-1 nova_compute[182713]: 2026-01-21 23:56:09.228 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:56:09 compute-1 nova_compute[182713]: 2026-01-21 23:56:09.230 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:56:09 compute-1 nova_compute[182713]: 2026-01-21 23:56:09.233 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:56:09 compute-1 NetworkManager[54952]: <info>  [1769039769.2338] device (tapb767fb64-f4): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 21 23:56:09 compute-1 NetworkManager[54952]: <info>  [1769039769.2356] device (tapb767fb64-f4): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 21 23:56:09 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:56:09.241 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[9c930d2a-30b6-4672-95b5-7505c8f28c81]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:56:09 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:56:09.242 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[fe482387-dfe0-4c29-be47-00e9a8feb3c6]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:56:09 compute-1 nova_compute[182713]: 2026-01-21 23:56:09.247 182717 INFO nova.compute.manager [None req-9f2526cf-cb41-4c37-9673-dbe45c0ac68c 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: 2a5e9a44-f095-4122-8db9-4918b6ba22b7] Took 0.42 seconds to destroy the instance on the hypervisor.
Jan 21 23:56:09 compute-1 nova_compute[182713]: 2026-01-21 23:56:09.248 182717 DEBUG oslo.service.loopingcall [None req-9f2526cf-cb41-4c37-9673-dbe45c0ac68c 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 21 23:56:09 compute-1 nova_compute[182713]: 2026-01-21 23:56:09.248 182717 DEBUG nova.compute.manager [-] [instance: 2a5e9a44-f095-4122-8db9-4918b6ba22b7] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 21 23:56:09 compute-1 nova_compute[182713]: 2026-01-21 23:56:09.248 182717 DEBUG nova.network.neutron [-] [instance: 2a5e9a44-f095-4122-8db9-4918b6ba22b7] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 21 23:56:09 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:56:09.259 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[7ba7a9e3-f72a-4666-b0b6-300cf5470ed1]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 431834, 'reachable_time': 17912, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 219657, 'error': None, 'target': 'ovnmeta-74e2da48-44c2-4c6d-9597-6c47d6247f9c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:56:09 compute-1 systemd-machined[153970]: New machine qemu-29-instance-0000003a.
Jan 21 23:56:09 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:56:09.261 104576 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-74e2da48-44c2-4c6d-9597-6c47d6247f9c deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 21 23:56:09 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:56:09.261 104576 DEBUG oslo.privsep.daemon [-] privsep: reply[c9ad817a-f8b8-4cc3-af78-e15b25eefb0b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:56:09 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:56:09.262 104184 INFO neutron.agent.ovn.metadata.agent [-] Port ce72a712-5acd-45cf-9d5d-66fb0d28ce4b in datapath 74e2da48-44c2-4c6d-9597-6c47d6247f9c unbound from our chassis
Jan 21 23:56:09 compute-1 systemd[1]: run-netns-ovnmeta\x2d74e2da48\x2d44c2\x2d4c6d\x2d9597\x2d6c47d6247f9c.mount: Deactivated successfully.
Jan 21 23:56:09 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:56:09.264 104184 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 74e2da48-44c2-4c6d-9597-6c47d6247f9c, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 21 23:56:09 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:56:09.265 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[629c8fca-ef35-4e95-bdcf-46e3bf7b48e2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:56:09 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:56:09.266 104184 INFO neutron.agent.ovn.metadata.agent [-] Port ce72a712-5acd-45cf-9d5d-66fb0d28ce4b in datapath 74e2da48-44c2-4c6d-9597-6c47d6247f9c unbound from our chassis
Jan 21 23:56:09 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:56:09.268 104184 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 74e2da48-44c2-4c6d-9597-6c47d6247f9c, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 21 23:56:09 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:56:09.268 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[1dd9c621-66da-4986-be07-a6d74af701f5]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:56:09 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:56:09.269 104184 INFO neutron.agent.ovn.metadata.agent [-] Port b767fb64-f4a0-49cc-85c0-21b059344b3d in datapath a78bfb22-a192-4dbe-a117-9f8a59130e27 unbound from our chassis
Jan 21 23:56:09 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:56:09.271 104184 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network a78bfb22-a192-4dbe-a117-9f8a59130e27
Jan 21 23:56:09 compute-1 systemd[1]: Started Virtual Machine qemu-29-instance-0000003a.
Jan 21 23:56:09 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:56:09.284 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[0e309852-d351-4758-8cc1-463a9f960ad6]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:56:09 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:56:09.285 104184 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapa78bfb22-a1 in ovnmeta-a78bfb22-a192-4dbe-a117-9f8a59130e27 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 21 23:56:09 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:56:09.287 211733 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapa78bfb22-a0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 21 23:56:09 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:56:09.287 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[3394cbf1-6544-4a75-a886-8a7d303eb49f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:56:09 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:56:09.288 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[60440118-9e63-4a67-a863-64b267350ede]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:56:09 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:56:09.303 104576 DEBUG oslo.privsep.daemon [-] privsep: reply[d5b007da-65a9-41a6-8aa9-2b057f04af24]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:56:09 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:56:09.330 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[727a6a0d-f597-4aa5-8da5-f320cb812118]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:56:09 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:56:09.369 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[6c660995-2b05-4c93-b1e3-7bc1d81f87a6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:56:09 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:56:09.379 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[9dc0a566-e15c-4e94-a402-ac9df143ec3b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:56:09 compute-1 NetworkManager[54952]: <info>  [1769039769.3813] manager: (tapa78bfb22-a0): new Veth device (/org/freedesktop/NetworkManager/Devices/94)
Jan 21 23:56:09 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:56:09.420 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[c7e71592-9619-4c80-9359-29ff83a8d4ea]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:56:09 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:56:09.425 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[68f77695-9e51-4707-845a-ab77c490907e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:56:09 compute-1 NetworkManager[54952]: <info>  [1769039769.4562] device (tapa78bfb22-a0): carrier: link connected
Jan 21 23:56:09 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:56:09.465 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[92d5e46c-cc4e-4b1d-8d97-852e1f96eda7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:56:09 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:56:09.498 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[c6fbf9eb-cba1-4ac9-b040-3586723bca2a]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapa78bfb22-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:2f:41:94'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 58], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 434210, 'reachable_time': 32681, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 219690, 'error': None, 'target': 'ovnmeta-a78bfb22-a192-4dbe-a117-9f8a59130e27', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:56:09 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:56:09.527 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[d533de85-889b-4a3f-988f-4be385338ce7]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe2f:4194'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 434210, 'tstamp': 434210}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 219691, 'error': None, 'target': 'ovnmeta-a78bfb22-a192-4dbe-a117-9f8a59130e27', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:56:09 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:56:09.563 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[36560773-8c75-45f2-ac95-670d74f9a25b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapa78bfb22-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:2f:41:94'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 58], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 434210, 'reachable_time': 32681, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 219692, 'error': None, 'target': 'ovnmeta-a78bfb22-a192-4dbe-a117-9f8a59130e27', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:56:09 compute-1 nova_compute[182713]: 2026-01-21 23:56:09.573 182717 DEBUG nova.compute.manager [req-e5e9d2c2-64df-4432-ab3b-148d024953f0 req-ce76ddb1-531d-4109-9f38-11323f4fedb2 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 22742d9b-a6a8-4f10-a17f-a9704a1f8f43] Received event network-vif-plugged-b767fb64-f4a0-49cc-85c0-21b059344b3d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:56:09 compute-1 nova_compute[182713]: 2026-01-21 23:56:09.573 182717 DEBUG oslo_concurrency.lockutils [req-e5e9d2c2-64df-4432-ab3b-148d024953f0 req-ce76ddb1-531d-4109-9f38-11323f4fedb2 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "22742d9b-a6a8-4f10-a17f-a9704a1f8f43-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:56:09 compute-1 nova_compute[182713]: 2026-01-21 23:56:09.573 182717 DEBUG oslo_concurrency.lockutils [req-e5e9d2c2-64df-4432-ab3b-148d024953f0 req-ce76ddb1-531d-4109-9f38-11323f4fedb2 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "22742d9b-a6a8-4f10-a17f-a9704a1f8f43-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:56:09 compute-1 nova_compute[182713]: 2026-01-21 23:56:09.574 182717 DEBUG oslo_concurrency.lockutils [req-e5e9d2c2-64df-4432-ab3b-148d024953f0 req-ce76ddb1-531d-4109-9f38-11323f4fedb2 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "22742d9b-a6a8-4f10-a17f-a9704a1f8f43-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:56:09 compute-1 nova_compute[182713]: 2026-01-21 23:56:09.574 182717 DEBUG nova.compute.manager [req-e5e9d2c2-64df-4432-ab3b-148d024953f0 req-ce76ddb1-531d-4109-9f38-11323f4fedb2 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 22742d9b-a6a8-4f10-a17f-a9704a1f8f43] No waiting events found dispatching network-vif-plugged-b767fb64-f4a0-49cc-85c0-21b059344b3d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 23:56:09 compute-1 nova_compute[182713]: 2026-01-21 23:56:09.574 182717 WARNING nova.compute.manager [req-e5e9d2c2-64df-4432-ab3b-148d024953f0 req-ce76ddb1-531d-4109-9f38-11323f4fedb2 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 22742d9b-a6a8-4f10-a17f-a9704a1f8f43] Received unexpected event network-vif-plugged-b767fb64-f4a0-49cc-85c0-21b059344b3d for instance with vm_state stopped and task_state powering-on.
Jan 21 23:56:09 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:56:09.612 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[f8ef62a7-c453-4676-af71-3d97a73c9e8f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:56:09 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:56:09.692 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[f1c14730-cba7-4bfb-b9f8-e7082084cc2f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:56:09 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:56:09.694 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa78bfb22-a0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:56:09 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:56:09.694 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 21 23:56:09 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:56:09.695 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa78bfb22-a0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:56:09 compute-1 NetworkManager[54952]: <info>  [1769039769.6977] manager: (tapa78bfb22-a0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/95)
Jan 21 23:56:09 compute-1 kernel: tapa78bfb22-a0: entered promiscuous mode
Jan 21 23:56:09 compute-1 nova_compute[182713]: 2026-01-21 23:56:09.697 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:56:09 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:56:09.705 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapa78bfb22-a0, col_values=(('external_ids', {'iface-id': 'bb8c3f45-55b8-4c8e-8a31-26c5ecb4fb32'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:56:09 compute-1 ovn_controller[94841]: 2026-01-21T23:56:09Z|00191|binding|INFO|Releasing lport bb8c3f45-55b8-4c8e-8a31-26c5ecb4fb32 from this chassis (sb_readonly=0)
Jan 21 23:56:09 compute-1 nova_compute[182713]: 2026-01-21 23:56:09.707 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:56:09 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:56:09.711 104184 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/a78bfb22-a192-4dbe-a117-9f8a59130e27.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/a78bfb22-a192-4dbe-a117-9f8a59130e27.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 21 23:56:09 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:56:09.712 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[e95e334e-aaf7-4a23-b8aa-985cb1de0697]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:56:09 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:56:09.713 104184 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 21 23:56:09 compute-1 ovn_metadata_agent[104179]: global
Jan 21 23:56:09 compute-1 ovn_metadata_agent[104179]:     log         /dev/log local0 debug
Jan 21 23:56:09 compute-1 ovn_metadata_agent[104179]:     log-tag     haproxy-metadata-proxy-a78bfb22-a192-4dbe-a117-9f8a59130e27
Jan 21 23:56:09 compute-1 ovn_metadata_agent[104179]:     user        root
Jan 21 23:56:09 compute-1 ovn_metadata_agent[104179]:     group       root
Jan 21 23:56:09 compute-1 ovn_metadata_agent[104179]:     maxconn     1024
Jan 21 23:56:09 compute-1 ovn_metadata_agent[104179]:     pidfile     /var/lib/neutron/external/pids/a78bfb22-a192-4dbe-a117-9f8a59130e27.pid.haproxy
Jan 21 23:56:09 compute-1 ovn_metadata_agent[104179]:     daemon
Jan 21 23:56:09 compute-1 ovn_metadata_agent[104179]: 
Jan 21 23:56:09 compute-1 ovn_metadata_agent[104179]: defaults
Jan 21 23:56:09 compute-1 ovn_metadata_agent[104179]:     log global
Jan 21 23:56:09 compute-1 ovn_metadata_agent[104179]:     mode http
Jan 21 23:56:09 compute-1 ovn_metadata_agent[104179]:     option httplog
Jan 21 23:56:09 compute-1 ovn_metadata_agent[104179]:     option dontlognull
Jan 21 23:56:09 compute-1 ovn_metadata_agent[104179]:     option http-server-close
Jan 21 23:56:09 compute-1 ovn_metadata_agent[104179]:     option forwardfor
Jan 21 23:56:09 compute-1 ovn_metadata_agent[104179]:     retries                 3
Jan 21 23:56:09 compute-1 ovn_metadata_agent[104179]:     timeout http-request    30s
Jan 21 23:56:09 compute-1 ovn_metadata_agent[104179]:     timeout connect         30s
Jan 21 23:56:09 compute-1 ovn_metadata_agent[104179]:     timeout client          32s
Jan 21 23:56:09 compute-1 ovn_metadata_agent[104179]:     timeout server          32s
Jan 21 23:56:09 compute-1 ovn_metadata_agent[104179]:     timeout http-keep-alive 30s
Jan 21 23:56:09 compute-1 ovn_metadata_agent[104179]: 
Jan 21 23:56:09 compute-1 ovn_metadata_agent[104179]: 
Jan 21 23:56:09 compute-1 ovn_metadata_agent[104179]: listen listener
Jan 21 23:56:09 compute-1 ovn_metadata_agent[104179]:     bind 169.254.169.254:80
Jan 21 23:56:09 compute-1 ovn_metadata_agent[104179]:     server metadata /var/lib/neutron/metadata_proxy
Jan 21 23:56:09 compute-1 ovn_metadata_agent[104179]:     http-request add-header X-OVN-Network-ID a78bfb22-a192-4dbe-a117-9f8a59130e27
Jan 21 23:56:09 compute-1 ovn_metadata_agent[104179]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 21 23:56:09 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:56:09.714 104184 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-a78bfb22-a192-4dbe-a117-9f8a59130e27', 'env', 'PROCESS_TAG=haproxy-a78bfb22-a192-4dbe-a117-9f8a59130e27', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/a78bfb22-a192-4dbe-a117-9f8a59130e27.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 21 23:56:09 compute-1 nova_compute[182713]: 2026-01-21 23:56:09.731 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:56:09 compute-1 nova_compute[182713]: 2026-01-21 23:56:09.990 182717 DEBUG nova.network.neutron [-] [instance: 2a5e9a44-f095-4122-8db9-4918b6ba22b7] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 23:56:10 compute-1 nova_compute[182713]: 2026-01-21 23:56:10.007 182717 INFO nova.compute.manager [-] [instance: 2a5e9a44-f095-4122-8db9-4918b6ba22b7] Took 0.76 seconds to deallocate network for instance.
Jan 21 23:56:10 compute-1 nova_compute[182713]: 2026-01-21 23:56:10.048 182717 DEBUG nova.compute.manager [req-3f889d6b-4046-407d-9969-f3e1f25082ff req-8be31d04-b3b5-4a9d-9460-e808cfd6b24c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 2a5e9a44-f095-4122-8db9-4918b6ba22b7] Received event network-vif-deleted-ce72a712-5acd-45cf-9d5d-66fb0d28ce4b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:56:10 compute-1 nova_compute[182713]: 2026-01-21 23:56:10.087 182717 DEBUG oslo_concurrency.lockutils [None req-9f2526cf-cb41-4c37-9673-dbe45c0ac68c 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:56:10 compute-1 nova_compute[182713]: 2026-01-21 23:56:10.087 182717 DEBUG oslo_concurrency.lockutils [None req-9f2526cf-cb41-4c37-9673-dbe45c0ac68c 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:56:10 compute-1 podman[219724]: 2026-01-21 23:56:10.15844875 +0000 UTC m=+0.050657831 container create 7df86a05b272d47525491b31f1dc171095b5bbb8b357540d80ab3ea88a44e1a6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a78bfb22-a192-4dbe-a117-9f8a59130e27, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.vendor=CentOS)
Jan 21 23:56:10 compute-1 nova_compute[182713]: 2026-01-21 23:56:10.178 182717 DEBUG nova.compute.provider_tree [None req-9f2526cf-cb41-4c37-9673-dbe45c0ac68c 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Inventory has not changed in ProviderTree for provider: 39680711-70c9-4df1-ae59-25e54fac688d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 21 23:56:10 compute-1 systemd[1]: Started libpod-conmon-7df86a05b272d47525491b31f1dc171095b5bbb8b357540d80ab3ea88a44e1a6.scope.
Jan 21 23:56:10 compute-1 nova_compute[182713]: 2026-01-21 23:56:10.198 182717 DEBUG nova.scheduler.client.report [None req-9f2526cf-cb41-4c37-9673-dbe45c0ac68c 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Inventory has not changed for provider 39680711-70c9-4df1-ae59-25e54fac688d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 21 23:56:10 compute-1 systemd[1]: Started libcrun container.
Jan 21 23:56:10 compute-1 podman[219724]: 2026-01-21 23:56:10.131356056 +0000 UTC m=+0.023565157 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 21 23:56:10 compute-1 nova_compute[182713]: 2026-01-21 23:56:10.225 182717 DEBUG oslo_concurrency.lockutils [None req-9f2526cf-cb41-4c37-9673-dbe45c0ac68c 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.138s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:56:10 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a8bb9dd63ecfc20576c37291513a15cfefcb9d8aa9870ce0b541988a2fa8b006/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 21 23:56:10 compute-1 podman[219724]: 2026-01-21 23:56:10.241771495 +0000 UTC m=+0.133980606 container init 7df86a05b272d47525491b31f1dc171095b5bbb8b357540d80ab3ea88a44e1a6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a78bfb22-a192-4dbe-a117-9f8a59130e27, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 21 23:56:10 compute-1 podman[219724]: 2026-01-21 23:56:10.248334688 +0000 UTC m=+0.140543769 container start 7df86a05b272d47525491b31f1dc171095b5bbb8b357540d80ab3ea88a44e1a6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a78bfb22-a192-4dbe-a117-9f8a59130e27, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Jan 21 23:56:10 compute-1 nova_compute[182713]: 2026-01-21 23:56:10.264 182717 INFO nova.scheduler.client.report [None req-9f2526cf-cb41-4c37-9673-dbe45c0ac68c 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Deleted allocations for instance 2a5e9a44-f095-4122-8db9-4918b6ba22b7
Jan 21 23:56:10 compute-1 neutron-haproxy-ovnmeta-a78bfb22-a192-4dbe-a117-9f8a59130e27[219739]: [NOTICE]   (219743) : New worker (219745) forked
Jan 21 23:56:10 compute-1 neutron-haproxy-ovnmeta-a78bfb22-a192-4dbe-a117-9f8a59130e27[219739]: [NOTICE]   (219743) : Loading success.
Jan 21 23:56:10 compute-1 nova_compute[182713]: 2026-01-21 23:56:10.350 182717 DEBUG oslo_concurrency.lockutils [None req-9f2526cf-cb41-4c37-9673-dbe45c0ac68c 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Lock "2a5e9a44-f095-4122-8db9-4918b6ba22b7" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.587s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:56:10 compute-1 nova_compute[182713]: 2026-01-21 23:56:10.809 182717 DEBUG nova.virt.libvirt.host [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Removed pending event for 22742d9b-a6a8-4f10-a17f-a9704a1f8f43 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438
Jan 21 23:56:10 compute-1 nova_compute[182713]: 2026-01-21 23:56:10.810 182717 DEBUG nova.virt.driver [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Emitting event <LifecycleEvent: 1769039770.8091977, 22742d9b-a6a8-4f10-a17f-a9704a1f8f43 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:56:10 compute-1 nova_compute[182713]: 2026-01-21 23:56:10.810 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 22742d9b-a6a8-4f10-a17f-a9704a1f8f43] VM Resumed (Lifecycle Event)
Jan 21 23:56:10 compute-1 nova_compute[182713]: 2026-01-21 23:56:10.813 182717 DEBUG nova.compute.manager [None req-4e65fe80-d9f3-4139-88c1-a7743e2856a7 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] [instance: 22742d9b-a6a8-4f10-a17f-a9704a1f8f43] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 21 23:56:10 compute-1 nova_compute[182713]: 2026-01-21 23:56:10.818 182717 INFO nova.virt.libvirt.driver [-] [instance: 22742d9b-a6a8-4f10-a17f-a9704a1f8f43] Instance rebooted successfully.
Jan 21 23:56:10 compute-1 nova_compute[182713]: 2026-01-21 23:56:10.819 182717 DEBUG nova.compute.manager [None req-4e65fe80-d9f3-4139-88c1-a7743e2856a7 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] [instance: 22742d9b-a6a8-4f10-a17f-a9704a1f8f43] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:56:10 compute-1 nova_compute[182713]: 2026-01-21 23:56:10.832 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 22742d9b-a6a8-4f10-a17f-a9704a1f8f43] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:56:10 compute-1 nova_compute[182713]: 2026-01-21 23:56:10.837 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 22742d9b-a6a8-4f10-a17f-a9704a1f8f43] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: stopped, current task_state: powering-on, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 21 23:56:10 compute-1 nova_compute[182713]: 2026-01-21 23:56:10.881 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 22742d9b-a6a8-4f10-a17f-a9704a1f8f43] During sync_power_state the instance has a pending task (powering-on). Skip.
Jan 21 23:56:10 compute-1 nova_compute[182713]: 2026-01-21 23:56:10.882 182717 DEBUG nova.virt.driver [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Emitting event <LifecycleEvent: 1769039770.8135102, 22742d9b-a6a8-4f10-a17f-a9704a1f8f43 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:56:10 compute-1 nova_compute[182713]: 2026-01-21 23:56:10.882 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 22742d9b-a6a8-4f10-a17f-a9704a1f8f43] VM Started (Lifecycle Event)
Jan 21 23:56:10 compute-1 nova_compute[182713]: 2026-01-21 23:56:10.918 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 22742d9b-a6a8-4f10-a17f-a9704a1f8f43] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:56:10 compute-1 nova_compute[182713]: 2026-01-21 23:56:10.923 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 22742d9b-a6a8-4f10-a17f-a9704a1f8f43] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: None, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 21 23:56:11 compute-1 nova_compute[182713]: 2026-01-21 23:56:11.098 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:56:11 compute-1 nova_compute[182713]: 2026-01-21 23:56:11.188 182717 DEBUG nova.compute.manager [req-840d8525-cd0d-456f-8e38-2c5fc356a2b8 req-87c4e0ab-bfab-4b7d-acd9-55784dbf86e7 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 2a5e9a44-f095-4122-8db9-4918b6ba22b7] Received event network-vif-plugged-ce72a712-5acd-45cf-9d5d-66fb0d28ce4b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:56:11 compute-1 nova_compute[182713]: 2026-01-21 23:56:11.188 182717 DEBUG oslo_concurrency.lockutils [req-840d8525-cd0d-456f-8e38-2c5fc356a2b8 req-87c4e0ab-bfab-4b7d-acd9-55784dbf86e7 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "2a5e9a44-f095-4122-8db9-4918b6ba22b7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:56:11 compute-1 nova_compute[182713]: 2026-01-21 23:56:11.189 182717 DEBUG oslo_concurrency.lockutils [req-840d8525-cd0d-456f-8e38-2c5fc356a2b8 req-87c4e0ab-bfab-4b7d-acd9-55784dbf86e7 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "2a5e9a44-f095-4122-8db9-4918b6ba22b7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:56:11 compute-1 nova_compute[182713]: 2026-01-21 23:56:11.189 182717 DEBUG oslo_concurrency.lockutils [req-840d8525-cd0d-456f-8e38-2c5fc356a2b8 req-87c4e0ab-bfab-4b7d-acd9-55784dbf86e7 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "2a5e9a44-f095-4122-8db9-4918b6ba22b7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:56:11 compute-1 nova_compute[182713]: 2026-01-21 23:56:11.189 182717 DEBUG nova.compute.manager [req-840d8525-cd0d-456f-8e38-2c5fc356a2b8 req-87c4e0ab-bfab-4b7d-acd9-55784dbf86e7 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 2a5e9a44-f095-4122-8db9-4918b6ba22b7] No waiting events found dispatching network-vif-plugged-ce72a712-5acd-45cf-9d5d-66fb0d28ce4b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 23:56:11 compute-1 nova_compute[182713]: 2026-01-21 23:56:11.190 182717 WARNING nova.compute.manager [req-840d8525-cd0d-456f-8e38-2c5fc356a2b8 req-87c4e0ab-bfab-4b7d-acd9-55784dbf86e7 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 2a5e9a44-f095-4122-8db9-4918b6ba22b7] Received unexpected event network-vif-plugged-ce72a712-5acd-45cf-9d5d-66fb0d28ce4b for instance with vm_state deleted and task_state None.
Jan 21 23:56:11 compute-1 nova_compute[182713]: 2026-01-21 23:56:11.190 182717 DEBUG nova.compute.manager [req-840d8525-cd0d-456f-8e38-2c5fc356a2b8 req-87c4e0ab-bfab-4b7d-acd9-55784dbf86e7 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 2a5e9a44-f095-4122-8db9-4918b6ba22b7] Received event network-vif-plugged-ce72a712-5acd-45cf-9d5d-66fb0d28ce4b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:56:11 compute-1 nova_compute[182713]: 2026-01-21 23:56:11.190 182717 DEBUG oslo_concurrency.lockutils [req-840d8525-cd0d-456f-8e38-2c5fc356a2b8 req-87c4e0ab-bfab-4b7d-acd9-55784dbf86e7 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "2a5e9a44-f095-4122-8db9-4918b6ba22b7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:56:11 compute-1 nova_compute[182713]: 2026-01-21 23:56:11.190 182717 DEBUG oslo_concurrency.lockutils [req-840d8525-cd0d-456f-8e38-2c5fc356a2b8 req-87c4e0ab-bfab-4b7d-acd9-55784dbf86e7 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "2a5e9a44-f095-4122-8db9-4918b6ba22b7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:56:11 compute-1 nova_compute[182713]: 2026-01-21 23:56:11.191 182717 DEBUG oslo_concurrency.lockutils [req-840d8525-cd0d-456f-8e38-2c5fc356a2b8 req-87c4e0ab-bfab-4b7d-acd9-55784dbf86e7 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "2a5e9a44-f095-4122-8db9-4918b6ba22b7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:56:11 compute-1 nova_compute[182713]: 2026-01-21 23:56:11.191 182717 DEBUG nova.compute.manager [req-840d8525-cd0d-456f-8e38-2c5fc356a2b8 req-87c4e0ab-bfab-4b7d-acd9-55784dbf86e7 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 2a5e9a44-f095-4122-8db9-4918b6ba22b7] No waiting events found dispatching network-vif-plugged-ce72a712-5acd-45cf-9d5d-66fb0d28ce4b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 23:56:11 compute-1 nova_compute[182713]: 2026-01-21 23:56:11.191 182717 WARNING nova.compute.manager [req-840d8525-cd0d-456f-8e38-2c5fc356a2b8 req-87c4e0ab-bfab-4b7d-acd9-55784dbf86e7 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 2a5e9a44-f095-4122-8db9-4918b6ba22b7] Received unexpected event network-vif-plugged-ce72a712-5acd-45cf-9d5d-66fb0d28ce4b for instance with vm_state deleted and task_state None.
Jan 21 23:56:11 compute-1 nova_compute[182713]: 2026-01-21 23:56:11.191 182717 DEBUG nova.compute.manager [req-840d8525-cd0d-456f-8e38-2c5fc356a2b8 req-87c4e0ab-bfab-4b7d-acd9-55784dbf86e7 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 2a5e9a44-f095-4122-8db9-4918b6ba22b7] Received event network-vif-plugged-ce72a712-5acd-45cf-9d5d-66fb0d28ce4b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:56:11 compute-1 nova_compute[182713]: 2026-01-21 23:56:11.192 182717 DEBUG oslo_concurrency.lockutils [req-840d8525-cd0d-456f-8e38-2c5fc356a2b8 req-87c4e0ab-bfab-4b7d-acd9-55784dbf86e7 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "2a5e9a44-f095-4122-8db9-4918b6ba22b7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:56:11 compute-1 nova_compute[182713]: 2026-01-21 23:56:11.192 182717 DEBUG oslo_concurrency.lockutils [req-840d8525-cd0d-456f-8e38-2c5fc356a2b8 req-87c4e0ab-bfab-4b7d-acd9-55784dbf86e7 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "2a5e9a44-f095-4122-8db9-4918b6ba22b7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:56:11 compute-1 nova_compute[182713]: 2026-01-21 23:56:11.192 182717 DEBUG oslo_concurrency.lockutils [req-840d8525-cd0d-456f-8e38-2c5fc356a2b8 req-87c4e0ab-bfab-4b7d-acd9-55784dbf86e7 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "2a5e9a44-f095-4122-8db9-4918b6ba22b7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:56:11 compute-1 nova_compute[182713]: 2026-01-21 23:56:11.192 182717 DEBUG nova.compute.manager [req-840d8525-cd0d-456f-8e38-2c5fc356a2b8 req-87c4e0ab-bfab-4b7d-acd9-55784dbf86e7 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 2a5e9a44-f095-4122-8db9-4918b6ba22b7] No waiting events found dispatching network-vif-plugged-ce72a712-5acd-45cf-9d5d-66fb0d28ce4b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 23:56:11 compute-1 nova_compute[182713]: 2026-01-21 23:56:11.193 182717 WARNING nova.compute.manager [req-840d8525-cd0d-456f-8e38-2c5fc356a2b8 req-87c4e0ab-bfab-4b7d-acd9-55784dbf86e7 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 2a5e9a44-f095-4122-8db9-4918b6ba22b7] Received unexpected event network-vif-plugged-ce72a712-5acd-45cf-9d5d-66fb0d28ce4b for instance with vm_state deleted and task_state None.
Jan 21 23:56:11 compute-1 nova_compute[182713]: 2026-01-21 23:56:11.193 182717 DEBUG nova.compute.manager [req-840d8525-cd0d-456f-8e38-2c5fc356a2b8 req-87c4e0ab-bfab-4b7d-acd9-55784dbf86e7 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 2a5e9a44-f095-4122-8db9-4918b6ba22b7] Received event network-vif-unplugged-ce72a712-5acd-45cf-9d5d-66fb0d28ce4b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:56:11 compute-1 nova_compute[182713]: 2026-01-21 23:56:11.193 182717 DEBUG oslo_concurrency.lockutils [req-840d8525-cd0d-456f-8e38-2c5fc356a2b8 req-87c4e0ab-bfab-4b7d-acd9-55784dbf86e7 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "2a5e9a44-f095-4122-8db9-4918b6ba22b7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:56:11 compute-1 nova_compute[182713]: 2026-01-21 23:56:11.193 182717 DEBUG oslo_concurrency.lockutils [req-840d8525-cd0d-456f-8e38-2c5fc356a2b8 req-87c4e0ab-bfab-4b7d-acd9-55784dbf86e7 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "2a5e9a44-f095-4122-8db9-4918b6ba22b7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:56:11 compute-1 nova_compute[182713]: 2026-01-21 23:56:11.194 182717 DEBUG oslo_concurrency.lockutils [req-840d8525-cd0d-456f-8e38-2c5fc356a2b8 req-87c4e0ab-bfab-4b7d-acd9-55784dbf86e7 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "2a5e9a44-f095-4122-8db9-4918b6ba22b7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:56:11 compute-1 nova_compute[182713]: 2026-01-21 23:56:11.194 182717 DEBUG nova.compute.manager [req-840d8525-cd0d-456f-8e38-2c5fc356a2b8 req-87c4e0ab-bfab-4b7d-acd9-55784dbf86e7 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 2a5e9a44-f095-4122-8db9-4918b6ba22b7] No waiting events found dispatching network-vif-unplugged-ce72a712-5acd-45cf-9d5d-66fb0d28ce4b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 23:56:11 compute-1 nova_compute[182713]: 2026-01-21 23:56:11.194 182717 WARNING nova.compute.manager [req-840d8525-cd0d-456f-8e38-2c5fc356a2b8 req-87c4e0ab-bfab-4b7d-acd9-55784dbf86e7 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 2a5e9a44-f095-4122-8db9-4918b6ba22b7] Received unexpected event network-vif-unplugged-ce72a712-5acd-45cf-9d5d-66fb0d28ce4b for instance with vm_state deleted and task_state None.
Jan 21 23:56:11 compute-1 nova_compute[182713]: 2026-01-21 23:56:11.194 182717 DEBUG nova.compute.manager [req-840d8525-cd0d-456f-8e38-2c5fc356a2b8 req-87c4e0ab-bfab-4b7d-acd9-55784dbf86e7 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 2a5e9a44-f095-4122-8db9-4918b6ba22b7] Received event network-vif-plugged-ce72a712-5acd-45cf-9d5d-66fb0d28ce4b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:56:11 compute-1 nova_compute[182713]: 2026-01-21 23:56:11.195 182717 DEBUG oslo_concurrency.lockutils [req-840d8525-cd0d-456f-8e38-2c5fc356a2b8 req-87c4e0ab-bfab-4b7d-acd9-55784dbf86e7 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "2a5e9a44-f095-4122-8db9-4918b6ba22b7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:56:11 compute-1 nova_compute[182713]: 2026-01-21 23:56:11.195 182717 DEBUG oslo_concurrency.lockutils [req-840d8525-cd0d-456f-8e38-2c5fc356a2b8 req-87c4e0ab-bfab-4b7d-acd9-55784dbf86e7 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "2a5e9a44-f095-4122-8db9-4918b6ba22b7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:56:11 compute-1 nova_compute[182713]: 2026-01-21 23:56:11.195 182717 DEBUG oslo_concurrency.lockutils [req-840d8525-cd0d-456f-8e38-2c5fc356a2b8 req-87c4e0ab-bfab-4b7d-acd9-55784dbf86e7 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "2a5e9a44-f095-4122-8db9-4918b6ba22b7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:56:11 compute-1 nova_compute[182713]: 2026-01-21 23:56:11.195 182717 DEBUG nova.compute.manager [req-840d8525-cd0d-456f-8e38-2c5fc356a2b8 req-87c4e0ab-bfab-4b7d-acd9-55784dbf86e7 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 2a5e9a44-f095-4122-8db9-4918b6ba22b7] No waiting events found dispatching network-vif-plugged-ce72a712-5acd-45cf-9d5d-66fb0d28ce4b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 23:56:11 compute-1 nova_compute[182713]: 2026-01-21 23:56:11.196 182717 WARNING nova.compute.manager [req-840d8525-cd0d-456f-8e38-2c5fc356a2b8 req-87c4e0ab-bfab-4b7d-acd9-55784dbf86e7 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 2a5e9a44-f095-4122-8db9-4918b6ba22b7] Received unexpected event network-vif-plugged-ce72a712-5acd-45cf-9d5d-66fb0d28ce4b for instance with vm_state deleted and task_state None.
Jan 21 23:56:11 compute-1 nova_compute[182713]: 2026-01-21 23:56:11.725 182717 DEBUG nova.compute.manager [req-7ccfe300-eabf-4461-8f05-b60643572fcc req-47c4654d-7fe3-42c7-ac29-d86851066a21 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 22742d9b-a6a8-4f10-a17f-a9704a1f8f43] Received event network-vif-plugged-b767fb64-f4a0-49cc-85c0-21b059344b3d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:56:11 compute-1 nova_compute[182713]: 2026-01-21 23:56:11.725 182717 DEBUG oslo_concurrency.lockutils [req-7ccfe300-eabf-4461-8f05-b60643572fcc req-47c4654d-7fe3-42c7-ac29-d86851066a21 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "22742d9b-a6a8-4f10-a17f-a9704a1f8f43-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:56:11 compute-1 nova_compute[182713]: 2026-01-21 23:56:11.726 182717 DEBUG oslo_concurrency.lockutils [req-7ccfe300-eabf-4461-8f05-b60643572fcc req-47c4654d-7fe3-42c7-ac29-d86851066a21 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "22742d9b-a6a8-4f10-a17f-a9704a1f8f43-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:56:11 compute-1 nova_compute[182713]: 2026-01-21 23:56:11.726 182717 DEBUG oslo_concurrency.lockutils [req-7ccfe300-eabf-4461-8f05-b60643572fcc req-47c4654d-7fe3-42c7-ac29-d86851066a21 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "22742d9b-a6a8-4f10-a17f-a9704a1f8f43-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:56:11 compute-1 nova_compute[182713]: 2026-01-21 23:56:11.726 182717 DEBUG nova.compute.manager [req-7ccfe300-eabf-4461-8f05-b60643572fcc req-47c4654d-7fe3-42c7-ac29-d86851066a21 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 22742d9b-a6a8-4f10-a17f-a9704a1f8f43] No waiting events found dispatching network-vif-plugged-b767fb64-f4a0-49cc-85c0-21b059344b3d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 23:56:11 compute-1 nova_compute[182713]: 2026-01-21 23:56:11.727 182717 WARNING nova.compute.manager [req-7ccfe300-eabf-4461-8f05-b60643572fcc req-47c4654d-7fe3-42c7-ac29-d86851066a21 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 22742d9b-a6a8-4f10-a17f-a9704a1f8f43] Received unexpected event network-vif-plugged-b767fb64-f4a0-49cc-85c0-21b059344b3d for instance with vm_state active and task_state None.
Jan 21 23:56:11 compute-1 nova_compute[182713]: 2026-01-21 23:56:11.883 182717 DEBUG oslo_concurrency.lockutils [None req-6a023b4e-6f8f-4f6b-bf35-4658124c9d52 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Acquiring lock "0be60507-2a72-40c2-8ec7-86c829eacd52" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:56:11 compute-1 nova_compute[182713]: 2026-01-21 23:56:11.883 182717 DEBUG oslo_concurrency.lockutils [None req-6a023b4e-6f8f-4f6b-bf35-4658124c9d52 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Lock "0be60507-2a72-40c2-8ec7-86c829eacd52" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:56:11 compute-1 nova_compute[182713]: 2026-01-21 23:56:11.907 182717 DEBUG nova.compute.manager [None req-6a023b4e-6f8f-4f6b-bf35-4658124c9d52 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: 0be60507-2a72-40c2-8ec7-86c829eacd52] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 21 23:56:12 compute-1 nova_compute[182713]: 2026-01-21 23:56:12.093 182717 DEBUG oslo_concurrency.lockutils [None req-6a023b4e-6f8f-4f6b-bf35-4658124c9d52 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:56:12 compute-1 nova_compute[182713]: 2026-01-21 23:56:12.094 182717 DEBUG oslo_concurrency.lockutils [None req-6a023b4e-6f8f-4f6b-bf35-4658124c9d52 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:56:12 compute-1 nova_compute[182713]: 2026-01-21 23:56:12.105 182717 DEBUG nova.virt.hardware [None req-6a023b4e-6f8f-4f6b-bf35-4658124c9d52 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 21 23:56:12 compute-1 nova_compute[182713]: 2026-01-21 23:56:12.106 182717 INFO nova.compute.claims [None req-6a023b4e-6f8f-4f6b-bf35-4658124c9d52 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: 0be60507-2a72-40c2-8ec7-86c829eacd52] Claim successful on node compute-1.ctlplane.example.com
Jan 21 23:56:12 compute-1 nova_compute[182713]: 2026-01-21 23:56:12.288 182717 DEBUG nova.compute.provider_tree [None req-6a023b4e-6f8f-4f6b-bf35-4658124c9d52 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Inventory has not changed in ProviderTree for provider: 39680711-70c9-4df1-ae59-25e54fac688d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 21 23:56:12 compute-1 nova_compute[182713]: 2026-01-21 23:56:12.309 182717 DEBUG nova.scheduler.client.report [None req-6a023b4e-6f8f-4f6b-bf35-4658124c9d52 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Inventory has not changed for provider 39680711-70c9-4df1-ae59-25e54fac688d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 21 23:56:12 compute-1 nova_compute[182713]: 2026-01-21 23:56:12.334 182717 DEBUG oslo_concurrency.lockutils [None req-6a023b4e-6f8f-4f6b-bf35-4658124c9d52 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.240s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:56:12 compute-1 nova_compute[182713]: 2026-01-21 23:56:12.335 182717 DEBUG nova.compute.manager [None req-6a023b4e-6f8f-4f6b-bf35-4658124c9d52 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: 0be60507-2a72-40c2-8ec7-86c829eacd52] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 21 23:56:12 compute-1 nova_compute[182713]: 2026-01-21 23:56:12.417 182717 DEBUG nova.compute.manager [None req-6a023b4e-6f8f-4f6b-bf35-4658124c9d52 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: 0be60507-2a72-40c2-8ec7-86c829eacd52] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 21 23:56:12 compute-1 nova_compute[182713]: 2026-01-21 23:56:12.418 182717 DEBUG nova.network.neutron [None req-6a023b4e-6f8f-4f6b-bf35-4658124c9d52 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: 0be60507-2a72-40c2-8ec7-86c829eacd52] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 21 23:56:12 compute-1 nova_compute[182713]: 2026-01-21 23:56:12.461 182717 INFO nova.virt.libvirt.driver [None req-6a023b4e-6f8f-4f6b-bf35-4658124c9d52 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: 0be60507-2a72-40c2-8ec7-86c829eacd52] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 21 23:56:12 compute-1 nova_compute[182713]: 2026-01-21 23:56:12.502 182717 DEBUG nova.compute.manager [None req-6a023b4e-6f8f-4f6b-bf35-4658124c9d52 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: 0be60507-2a72-40c2-8ec7-86c829eacd52] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 21 23:56:12 compute-1 nova_compute[182713]: 2026-01-21 23:56:12.640 182717 DEBUG nova.compute.manager [None req-6a023b4e-6f8f-4f6b-bf35-4658124c9d52 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: 0be60507-2a72-40c2-8ec7-86c829eacd52] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 21 23:56:12 compute-1 nova_compute[182713]: 2026-01-21 23:56:12.642 182717 DEBUG nova.virt.libvirt.driver [None req-6a023b4e-6f8f-4f6b-bf35-4658124c9d52 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: 0be60507-2a72-40c2-8ec7-86c829eacd52] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 21 23:56:12 compute-1 nova_compute[182713]: 2026-01-21 23:56:12.643 182717 INFO nova.virt.libvirt.driver [None req-6a023b4e-6f8f-4f6b-bf35-4658124c9d52 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: 0be60507-2a72-40c2-8ec7-86c829eacd52] Creating image(s)
Jan 21 23:56:12 compute-1 nova_compute[182713]: 2026-01-21 23:56:12.644 182717 DEBUG oslo_concurrency.lockutils [None req-6a023b4e-6f8f-4f6b-bf35-4658124c9d52 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Acquiring lock "/var/lib/nova/instances/0be60507-2a72-40c2-8ec7-86c829eacd52/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:56:12 compute-1 nova_compute[182713]: 2026-01-21 23:56:12.645 182717 DEBUG oslo_concurrency.lockutils [None req-6a023b4e-6f8f-4f6b-bf35-4658124c9d52 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Lock "/var/lib/nova/instances/0be60507-2a72-40c2-8ec7-86c829eacd52/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:56:12 compute-1 nova_compute[182713]: 2026-01-21 23:56:12.646 182717 DEBUG oslo_concurrency.lockutils [None req-6a023b4e-6f8f-4f6b-bf35-4658124c9d52 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Lock "/var/lib/nova/instances/0be60507-2a72-40c2-8ec7-86c829eacd52/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:56:12 compute-1 nova_compute[182713]: 2026-01-21 23:56:12.672 182717 DEBUG nova.policy [None req-6a023b4e-6f8f-4f6b-bf35-4658124c9d52 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '6eb1bcf645844eaca088761a04e59542', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '63e5713bcd4c429796b251487b6136bc', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 21 23:56:12 compute-1 nova_compute[182713]: 2026-01-21 23:56:12.677 182717 DEBUG oslo_concurrency.processutils [None req-6a023b4e-6f8f-4f6b-bf35-4658124c9d52 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:56:12 compute-1 nova_compute[182713]: 2026-01-21 23:56:12.749 182717 DEBUG oslo_concurrency.processutils [None req-6a023b4e-6f8f-4f6b-bf35-4658124c9d52 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:56:12 compute-1 nova_compute[182713]: 2026-01-21 23:56:12.751 182717 DEBUG oslo_concurrency.lockutils [None req-6a023b4e-6f8f-4f6b-bf35-4658124c9d52 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Acquiring lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:56:12 compute-1 nova_compute[182713]: 2026-01-21 23:56:12.752 182717 DEBUG oslo_concurrency.lockutils [None req-6a023b4e-6f8f-4f6b-bf35-4658124c9d52 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:56:12 compute-1 nova_compute[182713]: 2026-01-21 23:56:12.775 182717 DEBUG oslo_concurrency.processutils [None req-6a023b4e-6f8f-4f6b-bf35-4658124c9d52 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:56:12 compute-1 nova_compute[182713]: 2026-01-21 23:56:12.850 182717 DEBUG oslo_concurrency.processutils [None req-6a023b4e-6f8f-4f6b-bf35-4658124c9d52 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:56:12 compute-1 nova_compute[182713]: 2026-01-21 23:56:12.852 182717 DEBUG oslo_concurrency.processutils [None req-6a023b4e-6f8f-4f6b-bf35-4658124c9d52 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/0be60507-2a72-40c2-8ec7-86c829eacd52/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:56:12 compute-1 nova_compute[182713]: 2026-01-21 23:56:12.885 182717 DEBUG oslo_concurrency.processutils [None req-6a023b4e-6f8f-4f6b-bf35-4658124c9d52 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/0be60507-2a72-40c2-8ec7-86c829eacd52/disk 1073741824" returned: 0 in 0.033s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:56:12 compute-1 nova_compute[182713]: 2026-01-21 23:56:12.887 182717 DEBUG oslo_concurrency.lockutils [None req-6a023b4e-6f8f-4f6b-bf35-4658124c9d52 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.135s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:56:12 compute-1 nova_compute[182713]: 2026-01-21 23:56:12.888 182717 DEBUG oslo_concurrency.processutils [None req-6a023b4e-6f8f-4f6b-bf35-4658124c9d52 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:56:12 compute-1 nova_compute[182713]: 2026-01-21 23:56:12.948 182717 DEBUG oslo_concurrency.processutils [None req-6a023b4e-6f8f-4f6b-bf35-4658124c9d52 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:56:12 compute-1 nova_compute[182713]: 2026-01-21 23:56:12.950 182717 DEBUG nova.virt.disk.api [None req-6a023b4e-6f8f-4f6b-bf35-4658124c9d52 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Checking if we can resize image /var/lib/nova/instances/0be60507-2a72-40c2-8ec7-86c829eacd52/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 21 23:56:12 compute-1 nova_compute[182713]: 2026-01-21 23:56:12.951 182717 DEBUG oslo_concurrency.processutils [None req-6a023b4e-6f8f-4f6b-bf35-4658124c9d52 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0be60507-2a72-40c2-8ec7-86c829eacd52/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:56:13 compute-1 nova_compute[182713]: 2026-01-21 23:56:13.007 182717 DEBUG oslo_concurrency.processutils [None req-6a023b4e-6f8f-4f6b-bf35-4658124c9d52 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0be60507-2a72-40c2-8ec7-86c829eacd52/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:56:13 compute-1 nova_compute[182713]: 2026-01-21 23:56:13.009 182717 DEBUG nova.virt.disk.api [None req-6a023b4e-6f8f-4f6b-bf35-4658124c9d52 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Cannot resize image /var/lib/nova/instances/0be60507-2a72-40c2-8ec7-86c829eacd52/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 21 23:56:13 compute-1 nova_compute[182713]: 2026-01-21 23:56:13.010 182717 DEBUG nova.objects.instance [None req-6a023b4e-6f8f-4f6b-bf35-4658124c9d52 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Lazy-loading 'migration_context' on Instance uuid 0be60507-2a72-40c2-8ec7-86c829eacd52 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:56:13 compute-1 nova_compute[182713]: 2026-01-21 23:56:13.035 182717 DEBUG nova.virt.libvirt.driver [None req-6a023b4e-6f8f-4f6b-bf35-4658124c9d52 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: 0be60507-2a72-40c2-8ec7-86c829eacd52] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 21 23:56:13 compute-1 nova_compute[182713]: 2026-01-21 23:56:13.036 182717 DEBUG nova.virt.libvirt.driver [None req-6a023b4e-6f8f-4f6b-bf35-4658124c9d52 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: 0be60507-2a72-40c2-8ec7-86c829eacd52] Ensure instance console log exists: /var/lib/nova/instances/0be60507-2a72-40c2-8ec7-86c829eacd52/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 21 23:56:13 compute-1 nova_compute[182713]: 2026-01-21 23:56:13.037 182717 DEBUG oslo_concurrency.lockutils [None req-6a023b4e-6f8f-4f6b-bf35-4658124c9d52 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:56:13 compute-1 nova_compute[182713]: 2026-01-21 23:56:13.037 182717 DEBUG oslo_concurrency.lockutils [None req-6a023b4e-6f8f-4f6b-bf35-4658124c9d52 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:56:13 compute-1 nova_compute[182713]: 2026-01-21 23:56:13.037 182717 DEBUG oslo_concurrency.lockutils [None req-6a023b4e-6f8f-4f6b-bf35-4658124c9d52 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:56:13 compute-1 nova_compute[182713]: 2026-01-21 23:56:13.422 182717 DEBUG nova.network.neutron [None req-6a023b4e-6f8f-4f6b-bf35-4658124c9d52 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: 0be60507-2a72-40c2-8ec7-86c829eacd52] Successfully created port: 8014260a-f495-40a2-81b9-2fa4e968f539 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 21 23:56:14 compute-1 nova_compute[182713]: 2026-01-21 23:56:14.139 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:56:14 compute-1 nova_compute[182713]: 2026-01-21 23:56:14.182 182717 DEBUG oslo_concurrency.lockutils [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] Acquiring lock "03db37ed-d870-40ec-86f5-db23a9180dc8" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:56:14 compute-1 nova_compute[182713]: 2026-01-21 23:56:14.183 182717 DEBUG oslo_concurrency.lockutils [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] Lock "03db37ed-d870-40ec-86f5-db23a9180dc8" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:56:14 compute-1 nova_compute[182713]: 2026-01-21 23:56:14.216 182717 DEBUG nova.compute.manager [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] [instance: 03db37ed-d870-40ec-86f5-db23a9180dc8] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 21 23:56:14 compute-1 nova_compute[182713]: 2026-01-21 23:56:14.387 182717 DEBUG oslo_concurrency.lockutils [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:56:14 compute-1 nova_compute[182713]: 2026-01-21 23:56:14.387 182717 DEBUG oslo_concurrency.lockutils [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:56:14 compute-1 nova_compute[182713]: 2026-01-21 23:56:14.394 182717 DEBUG nova.virt.hardware [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 21 23:56:14 compute-1 nova_compute[182713]: 2026-01-21 23:56:14.395 182717 INFO nova.compute.claims [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] [instance: 03db37ed-d870-40ec-86f5-db23a9180dc8] Claim successful on node compute-1.ctlplane.example.com
Jan 21 23:56:14 compute-1 nova_compute[182713]: 2026-01-21 23:56:14.592 182717 DEBUG nova.compute.provider_tree [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] Inventory has not changed in ProviderTree for provider: 39680711-70c9-4df1-ae59-25e54fac688d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 21 23:56:14 compute-1 nova_compute[182713]: 2026-01-21 23:56:14.611 182717 DEBUG nova.scheduler.client.report [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] Inventory has not changed for provider 39680711-70c9-4df1-ae59-25e54fac688d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 21 23:56:14 compute-1 nova_compute[182713]: 2026-01-21 23:56:14.646 182717 DEBUG nova.network.neutron [None req-6a023b4e-6f8f-4f6b-bf35-4658124c9d52 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: 0be60507-2a72-40c2-8ec7-86c829eacd52] Successfully updated port: 8014260a-f495-40a2-81b9-2fa4e968f539 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 21 23:56:14 compute-1 nova_compute[182713]: 2026-01-21 23:56:14.656 182717 DEBUG oslo_concurrency.lockutils [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.268s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:56:14 compute-1 nova_compute[182713]: 2026-01-21 23:56:14.657 182717 DEBUG nova.compute.manager [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] [instance: 03db37ed-d870-40ec-86f5-db23a9180dc8] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 21 23:56:14 compute-1 nova_compute[182713]: 2026-01-21 23:56:14.687 182717 DEBUG oslo_concurrency.lockutils [None req-6a023b4e-6f8f-4f6b-bf35-4658124c9d52 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Acquiring lock "refresh_cache-0be60507-2a72-40c2-8ec7-86c829eacd52" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 23:56:14 compute-1 nova_compute[182713]: 2026-01-21 23:56:14.688 182717 DEBUG oslo_concurrency.lockutils [None req-6a023b4e-6f8f-4f6b-bf35-4658124c9d52 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Acquired lock "refresh_cache-0be60507-2a72-40c2-8ec7-86c829eacd52" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 23:56:14 compute-1 nova_compute[182713]: 2026-01-21 23:56:14.688 182717 DEBUG nova.network.neutron [None req-6a023b4e-6f8f-4f6b-bf35-4658124c9d52 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: 0be60507-2a72-40c2-8ec7-86c829eacd52] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 21 23:56:14 compute-1 nova_compute[182713]: 2026-01-21 23:56:14.746 182717 DEBUG nova.compute.manager [req-c9e8c54f-8249-4823-a5f6-6bf2fead6cfd req-8b375db3-5c61-4db7-b30c-265fae673541 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 0be60507-2a72-40c2-8ec7-86c829eacd52] Received event network-changed-8014260a-f495-40a2-81b9-2fa4e968f539 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:56:14 compute-1 nova_compute[182713]: 2026-01-21 23:56:14.747 182717 DEBUG nova.compute.manager [req-c9e8c54f-8249-4823-a5f6-6bf2fead6cfd req-8b375db3-5c61-4db7-b30c-265fae673541 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 0be60507-2a72-40c2-8ec7-86c829eacd52] Refreshing instance network info cache due to event network-changed-8014260a-f495-40a2-81b9-2fa4e968f539. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 21 23:56:14 compute-1 nova_compute[182713]: 2026-01-21 23:56:14.748 182717 DEBUG oslo_concurrency.lockutils [req-c9e8c54f-8249-4823-a5f6-6bf2fead6cfd req-8b375db3-5c61-4db7-b30c-265fae673541 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-0be60507-2a72-40c2-8ec7-86c829eacd52" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 23:56:14 compute-1 nova_compute[182713]: 2026-01-21 23:56:14.766 182717 DEBUG nova.compute.manager [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] [instance: 03db37ed-d870-40ec-86f5-db23a9180dc8] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 21 23:56:14 compute-1 nova_compute[182713]: 2026-01-21 23:56:14.767 182717 DEBUG nova.network.neutron [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] [instance: 03db37ed-d870-40ec-86f5-db23a9180dc8] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 21 23:56:14 compute-1 nova_compute[182713]: 2026-01-21 23:56:14.788 182717 INFO nova.virt.libvirt.driver [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] [instance: 03db37ed-d870-40ec-86f5-db23a9180dc8] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 21 23:56:14 compute-1 nova_compute[182713]: 2026-01-21 23:56:14.820 182717 DEBUG nova.compute.manager [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] [instance: 03db37ed-d870-40ec-86f5-db23a9180dc8] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 21 23:56:14 compute-1 nova_compute[182713]: 2026-01-21 23:56:14.946 182717 DEBUG nova.compute.manager [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] [instance: 03db37ed-d870-40ec-86f5-db23a9180dc8] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 21 23:56:14 compute-1 nova_compute[182713]: 2026-01-21 23:56:14.948 182717 DEBUG nova.virt.libvirt.driver [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] [instance: 03db37ed-d870-40ec-86f5-db23a9180dc8] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 21 23:56:14 compute-1 nova_compute[182713]: 2026-01-21 23:56:14.949 182717 INFO nova.virt.libvirt.driver [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] [instance: 03db37ed-d870-40ec-86f5-db23a9180dc8] Creating image(s)
Jan 21 23:56:14 compute-1 nova_compute[182713]: 2026-01-21 23:56:14.949 182717 DEBUG oslo_concurrency.lockutils [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] Acquiring lock "/var/lib/nova/instances/03db37ed-d870-40ec-86f5-db23a9180dc8/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:56:14 compute-1 nova_compute[182713]: 2026-01-21 23:56:14.949 182717 DEBUG oslo_concurrency.lockutils [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] Lock "/var/lib/nova/instances/03db37ed-d870-40ec-86f5-db23a9180dc8/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:56:14 compute-1 nova_compute[182713]: 2026-01-21 23:56:14.951 182717 DEBUG oslo_concurrency.lockutils [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] Lock "/var/lib/nova/instances/03db37ed-d870-40ec-86f5-db23a9180dc8/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:56:14 compute-1 nova_compute[182713]: 2026-01-21 23:56:14.966 182717 DEBUG oslo_concurrency.processutils [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:56:15 compute-1 nova_compute[182713]: 2026-01-21 23:56:15.053 182717 DEBUG oslo_concurrency.processutils [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.087s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:56:15 compute-1 nova_compute[182713]: 2026-01-21 23:56:15.055 182717 DEBUG oslo_concurrency.lockutils [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] Acquiring lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:56:15 compute-1 nova_compute[182713]: 2026-01-21 23:56:15.056 182717 DEBUG oslo_concurrency.lockutils [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:56:15 compute-1 nova_compute[182713]: 2026-01-21 23:56:15.074 182717 DEBUG oslo_concurrency.processutils [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:56:15 compute-1 nova_compute[182713]: 2026-01-21 23:56:15.165 182717 DEBUG oslo_concurrency.processutils [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.091s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:56:15 compute-1 nova_compute[182713]: 2026-01-21 23:56:15.167 182717 DEBUG oslo_concurrency.processutils [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/03db37ed-d870-40ec-86f5-db23a9180dc8/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:56:15 compute-1 nova_compute[182713]: 2026-01-21 23:56:15.205 182717 DEBUG oslo_concurrency.processutils [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/03db37ed-d870-40ec-86f5-db23a9180dc8/disk 1073741824" returned: 0 in 0.038s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:56:15 compute-1 nova_compute[182713]: 2026-01-21 23:56:15.207 182717 DEBUG oslo_concurrency.lockutils [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.151s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:56:15 compute-1 nova_compute[182713]: 2026-01-21 23:56:15.207 182717 DEBUG oslo_concurrency.processutils [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:56:15 compute-1 nova_compute[182713]: 2026-01-21 23:56:15.284 182717 DEBUG oslo_concurrency.processutils [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.077s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:56:15 compute-1 nova_compute[182713]: 2026-01-21 23:56:15.286 182717 DEBUG nova.virt.disk.api [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] Checking if we can resize image /var/lib/nova/instances/03db37ed-d870-40ec-86f5-db23a9180dc8/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 21 23:56:15 compute-1 nova_compute[182713]: 2026-01-21 23:56:15.287 182717 DEBUG oslo_concurrency.processutils [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/03db37ed-d870-40ec-86f5-db23a9180dc8/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:56:15 compute-1 nova_compute[182713]: 2026-01-21 23:56:15.345 182717 DEBUG oslo_concurrency.processutils [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/03db37ed-d870-40ec-86f5-db23a9180dc8/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:56:15 compute-1 nova_compute[182713]: 2026-01-21 23:56:15.347 182717 DEBUG nova.virt.disk.api [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] Cannot resize image /var/lib/nova/instances/03db37ed-d870-40ec-86f5-db23a9180dc8/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 21 23:56:15 compute-1 nova_compute[182713]: 2026-01-21 23:56:15.347 182717 DEBUG nova.objects.instance [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] Lazy-loading 'migration_context' on Instance uuid 03db37ed-d870-40ec-86f5-db23a9180dc8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:56:15 compute-1 nova_compute[182713]: 2026-01-21 23:56:15.365 182717 DEBUG nova.virt.libvirt.driver [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] [instance: 03db37ed-d870-40ec-86f5-db23a9180dc8] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 21 23:56:15 compute-1 nova_compute[182713]: 2026-01-21 23:56:15.365 182717 DEBUG nova.virt.libvirt.driver [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] [instance: 03db37ed-d870-40ec-86f5-db23a9180dc8] Ensure instance console log exists: /var/lib/nova/instances/03db37ed-d870-40ec-86f5-db23a9180dc8/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 21 23:56:15 compute-1 nova_compute[182713]: 2026-01-21 23:56:15.366 182717 DEBUG oslo_concurrency.lockutils [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:56:15 compute-1 nova_compute[182713]: 2026-01-21 23:56:15.366 182717 DEBUG oslo_concurrency.lockutils [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:56:15 compute-1 nova_compute[182713]: 2026-01-21 23:56:15.366 182717 DEBUG oslo_concurrency.lockutils [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:56:15 compute-1 nova_compute[182713]: 2026-01-21 23:56:15.453 182717 DEBUG nova.network.neutron [None req-6a023b4e-6f8f-4f6b-bf35-4658124c9d52 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: 0be60507-2a72-40c2-8ec7-86c829eacd52] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 21 23:56:15 compute-1 nova_compute[182713]: 2026-01-21 23:56:15.675 182717 DEBUG nova.policy [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '9a4a4a5f3c9f4c5091261592272bcb81', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '414437860afc460b9e86d674975e9d1f', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 21 23:56:16 compute-1 nova_compute[182713]: 2026-01-21 23:56:16.102 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:56:17 compute-1 nova_compute[182713]: 2026-01-21 23:56:17.338 182717 DEBUG nova.network.neutron [None req-6a023b4e-6f8f-4f6b-bf35-4658124c9d52 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: 0be60507-2a72-40c2-8ec7-86c829eacd52] Updating instance_info_cache with network_info: [{"id": "8014260a-f495-40a2-81b9-2fa4e968f539", "address": "fa:16:3e:50:87:d7", "network": {"id": "74e2da48-44c2-4c6d-9597-6c47d6247f9c", "bridge": "br-int", "label": "tempest-ImagesTestJSON-926280031-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "63e5713bcd4c429796b251487b6136bc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8014260a-f4", "ovs_interfaceid": "8014260a-f495-40a2-81b9-2fa4e968f539", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 23:56:17 compute-1 nova_compute[182713]: 2026-01-21 23:56:17.370 182717 DEBUG oslo_concurrency.lockutils [None req-6a023b4e-6f8f-4f6b-bf35-4658124c9d52 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Releasing lock "refresh_cache-0be60507-2a72-40c2-8ec7-86c829eacd52" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 23:56:17 compute-1 nova_compute[182713]: 2026-01-21 23:56:17.371 182717 DEBUG nova.compute.manager [None req-6a023b4e-6f8f-4f6b-bf35-4658124c9d52 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: 0be60507-2a72-40c2-8ec7-86c829eacd52] Instance network_info: |[{"id": "8014260a-f495-40a2-81b9-2fa4e968f539", "address": "fa:16:3e:50:87:d7", "network": {"id": "74e2da48-44c2-4c6d-9597-6c47d6247f9c", "bridge": "br-int", "label": "tempest-ImagesTestJSON-926280031-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "63e5713bcd4c429796b251487b6136bc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8014260a-f4", "ovs_interfaceid": "8014260a-f495-40a2-81b9-2fa4e968f539", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 21 23:56:17 compute-1 nova_compute[182713]: 2026-01-21 23:56:17.373 182717 DEBUG oslo_concurrency.lockutils [req-c9e8c54f-8249-4823-a5f6-6bf2fead6cfd req-8b375db3-5c61-4db7-b30c-265fae673541 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-0be60507-2a72-40c2-8ec7-86c829eacd52" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 23:56:17 compute-1 nova_compute[182713]: 2026-01-21 23:56:17.373 182717 DEBUG nova.network.neutron [req-c9e8c54f-8249-4823-a5f6-6bf2fead6cfd req-8b375db3-5c61-4db7-b30c-265fae673541 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 0be60507-2a72-40c2-8ec7-86c829eacd52] Refreshing network info cache for port 8014260a-f495-40a2-81b9-2fa4e968f539 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 21 23:56:17 compute-1 nova_compute[182713]: 2026-01-21 23:56:17.379 182717 DEBUG nova.virt.libvirt.driver [None req-6a023b4e-6f8f-4f6b-bf35-4658124c9d52 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: 0be60507-2a72-40c2-8ec7-86c829eacd52] Start _get_guest_xml network_info=[{"id": "8014260a-f495-40a2-81b9-2fa4e968f539", "address": "fa:16:3e:50:87:d7", "network": {"id": "74e2da48-44c2-4c6d-9597-6c47d6247f9c", "bridge": "br-int", "label": "tempest-ImagesTestJSON-926280031-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "63e5713bcd4c429796b251487b6136bc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8014260a-f4", "ovs_interfaceid": "8014260a-f495-40a2-81b9-2fa4e968f539", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'size': 0, 'disk_bus': 'virtio', 'boot_index': 0, 'device_name': '/dev/vda', 'guest_format': None, 'encrypted': False, 'encryption_format': None, 'encryption_secret_uuid': None, 'device_type': 'disk', 'image_id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 21 23:56:17 compute-1 nova_compute[182713]: 2026-01-21 23:56:17.386 182717 WARNING nova.virt.libvirt.driver [None req-6a023b4e-6f8f-4f6b-bf35-4658124c9d52 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 21 23:56:17 compute-1 nova_compute[182713]: 2026-01-21 23:56:17.393 182717 DEBUG nova.virt.libvirt.host [None req-6a023b4e-6f8f-4f6b-bf35-4658124c9d52 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 21 23:56:17 compute-1 nova_compute[182713]: 2026-01-21 23:56:17.394 182717 DEBUG nova.virt.libvirt.host [None req-6a023b4e-6f8f-4f6b-bf35-4658124c9d52 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 21 23:56:17 compute-1 nova_compute[182713]: 2026-01-21 23:56:17.401 182717 DEBUG nova.virt.libvirt.host [None req-6a023b4e-6f8f-4f6b-bf35-4658124c9d52 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 21 23:56:17 compute-1 nova_compute[182713]: 2026-01-21 23:56:17.402 182717 DEBUG nova.virt.libvirt.host [None req-6a023b4e-6f8f-4f6b-bf35-4658124c9d52 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 21 23:56:17 compute-1 nova_compute[182713]: 2026-01-21 23:56:17.404 182717 DEBUG nova.virt.libvirt.driver [None req-6a023b4e-6f8f-4f6b-bf35-4658124c9d52 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 21 23:56:17 compute-1 nova_compute[182713]: 2026-01-21 23:56:17.405 182717 DEBUG nova.virt.hardware [None req-6a023b4e-6f8f-4f6b-bf35-4658124c9d52 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-21T23:42:45Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c3389c03-89c4-4ff5-9e03-1a99d41713d4',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 21 23:56:17 compute-1 nova_compute[182713]: 2026-01-21 23:56:17.406 182717 DEBUG nova.virt.hardware [None req-6a023b4e-6f8f-4f6b-bf35-4658124c9d52 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 21 23:56:17 compute-1 nova_compute[182713]: 2026-01-21 23:56:17.406 182717 DEBUG nova.virt.hardware [None req-6a023b4e-6f8f-4f6b-bf35-4658124c9d52 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 21 23:56:17 compute-1 nova_compute[182713]: 2026-01-21 23:56:17.407 182717 DEBUG nova.virt.hardware [None req-6a023b4e-6f8f-4f6b-bf35-4658124c9d52 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 21 23:56:17 compute-1 nova_compute[182713]: 2026-01-21 23:56:17.407 182717 DEBUG nova.virt.hardware [None req-6a023b4e-6f8f-4f6b-bf35-4658124c9d52 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 21 23:56:17 compute-1 nova_compute[182713]: 2026-01-21 23:56:17.408 182717 DEBUG nova.virt.hardware [None req-6a023b4e-6f8f-4f6b-bf35-4658124c9d52 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 21 23:56:17 compute-1 nova_compute[182713]: 2026-01-21 23:56:17.408 182717 DEBUG nova.virt.hardware [None req-6a023b4e-6f8f-4f6b-bf35-4658124c9d52 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 21 23:56:17 compute-1 nova_compute[182713]: 2026-01-21 23:56:17.409 182717 DEBUG nova.virt.hardware [None req-6a023b4e-6f8f-4f6b-bf35-4658124c9d52 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 21 23:56:17 compute-1 nova_compute[182713]: 2026-01-21 23:56:17.410 182717 DEBUG nova.virt.hardware [None req-6a023b4e-6f8f-4f6b-bf35-4658124c9d52 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 21 23:56:17 compute-1 nova_compute[182713]: 2026-01-21 23:56:17.410 182717 DEBUG nova.virt.hardware [None req-6a023b4e-6f8f-4f6b-bf35-4658124c9d52 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 21 23:56:17 compute-1 nova_compute[182713]: 2026-01-21 23:56:17.411 182717 DEBUG nova.virt.hardware [None req-6a023b4e-6f8f-4f6b-bf35-4658124c9d52 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 21 23:56:17 compute-1 nova_compute[182713]: 2026-01-21 23:56:17.420 182717 DEBUG nova.virt.libvirt.vif [None req-6a023b4e-6f8f-4f6b-bf35-4658124c9d52 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-21T23:56:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ImagesTestJSON-server-560731900',display_name='tempest-ImagesTestJSON-server-560731900',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-imagestestjson-server-560731900',id=64,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='63e5713bcd4c429796b251487b6136bc',ramdisk_id='',reservation_id='r-1ccb7p5b',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ImagesTestJSON-126431515',owner_user_name='tempest-ImagesTestJSON-126431515-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-21T23:56:12Z,user_data=None,user_id='6eb1bcf645844eaca088761a04e59542',uuid=0be60507-2a72-40c2-8ec7-86c829eacd52,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "8014260a-f495-40a2-81b9-2fa4e968f539", "address": "fa:16:3e:50:87:d7", "network": {"id": "74e2da48-44c2-4c6d-9597-6c47d6247f9c", "bridge": "br-int", "label": "tempest-ImagesTestJSON-926280031-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "63e5713bcd4c429796b251487b6136bc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8014260a-f4", "ovs_interfaceid": "8014260a-f495-40a2-81b9-2fa4e968f539", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 21 23:56:17 compute-1 nova_compute[182713]: 2026-01-21 23:56:17.420 182717 DEBUG nova.network.os_vif_util [None req-6a023b4e-6f8f-4f6b-bf35-4658124c9d52 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Converting VIF {"id": "8014260a-f495-40a2-81b9-2fa4e968f539", "address": "fa:16:3e:50:87:d7", "network": {"id": "74e2da48-44c2-4c6d-9597-6c47d6247f9c", "bridge": "br-int", "label": "tempest-ImagesTestJSON-926280031-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "63e5713bcd4c429796b251487b6136bc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8014260a-f4", "ovs_interfaceid": "8014260a-f495-40a2-81b9-2fa4e968f539", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 21 23:56:17 compute-1 nova_compute[182713]: 2026-01-21 23:56:17.423 182717 DEBUG nova.network.os_vif_util [None req-6a023b4e-6f8f-4f6b-bf35-4658124c9d52 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:50:87:d7,bridge_name='br-int',has_traffic_filtering=True,id=8014260a-f495-40a2-81b9-2fa4e968f539,network=Network(74e2da48-44c2-4c6d-9597-6c47d6247f9c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8014260a-f4') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 21 23:56:17 compute-1 nova_compute[182713]: 2026-01-21 23:56:17.425 182717 DEBUG nova.objects.instance [None req-6a023b4e-6f8f-4f6b-bf35-4658124c9d52 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Lazy-loading 'pci_devices' on Instance uuid 0be60507-2a72-40c2-8ec7-86c829eacd52 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:56:17 compute-1 nova_compute[182713]: 2026-01-21 23:56:17.445 182717 DEBUG nova.virt.libvirt.driver [None req-6a023b4e-6f8f-4f6b-bf35-4658124c9d52 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: 0be60507-2a72-40c2-8ec7-86c829eacd52] End _get_guest_xml xml=<domain type="kvm">
Jan 21 23:56:17 compute-1 nova_compute[182713]:   <uuid>0be60507-2a72-40c2-8ec7-86c829eacd52</uuid>
Jan 21 23:56:17 compute-1 nova_compute[182713]:   <name>instance-00000040</name>
Jan 21 23:56:17 compute-1 nova_compute[182713]:   <memory>131072</memory>
Jan 21 23:56:17 compute-1 nova_compute[182713]:   <vcpu>1</vcpu>
Jan 21 23:56:17 compute-1 nova_compute[182713]:   <metadata>
Jan 21 23:56:17 compute-1 nova_compute[182713]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 21 23:56:17 compute-1 nova_compute[182713]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 21 23:56:17 compute-1 nova_compute[182713]:       <nova:name>tempest-ImagesTestJSON-server-560731900</nova:name>
Jan 21 23:56:17 compute-1 nova_compute[182713]:       <nova:creationTime>2026-01-21 23:56:17</nova:creationTime>
Jan 21 23:56:17 compute-1 nova_compute[182713]:       <nova:flavor name="m1.nano">
Jan 21 23:56:17 compute-1 nova_compute[182713]:         <nova:memory>128</nova:memory>
Jan 21 23:56:17 compute-1 nova_compute[182713]:         <nova:disk>1</nova:disk>
Jan 21 23:56:17 compute-1 nova_compute[182713]:         <nova:swap>0</nova:swap>
Jan 21 23:56:17 compute-1 nova_compute[182713]:         <nova:ephemeral>0</nova:ephemeral>
Jan 21 23:56:17 compute-1 nova_compute[182713]:         <nova:vcpus>1</nova:vcpus>
Jan 21 23:56:17 compute-1 nova_compute[182713]:       </nova:flavor>
Jan 21 23:56:17 compute-1 nova_compute[182713]:       <nova:owner>
Jan 21 23:56:17 compute-1 nova_compute[182713]:         <nova:user uuid="6eb1bcf645844eaca088761a04e59542">tempest-ImagesTestJSON-126431515-project-member</nova:user>
Jan 21 23:56:17 compute-1 nova_compute[182713]:         <nova:project uuid="63e5713bcd4c429796b251487b6136bc">tempest-ImagesTestJSON-126431515</nova:project>
Jan 21 23:56:17 compute-1 nova_compute[182713]:       </nova:owner>
Jan 21 23:56:17 compute-1 nova_compute[182713]:       <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 21 23:56:17 compute-1 nova_compute[182713]:       <nova:ports>
Jan 21 23:56:17 compute-1 nova_compute[182713]:         <nova:port uuid="8014260a-f495-40a2-81b9-2fa4e968f539">
Jan 21 23:56:17 compute-1 nova_compute[182713]:           <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Jan 21 23:56:17 compute-1 nova_compute[182713]:         </nova:port>
Jan 21 23:56:17 compute-1 nova_compute[182713]:       </nova:ports>
Jan 21 23:56:17 compute-1 nova_compute[182713]:     </nova:instance>
Jan 21 23:56:17 compute-1 nova_compute[182713]:   </metadata>
Jan 21 23:56:17 compute-1 nova_compute[182713]:   <sysinfo type="smbios">
Jan 21 23:56:17 compute-1 nova_compute[182713]:     <system>
Jan 21 23:56:17 compute-1 nova_compute[182713]:       <entry name="manufacturer">RDO</entry>
Jan 21 23:56:17 compute-1 nova_compute[182713]:       <entry name="product">OpenStack Compute</entry>
Jan 21 23:56:17 compute-1 nova_compute[182713]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 21 23:56:17 compute-1 nova_compute[182713]:       <entry name="serial">0be60507-2a72-40c2-8ec7-86c829eacd52</entry>
Jan 21 23:56:17 compute-1 nova_compute[182713]:       <entry name="uuid">0be60507-2a72-40c2-8ec7-86c829eacd52</entry>
Jan 21 23:56:17 compute-1 nova_compute[182713]:       <entry name="family">Virtual Machine</entry>
Jan 21 23:56:17 compute-1 nova_compute[182713]:     </system>
Jan 21 23:56:17 compute-1 nova_compute[182713]:   </sysinfo>
Jan 21 23:56:17 compute-1 nova_compute[182713]:   <os>
Jan 21 23:56:17 compute-1 nova_compute[182713]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 21 23:56:17 compute-1 nova_compute[182713]:     <boot dev="hd"/>
Jan 21 23:56:17 compute-1 nova_compute[182713]:     <smbios mode="sysinfo"/>
Jan 21 23:56:17 compute-1 nova_compute[182713]:   </os>
Jan 21 23:56:17 compute-1 nova_compute[182713]:   <features>
Jan 21 23:56:17 compute-1 nova_compute[182713]:     <acpi/>
Jan 21 23:56:17 compute-1 nova_compute[182713]:     <apic/>
Jan 21 23:56:17 compute-1 nova_compute[182713]:     <vmcoreinfo/>
Jan 21 23:56:17 compute-1 nova_compute[182713]:   </features>
Jan 21 23:56:17 compute-1 nova_compute[182713]:   <clock offset="utc">
Jan 21 23:56:17 compute-1 nova_compute[182713]:     <timer name="pit" tickpolicy="delay"/>
Jan 21 23:56:17 compute-1 nova_compute[182713]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 21 23:56:17 compute-1 nova_compute[182713]:     <timer name="hpet" present="no"/>
Jan 21 23:56:17 compute-1 nova_compute[182713]:   </clock>
Jan 21 23:56:17 compute-1 nova_compute[182713]:   <cpu mode="custom" match="exact">
Jan 21 23:56:17 compute-1 nova_compute[182713]:     <model>Nehalem</model>
Jan 21 23:56:17 compute-1 nova_compute[182713]:     <topology sockets="1" cores="1" threads="1"/>
Jan 21 23:56:17 compute-1 nova_compute[182713]:   </cpu>
Jan 21 23:56:17 compute-1 nova_compute[182713]:   <devices>
Jan 21 23:56:17 compute-1 nova_compute[182713]:     <disk type="file" device="disk">
Jan 21 23:56:17 compute-1 nova_compute[182713]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 21 23:56:17 compute-1 nova_compute[182713]:       <source file="/var/lib/nova/instances/0be60507-2a72-40c2-8ec7-86c829eacd52/disk"/>
Jan 21 23:56:17 compute-1 nova_compute[182713]:       <target dev="vda" bus="virtio"/>
Jan 21 23:56:17 compute-1 nova_compute[182713]:     </disk>
Jan 21 23:56:17 compute-1 nova_compute[182713]:     <disk type="file" device="cdrom">
Jan 21 23:56:17 compute-1 nova_compute[182713]:       <driver name="qemu" type="raw" cache="none"/>
Jan 21 23:56:17 compute-1 nova_compute[182713]:       <source file="/var/lib/nova/instances/0be60507-2a72-40c2-8ec7-86c829eacd52/disk.config"/>
Jan 21 23:56:17 compute-1 nova_compute[182713]:       <target dev="sda" bus="sata"/>
Jan 21 23:56:17 compute-1 nova_compute[182713]:     </disk>
Jan 21 23:56:17 compute-1 nova_compute[182713]:     <interface type="ethernet">
Jan 21 23:56:17 compute-1 nova_compute[182713]:       <mac address="fa:16:3e:50:87:d7"/>
Jan 21 23:56:17 compute-1 nova_compute[182713]:       <model type="virtio"/>
Jan 21 23:56:17 compute-1 nova_compute[182713]:       <driver name="vhost" rx_queue_size="512"/>
Jan 21 23:56:17 compute-1 nova_compute[182713]:       <mtu size="1442"/>
Jan 21 23:56:17 compute-1 nova_compute[182713]:       <target dev="tap8014260a-f4"/>
Jan 21 23:56:17 compute-1 nova_compute[182713]:     </interface>
Jan 21 23:56:17 compute-1 nova_compute[182713]:     <serial type="pty">
Jan 21 23:56:17 compute-1 nova_compute[182713]:       <log file="/var/lib/nova/instances/0be60507-2a72-40c2-8ec7-86c829eacd52/console.log" append="off"/>
Jan 21 23:56:17 compute-1 nova_compute[182713]:     </serial>
Jan 21 23:56:17 compute-1 nova_compute[182713]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 21 23:56:17 compute-1 nova_compute[182713]:     <video>
Jan 21 23:56:17 compute-1 nova_compute[182713]:       <model type="virtio"/>
Jan 21 23:56:17 compute-1 nova_compute[182713]:     </video>
Jan 21 23:56:17 compute-1 nova_compute[182713]:     <input type="tablet" bus="usb"/>
Jan 21 23:56:17 compute-1 nova_compute[182713]:     <rng model="virtio">
Jan 21 23:56:17 compute-1 nova_compute[182713]:       <backend model="random">/dev/urandom</backend>
Jan 21 23:56:17 compute-1 nova_compute[182713]:     </rng>
Jan 21 23:56:17 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root"/>
Jan 21 23:56:17 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:56:17 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:56:17 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:56:17 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:56:17 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:56:17 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:56:17 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:56:17 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:56:17 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:56:17 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:56:17 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:56:17 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:56:17 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:56:17 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:56:17 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:56:17 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:56:17 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:56:17 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:56:17 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:56:17 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:56:17 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:56:17 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:56:17 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:56:17 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:56:17 compute-1 nova_compute[182713]:     <controller type="usb" index="0"/>
Jan 21 23:56:17 compute-1 nova_compute[182713]:     <memballoon model="virtio">
Jan 21 23:56:17 compute-1 nova_compute[182713]:       <stats period="10"/>
Jan 21 23:56:17 compute-1 nova_compute[182713]:     </memballoon>
Jan 21 23:56:17 compute-1 nova_compute[182713]:   </devices>
Jan 21 23:56:17 compute-1 nova_compute[182713]: </domain>
Jan 21 23:56:17 compute-1 nova_compute[182713]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 21 23:56:17 compute-1 nova_compute[182713]: 2026-01-21 23:56:17.446 182717 DEBUG nova.compute.manager [None req-6a023b4e-6f8f-4f6b-bf35-4658124c9d52 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: 0be60507-2a72-40c2-8ec7-86c829eacd52] Preparing to wait for external event network-vif-plugged-8014260a-f495-40a2-81b9-2fa4e968f539 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 21 23:56:17 compute-1 nova_compute[182713]: 2026-01-21 23:56:17.447 182717 DEBUG oslo_concurrency.lockutils [None req-6a023b4e-6f8f-4f6b-bf35-4658124c9d52 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Acquiring lock "0be60507-2a72-40c2-8ec7-86c829eacd52-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:56:17 compute-1 nova_compute[182713]: 2026-01-21 23:56:17.447 182717 DEBUG oslo_concurrency.lockutils [None req-6a023b4e-6f8f-4f6b-bf35-4658124c9d52 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Lock "0be60507-2a72-40c2-8ec7-86c829eacd52-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:56:17 compute-1 nova_compute[182713]: 2026-01-21 23:56:17.447 182717 DEBUG oslo_concurrency.lockutils [None req-6a023b4e-6f8f-4f6b-bf35-4658124c9d52 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Lock "0be60507-2a72-40c2-8ec7-86c829eacd52-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:56:17 compute-1 nova_compute[182713]: 2026-01-21 23:56:17.448 182717 DEBUG nova.virt.libvirt.vif [None req-6a023b4e-6f8f-4f6b-bf35-4658124c9d52 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-21T23:56:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ImagesTestJSON-server-560731900',display_name='tempest-ImagesTestJSON-server-560731900',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-imagestestjson-server-560731900',id=64,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='63e5713bcd4c429796b251487b6136bc',ramdisk_id='',reservation_id='r-1ccb7p5b',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ImagesTestJSON-126431515',owner_user_name='tempest-ImagesTestJSON-126431515-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-21T23:56:12Z,user_data=None,user_id='6eb1bcf645844eaca088761a04e59542',uuid=0be60507-2a72-40c2-8ec7-86c829eacd52,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "8014260a-f495-40a2-81b9-2fa4e968f539", "address": "fa:16:3e:50:87:d7", "network": {"id": "74e2da48-44c2-4c6d-9597-6c47d6247f9c", "bridge": "br-int", "label": "tempest-ImagesTestJSON-926280031-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "63e5713bcd4c429796b251487b6136bc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8014260a-f4", "ovs_interfaceid": "8014260a-f495-40a2-81b9-2fa4e968f539", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 21 23:56:17 compute-1 nova_compute[182713]: 2026-01-21 23:56:17.449 182717 DEBUG nova.network.os_vif_util [None req-6a023b4e-6f8f-4f6b-bf35-4658124c9d52 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Converting VIF {"id": "8014260a-f495-40a2-81b9-2fa4e968f539", "address": "fa:16:3e:50:87:d7", "network": {"id": "74e2da48-44c2-4c6d-9597-6c47d6247f9c", "bridge": "br-int", "label": "tempest-ImagesTestJSON-926280031-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "63e5713bcd4c429796b251487b6136bc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8014260a-f4", "ovs_interfaceid": "8014260a-f495-40a2-81b9-2fa4e968f539", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 21 23:56:17 compute-1 nova_compute[182713]: 2026-01-21 23:56:17.450 182717 DEBUG nova.network.os_vif_util [None req-6a023b4e-6f8f-4f6b-bf35-4658124c9d52 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:50:87:d7,bridge_name='br-int',has_traffic_filtering=True,id=8014260a-f495-40a2-81b9-2fa4e968f539,network=Network(74e2da48-44c2-4c6d-9597-6c47d6247f9c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8014260a-f4') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 21 23:56:17 compute-1 nova_compute[182713]: 2026-01-21 23:56:17.450 182717 DEBUG os_vif [None req-6a023b4e-6f8f-4f6b-bf35-4658124c9d52 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:50:87:d7,bridge_name='br-int',has_traffic_filtering=True,id=8014260a-f495-40a2-81b9-2fa4e968f539,network=Network(74e2da48-44c2-4c6d-9597-6c47d6247f9c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8014260a-f4') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 21 23:56:17 compute-1 nova_compute[182713]: 2026-01-21 23:56:17.451 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:56:17 compute-1 nova_compute[182713]: 2026-01-21 23:56:17.452 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:56:17 compute-1 nova_compute[182713]: 2026-01-21 23:56:17.452 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 21 23:56:17 compute-1 nova_compute[182713]: 2026-01-21 23:56:17.457 182717 DEBUG nova.network.neutron [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] [instance: 03db37ed-d870-40ec-86f5-db23a9180dc8] Successfully created port: 1f2706f6-320f-42cf-8e88-b7cb375b001a _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 21 23:56:17 compute-1 nova_compute[182713]: 2026-01-21 23:56:17.461 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:56:17 compute-1 nova_compute[182713]: 2026-01-21 23:56:17.461 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap8014260a-f4, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:56:17 compute-1 nova_compute[182713]: 2026-01-21 23:56:17.462 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap8014260a-f4, col_values=(('external_ids', {'iface-id': '8014260a-f495-40a2-81b9-2fa4e968f539', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:50:87:d7', 'vm-uuid': '0be60507-2a72-40c2-8ec7-86c829eacd52'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:56:17 compute-1 nova_compute[182713]: 2026-01-21 23:56:17.463 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:56:17 compute-1 NetworkManager[54952]: <info>  [1769039777.4652] manager: (tap8014260a-f4): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/96)
Jan 21 23:56:17 compute-1 nova_compute[182713]: 2026-01-21 23:56:17.466 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 21 23:56:17 compute-1 nova_compute[182713]: 2026-01-21 23:56:17.471 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:56:17 compute-1 nova_compute[182713]: 2026-01-21 23:56:17.474 182717 INFO os_vif [None req-6a023b4e-6f8f-4f6b-bf35-4658124c9d52 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:50:87:d7,bridge_name='br-int',has_traffic_filtering=True,id=8014260a-f495-40a2-81b9-2fa4e968f539,network=Network(74e2da48-44c2-4c6d-9597-6c47d6247f9c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8014260a-f4')
Jan 21 23:56:17 compute-1 nova_compute[182713]: 2026-01-21 23:56:17.553 182717 DEBUG nova.virt.libvirt.driver [None req-6a023b4e-6f8f-4f6b-bf35-4658124c9d52 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 21 23:56:17 compute-1 nova_compute[182713]: 2026-01-21 23:56:17.554 182717 DEBUG nova.virt.libvirt.driver [None req-6a023b4e-6f8f-4f6b-bf35-4658124c9d52 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 21 23:56:17 compute-1 nova_compute[182713]: 2026-01-21 23:56:17.554 182717 DEBUG nova.virt.libvirt.driver [None req-6a023b4e-6f8f-4f6b-bf35-4658124c9d52 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] No VIF found with MAC fa:16:3e:50:87:d7, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 21 23:56:17 compute-1 nova_compute[182713]: 2026-01-21 23:56:17.554 182717 INFO nova.virt.libvirt.driver [None req-6a023b4e-6f8f-4f6b-bf35-4658124c9d52 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: 0be60507-2a72-40c2-8ec7-86c829eacd52] Using config drive
Jan 21 23:56:18 compute-1 nova_compute[182713]: 2026-01-21 23:56:18.002 182717 INFO nova.virt.libvirt.driver [None req-6a023b4e-6f8f-4f6b-bf35-4658124c9d52 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: 0be60507-2a72-40c2-8ec7-86c829eacd52] Creating config drive at /var/lib/nova/instances/0be60507-2a72-40c2-8ec7-86c829eacd52/disk.config
Jan 21 23:56:18 compute-1 nova_compute[182713]: 2026-01-21 23:56:18.011 182717 DEBUG oslo_concurrency.processutils [None req-6a023b4e-6f8f-4f6b-bf35-4658124c9d52 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/0be60507-2a72-40c2-8ec7-86c829eacd52/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpedq2zmlp execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:56:18 compute-1 nova_compute[182713]: 2026-01-21 23:56:18.149 182717 DEBUG oslo_concurrency.processutils [None req-6a023b4e-6f8f-4f6b-bf35-4658124c9d52 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/0be60507-2a72-40c2-8ec7-86c829eacd52/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpedq2zmlp" returned: 0 in 0.138s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:56:18 compute-1 kernel: tap8014260a-f4: entered promiscuous mode
Jan 21 23:56:18 compute-1 NetworkManager[54952]: <info>  [1769039778.2121] manager: (tap8014260a-f4): new Tun device (/org/freedesktop/NetworkManager/Devices/97)
Jan 21 23:56:18 compute-1 systemd-udevd[219809]: Network interface NamePolicy= disabled on kernel command line.
Jan 21 23:56:18 compute-1 ovn_controller[94841]: 2026-01-21T23:56:18Z|00192|binding|INFO|Claiming lport 8014260a-f495-40a2-81b9-2fa4e968f539 for this chassis.
Jan 21 23:56:18 compute-1 ovn_controller[94841]: 2026-01-21T23:56:18Z|00193|binding|INFO|8014260a-f495-40a2-81b9-2fa4e968f539: Claiming fa:16:3e:50:87:d7 10.100.0.8
Jan 21 23:56:18 compute-1 nova_compute[182713]: 2026-01-21 23:56:18.274 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:56:18 compute-1 NetworkManager[54952]: <info>  [1769039778.2841] device (tap8014260a-f4): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 21 23:56:18 compute-1 NetworkManager[54952]: <info>  [1769039778.2852] device (tap8014260a-f4): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 21 23:56:18 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:56:18.293 104184 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:50:87:d7 10.100.0.8'], port_security=['fa:16:3e:50:87:d7 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '0be60507-2a72-40c2-8ec7-86c829eacd52', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-74e2da48-44c2-4c6d-9597-6c47d6247f9c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '63e5713bcd4c429796b251487b6136bc', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'f9b7e0b6-a204-4fa2-b013-6e4797586550', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d83c4887-6d70-4852-825f-2112d1d70c76, chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>], logical_port=8014260a-f495-40a2-81b9-2fa4e968f539) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 21 23:56:18 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:56:18.294 104184 INFO neutron.agent.ovn.metadata.agent [-] Port 8014260a-f495-40a2-81b9-2fa4e968f539 in datapath 74e2da48-44c2-4c6d-9597-6c47d6247f9c bound to our chassis
Jan 21 23:56:18 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:56:18.295 104184 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 74e2da48-44c2-4c6d-9597-6c47d6247f9c
Jan 21 23:56:18 compute-1 ovn_controller[94841]: 2026-01-21T23:56:18Z|00194|binding|INFO|Setting lport 8014260a-f495-40a2-81b9-2fa4e968f539 up in Southbound
Jan 21 23:56:18 compute-1 ovn_controller[94841]: 2026-01-21T23:56:18Z|00195|binding|INFO|Setting lport 8014260a-f495-40a2-81b9-2fa4e968f539 ovn-installed in OVS
Jan 21 23:56:18 compute-1 nova_compute[182713]: 2026-01-21 23:56:18.304 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:56:18 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:56:18.315 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[1a7d0c27-91d9-4668-8f00-cab99dc3a333]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:56:18 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:56:18.316 104184 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap74e2da48-41 in ovnmeta-74e2da48-44c2-4c6d-9597-6c47d6247f9c namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 21 23:56:18 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:56:18.318 211733 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap74e2da48-40 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 21 23:56:18 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:56:18.318 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[7f0e5e31-419f-4132-baf6-a6e74b6fef82]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:56:18 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:56:18.319 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[5663e24d-92db-4dfc-be6c-5f0f167d0f16]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:56:18 compute-1 systemd-machined[153970]: New machine qemu-30-instance-00000040.
Jan 21 23:56:18 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:56:18.328 104576 DEBUG oslo.privsep.daemon [-] privsep: reply[ec7c55af-39f9-4c0d-9039-ae4527e6d569]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:56:18 compute-1 systemd[1]: Started Virtual Machine qemu-30-instance-00000040.
Jan 21 23:56:18 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:56:18.361 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[5f376114-35a7-4f81-8c8a-d8a9a4d23c08]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:56:18 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:56:18.392 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[e1c27523-7cb0-4927-ba4b-0168cc051a5c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:56:18 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:56:18.397 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[8da72af6-a80d-40bf-81f9-9137ba29182d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:56:18 compute-1 NetworkManager[54952]: <info>  [1769039778.3987] manager: (tap74e2da48-40): new Veth device (/org/freedesktop/NetworkManager/Devices/98)
Jan 21 23:56:18 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:56:18.436 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[def51057-294d-4121-885f-38da72414ccd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:56:18 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:56:18.440 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[7755bb61-9590-4346-9c8a-b121dc6d387a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:56:18 compute-1 NetworkManager[54952]: <info>  [1769039778.4757] device (tap74e2da48-40): carrier: link connected
Jan 21 23:56:18 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:56:18.483 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[a95d0da1-231b-419b-b30d-4a9c5bb7c068]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:56:18 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:56:18.506 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[5041852c-7864-4a73-942b-2033cc82b2cd]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap74e2da48-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:af:75:49'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 60], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 435112, 'reachable_time': 29732, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 219845, 'error': None, 'target': 'ovnmeta-74e2da48-44c2-4c6d-9597-6c47d6247f9c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:56:18 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:56:18.527 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[9a38cfb6-f248-458d-a219-bd2c5c5282f5]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feaf:7549'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 435112, 'tstamp': 435112}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 219846, 'error': None, 'target': 'ovnmeta-74e2da48-44c2-4c6d-9597-6c47d6247f9c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:56:18 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:56:18.550 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[665faf4c-d7ce-45c5-99d6-e362eb5e38b5]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap74e2da48-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:af:75:49'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 60], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 435112, 'reachable_time': 29732, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 219847, 'error': None, 'target': 'ovnmeta-74e2da48-44c2-4c6d-9597-6c47d6247f9c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:56:18 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:56:18.590 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[5ab70896-6632-4327-9d43-98fd135f133c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:56:18 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:56:18.681 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[37339487-72cf-4c8f-b54a-d26764296ab9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:56:18 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:56:18.682 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap74e2da48-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:56:18 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:56:18.683 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 21 23:56:18 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:56:18.683 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap74e2da48-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:56:18 compute-1 NetworkManager[54952]: <info>  [1769039778.6857] manager: (tap74e2da48-40): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/99)
Jan 21 23:56:18 compute-1 nova_compute[182713]: 2026-01-21 23:56:18.685 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:56:18 compute-1 kernel: tap74e2da48-40: entered promiscuous mode
Jan 21 23:56:18 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:56:18.690 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap74e2da48-40, col_values=(('external_ids', {'iface-id': '5f8f321e-2942-4700-a50e-4b0628052c1b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:56:18 compute-1 nova_compute[182713]: 2026-01-21 23:56:18.696 182717 DEBUG nova.compute.manager [req-c2183038-4275-457d-a1f9-10c272145c3d req-7a8cd0d4-0820-434c-b5d7-0e6bdfd4d727 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 0be60507-2a72-40c2-8ec7-86c829eacd52] Received event network-vif-plugged-8014260a-f495-40a2-81b9-2fa4e968f539 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:56:18 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:56:18.697 104184 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/74e2da48-44c2-4c6d-9597-6c47d6247f9c.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/74e2da48-44c2-4c6d-9597-6c47d6247f9c.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 21 23:56:18 compute-1 ovn_controller[94841]: 2026-01-21T23:56:18Z|00196|binding|INFO|Releasing lport 5f8f321e-2942-4700-a50e-4b0628052c1b from this chassis (sb_readonly=0)
Jan 21 23:56:18 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:56:18.698 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[42c601c0-0fda-4432-91d3-788416f465c4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:56:18 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:56:18.699 104184 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 21 23:56:18 compute-1 ovn_metadata_agent[104179]: global
Jan 21 23:56:18 compute-1 ovn_metadata_agent[104179]:     log         /dev/log local0 debug
Jan 21 23:56:18 compute-1 ovn_metadata_agent[104179]:     log-tag     haproxy-metadata-proxy-74e2da48-44c2-4c6d-9597-6c47d6247f9c
Jan 21 23:56:18 compute-1 ovn_metadata_agent[104179]:     user        root
Jan 21 23:56:18 compute-1 ovn_metadata_agent[104179]:     group       root
Jan 21 23:56:18 compute-1 ovn_metadata_agent[104179]:     maxconn     1024
Jan 21 23:56:18 compute-1 ovn_metadata_agent[104179]:     pidfile     /var/lib/neutron/external/pids/74e2da48-44c2-4c6d-9597-6c47d6247f9c.pid.haproxy
Jan 21 23:56:18 compute-1 ovn_metadata_agent[104179]:     daemon
Jan 21 23:56:18 compute-1 ovn_metadata_agent[104179]: 
Jan 21 23:56:18 compute-1 ovn_metadata_agent[104179]: defaults
Jan 21 23:56:18 compute-1 ovn_metadata_agent[104179]:     log global
Jan 21 23:56:18 compute-1 ovn_metadata_agent[104179]:     mode http
Jan 21 23:56:18 compute-1 ovn_metadata_agent[104179]:     option httplog
Jan 21 23:56:18 compute-1 ovn_metadata_agent[104179]:     option dontlognull
Jan 21 23:56:18 compute-1 ovn_metadata_agent[104179]:     option http-server-close
Jan 21 23:56:18 compute-1 ovn_metadata_agent[104179]:     option forwardfor
Jan 21 23:56:18 compute-1 ovn_metadata_agent[104179]:     retries                 3
Jan 21 23:56:18 compute-1 ovn_metadata_agent[104179]:     timeout http-request    30s
Jan 21 23:56:18 compute-1 ovn_metadata_agent[104179]:     timeout connect         30s
Jan 21 23:56:18 compute-1 ovn_metadata_agent[104179]:     timeout client          32s
Jan 21 23:56:18 compute-1 ovn_metadata_agent[104179]:     timeout server          32s
Jan 21 23:56:18 compute-1 ovn_metadata_agent[104179]:     timeout http-keep-alive 30s
Jan 21 23:56:18 compute-1 ovn_metadata_agent[104179]: 
Jan 21 23:56:18 compute-1 ovn_metadata_agent[104179]: 
Jan 21 23:56:18 compute-1 ovn_metadata_agent[104179]: listen listener
Jan 21 23:56:18 compute-1 ovn_metadata_agent[104179]:     bind 169.254.169.254:80
Jan 21 23:56:18 compute-1 ovn_metadata_agent[104179]:     server metadata /var/lib/neutron/metadata_proxy
Jan 21 23:56:18 compute-1 ovn_metadata_agent[104179]:     http-request add-header X-OVN-Network-ID 74e2da48-44c2-4c6d-9597-6c47d6247f9c
Jan 21 23:56:18 compute-1 ovn_metadata_agent[104179]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 21 23:56:18 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:56:18.699 104184 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-74e2da48-44c2-4c6d-9597-6c47d6247f9c', 'env', 'PROCESS_TAG=haproxy-74e2da48-44c2-4c6d-9597-6c47d6247f9c', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/74e2da48-44c2-4c6d-9597-6c47d6247f9c.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 21 23:56:18 compute-1 nova_compute[182713]: 2026-01-21 23:56:18.699 182717 DEBUG oslo_concurrency.lockutils [req-c2183038-4275-457d-a1f9-10c272145c3d req-7a8cd0d4-0820-434c-b5d7-0e6bdfd4d727 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "0be60507-2a72-40c2-8ec7-86c829eacd52-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:56:18 compute-1 nova_compute[182713]: 2026-01-21 23:56:18.701 182717 DEBUG oslo_concurrency.lockutils [req-c2183038-4275-457d-a1f9-10c272145c3d req-7a8cd0d4-0820-434c-b5d7-0e6bdfd4d727 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "0be60507-2a72-40c2-8ec7-86c829eacd52-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:56:18 compute-1 nova_compute[182713]: 2026-01-21 23:56:18.702 182717 DEBUG oslo_concurrency.lockutils [req-c2183038-4275-457d-a1f9-10c272145c3d req-7a8cd0d4-0820-434c-b5d7-0e6bdfd4d727 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "0be60507-2a72-40c2-8ec7-86c829eacd52-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:56:18 compute-1 nova_compute[182713]: 2026-01-21 23:56:18.702 182717 DEBUG nova.compute.manager [req-c2183038-4275-457d-a1f9-10c272145c3d req-7a8cd0d4-0820-434c-b5d7-0e6bdfd4d727 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 0be60507-2a72-40c2-8ec7-86c829eacd52] Processing event network-vif-plugged-8014260a-f495-40a2-81b9-2fa4e968f539 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 21 23:56:18 compute-1 nova_compute[182713]: 2026-01-21 23:56:18.703 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:56:18 compute-1 nova_compute[182713]: 2026-01-21 23:56:18.723 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:56:18 compute-1 nova_compute[182713]: 2026-01-21 23:56:18.802 182717 DEBUG nova.compute.manager [None req-6a023b4e-6f8f-4f6b-bf35-4658124c9d52 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: 0be60507-2a72-40c2-8ec7-86c829eacd52] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 21 23:56:18 compute-1 nova_compute[182713]: 2026-01-21 23:56:18.811 182717 DEBUG nova.virt.driver [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Emitting event <LifecycleEvent: 1769039778.802915, 0be60507-2a72-40c2-8ec7-86c829eacd52 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:56:18 compute-1 nova_compute[182713]: 2026-01-21 23:56:18.812 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 0be60507-2a72-40c2-8ec7-86c829eacd52] VM Started (Lifecycle Event)
Jan 21 23:56:18 compute-1 nova_compute[182713]: 2026-01-21 23:56:18.826 182717 DEBUG nova.virt.libvirt.driver [None req-6a023b4e-6f8f-4f6b-bf35-4658124c9d52 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: 0be60507-2a72-40c2-8ec7-86c829eacd52] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 21 23:56:18 compute-1 nova_compute[182713]: 2026-01-21 23:56:18.830 182717 INFO nova.virt.libvirt.driver [-] [instance: 0be60507-2a72-40c2-8ec7-86c829eacd52] Instance spawned successfully.
Jan 21 23:56:18 compute-1 nova_compute[182713]: 2026-01-21 23:56:18.830 182717 DEBUG nova.virt.libvirt.driver [None req-6a023b4e-6f8f-4f6b-bf35-4658124c9d52 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: 0be60507-2a72-40c2-8ec7-86c829eacd52] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 21 23:56:18 compute-1 nova_compute[182713]: 2026-01-21 23:56:18.839 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 0be60507-2a72-40c2-8ec7-86c829eacd52] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:56:18 compute-1 nova_compute[182713]: 2026-01-21 23:56:18.843 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 0be60507-2a72-40c2-8ec7-86c829eacd52] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 21 23:56:18 compute-1 nova_compute[182713]: 2026-01-21 23:56:18.874 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 0be60507-2a72-40c2-8ec7-86c829eacd52] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 21 23:56:18 compute-1 nova_compute[182713]: 2026-01-21 23:56:18.874 182717 DEBUG nova.virt.driver [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Emitting event <LifecycleEvent: 1769039778.8030634, 0be60507-2a72-40c2-8ec7-86c829eacd52 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:56:18 compute-1 nova_compute[182713]: 2026-01-21 23:56:18.875 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 0be60507-2a72-40c2-8ec7-86c829eacd52] VM Paused (Lifecycle Event)
Jan 21 23:56:18 compute-1 nova_compute[182713]: 2026-01-21 23:56:18.881 182717 DEBUG nova.virt.libvirt.driver [None req-6a023b4e-6f8f-4f6b-bf35-4658124c9d52 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: 0be60507-2a72-40c2-8ec7-86c829eacd52] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:56:18 compute-1 nova_compute[182713]: 2026-01-21 23:56:18.881 182717 DEBUG nova.virt.libvirt.driver [None req-6a023b4e-6f8f-4f6b-bf35-4658124c9d52 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: 0be60507-2a72-40c2-8ec7-86c829eacd52] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:56:18 compute-1 nova_compute[182713]: 2026-01-21 23:56:18.882 182717 DEBUG nova.virt.libvirt.driver [None req-6a023b4e-6f8f-4f6b-bf35-4658124c9d52 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: 0be60507-2a72-40c2-8ec7-86c829eacd52] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:56:18 compute-1 nova_compute[182713]: 2026-01-21 23:56:18.882 182717 DEBUG nova.virt.libvirt.driver [None req-6a023b4e-6f8f-4f6b-bf35-4658124c9d52 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: 0be60507-2a72-40c2-8ec7-86c829eacd52] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:56:18 compute-1 nova_compute[182713]: 2026-01-21 23:56:18.883 182717 DEBUG nova.virt.libvirt.driver [None req-6a023b4e-6f8f-4f6b-bf35-4658124c9d52 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: 0be60507-2a72-40c2-8ec7-86c829eacd52] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:56:18 compute-1 nova_compute[182713]: 2026-01-21 23:56:18.884 182717 DEBUG nova.virt.libvirt.driver [None req-6a023b4e-6f8f-4f6b-bf35-4658124c9d52 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: 0be60507-2a72-40c2-8ec7-86c829eacd52] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:56:18 compute-1 nova_compute[182713]: 2026-01-21 23:56:18.918 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 0be60507-2a72-40c2-8ec7-86c829eacd52] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:56:18 compute-1 nova_compute[182713]: 2026-01-21 23:56:18.922 182717 DEBUG nova.virt.driver [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Emitting event <LifecycleEvent: 1769039778.8170378, 0be60507-2a72-40c2-8ec7-86c829eacd52 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:56:18 compute-1 nova_compute[182713]: 2026-01-21 23:56:18.923 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 0be60507-2a72-40c2-8ec7-86c829eacd52] VM Resumed (Lifecycle Event)
Jan 21 23:56:18 compute-1 nova_compute[182713]: 2026-01-21 23:56:18.959 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 0be60507-2a72-40c2-8ec7-86c829eacd52] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:56:18 compute-1 nova_compute[182713]: 2026-01-21 23:56:18.962 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 0be60507-2a72-40c2-8ec7-86c829eacd52] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 21 23:56:18 compute-1 nova_compute[182713]: 2026-01-21 23:56:18.996 182717 INFO nova.compute.manager [None req-6a023b4e-6f8f-4f6b-bf35-4658124c9d52 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: 0be60507-2a72-40c2-8ec7-86c829eacd52] Took 6.35 seconds to spawn the instance on the hypervisor.
Jan 21 23:56:18 compute-1 nova_compute[182713]: 2026-01-21 23:56:18.996 182717 DEBUG nova.compute.manager [None req-6a023b4e-6f8f-4f6b-bf35-4658124c9d52 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: 0be60507-2a72-40c2-8ec7-86c829eacd52] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:56:18 compute-1 nova_compute[182713]: 2026-01-21 23:56:18.998 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 0be60507-2a72-40c2-8ec7-86c829eacd52] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 21 23:56:19 compute-1 nova_compute[182713]: 2026-01-21 23:56:19.115 182717 INFO nova.compute.manager [None req-6a023b4e-6f8f-4f6b-bf35-4658124c9d52 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: 0be60507-2a72-40c2-8ec7-86c829eacd52] Took 7.08 seconds to build instance.
Jan 21 23:56:19 compute-1 podman[219886]: 2026-01-21 23:56:19.163885225 +0000 UTC m=+0.082316205 container create 4c76bbf8e04b54e43a6457b032593786d54729d1dc2836ddb97956dd25295f01 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-74e2da48-44c2-4c6d-9597-6c47d6247f9c, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team)
Jan 21 23:56:19 compute-1 nova_compute[182713]: 2026-01-21 23:56:19.168 182717 DEBUG oslo_concurrency.lockutils [None req-6a023b4e-6f8f-4f6b-bf35-4658124c9d52 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Lock "0be60507-2a72-40c2-8ec7-86c829eacd52" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.284s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:56:19 compute-1 podman[219886]: 2026-01-21 23:56:19.122122049 +0000 UTC m=+0.040553109 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 21 23:56:19 compute-1 systemd[1]: Started libpod-conmon-4c76bbf8e04b54e43a6457b032593786d54729d1dc2836ddb97956dd25295f01.scope.
Jan 21 23:56:19 compute-1 systemd[1]: Started libcrun container.
Jan 21 23:56:19 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/72c1f367355eea11a663bee895b2efccc8b163be9b3ed0f08d872d937ac5042a/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 21 23:56:19 compute-1 podman[219886]: 2026-01-21 23:56:19.290001559 +0000 UTC m=+0.208432539 container init 4c76bbf8e04b54e43a6457b032593786d54729d1dc2836ddb97956dd25295f01 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-74e2da48-44c2-4c6d-9597-6c47d6247f9c, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, io.buildah.version=1.41.3)
Jan 21 23:56:19 compute-1 podman[219886]: 2026-01-21 23:56:19.300543715 +0000 UTC m=+0.218974695 container start 4c76bbf8e04b54e43a6457b032593786d54729d1dc2836ddb97956dd25295f01 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-74e2da48-44c2-4c6d-9597-6c47d6247f9c, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2)
Jan 21 23:56:19 compute-1 neutron-haproxy-ovnmeta-74e2da48-44c2-4c6d-9597-6c47d6247f9c[219901]: [NOTICE]   (219905) : New worker (219907) forked
Jan 21 23:56:19 compute-1 neutron-haproxy-ovnmeta-74e2da48-44c2-4c6d-9597-6c47d6247f9c[219901]: [NOTICE]   (219905) : Loading success.
Jan 21 23:56:19 compute-1 nova_compute[182713]: 2026-01-21 23:56:19.506 182717 DEBUG nova.network.neutron [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] [instance: 03db37ed-d870-40ec-86f5-db23a9180dc8] Successfully updated port: 1f2706f6-320f-42cf-8e88-b7cb375b001a _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 21 23:56:19 compute-1 nova_compute[182713]: 2026-01-21 23:56:19.558 182717 DEBUG oslo_concurrency.lockutils [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] Acquiring lock "refresh_cache-03db37ed-d870-40ec-86f5-db23a9180dc8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 23:56:19 compute-1 nova_compute[182713]: 2026-01-21 23:56:19.558 182717 DEBUG oslo_concurrency.lockutils [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] Acquired lock "refresh_cache-03db37ed-d870-40ec-86f5-db23a9180dc8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 23:56:19 compute-1 nova_compute[182713]: 2026-01-21 23:56:19.559 182717 DEBUG nova.network.neutron [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] [instance: 03db37ed-d870-40ec-86f5-db23a9180dc8] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 21 23:56:19 compute-1 nova_compute[182713]: 2026-01-21 23:56:19.813 182717 DEBUG nova.network.neutron [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] [instance: 03db37ed-d870-40ec-86f5-db23a9180dc8] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 21 23:56:19 compute-1 nova_compute[182713]: 2026-01-21 23:56:19.935 182717 DEBUG nova.network.neutron [req-c9e8c54f-8249-4823-a5f6-6bf2fead6cfd req-8b375db3-5c61-4db7-b30c-265fae673541 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 0be60507-2a72-40c2-8ec7-86c829eacd52] Updated VIF entry in instance network info cache for port 8014260a-f495-40a2-81b9-2fa4e968f539. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 21 23:56:19 compute-1 nova_compute[182713]: 2026-01-21 23:56:19.936 182717 DEBUG nova.network.neutron [req-c9e8c54f-8249-4823-a5f6-6bf2fead6cfd req-8b375db3-5c61-4db7-b30c-265fae673541 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 0be60507-2a72-40c2-8ec7-86c829eacd52] Updating instance_info_cache with network_info: [{"id": "8014260a-f495-40a2-81b9-2fa4e968f539", "address": "fa:16:3e:50:87:d7", "network": {"id": "74e2da48-44c2-4c6d-9597-6c47d6247f9c", "bridge": "br-int", "label": "tempest-ImagesTestJSON-926280031-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "63e5713bcd4c429796b251487b6136bc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8014260a-f4", "ovs_interfaceid": "8014260a-f495-40a2-81b9-2fa4e968f539", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 23:56:19 compute-1 nova_compute[182713]: 2026-01-21 23:56:19.974 182717 DEBUG oslo_concurrency.lockutils [req-c9e8c54f-8249-4823-a5f6-6bf2fead6cfd req-8b375db3-5c61-4db7-b30c-265fae673541 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-0be60507-2a72-40c2-8ec7-86c829eacd52" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 23:56:20 compute-1 nova_compute[182713]: 2026-01-21 23:56:20.937 182717 DEBUG nova.network.neutron [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] [instance: 03db37ed-d870-40ec-86f5-db23a9180dc8] Updating instance_info_cache with network_info: [{"id": "1f2706f6-320f-42cf-8e88-b7cb375b001a", "address": "fa:16:3e:38:ee:b8", "network": {"id": "835f4434-3fa6-458b-b79c-b27830f531cf", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-1274650069-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "414437860afc460b9e86d674975e9d1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1f2706f6-32", "ovs_interfaceid": "1f2706f6-320f-42cf-8e88-b7cb375b001a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 23:56:20 compute-1 nova_compute[182713]: 2026-01-21 23:56:20.980 182717 DEBUG oslo_concurrency.lockutils [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] Releasing lock "refresh_cache-03db37ed-d870-40ec-86f5-db23a9180dc8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 23:56:20 compute-1 nova_compute[182713]: 2026-01-21 23:56:20.981 182717 DEBUG nova.compute.manager [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] [instance: 03db37ed-d870-40ec-86f5-db23a9180dc8] Instance network_info: |[{"id": "1f2706f6-320f-42cf-8e88-b7cb375b001a", "address": "fa:16:3e:38:ee:b8", "network": {"id": "835f4434-3fa6-458b-b79c-b27830f531cf", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-1274650069-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "414437860afc460b9e86d674975e9d1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1f2706f6-32", "ovs_interfaceid": "1f2706f6-320f-42cf-8e88-b7cb375b001a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 21 23:56:20 compute-1 nova_compute[182713]: 2026-01-21 23:56:20.983 182717 DEBUG nova.virt.libvirt.driver [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] [instance: 03db37ed-d870-40ec-86f5-db23a9180dc8] Start _get_guest_xml network_info=[{"id": "1f2706f6-320f-42cf-8e88-b7cb375b001a", "address": "fa:16:3e:38:ee:b8", "network": {"id": "835f4434-3fa6-458b-b79c-b27830f531cf", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-1274650069-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "414437860afc460b9e86d674975e9d1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1f2706f6-32", "ovs_interfaceid": "1f2706f6-320f-42cf-8e88-b7cb375b001a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'size': 0, 'disk_bus': 'virtio', 'boot_index': 0, 'device_name': '/dev/vda', 'guest_format': None, 'encrypted': False, 'encryption_format': None, 'encryption_secret_uuid': None, 'device_type': 'disk', 'image_id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 21 23:56:20 compute-1 nova_compute[182713]: 2026-01-21 23:56:20.988 182717 WARNING nova.virt.libvirt.driver [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 21 23:56:20 compute-1 nova_compute[182713]: 2026-01-21 23:56:20.994 182717 DEBUG nova.virt.libvirt.host [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 21 23:56:20 compute-1 nova_compute[182713]: 2026-01-21 23:56:20.994 182717 DEBUG nova.virt.libvirt.host [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 21 23:56:20 compute-1 nova_compute[182713]: 2026-01-21 23:56:20.998 182717 DEBUG nova.virt.libvirt.host [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 21 23:56:20 compute-1 nova_compute[182713]: 2026-01-21 23:56:20.999 182717 DEBUG nova.virt.libvirt.host [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 21 23:56:21 compute-1 nova_compute[182713]: 2026-01-21 23:56:21.000 182717 DEBUG nova.virt.libvirt.driver [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 21 23:56:21 compute-1 nova_compute[182713]: 2026-01-21 23:56:21.000 182717 DEBUG nova.virt.hardware [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-21T23:42:45Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c3389c03-89c4-4ff5-9e03-1a99d41713d4',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 21 23:56:21 compute-1 nova_compute[182713]: 2026-01-21 23:56:21.001 182717 DEBUG nova.virt.hardware [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 21 23:56:21 compute-1 nova_compute[182713]: 2026-01-21 23:56:21.001 182717 DEBUG nova.virt.hardware [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 21 23:56:21 compute-1 nova_compute[182713]: 2026-01-21 23:56:21.001 182717 DEBUG nova.virt.hardware [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 21 23:56:21 compute-1 nova_compute[182713]: 2026-01-21 23:56:21.001 182717 DEBUG nova.virt.hardware [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 21 23:56:21 compute-1 nova_compute[182713]: 2026-01-21 23:56:21.001 182717 DEBUG nova.virt.hardware [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 21 23:56:21 compute-1 nova_compute[182713]: 2026-01-21 23:56:21.002 182717 DEBUG nova.virt.hardware [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 21 23:56:21 compute-1 nova_compute[182713]: 2026-01-21 23:56:21.002 182717 DEBUG nova.virt.hardware [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 21 23:56:21 compute-1 nova_compute[182713]: 2026-01-21 23:56:21.002 182717 DEBUG nova.virt.hardware [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 21 23:56:21 compute-1 nova_compute[182713]: 2026-01-21 23:56:21.003 182717 DEBUG nova.virt.hardware [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 21 23:56:21 compute-1 nova_compute[182713]: 2026-01-21 23:56:21.003 182717 DEBUG nova.virt.hardware [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 21 23:56:21 compute-1 nova_compute[182713]: 2026-01-21 23:56:21.006 182717 DEBUG nova.virt.libvirt.vif [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-21T23:56:12Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ListServersNegativeTestJSON-server-1677728672',display_name='tempest-ListServersNegativeTestJSON-server-1677728672-3',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-listserversnegativetestjson-server-1677728672-3',id=67,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=2,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='414437860afc460b9e86d674975e9d1f',ramdisk_id='',reservation_id='r-dznclk2m',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ListServersNegativeTestJSON-1787990789',owner_user_name='tempest-ListServersNegativeTestJSON-1787990789-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-21T23:56:14Z,user_data=None,user_id='9a4a4a5f3c9f4c5091261592272bcb81',uuid=03db37ed-d870-40ec-86f5-db23a9180dc8,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "1f2706f6-320f-42cf-8e88-b7cb375b001a", "address": "fa:16:3e:38:ee:b8", "network": {"id": "835f4434-3fa6-458b-b79c-b27830f531cf", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-1274650069-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "414437860afc460b9e86d674975e9d1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1f2706f6-32", "ovs_interfaceid": "1f2706f6-320f-42cf-8e88-b7cb375b001a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 21 23:56:21 compute-1 nova_compute[182713]: 2026-01-21 23:56:21.007 182717 DEBUG nova.network.os_vif_util [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] Converting VIF {"id": "1f2706f6-320f-42cf-8e88-b7cb375b001a", "address": "fa:16:3e:38:ee:b8", "network": {"id": "835f4434-3fa6-458b-b79c-b27830f531cf", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-1274650069-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "414437860afc460b9e86d674975e9d1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1f2706f6-32", "ovs_interfaceid": "1f2706f6-320f-42cf-8e88-b7cb375b001a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 21 23:56:21 compute-1 nova_compute[182713]: 2026-01-21 23:56:21.008 182717 DEBUG nova.network.os_vif_util [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:38:ee:b8,bridge_name='br-int',has_traffic_filtering=True,id=1f2706f6-320f-42cf-8e88-b7cb375b001a,network=Network(835f4434-3fa6-458b-b79c-b27830f531cf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1f2706f6-32') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 21 23:56:21 compute-1 nova_compute[182713]: 2026-01-21 23:56:21.009 182717 DEBUG nova.objects.instance [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] Lazy-loading 'pci_devices' on Instance uuid 03db37ed-d870-40ec-86f5-db23a9180dc8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:56:21 compute-1 nova_compute[182713]: 2026-01-21 23:56:21.067 182717 DEBUG nova.virt.libvirt.driver [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] [instance: 03db37ed-d870-40ec-86f5-db23a9180dc8] End _get_guest_xml xml=<domain type="kvm">
Jan 21 23:56:21 compute-1 nova_compute[182713]:   <uuid>03db37ed-d870-40ec-86f5-db23a9180dc8</uuid>
Jan 21 23:56:21 compute-1 nova_compute[182713]:   <name>instance-00000043</name>
Jan 21 23:56:21 compute-1 nova_compute[182713]:   <memory>131072</memory>
Jan 21 23:56:21 compute-1 nova_compute[182713]:   <vcpu>1</vcpu>
Jan 21 23:56:21 compute-1 nova_compute[182713]:   <metadata>
Jan 21 23:56:21 compute-1 nova_compute[182713]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 21 23:56:21 compute-1 nova_compute[182713]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 21 23:56:21 compute-1 nova_compute[182713]:       <nova:name>tempest-ListServersNegativeTestJSON-server-1677728672-3</nova:name>
Jan 21 23:56:21 compute-1 nova_compute[182713]:       <nova:creationTime>2026-01-21 23:56:20</nova:creationTime>
Jan 21 23:56:21 compute-1 nova_compute[182713]:       <nova:flavor name="m1.nano">
Jan 21 23:56:21 compute-1 nova_compute[182713]:         <nova:memory>128</nova:memory>
Jan 21 23:56:21 compute-1 nova_compute[182713]:         <nova:disk>1</nova:disk>
Jan 21 23:56:21 compute-1 nova_compute[182713]:         <nova:swap>0</nova:swap>
Jan 21 23:56:21 compute-1 nova_compute[182713]:         <nova:ephemeral>0</nova:ephemeral>
Jan 21 23:56:21 compute-1 nova_compute[182713]:         <nova:vcpus>1</nova:vcpus>
Jan 21 23:56:21 compute-1 nova_compute[182713]:       </nova:flavor>
Jan 21 23:56:21 compute-1 nova_compute[182713]:       <nova:owner>
Jan 21 23:56:21 compute-1 nova_compute[182713]:         <nova:user uuid="9a4a4a5f3c9f4c5091261592272bcb81">tempest-ListServersNegativeTestJSON-1787990789-project-member</nova:user>
Jan 21 23:56:21 compute-1 nova_compute[182713]:         <nova:project uuid="414437860afc460b9e86d674975e9d1f">tempest-ListServersNegativeTestJSON-1787990789</nova:project>
Jan 21 23:56:21 compute-1 nova_compute[182713]:       </nova:owner>
Jan 21 23:56:21 compute-1 nova_compute[182713]:       <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 21 23:56:21 compute-1 nova_compute[182713]:       <nova:ports>
Jan 21 23:56:21 compute-1 nova_compute[182713]:         <nova:port uuid="1f2706f6-320f-42cf-8e88-b7cb375b001a">
Jan 21 23:56:21 compute-1 nova_compute[182713]:           <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Jan 21 23:56:21 compute-1 nova_compute[182713]:         </nova:port>
Jan 21 23:56:21 compute-1 nova_compute[182713]:       </nova:ports>
Jan 21 23:56:21 compute-1 nova_compute[182713]:     </nova:instance>
Jan 21 23:56:21 compute-1 nova_compute[182713]:   </metadata>
Jan 21 23:56:21 compute-1 nova_compute[182713]:   <sysinfo type="smbios">
Jan 21 23:56:21 compute-1 nova_compute[182713]:     <system>
Jan 21 23:56:21 compute-1 nova_compute[182713]:       <entry name="manufacturer">RDO</entry>
Jan 21 23:56:21 compute-1 nova_compute[182713]:       <entry name="product">OpenStack Compute</entry>
Jan 21 23:56:21 compute-1 nova_compute[182713]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 21 23:56:21 compute-1 nova_compute[182713]:       <entry name="serial">03db37ed-d870-40ec-86f5-db23a9180dc8</entry>
Jan 21 23:56:21 compute-1 nova_compute[182713]:       <entry name="uuid">03db37ed-d870-40ec-86f5-db23a9180dc8</entry>
Jan 21 23:56:21 compute-1 nova_compute[182713]:       <entry name="family">Virtual Machine</entry>
Jan 21 23:56:21 compute-1 nova_compute[182713]:     </system>
Jan 21 23:56:21 compute-1 nova_compute[182713]:   </sysinfo>
Jan 21 23:56:21 compute-1 nova_compute[182713]:   <os>
Jan 21 23:56:21 compute-1 nova_compute[182713]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 21 23:56:21 compute-1 nova_compute[182713]:     <boot dev="hd"/>
Jan 21 23:56:21 compute-1 nova_compute[182713]:     <smbios mode="sysinfo"/>
Jan 21 23:56:21 compute-1 nova_compute[182713]:   </os>
Jan 21 23:56:21 compute-1 nova_compute[182713]:   <features>
Jan 21 23:56:21 compute-1 nova_compute[182713]:     <acpi/>
Jan 21 23:56:21 compute-1 nova_compute[182713]:     <apic/>
Jan 21 23:56:21 compute-1 nova_compute[182713]:     <vmcoreinfo/>
Jan 21 23:56:21 compute-1 nova_compute[182713]:   </features>
Jan 21 23:56:21 compute-1 nova_compute[182713]:   <clock offset="utc">
Jan 21 23:56:21 compute-1 nova_compute[182713]:     <timer name="pit" tickpolicy="delay"/>
Jan 21 23:56:21 compute-1 nova_compute[182713]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 21 23:56:21 compute-1 nova_compute[182713]:     <timer name="hpet" present="no"/>
Jan 21 23:56:21 compute-1 nova_compute[182713]:   </clock>
Jan 21 23:56:21 compute-1 nova_compute[182713]:   <cpu mode="custom" match="exact">
Jan 21 23:56:21 compute-1 nova_compute[182713]:     <model>Nehalem</model>
Jan 21 23:56:21 compute-1 nova_compute[182713]:     <topology sockets="1" cores="1" threads="1"/>
Jan 21 23:56:21 compute-1 nova_compute[182713]:   </cpu>
Jan 21 23:56:21 compute-1 nova_compute[182713]:   <devices>
Jan 21 23:56:21 compute-1 nova_compute[182713]:     <disk type="file" device="disk">
Jan 21 23:56:21 compute-1 nova_compute[182713]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 21 23:56:21 compute-1 nova_compute[182713]:       <source file="/var/lib/nova/instances/03db37ed-d870-40ec-86f5-db23a9180dc8/disk"/>
Jan 21 23:56:21 compute-1 nova_compute[182713]:       <target dev="vda" bus="virtio"/>
Jan 21 23:56:21 compute-1 nova_compute[182713]:     </disk>
Jan 21 23:56:21 compute-1 nova_compute[182713]:     <disk type="file" device="cdrom">
Jan 21 23:56:21 compute-1 nova_compute[182713]:       <driver name="qemu" type="raw" cache="none"/>
Jan 21 23:56:21 compute-1 nova_compute[182713]:       <source file="/var/lib/nova/instances/03db37ed-d870-40ec-86f5-db23a9180dc8/disk.config"/>
Jan 21 23:56:21 compute-1 nova_compute[182713]:       <target dev="sda" bus="sata"/>
Jan 21 23:56:21 compute-1 nova_compute[182713]:     </disk>
Jan 21 23:56:21 compute-1 nova_compute[182713]:     <interface type="ethernet">
Jan 21 23:56:21 compute-1 nova_compute[182713]:       <mac address="fa:16:3e:38:ee:b8"/>
Jan 21 23:56:21 compute-1 nova_compute[182713]:       <model type="virtio"/>
Jan 21 23:56:21 compute-1 nova_compute[182713]:       <driver name="vhost" rx_queue_size="512"/>
Jan 21 23:56:21 compute-1 nova_compute[182713]:       <mtu size="1442"/>
Jan 21 23:56:21 compute-1 nova_compute[182713]:       <target dev="tap1f2706f6-32"/>
Jan 21 23:56:21 compute-1 nova_compute[182713]:     </interface>
Jan 21 23:56:21 compute-1 nova_compute[182713]:     <serial type="pty">
Jan 21 23:56:21 compute-1 nova_compute[182713]:       <log file="/var/lib/nova/instances/03db37ed-d870-40ec-86f5-db23a9180dc8/console.log" append="off"/>
Jan 21 23:56:21 compute-1 nova_compute[182713]:     </serial>
Jan 21 23:56:21 compute-1 nova_compute[182713]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 21 23:56:21 compute-1 nova_compute[182713]:     <video>
Jan 21 23:56:21 compute-1 nova_compute[182713]:       <model type="virtio"/>
Jan 21 23:56:21 compute-1 nova_compute[182713]:     </video>
Jan 21 23:56:21 compute-1 nova_compute[182713]:     <input type="tablet" bus="usb"/>
Jan 21 23:56:21 compute-1 nova_compute[182713]:     <rng model="virtio">
Jan 21 23:56:21 compute-1 nova_compute[182713]:       <backend model="random">/dev/urandom</backend>
Jan 21 23:56:21 compute-1 nova_compute[182713]:     </rng>
Jan 21 23:56:21 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root"/>
Jan 21 23:56:21 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:56:21 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:56:21 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:56:21 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:56:21 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:56:21 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:56:21 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:56:21 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:56:21 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:56:21 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:56:21 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:56:21 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:56:21 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:56:21 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:56:21 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:56:21 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:56:21 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:56:21 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:56:21 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:56:21 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:56:21 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:56:21 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:56:21 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:56:21 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:56:21 compute-1 nova_compute[182713]:     <controller type="usb" index="0"/>
Jan 21 23:56:21 compute-1 nova_compute[182713]:     <memballoon model="virtio">
Jan 21 23:56:21 compute-1 nova_compute[182713]:       <stats period="10"/>
Jan 21 23:56:21 compute-1 nova_compute[182713]:     </memballoon>
Jan 21 23:56:21 compute-1 nova_compute[182713]:   </devices>
Jan 21 23:56:21 compute-1 nova_compute[182713]: </domain>
Jan 21 23:56:21 compute-1 nova_compute[182713]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 21 23:56:21 compute-1 nova_compute[182713]: 2026-01-21 23:56:21.069 182717 DEBUG nova.compute.manager [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] [instance: 03db37ed-d870-40ec-86f5-db23a9180dc8] Preparing to wait for external event network-vif-plugged-1f2706f6-320f-42cf-8e88-b7cb375b001a prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 21 23:56:21 compute-1 nova_compute[182713]: 2026-01-21 23:56:21.069 182717 DEBUG oslo_concurrency.lockutils [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] Acquiring lock "03db37ed-d870-40ec-86f5-db23a9180dc8-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:56:21 compute-1 nova_compute[182713]: 2026-01-21 23:56:21.069 182717 DEBUG oslo_concurrency.lockutils [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] Lock "03db37ed-d870-40ec-86f5-db23a9180dc8-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:56:21 compute-1 nova_compute[182713]: 2026-01-21 23:56:21.070 182717 DEBUG oslo_concurrency.lockutils [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] Lock "03db37ed-d870-40ec-86f5-db23a9180dc8-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:56:21 compute-1 nova_compute[182713]: 2026-01-21 23:56:21.070 182717 DEBUG nova.virt.libvirt.vif [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-21T23:56:12Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ListServersNegativeTestJSON-server-1677728672',display_name='tempest-ListServersNegativeTestJSON-server-1677728672-3',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-listserversnegativetestjson-server-1677728672-3',id=67,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=2,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='414437860afc460b9e86d674975e9d1f',ramdisk_id='',reservation_id='r-dznclk2m',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ListServersNegativeTestJSON-1787990789',owner_user_name='tempest-ListServersNegativeTestJSON-1787990789-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-21T23:56:14Z,user_data=None,user_id='9a4a4a5f3c9f4c5091261592272bcb81',uuid=03db37ed-d870-40ec-86f5-db23a9180dc8,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "1f2706f6-320f-42cf-8e88-b7cb375b001a", "address": "fa:16:3e:38:ee:b8", "network": {"id": "835f4434-3fa6-458b-b79c-b27830f531cf", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-1274650069-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "414437860afc460b9e86d674975e9d1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1f2706f6-32", "ovs_interfaceid": "1f2706f6-320f-42cf-8e88-b7cb375b001a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 21 23:56:21 compute-1 nova_compute[182713]: 2026-01-21 23:56:21.070 182717 DEBUG nova.network.os_vif_util [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] Converting VIF {"id": "1f2706f6-320f-42cf-8e88-b7cb375b001a", "address": "fa:16:3e:38:ee:b8", "network": {"id": "835f4434-3fa6-458b-b79c-b27830f531cf", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-1274650069-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "414437860afc460b9e86d674975e9d1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1f2706f6-32", "ovs_interfaceid": "1f2706f6-320f-42cf-8e88-b7cb375b001a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 21 23:56:21 compute-1 nova_compute[182713]: 2026-01-21 23:56:21.071 182717 DEBUG nova.network.os_vif_util [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:38:ee:b8,bridge_name='br-int',has_traffic_filtering=True,id=1f2706f6-320f-42cf-8e88-b7cb375b001a,network=Network(835f4434-3fa6-458b-b79c-b27830f531cf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1f2706f6-32') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 21 23:56:21 compute-1 nova_compute[182713]: 2026-01-21 23:56:21.071 182717 DEBUG os_vif [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:38:ee:b8,bridge_name='br-int',has_traffic_filtering=True,id=1f2706f6-320f-42cf-8e88-b7cb375b001a,network=Network(835f4434-3fa6-458b-b79c-b27830f531cf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1f2706f6-32') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 21 23:56:21 compute-1 nova_compute[182713]: 2026-01-21 23:56:21.072 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:56:21 compute-1 nova_compute[182713]: 2026-01-21 23:56:21.072 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:56:21 compute-1 nova_compute[182713]: 2026-01-21 23:56:21.073 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 21 23:56:21 compute-1 nova_compute[182713]: 2026-01-21 23:56:21.083 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:56:21 compute-1 nova_compute[182713]: 2026-01-21 23:56:21.083 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap1f2706f6-32, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:56:21 compute-1 nova_compute[182713]: 2026-01-21 23:56:21.083 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap1f2706f6-32, col_values=(('external_ids', {'iface-id': '1f2706f6-320f-42cf-8e88-b7cb375b001a', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:38:ee:b8', 'vm-uuid': '03db37ed-d870-40ec-86f5-db23a9180dc8'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:56:21 compute-1 nova_compute[182713]: 2026-01-21 23:56:21.085 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:56:21 compute-1 NetworkManager[54952]: <info>  [1769039781.0864] manager: (tap1f2706f6-32): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/100)
Jan 21 23:56:21 compute-1 nova_compute[182713]: 2026-01-21 23:56:21.087 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 21 23:56:21 compute-1 nova_compute[182713]: 2026-01-21 23:56:21.093 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:56:21 compute-1 nova_compute[182713]: 2026-01-21 23:56:21.094 182717 INFO os_vif [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:38:ee:b8,bridge_name='br-int',has_traffic_filtering=True,id=1f2706f6-320f-42cf-8e88-b7cb375b001a,network=Network(835f4434-3fa6-458b-b79c-b27830f531cf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1f2706f6-32')
Jan 21 23:56:21 compute-1 nova_compute[182713]: 2026-01-21 23:56:21.104 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:56:21 compute-1 nova_compute[182713]: 2026-01-21 23:56:21.168 182717 DEBUG nova.virt.libvirt.driver [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 21 23:56:21 compute-1 nova_compute[182713]: 2026-01-21 23:56:21.169 182717 DEBUG nova.virt.libvirt.driver [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 21 23:56:21 compute-1 nova_compute[182713]: 2026-01-21 23:56:21.169 182717 DEBUG nova.virt.libvirt.driver [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] No VIF found with MAC fa:16:3e:38:ee:b8, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 21 23:56:21 compute-1 nova_compute[182713]: 2026-01-21 23:56:21.169 182717 INFO nova.virt.libvirt.driver [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] [instance: 03db37ed-d870-40ec-86f5-db23a9180dc8] Using config drive
Jan 21 23:56:21 compute-1 nova_compute[182713]: 2026-01-21 23:56:21.198 182717 DEBUG nova.compute.manager [req-f7a59538-2406-4448-9d20-70f52326ed93 req-0fce4b0b-827a-46a9-8271-2ab484753684 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 0be60507-2a72-40c2-8ec7-86c829eacd52] Received event network-vif-plugged-8014260a-f495-40a2-81b9-2fa4e968f539 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:56:21 compute-1 nova_compute[182713]: 2026-01-21 23:56:21.199 182717 DEBUG oslo_concurrency.lockutils [req-f7a59538-2406-4448-9d20-70f52326ed93 req-0fce4b0b-827a-46a9-8271-2ab484753684 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "0be60507-2a72-40c2-8ec7-86c829eacd52-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:56:21 compute-1 nova_compute[182713]: 2026-01-21 23:56:21.199 182717 DEBUG oslo_concurrency.lockutils [req-f7a59538-2406-4448-9d20-70f52326ed93 req-0fce4b0b-827a-46a9-8271-2ab484753684 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "0be60507-2a72-40c2-8ec7-86c829eacd52-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:56:21 compute-1 nova_compute[182713]: 2026-01-21 23:56:21.199 182717 DEBUG oslo_concurrency.lockutils [req-f7a59538-2406-4448-9d20-70f52326ed93 req-0fce4b0b-827a-46a9-8271-2ab484753684 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "0be60507-2a72-40c2-8ec7-86c829eacd52-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:56:21 compute-1 nova_compute[182713]: 2026-01-21 23:56:21.200 182717 DEBUG nova.compute.manager [req-f7a59538-2406-4448-9d20-70f52326ed93 req-0fce4b0b-827a-46a9-8271-2ab484753684 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 0be60507-2a72-40c2-8ec7-86c829eacd52] No waiting events found dispatching network-vif-plugged-8014260a-f495-40a2-81b9-2fa4e968f539 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 23:56:21 compute-1 nova_compute[182713]: 2026-01-21 23:56:21.200 182717 WARNING nova.compute.manager [req-f7a59538-2406-4448-9d20-70f52326ed93 req-0fce4b0b-827a-46a9-8271-2ab484753684 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 0be60507-2a72-40c2-8ec7-86c829eacd52] Received unexpected event network-vif-plugged-8014260a-f495-40a2-81b9-2fa4e968f539 for instance with vm_state active and task_state None.
Jan 21 23:56:21 compute-1 podman[219919]: 2026-01-21 23:56:21.209194942 +0000 UTC m=+0.077044324 container health_status cba261ffc859a105e0afa4436e64e27f9ba7e7387a1ea02146e0072c51844d44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ceilometer_agent_compute)
Jan 21 23:56:21 compute-1 nova_compute[182713]: 2026-01-21 23:56:21.511 182717 DEBUG nova.compute.manager [None req-655dd1ba-8eed-4a57-b8b3-c8fc4849ece0 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: 0be60507-2a72-40c2-8ec7-86c829eacd52] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:56:21 compute-1 nova_compute[182713]: 2026-01-21 23:56:21.639 182717 INFO nova.compute.manager [None req-655dd1ba-8eed-4a57-b8b3-c8fc4849ece0 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: 0be60507-2a72-40c2-8ec7-86c829eacd52] instance snapshotting
Jan 21 23:56:21 compute-1 nova_compute[182713]: 2026-01-21 23:56:21.724 182717 INFO nova.virt.libvirt.driver [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] [instance: 03db37ed-d870-40ec-86f5-db23a9180dc8] Creating config drive at /var/lib/nova/instances/03db37ed-d870-40ec-86f5-db23a9180dc8/disk.config
Jan 21 23:56:21 compute-1 nova_compute[182713]: 2026-01-21 23:56:21.736 182717 DEBUG oslo_concurrency.processutils [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/03db37ed-d870-40ec-86f5-db23a9180dc8/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp518vccex execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:56:21 compute-1 nova_compute[182713]: 2026-01-21 23:56:21.885 182717 DEBUG oslo_concurrency.processutils [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/03db37ed-d870-40ec-86f5-db23a9180dc8/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp518vccex" returned: 0 in 0.149s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:56:21 compute-1 kernel: tap1f2706f6-32: entered promiscuous mode
Jan 21 23:56:21 compute-1 NetworkManager[54952]: <info>  [1769039781.9691] manager: (tap1f2706f6-32): new Tun device (/org/freedesktop/NetworkManager/Devices/101)
Jan 21 23:56:21 compute-1 ovn_controller[94841]: 2026-01-21T23:56:21Z|00197|binding|INFO|Claiming lport 1f2706f6-320f-42cf-8e88-b7cb375b001a for this chassis.
Jan 21 23:56:21 compute-1 ovn_controller[94841]: 2026-01-21T23:56:21Z|00198|binding|INFO|1f2706f6-320f-42cf-8e88-b7cb375b001a: Claiming fa:16:3e:38:ee:b8 10.100.0.11
Jan 21 23:56:21 compute-1 nova_compute[182713]: 2026-01-21 23:56:21.972 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:56:21 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:56:21.983 104184 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:38:ee:b8 10.100.0.11'], port_security=['fa:16:3e:38:ee:b8 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '03db37ed-d870-40ec-86f5-db23a9180dc8', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-835f4434-3fa6-458b-b79c-b27830f531cf', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '414437860afc460b9e86d674975e9d1f', 'neutron:revision_number': '2', 'neutron:security_group_ids': '30db0ce4-28a9-4add-b257-f90dc081c48d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f2a61a9c-1832-4a5f-89c7-e09ac8a1046e, chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>], logical_port=1f2706f6-320f-42cf-8e88-b7cb375b001a) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 21 23:56:21 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:56:21.986 104184 INFO neutron.agent.ovn.metadata.agent [-] Port 1f2706f6-320f-42cf-8e88-b7cb375b001a in datapath 835f4434-3fa6-458b-b79c-b27830f531cf bound to our chassis
Jan 21 23:56:21 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:56:21.990 104184 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 835f4434-3fa6-458b-b79c-b27830f531cf
Jan 21 23:56:22 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:56:22.007 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[92d98f93-1f38-46da-9efb-059d6afc1ebf]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:56:22 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:56:22.008 104184 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap835f4434-31 in ovnmeta-835f4434-3fa6-458b-b79c-b27830f531cf namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 21 23:56:22 compute-1 systemd-udevd[219961]: Network interface NamePolicy= disabled on kernel command line.
Jan 21 23:56:22 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:56:22.013 211733 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap835f4434-30 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 21 23:56:22 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:56:22.013 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[7c14e57c-92b2-4002-8c6d-6c295d52306b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:56:22 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:56:22.014 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[6eaca21b-8dd3-4548-8e80-b31ed951c627]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:56:22 compute-1 NetworkManager[54952]: <info>  [1769039782.0234] device (tap1f2706f6-32): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 21 23:56:22 compute-1 NetworkManager[54952]: <info>  [1769039782.0244] device (tap1f2706f6-32): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 21 23:56:22 compute-1 systemd-machined[153970]: New machine qemu-31-instance-00000043.
Jan 21 23:56:22 compute-1 nova_compute[182713]: 2026-01-21 23:56:22.034 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:56:22 compute-1 systemd[1]: Started Virtual Machine qemu-31-instance-00000043.
Jan 21 23:56:22 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:56:22.039 104576 DEBUG oslo.privsep.daemon [-] privsep: reply[87e59a93-7908-4e23-a45b-f52638637877]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:56:22 compute-1 ovn_controller[94841]: 2026-01-21T23:56:22Z|00199|binding|INFO|Setting lport 1f2706f6-320f-42cf-8e88-b7cb375b001a ovn-installed in OVS
Jan 21 23:56:22 compute-1 ovn_controller[94841]: 2026-01-21T23:56:22Z|00200|binding|INFO|Setting lport 1f2706f6-320f-42cf-8e88-b7cb375b001a up in Southbound
Jan 21 23:56:22 compute-1 nova_compute[182713]: 2026-01-21 23:56:22.044 182717 DEBUG nova.compute.manager [req-2356e513-1313-4610-8e0b-100dd4f88e54 req-6fbcaa77-9f6c-4453-8f92-7a330cf9fa4c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 03db37ed-d870-40ec-86f5-db23a9180dc8] Received event network-changed-1f2706f6-320f-42cf-8e88-b7cb375b001a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:56:22 compute-1 nova_compute[182713]: 2026-01-21 23:56:22.045 182717 DEBUG nova.compute.manager [req-2356e513-1313-4610-8e0b-100dd4f88e54 req-6fbcaa77-9f6c-4453-8f92-7a330cf9fa4c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 03db37ed-d870-40ec-86f5-db23a9180dc8] Refreshing instance network info cache due to event network-changed-1f2706f6-320f-42cf-8e88-b7cb375b001a. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 21 23:56:22 compute-1 nova_compute[182713]: 2026-01-21 23:56:22.045 182717 DEBUG oslo_concurrency.lockutils [req-2356e513-1313-4610-8e0b-100dd4f88e54 req-6fbcaa77-9f6c-4453-8f92-7a330cf9fa4c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-03db37ed-d870-40ec-86f5-db23a9180dc8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 23:56:22 compute-1 nova_compute[182713]: 2026-01-21 23:56:22.046 182717 DEBUG oslo_concurrency.lockutils [req-2356e513-1313-4610-8e0b-100dd4f88e54 req-6fbcaa77-9f6c-4453-8f92-7a330cf9fa4c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-03db37ed-d870-40ec-86f5-db23a9180dc8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 23:56:22 compute-1 nova_compute[182713]: 2026-01-21 23:56:22.047 182717 DEBUG nova.network.neutron [req-2356e513-1313-4610-8e0b-100dd4f88e54 req-6fbcaa77-9f6c-4453-8f92-7a330cf9fa4c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 03db37ed-d870-40ec-86f5-db23a9180dc8] Refreshing network info cache for port 1f2706f6-320f-42cf-8e88-b7cb375b001a _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 21 23:56:22 compute-1 nova_compute[182713]: 2026-01-21 23:56:22.050 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:56:22 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:56:22.056 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[6349ab55-d7b0-4460-aac4-b6d152b6ec5e]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:56:22 compute-1 nova_compute[182713]: 2026-01-21 23:56:22.061 182717 INFO nova.virt.libvirt.driver [None req-655dd1ba-8eed-4a57-b8b3-c8fc4849ece0 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: 0be60507-2a72-40c2-8ec7-86c829eacd52] Beginning live snapshot process
Jan 21 23:56:22 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:56:22.083 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[05f8c449-aeb2-4ba9-8474-e3482430b2e3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:56:22 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:56:22.089 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[6ba34660-3077-47b1-b0c4-2f74db6cec16]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:56:22 compute-1 NetworkManager[54952]: <info>  [1769039782.0900] manager: (tap835f4434-30): new Veth device (/org/freedesktop/NetworkManager/Devices/102)
Jan 21 23:56:22 compute-1 systemd-udevd[219965]: Network interface NamePolicy= disabled on kernel command line.
Jan 21 23:56:22 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:56:22.124 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[e194f4eb-474b-4891-9a2f-3af7aed67543]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:56:22 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:56:22.127 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[ddc3a843-d83f-48c9-8970-e14ce38f669d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:56:22 compute-1 NetworkManager[54952]: <info>  [1769039782.1464] device (tap835f4434-30): carrier: link connected
Jan 21 23:56:22 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:56:22.152 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[9bce746d-593d-48c0-95c5-3569662b1879]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:56:22 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:56:22.170 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[7d7121b8-ed34-4267-803d-f3d519b718b3]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap835f4434-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:3d:51:07'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 62], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 435479, 'reachable_time': 38478, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 219994, 'error': None, 'target': 'ovnmeta-835f4434-3fa6-458b-b79c-b27830f531cf', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:56:22 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:56:22.191 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[b61a180c-a7da-45ae-9e0e-1dce3024615d]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe3d:5107'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 435479, 'tstamp': 435479}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 219995, 'error': None, 'target': 'ovnmeta-835f4434-3fa6-458b-b79c-b27830f531cf', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:56:22 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:56:22.205 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[d06d2d37-a132-44b3-a378-13509a478988]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap835f4434-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:3d:51:07'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 62], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 435479, 'reachable_time': 38478, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 219996, 'error': None, 'target': 'ovnmeta-835f4434-3fa6-458b-b79c-b27830f531cf', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:56:22 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:56:22.235 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[f6c25428-88e3-4562-bc85-b7459a23c2b9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:56:22 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:56:22.310 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[59d44ae9-2ce1-48b2-ae37-c98c08dee376]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:56:22 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:56:22.312 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap835f4434-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:56:22 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:56:22.313 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 21 23:56:22 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:56:22.313 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap835f4434-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:56:22 compute-1 nova_compute[182713]: 2026-01-21 23:56:22.316 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:56:22 compute-1 kernel: tap835f4434-30: entered promiscuous mode
Jan 21 23:56:22 compute-1 NetworkManager[54952]: <info>  [1769039782.3195] manager: (tap835f4434-30): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/103)
Jan 21 23:56:22 compute-1 nova_compute[182713]: 2026-01-21 23:56:22.319 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:56:22 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:56:22.321 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap835f4434-30, col_values=(('external_ids', {'iface-id': '8bc16eeb-6666-4300-9ce8-0a810442a173'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:56:22 compute-1 nova_compute[182713]: 2026-01-21 23:56:22.323 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:56:22 compute-1 ovn_controller[94841]: 2026-01-21T23:56:22Z|00201|binding|INFO|Releasing lport 8bc16eeb-6666-4300-9ce8-0a810442a173 from this chassis (sb_readonly=0)
Jan 21 23:56:22 compute-1 nova_compute[182713]: 2026-01-21 23:56:22.324 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:56:22 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:56:22.328 104184 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/835f4434-3fa6-458b-b79c-b27830f531cf.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/835f4434-3fa6-458b-b79c-b27830f531cf.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 21 23:56:22 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:56:22.329 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[f5d0f557-20ed-4911-a987-0b4b73f1d396]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:56:22 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:56:22.330 104184 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 21 23:56:22 compute-1 ovn_metadata_agent[104179]: global
Jan 21 23:56:22 compute-1 ovn_metadata_agent[104179]:     log         /dev/log local0 debug
Jan 21 23:56:22 compute-1 ovn_metadata_agent[104179]:     log-tag     haproxy-metadata-proxy-835f4434-3fa6-458b-b79c-b27830f531cf
Jan 21 23:56:22 compute-1 ovn_metadata_agent[104179]:     user        root
Jan 21 23:56:22 compute-1 ovn_metadata_agent[104179]:     group       root
Jan 21 23:56:22 compute-1 ovn_metadata_agent[104179]:     maxconn     1024
Jan 21 23:56:22 compute-1 ovn_metadata_agent[104179]:     pidfile     /var/lib/neutron/external/pids/835f4434-3fa6-458b-b79c-b27830f531cf.pid.haproxy
Jan 21 23:56:22 compute-1 ovn_metadata_agent[104179]:     daemon
Jan 21 23:56:22 compute-1 ovn_metadata_agent[104179]: 
Jan 21 23:56:22 compute-1 ovn_metadata_agent[104179]: defaults
Jan 21 23:56:22 compute-1 ovn_metadata_agent[104179]:     log global
Jan 21 23:56:22 compute-1 ovn_metadata_agent[104179]:     mode http
Jan 21 23:56:22 compute-1 ovn_metadata_agent[104179]:     option httplog
Jan 21 23:56:22 compute-1 ovn_metadata_agent[104179]:     option dontlognull
Jan 21 23:56:22 compute-1 ovn_metadata_agent[104179]:     option http-server-close
Jan 21 23:56:22 compute-1 ovn_metadata_agent[104179]:     option forwardfor
Jan 21 23:56:22 compute-1 ovn_metadata_agent[104179]:     retries                 3
Jan 21 23:56:22 compute-1 ovn_metadata_agent[104179]:     timeout http-request    30s
Jan 21 23:56:22 compute-1 ovn_metadata_agent[104179]:     timeout connect         30s
Jan 21 23:56:22 compute-1 ovn_metadata_agent[104179]:     timeout client          32s
Jan 21 23:56:22 compute-1 ovn_metadata_agent[104179]:     timeout server          32s
Jan 21 23:56:22 compute-1 ovn_metadata_agent[104179]:     timeout http-keep-alive 30s
Jan 21 23:56:22 compute-1 ovn_metadata_agent[104179]: 
Jan 21 23:56:22 compute-1 ovn_metadata_agent[104179]: 
Jan 21 23:56:22 compute-1 ovn_metadata_agent[104179]: listen listener
Jan 21 23:56:22 compute-1 ovn_metadata_agent[104179]:     bind 169.254.169.254:80
Jan 21 23:56:22 compute-1 ovn_metadata_agent[104179]:     server metadata /var/lib/neutron/metadata_proxy
Jan 21 23:56:22 compute-1 ovn_metadata_agent[104179]:     http-request add-header X-OVN-Network-ID 835f4434-3fa6-458b-b79c-b27830f531cf
Jan 21 23:56:22 compute-1 ovn_metadata_agent[104179]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 21 23:56:22 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:56:22.331 104184 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-835f4434-3fa6-458b-b79c-b27830f531cf', 'env', 'PROCESS_TAG=haproxy-835f4434-3fa6-458b-b79c-b27830f531cf', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/835f4434-3fa6-458b-b79c-b27830f531cf.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 21 23:56:22 compute-1 nova_compute[182713]: 2026-01-21 23:56:22.335 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:56:22 compute-1 virtqemud[182235]: invalid argument: disk vda does not have an active block job
Jan 21 23:56:22 compute-1 nova_compute[182713]: 2026-01-21 23:56:22.355 182717 DEBUG oslo_concurrency.processutils [None req-655dd1ba-8eed-4a57-b8b3-c8fc4849ece0 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0be60507-2a72-40c2-8ec7-86c829eacd52/disk --force-share --output=json -f qcow2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:56:22 compute-1 nova_compute[182713]: 2026-01-21 23:56:22.415 182717 DEBUG oslo_concurrency.processutils [None req-655dd1ba-8eed-4a57-b8b3-c8fc4849ece0 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0be60507-2a72-40c2-8ec7-86c829eacd52/disk --force-share --output=json -f qcow2" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:56:22 compute-1 nova_compute[182713]: 2026-01-21 23:56:22.416 182717 DEBUG oslo_concurrency.processutils [None req-655dd1ba-8eed-4a57-b8b3-c8fc4849ece0 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0be60507-2a72-40c2-8ec7-86c829eacd52/disk --force-share --output=json -f qcow2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:56:22 compute-1 nova_compute[182713]: 2026-01-21 23:56:22.516 182717 DEBUG oslo_concurrency.processutils [None req-655dd1ba-8eed-4a57-b8b3-c8fc4849ece0 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0be60507-2a72-40c2-8ec7-86c829eacd52/disk --force-share --output=json -f qcow2" returned: 0 in 0.100s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:56:22 compute-1 nova_compute[182713]: 2026-01-21 23:56:22.533 182717 DEBUG oslo_concurrency.processutils [None req-655dd1ba-8eed-4a57-b8b3-c8fc4849ece0 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:56:22 compute-1 nova_compute[182713]: 2026-01-21 23:56:22.576 182717 DEBUG nova.virt.driver [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Emitting event <LifecycleEvent: 1769039782.575541, 03db37ed-d870-40ec-86f5-db23a9180dc8 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:56:22 compute-1 nova_compute[182713]: 2026-01-21 23:56:22.577 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 03db37ed-d870-40ec-86f5-db23a9180dc8] VM Started (Lifecycle Event)
Jan 21 23:56:22 compute-1 nova_compute[182713]: 2026-01-21 23:56:22.610 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 03db37ed-d870-40ec-86f5-db23a9180dc8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:56:22 compute-1 nova_compute[182713]: 2026-01-21 23:56:22.620 182717 DEBUG nova.virt.driver [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Emitting event <LifecycleEvent: 1769039782.5760524, 03db37ed-d870-40ec-86f5-db23a9180dc8 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:56:22 compute-1 nova_compute[182713]: 2026-01-21 23:56:22.621 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 03db37ed-d870-40ec-86f5-db23a9180dc8] VM Paused (Lifecycle Event)
Jan 21 23:56:22 compute-1 nova_compute[182713]: 2026-01-21 23:56:22.624 182717 DEBUG oslo_concurrency.processutils [None req-655dd1ba-8eed-4a57-b8b3-c8fc4849ece0 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.091s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:56:22 compute-1 nova_compute[182713]: 2026-01-21 23:56:22.625 182717 DEBUG oslo_concurrency.processutils [None req-655dd1ba-8eed-4a57-b8b3-c8fc4849ece0 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/snapshots/tmpqv11c2ya/d21b70ead57b4e748a98f0911e3fb375.delta 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:56:22 compute-1 nova_compute[182713]: 2026-01-21 23:56:22.649 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 03db37ed-d870-40ec-86f5-db23a9180dc8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:56:22 compute-1 nova_compute[182713]: 2026-01-21 23:56:22.655 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 03db37ed-d870-40ec-86f5-db23a9180dc8] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 21 23:56:22 compute-1 nova_compute[182713]: 2026-01-21 23:56:22.665 182717 DEBUG oslo_concurrency.processutils [None req-655dd1ba-8eed-4a57-b8b3-c8fc4849ece0 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/snapshots/tmpqv11c2ya/d21b70ead57b4e748a98f0911e3fb375.delta 1073741824" returned: 0 in 0.040s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:56:22 compute-1 nova_compute[182713]: 2026-01-21 23:56:22.666 182717 INFO nova.virt.libvirt.driver [None req-655dd1ba-8eed-4a57-b8b3-c8fc4849ece0 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: 0be60507-2a72-40c2-8ec7-86c829eacd52] Quiescing instance not available: QEMU guest agent is not enabled.
Jan 21 23:56:22 compute-1 nova_compute[182713]: 2026-01-21 23:56:22.712 182717 DEBUG nova.virt.libvirt.guest [None req-655dd1ba-8eed-4a57-b8b3-c8fc4849ece0 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] COPY block job progress, current cursor: 1 final cursor: 1 is_job_complete /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:846
Jan 21 23:56:22 compute-1 nova_compute[182713]: 2026-01-21 23:56:22.715 182717 INFO nova.virt.libvirt.driver [None req-655dd1ba-8eed-4a57-b8b3-c8fc4849ece0 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: 0be60507-2a72-40c2-8ec7-86c829eacd52] Skipping quiescing instance: QEMU guest agent is not enabled.
Jan 21 23:56:22 compute-1 nova_compute[182713]: 2026-01-21 23:56:22.744 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 03db37ed-d870-40ec-86f5-db23a9180dc8] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 21 23:56:22 compute-1 nova_compute[182713]: 2026-01-21 23:56:22.761 182717 DEBUG nova.privsep.utils [None req-655dd1ba-8eed-4a57-b8b3-c8fc4849ece0 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63
Jan 21 23:56:22 compute-1 nova_compute[182713]: 2026-01-21 23:56:22.762 182717 DEBUG oslo_concurrency.processutils [None req-655dd1ba-8eed-4a57-b8b3-c8fc4849ece0 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Running cmd (subprocess): qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/snapshots/tmpqv11c2ya/d21b70ead57b4e748a98f0911e3fb375.delta /var/lib/nova/instances/snapshots/tmpqv11c2ya/d21b70ead57b4e748a98f0911e3fb375 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:56:22 compute-1 podman[220052]: 2026-01-21 23:56:22.779392287 +0000 UTC m=+0.063252349 container create f9e1d5ada4263f87a1459a1fc21623802c76b9601d0656dea344a745d723ffd4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-835f4434-3fa6-458b-b79c-b27830f531cf, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Jan 21 23:56:22 compute-1 systemd[1]: Started libpod-conmon-f9e1d5ada4263f87a1459a1fc21623802c76b9601d0656dea344a745d723ffd4.scope.
Jan 21 23:56:22 compute-1 podman[220052]: 2026-01-21 23:56:22.747353709 +0000 UTC m=+0.031213811 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 21 23:56:22 compute-1 systemd[1]: Started libcrun container.
Jan 21 23:56:22 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e36a143b95759f027051f25d1ff22d50c7f9aa731774182c64315d1ff4d25030/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 21 23:56:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:22.868 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '22742d9b-a6a8-4f10-a17f-a9704a1f8f43', 'name': 'tempest-ListServerFiltersTestJSON-instance-1374146863', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-0000003a', 'OS-EXT-SRV-ATTR:host': 'compute-1.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '70b1c9f8be0042aa8de9841a26729700', 'user_id': '7e79b904cb8a49f990b05eb0ed72fdf4', 'hostId': 'fa7884d5cd98c8b7c256bd4fdc70b5077636284be54ca29d74feaff4', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Jan 21 23:56:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:22.871 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '0be60507-2a72-40c2-8ec7-86c829eacd52', 'name': 'tempest-ImagesTestJSON-server-560731900', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000040', 'OS-EXT-SRV-ATTR:host': 'compute-1.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '63e5713bcd4c429796b251487b6136bc', 'user_id': '6eb1bcf645844eaca088761a04e59542', 'hostId': 'd077fa559ec5930f9f2ebe56ec582b45241b18010f40f326cd90587e', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Jan 21 23:56:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:22.872 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '03db37ed-d870-40ec-86f5-db23a9180dc8', 'name': 'tempest-ListServersNegativeTestJSON-server-1677728672-3', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000043', 'OS-EXT-SRV-ATTR:host': 'compute-1.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'paused', 'tenant_id': '414437860afc460b9e86d674975e9d1f', 'user_id': '9a4a4a5f3c9f4c5091261592272bcb81', 'hostId': 'a517d86b3070decca8e41497c8a7fe9ab28da082b264f32b2ddc7c04', 'status': 'paused', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Jan 21 23:56:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:22.872 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Jan 21 23:56:22 compute-1 podman[220052]: 2026-01-21 23:56:22.888320871 +0000 UTC m=+0.172180963 container init f9e1d5ada4263f87a1459a1fc21623802c76b9601d0656dea344a745d723ffd4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-835f4434-3fa6-458b-b79c-b27830f531cf, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true)
Jan 21 23:56:22 compute-1 nova_compute[182713]: 2026-01-21 23:56:22.890 182717 DEBUG oslo_concurrency.processutils [None req-655dd1ba-8eed-4a57-b8b3-c8fc4849ece0 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] CMD "qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/snapshots/tmpqv11c2ya/d21b70ead57b4e748a98f0911e3fb375.delta /var/lib/nova/instances/snapshots/tmpqv11c2ya/d21b70ead57b4e748a98f0911e3fb375" returned: 0 in 0.129s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:56:22 compute-1 nova_compute[182713]: 2026-01-21 23:56:22.891 182717 INFO nova.virt.libvirt.driver [None req-655dd1ba-8eed-4a57-b8b3-c8fc4849ece0 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: 0be60507-2a72-40c2-8ec7-86c829eacd52] Snapshot extracted, beginning image upload
Jan 21 23:56:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:22.896 12 DEBUG ceilometer.compute.pollsters [-] 22742d9b-a6a8-4f10-a17f-a9704a1f8f43/disk.device.write.bytes volume: 135168 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:56:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:22.897 12 DEBUG ceilometer.compute.pollsters [-] 22742d9b-a6a8-4f10-a17f-a9704a1f8f43/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:56:22 compute-1 podman[220052]: 2026-01-21 23:56:22.899180365 +0000 UTC m=+0.183040427 container start f9e1d5ada4263f87a1459a1fc21623802c76b9601d0656dea344a745d723ffd4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-835f4434-3fa6-458b-b79c-b27830f531cf, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 21 23:56:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:22.924 12 DEBUG ceilometer.compute.pollsters [-] 0be60507-2a72-40c2-8ec7-86c829eacd52/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:56:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:22.925 12 DEBUG ceilometer.compute.pollsters [-] 0be60507-2a72-40c2-8ec7-86c829eacd52/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:56:22 compute-1 neutron-haproxy-ovnmeta-835f4434-3fa6-458b-b79c-b27830f531cf[220076]: [NOTICE]   (220080) : New worker (220082) forked
Jan 21 23:56:22 compute-1 neutron-haproxy-ovnmeta-835f4434-3fa6-458b-b79c-b27830f531cf[220076]: [NOTICE]   (220080) : Loading success.
Jan 21 23:56:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:22.964 12 DEBUG ceilometer.compute.pollsters [-] 03db37ed-d870-40ec-86f5-db23a9180dc8/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:56:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:22.965 12 DEBUG ceilometer.compute.pollsters [-] 03db37ed-d870-40ec-86f5-db23a9180dc8/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:56:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:22.967 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '192d3c99-c754-465f-81ee-d17ed5f23840', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 135168, 'user_id': '7e79b904cb8a49f990b05eb0ed72fdf4', 'user_name': None, 'project_id': '70b1c9f8be0042aa8de9841a26729700', 'project_name': None, 'resource_id': '22742d9b-a6a8-4f10-a17f-a9704a1f8f43-vda', 'timestamp': '2026-01-21T23:56:22.873190', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1374146863', 'name': 'instance-0000003a', 'instance_id': '22742d9b-a6a8-4f10-a17f-a9704a1f8f43', 'instance_type': 'm1.nano', 'host': 'fa7884d5cd98c8b7c256bd4fdc70b5077636284be54ca29d74feaff4', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'ca09380e-f724-11f0-a0a4-fa163e934844', 'monotonic_time': 4355.580448455, 'message_signature': 'bdbac20e99c54f4e60fc7fd70256acca64f34d8aedb89a9930e66978a58b0142'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '7e79b904cb8a49f990b05eb0ed72fdf4', 'user_name': None, 'project_id': '70b1c9f8be0042aa8de9841a26729700', 'project_name': None, 'resource_id': '22742d9b-a6a8-4f10-a17f-a9704a1f8f43-sda', 'timestamp': '2026-01-21T23:56:22.873190', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1374146863', 'name': 'instance-0000003a', 'instance_id': '22742d9b-a6a8-4f10-a17f-a9704a1f8f43', 'instance_type': 'm1.nano', 'host': 'fa7884d5cd98c8b7c256bd4fdc70b5077636284be54ca29d74feaff4', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'ca094bc8-f724-11f0-a0a4-fa163e934844', 'monotonic_time': 4355.580448455, 'message_signature': 'ddb29597e8bd055b1adfd682a44e7aa8129163e35a435f4c1a3d4dabb8cb3a93'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '6eb1bcf645844eaca088761a04e59542', 'user_name': None, 'project_id': '63e5713bcd4c429796b251487b6136bc', 'project_name': None, 'resource_id': '0be60507-2a72-40c2-8ec7-86c829eacd52-vda', 'timestamp': '2026-01-21T23:56:22.873190', 'resource_metadata': {'display_name': 'tempest-ImagesTestJSON-server-560731900', 'name': 'instance-00000040', 'instance_id': '0be60507-2a72-40c2-8ec7-86c829eacd52', 'instance_type': 'm1.nano', 'host': 'd077fa559ec5930f9f2ebe56ec582b45241b18010f40f326cd90587e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'ca0d81a2-f724-11f0-a0a4-fa163e934844', 'monotonic_time': 4355.604927809, 'message_signature': '0dae25aa43cd3d2b86d05db792f8c9fc5edf58f8764ad7b2bc09b9a25da40a2c'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '6eb1bcf645844eaca088761a04e59542', 'user_name': None, 'project_id': '63e5713bcd4c429796b251487b6136bc', 'project_name': None, 'resource_id': '0be60507-2a72-40c2-8ec7-86c829eacd52-sda', 'timestamp': '2026-01-21T23:56:22.873190', 'resource_metadata': {'display_name': 'tempest-ImagesTestJSON-server-560731900', 'name': 'instance-00000040', 'instance_id': '0be60507-2a72-40c2-8ec7-86c829eacd52', 'instance_type': 'm1.nano', 'host': 'd077fa559ec5930f9f2ebe56ec582b45241b18010f40f326cd90587e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'ca0d8ba2-f724-11f0-a0a4-fa163e934844', 'monotonic_time': 4355.604927809, 'message_signature': '6c12a462121b75728ddf6e8570a4a04b7d52711eb6dfd569f0bf2320398e9b49'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '9a4a4a5f3c9f4c5091261592272bcb81', 'user_name': None, 'project_id': '414437860afc460b9e86d674975e9d1f', 'project_name': None, 'resource_id': '03db37ed-d870-40ec-86f5-db23a9180dc8-vda', 'timestamp': '2026-01-21T23:56:22.873190', 'resource_metadata': {'display_name': 'tempest-ListServersNegativeTestJSON-server-1677728672-3', 'name': 'instance-00000043', 'instance_id': '03db37ed-d870-40ec-86f5-db23a9180dc8', 'instance_type': 'm1.nano', 'host': 'a517d86b3070decca8e41497c8a7fe9ab28da082b264f32b2ddc7c04', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'ca13948e-f724-11f0-a0a4-fa163e934844', 'monotonic_time': 4355.632702695, 'message_signature': 'ef91081473c6519736db0be0b6888d9ff15916ed4f638db9940c71c0f981e9c2'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '9a4a4a5f3c9f4c5091261592272bcb81', 'user_name': None, 'project_id': '414437860afc460b9e86d674975e9d1f', 'project_name': None, 'resource_id': '03db37ed-d870-40ec-86f5-db23a9180dc8-sda', 'timestamp': '2026-01-21T23:56:22.873190', 'resource_metadata': {'display_name': 'tempest-ListServersNegativeTestJSON-server-1677728672-3', 'name': 'instance-00000043', 'instance_id': '03db37ed-d870-40ec-86f5-db23a9180dc8', 'instance_type': 'm1.nano', 'host': 'a517d86b3070decca8e41497c8a7fe9ab28da082b264f32b2ddc7c04', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0,
Jan 21 23:56:22 compute-1 ceilometer_agent_compute[192394]:  'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'ca139fce-f724-11f0-a0a4-fa163e934844', 'monotonic_time': 4355.632702695, 'message_signature': '4e8b14a93cb62caf2e5bad37fc3706feab4d03ac89dc687901353d091531bf32'}]}, 'timestamp': '2026-01-21 23:56:22.965435', '_unique_id': '2994f3844cbc49ffbd8a9d5e1dfbfb91'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:56:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:22.967 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:56:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:22.967 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 23:56:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:22.967 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 23:56:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:22.967 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:56:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:22.967 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:56:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:22.967 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 23:56:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:22.967 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 23:56:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:22.967 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 23:56:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:22.967 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 23:56:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:22.967 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 23:56:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:22.967 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 23:56:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:22.967 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 23:56:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:22.967 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 23:56:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:22.967 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 23:56:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:22.967 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 23:56:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:22.967 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 23:56:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:22.967 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 23:56:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:22.967 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 23:56:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:22.967 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 23:56:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:22.967 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 23:56:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:22.967 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:56:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:22.967 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 23:56:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:22.967 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:56:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:22.967 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:56:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:22.967 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 23:56:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:22.967 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 23:56:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:22.967 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 23:56:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:22.967 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 23:56:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:22.967 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 23:56:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:22.967 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 23:56:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:22.967 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 23:56:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:22.967 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 23:56:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:22.967 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 23:56:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:22.967 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 23:56:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:22.967 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 23:56:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:22.967 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 23:56:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:22.967 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 23:56:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:22.967 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 23:56:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:22.967 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 23:56:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:22.967 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 23:56:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:22.967 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 23:56:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:22.967 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 23:56:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:22.967 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 23:56:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:22.967 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 23:56:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:22.967 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 23:56:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:22.967 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 23:56:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:22.967 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:56:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:22.967 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:56:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:22.967 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 23:56:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:22.967 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 23:56:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:22.967 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 23:56:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:22.967 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 23:56:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:22.967 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:56:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:22.967 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:56:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:22.968 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Jan 21 23:56:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:22.972 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 22742d9b-a6a8-4f10-a17f-a9704a1f8f43 / tapb767fb64-f4 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Jan 21 23:56:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:22.972 12 DEBUG ceilometer.compute.pollsters [-] 22742d9b-a6a8-4f10-a17f-a9704a1f8f43/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:56:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:22.974 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 0be60507-2a72-40c2-8ec7-86c829eacd52 / tap8014260a-f4 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Jan 21 23:56:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:22.975 12 DEBUG ceilometer.compute.pollsters [-] 0be60507-2a72-40c2-8ec7-86c829eacd52/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:56:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:22.977 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 03db37ed-d870-40ec-86f5-db23a9180dc8 / tap1f2706f6-32 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Jan 21 23:56:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:22.977 12 DEBUG ceilometer.compute.pollsters [-] 03db37ed-d870-40ec-86f5-db23a9180dc8/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:56:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:22.979 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'dfebca34-bb38-45d4-8bf3-b2dc9cbea94d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '7e79b904cb8a49f990b05eb0ed72fdf4', 'user_name': None, 'project_id': '70b1c9f8be0042aa8de9841a26729700', 'project_name': None, 'resource_id': 'instance-0000003a-22742d9b-a6a8-4f10-a17f-a9704a1f8f43-tapb767fb64-f4', 'timestamp': '2026-01-21T23:56:22.969011', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1374146863', 'name': 'tapb767fb64-f4', 'instance_id': '22742d9b-a6a8-4f10-a17f-a9704a1f8f43', 'instance_type': 'm1.nano', 'host': 'fa7884d5cd98c8b7c256bd4fdc70b5077636284be54ca29d74feaff4', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:4e:99:ca', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb767fb64-f4'}, 'message_id': 'ca14c4c6-f724-11f0-a0a4-fa163e934844', 'monotonic_time': 4355.676249775, 'message_signature': 'e0d847cb17fae074e3fccf271799111791978b7a6f27d62f43841d1d9d5115a9'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '6eb1bcf645844eaca088761a04e59542', 'user_name': None, 'project_id': '63e5713bcd4c429796b251487b6136bc', 'project_name': None, 'resource_id': 'instance-00000040-0be60507-2a72-40c2-8ec7-86c829eacd52-tap8014260a-f4', 'timestamp': '2026-01-21T23:56:22.969011', 'resource_metadata': {'display_name': 'tempest-ImagesTestJSON-server-560731900', 'name': 'tap8014260a-f4', 'instance_id': '0be60507-2a72-40c2-8ec7-86c829eacd52', 'instance_type': 'm1.nano', 'host': 'd077fa559ec5930f9f2ebe56ec582b45241b18010f40f326cd90587e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:50:87:d7', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap8014260a-f4'}, 'message_id': 'ca152984-f724-11f0-a0a4-fa163e934844', 'monotonic_time': 4355.680236019, 'message_signature': '0db7627e370d92a8dfd2d442a5ce69c9c94273a59363493a035141f180ac86dd'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '9a4a4a5f3c9f4c5091261592272bcb81', 'user_name': None, 'project_id': '414437860afc460b9e86d674975e9d1f', 'project_name': None, 'resource_id': 'instance-00000043-03db37ed-d870-40ec-86f5-db23a9180dc8-tap1f2706f6-32', 'timestamp': '2026-01-21T23:56:22.969011', 'resource_metadata': {'display_name': 'tempest-ListServersNegativeTestJSON-server-1677728672-3', 'name': 'tap1f2706f6-32', 'instance_id': '03db37ed-d870-40ec-86f5-db23a9180dc8', 'instance_type': 'm1.nano', 'host': 'a517d86b3070decca8e41497c8a7fe9ab28da082b264f32b2ddc7c04', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:38:ee:b8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap1f2706f6-32'}, 'message_id': 'ca159018-f724-11f0-a0a4-fa163e934844', 'monotonic_time': 4355.682720395, 'message_signature': 'dae06adabb7088fccd54bfab42f6a6b9d8eee9ac388056385460da60062af162'}]}, 'timestamp': '2026-01-21 23:56:22.978149', '_unique_id': 'ef808cc99e8e43a18fc177de78c29961'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:56:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:22.979 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:56:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:22.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 23:56:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:22.979 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 23:56:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:22.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:56:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:22.979 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:56:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:22.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 23:56:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:22.979 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 23:56:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:22.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 23:56:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:22.979 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 23:56:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:22.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 23:56:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:22.979 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 23:56:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:22.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 23:56:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:22.979 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 23:56:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:22.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 23:56:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:22.979 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 23:56:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:22.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 23:56:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:22.979 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 23:56:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:22.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 23:56:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:22.979 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 23:56:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:22.979 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 23:56:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:22.979 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:56:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:22.979 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 23:56:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:22.979 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:56:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:22.979 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:56:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:22.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 23:56:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:22.979 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 23:56:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:22.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 23:56:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:22.979 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 23:56:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:22.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 23:56:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:22.979 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 23:56:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:22.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 23:56:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:22.979 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 23:56:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:22.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 23:56:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:22.979 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 23:56:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:22.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 23:56:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:22.979 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 23:56:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:22.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 23:56:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:22.979 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 23:56:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:22.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 23:56:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:22.979 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 23:56:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:22.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 23:56:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:22.979 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 23:56:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:22.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 23:56:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:22.979 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 23:56:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:22.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 23:56:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:22.979 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 23:56:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:22.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:56:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:22.979 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:56:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:22.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 23:56:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:22.979 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 23:56:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:22.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 23:56:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:22.979 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 23:56:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:22.979 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:56:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:22.979 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:56:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:22.980 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Jan 21 23:56:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:22.997 12 DEBUG ceilometer.compute.pollsters [-] 22742d9b-a6a8-4f10-a17f-a9704a1f8f43/memory.usage volume: 40.38671875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.015 12 DEBUG ceilometer.compute.pollsters [-] 0be60507-2a72-40c2-8ec7-86c829eacd52/memory.usage volume: Unavailable _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.015 12 WARNING ceilometer.compute.pollsters [-] memory.usage statistic in not available for instance 0be60507-2a72-40c2-8ec7-86c829eacd52: ceilometer.compute.pollsters.NoVolumeException
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.042 12 DEBUG ceilometer.compute.pollsters [-] 03db37ed-d870-40ec-86f5-db23a9180dc8/memory.usage volume: Unavailable _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.042 12 WARNING ceilometer.compute.pollsters [-] memory.usage statistic in not available for instance 03db37ed-d870-40ec-86f5-db23a9180dc8: ceilometer.compute.pollsters.NoVolumeException
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.044 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '338a96b6-055c-428d-bef7-552d2ad341b9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 40.38671875, 'user_id': '7e79b904cb8a49f990b05eb0ed72fdf4', 'user_name': None, 'project_id': '70b1c9f8be0042aa8de9841a26729700', 'project_name': None, 'resource_id': '22742d9b-a6a8-4f10-a17f-a9704a1f8f43', 'timestamp': '2026-01-21T23:56:22.980712', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1374146863', 'name': 'instance-0000003a', 'instance_id': '22742d9b-a6a8-4f10-a17f-a9704a1f8f43', 'instance_type': 'm1.nano', 'host': 'fa7884d5cd98c8b7c256bd4fdc70b5077636284be54ca29d74feaff4', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': 'ca1882c8-f724-11f0-a0a4-fa163e934844', 'monotonic_time': 4355.704111053, 'message_signature': '265d9cec871132db97701ccb37e15e092818787567584b9b36f7a9ba0c7863a5'}]}, 'timestamp': '2026-01-21 23:56:23.043099', '_unique_id': '7e5501c7fb5f4643a0a93680c5a479e3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.044 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.044 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.044 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.044 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.044 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.044 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.044 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.044 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.044 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.044 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.044 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.044 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.044 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.044 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.044 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.044 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.044 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.044 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.044 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.044 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.044 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.044 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.044 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.044 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.044 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.044 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.044 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.044 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.044 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.044 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.044 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.045 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.055 12 DEBUG ceilometer.compute.pollsters [-] 22742d9b-a6a8-4f10-a17f-a9704a1f8f43/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.055 12 DEBUG ceilometer.compute.pollsters [-] 22742d9b-a6a8-4f10-a17f-a9704a1f8f43/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.068 12 DEBUG ceilometer.compute.pollsters [-] 0be60507-2a72-40c2-8ec7-86c829eacd52/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.069 12 DEBUG ceilometer.compute.pollsters [-] 0be60507-2a72-40c2-8ec7-86c829eacd52/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.080 12 DEBUG ceilometer.compute.pollsters [-] 03db37ed-d870-40ec-86f5-db23a9180dc8/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.081 12 DEBUG ceilometer.compute.pollsters [-] 03db37ed-d870-40ec-86f5-db23a9180dc8/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.083 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c8a72c3f-93db-468f-8558-7afbd3a78cbd', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7e79b904cb8a49f990b05eb0ed72fdf4', 'user_name': None, 'project_id': '70b1c9f8be0042aa8de9841a26729700', 'project_name': None, 'resource_id': '22742d9b-a6a8-4f10-a17f-a9704a1f8f43-vda', 'timestamp': '2026-01-21T23:56:23.045130', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1374146863', 'name': 'instance-0000003a', 'instance_id': '22742d9b-a6a8-4f10-a17f-a9704a1f8f43', 'instance_type': 'm1.nano', 'host': 'fa7884d5cd98c8b7c256bd4fdc70b5077636284be54ca29d74feaff4', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'ca216ed8-f724-11f0-a0a4-fa163e934844', 'monotonic_time': 4355.752354569, 'message_signature': '1087b59cb3c7554271558db91783e0710169457eb2dc340c1a8f62b45ce9fe36'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '7e79b904cb8a49f990b05eb0ed72fdf4', 'user_name': None, 'project_id': '70b1c9f8be0042aa8de9841a26729700', 'project_name': None, 'resource_id': '22742d9b-a6a8-4f10-a17f-a9704a1f8f43-sda', 'timestamp': '2026-01-21T23:56:23.045130', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1374146863', 'name': 'instance-0000003a', 'instance_id': '22742d9b-a6a8-4f10-a17f-a9704a1f8f43', 'instance_type': 'm1.nano', 'host': 'fa7884d5cd98c8b7c256bd4fdc70b5077636284be54ca29d74feaff4', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'ca2187a6-f724-11f0-a0a4-fa163e934844', 'monotonic_time': 4355.752354569, 'message_signature': '1548d8dac340e82780587f73262fc9f6af4c2120a6847ad1f1c18c3e66048656'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '6eb1bcf645844eaca088761a04e59542', 'user_name': None, 'project_id': '63e5713bcd4c429796b251487b6136bc', 'project_name': None, 'resource_id': '0be60507-2a72-40c2-8ec7-86c829eacd52-vda', 'timestamp': '2026-01-21T23:56:23.045130', 'resource_metadata': {'display_name': 'tempest-ImagesTestJSON-server-560731900', 'name': 'instance-00000040', 'instance_id': '0be60507-2a72-40c2-8ec7-86c829eacd52', 'instance_type': 'm1.nano', 'host': 'd077fa559ec5930f9f2ebe56ec582b45241b18010f40f326cd90587e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'ca2381e6-f724-11f0-a0a4-fa163e934844', 'monotonic_time': 4355.76373929, 'message_signature': '17cc18a6cf477bbbe98e2851fc3965f303992b29be79e7e64516d50d864e8c8d'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '6eb1bcf645844eaca088761a04e59542', 'user_name': None, 'project_id': '63e5713bcd4c429796b251487b6136bc', 'project_name': None, 'resource_id': '0be60507-2a72-40c2-8ec7-86c829eacd52-sda', 'timestamp': '2026-01-21T23:56:23.045130', 'resource_metadata': {'display_name': 'tempest-ImagesTestJSON-server-560731900', 'name': 'instance-00000040', 'instance_id': '0be60507-2a72-40c2-8ec7-86c829eacd52', 'instance_type': 'm1.nano', 'host': 'd077fa559ec5930f9f2ebe56ec582b45241b18010f40f326cd90587e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'ca239064-f724-11f0-a0a4-fa163e934844', 'monotonic_time': 4355.76373929, 'message_signature': 'c352d2d1c70c505188c8bc9237c29db1f9724c410300497e3187c9b4ef617d50'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9a4a4a5f3c9f4c5091261592272bcb81', 'user_name': None, 'project_id': '414437860afc460b9e86d674975e9d1f', 'project_name': None, 'resource_id': '03db37ed-d870-40ec-86f5-db23a9180dc8-vda', 'timestamp': '2026-01-21T23:56:23.045130', 'resource_metadata': {'display_name': 'tempest-ListServersNegativeTestJSON-server-1677728672-3', 'name': 'instance-00000043', 'instance_id': '03db37ed-d870-40ec-86f5-db23a9180dc8', 'instance_type': 'm1.nano', 'host': 'a517d86b3070decca8e41497c8a7fe9ab28da082b264f32b2ddc7c04', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'ca25467a-f724-11f0-a0a4-fa163e934844', 'monotonic_time': 4355.777088431, 'message_signature': '3ab6b1b26f05061b4b14d0e7a708ba29d391053f6c862c72fb20dd08b49c123e'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '9a4a4a5f3c9f4c5091261592272bcb81', 'user_name': None, 'project_id': '414437860afc460b9e86d674975e9d1f', 'project_name': None, 'resource_id': '03db37ed-d870-40ec-86f5-db23a9180dc8-sda', 'timestamp': '2026-01-21T23:56:23.045130', 'resource_metadata': {'display_name': 'tempest-ListServersNegativeTestJSON-server-1677728672-3', 'name': 'instance-00000043', 'instance_id': '03db37ed-d870-40ec-86f5-db23a9180dc8', 'instance_type': 'm1.nano', 'host': 'a517d86b3070decca8e41497c8a7fe9ab28da082b264f32b2ddc7c04', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: , 'disk_name': 'sda'}, 'message_id': 'ca255d04-f724-11f0-a0a4-fa163e934844', 'monotonic_time': 4355.777088431, 'message_signature': '585f6a0baecc004b2c3d1543e5aac11a6b293f62806603531d7fde5db8742d67'}]}, 'timestamp': '2026-01-21 23:56:23.081671', '_unique_id': '621672b6224d4e2383f9d0e058f8f950'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.083 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.083 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.083 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.083 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.083 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.083 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.083 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.083 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.083 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.083 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.083 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.083 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.083 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.083 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.083 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.083 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.083 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.083 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.083 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.083 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.083 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.083 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.083 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.083 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.083 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.083 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.083 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.083 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.083 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.083 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.083 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.084 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.085 12 DEBUG ceilometer.compute.pollsters [-] 22742d9b-a6a8-4f10-a17f-a9704a1f8f43/network.incoming.packets volume: 10 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.085 12 DEBUG ceilometer.compute.pollsters [-] 0be60507-2a72-40c2-8ec7-86c829eacd52/network.incoming.packets volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.085 12 DEBUG ceilometer.compute.pollsters [-] 03db37ed-d870-40ec-86f5-db23a9180dc8/network.incoming.packets volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.087 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1bce646e-c5c6-4943-8bf3-fb280b53f0f7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 10, 'user_id': '7e79b904cb8a49f990b05eb0ed72fdf4', 'user_name': None, 'project_id': '70b1c9f8be0042aa8de9841a26729700', 'project_name': None, 'resource_id': 'instance-0000003a-22742d9b-a6a8-4f10-a17f-a9704a1f8f43-tapb767fb64-f4', 'timestamp': '2026-01-21T23:56:23.084969', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1374146863', 'name': 'tapb767fb64-f4', 'instance_id': '22742d9b-a6a8-4f10-a17f-a9704a1f8f43', 'instance_type': 'm1.nano', 'host': 'fa7884d5cd98c8b7c256bd4fdc70b5077636284be54ca29d74feaff4', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:4e:99:ca', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb767fb64-f4'}, 'message_id': 'ca25ecf6-f724-11f0-a0a4-fa163e934844', 'monotonic_time': 4355.676249775, 'message_signature': 'c41a30b2ea065979f8d866f6f93a8968b44249c95c9452b89ebe89a6e870b066'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 1, 'user_id': '6eb1bcf645844eaca088761a04e59542', 'user_name': None, 'project_id': '63e5713bcd4c429796b251487b6136bc', 'project_name': None, 'resource_id': 'instance-00000040-0be60507-2a72-40c2-8ec7-86c829eacd52-tap8014260a-f4', 'timestamp': '2026-01-21T23:56:23.084969', 'resource_metadata': {'display_name': 'tempest-ImagesTestJSON-server-560731900', 'name': 'tap8014260a-f4', 'instance_id': '0be60507-2a72-40c2-8ec7-86c829eacd52', 'instance_type': 'm1.nano', 'host': 'd077fa559ec5930f9f2ebe56ec582b45241b18010f40f326cd90587e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:50:87:d7', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap8014260a-f4'}, 'message_id': 'ca25fe26-f724-11f0-a0a4-fa163e934844', 'monotonic_time': 4355.680236019, 'message_signature': 'e827ba2713799bd1e2944dcc4a9a242d56612c22eaca40a6092fe5b7f26aca23'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 1, 'user_id': '9a4a4a5f3c9f4c5091261592272bcb81', 'user_name': None, 'project_id': '414437860afc460b9e86d674975e9d1f', 'project_name': None, 'resource_id': 'instance-00000043-03db37ed-d870-40ec-86f5-db23a9180dc8-tap1f2706f6-32', 'timestamp': '2026-01-21T23:56:23.084969', 'resource_metadata': {'display_name': 'tempest-ListServersNegativeTestJSON-server-1677728672-3', 'name': 'tap1f2706f6-32', 'instance_id': '03db37ed-d870-40ec-86f5-db23a9180dc8', 'instance_type': 'm1.nano', 'host': 'a517d86b3070decca8e41497c8a7fe9ab28da082b264f32b2ddc7c04', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:38:ee:b8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap1f2706f6-32'}, 'message_id': 'ca260d30-f724-11f0-a0a4-fa163e934844', 'monotonic_time': 4355.682720395, 'message_signature': '29b8e46710859efe50c96b42829e66d6b3a0aca346cb20ca22464854cbec40ce'}]}, 'timestamp': '2026-01-21 23:56:23.086170', '_unique_id': 'c6469cb0897f4f4cabd33c1bc53d3361'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.087 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.087 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.087 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.087 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.087 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.087 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.087 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.087 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.087 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.087 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.087 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.087 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.087 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.087 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.087 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.087 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.087 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.087 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.087 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.087 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.087 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.087 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.087 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.087 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.087 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.087 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.087 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.087 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.087 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.087 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.087 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.088 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.088 12 DEBUG ceilometer.compute.pollsters [-] 22742d9b-a6a8-4f10-a17f-a9704a1f8f43/disk.device.allocation volume: 30482432 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.088 12 DEBUG ceilometer.compute.pollsters [-] 22742d9b-a6a8-4f10-a17f-a9704a1f8f43/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.089 12 DEBUG ceilometer.compute.pollsters [-] 0be60507-2a72-40c2-8ec7-86c829eacd52/disk.device.allocation volume: 204800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.089 12 DEBUG ceilometer.compute.pollsters [-] 0be60507-2a72-40c2-8ec7-86c829eacd52/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.089 12 DEBUG ceilometer.compute.pollsters [-] 03db37ed-d870-40ec-86f5-db23a9180dc8/disk.device.allocation volume: 204800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.090 12 DEBUG ceilometer.compute.pollsters [-] 03db37ed-d870-40ec-86f5-db23a9180dc8/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.099 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '67433dc4-c2fd-4aa0-8abf-90988e3052e4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30482432, 'user_id': '7e79b904cb8a49f990b05eb0ed72fdf4', 'user_name': None, 'project_id': '70b1c9f8be0042aa8de9841a26729700', 'project_name': None, 'resource_id': '22742d9b-a6a8-4f10-a17f-a9704a1f8f43-vda', 'timestamp': '2026-01-21T23:56:23.088392', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1374146863', 'name': 'instance-0000003a', 'instance_id': '22742d9b-a6a8-4f10-a17f-a9704a1f8f43', 'instance_type': 'm1.nano', 'host': 'fa7884d5cd98c8b7c256bd4fdc70b5077636284be54ca29d74feaff4', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'ca26743c-f724-11f0-a0a4-fa163e934844', 'monotonic_time': 4355.752354569, 'message_signature': '93731a2eb7c767b080558e6f283a90962e3ccb61fc9b9c03df0ecbafd75e554d'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '7e79b904cb8a49f990b05eb0ed72fdf4', 'user_name': None, 'project_id': '70b1c9f8be0042aa8de9841a26729700', 'project_name': None, 'resource_id': '22742d9b-a6a8-4f10-a17f-a9704a1f8f43-sda', 'timestamp': '2026-01-21T23:56:23.088392', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1374146863', 'name': 'instance-0000003a', 'instance_id': '22742d9b-a6a8-4f10-a17f-a9704a1f8f43', 'instance_type': 'm1.nano', 'host': 'fa7884d5cd98c8b7c256bd4fdc70b5077636284be54ca29d74feaff4', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'ca26827e-f724-11f0-a0a4-fa163e934844', 'monotonic_time': 4355.752354569, 'message_signature': '72484b48f0022a55af03bff4caa14780aa67016fe19d765f5acd6bbd0646ddc2'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 204800, 'user_id': '6eb1bcf645844eaca088761a04e59542', 'user_name': None, 'project_id': '63e5713bcd4c429796b251487b6136bc', 'project_name': None, 'resource_id': '0be60507-2a72-40c2-8ec7-86c829eacd52-vda', 'timestamp': '2026-01-21T23:56:23.088392', 'resource_metadata': {'display_name': 'tempest-ImagesTestJSON-server-560731900', 'name': 'instance-00000040', 'instance_id': '0be60507-2a72-40c2-8ec7-86c829eacd52', 'instance_type': 'm1.nano', 'host': 'd077fa559ec5930f9f2ebe56ec582b45241b18010f40f326cd90587e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'ca268f3a-f724-11f0-a0a4-fa163e934844', 'monotonic_time': 4355.76373929, 'message_signature': '9fe9594d9890561422da0a4a9b3fa20525965f2104bff667c33757a9cb08f075'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '6eb1bcf645844eaca088761a04e59542', 'user_name': None, 'project_id': '63e5713bcd4c429796b251487b6136bc', 'project_name': None, 'resource_id': '0be60507-2a72-40c2-8ec7-86c829eacd52-sda', 'timestamp': '2026-01-21T23:56:23.088392', 'resource_metadata': {'display_name': 'tempest-ImagesTestJSON-server-560731900', 'name': 'instance-00000040', 'instance_id': '0be60507-2a72-40c2-8ec7-86c829eacd52', 'instance_type': 'm1.nano', 'host': 'd077fa559ec5930f9f2ebe56ec582b45241b18010f40f326cd90587e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'ca269ebc-f724-11f0-a0a4-fa163e934844', 'monotonic_time': 4355.76373929, 'message_signature': '634595b48e612dbf6134409dfcc8617cf23460b8d60be0cc765e86195771c46c'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 204800, 'user_id': '9a4a4a5f3c9f4c5091261592272bcb81', 'user_name': None, 'project_id': '414437860afc460b9e86d674975e9d1f', 'project_name': None, 'resource_id': '03db37ed-d870-40ec-86f5-db23a9180dc8-vda', 'timestamp': '2026-01-21T23:56:23.088392', 'resource_metadata': {'display_name': 'tempest-ListServersNegativeTestJSON-server-1677728672-3', 'name': 'instance-00000043', 'instance_id': '03db37ed-d870-40ec-86f5-db23a9180dc8', 'instance_type': 'm1.nano', 'host': 'a517d86b3070decca8e41497c8a7fe9ab28da082b264f32b2ddc7c04', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'ca26ac04-f724-11f0-a0a4-fa163e934844', 'monotonic_time': 4355.777088431, 'message_signature': 'c13d679c90ecbe92c893ba6d2f4fddd2274b0ff8459de305638784c7decaad65'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '9a4a4a5f3c9f4c5091261592272bcb81', 'user_name': None, 'project_id': '414437860afc460b9e86d674975e9d1f', 'project_name': None, 'resource_id': '03db37ed-d870-40ec-86f5-db23a9180dc8-sda', 'timestamp': '2026-01-21T23:56:23.088392', 'resource_metadata': {'display_name': 'tempest-ListServersNegativeTestJSON-server-1677728672-3', 'name': 'instance-00000043', 'instance_id': '03db37ed-d870-40ec-86f5-db23a9180dc8', 'instance_type': 'm1.nano', 'host': 'a517d86b3070decca8e41497c8a7fe9ab28da082b264f32b2ddc7c04', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb':
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]:  1, 'disk_name': 'sda'}, 'message_id': 'ca26b712-f724-11f0-a0a4-fa163e934844', 'monotonic_time': 4355.777088431, 'message_signature': '444608b4d41bb00d8b9dc17996889000132141bb3d60d825f283f476004d4a1a'}]}, 'timestamp': '2026-01-21 23:56:23.090496', '_unique_id': 'd24a51ece5e743a398f4cf01c8e62b27'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.099 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.099 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.099 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.099 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.099 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.099 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.099 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.099 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.099 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.099 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.099 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.099 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.099 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.099 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.099 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.099 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.099 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.099 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.099 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.099 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.099 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.099 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.099 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.099 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.099 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.099 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.099 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.099 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.099 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.099 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.099 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.101 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.101 12 DEBUG ceilometer.compute.pollsters [-] 22742d9b-a6a8-4f10-a17f-a9704a1f8f43/disk.device.read.bytes volume: 32057344 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.102 12 DEBUG ceilometer.compute.pollsters [-] 22742d9b-a6a8-4f10-a17f-a9704a1f8f43/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.102 12 DEBUG ceilometer.compute.pollsters [-] 0be60507-2a72-40c2-8ec7-86c829eacd52/disk.device.read.bytes volume: 23775232 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.103 12 DEBUG ceilometer.compute.pollsters [-] 0be60507-2a72-40c2-8ec7-86c829eacd52/disk.device.read.bytes volume: 2048 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.103 12 DEBUG ceilometer.compute.pollsters [-] 03db37ed-d870-40ec-86f5-db23a9180dc8/disk.device.read.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.104 12 DEBUG ceilometer.compute.pollsters [-] 03db37ed-d870-40ec-86f5-db23a9180dc8/disk.device.read.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.106 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '746b1de4-8062-405f-a5ac-567936b8ee53', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 32057344, 'user_id': '7e79b904cb8a49f990b05eb0ed72fdf4', 'user_name': None, 'project_id': '70b1c9f8be0042aa8de9841a26729700', 'project_name': None, 'resource_id': '22742d9b-a6a8-4f10-a17f-a9704a1f8f43-vda', 'timestamp': '2026-01-21T23:56:23.101608', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1374146863', 'name': 'instance-0000003a', 'instance_id': '22742d9b-a6a8-4f10-a17f-a9704a1f8f43', 'instance_type': 'm1.nano', 'host': 'fa7884d5cd98c8b7c256bd4fdc70b5077636284be54ca29d74feaff4', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'ca287b38-f724-11f0-a0a4-fa163e934844', 'monotonic_time': 4355.580448455, 'message_signature': 'fcbd744ab58cec757b7b8be886f1ee9f7a3c53c54dbc049841bce7a08a8caea0'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': '7e79b904cb8a49f990b05eb0ed72fdf4', 'user_name': None, 'project_id': '70b1c9f8be0042aa8de9841a26729700', 'project_name': None, 'resource_id': '22742d9b-a6a8-4f10-a17f-a9704a1f8f43-sda', 'timestamp': '2026-01-21T23:56:23.101608', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1374146863', 'name': 'instance-0000003a', 'instance_id': '22742d9b-a6a8-4f10-a17f-a9704a1f8f43', 'instance_type': 'm1.nano', 'host': 'fa7884d5cd98c8b7c256bd4fdc70b5077636284be54ca29d74feaff4', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'ca2895be-f724-11f0-a0a4-fa163e934844', 'monotonic_time': 4355.580448455, 'message_signature': 'af96c772477c23c161fa9713f99363eb41a7c92e0a8e5889584b0b54bfde1d55'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 23775232, 'user_id': '6eb1bcf645844eaca088761a04e59542', 'user_name': None, 'project_id': '63e5713bcd4c429796b251487b6136bc', 'project_name': None, 'resource_id': '0be60507-2a72-40c2-8ec7-86c829eacd52-vda', 'timestamp': '2026-01-21T23:56:23.101608', 'resource_metadata': {'display_name': 'tempest-ImagesTestJSON-server-560731900', 'name': 'instance-00000040', 'instance_id': '0be60507-2a72-40c2-8ec7-86c829eacd52', 'instance_type': 'm1.nano', 'host': 'd077fa559ec5930f9f2ebe56ec582b45241b18010f40f326cd90587e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'ca28a8a6-f724-11f0-a0a4-fa163e934844', 'monotonic_time': 4355.604927809, 'message_signature': '9a7b3bd677efaab549df1592740ff8e404e7fd254e185a03956a045737d71cda'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2048, 'user_id': '6eb1bcf645844eaca088761a04e59542', 'user_name': None, 'project_id': '63e5713bcd4c429796b251487b6136bc', 'project_name': None, 'resource_id': '0be60507-2a72-40c2-8ec7-86c829eacd52-sda', 'timestamp': '2026-01-21T23:56:23.101608', 'resource_metadata': {'display_name': 'tempest-ImagesTestJSON-server-560731900', 'name': 'instance-00000040', 'instance_id': '0be60507-2a72-40c2-8ec7-86c829eacd52', 'instance_type': 'm1.nano', 'host': 'd077fa559ec5930f9f2ebe56ec582b45241b18010f40f326cd90587e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'ca28bb7a-f724-11f0-a0a4-fa163e934844', 'monotonic_time': 4355.604927809, 'message_signature': 'd68f09ad148b229c67b7e378309495dbe1237b492e923ca3df1e44b2ba8425a2'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '9a4a4a5f3c9f4c5091261592272bcb81', 'user_name': None, 'project_id': '414437860afc460b9e86d674975e9d1f', 'project_name': None, 'resource_id': '03db37ed-d870-40ec-86f5-db23a9180dc8-vda', 'timestamp': '2026-01-21T23:56:23.101608', 'resource_metadata': {'display_name': 'tempest-ListServersNegativeTestJSON-server-1677728672-3', 'name': 'instance-00000043', 'instance_id': '03db37ed-d870-40ec-86f5-db23a9180dc8', 'instance_type': 'm1.nano', 'host': 'a517d86b3070decca8e41497c8a7fe9ab28da082b264f32b2ddc7c04', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'ca28ccc8-f724-11f0-a0a4-fa163e934844', 'monotonic_time': 4355.632702695, 'message_signature': 'c20451aaf741be9716b6e4af7f591f19dc4de0b23d805cc6bca4db8eb4a793b8'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '9a4a4a5f3c9f4c5091261592272bcb81', 'user_name': None, 'project_id': '414437860afc460b9e86d674975e9d1f', 'project_name': None, 'resource_id': '03db37ed-d870-40ec-86f5-db23a9180dc8-sda', 'timestamp': '2026-01-21T23:56:23.101608', 'resource_metadata': {'display_name': 'tempest-ListServersNegativeTestJSON-server-1677728672-3', 'name': 'instance-00000043', 'instance_id': '03db37ed-d870-40ec-86f5-db23a9180dc8', 'instance_type': 'm1.nano', 'host': 'a517d86b3070decca8e41497c8a7fe9ab28da082b264f32b2ddc7c04', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'epheme
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: ral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'ca28dd8a-f724-11f0-a0a4-fa163e934844', 'monotonic_time': 4355.632702695, 'message_signature': 'd774b6bdd4c641abe41a4f792b3375313bbb02d5890bc5ed6549dae1e9fb31db'}]}, 'timestamp': '2026-01-21 23:56:23.104741', '_unique_id': '4e84a83890734e24a315e5d6c02e32a7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.106 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.106 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.106 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.106 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.106 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.106 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.106 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.106 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.106 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.106 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.106 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.106 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.106 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.106 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.106 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.106 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.106 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.106 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.106 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.106 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.106 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.106 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.106 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.106 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.106 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.106 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.106 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.106 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.106 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.106 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.106 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.107 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.107 12 DEBUG ceilometer.compute.pollsters [-] 22742d9b-a6a8-4f10-a17f-a9704a1f8f43/disk.device.usage volume: 29949952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.108 12 DEBUG ceilometer.compute.pollsters [-] 22742d9b-a6a8-4f10-a17f-a9704a1f8f43/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.108 12 DEBUG ceilometer.compute.pollsters [-] 0be60507-2a72-40c2-8ec7-86c829eacd52/disk.device.usage volume: 196624 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.109 12 DEBUG ceilometer.compute.pollsters [-] 0be60507-2a72-40c2-8ec7-86c829eacd52/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.109 12 DEBUG ceilometer.compute.pollsters [-] 03db37ed-d870-40ec-86f5-db23a9180dc8/disk.device.usage volume: 196624 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.110 12 DEBUG ceilometer.compute.pollsters [-] 03db37ed-d870-40ec-86f5-db23a9180dc8/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.111 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e2e836c5-f790-4d2d-a7bf-d1a9f51f45d9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29949952, 'user_id': '7e79b904cb8a49f990b05eb0ed72fdf4', 'user_name': None, 'project_id': '70b1c9f8be0042aa8de9841a26729700', 'project_name': None, 'resource_id': '22742d9b-a6a8-4f10-a17f-a9704a1f8f43-vda', 'timestamp': '2026-01-21T23:56:23.107943', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1374146863', 'name': 'instance-0000003a', 'instance_id': '22742d9b-a6a8-4f10-a17f-a9704a1f8f43', 'instance_type': 'm1.nano', 'host': 'fa7884d5cd98c8b7c256bd4fdc70b5077636284be54ca29d74feaff4', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'ca297038-f724-11f0-a0a4-fa163e934844', 'monotonic_time': 4355.752354569, 'message_signature': '1316cf8498a1d11a70bb65ad44670cf9ac093b879ecfd3864358aba9a0c4da23'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '7e79b904cb8a49f990b05eb0ed72fdf4', 'user_name': None, 'project_id': '70b1c9f8be0042aa8de9841a26729700', 'project_name': None, 'resource_id': '22742d9b-a6a8-4f10-a17f-a9704a1f8f43-sda', 'timestamp': '2026-01-21T23:56:23.107943', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1374146863', 'name': 'instance-0000003a', 'instance_id': '22742d9b-a6a8-4f10-a17f-a9704a1f8f43', 'instance_type': 'm1.nano', 'host': 'fa7884d5cd98c8b7c256bd4fdc70b5077636284be54ca29d74feaff4', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'ca29806e-f724-11f0-a0a4-fa163e934844', 'monotonic_time': 4355.752354569, 'message_signature': 'c0863c87a120f4e2f179ec36c2b5d2b835566cbaa83127e429d3f4f0f2b35b61'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 196624, 'user_id': '6eb1bcf645844eaca088761a04e59542', 'user_name': None, 'project_id': '63e5713bcd4c429796b251487b6136bc', 'project_name': None, 'resource_id': '0be60507-2a72-40c2-8ec7-86c829eacd52-vda', 'timestamp': '2026-01-21T23:56:23.107943', 'resource_metadata': {'display_name': 'tempest-ImagesTestJSON-server-560731900', 'name': 'instance-00000040', 'instance_id': '0be60507-2a72-40c2-8ec7-86c829eacd52', 'instance_type': 'm1.nano', 'host': 'd077fa559ec5930f9f2ebe56ec582b45241b18010f40f326cd90587e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'ca29950e-f724-11f0-a0a4-fa163e934844', 'monotonic_time': 4355.76373929, 'message_signature': '88d7c83739465446ecaa4ef986fc44954309d87beae3f79ba91174714b3ac38b'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '6eb1bcf645844eaca088761a04e59542', 'user_name': None, 'project_id': '63e5713bcd4c429796b251487b6136bc', 'project_name': None, 'resource_id': '0be60507-2a72-40c2-8ec7-86c829eacd52-sda', 'timestamp': '2026-01-21T23:56:23.107943', 'resource_metadata': {'display_name': 'tempest-ImagesTestJSON-server-560731900', 'name': 'instance-00000040', 'instance_id': '0be60507-2a72-40c2-8ec7-86c829eacd52', 'instance_type': 'm1.nano', 'host': 'd077fa559ec5930f9f2ebe56ec582b45241b18010f40f326cd90587e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'ca29a486-f724-11f0-a0a4-fa163e934844', 'monotonic_time': 4355.76373929, 'message_signature': '4fd8e47a1e80ef697c497d432e90c293f1212652128882d5efbda3b1bde56748'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 196624, 'user_id': '9a4a4a5f3c9f4c5091261592272bcb81', 'user_name': None, 'project_id': '414437860afc460b9e86d674975e9d1f', 'project_name': None, 'resource_id': '03db37ed-d870-40ec-86f5-db23a9180dc8-vda', 'timestamp': '2026-01-21T23:56:23.107943', 'resource_metadata': {'display_name': 'tempest-ListServersNegativeTestJSON-server-1677728672-3', 'name': 'instance-00000043', 'instance_id': '03db37ed-d870-40ec-86f5-db23a9180dc8', 'instance_type': 'm1.nano', 'host': 'a517d86b3070decca8e41497c8a7fe9ab28da082b264f32b2ddc7c04', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'ca29b8a4-f724-11f0-a0a4-fa163e934844', 'monotonic_time': 4355.777088431, 'message_signature': 'a536324b2eef0015516eecfe1d8843a7aa9eb5fda17fdd63824d39ce48d3b8dd'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '9a4a4a5f3c9f4c5091261592272bcb81', 'user_name': None, 'project_id': '414437860afc460b9e86d674975e9d1f', 'project_name': None, 'resource_id': '03db37ed-d870-40ec-86f5-db23a9180dc8-sda', 'timestamp': '2026-01-21T23:56:23.107943', 'resource_metadata': {'display_name': 'tempest-ListServersNegativeTestJSON-server-1677728672-3', 'name': 'instance-00000043', 'instance_id': '03db37ed-d870-40ec-86f5-db23a9180dc8', 'instance_type': 'm1.nano', 'host': 'a517d86b3070decca8e41497c8a7fe9ab28da082b264f32b2ddc7c04', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'mess
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: age_id': 'ca29c826-f724-11f0-a0a4-fa163e934844', 'monotonic_time': 4355.777088431, 'message_signature': '751ad471a9d6a68ed896c140db50574a50f4175113e75b17f75d6a27ff0191a4'}]}, 'timestamp': '2026-01-21 23:56:23.110657', '_unique_id': '517c1f89b0ba4c8dba04360f9fc4dd14'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.111 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.111 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.111 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.111 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.111 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.111 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.111 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.111 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.111 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.111 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.111 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.111 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.111 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.111 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.111 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.111 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.111 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.111 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.111 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.111 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.111 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.111 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.111 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.111 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.111 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.111 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.111 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.111 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.111 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.111 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.111 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.113 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.113 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.113 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-ListServerFiltersTestJSON-instance-1374146863>, <NovaLikeServer: tempest-ImagesTestJSON-server-560731900>, <NovaLikeServer: tempest-ListServersNegativeTestJSON-server-1677728672-3>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ListServerFiltersTestJSON-instance-1374146863>, <NovaLikeServer: tempest-ImagesTestJSON-server-560731900>, <NovaLikeServer: tempest-ListServersNegativeTestJSON-server-1677728672-3>]
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.114 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.114 12 DEBUG ceilometer.compute.pollsters [-] 22742d9b-a6a8-4f10-a17f-a9704a1f8f43/cpu volume: 10970000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.114 12 DEBUG ceilometer.compute.pollsters [-] 0be60507-2a72-40c2-8ec7-86c829eacd52/cpu volume: 3840000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.115 12 DEBUG ceilometer.compute.pollsters [-] 03db37ed-d870-40ec-86f5-db23a9180dc8/cpu volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.116 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2cc5e216-4391-4487-8aa9-5161a19d68a8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 10970000000, 'user_id': '7e79b904cb8a49f990b05eb0ed72fdf4', 'user_name': None, 'project_id': '70b1c9f8be0042aa8de9841a26729700', 'project_name': None, 'resource_id': '22742d9b-a6a8-4f10-a17f-a9704a1f8f43', 'timestamp': '2026-01-21T23:56:23.114335', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1374146863', 'name': 'instance-0000003a', 'instance_id': '22742d9b-a6a8-4f10-a17f-a9704a1f8f43', 'instance_type': 'm1.nano', 'host': 'fa7884d5cd98c8b7c256bd4fdc70b5077636284be54ca29d74feaff4', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': 'ca2a6b46-f724-11f0-a0a4-fa163e934844', 'monotonic_time': 4355.704111053, 'message_signature': '4a9f45009105a0ed279051869a354db1a176f615396ea24c0d4bdcb0bac1a3a7'}, {'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 3840000000, 'user_id': '6eb1bcf645844eaca088761a04e59542', 'user_name': None, 'project_id': '63e5713bcd4c429796b251487b6136bc', 'project_name': None, 'resource_id': '0be60507-2a72-40c2-8ec7-86c829eacd52', 'timestamp': '2026-01-21T23:56:23.114335', 'resource_metadata': {'display_name': 'tempest-ImagesTestJSON-server-560731900', 'name': 'instance-00000040', 'instance_id': '0be60507-2a72-40c2-8ec7-86c829eacd52', 'instance_type': 'm1.nano', 'host': 'd077fa559ec5930f9f2ebe56ec582b45241b18010f40f326cd90587e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': 'ca2a8176-f724-11f0-a0a4-fa163e934844', 'monotonic_time': 4355.722137829, 'message_signature': '89f97a28f0b65768ea6bc1fd15093962114bb7acd7826914b61dfb37b7b7ae00'}, {'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '9a4a4a5f3c9f4c5091261592272bcb81', 'user_name': None, 'project_id': '414437860afc460b9e86d674975e9d1f', 'project_name': None, 'resource_id': '03db37ed-d870-40ec-86f5-db23a9180dc8', 'timestamp': '2026-01-21T23:56:23.114335', 'resource_metadata': {'display_name': 'tempest-ListServersNegativeTestJSON-server-1677728672-3', 'name': 'instance-00000043', 'instance_id': '03db37ed-d870-40ec-86f5-db23a9180dc8', 'instance_type': 'm1.nano', 'host': 'a517d86b3070decca8e41497c8a7fe9ab28da082b264f32b2ddc7c04', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': 'ca2a91b6-f724-11f0-a0a4-fa163e934844', 'monotonic_time': 4355.749589904, 'message_signature': 'a5c89e8265abc7a7791abf5a0ecafdc4bb90ed6abfa962850446613ef7dc324f'}]}, 'timestamp': '2026-01-21 23:56:23.115820', '_unique_id': 'ce1896b4fe4c491bb707233dd7e82db2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.116 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.116 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.116 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.116 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.116 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.116 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.116 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.116 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.116 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.116 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.116 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.116 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.116 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.116 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.116 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.116 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.116 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.116 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.116 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.116 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.116 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.116 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.116 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.116 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.116 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.116 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.116 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.116 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.116 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.116 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.116 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.118 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.118 12 DEBUG ceilometer.compute.pollsters [-] 22742d9b-a6a8-4f10-a17f-a9704a1f8f43/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.118 12 DEBUG ceilometer.compute.pollsters [-] 0be60507-2a72-40c2-8ec7-86c829eacd52/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.119 12 DEBUG ceilometer.compute.pollsters [-] 03db37ed-d870-40ec-86f5-db23a9180dc8/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.120 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bd4c3142-cffd-46b4-8407-f8317a92c06f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7e79b904cb8a49f990b05eb0ed72fdf4', 'user_name': None, 'project_id': '70b1c9f8be0042aa8de9841a26729700', 'project_name': None, 'resource_id': 'instance-0000003a-22742d9b-a6a8-4f10-a17f-a9704a1f8f43-tapb767fb64-f4', 'timestamp': '2026-01-21T23:56:23.118400', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1374146863', 'name': 'tapb767fb64-f4', 'instance_id': '22742d9b-a6a8-4f10-a17f-a9704a1f8f43', 'instance_type': 'm1.nano', 'host': 'fa7884d5cd98c8b7c256bd4fdc70b5077636284be54ca29d74feaff4', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:4e:99:ca', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb767fb64-f4'}, 'message_id': 'ca2b07a4-f724-11f0-a0a4-fa163e934844', 'monotonic_time': 4355.676249775, 'message_signature': 'fb310888899aee7a3ca0d4f0a8d33fdeb1418eee45bd7d417c5e8d3f90af0e62'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '6eb1bcf645844eaca088761a04e59542', 'user_name': None, 'project_id': '63e5713bcd4c429796b251487b6136bc', 'project_name': None, 'resource_id': 'instance-00000040-0be60507-2a72-40c2-8ec7-86c829eacd52-tap8014260a-f4', 'timestamp': '2026-01-21T23:56:23.118400', 'resource_metadata': {'display_name': 'tempest-ImagesTestJSON-server-560731900', 'name': 'tap8014260a-f4', 'instance_id': '0be60507-2a72-40c2-8ec7-86c829eacd52', 'instance_type': 'm1.nano', 'host': 'd077fa559ec5930f9f2ebe56ec582b45241b18010f40f326cd90587e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:50:87:d7', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap8014260a-f4'}, 'message_id': 'ca2b1d16-f724-11f0-a0a4-fa163e934844', 'monotonic_time': 4355.680236019, 'message_signature': 'dadf202a06d37415e8330d9e91bb90b217b709491a9df66ed662a1e46ec2e13a'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '9a4a4a5f3c9f4c5091261592272bcb81', 'user_name': None, 'project_id': '414437860afc460b9e86d674975e9d1f', 'project_name': None, 'resource_id': 'instance-00000043-03db37ed-d870-40ec-86f5-db23a9180dc8-tap1f2706f6-32', 'timestamp': '2026-01-21T23:56:23.118400', 'resource_metadata': {'display_name': 'tempest-ListServersNegativeTestJSON-server-1677728672-3', 'name': 'tap1f2706f6-32', 'instance_id': '03db37ed-d870-40ec-86f5-db23a9180dc8', 'instance_type': 'm1.nano', 'host': 'a517d86b3070decca8e41497c8a7fe9ab28da082b264f32b2ddc7c04', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:38:ee:b8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap1f2706f6-32'}, 'message_id': 'ca2b2e82-f724-11f0-a0a4-fa163e934844', 'monotonic_time': 4355.682720395, 'message_signature': 'e744b14d1a33519f0db76e0d826c70d8470941a1a98e8c653c33f08b8b245296'}]}, 'timestamp': '2026-01-21 23:56:23.119839', '_unique_id': 'f908a3f5f9c44d7ba1c09185b13c05b3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.120 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.120 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.120 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.120 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.120 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.120 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.120 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.120 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.120 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.120 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.120 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.120 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.120 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.120 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.120 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.120 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.120 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.120 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.120 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.120 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.120 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.120 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.120 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.120 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.120 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.120 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.120 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.120 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.120 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.120 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.120 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.121 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.121 12 DEBUG ceilometer.compute.pollsters [-] 22742d9b-a6a8-4f10-a17f-a9704a1f8f43/disk.device.write.latency volume: 9131160 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.122 12 DEBUG ceilometer.compute.pollsters [-] 22742d9b-a6a8-4f10-a17f-a9704a1f8f43/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.122 12 DEBUG ceilometer.compute.pollsters [-] 0be60507-2a72-40c2-8ec7-86c829eacd52/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.122 12 DEBUG ceilometer.compute.pollsters [-] 0be60507-2a72-40c2-8ec7-86c829eacd52/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.123 12 DEBUG ceilometer.compute.pollsters [-] 03db37ed-d870-40ec-86f5-db23a9180dc8/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.123 12 DEBUG ceilometer.compute.pollsters [-] 03db37ed-d870-40ec-86f5-db23a9180dc8/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.124 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3eff4d68-559b-450c-9dd1-5c699345c747', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 9131160, 'user_id': '7e79b904cb8a49f990b05eb0ed72fdf4', 'user_name': None, 'project_id': '70b1c9f8be0042aa8de9841a26729700', 'project_name': None, 'resource_id': '22742d9b-a6a8-4f10-a17f-a9704a1f8f43-vda', 'timestamp': '2026-01-21T23:56:23.121779', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1374146863', 'name': 'instance-0000003a', 'instance_id': '22742d9b-a6a8-4f10-a17f-a9704a1f8f43', 'instance_type': 'm1.nano', 'host': 'fa7884d5cd98c8b7c256bd4fdc70b5077636284be54ca29d74feaff4', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'ca2b89c2-f724-11f0-a0a4-fa163e934844', 'monotonic_time': 4355.580448455, 'message_signature': '0e8927a8bd5470a63140c43e55cd4a8141958dbdc0e9cff1812ef95f2d06cce8'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '7e79b904cb8a49f990b05eb0ed72fdf4', 'user_name': None, 'project_id': '70b1c9f8be0042aa8de9841a26729700', 'project_name': None, 'resource_id': '22742d9b-a6a8-4f10-a17f-a9704a1f8f43-sda', 'timestamp': '2026-01-21T23:56:23.121779', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1374146863', 'name': 'instance-0000003a', 'instance_id': '22742d9b-a6a8-4f10-a17f-a9704a1f8f43', 'instance_type': 'm1.nano', 'host': 'fa7884d5cd98c8b7c256bd4fdc70b5077636284be54ca29d74feaff4', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'ca2b96ce-f724-11f0-a0a4-fa163e934844', 'monotonic_time': 4355.580448455, 'message_signature': '64ea65e5194b39637e03a79cc7ada1fefe039e87bf8172e3a13831f5c41e8973'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '6eb1bcf645844eaca088761a04e59542', 'user_name': None, 'project_id': '63e5713bcd4c429796b251487b6136bc', 'project_name': None, 'resource_id': '0be60507-2a72-40c2-8ec7-86c829eacd52-vda', 'timestamp': '2026-01-21T23:56:23.121779', 'resource_metadata': {'display_name': 'tempest-ImagesTestJSON-server-560731900', 'name': 'instance-00000040', 'instance_id': '0be60507-2a72-40c2-8ec7-86c829eacd52', 'instance_type': 'm1.nano', 'host': 'd077fa559ec5930f9f2ebe56ec582b45241b18010f40f326cd90587e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'ca2ba1f0-f724-11f0-a0a4-fa163e934844', 'monotonic_time': 4355.604927809, 'message_signature': '4a0bdc0a8e0cda1aa301bbae97a059be6d59a72afcbb429fcd7b7be253217bd4'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '6eb1bcf645844eaca088761a04e59542', 'user_name': None, 'project_id': '63e5713bcd4c429796b251487b6136bc', 'project_name': None, 'resource_id': '0be60507-2a72-40c2-8ec7-86c829eacd52-sda', 'timestamp': '2026-01-21T23:56:23.121779', 'resource_metadata': {'display_name': 'tempest-ImagesTestJSON-server-560731900', 'name': 'instance-00000040', 'instance_id': '0be60507-2a72-40c2-8ec7-86c829eacd52', 'instance_type': 'm1.nano', 'host': 'd077fa559ec5930f9f2ebe56ec582b45241b18010f40f326cd90587e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'ca2bad12-f724-11f0-a0a4-fa163e934844', 'monotonic_time': 4355.604927809, 'message_signature': '84af98d7148fe69321c47b005f1ca6b7aef16399178f7b14611b58e7eee7d4e2'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '9a4a4a5f3c9f4c5091261592272bcb81', 'user_name': None, 'project_id': '414437860afc460b9e86d674975e9d1f', 'project_name': None, 'resource_id': '03db37ed-d870-40ec-86f5-db23a9180dc8-vda', 'timestamp': '2026-01-21T23:56:23.121779', 'resource_metadata': {'display_name': 'tempest-ListServersNegativeTestJSON-server-1677728672-3', 'name': 'instance-00000043', 'instance_id': '03db37ed-d870-40ec-86f5-db23a9180dc8', 'instance_type': 'm1.nano', 'host': 'a517d86b3070decca8e41497c8a7fe9ab28da082b264f32b2ddc7c04', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'ca2bb7e4-f724-11f0-a0a4-fa163e934844', 'monotonic_time': 4355.632702695, 'message_signature': '8c645b51f83814bb0a1f7561bcc7a6df8af210376a78529522151e1b900d7566'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '9a4a4a5f3c9f4c5091261592272bcb81', 'user_name': None, 'project_id': '414437860afc460b9e86d674975e9d1f', 'project_name': None, 'resource_id': '03db37ed-d870-40ec-86f5-db23a9180dc8-sda', 'timestamp': '2026-01-21T23:56:23.121779', 'resource_metadata': {'display_name': 'tempest-ListServersNegativeTestJSON-server-1677728672-3', 'name': 'instance-00000043', 'instance_id': '03db37ed-d870-40ec-86f5-db23a9180dc8', 'instance_type': 'm1.nano', 'host': 'a517d86b3070decca8e41497c8a7fe9ab28da082b264f32b2ddc7c04', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1,
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]:  'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'ca2bc536-f724-11f0-a0a4-fa163e934844', 'monotonic_time': 4355.632702695, 'message_signature': 'aed9d5e318cd263147183f2f976a78273427ece2077c6509aab21ae158079b83'}]}, 'timestamp': '2026-01-21 23:56:23.123625', '_unique_id': '01b39caaa5c74c4b8707301f1d7dc66c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.124 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.124 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.124 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.124 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.124 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.124 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.124 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.124 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.124 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.124 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.124 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.124 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.124 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.124 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.124 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.124 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.124 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.124 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.124 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.124 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.124 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.124 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.124 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.124 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.124 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.124 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.124 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.124 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.124 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.124 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.124 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.125 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.125 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.125 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-ListServerFiltersTestJSON-instance-1374146863>, <NovaLikeServer: tempest-ImagesTestJSON-server-560731900>, <NovaLikeServer: tempest-ListServersNegativeTestJSON-server-1677728672-3>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ListServerFiltersTestJSON-instance-1374146863>, <NovaLikeServer: tempest-ImagesTestJSON-server-560731900>, <NovaLikeServer: tempest-ListServersNegativeTestJSON-server-1677728672-3>]
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.125 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.125 12 DEBUG ceilometer.compute.pollsters [-] 22742d9b-a6a8-4f10-a17f-a9704a1f8f43/network.outgoing.bytes volume: 266 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.126 12 DEBUG ceilometer.compute.pollsters [-] 0be60507-2a72-40c2-8ec7-86c829eacd52/network.outgoing.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.126 12 DEBUG ceilometer.compute.pollsters [-] 03db37ed-d870-40ec-86f5-db23a9180dc8/network.outgoing.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.127 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6f51b00a-5404-4e77-bea8-73e00c592513', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 266, 'user_id': '7e79b904cb8a49f990b05eb0ed72fdf4', 'user_name': None, 'project_id': '70b1c9f8be0042aa8de9841a26729700', 'project_name': None, 'resource_id': 'instance-0000003a-22742d9b-a6a8-4f10-a17f-a9704a1f8f43-tapb767fb64-f4', 'timestamp': '2026-01-21T23:56:23.125875', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1374146863', 'name': 'tapb767fb64-f4', 'instance_id': '22742d9b-a6a8-4f10-a17f-a9704a1f8f43', 'instance_type': 'm1.nano', 'host': 'fa7884d5cd98c8b7c256bd4fdc70b5077636284be54ca29d74feaff4', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:4e:99:ca', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb767fb64-f4'}, 'message_id': 'ca2c292c-f724-11f0-a0a4-fa163e934844', 'monotonic_time': 4355.676249775, 'message_signature': '166d1c9794d8fe53453d31dafcbdf4863d36ebe932e9993e0b9d3add421d677b'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '6eb1bcf645844eaca088761a04e59542', 'user_name': None, 'project_id': '63e5713bcd4c429796b251487b6136bc', 'project_name': None, 'resource_id': 'instance-00000040-0be60507-2a72-40c2-8ec7-86c829eacd52-tap8014260a-f4', 'timestamp': '2026-01-21T23:56:23.125875', 'resource_metadata': {'display_name': 'tempest-ImagesTestJSON-server-560731900', 'name': 'tap8014260a-f4', 'instance_id': '0be60507-2a72-40c2-8ec7-86c829eacd52', 'instance_type': 'm1.nano', 'host': 'd077fa559ec5930f9f2ebe56ec582b45241b18010f40f326cd90587e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:50:87:d7', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap8014260a-f4'}, 'message_id': 'ca2c36d8-f724-11f0-a0a4-fa163e934844', 'monotonic_time': 4355.680236019, 'message_signature': 'f90536499e775fe2844726fb6ac3d0eff13b4da21f27d9558f067a1836757a50'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '9a4a4a5f3c9f4c5091261592272bcb81', 'user_name': None, 'project_id': '414437860afc460b9e86d674975e9d1f', 'project_name': None, 'resource_id': 'instance-00000043-03db37ed-d870-40ec-86f5-db23a9180dc8-tap1f2706f6-32', 'timestamp': '2026-01-21T23:56:23.125875', 'resource_metadata': {'display_name': 'tempest-ListServersNegativeTestJSON-server-1677728672-3', 'name': 'tap1f2706f6-32', 'instance_id': '03db37ed-d870-40ec-86f5-db23a9180dc8', 'instance_type': 'm1.nano', 'host': 'a517d86b3070decca8e41497c8a7fe9ab28da082b264f32b2ddc7c04', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:38:ee:b8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap1f2706f6-32'}, 'message_id': 'ca2c431c-f724-11f0-a0a4-fa163e934844', 'monotonic_time': 4355.682720395, 'message_signature': 'd75090d286724b2e9479bce6d802e278599573083bb0a9f63fcafd644ffb8c8b'}]}, 'timestamp': '2026-01-21 23:56:23.126873', '_unique_id': '02d83cca5b654a05a9fa6a670c3a5319'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.127 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.127 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.127 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.127 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.127 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.127 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.127 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.127 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.127 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.127 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.127 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.127 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.127 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.127 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.127 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.127 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.127 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.127 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.127 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.127 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.127 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.127 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.127 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.127 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.127 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.127 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.127 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.127 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.127 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.127 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.127 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.128 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.128 12 DEBUG ceilometer.compute.pollsters [-] 22742d9b-a6a8-4f10-a17f-a9704a1f8f43/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.128 12 DEBUG ceilometer.compute.pollsters [-] 0be60507-2a72-40c2-8ec7-86c829eacd52/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.129 12 DEBUG ceilometer.compute.pollsters [-] 03db37ed-d870-40ec-86f5-db23a9180dc8/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.130 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f8fd5312-67f8-46f0-a6bb-f212b1eeb58a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7e79b904cb8a49f990b05eb0ed72fdf4', 'user_name': None, 'project_id': '70b1c9f8be0042aa8de9841a26729700', 'project_name': None, 'resource_id': 'instance-0000003a-22742d9b-a6a8-4f10-a17f-a9704a1f8f43-tapb767fb64-f4', 'timestamp': '2026-01-21T23:56:23.128549', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1374146863', 'name': 'tapb767fb64-f4', 'instance_id': '22742d9b-a6a8-4f10-a17f-a9704a1f8f43', 'instance_type': 'm1.nano', 'host': 'fa7884d5cd98c8b7c256bd4fdc70b5077636284be54ca29d74feaff4', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:4e:99:ca', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb767fb64-f4'}, 'message_id': 'ca2c916e-f724-11f0-a0a4-fa163e934844', 'monotonic_time': 4355.676249775, 'message_signature': '766c3cf95e9e3fbef6d254177fd2a2603f2c99c434b88eb51a096a9fd933d657'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '6eb1bcf645844eaca088761a04e59542', 'user_name': None, 'project_id': '63e5713bcd4c429796b251487b6136bc', 'project_name': None, 'resource_id': 'instance-00000040-0be60507-2a72-40c2-8ec7-86c829eacd52-tap8014260a-f4', 'timestamp': '2026-01-21T23:56:23.128549', 'resource_metadata': {'display_name': 'tempest-ImagesTestJSON-server-560731900', 'name': 'tap8014260a-f4', 'instance_id': '0be60507-2a72-40c2-8ec7-86c829eacd52', 'instance_type': 'm1.nano', 'host': 'd077fa559ec5930f9f2ebe56ec582b45241b18010f40f326cd90587e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:50:87:d7', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap8014260a-f4'}, 'message_id': 'ca2c9e48-f724-11f0-a0a4-fa163e934844', 'monotonic_time': 4355.680236019, 'message_signature': '044d823c3d91581923bc1823e66ca1009339b1f6a6ded285ebce7a1cfec341ca'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '9a4a4a5f3c9f4c5091261592272bcb81', 'user_name': None, 'project_id': '414437860afc460b9e86d674975e9d1f', 'project_name': None, 'resource_id': 'instance-00000043-03db37ed-d870-40ec-86f5-db23a9180dc8-tap1f2706f6-32', 'timestamp': '2026-01-21T23:56:23.128549', 'resource_metadata': {'display_name': 'tempest-ListServersNegativeTestJSON-server-1677728672-3', 'name': 'tap1f2706f6-32', 'instance_id': '03db37ed-d870-40ec-86f5-db23a9180dc8', 'instance_type': 'm1.nano', 'host': 'a517d86b3070decca8e41497c8a7fe9ab28da082b264f32b2ddc7c04', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:38:ee:b8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap1f2706f6-32'}, 'message_id': 'ca2ca9ec-f724-11f0-a0a4-fa163e934844', 'monotonic_time': 4355.682720395, 'message_signature': '6897faaf94d1db1d260ec90d6320be6dbe88a973e9c8a554d74dbfc835caa40b'}]}, 'timestamp': '2026-01-21 23:56:23.129594', '_unique_id': '4d37d2f3c7994c6f9b735f94fce5b518'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.130 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.130 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.130 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.130 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.130 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.130 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.130 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.130 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.130 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.130 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.130 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.130 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.130 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.130 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.130 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.130 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.130 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.130 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.130 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.130 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.130 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.130 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.130 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.130 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.130 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.130 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.130 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.130 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.130 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.130 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.130 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.131 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.131 12 DEBUG ceilometer.compute.pollsters [-] 22742d9b-a6a8-4f10-a17f-a9704a1f8f43/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.131 12 DEBUG ceilometer.compute.pollsters [-] 0be60507-2a72-40c2-8ec7-86c829eacd52/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.132 12 DEBUG ceilometer.compute.pollsters [-] 03db37ed-d870-40ec-86f5-db23a9180dc8/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.133 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f75f5dea-494e-46ac-8580-a0b806fae2a8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '7e79b904cb8a49f990b05eb0ed72fdf4', 'user_name': None, 'project_id': '70b1c9f8be0042aa8de9841a26729700', 'project_name': None, 'resource_id': 'instance-0000003a-22742d9b-a6a8-4f10-a17f-a9704a1f8f43-tapb767fb64-f4', 'timestamp': '2026-01-21T23:56:23.131285', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1374146863', 'name': 'tapb767fb64-f4', 'instance_id': '22742d9b-a6a8-4f10-a17f-a9704a1f8f43', 'instance_type': 'm1.nano', 'host': 'fa7884d5cd98c8b7c256bd4fdc70b5077636284be54ca29d74feaff4', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:4e:99:ca', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb767fb64-f4'}, 'message_id': 'ca2cfcee-f724-11f0-a0a4-fa163e934844', 'monotonic_time': 4355.676249775, 'message_signature': '60fbce2e56cca9f5f113671657faf1167f27199012a6d1b8b351a6300335f668'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '6eb1bcf645844eaca088761a04e59542', 'user_name': None, 'project_id': '63e5713bcd4c429796b251487b6136bc', 'project_name': None, 'resource_id': 'instance-00000040-0be60507-2a72-40c2-8ec7-86c829eacd52-tap8014260a-f4', 'timestamp': '2026-01-21T23:56:23.131285', 'resource_metadata': {'display_name': 'tempest-ImagesTestJSON-server-560731900', 'name': 'tap8014260a-f4', 'instance_id': '0be60507-2a72-40c2-8ec7-86c829eacd52', 'instance_type': 'm1.nano', 'host': 'd077fa559ec5930f9f2ebe56ec582b45241b18010f40f326cd90587e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:50:87:d7', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap8014260a-f4'}, 'message_id': 'ca2d0cca-f724-11f0-a0a4-fa163e934844', 'monotonic_time': 4355.680236019, 'message_signature': '64fe4abc0c84a8e39b298ae5c7873ad869f39bda98c3c091f15ec3023b646f49'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '9a4a4a5f3c9f4c5091261592272bcb81', 'user_name': None, 'project_id': '414437860afc460b9e86d674975e9d1f', 'project_name': None, 'resource_id': 'instance-00000043-03db37ed-d870-40ec-86f5-db23a9180dc8-tap1f2706f6-32', 'timestamp': '2026-01-21T23:56:23.131285', 'resource_metadata': {'display_name': 'tempest-ListServersNegativeTestJSON-server-1677728672-3', 'name': 'tap1f2706f6-32', 'instance_id': '03db37ed-d870-40ec-86f5-db23a9180dc8', 'instance_type': 'm1.nano', 'host': 'a517d86b3070decca8e41497c8a7fe9ab28da082b264f32b2ddc7c04', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:38:ee:b8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap1f2706f6-32'}, 'message_id': 'ca2d1b02-f724-11f0-a0a4-fa163e934844', 'monotonic_time': 4355.682720395, 'message_signature': 'ea705cccf75e98e0bb1227dc11cd51e0b321584731824c47657762d2ee0acc0f'}]}, 'timestamp': '2026-01-21 23:56:23.132354', '_unique_id': '32cab626fc36422ab76b8bb79d2692b8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.133 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.133 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.133 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.133 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.133 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.133 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.133 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.133 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.133 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.133 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.133 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.133 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.133 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.133 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.133 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.133 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.133 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.133 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.133 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.133 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.133 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.133 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.133 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.133 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.133 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.133 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.133 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.133 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.133 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.133 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.133 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.133 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.133 12 DEBUG ceilometer.compute.pollsters [-] 22742d9b-a6a8-4f10-a17f-a9704a1f8f43/disk.device.read.latency volume: 331104628 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.134 12 DEBUG ceilometer.compute.pollsters [-] 22742d9b-a6a8-4f10-a17f-a9704a1f8f43/disk.device.read.latency volume: 49993543 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.134 12 DEBUG ceilometer.compute.pollsters [-] 0be60507-2a72-40c2-8ec7-86c829eacd52/disk.device.read.latency volume: 167366186 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.134 12 DEBUG ceilometer.compute.pollsters [-] 0be60507-2a72-40c2-8ec7-86c829eacd52/disk.device.read.latency volume: 703542 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.134 12 DEBUG ceilometer.compute.pollsters [-] 03db37ed-d870-40ec-86f5-db23a9180dc8/disk.device.read.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.134 12 DEBUG ceilometer.compute.pollsters [-] 03db37ed-d870-40ec-86f5-db23a9180dc8/disk.device.read.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.135 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '45878ffd-ce13-4241-83a8-a7b45a843c87', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 331104628, 'user_id': '7e79b904cb8a49f990b05eb0ed72fdf4', 'user_name': None, 'project_id': '70b1c9f8be0042aa8de9841a26729700', 'project_name': None, 'resource_id': '22742d9b-a6a8-4f10-a17f-a9704a1f8f43-vda', 'timestamp': '2026-01-21T23:56:23.133774', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1374146863', 'name': 'instance-0000003a', 'instance_id': '22742d9b-a6a8-4f10-a17f-a9704a1f8f43', 'instance_type': 'm1.nano', 'host': 'fa7884d5cd98c8b7c256bd4fdc70b5077636284be54ca29d74feaff4', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'ca2d5d88-f724-11f0-a0a4-fa163e934844', 'monotonic_time': 4355.580448455, 'message_signature': '974f567a905a3a78039d19061f452466c46706cf8e99ab4df53957dd8c97b895'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 49993543, 'user_id': '7e79b904cb8a49f990b05eb0ed72fdf4', 'user_name': None, 'project_id': '70b1c9f8be0042aa8de9841a26729700', 'project_name': None, 'resource_id': '22742d9b-a6a8-4f10-a17f-a9704a1f8f43-sda', 'timestamp': '2026-01-21T23:56:23.133774', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1374146863', 'name': 'instance-0000003a', 'instance_id': '22742d9b-a6a8-4f10-a17f-a9704a1f8f43', 'instance_type': 'm1.nano', 'host': 'fa7884d5cd98c8b7c256bd4fdc70b5077636284be54ca29d74feaff4', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'ca2d6616-f724-11f0-a0a4-fa163e934844', 'monotonic_time': 4355.580448455, 'message_signature': 'c9e4adfeeb9bd0cddd205f2b61772b5a4d4ab802d60feade52c97e9d611adbf2'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 167366186, 'user_id': '6eb1bcf645844eaca088761a04e59542', 'user_name': None, 'project_id': '63e5713bcd4c429796b251487b6136bc', 'project_name': None, 'resource_id': '0be60507-2a72-40c2-8ec7-86c829eacd52-vda', 'timestamp': '2026-01-21T23:56:23.133774', 'resource_metadata': {'display_name': 'tempest-ImagesTestJSON-server-560731900', 'name': 'instance-00000040', 'instance_id': '0be60507-2a72-40c2-8ec7-86c829eacd52', 'instance_type': 'm1.nano', 'host': 'd077fa559ec5930f9f2ebe56ec582b45241b18010f40f326cd90587e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'ca2d6e0e-f724-11f0-a0a4-fa163e934844', 'monotonic_time': 4355.604927809, 'message_signature': '6d07fcab040e725c1905e45163b4ca4e10621e1a3f1c0e92b21630695555bbe3'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 703542, 'user_id': '6eb1bcf645844eaca088761a04e59542', 'user_name': None, 'project_id': '63e5713bcd4c429796b251487b6136bc', 'project_name': None, 'resource_id': '0be60507-2a72-40c2-8ec7-86c829eacd52-sda', 'timestamp': '2026-01-21T23:56:23.133774', 'resource_metadata': {'display_name': 'tempest-ImagesTestJSON-server-560731900', 'name': 'instance-00000040', 'instance_id': '0be60507-2a72-40c2-8ec7-86c829eacd52', 'instance_type': 'm1.nano', 'host': 'd077fa559ec5930f9f2ebe56ec582b45241b18010f40f326cd90587e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'ca2d766a-f724-11f0-a0a4-fa163e934844', 'monotonic_time': 4355.604927809, 'message_signature': '862352577b083aff7b97e9ab045a6caad70b7b7d32807a96effae9d15e5cb803'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '9a4a4a5f3c9f4c5091261592272bcb81', 'user_name': None, 'project_id': '414437860afc460b9e86d674975e9d1f', 'project_name': None, 'resource_id': '03db37ed-d870-40ec-86f5-db23a9180dc8-vda', 'timestamp': '2026-01-21T23:56:23.133774', 'resource_metadata': {'display_name': 'tempest-ListServersNegativeTestJSON-server-1677728672-3', 'name': 'instance-00000043', 'instance_id': '03db37ed-d870-40ec-86f5-db23a9180dc8', 'instance_type': 'm1.nano', 'host': 'a517d86b3070decca8e41497c8a7fe9ab28da082b264f32b2ddc7c04', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'ca2d8204-f724-11f0-a0a4-fa163e934844', 'monotonic_time': 4355.632702695, 'message_signature': 'b00f99533bb6ff33ea9f639ec574ad88adc2dda6f005fb6e5b433e7fbccb5295'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '9a4a4a5f3c9f4c5091261592272bcb81', 'user_name': None, 'project_id': '414437860afc460b9e86d674975e9d1f', 'project_name': None, 'resource_id': '03db37ed-d870-40ec-86f5-db23a9180dc8-sda', 'timestamp': '2026-01-21T23:56:23.133774', 'resource_metadata': {'display_name': 'tempest-ListServersNegativeTestJSON-server-1677728672-3', 'name': 'instance-00000043', 'instance_id': '03db37ed-d870-40ec-86f5-db23a9180dc8', 'instance_type': 'm1.nano', 'host': 'a517d86b3070decca8e41497c8a7fe9ab28da082b264f32b2ddc7c04', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 12
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 8, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'ca2d8a1a-f724-11f0-a0a4-fa163e934844', 'monotonic_time': 4355.632702695, 'message_signature': '916de1422c1cce77d967fcbbea3704d14a0087a373b56a95f20dd7b23aec58d9'}]}, 'timestamp': '2026-01-21 23:56:23.135177', '_unique_id': 'ff9aef75fe11436d864bcfeee66fbe03'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.135 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.135 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.135 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.135 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.135 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.135 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.135 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.135 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.135 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.135 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.135 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.135 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.135 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.135 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.135 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.135 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.135 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.135 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.135 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.135 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.135 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.135 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.135 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.135 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.135 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.135 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.135 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.135 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.135 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.135 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.135 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.136 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.136 12 DEBUG ceilometer.compute.pollsters [-] 22742d9b-a6a8-4f10-a17f-a9704a1f8f43/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.136 12 DEBUG ceilometer.compute.pollsters [-] 0be60507-2a72-40c2-8ec7-86c829eacd52/network.incoming.packets.drop volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.137 12 DEBUG ceilometer.compute.pollsters [-] 03db37ed-d870-40ec-86f5-db23a9180dc8/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.137 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '233a7f5e-61eb-40fb-84ff-0f5b41b589a8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7e79b904cb8a49f990b05eb0ed72fdf4', 'user_name': None, 'project_id': '70b1c9f8be0042aa8de9841a26729700', 'project_name': None, 'resource_id': 'instance-0000003a-22742d9b-a6a8-4f10-a17f-a9704a1f8f43-tapb767fb64-f4', 'timestamp': '2026-01-21T23:56:23.136548', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1374146863', 'name': 'tapb767fb64-f4', 'instance_id': '22742d9b-a6a8-4f10-a17f-a9704a1f8f43', 'instance_type': 'm1.nano', 'host': 'fa7884d5cd98c8b7c256bd4fdc70b5077636284be54ca29d74feaff4', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:4e:99:ca', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb767fb64-f4'}, 'message_id': 'ca2dca48-f724-11f0-a0a4-fa163e934844', 'monotonic_time': 4355.676249775, 'message_signature': '47138b7cb6aa6e57dc9bd8f428191d4359a0c7111b63c2c4a80ac8d04650ee0f'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 1, 'user_id': '6eb1bcf645844eaca088761a04e59542', 'user_name': None, 'project_id': '63e5713bcd4c429796b251487b6136bc', 'project_name': None, 'resource_id': 'instance-00000040-0be60507-2a72-40c2-8ec7-86c829eacd52-tap8014260a-f4', 'timestamp': '2026-01-21T23:56:23.136548', 'resource_metadata': {'display_name': 'tempest-ImagesTestJSON-server-560731900', 'name': 'tap8014260a-f4', 'instance_id': '0be60507-2a72-40c2-8ec7-86c829eacd52', 'instance_type': 'm1.nano', 'host': 'd077fa559ec5930f9f2ebe56ec582b45241b18010f40f326cd90587e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:50:87:d7', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap8014260a-f4'}, 'message_id': 'ca2dd5c4-f724-11f0-a0a4-fa163e934844', 'monotonic_time': 4355.680236019, 'message_signature': '919f6a9d1eeeb2fa8372dbd81911d30d2b11207af8e41abe23af5150155d351f'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '9a4a4a5f3c9f4c5091261592272bcb81', 'user_name': None, 'project_id': '414437860afc460b9e86d674975e9d1f', 'project_name': None, 'resource_id': 'instance-00000043-03db37ed-d870-40ec-86f5-db23a9180dc8-tap1f2706f6-32', 'timestamp': '2026-01-21T23:56:23.136548', 'resource_metadata': {'display_name': 'tempest-ListServersNegativeTestJSON-server-1677728672-3', 'name': 'tap1f2706f6-32', 'instance_id': '03db37ed-d870-40ec-86f5-db23a9180dc8', 'instance_type': 'm1.nano', 'host': 'a517d86b3070decca8e41497c8a7fe9ab28da082b264f32b2ddc7c04', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:38:ee:b8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap1f2706f6-32'}, 'message_id': 'ca2dde8e-f724-11f0-a0a4-fa163e934844', 'monotonic_time': 4355.682720395, 'message_signature': '77bb5a8d563d82f9a820e40c1f11617c7c774e1bbbb27c83f864c7887e65946d'}]}, 'timestamp': '2026-01-21 23:56:23.137347', '_unique_id': '18b2a918772742aaaa3136f0fad3ebe5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.137 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.137 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.137 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.137 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.137 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.137 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.137 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.137 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.137 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.137 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.137 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.137 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.137 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.137 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.137 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.137 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.137 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.137 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.137 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.137 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.137 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.137 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.137 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.137 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.137 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.137 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.137 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.137 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.137 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.137 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.137 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.138 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.138 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.138 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-ListServerFiltersTestJSON-instance-1374146863>, <NovaLikeServer: tempest-ImagesTestJSON-server-560731900>, <NovaLikeServer: tempest-ListServersNegativeTestJSON-server-1677728672-3>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ListServerFiltersTestJSON-instance-1374146863>, <NovaLikeServer: tempest-ImagesTestJSON-server-560731900>, <NovaLikeServer: tempest-ListServersNegativeTestJSON-server-1677728672-3>]
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.138 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.139 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.139 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-ListServerFiltersTestJSON-instance-1374146863>, <NovaLikeServer: tempest-ImagesTestJSON-server-560731900>, <NovaLikeServer: tempest-ListServersNegativeTestJSON-server-1677728672-3>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ListServerFiltersTestJSON-instance-1374146863>, <NovaLikeServer: tempest-ImagesTestJSON-server-560731900>, <NovaLikeServer: tempest-ListServersNegativeTestJSON-server-1677728672-3>]
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.139 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.139 12 DEBUG ceilometer.compute.pollsters [-] 22742d9b-a6a8-4f10-a17f-a9704a1f8f43/network.outgoing.packets volume: 3 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.139 12 DEBUG ceilometer.compute.pollsters [-] 0be60507-2a72-40c2-8ec7-86c829eacd52/network.outgoing.packets volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.139 12 DEBUG ceilometer.compute.pollsters [-] 03db37ed-d870-40ec-86f5-db23a9180dc8/network.outgoing.packets volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.140 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a622574b-dc63-484b-8d1f-8465ee2196ec', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 3, 'user_id': '7e79b904cb8a49f990b05eb0ed72fdf4', 'user_name': None, 'project_id': '70b1c9f8be0042aa8de9841a26729700', 'project_name': None, 'resource_id': 'instance-0000003a-22742d9b-a6a8-4f10-a17f-a9704a1f8f43-tapb767fb64-f4', 'timestamp': '2026-01-21T23:56:23.139313', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1374146863', 'name': 'tapb767fb64-f4', 'instance_id': '22742d9b-a6a8-4f10-a17f-a9704a1f8f43', 'instance_type': 'm1.nano', 'host': 'fa7884d5cd98c8b7c256bd4fdc70b5077636284be54ca29d74feaff4', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:4e:99:ca', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb767fb64-f4'}, 'message_id': 'ca2e3442-f724-11f0-a0a4-fa163e934844', 'monotonic_time': 4355.676249775, 'message_signature': '352399d819d8d3c05103f41218419ea0e30a035b9c9421f05e031bf6fcb01d80'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '6eb1bcf645844eaca088761a04e59542', 'user_name': None, 'project_id': '63e5713bcd4c429796b251487b6136bc', 'project_name': None, 'resource_id': 'instance-00000040-0be60507-2a72-40c2-8ec7-86c829eacd52-tap8014260a-f4', 'timestamp': '2026-01-21T23:56:23.139313', 'resource_metadata': {'display_name': 'tempest-ImagesTestJSON-server-560731900', 'name': 'tap8014260a-f4', 'instance_id': '0be60507-2a72-40c2-8ec7-86c829eacd52', 'instance_type': 'm1.nano', 'host': 'd077fa559ec5930f9f2ebe56ec582b45241b18010f40f326cd90587e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:50:87:d7', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap8014260a-f4'}, 'message_id': 'ca2e3cd0-f724-11f0-a0a4-fa163e934844', 'monotonic_time': 4355.680236019, 'message_signature': 'e77045e7b8b0a07d7bea9edba585eee63ac20b7e26e27fbe1cc07211966f52b9'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '9a4a4a5f3c9f4c5091261592272bcb81', 'user_name': None, 'project_id': '414437860afc460b9e86d674975e9d1f', 'project_name': None, 'resource_id': 'instance-00000043-03db37ed-d870-40ec-86f5-db23a9180dc8-tap1f2706f6-32', 'timestamp': '2026-01-21T23:56:23.139313', 'resource_metadata': {'display_name': 'tempest-ListServersNegativeTestJSON-server-1677728672-3', 'name': 'tap1f2706f6-32', 'instance_id': '03db37ed-d870-40ec-86f5-db23a9180dc8', 'instance_type': 'm1.nano', 'host': 'a517d86b3070decca8e41497c8a7fe9ab28da082b264f32b2ddc7c04', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:38:ee:b8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap1f2706f6-32'}, 'message_id': 'ca2e48ce-f724-11f0-a0a4-fa163e934844', 'monotonic_time': 4355.682720395, 'message_signature': 'e0c42318331c54b6c00bebd7a139ca70436a947867cb0d4a8489d8e31a41f188'}]}, 'timestamp': '2026-01-21 23:56:23.140100', '_unique_id': 'f394372b0e834c44be5f50933cbe9b3e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.140 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.140 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.140 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.140 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.140 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.140 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.140 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.140 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.140 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.140 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.140 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.140 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.140 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.140 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.140 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.140 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.140 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.140 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.140 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.140 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.140 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.140 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.140 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.140 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.140 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.140 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.140 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.140 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.140 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.140 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.140 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.141 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.141 12 DEBUG ceilometer.compute.pollsters [-] 22742d9b-a6a8-4f10-a17f-a9704a1f8f43/network.incoming.bytes volume: 748 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.141 12 DEBUG ceilometer.compute.pollsters [-] 0be60507-2a72-40c2-8ec7-86c829eacd52/network.incoming.bytes volume: 90 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.141 12 DEBUG ceilometer.compute.pollsters [-] 03db37ed-d870-40ec-86f5-db23a9180dc8/network.incoming.bytes volume: 90 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.142 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '48c921da-8014-4a88-a064-72b1cb69ddca', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 748, 'user_id': '7e79b904cb8a49f990b05eb0ed72fdf4', 'user_name': None, 'project_id': '70b1c9f8be0042aa8de9841a26729700', 'project_name': None, 'resource_id': 'instance-0000003a-22742d9b-a6a8-4f10-a17f-a9704a1f8f43-tapb767fb64-f4', 'timestamp': '2026-01-21T23:56:23.141325', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1374146863', 'name': 'tapb767fb64-f4', 'instance_id': '22742d9b-a6a8-4f10-a17f-a9704a1f8f43', 'instance_type': 'm1.nano', 'host': 'fa7884d5cd98c8b7c256bd4fdc70b5077636284be54ca29d74feaff4', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:4e:99:ca', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb767fb64-f4'}, 'message_id': 'ca2e82a8-f724-11f0-a0a4-fa163e934844', 'monotonic_time': 4355.676249775, 'message_signature': '87581899138304bbba3df1cce4ab033da42b4be930d528ebadbad4c153363a62'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 90, 'user_id': '6eb1bcf645844eaca088761a04e59542', 'user_name': None, 'project_id': '63e5713bcd4c429796b251487b6136bc', 'project_name': None, 'resource_id': 'instance-00000040-0be60507-2a72-40c2-8ec7-86c829eacd52-tap8014260a-f4', 'timestamp': '2026-01-21T23:56:23.141325', 'resource_metadata': {'display_name': 'tempest-ImagesTestJSON-server-560731900', 'name': 'tap8014260a-f4', 'instance_id': '0be60507-2a72-40c2-8ec7-86c829eacd52', 'instance_type': 'm1.nano', 'host': 'd077fa559ec5930f9f2ebe56ec582b45241b18010f40f326cd90587e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:50:87:d7', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap8014260a-f4'}, 'message_id': 'ca2e8b04-f724-11f0-a0a4-fa163e934844', 'monotonic_time': 4355.680236019, 'message_signature': '360462b3991d897953ccba393e7b390e4a2cbdafb91a11d377f0b14df65b299a'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 90, 'user_id': '9a4a4a5f3c9f4c5091261592272bcb81', 'user_name': None, 'project_id': '414437860afc460b9e86d674975e9d1f', 'project_name': None, 'resource_id': 'instance-00000043-03db37ed-d870-40ec-86f5-db23a9180dc8-tap1f2706f6-32', 'timestamp': '2026-01-21T23:56:23.141325', 'resource_metadata': {'display_name': 'tempest-ListServersNegativeTestJSON-server-1677728672-3', 'name': 'tap1f2706f6-32', 'instance_id': '03db37ed-d870-40ec-86f5-db23a9180dc8', 'instance_type': 'm1.nano', 'host': 'a517d86b3070decca8e41497c8a7fe9ab28da082b264f32b2ddc7c04', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:38:ee:b8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap1f2706f6-32'}, 'message_id': 'ca2e9590-f724-11f0-a0a4-fa163e934844', 'monotonic_time': 4355.682720395, 'message_signature': 'e50461b6fcaef6d23f7b60f94251c20f907764dd3e67d561b054d384712b7b82'}]}, 'timestamp': '2026-01-21 23:56:23.142031', '_unique_id': 'dde4995a15c943938a742fbce047ab0c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.142 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.142 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.142 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.142 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.142 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.142 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.142 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.142 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.142 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.142 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.142 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.142 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.142 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.142 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.142 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.142 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.142 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.142 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.142 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.142 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.142 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.142 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.142 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.142 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.142 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.142 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.142 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.142 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.142 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.142 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.142 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.143 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.143 12 DEBUG ceilometer.compute.pollsters [-] 22742d9b-a6a8-4f10-a17f-a9704a1f8f43/disk.device.read.requests volume: 1213 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.143 12 DEBUG ceilometer.compute.pollsters [-] 22742d9b-a6a8-4f10-a17f-a9704a1f8f43/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.143 12 DEBUG ceilometer.compute.pollsters [-] 0be60507-2a72-40c2-8ec7-86c829eacd52/disk.device.read.requests volume: 760 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.144 12 DEBUG ceilometer.compute.pollsters [-] 0be60507-2a72-40c2-8ec7-86c829eacd52/disk.device.read.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.144 12 DEBUG ceilometer.compute.pollsters [-] 03db37ed-d870-40ec-86f5-db23a9180dc8/disk.device.read.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.144 12 DEBUG ceilometer.compute.pollsters [-] 03db37ed-d870-40ec-86f5-db23a9180dc8/disk.device.read.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.145 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '22f95709-1b15-40d2-a946-8b8d8db96f04', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1213, 'user_id': '7e79b904cb8a49f990b05eb0ed72fdf4', 'user_name': None, 'project_id': '70b1c9f8be0042aa8de9841a26729700', 'project_name': None, 'resource_id': '22742d9b-a6a8-4f10-a17f-a9704a1f8f43-vda', 'timestamp': '2026-01-21T23:56:23.143392', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1374146863', 'name': 'instance-0000003a', 'instance_id': '22742d9b-a6a8-4f10-a17f-a9704a1f8f43', 'instance_type': 'm1.nano', 'host': 'fa7884d5cd98c8b7c256bd4fdc70b5077636284be54ca29d74feaff4', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'ca2ed474-f724-11f0-a0a4-fa163e934844', 'monotonic_time': 4355.580448455, 'message_signature': 'd517a11d3db5220ba62066849090da32662cf5414f55bd020c5d68b76a67a42e'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': '7e79b904cb8a49f990b05eb0ed72fdf4', 'user_name': None, 'project_id': '70b1c9f8be0042aa8de9841a26729700', 'project_name': None, 'resource_id': '22742d9b-a6a8-4f10-a17f-a9704a1f8f43-sda', 'timestamp': '2026-01-21T23:56:23.143392', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1374146863', 'name': 'instance-0000003a', 'instance_id': '22742d9b-a6a8-4f10-a17f-a9704a1f8f43', 'instance_type': 'm1.nano', 'host': 'fa7884d5cd98c8b7c256bd4fdc70b5077636284be54ca29d74feaff4', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'ca2edc9e-f724-11f0-a0a4-fa163e934844', 'monotonic_time': 4355.580448455, 'message_signature': 'f76b264c68f3554e7b1a577e533e14eceef8ef640732cbd7eda3acb79ad89a76'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 760, 'user_id': '6eb1bcf645844eaca088761a04e59542', 'user_name': None, 'project_id': '63e5713bcd4c429796b251487b6136bc', 'project_name': None, 'resource_id': '0be60507-2a72-40c2-8ec7-86c829eacd52-vda', 'timestamp': '2026-01-21T23:56:23.143392', 'resource_metadata': {'display_name': 'tempest-ImagesTestJSON-server-560731900', 'name': 'instance-00000040', 'instance_id': '0be60507-2a72-40c2-8ec7-86c829eacd52', 'instance_type': 'm1.nano', 'host': 'd077fa559ec5930f9f2ebe56ec582b45241b18010f40f326cd90587e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'ca2ee752-f724-11f0-a0a4-fa163e934844', 'monotonic_time': 4355.604927809, 'message_signature': 'a46822abb8899eb6af923b01d6206301b2be567ff7f47b31571ad17ebd59a15c'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '6eb1bcf645844eaca088761a04e59542', 'user_name': None, 'project_id': '63e5713bcd4c429796b251487b6136bc', 'project_name': None, 'resource_id': '0be60507-2a72-40c2-8ec7-86c829eacd52-sda', 'timestamp': '2026-01-21T23:56:23.143392', 'resource_metadata': {'display_name': 'tempest-ImagesTestJSON-server-560731900', 'name': 'instance-00000040', 'instance_id': '0be60507-2a72-40c2-8ec7-86c829eacd52', 'instance_type': 'm1.nano', 'host': 'd077fa559ec5930f9f2ebe56ec582b45241b18010f40f326cd90587e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'ca2ef012-f724-11f0-a0a4-fa163e934844', 'monotonic_time': 4355.604927809, 'message_signature': '858a0e6039f2b863c9dd1955e042d1ffe8c9abac3f440535bb6b18ce79b33781'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '9a4a4a5f3c9f4c5091261592272bcb81', 'user_name': None, 'project_id': '414437860afc460b9e86d674975e9d1f', 'project_name': None, 'resource_id': '03db37ed-d870-40ec-86f5-db23a9180dc8-vda', 'timestamp': '2026-01-21T23:56:23.143392', 'resource_metadata': {'display_name': 'tempest-ListServersNegativeTestJSON-server-1677728672-3', 'name': 'instance-00000043', 'instance_id': '03db37ed-d870-40ec-86f5-db23a9180dc8', 'instance_type': 'm1.nano', 'host': 'a517d86b3070decca8e41497c8a7fe9ab28da082b264f32b2ddc7c04', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'ca2ef792-f724-11f0-a0a4-fa163e934844', 'monotonic_time': 4355.632702695, 'message_signature': '5d81d38ac071f58600cd7bdd1ae4196c2b6af006e6b9adb73017a587e4a18bf7'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '9a4a4a5f3c9f4c5091261592272bcb81', 'user_name': None, 'project_id': '414437860afc460b9e86d674975e9d1f', 'project_name': None, 'resource_id': '03db37ed-d870-40ec-86f5-db23a9180dc8-sda', 'timestamp': '2026-01-21T23:56:23.143392', 'resource_metadata': {'display_name': 'tempest-ListServersNegativeTestJSON-server-1677728672-3', 'name': 'instance-00000043', 'instance_id': '03db37ed-d870-40ec-86f5-db23a9180dc8', 'instance_type': 'm1.nano', 'host': 'a517d86b3070decca8e41497c8a7fe9ab28da082b264f32b2ddc7c04', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'ca2efefe-f724-11f0-a0a4-fa163e934844', 'monotonic_time': 4355.632702695, 'message_signature': '1293f48c8900c7ee305b9dfb6753d0ac565deb8a8b5334f0d556175a58c54f88'}]}, 'timestamp': '2026-01-21 23:56:23.144721', '_unique_id': 'f4e41f40409f4ee09244161a63b294aa'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.145 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.145 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.145 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.145 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.145 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.145 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.145 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.145 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.145 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.145 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.145 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.145 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.145 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.145 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.145 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.145 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.145 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.145 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.145 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.145 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.145 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.145 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.145 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.145 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.145 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.145 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.145 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.145 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.145 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.145 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.145 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.146 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.146 12 DEBUG ceilometer.compute.pollsters [-] 22742d9b-a6a8-4f10-a17f-a9704a1f8f43/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.146 12 DEBUG ceilometer.compute.pollsters [-] 0be60507-2a72-40c2-8ec7-86c829eacd52/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.146 12 DEBUG ceilometer.compute.pollsters [-] 03db37ed-d870-40ec-86f5-db23a9180dc8/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.147 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '87202b52-7ddd-4707-b87d-f1a4688f0929', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7e79b904cb8a49f990b05eb0ed72fdf4', 'user_name': None, 'project_id': '70b1c9f8be0042aa8de9841a26729700', 'project_name': None, 'resource_id': 'instance-0000003a-22742d9b-a6a8-4f10-a17f-a9704a1f8f43-tapb767fb64-f4', 'timestamp': '2026-01-21T23:56:23.146095', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1374146863', 'name': 'tapb767fb64-f4', 'instance_id': '22742d9b-a6a8-4f10-a17f-a9704a1f8f43', 'instance_type': 'm1.nano', 'host': 'fa7884d5cd98c8b7c256bd4fdc70b5077636284be54ca29d74feaff4', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:4e:99:ca', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb767fb64-f4'}, 'message_id': 'ca2f3d1a-f724-11f0-a0a4-fa163e934844', 'monotonic_time': 4355.676249775, 'message_signature': 'c4593bf60aa712d054c324e0c8b1b1a206d1405524877927857210556ab5029a'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '6eb1bcf645844eaca088761a04e59542', 'user_name': None, 'project_id': '63e5713bcd4c429796b251487b6136bc', 'project_name': None, 'resource_id': 'instance-00000040-0be60507-2a72-40c2-8ec7-86c829eacd52-tap8014260a-f4', 'timestamp': '2026-01-21T23:56:23.146095', 'resource_metadata': {'display_name': 'tempest-ImagesTestJSON-server-560731900', 'name': 'tap8014260a-f4', 'instance_id': '0be60507-2a72-40c2-8ec7-86c829eacd52', 'instance_type': 'm1.nano', 'host': 'd077fa559ec5930f9f2ebe56ec582b45241b18010f40f326cd90587e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:50:87:d7', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap8014260a-f4'}, 'message_id': 'ca2f4544-f724-11f0-a0a4-fa163e934844', 'monotonic_time': 4355.680236019, 'message_signature': '0640c0f9a594b22ecbb17d5f864c48c6a097e5b201b6b123b3f8f14f3a563b5b'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '9a4a4a5f3c9f4c5091261592272bcb81', 'user_name': None, 'project_id': '414437860afc460b9e86d674975e9d1f', 'project_name': None, 'resource_id': 'instance-00000043-03db37ed-d870-40ec-86f5-db23a9180dc8-tap1f2706f6-32', 'timestamp': '2026-01-21T23:56:23.146095', 'resource_metadata': {'display_name': 'tempest-ListServersNegativeTestJSON-server-1677728672-3', 'name': 'tap1f2706f6-32', 'instance_id': '03db37ed-d870-40ec-86f5-db23a9180dc8', 'instance_type': 'm1.nano', 'host': 'a517d86b3070decca8e41497c8a7fe9ab28da082b264f32b2ddc7c04', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:38:ee:b8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap1f2706f6-32'}, 'message_id': 'ca2f4d28-f724-11f0-a0a4-fa163e934844', 'monotonic_time': 4355.682720395, 'message_signature': 'abda9768c3e8bb5225e86466833ff08d91533caab65b61949b38f52d9648dac9'}]}, 'timestamp': '2026-01-21 23:56:23.146728', '_unique_id': '463f4cb281e94bd289ddf86f1022e9fd'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.147 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.147 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.147 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.147 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.147 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.147 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.147 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.147 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.147 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.147 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.147 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.147 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.147 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.147 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.147 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.147 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.147 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.147 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.147 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.147 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.147 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.147 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.147 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.147 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.147 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.147 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.147 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.147 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.147 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.147 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.147 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.147 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.147 12 DEBUG ceilometer.compute.pollsters [-] 22742d9b-a6a8-4f10-a17f-a9704a1f8f43/disk.device.write.requests volume: 20 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.148 12 DEBUG ceilometer.compute.pollsters [-] 22742d9b-a6a8-4f10-a17f-a9704a1f8f43/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.148 12 DEBUG ceilometer.compute.pollsters [-] 0be60507-2a72-40c2-8ec7-86c829eacd52/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.148 12 DEBUG ceilometer.compute.pollsters [-] 0be60507-2a72-40c2-8ec7-86c829eacd52/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.148 12 DEBUG ceilometer.compute.pollsters [-] 03db37ed-d870-40ec-86f5-db23a9180dc8/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.149 12 DEBUG ceilometer.compute.pollsters [-] 03db37ed-d870-40ec-86f5-db23a9180dc8/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.149 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5d6882ef-5cb9-47cc-9541-4f29c3370908', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 20, 'user_id': '7e79b904cb8a49f990b05eb0ed72fdf4', 'user_name': None, 'project_id': '70b1c9f8be0042aa8de9841a26729700', 'project_name': None, 'resource_id': '22742d9b-a6a8-4f10-a17f-a9704a1f8f43-vda', 'timestamp': '2026-01-21T23:56:23.147922', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1374146863', 'name': 'instance-0000003a', 'instance_id': '22742d9b-a6a8-4f10-a17f-a9704a1f8f43', 'instance_type': 'm1.nano', 'host': 'fa7884d5cd98c8b7c256bd4fdc70b5077636284be54ca29d74feaff4', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'ca2f8784-f724-11f0-a0a4-fa163e934844', 'monotonic_time': 4355.580448455, 'message_signature': 'd8d3e70f1b7173ed9af264362d7c37781b9b655acf4575b750d0701b6ae73505'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '7e79b904cb8a49f990b05eb0ed72fdf4', 'user_name': None, 'project_id': '70b1c9f8be0042aa8de9841a26729700', 'project_name': None, 'resource_id': '22742d9b-a6a8-4f10-a17f-a9704a1f8f43-sda', 'timestamp': '2026-01-21T23:56:23.147922', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1374146863', 'name': 'instance-0000003a', 'instance_id': '22742d9b-a6a8-4f10-a17f-a9704a1f8f43', 'instance_type': 'm1.nano', 'host': 'fa7884d5cd98c8b7c256bd4fdc70b5077636284be54ca29d74feaff4', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'ca2f92c4-f724-11f0-a0a4-fa163e934844', 'monotonic_time': 4355.580448455, 'message_signature': '747bdb8a909677f120b764259ce3a14e6951e6e2768c271820f8bc42da6f887e'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '6eb1bcf645844eaca088761a04e59542', 'user_name': None, 'project_id': '63e5713bcd4c429796b251487b6136bc', 'project_name': None, 'resource_id': '0be60507-2a72-40c2-8ec7-86c829eacd52-vda', 'timestamp': '2026-01-21T23:56:23.147922', 'resource_metadata': {'display_name': 'tempest-ImagesTestJSON-server-560731900', 'name': 'instance-00000040', 'instance_id': '0be60507-2a72-40c2-8ec7-86c829eacd52', 'instance_type': 'm1.nano', 'host': 'd077fa559ec5930f9f2ebe56ec582b45241b18010f40f326cd90587e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'ca2f9b0c-f724-11f0-a0a4-fa163e934844', 'monotonic_time': 4355.604927809, 'message_signature': '4cb402fcc602289a717a2d787e57172ddd4447fc9883303dd9f20cbfc21e469e'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '6eb1bcf645844eaca088761a04e59542', 'user_name': None, 'project_id': '63e5713bcd4c429796b251487b6136bc', 'project_name': None, 'resource_id': '0be60507-2a72-40c2-8ec7-86c829eacd52-sda', 'timestamp': '2026-01-21T23:56:23.147922', 'resource_metadata': {'display_name': 'tempest-ImagesTestJSON-server-560731900', 'name': 'instance-00000040', 'instance_id': '0be60507-2a72-40c2-8ec7-86c829eacd52', 'instance_type': 'm1.nano', 'host': 'd077fa559ec5930f9f2ebe56ec582b45241b18010f40f326cd90587e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'ca2fa322-f724-11f0-a0a4-fa163e934844', 'monotonic_time': 4355.604927809, 'message_signature': '66c9c24a8d8b5c610d027b4d0aea3ac1b4d3ccaa0b6a2560acd29f29a3b7ad11'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '9a4a4a5f3c9f4c5091261592272bcb81', 'user_name': None, 'project_id': '414437860afc460b9e86d674975e9d1f', 'project_name': None, 'resource_id': '03db37ed-d870-40ec-86f5-db23a9180dc8-vda', 'timestamp': '2026-01-21T23:56:23.147922', 'resource_metadata': {'display_name': 'tempest-ListServersNegativeTestJSON-server-1677728672-3', 'name': 'instance-00000043', 'instance_id': '03db37ed-d870-40ec-86f5-db23a9180dc8', 'instance_type': 'm1.nano', 'host': 'a517d86b3070decca8e41497c8a7fe9ab28da082b264f32b2ddc7c04', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'ca2fae80-f724-11f0-a0a4-fa163e934844', 'monotonic_time': 4355.632702695, 'message_signature': 'd82e4c050d16e8221cb3a16127b3ed4e7341e692647403f318f5ed4effc464a7'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '9a4a4a5f3c9f4c5091261592272bcb81', 'user_name': None, 'project_id': '414437860afc460b9e86d674975e9d1f', 'project_name': None, 'resource_id': '03db37ed-d870-40ec-86f5-db23a9180dc8-sda', 'timestamp': '2026-01-21T23:56:23.147922', 'resource_metadata': {'display_name': 'tempest-ListServersNegativeTestJSON-server-1677728672-3', 'name': 'instance-00000043', 'instance_id': '03db37ed-d870-40ec-86f5-db23a9180dc8', 'instance_type': 'm1.nano', 'host': 'a517d86b3070decca8e41497c8a7fe9ab28da082b264f32b2ddc7c04', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'ca2fb61e-f724-11f0-a0a4-fa163e934844', 'monotonic_time': 4355.632702695, 'message_signature': '92ecc282cb5278c7951ec7a65d8b47b4073822e755b9b1b02c486711284cd497'}]}, 'timestamp': '2026-01-21 23:56:23.149409', '_unique_id': 'b3c805148b4c43f790df1dda560c4ec7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.149 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.149 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.149 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.149 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.149 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.149 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.149 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.149 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.149 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.149 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.149 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.149 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.149 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.149 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.149 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.149 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.149 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.149 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.149 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.149 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.149 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.149 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.149 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.149 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.149 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.149 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.149 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.149 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.149 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.149 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:56:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:56:23.149 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:56:23 compute-1 rsyslogd[1003]: message too long (8192) with configured size 8096, begin of message is: 2026-01-21 23:56:22.967 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Jan 21 23:56:23 compute-1 rsyslogd[1003]: message too long (8192) with configured size 8096, begin of message is: 2026-01-21 23:56:23.083 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Jan 21 23:56:23 compute-1 rsyslogd[1003]: message too long (8192) with configured size 8096, begin of message is: 2026-01-21 23:56:23.099 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Jan 21 23:56:23 compute-1 rsyslogd[1003]: message too long (8192) with configured size 8096, begin of message is: 2026-01-21 23:56:23.106 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Jan 21 23:56:23 compute-1 rsyslogd[1003]: message too long (8192) with configured size 8096, begin of message is: 2026-01-21 23:56:23.111 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Jan 21 23:56:23 compute-1 rsyslogd[1003]: message too long (8192) with configured size 8096, begin of message is: 2026-01-21 23:56:23.124 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Jan 21 23:56:23 compute-1 rsyslogd[1003]: message too long (8192) with configured size 8096, begin of message is: 2026-01-21 23:56:23.135 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Jan 21 23:56:23 compute-1 rsyslogd[1003]: message too long (8192) with configured size 8096, begin of message is: 2026-01-21 23:56:23.145 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Jan 21 23:56:23 compute-1 rsyslogd[1003]: message too long (8192) with configured size 8096, begin of message is: 2026-01-21 23:56:23.149 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Jan 21 23:56:23 compute-1 nova_compute[182713]: 2026-01-21 23:56:23.588 182717 WARNING nova.compute.manager [None req-655dd1ba-8eed-4a57-b8b3-c8fc4849ece0 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: 0be60507-2a72-40c2-8ec7-86c829eacd52] Image not found during snapshot: nova.exception.ImageNotFound: Image 4ef4e650-c909-46a9-90d2-b3adaedd0b3d could not be found.
Jan 21 23:56:23 compute-1 ovn_controller[94841]: 2026-01-21T23:56:23Z|00024|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:4e:99:ca 10.100.0.10
Jan 21 23:56:23 compute-1 ovn_controller[94841]: 2026-01-21T23:56:23Z|00025|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:4e:99:ca 10.100.0.10
Jan 21 23:56:23 compute-1 nova_compute[182713]: 2026-01-21 23:56:23.798 182717 DEBUG nova.network.neutron [req-2356e513-1313-4610-8e0b-100dd4f88e54 req-6fbcaa77-9f6c-4453-8f92-7a330cf9fa4c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 03db37ed-d870-40ec-86f5-db23a9180dc8] Updated VIF entry in instance network info cache for port 1f2706f6-320f-42cf-8e88-b7cb375b001a. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 21 23:56:23 compute-1 nova_compute[182713]: 2026-01-21 23:56:23.799 182717 DEBUG nova.network.neutron [req-2356e513-1313-4610-8e0b-100dd4f88e54 req-6fbcaa77-9f6c-4453-8f92-7a330cf9fa4c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 03db37ed-d870-40ec-86f5-db23a9180dc8] Updating instance_info_cache with network_info: [{"id": "1f2706f6-320f-42cf-8e88-b7cb375b001a", "address": "fa:16:3e:38:ee:b8", "network": {"id": "835f4434-3fa6-458b-b79c-b27830f531cf", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-1274650069-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "414437860afc460b9e86d674975e9d1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1f2706f6-32", "ovs_interfaceid": "1f2706f6-320f-42cf-8e88-b7cb375b001a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 23:56:23 compute-1 nova_compute[182713]: 2026-01-21 23:56:23.842 182717 DEBUG oslo_concurrency.lockutils [req-2356e513-1313-4610-8e0b-100dd4f88e54 req-6fbcaa77-9f6c-4453-8f92-7a330cf9fa4c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-03db37ed-d870-40ec-86f5-db23a9180dc8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 23:56:24 compute-1 nova_compute[182713]: 2026-01-21 23:56:24.113 182717 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769039769.1115942, 2a5e9a44-f095-4122-8db9-4918b6ba22b7 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:56:24 compute-1 nova_compute[182713]: 2026-01-21 23:56:24.113 182717 INFO nova.compute.manager [-] [instance: 2a5e9a44-f095-4122-8db9-4918b6ba22b7] VM Stopped (Lifecycle Event)
Jan 21 23:56:24 compute-1 nova_compute[182713]: 2026-01-21 23:56:24.128 182717 DEBUG nova.compute.manager [req-4121ac80-3da6-42e2-ba5e-eb8064072abd req-028fa5b5-a387-459b-ad5c-81c4cbaab913 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 03db37ed-d870-40ec-86f5-db23a9180dc8] Received event network-vif-plugged-1f2706f6-320f-42cf-8e88-b7cb375b001a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:56:24 compute-1 nova_compute[182713]: 2026-01-21 23:56:24.128 182717 DEBUG oslo_concurrency.lockutils [req-4121ac80-3da6-42e2-ba5e-eb8064072abd req-028fa5b5-a387-459b-ad5c-81c4cbaab913 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "03db37ed-d870-40ec-86f5-db23a9180dc8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:56:24 compute-1 nova_compute[182713]: 2026-01-21 23:56:24.129 182717 DEBUG oslo_concurrency.lockutils [req-4121ac80-3da6-42e2-ba5e-eb8064072abd req-028fa5b5-a387-459b-ad5c-81c4cbaab913 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "03db37ed-d870-40ec-86f5-db23a9180dc8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:56:24 compute-1 nova_compute[182713]: 2026-01-21 23:56:24.129 182717 DEBUG oslo_concurrency.lockutils [req-4121ac80-3da6-42e2-ba5e-eb8064072abd req-028fa5b5-a387-459b-ad5c-81c4cbaab913 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "03db37ed-d870-40ec-86f5-db23a9180dc8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:56:24 compute-1 nova_compute[182713]: 2026-01-21 23:56:24.130 182717 DEBUG nova.compute.manager [req-4121ac80-3da6-42e2-ba5e-eb8064072abd req-028fa5b5-a387-459b-ad5c-81c4cbaab913 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 03db37ed-d870-40ec-86f5-db23a9180dc8] Processing event network-vif-plugged-1f2706f6-320f-42cf-8e88-b7cb375b001a _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 21 23:56:24 compute-1 nova_compute[182713]: 2026-01-21 23:56:24.130 182717 DEBUG nova.compute.manager [req-4121ac80-3da6-42e2-ba5e-eb8064072abd req-028fa5b5-a387-459b-ad5c-81c4cbaab913 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 03db37ed-d870-40ec-86f5-db23a9180dc8] Received event network-vif-plugged-1f2706f6-320f-42cf-8e88-b7cb375b001a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:56:24 compute-1 nova_compute[182713]: 2026-01-21 23:56:24.130 182717 DEBUG oslo_concurrency.lockutils [req-4121ac80-3da6-42e2-ba5e-eb8064072abd req-028fa5b5-a387-459b-ad5c-81c4cbaab913 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "03db37ed-d870-40ec-86f5-db23a9180dc8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:56:24 compute-1 nova_compute[182713]: 2026-01-21 23:56:24.131 182717 DEBUG oslo_concurrency.lockutils [req-4121ac80-3da6-42e2-ba5e-eb8064072abd req-028fa5b5-a387-459b-ad5c-81c4cbaab913 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "03db37ed-d870-40ec-86f5-db23a9180dc8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:56:24 compute-1 nova_compute[182713]: 2026-01-21 23:56:24.131 182717 DEBUG oslo_concurrency.lockutils [req-4121ac80-3da6-42e2-ba5e-eb8064072abd req-028fa5b5-a387-459b-ad5c-81c4cbaab913 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "03db37ed-d870-40ec-86f5-db23a9180dc8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:56:24 compute-1 nova_compute[182713]: 2026-01-21 23:56:24.131 182717 DEBUG nova.compute.manager [req-4121ac80-3da6-42e2-ba5e-eb8064072abd req-028fa5b5-a387-459b-ad5c-81c4cbaab913 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 03db37ed-d870-40ec-86f5-db23a9180dc8] No waiting events found dispatching network-vif-plugged-1f2706f6-320f-42cf-8e88-b7cb375b001a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 23:56:24 compute-1 nova_compute[182713]: 2026-01-21 23:56:24.131 182717 WARNING nova.compute.manager [req-4121ac80-3da6-42e2-ba5e-eb8064072abd req-028fa5b5-a387-459b-ad5c-81c4cbaab913 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 03db37ed-d870-40ec-86f5-db23a9180dc8] Received unexpected event network-vif-plugged-1f2706f6-320f-42cf-8e88-b7cb375b001a for instance with vm_state building and task_state spawning.
Jan 21 23:56:24 compute-1 nova_compute[182713]: 2026-01-21 23:56:24.132 182717 DEBUG nova.compute.manager [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] [instance: 03db37ed-d870-40ec-86f5-db23a9180dc8] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 21 23:56:24 compute-1 nova_compute[182713]: 2026-01-21 23:56:24.148 182717 DEBUG nova.virt.driver [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Emitting event <LifecycleEvent: 1769039784.1482868, 03db37ed-d870-40ec-86f5-db23a9180dc8 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:56:24 compute-1 nova_compute[182713]: 2026-01-21 23:56:24.149 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 03db37ed-d870-40ec-86f5-db23a9180dc8] VM Resumed (Lifecycle Event)
Jan 21 23:56:24 compute-1 nova_compute[182713]: 2026-01-21 23:56:24.152 182717 DEBUG nova.compute.manager [None req-d6e7dee1-7ce2-448d-a000-aa46d296c3a2 - - - - - -] [instance: 2a5e9a44-f095-4122-8db9-4918b6ba22b7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:56:24 compute-1 nova_compute[182713]: 2026-01-21 23:56:24.153 182717 DEBUG nova.virt.libvirt.driver [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] [instance: 03db37ed-d870-40ec-86f5-db23a9180dc8] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 21 23:56:24 compute-1 nova_compute[182713]: 2026-01-21 23:56:24.174 182717 INFO nova.virt.libvirt.driver [-] [instance: 03db37ed-d870-40ec-86f5-db23a9180dc8] Instance spawned successfully.
Jan 21 23:56:24 compute-1 nova_compute[182713]: 2026-01-21 23:56:24.174 182717 DEBUG nova.virt.libvirt.driver [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] [instance: 03db37ed-d870-40ec-86f5-db23a9180dc8] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 21 23:56:24 compute-1 nova_compute[182713]: 2026-01-21 23:56:24.184 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 03db37ed-d870-40ec-86f5-db23a9180dc8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:56:24 compute-1 nova_compute[182713]: 2026-01-21 23:56:24.188 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 03db37ed-d870-40ec-86f5-db23a9180dc8] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 21 23:56:24 compute-1 nova_compute[182713]: 2026-01-21 23:56:24.220 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 03db37ed-d870-40ec-86f5-db23a9180dc8] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 21 23:56:24 compute-1 nova_compute[182713]: 2026-01-21 23:56:24.224 182717 DEBUG nova.virt.libvirt.driver [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] [instance: 03db37ed-d870-40ec-86f5-db23a9180dc8] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:56:24 compute-1 nova_compute[182713]: 2026-01-21 23:56:24.225 182717 DEBUG nova.virt.libvirt.driver [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] [instance: 03db37ed-d870-40ec-86f5-db23a9180dc8] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:56:24 compute-1 nova_compute[182713]: 2026-01-21 23:56:24.226 182717 DEBUG nova.virt.libvirt.driver [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] [instance: 03db37ed-d870-40ec-86f5-db23a9180dc8] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:56:24 compute-1 nova_compute[182713]: 2026-01-21 23:56:24.227 182717 DEBUG nova.virt.libvirt.driver [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] [instance: 03db37ed-d870-40ec-86f5-db23a9180dc8] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:56:24 compute-1 nova_compute[182713]: 2026-01-21 23:56:24.228 182717 DEBUG nova.virt.libvirt.driver [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] [instance: 03db37ed-d870-40ec-86f5-db23a9180dc8] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:56:24 compute-1 nova_compute[182713]: 2026-01-21 23:56:24.229 182717 DEBUG nova.virt.libvirt.driver [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] [instance: 03db37ed-d870-40ec-86f5-db23a9180dc8] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:56:24 compute-1 nova_compute[182713]: 2026-01-21 23:56:24.330 182717 INFO nova.compute.manager [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] [instance: 03db37ed-d870-40ec-86f5-db23a9180dc8] Took 9.38 seconds to spawn the instance on the hypervisor.
Jan 21 23:56:24 compute-1 nova_compute[182713]: 2026-01-21 23:56:24.330 182717 DEBUG nova.compute.manager [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] [instance: 03db37ed-d870-40ec-86f5-db23a9180dc8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:56:24 compute-1 nova_compute[182713]: 2026-01-21 23:56:24.438 182717 INFO nova.compute.manager [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] [instance: 03db37ed-d870-40ec-86f5-db23a9180dc8] Took 10.10 seconds to build instance.
Jan 21 23:56:24 compute-1 nova_compute[182713]: 2026-01-21 23:56:24.461 182717 DEBUG oslo_concurrency.lockutils [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] Lock "03db37ed-d870-40ec-86f5-db23a9180dc8" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 10.278s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:56:24 compute-1 podman[220091]: 2026-01-21 23:56:24.619356159 +0000 UTC m=+0.098232206 container health_status dce133a2ffa21f3006ac27dc80027060a9029e6d972f4c62fecd35dd3bbb00f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, version=9.6, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., architecture=x86_64, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, config_id=openstack_network_exporter, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Jan 21 23:56:24 compute-1 nova_compute[182713]: 2026-01-21 23:56:24.855 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:56:24 compute-1 nova_compute[182713]: 2026-01-21 23:56:24.856 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:56:24 compute-1 nova_compute[182713]: 2026-01-21 23:56:24.856 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:56:24 compute-1 nova_compute[182713]: 2026-01-21 23:56:24.856 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 21 23:56:25 compute-1 nova_compute[182713]: 2026-01-21 23:56:25.392 182717 DEBUG oslo_concurrency.lockutils [None req-aa2430d3-1d7a-4051-bc1a-32ff71ae4c49 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Acquiring lock "0be60507-2a72-40c2-8ec7-86c829eacd52" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:56:25 compute-1 nova_compute[182713]: 2026-01-21 23:56:25.392 182717 DEBUG oslo_concurrency.lockutils [None req-aa2430d3-1d7a-4051-bc1a-32ff71ae4c49 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Lock "0be60507-2a72-40c2-8ec7-86c829eacd52" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:56:25 compute-1 nova_compute[182713]: 2026-01-21 23:56:25.393 182717 DEBUG oslo_concurrency.lockutils [None req-aa2430d3-1d7a-4051-bc1a-32ff71ae4c49 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Acquiring lock "0be60507-2a72-40c2-8ec7-86c829eacd52-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:56:25 compute-1 nova_compute[182713]: 2026-01-21 23:56:25.393 182717 DEBUG oslo_concurrency.lockutils [None req-aa2430d3-1d7a-4051-bc1a-32ff71ae4c49 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Lock "0be60507-2a72-40c2-8ec7-86c829eacd52-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:56:25 compute-1 nova_compute[182713]: 2026-01-21 23:56:25.393 182717 DEBUG oslo_concurrency.lockutils [None req-aa2430d3-1d7a-4051-bc1a-32ff71ae4c49 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Lock "0be60507-2a72-40c2-8ec7-86c829eacd52-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:56:25 compute-1 nova_compute[182713]: 2026-01-21 23:56:25.412 182717 INFO nova.compute.manager [None req-aa2430d3-1d7a-4051-bc1a-32ff71ae4c49 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: 0be60507-2a72-40c2-8ec7-86c829eacd52] Terminating instance
Jan 21 23:56:25 compute-1 nova_compute[182713]: 2026-01-21 23:56:25.424 182717 DEBUG nova.compute.manager [None req-aa2430d3-1d7a-4051-bc1a-32ff71ae4c49 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: 0be60507-2a72-40c2-8ec7-86c829eacd52] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 21 23:56:25 compute-1 kernel: tap8014260a-f4 (unregistering): left promiscuous mode
Jan 21 23:56:25 compute-1 NetworkManager[54952]: <info>  [1769039785.4463] device (tap8014260a-f4): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 21 23:56:25 compute-1 ovn_controller[94841]: 2026-01-21T23:56:25Z|00202|binding|INFO|Releasing lport 8014260a-f495-40a2-81b9-2fa4e968f539 from this chassis (sb_readonly=0)
Jan 21 23:56:25 compute-1 ovn_controller[94841]: 2026-01-21T23:56:25Z|00203|binding|INFO|Setting lport 8014260a-f495-40a2-81b9-2fa4e968f539 down in Southbound
Jan 21 23:56:25 compute-1 ovn_controller[94841]: 2026-01-21T23:56:25Z|00204|binding|INFO|Removing iface tap8014260a-f4 ovn-installed in OVS
Jan 21 23:56:25 compute-1 nova_compute[182713]: 2026-01-21 23:56:25.478 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:56:25 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:56:25.485 104184 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:50:87:d7 10.100.0.8'], port_security=['fa:16:3e:50:87:d7 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '0be60507-2a72-40c2-8ec7-86c829eacd52', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-74e2da48-44c2-4c6d-9597-6c47d6247f9c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '63e5713bcd4c429796b251487b6136bc', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'f9b7e0b6-a204-4fa2-b013-6e4797586550', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d83c4887-6d70-4852-825f-2112d1d70c76, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>], logical_port=8014260a-f495-40a2-81b9-2fa4e968f539) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 21 23:56:25 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:56:25.488 104184 INFO neutron.agent.ovn.metadata.agent [-] Port 8014260a-f495-40a2-81b9-2fa4e968f539 in datapath 74e2da48-44c2-4c6d-9597-6c47d6247f9c unbound from our chassis
Jan 21 23:56:25 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:56:25.491 104184 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 74e2da48-44c2-4c6d-9597-6c47d6247f9c, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 21 23:56:25 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:56:25.493 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[b4964c02-36f8-4cd0-a60b-040b9a2c83f3]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:56:25 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:56:25.494 104184 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-74e2da48-44c2-4c6d-9597-6c47d6247f9c namespace which is not needed anymore
Jan 21 23:56:25 compute-1 nova_compute[182713]: 2026-01-21 23:56:25.500 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:56:25 compute-1 systemd[1]: machine-qemu\x2d30\x2dinstance\x2d00000040.scope: Deactivated successfully.
Jan 21 23:56:25 compute-1 systemd[1]: machine-qemu\x2d30\x2dinstance\x2d00000040.scope: Consumed 7.059s CPU time.
Jan 21 23:56:25 compute-1 systemd-machined[153970]: Machine qemu-30-instance-00000040 terminated.
Jan 21 23:56:25 compute-1 neutron-haproxy-ovnmeta-74e2da48-44c2-4c6d-9597-6c47d6247f9c[219901]: [NOTICE]   (219905) : haproxy version is 2.8.14-c23fe91
Jan 21 23:56:25 compute-1 neutron-haproxy-ovnmeta-74e2da48-44c2-4c6d-9597-6c47d6247f9c[219901]: [NOTICE]   (219905) : path to executable is /usr/sbin/haproxy
Jan 21 23:56:25 compute-1 neutron-haproxy-ovnmeta-74e2da48-44c2-4c6d-9597-6c47d6247f9c[219901]: [WARNING]  (219905) : Exiting Master process...
Jan 21 23:56:25 compute-1 neutron-haproxy-ovnmeta-74e2da48-44c2-4c6d-9597-6c47d6247f9c[219901]: [WARNING]  (219905) : Exiting Master process...
Jan 21 23:56:25 compute-1 neutron-haproxy-ovnmeta-74e2da48-44c2-4c6d-9597-6c47d6247f9c[219901]: [ALERT]    (219905) : Current worker (219907) exited with code 143 (Terminated)
Jan 21 23:56:25 compute-1 neutron-haproxy-ovnmeta-74e2da48-44c2-4c6d-9597-6c47d6247f9c[219901]: [WARNING]  (219905) : All workers exited. Exiting... (0)
Jan 21 23:56:25 compute-1 systemd[1]: libpod-4c76bbf8e04b54e43a6457b032593786d54729d1dc2836ddb97956dd25295f01.scope: Deactivated successfully.
Jan 21 23:56:25 compute-1 conmon[219901]: conmon 4c76bbf8e04b54e43a64 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-4c76bbf8e04b54e43a6457b032593786d54729d1dc2836ddb97956dd25295f01.scope/container/memory.events
Jan 21 23:56:25 compute-1 podman[220136]: 2026-01-21 23:56:25.653210046 +0000 UTC m=+0.057448509 container died 4c76bbf8e04b54e43a6457b032593786d54729d1dc2836ddb97956dd25295f01 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-74e2da48-44c2-4c6d-9597-6c47d6247f9c, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Jan 21 23:56:25 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-4c76bbf8e04b54e43a6457b032593786d54729d1dc2836ddb97956dd25295f01-userdata-shm.mount: Deactivated successfully.
Jan 21 23:56:25 compute-1 systemd[1]: var-lib-containers-storage-overlay-72c1f367355eea11a663bee895b2efccc8b163be9b3ed0f08d872d937ac5042a-merged.mount: Deactivated successfully.
Jan 21 23:56:25 compute-1 nova_compute[182713]: 2026-01-21 23:56:25.699 182717 INFO nova.virt.libvirt.driver [-] [instance: 0be60507-2a72-40c2-8ec7-86c829eacd52] Instance destroyed successfully.
Jan 21 23:56:25 compute-1 nova_compute[182713]: 2026-01-21 23:56:25.700 182717 DEBUG nova.objects.instance [None req-aa2430d3-1d7a-4051-bc1a-32ff71ae4c49 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Lazy-loading 'resources' on Instance uuid 0be60507-2a72-40c2-8ec7-86c829eacd52 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:56:25 compute-1 podman[220136]: 2026-01-21 23:56:25.705195668 +0000 UTC m=+0.109434081 container cleanup 4c76bbf8e04b54e43a6457b032593786d54729d1dc2836ddb97956dd25295f01 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-74e2da48-44c2-4c6d-9597-6c47d6247f9c, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3)
Jan 21 23:56:25 compute-1 systemd[1]: libpod-conmon-4c76bbf8e04b54e43a6457b032593786d54729d1dc2836ddb97956dd25295f01.scope: Deactivated successfully.
Jan 21 23:56:25 compute-1 nova_compute[182713]: 2026-01-21 23:56:25.732 182717 DEBUG nova.virt.libvirt.vif [None req-aa2430d3-1d7a-4051-bc1a-32ff71ae4c49 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-21T23:56:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ImagesTestJSON-server-560731900',display_name='tempest-ImagesTestJSON-server-560731900',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-imagestestjson-server-560731900',id=64,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-21T23:56:19Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='63e5713bcd4c429796b251487b6136bc',ramdisk_id='',reservation_id='r-1ccb7p5b',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ImagesTestJSON-126431515',owner_user_name='tempest-ImagesTestJSON-126431515-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-21T23:56:23Z,user_data=None,user_id='6eb1bcf645844eaca088761a04e59542',uuid=0be60507-2a72-40c2-8ec7-86c829eacd52,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "8014260a-f495-40a2-81b9-2fa4e968f539", "address": "fa:16:3e:50:87:d7", "network": {"id": "74e2da48-44c2-4c6d-9597-6c47d6247f9c", "bridge": "br-int", "label": "tempest-ImagesTestJSON-926280031-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "63e5713bcd4c429796b251487b6136bc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8014260a-f4", "ovs_interfaceid": "8014260a-f495-40a2-81b9-2fa4e968f539", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 21 23:56:25 compute-1 nova_compute[182713]: 2026-01-21 23:56:25.733 182717 DEBUG nova.network.os_vif_util [None req-aa2430d3-1d7a-4051-bc1a-32ff71ae4c49 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Converting VIF {"id": "8014260a-f495-40a2-81b9-2fa4e968f539", "address": "fa:16:3e:50:87:d7", "network": {"id": "74e2da48-44c2-4c6d-9597-6c47d6247f9c", "bridge": "br-int", "label": "tempest-ImagesTestJSON-926280031-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "63e5713bcd4c429796b251487b6136bc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8014260a-f4", "ovs_interfaceid": "8014260a-f495-40a2-81b9-2fa4e968f539", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 21 23:56:25 compute-1 nova_compute[182713]: 2026-01-21 23:56:25.734 182717 DEBUG nova.network.os_vif_util [None req-aa2430d3-1d7a-4051-bc1a-32ff71ae4c49 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:50:87:d7,bridge_name='br-int',has_traffic_filtering=True,id=8014260a-f495-40a2-81b9-2fa4e968f539,network=Network(74e2da48-44c2-4c6d-9597-6c47d6247f9c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8014260a-f4') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 21 23:56:25 compute-1 nova_compute[182713]: 2026-01-21 23:56:25.735 182717 DEBUG os_vif [None req-aa2430d3-1d7a-4051-bc1a-32ff71ae4c49 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:50:87:d7,bridge_name='br-int',has_traffic_filtering=True,id=8014260a-f495-40a2-81b9-2fa4e968f539,network=Network(74e2da48-44c2-4c6d-9597-6c47d6247f9c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8014260a-f4') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 21 23:56:25 compute-1 nova_compute[182713]: 2026-01-21 23:56:25.741 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:56:25 compute-1 nova_compute[182713]: 2026-01-21 23:56:25.742 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8014260a-f4, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:56:25 compute-1 nova_compute[182713]: 2026-01-21 23:56:25.745 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:56:25 compute-1 nova_compute[182713]: 2026-01-21 23:56:25.749 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:56:25 compute-1 nova_compute[182713]: 2026-01-21 23:56:25.752 182717 INFO os_vif [None req-aa2430d3-1d7a-4051-bc1a-32ff71ae4c49 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:50:87:d7,bridge_name='br-int',has_traffic_filtering=True,id=8014260a-f495-40a2-81b9-2fa4e968f539,network=Network(74e2da48-44c2-4c6d-9597-6c47d6247f9c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8014260a-f4')
Jan 21 23:56:25 compute-1 nova_compute[182713]: 2026-01-21 23:56:25.754 182717 INFO nova.virt.libvirt.driver [None req-aa2430d3-1d7a-4051-bc1a-32ff71ae4c49 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: 0be60507-2a72-40c2-8ec7-86c829eacd52] Deleting instance files /var/lib/nova/instances/0be60507-2a72-40c2-8ec7-86c829eacd52_del
Jan 21 23:56:25 compute-1 nova_compute[182713]: 2026-01-21 23:56:25.756 182717 INFO nova.virt.libvirt.driver [None req-aa2430d3-1d7a-4051-bc1a-32ff71ae4c49 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: 0be60507-2a72-40c2-8ec7-86c829eacd52] Deletion of /var/lib/nova/instances/0be60507-2a72-40c2-8ec7-86c829eacd52_del complete
Jan 21 23:56:25 compute-1 podman[220184]: 2026-01-21 23:56:25.777621978 +0000 UTC m=+0.048614798 container remove 4c76bbf8e04b54e43a6457b032593786d54729d1dc2836ddb97956dd25295f01 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-74e2da48-44c2-4c6d-9597-6c47d6247f9c, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Jan 21 23:56:25 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:56:25.781 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[ccbb6b56-ec7d-4584-b0a6-1f69ec686fb8]: (4, ('Wed Jan 21 11:56:25 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-74e2da48-44c2-4c6d-9597-6c47d6247f9c (4c76bbf8e04b54e43a6457b032593786d54729d1dc2836ddb97956dd25295f01)\n4c76bbf8e04b54e43a6457b032593786d54729d1dc2836ddb97956dd25295f01\nWed Jan 21 11:56:25 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-74e2da48-44c2-4c6d-9597-6c47d6247f9c (4c76bbf8e04b54e43a6457b032593786d54729d1dc2836ddb97956dd25295f01)\n4c76bbf8e04b54e43a6457b032593786d54729d1dc2836ddb97956dd25295f01\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:56:25 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:56:25.783 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[1b30cd90-ef7a-4183-aa55-d49df744ad2f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:56:25 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:56:25.784 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap74e2da48-40, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:56:25 compute-1 nova_compute[182713]: 2026-01-21 23:56:25.785 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:56:25 compute-1 kernel: tap74e2da48-40: left promiscuous mode
Jan 21 23:56:25 compute-1 nova_compute[182713]: 2026-01-21 23:56:25.797 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:56:25 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:56:25.799 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[6c85eebd-04c4-4bf8-bb4d-0a790609481a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:56:25 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:56:25.818 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[5114fa0b-2f7a-4b48-a6cf-5958f7875625]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:56:25 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:56:25.819 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[aae004cb-a301-41b7-ae3c-06dfe2aca355]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:56:25 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:56:25.841 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[79dad2af-13f9-43d5-9c95-b6d6a44a400c]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 435103, 'reachable_time': 24496, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 220198, 'error': None, 'target': 'ovnmeta-74e2da48-44c2-4c6d-9597-6c47d6247f9c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:56:25 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:56:25.850 104576 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-74e2da48-44c2-4c6d-9597-6c47d6247f9c deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 21 23:56:25 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:56:25.850 104576 DEBUG oslo.privsep.daemon [-] privsep: reply[25dc269a-8230-41cf-9769-02bd57cd8a42]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:56:25 compute-1 systemd[1]: run-netns-ovnmeta\x2d74e2da48\x2d44c2\x2d4c6d\x2d9597\x2d6c47d6247f9c.mount: Deactivated successfully.
Jan 21 23:56:25 compute-1 nova_compute[182713]: 2026-01-21 23:56:25.856 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:56:25 compute-1 nova_compute[182713]: 2026-01-21 23:56:25.894 182717 INFO nova.compute.manager [None req-aa2430d3-1d7a-4051-bc1a-32ff71ae4c49 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: 0be60507-2a72-40c2-8ec7-86c829eacd52] Took 0.47 seconds to destroy the instance on the hypervisor.
Jan 21 23:56:25 compute-1 nova_compute[182713]: 2026-01-21 23:56:25.895 182717 DEBUG oslo.service.loopingcall [None req-aa2430d3-1d7a-4051-bc1a-32ff71ae4c49 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 21 23:56:25 compute-1 nova_compute[182713]: 2026-01-21 23:56:25.897 182717 DEBUG nova.compute.manager [-] [instance: 0be60507-2a72-40c2-8ec7-86c829eacd52] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 21 23:56:25 compute-1 nova_compute[182713]: 2026-01-21 23:56:25.897 182717 DEBUG nova.network.neutron [-] [instance: 0be60507-2a72-40c2-8ec7-86c829eacd52] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 21 23:56:26 compute-1 nova_compute[182713]: 2026-01-21 23:56:26.107 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:56:26 compute-1 nova_compute[182713]: 2026-01-21 23:56:26.587 182717 DEBUG nova.network.neutron [-] [instance: 0be60507-2a72-40c2-8ec7-86c829eacd52] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 23:56:26 compute-1 nova_compute[182713]: 2026-01-21 23:56:26.609 182717 INFO nova.compute.manager [-] [instance: 0be60507-2a72-40c2-8ec7-86c829eacd52] Took 0.71 seconds to deallocate network for instance.
Jan 21 23:56:26 compute-1 nova_compute[182713]: 2026-01-21 23:56:26.675 182717 DEBUG nova.compute.manager [req-050dee91-ca7d-43b9-8dfd-9a801a742589 req-5696770d-3998-4103-be8d-c92299e5d3ce 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 0be60507-2a72-40c2-8ec7-86c829eacd52] Received event network-vif-deleted-8014260a-f495-40a2-81b9-2fa4e968f539 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:56:26 compute-1 nova_compute[182713]: 2026-01-21 23:56:26.712 182717 DEBUG oslo_concurrency.lockutils [None req-aa2430d3-1d7a-4051-bc1a-32ff71ae4c49 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:56:26 compute-1 nova_compute[182713]: 2026-01-21 23:56:26.713 182717 DEBUG oslo_concurrency.lockutils [None req-aa2430d3-1d7a-4051-bc1a-32ff71ae4c49 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:56:26 compute-1 nova_compute[182713]: 2026-01-21 23:56:26.836 182717 DEBUG nova.compute.provider_tree [None req-aa2430d3-1d7a-4051-bc1a-32ff71ae4c49 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Inventory has not changed in ProviderTree for provider: 39680711-70c9-4df1-ae59-25e54fac688d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 21 23:56:26 compute-1 nova_compute[182713]: 2026-01-21 23:56:26.853 182717 DEBUG nova.scheduler.client.report [None req-aa2430d3-1d7a-4051-bc1a-32ff71ae4c49 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Inventory has not changed for provider 39680711-70c9-4df1-ae59-25e54fac688d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 21 23:56:26 compute-1 nova_compute[182713]: 2026-01-21 23:56:26.879 182717 DEBUG oslo_concurrency.lockutils [None req-aa2430d3-1d7a-4051-bc1a-32ff71ae4c49 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.166s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:56:26 compute-1 nova_compute[182713]: 2026-01-21 23:56:26.906 182717 INFO nova.scheduler.client.report [None req-aa2430d3-1d7a-4051-bc1a-32ff71ae4c49 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Deleted allocations for instance 0be60507-2a72-40c2-8ec7-86c829eacd52
Jan 21 23:56:26 compute-1 nova_compute[182713]: 2026-01-21 23:56:26.990 182717 DEBUG oslo_concurrency.lockutils [None req-292b870f-72d1-48aa-ba5f-2f0ed3774aa3 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Acquiring lock "22742d9b-a6a8-4f10-a17f-a9704a1f8f43" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:56:26 compute-1 nova_compute[182713]: 2026-01-21 23:56:26.991 182717 DEBUG oslo_concurrency.lockutils [None req-292b870f-72d1-48aa-ba5f-2f0ed3774aa3 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Lock "22742d9b-a6a8-4f10-a17f-a9704a1f8f43" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:56:26 compute-1 nova_compute[182713]: 2026-01-21 23:56:26.991 182717 DEBUG oslo_concurrency.lockutils [None req-292b870f-72d1-48aa-ba5f-2f0ed3774aa3 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Acquiring lock "22742d9b-a6a8-4f10-a17f-a9704a1f8f43-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:56:26 compute-1 nova_compute[182713]: 2026-01-21 23:56:26.992 182717 DEBUG oslo_concurrency.lockutils [None req-292b870f-72d1-48aa-ba5f-2f0ed3774aa3 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Lock "22742d9b-a6a8-4f10-a17f-a9704a1f8f43-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:56:26 compute-1 nova_compute[182713]: 2026-01-21 23:56:26.992 182717 DEBUG oslo_concurrency.lockutils [None req-292b870f-72d1-48aa-ba5f-2f0ed3774aa3 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Lock "22742d9b-a6a8-4f10-a17f-a9704a1f8f43-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:56:27 compute-1 nova_compute[182713]: 2026-01-21 23:56:27.022 182717 INFO nova.compute.manager [None req-292b870f-72d1-48aa-ba5f-2f0ed3774aa3 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] [instance: 22742d9b-a6a8-4f10-a17f-a9704a1f8f43] Terminating instance
Jan 21 23:56:27 compute-1 nova_compute[182713]: 2026-01-21 23:56:27.035 182717 DEBUG nova.compute.manager [None req-292b870f-72d1-48aa-ba5f-2f0ed3774aa3 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] [instance: 22742d9b-a6a8-4f10-a17f-a9704a1f8f43] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 21 23:56:27 compute-1 kernel: tapb767fb64-f4 (unregistering): left promiscuous mode
Jan 21 23:56:27 compute-1 nova_compute[182713]: 2026-01-21 23:56:27.058 182717 DEBUG oslo_concurrency.lockutils [None req-aa2430d3-1d7a-4051-bc1a-32ff71ae4c49 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Lock "0be60507-2a72-40c2-8ec7-86c829eacd52" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.666s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:56:27 compute-1 NetworkManager[54952]: <info>  [1769039787.0609] device (tapb767fb64-f4): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 21 23:56:27 compute-1 ovn_controller[94841]: 2026-01-21T23:56:27Z|00205|binding|INFO|Releasing lport b767fb64-f4a0-49cc-85c0-21b059344b3d from this chassis (sb_readonly=0)
Jan 21 23:56:27 compute-1 nova_compute[182713]: 2026-01-21 23:56:27.067 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:56:27 compute-1 ovn_controller[94841]: 2026-01-21T23:56:27Z|00206|binding|INFO|Setting lport b767fb64-f4a0-49cc-85c0-21b059344b3d down in Southbound
Jan 21 23:56:27 compute-1 ovn_controller[94841]: 2026-01-21T23:56:27Z|00207|binding|INFO|Removing iface tapb767fb64-f4 ovn-installed in OVS
Jan 21 23:56:27 compute-1 nova_compute[182713]: 2026-01-21 23:56:27.071 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:56:27 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:56:27.081 104184 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:4e:99:ca 10.100.0.10'], port_security=['fa:16:3e:4e:99:ca 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '22742d9b-a6a8-4f10-a17f-a9704a1f8f43', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a78bfb22-a192-4dbe-a117-9f8a59130e27', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '70b1c9f8be0042aa8de9841a26729700', 'neutron:revision_number': '6', 'neutron:security_group_ids': '5943869c-ade1-4cd3-81a5-29e65236fb49', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=88d3d39a-f56f-4f3b-95e9-79768ac7b596, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>], logical_port=b767fb64-f4a0-49cc-85c0-21b059344b3d) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 21 23:56:27 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:56:27.082 104184 INFO neutron.agent.ovn.metadata.agent [-] Port b767fb64-f4a0-49cc-85c0-21b059344b3d in datapath a78bfb22-a192-4dbe-a117-9f8a59130e27 unbound from our chassis
Jan 21 23:56:27 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:56:27.084 104184 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network a78bfb22-a192-4dbe-a117-9f8a59130e27, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 21 23:56:27 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:56:27.086 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[80b341b1-d338-45d2-8251-bc495ad93336]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:56:27 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:56:27.087 104184 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-a78bfb22-a192-4dbe-a117-9f8a59130e27 namespace which is not needed anymore
Jan 21 23:56:27 compute-1 nova_compute[182713]: 2026-01-21 23:56:27.095 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:56:27 compute-1 systemd[1]: machine-qemu\x2d29\x2dinstance\x2d0000003a.scope: Deactivated successfully.
Jan 21 23:56:27 compute-1 systemd[1]: machine-qemu\x2d29\x2dinstance\x2d0000003a.scope: Consumed 13.895s CPU time.
Jan 21 23:56:27 compute-1 systemd-machined[153970]: Machine qemu-29-instance-0000003a terminated.
Jan 21 23:56:27 compute-1 neutron-haproxy-ovnmeta-a78bfb22-a192-4dbe-a117-9f8a59130e27[219739]: [NOTICE]   (219743) : haproxy version is 2.8.14-c23fe91
Jan 21 23:56:27 compute-1 neutron-haproxy-ovnmeta-a78bfb22-a192-4dbe-a117-9f8a59130e27[219739]: [NOTICE]   (219743) : path to executable is /usr/sbin/haproxy
Jan 21 23:56:27 compute-1 neutron-haproxy-ovnmeta-a78bfb22-a192-4dbe-a117-9f8a59130e27[219739]: [WARNING]  (219743) : Exiting Master process...
Jan 21 23:56:27 compute-1 neutron-haproxy-ovnmeta-a78bfb22-a192-4dbe-a117-9f8a59130e27[219739]: [WARNING]  (219743) : Exiting Master process...
Jan 21 23:56:27 compute-1 neutron-haproxy-ovnmeta-a78bfb22-a192-4dbe-a117-9f8a59130e27[219739]: [ALERT]    (219743) : Current worker (219745) exited with code 143 (Terminated)
Jan 21 23:56:27 compute-1 neutron-haproxy-ovnmeta-a78bfb22-a192-4dbe-a117-9f8a59130e27[219739]: [WARNING]  (219743) : All workers exited. Exiting... (0)
Jan 21 23:56:27 compute-1 systemd[1]: libpod-7df86a05b272d47525491b31f1dc171095b5bbb8b357540d80ab3ea88a44e1a6.scope: Deactivated successfully.
Jan 21 23:56:27 compute-1 conmon[219739]: conmon 7df86a05b272d4752549 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-7df86a05b272d47525491b31f1dc171095b5bbb8b357540d80ab3ea88a44e1a6.scope/container/memory.events
Jan 21 23:56:27 compute-1 podman[220221]: 2026-01-21 23:56:27.227033773 +0000 UTC m=+0.049470194 container died 7df86a05b272d47525491b31f1dc171095b5bbb8b357540d80ab3ea88a44e1a6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a78bfb22-a192-4dbe-a117-9f8a59130e27, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 21 23:56:27 compute-1 systemd[1]: var-lib-containers-storage-overlay-a8bb9dd63ecfc20576c37291513a15cfefcb9d8aa9870ce0b541988a2fa8b006-merged.mount: Deactivated successfully.
Jan 21 23:56:27 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-7df86a05b272d47525491b31f1dc171095b5bbb8b357540d80ab3ea88a44e1a6-userdata-shm.mount: Deactivated successfully.
Jan 21 23:56:27 compute-1 podman[220221]: 2026-01-21 23:56:27.283560674 +0000 UTC m=+0.105997075 container cleanup 7df86a05b272d47525491b31f1dc171095b5bbb8b357540d80ab3ea88a44e1a6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a78bfb22-a192-4dbe-a117-9f8a59130e27, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Jan 21 23:56:27 compute-1 systemd[1]: libpod-conmon-7df86a05b272d47525491b31f1dc171095b5bbb8b357540d80ab3ea88a44e1a6.scope: Deactivated successfully.
Jan 21 23:56:27 compute-1 nova_compute[182713]: 2026-01-21 23:56:27.316 182717 INFO nova.virt.libvirt.driver [-] [instance: 22742d9b-a6a8-4f10-a17f-a9704a1f8f43] Instance destroyed successfully.
Jan 21 23:56:27 compute-1 nova_compute[182713]: 2026-01-21 23:56:27.316 182717 DEBUG nova.objects.instance [None req-292b870f-72d1-48aa-ba5f-2f0ed3774aa3 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Lazy-loading 'resources' on Instance uuid 22742d9b-a6a8-4f10-a17f-a9704a1f8f43 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:56:27 compute-1 nova_compute[182713]: 2026-01-21 23:56:27.329 182717 DEBUG nova.virt.libvirt.vif [None req-292b870f-72d1-48aa-ba5f-2f0ed3774aa3 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-21T23:55:33Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ListServerFiltersTestJSON-instance-1374146863',display_name='tempest-ListServerFiltersTestJSON-instance-1374146863',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-listserverfilterstestjson-instance-1374146863',id=58,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-21T23:55:46Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='70b1c9f8be0042aa8de9841a26729700',ramdisk_id='',reservation_id='r-3b2x1al6',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ListServerFiltersTestJSON-1547380946',owner_user_name='tempest-ListServerFiltersTestJSON-1547380946-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-21T23:56:10Z,user_data=None,user_id='7e79b904cb8a49f990b05eb0ed72fdf4',uuid=22742d9b-a6a8-4f10-a17f-a9704a1f8f43,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "b767fb64-f4a0-49cc-85c0-21b059344b3d", "address": "fa:16:3e:4e:99:ca", "network": {"id": "a78bfb22-a192-4dbe-a117-9f8a59130e27", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-9709523-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "70b1c9f8be0042aa8de9841a26729700", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb767fb64-f4", "ovs_interfaceid": "b767fb64-f4a0-49cc-85c0-21b059344b3d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 21 23:56:27 compute-1 nova_compute[182713]: 2026-01-21 23:56:27.330 182717 DEBUG nova.network.os_vif_util [None req-292b870f-72d1-48aa-ba5f-2f0ed3774aa3 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Converting VIF {"id": "b767fb64-f4a0-49cc-85c0-21b059344b3d", "address": "fa:16:3e:4e:99:ca", "network": {"id": "a78bfb22-a192-4dbe-a117-9f8a59130e27", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-9709523-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "70b1c9f8be0042aa8de9841a26729700", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb767fb64-f4", "ovs_interfaceid": "b767fb64-f4a0-49cc-85c0-21b059344b3d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 21 23:56:27 compute-1 nova_compute[182713]: 2026-01-21 23:56:27.330 182717 DEBUG nova.network.os_vif_util [None req-292b870f-72d1-48aa-ba5f-2f0ed3774aa3 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:4e:99:ca,bridge_name='br-int',has_traffic_filtering=True,id=b767fb64-f4a0-49cc-85c0-21b059344b3d,network=Network(a78bfb22-a192-4dbe-a117-9f8a59130e27),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb767fb64-f4') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 21 23:56:27 compute-1 nova_compute[182713]: 2026-01-21 23:56:27.331 182717 DEBUG os_vif [None req-292b870f-72d1-48aa-ba5f-2f0ed3774aa3 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:4e:99:ca,bridge_name='br-int',has_traffic_filtering=True,id=b767fb64-f4a0-49cc-85c0-21b059344b3d,network=Network(a78bfb22-a192-4dbe-a117-9f8a59130e27),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb767fb64-f4') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 21 23:56:27 compute-1 nova_compute[182713]: 2026-01-21 23:56:27.333 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:56:27 compute-1 nova_compute[182713]: 2026-01-21 23:56:27.333 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb767fb64-f4, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:56:27 compute-1 nova_compute[182713]: 2026-01-21 23:56:27.335 182717 DEBUG nova.compute.manager [req-e892c1fa-4857-43e0-bb24-ce9e313e0125 req-0816d4b7-99ca-4c3a-ac6f-a8e9246c3bf9 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 22742d9b-a6a8-4f10-a17f-a9704a1f8f43] Received event network-vif-unplugged-b767fb64-f4a0-49cc-85c0-21b059344b3d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:56:27 compute-1 nova_compute[182713]: 2026-01-21 23:56:27.335 182717 DEBUG oslo_concurrency.lockutils [req-e892c1fa-4857-43e0-bb24-ce9e313e0125 req-0816d4b7-99ca-4c3a-ac6f-a8e9246c3bf9 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "22742d9b-a6a8-4f10-a17f-a9704a1f8f43-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:56:27 compute-1 nova_compute[182713]: 2026-01-21 23:56:27.335 182717 DEBUG oslo_concurrency.lockutils [req-e892c1fa-4857-43e0-bb24-ce9e313e0125 req-0816d4b7-99ca-4c3a-ac6f-a8e9246c3bf9 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "22742d9b-a6a8-4f10-a17f-a9704a1f8f43-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:56:27 compute-1 nova_compute[182713]: 2026-01-21 23:56:27.336 182717 DEBUG oslo_concurrency.lockutils [req-e892c1fa-4857-43e0-bb24-ce9e313e0125 req-0816d4b7-99ca-4c3a-ac6f-a8e9246c3bf9 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "22742d9b-a6a8-4f10-a17f-a9704a1f8f43-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:56:27 compute-1 nova_compute[182713]: 2026-01-21 23:56:27.336 182717 DEBUG nova.compute.manager [req-e892c1fa-4857-43e0-bb24-ce9e313e0125 req-0816d4b7-99ca-4c3a-ac6f-a8e9246c3bf9 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 22742d9b-a6a8-4f10-a17f-a9704a1f8f43] No waiting events found dispatching network-vif-unplugged-b767fb64-f4a0-49cc-85c0-21b059344b3d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 23:56:27 compute-1 nova_compute[182713]: 2026-01-21 23:56:27.336 182717 DEBUG nova.compute.manager [req-e892c1fa-4857-43e0-bb24-ce9e313e0125 req-0816d4b7-99ca-4c3a-ac6f-a8e9246c3bf9 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 22742d9b-a6a8-4f10-a17f-a9704a1f8f43] Received event network-vif-unplugged-b767fb64-f4a0-49cc-85c0-21b059344b3d for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 21 23:56:27 compute-1 nova_compute[182713]: 2026-01-21 23:56:27.336 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:56:27 compute-1 nova_compute[182713]: 2026-01-21 23:56:27.337 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 21 23:56:27 compute-1 nova_compute[182713]: 2026-01-21 23:56:27.339 182717 INFO os_vif [None req-292b870f-72d1-48aa-ba5f-2f0ed3774aa3 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:4e:99:ca,bridge_name='br-int',has_traffic_filtering=True,id=b767fb64-f4a0-49cc-85c0-21b059344b3d,network=Network(a78bfb22-a192-4dbe-a117-9f8a59130e27),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb767fb64-f4')
Jan 21 23:56:27 compute-1 nova_compute[182713]: 2026-01-21 23:56:27.340 182717 INFO nova.virt.libvirt.driver [None req-292b870f-72d1-48aa-ba5f-2f0ed3774aa3 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] [instance: 22742d9b-a6a8-4f10-a17f-a9704a1f8f43] Deleting instance files /var/lib/nova/instances/22742d9b-a6a8-4f10-a17f-a9704a1f8f43_del
Jan 21 23:56:27 compute-1 nova_compute[182713]: 2026-01-21 23:56:27.340 182717 INFO nova.virt.libvirt.driver [None req-292b870f-72d1-48aa-ba5f-2f0ed3774aa3 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] [instance: 22742d9b-a6a8-4f10-a17f-a9704a1f8f43] Deletion of /var/lib/nova/instances/22742d9b-a6a8-4f10-a17f-a9704a1f8f43_del complete
Jan 21 23:56:27 compute-1 podman[220262]: 2026-01-21 23:56:27.35975188 +0000 UTC m=+0.045062738 container remove 7df86a05b272d47525491b31f1dc171095b5bbb8b357540d80ab3ea88a44e1a6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a78bfb22-a192-4dbe-a117-9f8a59130e27, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 21 23:56:27 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:56:27.368 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[cd8d8edb-e998-4341-89e3-aee9285e0a12]: (4, ('Wed Jan 21 11:56:27 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-a78bfb22-a192-4dbe-a117-9f8a59130e27 (7df86a05b272d47525491b31f1dc171095b5bbb8b357540d80ab3ea88a44e1a6)\n7df86a05b272d47525491b31f1dc171095b5bbb8b357540d80ab3ea88a44e1a6\nWed Jan 21 11:56:27 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-a78bfb22-a192-4dbe-a117-9f8a59130e27 (7df86a05b272d47525491b31f1dc171095b5bbb8b357540d80ab3ea88a44e1a6)\n7df86a05b272d47525491b31f1dc171095b5bbb8b357540d80ab3ea88a44e1a6\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:56:27 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:56:27.370 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[47948f7d-31d0-40de-a692-fd0a523e8d3d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:56:27 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:56:27.371 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa78bfb22-a0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:56:27 compute-1 nova_compute[182713]: 2026-01-21 23:56:27.373 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:56:27 compute-1 kernel: tapa78bfb22-a0: left promiscuous mode
Jan 21 23:56:27 compute-1 nova_compute[182713]: 2026-01-21 23:56:27.388 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:56:27 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:56:27.391 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[bafa14f9-d216-43df-8d31-90462ad1ab54]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:56:27 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:56:27.407 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[108d572c-ba4b-499d-8b6f-1452a9f85be7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:56:27 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:56:27.410 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[41a00767-808b-4163-ac8b-d1ea3482fa68]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:56:27 compute-1 nova_compute[182713]: 2026-01-21 23:56:27.413 182717 INFO nova.compute.manager [None req-292b870f-72d1-48aa-ba5f-2f0ed3774aa3 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] [instance: 22742d9b-a6a8-4f10-a17f-a9704a1f8f43] Took 0.38 seconds to destroy the instance on the hypervisor.
Jan 21 23:56:27 compute-1 nova_compute[182713]: 2026-01-21 23:56:27.414 182717 DEBUG oslo.service.loopingcall [None req-292b870f-72d1-48aa-ba5f-2f0ed3774aa3 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 21 23:56:27 compute-1 nova_compute[182713]: 2026-01-21 23:56:27.414 182717 DEBUG nova.compute.manager [-] [instance: 22742d9b-a6a8-4f10-a17f-a9704a1f8f43] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 21 23:56:27 compute-1 nova_compute[182713]: 2026-01-21 23:56:27.414 182717 DEBUG nova.network.neutron [-] [instance: 22742d9b-a6a8-4f10-a17f-a9704a1f8f43] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 21 23:56:27 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:56:27.430 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[5eebbc6e-a285-465c-963a-8204289be156]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 434201, 'reachable_time': 26734, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 220279, 'error': None, 'target': 'ovnmeta-a78bfb22-a192-4dbe-a117-9f8a59130e27', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:56:27 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:56:27.432 104576 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-a78bfb22-a192-4dbe-a117-9f8a59130e27 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 21 23:56:27 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:56:27.432 104576 DEBUG oslo.privsep.daemon [-] privsep: reply[4e6980ee-5e9c-4335-a830-2234ab205e5e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:56:27 compute-1 systemd[1]: run-netns-ovnmeta\x2da78bfb22\x2da192\x2d4dbe\x2da117\x2d9f8a59130e27.mount: Deactivated successfully.
Jan 21 23:56:27 compute-1 nova_compute[182713]: 2026-01-21 23:56:27.638 182717 DEBUG nova.compute.manager [req-51c5d0c4-709a-42cd-af6e-a14bbf812b33 req-ec3ff821-0a7c-4874-97e5-2014071de0af 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 0be60507-2a72-40c2-8ec7-86c829eacd52] Received event network-vif-unplugged-8014260a-f495-40a2-81b9-2fa4e968f539 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:56:27 compute-1 nova_compute[182713]: 2026-01-21 23:56:27.639 182717 DEBUG oslo_concurrency.lockutils [req-51c5d0c4-709a-42cd-af6e-a14bbf812b33 req-ec3ff821-0a7c-4874-97e5-2014071de0af 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "0be60507-2a72-40c2-8ec7-86c829eacd52-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:56:27 compute-1 nova_compute[182713]: 2026-01-21 23:56:27.639 182717 DEBUG oslo_concurrency.lockutils [req-51c5d0c4-709a-42cd-af6e-a14bbf812b33 req-ec3ff821-0a7c-4874-97e5-2014071de0af 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "0be60507-2a72-40c2-8ec7-86c829eacd52-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:56:27 compute-1 nova_compute[182713]: 2026-01-21 23:56:27.640 182717 DEBUG oslo_concurrency.lockutils [req-51c5d0c4-709a-42cd-af6e-a14bbf812b33 req-ec3ff821-0a7c-4874-97e5-2014071de0af 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "0be60507-2a72-40c2-8ec7-86c829eacd52-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:56:27 compute-1 nova_compute[182713]: 2026-01-21 23:56:27.640 182717 DEBUG nova.compute.manager [req-51c5d0c4-709a-42cd-af6e-a14bbf812b33 req-ec3ff821-0a7c-4874-97e5-2014071de0af 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 0be60507-2a72-40c2-8ec7-86c829eacd52] No waiting events found dispatching network-vif-unplugged-8014260a-f495-40a2-81b9-2fa4e968f539 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 23:56:27 compute-1 nova_compute[182713]: 2026-01-21 23:56:27.640 182717 WARNING nova.compute.manager [req-51c5d0c4-709a-42cd-af6e-a14bbf812b33 req-ec3ff821-0a7c-4874-97e5-2014071de0af 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 0be60507-2a72-40c2-8ec7-86c829eacd52] Received unexpected event network-vif-unplugged-8014260a-f495-40a2-81b9-2fa4e968f539 for instance with vm_state deleted and task_state None.
Jan 21 23:56:27 compute-1 nova_compute[182713]: 2026-01-21 23:56:27.641 182717 DEBUG nova.compute.manager [req-51c5d0c4-709a-42cd-af6e-a14bbf812b33 req-ec3ff821-0a7c-4874-97e5-2014071de0af 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 0be60507-2a72-40c2-8ec7-86c829eacd52] Received event network-vif-plugged-8014260a-f495-40a2-81b9-2fa4e968f539 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:56:27 compute-1 nova_compute[182713]: 2026-01-21 23:56:27.641 182717 DEBUG oslo_concurrency.lockutils [req-51c5d0c4-709a-42cd-af6e-a14bbf812b33 req-ec3ff821-0a7c-4874-97e5-2014071de0af 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "0be60507-2a72-40c2-8ec7-86c829eacd52-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:56:27 compute-1 nova_compute[182713]: 2026-01-21 23:56:27.642 182717 DEBUG oslo_concurrency.lockutils [req-51c5d0c4-709a-42cd-af6e-a14bbf812b33 req-ec3ff821-0a7c-4874-97e5-2014071de0af 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "0be60507-2a72-40c2-8ec7-86c829eacd52-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:56:27 compute-1 nova_compute[182713]: 2026-01-21 23:56:27.642 182717 DEBUG oslo_concurrency.lockutils [req-51c5d0c4-709a-42cd-af6e-a14bbf812b33 req-ec3ff821-0a7c-4874-97e5-2014071de0af 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "0be60507-2a72-40c2-8ec7-86c829eacd52-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:56:27 compute-1 nova_compute[182713]: 2026-01-21 23:56:27.642 182717 DEBUG nova.compute.manager [req-51c5d0c4-709a-42cd-af6e-a14bbf812b33 req-ec3ff821-0a7c-4874-97e5-2014071de0af 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 0be60507-2a72-40c2-8ec7-86c829eacd52] No waiting events found dispatching network-vif-plugged-8014260a-f495-40a2-81b9-2fa4e968f539 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 23:56:27 compute-1 nova_compute[182713]: 2026-01-21 23:56:27.643 182717 WARNING nova.compute.manager [req-51c5d0c4-709a-42cd-af6e-a14bbf812b33 req-ec3ff821-0a7c-4874-97e5-2014071de0af 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 0be60507-2a72-40c2-8ec7-86c829eacd52] Received unexpected event network-vif-plugged-8014260a-f495-40a2-81b9-2fa4e968f539 for instance with vm_state deleted and task_state None.
Jan 21 23:56:28 compute-1 nova_compute[182713]: 2026-01-21 23:56:28.856 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:56:28 compute-1 nova_compute[182713]: 2026-01-21 23:56:28.997 182717 DEBUG nova.network.neutron [-] [instance: 22742d9b-a6a8-4f10-a17f-a9704a1f8f43] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 23:56:29 compute-1 nova_compute[182713]: 2026-01-21 23:56:29.074 182717 INFO nova.compute.manager [-] [instance: 22742d9b-a6a8-4f10-a17f-a9704a1f8f43] Took 1.66 seconds to deallocate network for instance.
Jan 21 23:56:29 compute-1 nova_compute[182713]: 2026-01-21 23:56:29.188 182717 DEBUG nova.compute.manager [req-56464cf4-b0c9-4524-ab61-e3f354a8f9e6 req-42929829-32b0-409e-9168-dad53eea0cb5 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 22742d9b-a6a8-4f10-a17f-a9704a1f8f43] Received event network-vif-deleted-b767fb64-f4a0-49cc-85c0-21b059344b3d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:56:29 compute-1 nova_compute[182713]: 2026-01-21 23:56:29.295 182717 DEBUG oslo_concurrency.lockutils [None req-292b870f-72d1-48aa-ba5f-2f0ed3774aa3 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:56:29 compute-1 nova_compute[182713]: 2026-01-21 23:56:29.295 182717 DEBUG oslo_concurrency.lockutils [None req-292b870f-72d1-48aa-ba5f-2f0ed3774aa3 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:56:29 compute-1 nova_compute[182713]: 2026-01-21 23:56:29.388 182717 DEBUG nova.compute.provider_tree [None req-292b870f-72d1-48aa-ba5f-2f0ed3774aa3 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Inventory has not changed in ProviderTree for provider: 39680711-70c9-4df1-ae59-25e54fac688d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 21 23:56:29 compute-1 nova_compute[182713]: 2026-01-21 23:56:29.407 182717 DEBUG nova.scheduler.client.report [None req-292b870f-72d1-48aa-ba5f-2f0ed3774aa3 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Inventory has not changed for provider 39680711-70c9-4df1-ae59-25e54fac688d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 21 23:56:29 compute-1 nova_compute[182713]: 2026-01-21 23:56:29.441 182717 DEBUG oslo_concurrency.lockutils [None req-292b870f-72d1-48aa-ba5f-2f0ed3774aa3 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.145s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:56:29 compute-1 nova_compute[182713]: 2026-01-21 23:56:29.484 182717 DEBUG nova.compute.manager [req-a5e7195c-85f3-4c91-a187-da2126c72ddd req-13ccc5ad-97f6-4737-884b-5e87cce5ae87 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 22742d9b-a6a8-4f10-a17f-a9704a1f8f43] Received event network-vif-plugged-b767fb64-f4a0-49cc-85c0-21b059344b3d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:56:29 compute-1 nova_compute[182713]: 2026-01-21 23:56:29.485 182717 DEBUG oslo_concurrency.lockutils [req-a5e7195c-85f3-4c91-a187-da2126c72ddd req-13ccc5ad-97f6-4737-884b-5e87cce5ae87 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "22742d9b-a6a8-4f10-a17f-a9704a1f8f43-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:56:29 compute-1 nova_compute[182713]: 2026-01-21 23:56:29.485 182717 DEBUG oslo_concurrency.lockutils [req-a5e7195c-85f3-4c91-a187-da2126c72ddd req-13ccc5ad-97f6-4737-884b-5e87cce5ae87 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "22742d9b-a6a8-4f10-a17f-a9704a1f8f43-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:56:29 compute-1 nova_compute[182713]: 2026-01-21 23:56:29.485 182717 DEBUG oslo_concurrency.lockutils [req-a5e7195c-85f3-4c91-a187-da2126c72ddd req-13ccc5ad-97f6-4737-884b-5e87cce5ae87 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "22742d9b-a6a8-4f10-a17f-a9704a1f8f43-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:56:29 compute-1 nova_compute[182713]: 2026-01-21 23:56:29.486 182717 DEBUG nova.compute.manager [req-a5e7195c-85f3-4c91-a187-da2126c72ddd req-13ccc5ad-97f6-4737-884b-5e87cce5ae87 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 22742d9b-a6a8-4f10-a17f-a9704a1f8f43] No waiting events found dispatching network-vif-plugged-b767fb64-f4a0-49cc-85c0-21b059344b3d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 23:56:29 compute-1 nova_compute[182713]: 2026-01-21 23:56:29.486 182717 WARNING nova.compute.manager [req-a5e7195c-85f3-4c91-a187-da2126c72ddd req-13ccc5ad-97f6-4737-884b-5e87cce5ae87 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 22742d9b-a6a8-4f10-a17f-a9704a1f8f43] Received unexpected event network-vif-plugged-b767fb64-f4a0-49cc-85c0-21b059344b3d for instance with vm_state deleted and task_state None.
Jan 21 23:56:29 compute-1 nova_compute[182713]: 2026-01-21 23:56:29.488 182717 INFO nova.scheduler.client.report [None req-292b870f-72d1-48aa-ba5f-2f0ed3774aa3 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Deleted allocations for instance 22742d9b-a6a8-4f10-a17f-a9704a1f8f43
Jan 21 23:56:29 compute-1 nova_compute[182713]: 2026-01-21 23:56:29.607 182717 DEBUG oslo_concurrency.lockutils [None req-292b870f-72d1-48aa-ba5f-2f0ed3774aa3 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Lock "22742d9b-a6a8-4f10-a17f-a9704a1f8f43" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.616s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:56:29 compute-1 nova_compute[182713]: 2026-01-21 23:56:29.855 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:56:30 compute-1 nova_compute[182713]: 2026-01-21 23:56:30.850 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:56:30 compute-1 nova_compute[182713]: 2026-01-21 23:56:30.855 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:56:30 compute-1 nova_compute[182713]: 2026-01-21 23:56:30.856 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 21 23:56:30 compute-1 nova_compute[182713]: 2026-01-21 23:56:30.856 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 21 23:56:31 compute-1 nova_compute[182713]: 2026-01-21 23:56:31.035 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquiring lock "refresh_cache-03db37ed-d870-40ec-86f5-db23a9180dc8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 23:56:31 compute-1 nova_compute[182713]: 2026-01-21 23:56:31.035 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquired lock "refresh_cache-03db37ed-d870-40ec-86f5-db23a9180dc8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 23:56:31 compute-1 nova_compute[182713]: 2026-01-21 23:56:31.036 182717 DEBUG nova.network.neutron [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] [instance: 03db37ed-d870-40ec-86f5-db23a9180dc8] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Jan 21 23:56:31 compute-1 nova_compute[182713]: 2026-01-21 23:56:31.036 182717 DEBUG nova.objects.instance [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lazy-loading 'info_cache' on Instance uuid 03db37ed-d870-40ec-86f5-db23a9180dc8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:56:31 compute-1 nova_compute[182713]: 2026-01-21 23:56:31.109 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:56:31 compute-1 ovn_controller[94841]: 2026-01-21T23:56:31Z|00208|binding|INFO|Releasing lport 8bc16eeb-6666-4300-9ce8-0a810442a173 from this chassis (sb_readonly=0)
Jan 21 23:56:31 compute-1 nova_compute[182713]: 2026-01-21 23:56:31.529 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:56:32 compute-1 nova_compute[182713]: 2026-01-21 23:56:32.336 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:56:32 compute-1 podman[220281]: 2026-01-21 23:56:32.582431054 +0000 UTC m=+0.067087917 container health_status 9069102163b9014ce8d990e36224edf2f6277e737eebbc85a8ea0ba5a3d6a468 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 21 23:56:32 compute-1 podman[220280]: 2026-01-21 23:56:32.614399039 +0000 UTC m=+0.103361974 container health_status 1b31374526468d6b98833c6ddc06c791524e3aaa8f4a92d02e3f11c449a17ca2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251202)
Jan 21 23:56:33 compute-1 nova_compute[182713]: 2026-01-21 23:56:33.295 182717 DEBUG nova.network.neutron [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] [instance: 03db37ed-d870-40ec-86f5-db23a9180dc8] Updating instance_info_cache with network_info: [{"id": "1f2706f6-320f-42cf-8e88-b7cb375b001a", "address": "fa:16:3e:38:ee:b8", "network": {"id": "835f4434-3fa6-458b-b79c-b27830f531cf", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-1274650069-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "414437860afc460b9e86d674975e9d1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1f2706f6-32", "ovs_interfaceid": "1f2706f6-320f-42cf-8e88-b7cb375b001a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 23:56:33 compute-1 nova_compute[182713]: 2026-01-21 23:56:33.326 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Releasing lock "refresh_cache-03db37ed-d870-40ec-86f5-db23a9180dc8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 23:56:33 compute-1 nova_compute[182713]: 2026-01-21 23:56:33.327 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] [instance: 03db37ed-d870-40ec-86f5-db23a9180dc8] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Jan 21 23:56:33 compute-1 nova_compute[182713]: 2026-01-21 23:56:33.328 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:56:33 compute-1 nova_compute[182713]: 2026-01-21 23:56:33.352 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:56:33 compute-1 nova_compute[182713]: 2026-01-21 23:56:33.354 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:56:33 compute-1 nova_compute[182713]: 2026-01-21 23:56:33.355 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:56:33 compute-1 nova_compute[182713]: 2026-01-21 23:56:33.356 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 21 23:56:33 compute-1 nova_compute[182713]: 2026-01-21 23:56:33.436 182717 DEBUG oslo_concurrency.processutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/03db37ed-d870-40ec-86f5-db23a9180dc8/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:56:33 compute-1 nova_compute[182713]: 2026-01-21 23:56:33.497 182717 DEBUG oslo_concurrency.processutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/03db37ed-d870-40ec-86f5-db23a9180dc8/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:56:33 compute-1 nova_compute[182713]: 2026-01-21 23:56:33.498 182717 DEBUG oslo_concurrency.processutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/03db37ed-d870-40ec-86f5-db23a9180dc8/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:56:33 compute-1 nova_compute[182713]: 2026-01-21 23:56:33.574 182717 DEBUG oslo_concurrency.processutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/03db37ed-d870-40ec-86f5-db23a9180dc8/disk --force-share --output=json" returned: 0 in 0.076s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:56:33 compute-1 nova_compute[182713]: 2026-01-21 23:56:33.745 182717 WARNING nova.virt.libvirt.driver [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 21 23:56:33 compute-1 nova_compute[182713]: 2026-01-21 23:56:33.747 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5505MB free_disk=73.30320358276367GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 21 23:56:33 compute-1 nova_compute[182713]: 2026-01-21 23:56:33.747 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:56:33 compute-1 nova_compute[182713]: 2026-01-21 23:56:33.747 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:56:33 compute-1 nova_compute[182713]: 2026-01-21 23:56:33.856 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Instance 03db37ed-d870-40ec-86f5-db23a9180dc8 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 21 23:56:33 compute-1 nova_compute[182713]: 2026-01-21 23:56:33.857 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 21 23:56:33 compute-1 nova_compute[182713]: 2026-01-21 23:56:33.858 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 21 23:56:33 compute-1 nova_compute[182713]: 2026-01-21 23:56:33.924 182717 DEBUG nova.compute.provider_tree [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Inventory has not changed in ProviderTree for provider: 39680711-70c9-4df1-ae59-25e54fac688d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 21 23:56:33 compute-1 nova_compute[182713]: 2026-01-21 23:56:33.948 182717 DEBUG nova.scheduler.client.report [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Inventory has not changed for provider 39680711-70c9-4df1-ae59-25e54fac688d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 21 23:56:33 compute-1 nova_compute[182713]: 2026-01-21 23:56:33.987 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 21 23:56:33 compute-1 nova_compute[182713]: 2026-01-21 23:56:33.987 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.240s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:56:34 compute-1 nova_compute[182713]: 2026-01-21 23:56:34.451 182717 DEBUG oslo_concurrency.lockutils [None req-d421e012-4047-4d25-a951-817b2416de54 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] Acquiring lock "03db37ed-d870-40ec-86f5-db23a9180dc8" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:56:34 compute-1 nova_compute[182713]: 2026-01-21 23:56:34.451 182717 DEBUG oslo_concurrency.lockutils [None req-d421e012-4047-4d25-a951-817b2416de54 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] Lock "03db37ed-d870-40ec-86f5-db23a9180dc8" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:56:34 compute-1 nova_compute[182713]: 2026-01-21 23:56:34.452 182717 DEBUG oslo_concurrency.lockutils [None req-d421e012-4047-4d25-a951-817b2416de54 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] Acquiring lock "03db37ed-d870-40ec-86f5-db23a9180dc8-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:56:34 compute-1 nova_compute[182713]: 2026-01-21 23:56:34.452 182717 DEBUG oslo_concurrency.lockutils [None req-d421e012-4047-4d25-a951-817b2416de54 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] Lock "03db37ed-d870-40ec-86f5-db23a9180dc8-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:56:34 compute-1 nova_compute[182713]: 2026-01-21 23:56:34.453 182717 DEBUG oslo_concurrency.lockutils [None req-d421e012-4047-4d25-a951-817b2416de54 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] Lock "03db37ed-d870-40ec-86f5-db23a9180dc8-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:56:34 compute-1 nova_compute[182713]: 2026-01-21 23:56:34.470 182717 INFO nova.compute.manager [None req-d421e012-4047-4d25-a951-817b2416de54 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] [instance: 03db37ed-d870-40ec-86f5-db23a9180dc8] Terminating instance
Jan 21 23:56:34 compute-1 nova_compute[182713]: 2026-01-21 23:56:34.491 182717 DEBUG nova.compute.manager [None req-d421e012-4047-4d25-a951-817b2416de54 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] [instance: 03db37ed-d870-40ec-86f5-db23a9180dc8] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 21 23:56:34 compute-1 kernel: tap1f2706f6-32 (unregistering): left promiscuous mode
Jan 21 23:56:34 compute-1 NetworkManager[54952]: <info>  [1769039794.5129] device (tap1f2706f6-32): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 21 23:56:34 compute-1 nova_compute[182713]: 2026-01-21 23:56:34.521 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:56:34 compute-1 ovn_controller[94841]: 2026-01-21T23:56:34Z|00209|binding|INFO|Releasing lport 1f2706f6-320f-42cf-8e88-b7cb375b001a from this chassis (sb_readonly=0)
Jan 21 23:56:34 compute-1 ovn_controller[94841]: 2026-01-21T23:56:34Z|00210|binding|INFO|Setting lport 1f2706f6-320f-42cf-8e88-b7cb375b001a down in Southbound
Jan 21 23:56:34 compute-1 ovn_controller[94841]: 2026-01-21T23:56:34Z|00211|binding|INFO|Removing iface tap1f2706f6-32 ovn-installed in OVS
Jan 21 23:56:34 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:56:34.535 104184 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:38:ee:b8 10.100.0.11'], port_security=['fa:16:3e:38:ee:b8 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '03db37ed-d870-40ec-86f5-db23a9180dc8', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-835f4434-3fa6-458b-b79c-b27830f531cf', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '414437860afc460b9e86d674975e9d1f', 'neutron:revision_number': '4', 'neutron:security_group_ids': '30db0ce4-28a9-4add-b257-f90dc081c48d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f2a61a9c-1832-4a5f-89c7-e09ac8a1046e, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>], logical_port=1f2706f6-320f-42cf-8e88-b7cb375b001a) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 21 23:56:34 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:56:34.538 104184 INFO neutron.agent.ovn.metadata.agent [-] Port 1f2706f6-320f-42cf-8e88-b7cb375b001a in datapath 835f4434-3fa6-458b-b79c-b27830f531cf unbound from our chassis
Jan 21 23:56:34 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:56:34.543 104184 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 835f4434-3fa6-458b-b79c-b27830f531cf, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 21 23:56:34 compute-1 nova_compute[182713]: 2026-01-21 23:56:34.545 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:56:34 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:56:34.547 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[7cbef8aa-6a11-4234-85ac-fcf80ba173f1]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:56:34 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:56:34.548 104184 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-835f4434-3fa6-458b-b79c-b27830f531cf namespace which is not needed anymore
Jan 21 23:56:34 compute-1 systemd[1]: machine-qemu\x2d31\x2dinstance\x2d00000043.scope: Deactivated successfully.
Jan 21 23:56:34 compute-1 systemd[1]: machine-qemu\x2d31\x2dinstance\x2d00000043.scope: Consumed 11.068s CPU time.
Jan 21 23:56:34 compute-1 systemd-machined[153970]: Machine qemu-31-instance-00000043 terminated.
Jan 21 23:56:34 compute-1 neutron-haproxy-ovnmeta-835f4434-3fa6-458b-b79c-b27830f531cf[220076]: [NOTICE]   (220080) : haproxy version is 2.8.14-c23fe91
Jan 21 23:56:34 compute-1 neutron-haproxy-ovnmeta-835f4434-3fa6-458b-b79c-b27830f531cf[220076]: [NOTICE]   (220080) : path to executable is /usr/sbin/haproxy
Jan 21 23:56:34 compute-1 neutron-haproxy-ovnmeta-835f4434-3fa6-458b-b79c-b27830f531cf[220076]: [WARNING]  (220080) : Exiting Master process...
Jan 21 23:56:34 compute-1 neutron-haproxy-ovnmeta-835f4434-3fa6-458b-b79c-b27830f531cf[220076]: [ALERT]    (220080) : Current worker (220082) exited with code 143 (Terminated)
Jan 21 23:56:34 compute-1 neutron-haproxy-ovnmeta-835f4434-3fa6-458b-b79c-b27830f531cf[220076]: [WARNING]  (220080) : All workers exited. Exiting... (0)
Jan 21 23:56:34 compute-1 NetworkManager[54952]: <info>  [1769039794.7152] manager: (tap1f2706f6-32): new Tun device (/org/freedesktop/NetworkManager/Devices/104)
Jan 21 23:56:34 compute-1 kernel: tap1f2706f6-32: entered promiscuous mode
Jan 21 23:56:34 compute-1 kernel: tap1f2706f6-32 (unregistering): left promiscuous mode
Jan 21 23:56:34 compute-1 systemd[1]: libpod-f9e1d5ada4263f87a1459a1fc21623802c76b9601d0656dea344a745d723ffd4.scope: Deactivated successfully.
Jan 21 23:56:34 compute-1 ovn_controller[94841]: 2026-01-21T23:56:34Z|00212|binding|INFO|Claiming lport 1f2706f6-320f-42cf-8e88-b7cb375b001a for this chassis.
Jan 21 23:56:34 compute-1 ovn_controller[94841]: 2026-01-21T23:56:34Z|00213|binding|INFO|1f2706f6-320f-42cf-8e88-b7cb375b001a: Claiming fa:16:3e:38:ee:b8 10.100.0.11
Jan 21 23:56:34 compute-1 nova_compute[182713]: 2026-01-21 23:56:34.722 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:56:34 compute-1 podman[220358]: 2026-01-21 23:56:34.729797423 +0000 UTC m=+0.067416387 container died f9e1d5ada4263f87a1459a1fc21623802c76b9601d0656dea344a745d723ffd4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-835f4434-3fa6-458b-b79c-b27830f531cf, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Jan 21 23:56:34 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:56:34.736 104184 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:38:ee:b8 10.100.0.11'], port_security=['fa:16:3e:38:ee:b8 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '03db37ed-d870-40ec-86f5-db23a9180dc8', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-835f4434-3fa6-458b-b79c-b27830f531cf', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '414437860afc460b9e86d674975e9d1f', 'neutron:revision_number': '4', 'neutron:security_group_ids': '30db0ce4-28a9-4add-b257-f90dc081c48d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f2a61a9c-1832-4a5f-89c7-e09ac8a1046e, chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>], logical_port=1f2706f6-320f-42cf-8e88-b7cb375b001a) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 21 23:56:34 compute-1 nova_compute[182713]: 2026-01-21 23:56:34.746 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:56:34 compute-1 ovn_controller[94841]: 2026-01-21T23:56:34Z|00214|binding|INFO|Releasing lport 1f2706f6-320f-42cf-8e88-b7cb375b001a from this chassis (sb_readonly=0)
Jan 21 23:56:34 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:56:34.756 104184 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:38:ee:b8 10.100.0.11'], port_security=['fa:16:3e:38:ee:b8 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '03db37ed-d870-40ec-86f5-db23a9180dc8', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-835f4434-3fa6-458b-b79c-b27830f531cf', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '414437860afc460b9e86d674975e9d1f', 'neutron:revision_number': '4', 'neutron:security_group_ids': '30db0ce4-28a9-4add-b257-f90dc081c48d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f2a61a9c-1832-4a5f-89c7-e09ac8a1046e, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>], logical_port=1f2706f6-320f-42cf-8e88-b7cb375b001a) old=Port_Binding(chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 21 23:56:34 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-f9e1d5ada4263f87a1459a1fc21623802c76b9601d0656dea344a745d723ffd4-userdata-shm.mount: Deactivated successfully.
Jan 21 23:56:34 compute-1 systemd[1]: var-lib-containers-storage-overlay-e36a143b95759f027051f25d1ff22d50c7f9aa731774182c64315d1ff4d25030-merged.mount: Deactivated successfully.
Jan 21 23:56:34 compute-1 podman[220358]: 2026-01-21 23:56:34.79400503 +0000 UTC m=+0.131623914 container cleanup f9e1d5ada4263f87a1459a1fc21623802c76b9601d0656dea344a745d723ffd4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-835f4434-3fa6-458b-b79c-b27830f531cf, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 21 23:56:34 compute-1 systemd[1]: libpod-conmon-f9e1d5ada4263f87a1459a1fc21623802c76b9601d0656dea344a745d723ffd4.scope: Deactivated successfully.
Jan 21 23:56:34 compute-1 nova_compute[182713]: 2026-01-21 23:56:34.812 182717 INFO nova.virt.libvirt.driver [-] [instance: 03db37ed-d870-40ec-86f5-db23a9180dc8] Instance destroyed successfully.
Jan 21 23:56:34 compute-1 nova_compute[182713]: 2026-01-21 23:56:34.813 182717 DEBUG nova.objects.instance [None req-d421e012-4047-4d25-a951-817b2416de54 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] Lazy-loading 'resources' on Instance uuid 03db37ed-d870-40ec-86f5-db23a9180dc8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:56:34 compute-1 nova_compute[182713]: 2026-01-21 23:56:34.828 182717 DEBUG nova.compute.manager [req-f55e3412-d6e1-4c18-8803-17feaa1a5a46 req-2349ed9f-d036-479a-81ad-d186d31d4411 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 03db37ed-d870-40ec-86f5-db23a9180dc8] Received event network-vif-unplugged-1f2706f6-320f-42cf-8e88-b7cb375b001a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:56:34 compute-1 nova_compute[182713]: 2026-01-21 23:56:34.830 182717 DEBUG oslo_concurrency.lockutils [req-f55e3412-d6e1-4c18-8803-17feaa1a5a46 req-2349ed9f-d036-479a-81ad-d186d31d4411 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "03db37ed-d870-40ec-86f5-db23a9180dc8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:56:34 compute-1 nova_compute[182713]: 2026-01-21 23:56:34.831 182717 DEBUG oslo_concurrency.lockutils [req-f55e3412-d6e1-4c18-8803-17feaa1a5a46 req-2349ed9f-d036-479a-81ad-d186d31d4411 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "03db37ed-d870-40ec-86f5-db23a9180dc8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:56:34 compute-1 nova_compute[182713]: 2026-01-21 23:56:34.831 182717 DEBUG oslo_concurrency.lockutils [req-f55e3412-d6e1-4c18-8803-17feaa1a5a46 req-2349ed9f-d036-479a-81ad-d186d31d4411 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "03db37ed-d870-40ec-86f5-db23a9180dc8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:56:34 compute-1 nova_compute[182713]: 2026-01-21 23:56:34.832 182717 DEBUG nova.compute.manager [req-f55e3412-d6e1-4c18-8803-17feaa1a5a46 req-2349ed9f-d036-479a-81ad-d186d31d4411 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 03db37ed-d870-40ec-86f5-db23a9180dc8] No waiting events found dispatching network-vif-unplugged-1f2706f6-320f-42cf-8e88-b7cb375b001a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 23:56:34 compute-1 nova_compute[182713]: 2026-01-21 23:56:34.833 182717 DEBUG nova.compute.manager [req-f55e3412-d6e1-4c18-8803-17feaa1a5a46 req-2349ed9f-d036-479a-81ad-d186d31d4411 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 03db37ed-d870-40ec-86f5-db23a9180dc8] Received event network-vif-unplugged-1f2706f6-320f-42cf-8e88-b7cb375b001a for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 21 23:56:34 compute-1 nova_compute[182713]: 2026-01-21 23:56:34.835 182717 DEBUG nova.virt.libvirt.vif [None req-d421e012-4047-4d25-a951-817b2416de54 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-21T23:56:12Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ListServersNegativeTestJSON-server-1677728672',display_name='tempest-ListServersNegativeTestJSON-server-1677728672-3',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-listserversnegativetestjson-server-1677728672-3',id=67,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=2,launched_at=2026-01-21T23:56:24Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='414437860afc460b9e86d674975e9d1f',ramdisk_id='',reservation_id='r-dznclk2m',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ListServersNegativeTestJSON-1787990789',owner_user_name='tempest-ListServersNegativeTestJSON-1787990789-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-21T23:56:24Z,user_data=None,user_id='9a4a4a5f3c9f4c5091261592272bcb81',uuid=03db37ed-d870-40ec-86f5-db23a9180dc8,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "1f2706f6-320f-42cf-8e88-b7cb375b001a", "address": "fa:16:3e:38:ee:b8", "network": {"id": "835f4434-3fa6-458b-b79c-b27830f531cf", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-1274650069-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "414437860afc460b9e86d674975e9d1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1f2706f6-32", "ovs_interfaceid": "1f2706f6-320f-42cf-8e88-b7cb375b001a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 21 23:56:34 compute-1 nova_compute[182713]: 2026-01-21 23:56:34.836 182717 DEBUG nova.network.os_vif_util [None req-d421e012-4047-4d25-a951-817b2416de54 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] Converting VIF {"id": "1f2706f6-320f-42cf-8e88-b7cb375b001a", "address": "fa:16:3e:38:ee:b8", "network": {"id": "835f4434-3fa6-458b-b79c-b27830f531cf", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-1274650069-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "414437860afc460b9e86d674975e9d1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1f2706f6-32", "ovs_interfaceid": "1f2706f6-320f-42cf-8e88-b7cb375b001a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 21 23:56:34 compute-1 nova_compute[182713]: 2026-01-21 23:56:34.838 182717 DEBUG nova.network.os_vif_util [None req-d421e012-4047-4d25-a951-817b2416de54 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:38:ee:b8,bridge_name='br-int',has_traffic_filtering=True,id=1f2706f6-320f-42cf-8e88-b7cb375b001a,network=Network(835f4434-3fa6-458b-b79c-b27830f531cf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1f2706f6-32') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 21 23:56:34 compute-1 nova_compute[182713]: 2026-01-21 23:56:34.839 182717 DEBUG os_vif [None req-d421e012-4047-4d25-a951-817b2416de54 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:38:ee:b8,bridge_name='br-int',has_traffic_filtering=True,id=1f2706f6-320f-42cf-8e88-b7cb375b001a,network=Network(835f4434-3fa6-458b-b79c-b27830f531cf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1f2706f6-32') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 21 23:56:34 compute-1 nova_compute[182713]: 2026-01-21 23:56:34.844 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:56:34 compute-1 nova_compute[182713]: 2026-01-21 23:56:34.845 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1f2706f6-32, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:56:34 compute-1 nova_compute[182713]: 2026-01-21 23:56:34.848 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:56:34 compute-1 nova_compute[182713]: 2026-01-21 23:56:34.850 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 21 23:56:34 compute-1 nova_compute[182713]: 2026-01-21 23:56:34.857 182717 INFO os_vif [None req-d421e012-4047-4d25-a951-817b2416de54 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:38:ee:b8,bridge_name='br-int',has_traffic_filtering=True,id=1f2706f6-320f-42cf-8e88-b7cb375b001a,network=Network(835f4434-3fa6-458b-b79c-b27830f531cf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1f2706f6-32')
Jan 21 23:56:34 compute-1 nova_compute[182713]: 2026-01-21 23:56:34.857 182717 INFO nova.virt.libvirt.driver [None req-d421e012-4047-4d25-a951-817b2416de54 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] [instance: 03db37ed-d870-40ec-86f5-db23a9180dc8] Deleting instance files /var/lib/nova/instances/03db37ed-d870-40ec-86f5-db23a9180dc8_del
Jan 21 23:56:34 compute-1 nova_compute[182713]: 2026-01-21 23:56:34.859 182717 INFO nova.virt.libvirt.driver [None req-d421e012-4047-4d25-a951-817b2416de54 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] [instance: 03db37ed-d870-40ec-86f5-db23a9180dc8] Deletion of /var/lib/nova/instances/03db37ed-d870-40ec-86f5-db23a9180dc8_del complete
Jan 21 23:56:34 compute-1 podman[220401]: 2026-01-21 23:56:34.881536376 +0000 UTC m=+0.058914025 container remove f9e1d5ada4263f87a1459a1fc21623802c76b9601d0656dea344a745d723ffd4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-835f4434-3fa6-458b-b79c-b27830f531cf, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 21 23:56:34 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:56:34.887 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[baca9566-24ed-4d0d-8b8d-8af8b12992b0]: (4, ('Wed Jan 21 11:56:34 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-835f4434-3fa6-458b-b79c-b27830f531cf (f9e1d5ada4263f87a1459a1fc21623802c76b9601d0656dea344a745d723ffd4)\nf9e1d5ada4263f87a1459a1fc21623802c76b9601d0656dea344a745d723ffd4\nWed Jan 21 11:56:34 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-835f4434-3fa6-458b-b79c-b27830f531cf (f9e1d5ada4263f87a1459a1fc21623802c76b9601d0656dea344a745d723ffd4)\nf9e1d5ada4263f87a1459a1fc21623802c76b9601d0656dea344a745d723ffd4\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:56:34 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:56:34.890 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[5e270dd9-a484-4669-add5-4c5198f9000a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:56:34 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:56:34.892 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap835f4434-30, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:56:34 compute-1 nova_compute[182713]: 2026-01-21 23:56:34.894 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:56:34 compute-1 kernel: tap835f4434-30: left promiscuous mode
Jan 21 23:56:34 compute-1 nova_compute[182713]: 2026-01-21 23:56:34.896 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:56:34 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:56:34.901 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[78c2a3ce-a1ae-40ad-af2f-3a21da8e60d6]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:56:34 compute-1 nova_compute[182713]: 2026-01-21 23:56:34.913 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:56:34 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:56:34.921 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[12dc0192-a4da-4b32-961a-3887cbb92ccb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:56:34 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:56:34.922 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[c3d6bfb3-87cc-4f12-bf4d-8ba3ac6e8cfe]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:56:34 compute-1 nova_compute[182713]: 2026-01-21 23:56:34.933 182717 INFO nova.compute.manager [None req-d421e012-4047-4d25-a951-817b2416de54 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] [instance: 03db37ed-d870-40ec-86f5-db23a9180dc8] Took 0.44 seconds to destroy the instance on the hypervisor.
Jan 21 23:56:34 compute-1 nova_compute[182713]: 2026-01-21 23:56:34.934 182717 DEBUG oslo.service.loopingcall [None req-d421e012-4047-4d25-a951-817b2416de54 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 21 23:56:34 compute-1 nova_compute[182713]: 2026-01-21 23:56:34.935 182717 DEBUG nova.compute.manager [-] [instance: 03db37ed-d870-40ec-86f5-db23a9180dc8] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 21 23:56:34 compute-1 nova_compute[182713]: 2026-01-21 23:56:34.935 182717 DEBUG nova.network.neutron [-] [instance: 03db37ed-d870-40ec-86f5-db23a9180dc8] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 21 23:56:34 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:56:34.945 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[e323125d-9455-4413-8301-d569b86d4d31]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 435473, 'reachable_time': 33827, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 220415, 'error': None, 'target': 'ovnmeta-835f4434-3fa6-458b-b79c-b27830f531cf', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:56:34 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:56:34.948 104576 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-835f4434-3fa6-458b-b79c-b27830f531cf deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 21 23:56:34 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:56:34.949 104576 DEBUG oslo.privsep.daemon [-] privsep: reply[fc5784d5-df61-4886-bd7c-b139355b2a6e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:56:34 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:56:34.949 104184 INFO neutron.agent.ovn.metadata.agent [-] Port 1f2706f6-320f-42cf-8e88-b7cb375b001a in datapath 835f4434-3fa6-458b-b79c-b27830f531cf unbound from our chassis
Jan 21 23:56:34 compute-1 systemd[1]: run-netns-ovnmeta\x2d835f4434\x2d3fa6\x2d458b\x2db79c\x2db27830f531cf.mount: Deactivated successfully.
Jan 21 23:56:34 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:56:34.951 104184 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 835f4434-3fa6-458b-b79c-b27830f531cf, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 21 23:56:34 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:56:34.952 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[3136d200-7c13-47a4-a957-977e1e5dfd90]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:56:34 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:56:34.953 104184 INFO neutron.agent.ovn.metadata.agent [-] Port 1f2706f6-320f-42cf-8e88-b7cb375b001a in datapath 835f4434-3fa6-458b-b79c-b27830f531cf unbound from our chassis
Jan 21 23:56:34 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:56:34.954 104184 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 835f4434-3fa6-458b-b79c-b27830f531cf, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 21 23:56:34 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:56:34.955 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[a1f7ffad-ab37-4e11-a927-e479a79c8442]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:56:36 compute-1 nova_compute[182713]: 2026-01-21 23:56:36.111 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:56:36 compute-1 nova_compute[182713]: 2026-01-21 23:56:36.127 182717 DEBUG nova.network.neutron [-] [instance: 03db37ed-d870-40ec-86f5-db23a9180dc8] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 23:56:36 compute-1 nova_compute[182713]: 2026-01-21 23:56:36.162 182717 INFO nova.compute.manager [-] [instance: 03db37ed-d870-40ec-86f5-db23a9180dc8] Took 1.23 seconds to deallocate network for instance.
Jan 21 23:56:36 compute-1 nova_compute[182713]: 2026-01-21 23:56:36.244 182717 DEBUG oslo_concurrency.lockutils [None req-d421e012-4047-4d25-a951-817b2416de54 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:56:36 compute-1 nova_compute[182713]: 2026-01-21 23:56:36.245 182717 DEBUG oslo_concurrency.lockutils [None req-d421e012-4047-4d25-a951-817b2416de54 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:56:36 compute-1 nova_compute[182713]: 2026-01-21 23:56:36.296 182717 DEBUG nova.compute.provider_tree [None req-d421e012-4047-4d25-a951-817b2416de54 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] Inventory has not changed in ProviderTree for provider: 39680711-70c9-4df1-ae59-25e54fac688d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 21 23:56:36 compute-1 nova_compute[182713]: 2026-01-21 23:56:36.313 182717 DEBUG nova.scheduler.client.report [None req-d421e012-4047-4d25-a951-817b2416de54 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] Inventory has not changed for provider 39680711-70c9-4df1-ae59-25e54fac688d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 21 23:56:36 compute-1 nova_compute[182713]: 2026-01-21 23:56:36.339 182717 DEBUG oslo_concurrency.lockutils [None req-d421e012-4047-4d25-a951-817b2416de54 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.094s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:56:36 compute-1 nova_compute[182713]: 2026-01-21 23:56:36.364 182717 INFO nova.scheduler.client.report [None req-d421e012-4047-4d25-a951-817b2416de54 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] Deleted allocations for instance 03db37ed-d870-40ec-86f5-db23a9180dc8
Jan 21 23:56:36 compute-1 nova_compute[182713]: 2026-01-21 23:56:36.501 182717 DEBUG oslo_concurrency.lockutils [None req-d421e012-4047-4d25-a951-817b2416de54 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] Lock "03db37ed-d870-40ec-86f5-db23a9180dc8" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.050s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:56:36 compute-1 nova_compute[182713]: 2026-01-21 23:56:36.953 182717 DEBUG nova.compute.manager [req-fa179833-2ab7-4acf-8b87-bcab11582fca req-c8786743-a6bf-45da-a145-0caf8a0b4dd2 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 03db37ed-d870-40ec-86f5-db23a9180dc8] Received event network-vif-plugged-1f2706f6-320f-42cf-8e88-b7cb375b001a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:56:36 compute-1 nova_compute[182713]: 2026-01-21 23:56:36.954 182717 DEBUG oslo_concurrency.lockutils [req-fa179833-2ab7-4acf-8b87-bcab11582fca req-c8786743-a6bf-45da-a145-0caf8a0b4dd2 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "03db37ed-d870-40ec-86f5-db23a9180dc8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:56:36 compute-1 nova_compute[182713]: 2026-01-21 23:56:36.954 182717 DEBUG oslo_concurrency.lockutils [req-fa179833-2ab7-4acf-8b87-bcab11582fca req-c8786743-a6bf-45da-a145-0caf8a0b4dd2 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "03db37ed-d870-40ec-86f5-db23a9180dc8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:56:36 compute-1 nova_compute[182713]: 2026-01-21 23:56:36.955 182717 DEBUG oslo_concurrency.lockutils [req-fa179833-2ab7-4acf-8b87-bcab11582fca req-c8786743-a6bf-45da-a145-0caf8a0b4dd2 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "03db37ed-d870-40ec-86f5-db23a9180dc8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:56:36 compute-1 nova_compute[182713]: 2026-01-21 23:56:36.955 182717 DEBUG nova.compute.manager [req-fa179833-2ab7-4acf-8b87-bcab11582fca req-c8786743-a6bf-45da-a145-0caf8a0b4dd2 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 03db37ed-d870-40ec-86f5-db23a9180dc8] No waiting events found dispatching network-vif-plugged-1f2706f6-320f-42cf-8e88-b7cb375b001a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 23:56:36 compute-1 nova_compute[182713]: 2026-01-21 23:56:36.956 182717 WARNING nova.compute.manager [req-fa179833-2ab7-4acf-8b87-bcab11582fca req-c8786743-a6bf-45da-a145-0caf8a0b4dd2 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 03db37ed-d870-40ec-86f5-db23a9180dc8] Received unexpected event network-vif-plugged-1f2706f6-320f-42cf-8e88-b7cb375b001a for instance with vm_state deleted and task_state None.
Jan 21 23:56:36 compute-1 nova_compute[182713]: 2026-01-21 23:56:36.956 182717 DEBUG nova.compute.manager [req-fa179833-2ab7-4acf-8b87-bcab11582fca req-c8786743-a6bf-45da-a145-0caf8a0b4dd2 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 03db37ed-d870-40ec-86f5-db23a9180dc8] Received event network-vif-deleted-1f2706f6-320f-42cf-8e88-b7cb375b001a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:56:39 compute-1 podman[220416]: 2026-01-21 23:56:39.602436608 +0000 UTC m=+0.081529202 container health_status af9a44923f14d865893c91712a29a99d59e05d83f1a1a0300c4f4d5af638f284 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 21 23:56:39 compute-1 podman[220417]: 2026-01-21 23:56:39.602612244 +0000 UTC m=+0.076669513 container health_status e305ebfcd3dbca18f8dd680606b043b108cf9efb0d0a771039cb04fe0df8441d (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 21 23:56:39 compute-1 nova_compute[182713]: 2026-01-21 23:56:39.849 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:56:40 compute-1 nova_compute[182713]: 2026-01-21 23:56:40.437 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:56:40 compute-1 nova_compute[182713]: 2026-01-21 23:56:40.692 182717 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769039785.691544, 0be60507-2a72-40c2-8ec7-86c829eacd52 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:56:40 compute-1 nova_compute[182713]: 2026-01-21 23:56:40.693 182717 INFO nova.compute.manager [-] [instance: 0be60507-2a72-40c2-8ec7-86c829eacd52] VM Stopped (Lifecycle Event)
Jan 21 23:56:40 compute-1 nova_compute[182713]: 2026-01-21 23:56:40.723 182717 DEBUG nova.compute.manager [None req-a075be2b-2f66-49b7-9d10-34207ea35447 - - - - - -] [instance: 0be60507-2a72-40c2-8ec7-86c829eacd52] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:56:41 compute-1 nova_compute[182713]: 2026-01-21 23:56:41.113 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:56:42 compute-1 nova_compute[182713]: 2026-01-21 23:56:42.314 182717 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769039787.312379, 22742d9b-a6a8-4f10-a17f-a9704a1f8f43 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:56:42 compute-1 nova_compute[182713]: 2026-01-21 23:56:42.315 182717 INFO nova.compute.manager [-] [instance: 22742d9b-a6a8-4f10-a17f-a9704a1f8f43] VM Stopped (Lifecycle Event)
Jan 21 23:56:42 compute-1 nova_compute[182713]: 2026-01-21 23:56:42.348 182717 DEBUG nova.compute.manager [None req-2eacd21f-4f0a-4bb3-a4ee-ccbfb682f990 - - - - - -] [instance: 22742d9b-a6a8-4f10-a17f-a9704a1f8f43] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:56:44 compute-1 nova_compute[182713]: 2026-01-21 23:56:44.852 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:56:46 compute-1 nova_compute[182713]: 2026-01-21 23:56:46.115 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:56:47 compute-1 nova_compute[182713]: 2026-01-21 23:56:47.553 182717 DEBUG oslo_concurrency.lockutils [None req-34cb9690-6bad-479b-99b8-b8910d03087c 06bc08086b914d8f96a51e13ea95fd1a e35969ce39e84c9a8e6def5d9829f062 - - default default] Acquiring lock "dbef0790-08d1-4340-8088-615805f5e01f" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:56:47 compute-1 nova_compute[182713]: 2026-01-21 23:56:47.553 182717 DEBUG oslo_concurrency.lockutils [None req-34cb9690-6bad-479b-99b8-b8910d03087c 06bc08086b914d8f96a51e13ea95fd1a e35969ce39e84c9a8e6def5d9829f062 - - default default] Lock "dbef0790-08d1-4340-8088-615805f5e01f" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:56:47 compute-1 nova_compute[182713]: 2026-01-21 23:56:47.580 182717 DEBUG nova.compute.manager [None req-34cb9690-6bad-479b-99b8-b8910d03087c 06bc08086b914d8f96a51e13ea95fd1a e35969ce39e84c9a8e6def5d9829f062 - - default default] [instance: dbef0790-08d1-4340-8088-615805f5e01f] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 21 23:56:47 compute-1 nova_compute[182713]: 2026-01-21 23:56:47.728 182717 DEBUG oslo_concurrency.lockutils [None req-34cb9690-6bad-479b-99b8-b8910d03087c 06bc08086b914d8f96a51e13ea95fd1a e35969ce39e84c9a8e6def5d9829f062 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:56:47 compute-1 nova_compute[182713]: 2026-01-21 23:56:47.729 182717 DEBUG oslo_concurrency.lockutils [None req-34cb9690-6bad-479b-99b8-b8910d03087c 06bc08086b914d8f96a51e13ea95fd1a e35969ce39e84c9a8e6def5d9829f062 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:56:47 compute-1 nova_compute[182713]: 2026-01-21 23:56:47.738 182717 DEBUG nova.virt.hardware [None req-34cb9690-6bad-479b-99b8-b8910d03087c 06bc08086b914d8f96a51e13ea95fd1a e35969ce39e84c9a8e6def5d9829f062 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 21 23:56:47 compute-1 nova_compute[182713]: 2026-01-21 23:56:47.738 182717 INFO nova.compute.claims [None req-34cb9690-6bad-479b-99b8-b8910d03087c 06bc08086b914d8f96a51e13ea95fd1a e35969ce39e84c9a8e6def5d9829f062 - - default default] [instance: dbef0790-08d1-4340-8088-615805f5e01f] Claim successful on node compute-1.ctlplane.example.com
Jan 21 23:56:47 compute-1 nova_compute[182713]: 2026-01-21 23:56:47.913 182717 DEBUG nova.compute.provider_tree [None req-34cb9690-6bad-479b-99b8-b8910d03087c 06bc08086b914d8f96a51e13ea95fd1a e35969ce39e84c9a8e6def5d9829f062 - - default default] Inventory has not changed in ProviderTree for provider: 39680711-70c9-4df1-ae59-25e54fac688d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 21 23:56:47 compute-1 nova_compute[182713]: 2026-01-21 23:56:47.928 182717 DEBUG nova.scheduler.client.report [None req-34cb9690-6bad-479b-99b8-b8910d03087c 06bc08086b914d8f96a51e13ea95fd1a e35969ce39e84c9a8e6def5d9829f062 - - default default] Inventory has not changed for provider 39680711-70c9-4df1-ae59-25e54fac688d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 21 23:56:47 compute-1 nova_compute[182713]: 2026-01-21 23:56:47.950 182717 DEBUG oslo_concurrency.lockutils [None req-34cb9690-6bad-479b-99b8-b8910d03087c 06bc08086b914d8f96a51e13ea95fd1a e35969ce39e84c9a8e6def5d9829f062 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.221s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:56:47 compute-1 nova_compute[182713]: 2026-01-21 23:56:47.951 182717 DEBUG nova.compute.manager [None req-34cb9690-6bad-479b-99b8-b8910d03087c 06bc08086b914d8f96a51e13ea95fd1a e35969ce39e84c9a8e6def5d9829f062 - - default default] [instance: dbef0790-08d1-4340-8088-615805f5e01f] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 21 23:56:48 compute-1 nova_compute[182713]: 2026-01-21 23:56:48.058 182717 DEBUG nova.compute.manager [None req-34cb9690-6bad-479b-99b8-b8910d03087c 06bc08086b914d8f96a51e13ea95fd1a e35969ce39e84c9a8e6def5d9829f062 - - default default] [instance: dbef0790-08d1-4340-8088-615805f5e01f] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 21 23:56:48 compute-1 nova_compute[182713]: 2026-01-21 23:56:48.059 182717 DEBUG nova.network.neutron [None req-34cb9690-6bad-479b-99b8-b8910d03087c 06bc08086b914d8f96a51e13ea95fd1a e35969ce39e84c9a8e6def5d9829f062 - - default default] [instance: dbef0790-08d1-4340-8088-615805f5e01f] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 21 23:56:48 compute-1 nova_compute[182713]: 2026-01-21 23:56:48.089 182717 INFO nova.virt.libvirt.driver [None req-34cb9690-6bad-479b-99b8-b8910d03087c 06bc08086b914d8f96a51e13ea95fd1a e35969ce39e84c9a8e6def5d9829f062 - - default default] [instance: dbef0790-08d1-4340-8088-615805f5e01f] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 21 23:56:48 compute-1 nova_compute[182713]: 2026-01-21 23:56:48.113 182717 DEBUG nova.compute.manager [None req-34cb9690-6bad-479b-99b8-b8910d03087c 06bc08086b914d8f96a51e13ea95fd1a e35969ce39e84c9a8e6def5d9829f062 - - default default] [instance: dbef0790-08d1-4340-8088-615805f5e01f] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 21 23:56:48 compute-1 nova_compute[182713]: 2026-01-21 23:56:48.285 182717 DEBUG nova.compute.manager [None req-34cb9690-6bad-479b-99b8-b8910d03087c 06bc08086b914d8f96a51e13ea95fd1a e35969ce39e84c9a8e6def5d9829f062 - - default default] [instance: dbef0790-08d1-4340-8088-615805f5e01f] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 21 23:56:48 compute-1 nova_compute[182713]: 2026-01-21 23:56:48.287 182717 DEBUG nova.virt.libvirt.driver [None req-34cb9690-6bad-479b-99b8-b8910d03087c 06bc08086b914d8f96a51e13ea95fd1a e35969ce39e84c9a8e6def5d9829f062 - - default default] [instance: dbef0790-08d1-4340-8088-615805f5e01f] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 21 23:56:48 compute-1 nova_compute[182713]: 2026-01-21 23:56:48.287 182717 INFO nova.virt.libvirt.driver [None req-34cb9690-6bad-479b-99b8-b8910d03087c 06bc08086b914d8f96a51e13ea95fd1a e35969ce39e84c9a8e6def5d9829f062 - - default default] [instance: dbef0790-08d1-4340-8088-615805f5e01f] Creating image(s)
Jan 21 23:56:48 compute-1 nova_compute[182713]: 2026-01-21 23:56:48.288 182717 DEBUG oslo_concurrency.lockutils [None req-34cb9690-6bad-479b-99b8-b8910d03087c 06bc08086b914d8f96a51e13ea95fd1a e35969ce39e84c9a8e6def5d9829f062 - - default default] Acquiring lock "/var/lib/nova/instances/dbef0790-08d1-4340-8088-615805f5e01f/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:56:48 compute-1 nova_compute[182713]: 2026-01-21 23:56:48.288 182717 DEBUG oslo_concurrency.lockutils [None req-34cb9690-6bad-479b-99b8-b8910d03087c 06bc08086b914d8f96a51e13ea95fd1a e35969ce39e84c9a8e6def5d9829f062 - - default default] Lock "/var/lib/nova/instances/dbef0790-08d1-4340-8088-615805f5e01f/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:56:48 compute-1 nova_compute[182713]: 2026-01-21 23:56:48.289 182717 DEBUG oslo_concurrency.lockutils [None req-34cb9690-6bad-479b-99b8-b8910d03087c 06bc08086b914d8f96a51e13ea95fd1a e35969ce39e84c9a8e6def5d9829f062 - - default default] Lock "/var/lib/nova/instances/dbef0790-08d1-4340-8088-615805f5e01f/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:56:48 compute-1 nova_compute[182713]: 2026-01-21 23:56:48.303 182717 DEBUG oslo_concurrency.processutils [None req-34cb9690-6bad-479b-99b8-b8910d03087c 06bc08086b914d8f96a51e13ea95fd1a e35969ce39e84c9a8e6def5d9829f062 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:56:48 compute-1 nova_compute[182713]: 2026-01-21 23:56:48.369 182717 DEBUG nova.policy [None req-34cb9690-6bad-479b-99b8-b8910d03087c 06bc08086b914d8f96a51e13ea95fd1a e35969ce39e84c9a8e6def5d9829f062 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '06bc08086b914d8f96a51e13ea95fd1a', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'e35969ce39e84c9a8e6def5d9829f062', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 21 23:56:48 compute-1 nova_compute[182713]: 2026-01-21 23:56:48.375 182717 DEBUG oslo_concurrency.processutils [None req-34cb9690-6bad-479b-99b8-b8910d03087c 06bc08086b914d8f96a51e13ea95fd1a e35969ce39e84c9a8e6def5d9829f062 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:56:48 compute-1 nova_compute[182713]: 2026-01-21 23:56:48.376 182717 DEBUG oslo_concurrency.lockutils [None req-34cb9690-6bad-479b-99b8-b8910d03087c 06bc08086b914d8f96a51e13ea95fd1a e35969ce39e84c9a8e6def5d9829f062 - - default default] Acquiring lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:56:48 compute-1 nova_compute[182713]: 2026-01-21 23:56:48.377 182717 DEBUG oslo_concurrency.lockutils [None req-34cb9690-6bad-479b-99b8-b8910d03087c 06bc08086b914d8f96a51e13ea95fd1a e35969ce39e84c9a8e6def5d9829f062 - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:56:48 compute-1 nova_compute[182713]: 2026-01-21 23:56:48.402 182717 DEBUG oslo_concurrency.processutils [None req-34cb9690-6bad-479b-99b8-b8910d03087c 06bc08086b914d8f96a51e13ea95fd1a e35969ce39e84c9a8e6def5d9829f062 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:56:48 compute-1 nova_compute[182713]: 2026-01-21 23:56:48.470 182717 DEBUG oslo_concurrency.processutils [None req-34cb9690-6bad-479b-99b8-b8910d03087c 06bc08086b914d8f96a51e13ea95fd1a e35969ce39e84c9a8e6def5d9829f062 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:56:48 compute-1 nova_compute[182713]: 2026-01-21 23:56:48.472 182717 DEBUG oslo_concurrency.processutils [None req-34cb9690-6bad-479b-99b8-b8910d03087c 06bc08086b914d8f96a51e13ea95fd1a e35969ce39e84c9a8e6def5d9829f062 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/dbef0790-08d1-4340-8088-615805f5e01f/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:56:48 compute-1 nova_compute[182713]: 2026-01-21 23:56:48.509 182717 DEBUG oslo_concurrency.processutils [None req-34cb9690-6bad-479b-99b8-b8910d03087c 06bc08086b914d8f96a51e13ea95fd1a e35969ce39e84c9a8e6def5d9829f062 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/dbef0790-08d1-4340-8088-615805f5e01f/disk 1073741824" returned: 0 in 0.037s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:56:48 compute-1 nova_compute[182713]: 2026-01-21 23:56:48.510 182717 DEBUG oslo_concurrency.lockutils [None req-34cb9690-6bad-479b-99b8-b8910d03087c 06bc08086b914d8f96a51e13ea95fd1a e35969ce39e84c9a8e6def5d9829f062 - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.133s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:56:48 compute-1 nova_compute[182713]: 2026-01-21 23:56:48.511 182717 DEBUG oslo_concurrency.processutils [None req-34cb9690-6bad-479b-99b8-b8910d03087c 06bc08086b914d8f96a51e13ea95fd1a e35969ce39e84c9a8e6def5d9829f062 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:56:48 compute-1 nova_compute[182713]: 2026-01-21 23:56:48.585 182717 DEBUG oslo_concurrency.processutils [None req-34cb9690-6bad-479b-99b8-b8910d03087c 06bc08086b914d8f96a51e13ea95fd1a e35969ce39e84c9a8e6def5d9829f062 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.074s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:56:48 compute-1 nova_compute[182713]: 2026-01-21 23:56:48.589 182717 DEBUG nova.virt.disk.api [None req-34cb9690-6bad-479b-99b8-b8910d03087c 06bc08086b914d8f96a51e13ea95fd1a e35969ce39e84c9a8e6def5d9829f062 - - default default] Checking if we can resize image /var/lib/nova/instances/dbef0790-08d1-4340-8088-615805f5e01f/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 21 23:56:48 compute-1 nova_compute[182713]: 2026-01-21 23:56:48.590 182717 DEBUG oslo_concurrency.processutils [None req-34cb9690-6bad-479b-99b8-b8910d03087c 06bc08086b914d8f96a51e13ea95fd1a e35969ce39e84c9a8e6def5d9829f062 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/dbef0790-08d1-4340-8088-615805f5e01f/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:56:48 compute-1 nova_compute[182713]: 2026-01-21 23:56:48.663 182717 DEBUG oslo_concurrency.processutils [None req-34cb9690-6bad-479b-99b8-b8910d03087c 06bc08086b914d8f96a51e13ea95fd1a e35969ce39e84c9a8e6def5d9829f062 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/dbef0790-08d1-4340-8088-615805f5e01f/disk --force-share --output=json" returned: 0 in 0.073s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:56:48 compute-1 nova_compute[182713]: 2026-01-21 23:56:48.665 182717 DEBUG nova.virt.disk.api [None req-34cb9690-6bad-479b-99b8-b8910d03087c 06bc08086b914d8f96a51e13ea95fd1a e35969ce39e84c9a8e6def5d9829f062 - - default default] Cannot resize image /var/lib/nova/instances/dbef0790-08d1-4340-8088-615805f5e01f/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 21 23:56:48 compute-1 nova_compute[182713]: 2026-01-21 23:56:48.666 182717 DEBUG nova.objects.instance [None req-34cb9690-6bad-479b-99b8-b8910d03087c 06bc08086b914d8f96a51e13ea95fd1a e35969ce39e84c9a8e6def5d9829f062 - - default default] Lazy-loading 'migration_context' on Instance uuid dbef0790-08d1-4340-8088-615805f5e01f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:56:48 compute-1 nova_compute[182713]: 2026-01-21 23:56:48.698 182717 DEBUG nova.virt.libvirt.driver [None req-34cb9690-6bad-479b-99b8-b8910d03087c 06bc08086b914d8f96a51e13ea95fd1a e35969ce39e84c9a8e6def5d9829f062 - - default default] [instance: dbef0790-08d1-4340-8088-615805f5e01f] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 21 23:56:48 compute-1 nova_compute[182713]: 2026-01-21 23:56:48.698 182717 DEBUG nova.virt.libvirt.driver [None req-34cb9690-6bad-479b-99b8-b8910d03087c 06bc08086b914d8f96a51e13ea95fd1a e35969ce39e84c9a8e6def5d9829f062 - - default default] [instance: dbef0790-08d1-4340-8088-615805f5e01f] Ensure instance console log exists: /var/lib/nova/instances/dbef0790-08d1-4340-8088-615805f5e01f/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 21 23:56:48 compute-1 nova_compute[182713]: 2026-01-21 23:56:48.699 182717 DEBUG oslo_concurrency.lockutils [None req-34cb9690-6bad-479b-99b8-b8910d03087c 06bc08086b914d8f96a51e13ea95fd1a e35969ce39e84c9a8e6def5d9829f062 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:56:48 compute-1 nova_compute[182713]: 2026-01-21 23:56:48.699 182717 DEBUG oslo_concurrency.lockutils [None req-34cb9690-6bad-479b-99b8-b8910d03087c 06bc08086b914d8f96a51e13ea95fd1a e35969ce39e84c9a8e6def5d9829f062 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:56:48 compute-1 nova_compute[182713]: 2026-01-21 23:56:48.700 182717 DEBUG oslo_concurrency.lockutils [None req-34cb9690-6bad-479b-99b8-b8910d03087c 06bc08086b914d8f96a51e13ea95fd1a e35969ce39e84c9a8e6def5d9829f062 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:56:49 compute-1 nova_compute[182713]: 2026-01-21 23:56:49.012 182717 DEBUG nova.network.neutron [None req-34cb9690-6bad-479b-99b8-b8910d03087c 06bc08086b914d8f96a51e13ea95fd1a e35969ce39e84c9a8e6def5d9829f062 - - default default] [instance: dbef0790-08d1-4340-8088-615805f5e01f] Successfully created port: 53c314d0-2f1b-4465-b514-6bdd134e1722 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 21 23:56:49 compute-1 nova_compute[182713]: 2026-01-21 23:56:49.811 182717 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769039794.8101783, 03db37ed-d870-40ec-86f5-db23a9180dc8 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:56:49 compute-1 nova_compute[182713]: 2026-01-21 23:56:49.812 182717 INFO nova.compute.manager [-] [instance: 03db37ed-d870-40ec-86f5-db23a9180dc8] VM Stopped (Lifecycle Event)
Jan 21 23:56:49 compute-1 nova_compute[182713]: 2026-01-21 23:56:49.836 182717 DEBUG nova.compute.manager [None req-82e1a9ee-d0b4-428b-abc8-cafddc5d37e2 - - - - - -] [instance: 03db37ed-d870-40ec-86f5-db23a9180dc8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:56:49 compute-1 nova_compute[182713]: 2026-01-21 23:56:49.856 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:56:50 compute-1 nova_compute[182713]: 2026-01-21 23:56:50.311 182717 DEBUG nova.network.neutron [None req-34cb9690-6bad-479b-99b8-b8910d03087c 06bc08086b914d8f96a51e13ea95fd1a e35969ce39e84c9a8e6def5d9829f062 - - default default] [instance: dbef0790-08d1-4340-8088-615805f5e01f] Successfully updated port: 53c314d0-2f1b-4465-b514-6bdd134e1722 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 21 23:56:50 compute-1 nova_compute[182713]: 2026-01-21 23:56:50.328 182717 DEBUG oslo_concurrency.lockutils [None req-34cb9690-6bad-479b-99b8-b8910d03087c 06bc08086b914d8f96a51e13ea95fd1a e35969ce39e84c9a8e6def5d9829f062 - - default default] Acquiring lock "refresh_cache-dbef0790-08d1-4340-8088-615805f5e01f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 23:56:50 compute-1 nova_compute[182713]: 2026-01-21 23:56:50.328 182717 DEBUG oslo_concurrency.lockutils [None req-34cb9690-6bad-479b-99b8-b8910d03087c 06bc08086b914d8f96a51e13ea95fd1a e35969ce39e84c9a8e6def5d9829f062 - - default default] Acquired lock "refresh_cache-dbef0790-08d1-4340-8088-615805f5e01f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 23:56:50 compute-1 nova_compute[182713]: 2026-01-21 23:56:50.329 182717 DEBUG nova.network.neutron [None req-34cb9690-6bad-479b-99b8-b8910d03087c 06bc08086b914d8f96a51e13ea95fd1a e35969ce39e84c9a8e6def5d9829f062 - - default default] [instance: dbef0790-08d1-4340-8088-615805f5e01f] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 21 23:56:50 compute-1 nova_compute[182713]: 2026-01-21 23:56:50.434 182717 DEBUG nova.compute.manager [req-4e38c52c-a2ec-42c9-b991-208fd4210135 req-fa131b7c-4c56-4fbc-ae9c-b0ff00255b93 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: dbef0790-08d1-4340-8088-615805f5e01f] Received event network-changed-53c314d0-2f1b-4465-b514-6bdd134e1722 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:56:50 compute-1 nova_compute[182713]: 2026-01-21 23:56:50.434 182717 DEBUG nova.compute.manager [req-4e38c52c-a2ec-42c9-b991-208fd4210135 req-fa131b7c-4c56-4fbc-ae9c-b0ff00255b93 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: dbef0790-08d1-4340-8088-615805f5e01f] Refreshing instance network info cache due to event network-changed-53c314d0-2f1b-4465-b514-6bdd134e1722. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 21 23:56:50 compute-1 nova_compute[182713]: 2026-01-21 23:56:50.435 182717 DEBUG oslo_concurrency.lockutils [req-4e38c52c-a2ec-42c9-b991-208fd4210135 req-fa131b7c-4c56-4fbc-ae9c-b0ff00255b93 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-dbef0790-08d1-4340-8088-615805f5e01f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 23:56:50 compute-1 nova_compute[182713]: 2026-01-21 23:56:50.517 182717 DEBUG nova.network.neutron [None req-34cb9690-6bad-479b-99b8-b8910d03087c 06bc08086b914d8f96a51e13ea95fd1a e35969ce39e84c9a8e6def5d9829f062 - - default default] [instance: dbef0790-08d1-4340-8088-615805f5e01f] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 21 23:56:51 compute-1 nova_compute[182713]: 2026-01-21 23:56:51.117 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:56:51 compute-1 podman[220474]: 2026-01-21 23:56:51.60559519 +0000 UTC m=+0.087823255 container health_status cba261ffc859a105e0afa4436e64e27f9ba7e7387a1ea02146e0072c51844d44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3)
Jan 21 23:56:53 compute-1 nova_compute[182713]: 2026-01-21 23:56:53.041 182717 DEBUG nova.network.neutron [None req-34cb9690-6bad-479b-99b8-b8910d03087c 06bc08086b914d8f96a51e13ea95fd1a e35969ce39e84c9a8e6def5d9829f062 - - default default] [instance: dbef0790-08d1-4340-8088-615805f5e01f] Updating instance_info_cache with network_info: [{"id": "53c314d0-2f1b-4465-b514-6bdd134e1722", "address": "fa:16:3e:d9:67:59", "network": {"id": "8e133733-21f6-4116-b549-db9e6b754b7b", "bridge": "br-int", "label": "tempest-ServerPasswordTestJSON-601456831-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e35969ce39e84c9a8e6def5d9829f062", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap53c314d0-2f", "ovs_interfaceid": "53c314d0-2f1b-4465-b514-6bdd134e1722", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 23:56:53 compute-1 nova_compute[182713]: 2026-01-21 23:56:53.067 182717 DEBUG oslo_concurrency.lockutils [None req-34cb9690-6bad-479b-99b8-b8910d03087c 06bc08086b914d8f96a51e13ea95fd1a e35969ce39e84c9a8e6def5d9829f062 - - default default] Releasing lock "refresh_cache-dbef0790-08d1-4340-8088-615805f5e01f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 23:56:53 compute-1 nova_compute[182713]: 2026-01-21 23:56:53.068 182717 DEBUG nova.compute.manager [None req-34cb9690-6bad-479b-99b8-b8910d03087c 06bc08086b914d8f96a51e13ea95fd1a e35969ce39e84c9a8e6def5d9829f062 - - default default] [instance: dbef0790-08d1-4340-8088-615805f5e01f] Instance network_info: |[{"id": "53c314d0-2f1b-4465-b514-6bdd134e1722", "address": "fa:16:3e:d9:67:59", "network": {"id": "8e133733-21f6-4116-b549-db9e6b754b7b", "bridge": "br-int", "label": "tempest-ServerPasswordTestJSON-601456831-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e35969ce39e84c9a8e6def5d9829f062", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap53c314d0-2f", "ovs_interfaceid": "53c314d0-2f1b-4465-b514-6bdd134e1722", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 21 23:56:53 compute-1 nova_compute[182713]: 2026-01-21 23:56:53.068 182717 DEBUG oslo_concurrency.lockutils [req-4e38c52c-a2ec-42c9-b991-208fd4210135 req-fa131b7c-4c56-4fbc-ae9c-b0ff00255b93 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-dbef0790-08d1-4340-8088-615805f5e01f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 23:56:53 compute-1 nova_compute[182713]: 2026-01-21 23:56:53.068 182717 DEBUG nova.network.neutron [req-4e38c52c-a2ec-42c9-b991-208fd4210135 req-fa131b7c-4c56-4fbc-ae9c-b0ff00255b93 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: dbef0790-08d1-4340-8088-615805f5e01f] Refreshing network info cache for port 53c314d0-2f1b-4465-b514-6bdd134e1722 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 21 23:56:53 compute-1 nova_compute[182713]: 2026-01-21 23:56:53.072 182717 DEBUG nova.virt.libvirt.driver [None req-34cb9690-6bad-479b-99b8-b8910d03087c 06bc08086b914d8f96a51e13ea95fd1a e35969ce39e84c9a8e6def5d9829f062 - - default default] [instance: dbef0790-08d1-4340-8088-615805f5e01f] Start _get_guest_xml network_info=[{"id": "53c314d0-2f1b-4465-b514-6bdd134e1722", "address": "fa:16:3e:d9:67:59", "network": {"id": "8e133733-21f6-4116-b549-db9e6b754b7b", "bridge": "br-int", "label": "tempest-ServerPasswordTestJSON-601456831-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e35969ce39e84c9a8e6def5d9829f062", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap53c314d0-2f", "ovs_interfaceid": "53c314d0-2f1b-4465-b514-6bdd134e1722", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'size': 0, 'disk_bus': 'virtio', 'boot_index': 0, 'device_name': '/dev/vda', 'guest_format': None, 'encrypted': False, 'encryption_format': None, 'encryption_secret_uuid': None, 'device_type': 'disk', 'image_id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 21 23:56:53 compute-1 nova_compute[182713]: 2026-01-21 23:56:53.080 182717 WARNING nova.virt.libvirt.driver [None req-34cb9690-6bad-479b-99b8-b8910d03087c 06bc08086b914d8f96a51e13ea95fd1a e35969ce39e84c9a8e6def5d9829f062 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 21 23:56:53 compute-1 nova_compute[182713]: 2026-01-21 23:56:53.092 182717 DEBUG nova.virt.libvirt.host [None req-34cb9690-6bad-479b-99b8-b8910d03087c 06bc08086b914d8f96a51e13ea95fd1a e35969ce39e84c9a8e6def5d9829f062 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 21 23:56:53 compute-1 nova_compute[182713]: 2026-01-21 23:56:53.093 182717 DEBUG nova.virt.libvirt.host [None req-34cb9690-6bad-479b-99b8-b8910d03087c 06bc08086b914d8f96a51e13ea95fd1a e35969ce39e84c9a8e6def5d9829f062 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 21 23:56:53 compute-1 nova_compute[182713]: 2026-01-21 23:56:53.099 182717 DEBUG nova.virt.libvirt.host [None req-34cb9690-6bad-479b-99b8-b8910d03087c 06bc08086b914d8f96a51e13ea95fd1a e35969ce39e84c9a8e6def5d9829f062 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 21 23:56:53 compute-1 nova_compute[182713]: 2026-01-21 23:56:53.100 182717 DEBUG nova.virt.libvirt.host [None req-34cb9690-6bad-479b-99b8-b8910d03087c 06bc08086b914d8f96a51e13ea95fd1a e35969ce39e84c9a8e6def5d9829f062 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 21 23:56:53 compute-1 nova_compute[182713]: 2026-01-21 23:56:53.102 182717 DEBUG nova.virt.libvirt.driver [None req-34cb9690-6bad-479b-99b8-b8910d03087c 06bc08086b914d8f96a51e13ea95fd1a e35969ce39e84c9a8e6def5d9829f062 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 21 23:56:53 compute-1 nova_compute[182713]: 2026-01-21 23:56:53.103 182717 DEBUG nova.virt.hardware [None req-34cb9690-6bad-479b-99b8-b8910d03087c 06bc08086b914d8f96a51e13ea95fd1a e35969ce39e84c9a8e6def5d9829f062 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-21T23:42:45Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c3389c03-89c4-4ff5-9e03-1a99d41713d4',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 21 23:56:53 compute-1 nova_compute[182713]: 2026-01-21 23:56:53.103 182717 DEBUG nova.virt.hardware [None req-34cb9690-6bad-479b-99b8-b8910d03087c 06bc08086b914d8f96a51e13ea95fd1a e35969ce39e84c9a8e6def5d9829f062 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 21 23:56:53 compute-1 nova_compute[182713]: 2026-01-21 23:56:53.103 182717 DEBUG nova.virt.hardware [None req-34cb9690-6bad-479b-99b8-b8910d03087c 06bc08086b914d8f96a51e13ea95fd1a e35969ce39e84c9a8e6def5d9829f062 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 21 23:56:53 compute-1 nova_compute[182713]: 2026-01-21 23:56:53.104 182717 DEBUG nova.virt.hardware [None req-34cb9690-6bad-479b-99b8-b8910d03087c 06bc08086b914d8f96a51e13ea95fd1a e35969ce39e84c9a8e6def5d9829f062 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 21 23:56:53 compute-1 nova_compute[182713]: 2026-01-21 23:56:53.104 182717 DEBUG nova.virt.hardware [None req-34cb9690-6bad-479b-99b8-b8910d03087c 06bc08086b914d8f96a51e13ea95fd1a e35969ce39e84c9a8e6def5d9829f062 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 21 23:56:53 compute-1 nova_compute[182713]: 2026-01-21 23:56:53.104 182717 DEBUG nova.virt.hardware [None req-34cb9690-6bad-479b-99b8-b8910d03087c 06bc08086b914d8f96a51e13ea95fd1a e35969ce39e84c9a8e6def5d9829f062 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 21 23:56:53 compute-1 nova_compute[182713]: 2026-01-21 23:56:53.105 182717 DEBUG nova.virt.hardware [None req-34cb9690-6bad-479b-99b8-b8910d03087c 06bc08086b914d8f96a51e13ea95fd1a e35969ce39e84c9a8e6def5d9829f062 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 21 23:56:53 compute-1 nova_compute[182713]: 2026-01-21 23:56:53.105 182717 DEBUG nova.virt.hardware [None req-34cb9690-6bad-479b-99b8-b8910d03087c 06bc08086b914d8f96a51e13ea95fd1a e35969ce39e84c9a8e6def5d9829f062 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 21 23:56:53 compute-1 nova_compute[182713]: 2026-01-21 23:56:53.106 182717 DEBUG nova.virt.hardware [None req-34cb9690-6bad-479b-99b8-b8910d03087c 06bc08086b914d8f96a51e13ea95fd1a e35969ce39e84c9a8e6def5d9829f062 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 21 23:56:53 compute-1 nova_compute[182713]: 2026-01-21 23:56:53.106 182717 DEBUG nova.virt.hardware [None req-34cb9690-6bad-479b-99b8-b8910d03087c 06bc08086b914d8f96a51e13ea95fd1a e35969ce39e84c9a8e6def5d9829f062 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 21 23:56:53 compute-1 nova_compute[182713]: 2026-01-21 23:56:53.106 182717 DEBUG nova.virt.hardware [None req-34cb9690-6bad-479b-99b8-b8910d03087c 06bc08086b914d8f96a51e13ea95fd1a e35969ce39e84c9a8e6def5d9829f062 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 21 23:56:53 compute-1 nova_compute[182713]: 2026-01-21 23:56:53.112 182717 DEBUG nova.virt.libvirt.vif [None req-34cb9690-6bad-479b-99b8-b8910d03087c 06bc08086b914d8f96a51e13ea95fd1a e35969ce39e84c9a8e6def5d9829f062 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-21T23:56:46Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerPasswordTestJSON-server-424514161',display_name='tempest-ServerPasswordTestJSON-server-424514161',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverpasswordtestjson-server-424514161',id=69,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='e35969ce39e84c9a8e6def5d9829f062',ramdisk_id='',reservation_id='r-dh1gpvxm',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerPasswordTestJSON-1806469904',owner_user_name='tempest-ServerPasswordTestJSON-1806469904-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-21T23:56:48Z,user_data=None,user_id='06bc08086b914d8f96a51e13ea95fd1a',uuid=dbef0790-08d1-4340-8088-615805f5e01f,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "53c314d0-2f1b-4465-b514-6bdd134e1722", "address": "fa:16:3e:d9:67:59", "network": {"id": "8e133733-21f6-4116-b549-db9e6b754b7b", "bridge": "br-int", "label": "tempest-ServerPasswordTestJSON-601456831-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e35969ce39e84c9a8e6def5d9829f062", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap53c314d0-2f", "ovs_interfaceid": "53c314d0-2f1b-4465-b514-6bdd134e1722", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 21 23:56:53 compute-1 nova_compute[182713]: 2026-01-21 23:56:53.113 182717 DEBUG nova.network.os_vif_util [None req-34cb9690-6bad-479b-99b8-b8910d03087c 06bc08086b914d8f96a51e13ea95fd1a e35969ce39e84c9a8e6def5d9829f062 - - default default] Converting VIF {"id": "53c314d0-2f1b-4465-b514-6bdd134e1722", "address": "fa:16:3e:d9:67:59", "network": {"id": "8e133733-21f6-4116-b549-db9e6b754b7b", "bridge": "br-int", "label": "tempest-ServerPasswordTestJSON-601456831-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e35969ce39e84c9a8e6def5d9829f062", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap53c314d0-2f", "ovs_interfaceid": "53c314d0-2f1b-4465-b514-6bdd134e1722", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 21 23:56:53 compute-1 nova_compute[182713]: 2026-01-21 23:56:53.114 182717 DEBUG nova.network.os_vif_util [None req-34cb9690-6bad-479b-99b8-b8910d03087c 06bc08086b914d8f96a51e13ea95fd1a e35969ce39e84c9a8e6def5d9829f062 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d9:67:59,bridge_name='br-int',has_traffic_filtering=True,id=53c314d0-2f1b-4465-b514-6bdd134e1722,network=Network(8e133733-21f6-4116-b549-db9e6b754b7b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap53c314d0-2f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 21 23:56:53 compute-1 nova_compute[182713]: 2026-01-21 23:56:53.115 182717 DEBUG nova.objects.instance [None req-34cb9690-6bad-479b-99b8-b8910d03087c 06bc08086b914d8f96a51e13ea95fd1a e35969ce39e84c9a8e6def5d9829f062 - - default default] Lazy-loading 'pci_devices' on Instance uuid dbef0790-08d1-4340-8088-615805f5e01f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:56:53 compute-1 nova_compute[182713]: 2026-01-21 23:56:53.137 182717 DEBUG nova.virt.libvirt.driver [None req-34cb9690-6bad-479b-99b8-b8910d03087c 06bc08086b914d8f96a51e13ea95fd1a e35969ce39e84c9a8e6def5d9829f062 - - default default] [instance: dbef0790-08d1-4340-8088-615805f5e01f] End _get_guest_xml xml=<domain type="kvm">
Jan 21 23:56:53 compute-1 nova_compute[182713]:   <uuid>dbef0790-08d1-4340-8088-615805f5e01f</uuid>
Jan 21 23:56:53 compute-1 nova_compute[182713]:   <name>instance-00000045</name>
Jan 21 23:56:53 compute-1 nova_compute[182713]:   <memory>131072</memory>
Jan 21 23:56:53 compute-1 nova_compute[182713]:   <vcpu>1</vcpu>
Jan 21 23:56:53 compute-1 nova_compute[182713]:   <metadata>
Jan 21 23:56:53 compute-1 nova_compute[182713]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 21 23:56:53 compute-1 nova_compute[182713]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 21 23:56:53 compute-1 nova_compute[182713]:       <nova:name>tempest-ServerPasswordTestJSON-server-424514161</nova:name>
Jan 21 23:56:53 compute-1 nova_compute[182713]:       <nova:creationTime>2026-01-21 23:56:53</nova:creationTime>
Jan 21 23:56:53 compute-1 nova_compute[182713]:       <nova:flavor name="m1.nano">
Jan 21 23:56:53 compute-1 nova_compute[182713]:         <nova:memory>128</nova:memory>
Jan 21 23:56:53 compute-1 nova_compute[182713]:         <nova:disk>1</nova:disk>
Jan 21 23:56:53 compute-1 nova_compute[182713]:         <nova:swap>0</nova:swap>
Jan 21 23:56:53 compute-1 nova_compute[182713]:         <nova:ephemeral>0</nova:ephemeral>
Jan 21 23:56:53 compute-1 nova_compute[182713]:         <nova:vcpus>1</nova:vcpus>
Jan 21 23:56:53 compute-1 nova_compute[182713]:       </nova:flavor>
Jan 21 23:56:53 compute-1 nova_compute[182713]:       <nova:owner>
Jan 21 23:56:53 compute-1 nova_compute[182713]:         <nova:user uuid="06bc08086b914d8f96a51e13ea95fd1a">tempest-ServerPasswordTestJSON-1806469904-project-member</nova:user>
Jan 21 23:56:53 compute-1 nova_compute[182713]:         <nova:project uuid="e35969ce39e84c9a8e6def5d9829f062">tempest-ServerPasswordTestJSON-1806469904</nova:project>
Jan 21 23:56:53 compute-1 nova_compute[182713]:       </nova:owner>
Jan 21 23:56:53 compute-1 nova_compute[182713]:       <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 21 23:56:53 compute-1 nova_compute[182713]:       <nova:ports>
Jan 21 23:56:53 compute-1 nova_compute[182713]:         <nova:port uuid="53c314d0-2f1b-4465-b514-6bdd134e1722">
Jan 21 23:56:53 compute-1 nova_compute[182713]:           <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Jan 21 23:56:53 compute-1 nova_compute[182713]:         </nova:port>
Jan 21 23:56:53 compute-1 nova_compute[182713]:       </nova:ports>
Jan 21 23:56:53 compute-1 nova_compute[182713]:     </nova:instance>
Jan 21 23:56:53 compute-1 nova_compute[182713]:   </metadata>
Jan 21 23:56:53 compute-1 nova_compute[182713]:   <sysinfo type="smbios">
Jan 21 23:56:53 compute-1 nova_compute[182713]:     <system>
Jan 21 23:56:53 compute-1 nova_compute[182713]:       <entry name="manufacturer">RDO</entry>
Jan 21 23:56:53 compute-1 nova_compute[182713]:       <entry name="product">OpenStack Compute</entry>
Jan 21 23:56:53 compute-1 nova_compute[182713]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 21 23:56:53 compute-1 nova_compute[182713]:       <entry name="serial">dbef0790-08d1-4340-8088-615805f5e01f</entry>
Jan 21 23:56:53 compute-1 nova_compute[182713]:       <entry name="uuid">dbef0790-08d1-4340-8088-615805f5e01f</entry>
Jan 21 23:56:53 compute-1 nova_compute[182713]:       <entry name="family">Virtual Machine</entry>
Jan 21 23:56:53 compute-1 nova_compute[182713]:     </system>
Jan 21 23:56:53 compute-1 nova_compute[182713]:   </sysinfo>
Jan 21 23:56:53 compute-1 nova_compute[182713]:   <os>
Jan 21 23:56:53 compute-1 nova_compute[182713]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 21 23:56:53 compute-1 nova_compute[182713]:     <boot dev="hd"/>
Jan 21 23:56:53 compute-1 nova_compute[182713]:     <smbios mode="sysinfo"/>
Jan 21 23:56:53 compute-1 nova_compute[182713]:   </os>
Jan 21 23:56:53 compute-1 nova_compute[182713]:   <features>
Jan 21 23:56:53 compute-1 nova_compute[182713]:     <acpi/>
Jan 21 23:56:53 compute-1 nova_compute[182713]:     <apic/>
Jan 21 23:56:53 compute-1 nova_compute[182713]:     <vmcoreinfo/>
Jan 21 23:56:53 compute-1 nova_compute[182713]:   </features>
Jan 21 23:56:53 compute-1 nova_compute[182713]:   <clock offset="utc">
Jan 21 23:56:53 compute-1 nova_compute[182713]:     <timer name="pit" tickpolicy="delay"/>
Jan 21 23:56:53 compute-1 nova_compute[182713]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 21 23:56:53 compute-1 nova_compute[182713]:     <timer name="hpet" present="no"/>
Jan 21 23:56:53 compute-1 nova_compute[182713]:   </clock>
Jan 21 23:56:53 compute-1 nova_compute[182713]:   <cpu mode="custom" match="exact">
Jan 21 23:56:53 compute-1 nova_compute[182713]:     <model>Nehalem</model>
Jan 21 23:56:53 compute-1 nova_compute[182713]:     <topology sockets="1" cores="1" threads="1"/>
Jan 21 23:56:53 compute-1 nova_compute[182713]:   </cpu>
Jan 21 23:56:53 compute-1 nova_compute[182713]:   <devices>
Jan 21 23:56:53 compute-1 nova_compute[182713]:     <disk type="file" device="disk">
Jan 21 23:56:53 compute-1 nova_compute[182713]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 21 23:56:53 compute-1 nova_compute[182713]:       <source file="/var/lib/nova/instances/dbef0790-08d1-4340-8088-615805f5e01f/disk"/>
Jan 21 23:56:53 compute-1 nova_compute[182713]:       <target dev="vda" bus="virtio"/>
Jan 21 23:56:53 compute-1 nova_compute[182713]:     </disk>
Jan 21 23:56:53 compute-1 nova_compute[182713]:     <disk type="file" device="cdrom">
Jan 21 23:56:53 compute-1 nova_compute[182713]:       <driver name="qemu" type="raw" cache="none"/>
Jan 21 23:56:53 compute-1 nova_compute[182713]:       <source file="/var/lib/nova/instances/dbef0790-08d1-4340-8088-615805f5e01f/disk.config"/>
Jan 21 23:56:53 compute-1 nova_compute[182713]:       <target dev="sda" bus="sata"/>
Jan 21 23:56:53 compute-1 nova_compute[182713]:     </disk>
Jan 21 23:56:53 compute-1 nova_compute[182713]:     <interface type="ethernet">
Jan 21 23:56:53 compute-1 nova_compute[182713]:       <mac address="fa:16:3e:d9:67:59"/>
Jan 21 23:56:53 compute-1 nova_compute[182713]:       <model type="virtio"/>
Jan 21 23:56:53 compute-1 nova_compute[182713]:       <driver name="vhost" rx_queue_size="512"/>
Jan 21 23:56:53 compute-1 nova_compute[182713]:       <mtu size="1442"/>
Jan 21 23:56:53 compute-1 nova_compute[182713]:       <target dev="tap53c314d0-2f"/>
Jan 21 23:56:53 compute-1 nova_compute[182713]:     </interface>
Jan 21 23:56:53 compute-1 nova_compute[182713]:     <serial type="pty">
Jan 21 23:56:53 compute-1 nova_compute[182713]:       <log file="/var/lib/nova/instances/dbef0790-08d1-4340-8088-615805f5e01f/console.log" append="off"/>
Jan 21 23:56:53 compute-1 nova_compute[182713]:     </serial>
Jan 21 23:56:53 compute-1 nova_compute[182713]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 21 23:56:53 compute-1 nova_compute[182713]:     <video>
Jan 21 23:56:53 compute-1 nova_compute[182713]:       <model type="virtio"/>
Jan 21 23:56:53 compute-1 nova_compute[182713]:     </video>
Jan 21 23:56:53 compute-1 nova_compute[182713]:     <input type="tablet" bus="usb"/>
Jan 21 23:56:53 compute-1 nova_compute[182713]:     <rng model="virtio">
Jan 21 23:56:53 compute-1 nova_compute[182713]:       <backend model="random">/dev/urandom</backend>
Jan 21 23:56:53 compute-1 nova_compute[182713]:     </rng>
Jan 21 23:56:53 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root"/>
Jan 21 23:56:53 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:56:53 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:56:53 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:56:53 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:56:53 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:56:53 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:56:53 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:56:53 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:56:53 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:56:53 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:56:53 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:56:53 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:56:53 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:56:53 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:56:53 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:56:53 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:56:53 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:56:53 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:56:53 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:56:53 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:56:53 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:56:53 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:56:53 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:56:53 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:56:53 compute-1 nova_compute[182713]:     <controller type="usb" index="0"/>
Jan 21 23:56:53 compute-1 nova_compute[182713]:     <memballoon model="virtio">
Jan 21 23:56:53 compute-1 nova_compute[182713]:       <stats period="10"/>
Jan 21 23:56:53 compute-1 nova_compute[182713]:     </memballoon>
Jan 21 23:56:53 compute-1 nova_compute[182713]:   </devices>
Jan 21 23:56:53 compute-1 nova_compute[182713]: </domain>
Jan 21 23:56:53 compute-1 nova_compute[182713]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 21 23:56:53 compute-1 nova_compute[182713]: 2026-01-21 23:56:53.138 182717 DEBUG nova.compute.manager [None req-34cb9690-6bad-479b-99b8-b8910d03087c 06bc08086b914d8f96a51e13ea95fd1a e35969ce39e84c9a8e6def5d9829f062 - - default default] [instance: dbef0790-08d1-4340-8088-615805f5e01f] Preparing to wait for external event network-vif-plugged-53c314d0-2f1b-4465-b514-6bdd134e1722 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 21 23:56:53 compute-1 nova_compute[182713]: 2026-01-21 23:56:53.138 182717 DEBUG oslo_concurrency.lockutils [None req-34cb9690-6bad-479b-99b8-b8910d03087c 06bc08086b914d8f96a51e13ea95fd1a e35969ce39e84c9a8e6def5d9829f062 - - default default] Acquiring lock "dbef0790-08d1-4340-8088-615805f5e01f-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:56:53 compute-1 nova_compute[182713]: 2026-01-21 23:56:53.139 182717 DEBUG oslo_concurrency.lockutils [None req-34cb9690-6bad-479b-99b8-b8910d03087c 06bc08086b914d8f96a51e13ea95fd1a e35969ce39e84c9a8e6def5d9829f062 - - default default] Lock "dbef0790-08d1-4340-8088-615805f5e01f-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:56:53 compute-1 nova_compute[182713]: 2026-01-21 23:56:53.139 182717 DEBUG oslo_concurrency.lockutils [None req-34cb9690-6bad-479b-99b8-b8910d03087c 06bc08086b914d8f96a51e13ea95fd1a e35969ce39e84c9a8e6def5d9829f062 - - default default] Lock "dbef0790-08d1-4340-8088-615805f5e01f-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:56:53 compute-1 nova_compute[182713]: 2026-01-21 23:56:53.140 182717 DEBUG nova.virt.libvirt.vif [None req-34cb9690-6bad-479b-99b8-b8910d03087c 06bc08086b914d8f96a51e13ea95fd1a e35969ce39e84c9a8e6def5d9829f062 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-21T23:56:46Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerPasswordTestJSON-server-424514161',display_name='tempest-ServerPasswordTestJSON-server-424514161',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverpasswordtestjson-server-424514161',id=69,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='e35969ce39e84c9a8e6def5d9829f062',ramdisk_id='',reservation_id='r-dh1gpvxm',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerPasswordTestJSON-1806469904',owner_user_name='tempest-ServerPasswordTestJSON-1806469904-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-21T23:56:48Z,user_data=None,user_id='06bc08086b914d8f96a51e13ea95fd1a',uuid=dbef0790-08d1-4340-8088-615805f5e01f,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "53c314d0-2f1b-4465-b514-6bdd134e1722", "address": "fa:16:3e:d9:67:59", "network": {"id": "8e133733-21f6-4116-b549-db9e6b754b7b", "bridge": "br-int", "label": "tempest-ServerPasswordTestJSON-601456831-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e35969ce39e84c9a8e6def5d9829f062", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap53c314d0-2f", "ovs_interfaceid": "53c314d0-2f1b-4465-b514-6bdd134e1722", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 21 23:56:53 compute-1 nova_compute[182713]: 2026-01-21 23:56:53.140 182717 DEBUG nova.network.os_vif_util [None req-34cb9690-6bad-479b-99b8-b8910d03087c 06bc08086b914d8f96a51e13ea95fd1a e35969ce39e84c9a8e6def5d9829f062 - - default default] Converting VIF {"id": "53c314d0-2f1b-4465-b514-6bdd134e1722", "address": "fa:16:3e:d9:67:59", "network": {"id": "8e133733-21f6-4116-b549-db9e6b754b7b", "bridge": "br-int", "label": "tempest-ServerPasswordTestJSON-601456831-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e35969ce39e84c9a8e6def5d9829f062", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap53c314d0-2f", "ovs_interfaceid": "53c314d0-2f1b-4465-b514-6bdd134e1722", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 21 23:56:53 compute-1 nova_compute[182713]: 2026-01-21 23:56:53.141 182717 DEBUG nova.network.os_vif_util [None req-34cb9690-6bad-479b-99b8-b8910d03087c 06bc08086b914d8f96a51e13ea95fd1a e35969ce39e84c9a8e6def5d9829f062 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d9:67:59,bridge_name='br-int',has_traffic_filtering=True,id=53c314d0-2f1b-4465-b514-6bdd134e1722,network=Network(8e133733-21f6-4116-b549-db9e6b754b7b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap53c314d0-2f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 21 23:56:53 compute-1 nova_compute[182713]: 2026-01-21 23:56:53.142 182717 DEBUG os_vif [None req-34cb9690-6bad-479b-99b8-b8910d03087c 06bc08086b914d8f96a51e13ea95fd1a e35969ce39e84c9a8e6def5d9829f062 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:d9:67:59,bridge_name='br-int',has_traffic_filtering=True,id=53c314d0-2f1b-4465-b514-6bdd134e1722,network=Network(8e133733-21f6-4116-b549-db9e6b754b7b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap53c314d0-2f') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 21 23:56:53 compute-1 nova_compute[182713]: 2026-01-21 23:56:53.142 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:56:53 compute-1 nova_compute[182713]: 2026-01-21 23:56:53.143 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:56:53 compute-1 nova_compute[182713]: 2026-01-21 23:56:53.143 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 21 23:56:53 compute-1 nova_compute[182713]: 2026-01-21 23:56:53.147 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:56:53 compute-1 nova_compute[182713]: 2026-01-21 23:56:53.147 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap53c314d0-2f, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:56:53 compute-1 nova_compute[182713]: 2026-01-21 23:56:53.148 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap53c314d0-2f, col_values=(('external_ids', {'iface-id': '53c314d0-2f1b-4465-b514-6bdd134e1722', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:d9:67:59', 'vm-uuid': 'dbef0790-08d1-4340-8088-615805f5e01f'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:56:53 compute-1 nova_compute[182713]: 2026-01-21 23:56:53.150 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:56:53 compute-1 nova_compute[182713]: 2026-01-21 23:56:53.152 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 21 23:56:53 compute-1 NetworkManager[54952]: <info>  [1769039813.1525] manager: (tap53c314d0-2f): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/105)
Jan 21 23:56:53 compute-1 nova_compute[182713]: 2026-01-21 23:56:53.158 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:56:53 compute-1 nova_compute[182713]: 2026-01-21 23:56:53.159 182717 INFO os_vif [None req-34cb9690-6bad-479b-99b8-b8910d03087c 06bc08086b914d8f96a51e13ea95fd1a e35969ce39e84c9a8e6def5d9829f062 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:d9:67:59,bridge_name='br-int',has_traffic_filtering=True,id=53c314d0-2f1b-4465-b514-6bdd134e1722,network=Network(8e133733-21f6-4116-b549-db9e6b754b7b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap53c314d0-2f')
Jan 21 23:56:53 compute-1 nova_compute[182713]: 2026-01-21 23:56:53.230 182717 DEBUG nova.virt.libvirt.driver [None req-34cb9690-6bad-479b-99b8-b8910d03087c 06bc08086b914d8f96a51e13ea95fd1a e35969ce39e84c9a8e6def5d9829f062 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 21 23:56:53 compute-1 nova_compute[182713]: 2026-01-21 23:56:53.231 182717 DEBUG nova.virt.libvirt.driver [None req-34cb9690-6bad-479b-99b8-b8910d03087c 06bc08086b914d8f96a51e13ea95fd1a e35969ce39e84c9a8e6def5d9829f062 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 21 23:56:53 compute-1 nova_compute[182713]: 2026-01-21 23:56:53.232 182717 DEBUG nova.virt.libvirt.driver [None req-34cb9690-6bad-479b-99b8-b8910d03087c 06bc08086b914d8f96a51e13ea95fd1a e35969ce39e84c9a8e6def5d9829f062 - - default default] No VIF found with MAC fa:16:3e:d9:67:59, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 21 23:56:53 compute-1 nova_compute[182713]: 2026-01-21 23:56:53.233 182717 INFO nova.virt.libvirt.driver [None req-34cb9690-6bad-479b-99b8-b8910d03087c 06bc08086b914d8f96a51e13ea95fd1a e35969ce39e84c9a8e6def5d9829f062 - - default default] [instance: dbef0790-08d1-4340-8088-615805f5e01f] Using config drive
Jan 21 23:56:54 compute-1 nova_compute[182713]: 2026-01-21 23:56:54.352 182717 INFO nova.virt.libvirt.driver [None req-34cb9690-6bad-479b-99b8-b8910d03087c 06bc08086b914d8f96a51e13ea95fd1a e35969ce39e84c9a8e6def5d9829f062 - - default default] [instance: dbef0790-08d1-4340-8088-615805f5e01f] Creating config drive at /var/lib/nova/instances/dbef0790-08d1-4340-8088-615805f5e01f/disk.config
Jan 21 23:56:54 compute-1 nova_compute[182713]: 2026-01-21 23:56:54.361 182717 DEBUG oslo_concurrency.processutils [None req-34cb9690-6bad-479b-99b8-b8910d03087c 06bc08086b914d8f96a51e13ea95fd1a e35969ce39e84c9a8e6def5d9829f062 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/dbef0790-08d1-4340-8088-615805f5e01f/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpo3w3t95u execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:56:54 compute-1 nova_compute[182713]: 2026-01-21 23:56:54.490 182717 DEBUG oslo_concurrency.processutils [None req-34cb9690-6bad-479b-99b8-b8910d03087c 06bc08086b914d8f96a51e13ea95fd1a e35969ce39e84c9a8e6def5d9829f062 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/dbef0790-08d1-4340-8088-615805f5e01f/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpo3w3t95u" returned: 0 in 0.129s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:56:54 compute-1 kernel: tap53c314d0-2f: entered promiscuous mode
Jan 21 23:56:54 compute-1 NetworkManager[54952]: <info>  [1769039814.5749] manager: (tap53c314d0-2f): new Tun device (/org/freedesktop/NetworkManager/Devices/106)
Jan 21 23:56:54 compute-1 ovn_controller[94841]: 2026-01-21T23:56:54Z|00215|binding|INFO|Claiming lport 53c314d0-2f1b-4465-b514-6bdd134e1722 for this chassis.
Jan 21 23:56:54 compute-1 ovn_controller[94841]: 2026-01-21T23:56:54Z|00216|binding|INFO|53c314d0-2f1b-4465-b514-6bdd134e1722: Claiming fa:16:3e:d9:67:59 10.100.0.5
Jan 21 23:56:54 compute-1 nova_compute[182713]: 2026-01-21 23:56:54.581 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:56:54 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:56:54.602 104184 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d9:67:59 10.100.0.5'], port_security=['fa:16:3e:d9:67:59 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': 'dbef0790-08d1-4340-8088-615805f5e01f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8e133733-21f6-4116-b549-db9e6b754b7b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e35969ce39e84c9a8e6def5d9829f062', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'f3bb3258-ae01-47a5-ba6d-04f80967d243', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=eb19b89e-3b6e-4bdd-91be-b9ee619d6ef4, chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>], logical_port=53c314d0-2f1b-4465-b514-6bdd134e1722) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 21 23:56:54 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:56:54.605 104184 INFO neutron.agent.ovn.metadata.agent [-] Port 53c314d0-2f1b-4465-b514-6bdd134e1722 in datapath 8e133733-21f6-4116-b549-db9e6b754b7b bound to our chassis
Jan 21 23:56:54 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:56:54.608 104184 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 8e133733-21f6-4116-b549-db9e6b754b7b
Jan 21 23:56:54 compute-1 systemd-machined[153970]: New machine qemu-32-instance-00000045.
Jan 21 23:56:54 compute-1 systemd-udevd[220517]: Network interface NamePolicy= disabled on kernel command line.
Jan 21 23:56:54 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:56:54.628 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[83f0589d-6358-4590-af95-0db38d03e5e6]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:56:54 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:56:54.629 104184 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap8e133733-21 in ovnmeta-8e133733-21f6-4116-b549-db9e6b754b7b namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 21 23:56:54 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:56:54.633 211733 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap8e133733-20 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 21 23:56:54 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:56:54.633 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[0c25f75b-691e-4351-9137-f46e8bb87abb]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:56:54 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:56:54.634 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[b6df03c5-e1aa-4d25-82b8-7a3bae137772]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:56:54 compute-1 ovn_controller[94841]: 2026-01-21T23:56:54Z|00217|binding|INFO|Setting lport 53c314d0-2f1b-4465-b514-6bdd134e1722 ovn-installed in OVS
Jan 21 23:56:54 compute-1 ovn_controller[94841]: 2026-01-21T23:56:54Z|00218|binding|INFO|Setting lport 53c314d0-2f1b-4465-b514-6bdd134e1722 up in Southbound
Jan 21 23:56:54 compute-1 nova_compute[182713]: 2026-01-21 23:56:54.646 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:56:54 compute-1 NetworkManager[54952]: <info>  [1769039814.6505] device (tap53c314d0-2f): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 21 23:56:54 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:56:54.650 104576 DEBUG oslo.privsep.daemon [-] privsep: reply[bb2eb64e-1776-4a67-97f6-e22ebb2ac790]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:56:54 compute-1 systemd[1]: Started Virtual Machine qemu-32-instance-00000045.
Jan 21 23:56:54 compute-1 NetworkManager[54952]: <info>  [1769039814.6526] device (tap53c314d0-2f): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 21 23:56:54 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:56:54.666 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[0b910ff7-df2f-4fb2-8b91-582c1bad6bc2]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:56:54 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:56:54.694 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[a49ce519-6d24-45af-9e0a-2aabfae4491b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:56:54 compute-1 NetworkManager[54952]: <info>  [1769039814.7044] manager: (tap8e133733-20): new Veth device (/org/freedesktop/NetworkManager/Devices/107)
Jan 21 23:56:54 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:56:54.704 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[dd6f2720-11f4-4045-bc8d-3ef7925078e5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:56:54 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:56:54.742 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[a6e5df51-fff1-45a7-90ee-0f6fb01243fd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:56:54 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:56:54.746 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[4057620b-a102-47e3-97c0-20ec7fd00cd0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:56:54 compute-1 NetworkManager[54952]: <info>  [1769039814.7741] device (tap8e133733-20): carrier: link connected
Jan 21 23:56:54 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:56:54.782 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[ea855229-4a60-4a92-bc94-8f17887b4d8c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:56:54 compute-1 podman[220522]: 2026-01-21 23:56:54.790204461 +0000 UTC m=+0.097133583 container health_status dce133a2ffa21f3006ac27dc80027060a9029e6d972f4c62fecd35dd3bbb00f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1755695350, architecture=x86_64, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., name=ubi9-minimal, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, com.redhat.component=ubi9-minimal-container, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=openstack_network_exporter, io.openshift.expose-services=, vcs-type=git, distribution-scope=public, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Jan 21 23:56:54 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:56:54.809 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[11dd7278-23c2-4b54-8795-1c37e8e08531]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap8e133733-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d8:ef:ad'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 67], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 438742, 'reachable_time': 37515, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 220569, 'error': None, 'target': 'ovnmeta-8e133733-21f6-4116-b549-db9e6b754b7b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:56:54 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:56:54.830 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[deb2c8fa-473b-4342-8412-f283aef37065]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fed8:efad'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 438742, 'tstamp': 438742}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 220570, 'error': None, 'target': 'ovnmeta-8e133733-21f6-4116-b549-db9e6b754b7b', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:56:54 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:56:54.853 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[813b0a0d-d059-4608-a7be-18828a3e282e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap8e133733-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d8:ef:ad'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 67], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 438742, 'reachable_time': 37515, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 220571, 'error': None, 'target': 'ovnmeta-8e133733-21f6-4116-b549-db9e6b754b7b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:56:54 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:56:54.899 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[c432f1fc-d7d9-4e6d-8f45-b1043bda1908]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:56:54 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:56:54.978 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[1c0e82e0-1187-4144-a96c-e16ab4b223f1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:56:54 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:56:54.981 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8e133733-20, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:56:54 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:56:54.982 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 21 23:56:54 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:56:54.983 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap8e133733-20, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:56:54 compute-1 nova_compute[182713]: 2026-01-21 23:56:54.985 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:56:54 compute-1 NetworkManager[54952]: <info>  [1769039814.9859] manager: (tap8e133733-20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/108)
Jan 21 23:56:54 compute-1 kernel: tap8e133733-20: entered promiscuous mode
Jan 21 23:56:54 compute-1 nova_compute[182713]: 2026-01-21 23:56:54.987 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:56:54 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:56:54.988 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap8e133733-20, col_values=(('external_ids', {'iface-id': '2b7cff10-d507-4b6c-a9e3-6ec4a242f716'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:56:54 compute-1 nova_compute[182713]: 2026-01-21 23:56:54.990 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:56:54 compute-1 ovn_controller[94841]: 2026-01-21T23:56:54Z|00219|binding|INFO|Releasing lport 2b7cff10-d507-4b6c-a9e3-6ec4a242f716 from this chassis (sb_readonly=0)
Jan 21 23:56:55 compute-1 nova_compute[182713]: 2026-01-21 23:56:55.007 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:56:55 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:56:55.009 104184 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/8e133733-21f6-4116-b549-db9e6b754b7b.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/8e133733-21f6-4116-b549-db9e6b754b7b.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 21 23:56:55 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:56:55.010 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[9bad1120-7a30-40ba-b954-54fa7c796e84]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:56:55 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:56:55.011 104184 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 21 23:56:55 compute-1 ovn_metadata_agent[104179]: global
Jan 21 23:56:55 compute-1 ovn_metadata_agent[104179]:     log         /dev/log local0 debug
Jan 21 23:56:55 compute-1 ovn_metadata_agent[104179]:     log-tag     haproxy-metadata-proxy-8e133733-21f6-4116-b549-db9e6b754b7b
Jan 21 23:56:55 compute-1 ovn_metadata_agent[104179]:     user        root
Jan 21 23:56:55 compute-1 ovn_metadata_agent[104179]:     group       root
Jan 21 23:56:55 compute-1 ovn_metadata_agent[104179]:     maxconn     1024
Jan 21 23:56:55 compute-1 ovn_metadata_agent[104179]:     pidfile     /var/lib/neutron/external/pids/8e133733-21f6-4116-b549-db9e6b754b7b.pid.haproxy
Jan 21 23:56:55 compute-1 ovn_metadata_agent[104179]:     daemon
Jan 21 23:56:55 compute-1 ovn_metadata_agent[104179]: 
Jan 21 23:56:55 compute-1 ovn_metadata_agent[104179]: defaults
Jan 21 23:56:55 compute-1 ovn_metadata_agent[104179]:     log global
Jan 21 23:56:55 compute-1 ovn_metadata_agent[104179]:     mode http
Jan 21 23:56:55 compute-1 ovn_metadata_agent[104179]:     option httplog
Jan 21 23:56:55 compute-1 ovn_metadata_agent[104179]:     option dontlognull
Jan 21 23:56:55 compute-1 ovn_metadata_agent[104179]:     option http-server-close
Jan 21 23:56:55 compute-1 ovn_metadata_agent[104179]:     option forwardfor
Jan 21 23:56:55 compute-1 ovn_metadata_agent[104179]:     retries                 3
Jan 21 23:56:55 compute-1 ovn_metadata_agent[104179]:     timeout http-request    30s
Jan 21 23:56:55 compute-1 ovn_metadata_agent[104179]:     timeout connect         30s
Jan 21 23:56:55 compute-1 ovn_metadata_agent[104179]:     timeout client          32s
Jan 21 23:56:55 compute-1 ovn_metadata_agent[104179]:     timeout server          32s
Jan 21 23:56:55 compute-1 ovn_metadata_agent[104179]:     timeout http-keep-alive 30s
Jan 21 23:56:55 compute-1 ovn_metadata_agent[104179]: 
Jan 21 23:56:55 compute-1 ovn_metadata_agent[104179]: 
Jan 21 23:56:55 compute-1 ovn_metadata_agent[104179]: listen listener
Jan 21 23:56:55 compute-1 ovn_metadata_agent[104179]:     bind 169.254.169.254:80
Jan 21 23:56:55 compute-1 ovn_metadata_agent[104179]:     server metadata /var/lib/neutron/metadata_proxy
Jan 21 23:56:55 compute-1 ovn_metadata_agent[104179]:     http-request add-header X-OVN-Network-ID 8e133733-21f6-4116-b549-db9e6b754b7b
Jan 21 23:56:55 compute-1 ovn_metadata_agent[104179]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 21 23:56:55 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:56:55.012 104184 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-8e133733-21f6-4116-b549-db9e6b754b7b', 'env', 'PROCESS_TAG=haproxy-8e133733-21f6-4116-b549-db9e6b754b7b', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/8e133733-21f6-4116-b549-db9e6b754b7b.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 21 23:56:55 compute-1 nova_compute[182713]: 2026-01-21 23:56:55.090 182717 DEBUG nova.virt.driver [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Emitting event <LifecycleEvent: 1769039815.08981, dbef0790-08d1-4340-8088-615805f5e01f => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:56:55 compute-1 nova_compute[182713]: 2026-01-21 23:56:55.091 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: dbef0790-08d1-4340-8088-615805f5e01f] VM Started (Lifecycle Event)
Jan 21 23:56:55 compute-1 nova_compute[182713]: 2026-01-21 23:56:55.120 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: dbef0790-08d1-4340-8088-615805f5e01f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:56:55 compute-1 nova_compute[182713]: 2026-01-21 23:56:55.132 182717 DEBUG nova.virt.driver [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Emitting event <LifecycleEvent: 1769039815.090977, dbef0790-08d1-4340-8088-615805f5e01f => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:56:55 compute-1 nova_compute[182713]: 2026-01-21 23:56:55.132 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: dbef0790-08d1-4340-8088-615805f5e01f] VM Paused (Lifecycle Event)
Jan 21 23:56:55 compute-1 nova_compute[182713]: 2026-01-21 23:56:55.161 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: dbef0790-08d1-4340-8088-615805f5e01f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:56:55 compute-1 nova_compute[182713]: 2026-01-21 23:56:55.166 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: dbef0790-08d1-4340-8088-615805f5e01f] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 21 23:56:55 compute-1 nova_compute[182713]: 2026-01-21 23:56:55.189 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: dbef0790-08d1-4340-8088-615805f5e01f] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 21 23:56:55 compute-1 podman[220610]: 2026-01-21 23:56:55.389895279 +0000 UTC m=+0.047615417 container create 4bea52b2013e488e1f73a80d6d3383b821f4f0e36c46edea7754a05fb0183284 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8e133733-21f6-4116-b549-db9e6b754b7b, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 21 23:56:55 compute-1 systemd[1]: Started libpod-conmon-4bea52b2013e488e1f73a80d6d3383b821f4f0e36c46edea7754a05fb0183284.scope.
Jan 21 23:56:55 compute-1 systemd[1]: Started libcrun container.
Jan 21 23:56:55 compute-1 podman[220610]: 2026-01-21 23:56:55.364501307 +0000 UTC m=+0.022221475 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 21 23:56:55 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/831fc28d30c0d7652ba9eb1a4c5812c9682df28d65b619c5356434c34bb252e7/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 21 23:56:55 compute-1 podman[220610]: 2026-01-21 23:56:55.480092906 +0000 UTC m=+0.137813074 container init 4bea52b2013e488e1f73a80d6d3383b821f4f0e36c46edea7754a05fb0183284 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8e133733-21f6-4116-b549-db9e6b754b7b, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Jan 21 23:56:55 compute-1 podman[220610]: 2026-01-21 23:56:55.490159667 +0000 UTC m=+0.147879835 container start 4bea52b2013e488e1f73a80d6d3383b821f4f0e36c46edea7754a05fb0183284 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8e133733-21f6-4116-b549-db9e6b754b7b, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 21 23:56:55 compute-1 neutron-haproxy-ovnmeta-8e133733-21f6-4116-b549-db9e6b754b7b[220626]: [NOTICE]   (220630) : New worker (220632) forked
Jan 21 23:56:55 compute-1 neutron-haproxy-ovnmeta-8e133733-21f6-4116-b549-db9e6b754b7b[220626]: [NOTICE]   (220630) : Loading success.
Jan 21 23:56:55 compute-1 nova_compute[182713]: 2026-01-21 23:56:55.633 182717 DEBUG nova.compute.manager [req-4b88ca41-bbd1-48e5-912a-e6d91bb86363 req-cafbaba4-9f9b-4515-ab0b-8f8bbd531a04 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: dbef0790-08d1-4340-8088-615805f5e01f] Received event network-vif-plugged-53c314d0-2f1b-4465-b514-6bdd134e1722 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:56:55 compute-1 nova_compute[182713]: 2026-01-21 23:56:55.634 182717 DEBUG oslo_concurrency.lockutils [req-4b88ca41-bbd1-48e5-912a-e6d91bb86363 req-cafbaba4-9f9b-4515-ab0b-8f8bbd531a04 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "dbef0790-08d1-4340-8088-615805f5e01f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:56:55 compute-1 nova_compute[182713]: 2026-01-21 23:56:55.634 182717 DEBUG oslo_concurrency.lockutils [req-4b88ca41-bbd1-48e5-912a-e6d91bb86363 req-cafbaba4-9f9b-4515-ab0b-8f8bbd531a04 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "dbef0790-08d1-4340-8088-615805f5e01f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:56:55 compute-1 nova_compute[182713]: 2026-01-21 23:56:55.635 182717 DEBUG oslo_concurrency.lockutils [req-4b88ca41-bbd1-48e5-912a-e6d91bb86363 req-cafbaba4-9f9b-4515-ab0b-8f8bbd531a04 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "dbef0790-08d1-4340-8088-615805f5e01f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:56:55 compute-1 nova_compute[182713]: 2026-01-21 23:56:55.635 182717 DEBUG nova.compute.manager [req-4b88ca41-bbd1-48e5-912a-e6d91bb86363 req-cafbaba4-9f9b-4515-ab0b-8f8bbd531a04 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: dbef0790-08d1-4340-8088-615805f5e01f] Processing event network-vif-plugged-53c314d0-2f1b-4465-b514-6bdd134e1722 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 21 23:56:55 compute-1 nova_compute[182713]: 2026-01-21 23:56:55.636 182717 DEBUG nova.compute.manager [None req-34cb9690-6bad-479b-99b8-b8910d03087c 06bc08086b914d8f96a51e13ea95fd1a e35969ce39e84c9a8e6def5d9829f062 - - default default] [instance: dbef0790-08d1-4340-8088-615805f5e01f] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 21 23:56:55 compute-1 nova_compute[182713]: 2026-01-21 23:56:55.641 182717 DEBUG nova.virt.driver [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Emitting event <LifecycleEvent: 1769039815.6406543, dbef0790-08d1-4340-8088-615805f5e01f => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:56:55 compute-1 nova_compute[182713]: 2026-01-21 23:56:55.643 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: dbef0790-08d1-4340-8088-615805f5e01f] VM Resumed (Lifecycle Event)
Jan 21 23:56:55 compute-1 nova_compute[182713]: 2026-01-21 23:56:55.646 182717 DEBUG nova.virt.libvirt.driver [None req-34cb9690-6bad-479b-99b8-b8910d03087c 06bc08086b914d8f96a51e13ea95fd1a e35969ce39e84c9a8e6def5d9829f062 - - default default] [instance: dbef0790-08d1-4340-8088-615805f5e01f] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 21 23:56:55 compute-1 nova_compute[182713]: 2026-01-21 23:56:55.654 182717 INFO nova.virt.libvirt.driver [-] [instance: dbef0790-08d1-4340-8088-615805f5e01f] Instance spawned successfully.
Jan 21 23:56:55 compute-1 nova_compute[182713]: 2026-01-21 23:56:55.654 182717 DEBUG nova.virt.libvirt.driver [None req-34cb9690-6bad-479b-99b8-b8910d03087c 06bc08086b914d8f96a51e13ea95fd1a e35969ce39e84c9a8e6def5d9829f062 - - default default] [instance: dbef0790-08d1-4340-8088-615805f5e01f] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 21 23:56:55 compute-1 nova_compute[182713]: 2026-01-21 23:56:55.686 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: dbef0790-08d1-4340-8088-615805f5e01f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:56:55 compute-1 nova_compute[182713]: 2026-01-21 23:56:55.693 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: dbef0790-08d1-4340-8088-615805f5e01f] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 21 23:56:55 compute-1 nova_compute[182713]: 2026-01-21 23:56:55.696 182717 DEBUG nova.virt.libvirt.driver [None req-34cb9690-6bad-479b-99b8-b8910d03087c 06bc08086b914d8f96a51e13ea95fd1a e35969ce39e84c9a8e6def5d9829f062 - - default default] [instance: dbef0790-08d1-4340-8088-615805f5e01f] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:56:55 compute-1 nova_compute[182713]: 2026-01-21 23:56:55.697 182717 DEBUG nova.virt.libvirt.driver [None req-34cb9690-6bad-479b-99b8-b8910d03087c 06bc08086b914d8f96a51e13ea95fd1a e35969ce39e84c9a8e6def5d9829f062 - - default default] [instance: dbef0790-08d1-4340-8088-615805f5e01f] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:56:55 compute-1 nova_compute[182713]: 2026-01-21 23:56:55.697 182717 DEBUG nova.virt.libvirt.driver [None req-34cb9690-6bad-479b-99b8-b8910d03087c 06bc08086b914d8f96a51e13ea95fd1a e35969ce39e84c9a8e6def5d9829f062 - - default default] [instance: dbef0790-08d1-4340-8088-615805f5e01f] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:56:55 compute-1 nova_compute[182713]: 2026-01-21 23:56:55.698 182717 DEBUG nova.virt.libvirt.driver [None req-34cb9690-6bad-479b-99b8-b8910d03087c 06bc08086b914d8f96a51e13ea95fd1a e35969ce39e84c9a8e6def5d9829f062 - - default default] [instance: dbef0790-08d1-4340-8088-615805f5e01f] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:56:55 compute-1 nova_compute[182713]: 2026-01-21 23:56:55.698 182717 DEBUG nova.virt.libvirt.driver [None req-34cb9690-6bad-479b-99b8-b8910d03087c 06bc08086b914d8f96a51e13ea95fd1a e35969ce39e84c9a8e6def5d9829f062 - - default default] [instance: dbef0790-08d1-4340-8088-615805f5e01f] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:56:55 compute-1 nova_compute[182713]: 2026-01-21 23:56:55.699 182717 DEBUG nova.virt.libvirt.driver [None req-34cb9690-6bad-479b-99b8-b8910d03087c 06bc08086b914d8f96a51e13ea95fd1a e35969ce39e84c9a8e6def5d9829f062 - - default default] [instance: dbef0790-08d1-4340-8088-615805f5e01f] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:56:55 compute-1 nova_compute[182713]: 2026-01-21 23:56:55.735 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: dbef0790-08d1-4340-8088-615805f5e01f] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 21 23:56:55 compute-1 nova_compute[182713]: 2026-01-21 23:56:55.791 182717 INFO nova.compute.manager [None req-34cb9690-6bad-479b-99b8-b8910d03087c 06bc08086b914d8f96a51e13ea95fd1a e35969ce39e84c9a8e6def5d9829f062 - - default default] [instance: dbef0790-08d1-4340-8088-615805f5e01f] Took 7.51 seconds to spawn the instance on the hypervisor.
Jan 21 23:56:55 compute-1 nova_compute[182713]: 2026-01-21 23:56:55.792 182717 DEBUG nova.compute.manager [None req-34cb9690-6bad-479b-99b8-b8910d03087c 06bc08086b914d8f96a51e13ea95fd1a e35969ce39e84c9a8e6def5d9829f062 - - default default] [instance: dbef0790-08d1-4340-8088-615805f5e01f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:56:55 compute-1 nova_compute[182713]: 2026-01-21 23:56:55.886 182717 INFO nova.compute.manager [None req-34cb9690-6bad-479b-99b8-b8910d03087c 06bc08086b914d8f96a51e13ea95fd1a e35969ce39e84c9a8e6def5d9829f062 - - default default] [instance: dbef0790-08d1-4340-8088-615805f5e01f] Took 8.21 seconds to build instance.
Jan 21 23:56:55 compute-1 nova_compute[182713]: 2026-01-21 23:56:55.903 182717 DEBUG oslo_concurrency.lockutils [None req-34cb9690-6bad-479b-99b8-b8910d03087c 06bc08086b914d8f96a51e13ea95fd1a e35969ce39e84c9a8e6def5d9829f062 - - default default] Lock "dbef0790-08d1-4340-8088-615805f5e01f" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.350s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:56:55 compute-1 nova_compute[182713]: 2026-01-21 23:56:55.971 182717 DEBUG nova.network.neutron [req-4e38c52c-a2ec-42c9-b991-208fd4210135 req-fa131b7c-4c56-4fbc-ae9c-b0ff00255b93 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: dbef0790-08d1-4340-8088-615805f5e01f] Updated VIF entry in instance network info cache for port 53c314d0-2f1b-4465-b514-6bdd134e1722. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 21 23:56:55 compute-1 nova_compute[182713]: 2026-01-21 23:56:55.972 182717 DEBUG nova.network.neutron [req-4e38c52c-a2ec-42c9-b991-208fd4210135 req-fa131b7c-4c56-4fbc-ae9c-b0ff00255b93 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: dbef0790-08d1-4340-8088-615805f5e01f] Updating instance_info_cache with network_info: [{"id": "53c314d0-2f1b-4465-b514-6bdd134e1722", "address": "fa:16:3e:d9:67:59", "network": {"id": "8e133733-21f6-4116-b549-db9e6b754b7b", "bridge": "br-int", "label": "tempest-ServerPasswordTestJSON-601456831-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e35969ce39e84c9a8e6def5d9829f062", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap53c314d0-2f", "ovs_interfaceid": "53c314d0-2f1b-4465-b514-6bdd134e1722", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 23:56:55 compute-1 nova_compute[182713]: 2026-01-21 23:56:55.997 182717 DEBUG oslo_concurrency.lockutils [req-4e38c52c-a2ec-42c9-b991-208fd4210135 req-fa131b7c-4c56-4fbc-ae9c-b0ff00255b93 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-dbef0790-08d1-4340-8088-615805f5e01f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 23:56:56 compute-1 nova_compute[182713]: 2026-01-21 23:56:56.120 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:56:57 compute-1 nova_compute[182713]: 2026-01-21 23:56:57.742 182717 DEBUG nova.compute.manager [req-26c1fcaa-99e3-4ff7-bc87-5ba66241cbb2 req-081c3757-124c-4c12-af63-cc330f6b9114 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: dbef0790-08d1-4340-8088-615805f5e01f] Received event network-vif-plugged-53c314d0-2f1b-4465-b514-6bdd134e1722 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:56:57 compute-1 nova_compute[182713]: 2026-01-21 23:56:57.743 182717 DEBUG oslo_concurrency.lockutils [req-26c1fcaa-99e3-4ff7-bc87-5ba66241cbb2 req-081c3757-124c-4c12-af63-cc330f6b9114 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "dbef0790-08d1-4340-8088-615805f5e01f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:56:57 compute-1 nova_compute[182713]: 2026-01-21 23:56:57.744 182717 DEBUG oslo_concurrency.lockutils [req-26c1fcaa-99e3-4ff7-bc87-5ba66241cbb2 req-081c3757-124c-4c12-af63-cc330f6b9114 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "dbef0790-08d1-4340-8088-615805f5e01f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:56:57 compute-1 nova_compute[182713]: 2026-01-21 23:56:57.744 182717 DEBUG oslo_concurrency.lockutils [req-26c1fcaa-99e3-4ff7-bc87-5ba66241cbb2 req-081c3757-124c-4c12-af63-cc330f6b9114 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "dbef0790-08d1-4340-8088-615805f5e01f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:56:57 compute-1 nova_compute[182713]: 2026-01-21 23:56:57.745 182717 DEBUG nova.compute.manager [req-26c1fcaa-99e3-4ff7-bc87-5ba66241cbb2 req-081c3757-124c-4c12-af63-cc330f6b9114 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: dbef0790-08d1-4340-8088-615805f5e01f] No waiting events found dispatching network-vif-plugged-53c314d0-2f1b-4465-b514-6bdd134e1722 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 23:56:57 compute-1 nova_compute[182713]: 2026-01-21 23:56:57.745 182717 WARNING nova.compute.manager [req-26c1fcaa-99e3-4ff7-bc87-5ba66241cbb2 req-081c3757-124c-4c12-af63-cc330f6b9114 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: dbef0790-08d1-4340-8088-615805f5e01f] Received unexpected event network-vif-plugged-53c314d0-2f1b-4465-b514-6bdd134e1722 for instance with vm_state active and task_state None.
Jan 21 23:56:58 compute-1 nova_compute[182713]: 2026-01-21 23:56:58.030 182717 DEBUG oslo_concurrency.lockutils [None req-cfd0e38f-f678-4e3b-9a94-40ada9776192 06bc08086b914d8f96a51e13ea95fd1a e35969ce39e84c9a8e6def5d9829f062 - - default default] Acquiring lock "dbef0790-08d1-4340-8088-615805f5e01f" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:56:58 compute-1 nova_compute[182713]: 2026-01-21 23:56:58.031 182717 DEBUG oslo_concurrency.lockutils [None req-cfd0e38f-f678-4e3b-9a94-40ada9776192 06bc08086b914d8f96a51e13ea95fd1a e35969ce39e84c9a8e6def5d9829f062 - - default default] Lock "dbef0790-08d1-4340-8088-615805f5e01f" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:56:58 compute-1 nova_compute[182713]: 2026-01-21 23:56:58.031 182717 DEBUG oslo_concurrency.lockutils [None req-cfd0e38f-f678-4e3b-9a94-40ada9776192 06bc08086b914d8f96a51e13ea95fd1a e35969ce39e84c9a8e6def5d9829f062 - - default default] Acquiring lock "dbef0790-08d1-4340-8088-615805f5e01f-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:56:58 compute-1 nova_compute[182713]: 2026-01-21 23:56:58.032 182717 DEBUG oslo_concurrency.lockutils [None req-cfd0e38f-f678-4e3b-9a94-40ada9776192 06bc08086b914d8f96a51e13ea95fd1a e35969ce39e84c9a8e6def5d9829f062 - - default default] Lock "dbef0790-08d1-4340-8088-615805f5e01f-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:56:58 compute-1 nova_compute[182713]: 2026-01-21 23:56:58.033 182717 DEBUG oslo_concurrency.lockutils [None req-cfd0e38f-f678-4e3b-9a94-40ada9776192 06bc08086b914d8f96a51e13ea95fd1a e35969ce39e84c9a8e6def5d9829f062 - - default default] Lock "dbef0790-08d1-4340-8088-615805f5e01f-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:56:58 compute-1 nova_compute[182713]: 2026-01-21 23:56:58.047 182717 INFO nova.compute.manager [None req-cfd0e38f-f678-4e3b-9a94-40ada9776192 06bc08086b914d8f96a51e13ea95fd1a e35969ce39e84c9a8e6def5d9829f062 - - default default] [instance: dbef0790-08d1-4340-8088-615805f5e01f] Terminating instance
Jan 21 23:56:58 compute-1 nova_compute[182713]: 2026-01-21 23:56:58.057 182717 DEBUG nova.compute.manager [None req-cfd0e38f-f678-4e3b-9a94-40ada9776192 06bc08086b914d8f96a51e13ea95fd1a e35969ce39e84c9a8e6def5d9829f062 - - default default] [instance: dbef0790-08d1-4340-8088-615805f5e01f] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 21 23:56:58 compute-1 kernel: tap53c314d0-2f (unregistering): left promiscuous mode
Jan 21 23:56:58 compute-1 NetworkManager[54952]: <info>  [1769039818.0841] device (tap53c314d0-2f): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 21 23:56:58 compute-1 ovn_controller[94841]: 2026-01-21T23:56:58Z|00220|binding|INFO|Releasing lport 53c314d0-2f1b-4465-b514-6bdd134e1722 from this chassis (sb_readonly=0)
Jan 21 23:56:58 compute-1 ovn_controller[94841]: 2026-01-21T23:56:58Z|00221|binding|INFO|Setting lport 53c314d0-2f1b-4465-b514-6bdd134e1722 down in Southbound
Jan 21 23:56:58 compute-1 nova_compute[182713]: 2026-01-21 23:56:58.088 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:56:58 compute-1 ovn_controller[94841]: 2026-01-21T23:56:58Z|00222|binding|INFO|Removing iface tap53c314d0-2f ovn-installed in OVS
Jan 21 23:56:58 compute-1 nova_compute[182713]: 2026-01-21 23:56:58.092 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:56:58 compute-1 ovn_controller[94841]: 2026-01-21T23:56:58Z|00223|binding|INFO|Releasing lport 2b7cff10-d507-4b6c-a9e3-6ec4a242f716 from this chassis (sb_readonly=0)
Jan 21 23:56:58 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:56:58.097 104184 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d9:67:59 10.100.0.5'], port_security=['fa:16:3e:d9:67:59 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': 'dbef0790-08d1-4340-8088-615805f5e01f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8e133733-21f6-4116-b549-db9e6b754b7b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e35969ce39e84c9a8e6def5d9829f062', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'f3bb3258-ae01-47a5-ba6d-04f80967d243', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=eb19b89e-3b6e-4bdd-91be-b9ee619d6ef4, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>], logical_port=53c314d0-2f1b-4465-b514-6bdd134e1722) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 21 23:56:58 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:56:58.099 104184 INFO neutron.agent.ovn.metadata.agent [-] Port 53c314d0-2f1b-4465-b514-6bdd134e1722 in datapath 8e133733-21f6-4116-b549-db9e6b754b7b unbound from our chassis
Jan 21 23:56:58 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:56:58.101 104184 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 8e133733-21f6-4116-b549-db9e6b754b7b, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 21 23:56:58 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:56:58.102 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[5a672381-a98f-436f-9d47-1a68691a2d7d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:56:58 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:56:58.102 104184 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-8e133733-21f6-4116-b549-db9e6b754b7b namespace which is not needed anymore
Jan 21 23:56:58 compute-1 nova_compute[182713]: 2026-01-21 23:56:58.106 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:56:58 compute-1 nova_compute[182713]: 2026-01-21 23:56:58.147 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:56:58 compute-1 nova_compute[182713]: 2026-01-21 23:56:58.149 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:56:58 compute-1 systemd[1]: machine-qemu\x2d32\x2dinstance\x2d00000045.scope: Deactivated successfully.
Jan 21 23:56:58 compute-1 systemd[1]: machine-qemu\x2d32\x2dinstance\x2d00000045.scope: Consumed 2.814s CPU time.
Jan 21 23:56:58 compute-1 systemd-machined[153970]: Machine qemu-32-instance-00000045 terminated.
Jan 21 23:56:58 compute-1 neutron-haproxy-ovnmeta-8e133733-21f6-4116-b549-db9e6b754b7b[220626]: [NOTICE]   (220630) : haproxy version is 2.8.14-c23fe91
Jan 21 23:56:58 compute-1 neutron-haproxy-ovnmeta-8e133733-21f6-4116-b549-db9e6b754b7b[220626]: [NOTICE]   (220630) : path to executable is /usr/sbin/haproxy
Jan 21 23:56:58 compute-1 neutron-haproxy-ovnmeta-8e133733-21f6-4116-b549-db9e6b754b7b[220626]: [WARNING]  (220630) : Exiting Master process...
Jan 21 23:56:58 compute-1 neutron-haproxy-ovnmeta-8e133733-21f6-4116-b549-db9e6b754b7b[220626]: [ALERT]    (220630) : Current worker (220632) exited with code 143 (Terminated)
Jan 21 23:56:58 compute-1 neutron-haproxy-ovnmeta-8e133733-21f6-4116-b549-db9e6b754b7b[220626]: [WARNING]  (220630) : All workers exited. Exiting... (0)
Jan 21 23:56:58 compute-1 systemd[1]: libpod-4bea52b2013e488e1f73a80d6d3383b821f4f0e36c46edea7754a05fb0183284.scope: Deactivated successfully.
Jan 21 23:56:58 compute-1 podman[220665]: 2026-01-21 23:56:58.242442684 +0000 UTC m=+0.046749361 container died 4bea52b2013e488e1f73a80d6d3383b821f4f0e36c46edea7754a05fb0183284 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8e133733-21f6-4116-b549-db9e6b754b7b, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Jan 21 23:56:58 compute-1 systemd[1]: var-lib-containers-storage-overlay-831fc28d30c0d7652ba9eb1a4c5812c9682df28d65b619c5356434c34bb252e7-merged.mount: Deactivated successfully.
Jan 21 23:56:58 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-4bea52b2013e488e1f73a80d6d3383b821f4f0e36c46edea7754a05fb0183284-userdata-shm.mount: Deactivated successfully.
Jan 21 23:56:58 compute-1 kernel: tap53c314d0-2f: entered promiscuous mode
Jan 21 23:56:58 compute-1 NetworkManager[54952]: <info>  [1769039818.2857] manager: (tap53c314d0-2f): new Tun device (/org/freedesktop/NetworkManager/Devices/109)
Jan 21 23:56:58 compute-1 kernel: tap53c314d0-2f (unregistering): left promiscuous mode
Jan 21 23:56:58 compute-1 podman[220665]: 2026-01-21 23:56:58.294430965 +0000 UTC m=+0.098737642 container cleanup 4bea52b2013e488e1f73a80d6d3383b821f4f0e36c46edea7754a05fb0183284 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8e133733-21f6-4116-b549-db9e6b754b7b, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Jan 21 23:56:58 compute-1 nova_compute[182713]: 2026-01-21 23:56:58.326 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:56:58 compute-1 ovn_controller[94841]: 2026-01-21T23:56:58Z|00224|binding|INFO|Claiming lport 53c314d0-2f1b-4465-b514-6bdd134e1722 for this chassis.
Jan 21 23:56:58 compute-1 ovn_controller[94841]: 2026-01-21T23:56:58Z|00225|binding|INFO|53c314d0-2f1b-4465-b514-6bdd134e1722: Claiming fa:16:3e:d9:67:59 10.100.0.5
Jan 21 23:56:58 compute-1 systemd[1]: libpod-conmon-4bea52b2013e488e1f73a80d6d3383b821f4f0e36c46edea7754a05fb0183284.scope: Deactivated successfully.
Jan 21 23:56:58 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:56:58.341 104184 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d9:67:59 10.100.0.5'], port_security=['fa:16:3e:d9:67:59 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': 'dbef0790-08d1-4340-8088-615805f5e01f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8e133733-21f6-4116-b549-db9e6b754b7b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e35969ce39e84c9a8e6def5d9829f062', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'f3bb3258-ae01-47a5-ba6d-04f80967d243', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=eb19b89e-3b6e-4bdd-91be-b9ee619d6ef4, chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>], logical_port=53c314d0-2f1b-4465-b514-6bdd134e1722) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 21 23:56:58 compute-1 nova_compute[182713]: 2026-01-21 23:56:58.370 182717 INFO nova.virt.libvirt.driver [-] [instance: dbef0790-08d1-4340-8088-615805f5e01f] Instance destroyed successfully.
Jan 21 23:56:58 compute-1 nova_compute[182713]: 2026-01-21 23:56:58.371 182717 DEBUG nova.objects.instance [None req-cfd0e38f-f678-4e3b-9a94-40ada9776192 06bc08086b914d8f96a51e13ea95fd1a e35969ce39e84c9a8e6def5d9829f062 - - default default] Lazy-loading 'resources' on Instance uuid dbef0790-08d1-4340-8088-615805f5e01f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:56:58 compute-1 ovn_controller[94841]: 2026-01-21T23:56:58Z|00226|binding|INFO|Setting lport 53c314d0-2f1b-4465-b514-6bdd134e1722 ovn-installed in OVS
Jan 21 23:56:58 compute-1 ovn_controller[94841]: 2026-01-21T23:56:58Z|00227|binding|INFO|Setting lport 53c314d0-2f1b-4465-b514-6bdd134e1722 up in Southbound
Jan 21 23:56:58 compute-1 nova_compute[182713]: 2026-01-21 23:56:58.381 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:56:58 compute-1 nova_compute[182713]: 2026-01-21 23:56:58.389 182717 DEBUG nova.virt.libvirt.vif [None req-cfd0e38f-f678-4e3b-9a94-40ada9776192 06bc08086b914d8f96a51e13ea95fd1a e35969ce39e84c9a8e6def5d9829f062 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-21T23:56:46Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerPasswordTestJSON-server-424514161',display_name='tempest-ServerPasswordTestJSON-server-424514161',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverpasswordtestjson-server-424514161',id=69,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-21T23:56:55Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='e35969ce39e84c9a8e6def5d9829f062',ramdisk_id='',reservation_id='r-dh1gpvxm',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerPasswordTestJSON-1806469904',owner_user_name='tempest-ServerPasswordTestJSON-1806469904-project-member',password_0='',password_1='',password_2='',password_3=''},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-21T23:56:57Z,user_data=None,user_id='06bc08086b914d8f96a51e13ea95fd1a',uuid=dbef0790-08d1-4340-8088-615805f5e01f,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "53c314d0-2f1b-4465-b514-6bdd134e1722", "address": "fa:16:3e:d9:67:59", "network": {"id": "8e133733-21f6-4116-b549-db9e6b754b7b", "bridge": "br-int", "label": "tempest-ServerPasswordTestJSON-601456831-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e35969ce39e84c9a8e6def5d9829f062", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap53c314d0-2f", "ovs_interfaceid": "53c314d0-2f1b-4465-b514-6bdd134e1722", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 21 23:56:58 compute-1 nova_compute[182713]: 2026-01-21 23:56:58.390 182717 DEBUG nova.network.os_vif_util [None req-cfd0e38f-f678-4e3b-9a94-40ada9776192 06bc08086b914d8f96a51e13ea95fd1a e35969ce39e84c9a8e6def5d9829f062 - - default default] Converting VIF {"id": "53c314d0-2f1b-4465-b514-6bdd134e1722", "address": "fa:16:3e:d9:67:59", "network": {"id": "8e133733-21f6-4116-b549-db9e6b754b7b", "bridge": "br-int", "label": "tempest-ServerPasswordTestJSON-601456831-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e35969ce39e84c9a8e6def5d9829f062", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap53c314d0-2f", "ovs_interfaceid": "53c314d0-2f1b-4465-b514-6bdd134e1722", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 21 23:56:58 compute-1 nova_compute[182713]: 2026-01-21 23:56:58.391 182717 DEBUG nova.network.os_vif_util [None req-cfd0e38f-f678-4e3b-9a94-40ada9776192 06bc08086b914d8f96a51e13ea95fd1a e35969ce39e84c9a8e6def5d9829f062 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d9:67:59,bridge_name='br-int',has_traffic_filtering=True,id=53c314d0-2f1b-4465-b514-6bdd134e1722,network=Network(8e133733-21f6-4116-b549-db9e6b754b7b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap53c314d0-2f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 21 23:56:58 compute-1 nova_compute[182713]: 2026-01-21 23:56:58.391 182717 DEBUG os_vif [None req-cfd0e38f-f678-4e3b-9a94-40ada9776192 06bc08086b914d8f96a51e13ea95fd1a e35969ce39e84c9a8e6def5d9829f062 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:d9:67:59,bridge_name='br-int',has_traffic_filtering=True,id=53c314d0-2f1b-4465-b514-6bdd134e1722,network=Network(8e133733-21f6-4116-b549-db9e6b754b7b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap53c314d0-2f') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 21 23:56:58 compute-1 podman[220700]: 2026-01-21 23:56:58.391238736 +0000 UTC m=+0.044731059 container remove 4bea52b2013e488e1f73a80d6d3383b821f4f0e36c46edea7754a05fb0183284 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8e133733-21f6-4116-b549-db9e6b754b7b, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 21 23:56:58 compute-1 nova_compute[182713]: 2026-01-21 23:56:58.393 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:56:58 compute-1 nova_compute[182713]: 2026-01-21 23:56:58.394 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap53c314d0-2f, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:56:58 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:56:58.395 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[c05eb276-9bc5-47f2-9596-4d0b329b072b]: (4, ('Wed Jan 21 11:56:58 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-8e133733-21f6-4116-b549-db9e6b754b7b (4bea52b2013e488e1f73a80d6d3383b821f4f0e36c46edea7754a05fb0183284)\n4bea52b2013e488e1f73a80d6d3383b821f4f0e36c46edea7754a05fb0183284\nWed Jan 21 11:56:58 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-8e133733-21f6-4116-b549-db9e6b754b7b (4bea52b2013e488e1f73a80d6d3383b821f4f0e36c46edea7754a05fb0183284)\n4bea52b2013e488e1f73a80d6d3383b821f4f0e36c46edea7754a05fb0183284\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:56:58 compute-1 ovn_controller[94841]: 2026-01-21T23:56:58Z|00228|binding|INFO|Releasing lport 53c314d0-2f1b-4465-b514-6bdd134e1722 from this chassis (sb_readonly=0)
Jan 21 23:56:58 compute-1 ovn_controller[94841]: 2026-01-21T23:56:58Z|00229|binding|INFO|Setting lport 53c314d0-2f1b-4465-b514-6bdd134e1722 down in Southbound
Jan 21 23:56:58 compute-1 nova_compute[182713]: 2026-01-21 23:56:58.398 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:56:58 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:56:58.398 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[7bb5fede-9a21-4c2b-a142-908430135df9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:56:58 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:56:58.399 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8e133733-20, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:56:58 compute-1 nova_compute[182713]: 2026-01-21 23:56:58.400 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:56:58 compute-1 nova_compute[182713]: 2026-01-21 23:56:58.402 182717 INFO os_vif [None req-cfd0e38f-f678-4e3b-9a94-40ada9776192 06bc08086b914d8f96a51e13ea95fd1a e35969ce39e84c9a8e6def5d9829f062 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:d9:67:59,bridge_name='br-int',has_traffic_filtering=True,id=53c314d0-2f1b-4465-b514-6bdd134e1722,network=Network(8e133733-21f6-4116-b549-db9e6b754b7b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap53c314d0-2f')
Jan 21 23:56:58 compute-1 nova_compute[182713]: 2026-01-21 23:56:58.403 182717 INFO nova.virt.libvirt.driver [None req-cfd0e38f-f678-4e3b-9a94-40ada9776192 06bc08086b914d8f96a51e13ea95fd1a e35969ce39e84c9a8e6def5d9829f062 - - default default] [instance: dbef0790-08d1-4340-8088-615805f5e01f] Deleting instance files /var/lib/nova/instances/dbef0790-08d1-4340-8088-615805f5e01f_del
Jan 21 23:56:58 compute-1 nova_compute[182713]: 2026-01-21 23:56:58.404 182717 INFO nova.virt.libvirt.driver [None req-cfd0e38f-f678-4e3b-9a94-40ada9776192 06bc08086b914d8f96a51e13ea95fd1a e35969ce39e84c9a8e6def5d9829f062 - - default default] [instance: dbef0790-08d1-4340-8088-615805f5e01f] Deletion of /var/lib/nova/instances/dbef0790-08d1-4340-8088-615805f5e01f_del complete
Jan 21 23:56:58 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:56:58.405 104184 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d9:67:59 10.100.0.5'], port_security=['fa:16:3e:d9:67:59 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': 'dbef0790-08d1-4340-8088-615805f5e01f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8e133733-21f6-4116-b549-db9e6b754b7b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e35969ce39e84c9a8e6def5d9829f062', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'f3bb3258-ae01-47a5-ba6d-04f80967d243', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=eb19b89e-3b6e-4bdd-91be-b9ee619d6ef4, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>], logical_port=53c314d0-2f1b-4465-b514-6bdd134e1722) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 21 23:56:58 compute-1 nova_compute[182713]: 2026-01-21 23:56:58.410 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:56:58 compute-1 kernel: tap8e133733-20: left promiscuous mode
Jan 21 23:56:58 compute-1 nova_compute[182713]: 2026-01-21 23:56:58.419 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:56:58 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:56:58.424 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[8e335551-dcaa-4e9a-97c3-ada2b2aa8a71]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:56:58 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:56:58.438 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[1ee1f29d-87c1-4d40-8c0e-b29abd8f9c0d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:56:58 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:56:58.440 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[4d033bcd-0d4e-4503-9413-5ce467b03de4]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:56:58 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:56:58.457 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[d8dffb30-b5ae-4dcf-8a71-1150df983962]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 438734, 'reachable_time': 33897, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 220721, 'error': None, 'target': 'ovnmeta-8e133733-21f6-4116-b549-db9e6b754b7b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:56:58 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:56:58.460 104576 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-8e133733-21f6-4116-b549-db9e6b754b7b deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 21 23:56:58 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:56:58.461 104576 DEBUG oslo.privsep.daemon [-] privsep: reply[728d57e2-1ddb-4079-b18a-c9ad0d02ee9b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:56:58 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:56:58.461 104184 INFO neutron.agent.ovn.metadata.agent [-] Port 53c314d0-2f1b-4465-b514-6bdd134e1722 in datapath 8e133733-21f6-4116-b549-db9e6b754b7b unbound from our chassis
Jan 21 23:56:58 compute-1 systemd[1]: run-netns-ovnmeta\x2d8e133733\x2d21f6\x2d4116\x2db549\x2ddb9e6b754b7b.mount: Deactivated successfully.
Jan 21 23:56:58 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:56:58.463 104184 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 8e133733-21f6-4116-b549-db9e6b754b7b, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 21 23:56:58 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:56:58.463 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[b41ebffa-2c2f-4e30-976b-d1cf0770cc16]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:56:58 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:56:58.464 104184 INFO neutron.agent.ovn.metadata.agent [-] Port 53c314d0-2f1b-4465-b514-6bdd134e1722 in datapath 8e133733-21f6-4116-b549-db9e6b754b7b unbound from our chassis
Jan 21 23:56:58 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:56:58.465 104184 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 8e133733-21f6-4116-b549-db9e6b754b7b, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 21 23:56:58 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:56:58.465 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[7343402c-e1a2-4b9b-8dab-c4d78820f675]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:56:58 compute-1 nova_compute[182713]: 2026-01-21 23:56:58.480 182717 INFO nova.compute.manager [None req-cfd0e38f-f678-4e3b-9a94-40ada9776192 06bc08086b914d8f96a51e13ea95fd1a e35969ce39e84c9a8e6def5d9829f062 - - default default] [instance: dbef0790-08d1-4340-8088-615805f5e01f] Took 0.42 seconds to destroy the instance on the hypervisor.
Jan 21 23:56:58 compute-1 nova_compute[182713]: 2026-01-21 23:56:58.481 182717 DEBUG oslo.service.loopingcall [None req-cfd0e38f-f678-4e3b-9a94-40ada9776192 06bc08086b914d8f96a51e13ea95fd1a e35969ce39e84c9a8e6def5d9829f062 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 21 23:56:58 compute-1 nova_compute[182713]: 2026-01-21 23:56:58.483 182717 DEBUG nova.compute.manager [-] [instance: dbef0790-08d1-4340-8088-615805f5e01f] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 21 23:56:58 compute-1 nova_compute[182713]: 2026-01-21 23:56:58.483 182717 DEBUG nova.network.neutron [-] [instance: dbef0790-08d1-4340-8088-615805f5e01f] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 21 23:56:59 compute-1 nova_compute[182713]: 2026-01-21 23:56:59.861 182717 DEBUG nova.network.neutron [-] [instance: dbef0790-08d1-4340-8088-615805f5e01f] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 23:56:59 compute-1 nova_compute[182713]: 2026-01-21 23:56:59.892 182717 INFO nova.compute.manager [-] [instance: dbef0790-08d1-4340-8088-615805f5e01f] Took 1.41 seconds to deallocate network for instance.
Jan 21 23:56:59 compute-1 nova_compute[182713]: 2026-01-21 23:56:59.896 182717 DEBUG nova.compute.manager [req-8ee978d6-249e-4275-a66c-3a5587657d0d req-173d3d0e-7326-4524-923a-1ded5f932b89 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: dbef0790-08d1-4340-8088-615805f5e01f] Received event network-vif-unplugged-53c314d0-2f1b-4465-b514-6bdd134e1722 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:56:59 compute-1 nova_compute[182713]: 2026-01-21 23:56:59.896 182717 DEBUG oslo_concurrency.lockutils [req-8ee978d6-249e-4275-a66c-3a5587657d0d req-173d3d0e-7326-4524-923a-1ded5f932b89 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "dbef0790-08d1-4340-8088-615805f5e01f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:56:59 compute-1 nova_compute[182713]: 2026-01-21 23:56:59.896 182717 DEBUG oslo_concurrency.lockutils [req-8ee978d6-249e-4275-a66c-3a5587657d0d req-173d3d0e-7326-4524-923a-1ded5f932b89 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "dbef0790-08d1-4340-8088-615805f5e01f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:56:59 compute-1 nova_compute[182713]: 2026-01-21 23:56:59.896 182717 DEBUG oslo_concurrency.lockutils [req-8ee978d6-249e-4275-a66c-3a5587657d0d req-173d3d0e-7326-4524-923a-1ded5f932b89 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "dbef0790-08d1-4340-8088-615805f5e01f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:56:59 compute-1 nova_compute[182713]: 2026-01-21 23:56:59.897 182717 DEBUG nova.compute.manager [req-8ee978d6-249e-4275-a66c-3a5587657d0d req-173d3d0e-7326-4524-923a-1ded5f932b89 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: dbef0790-08d1-4340-8088-615805f5e01f] No waiting events found dispatching network-vif-unplugged-53c314d0-2f1b-4465-b514-6bdd134e1722 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 23:56:59 compute-1 nova_compute[182713]: 2026-01-21 23:56:59.897 182717 DEBUG nova.compute.manager [req-8ee978d6-249e-4275-a66c-3a5587657d0d req-173d3d0e-7326-4524-923a-1ded5f932b89 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: dbef0790-08d1-4340-8088-615805f5e01f] Received event network-vif-unplugged-53c314d0-2f1b-4465-b514-6bdd134e1722 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 21 23:56:59 compute-1 nova_compute[182713]: 2026-01-21 23:56:59.897 182717 DEBUG nova.compute.manager [req-8ee978d6-249e-4275-a66c-3a5587657d0d req-173d3d0e-7326-4524-923a-1ded5f932b89 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: dbef0790-08d1-4340-8088-615805f5e01f] Received event network-vif-plugged-53c314d0-2f1b-4465-b514-6bdd134e1722 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:56:59 compute-1 nova_compute[182713]: 2026-01-21 23:56:59.897 182717 DEBUG oslo_concurrency.lockutils [req-8ee978d6-249e-4275-a66c-3a5587657d0d req-173d3d0e-7326-4524-923a-1ded5f932b89 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "dbef0790-08d1-4340-8088-615805f5e01f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:56:59 compute-1 nova_compute[182713]: 2026-01-21 23:56:59.897 182717 DEBUG oslo_concurrency.lockutils [req-8ee978d6-249e-4275-a66c-3a5587657d0d req-173d3d0e-7326-4524-923a-1ded5f932b89 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "dbef0790-08d1-4340-8088-615805f5e01f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:56:59 compute-1 nova_compute[182713]: 2026-01-21 23:56:59.898 182717 DEBUG oslo_concurrency.lockutils [req-8ee978d6-249e-4275-a66c-3a5587657d0d req-173d3d0e-7326-4524-923a-1ded5f932b89 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "dbef0790-08d1-4340-8088-615805f5e01f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:56:59 compute-1 nova_compute[182713]: 2026-01-21 23:56:59.898 182717 DEBUG nova.compute.manager [req-8ee978d6-249e-4275-a66c-3a5587657d0d req-173d3d0e-7326-4524-923a-1ded5f932b89 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: dbef0790-08d1-4340-8088-615805f5e01f] No waiting events found dispatching network-vif-plugged-53c314d0-2f1b-4465-b514-6bdd134e1722 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 23:56:59 compute-1 nova_compute[182713]: 2026-01-21 23:56:59.898 182717 WARNING nova.compute.manager [req-8ee978d6-249e-4275-a66c-3a5587657d0d req-173d3d0e-7326-4524-923a-1ded5f932b89 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: dbef0790-08d1-4340-8088-615805f5e01f] Received unexpected event network-vif-plugged-53c314d0-2f1b-4465-b514-6bdd134e1722 for instance with vm_state active and task_state deleting.
Jan 21 23:56:59 compute-1 nova_compute[182713]: 2026-01-21 23:56:59.992 182717 DEBUG oslo_concurrency.lockutils [None req-cfd0e38f-f678-4e3b-9a94-40ada9776192 06bc08086b914d8f96a51e13ea95fd1a e35969ce39e84c9a8e6def5d9829f062 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:56:59 compute-1 nova_compute[182713]: 2026-01-21 23:56:59.993 182717 DEBUG oslo_concurrency.lockutils [None req-cfd0e38f-f678-4e3b-9a94-40ada9776192 06bc08086b914d8f96a51e13ea95fd1a e35969ce39e84c9a8e6def5d9829f062 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:57:00 compute-1 nova_compute[182713]: 2026-01-21 23:57:00.015 182717 DEBUG nova.compute.manager [req-242230dc-879e-4ff5-b284-b19b9bf1f3dc req-95a19683-aae3-49e6-b31d-ad2341340b10 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: dbef0790-08d1-4340-8088-615805f5e01f] Received event network-vif-deleted-53c314d0-2f1b-4465-b514-6bdd134e1722 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:57:00 compute-1 nova_compute[182713]: 2026-01-21 23:57:00.077 182717 DEBUG nova.compute.provider_tree [None req-cfd0e38f-f678-4e3b-9a94-40ada9776192 06bc08086b914d8f96a51e13ea95fd1a e35969ce39e84c9a8e6def5d9829f062 - - default default] Inventory has not changed in ProviderTree for provider: 39680711-70c9-4df1-ae59-25e54fac688d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 21 23:57:00 compute-1 nova_compute[182713]: 2026-01-21 23:57:00.105 182717 DEBUG nova.scheduler.client.report [None req-cfd0e38f-f678-4e3b-9a94-40ada9776192 06bc08086b914d8f96a51e13ea95fd1a e35969ce39e84c9a8e6def5d9829f062 - - default default] Inventory has not changed for provider 39680711-70c9-4df1-ae59-25e54fac688d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 21 23:57:00 compute-1 nova_compute[182713]: 2026-01-21 23:57:00.154 182717 DEBUG oslo_concurrency.lockutils [None req-cfd0e38f-f678-4e3b-9a94-40ada9776192 06bc08086b914d8f96a51e13ea95fd1a e35969ce39e84c9a8e6def5d9829f062 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.162s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:57:00 compute-1 nova_compute[182713]: 2026-01-21 23:57:00.179 182717 INFO nova.scheduler.client.report [None req-cfd0e38f-f678-4e3b-9a94-40ada9776192 06bc08086b914d8f96a51e13ea95fd1a e35969ce39e84c9a8e6def5d9829f062 - - default default] Deleted allocations for instance dbef0790-08d1-4340-8088-615805f5e01f
Jan 21 23:57:00 compute-1 nova_compute[182713]: 2026-01-21 23:57:00.262 182717 DEBUG oslo_concurrency.lockutils [None req-cfd0e38f-f678-4e3b-9a94-40ada9776192 06bc08086b914d8f96a51e13ea95fd1a e35969ce39e84c9a8e6def5d9829f062 - - default default] Lock "dbef0790-08d1-4340-8088-615805f5e01f" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.231s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:57:01 compute-1 nova_compute[182713]: 2026-01-21 23:57:01.122 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:57:03 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:57:03.004 104184 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:57:03 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:57:03.005 104184 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:57:03 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:57:03.005 104184 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:57:03 compute-1 nova_compute[182713]: 2026-01-21 23:57:03.398 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:57:03 compute-1 podman[220724]: 2026-01-21 23:57:03.557539144 +0000 UTC m=+0.054180340 container health_status 9069102163b9014ce8d990e36224edf2f6277e737eebbc85a8ea0ba5a3d6a468 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 21 23:57:03 compute-1 podman[220723]: 2026-01-21 23:57:03.623159475 +0000 UTC m=+0.120655266 container health_status 1b31374526468d6b98833c6ddc06c791524e3aaa8f4a92d02e3f11c449a17ca2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller)
Jan 21 23:57:03 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:57:03.866 104184 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=16, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:a4:fb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'fa:c2:bb:50:eb:27'}, ipsec=False) old=SB_Global(nb_cfg=15) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 21 23:57:03 compute-1 nova_compute[182713]: 2026-01-21 23:57:03.867 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:57:03 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:57:03.868 104184 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 21 23:57:05 compute-1 nova_compute[182713]: 2026-01-21 23:57:05.253 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:57:06 compute-1 nova_compute[182713]: 2026-01-21 23:57:06.126 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:57:08 compute-1 nova_compute[182713]: 2026-01-21 23:57:08.402 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:57:10 compute-1 podman[220775]: 2026-01-21 23:57:10.576106944 +0000 UTC m=+0.067051576 container health_status af9a44923f14d865893c91712a29a99d59e05d83f1a1a0300c4f4d5af638f284 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 21 23:57:10 compute-1 podman[220776]: 2026-01-21 23:57:10.603944321 +0000 UTC m=+0.091484008 container health_status e305ebfcd3dbca18f8dd680606b043b108cf9efb0d0a771039cb04fe0df8441d (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Jan 21 23:57:11 compute-1 nova_compute[182713]: 2026-01-21 23:57:11.129 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:57:11 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:57:11.871 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=74526b6d-b1ca-423f-9094-b845f8b97526, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '16'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:57:13 compute-1 nova_compute[182713]: 2026-01-21 23:57:13.368 182717 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769039818.3679006, dbef0790-08d1-4340-8088-615805f5e01f => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:57:13 compute-1 nova_compute[182713]: 2026-01-21 23:57:13.369 182717 INFO nova.compute.manager [-] [instance: dbef0790-08d1-4340-8088-615805f5e01f] VM Stopped (Lifecycle Event)
Jan 21 23:57:13 compute-1 nova_compute[182713]: 2026-01-21 23:57:13.394 182717 DEBUG nova.compute.manager [None req-52475067-41da-4506-8eee-4aa89eb1dd00 - - - - - -] [instance: dbef0790-08d1-4340-8088-615805f5e01f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:57:13 compute-1 nova_compute[182713]: 2026-01-21 23:57:13.405 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:57:16 compute-1 nova_compute[182713]: 2026-01-21 23:57:16.131 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:57:16 compute-1 nova_compute[182713]: 2026-01-21 23:57:16.856 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:57:16 compute-1 nova_compute[182713]: 2026-01-21 23:57:16.857 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Jan 21 23:57:16 compute-1 nova_compute[182713]: 2026-01-21 23:57:16.876 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Jan 21 23:57:18 compute-1 nova_compute[182713]: 2026-01-21 23:57:18.409 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:57:19 compute-1 nova_compute[182713]: 2026-01-21 23:57:19.711 182717 DEBUG oslo_concurrency.lockutils [None req-6c57dfb1-0b39-482f-915e-356d6ff9cfe4 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Acquiring lock "c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:57:19 compute-1 nova_compute[182713]: 2026-01-21 23:57:19.712 182717 DEBUG oslo_concurrency.lockutils [None req-6c57dfb1-0b39-482f-915e-356d6ff9cfe4 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Lock "c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:57:19 compute-1 nova_compute[182713]: 2026-01-21 23:57:19.743 182717 DEBUG nova.compute.manager [None req-6c57dfb1-0b39-482f-915e-356d6ff9cfe4 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] [instance: c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 21 23:57:19 compute-1 nova_compute[182713]: 2026-01-21 23:57:19.918 182717 DEBUG oslo_concurrency.lockutils [None req-6c57dfb1-0b39-482f-915e-356d6ff9cfe4 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:57:19 compute-1 nova_compute[182713]: 2026-01-21 23:57:19.919 182717 DEBUG oslo_concurrency.lockutils [None req-6c57dfb1-0b39-482f-915e-356d6ff9cfe4 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:57:19 compute-1 nova_compute[182713]: 2026-01-21 23:57:19.929 182717 DEBUG nova.virt.hardware [None req-6c57dfb1-0b39-482f-915e-356d6ff9cfe4 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 21 23:57:19 compute-1 nova_compute[182713]: 2026-01-21 23:57:19.930 182717 INFO nova.compute.claims [None req-6c57dfb1-0b39-482f-915e-356d6ff9cfe4 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] [instance: c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9] Claim successful on node compute-1.ctlplane.example.com
Jan 21 23:57:20 compute-1 nova_compute[182713]: 2026-01-21 23:57:20.120 182717 DEBUG nova.compute.provider_tree [None req-6c57dfb1-0b39-482f-915e-356d6ff9cfe4 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Inventory has not changed in ProviderTree for provider: 39680711-70c9-4df1-ae59-25e54fac688d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 21 23:57:20 compute-1 nova_compute[182713]: 2026-01-21 23:57:20.140 182717 DEBUG nova.scheduler.client.report [None req-6c57dfb1-0b39-482f-915e-356d6ff9cfe4 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Inventory has not changed for provider 39680711-70c9-4df1-ae59-25e54fac688d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 21 23:57:20 compute-1 nova_compute[182713]: 2026-01-21 23:57:20.172 182717 DEBUG oslo_concurrency.lockutils [None req-6c57dfb1-0b39-482f-915e-356d6ff9cfe4 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.252s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:57:20 compute-1 nova_compute[182713]: 2026-01-21 23:57:20.173 182717 DEBUG nova.compute.manager [None req-6c57dfb1-0b39-482f-915e-356d6ff9cfe4 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] [instance: c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 21 23:57:20 compute-1 nova_compute[182713]: 2026-01-21 23:57:20.302 182717 DEBUG nova.compute.manager [None req-6c57dfb1-0b39-482f-915e-356d6ff9cfe4 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] [instance: c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 21 23:57:20 compute-1 nova_compute[182713]: 2026-01-21 23:57:20.303 182717 DEBUG nova.network.neutron [None req-6c57dfb1-0b39-482f-915e-356d6ff9cfe4 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] [instance: c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 21 23:57:20 compute-1 nova_compute[182713]: 2026-01-21 23:57:20.332 182717 INFO nova.virt.libvirt.driver [None req-6c57dfb1-0b39-482f-915e-356d6ff9cfe4 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] [instance: c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 21 23:57:20 compute-1 nova_compute[182713]: 2026-01-21 23:57:20.379 182717 DEBUG nova.compute.manager [None req-6c57dfb1-0b39-482f-915e-356d6ff9cfe4 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] [instance: c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 21 23:57:20 compute-1 nova_compute[182713]: 2026-01-21 23:57:20.526 182717 DEBUG nova.compute.manager [None req-6c57dfb1-0b39-482f-915e-356d6ff9cfe4 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] [instance: c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 21 23:57:20 compute-1 nova_compute[182713]: 2026-01-21 23:57:20.528 182717 DEBUG nova.virt.libvirt.driver [None req-6c57dfb1-0b39-482f-915e-356d6ff9cfe4 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] [instance: c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 21 23:57:20 compute-1 nova_compute[182713]: 2026-01-21 23:57:20.529 182717 INFO nova.virt.libvirt.driver [None req-6c57dfb1-0b39-482f-915e-356d6ff9cfe4 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] [instance: c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9] Creating image(s)
Jan 21 23:57:20 compute-1 nova_compute[182713]: 2026-01-21 23:57:20.530 182717 DEBUG oslo_concurrency.lockutils [None req-6c57dfb1-0b39-482f-915e-356d6ff9cfe4 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Acquiring lock "/var/lib/nova/instances/c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:57:20 compute-1 nova_compute[182713]: 2026-01-21 23:57:20.531 182717 DEBUG oslo_concurrency.lockutils [None req-6c57dfb1-0b39-482f-915e-356d6ff9cfe4 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Lock "/var/lib/nova/instances/c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:57:20 compute-1 nova_compute[182713]: 2026-01-21 23:57:20.532 182717 DEBUG oslo_concurrency.lockutils [None req-6c57dfb1-0b39-482f-915e-356d6ff9cfe4 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Lock "/var/lib/nova/instances/c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:57:20 compute-1 nova_compute[182713]: 2026-01-21 23:57:20.556 182717 DEBUG oslo_concurrency.processutils [None req-6c57dfb1-0b39-482f-915e-356d6ff9cfe4 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:57:20 compute-1 nova_compute[182713]: 2026-01-21 23:57:20.650 182717 DEBUG oslo_concurrency.processutils [None req-6c57dfb1-0b39-482f-915e-356d6ff9cfe4 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.094s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:57:20 compute-1 nova_compute[182713]: 2026-01-21 23:57:20.652 182717 DEBUG oslo_concurrency.lockutils [None req-6c57dfb1-0b39-482f-915e-356d6ff9cfe4 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Acquiring lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:57:20 compute-1 nova_compute[182713]: 2026-01-21 23:57:20.653 182717 DEBUG oslo_concurrency.lockutils [None req-6c57dfb1-0b39-482f-915e-356d6ff9cfe4 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:57:20 compute-1 nova_compute[182713]: 2026-01-21 23:57:20.679 182717 DEBUG oslo_concurrency.processutils [None req-6c57dfb1-0b39-482f-915e-356d6ff9cfe4 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:57:20 compute-1 nova_compute[182713]: 2026-01-21 23:57:20.712 182717 DEBUG nova.policy [None req-6c57dfb1-0b39-482f-915e-356d6ff9cfe4 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '0f8ef02149394f2dac899fc3395b6bf7', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '717cc581e6a349a98dfd390d05b18624', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 21 23:57:20 compute-1 nova_compute[182713]: 2026-01-21 23:57:20.765 182717 DEBUG oslo_concurrency.processutils [None req-6c57dfb1-0b39-482f-915e-356d6ff9cfe4 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.086s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:57:20 compute-1 nova_compute[182713]: 2026-01-21 23:57:20.766 182717 DEBUG oslo_concurrency.processutils [None req-6c57dfb1-0b39-482f-915e-356d6ff9cfe4 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:57:20 compute-1 nova_compute[182713]: 2026-01-21 23:57:20.821 182717 DEBUG oslo_concurrency.processutils [None req-6c57dfb1-0b39-482f-915e-356d6ff9cfe4 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9/disk 1073741824" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:57:20 compute-1 nova_compute[182713]: 2026-01-21 23:57:20.823 182717 DEBUG oslo_concurrency.lockutils [None req-6c57dfb1-0b39-482f-915e-356d6ff9cfe4 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.170s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:57:20 compute-1 nova_compute[182713]: 2026-01-21 23:57:20.824 182717 DEBUG oslo_concurrency.processutils [None req-6c57dfb1-0b39-482f-915e-356d6ff9cfe4 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:57:20 compute-1 nova_compute[182713]: 2026-01-21 23:57:20.861 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:57:20 compute-1 nova_compute[182713]: 2026-01-21 23:57:20.916 182717 DEBUG oslo_concurrency.processutils [None req-6c57dfb1-0b39-482f-915e-356d6ff9cfe4 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.092s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:57:20 compute-1 nova_compute[182713]: 2026-01-21 23:57:20.919 182717 DEBUG nova.virt.disk.api [None req-6c57dfb1-0b39-482f-915e-356d6ff9cfe4 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Checking if we can resize image /var/lib/nova/instances/c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 21 23:57:20 compute-1 nova_compute[182713]: 2026-01-21 23:57:20.920 182717 DEBUG oslo_concurrency.processutils [None req-6c57dfb1-0b39-482f-915e-356d6ff9cfe4 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:57:20 compute-1 nova_compute[182713]: 2026-01-21 23:57:20.984 182717 DEBUG oslo_concurrency.processutils [None req-6c57dfb1-0b39-482f-915e-356d6ff9cfe4 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9/disk --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:57:20 compute-1 nova_compute[182713]: 2026-01-21 23:57:20.986 182717 DEBUG nova.virt.disk.api [None req-6c57dfb1-0b39-482f-915e-356d6ff9cfe4 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Cannot resize image /var/lib/nova/instances/c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 21 23:57:20 compute-1 nova_compute[182713]: 2026-01-21 23:57:20.986 182717 DEBUG nova.objects.instance [None req-6c57dfb1-0b39-482f-915e-356d6ff9cfe4 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Lazy-loading 'migration_context' on Instance uuid c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:57:21 compute-1 nova_compute[182713]: 2026-01-21 23:57:21.008 182717 DEBUG nova.virt.libvirt.driver [None req-6c57dfb1-0b39-482f-915e-356d6ff9cfe4 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] [instance: c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 21 23:57:21 compute-1 nova_compute[182713]: 2026-01-21 23:57:21.008 182717 DEBUG nova.virt.libvirt.driver [None req-6c57dfb1-0b39-482f-915e-356d6ff9cfe4 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] [instance: c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9] Ensure instance console log exists: /var/lib/nova/instances/c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 21 23:57:21 compute-1 nova_compute[182713]: 2026-01-21 23:57:21.009 182717 DEBUG oslo_concurrency.lockutils [None req-6c57dfb1-0b39-482f-915e-356d6ff9cfe4 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:57:21 compute-1 nova_compute[182713]: 2026-01-21 23:57:21.010 182717 DEBUG oslo_concurrency.lockutils [None req-6c57dfb1-0b39-482f-915e-356d6ff9cfe4 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:57:21 compute-1 nova_compute[182713]: 2026-01-21 23:57:21.010 182717 DEBUG oslo_concurrency.lockutils [None req-6c57dfb1-0b39-482f-915e-356d6ff9cfe4 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:57:21 compute-1 nova_compute[182713]: 2026-01-21 23:57:21.133 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:57:22 compute-1 nova_compute[182713]: 2026-01-21 23:57:22.521 182717 DEBUG nova.network.neutron [None req-6c57dfb1-0b39-482f-915e-356d6ff9cfe4 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] [instance: c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9] Successfully created port: 43589933-1997-41c6-9aa3-54f71a1330b8 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 21 23:57:22 compute-1 podman[220834]: 2026-01-21 23:57:22.603517524 +0000 UTC m=+0.090493428 container health_status cba261ffc859a105e0afa4436e64e27f9ba7e7387a1ea02146e0072c51844d44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Jan 21 23:57:23 compute-1 nova_compute[182713]: 2026-01-21 23:57:23.390 182717 DEBUG nova.network.neutron [None req-6c57dfb1-0b39-482f-915e-356d6ff9cfe4 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] [instance: c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9] Successfully updated port: 43589933-1997-41c6-9aa3-54f71a1330b8 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 21 23:57:23 compute-1 nova_compute[182713]: 2026-01-21 23:57:23.412 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:57:23 compute-1 nova_compute[182713]: 2026-01-21 23:57:23.417 182717 DEBUG oslo_concurrency.lockutils [None req-6c57dfb1-0b39-482f-915e-356d6ff9cfe4 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Acquiring lock "refresh_cache-c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 23:57:23 compute-1 nova_compute[182713]: 2026-01-21 23:57:23.417 182717 DEBUG oslo_concurrency.lockutils [None req-6c57dfb1-0b39-482f-915e-356d6ff9cfe4 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Acquired lock "refresh_cache-c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 23:57:23 compute-1 nova_compute[182713]: 2026-01-21 23:57:23.418 182717 DEBUG nova.network.neutron [None req-6c57dfb1-0b39-482f-915e-356d6ff9cfe4 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] [instance: c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 21 23:57:23 compute-1 nova_compute[182713]: 2026-01-21 23:57:23.487 182717 DEBUG nova.compute.manager [req-6dca04ea-1084-4aef-8624-0065cd6909cc req-d2b5c8d7-d846-4392-b75b-68633086f52d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9] Received event network-changed-43589933-1997-41c6-9aa3-54f71a1330b8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:57:23 compute-1 nova_compute[182713]: 2026-01-21 23:57:23.488 182717 DEBUG nova.compute.manager [req-6dca04ea-1084-4aef-8624-0065cd6909cc req-d2b5c8d7-d846-4392-b75b-68633086f52d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9] Refreshing instance network info cache due to event network-changed-43589933-1997-41c6-9aa3-54f71a1330b8. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 21 23:57:23 compute-1 nova_compute[182713]: 2026-01-21 23:57:23.488 182717 DEBUG oslo_concurrency.lockutils [req-6dca04ea-1084-4aef-8624-0065cd6909cc req-d2b5c8d7-d846-4392-b75b-68633086f52d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 23:57:23 compute-1 nova_compute[182713]: 2026-01-21 23:57:23.659 182717 DEBUG nova.network.neutron [None req-6c57dfb1-0b39-482f-915e-356d6ff9cfe4 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] [instance: c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 21 23:57:24 compute-1 nova_compute[182713]: 2026-01-21 23:57:24.960 182717 DEBUG nova.network.neutron [None req-6c57dfb1-0b39-482f-915e-356d6ff9cfe4 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] [instance: c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9] Updating instance_info_cache with network_info: [{"id": "43589933-1997-41c6-9aa3-54f71a1330b8", "address": "fa:16:3e:3e:bc:4e", "network": {"id": "1995baab-0f8d-4658-a4fc-2d21868dc592", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-54296518-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "717cc581e6a349a98dfd390d05b18624", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap43589933-19", "ovs_interfaceid": "43589933-1997-41c6-9aa3-54f71a1330b8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 23:57:24 compute-1 nova_compute[182713]: 2026-01-21 23:57:24.983 182717 DEBUG oslo_concurrency.lockutils [None req-6c57dfb1-0b39-482f-915e-356d6ff9cfe4 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Releasing lock "refresh_cache-c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 23:57:24 compute-1 nova_compute[182713]: 2026-01-21 23:57:24.984 182717 DEBUG nova.compute.manager [None req-6c57dfb1-0b39-482f-915e-356d6ff9cfe4 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] [instance: c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9] Instance network_info: |[{"id": "43589933-1997-41c6-9aa3-54f71a1330b8", "address": "fa:16:3e:3e:bc:4e", "network": {"id": "1995baab-0f8d-4658-a4fc-2d21868dc592", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-54296518-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "717cc581e6a349a98dfd390d05b18624", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap43589933-19", "ovs_interfaceid": "43589933-1997-41c6-9aa3-54f71a1330b8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 21 23:57:24 compute-1 nova_compute[182713]: 2026-01-21 23:57:24.985 182717 DEBUG oslo_concurrency.lockutils [req-6dca04ea-1084-4aef-8624-0065cd6909cc req-d2b5c8d7-d846-4392-b75b-68633086f52d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 23:57:24 compute-1 nova_compute[182713]: 2026-01-21 23:57:24.986 182717 DEBUG nova.network.neutron [req-6dca04ea-1084-4aef-8624-0065cd6909cc req-d2b5c8d7-d846-4392-b75b-68633086f52d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9] Refreshing network info cache for port 43589933-1997-41c6-9aa3-54f71a1330b8 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 21 23:57:24 compute-1 nova_compute[182713]: 2026-01-21 23:57:24.993 182717 DEBUG nova.virt.libvirt.driver [None req-6c57dfb1-0b39-482f-915e-356d6ff9cfe4 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] [instance: c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9] Start _get_guest_xml network_info=[{"id": "43589933-1997-41c6-9aa3-54f71a1330b8", "address": "fa:16:3e:3e:bc:4e", "network": {"id": "1995baab-0f8d-4658-a4fc-2d21868dc592", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-54296518-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "717cc581e6a349a98dfd390d05b18624", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap43589933-19", "ovs_interfaceid": "43589933-1997-41c6-9aa3-54f71a1330b8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'size': 0, 'disk_bus': 'virtio', 'boot_index': 0, 'device_name': '/dev/vda', 'guest_format': None, 'encrypted': False, 'encryption_format': None, 'encryption_secret_uuid': None, 'device_type': 'disk', 'image_id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 21 23:57:25 compute-1 nova_compute[182713]: 2026-01-21 23:57:25.002 182717 WARNING nova.virt.libvirt.driver [None req-6c57dfb1-0b39-482f-915e-356d6ff9cfe4 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 21 23:57:25 compute-1 nova_compute[182713]: 2026-01-21 23:57:25.009 182717 DEBUG nova.virt.libvirt.host [None req-6c57dfb1-0b39-482f-915e-356d6ff9cfe4 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 21 23:57:25 compute-1 nova_compute[182713]: 2026-01-21 23:57:25.010 182717 DEBUG nova.virt.libvirt.host [None req-6c57dfb1-0b39-482f-915e-356d6ff9cfe4 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 21 23:57:25 compute-1 nova_compute[182713]: 2026-01-21 23:57:25.022 182717 DEBUG nova.virt.libvirt.host [None req-6c57dfb1-0b39-482f-915e-356d6ff9cfe4 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 21 23:57:25 compute-1 nova_compute[182713]: 2026-01-21 23:57:25.023 182717 DEBUG nova.virt.libvirt.host [None req-6c57dfb1-0b39-482f-915e-356d6ff9cfe4 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 21 23:57:25 compute-1 nova_compute[182713]: 2026-01-21 23:57:25.025 182717 DEBUG nova.virt.libvirt.driver [None req-6c57dfb1-0b39-482f-915e-356d6ff9cfe4 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 21 23:57:25 compute-1 nova_compute[182713]: 2026-01-21 23:57:25.026 182717 DEBUG nova.virt.hardware [None req-6c57dfb1-0b39-482f-915e-356d6ff9cfe4 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-21T23:42:45Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c3389c03-89c4-4ff5-9e03-1a99d41713d4',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 21 23:57:25 compute-1 nova_compute[182713]: 2026-01-21 23:57:25.027 182717 DEBUG nova.virt.hardware [None req-6c57dfb1-0b39-482f-915e-356d6ff9cfe4 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 21 23:57:25 compute-1 nova_compute[182713]: 2026-01-21 23:57:25.028 182717 DEBUG nova.virt.hardware [None req-6c57dfb1-0b39-482f-915e-356d6ff9cfe4 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 21 23:57:25 compute-1 nova_compute[182713]: 2026-01-21 23:57:25.028 182717 DEBUG nova.virt.hardware [None req-6c57dfb1-0b39-482f-915e-356d6ff9cfe4 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 21 23:57:25 compute-1 nova_compute[182713]: 2026-01-21 23:57:25.029 182717 DEBUG nova.virt.hardware [None req-6c57dfb1-0b39-482f-915e-356d6ff9cfe4 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 21 23:57:25 compute-1 nova_compute[182713]: 2026-01-21 23:57:25.029 182717 DEBUG nova.virt.hardware [None req-6c57dfb1-0b39-482f-915e-356d6ff9cfe4 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 21 23:57:25 compute-1 nova_compute[182713]: 2026-01-21 23:57:25.030 182717 DEBUG nova.virt.hardware [None req-6c57dfb1-0b39-482f-915e-356d6ff9cfe4 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 21 23:57:25 compute-1 nova_compute[182713]: 2026-01-21 23:57:25.031 182717 DEBUG nova.virt.hardware [None req-6c57dfb1-0b39-482f-915e-356d6ff9cfe4 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 21 23:57:25 compute-1 nova_compute[182713]: 2026-01-21 23:57:25.031 182717 DEBUG nova.virt.hardware [None req-6c57dfb1-0b39-482f-915e-356d6ff9cfe4 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 21 23:57:25 compute-1 nova_compute[182713]: 2026-01-21 23:57:25.031 182717 DEBUG nova.virt.hardware [None req-6c57dfb1-0b39-482f-915e-356d6ff9cfe4 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 21 23:57:25 compute-1 nova_compute[182713]: 2026-01-21 23:57:25.032 182717 DEBUG nova.virt.hardware [None req-6c57dfb1-0b39-482f-915e-356d6ff9cfe4 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 21 23:57:25 compute-1 nova_compute[182713]: 2026-01-21 23:57:25.039 182717 DEBUG nova.virt.libvirt.vif [None req-6c57dfb1-0b39-482f-915e-356d6ff9cfe4 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-21T23:57:18Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachInterfacesTestJSON-server-936465965',display_name='tempest-AttachInterfacesTestJSON-server-936465965',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-attachinterfacestestjson-server-936465965',id=71,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBO8uUXqPwbsFq/YiTWyOW9dXr7d73wRm6448mROH/nejo5yLwAs6kN5js+eg5mRrAu9wdd3Q77g1T/xRTktMBrGTEhLl3+r8EWHZg0auWjRb9BMWnMu9DqjgoX1NYmMOKg==',key_name='tempest-keypair-1559391228',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='717cc581e6a349a98dfd390d05b18624',ramdisk_id='',reservation_id='r-fh2clcvk',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AttachInterfacesTestJSON-658760528',owner_user_name='tempest-AttachInterfacesTestJSON-658760528-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-21T23:57:20Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='0f8ef02149394f2dac899fc3395b6bf7',uuid=c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "43589933-1997-41c6-9aa3-54f71a1330b8", "address": "fa:16:3e:3e:bc:4e", "network": {"id": "1995baab-0f8d-4658-a4fc-2d21868dc592", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-54296518-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "717cc581e6a349a98dfd390d05b18624", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap43589933-19", "ovs_interfaceid": "43589933-1997-41c6-9aa3-54f71a1330b8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 21 23:57:25 compute-1 nova_compute[182713]: 2026-01-21 23:57:25.040 182717 DEBUG nova.network.os_vif_util [None req-6c57dfb1-0b39-482f-915e-356d6ff9cfe4 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Converting VIF {"id": "43589933-1997-41c6-9aa3-54f71a1330b8", "address": "fa:16:3e:3e:bc:4e", "network": {"id": "1995baab-0f8d-4658-a4fc-2d21868dc592", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-54296518-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "717cc581e6a349a98dfd390d05b18624", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap43589933-19", "ovs_interfaceid": "43589933-1997-41c6-9aa3-54f71a1330b8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 21 23:57:25 compute-1 nova_compute[182713]: 2026-01-21 23:57:25.042 182717 DEBUG nova.network.os_vif_util [None req-6c57dfb1-0b39-482f-915e-356d6ff9cfe4 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:3e:bc:4e,bridge_name='br-int',has_traffic_filtering=True,id=43589933-1997-41c6-9aa3-54f71a1330b8,network=Network(1995baab-0f8d-4658-a4fc-2d21868dc592),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap43589933-19') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 21 23:57:25 compute-1 nova_compute[182713]: 2026-01-21 23:57:25.050 182717 DEBUG nova.objects.instance [None req-6c57dfb1-0b39-482f-915e-356d6ff9cfe4 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Lazy-loading 'pci_devices' on Instance uuid c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:57:25 compute-1 nova_compute[182713]: 2026-01-21 23:57:25.071 182717 DEBUG nova.virt.libvirt.driver [None req-6c57dfb1-0b39-482f-915e-356d6ff9cfe4 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] [instance: c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9] End _get_guest_xml xml=<domain type="kvm">
Jan 21 23:57:25 compute-1 nova_compute[182713]:   <uuid>c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9</uuid>
Jan 21 23:57:25 compute-1 nova_compute[182713]:   <name>instance-00000047</name>
Jan 21 23:57:25 compute-1 nova_compute[182713]:   <memory>131072</memory>
Jan 21 23:57:25 compute-1 nova_compute[182713]:   <vcpu>1</vcpu>
Jan 21 23:57:25 compute-1 nova_compute[182713]:   <metadata>
Jan 21 23:57:25 compute-1 nova_compute[182713]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 21 23:57:25 compute-1 nova_compute[182713]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 21 23:57:25 compute-1 nova_compute[182713]:       <nova:name>tempest-AttachInterfacesTestJSON-server-936465965</nova:name>
Jan 21 23:57:25 compute-1 nova_compute[182713]:       <nova:creationTime>2026-01-21 23:57:25</nova:creationTime>
Jan 21 23:57:25 compute-1 nova_compute[182713]:       <nova:flavor name="m1.nano">
Jan 21 23:57:25 compute-1 nova_compute[182713]:         <nova:memory>128</nova:memory>
Jan 21 23:57:25 compute-1 nova_compute[182713]:         <nova:disk>1</nova:disk>
Jan 21 23:57:25 compute-1 nova_compute[182713]:         <nova:swap>0</nova:swap>
Jan 21 23:57:25 compute-1 nova_compute[182713]:         <nova:ephemeral>0</nova:ephemeral>
Jan 21 23:57:25 compute-1 nova_compute[182713]:         <nova:vcpus>1</nova:vcpus>
Jan 21 23:57:25 compute-1 nova_compute[182713]:       </nova:flavor>
Jan 21 23:57:25 compute-1 nova_compute[182713]:       <nova:owner>
Jan 21 23:57:25 compute-1 nova_compute[182713]:         <nova:user uuid="0f8ef02149394f2dac899fc3395b6bf7">tempest-AttachInterfacesTestJSON-658760528-project-member</nova:user>
Jan 21 23:57:25 compute-1 nova_compute[182713]:         <nova:project uuid="717cc581e6a349a98dfd390d05b18624">tempest-AttachInterfacesTestJSON-658760528</nova:project>
Jan 21 23:57:25 compute-1 nova_compute[182713]:       </nova:owner>
Jan 21 23:57:25 compute-1 nova_compute[182713]:       <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 21 23:57:25 compute-1 nova_compute[182713]:       <nova:ports>
Jan 21 23:57:25 compute-1 nova_compute[182713]:         <nova:port uuid="43589933-1997-41c6-9aa3-54f71a1330b8">
Jan 21 23:57:25 compute-1 nova_compute[182713]:           <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Jan 21 23:57:25 compute-1 nova_compute[182713]:         </nova:port>
Jan 21 23:57:25 compute-1 nova_compute[182713]:       </nova:ports>
Jan 21 23:57:25 compute-1 nova_compute[182713]:     </nova:instance>
Jan 21 23:57:25 compute-1 nova_compute[182713]:   </metadata>
Jan 21 23:57:25 compute-1 nova_compute[182713]:   <sysinfo type="smbios">
Jan 21 23:57:25 compute-1 nova_compute[182713]:     <system>
Jan 21 23:57:25 compute-1 nova_compute[182713]:       <entry name="manufacturer">RDO</entry>
Jan 21 23:57:25 compute-1 nova_compute[182713]:       <entry name="product">OpenStack Compute</entry>
Jan 21 23:57:25 compute-1 nova_compute[182713]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 21 23:57:25 compute-1 nova_compute[182713]:       <entry name="serial">c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9</entry>
Jan 21 23:57:25 compute-1 nova_compute[182713]:       <entry name="uuid">c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9</entry>
Jan 21 23:57:25 compute-1 nova_compute[182713]:       <entry name="family">Virtual Machine</entry>
Jan 21 23:57:25 compute-1 nova_compute[182713]:     </system>
Jan 21 23:57:25 compute-1 nova_compute[182713]:   </sysinfo>
Jan 21 23:57:25 compute-1 nova_compute[182713]:   <os>
Jan 21 23:57:25 compute-1 nova_compute[182713]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 21 23:57:25 compute-1 nova_compute[182713]:     <boot dev="hd"/>
Jan 21 23:57:25 compute-1 nova_compute[182713]:     <smbios mode="sysinfo"/>
Jan 21 23:57:25 compute-1 nova_compute[182713]:   </os>
Jan 21 23:57:25 compute-1 nova_compute[182713]:   <features>
Jan 21 23:57:25 compute-1 nova_compute[182713]:     <acpi/>
Jan 21 23:57:25 compute-1 nova_compute[182713]:     <apic/>
Jan 21 23:57:25 compute-1 nova_compute[182713]:     <vmcoreinfo/>
Jan 21 23:57:25 compute-1 nova_compute[182713]:   </features>
Jan 21 23:57:25 compute-1 nova_compute[182713]:   <clock offset="utc">
Jan 21 23:57:25 compute-1 nova_compute[182713]:     <timer name="pit" tickpolicy="delay"/>
Jan 21 23:57:25 compute-1 nova_compute[182713]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 21 23:57:25 compute-1 nova_compute[182713]:     <timer name="hpet" present="no"/>
Jan 21 23:57:25 compute-1 nova_compute[182713]:   </clock>
Jan 21 23:57:25 compute-1 nova_compute[182713]:   <cpu mode="custom" match="exact">
Jan 21 23:57:25 compute-1 nova_compute[182713]:     <model>Nehalem</model>
Jan 21 23:57:25 compute-1 nova_compute[182713]:     <topology sockets="1" cores="1" threads="1"/>
Jan 21 23:57:25 compute-1 nova_compute[182713]:   </cpu>
Jan 21 23:57:25 compute-1 nova_compute[182713]:   <devices>
Jan 21 23:57:25 compute-1 nova_compute[182713]:     <disk type="file" device="disk">
Jan 21 23:57:25 compute-1 nova_compute[182713]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 21 23:57:25 compute-1 nova_compute[182713]:       <source file="/var/lib/nova/instances/c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9/disk"/>
Jan 21 23:57:25 compute-1 nova_compute[182713]:       <target dev="vda" bus="virtio"/>
Jan 21 23:57:25 compute-1 nova_compute[182713]:     </disk>
Jan 21 23:57:25 compute-1 nova_compute[182713]:     <disk type="file" device="cdrom">
Jan 21 23:57:25 compute-1 nova_compute[182713]:       <driver name="qemu" type="raw" cache="none"/>
Jan 21 23:57:25 compute-1 nova_compute[182713]:       <source file="/var/lib/nova/instances/c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9/disk.config"/>
Jan 21 23:57:25 compute-1 nova_compute[182713]:       <target dev="sda" bus="sata"/>
Jan 21 23:57:25 compute-1 nova_compute[182713]:     </disk>
Jan 21 23:57:25 compute-1 nova_compute[182713]:     <interface type="ethernet">
Jan 21 23:57:25 compute-1 nova_compute[182713]:       <mac address="fa:16:3e:3e:bc:4e"/>
Jan 21 23:57:25 compute-1 nova_compute[182713]:       <model type="virtio"/>
Jan 21 23:57:25 compute-1 nova_compute[182713]:       <driver name="vhost" rx_queue_size="512"/>
Jan 21 23:57:25 compute-1 nova_compute[182713]:       <mtu size="1442"/>
Jan 21 23:57:25 compute-1 nova_compute[182713]:       <target dev="tap43589933-19"/>
Jan 21 23:57:25 compute-1 nova_compute[182713]:     </interface>
Jan 21 23:57:25 compute-1 nova_compute[182713]:     <serial type="pty">
Jan 21 23:57:25 compute-1 nova_compute[182713]:       <log file="/var/lib/nova/instances/c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9/console.log" append="off"/>
Jan 21 23:57:25 compute-1 nova_compute[182713]:     </serial>
Jan 21 23:57:25 compute-1 nova_compute[182713]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 21 23:57:25 compute-1 nova_compute[182713]:     <video>
Jan 21 23:57:25 compute-1 nova_compute[182713]:       <model type="virtio"/>
Jan 21 23:57:25 compute-1 nova_compute[182713]:     </video>
Jan 21 23:57:25 compute-1 nova_compute[182713]:     <input type="tablet" bus="usb"/>
Jan 21 23:57:25 compute-1 nova_compute[182713]:     <rng model="virtio">
Jan 21 23:57:25 compute-1 nova_compute[182713]:       <backend model="random">/dev/urandom</backend>
Jan 21 23:57:25 compute-1 nova_compute[182713]:     </rng>
Jan 21 23:57:25 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root"/>
Jan 21 23:57:25 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:57:25 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:57:25 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:57:25 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:57:25 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:57:25 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:57:25 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:57:25 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:57:25 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:57:25 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:57:25 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:57:25 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:57:25 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:57:25 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:57:25 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:57:25 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:57:25 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:57:25 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:57:25 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:57:25 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:57:25 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:57:25 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:57:25 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:57:25 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:57:25 compute-1 nova_compute[182713]:     <controller type="usb" index="0"/>
Jan 21 23:57:25 compute-1 nova_compute[182713]:     <memballoon model="virtio">
Jan 21 23:57:25 compute-1 nova_compute[182713]:       <stats period="10"/>
Jan 21 23:57:25 compute-1 nova_compute[182713]:     </memballoon>
Jan 21 23:57:25 compute-1 nova_compute[182713]:   </devices>
Jan 21 23:57:25 compute-1 nova_compute[182713]: </domain>
Jan 21 23:57:25 compute-1 nova_compute[182713]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 21 23:57:25 compute-1 nova_compute[182713]: 2026-01-21 23:57:25.072 182717 DEBUG nova.compute.manager [None req-6c57dfb1-0b39-482f-915e-356d6ff9cfe4 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] [instance: c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9] Preparing to wait for external event network-vif-plugged-43589933-1997-41c6-9aa3-54f71a1330b8 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 21 23:57:25 compute-1 nova_compute[182713]: 2026-01-21 23:57:25.072 182717 DEBUG oslo_concurrency.lockutils [None req-6c57dfb1-0b39-482f-915e-356d6ff9cfe4 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Acquiring lock "c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:57:25 compute-1 nova_compute[182713]: 2026-01-21 23:57:25.072 182717 DEBUG oslo_concurrency.lockutils [None req-6c57dfb1-0b39-482f-915e-356d6ff9cfe4 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Lock "c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:57:25 compute-1 nova_compute[182713]: 2026-01-21 23:57:25.073 182717 DEBUG oslo_concurrency.lockutils [None req-6c57dfb1-0b39-482f-915e-356d6ff9cfe4 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Lock "c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:57:25 compute-1 nova_compute[182713]: 2026-01-21 23:57:25.073 182717 DEBUG nova.virt.libvirt.vif [None req-6c57dfb1-0b39-482f-915e-356d6ff9cfe4 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-21T23:57:18Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachInterfacesTestJSON-server-936465965',display_name='tempest-AttachInterfacesTestJSON-server-936465965',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-attachinterfacestestjson-server-936465965',id=71,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBO8uUXqPwbsFq/YiTWyOW9dXr7d73wRm6448mROH/nejo5yLwAs6kN5js+eg5mRrAu9wdd3Q77g1T/xRTktMBrGTEhLl3+r8EWHZg0auWjRb9BMWnMu9DqjgoX1NYmMOKg==',key_name='tempest-keypair-1559391228',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='717cc581e6a349a98dfd390d05b18624',ramdisk_id='',reservation_id='r-fh2clcvk',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AttachInterfacesTestJSON-658760528',owner_user_name='tempest-AttachInterfacesTestJSON-658760528-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-21T23:57:20Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='0f8ef02149394f2dac899fc3395b6bf7',uuid=c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "43589933-1997-41c6-9aa3-54f71a1330b8", "address": "fa:16:3e:3e:bc:4e", "network": {"id": "1995baab-0f8d-4658-a4fc-2d21868dc592", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-54296518-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "717cc581e6a349a98dfd390d05b18624", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap43589933-19", "ovs_interfaceid": "43589933-1997-41c6-9aa3-54f71a1330b8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 21 23:57:25 compute-1 nova_compute[182713]: 2026-01-21 23:57:25.074 182717 DEBUG nova.network.os_vif_util [None req-6c57dfb1-0b39-482f-915e-356d6ff9cfe4 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Converting VIF {"id": "43589933-1997-41c6-9aa3-54f71a1330b8", "address": "fa:16:3e:3e:bc:4e", "network": {"id": "1995baab-0f8d-4658-a4fc-2d21868dc592", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-54296518-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "717cc581e6a349a98dfd390d05b18624", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap43589933-19", "ovs_interfaceid": "43589933-1997-41c6-9aa3-54f71a1330b8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 21 23:57:25 compute-1 nova_compute[182713]: 2026-01-21 23:57:25.074 182717 DEBUG nova.network.os_vif_util [None req-6c57dfb1-0b39-482f-915e-356d6ff9cfe4 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:3e:bc:4e,bridge_name='br-int',has_traffic_filtering=True,id=43589933-1997-41c6-9aa3-54f71a1330b8,network=Network(1995baab-0f8d-4658-a4fc-2d21868dc592),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap43589933-19') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 21 23:57:25 compute-1 nova_compute[182713]: 2026-01-21 23:57:25.075 182717 DEBUG os_vif [None req-6c57dfb1-0b39-482f-915e-356d6ff9cfe4 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:3e:bc:4e,bridge_name='br-int',has_traffic_filtering=True,id=43589933-1997-41c6-9aa3-54f71a1330b8,network=Network(1995baab-0f8d-4658-a4fc-2d21868dc592),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap43589933-19') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 21 23:57:25 compute-1 nova_compute[182713]: 2026-01-21 23:57:25.076 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:57:25 compute-1 nova_compute[182713]: 2026-01-21 23:57:25.076 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:57:25 compute-1 nova_compute[182713]: 2026-01-21 23:57:25.077 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 21 23:57:25 compute-1 nova_compute[182713]: 2026-01-21 23:57:25.081 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:57:25 compute-1 nova_compute[182713]: 2026-01-21 23:57:25.081 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap43589933-19, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:57:25 compute-1 nova_compute[182713]: 2026-01-21 23:57:25.082 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap43589933-19, col_values=(('external_ids', {'iface-id': '43589933-1997-41c6-9aa3-54f71a1330b8', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:3e:bc:4e', 'vm-uuid': 'c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:57:25 compute-1 nova_compute[182713]: 2026-01-21 23:57:25.084 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:57:25 compute-1 NetworkManager[54952]: <info>  [1769039845.0856] manager: (tap43589933-19): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/110)
Jan 21 23:57:25 compute-1 nova_compute[182713]: 2026-01-21 23:57:25.086 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 21 23:57:25 compute-1 nova_compute[182713]: 2026-01-21 23:57:25.090 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:57:25 compute-1 nova_compute[182713]: 2026-01-21 23:57:25.091 182717 INFO os_vif [None req-6c57dfb1-0b39-482f-915e-356d6ff9cfe4 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:3e:bc:4e,bridge_name='br-int',has_traffic_filtering=True,id=43589933-1997-41c6-9aa3-54f71a1330b8,network=Network(1995baab-0f8d-4658-a4fc-2d21868dc592),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap43589933-19')
Jan 21 23:57:25 compute-1 nova_compute[182713]: 2026-01-21 23:57:25.191 182717 DEBUG nova.virt.libvirt.driver [None req-6c57dfb1-0b39-482f-915e-356d6ff9cfe4 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 21 23:57:25 compute-1 nova_compute[182713]: 2026-01-21 23:57:25.192 182717 DEBUG nova.virt.libvirt.driver [None req-6c57dfb1-0b39-482f-915e-356d6ff9cfe4 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 21 23:57:25 compute-1 nova_compute[182713]: 2026-01-21 23:57:25.193 182717 DEBUG nova.virt.libvirt.driver [None req-6c57dfb1-0b39-482f-915e-356d6ff9cfe4 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] No VIF found with MAC fa:16:3e:3e:bc:4e, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 21 23:57:25 compute-1 nova_compute[182713]: 2026-01-21 23:57:25.194 182717 INFO nova.virt.libvirt.driver [None req-6c57dfb1-0b39-482f-915e-356d6ff9cfe4 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] [instance: c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9] Using config drive
Jan 21 23:57:25 compute-1 podman[220857]: 2026-01-21 23:57:25.226888682 +0000 UTC m=+0.094144801 container health_status dce133a2ffa21f3006ac27dc80027060a9029e6d972f4c62fecd35dd3bbb00f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., vcs-type=git, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, managed_by=edpm_ansible, release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64)
Jan 21 23:57:25 compute-1 nova_compute[182713]: 2026-01-21 23:57:25.702 182717 INFO nova.virt.libvirt.driver [None req-6c57dfb1-0b39-482f-915e-356d6ff9cfe4 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] [instance: c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9] Creating config drive at /var/lib/nova/instances/c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9/disk.config
Jan 21 23:57:25 compute-1 nova_compute[182713]: 2026-01-21 23:57:25.711 182717 DEBUG oslo_concurrency.processutils [None req-6c57dfb1-0b39-482f-915e-356d6ff9cfe4 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpb5790w5f execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:57:25 compute-1 nova_compute[182713]: 2026-01-21 23:57:25.858 182717 DEBUG oslo_concurrency.processutils [None req-6c57dfb1-0b39-482f-915e-356d6ff9cfe4 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpb5790w5f" returned: 0 in 0.148s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:57:25 compute-1 nova_compute[182713]: 2026-01-21 23:57:25.877 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:57:25 compute-1 nova_compute[182713]: 2026-01-21 23:57:25.878 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 21 23:57:25 compute-1 kernel: tap43589933-19: entered promiscuous mode
Jan 21 23:57:25 compute-1 NetworkManager[54952]: <info>  [1769039845.9595] manager: (tap43589933-19): new Tun device (/org/freedesktop/NetworkManager/Devices/111)
Jan 21 23:57:25 compute-1 nova_compute[182713]: 2026-01-21 23:57:25.959 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:57:25 compute-1 ovn_controller[94841]: 2026-01-21T23:57:25Z|00230|binding|INFO|Claiming lport 43589933-1997-41c6-9aa3-54f71a1330b8 for this chassis.
Jan 21 23:57:25 compute-1 ovn_controller[94841]: 2026-01-21T23:57:25Z|00231|binding|INFO|43589933-1997-41c6-9aa3-54f71a1330b8: Claiming fa:16:3e:3e:bc:4e 10.100.0.9
Jan 21 23:57:25 compute-1 nova_compute[182713]: 2026-01-21 23:57:25.962 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:57:25 compute-1 nova_compute[182713]: 2026-01-21 23:57:25.973 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:57:25 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:57:25.982 104184 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:3e:bc:4e 10.100.0.9'], port_security=['fa:16:3e:3e:bc:4e 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': 'c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1995baab-0f8d-4658-a4fc-2d21868dc592', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '717cc581e6a349a98dfd390d05b18624', 'neutron:revision_number': '2', 'neutron:security_group_ids': '3de5207d-5e5a-404a-9582-14ba2d715885', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a84fa12f-731b-4479-8697-844749c5a76f, chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>], logical_port=43589933-1997-41c6-9aa3-54f71a1330b8) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 21 23:57:25 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:57:25.984 104184 INFO neutron.agent.ovn.metadata.agent [-] Port 43589933-1997-41c6-9aa3-54f71a1330b8 in datapath 1995baab-0f8d-4658-a4fc-2d21868dc592 bound to our chassis
Jan 21 23:57:25 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:57:25.987 104184 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 1995baab-0f8d-4658-a4fc-2d21868dc592
Jan 21 23:57:25 compute-1 systemd-machined[153970]: New machine qemu-33-instance-00000047.
Jan 21 23:57:25 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:57:25.997 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[d78b79ef-1fbc-4d15-9a83-91b9d5b403a0]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:57:25 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:57:25.998 104184 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap1995baab-01 in ovnmeta-1995baab-0f8d-4658-a4fc-2d21868dc592 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 21 23:57:26 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:57:26.001 211733 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap1995baab-00 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 21 23:57:26 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:57:26.001 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[48e97ca3-c2e8-4a7b-b020-e0174c97dd95]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:57:26 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:57:26.002 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[32b358bf-db49-40a1-a13c-08b9a2c661d5]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:57:26 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:57:26.017 104576 DEBUG oslo.privsep.daemon [-] privsep: reply[2f9f69e9-e827-4b28-aca2-5ac220bd7486]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:57:26 compute-1 systemd[1]: Started Virtual Machine qemu-33-instance-00000047.
Jan 21 23:57:26 compute-1 nova_compute[182713]: 2026-01-21 23:57:26.032 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:57:26 compute-1 ovn_controller[94841]: 2026-01-21T23:57:26Z|00232|binding|INFO|Setting lport 43589933-1997-41c6-9aa3-54f71a1330b8 ovn-installed in OVS
Jan 21 23:57:26 compute-1 ovn_controller[94841]: 2026-01-21T23:57:26Z|00233|binding|INFO|Setting lport 43589933-1997-41c6-9aa3-54f71a1330b8 up in Southbound
Jan 21 23:57:26 compute-1 nova_compute[182713]: 2026-01-21 23:57:26.036 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:57:26 compute-1 systemd-udevd[220898]: Network interface NamePolicy= disabled on kernel command line.
Jan 21 23:57:26 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:57:26.043 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[32f34830-101d-4792-81ed-fdb52fb8b2dc]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:57:26 compute-1 NetworkManager[54952]: <info>  [1769039846.0550] device (tap43589933-19): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 21 23:57:26 compute-1 NetworkManager[54952]: <info>  [1769039846.0557] device (tap43589933-19): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 21 23:57:26 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:57:26.075 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[110fa892-e9e3-4ddd-bd37-1f5305d79268]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:57:26 compute-1 systemd-udevd[220901]: Network interface NamePolicy= disabled on kernel command line.
Jan 21 23:57:26 compute-1 NetworkManager[54952]: <info>  [1769039846.0835] manager: (tap1995baab-00): new Veth device (/org/freedesktop/NetworkManager/Devices/112)
Jan 21 23:57:26 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:57:26.081 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[da3e05b0-f75e-4c47-a8c6-88c2f9a838d0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:57:26 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:57:26.121 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[622a7c18-2807-42b6-bf4a-984a96f9f84d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:57:26 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:57:26.124 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[aba01e98-7e11-448f-9f4b-5b71107637c1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:57:26 compute-1 nova_compute[182713]: 2026-01-21 23:57:26.134 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:57:26 compute-1 NetworkManager[54952]: <info>  [1769039846.1484] device (tap1995baab-00): carrier: link connected
Jan 21 23:57:26 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:57:26.154 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[cd68c313-a4f8-46a6-a63e-7b0f3f6e910b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:57:26 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:57:26.177 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[f12a5ec9-cb99-4317-b8e6-b96fbbad8f80]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap1995baab-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:4b:ff:2f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 70], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 441880, 'reachable_time': 27223, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 220928, 'error': None, 'target': 'ovnmeta-1995baab-0f8d-4658-a4fc-2d21868dc592', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:57:26 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:57:26.198 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[9427f33d-2656-443f-815d-cf257ac25b1e]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe4b:ff2f'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 441880, 'tstamp': 441880}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 220929, 'error': None, 'target': 'ovnmeta-1995baab-0f8d-4658-a4fc-2d21868dc592', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:57:26 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:57:26.221 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[db8b1967-b8f5-45e4-977e-bed7eeacce22]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap1995baab-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:4b:ff:2f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 70], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 441880, 'reachable_time': 27223, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 220930, 'error': None, 'target': 'ovnmeta-1995baab-0f8d-4658-a4fc-2d21868dc592', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:57:26 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:57:26.271 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[19694f2f-bd1f-42ef-b27c-af9714908925]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:57:26 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:57:26.347 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[bf34ae97-f3c6-4e43-a887-a9f5d3fe0339]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:57:26 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:57:26.349 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1995baab-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:57:26 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:57:26.349 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 21 23:57:26 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:57:26.350 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap1995baab-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:57:26 compute-1 kernel: tap1995baab-00: entered promiscuous mode
Jan 21 23:57:26 compute-1 nova_compute[182713]: 2026-01-21 23:57:26.352 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:57:26 compute-1 NetworkManager[54952]: <info>  [1769039846.3525] manager: (tap1995baab-00): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/113)
Jan 21 23:57:26 compute-1 nova_compute[182713]: 2026-01-21 23:57:26.353 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:57:26 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:57:26.360 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap1995baab-00, col_values=(('external_ids', {'iface-id': '4a5cc35b-5169-43e2-b11f-202219aae22d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:57:26 compute-1 ovn_controller[94841]: 2026-01-21T23:57:26Z|00234|binding|INFO|Releasing lport 4a5cc35b-5169-43e2-b11f-202219aae22d from this chassis (sb_readonly=0)
Jan 21 23:57:26 compute-1 nova_compute[182713]: 2026-01-21 23:57:26.361 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:57:26 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:57:26.374 104184 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/1995baab-0f8d-4658-a4fc-2d21868dc592.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/1995baab-0f8d-4658-a4fc-2d21868dc592.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 21 23:57:26 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:57:26.375 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[67afa02a-74d0-48e9-b35d-ca153f35c73c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:57:26 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:57:26.376 104184 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 21 23:57:26 compute-1 ovn_metadata_agent[104179]: global
Jan 21 23:57:26 compute-1 ovn_metadata_agent[104179]:     log         /dev/log local0 debug
Jan 21 23:57:26 compute-1 ovn_metadata_agent[104179]:     log-tag     haproxy-metadata-proxy-1995baab-0f8d-4658-a4fc-2d21868dc592
Jan 21 23:57:26 compute-1 ovn_metadata_agent[104179]:     user        root
Jan 21 23:57:26 compute-1 ovn_metadata_agent[104179]:     group       root
Jan 21 23:57:26 compute-1 ovn_metadata_agent[104179]:     maxconn     1024
Jan 21 23:57:26 compute-1 ovn_metadata_agent[104179]:     pidfile     /var/lib/neutron/external/pids/1995baab-0f8d-4658-a4fc-2d21868dc592.pid.haproxy
Jan 21 23:57:26 compute-1 ovn_metadata_agent[104179]:     daemon
Jan 21 23:57:26 compute-1 ovn_metadata_agent[104179]: 
Jan 21 23:57:26 compute-1 ovn_metadata_agent[104179]: defaults
Jan 21 23:57:26 compute-1 ovn_metadata_agent[104179]:     log global
Jan 21 23:57:26 compute-1 ovn_metadata_agent[104179]:     mode http
Jan 21 23:57:26 compute-1 ovn_metadata_agent[104179]:     option httplog
Jan 21 23:57:26 compute-1 ovn_metadata_agent[104179]:     option dontlognull
Jan 21 23:57:26 compute-1 ovn_metadata_agent[104179]:     option http-server-close
Jan 21 23:57:26 compute-1 ovn_metadata_agent[104179]:     option forwardfor
Jan 21 23:57:26 compute-1 ovn_metadata_agent[104179]:     retries                 3
Jan 21 23:57:26 compute-1 ovn_metadata_agent[104179]:     timeout http-request    30s
Jan 21 23:57:26 compute-1 ovn_metadata_agent[104179]:     timeout connect         30s
Jan 21 23:57:26 compute-1 ovn_metadata_agent[104179]:     timeout client          32s
Jan 21 23:57:26 compute-1 ovn_metadata_agent[104179]:     timeout server          32s
Jan 21 23:57:26 compute-1 ovn_metadata_agent[104179]:     timeout http-keep-alive 30s
Jan 21 23:57:26 compute-1 ovn_metadata_agent[104179]: 
Jan 21 23:57:26 compute-1 ovn_metadata_agent[104179]: 
Jan 21 23:57:26 compute-1 ovn_metadata_agent[104179]: listen listener
Jan 21 23:57:26 compute-1 ovn_metadata_agent[104179]:     bind 169.254.169.254:80
Jan 21 23:57:26 compute-1 ovn_metadata_agent[104179]:     server metadata /var/lib/neutron/metadata_proxy
Jan 21 23:57:26 compute-1 ovn_metadata_agent[104179]:     http-request add-header X-OVN-Network-ID 1995baab-0f8d-4658-a4fc-2d21868dc592
Jan 21 23:57:26 compute-1 ovn_metadata_agent[104179]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 21 23:57:26 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:57:26.377 104184 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-1995baab-0f8d-4658-a4fc-2d21868dc592', 'env', 'PROCESS_TAG=haproxy-1995baab-0f8d-4658-a4fc-2d21868dc592', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/1995baab-0f8d-4658-a4fc-2d21868dc592.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 21 23:57:26 compute-1 nova_compute[182713]: 2026-01-21 23:57:26.380 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:57:26 compute-1 nova_compute[182713]: 2026-01-21 23:57:26.477 182717 DEBUG nova.network.neutron [req-6dca04ea-1084-4aef-8624-0065cd6909cc req-d2b5c8d7-d846-4392-b75b-68633086f52d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9] Updated VIF entry in instance network info cache for port 43589933-1997-41c6-9aa3-54f71a1330b8. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 21 23:57:26 compute-1 nova_compute[182713]: 2026-01-21 23:57:26.478 182717 DEBUG nova.network.neutron [req-6dca04ea-1084-4aef-8624-0065cd6909cc req-d2b5c8d7-d846-4392-b75b-68633086f52d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9] Updating instance_info_cache with network_info: [{"id": "43589933-1997-41c6-9aa3-54f71a1330b8", "address": "fa:16:3e:3e:bc:4e", "network": {"id": "1995baab-0f8d-4658-a4fc-2d21868dc592", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-54296518-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "717cc581e6a349a98dfd390d05b18624", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap43589933-19", "ovs_interfaceid": "43589933-1997-41c6-9aa3-54f71a1330b8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 23:57:26 compute-1 nova_compute[182713]: 2026-01-21 23:57:26.498 182717 DEBUG oslo_concurrency.lockutils [req-6dca04ea-1084-4aef-8624-0065cd6909cc req-d2b5c8d7-d846-4392-b75b-68633086f52d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 23:57:26 compute-1 nova_compute[182713]: 2026-01-21 23:57:26.504 182717 DEBUG nova.compute.manager [req-441bf282-ca46-4f5c-bf4a-9a2b56a0576c req-c1d4be9c-2232-4b66-8dca-57e8625ec3ba 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9] Received event network-vif-plugged-43589933-1997-41c6-9aa3-54f71a1330b8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:57:26 compute-1 nova_compute[182713]: 2026-01-21 23:57:26.505 182717 DEBUG oslo_concurrency.lockutils [req-441bf282-ca46-4f5c-bf4a-9a2b56a0576c req-c1d4be9c-2232-4b66-8dca-57e8625ec3ba 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:57:26 compute-1 nova_compute[182713]: 2026-01-21 23:57:26.505 182717 DEBUG oslo_concurrency.lockutils [req-441bf282-ca46-4f5c-bf4a-9a2b56a0576c req-c1d4be9c-2232-4b66-8dca-57e8625ec3ba 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:57:26 compute-1 nova_compute[182713]: 2026-01-21 23:57:26.505 182717 DEBUG oslo_concurrency.lockutils [req-441bf282-ca46-4f5c-bf4a-9a2b56a0576c req-c1d4be9c-2232-4b66-8dca-57e8625ec3ba 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:57:26 compute-1 nova_compute[182713]: 2026-01-21 23:57:26.505 182717 DEBUG nova.compute.manager [req-441bf282-ca46-4f5c-bf4a-9a2b56a0576c req-c1d4be9c-2232-4b66-8dca-57e8625ec3ba 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9] Processing event network-vif-plugged-43589933-1997-41c6-9aa3-54f71a1330b8 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 21 23:57:26 compute-1 nova_compute[182713]: 2026-01-21 23:57:26.530 182717 DEBUG nova.virt.driver [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Emitting event <LifecycleEvent: 1769039846.530332, c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:57:26 compute-1 nova_compute[182713]: 2026-01-21 23:57:26.531 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9] VM Started (Lifecycle Event)
Jan 21 23:57:26 compute-1 nova_compute[182713]: 2026-01-21 23:57:26.532 182717 DEBUG nova.compute.manager [None req-6c57dfb1-0b39-482f-915e-356d6ff9cfe4 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] [instance: c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 21 23:57:26 compute-1 nova_compute[182713]: 2026-01-21 23:57:26.535 182717 DEBUG nova.virt.libvirt.driver [None req-6c57dfb1-0b39-482f-915e-356d6ff9cfe4 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] [instance: c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 21 23:57:26 compute-1 nova_compute[182713]: 2026-01-21 23:57:26.538 182717 INFO nova.virt.libvirt.driver [-] [instance: c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9] Instance spawned successfully.
Jan 21 23:57:26 compute-1 nova_compute[182713]: 2026-01-21 23:57:26.539 182717 DEBUG nova.virt.libvirt.driver [None req-6c57dfb1-0b39-482f-915e-356d6ff9cfe4 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] [instance: c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 21 23:57:26 compute-1 nova_compute[182713]: 2026-01-21 23:57:26.566 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:57:26 compute-1 nova_compute[182713]: 2026-01-21 23:57:26.570 182717 DEBUG nova.virt.libvirt.driver [None req-6c57dfb1-0b39-482f-915e-356d6ff9cfe4 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] [instance: c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:57:26 compute-1 nova_compute[182713]: 2026-01-21 23:57:26.571 182717 DEBUG nova.virt.libvirt.driver [None req-6c57dfb1-0b39-482f-915e-356d6ff9cfe4 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] [instance: c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:57:26 compute-1 nova_compute[182713]: 2026-01-21 23:57:26.571 182717 DEBUG nova.virt.libvirt.driver [None req-6c57dfb1-0b39-482f-915e-356d6ff9cfe4 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] [instance: c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:57:26 compute-1 nova_compute[182713]: 2026-01-21 23:57:26.571 182717 DEBUG nova.virt.libvirt.driver [None req-6c57dfb1-0b39-482f-915e-356d6ff9cfe4 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] [instance: c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:57:26 compute-1 nova_compute[182713]: 2026-01-21 23:57:26.572 182717 DEBUG nova.virt.libvirt.driver [None req-6c57dfb1-0b39-482f-915e-356d6ff9cfe4 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] [instance: c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:57:26 compute-1 nova_compute[182713]: 2026-01-21 23:57:26.572 182717 DEBUG nova.virt.libvirt.driver [None req-6c57dfb1-0b39-482f-915e-356d6ff9cfe4 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] [instance: c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:57:26 compute-1 nova_compute[182713]: 2026-01-21 23:57:26.575 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 21 23:57:26 compute-1 nova_compute[182713]: 2026-01-21 23:57:26.615 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 21 23:57:26 compute-1 nova_compute[182713]: 2026-01-21 23:57:26.615 182717 DEBUG nova.virt.driver [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Emitting event <LifecycleEvent: 1769039846.5304477, c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:57:26 compute-1 nova_compute[182713]: 2026-01-21 23:57:26.615 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9] VM Paused (Lifecycle Event)
Jan 21 23:57:26 compute-1 nova_compute[182713]: 2026-01-21 23:57:26.648 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:57:26 compute-1 nova_compute[182713]: 2026-01-21 23:57:26.651 182717 DEBUG nova.virt.driver [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Emitting event <LifecycleEvent: 1769039846.5347915, c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:57:26 compute-1 nova_compute[182713]: 2026-01-21 23:57:26.651 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9] VM Resumed (Lifecycle Event)
Jan 21 23:57:26 compute-1 nova_compute[182713]: 2026-01-21 23:57:26.684 182717 INFO nova.compute.manager [None req-6c57dfb1-0b39-482f-915e-356d6ff9cfe4 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] [instance: c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9] Took 6.16 seconds to spawn the instance on the hypervisor.
Jan 21 23:57:26 compute-1 nova_compute[182713]: 2026-01-21 23:57:26.685 182717 DEBUG nova.compute.manager [None req-6c57dfb1-0b39-482f-915e-356d6ff9cfe4 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] [instance: c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:57:26 compute-1 nova_compute[182713]: 2026-01-21 23:57:26.690 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:57:26 compute-1 nova_compute[182713]: 2026-01-21 23:57:26.702 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 21 23:57:26 compute-1 nova_compute[182713]: 2026-01-21 23:57:26.753 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 21 23:57:26 compute-1 nova_compute[182713]: 2026-01-21 23:57:26.809 182717 INFO nova.compute.manager [None req-6c57dfb1-0b39-482f-915e-356d6ff9cfe4 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] [instance: c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9] Took 6.95 seconds to build instance.
Jan 21 23:57:26 compute-1 nova_compute[182713]: 2026-01-21 23:57:26.832 182717 DEBUG oslo_concurrency.lockutils [None req-6c57dfb1-0b39-482f-915e-356d6ff9cfe4 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Lock "c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.120s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:57:26 compute-1 podman[220969]: 2026-01-21 23:57:26.841642869 +0000 UTC m=+0.081804550 container create dc9ee907f3258a2af738ff6ab6c0b3595430226990f49383637dfee773c11cb5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1995baab-0f8d-4658-a4fc-2d21868dc592, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS)
Jan 21 23:57:26 compute-1 nova_compute[182713]: 2026-01-21 23:57:26.855 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:57:26 compute-1 nova_compute[182713]: 2026-01-21 23:57:26.857 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:57:26 compute-1 nova_compute[182713]: 2026-01-21 23:57:26.857 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:57:26 compute-1 nova_compute[182713]: 2026-01-21 23:57:26.857 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Jan 21 23:57:26 compute-1 podman[220969]: 2026-01-21 23:57:26.794164067 +0000 UTC m=+0.034325788 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 21 23:57:26 compute-1 systemd[1]: Started libpod-conmon-dc9ee907f3258a2af738ff6ab6c0b3595430226990f49383637dfee773c11cb5.scope.
Jan 21 23:57:26 compute-1 systemd[1]: Started libcrun container.
Jan 21 23:57:26 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/31ef042e5cdf851c57dd883f14c43712598f53809934173ed133db205e2c95b3/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 21 23:57:26 compute-1 podman[220969]: 2026-01-21 23:57:26.95631219 +0000 UTC m=+0.196473861 container init dc9ee907f3258a2af738ff6ab6c0b3595430226990f49383637dfee773c11cb5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1995baab-0f8d-4658-a4fc-2d21868dc592, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Jan 21 23:57:26 compute-1 podman[220969]: 2026-01-21 23:57:26.963121401 +0000 UTC m=+0.203283042 container start dc9ee907f3258a2af738ff6ab6c0b3595430226990f49383637dfee773c11cb5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1995baab-0f8d-4658-a4fc-2d21868dc592, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true)
Jan 21 23:57:26 compute-1 neutron-haproxy-ovnmeta-1995baab-0f8d-4658-a4fc-2d21868dc592[220984]: [NOTICE]   (220988) : New worker (220990) forked
Jan 21 23:57:26 compute-1 neutron-haproxy-ovnmeta-1995baab-0f8d-4658-a4fc-2d21868dc592[220984]: [NOTICE]   (220988) : Loading success.
Jan 21 23:57:27 compute-1 nova_compute[182713]: 2026-01-21 23:57:27.873 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:57:28 compute-1 nova_compute[182713]: 2026-01-21 23:57:28.856 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:57:29 compute-1 nova_compute[182713]: 2026-01-21 23:57:29.020 182717 DEBUG nova.compute.manager [req-7f4cb6ba-7c88-4610-98ec-6fd138b44fa8 req-fc7455cb-7449-4711-9e89-48dd18f093ec 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9] Received event network-vif-plugged-43589933-1997-41c6-9aa3-54f71a1330b8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:57:29 compute-1 nova_compute[182713]: 2026-01-21 23:57:29.021 182717 DEBUG oslo_concurrency.lockutils [req-7f4cb6ba-7c88-4610-98ec-6fd138b44fa8 req-fc7455cb-7449-4711-9e89-48dd18f093ec 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:57:29 compute-1 nova_compute[182713]: 2026-01-21 23:57:29.022 182717 DEBUG oslo_concurrency.lockutils [req-7f4cb6ba-7c88-4610-98ec-6fd138b44fa8 req-fc7455cb-7449-4711-9e89-48dd18f093ec 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:57:29 compute-1 nova_compute[182713]: 2026-01-21 23:57:29.022 182717 DEBUG oslo_concurrency.lockutils [req-7f4cb6ba-7c88-4610-98ec-6fd138b44fa8 req-fc7455cb-7449-4711-9e89-48dd18f093ec 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:57:29 compute-1 nova_compute[182713]: 2026-01-21 23:57:29.023 182717 DEBUG nova.compute.manager [req-7f4cb6ba-7c88-4610-98ec-6fd138b44fa8 req-fc7455cb-7449-4711-9e89-48dd18f093ec 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9] No waiting events found dispatching network-vif-plugged-43589933-1997-41c6-9aa3-54f71a1330b8 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 23:57:29 compute-1 nova_compute[182713]: 2026-01-21 23:57:29.023 182717 WARNING nova.compute.manager [req-7f4cb6ba-7c88-4610-98ec-6fd138b44fa8 req-fc7455cb-7449-4711-9e89-48dd18f093ec 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9] Received unexpected event network-vif-plugged-43589933-1997-41c6-9aa3-54f71a1330b8 for instance with vm_state active and task_state None.
Jan 21 23:57:30 compute-1 nova_compute[182713]: 2026-01-21 23:57:30.086 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:57:30 compute-1 nova_compute[182713]: 2026-01-21 23:57:30.855 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:57:30 compute-1 nova_compute[182713]: 2026-01-21 23:57:30.892 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:57:30 compute-1 nova_compute[182713]: 2026-01-21 23:57:30.893 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:57:30 compute-1 nova_compute[182713]: 2026-01-21 23:57:30.893 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:57:30 compute-1 nova_compute[182713]: 2026-01-21 23:57:30.894 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 21 23:57:30 compute-1 nova_compute[182713]: 2026-01-21 23:57:30.977 182717 DEBUG oslo_concurrency.processutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:57:31 compute-1 nova_compute[182713]: 2026-01-21 23:57:31.088 182717 DEBUG oslo_concurrency.processutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9/disk --force-share --output=json" returned: 0 in 0.111s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:57:31 compute-1 nova_compute[182713]: 2026-01-21 23:57:31.090 182717 DEBUG oslo_concurrency.processutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:57:31 compute-1 nova_compute[182713]: 2026-01-21 23:57:31.138 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:57:31 compute-1 nova_compute[182713]: 2026-01-21 23:57:31.148 182717 DEBUG oslo_concurrency.processutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:57:31 compute-1 nova_compute[182713]: 2026-01-21 23:57:31.310 182717 WARNING nova.virt.libvirt.driver [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 21 23:57:31 compute-1 nova_compute[182713]: 2026-01-21 23:57:31.312 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5531MB free_disk=73.30311584472656GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 21 23:57:31 compute-1 nova_compute[182713]: 2026-01-21 23:57:31.312 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:57:31 compute-1 nova_compute[182713]: 2026-01-21 23:57:31.312 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:57:31 compute-1 nova_compute[182713]: 2026-01-21 23:57:31.482 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Instance c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 21 23:57:31 compute-1 nova_compute[182713]: 2026-01-21 23:57:31.483 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 21 23:57:31 compute-1 nova_compute[182713]: 2026-01-21 23:57:31.483 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 21 23:57:31 compute-1 nova_compute[182713]: 2026-01-21 23:57:31.587 182717 DEBUG nova.compute.provider_tree [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Inventory has not changed in ProviderTree for provider: 39680711-70c9-4df1-ae59-25e54fac688d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 21 23:57:31 compute-1 nova_compute[182713]: 2026-01-21 23:57:31.614 182717 DEBUG nova.scheduler.client.report [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Inventory has not changed for provider 39680711-70c9-4df1-ae59-25e54fac688d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 21 23:57:31 compute-1 nova_compute[182713]: 2026-01-21 23:57:31.640 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 21 23:57:31 compute-1 nova_compute[182713]: 2026-01-21 23:57:31.641 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.328s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:57:31 compute-1 nova_compute[182713]: 2026-01-21 23:57:31.643 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:57:31 compute-1 NetworkManager[54952]: <info>  [1769039851.6450] manager: (patch-br-int-to-provnet-99bcf688-d143-4306-a11c-956e66fbe227): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/114)
Jan 21 23:57:31 compute-1 NetworkManager[54952]: <info>  [1769039851.6462] manager: (patch-provnet-99bcf688-d143-4306-a11c-956e66fbe227-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/115)
Jan 21 23:57:31 compute-1 nova_compute[182713]: 2026-01-21 23:57:31.784 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:57:31 compute-1 ovn_controller[94841]: 2026-01-21T23:57:31Z|00235|binding|INFO|Releasing lport 4a5cc35b-5169-43e2-b11f-202219aae22d from this chassis (sb_readonly=0)
Jan 21 23:57:31 compute-1 nova_compute[182713]: 2026-01-21 23:57:31.807 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:57:32 compute-1 nova_compute[182713]: 2026-01-21 23:57:32.637 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:57:32 compute-1 nova_compute[182713]: 2026-01-21 23:57:32.638 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:57:32 compute-1 nova_compute[182713]: 2026-01-21 23:57:32.639 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 21 23:57:32 compute-1 nova_compute[182713]: 2026-01-21 23:57:32.641 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 21 23:57:33 compute-1 nova_compute[182713]: 2026-01-21 23:57:33.439 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquiring lock "refresh_cache-c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 23:57:33 compute-1 nova_compute[182713]: 2026-01-21 23:57:33.440 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquired lock "refresh_cache-c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 23:57:33 compute-1 nova_compute[182713]: 2026-01-21 23:57:33.441 182717 DEBUG nova.network.neutron [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] [instance: c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Jan 21 23:57:33 compute-1 nova_compute[182713]: 2026-01-21 23:57:33.441 182717 DEBUG nova.objects.instance [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lazy-loading 'info_cache' on Instance uuid c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:57:33 compute-1 nova_compute[182713]: 2026-01-21 23:57:33.759 182717 DEBUG nova.compute.manager [req-483b06fe-ad88-4d08-adb6-33808cd4b8b0 req-5afde2d0-9c6c-459a-abe7-663549056a2f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9] Received event network-changed-43589933-1997-41c6-9aa3-54f71a1330b8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:57:33 compute-1 nova_compute[182713]: 2026-01-21 23:57:33.760 182717 DEBUG nova.compute.manager [req-483b06fe-ad88-4d08-adb6-33808cd4b8b0 req-5afde2d0-9c6c-459a-abe7-663549056a2f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9] Refreshing instance network info cache due to event network-changed-43589933-1997-41c6-9aa3-54f71a1330b8. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 21 23:57:33 compute-1 nova_compute[182713]: 2026-01-21 23:57:33.761 182717 DEBUG oslo_concurrency.lockutils [req-483b06fe-ad88-4d08-adb6-33808cd4b8b0 req-5afde2d0-9c6c-459a-abe7-663549056a2f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 23:57:34 compute-1 podman[221008]: 2026-01-21 23:57:34.604985835 +0000 UTC m=+0.095410630 container health_status 9069102163b9014ce8d990e36224edf2f6277e737eebbc85a8ea0ba5a3d6a468 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 21 23:57:34 compute-1 podman[221007]: 2026-01-21 23:57:34.634506913 +0000 UTC m=+0.120729978 container health_status 1b31374526468d6b98833c6ddc06c791524e3aaa8f4a92d02e3f11c449a17ca2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251202, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 21 23:57:35 compute-1 nova_compute[182713]: 2026-01-21 23:57:35.126 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:57:35 compute-1 nova_compute[182713]: 2026-01-21 23:57:35.558 182717 DEBUG nova.network.neutron [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] [instance: c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9] Updating instance_info_cache with network_info: [{"id": "43589933-1997-41c6-9aa3-54f71a1330b8", "address": "fa:16:3e:3e:bc:4e", "network": {"id": "1995baab-0f8d-4658-a4fc-2d21868dc592", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-54296518-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.190", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "717cc581e6a349a98dfd390d05b18624", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap43589933-19", "ovs_interfaceid": "43589933-1997-41c6-9aa3-54f71a1330b8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 23:57:35 compute-1 nova_compute[182713]: 2026-01-21 23:57:35.581 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Releasing lock "refresh_cache-c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 23:57:35 compute-1 nova_compute[182713]: 2026-01-21 23:57:35.582 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] [instance: c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Jan 21 23:57:35 compute-1 nova_compute[182713]: 2026-01-21 23:57:35.583 182717 DEBUG oslo_concurrency.lockutils [req-483b06fe-ad88-4d08-adb6-33808cd4b8b0 req-5afde2d0-9c6c-459a-abe7-663549056a2f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 23:57:35 compute-1 nova_compute[182713]: 2026-01-21 23:57:35.583 182717 DEBUG nova.network.neutron [req-483b06fe-ad88-4d08-adb6-33808cd4b8b0 req-5afde2d0-9c6c-459a-abe7-663549056a2f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9] Refreshing network info cache for port 43589933-1997-41c6-9aa3-54f71a1330b8 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 21 23:57:35 compute-1 nova_compute[182713]: 2026-01-21 23:57:35.586 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:57:35 compute-1 nova_compute[182713]: 2026-01-21 23:57:35.800 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:57:36 compute-1 nova_compute[182713]: 2026-01-21 23:57:36.140 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:57:38 compute-1 ovn_controller[94841]: 2026-01-21T23:57:38Z|00026|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:3e:bc:4e 10.100.0.9
Jan 21 23:57:38 compute-1 ovn_controller[94841]: 2026-01-21T23:57:38Z|00027|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:3e:bc:4e 10.100.0.9
Jan 21 23:57:38 compute-1 nova_compute[182713]: 2026-01-21 23:57:38.534 182717 DEBUG nova.network.neutron [req-483b06fe-ad88-4d08-adb6-33808cd4b8b0 req-5afde2d0-9c6c-459a-abe7-663549056a2f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9] Updated VIF entry in instance network info cache for port 43589933-1997-41c6-9aa3-54f71a1330b8. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 21 23:57:38 compute-1 nova_compute[182713]: 2026-01-21 23:57:38.536 182717 DEBUG nova.network.neutron [req-483b06fe-ad88-4d08-adb6-33808cd4b8b0 req-5afde2d0-9c6c-459a-abe7-663549056a2f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9] Updating instance_info_cache with network_info: [{"id": "43589933-1997-41c6-9aa3-54f71a1330b8", "address": "fa:16:3e:3e:bc:4e", "network": {"id": "1995baab-0f8d-4658-a4fc-2d21868dc592", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-54296518-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.190", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "717cc581e6a349a98dfd390d05b18624", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap43589933-19", "ovs_interfaceid": "43589933-1997-41c6-9aa3-54f71a1330b8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 23:57:38 compute-1 nova_compute[182713]: 2026-01-21 23:57:38.564 182717 DEBUG oslo_concurrency.lockutils [req-483b06fe-ad88-4d08-adb6-33808cd4b8b0 req-5afde2d0-9c6c-459a-abe7-663549056a2f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 23:57:40 compute-1 nova_compute[182713]: 2026-01-21 23:57:40.155 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:57:41 compute-1 nova_compute[182713]: 2026-01-21 23:57:41.143 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:57:41 compute-1 podman[221071]: 2026-01-21 23:57:41.575457424 +0000 UTC m=+0.066088826 container health_status af9a44923f14d865893c91712a29a99d59e05d83f1a1a0300c4f4d5af638f284 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3)
Jan 21 23:57:41 compute-1 podman[221072]: 2026-01-21 23:57:41.599724331 +0000 UTC m=+0.082890513 container health_status e305ebfcd3dbca18f8dd680606b043b108cf9efb0d0a771039cb04fe0df8441d (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 21 23:57:45 compute-1 nova_compute[182713]: 2026-01-21 23:57:45.159 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:57:45 compute-1 nova_compute[182713]: 2026-01-21 23:57:45.750 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:57:46 compute-1 nova_compute[182713]: 2026-01-21 23:57:46.145 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:57:49 compute-1 nova_compute[182713]: 2026-01-21 23:57:49.947 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:57:50 compute-1 nova_compute[182713]: 2026-01-21 23:57:50.162 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:57:50 compute-1 nova_compute[182713]: 2026-01-21 23:57:50.513 182717 DEBUG oslo_concurrency.lockutils [None req-54c0c702-83b3-4d24-b872-00ab5b25b1e8 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Acquiring lock "interface-c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9-None" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:57:50 compute-1 nova_compute[182713]: 2026-01-21 23:57:50.513 182717 DEBUG oslo_concurrency.lockutils [None req-54c0c702-83b3-4d24-b872-00ab5b25b1e8 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Lock "interface-c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9-None" acquired by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:57:50 compute-1 nova_compute[182713]: 2026-01-21 23:57:50.514 182717 DEBUG nova.objects.instance [None req-54c0c702-83b3-4d24-b872-00ab5b25b1e8 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Lazy-loading 'flavor' on Instance uuid c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:57:50 compute-1 nova_compute[182713]: 2026-01-21 23:57:50.540 182717 DEBUG nova.objects.instance [None req-54c0c702-83b3-4d24-b872-00ab5b25b1e8 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Lazy-loading 'pci_requests' on Instance uuid c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:57:50 compute-1 nova_compute[182713]: 2026-01-21 23:57:50.552 182717 DEBUG nova.network.neutron [None req-54c0c702-83b3-4d24-b872-00ab5b25b1e8 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] [instance: c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 21 23:57:50 compute-1 nova_compute[182713]: 2026-01-21 23:57:50.859 182717 DEBUG nova.policy [None req-54c0c702-83b3-4d24-b872-00ab5b25b1e8 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '0f8ef02149394f2dac899fc3395b6bf7', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '717cc581e6a349a98dfd390d05b18624', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 21 23:57:51 compute-1 nova_compute[182713]: 2026-01-21 23:57:51.148 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:57:51 compute-1 nova_compute[182713]: 2026-01-21 23:57:51.721 182717 DEBUG nova.network.neutron [None req-54c0c702-83b3-4d24-b872-00ab5b25b1e8 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] [instance: c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9] Successfully created port: 63932621-f0d1-4f08-8ce5-b5fa120bcc62 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 21 23:57:53 compute-1 podman[221111]: 2026-01-21 23:57:53.611536469 +0000 UTC m=+0.088275820 container health_status cba261ffc859a105e0afa4436e64e27f9ba7e7387a1ea02146e0072c51844d44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute)
Jan 21 23:57:55 compute-1 nova_compute[182713]: 2026-01-21 23:57:55.166 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:57:55 compute-1 podman[221131]: 2026-01-21 23:57:55.61381813 +0000 UTC m=+0.096794853 container health_status dce133a2ffa21f3006ac27dc80027060a9029e6d972f4c62fecd35dd3bbb00f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, name=ubi9-minimal, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, vcs-type=git, build-date=2025-08-20T13:12:41, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., architecture=x86_64, release=1755695350, com.redhat.component=ubi9-minimal-container, config_id=openstack_network_exporter, container_name=openstack_network_exporter)
Jan 21 23:57:55 compute-1 nova_compute[182713]: 2026-01-21 23:57:55.654 182717 DEBUG nova.network.neutron [None req-54c0c702-83b3-4d24-b872-00ab5b25b1e8 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] [instance: c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9] Successfully updated port: 63932621-f0d1-4f08-8ce5-b5fa120bcc62 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 21 23:57:55 compute-1 nova_compute[182713]: 2026-01-21 23:57:55.673 182717 DEBUG oslo_concurrency.lockutils [None req-54c0c702-83b3-4d24-b872-00ab5b25b1e8 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Acquiring lock "refresh_cache-c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 23:57:55 compute-1 nova_compute[182713]: 2026-01-21 23:57:55.674 182717 DEBUG oslo_concurrency.lockutils [None req-54c0c702-83b3-4d24-b872-00ab5b25b1e8 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Acquired lock "refresh_cache-c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 23:57:55 compute-1 nova_compute[182713]: 2026-01-21 23:57:55.674 182717 DEBUG nova.network.neutron [None req-54c0c702-83b3-4d24-b872-00ab5b25b1e8 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] [instance: c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 21 23:57:55 compute-1 nova_compute[182713]: 2026-01-21 23:57:55.876 182717 WARNING nova.network.neutron [None req-54c0c702-83b3-4d24-b872-00ab5b25b1e8 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] [instance: c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9] 1995baab-0f8d-4658-a4fc-2d21868dc592 already exists in list: networks containing: ['1995baab-0f8d-4658-a4fc-2d21868dc592']. ignoring it
Jan 21 23:57:55 compute-1 nova_compute[182713]: 2026-01-21 23:57:55.972 182717 DEBUG nova.compute.manager [req-e8402705-62a3-49a1-b2c2-3cd9131cc433 req-1159d968-9855-44bf-8d13-f0eddcf76a95 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9] Received event network-changed-63932621-f0d1-4f08-8ce5-b5fa120bcc62 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:57:55 compute-1 nova_compute[182713]: 2026-01-21 23:57:55.973 182717 DEBUG nova.compute.manager [req-e8402705-62a3-49a1-b2c2-3cd9131cc433 req-1159d968-9855-44bf-8d13-f0eddcf76a95 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9] Refreshing instance network info cache due to event network-changed-63932621-f0d1-4f08-8ce5-b5fa120bcc62. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 21 23:57:55 compute-1 nova_compute[182713]: 2026-01-21 23:57:55.973 182717 DEBUG oslo_concurrency.lockutils [req-e8402705-62a3-49a1-b2c2-3cd9131cc433 req-1159d968-9855-44bf-8d13-f0eddcf76a95 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 23:57:56 compute-1 nova_compute[182713]: 2026-01-21 23:57:56.151 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:57:57 compute-1 nova_compute[182713]: 2026-01-21 23:57:57.954 182717 DEBUG oslo_concurrency.lockutils [None req-b4b0d19c-780a-44cf-a0dd-0d0926b7e4a3 d31ef0e2c7354f47adb7b7f072c28fae 40a322b32cda438b83f33ec51a9007dc - - default default] Acquiring lock "337d88a9-a34b-4c90-bf0d-0418533ae52d" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:57:57 compute-1 nova_compute[182713]: 2026-01-21 23:57:57.955 182717 DEBUG oslo_concurrency.lockutils [None req-b4b0d19c-780a-44cf-a0dd-0d0926b7e4a3 d31ef0e2c7354f47adb7b7f072c28fae 40a322b32cda438b83f33ec51a9007dc - - default default] Lock "337d88a9-a34b-4c90-bf0d-0418533ae52d" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:57:57 compute-1 nova_compute[182713]: 2026-01-21 23:57:57.982 182717 DEBUG nova.compute.manager [None req-b4b0d19c-780a-44cf-a0dd-0d0926b7e4a3 d31ef0e2c7354f47adb7b7f072c28fae 40a322b32cda438b83f33ec51a9007dc - - default default] [instance: 337d88a9-a34b-4c90-bf0d-0418533ae52d] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 21 23:57:58 compute-1 nova_compute[182713]: 2026-01-21 23:57:58.224 182717 DEBUG oslo_concurrency.lockutils [None req-b4b0d19c-780a-44cf-a0dd-0d0926b7e4a3 d31ef0e2c7354f47adb7b7f072c28fae 40a322b32cda438b83f33ec51a9007dc - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:57:58 compute-1 nova_compute[182713]: 2026-01-21 23:57:58.225 182717 DEBUG oslo_concurrency.lockutils [None req-b4b0d19c-780a-44cf-a0dd-0d0926b7e4a3 d31ef0e2c7354f47adb7b7f072c28fae 40a322b32cda438b83f33ec51a9007dc - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:57:58 compute-1 nova_compute[182713]: 2026-01-21 23:57:58.236 182717 DEBUG nova.virt.hardware [None req-b4b0d19c-780a-44cf-a0dd-0d0926b7e4a3 d31ef0e2c7354f47adb7b7f072c28fae 40a322b32cda438b83f33ec51a9007dc - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 21 23:57:58 compute-1 nova_compute[182713]: 2026-01-21 23:57:58.236 182717 INFO nova.compute.claims [None req-b4b0d19c-780a-44cf-a0dd-0d0926b7e4a3 d31ef0e2c7354f47adb7b7f072c28fae 40a322b32cda438b83f33ec51a9007dc - - default default] [instance: 337d88a9-a34b-4c90-bf0d-0418533ae52d] Claim successful on node compute-1.ctlplane.example.com
Jan 21 23:57:58 compute-1 nova_compute[182713]: 2026-01-21 23:57:58.409 182717 DEBUG nova.compute.provider_tree [None req-b4b0d19c-780a-44cf-a0dd-0d0926b7e4a3 d31ef0e2c7354f47adb7b7f072c28fae 40a322b32cda438b83f33ec51a9007dc - - default default] Inventory has not changed in ProviderTree for provider: 39680711-70c9-4df1-ae59-25e54fac688d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 21 23:57:58 compute-1 nova_compute[182713]: 2026-01-21 23:57:58.434 182717 DEBUG nova.scheduler.client.report [None req-b4b0d19c-780a-44cf-a0dd-0d0926b7e4a3 d31ef0e2c7354f47adb7b7f072c28fae 40a322b32cda438b83f33ec51a9007dc - - default default] Inventory has not changed for provider 39680711-70c9-4df1-ae59-25e54fac688d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 21 23:57:58 compute-1 nova_compute[182713]: 2026-01-21 23:57:58.483 182717 DEBUG oslo_concurrency.lockutils [None req-b4b0d19c-780a-44cf-a0dd-0d0926b7e4a3 d31ef0e2c7354f47adb7b7f072c28fae 40a322b32cda438b83f33ec51a9007dc - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.259s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:57:58 compute-1 nova_compute[182713]: 2026-01-21 23:57:58.485 182717 DEBUG nova.compute.manager [None req-b4b0d19c-780a-44cf-a0dd-0d0926b7e4a3 d31ef0e2c7354f47adb7b7f072c28fae 40a322b32cda438b83f33ec51a9007dc - - default default] [instance: 337d88a9-a34b-4c90-bf0d-0418533ae52d] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 21 23:57:58 compute-1 nova_compute[182713]: 2026-01-21 23:57:58.578 182717 DEBUG nova.compute.manager [None req-b4b0d19c-780a-44cf-a0dd-0d0926b7e4a3 d31ef0e2c7354f47adb7b7f072c28fae 40a322b32cda438b83f33ec51a9007dc - - default default] [instance: 337d88a9-a34b-4c90-bf0d-0418533ae52d] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 21 23:57:58 compute-1 nova_compute[182713]: 2026-01-21 23:57:58.579 182717 DEBUG nova.network.neutron [None req-b4b0d19c-780a-44cf-a0dd-0d0926b7e4a3 d31ef0e2c7354f47adb7b7f072c28fae 40a322b32cda438b83f33ec51a9007dc - - default default] [instance: 337d88a9-a34b-4c90-bf0d-0418533ae52d] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 21 23:57:58 compute-1 nova_compute[182713]: 2026-01-21 23:57:58.605 182717 INFO nova.virt.libvirt.driver [None req-b4b0d19c-780a-44cf-a0dd-0d0926b7e4a3 d31ef0e2c7354f47adb7b7f072c28fae 40a322b32cda438b83f33ec51a9007dc - - default default] [instance: 337d88a9-a34b-4c90-bf0d-0418533ae52d] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 21 23:57:58 compute-1 nova_compute[182713]: 2026-01-21 23:57:58.634 182717 DEBUG nova.compute.manager [None req-b4b0d19c-780a-44cf-a0dd-0d0926b7e4a3 d31ef0e2c7354f47adb7b7f072c28fae 40a322b32cda438b83f33ec51a9007dc - - default default] [instance: 337d88a9-a34b-4c90-bf0d-0418533ae52d] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 21 23:57:58 compute-1 nova_compute[182713]: 2026-01-21 23:57:58.758 182717 DEBUG nova.compute.manager [None req-b4b0d19c-780a-44cf-a0dd-0d0926b7e4a3 d31ef0e2c7354f47adb7b7f072c28fae 40a322b32cda438b83f33ec51a9007dc - - default default] [instance: 337d88a9-a34b-4c90-bf0d-0418533ae52d] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 21 23:57:58 compute-1 nova_compute[182713]: 2026-01-21 23:57:58.761 182717 DEBUG nova.virt.libvirt.driver [None req-b4b0d19c-780a-44cf-a0dd-0d0926b7e4a3 d31ef0e2c7354f47adb7b7f072c28fae 40a322b32cda438b83f33ec51a9007dc - - default default] [instance: 337d88a9-a34b-4c90-bf0d-0418533ae52d] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 21 23:57:58 compute-1 nova_compute[182713]: 2026-01-21 23:57:58.763 182717 INFO nova.virt.libvirt.driver [None req-b4b0d19c-780a-44cf-a0dd-0d0926b7e4a3 d31ef0e2c7354f47adb7b7f072c28fae 40a322b32cda438b83f33ec51a9007dc - - default default] [instance: 337d88a9-a34b-4c90-bf0d-0418533ae52d] Creating image(s)
Jan 21 23:57:58 compute-1 nova_compute[182713]: 2026-01-21 23:57:58.764 182717 DEBUG oslo_concurrency.lockutils [None req-b4b0d19c-780a-44cf-a0dd-0d0926b7e4a3 d31ef0e2c7354f47adb7b7f072c28fae 40a322b32cda438b83f33ec51a9007dc - - default default] Acquiring lock "/var/lib/nova/instances/337d88a9-a34b-4c90-bf0d-0418533ae52d/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:57:58 compute-1 nova_compute[182713]: 2026-01-21 23:57:58.765 182717 DEBUG oslo_concurrency.lockutils [None req-b4b0d19c-780a-44cf-a0dd-0d0926b7e4a3 d31ef0e2c7354f47adb7b7f072c28fae 40a322b32cda438b83f33ec51a9007dc - - default default] Lock "/var/lib/nova/instances/337d88a9-a34b-4c90-bf0d-0418533ae52d/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:57:58 compute-1 nova_compute[182713]: 2026-01-21 23:57:58.766 182717 DEBUG oslo_concurrency.lockutils [None req-b4b0d19c-780a-44cf-a0dd-0d0926b7e4a3 d31ef0e2c7354f47adb7b7f072c28fae 40a322b32cda438b83f33ec51a9007dc - - default default] Lock "/var/lib/nova/instances/337d88a9-a34b-4c90-bf0d-0418533ae52d/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:57:58 compute-1 nova_compute[182713]: 2026-01-21 23:57:58.801 182717 DEBUG oslo_concurrency.processutils [None req-b4b0d19c-780a-44cf-a0dd-0d0926b7e4a3 d31ef0e2c7354f47adb7b7f072c28fae 40a322b32cda438b83f33ec51a9007dc - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:57:58 compute-1 nova_compute[182713]: 2026-01-21 23:57:58.893 182717 DEBUG oslo_concurrency.processutils [None req-b4b0d19c-780a-44cf-a0dd-0d0926b7e4a3 d31ef0e2c7354f47adb7b7f072c28fae 40a322b32cda438b83f33ec51a9007dc - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.092s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:57:58 compute-1 nova_compute[182713]: 2026-01-21 23:57:58.896 182717 DEBUG oslo_concurrency.lockutils [None req-b4b0d19c-780a-44cf-a0dd-0d0926b7e4a3 d31ef0e2c7354f47adb7b7f072c28fae 40a322b32cda438b83f33ec51a9007dc - - default default] Acquiring lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:57:58 compute-1 nova_compute[182713]: 2026-01-21 23:57:58.897 182717 DEBUG oslo_concurrency.lockutils [None req-b4b0d19c-780a-44cf-a0dd-0d0926b7e4a3 d31ef0e2c7354f47adb7b7f072c28fae 40a322b32cda438b83f33ec51a9007dc - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:57:58 compute-1 nova_compute[182713]: 2026-01-21 23:57:58.923 182717 DEBUG oslo_concurrency.processutils [None req-b4b0d19c-780a-44cf-a0dd-0d0926b7e4a3 d31ef0e2c7354f47adb7b7f072c28fae 40a322b32cda438b83f33ec51a9007dc - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:57:59 compute-1 nova_compute[182713]: 2026-01-21 23:57:59.001 182717 DEBUG oslo_concurrency.processutils [None req-b4b0d19c-780a-44cf-a0dd-0d0926b7e4a3 d31ef0e2c7354f47adb7b7f072c28fae 40a322b32cda438b83f33ec51a9007dc - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.079s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:57:59 compute-1 nova_compute[182713]: 2026-01-21 23:57:59.003 182717 DEBUG oslo_concurrency.processutils [None req-b4b0d19c-780a-44cf-a0dd-0d0926b7e4a3 d31ef0e2c7354f47adb7b7f072c28fae 40a322b32cda438b83f33ec51a9007dc - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/337d88a9-a34b-4c90-bf0d-0418533ae52d/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:57:59 compute-1 nova_compute[182713]: 2026-01-21 23:57:59.057 182717 DEBUG oslo_concurrency.processutils [None req-b4b0d19c-780a-44cf-a0dd-0d0926b7e4a3 d31ef0e2c7354f47adb7b7f072c28fae 40a322b32cda438b83f33ec51a9007dc - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/337d88a9-a34b-4c90-bf0d-0418533ae52d/disk 1073741824" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:57:59 compute-1 nova_compute[182713]: 2026-01-21 23:57:59.058 182717 DEBUG oslo_concurrency.lockutils [None req-b4b0d19c-780a-44cf-a0dd-0d0926b7e4a3 d31ef0e2c7354f47adb7b7f072c28fae 40a322b32cda438b83f33ec51a9007dc - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.161s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:57:59 compute-1 nova_compute[182713]: 2026-01-21 23:57:59.058 182717 DEBUG oslo_concurrency.processutils [None req-b4b0d19c-780a-44cf-a0dd-0d0926b7e4a3 d31ef0e2c7354f47adb7b7f072c28fae 40a322b32cda438b83f33ec51a9007dc - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:57:59 compute-1 nova_compute[182713]: 2026-01-21 23:57:59.152 182717 DEBUG oslo_concurrency.processutils [None req-b4b0d19c-780a-44cf-a0dd-0d0926b7e4a3 d31ef0e2c7354f47adb7b7f072c28fae 40a322b32cda438b83f33ec51a9007dc - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.093s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:57:59 compute-1 nova_compute[182713]: 2026-01-21 23:57:59.154 182717 DEBUG nova.virt.disk.api [None req-b4b0d19c-780a-44cf-a0dd-0d0926b7e4a3 d31ef0e2c7354f47adb7b7f072c28fae 40a322b32cda438b83f33ec51a9007dc - - default default] Checking if we can resize image /var/lib/nova/instances/337d88a9-a34b-4c90-bf0d-0418533ae52d/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 21 23:57:59 compute-1 nova_compute[182713]: 2026-01-21 23:57:59.154 182717 DEBUG oslo_concurrency.processutils [None req-b4b0d19c-780a-44cf-a0dd-0d0926b7e4a3 d31ef0e2c7354f47adb7b7f072c28fae 40a322b32cda438b83f33ec51a9007dc - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/337d88a9-a34b-4c90-bf0d-0418533ae52d/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:57:59 compute-1 nova_compute[182713]: 2026-01-21 23:57:59.224 182717 DEBUG oslo_concurrency.processutils [None req-b4b0d19c-780a-44cf-a0dd-0d0926b7e4a3 d31ef0e2c7354f47adb7b7f072c28fae 40a322b32cda438b83f33ec51a9007dc - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/337d88a9-a34b-4c90-bf0d-0418533ae52d/disk --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:57:59 compute-1 nova_compute[182713]: 2026-01-21 23:57:59.225 182717 DEBUG nova.virt.disk.api [None req-b4b0d19c-780a-44cf-a0dd-0d0926b7e4a3 d31ef0e2c7354f47adb7b7f072c28fae 40a322b32cda438b83f33ec51a9007dc - - default default] Cannot resize image /var/lib/nova/instances/337d88a9-a34b-4c90-bf0d-0418533ae52d/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 21 23:57:59 compute-1 nova_compute[182713]: 2026-01-21 23:57:59.226 182717 DEBUG nova.objects.instance [None req-b4b0d19c-780a-44cf-a0dd-0d0926b7e4a3 d31ef0e2c7354f47adb7b7f072c28fae 40a322b32cda438b83f33ec51a9007dc - - default default] Lazy-loading 'migration_context' on Instance uuid 337d88a9-a34b-4c90-bf0d-0418533ae52d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:57:59 compute-1 nova_compute[182713]: 2026-01-21 23:57:59.256 182717 DEBUG nova.virt.libvirt.driver [None req-b4b0d19c-780a-44cf-a0dd-0d0926b7e4a3 d31ef0e2c7354f47adb7b7f072c28fae 40a322b32cda438b83f33ec51a9007dc - - default default] [instance: 337d88a9-a34b-4c90-bf0d-0418533ae52d] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 21 23:57:59 compute-1 nova_compute[182713]: 2026-01-21 23:57:59.257 182717 DEBUG nova.virt.libvirt.driver [None req-b4b0d19c-780a-44cf-a0dd-0d0926b7e4a3 d31ef0e2c7354f47adb7b7f072c28fae 40a322b32cda438b83f33ec51a9007dc - - default default] [instance: 337d88a9-a34b-4c90-bf0d-0418533ae52d] Ensure instance console log exists: /var/lib/nova/instances/337d88a9-a34b-4c90-bf0d-0418533ae52d/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 21 23:57:59 compute-1 nova_compute[182713]: 2026-01-21 23:57:59.258 182717 DEBUG oslo_concurrency.lockutils [None req-b4b0d19c-780a-44cf-a0dd-0d0926b7e4a3 d31ef0e2c7354f47adb7b7f072c28fae 40a322b32cda438b83f33ec51a9007dc - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:57:59 compute-1 nova_compute[182713]: 2026-01-21 23:57:59.258 182717 DEBUG oslo_concurrency.lockutils [None req-b4b0d19c-780a-44cf-a0dd-0d0926b7e4a3 d31ef0e2c7354f47adb7b7f072c28fae 40a322b32cda438b83f33ec51a9007dc - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:57:59 compute-1 nova_compute[182713]: 2026-01-21 23:57:59.259 182717 DEBUG oslo_concurrency.lockutils [None req-b4b0d19c-780a-44cf-a0dd-0d0926b7e4a3 d31ef0e2c7354f47adb7b7f072c28fae 40a322b32cda438b83f33ec51a9007dc - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:57:59 compute-1 nova_compute[182713]: 2026-01-21 23:57:59.437 182717 DEBUG nova.policy [None req-b4b0d19c-780a-44cf-a0dd-0d0926b7e4a3 d31ef0e2c7354f47adb7b7f072c28fae 40a322b32cda438b83f33ec51a9007dc - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'd31ef0e2c7354f47adb7b7f072c28fae', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '40a322b32cda438b83f33ec51a9007dc', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 21 23:57:59 compute-1 nova_compute[182713]: 2026-01-21 23:57:59.464 182717 DEBUG nova.network.neutron [None req-54c0c702-83b3-4d24-b872-00ab5b25b1e8 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] [instance: c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9] Updating instance_info_cache with network_info: [{"id": "43589933-1997-41c6-9aa3-54f71a1330b8", "address": "fa:16:3e:3e:bc:4e", "network": {"id": "1995baab-0f8d-4658-a4fc-2d21868dc592", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-54296518-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.190", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "717cc581e6a349a98dfd390d05b18624", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap43589933-19", "ovs_interfaceid": "43589933-1997-41c6-9aa3-54f71a1330b8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "63932621-f0d1-4f08-8ce5-b5fa120bcc62", "address": "fa:16:3e:30:c6:cf", "network": {"id": "1995baab-0f8d-4658-a4fc-2d21868dc592", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-54296518-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "717cc581e6a349a98dfd390d05b18624", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap63932621-f0", "ovs_interfaceid": "63932621-f0d1-4f08-8ce5-b5fa120bcc62", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 23:57:59 compute-1 nova_compute[182713]: 2026-01-21 23:57:59.494 182717 DEBUG oslo_concurrency.lockutils [None req-54c0c702-83b3-4d24-b872-00ab5b25b1e8 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Releasing lock "refresh_cache-c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 23:57:59 compute-1 nova_compute[182713]: 2026-01-21 23:57:59.497 182717 DEBUG oslo_concurrency.lockutils [req-e8402705-62a3-49a1-b2c2-3cd9131cc433 req-1159d968-9855-44bf-8d13-f0eddcf76a95 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 23:57:59 compute-1 nova_compute[182713]: 2026-01-21 23:57:59.497 182717 DEBUG nova.network.neutron [req-e8402705-62a3-49a1-b2c2-3cd9131cc433 req-1159d968-9855-44bf-8d13-f0eddcf76a95 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9] Refreshing network info cache for port 63932621-f0d1-4f08-8ce5-b5fa120bcc62 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 21 23:57:59 compute-1 nova_compute[182713]: 2026-01-21 23:57:59.502 182717 DEBUG nova.virt.libvirt.vif [None req-54c0c702-83b3-4d24-b872-00ab5b25b1e8 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-21T23:57:18Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-AttachInterfacesTestJSON-server-936465965',display_name='tempest-AttachInterfacesTestJSON-server-936465965',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-attachinterfacestestjson-server-936465965',id=71,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBO8uUXqPwbsFq/YiTWyOW9dXr7d73wRm6448mROH/nejo5yLwAs6kN5js+eg5mRrAu9wdd3Q77g1T/xRTktMBrGTEhLl3+r8EWHZg0auWjRb9BMWnMu9DqjgoX1NYmMOKg==',key_name='tempest-keypair-1559391228',keypairs=<?>,launch_index=0,launched_at=2026-01-21T23:57:26Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='717cc581e6a349a98dfd390d05b18624',ramdisk_id='',reservation_id='r-fh2clcvk',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-658760528',owner_user_name='tempest-AttachInterfacesTestJSON-658760528-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-21T23:57:26Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='0f8ef02149394f2dac899fc3395b6bf7',uuid=c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "63932621-f0d1-4f08-8ce5-b5fa120bcc62", "address": "fa:16:3e:30:c6:cf", "network": {"id": "1995baab-0f8d-4658-a4fc-2d21868dc592", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-54296518-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "717cc581e6a349a98dfd390d05b18624", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap63932621-f0", "ovs_interfaceid": "63932621-f0d1-4f08-8ce5-b5fa120bcc62", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 21 23:57:59 compute-1 nova_compute[182713]: 2026-01-21 23:57:59.503 182717 DEBUG nova.network.os_vif_util [None req-54c0c702-83b3-4d24-b872-00ab5b25b1e8 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Converting VIF {"id": "63932621-f0d1-4f08-8ce5-b5fa120bcc62", "address": "fa:16:3e:30:c6:cf", "network": {"id": "1995baab-0f8d-4658-a4fc-2d21868dc592", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-54296518-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "717cc581e6a349a98dfd390d05b18624", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap63932621-f0", "ovs_interfaceid": "63932621-f0d1-4f08-8ce5-b5fa120bcc62", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 21 23:57:59 compute-1 nova_compute[182713]: 2026-01-21 23:57:59.505 182717 DEBUG nova.network.os_vif_util [None req-54c0c702-83b3-4d24-b872-00ab5b25b1e8 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:30:c6:cf,bridge_name='br-int',has_traffic_filtering=True,id=63932621-f0d1-4f08-8ce5-b5fa120bcc62,network=Network(1995baab-0f8d-4658-a4fc-2d21868dc592),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap63932621-f0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 21 23:57:59 compute-1 nova_compute[182713]: 2026-01-21 23:57:59.505 182717 DEBUG os_vif [None req-54c0c702-83b3-4d24-b872-00ab5b25b1e8 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:30:c6:cf,bridge_name='br-int',has_traffic_filtering=True,id=63932621-f0d1-4f08-8ce5-b5fa120bcc62,network=Network(1995baab-0f8d-4658-a4fc-2d21868dc592),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap63932621-f0') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 21 23:57:59 compute-1 nova_compute[182713]: 2026-01-21 23:57:59.506 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:57:59 compute-1 nova_compute[182713]: 2026-01-21 23:57:59.507 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:57:59 compute-1 nova_compute[182713]: 2026-01-21 23:57:59.508 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 21 23:57:59 compute-1 nova_compute[182713]: 2026-01-21 23:57:59.516 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:57:59 compute-1 nova_compute[182713]: 2026-01-21 23:57:59.517 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap63932621-f0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:57:59 compute-1 nova_compute[182713]: 2026-01-21 23:57:59.518 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap63932621-f0, col_values=(('external_ids', {'iface-id': '63932621-f0d1-4f08-8ce5-b5fa120bcc62', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:30:c6:cf', 'vm-uuid': 'c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:57:59 compute-1 NetworkManager[54952]: <info>  [1769039879.5224] manager: (tap63932621-f0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/116)
Jan 21 23:57:59 compute-1 nova_compute[182713]: 2026-01-21 23:57:59.526 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 21 23:57:59 compute-1 nova_compute[182713]: 2026-01-21 23:57:59.531 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:57:59 compute-1 nova_compute[182713]: 2026-01-21 23:57:59.532 182717 INFO os_vif [None req-54c0c702-83b3-4d24-b872-00ab5b25b1e8 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:30:c6:cf,bridge_name='br-int',has_traffic_filtering=True,id=63932621-f0d1-4f08-8ce5-b5fa120bcc62,network=Network(1995baab-0f8d-4658-a4fc-2d21868dc592),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap63932621-f0')
Jan 21 23:57:59 compute-1 nova_compute[182713]: 2026-01-21 23:57:59.533 182717 DEBUG nova.virt.libvirt.vif [None req-54c0c702-83b3-4d24-b872-00ab5b25b1e8 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-21T23:57:18Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-AttachInterfacesTestJSON-server-936465965',display_name='tempest-AttachInterfacesTestJSON-server-936465965',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-attachinterfacestestjson-server-936465965',id=71,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBO8uUXqPwbsFq/YiTWyOW9dXr7d73wRm6448mROH/nejo5yLwAs6kN5js+eg5mRrAu9wdd3Q77g1T/xRTktMBrGTEhLl3+r8EWHZg0auWjRb9BMWnMu9DqjgoX1NYmMOKg==',key_name='tempest-keypair-1559391228',keypairs=<?>,launch_index=0,launched_at=2026-01-21T23:57:26Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='717cc581e6a349a98dfd390d05b18624',ramdisk_id='',reservation_id='r-fh2clcvk',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-658760528',owner_user_name='tempest-AttachInterfacesTestJSON-658760528-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-21T23:57:26Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='0f8ef02149394f2dac899fc3395b6bf7',uuid=c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "63932621-f0d1-4f08-8ce5-b5fa120bcc62", "address": "fa:16:3e:30:c6:cf", "network": {"id": "1995baab-0f8d-4658-a4fc-2d21868dc592", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-54296518-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "717cc581e6a349a98dfd390d05b18624", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap63932621-f0", "ovs_interfaceid": "63932621-f0d1-4f08-8ce5-b5fa120bcc62", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 21 23:57:59 compute-1 nova_compute[182713]: 2026-01-21 23:57:59.533 182717 DEBUG nova.network.os_vif_util [None req-54c0c702-83b3-4d24-b872-00ab5b25b1e8 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Converting VIF {"id": "63932621-f0d1-4f08-8ce5-b5fa120bcc62", "address": "fa:16:3e:30:c6:cf", "network": {"id": "1995baab-0f8d-4658-a4fc-2d21868dc592", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-54296518-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "717cc581e6a349a98dfd390d05b18624", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap63932621-f0", "ovs_interfaceid": "63932621-f0d1-4f08-8ce5-b5fa120bcc62", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 21 23:57:59 compute-1 nova_compute[182713]: 2026-01-21 23:57:59.534 182717 DEBUG nova.network.os_vif_util [None req-54c0c702-83b3-4d24-b872-00ab5b25b1e8 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:30:c6:cf,bridge_name='br-int',has_traffic_filtering=True,id=63932621-f0d1-4f08-8ce5-b5fa120bcc62,network=Network(1995baab-0f8d-4658-a4fc-2d21868dc592),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap63932621-f0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 21 23:57:59 compute-1 nova_compute[182713]: 2026-01-21 23:57:59.537 182717 DEBUG nova.virt.libvirt.guest [None req-54c0c702-83b3-4d24-b872-00ab5b25b1e8 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] attach device xml: <interface type="ethernet">
Jan 21 23:57:59 compute-1 nova_compute[182713]:   <mac address="fa:16:3e:30:c6:cf"/>
Jan 21 23:57:59 compute-1 nova_compute[182713]:   <model type="virtio"/>
Jan 21 23:57:59 compute-1 nova_compute[182713]:   <driver name="vhost" rx_queue_size="512"/>
Jan 21 23:57:59 compute-1 nova_compute[182713]:   <mtu size="1442"/>
Jan 21 23:57:59 compute-1 nova_compute[182713]:   <target dev="tap63932621-f0"/>
Jan 21 23:57:59 compute-1 nova_compute[182713]: </interface>
Jan 21 23:57:59 compute-1 nova_compute[182713]:  attach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:339
Jan 21 23:57:59 compute-1 kernel: tap63932621-f0: entered promiscuous mode
Jan 21 23:57:59 compute-1 NetworkManager[54952]: <info>  [1769039879.5560] manager: (tap63932621-f0): new Tun device (/org/freedesktop/NetworkManager/Devices/117)
Jan 21 23:57:59 compute-1 ovn_controller[94841]: 2026-01-21T23:57:59Z|00236|binding|INFO|Claiming lport 63932621-f0d1-4f08-8ce5-b5fa120bcc62 for this chassis.
Jan 21 23:57:59 compute-1 ovn_controller[94841]: 2026-01-21T23:57:59Z|00237|binding|INFO|63932621-f0d1-4f08-8ce5-b5fa120bcc62: Claiming fa:16:3e:30:c6:cf 10.100.0.14
Jan 21 23:57:59 compute-1 nova_compute[182713]: 2026-01-21 23:57:59.560 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:57:59 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:57:59.575 104184 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:30:c6:cf 10.100.0.14'], port_security=['fa:16:3e:30:c6:cf 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': 'c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1995baab-0f8d-4658-a4fc-2d21868dc592', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '717cc581e6a349a98dfd390d05b18624', 'neutron:revision_number': '2', 'neutron:security_group_ids': '453c6af8-25dc-4538-90e8-d74d46875cdc', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a84fa12f-731b-4479-8697-844749c5a76f, chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>], logical_port=63932621-f0d1-4f08-8ce5-b5fa120bcc62) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 21 23:57:59 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:57:59.578 104184 INFO neutron.agent.ovn.metadata.agent [-] Port 63932621-f0d1-4f08-8ce5-b5fa120bcc62 in datapath 1995baab-0f8d-4658-a4fc-2d21868dc592 bound to our chassis
Jan 21 23:57:59 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:57:59.581 104184 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 1995baab-0f8d-4658-a4fc-2d21868dc592
Jan 21 23:57:59 compute-1 ovn_controller[94841]: 2026-01-21T23:57:59Z|00238|binding|INFO|Setting lport 63932621-f0d1-4f08-8ce5-b5fa120bcc62 ovn-installed in OVS
Jan 21 23:57:59 compute-1 ovn_controller[94841]: 2026-01-21T23:57:59Z|00239|binding|INFO|Setting lport 63932621-f0d1-4f08-8ce5-b5fa120bcc62 up in Southbound
Jan 21 23:57:59 compute-1 nova_compute[182713]: 2026-01-21 23:57:59.587 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:57:59 compute-1 nova_compute[182713]: 2026-01-21 23:57:59.593 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:57:59 compute-1 systemd-udevd[221173]: Network interface NamePolicy= disabled on kernel command line.
Jan 21 23:57:59 compute-1 NetworkManager[54952]: <info>  [1769039879.6128] device (tap63932621-f0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 21 23:57:59 compute-1 NetworkManager[54952]: <info>  [1769039879.6142] device (tap63932621-f0): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 21 23:57:59 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:57:59.611 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[93704a3b-73f5-44ce-aa70-5840ed2624a4]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:57:59 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:57:59.655 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[0dd6d609-b85d-4a81-80ee-a78366166539]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:57:59 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:57:59.659 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[91874271-2444-4a14-951a-c98b758b4d08]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:57:59 compute-1 nova_compute[182713]: 2026-01-21 23:57:59.691 182717 DEBUG nova.virt.libvirt.driver [None req-54c0c702-83b3-4d24-b872-00ab5b25b1e8 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 21 23:57:59 compute-1 nova_compute[182713]: 2026-01-21 23:57:59.692 182717 DEBUG nova.virt.libvirt.driver [None req-54c0c702-83b3-4d24-b872-00ab5b25b1e8 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 21 23:57:59 compute-1 nova_compute[182713]: 2026-01-21 23:57:59.692 182717 DEBUG nova.virt.libvirt.driver [None req-54c0c702-83b3-4d24-b872-00ab5b25b1e8 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] No VIF found with MAC fa:16:3e:3e:bc:4e, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 21 23:57:59 compute-1 nova_compute[182713]: 2026-01-21 23:57:59.693 182717 DEBUG nova.virt.libvirt.driver [None req-54c0c702-83b3-4d24-b872-00ab5b25b1e8 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] No VIF found with MAC fa:16:3e:30:c6:cf, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 21 23:57:59 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:57:59.701 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[c727b778-e76c-44d1-9b72-e90910dadd2c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:57:59 compute-1 nova_compute[182713]: 2026-01-21 23:57:59.722 182717 DEBUG nova.virt.libvirt.guest [None req-54c0c702-83b3-4d24-b872-00ab5b25b1e8 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 21 23:57:59 compute-1 nova_compute[182713]:   <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 21 23:57:59 compute-1 nova_compute[182713]:   <nova:name>tempest-AttachInterfacesTestJSON-server-936465965</nova:name>
Jan 21 23:57:59 compute-1 nova_compute[182713]:   <nova:creationTime>2026-01-21 23:57:59</nova:creationTime>
Jan 21 23:57:59 compute-1 nova_compute[182713]:   <nova:flavor name="m1.nano">
Jan 21 23:57:59 compute-1 nova_compute[182713]:     <nova:memory>128</nova:memory>
Jan 21 23:57:59 compute-1 nova_compute[182713]:     <nova:disk>1</nova:disk>
Jan 21 23:57:59 compute-1 nova_compute[182713]:     <nova:swap>0</nova:swap>
Jan 21 23:57:59 compute-1 nova_compute[182713]:     <nova:ephemeral>0</nova:ephemeral>
Jan 21 23:57:59 compute-1 nova_compute[182713]:     <nova:vcpus>1</nova:vcpus>
Jan 21 23:57:59 compute-1 nova_compute[182713]:   </nova:flavor>
Jan 21 23:57:59 compute-1 nova_compute[182713]:   <nova:owner>
Jan 21 23:57:59 compute-1 nova_compute[182713]:     <nova:user uuid="0f8ef02149394f2dac899fc3395b6bf7">tempest-AttachInterfacesTestJSON-658760528-project-member</nova:user>
Jan 21 23:57:59 compute-1 nova_compute[182713]:     <nova:project uuid="717cc581e6a349a98dfd390d05b18624">tempest-AttachInterfacesTestJSON-658760528</nova:project>
Jan 21 23:57:59 compute-1 nova_compute[182713]:   </nova:owner>
Jan 21 23:57:59 compute-1 nova_compute[182713]:   <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 21 23:57:59 compute-1 nova_compute[182713]:   <nova:ports>
Jan 21 23:57:59 compute-1 nova_compute[182713]:     <nova:port uuid="43589933-1997-41c6-9aa3-54f71a1330b8">
Jan 21 23:57:59 compute-1 nova_compute[182713]:       <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Jan 21 23:57:59 compute-1 nova_compute[182713]:     </nova:port>
Jan 21 23:57:59 compute-1 nova_compute[182713]:     <nova:port uuid="63932621-f0d1-4f08-8ce5-b5fa120bcc62">
Jan 21 23:57:59 compute-1 nova_compute[182713]:       <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Jan 21 23:57:59 compute-1 nova_compute[182713]:     </nova:port>
Jan 21 23:57:59 compute-1 nova_compute[182713]:   </nova:ports>
Jan 21 23:57:59 compute-1 nova_compute[182713]: </nova:instance>
Jan 21 23:57:59 compute-1 nova_compute[182713]:  set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359
Jan 21 23:57:59 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:57:59.726 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[7b088315-843d-4693-a89a-e0eabe7ca7b5]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap1995baab-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:4b:ff:2f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 70], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 441880, 'reachable_time': 27223, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 221181, 'error': None, 'target': 'ovnmeta-1995baab-0f8d-4658-a4fc-2d21868dc592', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:57:59 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:57:59.749 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[93542be3-f5ba-4e57-9e48-d12c975ec142]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap1995baab-01'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 441895, 'tstamp': 441895}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 221182, 'error': None, 'target': 'ovnmeta-1995baab-0f8d-4658-a4fc-2d21868dc592', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap1995baab-01'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 441899, 'tstamp': 441899}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 221182, 'error': None, 'target': 'ovnmeta-1995baab-0f8d-4658-a4fc-2d21868dc592', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:57:59 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:57:59.751 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1995baab-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:57:59 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:57:59.755 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap1995baab-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:57:59 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:57:59.756 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 21 23:57:59 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:57:59.757 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap1995baab-00, col_values=(('external_ids', {'iface-id': '4a5cc35b-5169-43e2-b11f-202219aae22d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:57:59 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:57:59.757 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 21 23:57:59 compute-1 nova_compute[182713]: 2026-01-21 23:57:59.760 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:57:59 compute-1 nova_compute[182713]: 2026-01-21 23:57:59.765 182717 DEBUG oslo_concurrency.lockutils [None req-54c0c702-83b3-4d24-b872-00ab5b25b1e8 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Lock "interface-c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9-None" "released" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: held 9.252s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:58:00 compute-1 ovn_controller[94841]: 2026-01-21T23:58:00Z|00028|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:30:c6:cf 10.100.0.14
Jan 21 23:58:00 compute-1 ovn_controller[94841]: 2026-01-21T23:58:00Z|00029|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:30:c6:cf 10.100.0.14
Jan 21 23:58:01 compute-1 nova_compute[182713]: 2026-01-21 23:58:01.063 182717 DEBUG nova.network.neutron [None req-b4b0d19c-780a-44cf-a0dd-0d0926b7e4a3 d31ef0e2c7354f47adb7b7f072c28fae 40a322b32cda438b83f33ec51a9007dc - - default default] [instance: 337d88a9-a34b-4c90-bf0d-0418533ae52d] Successfully created port: 0d50f3fc-9e12-4e56-ba52-98ff14988caa _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 21 23:58:01 compute-1 nova_compute[182713]: 2026-01-21 23:58:01.155 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:58:01 compute-1 nova_compute[182713]: 2026-01-21 23:58:01.770 182717 DEBUG nova.network.neutron [req-e8402705-62a3-49a1-b2c2-3cd9131cc433 req-1159d968-9855-44bf-8d13-f0eddcf76a95 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9] Updated VIF entry in instance network info cache for port 63932621-f0d1-4f08-8ce5-b5fa120bcc62. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 21 23:58:01 compute-1 nova_compute[182713]: 2026-01-21 23:58:01.771 182717 DEBUG nova.network.neutron [req-e8402705-62a3-49a1-b2c2-3cd9131cc433 req-1159d968-9855-44bf-8d13-f0eddcf76a95 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9] Updating instance_info_cache with network_info: [{"id": "43589933-1997-41c6-9aa3-54f71a1330b8", "address": "fa:16:3e:3e:bc:4e", "network": {"id": "1995baab-0f8d-4658-a4fc-2d21868dc592", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-54296518-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.190", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "717cc581e6a349a98dfd390d05b18624", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap43589933-19", "ovs_interfaceid": "43589933-1997-41c6-9aa3-54f71a1330b8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "63932621-f0d1-4f08-8ce5-b5fa120bcc62", "address": "fa:16:3e:30:c6:cf", "network": {"id": "1995baab-0f8d-4658-a4fc-2d21868dc592", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-54296518-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "717cc581e6a349a98dfd390d05b18624", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap63932621-f0", "ovs_interfaceid": "63932621-f0d1-4f08-8ce5-b5fa120bcc62", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 23:58:01 compute-1 nova_compute[182713]: 2026-01-21 23:58:01.794 182717 DEBUG oslo_concurrency.lockutils [req-e8402705-62a3-49a1-b2c2-3cd9131cc433 req-1159d968-9855-44bf-8d13-f0eddcf76a95 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 23:58:01 compute-1 nova_compute[182713]: 2026-01-21 23:58:01.854 182717 DEBUG oslo_concurrency.lockutils [None req-5f5259a4-0578-4d36-993a-4c38ee3abaf5 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Acquiring lock "interface-c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9-None" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:58:01 compute-1 nova_compute[182713]: 2026-01-21 23:58:01.855 182717 DEBUG oslo_concurrency.lockutils [None req-5f5259a4-0578-4d36-993a-4c38ee3abaf5 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Lock "interface-c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9-None" acquired by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:58:01 compute-1 nova_compute[182713]: 2026-01-21 23:58:01.855 182717 DEBUG nova.objects.instance [None req-5f5259a4-0578-4d36-993a-4c38ee3abaf5 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Lazy-loading 'flavor' on Instance uuid c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:58:02 compute-1 nova_compute[182713]: 2026-01-21 23:58:02.071 182717 DEBUG nova.network.neutron [None req-b4b0d19c-780a-44cf-a0dd-0d0926b7e4a3 d31ef0e2c7354f47adb7b7f072c28fae 40a322b32cda438b83f33ec51a9007dc - - default default] [instance: 337d88a9-a34b-4c90-bf0d-0418533ae52d] Successfully updated port: 0d50f3fc-9e12-4e56-ba52-98ff14988caa _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 21 23:58:02 compute-1 nova_compute[182713]: 2026-01-21 23:58:02.089 182717 DEBUG oslo_concurrency.lockutils [None req-b4b0d19c-780a-44cf-a0dd-0d0926b7e4a3 d31ef0e2c7354f47adb7b7f072c28fae 40a322b32cda438b83f33ec51a9007dc - - default default] Acquiring lock "refresh_cache-337d88a9-a34b-4c90-bf0d-0418533ae52d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 23:58:02 compute-1 nova_compute[182713]: 2026-01-21 23:58:02.090 182717 DEBUG oslo_concurrency.lockutils [None req-b4b0d19c-780a-44cf-a0dd-0d0926b7e4a3 d31ef0e2c7354f47adb7b7f072c28fae 40a322b32cda438b83f33ec51a9007dc - - default default] Acquired lock "refresh_cache-337d88a9-a34b-4c90-bf0d-0418533ae52d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 23:58:02 compute-1 nova_compute[182713]: 2026-01-21 23:58:02.090 182717 DEBUG nova.network.neutron [None req-b4b0d19c-780a-44cf-a0dd-0d0926b7e4a3 d31ef0e2c7354f47adb7b7f072c28fae 40a322b32cda438b83f33ec51a9007dc - - default default] [instance: 337d88a9-a34b-4c90-bf0d-0418533ae52d] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 21 23:58:02 compute-1 nova_compute[182713]: 2026-01-21 23:58:02.268 182717 DEBUG nova.network.neutron [None req-b4b0d19c-780a-44cf-a0dd-0d0926b7e4a3 d31ef0e2c7354f47adb7b7f072c28fae 40a322b32cda438b83f33ec51a9007dc - - default default] [instance: 337d88a9-a34b-4c90-bf0d-0418533ae52d] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 21 23:58:02 compute-1 nova_compute[182713]: 2026-01-21 23:58:02.271 182717 DEBUG nova.objects.instance [None req-5f5259a4-0578-4d36-993a-4c38ee3abaf5 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Lazy-loading 'pci_requests' on Instance uuid c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:58:02 compute-1 nova_compute[182713]: 2026-01-21 23:58:02.288 182717 DEBUG nova.network.neutron [None req-5f5259a4-0578-4d36-993a-4c38ee3abaf5 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] [instance: c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 21 23:58:02 compute-1 nova_compute[182713]: 2026-01-21 23:58:02.464 182717 DEBUG nova.compute.manager [req-7453fe32-1149-43c1-a4d4-7518eef0a4a5 req-9fab2623-3ee1-437e-b414-d38d49d08c95 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9] Received event network-vif-plugged-63932621-f0d1-4f08-8ce5-b5fa120bcc62 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:58:02 compute-1 nova_compute[182713]: 2026-01-21 23:58:02.464 182717 DEBUG oslo_concurrency.lockutils [req-7453fe32-1149-43c1-a4d4-7518eef0a4a5 req-9fab2623-3ee1-437e-b414-d38d49d08c95 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:58:02 compute-1 nova_compute[182713]: 2026-01-21 23:58:02.465 182717 DEBUG oslo_concurrency.lockutils [req-7453fe32-1149-43c1-a4d4-7518eef0a4a5 req-9fab2623-3ee1-437e-b414-d38d49d08c95 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:58:02 compute-1 nova_compute[182713]: 2026-01-21 23:58:02.465 182717 DEBUG oslo_concurrency.lockutils [req-7453fe32-1149-43c1-a4d4-7518eef0a4a5 req-9fab2623-3ee1-437e-b414-d38d49d08c95 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:58:02 compute-1 nova_compute[182713]: 2026-01-21 23:58:02.466 182717 DEBUG nova.compute.manager [req-7453fe32-1149-43c1-a4d4-7518eef0a4a5 req-9fab2623-3ee1-437e-b414-d38d49d08c95 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9] No waiting events found dispatching network-vif-plugged-63932621-f0d1-4f08-8ce5-b5fa120bcc62 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 23:58:02 compute-1 nova_compute[182713]: 2026-01-21 23:58:02.466 182717 WARNING nova.compute.manager [req-7453fe32-1149-43c1-a4d4-7518eef0a4a5 req-9fab2623-3ee1-437e-b414-d38d49d08c95 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9] Received unexpected event network-vif-plugged-63932621-f0d1-4f08-8ce5-b5fa120bcc62 for instance with vm_state active and task_state None.
Jan 21 23:58:02 compute-1 nova_compute[182713]: 2026-01-21 23:58:02.466 182717 DEBUG nova.compute.manager [req-7453fe32-1149-43c1-a4d4-7518eef0a4a5 req-9fab2623-3ee1-437e-b414-d38d49d08c95 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9] Received event network-vif-plugged-63932621-f0d1-4f08-8ce5-b5fa120bcc62 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:58:02 compute-1 nova_compute[182713]: 2026-01-21 23:58:02.467 182717 DEBUG oslo_concurrency.lockutils [req-7453fe32-1149-43c1-a4d4-7518eef0a4a5 req-9fab2623-3ee1-437e-b414-d38d49d08c95 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:58:02 compute-1 nova_compute[182713]: 2026-01-21 23:58:02.467 182717 DEBUG oslo_concurrency.lockutils [req-7453fe32-1149-43c1-a4d4-7518eef0a4a5 req-9fab2623-3ee1-437e-b414-d38d49d08c95 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:58:02 compute-1 nova_compute[182713]: 2026-01-21 23:58:02.468 182717 DEBUG oslo_concurrency.lockutils [req-7453fe32-1149-43c1-a4d4-7518eef0a4a5 req-9fab2623-3ee1-437e-b414-d38d49d08c95 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:58:02 compute-1 nova_compute[182713]: 2026-01-21 23:58:02.468 182717 DEBUG nova.compute.manager [req-7453fe32-1149-43c1-a4d4-7518eef0a4a5 req-9fab2623-3ee1-437e-b414-d38d49d08c95 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9] No waiting events found dispatching network-vif-plugged-63932621-f0d1-4f08-8ce5-b5fa120bcc62 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 23:58:02 compute-1 nova_compute[182713]: 2026-01-21 23:58:02.469 182717 WARNING nova.compute.manager [req-7453fe32-1149-43c1-a4d4-7518eef0a4a5 req-9fab2623-3ee1-437e-b414-d38d49d08c95 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9] Received unexpected event network-vif-plugged-63932621-f0d1-4f08-8ce5-b5fa120bcc62 for instance with vm_state active and task_state None.
Jan 21 23:58:02 compute-1 nova_compute[182713]: 2026-01-21 23:58:02.504 182717 DEBUG nova.policy [None req-5f5259a4-0578-4d36-993a-4c38ee3abaf5 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '0f8ef02149394f2dac899fc3395b6bf7', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '717cc581e6a349a98dfd390d05b18624', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 21 23:58:02 compute-1 nova_compute[182713]: 2026-01-21 23:58:02.768 182717 DEBUG nova.compute.manager [req-5e634f85-916a-4064-8a4a-87548d96e9a0 req-dfae0201-0b38-4f7c-b2c4-4091d4d51abd 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 337d88a9-a34b-4c90-bf0d-0418533ae52d] Received event network-changed-0d50f3fc-9e12-4e56-ba52-98ff14988caa external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:58:02 compute-1 nova_compute[182713]: 2026-01-21 23:58:02.768 182717 DEBUG nova.compute.manager [req-5e634f85-916a-4064-8a4a-87548d96e9a0 req-dfae0201-0b38-4f7c-b2c4-4091d4d51abd 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 337d88a9-a34b-4c90-bf0d-0418533ae52d] Refreshing instance network info cache due to event network-changed-0d50f3fc-9e12-4e56-ba52-98ff14988caa. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 21 23:58:02 compute-1 nova_compute[182713]: 2026-01-21 23:58:02.768 182717 DEBUG oslo_concurrency.lockutils [req-5e634f85-916a-4064-8a4a-87548d96e9a0 req-dfae0201-0b38-4f7c-b2c4-4091d4d51abd 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-337d88a9-a34b-4c90-bf0d-0418533ae52d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 23:58:03 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:58:03.005 104184 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:58:03 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:58:03.005 104184 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:58:03 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:58:03.007 104184 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:58:04 compute-1 nova_compute[182713]: 2026-01-21 23:58:04.555 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:58:04 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:58:04.867 104184 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=17, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:a4:fb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'fa:c2:bb:50:eb:27'}, ipsec=False) old=SB_Global(nb_cfg=16) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 21 23:58:04 compute-1 nova_compute[182713]: 2026-01-21 23:58:04.868 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:58:04 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:58:04.870 104184 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 21 23:58:04 compute-1 nova_compute[182713]: 2026-01-21 23:58:04.872 182717 DEBUG nova.network.neutron [None req-b4b0d19c-780a-44cf-a0dd-0d0926b7e4a3 d31ef0e2c7354f47adb7b7f072c28fae 40a322b32cda438b83f33ec51a9007dc - - default default] [instance: 337d88a9-a34b-4c90-bf0d-0418533ae52d] Updating instance_info_cache with network_info: [{"id": "0d50f3fc-9e12-4e56-ba52-98ff14988caa", "address": "fa:16:3e:38:24:84", "network": {"id": "58dabcb0-4997-48a1-816b-d257b8a0a2a6", "bridge": "br-int", "label": "tempest-ServersV294TestFqdnHostnames-94429612-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "40a322b32cda438b83f33ec51a9007dc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0d50f3fc-9e", "ovs_interfaceid": "0d50f3fc-9e12-4e56-ba52-98ff14988caa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 23:58:04 compute-1 nova_compute[182713]: 2026-01-21 23:58:04.904 182717 DEBUG oslo_concurrency.lockutils [None req-b4b0d19c-780a-44cf-a0dd-0d0926b7e4a3 d31ef0e2c7354f47adb7b7f072c28fae 40a322b32cda438b83f33ec51a9007dc - - default default] Releasing lock "refresh_cache-337d88a9-a34b-4c90-bf0d-0418533ae52d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 23:58:04 compute-1 nova_compute[182713]: 2026-01-21 23:58:04.904 182717 DEBUG nova.compute.manager [None req-b4b0d19c-780a-44cf-a0dd-0d0926b7e4a3 d31ef0e2c7354f47adb7b7f072c28fae 40a322b32cda438b83f33ec51a9007dc - - default default] [instance: 337d88a9-a34b-4c90-bf0d-0418533ae52d] Instance network_info: |[{"id": "0d50f3fc-9e12-4e56-ba52-98ff14988caa", "address": "fa:16:3e:38:24:84", "network": {"id": "58dabcb0-4997-48a1-816b-d257b8a0a2a6", "bridge": "br-int", "label": "tempest-ServersV294TestFqdnHostnames-94429612-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "40a322b32cda438b83f33ec51a9007dc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0d50f3fc-9e", "ovs_interfaceid": "0d50f3fc-9e12-4e56-ba52-98ff14988caa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 21 23:58:04 compute-1 nova_compute[182713]: 2026-01-21 23:58:04.906 182717 DEBUG oslo_concurrency.lockutils [req-5e634f85-916a-4064-8a4a-87548d96e9a0 req-dfae0201-0b38-4f7c-b2c4-4091d4d51abd 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-337d88a9-a34b-4c90-bf0d-0418533ae52d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 23:58:04 compute-1 nova_compute[182713]: 2026-01-21 23:58:04.906 182717 DEBUG nova.network.neutron [req-5e634f85-916a-4064-8a4a-87548d96e9a0 req-dfae0201-0b38-4f7c-b2c4-4091d4d51abd 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 337d88a9-a34b-4c90-bf0d-0418533ae52d] Refreshing network info cache for port 0d50f3fc-9e12-4e56-ba52-98ff14988caa _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 21 23:58:04 compute-1 nova_compute[182713]: 2026-01-21 23:58:04.911 182717 DEBUG nova.virt.libvirt.driver [None req-b4b0d19c-780a-44cf-a0dd-0d0926b7e4a3 d31ef0e2c7354f47adb7b7f072c28fae 40a322b32cda438b83f33ec51a9007dc - - default default] [instance: 337d88a9-a34b-4c90-bf0d-0418533ae52d] Start _get_guest_xml network_info=[{"id": "0d50f3fc-9e12-4e56-ba52-98ff14988caa", "address": "fa:16:3e:38:24:84", "network": {"id": "58dabcb0-4997-48a1-816b-d257b8a0a2a6", "bridge": "br-int", "label": "tempest-ServersV294TestFqdnHostnames-94429612-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "40a322b32cda438b83f33ec51a9007dc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0d50f3fc-9e", "ovs_interfaceid": "0d50f3fc-9e12-4e56-ba52-98ff14988caa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'size': 0, 'disk_bus': 'virtio', 'boot_index': 0, 'device_name': '/dev/vda', 'guest_format': None, 'encrypted': False, 'encryption_format': None, 'encryption_secret_uuid': None, 'device_type': 'disk', 'image_id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 21 23:58:04 compute-1 nova_compute[182713]: 2026-01-21 23:58:04.918 182717 WARNING nova.virt.libvirt.driver [None req-b4b0d19c-780a-44cf-a0dd-0d0926b7e4a3 d31ef0e2c7354f47adb7b7f072c28fae 40a322b32cda438b83f33ec51a9007dc - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 21 23:58:04 compute-1 nova_compute[182713]: 2026-01-21 23:58:04.924 182717 DEBUG nova.virt.libvirt.host [None req-b4b0d19c-780a-44cf-a0dd-0d0926b7e4a3 d31ef0e2c7354f47adb7b7f072c28fae 40a322b32cda438b83f33ec51a9007dc - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 21 23:58:04 compute-1 nova_compute[182713]: 2026-01-21 23:58:04.925 182717 DEBUG nova.virt.libvirt.host [None req-b4b0d19c-780a-44cf-a0dd-0d0926b7e4a3 d31ef0e2c7354f47adb7b7f072c28fae 40a322b32cda438b83f33ec51a9007dc - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 21 23:58:04 compute-1 nova_compute[182713]: 2026-01-21 23:58:04.936 182717 DEBUG nova.virt.libvirt.host [None req-b4b0d19c-780a-44cf-a0dd-0d0926b7e4a3 d31ef0e2c7354f47adb7b7f072c28fae 40a322b32cda438b83f33ec51a9007dc - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 21 23:58:04 compute-1 nova_compute[182713]: 2026-01-21 23:58:04.937 182717 DEBUG nova.virt.libvirt.host [None req-b4b0d19c-780a-44cf-a0dd-0d0926b7e4a3 d31ef0e2c7354f47adb7b7f072c28fae 40a322b32cda438b83f33ec51a9007dc - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 21 23:58:04 compute-1 nova_compute[182713]: 2026-01-21 23:58:04.939 182717 DEBUG nova.virt.libvirt.driver [None req-b4b0d19c-780a-44cf-a0dd-0d0926b7e4a3 d31ef0e2c7354f47adb7b7f072c28fae 40a322b32cda438b83f33ec51a9007dc - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 21 23:58:04 compute-1 nova_compute[182713]: 2026-01-21 23:58:04.940 182717 DEBUG nova.virt.hardware [None req-b4b0d19c-780a-44cf-a0dd-0d0926b7e4a3 d31ef0e2c7354f47adb7b7f072c28fae 40a322b32cda438b83f33ec51a9007dc - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-21T23:42:45Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c3389c03-89c4-4ff5-9e03-1a99d41713d4',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 21 23:58:04 compute-1 nova_compute[182713]: 2026-01-21 23:58:04.940 182717 DEBUG nova.virt.hardware [None req-b4b0d19c-780a-44cf-a0dd-0d0926b7e4a3 d31ef0e2c7354f47adb7b7f072c28fae 40a322b32cda438b83f33ec51a9007dc - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 21 23:58:04 compute-1 nova_compute[182713]: 2026-01-21 23:58:04.941 182717 DEBUG nova.virt.hardware [None req-b4b0d19c-780a-44cf-a0dd-0d0926b7e4a3 d31ef0e2c7354f47adb7b7f072c28fae 40a322b32cda438b83f33ec51a9007dc - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 21 23:58:04 compute-1 nova_compute[182713]: 2026-01-21 23:58:04.941 182717 DEBUG nova.virt.hardware [None req-b4b0d19c-780a-44cf-a0dd-0d0926b7e4a3 d31ef0e2c7354f47adb7b7f072c28fae 40a322b32cda438b83f33ec51a9007dc - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 21 23:58:04 compute-1 nova_compute[182713]: 2026-01-21 23:58:04.942 182717 DEBUG nova.virt.hardware [None req-b4b0d19c-780a-44cf-a0dd-0d0926b7e4a3 d31ef0e2c7354f47adb7b7f072c28fae 40a322b32cda438b83f33ec51a9007dc - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 21 23:58:04 compute-1 nova_compute[182713]: 2026-01-21 23:58:04.942 182717 DEBUG nova.virt.hardware [None req-b4b0d19c-780a-44cf-a0dd-0d0926b7e4a3 d31ef0e2c7354f47adb7b7f072c28fae 40a322b32cda438b83f33ec51a9007dc - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 21 23:58:04 compute-1 nova_compute[182713]: 2026-01-21 23:58:04.943 182717 DEBUG nova.virt.hardware [None req-b4b0d19c-780a-44cf-a0dd-0d0926b7e4a3 d31ef0e2c7354f47adb7b7f072c28fae 40a322b32cda438b83f33ec51a9007dc - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 21 23:58:04 compute-1 nova_compute[182713]: 2026-01-21 23:58:04.943 182717 DEBUG nova.virt.hardware [None req-b4b0d19c-780a-44cf-a0dd-0d0926b7e4a3 d31ef0e2c7354f47adb7b7f072c28fae 40a322b32cda438b83f33ec51a9007dc - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 21 23:58:04 compute-1 nova_compute[182713]: 2026-01-21 23:58:04.944 182717 DEBUG nova.virt.hardware [None req-b4b0d19c-780a-44cf-a0dd-0d0926b7e4a3 d31ef0e2c7354f47adb7b7f072c28fae 40a322b32cda438b83f33ec51a9007dc - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 21 23:58:04 compute-1 nova_compute[182713]: 2026-01-21 23:58:04.944 182717 DEBUG nova.virt.hardware [None req-b4b0d19c-780a-44cf-a0dd-0d0926b7e4a3 d31ef0e2c7354f47adb7b7f072c28fae 40a322b32cda438b83f33ec51a9007dc - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 21 23:58:04 compute-1 nova_compute[182713]: 2026-01-21 23:58:04.945 182717 DEBUG nova.virt.hardware [None req-b4b0d19c-780a-44cf-a0dd-0d0926b7e4a3 d31ef0e2c7354f47adb7b7f072c28fae 40a322b32cda438b83f33ec51a9007dc - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 21 23:58:04 compute-1 nova_compute[182713]: 2026-01-21 23:58:04.951 182717 DEBUG nova.virt.libvirt.vif [None req-b4b0d19c-780a-44cf-a0dd-0d0926b7e4a3 d31ef0e2c7354f47adb7b7f072c28fae 40a322b32cda438b83f33ec51a9007dc - - default default] vif_type=ovs instance=Instance(access_ip_v4=1.1.1.1,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-21T23:57:56Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='guest-instance-1',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx-guest-test.domaintest.com',id=73,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJmcUUCRNBDC7pEUz/tpS2M2Bh7QKoMuCvSGq0OlhfgBB2gHHm0vZwJ3C6s56OZFiBK7PSn211goCBxjEIAa7rURlcZpYO0vnVohce5WvMJCsd1LTVs/VNJbZ3GaVOQFkg==',key_name='tempest-keypair-2098695703',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='40a322b32cda438b83f33ec51a9007dc',ramdisk_id='',reservation_id='r-mpbb2z0m',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersV294TestFqdnHostnames-36171568',owner_user_name='tempest-ServersV294TestFqdnHostnames-36171568-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-21T23:57:58Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='d31ef0e2c7354f47adb7b7f072c28fae',uuid=337d88a9-a34b-4c90-bf0d-0418533ae52d,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "0d50f3fc-9e12-4e56-ba52-98ff14988caa", "address": "fa:16:3e:38:24:84", "network": {"id": "58dabcb0-4997-48a1-816b-d257b8a0a2a6", "bridge": "br-int", "label": "tempest-ServersV294TestFqdnHostnames-94429612-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "40a322b32cda438b83f33ec51a9007dc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0d50f3fc-9e", "ovs_interfaceid": "0d50f3fc-9e12-4e56-ba52-98ff14988caa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 21 23:58:04 compute-1 nova_compute[182713]: 2026-01-21 23:58:04.951 182717 DEBUG nova.network.os_vif_util [None req-b4b0d19c-780a-44cf-a0dd-0d0926b7e4a3 d31ef0e2c7354f47adb7b7f072c28fae 40a322b32cda438b83f33ec51a9007dc - - default default] Converting VIF {"id": "0d50f3fc-9e12-4e56-ba52-98ff14988caa", "address": "fa:16:3e:38:24:84", "network": {"id": "58dabcb0-4997-48a1-816b-d257b8a0a2a6", "bridge": "br-int", "label": "tempest-ServersV294TestFqdnHostnames-94429612-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "40a322b32cda438b83f33ec51a9007dc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0d50f3fc-9e", "ovs_interfaceid": "0d50f3fc-9e12-4e56-ba52-98ff14988caa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 21 23:58:04 compute-1 nova_compute[182713]: 2026-01-21 23:58:04.953 182717 DEBUG nova.network.os_vif_util [None req-b4b0d19c-780a-44cf-a0dd-0d0926b7e4a3 d31ef0e2c7354f47adb7b7f072c28fae 40a322b32cda438b83f33ec51a9007dc - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:38:24:84,bridge_name='br-int',has_traffic_filtering=True,id=0d50f3fc-9e12-4e56-ba52-98ff14988caa,network=Network(58dabcb0-4997-48a1-816b-d257b8a0a2a6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0d50f3fc-9e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 21 23:58:04 compute-1 nova_compute[182713]: 2026-01-21 23:58:04.954 182717 DEBUG nova.objects.instance [None req-b4b0d19c-780a-44cf-a0dd-0d0926b7e4a3 d31ef0e2c7354f47adb7b7f072c28fae 40a322b32cda438b83f33ec51a9007dc - - default default] Lazy-loading 'pci_devices' on Instance uuid 337d88a9-a34b-4c90-bf0d-0418533ae52d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:58:04 compute-1 nova_compute[182713]: 2026-01-21 23:58:04.970 182717 DEBUG nova.virt.libvirt.driver [None req-b4b0d19c-780a-44cf-a0dd-0d0926b7e4a3 d31ef0e2c7354f47adb7b7f072c28fae 40a322b32cda438b83f33ec51a9007dc - - default default] [instance: 337d88a9-a34b-4c90-bf0d-0418533ae52d] End _get_guest_xml xml=<domain type="kvm">
Jan 21 23:58:04 compute-1 nova_compute[182713]:   <uuid>337d88a9-a34b-4c90-bf0d-0418533ae52d</uuid>
Jan 21 23:58:04 compute-1 nova_compute[182713]:   <name>instance-00000049</name>
Jan 21 23:58:04 compute-1 nova_compute[182713]:   <memory>131072</memory>
Jan 21 23:58:04 compute-1 nova_compute[182713]:   <vcpu>1</vcpu>
Jan 21 23:58:04 compute-1 nova_compute[182713]:   <metadata>
Jan 21 23:58:04 compute-1 nova_compute[182713]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 21 23:58:04 compute-1 nova_compute[182713]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 21 23:58:04 compute-1 nova_compute[182713]:       <nova:name>guest-instance-1</nova:name>
Jan 21 23:58:04 compute-1 nova_compute[182713]:       <nova:creationTime>2026-01-21 23:58:04</nova:creationTime>
Jan 21 23:58:04 compute-1 nova_compute[182713]:       <nova:flavor name="m1.nano">
Jan 21 23:58:04 compute-1 nova_compute[182713]:         <nova:memory>128</nova:memory>
Jan 21 23:58:04 compute-1 nova_compute[182713]:         <nova:disk>1</nova:disk>
Jan 21 23:58:04 compute-1 nova_compute[182713]:         <nova:swap>0</nova:swap>
Jan 21 23:58:04 compute-1 nova_compute[182713]:         <nova:ephemeral>0</nova:ephemeral>
Jan 21 23:58:04 compute-1 nova_compute[182713]:         <nova:vcpus>1</nova:vcpus>
Jan 21 23:58:04 compute-1 nova_compute[182713]:       </nova:flavor>
Jan 21 23:58:04 compute-1 nova_compute[182713]:       <nova:owner>
Jan 21 23:58:04 compute-1 nova_compute[182713]:         <nova:user uuid="d31ef0e2c7354f47adb7b7f072c28fae">tempest-ServersV294TestFqdnHostnames-36171568-project-member</nova:user>
Jan 21 23:58:04 compute-1 nova_compute[182713]:         <nova:project uuid="40a322b32cda438b83f33ec51a9007dc">tempest-ServersV294TestFqdnHostnames-36171568</nova:project>
Jan 21 23:58:04 compute-1 nova_compute[182713]:       </nova:owner>
Jan 21 23:58:04 compute-1 nova_compute[182713]:       <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 21 23:58:04 compute-1 nova_compute[182713]:       <nova:ports>
Jan 21 23:58:04 compute-1 nova_compute[182713]:         <nova:port uuid="0d50f3fc-9e12-4e56-ba52-98ff14988caa">
Jan 21 23:58:04 compute-1 nova_compute[182713]:           <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Jan 21 23:58:04 compute-1 nova_compute[182713]:         </nova:port>
Jan 21 23:58:04 compute-1 nova_compute[182713]:       </nova:ports>
Jan 21 23:58:04 compute-1 nova_compute[182713]:     </nova:instance>
Jan 21 23:58:04 compute-1 nova_compute[182713]:   </metadata>
Jan 21 23:58:04 compute-1 nova_compute[182713]:   <sysinfo type="smbios">
Jan 21 23:58:04 compute-1 nova_compute[182713]:     <system>
Jan 21 23:58:04 compute-1 nova_compute[182713]:       <entry name="manufacturer">RDO</entry>
Jan 21 23:58:04 compute-1 nova_compute[182713]:       <entry name="product">OpenStack Compute</entry>
Jan 21 23:58:04 compute-1 nova_compute[182713]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 21 23:58:04 compute-1 nova_compute[182713]:       <entry name="serial">337d88a9-a34b-4c90-bf0d-0418533ae52d</entry>
Jan 21 23:58:04 compute-1 nova_compute[182713]:       <entry name="uuid">337d88a9-a34b-4c90-bf0d-0418533ae52d</entry>
Jan 21 23:58:04 compute-1 nova_compute[182713]:       <entry name="family">Virtual Machine</entry>
Jan 21 23:58:04 compute-1 nova_compute[182713]:     </system>
Jan 21 23:58:04 compute-1 nova_compute[182713]:   </sysinfo>
Jan 21 23:58:04 compute-1 nova_compute[182713]:   <os>
Jan 21 23:58:04 compute-1 nova_compute[182713]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 21 23:58:04 compute-1 nova_compute[182713]:     <boot dev="hd"/>
Jan 21 23:58:04 compute-1 nova_compute[182713]:     <smbios mode="sysinfo"/>
Jan 21 23:58:04 compute-1 nova_compute[182713]:   </os>
Jan 21 23:58:04 compute-1 nova_compute[182713]:   <features>
Jan 21 23:58:04 compute-1 nova_compute[182713]:     <acpi/>
Jan 21 23:58:04 compute-1 nova_compute[182713]:     <apic/>
Jan 21 23:58:04 compute-1 nova_compute[182713]:     <vmcoreinfo/>
Jan 21 23:58:04 compute-1 nova_compute[182713]:   </features>
Jan 21 23:58:04 compute-1 nova_compute[182713]:   <clock offset="utc">
Jan 21 23:58:04 compute-1 nova_compute[182713]:     <timer name="pit" tickpolicy="delay"/>
Jan 21 23:58:04 compute-1 nova_compute[182713]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 21 23:58:04 compute-1 nova_compute[182713]:     <timer name="hpet" present="no"/>
Jan 21 23:58:04 compute-1 nova_compute[182713]:   </clock>
Jan 21 23:58:04 compute-1 nova_compute[182713]:   <cpu mode="custom" match="exact">
Jan 21 23:58:04 compute-1 nova_compute[182713]:     <model>Nehalem</model>
Jan 21 23:58:04 compute-1 nova_compute[182713]:     <topology sockets="1" cores="1" threads="1"/>
Jan 21 23:58:04 compute-1 nova_compute[182713]:   </cpu>
Jan 21 23:58:04 compute-1 nova_compute[182713]:   <devices>
Jan 21 23:58:04 compute-1 nova_compute[182713]:     <disk type="file" device="disk">
Jan 21 23:58:04 compute-1 nova_compute[182713]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 21 23:58:04 compute-1 nova_compute[182713]:       <source file="/var/lib/nova/instances/337d88a9-a34b-4c90-bf0d-0418533ae52d/disk"/>
Jan 21 23:58:04 compute-1 nova_compute[182713]:       <target dev="vda" bus="virtio"/>
Jan 21 23:58:04 compute-1 nova_compute[182713]:     </disk>
Jan 21 23:58:04 compute-1 nova_compute[182713]:     <disk type="file" device="cdrom">
Jan 21 23:58:04 compute-1 nova_compute[182713]:       <driver name="qemu" type="raw" cache="none"/>
Jan 21 23:58:04 compute-1 nova_compute[182713]:       <source file="/var/lib/nova/instances/337d88a9-a34b-4c90-bf0d-0418533ae52d/disk.config"/>
Jan 21 23:58:04 compute-1 nova_compute[182713]:       <target dev="sda" bus="sata"/>
Jan 21 23:58:04 compute-1 nova_compute[182713]:     </disk>
Jan 21 23:58:04 compute-1 nova_compute[182713]:     <interface type="ethernet">
Jan 21 23:58:04 compute-1 nova_compute[182713]:       <mac address="fa:16:3e:38:24:84"/>
Jan 21 23:58:04 compute-1 nova_compute[182713]:       <model type="virtio"/>
Jan 21 23:58:04 compute-1 nova_compute[182713]:       <driver name="vhost" rx_queue_size="512"/>
Jan 21 23:58:04 compute-1 nova_compute[182713]:       <mtu size="1442"/>
Jan 21 23:58:04 compute-1 nova_compute[182713]:       <target dev="tap0d50f3fc-9e"/>
Jan 21 23:58:04 compute-1 nova_compute[182713]:     </interface>
Jan 21 23:58:04 compute-1 nova_compute[182713]:     <serial type="pty">
Jan 21 23:58:04 compute-1 nova_compute[182713]:       <log file="/var/lib/nova/instances/337d88a9-a34b-4c90-bf0d-0418533ae52d/console.log" append="off"/>
Jan 21 23:58:04 compute-1 nova_compute[182713]:     </serial>
Jan 21 23:58:04 compute-1 nova_compute[182713]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 21 23:58:04 compute-1 nova_compute[182713]:     <video>
Jan 21 23:58:04 compute-1 nova_compute[182713]:       <model type="virtio"/>
Jan 21 23:58:04 compute-1 nova_compute[182713]:     </video>
Jan 21 23:58:04 compute-1 nova_compute[182713]:     <input type="tablet" bus="usb"/>
Jan 21 23:58:04 compute-1 nova_compute[182713]:     <rng model="virtio">
Jan 21 23:58:04 compute-1 nova_compute[182713]:       <backend model="random">/dev/urandom</backend>
Jan 21 23:58:04 compute-1 nova_compute[182713]:     </rng>
Jan 21 23:58:04 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root"/>
Jan 21 23:58:04 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:58:04 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:58:04 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:58:04 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:58:04 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:58:04 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:58:04 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:58:04 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:58:04 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:58:04 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:58:04 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:58:04 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:58:04 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:58:04 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:58:04 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:58:04 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:58:04 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:58:04 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:58:04 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:58:04 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:58:04 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:58:04 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:58:04 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:58:04 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:58:04 compute-1 nova_compute[182713]:     <controller type="usb" index="0"/>
Jan 21 23:58:04 compute-1 nova_compute[182713]:     <memballoon model="virtio">
Jan 21 23:58:04 compute-1 nova_compute[182713]:       <stats period="10"/>
Jan 21 23:58:04 compute-1 nova_compute[182713]:     </memballoon>
Jan 21 23:58:04 compute-1 nova_compute[182713]:   </devices>
Jan 21 23:58:04 compute-1 nova_compute[182713]: </domain>
Jan 21 23:58:04 compute-1 nova_compute[182713]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 21 23:58:04 compute-1 nova_compute[182713]: 2026-01-21 23:58:04.972 182717 DEBUG nova.compute.manager [None req-b4b0d19c-780a-44cf-a0dd-0d0926b7e4a3 d31ef0e2c7354f47adb7b7f072c28fae 40a322b32cda438b83f33ec51a9007dc - - default default] [instance: 337d88a9-a34b-4c90-bf0d-0418533ae52d] Preparing to wait for external event network-vif-plugged-0d50f3fc-9e12-4e56-ba52-98ff14988caa prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 21 23:58:04 compute-1 nova_compute[182713]: 2026-01-21 23:58:04.973 182717 DEBUG oslo_concurrency.lockutils [None req-b4b0d19c-780a-44cf-a0dd-0d0926b7e4a3 d31ef0e2c7354f47adb7b7f072c28fae 40a322b32cda438b83f33ec51a9007dc - - default default] Acquiring lock "337d88a9-a34b-4c90-bf0d-0418533ae52d-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:58:04 compute-1 nova_compute[182713]: 2026-01-21 23:58:04.973 182717 DEBUG oslo_concurrency.lockutils [None req-b4b0d19c-780a-44cf-a0dd-0d0926b7e4a3 d31ef0e2c7354f47adb7b7f072c28fae 40a322b32cda438b83f33ec51a9007dc - - default default] Lock "337d88a9-a34b-4c90-bf0d-0418533ae52d-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:58:04 compute-1 nova_compute[182713]: 2026-01-21 23:58:04.974 182717 DEBUG oslo_concurrency.lockutils [None req-b4b0d19c-780a-44cf-a0dd-0d0926b7e4a3 d31ef0e2c7354f47adb7b7f072c28fae 40a322b32cda438b83f33ec51a9007dc - - default default] Lock "337d88a9-a34b-4c90-bf0d-0418533ae52d-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:58:04 compute-1 nova_compute[182713]: 2026-01-21 23:58:04.977 182717 DEBUG nova.virt.libvirt.vif [None req-b4b0d19c-780a-44cf-a0dd-0d0926b7e4a3 d31ef0e2c7354f47adb7b7f072c28fae 40a322b32cda438b83f33ec51a9007dc - - default default] vif_type=ovs instance=Instance(access_ip_v4=1.1.1.1,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-21T23:57:56Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='guest-instance-1',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx-guest-test.domaintest.com',id=73,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJmcUUCRNBDC7pEUz/tpS2M2Bh7QKoMuCvSGq0OlhfgBB2gHHm0vZwJ3C6s56OZFiBK7PSn211goCBxjEIAa7rURlcZpYO0vnVohce5WvMJCsd1LTVs/VNJbZ3GaVOQFkg==',key_name='tempest-keypair-2098695703',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='40a322b32cda438b83f33ec51a9007dc',ramdisk_id='',reservation_id='r-mpbb2z0m',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersV294TestFqdnHostnames-36171568',owner_user_name='tempest-ServersV294TestFqdnHostnames-36171568-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-21T23:57:58Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='d31ef0e2c7354f47adb7b7f072c28fae',uuid=337d88a9-a34b-4c90-bf0d-0418533ae52d,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "0d50f3fc-9e12-4e56-ba52-98ff14988caa", "address": "fa:16:3e:38:24:84", "network": {"id": "58dabcb0-4997-48a1-816b-d257b8a0a2a6", "bridge": "br-int", "label": "tempest-ServersV294TestFqdnHostnames-94429612-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "40a322b32cda438b83f33ec51a9007dc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0d50f3fc-9e", "ovs_interfaceid": "0d50f3fc-9e12-4e56-ba52-98ff14988caa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 21 23:58:04 compute-1 nova_compute[182713]: 2026-01-21 23:58:04.977 182717 DEBUG nova.network.os_vif_util [None req-b4b0d19c-780a-44cf-a0dd-0d0926b7e4a3 d31ef0e2c7354f47adb7b7f072c28fae 40a322b32cda438b83f33ec51a9007dc - - default default] Converting VIF {"id": "0d50f3fc-9e12-4e56-ba52-98ff14988caa", "address": "fa:16:3e:38:24:84", "network": {"id": "58dabcb0-4997-48a1-816b-d257b8a0a2a6", "bridge": "br-int", "label": "tempest-ServersV294TestFqdnHostnames-94429612-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "40a322b32cda438b83f33ec51a9007dc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0d50f3fc-9e", "ovs_interfaceid": "0d50f3fc-9e12-4e56-ba52-98ff14988caa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 21 23:58:04 compute-1 nova_compute[182713]: 2026-01-21 23:58:04.978 182717 DEBUG nova.network.os_vif_util [None req-b4b0d19c-780a-44cf-a0dd-0d0926b7e4a3 d31ef0e2c7354f47adb7b7f072c28fae 40a322b32cda438b83f33ec51a9007dc - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:38:24:84,bridge_name='br-int',has_traffic_filtering=True,id=0d50f3fc-9e12-4e56-ba52-98ff14988caa,network=Network(58dabcb0-4997-48a1-816b-d257b8a0a2a6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0d50f3fc-9e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 21 23:58:04 compute-1 nova_compute[182713]: 2026-01-21 23:58:04.979 182717 DEBUG os_vif [None req-b4b0d19c-780a-44cf-a0dd-0d0926b7e4a3 d31ef0e2c7354f47adb7b7f072c28fae 40a322b32cda438b83f33ec51a9007dc - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:38:24:84,bridge_name='br-int',has_traffic_filtering=True,id=0d50f3fc-9e12-4e56-ba52-98ff14988caa,network=Network(58dabcb0-4997-48a1-816b-d257b8a0a2a6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0d50f3fc-9e') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 21 23:58:04 compute-1 nova_compute[182713]: 2026-01-21 23:58:04.980 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:58:04 compute-1 nova_compute[182713]: 2026-01-21 23:58:04.980 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:58:04 compute-1 nova_compute[182713]: 2026-01-21 23:58:04.981 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 21 23:58:04 compute-1 nova_compute[182713]: 2026-01-21 23:58:04.986 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:58:04 compute-1 nova_compute[182713]: 2026-01-21 23:58:04.986 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap0d50f3fc-9e, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:58:04 compute-1 nova_compute[182713]: 2026-01-21 23:58:04.987 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap0d50f3fc-9e, col_values=(('external_ids', {'iface-id': '0d50f3fc-9e12-4e56-ba52-98ff14988caa', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:38:24:84', 'vm-uuid': '337d88a9-a34b-4c90-bf0d-0418533ae52d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:58:04 compute-1 nova_compute[182713]: 2026-01-21 23:58:04.989 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:58:04 compute-1 NetworkManager[54952]: <info>  [1769039884.9910] manager: (tap0d50f3fc-9e): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/118)
Jan 21 23:58:04 compute-1 nova_compute[182713]: 2026-01-21 23:58:04.993 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 21 23:58:04 compute-1 nova_compute[182713]: 2026-01-21 23:58:04.997 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:58:05 compute-1 nova_compute[182713]: 2026-01-21 23:58:04.999 182717 INFO os_vif [None req-b4b0d19c-780a-44cf-a0dd-0d0926b7e4a3 d31ef0e2c7354f47adb7b7f072c28fae 40a322b32cda438b83f33ec51a9007dc - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:38:24:84,bridge_name='br-int',has_traffic_filtering=True,id=0d50f3fc-9e12-4e56-ba52-98ff14988caa,network=Network(58dabcb0-4997-48a1-816b-d257b8a0a2a6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0d50f3fc-9e')
Jan 21 23:58:05 compute-1 nova_compute[182713]: 2026-01-21 23:58:05.063 182717 DEBUG nova.virt.libvirt.driver [None req-b4b0d19c-780a-44cf-a0dd-0d0926b7e4a3 d31ef0e2c7354f47adb7b7f072c28fae 40a322b32cda438b83f33ec51a9007dc - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 21 23:58:05 compute-1 nova_compute[182713]: 2026-01-21 23:58:05.064 182717 DEBUG nova.virt.libvirt.driver [None req-b4b0d19c-780a-44cf-a0dd-0d0926b7e4a3 d31ef0e2c7354f47adb7b7f072c28fae 40a322b32cda438b83f33ec51a9007dc - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 21 23:58:05 compute-1 nova_compute[182713]: 2026-01-21 23:58:05.064 182717 DEBUG nova.virt.libvirt.driver [None req-b4b0d19c-780a-44cf-a0dd-0d0926b7e4a3 d31ef0e2c7354f47adb7b7f072c28fae 40a322b32cda438b83f33ec51a9007dc - - default default] No VIF found with MAC fa:16:3e:38:24:84, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 21 23:58:05 compute-1 nova_compute[182713]: 2026-01-21 23:58:05.065 182717 INFO nova.virt.libvirt.driver [None req-b4b0d19c-780a-44cf-a0dd-0d0926b7e4a3 d31ef0e2c7354f47adb7b7f072c28fae 40a322b32cda438b83f33ec51a9007dc - - default default] [instance: 337d88a9-a34b-4c90-bf0d-0418533ae52d] Using config drive
Jan 21 23:58:05 compute-1 nova_compute[182713]: 2026-01-21 23:58:05.437 182717 DEBUG nova.network.neutron [None req-5f5259a4-0578-4d36-993a-4c38ee3abaf5 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] [instance: c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9] Successfully created port: da88332f-f709-4521-a7e5-faca686cf825 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 21 23:58:05 compute-1 nova_compute[182713]: 2026-01-21 23:58:05.549 182717 INFO nova.virt.libvirt.driver [None req-b4b0d19c-780a-44cf-a0dd-0d0926b7e4a3 d31ef0e2c7354f47adb7b7f072c28fae 40a322b32cda438b83f33ec51a9007dc - - default default] [instance: 337d88a9-a34b-4c90-bf0d-0418533ae52d] Creating config drive at /var/lib/nova/instances/337d88a9-a34b-4c90-bf0d-0418533ae52d/disk.config
Jan 21 23:58:05 compute-1 nova_compute[182713]: 2026-01-21 23:58:05.581 182717 DEBUG oslo_concurrency.processutils [None req-b4b0d19c-780a-44cf-a0dd-0d0926b7e4a3 d31ef0e2c7354f47adb7b7f072c28fae 40a322b32cda438b83f33ec51a9007dc - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/337d88a9-a34b-4c90-bf0d-0418533ae52d/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpf22ay4bt execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:58:05 compute-1 podman[221186]: 2026-01-21 23:58:05.622674766 +0000 UTC m=+0.088406744 container health_status 9069102163b9014ce8d990e36224edf2f6277e737eebbc85a8ea0ba5a3d6a468 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 21 23:58:05 compute-1 podman[221185]: 2026-01-21 23:58:05.650992337 +0000 UTC m=+0.122685679 container health_status 1b31374526468d6b98833c6ddc06c791524e3aaa8f4a92d02e3f11c449a17ca2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3, config_id=ovn_controller, org.label-schema.schema-version=1.0)
Jan 21 23:58:05 compute-1 nova_compute[182713]: 2026-01-21 23:58:05.719 182717 DEBUG oslo_concurrency.processutils [None req-b4b0d19c-780a-44cf-a0dd-0d0926b7e4a3 d31ef0e2c7354f47adb7b7f072c28fae 40a322b32cda438b83f33ec51a9007dc - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/337d88a9-a34b-4c90-bf0d-0418533ae52d/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpf22ay4bt" returned: 0 in 0.138s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:58:05 compute-1 kernel: tap0d50f3fc-9e: entered promiscuous mode
Jan 21 23:58:05 compute-1 NetworkManager[54952]: <info>  [1769039885.8094] manager: (tap0d50f3fc-9e): new Tun device (/org/freedesktop/NetworkManager/Devices/119)
Jan 21 23:58:05 compute-1 nova_compute[182713]: 2026-01-21 23:58:05.808 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:58:05 compute-1 ovn_controller[94841]: 2026-01-21T23:58:05Z|00240|binding|INFO|Claiming lport 0d50f3fc-9e12-4e56-ba52-98ff14988caa for this chassis.
Jan 21 23:58:05 compute-1 ovn_controller[94841]: 2026-01-21T23:58:05Z|00241|binding|INFO|0d50f3fc-9e12-4e56-ba52-98ff14988caa: Claiming fa:16:3e:38:24:84 10.100.0.10
Jan 21 23:58:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:58:05.819 104184 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:38:24:84 10.100.0.10'], port_security=['fa:16:3e:38:24:84 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '337d88a9-a34b-4c90-bf0d-0418533ae52d', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-58dabcb0-4997-48a1-816b-d257b8a0a2a6', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '40a322b32cda438b83f33ec51a9007dc', 'neutron:revision_number': '2', 'neutron:security_group_ids': '992f086f-c626-4197-b1cd-eb7d61dff758', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a4acf22b-ed26-4bae-bf49-1694ec7b3765, chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>], logical_port=0d50f3fc-9e12-4e56-ba52-98ff14988caa) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 21 23:58:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:58:05.820 104184 INFO neutron.agent.ovn.metadata.agent [-] Port 0d50f3fc-9e12-4e56-ba52-98ff14988caa in datapath 58dabcb0-4997-48a1-816b-d257b8a0a2a6 bound to our chassis
Jan 21 23:58:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:58:05.824 104184 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 58dabcb0-4997-48a1-816b-d257b8a0a2a6
Jan 21 23:58:05 compute-1 ovn_controller[94841]: 2026-01-21T23:58:05Z|00242|binding|INFO|Setting lport 0d50f3fc-9e12-4e56-ba52-98ff14988caa up in Southbound
Jan 21 23:58:05 compute-1 ovn_controller[94841]: 2026-01-21T23:58:05Z|00243|binding|INFO|Setting lport 0d50f3fc-9e12-4e56-ba52-98ff14988caa ovn-installed in OVS
Jan 21 23:58:05 compute-1 nova_compute[182713]: 2026-01-21 23:58:05.836 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:58:05 compute-1 nova_compute[182713]: 2026-01-21 23:58:05.841 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:58:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:58:05.843 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[af13a538-250f-4fad-8705-8842c6a6058d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:58:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:58:05.845 104184 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap58dabcb0-41 in ovnmeta-58dabcb0-4997-48a1-816b-d257b8a0a2a6 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 21 23:58:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:58:05.847 211733 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap58dabcb0-40 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 21 23:58:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:58:05.847 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[b245974f-8a00-4f3e-9fce-83f9a40cac85]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:58:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:58:05.849 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[578a492d-03af-4fc4-96b3-2ee0542d7f00]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:58:05 compute-1 systemd-udevd[221252]: Network interface NamePolicy= disabled on kernel command line.
Jan 21 23:58:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:58:05.863 104576 DEBUG oslo.privsep.daemon [-] privsep: reply[01c8fead-33e2-4713-9829-4a731105fbdb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:58:05 compute-1 systemd-machined[153970]: New machine qemu-34-instance-00000049.
Jan 21 23:58:05 compute-1 NetworkManager[54952]: <info>  [1769039885.8794] device (tap0d50f3fc-9e): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 21 23:58:05 compute-1 NetworkManager[54952]: <info>  [1769039885.8811] device (tap0d50f3fc-9e): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 21 23:58:05 compute-1 systemd[1]: Started Virtual Machine qemu-34-instance-00000049.
Jan 21 23:58:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:58:05.890 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[2cd85d55-002a-4e26-ad16-cbf857f14e01]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:58:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:58:05.925 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[e26591f5-6e2b-47b3-a0f4-7f5d70cee8de]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:58:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:58:05.931 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[ef5f543d-5609-4131-8459-20bf93af43b8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:58:05 compute-1 NetworkManager[54952]: <info>  [1769039885.9325] manager: (tap58dabcb0-40): new Veth device (/org/freedesktop/NetworkManager/Devices/120)
Jan 21 23:58:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:58:05.973 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[aaa99140-23c5-45fb-8f9d-29a2d8944715]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:58:05 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:58:05.979 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[f706e6db-be49-496a-9506-f791c4697739]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:58:06 compute-1 NetworkManager[54952]: <info>  [1769039886.0118] device (tap58dabcb0-40): carrier: link connected
Jan 21 23:58:06 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:58:06.019 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[0f6bb168-bd3b-40ee-9186-8d24f971fbe1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:58:06 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:58:06.040 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[052ab0cb-9ea3-43af-a93c-273d24a6d8f9]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap58dabcb0-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a9:71:b8'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 73], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 445866, 'reachable_time': 38700, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 221284, 'error': None, 'target': 'ovnmeta-58dabcb0-4997-48a1-816b-d257b8a0a2a6', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:58:06 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:58:06.059 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[44d5daa7-ba89-4216-937d-0e29c8c4821a]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fea9:71b8'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 445866, 'tstamp': 445866}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 221285, 'error': None, 'target': 'ovnmeta-58dabcb0-4997-48a1-816b-d257b8a0a2a6', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:58:06 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:58:06.079 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[dcc78a4e-a9ec-474b-a3a8-c7aebf217825]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap58dabcb0-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a9:71:b8'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 73], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 445866, 'reachable_time': 38700, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 221286, 'error': None, 'target': 'ovnmeta-58dabcb0-4997-48a1-816b-d257b8a0a2a6', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:58:06 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:58:06.125 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[f8c696d7-1f96-42f8-b11c-707ef1008241]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:58:06 compute-1 nova_compute[182713]: 2026-01-21 23:58:06.156 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:58:06 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:58:06.206 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[d1d61314-60b9-4d04-9c4c-9db592207371]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:58:06 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:58:06.209 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap58dabcb0-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:58:06 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:58:06.209 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 21 23:58:06 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:58:06.210 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap58dabcb0-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:58:06 compute-1 kernel: tap58dabcb0-40: entered promiscuous mode
Jan 21 23:58:06 compute-1 NetworkManager[54952]: <info>  [1769039886.2199] manager: (tap58dabcb0-40): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/121)
Jan 21 23:58:06 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:58:06.221 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap58dabcb0-40, col_values=(('external_ids', {'iface-id': '75b8444c-8fe2-4d68-af37-bf230a93a3f7'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:58:06 compute-1 ovn_controller[94841]: 2026-01-21T23:58:06Z|00244|binding|INFO|Releasing lport 75b8444c-8fe2-4d68-af37-bf230a93a3f7 from this chassis (sb_readonly=0)
Jan 21 23:58:06 compute-1 nova_compute[182713]: 2026-01-21 23:58:06.223 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:58:06 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:58:06.229 104184 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/58dabcb0-4997-48a1-816b-d257b8a0a2a6.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/58dabcb0-4997-48a1-816b-d257b8a0a2a6.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 21 23:58:06 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:58:06.230 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[0218d59e-ddf1-41d9-9652-f8f13af6fc3e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:58:06 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:58:06.231 104184 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 21 23:58:06 compute-1 ovn_metadata_agent[104179]: global
Jan 21 23:58:06 compute-1 ovn_metadata_agent[104179]:     log         /dev/log local0 debug
Jan 21 23:58:06 compute-1 ovn_metadata_agent[104179]:     log-tag     haproxy-metadata-proxy-58dabcb0-4997-48a1-816b-d257b8a0a2a6
Jan 21 23:58:06 compute-1 ovn_metadata_agent[104179]:     user        root
Jan 21 23:58:06 compute-1 ovn_metadata_agent[104179]:     group       root
Jan 21 23:58:06 compute-1 ovn_metadata_agent[104179]:     maxconn     1024
Jan 21 23:58:06 compute-1 ovn_metadata_agent[104179]:     pidfile     /var/lib/neutron/external/pids/58dabcb0-4997-48a1-816b-d257b8a0a2a6.pid.haproxy
Jan 21 23:58:06 compute-1 ovn_metadata_agent[104179]:     daemon
Jan 21 23:58:06 compute-1 ovn_metadata_agent[104179]: 
Jan 21 23:58:06 compute-1 ovn_metadata_agent[104179]: defaults
Jan 21 23:58:06 compute-1 ovn_metadata_agent[104179]:     log global
Jan 21 23:58:06 compute-1 ovn_metadata_agent[104179]:     mode http
Jan 21 23:58:06 compute-1 ovn_metadata_agent[104179]:     option httplog
Jan 21 23:58:06 compute-1 ovn_metadata_agent[104179]:     option dontlognull
Jan 21 23:58:06 compute-1 ovn_metadata_agent[104179]:     option http-server-close
Jan 21 23:58:06 compute-1 ovn_metadata_agent[104179]:     option forwardfor
Jan 21 23:58:06 compute-1 ovn_metadata_agent[104179]:     retries                 3
Jan 21 23:58:06 compute-1 ovn_metadata_agent[104179]:     timeout http-request    30s
Jan 21 23:58:06 compute-1 ovn_metadata_agent[104179]:     timeout connect         30s
Jan 21 23:58:06 compute-1 ovn_metadata_agent[104179]:     timeout client          32s
Jan 21 23:58:06 compute-1 ovn_metadata_agent[104179]:     timeout server          32s
Jan 21 23:58:06 compute-1 ovn_metadata_agent[104179]:     timeout http-keep-alive 30s
Jan 21 23:58:06 compute-1 ovn_metadata_agent[104179]: 
Jan 21 23:58:06 compute-1 ovn_metadata_agent[104179]: 
Jan 21 23:58:06 compute-1 ovn_metadata_agent[104179]: listen listener
Jan 21 23:58:06 compute-1 ovn_metadata_agent[104179]:     bind 169.254.169.254:80
Jan 21 23:58:06 compute-1 ovn_metadata_agent[104179]:     server metadata /var/lib/neutron/metadata_proxy
Jan 21 23:58:06 compute-1 ovn_metadata_agent[104179]:     http-request add-header X-OVN-Network-ID 58dabcb0-4997-48a1-816b-d257b8a0a2a6
Jan 21 23:58:06 compute-1 ovn_metadata_agent[104179]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 21 23:58:06 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:58:06.232 104184 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-58dabcb0-4997-48a1-816b-d257b8a0a2a6', 'env', 'PROCESS_TAG=haproxy-58dabcb0-4997-48a1-816b-d257b8a0a2a6', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/58dabcb0-4997-48a1-816b-d257b8a0a2a6.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 21 23:58:06 compute-1 nova_compute[182713]: 2026-01-21 23:58:06.227 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:58:06 compute-1 nova_compute[182713]: 2026-01-21 23:58:06.238 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:58:06 compute-1 podman[221318]: 2026-01-21 23:58:06.696251147 +0000 UTC m=+0.091321683 container create 0bb55bb0aceaf3fcc5dbecea6bbca74b1e2f52ccdc1a4da8ef344600e2130d01 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-58dabcb0-4997-48a1-816b-d257b8a0a2a6, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Jan 21 23:58:06 compute-1 systemd[1]: Started libpod-conmon-0bb55bb0aceaf3fcc5dbecea6bbca74b1e2f52ccdc1a4da8ef344600e2130d01.scope.
Jan 21 23:58:06 compute-1 podman[221318]: 2026-01-21 23:58:06.655510643 +0000 UTC m=+0.050581219 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 21 23:58:06 compute-1 systemd[1]: Started libcrun container.
Jan 21 23:58:06 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3aa7ce989211752149f65cf59feadfa20c51f09212ade832ca01b72728ee9a03/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 21 23:58:06 compute-1 podman[221318]: 2026-01-21 23:58:06.791662615 +0000 UTC m=+0.186733201 container init 0bb55bb0aceaf3fcc5dbecea6bbca74b1e2f52ccdc1a4da8ef344600e2130d01 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-58dabcb0-4997-48a1-816b-d257b8a0a2a6, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS)
Jan 21 23:58:06 compute-1 podman[221318]: 2026-01-21 23:58:06.796498944 +0000 UTC m=+0.191569480 container start 0bb55bb0aceaf3fcc5dbecea6bbca74b1e2f52ccdc1a4da8ef344600e2130d01 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-58dabcb0-4997-48a1-816b-d257b8a0a2a6, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 21 23:58:06 compute-1 nova_compute[182713]: 2026-01-21 23:58:06.801 182717 DEBUG nova.network.neutron [None req-5f5259a4-0578-4d36-993a-4c38ee3abaf5 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] [instance: c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9] Successfully updated port: da88332f-f709-4521-a7e5-faca686cf825 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 21 23:58:06 compute-1 nova_compute[182713]: 2026-01-21 23:58:06.820 182717 DEBUG oslo_concurrency.lockutils [None req-5f5259a4-0578-4d36-993a-4c38ee3abaf5 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Acquiring lock "refresh_cache-c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 23:58:06 compute-1 nova_compute[182713]: 2026-01-21 23:58:06.820 182717 DEBUG oslo_concurrency.lockutils [None req-5f5259a4-0578-4d36-993a-4c38ee3abaf5 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Acquired lock "refresh_cache-c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 23:58:06 compute-1 nova_compute[182713]: 2026-01-21 23:58:06.820 182717 DEBUG nova.network.neutron [None req-5f5259a4-0578-4d36-993a-4c38ee3abaf5 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] [instance: c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 21 23:58:06 compute-1 nova_compute[182713]: 2026-01-21 23:58:06.825 182717 DEBUG nova.network.neutron [req-5e634f85-916a-4064-8a4a-87548d96e9a0 req-dfae0201-0b38-4f7c-b2c4-4091d4d51abd 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 337d88a9-a34b-4c90-bf0d-0418533ae52d] Updated VIF entry in instance network info cache for port 0d50f3fc-9e12-4e56-ba52-98ff14988caa. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 21 23:58:06 compute-1 nova_compute[182713]: 2026-01-21 23:58:06.826 182717 DEBUG nova.network.neutron [req-5e634f85-916a-4064-8a4a-87548d96e9a0 req-dfae0201-0b38-4f7c-b2c4-4091d4d51abd 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 337d88a9-a34b-4c90-bf0d-0418533ae52d] Updating instance_info_cache with network_info: [{"id": "0d50f3fc-9e12-4e56-ba52-98ff14988caa", "address": "fa:16:3e:38:24:84", "network": {"id": "58dabcb0-4997-48a1-816b-d257b8a0a2a6", "bridge": "br-int", "label": "tempest-ServersV294TestFqdnHostnames-94429612-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "40a322b32cda438b83f33ec51a9007dc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0d50f3fc-9e", "ovs_interfaceid": "0d50f3fc-9e12-4e56-ba52-98ff14988caa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 23:58:06 compute-1 neutron-haproxy-ovnmeta-58dabcb0-4997-48a1-816b-d257b8a0a2a6[221333]: [NOTICE]   (221337) : New worker (221339) forked
Jan 21 23:58:06 compute-1 neutron-haproxy-ovnmeta-58dabcb0-4997-48a1-816b-d257b8a0a2a6[221333]: [NOTICE]   (221337) : Loading success.
Jan 21 23:58:06 compute-1 nova_compute[182713]: 2026-01-21 23:58:06.844 182717 DEBUG oslo_concurrency.lockutils [req-5e634f85-916a-4064-8a4a-87548d96e9a0 req-dfae0201-0b38-4f7c-b2c4-4091d4d51abd 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-337d88a9-a34b-4c90-bf0d-0418533ae52d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 23:58:07 compute-1 nova_compute[182713]: 2026-01-21 23:58:07.044 182717 DEBUG nova.virt.driver [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Emitting event <LifecycleEvent: 1769039887.0439928, 337d88a9-a34b-4c90-bf0d-0418533ae52d => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:58:07 compute-1 nova_compute[182713]: 2026-01-21 23:58:07.044 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 337d88a9-a34b-4c90-bf0d-0418533ae52d] VM Started (Lifecycle Event)
Jan 21 23:58:07 compute-1 nova_compute[182713]: 2026-01-21 23:58:07.067 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 337d88a9-a34b-4c90-bf0d-0418533ae52d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:58:07 compute-1 nova_compute[182713]: 2026-01-21 23:58:07.071 182717 DEBUG nova.virt.driver [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Emitting event <LifecycleEvent: 1769039887.0469937, 337d88a9-a34b-4c90-bf0d-0418533ae52d => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:58:07 compute-1 nova_compute[182713]: 2026-01-21 23:58:07.071 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 337d88a9-a34b-4c90-bf0d-0418533ae52d] VM Paused (Lifecycle Event)
Jan 21 23:58:07 compute-1 nova_compute[182713]: 2026-01-21 23:58:07.088 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 337d88a9-a34b-4c90-bf0d-0418533ae52d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:58:07 compute-1 nova_compute[182713]: 2026-01-21 23:58:07.091 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 337d88a9-a34b-4c90-bf0d-0418533ae52d] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 21 23:58:07 compute-1 nova_compute[182713]: 2026-01-21 23:58:07.115 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 337d88a9-a34b-4c90-bf0d-0418533ae52d] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 21 23:58:07 compute-1 nova_compute[182713]: 2026-01-21 23:58:07.420 182717 WARNING nova.network.neutron [None req-5f5259a4-0578-4d36-993a-4c38ee3abaf5 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] [instance: c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9] 1995baab-0f8d-4658-a4fc-2d21868dc592 already exists in list: networks containing: ['1995baab-0f8d-4658-a4fc-2d21868dc592']. ignoring it
Jan 21 23:58:07 compute-1 nova_compute[182713]: 2026-01-21 23:58:07.421 182717 WARNING nova.network.neutron [None req-5f5259a4-0578-4d36-993a-4c38ee3abaf5 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] [instance: c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9] 1995baab-0f8d-4658-a4fc-2d21868dc592 already exists in list: networks containing: ['1995baab-0f8d-4658-a4fc-2d21868dc592']. ignoring it
Jan 21 23:58:07 compute-1 nova_compute[182713]: 2026-01-21 23:58:07.888 182717 DEBUG nova.compute.manager [req-32348627-0dc3-43ed-ab06-ba20b225048c req-65cc736a-5332-4c38-b7d8-0e7127202ae1 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 337d88a9-a34b-4c90-bf0d-0418533ae52d] Received event network-vif-plugged-0d50f3fc-9e12-4e56-ba52-98ff14988caa external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:58:07 compute-1 nova_compute[182713]: 2026-01-21 23:58:07.889 182717 DEBUG oslo_concurrency.lockutils [req-32348627-0dc3-43ed-ab06-ba20b225048c req-65cc736a-5332-4c38-b7d8-0e7127202ae1 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "337d88a9-a34b-4c90-bf0d-0418533ae52d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:58:07 compute-1 nova_compute[182713]: 2026-01-21 23:58:07.889 182717 DEBUG oslo_concurrency.lockutils [req-32348627-0dc3-43ed-ab06-ba20b225048c req-65cc736a-5332-4c38-b7d8-0e7127202ae1 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "337d88a9-a34b-4c90-bf0d-0418533ae52d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:58:07 compute-1 nova_compute[182713]: 2026-01-21 23:58:07.889 182717 DEBUG oslo_concurrency.lockutils [req-32348627-0dc3-43ed-ab06-ba20b225048c req-65cc736a-5332-4c38-b7d8-0e7127202ae1 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "337d88a9-a34b-4c90-bf0d-0418533ae52d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:58:07 compute-1 nova_compute[182713]: 2026-01-21 23:58:07.889 182717 DEBUG nova.compute.manager [req-32348627-0dc3-43ed-ab06-ba20b225048c req-65cc736a-5332-4c38-b7d8-0e7127202ae1 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 337d88a9-a34b-4c90-bf0d-0418533ae52d] Processing event network-vif-plugged-0d50f3fc-9e12-4e56-ba52-98ff14988caa _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 21 23:58:07 compute-1 nova_compute[182713]: 2026-01-21 23:58:07.890 182717 DEBUG nova.compute.manager [req-32348627-0dc3-43ed-ab06-ba20b225048c req-65cc736a-5332-4c38-b7d8-0e7127202ae1 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 337d88a9-a34b-4c90-bf0d-0418533ae52d] Received event network-vif-plugged-0d50f3fc-9e12-4e56-ba52-98ff14988caa external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:58:07 compute-1 nova_compute[182713]: 2026-01-21 23:58:07.890 182717 DEBUG oslo_concurrency.lockutils [req-32348627-0dc3-43ed-ab06-ba20b225048c req-65cc736a-5332-4c38-b7d8-0e7127202ae1 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "337d88a9-a34b-4c90-bf0d-0418533ae52d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:58:07 compute-1 nova_compute[182713]: 2026-01-21 23:58:07.890 182717 DEBUG oslo_concurrency.lockutils [req-32348627-0dc3-43ed-ab06-ba20b225048c req-65cc736a-5332-4c38-b7d8-0e7127202ae1 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "337d88a9-a34b-4c90-bf0d-0418533ae52d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:58:07 compute-1 nova_compute[182713]: 2026-01-21 23:58:07.890 182717 DEBUG oslo_concurrency.lockutils [req-32348627-0dc3-43ed-ab06-ba20b225048c req-65cc736a-5332-4c38-b7d8-0e7127202ae1 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "337d88a9-a34b-4c90-bf0d-0418533ae52d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:58:07 compute-1 nova_compute[182713]: 2026-01-21 23:58:07.890 182717 DEBUG nova.compute.manager [req-32348627-0dc3-43ed-ab06-ba20b225048c req-65cc736a-5332-4c38-b7d8-0e7127202ae1 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 337d88a9-a34b-4c90-bf0d-0418533ae52d] No waiting events found dispatching network-vif-plugged-0d50f3fc-9e12-4e56-ba52-98ff14988caa pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 23:58:07 compute-1 nova_compute[182713]: 2026-01-21 23:58:07.891 182717 WARNING nova.compute.manager [req-32348627-0dc3-43ed-ab06-ba20b225048c req-65cc736a-5332-4c38-b7d8-0e7127202ae1 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 337d88a9-a34b-4c90-bf0d-0418533ae52d] Received unexpected event network-vif-plugged-0d50f3fc-9e12-4e56-ba52-98ff14988caa for instance with vm_state building and task_state spawning.
Jan 21 23:58:07 compute-1 nova_compute[182713]: 2026-01-21 23:58:07.891 182717 DEBUG nova.compute.manager [req-32348627-0dc3-43ed-ab06-ba20b225048c req-65cc736a-5332-4c38-b7d8-0e7127202ae1 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9] Received event network-changed-da88332f-f709-4521-a7e5-faca686cf825 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:58:07 compute-1 nova_compute[182713]: 2026-01-21 23:58:07.891 182717 DEBUG nova.compute.manager [req-32348627-0dc3-43ed-ab06-ba20b225048c req-65cc736a-5332-4c38-b7d8-0e7127202ae1 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9] Refreshing instance network info cache due to event network-changed-da88332f-f709-4521-a7e5-faca686cf825. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 21 23:58:07 compute-1 nova_compute[182713]: 2026-01-21 23:58:07.891 182717 DEBUG oslo_concurrency.lockutils [req-32348627-0dc3-43ed-ab06-ba20b225048c req-65cc736a-5332-4c38-b7d8-0e7127202ae1 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 23:58:07 compute-1 nova_compute[182713]: 2026-01-21 23:58:07.891 182717 DEBUG nova.compute.manager [None req-b4b0d19c-780a-44cf-a0dd-0d0926b7e4a3 d31ef0e2c7354f47adb7b7f072c28fae 40a322b32cda438b83f33ec51a9007dc - - default default] [instance: 337d88a9-a34b-4c90-bf0d-0418533ae52d] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 21 23:58:07 compute-1 nova_compute[182713]: 2026-01-21 23:58:07.895 182717 DEBUG nova.virt.driver [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Emitting event <LifecycleEvent: 1769039887.8956525, 337d88a9-a34b-4c90-bf0d-0418533ae52d => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:58:07 compute-1 nova_compute[182713]: 2026-01-21 23:58:07.896 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 337d88a9-a34b-4c90-bf0d-0418533ae52d] VM Resumed (Lifecycle Event)
Jan 21 23:58:07 compute-1 nova_compute[182713]: 2026-01-21 23:58:07.898 182717 DEBUG nova.virt.libvirt.driver [None req-b4b0d19c-780a-44cf-a0dd-0d0926b7e4a3 d31ef0e2c7354f47adb7b7f072c28fae 40a322b32cda438b83f33ec51a9007dc - - default default] [instance: 337d88a9-a34b-4c90-bf0d-0418533ae52d] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 21 23:58:07 compute-1 nova_compute[182713]: 2026-01-21 23:58:07.902 182717 INFO nova.virt.libvirt.driver [-] [instance: 337d88a9-a34b-4c90-bf0d-0418533ae52d] Instance spawned successfully.
Jan 21 23:58:07 compute-1 nova_compute[182713]: 2026-01-21 23:58:07.902 182717 DEBUG nova.virt.libvirt.driver [None req-b4b0d19c-780a-44cf-a0dd-0d0926b7e4a3 d31ef0e2c7354f47adb7b7f072c28fae 40a322b32cda438b83f33ec51a9007dc - - default default] [instance: 337d88a9-a34b-4c90-bf0d-0418533ae52d] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 21 23:58:07 compute-1 nova_compute[182713]: 2026-01-21 23:58:07.940 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 337d88a9-a34b-4c90-bf0d-0418533ae52d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:58:07 compute-1 nova_compute[182713]: 2026-01-21 23:58:07.956 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 337d88a9-a34b-4c90-bf0d-0418533ae52d] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 21 23:58:07 compute-1 nova_compute[182713]: 2026-01-21 23:58:07.962 182717 DEBUG nova.virt.libvirt.driver [None req-b4b0d19c-780a-44cf-a0dd-0d0926b7e4a3 d31ef0e2c7354f47adb7b7f072c28fae 40a322b32cda438b83f33ec51a9007dc - - default default] [instance: 337d88a9-a34b-4c90-bf0d-0418533ae52d] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:58:07 compute-1 nova_compute[182713]: 2026-01-21 23:58:07.963 182717 DEBUG nova.virt.libvirt.driver [None req-b4b0d19c-780a-44cf-a0dd-0d0926b7e4a3 d31ef0e2c7354f47adb7b7f072c28fae 40a322b32cda438b83f33ec51a9007dc - - default default] [instance: 337d88a9-a34b-4c90-bf0d-0418533ae52d] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:58:07 compute-1 nova_compute[182713]: 2026-01-21 23:58:07.963 182717 DEBUG nova.virt.libvirt.driver [None req-b4b0d19c-780a-44cf-a0dd-0d0926b7e4a3 d31ef0e2c7354f47adb7b7f072c28fae 40a322b32cda438b83f33ec51a9007dc - - default default] [instance: 337d88a9-a34b-4c90-bf0d-0418533ae52d] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:58:07 compute-1 nova_compute[182713]: 2026-01-21 23:58:07.964 182717 DEBUG nova.virt.libvirt.driver [None req-b4b0d19c-780a-44cf-a0dd-0d0926b7e4a3 d31ef0e2c7354f47adb7b7f072c28fae 40a322b32cda438b83f33ec51a9007dc - - default default] [instance: 337d88a9-a34b-4c90-bf0d-0418533ae52d] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:58:07 compute-1 nova_compute[182713]: 2026-01-21 23:58:07.964 182717 DEBUG nova.virt.libvirt.driver [None req-b4b0d19c-780a-44cf-a0dd-0d0926b7e4a3 d31ef0e2c7354f47adb7b7f072c28fae 40a322b32cda438b83f33ec51a9007dc - - default default] [instance: 337d88a9-a34b-4c90-bf0d-0418533ae52d] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:58:07 compute-1 nova_compute[182713]: 2026-01-21 23:58:07.964 182717 DEBUG nova.virt.libvirt.driver [None req-b4b0d19c-780a-44cf-a0dd-0d0926b7e4a3 d31ef0e2c7354f47adb7b7f072c28fae 40a322b32cda438b83f33ec51a9007dc - - default default] [instance: 337d88a9-a34b-4c90-bf0d-0418533ae52d] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:58:08 compute-1 nova_compute[182713]: 2026-01-21 23:58:08.005 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 337d88a9-a34b-4c90-bf0d-0418533ae52d] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 21 23:58:08 compute-1 nova_compute[182713]: 2026-01-21 23:58:08.063 182717 INFO nova.compute.manager [None req-b4b0d19c-780a-44cf-a0dd-0d0926b7e4a3 d31ef0e2c7354f47adb7b7f072c28fae 40a322b32cda438b83f33ec51a9007dc - - default default] [instance: 337d88a9-a34b-4c90-bf0d-0418533ae52d] Took 9.30 seconds to spawn the instance on the hypervisor.
Jan 21 23:58:08 compute-1 nova_compute[182713]: 2026-01-21 23:58:08.064 182717 DEBUG nova.compute.manager [None req-b4b0d19c-780a-44cf-a0dd-0d0926b7e4a3 d31ef0e2c7354f47adb7b7f072c28fae 40a322b32cda438b83f33ec51a9007dc - - default default] [instance: 337d88a9-a34b-4c90-bf0d-0418533ae52d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:58:08 compute-1 nova_compute[182713]: 2026-01-21 23:58:08.155 182717 INFO nova.compute.manager [None req-b4b0d19c-780a-44cf-a0dd-0d0926b7e4a3 d31ef0e2c7354f47adb7b7f072c28fae 40a322b32cda438b83f33ec51a9007dc - - default default] [instance: 337d88a9-a34b-4c90-bf0d-0418533ae52d] Took 10.11 seconds to build instance.
Jan 21 23:58:08 compute-1 nova_compute[182713]: 2026-01-21 23:58:08.182 182717 DEBUG oslo_concurrency.lockutils [None req-b4b0d19c-780a-44cf-a0dd-0d0926b7e4a3 d31ef0e2c7354f47adb7b7f072c28fae 40a322b32cda438b83f33ec51a9007dc - - default default] Lock "337d88a9-a34b-4c90-bf0d-0418533ae52d" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 10.227s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:58:09 compute-1 nova_compute[182713]: 2026-01-21 23:58:09.991 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:58:11 compute-1 nova_compute[182713]: 2026-01-21 23:58:11.159 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:58:11 compute-1 nova_compute[182713]: 2026-01-21 23:58:11.612 182717 DEBUG nova.network.neutron [None req-5f5259a4-0578-4d36-993a-4c38ee3abaf5 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] [instance: c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9] Updating instance_info_cache with network_info: [{"id": "43589933-1997-41c6-9aa3-54f71a1330b8", "address": "fa:16:3e:3e:bc:4e", "network": {"id": "1995baab-0f8d-4658-a4fc-2d21868dc592", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-54296518-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.190", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "717cc581e6a349a98dfd390d05b18624", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap43589933-19", "ovs_interfaceid": "43589933-1997-41c6-9aa3-54f71a1330b8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "63932621-f0d1-4f08-8ce5-b5fa120bcc62", "address": "fa:16:3e:30:c6:cf", "network": {"id": "1995baab-0f8d-4658-a4fc-2d21868dc592", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-54296518-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "717cc581e6a349a98dfd390d05b18624", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap63932621-f0", "ovs_interfaceid": "63932621-f0d1-4f08-8ce5-b5fa120bcc62", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "da88332f-f709-4521-a7e5-faca686cf825", "address": "fa:16:3e:92:0a:be", "network": {"id": "1995baab-0f8d-4658-a4fc-2d21868dc592", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-54296518-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "717cc581e6a349a98dfd390d05b18624", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapda88332f-f7", "ovs_interfaceid": "da88332f-f709-4521-a7e5-faca686cf825", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 23:58:11 compute-1 nova_compute[182713]: 2026-01-21 23:58:11.659 182717 DEBUG oslo_concurrency.lockutils [None req-5f5259a4-0578-4d36-993a-4c38ee3abaf5 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Releasing lock "refresh_cache-c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 23:58:11 compute-1 nova_compute[182713]: 2026-01-21 23:58:11.661 182717 DEBUG oslo_concurrency.lockutils [req-32348627-0dc3-43ed-ab06-ba20b225048c req-65cc736a-5332-4c38-b7d8-0e7127202ae1 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 23:58:11 compute-1 nova_compute[182713]: 2026-01-21 23:58:11.662 182717 DEBUG nova.network.neutron [req-32348627-0dc3-43ed-ab06-ba20b225048c req-65cc736a-5332-4c38-b7d8-0e7127202ae1 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9] Refreshing network info cache for port da88332f-f709-4521-a7e5-faca686cf825 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 21 23:58:11 compute-1 nova_compute[182713]: 2026-01-21 23:58:11.666 182717 DEBUG nova.virt.libvirt.vif [None req-5f5259a4-0578-4d36-993a-4c38ee3abaf5 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-21T23:57:18Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-AttachInterfacesTestJSON-server-936465965',display_name='tempest-AttachInterfacesTestJSON-server-936465965',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-attachinterfacestestjson-server-936465965',id=71,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBO8uUXqPwbsFq/YiTWyOW9dXr7d73wRm6448mROH/nejo5yLwAs6kN5js+eg5mRrAu9wdd3Q77g1T/xRTktMBrGTEhLl3+r8EWHZg0auWjRb9BMWnMu9DqjgoX1NYmMOKg==',key_name='tempest-keypair-1559391228',keypairs=<?>,launch_index=0,launched_at=2026-01-21T23:57:26Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='717cc581e6a349a98dfd390d05b18624',ramdisk_id='',reservation_id='r-fh2clcvk',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-658760528',owner_user_name='tempest-AttachInterfacesTestJSON-658760528-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-21T23:57:26Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='0f8ef02149394f2dac899fc3395b6bf7',uuid=c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "da88332f-f709-4521-a7e5-faca686cf825", "address": "fa:16:3e:92:0a:be", "network": {"id": "1995baab-0f8d-4658-a4fc-2d21868dc592", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-54296518-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "717cc581e6a349a98dfd390d05b18624", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapda88332f-f7", "ovs_interfaceid": "da88332f-f709-4521-a7e5-faca686cf825", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 21 23:58:11 compute-1 nova_compute[182713]: 2026-01-21 23:58:11.667 182717 DEBUG nova.network.os_vif_util [None req-5f5259a4-0578-4d36-993a-4c38ee3abaf5 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Converting VIF {"id": "da88332f-f709-4521-a7e5-faca686cf825", "address": "fa:16:3e:92:0a:be", "network": {"id": "1995baab-0f8d-4658-a4fc-2d21868dc592", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-54296518-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "717cc581e6a349a98dfd390d05b18624", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapda88332f-f7", "ovs_interfaceid": "da88332f-f709-4521-a7e5-faca686cf825", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 21 23:58:11 compute-1 nova_compute[182713]: 2026-01-21 23:58:11.669 182717 DEBUG nova.network.os_vif_util [None req-5f5259a4-0578-4d36-993a-4c38ee3abaf5 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:92:0a:be,bridge_name='br-int',has_traffic_filtering=True,id=da88332f-f709-4521-a7e5-faca686cf825,network=Network(1995baab-0f8d-4658-a4fc-2d21868dc592),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapda88332f-f7') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 21 23:58:11 compute-1 nova_compute[182713]: 2026-01-21 23:58:11.670 182717 DEBUG os_vif [None req-5f5259a4-0578-4d36-993a-4c38ee3abaf5 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:92:0a:be,bridge_name='br-int',has_traffic_filtering=True,id=da88332f-f709-4521-a7e5-faca686cf825,network=Network(1995baab-0f8d-4658-a4fc-2d21868dc592),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapda88332f-f7') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 21 23:58:11 compute-1 nova_compute[182713]: 2026-01-21 23:58:11.671 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:58:11 compute-1 nova_compute[182713]: 2026-01-21 23:58:11.671 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:58:11 compute-1 nova_compute[182713]: 2026-01-21 23:58:11.672 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 21 23:58:11 compute-1 nova_compute[182713]: 2026-01-21 23:58:11.683 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:58:11 compute-1 nova_compute[182713]: 2026-01-21 23:58:11.683 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapda88332f-f7, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:58:11 compute-1 nova_compute[182713]: 2026-01-21 23:58:11.684 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapda88332f-f7, col_values=(('external_ids', {'iface-id': 'da88332f-f709-4521-a7e5-faca686cf825', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:92:0a:be', 'vm-uuid': 'c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:58:11 compute-1 NetworkManager[54952]: <info>  [1769039891.6905] manager: (tapda88332f-f7): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/122)
Jan 21 23:58:11 compute-1 nova_compute[182713]: 2026-01-21 23:58:11.690 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 21 23:58:11 compute-1 nova_compute[182713]: 2026-01-21 23:58:11.697 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:58:11 compute-1 nova_compute[182713]: 2026-01-21 23:58:11.699 182717 INFO os_vif [None req-5f5259a4-0578-4d36-993a-4c38ee3abaf5 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:92:0a:be,bridge_name='br-int',has_traffic_filtering=True,id=da88332f-f709-4521-a7e5-faca686cf825,network=Network(1995baab-0f8d-4658-a4fc-2d21868dc592),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapda88332f-f7')
Jan 21 23:58:11 compute-1 nova_compute[182713]: 2026-01-21 23:58:11.701 182717 DEBUG nova.virt.libvirt.vif [None req-5f5259a4-0578-4d36-993a-4c38ee3abaf5 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-21T23:57:18Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-AttachInterfacesTestJSON-server-936465965',display_name='tempest-AttachInterfacesTestJSON-server-936465965',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-attachinterfacestestjson-server-936465965',id=71,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBO8uUXqPwbsFq/YiTWyOW9dXr7d73wRm6448mROH/nejo5yLwAs6kN5js+eg5mRrAu9wdd3Q77g1T/xRTktMBrGTEhLl3+r8EWHZg0auWjRb9BMWnMu9DqjgoX1NYmMOKg==',key_name='tempest-keypair-1559391228',keypairs=<?>,launch_index=0,launched_at=2026-01-21T23:57:26Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='717cc581e6a349a98dfd390d05b18624',ramdisk_id='',reservation_id='r-fh2clcvk',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-658760528',owner_user_name='tempest-AttachInterfacesTestJSON-658760528-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-21T23:57:26Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='0f8ef02149394f2dac899fc3395b6bf7',uuid=c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "da88332f-f709-4521-a7e5-faca686cf825", "address": "fa:16:3e:92:0a:be", "network": {"id": "1995baab-0f8d-4658-a4fc-2d21868dc592", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-54296518-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "717cc581e6a349a98dfd390d05b18624", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapda88332f-f7", "ovs_interfaceid": "da88332f-f709-4521-a7e5-faca686cf825", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 21 23:58:11 compute-1 nova_compute[182713]: 2026-01-21 23:58:11.702 182717 DEBUG nova.network.os_vif_util [None req-5f5259a4-0578-4d36-993a-4c38ee3abaf5 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Converting VIF {"id": "da88332f-f709-4521-a7e5-faca686cf825", "address": "fa:16:3e:92:0a:be", "network": {"id": "1995baab-0f8d-4658-a4fc-2d21868dc592", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-54296518-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "717cc581e6a349a98dfd390d05b18624", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapda88332f-f7", "ovs_interfaceid": "da88332f-f709-4521-a7e5-faca686cf825", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 21 23:58:11 compute-1 nova_compute[182713]: 2026-01-21 23:58:11.703 182717 DEBUG nova.network.os_vif_util [None req-5f5259a4-0578-4d36-993a-4c38ee3abaf5 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:92:0a:be,bridge_name='br-int',has_traffic_filtering=True,id=da88332f-f709-4521-a7e5-faca686cf825,network=Network(1995baab-0f8d-4658-a4fc-2d21868dc592),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapda88332f-f7') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 21 23:58:11 compute-1 nova_compute[182713]: 2026-01-21 23:58:11.707 182717 DEBUG nova.virt.libvirt.guest [None req-5f5259a4-0578-4d36-993a-4c38ee3abaf5 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] attach device xml: <interface type="ethernet">
Jan 21 23:58:11 compute-1 nova_compute[182713]:   <mac address="fa:16:3e:92:0a:be"/>
Jan 21 23:58:11 compute-1 nova_compute[182713]:   <model type="virtio"/>
Jan 21 23:58:11 compute-1 nova_compute[182713]:   <driver name="vhost" rx_queue_size="512"/>
Jan 21 23:58:11 compute-1 nova_compute[182713]:   <mtu size="1442"/>
Jan 21 23:58:11 compute-1 nova_compute[182713]:   <target dev="tapda88332f-f7"/>
Jan 21 23:58:11 compute-1 nova_compute[182713]: </interface>
Jan 21 23:58:11 compute-1 nova_compute[182713]:  attach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:339
Jan 21 23:58:11 compute-1 kernel: tapda88332f-f7: entered promiscuous mode
Jan 21 23:58:11 compute-1 ovn_controller[94841]: 2026-01-21T23:58:11Z|00245|binding|INFO|Claiming lport da88332f-f709-4521-a7e5-faca686cf825 for this chassis.
Jan 21 23:58:11 compute-1 ovn_controller[94841]: 2026-01-21T23:58:11Z|00246|binding|INFO|da88332f-f709-4521-a7e5-faca686cf825: Claiming fa:16:3e:92:0a:be 10.100.0.6
Jan 21 23:58:11 compute-1 NetworkManager[54952]: <info>  [1769039891.7295] manager: (tapda88332f-f7): new Tun device (/org/freedesktop/NetworkManager/Devices/123)
Jan 21 23:58:11 compute-1 nova_compute[182713]: 2026-01-21 23:58:11.736 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:58:11 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:58:11.741 104184 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:92:0a:be 10.100.0.6'], port_security=['fa:16:3e:92:0a:be 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': 'c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1995baab-0f8d-4658-a4fc-2d21868dc592', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '717cc581e6a349a98dfd390d05b18624', 'neutron:revision_number': '2', 'neutron:security_group_ids': '453c6af8-25dc-4538-90e8-d74d46875cdc', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a84fa12f-731b-4479-8697-844749c5a76f, chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>], logical_port=da88332f-f709-4521-a7e5-faca686cf825) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 21 23:58:11 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:58:11.744 104184 INFO neutron.agent.ovn.metadata.agent [-] Port da88332f-f709-4521-a7e5-faca686cf825 in datapath 1995baab-0f8d-4658-a4fc-2d21868dc592 bound to our chassis
Jan 21 23:58:11 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:58:11.748 104184 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 1995baab-0f8d-4658-a4fc-2d21868dc592
Jan 21 23:58:11 compute-1 ovn_controller[94841]: 2026-01-21T23:58:11Z|00247|binding|INFO|Setting lport da88332f-f709-4521-a7e5-faca686cf825 ovn-installed in OVS
Jan 21 23:58:11 compute-1 ovn_controller[94841]: 2026-01-21T23:58:11Z|00248|binding|INFO|Setting lport da88332f-f709-4521-a7e5-faca686cf825 up in Southbound
Jan 21 23:58:11 compute-1 nova_compute[182713]: 2026-01-21 23:58:11.759 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:58:11 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:58:11.783 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[d49df5e3-627e-45ab-a5ba-803337d2d63e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:58:11 compute-1 systemd-udevd[221386]: Network interface NamePolicy= disabled on kernel command line.
Jan 21 23:58:11 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:58:11.838 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[dc2763bb-2176-44cb-8aee-6da2ba04acf7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:58:11 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:58:11.843 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[008b9709-eebc-4582-9af3-55d01fdee3b7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:58:11 compute-1 NetworkManager[54952]: <info>  [1769039891.8443] device (tapda88332f-f7): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 21 23:58:11 compute-1 NetworkManager[54952]: <info>  [1769039891.8452] device (tapda88332f-f7): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 21 23:58:11 compute-1 nova_compute[182713]: 2026-01-21 23:58:11.861 182717 DEBUG nova.virt.libvirt.driver [None req-5f5259a4-0578-4d36-993a-4c38ee3abaf5 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 21 23:58:11 compute-1 nova_compute[182713]: 2026-01-21 23:58:11.861 182717 DEBUG nova.virt.libvirt.driver [None req-5f5259a4-0578-4d36-993a-4c38ee3abaf5 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 21 23:58:11 compute-1 nova_compute[182713]: 2026-01-21 23:58:11.861 182717 DEBUG nova.virt.libvirt.driver [None req-5f5259a4-0578-4d36-993a-4c38ee3abaf5 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] No VIF found with MAC fa:16:3e:3e:bc:4e, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 21 23:58:11 compute-1 nova_compute[182713]: 2026-01-21 23:58:11.862 182717 DEBUG nova.virt.libvirt.driver [None req-5f5259a4-0578-4d36-993a-4c38ee3abaf5 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] No VIF found with MAC fa:16:3e:30:c6:cf, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 21 23:58:11 compute-1 nova_compute[182713]: 2026-01-21 23:58:11.862 182717 DEBUG nova.virt.libvirt.driver [None req-5f5259a4-0578-4d36-993a-4c38ee3abaf5 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] No VIF found with MAC fa:16:3e:92:0a:be, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 21 23:58:11 compute-1 podman[221362]: 2026-01-21 23:58:11.864193636 +0000 UTC m=+0.093971355 container health_status e305ebfcd3dbca18f8dd680606b043b108cf9efb0d0a771039cb04fe0df8441d (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Jan 21 23:58:11 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:58:11.882 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[3e5cf0d7-f1d9-45a0-978b-2af4c7a609c2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:58:11 compute-1 nova_compute[182713]: 2026-01-21 23:58:11.890 182717 DEBUG nova.virt.libvirt.guest [None req-5f5259a4-0578-4d36-993a-4c38ee3abaf5 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 21 23:58:11 compute-1 nova_compute[182713]:   <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 21 23:58:11 compute-1 nova_compute[182713]:   <nova:name>tempest-AttachInterfacesTestJSON-server-936465965</nova:name>
Jan 21 23:58:11 compute-1 nova_compute[182713]:   <nova:creationTime>2026-01-21 23:58:11</nova:creationTime>
Jan 21 23:58:11 compute-1 nova_compute[182713]:   <nova:flavor name="m1.nano">
Jan 21 23:58:11 compute-1 nova_compute[182713]:     <nova:memory>128</nova:memory>
Jan 21 23:58:11 compute-1 nova_compute[182713]:     <nova:disk>1</nova:disk>
Jan 21 23:58:11 compute-1 nova_compute[182713]:     <nova:swap>0</nova:swap>
Jan 21 23:58:11 compute-1 nova_compute[182713]:     <nova:ephemeral>0</nova:ephemeral>
Jan 21 23:58:11 compute-1 nova_compute[182713]:     <nova:vcpus>1</nova:vcpus>
Jan 21 23:58:11 compute-1 nova_compute[182713]:   </nova:flavor>
Jan 21 23:58:11 compute-1 nova_compute[182713]:   <nova:owner>
Jan 21 23:58:11 compute-1 nova_compute[182713]:     <nova:user uuid="0f8ef02149394f2dac899fc3395b6bf7">tempest-AttachInterfacesTestJSON-658760528-project-member</nova:user>
Jan 21 23:58:11 compute-1 nova_compute[182713]:     <nova:project uuid="717cc581e6a349a98dfd390d05b18624">tempest-AttachInterfacesTestJSON-658760528</nova:project>
Jan 21 23:58:11 compute-1 nova_compute[182713]:   </nova:owner>
Jan 21 23:58:11 compute-1 nova_compute[182713]:   <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 21 23:58:11 compute-1 nova_compute[182713]:   <nova:ports>
Jan 21 23:58:11 compute-1 nova_compute[182713]:     <nova:port uuid="43589933-1997-41c6-9aa3-54f71a1330b8">
Jan 21 23:58:11 compute-1 nova_compute[182713]:       <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Jan 21 23:58:11 compute-1 nova_compute[182713]:     </nova:port>
Jan 21 23:58:11 compute-1 nova_compute[182713]:     <nova:port uuid="63932621-f0d1-4f08-8ce5-b5fa120bcc62">
Jan 21 23:58:11 compute-1 nova_compute[182713]:       <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Jan 21 23:58:11 compute-1 nova_compute[182713]:     </nova:port>
Jan 21 23:58:11 compute-1 nova_compute[182713]:     <nova:port uuid="da88332f-f709-4521-a7e5-faca686cf825">
Jan 21 23:58:11 compute-1 nova_compute[182713]:       <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Jan 21 23:58:11 compute-1 nova_compute[182713]:     </nova:port>
Jan 21 23:58:11 compute-1 nova_compute[182713]:   </nova:ports>
Jan 21 23:58:11 compute-1 nova_compute[182713]: </nova:instance>
Jan 21 23:58:11 compute-1 nova_compute[182713]:  set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359
Jan 21 23:58:11 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:58:11.902 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[62f7a3b8-8088-4f8b-8d51-8ca4cfa3ac0d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap1995baab-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:4b:ff:2f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 7, 'rx_bytes': 700, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 7, 'rx_bytes': 700, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 70], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 441880, 'reachable_time': 27223, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 221408, 'error': None, 'target': 'ovnmeta-1995baab-0f8d-4658-a4fc-2d21868dc592', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:58:11 compute-1 podman[221361]: 2026-01-21 23:58:11.927098503 +0000 UTC m=+0.155274983 container health_status af9a44923f14d865893c91712a29a99d59e05d83f1a1a0300c4f4d5af638f284 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_metadata_agent)
Jan 21 23:58:11 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:58:11.919 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[c4ebeef6-15a7-4cba-b926-c1e5227e0698]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap1995baab-01'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 441895, 'tstamp': 441895}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 221411, 'error': None, 'target': 'ovnmeta-1995baab-0f8d-4658-a4fc-2d21868dc592', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap1995baab-01'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 441899, 'tstamp': 441899}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 221411, 'error': None, 'target': 'ovnmeta-1995baab-0f8d-4658-a4fc-2d21868dc592', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:58:11 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:58:11.929 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1995baab-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:58:11 compute-1 nova_compute[182713]: 2026-01-21 23:58:11.934 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:58:11 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:58:11.934 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap1995baab-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:58:11 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:58:11.935 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 21 23:58:11 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:58:11.935 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap1995baab-00, col_values=(('external_ids', {'iface-id': '4a5cc35b-5169-43e2-b11f-202219aae22d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:58:11 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:58:11.936 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 21 23:58:11 compute-1 nova_compute[182713]: 2026-01-21 23:58:11.940 182717 DEBUG oslo_concurrency.lockutils [None req-5f5259a4-0578-4d36-993a-4c38ee3abaf5 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Lock "interface-c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9-None" "released" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: held 10.085s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:58:12 compute-1 nova_compute[182713]: 2026-01-21 23:58:12.186 182717 DEBUG nova.compute.manager [req-c800cc9b-4b95-45e0-a6e9-3fb90d9e9ed9 req-b2b3d8da-3f59-4349-b391-1752d21d3151 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9] Received event network-vif-plugged-da88332f-f709-4521-a7e5-faca686cf825 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:58:12 compute-1 nova_compute[182713]: 2026-01-21 23:58:12.187 182717 DEBUG oslo_concurrency.lockutils [req-c800cc9b-4b95-45e0-a6e9-3fb90d9e9ed9 req-b2b3d8da-3f59-4349-b391-1752d21d3151 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:58:12 compute-1 nova_compute[182713]: 2026-01-21 23:58:12.187 182717 DEBUG oslo_concurrency.lockutils [req-c800cc9b-4b95-45e0-a6e9-3fb90d9e9ed9 req-b2b3d8da-3f59-4349-b391-1752d21d3151 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:58:12 compute-1 nova_compute[182713]: 2026-01-21 23:58:12.188 182717 DEBUG oslo_concurrency.lockutils [req-c800cc9b-4b95-45e0-a6e9-3fb90d9e9ed9 req-b2b3d8da-3f59-4349-b391-1752d21d3151 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:58:12 compute-1 nova_compute[182713]: 2026-01-21 23:58:12.188 182717 DEBUG nova.compute.manager [req-c800cc9b-4b95-45e0-a6e9-3fb90d9e9ed9 req-b2b3d8da-3f59-4349-b391-1752d21d3151 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9] No waiting events found dispatching network-vif-plugged-da88332f-f709-4521-a7e5-faca686cf825 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 23:58:12 compute-1 nova_compute[182713]: 2026-01-21 23:58:12.188 182717 WARNING nova.compute.manager [req-c800cc9b-4b95-45e0-a6e9-3fb90d9e9ed9 req-b2b3d8da-3f59-4349-b391-1752d21d3151 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9] Received unexpected event network-vif-plugged-da88332f-f709-4521-a7e5-faca686cf825 for instance with vm_state active and task_state None.
Jan 21 23:58:12 compute-1 nova_compute[182713]: 2026-01-21 23:58:12.286 182717 DEBUG nova.compute.manager [req-ae6186e1-2fe6-403e-8038-cf8a4077990a req-7ce7cf90-a953-425c-b969-ab06f5ddf3d1 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 337d88a9-a34b-4c90-bf0d-0418533ae52d] Received event network-changed-0d50f3fc-9e12-4e56-ba52-98ff14988caa external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:58:12 compute-1 nova_compute[182713]: 2026-01-21 23:58:12.287 182717 DEBUG nova.compute.manager [req-ae6186e1-2fe6-403e-8038-cf8a4077990a req-7ce7cf90-a953-425c-b969-ab06f5ddf3d1 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 337d88a9-a34b-4c90-bf0d-0418533ae52d] Refreshing instance network info cache due to event network-changed-0d50f3fc-9e12-4e56-ba52-98ff14988caa. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 21 23:58:12 compute-1 nova_compute[182713]: 2026-01-21 23:58:12.287 182717 DEBUG oslo_concurrency.lockutils [req-ae6186e1-2fe6-403e-8038-cf8a4077990a req-7ce7cf90-a953-425c-b969-ab06f5ddf3d1 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-337d88a9-a34b-4c90-bf0d-0418533ae52d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 23:58:12 compute-1 nova_compute[182713]: 2026-01-21 23:58:12.288 182717 DEBUG oslo_concurrency.lockutils [req-ae6186e1-2fe6-403e-8038-cf8a4077990a req-7ce7cf90-a953-425c-b969-ab06f5ddf3d1 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-337d88a9-a34b-4c90-bf0d-0418533ae52d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 23:58:12 compute-1 nova_compute[182713]: 2026-01-21 23:58:12.288 182717 DEBUG nova.network.neutron [req-ae6186e1-2fe6-403e-8038-cf8a4077990a req-7ce7cf90-a953-425c-b969-ab06f5ddf3d1 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 337d88a9-a34b-4c90-bf0d-0418533ae52d] Refreshing network info cache for port 0d50f3fc-9e12-4e56-ba52-98ff14988caa _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 21 23:58:13 compute-1 nova_compute[182713]: 2026-01-21 23:58:13.234 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:58:13 compute-1 nova_compute[182713]: 2026-01-21 23:58:13.263 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Triggering sync for uuid c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9 _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268
Jan 21 23:58:13 compute-1 nova_compute[182713]: 2026-01-21 23:58:13.263 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Triggering sync for uuid 337d88a9-a34b-4c90-bf0d-0418533ae52d _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268
Jan 21 23:58:13 compute-1 nova_compute[182713]: 2026-01-21 23:58:13.264 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquiring lock "c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:58:13 compute-1 nova_compute[182713]: 2026-01-21 23:58:13.264 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:58:13 compute-1 nova_compute[182713]: 2026-01-21 23:58:13.265 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquiring lock "337d88a9-a34b-4c90-bf0d-0418533ae52d" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:58:13 compute-1 nova_compute[182713]: 2026-01-21 23:58:13.265 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "337d88a9-a34b-4c90-bf0d-0418533ae52d" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:58:13 compute-1 nova_compute[182713]: 2026-01-21 23:58:13.296 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.031s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:58:13 compute-1 nova_compute[182713]: 2026-01-21 23:58:13.299 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "337d88a9-a34b-4c90-bf0d-0418533ae52d" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.034s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:58:13 compute-1 nova_compute[182713]: 2026-01-21 23:58:13.752 182717 DEBUG nova.network.neutron [req-ae6186e1-2fe6-403e-8038-cf8a4077990a req-7ce7cf90-a953-425c-b969-ab06f5ddf3d1 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 337d88a9-a34b-4c90-bf0d-0418533ae52d] Updated VIF entry in instance network info cache for port 0d50f3fc-9e12-4e56-ba52-98ff14988caa. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 21 23:58:13 compute-1 nova_compute[182713]: 2026-01-21 23:58:13.753 182717 DEBUG nova.network.neutron [req-ae6186e1-2fe6-403e-8038-cf8a4077990a req-7ce7cf90-a953-425c-b969-ab06f5ddf3d1 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 337d88a9-a34b-4c90-bf0d-0418533ae52d] Updating instance_info_cache with network_info: [{"id": "0d50f3fc-9e12-4e56-ba52-98ff14988caa", "address": "fa:16:3e:38:24:84", "network": {"id": "58dabcb0-4997-48a1-816b-d257b8a0a2a6", "bridge": "br-int", "label": "tempest-ServersV294TestFqdnHostnames-94429612-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.172", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "40a322b32cda438b83f33ec51a9007dc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0d50f3fc-9e", "ovs_interfaceid": "0d50f3fc-9e12-4e56-ba52-98ff14988caa", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 23:58:13 compute-1 nova_compute[182713]: 2026-01-21 23:58:13.797 182717 DEBUG oslo_concurrency.lockutils [req-ae6186e1-2fe6-403e-8038-cf8a4077990a req-7ce7cf90-a953-425c-b969-ab06f5ddf3d1 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-337d88a9-a34b-4c90-bf0d-0418533ae52d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 23:58:13 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:58:13.873 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=74526b6d-b1ca-423f-9094-b845f8b97526, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '17'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:58:13 compute-1 ovn_controller[94841]: 2026-01-21T23:58:13Z|00030|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:92:0a:be 10.100.0.6
Jan 21 23:58:13 compute-1 ovn_controller[94841]: 2026-01-21T23:58:13Z|00031|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:92:0a:be 10.100.0.6
Jan 21 23:58:14 compute-1 nova_compute[182713]: 2026-01-21 23:58:14.001 182717 DEBUG nova.network.neutron [req-32348627-0dc3-43ed-ab06-ba20b225048c req-65cc736a-5332-4c38-b7d8-0e7127202ae1 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9] Updated VIF entry in instance network info cache for port da88332f-f709-4521-a7e5-faca686cf825. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 21 23:58:14 compute-1 nova_compute[182713]: 2026-01-21 23:58:14.002 182717 DEBUG nova.network.neutron [req-32348627-0dc3-43ed-ab06-ba20b225048c req-65cc736a-5332-4c38-b7d8-0e7127202ae1 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9] Updating instance_info_cache with network_info: [{"id": "43589933-1997-41c6-9aa3-54f71a1330b8", "address": "fa:16:3e:3e:bc:4e", "network": {"id": "1995baab-0f8d-4658-a4fc-2d21868dc592", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-54296518-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.190", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "717cc581e6a349a98dfd390d05b18624", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap43589933-19", "ovs_interfaceid": "43589933-1997-41c6-9aa3-54f71a1330b8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "63932621-f0d1-4f08-8ce5-b5fa120bcc62", "address": "fa:16:3e:30:c6:cf", "network": {"id": "1995baab-0f8d-4658-a4fc-2d21868dc592", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-54296518-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "717cc581e6a349a98dfd390d05b18624", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap63932621-f0", "ovs_interfaceid": "63932621-f0d1-4f08-8ce5-b5fa120bcc62", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "da88332f-f709-4521-a7e5-faca686cf825", "address": "fa:16:3e:92:0a:be", "network": {"id": "1995baab-0f8d-4658-a4fc-2d21868dc592", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-54296518-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "717cc581e6a349a98dfd390d05b18624", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapda88332f-f7", "ovs_interfaceid": "da88332f-f709-4521-a7e5-faca686cf825", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 23:58:14 compute-1 nova_compute[182713]: 2026-01-21 23:58:14.029 182717 DEBUG oslo_concurrency.lockutils [req-32348627-0dc3-43ed-ab06-ba20b225048c req-65cc736a-5332-4c38-b7d8-0e7127202ae1 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 23:58:14 compute-1 nova_compute[182713]: 2026-01-21 23:58:14.294 182717 DEBUG nova.compute.manager [req-e0a54199-db4b-4d42-a691-f44ce286ece2 req-7b3e0fa1-7cdf-4564-9465-2c2e5905a1c8 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9] Received event network-vif-plugged-da88332f-f709-4521-a7e5-faca686cf825 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:58:14 compute-1 nova_compute[182713]: 2026-01-21 23:58:14.295 182717 DEBUG oslo_concurrency.lockutils [req-e0a54199-db4b-4d42-a691-f44ce286ece2 req-7b3e0fa1-7cdf-4564-9465-2c2e5905a1c8 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:58:14 compute-1 nova_compute[182713]: 2026-01-21 23:58:14.295 182717 DEBUG oslo_concurrency.lockutils [req-e0a54199-db4b-4d42-a691-f44ce286ece2 req-7b3e0fa1-7cdf-4564-9465-2c2e5905a1c8 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:58:14 compute-1 nova_compute[182713]: 2026-01-21 23:58:14.295 182717 DEBUG oslo_concurrency.lockutils [req-e0a54199-db4b-4d42-a691-f44ce286ece2 req-7b3e0fa1-7cdf-4564-9465-2c2e5905a1c8 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:58:14 compute-1 nova_compute[182713]: 2026-01-21 23:58:14.295 182717 DEBUG nova.compute.manager [req-e0a54199-db4b-4d42-a691-f44ce286ece2 req-7b3e0fa1-7cdf-4564-9465-2c2e5905a1c8 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9] No waiting events found dispatching network-vif-plugged-da88332f-f709-4521-a7e5-faca686cf825 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 23:58:14 compute-1 nova_compute[182713]: 2026-01-21 23:58:14.296 182717 WARNING nova.compute.manager [req-e0a54199-db4b-4d42-a691-f44ce286ece2 req-7b3e0fa1-7cdf-4564-9465-2c2e5905a1c8 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9] Received unexpected event network-vif-plugged-da88332f-f709-4521-a7e5-faca686cf825 for instance with vm_state active and task_state None.
Jan 21 23:58:14 compute-1 nova_compute[182713]: 2026-01-21 23:58:14.606 182717 DEBUG oslo_concurrency.lockutils [None req-2bb425d0-9173-4c82-a3ce-4cfa847df0a8 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Acquiring lock "interface-c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9-ebbb51e0-ecfc-404f-a578-681300a57aa8" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:58:14 compute-1 nova_compute[182713]: 2026-01-21 23:58:14.607 182717 DEBUG oslo_concurrency.lockutils [None req-2bb425d0-9173-4c82-a3ce-4cfa847df0a8 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Lock "interface-c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9-ebbb51e0-ecfc-404f-a578-681300a57aa8" acquired by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:58:14 compute-1 nova_compute[182713]: 2026-01-21 23:58:14.608 182717 DEBUG nova.objects.instance [None req-2bb425d0-9173-4c82-a3ce-4cfa847df0a8 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Lazy-loading 'flavor' on Instance uuid c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:58:15 compute-1 nova_compute[182713]: 2026-01-21 23:58:15.442 182717 DEBUG nova.objects.instance [None req-2bb425d0-9173-4c82-a3ce-4cfa847df0a8 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Lazy-loading 'pci_requests' on Instance uuid c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:58:15 compute-1 nova_compute[182713]: 2026-01-21 23:58:15.482 182717 DEBUG nova.network.neutron [None req-2bb425d0-9173-4c82-a3ce-4cfa847df0a8 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] [instance: c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 21 23:58:15 compute-1 nova_compute[182713]: 2026-01-21 23:58:15.893 182717 DEBUG nova.policy [None req-2bb425d0-9173-4c82-a3ce-4cfa847df0a8 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '0f8ef02149394f2dac899fc3395b6bf7', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '717cc581e6a349a98dfd390d05b18624', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 21 23:58:16 compute-1 nova_compute[182713]: 2026-01-21 23:58:16.163 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:58:16 compute-1 nova_compute[182713]: 2026-01-21 23:58:16.687 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:58:16 compute-1 nova_compute[182713]: 2026-01-21 23:58:16.849 182717 DEBUG nova.network.neutron [None req-2bb425d0-9173-4c82-a3ce-4cfa847df0a8 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] [instance: c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9] Successfully updated port: ebbb51e0-ecfc-404f-a578-681300a57aa8 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 21 23:58:16 compute-1 nova_compute[182713]: 2026-01-21 23:58:16.874 182717 DEBUG oslo_concurrency.lockutils [None req-2bb425d0-9173-4c82-a3ce-4cfa847df0a8 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Acquiring lock "refresh_cache-c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 23:58:16 compute-1 nova_compute[182713]: 2026-01-21 23:58:16.874 182717 DEBUG oslo_concurrency.lockutils [None req-2bb425d0-9173-4c82-a3ce-4cfa847df0a8 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Acquired lock "refresh_cache-c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 23:58:16 compute-1 nova_compute[182713]: 2026-01-21 23:58:16.875 182717 DEBUG nova.network.neutron [None req-2bb425d0-9173-4c82-a3ce-4cfa847df0a8 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] [instance: c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 21 23:58:16 compute-1 nova_compute[182713]: 2026-01-21 23:58:16.964 182717 DEBUG nova.compute.manager [req-370b2cb8-2c73-41b9-b9b5-69346d1e554c req-710a2ba9-c709-46a9-9a4c-1a30ce32c80a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9] Received event network-changed-ebbb51e0-ecfc-404f-a578-681300a57aa8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:58:16 compute-1 nova_compute[182713]: 2026-01-21 23:58:16.965 182717 DEBUG nova.compute.manager [req-370b2cb8-2c73-41b9-b9b5-69346d1e554c req-710a2ba9-c709-46a9-9a4c-1a30ce32c80a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9] Refreshing instance network info cache due to event network-changed-ebbb51e0-ecfc-404f-a578-681300a57aa8. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 21 23:58:16 compute-1 nova_compute[182713]: 2026-01-21 23:58:16.965 182717 DEBUG oslo_concurrency.lockutils [req-370b2cb8-2c73-41b9-b9b5-69346d1e554c req-710a2ba9-c709-46a9-9a4c-1a30ce32c80a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 23:58:17 compute-1 nova_compute[182713]: 2026-01-21 23:58:17.159 182717 WARNING nova.network.neutron [None req-2bb425d0-9173-4c82-a3ce-4cfa847df0a8 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] [instance: c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9] 1995baab-0f8d-4658-a4fc-2d21868dc592 already exists in list: networks containing: ['1995baab-0f8d-4658-a4fc-2d21868dc592']. ignoring it
Jan 21 23:58:17 compute-1 nova_compute[182713]: 2026-01-21 23:58:17.160 182717 WARNING nova.network.neutron [None req-2bb425d0-9173-4c82-a3ce-4cfa847df0a8 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] [instance: c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9] 1995baab-0f8d-4658-a4fc-2d21868dc592 already exists in list: networks containing: ['1995baab-0f8d-4658-a4fc-2d21868dc592']. ignoring it
Jan 21 23:58:17 compute-1 nova_compute[182713]: 2026-01-21 23:58:17.160 182717 WARNING nova.network.neutron [None req-2bb425d0-9173-4c82-a3ce-4cfa847df0a8 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] [instance: c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9] 1995baab-0f8d-4658-a4fc-2d21868dc592 already exists in list: networks containing: ['1995baab-0f8d-4658-a4fc-2d21868dc592']. ignoring it
Jan 21 23:58:19 compute-1 ovn_controller[94841]: 2026-01-21T23:58:19Z|00032|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:38:24:84 10.100.0.10
Jan 21 23:58:19 compute-1 ovn_controller[94841]: 2026-01-21T23:58:19Z|00033|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:38:24:84 10.100.0.10
Jan 21 23:58:21 compute-1 nova_compute[182713]: 2026-01-21 23:58:21.165 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:58:21 compute-1 nova_compute[182713]: 2026-01-21 23:58:21.690 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:22.871 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '337d88a9-a34b-4c90-bf0d-0418533ae52d', 'name': 'guest-instance-1', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000049', 'OS-EXT-SRV-ATTR:host': 'compute-1.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '40a322b32cda438b83f33ec51a9007dc', 'user_id': 'd31ef0e2c7354f47adb7b7f072c28fae', 'hostId': '5249bc4dd155e3a7eb644d8de508ae6e64618295cdf79fe9051c7afd', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Jan 21 23:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:22.874 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9', 'name': 'tempest-AttachInterfacesTestJSON-server-936465965', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000047', 'OS-EXT-SRV-ATTR:host': 'compute-1.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '717cc581e6a349a98dfd390d05b18624', 'user_id': '0f8ef02149394f2dac899fc3395b6bf7', 'hostId': '055250bef50e23cf2e31f6f86c8bab0b62c3741c7cced2ffb25604bb', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Jan 21 23:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:22.874 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Jan 21 23:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:22.878 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 337d88a9-a34b-4c90-bf0d-0418533ae52d / tap0d50f3fc-9e inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Jan 21 23:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:22.878 12 DEBUG ceilometer.compute.pollsters [-] 337d88a9-a34b-4c90-bf0d-0418533ae52d/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:22.881 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9 / tap43589933-19 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Jan 21 23:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:22.882 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9 / tap63932621-f0 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Jan 21 23:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:22.882 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9 / tapda88332f-f7 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Jan 21 23:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:22.882 12 DEBUG ceilometer.compute.pollsters [-] c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:22.883 12 DEBUG ceilometer.compute.pollsters [-] c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:22.883 12 DEBUG ceilometer.compute.pollsters [-] c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:22.885 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4ed6be42-d3f4-4b05-9e50-6eb82191531a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'd31ef0e2c7354f47adb7b7f072c28fae', 'user_name': None, 'project_id': '40a322b32cda438b83f33ec51a9007dc', 'project_name': None, 'resource_id': 'instance-00000049-337d88a9-a34b-4c90-bf0d-0418533ae52d-tap0d50f3fc-9e', 'timestamp': '2026-01-21T23:58:22.875135', 'resource_metadata': {'display_name': 'guest-instance-1', 'name': 'tap0d50f3fc-9e', 'instance_id': '337d88a9-a34b-4c90-bf0d-0418533ae52d', 'instance_type': 'm1.nano', 'host': '5249bc4dd155e3a7eb644d8de508ae6e64618295cdf79fe9051c7afd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:38:24:84', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap0d50f3fc-9e'}, 'message_id': '118d0e58-f725-11f0-a0a4-fa163e934844', 'monotonic_time': 4475.582402443, 'message_signature': '695038cde1b131507bae15bc375ebb66c6a79c707003d7e3a69161682f3ecfd2'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '0f8ef02149394f2dac899fc3395b6bf7', 'user_name': None, 'project_id': '717cc581e6a349a98dfd390d05b18624', 'project_name': None, 'resource_id': 'instance-00000047-c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9-tap43589933-19', 'timestamp': '2026-01-21T23:58:22.875135', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-936465965', 'name': 'tap43589933-19', 'instance_id': 'c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9', 'instance_type': 'm1.nano', 'host': '055250bef50e23cf2e31f6f86c8bab0b62c3741c7cced2ffb25604bb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:3e:bc:4e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap43589933-19'}, 'message_id': '118d9aee-f725-11f0-a0a4-fa163e934844', 'monotonic_time': 4475.586735996, 'message_signature': '6f13764e996e9a08519e0280dc141b1a1b552230c48f522c3cc4c8a21402f780'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '0f8ef02149394f2dac899fc3395b6bf7', 'user_name': None, 'project_id': '717cc581e6a349a98dfd390d05b18624', 'project_name': None, 'resource_id': 'instance-00000047-c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9-tap63932621-f0', 'timestamp': '2026-01-21T23:58:22.875135', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-936465965', 'name': 'tap63932621-f0', 'instance_id': 'c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9', 'instance_type': 'm1.nano', 'host': '055250bef50e23cf2e31f6f86c8bab0b62c3741c7cced2ffb25604bb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:30:c6:cf', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap63932621-f0'}, 'message_id': '118da444-f725-11f0-a0a4-fa163e934844', 'monotonic_time': 4475.586735996, 'message_signature': '3a5bb490450700eb3404c9628e7f7ab21a50cb8918ee0f65cfc741e6e66495f8'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '0f8ef02149394f2dac899fc3395b6bf7', 'user_name': None, 'project_id': '717cc581e6a349a98dfd390d05b18624', 'project_name': None, 'resource_id': 'instance-00000047-c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9-tapda88332f-f7', 'timestamp': '2026-01-21T23:58:22.875135', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-936465965', 'name': 'tapda88332f-f7', 'instance_id': 'c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9', 'instance_type': 'm1.nano', 'host': '055250bef50e23cf2e31f6f86c8bab0b62c3741c7cced2ffb25604bb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:92:0a:be', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapda88332f-f7'}, 'message_id': '118dae12-f725-11f0-a0a4-fa163e934844', 'monotonic_time': 4475.586735996, 'message_signature': '0ac2f503362ebfa9aec5be68a4a57aa31613228c5aaa24086ebb91c09447cd69'}]}, 'timestamp': '2026-01-21 23:58:22.883590', '_unique_id': '03ce23c5a7504e98b80c93cafc8d0216'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:22.885 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:22.885 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 23:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:22.885 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 23:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:22.885 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:22.885 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:22.885 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 23:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:22.885 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 23:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:22.885 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 23:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:22.885 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 23:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:22.885 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 23:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:22.885 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 23:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:22.885 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 23:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:22.885 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 23:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:22.885 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 23:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:22.885 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 23:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:22.885 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 23:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:22.885 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 23:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:22.885 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 23:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:22.885 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 23:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:22.885 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 23:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:22.885 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:22.885 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 23:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:22.885 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:22.885 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:22.885 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 23:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:22.885 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 23:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:22.885 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 23:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:22.885 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 23:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:22.885 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 23:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:22.885 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 23:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:22.885 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 23:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:22.885 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 23:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:22.885 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 23:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:22.885 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 23:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:22.885 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 23:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:22.885 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 23:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:22.885 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 23:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:22.885 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 23:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:22.885 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 23:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:22.885 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 23:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:22.885 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 23:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:22.885 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 23:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:22.885 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 23:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:22.885 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 23:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:22.885 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 23:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:22.885 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 23:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:22.885 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:22.885 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:22.885 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 23:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:22.885 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 23:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:22.885 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 23:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:22.885 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 23:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:22.885 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:22.885 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:22.887 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Jan 21 23:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:22.887 12 DEBUG ceilometer.compute.pollsters [-] 337d88a9-a34b-4c90-bf0d-0418533ae52d/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:22.887 12 DEBUG ceilometer.compute.pollsters [-] c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:22.887 12 DEBUG ceilometer.compute.pollsters [-] c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:22.888 12 DEBUG ceilometer.compute.pollsters [-] c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:22.888 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '11d5d7bc-235f-4103-835d-9d5dbd263c32', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'd31ef0e2c7354f47adb7b7f072c28fae', 'user_name': None, 'project_id': '40a322b32cda438b83f33ec51a9007dc', 'project_name': None, 'resource_id': 'instance-00000049-337d88a9-a34b-4c90-bf0d-0418533ae52d-tap0d50f3fc-9e', 'timestamp': '2026-01-21T23:58:22.887341', 'resource_metadata': {'display_name': 'guest-instance-1', 'name': 'tap0d50f3fc-9e', 'instance_id': '337d88a9-a34b-4c90-bf0d-0418533ae52d', 'instance_type': 'm1.nano', 'host': '5249bc4dd155e3a7eb644d8de508ae6e64618295cdf79fe9051c7afd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:38:24:84', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap0d50f3fc-9e'}, 'message_id': '118e4dd6-f725-11f0-a0a4-fa163e934844', 'monotonic_time': 4475.582402443, 'message_signature': '79c31f7aee06fc24ddde6bda3b6c49d168cc8f26ce8d76c80f9924b43534cf2c'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '0f8ef02149394f2dac899fc3395b6bf7', 'user_name': None, 'project_id': '717cc581e6a349a98dfd390d05b18624', 'project_name': None, 'resource_id': 'instance-00000047-c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9-tap43589933-19', 'timestamp': '2026-01-21T23:58:22.887341', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-936465965', 'name': 'tap43589933-19', 'instance_id': 'c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9', 'instance_type': 'm1.nano', 'host': '055250bef50e23cf2e31f6f86c8bab0b62c3741c7cced2ffb25604bb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:3e:bc:4e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap43589933-19'}, 'message_id': '118e5678-f725-11f0-a0a4-fa163e934844', 'monotonic_time': 4475.586735996, 'message_signature': 'd2f4cd9bad92d1abd9563a559093ba15b1736bd7fb550346e89bfc15e89d6496'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '0f8ef02149394f2dac899fc3395b6bf7', 'user_name': None, 'project_id': '717cc581e6a349a98dfd390d05b18624', 'project_name': None, 'resource_id': 'instance-00000047-c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9-tap63932621-f0', 'timestamp': '2026-01-21T23:58:22.887341', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-936465965', 'name': 'tap63932621-f0', 'instance_id': 'c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9', 'instance_type': 'm1.nano', 'host': '055250bef50e23cf2e31f6f86c8bab0b62c3741c7cced2ffb25604bb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:30:c6:cf', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap63932621-f0'}, 'message_id': '118e5ff6-f725-11f0-a0a4-fa163e934844', 'monotonic_time': 4475.586735996, 'message_signature': '1cf91ba0ca9358618e42b7d2fe861e9bca5e41261d181dffc3db21d6bb960df2'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '0f8ef02149394f2dac899fc3395b6bf7', 'user_name': None, 'project_id': '717cc581e6a349a98dfd390d05b18624', 'project_name': None, 'resource_id': 'instance-00000047-c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9-tapda88332f-f7', 'timestamp': '2026-01-21T23:58:22.887341', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-936465965', 'name': 'tapda88332f-f7', 'instance_id': 'c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9', 'instance_type': 'm1.nano', 'host': '055250bef50e23cf2e31f6f86c8bab0b62c3741c7cced2ffb25604bb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:92:0a:be', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapda88332f-f7'}, 'message_id': '118e67f8-f725-11f0-a0a4-fa163e934844', 'monotonic_time': 4475.586735996, 'message_signature': 'afa48a325910edc72aaade48f75a4e0ff46bb18a25e1078beb7e4e50ca0cf9c7'}]}, 'timestamp': '2026-01-21 23:58:22.888244', '_unique_id': '73bc2325765e47afa888468d7f7a666e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:22.888 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:22.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 23:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:22.888 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 23:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:22.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:22.888 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:22.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 23:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:22.888 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 23:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:22.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 23:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:22.888 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 23:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:22.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 23:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:22.888 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 23:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:22.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 23:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:22.888 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 23:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:22.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 23:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:22.888 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 23:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:22.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 23:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:22.888 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 23:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:22.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 23:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:22.888 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 23:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:22.888 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 23:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:22.888 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:22.888 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 23:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:22.888 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:22.888 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:22.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 23:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:22.888 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 23:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:22.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 23:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:22.888 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 23:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:22.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 23:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:22.888 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 23:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:22.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 23:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:22.888 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 23:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:22.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 23:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:22.888 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 23:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:22.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 23:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:22.888 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 23:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:22.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 23:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:22.888 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 23:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:22.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 23:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:22.888 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 23:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:22.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 23:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:22.888 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 23:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:22.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 23:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:22.888 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 23:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:22.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 23:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:22.888 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 23:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:22.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:22.888 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:22.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 23:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:22.888 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 23:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:22.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 23:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:22.888 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 23:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:22.888 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:22.888 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:22.889 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Jan 21 23:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:22.889 12 DEBUG ceilometer.compute.pollsters [-] 337d88a9-a34b-4c90-bf0d-0418533ae52d/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:22.889 12 DEBUG ceilometer.compute.pollsters [-] c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:22.889 12 DEBUG ceilometer.compute.pollsters [-] c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:22.890 12 DEBUG ceilometer.compute.pollsters [-] c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:22.890 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '487081d8-21de-434e-8f77-094de5e802f8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'd31ef0e2c7354f47adb7b7f072c28fae', 'user_name': None, 'project_id': '40a322b32cda438b83f33ec51a9007dc', 'project_name': None, 'resource_id': 'instance-00000049-337d88a9-a34b-4c90-bf0d-0418533ae52d-tap0d50f3fc-9e', 'timestamp': '2026-01-21T23:58:22.889497', 'resource_metadata': {'display_name': 'guest-instance-1', 'name': 'tap0d50f3fc-9e', 'instance_id': '337d88a9-a34b-4c90-bf0d-0418533ae52d', 'instance_type': 'm1.nano', 'host': '5249bc4dd155e3a7eb644d8de508ae6e64618295cdf79fe9051c7afd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:38:24:84', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap0d50f3fc-9e'}, 'message_id': '118ea218-f725-11f0-a0a4-fa163e934844', 'monotonic_time': 4475.582402443, 'message_signature': 'd6f047c0778459cca11904f8de2266fe5f3c92169335976a9b0e992360a4ec32'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '0f8ef02149394f2dac899fc3395b6bf7', 'user_name': None, 'project_id': '717cc581e6a349a98dfd390d05b18624', 'project_name': None, 'resource_id': 'instance-00000047-c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9-tap43589933-19', 'timestamp': '2026-01-21T23:58:22.889497', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-936465965', 'name': 'tap43589933-19', 'instance_id': 'c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9', 'instance_type': 'm1.nano', 'host': '055250bef50e23cf2e31f6f86c8bab0b62c3741c7cced2ffb25604bb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:3e:bc:4e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap43589933-19'}, 'message_id': '118eab78-f725-11f0-a0a4-fa163e934844', 'monotonic_time': 4475.586735996, 'message_signature': '1714dca2fd6de45156bdb99b97927975a48ef30a74f897109272109c783970bd'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '0f8ef02149394f2dac899fc3395b6bf7', 'user_name': None, 'project_id': '717cc581e6a349a98dfd390d05b18624', 'project_name': None, 'resource_id': 'instance-00000047-c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9-tap63932621-f0', 'timestamp': '2026-01-21T23:58:22.889497', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-936465965', 'name': 'tap63932621-f0', 'instance_id': 'c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9', 'instance_type': 'm1.nano', 'host': '055250bef50e23cf2e31f6f86c8bab0b62c3741c7cced2ffb25604bb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:30:c6:cf', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap63932621-f0'}, 'message_id': '118eb3ca-f725-11f0-a0a4-fa163e934844', 'monotonic_time': 4475.586735996, 'message_signature': 'a63e5280afb6965b6eebb95240ef6720291ebf117618b65f419a6c40f73f21a6'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '0f8ef02149394f2dac899fc3395b6bf7', 'user_name': None, 'project_id': '717cc581e6a349a98dfd390d05b18624', 'project_name': None, 'resource_id': 'instance-00000047-c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9-tapda88332f-f7', 'timestamp': '2026-01-21T23:58:22.889497', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-936465965', 'name': 'tapda88332f-f7', 'instance_id': 'c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9', 'instance_type': 'm1.nano', 'host': '055250bef50e23cf2e31f6f86c8bab0b62c3741c7cced2ffb25604bb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:92:0a:be', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapda88332f-f7'}, 'message_id': '118ebbae-f725-11f0-a0a4-fa163e934844', 'monotonic_time': 4475.586735996, 'message_signature': 'a25f6ae24216b06a968734a8f834116855730c42866f22f1967313c10488e1e1'}]}, 'timestamp': '2026-01-21 23:58:22.890382', '_unique_id': '83303f5c0e6a49dd8354a80016a2212f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:22.890 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:22.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 23:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:22.890 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 23:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:22.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:22.890 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:22.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 23:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:22.890 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 23:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:22.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 23:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:22.890 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 23:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:22.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 23:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:22.890 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 23:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:22.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 23:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:22.890 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 23:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:22.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 23:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:22.890 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 23:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:22.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 23:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:22.890 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 23:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:22.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 23:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:22.890 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 23:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:22.890 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 23:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:22.890 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:22.890 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 23:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:22.890 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:22.890 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:22.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 23:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:22.890 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 23:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:22.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 23:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:22.890 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 23:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:22.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 23:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:22.890 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 23:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:22.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 23:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:22.890 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 23:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:22.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 23:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:22.890 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 23:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:22.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 23:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:22.890 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 23:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:22.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 23:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:22.890 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 23:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:22.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 23:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:22.890 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 23:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:22.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 23:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:22.890 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 23:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:22.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 23:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:22.890 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 23:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:22.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 23:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:22.890 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 23:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:22.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:22.890 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:22.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 23:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:22.890 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 23:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:22.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 23:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:22.890 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 23:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:22.890 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:22.890 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:22.891 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Jan 21 23:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:22.931 12 DEBUG ceilometer.compute.pollsters [-] 337d88a9-a34b-4c90-bf0d-0418533ae52d/disk.device.read.requests volume: 1101 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:22.932 12 DEBUG ceilometer.compute.pollsters [-] 337d88a9-a34b-4c90-bf0d-0418533ae52d/disk.device.read.requests volume: 120 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:22.979 12 DEBUG ceilometer.compute.pollsters [-] c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9/disk.device.read.requests volume: 1074 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:22.980 12 DEBUG ceilometer.compute.pollsters [-] c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9/disk.device.read.requests volume: 120 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:22.982 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a87c94cf-b23d-4710-9bb4-e2de73ed1c37', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1101, 'user_id': 'd31ef0e2c7354f47adb7b7f072c28fae', 'user_name': None, 'project_id': '40a322b32cda438b83f33ec51a9007dc', 'project_name': None, 'resource_id': '337d88a9-a34b-4c90-bf0d-0418533ae52d-vda', 'timestamp': '2026-01-21T23:58:22.891553', 'resource_metadata': {'display_name': 'guest-instance-1', 'name': 'instance-00000049', 'instance_id': '337d88a9-a34b-4c90-bf0d-0418533ae52d', 'instance_type': 'm1.nano', 'host': '5249bc4dd155e3a7eb644d8de508ae6e64618295cdf79fe9051c7afd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '11951576-f725-11f0-a0a4-fa163e934844', 'monotonic_time': 4475.598763527, 'message_signature': 'db03b689ffd825f5278e24f95646a48ceac40d47eeb8e853eff5bd959bed25b4'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 120, 'user_id': 'd31ef0e2c7354f47adb7b7f072c28fae', 'user_name': None, 'project_id': '40a322b32cda438b83f33ec51a9007dc', 'project_name': None, 'resource_id': '337d88a9-a34b-4c90-bf0d-0418533ae52d-sda', 'timestamp': '2026-01-21T23:58:22.891553', 'resource_metadata': {'display_name': 'guest-instance-1', 'name': 'instance-00000049', 'instance_id': '337d88a9-a34b-4c90-bf0d-0418533ae52d', 'instance_type': 'm1.nano', 'host': '5249bc4dd155e3a7eb644d8de508ae6e64618295cdf79fe9051c7afd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '119524bc-f725-11f0-a0a4-fa163e934844', 'monotonic_time': 4475.598763527, 'message_signature': '0d6767cefa6b230450f947bc9bf818efbdc0d7dc2af2aaadf1174c2bb0c7a998'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1074, 'user_id': '0f8ef02149394f2dac899fc3395b6bf7', 'user_name': None, 'project_id': '717cc581e6a349a98dfd390d05b18624', 'project_name': None, 'resource_id': 'c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9-vda', 'timestamp': '2026-01-21T23:58:22.891553', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-936465965', 'name': 'instance-00000047', 'instance_id': 'c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9', 'instance_type': 'm1.nano', 'host': '055250bef50e23cf2e31f6f86c8bab0b62c3741c7cced2ffb25604bb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '119c6632-f725-11f0-a0a4-fa163e934844', 'monotonic_time': 4475.639689717, 'message_signature': '387b525ba75b3fd9b23d13d2206c3b119bd9753649a4cfa0396bf08156e23392'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 120, 'user_id': '0f8ef02149394f2dac899fc3395b6bf7', 'user_name': None, 'project_id': '717cc581e6a349a98dfd390d05b18624', 'project_name': None, 'resource_id': 'c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9-sda', 'timestamp': '2026-01-21T23:58:22.891553', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-936465965', 'name': 'instance-00000047', 'instance_id': 'c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9', 'instance_type': 'm1.nano', 'host': '055250bef50e23cf2e31f6f86c8bab0b62c3741c7cced2ffb25604bb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '119c7938-f725-11f0-a0a4-fa163e934844', 'monotonic_time': 4475.639689717, 'message_signature': '3761ccd7c5981609a2dc580879e56b30ad77f5ca572fad859d03d9d0de9b7d2a'}]}, 'timestamp': '2026-01-21 23:58:22.980555', '_unique_id': '665090479ceb433286bfa2a92397c01b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:22.982 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:22.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 23:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:22.982 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 23:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:22.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:22.982 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:22.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 23:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:22.982 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 23:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:22.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 23:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:22.982 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 23:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:22.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 23:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:22.982 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 23:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:22.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 23:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:22.982 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 23:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:22.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 23:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:22.982 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 23:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:22.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 23:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:22.982 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 23:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:22.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 23:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:22.982 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 23:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:22.982 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 23:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:22.982 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:22.982 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 23:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:22.982 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:22.982 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:22.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 23:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:22.982 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 23:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:22.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 23:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:22.982 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 23:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:22.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 23:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:22.982 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 23:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:22.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 23:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:22.982 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 23:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:22.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 23:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:22.982 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 23:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:22.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 23:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:22.982 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 23:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:22.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 23:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:22.982 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 23:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:22.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 23:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:22.982 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 23:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:22.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 23:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:22.982 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 23:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:22.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 23:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:22.982 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 23:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:22.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 23:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:22.982 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 23:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:22.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:22.982 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:22.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 23:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:22.982 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 23:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:22.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 23:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:22.982 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 23:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:22.982 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:22.982 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:22.983 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.002 12 DEBUG ceilometer.compute.pollsters [-] 337d88a9-a34b-4c90-bf0d-0418533ae52d/disk.device.usage volume: 29753344 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.003 12 DEBUG ceilometer.compute.pollsters [-] 337d88a9-a34b-4c90-bf0d-0418533ae52d/disk.device.usage volume: 509952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.016 12 DEBUG ceilometer.compute.pollsters [-] c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9/disk.device.usage volume: 30015488 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.016 12 DEBUG ceilometer.compute.pollsters [-] c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9/disk.device.usage volume: 509952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.018 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '97db2613-a33f-49bb-b833-d7b344f91ba5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29753344, 'user_id': 'd31ef0e2c7354f47adb7b7f072c28fae', 'user_name': None, 'project_id': '40a322b32cda438b83f33ec51a9007dc', 'project_name': None, 'resource_id': '337d88a9-a34b-4c90-bf0d-0418533ae52d-vda', 'timestamp': '2026-01-21T23:58:22.983384', 'resource_metadata': {'display_name': 'guest-instance-1', 'name': 'instance-00000049', 'instance_id': '337d88a9-a34b-4c90-bf0d-0418533ae52d', 'instance_type': 'm1.nano', 'host': '5249bc4dd155e3a7eb644d8de508ae6e64618295cdf79fe9051c7afd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '119ff70c-f725-11f0-a0a4-fa163e934844', 'monotonic_time': 4475.690626285, 'message_signature': '7303be047c5cf6d409830c0866380c73c6a1926c6981780a215466efe0574c9e'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 509952, 'user_id': 'd31ef0e2c7354f47adb7b7f072c28fae', 'user_name': None, 'project_id': '40a322b32cda438b83f33ec51a9007dc', 'project_name': None, 'resource_id': '337d88a9-a34b-4c90-bf0d-0418533ae52d-sda', 'timestamp': '2026-01-21T23:58:22.983384', 'resource_metadata': {'display_name': 'guest-instance-1', 'name': 'instance-00000049', 'instance_id': '337d88a9-a34b-4c90-bf0d-0418533ae52d', 'instance_type': 'm1.nano', 'host': '5249bc4dd155e3a7eb644d8de508ae6e64618295cdf79fe9051c7afd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '11a0067a-f725-11f0-a0a4-fa163e934844', 'monotonic_time': 4475.690626285, 'message_signature': '72e8c8067c151817e753c1dd534f96452cbee71dd3c1fe23c6fc69f3c6e60a17'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30015488, 'user_id': '0f8ef02149394f2dac899fc3395b6bf7', 'user_name': None, 'project_id': '717cc581e6a349a98dfd390d05b18624', 'project_name': None, 'resource_id': 'c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9-vda', 'timestamp': '2026-01-21T23:58:22.983384', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-936465965', 'name': 'instance-00000047', 'instance_id': 'c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9', 'instance_type': 'm1.nano', 'host': '055250bef50e23cf2e31f6f86c8bab0b62c3741c7cced2ffb25604bb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '11a1f9f8-f725-11f0-a0a4-fa163e934844', 'monotonic_time': 4475.710964102, 'message_signature': '4cdb4dacb22b2a94c57058b11709fa8802a6c815ab29e95e7afe18ef968e170b'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 509952, 'user_id': '0f8ef02149394f2dac899fc3395b6bf7', 'user_name': None, 'project_id': '717cc581e6a349a98dfd390d05b18624', 'project_name': None, 'resource_id': 'c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9-sda', 'timestamp': '2026-01-21T23:58:22.983384', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-936465965', 'name': 'instance-00000047', 'instance_id': 'c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9', 'instance_type': 'm1.nano', 'host': '055250bef50e23cf2e31f6f86c8bab0b62c3741c7cced2ffb25604bb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '11a2074a-f725-11f0-a0a4-fa163e934844', 'monotonic_time': 4475.710964102, 'message_signature': '1fde9779574c1fe28f934b6be0f66c78a6651fe0fc553c7a7b38d18e43d60878'}]}, 'timestamp': '2026-01-21 23:58:23.016952', '_unique_id': '9556def9ef064fa0b762ddd56764d507'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.018 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.018 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.018 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.018 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.018 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.018 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.018 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.018 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.018 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.018 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.018 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.018 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.018 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.018 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.018 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.018 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.018 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.018 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.018 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.018 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.018 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.018 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.018 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.018 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.018 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.018 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.018 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.018 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.018 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.018 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.018 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.019 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.043 12 DEBUG ceilometer.compute.pollsters [-] 337d88a9-a34b-4c90-bf0d-0418533ae52d/memory.usage volume: 40.4375 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.062 12 DEBUG ceilometer.compute.pollsters [-] c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9/memory.usage volume: 44.66796875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.064 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0bac86fc-b8a5-4022-b534-d7b50f31d7d9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 40.4375, 'user_id': 'd31ef0e2c7354f47adb7b7f072c28fae', 'user_name': None, 'project_id': '40a322b32cda438b83f33ec51a9007dc', 'project_name': None, 'resource_id': '337d88a9-a34b-4c90-bf0d-0418533ae52d', 'timestamp': '2026-01-21T23:58:23.019332', 'resource_metadata': {'display_name': 'guest-instance-1', 'name': 'instance-00000049', 'instance_id': '337d88a9-a34b-4c90-bf0d-0418533ae52d', 'instance_type': 'm1.nano', 'host': '5249bc4dd155e3a7eb644d8de508ae6e64618295cdf79fe9051c7afd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '11a62898-f725-11f0-a0a4-fa163e934844', 'monotonic_time': 4475.750634134, 'message_signature': '25e0158953f04dd876446ccf548e5f66ccf5c842d7dfa283b0ff53c4f536f360'}, {'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 44.66796875, 'user_id': '0f8ef02149394f2dac899fc3395b6bf7', 'user_name': None, 'project_id': '717cc581e6a349a98dfd390d05b18624', 'project_name': None, 'resource_id': 'c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9', 'timestamp': '2026-01-21T23:58:23.019332', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-936465965', 'name': 'instance-00000047', 'instance_id': 'c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9', 'instance_type': 'm1.nano', 'host': '055250bef50e23cf2e31f6f86c8bab0b62c3741c7cced2ffb25604bb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '11a91314-f725-11f0-a0a4-fa163e934844', 'monotonic_time': 4475.769530995, 'message_signature': '85b903e837a49b2ce17eaac2f7a4e7e4663dd19973741756e555d19269e2970c'}]}, 'timestamp': '2026-01-21 23:58:23.063141', '_unique_id': '2c85d16c93594adf8fbc8a4a35825619'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.064 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.064 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.064 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.064 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.064 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.064 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.064 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.064 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.064 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.064 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.064 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.064 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.064 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.064 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.064 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.064 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.064 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.064 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.064 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.064 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.064 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.064 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.064 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.064 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.064 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.064 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.064 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.064 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.064 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.064 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.064 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.065 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.065 12 DEBUG ceilometer.compute.pollsters [-] 337d88a9-a34b-4c90-bf0d-0418533ae52d/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.066 12 DEBUG ceilometer.compute.pollsters [-] 337d88a9-a34b-4c90-bf0d-0418533ae52d/disk.device.capacity volume: 509952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.066 12 DEBUG ceilometer.compute.pollsters [-] c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.066 12 DEBUG ceilometer.compute.pollsters [-] c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9/disk.device.capacity volume: 509952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.067 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2717fe84-c7bc-4a8f-93fb-715642974fcc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'd31ef0e2c7354f47adb7b7f072c28fae', 'user_name': None, 'project_id': '40a322b32cda438b83f33ec51a9007dc', 'project_name': None, 'resource_id': '337d88a9-a34b-4c90-bf0d-0418533ae52d-vda', 'timestamp': '2026-01-21T23:58:23.065641', 'resource_metadata': {'display_name': 'guest-instance-1', 'name': 'instance-00000049', 'instance_id': '337d88a9-a34b-4c90-bf0d-0418533ae52d', 'instance_type': 'm1.nano', 'host': '5249bc4dd155e3a7eb644d8de508ae6e64618295cdf79fe9051c7afd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '11a984b6-f725-11f0-a0a4-fa163e934844', 'monotonic_time': 4475.690626285, 'message_signature': 'db7716a8424deafa7117ba0e45ec3e435e622fbb1eee170f49fcb44c7aae80c5'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 509952, 'user_id': 'd31ef0e2c7354f47adb7b7f072c28fae', 'user_name': None, 'project_id': '40a322b32cda438b83f33ec51a9007dc', 'project_name': None, 'resource_id': '337d88a9-a34b-4c90-bf0d-0418533ae52d-sda', 'timestamp': '2026-01-21T23:58:23.065641', 'resource_metadata': {'display_name': 'guest-instance-1', 'name': 'instance-00000049', 'instance_id': '337d88a9-a34b-4c90-bf0d-0418533ae52d', 'instance_type': 'm1.nano', 'host': '5249bc4dd155e3a7eb644d8de508ae6e64618295cdf79fe9051c7afd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '11a9919a-f725-11f0-a0a4-fa163e934844', 'monotonic_time': 4475.690626285, 'message_signature': '753703079f6a6e1e569d694a815fce58f349af1ac5c5714fa6ea7f5c98de4ec0'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '0f8ef02149394f2dac899fc3395b6bf7', 'user_name': None, 'project_id': '717cc581e6a349a98dfd390d05b18624', 'project_name': None, 'resource_id': 'c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9-vda', 'timestamp': '2026-01-21T23:58:23.065641', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-936465965', 'name': 'instance-00000047', 'instance_id': 'c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9', 'instance_type': 'm1.nano', 'host': '055250bef50e23cf2e31f6f86c8bab0b62c3741c7cced2ffb25604bb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '11a99c76-f725-11f0-a0a4-fa163e934844', 'monotonic_time': 4475.710964102, 'message_signature': 'aafa0f2220d47746123870af589625331b34eb567168d794d3ba33866c04f0ca'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 509952, 'user_id': '0f8ef02149394f2dac899fc3395b6bf7', 'user_name': None, 'project_id': '717cc581e6a349a98dfd390d05b18624', 'project_name': None, 'resource_id': 'c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9-sda', 'timestamp': '2026-01-21T23:58:23.065641', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-936465965', 'name': 'instance-00000047', 'instance_id': 'c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9', 'instance_type': 'm1.nano', 'host': '055250bef50e23cf2e31f6f86c8bab0b62c3741c7cced2ffb25604bb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '11a9a6da-f725-11f0-a0a4-fa163e934844', 'monotonic_time': 4475.710964102, 'message_signature': 'b81bb3f7c95c71a687f7e83b66297f43a77ee2cb07b3c76da6ca344334f1a4dc'}]}, 'timestamp': '2026-01-21 23:58:23.066842', '_unique_id': '6e3ac833819645dbbddfdf0ecfb556fa'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.067 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.067 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.067 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.067 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.067 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.067 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.067 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.067 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.067 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.067 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.067 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.067 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.067 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.067 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.067 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.067 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.067 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.067 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.067 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.067 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.067 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.067 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.067 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.067 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.067 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.067 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.067 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.067 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.067 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.067 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.067 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.068 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.068 12 DEBUG ceilometer.compute.pollsters [-] 337d88a9-a34b-4c90-bf0d-0418533ae52d/network.incoming.bytes volume: 1648 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.068 12 DEBUG ceilometer.compute.pollsters [-] c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9/network.incoming.bytes volume: 4597 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.069 12 DEBUG ceilometer.compute.pollsters [-] c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9/network.incoming.bytes volume: 1514 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.069 12 DEBUG ceilometer.compute.pollsters [-] c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9/network.incoming.bytes volume: 1346 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.070 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2011f492-33ef-4f9f-a402-00063aef584b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1648, 'user_id': 'd31ef0e2c7354f47adb7b7f072c28fae', 'user_name': None, 'project_id': '40a322b32cda438b83f33ec51a9007dc', 'project_name': None, 'resource_id': 'instance-00000049-337d88a9-a34b-4c90-bf0d-0418533ae52d-tap0d50f3fc-9e', 'timestamp': '2026-01-21T23:58:23.068548', 'resource_metadata': {'display_name': 'guest-instance-1', 'name': 'tap0d50f3fc-9e', 'instance_id': '337d88a9-a34b-4c90-bf0d-0418533ae52d', 'instance_type': 'm1.nano', 'host': '5249bc4dd155e3a7eb644d8de508ae6e64618295cdf79fe9051c7afd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:38:24:84', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap0d50f3fc-9e'}, 'message_id': '11a9f6c6-f725-11f0-a0a4-fa163e934844', 'monotonic_time': 4475.582402443, 'message_signature': 'afd47723aa2d950097951ad2d59211c3658ad8bb7cf6c87475ed3e2c72674d06'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 4597, 'user_id': '0f8ef02149394f2dac899fc3395b6bf7', 'user_name': None, 'project_id': '717cc581e6a349a98dfd390d05b18624', 'project_name': None, 'resource_id': 'instance-00000047-c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9-tap43589933-19', 'timestamp': '2026-01-21T23:58:23.068548', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-936465965', 'name': 'tap43589933-19', 'instance_id': 'c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9', 'instance_type': 'm1.nano', 'host': '055250bef50e23cf2e31f6f86c8bab0b62c3741c7cced2ffb25604bb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:3e:bc:4e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap43589933-19'}, 'message_id': '11aa0490-f725-11f0-a0a4-fa163e934844', 'monotonic_time': 4475.586735996, 'message_signature': '805b9b4d8a821d047970710941c03eafd80e79b57cc11557f255f61a634c3e9d'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1514, 'user_id': '0f8ef02149394f2dac899fc3395b6bf7', 'user_name': None, 'project_id': '717cc581e6a349a98dfd390d05b18624', 'project_name': None, 'resource_id': 'instance-00000047-c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9-tap63932621-f0', 'timestamp': '2026-01-21T23:58:23.068548', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-936465965', 'name': 'tap63932621-f0', 'instance_id': 'c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9', 'instance_type': 'm1.nano', 'host': '055250bef50e23cf2e31f6f86c8bab0b62c3741c7cced2ffb25604bb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:30:c6:cf', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap63932621-f0'}, 'message_id': '11aa10ca-f725-11f0-a0a4-fa163e934844', 'monotonic_time': 4475.586735996, 'message_signature': 'b772e5749e17df1eecb57165a1ec43d72cc61ede3f045039bd6c3151283402ac'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1346, 'user_id': '0f8ef02149394f2dac899fc3395b6bf7', 'user_name': None, 'project_id': '717cc581e6a349a98dfd390d05b18624', 'project_name': None, 'resource_id': 'instance-00000047-c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9-tapda88332f-f7', 'timestamp': '2026-01-21T23:58:23.068548', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-936465965', 'name': 'tapda88332f-f7', 'instance_id': 'c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9', 'instance_type': 'm1.nano', 'host': '055250bef50e23cf2e31f6f86c8bab0b62c3741c7cced2ffb25604bb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:92:0a:be', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapda88332f-f7'}, 'message_id': '11aa20a6-f725-11f0-a0a4-fa163e934844', 'monotonic_time': 4475.586735996, 'message_signature': 'fb7734504c9b957ce0501c9b1ad5b155375f5346d23bb190a073bdf93c2c8ea3'}]}, 'timestamp': '2026-01-21 23:58:23.070017', '_unique_id': '6c84186767694ba3b10f5c48b886263a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.070 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.070 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.070 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.070 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.070 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.070 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.070 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.070 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.070 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.070 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.070 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.070 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.070 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.070 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.070 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.070 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.070 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.070 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.070 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.070 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.070 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.070 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.070 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.070 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.070 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.070 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.070 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.070 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.070 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.070 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.070 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.070 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.070 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.070 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.070 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.070 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.070 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.070 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.070 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.070 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.070 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.070 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.070 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.070 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.070 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.070 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.070 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.070 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.070 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.070 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.070 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.070 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.070 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.070 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.071 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.071 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.071 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: guest-instance-1>, <NovaLikeServer: tempest-AttachInterfacesTestJSON-server-936465965>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: guest-instance-1>, <NovaLikeServer: tempest-AttachInterfacesTestJSON-server-936465965>]
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.072 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.072 12 DEBUG ceilometer.compute.pollsters [-] 337d88a9-a34b-4c90-bf0d-0418533ae52d/network.outgoing.packets volume: 11 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.072 12 DEBUG ceilometer.compute.pollsters [-] c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9/network.outgoing.packets volume: 28 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.073 12 DEBUG ceilometer.compute.pollsters [-] c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9/network.outgoing.packets volume: 16 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.073 12 DEBUG ceilometer.compute.pollsters [-] c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9/network.outgoing.packets volume: 15 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.074 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c9829c42-e6a4-4150-80f3-ec1fa3ead208', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 11, 'user_id': 'd31ef0e2c7354f47adb7b7f072c28fae', 'user_name': None, 'project_id': '40a322b32cda438b83f33ec51a9007dc', 'project_name': None, 'resource_id': 'instance-00000049-337d88a9-a34b-4c90-bf0d-0418533ae52d-tap0d50f3fc-9e', 'timestamp': '2026-01-21T23:58:23.072317', 'resource_metadata': {'display_name': 'guest-instance-1', 'name': 'tap0d50f3fc-9e', 'instance_id': '337d88a9-a34b-4c90-bf0d-0418533ae52d', 'instance_type': 'm1.nano', 'host': '5249bc4dd155e3a7eb644d8de508ae6e64618295cdf79fe9051c7afd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:38:24:84', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap0d50f3fc-9e'}, 'message_id': '11aa8906-f725-11f0-a0a4-fa163e934844', 'monotonic_time': 4475.582402443, 'message_signature': '7771f8c5245613efdf060b97fcfba441adb9b9ac566761ad621d0b1ed9cf6f7b'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 28, 'user_id': '0f8ef02149394f2dac899fc3395b6bf7', 'user_name': None, 'project_id': '717cc581e6a349a98dfd390d05b18624', 'project_name': None, 'resource_id': 'instance-00000047-c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9-tap43589933-19', 'timestamp': '2026-01-21T23:58:23.072317', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-936465965', 'name': 'tap43589933-19', 'instance_id': 'c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9', 'instance_type': 'm1.nano', 'host': '055250bef50e23cf2e31f6f86c8bab0b62c3741c7cced2ffb25604bb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:3e:bc:4e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap43589933-19'}, 'message_id': '11aa96c6-f725-11f0-a0a4-fa163e934844', 'monotonic_time': 4475.586735996, 'message_signature': '994ed496296927348a161887b2db1ca7739bfb17ce9f9c3053a969a17f09caa4'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 16, 'user_id': '0f8ef02149394f2dac899fc3395b6bf7', 'user_name': None, 'project_id': '717cc581e6a349a98dfd390d05b18624', 'project_name': None, 'resource_id': 'instance-00000047-c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9-tap63932621-f0', 'timestamp': '2026-01-21T23:58:23.072317', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-936465965', 'name': 'tap63932621-f0', 'instance_id': 'c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9', 'instance_type': 'm1.nano', 'host': '055250bef50e23cf2e31f6f86c8bab0b62c3741c7cced2ffb25604bb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:30:c6:cf', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap63932621-f0'}, 'message_id': '11aaa65c-f725-11f0-a0a4-fa163e934844', 'monotonic_time': 4475.586735996, 'message_signature': 'aec601700cbc6c346d5b2a68b75fabf205f3c1666710c2de3a5da7a2c136d43c'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 15, 'user_id': '0f8ef02149394f2dac899fc3395b6bf7', 'user_name': None, 'project_id': '717cc581e6a349a98dfd390d05b18624', 'project_name': None, 'resource_id': 'instance-00000047-c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9-tapda88332f-f7', 'timestamp': '2026-01-21T23:58:23.072317', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-936465965', 'name': 'tapda88332f-f7', 'instance_id': 'c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9', 'instance_type': 'm1.nano', 'host': '055250bef50e23cf2e31f6f86c8bab0b62c3741c7cced2ffb25604bb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:92:0a:be', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapda88332f-f7'}, 'message_id': '11aab1a6-f725-11f0-a0a4-fa163e934844', 'monotonic_time': 4475.586735996, 'message_signature': '6f3beb5a5f68a7f34dc84b417166d3d3d2d193a75aaef34b524a348391db9149'}]}, 'timestamp': '2026-01-21 23:58:23.073729', '_unique_id': '5407eca4bd724338ab7a5264b265e59e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.074 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.074 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.074 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.074 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.074 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.074 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.074 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.074 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.074 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.074 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.074 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.074 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.074 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.074 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.074 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.074 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.074 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.074 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.074 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.074 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.074 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.074 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.074 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.074 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.074 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.074 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.074 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.074 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.074 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.074 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.074 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.075 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.075 12 DEBUG ceilometer.compute.pollsters [-] 337d88a9-a34b-4c90-bf0d-0418533ae52d/disk.device.write.requests volume: 298 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.075 12 DEBUG ceilometer.compute.pollsters [-] 337d88a9-a34b-4c90-bf0d-0418533ae52d/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.076 12 DEBUG ceilometer.compute.pollsters [-] c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9/disk.device.write.requests volume: 320 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.076 12 DEBUG ceilometer.compute.pollsters [-] c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.077 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'da03230c-0bea-4d6a-886c-8bccb82b3eb5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 298, 'user_id': 'd31ef0e2c7354f47adb7b7f072c28fae', 'user_name': None, 'project_id': '40a322b32cda438b83f33ec51a9007dc', 'project_name': None, 'resource_id': '337d88a9-a34b-4c90-bf0d-0418533ae52d-vda', 'timestamp': '2026-01-21T23:58:23.075382', 'resource_metadata': {'display_name': 'guest-instance-1', 'name': 'instance-00000049', 'instance_id': '337d88a9-a34b-4c90-bf0d-0418533ae52d', 'instance_type': 'm1.nano', 'host': '5249bc4dd155e3a7eb644d8de508ae6e64618295cdf79fe9051c7afd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '11ab008e-f725-11f0-a0a4-fa163e934844', 'monotonic_time': 4475.598763527, 'message_signature': '7593faaa1eb060fa8af1e125f966fc7f60fd4ee4b8f2d9d3d89119ea51a57a08'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'd31ef0e2c7354f47adb7b7f072c28fae', 'user_name': None, 'project_id': '40a322b32cda438b83f33ec51a9007dc', 'project_name': None, 'resource_id': '337d88a9-a34b-4c90-bf0d-0418533ae52d-sda', 'timestamp': '2026-01-21T23:58:23.075382', 'resource_metadata': {'display_name': 'guest-instance-1', 'name': 'instance-00000049', 'instance_id': '337d88a9-a34b-4c90-bf0d-0418533ae52d', 'instance_type': 'm1.nano', 'host': '5249bc4dd155e3a7eb644d8de508ae6e64618295cdf79fe9051c7afd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '11ab0d68-f725-11f0-a0a4-fa163e934844', 'monotonic_time': 4475.598763527, 'message_signature': 'a7376c2a59374ceafe419e13111c6f84af5b6a6401bd88d0b1c324466afa5ab9'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 320, 'user_id': '0f8ef02149394f2dac899fc3395b6bf7', 'user_name': None, 'project_id': '717cc581e6a349a98dfd390d05b18624', 'project_name': None, 'resource_id': 'c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9-vda', 'timestamp': '2026-01-21T23:58:23.075382', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-936465965', 'name': 'instance-00000047', 'instance_id': 'c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9', 'instance_type': 'm1.nano', 'host': '055250bef50e23cf2e31f6f86c8bab0b62c3741c7cced2ffb25604bb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '11ab189e-f725-11f0-a0a4-fa163e934844', 'monotonic_time': 4475.639689717, 'message_signature': 'cd8f0d2af53bd76f3c5f517d501ebf3453502f2fff0bf21af618b68a2c65298e'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '0f8ef02149394f2dac899fc3395b6bf7', 'user_name': None, 'project_id': '717cc581e6a349a98dfd390d05b18624', 'project_name': None, 'resource_id': 'c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9-sda', 'timestamp': '2026-01-21T23:58:23.075382', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-936465965', 'name': 'instance-00000047', 'instance_id': 'c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9', 'instance_type': 'm1.nano', 'host': '055250bef50e23cf2e31f6f86c8bab0b62c3741c7cced2ffb25604bb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '11ab2302-f725-11f0-a0a4-fa163e934844', 'monotonic_time': 4475.639689717, 'message_signature': '15742ad812bf746e5eb817ded10e1498bf76809e3928fa32cf62c72239e144de'}]}, 'timestamp': '2026-01-21 23:58:23.076558', '_unique_id': '4acab840bb2b446eafef5137fb36a411'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.077 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.077 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.077 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.077 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.077 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.077 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.077 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.077 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.077 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.077 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.077 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.077 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.077 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.077 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.077 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.077 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.077 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.077 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.077 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.077 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.077 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.077 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.077 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.077 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.077 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.077 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.077 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.077 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.077 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.077 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.077 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.077 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.077 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.077 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.077 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.077 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.077 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.077 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.077 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.077 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.077 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.077 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.077 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.077 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.077 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.077 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.077 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.077 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.077 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.077 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.077 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.077 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.077 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.077 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.078 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.078 12 DEBUG ceilometer.compute.pollsters [-] 337d88a9-a34b-4c90-bf0d-0418533ae52d/disk.device.read.latency volume: 200549615 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.078 12 DEBUG ceilometer.compute.pollsters [-] 337d88a9-a34b-4c90-bf0d-0418533ae52d/disk.device.read.latency volume: 33438591 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.078 12 DEBUG ceilometer.compute.pollsters [-] c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9/disk.device.read.latency volume: 210142080 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.079 12 DEBUG ceilometer.compute.pollsters [-] c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9/disk.device.read.latency volume: 29283254 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.080 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '53feaadd-0b2a-433e-aa58-d9a032de0ebb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 200549615, 'user_id': 'd31ef0e2c7354f47adb7b7f072c28fae', 'user_name': None, 'project_id': '40a322b32cda438b83f33ec51a9007dc', 'project_name': None, 'resource_id': '337d88a9-a34b-4c90-bf0d-0418533ae52d-vda', 'timestamp': '2026-01-21T23:58:23.078269', 'resource_metadata': {'display_name': 'guest-instance-1', 'name': 'instance-00000049', 'instance_id': '337d88a9-a34b-4c90-bf0d-0418533ae52d', 'instance_type': 'm1.nano', 'host': '5249bc4dd155e3a7eb644d8de508ae6e64618295cdf79fe9051c7afd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '11ab7136-f725-11f0-a0a4-fa163e934844', 'monotonic_time': 4475.598763527, 'message_signature': 'eb1e8d23060b43c1f0ad43aace42ee2cb66fbe2706cb9db62a07b7b7094ce76b'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 33438591, 'user_id': 'd31ef0e2c7354f47adb7b7f072c28fae', 'user_name': None, 'project_id': '40a322b32cda438b83f33ec51a9007dc', 'project_name': None, 'resource_id': '337d88a9-a34b-4c90-bf0d-0418533ae52d-sda', 'timestamp': '2026-01-21T23:58:23.078269', 'resource_metadata': {'display_name': 'guest-instance-1', 'name': 'instance-00000049', 'instance_id': '337d88a9-a34b-4c90-bf0d-0418533ae52d', 'instance_type': 'm1.nano', 'host': '5249bc4dd155e3a7eb644d8de508ae6e64618295cdf79fe9051c7afd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '11ab7bcc-f725-11f0-a0a4-fa163e934844', 'monotonic_time': 4475.598763527, 'message_signature': '3c98a1494a1784ac5fbabd4a8d722dbbfc79ff58662a99c1efd00bfab20ce2be'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 210142080, 'user_id': '0f8ef02149394f2dac899fc3395b6bf7', 'user_name': None, 'project_id': '717cc581e6a349a98dfd390d05b18624', 'project_name': None, 'resource_id': 'c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9-vda', 'timestamp': '2026-01-21T23:58:23.078269', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-936465965', 'name': 'instance-00000047', 'instance_id': 'c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9', 'instance_type': 'm1.nano', 'host': '055250bef50e23cf2e31f6f86c8bab0b62c3741c7cced2ffb25604bb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '11ab889c-f725-11f0-a0a4-fa163e934844', 'monotonic_time': 4475.639689717, 'message_signature': '896cfc6ebea7dc8a6067ac1605d51db82161ef03f46a7feb54ff4daa25c96196'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 29283254, 'user_id': '0f8ef02149394f2dac899fc3395b6bf7', 'user_name': None, 'project_id': '717cc581e6a349a98dfd390d05b18624', 'project_name': None, 'resource_id': 'c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9-sda', 'timestamp': '2026-01-21T23:58:23.078269', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-936465965', 'name': 'instance-00000047', 'instance_id': 'c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9', 'instance_type': 'm1.nano', 'host': '055250bef50e23cf2e31f6f86c8bab0b62c3741c7cced2ffb25604bb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '11ab9382-f725-11f0-a0a4-fa163e934844', 'monotonic_time': 4475.639689717, 'message_signature': '2bdbb4113acb210b9517d85fd2845ce5d8c2fc8cb00d7ac2dba8549365395a83'}]}, 'timestamp': '2026-01-21 23:58:23.079440', '_unique_id': '140c8f9129a149f39260d7c9e81335aa'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.080 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.080 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.080 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.080 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.080 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.080 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.080 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.080 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.080 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.080 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.080 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.080 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.080 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.080 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.080 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.080 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.080 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.080 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.080 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.080 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.080 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.080 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.080 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.080 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.080 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.080 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.080 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.080 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.080 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.080 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.080 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.081 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.081 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.081 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: guest-instance-1>, <NovaLikeServer: tempest-AttachInterfacesTestJSON-server-936465965>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: guest-instance-1>, <NovaLikeServer: tempest-AttachInterfacesTestJSON-server-936465965>]
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.081 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.081 12 DEBUG ceilometer.compute.pollsters [-] 337d88a9-a34b-4c90-bf0d-0418533ae52d/disk.device.read.bytes volume: 30534144 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.081 12 DEBUG ceilometer.compute.pollsters [-] 337d88a9-a34b-4c90-bf0d-0418533ae52d/disk.device.read.bytes volume: 299326 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.082 12 DEBUG ceilometer.compute.pollsters [-] c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9/disk.device.read.bytes volume: 29993472 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.082 12 DEBUG ceilometer.compute.pollsters [-] c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9/disk.device.read.bytes volume: 299326 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.083 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '94a974dc-19c7-4f3c-a201-32b02c4886c8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 30534144, 'user_id': 'd31ef0e2c7354f47adb7b7f072c28fae', 'user_name': None, 'project_id': '40a322b32cda438b83f33ec51a9007dc', 'project_name': None, 'resource_id': '337d88a9-a34b-4c90-bf0d-0418533ae52d-vda', 'timestamp': '2026-01-21T23:58:23.081628', 'resource_metadata': {'display_name': 'guest-instance-1', 'name': 'instance-00000049', 'instance_id': '337d88a9-a34b-4c90-bf0d-0418533ae52d', 'instance_type': 'm1.nano', 'host': '5249bc4dd155e3a7eb644d8de508ae6e64618295cdf79fe9051c7afd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '11abf43a-f725-11f0-a0a4-fa163e934844', 'monotonic_time': 4475.598763527, 'message_signature': '55b00f95ff63a4dbe2c382b0f23acf8454dc5f8bd9fdb949d1522e52a7ff9556'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 299326, 'user_id': 'd31ef0e2c7354f47adb7b7f072c28fae', 'user_name': None, 'project_id': '40a322b32cda438b83f33ec51a9007dc', 'project_name': None, 'resource_id': '337d88a9-a34b-4c90-bf0d-0418533ae52d-sda', 'timestamp': '2026-01-21T23:58:23.081628', 'resource_metadata': {'display_name': 'guest-instance-1', 'name': 'instance-00000049', 'instance_id': '337d88a9-a34b-4c90-bf0d-0418533ae52d', 'instance_type': 'm1.nano', 'host': '5249bc4dd155e3a7eb644d8de508ae6e64618295cdf79fe9051c7afd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '11ac00a6-f725-11f0-a0a4-fa163e934844', 'monotonic_time': 4475.598763527, 'message_signature': '91524d5e8ab5c1b3276e3f4f6166442515b000daeb6f3a6480fc34dafafffb2e'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 29993472, 'user_id': '0f8ef02149394f2dac899fc3395b6bf7', 'user_name': None, 'project_id': '717cc581e6a349a98dfd390d05b18624', 'project_name': None, 'resource_id': 'c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9-vda', 'timestamp': '2026-01-21T23:58:23.081628', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-936465965', 'name': 'instance-00000047', 'instance_id': 'c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9', 'instance_type': 'm1.nano', 'host': '055250bef50e23cf2e31f6f86c8bab0b62c3741c7cced2ffb25604bb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '11ac0d1c-f725-11f0-a0a4-fa163e934844', 'monotonic_time': 4475.639689717, 'message_signature': 'a4e2ec46e6e8831932fe3d984d75bb7ed13ff7a43200ec067bb8c5effb631256'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 299326, 'user_id': '0f8ef02149394f2dac899fc3395b6bf7', 'user_name': None, 'project_id': '717cc581e6a349a98dfd390d05b18624', 'project_name': None, 'resource_id': 'c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9-sda', 'timestamp': '2026-01-21T23:58:23.081628', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-936465965', 'name': 'instance-00000047', 'instance_id': 'c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9', 'instance_type': 'm1.nano', 'host': '055250bef50e23cf2e31f6f86c8bab0b62c3741c7cced2ffb25604bb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '11ac17b2-f725-11f0-a0a4-fa163e934844', 'monotonic_time': 4475.639689717, 'message_signature': '03f7f6a919170c0e53d5677854a84f0c10c226d4b5ae4810fa60c27c46359d4a'}]}, 'timestamp': '2026-01-21 23:58:23.082824', '_unique_id': 'e84935be3ecf4c81922acd290a7c2c99'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.083 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.083 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.083 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.083 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.083 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.083 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.083 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.083 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.083 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.083 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.083 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.083 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.083 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.083 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.083 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.083 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.083 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.083 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.083 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.083 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.083 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.083 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.083 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.083 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.083 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.083 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.083 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.083 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.083 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.083 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.083 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.084 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.084 12 DEBUG ceilometer.compute.pollsters [-] 337d88a9-a34b-4c90-bf0d-0418533ae52d/disk.device.write.bytes volume: 72769536 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.084 12 DEBUG ceilometer.compute.pollsters [-] 337d88a9-a34b-4c90-bf0d-0418533ae52d/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.085 12 DEBUG ceilometer.compute.pollsters [-] c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9/disk.device.write.bytes volume: 73039872 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.085 12 DEBUG ceilometer.compute.pollsters [-] c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.086 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f7bdc719-e996-4491-bd91-b4184bfa4d5c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 72769536, 'user_id': 'd31ef0e2c7354f47adb7b7f072c28fae', 'user_name': None, 'project_id': '40a322b32cda438b83f33ec51a9007dc', 'project_name': None, 'resource_id': '337d88a9-a34b-4c90-bf0d-0418533ae52d-vda', 'timestamp': '2026-01-21T23:58:23.084520', 'resource_metadata': {'display_name': 'guest-instance-1', 'name': 'instance-00000049', 'instance_id': '337d88a9-a34b-4c90-bf0d-0418533ae52d', 'instance_type': 'm1.nano', 'host': '5249bc4dd155e3a7eb644d8de508ae6e64618295cdf79fe9051c7afd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '11ac6582-f725-11f0-a0a4-fa163e934844', 'monotonic_time': 4475.598763527, 'message_signature': 'bac3034aa2847c74f685bdcc4aa986f188b8b0726c60ca930ef3a07c258f8e71'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'd31ef0e2c7354f47adb7b7f072c28fae', 'user_name': None, 'project_id': '40a322b32cda438b83f33ec51a9007dc', 'project_name': None, 'resource_id': '337d88a9-a34b-4c90-bf0d-0418533ae52d-sda', 'timestamp': '2026-01-21T23:58:23.084520', 'resource_metadata': {'display_name': 'guest-instance-1', 'name': 'instance-00000049', 'instance_id': '337d88a9-a34b-4c90-bf0d-0418533ae52d', 'instance_type': 'm1.nano', 'host': '5249bc4dd155e3a7eb644d8de508ae6e64618295cdf79fe9051c7afd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '11ac731a-f725-11f0-a0a4-fa163e934844', 'monotonic_time': 4475.598763527, 'message_signature': '42bc70202589d8d56d43531cc781cc0a089464bcd4e537847583eadd3f49c85f'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 73039872, 'user_id': '0f8ef02149394f2dac899fc3395b6bf7', 'user_name': None, 'project_id': '717cc581e6a349a98dfd390d05b18624', 'project_name': None, 'resource_id': 'c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9-vda', 'timestamp': '2026-01-21T23:58:23.084520', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-936465965', 'name': 'instance-00000047', 'instance_id': 'c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9', 'instance_type': 'm1.nano', 'host': '055250bef50e23cf2e31f6f86c8bab0b62c3741c7cced2ffb25604bb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '11ac7e3c-f725-11f0-a0a4-fa163e934844', 'monotonic_time': 4475.639689717, 'message_signature': '4997c877a14d88d32a3a6c5d240fc05b09943872c964f3fd93dc735a8b169beb'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '0f8ef02149394f2dac899fc3395b6bf7', 'user_name': None, 'project_id': '717cc581e6a349a98dfd390d05b18624', 'project_name': None, 'resource_id': 'c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9-sda', 'timestamp': '2026-01-21T23:58:23.084520', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-936465965', 'name': 'instance-00000047', 'instance_id': 'c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9', 'instance_type': 'm1.nano', 'host': '055250bef50e23cf2e31f6f86c8bab0b62c3741c7cced2ffb25604bb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '11ac8a62-f725-11f0-a0a4-fa163e934844', 'monotonic_time': 4475.639689717, 'message_signature': 'c90f5e7e0308707812508d8969af1e306e16cd92f669439f8c3577284eaee81e'}]}, 'timestamp': '2026-01-21 23:58:23.085762', '_unique_id': 'c79d215c6eae4b1388d5d739cf044b13'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.086 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.086 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.086 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.086 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.086 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.086 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.086 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.086 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.086 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.086 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.086 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.086 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.086 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.086 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.086 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.086 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.086 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.086 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.086 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.086 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.086 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.086 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.086 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.086 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.086 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.086 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.086 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.086 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.086 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.086 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.086 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.087 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.087 12 DEBUG ceilometer.compute.pollsters [-] 337d88a9-a34b-4c90-bf0d-0418533ae52d/cpu volume: 11120000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.088 12 DEBUG ceilometer.compute.pollsters [-] c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9/cpu volume: 11720000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.089 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b0e8623e-39d7-49e0-8d9a-a02128a15046', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 11120000000, 'user_id': 'd31ef0e2c7354f47adb7b7f072c28fae', 'user_name': None, 'project_id': '40a322b32cda438b83f33ec51a9007dc', 'project_name': None, 'resource_id': '337d88a9-a34b-4c90-bf0d-0418533ae52d', 'timestamp': '2026-01-21T23:58:23.087657', 'resource_metadata': {'display_name': 'guest-instance-1', 'name': 'instance-00000049', 'instance_id': '337d88a9-a34b-4c90-bf0d-0418533ae52d', 'instance_type': 'm1.nano', 'host': '5249bc4dd155e3a7eb644d8de508ae6e64618295cdf79fe9051c7afd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '11ace264-f725-11f0-a0a4-fa163e934844', 'monotonic_time': 4475.750634134, 'message_signature': '1f550fc640d4a8669afd043f442876c53404da012caf8c1e68dc24080f01dd9c'}, {'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 11720000000, 'user_id': '0f8ef02149394f2dac899fc3395b6bf7', 'user_name': None, 'project_id': '717cc581e6a349a98dfd390d05b18624', 'project_name': None, 'resource_id': 'c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9', 'timestamp': '2026-01-21T23:58:23.087657', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-936465965', 'name': 'instance-00000047', 'instance_id': 'c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9', 'instance_type': 'm1.nano', 'host': '055250bef50e23cf2e31f6f86c8bab0b62c3741c7cced2ffb25604bb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '11acedc2-f725-11f0-a0a4-fa163e934844', 'monotonic_time': 4475.769530995, 'message_signature': 'db028d86b7321a220b384c1c1b6d503ad589621792d3ef4fe4a8828f622d5a5d'}]}, 'timestamp': '2026-01-21 23:58:23.088307', '_unique_id': 'a447090213a3416690efae6dcabd186d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.089 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.089 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.089 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.089 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.089 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.089 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.089 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.089 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.089 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.089 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.089 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.089 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.089 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.089 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.089 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.089 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.089 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.089 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.089 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.089 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.089 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.089 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.089 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.089 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.089 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.089 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.089 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.089 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.089 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.089 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.089 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.089 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.089 12 DEBUG ceilometer.compute.pollsters [-] 337d88a9-a34b-4c90-bf0d-0418533ae52d/network.incoming.packets volume: 13 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.090 12 DEBUG ceilometer.compute.pollsters [-] c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9/network.incoming.packets volume: 35 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.090 12 DEBUG ceilometer.compute.pollsters [-] c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9/network.incoming.packets volume: 14 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.090 12 DEBUG ceilometer.compute.pollsters [-] c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9/network.incoming.packets volume: 10 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.091 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '02ea6fe0-4922-4731-9c04-d6bcd9b39999', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 13, 'user_id': 'd31ef0e2c7354f47adb7b7f072c28fae', 'user_name': None, 'project_id': '40a322b32cda438b83f33ec51a9007dc', 'project_name': None, 'resource_id': 'instance-00000049-337d88a9-a34b-4c90-bf0d-0418533ae52d-tap0d50f3fc-9e', 'timestamp': '2026-01-21T23:58:23.089969', 'resource_metadata': {'display_name': 'guest-instance-1', 'name': 'tap0d50f3fc-9e', 'instance_id': '337d88a9-a34b-4c90-bf0d-0418533ae52d', 'instance_type': 'm1.nano', 'host': '5249bc4dd155e3a7eb644d8de508ae6e64618295cdf79fe9051c7afd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:38:24:84', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap0d50f3fc-9e'}, 'message_id': '11ad3a66-f725-11f0-a0a4-fa163e934844', 'monotonic_time': 4475.582402443, 'message_signature': '45c94ecdaf014a9a487e0093b439f1bdf8cb8b1a78e15d712778b01fd3de2086'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 35, 'user_id': '0f8ef02149394f2dac899fc3395b6bf7', 'user_name': None, 'project_id': '717cc581e6a349a98dfd390d05b18624', 'project_name': None, 'resource_id': 'instance-00000047-c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9-tap43589933-19', 'timestamp': '2026-01-21T23:58:23.089969', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-936465965', 'name': 'tap43589933-19', 'instance_id': 'c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9', 'instance_type': 'm1.nano', 'host': '055250bef50e23cf2e31f6f86c8bab0b62c3741c7cced2ffb25604bb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:3e:bc:4e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap43589933-19'}, 'message_id': '11ad45a6-f725-11f0-a0a4-fa163e934844', 'monotonic_time': 4475.586735996, 'message_signature': 'defe3c0aee4bf8c8d9878e79688b3deab53d251b1f716426d6329eeba79c1dd3'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 14, 'user_id': '0f8ef02149394f2dac899fc3395b6bf7', 'user_name': None, 'project_id': '717cc581e6a349a98dfd390d05b18624', 'project_name': None, 'resource_id': 'instance-00000047-c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9-tap63932621-f0', 'timestamp': '2026-01-21T23:58:23.089969', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-936465965', 'name': 'tap63932621-f0', 'instance_id': 'c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9', 'instance_type': 'm1.nano', 'host': '055250bef50e23cf2e31f6f86c8bab0b62c3741c7cced2ffb25604bb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:30:c6:cf', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap63932621-f0'}, 'message_id': '11ad5186-f725-11f0-a0a4-fa163e934844', 'monotonic_time': 4475.586735996, 'message_signature': 'f3edba5c09914f6ca270cb87f0b17307229a4083ff33d53f526b80a2e33710a4'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 10, 'user_id': '0f8ef02149394f2dac899fc3395b6bf7', 'user_name': None, 'project_id': '717cc581e6a349a98dfd390d05b18624', 'project_name': None, 'resource_id': 'instance-00000047-c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9-tapda88332f-f7', 'timestamp': '2026-01-21T23:58:23.089969', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-936465965', 'name': 'tapda88332f-f7', 'instance_id': 'c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9', 'instance_type': 'm1.nano', 'host': '055250bef50e23cf2e31f6f86c8bab0b62c3741c7cced2ffb25604bb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:92:0a:be', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapda88332f-f7'}, 'message_id': '11ad5fdc-f725-11f0-a0a4-fa163e934844', 'monotonic_time': 4475.586735996, 'message_signature': '5d2b7387654dac22f85587a664699af89550ae0e9868ef13ee537596d9212337'}]}, 'timestamp': '2026-01-21 23:58:23.091242', '_unique_id': '41504b3506ac4e86990551b73a698677'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.091 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.091 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.091 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.091 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.091 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.091 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.091 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.091 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.091 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.091 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.091 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.091 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.091 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.091 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.091 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.091 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.091 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.091 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.091 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.091 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.091 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.091 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.091 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.091 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.091 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.091 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.091 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.091 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.091 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.091 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.091 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.092 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.092 12 DEBUG ceilometer.compute.pollsters [-] 337d88a9-a34b-4c90-bf0d-0418533ae52d/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.093 12 DEBUG ceilometer.compute.pollsters [-] c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.093 12 DEBUG ceilometer.compute.pollsters [-] c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.093 12 DEBUG ceilometer.compute.pollsters [-] c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.094 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'df810987-00f9-4bb4-a738-e755c62055a9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'd31ef0e2c7354f47adb7b7f072c28fae', 'user_name': None, 'project_id': '40a322b32cda438b83f33ec51a9007dc', 'project_name': None, 'resource_id': 'instance-00000049-337d88a9-a34b-4c90-bf0d-0418533ae52d-tap0d50f3fc-9e', 'timestamp': '2026-01-21T23:58:23.092935', 'resource_metadata': {'display_name': 'guest-instance-1', 'name': 'tap0d50f3fc-9e', 'instance_id': '337d88a9-a34b-4c90-bf0d-0418533ae52d', 'instance_type': 'm1.nano', 'host': '5249bc4dd155e3a7eb644d8de508ae6e64618295cdf79fe9051c7afd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:38:24:84', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap0d50f3fc-9e'}, 'message_id': '11adae92-f725-11f0-a0a4-fa163e934844', 'monotonic_time': 4475.582402443, 'message_signature': 'ea36ff0dc0f6bfeab39c1bbacfb0cd13f6aa158e1912804e33c6d72c33c7f5bf'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '0f8ef02149394f2dac899fc3395b6bf7', 'user_name': None, 'project_id': '717cc581e6a349a98dfd390d05b18624', 'project_name': None, 'resource_id': 'instance-00000047-c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9-tap43589933-19', 'timestamp': '2026-01-21T23:58:23.092935', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-936465965', 'name': 'tap43589933-19', 'instance_id': 'c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9', 'instance_type': 'm1.nano', 'host': '055250bef50e23cf2e31f6f86c8bab0b62c3741c7cced2ffb25604bb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:3e:bc:4e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap43589933-19'}, 'message_id': '11adb9aa-f725-11f0-a0a4-fa163e934844', 'monotonic_time': 4475.586735996, 'message_signature': '42d57a2d6e2d70ef4f5d08affbbca9eb65669b8eb557695cc63ed62035c75c93'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '0f8ef02149394f2dac899fc3395b6bf7', 'user_name': None, 'project_id': '717cc581e6a349a98dfd390d05b18624', 'project_name': None, 'resource_id': 'instance-00000047-c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9-tap63932621-f0', 'timestamp': '2026-01-21T23:58:23.092935', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-936465965', 'name': 'tap63932621-f0', 'instance_id': 'c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9', 'instance_type': 'm1.nano', 'host': '055250bef50e23cf2e31f6f86c8bab0b62c3741c7cced2ffb25604bb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:30:c6:cf', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap63932621-f0'}, 'message_id': '11adc4c2-f725-11f0-a0a4-fa163e934844', 'monotonic_time': 4475.586735996, 'message_signature': '2c2582bf1d9b467b19e0d11ec6d70172232dea5888b15771fb859237168af9fb'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '0f8ef02149394f2dac899fc3395b6bf7', 'user_name': None, 'project_id': '717cc581e6a349a98dfd390d05b18624', 'project_name': None, 'resource_id': 'instance-00000047-c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9-tapda88332f-f7', 'timestamp': '2026-01-21T23:58:23.092935', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-936465965', 'name': 'tapda88332f-f7', 'instance_id': 'c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9', 'instance_type': 'm1.nano', 'host': '055250bef50e23cf2e31f6f86c8bab0b62c3741c7cced2ffb25604bb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:92:0a:be', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapda88332f-f7'}, 'message_id': '11add296-f725-11f0-a0a4-fa163e934844', 'monotonic_time': 4475.586735996, 'message_signature': 'cc5de7fb9b97fe1a94da860d6d081c4b964edb9848926f53614cda3d2a64067f'}]}, 'timestamp': '2026-01-21 23:58:23.094176', '_unique_id': 'dfbd710f26f648ad82aba01e43800143'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.094 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.094 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.094 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.094 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.094 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.094 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.094 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.094 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.094 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.094 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.094 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.094 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.094 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.094 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.094 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.094 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.094 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.094 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.094 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.094 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.094 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.094 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.094 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.094 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.094 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.094 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.094 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.094 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.094 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.094 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.094 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.095 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.095 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.095 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: guest-instance-1>, <NovaLikeServer: tempest-AttachInterfacesTestJSON-server-936465965>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: guest-instance-1>, <NovaLikeServer: tempest-AttachInterfacesTestJSON-server-936465965>]
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.096 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.096 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.096 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: guest-instance-1>, <NovaLikeServer: tempest-AttachInterfacesTestJSON-server-936465965>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: guest-instance-1>, <NovaLikeServer: tempest-AttachInterfacesTestJSON-server-936465965>]
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.096 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.096 12 DEBUG ceilometer.compute.pollsters [-] 337d88a9-a34b-4c90-bf0d-0418533ae52d/disk.device.write.latency volume: 2109040606 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.096 12 DEBUG ceilometer.compute.pollsters [-] 337d88a9-a34b-4c90-bf0d-0418533ae52d/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.097 12 DEBUG ceilometer.compute.pollsters [-] c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9/disk.device.write.latency volume: 1939133463 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.097 12 DEBUG ceilometer.compute.pollsters [-] c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.098 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3f947808-aa27-4312-9ab7-d9dff49756e6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 2109040606, 'user_id': 'd31ef0e2c7354f47adb7b7f072c28fae', 'user_name': None, 'project_id': '40a322b32cda438b83f33ec51a9007dc', 'project_name': None, 'resource_id': '337d88a9-a34b-4c90-bf0d-0418533ae52d-vda', 'timestamp': '2026-01-21T23:58:23.096603', 'resource_metadata': {'display_name': 'guest-instance-1', 'name': 'instance-00000049', 'instance_id': '337d88a9-a34b-4c90-bf0d-0418533ae52d', 'instance_type': 'm1.nano', 'host': '5249bc4dd155e3a7eb644d8de508ae6e64618295cdf79fe9051c7afd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '11ae3d76-f725-11f0-a0a4-fa163e934844', 'monotonic_time': 4475.598763527, 'message_signature': 'd3d995bd44ae10742ac6d9f1e28adae53dfdc337baaf666aa5edf888fe4428c1'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'd31ef0e2c7354f47adb7b7f072c28fae', 'user_name': None, 'project_id': '40a322b32cda438b83f33ec51a9007dc', 'project_name': None, 'resource_id': '337d88a9-a34b-4c90-bf0d-0418533ae52d-sda', 'timestamp': '2026-01-21T23:58:23.096603', 'resource_metadata': {'display_name': 'guest-instance-1', 'name': 'instance-00000049', 'instance_id': '337d88a9-a34b-4c90-bf0d-0418533ae52d', 'instance_type': 'm1.nano', 'host': '5249bc4dd155e3a7eb644d8de508ae6e64618295cdf79fe9051c7afd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '11ae4b5e-f725-11f0-a0a4-fa163e934844', 'monotonic_time': 4475.598763527, 'message_signature': '3f9201804c62c5d5988bf95528dfe1c7821b3926d01f4c293de044e6e7fb6391'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1939133463, 'user_id': '0f8ef02149394f2dac899fc3395b6bf7', 'user_name': None, 'project_id': '717cc581e6a349a98dfd390d05b18624', 'project_name': None, 'resource_id': 'c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9-vda', 'timestamp': '2026-01-21T23:58:23.096603', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-936465965', 'name': 'instance-00000047', 'instance_id': 'c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9', 'instance_type': 'm1.nano', 'host': '055250bef50e23cf2e31f6f86c8bab0b62c3741c7cced2ffb25604bb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '11ae5680-f725-11f0-a0a4-fa163e934844', 'monotonic_time': 4475.639689717, 'message_signature': 'f522e20694eb3c73fc43c6ee5f5edf4fb1bf437c50a47f690af8427cca5243d6'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '0f8ef02149394f2dac899fc3395b6bf7', 'user_name': None, 'project_id': '717cc581e6a349a98dfd390d05b18624', 'project_name': None, 'resource_id': 'c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9-sda', 'timestamp': '2026-01-21T23:58:23.096603', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-936465965', 'name': 'instance-00000047', 'instance_id': 'c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9', 'instance_type': 'm1.nano', 'host': '055250bef50e23cf2e31f6f86c8bab0b62c3741c7cced2ffb25604bb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '11ae60d0-f725-11f0-a0a4-fa163e934844', 'monotonic_time': 4475.639689717, 'message_signature': '10cbe239ea12af0ac00422e364f76badd6a93b78d613630b93a53d257e2a2fe6'}]}, 'timestamp': '2026-01-21 23:58:23.097801', '_unique_id': 'f0bc710cd37e4f1aa95d7c9a5398b7d5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.098 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.098 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.098 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.098 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.098 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.098 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.098 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.098 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.098 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.098 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.098 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.098 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.098 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.098 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.098 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.098 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.098 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.098 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.098 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.098 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.098 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.098 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.098 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.098 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.098 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.098 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.098 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.098 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.098 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.098 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.098 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.099 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.099 12 DEBUG ceilometer.compute.pollsters [-] 337d88a9-a34b-4c90-bf0d-0418533ae52d/disk.device.allocation volume: 30089216 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.099 12 DEBUG ceilometer.compute.pollsters [-] 337d88a9-a34b-4c90-bf0d-0418533ae52d/disk.device.allocation volume: 512000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.100 12 DEBUG ceilometer.compute.pollsters [-] c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9/disk.device.allocation volume: 30089216 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.100 12 DEBUG ceilometer.compute.pollsters [-] c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9/disk.device.allocation volume: 512000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.101 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9a492a2e-ad49-48b4-a62c-5048f54eb8d4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30089216, 'user_id': 'd31ef0e2c7354f47adb7b7f072c28fae', 'user_name': None, 'project_id': '40a322b32cda438b83f33ec51a9007dc', 'project_name': None, 'resource_id': '337d88a9-a34b-4c90-bf0d-0418533ae52d-vda', 'timestamp': '2026-01-21T23:58:23.099435', 'resource_metadata': {'display_name': 'guest-instance-1', 'name': 'instance-00000049', 'instance_id': '337d88a9-a34b-4c90-bf0d-0418533ae52d', 'instance_type': 'm1.nano', 'host': '5249bc4dd155e3a7eb644d8de508ae6e64618295cdf79fe9051c7afd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '11aeabf8-f725-11f0-a0a4-fa163e934844', 'monotonic_time': 4475.690626285, 'message_signature': '95b186b04df2b147b9b4082ce75af078824419ccf036d61117c536779e2944f8'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 512000, 'user_id': 'd31ef0e2c7354f47adb7b7f072c28fae', 'user_name': None, 'project_id': '40a322b32cda438b83f33ec51a9007dc', 'project_name': None, 'resource_id': '337d88a9-a34b-4c90-bf0d-0418533ae52d-sda', 'timestamp': '2026-01-21T23:58:23.099435', 'resource_metadata': {'display_name': 'guest-instance-1', 'name': 'instance-00000049', 'instance_id': '337d88a9-a34b-4c90-bf0d-0418533ae52d', 'instance_type': 'm1.nano', 'host': '5249bc4dd155e3a7eb644d8de508ae6e64618295cdf79fe9051c7afd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '11aeb760-f725-11f0-a0a4-fa163e934844', 'monotonic_time': 4475.690626285, 'message_signature': 'cb78a4979a8d1f5ab24a32b1bafea7789fd404de3f55eb6fd660e3f17edc8920'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30089216, 'user_id': '0f8ef02149394f2dac899fc3395b6bf7', 'user_name': None, 'project_id': '717cc581e6a349a98dfd390d05b18624', 'project_name': None, 'resource_id': 'c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9-vda', 'timestamp': '2026-01-21T23:58:23.099435', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-936465965', 'name': 'instance-00000047', 'instance_id': 'c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9', 'instance_type': 'm1.nano', 'host': '055250bef50e23cf2e31f6f86c8bab0b62c3741c7cced2ffb25604bb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '11aec250-f725-11f0-a0a4-fa163e934844', 'monotonic_time': 4475.710964102, 'message_signature': 'b038e70169d602aee69e6de502d12195925c8e3e06f03bf257bcd62a0762d331'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 512000, 'user_id': '0f8ef02149394f2dac899fc3395b6bf7', 'user_name': None, 'project_id': '717cc581e6a349a98dfd390d05b18624', 'project_name': None, 'resource_id': 'c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9-sda', 'timestamp': '2026-01-21T23:58:23.099435', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-936465965', 'name': 'instance-00000047', 'instance_id': 'c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9', 'instance_type': 'm1.nano', 'host': '055250bef50e23cf2e31f6f86c8bab0b62c3741c7cced2ffb25604bb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '11aeceb2-f725-11f0-a0a4-fa163e934844', 'monotonic_time': 4475.710964102, 'message_signature': '38dbe5e21cd3a2f4c6d4442df4be49e3c95bc9979bc50b56fe2a23525aa72ee4'}]}, 'timestamp': '2026-01-21 23:58:23.100618', '_unique_id': '9580b352265e4996a12f0112f8435370'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.101 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.101 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.101 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.101 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.101 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.101 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.101 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.101 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.101 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.101 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.101 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.101 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.101 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.101 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.101 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.101 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.101 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.101 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.101 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.101 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.101 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.101 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.101 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.101 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.101 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.101 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.101 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.101 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.101 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.101 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.101 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.102 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.102 12 DEBUG ceilometer.compute.pollsters [-] 337d88a9-a34b-4c90-bf0d-0418533ae52d/network.outgoing.bytes volume: 1326 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.102 12 DEBUG ceilometer.compute.pollsters [-] c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9/network.outgoing.bytes volume: 3390 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.102 12 DEBUG ceilometer.compute.pollsters [-] c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9/network.outgoing.bytes volume: 1620 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.103 12 DEBUG ceilometer.compute.pollsters [-] c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9/network.outgoing.bytes volume: 1550 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.104 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9f72e0ea-8a60-4ee1-9979-fc5944f19aa6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1326, 'user_id': 'd31ef0e2c7354f47adb7b7f072c28fae', 'user_name': None, 'project_id': '40a322b32cda438b83f33ec51a9007dc', 'project_name': None, 'resource_id': 'instance-00000049-337d88a9-a34b-4c90-bf0d-0418533ae52d-tap0d50f3fc-9e', 'timestamp': '2026-01-21T23:58:23.102232', 'resource_metadata': {'display_name': 'guest-instance-1', 'name': 'tap0d50f3fc-9e', 'instance_id': '337d88a9-a34b-4c90-bf0d-0418533ae52d', 'instance_type': 'm1.nano', 'host': '5249bc4dd155e3a7eb644d8de508ae6e64618295cdf79fe9051c7afd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:38:24:84', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap0d50f3fc-9e'}, 'message_id': '11af19c6-f725-11f0-a0a4-fa163e934844', 'monotonic_time': 4475.582402443, 'message_signature': 'd27ae2274c15462cd6c00895bcbfab3521982da31f8872304f3e7e5d1be9576c'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 3390, 'user_id': '0f8ef02149394f2dac899fc3395b6bf7', 'user_name': None, 'project_id': '717cc581e6a349a98dfd390d05b18624', 'project_name': None, 'resource_id': 'instance-00000047-c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9-tap43589933-19', 'timestamp': '2026-01-21T23:58:23.102232', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-936465965', 'name': 'tap43589933-19', 'instance_id': 'c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9', 'instance_type': 'm1.nano', 'host': '055250bef50e23cf2e31f6f86c8bab0b62c3741c7cced2ffb25604bb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:3e:bc:4e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap43589933-19'}, 'message_id': '11af25a6-f725-11f0-a0a4-fa163e934844', 'monotonic_time': 4475.586735996, 'message_signature': '3dc96d7771babaf665f3bcea84d8fe238388aae842b6b1bf0f9442d11e8f6989'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1620, 'user_id': '0f8ef02149394f2dac899fc3395b6bf7', 'user_name': None, 'project_id': '717cc581e6a349a98dfd390d05b18624', 'project_name': None, 'resource_id': 'instance-00000047-c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9-tap63932621-f0', 'timestamp': '2026-01-21T23:58:23.102232', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-936465965', 'name': 'tap63932621-f0', 'instance_id': 'c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9', 'instance_type': 'm1.nano', 'host': '055250bef50e23cf2e31f6f86c8bab0b62c3741c7cced2ffb25604bb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:30:c6:cf', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap63932621-f0'}, 'message_id': '11af31cc-f725-11f0-a0a4-fa163e934844', 'monotonic_time': 4475.586735996, 'message_signature': '24bc4a838e03b76a7ced62e5dca5e3907a5b472fa680720ba8d10bc403fbfcf4'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1550, 'user_id': '0f8ef02149394f2dac899fc3395b6bf7', 'user_name': None, 'project_id': '717cc581e6a349a98dfd390d05b18624', 'project_name': None, 'resource_id': 'instance-00000047-c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9-tapda88332f-f7', 'timestamp': '2026-01-21T23:58:23.102232', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-936465965', 'name': 'tapda88332f-f7', 'instance_id': 'c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9', 'instance_type': 'm1.nano', 'host': '055250bef50e23cf2e31f6f86c8bab0b62c3741c7cced2ffb25604bb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:92:0a:be', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapda88332f-f7'}, 'message_id': '11af3c80-f725-11f0-a0a4-fa163e934844', 'monotonic_time': 4475.586735996, 'message_signature': 'f841bf08c3d2be70275c334bad9e982f2d33b93d69029a65a58441d96367fbe6'}]}, 'timestamp': '2026-01-21 23:58:23.103452', '_unique_id': '97db8fb2ca45479ab54c82b893ded38b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.104 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.104 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.104 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.104 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.104 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.104 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.104 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.104 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.104 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.104 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.104 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.104 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.104 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.104 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.104 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.104 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.104 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.104 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.104 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.104 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.104 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.104 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.104 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.104 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.104 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.104 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.104 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.104 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.104 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.104 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.104 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.105 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.105 12 DEBUG ceilometer.compute.pollsters [-] 337d88a9-a34b-4c90-bf0d-0418533ae52d/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.105 12 DEBUG ceilometer.compute.pollsters [-] c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.105 12 DEBUG ceilometer.compute.pollsters [-] c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.106 12 DEBUG ceilometer.compute.pollsters [-] c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.107 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a0572ebc-73c1-4b39-bacc-899e2ac01ad2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'd31ef0e2c7354f47adb7b7f072c28fae', 'user_name': None, 'project_id': '40a322b32cda438b83f33ec51a9007dc', 'project_name': None, 'resource_id': 'instance-00000049-337d88a9-a34b-4c90-bf0d-0418533ae52d-tap0d50f3fc-9e', 'timestamp': '2026-01-21T23:58:23.105124', 'resource_metadata': {'display_name': 'guest-instance-1', 'name': 'tap0d50f3fc-9e', 'instance_id': '337d88a9-a34b-4c90-bf0d-0418533ae52d', 'instance_type': 'm1.nano', 'host': '5249bc4dd155e3a7eb644d8de508ae6e64618295cdf79fe9051c7afd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:38:24:84', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap0d50f3fc-9e'}, 'message_id': '11af8a32-f725-11f0-a0a4-fa163e934844', 'monotonic_time': 4475.582402443, 'message_signature': '0f411e25869f27e6df08a4c49c05ffe6912538bac821cbd2d851c534d3c9c791'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '0f8ef02149394f2dac899fc3395b6bf7', 'user_name': None, 'project_id': '717cc581e6a349a98dfd390d05b18624', 'project_name': None, 'resource_id': 'instance-00000047-c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9-tap43589933-19', 'timestamp': '2026-01-21T23:58:23.105124', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-936465965', 'name': 'tap43589933-19', 'instance_id': 'c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9', 'instance_type': 'm1.nano', 'host': '055250bef50e23cf2e31f6f86c8bab0b62c3741c7cced2ffb25604bb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:3e:bc:4e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap43589933-19'}, 'message_id': '11af95ea-f725-11f0-a0a4-fa163e934844', 'monotonic_time': 4475.586735996, 'message_signature': '6b32e5636d8dfba07601cc2f5079e97a74872c066939053997bf186487224821'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '0f8ef02149394f2dac899fc3395b6bf7', 'user_name': None, 'project_id': '717cc581e6a349a98dfd390d05b18624', 'project_name': None, 'resource_id': 'instance-00000047-c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9-tap63932621-f0', 'timestamp': '2026-01-21T23:58:23.105124', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-936465965', 'name': 'tap63932621-f0', 'instance_id': 'c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9', 'instance_type': 'm1.nano', 'host': '055250bef50e23cf2e31f6f86c8bab0b62c3741c7cced2ffb25604bb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:30:c6:cf', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap63932621-f0'}, 'message_id': '11afa300-f725-11f0-a0a4-fa163e934844', 'monotonic_time': 4475.586735996, 'message_signature': '79a14161fdb8db58ef6fd1c2d05c1c8ff4630342ea10686456882273f38596ea'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '0f8ef02149394f2dac899fc3395b6bf7', 'user_name': None, 'project_id': '717cc581e6a349a98dfd390d05b18624', 'project_name': None, 'resource_id': 'instance-00000047-c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9-tapda88332f-f7', 'timestamp': '2026-01-21T23:58:23.105124', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-936465965', 'name': 'tapda88332f-f7', 'instance_id': 'c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9', 'instance_type': 'm1.nano', 'host': '055250bef50e23cf2e31f6f86c8bab0b62c3741c7cced2ffb25604bb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:92:0a:be', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapda88332f-f7'}, 'message_id': '11afae18-f725-11f0-a0a4-fa163e934844', 'monotonic_time': 4475.586735996, 'message_signature': 'c8bc2137f550540c1c5d8d7c3aa02ae086c3041424430c516c9bdb761928e0a9'}]}, 'timestamp': '2026-01-21 23:58:23.106343', '_unique_id': 'e96406821b9a4fa1b31a413d28d99172'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.107 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.107 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.107 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.107 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.107 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.107 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.107 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.107 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.107 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.107 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.107 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.107 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.107 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.107 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.107 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.107 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.107 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.107 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.107 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.107 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.107 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.107 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.107 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.107 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.107 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.107 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.107 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.107 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.107 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.107 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.107 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.107 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.108 12 DEBUG ceilometer.compute.pollsters [-] 337d88a9-a34b-4c90-bf0d-0418533ae52d/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.108 12 DEBUG ceilometer.compute.pollsters [-] c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.108 12 DEBUG ceilometer.compute.pollsters [-] c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.108 12 DEBUG ceilometer.compute.pollsters [-] c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.109 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '78a394f3-fd7a-4b8a-84b0-efd43c333fcb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'd31ef0e2c7354f47adb7b7f072c28fae', 'user_name': None, 'project_id': '40a322b32cda438b83f33ec51a9007dc', 'project_name': None, 'resource_id': 'instance-00000049-337d88a9-a34b-4c90-bf0d-0418533ae52d-tap0d50f3fc-9e', 'timestamp': '2026-01-21T23:58:23.107979', 'resource_metadata': {'display_name': 'guest-instance-1', 'name': 'tap0d50f3fc-9e', 'instance_id': '337d88a9-a34b-4c90-bf0d-0418533ae52d', 'instance_type': 'm1.nano', 'host': '5249bc4dd155e3a7eb644d8de508ae6e64618295cdf79fe9051c7afd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:38:24:84', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap0d50f3fc-9e'}, 'message_id': '11aff9ae-f725-11f0-a0a4-fa163e934844', 'monotonic_time': 4475.582402443, 'message_signature': 'b9b2e91a2c1d8538955fbcb62f5d62d6927e8774f1e7877b1373ca260a5807bb'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '0f8ef02149394f2dac899fc3395b6bf7', 'user_name': None, 'project_id': '717cc581e6a349a98dfd390d05b18624', 'project_name': None, 'resource_id': 'instance-00000047-c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9-tap43589933-19', 'timestamp': '2026-01-21T23:58:23.107979', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-936465965', 'name': 'tap43589933-19', 'instance_id': 'c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9', 'instance_type': 'm1.nano', 'host': '055250bef50e23cf2e31f6f86c8bab0b62c3741c7cced2ffb25604bb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:3e:bc:4e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap43589933-19'}, 'message_id': '11b004c6-f725-11f0-a0a4-fa163e934844', 'monotonic_time': 4475.586735996, 'message_signature': 'db53f85acbc014dc6922aa8f241dc45e25dca10089fbdebbb90f73d1e44e4b56'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '0f8ef02149394f2dac899fc3395b6bf7', 'user_name': None, 'project_id': '717cc581e6a349a98dfd390d05b18624', 'project_name': None, 'resource_id': 'instance-00000047-c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9-tap63932621-f0', 'timestamp': '2026-01-21T23:58:23.107979', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-936465965', 'name': 'tap63932621-f0', 'instance_id': 'c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9', 'instance_type': 'm1.nano', 'host': '055250bef50e23cf2e31f6f86c8bab0b62c3741c7cced2ffb25604bb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:30:c6:cf', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap63932621-f0'}, 'message_id': '11b00fa2-f725-11f0-a0a4-fa163e934844', 'monotonic_time': 4475.586735996, 'message_signature': 'e6993973df015710158d4f431375c3053fa773ab8f703de1aee7acf9f2c39739'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '0f8ef02149394f2dac899fc3395b6bf7', 'user_name': None, 'project_id': '717cc581e6a349a98dfd390d05b18624', 'project_name': None, 'resource_id': 'instance-00000047-c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9-tapda88332f-f7', 'timestamp': '2026-01-21T23:58:23.107979', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-936465965', 'name': 'tapda88332f-f7', 'instance_id': 'c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9', 'instance_type': 'm1.nano', 'host': '055250bef50e23cf2e31f6f86c8bab0b62c3741c7cced2ffb25604bb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:92:0a:be', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapda88332f-f7'}, 'message_id': '11b01ec0-f725-11f0-a0a4-fa163e934844', 'monotonic_time': 4475.586735996, 'message_signature': '79df78f824f9f661e44ebae4bf864ea2b80ab98aa9b8a111c1d6c59670525165'}]}, 'timestamp': '2026-01-21 23:58:23.109232', '_unique_id': 'fa4c1115542c476e83c5f142edda4e89'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.109 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.109 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.109 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.109 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.109 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.109 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.109 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.109 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.109 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.109 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.109 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.109 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.109 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.109 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.109 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.109 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.109 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.109 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.109 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.109 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.109 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.109 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.109 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.109 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.109 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.109 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.109 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.109 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.109 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.109 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:58:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-21 23:58:23.109 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:58:24 compute-1 podman[221426]: 2026-01-21 23:58:24.616156578 +0000 UTC m=+0.099538236 container health_status cba261ffc859a105e0afa4436e64e27f9ba7e7387a1ea02146e0072c51844d44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ceilometer_agent_compute, managed_by=edpm_ansible)
Jan 21 23:58:25 compute-1 nova_compute[182713]: 2026-01-21 23:58:25.855 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:58:25 compute-1 nova_compute[182713]: 2026-01-21 23:58:25.856 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 21 23:58:26 compute-1 nova_compute[182713]: 2026-01-21 23:58:26.168 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:58:26 compute-1 nova_compute[182713]: 2026-01-21 23:58:26.593 182717 DEBUG nova.network.neutron [None req-2bb425d0-9173-4c82-a3ce-4cfa847df0a8 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] [instance: c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9] Updating instance_info_cache with network_info: [{"id": "43589933-1997-41c6-9aa3-54f71a1330b8", "address": "fa:16:3e:3e:bc:4e", "network": {"id": "1995baab-0f8d-4658-a4fc-2d21868dc592", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-54296518-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.190", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "717cc581e6a349a98dfd390d05b18624", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap43589933-19", "ovs_interfaceid": "43589933-1997-41c6-9aa3-54f71a1330b8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "63932621-f0d1-4f08-8ce5-b5fa120bcc62", "address": "fa:16:3e:30:c6:cf", "network": {"id": "1995baab-0f8d-4658-a4fc-2d21868dc592", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-54296518-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "717cc581e6a349a98dfd390d05b18624", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap63932621-f0", "ovs_interfaceid": "63932621-f0d1-4f08-8ce5-b5fa120bcc62", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "da88332f-f709-4521-a7e5-faca686cf825", "address": "fa:16:3e:92:0a:be", "network": {"id": "1995baab-0f8d-4658-a4fc-2d21868dc592", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-54296518-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "717cc581e6a349a98dfd390d05b18624", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapda88332f-f7", "ovs_interfaceid": "da88332f-f709-4521-a7e5-faca686cf825", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "ebbb51e0-ecfc-404f-a578-681300a57aa8", "address": "fa:16:3e:db:19:e5", "network": {"id": "1995baab-0f8d-4658-a4fc-2d21868dc592", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-54296518-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "717cc581e6a349a98dfd390d05b18624", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapebbb51e0-ec", "ovs_interfaceid": "ebbb51e0-ecfc-404f-a578-681300a57aa8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 23:58:26 compute-1 podman[221454]: 2026-01-21 23:58:26.603155398 +0000 UTC m=+0.087155035 container health_status dce133a2ffa21f3006ac27dc80027060a9029e6d972f4c62fecd35dd3bbb00f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=openstack_network_exporter, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, build-date=2025-08-20T13:12:41, version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, com.redhat.component=ubi9-minimal-container, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, name=ubi9-minimal, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7, vendor=Red Hat, Inc., managed_by=edpm_ansible)
Jan 21 23:58:26 compute-1 nova_compute[182713]: 2026-01-21 23:58:26.639 182717 DEBUG oslo_concurrency.lockutils [None req-2bb425d0-9173-4c82-a3ce-4cfa847df0a8 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Releasing lock "refresh_cache-c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 23:58:26 compute-1 nova_compute[182713]: 2026-01-21 23:58:26.639 182717 DEBUG oslo_concurrency.lockutils [req-370b2cb8-2c73-41b9-b9b5-69346d1e554c req-710a2ba9-c709-46a9-9a4c-1a30ce32c80a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 23:58:26 compute-1 nova_compute[182713]: 2026-01-21 23:58:26.640 182717 DEBUG nova.network.neutron [req-370b2cb8-2c73-41b9-b9b5-69346d1e554c req-710a2ba9-c709-46a9-9a4c-1a30ce32c80a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9] Refreshing network info cache for port ebbb51e0-ecfc-404f-a578-681300a57aa8 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 21 23:58:26 compute-1 nova_compute[182713]: 2026-01-21 23:58:26.643 182717 DEBUG nova.virt.libvirt.vif [None req-2bb425d0-9173-4c82-a3ce-4cfa847df0a8 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-21T23:57:18Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-AttachInterfacesTestJSON-server-936465965',display_name='tempest-AttachInterfacesTestJSON-server-936465965',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-attachinterfacestestjson-server-936465965',id=71,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBO8uUXqPwbsFq/YiTWyOW9dXr7d73wRm6448mROH/nejo5yLwAs6kN5js+eg5mRrAu9wdd3Q77g1T/xRTktMBrGTEhLl3+r8EWHZg0auWjRb9BMWnMu9DqjgoX1NYmMOKg==',key_name='tempest-keypair-1559391228',keypairs=<?>,launch_index=0,launched_at=2026-01-21T23:57:26Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='717cc581e6a349a98dfd390d05b18624',ramdisk_id='',reservation_id='r-fh2clcvk',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-658760528',owner_user_name='tempest-AttachInterfacesTestJSON-658760528-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-21T23:57:26Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='0f8ef02149394f2dac899fc3395b6bf7',uuid=c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "ebbb51e0-ecfc-404f-a578-681300a57aa8", "address": "fa:16:3e:db:19:e5", "network": {"id": "1995baab-0f8d-4658-a4fc-2d21868dc592", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-54296518-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "717cc581e6a349a98dfd390d05b18624", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapebbb51e0-ec", "ovs_interfaceid": "ebbb51e0-ecfc-404f-a578-681300a57aa8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 21 23:58:26 compute-1 nova_compute[182713]: 2026-01-21 23:58:26.643 182717 DEBUG nova.network.os_vif_util [None req-2bb425d0-9173-4c82-a3ce-4cfa847df0a8 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Converting VIF {"id": "ebbb51e0-ecfc-404f-a578-681300a57aa8", "address": "fa:16:3e:db:19:e5", "network": {"id": "1995baab-0f8d-4658-a4fc-2d21868dc592", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-54296518-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "717cc581e6a349a98dfd390d05b18624", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapebbb51e0-ec", "ovs_interfaceid": "ebbb51e0-ecfc-404f-a578-681300a57aa8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 21 23:58:26 compute-1 nova_compute[182713]: 2026-01-21 23:58:26.644 182717 DEBUG nova.network.os_vif_util [None req-2bb425d0-9173-4c82-a3ce-4cfa847df0a8 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:db:19:e5,bridge_name='br-int',has_traffic_filtering=True,id=ebbb51e0-ecfc-404f-a578-681300a57aa8,network=Network(1995baab-0f8d-4658-a4fc-2d21868dc592),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapebbb51e0-ec') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 21 23:58:26 compute-1 nova_compute[182713]: 2026-01-21 23:58:26.644 182717 DEBUG os_vif [None req-2bb425d0-9173-4c82-a3ce-4cfa847df0a8 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:db:19:e5,bridge_name='br-int',has_traffic_filtering=True,id=ebbb51e0-ecfc-404f-a578-681300a57aa8,network=Network(1995baab-0f8d-4658-a4fc-2d21868dc592),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapebbb51e0-ec') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 21 23:58:26 compute-1 nova_compute[182713]: 2026-01-21 23:58:26.645 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:58:26 compute-1 nova_compute[182713]: 2026-01-21 23:58:26.645 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:58:26 compute-1 nova_compute[182713]: 2026-01-21 23:58:26.645 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 21 23:58:26 compute-1 nova_compute[182713]: 2026-01-21 23:58:26.648 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:58:26 compute-1 nova_compute[182713]: 2026-01-21 23:58:26.648 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapebbb51e0-ec, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:58:26 compute-1 nova_compute[182713]: 2026-01-21 23:58:26.649 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapebbb51e0-ec, col_values=(('external_ids', {'iface-id': 'ebbb51e0-ecfc-404f-a578-681300a57aa8', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:db:19:e5', 'vm-uuid': 'c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:58:26 compute-1 nova_compute[182713]: 2026-01-21 23:58:26.650 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:58:26 compute-1 nova_compute[182713]: 2026-01-21 23:58:26.651 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 21 23:58:26 compute-1 NetworkManager[54952]: <info>  [1769039906.6523] manager: (tapebbb51e0-ec): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/124)
Jan 21 23:58:26 compute-1 nova_compute[182713]: 2026-01-21 23:58:26.660 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:58:26 compute-1 nova_compute[182713]: 2026-01-21 23:58:26.661 182717 INFO os_vif [None req-2bb425d0-9173-4c82-a3ce-4cfa847df0a8 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:db:19:e5,bridge_name='br-int',has_traffic_filtering=True,id=ebbb51e0-ecfc-404f-a578-681300a57aa8,network=Network(1995baab-0f8d-4658-a4fc-2d21868dc592),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapebbb51e0-ec')
Jan 21 23:58:26 compute-1 nova_compute[182713]: 2026-01-21 23:58:26.662 182717 DEBUG nova.virt.libvirt.vif [None req-2bb425d0-9173-4c82-a3ce-4cfa847df0a8 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-21T23:57:18Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-AttachInterfacesTestJSON-server-936465965',display_name='tempest-AttachInterfacesTestJSON-server-936465965',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-attachinterfacestestjson-server-936465965',id=71,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBO8uUXqPwbsFq/YiTWyOW9dXr7d73wRm6448mROH/nejo5yLwAs6kN5js+eg5mRrAu9wdd3Q77g1T/xRTktMBrGTEhLl3+r8EWHZg0auWjRb9BMWnMu9DqjgoX1NYmMOKg==',key_name='tempest-keypair-1559391228',keypairs=<?>,launch_index=0,launched_at=2026-01-21T23:57:26Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='717cc581e6a349a98dfd390d05b18624',ramdisk_id='',reservation_id='r-fh2clcvk',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-658760528',owner_user_name='tempest-AttachInterfacesTestJSON-658760528-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-21T23:57:26Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='0f8ef02149394f2dac899fc3395b6bf7',uuid=c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "ebbb51e0-ecfc-404f-a578-681300a57aa8", "address": "fa:16:3e:db:19:e5", "network": {"id": "1995baab-0f8d-4658-a4fc-2d21868dc592", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-54296518-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "717cc581e6a349a98dfd390d05b18624", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapebbb51e0-ec", "ovs_interfaceid": "ebbb51e0-ecfc-404f-a578-681300a57aa8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 21 23:58:26 compute-1 nova_compute[182713]: 2026-01-21 23:58:26.662 182717 DEBUG nova.network.os_vif_util [None req-2bb425d0-9173-4c82-a3ce-4cfa847df0a8 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Converting VIF {"id": "ebbb51e0-ecfc-404f-a578-681300a57aa8", "address": "fa:16:3e:db:19:e5", "network": {"id": "1995baab-0f8d-4658-a4fc-2d21868dc592", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-54296518-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "717cc581e6a349a98dfd390d05b18624", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapebbb51e0-ec", "ovs_interfaceid": "ebbb51e0-ecfc-404f-a578-681300a57aa8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 21 23:58:26 compute-1 nova_compute[182713]: 2026-01-21 23:58:26.663 182717 DEBUG nova.network.os_vif_util [None req-2bb425d0-9173-4c82-a3ce-4cfa847df0a8 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:db:19:e5,bridge_name='br-int',has_traffic_filtering=True,id=ebbb51e0-ecfc-404f-a578-681300a57aa8,network=Network(1995baab-0f8d-4658-a4fc-2d21868dc592),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapebbb51e0-ec') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 21 23:58:26 compute-1 nova_compute[182713]: 2026-01-21 23:58:26.665 182717 DEBUG nova.virt.libvirt.guest [None req-2bb425d0-9173-4c82-a3ce-4cfa847df0a8 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] attach device xml: <interface type="ethernet">
Jan 21 23:58:26 compute-1 nova_compute[182713]:   <mac address="fa:16:3e:db:19:e5"/>
Jan 21 23:58:26 compute-1 nova_compute[182713]:   <model type="virtio"/>
Jan 21 23:58:26 compute-1 nova_compute[182713]:   <driver name="vhost" rx_queue_size="512"/>
Jan 21 23:58:26 compute-1 nova_compute[182713]:   <mtu size="1442"/>
Jan 21 23:58:26 compute-1 nova_compute[182713]:   <target dev="tapebbb51e0-ec"/>
Jan 21 23:58:26 compute-1 nova_compute[182713]: </interface>
Jan 21 23:58:26 compute-1 nova_compute[182713]:  attach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:339
Jan 21 23:58:26 compute-1 kernel: tapebbb51e0-ec: entered promiscuous mode
Jan 21 23:58:26 compute-1 NetworkManager[54952]: <info>  [1769039906.6823] manager: (tapebbb51e0-ec): new Tun device (/org/freedesktop/NetworkManager/Devices/125)
Jan 21 23:58:26 compute-1 nova_compute[182713]: 2026-01-21 23:58:26.685 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:58:26 compute-1 ovn_controller[94841]: 2026-01-21T23:58:26Z|00249|binding|INFO|Claiming lport ebbb51e0-ecfc-404f-a578-681300a57aa8 for this chassis.
Jan 21 23:58:26 compute-1 ovn_controller[94841]: 2026-01-21T23:58:26Z|00250|binding|INFO|ebbb51e0-ecfc-404f-a578-681300a57aa8: Claiming fa:16:3e:db:19:e5 10.100.0.5
Jan 21 23:58:26 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:58:26.696 104184 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:db:19:e5 10.100.0.5'], port_security=['fa:16:3e:db:19:e5 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-AttachInterfacesTestJSON-85120559', 'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': 'c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1995baab-0f8d-4658-a4fc-2d21868dc592', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-AttachInterfacesTestJSON-85120559', 'neutron:project_id': '717cc581e6a349a98dfd390d05b18624', 'neutron:revision_number': '2', 'neutron:security_group_ids': '453c6af8-25dc-4538-90e8-d74d46875cdc', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a84fa12f-731b-4479-8697-844749c5a76f, chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>], tunnel_key=6, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>], logical_port=ebbb51e0-ecfc-404f-a578-681300a57aa8) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 21 23:58:26 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:58:26.699 104184 INFO neutron.agent.ovn.metadata.agent [-] Port ebbb51e0-ecfc-404f-a578-681300a57aa8 in datapath 1995baab-0f8d-4658-a4fc-2d21868dc592 bound to our chassis
Jan 21 23:58:26 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:58:26.702 104184 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 1995baab-0f8d-4658-a4fc-2d21868dc592
Jan 21 23:58:26 compute-1 ovn_controller[94841]: 2026-01-21T23:58:26Z|00251|binding|INFO|Setting lport ebbb51e0-ecfc-404f-a578-681300a57aa8 ovn-installed in OVS
Jan 21 23:58:26 compute-1 ovn_controller[94841]: 2026-01-21T23:58:26Z|00252|binding|INFO|Setting lport ebbb51e0-ecfc-404f-a578-681300a57aa8 up in Southbound
Jan 21 23:58:26 compute-1 nova_compute[182713]: 2026-01-21 23:58:26.713 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:58:26 compute-1 nova_compute[182713]: 2026-01-21 23:58:26.720 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:58:26 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:58:26.732 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[cd11f50f-c6a2-4dc9-8789-8ead9182bb1a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:58:26 compute-1 systemd-udevd[221483]: Network interface NamePolicy= disabled on kernel command line.
Jan 21 23:58:26 compute-1 NetworkManager[54952]: <info>  [1769039906.7547] device (tapebbb51e0-ec): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 21 23:58:26 compute-1 NetworkManager[54952]: <info>  [1769039906.7557] device (tapebbb51e0-ec): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 21 23:58:26 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:58:26.783 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[93491aab-d688-49a0-b203-1bfb4a719b74]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:58:26 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:58:26.787 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[3a658243-c60e-4ddf-b905-ac2f75f4c6d4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:58:26 compute-1 nova_compute[182713]: 2026-01-21 23:58:26.833 182717 DEBUG nova.virt.libvirt.driver [None req-2bb425d0-9173-4c82-a3ce-4cfa847df0a8 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 21 23:58:26 compute-1 nova_compute[182713]: 2026-01-21 23:58:26.833 182717 DEBUG nova.virt.libvirt.driver [None req-2bb425d0-9173-4c82-a3ce-4cfa847df0a8 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 21 23:58:26 compute-1 nova_compute[182713]: 2026-01-21 23:58:26.833 182717 DEBUG nova.virt.libvirt.driver [None req-2bb425d0-9173-4c82-a3ce-4cfa847df0a8 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] No VIF found with MAC fa:16:3e:3e:bc:4e, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 21 23:58:26 compute-1 nova_compute[182713]: 2026-01-21 23:58:26.833 182717 DEBUG nova.virt.libvirt.driver [None req-2bb425d0-9173-4c82-a3ce-4cfa847df0a8 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] No VIF found with MAC fa:16:3e:30:c6:cf, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 21 23:58:26 compute-1 nova_compute[182713]: 2026-01-21 23:58:26.833 182717 DEBUG nova.virt.libvirt.driver [None req-2bb425d0-9173-4c82-a3ce-4cfa847df0a8 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] No VIF found with MAC fa:16:3e:92:0a:be, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 21 23:58:26 compute-1 nova_compute[182713]: 2026-01-21 23:58:26.833 182717 DEBUG nova.virt.libvirt.driver [None req-2bb425d0-9173-4c82-a3ce-4cfa847df0a8 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] No VIF found with MAC fa:16:3e:db:19:e5, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 21 23:58:26 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:58:26.833 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[e4365b73-914e-4a0a-a742-a410c7a7e71a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:58:26 compute-1 nova_compute[182713]: 2026-01-21 23:58:26.865 182717 DEBUG nova.virt.libvirt.guest [None req-2bb425d0-9173-4c82-a3ce-4cfa847df0a8 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 21 23:58:26 compute-1 nova_compute[182713]:   <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 21 23:58:26 compute-1 nova_compute[182713]:   <nova:name>tempest-AttachInterfacesTestJSON-server-936465965</nova:name>
Jan 21 23:58:26 compute-1 nova_compute[182713]:   <nova:creationTime>2026-01-21 23:58:26</nova:creationTime>
Jan 21 23:58:26 compute-1 nova_compute[182713]:   <nova:flavor name="m1.nano">
Jan 21 23:58:26 compute-1 nova_compute[182713]:     <nova:memory>128</nova:memory>
Jan 21 23:58:26 compute-1 nova_compute[182713]:     <nova:disk>1</nova:disk>
Jan 21 23:58:26 compute-1 nova_compute[182713]:     <nova:swap>0</nova:swap>
Jan 21 23:58:26 compute-1 nova_compute[182713]:     <nova:ephemeral>0</nova:ephemeral>
Jan 21 23:58:26 compute-1 nova_compute[182713]:     <nova:vcpus>1</nova:vcpus>
Jan 21 23:58:26 compute-1 nova_compute[182713]:   </nova:flavor>
Jan 21 23:58:26 compute-1 nova_compute[182713]:   <nova:owner>
Jan 21 23:58:26 compute-1 nova_compute[182713]:     <nova:user uuid="0f8ef02149394f2dac899fc3395b6bf7">tempest-AttachInterfacesTestJSON-658760528-project-member</nova:user>
Jan 21 23:58:26 compute-1 nova_compute[182713]:     <nova:project uuid="717cc581e6a349a98dfd390d05b18624">tempest-AttachInterfacesTestJSON-658760528</nova:project>
Jan 21 23:58:26 compute-1 nova_compute[182713]:   </nova:owner>
Jan 21 23:58:26 compute-1 nova_compute[182713]:   <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 21 23:58:26 compute-1 nova_compute[182713]:   <nova:ports>
Jan 21 23:58:26 compute-1 nova_compute[182713]:     <nova:port uuid="43589933-1997-41c6-9aa3-54f71a1330b8">
Jan 21 23:58:26 compute-1 nova_compute[182713]:       <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Jan 21 23:58:26 compute-1 nova_compute[182713]:     </nova:port>
Jan 21 23:58:26 compute-1 nova_compute[182713]:     <nova:port uuid="63932621-f0d1-4f08-8ce5-b5fa120bcc62">
Jan 21 23:58:26 compute-1 nova_compute[182713]:       <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Jan 21 23:58:26 compute-1 nova_compute[182713]:     </nova:port>
Jan 21 23:58:26 compute-1 nova_compute[182713]:     <nova:port uuid="da88332f-f709-4521-a7e5-faca686cf825">
Jan 21 23:58:26 compute-1 nova_compute[182713]:       <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Jan 21 23:58:26 compute-1 nova_compute[182713]:     </nova:port>
Jan 21 23:58:26 compute-1 nova_compute[182713]:     <nova:port uuid="ebbb51e0-ecfc-404f-a578-681300a57aa8">
Jan 21 23:58:26 compute-1 nova_compute[182713]:       <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Jan 21 23:58:26 compute-1 nova_compute[182713]:     </nova:port>
Jan 21 23:58:26 compute-1 nova_compute[182713]:   </nova:ports>
Jan 21 23:58:26 compute-1 nova_compute[182713]: </nova:instance>
Jan 21 23:58:26 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:58:26.865 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[c63cb313-3634-414b-9ace-fae156eb1d85]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap1995baab-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:4b:ff:2f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 12, 'tx_packets': 9, 'rx_bytes': 784, 'tx_bytes': 522, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 12, 'tx_packets': 9, 'rx_bytes': 784, 'tx_bytes': 522, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 70], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 441880, 'reachable_time': 27223, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 221490, 'error': None, 'target': 'ovnmeta-1995baab-0f8d-4658-a4fc-2d21868dc592', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:58:26 compute-1 nova_compute[182713]:  set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359
Jan 21 23:58:26 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:58:26.887 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[7d783929-709f-4978-aa8b-29402a874a86]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap1995baab-01'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 441895, 'tstamp': 441895}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 221491, 'error': None, 'target': 'ovnmeta-1995baab-0f8d-4658-a4fc-2d21868dc592', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap1995baab-01'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 441899, 'tstamp': 441899}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 221491, 'error': None, 'target': 'ovnmeta-1995baab-0f8d-4658-a4fc-2d21868dc592', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:58:26 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:58:26.889 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1995baab-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:58:26 compute-1 nova_compute[182713]: 2026-01-21 23:58:26.891 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:58:26 compute-1 nova_compute[182713]: 2026-01-21 23:58:26.892 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:58:26 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:58:26.893 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap1995baab-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:58:26 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:58:26.893 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 21 23:58:26 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:58:26.894 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap1995baab-00, col_values=(('external_ids', {'iface-id': '4a5cc35b-5169-43e2-b11f-202219aae22d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:58:26 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:58:26.894 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 21 23:58:26 compute-1 nova_compute[182713]: 2026-01-21 23:58:26.906 182717 DEBUG oslo_concurrency.lockutils [None req-2bb425d0-9173-4c82-a3ce-4cfa847df0a8 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Lock "interface-c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9-ebbb51e0-ecfc-404f-a578-681300a57aa8" "released" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: held 12.300s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:58:27 compute-1 nova_compute[182713]: 2026-01-21 23:58:27.857 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:58:27 compute-1 nova_compute[182713]: 2026-01-21 23:58:27.858 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:58:28 compute-1 nova_compute[182713]: 2026-01-21 23:58:28.000 182717 DEBUG nova.compute.manager [req-0edf634e-ad17-4f55-b1a7-ece7daeea2b7 req-c54aac88-875c-4c6a-b4be-555c0d37253d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9] Received event network-vif-plugged-ebbb51e0-ecfc-404f-a578-681300a57aa8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:58:28 compute-1 nova_compute[182713]: 2026-01-21 23:58:28.000 182717 DEBUG oslo_concurrency.lockutils [req-0edf634e-ad17-4f55-b1a7-ece7daeea2b7 req-c54aac88-875c-4c6a-b4be-555c0d37253d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:58:28 compute-1 nova_compute[182713]: 2026-01-21 23:58:28.001 182717 DEBUG oslo_concurrency.lockutils [req-0edf634e-ad17-4f55-b1a7-ece7daeea2b7 req-c54aac88-875c-4c6a-b4be-555c0d37253d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:58:28 compute-1 nova_compute[182713]: 2026-01-21 23:58:28.001 182717 DEBUG oslo_concurrency.lockutils [req-0edf634e-ad17-4f55-b1a7-ece7daeea2b7 req-c54aac88-875c-4c6a-b4be-555c0d37253d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:58:28 compute-1 nova_compute[182713]: 2026-01-21 23:58:28.002 182717 DEBUG nova.compute.manager [req-0edf634e-ad17-4f55-b1a7-ece7daeea2b7 req-c54aac88-875c-4c6a-b4be-555c0d37253d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9] No waiting events found dispatching network-vif-plugged-ebbb51e0-ecfc-404f-a578-681300a57aa8 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 23:58:28 compute-1 nova_compute[182713]: 2026-01-21 23:58:28.002 182717 WARNING nova.compute.manager [req-0edf634e-ad17-4f55-b1a7-ece7daeea2b7 req-c54aac88-875c-4c6a-b4be-555c0d37253d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9] Received unexpected event network-vif-plugged-ebbb51e0-ecfc-404f-a578-681300a57aa8 for instance with vm_state active and task_state None.
Jan 21 23:58:28 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:58:28.177 104443 DEBUG eventlet.wsgi.server [-] (104443) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 21 23:58:28 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:58:28.180 104443 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /openstack/latest/meta_data.json HTTP/1.0
Jan 21 23:58:28 compute-1 ovn_metadata_agent[104179]: Accept: */*
Jan 21 23:58:28 compute-1 ovn_metadata_agent[104179]: Connection: close
Jan 21 23:58:28 compute-1 ovn_metadata_agent[104179]: Content-Type: text/plain
Jan 21 23:58:28 compute-1 ovn_metadata_agent[104179]: Host: 169.254.169.254
Jan 21 23:58:28 compute-1 ovn_metadata_agent[104179]: User-Agent: curl/7.84.0
Jan 21 23:58:28 compute-1 ovn_metadata_agent[104179]: X-Forwarded-For: 10.100.0.10
Jan 21 23:58:28 compute-1 ovn_metadata_agent[104179]: X-Ovn-Network-Id: 58dabcb0-4997-48a1-816b-d257b8a0a2a6 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 21 23:58:28 compute-1 nova_compute[182713]: 2026-01-21 23:58:28.856 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:58:28 compute-1 ovn_controller[94841]: 2026-01-21T23:58:28Z|00034|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:db:19:e5 10.100.0.5
Jan 21 23:58:28 compute-1 ovn_controller[94841]: 2026-01-21T23:58:28Z|00035|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:db:19:e5 10.100.0.5
Jan 21 23:58:29 compute-1 nova_compute[182713]: 2026-01-21 23:58:29.465 182717 DEBUG nova.network.neutron [req-370b2cb8-2c73-41b9-b9b5-69346d1e554c req-710a2ba9-c709-46a9-9a4c-1a30ce32c80a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9] Updated VIF entry in instance network info cache for port ebbb51e0-ecfc-404f-a578-681300a57aa8. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 21 23:58:29 compute-1 nova_compute[182713]: 2026-01-21 23:58:29.465 182717 DEBUG nova.network.neutron [req-370b2cb8-2c73-41b9-b9b5-69346d1e554c req-710a2ba9-c709-46a9-9a4c-1a30ce32c80a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9] Updating instance_info_cache with network_info: [{"id": "43589933-1997-41c6-9aa3-54f71a1330b8", "address": "fa:16:3e:3e:bc:4e", "network": {"id": "1995baab-0f8d-4658-a4fc-2d21868dc592", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-54296518-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.190", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "717cc581e6a349a98dfd390d05b18624", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap43589933-19", "ovs_interfaceid": "43589933-1997-41c6-9aa3-54f71a1330b8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "63932621-f0d1-4f08-8ce5-b5fa120bcc62", "address": "fa:16:3e:30:c6:cf", "network": {"id": "1995baab-0f8d-4658-a4fc-2d21868dc592", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-54296518-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "717cc581e6a349a98dfd390d05b18624", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap63932621-f0", "ovs_interfaceid": "63932621-f0d1-4f08-8ce5-b5fa120bcc62", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "da88332f-f709-4521-a7e5-faca686cf825", "address": "fa:16:3e:92:0a:be", "network": {"id": "1995baab-0f8d-4658-a4fc-2d21868dc592", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-54296518-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "717cc581e6a349a98dfd390d05b18624", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapda88332f-f7", "ovs_interfaceid": "da88332f-f709-4521-a7e5-faca686cf825", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "ebbb51e0-ecfc-404f-a578-681300a57aa8", "address": "fa:16:3e:db:19:e5", "network": {"id": "1995baab-0f8d-4658-a4fc-2d21868dc592", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-54296518-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "717cc581e6a349a98dfd390d05b18624", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapebbb51e0-ec", "ovs_interfaceid": "ebbb51e0-ecfc-404f-a578-681300a57aa8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 23:58:29 compute-1 nova_compute[182713]: 2026-01-21 23:58:29.513 182717 DEBUG oslo_concurrency.lockutils [req-370b2cb8-2c73-41b9-b9b5-69346d1e554c req-710a2ba9-c709-46a9-9a4c-1a30ce32c80a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 23:58:29 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:58:29.591 104443 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 21 23:58:29 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:58:29.592 104443 INFO eventlet.wsgi.server [-] 10.100.0.10,<local> "GET /openstack/latest/meta_data.json HTTP/1.1" status: 200  len: 1673 time: 1.4128618
Jan 21 23:58:29 compute-1 haproxy-metadata-proxy-58dabcb0-4997-48a1-816b-d257b8a0a2a6[221339]: 10.100.0.10:33598 [21/Jan/2026:23:58:28.176] listener listener/metadata 0/0/0/1416/1416 200 1657 - - ---- 1/1/0/0/0 0/0 "GET /openstack/latest/meta_data.json HTTP/1.1"
Jan 21 23:58:29 compute-1 nova_compute[182713]: 2026-01-21 23:58:29.733 182717 DEBUG oslo_concurrency.lockutils [None req-f8213aff-4f1c-4f00-aeaa-898fb01fcf42 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Acquiring lock "interface-c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9-63932621-f0d1-4f08-8ce5-b5fa120bcc62" by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:58:29 compute-1 nova_compute[182713]: 2026-01-21 23:58:29.733 182717 DEBUG oslo_concurrency.lockutils [None req-f8213aff-4f1c-4f00-aeaa-898fb01fcf42 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Lock "interface-c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9-63932621-f0d1-4f08-8ce5-b5fa120bcc62" acquired by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:58:29 compute-1 nova_compute[182713]: 2026-01-21 23:58:29.760 182717 DEBUG nova.objects.instance [None req-f8213aff-4f1c-4f00-aeaa-898fb01fcf42 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Lazy-loading 'flavor' on Instance uuid c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:58:29 compute-1 nova_compute[182713]: 2026-01-21 23:58:29.811 182717 DEBUG nova.virt.libvirt.vif [None req-f8213aff-4f1c-4f00-aeaa-898fb01fcf42 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-21T23:57:18Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachInterfacesTestJSON-server-936465965',display_name='tempest-AttachInterfacesTestJSON-server-936465965',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-attachinterfacestestjson-server-936465965',id=71,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBO8uUXqPwbsFq/YiTWyOW9dXr7d73wRm6448mROH/nejo5yLwAs6kN5js+eg5mRrAu9wdd3Q77g1T/xRTktMBrGTEhLl3+r8EWHZg0auWjRb9BMWnMu9DqjgoX1NYmMOKg==',key_name='tempest-keypair-1559391228',keypairs=<?>,launch_index=0,launched_at=2026-01-21T23:57:26Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='717cc581e6a349a98dfd390d05b18624',ramdisk_id='',reservation_id='r-fh2clcvk',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-658760528',owner_user_name='tempest-AttachInterfacesTestJSON-658760528-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-21T23:57:26Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='0f8ef02149394f2dac899fc3395b6bf7',uuid=c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "63932621-f0d1-4f08-8ce5-b5fa120bcc62", "address": "fa:16:3e:30:c6:cf", "network": {"id": "1995baab-0f8d-4658-a4fc-2d21868dc592", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-54296518-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "717cc581e6a349a98dfd390d05b18624", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap63932621-f0", "ovs_interfaceid": "63932621-f0d1-4f08-8ce5-b5fa120bcc62", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 21 23:58:29 compute-1 nova_compute[182713]: 2026-01-21 23:58:29.812 182717 DEBUG nova.network.os_vif_util [None req-f8213aff-4f1c-4f00-aeaa-898fb01fcf42 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Converting VIF {"id": "63932621-f0d1-4f08-8ce5-b5fa120bcc62", "address": "fa:16:3e:30:c6:cf", "network": {"id": "1995baab-0f8d-4658-a4fc-2d21868dc592", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-54296518-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "717cc581e6a349a98dfd390d05b18624", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap63932621-f0", "ovs_interfaceid": "63932621-f0d1-4f08-8ce5-b5fa120bcc62", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 21 23:58:29 compute-1 nova_compute[182713]: 2026-01-21 23:58:29.813 182717 DEBUG nova.network.os_vif_util [None req-f8213aff-4f1c-4f00-aeaa-898fb01fcf42 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:30:c6:cf,bridge_name='br-int',has_traffic_filtering=True,id=63932621-f0d1-4f08-8ce5-b5fa120bcc62,network=Network(1995baab-0f8d-4658-a4fc-2d21868dc592),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap63932621-f0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 21 23:58:29 compute-1 nova_compute[182713]: 2026-01-21 23:58:29.818 182717 DEBUG nova.virt.libvirt.guest [None req-f8213aff-4f1c-4f00-aeaa-898fb01fcf42 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:30:c6:cf"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap63932621-f0"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257
Jan 21 23:58:29 compute-1 nova_compute[182713]: 2026-01-21 23:58:29.821 182717 DEBUG nova.virt.libvirt.guest [None req-f8213aff-4f1c-4f00-aeaa-898fb01fcf42 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:30:c6:cf"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap63932621-f0"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257
Jan 21 23:58:29 compute-1 nova_compute[182713]: 2026-01-21 23:58:29.824 182717 DEBUG nova.virt.libvirt.driver [None req-f8213aff-4f1c-4f00-aeaa-898fb01fcf42 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Attempting to detach device tap63932621-f0 from instance c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9 from the persistent domain config. _detach_from_persistent /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2487
Jan 21 23:58:29 compute-1 nova_compute[182713]: 2026-01-21 23:58:29.825 182717 DEBUG nova.virt.libvirt.guest [None req-f8213aff-4f1c-4f00-aeaa-898fb01fcf42 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] detach device xml: <interface type="ethernet">
Jan 21 23:58:29 compute-1 nova_compute[182713]:   <mac address="fa:16:3e:30:c6:cf"/>
Jan 21 23:58:29 compute-1 nova_compute[182713]:   <model type="virtio"/>
Jan 21 23:58:29 compute-1 nova_compute[182713]:   <driver name="vhost" rx_queue_size="512"/>
Jan 21 23:58:29 compute-1 nova_compute[182713]:   <mtu size="1442"/>
Jan 21 23:58:29 compute-1 nova_compute[182713]:   <target dev="tap63932621-f0"/>
Jan 21 23:58:29 compute-1 nova_compute[182713]: </interface>
Jan 21 23:58:29 compute-1 nova_compute[182713]:  detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465
Jan 21 23:58:29 compute-1 nova_compute[182713]: 2026-01-21 23:58:29.840 182717 DEBUG nova.virt.libvirt.guest [None req-f8213aff-4f1c-4f00-aeaa-898fb01fcf42 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:30:c6:cf"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap63932621-f0"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257
Jan 21 23:58:29 compute-1 nova_compute[182713]: 2026-01-21 23:58:29.845 182717 DEBUG nova.virt.libvirt.guest [None req-f8213aff-4f1c-4f00-aeaa-898fb01fcf42 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:30:c6:cf"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap63932621-f0"/></interface>not found in domain: <domain type='kvm' id='33'>
Jan 21 23:58:29 compute-1 nova_compute[182713]:   <name>instance-00000047</name>
Jan 21 23:58:29 compute-1 nova_compute[182713]:   <uuid>c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9</uuid>
Jan 21 23:58:29 compute-1 nova_compute[182713]:   <metadata>
Jan 21 23:58:29 compute-1 nova_compute[182713]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 21 23:58:29 compute-1 nova_compute[182713]:   <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 21 23:58:29 compute-1 nova_compute[182713]:   <nova:name>tempest-AttachInterfacesTestJSON-server-936465965</nova:name>
Jan 21 23:58:29 compute-1 nova_compute[182713]:   <nova:creationTime>2026-01-21 23:58:26</nova:creationTime>
Jan 21 23:58:29 compute-1 nova_compute[182713]:   <nova:flavor name="m1.nano">
Jan 21 23:58:29 compute-1 nova_compute[182713]:     <nova:memory>128</nova:memory>
Jan 21 23:58:29 compute-1 nova_compute[182713]:     <nova:disk>1</nova:disk>
Jan 21 23:58:29 compute-1 nova_compute[182713]:     <nova:swap>0</nova:swap>
Jan 21 23:58:29 compute-1 nova_compute[182713]:     <nova:ephemeral>0</nova:ephemeral>
Jan 21 23:58:29 compute-1 nova_compute[182713]:     <nova:vcpus>1</nova:vcpus>
Jan 21 23:58:29 compute-1 nova_compute[182713]:   </nova:flavor>
Jan 21 23:58:29 compute-1 nova_compute[182713]:   <nova:owner>
Jan 21 23:58:29 compute-1 nova_compute[182713]:     <nova:user uuid="0f8ef02149394f2dac899fc3395b6bf7">tempest-AttachInterfacesTestJSON-658760528-project-member</nova:user>
Jan 21 23:58:29 compute-1 nova_compute[182713]:     <nova:project uuid="717cc581e6a349a98dfd390d05b18624">tempest-AttachInterfacesTestJSON-658760528</nova:project>
Jan 21 23:58:29 compute-1 nova_compute[182713]:   </nova:owner>
Jan 21 23:58:29 compute-1 nova_compute[182713]:   <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 21 23:58:29 compute-1 nova_compute[182713]:   <nova:ports>
Jan 21 23:58:29 compute-1 nova_compute[182713]:     <nova:port uuid="43589933-1997-41c6-9aa3-54f71a1330b8">
Jan 21 23:58:29 compute-1 nova_compute[182713]:       <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Jan 21 23:58:29 compute-1 nova_compute[182713]:     </nova:port>
Jan 21 23:58:29 compute-1 nova_compute[182713]:     <nova:port uuid="63932621-f0d1-4f08-8ce5-b5fa120bcc62">
Jan 21 23:58:29 compute-1 nova_compute[182713]:       <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Jan 21 23:58:29 compute-1 nova_compute[182713]:     </nova:port>
Jan 21 23:58:29 compute-1 nova_compute[182713]:     <nova:port uuid="da88332f-f709-4521-a7e5-faca686cf825">
Jan 21 23:58:29 compute-1 nova_compute[182713]:       <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Jan 21 23:58:29 compute-1 nova_compute[182713]:     </nova:port>
Jan 21 23:58:29 compute-1 nova_compute[182713]:     <nova:port uuid="ebbb51e0-ecfc-404f-a578-681300a57aa8">
Jan 21 23:58:29 compute-1 nova_compute[182713]:       <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Jan 21 23:58:29 compute-1 nova_compute[182713]:     </nova:port>
Jan 21 23:58:29 compute-1 nova_compute[182713]:   </nova:ports>
Jan 21 23:58:29 compute-1 nova_compute[182713]: </nova:instance>
Jan 21 23:58:29 compute-1 nova_compute[182713]:   </metadata>
Jan 21 23:58:29 compute-1 nova_compute[182713]:   <memory unit='KiB'>131072</memory>
Jan 21 23:58:29 compute-1 nova_compute[182713]:   <currentMemory unit='KiB'>131072</currentMemory>
Jan 21 23:58:29 compute-1 nova_compute[182713]:   <vcpu placement='static'>1</vcpu>
Jan 21 23:58:29 compute-1 nova_compute[182713]:   <resource>
Jan 21 23:58:29 compute-1 nova_compute[182713]:     <partition>/machine</partition>
Jan 21 23:58:29 compute-1 nova_compute[182713]:   </resource>
Jan 21 23:58:29 compute-1 nova_compute[182713]:   <sysinfo type='smbios'>
Jan 21 23:58:29 compute-1 nova_compute[182713]:     <system>
Jan 21 23:58:29 compute-1 nova_compute[182713]:       <entry name='manufacturer'>RDO</entry>
Jan 21 23:58:29 compute-1 nova_compute[182713]:       <entry name='product'>OpenStack Compute</entry>
Jan 21 23:58:29 compute-1 nova_compute[182713]:       <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 21 23:58:29 compute-1 nova_compute[182713]:       <entry name='serial'>c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9</entry>
Jan 21 23:58:29 compute-1 nova_compute[182713]:       <entry name='uuid'>c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9</entry>
Jan 21 23:58:29 compute-1 nova_compute[182713]:       <entry name='family'>Virtual Machine</entry>
Jan 21 23:58:29 compute-1 nova_compute[182713]:     </system>
Jan 21 23:58:29 compute-1 nova_compute[182713]:   </sysinfo>
Jan 21 23:58:29 compute-1 nova_compute[182713]:   <os>
Jan 21 23:58:29 compute-1 nova_compute[182713]:     <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Jan 21 23:58:29 compute-1 nova_compute[182713]:     <boot dev='hd'/>
Jan 21 23:58:29 compute-1 nova_compute[182713]:     <smbios mode='sysinfo'/>
Jan 21 23:58:29 compute-1 nova_compute[182713]:   </os>
Jan 21 23:58:29 compute-1 nova_compute[182713]:   <features>
Jan 21 23:58:29 compute-1 nova_compute[182713]:     <acpi/>
Jan 21 23:58:29 compute-1 nova_compute[182713]:     <apic/>
Jan 21 23:58:29 compute-1 nova_compute[182713]:     <vmcoreinfo state='on'/>
Jan 21 23:58:29 compute-1 nova_compute[182713]:   </features>
Jan 21 23:58:29 compute-1 nova_compute[182713]:   <cpu mode='custom' match='exact' check='full'>
Jan 21 23:58:29 compute-1 nova_compute[182713]:     <model fallback='forbid'>Nehalem</model>
Jan 21 23:58:29 compute-1 nova_compute[182713]:     <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Jan 21 23:58:29 compute-1 nova_compute[182713]:     <feature policy='require' name='x2apic'/>
Jan 21 23:58:29 compute-1 nova_compute[182713]:     <feature policy='require' name='hypervisor'/>
Jan 21 23:58:29 compute-1 nova_compute[182713]:     <feature policy='require' name='vme'/>
Jan 21 23:58:29 compute-1 nova_compute[182713]:   </cpu>
Jan 21 23:58:29 compute-1 nova_compute[182713]:   <clock offset='utc'>
Jan 21 23:58:29 compute-1 nova_compute[182713]:     <timer name='pit' tickpolicy='delay'/>
Jan 21 23:58:29 compute-1 nova_compute[182713]:     <timer name='rtc' tickpolicy='catchup'/>
Jan 21 23:58:29 compute-1 nova_compute[182713]:     <timer name='hpet' present='no'/>
Jan 21 23:58:29 compute-1 nova_compute[182713]:   </clock>
Jan 21 23:58:29 compute-1 nova_compute[182713]:   <on_poweroff>destroy</on_poweroff>
Jan 21 23:58:29 compute-1 nova_compute[182713]:   <on_reboot>restart</on_reboot>
Jan 21 23:58:29 compute-1 nova_compute[182713]:   <on_crash>destroy</on_crash>
Jan 21 23:58:29 compute-1 nova_compute[182713]:   <devices>
Jan 21 23:58:29 compute-1 nova_compute[182713]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 21 23:58:29 compute-1 nova_compute[182713]:     <disk type='file' device='disk'>
Jan 21 23:58:29 compute-1 nova_compute[182713]:       <driver name='qemu' type='qcow2' cache='none'/>
Jan 21 23:58:29 compute-1 nova_compute[182713]:       <source file='/var/lib/nova/instances/c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9/disk' index='2'/>
Jan 21 23:58:29 compute-1 nova_compute[182713]:       <backingStore type='file' index='3'>
Jan 21 23:58:29 compute-1 nova_compute[182713]:         <format type='raw'/>
Jan 21 23:58:29 compute-1 nova_compute[182713]:         <source file='/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474'/>
Jan 21 23:58:29 compute-1 nova_compute[182713]:         <backingStore/>
Jan 21 23:58:29 compute-1 nova_compute[182713]:       </backingStore>
Jan 21 23:58:29 compute-1 nova_compute[182713]:       <target dev='vda' bus='virtio'/>
Jan 21 23:58:29 compute-1 nova_compute[182713]:       <alias name='virtio-disk0'/>
Jan 21 23:58:29 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Jan 21 23:58:29 compute-1 nova_compute[182713]:     </disk>
Jan 21 23:58:29 compute-1 nova_compute[182713]:     <disk type='file' device='cdrom'>
Jan 21 23:58:29 compute-1 nova_compute[182713]:       <driver name='qemu' type='raw' cache='none'/>
Jan 21 23:58:29 compute-1 nova_compute[182713]:       <source file='/var/lib/nova/instances/c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9/disk.config' index='1'/>
Jan 21 23:58:29 compute-1 nova_compute[182713]:       <backingStore/>
Jan 21 23:58:29 compute-1 nova_compute[182713]:       <target dev='sda' bus='sata'/>
Jan 21 23:58:29 compute-1 nova_compute[182713]:       <readonly/>
Jan 21 23:58:29 compute-1 nova_compute[182713]:       <alias name='sata0-0-0'/>
Jan 21 23:58:29 compute-1 nova_compute[182713]:       <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Jan 21 23:58:29 compute-1 nova_compute[182713]:     </disk>
Jan 21 23:58:29 compute-1 nova_compute[182713]:     <controller type='pci' index='0' model='pcie-root'>
Jan 21 23:58:29 compute-1 nova_compute[182713]:       <alias name='pcie.0'/>
Jan 21 23:58:29 compute-1 nova_compute[182713]:     </controller>
Jan 21 23:58:29 compute-1 nova_compute[182713]:     <controller type='pci' index='1' model='pcie-root-port'>
Jan 21 23:58:29 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 21 23:58:29 compute-1 nova_compute[182713]:       <target chassis='1' port='0x10'/>
Jan 21 23:58:29 compute-1 nova_compute[182713]:       <alias name='pci.1'/>
Jan 21 23:58:29 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Jan 21 23:58:29 compute-1 nova_compute[182713]:     </controller>
Jan 21 23:58:29 compute-1 nova_compute[182713]:     <controller type='pci' index='2' model='pcie-root-port'>
Jan 21 23:58:29 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 21 23:58:29 compute-1 nova_compute[182713]:       <target chassis='2' port='0x11'/>
Jan 21 23:58:29 compute-1 nova_compute[182713]:       <alias name='pci.2'/>
Jan 21 23:58:29 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Jan 21 23:58:29 compute-1 nova_compute[182713]:     </controller>
Jan 21 23:58:29 compute-1 nova_compute[182713]:     <controller type='pci' index='3' model='pcie-root-port'>
Jan 21 23:58:29 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 21 23:58:29 compute-1 nova_compute[182713]:       <target chassis='3' port='0x12'/>
Jan 21 23:58:29 compute-1 nova_compute[182713]:       <alias name='pci.3'/>
Jan 21 23:58:29 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Jan 21 23:58:29 compute-1 nova_compute[182713]:     </controller>
Jan 21 23:58:29 compute-1 nova_compute[182713]:     <controller type='pci' index='4' model='pcie-root-port'>
Jan 21 23:58:29 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 21 23:58:29 compute-1 nova_compute[182713]:       <target chassis='4' port='0x13'/>
Jan 21 23:58:29 compute-1 nova_compute[182713]:       <alias name='pci.4'/>
Jan 21 23:58:29 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Jan 21 23:58:29 compute-1 nova_compute[182713]:     </controller>
Jan 21 23:58:29 compute-1 nova_compute[182713]:     <controller type='pci' index='5' model='pcie-root-port'>
Jan 21 23:58:29 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 21 23:58:29 compute-1 nova_compute[182713]:       <target chassis='5' port='0x14'/>
Jan 21 23:58:29 compute-1 nova_compute[182713]:       <alias name='pci.5'/>
Jan 21 23:58:29 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Jan 21 23:58:29 compute-1 nova_compute[182713]:     </controller>
Jan 21 23:58:29 compute-1 nova_compute[182713]:     <controller type='pci' index='6' model='pcie-root-port'>
Jan 21 23:58:29 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 21 23:58:29 compute-1 nova_compute[182713]:       <target chassis='6' port='0x15'/>
Jan 21 23:58:29 compute-1 nova_compute[182713]:       <alias name='pci.6'/>
Jan 21 23:58:29 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Jan 21 23:58:29 compute-1 nova_compute[182713]:     </controller>
Jan 21 23:58:29 compute-1 nova_compute[182713]:     <controller type='pci' index='7' model='pcie-root-port'>
Jan 21 23:58:29 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 21 23:58:29 compute-1 nova_compute[182713]:       <target chassis='7' port='0x16'/>
Jan 21 23:58:29 compute-1 nova_compute[182713]:       <alias name='pci.7'/>
Jan 21 23:58:29 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Jan 21 23:58:29 compute-1 nova_compute[182713]:     </controller>
Jan 21 23:58:29 compute-1 nova_compute[182713]:     <controller type='pci' index='8' model='pcie-root-port'>
Jan 21 23:58:29 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 21 23:58:29 compute-1 nova_compute[182713]:       <target chassis='8' port='0x17'/>
Jan 21 23:58:29 compute-1 nova_compute[182713]:       <alias name='pci.8'/>
Jan 21 23:58:29 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Jan 21 23:58:29 compute-1 nova_compute[182713]:     </controller>
Jan 21 23:58:29 compute-1 nova_compute[182713]:     <controller type='pci' index='9' model='pcie-root-port'>
Jan 21 23:58:29 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 21 23:58:29 compute-1 nova_compute[182713]:       <target chassis='9' port='0x18'/>
Jan 21 23:58:29 compute-1 nova_compute[182713]:       <alias name='pci.9'/>
Jan 21 23:58:29 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Jan 21 23:58:29 compute-1 nova_compute[182713]:     </controller>
Jan 21 23:58:29 compute-1 nova_compute[182713]:     <controller type='pci' index='10' model='pcie-root-port'>
Jan 21 23:58:29 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 21 23:58:29 compute-1 nova_compute[182713]:       <target chassis='10' port='0x19'/>
Jan 21 23:58:29 compute-1 nova_compute[182713]:       <alias name='pci.10'/>
Jan 21 23:58:29 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Jan 21 23:58:29 compute-1 nova_compute[182713]:     </controller>
Jan 21 23:58:29 compute-1 nova_compute[182713]:     <controller type='pci' index='11' model='pcie-root-port'>
Jan 21 23:58:29 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 21 23:58:29 compute-1 nova_compute[182713]:       <target chassis='11' port='0x1a'/>
Jan 21 23:58:29 compute-1 nova_compute[182713]:       <alias name='pci.11'/>
Jan 21 23:58:29 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Jan 21 23:58:29 compute-1 nova_compute[182713]:     </controller>
Jan 21 23:58:29 compute-1 nova_compute[182713]:     <controller type='pci' index='12' model='pcie-root-port'>
Jan 21 23:58:29 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 21 23:58:29 compute-1 nova_compute[182713]:       <target chassis='12' port='0x1b'/>
Jan 21 23:58:29 compute-1 nova_compute[182713]:       <alias name='pci.12'/>
Jan 21 23:58:29 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Jan 21 23:58:29 compute-1 nova_compute[182713]:     </controller>
Jan 21 23:58:29 compute-1 nova_compute[182713]:     <controller type='pci' index='13' model='pcie-root-port'>
Jan 21 23:58:29 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 21 23:58:29 compute-1 nova_compute[182713]:       <target chassis='13' port='0x1c'/>
Jan 21 23:58:29 compute-1 nova_compute[182713]:       <alias name='pci.13'/>
Jan 21 23:58:29 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Jan 21 23:58:29 compute-1 nova_compute[182713]:     </controller>
Jan 21 23:58:29 compute-1 nova_compute[182713]:     <controller type='pci' index='14' model='pcie-root-port'>
Jan 21 23:58:29 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 21 23:58:29 compute-1 nova_compute[182713]:       <target chassis='14' port='0x1d'/>
Jan 21 23:58:29 compute-1 nova_compute[182713]:       <alias name='pci.14'/>
Jan 21 23:58:29 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Jan 21 23:58:29 compute-1 nova_compute[182713]:     </controller>
Jan 21 23:58:29 compute-1 nova_compute[182713]:     <controller type='pci' index='15' model='pcie-root-port'>
Jan 21 23:58:29 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 21 23:58:29 compute-1 nova_compute[182713]:       <target chassis='15' port='0x1e'/>
Jan 21 23:58:29 compute-1 nova_compute[182713]:       <alias name='pci.15'/>
Jan 21 23:58:29 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Jan 21 23:58:29 compute-1 nova_compute[182713]:     </controller>
Jan 21 23:58:29 compute-1 nova_compute[182713]:     <controller type='pci' index='16' model='pcie-root-port'>
Jan 21 23:58:29 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 21 23:58:29 compute-1 nova_compute[182713]:       <target chassis='16' port='0x1f'/>
Jan 21 23:58:29 compute-1 nova_compute[182713]:       <alias name='pci.16'/>
Jan 21 23:58:29 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Jan 21 23:58:29 compute-1 nova_compute[182713]:     </controller>
Jan 21 23:58:29 compute-1 nova_compute[182713]:     <controller type='pci' index='17' model='pcie-root-port'>
Jan 21 23:58:29 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 21 23:58:29 compute-1 nova_compute[182713]:       <target chassis='17' port='0x20'/>
Jan 21 23:58:29 compute-1 nova_compute[182713]:       <alias name='pci.17'/>
Jan 21 23:58:29 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Jan 21 23:58:29 compute-1 nova_compute[182713]:     </controller>
Jan 21 23:58:29 compute-1 nova_compute[182713]:     <controller type='pci' index='18' model='pcie-root-port'>
Jan 21 23:58:29 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 21 23:58:29 compute-1 nova_compute[182713]:       <target chassis='18' port='0x21'/>
Jan 21 23:58:29 compute-1 nova_compute[182713]:       <alias name='pci.18'/>
Jan 21 23:58:29 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Jan 21 23:58:29 compute-1 nova_compute[182713]:     </controller>
Jan 21 23:58:29 compute-1 nova_compute[182713]:     <controller type='pci' index='19' model='pcie-root-port'>
Jan 21 23:58:29 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 21 23:58:29 compute-1 nova_compute[182713]:       <target chassis='19' port='0x22'/>
Jan 21 23:58:29 compute-1 nova_compute[182713]:       <alias name='pci.19'/>
Jan 21 23:58:29 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Jan 21 23:58:29 compute-1 nova_compute[182713]:     </controller>
Jan 21 23:58:29 compute-1 nova_compute[182713]:     <controller type='pci' index='20' model='pcie-root-port'>
Jan 21 23:58:29 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 21 23:58:29 compute-1 nova_compute[182713]:       <target chassis='20' port='0x23'/>
Jan 21 23:58:29 compute-1 nova_compute[182713]:       <alias name='pci.20'/>
Jan 21 23:58:29 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Jan 21 23:58:29 compute-1 nova_compute[182713]:     </controller>
Jan 21 23:58:29 compute-1 nova_compute[182713]:     <controller type='pci' index='21' model='pcie-root-port'>
Jan 21 23:58:29 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 21 23:58:29 compute-1 nova_compute[182713]:       <target chassis='21' port='0x24'/>
Jan 21 23:58:29 compute-1 nova_compute[182713]:       <alias name='pci.21'/>
Jan 21 23:58:29 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Jan 21 23:58:29 compute-1 nova_compute[182713]:     </controller>
Jan 21 23:58:29 compute-1 nova_compute[182713]:     <controller type='pci' index='22' model='pcie-root-port'>
Jan 21 23:58:29 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 21 23:58:29 compute-1 nova_compute[182713]:       <target chassis='22' port='0x25'/>
Jan 21 23:58:29 compute-1 nova_compute[182713]:       <alias name='pci.22'/>
Jan 21 23:58:29 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Jan 21 23:58:29 compute-1 nova_compute[182713]:     </controller>
Jan 21 23:58:29 compute-1 nova_compute[182713]:     <controller type='pci' index='23' model='pcie-root-port'>
Jan 21 23:58:29 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 21 23:58:29 compute-1 nova_compute[182713]:       <target chassis='23' port='0x26'/>
Jan 21 23:58:29 compute-1 nova_compute[182713]:       <alias name='pci.23'/>
Jan 21 23:58:29 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Jan 21 23:58:29 compute-1 nova_compute[182713]:     </controller>
Jan 21 23:58:29 compute-1 nova_compute[182713]:     <controller type='pci' index='24' model='pcie-root-port'>
Jan 21 23:58:29 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 21 23:58:29 compute-1 nova_compute[182713]:       <target chassis='24' port='0x27'/>
Jan 21 23:58:29 compute-1 nova_compute[182713]:       <alias name='pci.24'/>
Jan 21 23:58:29 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Jan 21 23:58:29 compute-1 nova_compute[182713]:     </controller>
Jan 21 23:58:29 compute-1 nova_compute[182713]:     <controller type='pci' index='25' model='pcie-root-port'>
Jan 21 23:58:29 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 21 23:58:29 compute-1 nova_compute[182713]:       <target chassis='25' port='0x28'/>
Jan 21 23:58:29 compute-1 nova_compute[182713]:       <alias name='pci.25'/>
Jan 21 23:58:29 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Jan 21 23:58:29 compute-1 nova_compute[182713]:     </controller>
Jan 21 23:58:29 compute-1 nova_compute[182713]:     <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Jan 21 23:58:29 compute-1 nova_compute[182713]:       <model name='pcie-pci-bridge'/>
Jan 21 23:58:29 compute-1 nova_compute[182713]:       <alias name='pci.26'/>
Jan 21 23:58:29 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Jan 21 23:58:29 compute-1 nova_compute[182713]:     </controller>
Jan 21 23:58:29 compute-1 nova_compute[182713]:     <controller type='usb' index='0' model='piix3-uhci'>
Jan 21 23:58:29 compute-1 nova_compute[182713]:       <alias name='usb'/>
Jan 21 23:58:29 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Jan 21 23:58:29 compute-1 nova_compute[182713]:     </controller>
Jan 21 23:58:29 compute-1 nova_compute[182713]:     <controller type='sata' index='0'>
Jan 21 23:58:29 compute-1 nova_compute[182713]:       <alias name='ide'/>
Jan 21 23:58:29 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Jan 21 23:58:29 compute-1 nova_compute[182713]:     </controller>
Jan 21 23:58:29 compute-1 nova_compute[182713]:     <interface type='ethernet'>
Jan 21 23:58:29 compute-1 nova_compute[182713]:       <mac address='fa:16:3e:3e:bc:4e'/>
Jan 21 23:58:29 compute-1 nova_compute[182713]:       <target dev='tap43589933-19'/>
Jan 21 23:58:29 compute-1 nova_compute[182713]:       <model type='virtio'/>
Jan 21 23:58:29 compute-1 nova_compute[182713]:       <driver name='vhost' rx_queue_size='512'/>
Jan 21 23:58:29 compute-1 nova_compute[182713]:       <mtu size='1442'/>
Jan 21 23:58:29 compute-1 nova_compute[182713]:       <alias name='net0'/>
Jan 21 23:58:29 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Jan 21 23:58:29 compute-1 nova_compute[182713]:     </interface>
Jan 21 23:58:29 compute-1 nova_compute[182713]:     <interface type='ethernet'>
Jan 21 23:58:29 compute-1 nova_compute[182713]:       <mac address='fa:16:3e:30:c6:cf'/>
Jan 21 23:58:29 compute-1 nova_compute[182713]:       <target dev='tap63932621-f0'/>
Jan 21 23:58:29 compute-1 nova_compute[182713]:       <model type='virtio'/>
Jan 21 23:58:29 compute-1 nova_compute[182713]:       <driver name='vhost' rx_queue_size='512'/>
Jan 21 23:58:29 compute-1 nova_compute[182713]:       <mtu size='1442'/>
Jan 21 23:58:29 compute-1 nova_compute[182713]:       <alias name='net1'/>
Jan 21 23:58:29 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x06' slot='0x00' function='0x0'/>
Jan 21 23:58:29 compute-1 nova_compute[182713]:     </interface>
Jan 21 23:58:29 compute-1 nova_compute[182713]:     <interface type='ethernet'>
Jan 21 23:58:29 compute-1 nova_compute[182713]:       <mac address='fa:16:3e:92:0a:be'/>
Jan 21 23:58:29 compute-1 nova_compute[182713]:       <target dev='tapda88332f-f7'/>
Jan 21 23:58:29 compute-1 nova_compute[182713]:       <model type='virtio'/>
Jan 21 23:58:29 compute-1 nova_compute[182713]:       <driver name='vhost' rx_queue_size='512'/>
Jan 21 23:58:29 compute-1 nova_compute[182713]:       <mtu size='1442'/>
Jan 21 23:58:29 compute-1 nova_compute[182713]:       <alias name='net2'/>
Jan 21 23:58:29 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x07' slot='0x00' function='0x0'/>
Jan 21 23:58:29 compute-1 nova_compute[182713]:     </interface>
Jan 21 23:58:29 compute-1 nova_compute[182713]:     <interface type='ethernet'>
Jan 21 23:58:29 compute-1 nova_compute[182713]:       <mac address='fa:16:3e:db:19:e5'/>
Jan 21 23:58:29 compute-1 nova_compute[182713]:       <target dev='tapebbb51e0-ec'/>
Jan 21 23:58:29 compute-1 nova_compute[182713]:       <model type='virtio'/>
Jan 21 23:58:29 compute-1 nova_compute[182713]:       <driver name='vhost' rx_queue_size='512'/>
Jan 21 23:58:29 compute-1 nova_compute[182713]:       <mtu size='1442'/>
Jan 21 23:58:29 compute-1 nova_compute[182713]:       <alias name='net3'/>
Jan 21 23:58:29 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x08' slot='0x00' function='0x0'/>
Jan 21 23:58:29 compute-1 nova_compute[182713]:     </interface>
Jan 21 23:58:29 compute-1 nova_compute[182713]:     <serial type='pty'>
Jan 21 23:58:29 compute-1 nova_compute[182713]:       <source path='/dev/pts/0'/>
Jan 21 23:58:29 compute-1 nova_compute[182713]:       <log file='/var/lib/nova/instances/c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9/console.log' append='off'/>
Jan 21 23:58:29 compute-1 nova_compute[182713]:       <target type='isa-serial' port='0'>
Jan 21 23:58:29 compute-1 nova_compute[182713]:         <model name='isa-serial'/>
Jan 21 23:58:29 compute-1 nova_compute[182713]:       </target>
Jan 21 23:58:29 compute-1 nova_compute[182713]:       <alias name='serial0'/>
Jan 21 23:58:29 compute-1 nova_compute[182713]:     </serial>
Jan 21 23:58:29 compute-1 nova_compute[182713]:     <console type='pty' tty='/dev/pts/0'>
Jan 21 23:58:29 compute-1 nova_compute[182713]:       <source path='/dev/pts/0'/>
Jan 21 23:58:29 compute-1 nova_compute[182713]:       <log file='/var/lib/nova/instances/c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9/console.log' append='off'/>
Jan 21 23:58:29 compute-1 nova_compute[182713]:       <target type='serial' port='0'/>
Jan 21 23:58:29 compute-1 nova_compute[182713]:       <alias name='serial0'/>
Jan 21 23:58:29 compute-1 nova_compute[182713]:     </console>
Jan 21 23:58:29 compute-1 nova_compute[182713]:     <input type='tablet' bus='usb'>
Jan 21 23:58:29 compute-1 nova_compute[182713]:       <alias name='input0'/>
Jan 21 23:58:29 compute-1 nova_compute[182713]:       <address type='usb' bus='0' port='1'/>
Jan 21 23:58:29 compute-1 nova_compute[182713]:     </input>
Jan 21 23:58:29 compute-1 nova_compute[182713]:     <input type='mouse' bus='ps2'>
Jan 21 23:58:29 compute-1 nova_compute[182713]:       <alias name='input1'/>
Jan 21 23:58:29 compute-1 nova_compute[182713]:     </input>
Jan 21 23:58:29 compute-1 nova_compute[182713]:     <input type='keyboard' bus='ps2'>
Jan 21 23:58:29 compute-1 nova_compute[182713]:       <alias name='input2'/>
Jan 21 23:58:29 compute-1 nova_compute[182713]:     </input>
Jan 21 23:58:29 compute-1 nova_compute[182713]:     <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Jan 21 23:58:29 compute-1 nova_compute[182713]:       <listen type='address' address='::0'/>
Jan 21 23:58:29 compute-1 nova_compute[182713]:     </graphics>
Jan 21 23:58:29 compute-1 nova_compute[182713]:     <audio id='1' type='none'/>
Jan 21 23:58:29 compute-1 nova_compute[182713]:     <video>
Jan 21 23:58:29 compute-1 nova_compute[182713]:       <model type='virtio' heads='1' primary='yes'/>
Jan 21 23:58:29 compute-1 nova_compute[182713]:       <alias name='video0'/>
Jan 21 23:58:29 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Jan 21 23:58:29 compute-1 nova_compute[182713]:     </video>
Jan 21 23:58:29 compute-1 nova_compute[182713]:     <watchdog model='itco' action='reset'>
Jan 21 23:58:29 compute-1 nova_compute[182713]:       <alias name='watchdog0'/>
Jan 21 23:58:29 compute-1 nova_compute[182713]:     </watchdog>
Jan 21 23:58:29 compute-1 nova_compute[182713]:     <memballoon model='virtio'>
Jan 21 23:58:29 compute-1 nova_compute[182713]:       <stats period='10'/>
Jan 21 23:58:29 compute-1 nova_compute[182713]:       <alias name='balloon0'/>
Jan 21 23:58:29 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Jan 21 23:58:29 compute-1 nova_compute[182713]:     </memballoon>
Jan 21 23:58:29 compute-1 nova_compute[182713]:     <rng model='virtio'>
Jan 21 23:58:29 compute-1 nova_compute[182713]:       <backend model='random'>/dev/urandom</backend>
Jan 21 23:58:29 compute-1 nova_compute[182713]:       <alias name='rng0'/>
Jan 21 23:58:29 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Jan 21 23:58:29 compute-1 nova_compute[182713]:     </rng>
Jan 21 23:58:29 compute-1 nova_compute[182713]:   </devices>
Jan 21 23:58:29 compute-1 nova_compute[182713]:   <seclabel type='dynamic' model='selinux' relabel='yes'>
Jan 21 23:58:29 compute-1 nova_compute[182713]:     <label>system_u:system_r:svirt_t:s0:c342,c529</label>
Jan 21 23:58:29 compute-1 nova_compute[182713]:     <imagelabel>system_u:object_r:svirt_image_t:s0:c342,c529</imagelabel>
Jan 21 23:58:29 compute-1 nova_compute[182713]:   </seclabel>
Jan 21 23:58:29 compute-1 nova_compute[182713]:   <seclabel type='dynamic' model='dac' relabel='yes'>
Jan 21 23:58:29 compute-1 nova_compute[182713]:     <label>+107:+107</label>
Jan 21 23:58:29 compute-1 nova_compute[182713]:     <imagelabel>+107:+107</imagelabel>
Jan 21 23:58:29 compute-1 nova_compute[182713]:   </seclabel>
Jan 21 23:58:29 compute-1 nova_compute[182713]: </domain>
Jan 21 23:58:29 compute-1 nova_compute[182713]:  get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282
Jan 21 23:58:29 compute-1 nova_compute[182713]: 2026-01-21 23:58:29.845 182717 INFO nova.virt.libvirt.driver [None req-f8213aff-4f1c-4f00-aeaa-898fb01fcf42 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Successfully detached device tap63932621-f0 from instance c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9 from the persistent domain config.
Jan 21 23:58:29 compute-1 nova_compute[182713]: 2026-01-21 23:58:29.846 182717 DEBUG nova.virt.libvirt.driver [None req-f8213aff-4f1c-4f00-aeaa-898fb01fcf42 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] (1/8): Attempting to detach device tap63932621-f0 with device alias net1 from instance c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9 from the live domain config. _detach_from_live_with_retry /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2523
Jan 21 23:58:29 compute-1 nova_compute[182713]: 2026-01-21 23:58:29.846 182717 DEBUG nova.virt.libvirt.guest [None req-f8213aff-4f1c-4f00-aeaa-898fb01fcf42 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] detach device xml: <interface type="ethernet">
Jan 21 23:58:29 compute-1 nova_compute[182713]:   <mac address="fa:16:3e:30:c6:cf"/>
Jan 21 23:58:29 compute-1 nova_compute[182713]:   <model type="virtio"/>
Jan 21 23:58:29 compute-1 nova_compute[182713]:   <driver name="vhost" rx_queue_size="512"/>
Jan 21 23:58:29 compute-1 nova_compute[182713]:   <mtu size="1442"/>
Jan 21 23:58:29 compute-1 nova_compute[182713]:   <target dev="tap63932621-f0"/>
Jan 21 23:58:29 compute-1 nova_compute[182713]: </interface>
Jan 21 23:58:29 compute-1 nova_compute[182713]:  detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465
Jan 21 23:58:29 compute-1 nova_compute[182713]: 2026-01-21 23:58:29.855 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:58:29 compute-1 nova_compute[182713]: 2026-01-21 23:58:29.954 182717 DEBUG oslo_concurrency.lockutils [None req-9292a1aa-694c-4e1a-bc79-2654a7b5a37b d31ef0e2c7354f47adb7b7f072c28fae 40a322b32cda438b83f33ec51a9007dc - - default default] Acquiring lock "337d88a9-a34b-4c90-bf0d-0418533ae52d" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:58:29 compute-1 nova_compute[182713]: 2026-01-21 23:58:29.954 182717 DEBUG oslo_concurrency.lockutils [None req-9292a1aa-694c-4e1a-bc79-2654a7b5a37b d31ef0e2c7354f47adb7b7f072c28fae 40a322b32cda438b83f33ec51a9007dc - - default default] Lock "337d88a9-a34b-4c90-bf0d-0418533ae52d" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:58:29 compute-1 nova_compute[182713]: 2026-01-21 23:58:29.955 182717 DEBUG oslo_concurrency.lockutils [None req-9292a1aa-694c-4e1a-bc79-2654a7b5a37b d31ef0e2c7354f47adb7b7f072c28fae 40a322b32cda438b83f33ec51a9007dc - - default default] Acquiring lock "337d88a9-a34b-4c90-bf0d-0418533ae52d-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:58:29 compute-1 nova_compute[182713]: 2026-01-21 23:58:29.955 182717 DEBUG oslo_concurrency.lockutils [None req-9292a1aa-694c-4e1a-bc79-2654a7b5a37b d31ef0e2c7354f47adb7b7f072c28fae 40a322b32cda438b83f33ec51a9007dc - - default default] Lock "337d88a9-a34b-4c90-bf0d-0418533ae52d-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:58:29 compute-1 nova_compute[182713]: 2026-01-21 23:58:29.956 182717 DEBUG oslo_concurrency.lockutils [None req-9292a1aa-694c-4e1a-bc79-2654a7b5a37b d31ef0e2c7354f47adb7b7f072c28fae 40a322b32cda438b83f33ec51a9007dc - - default default] Lock "337d88a9-a34b-4c90-bf0d-0418533ae52d-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:58:29 compute-1 nova_compute[182713]: 2026-01-21 23:58:29.971 182717 INFO nova.compute.manager [None req-9292a1aa-694c-4e1a-bc79-2654a7b5a37b d31ef0e2c7354f47adb7b7f072c28fae 40a322b32cda438b83f33ec51a9007dc - - default default] [instance: 337d88a9-a34b-4c90-bf0d-0418533ae52d] Terminating instance
Jan 21 23:58:29 compute-1 kernel: tap63932621-f0 (unregistering): left promiscuous mode
Jan 21 23:58:29 compute-1 nova_compute[182713]: 2026-01-21 23:58:29.985 182717 DEBUG nova.compute.manager [None req-9292a1aa-694c-4e1a-bc79-2654a7b5a37b d31ef0e2c7354f47adb7b7f072c28fae 40a322b32cda438b83f33ec51a9007dc - - default default] [instance: 337d88a9-a34b-4c90-bf0d-0418533ae52d] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 21 23:58:29 compute-1 NetworkManager[54952]: <info>  [1769039909.9862] device (tap63932621-f0): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 21 23:58:30 compute-1 ovn_controller[94841]: 2026-01-21T23:58:30Z|00253|binding|INFO|Releasing lport 63932621-f0d1-4f08-8ce5-b5fa120bcc62 from this chassis (sb_readonly=0)
Jan 21 23:58:30 compute-1 ovn_controller[94841]: 2026-01-21T23:58:30Z|00254|binding|INFO|Setting lport 63932621-f0d1-4f08-8ce5-b5fa120bcc62 down in Southbound
Jan 21 23:58:30 compute-1 ovn_controller[94841]: 2026-01-21T23:58:30Z|00255|binding|INFO|Removing iface tap63932621-f0 ovn-installed in OVS
Jan 21 23:58:30 compute-1 nova_compute[182713]: 2026-01-21 23:58:30.083 182717 DEBUG nova.virt.libvirt.driver [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Received event <DeviceRemovedEvent: 1769039910.0831509, c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9 => net1> from libvirt while the driver is waiting for it; dispatched. emit_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2370
Jan 21 23:58:30 compute-1 nova_compute[182713]: 2026-01-21 23:58:30.086 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:58:30 compute-1 nova_compute[182713]: 2026-01-21 23:58:30.088 182717 DEBUG nova.virt.libvirt.driver [None req-f8213aff-4f1c-4f00-aeaa-898fb01fcf42 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Start waiting for the detach event from libvirt for device tap63932621-f0 with device alias net1 for instance c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9 _detach_from_live_and_wait_for_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2599
Jan 21 23:58:30 compute-1 nova_compute[182713]: 2026-01-21 23:58:30.089 182717 DEBUG nova.virt.libvirt.guest [None req-f8213aff-4f1c-4f00-aeaa-898fb01fcf42 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:30:c6:cf"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap63932621-f0"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257
Jan 21 23:58:30 compute-1 nova_compute[182713]: 2026-01-21 23:58:30.094 182717 DEBUG nova.virt.libvirt.guest [None req-f8213aff-4f1c-4f00-aeaa-898fb01fcf42 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:30:c6:cf"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap63932621-f0"/></interface>not found in domain: <domain type='kvm' id='33'>
Jan 21 23:58:30 compute-1 nova_compute[182713]:   <name>instance-00000047</name>
Jan 21 23:58:30 compute-1 nova_compute[182713]:   <uuid>c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9</uuid>
Jan 21 23:58:30 compute-1 nova_compute[182713]:   <metadata>
Jan 21 23:58:30 compute-1 nova_compute[182713]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 21 23:58:30 compute-1 nova_compute[182713]:   <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 21 23:58:30 compute-1 nova_compute[182713]:   <nova:name>tempest-AttachInterfacesTestJSON-server-936465965</nova:name>
Jan 21 23:58:30 compute-1 nova_compute[182713]:   <nova:creationTime>2026-01-21 23:58:26</nova:creationTime>
Jan 21 23:58:30 compute-1 nova_compute[182713]:   <nova:flavor name="m1.nano">
Jan 21 23:58:30 compute-1 nova_compute[182713]:     <nova:memory>128</nova:memory>
Jan 21 23:58:30 compute-1 nova_compute[182713]:     <nova:disk>1</nova:disk>
Jan 21 23:58:30 compute-1 nova_compute[182713]:     <nova:swap>0</nova:swap>
Jan 21 23:58:30 compute-1 nova_compute[182713]:     <nova:ephemeral>0</nova:ephemeral>
Jan 21 23:58:30 compute-1 nova_compute[182713]:     <nova:vcpus>1</nova:vcpus>
Jan 21 23:58:30 compute-1 nova_compute[182713]:   </nova:flavor>
Jan 21 23:58:30 compute-1 nova_compute[182713]:   <nova:owner>
Jan 21 23:58:30 compute-1 nova_compute[182713]:     <nova:user uuid="0f8ef02149394f2dac899fc3395b6bf7">tempest-AttachInterfacesTestJSON-658760528-project-member</nova:user>
Jan 21 23:58:30 compute-1 nova_compute[182713]:     <nova:project uuid="717cc581e6a349a98dfd390d05b18624">tempest-AttachInterfacesTestJSON-658760528</nova:project>
Jan 21 23:58:30 compute-1 nova_compute[182713]:   </nova:owner>
Jan 21 23:58:30 compute-1 nova_compute[182713]:   <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 21 23:58:30 compute-1 nova_compute[182713]:   <nova:ports>
Jan 21 23:58:30 compute-1 nova_compute[182713]:     <nova:port uuid="43589933-1997-41c6-9aa3-54f71a1330b8">
Jan 21 23:58:30 compute-1 nova_compute[182713]:       <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Jan 21 23:58:30 compute-1 nova_compute[182713]:     </nova:port>
Jan 21 23:58:30 compute-1 nova_compute[182713]:     <nova:port uuid="63932621-f0d1-4f08-8ce5-b5fa120bcc62">
Jan 21 23:58:30 compute-1 nova_compute[182713]:       <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Jan 21 23:58:30 compute-1 nova_compute[182713]:     </nova:port>
Jan 21 23:58:30 compute-1 nova_compute[182713]:     <nova:port uuid="da88332f-f709-4521-a7e5-faca686cf825">
Jan 21 23:58:30 compute-1 nova_compute[182713]:       <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Jan 21 23:58:30 compute-1 nova_compute[182713]:     </nova:port>
Jan 21 23:58:30 compute-1 nova_compute[182713]:     <nova:port uuid="ebbb51e0-ecfc-404f-a578-681300a57aa8">
Jan 21 23:58:30 compute-1 nova_compute[182713]:       <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Jan 21 23:58:30 compute-1 nova_compute[182713]:     </nova:port>
Jan 21 23:58:30 compute-1 nova_compute[182713]:   </nova:ports>
Jan 21 23:58:30 compute-1 nova_compute[182713]: </nova:instance>
Jan 21 23:58:30 compute-1 nova_compute[182713]:   </metadata>
Jan 21 23:58:30 compute-1 nova_compute[182713]:   <memory unit='KiB'>131072</memory>
Jan 21 23:58:30 compute-1 nova_compute[182713]:   <currentMemory unit='KiB'>131072</currentMemory>
Jan 21 23:58:30 compute-1 nova_compute[182713]:   <vcpu placement='static'>1</vcpu>
Jan 21 23:58:30 compute-1 nova_compute[182713]:   <resource>
Jan 21 23:58:30 compute-1 nova_compute[182713]:     <partition>/machine</partition>
Jan 21 23:58:30 compute-1 nova_compute[182713]:   </resource>
Jan 21 23:58:30 compute-1 nova_compute[182713]:   <sysinfo type='smbios'>
Jan 21 23:58:30 compute-1 nova_compute[182713]:     <system>
Jan 21 23:58:30 compute-1 nova_compute[182713]:       <entry name='manufacturer'>RDO</entry>
Jan 21 23:58:30 compute-1 nova_compute[182713]:       <entry name='product'>OpenStack Compute</entry>
Jan 21 23:58:30 compute-1 nova_compute[182713]:       <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 21 23:58:30 compute-1 nova_compute[182713]:       <entry name='serial'>c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9</entry>
Jan 21 23:58:30 compute-1 nova_compute[182713]:       <entry name='uuid'>c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9</entry>
Jan 21 23:58:30 compute-1 nova_compute[182713]:       <entry name='family'>Virtual Machine</entry>
Jan 21 23:58:30 compute-1 nova_compute[182713]:     </system>
Jan 21 23:58:30 compute-1 nova_compute[182713]:   </sysinfo>
Jan 21 23:58:30 compute-1 nova_compute[182713]:   <os>
Jan 21 23:58:30 compute-1 nova_compute[182713]:     <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Jan 21 23:58:30 compute-1 nova_compute[182713]:     <boot dev='hd'/>
Jan 21 23:58:30 compute-1 nova_compute[182713]:     <smbios mode='sysinfo'/>
Jan 21 23:58:30 compute-1 nova_compute[182713]:   </os>
Jan 21 23:58:30 compute-1 nova_compute[182713]:   <features>
Jan 21 23:58:30 compute-1 nova_compute[182713]:     <acpi/>
Jan 21 23:58:30 compute-1 nova_compute[182713]:     <apic/>
Jan 21 23:58:30 compute-1 nova_compute[182713]:     <vmcoreinfo state='on'/>
Jan 21 23:58:30 compute-1 nova_compute[182713]:   </features>
Jan 21 23:58:30 compute-1 nova_compute[182713]:   <cpu mode='custom' match='exact' check='full'>
Jan 21 23:58:30 compute-1 nova_compute[182713]:     <model fallback='forbid'>Nehalem</model>
Jan 21 23:58:30 compute-1 nova_compute[182713]:     <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Jan 21 23:58:30 compute-1 nova_compute[182713]:     <feature policy='require' name='x2apic'/>
Jan 21 23:58:30 compute-1 nova_compute[182713]:     <feature policy='require' name='hypervisor'/>
Jan 21 23:58:30 compute-1 nova_compute[182713]:     <feature policy='require' name='vme'/>
Jan 21 23:58:30 compute-1 nova_compute[182713]:   </cpu>
Jan 21 23:58:30 compute-1 nova_compute[182713]:   <clock offset='utc'>
Jan 21 23:58:30 compute-1 nova_compute[182713]:     <timer name='pit' tickpolicy='delay'/>
Jan 21 23:58:30 compute-1 nova_compute[182713]:     <timer name='rtc' tickpolicy='catchup'/>
Jan 21 23:58:30 compute-1 nova_compute[182713]:     <timer name='hpet' present='no'/>
Jan 21 23:58:30 compute-1 nova_compute[182713]:   </clock>
Jan 21 23:58:30 compute-1 nova_compute[182713]:   <on_poweroff>destroy</on_poweroff>
Jan 21 23:58:30 compute-1 nova_compute[182713]:   <on_reboot>restart</on_reboot>
Jan 21 23:58:30 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:58:30.095 104184 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:30:c6:cf 10.100.0.14'], port_security=['fa:16:3e:30:c6:cf 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': 'c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1995baab-0f8d-4658-a4fc-2d21868dc592', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '717cc581e6a349a98dfd390d05b18624', 'neutron:revision_number': '4', 'neutron:security_group_ids': '453c6af8-25dc-4538-90e8-d74d46875cdc', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a84fa12f-731b-4479-8697-844749c5a76f, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>], logical_port=63932621-f0d1-4f08-8ce5-b5fa120bcc62) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 21 23:58:30 compute-1 nova_compute[182713]:   <on_crash>destroy</on_crash>
Jan 21 23:58:30 compute-1 nova_compute[182713]:   <devices>
Jan 21 23:58:30 compute-1 nova_compute[182713]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 21 23:58:30 compute-1 nova_compute[182713]:     <disk type='file' device='disk'>
Jan 21 23:58:30 compute-1 nova_compute[182713]:       <driver name='qemu' type='qcow2' cache='none'/>
Jan 21 23:58:30 compute-1 nova_compute[182713]:       <source file='/var/lib/nova/instances/c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9/disk' index='2'/>
Jan 21 23:58:30 compute-1 nova_compute[182713]:       <backingStore type='file' index='3'>
Jan 21 23:58:30 compute-1 nova_compute[182713]:         <format type='raw'/>
Jan 21 23:58:30 compute-1 nova_compute[182713]:         <source file='/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474'/>
Jan 21 23:58:30 compute-1 nova_compute[182713]:         <backingStore/>
Jan 21 23:58:30 compute-1 nova_compute[182713]:       </backingStore>
Jan 21 23:58:30 compute-1 nova_compute[182713]:       <target dev='vda' bus='virtio'/>
Jan 21 23:58:30 compute-1 nova_compute[182713]:       <alias name='virtio-disk0'/>
Jan 21 23:58:30 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Jan 21 23:58:30 compute-1 nova_compute[182713]:     </disk>
Jan 21 23:58:30 compute-1 nova_compute[182713]:     <disk type='file' device='cdrom'>
Jan 21 23:58:30 compute-1 nova_compute[182713]:       <driver name='qemu' type='raw' cache='none'/>
Jan 21 23:58:30 compute-1 nova_compute[182713]:       <source file='/var/lib/nova/instances/c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9/disk.config' index='1'/>
Jan 21 23:58:30 compute-1 nova_compute[182713]:       <backingStore/>
Jan 21 23:58:30 compute-1 nova_compute[182713]:       <target dev='sda' bus='sata'/>
Jan 21 23:58:30 compute-1 nova_compute[182713]:       <readonly/>
Jan 21 23:58:30 compute-1 nova_compute[182713]:       <alias name='sata0-0-0'/>
Jan 21 23:58:30 compute-1 nova_compute[182713]:       <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Jan 21 23:58:30 compute-1 nova_compute[182713]:     </disk>
Jan 21 23:58:30 compute-1 nova_compute[182713]:     <controller type='pci' index='0' model='pcie-root'>
Jan 21 23:58:30 compute-1 nova_compute[182713]:       <alias name='pcie.0'/>
Jan 21 23:58:30 compute-1 nova_compute[182713]:     </controller>
Jan 21 23:58:30 compute-1 nova_compute[182713]:     <controller type='pci' index='1' model='pcie-root-port'>
Jan 21 23:58:30 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 21 23:58:30 compute-1 nova_compute[182713]:       <target chassis='1' port='0x10'/>
Jan 21 23:58:30 compute-1 nova_compute[182713]:       <alias name='pci.1'/>
Jan 21 23:58:30 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Jan 21 23:58:30 compute-1 nova_compute[182713]:     </controller>
Jan 21 23:58:30 compute-1 nova_compute[182713]:     <controller type='pci' index='2' model='pcie-root-port'>
Jan 21 23:58:30 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 21 23:58:30 compute-1 nova_compute[182713]:       <target chassis='2' port='0x11'/>
Jan 21 23:58:30 compute-1 nova_compute[182713]:       <alias name='pci.2'/>
Jan 21 23:58:30 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Jan 21 23:58:30 compute-1 nova_compute[182713]:     </controller>
Jan 21 23:58:30 compute-1 nova_compute[182713]:     <controller type='pci' index='3' model='pcie-root-port'>
Jan 21 23:58:30 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 21 23:58:30 compute-1 nova_compute[182713]:       <target chassis='3' port='0x12'/>
Jan 21 23:58:30 compute-1 nova_compute[182713]:       <alias name='pci.3'/>
Jan 21 23:58:30 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Jan 21 23:58:30 compute-1 nova_compute[182713]:     </controller>
Jan 21 23:58:30 compute-1 nova_compute[182713]:     <controller type='pci' index='4' model='pcie-root-port'>
Jan 21 23:58:30 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 21 23:58:30 compute-1 nova_compute[182713]:       <target chassis='4' port='0x13'/>
Jan 21 23:58:30 compute-1 nova_compute[182713]:       <alias name='pci.4'/>
Jan 21 23:58:30 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Jan 21 23:58:30 compute-1 nova_compute[182713]:     </controller>
Jan 21 23:58:30 compute-1 nova_compute[182713]:     <controller type='pci' index='5' model='pcie-root-port'>
Jan 21 23:58:30 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 21 23:58:30 compute-1 nova_compute[182713]:       <target chassis='5' port='0x14'/>
Jan 21 23:58:30 compute-1 nova_compute[182713]:       <alias name='pci.5'/>
Jan 21 23:58:30 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Jan 21 23:58:30 compute-1 nova_compute[182713]:     </controller>
Jan 21 23:58:30 compute-1 nova_compute[182713]:     <controller type='pci' index='6' model='pcie-root-port'>
Jan 21 23:58:30 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 21 23:58:30 compute-1 nova_compute[182713]:       <target chassis='6' port='0x15'/>
Jan 21 23:58:30 compute-1 nova_compute[182713]:       <alias name='pci.6'/>
Jan 21 23:58:30 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Jan 21 23:58:30 compute-1 nova_compute[182713]:     </controller>
Jan 21 23:58:30 compute-1 nova_compute[182713]:     <controller type='pci' index='7' model='pcie-root-port'>
Jan 21 23:58:30 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 21 23:58:30 compute-1 nova_compute[182713]:       <target chassis='7' port='0x16'/>
Jan 21 23:58:30 compute-1 nova_compute[182713]:       <alias name='pci.7'/>
Jan 21 23:58:30 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Jan 21 23:58:30 compute-1 nova_compute[182713]:     </controller>
Jan 21 23:58:30 compute-1 nova_compute[182713]:     <controller type='pci' index='8' model='pcie-root-port'>
Jan 21 23:58:30 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 21 23:58:30 compute-1 nova_compute[182713]:       <target chassis='8' port='0x17'/>
Jan 21 23:58:30 compute-1 nova_compute[182713]:       <alias name='pci.8'/>
Jan 21 23:58:30 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Jan 21 23:58:30 compute-1 nova_compute[182713]:     </controller>
Jan 21 23:58:30 compute-1 nova_compute[182713]:     <controller type='pci' index='9' model='pcie-root-port'>
Jan 21 23:58:30 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 21 23:58:30 compute-1 nova_compute[182713]:       <target chassis='9' port='0x18'/>
Jan 21 23:58:30 compute-1 nova_compute[182713]:       <alias name='pci.9'/>
Jan 21 23:58:30 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Jan 21 23:58:30 compute-1 nova_compute[182713]:     </controller>
Jan 21 23:58:30 compute-1 nova_compute[182713]:     <controller type='pci' index='10' model='pcie-root-port'>
Jan 21 23:58:30 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 21 23:58:30 compute-1 nova_compute[182713]:       <target chassis='10' port='0x19'/>
Jan 21 23:58:30 compute-1 nova_compute[182713]:       <alias name='pci.10'/>
Jan 21 23:58:30 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Jan 21 23:58:30 compute-1 nova_compute[182713]:     </controller>
Jan 21 23:58:30 compute-1 nova_compute[182713]:     <controller type='pci' index='11' model='pcie-root-port'>
Jan 21 23:58:30 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 21 23:58:30 compute-1 nova_compute[182713]:       <target chassis='11' port='0x1a'/>
Jan 21 23:58:30 compute-1 nova_compute[182713]:       <alias name='pci.11'/>
Jan 21 23:58:30 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Jan 21 23:58:30 compute-1 nova_compute[182713]:     </controller>
Jan 21 23:58:30 compute-1 nova_compute[182713]:     <controller type='pci' index='12' model='pcie-root-port'>
Jan 21 23:58:30 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 21 23:58:30 compute-1 nova_compute[182713]:       <target chassis='12' port='0x1b'/>
Jan 21 23:58:30 compute-1 nova_compute[182713]:       <alias name='pci.12'/>
Jan 21 23:58:30 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Jan 21 23:58:30 compute-1 nova_compute[182713]:     </controller>
Jan 21 23:58:30 compute-1 nova_compute[182713]:     <controller type='pci' index='13' model='pcie-root-port'>
Jan 21 23:58:30 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 21 23:58:30 compute-1 nova_compute[182713]:       <target chassis='13' port='0x1c'/>
Jan 21 23:58:30 compute-1 nova_compute[182713]:       <alias name='pci.13'/>
Jan 21 23:58:30 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Jan 21 23:58:30 compute-1 nova_compute[182713]:     </controller>
Jan 21 23:58:30 compute-1 nova_compute[182713]:     <controller type='pci' index='14' model='pcie-root-port'>
Jan 21 23:58:30 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 21 23:58:30 compute-1 nova_compute[182713]:       <target chassis='14' port='0x1d'/>
Jan 21 23:58:30 compute-1 nova_compute[182713]:       <alias name='pci.14'/>
Jan 21 23:58:30 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Jan 21 23:58:30 compute-1 nova_compute[182713]:     </controller>
Jan 21 23:58:30 compute-1 nova_compute[182713]:     <controller type='pci' index='15' model='pcie-root-port'>
Jan 21 23:58:30 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 21 23:58:30 compute-1 nova_compute[182713]:       <target chassis='15' port='0x1e'/>
Jan 21 23:58:30 compute-1 nova_compute[182713]:       <alias name='pci.15'/>
Jan 21 23:58:30 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Jan 21 23:58:30 compute-1 nova_compute[182713]:     </controller>
Jan 21 23:58:30 compute-1 nova_compute[182713]:     <controller type='pci' index='16' model='pcie-root-port'>
Jan 21 23:58:30 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 21 23:58:30 compute-1 nova_compute[182713]:       <target chassis='16' port='0x1f'/>
Jan 21 23:58:30 compute-1 nova_compute[182713]:       <alias name='pci.16'/>
Jan 21 23:58:30 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Jan 21 23:58:30 compute-1 nova_compute[182713]:     </controller>
Jan 21 23:58:30 compute-1 nova_compute[182713]:     <controller type='pci' index='17' model='pcie-root-port'>
Jan 21 23:58:30 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 21 23:58:30 compute-1 nova_compute[182713]:       <target chassis='17' port='0x20'/>
Jan 21 23:58:30 compute-1 nova_compute[182713]:       <alias name='pci.17'/>
Jan 21 23:58:30 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Jan 21 23:58:30 compute-1 nova_compute[182713]:     </controller>
Jan 21 23:58:30 compute-1 nova_compute[182713]:     <controller type='pci' index='18' model='pcie-root-port'>
Jan 21 23:58:30 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 21 23:58:30 compute-1 nova_compute[182713]:       <target chassis='18' port='0x21'/>
Jan 21 23:58:30 compute-1 nova_compute[182713]:       <alias name='pci.18'/>
Jan 21 23:58:30 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Jan 21 23:58:30 compute-1 nova_compute[182713]:     </controller>
Jan 21 23:58:30 compute-1 nova_compute[182713]:     <controller type='pci' index='19' model='pcie-root-port'>
Jan 21 23:58:30 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 21 23:58:30 compute-1 nova_compute[182713]:       <target chassis='19' port='0x22'/>
Jan 21 23:58:30 compute-1 nova_compute[182713]:       <alias name='pci.19'/>
Jan 21 23:58:30 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Jan 21 23:58:30 compute-1 nova_compute[182713]:     </controller>
Jan 21 23:58:30 compute-1 nova_compute[182713]:     <controller type='pci' index='20' model='pcie-root-port'>
Jan 21 23:58:30 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 21 23:58:30 compute-1 nova_compute[182713]:       <target chassis='20' port='0x23'/>
Jan 21 23:58:30 compute-1 nova_compute[182713]:       <alias name='pci.20'/>
Jan 21 23:58:30 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Jan 21 23:58:30 compute-1 nova_compute[182713]:     </controller>
Jan 21 23:58:30 compute-1 nova_compute[182713]:     <controller type='pci' index='21' model='pcie-root-port'>
Jan 21 23:58:30 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 21 23:58:30 compute-1 nova_compute[182713]:       <target chassis='21' port='0x24'/>
Jan 21 23:58:30 compute-1 nova_compute[182713]:       <alias name='pci.21'/>
Jan 21 23:58:30 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Jan 21 23:58:30 compute-1 nova_compute[182713]:     </controller>
Jan 21 23:58:30 compute-1 nova_compute[182713]:     <controller type='pci' index='22' model='pcie-root-port'>
Jan 21 23:58:30 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 21 23:58:30 compute-1 nova_compute[182713]:       <target chassis='22' port='0x25'/>
Jan 21 23:58:30 compute-1 nova_compute[182713]:       <alias name='pci.22'/>
Jan 21 23:58:30 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Jan 21 23:58:30 compute-1 nova_compute[182713]:     </controller>
Jan 21 23:58:30 compute-1 nova_compute[182713]:     <controller type='pci' index='23' model='pcie-root-port'>
Jan 21 23:58:30 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 21 23:58:30 compute-1 nova_compute[182713]:       <target chassis='23' port='0x26'/>
Jan 21 23:58:30 compute-1 nova_compute[182713]:       <alias name='pci.23'/>
Jan 21 23:58:30 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Jan 21 23:58:30 compute-1 nova_compute[182713]:     </controller>
Jan 21 23:58:30 compute-1 nova_compute[182713]:     <controller type='pci' index='24' model='pcie-root-port'>
Jan 21 23:58:30 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 21 23:58:30 compute-1 nova_compute[182713]:       <target chassis='24' port='0x27'/>
Jan 21 23:58:30 compute-1 nova_compute[182713]:       <alias name='pci.24'/>
Jan 21 23:58:30 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Jan 21 23:58:30 compute-1 nova_compute[182713]:     </controller>
Jan 21 23:58:30 compute-1 nova_compute[182713]:     <controller type='pci' index='25' model='pcie-root-port'>
Jan 21 23:58:30 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 21 23:58:30 compute-1 nova_compute[182713]:       <target chassis='25' port='0x28'/>
Jan 21 23:58:30 compute-1 nova_compute[182713]:       <alias name='pci.25'/>
Jan 21 23:58:30 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Jan 21 23:58:30 compute-1 nova_compute[182713]:     </controller>
Jan 21 23:58:30 compute-1 nova_compute[182713]:     <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Jan 21 23:58:30 compute-1 nova_compute[182713]:       <model name='pcie-pci-bridge'/>
Jan 21 23:58:30 compute-1 nova_compute[182713]:       <alias name='pci.26'/>
Jan 21 23:58:30 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Jan 21 23:58:30 compute-1 nova_compute[182713]:     </controller>
Jan 21 23:58:30 compute-1 nova_compute[182713]:     <controller type='usb' index='0' model='piix3-uhci'>
Jan 21 23:58:30 compute-1 nova_compute[182713]:       <alias name='usb'/>
Jan 21 23:58:30 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Jan 21 23:58:30 compute-1 nova_compute[182713]:     </controller>
Jan 21 23:58:30 compute-1 nova_compute[182713]:     <controller type='sata' index='0'>
Jan 21 23:58:30 compute-1 nova_compute[182713]:       <alias name='ide'/>
Jan 21 23:58:30 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Jan 21 23:58:30 compute-1 nova_compute[182713]:     </controller>
Jan 21 23:58:30 compute-1 nova_compute[182713]:     <interface type='ethernet'>
Jan 21 23:58:30 compute-1 nova_compute[182713]:       <mac address='fa:16:3e:3e:bc:4e'/>
Jan 21 23:58:30 compute-1 nova_compute[182713]:       <target dev='tap43589933-19'/>
Jan 21 23:58:30 compute-1 nova_compute[182713]:       <model type='virtio'/>
Jan 21 23:58:30 compute-1 nova_compute[182713]:       <driver name='vhost' rx_queue_size='512'/>
Jan 21 23:58:30 compute-1 nova_compute[182713]:       <mtu size='1442'/>
Jan 21 23:58:30 compute-1 nova_compute[182713]:       <alias name='net0'/>
Jan 21 23:58:30 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Jan 21 23:58:30 compute-1 nova_compute[182713]:     </interface>
Jan 21 23:58:30 compute-1 nova_compute[182713]:     <interface type='ethernet'>
Jan 21 23:58:30 compute-1 nova_compute[182713]:       <mac address='fa:16:3e:92:0a:be'/>
Jan 21 23:58:30 compute-1 nova_compute[182713]:       <target dev='tapda88332f-f7'/>
Jan 21 23:58:30 compute-1 nova_compute[182713]:       <model type='virtio'/>
Jan 21 23:58:30 compute-1 nova_compute[182713]:       <driver name='vhost' rx_queue_size='512'/>
Jan 21 23:58:30 compute-1 nova_compute[182713]:       <mtu size='1442'/>
Jan 21 23:58:30 compute-1 nova_compute[182713]:       <alias name='net2'/>
Jan 21 23:58:30 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x07' slot='0x00' function='0x0'/>
Jan 21 23:58:30 compute-1 nova_compute[182713]:     </interface>
Jan 21 23:58:30 compute-1 nova_compute[182713]:     <interface type='ethernet'>
Jan 21 23:58:30 compute-1 nova_compute[182713]:       <mac address='fa:16:3e:db:19:e5'/>
Jan 21 23:58:30 compute-1 nova_compute[182713]:       <target dev='tapebbb51e0-ec'/>
Jan 21 23:58:30 compute-1 nova_compute[182713]:       <model type='virtio'/>
Jan 21 23:58:30 compute-1 nova_compute[182713]:       <driver name='vhost' rx_queue_size='512'/>
Jan 21 23:58:30 compute-1 nova_compute[182713]:       <mtu size='1442'/>
Jan 21 23:58:30 compute-1 nova_compute[182713]:       <alias name='net3'/>
Jan 21 23:58:30 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x08' slot='0x00' function='0x0'/>
Jan 21 23:58:30 compute-1 nova_compute[182713]:     </interface>
Jan 21 23:58:30 compute-1 nova_compute[182713]:     <serial type='pty'>
Jan 21 23:58:30 compute-1 nova_compute[182713]:       <source path='/dev/pts/0'/>
Jan 21 23:58:30 compute-1 nova_compute[182713]:       <log file='/var/lib/nova/instances/c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9/console.log' append='off'/>
Jan 21 23:58:30 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:58:30.097 104184 INFO neutron.agent.ovn.metadata.agent [-] Port 63932621-f0d1-4f08-8ce5-b5fa120bcc62 in datapath 1995baab-0f8d-4658-a4fc-2d21868dc592 unbound from our chassis
Jan 21 23:58:30 compute-1 nova_compute[182713]:       <target type='isa-serial' port='0'>
Jan 21 23:58:30 compute-1 nova_compute[182713]:         <model name='isa-serial'/>
Jan 21 23:58:30 compute-1 nova_compute[182713]:       </target>
Jan 21 23:58:30 compute-1 nova_compute[182713]:       <alias name='serial0'/>
Jan 21 23:58:30 compute-1 nova_compute[182713]:     </serial>
Jan 21 23:58:30 compute-1 nova_compute[182713]:     <console type='pty' tty='/dev/pts/0'>
Jan 21 23:58:30 compute-1 nova_compute[182713]:       <source path='/dev/pts/0'/>
Jan 21 23:58:30 compute-1 nova_compute[182713]:       <log file='/var/lib/nova/instances/c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9/console.log' append='off'/>
Jan 21 23:58:30 compute-1 nova_compute[182713]:       <target type='serial' port='0'/>
Jan 21 23:58:30 compute-1 nova_compute[182713]:       <alias name='serial0'/>
Jan 21 23:58:30 compute-1 nova_compute[182713]:     </console>
Jan 21 23:58:30 compute-1 nova_compute[182713]:     <input type='tablet' bus='usb'>
Jan 21 23:58:30 compute-1 nova_compute[182713]:       <alias name='input0'/>
Jan 21 23:58:30 compute-1 nova_compute[182713]:       <address type='usb' bus='0' port='1'/>
Jan 21 23:58:30 compute-1 nova_compute[182713]:     </input>
Jan 21 23:58:30 compute-1 nova_compute[182713]:     <input type='mouse' bus='ps2'>
Jan 21 23:58:30 compute-1 nova_compute[182713]:       <alias name='input1'/>
Jan 21 23:58:30 compute-1 nova_compute[182713]:     </input>
Jan 21 23:58:30 compute-1 nova_compute[182713]:     <input type='keyboard' bus='ps2'>
Jan 21 23:58:30 compute-1 nova_compute[182713]:       <alias name='input2'/>
Jan 21 23:58:30 compute-1 nova_compute[182713]:     </input>
Jan 21 23:58:30 compute-1 nova_compute[182713]:     <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Jan 21 23:58:30 compute-1 nova_compute[182713]:       <listen type='address' address='::0'/>
Jan 21 23:58:30 compute-1 nova_compute[182713]:     </graphics>
Jan 21 23:58:30 compute-1 nova_compute[182713]:     <audio id='1' type='none'/>
Jan 21 23:58:30 compute-1 nova_compute[182713]:     <video>
Jan 21 23:58:30 compute-1 nova_compute[182713]:       <model type='virtio' heads='1' primary='yes'/>
Jan 21 23:58:30 compute-1 nova_compute[182713]:       <alias name='video0'/>
Jan 21 23:58:30 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Jan 21 23:58:30 compute-1 nova_compute[182713]:     </video>
Jan 21 23:58:30 compute-1 nova_compute[182713]:     <watchdog model='itco' action='reset'>
Jan 21 23:58:30 compute-1 nova_compute[182713]:       <alias name='watchdog0'/>
Jan 21 23:58:30 compute-1 nova_compute[182713]:     </watchdog>
Jan 21 23:58:30 compute-1 nova_compute[182713]:     <memballoon model='virtio'>
Jan 21 23:58:30 compute-1 nova_compute[182713]:       <stats period='10'/>
Jan 21 23:58:30 compute-1 nova_compute[182713]:       <alias name='balloon0'/>
Jan 21 23:58:30 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Jan 21 23:58:30 compute-1 nova_compute[182713]:     </memballoon>
Jan 21 23:58:30 compute-1 nova_compute[182713]:     <rng model='virtio'>
Jan 21 23:58:30 compute-1 nova_compute[182713]:       <backend model='random'>/dev/urandom</backend>
Jan 21 23:58:30 compute-1 nova_compute[182713]:       <alias name='rng0'/>
Jan 21 23:58:30 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Jan 21 23:58:30 compute-1 nova_compute[182713]:     </rng>
Jan 21 23:58:30 compute-1 nova_compute[182713]:   </devices>
Jan 21 23:58:30 compute-1 nova_compute[182713]:   <seclabel type='dynamic' model='selinux' relabel='yes'>
Jan 21 23:58:30 compute-1 nova_compute[182713]:     <label>system_u:system_r:svirt_t:s0:c342,c529</label>
Jan 21 23:58:30 compute-1 nova_compute[182713]:     <imagelabel>system_u:object_r:svirt_image_t:s0:c342,c529</imagelabel>
Jan 21 23:58:30 compute-1 nova_compute[182713]:   </seclabel>
Jan 21 23:58:30 compute-1 nova_compute[182713]:   <seclabel type='dynamic' model='dac' relabel='yes'>
Jan 21 23:58:30 compute-1 nova_compute[182713]:     <label>+107:+107</label>
Jan 21 23:58:30 compute-1 nova_compute[182713]:     <imagelabel>+107:+107</imagelabel>
Jan 21 23:58:30 compute-1 nova_compute[182713]:   </seclabel>
Jan 21 23:58:30 compute-1 nova_compute[182713]: </domain>
Jan 21 23:58:30 compute-1 nova_compute[182713]:  get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282
Jan 21 23:58:30 compute-1 nova_compute[182713]: 2026-01-21 23:58:30.095 182717 INFO nova.virt.libvirt.driver [None req-f8213aff-4f1c-4f00-aeaa-898fb01fcf42 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Successfully detached device tap63932621-f0 from instance c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9 from the live domain config.
Jan 21 23:58:30 compute-1 nova_compute[182713]: 2026-01-21 23:58:30.095 182717 DEBUG nova.virt.libvirt.vif [None req-f8213aff-4f1c-4f00-aeaa-898fb01fcf42 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-21T23:57:18Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachInterfacesTestJSON-server-936465965',display_name='tempest-AttachInterfacesTestJSON-server-936465965',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-attachinterfacestestjson-server-936465965',id=71,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBO8uUXqPwbsFq/YiTWyOW9dXr7d73wRm6448mROH/nejo5yLwAs6kN5js+eg5mRrAu9wdd3Q77g1T/xRTktMBrGTEhLl3+r8EWHZg0auWjRb9BMWnMu9DqjgoX1NYmMOKg==',key_name='tempest-keypair-1559391228',keypairs=<?>,launch_index=0,launched_at=2026-01-21T23:57:26Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='717cc581e6a349a98dfd390d05b18624',ramdisk_id='',reservation_id='r-fh2clcvk',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-658760528',owner_user_name='tempest-AttachInterfacesTestJSON-658760528-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-21T23:57:26Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='0f8ef02149394f2dac899fc3395b6bf7',uuid=c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "63932621-f0d1-4f08-8ce5-b5fa120bcc62", "address": "fa:16:3e:30:c6:cf", "network": {"id": "1995baab-0f8d-4658-a4fc-2d21868dc592", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-54296518-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "717cc581e6a349a98dfd390d05b18624", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap63932621-f0", "ovs_interfaceid": "63932621-f0d1-4f08-8ce5-b5fa120bcc62", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 21 23:58:30 compute-1 nova_compute[182713]: 2026-01-21 23:58:30.096 182717 DEBUG nova.network.os_vif_util [None req-f8213aff-4f1c-4f00-aeaa-898fb01fcf42 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Converting VIF {"id": "63932621-f0d1-4f08-8ce5-b5fa120bcc62", "address": "fa:16:3e:30:c6:cf", "network": {"id": "1995baab-0f8d-4658-a4fc-2d21868dc592", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-54296518-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "717cc581e6a349a98dfd390d05b18624", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap63932621-f0", "ovs_interfaceid": "63932621-f0d1-4f08-8ce5-b5fa120bcc62", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 21 23:58:30 compute-1 nova_compute[182713]: 2026-01-21 23:58:30.098 182717 DEBUG nova.network.os_vif_util [None req-f8213aff-4f1c-4f00-aeaa-898fb01fcf42 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:30:c6:cf,bridge_name='br-int',has_traffic_filtering=True,id=63932621-f0d1-4f08-8ce5-b5fa120bcc62,network=Network(1995baab-0f8d-4658-a4fc-2d21868dc592),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap63932621-f0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 21 23:58:30 compute-1 nova_compute[182713]: 2026-01-21 23:58:30.098 182717 DEBUG os_vif [None req-f8213aff-4f1c-4f00-aeaa-898fb01fcf42 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:30:c6:cf,bridge_name='br-int',has_traffic_filtering=True,id=63932621-f0d1-4f08-8ce5-b5fa120bcc62,network=Network(1995baab-0f8d-4658-a4fc-2d21868dc592),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap63932621-f0') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 21 23:58:30 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:58:30.098 104184 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 1995baab-0f8d-4658-a4fc-2d21868dc592
Jan 21 23:58:30 compute-1 nova_compute[182713]: 2026-01-21 23:58:30.102 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:58:30 compute-1 nova_compute[182713]: 2026-01-21 23:58:30.102 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap63932621-f0, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:58:30 compute-1 nova_compute[182713]: 2026-01-21 23:58:30.103 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:58:30 compute-1 nova_compute[182713]: 2026-01-21 23:58:30.105 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 21 23:58:30 compute-1 nova_compute[182713]: 2026-01-21 23:58:30.105 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:58:30 compute-1 nova_compute[182713]: 2026-01-21 23:58:30.110 182717 INFO os_vif [None req-f8213aff-4f1c-4f00-aeaa-898fb01fcf42 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:30:c6:cf,bridge_name='br-int',has_traffic_filtering=True,id=63932621-f0d1-4f08-8ce5-b5fa120bcc62,network=Network(1995baab-0f8d-4658-a4fc-2d21868dc592),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap63932621-f0')
Jan 21 23:58:30 compute-1 nova_compute[182713]: 2026-01-21 23:58:30.111 182717 DEBUG nova.virt.libvirt.guest [None req-f8213aff-4f1c-4f00-aeaa-898fb01fcf42 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 21 23:58:30 compute-1 nova_compute[182713]:   <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 21 23:58:30 compute-1 nova_compute[182713]:   <nova:name>tempest-AttachInterfacesTestJSON-server-936465965</nova:name>
Jan 21 23:58:30 compute-1 nova_compute[182713]:   <nova:creationTime>2026-01-21 23:58:30</nova:creationTime>
Jan 21 23:58:30 compute-1 nova_compute[182713]:   <nova:flavor name="m1.nano">
Jan 21 23:58:30 compute-1 nova_compute[182713]:     <nova:memory>128</nova:memory>
Jan 21 23:58:30 compute-1 nova_compute[182713]:     <nova:disk>1</nova:disk>
Jan 21 23:58:30 compute-1 nova_compute[182713]:     <nova:swap>0</nova:swap>
Jan 21 23:58:30 compute-1 nova_compute[182713]:     <nova:ephemeral>0</nova:ephemeral>
Jan 21 23:58:30 compute-1 nova_compute[182713]:     <nova:vcpus>1</nova:vcpus>
Jan 21 23:58:30 compute-1 nova_compute[182713]:   </nova:flavor>
Jan 21 23:58:30 compute-1 nova_compute[182713]:   <nova:owner>
Jan 21 23:58:30 compute-1 nova_compute[182713]:     <nova:user uuid="0f8ef02149394f2dac899fc3395b6bf7">tempest-AttachInterfacesTestJSON-658760528-project-member</nova:user>
Jan 21 23:58:30 compute-1 nova_compute[182713]:     <nova:project uuid="717cc581e6a349a98dfd390d05b18624">tempest-AttachInterfacesTestJSON-658760528</nova:project>
Jan 21 23:58:30 compute-1 nova_compute[182713]:   </nova:owner>
Jan 21 23:58:30 compute-1 nova_compute[182713]:   <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 21 23:58:30 compute-1 nova_compute[182713]:   <nova:ports>
Jan 21 23:58:30 compute-1 nova_compute[182713]:     <nova:port uuid="43589933-1997-41c6-9aa3-54f71a1330b8">
Jan 21 23:58:30 compute-1 nova_compute[182713]:       <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Jan 21 23:58:30 compute-1 nova_compute[182713]:     </nova:port>
Jan 21 23:58:30 compute-1 nova_compute[182713]:     <nova:port uuid="da88332f-f709-4521-a7e5-faca686cf825">
Jan 21 23:58:30 compute-1 nova_compute[182713]:       <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Jan 21 23:58:30 compute-1 nova_compute[182713]:     </nova:port>
Jan 21 23:58:30 compute-1 nova_compute[182713]:     <nova:port uuid="ebbb51e0-ecfc-404f-a578-681300a57aa8">
Jan 21 23:58:30 compute-1 nova_compute[182713]:       <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Jan 21 23:58:30 compute-1 nova_compute[182713]:     </nova:port>
Jan 21 23:58:30 compute-1 nova_compute[182713]:   </nova:ports>
Jan 21 23:58:30 compute-1 nova_compute[182713]: </nova:instance>
Jan 21 23:58:30 compute-1 nova_compute[182713]:  set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359
Jan 21 23:58:30 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:58:30.119 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[57a3fc72-049c-4f95-8384-61f160a2bbb5]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:58:30 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:58:30.142 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[1478ef96-b45d-433b-9c2b-24c4ef130322]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:58:30 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:58:30.145 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[c3819163-25b7-4ea2-9519-87e60160eff3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:58:30 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:58:30.170 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[52b9ae1e-da60-48c4-a5af-e6258cbf9ab2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:58:30 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:58:30.186 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[af4bb706-2c35-4334-ab05-6eb726b92f50]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap1995baab-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:4b:ff:2f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 12, 'tx_packets': 11, 'rx_bytes': 784, 'tx_bytes': 606, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 12, 'tx_packets': 11, 'rx_bytes': 784, 'tx_bytes': 606, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 70], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 441880, 'reachable_time': 27223, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 221504, 'error': None, 'target': 'ovnmeta-1995baab-0f8d-4658-a4fc-2d21868dc592', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:58:30 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:58:30.199 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[cc7e981f-4bb0-4410-90d5-89166d62a1c8]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap1995baab-01'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 441895, 'tstamp': 441895}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 221505, 'error': None, 'target': 'ovnmeta-1995baab-0f8d-4658-a4fc-2d21868dc592', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap1995baab-01'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 441899, 'tstamp': 441899}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 221505, 'error': None, 'target': 'ovnmeta-1995baab-0f8d-4658-a4fc-2d21868dc592', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:58:30 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:58:30.200 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1995baab-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:58:30 compute-1 nova_compute[182713]: 2026-01-21 23:58:30.202 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:58:30 compute-1 nova_compute[182713]: 2026-01-21 23:58:30.204 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:58:30 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:58:30.204 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap1995baab-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:58:30 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:58:30.205 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 21 23:58:30 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:58:30.205 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap1995baab-00, col_values=(('external_ids', {'iface-id': '4a5cc35b-5169-43e2-b11f-202219aae22d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:58:30 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:58:30.205 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 21 23:58:30 compute-1 nova_compute[182713]: 2026-01-21 23:58:30.407 182717 DEBUG nova.compute.manager [req-350c90c1-71b2-4ce7-86e7-afed0155f717 req-ef3befb0-9663-47ff-8241-4a2e415b69b1 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9] Received event network-vif-plugged-ebbb51e0-ecfc-404f-a578-681300a57aa8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:58:30 compute-1 nova_compute[182713]: 2026-01-21 23:58:30.408 182717 DEBUG oslo_concurrency.lockutils [req-350c90c1-71b2-4ce7-86e7-afed0155f717 req-ef3befb0-9663-47ff-8241-4a2e415b69b1 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:58:30 compute-1 nova_compute[182713]: 2026-01-21 23:58:30.409 182717 DEBUG oslo_concurrency.lockutils [req-350c90c1-71b2-4ce7-86e7-afed0155f717 req-ef3befb0-9663-47ff-8241-4a2e415b69b1 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:58:30 compute-1 nova_compute[182713]: 2026-01-21 23:58:30.410 182717 DEBUG oslo_concurrency.lockutils [req-350c90c1-71b2-4ce7-86e7-afed0155f717 req-ef3befb0-9663-47ff-8241-4a2e415b69b1 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:58:30 compute-1 nova_compute[182713]: 2026-01-21 23:58:30.410 182717 DEBUG nova.compute.manager [req-350c90c1-71b2-4ce7-86e7-afed0155f717 req-ef3befb0-9663-47ff-8241-4a2e415b69b1 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9] No waiting events found dispatching network-vif-plugged-ebbb51e0-ecfc-404f-a578-681300a57aa8 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 23:58:30 compute-1 nova_compute[182713]: 2026-01-21 23:58:30.411 182717 WARNING nova.compute.manager [req-350c90c1-71b2-4ce7-86e7-afed0155f717 req-ef3befb0-9663-47ff-8241-4a2e415b69b1 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9] Received unexpected event network-vif-plugged-ebbb51e0-ecfc-404f-a578-681300a57aa8 for instance with vm_state active and task_state None.
Jan 21 23:58:30 compute-1 kernel: tap0d50f3fc-9e (unregistering): left promiscuous mode
Jan 21 23:58:30 compute-1 NetworkManager[54952]: <info>  [1769039910.6940] device (tap0d50f3fc-9e): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 21 23:58:30 compute-1 ovn_controller[94841]: 2026-01-21T23:58:30Z|00256|binding|INFO|Releasing lport 0d50f3fc-9e12-4e56-ba52-98ff14988caa from this chassis (sb_readonly=0)
Jan 21 23:58:30 compute-1 ovn_controller[94841]: 2026-01-21T23:58:30Z|00257|binding|INFO|Setting lport 0d50f3fc-9e12-4e56-ba52-98ff14988caa down in Southbound
Jan 21 23:58:30 compute-1 nova_compute[182713]: 2026-01-21 23:58:30.700 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:58:30 compute-1 ovn_controller[94841]: 2026-01-21T23:58:30Z|00258|binding|INFO|Removing iface tap0d50f3fc-9e ovn-installed in OVS
Jan 21 23:58:30 compute-1 nova_compute[182713]: 2026-01-21 23:58:30.704 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:58:30 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:58:30.713 104184 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:38:24:84 10.100.0.10'], port_security=['fa:16:3e:38:24:84 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '337d88a9-a34b-4c90-bf0d-0418533ae52d', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-58dabcb0-4997-48a1-816b-d257b8a0a2a6', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '40a322b32cda438b83f33ec51a9007dc', 'neutron:revision_number': '4', 'neutron:security_group_ids': '992f086f-c626-4197-b1cd-eb7d61dff758', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com', 'neutron:port_fip': '192.168.122.172'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a4acf22b-ed26-4bae-bf49-1694ec7b3765, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>], logical_port=0d50f3fc-9e12-4e56-ba52-98ff14988caa) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 21 23:58:30 compute-1 nova_compute[182713]: 2026-01-21 23:58:30.714 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:58:30 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:58:30.716 104184 INFO neutron.agent.ovn.metadata.agent [-] Port 0d50f3fc-9e12-4e56-ba52-98ff14988caa in datapath 58dabcb0-4997-48a1-816b-d257b8a0a2a6 unbound from our chassis
Jan 21 23:58:30 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:58:30.719 104184 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 58dabcb0-4997-48a1-816b-d257b8a0a2a6, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 21 23:58:30 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:58:30.720 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[6e3cf9fd-8bd5-41e2-8b37-07bebb315e7f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:58:30 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:58:30.721 104184 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-58dabcb0-4997-48a1-816b-d257b8a0a2a6 namespace which is not needed anymore
Jan 21 23:58:30 compute-1 systemd[1]: machine-qemu\x2d34\x2dinstance\x2d00000049.scope: Deactivated successfully.
Jan 21 23:58:30 compute-1 systemd[1]: machine-qemu\x2d34\x2dinstance\x2d00000049.scope: Consumed 13.859s CPU time.
Jan 21 23:58:30 compute-1 systemd-machined[153970]: Machine qemu-34-instance-00000049 terminated.
Jan 21 23:58:30 compute-1 nova_compute[182713]: 2026-01-21 23:58:30.866 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:58:30 compute-1 nova_compute[182713]: 2026-01-21 23:58:30.875 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:58:30 compute-1 nova_compute[182713]: 2026-01-21 23:58:30.920 182717 INFO nova.virt.libvirt.driver [-] [instance: 337d88a9-a34b-4c90-bf0d-0418533ae52d] Instance destroyed successfully.
Jan 21 23:58:30 compute-1 nova_compute[182713]: 2026-01-21 23:58:30.921 182717 DEBUG nova.objects.instance [None req-9292a1aa-694c-4e1a-bc79-2654a7b5a37b d31ef0e2c7354f47adb7b7f072c28fae 40a322b32cda438b83f33ec51a9007dc - - default default] Lazy-loading 'resources' on Instance uuid 337d88a9-a34b-4c90-bf0d-0418533ae52d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:58:30 compute-1 neutron-haproxy-ovnmeta-58dabcb0-4997-48a1-816b-d257b8a0a2a6[221333]: [NOTICE]   (221337) : haproxy version is 2.8.14-c23fe91
Jan 21 23:58:30 compute-1 neutron-haproxy-ovnmeta-58dabcb0-4997-48a1-816b-d257b8a0a2a6[221333]: [NOTICE]   (221337) : path to executable is /usr/sbin/haproxy
Jan 21 23:58:30 compute-1 neutron-haproxy-ovnmeta-58dabcb0-4997-48a1-816b-d257b8a0a2a6[221333]: [WARNING]  (221337) : Exiting Master process...
Jan 21 23:58:30 compute-1 nova_compute[182713]: 2026-01-21 23:58:30.933 182717 DEBUG oslo_concurrency.lockutils [None req-f8213aff-4f1c-4f00-aeaa-898fb01fcf42 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Acquiring lock "refresh_cache-c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 23:58:30 compute-1 nova_compute[182713]: 2026-01-21 23:58:30.934 182717 DEBUG oslo_concurrency.lockutils [None req-f8213aff-4f1c-4f00-aeaa-898fb01fcf42 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Acquired lock "refresh_cache-c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 23:58:30 compute-1 neutron-haproxy-ovnmeta-58dabcb0-4997-48a1-816b-d257b8a0a2a6[221333]: [ALERT]    (221337) : Current worker (221339) exited with code 143 (Terminated)
Jan 21 23:58:30 compute-1 neutron-haproxy-ovnmeta-58dabcb0-4997-48a1-816b-d257b8a0a2a6[221333]: [WARNING]  (221337) : All workers exited. Exiting... (0)
Jan 21 23:58:30 compute-1 nova_compute[182713]: 2026-01-21 23:58:30.934 182717 DEBUG nova.network.neutron [None req-f8213aff-4f1c-4f00-aeaa-898fb01fcf42 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] [instance: c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 21 23:58:30 compute-1 systemd[1]: libpod-0bb55bb0aceaf3fcc5dbecea6bbca74b1e2f52ccdc1a4da8ef344600e2130d01.scope: Deactivated successfully.
Jan 21 23:58:30 compute-1 podman[221527]: 2026-01-21 23:58:30.945472751 +0000 UTC m=+0.071081829 container died 0bb55bb0aceaf3fcc5dbecea6bbca74b1e2f52ccdc1a4da8ef344600e2130d01 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-58dabcb0-4997-48a1-816b-d257b8a0a2a6, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 21 23:58:30 compute-1 nova_compute[182713]: 2026-01-21 23:58:30.951 182717 DEBUG nova.virt.libvirt.vif [None req-9292a1aa-694c-4e1a-bc79-2654a7b5a37b d31ef0e2c7354f47adb7b7f072c28fae 40a322b32cda438b83f33ec51a9007dc - - default default] vif_type=ovs instance=Instance(access_ip_v4=1.1.1.1,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-21T23:57:56Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='guest-instance-1',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx-guest-test.domaintest.com',id=73,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJmcUUCRNBDC7pEUz/tpS2M2Bh7QKoMuCvSGq0OlhfgBB2gHHm0vZwJ3C6s56OZFiBK7PSn211goCBxjEIAa7rURlcZpYO0vnVohce5WvMJCsd1LTVs/VNJbZ3GaVOQFkg==',key_name='tempest-keypair-2098695703',keypairs=<?>,launch_index=0,launched_at=2026-01-21T23:58:08Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='40a322b32cda438b83f33ec51a9007dc',ramdisk_id='',reservation_id='r-mpbb2z0m',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersV294TestFqdnHostnames-36171568',owner_user_name='tempest-ServersV294TestFqdnHostnames-36171568-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-21T23:58:08Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='d31ef0e2c7354f47adb7b7f072c28fae',uuid=337d88a9-a34b-4c90-bf0d-0418533ae52d,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "0d50f3fc-9e12-4e56-ba52-98ff14988caa", "address": "fa:16:3e:38:24:84", "network": {"id": "58dabcb0-4997-48a1-816b-d257b8a0a2a6", "bridge": "br-int", "label": "tempest-ServersV294TestFqdnHostnames-94429612-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.172", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "40a322b32cda438b83f33ec51a9007dc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0d50f3fc-9e", "ovs_interfaceid": "0d50f3fc-9e12-4e56-ba52-98ff14988caa", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 21 23:58:30 compute-1 nova_compute[182713]: 2026-01-21 23:58:30.952 182717 DEBUG nova.network.os_vif_util [None req-9292a1aa-694c-4e1a-bc79-2654a7b5a37b d31ef0e2c7354f47adb7b7f072c28fae 40a322b32cda438b83f33ec51a9007dc - - default default] Converting VIF {"id": "0d50f3fc-9e12-4e56-ba52-98ff14988caa", "address": "fa:16:3e:38:24:84", "network": {"id": "58dabcb0-4997-48a1-816b-d257b8a0a2a6", "bridge": "br-int", "label": "tempest-ServersV294TestFqdnHostnames-94429612-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.172", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "40a322b32cda438b83f33ec51a9007dc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0d50f3fc-9e", "ovs_interfaceid": "0d50f3fc-9e12-4e56-ba52-98ff14988caa", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 21 23:58:30 compute-1 nova_compute[182713]: 2026-01-21 23:58:30.954 182717 DEBUG nova.network.os_vif_util [None req-9292a1aa-694c-4e1a-bc79-2654a7b5a37b d31ef0e2c7354f47adb7b7f072c28fae 40a322b32cda438b83f33ec51a9007dc - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:38:24:84,bridge_name='br-int',has_traffic_filtering=True,id=0d50f3fc-9e12-4e56-ba52-98ff14988caa,network=Network(58dabcb0-4997-48a1-816b-d257b8a0a2a6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0d50f3fc-9e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 21 23:58:30 compute-1 nova_compute[182713]: 2026-01-21 23:58:30.954 182717 DEBUG os_vif [None req-9292a1aa-694c-4e1a-bc79-2654a7b5a37b d31ef0e2c7354f47adb7b7f072c28fae 40a322b32cda438b83f33ec51a9007dc - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:38:24:84,bridge_name='br-int',has_traffic_filtering=True,id=0d50f3fc-9e12-4e56-ba52-98ff14988caa,network=Network(58dabcb0-4997-48a1-816b-d257b8a0a2a6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0d50f3fc-9e') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 21 23:58:30 compute-1 nova_compute[182713]: 2026-01-21 23:58:30.958 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:58:30 compute-1 nova_compute[182713]: 2026-01-21 23:58:30.959 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0d50f3fc-9e, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:58:30 compute-1 nova_compute[182713]: 2026-01-21 23:58:30.962 182717 DEBUG nova.compute.manager [req-cc488ddc-d89a-4d66-a65b-6a6a1ac7d121 req-04d41213-95ea-444e-821b-fa5eb745a031 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9] Received event network-vif-unplugged-63932621-f0d1-4f08-8ce5-b5fa120bcc62 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:58:30 compute-1 nova_compute[182713]: 2026-01-21 23:58:30.962 182717 DEBUG oslo_concurrency.lockutils [req-cc488ddc-d89a-4d66-a65b-6a6a1ac7d121 req-04d41213-95ea-444e-821b-fa5eb745a031 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:58:30 compute-1 nova_compute[182713]: 2026-01-21 23:58:30.962 182717 DEBUG oslo_concurrency.lockutils [req-cc488ddc-d89a-4d66-a65b-6a6a1ac7d121 req-04d41213-95ea-444e-821b-fa5eb745a031 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:58:30 compute-1 nova_compute[182713]: 2026-01-21 23:58:30.963 182717 DEBUG oslo_concurrency.lockutils [req-cc488ddc-d89a-4d66-a65b-6a6a1ac7d121 req-04d41213-95ea-444e-821b-fa5eb745a031 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:58:30 compute-1 nova_compute[182713]: 2026-01-21 23:58:30.963 182717 DEBUG nova.compute.manager [req-cc488ddc-d89a-4d66-a65b-6a6a1ac7d121 req-04d41213-95ea-444e-821b-fa5eb745a031 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9] No waiting events found dispatching network-vif-unplugged-63932621-f0d1-4f08-8ce5-b5fa120bcc62 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 23:58:30 compute-1 nova_compute[182713]: 2026-01-21 23:58:30.963 182717 WARNING nova.compute.manager [req-cc488ddc-d89a-4d66-a65b-6a6a1ac7d121 req-04d41213-95ea-444e-821b-fa5eb745a031 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9] Received unexpected event network-vif-unplugged-63932621-f0d1-4f08-8ce5-b5fa120bcc62 for instance with vm_state active and task_state None.
Jan 21 23:58:30 compute-1 nova_compute[182713]: 2026-01-21 23:58:30.968 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:58:30 compute-1 nova_compute[182713]: 2026-01-21 23:58:30.970 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 21 23:58:30 compute-1 nova_compute[182713]: 2026-01-21 23:58:30.973 182717 INFO os_vif [None req-9292a1aa-694c-4e1a-bc79-2654a7b5a37b d31ef0e2c7354f47adb7b7f072c28fae 40a322b32cda438b83f33ec51a9007dc - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:38:24:84,bridge_name='br-int',has_traffic_filtering=True,id=0d50f3fc-9e12-4e56-ba52-98ff14988caa,network=Network(58dabcb0-4997-48a1-816b-d257b8a0a2a6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0d50f3fc-9e')
Jan 21 23:58:30 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-0bb55bb0aceaf3fcc5dbecea6bbca74b1e2f52ccdc1a4da8ef344600e2130d01-userdata-shm.mount: Deactivated successfully.
Jan 21 23:58:30 compute-1 nova_compute[182713]: 2026-01-21 23:58:30.974 182717 INFO nova.virt.libvirt.driver [None req-9292a1aa-694c-4e1a-bc79-2654a7b5a37b d31ef0e2c7354f47adb7b7f072c28fae 40a322b32cda438b83f33ec51a9007dc - - default default] [instance: 337d88a9-a34b-4c90-bf0d-0418533ae52d] Deleting instance files /var/lib/nova/instances/337d88a9-a34b-4c90-bf0d-0418533ae52d_del
Jan 21 23:58:30 compute-1 nova_compute[182713]: 2026-01-21 23:58:30.975 182717 INFO nova.virt.libvirt.driver [None req-9292a1aa-694c-4e1a-bc79-2654a7b5a37b d31ef0e2c7354f47adb7b7f072c28fae 40a322b32cda438b83f33ec51a9007dc - - default default] [instance: 337d88a9-a34b-4c90-bf0d-0418533ae52d] Deletion of /var/lib/nova/instances/337d88a9-a34b-4c90-bf0d-0418533ae52d_del complete
Jan 21 23:58:30 compute-1 systemd[1]: var-lib-containers-storage-overlay-3aa7ce989211752149f65cf59feadfa20c51f09212ade832ca01b72728ee9a03-merged.mount: Deactivated successfully.
Jan 21 23:58:30 compute-1 podman[221527]: 2026-01-21 23:58:30.987681171 +0000 UTC m=+0.113290269 container cleanup 0bb55bb0aceaf3fcc5dbecea6bbca74b1e2f52ccdc1a4da8ef344600e2130d01 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-58dabcb0-4997-48a1-816b-d257b8a0a2a6, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 21 23:58:31 compute-1 systemd[1]: libpod-conmon-0bb55bb0aceaf3fcc5dbecea6bbca74b1e2f52ccdc1a4da8ef344600e2130d01.scope: Deactivated successfully.
Jan 21 23:58:31 compute-1 podman[221570]: 2026-01-21 23:58:31.067100667 +0000 UTC m=+0.053751467 container remove 0bb55bb0aceaf3fcc5dbecea6bbca74b1e2f52ccdc1a4da8ef344600e2130d01 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-58dabcb0-4997-48a1-816b-d257b8a0a2a6, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3)
Jan 21 23:58:31 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:58:31.072 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[85361acf-53ce-45eb-81f3-d73e2a12a043]: (4, ('Wed Jan 21 11:58:30 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-58dabcb0-4997-48a1-816b-d257b8a0a2a6 (0bb55bb0aceaf3fcc5dbecea6bbca74b1e2f52ccdc1a4da8ef344600e2130d01)\n0bb55bb0aceaf3fcc5dbecea6bbca74b1e2f52ccdc1a4da8ef344600e2130d01\nWed Jan 21 11:58:30 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-58dabcb0-4997-48a1-816b-d257b8a0a2a6 (0bb55bb0aceaf3fcc5dbecea6bbca74b1e2f52ccdc1a4da8ef344600e2130d01)\n0bb55bb0aceaf3fcc5dbecea6bbca74b1e2f52ccdc1a4da8ef344600e2130d01\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:58:31 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:58:31.074 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[829d0df5-546b-4df6-bc53-5b35eadc3cb8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:58:31 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:58:31.075 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap58dabcb0-40, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:58:31 compute-1 nova_compute[182713]: 2026-01-21 23:58:31.078 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:58:31 compute-1 kernel: tap58dabcb0-40: left promiscuous mode
Jan 21 23:58:31 compute-1 nova_compute[182713]: 2026-01-21 23:58:31.151 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:58:31 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:58:31.153 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[89b270a8-a253-4e3e-be7d-43bd5fc9c0d5]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:58:31 compute-1 nova_compute[182713]: 2026-01-21 23:58:31.155 182717 INFO nova.compute.manager [None req-9292a1aa-694c-4e1a-bc79-2654a7b5a37b d31ef0e2c7354f47adb7b7f072c28fae 40a322b32cda438b83f33ec51a9007dc - - default default] [instance: 337d88a9-a34b-4c90-bf0d-0418533ae52d] Took 1.17 seconds to destroy the instance on the hypervisor.
Jan 21 23:58:31 compute-1 nova_compute[182713]: 2026-01-21 23:58:31.156 182717 DEBUG oslo.service.loopingcall [None req-9292a1aa-694c-4e1a-bc79-2654a7b5a37b d31ef0e2c7354f47adb7b7f072c28fae 40a322b32cda438b83f33ec51a9007dc - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 21 23:58:31 compute-1 nova_compute[182713]: 2026-01-21 23:58:31.156 182717 DEBUG nova.compute.manager [-] [instance: 337d88a9-a34b-4c90-bf0d-0418533ae52d] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 21 23:58:31 compute-1 nova_compute[182713]: 2026-01-21 23:58:31.156 182717 DEBUG nova.network.neutron [-] [instance: 337d88a9-a34b-4c90-bf0d-0418533ae52d] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 21 23:58:31 compute-1 nova_compute[182713]: 2026-01-21 23:58:31.170 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:58:31 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:58:31.177 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[35fe8c78-c809-4de9-83a1-20aa4c118103]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:58:31 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:58:31.178 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[b5b87786-dd3e-4d2d-967d-bd6c69422979]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:58:31 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:58:31.202 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[957c88d7-bf1c-475f-9b9d-e98a9cba3c28]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 445857, 'reachable_time': 39840, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 221584, 'error': None, 'target': 'ovnmeta-58dabcb0-4997-48a1-816b-d257b8a0a2a6', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:58:31 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:58:31.205 104576 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-58dabcb0-4997-48a1-816b-d257b8a0a2a6 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 21 23:58:31 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:58:31.205 104576 DEBUG oslo.privsep.daemon [-] privsep: reply[6cc93959-e972-42be-a86e-9de53466ea8c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:58:31 compute-1 systemd[1]: run-netns-ovnmeta\x2d58dabcb0\x2d4997\x2d48a1\x2d816b\x2dd257b8a0a2a6.mount: Deactivated successfully.
Jan 21 23:58:31 compute-1 nova_compute[182713]: 2026-01-21 23:58:31.787 182717 DEBUG nova.compute.manager [req-93aaa9c1-f0d5-45d3-a91b-8936241ad934 req-5f0a4c0b-cc1d-4ae0-8828-4a2f22088035 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9] Received event network-vif-deleted-63932621-f0d1-4f08-8ce5-b5fa120bcc62 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:58:31 compute-1 nova_compute[182713]: 2026-01-21 23:58:31.788 182717 INFO nova.compute.manager [req-93aaa9c1-f0d5-45d3-a91b-8936241ad934 req-5f0a4c0b-cc1d-4ae0-8828-4a2f22088035 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9] Neutron deleted interface 63932621-f0d1-4f08-8ce5-b5fa120bcc62; detaching it from the instance and deleting it from the info cache
Jan 21 23:58:31 compute-1 nova_compute[182713]: 2026-01-21 23:58:31.788 182717 DEBUG nova.network.neutron [req-93aaa9c1-f0d5-45d3-a91b-8936241ad934 req-5f0a4c0b-cc1d-4ae0-8828-4a2f22088035 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9] Updating instance_info_cache with network_info: [{"id": "43589933-1997-41c6-9aa3-54f71a1330b8", "address": "fa:16:3e:3e:bc:4e", "network": {"id": "1995baab-0f8d-4658-a4fc-2d21868dc592", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-54296518-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.190", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "717cc581e6a349a98dfd390d05b18624", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap43589933-19", "ovs_interfaceid": "43589933-1997-41c6-9aa3-54f71a1330b8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "da88332f-f709-4521-a7e5-faca686cf825", "address": "fa:16:3e:92:0a:be", "network": {"id": "1995baab-0f8d-4658-a4fc-2d21868dc592", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-54296518-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "717cc581e6a349a98dfd390d05b18624", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapda88332f-f7", "ovs_interfaceid": "da88332f-f709-4521-a7e5-faca686cf825", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "ebbb51e0-ecfc-404f-a578-681300a57aa8", "address": "fa:16:3e:db:19:e5", "network": {"id": "1995baab-0f8d-4658-a4fc-2d21868dc592", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-54296518-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "717cc581e6a349a98dfd390d05b18624", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapebbb51e0-ec", "ovs_interfaceid": "ebbb51e0-ecfc-404f-a578-681300a57aa8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 23:58:31 compute-1 nova_compute[182713]: 2026-01-21 23:58:31.822 182717 DEBUG nova.objects.instance [req-93aaa9c1-f0d5-45d3-a91b-8936241ad934 req-5f0a4c0b-cc1d-4ae0-8828-4a2f22088035 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lazy-loading 'system_metadata' on Instance uuid c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:58:31 compute-1 nova_compute[182713]: 2026-01-21 23:58:31.854 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:58:31 compute-1 nova_compute[182713]: 2026-01-21 23:58:31.855 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:58:31 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:58:31.961 104184 WARNING neutron.agent.ovn.metadata.agent [-] Removing non-external type port 3de59374-0851-4393-9f03-0d827e0b8d7c with type ""
Jan 21 23:58:31 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:58:31.964 104184 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched DELETE: PortBindingDeletedEvent(events=('delete',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:db:19:e5 10.100.0.5'], port_security=['fa:16:3e:db:19:e5 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-AttachInterfacesTestJSON-85120559', 'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': 'c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1995baab-0f8d-4658-a4fc-2d21868dc592', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-AttachInterfacesTestJSON-85120559', 'neutron:project_id': '717cc581e6a349a98dfd390d05b18624', 'neutron:revision_number': '4', 'neutron:security_group_ids': '453c6af8-25dc-4538-90e8-d74d46875cdc', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a84fa12f-731b-4479-8697-844749c5a76f, chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>], tunnel_key=6, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>], logical_port=ebbb51e0-ecfc-404f-a578-681300a57aa8) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 21 23:58:31 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:58:31.966 104184 INFO neutron.agent.ovn.metadata.agent [-] Port ebbb51e0-ecfc-404f-a578-681300a57aa8 in datapath 1995baab-0f8d-4658-a4fc-2d21868dc592 unbound from our chassis
Jan 21 23:58:31 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:58:31.971 104184 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 1995baab-0f8d-4658-a4fc-2d21868dc592
Jan 21 23:58:31 compute-1 nova_compute[182713]: 2026-01-21 23:58:31.977 182717 DEBUG nova.objects.instance [req-93aaa9c1-f0d5-45d3-a91b-8936241ad934 req-5f0a4c0b-cc1d-4ae0-8828-4a2f22088035 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lazy-loading 'flavor' on Instance uuid c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:58:31 compute-1 nova_compute[182713]: 2026-01-21 23:58:31.981 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:58:31 compute-1 nova_compute[182713]: 2026-01-21 23:58:31.982 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:58:31 compute-1 nova_compute[182713]: 2026-01-21 23:58:31.982 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:58:31 compute-1 nova_compute[182713]: 2026-01-21 23:58:31.982 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 21 23:58:31 compute-1 ovn_controller[94841]: 2026-01-21T23:58:31Z|00259|binding|INFO|Removing iface tapebbb51e0-ec ovn-installed in OVS
Jan 21 23:58:31 compute-1 ovn_controller[94841]: 2026-01-21T23:58:31Z|00260|binding|INFO|Removing lport ebbb51e0-ecfc-404f-a578-681300a57aa8 ovn-installed in OVS
Jan 21 23:58:31 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:58:31.990 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[1bb72949-4b3a-4c01-a3ce-7fe4e04bf4b0]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:58:31 compute-1 nova_compute[182713]: 2026-01-21 23:58:31.994 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:58:32 compute-1 nova_compute[182713]: 2026-01-21 23:58:31.999 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:58:32 compute-1 nova_compute[182713]: 2026-01-21 23:58:32.020 182717 DEBUG nova.virt.libvirt.vif [req-93aaa9c1-f0d5-45d3-a91b-8936241ad934 req-5f0a4c0b-cc1d-4ae0-8828-4a2f22088035 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-21T23:57:18Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-AttachInterfacesTestJSON-server-936465965',display_name='tempest-AttachInterfacesTestJSON-server-936465965',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-attachinterfacestestjson-server-936465965',id=71,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBO8uUXqPwbsFq/YiTWyOW9dXr7d73wRm6448mROH/nejo5yLwAs6kN5js+eg5mRrAu9wdd3Q77g1T/xRTktMBrGTEhLl3+r8EWHZg0auWjRb9BMWnMu9DqjgoX1NYmMOKg==',key_name='tempest-keypair-1559391228',keypairs=<?>,launch_index=0,launched_at=2026-01-21T23:57:26Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata=<?>,migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='717cc581e6a349a98dfd390d05b18624',ramdisk_id='',reservation_id='r-fh2clcvk',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-658760528',owner_user_name='tempest-AttachInterfacesTestJSON-658760528-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-21T23:57:26Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='0f8ef02149394f2dac899fc3395b6bf7',uuid=c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "63932621-f0d1-4f08-8ce5-b5fa120bcc62", "address": "fa:16:3e:30:c6:cf", "network": {"id": "1995baab-0f8d-4658-a4fc-2d21868dc592", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-54296518-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "717cc581e6a349a98dfd390d05b18624", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap63932621-f0", "ovs_interfaceid": "63932621-f0d1-4f08-8ce5-b5fa120bcc62", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 21 23:58:32 compute-1 nova_compute[182713]: 2026-01-21 23:58:32.021 182717 DEBUG nova.network.os_vif_util [req-93aaa9c1-f0d5-45d3-a91b-8936241ad934 req-5f0a4c0b-cc1d-4ae0-8828-4a2f22088035 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Converting VIF {"id": "63932621-f0d1-4f08-8ce5-b5fa120bcc62", "address": "fa:16:3e:30:c6:cf", "network": {"id": "1995baab-0f8d-4658-a4fc-2d21868dc592", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-54296518-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "717cc581e6a349a98dfd390d05b18624", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap63932621-f0", "ovs_interfaceid": "63932621-f0d1-4f08-8ce5-b5fa120bcc62", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 21 23:58:32 compute-1 nova_compute[182713]: 2026-01-21 23:58:32.022 182717 DEBUG nova.network.os_vif_util [req-93aaa9c1-f0d5-45d3-a91b-8936241ad934 req-5f0a4c0b-cc1d-4ae0-8828-4a2f22088035 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:30:c6:cf,bridge_name='br-int',has_traffic_filtering=True,id=63932621-f0d1-4f08-8ce5-b5fa120bcc62,network=Network(1995baab-0f8d-4658-a4fc-2d21868dc592),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap63932621-f0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 21 23:58:32 compute-1 nova_compute[182713]: 2026-01-21 23:58:32.027 182717 DEBUG nova.virt.libvirt.guest [req-93aaa9c1-f0d5-45d3-a91b-8936241ad934 req-5f0a4c0b-cc1d-4ae0-8828-4a2f22088035 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:30:c6:cf"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap63932621-f0"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257
Jan 21 23:58:32 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:58:32.028 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[736da4e2-cf33-46bc-985f-cae8668319e8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:58:32 compute-1 nova_compute[182713]: 2026-01-21 23:58:32.030 182717 DEBUG nova.virt.libvirt.guest [req-93aaa9c1-f0d5-45d3-a91b-8936241ad934 req-5f0a4c0b-cc1d-4ae0-8828-4a2f22088035 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:30:c6:cf"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap63932621-f0"/></interface>not found in domain: <domain type='kvm' id='33'>
Jan 21 23:58:32 compute-1 nova_compute[182713]:   <name>instance-00000047</name>
Jan 21 23:58:32 compute-1 nova_compute[182713]:   <uuid>c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9</uuid>
Jan 21 23:58:32 compute-1 nova_compute[182713]:   <metadata>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 21 23:58:32 compute-1 nova_compute[182713]:   <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:   <nova:name>tempest-AttachInterfacesTestJSON-server-936465965</nova:name>
Jan 21 23:58:32 compute-1 nova_compute[182713]:   <nova:creationTime>2026-01-21 23:58:30</nova:creationTime>
Jan 21 23:58:32 compute-1 nova_compute[182713]:   <nova:flavor name="m1.nano">
Jan 21 23:58:32 compute-1 nova_compute[182713]:     <nova:memory>128</nova:memory>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     <nova:disk>1</nova:disk>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     <nova:swap>0</nova:swap>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     <nova:ephemeral>0</nova:ephemeral>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     <nova:vcpus>1</nova:vcpus>
Jan 21 23:58:32 compute-1 nova_compute[182713]:   </nova:flavor>
Jan 21 23:58:32 compute-1 nova_compute[182713]:   <nova:owner>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     <nova:user uuid="0f8ef02149394f2dac899fc3395b6bf7">tempest-AttachInterfacesTestJSON-658760528-project-member</nova:user>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     <nova:project uuid="717cc581e6a349a98dfd390d05b18624">tempest-AttachInterfacesTestJSON-658760528</nova:project>
Jan 21 23:58:32 compute-1 nova_compute[182713]:   </nova:owner>
Jan 21 23:58:32 compute-1 nova_compute[182713]:   <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:   <nova:ports>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     <nova:port uuid="43589933-1997-41c6-9aa3-54f71a1330b8">
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     </nova:port>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     <nova:port uuid="da88332f-f709-4521-a7e5-faca686cf825">
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     </nova:port>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     <nova:port uuid="ebbb51e0-ecfc-404f-a578-681300a57aa8">
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     </nova:port>
Jan 21 23:58:32 compute-1 nova_compute[182713]:   </nova:ports>
Jan 21 23:58:32 compute-1 nova_compute[182713]: </nova:instance>
Jan 21 23:58:32 compute-1 nova_compute[182713]:   </metadata>
Jan 21 23:58:32 compute-1 nova_compute[182713]:   <memory unit='KiB'>131072</memory>
Jan 21 23:58:32 compute-1 nova_compute[182713]:   <currentMemory unit='KiB'>131072</currentMemory>
Jan 21 23:58:32 compute-1 nova_compute[182713]:   <vcpu placement='static'>1</vcpu>
Jan 21 23:58:32 compute-1 nova_compute[182713]:   <resource>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     <partition>/machine</partition>
Jan 21 23:58:32 compute-1 nova_compute[182713]:   </resource>
Jan 21 23:58:32 compute-1 nova_compute[182713]:   <sysinfo type='smbios'>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     <system>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <entry name='manufacturer'>RDO</entry>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <entry name='product'>OpenStack Compute</entry>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <entry name='serial'>c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9</entry>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <entry name='uuid'>c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9</entry>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <entry name='family'>Virtual Machine</entry>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     </system>
Jan 21 23:58:32 compute-1 nova_compute[182713]:   </sysinfo>
Jan 21 23:58:32 compute-1 nova_compute[182713]:   <os>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     <boot dev='hd'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     <smbios mode='sysinfo'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:   </os>
Jan 21 23:58:32 compute-1 nova_compute[182713]:   <features>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     <acpi/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     <apic/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     <vmcoreinfo state='on'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:   </features>
Jan 21 23:58:32 compute-1 nova_compute[182713]:   <cpu mode='custom' match='exact' check='full'>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     <model fallback='forbid'>Nehalem</model>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     <feature policy='require' name='x2apic'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     <feature policy='require' name='hypervisor'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     <feature policy='require' name='vme'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:   </cpu>
Jan 21 23:58:32 compute-1 nova_compute[182713]:   <clock offset='utc'>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     <timer name='pit' tickpolicy='delay'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     <timer name='rtc' tickpolicy='catchup'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     <timer name='hpet' present='no'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:   </clock>
Jan 21 23:58:32 compute-1 nova_compute[182713]:   <on_poweroff>destroy</on_poweroff>
Jan 21 23:58:32 compute-1 nova_compute[182713]:   <on_reboot>restart</on_reboot>
Jan 21 23:58:32 compute-1 nova_compute[182713]:   <on_crash>destroy</on_crash>
Jan 21 23:58:32 compute-1 nova_compute[182713]:   <devices>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     <disk type='file' device='disk'>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <driver name='qemu' type='qcow2' cache='none'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <source file='/var/lib/nova/instances/c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9/disk' index='2'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <backingStore type='file' index='3'>
Jan 21 23:58:32 compute-1 nova_compute[182713]:         <format type='raw'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:         <source file='/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:         <backingStore/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       </backingStore>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <target dev='vda' bus='virtio'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <alias name='virtio-disk0'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Jan 21 23:58:32 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:58:32.030 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[043eef7e-12a0-4117-adca-5600b279f6b2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:58:32 compute-1 nova_compute[182713]:     </disk>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     <disk type='file' device='cdrom'>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <driver name='qemu' type='raw' cache='none'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <source file='/var/lib/nova/instances/c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9/disk.config' index='1'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <backingStore/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <target dev='sda' bus='sata'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <readonly/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <alias name='sata0-0-0'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     </disk>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     <controller type='pci' index='0' model='pcie-root'>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <alias name='pcie.0'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     </controller>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     <controller type='pci' index='1' model='pcie-root-port'>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <target chassis='1' port='0x10'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <alias name='pci.1'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     </controller>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     <controller type='pci' index='2' model='pcie-root-port'>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <target chassis='2' port='0x11'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <alias name='pci.2'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     </controller>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     <controller type='pci' index='3' model='pcie-root-port'>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <target chassis='3' port='0x12'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <alias name='pci.3'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     </controller>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     <controller type='pci' index='4' model='pcie-root-port'>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <target chassis='4' port='0x13'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <alias name='pci.4'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     </controller>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     <controller type='pci' index='5' model='pcie-root-port'>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <target chassis='5' port='0x14'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <alias name='pci.5'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     </controller>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     <controller type='pci' index='6' model='pcie-root-port'>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <target chassis='6' port='0x15'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <alias name='pci.6'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     </controller>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     <controller type='pci' index='7' model='pcie-root-port'>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <target chassis='7' port='0x16'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <alias name='pci.7'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     </controller>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     <controller type='pci' index='8' model='pcie-root-port'>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <target chassis='8' port='0x17'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <alias name='pci.8'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     </controller>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     <controller type='pci' index='9' model='pcie-root-port'>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <target chassis='9' port='0x18'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <alias name='pci.9'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     </controller>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     <controller type='pci' index='10' model='pcie-root-port'>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <target chassis='10' port='0x19'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <alias name='pci.10'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     </controller>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     <controller type='pci' index='11' model='pcie-root-port'>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <target chassis='11' port='0x1a'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <alias name='pci.11'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     </controller>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     <controller type='pci' index='12' model='pcie-root-port'>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <target chassis='12' port='0x1b'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <alias name='pci.12'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     </controller>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     <controller type='pci' index='13' model='pcie-root-port'>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <target chassis='13' port='0x1c'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <alias name='pci.13'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     </controller>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     <controller type='pci' index='14' model='pcie-root-port'>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <target chassis='14' port='0x1d'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <alias name='pci.14'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     </controller>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     <controller type='pci' index='15' model='pcie-root-port'>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <target chassis='15' port='0x1e'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <alias name='pci.15'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     </controller>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     <controller type='pci' index='16' model='pcie-root-port'>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <target chassis='16' port='0x1f'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <alias name='pci.16'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     </controller>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     <controller type='pci' index='17' model='pcie-root-port'>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <target chassis='17' port='0x20'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <alias name='pci.17'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     </controller>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     <controller type='pci' index='18' model='pcie-root-port'>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <target chassis='18' port='0x21'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <alias name='pci.18'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     </controller>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     <controller type='pci' index='19' model='pcie-root-port'>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <target chassis='19' port='0x22'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <alias name='pci.19'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     </controller>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     <controller type='pci' index='20' model='pcie-root-port'>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <target chassis='20' port='0x23'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <alias name='pci.20'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     </controller>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     <controller type='pci' index='21' model='pcie-root-port'>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <target chassis='21' port='0x24'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <alias name='pci.21'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     </controller>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     <controller type='pci' index='22' model='pcie-root-port'>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <target chassis='22' port='0x25'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <alias name='pci.22'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     </controller>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     <controller type='pci' index='23' model='pcie-root-port'>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <target chassis='23' port='0x26'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <alias name='pci.23'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     </controller>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     <controller type='pci' index='24' model='pcie-root-port'>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <target chassis='24' port='0x27'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <alias name='pci.24'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     </controller>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     <controller type='pci' index='25' model='pcie-root-port'>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <target chassis='25' port='0x28'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <alias name='pci.25'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     </controller>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <model name='pcie-pci-bridge'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <alias name='pci.26'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     </controller>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     <controller type='usb' index='0' model='piix3-uhci'>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <alias name='usb'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     </controller>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     <controller type='sata' index='0'>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <alias name='ide'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     </controller>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     <interface type='ethernet'>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <mac address='fa:16:3e:3e:bc:4e'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <target dev='tap43589933-19'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <model type='virtio'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <driver name='vhost' rx_queue_size='512'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <mtu size='1442'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <alias name='net0'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     </interface>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     <interface type='ethernet'>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <mac address='fa:16:3e:92:0a:be'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <target dev='tapda88332f-f7'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <model type='virtio'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <driver name='vhost' rx_queue_size='512'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <mtu size='1442'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <alias name='net2'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x07' slot='0x00' function='0x0'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     </interface>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     <interface type='ethernet'>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <mac address='fa:16:3e:db:19:e5'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <target dev='tapebbb51e0-ec'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <model type='virtio'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <driver name='vhost' rx_queue_size='512'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <mtu size='1442'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <alias name='net3'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x08' slot='0x00' function='0x0'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     </interface>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     <serial type='pty'>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <source path='/dev/pts/0'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <log file='/var/lib/nova/instances/c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9/console.log' append='off'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <target type='isa-serial' port='0'>
Jan 21 23:58:32 compute-1 nova_compute[182713]:         <model name='isa-serial'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       </target>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <alias name='serial0'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     </serial>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     <console type='pty' tty='/dev/pts/0'>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <source path='/dev/pts/0'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <log file='/var/lib/nova/instances/c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9/console.log' append='off'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <target type='serial' port='0'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <alias name='serial0'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     </console>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     <input type='tablet' bus='usb'>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <alias name='input0'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <address type='usb' bus='0' port='1'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     </input>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     <input type='mouse' bus='ps2'>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <alias name='input1'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     </input>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     <input type='keyboard' bus='ps2'>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <alias name='input2'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     </input>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <listen type='address' address='::0'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     </graphics>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     <audio id='1' type='none'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     <video>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <model type='virtio' heads='1' primary='yes'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <alias name='video0'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     </video>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     <watchdog model='itco' action='reset'>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <alias name='watchdog0'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     </watchdog>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     <memballoon model='virtio'>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <stats period='10'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <alias name='balloon0'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     </memballoon>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     <rng model='virtio'>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <backend model='random'>/dev/urandom</backend>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <alias name='rng0'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     </rng>
Jan 21 23:58:32 compute-1 nova_compute[182713]:   </devices>
Jan 21 23:58:32 compute-1 nova_compute[182713]:   <seclabel type='dynamic' model='selinux' relabel='yes'>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     <label>system_u:system_r:svirt_t:s0:c342,c529</label>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     <imagelabel>system_u:object_r:svirt_image_t:s0:c342,c529</imagelabel>
Jan 21 23:58:32 compute-1 nova_compute[182713]:   </seclabel>
Jan 21 23:58:32 compute-1 nova_compute[182713]:   <seclabel type='dynamic' model='dac' relabel='yes'>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     <label>+107:+107</label>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     <imagelabel>+107:+107</imagelabel>
Jan 21 23:58:32 compute-1 nova_compute[182713]:   </seclabel>
Jan 21 23:58:32 compute-1 nova_compute[182713]: </domain>
Jan 21 23:58:32 compute-1 nova_compute[182713]:  get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282
Jan 21 23:58:32 compute-1 nova_compute[182713]: 2026-01-21 23:58:32.032 182717 DEBUG nova.virt.libvirt.guest [req-93aaa9c1-f0d5-45d3-a91b-8936241ad934 req-5f0a4c0b-cc1d-4ae0-8828-4a2f22088035 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:30:c6:cf"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap63932621-f0"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257
Jan 21 23:58:32 compute-1 nova_compute[182713]: 2026-01-21 23:58:32.037 182717 DEBUG nova.virt.libvirt.guest [req-93aaa9c1-f0d5-45d3-a91b-8936241ad934 req-5f0a4c0b-cc1d-4ae0-8828-4a2f22088035 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:30:c6:cf"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap63932621-f0"/></interface>not found in domain: <domain type='kvm' id='33'>
Jan 21 23:58:32 compute-1 nova_compute[182713]:   <name>instance-00000047</name>
Jan 21 23:58:32 compute-1 nova_compute[182713]:   <uuid>c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9</uuid>
Jan 21 23:58:32 compute-1 nova_compute[182713]:   <metadata>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 21 23:58:32 compute-1 nova_compute[182713]:   <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:   <nova:name>tempest-AttachInterfacesTestJSON-server-936465965</nova:name>
Jan 21 23:58:32 compute-1 nova_compute[182713]:   <nova:creationTime>2026-01-21 23:58:30</nova:creationTime>
Jan 21 23:58:32 compute-1 nova_compute[182713]:   <nova:flavor name="m1.nano">
Jan 21 23:58:32 compute-1 nova_compute[182713]:     <nova:memory>128</nova:memory>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     <nova:disk>1</nova:disk>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     <nova:swap>0</nova:swap>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     <nova:ephemeral>0</nova:ephemeral>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     <nova:vcpus>1</nova:vcpus>
Jan 21 23:58:32 compute-1 nova_compute[182713]:   </nova:flavor>
Jan 21 23:58:32 compute-1 nova_compute[182713]:   <nova:owner>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     <nova:user uuid="0f8ef02149394f2dac899fc3395b6bf7">tempest-AttachInterfacesTestJSON-658760528-project-member</nova:user>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     <nova:project uuid="717cc581e6a349a98dfd390d05b18624">tempest-AttachInterfacesTestJSON-658760528</nova:project>
Jan 21 23:58:32 compute-1 nova_compute[182713]:   </nova:owner>
Jan 21 23:58:32 compute-1 nova_compute[182713]:   <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:   <nova:ports>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     <nova:port uuid="43589933-1997-41c6-9aa3-54f71a1330b8">
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     </nova:port>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     <nova:port uuid="da88332f-f709-4521-a7e5-faca686cf825">
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     </nova:port>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     <nova:port uuid="ebbb51e0-ecfc-404f-a578-681300a57aa8">
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     </nova:port>
Jan 21 23:58:32 compute-1 nova_compute[182713]:   </nova:ports>
Jan 21 23:58:32 compute-1 nova_compute[182713]: </nova:instance>
Jan 21 23:58:32 compute-1 nova_compute[182713]:   </metadata>
Jan 21 23:58:32 compute-1 nova_compute[182713]:   <memory unit='KiB'>131072</memory>
Jan 21 23:58:32 compute-1 nova_compute[182713]:   <currentMemory unit='KiB'>131072</currentMemory>
Jan 21 23:58:32 compute-1 nova_compute[182713]:   <vcpu placement='static'>1</vcpu>
Jan 21 23:58:32 compute-1 nova_compute[182713]:   <resource>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     <partition>/machine</partition>
Jan 21 23:58:32 compute-1 nova_compute[182713]:   </resource>
Jan 21 23:58:32 compute-1 nova_compute[182713]:   <sysinfo type='smbios'>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     <system>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <entry name='manufacturer'>RDO</entry>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <entry name='product'>OpenStack Compute</entry>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <entry name='serial'>c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9</entry>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <entry name='uuid'>c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9</entry>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <entry name='family'>Virtual Machine</entry>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     </system>
Jan 21 23:58:32 compute-1 nova_compute[182713]:   </sysinfo>
Jan 21 23:58:32 compute-1 nova_compute[182713]:   <os>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     <boot dev='hd'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     <smbios mode='sysinfo'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:   </os>
Jan 21 23:58:32 compute-1 nova_compute[182713]:   <features>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     <acpi/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     <apic/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     <vmcoreinfo state='on'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:   </features>
Jan 21 23:58:32 compute-1 nova_compute[182713]:   <cpu mode='custom' match='exact' check='full'>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     <model fallback='forbid'>Nehalem</model>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     <feature policy='require' name='x2apic'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     <feature policy='require' name='hypervisor'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     <feature policy='require' name='vme'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:   </cpu>
Jan 21 23:58:32 compute-1 nova_compute[182713]:   <clock offset='utc'>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     <timer name='pit' tickpolicy='delay'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     <timer name='rtc' tickpolicy='catchup'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     <timer name='hpet' present='no'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:   </clock>
Jan 21 23:58:32 compute-1 nova_compute[182713]:   <on_poweroff>destroy</on_poweroff>
Jan 21 23:58:32 compute-1 nova_compute[182713]:   <on_reboot>restart</on_reboot>
Jan 21 23:58:32 compute-1 nova_compute[182713]:   <on_crash>destroy</on_crash>
Jan 21 23:58:32 compute-1 nova_compute[182713]:   <devices>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     <disk type='file' device='disk'>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <driver name='qemu' type='qcow2' cache='none'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <source file='/var/lib/nova/instances/c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9/disk' index='2'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <backingStore type='file' index='3'>
Jan 21 23:58:32 compute-1 nova_compute[182713]:         <format type='raw'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:         <source file='/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:         <backingStore/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       </backingStore>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <target dev='vda' bus='virtio'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <alias name='virtio-disk0'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     </disk>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     <disk type='file' device='cdrom'>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <driver name='qemu' type='raw' cache='none'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <source file='/var/lib/nova/instances/c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9/disk.config' index='1'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <backingStore/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <target dev='sda' bus='sata'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <readonly/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <alias name='sata0-0-0'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     </disk>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     <controller type='pci' index='0' model='pcie-root'>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <alias name='pcie.0'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     </controller>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     <controller type='pci' index='1' model='pcie-root-port'>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <target chassis='1' port='0x10'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <alias name='pci.1'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     </controller>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     <controller type='pci' index='2' model='pcie-root-port'>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <target chassis='2' port='0x11'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <alias name='pci.2'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     </controller>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     <controller type='pci' index='3' model='pcie-root-port'>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <target chassis='3' port='0x12'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <alias name='pci.3'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     </controller>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     <controller type='pci' index='4' model='pcie-root-port'>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <target chassis='4' port='0x13'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <alias name='pci.4'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     </controller>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     <controller type='pci' index='5' model='pcie-root-port'>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <target chassis='5' port='0x14'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <alias name='pci.5'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     </controller>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     <controller type='pci' index='6' model='pcie-root-port'>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <target chassis='6' port='0x15'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <alias name='pci.6'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     </controller>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     <controller type='pci' index='7' model='pcie-root-port'>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <target chassis='7' port='0x16'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <alias name='pci.7'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     </controller>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     <controller type='pci' index='8' model='pcie-root-port'>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <target chassis='8' port='0x17'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <alias name='pci.8'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     </controller>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     <controller type='pci' index='9' model='pcie-root-port'>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <target chassis='9' port='0x18'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <alias name='pci.9'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     </controller>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     <controller type='pci' index='10' model='pcie-root-port'>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <target chassis='10' port='0x19'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <alias name='pci.10'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     </controller>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     <controller type='pci' index='11' model='pcie-root-port'>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <target chassis='11' port='0x1a'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <alias name='pci.11'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     </controller>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     <controller type='pci' index='12' model='pcie-root-port'>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <target chassis='12' port='0x1b'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <alias name='pci.12'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     </controller>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     <controller type='pci' index='13' model='pcie-root-port'>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <target chassis='13' port='0x1c'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <alias name='pci.13'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     </controller>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     <controller type='pci' index='14' model='pcie-root-port'>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <target chassis='14' port='0x1d'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <alias name='pci.14'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     </controller>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     <controller type='pci' index='15' model='pcie-root-port'>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <target chassis='15' port='0x1e'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <alias name='pci.15'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     </controller>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     <controller type='pci' index='16' model='pcie-root-port'>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <target chassis='16' port='0x1f'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <alias name='pci.16'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     </controller>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     <controller type='pci' index='17' model='pcie-root-port'>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <target chassis='17' port='0x20'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <alias name='pci.17'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     </controller>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     <controller type='pci' index='18' model='pcie-root-port'>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <target chassis='18' port='0x21'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <alias name='pci.18'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     </controller>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     <controller type='pci' index='19' model='pcie-root-port'>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <target chassis='19' port='0x22'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <alias name='pci.19'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     </controller>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     <controller type='pci' index='20' model='pcie-root-port'>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <target chassis='20' port='0x23'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <alias name='pci.20'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     </controller>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     <controller type='pci' index='21' model='pcie-root-port'>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <target chassis='21' port='0x24'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <alias name='pci.21'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     </controller>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     <controller type='pci' index='22' model='pcie-root-port'>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <target chassis='22' port='0x25'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <alias name='pci.22'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     </controller>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     <controller type='pci' index='23' model='pcie-root-port'>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <target chassis='23' port='0x26'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <alias name='pci.23'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     </controller>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     <controller type='pci' index='24' model='pcie-root-port'>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <target chassis='24' port='0x27'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <alias name='pci.24'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     </controller>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     <controller type='pci' index='25' model='pcie-root-port'>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <target chassis='25' port='0x28'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <alias name='pci.25'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     </controller>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <model name='pcie-pci-bridge'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <alias name='pci.26'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     </controller>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     <controller type='usb' index='0' model='piix3-uhci'>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <alias name='usb'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     </controller>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     <controller type='sata' index='0'>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <alias name='ide'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     </controller>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     <interface type='ethernet'>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <mac address='fa:16:3e:3e:bc:4e'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <target dev='tap43589933-19'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <model type='virtio'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <driver name='vhost' rx_queue_size='512'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <mtu size='1442'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <alias name='net0'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     </interface>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     <interface type='ethernet'>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <mac address='fa:16:3e:92:0a:be'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <target dev='tapda88332f-f7'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <model type='virtio'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <driver name='vhost' rx_queue_size='512'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <mtu size='1442'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <alias name='net2'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x07' slot='0x00' function='0x0'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     </interface>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     <interface type='ethernet'>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <mac address='fa:16:3e:db:19:e5'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <target dev='tapebbb51e0-ec'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <model type='virtio'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <driver name='vhost' rx_queue_size='512'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <mtu size='1442'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <alias name='net3'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x08' slot='0x00' function='0x0'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     </interface>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     <serial type='pty'>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <source path='/dev/pts/0'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <log file='/var/lib/nova/instances/c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9/console.log' append='off'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <target type='isa-serial' port='0'>
Jan 21 23:58:32 compute-1 nova_compute[182713]:         <model name='isa-serial'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       </target>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <alias name='serial0'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     </serial>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     <console type='pty' tty='/dev/pts/0'>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <source path='/dev/pts/0'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <log file='/var/lib/nova/instances/c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9/console.log' append='off'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <target type='serial' port='0'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <alias name='serial0'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     </console>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     <input type='tablet' bus='usb'>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <alias name='input0'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <address type='usb' bus='0' port='1'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     </input>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     <input type='mouse' bus='ps2'>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <alias name='input1'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     </input>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     <input type='keyboard' bus='ps2'>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <alias name='input2'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     </input>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <listen type='address' address='::0'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     </graphics>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     <audio id='1' type='none'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     <video>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <model type='virtio' heads='1' primary='yes'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <alias name='video0'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     </video>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     <watchdog model='itco' action='reset'>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <alias name='watchdog0'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     </watchdog>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     <memballoon model='virtio'>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <stats period='10'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <alias name='balloon0'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     </memballoon>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     <rng model='virtio'>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <backend model='random'>/dev/urandom</backend>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <alias name='rng0'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     </rng>
Jan 21 23:58:32 compute-1 nova_compute[182713]:   </devices>
Jan 21 23:58:32 compute-1 nova_compute[182713]:   <seclabel type='dynamic' model='selinux' relabel='yes'>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     <label>system_u:system_r:svirt_t:s0:c342,c529</label>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     <imagelabel>system_u:object_r:svirt_image_t:s0:c342,c529</imagelabel>
Jan 21 23:58:32 compute-1 nova_compute[182713]:   </seclabel>
Jan 21 23:58:32 compute-1 nova_compute[182713]:   <seclabel type='dynamic' model='dac' relabel='yes'>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     <label>+107:+107</label>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     <imagelabel>+107:+107</imagelabel>
Jan 21 23:58:32 compute-1 nova_compute[182713]:   </seclabel>
Jan 21 23:58:32 compute-1 nova_compute[182713]: </domain>
Jan 21 23:58:32 compute-1 nova_compute[182713]:  get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282
Jan 21 23:58:32 compute-1 nova_compute[182713]: 2026-01-21 23:58:32.037 182717 WARNING nova.virt.libvirt.driver [req-93aaa9c1-f0d5-45d3-a91b-8936241ad934 req-5f0a4c0b-cc1d-4ae0-8828-4a2f22088035 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9] Detaching interface fa:16:3e:30:c6:cf failed because the device is no longer found on the guest.: nova.exception.DeviceNotFound: Device 'tap63932621-f0' not found.
Jan 21 23:58:32 compute-1 nova_compute[182713]: 2026-01-21 23:58:32.038 182717 DEBUG nova.virt.libvirt.vif [req-93aaa9c1-f0d5-45d3-a91b-8936241ad934 req-5f0a4c0b-cc1d-4ae0-8828-4a2f22088035 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-21T23:57:18Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-AttachInterfacesTestJSON-server-936465965',display_name='tempest-AttachInterfacesTestJSON-server-936465965',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-attachinterfacestestjson-server-936465965',id=71,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBO8uUXqPwbsFq/YiTWyOW9dXr7d73wRm6448mROH/nejo5yLwAs6kN5js+eg5mRrAu9wdd3Q77g1T/xRTktMBrGTEhLl3+r8EWHZg0auWjRb9BMWnMu9DqjgoX1NYmMOKg==',key_name='tempest-keypair-1559391228',keypairs=<?>,launch_index=0,launched_at=2026-01-21T23:57:26Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata=<?>,migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='717cc581e6a349a98dfd390d05b18624',ramdisk_id='',reservation_id='r-fh2clcvk',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-658760528',owner_user_name='tempest-AttachInterfacesTestJSON-658760528-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-21T23:57:26Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='0f8ef02149394f2dac899fc3395b6bf7',uuid=c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "63932621-f0d1-4f08-8ce5-b5fa120bcc62", "address": "fa:16:3e:30:c6:cf", "network": {"id": "1995baab-0f8d-4658-a4fc-2d21868dc592", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-54296518-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "717cc581e6a349a98dfd390d05b18624", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap63932621-f0", "ovs_interfaceid": "63932621-f0d1-4f08-8ce5-b5fa120bcc62", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 21 23:58:32 compute-1 nova_compute[182713]: 2026-01-21 23:58:32.039 182717 DEBUG nova.network.os_vif_util [req-93aaa9c1-f0d5-45d3-a91b-8936241ad934 req-5f0a4c0b-cc1d-4ae0-8828-4a2f22088035 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Converting VIF {"id": "63932621-f0d1-4f08-8ce5-b5fa120bcc62", "address": "fa:16:3e:30:c6:cf", "network": {"id": "1995baab-0f8d-4658-a4fc-2d21868dc592", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-54296518-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "717cc581e6a349a98dfd390d05b18624", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap63932621-f0", "ovs_interfaceid": "63932621-f0d1-4f08-8ce5-b5fa120bcc62", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 21 23:58:32 compute-1 nova_compute[182713]: 2026-01-21 23:58:32.040 182717 DEBUG nova.network.os_vif_util [req-93aaa9c1-f0d5-45d3-a91b-8936241ad934 req-5f0a4c0b-cc1d-4ae0-8828-4a2f22088035 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:30:c6:cf,bridge_name='br-int',has_traffic_filtering=True,id=63932621-f0d1-4f08-8ce5-b5fa120bcc62,network=Network(1995baab-0f8d-4658-a4fc-2d21868dc592),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap63932621-f0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 21 23:58:32 compute-1 nova_compute[182713]: 2026-01-21 23:58:32.041 182717 DEBUG os_vif [req-93aaa9c1-f0d5-45d3-a91b-8936241ad934 req-5f0a4c0b-cc1d-4ae0-8828-4a2f22088035 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:30:c6:cf,bridge_name='br-int',has_traffic_filtering=True,id=63932621-f0d1-4f08-8ce5-b5fa120bcc62,network=Network(1995baab-0f8d-4658-a4fc-2d21868dc592),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap63932621-f0') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 21 23:58:32 compute-1 nova_compute[182713]: 2026-01-21 23:58:32.043 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:58:32 compute-1 nova_compute[182713]: 2026-01-21 23:58:32.043 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap63932621-f0, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:58:32 compute-1 nova_compute[182713]: 2026-01-21 23:58:32.043 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 21 23:58:32 compute-1 nova_compute[182713]: 2026-01-21 23:58:32.046 182717 INFO os_vif [req-93aaa9c1-f0d5-45d3-a91b-8936241ad934 req-5f0a4c0b-cc1d-4ae0-8828-4a2f22088035 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:30:c6:cf,bridge_name='br-int',has_traffic_filtering=True,id=63932621-f0d1-4f08-8ce5-b5fa120bcc62,network=Network(1995baab-0f8d-4658-a4fc-2d21868dc592),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap63932621-f0')
Jan 21 23:58:32 compute-1 nova_compute[182713]: 2026-01-21 23:58:32.048 182717 DEBUG nova.virt.libvirt.guest [req-93aaa9c1-f0d5-45d3-a91b-8936241ad934 req-5f0a4c0b-cc1d-4ae0-8828-4a2f22088035 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 21 23:58:32 compute-1 nova_compute[182713]:   <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:   <nova:name>tempest-AttachInterfacesTestJSON-server-936465965</nova:name>
Jan 21 23:58:32 compute-1 nova_compute[182713]:   <nova:creationTime>2026-01-21 23:58:32</nova:creationTime>
Jan 21 23:58:32 compute-1 nova_compute[182713]:   <nova:flavor name="m1.nano">
Jan 21 23:58:32 compute-1 nova_compute[182713]:     <nova:memory>128</nova:memory>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     <nova:disk>1</nova:disk>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     <nova:swap>0</nova:swap>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     <nova:ephemeral>0</nova:ephemeral>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     <nova:vcpus>1</nova:vcpus>
Jan 21 23:58:32 compute-1 nova_compute[182713]:   </nova:flavor>
Jan 21 23:58:32 compute-1 nova_compute[182713]:   <nova:owner>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     <nova:user uuid="0f8ef02149394f2dac899fc3395b6bf7">tempest-AttachInterfacesTestJSON-658760528-project-member</nova:user>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     <nova:project uuid="717cc581e6a349a98dfd390d05b18624">tempest-AttachInterfacesTestJSON-658760528</nova:project>
Jan 21 23:58:32 compute-1 nova_compute[182713]:   </nova:owner>
Jan 21 23:58:32 compute-1 nova_compute[182713]:   <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:   <nova:ports>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     <nova:port uuid="43589933-1997-41c6-9aa3-54f71a1330b8">
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     </nova:port>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     <nova:port uuid="da88332f-f709-4521-a7e5-faca686cf825">
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     </nova:port>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     <nova:port uuid="ebbb51e0-ecfc-404f-a578-681300a57aa8">
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     </nova:port>
Jan 21 23:58:32 compute-1 nova_compute[182713]:   </nova:ports>
Jan 21 23:58:32 compute-1 nova_compute[182713]: </nova:instance>
Jan 21 23:58:32 compute-1 nova_compute[182713]:  set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359
Jan 21 23:58:32 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:58:32.060 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[554052f1-c0f4-4d1b-b9ec-e812dba7c7f9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:58:32 compute-1 nova_compute[182713]: 2026-01-21 23:58:32.074 182717 DEBUG oslo_concurrency.processutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:58:32 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:58:32.077 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[79c9ee5f-ab1f-46c8-8755-9e16e4ae8c07]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap1995baab-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:4b:ff:2f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 12, 'tx_packets': 13, 'rx_bytes': 784, 'tx_bytes': 690, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 12, 'tx_packets': 13, 'rx_bytes': 784, 'tx_bytes': 690, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 70], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 441880, 'reachable_time': 27223, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 221591, 'error': None, 'target': 'ovnmeta-1995baab-0f8d-4658-a4fc-2d21868dc592', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:58:32 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:58:32.094 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[32f15088-4fd1-4e55-be93-acd0440a3b85]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap1995baab-01'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 441895, 'tstamp': 441895}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 221592, 'error': None, 'target': 'ovnmeta-1995baab-0f8d-4658-a4fc-2d21868dc592', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap1995baab-01'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 441899, 'tstamp': 441899}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 221592, 'error': None, 'target': 'ovnmeta-1995baab-0f8d-4658-a4fc-2d21868dc592', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:58:32 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:58:32.095 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1995baab-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:58:32 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:58:32.099 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap1995baab-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:58:32 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:58:32.099 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 21 23:58:32 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:58:32.099 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap1995baab-00, col_values=(('external_ids', {'iface-id': '4a5cc35b-5169-43e2-b11f-202219aae22d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:58:32 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:58:32.100 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 21 23:58:32 compute-1 nova_compute[182713]: 2026-01-21 23:58:32.109 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:58:32 compute-1 nova_compute[182713]: 2026-01-21 23:58:32.173 182717 DEBUG oslo_concurrency.processutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9/disk --force-share --output=json" returned: 0 in 0.099s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:58:32 compute-1 nova_compute[182713]: 2026-01-21 23:58:32.174 182717 DEBUG oslo_concurrency.processutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:58:32 compute-1 nova_compute[182713]: 2026-01-21 23:58:32.232 182717 DEBUG oslo_concurrency.processutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:58:32 compute-1 nova_compute[182713]: 2026-01-21 23:58:32.262 182717 DEBUG oslo_concurrency.lockutils [None req-ab05fc8b-074b-4881-a880-fb3a4d2a03ab 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Acquiring lock "c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:58:32 compute-1 nova_compute[182713]: 2026-01-21 23:58:32.262 182717 DEBUG oslo_concurrency.lockutils [None req-ab05fc8b-074b-4881-a880-fb3a4d2a03ab 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Lock "c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:58:32 compute-1 nova_compute[182713]: 2026-01-21 23:58:32.263 182717 DEBUG oslo_concurrency.lockutils [None req-ab05fc8b-074b-4881-a880-fb3a4d2a03ab 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Acquiring lock "c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:58:32 compute-1 nova_compute[182713]: 2026-01-21 23:58:32.263 182717 DEBUG oslo_concurrency.lockutils [None req-ab05fc8b-074b-4881-a880-fb3a4d2a03ab 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Lock "c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:58:32 compute-1 nova_compute[182713]: 2026-01-21 23:58:32.263 182717 DEBUG oslo_concurrency.lockutils [None req-ab05fc8b-074b-4881-a880-fb3a4d2a03ab 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Lock "c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:58:32 compute-1 nova_compute[182713]: 2026-01-21 23:58:32.279 182717 INFO nova.compute.manager [None req-ab05fc8b-074b-4881-a880-fb3a4d2a03ab 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] [instance: c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9] Terminating instance
Jan 21 23:58:32 compute-1 nova_compute[182713]: 2026-01-21 23:58:32.291 182717 DEBUG nova.compute.manager [None req-ab05fc8b-074b-4881-a880-fb3a4d2a03ab 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] [instance: c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 21 23:58:32 compute-1 kernel: tap43589933-19 (unregistering): left promiscuous mode
Jan 21 23:58:32 compute-1 NetworkManager[54952]: <info>  [1769039912.3160] device (tap43589933-19): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 21 23:58:32 compute-1 nova_compute[182713]: 2026-01-21 23:58:32.327 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:58:32 compute-1 ovn_controller[94841]: 2026-01-21T23:58:32Z|00261|binding|INFO|Releasing lport 43589933-1997-41c6-9aa3-54f71a1330b8 from this chassis (sb_readonly=0)
Jan 21 23:58:32 compute-1 ovn_controller[94841]: 2026-01-21T23:58:32Z|00262|binding|INFO|Setting lport 43589933-1997-41c6-9aa3-54f71a1330b8 down in Southbound
Jan 21 23:58:32 compute-1 ovn_controller[94841]: 2026-01-21T23:58:32Z|00263|binding|INFO|Removing iface tap43589933-19 ovn-installed in OVS
Jan 21 23:58:32 compute-1 nova_compute[182713]: 2026-01-21 23:58:32.330 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:58:32 compute-1 kernel: tapda88332f-f7 (unregistering): left promiscuous mode
Jan 21 23:58:32 compute-1 ovn_controller[94841]: 2026-01-21T23:58:32Z|00264|binding|INFO|Releasing lport 4a5cc35b-5169-43e2-b11f-202219aae22d from this chassis (sb_readonly=0)
Jan 21 23:58:32 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:58:32.344 104184 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:3e:bc:4e 10.100.0.9'], port_security=['fa:16:3e:3e:bc:4e 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': 'c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1995baab-0f8d-4658-a4fc-2d21868dc592', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '717cc581e6a349a98dfd390d05b18624', 'neutron:revision_number': '4', 'neutron:security_group_ids': '3de5207d-5e5a-404a-9582-14ba2d715885', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com', 'neutron:port_fip': '192.168.122.190'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a84fa12f-731b-4479-8697-844749c5a76f, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>], logical_port=43589933-1997-41c6-9aa3-54f71a1330b8) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 21 23:58:32 compute-1 NetworkManager[54952]: <info>  [1769039912.3462] device (tapda88332f-f7): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 21 23:58:32 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:58:32.346 104184 INFO neutron.agent.ovn.metadata.agent [-] Port 43589933-1997-41c6-9aa3-54f71a1330b8 in datapath 1995baab-0f8d-4658-a4fc-2d21868dc592 unbound from our chassis
Jan 21 23:58:32 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:58:32.347 104184 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 1995baab-0f8d-4658-a4fc-2d21868dc592
Jan 21 23:58:32 compute-1 nova_compute[182713]: 2026-01-21 23:58:32.355 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:58:32 compute-1 kernel: tapebbb51e0-ec (unregistering): left promiscuous mode
Jan 21 23:58:32 compute-1 nova_compute[182713]: 2026-01-21 23:58:32.359 182717 DEBUG nova.network.neutron [-] [instance: 337d88a9-a34b-4c90-bf0d-0418533ae52d] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 23:58:32 compute-1 ovn_controller[94841]: 2026-01-21T23:58:32Z|00265|binding|INFO|Releasing lport da88332f-f709-4521-a7e5-faca686cf825 from this chassis (sb_readonly=0)
Jan 21 23:58:32 compute-1 ovn_controller[94841]: 2026-01-21T23:58:32Z|00266|binding|INFO|Setting lport da88332f-f709-4521-a7e5-faca686cf825 down in Southbound
Jan 21 23:58:32 compute-1 ovn_controller[94841]: 2026-01-21T23:58:32Z|00267|binding|INFO|Removing iface tapda88332f-f7 ovn-installed in OVS
Jan 21 23:58:32 compute-1 nova_compute[182713]: 2026-01-21 23:58:32.363 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:58:32 compute-1 NetworkManager[54952]: <info>  [1769039912.3667] device (tapebbb51e0-ec): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 21 23:58:32 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:58:32.371 104184 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:92:0a:be 10.100.0.6'], port_security=['fa:16:3e:92:0a:be 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': 'c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1995baab-0f8d-4658-a4fc-2d21868dc592', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '717cc581e6a349a98dfd390d05b18624', 'neutron:revision_number': '4', 'neutron:security_group_ids': '453c6af8-25dc-4538-90e8-d74d46875cdc', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a84fa12f-731b-4479-8697-844749c5a76f, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>], logical_port=da88332f-f709-4521-a7e5-faca686cf825) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 21 23:58:32 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:58:32.372 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[8e3b3839-5fb3-400b-b455-b6b795e70e52]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:58:32 compute-1 nova_compute[182713]: 2026-01-21 23:58:32.383 182717 INFO nova.compute.manager [-] [instance: 337d88a9-a34b-4c90-bf0d-0418533ae52d] Took 1.23 seconds to deallocate network for instance.
Jan 21 23:58:32 compute-1 nova_compute[182713]: 2026-01-21 23:58:32.402 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:58:32 compute-1 nova_compute[182713]: 2026-01-21 23:58:32.415 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:58:32 compute-1 nova_compute[182713]: 2026-01-21 23:58:32.419 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:58:32 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:58:32.423 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[4bf2d1e4-326d-4fde-8f30-74060adccf96]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:58:32 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:58:32.427 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[8a5afa42-78a7-40b6-808e-bce97c246c2d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:58:32 compute-1 systemd[1]: machine-qemu\x2d33\x2dinstance\x2d00000047.scope: Deactivated successfully.
Jan 21 23:58:32 compute-1 systemd[1]: machine-qemu\x2d33\x2dinstance\x2d00000047.scope: Consumed 15.868s CPU time.
Jan 21 23:58:32 compute-1 systemd-machined[153970]: Machine qemu-33-instance-00000047 terminated.
Jan 21 23:58:32 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:58:32.459 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[ff146588-256b-4ca0-9933-96914ecde621]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:58:32 compute-1 nova_compute[182713]: 2026-01-21 23:58:32.467 182717 DEBUG oslo_concurrency.lockutils [None req-9292a1aa-694c-4e1a-bc79-2654a7b5a37b d31ef0e2c7354f47adb7b7f072c28fae 40a322b32cda438b83f33ec51a9007dc - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:58:32 compute-1 nova_compute[182713]: 2026-01-21 23:58:32.467 182717 DEBUG oslo_concurrency.lockutils [None req-9292a1aa-694c-4e1a-bc79-2654a7b5a37b d31ef0e2c7354f47adb7b7f072c28fae 40a322b32cda438b83f33ec51a9007dc - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:58:32 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:58:32.478 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[af1d343f-e7c8-4fce-8186-725e31f52d73]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap1995baab-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:4b:ff:2f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 12, 'tx_packets': 15, 'rx_bytes': 784, 'tx_bytes': 774, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 12, 'tx_packets': 15, 'rx_bytes': 784, 'tx_bytes': 774, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 70], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 441880, 'reachable_time': 27223, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 221620, 'error': None, 'target': 'ovnmeta-1995baab-0f8d-4658-a4fc-2d21868dc592', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:58:32 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:58:32.498 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[729c8565-fb99-4300-a773-0b4827f19ae2]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap1995baab-01'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 441895, 'tstamp': 441895}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 221621, 'error': None, 'target': 'ovnmeta-1995baab-0f8d-4658-a4fc-2d21868dc592', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap1995baab-01'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 441899, 'tstamp': 441899}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 221621, 'error': None, 'target': 'ovnmeta-1995baab-0f8d-4658-a4fc-2d21868dc592', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:58:32 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:58:32.499 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1995baab-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:58:32 compute-1 nova_compute[182713]: 2026-01-21 23:58:32.501 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:58:32 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:58:32.509 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap1995baab-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:58:32 compute-1 nova_compute[182713]: 2026-01-21 23:58:32.509 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:58:32 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:58:32.510 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 21 23:58:32 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:58:32.510 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap1995baab-00, col_values=(('external_ids', {'iface-id': '4a5cc35b-5169-43e2-b11f-202219aae22d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:58:32 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:58:32.511 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 21 23:58:32 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:58:32.512 104184 INFO neutron.agent.ovn.metadata.agent [-] Port da88332f-f709-4521-a7e5-faca686cf825 in datapath 1995baab-0f8d-4658-a4fc-2d21868dc592 unbound from our chassis
Jan 21 23:58:32 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:58:32.513 104184 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 1995baab-0f8d-4658-a4fc-2d21868dc592, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 21 23:58:32 compute-1 NetworkManager[54952]: <info>  [1769039912.5137] manager: (tap43589933-19): new Tun device (/org/freedesktop/NetworkManager/Devices/126)
Jan 21 23:58:32 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:58:32.514 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[b994af29-411a-4c57-a1a2-5782f62f1c54]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:58:32 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:58:32.515 104184 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-1995baab-0f8d-4658-a4fc-2d21868dc592 namespace which is not needed anymore
Jan 21 23:58:32 compute-1 NetworkManager[54952]: <info>  [1769039912.5242] manager: (tapda88332f-f7): new Tun device (/org/freedesktop/NetworkManager/Devices/127)
Jan 21 23:58:32 compute-1 NetworkManager[54952]: <info>  [1769039912.5375] manager: (tapebbb51e0-ec): new Tun device (/org/freedesktop/NetworkManager/Devices/128)
Jan 21 23:58:32 compute-1 nova_compute[182713]: 2026-01-21 23:58:32.544 182717 DEBUG nova.compute.manager [req-7c04a3ae-19b3-46d0-96f4-580531a5ea47 req-22d4a720-90f2-46da-b370-13467f5ab6e2 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9] Received event network-vif-deleted-ebbb51e0-ecfc-404f-a578-681300a57aa8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:58:32 compute-1 nova_compute[182713]: 2026-01-21 23:58:32.545 182717 INFO nova.compute.manager [req-7c04a3ae-19b3-46d0-96f4-580531a5ea47 req-22d4a720-90f2-46da-b370-13467f5ab6e2 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9] Neutron deleted interface ebbb51e0-ecfc-404f-a578-681300a57aa8; detaching it from the instance and deleting it from the info cache
Jan 21 23:58:32 compute-1 nova_compute[182713]: 2026-01-21 23:58:32.545 182717 DEBUG nova.network.neutron [req-7c04a3ae-19b3-46d0-96f4-580531a5ea47 req-22d4a720-90f2-46da-b370-13467f5ab6e2 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9] Updating instance_info_cache with network_info: [{"id": "43589933-1997-41c6-9aa3-54f71a1330b8", "address": "fa:16:3e:3e:bc:4e", "network": {"id": "1995baab-0f8d-4658-a4fc-2d21868dc592", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-54296518-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.190", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "717cc581e6a349a98dfd390d05b18624", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap43589933-19", "ovs_interfaceid": "43589933-1997-41c6-9aa3-54f71a1330b8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "da88332f-f709-4521-a7e5-faca686cf825", "address": "fa:16:3e:92:0a:be", "network": {"id": "1995baab-0f8d-4658-a4fc-2d21868dc592", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-54296518-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "717cc581e6a349a98dfd390d05b18624", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapda88332f-f7", "ovs_interfaceid": "da88332f-f709-4521-a7e5-faca686cf825", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 23:58:32 compute-1 nova_compute[182713]: 2026-01-21 23:58:32.564 182717 DEBUG nova.compute.provider_tree [None req-9292a1aa-694c-4e1a-bc79-2654a7b5a37b d31ef0e2c7354f47adb7b7f072c28fae 40a322b32cda438b83f33ec51a9007dc - - default default] Inventory has not changed in ProviderTree for provider: 39680711-70c9-4df1-ae59-25e54fac688d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 21 23:58:32 compute-1 nova_compute[182713]: 2026-01-21 23:58:32.573 182717 DEBUG nova.objects.instance [req-7c04a3ae-19b3-46d0-96f4-580531a5ea47 req-22d4a720-90f2-46da-b370-13467f5ab6e2 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lazy-loading 'system_metadata' on Instance uuid c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:58:32 compute-1 nova_compute[182713]: 2026-01-21 23:58:32.577 182717 INFO nova.virt.libvirt.driver [-] [instance: c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9] Instance destroyed successfully.
Jan 21 23:58:32 compute-1 nova_compute[182713]: 2026-01-21 23:58:32.578 182717 DEBUG nova.objects.instance [None req-ab05fc8b-074b-4881-a880-fb3a4d2a03ab 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Lazy-loading 'resources' on Instance uuid c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:58:32 compute-1 nova_compute[182713]: 2026-01-21 23:58:32.580 182717 DEBUG nova.scheduler.client.report [None req-9292a1aa-694c-4e1a-bc79-2654a7b5a37b d31ef0e2c7354f47adb7b7f072c28fae 40a322b32cda438b83f33ec51a9007dc - - default default] Inventory has not changed for provider 39680711-70c9-4df1-ae59-25e54fac688d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 21 23:58:32 compute-1 nova_compute[182713]: 2026-01-21 23:58:32.605 182717 DEBUG nova.objects.instance [req-7c04a3ae-19b3-46d0-96f4-580531a5ea47 req-22d4a720-90f2-46da-b370-13467f5ab6e2 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lazy-loading 'flavor' on Instance uuid c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:58:32 compute-1 nova_compute[182713]: 2026-01-21 23:58:32.608 182717 DEBUG nova.virt.libvirt.vif [None req-ab05fc8b-074b-4881-a880-fb3a4d2a03ab 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-21T23:57:18Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-AttachInterfacesTestJSON-server-936465965',display_name='tempest-AttachInterfacesTestJSON-server-936465965',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-attachinterfacestestjson-server-936465965',id=71,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBO8uUXqPwbsFq/YiTWyOW9dXr7d73wRm6448mROH/nejo5yLwAs6kN5js+eg5mRrAu9wdd3Q77g1T/xRTktMBrGTEhLl3+r8EWHZg0auWjRb9BMWnMu9DqjgoX1NYmMOKg==',key_name='tempest-keypair-1559391228',keypairs=<?>,launch_index=0,launched_at=2026-01-21T23:57:26Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='717cc581e6a349a98dfd390d05b18624',ramdisk_id='',reservation_id='r-fh2clcvk',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-658760528',owner_user_name='tempest-AttachInterfacesTestJSON-658760528-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-21T23:57:26Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='0f8ef02149394f2dac899fc3395b6bf7',uuid=c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "43589933-1997-41c6-9aa3-54f71a1330b8", "address": "fa:16:3e:3e:bc:4e", "network": {"id": "1995baab-0f8d-4658-a4fc-2d21868dc592", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-54296518-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.190", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "717cc581e6a349a98dfd390d05b18624", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap43589933-19", "ovs_interfaceid": "43589933-1997-41c6-9aa3-54f71a1330b8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 21 23:58:32 compute-1 nova_compute[182713]: 2026-01-21 23:58:32.608 182717 DEBUG nova.network.os_vif_util [None req-ab05fc8b-074b-4881-a880-fb3a4d2a03ab 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Converting VIF {"id": "43589933-1997-41c6-9aa3-54f71a1330b8", "address": "fa:16:3e:3e:bc:4e", "network": {"id": "1995baab-0f8d-4658-a4fc-2d21868dc592", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-54296518-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.190", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "717cc581e6a349a98dfd390d05b18624", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap43589933-19", "ovs_interfaceid": "43589933-1997-41c6-9aa3-54f71a1330b8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 21 23:58:32 compute-1 nova_compute[182713]: 2026-01-21 23:58:32.609 182717 DEBUG nova.network.os_vif_util [None req-ab05fc8b-074b-4881-a880-fb3a4d2a03ab 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:3e:bc:4e,bridge_name='br-int',has_traffic_filtering=True,id=43589933-1997-41c6-9aa3-54f71a1330b8,network=Network(1995baab-0f8d-4658-a4fc-2d21868dc592),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap43589933-19') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 21 23:58:32 compute-1 nova_compute[182713]: 2026-01-21 23:58:32.609 182717 DEBUG os_vif [None req-ab05fc8b-074b-4881-a880-fb3a4d2a03ab 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:3e:bc:4e,bridge_name='br-int',has_traffic_filtering=True,id=43589933-1997-41c6-9aa3-54f71a1330b8,network=Network(1995baab-0f8d-4658-a4fc-2d21868dc592),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap43589933-19') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 21 23:58:32 compute-1 nova_compute[182713]: 2026-01-21 23:58:32.610 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:58:32 compute-1 nova_compute[182713]: 2026-01-21 23:58:32.611 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap43589933-19, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:58:32 compute-1 nova_compute[182713]: 2026-01-21 23:58:32.612 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:58:32 compute-1 nova_compute[182713]: 2026-01-21 23:58:32.614 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 21 23:58:32 compute-1 nova_compute[182713]: 2026-01-21 23:58:32.618 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:58:32 compute-1 nova_compute[182713]: 2026-01-21 23:58:32.620 182717 INFO os_vif [None req-ab05fc8b-074b-4881-a880-fb3a4d2a03ab 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:3e:bc:4e,bridge_name='br-int',has_traffic_filtering=True,id=43589933-1997-41c6-9aa3-54f71a1330b8,network=Network(1995baab-0f8d-4658-a4fc-2d21868dc592),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap43589933-19')
Jan 21 23:58:32 compute-1 nova_compute[182713]: 2026-01-21 23:58:32.621 182717 DEBUG nova.virt.libvirt.vif [None req-ab05fc8b-074b-4881-a880-fb3a4d2a03ab 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-21T23:57:18Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-AttachInterfacesTestJSON-server-936465965',display_name='tempest-AttachInterfacesTestJSON-server-936465965',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-attachinterfacestestjson-server-936465965',id=71,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBO8uUXqPwbsFq/YiTWyOW9dXr7d73wRm6448mROH/nejo5yLwAs6kN5js+eg5mRrAu9wdd3Q77g1T/xRTktMBrGTEhLl3+r8EWHZg0auWjRb9BMWnMu9DqjgoX1NYmMOKg==',key_name='tempest-keypair-1559391228',keypairs=<?>,launch_index=0,launched_at=2026-01-21T23:57:26Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='717cc581e6a349a98dfd390d05b18624',ramdisk_id='',reservation_id='r-fh2clcvk',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-658760528',owner_user_name='tempest-AttachInterfacesTestJSON-658760528-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-21T23:57:26Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='0f8ef02149394f2dac899fc3395b6bf7',uuid=c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "da88332f-f709-4521-a7e5-faca686cf825", "address": "fa:16:3e:92:0a:be", "network": {"id": "1995baab-0f8d-4658-a4fc-2d21868dc592", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-54296518-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "717cc581e6a349a98dfd390d05b18624", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapda88332f-f7", "ovs_interfaceid": "da88332f-f709-4521-a7e5-faca686cf825", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 21 23:58:32 compute-1 nova_compute[182713]: 2026-01-21 23:58:32.621 182717 DEBUG nova.network.os_vif_util [None req-ab05fc8b-074b-4881-a880-fb3a4d2a03ab 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Converting VIF {"id": "da88332f-f709-4521-a7e5-faca686cf825", "address": "fa:16:3e:92:0a:be", "network": {"id": "1995baab-0f8d-4658-a4fc-2d21868dc592", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-54296518-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "717cc581e6a349a98dfd390d05b18624", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapda88332f-f7", "ovs_interfaceid": "da88332f-f709-4521-a7e5-faca686cf825", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 21 23:58:32 compute-1 nova_compute[182713]: 2026-01-21 23:58:32.622 182717 DEBUG nova.network.os_vif_util [None req-ab05fc8b-074b-4881-a880-fb3a4d2a03ab 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:92:0a:be,bridge_name='br-int',has_traffic_filtering=True,id=da88332f-f709-4521-a7e5-faca686cf825,network=Network(1995baab-0f8d-4658-a4fc-2d21868dc592),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapda88332f-f7') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 21 23:58:32 compute-1 nova_compute[182713]: 2026-01-21 23:58:32.622 182717 DEBUG os_vif [None req-ab05fc8b-074b-4881-a880-fb3a4d2a03ab 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:92:0a:be,bridge_name='br-int',has_traffic_filtering=True,id=da88332f-f709-4521-a7e5-faca686cf825,network=Network(1995baab-0f8d-4658-a4fc-2d21868dc592),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapda88332f-f7') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 21 23:58:32 compute-1 nova_compute[182713]: 2026-01-21 23:58:32.623 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:58:32 compute-1 nova_compute[182713]: 2026-01-21 23:58:32.623 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapda88332f-f7, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:58:32 compute-1 nova_compute[182713]: 2026-01-21 23:58:32.624 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:58:32 compute-1 nova_compute[182713]: 2026-01-21 23:58:32.626 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 21 23:58:32 compute-1 nova_compute[182713]: 2026-01-21 23:58:32.627 182717 DEBUG oslo_concurrency.lockutils [None req-9292a1aa-694c-4e1a-bc79-2654a7b5a37b d31ef0e2c7354f47adb7b7f072c28fae 40a322b32cda438b83f33ec51a9007dc - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.160s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:58:32 compute-1 nova_compute[182713]: 2026-01-21 23:58:32.629 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:58:32 compute-1 nova_compute[182713]: 2026-01-21 23:58:32.632 182717 DEBUG nova.virt.libvirt.vif [req-7c04a3ae-19b3-46d0-96f4-580531a5ea47 req-22d4a720-90f2-46da-b370-13467f5ab6e2 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-21T23:57:18Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-AttachInterfacesTestJSON-server-936465965',display_name='tempest-AttachInterfacesTestJSON-server-936465965',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-attachinterfacestestjson-server-936465965',id=71,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBO8uUXqPwbsFq/YiTWyOW9dXr7d73wRm6448mROH/nejo5yLwAs6kN5js+eg5mRrAu9wdd3Q77g1T/xRTktMBrGTEhLl3+r8EWHZg0auWjRb9BMWnMu9DqjgoX1NYmMOKg==',key_name='tempest-keypair-1559391228',keypairs=<?>,launch_index=0,launched_at=2026-01-21T23:57:26Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata=<?>,migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='717cc581e6a349a98dfd390d05b18624',ramdisk_id='',reservation_id='r-fh2clcvk',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-658760528',owner_user_name='tempest-AttachInterfacesTestJSON-658760528-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-21T23:58:32Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='0f8ef02149394f2dac899fc3395b6bf7',uuid=c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "ebbb51e0-ecfc-404f-a578-681300a57aa8", "address": "fa:16:3e:db:19:e5", "network": {"id": "1995baab-0f8d-4658-a4fc-2d21868dc592", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-54296518-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "717cc581e6a349a98dfd390d05b18624", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapebbb51e0-ec", "ovs_interfaceid": "ebbb51e0-ecfc-404f-a578-681300a57aa8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 21 23:58:32 compute-1 nova_compute[182713]: 2026-01-21 23:58:32.633 182717 DEBUG nova.network.os_vif_util [req-7c04a3ae-19b3-46d0-96f4-580531a5ea47 req-22d4a720-90f2-46da-b370-13467f5ab6e2 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Converting VIF {"id": "ebbb51e0-ecfc-404f-a578-681300a57aa8", "address": "fa:16:3e:db:19:e5", "network": {"id": "1995baab-0f8d-4658-a4fc-2d21868dc592", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-54296518-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "717cc581e6a349a98dfd390d05b18624", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapebbb51e0-ec", "ovs_interfaceid": "ebbb51e0-ecfc-404f-a578-681300a57aa8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 21 23:58:32 compute-1 nova_compute[182713]: 2026-01-21 23:58:32.633 182717 DEBUG nova.network.os_vif_util [req-7c04a3ae-19b3-46d0-96f4-580531a5ea47 req-22d4a720-90f2-46da-b370-13467f5ab6e2 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:db:19:e5,bridge_name='br-int',has_traffic_filtering=True,id=ebbb51e0-ecfc-404f-a578-681300a57aa8,network=Network(1995baab-0f8d-4658-a4fc-2d21868dc592),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapebbb51e0-ec') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 21 23:58:32 compute-1 nova_compute[182713]: 2026-01-21 23:58:32.634 182717 INFO os_vif [None req-ab05fc8b-074b-4881-a880-fb3a4d2a03ab 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:92:0a:be,bridge_name='br-int',has_traffic_filtering=True,id=da88332f-f709-4521-a7e5-faca686cf825,network=Network(1995baab-0f8d-4658-a4fc-2d21868dc592),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapda88332f-f7')
Jan 21 23:58:32 compute-1 nova_compute[182713]: 2026-01-21 23:58:32.635 182717 DEBUG nova.virt.libvirt.vif [None req-ab05fc8b-074b-4881-a880-fb3a4d2a03ab 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-21T23:57:18Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-AttachInterfacesTestJSON-server-936465965',display_name='tempest-AttachInterfacesTestJSON-server-936465965',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-attachinterfacestestjson-server-936465965',id=71,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBO8uUXqPwbsFq/YiTWyOW9dXr7d73wRm6448mROH/nejo5yLwAs6kN5js+eg5mRrAu9wdd3Q77g1T/xRTktMBrGTEhLl3+r8EWHZg0auWjRb9BMWnMu9DqjgoX1NYmMOKg==',key_name='tempest-keypair-1559391228',keypairs=<?>,launch_index=0,launched_at=2026-01-21T23:57:26Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='717cc581e6a349a98dfd390d05b18624',ramdisk_id='',reservation_id='r-fh2clcvk',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-658760528',owner_user_name='tempest-AttachInterfacesTestJSON-658760528-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-21T23:57:26Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='0f8ef02149394f2dac899fc3395b6bf7',uuid=c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "ebbb51e0-ecfc-404f-a578-681300a57aa8", "address": "fa:16:3e:db:19:e5", "network": {"id": "1995baab-0f8d-4658-a4fc-2d21868dc592", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-54296518-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "717cc581e6a349a98dfd390d05b18624", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapebbb51e0-ec", "ovs_interfaceid": "ebbb51e0-ecfc-404f-a578-681300a57aa8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 21 23:58:32 compute-1 nova_compute[182713]: 2026-01-21 23:58:32.635 182717 DEBUG nova.network.os_vif_util [None req-ab05fc8b-074b-4881-a880-fb3a4d2a03ab 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Converting VIF {"id": "ebbb51e0-ecfc-404f-a578-681300a57aa8", "address": "fa:16:3e:db:19:e5", "network": {"id": "1995baab-0f8d-4658-a4fc-2d21868dc592", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-54296518-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "717cc581e6a349a98dfd390d05b18624", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapebbb51e0-ec", "ovs_interfaceid": "ebbb51e0-ecfc-404f-a578-681300a57aa8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 21 23:58:32 compute-1 nova_compute[182713]: 2026-01-21 23:58:32.635 182717 DEBUG nova.network.os_vif_util [None req-ab05fc8b-074b-4881-a880-fb3a4d2a03ab 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:db:19:e5,bridge_name='br-int',has_traffic_filtering=True,id=ebbb51e0-ecfc-404f-a578-681300a57aa8,network=Network(1995baab-0f8d-4658-a4fc-2d21868dc592),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapebbb51e0-ec') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 21 23:58:32 compute-1 nova_compute[182713]: 2026-01-21 23:58:32.636 182717 DEBUG os_vif [None req-ab05fc8b-074b-4881-a880-fb3a4d2a03ab 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:db:19:e5,bridge_name='br-int',has_traffic_filtering=True,id=ebbb51e0-ecfc-404f-a578-681300a57aa8,network=Network(1995baab-0f8d-4658-a4fc-2d21868dc592),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapebbb51e0-ec') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 21 23:58:32 compute-1 nova_compute[182713]: 2026-01-21 23:58:32.637 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:58:32 compute-1 nova_compute[182713]: 2026-01-21 23:58:32.637 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapebbb51e0-ec, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:58:32 compute-1 nova_compute[182713]: 2026-01-21 23:58:32.639 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:58:32 compute-1 nova_compute[182713]: 2026-01-21 23:58:32.639 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:58:32 compute-1 nova_compute[182713]: 2026-01-21 23:58:32.640 182717 DEBUG nova.virt.libvirt.guest [req-7c04a3ae-19b3-46d0-96f4-580531a5ea47 req-22d4a720-90f2-46da-b370-13467f5ab6e2 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:db:19:e5"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapebbb51e0-ec"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257
Jan 21 23:58:32 compute-1 nova_compute[182713]: 2026-01-21 23:58:32.644 182717 INFO os_vif [None req-ab05fc8b-074b-4881-a880-fb3a4d2a03ab 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:db:19:e5,bridge_name='br-int',has_traffic_filtering=True,id=ebbb51e0-ecfc-404f-a578-681300a57aa8,network=Network(1995baab-0f8d-4658-a4fc-2d21868dc592),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapebbb51e0-ec')
Jan 21 23:58:32 compute-1 nova_compute[182713]: 2026-01-21 23:58:32.644 182717 INFO nova.virt.libvirt.driver [None req-ab05fc8b-074b-4881-a880-fb3a4d2a03ab 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] [instance: c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9] Deleting instance files /var/lib/nova/instances/c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9_del
Jan 21 23:58:32 compute-1 nova_compute[182713]: 2026-01-21 23:58:32.645 182717 INFO nova.virt.libvirt.driver [None req-ab05fc8b-074b-4881-a880-fb3a4d2a03ab 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] [instance: c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9] Deletion of /var/lib/nova/instances/c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9_del complete
Jan 21 23:58:32 compute-1 nova_compute[182713]: 2026-01-21 23:58:32.651 182717 DEBUG nova.virt.libvirt.driver [req-7c04a3ae-19b3-46d0-96f4-580531a5ea47 req-22d4a720-90f2-46da-b370-13467f5ab6e2 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Attempting to detach device tapebbb51e0-ec from instance c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9 from the persistent domain config. _detach_from_persistent /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2487
Jan 21 23:58:32 compute-1 nova_compute[182713]: 2026-01-21 23:58:32.651 182717 DEBUG nova.virt.libvirt.guest [req-7c04a3ae-19b3-46d0-96f4-580531a5ea47 req-22d4a720-90f2-46da-b370-13467f5ab6e2 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] detach device xml: <interface type="ethernet">
Jan 21 23:58:32 compute-1 nova_compute[182713]:   <mac address="fa:16:3e:db:19:e5"/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:   <model type="virtio"/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:   <driver name="vhost" rx_queue_size="512"/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:   <mtu size="1442"/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:   <target dev="tapebbb51e0-ec"/>
Jan 21 23:58:32 compute-1 nova_compute[182713]: </interface>
Jan 21 23:58:32 compute-1 nova_compute[182713]:  detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465
Jan 21 23:58:32 compute-1 neutron-haproxy-ovnmeta-1995baab-0f8d-4658-a4fc-2d21868dc592[220984]: [NOTICE]   (220988) : haproxy version is 2.8.14-c23fe91
Jan 21 23:58:32 compute-1 neutron-haproxy-ovnmeta-1995baab-0f8d-4658-a4fc-2d21868dc592[220984]: [NOTICE]   (220988) : path to executable is /usr/sbin/haproxy
Jan 21 23:58:32 compute-1 neutron-haproxy-ovnmeta-1995baab-0f8d-4658-a4fc-2d21868dc592[220984]: [WARNING]  (220988) : Exiting Master process...
Jan 21 23:58:32 compute-1 neutron-haproxy-ovnmeta-1995baab-0f8d-4658-a4fc-2d21868dc592[220984]: [ALERT]    (220988) : Current worker (220990) exited with code 143 (Terminated)
Jan 21 23:58:32 compute-1 neutron-haproxy-ovnmeta-1995baab-0f8d-4658-a4fc-2d21868dc592[220984]: [WARNING]  (220988) : All workers exited. Exiting... (0)
Jan 21 23:58:32 compute-1 nova_compute[182713]: 2026-01-21 23:58:32.660 182717 WARNING nova.virt.libvirt.driver [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 21 23:58:32 compute-1 nova_compute[182713]: 2026-01-21 23:58:32.661 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5452MB free_disk=73.27441024780273GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 21 23:58:32 compute-1 nova_compute[182713]: 2026-01-21 23:58:32.661 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:58:32 compute-1 nova_compute[182713]: 2026-01-21 23:58:32.661 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:58:32 compute-1 systemd[1]: libpod-dc9ee907f3258a2af738ff6ab6c0b3595430226990f49383637dfee773c11cb5.scope: Deactivated successfully.
Jan 21 23:58:32 compute-1 conmon[220984]: conmon dc9ee907f3258a2af738 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-dc9ee907f3258a2af738ff6ab6c0b3595430226990f49383637dfee773c11cb5.scope/container/memory.events
Jan 21 23:58:32 compute-1 nova_compute[182713]: 2026-01-21 23:58:32.666 182717 DEBUG nova.virt.libvirt.guest [req-7c04a3ae-19b3-46d0-96f4-580531a5ea47 req-22d4a720-90f2-46da-b370-13467f5ab6e2 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:db:19:e5"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapebbb51e0-ec"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257
Jan 21 23:58:32 compute-1 podman[221686]: 2026-01-21 23:58:32.669098632 +0000 UTC m=+0.053934673 container died dc9ee907f3258a2af738ff6ab6c0b3595430226990f49383637dfee773c11cb5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1995baab-0f8d-4658-a4fc-2d21868dc592, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3)
Jan 21 23:58:32 compute-1 nova_compute[182713]: 2026-01-21 23:58:32.669 182717 DEBUG nova.virt.libvirt.guest [req-7c04a3ae-19b3-46d0-96f4-580531a5ea47 req-22d4a720-90f2-46da-b370-13467f5ab6e2 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:db:19:e5"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapebbb51e0-ec"/></interface>not found in domain: <domain type='kvm'>
Jan 21 23:58:32 compute-1 nova_compute[182713]:   <name>instance-00000047</name>
Jan 21 23:58:32 compute-1 nova_compute[182713]:   <uuid>c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9</uuid>
Jan 21 23:58:32 compute-1 nova_compute[182713]:   <metadata>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <nova:name>tempest-AttachInterfacesTestJSON-server-936465965</nova:name>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <nova:creationTime>2026-01-21 23:57:25</nova:creationTime>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <nova:flavor name="m1.nano">
Jan 21 23:58:32 compute-1 nova_compute[182713]:         <nova:memory>128</nova:memory>
Jan 21 23:58:32 compute-1 nova_compute[182713]:         <nova:disk>1</nova:disk>
Jan 21 23:58:32 compute-1 nova_compute[182713]:         <nova:swap>0</nova:swap>
Jan 21 23:58:32 compute-1 nova_compute[182713]:         <nova:ephemeral>0</nova:ephemeral>
Jan 21 23:58:32 compute-1 nova_compute[182713]:         <nova:vcpus>1</nova:vcpus>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       </nova:flavor>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <nova:owner>
Jan 21 23:58:32 compute-1 nova_compute[182713]:         <nova:user uuid="0f8ef02149394f2dac899fc3395b6bf7">tempest-AttachInterfacesTestJSON-658760528-project-member</nova:user>
Jan 21 23:58:32 compute-1 nova_compute[182713]:         <nova:project uuid="717cc581e6a349a98dfd390d05b18624">tempest-AttachInterfacesTestJSON-658760528</nova:project>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       </nova:owner>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <nova:ports>
Jan 21 23:58:32 compute-1 nova_compute[182713]:         <nova:port uuid="43589933-1997-41c6-9aa3-54f71a1330b8">
Jan 21 23:58:32 compute-1 nova_compute[182713]:           <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:         </nova:port>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       </nova:ports>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     </nova:instance>
Jan 21 23:58:32 compute-1 nova_compute[182713]:   </metadata>
Jan 21 23:58:32 compute-1 nova_compute[182713]:   <memory unit='KiB'>131072</memory>
Jan 21 23:58:32 compute-1 nova_compute[182713]:   <currentMemory unit='KiB'>131072</currentMemory>
Jan 21 23:58:32 compute-1 nova_compute[182713]:   <vcpu placement='static'>1</vcpu>
Jan 21 23:58:32 compute-1 nova_compute[182713]:   <sysinfo type='smbios'>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     <system>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <entry name='manufacturer'>RDO</entry>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <entry name='product'>OpenStack Compute</entry>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <entry name='serial'>c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9</entry>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <entry name='uuid'>c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9</entry>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <entry name='family'>Virtual Machine</entry>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     </system>
Jan 21 23:58:32 compute-1 nova_compute[182713]:   </sysinfo>
Jan 21 23:58:32 compute-1 nova_compute[182713]:   <os>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     <boot dev='hd'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     <smbios mode='sysinfo'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:   </os>
Jan 21 23:58:32 compute-1 nova_compute[182713]:   <features>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     <acpi/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     <apic/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     <vmcoreinfo state='on'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:   </features>
Jan 21 23:58:32 compute-1 nova_compute[182713]:   <cpu mode='custom' match='exact' check='partial'>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     <model fallback='allow'>Nehalem</model>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:   </cpu>
Jan 21 23:58:32 compute-1 nova_compute[182713]:   <clock offset='utc'>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     <timer name='pit' tickpolicy='delay'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     <timer name='rtc' tickpolicy='catchup'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     <timer name='hpet' present='no'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:   </clock>
Jan 21 23:58:32 compute-1 nova_compute[182713]:   <on_poweroff>destroy</on_poweroff>
Jan 21 23:58:32 compute-1 nova_compute[182713]:   <on_reboot>restart</on_reboot>
Jan 21 23:58:32 compute-1 nova_compute[182713]:   <on_crash>destroy</on_crash>
Jan 21 23:58:32 compute-1 nova_compute[182713]:   <devices>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     <disk type='file' device='disk'>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <driver name='qemu' type='qcow2' cache='none'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <source file='/var/lib/nova/instances/c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9/disk'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <target dev='vda' bus='virtio'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     </disk>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     <disk type='file' device='cdrom'>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <driver name='qemu' type='raw' cache='none'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <source file='/var/lib/nova/instances/c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9/disk.config'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <target dev='sda' bus='sata'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <readonly/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     </disk>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     <controller type='pci' index='0' model='pcie-root'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     <controller type='pci' index='1' model='pcie-root-port'>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <target chassis='1' port='0x10'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     </controller>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     <controller type='pci' index='2' model='pcie-root-port'>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <target chassis='2' port='0x11'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     </controller>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     <controller type='pci' index='3' model='pcie-root-port'>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <target chassis='3' port='0x12'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     </controller>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     <controller type='pci' index='4' model='pcie-root-port'>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <target chassis='4' port='0x13'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     </controller>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     <controller type='pci' index='5' model='pcie-root-port'>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <target chassis='5' port='0x14'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     </controller>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     <controller type='pci' index='6' model='pcie-root-port'>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <target chassis='6' port='0x15'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     </controller>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     <controller type='pci' index='7' model='pcie-root-port'>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <target chassis='7' port='0x16'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     </controller>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     <controller type='pci' index='8' model='pcie-root-port'>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <target chassis='8' port='0x17'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     </controller>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     <controller type='pci' index='9' model='pcie-root-port'>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <target chassis='9' port='0x18'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     </controller>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     <controller type='pci' index='10' model='pcie-root-port'>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <target chassis='10' port='0x19'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     </controller>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     <controller type='pci' index='11' model='pcie-root-port'>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <target chassis='11' port='0x1a'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     </controller>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     <controller type='pci' index='12' model='pcie-root-port'>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <target chassis='12' port='0x1b'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     </controller>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     <controller type='pci' index='13' model='pcie-root-port'>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <target chassis='13' port='0x1c'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     </controller>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     <controller type='pci' index='14' model='pcie-root-port'>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <target chassis='14' port='0x1d'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     </controller>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     <controller type='pci' index='15' model='pcie-root-port'>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <target chassis='15' port='0x1e'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     </controller>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     <controller type='pci' index='16' model='pcie-root-port'>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <target chassis='16' port='0x1f'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     </controller>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     <controller type='pci' index='17' model='pcie-root-port'>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <target chassis='17' port='0x20'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     </controller>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     <controller type='pci' index='18' model='pcie-root-port'>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <target chassis='18' port='0x21'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     </controller>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     <controller type='pci' index='19' model='pcie-root-port'>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <target chassis='19' port='0x22'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     </controller>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     <controller type='pci' index='20' model='pcie-root-port'>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <target chassis='20' port='0x23'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     </controller>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     <controller type='pci' index='21' model='pcie-root-port'>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <target chassis='21' port='0x24'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     </controller>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     <controller type='pci' index='22' model='pcie-root-port'>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <target chassis='22' port='0x25'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     </controller>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     <controller type='pci' index='23' model='pcie-root-port'>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <target chassis='23' port='0x26'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     </controller>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     <controller type='pci' index='24' model='pcie-root-port'>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <target chassis='24' port='0x27'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     </controller>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     <controller type='pci' index='25' model='pcie-root-port'>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <target chassis='25' port='0x28'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     </controller>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <model name='pcie-pci-bridge'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     </controller>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     <controller type='usb' index='0' model='piix3-uhci'>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     </controller>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     <controller type='sata' index='0'>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     </controller>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     <interface type='ethernet'>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <mac address='fa:16:3e:3e:bc:4e'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <target dev='tap43589933-19'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <model type='virtio'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <driver name='vhost' rx_queue_size='512'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <mtu size='1442'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     </interface>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     <interface type='ethernet'>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <mac address='fa:16:3e:92:0a:be'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <target dev='tapda88332f-f7'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <model type='virtio'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <driver name='vhost' rx_queue_size='512'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <mtu size='1442'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x07' slot='0x00' function='0x0'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     </interface>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     <serial type='pty'>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <log file='/var/lib/nova/instances/c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9/console.log' append='off'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <target type='isa-serial' port='0'>
Jan 21 23:58:32 compute-1 nova_compute[182713]:         <model name='isa-serial'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       </target>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     </serial>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     <console type='pty'>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <log file='/var/lib/nova/instances/c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9/console.log' append='off'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <target type='serial' port='0'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     </console>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     <input type='tablet' bus='usb'>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <address type='usb' bus='0' port='1'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     </input>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     <input type='mouse' bus='ps2'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     <input type='keyboard' bus='ps2'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     <graphics type='vnc' port='-1' autoport='yes' listen='::0'>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <listen type='address' address='::0'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     </graphics>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     <audio id='1' type='none'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     <video>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <model type='virtio' heads='1' primary='yes'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     </video>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     <watchdog model='itco' action='reset'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     <memballoon model='virtio'>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <stats period='10'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     </memballoon>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     <rng model='virtio'>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <backend model='random'>/dev/urandom</backend>
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     </rng>
Jan 21 23:58:32 compute-1 nova_compute[182713]:   </devices>
Jan 21 23:58:32 compute-1 nova_compute[182713]: </domain>
Jan 21 23:58:32 compute-1 nova_compute[182713]:  get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282
Jan 21 23:58:32 compute-1 nova_compute[182713]: 2026-01-21 23:58:32.669 182717 INFO nova.virt.libvirt.driver [req-7c04a3ae-19b3-46d0-96f4-580531a5ea47 req-22d4a720-90f2-46da-b370-13467f5ab6e2 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Successfully detached device tapebbb51e0-ec from instance c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9 from the persistent domain config.
Jan 21 23:58:32 compute-1 nova_compute[182713]: 2026-01-21 23:58:32.670 182717 DEBUG nova.virt.libvirt.vif [req-7c04a3ae-19b3-46d0-96f4-580531a5ea47 req-22d4a720-90f2-46da-b370-13467f5ab6e2 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-21T23:57:18Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-AttachInterfacesTestJSON-server-936465965',display_name='tempest-AttachInterfacesTestJSON-server-936465965',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-attachinterfacestestjson-server-936465965',id=71,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBO8uUXqPwbsFq/YiTWyOW9dXr7d73wRm6448mROH/nejo5yLwAs6kN5js+eg5mRrAu9wdd3Q77g1T/xRTktMBrGTEhLl3+r8EWHZg0auWjRb9BMWnMu9DqjgoX1NYmMOKg==',key_name='tempest-keypair-1559391228',keypairs=<?>,launch_index=0,launched_at=2026-01-21T23:57:26Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata=<?>,migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='717cc581e6a349a98dfd390d05b18624',ramdisk_id='',reservation_id='r-fh2clcvk',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-658760528',owner_user_name='tempest-AttachInterfacesTestJSON-658760528-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-21T23:58:32Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='0f8ef02149394f2dac899fc3395b6bf7',uuid=c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "ebbb51e0-ecfc-404f-a578-681300a57aa8", "address": "fa:16:3e:db:19:e5", "network": {"id": "1995baab-0f8d-4658-a4fc-2d21868dc592", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-54296518-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "717cc581e6a349a98dfd390d05b18624", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapebbb51e0-ec", "ovs_interfaceid": "ebbb51e0-ecfc-404f-a578-681300a57aa8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 21 23:58:32 compute-1 nova_compute[182713]: 2026-01-21 23:58:32.670 182717 DEBUG nova.network.os_vif_util [req-7c04a3ae-19b3-46d0-96f4-580531a5ea47 req-22d4a720-90f2-46da-b370-13467f5ab6e2 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Converting VIF {"id": "ebbb51e0-ecfc-404f-a578-681300a57aa8", "address": "fa:16:3e:db:19:e5", "network": {"id": "1995baab-0f8d-4658-a4fc-2d21868dc592", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-54296518-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "717cc581e6a349a98dfd390d05b18624", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapebbb51e0-ec", "ovs_interfaceid": "ebbb51e0-ecfc-404f-a578-681300a57aa8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 21 23:58:32 compute-1 nova_compute[182713]: 2026-01-21 23:58:32.670 182717 DEBUG nova.network.os_vif_util [req-7c04a3ae-19b3-46d0-96f4-580531a5ea47 req-22d4a720-90f2-46da-b370-13467f5ab6e2 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:db:19:e5,bridge_name='br-int',has_traffic_filtering=True,id=ebbb51e0-ecfc-404f-a578-681300a57aa8,network=Network(1995baab-0f8d-4658-a4fc-2d21868dc592),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapebbb51e0-ec') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 21 23:58:32 compute-1 nova_compute[182713]: 2026-01-21 23:58:32.671 182717 DEBUG os_vif [req-7c04a3ae-19b3-46d0-96f4-580531a5ea47 req-22d4a720-90f2-46da-b370-13467f5ab6e2 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:db:19:e5,bridge_name='br-int',has_traffic_filtering=True,id=ebbb51e0-ecfc-404f-a578-681300a57aa8,network=Network(1995baab-0f8d-4658-a4fc-2d21868dc592),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapebbb51e0-ec') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 21 23:58:32 compute-1 nova_compute[182713]: 2026-01-21 23:58:32.673 182717 INFO nova.scheduler.client.report [None req-9292a1aa-694c-4e1a-bc79-2654a7b5a37b d31ef0e2c7354f47adb7b7f072c28fae 40a322b32cda438b83f33ec51a9007dc - - default default] Deleted allocations for instance 337d88a9-a34b-4c90-bf0d-0418533ae52d
Jan 21 23:58:32 compute-1 nova_compute[182713]: 2026-01-21 23:58:32.674 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:58:32 compute-1 nova_compute[182713]: 2026-01-21 23:58:32.675 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapebbb51e0-ec, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:58:32 compute-1 nova_compute[182713]: 2026-01-21 23:58:32.675 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 21 23:58:32 compute-1 nova_compute[182713]: 2026-01-21 23:58:32.677 182717 INFO os_vif [req-7c04a3ae-19b3-46d0-96f4-580531a5ea47 req-22d4a720-90f2-46da-b370-13467f5ab6e2 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:db:19:e5,bridge_name='br-int',has_traffic_filtering=True,id=ebbb51e0-ecfc-404f-a578-681300a57aa8,network=Network(1995baab-0f8d-4658-a4fc-2d21868dc592),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapebbb51e0-ec')
Jan 21 23:58:32 compute-1 nova_compute[182713]: 2026-01-21 23:58:32.678 182717 DEBUG nova.virt.libvirt.guest [req-7c04a3ae-19b3-46d0-96f4-580531a5ea47 req-22d4a720-90f2-46da-b370-13467f5ab6e2 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 21 23:58:32 compute-1 nova_compute[182713]:   <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:   <nova:name>tempest-AttachInterfacesTestJSON-server-936465965</nova:name>
Jan 21 23:58:32 compute-1 nova_compute[182713]:   <nova:creationTime>2026-01-21 23:58:32</nova:creationTime>
Jan 21 23:58:32 compute-1 nova_compute[182713]:   <nova:flavor name="m1.nano">
Jan 21 23:58:32 compute-1 nova_compute[182713]:     <nova:memory>128</nova:memory>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     <nova:disk>1</nova:disk>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     <nova:swap>0</nova:swap>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     <nova:ephemeral>0</nova:ephemeral>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     <nova:vcpus>1</nova:vcpus>
Jan 21 23:58:32 compute-1 nova_compute[182713]:   </nova:flavor>
Jan 21 23:58:32 compute-1 nova_compute[182713]:   <nova:owner>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     <nova:user uuid="0f8ef02149394f2dac899fc3395b6bf7">tempest-AttachInterfacesTestJSON-658760528-project-member</nova:user>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     <nova:project uuid="717cc581e6a349a98dfd390d05b18624">tempest-AttachInterfacesTestJSON-658760528</nova:project>
Jan 21 23:58:32 compute-1 nova_compute[182713]:   </nova:owner>
Jan 21 23:58:32 compute-1 nova_compute[182713]:   <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:   <nova:ports>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     <nova:port uuid="43589933-1997-41c6-9aa3-54f71a1330b8">
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     </nova:port>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     <nova:port uuid="da88332f-f709-4521-a7e5-faca686cf825">
Jan 21 23:58:32 compute-1 nova_compute[182713]:       <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Jan 21 23:58:32 compute-1 nova_compute[182713]:     </nova:port>
Jan 21 23:58:32 compute-1 nova_compute[182713]:   </nova:ports>
Jan 21 23:58:32 compute-1 nova_compute[182713]: </nova:instance>
Jan 21 23:58:32 compute-1 nova_compute[182713]:  set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359
Jan 21 23:58:32 compute-1 nova_compute[182713]: 2026-01-21 23:58:32.682 182717 DEBUG nova.compute.manager [req-7c04a3ae-19b3-46d0-96f4-580531a5ea47 req-22d4a720-90f2-46da-b370-13467f5ab6e2 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 337d88a9-a34b-4c90-bf0d-0418533ae52d] Received event network-vif-deleted-0d50f3fc-9e12-4e56-ba52-98ff14988caa external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:58:32 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-dc9ee907f3258a2af738ff6ab6c0b3595430226990f49383637dfee773c11cb5-userdata-shm.mount: Deactivated successfully.
Jan 21 23:58:32 compute-1 systemd[1]: var-lib-containers-storage-overlay-31ef042e5cdf851c57dd883f14c43712598f53809934173ed133db205e2c95b3-merged.mount: Deactivated successfully.
Jan 21 23:58:32 compute-1 podman[221686]: 2026-01-21 23:58:32.711288681 +0000 UTC m=+0.096124722 container cleanup dc9ee907f3258a2af738ff6ab6c0b3595430226990f49383637dfee773c11cb5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1995baab-0f8d-4658-a4fc-2d21868dc592, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 21 23:58:32 compute-1 systemd[1]: libpod-conmon-dc9ee907f3258a2af738ff6ab6c0b3595430226990f49383637dfee773c11cb5.scope: Deactivated successfully.
Jan 21 23:58:32 compute-1 nova_compute[182713]: 2026-01-21 23:58:32.759 182717 INFO nova.compute.manager [None req-ab05fc8b-074b-4881-a880-fb3a4d2a03ab 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] [instance: c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9] Took 0.47 seconds to destroy the instance on the hypervisor.
Jan 21 23:58:32 compute-1 nova_compute[182713]: 2026-01-21 23:58:32.760 182717 DEBUG oslo.service.loopingcall [None req-ab05fc8b-074b-4881-a880-fb3a4d2a03ab 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 21 23:58:32 compute-1 nova_compute[182713]: 2026-01-21 23:58:32.760 182717 DEBUG nova.compute.manager [-] [instance: c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 21 23:58:32 compute-1 nova_compute[182713]: 2026-01-21 23:58:32.761 182717 DEBUG nova.network.neutron [-] [instance: c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 21 23:58:32 compute-1 nova_compute[182713]: 2026-01-21 23:58:32.783 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Instance c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 21 23:58:32 compute-1 nova_compute[182713]: 2026-01-21 23:58:32.783 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 21 23:58:32 compute-1 nova_compute[182713]: 2026-01-21 23:58:32.784 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 21 23:58:32 compute-1 nova_compute[182713]: 2026-01-21 23:58:32.804 182717 DEBUG oslo_concurrency.lockutils [None req-9292a1aa-694c-4e1a-bc79-2654a7b5a37b d31ef0e2c7354f47adb7b7f072c28fae 40a322b32cda438b83f33ec51a9007dc - - default default] Lock "337d88a9-a34b-4c90-bf0d-0418533ae52d" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.849s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:58:32 compute-1 podman[221721]: 2026-01-21 23:58:32.808940018 +0000 UTC m=+0.067736067 container remove dc9ee907f3258a2af738ff6ab6c0b3595430226990f49383637dfee773c11cb5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1995baab-0f8d-4658-a4fc-2d21868dc592, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 21 23:58:32 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:58:32.815 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[ccc00613-1d9f-46b7-bf9b-a58b1d894cde]: (4, ('Wed Jan 21 11:58:32 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-1995baab-0f8d-4658-a4fc-2d21868dc592 (dc9ee907f3258a2af738ff6ab6c0b3595430226990f49383637dfee773c11cb5)\ndc9ee907f3258a2af738ff6ab6c0b3595430226990f49383637dfee773c11cb5\nWed Jan 21 11:58:32 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-1995baab-0f8d-4658-a4fc-2d21868dc592 (dc9ee907f3258a2af738ff6ab6c0b3595430226990f49383637dfee773c11cb5)\ndc9ee907f3258a2af738ff6ab6c0b3595430226990f49383637dfee773c11cb5\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:58:32 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:58:32.817 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[1e7ab279-4315-4c01-99c9-9d689f942970]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:58:32 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:58:32.818 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1995baab-00, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:58:32 compute-1 kernel: tap1995baab-00: left promiscuous mode
Jan 21 23:58:32 compute-1 nova_compute[182713]: 2026-01-21 23:58:32.820 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:58:32 compute-1 nova_compute[182713]: 2026-01-21 23:58:32.835 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:58:32 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:58:32.839 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[66393c4c-cd78-4c12-9457-2d07d56ef3fd]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:58:32 compute-1 nova_compute[182713]: 2026-01-21 23:58:32.856 182717 DEBUG nova.compute.provider_tree [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Inventory has not changed in ProviderTree for provider: 39680711-70c9-4df1-ae59-25e54fac688d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 21 23:58:32 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:58:32.864 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[7e6a2a75-d7c7-4029-ace0-0ef2f2b2394d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:58:32 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:58:32.866 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[597a7685-6e8a-4c64-8a55-d20c3429333c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:58:32 compute-1 nova_compute[182713]: 2026-01-21 23:58:32.873 182717 DEBUG nova.scheduler.client.report [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Inventory has not changed for provider 39680711-70c9-4df1-ae59-25e54fac688d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 21 23:58:32 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:58:32.884 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[60fef77d-4cc9-4586-b2ea-8041efdc3853]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 441872, 'reachable_time': 38336, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 221735, 'error': None, 'target': 'ovnmeta-1995baab-0f8d-4658-a4fc-2d21868dc592', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:58:32 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:58:32.887 104576 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-1995baab-0f8d-4658-a4fc-2d21868dc592 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 21 23:58:32 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:58:32.887 104576 DEBUG oslo.privsep.daemon [-] privsep: reply[0d88a581-8c45-42cc-aa8f-41e9e4c1b6f7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:58:32 compute-1 systemd[1]: run-netns-ovnmeta\x2d1995baab\x2d0f8d\x2d4658\x2da4fc\x2d2d21868dc592.mount: Deactivated successfully.
Jan 21 23:58:32 compute-1 nova_compute[182713]: 2026-01-21 23:58:32.902 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 21 23:58:32 compute-1 nova_compute[182713]: 2026-01-21 23:58:32.903 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.241s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:58:33 compute-1 nova_compute[182713]: 2026-01-21 23:58:33.028 182717 INFO nova.network.neutron [None req-f8213aff-4f1c-4f00-aeaa-898fb01fcf42 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] [instance: c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9] Port 63932621-f0d1-4f08-8ce5-b5fa120bcc62 from network info_cache is no longer associated with instance in Neutron. Removing from network info_cache.
Jan 21 23:58:33 compute-1 nova_compute[182713]: 2026-01-21 23:58:33.082 182717 DEBUG nova.compute.manager [req-946dca1c-52b2-4bf3-b69f-ec93b663315b req-5ddf6bb9-b139-439b-aa32-20c641437452 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9] Received event network-vif-plugged-63932621-f0d1-4f08-8ce5-b5fa120bcc62 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:58:33 compute-1 nova_compute[182713]: 2026-01-21 23:58:33.083 182717 DEBUG oslo_concurrency.lockutils [req-946dca1c-52b2-4bf3-b69f-ec93b663315b req-5ddf6bb9-b139-439b-aa32-20c641437452 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:58:33 compute-1 nova_compute[182713]: 2026-01-21 23:58:33.084 182717 DEBUG oslo_concurrency.lockutils [req-946dca1c-52b2-4bf3-b69f-ec93b663315b req-5ddf6bb9-b139-439b-aa32-20c641437452 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:58:33 compute-1 nova_compute[182713]: 2026-01-21 23:58:33.084 182717 DEBUG oslo_concurrency.lockutils [req-946dca1c-52b2-4bf3-b69f-ec93b663315b req-5ddf6bb9-b139-439b-aa32-20c641437452 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:58:33 compute-1 nova_compute[182713]: 2026-01-21 23:58:33.085 182717 DEBUG nova.compute.manager [req-946dca1c-52b2-4bf3-b69f-ec93b663315b req-5ddf6bb9-b139-439b-aa32-20c641437452 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9] No waiting events found dispatching network-vif-plugged-63932621-f0d1-4f08-8ce5-b5fa120bcc62 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 23:58:33 compute-1 nova_compute[182713]: 2026-01-21 23:58:33.085 182717 WARNING nova.compute.manager [req-946dca1c-52b2-4bf3-b69f-ec93b663315b req-5ddf6bb9-b139-439b-aa32-20c641437452 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9] Received unexpected event network-vif-plugged-63932621-f0d1-4f08-8ce5-b5fa120bcc62 for instance with vm_state active and task_state deleting.
Jan 21 23:58:33 compute-1 nova_compute[182713]: 2026-01-21 23:58:33.085 182717 DEBUG nova.compute.manager [req-946dca1c-52b2-4bf3-b69f-ec93b663315b req-5ddf6bb9-b139-439b-aa32-20c641437452 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 337d88a9-a34b-4c90-bf0d-0418533ae52d] Received event network-vif-unplugged-0d50f3fc-9e12-4e56-ba52-98ff14988caa external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:58:33 compute-1 nova_compute[182713]: 2026-01-21 23:58:33.086 182717 DEBUG oslo_concurrency.lockutils [req-946dca1c-52b2-4bf3-b69f-ec93b663315b req-5ddf6bb9-b139-439b-aa32-20c641437452 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "337d88a9-a34b-4c90-bf0d-0418533ae52d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:58:33 compute-1 nova_compute[182713]: 2026-01-21 23:58:33.086 182717 DEBUG oslo_concurrency.lockutils [req-946dca1c-52b2-4bf3-b69f-ec93b663315b req-5ddf6bb9-b139-439b-aa32-20c641437452 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "337d88a9-a34b-4c90-bf0d-0418533ae52d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:58:33 compute-1 nova_compute[182713]: 2026-01-21 23:58:33.087 182717 DEBUG oslo_concurrency.lockutils [req-946dca1c-52b2-4bf3-b69f-ec93b663315b req-5ddf6bb9-b139-439b-aa32-20c641437452 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "337d88a9-a34b-4c90-bf0d-0418533ae52d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:58:33 compute-1 nova_compute[182713]: 2026-01-21 23:58:33.087 182717 DEBUG nova.compute.manager [req-946dca1c-52b2-4bf3-b69f-ec93b663315b req-5ddf6bb9-b139-439b-aa32-20c641437452 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 337d88a9-a34b-4c90-bf0d-0418533ae52d] No waiting events found dispatching network-vif-unplugged-0d50f3fc-9e12-4e56-ba52-98ff14988caa pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 23:58:33 compute-1 nova_compute[182713]: 2026-01-21 23:58:33.088 182717 WARNING nova.compute.manager [req-946dca1c-52b2-4bf3-b69f-ec93b663315b req-5ddf6bb9-b139-439b-aa32-20c641437452 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 337d88a9-a34b-4c90-bf0d-0418533ae52d] Received unexpected event network-vif-unplugged-0d50f3fc-9e12-4e56-ba52-98ff14988caa for instance with vm_state deleted and task_state None.
Jan 21 23:58:33 compute-1 nova_compute[182713]: 2026-01-21 23:58:33.088 182717 DEBUG nova.compute.manager [req-946dca1c-52b2-4bf3-b69f-ec93b663315b req-5ddf6bb9-b139-439b-aa32-20c641437452 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 337d88a9-a34b-4c90-bf0d-0418533ae52d] Received event network-vif-plugged-0d50f3fc-9e12-4e56-ba52-98ff14988caa external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:58:33 compute-1 nova_compute[182713]: 2026-01-21 23:58:33.088 182717 DEBUG oslo_concurrency.lockutils [req-946dca1c-52b2-4bf3-b69f-ec93b663315b req-5ddf6bb9-b139-439b-aa32-20c641437452 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "337d88a9-a34b-4c90-bf0d-0418533ae52d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:58:33 compute-1 nova_compute[182713]: 2026-01-21 23:58:33.089 182717 DEBUG oslo_concurrency.lockutils [req-946dca1c-52b2-4bf3-b69f-ec93b663315b req-5ddf6bb9-b139-439b-aa32-20c641437452 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "337d88a9-a34b-4c90-bf0d-0418533ae52d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:58:33 compute-1 nova_compute[182713]: 2026-01-21 23:58:33.089 182717 DEBUG oslo_concurrency.lockutils [req-946dca1c-52b2-4bf3-b69f-ec93b663315b req-5ddf6bb9-b139-439b-aa32-20c641437452 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "337d88a9-a34b-4c90-bf0d-0418533ae52d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:58:33 compute-1 nova_compute[182713]: 2026-01-21 23:58:33.090 182717 DEBUG nova.compute.manager [req-946dca1c-52b2-4bf3-b69f-ec93b663315b req-5ddf6bb9-b139-439b-aa32-20c641437452 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 337d88a9-a34b-4c90-bf0d-0418533ae52d] No waiting events found dispatching network-vif-plugged-0d50f3fc-9e12-4e56-ba52-98ff14988caa pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 23:58:33 compute-1 nova_compute[182713]: 2026-01-21 23:58:33.090 182717 WARNING nova.compute.manager [req-946dca1c-52b2-4bf3-b69f-ec93b663315b req-5ddf6bb9-b139-439b-aa32-20c641437452 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 337d88a9-a34b-4c90-bf0d-0418533ae52d] Received unexpected event network-vif-plugged-0d50f3fc-9e12-4e56-ba52-98ff14988caa for instance with vm_state deleted and task_state None.
Jan 21 23:58:33 compute-1 nova_compute[182713]: 2026-01-21 23:58:33.904 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:58:33 compute-1 nova_compute[182713]: 2026-01-21 23:58:33.905 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:58:33 compute-1 nova_compute[182713]: 2026-01-21 23:58:33.905 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 21 23:58:33 compute-1 nova_compute[182713]: 2026-01-21 23:58:33.905 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 21 23:58:33 compute-1 nova_compute[182713]: 2026-01-21 23:58:33.942 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] [instance: c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9] Skipping network cache update for instance because it is being deleted. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9875
Jan 21 23:58:33 compute-1 nova_compute[182713]: 2026-01-21 23:58:33.943 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 21 23:58:33 compute-1 nova_compute[182713]: 2026-01-21 23:58:33.946 182717 DEBUG nova.compute.manager [req-baf4b082-4827-43ef-8a20-61d79c24c4a1 req-bd82ff50-2885-43ac-a782-411265500beb 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9] Received event network-vif-unplugged-43589933-1997-41c6-9aa3-54f71a1330b8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:58:33 compute-1 nova_compute[182713]: 2026-01-21 23:58:33.947 182717 DEBUG oslo_concurrency.lockutils [req-baf4b082-4827-43ef-8a20-61d79c24c4a1 req-bd82ff50-2885-43ac-a782-411265500beb 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:58:33 compute-1 nova_compute[182713]: 2026-01-21 23:58:33.947 182717 DEBUG oslo_concurrency.lockutils [req-baf4b082-4827-43ef-8a20-61d79c24c4a1 req-bd82ff50-2885-43ac-a782-411265500beb 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:58:33 compute-1 nova_compute[182713]: 2026-01-21 23:58:33.947 182717 DEBUG oslo_concurrency.lockutils [req-baf4b082-4827-43ef-8a20-61d79c24c4a1 req-bd82ff50-2885-43ac-a782-411265500beb 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:58:33 compute-1 nova_compute[182713]: 2026-01-21 23:58:33.947 182717 DEBUG nova.compute.manager [req-baf4b082-4827-43ef-8a20-61d79c24c4a1 req-bd82ff50-2885-43ac-a782-411265500beb 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9] No waiting events found dispatching network-vif-unplugged-43589933-1997-41c6-9aa3-54f71a1330b8 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 23:58:33 compute-1 nova_compute[182713]: 2026-01-21 23:58:33.947 182717 DEBUG nova.compute.manager [req-baf4b082-4827-43ef-8a20-61d79c24c4a1 req-bd82ff50-2885-43ac-a782-411265500beb 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9] Received event network-vif-unplugged-43589933-1997-41c6-9aa3-54f71a1330b8 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 21 23:58:33 compute-1 nova_compute[182713]: 2026-01-21 23:58:33.948 182717 DEBUG nova.compute.manager [req-baf4b082-4827-43ef-8a20-61d79c24c4a1 req-bd82ff50-2885-43ac-a782-411265500beb 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9] Received event network-vif-plugged-43589933-1997-41c6-9aa3-54f71a1330b8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:58:33 compute-1 nova_compute[182713]: 2026-01-21 23:58:33.948 182717 DEBUG oslo_concurrency.lockutils [req-baf4b082-4827-43ef-8a20-61d79c24c4a1 req-bd82ff50-2885-43ac-a782-411265500beb 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:58:33 compute-1 nova_compute[182713]: 2026-01-21 23:58:33.948 182717 DEBUG oslo_concurrency.lockutils [req-baf4b082-4827-43ef-8a20-61d79c24c4a1 req-bd82ff50-2885-43ac-a782-411265500beb 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:58:33 compute-1 nova_compute[182713]: 2026-01-21 23:58:33.948 182717 DEBUG oslo_concurrency.lockutils [req-baf4b082-4827-43ef-8a20-61d79c24c4a1 req-bd82ff50-2885-43ac-a782-411265500beb 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:58:33 compute-1 nova_compute[182713]: 2026-01-21 23:58:33.948 182717 DEBUG nova.compute.manager [req-baf4b082-4827-43ef-8a20-61d79c24c4a1 req-bd82ff50-2885-43ac-a782-411265500beb 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9] No waiting events found dispatching network-vif-plugged-43589933-1997-41c6-9aa3-54f71a1330b8 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 23:58:33 compute-1 nova_compute[182713]: 2026-01-21 23:58:33.948 182717 WARNING nova.compute.manager [req-baf4b082-4827-43ef-8a20-61d79c24c4a1 req-bd82ff50-2885-43ac-a782-411265500beb 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9] Received unexpected event network-vif-plugged-43589933-1997-41c6-9aa3-54f71a1330b8 for instance with vm_state active and task_state deleting.
Jan 21 23:58:33 compute-1 nova_compute[182713]: 2026-01-21 23:58:33.949 182717 DEBUG nova.compute.manager [req-baf4b082-4827-43ef-8a20-61d79c24c4a1 req-bd82ff50-2885-43ac-a782-411265500beb 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9] Received event network-vif-unplugged-da88332f-f709-4521-a7e5-faca686cf825 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:58:33 compute-1 nova_compute[182713]: 2026-01-21 23:58:33.949 182717 DEBUG oslo_concurrency.lockutils [req-baf4b082-4827-43ef-8a20-61d79c24c4a1 req-bd82ff50-2885-43ac-a782-411265500beb 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:58:33 compute-1 nova_compute[182713]: 2026-01-21 23:58:33.949 182717 DEBUG oslo_concurrency.lockutils [req-baf4b082-4827-43ef-8a20-61d79c24c4a1 req-bd82ff50-2885-43ac-a782-411265500beb 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:58:33 compute-1 nova_compute[182713]: 2026-01-21 23:58:33.949 182717 DEBUG oslo_concurrency.lockutils [req-baf4b082-4827-43ef-8a20-61d79c24c4a1 req-bd82ff50-2885-43ac-a782-411265500beb 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:58:33 compute-1 nova_compute[182713]: 2026-01-21 23:58:33.949 182717 DEBUG nova.compute.manager [req-baf4b082-4827-43ef-8a20-61d79c24c4a1 req-bd82ff50-2885-43ac-a782-411265500beb 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9] No waiting events found dispatching network-vif-unplugged-da88332f-f709-4521-a7e5-faca686cf825 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 23:58:33 compute-1 nova_compute[182713]: 2026-01-21 23:58:33.950 182717 DEBUG nova.compute.manager [req-baf4b082-4827-43ef-8a20-61d79c24c4a1 req-bd82ff50-2885-43ac-a782-411265500beb 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9] Received event network-vif-unplugged-da88332f-f709-4521-a7e5-faca686cf825 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 21 23:58:33 compute-1 nova_compute[182713]: 2026-01-21 23:58:33.950 182717 DEBUG nova.compute.manager [req-baf4b082-4827-43ef-8a20-61d79c24c4a1 req-bd82ff50-2885-43ac-a782-411265500beb 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9] Received event network-vif-plugged-da88332f-f709-4521-a7e5-faca686cf825 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:58:33 compute-1 nova_compute[182713]: 2026-01-21 23:58:33.950 182717 DEBUG oslo_concurrency.lockutils [req-baf4b082-4827-43ef-8a20-61d79c24c4a1 req-bd82ff50-2885-43ac-a782-411265500beb 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:58:33 compute-1 nova_compute[182713]: 2026-01-21 23:58:33.950 182717 DEBUG oslo_concurrency.lockutils [req-baf4b082-4827-43ef-8a20-61d79c24c4a1 req-bd82ff50-2885-43ac-a782-411265500beb 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:58:33 compute-1 nova_compute[182713]: 2026-01-21 23:58:33.950 182717 DEBUG oslo_concurrency.lockutils [req-baf4b082-4827-43ef-8a20-61d79c24c4a1 req-bd82ff50-2885-43ac-a782-411265500beb 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:58:33 compute-1 nova_compute[182713]: 2026-01-21 23:58:33.951 182717 DEBUG nova.compute.manager [req-baf4b082-4827-43ef-8a20-61d79c24c4a1 req-bd82ff50-2885-43ac-a782-411265500beb 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9] No waiting events found dispatching network-vif-plugged-da88332f-f709-4521-a7e5-faca686cf825 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 23:58:33 compute-1 nova_compute[182713]: 2026-01-21 23:58:33.951 182717 WARNING nova.compute.manager [req-baf4b082-4827-43ef-8a20-61d79c24c4a1 req-bd82ff50-2885-43ac-a782-411265500beb 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9] Received unexpected event network-vif-plugged-da88332f-f709-4521-a7e5-faca686cf825 for instance with vm_state active and task_state deleting.
Jan 21 23:58:34 compute-1 nova_compute[182713]: 2026-01-21 23:58:34.774 182717 DEBUG nova.compute.manager [req-ef9302a0-be7e-4a0e-935d-26ce63ad983a req-e8e3c914-f4d4-49ac-b1fe-ea8017b6a75e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9] Received event network-vif-deleted-da88332f-f709-4521-a7e5-faca686cf825 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:58:34 compute-1 nova_compute[182713]: 2026-01-21 23:58:34.774 182717 INFO nova.compute.manager [req-ef9302a0-be7e-4a0e-935d-26ce63ad983a req-e8e3c914-f4d4-49ac-b1fe-ea8017b6a75e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9] Neutron deleted interface da88332f-f709-4521-a7e5-faca686cf825; detaching it from the instance and deleting it from the info cache
Jan 21 23:58:34 compute-1 nova_compute[182713]: 2026-01-21 23:58:34.775 182717 DEBUG nova.network.neutron [req-ef9302a0-be7e-4a0e-935d-26ce63ad983a req-e8e3c914-f4d4-49ac-b1fe-ea8017b6a75e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9] Updating instance_info_cache with network_info: [{"id": "43589933-1997-41c6-9aa3-54f71a1330b8", "address": "fa:16:3e:3e:bc:4e", "network": {"id": "1995baab-0f8d-4658-a4fc-2d21868dc592", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-54296518-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.190", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "717cc581e6a349a98dfd390d05b18624", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap43589933-19", "ovs_interfaceid": "43589933-1997-41c6-9aa3-54f71a1330b8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 23:58:34 compute-1 nova_compute[182713]: 2026-01-21 23:58:34.778 182717 DEBUG nova.network.neutron [-] [instance: c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 23:58:34 compute-1 nova_compute[182713]: 2026-01-21 23:58:34.827 182717 DEBUG nova.compute.manager [req-ef9302a0-be7e-4a0e-935d-26ce63ad983a req-e8e3c914-f4d4-49ac-b1fe-ea8017b6a75e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9] Detach interface failed, port_id=da88332f-f709-4521-a7e5-faca686cf825, reason: Instance c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882
Jan 21 23:58:34 compute-1 nova_compute[182713]: 2026-01-21 23:58:34.831 182717 INFO nova.compute.manager [-] [instance: c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9] Took 2.07 seconds to deallocate network for instance.
Jan 21 23:58:34 compute-1 nova_compute[182713]: 2026-01-21 23:58:34.951 182717 DEBUG oslo_concurrency.lockutils [None req-ab05fc8b-074b-4881-a880-fb3a4d2a03ab 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:58:34 compute-1 nova_compute[182713]: 2026-01-21 23:58:34.952 182717 DEBUG oslo_concurrency.lockutils [None req-ab05fc8b-074b-4881-a880-fb3a4d2a03ab 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:58:35 compute-1 nova_compute[182713]: 2026-01-21 23:58:35.043 182717 DEBUG nova.compute.provider_tree [None req-ab05fc8b-074b-4881-a880-fb3a4d2a03ab 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Inventory has not changed in ProviderTree for provider: 39680711-70c9-4df1-ae59-25e54fac688d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 21 23:58:35 compute-1 nova_compute[182713]: 2026-01-21 23:58:35.065 182717 DEBUG nova.scheduler.client.report [None req-ab05fc8b-074b-4881-a880-fb3a4d2a03ab 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Inventory has not changed for provider 39680711-70c9-4df1-ae59-25e54fac688d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 21 23:58:35 compute-1 nova_compute[182713]: 2026-01-21 23:58:35.094 182717 DEBUG oslo_concurrency.lockutils [None req-ab05fc8b-074b-4881-a880-fb3a4d2a03ab 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.143s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:58:35 compute-1 nova_compute[182713]: 2026-01-21 23:58:35.165 182717 INFO nova.scheduler.client.report [None req-ab05fc8b-074b-4881-a880-fb3a4d2a03ab 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Deleted allocations for instance c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9
Jan 21 23:58:35 compute-1 nova_compute[182713]: 2026-01-21 23:58:35.309 182717 DEBUG oslo_concurrency.lockutils [None req-ab05fc8b-074b-4881-a880-fb3a4d2a03ab 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Lock "c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.046s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:58:35 compute-1 nova_compute[182713]: 2026-01-21 23:58:35.767 182717 DEBUG nova.network.neutron [None req-f8213aff-4f1c-4f00-aeaa-898fb01fcf42 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] [instance: c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9] Updating instance_info_cache with network_info: [{"id": "43589933-1997-41c6-9aa3-54f71a1330b8", "address": "fa:16:3e:3e:bc:4e", "network": {"id": "1995baab-0f8d-4658-a4fc-2d21868dc592", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-54296518-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.190", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "717cc581e6a349a98dfd390d05b18624", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap43589933-19", "ovs_interfaceid": "43589933-1997-41c6-9aa3-54f71a1330b8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "da88332f-f709-4521-a7e5-faca686cf825", "address": "fa:16:3e:92:0a:be", "network": {"id": "1995baab-0f8d-4658-a4fc-2d21868dc592", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-54296518-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "717cc581e6a349a98dfd390d05b18624", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapda88332f-f7", "ovs_interfaceid": "da88332f-f709-4521-a7e5-faca686cf825", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "ebbb51e0-ecfc-404f-a578-681300a57aa8", "address": "fa:16:3e:db:19:e5", "network": {"id": "1995baab-0f8d-4658-a4fc-2d21868dc592", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-54296518-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "717cc581e6a349a98dfd390d05b18624", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapebbb51e0-ec", "ovs_interfaceid": "ebbb51e0-ecfc-404f-a578-681300a57aa8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 23:58:35 compute-1 nova_compute[182713]: 2026-01-21 23:58:35.790 182717 DEBUG oslo_concurrency.lockutils [None req-f8213aff-4f1c-4f00-aeaa-898fb01fcf42 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Releasing lock "refresh_cache-c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 23:58:35 compute-1 nova_compute[182713]: 2026-01-21 23:58:35.836 182717 DEBUG oslo_concurrency.lockutils [None req-f8213aff-4f1c-4f00-aeaa-898fb01fcf42 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Lock "interface-c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9-63932621-f0d1-4f08-8ce5-b5fa120bcc62" "released" by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" :: held 6.102s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:58:36 compute-1 nova_compute[182713]: 2026-01-21 23:58:36.172 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:58:36 compute-1 podman[221737]: 2026-01-21 23:58:36.614387387 +0000 UTC m=+0.086669130 container health_status 9069102163b9014ce8d990e36224edf2f6277e737eebbc85a8ea0ba5a3d6a468 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 21 23:58:36 compute-1 podman[221736]: 2026-01-21 23:58:36.664176981 +0000 UTC m=+0.141521909 container health_status 1b31374526468d6b98833c6ddc06c791524e3aaa8f4a92d02e3f11c449a17ca2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, org.label-schema.schema-version=1.0, config_id=ovn_controller, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3)
Jan 21 23:58:36 compute-1 nova_compute[182713]: 2026-01-21 23:58:36.887 182717 DEBUG nova.compute.manager [req-2125c2c9-989b-4a01-8f32-69e51e2fd2b1 req-5920b9a8-eaff-49d9-b968-f72eae5d6666 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9] Received event network-vif-deleted-43589933-1997-41c6-9aa3-54f71a1330b8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:58:37 compute-1 nova_compute[182713]: 2026-01-21 23:58:37.640 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:58:39 compute-1 nova_compute[182713]: 2026-01-21 23:58:39.181 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:58:39 compute-1 nova_compute[182713]: 2026-01-21 23:58:39.397 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:58:41 compute-1 nova_compute[182713]: 2026-01-21 23:58:41.175 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:58:42 compute-1 podman[221788]: 2026-01-21 23:58:42.595275241 +0000 UTC m=+0.073671933 container health_status e305ebfcd3dbca18f8dd680606b043b108cf9efb0d0a771039cb04fe0df8441d (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 21 23:58:42 compute-1 podman[221787]: 2026-01-21 23:58:42.602222086 +0000 UTC m=+0.084585510 container health_status af9a44923f14d865893c91712a29a99d59e05d83f1a1a0300c4f4d5af638f284 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Jan 21 23:58:42 compute-1 nova_compute[182713]: 2026-01-21 23:58:42.643 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:58:44 compute-1 nova_compute[182713]: 2026-01-21 23:58:44.980 182717 DEBUG oslo_concurrency.lockutils [None req-29d9fd01-6866-41c4-8530-cace66e53345 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Acquiring lock "30c7da24-de00-4067-a5d2-f36ad21391c5" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:58:44 compute-1 nova_compute[182713]: 2026-01-21 23:58:44.981 182717 DEBUG oslo_concurrency.lockutils [None req-29d9fd01-6866-41c4-8530-cace66e53345 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Lock "30c7da24-de00-4067-a5d2-f36ad21391c5" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:58:45 compute-1 nova_compute[182713]: 2026-01-21 23:58:45.003 182717 DEBUG nova.compute.manager [None req-29d9fd01-6866-41c4-8530-cace66e53345 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] [instance: 30c7da24-de00-4067-a5d2-f36ad21391c5] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 21 23:58:45 compute-1 nova_compute[182713]: 2026-01-21 23:58:45.172 182717 DEBUG oslo_concurrency.lockutils [None req-29d9fd01-6866-41c4-8530-cace66e53345 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:58:45 compute-1 nova_compute[182713]: 2026-01-21 23:58:45.173 182717 DEBUG oslo_concurrency.lockutils [None req-29d9fd01-6866-41c4-8530-cace66e53345 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:58:45 compute-1 nova_compute[182713]: 2026-01-21 23:58:45.184 182717 DEBUG nova.virt.hardware [None req-29d9fd01-6866-41c4-8530-cace66e53345 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 21 23:58:45 compute-1 nova_compute[182713]: 2026-01-21 23:58:45.184 182717 INFO nova.compute.claims [None req-29d9fd01-6866-41c4-8530-cace66e53345 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] [instance: 30c7da24-de00-4067-a5d2-f36ad21391c5] Claim successful on node compute-1.ctlplane.example.com
Jan 21 23:58:45 compute-1 nova_compute[182713]: 2026-01-21 23:58:45.324 182717 DEBUG nova.compute.provider_tree [None req-29d9fd01-6866-41c4-8530-cace66e53345 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Inventory has not changed in ProviderTree for provider: 39680711-70c9-4df1-ae59-25e54fac688d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 21 23:58:45 compute-1 nova_compute[182713]: 2026-01-21 23:58:45.340 182717 DEBUG nova.scheduler.client.report [None req-29d9fd01-6866-41c4-8530-cace66e53345 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Inventory has not changed for provider 39680711-70c9-4df1-ae59-25e54fac688d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 21 23:58:45 compute-1 nova_compute[182713]: 2026-01-21 23:58:45.379 182717 DEBUG oslo_concurrency.lockutils [None req-29d9fd01-6866-41c4-8530-cace66e53345 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.206s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:58:45 compute-1 nova_compute[182713]: 2026-01-21 23:58:45.380 182717 DEBUG nova.compute.manager [None req-29d9fd01-6866-41c4-8530-cace66e53345 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] [instance: 30c7da24-de00-4067-a5d2-f36ad21391c5] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 21 23:58:45 compute-1 nova_compute[182713]: 2026-01-21 23:58:45.457 182717 DEBUG nova.compute.manager [None req-29d9fd01-6866-41c4-8530-cace66e53345 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] [instance: 30c7da24-de00-4067-a5d2-f36ad21391c5] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 21 23:58:45 compute-1 nova_compute[182713]: 2026-01-21 23:58:45.458 182717 DEBUG nova.network.neutron [None req-29d9fd01-6866-41c4-8530-cace66e53345 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] [instance: 30c7da24-de00-4067-a5d2-f36ad21391c5] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 21 23:58:45 compute-1 nova_compute[182713]: 2026-01-21 23:58:45.484 182717 INFO nova.virt.libvirt.driver [None req-29d9fd01-6866-41c4-8530-cace66e53345 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] [instance: 30c7da24-de00-4067-a5d2-f36ad21391c5] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 21 23:58:45 compute-1 nova_compute[182713]: 2026-01-21 23:58:45.524 182717 DEBUG nova.compute.manager [None req-29d9fd01-6866-41c4-8530-cace66e53345 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] [instance: 30c7da24-de00-4067-a5d2-f36ad21391c5] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 21 23:58:45 compute-1 nova_compute[182713]: 2026-01-21 23:58:45.685 182717 DEBUG nova.compute.manager [None req-29d9fd01-6866-41c4-8530-cace66e53345 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] [instance: 30c7da24-de00-4067-a5d2-f36ad21391c5] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 21 23:58:45 compute-1 nova_compute[182713]: 2026-01-21 23:58:45.688 182717 DEBUG nova.virt.libvirt.driver [None req-29d9fd01-6866-41c4-8530-cace66e53345 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] [instance: 30c7da24-de00-4067-a5d2-f36ad21391c5] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 21 23:58:45 compute-1 nova_compute[182713]: 2026-01-21 23:58:45.688 182717 INFO nova.virt.libvirt.driver [None req-29d9fd01-6866-41c4-8530-cace66e53345 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] [instance: 30c7da24-de00-4067-a5d2-f36ad21391c5] Creating image(s)
Jan 21 23:58:45 compute-1 nova_compute[182713]: 2026-01-21 23:58:45.689 182717 DEBUG oslo_concurrency.lockutils [None req-29d9fd01-6866-41c4-8530-cace66e53345 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Acquiring lock "/var/lib/nova/instances/30c7da24-de00-4067-a5d2-f36ad21391c5/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:58:45 compute-1 nova_compute[182713]: 2026-01-21 23:58:45.690 182717 DEBUG oslo_concurrency.lockutils [None req-29d9fd01-6866-41c4-8530-cace66e53345 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Lock "/var/lib/nova/instances/30c7da24-de00-4067-a5d2-f36ad21391c5/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:58:45 compute-1 nova_compute[182713]: 2026-01-21 23:58:45.691 182717 DEBUG oslo_concurrency.lockutils [None req-29d9fd01-6866-41c4-8530-cace66e53345 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Lock "/var/lib/nova/instances/30c7da24-de00-4067-a5d2-f36ad21391c5/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:58:45 compute-1 nova_compute[182713]: 2026-01-21 23:58:45.716 182717 DEBUG oslo_concurrency.processutils [None req-29d9fd01-6866-41c4-8530-cace66e53345 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:58:45 compute-1 nova_compute[182713]: 2026-01-21 23:58:45.802 182717 DEBUG oslo_concurrency.processutils [None req-29d9fd01-6866-41c4-8530-cace66e53345 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.086s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:58:45 compute-1 nova_compute[182713]: 2026-01-21 23:58:45.803 182717 DEBUG oslo_concurrency.lockutils [None req-29d9fd01-6866-41c4-8530-cace66e53345 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Acquiring lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:58:45 compute-1 nova_compute[182713]: 2026-01-21 23:58:45.804 182717 DEBUG oslo_concurrency.lockutils [None req-29d9fd01-6866-41c4-8530-cace66e53345 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:58:45 compute-1 nova_compute[182713]: 2026-01-21 23:58:45.819 182717 DEBUG oslo_concurrency.processutils [None req-29d9fd01-6866-41c4-8530-cace66e53345 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:58:45 compute-1 nova_compute[182713]: 2026-01-21 23:58:45.847 182717 DEBUG nova.policy [None req-29d9fd01-6866-41c4-8530-cace66e53345 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '0f8ef02149394f2dac899fc3395b6bf7', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '717cc581e6a349a98dfd390d05b18624', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 21 23:58:45 compute-1 nova_compute[182713]: 2026-01-21 23:58:45.908 182717 DEBUG oslo_concurrency.processutils [None req-29d9fd01-6866-41c4-8530-cace66e53345 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.089s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:58:45 compute-1 nova_compute[182713]: 2026-01-21 23:58:45.910 182717 DEBUG oslo_concurrency.processutils [None req-29d9fd01-6866-41c4-8530-cace66e53345 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/30c7da24-de00-4067-a5d2-f36ad21391c5/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:58:45 compute-1 nova_compute[182713]: 2026-01-21 23:58:45.933 182717 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769039910.9174674, 337d88a9-a34b-4c90-bf0d-0418533ae52d => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:58:45 compute-1 nova_compute[182713]: 2026-01-21 23:58:45.934 182717 INFO nova.compute.manager [-] [instance: 337d88a9-a34b-4c90-bf0d-0418533ae52d] VM Stopped (Lifecycle Event)
Jan 21 23:58:45 compute-1 nova_compute[182713]: 2026-01-21 23:58:45.954 182717 DEBUG oslo_concurrency.processutils [None req-29d9fd01-6866-41c4-8530-cace66e53345 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/30c7da24-de00-4067-a5d2-f36ad21391c5/disk 1073741824" returned: 0 in 0.045s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:58:45 compute-1 nova_compute[182713]: 2026-01-21 23:58:45.956 182717 DEBUG oslo_concurrency.lockutils [None req-29d9fd01-6866-41c4-8530-cace66e53345 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.152s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:58:45 compute-1 nova_compute[182713]: 2026-01-21 23:58:45.958 182717 DEBUG oslo_concurrency.processutils [None req-29d9fd01-6866-41c4-8530-cace66e53345 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:58:45 compute-1 nova_compute[182713]: 2026-01-21 23:58:45.988 182717 DEBUG nova.compute.manager [None req-bc1c6ca8-427b-45d7-a834-e58180628578 - - - - - -] [instance: 337d88a9-a34b-4c90-bf0d-0418533ae52d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:58:46 compute-1 nova_compute[182713]: 2026-01-21 23:58:46.025 182717 DEBUG oslo_concurrency.processutils [None req-29d9fd01-6866-41c4-8530-cace66e53345 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:58:46 compute-1 nova_compute[182713]: 2026-01-21 23:58:46.026 182717 DEBUG nova.virt.disk.api [None req-29d9fd01-6866-41c4-8530-cace66e53345 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Checking if we can resize image /var/lib/nova/instances/30c7da24-de00-4067-a5d2-f36ad21391c5/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 21 23:58:46 compute-1 nova_compute[182713]: 2026-01-21 23:58:46.026 182717 DEBUG oslo_concurrency.processutils [None req-29d9fd01-6866-41c4-8530-cace66e53345 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/30c7da24-de00-4067-a5d2-f36ad21391c5/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:58:46 compute-1 nova_compute[182713]: 2026-01-21 23:58:46.081 182717 DEBUG oslo_concurrency.processutils [None req-29d9fd01-6866-41c4-8530-cace66e53345 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/30c7da24-de00-4067-a5d2-f36ad21391c5/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:58:46 compute-1 nova_compute[182713]: 2026-01-21 23:58:46.083 182717 DEBUG nova.virt.disk.api [None req-29d9fd01-6866-41c4-8530-cace66e53345 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Cannot resize image /var/lib/nova/instances/30c7da24-de00-4067-a5d2-f36ad21391c5/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 21 23:58:46 compute-1 nova_compute[182713]: 2026-01-21 23:58:46.084 182717 DEBUG nova.objects.instance [None req-29d9fd01-6866-41c4-8530-cace66e53345 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Lazy-loading 'migration_context' on Instance uuid 30c7da24-de00-4067-a5d2-f36ad21391c5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:58:46 compute-1 nova_compute[182713]: 2026-01-21 23:58:46.101 182717 DEBUG nova.virt.libvirt.driver [None req-29d9fd01-6866-41c4-8530-cace66e53345 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] [instance: 30c7da24-de00-4067-a5d2-f36ad21391c5] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 21 23:58:46 compute-1 nova_compute[182713]: 2026-01-21 23:58:46.102 182717 DEBUG nova.virt.libvirt.driver [None req-29d9fd01-6866-41c4-8530-cace66e53345 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] [instance: 30c7da24-de00-4067-a5d2-f36ad21391c5] Ensure instance console log exists: /var/lib/nova/instances/30c7da24-de00-4067-a5d2-f36ad21391c5/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 21 23:58:46 compute-1 nova_compute[182713]: 2026-01-21 23:58:46.103 182717 DEBUG oslo_concurrency.lockutils [None req-29d9fd01-6866-41c4-8530-cace66e53345 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:58:46 compute-1 nova_compute[182713]: 2026-01-21 23:58:46.103 182717 DEBUG oslo_concurrency.lockutils [None req-29d9fd01-6866-41c4-8530-cace66e53345 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:58:46 compute-1 nova_compute[182713]: 2026-01-21 23:58:46.104 182717 DEBUG oslo_concurrency.lockutils [None req-29d9fd01-6866-41c4-8530-cace66e53345 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:58:46 compute-1 nova_compute[182713]: 2026-01-21 23:58:46.177 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:58:46 compute-1 nova_compute[182713]: 2026-01-21 23:58:46.973 182717 DEBUG nova.network.neutron [None req-29d9fd01-6866-41c4-8530-cace66e53345 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] [instance: 30c7da24-de00-4067-a5d2-f36ad21391c5] Successfully created port: 917524f9-5334-4b3d-b16f-b9686b1c3528 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 21 23:58:47 compute-1 nova_compute[182713]: 2026-01-21 23:58:47.576 182717 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769039912.5733252, c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:58:47 compute-1 nova_compute[182713]: 2026-01-21 23:58:47.576 182717 INFO nova.compute.manager [-] [instance: c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9] VM Stopped (Lifecycle Event)
Jan 21 23:58:47 compute-1 nova_compute[182713]: 2026-01-21 23:58:47.630 182717 DEBUG nova.compute.manager [None req-9832e238-37ca-4d03-b094-9fd6836754f9 - - - - - -] [instance: c1a1b0d8-6b53-486a-bb03-6ecd2eb4a2a9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:58:47 compute-1 nova_compute[182713]: 2026-01-21 23:58:47.646 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:58:48 compute-1 nova_compute[182713]: 2026-01-21 23:58:48.584 182717 DEBUG nova.network.neutron [None req-29d9fd01-6866-41c4-8530-cace66e53345 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] [instance: 30c7da24-de00-4067-a5d2-f36ad21391c5] Successfully updated port: 917524f9-5334-4b3d-b16f-b9686b1c3528 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 21 23:58:48 compute-1 nova_compute[182713]: 2026-01-21 23:58:48.609 182717 DEBUG oslo_concurrency.lockutils [None req-29d9fd01-6866-41c4-8530-cace66e53345 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Acquiring lock "refresh_cache-30c7da24-de00-4067-a5d2-f36ad21391c5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 23:58:48 compute-1 nova_compute[182713]: 2026-01-21 23:58:48.610 182717 DEBUG oslo_concurrency.lockutils [None req-29d9fd01-6866-41c4-8530-cace66e53345 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Acquired lock "refresh_cache-30c7da24-de00-4067-a5d2-f36ad21391c5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 23:58:48 compute-1 nova_compute[182713]: 2026-01-21 23:58:48.610 182717 DEBUG nova.network.neutron [None req-29d9fd01-6866-41c4-8530-cace66e53345 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] [instance: 30c7da24-de00-4067-a5d2-f36ad21391c5] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 21 23:58:48 compute-1 nova_compute[182713]: 2026-01-21 23:58:48.686 182717 DEBUG nova.compute.manager [req-bd67ccd8-6f13-497a-8e4c-5dcbd477bf32 req-ef7bcbc1-56df-4b44-be4e-995e1211c739 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 30c7da24-de00-4067-a5d2-f36ad21391c5] Received event network-changed-917524f9-5334-4b3d-b16f-b9686b1c3528 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:58:48 compute-1 nova_compute[182713]: 2026-01-21 23:58:48.687 182717 DEBUG nova.compute.manager [req-bd67ccd8-6f13-497a-8e4c-5dcbd477bf32 req-ef7bcbc1-56df-4b44-be4e-995e1211c739 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 30c7da24-de00-4067-a5d2-f36ad21391c5] Refreshing instance network info cache due to event network-changed-917524f9-5334-4b3d-b16f-b9686b1c3528. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 21 23:58:48 compute-1 nova_compute[182713]: 2026-01-21 23:58:48.687 182717 DEBUG oslo_concurrency.lockutils [req-bd67ccd8-6f13-497a-8e4c-5dcbd477bf32 req-ef7bcbc1-56df-4b44-be4e-995e1211c739 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-30c7da24-de00-4067-a5d2-f36ad21391c5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 23:58:48 compute-1 nova_compute[182713]: 2026-01-21 23:58:48.794 182717 DEBUG nova.network.neutron [None req-29d9fd01-6866-41c4-8530-cace66e53345 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] [instance: 30c7da24-de00-4067-a5d2-f36ad21391c5] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 21 23:58:50 compute-1 nova_compute[182713]: 2026-01-21 23:58:50.487 182717 DEBUG nova.network.neutron [None req-29d9fd01-6866-41c4-8530-cace66e53345 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] [instance: 30c7da24-de00-4067-a5d2-f36ad21391c5] Updating instance_info_cache with network_info: [{"id": "917524f9-5334-4b3d-b16f-b9686b1c3528", "address": "fa:16:3e:80:88:c9", "network": {"id": "1995baab-0f8d-4658-a4fc-2d21868dc592", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-54296518-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "717cc581e6a349a98dfd390d05b18624", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap917524f9-53", "ovs_interfaceid": "917524f9-5334-4b3d-b16f-b9686b1c3528", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 23:58:50 compute-1 nova_compute[182713]: 2026-01-21 23:58:50.514 182717 DEBUG oslo_concurrency.lockutils [None req-29d9fd01-6866-41c4-8530-cace66e53345 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Releasing lock "refresh_cache-30c7da24-de00-4067-a5d2-f36ad21391c5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 23:58:50 compute-1 nova_compute[182713]: 2026-01-21 23:58:50.515 182717 DEBUG nova.compute.manager [None req-29d9fd01-6866-41c4-8530-cace66e53345 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] [instance: 30c7da24-de00-4067-a5d2-f36ad21391c5] Instance network_info: |[{"id": "917524f9-5334-4b3d-b16f-b9686b1c3528", "address": "fa:16:3e:80:88:c9", "network": {"id": "1995baab-0f8d-4658-a4fc-2d21868dc592", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-54296518-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "717cc581e6a349a98dfd390d05b18624", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap917524f9-53", "ovs_interfaceid": "917524f9-5334-4b3d-b16f-b9686b1c3528", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 21 23:58:50 compute-1 nova_compute[182713]: 2026-01-21 23:58:50.516 182717 DEBUG oslo_concurrency.lockutils [req-bd67ccd8-6f13-497a-8e4c-5dcbd477bf32 req-ef7bcbc1-56df-4b44-be4e-995e1211c739 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-30c7da24-de00-4067-a5d2-f36ad21391c5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 23:58:50 compute-1 nova_compute[182713]: 2026-01-21 23:58:50.517 182717 DEBUG nova.network.neutron [req-bd67ccd8-6f13-497a-8e4c-5dcbd477bf32 req-ef7bcbc1-56df-4b44-be4e-995e1211c739 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 30c7da24-de00-4067-a5d2-f36ad21391c5] Refreshing network info cache for port 917524f9-5334-4b3d-b16f-b9686b1c3528 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 21 23:58:50 compute-1 nova_compute[182713]: 2026-01-21 23:58:50.522 182717 DEBUG nova.virt.libvirt.driver [None req-29d9fd01-6866-41c4-8530-cace66e53345 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] [instance: 30c7da24-de00-4067-a5d2-f36ad21391c5] Start _get_guest_xml network_info=[{"id": "917524f9-5334-4b3d-b16f-b9686b1c3528", "address": "fa:16:3e:80:88:c9", "network": {"id": "1995baab-0f8d-4658-a4fc-2d21868dc592", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-54296518-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "717cc581e6a349a98dfd390d05b18624", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap917524f9-53", "ovs_interfaceid": "917524f9-5334-4b3d-b16f-b9686b1c3528", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'size': 0, 'disk_bus': 'virtio', 'boot_index': 0, 'device_name': '/dev/vda', 'guest_format': None, 'encrypted': False, 'encryption_format': None, 'encryption_secret_uuid': None, 'device_type': 'disk', 'image_id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 21 23:58:50 compute-1 nova_compute[182713]: 2026-01-21 23:58:50.532 182717 WARNING nova.virt.libvirt.driver [None req-29d9fd01-6866-41c4-8530-cace66e53345 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 21 23:58:50 compute-1 nova_compute[182713]: 2026-01-21 23:58:50.544 182717 DEBUG nova.virt.libvirt.host [None req-29d9fd01-6866-41c4-8530-cace66e53345 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 21 23:58:50 compute-1 nova_compute[182713]: 2026-01-21 23:58:50.545 182717 DEBUG nova.virt.libvirt.host [None req-29d9fd01-6866-41c4-8530-cace66e53345 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 21 23:58:50 compute-1 nova_compute[182713]: 2026-01-21 23:58:50.552 182717 DEBUG nova.virt.libvirt.host [None req-29d9fd01-6866-41c4-8530-cace66e53345 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 21 23:58:50 compute-1 nova_compute[182713]: 2026-01-21 23:58:50.553 182717 DEBUG nova.virt.libvirt.host [None req-29d9fd01-6866-41c4-8530-cace66e53345 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 21 23:58:50 compute-1 nova_compute[182713]: 2026-01-21 23:58:50.555 182717 DEBUG nova.virt.libvirt.driver [None req-29d9fd01-6866-41c4-8530-cace66e53345 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 21 23:58:50 compute-1 nova_compute[182713]: 2026-01-21 23:58:50.555 182717 DEBUG nova.virt.hardware [None req-29d9fd01-6866-41c4-8530-cace66e53345 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-21T23:42:45Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c3389c03-89c4-4ff5-9e03-1a99d41713d4',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 21 23:58:50 compute-1 nova_compute[182713]: 2026-01-21 23:58:50.556 182717 DEBUG nova.virt.hardware [None req-29d9fd01-6866-41c4-8530-cace66e53345 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 21 23:58:50 compute-1 nova_compute[182713]: 2026-01-21 23:58:50.556 182717 DEBUG nova.virt.hardware [None req-29d9fd01-6866-41c4-8530-cace66e53345 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 21 23:58:50 compute-1 nova_compute[182713]: 2026-01-21 23:58:50.556 182717 DEBUG nova.virt.hardware [None req-29d9fd01-6866-41c4-8530-cace66e53345 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 21 23:58:50 compute-1 nova_compute[182713]: 2026-01-21 23:58:50.557 182717 DEBUG nova.virt.hardware [None req-29d9fd01-6866-41c4-8530-cace66e53345 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 21 23:58:50 compute-1 nova_compute[182713]: 2026-01-21 23:58:50.557 182717 DEBUG nova.virt.hardware [None req-29d9fd01-6866-41c4-8530-cace66e53345 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 21 23:58:50 compute-1 nova_compute[182713]: 2026-01-21 23:58:50.557 182717 DEBUG nova.virt.hardware [None req-29d9fd01-6866-41c4-8530-cace66e53345 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 21 23:58:50 compute-1 nova_compute[182713]: 2026-01-21 23:58:50.557 182717 DEBUG nova.virt.hardware [None req-29d9fd01-6866-41c4-8530-cace66e53345 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 21 23:58:50 compute-1 nova_compute[182713]: 2026-01-21 23:58:50.557 182717 DEBUG nova.virt.hardware [None req-29d9fd01-6866-41c4-8530-cace66e53345 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 21 23:58:50 compute-1 nova_compute[182713]: 2026-01-21 23:58:50.558 182717 DEBUG nova.virt.hardware [None req-29d9fd01-6866-41c4-8530-cace66e53345 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 21 23:58:50 compute-1 nova_compute[182713]: 2026-01-21 23:58:50.558 182717 DEBUG nova.virt.hardware [None req-29d9fd01-6866-41c4-8530-cace66e53345 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 21 23:58:50 compute-1 nova_compute[182713]: 2026-01-21 23:58:50.562 182717 DEBUG nova.virt.libvirt.vif [None req-29d9fd01-6866-41c4-8530-cace66e53345 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-21T23:58:43Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-821479065',display_name='tempest-tempest.common.compute-instance-821479065',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-821479065',id=75,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGa3Z5wRILybgNVRpahBLmiLTAeMMxTHFRpSqeE8vf0/V2PbXMu+NKNirigUjrRZfax5519niVZ5m1wF5bYzERQCuKYHZS2P+HCnCDhrmGktv+3EqVL2XpKwAJqMQBW6VA==',key_name='tempest-keypair-1039920618',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='717cc581e6a349a98dfd390d05b18624',ramdisk_id='',reservation_id='r-tqp0ax95',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AttachInterfacesTestJSON-658760528',owner_user_name='tempest-AttachInterfacesTestJSON-658760528-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-21T23:58:45Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='0f8ef02149394f2dac899fc3395b6bf7',uuid=30c7da24-de00-4067-a5d2-f36ad21391c5,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "917524f9-5334-4b3d-b16f-b9686b1c3528", "address": "fa:16:3e:80:88:c9", "network": {"id": "1995baab-0f8d-4658-a4fc-2d21868dc592", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-54296518-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "717cc581e6a349a98dfd390d05b18624", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap917524f9-53", "ovs_interfaceid": "917524f9-5334-4b3d-b16f-b9686b1c3528", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 21 23:58:50 compute-1 nova_compute[182713]: 2026-01-21 23:58:50.563 182717 DEBUG nova.network.os_vif_util [None req-29d9fd01-6866-41c4-8530-cace66e53345 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Converting VIF {"id": "917524f9-5334-4b3d-b16f-b9686b1c3528", "address": "fa:16:3e:80:88:c9", "network": {"id": "1995baab-0f8d-4658-a4fc-2d21868dc592", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-54296518-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "717cc581e6a349a98dfd390d05b18624", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap917524f9-53", "ovs_interfaceid": "917524f9-5334-4b3d-b16f-b9686b1c3528", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 21 23:58:50 compute-1 nova_compute[182713]: 2026-01-21 23:58:50.564 182717 DEBUG nova.network.os_vif_util [None req-29d9fd01-6866-41c4-8530-cace66e53345 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:80:88:c9,bridge_name='br-int',has_traffic_filtering=True,id=917524f9-5334-4b3d-b16f-b9686b1c3528,network=Network(1995baab-0f8d-4658-a4fc-2d21868dc592),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap917524f9-53') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 21 23:58:50 compute-1 nova_compute[182713]: 2026-01-21 23:58:50.565 182717 DEBUG nova.objects.instance [None req-29d9fd01-6866-41c4-8530-cace66e53345 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Lazy-loading 'pci_devices' on Instance uuid 30c7da24-de00-4067-a5d2-f36ad21391c5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:58:50 compute-1 nova_compute[182713]: 2026-01-21 23:58:50.585 182717 DEBUG nova.virt.libvirt.driver [None req-29d9fd01-6866-41c4-8530-cace66e53345 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] [instance: 30c7da24-de00-4067-a5d2-f36ad21391c5] End _get_guest_xml xml=<domain type="kvm">
Jan 21 23:58:50 compute-1 nova_compute[182713]:   <uuid>30c7da24-de00-4067-a5d2-f36ad21391c5</uuid>
Jan 21 23:58:50 compute-1 nova_compute[182713]:   <name>instance-0000004b</name>
Jan 21 23:58:50 compute-1 nova_compute[182713]:   <memory>131072</memory>
Jan 21 23:58:50 compute-1 nova_compute[182713]:   <vcpu>1</vcpu>
Jan 21 23:58:50 compute-1 nova_compute[182713]:   <metadata>
Jan 21 23:58:50 compute-1 nova_compute[182713]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 21 23:58:50 compute-1 nova_compute[182713]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 21 23:58:50 compute-1 nova_compute[182713]:       <nova:name>tempest-tempest.common.compute-instance-821479065</nova:name>
Jan 21 23:58:50 compute-1 nova_compute[182713]:       <nova:creationTime>2026-01-21 23:58:50</nova:creationTime>
Jan 21 23:58:50 compute-1 nova_compute[182713]:       <nova:flavor name="m1.nano">
Jan 21 23:58:50 compute-1 nova_compute[182713]:         <nova:memory>128</nova:memory>
Jan 21 23:58:50 compute-1 nova_compute[182713]:         <nova:disk>1</nova:disk>
Jan 21 23:58:50 compute-1 nova_compute[182713]:         <nova:swap>0</nova:swap>
Jan 21 23:58:50 compute-1 nova_compute[182713]:         <nova:ephemeral>0</nova:ephemeral>
Jan 21 23:58:50 compute-1 nova_compute[182713]:         <nova:vcpus>1</nova:vcpus>
Jan 21 23:58:50 compute-1 nova_compute[182713]:       </nova:flavor>
Jan 21 23:58:50 compute-1 nova_compute[182713]:       <nova:owner>
Jan 21 23:58:50 compute-1 nova_compute[182713]:         <nova:user uuid="0f8ef02149394f2dac899fc3395b6bf7">tempest-AttachInterfacesTestJSON-658760528-project-member</nova:user>
Jan 21 23:58:50 compute-1 nova_compute[182713]:         <nova:project uuid="717cc581e6a349a98dfd390d05b18624">tempest-AttachInterfacesTestJSON-658760528</nova:project>
Jan 21 23:58:50 compute-1 nova_compute[182713]:       </nova:owner>
Jan 21 23:58:50 compute-1 nova_compute[182713]:       <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 21 23:58:50 compute-1 nova_compute[182713]:       <nova:ports>
Jan 21 23:58:50 compute-1 nova_compute[182713]:         <nova:port uuid="917524f9-5334-4b3d-b16f-b9686b1c3528">
Jan 21 23:58:50 compute-1 nova_compute[182713]:           <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Jan 21 23:58:50 compute-1 nova_compute[182713]:         </nova:port>
Jan 21 23:58:50 compute-1 nova_compute[182713]:       </nova:ports>
Jan 21 23:58:50 compute-1 nova_compute[182713]:     </nova:instance>
Jan 21 23:58:50 compute-1 nova_compute[182713]:   </metadata>
Jan 21 23:58:50 compute-1 nova_compute[182713]:   <sysinfo type="smbios">
Jan 21 23:58:50 compute-1 nova_compute[182713]:     <system>
Jan 21 23:58:50 compute-1 nova_compute[182713]:       <entry name="manufacturer">RDO</entry>
Jan 21 23:58:50 compute-1 nova_compute[182713]:       <entry name="product">OpenStack Compute</entry>
Jan 21 23:58:50 compute-1 nova_compute[182713]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 21 23:58:50 compute-1 nova_compute[182713]:       <entry name="serial">30c7da24-de00-4067-a5d2-f36ad21391c5</entry>
Jan 21 23:58:50 compute-1 nova_compute[182713]:       <entry name="uuid">30c7da24-de00-4067-a5d2-f36ad21391c5</entry>
Jan 21 23:58:50 compute-1 nova_compute[182713]:       <entry name="family">Virtual Machine</entry>
Jan 21 23:58:50 compute-1 nova_compute[182713]:     </system>
Jan 21 23:58:50 compute-1 nova_compute[182713]:   </sysinfo>
Jan 21 23:58:50 compute-1 nova_compute[182713]:   <os>
Jan 21 23:58:50 compute-1 nova_compute[182713]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 21 23:58:50 compute-1 nova_compute[182713]:     <boot dev="hd"/>
Jan 21 23:58:50 compute-1 nova_compute[182713]:     <smbios mode="sysinfo"/>
Jan 21 23:58:50 compute-1 nova_compute[182713]:   </os>
Jan 21 23:58:50 compute-1 nova_compute[182713]:   <features>
Jan 21 23:58:50 compute-1 nova_compute[182713]:     <acpi/>
Jan 21 23:58:50 compute-1 nova_compute[182713]:     <apic/>
Jan 21 23:58:50 compute-1 nova_compute[182713]:     <vmcoreinfo/>
Jan 21 23:58:50 compute-1 nova_compute[182713]:   </features>
Jan 21 23:58:50 compute-1 nova_compute[182713]:   <clock offset="utc">
Jan 21 23:58:50 compute-1 nova_compute[182713]:     <timer name="pit" tickpolicy="delay"/>
Jan 21 23:58:50 compute-1 nova_compute[182713]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 21 23:58:50 compute-1 nova_compute[182713]:     <timer name="hpet" present="no"/>
Jan 21 23:58:50 compute-1 nova_compute[182713]:   </clock>
Jan 21 23:58:50 compute-1 nova_compute[182713]:   <cpu mode="custom" match="exact">
Jan 21 23:58:50 compute-1 nova_compute[182713]:     <model>Nehalem</model>
Jan 21 23:58:50 compute-1 nova_compute[182713]:     <topology sockets="1" cores="1" threads="1"/>
Jan 21 23:58:50 compute-1 nova_compute[182713]:   </cpu>
Jan 21 23:58:50 compute-1 nova_compute[182713]:   <devices>
Jan 21 23:58:50 compute-1 nova_compute[182713]:     <disk type="file" device="disk">
Jan 21 23:58:50 compute-1 nova_compute[182713]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 21 23:58:50 compute-1 nova_compute[182713]:       <source file="/var/lib/nova/instances/30c7da24-de00-4067-a5d2-f36ad21391c5/disk"/>
Jan 21 23:58:50 compute-1 nova_compute[182713]:       <target dev="vda" bus="virtio"/>
Jan 21 23:58:50 compute-1 nova_compute[182713]:     </disk>
Jan 21 23:58:50 compute-1 nova_compute[182713]:     <disk type="file" device="cdrom">
Jan 21 23:58:50 compute-1 nova_compute[182713]:       <driver name="qemu" type="raw" cache="none"/>
Jan 21 23:58:50 compute-1 nova_compute[182713]:       <source file="/var/lib/nova/instances/30c7da24-de00-4067-a5d2-f36ad21391c5/disk.config"/>
Jan 21 23:58:50 compute-1 nova_compute[182713]:       <target dev="sda" bus="sata"/>
Jan 21 23:58:50 compute-1 nova_compute[182713]:     </disk>
Jan 21 23:58:50 compute-1 nova_compute[182713]:     <interface type="ethernet">
Jan 21 23:58:50 compute-1 nova_compute[182713]:       <mac address="fa:16:3e:80:88:c9"/>
Jan 21 23:58:50 compute-1 nova_compute[182713]:       <model type="virtio"/>
Jan 21 23:58:50 compute-1 nova_compute[182713]:       <driver name="vhost" rx_queue_size="512"/>
Jan 21 23:58:50 compute-1 nova_compute[182713]:       <mtu size="1442"/>
Jan 21 23:58:50 compute-1 nova_compute[182713]:       <target dev="tap917524f9-53"/>
Jan 21 23:58:50 compute-1 nova_compute[182713]:     </interface>
Jan 21 23:58:50 compute-1 nova_compute[182713]:     <serial type="pty">
Jan 21 23:58:50 compute-1 nova_compute[182713]:       <log file="/var/lib/nova/instances/30c7da24-de00-4067-a5d2-f36ad21391c5/console.log" append="off"/>
Jan 21 23:58:50 compute-1 nova_compute[182713]:     </serial>
Jan 21 23:58:50 compute-1 nova_compute[182713]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 21 23:58:50 compute-1 nova_compute[182713]:     <video>
Jan 21 23:58:50 compute-1 nova_compute[182713]:       <model type="virtio"/>
Jan 21 23:58:50 compute-1 nova_compute[182713]:     </video>
Jan 21 23:58:50 compute-1 nova_compute[182713]:     <input type="tablet" bus="usb"/>
Jan 21 23:58:50 compute-1 nova_compute[182713]:     <rng model="virtio">
Jan 21 23:58:50 compute-1 nova_compute[182713]:       <backend model="random">/dev/urandom</backend>
Jan 21 23:58:50 compute-1 nova_compute[182713]:     </rng>
Jan 21 23:58:50 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root"/>
Jan 21 23:58:50 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:58:50 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:58:50 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:58:50 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:58:50 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:58:50 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:58:50 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:58:50 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:58:50 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:58:50 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:58:50 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:58:50 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:58:50 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:58:50 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:58:50 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:58:50 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:58:50 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:58:50 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:58:50 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:58:50 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:58:50 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:58:50 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:58:50 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:58:50 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:58:50 compute-1 nova_compute[182713]:     <controller type="usb" index="0"/>
Jan 21 23:58:50 compute-1 nova_compute[182713]:     <memballoon model="virtio">
Jan 21 23:58:50 compute-1 nova_compute[182713]:       <stats period="10"/>
Jan 21 23:58:50 compute-1 nova_compute[182713]:     </memballoon>
Jan 21 23:58:50 compute-1 nova_compute[182713]:   </devices>
Jan 21 23:58:50 compute-1 nova_compute[182713]: </domain>
Jan 21 23:58:50 compute-1 nova_compute[182713]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 21 23:58:50 compute-1 nova_compute[182713]: 2026-01-21 23:58:50.587 182717 DEBUG nova.compute.manager [None req-29d9fd01-6866-41c4-8530-cace66e53345 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] [instance: 30c7da24-de00-4067-a5d2-f36ad21391c5] Preparing to wait for external event network-vif-plugged-917524f9-5334-4b3d-b16f-b9686b1c3528 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 21 23:58:50 compute-1 nova_compute[182713]: 2026-01-21 23:58:50.588 182717 DEBUG oslo_concurrency.lockutils [None req-29d9fd01-6866-41c4-8530-cace66e53345 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Acquiring lock "30c7da24-de00-4067-a5d2-f36ad21391c5-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:58:50 compute-1 nova_compute[182713]: 2026-01-21 23:58:50.589 182717 DEBUG oslo_concurrency.lockutils [None req-29d9fd01-6866-41c4-8530-cace66e53345 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Lock "30c7da24-de00-4067-a5d2-f36ad21391c5-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:58:50 compute-1 nova_compute[182713]: 2026-01-21 23:58:50.589 182717 DEBUG oslo_concurrency.lockutils [None req-29d9fd01-6866-41c4-8530-cace66e53345 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Lock "30c7da24-de00-4067-a5d2-f36ad21391c5-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:58:50 compute-1 nova_compute[182713]: 2026-01-21 23:58:50.591 182717 DEBUG nova.virt.libvirt.vif [None req-29d9fd01-6866-41c4-8530-cace66e53345 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-21T23:58:43Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-821479065',display_name='tempest-tempest.common.compute-instance-821479065',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-821479065',id=75,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGa3Z5wRILybgNVRpahBLmiLTAeMMxTHFRpSqeE8vf0/V2PbXMu+NKNirigUjrRZfax5519niVZ5m1wF5bYzERQCuKYHZS2P+HCnCDhrmGktv+3EqVL2XpKwAJqMQBW6VA==',key_name='tempest-keypair-1039920618',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='717cc581e6a349a98dfd390d05b18624',ramdisk_id='',reservation_id='r-tqp0ax95',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AttachInterfacesTestJSON-658760528',owner_user_name='tempest-AttachInterfacesTestJSON-658760528-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-21T23:58:45Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='0f8ef02149394f2dac899fc3395b6bf7',uuid=30c7da24-de00-4067-a5d2-f36ad21391c5,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "917524f9-5334-4b3d-b16f-b9686b1c3528", "address": "fa:16:3e:80:88:c9", "network": {"id": "1995baab-0f8d-4658-a4fc-2d21868dc592", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-54296518-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "717cc581e6a349a98dfd390d05b18624", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap917524f9-53", "ovs_interfaceid": "917524f9-5334-4b3d-b16f-b9686b1c3528", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 21 23:58:50 compute-1 nova_compute[182713]: 2026-01-21 23:58:50.591 182717 DEBUG nova.network.os_vif_util [None req-29d9fd01-6866-41c4-8530-cace66e53345 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Converting VIF {"id": "917524f9-5334-4b3d-b16f-b9686b1c3528", "address": "fa:16:3e:80:88:c9", "network": {"id": "1995baab-0f8d-4658-a4fc-2d21868dc592", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-54296518-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "717cc581e6a349a98dfd390d05b18624", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap917524f9-53", "ovs_interfaceid": "917524f9-5334-4b3d-b16f-b9686b1c3528", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 21 23:58:50 compute-1 nova_compute[182713]: 2026-01-21 23:58:50.592 182717 DEBUG nova.network.os_vif_util [None req-29d9fd01-6866-41c4-8530-cace66e53345 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:80:88:c9,bridge_name='br-int',has_traffic_filtering=True,id=917524f9-5334-4b3d-b16f-b9686b1c3528,network=Network(1995baab-0f8d-4658-a4fc-2d21868dc592),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap917524f9-53') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 21 23:58:50 compute-1 nova_compute[182713]: 2026-01-21 23:58:50.593 182717 DEBUG os_vif [None req-29d9fd01-6866-41c4-8530-cace66e53345 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:80:88:c9,bridge_name='br-int',has_traffic_filtering=True,id=917524f9-5334-4b3d-b16f-b9686b1c3528,network=Network(1995baab-0f8d-4658-a4fc-2d21868dc592),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap917524f9-53') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 21 23:58:50 compute-1 nova_compute[182713]: 2026-01-21 23:58:50.594 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:58:50 compute-1 nova_compute[182713]: 2026-01-21 23:58:50.595 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:58:50 compute-1 nova_compute[182713]: 2026-01-21 23:58:50.596 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 21 23:58:50 compute-1 nova_compute[182713]: 2026-01-21 23:58:50.605 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:58:50 compute-1 nova_compute[182713]: 2026-01-21 23:58:50.605 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap917524f9-53, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:58:50 compute-1 nova_compute[182713]: 2026-01-21 23:58:50.606 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap917524f9-53, col_values=(('external_ids', {'iface-id': '917524f9-5334-4b3d-b16f-b9686b1c3528', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:80:88:c9', 'vm-uuid': '30c7da24-de00-4067-a5d2-f36ad21391c5'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:58:50 compute-1 NetworkManager[54952]: <info>  [1769039930.6087] manager: (tap917524f9-53): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/129)
Jan 21 23:58:50 compute-1 nova_compute[182713]: 2026-01-21 23:58:50.609 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 21 23:58:50 compute-1 nova_compute[182713]: 2026-01-21 23:58:50.614 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:58:50 compute-1 nova_compute[182713]: 2026-01-21 23:58:50.615 182717 INFO os_vif [None req-29d9fd01-6866-41c4-8530-cace66e53345 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:80:88:c9,bridge_name='br-int',has_traffic_filtering=True,id=917524f9-5334-4b3d-b16f-b9686b1c3528,network=Network(1995baab-0f8d-4658-a4fc-2d21868dc592),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap917524f9-53')
Jan 21 23:58:50 compute-1 nova_compute[182713]: 2026-01-21 23:58:50.674 182717 DEBUG nova.virt.libvirt.driver [None req-29d9fd01-6866-41c4-8530-cace66e53345 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 21 23:58:50 compute-1 nova_compute[182713]: 2026-01-21 23:58:50.675 182717 DEBUG nova.virt.libvirt.driver [None req-29d9fd01-6866-41c4-8530-cace66e53345 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 21 23:58:50 compute-1 nova_compute[182713]: 2026-01-21 23:58:50.675 182717 DEBUG nova.virt.libvirt.driver [None req-29d9fd01-6866-41c4-8530-cace66e53345 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] No VIF found with MAC fa:16:3e:80:88:c9, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 21 23:58:50 compute-1 nova_compute[182713]: 2026-01-21 23:58:50.676 182717 INFO nova.virt.libvirt.driver [None req-29d9fd01-6866-41c4-8530-cace66e53345 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] [instance: 30c7da24-de00-4067-a5d2-f36ad21391c5] Using config drive
Jan 21 23:58:51 compute-1 nova_compute[182713]: 2026-01-21 23:58:51.179 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:58:51 compute-1 nova_compute[182713]: 2026-01-21 23:58:51.328 182717 INFO nova.virt.libvirt.driver [None req-29d9fd01-6866-41c4-8530-cace66e53345 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] [instance: 30c7da24-de00-4067-a5d2-f36ad21391c5] Creating config drive at /var/lib/nova/instances/30c7da24-de00-4067-a5d2-f36ad21391c5/disk.config
Jan 21 23:58:51 compute-1 nova_compute[182713]: 2026-01-21 23:58:51.339 182717 DEBUG oslo_concurrency.processutils [None req-29d9fd01-6866-41c4-8530-cace66e53345 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/30c7da24-de00-4067-a5d2-f36ad21391c5/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpx9d75qb5 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:58:51 compute-1 nova_compute[182713]: 2026-01-21 23:58:51.484 182717 DEBUG oslo_concurrency.processutils [None req-29d9fd01-6866-41c4-8530-cace66e53345 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/30c7da24-de00-4067-a5d2-f36ad21391c5/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpx9d75qb5" returned: 0 in 0.145s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:58:51 compute-1 kernel: tap917524f9-53: entered promiscuous mode
Jan 21 23:58:51 compute-1 NetworkManager[54952]: <info>  [1769039931.5744] manager: (tap917524f9-53): new Tun device (/org/freedesktop/NetworkManager/Devices/130)
Jan 21 23:58:51 compute-1 ovn_controller[94841]: 2026-01-21T23:58:51Z|00268|binding|INFO|Claiming lport 917524f9-5334-4b3d-b16f-b9686b1c3528 for this chassis.
Jan 21 23:58:51 compute-1 ovn_controller[94841]: 2026-01-21T23:58:51Z|00269|binding|INFO|917524f9-5334-4b3d-b16f-b9686b1c3528: Claiming fa:16:3e:80:88:c9 10.100.0.6
Jan 21 23:58:51 compute-1 nova_compute[182713]: 2026-01-21 23:58:51.581 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:58:51 compute-1 nova_compute[182713]: 2026-01-21 23:58:51.587 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:58:51 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:58:51.602 104184 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:80:88:c9 10.100.0.6'], port_security=['fa:16:3e:80:88:c9 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '30c7da24-de00-4067-a5d2-f36ad21391c5', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1995baab-0f8d-4658-a4fc-2d21868dc592', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '717cc581e6a349a98dfd390d05b18624', 'neutron:revision_number': '2', 'neutron:security_group_ids': '2e3b7d6e-99c3-4bed-a6db-24cc4d63ab1a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a84fa12f-731b-4479-8697-844749c5a76f, chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>], logical_port=917524f9-5334-4b3d-b16f-b9686b1c3528) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 21 23:58:51 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:58:51.603 104184 INFO neutron.agent.ovn.metadata.agent [-] Port 917524f9-5334-4b3d-b16f-b9686b1c3528 in datapath 1995baab-0f8d-4658-a4fc-2d21868dc592 bound to our chassis
Jan 21 23:58:51 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:58:51.605 104184 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 1995baab-0f8d-4658-a4fc-2d21868dc592
Jan 21 23:58:51 compute-1 systemd-machined[153970]: New machine qemu-35-instance-0000004b.
Jan 21 23:58:51 compute-1 systemd-udevd[221865]: Network interface NamePolicy= disabled on kernel command line.
Jan 21 23:58:51 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:58:51.624 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[245462a4-4533-4633-b666-9e192ae71687]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:58:51 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:58:51.626 104184 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap1995baab-01 in ovnmeta-1995baab-0f8d-4658-a4fc-2d21868dc592 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 21 23:58:51 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:58:51.628 211733 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap1995baab-00 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 21 23:58:51 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:58:51.629 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[9ecbbc15-13af-49b7-a528-407ccbcc2193]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:58:51 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:58:51.630 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[3d83bbd2-3cad-477f-89ed-7581edf58770]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:58:51 compute-1 NetworkManager[54952]: <info>  [1769039931.6436] device (tap917524f9-53): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 21 23:58:51 compute-1 NetworkManager[54952]: <info>  [1769039931.6443] device (tap917524f9-53): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 21 23:58:51 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:58:51.646 104576 DEBUG oslo.privsep.daemon [-] privsep: reply[f0887a9e-1eb3-4903-9301-296f6539e222]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:58:51 compute-1 ovn_controller[94841]: 2026-01-21T23:58:51Z|00270|binding|INFO|Setting lport 917524f9-5334-4b3d-b16f-b9686b1c3528 ovn-installed in OVS
Jan 21 23:58:51 compute-1 ovn_controller[94841]: 2026-01-21T23:58:51Z|00271|binding|INFO|Setting lport 917524f9-5334-4b3d-b16f-b9686b1c3528 up in Southbound
Jan 21 23:58:51 compute-1 nova_compute[182713]: 2026-01-21 23:58:51.655 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:58:51 compute-1 systemd[1]: Started Virtual Machine qemu-35-instance-0000004b.
Jan 21 23:58:51 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:58:51.672 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[d6ff428c-3e6c-43ea-9126-f5bf78c62dba]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:58:51 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:58:51.713 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[5848347a-cb6f-4475-81c0-565dcca26027]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:58:51 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:58:51.720 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[64261c31-de02-44df-97c6-64df0bcef73a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:58:51 compute-1 NetworkManager[54952]: <info>  [1769039931.7223] manager: (tap1995baab-00): new Veth device (/org/freedesktop/NetworkManager/Devices/131)
Jan 21 23:58:51 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:58:51.770 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[42edb3de-04e0-4cfd-8fb2-6a6570fe7b93]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:58:51 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:58:51.774 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[ef11159e-ace2-4a6e-943f-dbaffb3757cd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:58:51 compute-1 NetworkManager[54952]: <info>  [1769039931.7933] device (tap1995baab-00): carrier: link connected
Jan 21 23:58:51 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:58:51.798 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[69a2b886-02a5-4f33-afd3-a34333e3d5b1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:58:51 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:58:51.817 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[bfb1606e-6ec1-4af8-83c3-bb37a9ffbc2d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap1995baab-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:4b:ff:2f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 81], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 450444, 'reachable_time': 24854, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 221897, 'error': None, 'target': 'ovnmeta-1995baab-0f8d-4658-a4fc-2d21868dc592', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:58:51 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:58:51.834 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[71529871-6c74-4f1e-aff3-f9df2adefc2a]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe4b:ff2f'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 450444, 'tstamp': 450444}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 221898, 'error': None, 'target': 'ovnmeta-1995baab-0f8d-4658-a4fc-2d21868dc592', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:58:51 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:58:51.852 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[dabcea3f-bec1-4191-8596-05a9d94d77d0]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap1995baab-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:4b:ff:2f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 81], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 450444, 'reachable_time': 24854, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 148, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 148, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 221899, 'error': None, 'target': 'ovnmeta-1995baab-0f8d-4658-a4fc-2d21868dc592', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:58:51 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:58:51.885 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[476841fd-bcb7-43c0-8bfc-cd3a71e8c980]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:58:51 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:58:51.954 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[5210c8bb-4d2b-4221-b7d4-5aaeff1f10cf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:58:51 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:58:51.956 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1995baab-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:58:51 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:58:51.956 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 21 23:58:51 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:58:51.957 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap1995baab-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:58:51 compute-1 nova_compute[182713]: 2026-01-21 23:58:51.959 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:58:51 compute-1 NetworkManager[54952]: <info>  [1769039931.9601] manager: (tap1995baab-00): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/132)
Jan 21 23:58:51 compute-1 kernel: tap1995baab-00: entered promiscuous mode
Jan 21 23:58:51 compute-1 nova_compute[182713]: 2026-01-21 23:58:51.962 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:58:51 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:58:51.963 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap1995baab-00, col_values=(('external_ids', {'iface-id': '4a5cc35b-5169-43e2-b11f-202219aae22d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:58:51 compute-1 nova_compute[182713]: 2026-01-21 23:58:51.965 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:58:51 compute-1 ovn_controller[94841]: 2026-01-21T23:58:51Z|00272|binding|INFO|Releasing lport 4a5cc35b-5169-43e2-b11f-202219aae22d from this chassis (sb_readonly=0)
Jan 21 23:58:51 compute-1 nova_compute[182713]: 2026-01-21 23:58:51.966 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:58:51 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:58:51.967 104184 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/1995baab-0f8d-4658-a4fc-2d21868dc592.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/1995baab-0f8d-4658-a4fc-2d21868dc592.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 21 23:58:51 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:58:51.968 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[a73d06b4-bdda-479c-b3fb-9abb725dbbf2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:58:51 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:58:51.969 104184 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 21 23:58:51 compute-1 ovn_metadata_agent[104179]: global
Jan 21 23:58:51 compute-1 ovn_metadata_agent[104179]:     log         /dev/log local0 debug
Jan 21 23:58:51 compute-1 ovn_metadata_agent[104179]:     log-tag     haproxy-metadata-proxy-1995baab-0f8d-4658-a4fc-2d21868dc592
Jan 21 23:58:51 compute-1 ovn_metadata_agent[104179]:     user        root
Jan 21 23:58:51 compute-1 ovn_metadata_agent[104179]:     group       root
Jan 21 23:58:51 compute-1 ovn_metadata_agent[104179]:     maxconn     1024
Jan 21 23:58:51 compute-1 ovn_metadata_agent[104179]:     pidfile     /var/lib/neutron/external/pids/1995baab-0f8d-4658-a4fc-2d21868dc592.pid.haproxy
Jan 21 23:58:51 compute-1 ovn_metadata_agent[104179]:     daemon
Jan 21 23:58:51 compute-1 ovn_metadata_agent[104179]: 
Jan 21 23:58:51 compute-1 ovn_metadata_agent[104179]: defaults
Jan 21 23:58:51 compute-1 ovn_metadata_agent[104179]:     log global
Jan 21 23:58:51 compute-1 ovn_metadata_agent[104179]:     mode http
Jan 21 23:58:51 compute-1 ovn_metadata_agent[104179]:     option httplog
Jan 21 23:58:51 compute-1 ovn_metadata_agent[104179]:     option dontlognull
Jan 21 23:58:51 compute-1 ovn_metadata_agent[104179]:     option http-server-close
Jan 21 23:58:51 compute-1 ovn_metadata_agent[104179]:     option forwardfor
Jan 21 23:58:51 compute-1 ovn_metadata_agent[104179]:     retries                 3
Jan 21 23:58:51 compute-1 ovn_metadata_agent[104179]:     timeout http-request    30s
Jan 21 23:58:51 compute-1 ovn_metadata_agent[104179]:     timeout connect         30s
Jan 21 23:58:51 compute-1 ovn_metadata_agent[104179]:     timeout client          32s
Jan 21 23:58:51 compute-1 ovn_metadata_agent[104179]:     timeout server          32s
Jan 21 23:58:51 compute-1 ovn_metadata_agent[104179]:     timeout http-keep-alive 30s
Jan 21 23:58:51 compute-1 ovn_metadata_agent[104179]: 
Jan 21 23:58:51 compute-1 ovn_metadata_agent[104179]: 
Jan 21 23:58:51 compute-1 ovn_metadata_agent[104179]: listen listener
Jan 21 23:58:51 compute-1 ovn_metadata_agent[104179]:     bind 169.254.169.254:80
Jan 21 23:58:51 compute-1 ovn_metadata_agent[104179]:     server metadata /var/lib/neutron/metadata_proxy
Jan 21 23:58:51 compute-1 ovn_metadata_agent[104179]:     http-request add-header X-OVN-Network-ID 1995baab-0f8d-4658-a4fc-2d21868dc592
Jan 21 23:58:51 compute-1 ovn_metadata_agent[104179]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 21 23:58:51 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:58:51.970 104184 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-1995baab-0f8d-4658-a4fc-2d21868dc592', 'env', 'PROCESS_TAG=haproxy-1995baab-0f8d-4658-a4fc-2d21868dc592', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/1995baab-0f8d-4658-a4fc-2d21868dc592.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 21 23:58:51 compute-1 nova_compute[182713]: 2026-01-21 23:58:51.981 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:58:52 compute-1 nova_compute[182713]: 2026-01-21 23:58:52.096 182717 DEBUG nova.compute.manager [req-34ba3b27-1989-4fcd-9526-f7503518ea43 req-dd957d80-c8e3-4441-a4cc-cb1c7bd529d9 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 30c7da24-de00-4067-a5d2-f36ad21391c5] Received event network-vif-plugged-917524f9-5334-4b3d-b16f-b9686b1c3528 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:58:52 compute-1 nova_compute[182713]: 2026-01-21 23:58:52.098 182717 DEBUG oslo_concurrency.lockutils [req-34ba3b27-1989-4fcd-9526-f7503518ea43 req-dd957d80-c8e3-4441-a4cc-cb1c7bd529d9 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "30c7da24-de00-4067-a5d2-f36ad21391c5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:58:52 compute-1 nova_compute[182713]: 2026-01-21 23:58:52.098 182717 DEBUG oslo_concurrency.lockutils [req-34ba3b27-1989-4fcd-9526-f7503518ea43 req-dd957d80-c8e3-4441-a4cc-cb1c7bd529d9 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "30c7da24-de00-4067-a5d2-f36ad21391c5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:58:52 compute-1 nova_compute[182713]: 2026-01-21 23:58:52.099 182717 DEBUG oslo_concurrency.lockutils [req-34ba3b27-1989-4fcd-9526-f7503518ea43 req-dd957d80-c8e3-4441-a4cc-cb1c7bd529d9 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "30c7da24-de00-4067-a5d2-f36ad21391c5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:58:52 compute-1 nova_compute[182713]: 2026-01-21 23:58:52.100 182717 DEBUG nova.compute.manager [req-34ba3b27-1989-4fcd-9526-f7503518ea43 req-dd957d80-c8e3-4441-a4cc-cb1c7bd529d9 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 30c7da24-de00-4067-a5d2-f36ad21391c5] Processing event network-vif-plugged-917524f9-5334-4b3d-b16f-b9686b1c3528 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 21 23:58:52 compute-1 podman[221929]: 2026-01-21 23:58:52.409614499 +0000 UTC m=+0.082017230 container create 0529845bf52fb7ae33d1a4cc1e07df4659f72766a1b5a82fbc955311829a6049 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1995baab-0f8d-4658-a4fc-2d21868dc592, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 21 23:58:52 compute-1 systemd[1]: Started libpod-conmon-0529845bf52fb7ae33d1a4cc1e07df4659f72766a1b5a82fbc955311829a6049.scope.
Jan 21 23:58:52 compute-1 podman[221929]: 2026-01-21 23:58:52.372049171 +0000 UTC m=+0.044451982 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 21 23:58:52 compute-1 systemd[1]: Started libcrun container.
Jan 21 23:58:52 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bc46e15cd0b2f931f3a7e8961d5de61db6bceb98197b9360ca68c09e4862b55a/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 21 23:58:52 compute-1 podman[221929]: 2026-01-21 23:58:52.516584588 +0000 UTC m=+0.188987409 container init 0529845bf52fb7ae33d1a4cc1e07df4659f72766a1b5a82fbc955311829a6049 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1995baab-0f8d-4658-a4fc-2d21868dc592, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Jan 21 23:58:52 compute-1 podman[221929]: 2026-01-21 23:58:52.527294788 +0000 UTC m=+0.199697549 container start 0529845bf52fb7ae33d1a4cc1e07df4659f72766a1b5a82fbc955311829a6049 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1995baab-0f8d-4658-a4fc-2d21868dc592, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Jan 21 23:58:52 compute-1 neutron-haproxy-ovnmeta-1995baab-0f8d-4658-a4fc-2d21868dc592[221945]: [NOTICE]   (221955) : New worker (221958) forked
Jan 21 23:58:52 compute-1 neutron-haproxy-ovnmeta-1995baab-0f8d-4658-a4fc-2d21868dc592[221945]: [NOTICE]   (221955) : Loading success.
Jan 21 23:58:52 compute-1 nova_compute[182713]: 2026-01-21 23:58:52.633 182717 DEBUG nova.virt.driver [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Emitting event <LifecycleEvent: 1769039932.6333075, 30c7da24-de00-4067-a5d2-f36ad21391c5 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:58:52 compute-1 nova_compute[182713]: 2026-01-21 23:58:52.635 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 30c7da24-de00-4067-a5d2-f36ad21391c5] VM Started (Lifecycle Event)
Jan 21 23:58:52 compute-1 nova_compute[182713]: 2026-01-21 23:58:52.638 182717 DEBUG nova.compute.manager [None req-29d9fd01-6866-41c4-8530-cace66e53345 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] [instance: 30c7da24-de00-4067-a5d2-f36ad21391c5] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 21 23:58:52 compute-1 nova_compute[182713]: 2026-01-21 23:58:52.643 182717 DEBUG nova.virt.libvirt.driver [None req-29d9fd01-6866-41c4-8530-cace66e53345 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] [instance: 30c7da24-de00-4067-a5d2-f36ad21391c5] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 21 23:58:52 compute-1 nova_compute[182713]: 2026-01-21 23:58:52.648 182717 INFO nova.virt.libvirt.driver [-] [instance: 30c7da24-de00-4067-a5d2-f36ad21391c5] Instance spawned successfully.
Jan 21 23:58:52 compute-1 nova_compute[182713]: 2026-01-21 23:58:52.649 182717 DEBUG nova.virt.libvirt.driver [None req-29d9fd01-6866-41c4-8530-cace66e53345 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] [instance: 30c7da24-de00-4067-a5d2-f36ad21391c5] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 21 23:58:52 compute-1 nova_compute[182713]: 2026-01-21 23:58:52.671 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 30c7da24-de00-4067-a5d2-f36ad21391c5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:58:52 compute-1 nova_compute[182713]: 2026-01-21 23:58:52.683 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 30c7da24-de00-4067-a5d2-f36ad21391c5] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 21 23:58:52 compute-1 nova_compute[182713]: 2026-01-21 23:58:52.690 182717 DEBUG nova.virt.libvirt.driver [None req-29d9fd01-6866-41c4-8530-cace66e53345 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] [instance: 30c7da24-de00-4067-a5d2-f36ad21391c5] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:58:52 compute-1 nova_compute[182713]: 2026-01-21 23:58:52.691 182717 DEBUG nova.virt.libvirt.driver [None req-29d9fd01-6866-41c4-8530-cace66e53345 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] [instance: 30c7da24-de00-4067-a5d2-f36ad21391c5] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:58:52 compute-1 nova_compute[182713]: 2026-01-21 23:58:52.692 182717 DEBUG nova.virt.libvirt.driver [None req-29d9fd01-6866-41c4-8530-cace66e53345 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] [instance: 30c7da24-de00-4067-a5d2-f36ad21391c5] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:58:52 compute-1 nova_compute[182713]: 2026-01-21 23:58:52.693 182717 DEBUG nova.virt.libvirt.driver [None req-29d9fd01-6866-41c4-8530-cace66e53345 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] [instance: 30c7da24-de00-4067-a5d2-f36ad21391c5] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:58:52 compute-1 nova_compute[182713]: 2026-01-21 23:58:52.694 182717 DEBUG nova.virt.libvirt.driver [None req-29d9fd01-6866-41c4-8530-cace66e53345 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] [instance: 30c7da24-de00-4067-a5d2-f36ad21391c5] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:58:52 compute-1 nova_compute[182713]: 2026-01-21 23:58:52.695 182717 DEBUG nova.virt.libvirt.driver [None req-29d9fd01-6866-41c4-8530-cace66e53345 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] [instance: 30c7da24-de00-4067-a5d2-f36ad21391c5] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:58:52 compute-1 nova_compute[182713]: 2026-01-21 23:58:52.732 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 30c7da24-de00-4067-a5d2-f36ad21391c5] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 21 23:58:52 compute-1 nova_compute[182713]: 2026-01-21 23:58:52.733 182717 DEBUG nova.virt.driver [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Emitting event <LifecycleEvent: 1769039932.6347804, 30c7da24-de00-4067-a5d2-f36ad21391c5 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:58:52 compute-1 nova_compute[182713]: 2026-01-21 23:58:52.734 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 30c7da24-de00-4067-a5d2-f36ad21391c5] VM Paused (Lifecycle Event)
Jan 21 23:58:52 compute-1 nova_compute[182713]: 2026-01-21 23:58:52.958 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 30c7da24-de00-4067-a5d2-f36ad21391c5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:58:52 compute-1 nova_compute[182713]: 2026-01-21 23:58:52.963 182717 DEBUG nova.virt.driver [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Emitting event <LifecycleEvent: 1769039932.6420834, 30c7da24-de00-4067-a5d2-f36ad21391c5 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:58:52 compute-1 nova_compute[182713]: 2026-01-21 23:58:52.964 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 30c7da24-de00-4067-a5d2-f36ad21391c5] VM Resumed (Lifecycle Event)
Jan 21 23:58:53 compute-1 nova_compute[182713]: 2026-01-21 23:58:53.072 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 30c7da24-de00-4067-a5d2-f36ad21391c5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:58:53 compute-1 nova_compute[182713]: 2026-01-21 23:58:53.076 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 30c7da24-de00-4067-a5d2-f36ad21391c5] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 21 23:58:53 compute-1 nova_compute[182713]: 2026-01-21 23:58:53.104 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 30c7da24-de00-4067-a5d2-f36ad21391c5] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 21 23:58:53 compute-1 nova_compute[182713]: 2026-01-21 23:58:53.133 182717 INFO nova.compute.manager [None req-29d9fd01-6866-41c4-8530-cace66e53345 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] [instance: 30c7da24-de00-4067-a5d2-f36ad21391c5] Took 7.45 seconds to spawn the instance on the hypervisor.
Jan 21 23:58:53 compute-1 nova_compute[182713]: 2026-01-21 23:58:53.133 182717 DEBUG nova.compute.manager [None req-29d9fd01-6866-41c4-8530-cace66e53345 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] [instance: 30c7da24-de00-4067-a5d2-f36ad21391c5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:58:53 compute-1 nova_compute[182713]: 2026-01-21 23:58:53.379 182717 INFO nova.compute.manager [None req-29d9fd01-6866-41c4-8530-cace66e53345 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] [instance: 30c7da24-de00-4067-a5d2-f36ad21391c5] Took 8.26 seconds to build instance.
Jan 21 23:58:53 compute-1 nova_compute[182713]: 2026-01-21 23:58:53.415 182717 DEBUG nova.network.neutron [req-bd67ccd8-6f13-497a-8e4c-5dcbd477bf32 req-ef7bcbc1-56df-4b44-be4e-995e1211c739 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 30c7da24-de00-4067-a5d2-f36ad21391c5] Updated VIF entry in instance network info cache for port 917524f9-5334-4b3d-b16f-b9686b1c3528. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 21 23:58:53 compute-1 nova_compute[182713]: 2026-01-21 23:58:53.416 182717 DEBUG nova.network.neutron [req-bd67ccd8-6f13-497a-8e4c-5dcbd477bf32 req-ef7bcbc1-56df-4b44-be4e-995e1211c739 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 30c7da24-de00-4067-a5d2-f36ad21391c5] Updating instance_info_cache with network_info: [{"id": "917524f9-5334-4b3d-b16f-b9686b1c3528", "address": "fa:16:3e:80:88:c9", "network": {"id": "1995baab-0f8d-4658-a4fc-2d21868dc592", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-54296518-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "717cc581e6a349a98dfd390d05b18624", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap917524f9-53", "ovs_interfaceid": "917524f9-5334-4b3d-b16f-b9686b1c3528", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 23:58:53 compute-1 nova_compute[182713]: 2026-01-21 23:58:53.420 182717 DEBUG oslo_concurrency.lockutils [None req-29d9fd01-6866-41c4-8530-cace66e53345 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Lock "30c7da24-de00-4067-a5d2-f36ad21391c5" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.440s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:58:53 compute-1 nova_compute[182713]: 2026-01-21 23:58:53.432 182717 DEBUG oslo_concurrency.lockutils [req-bd67ccd8-6f13-497a-8e4c-5dcbd477bf32 req-ef7bcbc1-56df-4b44-be4e-995e1211c739 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-30c7da24-de00-4067-a5d2-f36ad21391c5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 23:58:54 compute-1 nova_compute[182713]: 2026-01-21 23:58:54.240 182717 DEBUG nova.compute.manager [req-9f794cf3-a0c6-497e-9d17-8c4a314c89ea req-0e82509e-a72c-44b6-ace1-ea88d5141fd3 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 30c7da24-de00-4067-a5d2-f36ad21391c5] Received event network-vif-plugged-917524f9-5334-4b3d-b16f-b9686b1c3528 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:58:54 compute-1 nova_compute[182713]: 2026-01-21 23:58:54.242 182717 DEBUG oslo_concurrency.lockutils [req-9f794cf3-a0c6-497e-9d17-8c4a314c89ea req-0e82509e-a72c-44b6-ace1-ea88d5141fd3 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "30c7da24-de00-4067-a5d2-f36ad21391c5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:58:54 compute-1 nova_compute[182713]: 2026-01-21 23:58:54.242 182717 DEBUG oslo_concurrency.lockutils [req-9f794cf3-a0c6-497e-9d17-8c4a314c89ea req-0e82509e-a72c-44b6-ace1-ea88d5141fd3 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "30c7da24-de00-4067-a5d2-f36ad21391c5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:58:54 compute-1 nova_compute[182713]: 2026-01-21 23:58:54.243 182717 DEBUG oslo_concurrency.lockutils [req-9f794cf3-a0c6-497e-9d17-8c4a314c89ea req-0e82509e-a72c-44b6-ace1-ea88d5141fd3 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "30c7da24-de00-4067-a5d2-f36ad21391c5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:58:54 compute-1 nova_compute[182713]: 2026-01-21 23:58:54.244 182717 DEBUG nova.compute.manager [req-9f794cf3-a0c6-497e-9d17-8c4a314c89ea req-0e82509e-a72c-44b6-ace1-ea88d5141fd3 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 30c7da24-de00-4067-a5d2-f36ad21391c5] No waiting events found dispatching network-vif-plugged-917524f9-5334-4b3d-b16f-b9686b1c3528 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 23:58:54 compute-1 nova_compute[182713]: 2026-01-21 23:58:54.244 182717 WARNING nova.compute.manager [req-9f794cf3-a0c6-497e-9d17-8c4a314c89ea req-0e82509e-a72c-44b6-ace1-ea88d5141fd3 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 30c7da24-de00-4067-a5d2-f36ad21391c5] Received unexpected event network-vif-plugged-917524f9-5334-4b3d-b16f-b9686b1c3528 for instance with vm_state active and task_state None.
Jan 21 23:58:55 compute-1 podman[221967]: 2026-01-21 23:58:55.592605804 +0000 UTC m=+0.081592906 container health_status cba261ffc859a105e0afa4436e64e27f9ba7e7387a1ea02146e0072c51844d44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, config_id=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 21 23:58:55 compute-1 nova_compute[182713]: 2026-01-21 23:58:55.608 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:58:56 compute-1 nova_compute[182713]: 2026-01-21 23:58:56.182 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:58:57 compute-1 NetworkManager[54952]: <info>  [1769039937.3942] manager: (patch-provnet-99bcf688-d143-4306-a11c-956e66fbe227-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/133)
Jan 21 23:58:57 compute-1 nova_compute[182713]: 2026-01-21 23:58:57.391 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:58:57 compute-1 NetworkManager[54952]: <info>  [1769039937.3955] manager: (patch-br-int-to-provnet-99bcf688-d143-4306-a11c-956e66fbe227): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/134)
Jan 21 23:58:57 compute-1 nova_compute[182713]: 2026-01-21 23:58:57.565 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:58:57 compute-1 ovn_controller[94841]: 2026-01-21T23:58:57Z|00273|binding|INFO|Releasing lport 4a5cc35b-5169-43e2-b11f-202219aae22d from this chassis (sb_readonly=0)
Jan 21 23:58:57 compute-1 nova_compute[182713]: 2026-01-21 23:58:57.595 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:58:57 compute-1 podman[221986]: 2026-01-21 23:58:57.658725294 +0000 UTC m=+0.076074016 container health_status dce133a2ffa21f3006ac27dc80027060a9029e6d972f4c62fecd35dd3bbb00f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., architecture=x86_64, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, vcs-type=git, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, io.openshift.expose-services=, release=1755695350, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.6, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter)
Jan 21 23:58:57 compute-1 nova_compute[182713]: 2026-01-21 23:58:57.851 182717 DEBUG nova.compute.manager [req-6aea26f5-628a-4e8f-a5f2-d98a1473263d req-d6f24529-82ae-4b5c-96a2-2a6781d1006a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 30c7da24-de00-4067-a5d2-f36ad21391c5] Received event network-changed-917524f9-5334-4b3d-b16f-b9686b1c3528 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:58:57 compute-1 nova_compute[182713]: 2026-01-21 23:58:57.852 182717 DEBUG nova.compute.manager [req-6aea26f5-628a-4e8f-a5f2-d98a1473263d req-d6f24529-82ae-4b5c-96a2-2a6781d1006a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 30c7da24-de00-4067-a5d2-f36ad21391c5] Refreshing instance network info cache due to event network-changed-917524f9-5334-4b3d-b16f-b9686b1c3528. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 21 23:58:57 compute-1 nova_compute[182713]: 2026-01-21 23:58:57.852 182717 DEBUG oslo_concurrency.lockutils [req-6aea26f5-628a-4e8f-a5f2-d98a1473263d req-d6f24529-82ae-4b5c-96a2-2a6781d1006a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-30c7da24-de00-4067-a5d2-f36ad21391c5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 23:58:57 compute-1 nova_compute[182713]: 2026-01-21 23:58:57.853 182717 DEBUG oslo_concurrency.lockutils [req-6aea26f5-628a-4e8f-a5f2-d98a1473263d req-d6f24529-82ae-4b5c-96a2-2a6781d1006a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-30c7da24-de00-4067-a5d2-f36ad21391c5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 23:58:57 compute-1 nova_compute[182713]: 2026-01-21 23:58:57.853 182717 DEBUG nova.network.neutron [req-6aea26f5-628a-4e8f-a5f2-d98a1473263d req-d6f24529-82ae-4b5c-96a2-2a6781d1006a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 30c7da24-de00-4067-a5d2-f36ad21391c5] Refreshing network info cache for port 917524f9-5334-4b3d-b16f-b9686b1c3528 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 21 23:58:59 compute-1 nova_compute[182713]: 2026-01-21 23:58:59.861 182717 DEBUG nova.network.neutron [req-6aea26f5-628a-4e8f-a5f2-d98a1473263d req-d6f24529-82ae-4b5c-96a2-2a6781d1006a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 30c7da24-de00-4067-a5d2-f36ad21391c5] Updated VIF entry in instance network info cache for port 917524f9-5334-4b3d-b16f-b9686b1c3528. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 21 23:58:59 compute-1 nova_compute[182713]: 2026-01-21 23:58:59.862 182717 DEBUG nova.network.neutron [req-6aea26f5-628a-4e8f-a5f2-d98a1473263d req-d6f24529-82ae-4b5c-96a2-2a6781d1006a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 30c7da24-de00-4067-a5d2-f36ad21391c5] Updating instance_info_cache with network_info: [{"id": "917524f9-5334-4b3d-b16f-b9686b1c3528", "address": "fa:16:3e:80:88:c9", "network": {"id": "1995baab-0f8d-4658-a4fc-2d21868dc592", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-54296518-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.223", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "717cc581e6a349a98dfd390d05b18624", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap917524f9-53", "ovs_interfaceid": "917524f9-5334-4b3d-b16f-b9686b1c3528", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 23:58:59 compute-1 nova_compute[182713]: 2026-01-21 23:58:59.883 182717 DEBUG oslo_concurrency.lockutils [req-6aea26f5-628a-4e8f-a5f2-d98a1473263d req-d6f24529-82ae-4b5c-96a2-2a6781d1006a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-30c7da24-de00-4067-a5d2-f36ad21391c5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 23:59:00 compute-1 nova_compute[182713]: 2026-01-21 23:59:00.612 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:59:01 compute-1 nova_compute[182713]: 2026-01-21 23:59:01.184 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:59:03 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:59:03.006 104184 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:59:03 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:59:03.007 104184 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:59:03 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:59:03.008 104184 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:59:05 compute-1 nova_compute[182713]: 2026-01-21 23:59:05.614 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:59:06 compute-1 nova_compute[182713]: 2026-01-21 23:59:06.186 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:59:06 compute-1 ovn_controller[94841]: 2026-01-21T23:59:06Z|00036|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:80:88:c9 10.100.0.6
Jan 21 23:59:06 compute-1 ovn_controller[94841]: 2026-01-21T23:59:06Z|00037|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:80:88:c9 10.100.0.6
Jan 21 23:59:07 compute-1 podman[222027]: 2026-01-21 23:59:07.619641254 +0000 UTC m=+0.092118682 container health_status 9069102163b9014ce8d990e36224edf2f6277e737eebbc85a8ea0ba5a3d6a468 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 21 23:59:07 compute-1 podman[222026]: 2026-01-21 23:59:07.643120909 +0000 UTC m=+0.117429483 container health_status 1b31374526468d6b98833c6ddc06c791524e3aaa8f4a92d02e3f11c449a17ca2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 21 23:59:10 compute-1 nova_compute[182713]: 2026-01-21 23:59:10.008 182717 DEBUG oslo_concurrency.lockutils [None req-8f7bd3b7-8d43-4fd6-84e3-31c784ac5bda 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] Acquiring lock "be4dacee-6b35-4e82-ba71-d6e8b745dfa8" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:59:10 compute-1 nova_compute[182713]: 2026-01-21 23:59:10.009 182717 DEBUG oslo_concurrency.lockutils [None req-8f7bd3b7-8d43-4fd6-84e3-31c784ac5bda 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] Lock "be4dacee-6b35-4e82-ba71-d6e8b745dfa8" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:59:10 compute-1 nova_compute[182713]: 2026-01-21 23:59:10.033 182717 DEBUG nova.compute.manager [None req-8f7bd3b7-8d43-4fd6-84e3-31c784ac5bda 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] [instance: be4dacee-6b35-4e82-ba71-d6e8b745dfa8] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 21 23:59:10 compute-1 nova_compute[182713]: 2026-01-21 23:59:10.184 182717 DEBUG oslo_concurrency.lockutils [None req-8f7bd3b7-8d43-4fd6-84e3-31c784ac5bda 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:59:10 compute-1 nova_compute[182713]: 2026-01-21 23:59:10.185 182717 DEBUG oslo_concurrency.lockutils [None req-8f7bd3b7-8d43-4fd6-84e3-31c784ac5bda 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:59:10 compute-1 nova_compute[182713]: 2026-01-21 23:59:10.194 182717 DEBUG nova.virt.hardware [None req-8f7bd3b7-8d43-4fd6-84e3-31c784ac5bda 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 21 23:59:10 compute-1 nova_compute[182713]: 2026-01-21 23:59:10.195 182717 INFO nova.compute.claims [None req-8f7bd3b7-8d43-4fd6-84e3-31c784ac5bda 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] [instance: be4dacee-6b35-4e82-ba71-d6e8b745dfa8] Claim successful on node compute-1.ctlplane.example.com
Jan 21 23:59:10 compute-1 nova_compute[182713]: 2026-01-21 23:59:10.317 182717 DEBUG nova.scheduler.client.report [None req-8f7bd3b7-8d43-4fd6-84e3-31c784ac5bda 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] Refreshing inventories for resource provider 39680711-70c9-4df1-ae59-25e54fac688d _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Jan 21 23:59:10 compute-1 nova_compute[182713]: 2026-01-21 23:59:10.335 182717 DEBUG nova.scheduler.client.report [None req-8f7bd3b7-8d43-4fd6-84e3-31c784ac5bda 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] Updating ProviderTree inventory for provider 39680711-70c9-4df1-ae59-25e54fac688d from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Jan 21 23:59:10 compute-1 nova_compute[182713]: 2026-01-21 23:59:10.336 182717 DEBUG nova.compute.provider_tree [None req-8f7bd3b7-8d43-4fd6-84e3-31c784ac5bda 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] Updating inventory in ProviderTree for provider 39680711-70c9-4df1-ae59-25e54fac688d with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Jan 21 23:59:10 compute-1 nova_compute[182713]: 2026-01-21 23:59:10.351 182717 DEBUG nova.scheduler.client.report [None req-8f7bd3b7-8d43-4fd6-84e3-31c784ac5bda 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] Refreshing aggregate associations for resource provider 39680711-70c9-4df1-ae59-25e54fac688d, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Jan 21 23:59:10 compute-1 nova_compute[182713]: 2026-01-21 23:59:10.390 182717 DEBUG nova.scheduler.client.report [None req-8f7bd3b7-8d43-4fd6-84e3-31c784ac5bda 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] Refreshing trait associations for resource provider 39680711-70c9-4df1-ae59-25e54fac688d, traits: COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_ACCELERATORS,COMPUTE_NODE,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_SSE41,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_GRAPHICS_MODEL_CIRRUS,HW_CPU_X86_SSE42,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_SECURITY_TPM_2_0,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_STORAGE_BUS_USB,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_STORAGE_BUS_FDC,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_STORAGE_BUS_SATA,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_DEVICE_TAGGING,COMPUTE_NET_VIF_MODEL_NE2K_PCI,HW_CPU_X86_SSE2,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_CPU_X86_SSSE3,COMPUTE_STORAGE_BUS_IDE,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_RESCUE_BFV,HW_CPU_X86_MMX,HW_CPU_X86_SSE,COMPUTE_TRUSTED_CERTS,COMPUTE_IMAGE_TYPE_ISO _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Jan 21 23:59:10 compute-1 nova_compute[182713]: 2026-01-21 23:59:10.478 182717 DEBUG nova.compute.provider_tree [None req-8f7bd3b7-8d43-4fd6-84e3-31c784ac5bda 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] Inventory has not changed in ProviderTree for provider: 39680711-70c9-4df1-ae59-25e54fac688d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 21 23:59:10 compute-1 nova_compute[182713]: 2026-01-21 23:59:10.511 182717 DEBUG nova.scheduler.client.report [None req-8f7bd3b7-8d43-4fd6-84e3-31c784ac5bda 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] Inventory has not changed for provider 39680711-70c9-4df1-ae59-25e54fac688d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 21 23:59:10 compute-1 nova_compute[182713]: 2026-01-21 23:59:10.546 182717 DEBUG oslo_concurrency.lockutils [None req-8f7bd3b7-8d43-4fd6-84e3-31c784ac5bda 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.361s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:59:10 compute-1 nova_compute[182713]: 2026-01-21 23:59:10.547 182717 DEBUG nova.compute.manager [None req-8f7bd3b7-8d43-4fd6-84e3-31c784ac5bda 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] [instance: be4dacee-6b35-4e82-ba71-d6e8b745dfa8] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 21 23:59:10 compute-1 nova_compute[182713]: 2026-01-21 23:59:10.611 182717 DEBUG nova.compute.manager [None req-8f7bd3b7-8d43-4fd6-84e3-31c784ac5bda 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] [instance: be4dacee-6b35-4e82-ba71-d6e8b745dfa8] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 21 23:59:10 compute-1 nova_compute[182713]: 2026-01-21 23:59:10.611 182717 DEBUG nova.network.neutron [None req-8f7bd3b7-8d43-4fd6-84e3-31c784ac5bda 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] [instance: be4dacee-6b35-4e82-ba71-d6e8b745dfa8] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 21 23:59:10 compute-1 nova_compute[182713]: 2026-01-21 23:59:10.618 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:59:10 compute-1 nova_compute[182713]: 2026-01-21 23:59:10.630 182717 INFO nova.virt.libvirt.driver [None req-8f7bd3b7-8d43-4fd6-84e3-31c784ac5bda 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] [instance: be4dacee-6b35-4e82-ba71-d6e8b745dfa8] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 21 23:59:10 compute-1 nova_compute[182713]: 2026-01-21 23:59:10.657 182717 DEBUG nova.compute.manager [None req-8f7bd3b7-8d43-4fd6-84e3-31c784ac5bda 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] [instance: be4dacee-6b35-4e82-ba71-d6e8b745dfa8] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 21 23:59:10 compute-1 nova_compute[182713]: 2026-01-21 23:59:10.854 182717 DEBUG nova.compute.manager [None req-8f7bd3b7-8d43-4fd6-84e3-31c784ac5bda 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] [instance: be4dacee-6b35-4e82-ba71-d6e8b745dfa8] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 21 23:59:10 compute-1 nova_compute[182713]: 2026-01-21 23:59:10.855 182717 DEBUG nova.virt.libvirt.driver [None req-8f7bd3b7-8d43-4fd6-84e3-31c784ac5bda 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] [instance: be4dacee-6b35-4e82-ba71-d6e8b745dfa8] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 21 23:59:10 compute-1 nova_compute[182713]: 2026-01-21 23:59:10.855 182717 INFO nova.virt.libvirt.driver [None req-8f7bd3b7-8d43-4fd6-84e3-31c784ac5bda 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] [instance: be4dacee-6b35-4e82-ba71-d6e8b745dfa8] Creating image(s)
Jan 21 23:59:10 compute-1 nova_compute[182713]: 2026-01-21 23:59:10.855 182717 DEBUG oslo_concurrency.lockutils [None req-8f7bd3b7-8d43-4fd6-84e3-31c784ac5bda 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] Acquiring lock "/var/lib/nova/instances/be4dacee-6b35-4e82-ba71-d6e8b745dfa8/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:59:10 compute-1 nova_compute[182713]: 2026-01-21 23:59:10.856 182717 DEBUG oslo_concurrency.lockutils [None req-8f7bd3b7-8d43-4fd6-84e3-31c784ac5bda 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] Lock "/var/lib/nova/instances/be4dacee-6b35-4e82-ba71-d6e8b745dfa8/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:59:10 compute-1 nova_compute[182713]: 2026-01-21 23:59:10.857 182717 DEBUG oslo_concurrency.lockutils [None req-8f7bd3b7-8d43-4fd6-84e3-31c784ac5bda 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] Lock "/var/lib/nova/instances/be4dacee-6b35-4e82-ba71-d6e8b745dfa8/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:59:10 compute-1 nova_compute[182713]: 2026-01-21 23:59:10.868 182717 DEBUG oslo_concurrency.processutils [None req-8f7bd3b7-8d43-4fd6-84e3-31c784ac5bda 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:59:10 compute-1 nova_compute[182713]: 2026-01-21 23:59:10.945 182717 DEBUG oslo_concurrency.processutils [None req-8f7bd3b7-8d43-4fd6-84e3-31c784ac5bda 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.077s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:59:10 compute-1 nova_compute[182713]: 2026-01-21 23:59:10.946 182717 DEBUG oslo_concurrency.lockutils [None req-8f7bd3b7-8d43-4fd6-84e3-31c784ac5bda 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] Acquiring lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:59:10 compute-1 nova_compute[182713]: 2026-01-21 23:59:10.946 182717 DEBUG oslo_concurrency.lockutils [None req-8f7bd3b7-8d43-4fd6-84e3-31c784ac5bda 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:59:10 compute-1 nova_compute[182713]: 2026-01-21 23:59:10.959 182717 DEBUG oslo_concurrency.processutils [None req-8f7bd3b7-8d43-4fd6-84e3-31c784ac5bda 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:59:11 compute-1 nova_compute[182713]: 2026-01-21 23:59:11.041 182717 DEBUG oslo_concurrency.processutils [None req-8f7bd3b7-8d43-4fd6-84e3-31c784ac5bda 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.082s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:59:11 compute-1 nova_compute[182713]: 2026-01-21 23:59:11.044 182717 DEBUG oslo_concurrency.processutils [None req-8f7bd3b7-8d43-4fd6-84e3-31c784ac5bda 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/be4dacee-6b35-4e82-ba71-d6e8b745dfa8/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:59:11 compute-1 nova_compute[182713]: 2026-01-21 23:59:11.090 182717 DEBUG oslo_concurrency.processutils [None req-8f7bd3b7-8d43-4fd6-84e3-31c784ac5bda 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/be4dacee-6b35-4e82-ba71-d6e8b745dfa8/disk 1073741824" returned: 0 in 0.045s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:59:11 compute-1 nova_compute[182713]: 2026-01-21 23:59:11.092 182717 DEBUG oslo_concurrency.lockutils [None req-8f7bd3b7-8d43-4fd6-84e3-31c784ac5bda 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.145s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:59:11 compute-1 nova_compute[182713]: 2026-01-21 23:59:11.093 182717 DEBUG oslo_concurrency.processutils [None req-8f7bd3b7-8d43-4fd6-84e3-31c784ac5bda 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:59:11 compute-1 nova_compute[182713]: 2026-01-21 23:59:11.189 182717 DEBUG oslo_concurrency.processutils [None req-8f7bd3b7-8d43-4fd6-84e3-31c784ac5bda 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.096s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:59:11 compute-1 nova_compute[182713]: 2026-01-21 23:59:11.191 182717 DEBUG nova.virt.disk.api [None req-8f7bd3b7-8d43-4fd6-84e3-31c784ac5bda 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] Checking if we can resize image /var/lib/nova/instances/be4dacee-6b35-4e82-ba71-d6e8b745dfa8/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 21 23:59:11 compute-1 nova_compute[182713]: 2026-01-21 23:59:11.192 182717 DEBUG oslo_concurrency.processutils [None req-8f7bd3b7-8d43-4fd6-84e3-31c784ac5bda 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/be4dacee-6b35-4e82-ba71-d6e8b745dfa8/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:59:11 compute-1 nova_compute[182713]: 2026-01-21 23:59:11.212 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:59:11 compute-1 nova_compute[182713]: 2026-01-21 23:59:11.250 182717 DEBUG oslo_concurrency.processutils [None req-8f7bd3b7-8d43-4fd6-84e3-31c784ac5bda 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/be4dacee-6b35-4e82-ba71-d6e8b745dfa8/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:59:11 compute-1 nova_compute[182713]: 2026-01-21 23:59:11.251 182717 DEBUG nova.virt.disk.api [None req-8f7bd3b7-8d43-4fd6-84e3-31c784ac5bda 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] Cannot resize image /var/lib/nova/instances/be4dacee-6b35-4e82-ba71-d6e8b745dfa8/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 21 23:59:11 compute-1 nova_compute[182713]: 2026-01-21 23:59:11.252 182717 DEBUG nova.objects.instance [None req-8f7bd3b7-8d43-4fd6-84e3-31c784ac5bda 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] Lazy-loading 'migration_context' on Instance uuid be4dacee-6b35-4e82-ba71-d6e8b745dfa8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:59:11 compute-1 nova_compute[182713]: 2026-01-21 23:59:11.271 182717 DEBUG nova.virt.libvirt.driver [None req-8f7bd3b7-8d43-4fd6-84e3-31c784ac5bda 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] [instance: be4dacee-6b35-4e82-ba71-d6e8b745dfa8] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 21 23:59:11 compute-1 nova_compute[182713]: 2026-01-21 23:59:11.272 182717 DEBUG nova.virt.libvirt.driver [None req-8f7bd3b7-8d43-4fd6-84e3-31c784ac5bda 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] [instance: be4dacee-6b35-4e82-ba71-d6e8b745dfa8] Ensure instance console log exists: /var/lib/nova/instances/be4dacee-6b35-4e82-ba71-d6e8b745dfa8/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 21 23:59:11 compute-1 nova_compute[182713]: 2026-01-21 23:59:11.273 182717 DEBUG oslo_concurrency.lockutils [None req-8f7bd3b7-8d43-4fd6-84e3-31c784ac5bda 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:59:11 compute-1 nova_compute[182713]: 2026-01-21 23:59:11.274 182717 DEBUG oslo_concurrency.lockutils [None req-8f7bd3b7-8d43-4fd6-84e3-31c784ac5bda 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:59:11 compute-1 nova_compute[182713]: 2026-01-21 23:59:11.274 182717 DEBUG oslo_concurrency.lockutils [None req-8f7bd3b7-8d43-4fd6-84e3-31c784ac5bda 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:59:11 compute-1 nova_compute[182713]: 2026-01-21 23:59:11.537 182717 DEBUG nova.policy [None req-8f7bd3b7-8d43-4fd6-84e3-31c784ac5bda 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '1931e691804246e3bb3ac03a95a74d93', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '975703700f9d42c5a1daa32f5e61f6f2', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 21 23:59:13 compute-1 nova_compute[182713]: 2026-01-21 23:59:13.107 182717 DEBUG nova.network.neutron [None req-8f7bd3b7-8d43-4fd6-84e3-31c784ac5bda 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] [instance: be4dacee-6b35-4e82-ba71-d6e8b745dfa8] Successfully created port: 9330cb5d-3c3f-499b-9d0c-ddc0fee6838f _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 21 23:59:13 compute-1 podman[222092]: 2026-01-21 23:59:13.616165039 +0000 UTC m=+0.090987306 container health_status af9a44923f14d865893c91712a29a99d59e05d83f1a1a0300c4f4d5af638f284 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 21 23:59:13 compute-1 podman[222093]: 2026-01-21 23:59:13.617121039 +0000 UTC m=+0.086721865 container health_status e305ebfcd3dbca18f8dd680606b043b108cf9efb0d0a771039cb04fe0df8441d (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 21 23:59:13 compute-1 nova_compute[182713]: 2026-01-21 23:59:13.712 182717 DEBUG nova.network.neutron [None req-8f7bd3b7-8d43-4fd6-84e3-31c784ac5bda 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] [instance: be4dacee-6b35-4e82-ba71-d6e8b745dfa8] Successfully created port: da0329e1-27a9-4900-91e5-aff8efb5d066 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 21 23:59:15 compute-1 nova_compute[182713]: 2026-01-21 23:59:15.117 182717 DEBUG nova.network.neutron [None req-8f7bd3b7-8d43-4fd6-84e3-31c784ac5bda 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] [instance: be4dacee-6b35-4e82-ba71-d6e8b745dfa8] Successfully created port: a899c1d3-f433-4476-a304-705e518f0bea _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 21 23:59:15 compute-1 nova_compute[182713]: 2026-01-21 23:59:15.620 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:59:16 compute-1 nova_compute[182713]: 2026-01-21 23:59:16.198 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:59:16 compute-1 nova_compute[182713]: 2026-01-21 23:59:16.980 182717 DEBUG nova.network.neutron [None req-8f7bd3b7-8d43-4fd6-84e3-31c784ac5bda 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] [instance: be4dacee-6b35-4e82-ba71-d6e8b745dfa8] Successfully updated port: 9330cb5d-3c3f-499b-9d0c-ddc0fee6838f _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 21 23:59:17 compute-1 nova_compute[182713]: 2026-01-21 23:59:17.109 182717 DEBUG nova.compute.manager [req-0b4bac51-9520-4c23-a72c-cea4ef92463a req-71555551-37b8-4027-a4c8-680adac59556 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 30c7da24-de00-4067-a5d2-f36ad21391c5] Received event network-changed-917524f9-5334-4b3d-b16f-b9686b1c3528 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:59:17 compute-1 nova_compute[182713]: 2026-01-21 23:59:17.110 182717 DEBUG nova.compute.manager [req-0b4bac51-9520-4c23-a72c-cea4ef92463a req-71555551-37b8-4027-a4c8-680adac59556 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 30c7da24-de00-4067-a5d2-f36ad21391c5] Refreshing instance network info cache due to event network-changed-917524f9-5334-4b3d-b16f-b9686b1c3528. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 21 23:59:17 compute-1 nova_compute[182713]: 2026-01-21 23:59:17.110 182717 DEBUG oslo_concurrency.lockutils [req-0b4bac51-9520-4c23-a72c-cea4ef92463a req-71555551-37b8-4027-a4c8-680adac59556 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-30c7da24-de00-4067-a5d2-f36ad21391c5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 23:59:17 compute-1 nova_compute[182713]: 2026-01-21 23:59:17.110 182717 DEBUG oslo_concurrency.lockutils [req-0b4bac51-9520-4c23-a72c-cea4ef92463a req-71555551-37b8-4027-a4c8-680adac59556 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-30c7da24-de00-4067-a5d2-f36ad21391c5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 23:59:17 compute-1 nova_compute[182713]: 2026-01-21 23:59:17.110 182717 DEBUG nova.network.neutron [req-0b4bac51-9520-4c23-a72c-cea4ef92463a req-71555551-37b8-4027-a4c8-680adac59556 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 30c7da24-de00-4067-a5d2-f36ad21391c5] Refreshing network info cache for port 917524f9-5334-4b3d-b16f-b9686b1c3528 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 21 23:59:17 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:59:17.767 104184 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=18, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:a4:fb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'fa:c2:bb:50:eb:27'}, ipsec=False) old=SB_Global(nb_cfg=17) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 21 23:59:17 compute-1 nova_compute[182713]: 2026-01-21 23:59:17.768 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:59:17 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:59:17.770 104184 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 21 23:59:18 compute-1 nova_compute[182713]: 2026-01-21 23:59:18.806 182717 DEBUG nova.network.neutron [None req-8f7bd3b7-8d43-4fd6-84e3-31c784ac5bda 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] [instance: be4dacee-6b35-4e82-ba71-d6e8b745dfa8] Successfully updated port: da0329e1-27a9-4900-91e5-aff8efb5d066 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 21 23:59:19 compute-1 nova_compute[182713]: 2026-01-21 23:59:19.018 182717 DEBUG nova.network.neutron [req-0b4bac51-9520-4c23-a72c-cea4ef92463a req-71555551-37b8-4027-a4c8-680adac59556 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 30c7da24-de00-4067-a5d2-f36ad21391c5] Updated VIF entry in instance network info cache for port 917524f9-5334-4b3d-b16f-b9686b1c3528. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 21 23:59:19 compute-1 nova_compute[182713]: 2026-01-21 23:59:19.019 182717 DEBUG nova.network.neutron [req-0b4bac51-9520-4c23-a72c-cea4ef92463a req-71555551-37b8-4027-a4c8-680adac59556 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 30c7da24-de00-4067-a5d2-f36ad21391c5] Updating instance_info_cache with network_info: [{"id": "917524f9-5334-4b3d-b16f-b9686b1c3528", "address": "fa:16:3e:80:88:c9", "network": {"id": "1995baab-0f8d-4658-a4fc-2d21868dc592", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-54296518-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "717cc581e6a349a98dfd390d05b18624", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap917524f9-53", "ovs_interfaceid": "917524f9-5334-4b3d-b16f-b9686b1c3528", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 23:59:19 compute-1 nova_compute[182713]: 2026-01-21 23:59:19.051 182717 DEBUG oslo_concurrency.lockutils [req-0b4bac51-9520-4c23-a72c-cea4ef92463a req-71555551-37b8-4027-a4c8-680adac59556 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-30c7da24-de00-4067-a5d2-f36ad21391c5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 23:59:19 compute-1 nova_compute[182713]: 2026-01-21 23:59:19.227 182717 DEBUG nova.compute.manager [req-2f8025c7-bcb1-429f-ad02-9855e3dbed41 req-7e55306a-315a-4334-a943-d487cfd20b6f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: be4dacee-6b35-4e82-ba71-d6e8b745dfa8] Received event network-changed-9330cb5d-3c3f-499b-9d0c-ddc0fee6838f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:59:19 compute-1 nova_compute[182713]: 2026-01-21 23:59:19.228 182717 DEBUG nova.compute.manager [req-2f8025c7-bcb1-429f-ad02-9855e3dbed41 req-7e55306a-315a-4334-a943-d487cfd20b6f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: be4dacee-6b35-4e82-ba71-d6e8b745dfa8] Refreshing instance network info cache due to event network-changed-9330cb5d-3c3f-499b-9d0c-ddc0fee6838f. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 21 23:59:19 compute-1 nova_compute[182713]: 2026-01-21 23:59:19.228 182717 DEBUG oslo_concurrency.lockutils [req-2f8025c7-bcb1-429f-ad02-9855e3dbed41 req-7e55306a-315a-4334-a943-d487cfd20b6f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-be4dacee-6b35-4e82-ba71-d6e8b745dfa8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 23:59:19 compute-1 nova_compute[182713]: 2026-01-21 23:59:19.229 182717 DEBUG oslo_concurrency.lockutils [req-2f8025c7-bcb1-429f-ad02-9855e3dbed41 req-7e55306a-315a-4334-a943-d487cfd20b6f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-be4dacee-6b35-4e82-ba71-d6e8b745dfa8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 23:59:19 compute-1 nova_compute[182713]: 2026-01-21 23:59:19.229 182717 DEBUG nova.network.neutron [req-2f8025c7-bcb1-429f-ad02-9855e3dbed41 req-7e55306a-315a-4334-a943-d487cfd20b6f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: be4dacee-6b35-4e82-ba71-d6e8b745dfa8] Refreshing network info cache for port 9330cb5d-3c3f-499b-9d0c-ddc0fee6838f _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 21 23:59:19 compute-1 nova_compute[182713]: 2026-01-21 23:59:19.492 182717 DEBUG nova.network.neutron [req-2f8025c7-bcb1-429f-ad02-9855e3dbed41 req-7e55306a-315a-4334-a943-d487cfd20b6f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: be4dacee-6b35-4e82-ba71-d6e8b745dfa8] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 21 23:59:20 compute-1 nova_compute[182713]: 2026-01-21 23:59:20.064 182717 DEBUG nova.network.neutron [req-2f8025c7-bcb1-429f-ad02-9855e3dbed41 req-7e55306a-315a-4334-a943-d487cfd20b6f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: be4dacee-6b35-4e82-ba71-d6e8b745dfa8] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 23:59:20 compute-1 nova_compute[182713]: 2026-01-21 23:59:20.082 182717 DEBUG oslo_concurrency.lockutils [req-2f8025c7-bcb1-429f-ad02-9855e3dbed41 req-7e55306a-315a-4334-a943-d487cfd20b6f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-be4dacee-6b35-4e82-ba71-d6e8b745dfa8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 23:59:20 compute-1 nova_compute[182713]: 2026-01-21 23:59:20.083 182717 DEBUG nova.compute.manager [req-2f8025c7-bcb1-429f-ad02-9855e3dbed41 req-7e55306a-315a-4334-a943-d487cfd20b6f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: be4dacee-6b35-4e82-ba71-d6e8b745dfa8] Received event network-changed-da0329e1-27a9-4900-91e5-aff8efb5d066 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:59:20 compute-1 nova_compute[182713]: 2026-01-21 23:59:20.083 182717 DEBUG nova.compute.manager [req-2f8025c7-bcb1-429f-ad02-9855e3dbed41 req-7e55306a-315a-4334-a943-d487cfd20b6f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: be4dacee-6b35-4e82-ba71-d6e8b745dfa8] Refreshing instance network info cache due to event network-changed-da0329e1-27a9-4900-91e5-aff8efb5d066. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 21 23:59:20 compute-1 nova_compute[182713]: 2026-01-21 23:59:20.083 182717 DEBUG oslo_concurrency.lockutils [req-2f8025c7-bcb1-429f-ad02-9855e3dbed41 req-7e55306a-315a-4334-a943-d487cfd20b6f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-be4dacee-6b35-4e82-ba71-d6e8b745dfa8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 23:59:20 compute-1 nova_compute[182713]: 2026-01-21 23:59:20.084 182717 DEBUG oslo_concurrency.lockutils [req-2f8025c7-bcb1-429f-ad02-9855e3dbed41 req-7e55306a-315a-4334-a943-d487cfd20b6f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-be4dacee-6b35-4e82-ba71-d6e8b745dfa8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 23:59:20 compute-1 nova_compute[182713]: 2026-01-21 23:59:20.084 182717 DEBUG nova.network.neutron [req-2f8025c7-bcb1-429f-ad02-9855e3dbed41 req-7e55306a-315a-4334-a943-d487cfd20b6f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: be4dacee-6b35-4e82-ba71-d6e8b745dfa8] Refreshing network info cache for port da0329e1-27a9-4900-91e5-aff8efb5d066 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 21 23:59:20 compute-1 nova_compute[182713]: 2026-01-21 23:59:20.351 182717 DEBUG nova.network.neutron [req-2f8025c7-bcb1-429f-ad02-9855e3dbed41 req-7e55306a-315a-4334-a943-d487cfd20b6f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: be4dacee-6b35-4e82-ba71-d6e8b745dfa8] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 21 23:59:20 compute-1 nova_compute[182713]: 2026-01-21 23:59:20.356 182717 DEBUG nova.network.neutron [None req-8f7bd3b7-8d43-4fd6-84e3-31c784ac5bda 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] [instance: be4dacee-6b35-4e82-ba71-d6e8b745dfa8] Successfully updated port: a899c1d3-f433-4476-a304-705e518f0bea _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 21 23:59:20 compute-1 nova_compute[182713]: 2026-01-21 23:59:20.374 182717 DEBUG oslo_concurrency.lockutils [None req-8f7bd3b7-8d43-4fd6-84e3-31c784ac5bda 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] Acquiring lock "refresh_cache-be4dacee-6b35-4e82-ba71-d6e8b745dfa8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 23:59:20 compute-1 nova_compute[182713]: 2026-01-21 23:59:20.625 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:59:20 compute-1 nova_compute[182713]: 2026-01-21 23:59:20.858 182717 DEBUG nova.network.neutron [req-2f8025c7-bcb1-429f-ad02-9855e3dbed41 req-7e55306a-315a-4334-a943-d487cfd20b6f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: be4dacee-6b35-4e82-ba71-d6e8b745dfa8] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 23:59:20 compute-1 nova_compute[182713]: 2026-01-21 23:59:20.877 182717 DEBUG oslo_concurrency.lockutils [req-2f8025c7-bcb1-429f-ad02-9855e3dbed41 req-7e55306a-315a-4334-a943-d487cfd20b6f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-be4dacee-6b35-4e82-ba71-d6e8b745dfa8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 23:59:20 compute-1 nova_compute[182713]: 2026-01-21 23:59:20.877 182717 DEBUG oslo_concurrency.lockutils [None req-8f7bd3b7-8d43-4fd6-84e3-31c784ac5bda 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] Acquired lock "refresh_cache-be4dacee-6b35-4e82-ba71-d6e8b745dfa8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 23:59:20 compute-1 nova_compute[182713]: 2026-01-21 23:59:20.878 182717 DEBUG nova.network.neutron [None req-8f7bd3b7-8d43-4fd6-84e3-31c784ac5bda 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] [instance: be4dacee-6b35-4e82-ba71-d6e8b745dfa8] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 21 23:59:21 compute-1 nova_compute[182713]: 2026-01-21 23:59:21.097 182717 DEBUG nova.network.neutron [None req-8f7bd3b7-8d43-4fd6-84e3-31c784ac5bda 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] [instance: be4dacee-6b35-4e82-ba71-d6e8b745dfa8] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 21 23:59:21 compute-1 nova_compute[182713]: 2026-01-21 23:59:21.201 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:59:21 compute-1 nova_compute[182713]: 2026-01-21 23:59:21.323 182717 DEBUG nova.compute.manager [req-30fb0ffb-fabe-4200-a328-63b1afb53689 req-732b0965-3b2b-41a4-a811-29c7fa2adda7 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: be4dacee-6b35-4e82-ba71-d6e8b745dfa8] Received event network-changed-a899c1d3-f433-4476-a304-705e518f0bea external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:59:21 compute-1 nova_compute[182713]: 2026-01-21 23:59:21.323 182717 DEBUG nova.compute.manager [req-30fb0ffb-fabe-4200-a328-63b1afb53689 req-732b0965-3b2b-41a4-a811-29c7fa2adda7 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: be4dacee-6b35-4e82-ba71-d6e8b745dfa8] Refreshing instance network info cache due to event network-changed-a899c1d3-f433-4476-a304-705e518f0bea. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 21 23:59:21 compute-1 nova_compute[182713]: 2026-01-21 23:59:21.324 182717 DEBUG oslo_concurrency.lockutils [req-30fb0ffb-fabe-4200-a328-63b1afb53689 req-732b0965-3b2b-41a4-a811-29c7fa2adda7 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-be4dacee-6b35-4e82-ba71-d6e8b745dfa8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 23:59:22 compute-1 nova_compute[182713]: 2026-01-21 23:59:22.534 182717 DEBUG oslo_concurrency.lockutils [None req-8cda5444-a8c8-4484-901e-917e1d85b993 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Acquiring lock "interface-30c7da24-de00-4067-a5d2-f36ad21391c5-c596dfbe-ce59-4ab9-8cad-bd4144812420" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:59:22 compute-1 nova_compute[182713]: 2026-01-21 23:59:22.535 182717 DEBUG oslo_concurrency.lockutils [None req-8cda5444-a8c8-4484-901e-917e1d85b993 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Lock "interface-30c7da24-de00-4067-a5d2-f36ad21391c5-c596dfbe-ce59-4ab9-8cad-bd4144812420" acquired by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:59:22 compute-1 nova_compute[182713]: 2026-01-21 23:59:22.536 182717 DEBUG nova.objects.instance [None req-8cda5444-a8c8-4484-901e-917e1d85b993 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Lazy-loading 'flavor' on Instance uuid 30c7da24-de00-4067-a5d2-f36ad21391c5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:59:23 compute-1 nova_compute[182713]: 2026-01-21 23:59:23.722 182717 DEBUG nova.objects.instance [None req-8cda5444-a8c8-4484-901e-917e1d85b993 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Lazy-loading 'pci_requests' on Instance uuid 30c7da24-de00-4067-a5d2-f36ad21391c5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:59:23 compute-1 nova_compute[182713]: 2026-01-21 23:59:23.739 182717 DEBUG nova.network.neutron [None req-8cda5444-a8c8-4484-901e-917e1d85b993 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] [instance: 30c7da24-de00-4067-a5d2-f36ad21391c5] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 21 23:59:23 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:59:23.773 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=74526b6d-b1ca-423f-9094-b845f8b97526, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '18'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:59:24 compute-1 nova_compute[182713]: 2026-01-21 23:59:24.082 182717 DEBUG nova.policy [None req-8cda5444-a8c8-4484-901e-917e1d85b993 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '0f8ef02149394f2dac899fc3395b6bf7', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '717cc581e6a349a98dfd390d05b18624', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 21 23:59:24 compute-1 nova_compute[182713]: 2026-01-21 23:59:24.625 182717 DEBUG nova.network.neutron [None req-8f7bd3b7-8d43-4fd6-84e3-31c784ac5bda 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] [instance: be4dacee-6b35-4e82-ba71-d6e8b745dfa8] Updating instance_info_cache with network_info: [{"id": "9330cb5d-3c3f-499b-9d0c-ddc0fee6838f", "address": "fa:16:3e:9f:22:0c", "network": {"id": "4ae8e9ca-350e-4d38-9fe2-01d17d47544e", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1174472241", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.143", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "975703700f9d42c5a1daa32f5e61f6f2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9330cb5d-3c", "ovs_interfaceid": "9330cb5d-3c3f-499b-9d0c-ddc0fee6838f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "da0329e1-27a9-4900-91e5-aff8efb5d066", "address": "fa:16:3e:66:b9:f9", "network": {"id": "54a6b0a6-096e-4f61-a504-b2a9810b3844", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-335695391", "subnets": [{"cidr": "10.100.1.0/24", "dns": [], "gateway": {"address": "10.100.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.1.127", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "975703700f9d42c5a1daa32f5e61f6f2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapda0329e1-27", "ovs_interfaceid": "da0329e1-27a9-4900-91e5-aff8efb5d066", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "a899c1d3-f433-4476-a304-705e518f0bea", "address": "fa:16:3e:e4:a0:ba", "network": {"id": "4ae8e9ca-350e-4d38-9fe2-01d17d47544e", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1174472241", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.228", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "975703700f9d42c5a1daa32f5e61f6f2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa899c1d3-f4", "ovs_interfaceid": "a899c1d3-f433-4476-a304-705e518f0bea", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 23:59:24 compute-1 nova_compute[182713]: 2026-01-21 23:59:24.669 182717 DEBUG oslo_concurrency.lockutils [None req-8f7bd3b7-8d43-4fd6-84e3-31c784ac5bda 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] Releasing lock "refresh_cache-be4dacee-6b35-4e82-ba71-d6e8b745dfa8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 23:59:24 compute-1 nova_compute[182713]: 2026-01-21 23:59:24.670 182717 DEBUG nova.compute.manager [None req-8f7bd3b7-8d43-4fd6-84e3-31c784ac5bda 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] [instance: be4dacee-6b35-4e82-ba71-d6e8b745dfa8] Instance network_info: |[{"id": "9330cb5d-3c3f-499b-9d0c-ddc0fee6838f", "address": "fa:16:3e:9f:22:0c", "network": {"id": "4ae8e9ca-350e-4d38-9fe2-01d17d47544e", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1174472241", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.143", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "975703700f9d42c5a1daa32f5e61f6f2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9330cb5d-3c", "ovs_interfaceid": "9330cb5d-3c3f-499b-9d0c-ddc0fee6838f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "da0329e1-27a9-4900-91e5-aff8efb5d066", "address": "fa:16:3e:66:b9:f9", "network": {"id": "54a6b0a6-096e-4f61-a504-b2a9810b3844", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-335695391", "subnets": [{"cidr": "10.100.1.0/24", "dns": [], "gateway": {"address": "10.100.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.1.127", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "975703700f9d42c5a1daa32f5e61f6f2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapda0329e1-27", "ovs_interfaceid": "da0329e1-27a9-4900-91e5-aff8efb5d066", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "a899c1d3-f433-4476-a304-705e518f0bea", "address": "fa:16:3e:e4:a0:ba", "network": {"id": "4ae8e9ca-350e-4d38-9fe2-01d17d47544e", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1174472241", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.228", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "975703700f9d42c5a1daa32f5e61f6f2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa899c1d3-f4", "ovs_interfaceid": "a899c1d3-f433-4476-a304-705e518f0bea", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 21 23:59:24 compute-1 nova_compute[182713]: 2026-01-21 23:59:24.672 182717 DEBUG oslo_concurrency.lockutils [req-30fb0ffb-fabe-4200-a328-63b1afb53689 req-732b0965-3b2b-41a4-a811-29c7fa2adda7 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-be4dacee-6b35-4e82-ba71-d6e8b745dfa8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 23:59:24 compute-1 nova_compute[182713]: 2026-01-21 23:59:24.673 182717 DEBUG nova.network.neutron [req-30fb0ffb-fabe-4200-a328-63b1afb53689 req-732b0965-3b2b-41a4-a811-29c7fa2adda7 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: be4dacee-6b35-4e82-ba71-d6e8b745dfa8] Refreshing network info cache for port a899c1d3-f433-4476-a304-705e518f0bea _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 21 23:59:24 compute-1 nova_compute[182713]: 2026-01-21 23:59:24.680 182717 DEBUG nova.virt.libvirt.driver [None req-8f7bd3b7-8d43-4fd6-84e3-31c784ac5bda 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] [instance: be4dacee-6b35-4e82-ba71-d6e8b745dfa8] Start _get_guest_xml network_info=[{"id": "9330cb5d-3c3f-499b-9d0c-ddc0fee6838f", "address": "fa:16:3e:9f:22:0c", "network": {"id": "4ae8e9ca-350e-4d38-9fe2-01d17d47544e", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1174472241", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.143", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "975703700f9d42c5a1daa32f5e61f6f2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9330cb5d-3c", "ovs_interfaceid": "9330cb5d-3c3f-499b-9d0c-ddc0fee6838f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "da0329e1-27a9-4900-91e5-aff8efb5d066", "address": "fa:16:3e:66:b9:f9", "network": {"id": "54a6b0a6-096e-4f61-a504-b2a9810b3844", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-335695391", "subnets": [{"cidr": "10.100.1.0/24", "dns": [], "gateway": {"address": "10.100.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.1.127", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "975703700f9d42c5a1daa32f5e61f6f2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapda0329e1-27", "ovs_interfaceid": "da0329e1-27a9-4900-91e5-aff8efb5d066", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "a899c1d3-f433-4476-a304-705e518f0bea", "address": "fa:16:3e:e4:a0:ba", "network": {"id": "4ae8e9ca-350e-4d38-9fe2-01d17d47544e", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1174472241", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.228", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "975703700f9d42c5a1daa32f5e61f6f2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa899c1d3-f4", "ovs_interfaceid": "a899c1d3-f433-4476-a304-705e518f0bea", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'size': 0, 'disk_bus': 'virtio', 'boot_index': 0, 'device_name': '/dev/vda', 'guest_format': None, 'encrypted': False, 'encryption_format': None, 'encryption_secret_uuid': None, 'device_type': 'disk', 'image_id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 21 23:59:24 compute-1 nova_compute[182713]: 2026-01-21 23:59:24.687 182717 WARNING nova.virt.libvirt.driver [None req-8f7bd3b7-8d43-4fd6-84e3-31c784ac5bda 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 21 23:59:24 compute-1 nova_compute[182713]: 2026-01-21 23:59:24.697 182717 DEBUG nova.virt.libvirt.host [None req-8f7bd3b7-8d43-4fd6-84e3-31c784ac5bda 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 21 23:59:24 compute-1 nova_compute[182713]: 2026-01-21 23:59:24.698 182717 DEBUG nova.virt.libvirt.host [None req-8f7bd3b7-8d43-4fd6-84e3-31c784ac5bda 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 21 23:59:24 compute-1 nova_compute[182713]: 2026-01-21 23:59:24.702 182717 DEBUG nova.virt.libvirt.host [None req-8f7bd3b7-8d43-4fd6-84e3-31c784ac5bda 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 21 23:59:24 compute-1 nova_compute[182713]: 2026-01-21 23:59:24.703 182717 DEBUG nova.virt.libvirt.host [None req-8f7bd3b7-8d43-4fd6-84e3-31c784ac5bda 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 21 23:59:24 compute-1 nova_compute[182713]: 2026-01-21 23:59:24.705 182717 DEBUG nova.virt.libvirt.driver [None req-8f7bd3b7-8d43-4fd6-84e3-31c784ac5bda 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 21 23:59:24 compute-1 nova_compute[182713]: 2026-01-21 23:59:24.705 182717 DEBUG nova.virt.hardware [None req-8f7bd3b7-8d43-4fd6-84e3-31c784ac5bda 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-21T23:42:45Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c3389c03-89c4-4ff5-9e03-1a99d41713d4',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 21 23:59:24 compute-1 nova_compute[182713]: 2026-01-21 23:59:24.706 182717 DEBUG nova.virt.hardware [None req-8f7bd3b7-8d43-4fd6-84e3-31c784ac5bda 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 21 23:59:24 compute-1 nova_compute[182713]: 2026-01-21 23:59:24.707 182717 DEBUG nova.virt.hardware [None req-8f7bd3b7-8d43-4fd6-84e3-31c784ac5bda 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 21 23:59:24 compute-1 nova_compute[182713]: 2026-01-21 23:59:24.708 182717 DEBUG nova.virt.hardware [None req-8f7bd3b7-8d43-4fd6-84e3-31c784ac5bda 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 21 23:59:24 compute-1 nova_compute[182713]: 2026-01-21 23:59:24.709 182717 DEBUG nova.virt.hardware [None req-8f7bd3b7-8d43-4fd6-84e3-31c784ac5bda 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 21 23:59:24 compute-1 nova_compute[182713]: 2026-01-21 23:59:24.709 182717 DEBUG nova.virt.hardware [None req-8f7bd3b7-8d43-4fd6-84e3-31c784ac5bda 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 21 23:59:24 compute-1 nova_compute[182713]: 2026-01-21 23:59:24.709 182717 DEBUG nova.virt.hardware [None req-8f7bd3b7-8d43-4fd6-84e3-31c784ac5bda 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 21 23:59:24 compute-1 nova_compute[182713]: 2026-01-21 23:59:24.710 182717 DEBUG nova.virt.hardware [None req-8f7bd3b7-8d43-4fd6-84e3-31c784ac5bda 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 21 23:59:24 compute-1 nova_compute[182713]: 2026-01-21 23:59:24.710 182717 DEBUG nova.virt.hardware [None req-8f7bd3b7-8d43-4fd6-84e3-31c784ac5bda 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 21 23:59:24 compute-1 nova_compute[182713]: 2026-01-21 23:59:24.710 182717 DEBUG nova.virt.hardware [None req-8f7bd3b7-8d43-4fd6-84e3-31c784ac5bda 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 21 23:59:24 compute-1 nova_compute[182713]: 2026-01-21 23:59:24.710 182717 DEBUG nova.virt.hardware [None req-8f7bd3b7-8d43-4fd6-84e3-31c784ac5bda 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 21 23:59:24 compute-1 nova_compute[182713]: 2026-01-21 23:59:24.714 182717 DEBUG nova.virt.libvirt.vif [None req-8f7bd3b7-8d43-4fd6-84e3-31c784ac5bda 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-21T23:59:07Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestMultiNic-server-1379253733',display_name='tempest-ServersTestMultiNic-server-1379253733',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverstestmultinic-server-1379253733',id=78,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='975703700f9d42c5a1daa32f5e61f6f2',ramdisk_id='',reservation_id='r-hvk60y87',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestMultiNic-672631386',owner_user_name='tempest-ServersTestMultiNic-672631386-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-21T23:59:10Z,user_data=None,user_id='1931e691804246e3bb3ac03a95a74d93',uuid=be4dacee-6b35-4e82-ba71-d6e8b745dfa8,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "9330cb5d-3c3f-499b-9d0c-ddc0fee6838f", "address": "fa:16:3e:9f:22:0c", "network": {"id": "4ae8e9ca-350e-4d38-9fe2-01d17d47544e", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1174472241", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.143", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "975703700f9d42c5a1daa32f5e61f6f2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9330cb5d-3c", "ovs_interfaceid": "9330cb5d-3c3f-499b-9d0c-ddc0fee6838f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 21 23:59:24 compute-1 nova_compute[182713]: 2026-01-21 23:59:24.715 182717 DEBUG nova.network.os_vif_util [None req-8f7bd3b7-8d43-4fd6-84e3-31c784ac5bda 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] Converting VIF {"id": "9330cb5d-3c3f-499b-9d0c-ddc0fee6838f", "address": "fa:16:3e:9f:22:0c", "network": {"id": "4ae8e9ca-350e-4d38-9fe2-01d17d47544e", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1174472241", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.143", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "975703700f9d42c5a1daa32f5e61f6f2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9330cb5d-3c", "ovs_interfaceid": "9330cb5d-3c3f-499b-9d0c-ddc0fee6838f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 21 23:59:24 compute-1 nova_compute[182713]: 2026-01-21 23:59:24.716 182717 DEBUG nova.network.os_vif_util [None req-8f7bd3b7-8d43-4fd6-84e3-31c784ac5bda 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:9f:22:0c,bridge_name='br-int',has_traffic_filtering=True,id=9330cb5d-3c3f-499b-9d0c-ddc0fee6838f,network=Network(4ae8e9ca-350e-4d38-9fe2-01d17d47544e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9330cb5d-3c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 21 23:59:24 compute-1 nova_compute[182713]: 2026-01-21 23:59:24.717 182717 DEBUG nova.virt.libvirt.vif [None req-8f7bd3b7-8d43-4fd6-84e3-31c784ac5bda 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-21T23:59:07Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestMultiNic-server-1379253733',display_name='tempest-ServersTestMultiNic-server-1379253733',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverstestmultinic-server-1379253733',id=78,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='975703700f9d42c5a1daa32f5e61f6f2',ramdisk_id='',reservation_id='r-hvk60y87',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestMultiNic-672631386',owner_user_name='tempest-ServersTestMultiNic-672631386-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-21T23:59:10Z,user_data=None,user_id='1931e691804246e3bb3ac03a95a74d93',uuid=be4dacee-6b35-4e82-ba71-d6e8b745dfa8,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "da0329e1-27a9-4900-91e5-aff8efb5d066", "address": "fa:16:3e:66:b9:f9", "network": {"id": "54a6b0a6-096e-4f61-a504-b2a9810b3844", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-335695391", "subnets": [{"cidr": "10.100.1.0/24", "dns": [], "gateway": {"address": "10.100.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.1.127", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "975703700f9d42c5a1daa32f5e61f6f2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapda0329e1-27", "ovs_interfaceid": "da0329e1-27a9-4900-91e5-aff8efb5d066", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 21 23:59:24 compute-1 nova_compute[182713]: 2026-01-21 23:59:24.717 182717 DEBUG nova.network.os_vif_util [None req-8f7bd3b7-8d43-4fd6-84e3-31c784ac5bda 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] Converting VIF {"id": "da0329e1-27a9-4900-91e5-aff8efb5d066", "address": "fa:16:3e:66:b9:f9", "network": {"id": "54a6b0a6-096e-4f61-a504-b2a9810b3844", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-335695391", "subnets": [{"cidr": "10.100.1.0/24", "dns": [], "gateway": {"address": "10.100.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.1.127", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "975703700f9d42c5a1daa32f5e61f6f2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapda0329e1-27", "ovs_interfaceid": "da0329e1-27a9-4900-91e5-aff8efb5d066", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 21 23:59:24 compute-1 nova_compute[182713]: 2026-01-21 23:59:24.717 182717 DEBUG nova.network.os_vif_util [None req-8f7bd3b7-8d43-4fd6-84e3-31c784ac5bda 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:66:b9:f9,bridge_name='br-int',has_traffic_filtering=True,id=da0329e1-27a9-4900-91e5-aff8efb5d066,network=Network(54a6b0a6-096e-4f61-a504-b2a9810b3844),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapda0329e1-27') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 21 23:59:24 compute-1 nova_compute[182713]: 2026-01-21 23:59:24.718 182717 DEBUG nova.virt.libvirt.vif [None req-8f7bd3b7-8d43-4fd6-84e3-31c784ac5bda 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-21T23:59:07Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestMultiNic-server-1379253733',display_name='tempest-ServersTestMultiNic-server-1379253733',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverstestmultinic-server-1379253733',id=78,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='975703700f9d42c5a1daa32f5e61f6f2',ramdisk_id='',reservation_id='r-hvk60y87',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestMultiNic-672631386',owner_user_name='tempest-ServersTestMultiNic-672631386-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-21T23:59:10Z,user_data=None,user_id='1931e691804246e3bb3ac03a95a74d93',uuid=be4dacee-6b35-4e82-ba71-d6e8b745dfa8,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "a899c1d3-f433-4476-a304-705e518f0bea", "address": "fa:16:3e:e4:a0:ba", "network": {"id": "4ae8e9ca-350e-4d38-9fe2-01d17d47544e", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1174472241", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.228", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "975703700f9d42c5a1daa32f5e61f6f2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa899c1d3-f4", "ovs_interfaceid": "a899c1d3-f433-4476-a304-705e518f0bea", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 21 23:59:24 compute-1 nova_compute[182713]: 2026-01-21 23:59:24.718 182717 DEBUG nova.network.os_vif_util [None req-8f7bd3b7-8d43-4fd6-84e3-31c784ac5bda 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] Converting VIF {"id": "a899c1d3-f433-4476-a304-705e518f0bea", "address": "fa:16:3e:e4:a0:ba", "network": {"id": "4ae8e9ca-350e-4d38-9fe2-01d17d47544e", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1174472241", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.228", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "975703700f9d42c5a1daa32f5e61f6f2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa899c1d3-f4", "ovs_interfaceid": "a899c1d3-f433-4476-a304-705e518f0bea", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 21 23:59:24 compute-1 nova_compute[182713]: 2026-01-21 23:59:24.719 182717 DEBUG nova.network.os_vif_util [None req-8f7bd3b7-8d43-4fd6-84e3-31c784ac5bda 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e4:a0:ba,bridge_name='br-int',has_traffic_filtering=True,id=a899c1d3-f433-4476-a304-705e518f0bea,network=Network(4ae8e9ca-350e-4d38-9fe2-01d17d47544e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa899c1d3-f4') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 21 23:59:24 compute-1 nova_compute[182713]: 2026-01-21 23:59:24.720 182717 DEBUG nova.objects.instance [None req-8f7bd3b7-8d43-4fd6-84e3-31c784ac5bda 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] Lazy-loading 'pci_devices' on Instance uuid be4dacee-6b35-4e82-ba71-d6e8b745dfa8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:59:24 compute-1 nova_compute[182713]: 2026-01-21 23:59:24.735 182717 DEBUG nova.virt.libvirt.driver [None req-8f7bd3b7-8d43-4fd6-84e3-31c784ac5bda 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] [instance: be4dacee-6b35-4e82-ba71-d6e8b745dfa8] End _get_guest_xml xml=<domain type="kvm">
Jan 21 23:59:24 compute-1 nova_compute[182713]:   <uuid>be4dacee-6b35-4e82-ba71-d6e8b745dfa8</uuid>
Jan 21 23:59:24 compute-1 nova_compute[182713]:   <name>instance-0000004e</name>
Jan 21 23:59:24 compute-1 nova_compute[182713]:   <memory>131072</memory>
Jan 21 23:59:24 compute-1 nova_compute[182713]:   <vcpu>1</vcpu>
Jan 21 23:59:24 compute-1 nova_compute[182713]:   <metadata>
Jan 21 23:59:24 compute-1 nova_compute[182713]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 21 23:59:24 compute-1 nova_compute[182713]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 21 23:59:24 compute-1 nova_compute[182713]:       <nova:name>tempest-ServersTestMultiNic-server-1379253733</nova:name>
Jan 21 23:59:24 compute-1 nova_compute[182713]:       <nova:creationTime>2026-01-21 23:59:24</nova:creationTime>
Jan 21 23:59:24 compute-1 nova_compute[182713]:       <nova:flavor name="m1.nano">
Jan 21 23:59:24 compute-1 nova_compute[182713]:         <nova:memory>128</nova:memory>
Jan 21 23:59:24 compute-1 nova_compute[182713]:         <nova:disk>1</nova:disk>
Jan 21 23:59:24 compute-1 nova_compute[182713]:         <nova:swap>0</nova:swap>
Jan 21 23:59:24 compute-1 nova_compute[182713]:         <nova:ephemeral>0</nova:ephemeral>
Jan 21 23:59:24 compute-1 nova_compute[182713]:         <nova:vcpus>1</nova:vcpus>
Jan 21 23:59:24 compute-1 nova_compute[182713]:       </nova:flavor>
Jan 21 23:59:24 compute-1 nova_compute[182713]:       <nova:owner>
Jan 21 23:59:24 compute-1 nova_compute[182713]:         <nova:user uuid="1931e691804246e3bb3ac03a95a74d93">tempest-ServersTestMultiNic-672631386-project-member</nova:user>
Jan 21 23:59:24 compute-1 nova_compute[182713]:         <nova:project uuid="975703700f9d42c5a1daa32f5e61f6f2">tempest-ServersTestMultiNic-672631386</nova:project>
Jan 21 23:59:24 compute-1 nova_compute[182713]:       </nova:owner>
Jan 21 23:59:24 compute-1 nova_compute[182713]:       <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 21 23:59:24 compute-1 nova_compute[182713]:       <nova:ports>
Jan 21 23:59:24 compute-1 nova_compute[182713]:         <nova:port uuid="9330cb5d-3c3f-499b-9d0c-ddc0fee6838f">
Jan 21 23:59:24 compute-1 nova_compute[182713]:           <nova:ip type="fixed" address="10.100.0.143" ipVersion="4"/>
Jan 21 23:59:24 compute-1 nova_compute[182713]:         </nova:port>
Jan 21 23:59:24 compute-1 nova_compute[182713]:         <nova:port uuid="da0329e1-27a9-4900-91e5-aff8efb5d066">
Jan 21 23:59:24 compute-1 nova_compute[182713]:           <nova:ip type="fixed" address="10.100.1.127" ipVersion="4"/>
Jan 21 23:59:24 compute-1 nova_compute[182713]:         </nova:port>
Jan 21 23:59:24 compute-1 nova_compute[182713]:         <nova:port uuid="a899c1d3-f433-4476-a304-705e518f0bea">
Jan 21 23:59:24 compute-1 nova_compute[182713]:           <nova:ip type="fixed" address="10.100.0.228" ipVersion="4"/>
Jan 21 23:59:24 compute-1 nova_compute[182713]:         </nova:port>
Jan 21 23:59:24 compute-1 nova_compute[182713]:       </nova:ports>
Jan 21 23:59:24 compute-1 nova_compute[182713]:     </nova:instance>
Jan 21 23:59:24 compute-1 nova_compute[182713]:   </metadata>
Jan 21 23:59:24 compute-1 nova_compute[182713]:   <sysinfo type="smbios">
Jan 21 23:59:24 compute-1 nova_compute[182713]:     <system>
Jan 21 23:59:24 compute-1 nova_compute[182713]:       <entry name="manufacturer">RDO</entry>
Jan 21 23:59:24 compute-1 nova_compute[182713]:       <entry name="product">OpenStack Compute</entry>
Jan 21 23:59:24 compute-1 nova_compute[182713]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 21 23:59:24 compute-1 nova_compute[182713]:       <entry name="serial">be4dacee-6b35-4e82-ba71-d6e8b745dfa8</entry>
Jan 21 23:59:24 compute-1 nova_compute[182713]:       <entry name="uuid">be4dacee-6b35-4e82-ba71-d6e8b745dfa8</entry>
Jan 21 23:59:24 compute-1 nova_compute[182713]:       <entry name="family">Virtual Machine</entry>
Jan 21 23:59:24 compute-1 nova_compute[182713]:     </system>
Jan 21 23:59:24 compute-1 nova_compute[182713]:   </sysinfo>
Jan 21 23:59:24 compute-1 nova_compute[182713]:   <os>
Jan 21 23:59:24 compute-1 nova_compute[182713]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 21 23:59:24 compute-1 nova_compute[182713]:     <boot dev="hd"/>
Jan 21 23:59:24 compute-1 nova_compute[182713]:     <smbios mode="sysinfo"/>
Jan 21 23:59:24 compute-1 nova_compute[182713]:   </os>
Jan 21 23:59:24 compute-1 nova_compute[182713]:   <features>
Jan 21 23:59:24 compute-1 nova_compute[182713]:     <acpi/>
Jan 21 23:59:24 compute-1 nova_compute[182713]:     <apic/>
Jan 21 23:59:24 compute-1 nova_compute[182713]:     <vmcoreinfo/>
Jan 21 23:59:24 compute-1 nova_compute[182713]:   </features>
Jan 21 23:59:24 compute-1 nova_compute[182713]:   <clock offset="utc">
Jan 21 23:59:24 compute-1 nova_compute[182713]:     <timer name="pit" tickpolicy="delay"/>
Jan 21 23:59:24 compute-1 nova_compute[182713]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 21 23:59:24 compute-1 nova_compute[182713]:     <timer name="hpet" present="no"/>
Jan 21 23:59:24 compute-1 nova_compute[182713]:   </clock>
Jan 21 23:59:24 compute-1 nova_compute[182713]:   <cpu mode="custom" match="exact">
Jan 21 23:59:24 compute-1 nova_compute[182713]:     <model>Nehalem</model>
Jan 21 23:59:24 compute-1 nova_compute[182713]:     <topology sockets="1" cores="1" threads="1"/>
Jan 21 23:59:24 compute-1 nova_compute[182713]:   </cpu>
Jan 21 23:59:24 compute-1 nova_compute[182713]:   <devices>
Jan 21 23:59:24 compute-1 nova_compute[182713]:     <disk type="file" device="disk">
Jan 21 23:59:24 compute-1 nova_compute[182713]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 21 23:59:24 compute-1 nova_compute[182713]:       <source file="/var/lib/nova/instances/be4dacee-6b35-4e82-ba71-d6e8b745dfa8/disk"/>
Jan 21 23:59:24 compute-1 nova_compute[182713]:       <target dev="vda" bus="virtio"/>
Jan 21 23:59:24 compute-1 nova_compute[182713]:     </disk>
Jan 21 23:59:24 compute-1 nova_compute[182713]:     <disk type="file" device="cdrom">
Jan 21 23:59:24 compute-1 nova_compute[182713]:       <driver name="qemu" type="raw" cache="none"/>
Jan 21 23:59:24 compute-1 nova_compute[182713]:       <source file="/var/lib/nova/instances/be4dacee-6b35-4e82-ba71-d6e8b745dfa8/disk.config"/>
Jan 21 23:59:24 compute-1 nova_compute[182713]:       <target dev="sda" bus="sata"/>
Jan 21 23:59:24 compute-1 nova_compute[182713]:     </disk>
Jan 21 23:59:24 compute-1 nova_compute[182713]:     <interface type="ethernet">
Jan 21 23:59:24 compute-1 nova_compute[182713]:       <mac address="fa:16:3e:9f:22:0c"/>
Jan 21 23:59:24 compute-1 nova_compute[182713]:       <model type="virtio"/>
Jan 21 23:59:24 compute-1 nova_compute[182713]:       <driver name="vhost" rx_queue_size="512"/>
Jan 21 23:59:24 compute-1 nova_compute[182713]:       <mtu size="1442"/>
Jan 21 23:59:24 compute-1 nova_compute[182713]:       <target dev="tap9330cb5d-3c"/>
Jan 21 23:59:24 compute-1 nova_compute[182713]:     </interface>
Jan 21 23:59:24 compute-1 nova_compute[182713]:     <interface type="ethernet">
Jan 21 23:59:24 compute-1 nova_compute[182713]:       <mac address="fa:16:3e:66:b9:f9"/>
Jan 21 23:59:24 compute-1 nova_compute[182713]:       <model type="virtio"/>
Jan 21 23:59:24 compute-1 nova_compute[182713]:       <driver name="vhost" rx_queue_size="512"/>
Jan 21 23:59:24 compute-1 nova_compute[182713]:       <mtu size="1442"/>
Jan 21 23:59:24 compute-1 nova_compute[182713]:       <target dev="tapda0329e1-27"/>
Jan 21 23:59:24 compute-1 nova_compute[182713]:     </interface>
Jan 21 23:59:24 compute-1 nova_compute[182713]:     <interface type="ethernet">
Jan 21 23:59:24 compute-1 nova_compute[182713]:       <mac address="fa:16:3e:e4:a0:ba"/>
Jan 21 23:59:24 compute-1 nova_compute[182713]:       <model type="virtio"/>
Jan 21 23:59:24 compute-1 nova_compute[182713]:       <driver name="vhost" rx_queue_size="512"/>
Jan 21 23:59:24 compute-1 nova_compute[182713]:       <mtu size="1442"/>
Jan 21 23:59:24 compute-1 nova_compute[182713]:       <target dev="tapa899c1d3-f4"/>
Jan 21 23:59:24 compute-1 nova_compute[182713]:     </interface>
Jan 21 23:59:24 compute-1 nova_compute[182713]:     <serial type="pty">
Jan 21 23:59:24 compute-1 nova_compute[182713]:       <log file="/var/lib/nova/instances/be4dacee-6b35-4e82-ba71-d6e8b745dfa8/console.log" append="off"/>
Jan 21 23:59:24 compute-1 nova_compute[182713]:     </serial>
Jan 21 23:59:24 compute-1 nova_compute[182713]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 21 23:59:24 compute-1 nova_compute[182713]:     <video>
Jan 21 23:59:24 compute-1 nova_compute[182713]:       <model type="virtio"/>
Jan 21 23:59:24 compute-1 nova_compute[182713]:     </video>
Jan 21 23:59:24 compute-1 nova_compute[182713]:     <input type="tablet" bus="usb"/>
Jan 21 23:59:24 compute-1 nova_compute[182713]:     <rng model="virtio">
Jan 21 23:59:24 compute-1 nova_compute[182713]:       <backend model="random">/dev/urandom</backend>
Jan 21 23:59:24 compute-1 nova_compute[182713]:     </rng>
Jan 21 23:59:24 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root"/>
Jan 21 23:59:24 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:59:24 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:59:24 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:59:24 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:59:24 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:59:24 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:59:24 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:59:24 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:59:24 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:59:24 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:59:24 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:59:24 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:59:24 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:59:24 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:59:24 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:59:24 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:59:24 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:59:24 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:59:24 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:59:24 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:59:24 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:59:24 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:59:24 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:59:24 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:59:24 compute-1 nova_compute[182713]:     <controller type="usb" index="0"/>
Jan 21 23:59:24 compute-1 nova_compute[182713]:     <memballoon model="virtio">
Jan 21 23:59:24 compute-1 nova_compute[182713]:       <stats period="10"/>
Jan 21 23:59:24 compute-1 nova_compute[182713]:     </memballoon>
Jan 21 23:59:24 compute-1 nova_compute[182713]:   </devices>
Jan 21 23:59:24 compute-1 nova_compute[182713]: </domain>
Jan 21 23:59:24 compute-1 nova_compute[182713]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 21 23:59:24 compute-1 nova_compute[182713]: 2026-01-21 23:59:24.736 182717 DEBUG nova.compute.manager [None req-8f7bd3b7-8d43-4fd6-84e3-31c784ac5bda 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] [instance: be4dacee-6b35-4e82-ba71-d6e8b745dfa8] Preparing to wait for external event network-vif-plugged-9330cb5d-3c3f-499b-9d0c-ddc0fee6838f prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 21 23:59:24 compute-1 nova_compute[182713]: 2026-01-21 23:59:24.737 182717 DEBUG oslo_concurrency.lockutils [None req-8f7bd3b7-8d43-4fd6-84e3-31c784ac5bda 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] Acquiring lock "be4dacee-6b35-4e82-ba71-d6e8b745dfa8-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:59:24 compute-1 nova_compute[182713]: 2026-01-21 23:59:24.737 182717 DEBUG oslo_concurrency.lockutils [None req-8f7bd3b7-8d43-4fd6-84e3-31c784ac5bda 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] Lock "be4dacee-6b35-4e82-ba71-d6e8b745dfa8-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:59:24 compute-1 nova_compute[182713]: 2026-01-21 23:59:24.737 182717 DEBUG oslo_concurrency.lockutils [None req-8f7bd3b7-8d43-4fd6-84e3-31c784ac5bda 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] Lock "be4dacee-6b35-4e82-ba71-d6e8b745dfa8-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:59:24 compute-1 nova_compute[182713]: 2026-01-21 23:59:24.738 182717 DEBUG nova.compute.manager [None req-8f7bd3b7-8d43-4fd6-84e3-31c784ac5bda 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] [instance: be4dacee-6b35-4e82-ba71-d6e8b745dfa8] Preparing to wait for external event network-vif-plugged-da0329e1-27a9-4900-91e5-aff8efb5d066 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 21 23:59:24 compute-1 nova_compute[182713]: 2026-01-21 23:59:24.738 182717 DEBUG oslo_concurrency.lockutils [None req-8f7bd3b7-8d43-4fd6-84e3-31c784ac5bda 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] Acquiring lock "be4dacee-6b35-4e82-ba71-d6e8b745dfa8-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:59:24 compute-1 nova_compute[182713]: 2026-01-21 23:59:24.738 182717 DEBUG oslo_concurrency.lockutils [None req-8f7bd3b7-8d43-4fd6-84e3-31c784ac5bda 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] Lock "be4dacee-6b35-4e82-ba71-d6e8b745dfa8-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:59:24 compute-1 nova_compute[182713]: 2026-01-21 23:59:24.738 182717 DEBUG oslo_concurrency.lockutils [None req-8f7bd3b7-8d43-4fd6-84e3-31c784ac5bda 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] Lock "be4dacee-6b35-4e82-ba71-d6e8b745dfa8-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:59:24 compute-1 nova_compute[182713]: 2026-01-21 23:59:24.739 182717 DEBUG nova.compute.manager [None req-8f7bd3b7-8d43-4fd6-84e3-31c784ac5bda 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] [instance: be4dacee-6b35-4e82-ba71-d6e8b745dfa8] Preparing to wait for external event network-vif-plugged-a899c1d3-f433-4476-a304-705e518f0bea prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 21 23:59:24 compute-1 nova_compute[182713]: 2026-01-21 23:59:24.739 182717 DEBUG oslo_concurrency.lockutils [None req-8f7bd3b7-8d43-4fd6-84e3-31c784ac5bda 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] Acquiring lock "be4dacee-6b35-4e82-ba71-d6e8b745dfa8-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:59:24 compute-1 nova_compute[182713]: 2026-01-21 23:59:24.739 182717 DEBUG oslo_concurrency.lockutils [None req-8f7bd3b7-8d43-4fd6-84e3-31c784ac5bda 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] Lock "be4dacee-6b35-4e82-ba71-d6e8b745dfa8-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:59:24 compute-1 nova_compute[182713]: 2026-01-21 23:59:24.739 182717 DEBUG oslo_concurrency.lockutils [None req-8f7bd3b7-8d43-4fd6-84e3-31c784ac5bda 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] Lock "be4dacee-6b35-4e82-ba71-d6e8b745dfa8-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:59:24 compute-1 nova_compute[182713]: 2026-01-21 23:59:24.740 182717 DEBUG nova.virt.libvirt.vif [None req-8f7bd3b7-8d43-4fd6-84e3-31c784ac5bda 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-21T23:59:07Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestMultiNic-server-1379253733',display_name='tempest-ServersTestMultiNic-server-1379253733',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverstestmultinic-server-1379253733',id=78,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='975703700f9d42c5a1daa32f5e61f6f2',ramdisk_id='',reservation_id='r-hvk60y87',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestMultiNic-672631386',owner_user_name='tempest-ServersTestMultiNic-672631386-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-21T23:59:10Z,user_data=None,user_id='1931e691804246e3bb3ac03a95a74d93',uuid=be4dacee-6b35-4e82-ba71-d6e8b745dfa8,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "9330cb5d-3c3f-499b-9d0c-ddc0fee6838f", "address": "fa:16:3e:9f:22:0c", "network": {"id": "4ae8e9ca-350e-4d38-9fe2-01d17d47544e", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1174472241", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.143", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "975703700f9d42c5a1daa32f5e61f6f2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9330cb5d-3c", "ovs_interfaceid": "9330cb5d-3c3f-499b-9d0c-ddc0fee6838f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 21 23:59:24 compute-1 nova_compute[182713]: 2026-01-21 23:59:24.741 182717 DEBUG nova.network.os_vif_util [None req-8f7bd3b7-8d43-4fd6-84e3-31c784ac5bda 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] Converting VIF {"id": "9330cb5d-3c3f-499b-9d0c-ddc0fee6838f", "address": "fa:16:3e:9f:22:0c", "network": {"id": "4ae8e9ca-350e-4d38-9fe2-01d17d47544e", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1174472241", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.143", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "975703700f9d42c5a1daa32f5e61f6f2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9330cb5d-3c", "ovs_interfaceid": "9330cb5d-3c3f-499b-9d0c-ddc0fee6838f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 21 23:59:24 compute-1 nova_compute[182713]: 2026-01-21 23:59:24.741 182717 DEBUG nova.network.os_vif_util [None req-8f7bd3b7-8d43-4fd6-84e3-31c784ac5bda 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:9f:22:0c,bridge_name='br-int',has_traffic_filtering=True,id=9330cb5d-3c3f-499b-9d0c-ddc0fee6838f,network=Network(4ae8e9ca-350e-4d38-9fe2-01d17d47544e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9330cb5d-3c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 21 23:59:24 compute-1 nova_compute[182713]: 2026-01-21 23:59:24.742 182717 DEBUG os_vif [None req-8f7bd3b7-8d43-4fd6-84e3-31c784ac5bda 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:9f:22:0c,bridge_name='br-int',has_traffic_filtering=True,id=9330cb5d-3c3f-499b-9d0c-ddc0fee6838f,network=Network(4ae8e9ca-350e-4d38-9fe2-01d17d47544e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9330cb5d-3c') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 21 23:59:24 compute-1 nova_compute[182713]: 2026-01-21 23:59:24.742 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:59:24 compute-1 nova_compute[182713]: 2026-01-21 23:59:24.743 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:59:24 compute-1 nova_compute[182713]: 2026-01-21 23:59:24.743 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 21 23:59:24 compute-1 nova_compute[182713]: 2026-01-21 23:59:24.749 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:59:24 compute-1 nova_compute[182713]: 2026-01-21 23:59:24.749 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap9330cb5d-3c, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:59:24 compute-1 nova_compute[182713]: 2026-01-21 23:59:24.750 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap9330cb5d-3c, col_values=(('external_ids', {'iface-id': '9330cb5d-3c3f-499b-9d0c-ddc0fee6838f', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:9f:22:0c', 'vm-uuid': 'be4dacee-6b35-4e82-ba71-d6e8b745dfa8'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:59:24 compute-1 nova_compute[182713]: 2026-01-21 23:59:24.751 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:59:24 compute-1 NetworkManager[54952]: <info>  [1769039964.7529] manager: (tap9330cb5d-3c): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/135)
Jan 21 23:59:24 compute-1 nova_compute[182713]: 2026-01-21 23:59:24.754 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 21 23:59:24 compute-1 nova_compute[182713]: 2026-01-21 23:59:24.759 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:59:24 compute-1 nova_compute[182713]: 2026-01-21 23:59:24.761 182717 INFO os_vif [None req-8f7bd3b7-8d43-4fd6-84e3-31c784ac5bda 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:9f:22:0c,bridge_name='br-int',has_traffic_filtering=True,id=9330cb5d-3c3f-499b-9d0c-ddc0fee6838f,network=Network(4ae8e9ca-350e-4d38-9fe2-01d17d47544e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9330cb5d-3c')
Jan 21 23:59:24 compute-1 nova_compute[182713]: 2026-01-21 23:59:24.762 182717 DEBUG nova.virt.libvirt.vif [None req-8f7bd3b7-8d43-4fd6-84e3-31c784ac5bda 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-21T23:59:07Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestMultiNic-server-1379253733',display_name='tempest-ServersTestMultiNic-server-1379253733',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverstestmultinic-server-1379253733',id=78,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='975703700f9d42c5a1daa32f5e61f6f2',ramdisk_id='',reservation_id='r-hvk60y87',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestMultiNic-672631386',owner_user_name='tempest-ServersTestMultiNic-672631386-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-21T23:59:10Z,user_data=None,user_id='1931e691804246e3bb3ac03a95a74d93',uuid=be4dacee-6b35-4e82-ba71-d6e8b745dfa8,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "da0329e1-27a9-4900-91e5-aff8efb5d066", "address": "fa:16:3e:66:b9:f9", "network": {"id": "54a6b0a6-096e-4f61-a504-b2a9810b3844", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-335695391", "subnets": [{"cidr": "10.100.1.0/24", "dns": [], "gateway": {"address": "10.100.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.1.127", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "975703700f9d42c5a1daa32f5e61f6f2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapda0329e1-27", "ovs_interfaceid": "da0329e1-27a9-4900-91e5-aff8efb5d066", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 21 23:59:24 compute-1 nova_compute[182713]: 2026-01-21 23:59:24.762 182717 DEBUG nova.network.os_vif_util [None req-8f7bd3b7-8d43-4fd6-84e3-31c784ac5bda 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] Converting VIF {"id": "da0329e1-27a9-4900-91e5-aff8efb5d066", "address": "fa:16:3e:66:b9:f9", "network": {"id": "54a6b0a6-096e-4f61-a504-b2a9810b3844", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-335695391", "subnets": [{"cidr": "10.100.1.0/24", "dns": [], "gateway": {"address": "10.100.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.1.127", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "975703700f9d42c5a1daa32f5e61f6f2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapda0329e1-27", "ovs_interfaceid": "da0329e1-27a9-4900-91e5-aff8efb5d066", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 21 23:59:24 compute-1 nova_compute[182713]: 2026-01-21 23:59:24.763 182717 DEBUG nova.network.os_vif_util [None req-8f7bd3b7-8d43-4fd6-84e3-31c784ac5bda 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:66:b9:f9,bridge_name='br-int',has_traffic_filtering=True,id=da0329e1-27a9-4900-91e5-aff8efb5d066,network=Network(54a6b0a6-096e-4f61-a504-b2a9810b3844),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapda0329e1-27') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 21 23:59:24 compute-1 nova_compute[182713]: 2026-01-21 23:59:24.763 182717 DEBUG os_vif [None req-8f7bd3b7-8d43-4fd6-84e3-31c784ac5bda 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:66:b9:f9,bridge_name='br-int',has_traffic_filtering=True,id=da0329e1-27a9-4900-91e5-aff8efb5d066,network=Network(54a6b0a6-096e-4f61-a504-b2a9810b3844),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapda0329e1-27') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 21 23:59:24 compute-1 nova_compute[182713]: 2026-01-21 23:59:24.764 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:59:24 compute-1 nova_compute[182713]: 2026-01-21 23:59:24.764 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:59:24 compute-1 nova_compute[182713]: 2026-01-21 23:59:24.765 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 21 23:59:24 compute-1 nova_compute[182713]: 2026-01-21 23:59:24.768 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:59:24 compute-1 nova_compute[182713]: 2026-01-21 23:59:24.768 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapda0329e1-27, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:59:24 compute-1 nova_compute[182713]: 2026-01-21 23:59:24.769 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapda0329e1-27, col_values=(('external_ids', {'iface-id': 'da0329e1-27a9-4900-91e5-aff8efb5d066', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:66:b9:f9', 'vm-uuid': 'be4dacee-6b35-4e82-ba71-d6e8b745dfa8'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:59:24 compute-1 nova_compute[182713]: 2026-01-21 23:59:24.770 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:59:24 compute-1 NetworkManager[54952]: <info>  [1769039964.7717] manager: (tapda0329e1-27): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/136)
Jan 21 23:59:24 compute-1 nova_compute[182713]: 2026-01-21 23:59:24.772 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 21 23:59:24 compute-1 nova_compute[182713]: 2026-01-21 23:59:24.782 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:59:24 compute-1 nova_compute[182713]: 2026-01-21 23:59:24.784 182717 INFO os_vif [None req-8f7bd3b7-8d43-4fd6-84e3-31c784ac5bda 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:66:b9:f9,bridge_name='br-int',has_traffic_filtering=True,id=da0329e1-27a9-4900-91e5-aff8efb5d066,network=Network(54a6b0a6-096e-4f61-a504-b2a9810b3844),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapda0329e1-27')
Jan 21 23:59:24 compute-1 nova_compute[182713]: 2026-01-21 23:59:24.785 182717 DEBUG nova.virt.libvirt.vif [None req-8f7bd3b7-8d43-4fd6-84e3-31c784ac5bda 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-21T23:59:07Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestMultiNic-server-1379253733',display_name='tempest-ServersTestMultiNic-server-1379253733',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverstestmultinic-server-1379253733',id=78,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='975703700f9d42c5a1daa32f5e61f6f2',ramdisk_id='',reservation_id='r-hvk60y87',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestMultiNic-672631386',owner_user_name='tempest-ServersTestMultiNic-672631386-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-21T23:59:10Z,user_data=None,user_id='1931e691804246e3bb3ac03a95a74d93',uuid=be4dacee-6b35-4e82-ba71-d6e8b745dfa8,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "a899c1d3-f433-4476-a304-705e518f0bea", "address": "fa:16:3e:e4:a0:ba", "network": {"id": "4ae8e9ca-350e-4d38-9fe2-01d17d47544e", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1174472241", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.228", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "975703700f9d42c5a1daa32f5e61f6f2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa899c1d3-f4", "ovs_interfaceid": "a899c1d3-f433-4476-a304-705e518f0bea", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 21 23:59:24 compute-1 nova_compute[182713]: 2026-01-21 23:59:24.785 182717 DEBUG nova.network.os_vif_util [None req-8f7bd3b7-8d43-4fd6-84e3-31c784ac5bda 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] Converting VIF {"id": "a899c1d3-f433-4476-a304-705e518f0bea", "address": "fa:16:3e:e4:a0:ba", "network": {"id": "4ae8e9ca-350e-4d38-9fe2-01d17d47544e", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1174472241", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.228", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "975703700f9d42c5a1daa32f5e61f6f2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa899c1d3-f4", "ovs_interfaceid": "a899c1d3-f433-4476-a304-705e518f0bea", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 21 23:59:24 compute-1 nova_compute[182713]: 2026-01-21 23:59:24.786 182717 DEBUG nova.network.os_vif_util [None req-8f7bd3b7-8d43-4fd6-84e3-31c784ac5bda 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e4:a0:ba,bridge_name='br-int',has_traffic_filtering=True,id=a899c1d3-f433-4476-a304-705e518f0bea,network=Network(4ae8e9ca-350e-4d38-9fe2-01d17d47544e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa899c1d3-f4') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 21 23:59:24 compute-1 nova_compute[182713]: 2026-01-21 23:59:24.786 182717 DEBUG os_vif [None req-8f7bd3b7-8d43-4fd6-84e3-31c784ac5bda 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:e4:a0:ba,bridge_name='br-int',has_traffic_filtering=True,id=a899c1d3-f433-4476-a304-705e518f0bea,network=Network(4ae8e9ca-350e-4d38-9fe2-01d17d47544e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa899c1d3-f4') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 21 23:59:24 compute-1 nova_compute[182713]: 2026-01-21 23:59:24.787 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:59:24 compute-1 nova_compute[182713]: 2026-01-21 23:59:24.787 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:59:24 compute-1 nova_compute[182713]: 2026-01-21 23:59:24.787 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 21 23:59:24 compute-1 nova_compute[182713]: 2026-01-21 23:59:24.790 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:59:24 compute-1 nova_compute[182713]: 2026-01-21 23:59:24.790 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa899c1d3-f4, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:59:24 compute-1 nova_compute[182713]: 2026-01-21 23:59:24.790 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapa899c1d3-f4, col_values=(('external_ids', {'iface-id': 'a899c1d3-f433-4476-a304-705e518f0bea', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:e4:a0:ba', 'vm-uuid': 'be4dacee-6b35-4e82-ba71-d6e8b745dfa8'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:59:24 compute-1 nova_compute[182713]: 2026-01-21 23:59:24.792 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:59:24 compute-1 NetworkManager[54952]: <info>  [1769039964.7933] manager: (tapa899c1d3-f4): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/137)
Jan 21 23:59:24 compute-1 nova_compute[182713]: 2026-01-21 23:59:24.794 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 21 23:59:24 compute-1 nova_compute[182713]: 2026-01-21 23:59:24.807 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:59:24 compute-1 nova_compute[182713]: 2026-01-21 23:59:24.808 182717 INFO os_vif [None req-8f7bd3b7-8d43-4fd6-84e3-31c784ac5bda 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:e4:a0:ba,bridge_name='br-int',has_traffic_filtering=True,id=a899c1d3-f433-4476-a304-705e518f0bea,network=Network(4ae8e9ca-350e-4d38-9fe2-01d17d47544e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa899c1d3-f4')
Jan 21 23:59:24 compute-1 nova_compute[182713]: 2026-01-21 23:59:24.877 182717 DEBUG nova.virt.libvirt.driver [None req-8f7bd3b7-8d43-4fd6-84e3-31c784ac5bda 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 21 23:59:24 compute-1 nova_compute[182713]: 2026-01-21 23:59:24.877 182717 DEBUG nova.virt.libvirt.driver [None req-8f7bd3b7-8d43-4fd6-84e3-31c784ac5bda 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 21 23:59:24 compute-1 nova_compute[182713]: 2026-01-21 23:59:24.877 182717 DEBUG nova.virt.libvirt.driver [None req-8f7bd3b7-8d43-4fd6-84e3-31c784ac5bda 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] No VIF found with MAC fa:16:3e:9f:22:0c, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 21 23:59:24 compute-1 nova_compute[182713]: 2026-01-21 23:59:24.878 182717 DEBUG nova.virt.libvirt.driver [None req-8f7bd3b7-8d43-4fd6-84e3-31c784ac5bda 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] No VIF found with MAC fa:16:3e:66:b9:f9, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 21 23:59:24 compute-1 nova_compute[182713]: 2026-01-21 23:59:24.878 182717 DEBUG nova.virt.libvirt.driver [None req-8f7bd3b7-8d43-4fd6-84e3-31c784ac5bda 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] No VIF found with MAC fa:16:3e:e4:a0:ba, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 21 23:59:24 compute-1 nova_compute[182713]: 2026-01-21 23:59:24.878 182717 INFO nova.virt.libvirt.driver [None req-8f7bd3b7-8d43-4fd6-84e3-31c784ac5bda 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] [instance: be4dacee-6b35-4e82-ba71-d6e8b745dfa8] Using config drive
Jan 21 23:59:25 compute-1 nova_compute[182713]: 2026-01-21 23:59:25.283 182717 INFO nova.virt.libvirt.driver [None req-8f7bd3b7-8d43-4fd6-84e3-31c784ac5bda 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] [instance: be4dacee-6b35-4e82-ba71-d6e8b745dfa8] Creating config drive at /var/lib/nova/instances/be4dacee-6b35-4e82-ba71-d6e8b745dfa8/disk.config
Jan 21 23:59:25 compute-1 nova_compute[182713]: 2026-01-21 23:59:25.294 182717 DEBUG oslo_concurrency.processutils [None req-8f7bd3b7-8d43-4fd6-84e3-31c784ac5bda 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/be4dacee-6b35-4e82-ba71-d6e8b745dfa8/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmphmfvgzw0 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:59:25 compute-1 nova_compute[182713]: 2026-01-21 23:59:25.439 182717 DEBUG oslo_concurrency.processutils [None req-8f7bd3b7-8d43-4fd6-84e3-31c784ac5bda 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/be4dacee-6b35-4e82-ba71-d6e8b745dfa8/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmphmfvgzw0" returned: 0 in 0.145s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:59:25 compute-1 nova_compute[182713]: 2026-01-21 23:59:25.466 182717 DEBUG nova.network.neutron [None req-8cda5444-a8c8-4484-901e-917e1d85b993 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] [instance: 30c7da24-de00-4067-a5d2-f36ad21391c5] Successfully updated port: c596dfbe-ce59-4ab9-8cad-bd4144812420 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 21 23:59:25 compute-1 nova_compute[182713]: 2026-01-21 23:59:25.493 182717 DEBUG oslo_concurrency.lockutils [None req-8cda5444-a8c8-4484-901e-917e1d85b993 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Acquiring lock "refresh_cache-30c7da24-de00-4067-a5d2-f36ad21391c5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 23:59:25 compute-1 nova_compute[182713]: 2026-01-21 23:59:25.494 182717 DEBUG oslo_concurrency.lockutils [None req-8cda5444-a8c8-4484-901e-917e1d85b993 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Acquired lock "refresh_cache-30c7da24-de00-4067-a5d2-f36ad21391c5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 23:59:25 compute-1 nova_compute[182713]: 2026-01-21 23:59:25.495 182717 DEBUG nova.network.neutron [None req-8cda5444-a8c8-4484-901e-917e1d85b993 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] [instance: 30c7da24-de00-4067-a5d2-f36ad21391c5] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 21 23:59:25 compute-1 NetworkManager[54952]: <info>  [1769039965.5341] manager: (tap9330cb5d-3c): new Tun device (/org/freedesktop/NetworkManager/Devices/138)
Jan 21 23:59:25 compute-1 NetworkManager[54952]: <info>  [1769039965.5529] manager: (tapda0329e1-27): new Tun device (/org/freedesktop/NetworkManager/Devices/139)
Jan 21 23:59:25 compute-1 kernel: tapda0329e1-27: entered promiscuous mode
Jan 21 23:59:25 compute-1 kernel: tap9330cb5d-3c: entered promiscuous mode
Jan 21 23:59:25 compute-1 ovn_controller[94841]: 2026-01-21T23:59:25Z|00274|binding|INFO|Claiming lport 9330cb5d-3c3f-499b-9d0c-ddc0fee6838f for this chassis.
Jan 21 23:59:25 compute-1 ovn_controller[94841]: 2026-01-21T23:59:25Z|00275|binding|INFO|9330cb5d-3c3f-499b-9d0c-ddc0fee6838f: Claiming fa:16:3e:9f:22:0c 10.100.0.143
Jan 21 23:59:25 compute-1 ovn_controller[94841]: 2026-01-21T23:59:25Z|00276|binding|INFO|Claiming lport da0329e1-27a9-4900-91e5-aff8efb5d066 for this chassis.
Jan 21 23:59:25 compute-1 ovn_controller[94841]: 2026-01-21T23:59:25Z|00277|binding|INFO|da0329e1-27a9-4900-91e5-aff8efb5d066: Claiming fa:16:3e:66:b9:f9 10.100.1.127
Jan 21 23:59:25 compute-1 nova_compute[182713]: 2026-01-21 23:59:25.560 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:59:25 compute-1 NetworkManager[54952]: <info>  [1769039965.5761] manager: (tapa899c1d3-f4): new Tun device (/org/freedesktop/NetworkManager/Devices/140)
Jan 21 23:59:25 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:59:25.575 104184 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:66:b9:f9 10.100.1.127'], port_security=['fa:16:3e:66:b9:f9 10.100.1.127'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.1.127/24', 'neutron:device_id': 'be4dacee-6b35-4e82-ba71-d6e8b745dfa8', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-54a6b0a6-096e-4f61-a504-b2a9810b3844', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '975703700f9d42c5a1daa32f5e61f6f2', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'ca9fa796-afc7-4732-b32a-7e6314132071', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=88c3c5ff-f5e5-415a-a727-ca4d025fff63, chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>], logical_port=da0329e1-27a9-4900-91e5-aff8efb5d066) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 21 23:59:25 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:59:25.578 104184 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:9f:22:0c 10.100.0.143'], port_security=['fa:16:3e:9f:22:0c 10.100.0.143'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.143/24', 'neutron:device_id': 'be4dacee-6b35-4e82-ba71-d6e8b745dfa8', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4ae8e9ca-350e-4d38-9fe2-01d17d47544e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '975703700f9d42c5a1daa32f5e61f6f2', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'ca9fa796-afc7-4732-b32a-7e6314132071', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d9dc661b-0eaf-42dd-bed8-0f2f7383c18d, chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>], logical_port=9330cb5d-3c3f-499b-9d0c-ddc0fee6838f) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 21 23:59:25 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:59:25.580 104184 INFO neutron.agent.ovn.metadata.agent [-] Port da0329e1-27a9-4900-91e5-aff8efb5d066 in datapath 54a6b0a6-096e-4f61-a504-b2a9810b3844 bound to our chassis
Jan 21 23:59:25 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:59:25.584 104184 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 54a6b0a6-096e-4f61-a504-b2a9810b3844
Jan 21 23:59:25 compute-1 systemd-udevd[222166]: Network interface NamePolicy= disabled on kernel command line.
Jan 21 23:59:25 compute-1 systemd-udevd[222165]: Network interface NamePolicy= disabled on kernel command line.
Jan 21 23:59:25 compute-1 nova_compute[182713]: 2026-01-21 23:59:25.587 182717 DEBUG nova.compute.manager [req-c485292f-ee64-4eca-86ef-9eb0706430a4 req-d1c0ed62-239b-4850-9c3f-742e9157ca66 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 30c7da24-de00-4067-a5d2-f36ad21391c5] Received event network-changed-c596dfbe-ce59-4ab9-8cad-bd4144812420 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:59:25 compute-1 systemd-udevd[222167]: Network interface NamePolicy= disabled on kernel command line.
Jan 21 23:59:25 compute-1 nova_compute[182713]: 2026-01-21 23:59:25.587 182717 DEBUG nova.compute.manager [req-c485292f-ee64-4eca-86ef-9eb0706430a4 req-d1c0ed62-239b-4850-9c3f-742e9157ca66 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 30c7da24-de00-4067-a5d2-f36ad21391c5] Refreshing instance network info cache due to event network-changed-c596dfbe-ce59-4ab9-8cad-bd4144812420. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 21 23:59:25 compute-1 nova_compute[182713]: 2026-01-21 23:59:25.588 182717 DEBUG oslo_concurrency.lockutils [req-c485292f-ee64-4eca-86ef-9eb0706430a4 req-d1c0ed62-239b-4850-9c3f-742e9157ca66 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-30c7da24-de00-4067-a5d2-f36ad21391c5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 23:59:25 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:59:25.600 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[63d7195d-9065-43c9-aa0b-8e79bdf1a16c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:59:25 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:59:25.602 104184 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap54a6b0a6-01 in ovnmeta-54a6b0a6-096e-4f61-a504-b2a9810b3844 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 21 23:59:25 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:59:25.606 211733 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap54a6b0a6-00 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 21 23:59:25 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:59:25.606 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[efaee3f9-5258-407f-aa22-183eab6600c3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:59:25 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:59:25.610 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[3a113726-1100-4747-a42b-9c9542c740c5]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:59:25 compute-1 NetworkManager[54952]: <info>  [1769039965.6127] device (tap9330cb5d-3c): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 21 23:59:25 compute-1 NetworkManager[54952]: <info>  [1769039965.6140] device (tapda0329e1-27): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 21 23:59:25 compute-1 NetworkManager[54952]: <info>  [1769039965.6151] device (tap9330cb5d-3c): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 21 23:59:25 compute-1 NetworkManager[54952]: <info>  [1769039965.6157] device (tapda0329e1-27): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 21 23:59:25 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:59:25.628 104576 DEBUG oslo.privsep.daemon [-] privsep: reply[52b2fe7a-18e6-461f-bb44-967610f68bd6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:59:25 compute-1 kernel: tapa899c1d3-f4: entered promiscuous mode
Jan 21 23:59:25 compute-1 NetworkManager[54952]: <info>  [1769039965.6497] device (tapa899c1d3-f4): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 21 23:59:25 compute-1 NetworkManager[54952]: <info>  [1769039965.6514] device (tapa899c1d3-f4): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 21 23:59:25 compute-1 nova_compute[182713]: 2026-01-21 23:59:25.654 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:59:25 compute-1 ovn_controller[94841]: 2026-01-21T23:59:25Z|00278|binding|INFO|Claiming lport a899c1d3-f433-4476-a304-705e518f0bea for this chassis.
Jan 21 23:59:25 compute-1 ovn_controller[94841]: 2026-01-21T23:59:25Z|00279|binding|INFO|a899c1d3-f433-4476-a304-705e518f0bea: Claiming fa:16:3e:e4:a0:ba 10.100.0.228
Jan 21 23:59:25 compute-1 ovn_controller[94841]: 2026-01-21T23:59:25Z|00280|binding|INFO|Setting lport 9330cb5d-3c3f-499b-9d0c-ddc0fee6838f ovn-installed in OVS
Jan 21 23:59:25 compute-1 ovn_controller[94841]: 2026-01-21T23:59:25Z|00281|binding|INFO|Setting lport da0329e1-27a9-4900-91e5-aff8efb5d066 ovn-installed in OVS
Jan 21 23:59:25 compute-1 nova_compute[182713]: 2026-01-21 23:59:25.661 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:59:25 compute-1 ovn_controller[94841]: 2026-01-21T23:59:25Z|00282|binding|INFO|Setting lport 9330cb5d-3c3f-499b-9d0c-ddc0fee6838f up in Southbound
Jan 21 23:59:25 compute-1 ovn_controller[94841]: 2026-01-21T23:59:25Z|00283|binding|INFO|Setting lport da0329e1-27a9-4900-91e5-aff8efb5d066 up in Southbound
Jan 21 23:59:25 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:59:25.663 104184 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e4:a0:ba 10.100.0.228'], port_security=['fa:16:3e:e4:a0:ba 10.100.0.228'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.228/24', 'neutron:device_id': 'be4dacee-6b35-4e82-ba71-d6e8b745dfa8', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4ae8e9ca-350e-4d38-9fe2-01d17d47544e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '975703700f9d42c5a1daa32f5e61f6f2', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'ca9fa796-afc7-4732-b32a-7e6314132071', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d9dc661b-0eaf-42dd-bed8-0f2f7383c18d, chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>], logical_port=a899c1d3-f433-4476-a304-705e518f0bea) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 21 23:59:25 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:59:25.664 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[811119dc-7a85-42f1-92e4-d87cf8569920]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:59:25 compute-1 systemd[1]: Started Virtual Machine qemu-36-instance-0000004e.
Jan 21 23:59:25 compute-1 systemd-machined[153970]: New machine qemu-36-instance-0000004e.
Jan 21 23:59:25 compute-1 ovn_controller[94841]: 2026-01-21T23:59:25Z|00284|binding|INFO|Setting lport a899c1d3-f433-4476-a304-705e518f0bea ovn-installed in OVS
Jan 21 23:59:25 compute-1 nova_compute[182713]: 2026-01-21 23:59:25.671 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:59:25 compute-1 ovn_controller[94841]: 2026-01-21T23:59:25Z|00285|binding|INFO|Setting lport a899c1d3-f433-4476-a304-705e518f0bea up in Southbound
Jan 21 23:59:25 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:59:25.699 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[04d80cc1-0398-45cf-903e-357e092ceab2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:59:25 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:59:25.705 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[384159ce-a8d8-4258-ada4-980991731fd7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:59:25 compute-1 NetworkManager[54952]: <info>  [1769039965.7066] manager: (tap54a6b0a6-00): new Veth device (/org/freedesktop/NetworkManager/Devices/141)
Jan 21 23:59:25 compute-1 nova_compute[182713]: 2026-01-21 23:59:25.738 182717 WARNING nova.network.neutron [None req-8cda5444-a8c8-4484-901e-917e1d85b993 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] [instance: 30c7da24-de00-4067-a5d2-f36ad21391c5] 1995baab-0f8d-4658-a4fc-2d21868dc592 already exists in list: networks containing: ['1995baab-0f8d-4658-a4fc-2d21868dc592']. ignoring it
Jan 21 23:59:25 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:59:25.745 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[02b78068-6d68-44f8-a7dd-64246f06d0ca]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:59:25 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:59:25.747 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[c87e010e-27d2-4145-ac21-292aff58ff0d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:59:25 compute-1 NetworkManager[54952]: <info>  [1769039965.7710] device (tap54a6b0a6-00): carrier: link connected
Jan 21 23:59:25 compute-1 podman[222174]: 2026-01-21 23:59:25.775962643 +0000 UTC m=+0.116526144 container health_status cba261ffc859a105e0afa4436e64e27f9ba7e7387a1ea02146e0072c51844d44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Jan 21 23:59:25 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:59:25.778 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[4040d1b5-bf96-4c07-8154-629309f343b3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:59:25 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:59:25.797 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[634bd2f3-b348-40da-9d85-799af0bbee24]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap54a6b0a6-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:2c:44:72'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 85], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 453842, 'reachable_time': 34345, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 222223, 'error': None, 'target': 'ovnmeta-54a6b0a6-096e-4f61-a504-b2a9810b3844', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:59:25 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:59:25.812 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[9402e697-ad86-4ef1-a656-0b5a99008c41]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe2c:4472'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 453842, 'tstamp': 453842}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 222224, 'error': None, 'target': 'ovnmeta-54a6b0a6-096e-4f61-a504-b2a9810b3844', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:59:25 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:59:25.828 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[3b2ca062-e972-4184-af3d-90662d305b35]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap54a6b0a6-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:2c:44:72'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 85], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 453842, 'reachable_time': 34345, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 222225, 'error': None, 'target': 'ovnmeta-54a6b0a6-096e-4f61-a504-b2a9810b3844', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:59:25 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:59:25.859 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[6b66b8c6-1c33-4cfe-8b44-bd53da241bab]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:59:25 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:59:25.930 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[bcde6972-09a6-4219-8679-bdd27cb24795]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:59:25 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:59:25.931 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap54a6b0a6-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:59:25 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:59:25.931 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 21 23:59:25 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:59:25.932 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap54a6b0a6-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:59:25 compute-1 NetworkManager[54952]: <info>  [1769039965.9344] manager: (tap54a6b0a6-00): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/142)
Jan 21 23:59:25 compute-1 nova_compute[182713]: 2026-01-21 23:59:25.934 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:59:25 compute-1 kernel: tap54a6b0a6-00: entered promiscuous mode
Jan 21 23:59:25 compute-1 nova_compute[182713]: 2026-01-21 23:59:25.936 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:59:25 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:59:25.937 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap54a6b0a6-00, col_values=(('external_ids', {'iface-id': 'ee4ecd44-98c4-4db8-9004-973e60d6fa58'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:59:25 compute-1 nova_compute[182713]: 2026-01-21 23:59:25.939 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:59:25 compute-1 ovn_controller[94841]: 2026-01-21T23:59:25Z|00286|binding|INFO|Releasing lport ee4ecd44-98c4-4db8-9004-973e60d6fa58 from this chassis (sb_readonly=0)
Jan 21 23:59:25 compute-1 nova_compute[182713]: 2026-01-21 23:59:25.950 182717 DEBUG nova.virt.driver [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Emitting event <LifecycleEvent: 1769039965.95021, be4dacee-6b35-4e82-ba71-d6e8b745dfa8 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:59:25 compute-1 nova_compute[182713]: 2026-01-21 23:59:25.951 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: be4dacee-6b35-4e82-ba71-d6e8b745dfa8] VM Started (Lifecycle Event)
Jan 21 23:59:25 compute-1 nova_compute[182713]: 2026-01-21 23:59:25.956 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:59:25 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:59:25.957 104184 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/54a6b0a6-096e-4f61-a504-b2a9810b3844.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/54a6b0a6-096e-4f61-a504-b2a9810b3844.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 21 23:59:25 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:59:25.958 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[6340821e-e72b-4952-beb9-8ba0fba2bf21]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:59:25 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:59:25.959 104184 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 21 23:59:25 compute-1 ovn_metadata_agent[104179]: global
Jan 21 23:59:25 compute-1 ovn_metadata_agent[104179]:     log         /dev/log local0 debug
Jan 21 23:59:25 compute-1 ovn_metadata_agent[104179]:     log-tag     haproxy-metadata-proxy-54a6b0a6-096e-4f61-a504-b2a9810b3844
Jan 21 23:59:25 compute-1 ovn_metadata_agent[104179]:     user        root
Jan 21 23:59:25 compute-1 ovn_metadata_agent[104179]:     group       root
Jan 21 23:59:25 compute-1 ovn_metadata_agent[104179]:     maxconn     1024
Jan 21 23:59:25 compute-1 ovn_metadata_agent[104179]:     pidfile     /var/lib/neutron/external/pids/54a6b0a6-096e-4f61-a504-b2a9810b3844.pid.haproxy
Jan 21 23:59:25 compute-1 ovn_metadata_agent[104179]:     daemon
Jan 21 23:59:25 compute-1 ovn_metadata_agent[104179]: 
Jan 21 23:59:25 compute-1 ovn_metadata_agent[104179]: defaults
Jan 21 23:59:25 compute-1 ovn_metadata_agent[104179]:     log global
Jan 21 23:59:25 compute-1 ovn_metadata_agent[104179]:     mode http
Jan 21 23:59:25 compute-1 ovn_metadata_agent[104179]:     option httplog
Jan 21 23:59:25 compute-1 ovn_metadata_agent[104179]:     option dontlognull
Jan 21 23:59:25 compute-1 ovn_metadata_agent[104179]:     option http-server-close
Jan 21 23:59:25 compute-1 ovn_metadata_agent[104179]:     option forwardfor
Jan 21 23:59:25 compute-1 ovn_metadata_agent[104179]:     retries                 3
Jan 21 23:59:25 compute-1 ovn_metadata_agent[104179]:     timeout http-request    30s
Jan 21 23:59:25 compute-1 ovn_metadata_agent[104179]:     timeout connect         30s
Jan 21 23:59:25 compute-1 ovn_metadata_agent[104179]:     timeout client          32s
Jan 21 23:59:25 compute-1 ovn_metadata_agent[104179]:     timeout server          32s
Jan 21 23:59:25 compute-1 ovn_metadata_agent[104179]:     timeout http-keep-alive 30s
Jan 21 23:59:25 compute-1 ovn_metadata_agent[104179]: 
Jan 21 23:59:25 compute-1 ovn_metadata_agent[104179]: 
Jan 21 23:59:25 compute-1 ovn_metadata_agent[104179]: listen listener
Jan 21 23:59:25 compute-1 ovn_metadata_agent[104179]:     bind 169.254.169.254:80
Jan 21 23:59:25 compute-1 ovn_metadata_agent[104179]:     server metadata /var/lib/neutron/metadata_proxy
Jan 21 23:59:25 compute-1 ovn_metadata_agent[104179]:     http-request add-header X-OVN-Network-ID 54a6b0a6-096e-4f61-a504-b2a9810b3844
Jan 21 23:59:25 compute-1 ovn_metadata_agent[104179]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 21 23:59:25 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:59:25.959 104184 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-54a6b0a6-096e-4f61-a504-b2a9810b3844', 'env', 'PROCESS_TAG=haproxy-54a6b0a6-096e-4f61-a504-b2a9810b3844', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/54a6b0a6-096e-4f61-a504-b2a9810b3844.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 21 23:59:25 compute-1 nova_compute[182713]: 2026-01-21 23:59:25.971 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: be4dacee-6b35-4e82-ba71-d6e8b745dfa8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:59:25 compute-1 nova_compute[182713]: 2026-01-21 23:59:25.978 182717 DEBUG nova.virt.driver [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Emitting event <LifecycleEvent: 1769039965.950373, be4dacee-6b35-4e82-ba71-d6e8b745dfa8 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:59:25 compute-1 nova_compute[182713]: 2026-01-21 23:59:25.978 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: be4dacee-6b35-4e82-ba71-d6e8b745dfa8] VM Paused (Lifecycle Event)
Jan 21 23:59:26 compute-1 nova_compute[182713]: 2026-01-21 23:59:26.002 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: be4dacee-6b35-4e82-ba71-d6e8b745dfa8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:59:26 compute-1 nova_compute[182713]: 2026-01-21 23:59:26.006 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: be4dacee-6b35-4e82-ba71-d6e8b745dfa8] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 21 23:59:26 compute-1 nova_compute[182713]: 2026-01-21 23:59:26.040 182717 DEBUG nova.compute.manager [req-a8873aca-d05b-4a95-80ce-4d967bb8c67a req-a40d0a59-e5c6-4cba-9040-fa9c8eb3b790 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: be4dacee-6b35-4e82-ba71-d6e8b745dfa8] Received event network-vif-plugged-9330cb5d-3c3f-499b-9d0c-ddc0fee6838f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:59:26 compute-1 nova_compute[182713]: 2026-01-21 23:59:26.040 182717 DEBUG oslo_concurrency.lockutils [req-a8873aca-d05b-4a95-80ce-4d967bb8c67a req-a40d0a59-e5c6-4cba-9040-fa9c8eb3b790 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "be4dacee-6b35-4e82-ba71-d6e8b745dfa8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:59:26 compute-1 nova_compute[182713]: 2026-01-21 23:59:26.040 182717 DEBUG oslo_concurrency.lockutils [req-a8873aca-d05b-4a95-80ce-4d967bb8c67a req-a40d0a59-e5c6-4cba-9040-fa9c8eb3b790 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "be4dacee-6b35-4e82-ba71-d6e8b745dfa8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:59:26 compute-1 nova_compute[182713]: 2026-01-21 23:59:26.041 182717 DEBUG oslo_concurrency.lockutils [req-a8873aca-d05b-4a95-80ce-4d967bb8c67a req-a40d0a59-e5c6-4cba-9040-fa9c8eb3b790 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "be4dacee-6b35-4e82-ba71-d6e8b745dfa8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:59:26 compute-1 nova_compute[182713]: 2026-01-21 23:59:26.041 182717 DEBUG nova.compute.manager [req-a8873aca-d05b-4a95-80ce-4d967bb8c67a req-a40d0a59-e5c6-4cba-9040-fa9c8eb3b790 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: be4dacee-6b35-4e82-ba71-d6e8b745dfa8] Processing event network-vif-plugged-9330cb5d-3c3f-499b-9d0c-ddc0fee6838f _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 21 23:59:26 compute-1 nova_compute[182713]: 2026-01-21 23:59:26.043 182717 DEBUG nova.compute.manager [req-969a341e-eedd-4636-85f4-4d6a4b72260c req-86b74422-d0f0-476b-a05c-132b42795de2 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: be4dacee-6b35-4e82-ba71-d6e8b745dfa8] Received event network-vif-plugged-a899c1d3-f433-4476-a304-705e518f0bea external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:59:26 compute-1 nova_compute[182713]: 2026-01-21 23:59:26.043 182717 DEBUG oslo_concurrency.lockutils [req-969a341e-eedd-4636-85f4-4d6a4b72260c req-86b74422-d0f0-476b-a05c-132b42795de2 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "be4dacee-6b35-4e82-ba71-d6e8b745dfa8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:59:26 compute-1 nova_compute[182713]: 2026-01-21 23:59:26.043 182717 DEBUG oslo_concurrency.lockutils [req-969a341e-eedd-4636-85f4-4d6a4b72260c req-86b74422-d0f0-476b-a05c-132b42795de2 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "be4dacee-6b35-4e82-ba71-d6e8b745dfa8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:59:26 compute-1 nova_compute[182713]: 2026-01-21 23:59:26.043 182717 DEBUG oslo_concurrency.lockutils [req-969a341e-eedd-4636-85f4-4d6a4b72260c req-86b74422-d0f0-476b-a05c-132b42795de2 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "be4dacee-6b35-4e82-ba71-d6e8b745dfa8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:59:26 compute-1 nova_compute[182713]: 2026-01-21 23:59:26.043 182717 DEBUG nova.compute.manager [req-969a341e-eedd-4636-85f4-4d6a4b72260c req-86b74422-d0f0-476b-a05c-132b42795de2 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: be4dacee-6b35-4e82-ba71-d6e8b745dfa8] Processing event network-vif-plugged-a899c1d3-f433-4476-a304-705e518f0bea _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 21 23:59:26 compute-1 nova_compute[182713]: 2026-01-21 23:59:26.044 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: be4dacee-6b35-4e82-ba71-d6e8b745dfa8] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 21 23:59:26 compute-1 nova_compute[182713]: 2026-01-21 23:59:26.204 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:59:26 compute-1 nova_compute[182713]: 2026-01-21 23:59:26.258 182717 DEBUG nova.network.neutron [req-30fb0ffb-fabe-4200-a328-63b1afb53689 req-732b0965-3b2b-41a4-a811-29c7fa2adda7 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: be4dacee-6b35-4e82-ba71-d6e8b745dfa8] Updated VIF entry in instance network info cache for port a899c1d3-f433-4476-a304-705e518f0bea. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 21 23:59:26 compute-1 nova_compute[182713]: 2026-01-21 23:59:26.259 182717 DEBUG nova.network.neutron [req-30fb0ffb-fabe-4200-a328-63b1afb53689 req-732b0965-3b2b-41a4-a811-29c7fa2adda7 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: be4dacee-6b35-4e82-ba71-d6e8b745dfa8] Updating instance_info_cache with network_info: [{"id": "9330cb5d-3c3f-499b-9d0c-ddc0fee6838f", "address": "fa:16:3e:9f:22:0c", "network": {"id": "4ae8e9ca-350e-4d38-9fe2-01d17d47544e", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1174472241", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.143", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "975703700f9d42c5a1daa32f5e61f6f2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9330cb5d-3c", "ovs_interfaceid": "9330cb5d-3c3f-499b-9d0c-ddc0fee6838f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "da0329e1-27a9-4900-91e5-aff8efb5d066", "address": "fa:16:3e:66:b9:f9", "network": {"id": "54a6b0a6-096e-4f61-a504-b2a9810b3844", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-335695391", "subnets": [{"cidr": "10.100.1.0/24", "dns": [], "gateway": {"address": "10.100.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.1.127", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "975703700f9d42c5a1daa32f5e61f6f2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapda0329e1-27", "ovs_interfaceid": "da0329e1-27a9-4900-91e5-aff8efb5d066", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "a899c1d3-f433-4476-a304-705e518f0bea", "address": "fa:16:3e:e4:a0:ba", "network": {"id": "4ae8e9ca-350e-4d38-9fe2-01d17d47544e", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1174472241", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.228", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "975703700f9d42c5a1daa32f5e61f6f2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa899c1d3-f4", "ovs_interfaceid": "a899c1d3-f433-4476-a304-705e518f0bea", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 23:59:26 compute-1 nova_compute[182713]: 2026-01-21 23:59:26.294 182717 DEBUG oslo_concurrency.lockutils [req-30fb0ffb-fabe-4200-a328-63b1afb53689 req-732b0965-3b2b-41a4-a811-29c7fa2adda7 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-be4dacee-6b35-4e82-ba71-d6e8b745dfa8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 23:59:26 compute-1 nova_compute[182713]: 2026-01-21 23:59:26.294 182717 DEBUG nova.compute.manager [req-30fb0ffb-fabe-4200-a328-63b1afb53689 req-732b0965-3b2b-41a4-a811-29c7fa2adda7 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 30c7da24-de00-4067-a5d2-f36ad21391c5] Received event network-changed-917524f9-5334-4b3d-b16f-b9686b1c3528 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:59:26 compute-1 nova_compute[182713]: 2026-01-21 23:59:26.295 182717 DEBUG nova.compute.manager [req-30fb0ffb-fabe-4200-a328-63b1afb53689 req-732b0965-3b2b-41a4-a811-29c7fa2adda7 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 30c7da24-de00-4067-a5d2-f36ad21391c5] Refreshing instance network info cache due to event network-changed-917524f9-5334-4b3d-b16f-b9686b1c3528. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 21 23:59:26 compute-1 nova_compute[182713]: 2026-01-21 23:59:26.295 182717 DEBUG oslo_concurrency.lockutils [req-30fb0ffb-fabe-4200-a328-63b1afb53689 req-732b0965-3b2b-41a4-a811-29c7fa2adda7 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-30c7da24-de00-4067-a5d2-f36ad21391c5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 23:59:26 compute-1 podman[222265]: 2026-01-21 23:59:26.440586081 +0000 UTC m=+0.066388589 container create 5f0ff92f1e526697a792fbc9cdebebab2edb0c475ddf98df071c87470ecff4a4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-54a6b0a6-096e-4f61-a504-b2a9810b3844, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 21 23:59:26 compute-1 systemd[1]: Started libpod-conmon-5f0ff92f1e526697a792fbc9cdebebab2edb0c475ddf98df071c87470ecff4a4.scope.
Jan 21 23:59:26 compute-1 podman[222265]: 2026-01-21 23:59:26.410972327 +0000 UTC m=+0.036774825 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 21 23:59:26 compute-1 systemd[1]: Started libcrun container.
Jan 21 23:59:26 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/148b8e5d36fd5e918c3cd94bf47a8fa769f8b942990cd974b23b5fb7a1dc5022/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 21 23:59:26 compute-1 podman[222265]: 2026-01-21 23:59:26.568725403 +0000 UTC m=+0.194527981 container init 5f0ff92f1e526697a792fbc9cdebebab2edb0c475ddf98df071c87470ecff4a4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-54a6b0a6-096e-4f61-a504-b2a9810b3844, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 21 23:59:26 compute-1 podman[222265]: 2026-01-21 23:59:26.581031082 +0000 UTC m=+0.206833590 container start 5f0ff92f1e526697a792fbc9cdebebab2edb0c475ddf98df071c87470ecff4a4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-54a6b0a6-096e-4f61-a504-b2a9810b3844, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Jan 21 23:59:26 compute-1 neutron-haproxy-ovnmeta-54a6b0a6-096e-4f61-a504-b2a9810b3844[222280]: [NOTICE]   (222284) : New worker (222286) forked
Jan 21 23:59:26 compute-1 neutron-haproxy-ovnmeta-54a6b0a6-096e-4f61-a504-b2a9810b3844[222280]: [NOTICE]   (222284) : Loading success.
Jan 21 23:59:26 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:59:26.660 104184 INFO neutron.agent.ovn.metadata.agent [-] Port 9330cb5d-3c3f-499b-9d0c-ddc0fee6838f in datapath 4ae8e9ca-350e-4d38-9fe2-01d17d47544e unbound from our chassis
Jan 21 23:59:26 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:59:26.665 104184 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 4ae8e9ca-350e-4d38-9fe2-01d17d47544e
Jan 21 23:59:26 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:59:26.685 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[d0cb7bfb-9a0d-4020-a24f-de1412d9c8b1]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:59:26 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:59:26.687 104184 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap4ae8e9ca-31 in ovnmeta-4ae8e9ca-350e-4d38-9fe2-01d17d47544e namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 21 23:59:26 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:59:26.693 211733 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap4ae8e9ca-30 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 21 23:59:26 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:59:26.693 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[f5c86c86-faee-4fff-80dc-df61b708f612]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:59:26 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:59:26.694 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[fc998ba4-e1bc-4d79-b4cb-996afae6ddbb]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:59:26 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:59:26.713 104576 DEBUG oslo.privsep.daemon [-] privsep: reply[7b16b22f-8ecb-4292-8d74-cce97ca55064]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:59:26 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:59:26.745 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[9105506a-781b-4686-93a9-962c1fa28b19]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:59:26 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:59:26.789 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[01897128-b89c-4d7b-8750-d0d4e2d71d5b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:59:26 compute-1 systemd-udevd[222213]: Network interface NamePolicy= disabled on kernel command line.
Jan 21 23:59:26 compute-1 NetworkManager[54952]: <info>  [1769039966.7977] manager: (tap4ae8e9ca-30): new Veth device (/org/freedesktop/NetworkManager/Devices/143)
Jan 21 23:59:26 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:59:26.797 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[7287ca8a-6066-4994-88c7-2980c0c3ac31]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:59:26 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:59:26.851 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[54c8200f-4509-4b01-a462-e2bddb1d30c6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:59:26 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:59:26.855 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[55353de8-de9a-4c8e-87d9-c4aec8af1e19]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:59:26 compute-1 nova_compute[182713]: 2026-01-21 23:59:26.856 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:59:26 compute-1 nova_compute[182713]: 2026-01-21 23:59:26.856 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 21 23:59:26 compute-1 NetworkManager[54952]: <info>  [1769039966.8785] device (tap4ae8e9ca-30): carrier: link connected
Jan 21 23:59:26 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:59:26.885 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[ba3eda4b-cc3c-43f6-9b20-47872cebf2dd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:59:26 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:59:26.913 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[682f651f-62f2-4b94-81d1-760d03f2f51b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap4ae8e9ca-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:11:a9:0c'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 86], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 453953, 'reachable_time': 44184, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 222305, 'error': None, 'target': 'ovnmeta-4ae8e9ca-350e-4d38-9fe2-01d17d47544e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:59:26 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:59:26.939 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[1742c858-c77a-497a-9629-6e779fb65a96]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe11:a90c'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 453953, 'tstamp': 453953}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 222306, 'error': None, 'target': 'ovnmeta-4ae8e9ca-350e-4d38-9fe2-01d17d47544e', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:59:26 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:59:26.963 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[e0b63e27-95f7-41c0-960f-63f6bb5800c5]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap4ae8e9ca-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:11:a9:0c'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 86], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 453953, 'reachable_time': 44184, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 222307, 'error': None, 'target': 'ovnmeta-4ae8e9ca-350e-4d38-9fe2-01d17d47544e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:59:27 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:59:27.003 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[b08c4b9e-bdaf-4ec3-b5e3-80f871cef8e1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:59:27 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:59:27.093 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[351d969a-22fb-4d97-93e1-a6a227296661]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:59:27 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:59:27.095 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4ae8e9ca-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:59:27 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:59:27.095 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 21 23:59:27 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:59:27.096 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap4ae8e9ca-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:59:27 compute-1 NetworkManager[54952]: <info>  [1769039967.0994] manager: (tap4ae8e9ca-30): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/144)
Jan 21 23:59:27 compute-1 kernel: tap4ae8e9ca-30: entered promiscuous mode
Jan 21 23:59:27 compute-1 nova_compute[182713]: 2026-01-21 23:59:27.098 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:59:27 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:59:27.103 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap4ae8e9ca-30, col_values=(('external_ids', {'iface-id': '9328bb99-12eb-4e9f-9bcb-95844b674407'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:59:27 compute-1 nova_compute[182713]: 2026-01-21 23:59:27.105 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:59:27 compute-1 ovn_controller[94841]: 2026-01-21T23:59:27Z|00287|binding|INFO|Releasing lport 9328bb99-12eb-4e9f-9bcb-95844b674407 from this chassis (sb_readonly=0)
Jan 21 23:59:27 compute-1 nova_compute[182713]: 2026-01-21 23:59:27.131 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:59:27 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:59:27.132 104184 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/4ae8e9ca-350e-4d38-9fe2-01d17d47544e.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/4ae8e9ca-350e-4d38-9fe2-01d17d47544e.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 21 23:59:27 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:59:27.133 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[4f5af8e5-ed6a-471c-af4b-461acdea21c3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:59:27 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:59:27.134 104184 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 21 23:59:27 compute-1 ovn_metadata_agent[104179]: global
Jan 21 23:59:27 compute-1 ovn_metadata_agent[104179]:     log         /dev/log local0 debug
Jan 21 23:59:27 compute-1 ovn_metadata_agent[104179]:     log-tag     haproxy-metadata-proxy-4ae8e9ca-350e-4d38-9fe2-01d17d47544e
Jan 21 23:59:27 compute-1 ovn_metadata_agent[104179]:     user        root
Jan 21 23:59:27 compute-1 ovn_metadata_agent[104179]:     group       root
Jan 21 23:59:27 compute-1 ovn_metadata_agent[104179]:     maxconn     1024
Jan 21 23:59:27 compute-1 ovn_metadata_agent[104179]:     pidfile     /var/lib/neutron/external/pids/4ae8e9ca-350e-4d38-9fe2-01d17d47544e.pid.haproxy
Jan 21 23:59:27 compute-1 ovn_metadata_agent[104179]:     daemon
Jan 21 23:59:27 compute-1 ovn_metadata_agent[104179]: 
Jan 21 23:59:27 compute-1 ovn_metadata_agent[104179]: defaults
Jan 21 23:59:27 compute-1 ovn_metadata_agent[104179]:     log global
Jan 21 23:59:27 compute-1 ovn_metadata_agent[104179]:     mode http
Jan 21 23:59:27 compute-1 ovn_metadata_agent[104179]:     option httplog
Jan 21 23:59:27 compute-1 ovn_metadata_agent[104179]:     option dontlognull
Jan 21 23:59:27 compute-1 ovn_metadata_agent[104179]:     option http-server-close
Jan 21 23:59:27 compute-1 ovn_metadata_agent[104179]:     option forwardfor
Jan 21 23:59:27 compute-1 ovn_metadata_agent[104179]:     retries                 3
Jan 21 23:59:27 compute-1 ovn_metadata_agent[104179]:     timeout http-request    30s
Jan 21 23:59:27 compute-1 ovn_metadata_agent[104179]:     timeout connect         30s
Jan 21 23:59:27 compute-1 ovn_metadata_agent[104179]:     timeout client          32s
Jan 21 23:59:27 compute-1 ovn_metadata_agent[104179]:     timeout server          32s
Jan 21 23:59:27 compute-1 ovn_metadata_agent[104179]:     timeout http-keep-alive 30s
Jan 21 23:59:27 compute-1 ovn_metadata_agent[104179]: 
Jan 21 23:59:27 compute-1 ovn_metadata_agent[104179]: 
Jan 21 23:59:27 compute-1 ovn_metadata_agent[104179]: listen listener
Jan 21 23:59:27 compute-1 ovn_metadata_agent[104179]:     bind 169.254.169.254:80
Jan 21 23:59:27 compute-1 ovn_metadata_agent[104179]:     server metadata /var/lib/neutron/metadata_proxy
Jan 21 23:59:27 compute-1 ovn_metadata_agent[104179]:     http-request add-header X-OVN-Network-ID 4ae8e9ca-350e-4d38-9fe2-01d17d47544e
Jan 21 23:59:27 compute-1 ovn_metadata_agent[104179]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 21 23:59:27 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:59:27.135 104184 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-4ae8e9ca-350e-4d38-9fe2-01d17d47544e', 'env', 'PROCESS_TAG=haproxy-4ae8e9ca-350e-4d38-9fe2-01d17d47544e', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/4ae8e9ca-350e-4d38-9fe2-01d17d47544e.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 21 23:59:27 compute-1 podman[222340]: 2026-01-21 23:59:27.578204225 +0000 UTC m=+0.073747655 container create 7c45f2ace3d5dbeec2335bb2491ce758b0eb7a1d24aba2b360d27a828b0262fe (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4ae8e9ca-350e-4d38-9fe2-01d17d47544e, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251202, io.buildah.version=1.41.3)
Jan 21 23:59:27 compute-1 systemd[1]: Started libpod-conmon-7c45f2ace3d5dbeec2335bb2491ce758b0eb7a1d24aba2b360d27a828b0262fe.scope.
Jan 21 23:59:27 compute-1 podman[222340]: 2026-01-21 23:59:27.54333506 +0000 UTC m=+0.038878540 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 21 23:59:27 compute-1 systemd[1]: Started libcrun container.
Jan 21 23:59:27 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/496f9ed9fa8bcdd8ca5ca001e6ca60188c6a1726c8a0215bf84f6f000dda1646/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 21 23:59:27 compute-1 nova_compute[182713]: 2026-01-21 23:59:27.680 182717 DEBUG nova.compute.manager [req-2b7b148d-a2b5-4889-a1fc-4c600182c64b req-248f17cf-c1d8-40c7-998b-4aaea98a3de5 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: be4dacee-6b35-4e82-ba71-d6e8b745dfa8] Received event network-vif-plugged-da0329e1-27a9-4900-91e5-aff8efb5d066 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:59:27 compute-1 nova_compute[182713]: 2026-01-21 23:59:27.680 182717 DEBUG oslo_concurrency.lockutils [req-2b7b148d-a2b5-4889-a1fc-4c600182c64b req-248f17cf-c1d8-40c7-998b-4aaea98a3de5 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "be4dacee-6b35-4e82-ba71-d6e8b745dfa8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:59:27 compute-1 nova_compute[182713]: 2026-01-21 23:59:27.681 182717 DEBUG oslo_concurrency.lockutils [req-2b7b148d-a2b5-4889-a1fc-4c600182c64b req-248f17cf-c1d8-40c7-998b-4aaea98a3de5 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "be4dacee-6b35-4e82-ba71-d6e8b745dfa8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:59:27 compute-1 nova_compute[182713]: 2026-01-21 23:59:27.681 182717 DEBUG oslo_concurrency.lockutils [req-2b7b148d-a2b5-4889-a1fc-4c600182c64b req-248f17cf-c1d8-40c7-998b-4aaea98a3de5 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "be4dacee-6b35-4e82-ba71-d6e8b745dfa8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:59:27 compute-1 nova_compute[182713]: 2026-01-21 23:59:27.682 182717 DEBUG nova.compute.manager [req-2b7b148d-a2b5-4889-a1fc-4c600182c64b req-248f17cf-c1d8-40c7-998b-4aaea98a3de5 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: be4dacee-6b35-4e82-ba71-d6e8b745dfa8] Processing event network-vif-plugged-da0329e1-27a9-4900-91e5-aff8efb5d066 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 21 23:59:27 compute-1 nova_compute[182713]: 2026-01-21 23:59:27.682 182717 DEBUG nova.compute.manager [req-2b7b148d-a2b5-4889-a1fc-4c600182c64b req-248f17cf-c1d8-40c7-998b-4aaea98a3de5 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: be4dacee-6b35-4e82-ba71-d6e8b745dfa8] Received event network-vif-plugged-da0329e1-27a9-4900-91e5-aff8efb5d066 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:59:27 compute-1 nova_compute[182713]: 2026-01-21 23:59:27.682 182717 DEBUG oslo_concurrency.lockutils [req-2b7b148d-a2b5-4889-a1fc-4c600182c64b req-248f17cf-c1d8-40c7-998b-4aaea98a3de5 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "be4dacee-6b35-4e82-ba71-d6e8b745dfa8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:59:27 compute-1 nova_compute[182713]: 2026-01-21 23:59:27.683 182717 DEBUG oslo_concurrency.lockutils [req-2b7b148d-a2b5-4889-a1fc-4c600182c64b req-248f17cf-c1d8-40c7-998b-4aaea98a3de5 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "be4dacee-6b35-4e82-ba71-d6e8b745dfa8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:59:27 compute-1 nova_compute[182713]: 2026-01-21 23:59:27.683 182717 DEBUG oslo_concurrency.lockutils [req-2b7b148d-a2b5-4889-a1fc-4c600182c64b req-248f17cf-c1d8-40c7-998b-4aaea98a3de5 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "be4dacee-6b35-4e82-ba71-d6e8b745dfa8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:59:27 compute-1 nova_compute[182713]: 2026-01-21 23:59:27.683 182717 DEBUG nova.compute.manager [req-2b7b148d-a2b5-4889-a1fc-4c600182c64b req-248f17cf-c1d8-40c7-998b-4aaea98a3de5 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: be4dacee-6b35-4e82-ba71-d6e8b745dfa8] No waiting events found dispatching network-vif-plugged-da0329e1-27a9-4900-91e5-aff8efb5d066 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 23:59:27 compute-1 nova_compute[182713]: 2026-01-21 23:59:27.684 182717 WARNING nova.compute.manager [req-2b7b148d-a2b5-4889-a1fc-4c600182c64b req-248f17cf-c1d8-40c7-998b-4aaea98a3de5 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: be4dacee-6b35-4e82-ba71-d6e8b745dfa8] Received unexpected event network-vif-plugged-da0329e1-27a9-4900-91e5-aff8efb5d066 for instance with vm_state building and task_state spawning.
Jan 21 23:59:27 compute-1 nova_compute[182713]: 2026-01-21 23:59:27.684 182717 DEBUG nova.compute.manager [None req-8f7bd3b7-8d43-4fd6-84e3-31c784ac5bda 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] [instance: be4dacee-6b35-4e82-ba71-d6e8b745dfa8] Instance event wait completed in 1 seconds for network-vif-plugged,network-vif-plugged,network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 21 23:59:27 compute-1 podman[222340]: 2026-01-21 23:59:27.686253768 +0000 UTC m=+0.181797248 container init 7c45f2ace3d5dbeec2335bb2491ce758b0eb7a1d24aba2b360d27a828b0262fe (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4ae8e9ca-350e-4d38-9fe2-01d17d47544e, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Jan 21 23:59:27 compute-1 nova_compute[182713]: 2026-01-21 23:59:27.692 182717 DEBUG nova.network.neutron [None req-8cda5444-a8c8-4484-901e-917e1d85b993 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] [instance: 30c7da24-de00-4067-a5d2-f36ad21391c5] Updating instance_info_cache with network_info: [{"id": "917524f9-5334-4b3d-b16f-b9686b1c3528", "address": "fa:16:3e:80:88:c9", "network": {"id": "1995baab-0f8d-4658-a4fc-2d21868dc592", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-54296518-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.223", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "717cc581e6a349a98dfd390d05b18624", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap917524f9-53", "ovs_interfaceid": "917524f9-5334-4b3d-b16f-b9686b1c3528", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "c596dfbe-ce59-4ab9-8cad-bd4144812420", "address": "fa:16:3e:24:23:22", "network": {"id": "1995baab-0f8d-4658-a4fc-2d21868dc592", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-54296518-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "717cc581e6a349a98dfd390d05b18624", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc596dfbe-ce", "ovs_interfaceid": "c596dfbe-ce59-4ab9-8cad-bd4144812420", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 23:59:27 compute-1 podman[222340]: 2026-01-21 23:59:27.696255656 +0000 UTC m=+0.191799076 container start 7c45f2ace3d5dbeec2335bb2491ce758b0eb7a1d24aba2b360d27a828b0262fe (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4ae8e9ca-350e-4d38-9fe2-01d17d47544e, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 21 23:59:27 compute-1 nova_compute[182713]: 2026-01-21 23:59:27.696 182717 DEBUG nova.virt.driver [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Emitting event <LifecycleEvent: 1769039967.6957366, be4dacee-6b35-4e82-ba71-d6e8b745dfa8 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:59:27 compute-1 nova_compute[182713]: 2026-01-21 23:59:27.696 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: be4dacee-6b35-4e82-ba71-d6e8b745dfa8] VM Resumed (Lifecycle Event)
Jan 21 23:59:27 compute-1 nova_compute[182713]: 2026-01-21 23:59:27.699 182717 DEBUG nova.virt.libvirt.driver [None req-8f7bd3b7-8d43-4fd6-84e3-31c784ac5bda 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] [instance: be4dacee-6b35-4e82-ba71-d6e8b745dfa8] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 21 23:59:27 compute-1 nova_compute[182713]: 2026-01-21 23:59:27.705 182717 INFO nova.virt.libvirt.driver [-] [instance: be4dacee-6b35-4e82-ba71-d6e8b745dfa8] Instance spawned successfully.
Jan 21 23:59:27 compute-1 nova_compute[182713]: 2026-01-21 23:59:27.705 182717 DEBUG nova.virt.libvirt.driver [None req-8f7bd3b7-8d43-4fd6-84e3-31c784ac5bda 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] [instance: be4dacee-6b35-4e82-ba71-d6e8b745dfa8] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 21 23:59:27 compute-1 nova_compute[182713]: 2026-01-21 23:59:27.722 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: be4dacee-6b35-4e82-ba71-d6e8b745dfa8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:59:27 compute-1 nova_compute[182713]: 2026-01-21 23:59:27.725 182717 DEBUG oslo_concurrency.lockutils [None req-8cda5444-a8c8-4484-901e-917e1d85b993 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Releasing lock "refresh_cache-30c7da24-de00-4067-a5d2-f36ad21391c5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 23:59:27 compute-1 nova_compute[182713]: 2026-01-21 23:59:27.729 182717 DEBUG oslo_concurrency.lockutils [req-c485292f-ee64-4eca-86ef-9eb0706430a4 req-d1c0ed62-239b-4850-9c3f-742e9157ca66 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-30c7da24-de00-4067-a5d2-f36ad21391c5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 23:59:27 compute-1 nova_compute[182713]: 2026-01-21 23:59:27.729 182717 DEBUG nova.network.neutron [req-c485292f-ee64-4eca-86ef-9eb0706430a4 req-d1c0ed62-239b-4850-9c3f-742e9157ca66 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 30c7da24-de00-4067-a5d2-f36ad21391c5] Refreshing network info cache for port c596dfbe-ce59-4ab9-8cad-bd4144812420 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 21 23:59:27 compute-1 neutron-haproxy-ovnmeta-4ae8e9ca-350e-4d38-9fe2-01d17d47544e[222355]: [NOTICE]   (222359) : New worker (222361) forked
Jan 21 23:59:27 compute-1 nova_compute[182713]: 2026-01-21 23:59:27.738 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: be4dacee-6b35-4e82-ba71-d6e8b745dfa8] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 21 23:59:27 compute-1 neutron-haproxy-ovnmeta-4ae8e9ca-350e-4d38-9fe2-01d17d47544e[222355]: [NOTICE]   (222359) : Loading success.
Jan 21 23:59:27 compute-1 nova_compute[182713]: 2026-01-21 23:59:27.743 182717 DEBUG nova.virt.libvirt.vif [None req-8cda5444-a8c8-4484-901e-917e1d85b993 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-21T23:58:43Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-821479065',display_name='tempest-tempest.common.compute-instance-821479065',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-821479065',id=75,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGa3Z5wRILybgNVRpahBLmiLTAeMMxTHFRpSqeE8vf0/V2PbXMu+NKNirigUjrRZfax5519niVZ5m1wF5bYzERQCuKYHZS2P+HCnCDhrmGktv+3EqVL2XpKwAJqMQBW6VA==',key_name='tempest-keypair-1039920618',keypairs=<?>,launch_index=0,launched_at=2026-01-21T23:58:53Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='717cc581e6a349a98dfd390d05b18624',ramdisk_id='',reservation_id='r-tqp0ax95',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-658760528',owner_user_name='tempest-AttachInterfacesTestJSON-658760528-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-21T23:58:53Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='0f8ef02149394f2dac899fc3395b6bf7',uuid=30c7da24-de00-4067-a5d2-f36ad21391c5,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "c596dfbe-ce59-4ab9-8cad-bd4144812420", "address": "fa:16:3e:24:23:22", "network": {"id": "1995baab-0f8d-4658-a4fc-2d21868dc592", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-54296518-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "717cc581e6a349a98dfd390d05b18624", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc596dfbe-ce", "ovs_interfaceid": "c596dfbe-ce59-4ab9-8cad-bd4144812420", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 21 23:59:27 compute-1 nova_compute[182713]: 2026-01-21 23:59:27.744 182717 DEBUG nova.network.os_vif_util [None req-8cda5444-a8c8-4484-901e-917e1d85b993 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Converting VIF {"id": "c596dfbe-ce59-4ab9-8cad-bd4144812420", "address": "fa:16:3e:24:23:22", "network": {"id": "1995baab-0f8d-4658-a4fc-2d21868dc592", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-54296518-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "717cc581e6a349a98dfd390d05b18624", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc596dfbe-ce", "ovs_interfaceid": "c596dfbe-ce59-4ab9-8cad-bd4144812420", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 21 23:59:27 compute-1 nova_compute[182713]: 2026-01-21 23:59:27.745 182717 DEBUG nova.network.os_vif_util [None req-8cda5444-a8c8-4484-901e-917e1d85b993 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:24:23:22,bridge_name='br-int',has_traffic_filtering=True,id=c596dfbe-ce59-4ab9-8cad-bd4144812420,network=Network(1995baab-0f8d-4658-a4fc-2d21868dc592),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapc596dfbe-ce') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 21 23:59:27 compute-1 nova_compute[182713]: 2026-01-21 23:59:27.746 182717 DEBUG os_vif [None req-8cda5444-a8c8-4484-901e-917e1d85b993 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:24:23:22,bridge_name='br-int',has_traffic_filtering=True,id=c596dfbe-ce59-4ab9-8cad-bd4144812420,network=Network(1995baab-0f8d-4658-a4fc-2d21868dc592),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapc596dfbe-ce') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 21 23:59:27 compute-1 nova_compute[182713]: 2026-01-21 23:59:27.747 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:59:27 compute-1 nova_compute[182713]: 2026-01-21 23:59:27.748 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:59:27 compute-1 nova_compute[182713]: 2026-01-21 23:59:27.749 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 21 23:59:27 compute-1 nova_compute[182713]: 2026-01-21 23:59:27.759 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:59:27 compute-1 nova_compute[182713]: 2026-01-21 23:59:27.759 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc596dfbe-ce, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:59:27 compute-1 nova_compute[182713]: 2026-01-21 23:59:27.760 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapc596dfbe-ce, col_values=(('external_ids', {'iface-id': 'c596dfbe-ce59-4ab9-8cad-bd4144812420', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:24:23:22', 'vm-uuid': '30c7da24-de00-4067-a5d2-f36ad21391c5'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:59:27 compute-1 nova_compute[182713]: 2026-01-21 23:59:27.763 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:59:27 compute-1 NetworkManager[54952]: <info>  [1769039967.7641] manager: (tapc596dfbe-ce): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/145)
Jan 21 23:59:27 compute-1 nova_compute[182713]: 2026-01-21 23:59:27.769 182717 DEBUG nova.virt.libvirt.driver [None req-8f7bd3b7-8d43-4fd6-84e3-31c784ac5bda 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] [instance: be4dacee-6b35-4e82-ba71-d6e8b745dfa8] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:59:27 compute-1 nova_compute[182713]: 2026-01-21 23:59:27.769 182717 DEBUG nova.virt.libvirt.driver [None req-8f7bd3b7-8d43-4fd6-84e3-31c784ac5bda 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] [instance: be4dacee-6b35-4e82-ba71-d6e8b745dfa8] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:59:27 compute-1 nova_compute[182713]: 2026-01-21 23:59:27.770 182717 DEBUG nova.virt.libvirt.driver [None req-8f7bd3b7-8d43-4fd6-84e3-31c784ac5bda 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] [instance: be4dacee-6b35-4e82-ba71-d6e8b745dfa8] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:59:27 compute-1 nova_compute[182713]: 2026-01-21 23:59:27.770 182717 DEBUG nova.virt.libvirt.driver [None req-8f7bd3b7-8d43-4fd6-84e3-31c784ac5bda 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] [instance: be4dacee-6b35-4e82-ba71-d6e8b745dfa8] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:59:27 compute-1 nova_compute[182713]: 2026-01-21 23:59:27.771 182717 DEBUG nova.virt.libvirt.driver [None req-8f7bd3b7-8d43-4fd6-84e3-31c784ac5bda 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] [instance: be4dacee-6b35-4e82-ba71-d6e8b745dfa8] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:59:27 compute-1 nova_compute[182713]: 2026-01-21 23:59:27.771 182717 DEBUG nova.virt.libvirt.driver [None req-8f7bd3b7-8d43-4fd6-84e3-31c784ac5bda 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] [instance: be4dacee-6b35-4e82-ba71-d6e8b745dfa8] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:59:27 compute-1 nova_compute[182713]: 2026-01-21 23:59:27.776 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:59:27 compute-1 nova_compute[182713]: 2026-01-21 23:59:27.777 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: be4dacee-6b35-4e82-ba71-d6e8b745dfa8] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 21 23:59:27 compute-1 nova_compute[182713]: 2026-01-21 23:59:27.778 182717 INFO os_vif [None req-8cda5444-a8c8-4484-901e-917e1d85b993 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:24:23:22,bridge_name='br-int',has_traffic_filtering=True,id=c596dfbe-ce59-4ab9-8cad-bd4144812420,network=Network(1995baab-0f8d-4658-a4fc-2d21868dc592),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapc596dfbe-ce')
Jan 21 23:59:27 compute-1 nova_compute[182713]: 2026-01-21 23:59:27.778 182717 DEBUG nova.virt.libvirt.vif [None req-8cda5444-a8c8-4484-901e-917e1d85b993 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-21T23:58:43Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-821479065',display_name='tempest-tempest.common.compute-instance-821479065',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-821479065',id=75,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGa3Z5wRILybgNVRpahBLmiLTAeMMxTHFRpSqeE8vf0/V2PbXMu+NKNirigUjrRZfax5519niVZ5m1wF5bYzERQCuKYHZS2P+HCnCDhrmGktv+3EqVL2XpKwAJqMQBW6VA==',key_name='tempest-keypair-1039920618',keypairs=<?>,launch_index=0,launched_at=2026-01-21T23:58:53Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='717cc581e6a349a98dfd390d05b18624',ramdisk_id='',reservation_id='r-tqp0ax95',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-658760528',owner_user_name='tempest-AttachInterfacesTestJSON-658760528-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-21T23:58:53Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='0f8ef02149394f2dac899fc3395b6bf7',uuid=30c7da24-de00-4067-a5d2-f36ad21391c5,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "c596dfbe-ce59-4ab9-8cad-bd4144812420", "address": "fa:16:3e:24:23:22", "network": {"id": "1995baab-0f8d-4658-a4fc-2d21868dc592", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-54296518-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "717cc581e6a349a98dfd390d05b18624", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc596dfbe-ce", "ovs_interfaceid": "c596dfbe-ce59-4ab9-8cad-bd4144812420", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 21 23:59:27 compute-1 nova_compute[182713]: 2026-01-21 23:59:27.779 182717 DEBUG nova.network.os_vif_util [None req-8cda5444-a8c8-4484-901e-917e1d85b993 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Converting VIF {"id": "c596dfbe-ce59-4ab9-8cad-bd4144812420", "address": "fa:16:3e:24:23:22", "network": {"id": "1995baab-0f8d-4658-a4fc-2d21868dc592", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-54296518-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "717cc581e6a349a98dfd390d05b18624", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc596dfbe-ce", "ovs_interfaceid": "c596dfbe-ce59-4ab9-8cad-bd4144812420", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 21 23:59:27 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:59:27.779 104184 INFO neutron.agent.ovn.metadata.agent [-] Port a899c1d3-f433-4476-a304-705e518f0bea in datapath 4ae8e9ca-350e-4d38-9fe2-01d17d47544e unbound from our chassis
Jan 21 23:59:27 compute-1 nova_compute[182713]: 2026-01-21 23:59:27.779 182717 DEBUG nova.network.os_vif_util [None req-8cda5444-a8c8-4484-901e-917e1d85b993 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:24:23:22,bridge_name='br-int',has_traffic_filtering=True,id=c596dfbe-ce59-4ab9-8cad-bd4144812420,network=Network(1995baab-0f8d-4658-a4fc-2d21868dc592),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapc596dfbe-ce') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 21 23:59:27 compute-1 nova_compute[182713]: 2026-01-21 23:59:27.782 182717 DEBUG nova.virt.libvirt.guest [None req-8cda5444-a8c8-4484-901e-917e1d85b993 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] attach device xml: <interface type="ethernet">
Jan 21 23:59:27 compute-1 nova_compute[182713]:   <mac address="fa:16:3e:24:23:22"/>
Jan 21 23:59:27 compute-1 nova_compute[182713]:   <model type="virtio"/>
Jan 21 23:59:27 compute-1 nova_compute[182713]:   <driver name="vhost" rx_queue_size="512"/>
Jan 21 23:59:27 compute-1 nova_compute[182713]:   <mtu size="1442"/>
Jan 21 23:59:27 compute-1 nova_compute[182713]:   <target dev="tapc596dfbe-ce"/>
Jan 21 23:59:27 compute-1 nova_compute[182713]: </interface>
Jan 21 23:59:27 compute-1 nova_compute[182713]:  attach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:339
Jan 21 23:59:27 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:59:27.784 104184 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 4ae8e9ca-350e-4d38-9fe2-01d17d47544e
Jan 21 23:59:27 compute-1 NetworkManager[54952]: <info>  [1769039967.7970] manager: (tapc596dfbe-ce): new Tun device (/org/freedesktop/NetworkManager/Devices/146)
Jan 21 23:59:27 compute-1 kernel: tapc596dfbe-ce: entered promiscuous mode
Jan 21 23:59:27 compute-1 nova_compute[182713]: 2026-01-21 23:59:27.803 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:59:27 compute-1 ovn_controller[94841]: 2026-01-21T23:59:27Z|00288|binding|INFO|Claiming lport c596dfbe-ce59-4ab9-8cad-bd4144812420 for this chassis.
Jan 21 23:59:27 compute-1 ovn_controller[94841]: 2026-01-21T23:59:27Z|00289|binding|INFO|c596dfbe-ce59-4ab9-8cad-bd4144812420: Claiming fa:16:3e:24:23:22 10.100.0.14
Jan 21 23:59:27 compute-1 NetworkManager[54952]: <info>  [1769039967.8074] device (tapc596dfbe-ce): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 21 23:59:27 compute-1 NetworkManager[54952]: <info>  [1769039967.8094] device (tapc596dfbe-ce): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 21 23:59:27 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:59:27.813 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[213643df-dd60-4da5-8dbe-6db60b0f698e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:59:27 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:59:27.826 104184 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:24:23:22 10.100.0.14'], port_security=['fa:16:3e:24:23:22 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-AttachInterfacesTestJSON-1512441673', 'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '30c7da24-de00-4067-a5d2-f36ad21391c5', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1995baab-0f8d-4658-a4fc-2d21868dc592', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-AttachInterfacesTestJSON-1512441673', 'neutron:project_id': '717cc581e6a349a98dfd390d05b18624', 'neutron:revision_number': '2', 'neutron:security_group_ids': '453c6af8-25dc-4538-90e8-d74d46875cdc', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a84fa12f-731b-4479-8697-844749c5a76f, chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>], logical_port=c596dfbe-ce59-4ab9-8cad-bd4144812420) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 21 23:59:27 compute-1 ovn_controller[94841]: 2026-01-21T23:59:27Z|00290|binding|INFO|Setting lport c596dfbe-ce59-4ab9-8cad-bd4144812420 ovn-installed in OVS
Jan 21 23:59:27 compute-1 ovn_controller[94841]: 2026-01-21T23:59:27Z|00291|binding|INFO|Setting lport c596dfbe-ce59-4ab9-8cad-bd4144812420 up in Southbound
Jan 21 23:59:27 compute-1 nova_compute[182713]: 2026-01-21 23:59:27.836 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:59:27 compute-1 nova_compute[182713]: 2026-01-21 23:59:27.840 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:59:27 compute-1 nova_compute[182713]: 2026-01-21 23:59:27.865 182717 INFO nova.compute.manager [None req-8f7bd3b7-8d43-4fd6-84e3-31c784ac5bda 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] [instance: be4dacee-6b35-4e82-ba71-d6e8b745dfa8] Took 17.01 seconds to spawn the instance on the hypervisor.
Jan 21 23:59:27 compute-1 nova_compute[182713]: 2026-01-21 23:59:27.866 182717 DEBUG nova.compute.manager [None req-8f7bd3b7-8d43-4fd6-84e3-31c784ac5bda 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] [instance: be4dacee-6b35-4e82-ba71-d6e8b745dfa8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:59:27 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:59:27.868 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[d41fd6cd-8105-405c-8b9e-eea889e854f7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:59:27 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:59:27.872 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[7acf8b9e-01af-47d8-b2d7-b60185dd8e86]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:59:27 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:59:27.903 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[65370e45-5ef4-40ef-9750-ce351ebb3f67]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:59:27 compute-1 nova_compute[182713]: 2026-01-21 23:59:27.908 182717 DEBUG nova.virt.libvirt.driver [None req-8cda5444-a8c8-4484-901e-917e1d85b993 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 21 23:59:27 compute-1 nova_compute[182713]: 2026-01-21 23:59:27.908 182717 DEBUG nova.virt.libvirt.driver [None req-8cda5444-a8c8-4484-901e-917e1d85b993 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 21 23:59:27 compute-1 nova_compute[182713]: 2026-01-21 23:59:27.908 182717 DEBUG nova.virt.libvirt.driver [None req-8cda5444-a8c8-4484-901e-917e1d85b993 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] No VIF found with MAC fa:16:3e:80:88:c9, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 21 23:59:27 compute-1 nova_compute[182713]: 2026-01-21 23:59:27.909 182717 DEBUG nova.virt.libvirt.driver [None req-8cda5444-a8c8-4484-901e-917e1d85b993 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] No VIF found with MAC fa:16:3e:24:23:22, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 21 23:59:27 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:59:27.924 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[6a750fc4-639c-467a-8ea6-b805402d19e8]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap4ae8e9ca-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:11:a9:0c'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 3, 'tx_packets': 5, 'rx_bytes': 266, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 3, 'tx_packets': 5, 'rx_bytes': 266, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 86], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 453953, 'reachable_time': 44184, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 3, 'inoctets': 224, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 3, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 224, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 3, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 222391, 'error': None, 'target': 'ovnmeta-4ae8e9ca-350e-4d38-9fe2-01d17d47544e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:59:27 compute-1 nova_compute[182713]: 2026-01-21 23:59:27.943 182717 DEBUG nova.virt.libvirt.guest [None req-8cda5444-a8c8-4484-901e-917e1d85b993 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 21 23:59:27 compute-1 nova_compute[182713]:   <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 21 23:59:27 compute-1 nova_compute[182713]:   <nova:name>tempest-tempest.common.compute-instance-821479065</nova:name>
Jan 21 23:59:27 compute-1 nova_compute[182713]:   <nova:creationTime>2026-01-21 23:59:27</nova:creationTime>
Jan 21 23:59:27 compute-1 nova_compute[182713]:   <nova:flavor name="m1.nano">
Jan 21 23:59:27 compute-1 nova_compute[182713]:     <nova:memory>128</nova:memory>
Jan 21 23:59:27 compute-1 nova_compute[182713]:     <nova:disk>1</nova:disk>
Jan 21 23:59:27 compute-1 nova_compute[182713]:     <nova:swap>0</nova:swap>
Jan 21 23:59:27 compute-1 nova_compute[182713]:     <nova:ephemeral>0</nova:ephemeral>
Jan 21 23:59:27 compute-1 nova_compute[182713]:     <nova:vcpus>1</nova:vcpus>
Jan 21 23:59:27 compute-1 nova_compute[182713]:   </nova:flavor>
Jan 21 23:59:27 compute-1 nova_compute[182713]:   <nova:owner>
Jan 21 23:59:27 compute-1 nova_compute[182713]:     <nova:user uuid="0f8ef02149394f2dac899fc3395b6bf7">tempest-AttachInterfacesTestJSON-658760528-project-member</nova:user>
Jan 21 23:59:27 compute-1 nova_compute[182713]:     <nova:project uuid="717cc581e6a349a98dfd390d05b18624">tempest-AttachInterfacesTestJSON-658760528</nova:project>
Jan 21 23:59:27 compute-1 nova_compute[182713]:   </nova:owner>
Jan 21 23:59:27 compute-1 nova_compute[182713]:   <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 21 23:59:27 compute-1 nova_compute[182713]:   <nova:ports>
Jan 21 23:59:27 compute-1 nova_compute[182713]:     <nova:port uuid="917524f9-5334-4b3d-b16f-b9686b1c3528">
Jan 21 23:59:27 compute-1 nova_compute[182713]:       <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Jan 21 23:59:27 compute-1 nova_compute[182713]:     </nova:port>
Jan 21 23:59:27 compute-1 nova_compute[182713]:     <nova:port uuid="c596dfbe-ce59-4ab9-8cad-bd4144812420">
Jan 21 23:59:27 compute-1 nova_compute[182713]:       <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Jan 21 23:59:27 compute-1 nova_compute[182713]:     </nova:port>
Jan 21 23:59:27 compute-1 nova_compute[182713]:   </nova:ports>
Jan 21 23:59:27 compute-1 nova_compute[182713]: </nova:instance>
Jan 21 23:59:27 compute-1 nova_compute[182713]:  set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359
Jan 21 23:59:27 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:59:27.943 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[6f4d4540-27bf-4f13-8fed-c6570853a0d7]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap4ae8e9ca-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 453969, 'tstamp': 453969}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 222396, 'error': None, 'target': 'ovnmeta-4ae8e9ca-350e-4d38-9fe2-01d17d47544e', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 24, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.255'], ['IFA_LABEL', 'tap4ae8e9ca-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 453973, 'tstamp': 453973}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 222396, 'error': None, 'target': 'ovnmeta-4ae8e9ca-350e-4d38-9fe2-01d17d47544e', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:59:27 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:59:27.945 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4ae8e9ca-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:59:27 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:59:27.948 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap4ae8e9ca-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:59:27 compute-1 nova_compute[182713]: 2026-01-21 23:59:27.948 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:59:27 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:59:27.948 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 21 23:59:27 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:59:27.949 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap4ae8e9ca-30, col_values=(('external_ids', {'iface-id': '9328bb99-12eb-4e9f-9bcb-95844b674407'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:59:27 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:59:27.949 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 21 23:59:27 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:59:27.950 104184 INFO neutron.agent.ovn.metadata.agent [-] Port c596dfbe-ce59-4ab9-8cad-bd4144812420 in datapath 1995baab-0f8d-4658-a4fc-2d21868dc592 unbound from our chassis
Jan 21 23:59:27 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:59:27.951 104184 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 1995baab-0f8d-4658-a4fc-2d21868dc592
Jan 21 23:59:27 compute-1 podman[222376]: 2026-01-21 23:59:27.954044226 +0000 UTC m=+0.090797171 container health_status dce133a2ffa21f3006ac27dc80027060a9029e6d972f4c62fecd35dd3bbb00f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, com.redhat.component=ubi9-minimal-container, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, release=1755695350, vcs-type=git, vendor=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., managed_by=edpm_ansible, name=ubi9-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, build-date=2025-08-20T13:12:41, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Jan 21 23:59:27 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:59:27.966 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[afc65157-2d05-4ab5-8e63-e1a50e9bd8c8]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:59:27 compute-1 nova_compute[182713]: 2026-01-21 23:59:27.983 182717 DEBUG oslo_concurrency.lockutils [None req-8cda5444-a8c8-4484-901e-917e1d85b993 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Lock "interface-30c7da24-de00-4067-a5d2-f36ad21391c5-c596dfbe-ce59-4ab9-8cad-bd4144812420" "released" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: held 5.447s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:59:27 compute-1 nova_compute[182713]: 2026-01-21 23:59:27.984 182717 INFO nova.compute.manager [None req-8f7bd3b7-8d43-4fd6-84e3-31c784ac5bda 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] [instance: be4dacee-6b35-4e82-ba71-d6e8b745dfa8] Took 17.86 seconds to build instance.
Jan 21 23:59:28 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:59:28.001 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[4c682d52-8de3-4071-9306-0468e9cd4396]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:59:28 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:59:28.003 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[9aba50a9-1704-4f74-b2b9-0246db5635e6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:59:28 compute-1 nova_compute[182713]: 2026-01-21 23:59:28.012 182717 DEBUG oslo_concurrency.lockutils [None req-8f7bd3b7-8d43-4fd6-84e3-31c784ac5bda 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] Lock "be4dacee-6b35-4e82-ba71-d6e8b745dfa8" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 18.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:59:28 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:59:28.030 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[fe113089-9c02-406c-8647-8fce81cc33fa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:59:28 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:59:28.048 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[28e4560e-791d-44e4-bd31-5b223d647675]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap1995baab-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:4b:ff:2f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 6, 'rx_bytes': 616, 'tx_bytes': 440, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 6, 'rx_bytes': 616, 'tx_bytes': 440, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 81], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 450444, 'reachable_time': 24854, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 4, 'outoctets': 300, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 4, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 300, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 4, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 222406, 'error': None, 'target': 'ovnmeta-1995baab-0f8d-4658-a4fc-2d21868dc592', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:59:28 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:59:28.067 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[882f5e28-93b0-4d4d-a61b-6d47502e2c6e]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap1995baab-01'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 450456, 'tstamp': 450456}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 222407, 'error': None, 'target': 'ovnmeta-1995baab-0f8d-4658-a4fc-2d21868dc592', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap1995baab-01'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 450459, 'tstamp': 450459}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 222407, 'error': None, 'target': 'ovnmeta-1995baab-0f8d-4658-a4fc-2d21868dc592', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:59:28 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:59:28.068 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1995baab-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:59:28 compute-1 nova_compute[182713]: 2026-01-21 23:59:28.070 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:59:28 compute-1 nova_compute[182713]: 2026-01-21 23:59:28.071 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:59:28 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:59:28.071 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap1995baab-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:59:28 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:59:28.072 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 21 23:59:28 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:59:28.072 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap1995baab-00, col_values=(('external_ids', {'iface-id': '4a5cc35b-5169-43e2-b11f-202219aae22d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:59:28 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:59:28.073 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 21 23:59:28 compute-1 nova_compute[182713]: 2026-01-21 23:59:28.184 182717 DEBUG nova.compute.manager [req-0eccd0a6-2251-4d37-9b09-5bff2d4d242c req-b155402e-f07b-4b78-97fe-7f9ce7101b39 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: be4dacee-6b35-4e82-ba71-d6e8b745dfa8] Received event network-vif-plugged-a899c1d3-f433-4476-a304-705e518f0bea external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:59:28 compute-1 nova_compute[182713]: 2026-01-21 23:59:28.185 182717 DEBUG oslo_concurrency.lockutils [req-0eccd0a6-2251-4d37-9b09-5bff2d4d242c req-b155402e-f07b-4b78-97fe-7f9ce7101b39 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "be4dacee-6b35-4e82-ba71-d6e8b745dfa8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:59:28 compute-1 nova_compute[182713]: 2026-01-21 23:59:28.187 182717 DEBUG oslo_concurrency.lockutils [req-0eccd0a6-2251-4d37-9b09-5bff2d4d242c req-b155402e-f07b-4b78-97fe-7f9ce7101b39 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "be4dacee-6b35-4e82-ba71-d6e8b745dfa8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:59:28 compute-1 nova_compute[182713]: 2026-01-21 23:59:28.187 182717 DEBUG oslo_concurrency.lockutils [req-0eccd0a6-2251-4d37-9b09-5bff2d4d242c req-b155402e-f07b-4b78-97fe-7f9ce7101b39 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "be4dacee-6b35-4e82-ba71-d6e8b745dfa8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:59:28 compute-1 nova_compute[182713]: 2026-01-21 23:59:28.188 182717 DEBUG nova.compute.manager [req-0eccd0a6-2251-4d37-9b09-5bff2d4d242c req-b155402e-f07b-4b78-97fe-7f9ce7101b39 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: be4dacee-6b35-4e82-ba71-d6e8b745dfa8] No waiting events found dispatching network-vif-plugged-a899c1d3-f433-4476-a304-705e518f0bea pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 23:59:28 compute-1 nova_compute[182713]: 2026-01-21 23:59:28.189 182717 WARNING nova.compute.manager [req-0eccd0a6-2251-4d37-9b09-5bff2d4d242c req-b155402e-f07b-4b78-97fe-7f9ce7101b39 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: be4dacee-6b35-4e82-ba71-d6e8b745dfa8] Received unexpected event network-vif-plugged-a899c1d3-f433-4476-a304-705e518f0bea for instance with vm_state active and task_state None.
Jan 21 23:59:28 compute-1 nova_compute[182713]: 2026-01-21 23:59:28.269 182717 DEBUG nova.compute.manager [req-0f70eef7-e84f-400a-97d3-2d55c28f275f req-0c07266a-d903-4901-91ca-d1e8fc06a6c9 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 30c7da24-de00-4067-a5d2-f36ad21391c5] Received event network-vif-plugged-c596dfbe-ce59-4ab9-8cad-bd4144812420 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:59:28 compute-1 nova_compute[182713]: 2026-01-21 23:59:28.270 182717 DEBUG oslo_concurrency.lockutils [req-0f70eef7-e84f-400a-97d3-2d55c28f275f req-0c07266a-d903-4901-91ca-d1e8fc06a6c9 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "30c7da24-de00-4067-a5d2-f36ad21391c5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:59:28 compute-1 nova_compute[182713]: 2026-01-21 23:59:28.270 182717 DEBUG oslo_concurrency.lockutils [req-0f70eef7-e84f-400a-97d3-2d55c28f275f req-0c07266a-d903-4901-91ca-d1e8fc06a6c9 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "30c7da24-de00-4067-a5d2-f36ad21391c5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:59:28 compute-1 nova_compute[182713]: 2026-01-21 23:59:28.270 182717 DEBUG oslo_concurrency.lockutils [req-0f70eef7-e84f-400a-97d3-2d55c28f275f req-0c07266a-d903-4901-91ca-d1e8fc06a6c9 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "30c7da24-de00-4067-a5d2-f36ad21391c5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:59:28 compute-1 nova_compute[182713]: 2026-01-21 23:59:28.271 182717 DEBUG nova.compute.manager [req-0f70eef7-e84f-400a-97d3-2d55c28f275f req-0c07266a-d903-4901-91ca-d1e8fc06a6c9 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 30c7da24-de00-4067-a5d2-f36ad21391c5] No waiting events found dispatching network-vif-plugged-c596dfbe-ce59-4ab9-8cad-bd4144812420 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 23:59:28 compute-1 nova_compute[182713]: 2026-01-21 23:59:28.271 182717 WARNING nova.compute.manager [req-0f70eef7-e84f-400a-97d3-2d55c28f275f req-0c07266a-d903-4901-91ca-d1e8fc06a6c9 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 30c7da24-de00-4067-a5d2-f36ad21391c5] Received unexpected event network-vif-plugged-c596dfbe-ce59-4ab9-8cad-bd4144812420 for instance with vm_state active and task_state None.
Jan 21 23:59:28 compute-1 nova_compute[182713]: 2026-01-21 23:59:28.447 182717 DEBUG nova.compute.manager [req-73a80df6-199d-4389-a32d-699d560e5c53 req-3f2564f0-09d7-4513-9c17-234ef0ec6874 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: be4dacee-6b35-4e82-ba71-d6e8b745dfa8] Received event network-vif-plugged-9330cb5d-3c3f-499b-9d0c-ddc0fee6838f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:59:28 compute-1 nova_compute[182713]: 2026-01-21 23:59:28.447 182717 DEBUG oslo_concurrency.lockutils [req-73a80df6-199d-4389-a32d-699d560e5c53 req-3f2564f0-09d7-4513-9c17-234ef0ec6874 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "be4dacee-6b35-4e82-ba71-d6e8b745dfa8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:59:28 compute-1 nova_compute[182713]: 2026-01-21 23:59:28.447 182717 DEBUG oslo_concurrency.lockutils [req-73a80df6-199d-4389-a32d-699d560e5c53 req-3f2564f0-09d7-4513-9c17-234ef0ec6874 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "be4dacee-6b35-4e82-ba71-d6e8b745dfa8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:59:28 compute-1 nova_compute[182713]: 2026-01-21 23:59:28.448 182717 DEBUG oslo_concurrency.lockutils [req-73a80df6-199d-4389-a32d-699d560e5c53 req-3f2564f0-09d7-4513-9c17-234ef0ec6874 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "be4dacee-6b35-4e82-ba71-d6e8b745dfa8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:59:28 compute-1 nova_compute[182713]: 2026-01-21 23:59:28.448 182717 DEBUG nova.compute.manager [req-73a80df6-199d-4389-a32d-699d560e5c53 req-3f2564f0-09d7-4513-9c17-234ef0ec6874 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: be4dacee-6b35-4e82-ba71-d6e8b745dfa8] No waiting events found dispatching network-vif-plugged-9330cb5d-3c3f-499b-9d0c-ddc0fee6838f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 23:59:28 compute-1 nova_compute[182713]: 2026-01-21 23:59:28.448 182717 WARNING nova.compute.manager [req-73a80df6-199d-4389-a32d-699d560e5c53 req-3f2564f0-09d7-4513-9c17-234ef0ec6874 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: be4dacee-6b35-4e82-ba71-d6e8b745dfa8] Received unexpected event network-vif-plugged-9330cb5d-3c3f-499b-9d0c-ddc0fee6838f for instance with vm_state active and task_state None.
Jan 21 23:59:29 compute-1 ovn_controller[94841]: 2026-01-21T23:59:29Z|00038|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:24:23:22 10.100.0.14
Jan 21 23:59:29 compute-1 ovn_controller[94841]: 2026-01-21T23:59:29Z|00039|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:24:23:22 10.100.0.14
Jan 21 23:59:29 compute-1 nova_compute[182713]: 2026-01-21 23:59:29.856 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:59:29 compute-1 nova_compute[182713]: 2026-01-21 23:59:29.856 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:59:29 compute-1 nova_compute[182713]: 2026-01-21 23:59:29.857 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:59:30 compute-1 nova_compute[182713]: 2026-01-21 23:59:30.380 182717 DEBUG nova.compute.manager [req-d635bceb-9775-411e-a5cf-85f9b45500c5 req-3364a90d-1244-498a-b9c5-29d04f036166 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 30c7da24-de00-4067-a5d2-f36ad21391c5] Received event network-vif-plugged-c596dfbe-ce59-4ab9-8cad-bd4144812420 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:59:30 compute-1 nova_compute[182713]: 2026-01-21 23:59:30.380 182717 DEBUG oslo_concurrency.lockutils [req-d635bceb-9775-411e-a5cf-85f9b45500c5 req-3364a90d-1244-498a-b9c5-29d04f036166 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "30c7da24-de00-4067-a5d2-f36ad21391c5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:59:30 compute-1 nova_compute[182713]: 2026-01-21 23:59:30.381 182717 DEBUG oslo_concurrency.lockutils [req-d635bceb-9775-411e-a5cf-85f9b45500c5 req-3364a90d-1244-498a-b9c5-29d04f036166 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "30c7da24-de00-4067-a5d2-f36ad21391c5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:59:30 compute-1 nova_compute[182713]: 2026-01-21 23:59:30.381 182717 DEBUG oslo_concurrency.lockutils [req-d635bceb-9775-411e-a5cf-85f9b45500c5 req-3364a90d-1244-498a-b9c5-29d04f036166 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "30c7da24-de00-4067-a5d2-f36ad21391c5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:59:30 compute-1 nova_compute[182713]: 2026-01-21 23:59:30.382 182717 DEBUG nova.compute.manager [req-d635bceb-9775-411e-a5cf-85f9b45500c5 req-3364a90d-1244-498a-b9c5-29d04f036166 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 30c7da24-de00-4067-a5d2-f36ad21391c5] No waiting events found dispatching network-vif-plugged-c596dfbe-ce59-4ab9-8cad-bd4144812420 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 23:59:30 compute-1 nova_compute[182713]: 2026-01-21 23:59:30.382 182717 WARNING nova.compute.manager [req-d635bceb-9775-411e-a5cf-85f9b45500c5 req-3364a90d-1244-498a-b9c5-29d04f036166 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 30c7da24-de00-4067-a5d2-f36ad21391c5] Received unexpected event network-vif-plugged-c596dfbe-ce59-4ab9-8cad-bd4144812420 for instance with vm_state active and task_state None.
Jan 21 23:59:30 compute-1 nova_compute[182713]: 2026-01-21 23:59:30.856 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:59:31 compute-1 nova_compute[182713]: 2026-01-21 23:59:31.208 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:59:31 compute-1 nova_compute[182713]: 2026-01-21 23:59:31.690 182717 DEBUG oslo_concurrency.lockutils [None req-779919fd-72c7-4671-9e60-20c79e8542dc 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Acquiring lock "interface-30c7da24-de00-4067-a5d2-f36ad21391c5-c596dfbe-ce59-4ab9-8cad-bd4144812420" by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:59:31 compute-1 nova_compute[182713]: 2026-01-21 23:59:31.691 182717 DEBUG oslo_concurrency.lockutils [None req-779919fd-72c7-4671-9e60-20c79e8542dc 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Lock "interface-30c7da24-de00-4067-a5d2-f36ad21391c5-c596dfbe-ce59-4ab9-8cad-bd4144812420" acquired by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:59:31 compute-1 nova_compute[182713]: 2026-01-21 23:59:31.706 182717 DEBUG nova.objects.instance [None req-779919fd-72c7-4671-9e60-20c79e8542dc 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Lazy-loading 'flavor' on Instance uuid 30c7da24-de00-4067-a5d2-f36ad21391c5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:59:31 compute-1 nova_compute[182713]: 2026-01-21 23:59:31.743 182717 DEBUG nova.virt.libvirt.vif [None req-779919fd-72c7-4671-9e60-20c79e8542dc 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-21T23:58:43Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-821479065',display_name='tempest-tempest.common.compute-instance-821479065',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-821479065',id=75,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGa3Z5wRILybgNVRpahBLmiLTAeMMxTHFRpSqeE8vf0/V2PbXMu+NKNirigUjrRZfax5519niVZ5m1wF5bYzERQCuKYHZS2P+HCnCDhrmGktv+3EqVL2XpKwAJqMQBW6VA==',key_name='tempest-keypair-1039920618',keypairs=<?>,launch_index=0,launched_at=2026-01-21T23:58:53Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='717cc581e6a349a98dfd390d05b18624',ramdisk_id='',reservation_id='r-tqp0ax95',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-658760528',owner_user_name='tempest-AttachInterfacesTestJSON-658760528-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-21T23:58:53Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='0f8ef02149394f2dac899fc3395b6bf7',uuid=30c7da24-de00-4067-a5d2-f36ad21391c5,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "c596dfbe-ce59-4ab9-8cad-bd4144812420", "address": "fa:16:3e:24:23:22", "network": {"id": "1995baab-0f8d-4658-a4fc-2d21868dc592", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-54296518-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "717cc581e6a349a98dfd390d05b18624", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc596dfbe-ce", "ovs_interfaceid": "c596dfbe-ce59-4ab9-8cad-bd4144812420", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 21 23:59:31 compute-1 nova_compute[182713]: 2026-01-21 23:59:31.743 182717 DEBUG nova.network.os_vif_util [None req-779919fd-72c7-4671-9e60-20c79e8542dc 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Converting VIF {"id": "c596dfbe-ce59-4ab9-8cad-bd4144812420", "address": "fa:16:3e:24:23:22", "network": {"id": "1995baab-0f8d-4658-a4fc-2d21868dc592", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-54296518-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "717cc581e6a349a98dfd390d05b18624", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc596dfbe-ce", "ovs_interfaceid": "c596dfbe-ce59-4ab9-8cad-bd4144812420", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 21 23:59:31 compute-1 nova_compute[182713]: 2026-01-21 23:59:31.744 182717 DEBUG nova.network.os_vif_util [None req-779919fd-72c7-4671-9e60-20c79e8542dc 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:24:23:22,bridge_name='br-int',has_traffic_filtering=True,id=c596dfbe-ce59-4ab9-8cad-bd4144812420,network=Network(1995baab-0f8d-4658-a4fc-2d21868dc592),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapc596dfbe-ce') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 21 23:59:31 compute-1 nova_compute[182713]: 2026-01-21 23:59:31.747 182717 DEBUG nova.virt.libvirt.guest [None req-779919fd-72c7-4671-9e60-20c79e8542dc 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:24:23:22"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapc596dfbe-ce"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257
Jan 21 23:59:31 compute-1 nova_compute[182713]: 2026-01-21 23:59:31.749 182717 DEBUG nova.virt.libvirt.guest [None req-779919fd-72c7-4671-9e60-20c79e8542dc 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:24:23:22"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapc596dfbe-ce"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257
Jan 21 23:59:31 compute-1 nova_compute[182713]: 2026-01-21 23:59:31.751 182717 DEBUG nova.virt.libvirt.driver [None req-779919fd-72c7-4671-9e60-20c79e8542dc 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Attempting to detach device tapc596dfbe-ce from instance 30c7da24-de00-4067-a5d2-f36ad21391c5 from the persistent domain config. _detach_from_persistent /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2487
Jan 21 23:59:31 compute-1 nova_compute[182713]: 2026-01-21 23:59:31.751 182717 DEBUG nova.virt.libvirt.guest [None req-779919fd-72c7-4671-9e60-20c79e8542dc 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] detach device xml: <interface type="ethernet">
Jan 21 23:59:31 compute-1 nova_compute[182713]:   <mac address="fa:16:3e:24:23:22"/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:   <model type="virtio"/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:   <driver name="vhost" rx_queue_size="512"/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:   <mtu size="1442"/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:   <target dev="tapc596dfbe-ce"/>
Jan 21 23:59:31 compute-1 nova_compute[182713]: </interface>
Jan 21 23:59:31 compute-1 nova_compute[182713]:  detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465
Jan 21 23:59:31 compute-1 nova_compute[182713]: 2026-01-21 23:59:31.763 182717 DEBUG nova.virt.libvirt.guest [None req-779919fd-72c7-4671-9e60-20c79e8542dc 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:24:23:22"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapc596dfbe-ce"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257
Jan 21 23:59:31 compute-1 nova_compute[182713]: 2026-01-21 23:59:31.766 182717 DEBUG nova.virt.libvirt.guest [None req-779919fd-72c7-4671-9e60-20c79e8542dc 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:24:23:22"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapc596dfbe-ce"/></interface>not found in domain: <domain type='kvm' id='35'>
Jan 21 23:59:31 compute-1 nova_compute[182713]:   <name>instance-0000004b</name>
Jan 21 23:59:31 compute-1 nova_compute[182713]:   <uuid>30c7da24-de00-4067-a5d2-f36ad21391c5</uuid>
Jan 21 23:59:31 compute-1 nova_compute[182713]:   <metadata>
Jan 21 23:59:31 compute-1 nova_compute[182713]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 21 23:59:31 compute-1 nova_compute[182713]:   <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:   <nova:name>tempest-tempest.common.compute-instance-821479065</nova:name>
Jan 21 23:59:31 compute-1 nova_compute[182713]:   <nova:creationTime>2026-01-21 23:59:27</nova:creationTime>
Jan 21 23:59:31 compute-1 nova_compute[182713]:   <nova:flavor name="m1.nano">
Jan 21 23:59:31 compute-1 nova_compute[182713]:     <nova:memory>128</nova:memory>
Jan 21 23:59:31 compute-1 nova_compute[182713]:     <nova:disk>1</nova:disk>
Jan 21 23:59:31 compute-1 nova_compute[182713]:     <nova:swap>0</nova:swap>
Jan 21 23:59:31 compute-1 nova_compute[182713]:     <nova:ephemeral>0</nova:ephemeral>
Jan 21 23:59:31 compute-1 nova_compute[182713]:     <nova:vcpus>1</nova:vcpus>
Jan 21 23:59:31 compute-1 nova_compute[182713]:   </nova:flavor>
Jan 21 23:59:31 compute-1 nova_compute[182713]:   <nova:owner>
Jan 21 23:59:31 compute-1 nova_compute[182713]:     <nova:user uuid="0f8ef02149394f2dac899fc3395b6bf7">tempest-AttachInterfacesTestJSON-658760528-project-member</nova:user>
Jan 21 23:59:31 compute-1 nova_compute[182713]:     <nova:project uuid="717cc581e6a349a98dfd390d05b18624">tempest-AttachInterfacesTestJSON-658760528</nova:project>
Jan 21 23:59:31 compute-1 nova_compute[182713]:   </nova:owner>
Jan 21 23:59:31 compute-1 nova_compute[182713]:   <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:   <nova:ports>
Jan 21 23:59:31 compute-1 nova_compute[182713]:     <nova:port uuid="917524f9-5334-4b3d-b16f-b9686b1c3528">
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:     </nova:port>
Jan 21 23:59:31 compute-1 nova_compute[182713]:     <nova:port uuid="c596dfbe-ce59-4ab9-8cad-bd4144812420">
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:     </nova:port>
Jan 21 23:59:31 compute-1 nova_compute[182713]:   </nova:ports>
Jan 21 23:59:31 compute-1 nova_compute[182713]: </nova:instance>
Jan 21 23:59:31 compute-1 nova_compute[182713]:   </metadata>
Jan 21 23:59:31 compute-1 nova_compute[182713]:   <memory unit='KiB'>131072</memory>
Jan 21 23:59:31 compute-1 nova_compute[182713]:   <currentMemory unit='KiB'>131072</currentMemory>
Jan 21 23:59:31 compute-1 nova_compute[182713]:   <vcpu placement='static'>1</vcpu>
Jan 21 23:59:31 compute-1 nova_compute[182713]:   <resource>
Jan 21 23:59:31 compute-1 nova_compute[182713]:     <partition>/machine</partition>
Jan 21 23:59:31 compute-1 nova_compute[182713]:   </resource>
Jan 21 23:59:31 compute-1 nova_compute[182713]:   <sysinfo type='smbios'>
Jan 21 23:59:31 compute-1 nova_compute[182713]:     <system>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <entry name='manufacturer'>RDO</entry>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <entry name='product'>OpenStack Compute</entry>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <entry name='serial'>30c7da24-de00-4067-a5d2-f36ad21391c5</entry>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <entry name='uuid'>30c7da24-de00-4067-a5d2-f36ad21391c5</entry>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <entry name='family'>Virtual Machine</entry>
Jan 21 23:59:31 compute-1 nova_compute[182713]:     </system>
Jan 21 23:59:31 compute-1 nova_compute[182713]:   </sysinfo>
Jan 21 23:59:31 compute-1 nova_compute[182713]:   <os>
Jan 21 23:59:31 compute-1 nova_compute[182713]:     <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Jan 21 23:59:31 compute-1 nova_compute[182713]:     <boot dev='hd'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:     <smbios mode='sysinfo'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:   </os>
Jan 21 23:59:31 compute-1 nova_compute[182713]:   <features>
Jan 21 23:59:31 compute-1 nova_compute[182713]:     <acpi/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:     <apic/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:     <vmcoreinfo state='on'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:   </features>
Jan 21 23:59:31 compute-1 nova_compute[182713]:   <cpu mode='custom' match='exact' check='full'>
Jan 21 23:59:31 compute-1 nova_compute[182713]:     <model fallback='forbid'>Nehalem</model>
Jan 21 23:59:31 compute-1 nova_compute[182713]:     <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:     <feature policy='require' name='x2apic'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:     <feature policy='require' name='hypervisor'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:     <feature policy='require' name='vme'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:   </cpu>
Jan 21 23:59:31 compute-1 nova_compute[182713]:   <clock offset='utc'>
Jan 21 23:59:31 compute-1 nova_compute[182713]:     <timer name='pit' tickpolicy='delay'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:     <timer name='rtc' tickpolicy='catchup'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:     <timer name='hpet' present='no'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:   </clock>
Jan 21 23:59:31 compute-1 nova_compute[182713]:   <on_poweroff>destroy</on_poweroff>
Jan 21 23:59:31 compute-1 nova_compute[182713]:   <on_reboot>restart</on_reboot>
Jan 21 23:59:31 compute-1 nova_compute[182713]:   <on_crash>destroy</on_crash>
Jan 21 23:59:31 compute-1 nova_compute[182713]:   <devices>
Jan 21 23:59:31 compute-1 nova_compute[182713]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 21 23:59:31 compute-1 nova_compute[182713]:     <disk type='file' device='disk'>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <driver name='qemu' type='qcow2' cache='none'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <source file='/var/lib/nova/instances/30c7da24-de00-4067-a5d2-f36ad21391c5/disk' index='2'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <backingStore type='file' index='3'>
Jan 21 23:59:31 compute-1 nova_compute[182713]:         <format type='raw'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:         <source file='/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:         <backingStore/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       </backingStore>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <target dev='vda' bus='virtio'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <alias name='virtio-disk0'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:     </disk>
Jan 21 23:59:31 compute-1 nova_compute[182713]:     <disk type='file' device='cdrom'>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <driver name='qemu' type='raw' cache='none'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <source file='/var/lib/nova/instances/30c7da24-de00-4067-a5d2-f36ad21391c5/disk.config' index='1'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <backingStore/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <target dev='sda' bus='sata'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <readonly/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <alias name='sata0-0-0'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:     </disk>
Jan 21 23:59:31 compute-1 nova_compute[182713]:     <controller type='pci' index='0' model='pcie-root'>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <alias name='pcie.0'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:     </controller>
Jan 21 23:59:31 compute-1 nova_compute[182713]:     <controller type='pci' index='1' model='pcie-root-port'>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <target chassis='1' port='0x10'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <alias name='pci.1'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:     </controller>
Jan 21 23:59:31 compute-1 nova_compute[182713]:     <controller type='pci' index='2' model='pcie-root-port'>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <target chassis='2' port='0x11'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <alias name='pci.2'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:     </controller>
Jan 21 23:59:31 compute-1 nova_compute[182713]:     <controller type='pci' index='3' model='pcie-root-port'>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <target chassis='3' port='0x12'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <alias name='pci.3'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:     </controller>
Jan 21 23:59:31 compute-1 nova_compute[182713]:     <controller type='pci' index='4' model='pcie-root-port'>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <target chassis='4' port='0x13'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <alias name='pci.4'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:     </controller>
Jan 21 23:59:31 compute-1 nova_compute[182713]:     <controller type='pci' index='5' model='pcie-root-port'>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <target chassis='5' port='0x14'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <alias name='pci.5'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:     </controller>
Jan 21 23:59:31 compute-1 nova_compute[182713]:     <controller type='pci' index='6' model='pcie-root-port'>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <target chassis='6' port='0x15'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <alias name='pci.6'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:     </controller>
Jan 21 23:59:31 compute-1 nova_compute[182713]:     <controller type='pci' index='7' model='pcie-root-port'>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <target chassis='7' port='0x16'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <alias name='pci.7'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:     </controller>
Jan 21 23:59:31 compute-1 nova_compute[182713]:     <controller type='pci' index='8' model='pcie-root-port'>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <target chassis='8' port='0x17'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <alias name='pci.8'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:     </controller>
Jan 21 23:59:31 compute-1 nova_compute[182713]:     <controller type='pci' index='9' model='pcie-root-port'>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <target chassis='9' port='0x18'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <alias name='pci.9'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:     </controller>
Jan 21 23:59:31 compute-1 nova_compute[182713]:     <controller type='pci' index='10' model='pcie-root-port'>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <target chassis='10' port='0x19'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <alias name='pci.10'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:     </controller>
Jan 21 23:59:31 compute-1 nova_compute[182713]:     <controller type='pci' index='11' model='pcie-root-port'>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <target chassis='11' port='0x1a'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <alias name='pci.11'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:     </controller>
Jan 21 23:59:31 compute-1 nova_compute[182713]:     <controller type='pci' index='12' model='pcie-root-port'>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <target chassis='12' port='0x1b'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <alias name='pci.12'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:     </controller>
Jan 21 23:59:31 compute-1 nova_compute[182713]:     <controller type='pci' index='13' model='pcie-root-port'>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <target chassis='13' port='0x1c'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <alias name='pci.13'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:     </controller>
Jan 21 23:59:31 compute-1 nova_compute[182713]:     <controller type='pci' index='14' model='pcie-root-port'>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <target chassis='14' port='0x1d'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <alias name='pci.14'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:     </controller>
Jan 21 23:59:31 compute-1 nova_compute[182713]:     <controller type='pci' index='15' model='pcie-root-port'>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <target chassis='15' port='0x1e'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <alias name='pci.15'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:     </controller>
Jan 21 23:59:31 compute-1 nova_compute[182713]:     <controller type='pci' index='16' model='pcie-root-port'>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <target chassis='16' port='0x1f'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <alias name='pci.16'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:     </controller>
Jan 21 23:59:31 compute-1 nova_compute[182713]:     <controller type='pci' index='17' model='pcie-root-port'>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <target chassis='17' port='0x20'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <alias name='pci.17'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:     </controller>
Jan 21 23:59:31 compute-1 nova_compute[182713]:     <controller type='pci' index='18' model='pcie-root-port'>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <target chassis='18' port='0x21'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <alias name='pci.18'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:     </controller>
Jan 21 23:59:31 compute-1 nova_compute[182713]:     <controller type='pci' index='19' model='pcie-root-port'>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <target chassis='19' port='0x22'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <alias name='pci.19'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:     </controller>
Jan 21 23:59:31 compute-1 nova_compute[182713]:     <controller type='pci' index='20' model='pcie-root-port'>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <target chassis='20' port='0x23'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <alias name='pci.20'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:     </controller>
Jan 21 23:59:31 compute-1 nova_compute[182713]:     <controller type='pci' index='21' model='pcie-root-port'>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <target chassis='21' port='0x24'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <alias name='pci.21'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:     </controller>
Jan 21 23:59:31 compute-1 nova_compute[182713]:     <controller type='pci' index='22' model='pcie-root-port'>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <target chassis='22' port='0x25'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <alias name='pci.22'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:     </controller>
Jan 21 23:59:31 compute-1 nova_compute[182713]:     <controller type='pci' index='23' model='pcie-root-port'>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <target chassis='23' port='0x26'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <alias name='pci.23'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:     </controller>
Jan 21 23:59:31 compute-1 nova_compute[182713]:     <controller type='pci' index='24' model='pcie-root-port'>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <target chassis='24' port='0x27'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <alias name='pci.24'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:     </controller>
Jan 21 23:59:31 compute-1 nova_compute[182713]:     <controller type='pci' index='25' model='pcie-root-port'>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <target chassis='25' port='0x28'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <alias name='pci.25'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:     </controller>
Jan 21 23:59:31 compute-1 nova_compute[182713]:     <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <model name='pcie-pci-bridge'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <alias name='pci.26'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:     </controller>
Jan 21 23:59:31 compute-1 nova_compute[182713]:     <controller type='usb' index='0' model='piix3-uhci'>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <alias name='usb'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:     </controller>
Jan 21 23:59:31 compute-1 nova_compute[182713]:     <controller type='sata' index='0'>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <alias name='ide'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:     </controller>
Jan 21 23:59:31 compute-1 nova_compute[182713]:     <interface type='ethernet'>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <mac address='fa:16:3e:80:88:c9'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <target dev='tap917524f9-53'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <model type='virtio'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <driver name='vhost' rx_queue_size='512'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <mtu size='1442'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <alias name='net0'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:     </interface>
Jan 21 23:59:31 compute-1 nova_compute[182713]:     <interface type='ethernet'>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <mac address='fa:16:3e:24:23:22'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <target dev='tapc596dfbe-ce'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <model type='virtio'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <driver name='vhost' rx_queue_size='512'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <mtu size='1442'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <alias name='net1'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x06' slot='0x00' function='0x0'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:     </interface>
Jan 21 23:59:31 compute-1 nova_compute[182713]:     <serial type='pty'>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <source path='/dev/pts/0'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <log file='/var/lib/nova/instances/30c7da24-de00-4067-a5d2-f36ad21391c5/console.log' append='off'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <target type='isa-serial' port='0'>
Jan 21 23:59:31 compute-1 nova_compute[182713]:         <model name='isa-serial'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       </target>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <alias name='serial0'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:     </serial>
Jan 21 23:59:31 compute-1 nova_compute[182713]:     <console type='pty' tty='/dev/pts/0'>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <source path='/dev/pts/0'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <log file='/var/lib/nova/instances/30c7da24-de00-4067-a5d2-f36ad21391c5/console.log' append='off'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <target type='serial' port='0'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <alias name='serial0'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:     </console>
Jan 21 23:59:31 compute-1 nova_compute[182713]:     <input type='tablet' bus='usb'>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <alias name='input0'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <address type='usb' bus='0' port='1'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:     </input>
Jan 21 23:59:31 compute-1 nova_compute[182713]:     <input type='mouse' bus='ps2'>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <alias name='input1'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:     </input>
Jan 21 23:59:31 compute-1 nova_compute[182713]:     <input type='keyboard' bus='ps2'>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <alias name='input2'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:     </input>
Jan 21 23:59:31 compute-1 nova_compute[182713]:     <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <listen type='address' address='::0'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:     </graphics>
Jan 21 23:59:31 compute-1 nova_compute[182713]:     <audio id='1' type='none'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:     <video>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <model type='virtio' heads='1' primary='yes'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <alias name='video0'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:     </video>
Jan 21 23:59:31 compute-1 nova_compute[182713]:     <watchdog model='itco' action='reset'>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <alias name='watchdog0'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:     </watchdog>
Jan 21 23:59:31 compute-1 nova_compute[182713]:     <memballoon model='virtio'>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <stats period='10'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <alias name='balloon0'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:     </memballoon>
Jan 21 23:59:31 compute-1 nova_compute[182713]:     <rng model='virtio'>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <backend model='random'>/dev/urandom</backend>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <alias name='rng0'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:     </rng>
Jan 21 23:59:31 compute-1 nova_compute[182713]:   </devices>
Jan 21 23:59:31 compute-1 nova_compute[182713]:   <seclabel type='dynamic' model='selinux' relabel='yes'>
Jan 21 23:59:31 compute-1 nova_compute[182713]:     <label>system_u:system_r:svirt_t:s0:c540,c550</label>
Jan 21 23:59:31 compute-1 nova_compute[182713]:     <imagelabel>system_u:object_r:svirt_image_t:s0:c540,c550</imagelabel>
Jan 21 23:59:31 compute-1 nova_compute[182713]:   </seclabel>
Jan 21 23:59:31 compute-1 nova_compute[182713]:   <seclabel type='dynamic' model='dac' relabel='yes'>
Jan 21 23:59:31 compute-1 nova_compute[182713]:     <label>+107:+107</label>
Jan 21 23:59:31 compute-1 nova_compute[182713]:     <imagelabel>+107:+107</imagelabel>
Jan 21 23:59:31 compute-1 nova_compute[182713]:   </seclabel>
Jan 21 23:59:31 compute-1 nova_compute[182713]: </domain>
Jan 21 23:59:31 compute-1 nova_compute[182713]:  get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282
Jan 21 23:59:31 compute-1 nova_compute[182713]: 2026-01-21 23:59:31.768 182717 INFO nova.virt.libvirt.driver [None req-779919fd-72c7-4671-9e60-20c79e8542dc 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Successfully detached device tapc596dfbe-ce from instance 30c7da24-de00-4067-a5d2-f36ad21391c5 from the persistent domain config.
Jan 21 23:59:31 compute-1 nova_compute[182713]: 2026-01-21 23:59:31.768 182717 DEBUG nova.virt.libvirt.driver [None req-779919fd-72c7-4671-9e60-20c79e8542dc 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] (1/8): Attempting to detach device tapc596dfbe-ce with device alias net1 from instance 30c7da24-de00-4067-a5d2-f36ad21391c5 from the live domain config. _detach_from_live_with_retry /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2523
Jan 21 23:59:31 compute-1 nova_compute[182713]: 2026-01-21 23:59:31.768 182717 DEBUG nova.virt.libvirt.guest [None req-779919fd-72c7-4671-9e60-20c79e8542dc 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] detach device xml: <interface type="ethernet">
Jan 21 23:59:31 compute-1 nova_compute[182713]:   <mac address="fa:16:3e:24:23:22"/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:   <model type="virtio"/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:   <driver name="vhost" rx_queue_size="512"/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:   <mtu size="1442"/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:   <target dev="tapc596dfbe-ce"/>
Jan 21 23:59:31 compute-1 nova_compute[182713]: </interface>
Jan 21 23:59:31 compute-1 nova_compute[182713]:  detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465
Jan 21 23:59:31 compute-1 kernel: tapc596dfbe-ce (unregistering): left promiscuous mode
Jan 21 23:59:31 compute-1 NetworkManager[54952]: <info>  [1769039971.8144] device (tapc596dfbe-ce): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 21 23:59:31 compute-1 ovn_controller[94841]: 2026-01-21T23:59:31Z|00292|binding|INFO|Releasing lport c596dfbe-ce59-4ab9-8cad-bd4144812420 from this chassis (sb_readonly=0)
Jan 21 23:59:31 compute-1 ovn_controller[94841]: 2026-01-21T23:59:31Z|00293|binding|INFO|Setting lport c596dfbe-ce59-4ab9-8cad-bd4144812420 down in Southbound
Jan 21 23:59:31 compute-1 ovn_controller[94841]: 2026-01-21T23:59:31Z|00294|binding|INFO|Removing iface tapc596dfbe-ce ovn-installed in OVS
Jan 21 23:59:31 compute-1 nova_compute[182713]: 2026-01-21 23:59:31.832 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:59:31 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:59:31.839 104184 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:24:23:22 10.100.0.14'], port_security=['fa:16:3e:24:23:22 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-AttachInterfacesTestJSON-1512441673', 'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '30c7da24-de00-4067-a5d2-f36ad21391c5', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1995baab-0f8d-4658-a4fc-2d21868dc592', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-AttachInterfacesTestJSON-1512441673', 'neutron:project_id': '717cc581e6a349a98dfd390d05b18624', 'neutron:revision_number': '4', 'neutron:security_group_ids': '453c6af8-25dc-4538-90e8-d74d46875cdc', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a84fa12f-731b-4479-8697-844749c5a76f, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>], logical_port=c596dfbe-ce59-4ab9-8cad-bd4144812420) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 21 23:59:31 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:59:31.840 104184 INFO neutron.agent.ovn.metadata.agent [-] Port c596dfbe-ce59-4ab9-8cad-bd4144812420 in datapath 1995baab-0f8d-4658-a4fc-2d21868dc592 unbound from our chassis
Jan 21 23:59:31 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:59:31.843 104184 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 1995baab-0f8d-4658-a4fc-2d21868dc592
Jan 21 23:59:31 compute-1 nova_compute[182713]: 2026-01-21 23:59:31.856 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:59:31 compute-1 nova_compute[182713]: 2026-01-21 23:59:31.864 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:59:31 compute-1 nova_compute[182713]: 2026-01-21 23:59:31.865 182717 DEBUG nova.virt.libvirt.driver [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Received event <DeviceRemovedEvent: 1769039971.859428, 30c7da24-de00-4067-a5d2-f36ad21391c5 => net1> from libvirt while the driver is waiting for it; dispatched. emit_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2370
Jan 21 23:59:31 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:59:31.865 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[196c9300-247f-4c8a-a85f-50c85e04c5fb]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:59:31 compute-1 nova_compute[182713]: 2026-01-21 23:59:31.866 182717 DEBUG nova.virt.libvirt.driver [None req-779919fd-72c7-4671-9e60-20c79e8542dc 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Start waiting for the detach event from libvirt for device tapc596dfbe-ce with device alias net1 for instance 30c7da24-de00-4067-a5d2-f36ad21391c5 _detach_from_live_and_wait_for_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2599
Jan 21 23:59:31 compute-1 nova_compute[182713]: 2026-01-21 23:59:31.866 182717 DEBUG nova.virt.libvirt.guest [None req-779919fd-72c7-4671-9e60-20c79e8542dc 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:24:23:22"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapc596dfbe-ce"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257
Jan 21 23:59:31 compute-1 nova_compute[182713]: 2026-01-21 23:59:31.874 182717 DEBUG nova.virt.libvirt.guest [None req-779919fd-72c7-4671-9e60-20c79e8542dc 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:24:23:22"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapc596dfbe-ce"/></interface>not found in domain: <domain type='kvm' id='35'>
Jan 21 23:59:31 compute-1 nova_compute[182713]:   <name>instance-0000004b</name>
Jan 21 23:59:31 compute-1 nova_compute[182713]:   <uuid>30c7da24-de00-4067-a5d2-f36ad21391c5</uuid>
Jan 21 23:59:31 compute-1 nova_compute[182713]:   <metadata>
Jan 21 23:59:31 compute-1 nova_compute[182713]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 21 23:59:31 compute-1 nova_compute[182713]:   <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:   <nova:name>tempest-tempest.common.compute-instance-821479065</nova:name>
Jan 21 23:59:31 compute-1 nova_compute[182713]:   <nova:creationTime>2026-01-21 23:59:27</nova:creationTime>
Jan 21 23:59:31 compute-1 nova_compute[182713]:   <nova:flavor name="m1.nano">
Jan 21 23:59:31 compute-1 nova_compute[182713]:     <nova:memory>128</nova:memory>
Jan 21 23:59:31 compute-1 nova_compute[182713]:     <nova:disk>1</nova:disk>
Jan 21 23:59:31 compute-1 nova_compute[182713]:     <nova:swap>0</nova:swap>
Jan 21 23:59:31 compute-1 nova_compute[182713]:     <nova:ephemeral>0</nova:ephemeral>
Jan 21 23:59:31 compute-1 nova_compute[182713]:     <nova:vcpus>1</nova:vcpus>
Jan 21 23:59:31 compute-1 nova_compute[182713]:   </nova:flavor>
Jan 21 23:59:31 compute-1 nova_compute[182713]:   <nova:owner>
Jan 21 23:59:31 compute-1 nova_compute[182713]:     <nova:user uuid="0f8ef02149394f2dac899fc3395b6bf7">tempest-AttachInterfacesTestJSON-658760528-project-member</nova:user>
Jan 21 23:59:31 compute-1 nova_compute[182713]:     <nova:project uuid="717cc581e6a349a98dfd390d05b18624">tempest-AttachInterfacesTestJSON-658760528</nova:project>
Jan 21 23:59:31 compute-1 nova_compute[182713]:   </nova:owner>
Jan 21 23:59:31 compute-1 nova_compute[182713]:   <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:   <nova:ports>
Jan 21 23:59:31 compute-1 nova_compute[182713]:     <nova:port uuid="917524f9-5334-4b3d-b16f-b9686b1c3528">
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:     </nova:port>
Jan 21 23:59:31 compute-1 nova_compute[182713]:     <nova:port uuid="c596dfbe-ce59-4ab9-8cad-bd4144812420">
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:     </nova:port>
Jan 21 23:59:31 compute-1 nova_compute[182713]:   </nova:ports>
Jan 21 23:59:31 compute-1 nova_compute[182713]: </nova:instance>
Jan 21 23:59:31 compute-1 nova_compute[182713]:   </metadata>
Jan 21 23:59:31 compute-1 nova_compute[182713]:   <memory unit='KiB'>131072</memory>
Jan 21 23:59:31 compute-1 nova_compute[182713]:   <currentMemory unit='KiB'>131072</currentMemory>
Jan 21 23:59:31 compute-1 nova_compute[182713]:   <vcpu placement='static'>1</vcpu>
Jan 21 23:59:31 compute-1 nova_compute[182713]:   <resource>
Jan 21 23:59:31 compute-1 nova_compute[182713]:     <partition>/machine</partition>
Jan 21 23:59:31 compute-1 nova_compute[182713]:   </resource>
Jan 21 23:59:31 compute-1 nova_compute[182713]:   <sysinfo type='smbios'>
Jan 21 23:59:31 compute-1 nova_compute[182713]:     <system>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <entry name='manufacturer'>RDO</entry>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <entry name='product'>OpenStack Compute</entry>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <entry name='serial'>30c7da24-de00-4067-a5d2-f36ad21391c5</entry>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <entry name='uuid'>30c7da24-de00-4067-a5d2-f36ad21391c5</entry>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <entry name='family'>Virtual Machine</entry>
Jan 21 23:59:31 compute-1 nova_compute[182713]:     </system>
Jan 21 23:59:31 compute-1 nova_compute[182713]:   </sysinfo>
Jan 21 23:59:31 compute-1 nova_compute[182713]:   <os>
Jan 21 23:59:31 compute-1 nova_compute[182713]:     <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Jan 21 23:59:31 compute-1 nova_compute[182713]:     <boot dev='hd'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:     <smbios mode='sysinfo'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:   </os>
Jan 21 23:59:31 compute-1 nova_compute[182713]:   <features>
Jan 21 23:59:31 compute-1 nova_compute[182713]:     <acpi/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:     <apic/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:     <vmcoreinfo state='on'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:   </features>
Jan 21 23:59:31 compute-1 nova_compute[182713]:   <cpu mode='custom' match='exact' check='full'>
Jan 21 23:59:31 compute-1 nova_compute[182713]:     <model fallback='forbid'>Nehalem</model>
Jan 21 23:59:31 compute-1 nova_compute[182713]:     <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:     <feature policy='require' name='x2apic'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:     <feature policy='require' name='hypervisor'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:     <feature policy='require' name='vme'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:   </cpu>
Jan 21 23:59:31 compute-1 nova_compute[182713]:   <clock offset='utc'>
Jan 21 23:59:31 compute-1 nova_compute[182713]:     <timer name='pit' tickpolicy='delay'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:     <timer name='rtc' tickpolicy='catchup'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:     <timer name='hpet' present='no'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:   </clock>
Jan 21 23:59:31 compute-1 nova_compute[182713]:   <on_poweroff>destroy</on_poweroff>
Jan 21 23:59:31 compute-1 nova_compute[182713]:   <on_reboot>restart</on_reboot>
Jan 21 23:59:31 compute-1 nova_compute[182713]:   <on_crash>destroy</on_crash>
Jan 21 23:59:31 compute-1 nova_compute[182713]:   <devices>
Jan 21 23:59:31 compute-1 nova_compute[182713]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 21 23:59:31 compute-1 nova_compute[182713]:     <disk type='file' device='disk'>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <driver name='qemu' type='qcow2' cache='none'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <source file='/var/lib/nova/instances/30c7da24-de00-4067-a5d2-f36ad21391c5/disk' index='2'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <backingStore type='file' index='3'>
Jan 21 23:59:31 compute-1 nova_compute[182713]:         <format type='raw'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:         <source file='/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:         <backingStore/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       </backingStore>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <target dev='vda' bus='virtio'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <alias name='virtio-disk0'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:     </disk>
Jan 21 23:59:31 compute-1 nova_compute[182713]:     <disk type='file' device='cdrom'>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <driver name='qemu' type='raw' cache='none'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <source file='/var/lib/nova/instances/30c7da24-de00-4067-a5d2-f36ad21391c5/disk.config' index='1'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <backingStore/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <target dev='sda' bus='sata'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <readonly/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <alias name='sata0-0-0'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:     </disk>
Jan 21 23:59:31 compute-1 nova_compute[182713]:     <controller type='pci' index='0' model='pcie-root'>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <alias name='pcie.0'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:     </controller>
Jan 21 23:59:31 compute-1 nova_compute[182713]:     <controller type='pci' index='1' model='pcie-root-port'>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <target chassis='1' port='0x10'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <alias name='pci.1'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:     </controller>
Jan 21 23:59:31 compute-1 nova_compute[182713]:     <controller type='pci' index='2' model='pcie-root-port'>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <target chassis='2' port='0x11'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <alias name='pci.2'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:     </controller>
Jan 21 23:59:31 compute-1 nova_compute[182713]:     <controller type='pci' index='3' model='pcie-root-port'>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <target chassis='3' port='0x12'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <alias name='pci.3'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:     </controller>
Jan 21 23:59:31 compute-1 nova_compute[182713]:     <controller type='pci' index='4' model='pcie-root-port'>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <target chassis='4' port='0x13'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <alias name='pci.4'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:     </controller>
Jan 21 23:59:31 compute-1 nova_compute[182713]:     <controller type='pci' index='5' model='pcie-root-port'>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <target chassis='5' port='0x14'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <alias name='pci.5'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:     </controller>
Jan 21 23:59:31 compute-1 nova_compute[182713]:     <controller type='pci' index='6' model='pcie-root-port'>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <target chassis='6' port='0x15'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <alias name='pci.6'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:     </controller>
Jan 21 23:59:31 compute-1 nova_compute[182713]:     <controller type='pci' index='7' model='pcie-root-port'>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <target chassis='7' port='0x16'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <alias name='pci.7'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:     </controller>
Jan 21 23:59:31 compute-1 nova_compute[182713]:     <controller type='pci' index='8' model='pcie-root-port'>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <target chassis='8' port='0x17'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <alias name='pci.8'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:     </controller>
Jan 21 23:59:31 compute-1 nova_compute[182713]:     <controller type='pci' index='9' model='pcie-root-port'>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <target chassis='9' port='0x18'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <alias name='pci.9'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:     </controller>
Jan 21 23:59:31 compute-1 nova_compute[182713]:     <controller type='pci' index='10' model='pcie-root-port'>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <target chassis='10' port='0x19'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <alias name='pci.10'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:     </controller>
Jan 21 23:59:31 compute-1 nova_compute[182713]:     <controller type='pci' index='11' model='pcie-root-port'>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <target chassis='11' port='0x1a'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <alias name='pci.11'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:     </controller>
Jan 21 23:59:31 compute-1 nova_compute[182713]:     <controller type='pci' index='12' model='pcie-root-port'>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <target chassis='12' port='0x1b'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <alias name='pci.12'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:     </controller>
Jan 21 23:59:31 compute-1 nova_compute[182713]:     <controller type='pci' index='13' model='pcie-root-port'>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <target chassis='13' port='0x1c'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <alias name='pci.13'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:     </controller>
Jan 21 23:59:31 compute-1 nova_compute[182713]:     <controller type='pci' index='14' model='pcie-root-port'>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <target chassis='14' port='0x1d'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <alias name='pci.14'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:     </controller>
Jan 21 23:59:31 compute-1 nova_compute[182713]:     <controller type='pci' index='15' model='pcie-root-port'>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <target chassis='15' port='0x1e'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <alias name='pci.15'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:     </controller>
Jan 21 23:59:31 compute-1 nova_compute[182713]:     <controller type='pci' index='16' model='pcie-root-port'>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <target chassis='16' port='0x1f'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <alias name='pci.16'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:     </controller>
Jan 21 23:59:31 compute-1 nova_compute[182713]:     <controller type='pci' index='17' model='pcie-root-port'>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <target chassis='17' port='0x20'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <alias name='pci.17'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:     </controller>
Jan 21 23:59:31 compute-1 nova_compute[182713]:     <controller type='pci' index='18' model='pcie-root-port'>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <target chassis='18' port='0x21'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <alias name='pci.18'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:     </controller>
Jan 21 23:59:31 compute-1 nova_compute[182713]:     <controller type='pci' index='19' model='pcie-root-port'>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <target chassis='19' port='0x22'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <alias name='pci.19'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:     </controller>
Jan 21 23:59:31 compute-1 nova_compute[182713]:     <controller type='pci' index='20' model='pcie-root-port'>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <target chassis='20' port='0x23'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <alias name='pci.20'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:     </controller>
Jan 21 23:59:31 compute-1 nova_compute[182713]:     <controller type='pci' index='21' model='pcie-root-port'>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <target chassis='21' port='0x24'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <alias name='pci.21'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:     </controller>
Jan 21 23:59:31 compute-1 nova_compute[182713]:     <controller type='pci' index='22' model='pcie-root-port'>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <target chassis='22' port='0x25'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <alias name='pci.22'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:     </controller>
Jan 21 23:59:31 compute-1 nova_compute[182713]:     <controller type='pci' index='23' model='pcie-root-port'>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <target chassis='23' port='0x26'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <alias name='pci.23'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:     </controller>
Jan 21 23:59:31 compute-1 nova_compute[182713]:     <controller type='pci' index='24' model='pcie-root-port'>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <target chassis='24' port='0x27'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <alias name='pci.24'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:     </controller>
Jan 21 23:59:31 compute-1 nova_compute[182713]:     <controller type='pci' index='25' model='pcie-root-port'>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <target chassis='25' port='0x28'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <alias name='pci.25'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:     </controller>
Jan 21 23:59:31 compute-1 nova_compute[182713]:     <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <model name='pcie-pci-bridge'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <alias name='pci.26'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:     </controller>
Jan 21 23:59:31 compute-1 nova_compute[182713]:     <controller type='usb' index='0' model='piix3-uhci'>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <alias name='usb'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:     </controller>
Jan 21 23:59:31 compute-1 nova_compute[182713]:     <controller type='sata' index='0'>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <alias name='ide'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:     </controller>
Jan 21 23:59:31 compute-1 nova_compute[182713]:     <interface type='ethernet'>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <mac address='fa:16:3e:80:88:c9'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <target dev='tap917524f9-53'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <model type='virtio'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <driver name='vhost' rx_queue_size='512'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <mtu size='1442'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <alias name='net0'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:     </interface>
Jan 21 23:59:31 compute-1 nova_compute[182713]:     <serial type='pty'>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <source path='/dev/pts/0'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <log file='/var/lib/nova/instances/30c7da24-de00-4067-a5d2-f36ad21391c5/console.log' append='off'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <target type='isa-serial' port='0'>
Jan 21 23:59:31 compute-1 nova_compute[182713]:         <model name='isa-serial'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       </target>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <alias name='serial0'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:     </serial>
Jan 21 23:59:31 compute-1 nova_compute[182713]:     <console type='pty' tty='/dev/pts/0'>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <source path='/dev/pts/0'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <log file='/var/lib/nova/instances/30c7da24-de00-4067-a5d2-f36ad21391c5/console.log' append='off'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <target type='serial' port='0'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <alias name='serial0'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:     </console>
Jan 21 23:59:31 compute-1 nova_compute[182713]:     <input type='tablet' bus='usb'>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <alias name='input0'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <address type='usb' bus='0' port='1'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:     </input>
Jan 21 23:59:31 compute-1 nova_compute[182713]:     <input type='mouse' bus='ps2'>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <alias name='input1'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:     </input>
Jan 21 23:59:31 compute-1 nova_compute[182713]:     <input type='keyboard' bus='ps2'>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <alias name='input2'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:     </input>
Jan 21 23:59:31 compute-1 nova_compute[182713]:     <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <listen type='address' address='::0'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:     </graphics>
Jan 21 23:59:31 compute-1 nova_compute[182713]:     <audio id='1' type='none'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:     <video>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <model type='virtio' heads='1' primary='yes'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <alias name='video0'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:     </video>
Jan 21 23:59:31 compute-1 nova_compute[182713]:     <watchdog model='itco' action='reset'>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <alias name='watchdog0'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:     </watchdog>
Jan 21 23:59:31 compute-1 nova_compute[182713]:     <memballoon model='virtio'>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <stats period='10'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <alias name='balloon0'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:     </memballoon>
Jan 21 23:59:31 compute-1 nova_compute[182713]:     <rng model='virtio'>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <backend model='random'>/dev/urandom</backend>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <alias name='rng0'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:     </rng>
Jan 21 23:59:31 compute-1 nova_compute[182713]:   </devices>
Jan 21 23:59:31 compute-1 nova_compute[182713]:   <seclabel type='dynamic' model='selinux' relabel='yes'>
Jan 21 23:59:31 compute-1 nova_compute[182713]:     <label>system_u:system_r:svirt_t:s0:c540,c550</label>
Jan 21 23:59:31 compute-1 nova_compute[182713]:     <imagelabel>system_u:object_r:svirt_image_t:s0:c540,c550</imagelabel>
Jan 21 23:59:31 compute-1 nova_compute[182713]:   </seclabel>
Jan 21 23:59:31 compute-1 nova_compute[182713]:   <seclabel type='dynamic' model='dac' relabel='yes'>
Jan 21 23:59:31 compute-1 nova_compute[182713]:     <label>+107:+107</label>
Jan 21 23:59:31 compute-1 nova_compute[182713]:     <imagelabel>+107:+107</imagelabel>
Jan 21 23:59:31 compute-1 nova_compute[182713]:   </seclabel>
Jan 21 23:59:31 compute-1 nova_compute[182713]: </domain>
Jan 21 23:59:31 compute-1 nova_compute[182713]:  get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282
Jan 21 23:59:31 compute-1 nova_compute[182713]: 2026-01-21 23:59:31.877 182717 INFO nova.virt.libvirt.driver [None req-779919fd-72c7-4671-9e60-20c79e8542dc 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Successfully detached device tapc596dfbe-ce from instance 30c7da24-de00-4067-a5d2-f36ad21391c5 from the live domain config.
Jan 21 23:59:31 compute-1 nova_compute[182713]: 2026-01-21 23:59:31.877 182717 DEBUG nova.virt.libvirt.vif [None req-779919fd-72c7-4671-9e60-20c79e8542dc 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-21T23:58:43Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-821479065',display_name='tempest-tempest.common.compute-instance-821479065',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-821479065',id=75,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGa3Z5wRILybgNVRpahBLmiLTAeMMxTHFRpSqeE8vf0/V2PbXMu+NKNirigUjrRZfax5519niVZ5m1wF5bYzERQCuKYHZS2P+HCnCDhrmGktv+3EqVL2XpKwAJqMQBW6VA==',key_name='tempest-keypair-1039920618',keypairs=<?>,launch_index=0,launched_at=2026-01-21T23:58:53Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='717cc581e6a349a98dfd390d05b18624',ramdisk_id='',reservation_id='r-tqp0ax95',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-658760528',owner_user_name='tempest-AttachInterfacesTestJSON-658760528-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-21T23:58:53Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='0f8ef02149394f2dac899fc3395b6bf7',uuid=30c7da24-de00-4067-a5d2-f36ad21391c5,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "c596dfbe-ce59-4ab9-8cad-bd4144812420", "address": "fa:16:3e:24:23:22", "network": {"id": "1995baab-0f8d-4658-a4fc-2d21868dc592", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-54296518-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "717cc581e6a349a98dfd390d05b18624", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc596dfbe-ce", "ovs_interfaceid": "c596dfbe-ce59-4ab9-8cad-bd4144812420", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 21 23:59:31 compute-1 nova_compute[182713]: 2026-01-21 23:59:31.878 182717 DEBUG nova.network.os_vif_util [None req-779919fd-72c7-4671-9e60-20c79e8542dc 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Converting VIF {"id": "c596dfbe-ce59-4ab9-8cad-bd4144812420", "address": "fa:16:3e:24:23:22", "network": {"id": "1995baab-0f8d-4658-a4fc-2d21868dc592", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-54296518-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "717cc581e6a349a98dfd390d05b18624", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc596dfbe-ce", "ovs_interfaceid": "c596dfbe-ce59-4ab9-8cad-bd4144812420", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 21 23:59:31 compute-1 nova_compute[182713]: 2026-01-21 23:59:31.879 182717 DEBUG nova.network.os_vif_util [None req-779919fd-72c7-4671-9e60-20c79e8542dc 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:24:23:22,bridge_name='br-int',has_traffic_filtering=True,id=c596dfbe-ce59-4ab9-8cad-bd4144812420,network=Network(1995baab-0f8d-4658-a4fc-2d21868dc592),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapc596dfbe-ce') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 21 23:59:31 compute-1 nova_compute[182713]: 2026-01-21 23:59:31.879 182717 DEBUG os_vif [None req-779919fd-72c7-4671-9e60-20c79e8542dc 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:24:23:22,bridge_name='br-int',has_traffic_filtering=True,id=c596dfbe-ce59-4ab9-8cad-bd4144812420,network=Network(1995baab-0f8d-4658-a4fc-2d21868dc592),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapc596dfbe-ce') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 21 23:59:31 compute-1 nova_compute[182713]: 2026-01-21 23:59:31.883 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:59:31 compute-1 nova_compute[182713]: 2026-01-21 23:59:31.884 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc596dfbe-ce, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:59:31 compute-1 nova_compute[182713]: 2026-01-21 23:59:31.885 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:59:31 compute-1 nova_compute[182713]: 2026-01-21 23:59:31.888 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:59:31 compute-1 nova_compute[182713]: 2026-01-21 23:59:31.901 182717 INFO os_vif [None req-779919fd-72c7-4671-9e60-20c79e8542dc 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:24:23:22,bridge_name='br-int',has_traffic_filtering=True,id=c596dfbe-ce59-4ab9-8cad-bd4144812420,network=Network(1995baab-0f8d-4658-a4fc-2d21868dc592),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapc596dfbe-ce')
Jan 21 23:59:31 compute-1 nova_compute[182713]: 2026-01-21 23:59:31.902 182717 DEBUG nova.virt.libvirt.guest [None req-779919fd-72c7-4671-9e60-20c79e8542dc 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 21 23:59:31 compute-1 nova_compute[182713]:   <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:   <nova:name>tempest-tempest.common.compute-instance-821479065</nova:name>
Jan 21 23:59:31 compute-1 nova_compute[182713]:   <nova:creationTime>2026-01-21 23:59:31</nova:creationTime>
Jan 21 23:59:31 compute-1 nova_compute[182713]:   <nova:flavor name="m1.nano">
Jan 21 23:59:31 compute-1 nova_compute[182713]:     <nova:memory>128</nova:memory>
Jan 21 23:59:31 compute-1 nova_compute[182713]:     <nova:disk>1</nova:disk>
Jan 21 23:59:31 compute-1 nova_compute[182713]:     <nova:swap>0</nova:swap>
Jan 21 23:59:31 compute-1 nova_compute[182713]:     <nova:ephemeral>0</nova:ephemeral>
Jan 21 23:59:31 compute-1 nova_compute[182713]:     <nova:vcpus>1</nova:vcpus>
Jan 21 23:59:31 compute-1 nova_compute[182713]:   </nova:flavor>
Jan 21 23:59:31 compute-1 nova_compute[182713]:   <nova:owner>
Jan 21 23:59:31 compute-1 nova_compute[182713]:     <nova:user uuid="0f8ef02149394f2dac899fc3395b6bf7">tempest-AttachInterfacesTestJSON-658760528-project-member</nova:user>
Jan 21 23:59:31 compute-1 nova_compute[182713]:     <nova:project uuid="717cc581e6a349a98dfd390d05b18624">tempest-AttachInterfacesTestJSON-658760528</nova:project>
Jan 21 23:59:31 compute-1 nova_compute[182713]:   </nova:owner>
Jan 21 23:59:31 compute-1 nova_compute[182713]:   <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:   <nova:ports>
Jan 21 23:59:31 compute-1 nova_compute[182713]:     <nova:port uuid="917524f9-5334-4b3d-b16f-b9686b1c3528">
Jan 21 23:59:31 compute-1 nova_compute[182713]:       <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Jan 21 23:59:31 compute-1 nova_compute[182713]:     </nova:port>
Jan 21 23:59:31 compute-1 nova_compute[182713]:   </nova:ports>
Jan 21 23:59:31 compute-1 nova_compute[182713]: </nova:instance>
Jan 21 23:59:31 compute-1 nova_compute[182713]:  set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359
Jan 21 23:59:31 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:59:31.913 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[a35d62d8-2c15-4657-8ff8-207545f4649f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:59:31 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:59:31.916 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[a34d00eb-5176-4449-907d-c16b91527cff]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:59:31 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:59:31.951 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[4b029834-8125-4ee4-bfdc-5ae600068c3f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:59:31 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:59:31.969 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[accdb6d5-a0a2-4b00-a8f6-06b05d9a8ef6]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap1995baab-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:4b:ff:2f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 8, 'rx_bytes': 616, 'tx_bytes': 524, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 8, 'rx_bytes': 616, 'tx_bytes': 524, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 81], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 450444, 'reachable_time': 24854, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 4, 'outoctets': 300, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 4, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 300, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 4, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 222419, 'error': None, 'target': 'ovnmeta-1995baab-0f8d-4658-a4fc-2d21868dc592', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:59:31 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:59:31.995 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[3266cf9d-08ad-45a8-9c66-fdcffd7b3248]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap1995baab-01'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 450456, 'tstamp': 450456}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 222420, 'error': None, 'target': 'ovnmeta-1995baab-0f8d-4658-a4fc-2d21868dc592', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap1995baab-01'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 450459, 'tstamp': 450459}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 222420, 'error': None, 'target': 'ovnmeta-1995baab-0f8d-4658-a4fc-2d21868dc592', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:59:31 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:59:31.998 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1995baab-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:59:32 compute-1 nova_compute[182713]: 2026-01-21 23:59:31.999 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:59:32 compute-1 nova_compute[182713]: 2026-01-21 23:59:32.001 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:59:32 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:59:32.003 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap1995baab-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:59:32 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:59:32.003 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 21 23:59:32 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:59:32.004 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap1995baab-00, col_values=(('external_ids', {'iface-id': '4a5cc35b-5169-43e2-b11f-202219aae22d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:59:32 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:59:32.004 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 21 23:59:32 compute-1 nova_compute[182713]: 2026-01-21 23:59:32.855 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:59:32 compute-1 nova_compute[182713]: 2026-01-21 23:59:32.882 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:59:32 compute-1 nova_compute[182713]: 2026-01-21 23:59:32.883 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:59:32 compute-1 nova_compute[182713]: 2026-01-21 23:59:32.884 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:59:32 compute-1 nova_compute[182713]: 2026-01-21 23:59:32.884 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 21 23:59:32 compute-1 nova_compute[182713]: 2026-01-21 23:59:32.956 182717 DEBUG oslo_concurrency.processutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/30c7da24-de00-4067-a5d2-f36ad21391c5/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:59:33 compute-1 nova_compute[182713]: 2026-01-21 23:59:33.032 182717 DEBUG oslo_concurrency.processutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/30c7da24-de00-4067-a5d2-f36ad21391c5/disk --force-share --output=json" returned: 0 in 0.076s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:59:33 compute-1 nova_compute[182713]: 2026-01-21 23:59:33.033 182717 DEBUG oslo_concurrency.processutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/30c7da24-de00-4067-a5d2-f36ad21391c5/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:59:33 compute-1 nova_compute[182713]: 2026-01-21 23:59:33.103 182717 DEBUG oslo_concurrency.processutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/30c7da24-de00-4067-a5d2-f36ad21391c5/disk --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:59:33 compute-1 nova_compute[182713]: 2026-01-21 23:59:33.108 182717 DEBUG oslo_concurrency.processutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/be4dacee-6b35-4e82-ba71-d6e8b745dfa8/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:59:33 compute-1 nova_compute[182713]: 2026-01-21 23:59:33.179 182717 DEBUG oslo_concurrency.processutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/be4dacee-6b35-4e82-ba71-d6e8b745dfa8/disk --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:59:33 compute-1 nova_compute[182713]: 2026-01-21 23:59:33.180 182717 DEBUG oslo_concurrency.processutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/be4dacee-6b35-4e82-ba71-d6e8b745dfa8/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:59:33 compute-1 nova_compute[182713]: 2026-01-21 23:59:33.232 182717 DEBUG oslo_concurrency.processutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/be4dacee-6b35-4e82-ba71-d6e8b745dfa8/disk --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:59:33 compute-1 nova_compute[182713]: 2026-01-21 23:59:33.413 182717 WARNING nova.virt.libvirt.driver [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 21 23:59:33 compute-1 nova_compute[182713]: 2026-01-21 23:59:33.415 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5350MB free_disk=73.27392196655273GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 21 23:59:33 compute-1 nova_compute[182713]: 2026-01-21 23:59:33.415 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:59:33 compute-1 nova_compute[182713]: 2026-01-21 23:59:33.415 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:59:33 compute-1 nova_compute[182713]: 2026-01-21 23:59:33.474 182717 DEBUG nova.network.neutron [req-c485292f-ee64-4eca-86ef-9eb0706430a4 req-d1c0ed62-239b-4850-9c3f-742e9157ca66 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 30c7da24-de00-4067-a5d2-f36ad21391c5] Updated VIF entry in instance network info cache for port c596dfbe-ce59-4ab9-8cad-bd4144812420. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 21 23:59:33 compute-1 nova_compute[182713]: 2026-01-21 23:59:33.474 182717 DEBUG nova.network.neutron [req-c485292f-ee64-4eca-86ef-9eb0706430a4 req-d1c0ed62-239b-4850-9c3f-742e9157ca66 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 30c7da24-de00-4067-a5d2-f36ad21391c5] Updating instance_info_cache with network_info: [{"id": "917524f9-5334-4b3d-b16f-b9686b1c3528", "address": "fa:16:3e:80:88:c9", "network": {"id": "1995baab-0f8d-4658-a4fc-2d21868dc592", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-54296518-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.223", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "717cc581e6a349a98dfd390d05b18624", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap917524f9-53", "ovs_interfaceid": "917524f9-5334-4b3d-b16f-b9686b1c3528", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "c596dfbe-ce59-4ab9-8cad-bd4144812420", "address": "fa:16:3e:24:23:22", "network": {"id": "1995baab-0f8d-4658-a4fc-2d21868dc592", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-54296518-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "717cc581e6a349a98dfd390d05b18624", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc596dfbe-ce", "ovs_interfaceid": "c596dfbe-ce59-4ab9-8cad-bd4144812420", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 23:59:33 compute-1 nova_compute[182713]: 2026-01-21 23:59:33.534 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Instance 30c7da24-de00-4067-a5d2-f36ad21391c5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 21 23:59:33 compute-1 nova_compute[182713]: 2026-01-21 23:59:33.535 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Instance be4dacee-6b35-4e82-ba71-d6e8b745dfa8 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 21 23:59:33 compute-1 nova_compute[182713]: 2026-01-21 23:59:33.535 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 21 23:59:33 compute-1 nova_compute[182713]: 2026-01-21 23:59:33.536 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 21 23:59:33 compute-1 nova_compute[182713]: 2026-01-21 23:59:33.542 182717 DEBUG oslo_concurrency.lockutils [req-c485292f-ee64-4eca-86ef-9eb0706430a4 req-d1c0ed62-239b-4850-9c3f-742e9157ca66 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-30c7da24-de00-4067-a5d2-f36ad21391c5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 23:59:33 compute-1 nova_compute[182713]: 2026-01-21 23:59:33.543 182717 DEBUG oslo_concurrency.lockutils [req-30fb0ffb-fabe-4200-a328-63b1afb53689 req-732b0965-3b2b-41a4-a811-29c7fa2adda7 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-30c7da24-de00-4067-a5d2-f36ad21391c5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 23:59:33 compute-1 nova_compute[182713]: 2026-01-21 23:59:33.544 182717 DEBUG nova.network.neutron [req-30fb0ffb-fabe-4200-a328-63b1afb53689 req-732b0965-3b2b-41a4-a811-29c7fa2adda7 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 30c7da24-de00-4067-a5d2-f36ad21391c5] Refreshing network info cache for port 917524f9-5334-4b3d-b16f-b9686b1c3528 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 21 23:59:33 compute-1 nova_compute[182713]: 2026-01-21 23:59:33.627 182717 DEBUG nova.compute.provider_tree [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Inventory has not changed in ProviderTree for provider: 39680711-70c9-4df1-ae59-25e54fac688d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 21 23:59:33 compute-1 nova_compute[182713]: 2026-01-21 23:59:33.644 182717 DEBUG nova.scheduler.client.report [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Inventory has not changed for provider 39680711-70c9-4df1-ae59-25e54fac688d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 21 23:59:33 compute-1 nova_compute[182713]: 2026-01-21 23:59:33.676 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 21 23:59:33 compute-1 nova_compute[182713]: 2026-01-21 23:59:33.676 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.261s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:59:34 compute-1 nova_compute[182713]: 2026-01-21 23:59:34.674 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:59:34 compute-1 nova_compute[182713]: 2026-01-21 23:59:34.855 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:59:34 compute-1 nova_compute[182713]: 2026-01-21 23:59:34.856 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 21 23:59:34 compute-1 nova_compute[182713]: 2026-01-21 23:59:34.857 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 21 23:59:35 compute-1 nova_compute[182713]: 2026-01-21 23:59:35.468 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquiring lock "refresh_cache-30c7da24-de00-4067-a5d2-f36ad21391c5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 23:59:35 compute-1 nova_compute[182713]: 2026-01-21 23:59:35.688 182717 DEBUG oslo_concurrency.lockutils [None req-c00e7618-8e30-4150-aa47-e8bf6705559e 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] Acquiring lock "be4dacee-6b35-4e82-ba71-d6e8b745dfa8" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:59:35 compute-1 nova_compute[182713]: 2026-01-21 23:59:35.689 182717 DEBUG oslo_concurrency.lockutils [None req-c00e7618-8e30-4150-aa47-e8bf6705559e 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] Lock "be4dacee-6b35-4e82-ba71-d6e8b745dfa8" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:59:35 compute-1 nova_compute[182713]: 2026-01-21 23:59:35.689 182717 DEBUG oslo_concurrency.lockutils [None req-c00e7618-8e30-4150-aa47-e8bf6705559e 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] Acquiring lock "be4dacee-6b35-4e82-ba71-d6e8b745dfa8-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:59:35 compute-1 nova_compute[182713]: 2026-01-21 23:59:35.689 182717 DEBUG oslo_concurrency.lockutils [None req-c00e7618-8e30-4150-aa47-e8bf6705559e 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] Lock "be4dacee-6b35-4e82-ba71-d6e8b745dfa8-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:59:35 compute-1 nova_compute[182713]: 2026-01-21 23:59:35.690 182717 DEBUG oslo_concurrency.lockutils [None req-c00e7618-8e30-4150-aa47-e8bf6705559e 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] Lock "be4dacee-6b35-4e82-ba71-d6e8b745dfa8-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:59:35 compute-1 nova_compute[182713]: 2026-01-21 23:59:35.701 182717 INFO nova.compute.manager [None req-c00e7618-8e30-4150-aa47-e8bf6705559e 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] [instance: be4dacee-6b35-4e82-ba71-d6e8b745dfa8] Terminating instance
Jan 21 23:59:35 compute-1 nova_compute[182713]: 2026-01-21 23:59:35.712 182717 DEBUG nova.compute.manager [None req-c00e7618-8e30-4150-aa47-e8bf6705559e 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] [instance: be4dacee-6b35-4e82-ba71-d6e8b745dfa8] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 21 23:59:35 compute-1 kernel: tap9330cb5d-3c (unregistering): left promiscuous mode
Jan 21 23:59:35 compute-1 NetworkManager[54952]: <info>  [1769039975.7305] device (tap9330cb5d-3c): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 21 23:59:35 compute-1 ovn_controller[94841]: 2026-01-21T23:59:35Z|00295|binding|INFO|Releasing lport 9330cb5d-3c3f-499b-9d0c-ddc0fee6838f from this chassis (sb_readonly=0)
Jan 21 23:59:35 compute-1 ovn_controller[94841]: 2026-01-21T23:59:35Z|00296|binding|INFO|Setting lport 9330cb5d-3c3f-499b-9d0c-ddc0fee6838f down in Southbound
Jan 21 23:59:35 compute-1 nova_compute[182713]: 2026-01-21 23:59:35.732 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:59:35 compute-1 ovn_controller[94841]: 2026-01-21T23:59:35Z|00297|binding|INFO|Removing iface tap9330cb5d-3c ovn-installed in OVS
Jan 21 23:59:35 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:59:35.739 104184 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:9f:22:0c 10.100.0.143'], port_security=['fa:16:3e:9f:22:0c 10.100.0.143'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.143/24', 'neutron:device_id': 'be4dacee-6b35-4e82-ba71-d6e8b745dfa8', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4ae8e9ca-350e-4d38-9fe2-01d17d47544e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '975703700f9d42c5a1daa32f5e61f6f2', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'ca9fa796-afc7-4732-b32a-7e6314132071', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d9dc661b-0eaf-42dd-bed8-0f2f7383c18d, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>], logical_port=9330cb5d-3c3f-499b-9d0c-ddc0fee6838f) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 21 23:59:35 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:59:35.740 104184 INFO neutron.agent.ovn.metadata.agent [-] Port 9330cb5d-3c3f-499b-9d0c-ddc0fee6838f in datapath 4ae8e9ca-350e-4d38-9fe2-01d17d47544e unbound from our chassis
Jan 21 23:59:35 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:59:35.741 104184 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 4ae8e9ca-350e-4d38-9fe2-01d17d47544e
Jan 21 23:59:35 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:59:35.759 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[5db4dfec-3e5d-4337-b901-13f152a3bb82]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:59:35 compute-1 nova_compute[182713]: 2026-01-21 23:59:35.761 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:59:35 compute-1 kernel: tapda0329e1-27 (unregistering): left promiscuous mode
Jan 21 23:59:35 compute-1 NetworkManager[54952]: <info>  [1769039975.7711] device (tapda0329e1-27): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 21 23:59:35 compute-1 nova_compute[182713]: 2026-01-21 23:59:35.779 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:59:35 compute-1 ovn_controller[94841]: 2026-01-21T23:59:35Z|00298|binding|INFO|Releasing lport da0329e1-27a9-4900-91e5-aff8efb5d066 from this chassis (sb_readonly=0)
Jan 21 23:59:35 compute-1 ovn_controller[94841]: 2026-01-21T23:59:35Z|00299|binding|INFO|Setting lport da0329e1-27a9-4900-91e5-aff8efb5d066 down in Southbound
Jan 21 23:59:35 compute-1 ovn_controller[94841]: 2026-01-21T23:59:35Z|00300|binding|INFO|Removing iface tapda0329e1-27 ovn-installed in OVS
Jan 21 23:59:35 compute-1 nova_compute[182713]: 2026-01-21 23:59:35.782 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:59:35 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:59:35.788 104184 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:66:b9:f9 10.100.1.127'], port_security=['fa:16:3e:66:b9:f9 10.100.1.127'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.1.127/24', 'neutron:device_id': 'be4dacee-6b35-4e82-ba71-d6e8b745dfa8', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-54a6b0a6-096e-4f61-a504-b2a9810b3844', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '975703700f9d42c5a1daa32f5e61f6f2', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'ca9fa796-afc7-4732-b32a-7e6314132071', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=88c3c5ff-f5e5-415a-a727-ca4d025fff63, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>], logical_port=da0329e1-27a9-4900-91e5-aff8efb5d066) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 21 23:59:35 compute-1 kernel: tapa899c1d3-f4 (unregistering): left promiscuous mode
Jan 21 23:59:35 compute-1 NetworkManager[54952]: <info>  [1769039975.7962] device (tapa899c1d3-f4): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 21 23:59:35 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:59:35.797 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[695f1c8d-215d-4b63-8d7b-1c557416bee2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:59:35 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:59:35.799 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[92acce01-0bd2-4e9f-98c5-b978a36246ea]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:59:35 compute-1 nova_compute[182713]: 2026-01-21 23:59:35.804 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:59:35 compute-1 ovn_controller[94841]: 2026-01-21T23:59:35Z|00301|binding|INFO|Releasing lport a899c1d3-f433-4476-a304-705e518f0bea from this chassis (sb_readonly=0)
Jan 21 23:59:35 compute-1 ovn_controller[94841]: 2026-01-21T23:59:35Z|00302|binding|INFO|Setting lport a899c1d3-f433-4476-a304-705e518f0bea down in Southbound
Jan 21 23:59:35 compute-1 nova_compute[182713]: 2026-01-21 23:59:35.815 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:59:35 compute-1 ovn_controller[94841]: 2026-01-21T23:59:35Z|00303|binding|INFO|Removing iface tapa899c1d3-f4 ovn-installed in OVS
Jan 21 23:59:35 compute-1 nova_compute[182713]: 2026-01-21 23:59:35.817 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:59:35 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:59:35.828 104184 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e4:a0:ba 10.100.0.228'], port_security=['fa:16:3e:e4:a0:ba 10.100.0.228'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.228/24', 'neutron:device_id': 'be4dacee-6b35-4e82-ba71-d6e8b745dfa8', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4ae8e9ca-350e-4d38-9fe2-01d17d47544e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '975703700f9d42c5a1daa32f5e61f6f2', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'ca9fa796-afc7-4732-b32a-7e6314132071', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d9dc661b-0eaf-42dd-bed8-0f2f7383c18d, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>], logical_port=a899c1d3-f433-4476-a304-705e518f0bea) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 21 23:59:35 compute-1 nova_compute[182713]: 2026-01-21 23:59:35.830 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:59:35 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:59:35.833 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[26652b6c-63e2-48ce-bdbb-6ea4e949c478]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:59:35 compute-1 systemd[1]: machine-qemu\x2d36\x2dinstance\x2d0000004e.scope: Deactivated successfully.
Jan 21 23:59:35 compute-1 systemd[1]: machine-qemu\x2d36\x2dinstance\x2d0000004e.scope: Consumed 8.445s CPU time.
Jan 21 23:59:35 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:59:35.851 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[79e9da64-7888-4f9a-a91d-9b8d036bc0e2]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap4ae8e9ca-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:11:a9:0c'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 6, 'tx_packets': 7, 'rx_bytes': 532, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 6, 'tx_packets': 7, 'rx_bytes': 532, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 86], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 453953, 'reachable_time': 44184, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 222457, 'error': None, 'target': 'ovnmeta-4ae8e9ca-350e-4d38-9fe2-01d17d47544e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:59:35 compute-1 systemd-machined[153970]: Machine qemu-36-instance-0000004e terminated.
Jan 21 23:59:35 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:59:35.872 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[ef352580-6a2d-4223-9007-7956fe7e078c]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap4ae8e9ca-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 453969, 'tstamp': 453969}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 222458, 'error': None, 'target': 'ovnmeta-4ae8e9ca-350e-4d38-9fe2-01d17d47544e', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 24, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.255'], ['IFA_LABEL', 'tap4ae8e9ca-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 453973, 'tstamp': 453973}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 222458, 'error': None, 'target': 'ovnmeta-4ae8e9ca-350e-4d38-9fe2-01d17d47544e', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:59:35 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:59:35.874 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4ae8e9ca-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:59:35 compute-1 nova_compute[182713]: 2026-01-21 23:59:35.875 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:59:35 compute-1 nova_compute[182713]: 2026-01-21 23:59:35.887 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:59:35 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:59:35.887 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap4ae8e9ca-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:59:35 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:59:35.888 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 21 23:59:35 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:59:35.888 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap4ae8e9ca-30, col_values=(('external_ids', {'iface-id': '9328bb99-12eb-4e9f-9bcb-95844b674407'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:59:35 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:59:35.888 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 21 23:59:35 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:59:35.889 104184 INFO neutron.agent.ovn.metadata.agent [-] Port da0329e1-27a9-4900-91e5-aff8efb5d066 in datapath 54a6b0a6-096e-4f61-a504-b2a9810b3844 unbound from our chassis
Jan 21 23:59:35 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:59:35.890 104184 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 54a6b0a6-096e-4f61-a504-b2a9810b3844, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 21 23:59:35 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:59:35.891 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[a5c016cf-05ee-4e8e-9225-d53cfa7af3b6]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:59:35 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:59:35.892 104184 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-54a6b0a6-096e-4f61-a504-b2a9810b3844 namespace which is not needed anymore
Jan 21 23:59:35 compute-1 NetworkManager[54952]: <info>  [1769039975.9477] manager: (tapda0329e1-27): new Tun device (/org/freedesktop/NetworkManager/Devices/147)
Jan 21 23:59:35 compute-1 NetworkManager[54952]: <info>  [1769039975.9554] manager: (tapa899c1d3-f4): new Tun device (/org/freedesktop/NetworkManager/Devices/148)
Jan 21 23:59:36 compute-1 nova_compute[182713]: 2026-01-21 23:59:36.005 182717 INFO nova.virt.libvirt.driver [-] [instance: be4dacee-6b35-4e82-ba71-d6e8b745dfa8] Instance destroyed successfully.
Jan 21 23:59:36 compute-1 nova_compute[182713]: 2026-01-21 23:59:36.005 182717 DEBUG nova.objects.instance [None req-c00e7618-8e30-4150-aa47-e8bf6705559e 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] Lazy-loading 'resources' on Instance uuid be4dacee-6b35-4e82-ba71-d6e8b745dfa8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:59:36 compute-1 nova_compute[182713]: 2026-01-21 23:59:36.013 182717 DEBUG nova.compute.manager [req-f531d668-5072-4ae9-abfe-4822d7e7d478 req-5b214448-d755-4cb4-85ac-d20f6566c774 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: be4dacee-6b35-4e82-ba71-d6e8b745dfa8] Received event network-vif-unplugged-9330cb5d-3c3f-499b-9d0c-ddc0fee6838f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:59:36 compute-1 nova_compute[182713]: 2026-01-21 23:59:36.013 182717 DEBUG oslo_concurrency.lockutils [req-f531d668-5072-4ae9-abfe-4822d7e7d478 req-5b214448-d755-4cb4-85ac-d20f6566c774 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "be4dacee-6b35-4e82-ba71-d6e8b745dfa8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:59:36 compute-1 nova_compute[182713]: 2026-01-21 23:59:36.013 182717 DEBUG oslo_concurrency.lockutils [req-f531d668-5072-4ae9-abfe-4822d7e7d478 req-5b214448-d755-4cb4-85ac-d20f6566c774 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "be4dacee-6b35-4e82-ba71-d6e8b745dfa8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:59:36 compute-1 nova_compute[182713]: 2026-01-21 23:59:36.014 182717 DEBUG oslo_concurrency.lockutils [req-f531d668-5072-4ae9-abfe-4822d7e7d478 req-5b214448-d755-4cb4-85ac-d20f6566c774 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "be4dacee-6b35-4e82-ba71-d6e8b745dfa8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:59:36 compute-1 nova_compute[182713]: 2026-01-21 23:59:36.014 182717 DEBUG nova.compute.manager [req-f531d668-5072-4ae9-abfe-4822d7e7d478 req-5b214448-d755-4cb4-85ac-d20f6566c774 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: be4dacee-6b35-4e82-ba71-d6e8b745dfa8] No waiting events found dispatching network-vif-unplugged-9330cb5d-3c3f-499b-9d0c-ddc0fee6838f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 23:59:36 compute-1 nova_compute[182713]: 2026-01-21 23:59:36.015 182717 DEBUG nova.compute.manager [req-f531d668-5072-4ae9-abfe-4822d7e7d478 req-5b214448-d755-4cb4-85ac-d20f6566c774 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: be4dacee-6b35-4e82-ba71-d6e8b745dfa8] Received event network-vif-unplugged-9330cb5d-3c3f-499b-9d0c-ddc0fee6838f for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 21 23:59:36 compute-1 nova_compute[182713]: 2026-01-21 23:59:36.032 182717 DEBUG nova.virt.libvirt.vif [None req-c00e7618-8e30-4150-aa47-e8bf6705559e 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-21T23:59:07Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersTestMultiNic-server-1379253733',display_name='tempest-ServersTestMultiNic-server-1379253733',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverstestmultinic-server-1379253733',id=78,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-21T23:59:27Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='975703700f9d42c5a1daa32f5e61f6f2',ramdisk_id='',reservation_id='r-hvk60y87',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersTestMultiNic-672631386',owner_user_name='tempest-ServersTestMultiNic-672631386-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-21T23:59:27Z,user_data=None,user_id='1931e691804246e3bb3ac03a95a74d93',uuid=be4dacee-6b35-4e82-ba71-d6e8b745dfa8,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "9330cb5d-3c3f-499b-9d0c-ddc0fee6838f", "address": "fa:16:3e:9f:22:0c", "network": {"id": "4ae8e9ca-350e-4d38-9fe2-01d17d47544e", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1174472241", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.143", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "975703700f9d42c5a1daa32f5e61f6f2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9330cb5d-3c", "ovs_interfaceid": "9330cb5d-3c3f-499b-9d0c-ddc0fee6838f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 21 23:59:36 compute-1 nova_compute[182713]: 2026-01-21 23:59:36.033 182717 DEBUG nova.network.os_vif_util [None req-c00e7618-8e30-4150-aa47-e8bf6705559e 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] Converting VIF {"id": "9330cb5d-3c3f-499b-9d0c-ddc0fee6838f", "address": "fa:16:3e:9f:22:0c", "network": {"id": "4ae8e9ca-350e-4d38-9fe2-01d17d47544e", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1174472241", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.143", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "975703700f9d42c5a1daa32f5e61f6f2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9330cb5d-3c", "ovs_interfaceid": "9330cb5d-3c3f-499b-9d0c-ddc0fee6838f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 21 23:59:36 compute-1 nova_compute[182713]: 2026-01-21 23:59:36.034 182717 DEBUG nova.network.os_vif_util [None req-c00e7618-8e30-4150-aa47-e8bf6705559e 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:9f:22:0c,bridge_name='br-int',has_traffic_filtering=True,id=9330cb5d-3c3f-499b-9d0c-ddc0fee6838f,network=Network(4ae8e9ca-350e-4d38-9fe2-01d17d47544e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9330cb5d-3c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 21 23:59:36 compute-1 nova_compute[182713]: 2026-01-21 23:59:36.034 182717 DEBUG os_vif [None req-c00e7618-8e30-4150-aa47-e8bf6705559e 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:9f:22:0c,bridge_name='br-int',has_traffic_filtering=True,id=9330cb5d-3c3f-499b-9d0c-ddc0fee6838f,network=Network(4ae8e9ca-350e-4d38-9fe2-01d17d47544e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9330cb5d-3c') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 21 23:59:36 compute-1 nova_compute[182713]: 2026-01-21 23:59:36.037 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:59:36 compute-1 nova_compute[182713]: 2026-01-21 23:59:36.037 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9330cb5d-3c, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:59:36 compute-1 neutron-haproxy-ovnmeta-54a6b0a6-096e-4f61-a504-b2a9810b3844[222280]: [NOTICE]   (222284) : haproxy version is 2.8.14-c23fe91
Jan 21 23:59:36 compute-1 neutron-haproxy-ovnmeta-54a6b0a6-096e-4f61-a504-b2a9810b3844[222280]: [NOTICE]   (222284) : path to executable is /usr/sbin/haproxy
Jan 21 23:59:36 compute-1 neutron-haproxy-ovnmeta-54a6b0a6-096e-4f61-a504-b2a9810b3844[222280]: [WARNING]  (222284) : Exiting Master process...
Jan 21 23:59:36 compute-1 neutron-haproxy-ovnmeta-54a6b0a6-096e-4f61-a504-b2a9810b3844[222280]: [ALERT]    (222284) : Current worker (222286) exited with code 143 (Terminated)
Jan 21 23:59:36 compute-1 neutron-haproxy-ovnmeta-54a6b0a6-096e-4f61-a504-b2a9810b3844[222280]: [WARNING]  (222284) : All workers exited. Exiting... (0)
Jan 21 23:59:36 compute-1 nova_compute[182713]: 2026-01-21 23:59:36.044 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:59:36 compute-1 nova_compute[182713]: 2026-01-21 23:59:36.046 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 21 23:59:36 compute-1 systemd[1]: libpod-5f0ff92f1e526697a792fbc9cdebebab2edb0c475ddf98df071c87470ecff4a4.scope: Deactivated successfully.
Jan 21 23:59:36 compute-1 nova_compute[182713]: 2026-01-21 23:59:36.049 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:59:36 compute-1 podman[222516]: 2026-01-21 23:59:36.053496387 +0000 UTC m=+0.047358251 container died 5f0ff92f1e526697a792fbc9cdebebab2edb0c475ddf98df071c87470ecff4a4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-54a6b0a6-096e-4f61-a504-b2a9810b3844, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202)
Jan 21 23:59:36 compute-1 nova_compute[182713]: 2026-01-21 23:59:36.053 182717 INFO os_vif [None req-c00e7618-8e30-4150-aa47-e8bf6705559e 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:9f:22:0c,bridge_name='br-int',has_traffic_filtering=True,id=9330cb5d-3c3f-499b-9d0c-ddc0fee6838f,network=Network(4ae8e9ca-350e-4d38-9fe2-01d17d47544e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9330cb5d-3c')
Jan 21 23:59:36 compute-1 nova_compute[182713]: 2026-01-21 23:59:36.055 182717 DEBUG nova.virt.libvirt.vif [None req-c00e7618-8e30-4150-aa47-e8bf6705559e 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-21T23:59:07Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersTestMultiNic-server-1379253733',display_name='tempest-ServersTestMultiNic-server-1379253733',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverstestmultinic-server-1379253733',id=78,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-21T23:59:27Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='975703700f9d42c5a1daa32f5e61f6f2',ramdisk_id='',reservation_id='r-hvk60y87',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersTestMultiNic-672631386',owner_user_name='tempest-ServersTestMultiNic-672631386-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-21T23:59:27Z,user_data=None,user_id='1931e691804246e3bb3ac03a95a74d93',uuid=be4dacee-6b35-4e82-ba71-d6e8b745dfa8,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "da0329e1-27a9-4900-91e5-aff8efb5d066", "address": "fa:16:3e:66:b9:f9", "network": {"id": "54a6b0a6-096e-4f61-a504-b2a9810b3844", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-335695391", "subnets": [{"cidr": "10.100.1.0/24", "dns": [], "gateway": {"address": "10.100.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.1.127", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "975703700f9d42c5a1daa32f5e61f6f2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapda0329e1-27", "ovs_interfaceid": "da0329e1-27a9-4900-91e5-aff8efb5d066", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 21 23:59:36 compute-1 nova_compute[182713]: 2026-01-21 23:59:36.055 182717 DEBUG nova.network.os_vif_util [None req-c00e7618-8e30-4150-aa47-e8bf6705559e 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] Converting VIF {"id": "da0329e1-27a9-4900-91e5-aff8efb5d066", "address": "fa:16:3e:66:b9:f9", "network": {"id": "54a6b0a6-096e-4f61-a504-b2a9810b3844", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-335695391", "subnets": [{"cidr": "10.100.1.0/24", "dns": [], "gateway": {"address": "10.100.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.1.127", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "975703700f9d42c5a1daa32f5e61f6f2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapda0329e1-27", "ovs_interfaceid": "da0329e1-27a9-4900-91e5-aff8efb5d066", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 21 23:59:36 compute-1 nova_compute[182713]: 2026-01-21 23:59:36.056 182717 DEBUG nova.network.os_vif_util [None req-c00e7618-8e30-4150-aa47-e8bf6705559e 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:66:b9:f9,bridge_name='br-int',has_traffic_filtering=True,id=da0329e1-27a9-4900-91e5-aff8efb5d066,network=Network(54a6b0a6-096e-4f61-a504-b2a9810b3844),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapda0329e1-27') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 21 23:59:36 compute-1 nova_compute[182713]: 2026-01-21 23:59:36.057 182717 DEBUG os_vif [None req-c00e7618-8e30-4150-aa47-e8bf6705559e 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:66:b9:f9,bridge_name='br-int',has_traffic_filtering=True,id=da0329e1-27a9-4900-91e5-aff8efb5d066,network=Network(54a6b0a6-096e-4f61-a504-b2a9810b3844),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapda0329e1-27') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 21 23:59:36 compute-1 nova_compute[182713]: 2026-01-21 23:59:36.058 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:59:36 compute-1 nova_compute[182713]: 2026-01-21 23:59:36.059 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapda0329e1-27, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:59:36 compute-1 nova_compute[182713]: 2026-01-21 23:59:36.062 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:59:36 compute-1 nova_compute[182713]: 2026-01-21 23:59:36.064 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 21 23:59:36 compute-1 nova_compute[182713]: 2026-01-21 23:59:36.065 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:59:36 compute-1 nova_compute[182713]: 2026-01-21 23:59:36.067 182717 INFO os_vif [None req-c00e7618-8e30-4150-aa47-e8bf6705559e 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:66:b9:f9,bridge_name='br-int',has_traffic_filtering=True,id=da0329e1-27a9-4900-91e5-aff8efb5d066,network=Network(54a6b0a6-096e-4f61-a504-b2a9810b3844),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapda0329e1-27')
Jan 21 23:59:36 compute-1 nova_compute[182713]: 2026-01-21 23:59:36.068 182717 DEBUG nova.virt.libvirt.vif [None req-c00e7618-8e30-4150-aa47-e8bf6705559e 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-21T23:59:07Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersTestMultiNic-server-1379253733',display_name='tempest-ServersTestMultiNic-server-1379253733',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverstestmultinic-server-1379253733',id=78,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-21T23:59:27Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='975703700f9d42c5a1daa32f5e61f6f2',ramdisk_id='',reservation_id='r-hvk60y87',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersTestMultiNic-672631386',owner_user_name='tempest-ServersTestMultiNic-672631386-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-21T23:59:27Z,user_data=None,user_id='1931e691804246e3bb3ac03a95a74d93',uuid=be4dacee-6b35-4e82-ba71-d6e8b745dfa8,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "a899c1d3-f433-4476-a304-705e518f0bea", "address": "fa:16:3e:e4:a0:ba", "network": {"id": "4ae8e9ca-350e-4d38-9fe2-01d17d47544e", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1174472241", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.228", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "975703700f9d42c5a1daa32f5e61f6f2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa899c1d3-f4", "ovs_interfaceid": "a899c1d3-f433-4476-a304-705e518f0bea", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 21 23:59:36 compute-1 nova_compute[182713]: 2026-01-21 23:59:36.068 182717 DEBUG nova.network.os_vif_util [None req-c00e7618-8e30-4150-aa47-e8bf6705559e 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] Converting VIF {"id": "a899c1d3-f433-4476-a304-705e518f0bea", "address": "fa:16:3e:e4:a0:ba", "network": {"id": "4ae8e9ca-350e-4d38-9fe2-01d17d47544e", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1174472241", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.228", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "975703700f9d42c5a1daa32f5e61f6f2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa899c1d3-f4", "ovs_interfaceid": "a899c1d3-f433-4476-a304-705e518f0bea", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 21 23:59:36 compute-1 nova_compute[182713]: 2026-01-21 23:59:36.069 182717 DEBUG nova.network.os_vif_util [None req-c00e7618-8e30-4150-aa47-e8bf6705559e 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e4:a0:ba,bridge_name='br-int',has_traffic_filtering=True,id=a899c1d3-f433-4476-a304-705e518f0bea,network=Network(4ae8e9ca-350e-4d38-9fe2-01d17d47544e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa899c1d3-f4') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 21 23:59:36 compute-1 nova_compute[182713]: 2026-01-21 23:59:36.069 182717 DEBUG os_vif [None req-c00e7618-8e30-4150-aa47-e8bf6705559e 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:e4:a0:ba,bridge_name='br-int',has_traffic_filtering=True,id=a899c1d3-f433-4476-a304-705e518f0bea,network=Network(4ae8e9ca-350e-4d38-9fe2-01d17d47544e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa899c1d3-f4') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 21 23:59:36 compute-1 nova_compute[182713]: 2026-01-21 23:59:36.071 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:59:36 compute-1 nova_compute[182713]: 2026-01-21 23:59:36.071 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa899c1d3-f4, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:59:36 compute-1 nova_compute[182713]: 2026-01-21 23:59:36.072 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:59:36 compute-1 nova_compute[182713]: 2026-01-21 23:59:36.076 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 21 23:59:36 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-5f0ff92f1e526697a792fbc9cdebebab2edb0c475ddf98df071c87470ecff4a4-userdata-shm.mount: Deactivated successfully.
Jan 21 23:59:36 compute-1 systemd[1]: var-lib-containers-storage-overlay-148b8e5d36fd5e918c3cd94bf47a8fa769f8b942990cd974b23b5fb7a1dc5022-merged.mount: Deactivated successfully.
Jan 21 23:59:36 compute-1 nova_compute[182713]: 2026-01-21 23:59:36.079 182717 INFO os_vif [None req-c00e7618-8e30-4150-aa47-e8bf6705559e 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:e4:a0:ba,bridge_name='br-int',has_traffic_filtering=True,id=a899c1d3-f433-4476-a304-705e518f0bea,network=Network(4ae8e9ca-350e-4d38-9fe2-01d17d47544e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa899c1d3-f4')
Jan 21 23:59:36 compute-1 nova_compute[182713]: 2026-01-21 23:59:36.080 182717 INFO nova.virt.libvirt.driver [None req-c00e7618-8e30-4150-aa47-e8bf6705559e 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] [instance: be4dacee-6b35-4e82-ba71-d6e8b745dfa8] Deleting instance files /var/lib/nova/instances/be4dacee-6b35-4e82-ba71-d6e8b745dfa8_del
Jan 21 23:59:36 compute-1 nova_compute[182713]: 2026-01-21 23:59:36.081 182717 INFO nova.virt.libvirt.driver [None req-c00e7618-8e30-4150-aa47-e8bf6705559e 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] [instance: be4dacee-6b35-4e82-ba71-d6e8b745dfa8] Deletion of /var/lib/nova/instances/be4dacee-6b35-4e82-ba71-d6e8b745dfa8_del complete
Jan 21 23:59:36 compute-1 podman[222516]: 2026-01-21 23:59:36.08892269 +0000 UTC m=+0.082784554 container cleanup 5f0ff92f1e526697a792fbc9cdebebab2edb0c475ddf98df071c87470ecff4a4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-54a6b0a6-096e-4f61-a504-b2a9810b3844, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 21 23:59:36 compute-1 systemd[1]: libpod-conmon-5f0ff92f1e526697a792fbc9cdebebab2edb0c475ddf98df071c87470ecff4a4.scope: Deactivated successfully.
Jan 21 23:59:36 compute-1 podman[222546]: 2026-01-21 23:59:36.183114075 +0000 UTC m=+0.063950914 container remove 5f0ff92f1e526697a792fbc9cdebebab2edb0c475ddf98df071c87470ecff4a4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-54a6b0a6-096e-4f61-a504-b2a9810b3844, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Jan 21 23:59:36 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:59:36.190 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[0bb5bb92-1b86-48c4-aaee-d0dfc257f565]: (4, ('Wed Jan 21 11:59:35 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-54a6b0a6-096e-4f61-a504-b2a9810b3844 (5f0ff92f1e526697a792fbc9cdebebab2edb0c475ddf98df071c87470ecff4a4)\n5f0ff92f1e526697a792fbc9cdebebab2edb0c475ddf98df071c87470ecff4a4\nWed Jan 21 11:59:36 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-54a6b0a6-096e-4f61-a504-b2a9810b3844 (5f0ff92f1e526697a792fbc9cdebebab2edb0c475ddf98df071c87470ecff4a4)\n5f0ff92f1e526697a792fbc9cdebebab2edb0c475ddf98df071c87470ecff4a4\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:59:36 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:59:36.192 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[2314419a-d71c-4be7-911c-3e57aad92a18]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:59:36 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:59:36.193 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap54a6b0a6-00, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:59:36 compute-1 nova_compute[182713]: 2026-01-21 23:59:36.194 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:59:36 compute-1 kernel: tap54a6b0a6-00: left promiscuous mode
Jan 21 23:59:36 compute-1 nova_compute[182713]: 2026-01-21 23:59:36.218 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:59:36 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:59:36.222 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[34f3e2b7-b4b3-4c1b-9663-e142f6cb6da7]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:59:36 compute-1 nova_compute[182713]: 2026-01-21 23:59:36.225 182717 INFO nova.compute.manager [None req-c00e7618-8e30-4150-aa47-e8bf6705559e 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] [instance: be4dacee-6b35-4e82-ba71-d6e8b745dfa8] Took 0.51 seconds to destroy the instance on the hypervisor.
Jan 21 23:59:36 compute-1 nova_compute[182713]: 2026-01-21 23:59:36.226 182717 DEBUG oslo.service.loopingcall [None req-c00e7618-8e30-4150-aa47-e8bf6705559e 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 21 23:59:36 compute-1 nova_compute[182713]: 2026-01-21 23:59:36.226 182717 DEBUG nova.compute.manager [-] [instance: be4dacee-6b35-4e82-ba71-d6e8b745dfa8] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 21 23:59:36 compute-1 nova_compute[182713]: 2026-01-21 23:59:36.227 182717 DEBUG nova.network.neutron [-] [instance: be4dacee-6b35-4e82-ba71-d6e8b745dfa8] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 21 23:59:36 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:59:36.236 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[b595419d-ef19-449a-b0b0-d623c917cfc9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:59:36 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:59:36.237 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[3cdb8adb-4b49-414c-8b98-e41a58ee121f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:59:36 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:59:36.257 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[0096d4ba-aa97-42b2-a898-7606c5b492ee]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 453834, 'reachable_time': 23420, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 222560, 'error': None, 'target': 'ovnmeta-54a6b0a6-096e-4f61-a504-b2a9810b3844', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:59:36 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:59:36.260 104576 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-54a6b0a6-096e-4f61-a504-b2a9810b3844 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 21 23:59:36 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:59:36.260 104576 DEBUG oslo.privsep.daemon [-] privsep: reply[f2bf2afa-078e-4b96-a978-baae05615643]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:59:36 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:59:36.261 104184 INFO neutron.agent.ovn.metadata.agent [-] Port a899c1d3-f433-4476-a304-705e518f0bea in datapath 4ae8e9ca-350e-4d38-9fe2-01d17d47544e unbound from our chassis
Jan 21 23:59:36 compute-1 systemd[1]: run-netns-ovnmeta\x2d54a6b0a6\x2d096e\x2d4f61\x2da504\x2db2a9810b3844.mount: Deactivated successfully.
Jan 21 23:59:36 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:59:36.264 104184 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 4ae8e9ca-350e-4d38-9fe2-01d17d47544e, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 21 23:59:36 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:59:36.266 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[b096e5ea-afbb-45c6-a512-7792939c5003]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:59:36 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:59:36.267 104184 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-4ae8e9ca-350e-4d38-9fe2-01d17d47544e namespace which is not needed anymore
Jan 21 23:59:36 compute-1 neutron-haproxy-ovnmeta-4ae8e9ca-350e-4d38-9fe2-01d17d47544e[222355]: [NOTICE]   (222359) : haproxy version is 2.8.14-c23fe91
Jan 21 23:59:36 compute-1 neutron-haproxy-ovnmeta-4ae8e9ca-350e-4d38-9fe2-01d17d47544e[222355]: [NOTICE]   (222359) : path to executable is /usr/sbin/haproxy
Jan 21 23:59:36 compute-1 neutron-haproxy-ovnmeta-4ae8e9ca-350e-4d38-9fe2-01d17d47544e[222355]: [WARNING]  (222359) : Exiting Master process...
Jan 21 23:59:36 compute-1 neutron-haproxy-ovnmeta-4ae8e9ca-350e-4d38-9fe2-01d17d47544e[222355]: [WARNING]  (222359) : Exiting Master process...
Jan 21 23:59:36 compute-1 neutron-haproxy-ovnmeta-4ae8e9ca-350e-4d38-9fe2-01d17d47544e[222355]: [ALERT]    (222359) : Current worker (222361) exited with code 143 (Terminated)
Jan 21 23:59:36 compute-1 neutron-haproxy-ovnmeta-4ae8e9ca-350e-4d38-9fe2-01d17d47544e[222355]: [WARNING]  (222359) : All workers exited. Exiting... (0)
Jan 21 23:59:36 compute-1 systemd[1]: libpod-7c45f2ace3d5dbeec2335bb2491ce758b0eb7a1d24aba2b360d27a828b0262fe.scope: Deactivated successfully.
Jan 21 23:59:36 compute-1 podman[222576]: 2026-01-21 23:59:36.452491772 +0000 UTC m=+0.071176025 container died 7c45f2ace3d5dbeec2335bb2491ce758b0eb7a1d24aba2b360d27a828b0262fe (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4ae8e9ca-350e-4d38-9fe2-01d17d47544e, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Jan 21 23:59:36 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-7c45f2ace3d5dbeec2335bb2491ce758b0eb7a1d24aba2b360d27a828b0262fe-userdata-shm.mount: Deactivated successfully.
Jan 21 23:59:36 compute-1 systemd[1]: var-lib-containers-storage-overlay-496f9ed9fa8bcdd8ca5ca001e6ca60188c6a1726c8a0215bf84f6f000dda1646-merged.mount: Deactivated successfully.
Jan 21 23:59:36 compute-1 podman[222576]: 2026-01-21 23:59:36.500778622 +0000 UTC m=+0.119462855 container cleanup 7c45f2ace3d5dbeec2335bb2491ce758b0eb7a1d24aba2b360d27a828b0262fe (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4ae8e9ca-350e-4d38-9fe2-01d17d47544e, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 21 23:59:36 compute-1 systemd[1]: libpod-conmon-7c45f2ace3d5dbeec2335bb2491ce758b0eb7a1d24aba2b360d27a828b0262fe.scope: Deactivated successfully.
Jan 21 23:59:36 compute-1 nova_compute[182713]: 2026-01-21 23:59:36.570 182717 DEBUG nova.compute.manager [req-7b533c4f-edb7-4ef4-8785-2ab490930a3c req-ac5506a1-c1c0-4f16-a5a1-1573c7773391 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: be4dacee-6b35-4e82-ba71-d6e8b745dfa8] Received event network-vif-unplugged-a899c1d3-f433-4476-a304-705e518f0bea external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:59:36 compute-1 nova_compute[182713]: 2026-01-21 23:59:36.571 182717 DEBUG oslo_concurrency.lockutils [req-7b533c4f-edb7-4ef4-8785-2ab490930a3c req-ac5506a1-c1c0-4f16-a5a1-1573c7773391 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "be4dacee-6b35-4e82-ba71-d6e8b745dfa8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:59:36 compute-1 nova_compute[182713]: 2026-01-21 23:59:36.571 182717 DEBUG oslo_concurrency.lockutils [req-7b533c4f-edb7-4ef4-8785-2ab490930a3c req-ac5506a1-c1c0-4f16-a5a1-1573c7773391 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "be4dacee-6b35-4e82-ba71-d6e8b745dfa8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:59:36 compute-1 nova_compute[182713]: 2026-01-21 23:59:36.572 182717 DEBUG oslo_concurrency.lockutils [req-7b533c4f-edb7-4ef4-8785-2ab490930a3c req-ac5506a1-c1c0-4f16-a5a1-1573c7773391 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "be4dacee-6b35-4e82-ba71-d6e8b745dfa8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:59:36 compute-1 nova_compute[182713]: 2026-01-21 23:59:36.572 182717 DEBUG nova.compute.manager [req-7b533c4f-edb7-4ef4-8785-2ab490930a3c req-ac5506a1-c1c0-4f16-a5a1-1573c7773391 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: be4dacee-6b35-4e82-ba71-d6e8b745dfa8] No waiting events found dispatching network-vif-unplugged-a899c1d3-f433-4476-a304-705e518f0bea pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 23:59:36 compute-1 nova_compute[182713]: 2026-01-21 23:59:36.573 182717 DEBUG nova.compute.manager [req-7b533c4f-edb7-4ef4-8785-2ab490930a3c req-ac5506a1-c1c0-4f16-a5a1-1573c7773391 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: be4dacee-6b35-4e82-ba71-d6e8b745dfa8] Received event network-vif-unplugged-a899c1d3-f433-4476-a304-705e518f0bea for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 21 23:59:36 compute-1 podman[222602]: 2026-01-21 23:59:36.59864642 +0000 UTC m=+0.063929813 container remove 7c45f2ace3d5dbeec2335bb2491ce758b0eb7a1d24aba2b360d27a828b0262fe (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4ae8e9ca-350e-4d38-9fe2-01d17d47544e, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Jan 21 23:59:36 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:59:36.606 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[97ab2261-3801-4c4b-b782-adbd914261c5]: (4, ('Wed Jan 21 11:59:36 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-4ae8e9ca-350e-4d38-9fe2-01d17d47544e (7c45f2ace3d5dbeec2335bb2491ce758b0eb7a1d24aba2b360d27a828b0262fe)\n7c45f2ace3d5dbeec2335bb2491ce758b0eb7a1d24aba2b360d27a828b0262fe\nWed Jan 21 11:59:36 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-4ae8e9ca-350e-4d38-9fe2-01d17d47544e (7c45f2ace3d5dbeec2335bb2491ce758b0eb7a1d24aba2b360d27a828b0262fe)\n7c45f2ace3d5dbeec2335bb2491ce758b0eb7a1d24aba2b360d27a828b0262fe\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:59:36 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:59:36.608 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[433eef73-9245-4a9a-a805-d765ee277586]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:59:36 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:59:36.609 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4ae8e9ca-30, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:59:36 compute-1 kernel: tap4ae8e9ca-30: left promiscuous mode
Jan 21 23:59:36 compute-1 nova_compute[182713]: 2026-01-21 23:59:36.614 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:59:36 compute-1 nova_compute[182713]: 2026-01-21 23:59:36.630 182717 DEBUG nova.compute.manager [req-09e7be8c-9517-4249-82ba-400b4ef4b8ec req-6af399f3-fa15-429d-9288-6252675ff667 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: be4dacee-6b35-4e82-ba71-d6e8b745dfa8] Received event network-vif-unplugged-da0329e1-27a9-4900-91e5-aff8efb5d066 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:59:36 compute-1 nova_compute[182713]: 2026-01-21 23:59:36.631 182717 DEBUG oslo_concurrency.lockutils [req-09e7be8c-9517-4249-82ba-400b4ef4b8ec req-6af399f3-fa15-429d-9288-6252675ff667 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "be4dacee-6b35-4e82-ba71-d6e8b745dfa8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:59:36 compute-1 nova_compute[182713]: 2026-01-21 23:59:36.631 182717 DEBUG oslo_concurrency.lockutils [req-09e7be8c-9517-4249-82ba-400b4ef4b8ec req-6af399f3-fa15-429d-9288-6252675ff667 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "be4dacee-6b35-4e82-ba71-d6e8b745dfa8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:59:36 compute-1 nova_compute[182713]: 2026-01-21 23:59:36.631 182717 DEBUG oslo_concurrency.lockutils [req-09e7be8c-9517-4249-82ba-400b4ef4b8ec req-6af399f3-fa15-429d-9288-6252675ff667 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "be4dacee-6b35-4e82-ba71-d6e8b745dfa8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:59:36 compute-1 nova_compute[182713]: 2026-01-21 23:59:36.632 182717 DEBUG nova.compute.manager [req-09e7be8c-9517-4249-82ba-400b4ef4b8ec req-6af399f3-fa15-429d-9288-6252675ff667 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: be4dacee-6b35-4e82-ba71-d6e8b745dfa8] No waiting events found dispatching network-vif-unplugged-da0329e1-27a9-4900-91e5-aff8efb5d066 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 23:59:36 compute-1 nova_compute[182713]: 2026-01-21 23:59:36.632 182717 DEBUG nova.compute.manager [req-09e7be8c-9517-4249-82ba-400b4ef4b8ec req-6af399f3-fa15-429d-9288-6252675ff667 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: be4dacee-6b35-4e82-ba71-d6e8b745dfa8] Received event network-vif-unplugged-da0329e1-27a9-4900-91e5-aff8efb5d066 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 21 23:59:36 compute-1 nova_compute[182713]: 2026-01-21 23:59:36.633 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:59:36 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:59:36.632 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[5e8b19b3-4d8f-44ed-83b4-ee6c53fd2699]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:59:36 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:59:36.649 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[872dfccd-47ad-4529-a9cb-676a2af8715d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:59:36 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:59:36.650 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[fc0583d4-855e-4e65-96ed-7c8f15a0a394]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:59:36 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:59:36.677 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[ebd89cae-6935-486c-8c13-28e6f0edfde2]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 453943, 'reachable_time': 41973, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 222622, 'error': None, 'target': 'ovnmeta-4ae8e9ca-350e-4d38-9fe2-01d17d47544e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:59:36 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:59:36.681 104576 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-4ae8e9ca-350e-4d38-9fe2-01d17d47544e deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 21 23:59:36 compute-1 ovn_metadata_agent[104179]: 2026-01-21 23:59:36.681 104576 DEBUG oslo.privsep.daemon [-] privsep: reply[d3241296-1034-4030-9560-8d4219b6ea67]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:59:37 compute-1 systemd[1]: run-netns-ovnmeta\x2d4ae8e9ca\x2d350e\x2d4d38\x2d9fe2\x2d01d17d47544e.mount: Deactivated successfully.
Jan 21 23:59:37 compute-1 nova_compute[182713]: 2026-01-21 23:59:37.332 182717 DEBUG oslo_concurrency.lockutils [None req-779919fd-72c7-4671-9e60-20c79e8542dc 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Acquiring lock "refresh_cache-30c7da24-de00-4067-a5d2-f36ad21391c5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 23:59:37 compute-1 nova_compute[182713]: 2026-01-21 23:59:37.932 182717 DEBUG nova.network.neutron [req-30fb0ffb-fabe-4200-a328-63b1afb53689 req-732b0965-3b2b-41a4-a811-29c7fa2adda7 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 30c7da24-de00-4067-a5d2-f36ad21391c5] Updated VIF entry in instance network info cache for port 917524f9-5334-4b3d-b16f-b9686b1c3528. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 21 23:59:37 compute-1 nova_compute[182713]: 2026-01-21 23:59:37.933 182717 DEBUG nova.network.neutron [req-30fb0ffb-fabe-4200-a328-63b1afb53689 req-732b0965-3b2b-41a4-a811-29c7fa2adda7 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 30c7da24-de00-4067-a5d2-f36ad21391c5] Updating instance_info_cache with network_info: [{"id": "917524f9-5334-4b3d-b16f-b9686b1c3528", "address": "fa:16:3e:80:88:c9", "network": {"id": "1995baab-0f8d-4658-a4fc-2d21868dc592", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-54296518-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.223", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "717cc581e6a349a98dfd390d05b18624", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap917524f9-53", "ovs_interfaceid": "917524f9-5334-4b3d-b16f-b9686b1c3528", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "c596dfbe-ce59-4ab9-8cad-bd4144812420", "address": "fa:16:3e:24:23:22", "network": {"id": "1995baab-0f8d-4658-a4fc-2d21868dc592", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-54296518-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "717cc581e6a349a98dfd390d05b18624", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc596dfbe-ce", "ovs_interfaceid": "c596dfbe-ce59-4ab9-8cad-bd4144812420", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 23:59:37 compute-1 nova_compute[182713]: 2026-01-21 23:59:37.952 182717 DEBUG oslo_concurrency.lockutils [req-30fb0ffb-fabe-4200-a328-63b1afb53689 req-732b0965-3b2b-41a4-a811-29c7fa2adda7 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-30c7da24-de00-4067-a5d2-f36ad21391c5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 23:59:37 compute-1 nova_compute[182713]: 2026-01-21 23:59:37.953 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquired lock "refresh_cache-30c7da24-de00-4067-a5d2-f36ad21391c5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 23:59:37 compute-1 nova_compute[182713]: 2026-01-21 23:59:37.954 182717 DEBUG nova.network.neutron [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] [instance: 30c7da24-de00-4067-a5d2-f36ad21391c5] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Jan 21 23:59:37 compute-1 nova_compute[182713]: 2026-01-21 23:59:37.955 182717 DEBUG nova.objects.instance [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lazy-loading 'info_cache' on Instance uuid 30c7da24-de00-4067-a5d2-f36ad21391c5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:59:38 compute-1 nova_compute[182713]: 2026-01-21 23:59:38.328 182717 DEBUG nova.compute.manager [req-be79bd0e-f810-4c47-905a-dc41d6b52510 req-8ea40329-9799-4783-ac02-8a1da3264d8b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 30c7da24-de00-4067-a5d2-f36ad21391c5] Received event network-vif-unplugged-c596dfbe-ce59-4ab9-8cad-bd4144812420 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:59:38 compute-1 nova_compute[182713]: 2026-01-21 23:59:38.329 182717 DEBUG oslo_concurrency.lockutils [req-be79bd0e-f810-4c47-905a-dc41d6b52510 req-8ea40329-9799-4783-ac02-8a1da3264d8b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "30c7da24-de00-4067-a5d2-f36ad21391c5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:59:38 compute-1 nova_compute[182713]: 2026-01-21 23:59:38.330 182717 DEBUG oslo_concurrency.lockutils [req-be79bd0e-f810-4c47-905a-dc41d6b52510 req-8ea40329-9799-4783-ac02-8a1da3264d8b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "30c7da24-de00-4067-a5d2-f36ad21391c5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:59:38 compute-1 nova_compute[182713]: 2026-01-21 23:59:38.331 182717 DEBUG oslo_concurrency.lockutils [req-be79bd0e-f810-4c47-905a-dc41d6b52510 req-8ea40329-9799-4783-ac02-8a1da3264d8b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "30c7da24-de00-4067-a5d2-f36ad21391c5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:59:38 compute-1 nova_compute[182713]: 2026-01-21 23:59:38.331 182717 DEBUG nova.compute.manager [req-be79bd0e-f810-4c47-905a-dc41d6b52510 req-8ea40329-9799-4783-ac02-8a1da3264d8b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 30c7da24-de00-4067-a5d2-f36ad21391c5] No waiting events found dispatching network-vif-unplugged-c596dfbe-ce59-4ab9-8cad-bd4144812420 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 23:59:38 compute-1 nova_compute[182713]: 2026-01-21 23:59:38.332 182717 WARNING nova.compute.manager [req-be79bd0e-f810-4c47-905a-dc41d6b52510 req-8ea40329-9799-4783-ac02-8a1da3264d8b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 30c7da24-de00-4067-a5d2-f36ad21391c5] Received unexpected event network-vif-unplugged-c596dfbe-ce59-4ab9-8cad-bd4144812420 for instance with vm_state active and task_state None.
Jan 21 23:59:38 compute-1 nova_compute[182713]: 2026-01-21 23:59:38.332 182717 DEBUG nova.compute.manager [req-be79bd0e-f810-4c47-905a-dc41d6b52510 req-8ea40329-9799-4783-ac02-8a1da3264d8b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 30c7da24-de00-4067-a5d2-f36ad21391c5] Received event network-vif-plugged-c596dfbe-ce59-4ab9-8cad-bd4144812420 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:59:38 compute-1 nova_compute[182713]: 2026-01-21 23:59:38.333 182717 DEBUG oslo_concurrency.lockutils [req-be79bd0e-f810-4c47-905a-dc41d6b52510 req-8ea40329-9799-4783-ac02-8a1da3264d8b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "30c7da24-de00-4067-a5d2-f36ad21391c5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:59:38 compute-1 nova_compute[182713]: 2026-01-21 23:59:38.334 182717 DEBUG oslo_concurrency.lockutils [req-be79bd0e-f810-4c47-905a-dc41d6b52510 req-8ea40329-9799-4783-ac02-8a1da3264d8b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "30c7da24-de00-4067-a5d2-f36ad21391c5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:59:38 compute-1 nova_compute[182713]: 2026-01-21 23:59:38.335 182717 DEBUG oslo_concurrency.lockutils [req-be79bd0e-f810-4c47-905a-dc41d6b52510 req-8ea40329-9799-4783-ac02-8a1da3264d8b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "30c7da24-de00-4067-a5d2-f36ad21391c5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:59:38 compute-1 nova_compute[182713]: 2026-01-21 23:59:38.335 182717 DEBUG nova.compute.manager [req-be79bd0e-f810-4c47-905a-dc41d6b52510 req-8ea40329-9799-4783-ac02-8a1da3264d8b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 30c7da24-de00-4067-a5d2-f36ad21391c5] No waiting events found dispatching network-vif-plugged-c596dfbe-ce59-4ab9-8cad-bd4144812420 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 23:59:38 compute-1 nova_compute[182713]: 2026-01-21 23:59:38.336 182717 WARNING nova.compute.manager [req-be79bd0e-f810-4c47-905a-dc41d6b52510 req-8ea40329-9799-4783-ac02-8a1da3264d8b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 30c7da24-de00-4067-a5d2-f36ad21391c5] Received unexpected event network-vif-plugged-c596dfbe-ce59-4ab9-8cad-bd4144812420 for instance with vm_state active and task_state None.
Jan 21 23:59:38 compute-1 nova_compute[182713]: 2026-01-21 23:59:38.340 182717 DEBUG nova.compute.manager [req-cf35d0ef-dab0-4f4c-b4b4-ca06650b2ba0 req-65876c6a-5667-4a6c-8b87-d1f7a73b06d8 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: be4dacee-6b35-4e82-ba71-d6e8b745dfa8] Received event network-vif-plugged-9330cb5d-3c3f-499b-9d0c-ddc0fee6838f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:59:38 compute-1 nova_compute[182713]: 2026-01-21 23:59:38.341 182717 DEBUG oslo_concurrency.lockutils [req-cf35d0ef-dab0-4f4c-b4b4-ca06650b2ba0 req-65876c6a-5667-4a6c-8b87-d1f7a73b06d8 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "be4dacee-6b35-4e82-ba71-d6e8b745dfa8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:59:38 compute-1 nova_compute[182713]: 2026-01-21 23:59:38.341 182717 DEBUG oslo_concurrency.lockutils [req-cf35d0ef-dab0-4f4c-b4b4-ca06650b2ba0 req-65876c6a-5667-4a6c-8b87-d1f7a73b06d8 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "be4dacee-6b35-4e82-ba71-d6e8b745dfa8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:59:38 compute-1 nova_compute[182713]: 2026-01-21 23:59:38.342 182717 DEBUG oslo_concurrency.lockutils [req-cf35d0ef-dab0-4f4c-b4b4-ca06650b2ba0 req-65876c6a-5667-4a6c-8b87-d1f7a73b06d8 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "be4dacee-6b35-4e82-ba71-d6e8b745dfa8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:59:38 compute-1 nova_compute[182713]: 2026-01-21 23:59:38.342 182717 DEBUG nova.compute.manager [req-cf35d0ef-dab0-4f4c-b4b4-ca06650b2ba0 req-65876c6a-5667-4a6c-8b87-d1f7a73b06d8 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: be4dacee-6b35-4e82-ba71-d6e8b745dfa8] No waiting events found dispatching network-vif-plugged-9330cb5d-3c3f-499b-9d0c-ddc0fee6838f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 23:59:38 compute-1 nova_compute[182713]: 2026-01-21 23:59:38.343 182717 WARNING nova.compute.manager [req-cf35d0ef-dab0-4f4c-b4b4-ca06650b2ba0 req-65876c6a-5667-4a6c-8b87-d1f7a73b06d8 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: be4dacee-6b35-4e82-ba71-d6e8b745dfa8] Received unexpected event network-vif-plugged-9330cb5d-3c3f-499b-9d0c-ddc0fee6838f for instance with vm_state active and task_state deleting.
Jan 21 23:59:38 compute-1 podman[222624]: 2026-01-21 23:59:38.607490054 +0000 UTC m=+0.082815476 container health_status 9069102163b9014ce8d990e36224edf2f6277e737eebbc85a8ea0ba5a3d6a468 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 21 23:59:38 compute-1 podman[222623]: 2026-01-21 23:59:38.656195196 +0000 UTC m=+0.136415448 container health_status 1b31374526468d6b98833c6ddc06c791524e3aaa8f4a92d02e3f11c449a17ca2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Jan 21 23:59:38 compute-1 nova_compute[182713]: 2026-01-21 23:59:38.682 182717 DEBUG nova.compute.manager [req-e2bac5c0-3cca-4894-a050-33274140a4fc req-a073485a-9e74-40de-aeb5-ca17181e5ef3 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: be4dacee-6b35-4e82-ba71-d6e8b745dfa8] Received event network-vif-plugged-a899c1d3-f433-4476-a304-705e518f0bea external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:59:38 compute-1 nova_compute[182713]: 2026-01-21 23:59:38.683 182717 DEBUG oslo_concurrency.lockutils [req-e2bac5c0-3cca-4894-a050-33274140a4fc req-a073485a-9e74-40de-aeb5-ca17181e5ef3 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "be4dacee-6b35-4e82-ba71-d6e8b745dfa8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:59:38 compute-1 nova_compute[182713]: 2026-01-21 23:59:38.684 182717 DEBUG oslo_concurrency.lockutils [req-e2bac5c0-3cca-4894-a050-33274140a4fc req-a073485a-9e74-40de-aeb5-ca17181e5ef3 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "be4dacee-6b35-4e82-ba71-d6e8b745dfa8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:59:38 compute-1 nova_compute[182713]: 2026-01-21 23:59:38.684 182717 DEBUG oslo_concurrency.lockutils [req-e2bac5c0-3cca-4894-a050-33274140a4fc req-a073485a-9e74-40de-aeb5-ca17181e5ef3 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "be4dacee-6b35-4e82-ba71-d6e8b745dfa8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:59:38 compute-1 nova_compute[182713]: 2026-01-21 23:59:38.684 182717 DEBUG nova.compute.manager [req-e2bac5c0-3cca-4894-a050-33274140a4fc req-a073485a-9e74-40de-aeb5-ca17181e5ef3 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: be4dacee-6b35-4e82-ba71-d6e8b745dfa8] No waiting events found dispatching network-vif-plugged-a899c1d3-f433-4476-a304-705e518f0bea pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 23:59:38 compute-1 nova_compute[182713]: 2026-01-21 23:59:38.685 182717 WARNING nova.compute.manager [req-e2bac5c0-3cca-4894-a050-33274140a4fc req-a073485a-9e74-40de-aeb5-ca17181e5ef3 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: be4dacee-6b35-4e82-ba71-d6e8b745dfa8] Received unexpected event network-vif-plugged-a899c1d3-f433-4476-a304-705e518f0bea for instance with vm_state active and task_state deleting.
Jan 21 23:59:38 compute-1 nova_compute[182713]: 2026-01-21 23:59:38.756 182717 DEBUG nova.compute.manager [req-52edc7b0-4927-4834-b3d4-38fce067193e req-5fbe1133-bc97-4a3b-b322-029bd0bd4b8a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: be4dacee-6b35-4e82-ba71-d6e8b745dfa8] Received event network-vif-plugged-da0329e1-27a9-4900-91e5-aff8efb5d066 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:59:38 compute-1 nova_compute[182713]: 2026-01-21 23:59:38.757 182717 DEBUG oslo_concurrency.lockutils [req-52edc7b0-4927-4834-b3d4-38fce067193e req-5fbe1133-bc97-4a3b-b322-029bd0bd4b8a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "be4dacee-6b35-4e82-ba71-d6e8b745dfa8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:59:38 compute-1 nova_compute[182713]: 2026-01-21 23:59:38.757 182717 DEBUG oslo_concurrency.lockutils [req-52edc7b0-4927-4834-b3d4-38fce067193e req-5fbe1133-bc97-4a3b-b322-029bd0bd4b8a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "be4dacee-6b35-4e82-ba71-d6e8b745dfa8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:59:38 compute-1 nova_compute[182713]: 2026-01-21 23:59:38.758 182717 DEBUG oslo_concurrency.lockutils [req-52edc7b0-4927-4834-b3d4-38fce067193e req-5fbe1133-bc97-4a3b-b322-029bd0bd4b8a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "be4dacee-6b35-4e82-ba71-d6e8b745dfa8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:59:38 compute-1 nova_compute[182713]: 2026-01-21 23:59:38.758 182717 DEBUG nova.compute.manager [req-52edc7b0-4927-4834-b3d4-38fce067193e req-5fbe1133-bc97-4a3b-b322-029bd0bd4b8a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: be4dacee-6b35-4e82-ba71-d6e8b745dfa8] No waiting events found dispatching network-vif-plugged-da0329e1-27a9-4900-91e5-aff8efb5d066 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 23:59:38 compute-1 nova_compute[182713]: 2026-01-21 23:59:38.759 182717 WARNING nova.compute.manager [req-52edc7b0-4927-4834-b3d4-38fce067193e req-5fbe1133-bc97-4a3b-b322-029bd0bd4b8a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: be4dacee-6b35-4e82-ba71-d6e8b745dfa8] Received unexpected event network-vif-plugged-da0329e1-27a9-4900-91e5-aff8efb5d066 for instance with vm_state active and task_state deleting.
Jan 21 23:59:38 compute-1 nova_compute[182713]: 2026-01-21 23:59:38.759 182717 DEBUG nova.compute.manager [req-52edc7b0-4927-4834-b3d4-38fce067193e req-5fbe1133-bc97-4a3b-b322-029bd0bd4b8a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: be4dacee-6b35-4e82-ba71-d6e8b745dfa8] Received event network-vif-deleted-9330cb5d-3c3f-499b-9d0c-ddc0fee6838f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:59:38 compute-1 nova_compute[182713]: 2026-01-21 23:59:38.759 182717 INFO nova.compute.manager [req-52edc7b0-4927-4834-b3d4-38fce067193e req-5fbe1133-bc97-4a3b-b322-029bd0bd4b8a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: be4dacee-6b35-4e82-ba71-d6e8b745dfa8] Neutron deleted interface 9330cb5d-3c3f-499b-9d0c-ddc0fee6838f; detaching it from the instance and deleting it from the info cache
Jan 21 23:59:38 compute-1 nova_compute[182713]: 2026-01-21 23:59:38.760 182717 DEBUG nova.network.neutron [req-52edc7b0-4927-4834-b3d4-38fce067193e req-5fbe1133-bc97-4a3b-b322-029bd0bd4b8a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: be4dacee-6b35-4e82-ba71-d6e8b745dfa8] Updating instance_info_cache with network_info: [{"id": "da0329e1-27a9-4900-91e5-aff8efb5d066", "address": "fa:16:3e:66:b9:f9", "network": {"id": "54a6b0a6-096e-4f61-a504-b2a9810b3844", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-335695391", "subnets": [{"cidr": "10.100.1.0/24", "dns": [], "gateway": {"address": "10.100.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.1.127", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "975703700f9d42c5a1daa32f5e61f6f2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapda0329e1-27", "ovs_interfaceid": "da0329e1-27a9-4900-91e5-aff8efb5d066", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "a899c1d3-f433-4476-a304-705e518f0bea", "address": "fa:16:3e:e4:a0:ba", "network": {"id": "4ae8e9ca-350e-4d38-9fe2-01d17d47544e", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1174472241", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.228", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "975703700f9d42c5a1daa32f5e61f6f2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa899c1d3-f4", "ovs_interfaceid": "a899c1d3-f433-4476-a304-705e518f0bea", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 23:59:38 compute-1 nova_compute[182713]: 2026-01-21 23:59:38.785 182717 DEBUG nova.compute.manager [req-52edc7b0-4927-4834-b3d4-38fce067193e req-5fbe1133-bc97-4a3b-b322-029bd0bd4b8a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: be4dacee-6b35-4e82-ba71-d6e8b745dfa8] Detach interface failed, port_id=9330cb5d-3c3f-499b-9d0c-ddc0fee6838f, reason: Instance be4dacee-6b35-4e82-ba71-d6e8b745dfa8 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882
Jan 21 23:59:39 compute-1 nova_compute[182713]: 2026-01-21 23:59:39.449 182717 DEBUG nova.network.neutron [-] [instance: be4dacee-6b35-4e82-ba71-d6e8b745dfa8] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 23:59:39 compute-1 nova_compute[182713]: 2026-01-21 23:59:39.486 182717 INFO nova.compute.manager [-] [instance: be4dacee-6b35-4e82-ba71-d6e8b745dfa8] Took 3.26 seconds to deallocate network for instance.
Jan 21 23:59:39 compute-1 nova_compute[182713]: 2026-01-21 23:59:39.592 182717 DEBUG oslo_concurrency.lockutils [None req-c00e7618-8e30-4150-aa47-e8bf6705559e 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:59:39 compute-1 nova_compute[182713]: 2026-01-21 23:59:39.593 182717 DEBUG oslo_concurrency.lockutils [None req-c00e7618-8e30-4150-aa47-e8bf6705559e 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:59:39 compute-1 nova_compute[182713]: 2026-01-21 23:59:39.688 182717 DEBUG nova.compute.provider_tree [None req-c00e7618-8e30-4150-aa47-e8bf6705559e 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] Inventory has not changed in ProviderTree for provider: 39680711-70c9-4df1-ae59-25e54fac688d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 21 23:59:39 compute-1 nova_compute[182713]: 2026-01-21 23:59:39.713 182717 DEBUG nova.scheduler.client.report [None req-c00e7618-8e30-4150-aa47-e8bf6705559e 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] Inventory has not changed for provider 39680711-70c9-4df1-ae59-25e54fac688d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 21 23:59:39 compute-1 nova_compute[182713]: 2026-01-21 23:59:39.735 182717 DEBUG oslo_concurrency.lockutils [None req-c00e7618-8e30-4150-aa47-e8bf6705559e 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.141s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:59:39 compute-1 nova_compute[182713]: 2026-01-21 23:59:39.762 182717 INFO nova.scheduler.client.report [None req-c00e7618-8e30-4150-aa47-e8bf6705559e 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] Deleted allocations for instance be4dacee-6b35-4e82-ba71-d6e8b745dfa8
Jan 21 23:59:39 compute-1 nova_compute[182713]: 2026-01-21 23:59:39.811 182717 DEBUG nova.network.neutron [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] [instance: 30c7da24-de00-4067-a5d2-f36ad21391c5] Updating instance_info_cache with network_info: [{"id": "917524f9-5334-4b3d-b16f-b9686b1c3528", "address": "fa:16:3e:80:88:c9", "network": {"id": "1995baab-0f8d-4658-a4fc-2d21868dc592", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-54296518-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "717cc581e6a349a98dfd390d05b18624", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap917524f9-53", "ovs_interfaceid": "917524f9-5334-4b3d-b16f-b9686b1c3528", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 23:59:39 compute-1 nova_compute[182713]: 2026-01-21 23:59:39.839 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Releasing lock "refresh_cache-30c7da24-de00-4067-a5d2-f36ad21391c5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 23:59:39 compute-1 nova_compute[182713]: 2026-01-21 23:59:39.839 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] [instance: 30c7da24-de00-4067-a5d2-f36ad21391c5] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Jan 21 23:59:39 compute-1 nova_compute[182713]: 2026-01-21 23:59:39.840 182717 DEBUG oslo_concurrency.lockutils [None req-779919fd-72c7-4671-9e60-20c79e8542dc 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Acquired lock "refresh_cache-30c7da24-de00-4067-a5d2-f36ad21391c5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 23:59:39 compute-1 nova_compute[182713]: 2026-01-21 23:59:39.840 182717 DEBUG nova.network.neutron [None req-779919fd-72c7-4671-9e60-20c79e8542dc 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] [instance: 30c7da24-de00-4067-a5d2-f36ad21391c5] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 21 23:59:39 compute-1 nova_compute[182713]: 2026-01-21 23:59:39.849 182717 DEBUG oslo_concurrency.lockutils [None req-c00e7618-8e30-4150-aa47-e8bf6705559e 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] Lock "be4dacee-6b35-4e82-ba71-d6e8b745dfa8" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.160s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:59:40 compute-1 nova_compute[182713]: 2026-01-21 23:59:40.897 182717 DEBUG nova.compute.manager [req-4ac2eac9-e312-49c1-87e6-e32a30e1bd08 req-21523be9-c211-4634-9213-eebf22b38c43 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: be4dacee-6b35-4e82-ba71-d6e8b745dfa8] Received event network-vif-deleted-a899c1d3-f433-4476-a304-705e518f0bea external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:59:40 compute-1 nova_compute[182713]: 2026-01-21 23:59:40.897 182717 DEBUG nova.compute.manager [req-4ac2eac9-e312-49c1-87e6-e32a30e1bd08 req-21523be9-c211-4634-9213-eebf22b38c43 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: be4dacee-6b35-4e82-ba71-d6e8b745dfa8] Received event network-vif-deleted-da0329e1-27a9-4900-91e5-aff8efb5d066 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:59:41 compute-1 nova_compute[182713]: 2026-01-21 23:59:41.075 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:59:41 compute-1 nova_compute[182713]: 2026-01-21 23:59:41.222 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:59:41 compute-1 nova_compute[182713]: 2026-01-21 23:59:41.236 182717 DEBUG nova.compute.manager [req-3dcff90f-86e4-4115-8aaa-8eef101b08a1 req-27520963-02ce-4a7c-a009-6ad5e3ad8849 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 30c7da24-de00-4067-a5d2-f36ad21391c5] Received event network-changed-917524f9-5334-4b3d-b16f-b9686b1c3528 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:59:41 compute-1 nova_compute[182713]: 2026-01-21 23:59:41.237 182717 DEBUG nova.compute.manager [req-3dcff90f-86e4-4115-8aaa-8eef101b08a1 req-27520963-02ce-4a7c-a009-6ad5e3ad8849 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 30c7da24-de00-4067-a5d2-f36ad21391c5] Refreshing instance network info cache due to event network-changed-917524f9-5334-4b3d-b16f-b9686b1c3528. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 21 23:59:41 compute-1 nova_compute[182713]: 2026-01-21 23:59:41.237 182717 DEBUG oslo_concurrency.lockutils [req-3dcff90f-86e4-4115-8aaa-8eef101b08a1 req-27520963-02ce-4a7c-a009-6ad5e3ad8849 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-30c7da24-de00-4067-a5d2-f36ad21391c5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 23:59:42 compute-1 nova_compute[182713]: 2026-01-21 23:59:42.693 182717 DEBUG nova.network.neutron [None req-779919fd-72c7-4671-9e60-20c79e8542dc 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] [instance: 30c7da24-de00-4067-a5d2-f36ad21391c5] Updating instance_info_cache with network_info: [{"id": "917524f9-5334-4b3d-b16f-b9686b1c3528", "address": "fa:16:3e:80:88:c9", "network": {"id": "1995baab-0f8d-4658-a4fc-2d21868dc592", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-54296518-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "717cc581e6a349a98dfd390d05b18624", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap917524f9-53", "ovs_interfaceid": "917524f9-5334-4b3d-b16f-b9686b1c3528", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 23:59:42 compute-1 nova_compute[182713]: 2026-01-21 23:59:42.718 182717 DEBUG oslo_concurrency.lockutils [None req-779919fd-72c7-4671-9e60-20c79e8542dc 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Releasing lock "refresh_cache-30c7da24-de00-4067-a5d2-f36ad21391c5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 23:59:42 compute-1 nova_compute[182713]: 2026-01-21 23:59:42.720 182717 DEBUG oslo_concurrency.lockutils [req-3dcff90f-86e4-4115-8aaa-8eef101b08a1 req-27520963-02ce-4a7c-a009-6ad5e3ad8849 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-30c7da24-de00-4067-a5d2-f36ad21391c5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 23:59:42 compute-1 nova_compute[182713]: 2026-01-21 23:59:42.721 182717 DEBUG nova.network.neutron [req-3dcff90f-86e4-4115-8aaa-8eef101b08a1 req-27520963-02ce-4a7c-a009-6ad5e3ad8849 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 30c7da24-de00-4067-a5d2-f36ad21391c5] Refreshing network info cache for port 917524f9-5334-4b3d-b16f-b9686b1c3528 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 21 23:59:42 compute-1 nova_compute[182713]: 2026-01-21 23:59:42.766 182717 DEBUG oslo_concurrency.lockutils [None req-779919fd-72c7-4671-9e60-20c79e8542dc 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Lock "interface-30c7da24-de00-4067-a5d2-f36ad21391c5-c596dfbe-ce59-4ab9-8cad-bd4144812420" "released" by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" :: held 11.075s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:59:42 compute-1 nova_compute[182713]: 2026-01-21 23:59:42.837 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:59:43 compute-1 ovn_controller[94841]: 2026-01-21T23:59:43Z|00304|binding|INFO|Releasing lport 4a5cc35b-5169-43e2-b11f-202219aae22d from this chassis (sb_readonly=0)
Jan 21 23:59:43 compute-1 nova_compute[182713]: 2026-01-21 23:59:43.835 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:59:44 compute-1 podman[222672]: 2026-01-21 23:59:44.598004544 +0000 UTC m=+0.074934592 container health_status e305ebfcd3dbca18f8dd680606b043b108cf9efb0d0a771039cb04fe0df8441d (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 21 23:59:44 compute-1 podman[222671]: 2026-01-21 23:59:44.606275559 +0000 UTC m=+0.088062607 container health_status af9a44923f14d865893c91712a29a99d59e05d83f1a1a0300c4f4d5af638f284 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 21 23:59:45 compute-1 nova_compute[182713]: 2026-01-21 23:59:45.645 182717 DEBUG nova.network.neutron [req-3dcff90f-86e4-4115-8aaa-8eef101b08a1 req-27520963-02ce-4a7c-a009-6ad5e3ad8849 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 30c7da24-de00-4067-a5d2-f36ad21391c5] Updated VIF entry in instance network info cache for port 917524f9-5334-4b3d-b16f-b9686b1c3528. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 21 23:59:45 compute-1 nova_compute[182713]: 2026-01-21 23:59:45.646 182717 DEBUG nova.network.neutron [req-3dcff90f-86e4-4115-8aaa-8eef101b08a1 req-27520963-02ce-4a7c-a009-6ad5e3ad8849 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 30c7da24-de00-4067-a5d2-f36ad21391c5] Updating instance_info_cache with network_info: [{"id": "917524f9-5334-4b3d-b16f-b9686b1c3528", "address": "fa:16:3e:80:88:c9", "network": {"id": "1995baab-0f8d-4658-a4fc-2d21868dc592", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-54296518-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "717cc581e6a349a98dfd390d05b18624", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap917524f9-53", "ovs_interfaceid": "917524f9-5334-4b3d-b16f-b9686b1c3528", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 23:59:45 compute-1 nova_compute[182713]: 2026-01-21 23:59:45.671 182717 DEBUG oslo_concurrency.lockutils [req-3dcff90f-86e4-4115-8aaa-8eef101b08a1 req-27520963-02ce-4a7c-a009-6ad5e3ad8849 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-30c7da24-de00-4067-a5d2-f36ad21391c5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 23:59:46 compute-1 nova_compute[182713]: 2026-01-21 23:59:46.078 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:59:46 compute-1 nova_compute[182713]: 2026-01-21 23:59:46.225 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:59:50 compute-1 sshd-session[222715]: Invalid user ubuntu from 38.67.240.124 port 46527
Jan 21 23:59:50 compute-1 sshd-session[222715]: Received disconnect from 38.67.240.124 port 46527:11:  [preauth]
Jan 21 23:59:50 compute-1 sshd-session[222715]: Disconnected from invalid user ubuntu 38.67.240.124 port 46527 [preauth]
Jan 21 23:59:51 compute-1 nova_compute[182713]: 2026-01-21 23:59:51.004 182717 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769039976.0016046, be4dacee-6b35-4e82-ba71-d6e8b745dfa8 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:59:51 compute-1 nova_compute[182713]: 2026-01-21 23:59:51.005 182717 INFO nova.compute.manager [-] [instance: be4dacee-6b35-4e82-ba71-d6e8b745dfa8] VM Stopped (Lifecycle Event)
Jan 21 23:59:51 compute-1 nova_compute[182713]: 2026-01-21 23:59:51.029 182717 DEBUG nova.compute.manager [None req-aef04659-e903-40d2-b37a-0673ad657e04 - - - - - -] [instance: be4dacee-6b35-4e82-ba71-d6e8b745dfa8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:59:51 compute-1 nova_compute[182713]: 2026-01-21 23:59:51.081 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:59:51 compute-1 nova_compute[182713]: 2026-01-21 23:59:51.228 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:59:51 compute-1 ovn_controller[94841]: 2026-01-21T23:59:51Z|00305|binding|INFO|Releasing lport 4a5cc35b-5169-43e2-b11f-202219aae22d from this chassis (sb_readonly=0)
Jan 21 23:59:51 compute-1 nova_compute[182713]: 2026-01-21 23:59:51.329 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:59:56 compute-1 nova_compute[182713]: 2026-01-21 23:59:56.084 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:59:56 compute-1 nova_compute[182713]: 2026-01-21 23:59:56.231 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:59:56 compute-1 podman[222718]: 2026-01-21 23:59:56.597779033 +0000 UTC m=+0.079992988 container health_status cba261ffc859a105e0afa4436e64e27f9ba7e7387a1ea02146e0072c51844d44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Jan 21 23:59:57 compute-1 nova_compute[182713]: 2026-01-21 23:59:57.134 182717 DEBUG oslo_concurrency.lockutils [None req-dd87d052-ec21-4f35-9d2b-748f4aba8186 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] Acquiring lock "6888ddb7-c373-48a1-bc4c-7e1c44cece29" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:59:57 compute-1 nova_compute[182713]: 2026-01-21 23:59:57.134 182717 DEBUG oslo_concurrency.lockutils [None req-dd87d052-ec21-4f35-9d2b-748f4aba8186 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] Lock "6888ddb7-c373-48a1-bc4c-7e1c44cece29" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:59:57 compute-1 nova_compute[182713]: 2026-01-21 23:59:57.163 182717 DEBUG nova.compute.manager [None req-dd87d052-ec21-4f35-9d2b-748f4aba8186 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] [instance: 6888ddb7-c373-48a1-bc4c-7e1c44cece29] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 21 23:59:57 compute-1 nova_compute[182713]: 2026-01-21 23:59:57.276 182717 DEBUG oslo_concurrency.lockutils [None req-dd87d052-ec21-4f35-9d2b-748f4aba8186 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:59:57 compute-1 nova_compute[182713]: 2026-01-21 23:59:57.277 182717 DEBUG oslo_concurrency.lockutils [None req-dd87d052-ec21-4f35-9d2b-748f4aba8186 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:59:57 compute-1 nova_compute[182713]: 2026-01-21 23:59:57.287 182717 DEBUG nova.virt.hardware [None req-dd87d052-ec21-4f35-9d2b-748f4aba8186 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 21 23:59:57 compute-1 nova_compute[182713]: 2026-01-21 23:59:57.288 182717 INFO nova.compute.claims [None req-dd87d052-ec21-4f35-9d2b-748f4aba8186 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] [instance: 6888ddb7-c373-48a1-bc4c-7e1c44cece29] Claim successful on node compute-1.ctlplane.example.com
Jan 21 23:59:57 compute-1 nova_compute[182713]: 2026-01-21 23:59:57.437 182717 DEBUG nova.compute.provider_tree [None req-dd87d052-ec21-4f35-9d2b-748f4aba8186 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] Inventory has not changed in ProviderTree for provider: 39680711-70c9-4df1-ae59-25e54fac688d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 21 23:59:57 compute-1 nova_compute[182713]: 2026-01-21 23:59:57.462 182717 DEBUG nova.scheduler.client.report [None req-dd87d052-ec21-4f35-9d2b-748f4aba8186 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] Inventory has not changed for provider 39680711-70c9-4df1-ae59-25e54fac688d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 21 23:59:57 compute-1 nova_compute[182713]: 2026-01-21 23:59:57.491 182717 DEBUG oslo_concurrency.lockutils [None req-dd87d052-ec21-4f35-9d2b-748f4aba8186 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.214s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:59:57 compute-1 nova_compute[182713]: 2026-01-21 23:59:57.492 182717 DEBUG nova.compute.manager [None req-dd87d052-ec21-4f35-9d2b-748f4aba8186 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] [instance: 6888ddb7-c373-48a1-bc4c-7e1c44cece29] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 21 23:59:57 compute-1 nova_compute[182713]: 2026-01-21 23:59:57.561 182717 DEBUG nova.compute.manager [None req-dd87d052-ec21-4f35-9d2b-748f4aba8186 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] [instance: 6888ddb7-c373-48a1-bc4c-7e1c44cece29] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 21 23:59:57 compute-1 nova_compute[182713]: 2026-01-21 23:59:57.562 182717 DEBUG nova.network.neutron [None req-dd87d052-ec21-4f35-9d2b-748f4aba8186 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] [instance: 6888ddb7-c373-48a1-bc4c-7e1c44cece29] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 21 23:59:57 compute-1 nova_compute[182713]: 2026-01-21 23:59:57.590 182717 INFO nova.virt.libvirt.driver [None req-dd87d052-ec21-4f35-9d2b-748f4aba8186 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] [instance: 6888ddb7-c373-48a1-bc4c-7e1c44cece29] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 21 23:59:57 compute-1 nova_compute[182713]: 2026-01-21 23:59:57.617 182717 DEBUG nova.compute.manager [None req-dd87d052-ec21-4f35-9d2b-748f4aba8186 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] [instance: 6888ddb7-c373-48a1-bc4c-7e1c44cece29] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 21 23:59:57 compute-1 nova_compute[182713]: 2026-01-21 23:59:57.748 182717 DEBUG nova.compute.manager [None req-dd87d052-ec21-4f35-9d2b-748f4aba8186 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] [instance: 6888ddb7-c373-48a1-bc4c-7e1c44cece29] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 21 23:59:57 compute-1 nova_compute[182713]: 2026-01-21 23:59:57.750 182717 DEBUG nova.virt.libvirt.driver [None req-dd87d052-ec21-4f35-9d2b-748f4aba8186 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] [instance: 6888ddb7-c373-48a1-bc4c-7e1c44cece29] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 21 23:59:57 compute-1 nova_compute[182713]: 2026-01-21 23:59:57.751 182717 INFO nova.virt.libvirt.driver [None req-dd87d052-ec21-4f35-9d2b-748f4aba8186 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] [instance: 6888ddb7-c373-48a1-bc4c-7e1c44cece29] Creating image(s)
Jan 21 23:59:57 compute-1 nova_compute[182713]: 2026-01-21 23:59:57.752 182717 DEBUG oslo_concurrency.lockutils [None req-dd87d052-ec21-4f35-9d2b-748f4aba8186 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] Acquiring lock "/var/lib/nova/instances/6888ddb7-c373-48a1-bc4c-7e1c44cece29/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:59:57 compute-1 nova_compute[182713]: 2026-01-21 23:59:57.752 182717 DEBUG oslo_concurrency.lockutils [None req-dd87d052-ec21-4f35-9d2b-748f4aba8186 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] Lock "/var/lib/nova/instances/6888ddb7-c373-48a1-bc4c-7e1c44cece29/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:59:57 compute-1 nova_compute[182713]: 2026-01-21 23:59:57.753 182717 DEBUG oslo_concurrency.lockutils [None req-dd87d052-ec21-4f35-9d2b-748f4aba8186 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] Lock "/var/lib/nova/instances/6888ddb7-c373-48a1-bc4c-7e1c44cece29/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:59:57 compute-1 nova_compute[182713]: 2026-01-21 23:59:57.780 182717 DEBUG nova.policy [None req-dd87d052-ec21-4f35-9d2b-748f4aba8186 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '1931e691804246e3bb3ac03a95a74d93', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '975703700f9d42c5a1daa32f5e61f6f2', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 21 23:59:57 compute-1 nova_compute[182713]: 2026-01-21 23:59:57.784 182717 DEBUG oslo_concurrency.processutils [None req-dd87d052-ec21-4f35-9d2b-748f4aba8186 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:59:57 compute-1 nova_compute[182713]: 2026-01-21 23:59:57.880 182717 DEBUG oslo_concurrency.processutils [None req-dd87d052-ec21-4f35-9d2b-748f4aba8186 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.096s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:59:57 compute-1 nova_compute[182713]: 2026-01-21 23:59:57.881 182717 DEBUG oslo_concurrency.lockutils [None req-dd87d052-ec21-4f35-9d2b-748f4aba8186 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] Acquiring lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:59:57 compute-1 nova_compute[182713]: 2026-01-21 23:59:57.882 182717 DEBUG oslo_concurrency.lockutils [None req-dd87d052-ec21-4f35-9d2b-748f4aba8186 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:59:57 compute-1 nova_compute[182713]: 2026-01-21 23:59:57.897 182717 DEBUG oslo_concurrency.processutils [None req-dd87d052-ec21-4f35-9d2b-748f4aba8186 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:59:57 compute-1 nova_compute[182713]: 2026-01-21 23:59:57.965 182717 DEBUG oslo_concurrency.processutils [None req-dd87d052-ec21-4f35-9d2b-748f4aba8186 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:59:57 compute-1 nova_compute[182713]: 2026-01-21 23:59:57.966 182717 DEBUG oslo_concurrency.processutils [None req-dd87d052-ec21-4f35-9d2b-748f4aba8186 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/6888ddb7-c373-48a1-bc4c-7e1c44cece29/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:59:58 compute-1 nova_compute[182713]: 2026-01-21 23:59:58.005 182717 DEBUG oslo_concurrency.processutils [None req-dd87d052-ec21-4f35-9d2b-748f4aba8186 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/6888ddb7-c373-48a1-bc4c-7e1c44cece29/disk 1073741824" returned: 0 in 0.039s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:59:58 compute-1 nova_compute[182713]: 2026-01-21 23:59:58.007 182717 DEBUG oslo_concurrency.lockutils [None req-dd87d052-ec21-4f35-9d2b-748f4aba8186 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.125s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:59:58 compute-1 nova_compute[182713]: 2026-01-21 23:59:58.009 182717 DEBUG oslo_concurrency.processutils [None req-dd87d052-ec21-4f35-9d2b-748f4aba8186 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:59:58 compute-1 nova_compute[182713]: 2026-01-21 23:59:58.093 182717 DEBUG oslo_concurrency.processutils [None req-dd87d052-ec21-4f35-9d2b-748f4aba8186 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.083s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:59:58 compute-1 nova_compute[182713]: 2026-01-21 23:59:58.094 182717 DEBUG nova.virt.disk.api [None req-dd87d052-ec21-4f35-9d2b-748f4aba8186 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] Checking if we can resize image /var/lib/nova/instances/6888ddb7-c373-48a1-bc4c-7e1c44cece29/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 21 23:59:58 compute-1 nova_compute[182713]: 2026-01-21 23:59:58.095 182717 DEBUG oslo_concurrency.processutils [None req-dd87d052-ec21-4f35-9d2b-748f4aba8186 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6888ddb7-c373-48a1-bc4c-7e1c44cece29/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:59:58 compute-1 nova_compute[182713]: 2026-01-21 23:59:58.156 182717 DEBUG oslo_concurrency.processutils [None req-dd87d052-ec21-4f35-9d2b-748f4aba8186 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6888ddb7-c373-48a1-bc4c-7e1c44cece29/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:59:58 compute-1 nova_compute[182713]: 2026-01-21 23:59:58.159 182717 DEBUG nova.virt.disk.api [None req-dd87d052-ec21-4f35-9d2b-748f4aba8186 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] Cannot resize image /var/lib/nova/instances/6888ddb7-c373-48a1-bc4c-7e1c44cece29/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 21 23:59:58 compute-1 nova_compute[182713]: 2026-01-21 23:59:58.160 182717 DEBUG nova.objects.instance [None req-dd87d052-ec21-4f35-9d2b-748f4aba8186 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] Lazy-loading 'migration_context' on Instance uuid 6888ddb7-c373-48a1-bc4c-7e1c44cece29 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:59:58 compute-1 nova_compute[182713]: 2026-01-21 23:59:58.182 182717 DEBUG nova.virt.libvirt.driver [None req-dd87d052-ec21-4f35-9d2b-748f4aba8186 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] [instance: 6888ddb7-c373-48a1-bc4c-7e1c44cece29] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 21 23:59:58 compute-1 nova_compute[182713]: 2026-01-21 23:59:58.183 182717 DEBUG nova.virt.libvirt.driver [None req-dd87d052-ec21-4f35-9d2b-748f4aba8186 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] [instance: 6888ddb7-c373-48a1-bc4c-7e1c44cece29] Ensure instance console log exists: /var/lib/nova/instances/6888ddb7-c373-48a1-bc4c-7e1c44cece29/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 21 23:59:58 compute-1 nova_compute[182713]: 2026-01-21 23:59:58.183 182717 DEBUG oslo_concurrency.lockutils [None req-dd87d052-ec21-4f35-9d2b-748f4aba8186 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:59:58 compute-1 nova_compute[182713]: 2026-01-21 23:59:58.184 182717 DEBUG oslo_concurrency.lockutils [None req-dd87d052-ec21-4f35-9d2b-748f4aba8186 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:59:58 compute-1 nova_compute[182713]: 2026-01-21 23:59:58.185 182717 DEBUG oslo_concurrency.lockutils [None req-dd87d052-ec21-4f35-9d2b-748f4aba8186 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:59:58 compute-1 nova_compute[182713]: 2026-01-21 23:59:58.444 182717 DEBUG nova.network.neutron [None req-dd87d052-ec21-4f35-9d2b-748f4aba8186 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] [instance: 6888ddb7-c373-48a1-bc4c-7e1c44cece29] Successfully created port: 21117406-f212-49b3-a849-2a3d9a544b64 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 21 23:59:58 compute-1 podman[222754]: 2026-01-21 23:59:58.616229523 +0000 UTC m=+0.100785349 container health_status dce133a2ffa21f3006ac27dc80027060a9029e6d972f4c62fecd35dd3bbb00f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, version=9.6, container_name=openstack_network_exporter, io.buildah.version=1.33.7, architecture=x86_64, build-date=2025-08-20T13:12:41, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, vcs-type=git, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., release=1755695350, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=openstack_network_exporter, distribution-scope=public, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Jan 21 23:59:59 compute-1 nova_compute[182713]: 2026-01-21 23:59:59.028 182717 DEBUG nova.network.neutron [None req-dd87d052-ec21-4f35-9d2b-748f4aba8186 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] [instance: 6888ddb7-c373-48a1-bc4c-7e1c44cece29] Successfully created port: 5aa7b353-4b8f-49f4-b051-3e3f1b1135af _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 21 23:59:59 compute-1 nova_compute[182713]: 2026-01-21 23:59:59.954 182717 DEBUG oslo_concurrency.lockutils [None req-dc07ab6c-5c53-40fb-9cbe-920ca2621cb0 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Acquiring lock "30c7da24-de00-4067-a5d2-f36ad21391c5" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:59:59 compute-1 nova_compute[182713]: 2026-01-21 23:59:59.955 182717 DEBUG oslo_concurrency.lockutils [None req-dc07ab6c-5c53-40fb-9cbe-920ca2621cb0 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Lock "30c7da24-de00-4067-a5d2-f36ad21391c5" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:59:59 compute-1 nova_compute[182713]: 2026-01-21 23:59:59.955 182717 DEBUG oslo_concurrency.lockutils [None req-dc07ab6c-5c53-40fb-9cbe-920ca2621cb0 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Acquiring lock "30c7da24-de00-4067-a5d2-f36ad21391c5-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:59:59 compute-1 nova_compute[182713]: 2026-01-21 23:59:59.955 182717 DEBUG oslo_concurrency.lockutils [None req-dc07ab6c-5c53-40fb-9cbe-920ca2621cb0 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Lock "30c7da24-de00-4067-a5d2-f36ad21391c5-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:59:59 compute-1 nova_compute[182713]: 2026-01-21 23:59:59.956 182717 DEBUG oslo_concurrency.lockutils [None req-dc07ab6c-5c53-40fb-9cbe-920ca2621cb0 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Lock "30c7da24-de00-4067-a5d2-f36ad21391c5-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:59:59 compute-1 nova_compute[182713]: 2026-01-21 23:59:59.967 182717 INFO nova.compute.manager [None req-dc07ab6c-5c53-40fb-9cbe-920ca2621cb0 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] [instance: 30c7da24-de00-4067-a5d2-f36ad21391c5] Terminating instance
Jan 21 23:59:59 compute-1 nova_compute[182713]: 2026-01-21 23:59:59.975 182717 DEBUG nova.compute.manager [None req-dc07ab6c-5c53-40fb-9cbe-920ca2621cb0 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] [instance: 30c7da24-de00-4067-a5d2-f36ad21391c5] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 22 00:00:00 compute-1 kernel: tap917524f9-53 (unregistering): left promiscuous mode
Jan 22 00:00:00 compute-1 NetworkManager[54952]: <info>  [1769040000.0090] device (tap917524f9-53): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 00:00:00 compute-1 ovn_controller[94841]: 2026-01-22T00:00:00Z|00306|binding|INFO|Releasing lport 917524f9-5334-4b3d-b16f-b9686b1c3528 from this chassis (sb_readonly=0)
Jan 22 00:00:00 compute-1 ovn_controller[94841]: 2026-01-22T00:00:00Z|00307|binding|INFO|Setting lport 917524f9-5334-4b3d-b16f-b9686b1c3528 down in Southbound
Jan 22 00:00:00 compute-1 nova_compute[182713]: 2026-01-22 00:00:00.017 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:00:00 compute-1 ovn_controller[94841]: 2026-01-22T00:00:00Z|00308|binding|INFO|Removing iface tap917524f9-53 ovn-installed in OVS
Jan 22 00:00:00 compute-1 systemd[1]: Starting update of the root trust anchor for DNSSEC validation in unbound...
Jan 22 00:00:00 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:00:00.029 104184 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:80:88:c9 10.100.0.6'], port_security=['fa:16:3e:80:88:c9 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '30c7da24-de00-4067-a5d2-f36ad21391c5', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1995baab-0f8d-4658-a4fc-2d21868dc592', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '717cc581e6a349a98dfd390d05b18624', 'neutron:revision_number': '4', 'neutron:security_group_ids': '2e3b7d6e-99c3-4bed-a6db-24cc4d63ab1a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a84fa12f-731b-4479-8697-844749c5a76f, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>], logical_port=917524f9-5334-4b3d-b16f-b9686b1c3528) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:00:00 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:00:00.031 104184 INFO neutron.agent.ovn.metadata.agent [-] Port 917524f9-5334-4b3d-b16f-b9686b1c3528 in datapath 1995baab-0f8d-4658-a4fc-2d21868dc592 unbound from our chassis
Jan 22 00:00:00 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:00:00.034 104184 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 1995baab-0f8d-4658-a4fc-2d21868dc592, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 00:00:00 compute-1 systemd[1]: Starting Rotate log files...
Jan 22 00:00:00 compute-1 nova_compute[182713]: 2026-01-22 00:00:00.036 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:00:00 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:00:00.037 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[27cac3aa-15cd-4bf4-9cc9-d4846807f440]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:00:00 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:00:00.038 104184 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-1995baab-0f8d-4658-a4fc-2d21868dc592 namespace which is not needed anymore
Jan 22 00:00:00 compute-1 systemd[1]: machine-qemu\x2d35\x2dinstance\x2d0000004b.scope: Deactivated successfully.
Jan 22 00:00:00 compute-1 systemd[1]: machine-qemu\x2d35\x2dinstance\x2d0000004b.scope: Consumed 16.880s CPU time.
Jan 22 00:00:00 compute-1 systemd-machined[153970]: Machine qemu-35-instance-0000004b terminated.
Jan 22 00:00:00 compute-1 systemd[1]: unbound-anchor.service: Deactivated successfully.
Jan 22 00:00:00 compute-1 systemd[1]: Finished update of the root trust anchor for DNSSEC validation in unbound.
Jan 22 00:00:00 compute-1 systemd[1]: logrotate.service: Deactivated successfully.
Jan 22 00:00:00 compute-1 systemd[1]: Finished Rotate log files.
Jan 22 00:00:00 compute-1 neutron-haproxy-ovnmeta-1995baab-0f8d-4658-a4fc-2d21868dc592[221945]: [NOTICE]   (221955) : haproxy version is 2.8.14-c23fe91
Jan 22 00:00:00 compute-1 neutron-haproxy-ovnmeta-1995baab-0f8d-4658-a4fc-2d21868dc592[221945]: [NOTICE]   (221955) : path to executable is /usr/sbin/haproxy
Jan 22 00:00:00 compute-1 neutron-haproxy-ovnmeta-1995baab-0f8d-4658-a4fc-2d21868dc592[221945]: [WARNING]  (221955) : Exiting Master process...
Jan 22 00:00:00 compute-1 neutron-haproxy-ovnmeta-1995baab-0f8d-4658-a4fc-2d21868dc592[221945]: [ALERT]    (221955) : Current worker (221958) exited with code 143 (Terminated)
Jan 22 00:00:00 compute-1 neutron-haproxy-ovnmeta-1995baab-0f8d-4658-a4fc-2d21868dc592[221945]: [WARNING]  (221955) : All workers exited. Exiting... (0)
Jan 22 00:00:00 compute-1 systemd[1]: libpod-0529845bf52fb7ae33d1a4cc1e07df4659f72766a1b5a82fbc955311829a6049.scope: Deactivated successfully.
Jan 22 00:00:00 compute-1 podman[222803]: 2026-01-22 00:00:00.190649549 +0000 UTC m=+0.047879268 container died 0529845bf52fb7ae33d1a4cc1e07df4659f72766a1b5a82fbc955311829a6049 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1995baab-0f8d-4658-a4fc-2d21868dc592, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Jan 22 00:00:00 compute-1 nova_compute[182713]: 2026-01-22 00:00:00.207 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:00:00 compute-1 nova_compute[182713]: 2026-01-22 00:00:00.211 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:00:00 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-0529845bf52fb7ae33d1a4cc1e07df4659f72766a1b5a82fbc955311829a6049-userdata-shm.mount: Deactivated successfully.
Jan 22 00:00:00 compute-1 systemd[1]: var-lib-containers-storage-overlay-bc46e15cd0b2f931f3a7e8961d5de61db6bceb98197b9360ca68c09e4862b55a-merged.mount: Deactivated successfully.
Jan 22 00:00:00 compute-1 podman[222803]: 2026-01-22 00:00:00.241538209 +0000 UTC m=+0.098767918 container cleanup 0529845bf52fb7ae33d1a4cc1e07df4659f72766a1b5a82fbc955311829a6049 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1995baab-0f8d-4658-a4fc-2d21868dc592, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 22 00:00:00 compute-1 systemd[1]: libpod-conmon-0529845bf52fb7ae33d1a4cc1e07df4659f72766a1b5a82fbc955311829a6049.scope: Deactivated successfully.
Jan 22 00:00:00 compute-1 nova_compute[182713]: 2026-01-22 00:00:00.251 182717 INFO nova.virt.libvirt.driver [-] [instance: 30c7da24-de00-4067-a5d2-f36ad21391c5] Instance destroyed successfully.
Jan 22 00:00:00 compute-1 nova_compute[182713]: 2026-01-22 00:00:00.252 182717 DEBUG nova.objects.instance [None req-dc07ab6c-5c53-40fb-9cbe-920ca2621cb0 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Lazy-loading 'resources' on Instance uuid 30c7da24-de00-4067-a5d2-f36ad21391c5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:00:00 compute-1 nova_compute[182713]: 2026-01-22 00:00:00.272 182717 DEBUG nova.virt.libvirt.vif [None req-dc07ab6c-5c53-40fb-9cbe-920ca2621cb0 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-21T23:58:43Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-821479065',display_name='tempest-tempest.common.compute-instance-821479065',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-821479065',id=75,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGa3Z5wRILybgNVRpahBLmiLTAeMMxTHFRpSqeE8vf0/V2PbXMu+NKNirigUjrRZfax5519niVZ5m1wF5bYzERQCuKYHZS2P+HCnCDhrmGktv+3EqVL2XpKwAJqMQBW6VA==',key_name='tempest-keypair-1039920618',keypairs=<?>,launch_index=0,launched_at=2026-01-21T23:58:53Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='717cc581e6a349a98dfd390d05b18624',ramdisk_id='',reservation_id='r-tqp0ax95',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-658760528',owner_user_name='tempest-AttachInterfacesTestJSON-658760528-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-21T23:58:53Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='0f8ef02149394f2dac899fc3395b6bf7',uuid=30c7da24-de00-4067-a5d2-f36ad21391c5,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "917524f9-5334-4b3d-b16f-b9686b1c3528", "address": "fa:16:3e:80:88:c9", "network": {"id": "1995baab-0f8d-4658-a4fc-2d21868dc592", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-54296518-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "717cc581e6a349a98dfd390d05b18624", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap917524f9-53", "ovs_interfaceid": "917524f9-5334-4b3d-b16f-b9686b1c3528", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 00:00:00 compute-1 nova_compute[182713]: 2026-01-22 00:00:00.273 182717 DEBUG nova.network.os_vif_util [None req-dc07ab6c-5c53-40fb-9cbe-920ca2621cb0 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Converting VIF {"id": "917524f9-5334-4b3d-b16f-b9686b1c3528", "address": "fa:16:3e:80:88:c9", "network": {"id": "1995baab-0f8d-4658-a4fc-2d21868dc592", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-54296518-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "717cc581e6a349a98dfd390d05b18624", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap917524f9-53", "ovs_interfaceid": "917524f9-5334-4b3d-b16f-b9686b1c3528", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:00:00 compute-1 nova_compute[182713]: 2026-01-22 00:00:00.274 182717 DEBUG nova.network.os_vif_util [None req-dc07ab6c-5c53-40fb-9cbe-920ca2621cb0 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:80:88:c9,bridge_name='br-int',has_traffic_filtering=True,id=917524f9-5334-4b3d-b16f-b9686b1c3528,network=Network(1995baab-0f8d-4658-a4fc-2d21868dc592),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap917524f9-53') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:00:00 compute-1 nova_compute[182713]: 2026-01-22 00:00:00.274 182717 DEBUG os_vif [None req-dc07ab6c-5c53-40fb-9cbe-920ca2621cb0 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:80:88:c9,bridge_name='br-int',has_traffic_filtering=True,id=917524f9-5334-4b3d-b16f-b9686b1c3528,network=Network(1995baab-0f8d-4658-a4fc-2d21868dc592),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap917524f9-53') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 00:00:00 compute-1 nova_compute[182713]: 2026-01-22 00:00:00.278 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:00:00 compute-1 nova_compute[182713]: 2026-01-22 00:00:00.279 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap917524f9-53, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:00:00 compute-1 nova_compute[182713]: 2026-01-22 00:00:00.282 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:00:00 compute-1 nova_compute[182713]: 2026-01-22 00:00:00.283 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:00:00 compute-1 nova_compute[182713]: 2026-01-22 00:00:00.291 182717 INFO os_vif [None req-dc07ab6c-5c53-40fb-9cbe-920ca2621cb0 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:80:88:c9,bridge_name='br-int',has_traffic_filtering=True,id=917524f9-5334-4b3d-b16f-b9686b1c3528,network=Network(1995baab-0f8d-4658-a4fc-2d21868dc592),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap917524f9-53')
Jan 22 00:00:00 compute-1 nova_compute[182713]: 2026-01-22 00:00:00.292 182717 INFO nova.virt.libvirt.driver [None req-dc07ab6c-5c53-40fb-9cbe-920ca2621cb0 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] [instance: 30c7da24-de00-4067-a5d2-f36ad21391c5] Deleting instance files /var/lib/nova/instances/30c7da24-de00-4067-a5d2-f36ad21391c5_del
Jan 22 00:00:00 compute-1 nova_compute[182713]: 2026-01-22 00:00:00.294 182717 INFO nova.virt.libvirt.driver [None req-dc07ab6c-5c53-40fb-9cbe-920ca2621cb0 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] [instance: 30c7da24-de00-4067-a5d2-f36ad21391c5] Deletion of /var/lib/nova/instances/30c7da24-de00-4067-a5d2-f36ad21391c5_del complete
Jan 22 00:00:00 compute-1 podman[222851]: 2026-01-22 00:00:00.312154307 +0000 UTC m=+0.045612548 container remove 0529845bf52fb7ae33d1a4cc1e07df4659f72766a1b5a82fbc955311829a6049 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1995baab-0f8d-4658-a4fc-2d21868dc592, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 00:00:00 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:00:00.318 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[21a27c20-d353-4c94-999a-71789377a615]: (4, ('Thu Jan 22 12:00:00 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-1995baab-0f8d-4658-a4fc-2d21868dc592 (0529845bf52fb7ae33d1a4cc1e07df4659f72766a1b5a82fbc955311829a6049)\n0529845bf52fb7ae33d1a4cc1e07df4659f72766a1b5a82fbc955311829a6049\nThu Jan 22 12:00:00 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-1995baab-0f8d-4658-a4fc-2d21868dc592 (0529845bf52fb7ae33d1a4cc1e07df4659f72766a1b5a82fbc955311829a6049)\n0529845bf52fb7ae33d1a4cc1e07df4659f72766a1b5a82fbc955311829a6049\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:00:00 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:00:00.320 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[4f4932f6-db73-46b3-b74f-a96605023af0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:00:00 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:00:00.321 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1995baab-00, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:00:00 compute-1 nova_compute[182713]: 2026-01-22 00:00:00.323 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:00:00 compute-1 kernel: tap1995baab-00: left promiscuous mode
Jan 22 00:00:00 compute-1 nova_compute[182713]: 2026-01-22 00:00:00.326 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:00:00 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:00:00.328 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[735349b1-1ae7-4e6b-8a6c-e8766c98189f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:00:00 compute-1 nova_compute[182713]: 2026-01-22 00:00:00.336 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:00:00 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:00:00.345 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[0c00cc89-94c2-46c3-9966-2ad82db82352]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:00:00 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:00:00.346 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[9374f4ce-3055-4a2e-8c51-ab4232b8e26c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:00:00 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:00:00.362 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[ca8a530e-eb37-481b-a1c5-04040aaebc69]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 450435, 'reachable_time': 23019, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 222866, 'error': None, 'target': 'ovnmeta-1995baab-0f8d-4658-a4fc-2d21868dc592', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:00:00 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:00:00.364 104576 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-1995baab-0f8d-4658-a4fc-2d21868dc592 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 22 00:00:00 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:00:00.365 104576 DEBUG oslo.privsep.daemon [-] privsep: reply[a0d02e5e-1cb4-4373-86b0-9b8c74f469fc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:00:00 compute-1 systemd[1]: run-netns-ovnmeta\x2d1995baab\x2d0f8d\x2d4658\x2da4fc\x2d2d21868dc592.mount: Deactivated successfully.
Jan 22 00:00:00 compute-1 nova_compute[182713]: 2026-01-22 00:00:00.393 182717 DEBUG nova.network.neutron [None req-dd87d052-ec21-4f35-9d2b-748f4aba8186 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] [instance: 6888ddb7-c373-48a1-bc4c-7e1c44cece29] Successfully updated port: 21117406-f212-49b3-a849-2a3d9a544b64 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 22 00:00:00 compute-1 nova_compute[182713]: 2026-01-22 00:00:00.470 182717 INFO nova.compute.manager [None req-dc07ab6c-5c53-40fb-9cbe-920ca2621cb0 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] [instance: 30c7da24-de00-4067-a5d2-f36ad21391c5] Took 0.49 seconds to destroy the instance on the hypervisor.
Jan 22 00:00:00 compute-1 nova_compute[182713]: 2026-01-22 00:00:00.471 182717 DEBUG oslo.service.loopingcall [None req-dc07ab6c-5c53-40fb-9cbe-920ca2621cb0 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 22 00:00:00 compute-1 nova_compute[182713]: 2026-01-22 00:00:00.471 182717 DEBUG nova.compute.manager [-] [instance: 30c7da24-de00-4067-a5d2-f36ad21391c5] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 22 00:00:00 compute-1 nova_compute[182713]: 2026-01-22 00:00:00.472 182717 DEBUG nova.network.neutron [-] [instance: 30c7da24-de00-4067-a5d2-f36ad21391c5] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 22 00:00:00 compute-1 nova_compute[182713]: 2026-01-22 00:00:00.572 182717 DEBUG nova.compute.manager [req-2506f4cb-5cdc-448e-865c-43ef9f2a4a1c req-4738c116-77e0-4bba-b764-7ef22d127007 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 6888ddb7-c373-48a1-bc4c-7e1c44cece29] Received event network-changed-21117406-f212-49b3-a849-2a3d9a544b64 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:00:00 compute-1 nova_compute[182713]: 2026-01-22 00:00:00.572 182717 DEBUG nova.compute.manager [req-2506f4cb-5cdc-448e-865c-43ef9f2a4a1c req-4738c116-77e0-4bba-b764-7ef22d127007 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 6888ddb7-c373-48a1-bc4c-7e1c44cece29] Refreshing instance network info cache due to event network-changed-21117406-f212-49b3-a849-2a3d9a544b64. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 00:00:00 compute-1 nova_compute[182713]: 2026-01-22 00:00:00.573 182717 DEBUG oslo_concurrency.lockutils [req-2506f4cb-5cdc-448e-865c-43ef9f2a4a1c req-4738c116-77e0-4bba-b764-7ef22d127007 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-6888ddb7-c373-48a1-bc4c-7e1c44cece29" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:00:00 compute-1 nova_compute[182713]: 2026-01-22 00:00:00.573 182717 DEBUG oslo_concurrency.lockutils [req-2506f4cb-5cdc-448e-865c-43ef9f2a4a1c req-4738c116-77e0-4bba-b764-7ef22d127007 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-6888ddb7-c373-48a1-bc4c-7e1c44cece29" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:00:00 compute-1 nova_compute[182713]: 2026-01-22 00:00:00.573 182717 DEBUG nova.network.neutron [req-2506f4cb-5cdc-448e-865c-43ef9f2a4a1c req-4738c116-77e0-4bba-b764-7ef22d127007 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 6888ddb7-c373-48a1-bc4c-7e1c44cece29] Refreshing network info cache for port 21117406-f212-49b3-a849-2a3d9a544b64 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 00:00:00 compute-1 nova_compute[182713]: 2026-01-22 00:00:00.708 182717 DEBUG nova.compute.manager [req-b6a2266a-766c-456a-9bb0-f453980710e2 req-45a3ca84-c51c-45c7-892b-5ecba78ff231 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 30c7da24-de00-4067-a5d2-f36ad21391c5] Received event network-vif-unplugged-917524f9-5334-4b3d-b16f-b9686b1c3528 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:00:00 compute-1 nova_compute[182713]: 2026-01-22 00:00:00.709 182717 DEBUG oslo_concurrency.lockutils [req-b6a2266a-766c-456a-9bb0-f453980710e2 req-45a3ca84-c51c-45c7-892b-5ecba78ff231 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "30c7da24-de00-4067-a5d2-f36ad21391c5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:00:00 compute-1 nova_compute[182713]: 2026-01-22 00:00:00.709 182717 DEBUG oslo_concurrency.lockutils [req-b6a2266a-766c-456a-9bb0-f453980710e2 req-45a3ca84-c51c-45c7-892b-5ecba78ff231 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "30c7da24-de00-4067-a5d2-f36ad21391c5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:00:00 compute-1 nova_compute[182713]: 2026-01-22 00:00:00.709 182717 DEBUG oslo_concurrency.lockutils [req-b6a2266a-766c-456a-9bb0-f453980710e2 req-45a3ca84-c51c-45c7-892b-5ecba78ff231 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "30c7da24-de00-4067-a5d2-f36ad21391c5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:00:00 compute-1 nova_compute[182713]: 2026-01-22 00:00:00.710 182717 DEBUG nova.compute.manager [req-b6a2266a-766c-456a-9bb0-f453980710e2 req-45a3ca84-c51c-45c7-892b-5ecba78ff231 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 30c7da24-de00-4067-a5d2-f36ad21391c5] No waiting events found dispatching network-vif-unplugged-917524f9-5334-4b3d-b16f-b9686b1c3528 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:00:00 compute-1 nova_compute[182713]: 2026-01-22 00:00:00.710 182717 DEBUG nova.compute.manager [req-b6a2266a-766c-456a-9bb0-f453980710e2 req-45a3ca84-c51c-45c7-892b-5ecba78ff231 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 30c7da24-de00-4067-a5d2-f36ad21391c5] Received event network-vif-unplugged-917524f9-5334-4b3d-b16f-b9686b1c3528 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 22 00:00:01 compute-1 nova_compute[182713]: 2026-01-22 00:00:01.233 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:00:01 compute-1 nova_compute[182713]: 2026-01-22 00:00:01.345 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:00:01 compute-1 nova_compute[182713]: 2026-01-22 00:00:01.367 182717 DEBUG nova.network.neutron [req-2506f4cb-5cdc-448e-865c-43ef9f2a4a1c req-4738c116-77e0-4bba-b764-7ef22d127007 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 6888ddb7-c373-48a1-bc4c-7e1c44cece29] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 00:00:01 compute-1 nova_compute[182713]: 2026-01-22 00:00:01.987 182717 DEBUG nova.network.neutron [req-2506f4cb-5cdc-448e-865c-43ef9f2a4a1c req-4738c116-77e0-4bba-b764-7ef22d127007 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 6888ddb7-c373-48a1-bc4c-7e1c44cece29] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:00:02 compute-1 nova_compute[182713]: 2026-01-22 00:00:02.023 182717 DEBUG oslo_concurrency.lockutils [req-2506f4cb-5cdc-448e-865c-43ef9f2a4a1c req-4738c116-77e0-4bba-b764-7ef22d127007 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-6888ddb7-c373-48a1-bc4c-7e1c44cece29" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:00:02 compute-1 nova_compute[182713]: 2026-01-22 00:00:02.373 182717 DEBUG nova.network.neutron [-] [instance: 30c7da24-de00-4067-a5d2-f36ad21391c5] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:00:02 compute-1 nova_compute[182713]: 2026-01-22 00:00:02.405 182717 INFO nova.compute.manager [-] [instance: 30c7da24-de00-4067-a5d2-f36ad21391c5] Took 1.93 seconds to deallocate network for instance.
Jan 22 00:00:02 compute-1 nova_compute[182713]: 2026-01-22 00:00:02.466 182717 DEBUG nova.network.neutron [None req-dd87d052-ec21-4f35-9d2b-748f4aba8186 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] [instance: 6888ddb7-c373-48a1-bc4c-7e1c44cece29] Successfully updated port: 5aa7b353-4b8f-49f4-b051-3e3f1b1135af _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 22 00:00:02 compute-1 nova_compute[182713]: 2026-01-22 00:00:02.541 182717 DEBUG oslo_concurrency.lockutils [None req-dd87d052-ec21-4f35-9d2b-748f4aba8186 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] Acquiring lock "refresh_cache-6888ddb7-c373-48a1-bc4c-7e1c44cece29" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:00:02 compute-1 nova_compute[182713]: 2026-01-22 00:00:02.542 182717 DEBUG oslo_concurrency.lockutils [None req-dd87d052-ec21-4f35-9d2b-748f4aba8186 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] Acquired lock "refresh_cache-6888ddb7-c373-48a1-bc4c-7e1c44cece29" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:00:02 compute-1 nova_compute[182713]: 2026-01-22 00:00:02.542 182717 DEBUG nova.network.neutron [None req-dd87d052-ec21-4f35-9d2b-748f4aba8186 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] [instance: 6888ddb7-c373-48a1-bc4c-7e1c44cece29] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 00:00:02 compute-1 nova_compute[182713]: 2026-01-22 00:00:02.553 182717 DEBUG oslo_concurrency.lockutils [None req-dc07ab6c-5c53-40fb-9cbe-920ca2621cb0 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:00:02 compute-1 nova_compute[182713]: 2026-01-22 00:00:02.554 182717 DEBUG oslo_concurrency.lockutils [None req-dc07ab6c-5c53-40fb-9cbe-920ca2621cb0 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:00:02 compute-1 nova_compute[182713]: 2026-01-22 00:00:02.709 182717 DEBUG nova.compute.provider_tree [None req-dc07ab6c-5c53-40fb-9cbe-920ca2621cb0 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Inventory has not changed in ProviderTree for provider: 39680711-70c9-4df1-ae59-25e54fac688d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:00:02 compute-1 nova_compute[182713]: 2026-01-22 00:00:02.803 182717 DEBUG nova.scheduler.client.report [None req-dc07ab6c-5c53-40fb-9cbe-920ca2621cb0 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Inventory has not changed for provider 39680711-70c9-4df1-ae59-25e54fac688d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:00:02 compute-1 nova_compute[182713]: 2026-01-22 00:00:02.850 182717 DEBUG oslo_concurrency.lockutils [None req-dc07ab6c-5c53-40fb-9cbe-920ca2621cb0 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.296s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:00:02 compute-1 nova_compute[182713]: 2026-01-22 00:00:02.942 182717 DEBUG nova.compute.manager [req-098484b5-bf3e-48a6-aa5b-9165ee5d1588 req-31275f2e-30e5-4712-baad-76b47b133327 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 6888ddb7-c373-48a1-bc4c-7e1c44cece29] Received event network-changed-5aa7b353-4b8f-49f4-b051-3e3f1b1135af external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:00:02 compute-1 nova_compute[182713]: 2026-01-22 00:00:02.942 182717 DEBUG nova.compute.manager [req-098484b5-bf3e-48a6-aa5b-9165ee5d1588 req-31275f2e-30e5-4712-baad-76b47b133327 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 6888ddb7-c373-48a1-bc4c-7e1c44cece29] Refreshing instance network info cache due to event network-changed-5aa7b353-4b8f-49f4-b051-3e3f1b1135af. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 00:00:02 compute-1 nova_compute[182713]: 2026-01-22 00:00:02.943 182717 DEBUG oslo_concurrency.lockutils [req-098484b5-bf3e-48a6-aa5b-9165ee5d1588 req-31275f2e-30e5-4712-baad-76b47b133327 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-6888ddb7-c373-48a1-bc4c-7e1c44cece29" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:00:02 compute-1 nova_compute[182713]: 2026-01-22 00:00:02.944 182717 INFO nova.scheduler.client.report [None req-dc07ab6c-5c53-40fb-9cbe-920ca2621cb0 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Deleted allocations for instance 30c7da24-de00-4067-a5d2-f36ad21391c5
Jan 22 00:00:03 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:00:03.007 104184 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:00:03 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:00:03.007 104184 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:00:03 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:00:03.008 104184 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:00:03 compute-1 nova_compute[182713]: 2026-01-22 00:00:03.050 182717 DEBUG oslo_concurrency.lockutils [None req-dc07ab6c-5c53-40fb-9cbe-920ca2621cb0 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Lock "30c7da24-de00-4067-a5d2-f36ad21391c5" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.095s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:00:03 compute-1 nova_compute[182713]: 2026-01-22 00:00:03.074 182717 DEBUG nova.network.neutron [None req-dd87d052-ec21-4f35-9d2b-748f4aba8186 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] [instance: 6888ddb7-c373-48a1-bc4c-7e1c44cece29] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 00:00:03 compute-1 nova_compute[182713]: 2026-01-22 00:00:03.315 182717 DEBUG nova.compute.manager [req-e9cf25b8-f611-4299-b6f2-e500fbfd18a7 req-658b3464-49e3-4ab8-83d0-12547a6dff3d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 30c7da24-de00-4067-a5d2-f36ad21391c5] Received event network-vif-plugged-917524f9-5334-4b3d-b16f-b9686b1c3528 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:00:03 compute-1 nova_compute[182713]: 2026-01-22 00:00:03.316 182717 DEBUG oslo_concurrency.lockutils [req-e9cf25b8-f611-4299-b6f2-e500fbfd18a7 req-658b3464-49e3-4ab8-83d0-12547a6dff3d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "30c7da24-de00-4067-a5d2-f36ad21391c5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:00:03 compute-1 nova_compute[182713]: 2026-01-22 00:00:03.317 182717 DEBUG oslo_concurrency.lockutils [req-e9cf25b8-f611-4299-b6f2-e500fbfd18a7 req-658b3464-49e3-4ab8-83d0-12547a6dff3d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "30c7da24-de00-4067-a5d2-f36ad21391c5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:00:03 compute-1 nova_compute[182713]: 2026-01-22 00:00:03.317 182717 DEBUG oslo_concurrency.lockutils [req-e9cf25b8-f611-4299-b6f2-e500fbfd18a7 req-658b3464-49e3-4ab8-83d0-12547a6dff3d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "30c7da24-de00-4067-a5d2-f36ad21391c5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:00:03 compute-1 nova_compute[182713]: 2026-01-22 00:00:03.318 182717 DEBUG nova.compute.manager [req-e9cf25b8-f611-4299-b6f2-e500fbfd18a7 req-658b3464-49e3-4ab8-83d0-12547a6dff3d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 30c7da24-de00-4067-a5d2-f36ad21391c5] No waiting events found dispatching network-vif-plugged-917524f9-5334-4b3d-b16f-b9686b1c3528 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:00:03 compute-1 nova_compute[182713]: 2026-01-22 00:00:03.318 182717 WARNING nova.compute.manager [req-e9cf25b8-f611-4299-b6f2-e500fbfd18a7 req-658b3464-49e3-4ab8-83d0-12547a6dff3d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 30c7da24-de00-4067-a5d2-f36ad21391c5] Received unexpected event network-vif-plugged-917524f9-5334-4b3d-b16f-b9686b1c3528 for instance with vm_state deleted and task_state None.
Jan 22 00:00:03 compute-1 nova_compute[182713]: 2026-01-22 00:00:03.318 182717 DEBUG nova.compute.manager [req-e9cf25b8-f611-4299-b6f2-e500fbfd18a7 req-658b3464-49e3-4ab8-83d0-12547a6dff3d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 30c7da24-de00-4067-a5d2-f36ad21391c5] Received event network-vif-deleted-917524f9-5334-4b3d-b16f-b9686b1c3528 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:00:05 compute-1 nova_compute[182713]: 2026-01-22 00:00:05.324 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:00:05 compute-1 nova_compute[182713]: 2026-01-22 00:00:05.912 182717 DEBUG nova.network.neutron [None req-dd87d052-ec21-4f35-9d2b-748f4aba8186 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] [instance: 6888ddb7-c373-48a1-bc4c-7e1c44cece29] Updating instance_info_cache with network_info: [{"id": "21117406-f212-49b3-a849-2a3d9a544b64", "address": "fa:16:3e:35:38:fd", "network": {"id": "88446c98-877a-4464-946b-7d73337856db", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1586215748", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.157", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "975703700f9d42c5a1daa32f5e61f6f2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap21117406-f2", "ovs_interfaceid": "21117406-f212-49b3-a849-2a3d9a544b64", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "5aa7b353-4b8f-49f4-b051-3e3f1b1135af", "address": "fa:16:3e:ff:7a:ac", "network": {"id": "563d1a6c-e81b-4fe1-b3b3-addfa9be9ba4", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1652397250", "subnets": [{"cidr": "10.100.1.0/24", "dns": [], "gateway": {"address": "10.100.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.1.43", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "975703700f9d42c5a1daa32f5e61f6f2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5aa7b353-4b", "ovs_interfaceid": "5aa7b353-4b8f-49f4-b051-3e3f1b1135af", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:00:05 compute-1 nova_compute[182713]: 2026-01-22 00:00:05.958 182717 DEBUG oslo_concurrency.lockutils [None req-dd87d052-ec21-4f35-9d2b-748f4aba8186 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] Releasing lock "refresh_cache-6888ddb7-c373-48a1-bc4c-7e1c44cece29" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:00:05 compute-1 nova_compute[182713]: 2026-01-22 00:00:05.959 182717 DEBUG nova.compute.manager [None req-dd87d052-ec21-4f35-9d2b-748f4aba8186 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] [instance: 6888ddb7-c373-48a1-bc4c-7e1c44cece29] Instance network_info: |[{"id": "21117406-f212-49b3-a849-2a3d9a544b64", "address": "fa:16:3e:35:38:fd", "network": {"id": "88446c98-877a-4464-946b-7d73337856db", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1586215748", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.157", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "975703700f9d42c5a1daa32f5e61f6f2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap21117406-f2", "ovs_interfaceid": "21117406-f212-49b3-a849-2a3d9a544b64", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "5aa7b353-4b8f-49f4-b051-3e3f1b1135af", "address": "fa:16:3e:ff:7a:ac", "network": {"id": "563d1a6c-e81b-4fe1-b3b3-addfa9be9ba4", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1652397250", "subnets": [{"cidr": "10.100.1.0/24", "dns": [], "gateway": {"address": "10.100.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.1.43", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "975703700f9d42c5a1daa32f5e61f6f2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5aa7b353-4b", "ovs_interfaceid": "5aa7b353-4b8f-49f4-b051-3e3f1b1135af", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 22 00:00:05 compute-1 nova_compute[182713]: 2026-01-22 00:00:05.960 182717 DEBUG oslo_concurrency.lockutils [req-098484b5-bf3e-48a6-aa5b-9165ee5d1588 req-31275f2e-30e5-4712-baad-76b47b133327 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-6888ddb7-c373-48a1-bc4c-7e1c44cece29" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:00:05 compute-1 nova_compute[182713]: 2026-01-22 00:00:05.960 182717 DEBUG nova.network.neutron [req-098484b5-bf3e-48a6-aa5b-9165ee5d1588 req-31275f2e-30e5-4712-baad-76b47b133327 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 6888ddb7-c373-48a1-bc4c-7e1c44cece29] Refreshing network info cache for port 5aa7b353-4b8f-49f4-b051-3e3f1b1135af _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 00:00:05 compute-1 nova_compute[182713]: 2026-01-22 00:00:05.968 182717 DEBUG nova.virt.libvirt.driver [None req-dd87d052-ec21-4f35-9d2b-748f4aba8186 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] [instance: 6888ddb7-c373-48a1-bc4c-7e1c44cece29] Start _get_guest_xml network_info=[{"id": "21117406-f212-49b3-a849-2a3d9a544b64", "address": "fa:16:3e:35:38:fd", "network": {"id": "88446c98-877a-4464-946b-7d73337856db", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1586215748", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.157", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "975703700f9d42c5a1daa32f5e61f6f2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap21117406-f2", "ovs_interfaceid": "21117406-f212-49b3-a849-2a3d9a544b64", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "5aa7b353-4b8f-49f4-b051-3e3f1b1135af", "address": "fa:16:3e:ff:7a:ac", "network": {"id": "563d1a6c-e81b-4fe1-b3b3-addfa9be9ba4", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1652397250", "subnets": [{"cidr": "10.100.1.0/24", "dns": [], "gateway": {"address": "10.100.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.1.43", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "975703700f9d42c5a1daa32f5e61f6f2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5aa7b353-4b", "ovs_interfaceid": "5aa7b353-4b8f-49f4-b051-3e3f1b1135af", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'size': 0, 'disk_bus': 'virtio', 'boot_index': 0, 'device_name': '/dev/vda', 'guest_format': None, 'encrypted': False, 'encryption_format': None, 'encryption_secret_uuid': None, 'device_type': 'disk', 'image_id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 22 00:00:05 compute-1 nova_compute[182713]: 2026-01-22 00:00:05.973 182717 WARNING nova.virt.libvirt.driver [None req-dd87d052-ec21-4f35-9d2b-748f4aba8186 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 00:00:05 compute-1 nova_compute[182713]: 2026-01-22 00:00:05.981 182717 DEBUG nova.virt.libvirt.host [None req-dd87d052-ec21-4f35-9d2b-748f4aba8186 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 22 00:00:05 compute-1 nova_compute[182713]: 2026-01-22 00:00:05.983 182717 DEBUG nova.virt.libvirt.host [None req-dd87d052-ec21-4f35-9d2b-748f4aba8186 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 22 00:00:06 compute-1 nova_compute[182713]: 2026-01-22 00:00:06.009 182717 DEBUG nova.virt.libvirt.host [None req-dd87d052-ec21-4f35-9d2b-748f4aba8186 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 22 00:00:06 compute-1 nova_compute[182713]: 2026-01-22 00:00:06.011 182717 DEBUG nova.virt.libvirt.host [None req-dd87d052-ec21-4f35-9d2b-748f4aba8186 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 22 00:00:06 compute-1 nova_compute[182713]: 2026-01-22 00:00:06.013 182717 DEBUG nova.virt.libvirt.driver [None req-dd87d052-ec21-4f35-9d2b-748f4aba8186 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 22 00:00:06 compute-1 nova_compute[182713]: 2026-01-22 00:00:06.014 182717 DEBUG nova.virt.hardware [None req-dd87d052-ec21-4f35-9d2b-748f4aba8186 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-21T23:42:45Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c3389c03-89c4-4ff5-9e03-1a99d41713d4',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 22 00:00:06 compute-1 nova_compute[182713]: 2026-01-22 00:00:06.014 182717 DEBUG nova.virt.hardware [None req-dd87d052-ec21-4f35-9d2b-748f4aba8186 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 22 00:00:06 compute-1 nova_compute[182713]: 2026-01-22 00:00:06.015 182717 DEBUG nova.virt.hardware [None req-dd87d052-ec21-4f35-9d2b-748f4aba8186 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 22 00:00:06 compute-1 nova_compute[182713]: 2026-01-22 00:00:06.015 182717 DEBUG nova.virt.hardware [None req-dd87d052-ec21-4f35-9d2b-748f4aba8186 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 22 00:00:06 compute-1 nova_compute[182713]: 2026-01-22 00:00:06.016 182717 DEBUG nova.virt.hardware [None req-dd87d052-ec21-4f35-9d2b-748f4aba8186 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 22 00:00:06 compute-1 nova_compute[182713]: 2026-01-22 00:00:06.016 182717 DEBUG nova.virt.hardware [None req-dd87d052-ec21-4f35-9d2b-748f4aba8186 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 22 00:00:06 compute-1 nova_compute[182713]: 2026-01-22 00:00:06.017 182717 DEBUG nova.virt.hardware [None req-dd87d052-ec21-4f35-9d2b-748f4aba8186 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 22 00:00:06 compute-1 nova_compute[182713]: 2026-01-22 00:00:06.017 182717 DEBUG nova.virt.hardware [None req-dd87d052-ec21-4f35-9d2b-748f4aba8186 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 22 00:00:06 compute-1 nova_compute[182713]: 2026-01-22 00:00:06.018 182717 DEBUG nova.virt.hardware [None req-dd87d052-ec21-4f35-9d2b-748f4aba8186 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 22 00:00:06 compute-1 nova_compute[182713]: 2026-01-22 00:00:06.018 182717 DEBUG nova.virt.hardware [None req-dd87d052-ec21-4f35-9d2b-748f4aba8186 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 22 00:00:06 compute-1 nova_compute[182713]: 2026-01-22 00:00:06.019 182717 DEBUG nova.virt.hardware [None req-dd87d052-ec21-4f35-9d2b-748f4aba8186 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 22 00:00:06 compute-1 nova_compute[182713]: 2026-01-22 00:00:06.025 182717 DEBUG nova.virt.libvirt.vif [None req-dd87d052-ec21-4f35-9d2b-748f4aba8186 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-21T23:59:55Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestMultiNic-server-718536947',display_name='tempest-ServersTestMultiNic-server-718536947',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverstestmultinic-server-718536947',id=80,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='975703700f9d42c5a1daa32f5e61f6f2',ramdisk_id='',reservation_id='r-kh5n56k6',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestMultiNic-672631386',owner_user_name='tempest-ServersTestMultiNic-672631386-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-21T23:59:57Z,user_data=None,user_id='1931e691804246e3bb3ac03a95a74d93',uuid=6888ddb7-c373-48a1-bc4c-7e1c44cece29,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "21117406-f212-49b3-a849-2a3d9a544b64", "address": "fa:16:3e:35:38:fd", "network": {"id": "88446c98-877a-4464-946b-7d73337856db", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1586215748", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.157", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "975703700f9d42c5a1daa32f5e61f6f2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap21117406-f2", "ovs_interfaceid": "21117406-f212-49b3-a849-2a3d9a544b64", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 00:00:06 compute-1 nova_compute[182713]: 2026-01-22 00:00:06.025 182717 DEBUG nova.network.os_vif_util [None req-dd87d052-ec21-4f35-9d2b-748f4aba8186 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] Converting VIF {"id": "21117406-f212-49b3-a849-2a3d9a544b64", "address": "fa:16:3e:35:38:fd", "network": {"id": "88446c98-877a-4464-946b-7d73337856db", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1586215748", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.157", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "975703700f9d42c5a1daa32f5e61f6f2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap21117406-f2", "ovs_interfaceid": "21117406-f212-49b3-a849-2a3d9a544b64", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:00:06 compute-1 nova_compute[182713]: 2026-01-22 00:00:06.027 182717 DEBUG nova.network.os_vif_util [None req-dd87d052-ec21-4f35-9d2b-748f4aba8186 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:35:38:fd,bridge_name='br-int',has_traffic_filtering=True,id=21117406-f212-49b3-a849-2a3d9a544b64,network=Network(88446c98-877a-4464-946b-7d73337856db),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap21117406-f2') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:00:06 compute-1 nova_compute[182713]: 2026-01-22 00:00:06.028 182717 DEBUG nova.virt.libvirt.vif [None req-dd87d052-ec21-4f35-9d2b-748f4aba8186 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-21T23:59:55Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestMultiNic-server-718536947',display_name='tempest-ServersTestMultiNic-server-718536947',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverstestmultinic-server-718536947',id=80,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='975703700f9d42c5a1daa32f5e61f6f2',ramdisk_id='',reservation_id='r-kh5n56k6',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestMultiNic-672631386',owner_user_name='tempest-ServersTestMultiNic-672631386-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-21T23:59:57Z,user_data=None,user_id='1931e691804246e3bb3ac03a95a74d93',uuid=6888ddb7-c373-48a1-bc4c-7e1c44cece29,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "5aa7b353-4b8f-49f4-b051-3e3f1b1135af", "address": "fa:16:3e:ff:7a:ac", "network": {"id": "563d1a6c-e81b-4fe1-b3b3-addfa9be9ba4", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1652397250", "subnets": [{"cidr": "10.100.1.0/24", "dns": [], "gateway": {"address": "10.100.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.1.43", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "975703700f9d42c5a1daa32f5e61f6f2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5aa7b353-4b", "ovs_interfaceid": "5aa7b353-4b8f-49f4-b051-3e3f1b1135af", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 00:00:06 compute-1 nova_compute[182713]: 2026-01-22 00:00:06.028 182717 DEBUG nova.network.os_vif_util [None req-dd87d052-ec21-4f35-9d2b-748f4aba8186 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] Converting VIF {"id": "5aa7b353-4b8f-49f4-b051-3e3f1b1135af", "address": "fa:16:3e:ff:7a:ac", "network": {"id": "563d1a6c-e81b-4fe1-b3b3-addfa9be9ba4", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1652397250", "subnets": [{"cidr": "10.100.1.0/24", "dns": [], "gateway": {"address": "10.100.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.1.43", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "975703700f9d42c5a1daa32f5e61f6f2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5aa7b353-4b", "ovs_interfaceid": "5aa7b353-4b8f-49f4-b051-3e3f1b1135af", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:00:06 compute-1 nova_compute[182713]: 2026-01-22 00:00:06.029 182717 DEBUG nova.network.os_vif_util [None req-dd87d052-ec21-4f35-9d2b-748f4aba8186 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ff:7a:ac,bridge_name='br-int',has_traffic_filtering=True,id=5aa7b353-4b8f-49f4-b051-3e3f1b1135af,network=Network(563d1a6c-e81b-4fe1-b3b3-addfa9be9ba4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5aa7b353-4b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:00:06 compute-1 nova_compute[182713]: 2026-01-22 00:00:06.030 182717 DEBUG nova.objects.instance [None req-dd87d052-ec21-4f35-9d2b-748f4aba8186 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] Lazy-loading 'pci_devices' on Instance uuid 6888ddb7-c373-48a1-bc4c-7e1c44cece29 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:00:06 compute-1 nova_compute[182713]: 2026-01-22 00:00:06.059 182717 DEBUG nova.virt.libvirt.driver [None req-dd87d052-ec21-4f35-9d2b-748f4aba8186 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] [instance: 6888ddb7-c373-48a1-bc4c-7e1c44cece29] End _get_guest_xml xml=<domain type="kvm">
Jan 22 00:00:06 compute-1 nova_compute[182713]:   <uuid>6888ddb7-c373-48a1-bc4c-7e1c44cece29</uuid>
Jan 22 00:00:06 compute-1 nova_compute[182713]:   <name>instance-00000050</name>
Jan 22 00:00:06 compute-1 nova_compute[182713]:   <memory>131072</memory>
Jan 22 00:00:06 compute-1 nova_compute[182713]:   <vcpu>1</vcpu>
Jan 22 00:00:06 compute-1 nova_compute[182713]:   <metadata>
Jan 22 00:00:06 compute-1 nova_compute[182713]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 00:00:06 compute-1 nova_compute[182713]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 00:00:06 compute-1 nova_compute[182713]:       <nova:name>tempest-ServersTestMultiNic-server-718536947</nova:name>
Jan 22 00:00:06 compute-1 nova_compute[182713]:       <nova:creationTime>2026-01-22 00:00:05</nova:creationTime>
Jan 22 00:00:06 compute-1 nova_compute[182713]:       <nova:flavor name="m1.nano">
Jan 22 00:00:06 compute-1 nova_compute[182713]:         <nova:memory>128</nova:memory>
Jan 22 00:00:06 compute-1 nova_compute[182713]:         <nova:disk>1</nova:disk>
Jan 22 00:00:06 compute-1 nova_compute[182713]:         <nova:swap>0</nova:swap>
Jan 22 00:00:06 compute-1 nova_compute[182713]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 00:00:06 compute-1 nova_compute[182713]:         <nova:vcpus>1</nova:vcpus>
Jan 22 00:00:06 compute-1 nova_compute[182713]:       </nova:flavor>
Jan 22 00:00:06 compute-1 nova_compute[182713]:       <nova:owner>
Jan 22 00:00:06 compute-1 nova_compute[182713]:         <nova:user uuid="1931e691804246e3bb3ac03a95a74d93">tempest-ServersTestMultiNic-672631386-project-member</nova:user>
Jan 22 00:00:06 compute-1 nova_compute[182713]:         <nova:project uuid="975703700f9d42c5a1daa32f5e61f6f2">tempest-ServersTestMultiNic-672631386</nova:project>
Jan 22 00:00:06 compute-1 nova_compute[182713]:       </nova:owner>
Jan 22 00:00:06 compute-1 nova_compute[182713]:       <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 22 00:00:06 compute-1 nova_compute[182713]:       <nova:ports>
Jan 22 00:00:06 compute-1 nova_compute[182713]:         <nova:port uuid="21117406-f212-49b3-a849-2a3d9a544b64">
Jan 22 00:00:06 compute-1 nova_compute[182713]:           <nova:ip type="fixed" address="10.100.0.157" ipVersion="4"/>
Jan 22 00:00:06 compute-1 nova_compute[182713]:         </nova:port>
Jan 22 00:00:06 compute-1 nova_compute[182713]:         <nova:port uuid="5aa7b353-4b8f-49f4-b051-3e3f1b1135af">
Jan 22 00:00:06 compute-1 nova_compute[182713]:           <nova:ip type="fixed" address="10.100.1.43" ipVersion="4"/>
Jan 22 00:00:06 compute-1 nova_compute[182713]:         </nova:port>
Jan 22 00:00:06 compute-1 nova_compute[182713]:       </nova:ports>
Jan 22 00:00:06 compute-1 nova_compute[182713]:     </nova:instance>
Jan 22 00:00:06 compute-1 nova_compute[182713]:   </metadata>
Jan 22 00:00:06 compute-1 nova_compute[182713]:   <sysinfo type="smbios">
Jan 22 00:00:06 compute-1 nova_compute[182713]:     <system>
Jan 22 00:00:06 compute-1 nova_compute[182713]:       <entry name="manufacturer">RDO</entry>
Jan 22 00:00:06 compute-1 nova_compute[182713]:       <entry name="product">OpenStack Compute</entry>
Jan 22 00:00:06 compute-1 nova_compute[182713]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 00:00:06 compute-1 nova_compute[182713]:       <entry name="serial">6888ddb7-c373-48a1-bc4c-7e1c44cece29</entry>
Jan 22 00:00:06 compute-1 nova_compute[182713]:       <entry name="uuid">6888ddb7-c373-48a1-bc4c-7e1c44cece29</entry>
Jan 22 00:00:06 compute-1 nova_compute[182713]:       <entry name="family">Virtual Machine</entry>
Jan 22 00:00:06 compute-1 nova_compute[182713]:     </system>
Jan 22 00:00:06 compute-1 nova_compute[182713]:   </sysinfo>
Jan 22 00:00:06 compute-1 nova_compute[182713]:   <os>
Jan 22 00:00:06 compute-1 nova_compute[182713]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 22 00:00:06 compute-1 nova_compute[182713]:     <boot dev="hd"/>
Jan 22 00:00:06 compute-1 nova_compute[182713]:     <smbios mode="sysinfo"/>
Jan 22 00:00:06 compute-1 nova_compute[182713]:   </os>
Jan 22 00:00:06 compute-1 nova_compute[182713]:   <features>
Jan 22 00:00:06 compute-1 nova_compute[182713]:     <acpi/>
Jan 22 00:00:06 compute-1 nova_compute[182713]:     <apic/>
Jan 22 00:00:06 compute-1 nova_compute[182713]:     <vmcoreinfo/>
Jan 22 00:00:06 compute-1 nova_compute[182713]:   </features>
Jan 22 00:00:06 compute-1 nova_compute[182713]:   <clock offset="utc">
Jan 22 00:00:06 compute-1 nova_compute[182713]:     <timer name="pit" tickpolicy="delay"/>
Jan 22 00:00:06 compute-1 nova_compute[182713]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 22 00:00:06 compute-1 nova_compute[182713]:     <timer name="hpet" present="no"/>
Jan 22 00:00:06 compute-1 nova_compute[182713]:   </clock>
Jan 22 00:00:06 compute-1 nova_compute[182713]:   <cpu mode="custom" match="exact">
Jan 22 00:00:06 compute-1 nova_compute[182713]:     <model>Nehalem</model>
Jan 22 00:00:06 compute-1 nova_compute[182713]:     <topology sockets="1" cores="1" threads="1"/>
Jan 22 00:00:06 compute-1 nova_compute[182713]:   </cpu>
Jan 22 00:00:06 compute-1 nova_compute[182713]:   <devices>
Jan 22 00:00:06 compute-1 nova_compute[182713]:     <disk type="file" device="disk">
Jan 22 00:00:06 compute-1 nova_compute[182713]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 00:00:06 compute-1 nova_compute[182713]:       <source file="/var/lib/nova/instances/6888ddb7-c373-48a1-bc4c-7e1c44cece29/disk"/>
Jan 22 00:00:06 compute-1 nova_compute[182713]:       <target dev="vda" bus="virtio"/>
Jan 22 00:00:06 compute-1 nova_compute[182713]:     </disk>
Jan 22 00:00:06 compute-1 nova_compute[182713]:     <disk type="file" device="cdrom">
Jan 22 00:00:06 compute-1 nova_compute[182713]:       <driver name="qemu" type="raw" cache="none"/>
Jan 22 00:00:06 compute-1 nova_compute[182713]:       <source file="/var/lib/nova/instances/6888ddb7-c373-48a1-bc4c-7e1c44cece29/disk.config"/>
Jan 22 00:00:06 compute-1 nova_compute[182713]:       <target dev="sda" bus="sata"/>
Jan 22 00:00:06 compute-1 nova_compute[182713]:     </disk>
Jan 22 00:00:06 compute-1 nova_compute[182713]:     <interface type="ethernet">
Jan 22 00:00:06 compute-1 nova_compute[182713]:       <mac address="fa:16:3e:35:38:fd"/>
Jan 22 00:00:06 compute-1 nova_compute[182713]:       <model type="virtio"/>
Jan 22 00:00:06 compute-1 nova_compute[182713]:       <driver name="vhost" rx_queue_size="512"/>
Jan 22 00:00:06 compute-1 nova_compute[182713]:       <mtu size="1442"/>
Jan 22 00:00:06 compute-1 nova_compute[182713]:       <target dev="tap21117406-f2"/>
Jan 22 00:00:06 compute-1 nova_compute[182713]:     </interface>
Jan 22 00:00:06 compute-1 nova_compute[182713]:     <interface type="ethernet">
Jan 22 00:00:06 compute-1 nova_compute[182713]:       <mac address="fa:16:3e:ff:7a:ac"/>
Jan 22 00:00:06 compute-1 nova_compute[182713]:       <model type="virtio"/>
Jan 22 00:00:06 compute-1 nova_compute[182713]:       <driver name="vhost" rx_queue_size="512"/>
Jan 22 00:00:06 compute-1 nova_compute[182713]:       <mtu size="1442"/>
Jan 22 00:00:06 compute-1 nova_compute[182713]:       <target dev="tap5aa7b353-4b"/>
Jan 22 00:00:06 compute-1 nova_compute[182713]:     </interface>
Jan 22 00:00:06 compute-1 nova_compute[182713]:     <serial type="pty">
Jan 22 00:00:06 compute-1 nova_compute[182713]:       <log file="/var/lib/nova/instances/6888ddb7-c373-48a1-bc4c-7e1c44cece29/console.log" append="off"/>
Jan 22 00:00:06 compute-1 nova_compute[182713]:     </serial>
Jan 22 00:00:06 compute-1 nova_compute[182713]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 00:00:06 compute-1 nova_compute[182713]:     <video>
Jan 22 00:00:06 compute-1 nova_compute[182713]:       <model type="virtio"/>
Jan 22 00:00:06 compute-1 nova_compute[182713]:     </video>
Jan 22 00:00:06 compute-1 nova_compute[182713]:     <input type="tablet" bus="usb"/>
Jan 22 00:00:06 compute-1 nova_compute[182713]:     <rng model="virtio">
Jan 22 00:00:06 compute-1 nova_compute[182713]:       <backend model="random">/dev/urandom</backend>
Jan 22 00:00:06 compute-1 nova_compute[182713]:     </rng>
Jan 22 00:00:06 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root"/>
Jan 22 00:00:06 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:00:06 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:00:06 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:00:06 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:00:06 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:00:06 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:00:06 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:00:06 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:00:06 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:00:06 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:00:06 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:00:06 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:00:06 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:00:06 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:00:06 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:00:06 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:00:06 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:00:06 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:00:06 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:00:06 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:00:06 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:00:06 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:00:06 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:00:06 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:00:06 compute-1 nova_compute[182713]:     <controller type="usb" index="0"/>
Jan 22 00:00:06 compute-1 nova_compute[182713]:     <memballoon model="virtio">
Jan 22 00:00:06 compute-1 nova_compute[182713]:       <stats period="10"/>
Jan 22 00:00:06 compute-1 nova_compute[182713]:     </memballoon>
Jan 22 00:00:06 compute-1 nova_compute[182713]:   </devices>
Jan 22 00:00:06 compute-1 nova_compute[182713]: </domain>
Jan 22 00:00:06 compute-1 nova_compute[182713]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 22 00:00:06 compute-1 nova_compute[182713]: 2026-01-22 00:00:06.061 182717 DEBUG nova.compute.manager [None req-dd87d052-ec21-4f35-9d2b-748f4aba8186 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] [instance: 6888ddb7-c373-48a1-bc4c-7e1c44cece29] Preparing to wait for external event network-vif-plugged-21117406-f212-49b3-a849-2a3d9a544b64 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 22 00:00:06 compute-1 nova_compute[182713]: 2026-01-22 00:00:06.062 182717 DEBUG oslo_concurrency.lockutils [None req-dd87d052-ec21-4f35-9d2b-748f4aba8186 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] Acquiring lock "6888ddb7-c373-48a1-bc4c-7e1c44cece29-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:00:06 compute-1 nova_compute[182713]: 2026-01-22 00:00:06.062 182717 DEBUG oslo_concurrency.lockutils [None req-dd87d052-ec21-4f35-9d2b-748f4aba8186 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] Lock "6888ddb7-c373-48a1-bc4c-7e1c44cece29-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:00:06 compute-1 nova_compute[182713]: 2026-01-22 00:00:06.062 182717 DEBUG oslo_concurrency.lockutils [None req-dd87d052-ec21-4f35-9d2b-748f4aba8186 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] Lock "6888ddb7-c373-48a1-bc4c-7e1c44cece29-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:00:06 compute-1 nova_compute[182713]: 2026-01-22 00:00:06.063 182717 DEBUG nova.compute.manager [None req-dd87d052-ec21-4f35-9d2b-748f4aba8186 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] [instance: 6888ddb7-c373-48a1-bc4c-7e1c44cece29] Preparing to wait for external event network-vif-plugged-5aa7b353-4b8f-49f4-b051-3e3f1b1135af prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 22 00:00:06 compute-1 nova_compute[182713]: 2026-01-22 00:00:06.063 182717 DEBUG oslo_concurrency.lockutils [None req-dd87d052-ec21-4f35-9d2b-748f4aba8186 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] Acquiring lock "6888ddb7-c373-48a1-bc4c-7e1c44cece29-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:00:06 compute-1 nova_compute[182713]: 2026-01-22 00:00:06.063 182717 DEBUG oslo_concurrency.lockutils [None req-dd87d052-ec21-4f35-9d2b-748f4aba8186 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] Lock "6888ddb7-c373-48a1-bc4c-7e1c44cece29-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:00:06 compute-1 nova_compute[182713]: 2026-01-22 00:00:06.064 182717 DEBUG oslo_concurrency.lockutils [None req-dd87d052-ec21-4f35-9d2b-748f4aba8186 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] Lock "6888ddb7-c373-48a1-bc4c-7e1c44cece29-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:00:06 compute-1 nova_compute[182713]: 2026-01-22 00:00:06.064 182717 DEBUG nova.virt.libvirt.vif [None req-dd87d052-ec21-4f35-9d2b-748f4aba8186 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-21T23:59:55Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestMultiNic-server-718536947',display_name='tempest-ServersTestMultiNic-server-718536947',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverstestmultinic-server-718536947',id=80,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='975703700f9d42c5a1daa32f5e61f6f2',ramdisk_id='',reservation_id='r-kh5n56k6',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestMultiNic-672631386',owner_user_name='tempest-ServersTestMultiNic-672631386-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-21T23:59:57Z,user_data=None,user_id='1931e691804246e3bb3ac03a95a74d93',uuid=6888ddb7-c373-48a1-bc4c-7e1c44cece29,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "21117406-f212-49b3-a849-2a3d9a544b64", "address": "fa:16:3e:35:38:fd", "network": {"id": "88446c98-877a-4464-946b-7d73337856db", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1586215748", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.157", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "975703700f9d42c5a1daa32f5e61f6f2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap21117406-f2", "ovs_interfaceid": "21117406-f212-49b3-a849-2a3d9a544b64", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 00:00:06 compute-1 nova_compute[182713]: 2026-01-22 00:00:06.065 182717 DEBUG nova.network.os_vif_util [None req-dd87d052-ec21-4f35-9d2b-748f4aba8186 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] Converting VIF {"id": "21117406-f212-49b3-a849-2a3d9a544b64", "address": "fa:16:3e:35:38:fd", "network": {"id": "88446c98-877a-4464-946b-7d73337856db", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1586215748", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.157", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "975703700f9d42c5a1daa32f5e61f6f2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap21117406-f2", "ovs_interfaceid": "21117406-f212-49b3-a849-2a3d9a544b64", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:00:06 compute-1 nova_compute[182713]: 2026-01-22 00:00:06.065 182717 DEBUG nova.network.os_vif_util [None req-dd87d052-ec21-4f35-9d2b-748f4aba8186 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:35:38:fd,bridge_name='br-int',has_traffic_filtering=True,id=21117406-f212-49b3-a849-2a3d9a544b64,network=Network(88446c98-877a-4464-946b-7d73337856db),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap21117406-f2') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:00:06 compute-1 nova_compute[182713]: 2026-01-22 00:00:06.066 182717 DEBUG os_vif [None req-dd87d052-ec21-4f35-9d2b-748f4aba8186 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:35:38:fd,bridge_name='br-int',has_traffic_filtering=True,id=21117406-f212-49b3-a849-2a3d9a544b64,network=Network(88446c98-877a-4464-946b-7d73337856db),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap21117406-f2') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 00:00:06 compute-1 nova_compute[182713]: 2026-01-22 00:00:06.066 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:00:06 compute-1 nova_compute[182713]: 2026-01-22 00:00:06.067 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:00:06 compute-1 nova_compute[182713]: 2026-01-22 00:00:06.067 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 00:00:06 compute-1 nova_compute[182713]: 2026-01-22 00:00:06.070 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:00:06 compute-1 nova_compute[182713]: 2026-01-22 00:00:06.071 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap21117406-f2, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:00:06 compute-1 nova_compute[182713]: 2026-01-22 00:00:06.071 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap21117406-f2, col_values=(('external_ids', {'iface-id': '21117406-f212-49b3-a849-2a3d9a544b64', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:35:38:fd', 'vm-uuid': '6888ddb7-c373-48a1-bc4c-7e1c44cece29'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:00:06 compute-1 nova_compute[182713]: 2026-01-22 00:00:06.073 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:00:06 compute-1 NetworkManager[54952]: <info>  [1769040006.0749] manager: (tap21117406-f2): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/149)
Jan 22 00:00:06 compute-1 nova_compute[182713]: 2026-01-22 00:00:06.076 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:00:06 compute-1 nova_compute[182713]: 2026-01-22 00:00:06.080 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:00:06 compute-1 nova_compute[182713]: 2026-01-22 00:00:06.081 182717 INFO os_vif [None req-dd87d052-ec21-4f35-9d2b-748f4aba8186 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:35:38:fd,bridge_name='br-int',has_traffic_filtering=True,id=21117406-f212-49b3-a849-2a3d9a544b64,network=Network(88446c98-877a-4464-946b-7d73337856db),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap21117406-f2')
Jan 22 00:00:06 compute-1 nova_compute[182713]: 2026-01-22 00:00:06.082 182717 DEBUG nova.virt.libvirt.vif [None req-dd87d052-ec21-4f35-9d2b-748f4aba8186 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-21T23:59:55Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestMultiNic-server-718536947',display_name='tempest-ServersTestMultiNic-server-718536947',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverstestmultinic-server-718536947',id=80,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='975703700f9d42c5a1daa32f5e61f6f2',ramdisk_id='',reservation_id='r-kh5n56k6',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestMultiNic-672631386',owner_user_name='tempest-ServersTestMultiNic-672631386-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-21T23:59:57Z,user_data=None,user_id='1931e691804246e3bb3ac03a95a74d93',uuid=6888ddb7-c373-48a1-bc4c-7e1c44cece29,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "5aa7b353-4b8f-49f4-b051-3e3f1b1135af", "address": "fa:16:3e:ff:7a:ac", "network": {"id": "563d1a6c-e81b-4fe1-b3b3-addfa9be9ba4", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1652397250", "subnets": [{"cidr": "10.100.1.0/24", "dns": [], "gateway": {"address": "10.100.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.1.43", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "975703700f9d42c5a1daa32f5e61f6f2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5aa7b353-4b", "ovs_interfaceid": "5aa7b353-4b8f-49f4-b051-3e3f1b1135af", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 00:00:06 compute-1 nova_compute[182713]: 2026-01-22 00:00:06.082 182717 DEBUG nova.network.os_vif_util [None req-dd87d052-ec21-4f35-9d2b-748f4aba8186 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] Converting VIF {"id": "5aa7b353-4b8f-49f4-b051-3e3f1b1135af", "address": "fa:16:3e:ff:7a:ac", "network": {"id": "563d1a6c-e81b-4fe1-b3b3-addfa9be9ba4", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1652397250", "subnets": [{"cidr": "10.100.1.0/24", "dns": [], "gateway": {"address": "10.100.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.1.43", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "975703700f9d42c5a1daa32f5e61f6f2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5aa7b353-4b", "ovs_interfaceid": "5aa7b353-4b8f-49f4-b051-3e3f1b1135af", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:00:06 compute-1 nova_compute[182713]: 2026-01-22 00:00:06.083 182717 DEBUG nova.network.os_vif_util [None req-dd87d052-ec21-4f35-9d2b-748f4aba8186 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ff:7a:ac,bridge_name='br-int',has_traffic_filtering=True,id=5aa7b353-4b8f-49f4-b051-3e3f1b1135af,network=Network(563d1a6c-e81b-4fe1-b3b3-addfa9be9ba4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5aa7b353-4b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:00:06 compute-1 nova_compute[182713]: 2026-01-22 00:00:06.083 182717 DEBUG os_vif [None req-dd87d052-ec21-4f35-9d2b-748f4aba8186 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ff:7a:ac,bridge_name='br-int',has_traffic_filtering=True,id=5aa7b353-4b8f-49f4-b051-3e3f1b1135af,network=Network(563d1a6c-e81b-4fe1-b3b3-addfa9be9ba4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5aa7b353-4b') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 00:00:06 compute-1 nova_compute[182713]: 2026-01-22 00:00:06.084 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:00:06 compute-1 nova_compute[182713]: 2026-01-22 00:00:06.084 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:00:06 compute-1 nova_compute[182713]: 2026-01-22 00:00:06.085 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 00:00:06 compute-1 nova_compute[182713]: 2026-01-22 00:00:06.087 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:00:06 compute-1 nova_compute[182713]: 2026-01-22 00:00:06.088 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap5aa7b353-4b, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:00:06 compute-1 nova_compute[182713]: 2026-01-22 00:00:06.088 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap5aa7b353-4b, col_values=(('external_ids', {'iface-id': '5aa7b353-4b8f-49f4-b051-3e3f1b1135af', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:ff:7a:ac', 'vm-uuid': '6888ddb7-c373-48a1-bc4c-7e1c44cece29'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:00:06 compute-1 nova_compute[182713]: 2026-01-22 00:00:06.089 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:00:06 compute-1 NetworkManager[54952]: <info>  [1769040006.0904] manager: (tap5aa7b353-4b): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/150)
Jan 22 00:00:06 compute-1 nova_compute[182713]: 2026-01-22 00:00:06.092 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:00:06 compute-1 nova_compute[182713]: 2026-01-22 00:00:06.097 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:00:06 compute-1 nova_compute[182713]: 2026-01-22 00:00:06.098 182717 INFO os_vif [None req-dd87d052-ec21-4f35-9d2b-748f4aba8186 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ff:7a:ac,bridge_name='br-int',has_traffic_filtering=True,id=5aa7b353-4b8f-49f4-b051-3e3f1b1135af,network=Network(563d1a6c-e81b-4fe1-b3b3-addfa9be9ba4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5aa7b353-4b')
Jan 22 00:00:06 compute-1 nova_compute[182713]: 2026-01-22 00:00:06.235 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:00:06 compute-1 nova_compute[182713]: 2026-01-22 00:00:06.247 182717 DEBUG nova.virt.libvirt.driver [None req-dd87d052-ec21-4f35-9d2b-748f4aba8186 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 00:00:06 compute-1 nova_compute[182713]: 2026-01-22 00:00:06.248 182717 DEBUG nova.virt.libvirt.driver [None req-dd87d052-ec21-4f35-9d2b-748f4aba8186 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 00:00:06 compute-1 nova_compute[182713]: 2026-01-22 00:00:06.248 182717 DEBUG nova.virt.libvirt.driver [None req-dd87d052-ec21-4f35-9d2b-748f4aba8186 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] No VIF found with MAC fa:16:3e:35:38:fd, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 00:00:06 compute-1 nova_compute[182713]: 2026-01-22 00:00:06.248 182717 DEBUG nova.virt.libvirt.driver [None req-dd87d052-ec21-4f35-9d2b-748f4aba8186 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] No VIF found with MAC fa:16:3e:ff:7a:ac, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 00:00:06 compute-1 nova_compute[182713]: 2026-01-22 00:00:06.249 182717 INFO nova.virt.libvirt.driver [None req-dd87d052-ec21-4f35-9d2b-748f4aba8186 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] [instance: 6888ddb7-c373-48a1-bc4c-7e1c44cece29] Using config drive
Jan 22 00:00:07 compute-1 nova_compute[182713]: 2026-01-22 00:00:07.168 182717 INFO nova.virt.libvirt.driver [None req-dd87d052-ec21-4f35-9d2b-748f4aba8186 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] [instance: 6888ddb7-c373-48a1-bc4c-7e1c44cece29] Creating config drive at /var/lib/nova/instances/6888ddb7-c373-48a1-bc4c-7e1c44cece29/disk.config
Jan 22 00:00:07 compute-1 nova_compute[182713]: 2026-01-22 00:00:07.177 182717 DEBUG oslo_concurrency.processutils [None req-dd87d052-ec21-4f35-9d2b-748f4aba8186 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/6888ddb7-c373-48a1-bc4c-7e1c44cece29/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpskfaatzb execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:00:07 compute-1 nova_compute[182713]: 2026-01-22 00:00:07.327 182717 DEBUG oslo_concurrency.processutils [None req-dd87d052-ec21-4f35-9d2b-748f4aba8186 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/6888ddb7-c373-48a1-bc4c-7e1c44cece29/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpskfaatzb" returned: 0 in 0.150s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:00:07 compute-1 nova_compute[182713]: 2026-01-22 00:00:07.379 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:00:07 compute-1 NetworkManager[54952]: <info>  [1769040007.3888] manager: (tap21117406-f2): new Tun device (/org/freedesktop/NetworkManager/Devices/151)
Jan 22 00:00:07 compute-1 kernel: tap21117406-f2: entered promiscuous mode
Jan 22 00:00:07 compute-1 systemd-udevd[222890]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 00:00:07 compute-1 nova_compute[182713]: 2026-01-22 00:00:07.444 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:00:07 compute-1 ovn_controller[94841]: 2026-01-22T00:00:07Z|00309|binding|INFO|Claiming lport 21117406-f212-49b3-a849-2a3d9a544b64 for this chassis.
Jan 22 00:00:07 compute-1 ovn_controller[94841]: 2026-01-22T00:00:07Z|00310|binding|INFO|21117406-f212-49b3-a849-2a3d9a544b64: Claiming fa:16:3e:35:38:fd 10.100.0.157
Jan 22 00:00:07 compute-1 NetworkManager[54952]: <info>  [1769040007.4529] manager: (tap5aa7b353-4b): new Tun device (/org/freedesktop/NetworkManager/Devices/152)
Jan 22 00:00:07 compute-1 systemd-udevd[222895]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 00:00:07 compute-1 NetworkManager[54952]: <info>  [1769040007.4646] device (tap21117406-f2): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 00:00:07 compute-1 NetworkManager[54952]: <info>  [1769040007.4655] device (tap21117406-f2): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 00:00:07 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:00:07.467 104184 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:35:38:fd 10.100.0.157'], port_security=['fa:16:3e:35:38:fd 10.100.0.157'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.157/24', 'neutron:device_id': '6888ddb7-c373-48a1-bc4c-7e1c44cece29', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-88446c98-877a-4464-946b-7d73337856db', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '975703700f9d42c5a1daa32f5e61f6f2', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'ca9fa796-afc7-4732-b32a-7e6314132071', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=80a6b551-f08c-4e74-9b7e-31150b2972b7, chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>], logical_port=21117406-f212-49b3-a849-2a3d9a544b64) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:00:07 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:00:07.469 104184 INFO neutron.agent.ovn.metadata.agent [-] Port 21117406-f212-49b3-a849-2a3d9a544b64 in datapath 88446c98-877a-4464-946b-7d73337856db bound to our chassis
Jan 22 00:00:07 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:00:07.471 104184 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 88446c98-877a-4464-946b-7d73337856db
Jan 22 00:00:07 compute-1 kernel: tap5aa7b353-4b: entered promiscuous mode
Jan 22 00:00:07 compute-1 NetworkManager[54952]: <info>  [1769040007.4794] device (tap5aa7b353-4b): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 00:00:07 compute-1 NetworkManager[54952]: <info>  [1769040007.4809] device (tap5aa7b353-4b): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 00:00:07 compute-1 ovn_controller[94841]: 2026-01-22T00:00:07Z|00311|binding|INFO|Claiming lport 5aa7b353-4b8f-49f4-b051-3e3f1b1135af for this chassis.
Jan 22 00:00:07 compute-1 ovn_controller[94841]: 2026-01-22T00:00:07Z|00312|binding|INFO|5aa7b353-4b8f-49f4-b051-3e3f1b1135af: Claiming fa:16:3e:ff:7a:ac 10.100.1.43
Jan 22 00:00:07 compute-1 nova_compute[182713]: 2026-01-22 00:00:07.481 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:00:07 compute-1 ovn_controller[94841]: 2026-01-22T00:00:07Z|00313|binding|INFO|Setting lport 21117406-f212-49b3-a849-2a3d9a544b64 ovn-installed in OVS
Jan 22 00:00:07 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:00:07.484 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[1a19aaa1-84b0-4ce7-8c2c-5e2a451b276f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:00:07 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:00:07.485 104184 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap88446c98-81 in ovnmeta-88446c98-877a-4464-946b-7d73337856db namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 22 00:00:07 compute-1 nova_compute[182713]: 2026-01-22 00:00:07.489 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:00:07 compute-1 ovn_controller[94841]: 2026-01-22T00:00:07Z|00314|binding|INFO|Setting lport 21117406-f212-49b3-a849-2a3d9a544b64 up in Southbound
Jan 22 00:00:07 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:00:07.490 104184 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ff:7a:ac 10.100.1.43'], port_security=['fa:16:3e:ff:7a:ac 10.100.1.43'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.1.43/24', 'neutron:device_id': '6888ddb7-c373-48a1-bc4c-7e1c44cece29', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-563d1a6c-e81b-4fe1-b3b3-addfa9be9ba4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '975703700f9d42c5a1daa32f5e61f6f2', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'ca9fa796-afc7-4732-b32a-7e6314132071', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d0cd14cf-f667-40ea-aa65-c2e1579b74f4, chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>], logical_port=5aa7b353-4b8f-49f4-b051-3e3f1b1135af) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:00:07 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:00:07.488 211733 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap88446c98-80 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 22 00:00:07 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:00:07.489 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[12fc008f-4153-46a3-91e7-38023f960695]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:00:07 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:00:07.492 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[85655c61-2eb8-4bb5-8e8b-7d8e8a69a7bb]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:00:07 compute-1 systemd-machined[153970]: New machine qemu-37-instance-00000050.
Jan 22 00:00:07 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:00:07.508 104576 DEBUG oslo.privsep.daemon [-] privsep: reply[b27e2283-42dd-40fc-8fad-21e2283934f9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:00:07 compute-1 ovn_controller[94841]: 2026-01-22T00:00:07Z|00315|binding|INFO|Setting lport 5aa7b353-4b8f-49f4-b051-3e3f1b1135af ovn-installed in OVS
Jan 22 00:00:07 compute-1 ovn_controller[94841]: 2026-01-22T00:00:07Z|00316|binding|INFO|Setting lport 5aa7b353-4b8f-49f4-b051-3e3f1b1135af up in Southbound
Jan 22 00:00:07 compute-1 nova_compute[182713]: 2026-01-22 00:00:07.517 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:00:07 compute-1 systemd[1]: Started Virtual Machine qemu-37-instance-00000050.
Jan 22 00:00:07 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:00:07.540 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[8db936ed-7962-4587-adf2-37a182c23856]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:00:07 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:00:07.577 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[0bd20612-2f11-4237-a5f4-50ddde017879]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:00:07 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:00:07.583 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[40f29028-6824-4fa1-9fb8-760cefe895aa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:00:07 compute-1 NetworkManager[54952]: <info>  [1769040007.5843] manager: (tap88446c98-80): new Veth device (/org/freedesktop/NetworkManager/Devices/153)
Jan 22 00:00:07 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:00:07.621 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[0f150e8f-3cb3-4298-a8d9-f80047b09d4b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:00:07 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:00:07.624 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[187f1777-e1cb-4ef8-be8a-02cf9dab77e6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:00:07 compute-1 NetworkManager[54952]: <info>  [1769040007.6502] device (tap88446c98-80): carrier: link connected
Jan 22 00:00:07 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:00:07.659 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[68d0e4b3-a37a-4676-b591-0bf39e5610ea]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:00:07 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:00:07.678 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[25b11c8f-e569-433c-830c-22b09b442c4e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap88446c98-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:eb:a2:5a'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 94], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 458030, 'reachable_time': 17093, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 222931, 'error': None, 'target': 'ovnmeta-88446c98-877a-4464-946b-7d73337856db', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:00:07 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:00:07.695 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[e71c4f9c-4ef8-4886-b344-423e94efa2ea]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feeb:a25a'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 458030, 'tstamp': 458030}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 222932, 'error': None, 'target': 'ovnmeta-88446c98-877a-4464-946b-7d73337856db', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:00:07 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:00:07.712 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[a395d659-e918-445c-bb78-0b24164a6f92]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap88446c98-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:eb:a2:5a'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 94], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 458030, 'reachable_time': 17093, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 222933, 'error': None, 'target': 'ovnmeta-88446c98-877a-4464-946b-7d73337856db', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:00:07 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:00:07.751 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[f8097678-6bd3-4d42-98dd-a7bab4bdf691]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:00:07 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:00:07.831 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[01fd5b34-c88d-4358-adac-7c3b305908c2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:00:07 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:00:07.832 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap88446c98-80, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:00:07 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:00:07.833 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 00:00:07 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:00:07.833 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap88446c98-80, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:00:07 compute-1 nova_compute[182713]: 2026-01-22 00:00:07.835 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:00:07 compute-1 kernel: tap88446c98-80: entered promiscuous mode
Jan 22 00:00:07 compute-1 NetworkManager[54952]: <info>  [1769040007.8367] manager: (tap88446c98-80): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/154)
Jan 22 00:00:07 compute-1 nova_compute[182713]: 2026-01-22 00:00:07.839 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:00:07 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:00:07.842 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap88446c98-80, col_values=(('external_ids', {'iface-id': 'fea32043-076f-4c8c-9cfc-fb1a5ca807c2'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:00:07 compute-1 ovn_controller[94841]: 2026-01-22T00:00:07Z|00317|binding|INFO|Releasing lport fea32043-076f-4c8c-9cfc-fb1a5ca807c2 from this chassis (sb_readonly=0)
Jan 22 00:00:07 compute-1 nova_compute[182713]: 2026-01-22 00:00:07.844 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:00:07 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:00:07.846 104184 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/88446c98-877a-4464-946b-7d73337856db.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/88446c98-877a-4464-946b-7d73337856db.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 22 00:00:07 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:00:07.847 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[ae1f6e46-c6b1-49eb-965e-454812065fa1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:00:07 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:00:07.849 104184 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 00:00:07 compute-1 ovn_metadata_agent[104179]: global
Jan 22 00:00:07 compute-1 ovn_metadata_agent[104179]:     log         /dev/log local0 debug
Jan 22 00:00:07 compute-1 ovn_metadata_agent[104179]:     log-tag     haproxy-metadata-proxy-88446c98-877a-4464-946b-7d73337856db
Jan 22 00:00:07 compute-1 ovn_metadata_agent[104179]:     user        root
Jan 22 00:00:07 compute-1 ovn_metadata_agent[104179]:     group       root
Jan 22 00:00:07 compute-1 ovn_metadata_agent[104179]:     maxconn     1024
Jan 22 00:00:07 compute-1 ovn_metadata_agent[104179]:     pidfile     /var/lib/neutron/external/pids/88446c98-877a-4464-946b-7d73337856db.pid.haproxy
Jan 22 00:00:07 compute-1 ovn_metadata_agent[104179]:     daemon
Jan 22 00:00:07 compute-1 ovn_metadata_agent[104179]: 
Jan 22 00:00:07 compute-1 ovn_metadata_agent[104179]: defaults
Jan 22 00:00:07 compute-1 ovn_metadata_agent[104179]:     log global
Jan 22 00:00:07 compute-1 ovn_metadata_agent[104179]:     mode http
Jan 22 00:00:07 compute-1 ovn_metadata_agent[104179]:     option httplog
Jan 22 00:00:07 compute-1 ovn_metadata_agent[104179]:     option dontlognull
Jan 22 00:00:07 compute-1 ovn_metadata_agent[104179]:     option http-server-close
Jan 22 00:00:07 compute-1 ovn_metadata_agent[104179]:     option forwardfor
Jan 22 00:00:07 compute-1 ovn_metadata_agent[104179]:     retries                 3
Jan 22 00:00:07 compute-1 ovn_metadata_agent[104179]:     timeout http-request    30s
Jan 22 00:00:07 compute-1 ovn_metadata_agent[104179]:     timeout connect         30s
Jan 22 00:00:07 compute-1 ovn_metadata_agent[104179]:     timeout client          32s
Jan 22 00:00:07 compute-1 ovn_metadata_agent[104179]:     timeout server          32s
Jan 22 00:00:07 compute-1 ovn_metadata_agent[104179]:     timeout http-keep-alive 30s
Jan 22 00:00:07 compute-1 ovn_metadata_agent[104179]: 
Jan 22 00:00:07 compute-1 ovn_metadata_agent[104179]: 
Jan 22 00:00:07 compute-1 ovn_metadata_agent[104179]: listen listener
Jan 22 00:00:07 compute-1 ovn_metadata_agent[104179]:     bind 169.254.169.254:80
Jan 22 00:00:07 compute-1 ovn_metadata_agent[104179]:     server metadata /var/lib/neutron/metadata_proxy
Jan 22 00:00:07 compute-1 ovn_metadata_agent[104179]:     http-request add-header X-OVN-Network-ID 88446c98-877a-4464-946b-7d73337856db
Jan 22 00:00:07 compute-1 ovn_metadata_agent[104179]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 22 00:00:07 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:00:07.850 104184 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-88446c98-877a-4464-946b-7d73337856db', 'env', 'PROCESS_TAG=haproxy-88446c98-877a-4464-946b-7d73337856db', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/88446c98-877a-4464-946b-7d73337856db.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 22 00:00:07 compute-1 nova_compute[182713]: 2026-01-22 00:00:07.860 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:00:07 compute-1 nova_compute[182713]: 2026-01-22 00:00:07.927 182717 DEBUG nova.compute.manager [req-837bae1d-b77f-4377-8b5b-a4264a078bc6 req-b3198b3e-1ea0-4b23-83e7-c5af884d11f0 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 6888ddb7-c373-48a1-bc4c-7e1c44cece29] Received event network-vif-plugged-21117406-f212-49b3-a849-2a3d9a544b64 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:00:07 compute-1 nova_compute[182713]: 2026-01-22 00:00:07.928 182717 DEBUG oslo_concurrency.lockutils [req-837bae1d-b77f-4377-8b5b-a4264a078bc6 req-b3198b3e-1ea0-4b23-83e7-c5af884d11f0 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "6888ddb7-c373-48a1-bc4c-7e1c44cece29-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:00:07 compute-1 nova_compute[182713]: 2026-01-22 00:00:07.929 182717 DEBUG oslo_concurrency.lockutils [req-837bae1d-b77f-4377-8b5b-a4264a078bc6 req-b3198b3e-1ea0-4b23-83e7-c5af884d11f0 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "6888ddb7-c373-48a1-bc4c-7e1c44cece29-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:00:07 compute-1 nova_compute[182713]: 2026-01-22 00:00:07.929 182717 DEBUG oslo_concurrency.lockutils [req-837bae1d-b77f-4377-8b5b-a4264a078bc6 req-b3198b3e-1ea0-4b23-83e7-c5af884d11f0 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "6888ddb7-c373-48a1-bc4c-7e1c44cece29-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:00:07 compute-1 nova_compute[182713]: 2026-01-22 00:00:07.929 182717 DEBUG nova.compute.manager [req-837bae1d-b77f-4377-8b5b-a4264a078bc6 req-b3198b3e-1ea0-4b23-83e7-c5af884d11f0 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 6888ddb7-c373-48a1-bc4c-7e1c44cece29] Processing event network-vif-plugged-21117406-f212-49b3-a849-2a3d9a544b64 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 22 00:00:07 compute-1 nova_compute[182713]: 2026-01-22 00:00:07.951 182717 DEBUG nova.virt.driver [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Emitting event <LifecycleEvent: 1769040007.9514544, 6888ddb7-c373-48a1-bc4c-7e1c44cece29 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:00:07 compute-1 nova_compute[182713]: 2026-01-22 00:00:07.952 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 6888ddb7-c373-48a1-bc4c-7e1c44cece29] VM Started (Lifecycle Event)
Jan 22 00:00:07 compute-1 nova_compute[182713]: 2026-01-22 00:00:07.977 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 6888ddb7-c373-48a1-bc4c-7e1c44cece29] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:00:07 compute-1 nova_compute[182713]: 2026-01-22 00:00:07.982 182717 DEBUG nova.virt.driver [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Emitting event <LifecycleEvent: 1769040007.9516525, 6888ddb7-c373-48a1-bc4c-7e1c44cece29 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:00:07 compute-1 nova_compute[182713]: 2026-01-22 00:00:07.983 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 6888ddb7-c373-48a1-bc4c-7e1c44cece29] VM Paused (Lifecycle Event)
Jan 22 00:00:08 compute-1 nova_compute[182713]: 2026-01-22 00:00:08.013 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 6888ddb7-c373-48a1-bc4c-7e1c44cece29] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:00:08 compute-1 nova_compute[182713]: 2026-01-22 00:00:08.017 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 6888ddb7-c373-48a1-bc4c-7e1c44cece29] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 00:00:08 compute-1 nova_compute[182713]: 2026-01-22 00:00:08.041 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 6888ddb7-c373-48a1-bc4c-7e1c44cece29] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 00:00:08 compute-1 nova_compute[182713]: 2026-01-22 00:00:08.102 182717 DEBUG nova.network.neutron [req-098484b5-bf3e-48a6-aa5b-9165ee5d1588 req-31275f2e-30e5-4712-baad-76b47b133327 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 6888ddb7-c373-48a1-bc4c-7e1c44cece29] Updated VIF entry in instance network info cache for port 5aa7b353-4b8f-49f4-b051-3e3f1b1135af. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 00:00:08 compute-1 nova_compute[182713]: 2026-01-22 00:00:08.103 182717 DEBUG nova.network.neutron [req-098484b5-bf3e-48a6-aa5b-9165ee5d1588 req-31275f2e-30e5-4712-baad-76b47b133327 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 6888ddb7-c373-48a1-bc4c-7e1c44cece29] Updating instance_info_cache with network_info: [{"id": "21117406-f212-49b3-a849-2a3d9a544b64", "address": "fa:16:3e:35:38:fd", "network": {"id": "88446c98-877a-4464-946b-7d73337856db", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1586215748", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.157", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "975703700f9d42c5a1daa32f5e61f6f2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap21117406-f2", "ovs_interfaceid": "21117406-f212-49b3-a849-2a3d9a544b64", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "5aa7b353-4b8f-49f4-b051-3e3f1b1135af", "address": "fa:16:3e:ff:7a:ac", "network": {"id": "563d1a6c-e81b-4fe1-b3b3-addfa9be9ba4", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1652397250", "subnets": [{"cidr": "10.100.1.0/24", "dns": [], "gateway": {"address": "10.100.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.1.43", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "975703700f9d42c5a1daa32f5e61f6f2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5aa7b353-4b", "ovs_interfaceid": "5aa7b353-4b8f-49f4-b051-3e3f1b1135af", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:00:08 compute-1 nova_compute[182713]: 2026-01-22 00:00:08.155 182717 DEBUG oslo_concurrency.lockutils [req-098484b5-bf3e-48a6-aa5b-9165ee5d1588 req-31275f2e-30e5-4712-baad-76b47b133327 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-6888ddb7-c373-48a1-bc4c-7e1c44cece29" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:00:08 compute-1 podman[222972]: 2026-01-22 00:00:08.282550358 +0000 UTC m=+0.077714439 container create 6cc4633bb4c18ea6d43512335a6781ee394b8b92c6cb4a5d92bb1cec92037072 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-88446c98-877a-4464-946b-7d73337856db, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS)
Jan 22 00:00:08 compute-1 systemd[1]: Started libpod-conmon-6cc4633bb4c18ea6d43512335a6781ee394b8b92c6cb4a5d92bb1cec92037072.scope.
Jan 22 00:00:08 compute-1 podman[222972]: 2026-01-22 00:00:08.248008122 +0000 UTC m=+0.043172283 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 00:00:08 compute-1 systemd[1]: Started libcrun container.
Jan 22 00:00:08 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f06098861529c33fa8177310b4a5125a6b218c1ac79b6e44171850807f25951c/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 00:00:08 compute-1 podman[222972]: 2026-01-22 00:00:08.366692473 +0000 UTC m=+0.161856544 container init 6cc4633bb4c18ea6d43512335a6781ee394b8b92c6cb4a5d92bb1cec92037072 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-88446c98-877a-4464-946b-7d73337856db, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 00:00:08 compute-1 podman[222972]: 2026-01-22 00:00:08.373460101 +0000 UTC m=+0.168624172 container start 6cc4633bb4c18ea6d43512335a6781ee394b8b92c6cb4a5d92bb1cec92037072 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-88446c98-877a-4464-946b-7d73337856db, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 00:00:08 compute-1 neutron-haproxy-ovnmeta-88446c98-877a-4464-946b-7d73337856db[222987]: [NOTICE]   (222991) : New worker (222993) forked
Jan 22 00:00:08 compute-1 neutron-haproxy-ovnmeta-88446c98-877a-4464-946b-7d73337856db[222987]: [NOTICE]   (222991) : Loading success.
Jan 22 00:00:08 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:00:08.448 104184 INFO neutron.agent.ovn.metadata.agent [-] Port 5aa7b353-4b8f-49f4-b051-3e3f1b1135af in datapath 563d1a6c-e81b-4fe1-b3b3-addfa9be9ba4 unbound from our chassis
Jan 22 00:00:08 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:00:08.450 104184 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 563d1a6c-e81b-4fe1-b3b3-addfa9be9ba4
Jan 22 00:00:08 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:00:08.464 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[19f3949a-875f-411c-9bfa-1d0c4fa87cbb]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:00:08 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:00:08.465 104184 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap563d1a6c-e1 in ovnmeta-563d1a6c-e81b-4fe1-b3b3-addfa9be9ba4 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 22 00:00:08 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:00:08.468 211733 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap563d1a6c-e0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 22 00:00:08 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:00:08.468 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[e5512373-d83d-4139-ae2b-ef5813cfdfec]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:00:08 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:00:08.469 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[67b41012-6130-4644-82a0-5d609e5f70e5]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:00:08 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:00:08.484 104576 DEBUG oslo.privsep.daemon [-] privsep: reply[40dfd8a8-ce4a-489e-8c8c-a3525228c1a2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:00:08 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:00:08.504 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[be044454-e417-4498-8112-64495976797c]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:00:08 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:00:08.548 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[b6699af2-2c93-409c-ac71-cf02fa21d429]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:00:08 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:00:08.554 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[03f9e42d-1b53-4e72-943c-2f796feea9a4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:00:08 compute-1 NetworkManager[54952]: <info>  [1769040008.5557] manager: (tap563d1a6c-e0): new Veth device (/org/freedesktop/NetworkManager/Devices/155)
Jan 22 00:00:08 compute-1 systemd-udevd[222920]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 00:00:08 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:00:08.609 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[76218082-fcb6-4563-bd6c-a58a4ed8d2b5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:00:08 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:00:08.613 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[b90e696e-d626-491f-8f51-5d449b814e93]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:00:08 compute-1 NetworkManager[54952]: <info>  [1769040008.6400] device (tap563d1a6c-e0): carrier: link connected
Jan 22 00:00:08 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:00:08.648 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[8db9edd7-9afc-4131-9618-3d10d84640ac]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:00:08 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:00:08.673 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[305ec485-ff86-41e0-9605-fd5d37304e8b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap563d1a6c-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:cc:d2:25'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 95], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 458129, 'reachable_time': 42129, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 223012, 'error': None, 'target': 'ovnmeta-563d1a6c-e81b-4fe1-b3b3-addfa9be9ba4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:00:08 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:00:08.693 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[bdbdf75f-364e-4dd9-bc6e-58d7ee7358ee]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fecc:d225'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 458129, 'tstamp': 458129}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 223013, 'error': None, 'target': 'ovnmeta-563d1a6c-e81b-4fe1-b3b3-addfa9be9ba4', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:00:08 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:00:08.718 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[445e3f92-90c0-456d-b053-7354611d853b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap563d1a6c-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:cc:d2:25'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 95], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 458129, 'reachable_time': 42129, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 223014, 'error': None, 'target': 'ovnmeta-563d1a6c-e81b-4fe1-b3b3-addfa9be9ba4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:00:08 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:00:08.762 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[9a887653-8b33-4562-88f2-5e2b7392a4d7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:00:08 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:00:08.854 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[928e6684-fab1-4a9a-b9df-2ca2df2d6f55]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:00:08 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:00:08.856 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap563d1a6c-e0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:00:08 compute-1 kernel: tap563d1a6c-e0: entered promiscuous mode
Jan 22 00:00:08 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:00:08.858 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 00:00:08 compute-1 rsyslogd[1003]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 22 00:00:08 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:00:08.858 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap563d1a6c-e0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:00:08 compute-1 rsyslogd[1003]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 22 00:00:08 compute-1 nova_compute[182713]: 2026-01-22 00:00:08.860 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:00:08 compute-1 NetworkManager[54952]: <info>  [1769040008.8617] manager: (tap563d1a6c-e0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/156)
Jan 22 00:00:08 compute-1 nova_compute[182713]: 2026-01-22 00:00:08.864 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:00:08 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:00:08.867 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap563d1a6c-e0, col_values=(('external_ids', {'iface-id': '719ccac6-878e-4b2e-a15c-96b44cf29708'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:00:08 compute-1 nova_compute[182713]: 2026-01-22 00:00:08.868 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:00:08 compute-1 ovn_controller[94841]: 2026-01-22T00:00:08Z|00318|binding|INFO|Releasing lport 719ccac6-878e-4b2e-a15c-96b44cf29708 from this chassis (sb_readonly=0)
Jan 22 00:00:08 compute-1 nova_compute[182713]: 2026-01-22 00:00:08.896 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:00:08 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:00:08.898 104184 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/563d1a6c-e81b-4fe1-b3b3-addfa9be9ba4.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/563d1a6c-e81b-4fe1-b3b3-addfa9be9ba4.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 22 00:00:08 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:00:08.899 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[4c20cdb8-4099-424b-a7a6-9d3fe57cca61]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:00:08 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:00:08.900 104184 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 00:00:08 compute-1 ovn_metadata_agent[104179]: global
Jan 22 00:00:08 compute-1 ovn_metadata_agent[104179]:     log         /dev/log local0 debug
Jan 22 00:00:08 compute-1 ovn_metadata_agent[104179]:     log-tag     haproxy-metadata-proxy-563d1a6c-e81b-4fe1-b3b3-addfa9be9ba4
Jan 22 00:00:08 compute-1 ovn_metadata_agent[104179]:     user        root
Jan 22 00:00:08 compute-1 ovn_metadata_agent[104179]:     group       root
Jan 22 00:00:08 compute-1 ovn_metadata_agent[104179]:     maxconn     1024
Jan 22 00:00:08 compute-1 ovn_metadata_agent[104179]:     pidfile     /var/lib/neutron/external/pids/563d1a6c-e81b-4fe1-b3b3-addfa9be9ba4.pid.haproxy
Jan 22 00:00:08 compute-1 ovn_metadata_agent[104179]:     daemon
Jan 22 00:00:08 compute-1 ovn_metadata_agent[104179]: 
Jan 22 00:00:08 compute-1 ovn_metadata_agent[104179]: defaults
Jan 22 00:00:08 compute-1 ovn_metadata_agent[104179]:     log global
Jan 22 00:00:08 compute-1 ovn_metadata_agent[104179]:     mode http
Jan 22 00:00:08 compute-1 ovn_metadata_agent[104179]:     option httplog
Jan 22 00:00:08 compute-1 ovn_metadata_agent[104179]:     option dontlognull
Jan 22 00:00:08 compute-1 ovn_metadata_agent[104179]:     option http-server-close
Jan 22 00:00:08 compute-1 ovn_metadata_agent[104179]:     option forwardfor
Jan 22 00:00:08 compute-1 ovn_metadata_agent[104179]:     retries                 3
Jan 22 00:00:08 compute-1 ovn_metadata_agent[104179]:     timeout http-request    30s
Jan 22 00:00:08 compute-1 ovn_metadata_agent[104179]:     timeout connect         30s
Jan 22 00:00:08 compute-1 ovn_metadata_agent[104179]:     timeout client          32s
Jan 22 00:00:08 compute-1 ovn_metadata_agent[104179]:     timeout server          32s
Jan 22 00:00:08 compute-1 ovn_metadata_agent[104179]:     timeout http-keep-alive 30s
Jan 22 00:00:08 compute-1 ovn_metadata_agent[104179]: 
Jan 22 00:00:08 compute-1 ovn_metadata_agent[104179]: 
Jan 22 00:00:08 compute-1 ovn_metadata_agent[104179]: listen listener
Jan 22 00:00:08 compute-1 ovn_metadata_agent[104179]:     bind 169.254.169.254:80
Jan 22 00:00:08 compute-1 ovn_metadata_agent[104179]:     server metadata /var/lib/neutron/metadata_proxy
Jan 22 00:00:08 compute-1 ovn_metadata_agent[104179]:     http-request add-header X-OVN-Network-ID 563d1a6c-e81b-4fe1-b3b3-addfa9be9ba4
Jan 22 00:00:08 compute-1 ovn_metadata_agent[104179]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 22 00:00:08 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:00:08.901 104184 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-563d1a6c-e81b-4fe1-b3b3-addfa9be9ba4', 'env', 'PROCESS_TAG=haproxy-563d1a6c-e81b-4fe1-b3b3-addfa9be9ba4', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/563d1a6c-e81b-4fe1-b3b3-addfa9be9ba4.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 22 00:00:09 compute-1 podman[223047]: 2026-01-22 00:00:09.335044317 +0000 UTC m=+0.080390081 container create a918a505cd5bfe45981d7a2ed11156d71ea7536e63a3e026383539c103bc9abc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-563d1a6c-e81b-4fe1-b3b3-addfa9be9ba4, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3)
Jan 22 00:00:09 compute-1 podman[223047]: 2026-01-22 00:00:09.286289283 +0000 UTC m=+0.031635087 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 00:00:09 compute-1 systemd[1]: Started libpod-conmon-a918a505cd5bfe45981d7a2ed11156d71ea7536e63a3e026383539c103bc9abc.scope.
Jan 22 00:00:09 compute-1 systemd[1]: Started libcrun container.
Jan 22 00:00:09 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/108a82e18556ea4336b7fff6330975e78f1b8ffb9d6e889c5c0ccaa3928dc41f/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 00:00:09 compute-1 podman[223047]: 2026-01-22 00:00:09.447892206 +0000 UTC m=+0.193237930 container init a918a505cd5bfe45981d7a2ed11156d71ea7536e63a3e026383539c103bc9abc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-563d1a6c-e81b-4fe1-b3b3-addfa9be9ba4, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Jan 22 00:00:09 compute-1 podman[223047]: 2026-01-22 00:00:09.454661616 +0000 UTC m=+0.200007340 container start a918a505cd5bfe45981d7a2ed11156d71ea7536e63a3e026383539c103bc9abc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-563d1a6c-e81b-4fe1-b3b3-addfa9be9ba4, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 22 00:00:09 compute-1 neutron-haproxy-ovnmeta-563d1a6c-e81b-4fe1-b3b3-addfa9be9ba4[223069]: [NOTICE]   (223099) : New worker (223115) forked
Jan 22 00:00:09 compute-1 neutron-haproxy-ovnmeta-563d1a6c-e81b-4fe1-b3b3-addfa9be9ba4[223069]: [NOTICE]   (223099) : Loading success.
Jan 22 00:00:09 compute-1 podman[223063]: 2026-01-22 00:00:09.4849523 +0000 UTC m=+0.072677883 container health_status 9069102163b9014ce8d990e36224edf2f6277e737eebbc85a8ea0ba5a3d6a468 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 22 00:00:09 compute-1 podman[223060]: 2026-01-22 00:00:09.515741839 +0000 UTC m=+0.129660749 container health_status 1b31374526468d6b98833c6ddc06c791524e3aaa8f4a92d02e3f11c449a17ca2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 22 00:00:11 compute-1 nova_compute[182713]: 2026-01-22 00:00:11.092 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:00:11 compute-1 nova_compute[182713]: 2026-01-22 00:00:11.236 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:00:13 compute-1 nova_compute[182713]: 2026-01-22 00:00:13.575 182717 DEBUG nova.compute.manager [req-9a28581f-d33b-4ae9-a0e9-498ea63f40a5 req-07278237-b523-4e76-be8d-a880a6508788 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 6888ddb7-c373-48a1-bc4c-7e1c44cece29] Received event network-vif-plugged-21117406-f212-49b3-a849-2a3d9a544b64 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:00:13 compute-1 nova_compute[182713]: 2026-01-22 00:00:13.575 182717 DEBUG oslo_concurrency.lockutils [req-9a28581f-d33b-4ae9-a0e9-498ea63f40a5 req-07278237-b523-4e76-be8d-a880a6508788 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "6888ddb7-c373-48a1-bc4c-7e1c44cece29-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:00:13 compute-1 nova_compute[182713]: 2026-01-22 00:00:13.576 182717 DEBUG oslo_concurrency.lockutils [req-9a28581f-d33b-4ae9-a0e9-498ea63f40a5 req-07278237-b523-4e76-be8d-a880a6508788 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "6888ddb7-c373-48a1-bc4c-7e1c44cece29-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:00:13 compute-1 nova_compute[182713]: 2026-01-22 00:00:13.576 182717 DEBUG oslo_concurrency.lockutils [req-9a28581f-d33b-4ae9-a0e9-498ea63f40a5 req-07278237-b523-4e76-be8d-a880a6508788 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "6888ddb7-c373-48a1-bc4c-7e1c44cece29-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:00:13 compute-1 nova_compute[182713]: 2026-01-22 00:00:13.576 182717 DEBUG nova.compute.manager [req-9a28581f-d33b-4ae9-a0e9-498ea63f40a5 req-07278237-b523-4e76-be8d-a880a6508788 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 6888ddb7-c373-48a1-bc4c-7e1c44cece29] No event matching network-vif-plugged-21117406-f212-49b3-a849-2a3d9a544b64 in dict_keys([('network-vif-plugged', '5aa7b353-4b8f-49f4-b051-3e3f1b1135af')]) pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:325
Jan 22 00:00:13 compute-1 nova_compute[182713]: 2026-01-22 00:00:13.577 182717 WARNING nova.compute.manager [req-9a28581f-d33b-4ae9-a0e9-498ea63f40a5 req-07278237-b523-4e76-be8d-a880a6508788 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 6888ddb7-c373-48a1-bc4c-7e1c44cece29] Received unexpected event network-vif-plugged-21117406-f212-49b3-a849-2a3d9a544b64 for instance with vm_state building and task_state spawning.
Jan 22 00:00:13 compute-1 nova_compute[182713]: 2026-01-22 00:00:13.665 182717 DEBUG nova.compute.manager [req-64f10eae-869d-4741-a0a0-4644306026ef req-54779c22-eeef-42a8-8a0a-ee6cbee74167 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 6888ddb7-c373-48a1-bc4c-7e1c44cece29] Received event network-vif-plugged-5aa7b353-4b8f-49f4-b051-3e3f1b1135af external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:00:13 compute-1 nova_compute[182713]: 2026-01-22 00:00:13.666 182717 DEBUG oslo_concurrency.lockutils [req-64f10eae-869d-4741-a0a0-4644306026ef req-54779c22-eeef-42a8-8a0a-ee6cbee74167 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "6888ddb7-c373-48a1-bc4c-7e1c44cece29-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:00:13 compute-1 nova_compute[182713]: 2026-01-22 00:00:13.666 182717 DEBUG oslo_concurrency.lockutils [req-64f10eae-869d-4741-a0a0-4644306026ef req-54779c22-eeef-42a8-8a0a-ee6cbee74167 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "6888ddb7-c373-48a1-bc4c-7e1c44cece29-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:00:13 compute-1 nova_compute[182713]: 2026-01-22 00:00:13.667 182717 DEBUG oslo_concurrency.lockutils [req-64f10eae-869d-4741-a0a0-4644306026ef req-54779c22-eeef-42a8-8a0a-ee6cbee74167 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "6888ddb7-c373-48a1-bc4c-7e1c44cece29-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:00:13 compute-1 nova_compute[182713]: 2026-01-22 00:00:13.667 182717 DEBUG nova.compute.manager [req-64f10eae-869d-4741-a0a0-4644306026ef req-54779c22-eeef-42a8-8a0a-ee6cbee74167 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 6888ddb7-c373-48a1-bc4c-7e1c44cece29] Processing event network-vif-plugged-5aa7b353-4b8f-49f4-b051-3e3f1b1135af _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 22 00:00:13 compute-1 nova_compute[182713]: 2026-01-22 00:00:13.669 182717 DEBUG nova.compute.manager [None req-dd87d052-ec21-4f35-9d2b-748f4aba8186 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] [instance: 6888ddb7-c373-48a1-bc4c-7e1c44cece29] Instance event wait completed in 5 seconds for network-vif-plugged,network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 00:00:13 compute-1 nova_compute[182713]: 2026-01-22 00:00:13.675 182717 DEBUG nova.virt.driver [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Emitting event <LifecycleEvent: 1769040013.6747057, 6888ddb7-c373-48a1-bc4c-7e1c44cece29 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:00:13 compute-1 nova_compute[182713]: 2026-01-22 00:00:13.675 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 6888ddb7-c373-48a1-bc4c-7e1c44cece29] VM Resumed (Lifecycle Event)
Jan 22 00:00:13 compute-1 nova_compute[182713]: 2026-01-22 00:00:13.678 182717 DEBUG nova.virt.libvirt.driver [None req-dd87d052-ec21-4f35-9d2b-748f4aba8186 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] [instance: 6888ddb7-c373-48a1-bc4c-7e1c44cece29] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 22 00:00:13 compute-1 nova_compute[182713]: 2026-01-22 00:00:13.683 182717 INFO nova.virt.libvirt.driver [-] [instance: 6888ddb7-c373-48a1-bc4c-7e1c44cece29] Instance spawned successfully.
Jan 22 00:00:13 compute-1 nova_compute[182713]: 2026-01-22 00:00:13.684 182717 DEBUG nova.virt.libvirt.driver [None req-dd87d052-ec21-4f35-9d2b-748f4aba8186 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] [instance: 6888ddb7-c373-48a1-bc4c-7e1c44cece29] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 22 00:00:13 compute-1 nova_compute[182713]: 2026-01-22 00:00:13.701 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 6888ddb7-c373-48a1-bc4c-7e1c44cece29] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:00:13 compute-1 nova_compute[182713]: 2026-01-22 00:00:13.712 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 6888ddb7-c373-48a1-bc4c-7e1c44cece29] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 00:00:13 compute-1 nova_compute[182713]: 2026-01-22 00:00:13.719 182717 DEBUG nova.virt.libvirt.driver [None req-dd87d052-ec21-4f35-9d2b-748f4aba8186 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] [instance: 6888ddb7-c373-48a1-bc4c-7e1c44cece29] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:00:13 compute-1 nova_compute[182713]: 2026-01-22 00:00:13.720 182717 DEBUG nova.virt.libvirt.driver [None req-dd87d052-ec21-4f35-9d2b-748f4aba8186 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] [instance: 6888ddb7-c373-48a1-bc4c-7e1c44cece29] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:00:13 compute-1 nova_compute[182713]: 2026-01-22 00:00:13.721 182717 DEBUG nova.virt.libvirt.driver [None req-dd87d052-ec21-4f35-9d2b-748f4aba8186 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] [instance: 6888ddb7-c373-48a1-bc4c-7e1c44cece29] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:00:13 compute-1 nova_compute[182713]: 2026-01-22 00:00:13.722 182717 DEBUG nova.virt.libvirt.driver [None req-dd87d052-ec21-4f35-9d2b-748f4aba8186 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] [instance: 6888ddb7-c373-48a1-bc4c-7e1c44cece29] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:00:13 compute-1 nova_compute[182713]: 2026-01-22 00:00:13.723 182717 DEBUG nova.virt.libvirt.driver [None req-dd87d052-ec21-4f35-9d2b-748f4aba8186 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] [instance: 6888ddb7-c373-48a1-bc4c-7e1c44cece29] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:00:13 compute-1 nova_compute[182713]: 2026-01-22 00:00:13.724 182717 DEBUG nova.virt.libvirt.driver [None req-dd87d052-ec21-4f35-9d2b-748f4aba8186 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] [instance: 6888ddb7-c373-48a1-bc4c-7e1c44cece29] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:00:13 compute-1 nova_compute[182713]: 2026-01-22 00:00:13.734 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 6888ddb7-c373-48a1-bc4c-7e1c44cece29] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 00:00:13 compute-1 nova_compute[182713]: 2026-01-22 00:00:13.841 182717 INFO nova.compute.manager [None req-dd87d052-ec21-4f35-9d2b-748f4aba8186 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] [instance: 6888ddb7-c373-48a1-bc4c-7e1c44cece29] Took 16.09 seconds to spawn the instance on the hypervisor.
Jan 22 00:00:13 compute-1 nova_compute[182713]: 2026-01-22 00:00:13.842 182717 DEBUG nova.compute.manager [None req-dd87d052-ec21-4f35-9d2b-748f4aba8186 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] [instance: 6888ddb7-c373-48a1-bc4c-7e1c44cece29] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:00:13 compute-1 nova_compute[182713]: 2026-01-22 00:00:13.997 182717 INFO nova.compute.manager [None req-dd87d052-ec21-4f35-9d2b-748f4aba8186 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] [instance: 6888ddb7-c373-48a1-bc4c-7e1c44cece29] Took 16.76 seconds to build instance.
Jan 22 00:00:14 compute-1 nova_compute[182713]: 2026-01-22 00:00:14.034 182717 DEBUG oslo_concurrency.lockutils [None req-dd87d052-ec21-4f35-9d2b-748f4aba8186 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] Lock "6888ddb7-c373-48a1-bc4c-7e1c44cece29" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 16.899s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:00:15 compute-1 nova_compute[182713]: 2026-01-22 00:00:15.250 182717 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769040000.2489328, 30c7da24-de00-4067-a5d2-f36ad21391c5 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:00:15 compute-1 nova_compute[182713]: 2026-01-22 00:00:15.251 182717 INFO nova.compute.manager [-] [instance: 30c7da24-de00-4067-a5d2-f36ad21391c5] VM Stopped (Lifecycle Event)
Jan 22 00:00:15 compute-1 nova_compute[182713]: 2026-01-22 00:00:15.291 182717 DEBUG nova.compute.manager [None req-d6a7dc77-42b7-46e6-ba43-21c13021519a - - - - - -] [instance: 30c7da24-de00-4067-a5d2-f36ad21391c5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:00:15 compute-1 podman[223127]: 2026-01-22 00:00:15.560203304 +0000 UTC m=+0.048664651 container health_status e305ebfcd3dbca18f8dd680606b043b108cf9efb0d0a771039cb04fe0df8441d (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 22 00:00:15 compute-1 podman[223126]: 2026-01-22 00:00:15.590068035 +0000 UTC m=+0.082627469 container health_status af9a44923f14d865893c91712a29a99d59e05d83f1a1a0300c4f4d5af638f284 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, managed_by=edpm_ansible)
Jan 22 00:00:15 compute-1 nova_compute[182713]: 2026-01-22 00:00:15.863 182717 DEBUG nova.compute.manager [req-3cbba574-9374-4246-b144-f0425e716ee5 req-1e88abfd-b831-4ac5-b8e9-96d8281584d8 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 6888ddb7-c373-48a1-bc4c-7e1c44cece29] Received event network-vif-plugged-5aa7b353-4b8f-49f4-b051-3e3f1b1135af external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:00:15 compute-1 nova_compute[182713]: 2026-01-22 00:00:15.864 182717 DEBUG oslo_concurrency.lockutils [req-3cbba574-9374-4246-b144-f0425e716ee5 req-1e88abfd-b831-4ac5-b8e9-96d8281584d8 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "6888ddb7-c373-48a1-bc4c-7e1c44cece29-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:00:15 compute-1 nova_compute[182713]: 2026-01-22 00:00:15.864 182717 DEBUG oslo_concurrency.lockutils [req-3cbba574-9374-4246-b144-f0425e716ee5 req-1e88abfd-b831-4ac5-b8e9-96d8281584d8 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "6888ddb7-c373-48a1-bc4c-7e1c44cece29-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:00:15 compute-1 nova_compute[182713]: 2026-01-22 00:00:15.865 182717 DEBUG oslo_concurrency.lockutils [req-3cbba574-9374-4246-b144-f0425e716ee5 req-1e88abfd-b831-4ac5-b8e9-96d8281584d8 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "6888ddb7-c373-48a1-bc4c-7e1c44cece29-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:00:15 compute-1 nova_compute[182713]: 2026-01-22 00:00:15.865 182717 DEBUG nova.compute.manager [req-3cbba574-9374-4246-b144-f0425e716ee5 req-1e88abfd-b831-4ac5-b8e9-96d8281584d8 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 6888ddb7-c373-48a1-bc4c-7e1c44cece29] No waiting events found dispatching network-vif-plugged-5aa7b353-4b8f-49f4-b051-3e3f1b1135af pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:00:15 compute-1 nova_compute[182713]: 2026-01-22 00:00:15.865 182717 WARNING nova.compute.manager [req-3cbba574-9374-4246-b144-f0425e716ee5 req-1e88abfd-b831-4ac5-b8e9-96d8281584d8 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 6888ddb7-c373-48a1-bc4c-7e1c44cece29] Received unexpected event network-vif-plugged-5aa7b353-4b8f-49f4-b051-3e3f1b1135af for instance with vm_state active and task_state None.
Jan 22 00:00:16 compute-1 nova_compute[182713]: 2026-01-22 00:00:16.098 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:00:16 compute-1 nova_compute[182713]: 2026-01-22 00:00:16.238 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:00:16 compute-1 nova_compute[182713]: 2026-01-22 00:00:16.653 182717 DEBUG oslo_concurrency.lockutils [None req-0554717d-09f7-4e60-9c8d-15fafd190e36 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] Acquiring lock "6888ddb7-c373-48a1-bc4c-7e1c44cece29" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:00:16 compute-1 nova_compute[182713]: 2026-01-22 00:00:16.654 182717 DEBUG oslo_concurrency.lockutils [None req-0554717d-09f7-4e60-9c8d-15fafd190e36 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] Lock "6888ddb7-c373-48a1-bc4c-7e1c44cece29" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:00:16 compute-1 nova_compute[182713]: 2026-01-22 00:00:16.655 182717 DEBUG oslo_concurrency.lockutils [None req-0554717d-09f7-4e60-9c8d-15fafd190e36 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] Acquiring lock "6888ddb7-c373-48a1-bc4c-7e1c44cece29-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:00:16 compute-1 nova_compute[182713]: 2026-01-22 00:00:16.655 182717 DEBUG oslo_concurrency.lockutils [None req-0554717d-09f7-4e60-9c8d-15fafd190e36 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] Lock "6888ddb7-c373-48a1-bc4c-7e1c44cece29-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:00:16 compute-1 nova_compute[182713]: 2026-01-22 00:00:16.656 182717 DEBUG oslo_concurrency.lockutils [None req-0554717d-09f7-4e60-9c8d-15fafd190e36 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] Lock "6888ddb7-c373-48a1-bc4c-7e1c44cece29-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:00:16 compute-1 nova_compute[182713]: 2026-01-22 00:00:16.673 182717 INFO nova.compute.manager [None req-0554717d-09f7-4e60-9c8d-15fafd190e36 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] [instance: 6888ddb7-c373-48a1-bc4c-7e1c44cece29] Terminating instance
Jan 22 00:00:16 compute-1 nova_compute[182713]: 2026-01-22 00:00:16.691 182717 DEBUG nova.compute.manager [None req-0554717d-09f7-4e60-9c8d-15fafd190e36 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] [instance: 6888ddb7-c373-48a1-bc4c-7e1c44cece29] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 22 00:00:16 compute-1 kernel: tap21117406-f2 (unregistering): left promiscuous mode
Jan 22 00:00:16 compute-1 NetworkManager[54952]: <info>  [1769040016.7190] device (tap21117406-f2): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 00:00:16 compute-1 ovn_controller[94841]: 2026-01-22T00:00:16Z|00319|binding|INFO|Releasing lport 21117406-f212-49b3-a849-2a3d9a544b64 from this chassis (sb_readonly=0)
Jan 22 00:00:16 compute-1 ovn_controller[94841]: 2026-01-22T00:00:16Z|00320|binding|INFO|Setting lport 21117406-f212-49b3-a849-2a3d9a544b64 down in Southbound
Jan 22 00:00:16 compute-1 nova_compute[182713]: 2026-01-22 00:00:16.729 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:00:16 compute-1 ovn_controller[94841]: 2026-01-22T00:00:16Z|00321|binding|INFO|Removing iface tap21117406-f2 ovn-installed in OVS
Jan 22 00:00:16 compute-1 nova_compute[182713]: 2026-01-22 00:00:16.733 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:00:16 compute-1 nova_compute[182713]: 2026-01-22 00:00:16.747 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:00:16 compute-1 kernel: tap5aa7b353-4b (unregistering): left promiscuous mode
Jan 22 00:00:16 compute-1 NetworkManager[54952]: <info>  [1769040016.7580] device (tap5aa7b353-4b): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 00:00:16 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:00:16.761 104184 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:35:38:fd 10.100.0.157'], port_security=['fa:16:3e:35:38:fd 10.100.0.157'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.157/24', 'neutron:device_id': '6888ddb7-c373-48a1-bc4c-7e1c44cece29', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-88446c98-877a-4464-946b-7d73337856db', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '975703700f9d42c5a1daa32f5e61f6f2', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'ca9fa796-afc7-4732-b32a-7e6314132071', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=80a6b551-f08c-4e74-9b7e-31150b2972b7, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>], logical_port=21117406-f212-49b3-a849-2a3d9a544b64) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:00:16 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:00:16.763 104184 INFO neutron.agent.ovn.metadata.agent [-] Port 21117406-f212-49b3-a849-2a3d9a544b64 in datapath 88446c98-877a-4464-946b-7d73337856db unbound from our chassis
Jan 22 00:00:16 compute-1 nova_compute[182713]: 2026-01-22 00:00:16.759 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:00:16 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:00:16.764 104184 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 88446c98-877a-4464-946b-7d73337856db, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 00:00:16 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:00:16.766 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[7e87f10b-5259-4e9e-9d19-011953748629]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:00:16 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:00:16.767 104184 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-88446c98-877a-4464-946b-7d73337856db namespace which is not needed anymore
Jan 22 00:00:16 compute-1 nova_compute[182713]: 2026-01-22 00:00:16.769 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:00:16 compute-1 ovn_controller[94841]: 2026-01-22T00:00:16Z|00322|binding|INFO|Releasing lport 5aa7b353-4b8f-49f4-b051-3e3f1b1135af from this chassis (sb_readonly=0)
Jan 22 00:00:16 compute-1 ovn_controller[94841]: 2026-01-22T00:00:16Z|00323|binding|INFO|Setting lport 5aa7b353-4b8f-49f4-b051-3e3f1b1135af down in Southbound
Jan 22 00:00:16 compute-1 ovn_controller[94841]: 2026-01-22T00:00:16Z|00324|binding|INFO|Removing iface tap5aa7b353-4b ovn-installed in OVS
Jan 22 00:00:16 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:00:16.775 104184 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ff:7a:ac 10.100.1.43'], port_security=['fa:16:3e:ff:7a:ac 10.100.1.43'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.1.43/24', 'neutron:device_id': '6888ddb7-c373-48a1-bc4c-7e1c44cece29', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-563d1a6c-e81b-4fe1-b3b3-addfa9be9ba4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '975703700f9d42c5a1daa32f5e61f6f2', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'ca9fa796-afc7-4732-b32a-7e6314132071', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d0cd14cf-f667-40ea-aa65-c2e1579b74f4, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>], logical_port=5aa7b353-4b8f-49f4-b051-3e3f1b1135af) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:00:16 compute-1 nova_compute[182713]: 2026-01-22 00:00:16.793 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:00:16 compute-1 systemd[1]: machine-qemu\x2d37\x2dinstance\x2d00000050.scope: Deactivated successfully.
Jan 22 00:00:16 compute-1 systemd[1]: machine-qemu\x2d37\x2dinstance\x2d00000050.scope: Consumed 3.488s CPU time.
Jan 22 00:00:16 compute-1 systemd-machined[153970]: Machine qemu-37-instance-00000050 terminated.
Jan 22 00:00:16 compute-1 ovn_controller[94841]: 2026-01-22T00:00:16Z|00325|binding|INFO|Releasing lport fea32043-076f-4c8c-9cfc-fb1a5ca807c2 from this chassis (sb_readonly=0)
Jan 22 00:00:16 compute-1 ovn_controller[94841]: 2026-01-22T00:00:16Z|00326|binding|INFO|Releasing lport 719ccac6-878e-4b2e-a15c-96b44cf29708 from this chassis (sb_readonly=0)
Jan 22 00:00:16 compute-1 nova_compute[182713]: 2026-01-22 00:00:16.856 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:00:16 compute-1 neutron-haproxy-ovnmeta-88446c98-877a-4464-946b-7d73337856db[222987]: [NOTICE]   (222991) : haproxy version is 2.8.14-c23fe91
Jan 22 00:00:16 compute-1 neutron-haproxy-ovnmeta-88446c98-877a-4464-946b-7d73337856db[222987]: [NOTICE]   (222991) : path to executable is /usr/sbin/haproxy
Jan 22 00:00:16 compute-1 neutron-haproxy-ovnmeta-88446c98-877a-4464-946b-7d73337856db[222987]: [WARNING]  (222991) : Exiting Master process...
Jan 22 00:00:16 compute-1 neutron-haproxy-ovnmeta-88446c98-877a-4464-946b-7d73337856db[222987]: [WARNING]  (222991) : Exiting Master process...
Jan 22 00:00:16 compute-1 neutron-haproxy-ovnmeta-88446c98-877a-4464-946b-7d73337856db[222987]: [ALERT]    (222991) : Current worker (222993) exited with code 143 (Terminated)
Jan 22 00:00:16 compute-1 neutron-haproxy-ovnmeta-88446c98-877a-4464-946b-7d73337856db[222987]: [WARNING]  (222991) : All workers exited. Exiting... (0)
Jan 22 00:00:16 compute-1 systemd[1]: libpod-6cc4633bb4c18ea6d43512335a6781ee394b8b92c6cb4a5d92bb1cec92037072.scope: Deactivated successfully.
Jan 22 00:00:16 compute-1 podman[223197]: 2026-01-22 00:00:16.932642771 +0000 UTC m=+0.069728672 container died 6cc4633bb4c18ea6d43512335a6781ee394b8b92c6cb4a5d92bb1cec92037072 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-88446c98-877a-4464-946b-7d73337856db, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 00:00:16 compute-1 NetworkManager[54952]: <info>  [1769040016.9598] manager: (tap21117406-f2): new Tun device (/org/freedesktop/NetworkManager/Devices/157)
Jan 22 00:00:16 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-6cc4633bb4c18ea6d43512335a6781ee394b8b92c6cb4a5d92bb1cec92037072-userdata-shm.mount: Deactivated successfully.
Jan 22 00:00:17 compute-1 NetworkManager[54952]: <info>  [1769040017.0113] manager: (tap5aa7b353-4b): new Tun device (/org/freedesktop/NetworkManager/Devices/158)
Jan 22 00:00:17 compute-1 systemd[1]: var-lib-containers-storage-overlay-f06098861529c33fa8177310b4a5125a6b218c1ac79b6e44171850807f25951c-merged.mount: Deactivated successfully.
Jan 22 00:00:17 compute-1 podman[223197]: 2026-01-22 00:00:17.042155518 +0000 UTC m=+0.179241449 container cleanup 6cc4633bb4c18ea6d43512335a6781ee394b8b92c6cb4a5d92bb1cec92037072 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-88446c98-877a-4464-946b-7d73337856db, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 00:00:17 compute-1 systemd[1]: libpod-conmon-6cc4633bb4c18ea6d43512335a6781ee394b8b92c6cb4a5d92bb1cec92037072.scope: Deactivated successfully.
Jan 22 00:00:17 compute-1 nova_compute[182713]: 2026-01-22 00:00:17.071 182717 INFO nova.virt.libvirt.driver [-] [instance: 6888ddb7-c373-48a1-bc4c-7e1c44cece29] Instance destroyed successfully.
Jan 22 00:00:17 compute-1 nova_compute[182713]: 2026-01-22 00:00:17.071 182717 DEBUG nova.objects.instance [None req-0554717d-09f7-4e60-9c8d-15fafd190e36 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] Lazy-loading 'resources' on Instance uuid 6888ddb7-c373-48a1-bc4c-7e1c44cece29 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:00:17 compute-1 nova_compute[182713]: 2026-01-22 00:00:17.091 182717 DEBUG nova.virt.libvirt.vif [None req-0554717d-09f7-4e60-9c8d-15fafd190e36 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-21T23:59:55Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersTestMultiNic-server-718536947',display_name='tempest-ServersTestMultiNic-server-718536947',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverstestmultinic-server-718536947',id=80,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-22T00:00:13Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='975703700f9d42c5a1daa32f5e61f6f2',ramdisk_id='',reservation_id='r-kh5n56k6',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersTestMultiNic-672631386',owner_user_name='tempest-ServersTestMultiNic-672631386-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T00:00:13Z,user_data=None,user_id='1931e691804246e3bb3ac03a95a74d93',uuid=6888ddb7-c373-48a1-bc4c-7e1c44cece29,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "21117406-f212-49b3-a849-2a3d9a544b64", "address": "fa:16:3e:35:38:fd", "network": {"id": "88446c98-877a-4464-946b-7d73337856db", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1586215748", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.157", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "975703700f9d42c5a1daa32f5e61f6f2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap21117406-f2", "ovs_interfaceid": "21117406-f212-49b3-a849-2a3d9a544b64", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 00:00:17 compute-1 nova_compute[182713]: 2026-01-22 00:00:17.092 182717 DEBUG nova.network.os_vif_util [None req-0554717d-09f7-4e60-9c8d-15fafd190e36 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] Converting VIF {"id": "21117406-f212-49b3-a849-2a3d9a544b64", "address": "fa:16:3e:35:38:fd", "network": {"id": "88446c98-877a-4464-946b-7d73337856db", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1586215748", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.157", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "975703700f9d42c5a1daa32f5e61f6f2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap21117406-f2", "ovs_interfaceid": "21117406-f212-49b3-a849-2a3d9a544b64", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:00:17 compute-1 nova_compute[182713]: 2026-01-22 00:00:17.093 182717 DEBUG nova.network.os_vif_util [None req-0554717d-09f7-4e60-9c8d-15fafd190e36 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:35:38:fd,bridge_name='br-int',has_traffic_filtering=True,id=21117406-f212-49b3-a849-2a3d9a544b64,network=Network(88446c98-877a-4464-946b-7d73337856db),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap21117406-f2') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:00:17 compute-1 nova_compute[182713]: 2026-01-22 00:00:17.093 182717 DEBUG os_vif [None req-0554717d-09f7-4e60-9c8d-15fafd190e36 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:35:38:fd,bridge_name='br-int',has_traffic_filtering=True,id=21117406-f212-49b3-a849-2a3d9a544b64,network=Network(88446c98-877a-4464-946b-7d73337856db),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap21117406-f2') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 00:00:17 compute-1 nova_compute[182713]: 2026-01-22 00:00:17.096 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:00:17 compute-1 nova_compute[182713]: 2026-01-22 00:00:17.096 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap21117406-f2, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:00:17 compute-1 ovn_controller[94841]: 2026-01-22T00:00:17Z|00327|binding|INFO|Releasing lport fea32043-076f-4c8c-9cfc-fb1a5ca807c2 from this chassis (sb_readonly=0)
Jan 22 00:00:17 compute-1 ovn_controller[94841]: 2026-01-22T00:00:17Z|00328|binding|INFO|Releasing lport 719ccac6-878e-4b2e-a15c-96b44cf29708 from this chassis (sb_readonly=0)
Jan 22 00:00:17 compute-1 nova_compute[182713]: 2026-01-22 00:00:17.099 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:00:17 compute-1 nova_compute[182713]: 2026-01-22 00:00:17.103 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:00:17 compute-1 nova_compute[182713]: 2026-01-22 00:00:17.106 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:00:17 compute-1 nova_compute[182713]: 2026-01-22 00:00:17.107 182717 INFO os_vif [None req-0554717d-09f7-4e60-9c8d-15fafd190e36 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:35:38:fd,bridge_name='br-int',has_traffic_filtering=True,id=21117406-f212-49b3-a849-2a3d9a544b64,network=Network(88446c98-877a-4464-946b-7d73337856db),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap21117406-f2')
Jan 22 00:00:17 compute-1 nova_compute[182713]: 2026-01-22 00:00:17.108 182717 DEBUG nova.virt.libvirt.vif [None req-0554717d-09f7-4e60-9c8d-15fafd190e36 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-21T23:59:55Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersTestMultiNic-server-718536947',display_name='tempest-ServersTestMultiNic-server-718536947',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverstestmultinic-server-718536947',id=80,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-22T00:00:13Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='975703700f9d42c5a1daa32f5e61f6f2',ramdisk_id='',reservation_id='r-kh5n56k6',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersTestMultiNic-672631386',owner_user_name='tempest-ServersTestMultiNic-672631386-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T00:00:13Z,user_data=None,user_id='1931e691804246e3bb3ac03a95a74d93',uuid=6888ddb7-c373-48a1-bc4c-7e1c44cece29,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "5aa7b353-4b8f-49f4-b051-3e3f1b1135af", "address": "fa:16:3e:ff:7a:ac", "network": {"id": "563d1a6c-e81b-4fe1-b3b3-addfa9be9ba4", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1652397250", "subnets": [{"cidr": "10.100.1.0/24", "dns": [], "gateway": {"address": "10.100.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.1.43", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "975703700f9d42c5a1daa32f5e61f6f2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5aa7b353-4b", "ovs_interfaceid": "5aa7b353-4b8f-49f4-b051-3e3f1b1135af", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 00:00:17 compute-1 nova_compute[182713]: 2026-01-22 00:00:17.108 182717 DEBUG nova.network.os_vif_util [None req-0554717d-09f7-4e60-9c8d-15fafd190e36 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] Converting VIF {"id": "5aa7b353-4b8f-49f4-b051-3e3f1b1135af", "address": "fa:16:3e:ff:7a:ac", "network": {"id": "563d1a6c-e81b-4fe1-b3b3-addfa9be9ba4", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1652397250", "subnets": [{"cidr": "10.100.1.0/24", "dns": [], "gateway": {"address": "10.100.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.1.43", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "975703700f9d42c5a1daa32f5e61f6f2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5aa7b353-4b", "ovs_interfaceid": "5aa7b353-4b8f-49f4-b051-3e3f1b1135af", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:00:17 compute-1 nova_compute[182713]: 2026-01-22 00:00:17.109 182717 DEBUG nova.network.os_vif_util [None req-0554717d-09f7-4e60-9c8d-15fafd190e36 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ff:7a:ac,bridge_name='br-int',has_traffic_filtering=True,id=5aa7b353-4b8f-49f4-b051-3e3f1b1135af,network=Network(563d1a6c-e81b-4fe1-b3b3-addfa9be9ba4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5aa7b353-4b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:00:17 compute-1 nova_compute[182713]: 2026-01-22 00:00:17.109 182717 DEBUG os_vif [None req-0554717d-09f7-4e60-9c8d-15fafd190e36 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ff:7a:ac,bridge_name='br-int',has_traffic_filtering=True,id=5aa7b353-4b8f-49f4-b051-3e3f1b1135af,network=Network(563d1a6c-e81b-4fe1-b3b3-addfa9be9ba4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5aa7b353-4b') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 00:00:17 compute-1 nova_compute[182713]: 2026-01-22 00:00:17.110 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:00:17 compute-1 nova_compute[182713]: 2026-01-22 00:00:17.111 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap5aa7b353-4b, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:00:17 compute-1 nova_compute[182713]: 2026-01-22 00:00:17.114 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:00:17 compute-1 nova_compute[182713]: 2026-01-22 00:00:17.116 182717 INFO os_vif [None req-0554717d-09f7-4e60-9c8d-15fafd190e36 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ff:7a:ac,bridge_name='br-int',has_traffic_filtering=True,id=5aa7b353-4b8f-49f4-b051-3e3f1b1135af,network=Network(563d1a6c-e81b-4fe1-b3b3-addfa9be9ba4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5aa7b353-4b')
Jan 22 00:00:17 compute-1 nova_compute[182713]: 2026-01-22 00:00:17.117 182717 INFO nova.virt.libvirt.driver [None req-0554717d-09f7-4e60-9c8d-15fafd190e36 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] [instance: 6888ddb7-c373-48a1-bc4c-7e1c44cece29] Deleting instance files /var/lib/nova/instances/6888ddb7-c373-48a1-bc4c-7e1c44cece29_del
Jan 22 00:00:17 compute-1 nova_compute[182713]: 2026-01-22 00:00:17.118 182717 INFO nova.virt.libvirt.driver [None req-0554717d-09f7-4e60-9c8d-15fafd190e36 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] [instance: 6888ddb7-c373-48a1-bc4c-7e1c44cece29] Deletion of /var/lib/nova/instances/6888ddb7-c373-48a1-bc4c-7e1c44cece29_del complete
Jan 22 00:00:17 compute-1 podman[223242]: 2026-01-22 00:00:17.121261598 +0000 UTC m=+0.047506516 container remove 6cc4633bb4c18ea6d43512335a6781ee394b8b92c6cb4a5d92bb1cec92037072 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-88446c98-877a-4464-946b-7d73337856db, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 22 00:00:17 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:00:17.129 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[defec868-c4d2-4728-81a4-a9f2c1ccf18c]: (4, ('Thu Jan 22 12:00:16 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-88446c98-877a-4464-946b-7d73337856db (6cc4633bb4c18ea6d43512335a6781ee394b8b92c6cb4a5d92bb1cec92037072)\n6cc4633bb4c18ea6d43512335a6781ee394b8b92c6cb4a5d92bb1cec92037072\nThu Jan 22 12:00:17 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-88446c98-877a-4464-946b-7d73337856db (6cc4633bb4c18ea6d43512335a6781ee394b8b92c6cb4a5d92bb1cec92037072)\n6cc4633bb4c18ea6d43512335a6781ee394b8b92c6cb4a5d92bb1cec92037072\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:00:17 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:00:17.132 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[ef85ff01-e566-4548-b22b-66eea9cbac95]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:00:17 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:00:17.133 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap88446c98-80, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:00:17 compute-1 nova_compute[182713]: 2026-01-22 00:00:17.135 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:00:17 compute-1 kernel: tap88446c98-80: left promiscuous mode
Jan 22 00:00:17 compute-1 nova_compute[182713]: 2026-01-22 00:00:17.145 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:00:17 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:00:17.150 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[ca1d4176-61ea-4bc2-a3ec-91dc7d63df8f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:00:17 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:00:17.173 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[5d59b555-ec66-4a17-aa80-839a26fe0475]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:00:17 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:00:17.174 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[818bc9e5-18c9-4fe4-bdd7-65f27dfa3010]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:00:17 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:00:17.198 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[165295c4-832e-4ee5-a091-45d6daf3f2dd]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 458022, 'reachable_time': 31422, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 223264, 'error': None, 'target': 'ovnmeta-88446c98-877a-4464-946b-7d73337856db', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:00:17 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:00:17.201 104576 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-88446c98-877a-4464-946b-7d73337856db deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 22 00:00:17 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:00:17.201 104576 DEBUG oslo.privsep.daemon [-] privsep: reply[0075d392-a4e5-4b23-b226-3bd98d041469]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:00:17 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:00:17.202 104184 INFO neutron.agent.ovn.metadata.agent [-] Port 5aa7b353-4b8f-49f4-b051-3e3f1b1135af in datapath 563d1a6c-e81b-4fe1-b3b3-addfa9be9ba4 unbound from our chassis
Jan 22 00:00:17 compute-1 systemd[1]: run-netns-ovnmeta\x2d88446c98\x2d877a\x2d4464\x2d946b\x2d7d73337856db.mount: Deactivated successfully.
Jan 22 00:00:17 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:00:17.204 104184 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 563d1a6c-e81b-4fe1-b3b3-addfa9be9ba4, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 00:00:17 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:00:17.205 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[0a60523c-8377-4d0c-a06c-4796ef43a294]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:00:17 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:00:17.206 104184 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-563d1a6c-e81b-4fe1-b3b3-addfa9be9ba4 namespace which is not needed anymore
Jan 22 00:00:17 compute-1 nova_compute[182713]: 2026-01-22 00:00:17.242 182717 INFO nova.compute.manager [None req-0554717d-09f7-4e60-9c8d-15fafd190e36 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] [instance: 6888ddb7-c373-48a1-bc4c-7e1c44cece29] Took 0.55 seconds to destroy the instance on the hypervisor.
Jan 22 00:00:17 compute-1 nova_compute[182713]: 2026-01-22 00:00:17.243 182717 DEBUG oslo.service.loopingcall [None req-0554717d-09f7-4e60-9c8d-15fafd190e36 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 22 00:00:17 compute-1 nova_compute[182713]: 2026-01-22 00:00:17.243 182717 DEBUG nova.compute.manager [-] [instance: 6888ddb7-c373-48a1-bc4c-7e1c44cece29] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 22 00:00:17 compute-1 nova_compute[182713]: 2026-01-22 00:00:17.243 182717 DEBUG nova.network.neutron [-] [instance: 6888ddb7-c373-48a1-bc4c-7e1c44cece29] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 22 00:00:17 compute-1 nova_compute[182713]: 2026-01-22 00:00:17.332 182717 DEBUG nova.compute.manager [req-f46486b8-0967-4824-ae9f-e9a7990d55b6 req-c636fb4c-f385-4486-a150-18d9ba068796 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 6888ddb7-c373-48a1-bc4c-7e1c44cece29] Received event network-vif-unplugged-21117406-f212-49b3-a849-2a3d9a544b64 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:00:17 compute-1 nova_compute[182713]: 2026-01-22 00:00:17.333 182717 DEBUG oslo_concurrency.lockutils [req-f46486b8-0967-4824-ae9f-e9a7990d55b6 req-c636fb4c-f385-4486-a150-18d9ba068796 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "6888ddb7-c373-48a1-bc4c-7e1c44cece29-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:00:17 compute-1 nova_compute[182713]: 2026-01-22 00:00:17.334 182717 DEBUG oslo_concurrency.lockutils [req-f46486b8-0967-4824-ae9f-e9a7990d55b6 req-c636fb4c-f385-4486-a150-18d9ba068796 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "6888ddb7-c373-48a1-bc4c-7e1c44cece29-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:00:17 compute-1 nova_compute[182713]: 2026-01-22 00:00:17.334 182717 DEBUG oslo_concurrency.lockutils [req-f46486b8-0967-4824-ae9f-e9a7990d55b6 req-c636fb4c-f385-4486-a150-18d9ba068796 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "6888ddb7-c373-48a1-bc4c-7e1c44cece29-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:00:17 compute-1 nova_compute[182713]: 2026-01-22 00:00:17.335 182717 DEBUG nova.compute.manager [req-f46486b8-0967-4824-ae9f-e9a7990d55b6 req-c636fb4c-f385-4486-a150-18d9ba068796 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 6888ddb7-c373-48a1-bc4c-7e1c44cece29] No waiting events found dispatching network-vif-unplugged-21117406-f212-49b3-a849-2a3d9a544b64 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:00:17 compute-1 nova_compute[182713]: 2026-01-22 00:00:17.335 182717 DEBUG nova.compute.manager [req-f46486b8-0967-4824-ae9f-e9a7990d55b6 req-c636fb4c-f385-4486-a150-18d9ba068796 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 6888ddb7-c373-48a1-bc4c-7e1c44cece29] Received event network-vif-unplugged-21117406-f212-49b3-a849-2a3d9a544b64 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 22 00:00:17 compute-1 neutron-haproxy-ovnmeta-563d1a6c-e81b-4fe1-b3b3-addfa9be9ba4[223069]: [NOTICE]   (223099) : haproxy version is 2.8.14-c23fe91
Jan 22 00:00:17 compute-1 neutron-haproxy-ovnmeta-563d1a6c-e81b-4fe1-b3b3-addfa9be9ba4[223069]: [NOTICE]   (223099) : path to executable is /usr/sbin/haproxy
Jan 22 00:00:17 compute-1 neutron-haproxy-ovnmeta-563d1a6c-e81b-4fe1-b3b3-addfa9be9ba4[223069]: [WARNING]  (223099) : Exiting Master process...
Jan 22 00:00:17 compute-1 neutron-haproxy-ovnmeta-563d1a6c-e81b-4fe1-b3b3-addfa9be9ba4[223069]: [ALERT]    (223099) : Current worker (223115) exited with code 143 (Terminated)
Jan 22 00:00:17 compute-1 neutron-haproxy-ovnmeta-563d1a6c-e81b-4fe1-b3b3-addfa9be9ba4[223069]: [WARNING]  (223099) : All workers exited. Exiting... (0)
Jan 22 00:00:17 compute-1 systemd[1]: libpod-a918a505cd5bfe45981d7a2ed11156d71ea7536e63a3e026383539c103bc9abc.scope: Deactivated successfully.
Jan 22 00:00:17 compute-1 podman[223282]: 2026-01-22 00:00:17.373956261 +0000 UTC m=+0.048069514 container died a918a505cd5bfe45981d7a2ed11156d71ea7536e63a3e026383539c103bc9abc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-563d1a6c-e81b-4fe1-b3b3-addfa9be9ba4, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Jan 22 00:00:17 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-a918a505cd5bfe45981d7a2ed11156d71ea7536e63a3e026383539c103bc9abc-userdata-shm.mount: Deactivated successfully.
Jan 22 00:00:17 compute-1 systemd[1]: var-lib-containers-storage-overlay-108a82e18556ea4336b7fff6330975e78f1b8ffb9d6e889c5c0ccaa3928dc41f-merged.mount: Deactivated successfully.
Jan 22 00:00:17 compute-1 podman[223282]: 2026-01-22 00:00:17.414210223 +0000 UTC m=+0.088323446 container cleanup a918a505cd5bfe45981d7a2ed11156d71ea7536e63a3e026383539c103bc9abc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-563d1a6c-e81b-4fe1-b3b3-addfa9be9ba4, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202)
Jan 22 00:00:17 compute-1 systemd[1]: libpod-conmon-a918a505cd5bfe45981d7a2ed11156d71ea7536e63a3e026383539c103bc9abc.scope: Deactivated successfully.
Jan 22 00:00:17 compute-1 podman[223309]: 2026-01-22 00:00:17.487783322 +0000 UTC m=+0.052307545 container remove a918a505cd5bfe45981d7a2ed11156d71ea7536e63a3e026383539c103bc9abc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-563d1a6c-e81b-4fe1-b3b3-addfa9be9ba4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0)
Jan 22 00:00:17 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:00:17.494 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[874380e5-72c3-420c-b1a2-9c2bb8ad755c]: (4, ('Thu Jan 22 12:00:17 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-563d1a6c-e81b-4fe1-b3b3-addfa9be9ba4 (a918a505cd5bfe45981d7a2ed11156d71ea7536e63a3e026383539c103bc9abc)\na918a505cd5bfe45981d7a2ed11156d71ea7536e63a3e026383539c103bc9abc\nThu Jan 22 12:00:17 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-563d1a6c-e81b-4fe1-b3b3-addfa9be9ba4 (a918a505cd5bfe45981d7a2ed11156d71ea7536e63a3e026383539c103bc9abc)\na918a505cd5bfe45981d7a2ed11156d71ea7536e63a3e026383539c103bc9abc\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:00:17 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:00:17.496 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[2d1fe139-d42e-4d8a-b867-3c660062329f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:00:17 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:00:17.497 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap563d1a6c-e0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:00:17 compute-1 nova_compute[182713]: 2026-01-22 00:00:17.499 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:00:17 compute-1 kernel: tap563d1a6c-e0: left promiscuous mode
Jan 22 00:00:17 compute-1 nova_compute[182713]: 2026-01-22 00:00:17.514 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:00:17 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:00:17.518 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[375488d8-d888-4c3f-9883-5d2f1a80cbd8]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:00:17 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:00:17.530 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[6ca7f459-ed4b-41ed-a0f4-8bad865468e9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:00:17 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:00:17.531 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[650ed372-b0bc-4013-b8e7-c14af0464bfc]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:00:17 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:00:17.547 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[06fcab85-8f10-4152-aed2-d4b3a91e82bd]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 458119, 'reachable_time': 38955, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 223328, 'error': None, 'target': 'ovnmeta-563d1a6c-e81b-4fe1-b3b3-addfa9be9ba4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:00:17 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:00:17.550 104576 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-563d1a6c-e81b-4fe1-b3b3-addfa9be9ba4 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 22 00:00:17 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:00:17.550 104576 DEBUG oslo.privsep.daemon [-] privsep: reply[d7c235b9-3f37-46f2-8fbb-0268bb7e9e48]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:00:17 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:00:17.905 104184 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=19, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:a4:fb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'fa:c2:bb:50:eb:27'}, ipsec=False) old=SB_Global(nb_cfg=18) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:00:17 compute-1 nova_compute[182713]: 2026-01-22 00:00:17.906 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:00:17 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:00:17.907 104184 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 22 00:00:17 compute-1 systemd[1]: run-netns-ovnmeta\x2d563d1a6c\x2de81b\x2d4fe1\x2db3b3\x2daddfa9be9ba4.mount: Deactivated successfully.
Jan 22 00:00:18 compute-1 nova_compute[182713]: 2026-01-22 00:00:18.026 182717 DEBUG nova.compute.manager [req-2c092b11-8c34-42da-aa6f-f148b36a9ea6 req-16cdaecd-f41d-4bdb-a130-11db6e967948 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 6888ddb7-c373-48a1-bc4c-7e1c44cece29] Received event network-vif-unplugged-5aa7b353-4b8f-49f4-b051-3e3f1b1135af external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:00:18 compute-1 nova_compute[182713]: 2026-01-22 00:00:18.027 182717 DEBUG oslo_concurrency.lockutils [req-2c092b11-8c34-42da-aa6f-f148b36a9ea6 req-16cdaecd-f41d-4bdb-a130-11db6e967948 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "6888ddb7-c373-48a1-bc4c-7e1c44cece29-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:00:18 compute-1 nova_compute[182713]: 2026-01-22 00:00:18.028 182717 DEBUG oslo_concurrency.lockutils [req-2c092b11-8c34-42da-aa6f-f148b36a9ea6 req-16cdaecd-f41d-4bdb-a130-11db6e967948 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "6888ddb7-c373-48a1-bc4c-7e1c44cece29-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:00:18 compute-1 nova_compute[182713]: 2026-01-22 00:00:18.029 182717 DEBUG oslo_concurrency.lockutils [req-2c092b11-8c34-42da-aa6f-f148b36a9ea6 req-16cdaecd-f41d-4bdb-a130-11db6e967948 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "6888ddb7-c373-48a1-bc4c-7e1c44cece29-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:00:18 compute-1 nova_compute[182713]: 2026-01-22 00:00:18.029 182717 DEBUG nova.compute.manager [req-2c092b11-8c34-42da-aa6f-f148b36a9ea6 req-16cdaecd-f41d-4bdb-a130-11db6e967948 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 6888ddb7-c373-48a1-bc4c-7e1c44cece29] No waiting events found dispatching network-vif-unplugged-5aa7b353-4b8f-49f4-b051-3e3f1b1135af pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:00:18 compute-1 nova_compute[182713]: 2026-01-22 00:00:18.030 182717 DEBUG nova.compute.manager [req-2c092b11-8c34-42da-aa6f-f148b36a9ea6 req-16cdaecd-f41d-4bdb-a130-11db6e967948 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 6888ddb7-c373-48a1-bc4c-7e1c44cece29] Received event network-vif-unplugged-5aa7b353-4b8f-49f4-b051-3e3f1b1135af for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 22 00:00:18 compute-1 nova_compute[182713]: 2026-01-22 00:00:18.030 182717 DEBUG nova.compute.manager [req-2c092b11-8c34-42da-aa6f-f148b36a9ea6 req-16cdaecd-f41d-4bdb-a130-11db6e967948 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 6888ddb7-c373-48a1-bc4c-7e1c44cece29] Received event network-vif-plugged-5aa7b353-4b8f-49f4-b051-3e3f1b1135af external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:00:18 compute-1 nova_compute[182713]: 2026-01-22 00:00:18.031 182717 DEBUG oslo_concurrency.lockutils [req-2c092b11-8c34-42da-aa6f-f148b36a9ea6 req-16cdaecd-f41d-4bdb-a130-11db6e967948 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "6888ddb7-c373-48a1-bc4c-7e1c44cece29-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:00:18 compute-1 nova_compute[182713]: 2026-01-22 00:00:18.031 182717 DEBUG oslo_concurrency.lockutils [req-2c092b11-8c34-42da-aa6f-f148b36a9ea6 req-16cdaecd-f41d-4bdb-a130-11db6e967948 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "6888ddb7-c373-48a1-bc4c-7e1c44cece29-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:00:18 compute-1 nova_compute[182713]: 2026-01-22 00:00:18.032 182717 DEBUG oslo_concurrency.lockutils [req-2c092b11-8c34-42da-aa6f-f148b36a9ea6 req-16cdaecd-f41d-4bdb-a130-11db6e967948 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "6888ddb7-c373-48a1-bc4c-7e1c44cece29-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:00:18 compute-1 nova_compute[182713]: 2026-01-22 00:00:18.032 182717 DEBUG nova.compute.manager [req-2c092b11-8c34-42da-aa6f-f148b36a9ea6 req-16cdaecd-f41d-4bdb-a130-11db6e967948 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 6888ddb7-c373-48a1-bc4c-7e1c44cece29] No waiting events found dispatching network-vif-plugged-5aa7b353-4b8f-49f4-b051-3e3f1b1135af pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:00:18 compute-1 nova_compute[182713]: 2026-01-22 00:00:18.033 182717 WARNING nova.compute.manager [req-2c092b11-8c34-42da-aa6f-f148b36a9ea6 req-16cdaecd-f41d-4bdb-a130-11db6e967948 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 6888ddb7-c373-48a1-bc4c-7e1c44cece29] Received unexpected event network-vif-plugged-5aa7b353-4b8f-49f4-b051-3e3f1b1135af for instance with vm_state active and task_state deleting.
Jan 22 00:00:18 compute-1 nova_compute[182713]: 2026-01-22 00:00:18.770 182717 DEBUG nova.compute.manager [req-c00cd5e3-d9c6-4b3b-b41e-8e519f7d08f4 req-56b707b1-eca1-44af-9429-83df1b99b41f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 6888ddb7-c373-48a1-bc4c-7e1c44cece29] Received event network-vif-deleted-21117406-f212-49b3-a849-2a3d9a544b64 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:00:18 compute-1 nova_compute[182713]: 2026-01-22 00:00:18.770 182717 INFO nova.compute.manager [req-c00cd5e3-d9c6-4b3b-b41e-8e519f7d08f4 req-56b707b1-eca1-44af-9429-83df1b99b41f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 6888ddb7-c373-48a1-bc4c-7e1c44cece29] Neutron deleted interface 21117406-f212-49b3-a849-2a3d9a544b64; detaching it from the instance and deleting it from the info cache
Jan 22 00:00:18 compute-1 nova_compute[182713]: 2026-01-22 00:00:18.771 182717 DEBUG nova.network.neutron [req-c00cd5e3-d9c6-4b3b-b41e-8e519f7d08f4 req-56b707b1-eca1-44af-9429-83df1b99b41f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 6888ddb7-c373-48a1-bc4c-7e1c44cece29] Updating instance_info_cache with network_info: [{"id": "5aa7b353-4b8f-49f4-b051-3e3f1b1135af", "address": "fa:16:3e:ff:7a:ac", "network": {"id": "563d1a6c-e81b-4fe1-b3b3-addfa9be9ba4", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1652397250", "subnets": [{"cidr": "10.100.1.0/24", "dns": [], "gateway": {"address": "10.100.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.1.43", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "975703700f9d42c5a1daa32f5e61f6f2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5aa7b353-4b", "ovs_interfaceid": "5aa7b353-4b8f-49f4-b051-3e3f1b1135af", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:00:18 compute-1 nova_compute[182713]: 2026-01-22 00:00:18.794 182717 DEBUG nova.compute.manager [req-c00cd5e3-d9c6-4b3b-b41e-8e519f7d08f4 req-56b707b1-eca1-44af-9429-83df1b99b41f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 6888ddb7-c373-48a1-bc4c-7e1c44cece29] Detach interface failed, port_id=21117406-f212-49b3-a849-2a3d9a544b64, reason: Instance 6888ddb7-c373-48a1-bc4c-7e1c44cece29 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882
Jan 22 00:00:18 compute-1 nova_compute[182713]: 2026-01-22 00:00:18.836 182717 DEBUG nova.network.neutron [-] [instance: 6888ddb7-c373-48a1-bc4c-7e1c44cece29] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:00:18 compute-1 nova_compute[182713]: 2026-01-22 00:00:18.852 182717 INFO nova.compute.manager [-] [instance: 6888ddb7-c373-48a1-bc4c-7e1c44cece29] Took 1.61 seconds to deallocate network for instance.
Jan 22 00:00:18 compute-1 nova_compute[182713]: 2026-01-22 00:00:18.934 182717 DEBUG oslo_concurrency.lockutils [None req-0554717d-09f7-4e60-9c8d-15fafd190e36 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:00:18 compute-1 nova_compute[182713]: 2026-01-22 00:00:18.935 182717 DEBUG oslo_concurrency.lockutils [None req-0554717d-09f7-4e60-9c8d-15fafd190e36 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:00:19 compute-1 nova_compute[182713]: 2026-01-22 00:00:19.002 182717 DEBUG nova.compute.provider_tree [None req-0554717d-09f7-4e60-9c8d-15fafd190e36 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] Inventory has not changed in ProviderTree for provider: 39680711-70c9-4df1-ae59-25e54fac688d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:00:19 compute-1 nova_compute[182713]: 2026-01-22 00:00:19.020 182717 DEBUG nova.scheduler.client.report [None req-0554717d-09f7-4e60-9c8d-15fafd190e36 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] Inventory has not changed for provider 39680711-70c9-4df1-ae59-25e54fac688d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:00:19 compute-1 nova_compute[182713]: 2026-01-22 00:00:19.047 182717 DEBUG oslo_concurrency.lockutils [None req-0554717d-09f7-4e60-9c8d-15fafd190e36 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.112s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:00:19 compute-1 nova_compute[182713]: 2026-01-22 00:00:19.087 182717 INFO nova.scheduler.client.report [None req-0554717d-09f7-4e60-9c8d-15fafd190e36 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] Deleted allocations for instance 6888ddb7-c373-48a1-bc4c-7e1c44cece29
Jan 22 00:00:19 compute-1 nova_compute[182713]: 2026-01-22 00:00:19.217 182717 DEBUG oslo_concurrency.lockutils [None req-0554717d-09f7-4e60-9c8d-15fafd190e36 1931e691804246e3bb3ac03a95a74d93 975703700f9d42c5a1daa32f5e61f6f2 - - default default] Lock "6888ddb7-c373-48a1-bc4c-7e1c44cece29" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.563s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:00:19 compute-1 nova_compute[182713]: 2026-01-22 00:00:19.445 182717 DEBUG nova.compute.manager [req-4c9c9af4-7caa-4efc-998b-3d05246cdfbb req-2c70d467-431b-4ef1-b200-c6025fbf20af 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 6888ddb7-c373-48a1-bc4c-7e1c44cece29] Received event network-vif-plugged-21117406-f212-49b3-a849-2a3d9a544b64 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:00:19 compute-1 nova_compute[182713]: 2026-01-22 00:00:19.445 182717 DEBUG oslo_concurrency.lockutils [req-4c9c9af4-7caa-4efc-998b-3d05246cdfbb req-2c70d467-431b-4ef1-b200-c6025fbf20af 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "6888ddb7-c373-48a1-bc4c-7e1c44cece29-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:00:19 compute-1 nova_compute[182713]: 2026-01-22 00:00:19.446 182717 DEBUG oslo_concurrency.lockutils [req-4c9c9af4-7caa-4efc-998b-3d05246cdfbb req-2c70d467-431b-4ef1-b200-c6025fbf20af 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "6888ddb7-c373-48a1-bc4c-7e1c44cece29-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:00:19 compute-1 nova_compute[182713]: 2026-01-22 00:00:19.446 182717 DEBUG oslo_concurrency.lockutils [req-4c9c9af4-7caa-4efc-998b-3d05246cdfbb req-2c70d467-431b-4ef1-b200-c6025fbf20af 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "6888ddb7-c373-48a1-bc4c-7e1c44cece29-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:00:19 compute-1 nova_compute[182713]: 2026-01-22 00:00:19.446 182717 DEBUG nova.compute.manager [req-4c9c9af4-7caa-4efc-998b-3d05246cdfbb req-2c70d467-431b-4ef1-b200-c6025fbf20af 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 6888ddb7-c373-48a1-bc4c-7e1c44cece29] No waiting events found dispatching network-vif-plugged-21117406-f212-49b3-a849-2a3d9a544b64 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:00:19 compute-1 nova_compute[182713]: 2026-01-22 00:00:19.446 182717 WARNING nova.compute.manager [req-4c9c9af4-7caa-4efc-998b-3d05246cdfbb req-2c70d467-431b-4ef1-b200-c6025fbf20af 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 6888ddb7-c373-48a1-bc4c-7e1c44cece29] Received unexpected event network-vif-plugged-21117406-f212-49b3-a849-2a3d9a544b64 for instance with vm_state deleted and task_state None.
Jan 22 00:00:20 compute-1 nova_compute[182713]: 2026-01-22 00:00:20.968 182717 DEBUG nova.compute.manager [req-ce1ad6e4-049a-4880-856f-40100e2e5ae1 req-ff2b7d94-52f6-4b54-a191-e443c5dbc01a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 6888ddb7-c373-48a1-bc4c-7e1c44cece29] Received event network-vif-deleted-5aa7b353-4b8f-49f4-b051-3e3f1b1135af external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:00:21 compute-1 nova_compute[182713]: 2026-01-22 00:00:21.240 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:00:22 compute-1 nova_compute[182713]: 2026-01-22 00:00:22.153 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:00:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:00:22.870 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:00:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:00:22.872 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:00:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:00:22.872 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:00:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:00:22.872 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:00:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:00:22.872 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:00:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:00:22.873 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:00:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:00:22.873 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:00:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:00:22.873 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:00:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:00:22.873 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:00:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:00:22.873 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:00:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:00:22.874 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:00:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:00:22.874 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:00:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:00:22.874 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:00:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:00:22.874 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:00:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:00:22.874 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:00:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:00:22.874 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:00:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:00:22.875 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:00:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:00:22.875 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:00:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:00:22.875 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:00:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:00:22.875 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:00:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:00:22.876 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:00:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:00:22.876 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:00:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:00:22.876 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:00:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:00:22.876 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:00:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:00:22.877 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:00:22 compute-1 nova_compute[182713]: 2026-01-22 00:00:22.948 182717 DEBUG nova.compute.manager [None req-7084740f-d576-4ef1-8009-26afe304ba5c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] Stashing vm_state: active _prep_resize /usr/lib/python3.9/site-packages/nova/compute/manager.py:5560
Jan 22 00:00:23 compute-1 nova_compute[182713]: 2026-01-22 00:00:23.062 182717 DEBUG oslo_concurrency.lockutils [None req-7084740f-d576-4ef1-8009-26afe304ba5c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:00:23 compute-1 nova_compute[182713]: 2026-01-22 00:00:23.062 182717 DEBUG oslo_concurrency.lockutils [None req-7084740f-d576-4ef1-8009-26afe304ba5c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:00:23 compute-1 nova_compute[182713]: 2026-01-22 00:00:23.088 182717 DEBUG nova.objects.instance [None req-7084740f-d576-4ef1-8009-26afe304ba5c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Lazy-loading 'pci_requests' on Instance uuid 9308be91-9a92-4389-939a-8b03d37474cf obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:00:23 compute-1 nova_compute[182713]: 2026-01-22 00:00:23.109 182717 DEBUG nova.virt.hardware [None req-7084740f-d576-4ef1-8009-26afe304ba5c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 22 00:00:23 compute-1 nova_compute[182713]: 2026-01-22 00:00:23.109 182717 INFO nova.compute.claims [None req-7084740f-d576-4ef1-8009-26afe304ba5c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] Claim successful on node compute-1.ctlplane.example.com
Jan 22 00:00:23 compute-1 nova_compute[182713]: 2026-01-22 00:00:23.110 182717 DEBUG nova.objects.instance [None req-7084740f-d576-4ef1-8009-26afe304ba5c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Lazy-loading 'resources' on Instance uuid 9308be91-9a92-4389-939a-8b03d37474cf obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:00:23 compute-1 nova_compute[182713]: 2026-01-22 00:00:23.125 182717 DEBUG nova.objects.instance [None req-7084740f-d576-4ef1-8009-26afe304ba5c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Lazy-loading 'pci_devices' on Instance uuid 9308be91-9a92-4389-939a-8b03d37474cf obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:00:23 compute-1 nova_compute[182713]: 2026-01-22 00:00:23.183 182717 INFO nova.compute.resource_tracker [None req-7084740f-d576-4ef1-8009-26afe304ba5c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] Updating resource usage from migration 26c72328-aea0-476d-a0df-60a56cb3907f
Jan 22 00:00:23 compute-1 nova_compute[182713]: 2026-01-22 00:00:23.183 182717 DEBUG nova.compute.resource_tracker [None req-7084740f-d576-4ef1-8009-26afe304ba5c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] Starting to track incoming migration 26c72328-aea0-476d-a0df-60a56cb3907f with flavor ff01ccba-ad51-439f-9037-926190d6dc0f _update_usage_from_migration /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1431
Jan 22 00:00:23 compute-1 nova_compute[182713]: 2026-01-22 00:00:23.267 182717 DEBUG nova.compute.provider_tree [None req-7084740f-d576-4ef1-8009-26afe304ba5c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Inventory has not changed in ProviderTree for provider: 39680711-70c9-4df1-ae59-25e54fac688d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:00:23 compute-1 nova_compute[182713]: 2026-01-22 00:00:23.285 182717 DEBUG nova.scheduler.client.report [None req-7084740f-d576-4ef1-8009-26afe304ba5c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Inventory has not changed for provider 39680711-70c9-4df1-ae59-25e54fac688d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:00:23 compute-1 nova_compute[182713]: 2026-01-22 00:00:23.320 182717 DEBUG oslo_concurrency.lockutils [None req-7084740f-d576-4ef1-8009-26afe304ba5c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: held 0.258s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:00:23 compute-1 nova_compute[182713]: 2026-01-22 00:00:23.321 182717 INFO nova.compute.manager [None req-7084740f-d576-4ef1-8009-26afe304ba5c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] Migrating
Jan 22 00:00:26 compute-1 sshd-session[223329]: Accepted publickey for nova from 192.168.122.102 port 47822 ssh2: ECDSA SHA256:F7R2p0Um/J2TH6GGZ4p2tVmjepU+Vjbmccv9PcuTsYI
Jan 22 00:00:26 compute-1 systemd[1]: Created slice User Slice of UID 42436.
Jan 22 00:00:26 compute-1 systemd[1]: Starting User Runtime Directory /run/user/42436...
Jan 22 00:00:26 compute-1 systemd-logind[796]: New session 40 of user nova.
Jan 22 00:00:26 compute-1 systemd[1]: Finished User Runtime Directory /run/user/42436.
Jan 22 00:00:26 compute-1 systemd[1]: Starting User Manager for UID 42436...
Jan 22 00:00:26 compute-1 systemd[223333]: pam_unix(systemd-user:session): session opened for user nova(uid=42436) by nova(uid=0)
Jan 22 00:00:26 compute-1 nova_compute[182713]: 2026-01-22 00:00:26.242 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:00:26 compute-1 systemd[223333]: Queued start job for default target Main User Target.
Jan 22 00:00:26 compute-1 systemd[223333]: Created slice User Application Slice.
Jan 22 00:00:26 compute-1 systemd[223333]: Started Mark boot as successful after the user session has run 2 minutes.
Jan 22 00:00:26 compute-1 systemd[223333]: Started Daily Cleanup of User's Temporary Directories.
Jan 22 00:00:26 compute-1 systemd[223333]: Reached target Paths.
Jan 22 00:00:26 compute-1 systemd[223333]: Reached target Timers.
Jan 22 00:00:26 compute-1 systemd[223333]: Starting D-Bus User Message Bus Socket...
Jan 22 00:00:26 compute-1 systemd[223333]: Starting Create User's Volatile Files and Directories...
Jan 22 00:00:26 compute-1 systemd[223333]: Finished Create User's Volatile Files and Directories.
Jan 22 00:00:26 compute-1 systemd[223333]: Listening on D-Bus User Message Bus Socket.
Jan 22 00:00:26 compute-1 systemd[223333]: Reached target Sockets.
Jan 22 00:00:26 compute-1 systemd[223333]: Reached target Basic System.
Jan 22 00:00:26 compute-1 systemd[223333]: Reached target Main User Target.
Jan 22 00:00:26 compute-1 systemd[223333]: Startup finished in 138ms.
Jan 22 00:00:26 compute-1 systemd[1]: Started User Manager for UID 42436.
Jan 22 00:00:26 compute-1 systemd[1]: Started Session 40 of User nova.
Jan 22 00:00:26 compute-1 sshd-session[223329]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Jan 22 00:00:26 compute-1 sshd-session[223348]: Received disconnect from 192.168.122.102 port 47822:11: disconnected by user
Jan 22 00:00:26 compute-1 sshd-session[223348]: Disconnected from user nova 192.168.122.102 port 47822
Jan 22 00:00:26 compute-1 sshd-session[223329]: pam_unix(sshd:session): session closed for user nova
Jan 22 00:00:26 compute-1 systemd[1]: session-40.scope: Deactivated successfully.
Jan 22 00:00:26 compute-1 systemd-logind[796]: Session 40 logged out. Waiting for processes to exit.
Jan 22 00:00:26 compute-1 systemd-logind[796]: Removed session 40.
Jan 22 00:00:26 compute-1 sshd-session[223350]: Accepted publickey for nova from 192.168.122.102 port 47828 ssh2: ECDSA SHA256:F7R2p0Um/J2TH6GGZ4p2tVmjepU+Vjbmccv9PcuTsYI
Jan 22 00:00:26 compute-1 systemd-logind[796]: New session 42 of user nova.
Jan 22 00:00:26 compute-1 systemd[1]: Started Session 42 of User nova.
Jan 22 00:00:26 compute-1 sshd-session[223350]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Jan 22 00:00:26 compute-1 sshd-session[223353]: Received disconnect from 192.168.122.102 port 47828:11: disconnected by user
Jan 22 00:00:26 compute-1 sshd-session[223353]: Disconnected from user nova 192.168.122.102 port 47828
Jan 22 00:00:26 compute-1 sshd-session[223350]: pam_unix(sshd:session): session closed for user nova
Jan 22 00:00:26 compute-1 systemd[1]: session-42.scope: Deactivated successfully.
Jan 22 00:00:26 compute-1 systemd-logind[796]: Session 42 logged out. Waiting for processes to exit.
Jan 22 00:00:26 compute-1 systemd-logind[796]: Removed session 42.
Jan 22 00:00:26 compute-1 podman[223355]: 2026-01-22 00:00:26.806515655 +0000 UTC m=+0.097820908 container health_status cba261ffc859a105e0afa4436e64e27f9ba7e7387a1ea02146e0072c51844d44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Jan 22 00:00:27 compute-1 nova_compute[182713]: 2026-01-22 00:00:27.155 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:00:27 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:00:27.909 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=74526b6d-b1ca-423f-9094-b845f8b97526, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '19'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:00:28 compute-1 nova_compute[182713]: 2026-01-22 00:00:28.855 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:00:28 compute-1 nova_compute[182713]: 2026-01-22 00:00:28.856 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 00:00:29 compute-1 nova_compute[182713]: 2026-01-22 00:00:29.247 182717 DEBUG nova.compute.manager [req-8d83ae3b-a114-4ca1-bf9b-5355703870e5 req-28e3188f-fc23-406c-9b89-383200f17a19 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] Received event network-vif-unplugged-d96fb6bb-9793-4373-8f62-3aa3f32af6a5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:00:29 compute-1 nova_compute[182713]: 2026-01-22 00:00:29.247 182717 DEBUG oslo_concurrency.lockutils [req-8d83ae3b-a114-4ca1-bf9b-5355703870e5 req-28e3188f-fc23-406c-9b89-383200f17a19 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "9308be91-9a92-4389-939a-8b03d37474cf-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:00:29 compute-1 nova_compute[182713]: 2026-01-22 00:00:29.248 182717 DEBUG oslo_concurrency.lockutils [req-8d83ae3b-a114-4ca1-bf9b-5355703870e5 req-28e3188f-fc23-406c-9b89-383200f17a19 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "9308be91-9a92-4389-939a-8b03d37474cf-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:00:29 compute-1 nova_compute[182713]: 2026-01-22 00:00:29.249 182717 DEBUG oslo_concurrency.lockutils [req-8d83ae3b-a114-4ca1-bf9b-5355703870e5 req-28e3188f-fc23-406c-9b89-383200f17a19 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "9308be91-9a92-4389-939a-8b03d37474cf-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:00:29 compute-1 nova_compute[182713]: 2026-01-22 00:00:29.249 182717 DEBUG nova.compute.manager [req-8d83ae3b-a114-4ca1-bf9b-5355703870e5 req-28e3188f-fc23-406c-9b89-383200f17a19 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] No waiting events found dispatching network-vif-unplugged-d96fb6bb-9793-4373-8f62-3aa3f32af6a5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:00:29 compute-1 nova_compute[182713]: 2026-01-22 00:00:29.250 182717 WARNING nova.compute.manager [req-8d83ae3b-a114-4ca1-bf9b-5355703870e5 req-28e3188f-fc23-406c-9b89-383200f17a19 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] Received unexpected event network-vif-unplugged-d96fb6bb-9793-4373-8f62-3aa3f32af6a5 for instance with vm_state active and task_state resize_migrating.
Jan 22 00:00:29 compute-1 podman[223375]: 2026-01-22 00:00:29.568189657 +0000 UTC m=+0.066540025 container health_status dce133a2ffa21f3006ac27dc80027060a9029e6d972f4c62fecd35dd3bbb00f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, managed_by=edpm_ansible, name=ubi9-minimal, version=9.6, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, architecture=x86_64, config_id=openstack_network_exporter, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, release=1755695350)
Jan 22 00:00:30 compute-1 sshd-session[223397]: Accepted publickey for nova from 192.168.122.102 port 47830 ssh2: ECDSA SHA256:F7R2p0Um/J2TH6GGZ4p2tVmjepU+Vjbmccv9PcuTsYI
Jan 22 00:00:30 compute-1 systemd-logind[796]: New session 43 of user nova.
Jan 22 00:00:30 compute-1 systemd[1]: Started Session 43 of User nova.
Jan 22 00:00:30 compute-1 sshd-session[223397]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Jan 22 00:00:30 compute-1 sshd-session[223400]: Received disconnect from 192.168.122.102 port 47830:11: disconnected by user
Jan 22 00:00:30 compute-1 sshd-session[223400]: Disconnected from user nova 192.168.122.102 port 47830
Jan 22 00:00:30 compute-1 sshd-session[223397]: pam_unix(sshd:session): session closed for user nova
Jan 22 00:00:30 compute-1 systemd[1]: session-43.scope: Deactivated successfully.
Jan 22 00:00:30 compute-1 systemd-logind[796]: Session 43 logged out. Waiting for processes to exit.
Jan 22 00:00:30 compute-1 systemd-logind[796]: Removed session 43.
Jan 22 00:00:30 compute-1 sshd-session[223402]: Accepted publickey for nova from 192.168.122.102 port 47842 ssh2: ECDSA SHA256:F7R2p0Um/J2TH6GGZ4p2tVmjepU+Vjbmccv9PcuTsYI
Jan 22 00:00:30 compute-1 systemd-logind[796]: New session 44 of user nova.
Jan 22 00:00:30 compute-1 systemd[1]: Started Session 44 of User nova.
Jan 22 00:00:30 compute-1 sshd-session[223402]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Jan 22 00:00:30 compute-1 sshd-session[223405]: Received disconnect from 192.168.122.102 port 47842:11: disconnected by user
Jan 22 00:00:30 compute-1 sshd-session[223405]: Disconnected from user nova 192.168.122.102 port 47842
Jan 22 00:00:30 compute-1 sshd-session[223402]: pam_unix(sshd:session): session closed for user nova
Jan 22 00:00:30 compute-1 systemd[1]: session-44.scope: Deactivated successfully.
Jan 22 00:00:30 compute-1 systemd-logind[796]: Session 44 logged out. Waiting for processes to exit.
Jan 22 00:00:30 compute-1 systemd-logind[796]: Removed session 44.
Jan 22 00:00:30 compute-1 nova_compute[182713]: 2026-01-22 00:00:30.856 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:00:30 compute-1 sshd-session[223407]: Accepted publickey for nova from 192.168.122.102 port 47852 ssh2: ECDSA SHA256:F7R2p0Um/J2TH6GGZ4p2tVmjepU+Vjbmccv9PcuTsYI
Jan 22 00:00:30 compute-1 systemd-logind[796]: New session 45 of user nova.
Jan 22 00:00:30 compute-1 systemd[1]: Started Session 45 of User nova.
Jan 22 00:00:30 compute-1 sshd-session[223407]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Jan 22 00:00:31 compute-1 nova_compute[182713]: 2026-01-22 00:00:31.406 182717 DEBUG nova.compute.manager [req-e799b16d-f730-4081-ade3-4f43d6ffe3e9 req-cb665332-e332-49aa-96e7-b2a6c1da6055 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] Received event network-vif-plugged-d96fb6bb-9793-4373-8f62-3aa3f32af6a5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:00:31 compute-1 nova_compute[182713]: 2026-01-22 00:00:31.406 182717 DEBUG oslo_concurrency.lockutils [req-e799b16d-f730-4081-ade3-4f43d6ffe3e9 req-cb665332-e332-49aa-96e7-b2a6c1da6055 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "9308be91-9a92-4389-939a-8b03d37474cf-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:00:31 compute-1 nova_compute[182713]: 2026-01-22 00:00:31.407 182717 DEBUG oslo_concurrency.lockutils [req-e799b16d-f730-4081-ade3-4f43d6ffe3e9 req-cb665332-e332-49aa-96e7-b2a6c1da6055 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "9308be91-9a92-4389-939a-8b03d37474cf-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:00:31 compute-1 nova_compute[182713]: 2026-01-22 00:00:31.407 182717 DEBUG oslo_concurrency.lockutils [req-e799b16d-f730-4081-ade3-4f43d6ffe3e9 req-cb665332-e332-49aa-96e7-b2a6c1da6055 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "9308be91-9a92-4389-939a-8b03d37474cf-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:00:31 compute-1 nova_compute[182713]: 2026-01-22 00:00:31.408 182717 DEBUG nova.compute.manager [req-e799b16d-f730-4081-ade3-4f43d6ffe3e9 req-cb665332-e332-49aa-96e7-b2a6c1da6055 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] No waiting events found dispatching network-vif-plugged-d96fb6bb-9793-4373-8f62-3aa3f32af6a5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:00:31 compute-1 nova_compute[182713]: 2026-01-22 00:00:31.408 182717 WARNING nova.compute.manager [req-e799b16d-f730-4081-ade3-4f43d6ffe3e9 req-cb665332-e332-49aa-96e7-b2a6c1da6055 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] Received unexpected event network-vif-plugged-d96fb6bb-9793-4373-8f62-3aa3f32af6a5 for instance with vm_state active and task_state resize_migrating.
Jan 22 00:00:31 compute-1 nova_compute[182713]: 2026-01-22 00:00:31.693 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:00:31 compute-1 sshd-session[223410]: Received disconnect from 192.168.122.102 port 47852:11: disconnected by user
Jan 22 00:00:31 compute-1 sshd-session[223410]: Disconnected from user nova 192.168.122.102 port 47852
Jan 22 00:00:31 compute-1 sshd-session[223407]: pam_unix(sshd:session): session closed for user nova
Jan 22 00:00:31 compute-1 systemd[1]: session-45.scope: Deactivated successfully.
Jan 22 00:00:31 compute-1 systemd-logind[796]: Session 45 logged out. Waiting for processes to exit.
Jan 22 00:00:31 compute-1 systemd-logind[796]: Removed session 45.
Jan 22 00:00:31 compute-1 nova_compute[182713]: 2026-01-22 00:00:31.855 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:00:31 compute-1 nova_compute[182713]: 2026-01-22 00:00:31.855 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:00:32 compute-1 nova_compute[182713]: 2026-01-22 00:00:32.069 182717 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769040017.0681992, 6888ddb7-c373-48a1-bc4c-7e1c44cece29 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:00:32 compute-1 nova_compute[182713]: 2026-01-22 00:00:32.070 182717 INFO nova.compute.manager [-] [instance: 6888ddb7-c373-48a1-bc4c-7e1c44cece29] VM Stopped (Lifecycle Event)
Jan 22 00:00:32 compute-1 nova_compute[182713]: 2026-01-22 00:00:32.097 182717 DEBUG nova.compute.manager [None req-e151fa57-293b-4cd4-ac59-6a2c57251076 - - - - - -] [instance: 6888ddb7-c373-48a1-bc4c-7e1c44cece29] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:00:32 compute-1 nova_compute[182713]: 2026-01-22 00:00:32.196 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:00:32 compute-1 nova_compute[182713]: 2026-01-22 00:00:32.436 182717 INFO nova.network.neutron [None req-7084740f-d576-4ef1-8009-26afe304ba5c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] Updating port d96fb6bb-9793-4373-8f62-3aa3f32af6a5 with attributes {'binding:host_id': 'compute-1.ctlplane.example.com', 'device_owner': 'compute:nova'}
Jan 22 00:00:32 compute-1 nova_compute[182713]: 2026-01-22 00:00:32.856 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:00:32 compute-1 nova_compute[182713]: 2026-01-22 00:00:32.856 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:00:33 compute-1 nova_compute[182713]: 2026-01-22 00:00:33.852 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:00:34 compute-1 nova_compute[182713]: 2026-01-22 00:00:34.645 182717 DEBUG oslo_concurrency.lockutils [None req-7084740f-d576-4ef1-8009-26afe304ba5c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Acquiring lock "refresh_cache-9308be91-9a92-4389-939a-8b03d37474cf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:00:34 compute-1 nova_compute[182713]: 2026-01-22 00:00:34.646 182717 DEBUG oslo_concurrency.lockutils [None req-7084740f-d576-4ef1-8009-26afe304ba5c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Acquired lock "refresh_cache-9308be91-9a92-4389-939a-8b03d37474cf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:00:34 compute-1 nova_compute[182713]: 2026-01-22 00:00:34.646 182717 DEBUG nova.network.neutron [None req-7084740f-d576-4ef1-8009-26afe304ba5c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 00:00:34 compute-1 nova_compute[182713]: 2026-01-22 00:00:34.855 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:00:34 compute-1 nova_compute[182713]: 2026-01-22 00:00:34.855 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 00:00:34 compute-1 nova_compute[182713]: 2026-01-22 00:00:34.885 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 22 00:00:34 compute-1 nova_compute[182713]: 2026-01-22 00:00:34.886 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:00:34 compute-1 nova_compute[182713]: 2026-01-22 00:00:34.918 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:00:34 compute-1 nova_compute[182713]: 2026-01-22 00:00:34.919 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:00:34 compute-1 nova_compute[182713]: 2026-01-22 00:00:34.920 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:00:34 compute-1 nova_compute[182713]: 2026-01-22 00:00:34.921 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 00:00:34 compute-1 nova_compute[182713]: 2026-01-22 00:00:34.929 182717 DEBUG nova.compute.manager [req-dda76e27-b347-4440-a550-5300598ea5a3 req-2589bb52-4aeb-4bc8-b973-f566fde31059 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] Received event network-changed-d96fb6bb-9793-4373-8f62-3aa3f32af6a5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:00:34 compute-1 nova_compute[182713]: 2026-01-22 00:00:34.929 182717 DEBUG nova.compute.manager [req-dda76e27-b347-4440-a550-5300598ea5a3 req-2589bb52-4aeb-4bc8-b973-f566fde31059 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] Refreshing instance network info cache due to event network-changed-d96fb6bb-9793-4373-8f62-3aa3f32af6a5. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 00:00:34 compute-1 nova_compute[182713]: 2026-01-22 00:00:34.929 182717 DEBUG oslo_concurrency.lockutils [req-dda76e27-b347-4440-a550-5300598ea5a3 req-2589bb52-4aeb-4bc8-b973-f566fde31059 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-9308be91-9a92-4389-939a-8b03d37474cf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:00:35 compute-1 nova_compute[182713]: 2026-01-22 00:00:35.139 182717 WARNING nova.virt.libvirt.driver [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 00:00:35 compute-1 nova_compute[182713]: 2026-01-22 00:00:35.141 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5669MB free_disk=73.27516174316406GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 00:00:35 compute-1 nova_compute[182713]: 2026-01-22 00:00:35.141 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:00:35 compute-1 nova_compute[182713]: 2026-01-22 00:00:35.141 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:00:35 compute-1 nova_compute[182713]: 2026-01-22 00:00:35.220 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Applying migration context for instance 9308be91-9a92-4389-939a-8b03d37474cf as it has an incoming, in-progress migration 26c72328-aea0-476d-a0df-60a56cb3907f. Migration status is post-migrating _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:950
Jan 22 00:00:35 compute-1 nova_compute[182713]: 2026-01-22 00:00:35.221 182717 INFO nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] Updating resource usage from migration 26c72328-aea0-476d-a0df-60a56cb3907f
Jan 22 00:00:35 compute-1 nova_compute[182713]: 2026-01-22 00:00:35.278 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Instance 9308be91-9a92-4389-939a-8b03d37474cf actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 192, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 22 00:00:35 compute-1 nova_compute[182713]: 2026-01-22 00:00:35.279 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 00:00:35 compute-1 nova_compute[182713]: 2026-01-22 00:00:35.279 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 00:00:35 compute-1 nova_compute[182713]: 2026-01-22 00:00:35.362 182717 DEBUG nova.compute.provider_tree [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Inventory has not changed in ProviderTree for provider: 39680711-70c9-4df1-ae59-25e54fac688d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:00:35 compute-1 nova_compute[182713]: 2026-01-22 00:00:35.385 182717 DEBUG nova.scheduler.client.report [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Inventory has not changed for provider 39680711-70c9-4df1-ae59-25e54fac688d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:00:35 compute-1 nova_compute[182713]: 2026-01-22 00:00:35.411 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 00:00:35 compute-1 nova_compute[182713]: 2026-01-22 00:00:35.412 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.271s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:00:36 compute-1 nova_compute[182713]: 2026-01-22 00:00:36.696 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:00:37 compute-1 nova_compute[182713]: 2026-01-22 00:00:37.198 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:00:38 compute-1 nova_compute[182713]: 2026-01-22 00:00:38.426 182717 DEBUG nova.network.neutron [None req-7084740f-d576-4ef1-8009-26afe304ba5c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] Updating instance_info_cache with network_info: [{"id": "d96fb6bb-9793-4373-8f62-3aa3f32af6a5", "address": "fa:16:3e:c3:44:d7", "network": {"id": "19c3e0c8-5563-479c-995a-ab38d8b8c7f7", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-10713966-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.240", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cccb624dbe6d4401a89e9cd254f91828", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd96fb6bb-97", "ovs_interfaceid": "d96fb6bb-9793-4373-8f62-3aa3f32af6a5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:00:38 compute-1 nova_compute[182713]: 2026-01-22 00:00:38.454 182717 DEBUG oslo_concurrency.lockutils [None req-7084740f-d576-4ef1-8009-26afe304ba5c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Releasing lock "refresh_cache-9308be91-9a92-4389-939a-8b03d37474cf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:00:38 compute-1 nova_compute[182713]: 2026-01-22 00:00:38.460 182717 DEBUG oslo_concurrency.lockutils [req-dda76e27-b347-4440-a550-5300598ea5a3 req-2589bb52-4aeb-4bc8-b973-f566fde31059 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-9308be91-9a92-4389-939a-8b03d37474cf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:00:38 compute-1 nova_compute[182713]: 2026-01-22 00:00:38.461 182717 DEBUG nova.network.neutron [req-dda76e27-b347-4440-a550-5300598ea5a3 req-2589bb52-4aeb-4bc8-b973-f566fde31059 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] Refreshing network info cache for port d96fb6bb-9793-4373-8f62-3aa3f32af6a5 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 00:00:38 compute-1 nova_compute[182713]: 2026-01-22 00:00:38.764 182717 DEBUG nova.virt.libvirt.driver [None req-7084740f-d576-4ef1-8009-26afe304ba5c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] Starting finish_migration finish_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11698
Jan 22 00:00:38 compute-1 nova_compute[182713]: 2026-01-22 00:00:38.767 182717 DEBUG nova.virt.libvirt.driver [None req-7084740f-d576-4ef1-8009-26afe304ba5c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] Instance directory exists: not creating _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4719
Jan 22 00:00:38 compute-1 nova_compute[182713]: 2026-01-22 00:00:38.768 182717 INFO nova.virt.libvirt.driver [None req-7084740f-d576-4ef1-8009-26afe304ba5c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] Creating image(s)
Jan 22 00:00:38 compute-1 nova_compute[182713]: 2026-01-22 00:00:38.770 182717 DEBUG nova.objects.instance [None req-7084740f-d576-4ef1-8009-26afe304ba5c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Lazy-loading 'trusted_certs' on Instance uuid 9308be91-9a92-4389-939a-8b03d37474cf obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:00:38 compute-1 nova_compute[182713]: 2026-01-22 00:00:38.808 182717 DEBUG oslo_concurrency.processutils [None req-7084740f-d576-4ef1-8009-26afe304ba5c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:00:38 compute-1 nova_compute[182713]: 2026-01-22 00:00:38.903 182717 DEBUG oslo_concurrency.processutils [None req-7084740f-d576-4ef1-8009-26afe304ba5c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.095s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:00:38 compute-1 nova_compute[182713]: 2026-01-22 00:00:38.904 182717 DEBUG nova.virt.disk.api [None req-7084740f-d576-4ef1-8009-26afe304ba5c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Checking if we can resize image /var/lib/nova/instances/9308be91-9a92-4389-939a-8b03d37474cf/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 22 00:00:38 compute-1 nova_compute[182713]: 2026-01-22 00:00:38.905 182717 DEBUG oslo_concurrency.processutils [None req-7084740f-d576-4ef1-8009-26afe304ba5c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9308be91-9a92-4389-939a-8b03d37474cf/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:00:38 compute-1 nova_compute[182713]: 2026-01-22 00:00:38.981 182717 DEBUG oslo_concurrency.processutils [None req-7084740f-d576-4ef1-8009-26afe304ba5c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9308be91-9a92-4389-939a-8b03d37474cf/disk --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:00:38 compute-1 nova_compute[182713]: 2026-01-22 00:00:38.981 182717 DEBUG nova.virt.disk.api [None req-7084740f-d576-4ef1-8009-26afe304ba5c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Cannot resize image /var/lib/nova/instances/9308be91-9a92-4389-939a-8b03d37474cf/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 22 00:00:39 compute-1 nova_compute[182713]: 2026-01-22 00:00:39.046 182717 DEBUG nova.virt.libvirt.driver [None req-7084740f-d576-4ef1-8009-26afe304ba5c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] Did not create local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4859
Jan 22 00:00:39 compute-1 nova_compute[182713]: 2026-01-22 00:00:39.047 182717 DEBUG nova.virt.libvirt.driver [None req-7084740f-d576-4ef1-8009-26afe304ba5c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] Ensure instance console log exists: /var/lib/nova/instances/9308be91-9a92-4389-939a-8b03d37474cf/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 22 00:00:39 compute-1 nova_compute[182713]: 2026-01-22 00:00:39.047 182717 DEBUG oslo_concurrency.lockutils [None req-7084740f-d576-4ef1-8009-26afe304ba5c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:00:39 compute-1 nova_compute[182713]: 2026-01-22 00:00:39.048 182717 DEBUG oslo_concurrency.lockutils [None req-7084740f-d576-4ef1-8009-26afe304ba5c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:00:39 compute-1 nova_compute[182713]: 2026-01-22 00:00:39.048 182717 DEBUG oslo_concurrency.lockutils [None req-7084740f-d576-4ef1-8009-26afe304ba5c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:00:39 compute-1 nova_compute[182713]: 2026-01-22 00:00:39.051 182717 DEBUG nova.virt.libvirt.driver [None req-7084740f-d576-4ef1-8009-26afe304ba5c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] Start _get_guest_xml network_info=[{"id": "d96fb6bb-9793-4373-8f62-3aa3f32af6a5", "address": "fa:16:3e:c3:44:d7", "network": {"id": "19c3e0c8-5563-479c-995a-ab38d8b8c7f7", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-10713966-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.240", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-ServerActionsTestJSON-10713966-network", "vif_mac": "fa:16:3e:c3:44:d7"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cccb624dbe6d4401a89e9cd254f91828", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd96fb6bb-97", "ovs_interfaceid": "d96fb6bb-9793-4373-8f62-3aa3f32af6a5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'size': 0, 'disk_bus': 'virtio', 'boot_index': 0, 'device_name': '/dev/vda', 'guest_format': None, 'encrypted': False, 'encryption_format': None, 'encryption_secret_uuid': None, 'device_type': 'disk', 'image_id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 22 00:00:39 compute-1 nova_compute[182713]: 2026-01-22 00:00:39.057 182717 WARNING nova.virt.libvirt.driver [None req-7084740f-d576-4ef1-8009-26afe304ba5c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 00:00:39 compute-1 nova_compute[182713]: 2026-01-22 00:00:39.090 182717 DEBUG nova.virt.libvirt.host [None req-7084740f-d576-4ef1-8009-26afe304ba5c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 22 00:00:39 compute-1 nova_compute[182713]: 2026-01-22 00:00:39.091 182717 DEBUG nova.virt.libvirt.host [None req-7084740f-d576-4ef1-8009-26afe304ba5c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 22 00:00:39 compute-1 nova_compute[182713]: 2026-01-22 00:00:39.171 182717 DEBUG nova.virt.libvirt.host [None req-7084740f-d576-4ef1-8009-26afe304ba5c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 22 00:00:39 compute-1 nova_compute[182713]: 2026-01-22 00:00:39.172 182717 DEBUG nova.virt.libvirt.host [None req-7084740f-d576-4ef1-8009-26afe304ba5c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 22 00:00:39 compute-1 nova_compute[182713]: 2026-01-22 00:00:39.173 182717 DEBUG nova.virt.libvirt.driver [None req-7084740f-d576-4ef1-8009-26afe304ba5c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 22 00:00:39 compute-1 nova_compute[182713]: 2026-01-22 00:00:39.174 182717 DEBUG nova.virt.hardware [None req-7084740f-d576-4ef1-8009-26afe304ba5c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-21T23:42:48Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='ff01ccba-ad51-439f-9037-926190d6dc0f',id=2,is_public=True,memory_mb=192,name='m1.micro',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 22 00:00:39 compute-1 nova_compute[182713]: 2026-01-22 00:00:39.174 182717 DEBUG nova.virt.hardware [None req-7084740f-d576-4ef1-8009-26afe304ba5c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 22 00:00:39 compute-1 nova_compute[182713]: 2026-01-22 00:00:39.174 182717 DEBUG nova.virt.hardware [None req-7084740f-d576-4ef1-8009-26afe304ba5c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 22 00:00:39 compute-1 nova_compute[182713]: 2026-01-22 00:00:39.175 182717 DEBUG nova.virt.hardware [None req-7084740f-d576-4ef1-8009-26afe304ba5c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 22 00:00:39 compute-1 nova_compute[182713]: 2026-01-22 00:00:39.175 182717 DEBUG nova.virt.hardware [None req-7084740f-d576-4ef1-8009-26afe304ba5c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 22 00:00:39 compute-1 nova_compute[182713]: 2026-01-22 00:00:39.175 182717 DEBUG nova.virt.hardware [None req-7084740f-d576-4ef1-8009-26afe304ba5c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 22 00:00:39 compute-1 nova_compute[182713]: 2026-01-22 00:00:39.176 182717 DEBUG nova.virt.hardware [None req-7084740f-d576-4ef1-8009-26afe304ba5c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 22 00:00:39 compute-1 nova_compute[182713]: 2026-01-22 00:00:39.176 182717 DEBUG nova.virt.hardware [None req-7084740f-d576-4ef1-8009-26afe304ba5c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 22 00:00:39 compute-1 nova_compute[182713]: 2026-01-22 00:00:39.176 182717 DEBUG nova.virt.hardware [None req-7084740f-d576-4ef1-8009-26afe304ba5c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 22 00:00:39 compute-1 nova_compute[182713]: 2026-01-22 00:00:39.176 182717 DEBUG nova.virt.hardware [None req-7084740f-d576-4ef1-8009-26afe304ba5c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 22 00:00:39 compute-1 nova_compute[182713]: 2026-01-22 00:00:39.177 182717 DEBUG nova.virt.hardware [None req-7084740f-d576-4ef1-8009-26afe304ba5c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 22 00:00:39 compute-1 nova_compute[182713]: 2026-01-22 00:00:39.177 182717 DEBUG nova.objects.instance [None req-7084740f-d576-4ef1-8009-26afe304ba5c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Lazy-loading 'vcpu_model' on Instance uuid 9308be91-9a92-4389-939a-8b03d37474cf obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:00:39 compute-1 nova_compute[182713]: 2026-01-22 00:00:39.278 182717 DEBUG nova.virt.libvirt.vif [None req-7084740f-d576-4ef1-8009-26afe304ba5c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-21T23:57:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-396111842',display_name='tempest-ServerActionsTestJSON-server-396111842',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-396111842',id=70,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBM2ugiUux7DYMlN8dY8gue1BzsfXbOKOqdPq/gJUxFgjYtiZRKn0Il7yH7vkt/FF0n0nQ57uKZ7FjQwDvGcLpEHkhrK3RTLhPWsztjfiNHjhjKK0S86T4k3kzP0rpeoh4Q==',key_name='tempest-keypair-452781070',keypairs=<?>,launch_index=0,launched_at=2026-01-21T23:57:17Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='cccb624dbe6d4401a89e9cd254f91828',ramdisk_id='',reservation_id='r-740ncwsh',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-ServerActionsTestJSON-78742637',owner_user_name='tempest-ServerActionsTestJSON-78742637-project-member'},tags=<?>,task_state='resize_finish',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:00:32Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='3e78a70a1d284a9d932d4a53b872df39',uuid=9308be91-9a92-4389-939a-8b03d37474cf,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "d96fb6bb-9793-4373-8f62-3aa3f32af6a5", "address": "fa:16:3e:c3:44:d7", "network": {"id": "19c3e0c8-5563-479c-995a-ab38d8b8c7f7", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-10713966-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.240", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-ServerActionsTestJSON-10713966-network", "vif_mac": "fa:16:3e:c3:44:d7"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cccb624dbe6d4401a89e9cd254f91828", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd96fb6bb-97", "ovs_interfaceid": "d96fb6bb-9793-4373-8f62-3aa3f32af6a5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 00:00:39 compute-1 nova_compute[182713]: 2026-01-22 00:00:39.279 182717 DEBUG nova.network.os_vif_util [None req-7084740f-d576-4ef1-8009-26afe304ba5c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Converting VIF {"id": "d96fb6bb-9793-4373-8f62-3aa3f32af6a5", "address": "fa:16:3e:c3:44:d7", "network": {"id": "19c3e0c8-5563-479c-995a-ab38d8b8c7f7", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-10713966-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.240", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-ServerActionsTestJSON-10713966-network", "vif_mac": "fa:16:3e:c3:44:d7"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cccb624dbe6d4401a89e9cd254f91828", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd96fb6bb-97", "ovs_interfaceid": "d96fb6bb-9793-4373-8f62-3aa3f32af6a5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:00:39 compute-1 nova_compute[182713]: 2026-01-22 00:00:39.281 182717 DEBUG nova.network.os_vif_util [None req-7084740f-d576-4ef1-8009-26afe304ba5c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c3:44:d7,bridge_name='br-int',has_traffic_filtering=True,id=d96fb6bb-9793-4373-8f62-3aa3f32af6a5,network=Network(19c3e0c8-5563-479c-995a-ab38d8b8c7f7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd96fb6bb-97') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:00:39 compute-1 nova_compute[182713]: 2026-01-22 00:00:39.285 182717 DEBUG nova.virt.libvirt.driver [None req-7084740f-d576-4ef1-8009-26afe304ba5c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] End _get_guest_xml xml=<domain type="kvm">
Jan 22 00:00:39 compute-1 nova_compute[182713]:   <uuid>9308be91-9a92-4389-939a-8b03d37474cf</uuid>
Jan 22 00:00:39 compute-1 nova_compute[182713]:   <name>instance-00000046</name>
Jan 22 00:00:39 compute-1 nova_compute[182713]:   <memory>196608</memory>
Jan 22 00:00:39 compute-1 nova_compute[182713]:   <vcpu>1</vcpu>
Jan 22 00:00:39 compute-1 nova_compute[182713]:   <metadata>
Jan 22 00:00:39 compute-1 nova_compute[182713]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 00:00:39 compute-1 nova_compute[182713]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 00:00:39 compute-1 nova_compute[182713]:       <nova:name>tempest-ServerActionsTestJSON-server-396111842</nova:name>
Jan 22 00:00:39 compute-1 nova_compute[182713]:       <nova:creationTime>2026-01-22 00:00:39</nova:creationTime>
Jan 22 00:00:39 compute-1 nova_compute[182713]:       <nova:flavor name="m1.micro">
Jan 22 00:00:39 compute-1 nova_compute[182713]:         <nova:memory>192</nova:memory>
Jan 22 00:00:39 compute-1 nova_compute[182713]:         <nova:disk>1</nova:disk>
Jan 22 00:00:39 compute-1 nova_compute[182713]:         <nova:swap>0</nova:swap>
Jan 22 00:00:39 compute-1 nova_compute[182713]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 00:00:39 compute-1 nova_compute[182713]:         <nova:vcpus>1</nova:vcpus>
Jan 22 00:00:39 compute-1 nova_compute[182713]:       </nova:flavor>
Jan 22 00:00:39 compute-1 nova_compute[182713]:       <nova:owner>
Jan 22 00:00:39 compute-1 nova_compute[182713]:         <nova:user uuid="3e78a70a1d284a9d932d4a53b872df39">tempest-ServerActionsTestJSON-78742637-project-member</nova:user>
Jan 22 00:00:39 compute-1 nova_compute[182713]:         <nova:project uuid="cccb624dbe6d4401a89e9cd254f91828">tempest-ServerActionsTestJSON-78742637</nova:project>
Jan 22 00:00:39 compute-1 nova_compute[182713]:       </nova:owner>
Jan 22 00:00:39 compute-1 nova_compute[182713]:       <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 22 00:00:39 compute-1 nova_compute[182713]:       <nova:ports>
Jan 22 00:00:39 compute-1 nova_compute[182713]:         <nova:port uuid="d96fb6bb-9793-4373-8f62-3aa3f32af6a5">
Jan 22 00:00:39 compute-1 nova_compute[182713]:           <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Jan 22 00:00:39 compute-1 nova_compute[182713]:         </nova:port>
Jan 22 00:00:39 compute-1 nova_compute[182713]:       </nova:ports>
Jan 22 00:00:39 compute-1 nova_compute[182713]:     </nova:instance>
Jan 22 00:00:39 compute-1 nova_compute[182713]:   </metadata>
Jan 22 00:00:39 compute-1 nova_compute[182713]:   <sysinfo type="smbios">
Jan 22 00:00:39 compute-1 nova_compute[182713]:     <system>
Jan 22 00:00:39 compute-1 nova_compute[182713]:       <entry name="manufacturer">RDO</entry>
Jan 22 00:00:39 compute-1 nova_compute[182713]:       <entry name="product">OpenStack Compute</entry>
Jan 22 00:00:39 compute-1 nova_compute[182713]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 00:00:39 compute-1 nova_compute[182713]:       <entry name="serial">9308be91-9a92-4389-939a-8b03d37474cf</entry>
Jan 22 00:00:39 compute-1 nova_compute[182713]:       <entry name="uuid">9308be91-9a92-4389-939a-8b03d37474cf</entry>
Jan 22 00:00:39 compute-1 nova_compute[182713]:       <entry name="family">Virtual Machine</entry>
Jan 22 00:00:39 compute-1 nova_compute[182713]:     </system>
Jan 22 00:00:39 compute-1 nova_compute[182713]:   </sysinfo>
Jan 22 00:00:39 compute-1 nova_compute[182713]:   <os>
Jan 22 00:00:39 compute-1 nova_compute[182713]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 22 00:00:39 compute-1 nova_compute[182713]:     <boot dev="hd"/>
Jan 22 00:00:39 compute-1 nova_compute[182713]:     <smbios mode="sysinfo"/>
Jan 22 00:00:39 compute-1 nova_compute[182713]:   </os>
Jan 22 00:00:39 compute-1 nova_compute[182713]:   <features>
Jan 22 00:00:39 compute-1 nova_compute[182713]:     <acpi/>
Jan 22 00:00:39 compute-1 nova_compute[182713]:     <apic/>
Jan 22 00:00:39 compute-1 nova_compute[182713]:     <vmcoreinfo/>
Jan 22 00:00:39 compute-1 nova_compute[182713]:   </features>
Jan 22 00:00:39 compute-1 nova_compute[182713]:   <clock offset="utc">
Jan 22 00:00:39 compute-1 nova_compute[182713]:     <timer name="pit" tickpolicy="delay"/>
Jan 22 00:00:39 compute-1 nova_compute[182713]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 22 00:00:39 compute-1 nova_compute[182713]:     <timer name="hpet" present="no"/>
Jan 22 00:00:39 compute-1 nova_compute[182713]:   </clock>
Jan 22 00:00:39 compute-1 nova_compute[182713]:   <cpu mode="custom" match="exact">
Jan 22 00:00:39 compute-1 nova_compute[182713]:     <model>Nehalem</model>
Jan 22 00:00:39 compute-1 nova_compute[182713]:     <topology sockets="1" cores="1" threads="1"/>
Jan 22 00:00:39 compute-1 nova_compute[182713]:   </cpu>
Jan 22 00:00:39 compute-1 nova_compute[182713]:   <devices>
Jan 22 00:00:39 compute-1 nova_compute[182713]:     <disk type="file" device="disk">
Jan 22 00:00:39 compute-1 nova_compute[182713]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 00:00:39 compute-1 nova_compute[182713]:       <source file="/var/lib/nova/instances/9308be91-9a92-4389-939a-8b03d37474cf/disk"/>
Jan 22 00:00:39 compute-1 nova_compute[182713]:       <target dev="vda" bus="virtio"/>
Jan 22 00:00:39 compute-1 nova_compute[182713]:     </disk>
Jan 22 00:00:39 compute-1 nova_compute[182713]:     <disk type="file" device="cdrom">
Jan 22 00:00:39 compute-1 nova_compute[182713]:       <driver name="qemu" type="raw" cache="none"/>
Jan 22 00:00:39 compute-1 nova_compute[182713]:       <source file="/var/lib/nova/instances/9308be91-9a92-4389-939a-8b03d37474cf/disk.config"/>
Jan 22 00:00:39 compute-1 nova_compute[182713]:       <target dev="sda" bus="sata"/>
Jan 22 00:00:39 compute-1 nova_compute[182713]:     </disk>
Jan 22 00:00:39 compute-1 nova_compute[182713]:     <interface type="ethernet">
Jan 22 00:00:39 compute-1 nova_compute[182713]:       <mac address="fa:16:3e:c3:44:d7"/>
Jan 22 00:00:39 compute-1 nova_compute[182713]:       <model type="virtio"/>
Jan 22 00:00:39 compute-1 nova_compute[182713]:       <driver name="vhost" rx_queue_size="512"/>
Jan 22 00:00:39 compute-1 nova_compute[182713]:       <mtu size="1442"/>
Jan 22 00:00:39 compute-1 nova_compute[182713]:       <target dev="tapd96fb6bb-97"/>
Jan 22 00:00:39 compute-1 nova_compute[182713]:     </interface>
Jan 22 00:00:39 compute-1 nova_compute[182713]:     <serial type="pty">
Jan 22 00:00:39 compute-1 nova_compute[182713]:       <log file="/var/lib/nova/instances/9308be91-9a92-4389-939a-8b03d37474cf/console.log" append="off"/>
Jan 22 00:00:39 compute-1 nova_compute[182713]:     </serial>
Jan 22 00:00:39 compute-1 nova_compute[182713]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 00:00:39 compute-1 nova_compute[182713]:     <video>
Jan 22 00:00:39 compute-1 nova_compute[182713]:       <model type="virtio"/>
Jan 22 00:00:39 compute-1 nova_compute[182713]:     </video>
Jan 22 00:00:39 compute-1 nova_compute[182713]:     <input type="tablet" bus="usb"/>
Jan 22 00:00:39 compute-1 nova_compute[182713]:     <rng model="virtio">
Jan 22 00:00:39 compute-1 nova_compute[182713]:       <backend model="random">/dev/urandom</backend>
Jan 22 00:00:39 compute-1 nova_compute[182713]:     </rng>
Jan 22 00:00:39 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root"/>
Jan 22 00:00:39 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:00:39 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:00:39 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:00:39 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:00:39 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:00:39 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:00:39 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:00:39 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:00:39 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:00:39 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:00:39 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:00:39 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:00:39 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:00:39 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:00:39 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:00:39 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:00:39 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:00:39 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:00:39 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:00:39 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:00:39 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:00:39 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:00:39 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:00:39 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:00:39 compute-1 nova_compute[182713]:     <controller type="usb" index="0"/>
Jan 22 00:00:39 compute-1 nova_compute[182713]:     <memballoon model="virtio">
Jan 22 00:00:39 compute-1 nova_compute[182713]:       <stats period="10"/>
Jan 22 00:00:39 compute-1 nova_compute[182713]:     </memballoon>
Jan 22 00:00:39 compute-1 nova_compute[182713]:   </devices>
Jan 22 00:00:39 compute-1 nova_compute[182713]: </domain>
Jan 22 00:00:39 compute-1 nova_compute[182713]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 22 00:00:39 compute-1 nova_compute[182713]: 2026-01-22 00:00:39.288 182717 DEBUG nova.virt.libvirt.vif [None req-7084740f-d576-4ef1-8009-26afe304ba5c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-21T23:57:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-396111842',display_name='tempest-ServerActionsTestJSON-server-396111842',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-396111842',id=70,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBM2ugiUux7DYMlN8dY8gue1BzsfXbOKOqdPq/gJUxFgjYtiZRKn0Il7yH7vkt/FF0n0nQ57uKZ7FjQwDvGcLpEHkhrK3RTLhPWsztjfiNHjhjKK0S86T4k3kzP0rpeoh4Q==',key_name='tempest-keypair-452781070',keypairs=<?>,launch_index=0,launched_at=2026-01-21T23:57:17Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='cccb624dbe6d4401a89e9cd254f91828',ramdisk_id='',reservation_id='r-740ncwsh',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-ServerActionsTestJSON-78742637',owner_user_name='tempest-ServerActionsTestJSON-78742637-project-member'},tags=<?>,task_state='resize_finish',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:00:32Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='3e78a70a1d284a9d932d4a53b872df39',uuid=9308be91-9a92-4389-939a-8b03d37474cf,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "d96fb6bb-9793-4373-8f62-3aa3f32af6a5", "address": "fa:16:3e:c3:44:d7", "network": {"id": "19c3e0c8-5563-479c-995a-ab38d8b8c7f7", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-10713966-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.240", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-ServerActionsTestJSON-10713966-network", "vif_mac": "fa:16:3e:c3:44:d7"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cccb624dbe6d4401a89e9cd254f91828", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd96fb6bb-97", "ovs_interfaceid": "d96fb6bb-9793-4373-8f62-3aa3f32af6a5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 00:00:39 compute-1 nova_compute[182713]: 2026-01-22 00:00:39.288 182717 DEBUG nova.network.os_vif_util [None req-7084740f-d576-4ef1-8009-26afe304ba5c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Converting VIF {"id": "d96fb6bb-9793-4373-8f62-3aa3f32af6a5", "address": "fa:16:3e:c3:44:d7", "network": {"id": "19c3e0c8-5563-479c-995a-ab38d8b8c7f7", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-10713966-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.240", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-ServerActionsTestJSON-10713966-network", "vif_mac": "fa:16:3e:c3:44:d7"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cccb624dbe6d4401a89e9cd254f91828", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd96fb6bb-97", "ovs_interfaceid": "d96fb6bb-9793-4373-8f62-3aa3f32af6a5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:00:39 compute-1 nova_compute[182713]: 2026-01-22 00:00:39.289 182717 DEBUG nova.network.os_vif_util [None req-7084740f-d576-4ef1-8009-26afe304ba5c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c3:44:d7,bridge_name='br-int',has_traffic_filtering=True,id=d96fb6bb-9793-4373-8f62-3aa3f32af6a5,network=Network(19c3e0c8-5563-479c-995a-ab38d8b8c7f7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd96fb6bb-97') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:00:39 compute-1 nova_compute[182713]: 2026-01-22 00:00:39.290 182717 DEBUG os_vif [None req-7084740f-d576-4ef1-8009-26afe304ba5c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:c3:44:d7,bridge_name='br-int',has_traffic_filtering=True,id=d96fb6bb-9793-4373-8f62-3aa3f32af6a5,network=Network(19c3e0c8-5563-479c-995a-ab38d8b8c7f7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd96fb6bb-97') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 00:00:39 compute-1 nova_compute[182713]: 2026-01-22 00:00:39.290 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:00:39 compute-1 nova_compute[182713]: 2026-01-22 00:00:39.291 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:00:39 compute-1 nova_compute[182713]: 2026-01-22 00:00:39.291 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 00:00:39 compute-1 nova_compute[182713]: 2026-01-22 00:00:39.295 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:00:39 compute-1 nova_compute[182713]: 2026-01-22 00:00:39.295 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd96fb6bb-97, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:00:39 compute-1 nova_compute[182713]: 2026-01-22 00:00:39.296 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapd96fb6bb-97, col_values=(('external_ids', {'iface-id': 'd96fb6bb-9793-4373-8f62-3aa3f32af6a5', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:c3:44:d7', 'vm-uuid': '9308be91-9a92-4389-939a-8b03d37474cf'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:00:39 compute-1 nova_compute[182713]: 2026-01-22 00:00:39.298 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:00:39 compute-1 nova_compute[182713]: 2026-01-22 00:00:39.299 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:00:39 compute-1 NetworkManager[54952]: <info>  [1769040039.3005] manager: (tapd96fb6bb-97): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/159)
Jan 22 00:00:39 compute-1 nova_compute[182713]: 2026-01-22 00:00:39.309 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:00:39 compute-1 nova_compute[182713]: 2026-01-22 00:00:39.311 182717 INFO os_vif [None req-7084740f-d576-4ef1-8009-26afe304ba5c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:c3:44:d7,bridge_name='br-int',has_traffic_filtering=True,id=d96fb6bb-9793-4373-8f62-3aa3f32af6a5,network=Network(19c3e0c8-5563-479c-995a-ab38d8b8c7f7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd96fb6bb-97')
Jan 22 00:00:39 compute-1 nova_compute[182713]: 2026-01-22 00:00:39.466 182717 DEBUG nova.virt.libvirt.driver [None req-7084740f-d576-4ef1-8009-26afe304ba5c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 00:00:39 compute-1 nova_compute[182713]: 2026-01-22 00:00:39.467 182717 DEBUG nova.virt.libvirt.driver [None req-7084740f-d576-4ef1-8009-26afe304ba5c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 00:00:39 compute-1 nova_compute[182713]: 2026-01-22 00:00:39.467 182717 DEBUG nova.virt.libvirt.driver [None req-7084740f-d576-4ef1-8009-26afe304ba5c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] No VIF found with MAC fa:16:3e:c3:44:d7, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 00:00:39 compute-1 nova_compute[182713]: 2026-01-22 00:00:39.468 182717 INFO nova.virt.libvirt.driver [None req-7084740f-d576-4ef1-8009-26afe304ba5c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] Using config drive
Jan 22 00:00:39 compute-1 kernel: tapd96fb6bb-97: entered promiscuous mode
Jan 22 00:00:39 compute-1 NetworkManager[54952]: <info>  [1769040039.5711] manager: (tapd96fb6bb-97): new Tun device (/org/freedesktop/NetworkManager/Devices/160)
Jan 22 00:00:39 compute-1 ovn_controller[94841]: 2026-01-22T00:00:39Z|00329|binding|INFO|Claiming lport d96fb6bb-9793-4373-8f62-3aa3f32af6a5 for this chassis.
Jan 22 00:00:39 compute-1 ovn_controller[94841]: 2026-01-22T00:00:39Z|00330|binding|INFO|d96fb6bb-9793-4373-8f62-3aa3f32af6a5: Claiming fa:16:3e:c3:44:d7 10.100.0.7
Jan 22 00:00:39 compute-1 nova_compute[182713]: 2026-01-22 00:00:39.583 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:00:39 compute-1 NetworkManager[54952]: <info>  [1769040039.5940] manager: (patch-br-int-to-provnet-99bcf688-d143-4306-a11c-956e66fbe227): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/161)
Jan 22 00:00:39 compute-1 NetworkManager[54952]: <info>  [1769040039.5950] manager: (patch-provnet-99bcf688-d143-4306-a11c-956e66fbe227-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/162)
Jan 22 00:00:39 compute-1 nova_compute[182713]: 2026-01-22 00:00:39.593 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:00:39 compute-1 systemd-udevd[223463]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 00:00:39 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:00:39.645 104184 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c3:44:d7 10.100.0.7'], port_security=['fa:16:3e:c3:44:d7 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '9308be91-9a92-4389-939a-8b03d37474cf', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-19c3e0c8-5563-479c-995a-ab38d8b8c7f7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'cccb624dbe6d4401a89e9cd254f91828', 'neutron:revision_number': '12', 'neutron:security_group_ids': '6d59a7e5-ecca-4ec2-a40e-386acabc1d66', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.240'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1cb5ae5b-fb9e-4b4d-8960-35191db09308, chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>], logical_port=d96fb6bb-9793-4373-8f62-3aa3f32af6a5) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:00:39 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:00:39.647 104184 INFO neutron.agent.ovn.metadata.agent [-] Port d96fb6bb-9793-4373-8f62-3aa3f32af6a5 in datapath 19c3e0c8-5563-479c-995a-ab38d8b8c7f7 bound to our chassis
Jan 22 00:00:39 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:00:39.650 104184 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 19c3e0c8-5563-479c-995a-ab38d8b8c7f7
Jan 22 00:00:39 compute-1 NetworkManager[54952]: <info>  [1769040039.6556] device (tapd96fb6bb-97): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 00:00:39 compute-1 NetworkManager[54952]: <info>  [1769040039.6567] device (tapd96fb6bb-97): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 00:00:39 compute-1 systemd-machined[153970]: New machine qemu-38-instance-00000046.
Jan 22 00:00:39 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:00:39.668 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[8941369d-1b55-4364-ae77-8e499239c88b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:00:39 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:00:39.669 104184 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap19c3e0c8-51 in ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 22 00:00:39 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:00:39.671 211733 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap19c3e0c8-50 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 22 00:00:39 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:00:39.672 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[547de6df-99b6-470a-98c8-f409d1118322]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:00:39 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:00:39.673 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[758b81da-aa97-4bc1-b9ef-0227b992ddee]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:00:39 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:00:39.689 104576 DEBUG oslo.privsep.daemon [-] privsep: reply[a25b12b5-e0b3-4728-a8ef-878f5702fc5e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:00:39 compute-1 podman[223429]: 2026-01-22 00:00:39.69169332 +0000 UTC m=+0.123612342 container health_status 9069102163b9014ce8d990e36224edf2f6277e737eebbc85a8ea0ba5a3d6a468 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 22 00:00:39 compute-1 nova_compute[182713]: 2026-01-22 00:00:39.712 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:00:39 compute-1 systemd[1]: Started Virtual Machine qemu-38-instance-00000046.
Jan 22 00:00:39 compute-1 podman[223428]: 2026-01-22 00:00:39.717378183 +0000 UTC m=+0.158399536 container health_status 1b31374526468d6b98833c6ddc06c791524e3aaa8f4a92d02e3f11c449a17ca2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Jan 22 00:00:39 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:00:39.720 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[2ea5642d-fe52-4848-962a-586a3d4d5809]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:00:39 compute-1 nova_compute[182713]: 2026-01-22 00:00:39.728 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:00:39 compute-1 ovn_controller[94841]: 2026-01-22T00:00:39Z|00331|binding|INFO|Setting lport d96fb6bb-9793-4373-8f62-3aa3f32af6a5 ovn-installed in OVS
Jan 22 00:00:39 compute-1 ovn_controller[94841]: 2026-01-22T00:00:39Z|00332|binding|INFO|Setting lport d96fb6bb-9793-4373-8f62-3aa3f32af6a5 up in Southbound
Jan 22 00:00:39 compute-1 nova_compute[182713]: 2026-01-22 00:00:39.738 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:00:39 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:00:39.754 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[e61c3674-724d-4b4b-a063-03b9ac771d9c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:00:39 compute-1 NetworkManager[54952]: <info>  [1769040039.7599] manager: (tap19c3e0c8-50): new Veth device (/org/freedesktop/NetworkManager/Devices/163)
Jan 22 00:00:39 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:00:39.759 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[edb1e70c-da2a-41bf-8732-80503ff1b7cd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:00:39 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:00:39.789 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[369c26b4-02b5-4c6d-8c8e-c6fb5741d744]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:00:39 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:00:39.792 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[7c576066-0e37-43bc-893d-1e18f52dbe5c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:00:39 compute-1 NetworkManager[54952]: <info>  [1769040039.8116] device (tap19c3e0c8-50): carrier: link connected
Jan 22 00:00:39 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:00:39.815 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[54eec12f-5a0d-45a7-8e11-ca4567799a8c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:00:39 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:00:39.830 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[0a9c0eb1-a651-4506-9ce6-bfd73de117bd]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap19c3e0c8-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:8f:3a:b0'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 99], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 461246, 'reachable_time': 19586, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 223515, 'error': None, 'target': 'ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:00:39 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:00:39.852 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[241bdf76-c96c-4ddf-a13f-551c96ba25c1]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe8f:3ab0'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 461246, 'tstamp': 461246}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 223516, 'error': None, 'target': 'ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:00:39 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:00:39.868 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[37891200-6b34-48d0-8886-8ee61ae9f408]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap19c3e0c8-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:8f:3a:b0'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 99], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 461246, 'reachable_time': 19586, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 223517, 'error': None, 'target': 'ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:00:39 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:00:39.904 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[3c7ff904-9d8c-4e3a-87ad-5cfb743a89c9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:00:39 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:00:39.958 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[bfe016e6-977b-4f5a-86d1-d7a096063b34]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:00:39 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:00:39.959 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap19c3e0c8-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:00:39 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:00:39.960 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 00:00:39 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:00:39.960 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap19c3e0c8-50, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:00:40 compute-1 kernel: tap19c3e0c8-50: entered promiscuous mode
Jan 22 00:00:40 compute-1 NetworkManager[54952]: <info>  [1769040040.0097] manager: (tap19c3e0c8-50): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/164)
Jan 22 00:00:40 compute-1 nova_compute[182713]: 2026-01-22 00:00:40.009 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:00:40 compute-1 nova_compute[182713]: 2026-01-22 00:00:40.012 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:00:40 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:00:40.013 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap19c3e0c8-50, col_values=(('external_ids', {'iface-id': '1b7e9589-a667-4684-99c2-2699b19c29bb'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:00:40 compute-1 ovn_controller[94841]: 2026-01-22T00:00:40Z|00333|binding|INFO|Releasing lport 1b7e9589-a667-4684-99c2-2699b19c29bb from this chassis (sb_readonly=0)
Jan 22 00:00:40 compute-1 nova_compute[182713]: 2026-01-22 00:00:40.015 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:00:40 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:00:40.015 104184 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/19c3e0c8-5563-479c-995a-ab38d8b8c7f7.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/19c3e0c8-5563-479c-995a-ab38d8b8c7f7.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 22 00:00:40 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:00:40.016 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[6b321a5f-adec-44e0-bee9-262cfa287114]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:00:40 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:00:40.017 104184 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 00:00:40 compute-1 ovn_metadata_agent[104179]: global
Jan 22 00:00:40 compute-1 ovn_metadata_agent[104179]:     log         /dev/log local0 debug
Jan 22 00:00:40 compute-1 ovn_metadata_agent[104179]:     log-tag     haproxy-metadata-proxy-19c3e0c8-5563-479c-995a-ab38d8b8c7f7
Jan 22 00:00:40 compute-1 ovn_metadata_agent[104179]:     user        root
Jan 22 00:00:40 compute-1 ovn_metadata_agent[104179]:     group       root
Jan 22 00:00:40 compute-1 ovn_metadata_agent[104179]:     maxconn     1024
Jan 22 00:00:40 compute-1 ovn_metadata_agent[104179]:     pidfile     /var/lib/neutron/external/pids/19c3e0c8-5563-479c-995a-ab38d8b8c7f7.pid.haproxy
Jan 22 00:00:40 compute-1 ovn_metadata_agent[104179]:     daemon
Jan 22 00:00:40 compute-1 ovn_metadata_agent[104179]: 
Jan 22 00:00:40 compute-1 ovn_metadata_agent[104179]: defaults
Jan 22 00:00:40 compute-1 ovn_metadata_agent[104179]:     log global
Jan 22 00:00:40 compute-1 ovn_metadata_agent[104179]:     mode http
Jan 22 00:00:40 compute-1 ovn_metadata_agent[104179]:     option httplog
Jan 22 00:00:40 compute-1 ovn_metadata_agent[104179]:     option dontlognull
Jan 22 00:00:40 compute-1 ovn_metadata_agent[104179]:     option http-server-close
Jan 22 00:00:40 compute-1 ovn_metadata_agent[104179]:     option forwardfor
Jan 22 00:00:40 compute-1 ovn_metadata_agent[104179]:     retries                 3
Jan 22 00:00:40 compute-1 ovn_metadata_agent[104179]:     timeout http-request    30s
Jan 22 00:00:40 compute-1 ovn_metadata_agent[104179]:     timeout connect         30s
Jan 22 00:00:40 compute-1 ovn_metadata_agent[104179]:     timeout client          32s
Jan 22 00:00:40 compute-1 ovn_metadata_agent[104179]:     timeout server          32s
Jan 22 00:00:40 compute-1 ovn_metadata_agent[104179]:     timeout http-keep-alive 30s
Jan 22 00:00:40 compute-1 ovn_metadata_agent[104179]: 
Jan 22 00:00:40 compute-1 ovn_metadata_agent[104179]: 
Jan 22 00:00:40 compute-1 ovn_metadata_agent[104179]: listen listener
Jan 22 00:00:40 compute-1 ovn_metadata_agent[104179]:     bind 169.254.169.254:80
Jan 22 00:00:40 compute-1 ovn_metadata_agent[104179]:     server metadata /var/lib/neutron/metadata_proxy
Jan 22 00:00:40 compute-1 ovn_metadata_agent[104179]:     http-request add-header X-OVN-Network-ID 19c3e0c8-5563-479c-995a-ab38d8b8c7f7
Jan 22 00:00:40 compute-1 ovn_metadata_agent[104179]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 22 00:00:40 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:00:40.017 104184 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7', 'env', 'PROCESS_TAG=haproxy-19c3e0c8-5563-479c-995a-ab38d8b8c7f7', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/19c3e0c8-5563-479c-995a-ab38d8b8c7f7.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 22 00:00:40 compute-1 nova_compute[182713]: 2026-01-22 00:00:40.025 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:00:40 compute-1 nova_compute[182713]: 2026-01-22 00:00:40.107 182717 DEBUG nova.virt.driver [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Emitting event <LifecycleEvent: 1769040040.1066751, 9308be91-9a92-4389-939a-8b03d37474cf => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:00:40 compute-1 nova_compute[182713]: 2026-01-22 00:00:40.108 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] VM Resumed (Lifecycle Event)
Jan 22 00:00:40 compute-1 nova_compute[182713]: 2026-01-22 00:00:40.112 182717 DEBUG nova.compute.manager [None req-7084740f-d576-4ef1-8009-26afe304ba5c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 00:00:40 compute-1 nova_compute[182713]: 2026-01-22 00:00:40.116 182717 INFO nova.virt.libvirt.driver [-] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] Instance running successfully.
Jan 22 00:00:40 compute-1 virtqemud[182235]: argument unsupported: QEMU guest agent is not configured
Jan 22 00:00:40 compute-1 nova_compute[182713]: 2026-01-22 00:00:40.119 182717 DEBUG nova.virt.libvirt.guest [None req-7084740f-d576-4ef1-8009-26afe304ba5c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] Failed to set time: agent not configured sync_guest_time /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:200
Jan 22 00:00:40 compute-1 nova_compute[182713]: 2026-01-22 00:00:40.120 182717 DEBUG nova.virt.libvirt.driver [None req-7084740f-d576-4ef1-8009-26afe304ba5c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] finish_migration finished successfully. finish_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11793
Jan 22 00:00:40 compute-1 nova_compute[182713]: 2026-01-22 00:00:40.144 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:00:40 compute-1 nova_compute[182713]: 2026-01-22 00:00:40.148 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: resize_finish, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 00:00:40 compute-1 nova_compute[182713]: 2026-01-22 00:00:40.221 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] During sync_power_state the instance has a pending task (resize_finish). Skip.
Jan 22 00:00:40 compute-1 nova_compute[182713]: 2026-01-22 00:00:40.221 182717 DEBUG nova.virt.driver [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Emitting event <LifecycleEvent: 1769040040.1083999, 9308be91-9a92-4389-939a-8b03d37474cf => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:00:40 compute-1 nova_compute[182713]: 2026-01-22 00:00:40.222 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] VM Started (Lifecycle Event)
Jan 22 00:00:40 compute-1 nova_compute[182713]: 2026-01-22 00:00:40.270 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:00:40 compute-1 nova_compute[182713]: 2026-01-22 00:00:40.273 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: resize_finish, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 00:00:40 compute-1 nova_compute[182713]: 2026-01-22 00:00:40.416 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] During sync_power_state the instance has a pending task (resize_finish). Skip.
Jan 22 00:00:40 compute-1 podman[223556]: 2026-01-22 00:00:40.421284722 +0000 UTC m=+0.068147673 container create 98ab25a95e91217b9355ec0085d3a89fa819135b6f623ae296a32e9ec80fee74 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 22 00:00:40 compute-1 podman[223556]: 2026-01-22 00:00:40.38554819 +0000 UTC m=+0.032411151 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 00:00:40 compute-1 systemd[1]: Started libpod-conmon-98ab25a95e91217b9355ec0085d3a89fa819135b6f623ae296a32e9ec80fee74.scope.
Jan 22 00:00:40 compute-1 systemd[1]: Started libcrun container.
Jan 22 00:00:40 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8029b8bfca6fa4c1a36bd48c60cf9846148d4046fb185f20da748c9b0d9c9fe3/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 00:00:40 compute-1 podman[223556]: 2026-01-22 00:00:40.526580979 +0000 UTC m=+0.173443980 container init 98ab25a95e91217b9355ec0085d3a89fa819135b6f623ae296a32e9ec80fee74 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team)
Jan 22 00:00:40 compute-1 podman[223556]: 2026-01-22 00:00:40.537294179 +0000 UTC m=+0.184157130 container start 98ab25a95e91217b9355ec0085d3a89fa819135b6f623ae296a32e9ec80fee74 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Jan 22 00:00:40 compute-1 neutron-haproxy-ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7[223570]: [NOTICE]   (223574) : New worker (223576) forked
Jan 22 00:00:40 compute-1 neutron-haproxy-ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7[223570]: [NOTICE]   (223574) : Loading success.
Jan 22 00:00:41 compute-1 nova_compute[182713]: 2026-01-22 00:00:41.698 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:00:41 compute-1 systemd[1]: Stopping User Manager for UID 42436...
Jan 22 00:00:41 compute-1 systemd[223333]: Activating special unit Exit the Session...
Jan 22 00:00:41 compute-1 systemd[223333]: Stopped target Main User Target.
Jan 22 00:00:41 compute-1 systemd[223333]: Stopped target Basic System.
Jan 22 00:00:41 compute-1 systemd[223333]: Stopped target Paths.
Jan 22 00:00:41 compute-1 systemd[223333]: Stopped target Sockets.
Jan 22 00:00:41 compute-1 systemd[223333]: Stopped target Timers.
Jan 22 00:00:41 compute-1 systemd[223333]: Stopped Mark boot as successful after the user session has run 2 minutes.
Jan 22 00:00:41 compute-1 systemd[223333]: Stopped Daily Cleanup of User's Temporary Directories.
Jan 22 00:00:41 compute-1 systemd[223333]: Closed D-Bus User Message Bus Socket.
Jan 22 00:00:41 compute-1 systemd[223333]: Stopped Create User's Volatile Files and Directories.
Jan 22 00:00:41 compute-1 systemd[223333]: Removed slice User Application Slice.
Jan 22 00:00:41 compute-1 systemd[223333]: Reached target Shutdown.
Jan 22 00:00:41 compute-1 systemd[223333]: Finished Exit the Session.
Jan 22 00:00:41 compute-1 systemd[223333]: Reached target Exit the Session.
Jan 22 00:00:41 compute-1 systemd[1]: user@42436.service: Deactivated successfully.
Jan 22 00:00:41 compute-1 systemd[1]: Stopped User Manager for UID 42436.
Jan 22 00:00:41 compute-1 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Jan 22 00:00:41 compute-1 systemd[1]: run-user-42436.mount: Deactivated successfully.
Jan 22 00:00:41 compute-1 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Jan 22 00:00:41 compute-1 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Jan 22 00:00:41 compute-1 systemd[1]: Removed slice User Slice of UID 42436.
Jan 22 00:00:42 compute-1 nova_compute[182713]: 2026-01-22 00:00:42.154 182717 DEBUG nova.compute.manager [req-df114de0-3351-443b-bd0f-ad132f17bce5 req-de8503e4-5fd5-479f-98c1-f71f68cb6e06 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] Received event network-vif-plugged-d96fb6bb-9793-4373-8f62-3aa3f32af6a5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:00:42 compute-1 nova_compute[182713]: 2026-01-22 00:00:42.155 182717 DEBUG oslo_concurrency.lockutils [req-df114de0-3351-443b-bd0f-ad132f17bce5 req-de8503e4-5fd5-479f-98c1-f71f68cb6e06 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "9308be91-9a92-4389-939a-8b03d37474cf-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:00:42 compute-1 nova_compute[182713]: 2026-01-22 00:00:42.155 182717 DEBUG oslo_concurrency.lockutils [req-df114de0-3351-443b-bd0f-ad132f17bce5 req-de8503e4-5fd5-479f-98c1-f71f68cb6e06 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "9308be91-9a92-4389-939a-8b03d37474cf-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:00:42 compute-1 nova_compute[182713]: 2026-01-22 00:00:42.156 182717 DEBUG oslo_concurrency.lockutils [req-df114de0-3351-443b-bd0f-ad132f17bce5 req-de8503e4-5fd5-479f-98c1-f71f68cb6e06 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "9308be91-9a92-4389-939a-8b03d37474cf-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:00:42 compute-1 nova_compute[182713]: 2026-01-22 00:00:42.156 182717 DEBUG nova.compute.manager [req-df114de0-3351-443b-bd0f-ad132f17bce5 req-de8503e4-5fd5-479f-98c1-f71f68cb6e06 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] No waiting events found dispatching network-vif-plugged-d96fb6bb-9793-4373-8f62-3aa3f32af6a5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:00:42 compute-1 nova_compute[182713]: 2026-01-22 00:00:42.157 182717 WARNING nova.compute.manager [req-df114de0-3351-443b-bd0f-ad132f17bce5 req-de8503e4-5fd5-479f-98c1-f71f68cb6e06 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] Received unexpected event network-vif-plugged-d96fb6bb-9793-4373-8f62-3aa3f32af6a5 for instance with vm_state resized and task_state None.
Jan 22 00:00:44 compute-1 nova_compute[182713]: 2026-01-22 00:00:44.297 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:00:44 compute-1 nova_compute[182713]: 2026-01-22 00:00:44.578 182717 DEBUG nova.compute.manager [req-98777641-7aa5-42cd-8fc8-a1d1b189dc6e req-043ed43a-bcf6-4566-8fa6-8fedb6a2476f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] Received event network-vif-plugged-d96fb6bb-9793-4373-8f62-3aa3f32af6a5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:00:44 compute-1 nova_compute[182713]: 2026-01-22 00:00:44.578 182717 DEBUG oslo_concurrency.lockutils [req-98777641-7aa5-42cd-8fc8-a1d1b189dc6e req-043ed43a-bcf6-4566-8fa6-8fedb6a2476f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "9308be91-9a92-4389-939a-8b03d37474cf-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:00:44 compute-1 nova_compute[182713]: 2026-01-22 00:00:44.579 182717 DEBUG oslo_concurrency.lockutils [req-98777641-7aa5-42cd-8fc8-a1d1b189dc6e req-043ed43a-bcf6-4566-8fa6-8fedb6a2476f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "9308be91-9a92-4389-939a-8b03d37474cf-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:00:44 compute-1 nova_compute[182713]: 2026-01-22 00:00:44.579 182717 DEBUG oslo_concurrency.lockutils [req-98777641-7aa5-42cd-8fc8-a1d1b189dc6e req-043ed43a-bcf6-4566-8fa6-8fedb6a2476f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "9308be91-9a92-4389-939a-8b03d37474cf-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:00:44 compute-1 nova_compute[182713]: 2026-01-22 00:00:44.579 182717 DEBUG nova.compute.manager [req-98777641-7aa5-42cd-8fc8-a1d1b189dc6e req-043ed43a-bcf6-4566-8fa6-8fedb6a2476f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] No waiting events found dispatching network-vif-plugged-d96fb6bb-9793-4373-8f62-3aa3f32af6a5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:00:44 compute-1 nova_compute[182713]: 2026-01-22 00:00:44.580 182717 WARNING nova.compute.manager [req-98777641-7aa5-42cd-8fc8-a1d1b189dc6e req-043ed43a-bcf6-4566-8fa6-8fedb6a2476f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] Received unexpected event network-vif-plugged-d96fb6bb-9793-4373-8f62-3aa3f32af6a5 for instance with vm_state resized and task_state None.
Jan 22 00:00:46 compute-1 nova_compute[182713]: 2026-01-22 00:00:46.466 182717 DEBUG nova.network.neutron [req-dda76e27-b347-4440-a550-5300598ea5a3 req-2589bb52-4aeb-4bc8-b973-f566fde31059 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] Updated VIF entry in instance network info cache for port d96fb6bb-9793-4373-8f62-3aa3f32af6a5. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 00:00:46 compute-1 nova_compute[182713]: 2026-01-22 00:00:46.467 182717 DEBUG nova.network.neutron [req-dda76e27-b347-4440-a550-5300598ea5a3 req-2589bb52-4aeb-4bc8-b973-f566fde31059 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] Updating instance_info_cache with network_info: [{"id": "d96fb6bb-9793-4373-8f62-3aa3f32af6a5", "address": "fa:16:3e:c3:44:d7", "network": {"id": "19c3e0c8-5563-479c-995a-ab38d8b8c7f7", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-10713966-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.240", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cccb624dbe6d4401a89e9cd254f91828", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd96fb6bb-97", "ovs_interfaceid": "d96fb6bb-9793-4373-8f62-3aa3f32af6a5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:00:46 compute-1 nova_compute[182713]: 2026-01-22 00:00:46.498 182717 DEBUG oslo_concurrency.lockutils [req-dda76e27-b347-4440-a550-5300598ea5a3 req-2589bb52-4aeb-4bc8-b973-f566fde31059 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-9308be91-9a92-4389-939a-8b03d37474cf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:00:46 compute-1 podman[223587]: 2026-01-22 00:00:46.614613808 +0000 UTC m=+0.100700137 container health_status af9a44923f14d865893c91712a29a99d59e05d83f1a1a0300c4f4d5af638f284 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 22 00:00:46 compute-1 podman[223588]: 2026-01-22 00:00:46.622831801 +0000 UTC m=+0.107780115 container health_status e305ebfcd3dbca18f8dd680606b043b108cf9efb0d0a771039cb04fe0df8441d (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Jan 22 00:00:46 compute-1 nova_compute[182713]: 2026-01-22 00:00:46.700 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:00:48 compute-1 nova_compute[182713]: 2026-01-22 00:00:48.822 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:00:49 compute-1 nova_compute[182713]: 2026-01-22 00:00:49.299 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:00:51 compute-1 nova_compute[182713]: 2026-01-22 00:00:51.746 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:00:52 compute-1 nova_compute[182713]: 2026-01-22 00:00:52.711 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:00:53 compute-1 ovn_controller[94841]: 2026-01-22T00:00:53Z|00040|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:c3:44:d7 10.100.0.7
Jan 22 00:00:54 compute-1 nova_compute[182713]: 2026-01-22 00:00:54.301 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:00:56 compute-1 nova_compute[182713]: 2026-01-22 00:00:56.630 182717 DEBUG oslo_concurrency.lockutils [None req-c3c5d425-5a3d-4283-9dcc-3cd66c6d8905 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Acquiring lock "9308be91-9a92-4389-939a-8b03d37474cf" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:00:56 compute-1 nova_compute[182713]: 2026-01-22 00:00:56.631 182717 DEBUG oslo_concurrency.lockutils [None req-c3c5d425-5a3d-4283-9dcc-3cd66c6d8905 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Lock "9308be91-9a92-4389-939a-8b03d37474cf" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:00:56 compute-1 nova_compute[182713]: 2026-01-22 00:00:56.631 182717 DEBUG oslo_concurrency.lockutils [None req-c3c5d425-5a3d-4283-9dcc-3cd66c6d8905 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Acquiring lock "9308be91-9a92-4389-939a-8b03d37474cf-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:00:56 compute-1 nova_compute[182713]: 2026-01-22 00:00:56.632 182717 DEBUG oslo_concurrency.lockutils [None req-c3c5d425-5a3d-4283-9dcc-3cd66c6d8905 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Lock "9308be91-9a92-4389-939a-8b03d37474cf-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:00:56 compute-1 nova_compute[182713]: 2026-01-22 00:00:56.632 182717 DEBUG oslo_concurrency.lockutils [None req-c3c5d425-5a3d-4283-9dcc-3cd66c6d8905 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Lock "9308be91-9a92-4389-939a-8b03d37474cf-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:00:56 compute-1 nova_compute[182713]: 2026-01-22 00:00:56.651 182717 INFO nova.compute.manager [None req-c3c5d425-5a3d-4283-9dcc-3cd66c6d8905 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] Terminating instance
Jan 22 00:00:56 compute-1 nova_compute[182713]: 2026-01-22 00:00:56.669 182717 DEBUG nova.compute.manager [None req-c3c5d425-5a3d-4283-9dcc-3cd66c6d8905 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 22 00:00:56 compute-1 kernel: tapd96fb6bb-97 (unregistering): left promiscuous mode
Jan 22 00:00:56 compute-1 NetworkManager[54952]: <info>  [1769040056.7037] device (tapd96fb6bb-97): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 00:00:56 compute-1 ovn_controller[94841]: 2026-01-22T00:00:56Z|00334|binding|INFO|Releasing lport d96fb6bb-9793-4373-8f62-3aa3f32af6a5 from this chassis (sb_readonly=0)
Jan 22 00:00:56 compute-1 nova_compute[182713]: 2026-01-22 00:00:56.717 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:00:56 compute-1 ovn_controller[94841]: 2026-01-22T00:00:56Z|00335|binding|INFO|Setting lport d96fb6bb-9793-4373-8f62-3aa3f32af6a5 down in Southbound
Jan 22 00:00:56 compute-1 ovn_controller[94841]: 2026-01-22T00:00:56Z|00336|binding|INFO|Removing iface tapd96fb6bb-97 ovn-installed in OVS
Jan 22 00:00:56 compute-1 nova_compute[182713]: 2026-01-22 00:00:56.723 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:00:56 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:00:56.748 104184 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c3:44:d7 10.100.0.7'], port_security=['fa:16:3e:c3:44:d7 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '9308be91-9a92-4389-939a-8b03d37474cf', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-19c3e0c8-5563-479c-995a-ab38d8b8c7f7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'cccb624dbe6d4401a89e9cd254f91828', 'neutron:revision_number': '14', 'neutron:security_group_ids': '6d59a7e5-ecca-4ec2-a40e-386acabc1d66', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.240', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1cb5ae5b-fb9e-4b4d-8960-35191db09308, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>], logical_port=d96fb6bb-9793-4373-8f62-3aa3f32af6a5) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:00:56 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:00:56.751 104184 INFO neutron.agent.ovn.metadata.agent [-] Port d96fb6bb-9793-4373-8f62-3aa3f32af6a5 in datapath 19c3e0c8-5563-479c-995a-ab38d8b8c7f7 unbound from our chassis
Jan 22 00:00:56 compute-1 nova_compute[182713]: 2026-01-22 00:00:56.752 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:00:56 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:00:56.755 104184 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 19c3e0c8-5563-479c-995a-ab38d8b8c7f7, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 00:00:56 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:00:56.758 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[83b74eca-bcb1-4eaf-803a-110bed6de2f7]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:00:56 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:00:56.759 104184 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7 namespace which is not needed anymore
Jan 22 00:00:56 compute-1 systemd[1]: machine-qemu\x2d38\x2dinstance\x2d00000046.scope: Deactivated successfully.
Jan 22 00:00:56 compute-1 systemd[1]: machine-qemu\x2d38\x2dinstance\x2d00000046.scope: Consumed 12.669s CPU time.
Jan 22 00:00:56 compute-1 systemd-machined[153970]: Machine qemu-38-instance-00000046 terminated.
Jan 22 00:00:56 compute-1 nova_compute[182713]: 2026-01-22 00:00:56.900 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:00:56 compute-1 nova_compute[182713]: 2026-01-22 00:00:56.909 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:00:56 compute-1 nova_compute[182713]: 2026-01-22 00:00:56.957 182717 INFO nova.virt.libvirt.driver [-] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] Instance destroyed successfully.
Jan 22 00:00:56 compute-1 nova_compute[182713]: 2026-01-22 00:00:56.958 182717 DEBUG nova.objects.instance [None req-c3c5d425-5a3d-4283-9dcc-3cd66c6d8905 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Lazy-loading 'resources' on Instance uuid 9308be91-9a92-4389-939a-8b03d37474cf obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:00:56 compute-1 nova_compute[182713]: 2026-01-22 00:00:56.974 182717 DEBUG nova.virt.libvirt.vif [None req-c3c5d425-5a3d-4283-9dcc-3cd66c6d8905 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-21T23:57:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-396111842',display_name='tempest-ServerActionsTestJSON-server-396111842',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-396111842',id=70,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBM2ugiUux7DYMlN8dY8gue1BzsfXbOKOqdPq/gJUxFgjYtiZRKn0Il7yH7vkt/FF0n0nQ57uKZ7FjQwDvGcLpEHkhrK3RTLhPWsztjfiNHjhjKK0S86T4k3kzP0rpeoh4Q==',key_name='tempest-keypair-452781070',keypairs=<?>,launch_index=0,launched_at=2026-01-22T00:00:40Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='cccb624dbe6d4401a89e9cd254f91828',ramdisk_id='',reservation_id='r-740ncwsh',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestJSON-78742637',owner_user_name='tempest-ServerActionsTestJSON-78742637-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T00:00:52Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='3e78a70a1d284a9d932d4a53b872df39',uuid=9308be91-9a92-4389-939a-8b03d37474cf,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "d96fb6bb-9793-4373-8f62-3aa3f32af6a5", "address": "fa:16:3e:c3:44:d7", "network": {"id": "19c3e0c8-5563-479c-995a-ab38d8b8c7f7", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-10713966-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.240", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cccb624dbe6d4401a89e9cd254f91828", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd96fb6bb-97", "ovs_interfaceid": "d96fb6bb-9793-4373-8f62-3aa3f32af6a5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 00:00:56 compute-1 nova_compute[182713]: 2026-01-22 00:00:56.974 182717 DEBUG nova.network.os_vif_util [None req-c3c5d425-5a3d-4283-9dcc-3cd66c6d8905 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Converting VIF {"id": "d96fb6bb-9793-4373-8f62-3aa3f32af6a5", "address": "fa:16:3e:c3:44:d7", "network": {"id": "19c3e0c8-5563-479c-995a-ab38d8b8c7f7", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-10713966-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.240", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cccb624dbe6d4401a89e9cd254f91828", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd96fb6bb-97", "ovs_interfaceid": "d96fb6bb-9793-4373-8f62-3aa3f32af6a5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:00:56 compute-1 nova_compute[182713]: 2026-01-22 00:00:56.975 182717 DEBUG nova.network.os_vif_util [None req-c3c5d425-5a3d-4283-9dcc-3cd66c6d8905 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:c3:44:d7,bridge_name='br-int',has_traffic_filtering=True,id=d96fb6bb-9793-4373-8f62-3aa3f32af6a5,network=Network(19c3e0c8-5563-479c-995a-ab38d8b8c7f7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd96fb6bb-97') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:00:56 compute-1 nova_compute[182713]: 2026-01-22 00:00:56.975 182717 DEBUG os_vif [None req-c3c5d425-5a3d-4283-9dcc-3cd66c6d8905 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:c3:44:d7,bridge_name='br-int',has_traffic_filtering=True,id=d96fb6bb-9793-4373-8f62-3aa3f32af6a5,network=Network(19c3e0c8-5563-479c-995a-ab38d8b8c7f7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd96fb6bb-97') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 00:00:56 compute-1 neutron-haproxy-ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7[223570]: [NOTICE]   (223574) : haproxy version is 2.8.14-c23fe91
Jan 22 00:00:56 compute-1 neutron-haproxy-ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7[223570]: [NOTICE]   (223574) : path to executable is /usr/sbin/haproxy
Jan 22 00:00:56 compute-1 neutron-haproxy-ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7[223570]: [WARNING]  (223574) : Exiting Master process...
Jan 22 00:00:56 compute-1 neutron-haproxy-ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7[223570]: [ALERT]    (223574) : Current worker (223576) exited with code 143 (Terminated)
Jan 22 00:00:56 compute-1 neutron-haproxy-ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7[223570]: [WARNING]  (223574) : All workers exited. Exiting... (0)
Jan 22 00:00:56 compute-1 nova_compute[182713]: 2026-01-22 00:00:56.978 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:00:56 compute-1 nova_compute[182713]: 2026-01-22 00:00:56.978 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd96fb6bb-97, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:00:56 compute-1 systemd[1]: libpod-98ab25a95e91217b9355ec0085d3a89fa819135b6f623ae296a32e9ec80fee74.scope: Deactivated successfully.
Jan 22 00:00:56 compute-1 nova_compute[182713]: 2026-01-22 00:00:56.979 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:00:56 compute-1 nova_compute[182713]: 2026-01-22 00:00:56.981 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:00:56 compute-1 podman[223659]: 2026-01-22 00:00:56.987555423 +0000 UTC m=+0.092380810 container died 98ab25a95e91217b9355ec0085d3a89fa819135b6f623ae296a32e9ec80fee74 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Jan 22 00:00:56 compute-1 nova_compute[182713]: 2026-01-22 00:00:56.987 182717 INFO os_vif [None req-c3c5d425-5a3d-4283-9dcc-3cd66c6d8905 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:c3:44:d7,bridge_name='br-int',has_traffic_filtering=True,id=d96fb6bb-9793-4373-8f62-3aa3f32af6a5,network=Network(19c3e0c8-5563-479c-995a-ab38d8b8c7f7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd96fb6bb-97')
Jan 22 00:00:56 compute-1 nova_compute[182713]: 2026-01-22 00:00:56.989 182717 INFO nova.virt.libvirt.driver [None req-c3c5d425-5a3d-4283-9dcc-3cd66c6d8905 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] Deleting instance files /var/lib/nova/instances/9308be91-9a92-4389-939a-8b03d37474cf_del
Jan 22 00:00:57 compute-1 nova_compute[182713]: 2026-01-22 00:00:56.998 182717 INFO nova.virt.libvirt.driver [None req-c3c5d425-5a3d-4283-9dcc-3cd66c6d8905 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] Deletion of /var/lib/nova/instances/9308be91-9a92-4389-939a-8b03d37474cf_del complete
Jan 22 00:00:57 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-98ab25a95e91217b9355ec0085d3a89fa819135b6f623ae296a32e9ec80fee74-userdata-shm.mount: Deactivated successfully.
Jan 22 00:00:57 compute-1 systemd[1]: var-lib-containers-storage-overlay-8029b8bfca6fa4c1a36bd48c60cf9846148d4046fb185f20da748c9b0d9c9fe3-merged.mount: Deactivated successfully.
Jan 22 00:00:57 compute-1 podman[223659]: 2026-01-22 00:00:57.030114926 +0000 UTC m=+0.134940273 container cleanup 98ab25a95e91217b9355ec0085d3a89fa819135b6f623ae296a32e9ec80fee74 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 00:00:57 compute-1 systemd[1]: libpod-conmon-98ab25a95e91217b9355ec0085d3a89fa819135b6f623ae296a32e9ec80fee74.scope: Deactivated successfully.
Jan 22 00:00:57 compute-1 podman[223688]: 2026-01-22 00:00:57.069820531 +0000 UTC m=+0.075785579 container health_status cba261ffc859a105e0afa4436e64e27f9ba7e7387a1ea02146e0072c51844d44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_id=ceilometer_agent_compute, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3)
Jan 22 00:00:57 compute-1 podman[223717]: 2026-01-22 00:00:57.113979272 +0000 UTC m=+0.053430329 container remove 98ab25a95e91217b9355ec0085d3a89fa819135b6f623ae296a32e9ec80fee74 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Jan 22 00:00:57 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:00:57.120 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[ec66044c-b4f3-4f83-a927-6ead1da780f5]: (4, ('Thu Jan 22 12:00:56 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7 (98ab25a95e91217b9355ec0085d3a89fa819135b6f623ae296a32e9ec80fee74)\n98ab25a95e91217b9355ec0085d3a89fa819135b6f623ae296a32e9ec80fee74\nThu Jan 22 12:00:57 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7 (98ab25a95e91217b9355ec0085d3a89fa819135b6f623ae296a32e9ec80fee74)\n98ab25a95e91217b9355ec0085d3a89fa819135b6f623ae296a32e9ec80fee74\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:00:57 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:00:57.123 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[1cb048c4-61ba-4d09-a980-1490376ac085]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:00:57 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:00:57.124 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap19c3e0c8-50, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:00:57 compute-1 nova_compute[182713]: 2026-01-22 00:00:57.127 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:00:57 compute-1 kernel: tap19c3e0c8-50: left promiscuous mode
Jan 22 00:00:57 compute-1 nova_compute[182713]: 2026-01-22 00:00:57.130 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:00:57 compute-1 nova_compute[182713]: 2026-01-22 00:00:57.135 182717 INFO nova.compute.manager [None req-c3c5d425-5a3d-4283-9dcc-3cd66c6d8905 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] Took 0.46 seconds to destroy the instance on the hypervisor.
Jan 22 00:00:57 compute-1 nova_compute[182713]: 2026-01-22 00:00:57.135 182717 DEBUG oslo.service.loopingcall [None req-c3c5d425-5a3d-4283-9dcc-3cd66c6d8905 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 22 00:00:57 compute-1 nova_compute[182713]: 2026-01-22 00:00:57.136 182717 DEBUG nova.compute.manager [-] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 22 00:00:57 compute-1 nova_compute[182713]: 2026-01-22 00:00:57.136 182717 DEBUG nova.network.neutron [-] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 22 00:00:57 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:00:57.135 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[064b4a88-b566-4bd9-8300-e3a4dda85197]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:00:57 compute-1 nova_compute[182713]: 2026-01-22 00:00:57.148 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:00:57 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:00:57.153 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[34552d46-d4fa-44e3-9c86-e5c8a90b4590]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:00:57 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:00:57.155 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[17f11cbc-611d-4e74-a8c7-97ab585199d6]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:00:57 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:00:57.176 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[0c8c907d-c0ce-4826-9404-54f83ce8b7aa]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 461240, 'reachable_time': 20546, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 223734, 'error': None, 'target': 'ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:00:57 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:00:57.180 104576 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 22 00:00:57 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:00:57.180 104576 DEBUG oslo.privsep.daemon [-] privsep: reply[5c678264-45d8-4fa0-a91b-3ba6e3dc8677]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:00:57 compute-1 systemd[1]: run-netns-ovnmeta\x2d19c3e0c8\x2d5563\x2d479c\x2d995a\x2dab38d8b8c7f7.mount: Deactivated successfully.
Jan 22 00:00:58 compute-1 nova_compute[182713]: 2026-01-22 00:00:58.942 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:01:00 compute-1 nova_compute[182713]: 2026-01-22 00:01:00.017 182717 DEBUG nova.network.neutron [-] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:01:00 compute-1 nova_compute[182713]: 2026-01-22 00:01:00.078 182717 INFO nova.compute.manager [-] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] Took 2.94 seconds to deallocate network for instance.
Jan 22 00:01:00 compute-1 nova_compute[182713]: 2026-01-22 00:01:00.322 182717 DEBUG oslo_concurrency.lockutils [None req-c3c5d425-5a3d-4283-9dcc-3cd66c6d8905 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:01:00 compute-1 nova_compute[182713]: 2026-01-22 00:01:00.323 182717 DEBUG oslo_concurrency.lockutils [None req-c3c5d425-5a3d-4283-9dcc-3cd66c6d8905 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:01:00 compute-1 nova_compute[182713]: 2026-01-22 00:01:00.388 182717 DEBUG nova.compute.provider_tree [None req-c3c5d425-5a3d-4283-9dcc-3cd66c6d8905 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Inventory has not changed in ProviderTree for provider: 39680711-70c9-4df1-ae59-25e54fac688d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:01:00 compute-1 nova_compute[182713]: 2026-01-22 00:01:00.405 182717 DEBUG nova.scheduler.client.report [None req-c3c5d425-5a3d-4283-9dcc-3cd66c6d8905 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Inventory has not changed for provider 39680711-70c9-4df1-ae59-25e54fac688d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:01:00 compute-1 nova_compute[182713]: 2026-01-22 00:01:00.415 182717 DEBUG nova.compute.manager [req-05b94bb5-245a-488f-943f-10f5acb0a286 req-f0517da1-7c44-4ba7-9f82-2e71e809dc3b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] Received event network-vif-deleted-d96fb6bb-9793-4373-8f62-3aa3f32af6a5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:01:00 compute-1 nova_compute[182713]: 2026-01-22 00:01:00.482 182717 DEBUG oslo_concurrency.lockutils [None req-c3c5d425-5a3d-4283-9dcc-3cd66c6d8905 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.159s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:01:00 compute-1 nova_compute[182713]: 2026-01-22 00:01:00.540 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:01:00 compute-1 nova_compute[182713]: 2026-01-22 00:01:00.548 182717 INFO nova.scheduler.client.report [None req-c3c5d425-5a3d-4283-9dcc-3cd66c6d8905 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Deleted allocations for instance 9308be91-9a92-4389-939a-8b03d37474cf
Jan 22 00:01:00 compute-1 podman[223735]: 2026-01-22 00:01:00.600296673 +0000 UTC m=+0.091480272 container health_status dce133a2ffa21f3006ac27dc80027060a9029e6d972f4c62fecd35dd3bbb00f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9-minimal, distribution-scope=public, architecture=x86_64, build-date=2025-08-20T13:12:41, config_id=openstack_network_exporter, maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Jan 22 00:01:00 compute-1 nova_compute[182713]: 2026-01-22 00:01:00.682 182717 DEBUG oslo_concurrency.lockutils [None req-c3c5d425-5a3d-4283-9dcc-3cd66c6d8905 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Lock "9308be91-9a92-4389-939a-8b03d37474cf" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.051s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:01:01 compute-1 CROND[223759]: (root) CMD (run-parts /etc/cron.hourly)
Jan 22 00:01:01 compute-1 run-parts[223762]: (/etc/cron.hourly) starting 0anacron
Jan 22 00:01:01 compute-1 anacron[223770]: Anacron started on 2026-01-22
Jan 22 00:01:01 compute-1 anacron[223770]: Job `cron.monthly' locked by another anacron - skipping
Jan 22 00:01:01 compute-1 anacron[223770]: Normal exit (0 jobs run)
Jan 22 00:01:01 compute-1 run-parts[223772]: (/etc/cron.hourly) finished 0anacron
Jan 22 00:01:01 compute-1 CROND[223758]: (root) CMDEND (run-parts /etc/cron.hourly)
Jan 22 00:01:01 compute-1 nova_compute[182713]: 2026-01-22 00:01:01.760 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:01:01 compute-1 nova_compute[182713]: 2026-01-22 00:01:01.980 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:01:03 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:01:03.008 104184 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:01:03 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:01:03.009 104184 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:01:03 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:01:03.009 104184 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:01:05 compute-1 nova_compute[182713]: 2026-01-22 00:01:05.441 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:01:06 compute-1 nova_compute[182713]: 2026-01-22 00:01:06.763 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:01:06 compute-1 nova_compute[182713]: 2026-01-22 00:01:06.982 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:01:07 compute-1 nova_compute[182713]: 2026-01-22 00:01:07.247 182717 DEBUG oslo_concurrency.lockutils [None req-f2462028-c54d-4396-bffc-4fa2f1da367f 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] Acquiring lock "2cb6e3d6-f22a-49ea-aab8-900dd88605e9" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:01:07 compute-1 nova_compute[182713]: 2026-01-22 00:01:07.248 182717 DEBUG oslo_concurrency.lockutils [None req-f2462028-c54d-4396-bffc-4fa2f1da367f 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] Lock "2cb6e3d6-f22a-49ea-aab8-900dd88605e9" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:01:07 compute-1 nova_compute[182713]: 2026-01-22 00:01:07.297 182717 DEBUG nova.compute.manager [None req-f2462028-c54d-4396-bffc-4fa2f1da367f 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] [instance: 2cb6e3d6-f22a-49ea-aab8-900dd88605e9] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 22 00:01:07 compute-1 nova_compute[182713]: 2026-01-22 00:01:07.348 182717 DEBUG oslo_concurrency.lockutils [None req-4346924a-e268-4c61-a82c-545f1e7be253 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Acquiring lock "8009ab5e-1bf8-4d17-8ff2-b62b25eeff20" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:01:07 compute-1 nova_compute[182713]: 2026-01-22 00:01:07.349 182717 DEBUG oslo_concurrency.lockutils [None req-4346924a-e268-4c61-a82c-545f1e7be253 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Lock "8009ab5e-1bf8-4d17-8ff2-b62b25eeff20" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:01:07 compute-1 nova_compute[182713]: 2026-01-22 00:01:07.380 182717 DEBUG nova.compute.manager [None req-4346924a-e268-4c61-a82c-545f1e7be253 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 22 00:01:07 compute-1 nova_compute[182713]: 2026-01-22 00:01:07.440 182717 DEBUG oslo_concurrency.lockutils [None req-f2462028-c54d-4396-bffc-4fa2f1da367f 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:01:07 compute-1 nova_compute[182713]: 2026-01-22 00:01:07.441 182717 DEBUG oslo_concurrency.lockutils [None req-f2462028-c54d-4396-bffc-4fa2f1da367f 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:01:07 compute-1 nova_compute[182713]: 2026-01-22 00:01:07.450 182717 DEBUG nova.virt.hardware [None req-f2462028-c54d-4396-bffc-4fa2f1da367f 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 22 00:01:07 compute-1 nova_compute[182713]: 2026-01-22 00:01:07.451 182717 INFO nova.compute.claims [None req-f2462028-c54d-4396-bffc-4fa2f1da367f 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] [instance: 2cb6e3d6-f22a-49ea-aab8-900dd88605e9] Claim successful on node compute-1.ctlplane.example.com
Jan 22 00:01:07 compute-1 nova_compute[182713]: 2026-01-22 00:01:07.665 182717 DEBUG oslo_concurrency.lockutils [None req-4346924a-e268-4c61-a82c-545f1e7be253 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:01:07 compute-1 nova_compute[182713]: 2026-01-22 00:01:07.759 182717 DEBUG nova.compute.provider_tree [None req-f2462028-c54d-4396-bffc-4fa2f1da367f 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] Inventory has not changed in ProviderTree for provider: 39680711-70c9-4df1-ae59-25e54fac688d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:01:07 compute-1 nova_compute[182713]: 2026-01-22 00:01:07.791 182717 DEBUG nova.scheduler.client.report [None req-f2462028-c54d-4396-bffc-4fa2f1da367f 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] Inventory has not changed for provider 39680711-70c9-4df1-ae59-25e54fac688d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:01:07 compute-1 nova_compute[182713]: 2026-01-22 00:01:07.824 182717 DEBUG oslo_concurrency.lockutils [None req-f2462028-c54d-4396-bffc-4fa2f1da367f 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.383s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:01:07 compute-1 nova_compute[182713]: 2026-01-22 00:01:07.826 182717 DEBUG nova.compute.manager [None req-f2462028-c54d-4396-bffc-4fa2f1da367f 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] [instance: 2cb6e3d6-f22a-49ea-aab8-900dd88605e9] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 22 00:01:07 compute-1 nova_compute[182713]: 2026-01-22 00:01:07.829 182717 DEBUG oslo_concurrency.lockutils [None req-4346924a-e268-4c61-a82c-545f1e7be253 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.165s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:01:07 compute-1 nova_compute[182713]: 2026-01-22 00:01:07.842 182717 DEBUG nova.virt.hardware [None req-4346924a-e268-4c61-a82c-545f1e7be253 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 22 00:01:07 compute-1 nova_compute[182713]: 2026-01-22 00:01:07.843 182717 INFO nova.compute.claims [None req-4346924a-e268-4c61-a82c-545f1e7be253 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20] Claim successful on node compute-1.ctlplane.example.com
Jan 22 00:01:07 compute-1 nova_compute[182713]: 2026-01-22 00:01:07.965 182717 DEBUG nova.compute.manager [None req-f2462028-c54d-4396-bffc-4fa2f1da367f 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] [instance: 2cb6e3d6-f22a-49ea-aab8-900dd88605e9] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 22 00:01:07 compute-1 nova_compute[182713]: 2026-01-22 00:01:07.965 182717 DEBUG nova.network.neutron [None req-f2462028-c54d-4396-bffc-4fa2f1da367f 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] [instance: 2cb6e3d6-f22a-49ea-aab8-900dd88605e9] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 22 00:01:07 compute-1 nova_compute[182713]: 2026-01-22 00:01:07.997 182717 INFO nova.virt.libvirt.driver [None req-f2462028-c54d-4396-bffc-4fa2f1da367f 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] [instance: 2cb6e3d6-f22a-49ea-aab8-900dd88605e9] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 22 00:01:08 compute-1 nova_compute[182713]: 2026-01-22 00:01:08.021 182717 DEBUG nova.compute.manager [None req-f2462028-c54d-4396-bffc-4fa2f1da367f 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] [instance: 2cb6e3d6-f22a-49ea-aab8-900dd88605e9] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 22 00:01:08 compute-1 nova_compute[182713]: 2026-01-22 00:01:08.136 182717 DEBUG nova.compute.provider_tree [None req-4346924a-e268-4c61-a82c-545f1e7be253 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Inventory has not changed in ProviderTree for provider: 39680711-70c9-4df1-ae59-25e54fac688d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:01:08 compute-1 nova_compute[182713]: 2026-01-22 00:01:08.176 182717 DEBUG nova.scheduler.client.report [None req-4346924a-e268-4c61-a82c-545f1e7be253 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Inventory has not changed for provider 39680711-70c9-4df1-ae59-25e54fac688d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:01:08 compute-1 nova_compute[182713]: 2026-01-22 00:01:08.198 182717 DEBUG nova.compute.manager [None req-f2462028-c54d-4396-bffc-4fa2f1da367f 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] [instance: 2cb6e3d6-f22a-49ea-aab8-900dd88605e9] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 22 00:01:08 compute-1 nova_compute[182713]: 2026-01-22 00:01:08.200 182717 DEBUG nova.virt.libvirt.driver [None req-f2462028-c54d-4396-bffc-4fa2f1da367f 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] [instance: 2cb6e3d6-f22a-49ea-aab8-900dd88605e9] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 22 00:01:08 compute-1 nova_compute[182713]: 2026-01-22 00:01:08.201 182717 INFO nova.virt.libvirt.driver [None req-f2462028-c54d-4396-bffc-4fa2f1da367f 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] [instance: 2cb6e3d6-f22a-49ea-aab8-900dd88605e9] Creating image(s)
Jan 22 00:01:08 compute-1 nova_compute[182713]: 2026-01-22 00:01:08.201 182717 DEBUG oslo_concurrency.lockutils [None req-f2462028-c54d-4396-bffc-4fa2f1da367f 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] Acquiring lock "/var/lib/nova/instances/2cb6e3d6-f22a-49ea-aab8-900dd88605e9/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:01:08 compute-1 nova_compute[182713]: 2026-01-22 00:01:08.202 182717 DEBUG oslo_concurrency.lockutils [None req-f2462028-c54d-4396-bffc-4fa2f1da367f 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] Lock "/var/lib/nova/instances/2cb6e3d6-f22a-49ea-aab8-900dd88605e9/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:01:08 compute-1 nova_compute[182713]: 2026-01-22 00:01:08.203 182717 DEBUG oslo_concurrency.lockutils [None req-f2462028-c54d-4396-bffc-4fa2f1da367f 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] Lock "/var/lib/nova/instances/2cb6e3d6-f22a-49ea-aab8-900dd88605e9/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:01:08 compute-1 nova_compute[182713]: 2026-01-22 00:01:08.228 182717 DEBUG oslo_concurrency.lockutils [None req-4346924a-e268-4c61-a82c-545f1e7be253 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.399s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:01:08 compute-1 nova_compute[182713]: 2026-01-22 00:01:08.229 182717 DEBUG nova.compute.manager [None req-4346924a-e268-4c61-a82c-545f1e7be253 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 22 00:01:08 compute-1 nova_compute[182713]: 2026-01-22 00:01:08.234 182717 DEBUG oslo_concurrency.processutils [None req-f2462028-c54d-4396-bffc-4fa2f1da367f 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:01:08 compute-1 nova_compute[182713]: 2026-01-22 00:01:08.322 182717 DEBUG oslo_concurrency.processutils [None req-f2462028-c54d-4396-bffc-4fa2f1da367f 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.087s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:01:08 compute-1 nova_compute[182713]: 2026-01-22 00:01:08.324 182717 DEBUG oslo_concurrency.lockutils [None req-f2462028-c54d-4396-bffc-4fa2f1da367f 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] Acquiring lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:01:08 compute-1 nova_compute[182713]: 2026-01-22 00:01:08.325 182717 DEBUG oslo_concurrency.lockutils [None req-f2462028-c54d-4396-bffc-4fa2f1da367f 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:01:08 compute-1 nova_compute[182713]: 2026-01-22 00:01:08.349 182717 DEBUG oslo_concurrency.processutils [None req-f2462028-c54d-4396-bffc-4fa2f1da367f 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:01:08 compute-1 nova_compute[182713]: 2026-01-22 00:01:08.388 182717 DEBUG nova.compute.manager [None req-4346924a-e268-4c61-a82c-545f1e7be253 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 22 00:01:08 compute-1 nova_compute[182713]: 2026-01-22 00:01:08.389 182717 DEBUG nova.network.neutron [None req-4346924a-e268-4c61-a82c-545f1e7be253 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 22 00:01:08 compute-1 nova_compute[182713]: 2026-01-22 00:01:08.415 182717 DEBUG nova.policy [None req-f2462028-c54d-4396-bffc-4fa2f1da367f 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '531ec5a088a94b78af6e2c3feda17c0c', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'a7e425a4d1854533a17d5f0dcd9d87b9', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 22 00:01:08 compute-1 nova_compute[182713]: 2026-01-22 00:01:08.434 182717 DEBUG oslo_concurrency.processutils [None req-f2462028-c54d-4396-bffc-4fa2f1da367f 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.085s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:01:08 compute-1 nova_compute[182713]: 2026-01-22 00:01:08.435 182717 DEBUG oslo_concurrency.processutils [None req-f2462028-c54d-4396-bffc-4fa2f1da367f 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/2cb6e3d6-f22a-49ea-aab8-900dd88605e9/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:01:08 compute-1 nova_compute[182713]: 2026-01-22 00:01:08.473 182717 INFO nova.virt.libvirt.driver [None req-4346924a-e268-4c61-a82c-545f1e7be253 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 22 00:01:08 compute-1 nova_compute[182713]: 2026-01-22 00:01:08.505 182717 DEBUG oslo_concurrency.processutils [None req-f2462028-c54d-4396-bffc-4fa2f1da367f 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/2cb6e3d6-f22a-49ea-aab8-900dd88605e9/disk 1073741824" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:01:08 compute-1 nova_compute[182713]: 2026-01-22 00:01:08.506 182717 DEBUG oslo_concurrency.lockutils [None req-f2462028-c54d-4396-bffc-4fa2f1da367f 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.181s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:01:08 compute-1 nova_compute[182713]: 2026-01-22 00:01:08.507 182717 DEBUG oslo_concurrency.processutils [None req-f2462028-c54d-4396-bffc-4fa2f1da367f 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:01:08 compute-1 nova_compute[182713]: 2026-01-22 00:01:08.538 182717 DEBUG nova.compute.manager [None req-4346924a-e268-4c61-a82c-545f1e7be253 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 22 00:01:08 compute-1 nova_compute[182713]: 2026-01-22 00:01:08.600 182717 DEBUG oslo_concurrency.processutils [None req-f2462028-c54d-4396-bffc-4fa2f1da367f 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.093s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:01:08 compute-1 nova_compute[182713]: 2026-01-22 00:01:08.601 182717 DEBUG nova.virt.disk.api [None req-f2462028-c54d-4396-bffc-4fa2f1da367f 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] Checking if we can resize image /var/lib/nova/instances/2cb6e3d6-f22a-49ea-aab8-900dd88605e9/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 22 00:01:08 compute-1 nova_compute[182713]: 2026-01-22 00:01:08.601 182717 DEBUG oslo_concurrency.processutils [None req-f2462028-c54d-4396-bffc-4fa2f1da367f 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2cb6e3d6-f22a-49ea-aab8-900dd88605e9/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:01:08 compute-1 nova_compute[182713]: 2026-01-22 00:01:08.673 182717 DEBUG oslo_concurrency.processutils [None req-f2462028-c54d-4396-bffc-4fa2f1da367f 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2cb6e3d6-f22a-49ea-aab8-900dd88605e9/disk --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:01:08 compute-1 nova_compute[182713]: 2026-01-22 00:01:08.675 182717 DEBUG nova.virt.disk.api [None req-f2462028-c54d-4396-bffc-4fa2f1da367f 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] Cannot resize image /var/lib/nova/instances/2cb6e3d6-f22a-49ea-aab8-900dd88605e9/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 22 00:01:08 compute-1 nova_compute[182713]: 2026-01-22 00:01:08.676 182717 DEBUG nova.objects.instance [None req-f2462028-c54d-4396-bffc-4fa2f1da367f 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] Lazy-loading 'migration_context' on Instance uuid 2cb6e3d6-f22a-49ea-aab8-900dd88605e9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:01:08 compute-1 nova_compute[182713]: 2026-01-22 00:01:08.721 182717 DEBUG nova.virt.libvirt.driver [None req-f2462028-c54d-4396-bffc-4fa2f1da367f 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] [instance: 2cb6e3d6-f22a-49ea-aab8-900dd88605e9] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 22 00:01:08 compute-1 nova_compute[182713]: 2026-01-22 00:01:08.721 182717 DEBUG nova.virt.libvirt.driver [None req-f2462028-c54d-4396-bffc-4fa2f1da367f 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] [instance: 2cb6e3d6-f22a-49ea-aab8-900dd88605e9] Ensure instance console log exists: /var/lib/nova/instances/2cb6e3d6-f22a-49ea-aab8-900dd88605e9/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 22 00:01:08 compute-1 nova_compute[182713]: 2026-01-22 00:01:08.722 182717 DEBUG oslo_concurrency.lockutils [None req-f2462028-c54d-4396-bffc-4fa2f1da367f 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:01:08 compute-1 nova_compute[182713]: 2026-01-22 00:01:08.723 182717 DEBUG oslo_concurrency.lockutils [None req-f2462028-c54d-4396-bffc-4fa2f1da367f 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:01:08 compute-1 nova_compute[182713]: 2026-01-22 00:01:08.723 182717 DEBUG oslo_concurrency.lockutils [None req-f2462028-c54d-4396-bffc-4fa2f1da367f 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:01:08 compute-1 nova_compute[182713]: 2026-01-22 00:01:08.736 182717 DEBUG nova.compute.manager [None req-4346924a-e268-4c61-a82c-545f1e7be253 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 22 00:01:08 compute-1 nova_compute[182713]: 2026-01-22 00:01:08.738 182717 DEBUG nova.virt.libvirt.driver [None req-4346924a-e268-4c61-a82c-545f1e7be253 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 22 00:01:08 compute-1 nova_compute[182713]: 2026-01-22 00:01:08.739 182717 INFO nova.virt.libvirt.driver [None req-4346924a-e268-4c61-a82c-545f1e7be253 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20] Creating image(s)
Jan 22 00:01:08 compute-1 nova_compute[182713]: 2026-01-22 00:01:08.740 182717 DEBUG oslo_concurrency.lockutils [None req-4346924a-e268-4c61-a82c-545f1e7be253 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Acquiring lock "/var/lib/nova/instances/8009ab5e-1bf8-4d17-8ff2-b62b25eeff20/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:01:08 compute-1 nova_compute[182713]: 2026-01-22 00:01:08.740 182717 DEBUG oslo_concurrency.lockutils [None req-4346924a-e268-4c61-a82c-545f1e7be253 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Lock "/var/lib/nova/instances/8009ab5e-1bf8-4d17-8ff2-b62b25eeff20/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:01:08 compute-1 nova_compute[182713]: 2026-01-22 00:01:08.742 182717 DEBUG oslo_concurrency.lockutils [None req-4346924a-e268-4c61-a82c-545f1e7be253 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Lock "/var/lib/nova/instances/8009ab5e-1bf8-4d17-8ff2-b62b25eeff20/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:01:08 compute-1 nova_compute[182713]: 2026-01-22 00:01:08.766 182717 DEBUG oslo_concurrency.processutils [None req-4346924a-e268-4c61-a82c-545f1e7be253 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:01:08 compute-1 nova_compute[182713]: 2026-01-22 00:01:08.862 182717 DEBUG oslo_concurrency.processutils [None req-4346924a-e268-4c61-a82c-545f1e7be253 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.096s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:01:08 compute-1 nova_compute[182713]: 2026-01-22 00:01:08.864 182717 DEBUG oslo_concurrency.lockutils [None req-4346924a-e268-4c61-a82c-545f1e7be253 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Acquiring lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:01:08 compute-1 nova_compute[182713]: 2026-01-22 00:01:08.865 182717 DEBUG oslo_concurrency.lockutils [None req-4346924a-e268-4c61-a82c-545f1e7be253 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:01:08 compute-1 nova_compute[182713]: 2026-01-22 00:01:08.891 182717 DEBUG oslo_concurrency.processutils [None req-4346924a-e268-4c61-a82c-545f1e7be253 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:01:08 compute-1 nova_compute[182713]: 2026-01-22 00:01:08.963 182717 DEBUG oslo_concurrency.processutils [None req-4346924a-e268-4c61-a82c-545f1e7be253 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:01:08 compute-1 nova_compute[182713]: 2026-01-22 00:01:08.965 182717 DEBUG oslo_concurrency.processutils [None req-4346924a-e268-4c61-a82c-545f1e7be253 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/8009ab5e-1bf8-4d17-8ff2-b62b25eeff20/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:01:08 compute-1 nova_compute[182713]: 2026-01-22 00:01:08.999 182717 DEBUG oslo_concurrency.processutils [None req-4346924a-e268-4c61-a82c-545f1e7be253 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/8009ab5e-1bf8-4d17-8ff2-b62b25eeff20/disk 1073741824" returned: 0 in 0.034s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:01:09 compute-1 nova_compute[182713]: 2026-01-22 00:01:08.999 182717 DEBUG oslo_concurrency.lockutils [None req-4346924a-e268-4c61-a82c-545f1e7be253 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.134s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:01:09 compute-1 nova_compute[182713]: 2026-01-22 00:01:09.000 182717 DEBUG oslo_concurrency.processutils [None req-4346924a-e268-4c61-a82c-545f1e7be253 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:01:09 compute-1 nova_compute[182713]: 2026-01-22 00:01:09.089 182717 DEBUG oslo_concurrency.processutils [None req-4346924a-e268-4c61-a82c-545f1e7be253 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.089s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:01:09 compute-1 nova_compute[182713]: 2026-01-22 00:01:09.090 182717 DEBUG nova.virt.disk.api [None req-4346924a-e268-4c61-a82c-545f1e7be253 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Checking if we can resize image /var/lib/nova/instances/8009ab5e-1bf8-4d17-8ff2-b62b25eeff20/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 22 00:01:09 compute-1 nova_compute[182713]: 2026-01-22 00:01:09.090 182717 DEBUG oslo_concurrency.processutils [None req-4346924a-e268-4c61-a82c-545f1e7be253 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8009ab5e-1bf8-4d17-8ff2-b62b25eeff20/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:01:09 compute-1 nova_compute[182713]: 2026-01-22 00:01:09.181 182717 DEBUG oslo_concurrency.processutils [None req-4346924a-e268-4c61-a82c-545f1e7be253 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8009ab5e-1bf8-4d17-8ff2-b62b25eeff20/disk --force-share --output=json" returned: 0 in 0.091s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:01:09 compute-1 nova_compute[182713]: 2026-01-22 00:01:09.183 182717 DEBUG nova.virt.disk.api [None req-4346924a-e268-4c61-a82c-545f1e7be253 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Cannot resize image /var/lib/nova/instances/8009ab5e-1bf8-4d17-8ff2-b62b25eeff20/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 22 00:01:09 compute-1 nova_compute[182713]: 2026-01-22 00:01:09.184 182717 DEBUG nova.objects.instance [None req-4346924a-e268-4c61-a82c-545f1e7be253 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Lazy-loading 'migration_context' on Instance uuid 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:01:09 compute-1 nova_compute[182713]: 2026-01-22 00:01:09.217 182717 DEBUG nova.virt.libvirt.driver [None req-4346924a-e268-4c61-a82c-545f1e7be253 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 22 00:01:09 compute-1 nova_compute[182713]: 2026-01-22 00:01:09.218 182717 DEBUG nova.virt.libvirt.driver [None req-4346924a-e268-4c61-a82c-545f1e7be253 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20] Ensure instance console log exists: /var/lib/nova/instances/8009ab5e-1bf8-4d17-8ff2-b62b25eeff20/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 22 00:01:09 compute-1 nova_compute[182713]: 2026-01-22 00:01:09.218 182717 DEBUG oslo_concurrency.lockutils [None req-4346924a-e268-4c61-a82c-545f1e7be253 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:01:09 compute-1 nova_compute[182713]: 2026-01-22 00:01:09.218 182717 DEBUG oslo_concurrency.lockutils [None req-4346924a-e268-4c61-a82c-545f1e7be253 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:01:09 compute-1 nova_compute[182713]: 2026-01-22 00:01:09.219 182717 DEBUG oslo_concurrency.lockutils [None req-4346924a-e268-4c61-a82c-545f1e7be253 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:01:09 compute-1 nova_compute[182713]: 2026-01-22 00:01:09.748 182717 DEBUG nova.policy [None req-4346924a-e268-4c61-a82c-545f1e7be253 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '3e78a70a1d284a9d932d4a53b872df39', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'cccb624dbe6d4401a89e9cd254f91828', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 22 00:01:10 compute-1 podman[223806]: 2026-01-22 00:01:10.615291331 +0000 UTC m=+0.095563869 container health_status 9069102163b9014ce8d990e36224edf2f6277e737eebbc85a8ea0ba5a3d6a468 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 22 00:01:10 compute-1 podman[223805]: 2026-01-22 00:01:10.667300034 +0000 UTC m=+0.147574642 container health_status 1b31374526468d6b98833c6ddc06c791524e3aaa8f4a92d02e3f11c449a17ca2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Jan 22 00:01:11 compute-1 nova_compute[182713]: 2026-01-22 00:01:11.764 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:01:11 compute-1 nova_compute[182713]: 2026-01-22 00:01:11.954 182717 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769040056.9531507, 9308be91-9a92-4389-939a-8b03d37474cf => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:01:11 compute-1 nova_compute[182713]: 2026-01-22 00:01:11.955 182717 INFO nova.compute.manager [-] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] VM Stopped (Lifecycle Event)
Jan 22 00:01:11 compute-1 nova_compute[182713]: 2026-01-22 00:01:11.984 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:01:12 compute-1 nova_compute[182713]: 2026-01-22 00:01:12.068 182717 DEBUG nova.compute.manager [None req-ee0f44fd-3b48-4dea-8082-909bdb86437a - - - - - -] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:01:12 compute-1 nova_compute[182713]: 2026-01-22 00:01:12.321 182717 DEBUG nova.network.neutron [None req-f2462028-c54d-4396-bffc-4fa2f1da367f 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] [instance: 2cb6e3d6-f22a-49ea-aab8-900dd88605e9] Successfully created port: 8412a083-ca97-4457-bb0e-9c7bcd8bfb2f _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 22 00:01:13 compute-1 nova_compute[182713]: 2026-01-22 00:01:13.520 182717 DEBUG nova.network.neutron [None req-4346924a-e268-4c61-a82c-545f1e7be253 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20] Successfully created port: 5965ccd1-7d75-4079-ade6-e1859a860162 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 22 00:01:16 compute-1 nova_compute[182713]: 2026-01-22 00:01:16.766 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:01:16 compute-1 nova_compute[182713]: 2026-01-22 00:01:16.987 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:01:17 compute-1 nova_compute[182713]: 2026-01-22 00:01:17.409 182717 DEBUG nova.network.neutron [None req-f2462028-c54d-4396-bffc-4fa2f1da367f 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] [instance: 2cb6e3d6-f22a-49ea-aab8-900dd88605e9] Successfully updated port: 8412a083-ca97-4457-bb0e-9c7bcd8bfb2f _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 22 00:01:17 compute-1 nova_compute[182713]: 2026-01-22 00:01:17.435 182717 DEBUG oslo_concurrency.lockutils [None req-f2462028-c54d-4396-bffc-4fa2f1da367f 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] Acquiring lock "refresh_cache-2cb6e3d6-f22a-49ea-aab8-900dd88605e9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:01:17 compute-1 nova_compute[182713]: 2026-01-22 00:01:17.436 182717 DEBUG oslo_concurrency.lockutils [None req-f2462028-c54d-4396-bffc-4fa2f1da367f 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] Acquired lock "refresh_cache-2cb6e3d6-f22a-49ea-aab8-900dd88605e9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:01:17 compute-1 nova_compute[182713]: 2026-01-22 00:01:17.436 182717 DEBUG nova.network.neutron [None req-f2462028-c54d-4396-bffc-4fa2f1da367f 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] [instance: 2cb6e3d6-f22a-49ea-aab8-900dd88605e9] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 00:01:17 compute-1 nova_compute[182713]: 2026-01-22 00:01:17.507 182717 DEBUG nova.network.neutron [None req-4346924a-e268-4c61-a82c-545f1e7be253 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20] Successfully updated port: 5965ccd1-7d75-4079-ade6-e1859a860162 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 22 00:01:17 compute-1 nova_compute[182713]: 2026-01-22 00:01:17.537 182717 DEBUG oslo_concurrency.lockutils [None req-4346924a-e268-4c61-a82c-545f1e7be253 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Acquiring lock "refresh_cache-8009ab5e-1bf8-4d17-8ff2-b62b25eeff20" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:01:17 compute-1 nova_compute[182713]: 2026-01-22 00:01:17.537 182717 DEBUG oslo_concurrency.lockutils [None req-4346924a-e268-4c61-a82c-545f1e7be253 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Acquired lock "refresh_cache-8009ab5e-1bf8-4d17-8ff2-b62b25eeff20" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:01:17 compute-1 nova_compute[182713]: 2026-01-22 00:01:17.537 182717 DEBUG nova.network.neutron [None req-4346924a-e268-4c61-a82c-545f1e7be253 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 00:01:17 compute-1 podman[223853]: 2026-01-22 00:01:17.583776501 +0000 UTC m=+0.073469956 container health_status af9a44923f14d865893c91712a29a99d59e05d83f1a1a0300c4f4d5af638f284 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251202)
Jan 22 00:01:17 compute-1 podman[223854]: 2026-01-22 00:01:17.595942017 +0000 UTC m=+0.081898168 container health_status e305ebfcd3dbca18f8dd680606b043b108cf9efb0d0a771039cb04fe0df8441d (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 22 00:01:17 compute-1 nova_compute[182713]: 2026-01-22 00:01:17.941 182717 DEBUG nova.compute.manager [req-318dd4c7-154d-4516-a3e6-f80e1040390d req-73b00727-4f70-4d45-9983-0cb131077981 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20] Received event network-changed-5965ccd1-7d75-4079-ade6-e1859a860162 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:01:17 compute-1 nova_compute[182713]: 2026-01-22 00:01:17.942 182717 DEBUG nova.compute.manager [req-318dd4c7-154d-4516-a3e6-f80e1040390d req-73b00727-4f70-4d45-9983-0cb131077981 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20] Refreshing instance network info cache due to event network-changed-5965ccd1-7d75-4079-ade6-e1859a860162. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 00:01:17 compute-1 nova_compute[182713]: 2026-01-22 00:01:17.942 182717 DEBUG oslo_concurrency.lockutils [req-318dd4c7-154d-4516-a3e6-f80e1040390d req-73b00727-4f70-4d45-9983-0cb131077981 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-8009ab5e-1bf8-4d17-8ff2-b62b25eeff20" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:01:17 compute-1 nova_compute[182713]: 2026-01-22 00:01:17.972 182717 DEBUG nova.compute.manager [req-661e2e4f-b091-4a41-9296-d842ed2b63c3 req-9ab1f846-b31a-4321-a36d-4a7304cdc04a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 2cb6e3d6-f22a-49ea-aab8-900dd88605e9] Received event network-changed-8412a083-ca97-4457-bb0e-9c7bcd8bfb2f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:01:17 compute-1 nova_compute[182713]: 2026-01-22 00:01:17.972 182717 DEBUG nova.compute.manager [req-661e2e4f-b091-4a41-9296-d842ed2b63c3 req-9ab1f846-b31a-4321-a36d-4a7304cdc04a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 2cb6e3d6-f22a-49ea-aab8-900dd88605e9] Refreshing instance network info cache due to event network-changed-8412a083-ca97-4457-bb0e-9c7bcd8bfb2f. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 00:01:17 compute-1 nova_compute[182713]: 2026-01-22 00:01:17.973 182717 DEBUG oslo_concurrency.lockutils [req-661e2e4f-b091-4a41-9296-d842ed2b63c3 req-9ab1f846-b31a-4321-a36d-4a7304cdc04a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-2cb6e3d6-f22a-49ea-aab8-900dd88605e9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:01:18 compute-1 nova_compute[182713]: 2026-01-22 00:01:18.703 182717 DEBUG nova.network.neutron [None req-4346924a-e268-4c61-a82c-545f1e7be253 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 00:01:18 compute-1 nova_compute[182713]: 2026-01-22 00:01:18.708 182717 DEBUG nova.network.neutron [None req-f2462028-c54d-4396-bffc-4fa2f1da367f 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] [instance: 2cb6e3d6-f22a-49ea-aab8-900dd88605e9] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 00:01:19 compute-1 nova_compute[182713]: 2026-01-22 00:01:19.151 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:01:19 compute-1 nova_compute[182713]: 2026-01-22 00:01:19.372 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:01:20 compute-1 nova_compute[182713]: 2026-01-22 00:01:20.907 182717 DEBUG nova.network.neutron [None req-4346924a-e268-4c61-a82c-545f1e7be253 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20] Updating instance_info_cache with network_info: [{"id": "5965ccd1-7d75-4079-ade6-e1859a860162", "address": "fa:16:3e:c9:be:ae", "network": {"id": "19c3e0c8-5563-479c-995a-ab38d8b8c7f7", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-10713966-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cccb624dbe6d4401a89e9cd254f91828", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5965ccd1-7d", "ovs_interfaceid": "5965ccd1-7d75-4079-ade6-e1859a860162", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:01:20 compute-1 nova_compute[182713]: 2026-01-22 00:01:20.943 182717 DEBUG oslo_concurrency.lockutils [None req-4346924a-e268-4c61-a82c-545f1e7be253 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Releasing lock "refresh_cache-8009ab5e-1bf8-4d17-8ff2-b62b25eeff20" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:01:20 compute-1 nova_compute[182713]: 2026-01-22 00:01:20.944 182717 DEBUG nova.compute.manager [None req-4346924a-e268-4c61-a82c-545f1e7be253 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20] Instance network_info: |[{"id": "5965ccd1-7d75-4079-ade6-e1859a860162", "address": "fa:16:3e:c9:be:ae", "network": {"id": "19c3e0c8-5563-479c-995a-ab38d8b8c7f7", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-10713966-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cccb624dbe6d4401a89e9cd254f91828", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5965ccd1-7d", "ovs_interfaceid": "5965ccd1-7d75-4079-ade6-e1859a860162", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 22 00:01:20 compute-1 nova_compute[182713]: 2026-01-22 00:01:20.944 182717 DEBUG oslo_concurrency.lockutils [req-318dd4c7-154d-4516-a3e6-f80e1040390d req-73b00727-4f70-4d45-9983-0cb131077981 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-8009ab5e-1bf8-4d17-8ff2-b62b25eeff20" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:01:20 compute-1 nova_compute[182713]: 2026-01-22 00:01:20.945 182717 DEBUG nova.network.neutron [req-318dd4c7-154d-4516-a3e6-f80e1040390d req-73b00727-4f70-4d45-9983-0cb131077981 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20] Refreshing network info cache for port 5965ccd1-7d75-4079-ade6-e1859a860162 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 00:01:20 compute-1 nova_compute[182713]: 2026-01-22 00:01:20.948 182717 DEBUG nova.virt.libvirt.driver [None req-4346924a-e268-4c61-a82c-545f1e7be253 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20] Start _get_guest_xml network_info=[{"id": "5965ccd1-7d75-4079-ade6-e1859a860162", "address": "fa:16:3e:c9:be:ae", "network": {"id": "19c3e0c8-5563-479c-995a-ab38d8b8c7f7", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-10713966-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cccb624dbe6d4401a89e9cd254f91828", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5965ccd1-7d", "ovs_interfaceid": "5965ccd1-7d75-4079-ade6-e1859a860162", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'size': 0, 'disk_bus': 'virtio', 'boot_index': 0, 'device_name': '/dev/vda', 'guest_format': None, 'encrypted': False, 'encryption_format': None, 'encryption_secret_uuid': None, 'device_type': 'disk', 'image_id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 22 00:01:20 compute-1 nova_compute[182713]: 2026-01-22 00:01:20.954 182717 WARNING nova.virt.libvirt.driver [None req-4346924a-e268-4c61-a82c-545f1e7be253 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 00:01:20 compute-1 nova_compute[182713]: 2026-01-22 00:01:20.974 182717 DEBUG nova.virt.libvirt.host [None req-4346924a-e268-4c61-a82c-545f1e7be253 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 22 00:01:20 compute-1 nova_compute[182713]: 2026-01-22 00:01:20.975 182717 DEBUG nova.virt.libvirt.host [None req-4346924a-e268-4c61-a82c-545f1e7be253 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 22 00:01:20 compute-1 nova_compute[182713]: 2026-01-22 00:01:20.983 182717 DEBUG nova.virt.libvirt.host [None req-4346924a-e268-4c61-a82c-545f1e7be253 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 22 00:01:20 compute-1 nova_compute[182713]: 2026-01-22 00:01:20.984 182717 DEBUG nova.virt.libvirt.host [None req-4346924a-e268-4c61-a82c-545f1e7be253 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 22 00:01:20 compute-1 nova_compute[182713]: 2026-01-22 00:01:20.987 182717 DEBUG nova.virt.libvirt.driver [None req-4346924a-e268-4c61-a82c-545f1e7be253 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 22 00:01:20 compute-1 nova_compute[182713]: 2026-01-22 00:01:20.987 182717 DEBUG nova.virt.hardware [None req-4346924a-e268-4c61-a82c-545f1e7be253 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-21T23:42:45Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c3389c03-89c4-4ff5-9e03-1a99d41713d4',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 22 00:01:20 compute-1 nova_compute[182713]: 2026-01-22 00:01:20.988 182717 DEBUG nova.virt.hardware [None req-4346924a-e268-4c61-a82c-545f1e7be253 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 22 00:01:20 compute-1 nova_compute[182713]: 2026-01-22 00:01:20.989 182717 DEBUG nova.virt.hardware [None req-4346924a-e268-4c61-a82c-545f1e7be253 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 22 00:01:20 compute-1 nova_compute[182713]: 2026-01-22 00:01:20.989 182717 DEBUG nova.virt.hardware [None req-4346924a-e268-4c61-a82c-545f1e7be253 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 22 00:01:20 compute-1 nova_compute[182713]: 2026-01-22 00:01:20.989 182717 DEBUG nova.virt.hardware [None req-4346924a-e268-4c61-a82c-545f1e7be253 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 22 00:01:20 compute-1 nova_compute[182713]: 2026-01-22 00:01:20.990 182717 DEBUG nova.virt.hardware [None req-4346924a-e268-4c61-a82c-545f1e7be253 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 22 00:01:20 compute-1 nova_compute[182713]: 2026-01-22 00:01:20.990 182717 DEBUG nova.virt.hardware [None req-4346924a-e268-4c61-a82c-545f1e7be253 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 22 00:01:20 compute-1 nova_compute[182713]: 2026-01-22 00:01:20.991 182717 DEBUG nova.virt.hardware [None req-4346924a-e268-4c61-a82c-545f1e7be253 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 22 00:01:20 compute-1 nova_compute[182713]: 2026-01-22 00:01:20.991 182717 DEBUG nova.virt.hardware [None req-4346924a-e268-4c61-a82c-545f1e7be253 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 22 00:01:20 compute-1 nova_compute[182713]: 2026-01-22 00:01:20.992 182717 DEBUG nova.virt.hardware [None req-4346924a-e268-4c61-a82c-545f1e7be253 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 22 00:01:20 compute-1 nova_compute[182713]: 2026-01-22 00:01:20.992 182717 DEBUG nova.virt.hardware [None req-4346924a-e268-4c61-a82c-545f1e7be253 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 22 00:01:20 compute-1 nova_compute[182713]: 2026-01-22 00:01:20.999 182717 DEBUG nova.virt.libvirt.vif [None req-4346924a-e268-4c61-a82c-545f1e7be253 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T00:01:05Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-803720403',display_name='tempest-ServerActionsTestJSON-server-803720403',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-803720403',id=84,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBM2ugiUux7DYMlN8dY8gue1BzsfXbOKOqdPq/gJUxFgjYtiZRKn0Il7yH7vkt/FF0n0nQ57uKZ7FjQwDvGcLpEHkhrK3RTLhPWsztjfiNHjhjKK0S86T4k3kzP0rpeoh4Q==',key_name='tempest-keypair-452781070',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='cccb624dbe6d4401a89e9cd254f91828',ramdisk_id='',reservation_id='r-zph0mh3t',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerActionsTestJSON-78742637',owner_user_name='tempest-ServerActionsTestJSON-78742637-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:01:08Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='3e78a70a1d284a9d932d4a53b872df39',uuid=8009ab5e-1bf8-4d17-8ff2-b62b25eeff20,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "5965ccd1-7d75-4079-ade6-e1859a860162", "address": "fa:16:3e:c9:be:ae", "network": {"id": "19c3e0c8-5563-479c-995a-ab38d8b8c7f7", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-10713966-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cccb624dbe6d4401a89e9cd254f91828", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5965ccd1-7d", "ovs_interfaceid": "5965ccd1-7d75-4079-ade6-e1859a860162", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 00:01:21 compute-1 nova_compute[182713]: 2026-01-22 00:01:21.000 182717 DEBUG nova.network.os_vif_util [None req-4346924a-e268-4c61-a82c-545f1e7be253 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Converting VIF {"id": "5965ccd1-7d75-4079-ade6-e1859a860162", "address": "fa:16:3e:c9:be:ae", "network": {"id": "19c3e0c8-5563-479c-995a-ab38d8b8c7f7", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-10713966-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cccb624dbe6d4401a89e9cd254f91828", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5965ccd1-7d", "ovs_interfaceid": "5965ccd1-7d75-4079-ade6-e1859a860162", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:01:21 compute-1 nova_compute[182713]: 2026-01-22 00:01:21.001 182717 DEBUG nova.network.os_vif_util [None req-4346924a-e268-4c61-a82c-545f1e7be253 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c9:be:ae,bridge_name='br-int',has_traffic_filtering=True,id=5965ccd1-7d75-4079-ade6-e1859a860162,network=Network(19c3e0c8-5563-479c-995a-ab38d8b8c7f7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5965ccd1-7d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:01:21 compute-1 nova_compute[182713]: 2026-01-22 00:01:21.003 182717 DEBUG nova.objects.instance [None req-4346924a-e268-4c61-a82c-545f1e7be253 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Lazy-loading 'pci_devices' on Instance uuid 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:01:21 compute-1 nova_compute[182713]: 2026-01-22 00:01:21.023 182717 DEBUG nova.virt.libvirt.driver [None req-4346924a-e268-4c61-a82c-545f1e7be253 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20] End _get_guest_xml xml=<domain type="kvm">
Jan 22 00:01:21 compute-1 nova_compute[182713]:   <uuid>8009ab5e-1bf8-4d17-8ff2-b62b25eeff20</uuid>
Jan 22 00:01:21 compute-1 nova_compute[182713]:   <name>instance-00000054</name>
Jan 22 00:01:21 compute-1 nova_compute[182713]:   <memory>131072</memory>
Jan 22 00:01:21 compute-1 nova_compute[182713]:   <vcpu>1</vcpu>
Jan 22 00:01:21 compute-1 nova_compute[182713]:   <metadata>
Jan 22 00:01:21 compute-1 nova_compute[182713]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 00:01:21 compute-1 nova_compute[182713]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 00:01:21 compute-1 nova_compute[182713]:       <nova:name>tempest-ServerActionsTestJSON-server-803720403</nova:name>
Jan 22 00:01:21 compute-1 nova_compute[182713]:       <nova:creationTime>2026-01-22 00:01:20</nova:creationTime>
Jan 22 00:01:21 compute-1 nova_compute[182713]:       <nova:flavor name="m1.nano">
Jan 22 00:01:21 compute-1 nova_compute[182713]:         <nova:memory>128</nova:memory>
Jan 22 00:01:21 compute-1 nova_compute[182713]:         <nova:disk>1</nova:disk>
Jan 22 00:01:21 compute-1 nova_compute[182713]:         <nova:swap>0</nova:swap>
Jan 22 00:01:21 compute-1 nova_compute[182713]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 00:01:21 compute-1 nova_compute[182713]:         <nova:vcpus>1</nova:vcpus>
Jan 22 00:01:21 compute-1 nova_compute[182713]:       </nova:flavor>
Jan 22 00:01:21 compute-1 nova_compute[182713]:       <nova:owner>
Jan 22 00:01:21 compute-1 nova_compute[182713]:         <nova:user uuid="3e78a70a1d284a9d932d4a53b872df39">tempest-ServerActionsTestJSON-78742637-project-member</nova:user>
Jan 22 00:01:21 compute-1 nova_compute[182713]:         <nova:project uuid="cccb624dbe6d4401a89e9cd254f91828">tempest-ServerActionsTestJSON-78742637</nova:project>
Jan 22 00:01:21 compute-1 nova_compute[182713]:       </nova:owner>
Jan 22 00:01:21 compute-1 nova_compute[182713]:       <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 22 00:01:21 compute-1 nova_compute[182713]:       <nova:ports>
Jan 22 00:01:21 compute-1 nova_compute[182713]:         <nova:port uuid="5965ccd1-7d75-4079-ade6-e1859a860162">
Jan 22 00:01:21 compute-1 nova_compute[182713]:           <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Jan 22 00:01:21 compute-1 nova_compute[182713]:         </nova:port>
Jan 22 00:01:21 compute-1 nova_compute[182713]:       </nova:ports>
Jan 22 00:01:21 compute-1 nova_compute[182713]:     </nova:instance>
Jan 22 00:01:21 compute-1 nova_compute[182713]:   </metadata>
Jan 22 00:01:21 compute-1 nova_compute[182713]:   <sysinfo type="smbios">
Jan 22 00:01:21 compute-1 nova_compute[182713]:     <system>
Jan 22 00:01:21 compute-1 nova_compute[182713]:       <entry name="manufacturer">RDO</entry>
Jan 22 00:01:21 compute-1 nova_compute[182713]:       <entry name="product">OpenStack Compute</entry>
Jan 22 00:01:21 compute-1 nova_compute[182713]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 00:01:21 compute-1 nova_compute[182713]:       <entry name="serial">8009ab5e-1bf8-4d17-8ff2-b62b25eeff20</entry>
Jan 22 00:01:21 compute-1 nova_compute[182713]:       <entry name="uuid">8009ab5e-1bf8-4d17-8ff2-b62b25eeff20</entry>
Jan 22 00:01:21 compute-1 nova_compute[182713]:       <entry name="family">Virtual Machine</entry>
Jan 22 00:01:21 compute-1 nova_compute[182713]:     </system>
Jan 22 00:01:21 compute-1 nova_compute[182713]:   </sysinfo>
Jan 22 00:01:21 compute-1 nova_compute[182713]:   <os>
Jan 22 00:01:21 compute-1 nova_compute[182713]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 22 00:01:21 compute-1 nova_compute[182713]:     <boot dev="hd"/>
Jan 22 00:01:21 compute-1 nova_compute[182713]:     <smbios mode="sysinfo"/>
Jan 22 00:01:21 compute-1 nova_compute[182713]:   </os>
Jan 22 00:01:21 compute-1 nova_compute[182713]:   <features>
Jan 22 00:01:21 compute-1 nova_compute[182713]:     <acpi/>
Jan 22 00:01:21 compute-1 nova_compute[182713]:     <apic/>
Jan 22 00:01:21 compute-1 nova_compute[182713]:     <vmcoreinfo/>
Jan 22 00:01:21 compute-1 nova_compute[182713]:   </features>
Jan 22 00:01:21 compute-1 nova_compute[182713]:   <clock offset="utc">
Jan 22 00:01:21 compute-1 nova_compute[182713]:     <timer name="pit" tickpolicy="delay"/>
Jan 22 00:01:21 compute-1 nova_compute[182713]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 22 00:01:21 compute-1 nova_compute[182713]:     <timer name="hpet" present="no"/>
Jan 22 00:01:21 compute-1 nova_compute[182713]:   </clock>
Jan 22 00:01:21 compute-1 nova_compute[182713]:   <cpu mode="custom" match="exact">
Jan 22 00:01:21 compute-1 nova_compute[182713]:     <model>Nehalem</model>
Jan 22 00:01:21 compute-1 nova_compute[182713]:     <topology sockets="1" cores="1" threads="1"/>
Jan 22 00:01:21 compute-1 nova_compute[182713]:   </cpu>
Jan 22 00:01:21 compute-1 nova_compute[182713]:   <devices>
Jan 22 00:01:21 compute-1 nova_compute[182713]:     <disk type="file" device="disk">
Jan 22 00:01:21 compute-1 nova_compute[182713]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 00:01:21 compute-1 nova_compute[182713]:       <source file="/var/lib/nova/instances/8009ab5e-1bf8-4d17-8ff2-b62b25eeff20/disk"/>
Jan 22 00:01:21 compute-1 nova_compute[182713]:       <target dev="vda" bus="virtio"/>
Jan 22 00:01:21 compute-1 nova_compute[182713]:     </disk>
Jan 22 00:01:21 compute-1 nova_compute[182713]:     <disk type="file" device="cdrom">
Jan 22 00:01:21 compute-1 nova_compute[182713]:       <driver name="qemu" type="raw" cache="none"/>
Jan 22 00:01:21 compute-1 nova_compute[182713]:       <source file="/var/lib/nova/instances/8009ab5e-1bf8-4d17-8ff2-b62b25eeff20/disk.config"/>
Jan 22 00:01:21 compute-1 nova_compute[182713]:       <target dev="sda" bus="sata"/>
Jan 22 00:01:21 compute-1 nova_compute[182713]:     </disk>
Jan 22 00:01:21 compute-1 nova_compute[182713]:     <interface type="ethernet">
Jan 22 00:01:21 compute-1 nova_compute[182713]:       <mac address="fa:16:3e:c9:be:ae"/>
Jan 22 00:01:21 compute-1 nova_compute[182713]:       <model type="virtio"/>
Jan 22 00:01:21 compute-1 nova_compute[182713]:       <driver name="vhost" rx_queue_size="512"/>
Jan 22 00:01:21 compute-1 nova_compute[182713]:       <mtu size="1442"/>
Jan 22 00:01:21 compute-1 nova_compute[182713]:       <target dev="tap5965ccd1-7d"/>
Jan 22 00:01:21 compute-1 nova_compute[182713]:     </interface>
Jan 22 00:01:21 compute-1 nova_compute[182713]:     <serial type="pty">
Jan 22 00:01:21 compute-1 nova_compute[182713]:       <log file="/var/lib/nova/instances/8009ab5e-1bf8-4d17-8ff2-b62b25eeff20/console.log" append="off"/>
Jan 22 00:01:21 compute-1 nova_compute[182713]:     </serial>
Jan 22 00:01:21 compute-1 nova_compute[182713]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 00:01:21 compute-1 nova_compute[182713]:     <video>
Jan 22 00:01:21 compute-1 nova_compute[182713]:       <model type="virtio"/>
Jan 22 00:01:21 compute-1 nova_compute[182713]:     </video>
Jan 22 00:01:21 compute-1 nova_compute[182713]:     <input type="tablet" bus="usb"/>
Jan 22 00:01:21 compute-1 nova_compute[182713]:     <rng model="virtio">
Jan 22 00:01:21 compute-1 nova_compute[182713]:       <backend model="random">/dev/urandom</backend>
Jan 22 00:01:21 compute-1 nova_compute[182713]:     </rng>
Jan 22 00:01:21 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root"/>
Jan 22 00:01:21 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:01:21 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:01:21 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:01:21 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:01:21 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:01:21 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:01:21 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:01:21 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:01:21 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:01:21 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:01:21 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:01:21 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:01:21 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:01:21 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:01:21 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:01:21 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:01:21 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:01:21 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:01:21 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:01:21 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:01:21 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:01:21 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:01:21 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:01:21 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:01:21 compute-1 nova_compute[182713]:     <controller type="usb" index="0"/>
Jan 22 00:01:21 compute-1 nova_compute[182713]:     <memballoon model="virtio">
Jan 22 00:01:21 compute-1 nova_compute[182713]:       <stats period="10"/>
Jan 22 00:01:21 compute-1 nova_compute[182713]:     </memballoon>
Jan 22 00:01:21 compute-1 nova_compute[182713]:   </devices>
Jan 22 00:01:21 compute-1 nova_compute[182713]: </domain>
Jan 22 00:01:21 compute-1 nova_compute[182713]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 22 00:01:21 compute-1 nova_compute[182713]: 2026-01-22 00:01:21.025 182717 DEBUG nova.compute.manager [None req-4346924a-e268-4c61-a82c-545f1e7be253 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20] Preparing to wait for external event network-vif-plugged-5965ccd1-7d75-4079-ade6-e1859a860162 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 22 00:01:21 compute-1 nova_compute[182713]: 2026-01-22 00:01:21.026 182717 DEBUG oslo_concurrency.lockutils [None req-4346924a-e268-4c61-a82c-545f1e7be253 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Acquiring lock "8009ab5e-1bf8-4d17-8ff2-b62b25eeff20-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:01:21 compute-1 nova_compute[182713]: 2026-01-22 00:01:21.026 182717 DEBUG oslo_concurrency.lockutils [None req-4346924a-e268-4c61-a82c-545f1e7be253 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Lock "8009ab5e-1bf8-4d17-8ff2-b62b25eeff20-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:01:21 compute-1 nova_compute[182713]: 2026-01-22 00:01:21.026 182717 DEBUG oslo_concurrency.lockutils [None req-4346924a-e268-4c61-a82c-545f1e7be253 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Lock "8009ab5e-1bf8-4d17-8ff2-b62b25eeff20-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:01:21 compute-1 nova_compute[182713]: 2026-01-22 00:01:21.027 182717 DEBUG nova.virt.libvirt.vif [None req-4346924a-e268-4c61-a82c-545f1e7be253 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T00:01:05Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-803720403',display_name='tempest-ServerActionsTestJSON-server-803720403',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-803720403',id=84,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBM2ugiUux7DYMlN8dY8gue1BzsfXbOKOqdPq/gJUxFgjYtiZRKn0Il7yH7vkt/FF0n0nQ57uKZ7FjQwDvGcLpEHkhrK3RTLhPWsztjfiNHjhjKK0S86T4k3kzP0rpeoh4Q==',key_name='tempest-keypair-452781070',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='cccb624dbe6d4401a89e9cd254f91828',ramdisk_id='',reservation_id='r-zph0mh3t',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerActionsTestJSON-78742637',owner_user_name='tempest-ServerActionsTestJSON-78742637-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:01:08Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='3e78a70a1d284a9d932d4a53b872df39',uuid=8009ab5e-1bf8-4d17-8ff2-b62b25eeff20,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "5965ccd1-7d75-4079-ade6-e1859a860162", "address": "fa:16:3e:c9:be:ae", "network": {"id": "19c3e0c8-5563-479c-995a-ab38d8b8c7f7", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-10713966-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cccb624dbe6d4401a89e9cd254f91828", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5965ccd1-7d", "ovs_interfaceid": "5965ccd1-7d75-4079-ade6-e1859a860162", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 00:01:21 compute-1 nova_compute[182713]: 2026-01-22 00:01:21.028 182717 DEBUG nova.network.os_vif_util [None req-4346924a-e268-4c61-a82c-545f1e7be253 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Converting VIF {"id": "5965ccd1-7d75-4079-ade6-e1859a860162", "address": "fa:16:3e:c9:be:ae", "network": {"id": "19c3e0c8-5563-479c-995a-ab38d8b8c7f7", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-10713966-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cccb624dbe6d4401a89e9cd254f91828", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5965ccd1-7d", "ovs_interfaceid": "5965ccd1-7d75-4079-ade6-e1859a860162", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:01:21 compute-1 nova_compute[182713]: 2026-01-22 00:01:21.028 182717 DEBUG nova.network.os_vif_util [None req-4346924a-e268-4c61-a82c-545f1e7be253 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c9:be:ae,bridge_name='br-int',has_traffic_filtering=True,id=5965ccd1-7d75-4079-ade6-e1859a860162,network=Network(19c3e0c8-5563-479c-995a-ab38d8b8c7f7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5965ccd1-7d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:01:21 compute-1 nova_compute[182713]: 2026-01-22 00:01:21.029 182717 DEBUG os_vif [None req-4346924a-e268-4c61-a82c-545f1e7be253 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:c9:be:ae,bridge_name='br-int',has_traffic_filtering=True,id=5965ccd1-7d75-4079-ade6-e1859a860162,network=Network(19c3e0c8-5563-479c-995a-ab38d8b8c7f7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5965ccd1-7d') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 00:01:21 compute-1 nova_compute[182713]: 2026-01-22 00:01:21.029 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:01:21 compute-1 nova_compute[182713]: 2026-01-22 00:01:21.030 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:01:21 compute-1 nova_compute[182713]: 2026-01-22 00:01:21.030 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 00:01:21 compute-1 nova_compute[182713]: 2026-01-22 00:01:21.041 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:01:21 compute-1 nova_compute[182713]: 2026-01-22 00:01:21.042 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap5965ccd1-7d, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:01:21 compute-1 nova_compute[182713]: 2026-01-22 00:01:21.042 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap5965ccd1-7d, col_values=(('external_ids', {'iface-id': '5965ccd1-7d75-4079-ade6-e1859a860162', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:c9:be:ae', 'vm-uuid': '8009ab5e-1bf8-4d17-8ff2-b62b25eeff20'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:01:21 compute-1 nova_compute[182713]: 2026-01-22 00:01:21.044 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:01:21 compute-1 NetworkManager[54952]: <info>  [1769040081.0450] manager: (tap5965ccd1-7d): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/165)
Jan 22 00:01:21 compute-1 nova_compute[182713]: 2026-01-22 00:01:21.047 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:01:21 compute-1 nova_compute[182713]: 2026-01-22 00:01:21.056 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:01:21 compute-1 nova_compute[182713]: 2026-01-22 00:01:21.057 182717 INFO os_vif [None req-4346924a-e268-4c61-a82c-545f1e7be253 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:c9:be:ae,bridge_name='br-int',has_traffic_filtering=True,id=5965ccd1-7d75-4079-ade6-e1859a860162,network=Network(19c3e0c8-5563-479c-995a-ab38d8b8c7f7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5965ccd1-7d')
Jan 22 00:01:21 compute-1 nova_compute[182713]: 2026-01-22 00:01:21.171 182717 DEBUG nova.virt.libvirt.driver [None req-4346924a-e268-4c61-a82c-545f1e7be253 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 00:01:21 compute-1 nova_compute[182713]: 2026-01-22 00:01:21.172 182717 DEBUG nova.virt.libvirt.driver [None req-4346924a-e268-4c61-a82c-545f1e7be253 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 00:01:21 compute-1 nova_compute[182713]: 2026-01-22 00:01:21.172 182717 DEBUG nova.virt.libvirt.driver [None req-4346924a-e268-4c61-a82c-545f1e7be253 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] No VIF found with MAC fa:16:3e:c9:be:ae, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 00:01:21 compute-1 nova_compute[182713]: 2026-01-22 00:01:21.173 182717 INFO nova.virt.libvirt.driver [None req-4346924a-e268-4c61-a82c-545f1e7be253 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20] Using config drive
Jan 22 00:01:21 compute-1 nova_compute[182713]: 2026-01-22 00:01:21.548 182717 DEBUG nova.network.neutron [None req-f2462028-c54d-4396-bffc-4fa2f1da367f 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] [instance: 2cb6e3d6-f22a-49ea-aab8-900dd88605e9] Updating instance_info_cache with network_info: [{"id": "8412a083-ca97-4457-bb0e-9c7bcd8bfb2f", "address": "fa:16:3e:e0:ee:91", "network": {"id": "397ba44b-e27b-4a2a-a10b-7de0daa31656", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1751919755-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a7e425a4d1854533a17d5f0dcd9d87b9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8412a083-ca", "ovs_interfaceid": "8412a083-ca97-4457-bb0e-9c7bcd8bfb2f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:01:21 compute-1 nova_compute[182713]: 2026-01-22 00:01:21.599 182717 DEBUG oslo_concurrency.lockutils [None req-f2462028-c54d-4396-bffc-4fa2f1da367f 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] Releasing lock "refresh_cache-2cb6e3d6-f22a-49ea-aab8-900dd88605e9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:01:21 compute-1 nova_compute[182713]: 2026-01-22 00:01:21.600 182717 DEBUG nova.compute.manager [None req-f2462028-c54d-4396-bffc-4fa2f1da367f 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] [instance: 2cb6e3d6-f22a-49ea-aab8-900dd88605e9] Instance network_info: |[{"id": "8412a083-ca97-4457-bb0e-9c7bcd8bfb2f", "address": "fa:16:3e:e0:ee:91", "network": {"id": "397ba44b-e27b-4a2a-a10b-7de0daa31656", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1751919755-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a7e425a4d1854533a17d5f0dcd9d87b9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8412a083-ca", "ovs_interfaceid": "8412a083-ca97-4457-bb0e-9c7bcd8bfb2f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 22 00:01:21 compute-1 nova_compute[182713]: 2026-01-22 00:01:21.600 182717 DEBUG oslo_concurrency.lockutils [req-661e2e4f-b091-4a41-9296-d842ed2b63c3 req-9ab1f846-b31a-4321-a36d-4a7304cdc04a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-2cb6e3d6-f22a-49ea-aab8-900dd88605e9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:01:21 compute-1 nova_compute[182713]: 2026-01-22 00:01:21.601 182717 DEBUG nova.network.neutron [req-661e2e4f-b091-4a41-9296-d842ed2b63c3 req-9ab1f846-b31a-4321-a36d-4a7304cdc04a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 2cb6e3d6-f22a-49ea-aab8-900dd88605e9] Refreshing network info cache for port 8412a083-ca97-4457-bb0e-9c7bcd8bfb2f _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 00:01:21 compute-1 nova_compute[182713]: 2026-01-22 00:01:21.606 182717 DEBUG nova.virt.libvirt.driver [None req-f2462028-c54d-4396-bffc-4fa2f1da367f 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] [instance: 2cb6e3d6-f22a-49ea-aab8-900dd88605e9] Start _get_guest_xml network_info=[{"id": "8412a083-ca97-4457-bb0e-9c7bcd8bfb2f", "address": "fa:16:3e:e0:ee:91", "network": {"id": "397ba44b-e27b-4a2a-a10b-7de0daa31656", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1751919755-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a7e425a4d1854533a17d5f0dcd9d87b9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8412a083-ca", "ovs_interfaceid": "8412a083-ca97-4457-bb0e-9c7bcd8bfb2f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'size': 0, 'disk_bus': 'virtio', 'boot_index': 0, 'device_name': '/dev/vda', 'guest_format': None, 'encrypted': False, 'encryption_format': None, 'encryption_secret_uuid': None, 'device_type': 'disk', 'image_id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 22 00:01:21 compute-1 nova_compute[182713]: 2026-01-22 00:01:21.613 182717 WARNING nova.virt.libvirt.driver [None req-f2462028-c54d-4396-bffc-4fa2f1da367f 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 00:01:21 compute-1 nova_compute[182713]: 2026-01-22 00:01:21.619 182717 DEBUG nova.virt.libvirt.host [None req-f2462028-c54d-4396-bffc-4fa2f1da367f 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 22 00:01:21 compute-1 nova_compute[182713]: 2026-01-22 00:01:21.620 182717 DEBUG nova.virt.libvirt.host [None req-f2462028-c54d-4396-bffc-4fa2f1da367f 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 22 00:01:21 compute-1 nova_compute[182713]: 2026-01-22 00:01:21.626 182717 DEBUG nova.virt.libvirt.host [None req-f2462028-c54d-4396-bffc-4fa2f1da367f 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 22 00:01:21 compute-1 nova_compute[182713]: 2026-01-22 00:01:21.627 182717 DEBUG nova.virt.libvirt.host [None req-f2462028-c54d-4396-bffc-4fa2f1da367f 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 22 00:01:21 compute-1 nova_compute[182713]: 2026-01-22 00:01:21.628 182717 DEBUG nova.virt.libvirt.driver [None req-f2462028-c54d-4396-bffc-4fa2f1da367f 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 22 00:01:21 compute-1 nova_compute[182713]: 2026-01-22 00:01:21.629 182717 DEBUG nova.virt.hardware [None req-f2462028-c54d-4396-bffc-4fa2f1da367f 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-21T23:42:45Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c3389c03-89c4-4ff5-9e03-1a99d41713d4',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 22 00:01:21 compute-1 nova_compute[182713]: 2026-01-22 00:01:21.630 182717 DEBUG nova.virt.hardware [None req-f2462028-c54d-4396-bffc-4fa2f1da367f 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 22 00:01:21 compute-1 nova_compute[182713]: 2026-01-22 00:01:21.630 182717 DEBUG nova.virt.hardware [None req-f2462028-c54d-4396-bffc-4fa2f1da367f 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 22 00:01:21 compute-1 nova_compute[182713]: 2026-01-22 00:01:21.630 182717 DEBUG nova.virt.hardware [None req-f2462028-c54d-4396-bffc-4fa2f1da367f 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 22 00:01:21 compute-1 nova_compute[182713]: 2026-01-22 00:01:21.631 182717 DEBUG nova.virt.hardware [None req-f2462028-c54d-4396-bffc-4fa2f1da367f 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 22 00:01:21 compute-1 nova_compute[182713]: 2026-01-22 00:01:21.631 182717 DEBUG nova.virt.hardware [None req-f2462028-c54d-4396-bffc-4fa2f1da367f 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 22 00:01:21 compute-1 nova_compute[182713]: 2026-01-22 00:01:21.632 182717 DEBUG nova.virt.hardware [None req-f2462028-c54d-4396-bffc-4fa2f1da367f 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 22 00:01:21 compute-1 nova_compute[182713]: 2026-01-22 00:01:21.632 182717 DEBUG nova.virt.hardware [None req-f2462028-c54d-4396-bffc-4fa2f1da367f 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 22 00:01:21 compute-1 nova_compute[182713]: 2026-01-22 00:01:21.633 182717 DEBUG nova.virt.hardware [None req-f2462028-c54d-4396-bffc-4fa2f1da367f 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 22 00:01:21 compute-1 nova_compute[182713]: 2026-01-22 00:01:21.633 182717 DEBUG nova.virt.hardware [None req-f2462028-c54d-4396-bffc-4fa2f1da367f 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 22 00:01:21 compute-1 nova_compute[182713]: 2026-01-22 00:01:21.633 182717 DEBUG nova.virt.hardware [None req-f2462028-c54d-4396-bffc-4fa2f1da367f 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 22 00:01:21 compute-1 nova_compute[182713]: 2026-01-22 00:01:21.639 182717 DEBUG nova.virt.libvirt.vif [None req-f2462028-c54d-4396-bffc-4fa2f1da367f 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T00:01:05Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersNegativeTestJSON-server-1381246704',display_name='tempest-ServersNegativeTestJSON-server-1381246704',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serversnegativetestjson-server-1381246704',id=83,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='a7e425a4d1854533a17d5f0dcd9d87b9',ramdisk_id='',reservation_id='r-37fe06tc',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersNegativeTestJSON-1689661',owner_user_name='tempest-ServersNegativeTestJSON-1689661-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:01:08Z,user_data=None,user_id='531ec5a088a94b78af6e2c3feda17c0c',uuid=2cb6e3d6-f22a-49ea-aab8-900dd88605e9,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "8412a083-ca97-4457-bb0e-9c7bcd8bfb2f", "address": "fa:16:3e:e0:ee:91", "network": {"id": "397ba44b-e27b-4a2a-a10b-7de0daa31656", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1751919755-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a7e425a4d1854533a17d5f0dcd9d87b9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8412a083-ca", "ovs_interfaceid": "8412a083-ca97-4457-bb0e-9c7bcd8bfb2f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 00:01:21 compute-1 nova_compute[182713]: 2026-01-22 00:01:21.639 182717 DEBUG nova.network.os_vif_util [None req-f2462028-c54d-4396-bffc-4fa2f1da367f 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] Converting VIF {"id": "8412a083-ca97-4457-bb0e-9c7bcd8bfb2f", "address": "fa:16:3e:e0:ee:91", "network": {"id": "397ba44b-e27b-4a2a-a10b-7de0daa31656", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1751919755-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a7e425a4d1854533a17d5f0dcd9d87b9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8412a083-ca", "ovs_interfaceid": "8412a083-ca97-4457-bb0e-9c7bcd8bfb2f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:01:21 compute-1 nova_compute[182713]: 2026-01-22 00:01:21.641 182717 DEBUG nova.network.os_vif_util [None req-f2462028-c54d-4396-bffc-4fa2f1da367f 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e0:ee:91,bridge_name='br-int',has_traffic_filtering=True,id=8412a083-ca97-4457-bb0e-9c7bcd8bfb2f,network=Network(397ba44b-e27b-4a2a-a10b-7de0daa31656),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8412a083-ca') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:01:21 compute-1 nova_compute[182713]: 2026-01-22 00:01:21.642 182717 DEBUG nova.objects.instance [None req-f2462028-c54d-4396-bffc-4fa2f1da367f 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] Lazy-loading 'pci_devices' on Instance uuid 2cb6e3d6-f22a-49ea-aab8-900dd88605e9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:01:21 compute-1 nova_compute[182713]: 2026-01-22 00:01:21.662 182717 DEBUG nova.virt.libvirt.driver [None req-f2462028-c54d-4396-bffc-4fa2f1da367f 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] [instance: 2cb6e3d6-f22a-49ea-aab8-900dd88605e9] End _get_guest_xml xml=<domain type="kvm">
Jan 22 00:01:21 compute-1 nova_compute[182713]:   <uuid>2cb6e3d6-f22a-49ea-aab8-900dd88605e9</uuid>
Jan 22 00:01:21 compute-1 nova_compute[182713]:   <name>instance-00000053</name>
Jan 22 00:01:21 compute-1 nova_compute[182713]:   <memory>131072</memory>
Jan 22 00:01:21 compute-1 nova_compute[182713]:   <vcpu>1</vcpu>
Jan 22 00:01:21 compute-1 nova_compute[182713]:   <metadata>
Jan 22 00:01:21 compute-1 nova_compute[182713]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 00:01:21 compute-1 nova_compute[182713]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 00:01:21 compute-1 nova_compute[182713]:       <nova:name>tempest-ServersNegativeTestJSON-server-1381246704</nova:name>
Jan 22 00:01:21 compute-1 nova_compute[182713]:       <nova:creationTime>2026-01-22 00:01:21</nova:creationTime>
Jan 22 00:01:21 compute-1 nova_compute[182713]:       <nova:flavor name="m1.nano">
Jan 22 00:01:21 compute-1 nova_compute[182713]:         <nova:memory>128</nova:memory>
Jan 22 00:01:21 compute-1 nova_compute[182713]:         <nova:disk>1</nova:disk>
Jan 22 00:01:21 compute-1 nova_compute[182713]:         <nova:swap>0</nova:swap>
Jan 22 00:01:21 compute-1 nova_compute[182713]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 00:01:21 compute-1 nova_compute[182713]:         <nova:vcpus>1</nova:vcpus>
Jan 22 00:01:21 compute-1 nova_compute[182713]:       </nova:flavor>
Jan 22 00:01:21 compute-1 nova_compute[182713]:       <nova:owner>
Jan 22 00:01:21 compute-1 nova_compute[182713]:         <nova:user uuid="531ec5a088a94b78af6e2c3feda17c0c">tempest-ServersNegativeTestJSON-1689661-project-member</nova:user>
Jan 22 00:01:21 compute-1 nova_compute[182713]:         <nova:project uuid="a7e425a4d1854533a17d5f0dcd9d87b9">tempest-ServersNegativeTestJSON-1689661</nova:project>
Jan 22 00:01:21 compute-1 nova_compute[182713]:       </nova:owner>
Jan 22 00:01:21 compute-1 nova_compute[182713]:       <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 22 00:01:21 compute-1 nova_compute[182713]:       <nova:ports>
Jan 22 00:01:21 compute-1 nova_compute[182713]:         <nova:port uuid="8412a083-ca97-4457-bb0e-9c7bcd8bfb2f">
Jan 22 00:01:21 compute-1 nova_compute[182713]:           <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Jan 22 00:01:21 compute-1 nova_compute[182713]:         </nova:port>
Jan 22 00:01:21 compute-1 nova_compute[182713]:       </nova:ports>
Jan 22 00:01:21 compute-1 nova_compute[182713]:     </nova:instance>
Jan 22 00:01:21 compute-1 nova_compute[182713]:   </metadata>
Jan 22 00:01:21 compute-1 nova_compute[182713]:   <sysinfo type="smbios">
Jan 22 00:01:21 compute-1 nova_compute[182713]:     <system>
Jan 22 00:01:21 compute-1 nova_compute[182713]:       <entry name="manufacturer">RDO</entry>
Jan 22 00:01:21 compute-1 nova_compute[182713]:       <entry name="product">OpenStack Compute</entry>
Jan 22 00:01:21 compute-1 nova_compute[182713]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 00:01:21 compute-1 nova_compute[182713]:       <entry name="serial">2cb6e3d6-f22a-49ea-aab8-900dd88605e9</entry>
Jan 22 00:01:21 compute-1 nova_compute[182713]:       <entry name="uuid">2cb6e3d6-f22a-49ea-aab8-900dd88605e9</entry>
Jan 22 00:01:21 compute-1 nova_compute[182713]:       <entry name="family">Virtual Machine</entry>
Jan 22 00:01:21 compute-1 nova_compute[182713]:     </system>
Jan 22 00:01:21 compute-1 nova_compute[182713]:   </sysinfo>
Jan 22 00:01:21 compute-1 nova_compute[182713]:   <os>
Jan 22 00:01:21 compute-1 nova_compute[182713]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 22 00:01:21 compute-1 nova_compute[182713]:     <boot dev="hd"/>
Jan 22 00:01:21 compute-1 nova_compute[182713]:     <smbios mode="sysinfo"/>
Jan 22 00:01:21 compute-1 nova_compute[182713]:   </os>
Jan 22 00:01:21 compute-1 nova_compute[182713]:   <features>
Jan 22 00:01:21 compute-1 nova_compute[182713]:     <acpi/>
Jan 22 00:01:21 compute-1 nova_compute[182713]:     <apic/>
Jan 22 00:01:21 compute-1 nova_compute[182713]:     <vmcoreinfo/>
Jan 22 00:01:21 compute-1 nova_compute[182713]:   </features>
Jan 22 00:01:21 compute-1 nova_compute[182713]:   <clock offset="utc">
Jan 22 00:01:21 compute-1 nova_compute[182713]:     <timer name="pit" tickpolicy="delay"/>
Jan 22 00:01:21 compute-1 nova_compute[182713]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 22 00:01:21 compute-1 nova_compute[182713]:     <timer name="hpet" present="no"/>
Jan 22 00:01:21 compute-1 nova_compute[182713]:   </clock>
Jan 22 00:01:21 compute-1 nova_compute[182713]:   <cpu mode="custom" match="exact">
Jan 22 00:01:21 compute-1 nova_compute[182713]:     <model>Nehalem</model>
Jan 22 00:01:21 compute-1 nova_compute[182713]:     <topology sockets="1" cores="1" threads="1"/>
Jan 22 00:01:21 compute-1 nova_compute[182713]:   </cpu>
Jan 22 00:01:21 compute-1 nova_compute[182713]:   <devices>
Jan 22 00:01:21 compute-1 nova_compute[182713]:     <disk type="file" device="disk">
Jan 22 00:01:21 compute-1 nova_compute[182713]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 00:01:21 compute-1 nova_compute[182713]:       <source file="/var/lib/nova/instances/2cb6e3d6-f22a-49ea-aab8-900dd88605e9/disk"/>
Jan 22 00:01:21 compute-1 nova_compute[182713]:       <target dev="vda" bus="virtio"/>
Jan 22 00:01:21 compute-1 nova_compute[182713]:     </disk>
Jan 22 00:01:21 compute-1 nova_compute[182713]:     <disk type="file" device="cdrom">
Jan 22 00:01:21 compute-1 nova_compute[182713]:       <driver name="qemu" type="raw" cache="none"/>
Jan 22 00:01:21 compute-1 nova_compute[182713]:       <source file="/var/lib/nova/instances/2cb6e3d6-f22a-49ea-aab8-900dd88605e9/disk.config"/>
Jan 22 00:01:21 compute-1 nova_compute[182713]:       <target dev="sda" bus="sata"/>
Jan 22 00:01:21 compute-1 nova_compute[182713]:     </disk>
Jan 22 00:01:21 compute-1 nova_compute[182713]:     <interface type="ethernet">
Jan 22 00:01:21 compute-1 nova_compute[182713]:       <mac address="fa:16:3e:e0:ee:91"/>
Jan 22 00:01:21 compute-1 nova_compute[182713]:       <model type="virtio"/>
Jan 22 00:01:21 compute-1 nova_compute[182713]:       <driver name="vhost" rx_queue_size="512"/>
Jan 22 00:01:21 compute-1 nova_compute[182713]:       <mtu size="1442"/>
Jan 22 00:01:21 compute-1 nova_compute[182713]:       <target dev="tap8412a083-ca"/>
Jan 22 00:01:21 compute-1 nova_compute[182713]:     </interface>
Jan 22 00:01:21 compute-1 nova_compute[182713]:     <serial type="pty">
Jan 22 00:01:21 compute-1 nova_compute[182713]:       <log file="/var/lib/nova/instances/2cb6e3d6-f22a-49ea-aab8-900dd88605e9/console.log" append="off"/>
Jan 22 00:01:21 compute-1 nova_compute[182713]:     </serial>
Jan 22 00:01:21 compute-1 nova_compute[182713]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 00:01:21 compute-1 nova_compute[182713]:     <video>
Jan 22 00:01:21 compute-1 nova_compute[182713]:       <model type="virtio"/>
Jan 22 00:01:21 compute-1 nova_compute[182713]:     </video>
Jan 22 00:01:21 compute-1 nova_compute[182713]:     <input type="tablet" bus="usb"/>
Jan 22 00:01:21 compute-1 nova_compute[182713]:     <rng model="virtio">
Jan 22 00:01:21 compute-1 nova_compute[182713]:       <backend model="random">/dev/urandom</backend>
Jan 22 00:01:21 compute-1 nova_compute[182713]:     </rng>
Jan 22 00:01:21 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root"/>
Jan 22 00:01:21 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:01:21 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:01:21 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:01:21 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:01:21 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:01:21 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:01:21 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:01:21 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:01:21 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:01:21 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:01:21 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:01:21 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:01:21 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:01:21 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:01:21 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:01:21 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:01:21 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:01:21 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:01:21 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:01:21 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:01:21 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:01:21 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:01:21 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:01:21 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:01:21 compute-1 nova_compute[182713]:     <controller type="usb" index="0"/>
Jan 22 00:01:21 compute-1 nova_compute[182713]:     <memballoon model="virtio">
Jan 22 00:01:21 compute-1 nova_compute[182713]:       <stats period="10"/>
Jan 22 00:01:21 compute-1 nova_compute[182713]:     </memballoon>
Jan 22 00:01:21 compute-1 nova_compute[182713]:   </devices>
Jan 22 00:01:21 compute-1 nova_compute[182713]: </domain>
Jan 22 00:01:21 compute-1 nova_compute[182713]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 22 00:01:21 compute-1 nova_compute[182713]: 2026-01-22 00:01:21.664 182717 DEBUG nova.compute.manager [None req-f2462028-c54d-4396-bffc-4fa2f1da367f 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] [instance: 2cb6e3d6-f22a-49ea-aab8-900dd88605e9] Preparing to wait for external event network-vif-plugged-8412a083-ca97-4457-bb0e-9c7bcd8bfb2f prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 22 00:01:21 compute-1 nova_compute[182713]: 2026-01-22 00:01:21.664 182717 DEBUG oslo_concurrency.lockutils [None req-f2462028-c54d-4396-bffc-4fa2f1da367f 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] Acquiring lock "2cb6e3d6-f22a-49ea-aab8-900dd88605e9-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:01:21 compute-1 nova_compute[182713]: 2026-01-22 00:01:21.664 182717 DEBUG oslo_concurrency.lockutils [None req-f2462028-c54d-4396-bffc-4fa2f1da367f 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] Lock "2cb6e3d6-f22a-49ea-aab8-900dd88605e9-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:01:21 compute-1 nova_compute[182713]: 2026-01-22 00:01:21.665 182717 DEBUG oslo_concurrency.lockutils [None req-f2462028-c54d-4396-bffc-4fa2f1da367f 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] Lock "2cb6e3d6-f22a-49ea-aab8-900dd88605e9-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:01:21 compute-1 nova_compute[182713]: 2026-01-22 00:01:21.666 182717 DEBUG nova.virt.libvirt.vif [None req-f2462028-c54d-4396-bffc-4fa2f1da367f 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T00:01:05Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersNegativeTestJSON-server-1381246704',display_name='tempest-ServersNegativeTestJSON-server-1381246704',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serversnegativetestjson-server-1381246704',id=83,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='a7e425a4d1854533a17d5f0dcd9d87b9',ramdisk_id='',reservation_id='r-37fe06tc',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersNegativeTestJSON-1689661',owner_user_name='tempest-ServersNegativeTestJSON-1689661-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:01:08Z,user_data=None,user_id='531ec5a088a94b78af6e2c3feda17c0c',uuid=2cb6e3d6-f22a-49ea-aab8-900dd88605e9,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "8412a083-ca97-4457-bb0e-9c7bcd8bfb2f", "address": "fa:16:3e:e0:ee:91", "network": {"id": "397ba44b-e27b-4a2a-a10b-7de0daa31656", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1751919755-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a7e425a4d1854533a17d5f0dcd9d87b9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8412a083-ca", "ovs_interfaceid": "8412a083-ca97-4457-bb0e-9c7bcd8bfb2f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 00:01:21 compute-1 nova_compute[182713]: 2026-01-22 00:01:21.666 182717 DEBUG nova.network.os_vif_util [None req-f2462028-c54d-4396-bffc-4fa2f1da367f 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] Converting VIF {"id": "8412a083-ca97-4457-bb0e-9c7bcd8bfb2f", "address": "fa:16:3e:e0:ee:91", "network": {"id": "397ba44b-e27b-4a2a-a10b-7de0daa31656", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1751919755-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a7e425a4d1854533a17d5f0dcd9d87b9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8412a083-ca", "ovs_interfaceid": "8412a083-ca97-4457-bb0e-9c7bcd8bfb2f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:01:21 compute-1 nova_compute[182713]: 2026-01-22 00:01:21.667 182717 DEBUG nova.network.os_vif_util [None req-f2462028-c54d-4396-bffc-4fa2f1da367f 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e0:ee:91,bridge_name='br-int',has_traffic_filtering=True,id=8412a083-ca97-4457-bb0e-9c7bcd8bfb2f,network=Network(397ba44b-e27b-4a2a-a10b-7de0daa31656),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8412a083-ca') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:01:21 compute-1 nova_compute[182713]: 2026-01-22 00:01:21.668 182717 DEBUG os_vif [None req-f2462028-c54d-4396-bffc-4fa2f1da367f 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:e0:ee:91,bridge_name='br-int',has_traffic_filtering=True,id=8412a083-ca97-4457-bb0e-9c7bcd8bfb2f,network=Network(397ba44b-e27b-4a2a-a10b-7de0daa31656),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8412a083-ca') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 00:01:21 compute-1 nova_compute[182713]: 2026-01-22 00:01:21.668 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:01:21 compute-1 nova_compute[182713]: 2026-01-22 00:01:21.669 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:01:21 compute-1 nova_compute[182713]: 2026-01-22 00:01:21.669 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 00:01:21 compute-1 nova_compute[182713]: 2026-01-22 00:01:21.673 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:01:21 compute-1 nova_compute[182713]: 2026-01-22 00:01:21.673 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap8412a083-ca, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:01:21 compute-1 nova_compute[182713]: 2026-01-22 00:01:21.674 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap8412a083-ca, col_values=(('external_ids', {'iface-id': '8412a083-ca97-4457-bb0e-9c7bcd8bfb2f', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:e0:ee:91', 'vm-uuid': '2cb6e3d6-f22a-49ea-aab8-900dd88605e9'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:01:21 compute-1 NetworkManager[54952]: <info>  [1769040081.7341] manager: (tap8412a083-ca): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/166)
Jan 22 00:01:21 compute-1 nova_compute[182713]: 2026-01-22 00:01:21.736 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:01:21 compute-1 nova_compute[182713]: 2026-01-22 00:01:21.739 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:01:21 compute-1 nova_compute[182713]: 2026-01-22 00:01:21.742 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:01:21 compute-1 nova_compute[182713]: 2026-01-22 00:01:21.743 182717 INFO os_vif [None req-f2462028-c54d-4396-bffc-4fa2f1da367f 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:e0:ee:91,bridge_name='br-int',has_traffic_filtering=True,id=8412a083-ca97-4457-bb0e-9c7bcd8bfb2f,network=Network(397ba44b-e27b-4a2a-a10b-7de0daa31656),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8412a083-ca')
Jan 22 00:01:21 compute-1 nova_compute[182713]: 2026-01-22 00:01:21.767 182717 INFO nova.virt.libvirt.driver [None req-4346924a-e268-4c61-a82c-545f1e7be253 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20] Creating config drive at /var/lib/nova/instances/8009ab5e-1bf8-4d17-8ff2-b62b25eeff20/disk.config
Jan 22 00:01:21 compute-1 nova_compute[182713]: 2026-01-22 00:01:21.775 182717 DEBUG oslo_concurrency.processutils [None req-4346924a-e268-4c61-a82c-545f1e7be253 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/8009ab5e-1bf8-4d17-8ff2-b62b25eeff20/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp59prjitd execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:01:21 compute-1 nova_compute[182713]: 2026-01-22 00:01:21.793 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:01:21 compute-1 nova_compute[182713]: 2026-01-22 00:01:21.835 182717 DEBUG nova.virt.libvirt.driver [None req-f2462028-c54d-4396-bffc-4fa2f1da367f 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 00:01:21 compute-1 nova_compute[182713]: 2026-01-22 00:01:21.835 182717 DEBUG nova.virt.libvirt.driver [None req-f2462028-c54d-4396-bffc-4fa2f1da367f 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 00:01:21 compute-1 nova_compute[182713]: 2026-01-22 00:01:21.835 182717 DEBUG nova.virt.libvirt.driver [None req-f2462028-c54d-4396-bffc-4fa2f1da367f 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] No VIF found with MAC fa:16:3e:e0:ee:91, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 00:01:21 compute-1 nova_compute[182713]: 2026-01-22 00:01:21.836 182717 INFO nova.virt.libvirt.driver [None req-f2462028-c54d-4396-bffc-4fa2f1da367f 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] [instance: 2cb6e3d6-f22a-49ea-aab8-900dd88605e9] Using config drive
Jan 22 00:01:21 compute-1 nova_compute[182713]: 2026-01-22 00:01:21.899 182717 DEBUG oslo_concurrency.processutils [None req-4346924a-e268-4c61-a82c-545f1e7be253 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/8009ab5e-1bf8-4d17-8ff2-b62b25eeff20/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp59prjitd" returned: 0 in 0.124s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:01:21 compute-1 NetworkManager[54952]: <info>  [1769040081.9680] manager: (tap5965ccd1-7d): new Tun device (/org/freedesktop/NetworkManager/Devices/167)
Jan 22 00:01:21 compute-1 kernel: tap5965ccd1-7d: entered promiscuous mode
Jan 22 00:01:21 compute-1 ovn_controller[94841]: 2026-01-22T00:01:21Z|00337|binding|INFO|Claiming lport 5965ccd1-7d75-4079-ade6-e1859a860162 for this chassis.
Jan 22 00:01:21 compute-1 ovn_controller[94841]: 2026-01-22T00:01:21Z|00338|binding|INFO|5965ccd1-7d75-4079-ade6-e1859a860162: Claiming fa:16:3e:c9:be:ae 10.100.0.4
Jan 22 00:01:21 compute-1 nova_compute[182713]: 2026-01-22 00:01:21.973 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:01:21 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:01:21.999 104184 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c9:be:ae 10.100.0.4'], port_security=['fa:16:3e:c9:be:ae 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '8009ab5e-1bf8-4d17-8ff2-b62b25eeff20', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-19c3e0c8-5563-479c-995a-ab38d8b8c7f7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'cccb624dbe6d4401a89e9cd254f91828', 'neutron:revision_number': '2', 'neutron:security_group_ids': '6d59a7e5-ecca-4ec2-a40e-386acabc1d66', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1cb5ae5b-fb9e-4b4d-8960-35191db09308, chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>], logical_port=5965ccd1-7d75-4079-ade6-e1859a860162) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:01:22 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:01:22.001 104184 INFO neutron.agent.ovn.metadata.agent [-] Port 5965ccd1-7d75-4079-ade6-e1859a860162 in datapath 19c3e0c8-5563-479c-995a-ab38d8b8c7f7 bound to our chassis
Jan 22 00:01:22 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:01:22.004 104184 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 19c3e0c8-5563-479c-995a-ab38d8b8c7f7
Jan 22 00:01:22 compute-1 systemd-machined[153970]: New machine qemu-39-instance-00000054.
Jan 22 00:01:22 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:01:22.021 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[b6e47b7d-d25c-4aea-ab3f-3328104162fe]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:01:22 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:01:22.022 104184 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap19c3e0c8-51 in ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 22 00:01:22 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:01:22.025 211733 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap19c3e0c8-50 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 22 00:01:22 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:01:22.025 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[72909b67-55a1-431b-902b-d12c8ee59a9e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:01:22 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:01:22.026 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[7a67aa3f-c39c-4c39-8f59-6fa6c756442b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:01:22 compute-1 nova_compute[182713]: 2026-01-22 00:01:22.044 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:01:22 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:01:22.044 104576 DEBUG oslo.privsep.daemon [-] privsep: reply[855b9104-2b4f-449d-85ab-2bdac52fbf64]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:01:22 compute-1 systemd[1]: Started Virtual Machine qemu-39-instance-00000054.
Jan 22 00:01:22 compute-1 ovn_controller[94841]: 2026-01-22T00:01:22Z|00339|binding|INFO|Setting lport 5965ccd1-7d75-4079-ade6-e1859a860162 ovn-installed in OVS
Jan 22 00:01:22 compute-1 ovn_controller[94841]: 2026-01-22T00:01:22Z|00340|binding|INFO|Setting lport 5965ccd1-7d75-4079-ade6-e1859a860162 up in Southbound
Jan 22 00:01:22 compute-1 nova_compute[182713]: 2026-01-22 00:01:22.060 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:01:22 compute-1 systemd-udevd[223924]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 00:01:22 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:01:22.073 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[035551e6-d263-43eb-bc64-5c500bf59ae0]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:01:22 compute-1 NetworkManager[54952]: <info>  [1769040082.0751] device (tap5965ccd1-7d): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 00:01:22 compute-1 NetworkManager[54952]: <info>  [1769040082.0759] device (tap5965ccd1-7d): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 00:01:22 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:01:22.107 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[422e3403-4117-4415-b3e7-71572ad1930b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:01:22 compute-1 NetworkManager[54952]: <info>  [1769040082.1162] manager: (tap19c3e0c8-50): new Veth device (/org/freedesktop/NetworkManager/Devices/168)
Jan 22 00:01:22 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:01:22.115 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[067d1234-9bd7-4c81-953c-701462437ca3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:01:22 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:01:22.151 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[53142d18-9bf3-4f5f-a6ab-426a26d19ff7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:01:22 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:01:22.157 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[c0733eb8-ec42-40db-9572-7ffebb9b21a7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:01:22 compute-1 NetworkManager[54952]: <info>  [1769040082.1936] device (tap19c3e0c8-50): carrier: link connected
Jan 22 00:01:22 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:01:22.202 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[a5e098d7-9b21-4822-8ca7-c583a08a3a9e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:01:22 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:01:22.227 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[d0ffd711-675c-4a93-9e4c-0190b32e3301]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap19c3e0c8-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:8f:3a:b0'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 102], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 465484, 'reachable_time': 28193, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 152, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 152, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 223957, 'error': None, 'target': 'ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:01:22 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:01:22.253 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[6fca2484-226a-4582-9d6f-16ecd816a61b]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe8f:3ab0'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 465484, 'tstamp': 465484}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 223958, 'error': None, 'target': 'ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:01:22 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:01:22.282 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[3235db19-9531-432c-82cd-38e156a9eb2c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap19c3e0c8-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:8f:3a:b0'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 102], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 465484, 'reachable_time': 28193, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 152, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 152, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 223959, 'error': None, 'target': 'ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:01:22 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:01:22.330 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[1a630395-61a5-4c4f-a8d2-ee25590d488d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:01:22 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:01:22.420 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[4a6c1c68-a945-415f-8036-ff40e28e232d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:01:22 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:01:22.423 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap19c3e0c8-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:01:22 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:01:22.424 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 00:01:22 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:01:22.425 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap19c3e0c8-50, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:01:22 compute-1 nova_compute[182713]: 2026-01-22 00:01:22.427 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:01:22 compute-1 NetworkManager[54952]: <info>  [1769040082.4282] manager: (tap19c3e0c8-50): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/169)
Jan 22 00:01:22 compute-1 kernel: tap19c3e0c8-50: entered promiscuous mode
Jan 22 00:01:22 compute-1 nova_compute[182713]: 2026-01-22 00:01:22.434 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:01:22 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:01:22.436 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap19c3e0c8-50, col_values=(('external_ids', {'iface-id': '1b7e9589-a667-4684-99c2-2699b19c29bb'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:01:22 compute-1 nova_compute[182713]: 2026-01-22 00:01:22.437 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:01:22 compute-1 ovn_controller[94841]: 2026-01-22T00:01:22Z|00341|binding|INFO|Releasing lport 1b7e9589-a667-4684-99c2-2699b19c29bb from this chassis (sb_readonly=0)
Jan 22 00:01:22 compute-1 nova_compute[182713]: 2026-01-22 00:01:22.466 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:01:22 compute-1 nova_compute[182713]: 2026-01-22 00:01:22.473 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:01:22 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:01:22.474 104184 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/19c3e0c8-5563-479c-995a-ab38d8b8c7f7.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/19c3e0c8-5563-479c-995a-ab38d8b8c7f7.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 22 00:01:22 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:01:22.475 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[49126eae-eae1-4d06-b3b2-b2a14af6e825]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:01:22 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:01:22.476 104184 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 00:01:22 compute-1 ovn_metadata_agent[104179]: global
Jan 22 00:01:22 compute-1 ovn_metadata_agent[104179]:     log         /dev/log local0 debug
Jan 22 00:01:22 compute-1 ovn_metadata_agent[104179]:     log-tag     haproxy-metadata-proxy-19c3e0c8-5563-479c-995a-ab38d8b8c7f7
Jan 22 00:01:22 compute-1 ovn_metadata_agent[104179]:     user        root
Jan 22 00:01:22 compute-1 ovn_metadata_agent[104179]:     group       root
Jan 22 00:01:22 compute-1 ovn_metadata_agent[104179]:     maxconn     1024
Jan 22 00:01:22 compute-1 ovn_metadata_agent[104179]:     pidfile     /var/lib/neutron/external/pids/19c3e0c8-5563-479c-995a-ab38d8b8c7f7.pid.haproxy
Jan 22 00:01:22 compute-1 ovn_metadata_agent[104179]:     daemon
Jan 22 00:01:22 compute-1 ovn_metadata_agent[104179]: 
Jan 22 00:01:22 compute-1 ovn_metadata_agent[104179]: defaults
Jan 22 00:01:22 compute-1 ovn_metadata_agent[104179]:     log global
Jan 22 00:01:22 compute-1 ovn_metadata_agent[104179]:     mode http
Jan 22 00:01:22 compute-1 ovn_metadata_agent[104179]:     option httplog
Jan 22 00:01:22 compute-1 ovn_metadata_agent[104179]:     option dontlognull
Jan 22 00:01:22 compute-1 ovn_metadata_agent[104179]:     option http-server-close
Jan 22 00:01:22 compute-1 ovn_metadata_agent[104179]:     option forwardfor
Jan 22 00:01:22 compute-1 ovn_metadata_agent[104179]:     retries                 3
Jan 22 00:01:22 compute-1 ovn_metadata_agent[104179]:     timeout http-request    30s
Jan 22 00:01:22 compute-1 ovn_metadata_agent[104179]:     timeout connect         30s
Jan 22 00:01:22 compute-1 ovn_metadata_agent[104179]:     timeout client          32s
Jan 22 00:01:22 compute-1 ovn_metadata_agent[104179]:     timeout server          32s
Jan 22 00:01:22 compute-1 ovn_metadata_agent[104179]:     timeout http-keep-alive 30s
Jan 22 00:01:22 compute-1 ovn_metadata_agent[104179]: 
Jan 22 00:01:22 compute-1 ovn_metadata_agent[104179]: 
Jan 22 00:01:22 compute-1 ovn_metadata_agent[104179]: listen listener
Jan 22 00:01:22 compute-1 ovn_metadata_agent[104179]:     bind 169.254.169.254:80
Jan 22 00:01:22 compute-1 ovn_metadata_agent[104179]:     server metadata /var/lib/neutron/metadata_proxy
Jan 22 00:01:22 compute-1 ovn_metadata_agent[104179]:     http-request add-header X-OVN-Network-ID 19c3e0c8-5563-479c-995a-ab38d8b8c7f7
Jan 22 00:01:22 compute-1 ovn_metadata_agent[104179]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 22 00:01:22 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:01:22.478 104184 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7', 'env', 'PROCESS_TAG=haproxy-19c3e0c8-5563-479c-995a-ab38d8b8c7f7', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/19c3e0c8-5563-479c-995a-ab38d8b8c7f7.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 22 00:01:22 compute-1 nova_compute[182713]: 2026-01-22 00:01:22.597 182717 INFO nova.virt.libvirt.driver [None req-f2462028-c54d-4396-bffc-4fa2f1da367f 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] [instance: 2cb6e3d6-f22a-49ea-aab8-900dd88605e9] Creating config drive at /var/lib/nova/instances/2cb6e3d6-f22a-49ea-aab8-900dd88605e9/disk.config
Jan 22 00:01:22 compute-1 nova_compute[182713]: 2026-01-22 00:01:22.607 182717 DEBUG oslo_concurrency.processutils [None req-f2462028-c54d-4396-bffc-4fa2f1da367f 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/2cb6e3d6-f22a-49ea-aab8-900dd88605e9/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpjn8r849z execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:01:22 compute-1 nova_compute[182713]: 2026-01-22 00:01:22.642 182717 DEBUG nova.compute.manager [req-75960e24-604a-445b-bd8b-43e6ae66bc33 req-d1e7e36a-8dcc-4a99-a224-2b40860ec27d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20] Received event network-vif-plugged-5965ccd1-7d75-4079-ade6-e1859a860162 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:01:22 compute-1 nova_compute[182713]: 2026-01-22 00:01:22.643 182717 DEBUG oslo_concurrency.lockutils [req-75960e24-604a-445b-bd8b-43e6ae66bc33 req-d1e7e36a-8dcc-4a99-a224-2b40860ec27d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "8009ab5e-1bf8-4d17-8ff2-b62b25eeff20-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:01:22 compute-1 nova_compute[182713]: 2026-01-22 00:01:22.644 182717 DEBUG oslo_concurrency.lockutils [req-75960e24-604a-445b-bd8b-43e6ae66bc33 req-d1e7e36a-8dcc-4a99-a224-2b40860ec27d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "8009ab5e-1bf8-4d17-8ff2-b62b25eeff20-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:01:22 compute-1 nova_compute[182713]: 2026-01-22 00:01:22.644 182717 DEBUG oslo_concurrency.lockutils [req-75960e24-604a-445b-bd8b-43e6ae66bc33 req-d1e7e36a-8dcc-4a99-a224-2b40860ec27d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "8009ab5e-1bf8-4d17-8ff2-b62b25eeff20-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:01:22 compute-1 nova_compute[182713]: 2026-01-22 00:01:22.645 182717 DEBUG nova.compute.manager [req-75960e24-604a-445b-bd8b-43e6ae66bc33 req-d1e7e36a-8dcc-4a99-a224-2b40860ec27d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20] Processing event network-vif-plugged-5965ccd1-7d75-4079-ade6-e1859a860162 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 22 00:01:22 compute-1 nova_compute[182713]: 2026-01-22 00:01:22.750 182717 DEBUG oslo_concurrency.processutils [None req-f2462028-c54d-4396-bffc-4fa2f1da367f 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/2cb6e3d6-f22a-49ea-aab8-900dd88605e9/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpjn8r849z" returned: 0 in 0.144s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:01:22 compute-1 kernel: tap8412a083-ca: entered promiscuous mode
Jan 22 00:01:22 compute-1 systemd-udevd[223941]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 00:01:22 compute-1 NetworkManager[54952]: <info>  [1769040082.8408] manager: (tap8412a083-ca): new Tun device (/org/freedesktop/NetworkManager/Devices/170)
Jan 22 00:01:22 compute-1 nova_compute[182713]: 2026-01-22 00:01:22.843 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:01:22 compute-1 ovn_controller[94841]: 2026-01-22T00:01:22Z|00342|binding|INFO|Claiming lport 8412a083-ca97-4457-bb0e-9c7bcd8bfb2f for this chassis.
Jan 22 00:01:22 compute-1 ovn_controller[94841]: 2026-01-22T00:01:22Z|00343|binding|INFO|8412a083-ca97-4457-bb0e-9c7bcd8bfb2f: Claiming fa:16:3e:e0:ee:91 10.100.0.3
Jan 22 00:01:22 compute-1 nova_compute[182713]: 2026-01-22 00:01:22.851 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:01:22 compute-1 NetworkManager[54952]: <info>  [1769040082.8553] device (tap8412a083-ca): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 00:01:22 compute-1 nova_compute[182713]: 2026-01-22 00:01:22.855 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:01:22 compute-1 NetworkManager[54952]: <info>  [1769040082.8561] device (tap8412a083-ca): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 00:01:22 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:01:22.880 104184 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e0:ee:91 10.100.0.3'], port_security=['fa:16:3e:e0:ee:91 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': '2cb6e3d6-f22a-49ea-aab8-900dd88605e9', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-397ba44b-e27b-4a2a-a10b-7de0daa31656', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a7e425a4d1854533a17d5f0dcd9d87b9', 'neutron:revision_number': '2', 'neutron:security_group_ids': '5fb84efc-d0d8-44ae-84e4-97e70d8c202e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=10175545-8ba8-4bcf-9e15-f460a54818aa, chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>], logical_port=8412a083-ca97-4457-bb0e-9c7bcd8bfb2f) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:01:22 compute-1 systemd-machined[153970]: New machine qemu-40-instance-00000053.
Jan 22 00:01:22 compute-1 ovn_controller[94841]: 2026-01-22T00:01:22Z|00344|binding|INFO|Setting lport 8412a083-ca97-4457-bb0e-9c7bcd8bfb2f ovn-installed in OVS
Jan 22 00:01:22 compute-1 ovn_controller[94841]: 2026-01-22T00:01:22Z|00345|binding|INFO|Setting lport 8412a083-ca97-4457-bb0e-9c7bcd8bfb2f up in Southbound
Jan 22 00:01:22 compute-1 nova_compute[182713]: 2026-01-22 00:01:22.933 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:01:22 compute-1 systemd[1]: Started Virtual Machine qemu-40-instance-00000053.
Jan 22 00:01:22 compute-1 podman[224005]: 2026-01-22 00:01:22.945549011 +0000 UTC m=+0.079374938 container create 76878bcf5715828236c418dca33c20e15dc3db38e28794a7e26839be5638dc04 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS)
Jan 22 00:01:22 compute-1 systemd[1]: Started libpod-conmon-76878bcf5715828236c418dca33c20e15dc3db38e28794a7e26839be5638dc04.scope.
Jan 22 00:01:23 compute-1 podman[224005]: 2026-01-22 00:01:22.906505707 +0000 UTC m=+0.040331644 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 00:01:23 compute-1 systemd[1]: Started libcrun container.
Jan 22 00:01:23 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9332773bee8f0b3b24e752a1f7406ff36656bd92b129dbd3021f9568041d25b9/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 00:01:23 compute-1 podman[224005]: 2026-01-22 00:01:23.032362949 +0000 UTC m=+0.166188916 container init 76878bcf5715828236c418dca33c20e15dc3db38e28794a7e26839be5638dc04 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Jan 22 00:01:23 compute-1 podman[224005]: 2026-01-22 00:01:23.043807292 +0000 UTC m=+0.177633229 container start 76878bcf5715828236c418dca33c20e15dc3db38e28794a7e26839be5638dc04 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 22 00:01:23 compute-1 nova_compute[182713]: 2026-01-22 00:01:23.055 182717 DEBUG nova.virt.driver [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Emitting event <LifecycleEvent: 1769040083.0547245, 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:01:23 compute-1 nova_compute[182713]: 2026-01-22 00:01:23.055 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20] VM Started (Lifecycle Event)
Jan 22 00:01:23 compute-1 nova_compute[182713]: 2026-01-22 00:01:23.058 182717 DEBUG nova.compute.manager [None req-4346924a-e268-4c61-a82c-545f1e7be253 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 00:01:23 compute-1 nova_compute[182713]: 2026-01-22 00:01:23.062 182717 DEBUG nova.virt.libvirt.driver [None req-4346924a-e268-4c61-a82c-545f1e7be253 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 22 00:01:23 compute-1 nova_compute[182713]: 2026-01-22 00:01:23.065 182717 INFO nova.virt.libvirt.driver [-] [instance: 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20] Instance spawned successfully.
Jan 22 00:01:23 compute-1 nova_compute[182713]: 2026-01-22 00:01:23.065 182717 DEBUG nova.virt.libvirt.driver [None req-4346924a-e268-4c61-a82c-545f1e7be253 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 22 00:01:23 compute-1 neutron-haproxy-ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7[224032]: [NOTICE]   (224041) : New worker (224043) forked
Jan 22 00:01:23 compute-1 neutron-haproxy-ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7[224032]: [NOTICE]   (224041) : Loading success.
Jan 22 00:01:23 compute-1 nova_compute[182713]: 2026-01-22 00:01:23.111 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:01:23 compute-1 nova_compute[182713]: 2026-01-22 00:01:23.117 182717 DEBUG nova.virt.libvirt.driver [None req-4346924a-e268-4c61-a82c-545f1e7be253 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:01:23 compute-1 nova_compute[182713]: 2026-01-22 00:01:23.117 182717 DEBUG nova.virt.libvirt.driver [None req-4346924a-e268-4c61-a82c-545f1e7be253 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:01:23 compute-1 nova_compute[182713]: 2026-01-22 00:01:23.118 182717 DEBUG nova.virt.libvirt.driver [None req-4346924a-e268-4c61-a82c-545f1e7be253 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:01:23 compute-1 nova_compute[182713]: 2026-01-22 00:01:23.119 182717 DEBUG nova.virt.libvirt.driver [None req-4346924a-e268-4c61-a82c-545f1e7be253 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:01:23 compute-1 nova_compute[182713]: 2026-01-22 00:01:23.119 182717 DEBUG nova.virt.libvirt.driver [None req-4346924a-e268-4c61-a82c-545f1e7be253 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:01:23 compute-1 nova_compute[182713]: 2026-01-22 00:01:23.120 182717 DEBUG nova.virt.libvirt.driver [None req-4346924a-e268-4c61-a82c-545f1e7be253 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:01:23 compute-1 nova_compute[182713]: 2026-01-22 00:01:23.125 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 00:01:23 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:01:23.128 104184 INFO neutron.agent.ovn.metadata.agent [-] Port 8412a083-ca97-4457-bb0e-9c7bcd8bfb2f in datapath 397ba44b-e27b-4a2a-a10b-7de0daa31656 unbound from our chassis
Jan 22 00:01:23 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:01:23.129 104184 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 397ba44b-e27b-4a2a-a10b-7de0daa31656
Jan 22 00:01:23 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:01:23.141 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[8dc6ba5f-e3d0-4da2-83b2-bf9ed41509a7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:01:23 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:01:23.142 104184 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap397ba44b-e1 in ovnmeta-397ba44b-e27b-4a2a-a10b-7de0daa31656 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 22 00:01:23 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:01:23.146 211733 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap397ba44b-e0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 22 00:01:23 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:01:23.146 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[55dbd466-8b53-4f6a-a26c-4ba96a2c3d3f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:01:23 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:01:23.147 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[a9f927f9-946c-4adc-b9c8-988031437804]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:01:23 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:01:23.159 104576 DEBUG oslo.privsep.daemon [-] privsep: reply[15f8083f-c1a5-405c-8830-1be5fc1837a3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:01:23 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:01:23.176 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[53aec0b6-a4a8-4fcf-a8e8-5fb6e87a5d3b]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:01:23 compute-1 nova_compute[182713]: 2026-01-22 00:01:23.198 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 00:01:23 compute-1 nova_compute[182713]: 2026-01-22 00:01:23.198 182717 DEBUG nova.virt.driver [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Emitting event <LifecycleEvent: 1769040083.0550423, 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:01:23 compute-1 nova_compute[182713]: 2026-01-22 00:01:23.199 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20] VM Paused (Lifecycle Event)
Jan 22 00:01:23 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:01:23.208 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[0390b4e4-0184-4652-928b-6a67755fb708]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:01:23 compute-1 NetworkManager[54952]: <info>  [1769040083.2143] manager: (tap397ba44b-e0): new Veth device (/org/freedesktop/NetworkManager/Devices/171)
Jan 22 00:01:23 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:01:23.215 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[45187aac-4cac-41d0-897c-586d63d2f453]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:01:23 compute-1 nova_compute[182713]: 2026-01-22 00:01:23.239 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:01:23 compute-1 nova_compute[182713]: 2026-01-22 00:01:23.243 182717 DEBUG nova.virt.driver [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Emitting event <LifecycleEvent: 1769040083.0612688, 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:01:23 compute-1 nova_compute[182713]: 2026-01-22 00:01:23.243 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20] VM Resumed (Lifecycle Event)
Jan 22 00:01:23 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:01:23.256 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[cee23581-2181-4232-b283-176ea1103147]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:01:23 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:01:23.258 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[53c74f73-6d5e-4aff-a2b9-c0106d3bb376]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:01:23 compute-1 NetworkManager[54952]: <info>  [1769040083.2864] device (tap397ba44b-e0): carrier: link connected
Jan 22 00:01:23 compute-1 nova_compute[182713]: 2026-01-22 00:01:23.291 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:01:23 compute-1 nova_compute[182713]: 2026-01-22 00:01:23.295 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 00:01:23 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:01:23.297 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[9df7275b-4036-4ec8-a2fd-2fc1a8aa31d6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:01:23 compute-1 nova_compute[182713]: 2026-01-22 00:01:23.316 182717 INFO nova.compute.manager [None req-4346924a-e268-4c61-a82c-545f1e7be253 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20] Took 14.58 seconds to spawn the instance on the hypervisor.
Jan 22 00:01:23 compute-1 nova_compute[182713]: 2026-01-22 00:01:23.317 182717 DEBUG nova.compute.manager [None req-4346924a-e268-4c61-a82c-545f1e7be253 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:01:23 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:01:23.324 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[7c892c43-8aa7-4218-afb9-582723d628a8]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap397ba44b-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:29:12:aa'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 104], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 465593, 'reachable_time': 18689, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 224062, 'error': None, 'target': 'ovnmeta-397ba44b-e27b-4a2a-a10b-7de0daa31656', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:01:23 compute-1 nova_compute[182713]: 2026-01-22 00:01:23.329 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 00:01:23 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:01:23.343 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[6937ccb0-9b87-4b05-9529-4cc8f27bc74a]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe29:12aa'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 465593, 'tstamp': 465593}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 224063, 'error': None, 'target': 'ovnmeta-397ba44b-e27b-4a2a-a10b-7de0daa31656', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:01:23 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:01:23.368 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[4b068e2f-8a31-421e-abaa-9d1179303a22]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap397ba44b-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:29:12:aa'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 104], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 465593, 'reachable_time': 18689, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 148, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 148, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 224064, 'error': None, 'target': 'ovnmeta-397ba44b-e27b-4a2a-a10b-7de0daa31656', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:01:23 compute-1 nova_compute[182713]: 2026-01-22 00:01:23.405 182717 DEBUG nova.compute.manager [req-3e4e8323-c02c-4e2b-a0e0-59b68d82f5e7 req-a66be3cb-454e-41b9-b0c0-12abc2ab3169 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 2cb6e3d6-f22a-49ea-aab8-900dd88605e9] Received event network-vif-plugged-8412a083-ca97-4457-bb0e-9c7bcd8bfb2f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:01:23 compute-1 nova_compute[182713]: 2026-01-22 00:01:23.406 182717 DEBUG oslo_concurrency.lockutils [req-3e4e8323-c02c-4e2b-a0e0-59b68d82f5e7 req-a66be3cb-454e-41b9-b0c0-12abc2ab3169 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "2cb6e3d6-f22a-49ea-aab8-900dd88605e9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:01:23 compute-1 nova_compute[182713]: 2026-01-22 00:01:23.407 182717 DEBUG oslo_concurrency.lockutils [req-3e4e8323-c02c-4e2b-a0e0-59b68d82f5e7 req-a66be3cb-454e-41b9-b0c0-12abc2ab3169 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "2cb6e3d6-f22a-49ea-aab8-900dd88605e9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:01:23 compute-1 nova_compute[182713]: 2026-01-22 00:01:23.407 182717 DEBUG oslo_concurrency.lockutils [req-3e4e8323-c02c-4e2b-a0e0-59b68d82f5e7 req-a66be3cb-454e-41b9-b0c0-12abc2ab3169 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "2cb6e3d6-f22a-49ea-aab8-900dd88605e9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:01:23 compute-1 nova_compute[182713]: 2026-01-22 00:01:23.408 182717 DEBUG nova.compute.manager [req-3e4e8323-c02c-4e2b-a0e0-59b68d82f5e7 req-a66be3cb-454e-41b9-b0c0-12abc2ab3169 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 2cb6e3d6-f22a-49ea-aab8-900dd88605e9] Processing event network-vif-plugged-8412a083-ca97-4457-bb0e-9c7bcd8bfb2f _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 22 00:01:23 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:01:23.407 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[8d0d475a-895e-4491-aa7f-5c855a366b87]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:01:23 compute-1 nova_compute[182713]: 2026-01-22 00:01:23.427 182717 INFO nova.compute.manager [None req-4346924a-e268-4c61-a82c-545f1e7be253 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20] Took 15.86 seconds to build instance.
Jan 22 00:01:23 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:01:23.455 104184 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=20, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:a4:fb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'fa:c2:bb:50:eb:27'}, ipsec=False) old=SB_Global(nb_cfg=19) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:01:23 compute-1 nova_compute[182713]: 2026-01-22 00:01:23.456 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:01:23 compute-1 nova_compute[182713]: 2026-01-22 00:01:23.461 182717 DEBUG oslo_concurrency.lockutils [None req-4346924a-e268-4c61-a82c-545f1e7be253 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Lock "8009ab5e-1bf8-4d17-8ff2-b62b25eeff20" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 16.112s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:01:23 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:01:23.502 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[bdef1f99-c12c-4466-b36e-1ef67e85054f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:01:23 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:01:23.504 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap397ba44b-e0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:01:23 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:01:23.504 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 00:01:23 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:01:23.505 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap397ba44b-e0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:01:23 compute-1 NetworkManager[54952]: <info>  [1769040083.5079] manager: (tap397ba44b-e0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/172)
Jan 22 00:01:23 compute-1 kernel: tap397ba44b-e0: entered promiscuous mode
Jan 22 00:01:23 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:01:23.511 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap397ba44b-e0, col_values=(('external_ids', {'iface-id': 'f7f4d7e4-9841-41f2-85bd-658a3b613e0d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:01:23 compute-1 nova_compute[182713]: 2026-01-22 00:01:23.514 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:01:23 compute-1 ovn_controller[94841]: 2026-01-22T00:01:23Z|00346|binding|INFO|Releasing lport f7f4d7e4-9841-41f2-85bd-658a3b613e0d from this chassis (sb_readonly=0)
Jan 22 00:01:23 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:01:23.521 104184 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/397ba44b-e27b-4a2a-a10b-7de0daa31656.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/397ba44b-e27b-4a2a-a10b-7de0daa31656.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 22 00:01:23 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:01:23.522 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[adc321ec-6f0b-4fb5-b0d4-b88800699a9a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:01:23 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:01:23.523 104184 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 00:01:23 compute-1 ovn_metadata_agent[104179]: global
Jan 22 00:01:23 compute-1 ovn_metadata_agent[104179]:     log         /dev/log local0 debug
Jan 22 00:01:23 compute-1 ovn_metadata_agent[104179]:     log-tag     haproxy-metadata-proxy-397ba44b-e27b-4a2a-a10b-7de0daa31656
Jan 22 00:01:23 compute-1 ovn_metadata_agent[104179]:     user        root
Jan 22 00:01:23 compute-1 ovn_metadata_agent[104179]:     group       root
Jan 22 00:01:23 compute-1 ovn_metadata_agent[104179]:     maxconn     1024
Jan 22 00:01:23 compute-1 ovn_metadata_agent[104179]:     pidfile     /var/lib/neutron/external/pids/397ba44b-e27b-4a2a-a10b-7de0daa31656.pid.haproxy
Jan 22 00:01:23 compute-1 ovn_metadata_agent[104179]:     daemon
Jan 22 00:01:23 compute-1 ovn_metadata_agent[104179]: 
Jan 22 00:01:23 compute-1 ovn_metadata_agent[104179]: defaults
Jan 22 00:01:23 compute-1 ovn_metadata_agent[104179]:     log global
Jan 22 00:01:23 compute-1 ovn_metadata_agent[104179]:     mode http
Jan 22 00:01:23 compute-1 ovn_metadata_agent[104179]:     option httplog
Jan 22 00:01:23 compute-1 ovn_metadata_agent[104179]:     option dontlognull
Jan 22 00:01:23 compute-1 ovn_metadata_agent[104179]:     option http-server-close
Jan 22 00:01:23 compute-1 ovn_metadata_agent[104179]:     option forwardfor
Jan 22 00:01:23 compute-1 ovn_metadata_agent[104179]:     retries                 3
Jan 22 00:01:23 compute-1 ovn_metadata_agent[104179]:     timeout http-request    30s
Jan 22 00:01:23 compute-1 ovn_metadata_agent[104179]:     timeout connect         30s
Jan 22 00:01:23 compute-1 ovn_metadata_agent[104179]:     timeout client          32s
Jan 22 00:01:23 compute-1 ovn_metadata_agent[104179]:     timeout server          32s
Jan 22 00:01:23 compute-1 ovn_metadata_agent[104179]:     timeout http-keep-alive 30s
Jan 22 00:01:23 compute-1 ovn_metadata_agent[104179]: 
Jan 22 00:01:23 compute-1 ovn_metadata_agent[104179]: 
Jan 22 00:01:23 compute-1 ovn_metadata_agent[104179]: listen listener
Jan 22 00:01:23 compute-1 ovn_metadata_agent[104179]:     bind 169.254.169.254:80
Jan 22 00:01:23 compute-1 ovn_metadata_agent[104179]:     server metadata /var/lib/neutron/metadata_proxy
Jan 22 00:01:23 compute-1 ovn_metadata_agent[104179]:     http-request add-header X-OVN-Network-ID 397ba44b-e27b-4a2a-a10b-7de0daa31656
Jan 22 00:01:23 compute-1 ovn_metadata_agent[104179]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 22 00:01:23 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:01:23.525 104184 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-397ba44b-e27b-4a2a-a10b-7de0daa31656', 'env', 'PROCESS_TAG=haproxy-397ba44b-e27b-4a2a-a10b-7de0daa31656', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/397ba44b-e27b-4a2a-a10b-7de0daa31656.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 22 00:01:23 compute-1 nova_compute[182713]: 2026-01-22 00:01:23.518 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:01:23 compute-1 nova_compute[182713]: 2026-01-22 00:01:23.538 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:01:23 compute-1 nova_compute[182713]: 2026-01-22 00:01:23.541 182717 DEBUG nova.network.neutron [req-318dd4c7-154d-4516-a3e6-f80e1040390d req-73b00727-4f70-4d45-9983-0cb131077981 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20] Updated VIF entry in instance network info cache for port 5965ccd1-7d75-4079-ade6-e1859a860162. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 00:01:23 compute-1 nova_compute[182713]: 2026-01-22 00:01:23.542 182717 DEBUG nova.network.neutron [req-318dd4c7-154d-4516-a3e6-f80e1040390d req-73b00727-4f70-4d45-9983-0cb131077981 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20] Updating instance_info_cache with network_info: [{"id": "5965ccd1-7d75-4079-ade6-e1859a860162", "address": "fa:16:3e:c9:be:ae", "network": {"id": "19c3e0c8-5563-479c-995a-ab38d8b8c7f7", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-10713966-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cccb624dbe6d4401a89e9cd254f91828", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5965ccd1-7d", "ovs_interfaceid": "5965ccd1-7d75-4079-ade6-e1859a860162", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:01:23 compute-1 nova_compute[182713]: 2026-01-22 00:01:23.598 182717 DEBUG oslo_concurrency.lockutils [req-318dd4c7-154d-4516-a3e6-f80e1040390d req-73b00727-4f70-4d45-9983-0cb131077981 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-8009ab5e-1bf8-4d17-8ff2-b62b25eeff20" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:01:23 compute-1 nova_compute[182713]: 2026-01-22 00:01:23.767 182717 DEBUG nova.virt.driver [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Emitting event <LifecycleEvent: 1769040083.7669425, 2cb6e3d6-f22a-49ea-aab8-900dd88605e9 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:01:23 compute-1 nova_compute[182713]: 2026-01-22 00:01:23.768 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 2cb6e3d6-f22a-49ea-aab8-900dd88605e9] VM Started (Lifecycle Event)
Jan 22 00:01:23 compute-1 nova_compute[182713]: 2026-01-22 00:01:23.770 182717 DEBUG nova.compute.manager [None req-f2462028-c54d-4396-bffc-4fa2f1da367f 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] [instance: 2cb6e3d6-f22a-49ea-aab8-900dd88605e9] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 00:01:23 compute-1 nova_compute[182713]: 2026-01-22 00:01:23.774 182717 DEBUG nova.virt.libvirt.driver [None req-f2462028-c54d-4396-bffc-4fa2f1da367f 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] [instance: 2cb6e3d6-f22a-49ea-aab8-900dd88605e9] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 22 00:01:23 compute-1 nova_compute[182713]: 2026-01-22 00:01:23.777 182717 INFO nova.virt.libvirt.driver [-] [instance: 2cb6e3d6-f22a-49ea-aab8-900dd88605e9] Instance spawned successfully.
Jan 22 00:01:23 compute-1 nova_compute[182713]: 2026-01-22 00:01:23.777 182717 DEBUG nova.virt.libvirt.driver [None req-f2462028-c54d-4396-bffc-4fa2f1da367f 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] [instance: 2cb6e3d6-f22a-49ea-aab8-900dd88605e9] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 22 00:01:23 compute-1 nova_compute[182713]: 2026-01-22 00:01:23.807 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 2cb6e3d6-f22a-49ea-aab8-900dd88605e9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:01:23 compute-1 nova_compute[182713]: 2026-01-22 00:01:23.808 182717 DEBUG nova.virt.libvirt.driver [None req-f2462028-c54d-4396-bffc-4fa2f1da367f 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] [instance: 2cb6e3d6-f22a-49ea-aab8-900dd88605e9] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:01:23 compute-1 nova_compute[182713]: 2026-01-22 00:01:23.809 182717 DEBUG nova.virt.libvirt.driver [None req-f2462028-c54d-4396-bffc-4fa2f1da367f 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] [instance: 2cb6e3d6-f22a-49ea-aab8-900dd88605e9] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:01:23 compute-1 nova_compute[182713]: 2026-01-22 00:01:23.809 182717 DEBUG nova.virt.libvirt.driver [None req-f2462028-c54d-4396-bffc-4fa2f1da367f 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] [instance: 2cb6e3d6-f22a-49ea-aab8-900dd88605e9] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:01:23 compute-1 nova_compute[182713]: 2026-01-22 00:01:23.810 182717 DEBUG nova.virt.libvirt.driver [None req-f2462028-c54d-4396-bffc-4fa2f1da367f 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] [instance: 2cb6e3d6-f22a-49ea-aab8-900dd88605e9] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:01:23 compute-1 nova_compute[182713]: 2026-01-22 00:01:23.810 182717 DEBUG nova.virt.libvirt.driver [None req-f2462028-c54d-4396-bffc-4fa2f1da367f 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] [instance: 2cb6e3d6-f22a-49ea-aab8-900dd88605e9] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:01:23 compute-1 nova_compute[182713]: 2026-01-22 00:01:23.810 182717 DEBUG nova.virt.libvirt.driver [None req-f2462028-c54d-4396-bffc-4fa2f1da367f 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] [instance: 2cb6e3d6-f22a-49ea-aab8-900dd88605e9] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:01:23 compute-1 nova_compute[182713]: 2026-01-22 00:01:23.815 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 2cb6e3d6-f22a-49ea-aab8-900dd88605e9] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 00:01:23 compute-1 nova_compute[182713]: 2026-01-22 00:01:23.860 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 2cb6e3d6-f22a-49ea-aab8-900dd88605e9] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 00:01:23 compute-1 nova_compute[182713]: 2026-01-22 00:01:23.861 182717 DEBUG nova.virt.driver [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Emitting event <LifecycleEvent: 1769040083.7673428, 2cb6e3d6-f22a-49ea-aab8-900dd88605e9 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:01:23 compute-1 nova_compute[182713]: 2026-01-22 00:01:23.861 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 2cb6e3d6-f22a-49ea-aab8-900dd88605e9] VM Paused (Lifecycle Event)
Jan 22 00:01:23 compute-1 nova_compute[182713]: 2026-01-22 00:01:23.911 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 2cb6e3d6-f22a-49ea-aab8-900dd88605e9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:01:23 compute-1 nova_compute[182713]: 2026-01-22 00:01:23.915 182717 DEBUG nova.virt.driver [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Emitting event <LifecycleEvent: 1769040083.772956, 2cb6e3d6-f22a-49ea-aab8-900dd88605e9 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:01:23 compute-1 nova_compute[182713]: 2026-01-22 00:01:23.915 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 2cb6e3d6-f22a-49ea-aab8-900dd88605e9] VM Resumed (Lifecycle Event)
Jan 22 00:01:23 compute-1 nova_compute[182713]: 2026-01-22 00:01:23.977 182717 INFO nova.compute.manager [None req-f2462028-c54d-4396-bffc-4fa2f1da367f 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] [instance: 2cb6e3d6-f22a-49ea-aab8-900dd88605e9] Took 15.78 seconds to spawn the instance on the hypervisor.
Jan 22 00:01:23 compute-1 nova_compute[182713]: 2026-01-22 00:01:23.977 182717 DEBUG nova.compute.manager [None req-f2462028-c54d-4396-bffc-4fa2f1da367f 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] [instance: 2cb6e3d6-f22a-49ea-aab8-900dd88605e9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:01:23 compute-1 podman[224102]: 2026-01-22 00:01:23.979190649 +0000 UTC m=+0.060142575 container create 533c90b1f8c293c688d73e9a05f500652ddb8a05d7a63d927a0c94fe78432582 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-397ba44b-e27b-4a2a-a10b-7de0daa31656, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3)
Jan 22 00:01:23 compute-1 nova_compute[182713]: 2026-01-22 00:01:23.980 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 2cb6e3d6-f22a-49ea-aab8-900dd88605e9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:01:23 compute-1 nova_compute[182713]: 2026-01-22 00:01:23.990 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 2cb6e3d6-f22a-49ea-aab8-900dd88605e9] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 00:01:24 compute-1 systemd[1]: Started libpod-conmon-533c90b1f8c293c688d73e9a05f500652ddb8a05d7a63d927a0c94fe78432582.scope.
Jan 22 00:01:24 compute-1 nova_compute[182713]: 2026-01-22 00:01:24.022 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 2cb6e3d6-f22a-49ea-aab8-900dd88605e9] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 00:01:24 compute-1 podman[224102]: 2026-01-22 00:01:23.952936359 +0000 UTC m=+0.033888305 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 00:01:24 compute-1 systemd[1]: Started libcrun container.
Jan 22 00:01:24 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5c7599a98e1976189a53b952a2523db361af4489a4e76162578e5b2991b81629/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 00:01:24 compute-1 podman[224102]: 2026-01-22 00:01:24.073422985 +0000 UTC m=+0.154374931 container init 533c90b1f8c293c688d73e9a05f500652ddb8a05d7a63d927a0c94fe78432582 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-397ba44b-e27b-4a2a-a10b-7de0daa31656, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 22 00:01:24 compute-1 podman[224102]: 2026-01-22 00:01:24.085903161 +0000 UTC m=+0.166855087 container start 533c90b1f8c293c688d73e9a05f500652ddb8a05d7a63d927a0c94fe78432582 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-397ba44b-e27b-4a2a-a10b-7de0daa31656, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3)
Jan 22 00:01:24 compute-1 nova_compute[182713]: 2026-01-22 00:01:24.098 182717 INFO nova.compute.manager [None req-f2462028-c54d-4396-bffc-4fa2f1da367f 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] [instance: 2cb6e3d6-f22a-49ea-aab8-900dd88605e9] Took 16.72 seconds to build instance.
Jan 22 00:01:24 compute-1 neutron-haproxy-ovnmeta-397ba44b-e27b-4a2a-a10b-7de0daa31656[224117]: [NOTICE]   (224121) : New worker (224123) forked
Jan 22 00:01:24 compute-1 neutron-haproxy-ovnmeta-397ba44b-e27b-4a2a-a10b-7de0daa31656[224117]: [NOTICE]   (224121) : Loading success.
Jan 22 00:01:24 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:01:24.146 104184 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 22 00:01:24 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:01:24.148 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=74526b6d-b1ca-423f-9094-b845f8b97526, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '20'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:01:24 compute-1 nova_compute[182713]: 2026-01-22 00:01:24.158 182717 DEBUG oslo_concurrency.lockutils [None req-f2462028-c54d-4396-bffc-4fa2f1da367f 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] Lock "2cb6e3d6-f22a-49ea-aab8-900dd88605e9" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 16.911s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:01:24 compute-1 nova_compute[182713]: 2026-01-22 00:01:24.719 182717 DEBUG nova.network.neutron [req-661e2e4f-b091-4a41-9296-d842ed2b63c3 req-9ab1f846-b31a-4321-a36d-4a7304cdc04a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 2cb6e3d6-f22a-49ea-aab8-900dd88605e9] Updated VIF entry in instance network info cache for port 8412a083-ca97-4457-bb0e-9c7bcd8bfb2f. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 00:01:24 compute-1 nova_compute[182713]: 2026-01-22 00:01:24.720 182717 DEBUG nova.network.neutron [req-661e2e4f-b091-4a41-9296-d842ed2b63c3 req-9ab1f846-b31a-4321-a36d-4a7304cdc04a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 2cb6e3d6-f22a-49ea-aab8-900dd88605e9] Updating instance_info_cache with network_info: [{"id": "8412a083-ca97-4457-bb0e-9c7bcd8bfb2f", "address": "fa:16:3e:e0:ee:91", "network": {"id": "397ba44b-e27b-4a2a-a10b-7de0daa31656", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1751919755-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a7e425a4d1854533a17d5f0dcd9d87b9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8412a083-ca", "ovs_interfaceid": "8412a083-ca97-4457-bb0e-9c7bcd8bfb2f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:01:24 compute-1 nova_compute[182713]: 2026-01-22 00:01:24.742 182717 DEBUG oslo_concurrency.lockutils [req-661e2e4f-b091-4a41-9296-d842ed2b63c3 req-9ab1f846-b31a-4321-a36d-4a7304cdc04a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-2cb6e3d6-f22a-49ea-aab8-900dd88605e9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:01:24 compute-1 nova_compute[182713]: 2026-01-22 00:01:24.782 182717 DEBUG nova.compute.manager [req-e8a5bfb2-1c42-4d22-beb5-1b4887f5132b req-e63817cb-8170-41bb-8469-4459928f725d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20] Received event network-vif-plugged-5965ccd1-7d75-4079-ade6-e1859a860162 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:01:24 compute-1 nova_compute[182713]: 2026-01-22 00:01:24.782 182717 DEBUG oslo_concurrency.lockutils [req-e8a5bfb2-1c42-4d22-beb5-1b4887f5132b req-e63817cb-8170-41bb-8469-4459928f725d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "8009ab5e-1bf8-4d17-8ff2-b62b25eeff20-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:01:24 compute-1 nova_compute[182713]: 2026-01-22 00:01:24.783 182717 DEBUG oslo_concurrency.lockutils [req-e8a5bfb2-1c42-4d22-beb5-1b4887f5132b req-e63817cb-8170-41bb-8469-4459928f725d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "8009ab5e-1bf8-4d17-8ff2-b62b25eeff20-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:01:24 compute-1 nova_compute[182713]: 2026-01-22 00:01:24.783 182717 DEBUG oslo_concurrency.lockutils [req-e8a5bfb2-1c42-4d22-beb5-1b4887f5132b req-e63817cb-8170-41bb-8469-4459928f725d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "8009ab5e-1bf8-4d17-8ff2-b62b25eeff20-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:01:24 compute-1 nova_compute[182713]: 2026-01-22 00:01:24.784 182717 DEBUG nova.compute.manager [req-e8a5bfb2-1c42-4d22-beb5-1b4887f5132b req-e63817cb-8170-41bb-8469-4459928f725d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20] No waiting events found dispatching network-vif-plugged-5965ccd1-7d75-4079-ade6-e1859a860162 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:01:24 compute-1 nova_compute[182713]: 2026-01-22 00:01:24.784 182717 WARNING nova.compute.manager [req-e8a5bfb2-1c42-4d22-beb5-1b4887f5132b req-e63817cb-8170-41bb-8469-4459928f725d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20] Received unexpected event network-vif-plugged-5965ccd1-7d75-4079-ade6-e1859a860162 for instance with vm_state active and task_state None.
Jan 22 00:01:25 compute-1 nova_compute[182713]: 2026-01-22 00:01:25.580 182717 DEBUG nova.compute.manager [req-de7f9262-3f86-49fc-863a-b1f335866902 req-bfe28a04-35ee-4275-a881-d951d5014719 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 2cb6e3d6-f22a-49ea-aab8-900dd88605e9] Received event network-vif-plugged-8412a083-ca97-4457-bb0e-9c7bcd8bfb2f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:01:25 compute-1 nova_compute[182713]: 2026-01-22 00:01:25.582 182717 DEBUG oslo_concurrency.lockutils [req-de7f9262-3f86-49fc-863a-b1f335866902 req-bfe28a04-35ee-4275-a881-d951d5014719 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "2cb6e3d6-f22a-49ea-aab8-900dd88605e9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:01:25 compute-1 nova_compute[182713]: 2026-01-22 00:01:25.582 182717 DEBUG oslo_concurrency.lockutils [req-de7f9262-3f86-49fc-863a-b1f335866902 req-bfe28a04-35ee-4275-a881-d951d5014719 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "2cb6e3d6-f22a-49ea-aab8-900dd88605e9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:01:25 compute-1 nova_compute[182713]: 2026-01-22 00:01:25.582 182717 DEBUG oslo_concurrency.lockutils [req-de7f9262-3f86-49fc-863a-b1f335866902 req-bfe28a04-35ee-4275-a881-d951d5014719 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "2cb6e3d6-f22a-49ea-aab8-900dd88605e9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:01:25 compute-1 nova_compute[182713]: 2026-01-22 00:01:25.582 182717 DEBUG nova.compute.manager [req-de7f9262-3f86-49fc-863a-b1f335866902 req-bfe28a04-35ee-4275-a881-d951d5014719 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 2cb6e3d6-f22a-49ea-aab8-900dd88605e9] No waiting events found dispatching network-vif-plugged-8412a083-ca97-4457-bb0e-9c7bcd8bfb2f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:01:25 compute-1 nova_compute[182713]: 2026-01-22 00:01:25.583 182717 WARNING nova.compute.manager [req-de7f9262-3f86-49fc-863a-b1f335866902 req-bfe28a04-35ee-4275-a881-d951d5014719 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 2cb6e3d6-f22a-49ea-aab8-900dd88605e9] Received unexpected event network-vif-plugged-8412a083-ca97-4457-bb0e-9c7bcd8bfb2f for instance with vm_state active and task_state None.
Jan 22 00:01:26 compute-1 nova_compute[182713]: 2026-01-22 00:01:26.099 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:01:26 compute-1 NetworkManager[54952]: <info>  [1769040086.1030] manager: (patch-br-int-to-provnet-99bcf688-d143-4306-a11c-956e66fbe227): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/173)
Jan 22 00:01:26 compute-1 NetworkManager[54952]: <info>  [1769040086.1046] manager: (patch-provnet-99bcf688-d143-4306-a11c-956e66fbe227-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/174)
Jan 22 00:01:26 compute-1 nova_compute[182713]: 2026-01-22 00:01:26.209 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:01:26 compute-1 ovn_controller[94841]: 2026-01-22T00:01:26Z|00347|binding|INFO|Releasing lport 1b7e9589-a667-4684-99c2-2699b19c29bb from this chassis (sb_readonly=0)
Jan 22 00:01:26 compute-1 ovn_controller[94841]: 2026-01-22T00:01:26Z|00348|binding|INFO|Releasing lport f7f4d7e4-9841-41f2-85bd-658a3b613e0d from this chassis (sb_readonly=0)
Jan 22 00:01:26 compute-1 nova_compute[182713]: 2026-01-22 00:01:26.225 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:01:26 compute-1 nova_compute[182713]: 2026-01-22 00:01:26.733 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:01:26 compute-1 nova_compute[182713]: 2026-01-22 00:01:26.748 182717 DEBUG nova.compute.manager [req-40c24b99-f77d-4fab-982f-9babe81dcd5b req-2e830c3c-40d2-40cf-b21c-7d68d201ccf8 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20] Received event network-changed-5965ccd1-7d75-4079-ade6-e1859a860162 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:01:26 compute-1 nova_compute[182713]: 2026-01-22 00:01:26.749 182717 DEBUG nova.compute.manager [req-40c24b99-f77d-4fab-982f-9babe81dcd5b req-2e830c3c-40d2-40cf-b21c-7d68d201ccf8 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20] Refreshing instance network info cache due to event network-changed-5965ccd1-7d75-4079-ade6-e1859a860162. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 00:01:26 compute-1 nova_compute[182713]: 2026-01-22 00:01:26.749 182717 DEBUG oslo_concurrency.lockutils [req-40c24b99-f77d-4fab-982f-9babe81dcd5b req-2e830c3c-40d2-40cf-b21c-7d68d201ccf8 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-8009ab5e-1bf8-4d17-8ff2-b62b25eeff20" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:01:26 compute-1 nova_compute[182713]: 2026-01-22 00:01:26.750 182717 DEBUG oslo_concurrency.lockutils [req-40c24b99-f77d-4fab-982f-9babe81dcd5b req-2e830c3c-40d2-40cf-b21c-7d68d201ccf8 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-8009ab5e-1bf8-4d17-8ff2-b62b25eeff20" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:01:26 compute-1 nova_compute[182713]: 2026-01-22 00:01:26.750 182717 DEBUG nova.network.neutron [req-40c24b99-f77d-4fab-982f-9babe81dcd5b req-2e830c3c-40d2-40cf-b21c-7d68d201ccf8 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20] Refreshing network info cache for port 5965ccd1-7d75-4079-ade6-e1859a860162 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 00:01:26 compute-1 nova_compute[182713]: 2026-01-22 00:01:26.770 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:01:27 compute-1 podman[224133]: 2026-01-22 00:01:27.624274015 +0000 UTC m=+0.103600856 container health_status cba261ffc859a105e0afa4436e64e27f9ba7e7387a1ea02146e0072c51844d44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ceilometer_agent_compute, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute)
Jan 22 00:01:27 compute-1 ovn_controller[94841]: 2026-01-22T00:01:27Z|00349|binding|INFO|Releasing lport 1b7e9589-a667-4684-99c2-2699b19c29bb from this chassis (sb_readonly=0)
Jan 22 00:01:27 compute-1 ovn_controller[94841]: 2026-01-22T00:01:27Z|00350|binding|INFO|Releasing lport f7f4d7e4-9841-41f2-85bd-658a3b613e0d from this chassis (sb_readonly=0)
Jan 22 00:01:27 compute-1 nova_compute[182713]: 2026-01-22 00:01:27.803 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:01:29 compute-1 nova_compute[182713]: 2026-01-22 00:01:29.310 182717 DEBUG nova.network.neutron [req-40c24b99-f77d-4fab-982f-9babe81dcd5b req-2e830c3c-40d2-40cf-b21c-7d68d201ccf8 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20] Updated VIF entry in instance network info cache for port 5965ccd1-7d75-4079-ade6-e1859a860162. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 00:01:29 compute-1 nova_compute[182713]: 2026-01-22 00:01:29.312 182717 DEBUG nova.network.neutron [req-40c24b99-f77d-4fab-982f-9babe81dcd5b req-2e830c3c-40d2-40cf-b21c-7d68d201ccf8 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20] Updating instance_info_cache with network_info: [{"id": "5965ccd1-7d75-4079-ade6-e1859a860162", "address": "fa:16:3e:c9:be:ae", "network": {"id": "19c3e0c8-5563-479c-995a-ab38d8b8c7f7", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-10713966-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.240", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cccb624dbe6d4401a89e9cd254f91828", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5965ccd1-7d", "ovs_interfaceid": "5965ccd1-7d75-4079-ade6-e1859a860162", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:01:29 compute-1 nova_compute[182713]: 2026-01-22 00:01:29.339 182717 DEBUG oslo_concurrency.lockutils [req-40c24b99-f77d-4fab-982f-9babe81dcd5b req-2e830c3c-40d2-40cf-b21c-7d68d201ccf8 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-8009ab5e-1bf8-4d17-8ff2-b62b25eeff20" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:01:29 compute-1 nova_compute[182713]: 2026-01-22 00:01:29.383 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:01:29 compute-1 nova_compute[182713]: 2026-01-22 00:01:29.384 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 00:01:30 compute-1 nova_compute[182713]: 2026-01-22 00:01:30.857 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:01:31 compute-1 podman[224153]: 2026-01-22 00:01:31.59034126 +0000 UTC m=+0.084760114 container health_status dce133a2ffa21f3006ac27dc80027060a9029e6d972f4c62fecd35dd3bbb00f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=openstack_network_exporter, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, architecture=x86_64, container_name=openstack_network_exporter, vcs-type=git, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible)
Jan 22 00:01:31 compute-1 nova_compute[182713]: 2026-01-22 00:01:31.736 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:01:31 compute-1 nova_compute[182713]: 2026-01-22 00:01:31.772 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:01:31 compute-1 nova_compute[182713]: 2026-01-22 00:01:31.855 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:01:32 compute-1 nova_compute[182713]: 2026-01-22 00:01:32.856 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:01:32 compute-1 nova_compute[182713]: 2026-01-22 00:01:32.857 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:01:33 compute-1 nova_compute[182713]: 2026-01-22 00:01:33.851 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:01:34 compute-1 ovn_controller[94841]: 2026-01-22T00:01:34Z|00041|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:c9:be:ae 10.100.0.4
Jan 22 00:01:34 compute-1 ovn_controller[94841]: 2026-01-22T00:01:34Z|00042|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:c9:be:ae 10.100.0.4
Jan 22 00:01:34 compute-1 nova_compute[182713]: 2026-01-22 00:01:34.855 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:01:35 compute-1 nova_compute[182713]: 2026-01-22 00:01:35.855 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:01:35 compute-1 nova_compute[182713]: 2026-01-22 00:01:35.856 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 00:01:35 compute-1 nova_compute[182713]: 2026-01-22 00:01:35.856 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 22 00:01:36 compute-1 nova_compute[182713]: 2026-01-22 00:01:36.175 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquiring lock "refresh_cache-2cb6e3d6-f22a-49ea-aab8-900dd88605e9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:01:36 compute-1 nova_compute[182713]: 2026-01-22 00:01:36.176 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquired lock "refresh_cache-2cb6e3d6-f22a-49ea-aab8-900dd88605e9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:01:36 compute-1 nova_compute[182713]: 2026-01-22 00:01:36.177 182717 DEBUG nova.network.neutron [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] [instance: 2cb6e3d6-f22a-49ea-aab8-900dd88605e9] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Jan 22 00:01:36 compute-1 nova_compute[182713]: 2026-01-22 00:01:36.177 182717 DEBUG nova.objects.instance [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lazy-loading 'info_cache' on Instance uuid 2cb6e3d6-f22a-49ea-aab8-900dd88605e9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:01:36 compute-1 ovn_controller[94841]: 2026-01-22T00:01:36Z|00043|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:e0:ee:91 10.100.0.3
Jan 22 00:01:36 compute-1 ovn_controller[94841]: 2026-01-22T00:01:36Z|00044|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:e0:ee:91 10.100.0.3
Jan 22 00:01:36 compute-1 nova_compute[182713]: 2026-01-22 00:01:36.796 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:01:37 compute-1 nova_compute[182713]: 2026-01-22 00:01:37.871 182717 DEBUG nova.network.neutron [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] [instance: 2cb6e3d6-f22a-49ea-aab8-900dd88605e9] Updating instance_info_cache with network_info: [{"id": "8412a083-ca97-4457-bb0e-9c7bcd8bfb2f", "address": "fa:16:3e:e0:ee:91", "network": {"id": "397ba44b-e27b-4a2a-a10b-7de0daa31656", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1751919755-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a7e425a4d1854533a17d5f0dcd9d87b9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8412a083-ca", "ovs_interfaceid": "8412a083-ca97-4457-bb0e-9c7bcd8bfb2f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:01:37 compute-1 nova_compute[182713]: 2026-01-22 00:01:37.906 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Releasing lock "refresh_cache-2cb6e3d6-f22a-49ea-aab8-900dd88605e9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:01:37 compute-1 nova_compute[182713]: 2026-01-22 00:01:37.906 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] [instance: 2cb6e3d6-f22a-49ea-aab8-900dd88605e9] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Jan 22 00:01:37 compute-1 nova_compute[182713]: 2026-01-22 00:01:37.907 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:01:37 compute-1 nova_compute[182713]: 2026-01-22 00:01:37.932 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:01:37 compute-1 nova_compute[182713]: 2026-01-22 00:01:37.932 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:01:37 compute-1 nova_compute[182713]: 2026-01-22 00:01:37.933 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:01:37 compute-1 nova_compute[182713]: 2026-01-22 00:01:37.933 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 00:01:38 compute-1 nova_compute[182713]: 2026-01-22 00:01:38.031 182717 DEBUG oslo_concurrency.processutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8009ab5e-1bf8-4d17-8ff2-b62b25eeff20/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:01:38 compute-1 nova_compute[182713]: 2026-01-22 00:01:38.132 182717 DEBUG oslo_concurrency.processutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8009ab5e-1bf8-4d17-8ff2-b62b25eeff20/disk --force-share --output=json" returned: 0 in 0.101s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:01:38 compute-1 nova_compute[182713]: 2026-01-22 00:01:38.133 182717 DEBUG oslo_concurrency.processutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8009ab5e-1bf8-4d17-8ff2-b62b25eeff20/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:01:38 compute-1 nova_compute[182713]: 2026-01-22 00:01:38.226 182717 DEBUG oslo_concurrency.processutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8009ab5e-1bf8-4d17-8ff2-b62b25eeff20/disk --force-share --output=json" returned: 0 in 0.093s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:01:38 compute-1 nova_compute[182713]: 2026-01-22 00:01:38.231 182717 DEBUG oslo_concurrency.processutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2cb6e3d6-f22a-49ea-aab8-900dd88605e9/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:01:38 compute-1 nova_compute[182713]: 2026-01-22 00:01:38.309 182717 DEBUG oslo_concurrency.processutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2cb6e3d6-f22a-49ea-aab8-900dd88605e9/disk --force-share --output=json" returned: 0 in 0.078s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:01:38 compute-1 nova_compute[182713]: 2026-01-22 00:01:38.310 182717 DEBUG oslo_concurrency.processutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2cb6e3d6-f22a-49ea-aab8-900dd88605e9/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:01:38 compute-1 nova_compute[182713]: 2026-01-22 00:01:38.385 182717 DEBUG oslo_concurrency.processutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2cb6e3d6-f22a-49ea-aab8-900dd88605e9/disk --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:01:38 compute-1 nova_compute[182713]: 2026-01-22 00:01:38.660 182717 WARNING nova.virt.libvirt.driver [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 00:01:38 compute-1 nova_compute[182713]: 2026-01-22 00:01:38.664 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5374MB free_disk=73.24575424194336GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 00:01:38 compute-1 nova_compute[182713]: 2026-01-22 00:01:38.664 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:01:38 compute-1 nova_compute[182713]: 2026-01-22 00:01:38.665 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:01:38 compute-1 nova_compute[182713]: 2026-01-22 00:01:38.767 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Instance 2cb6e3d6-f22a-49ea-aab8-900dd88605e9 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 22 00:01:38 compute-1 nova_compute[182713]: 2026-01-22 00:01:38.768 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Instance 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 22 00:01:38 compute-1 nova_compute[182713]: 2026-01-22 00:01:38.768 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 00:01:38 compute-1 nova_compute[182713]: 2026-01-22 00:01:38.768 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 00:01:38 compute-1 nova_compute[182713]: 2026-01-22 00:01:38.831 182717 DEBUG nova.compute.provider_tree [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Inventory has not changed in ProviderTree for provider: 39680711-70c9-4df1-ae59-25e54fac688d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:01:38 compute-1 nova_compute[182713]: 2026-01-22 00:01:38.852 182717 DEBUG nova.scheduler.client.report [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Inventory has not changed for provider 39680711-70c9-4df1-ae59-25e54fac688d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:01:38 compute-1 nova_compute[182713]: 2026-01-22 00:01:38.904 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 00:01:38 compute-1 nova_compute[182713]: 2026-01-22 00:01:38.905 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.240s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:01:40 compute-1 nova_compute[182713]: 2026-01-22 00:01:40.313 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:01:40 compute-1 ovn_controller[94841]: 2026-01-22T00:01:40Z|00351|binding|INFO|Releasing lport 1b7e9589-a667-4684-99c2-2699b19c29bb from this chassis (sb_readonly=0)
Jan 22 00:01:40 compute-1 ovn_controller[94841]: 2026-01-22T00:01:40Z|00352|binding|INFO|Releasing lport f7f4d7e4-9841-41f2-85bd-658a3b613e0d from this chassis (sb_readonly=0)
Jan 22 00:01:40 compute-1 nova_compute[182713]: 2026-01-22 00:01:40.883 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:01:40 compute-1 nova_compute[182713]: 2026-01-22 00:01:40.901 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:01:41 compute-1 podman[224217]: 2026-01-22 00:01:41.598369753 +0000 UTC m=+0.085855840 container health_status 9069102163b9014ce8d990e36224edf2f6277e737eebbc85a8ea0ba5a3d6a468 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 22 00:01:41 compute-1 podman[224216]: 2026-01-22 00:01:41.648821868 +0000 UTC m=+0.138423970 container health_status 1b31374526468d6b98833c6ddc06c791524e3aaa8f4a92d02e3f11c449a17ca2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 22 00:01:41 compute-1 nova_compute[182713]: 2026-01-22 00:01:41.801 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:01:46 compute-1 nova_compute[182713]: 2026-01-22 00:01:46.803 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:01:48 compute-1 nova_compute[182713]: 2026-01-22 00:01:48.295 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:01:48 compute-1 podman[224266]: 2026-01-22 00:01:48.606431613 +0000 UTC m=+0.085251269 container health_status e305ebfcd3dbca18f8dd680606b043b108cf9efb0d0a771039cb04fe0df8441d (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 22 00:01:48 compute-1 podman[224265]: 2026-01-22 00:01:48.611433237 +0000 UTC m=+0.090159880 container health_status af9a44923f14d865893c91712a29a99d59e05d83f1a1a0300c4f4d5af638f284 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251202, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 22 00:01:49 compute-1 nova_compute[182713]: 2026-01-22 00:01:49.391 182717 DEBUG oslo_concurrency.lockutils [None req-82b81f5c-f48d-4084-8608-de7c023cc37c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Acquiring lock "refresh_cache-8009ab5e-1bf8-4d17-8ff2-b62b25eeff20" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:01:49 compute-1 nova_compute[182713]: 2026-01-22 00:01:49.391 182717 DEBUG oslo_concurrency.lockutils [None req-82b81f5c-f48d-4084-8608-de7c023cc37c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Acquired lock "refresh_cache-8009ab5e-1bf8-4d17-8ff2-b62b25eeff20" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:01:49 compute-1 nova_compute[182713]: 2026-01-22 00:01:49.392 182717 DEBUG nova.network.neutron [None req-82b81f5c-f48d-4084-8608-de7c023cc37c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 00:01:51 compute-1 nova_compute[182713]: 2026-01-22 00:01:51.840 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:01:51 compute-1 nova_compute[182713]: 2026-01-22 00:01:51.843 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:01:53 compute-1 nova_compute[182713]: 2026-01-22 00:01:53.091 182717 DEBUG nova.network.neutron [None req-82b81f5c-f48d-4084-8608-de7c023cc37c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20] Updating instance_info_cache with network_info: [{"id": "5965ccd1-7d75-4079-ade6-e1859a860162", "address": "fa:16:3e:c9:be:ae", "network": {"id": "19c3e0c8-5563-479c-995a-ab38d8b8c7f7", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-10713966-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.240", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cccb624dbe6d4401a89e9cd254f91828", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5965ccd1-7d", "ovs_interfaceid": "5965ccd1-7d75-4079-ade6-e1859a860162", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:01:53 compute-1 nova_compute[182713]: 2026-01-22 00:01:53.204 182717 DEBUG oslo_concurrency.lockutils [None req-82b81f5c-f48d-4084-8608-de7c023cc37c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Releasing lock "refresh_cache-8009ab5e-1bf8-4d17-8ff2-b62b25eeff20" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:01:54 compute-1 nova_compute[182713]: 2026-01-22 00:01:54.014 182717 DEBUG nova.virt.libvirt.driver [None req-82b81f5c-f48d-4084-8608-de7c023cc37c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20] Starting migrate_disk_and_power_off migrate_disk_and_power_off /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11511
Jan 22 00:01:54 compute-1 nova_compute[182713]: 2026-01-22 00:01:54.015 182717 DEBUG nova.virt.libvirt.volume.remotefs [None req-82b81f5c-f48d-4084-8608-de7c023cc37c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Creating file /var/lib/nova/instances/8009ab5e-1bf8-4d17-8ff2-b62b25eeff20/84a050406e0b4e5b94000d03eb47f31e.tmp on remote host 192.168.122.100 create_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:79
Jan 22 00:01:54 compute-1 nova_compute[182713]: 2026-01-22 00:01:54.016 182717 DEBUG oslo_concurrency.processutils [None req-82b81f5c-f48d-4084-8608-de7c023cc37c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Running cmd (subprocess): ssh -o BatchMode=yes 192.168.122.100 touch /var/lib/nova/instances/8009ab5e-1bf8-4d17-8ff2-b62b25eeff20/84a050406e0b4e5b94000d03eb47f31e.tmp execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:01:54 compute-1 nova_compute[182713]: 2026-01-22 00:01:54.289 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:01:54 compute-1 nova_compute[182713]: 2026-01-22 00:01:54.576 182717 DEBUG oslo_concurrency.processutils [None req-82b81f5c-f48d-4084-8608-de7c023cc37c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] CMD "ssh -o BatchMode=yes 192.168.122.100 touch /var/lib/nova/instances/8009ab5e-1bf8-4d17-8ff2-b62b25eeff20/84a050406e0b4e5b94000d03eb47f31e.tmp" returned: 1 in 0.560s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:01:54 compute-1 nova_compute[182713]: 2026-01-22 00:01:54.577 182717 DEBUG oslo_concurrency.processutils [None req-82b81f5c-f48d-4084-8608-de7c023cc37c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] 'ssh -o BatchMode=yes 192.168.122.100 touch /var/lib/nova/instances/8009ab5e-1bf8-4d17-8ff2-b62b25eeff20/84a050406e0b4e5b94000d03eb47f31e.tmp' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473
Jan 22 00:01:54 compute-1 nova_compute[182713]: 2026-01-22 00:01:54.578 182717 DEBUG nova.virt.libvirt.volume.remotefs [None req-82b81f5c-f48d-4084-8608-de7c023cc37c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Creating directory /var/lib/nova/instances/8009ab5e-1bf8-4d17-8ff2-b62b25eeff20 on remote host 192.168.122.100 create_dir /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:91
Jan 22 00:01:54 compute-1 nova_compute[182713]: 2026-01-22 00:01:54.578 182717 DEBUG oslo_concurrency.processutils [None req-82b81f5c-f48d-4084-8608-de7c023cc37c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Running cmd (subprocess): ssh -o BatchMode=yes 192.168.122.100 mkdir -p /var/lib/nova/instances/8009ab5e-1bf8-4d17-8ff2-b62b25eeff20 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:01:54 compute-1 nova_compute[182713]: 2026-01-22 00:01:54.830 182717 DEBUG oslo_concurrency.processutils [None req-82b81f5c-f48d-4084-8608-de7c023cc37c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] CMD "ssh -o BatchMode=yes 192.168.122.100 mkdir -p /var/lib/nova/instances/8009ab5e-1bf8-4d17-8ff2-b62b25eeff20" returned: 0 in 0.252s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:01:54 compute-1 nova_compute[182713]: 2026-01-22 00:01:54.836 182717 DEBUG nova.virt.libvirt.driver [None req-82b81f5c-f48d-4084-8608-de7c023cc37c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071
Jan 22 00:01:56 compute-1 nova_compute[182713]: 2026-01-22 00:01:56.840 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:01:57 compute-1 kernel: tap5965ccd1-7d (unregistering): left promiscuous mode
Jan 22 00:01:57 compute-1 NetworkManager[54952]: <info>  [1769040117.0093] device (tap5965ccd1-7d): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 00:01:57 compute-1 ovn_controller[94841]: 2026-01-22T00:01:57Z|00353|binding|INFO|Releasing lport 5965ccd1-7d75-4079-ade6-e1859a860162 from this chassis (sb_readonly=0)
Jan 22 00:01:57 compute-1 ovn_controller[94841]: 2026-01-22T00:01:57Z|00354|binding|INFO|Setting lport 5965ccd1-7d75-4079-ade6-e1859a860162 down in Southbound
Jan 22 00:01:57 compute-1 ovn_controller[94841]: 2026-01-22T00:01:57Z|00355|binding|INFO|Removing iface tap5965ccd1-7d ovn-installed in OVS
Jan 22 00:01:57 compute-1 nova_compute[182713]: 2026-01-22 00:01:57.016 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:01:57 compute-1 nova_compute[182713]: 2026-01-22 00:01:57.072 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:01:57 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:01:57.079 104184 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c9:be:ae 10.100.0.4'], port_security=['fa:16:3e:c9:be:ae 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '8009ab5e-1bf8-4d17-8ff2-b62b25eeff20', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-19c3e0c8-5563-479c-995a-ab38d8b8c7f7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'cccb624dbe6d4401a89e9cd254f91828', 'neutron:revision_number': '4', 'neutron:security_group_ids': '6d59a7e5-ecca-4ec2-a40e-386acabc1d66', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com', 'neutron:port_fip': '192.168.122.240'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1cb5ae5b-fb9e-4b4d-8960-35191db09308, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>], logical_port=5965ccd1-7d75-4079-ade6-e1859a860162) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:01:57 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:01:57.080 104184 INFO neutron.agent.ovn.metadata.agent [-] Port 5965ccd1-7d75-4079-ade6-e1859a860162 in datapath 19c3e0c8-5563-479c-995a-ab38d8b8c7f7 unbound from our chassis
Jan 22 00:01:57 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:01:57.081 104184 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 19c3e0c8-5563-479c-995a-ab38d8b8c7f7, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 00:01:57 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:01:57.082 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[6344f2c7-8c0d-4c76-955c-522dbcf95a15]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:01:57 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:01:57.083 104184 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7 namespace which is not needed anymore
Jan 22 00:01:57 compute-1 systemd[1]: machine-qemu\x2d39\x2dinstance\x2d00000054.scope: Deactivated successfully.
Jan 22 00:01:57 compute-1 systemd[1]: machine-qemu\x2d39\x2dinstance\x2d00000054.scope: Consumed 14.321s CPU time.
Jan 22 00:01:57 compute-1 systemd-machined[153970]: Machine qemu-39-instance-00000054 terminated.
Jan 22 00:01:57 compute-1 neutron-haproxy-ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7[224032]: [NOTICE]   (224041) : haproxy version is 2.8.14-c23fe91
Jan 22 00:01:57 compute-1 neutron-haproxy-ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7[224032]: [NOTICE]   (224041) : path to executable is /usr/sbin/haproxy
Jan 22 00:01:57 compute-1 neutron-haproxy-ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7[224032]: [WARNING]  (224041) : Exiting Master process...
Jan 22 00:01:57 compute-1 neutron-haproxy-ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7[224032]: [ALERT]    (224041) : Current worker (224043) exited with code 143 (Terminated)
Jan 22 00:01:57 compute-1 neutron-haproxy-ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7[224032]: [WARNING]  (224041) : All workers exited. Exiting... (0)
Jan 22 00:01:57 compute-1 systemd[1]: libpod-76878bcf5715828236c418dca33c20e15dc3db38e28794a7e26839be5638dc04.scope: Deactivated successfully.
Jan 22 00:01:57 compute-1 podman[224334]: 2026-01-22 00:01:57.219486656 +0000 UTC m=+0.049675594 container died 76878bcf5715828236c418dca33c20e15dc3db38e28794a7e26839be5638dc04 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 00:01:57 compute-1 systemd[1]: var-lib-containers-storage-overlay-9332773bee8f0b3b24e752a1f7406ff36656bd92b129dbd3021f9568041d25b9-merged.mount: Deactivated successfully.
Jan 22 00:01:57 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-76878bcf5715828236c418dca33c20e15dc3db38e28794a7e26839be5638dc04-userdata-shm.mount: Deactivated successfully.
Jan 22 00:01:57 compute-1 podman[224334]: 2026-01-22 00:01:57.260547592 +0000 UTC m=+0.090736530 container cleanup 76878bcf5715828236c418dca33c20e15dc3db38e28794a7e26839be5638dc04 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 00:01:57 compute-1 systemd[1]: libpod-conmon-76878bcf5715828236c418dca33c20e15dc3db38e28794a7e26839be5638dc04.scope: Deactivated successfully.
Jan 22 00:01:57 compute-1 nova_compute[182713]: 2026-01-22 00:01:57.295 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:01:57 compute-1 nova_compute[182713]: 2026-01-22 00:01:57.299 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:01:57 compute-1 podman[224364]: 2026-01-22 00:01:57.350071433 +0000 UTC m=+0.057671230 container remove 76878bcf5715828236c418dca33c20e15dc3db38e28794a7e26839be5638dc04 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202)
Jan 22 00:01:57 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:01:57.357 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[2f0ec16e-4e2d-4fe4-9e5b-47018d23e661]: (4, ('Thu Jan 22 12:01:57 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7 (76878bcf5715828236c418dca33c20e15dc3db38e28794a7e26839be5638dc04)\n76878bcf5715828236c418dca33c20e15dc3db38e28794a7e26839be5638dc04\nThu Jan 22 12:01:57 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7 (76878bcf5715828236c418dca33c20e15dc3db38e28794a7e26839be5638dc04)\n76878bcf5715828236c418dca33c20e15dc3db38e28794a7e26839be5638dc04\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:01:57 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:01:57.359 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[b4f8c16a-add6-4c27-b523-585896b02a6a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:01:57 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:01:57.361 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap19c3e0c8-50, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:01:57 compute-1 nova_compute[182713]: 2026-01-22 00:01:57.362 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:01:57 compute-1 kernel: tap19c3e0c8-50: left promiscuous mode
Jan 22 00:01:57 compute-1 nova_compute[182713]: 2026-01-22 00:01:57.377 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:01:57 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:01:57.380 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[4310f168-a954-44cb-b904-fc30278ef34b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:01:57 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:01:57.401 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[5572fab5-acec-4701-8902-e5c245344575]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:01:57 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:01:57.402 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[fa12a640-90f2-43ca-ab47-1bdc10b06b20]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:01:57 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:01:57.419 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[8a429cd4-83aa-4e73-a143-be950f88eb90]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 465475, 'reachable_time': 22423, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 224395, 'error': None, 'target': 'ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:01:57 compute-1 systemd[1]: run-netns-ovnmeta\x2d19c3e0c8\x2d5563\x2d479c\x2d995a\x2dab38d8b8c7f7.mount: Deactivated successfully.
Jan 22 00:01:57 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:01:57.423 104576 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 22 00:01:57 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:01:57.423 104576 DEBUG oslo.privsep.daemon [-] privsep: reply[a33c5843-8cff-4626-87e5-24147cb0c67c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:01:57 compute-1 nova_compute[182713]: 2026-01-22 00:01:57.855 182717 INFO nova.virt.libvirt.driver [None req-82b81f5c-f48d-4084-8608-de7c023cc37c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20] Instance shutdown successfully after 3 seconds.
Jan 22 00:01:57 compute-1 nova_compute[182713]: 2026-01-22 00:01:57.863 182717 INFO nova.virt.libvirt.driver [-] [instance: 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20] Instance destroyed successfully.
Jan 22 00:01:57 compute-1 nova_compute[182713]: 2026-01-22 00:01:57.864 182717 DEBUG nova.virt.libvirt.vif [None req-82b81f5c-f48d-4084-8608-de7c023cc37c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T00:01:05Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-803720403',display_name='tempest-ServerActionsTestJSON-server-803720403',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-803720403',id=84,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBM2ugiUux7DYMlN8dY8gue1BzsfXbOKOqdPq/gJUxFgjYtiZRKn0Il7yH7vkt/FF0n0nQ57uKZ7FjQwDvGcLpEHkhrK3RTLhPWsztjfiNHjhjKK0S86T4k3kzP0rpeoh4Q==',key_name='tempest-keypair-452781070',keypairs=<?>,launch_index=0,launched_at=2026-01-22T00:01:23Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='cccb624dbe6d4401a89e9cd254f91828',ramdisk_id='',reservation_id='r-zph0mh3t',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-ServerActionsTestJSON-78742637',owner_user_name='tempest-ServerActionsTestJSON-78742637-project-member'},tags=<?>,task_state='resize_migrating',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T00:01:47Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='3e78a70a1d284a9d932d4a53b872df39',uuid=8009ab5e-1bf8-4d17-8ff2-b62b25eeff20,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "5965ccd1-7d75-4079-ade6-e1859a860162", "address": "fa:16:3e:c9:be:ae", "network": {"id": "19c3e0c8-5563-479c-995a-ab38d8b8c7f7", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-10713966-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.240", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-ServerActionsTestJSON-10713966-network", "vif_mac": "fa:16:3e:c9:be:ae"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cccb624dbe6d4401a89e9cd254f91828", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5965ccd1-7d", "ovs_interfaceid": "5965ccd1-7d75-4079-ade6-e1859a860162", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 00:01:57 compute-1 nova_compute[182713]: 2026-01-22 00:01:57.864 182717 DEBUG nova.network.os_vif_util [None req-82b81f5c-f48d-4084-8608-de7c023cc37c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Converting VIF {"id": "5965ccd1-7d75-4079-ade6-e1859a860162", "address": "fa:16:3e:c9:be:ae", "network": {"id": "19c3e0c8-5563-479c-995a-ab38d8b8c7f7", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-10713966-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.240", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-ServerActionsTestJSON-10713966-network", "vif_mac": "fa:16:3e:c9:be:ae"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cccb624dbe6d4401a89e9cd254f91828", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5965ccd1-7d", "ovs_interfaceid": "5965ccd1-7d75-4079-ade6-e1859a860162", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:01:57 compute-1 nova_compute[182713]: 2026-01-22 00:01:57.865 182717 DEBUG nova.network.os_vif_util [None req-82b81f5c-f48d-4084-8608-de7c023cc37c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:c9:be:ae,bridge_name='br-int',has_traffic_filtering=True,id=5965ccd1-7d75-4079-ade6-e1859a860162,network=Network(19c3e0c8-5563-479c-995a-ab38d8b8c7f7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5965ccd1-7d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:01:57 compute-1 nova_compute[182713]: 2026-01-22 00:01:57.866 182717 DEBUG os_vif [None req-82b81f5c-f48d-4084-8608-de7c023cc37c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:c9:be:ae,bridge_name='br-int',has_traffic_filtering=True,id=5965ccd1-7d75-4079-ade6-e1859a860162,network=Network(19c3e0c8-5563-479c-995a-ab38d8b8c7f7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5965ccd1-7d') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 00:01:57 compute-1 nova_compute[182713]: 2026-01-22 00:01:57.869 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:01:57 compute-1 nova_compute[182713]: 2026-01-22 00:01:57.869 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap5965ccd1-7d, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:01:57 compute-1 nova_compute[182713]: 2026-01-22 00:01:57.871 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:01:57 compute-1 nova_compute[182713]: 2026-01-22 00:01:57.873 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:01:57 compute-1 nova_compute[182713]: 2026-01-22 00:01:57.875 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:01:57 compute-1 nova_compute[182713]: 2026-01-22 00:01:57.879 182717 INFO os_vif [None req-82b81f5c-f48d-4084-8608-de7c023cc37c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:c9:be:ae,bridge_name='br-int',has_traffic_filtering=True,id=5965ccd1-7d75-4079-ade6-e1859a860162,network=Network(19c3e0c8-5563-479c-995a-ab38d8b8c7f7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5965ccd1-7d')
Jan 22 00:01:57 compute-1 nova_compute[182713]: 2026-01-22 00:01:57.884 182717 DEBUG oslo_concurrency.processutils [None req-82b81f5c-f48d-4084-8608-de7c023cc37c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8009ab5e-1bf8-4d17-8ff2-b62b25eeff20/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:01:57 compute-1 nova_compute[182713]: 2026-01-22 00:01:57.953 182717 DEBUG oslo_concurrency.processutils [None req-82b81f5c-f48d-4084-8608-de7c023cc37c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8009ab5e-1bf8-4d17-8ff2-b62b25eeff20/disk --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:01:57 compute-1 nova_compute[182713]: 2026-01-22 00:01:57.955 182717 DEBUG oslo_concurrency.processutils [None req-82b81f5c-f48d-4084-8608-de7c023cc37c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8009ab5e-1bf8-4d17-8ff2-b62b25eeff20/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:01:58 compute-1 nova_compute[182713]: 2026-01-22 00:01:58.017 182717 DEBUG oslo_concurrency.processutils [None req-82b81f5c-f48d-4084-8608-de7c023cc37c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8009ab5e-1bf8-4d17-8ff2-b62b25eeff20/disk --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:01:58 compute-1 nova_compute[182713]: 2026-01-22 00:01:58.020 182717 DEBUG nova.virt.libvirt.volume.remotefs [None req-82b81f5c-f48d-4084-8608-de7c023cc37c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Copying file /var/lib/nova/instances/8009ab5e-1bf8-4d17-8ff2-b62b25eeff20_resize/disk to 192.168.122.100:/var/lib/nova/instances/8009ab5e-1bf8-4d17-8ff2-b62b25eeff20/disk copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103
Jan 22 00:01:58 compute-1 nova_compute[182713]: 2026-01-22 00:01:58.021 182717 DEBUG oslo_concurrency.processutils [None req-82b81f5c-f48d-4084-8608-de7c023cc37c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Running cmd (subprocess): scp -r /var/lib/nova/instances/8009ab5e-1bf8-4d17-8ff2-b62b25eeff20_resize/disk 192.168.122.100:/var/lib/nova/instances/8009ab5e-1bf8-4d17-8ff2-b62b25eeff20/disk execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:01:58 compute-1 nova_compute[182713]: 2026-01-22 00:01:58.310 182717 DEBUG nova.compute.manager [req-e37f033f-6a7e-4df1-b130-b5812c0c2fed req-2b2198e2-0cde-4ba0-9efd-a18eb52d9700 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20] Received event network-vif-unplugged-5965ccd1-7d75-4079-ade6-e1859a860162 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:01:58 compute-1 nova_compute[182713]: 2026-01-22 00:01:58.313 182717 DEBUG oslo_concurrency.lockutils [req-e37f033f-6a7e-4df1-b130-b5812c0c2fed req-2b2198e2-0cde-4ba0-9efd-a18eb52d9700 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "8009ab5e-1bf8-4d17-8ff2-b62b25eeff20-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:01:58 compute-1 nova_compute[182713]: 2026-01-22 00:01:58.313 182717 DEBUG oslo_concurrency.lockutils [req-e37f033f-6a7e-4df1-b130-b5812c0c2fed req-2b2198e2-0cde-4ba0-9efd-a18eb52d9700 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "8009ab5e-1bf8-4d17-8ff2-b62b25eeff20-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:01:58 compute-1 nova_compute[182713]: 2026-01-22 00:01:58.314 182717 DEBUG oslo_concurrency.lockutils [req-e37f033f-6a7e-4df1-b130-b5812c0c2fed req-2b2198e2-0cde-4ba0-9efd-a18eb52d9700 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "8009ab5e-1bf8-4d17-8ff2-b62b25eeff20-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:01:58 compute-1 nova_compute[182713]: 2026-01-22 00:01:58.314 182717 DEBUG nova.compute.manager [req-e37f033f-6a7e-4df1-b130-b5812c0c2fed req-2b2198e2-0cde-4ba0-9efd-a18eb52d9700 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20] No waiting events found dispatching network-vif-unplugged-5965ccd1-7d75-4079-ade6-e1859a860162 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:01:58 compute-1 nova_compute[182713]: 2026-01-22 00:01:58.315 182717 WARNING nova.compute.manager [req-e37f033f-6a7e-4df1-b130-b5812c0c2fed req-2b2198e2-0cde-4ba0-9efd-a18eb52d9700 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20] Received unexpected event network-vif-unplugged-5965ccd1-7d75-4079-ade6-e1859a860162 for instance with vm_state active and task_state resize_migrating.
Jan 22 00:01:58 compute-1 podman[224404]: 2026-01-22 00:01:58.593678447 +0000 UTC m=+0.081052611 container health_status cba261ffc859a105e0afa4436e64e27f9ba7e7387a1ea02146e0072c51844d44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251202)
Jan 22 00:01:58 compute-1 nova_compute[182713]: 2026-01-22 00:01:58.631 182717 DEBUG oslo_concurrency.processutils [None req-82b81f5c-f48d-4084-8608-de7c023cc37c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] CMD "scp -r /var/lib/nova/instances/8009ab5e-1bf8-4d17-8ff2-b62b25eeff20_resize/disk 192.168.122.100:/var/lib/nova/instances/8009ab5e-1bf8-4d17-8ff2-b62b25eeff20/disk" returned: 0 in 0.610s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:01:58 compute-1 nova_compute[182713]: 2026-01-22 00:01:58.632 182717 DEBUG nova.virt.libvirt.volume.remotefs [None req-82b81f5c-f48d-4084-8608-de7c023cc37c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Copying file /var/lib/nova/instances/8009ab5e-1bf8-4d17-8ff2-b62b25eeff20_resize/disk.config to 192.168.122.100:/var/lib/nova/instances/8009ab5e-1bf8-4d17-8ff2-b62b25eeff20/disk.config copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103
Jan 22 00:01:58 compute-1 nova_compute[182713]: 2026-01-22 00:01:58.632 182717 DEBUG oslo_concurrency.processutils [None req-82b81f5c-f48d-4084-8608-de7c023cc37c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Running cmd (subprocess): scp -C -r /var/lib/nova/instances/8009ab5e-1bf8-4d17-8ff2-b62b25eeff20_resize/disk.config 192.168.122.100:/var/lib/nova/instances/8009ab5e-1bf8-4d17-8ff2-b62b25eeff20/disk.config execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:01:58 compute-1 nova_compute[182713]: 2026-01-22 00:01:58.920 182717 DEBUG oslo_concurrency.processutils [None req-82b81f5c-f48d-4084-8608-de7c023cc37c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] CMD "scp -C -r /var/lib/nova/instances/8009ab5e-1bf8-4d17-8ff2-b62b25eeff20_resize/disk.config 192.168.122.100:/var/lib/nova/instances/8009ab5e-1bf8-4d17-8ff2-b62b25eeff20/disk.config" returned: 0 in 0.288s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:01:58 compute-1 nova_compute[182713]: 2026-01-22 00:01:58.922 182717 DEBUG nova.virt.libvirt.volume.remotefs [None req-82b81f5c-f48d-4084-8608-de7c023cc37c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Copying file /var/lib/nova/instances/8009ab5e-1bf8-4d17-8ff2-b62b25eeff20_resize/disk.info to 192.168.122.100:/var/lib/nova/instances/8009ab5e-1bf8-4d17-8ff2-b62b25eeff20/disk.info copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103
Jan 22 00:01:58 compute-1 nova_compute[182713]: 2026-01-22 00:01:58.923 182717 DEBUG oslo_concurrency.processutils [None req-82b81f5c-f48d-4084-8608-de7c023cc37c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Running cmd (subprocess): scp -C -r /var/lib/nova/instances/8009ab5e-1bf8-4d17-8ff2-b62b25eeff20_resize/disk.info 192.168.122.100:/var/lib/nova/instances/8009ab5e-1bf8-4d17-8ff2-b62b25eeff20/disk.info execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:01:59 compute-1 nova_compute[182713]: 2026-01-22 00:01:59.184 182717 DEBUG oslo_concurrency.processutils [None req-82b81f5c-f48d-4084-8608-de7c023cc37c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] CMD "scp -C -r /var/lib/nova/instances/8009ab5e-1bf8-4d17-8ff2-b62b25eeff20_resize/disk.info 192.168.122.100:/var/lib/nova/instances/8009ab5e-1bf8-4d17-8ff2-b62b25eeff20/disk.info" returned: 0 in 0.261s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:01:59 compute-1 nova_compute[182713]: 2026-01-22 00:01:59.660 182717 DEBUG neutronclient.v2_0.client [None req-82b81f5c-f48d-4084-8608-de7c023cc37c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Error message: {"NeutronError": {"type": "PortBindingNotFound", "message": "Binding for port 5965ccd1-7d75-4079-ade6-e1859a860162 for host compute-0.ctlplane.example.com could not be found.", "detail": ""}} _handle_fault_response /usr/lib/python3.9/site-packages/neutronclient/v2_0/client.py:262
Jan 22 00:02:00 compute-1 nova_compute[182713]: 2026-01-22 00:02:00.781 182717 DEBUG nova.compute.manager [req-46d530aa-55da-468c-a837-20f197261fa7 req-3a24c15b-d757-4c82-bfa4-ca1658b19637 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20] Received event network-vif-plugged-5965ccd1-7d75-4079-ade6-e1859a860162 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:02:00 compute-1 nova_compute[182713]: 2026-01-22 00:02:00.782 182717 DEBUG oslo_concurrency.lockutils [req-46d530aa-55da-468c-a837-20f197261fa7 req-3a24c15b-d757-4c82-bfa4-ca1658b19637 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "8009ab5e-1bf8-4d17-8ff2-b62b25eeff20-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:02:00 compute-1 nova_compute[182713]: 2026-01-22 00:02:00.783 182717 DEBUG oslo_concurrency.lockutils [req-46d530aa-55da-468c-a837-20f197261fa7 req-3a24c15b-d757-4c82-bfa4-ca1658b19637 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "8009ab5e-1bf8-4d17-8ff2-b62b25eeff20-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:02:00 compute-1 nova_compute[182713]: 2026-01-22 00:02:00.783 182717 DEBUG oslo_concurrency.lockutils [req-46d530aa-55da-468c-a837-20f197261fa7 req-3a24c15b-d757-4c82-bfa4-ca1658b19637 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "8009ab5e-1bf8-4d17-8ff2-b62b25eeff20-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:02:00 compute-1 nova_compute[182713]: 2026-01-22 00:02:00.784 182717 DEBUG nova.compute.manager [req-46d530aa-55da-468c-a837-20f197261fa7 req-3a24c15b-d757-4c82-bfa4-ca1658b19637 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20] No waiting events found dispatching network-vif-plugged-5965ccd1-7d75-4079-ade6-e1859a860162 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:02:00 compute-1 nova_compute[182713]: 2026-01-22 00:02:00.784 182717 WARNING nova.compute.manager [req-46d530aa-55da-468c-a837-20f197261fa7 req-3a24c15b-d757-4c82-bfa4-ca1658b19637 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20] Received unexpected event network-vif-plugged-5965ccd1-7d75-4079-ade6-e1859a860162 for instance with vm_state active and task_state resize_migrated.
Jan 22 00:02:00 compute-1 nova_compute[182713]: 2026-01-22 00:02:00.786 182717 DEBUG oslo_concurrency.lockutils [None req-82b81f5c-f48d-4084-8608-de7c023cc37c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Acquiring lock "8009ab5e-1bf8-4d17-8ff2-b62b25eeff20-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:02:00 compute-1 nova_compute[182713]: 2026-01-22 00:02:00.787 182717 DEBUG oslo_concurrency.lockutils [None req-82b81f5c-f48d-4084-8608-de7c023cc37c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Lock "8009ab5e-1bf8-4d17-8ff2-b62b25eeff20-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:02:00 compute-1 nova_compute[182713]: 2026-01-22 00:02:00.788 182717 DEBUG oslo_concurrency.lockutils [None req-82b81f5c-f48d-4084-8608-de7c023cc37c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Lock "8009ab5e-1bf8-4d17-8ff2-b62b25eeff20-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:02:01 compute-1 nova_compute[182713]: 2026-01-22 00:02:01.843 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:02:02 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:02:02.117 104184 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=21, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:a4:fb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'fa:c2:bb:50:eb:27'}, ipsec=False) old=SB_Global(nb_cfg=20) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:02:02 compute-1 nova_compute[182713]: 2026-01-22 00:02:02.117 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:02:02 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:02:02.118 104184 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 22 00:02:02 compute-1 podman[224429]: 2026-01-22 00:02:02.603987187 +0000 UTC m=+0.086249611 container health_status dce133a2ffa21f3006ac27dc80027060a9029e6d972f4c62fecd35dd3bbb00f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, distribution-scope=public, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., release=1755695350, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, config_id=openstack_network_exporter, vendor=Red Hat, Inc., managed_by=edpm_ansible, architecture=x86_64, io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=)
Jan 22 00:02:02 compute-1 nova_compute[182713]: 2026-01-22 00:02:02.872 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:02:03 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:02:03.009 104184 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:02:03 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:02:03.009 104184 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:02:03 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:02:03.010 104184 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:02:03 compute-1 nova_compute[182713]: 2026-01-22 00:02:03.381 182717 DEBUG nova.compute.manager [req-14d3b801-7b18-4665-a6db-313da29827d8 req-d6f79cb1-b135-4e0b-a3d3-5960a59503de 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20] Received event network-changed-5965ccd1-7d75-4079-ade6-e1859a860162 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:02:03 compute-1 nova_compute[182713]: 2026-01-22 00:02:03.382 182717 DEBUG nova.compute.manager [req-14d3b801-7b18-4665-a6db-313da29827d8 req-d6f79cb1-b135-4e0b-a3d3-5960a59503de 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20] Refreshing instance network info cache due to event network-changed-5965ccd1-7d75-4079-ade6-e1859a860162. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 00:02:03 compute-1 nova_compute[182713]: 2026-01-22 00:02:03.383 182717 DEBUG oslo_concurrency.lockutils [req-14d3b801-7b18-4665-a6db-313da29827d8 req-d6f79cb1-b135-4e0b-a3d3-5960a59503de 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-8009ab5e-1bf8-4d17-8ff2-b62b25eeff20" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:02:03 compute-1 nova_compute[182713]: 2026-01-22 00:02:03.383 182717 DEBUG oslo_concurrency.lockutils [req-14d3b801-7b18-4665-a6db-313da29827d8 req-d6f79cb1-b135-4e0b-a3d3-5960a59503de 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-8009ab5e-1bf8-4d17-8ff2-b62b25eeff20" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:02:03 compute-1 nova_compute[182713]: 2026-01-22 00:02:03.384 182717 DEBUG nova.network.neutron [req-14d3b801-7b18-4665-a6db-313da29827d8 req-d6f79cb1-b135-4e0b-a3d3-5960a59503de 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20] Refreshing network info cache for port 5965ccd1-7d75-4079-ade6-e1859a860162 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 00:02:06 compute-1 nova_compute[182713]: 2026-01-22 00:02:06.846 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:02:07 compute-1 nova_compute[182713]: 2026-01-22 00:02:07.037 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:02:07 compute-1 nova_compute[182713]: 2026-01-22 00:02:07.183 182717 DEBUG nova.network.neutron [req-14d3b801-7b18-4665-a6db-313da29827d8 req-d6f79cb1-b135-4e0b-a3d3-5960a59503de 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20] Updated VIF entry in instance network info cache for port 5965ccd1-7d75-4079-ade6-e1859a860162. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 00:02:07 compute-1 nova_compute[182713]: 2026-01-22 00:02:07.184 182717 DEBUG nova.network.neutron [req-14d3b801-7b18-4665-a6db-313da29827d8 req-d6f79cb1-b135-4e0b-a3d3-5960a59503de 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20] Updating instance_info_cache with network_info: [{"id": "5965ccd1-7d75-4079-ade6-e1859a860162", "address": "fa:16:3e:c9:be:ae", "network": {"id": "19c3e0c8-5563-479c-995a-ab38d8b8c7f7", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-10713966-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.240", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cccb624dbe6d4401a89e9cd254f91828", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5965ccd1-7d", "ovs_interfaceid": "5965ccd1-7d75-4079-ade6-e1859a860162", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:02:07 compute-1 nova_compute[182713]: 2026-01-22 00:02:07.373 182717 DEBUG oslo_concurrency.lockutils [req-14d3b801-7b18-4665-a6db-313da29827d8 req-d6f79cb1-b135-4e0b-a3d3-5960a59503de 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-8009ab5e-1bf8-4d17-8ff2-b62b25eeff20" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:02:07 compute-1 nova_compute[182713]: 2026-01-22 00:02:07.874 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:02:08 compute-1 sshd-session[224451]: error: kex_exchange_identification: read: Connection reset by peer
Jan 22 00:02:08 compute-1 sshd-session[224451]: Connection reset by 176.120.22.52 port 24513
Jan 22 00:02:10 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:02:10.119 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=74526b6d-b1ca-423f-9094-b845f8b97526, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '21'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:02:11 compute-1 nova_compute[182713]: 2026-01-22 00:02:11.848 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:02:12 compute-1 nova_compute[182713]: 2026-01-22 00:02:12.348 182717 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769040117.3466012, 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:02:12 compute-1 nova_compute[182713]: 2026-01-22 00:02:12.348 182717 INFO nova.compute.manager [-] [instance: 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20] VM Stopped (Lifecycle Event)
Jan 22 00:02:12 compute-1 nova_compute[182713]: 2026-01-22 00:02:12.472 182717 DEBUG nova.compute.manager [None req-214b9a10-a7bc-4640-97dc-28e1d2292879 - - - - - -] [instance: 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:02:12 compute-1 nova_compute[182713]: 2026-01-22 00:02:12.477 182717 DEBUG nova.compute.manager [None req-214b9a10-a7bc-4640-97dc-28e1d2292879 - - - - - -] [instance: 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20] Synchronizing instance power state after lifecycle event "Stopped"; current vm_state: active, current task_state: resize_finish, current DB power_state: 1, VM power_state: 4 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 00:02:12 compute-1 podman[224454]: 2026-01-22 00:02:12.607951212 +0000 UTC m=+0.090442211 container health_status 9069102163b9014ce8d990e36224edf2f6277e737eebbc85a8ea0ba5a3d6a468 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 22 00:02:12 compute-1 podman[224453]: 2026-01-22 00:02:12.635933085 +0000 UTC m=+0.123835970 container health_status 1b31374526468d6b98833c6ddc06c791524e3aaa8f4a92d02e3f11c449a17ca2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251202)
Jan 22 00:02:12 compute-1 nova_compute[182713]: 2026-01-22 00:02:12.649 182717 INFO nova.compute.manager [None req-214b9a10-a7bc-4640-97dc-28e1d2292879 - - - - - -] [instance: 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20] During the sync_power process the instance has moved from host compute-0.ctlplane.example.com to host compute-1.ctlplane.example.com
Jan 22 00:02:12 compute-1 nova_compute[182713]: 2026-01-22 00:02:12.877 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:02:15 compute-1 nova_compute[182713]: 2026-01-22 00:02:15.634 182717 DEBUG nova.compute.manager [req-f636f8b4-4d9f-4782-af5b-5b18a23d8251 req-97f2580b-8171-4774-bc0c-f823d5563665 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20] Received event network-vif-plugged-5965ccd1-7d75-4079-ade6-e1859a860162 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:02:15 compute-1 nova_compute[182713]: 2026-01-22 00:02:15.635 182717 DEBUG oslo_concurrency.lockutils [req-f636f8b4-4d9f-4782-af5b-5b18a23d8251 req-97f2580b-8171-4774-bc0c-f823d5563665 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "8009ab5e-1bf8-4d17-8ff2-b62b25eeff20-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:02:15 compute-1 nova_compute[182713]: 2026-01-22 00:02:15.636 182717 DEBUG oslo_concurrency.lockutils [req-f636f8b4-4d9f-4782-af5b-5b18a23d8251 req-97f2580b-8171-4774-bc0c-f823d5563665 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "8009ab5e-1bf8-4d17-8ff2-b62b25eeff20-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:02:15 compute-1 nova_compute[182713]: 2026-01-22 00:02:15.636 182717 DEBUG oslo_concurrency.lockutils [req-f636f8b4-4d9f-4782-af5b-5b18a23d8251 req-97f2580b-8171-4774-bc0c-f823d5563665 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "8009ab5e-1bf8-4d17-8ff2-b62b25eeff20-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:02:15 compute-1 nova_compute[182713]: 2026-01-22 00:02:15.637 182717 DEBUG nova.compute.manager [req-f636f8b4-4d9f-4782-af5b-5b18a23d8251 req-97f2580b-8171-4774-bc0c-f823d5563665 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20] No waiting events found dispatching network-vif-plugged-5965ccd1-7d75-4079-ade6-e1859a860162 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:02:15 compute-1 nova_compute[182713]: 2026-01-22 00:02:15.637 182717 WARNING nova.compute.manager [req-f636f8b4-4d9f-4782-af5b-5b18a23d8251 req-97f2580b-8171-4774-bc0c-f823d5563665 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20] Received unexpected event network-vif-plugged-5965ccd1-7d75-4079-ade6-e1859a860162 for instance with vm_state resized and task_state None.
Jan 22 00:02:16 compute-1 nova_compute[182713]: 2026-01-22 00:02:16.851 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:02:17 compute-1 nova_compute[182713]: 2026-01-22 00:02:17.878 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:02:18 compute-1 nova_compute[182713]: 2026-01-22 00:02:18.502 182717 DEBUG nova.compute.manager [req-7def03ce-9877-4921-b35c-d1af811766c0 req-1c19c4c8-f7a1-4e84-a36e-4bf6049572ce 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20] Received event network-vif-plugged-5965ccd1-7d75-4079-ade6-e1859a860162 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:02:18 compute-1 nova_compute[182713]: 2026-01-22 00:02:18.503 182717 DEBUG oslo_concurrency.lockutils [req-7def03ce-9877-4921-b35c-d1af811766c0 req-1c19c4c8-f7a1-4e84-a36e-4bf6049572ce 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "8009ab5e-1bf8-4d17-8ff2-b62b25eeff20-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:02:18 compute-1 nova_compute[182713]: 2026-01-22 00:02:18.503 182717 DEBUG oslo_concurrency.lockutils [req-7def03ce-9877-4921-b35c-d1af811766c0 req-1c19c4c8-f7a1-4e84-a36e-4bf6049572ce 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "8009ab5e-1bf8-4d17-8ff2-b62b25eeff20-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:02:18 compute-1 nova_compute[182713]: 2026-01-22 00:02:18.503 182717 DEBUG oslo_concurrency.lockutils [req-7def03ce-9877-4921-b35c-d1af811766c0 req-1c19c4c8-f7a1-4e84-a36e-4bf6049572ce 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "8009ab5e-1bf8-4d17-8ff2-b62b25eeff20-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:02:18 compute-1 nova_compute[182713]: 2026-01-22 00:02:18.503 182717 DEBUG nova.compute.manager [req-7def03ce-9877-4921-b35c-d1af811766c0 req-1c19c4c8-f7a1-4e84-a36e-4bf6049572ce 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20] No waiting events found dispatching network-vif-plugged-5965ccd1-7d75-4079-ade6-e1859a860162 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:02:18 compute-1 nova_compute[182713]: 2026-01-22 00:02:18.504 182717 WARNING nova.compute.manager [req-7def03ce-9877-4921-b35c-d1af811766c0 req-1c19c4c8-f7a1-4e84-a36e-4bf6049572ce 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20] Received unexpected event network-vif-plugged-5965ccd1-7d75-4079-ade6-e1859a860162 for instance with vm_state resized and task_state resize_reverting.
Jan 22 00:02:18 compute-1 nova_compute[182713]: 2026-01-22 00:02:18.856 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:02:18 compute-1 nova_compute[182713]: 2026-01-22 00:02:18.856 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Jan 22 00:02:18 compute-1 nova_compute[182713]: 2026-01-22 00:02:18.995 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Jan 22 00:02:19 compute-1 podman[224504]: 2026-01-22 00:02:19.598120953 +0000 UTC m=+0.079172123 container health_status af9a44923f14d865893c91712a29a99d59e05d83f1a1a0300c4f4d5af638f284 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 00:02:19 compute-1 podman[224505]: 2026-01-22 00:02:19.613773006 +0000 UTC m=+0.084001492 container health_status e305ebfcd3dbca18f8dd680606b043b108cf9efb0d0a771039cb04fe0df8441d (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 22 00:02:20 compute-1 nova_compute[182713]: 2026-01-22 00:02:20.855 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:02:21 compute-1 nova_compute[182713]: 2026-01-22 00:02:21.852 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.875 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '8009ab5e-1bf8-4d17-8ff2-b62b25eeff20', 'name': 'tempest-ServerActionsTestJSON-server-803720403', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000054', 'OS-EXT-SRV-ATTR:host': 'compute-1.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'shutdown', 'tenant_id': 'cccb624dbe6d4401a89e9cd254f91828', 'user_id': '3e78a70a1d284a9d932d4a53b872df39', 'hostId': '89e229a2b27a89f6698442a3c675c14cf6376a025bb23797be903fbb', 'status': 'stopped', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Jan 22 00:02:22 compute-1 nova_compute[182713]: 2026-01-22 00:02:22.880 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.881 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '2cb6e3d6-f22a-49ea-aab8-900dd88605e9', 'name': 'tempest-ServersNegativeTestJSON-server-1381246704', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000053', 'OS-EXT-SRV-ATTR:host': 'compute-1.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'a7e425a4d1854533a17d5f0dcd9d87b9', 'user_id': '531ec5a088a94b78af6e2c3feda17c0c', 'hostId': '578648f6bf47c6deb4154cb0b4585dab27bdd8aa7e48881c80430019', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.882 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.884 12 DEBUG ceilometer.compute.pollsters [-] Instance 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20 was shut off while getting sample of disk.device.write.requests: Failed to inspect data of instance <name=instance-00000054, id=8009ab5e-1bf8-4d17-8ff2-b62b25eeff20>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.913 12 DEBUG ceilometer.compute.pollsters [-] 2cb6e3d6-f22a-49ea-aab8-900dd88605e9/disk.device.write.requests volume: 324 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.914 12 DEBUG ceilometer.compute.pollsters [-] 2cb6e3d6-f22a-49ea-aab8-900dd88605e9/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.918 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '78db9c5d-9a45-4a23-98c5-be0399971679', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 324, 'user_id': '531ec5a088a94b78af6e2c3feda17c0c', 'user_name': None, 'project_id': 'a7e425a4d1854533a17d5f0dcd9d87b9', 'project_name': None, 'resource_id': '2cb6e3d6-f22a-49ea-aab8-900dd88605e9-vda', 'timestamp': '2026-01-22T00:02:22.882505', 'resource_metadata': {'display_name': 'tempest-ServersNegativeTestJSON-server-1381246704', 'name': 'instance-00000053', 'instance_id': '2cb6e3d6-f22a-49ea-aab8-900dd88605e9', 'instance_type': 'm1.nano', 'host': '578648f6bf47c6deb4154cb0b4585dab27bdd8aa7e48881c80430019', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'a09f7d60-f725-11f0-a0a4-fa163e934844', 'monotonic_time': 4715.592368437, 'message_signature': '7172c6622aa02b33ad12c03c06f4cb0f4f9f8eb99ae85979a59a1f711eac7e1c'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '531ec5a088a94b78af6e2c3feda17c0c', 'user_name': None, 'project_id': 'a7e425a4d1854533a17d5f0dcd9d87b9', 'project_name': None, 'resource_id': '2cb6e3d6-f22a-49ea-aab8-900dd88605e9-sda', 'timestamp': '2026-01-22T00:02:22.882505', 'resource_metadata': {'display_name': 'tempest-ServersNegativeTestJSON-server-1381246704', 'name': 'instance-00000053', 'instance_id': '2cb6e3d6-f22a-49ea-aab8-900dd88605e9', 'instance_type': 'm1.nano', 'host': '578648f6bf47c6deb4154cb0b4585dab27bdd8aa7e48881c80430019', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'a09f9ca0-f725-11f0-a0a4-fa163e934844', 'monotonic_time': 4715.592368437, 'message_signature': 'b8e01977369d2c56433fcaf514b15941b95188f3a5dde3175aa895392e0c07e5'}]}, 'timestamp': '2026-01-22 00:02:22.915434', '_unique_id': '238b93ee10304bc2a04e910d4f5d18b8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.918 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.918 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.918 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.918 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.918 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.918 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.918 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.918 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.918 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.918 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.918 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.918 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.918 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.918 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.918 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.918 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.918 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.918 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.918 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.918 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.918 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.918 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.918 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.918 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.918 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.918 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.918 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.918 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.918 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.918 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.918 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.920 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.921 12 DEBUG ceilometer.compute.pollsters [-] Instance 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20 was shut off while getting sample of cpu: Failed to inspect data of instance <name=instance-00000054, id=8009ab5e-1bf8-4d17-8ff2-b62b25eeff20>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.938 12 DEBUG ceilometer.compute.pollsters [-] 2cb6e3d6-f22a-49ea-aab8-900dd88605e9/cpu volume: 11190000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.940 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7a53e3cf-5c7d-46bb-9bf0-c386ef78e878', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 11190000000, 'user_id': '531ec5a088a94b78af6e2c3feda17c0c', 'user_name': None, 'project_id': 'a7e425a4d1854533a17d5f0dcd9d87b9', 'project_name': None, 'resource_id': '2cb6e3d6-f22a-49ea-aab8-900dd88605e9', 'timestamp': '2026-01-22T00:02:22.920799', 'resource_metadata': {'display_name': 'tempest-ServersNegativeTestJSON-server-1381246704', 'name': 'instance-00000053', 'instance_id': '2cb6e3d6-f22a-49ea-aab8-900dd88605e9', 'instance_type': 'm1.nano', 'host': '578648f6bf47c6deb4154cb0b4585dab27bdd8aa7e48881c80430019', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': 'a0a337d4-f725-11f0-a0a4-fa163e934844', 'monotonic_time': 4715.645379312, 'message_signature': '529f87e8522323d9ddec3fa216cee9cdb39af00001c2c55c464c5cf227cb1f3b'}]}, 'timestamp': '2026-01-22 00:02:22.939010', '_unique_id': '0b9540c4773b4e509c35b5d7d6709fbf'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.940 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.940 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.940 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.940 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.940 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.940 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.940 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.940 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.940 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.940 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.940 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.940 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.940 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.940 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.940 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.940 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.940 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.940 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.940 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.940 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.940 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.940 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.940 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.940 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.940 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.940 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.940 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.940 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.940 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.940 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.940 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.941 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.942 12 DEBUG ceilometer.compute.pollsters [-] Instance 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20 was shut off while getting sample of disk.device.usage: Failed to inspect data of instance <name=instance-00000054, id=8009ab5e-1bf8-4d17-8ff2-b62b25eeff20>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.953 12 DEBUG ceilometer.compute.pollsters [-] 2cb6e3d6-f22a-49ea-aab8-900dd88605e9/disk.device.usage volume: 29949952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.954 12 DEBUG ceilometer.compute.pollsters [-] 2cb6e3d6-f22a-49ea-aab8-900dd88605e9/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.955 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a8e3b199-e07e-48ea-9570-4cbc25c5ad14', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29949952, 'user_id': '531ec5a088a94b78af6e2c3feda17c0c', 'user_name': None, 'project_id': 'a7e425a4d1854533a17d5f0dcd9d87b9', 'project_name': None, 'resource_id': '2cb6e3d6-f22a-49ea-aab8-900dd88605e9-vda', 'timestamp': '2026-01-22T00:02:22.941412', 'resource_metadata': {'display_name': 'tempest-ServersNegativeTestJSON-server-1381246704', 'name': 'instance-00000053', 'instance_id': '2cb6e3d6-f22a-49ea-aab8-900dd88605e9', 'instance_type': 'm1.nano', 'host': '578648f6bf47c6deb4154cb0b4585dab27bdd8aa7e48881c80430019', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'a0a588cc-f725-11f0-a0a4-fa163e934844', 'monotonic_time': 4715.649560511, 'message_signature': '5705bbfb037560c375a44de1662dadca2f82e5baf5871540fdaf058efc8521b8'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '531ec5a088a94b78af6e2c3feda17c0c', 'user_name': None, 'project_id': 'a7e425a4d1854533a17d5f0dcd9d87b9', 'project_name': None, 'resource_id': '2cb6e3d6-f22a-49ea-aab8-900dd88605e9-sda', 'timestamp': '2026-01-22T00:02:22.941412', 'resource_metadata': {'display_name': 'tempest-ServersNegativeTestJSON-server-1381246704', 'name': 'instance-00000053', 'instance_id': '2cb6e3d6-f22a-49ea-aab8-900dd88605e9', 'instance_type': 'm1.nano', 'host': '578648f6bf47c6deb4154cb0b4585dab27bdd8aa7e48881c80430019', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'a0a59862-f725-11f0-a0a4-fa163e934844', 'monotonic_time': 4715.649560511, 'message_signature': '7894751f887373a0c9ea0c5b268f96ca024fc00fe48a525619f3204c9fe768c0'}]}, 'timestamp': '2026-01-22 00:02:22.954470', '_unique_id': '35a317034b5e433ea018c8e495d0356a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.955 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.955 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.955 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.955 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.955 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.955 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.955 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.955 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.955 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.955 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.955 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.955 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.955 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.955 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.955 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.955 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.955 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.955 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.955 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.955 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.955 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.955 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.955 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.955 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.955 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.955 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.955 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.955 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.955 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.955 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.955 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.956 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.957 12 DEBUG ceilometer.compute.pollsters [-] Instance 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20 was shut off while getting sample of disk.device.read.requests: Failed to inspect data of instance <name=instance-00000054, id=8009ab5e-1bf8-4d17-8ff2-b62b25eeff20>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.957 12 DEBUG ceilometer.compute.pollsters [-] 2cb6e3d6-f22a-49ea-aab8-900dd88605e9/disk.device.read.requests volume: 1107 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.958 12 DEBUG ceilometer.compute.pollsters [-] 2cb6e3d6-f22a-49ea-aab8-900dd88605e9/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.959 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f6c60258-2e09-4aa5-b1c8-4aa8f7b40fa4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1107, 'user_id': '531ec5a088a94b78af6e2c3feda17c0c', 'user_name': None, 'project_id': 'a7e425a4d1854533a17d5f0dcd9d87b9', 'project_name': None, 'resource_id': '2cb6e3d6-f22a-49ea-aab8-900dd88605e9-vda', 'timestamp': '2026-01-22T00:02:22.956665', 'resource_metadata': {'display_name': 'tempest-ServersNegativeTestJSON-server-1381246704', 'name': 'instance-00000053', 'instance_id': '2cb6e3d6-f22a-49ea-aab8-900dd88605e9', 'instance_type': 'm1.nano', 'host': '578648f6bf47c6deb4154cb0b4585dab27bdd8aa7e48881c80430019', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'a0a621e2-f725-11f0-a0a4-fa163e934844', 'monotonic_time': 4715.592368437, 'message_signature': '9d659a5d52856a85e83c37149db7ba233d3351565bbab334d4f84a86b74ac059'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': '531ec5a088a94b78af6e2c3feda17c0c', 'user_name': None, 'project_id': 'a7e425a4d1854533a17d5f0dcd9d87b9', 'project_name': None, 'resource_id': '2cb6e3d6-f22a-49ea-aab8-900dd88605e9-sda', 'timestamp': '2026-01-22T00:02:22.956665', 'resource_metadata': {'display_name': 'tempest-ServersNegativeTestJSON-server-1381246704', 'name': 'instance-00000053', 'instance_id': '2cb6e3d6-f22a-49ea-aab8-900dd88605e9', 'instance_type': 'm1.nano', 'host': '578648f6bf47c6deb4154cb0b4585dab27bdd8aa7e48881c80430019', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'a0a62f52-f725-11f0-a0a4-fa163e934844', 'monotonic_time': 4715.592368437, 'message_signature': 'e1b91eaa4cce59eff0c927172c9d158550c6e001c5efac9aece328b669b9e2ce'}]}, 'timestamp': '2026-01-22 00:02:22.958317', '_unique_id': '23653220c2b842fab516ae5a4bcf8b89'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.959 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.959 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.959 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.959 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.959 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.959 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.959 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.959 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.959 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.959 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.959 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.959 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.959 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.959 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.959 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.959 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.959 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.959 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.959 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.959 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.959 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.959 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.959 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.959 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.959 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.959 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.959 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.959 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.959 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.959 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.959 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.960 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.961 12 DEBUG ceilometer.compute.pollsters [-] Instance 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20 was shut off while getting sample of disk.device.allocation: Failed to inspect data of instance <name=instance-00000054, id=8009ab5e-1bf8-4d17-8ff2-b62b25eeff20>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.961 12 DEBUG ceilometer.compute.pollsters [-] 2cb6e3d6-f22a-49ea-aab8-900dd88605e9/disk.device.allocation volume: 30482432 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.961 12 DEBUG ceilometer.compute.pollsters [-] 2cb6e3d6-f22a-49ea-aab8-900dd88605e9/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.962 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7ab0c2a4-df56-497a-96b1-2104812f3d18', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30482432, 'user_id': '531ec5a088a94b78af6e2c3feda17c0c', 'user_name': None, 'project_id': 'a7e425a4d1854533a17d5f0dcd9d87b9', 'project_name': None, 'resource_id': '2cb6e3d6-f22a-49ea-aab8-900dd88605e9-vda', 'timestamp': '2026-01-22T00:02:22.960240', 'resource_metadata': {'display_name': 'tempest-ServersNegativeTestJSON-server-1381246704', 'name': 'instance-00000053', 'instance_id': '2cb6e3d6-f22a-49ea-aab8-900dd88605e9', 'instance_type': 'm1.nano', 'host': '578648f6bf47c6deb4154cb0b4585dab27bdd8aa7e48881c80430019', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'a0a6acca-f725-11f0-a0a4-fa163e934844', 'monotonic_time': 4715.649560511, 'message_signature': '5f39d110e409c1b737cd28782ca572b69ca4a21fae97085a17b4e9cec7b042a4'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '531ec5a088a94b78af6e2c3feda17c0c', 'user_name': None, 'project_id': 'a7e425a4d1854533a17d5f0dcd9d87b9', 'project_name': None, 'resource_id': '2cb6e3d6-f22a-49ea-aab8-900dd88605e9-sda', 'timestamp': '2026-01-22T00:02:22.960240', 'resource_metadata': {'display_name': 'tempest-ServersNegativeTestJSON-server-1381246704', 'name': 'instance-00000053', 'instance_id': '2cb6e3d6-f22a-49ea-aab8-900dd88605e9', 'instance_type': 'm1.nano', 'host': '578648f6bf47c6deb4154cb0b4585dab27bdd8aa7e48881c80430019', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'a0a6b9e0-f725-11f0-a0a4-fa163e934844', 'monotonic_time': 4715.649560511, 'message_signature': '0b42b4fed96bd9692b7dbae2b0c2a0eb5c49499c2237b149d0df1c419abc204e'}]}, 'timestamp': '2026-01-22 00:02:22.961895', '_unique_id': 'd49ecb758e914f4fb53ebd19fe38cfbb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.962 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.962 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.962 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.962 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.962 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.962 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.962 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.962 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.962 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.962 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.962 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.962 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.962 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.962 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.962 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.962 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.962 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.962 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.962 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.962 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.962 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.962 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.962 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.962 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.962 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.962 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.962 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.962 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.962 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.962 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.962 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.963 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.964 12 DEBUG ceilometer.compute.pollsters [-] Instance 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20 was shut off while getting sample of disk.device.read.latency: Failed to inspect data of instance <name=instance-00000054, id=8009ab5e-1bf8-4d17-8ff2-b62b25eeff20>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.964 12 DEBUG ceilometer.compute.pollsters [-] 2cb6e3d6-f22a-49ea-aab8-900dd88605e9/disk.device.read.latency volume: 258919087 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.965 12 DEBUG ceilometer.compute.pollsters [-] 2cb6e3d6-f22a-49ea-aab8-900dd88605e9/disk.device.read.latency volume: 26343558 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.966 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'def3f700-cd81-478a-8dd6-cc496b427abd', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 258919087, 'user_id': '531ec5a088a94b78af6e2c3feda17c0c', 'user_name': None, 'project_id': 'a7e425a4d1854533a17d5f0dcd9d87b9', 'project_name': None, 'resource_id': '2cb6e3d6-f22a-49ea-aab8-900dd88605e9-vda', 'timestamp': '2026-01-22T00:02:22.963679', 'resource_metadata': {'display_name': 'tempest-ServersNegativeTestJSON-server-1381246704', 'name': 'instance-00000053', 'instance_id': '2cb6e3d6-f22a-49ea-aab8-900dd88605e9', 'instance_type': 'm1.nano', 'host': '578648f6bf47c6deb4154cb0b4585dab27bdd8aa7e48881c80430019', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'a0a73a64-f725-11f0-a0a4-fa163e934844', 'monotonic_time': 4715.592368437, 'message_signature': 'e164789c2812343a13c6dee327fa849094bb4a6cc3c0cc35e1475389b18249a9'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 26343558, 'user_id': '531ec5a088a94b78af6e2c3feda17c0c', 'user_name': None, 'project_id': 'a7e425a4d1854533a17d5f0dcd9d87b9', 'project_name': None, 'resource_id': '2cb6e3d6-f22a-49ea-aab8-900dd88605e9-sda', 'timestamp': '2026-01-22T00:02:22.963679', 'resource_metadata': {'display_name': 'tempest-ServersNegativeTestJSON-server-1381246704', 'name': 'instance-00000053', 'instance_id': '2cb6e3d6-f22a-49ea-aab8-900dd88605e9', 'instance_type': 'm1.nano', 'host': '578648f6bf47c6deb4154cb0b4585dab27bdd8aa7e48881c80430019', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'a0a747ac-f725-11f0-a0a4-fa163e934844', 'monotonic_time': 4715.592368437, 'message_signature': '83c797bbf1902478c43e2806bf9bf54fd1acd89b066e74370f86446c2ba3504a'}]}, 'timestamp': '2026-01-22 00:02:22.965496', '_unique_id': '4d15c7065df545f48965a67f9afe6565'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.966 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.966 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.966 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.966 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.966 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.966 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.966 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.966 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.966 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.966 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.966 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.966 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.966 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.966 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.966 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.966 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.966 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.966 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.966 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.966 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.966 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.966 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.966 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.966 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.966 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.966 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.966 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.966 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.966 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.966 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.966 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.967 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.968 12 DEBUG ceilometer.compute.pollsters [-] Instance 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20 was shut off while getting sample of network.incoming.packets.error: Failed to inspect data of instance <name=instance-00000054, id=8009ab5e-1bf8-4d17-8ff2-b62b25eeff20>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.971 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 2cb6e3d6-f22a-49ea-aab8-900dd88605e9 / tap8412a083-ca inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.971 12 DEBUG ceilometer.compute.pollsters [-] 2cb6e3d6-f22a-49ea-aab8-900dd88605e9/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.972 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'dcfd6cc3-6fa7-40cd-bd94-23b1fce1d73b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '531ec5a088a94b78af6e2c3feda17c0c', 'user_name': None, 'project_id': 'a7e425a4d1854533a17d5f0dcd9d87b9', 'project_name': None, 'resource_id': 'instance-00000053-2cb6e3d6-f22a-49ea-aab8-900dd88605e9-tap8412a083-ca', 'timestamp': '2026-01-22T00:02:22.967475', 'resource_metadata': {'display_name': 'tempest-ServersNegativeTestJSON-server-1381246704', 'name': 'tap8412a083-ca', 'instance_id': '2cb6e3d6-f22a-49ea-aab8-900dd88605e9', 'instance_type': 'm1.nano', 'host': '578648f6bf47c6deb4154cb0b4585dab27bdd8aa7e48881c80430019', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e0:ee:91', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap8412a083-ca'}, 'message_id': 'a0a83d24-f725-11f0-a0a4-fa163e934844', 'monotonic_time': 4715.675688606, 'message_signature': 'caf9e68cf6cadcb84b2d7f70cce6ff80f8f4837182d038c1b2e32e41c69f504e'}]}, 'timestamp': '2026-01-22 00:02:22.971884', '_unique_id': '428c808372ec41c8bb16b65c7ce9ffd0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.972 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.972 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.972 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.972 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.972 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.972 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.972 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.972 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.972 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.972 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.972 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.972 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.972 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.972 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.972 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.972 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.972 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.972 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.972 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.972 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.972 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.972 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.972 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.972 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.972 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.972 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.972 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.972 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.972 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.972 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.972 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.973 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.974 12 DEBUG ceilometer.compute.pollsters [-] Instance 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20 was shut off while getting sample of network.incoming.bytes.delta: Failed to inspect data of instance <name=instance-00000054, id=8009ab5e-1bf8-4d17-8ff2-b62b25eeff20>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.974 12 DEBUG ceilometer.compute.pollsters [-] 2cb6e3d6-f22a-49ea-aab8-900dd88605e9/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.976 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '13aff12c-5f84-4a8e-b673-8119c31b34c3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '531ec5a088a94b78af6e2c3feda17c0c', 'user_name': None, 'project_id': 'a7e425a4d1854533a17d5f0dcd9d87b9', 'project_name': None, 'resource_id': 'instance-00000053-2cb6e3d6-f22a-49ea-aab8-900dd88605e9-tap8412a083-ca', 'timestamp': '2026-01-22T00:02:22.973948', 'resource_metadata': {'display_name': 'tempest-ServersNegativeTestJSON-server-1381246704', 'name': 'tap8412a083-ca', 'instance_id': '2cb6e3d6-f22a-49ea-aab8-900dd88605e9', 'instance_type': 'm1.nano', 'host': '578648f6bf47c6deb4154cb0b4585dab27bdd8aa7e48881c80430019', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e0:ee:91', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap8412a083-ca'}, 'message_id': 'a0a8c42e-f725-11f0-a0a4-fa163e934844', 'monotonic_time': 4715.675688606, 'message_signature': 'bf96a90e84b0c78ad7eda6dd092bb5555bf8376a58adb60077ce4b93c579b1b1'}]}, 'timestamp': '2026-01-22 00:02:22.975283', '_unique_id': 'fb88193367c14177b844ea2060638493'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.976 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.976 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.976 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.976 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.976 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.976 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.976 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.976 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.976 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.976 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.976 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.976 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.976 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.976 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.976 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.976 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.976 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.976 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.976 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.976 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.976 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.976 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.976 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.976 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.976 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.976 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.976 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.976 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.976 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.976 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.976 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.976 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.976 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.976 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.976 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.976 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.976 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.976 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.976 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.976 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.976 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.976 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.976 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.976 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.976 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.976 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.976 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.976 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.976 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.976 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.976 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.976 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.976 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.976 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.976 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.977 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.977 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-ServerActionsTestJSON-server-803720403>, <NovaLikeServer: tempest-ServersNegativeTestJSON-server-1381246704>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServerActionsTestJSON-server-803720403>, <NovaLikeServer: tempest-ServersNegativeTestJSON-server-1381246704>]
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.977 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.978 12 DEBUG ceilometer.compute.pollsters [-] Instance 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20 was shut off while getting sample of network.outgoing.packets: Failed to inspect data of instance <name=instance-00000054, id=8009ab5e-1bf8-4d17-8ff2-b62b25eeff20>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.978 12 DEBUG ceilometer.compute.pollsters [-] 2cb6e3d6-f22a-49ea-aab8-900dd88605e9/network.outgoing.packets volume: 16 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.980 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7f7bdb50-2cb2-4a36-a872-3d74d2b1df24', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 16, 'user_id': '531ec5a088a94b78af6e2c3feda17c0c', 'user_name': None, 'project_id': 'a7e425a4d1854533a17d5f0dcd9d87b9', 'project_name': None, 'resource_id': 'instance-00000053-2cb6e3d6-f22a-49ea-aab8-900dd88605e9-tap8412a083-ca', 'timestamp': '2026-01-22T00:02:22.977681', 'resource_metadata': {'display_name': 'tempest-ServersNegativeTestJSON-server-1381246704', 'name': 'tap8412a083-ca', 'instance_id': '2cb6e3d6-f22a-49ea-aab8-900dd88605e9', 'instance_type': 'm1.nano', 'host': '578648f6bf47c6deb4154cb0b4585dab27bdd8aa7e48881c80430019', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e0:ee:91', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap8412a083-ca'}, 'message_id': 'a0a95c5e-f725-11f0-a0a4-fa163e934844', 'monotonic_time': 4715.675688606, 'message_signature': 'd5706bfcedf40b7a24cc89c4917e9a980b440e4ae89a0b0660ea02749291c5f4'}]}, 'timestamp': '2026-01-22 00:02:22.979199', '_unique_id': '0d6c3d1ebfb04c01a6ebbd080fef68ac'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.980 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.980 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.980 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.980 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.980 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.980 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.980 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.980 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.980 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.980 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.980 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.980 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.980 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.980 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.980 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.980 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.980 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.980 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.980 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.980 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.980 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.980 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.980 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.980 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.980 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.980 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.980 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.980 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.980 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.980 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.980 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.980 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.982 12 DEBUG ceilometer.compute.pollsters [-] Instance 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20 was shut off while getting sample of network.outgoing.packets.drop: Failed to inspect data of instance <name=instance-00000054, id=8009ab5e-1bf8-4d17-8ff2-b62b25eeff20>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.982 12 DEBUG ceilometer.compute.pollsters [-] 2cb6e3d6-f22a-49ea-aab8-900dd88605e9/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.983 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '073757c2-d481-47bb-bbef-339d3ced58ee', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '531ec5a088a94b78af6e2c3feda17c0c', 'user_name': None, 'project_id': 'a7e425a4d1854533a17d5f0dcd9d87b9', 'project_name': None, 'resource_id': 'instance-00000053-2cb6e3d6-f22a-49ea-aab8-900dd88605e9-tap8412a083-ca', 'timestamp': '2026-01-22T00:02:22.981071', 'resource_metadata': {'display_name': 'tempest-ServersNegativeTestJSON-server-1381246704', 'name': 'tap8412a083-ca', 'instance_id': '2cb6e3d6-f22a-49ea-aab8-900dd88605e9', 'instance_type': 'm1.nano', 'host': '578648f6bf47c6deb4154cb0b4585dab27bdd8aa7e48881c80430019', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e0:ee:91', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap8412a083-ca'}, 'message_id': 'a0a9e23c-f725-11f0-a0a4-fa163e934844', 'monotonic_time': 4715.675688606, 'message_signature': 'cf932855583609f443c19ac96d7444a6356d206103d8d02344e18f381321167b'}]}, 'timestamp': '2026-01-22 00:02:22.982584', '_unique_id': '710244cff9d24830acfb7918acadd269'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.983 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.983 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.983 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.983 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.983 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.983 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.983 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.983 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.983 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.983 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.983 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.983 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.983 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.983 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.983 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.983 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.983 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.983 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.983 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.983 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.983 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.983 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.983 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.983 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.983 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.983 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.983 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.983 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.983 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.983 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.983 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.984 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.985 12 DEBUG ceilometer.compute.pollsters [-] Instance 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20 was shut off while getting sample of network.outgoing.bytes: Failed to inspect data of instance <name=instance-00000054, id=8009ab5e-1bf8-4d17-8ff2-b62b25eeff20>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.985 12 DEBUG ceilometer.compute.pollsters [-] 2cb6e3d6-f22a-49ea-aab8-900dd88605e9/network.outgoing.bytes volume: 1620 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.986 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f4187a2b-f0ca-437a-84ac-ba3593f7cc23', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1620, 'user_id': '531ec5a088a94b78af6e2c3feda17c0c', 'user_name': None, 'project_id': 'a7e425a4d1854533a17d5f0dcd9d87b9', 'project_name': None, 'resource_id': 'instance-00000053-2cb6e3d6-f22a-49ea-aab8-900dd88605e9-tap8412a083-ca', 'timestamp': '2026-01-22T00:02:22.984380', 'resource_metadata': {'display_name': 'tempest-ServersNegativeTestJSON-server-1381246704', 'name': 'tap8412a083-ca', 'instance_id': '2cb6e3d6-f22a-49ea-aab8-900dd88605e9', 'instance_type': 'm1.nano', 'host': '578648f6bf47c6deb4154cb0b4585dab27bdd8aa7e48881c80430019', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e0:ee:91', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap8412a083-ca'}, 'message_id': 'a0aa5488-f725-11f0-a0a4-fa163e934844', 'monotonic_time': 4715.675688606, 'message_signature': 'f9a8a8c056adaea1b32d3f8e1bb58cc8cf5bf606a67ec6a1c22a0260a12d7a23'}]}, 'timestamp': '2026-01-22 00:02:22.985526', '_unique_id': '88c664d43edd4de386d9a807fc5bf6fa'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.986 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.986 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.986 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.986 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.986 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.986 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.986 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.986 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.986 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.986 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.986 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.986 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.986 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.986 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.986 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.986 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.986 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.986 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.986 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.986 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.986 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.986 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.986 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.986 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.986 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.986 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.986 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.986 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.986 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.986 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.986 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.986 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.986 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.986 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.986 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.986 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.986 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.986 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.986 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.986 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.986 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.986 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.986 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.986 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.986 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.986 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.986 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.986 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.986 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.986 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.986 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.986 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.986 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.986 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.987 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.988 12 DEBUG ceilometer.compute.pollsters [-] Instance 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20 was shut off while getting sample of network.incoming.packets.drop: Failed to inspect data of instance <name=instance-00000054, id=8009ab5e-1bf8-4d17-8ff2-b62b25eeff20>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.988 12 DEBUG ceilometer.compute.pollsters [-] 2cb6e3d6-f22a-49ea-aab8-900dd88605e9/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.989 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '84659eed-35d6-4970-b32d-0aa61a85ced4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '531ec5a088a94b78af6e2c3feda17c0c', 'user_name': None, 'project_id': 'a7e425a4d1854533a17d5f0dcd9d87b9', 'project_name': None, 'resource_id': 'instance-00000053-2cb6e3d6-f22a-49ea-aab8-900dd88605e9-tap8412a083-ca', 'timestamp': '2026-01-22T00:02:22.987306', 'resource_metadata': {'display_name': 'tempest-ServersNegativeTestJSON-server-1381246704', 'name': 'tap8412a083-ca', 'instance_id': '2cb6e3d6-f22a-49ea-aab8-900dd88605e9', 'instance_type': 'm1.nano', 'host': '578648f6bf47c6deb4154cb0b4585dab27bdd8aa7e48881c80430019', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e0:ee:91', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap8412a083-ca'}, 'message_id': 'a0aad1b0-f725-11f0-a0a4-fa163e934844', 'monotonic_time': 4715.675688606, 'message_signature': '425606f50952f0f783bb3641ab5b6a97cc52313e3cc96631f5e43d382e225f6a'}]}, 'timestamp': '2026-01-22 00:02:22.988732', '_unique_id': '02e7abb3915948eba87c893a40f92bc5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.989 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.989 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.989 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.989 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.989 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.989 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.989 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.989 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.989 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.989 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.989 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.989 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.989 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.989 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.989 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.989 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.989 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.989 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.989 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.989 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.989 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.989 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.989 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.989 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.989 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.989 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.989 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.989 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.989 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.989 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.989 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.990 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.991 12 DEBUG ceilometer.compute.pollsters [-] Instance 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20 was shut off while getting sample of network.incoming.packets: Failed to inspect data of instance <name=instance-00000054, id=8009ab5e-1bf8-4d17-8ff2-b62b25eeff20>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.991 12 DEBUG ceilometer.compute.pollsters [-] 2cb6e3d6-f22a-49ea-aab8-900dd88605e9/network.incoming.packets volume: 9 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.992 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '40bdeb5c-2edc-40dc-a577-52f439a8f161', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 9, 'user_id': '531ec5a088a94b78af6e2c3feda17c0c', 'user_name': None, 'project_id': 'a7e425a4d1854533a17d5f0dcd9d87b9', 'project_name': None, 'resource_id': 'instance-00000053-2cb6e3d6-f22a-49ea-aab8-900dd88605e9-tap8412a083-ca', 'timestamp': '2026-01-22T00:02:22.990581', 'resource_metadata': {'display_name': 'tempest-ServersNegativeTestJSON-server-1381246704', 'name': 'tap8412a083-ca', 'instance_id': '2cb6e3d6-f22a-49ea-aab8-900dd88605e9', 'instance_type': 'm1.nano', 'host': '578648f6bf47c6deb4154cb0b4585dab27bdd8aa7e48881c80430019', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e0:ee:91', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap8412a083-ca'}, 'message_id': 'a0ab4e4c-f725-11f0-a0a4-fa163e934844', 'monotonic_time': 4715.675688606, 'message_signature': 'ae93164246e639e5e65b399434908855513f15ad670fd7a2af418a9682d31742'}]}, 'timestamp': '2026-01-22 00:02:22.991949', '_unique_id': '66eaf48ec0544cfda26515b19e955a99'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.992 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.992 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.992 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.992 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.992 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.992 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.992 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.992 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.992 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.992 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.992 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.992 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.992 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.992 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.992 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.992 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.992 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.992 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.992 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.992 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.992 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.992 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.992 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.992 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.992 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.992 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.992 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.992 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.992 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.992 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.992 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.993 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.994 12 DEBUG ceilometer.compute.pollsters [-] Instance 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20 was shut off while getting sample of network.outgoing.bytes.delta: Failed to inspect data of instance <name=instance-00000054, id=8009ab5e-1bf8-4d17-8ff2-b62b25eeff20>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.994 12 DEBUG ceilometer.compute.pollsters [-] 2cb6e3d6-f22a-49ea-aab8-900dd88605e9/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.995 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f1d496bf-7208-4c7f-a167-83aab218fab9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '531ec5a088a94b78af6e2c3feda17c0c', 'user_name': None, 'project_id': 'a7e425a4d1854533a17d5f0dcd9d87b9', 'project_name': None, 'resource_id': 'instance-00000053-2cb6e3d6-f22a-49ea-aab8-900dd88605e9-tap8412a083-ca', 'timestamp': '2026-01-22T00:02:22.993733', 'resource_metadata': {'display_name': 'tempest-ServersNegativeTestJSON-server-1381246704', 'name': 'tap8412a083-ca', 'instance_id': '2cb6e3d6-f22a-49ea-aab8-900dd88605e9', 'instance_type': 'm1.nano', 'host': '578648f6bf47c6deb4154cb0b4585dab27bdd8aa7e48881c80430019', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e0:ee:91', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap8412a083-ca'}, 'message_id': 'a0abc430-f725-11f0-a0a4-fa163e934844', 'monotonic_time': 4715.675688606, 'message_signature': '69c376e214573c8007214bfc5c78139fd14e2de4393cc5981686128e30a06a06'}]}, 'timestamp': '2026-01-22 00:02:22.994966', '_unique_id': 'bb3efa197c87408989d756c8215738f1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.995 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.995 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.995 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.995 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.995 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.995 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.995 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.995 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.995 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.995 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.995 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.995 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.995 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.995 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.995 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.995 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.995 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.995 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.995 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.995 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.995 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.995 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.995 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.995 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.995 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.995 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.995 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.995 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.995 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.995 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.995 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.996 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.997 12 DEBUG ceilometer.compute.pollsters [-] Instance 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20 was shut off while getting sample of disk.device.read.bytes: Failed to inspect data of instance <name=instance-00000054, id=8009ab5e-1bf8-4d17-8ff2-b62b25eeff20>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.997 12 DEBUG ceilometer.compute.pollsters [-] 2cb6e3d6-f22a-49ea-aab8-900dd88605e9/disk.device.read.bytes volume: 30693888 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.997 12 DEBUG ceilometer.compute.pollsters [-] 2cb6e3d6-f22a-49ea-aab8-900dd88605e9/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.999 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1aaa7b25-f5e6-4292-97c0-39488678efc7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 30693888, 'user_id': '531ec5a088a94b78af6e2c3feda17c0c', 'user_name': None, 'project_id': 'a7e425a4d1854533a17d5f0dcd9d87b9', 'project_name': None, 'resource_id': '2cb6e3d6-f22a-49ea-aab8-900dd88605e9-vda', 'timestamp': '2026-01-22T00:02:22.996633', 'resource_metadata': {'display_name': 'tempest-ServersNegativeTestJSON-server-1381246704', 'name': 'instance-00000053', 'instance_id': '2cb6e3d6-f22a-49ea-aab8-900dd88605e9', 'instance_type': 'm1.nano', 'host': '578648f6bf47c6deb4154cb0b4585dab27bdd8aa7e48881c80430019', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'a0ac392e-f725-11f0-a0a4-fa163e934844', 'monotonic_time': 4715.592368437, 'message_signature': 'd79cdb4d8f28457ff9b0828ad80859586b474f0c4739d7cf6b1b282eb742b3a8'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': '531ec5a088a94b78af6e2c3feda17c0c', 'user_name': None, 'project_id': 'a7e425a4d1854533a17d5f0dcd9d87b9', 'project_name': None, 'resource_id': '2cb6e3d6-f22a-49ea-aab8-900dd88605e9-sda', 'timestamp': '2026-01-22T00:02:22.996633', 'resource_metadata': {'display_name': 'tempest-ServersNegativeTestJSON-server-1381246704', 'name': 'instance-00000053', 'instance_id': '2cb6e3d6-f22a-49ea-aab8-900dd88605e9', 'instance_type': 'm1.nano', 'host': '578648f6bf47c6deb4154cb0b4585dab27bdd8aa7e48881c80430019', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'a0ac46da-f725-11f0-a0a4-fa163e934844', 'monotonic_time': 4715.592368437, 'message_signature': '1be88cb42bb1c61c21a7caa3f42576e99620db884af29dcd5fc66d04264a6a60'}]}, 'timestamp': '2026-01-22 00:02:22.998245', '_unique_id': 'ffed29f0511d4010af5b8fb7a9be684f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.999 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.999 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.999 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.999 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.999 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.999 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.999 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.999 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.999 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.999 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.999 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.999 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.999 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.999 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.999 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.999 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.999 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.999 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.999 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.999 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.999 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.999 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.999 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.999 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.999 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.999 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.999 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.999 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.999 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.999 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.999 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:22.999 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.000 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.000 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-ServerActionsTestJSON-server-803720403>, <NovaLikeServer: tempest-ServersNegativeTestJSON-server-1381246704>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServerActionsTestJSON-server-803720403>, <NovaLikeServer: tempest-ServersNegativeTestJSON-server-1381246704>]
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.000 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.000 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.000 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-ServerActionsTestJSON-server-803720403>, <NovaLikeServer: tempest-ServersNegativeTestJSON-server-1381246704>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServerActionsTestJSON-server-803720403>, <NovaLikeServer: tempest-ServersNegativeTestJSON-server-1381246704>]
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.000 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.001 12 DEBUG ceilometer.compute.pollsters [-] Instance 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20 was shut off while getting sample of network.outgoing.packets.error: Failed to inspect data of instance <name=instance-00000054, id=8009ab5e-1bf8-4d17-8ff2-b62b25eeff20>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.001 12 DEBUG ceilometer.compute.pollsters [-] 2cb6e3d6-f22a-49ea-aab8-900dd88605e9/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.002 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f9ba1627-3dcf-4178-af59-8ed59458ac1d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '531ec5a088a94b78af6e2c3feda17c0c', 'user_name': None, 'project_id': 'a7e425a4d1854533a17d5f0dcd9d87b9', 'project_name': None, 'resource_id': 'instance-00000053-2cb6e3d6-f22a-49ea-aab8-900dd88605e9-tap8412a083-ca', 'timestamp': '2026-01-22T00:02:23.000827', 'resource_metadata': {'display_name': 'tempest-ServersNegativeTestJSON-server-1381246704', 'name': 'tap8412a083-ca', 'instance_id': '2cb6e3d6-f22a-49ea-aab8-900dd88605e9', 'instance_type': 'm1.nano', 'host': '578648f6bf47c6deb4154cb0b4585dab27bdd8aa7e48881c80430019', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e0:ee:91', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap8412a083-ca'}, 'message_id': 'a0acd76c-f725-11f0-a0a4-fa163e934844', 'monotonic_time': 4715.675688606, 'message_signature': '2365ebb64a5aaf32a6e6b42e912ca40183aa35b98da1c77d00bc36dd68a0a97b'}]}, 'timestamp': '2026-01-22 00:02:23.002004', '_unique_id': 'ff483515e5694c4d9e5c1872542d6779'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.002 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.002 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.002 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.002 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.002 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.002 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.002 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.002 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.002 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.002 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.002 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.002 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.002 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.002 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.002 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.002 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.002 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.002 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.002 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.002 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.002 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.002 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.002 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.002 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.002 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.002 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.002 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.002 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.002 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.002 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.002 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.003 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.004 12 DEBUG ceilometer.compute.pollsters [-] Instance 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20 was shut off while getting sample of network.incoming.bytes: Failed to inspect data of instance <name=instance-00000054, id=8009ab5e-1bf8-4d17-8ff2-b62b25eeff20>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.004 12 DEBUG ceilometer.compute.pollsters [-] 2cb6e3d6-f22a-49ea-aab8-900dd88605e9/network.incoming.bytes volume: 1352 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.005 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e3ff1172-4d84-401f-9620-f5a45cb2c2a2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1352, 'user_id': '531ec5a088a94b78af6e2c3feda17c0c', 'user_name': None, 'project_id': 'a7e425a4d1854533a17d5f0dcd9d87b9', 'project_name': None, 'resource_id': 'instance-00000053-2cb6e3d6-f22a-49ea-aab8-900dd88605e9-tap8412a083-ca', 'timestamp': '2026-01-22T00:02:23.003737', 'resource_metadata': {'display_name': 'tempest-ServersNegativeTestJSON-server-1381246704', 'name': 'tap8412a083-ca', 'instance_id': '2cb6e3d6-f22a-49ea-aab8-900dd88605e9', 'instance_type': 'm1.nano', 'host': '578648f6bf47c6deb4154cb0b4585dab27bdd8aa7e48881c80430019', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e0:ee:91', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap8412a083-ca'}, 'message_id': 'a0ad4a26-f725-11f0-a0a4-fa163e934844', 'monotonic_time': 4715.675688606, 'message_signature': '594ae34fa773530dd45019d2e528b2e93a51b693119fd9c1c77b9c6b6eb91c22'}]}, 'timestamp': '2026-01-22 00:02:23.004930', '_unique_id': 'a0b71d51d8574841b0e9bac0b2af0a53'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.005 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.005 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.005 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.005 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.005 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.005 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.005 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.005 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.005 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.005 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.005 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.005 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.005 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.005 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.005 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.005 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.005 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.005 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.005 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.005 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.005 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.005 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.005 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.005 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.005 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.005 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.005 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.005 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.005 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.005 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.005 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.006 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.007 12 DEBUG ceilometer.compute.pollsters [-] Instance 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20 was shut off while getting sample of disk.device.write.latency: Failed to inspect data of instance <name=instance-00000054, id=8009ab5e-1bf8-4d17-8ff2-b62b25eeff20>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.007 12 DEBUG ceilometer.compute.pollsters [-] 2cb6e3d6-f22a-49ea-aab8-900dd88605e9/disk.device.write.latency volume: 3150138221 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.007 12 DEBUG ceilometer.compute.pollsters [-] 2cb6e3d6-f22a-49ea-aab8-900dd88605e9/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.008 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9583ed26-0ffc-4b48-bc69-c8ce15ba9f00', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 3150138221, 'user_id': '531ec5a088a94b78af6e2c3feda17c0c', 'user_name': None, 'project_id': 'a7e425a4d1854533a17d5f0dcd9d87b9', 'project_name': None, 'resource_id': '2cb6e3d6-f22a-49ea-aab8-900dd88605e9-vda', 'timestamp': '2026-01-22T00:02:23.006667', 'resource_metadata': {'display_name': 'tempest-ServersNegativeTestJSON-server-1381246704', 'name': 'instance-00000053', 'instance_id': '2cb6e3d6-f22a-49ea-aab8-900dd88605e9', 'instance_type': 'm1.nano', 'host': '578648f6bf47c6deb4154cb0b4585dab27bdd8aa7e48881c80430019', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'a0adbeac-f725-11f0-a0a4-fa163e934844', 'monotonic_time': 4715.592368437, 'message_signature': 'dca2a2c08a74e9c979feccba16b68cb3b42f5a4aa955b393b97fd65b1f8f096d'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '531ec5a088a94b78af6e2c3feda17c0c', 'user_name': None, 'project_id': 'a7e425a4d1854533a17d5f0dcd9d87b9', 'project_name': None, 'resource_id': '2cb6e3d6-f22a-49ea-aab8-900dd88605e9-sda', 'timestamp': '2026-01-22T00:02:23.006667', 'resource_metadata': {'display_name': 'tempest-ServersNegativeTestJSON-server-1381246704', 'name': 'instance-00000053', 'instance_id': '2cb6e3d6-f22a-49ea-aab8-900dd88605e9', 'instance_type': 'm1.nano', 'host': '578648f6bf47c6deb4154cb0b4585dab27bdd8aa7e48881c80430019', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'a0adcbcc-f725-11f0-a0a4-fa163e934844', 'monotonic_time': 4715.592368437, 'message_signature': 'f359934eb8568b725106110b2436aa0d267aa30880f67c1f24bab74c3fae45d0'}]}, 'timestamp': '2026-01-22 00:02:23.008194', '_unique_id': '4a4aa8f1492241e0932dc37414ec5a7e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.008 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.008 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.008 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.008 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.008 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.008 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.008 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.008 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.008 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.008 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.008 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.008 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.008 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.008 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.008 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.008 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.008 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.008 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.008 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.008 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.008 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.008 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.008 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.008 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.008 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.008 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.008 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.008 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.008 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.008 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.008 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.009 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.009 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.010 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-ServerActionsTestJSON-server-803720403>, <NovaLikeServer: tempest-ServersNegativeTestJSON-server-1381246704>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServerActionsTestJSON-server-803720403>, <NovaLikeServer: tempest-ServersNegativeTestJSON-server-1381246704>]
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.010 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.011 12 DEBUG ceilometer.compute.pollsters [-] Instance 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20 was shut off while getting sample of disk.device.capacity: Failed to inspect data of instance <name=instance-00000054, id=8009ab5e-1bf8-4d17-8ff2-b62b25eeff20>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.011 12 DEBUG ceilometer.compute.pollsters [-] 2cb6e3d6-f22a-49ea-aab8-900dd88605e9/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.011 12 DEBUG ceilometer.compute.pollsters [-] 2cb6e3d6-f22a-49ea-aab8-900dd88605e9/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.012 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '63a4b55b-f8df-43ea-9392-8fe6aaa57fb9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '531ec5a088a94b78af6e2c3feda17c0c', 'user_name': None, 'project_id': 'a7e425a4d1854533a17d5f0dcd9d87b9', 'project_name': None, 'resource_id': '2cb6e3d6-f22a-49ea-aab8-900dd88605e9-vda', 'timestamp': '2026-01-22T00:02:23.010357', 'resource_metadata': {'display_name': 'tempest-ServersNegativeTestJSON-server-1381246704', 'name': 'instance-00000053', 'instance_id': '2cb6e3d6-f22a-49ea-aab8-900dd88605e9', 'instance_type': 'm1.nano', 'host': '578648f6bf47c6deb4154cb0b4585dab27bdd8aa7e48881c80430019', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'a0ae4cd2-f725-11f0-a0a4-fa163e934844', 'monotonic_time': 4715.649560511, 'message_signature': '732884a6ca857af54fe25fa435bdef043659722ac8b15c90c91daf5a2ced19da'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '531ec5a088a94b78af6e2c3feda17c0c', 'user_name': None, 'project_id': 'a7e425a4d1854533a17d5f0dcd9d87b9', 'project_name': None, 'resource_id': '2cb6e3d6-f22a-49ea-aab8-900dd88605e9-sda', 'timestamp': '2026-01-22T00:02:23.010357', 'resource_metadata': {'display_name': 'tempest-ServersNegativeTestJSON-server-1381246704', 'name': 'instance-00000053', 'instance_id': '2cb6e3d6-f22a-49ea-aab8-900dd88605e9', 'instance_type': 'm1.nano', 'host': '578648f6bf47c6deb4154cb0b4585dab27bdd8aa7e48881c80430019', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'a0ae5902-f725-11f0-a0a4-fa163e934844', 'monotonic_time': 4715.649560511, 'message_signature': 'c862b67d1200d9c5b20740a0c2014eab97e44d5aaa30b46f4e19bc178d18ddf8'}]}, 'timestamp': '2026-01-22 00:02:23.011811', '_unique_id': '18e34dc4b3bd4efaa259521f731b2d4c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.012 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.012 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.012 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.012 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.012 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.012 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.012 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.012 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.012 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.012 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.012 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.012 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.012 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.012 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.012 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.012 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.012 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.012 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.012 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.012 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.012 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.012 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.012 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.012 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.012 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.012 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.012 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.012 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.012 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.012 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.012 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.012 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.012 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.012 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.012 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.012 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.012 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.012 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.012 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.012 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.012 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.012 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.012 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.012 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.012 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.012 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.012 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.012 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.012 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.012 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.012 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.012 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.012 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.012 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.013 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.014 12 DEBUG ceilometer.compute.pollsters [-] Instance 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20 was shut off while getting sample of memory.usage: Failed to inspect data of instance <name=instance-00000054, id=8009ab5e-1bf8-4d17-8ff2-b62b25eeff20>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.014 12 DEBUG ceilometer.compute.pollsters [-] 2cb6e3d6-f22a-49ea-aab8-900dd88605e9/memory.usage volume: 42.36328125 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.015 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'afd6e40d-86b5-42c1-9c36-4e564954ab3c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 42.36328125, 'user_id': '531ec5a088a94b78af6e2c3feda17c0c', 'user_name': None, 'project_id': 'a7e425a4d1854533a17d5f0dcd9d87b9', 'project_name': None, 'resource_id': '2cb6e3d6-f22a-49ea-aab8-900dd88605e9', 'timestamp': '2026-01-22T00:02:23.013503', 'resource_metadata': {'display_name': 'tempest-ServersNegativeTestJSON-server-1381246704', 'name': 'instance-00000053', 'instance_id': '2cb6e3d6-f22a-49ea-aab8-900dd88605e9', 'instance_type': 'm1.nano', 'host': '578648f6bf47c6deb4154cb0b4585dab27bdd8aa7e48881c80430019', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': 'a0aec7c0-f725-11f0-a0a4-fa163e934844', 'monotonic_time': 4715.645379312, 'message_signature': '6842439e9db5c5ae7898326dbccc1439cca099337694e854900dc92e1f108075'}]}, 'timestamp': '2026-01-22 00:02:23.014670', '_unique_id': '5f99579a3efe47949cf53e8ab4ebf44a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.015 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.015 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.015 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.015 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.015 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.015 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.015 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.015 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.015 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.015 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.015 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.015 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.015 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.015 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.015 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.015 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.015 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.015 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.015 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.015 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.015 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.015 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.015 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.015 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.015 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.015 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.015 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.015 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.015 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.015 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.015 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.016 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.017 12 DEBUG ceilometer.compute.pollsters [-] Instance 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20 was shut off while getting sample of disk.device.write.bytes: Failed to inspect data of instance <name=instance-00000054, id=8009ab5e-1bf8-4d17-8ff2-b62b25eeff20>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.017 12 DEBUG ceilometer.compute.pollsters [-] 2cb6e3d6-f22a-49ea-aab8-900dd88605e9/disk.device.write.bytes volume: 73019392 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.017 12 DEBUG ceilometer.compute.pollsters [-] 2cb6e3d6-f22a-49ea-aab8-900dd88605e9/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.018 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '57ae5f9c-d7e8-4230-8b1e-bf1c45fe52aa', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 73019392, 'user_id': '531ec5a088a94b78af6e2c3feda17c0c', 'user_name': None, 'project_id': 'a7e425a4d1854533a17d5f0dcd9d87b9', 'project_name': None, 'resource_id': '2cb6e3d6-f22a-49ea-aab8-900dd88605e9-vda', 'timestamp': '2026-01-22T00:02:23.016424', 'resource_metadata': {'display_name': 'tempest-ServersNegativeTestJSON-server-1381246704', 'name': 'instance-00000053', 'instance_id': '2cb6e3d6-f22a-49ea-aab8-900dd88605e9', 'instance_type': 'm1.nano', 'host': '578648f6bf47c6deb4154cb0b4585dab27bdd8aa7e48881c80430019', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'a0af387c-f725-11f0-a0a4-fa163e934844', 'monotonic_time': 4715.592368437, 'message_signature': '65aec71f6e6411ff108925a53f9fd69087cf9fb8e06ce0713307bdff6181833f'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '531ec5a088a94b78af6e2c3feda17c0c', 'user_name': None, 'project_id': 'a7e425a4d1854533a17d5f0dcd9d87b9', 'project_name': None, 'resource_id': '2cb6e3d6-f22a-49ea-aab8-900dd88605e9-sda', 'timestamp': '2026-01-22T00:02:23.016424', 'resource_metadata': {'display_name': 'tempest-ServersNegativeTestJSON-server-1381246704', 'name': 'instance-00000053', 'instance_id': '2cb6e3d6-f22a-49ea-aab8-900dd88605e9', 'instance_type': 'm1.nano', 'host': '578648f6bf47c6deb4154cb0b4585dab27bdd8aa7e48881c80430019', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'a0af43f8-f725-11f0-a0a4-fa163e934844', 'monotonic_time': 4715.592368437, 'message_signature': 'cefd0d13738ee079a3442b8d919227deacc2aa4dd10560fcdcb706dabea187ca'}]}, 'timestamp': '2026-01-22 00:02:23.017819', '_unique_id': 'a5a9d38cf4404b29abe6c93b8e542c9c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.018 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.018 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.018 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.018 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.018 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.018 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.018 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.018 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.018 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.018 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.018 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.018 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.018 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.018 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.018 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.018 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.018 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.018 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.018 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.018 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.018 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.018 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.018 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.018 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.018 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.018 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.018 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.018 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.018 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.018 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:02:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:02:23.018 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:02:23 compute-1 nova_compute[182713]: 2026-01-22 00:02:23.226 182717 DEBUG nova.compute.manager [req-17c74439-1b42-48d2-85dd-63db0c790ad9 req-4939c9bc-337a-49e1-92cc-ea86f35154a8 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20] Received event network-vif-unplugged-5965ccd1-7d75-4079-ade6-e1859a860162 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:02:23 compute-1 nova_compute[182713]: 2026-01-22 00:02:23.227 182717 DEBUG oslo_concurrency.lockutils [req-17c74439-1b42-48d2-85dd-63db0c790ad9 req-4939c9bc-337a-49e1-92cc-ea86f35154a8 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "8009ab5e-1bf8-4d17-8ff2-b62b25eeff20-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:02:23 compute-1 nova_compute[182713]: 2026-01-22 00:02:23.227 182717 DEBUG oslo_concurrency.lockutils [req-17c74439-1b42-48d2-85dd-63db0c790ad9 req-4939c9bc-337a-49e1-92cc-ea86f35154a8 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "8009ab5e-1bf8-4d17-8ff2-b62b25eeff20-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:02:23 compute-1 nova_compute[182713]: 2026-01-22 00:02:23.228 182717 DEBUG oslo_concurrency.lockutils [req-17c74439-1b42-48d2-85dd-63db0c790ad9 req-4939c9bc-337a-49e1-92cc-ea86f35154a8 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "8009ab5e-1bf8-4d17-8ff2-b62b25eeff20-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:02:23 compute-1 nova_compute[182713]: 2026-01-22 00:02:23.229 182717 DEBUG nova.compute.manager [req-17c74439-1b42-48d2-85dd-63db0c790ad9 req-4939c9bc-337a-49e1-92cc-ea86f35154a8 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20] No waiting events found dispatching network-vif-unplugged-5965ccd1-7d75-4079-ade6-e1859a860162 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:02:23 compute-1 nova_compute[182713]: 2026-01-22 00:02:23.229 182717 WARNING nova.compute.manager [req-17c74439-1b42-48d2-85dd-63db0c790ad9 req-4939c9bc-337a-49e1-92cc-ea86f35154a8 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20] Received unexpected event network-vif-unplugged-5965ccd1-7d75-4079-ade6-e1859a860162 for instance with vm_state resized and task_state resize_reverting.
Jan 22 00:02:26 compute-1 nova_compute[182713]: 2026-01-22 00:02:26.854 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:02:27 compute-1 nova_compute[182713]: 2026-01-22 00:02:27.882 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:02:28 compute-1 nova_compute[182713]: 2026-01-22 00:02:28.279 182717 INFO nova.compute.manager [None req-9197a828-795b-4e73-bc6e-c4fcf95c17d1 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20] Swapping old allocation on dict_keys(['39680711-70c9-4df1-ae59-25e54fac688d']) held by migration e2b58971-2cd8-417e-8990-aa80e80966b9 for instance
Jan 22 00:02:28 compute-1 nova_compute[182713]: 2026-01-22 00:02:28.323 182717 DEBUG nova.scheduler.client.report [None req-9197a828-795b-4e73-bc6e-c4fcf95c17d1 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Overwriting current allocation {'allocations': {'5f09a77c-505f-4bd3-ac26-41f43ebdf535': {'resources': {'VCPU': 1, 'MEMORY_MB': 192, 'DISK_GB': 1}, 'generation': 45}}, 'project_id': 'cccb624dbe6d4401a89e9cd254f91828', 'user_id': '3e78a70a1d284a9d932d4a53b872df39', 'consumer_generation': 1} on consumer 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20 move_allocations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:2018
Jan 22 00:02:28 compute-1 nova_compute[182713]: 2026-01-22 00:02:28.613 182717 INFO nova.network.neutron [None req-9197a828-795b-4e73-bc6e-c4fcf95c17d1 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20] Updating port 5965ccd1-7d75-4079-ade6-e1859a860162 with attributes {'binding:host_id': 'compute-1.ctlplane.example.com', 'device_owner': 'compute:nova'}
Jan 22 00:02:28 compute-1 nova_compute[182713]: 2026-01-22 00:02:28.951 182717 DEBUG nova.compute.manager [req-c9cfe757-df6a-4d3f-b073-7cea67ba766c req-4471aeac-152f-439f-bd40-986f5418cc66 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20] Received event network-vif-plugged-5965ccd1-7d75-4079-ade6-e1859a860162 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:02:28 compute-1 nova_compute[182713]: 2026-01-22 00:02:28.952 182717 DEBUG oslo_concurrency.lockutils [req-c9cfe757-df6a-4d3f-b073-7cea67ba766c req-4471aeac-152f-439f-bd40-986f5418cc66 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "8009ab5e-1bf8-4d17-8ff2-b62b25eeff20-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:02:28 compute-1 nova_compute[182713]: 2026-01-22 00:02:28.952 182717 DEBUG oslo_concurrency.lockutils [req-c9cfe757-df6a-4d3f-b073-7cea67ba766c req-4471aeac-152f-439f-bd40-986f5418cc66 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "8009ab5e-1bf8-4d17-8ff2-b62b25eeff20-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:02:28 compute-1 nova_compute[182713]: 2026-01-22 00:02:28.953 182717 DEBUG oslo_concurrency.lockutils [req-c9cfe757-df6a-4d3f-b073-7cea67ba766c req-4471aeac-152f-439f-bd40-986f5418cc66 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "8009ab5e-1bf8-4d17-8ff2-b62b25eeff20-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:02:28 compute-1 nova_compute[182713]: 2026-01-22 00:02:28.953 182717 DEBUG nova.compute.manager [req-c9cfe757-df6a-4d3f-b073-7cea67ba766c req-4471aeac-152f-439f-bd40-986f5418cc66 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20] No waiting events found dispatching network-vif-plugged-5965ccd1-7d75-4079-ade6-e1859a860162 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:02:28 compute-1 nova_compute[182713]: 2026-01-22 00:02:28.954 182717 WARNING nova.compute.manager [req-c9cfe757-df6a-4d3f-b073-7cea67ba766c req-4471aeac-152f-439f-bd40-986f5418cc66 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20] Received unexpected event network-vif-plugged-5965ccd1-7d75-4079-ade6-e1859a860162 for instance with vm_state resized and task_state resize_reverting.
Jan 22 00:02:29 compute-1 podman[224554]: 2026-01-22 00:02:29.778396179 +0000 UTC m=+0.261996792 container health_status cba261ffc859a105e0afa4436e64e27f9ba7e7387a1ea02146e0072c51844d44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute)
Jan 22 00:02:31 compute-1 nova_compute[182713]: 2026-01-22 00:02:31.104 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:02:31 compute-1 nova_compute[182713]: 2026-01-22 00:02:31.105 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 00:02:31 compute-1 nova_compute[182713]: 2026-01-22 00:02:31.857 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:02:32 compute-1 nova_compute[182713]: 2026-01-22 00:02:32.858 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:02:32 compute-1 nova_compute[182713]: 2026-01-22 00:02:32.859 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:02:32 compute-1 nova_compute[182713]: 2026-01-22 00:02:32.859 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:02:32 compute-1 nova_compute[182713]: 2026-01-22 00:02:32.885 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:02:33 compute-1 podman[224574]: 2026-01-22 00:02:33.596036056 +0000 UTC m=+0.083761884 container health_status dce133a2ffa21f3006ac27dc80027060a9029e6d972f4c62fecd35dd3bbb00f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, architecture=x86_64, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, io.openshift.expose-services=, release=1755695350, config_id=openstack_network_exporter)
Jan 22 00:02:33 compute-1 nova_compute[182713]: 2026-01-22 00:02:33.726 182717 DEBUG oslo_concurrency.lockutils [None req-9197a828-795b-4e73-bc6e-c4fcf95c17d1 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Acquiring lock "refresh_cache-8009ab5e-1bf8-4d17-8ff2-b62b25eeff20" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:02:33 compute-1 nova_compute[182713]: 2026-01-22 00:02:33.727 182717 DEBUG oslo_concurrency.lockutils [None req-9197a828-795b-4e73-bc6e-c4fcf95c17d1 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Acquired lock "refresh_cache-8009ab5e-1bf8-4d17-8ff2-b62b25eeff20" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:02:33 compute-1 nova_compute[182713]: 2026-01-22 00:02:33.727 182717 DEBUG nova.network.neutron [None req-9197a828-795b-4e73-bc6e-c4fcf95c17d1 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 00:02:33 compute-1 nova_compute[182713]: 2026-01-22 00:02:33.852 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:02:33 compute-1 nova_compute[182713]: 2026-01-22 00:02:33.910 182717 DEBUG nova.compute.manager [req-542bb107-f9f0-448b-ac24-30b9094c209f req-161299a5-cd6a-42d0-83f0-157538585df4 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20] Received event network-changed-5965ccd1-7d75-4079-ade6-e1859a860162 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:02:33 compute-1 nova_compute[182713]: 2026-01-22 00:02:33.910 182717 DEBUG nova.compute.manager [req-542bb107-f9f0-448b-ac24-30b9094c209f req-161299a5-cd6a-42d0-83f0-157538585df4 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20] Refreshing instance network info cache due to event network-changed-5965ccd1-7d75-4079-ade6-e1859a860162. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 00:02:33 compute-1 nova_compute[182713]: 2026-01-22 00:02:33.911 182717 DEBUG oslo_concurrency.lockutils [req-542bb107-f9f0-448b-ac24-30b9094c209f req-161299a5-cd6a-42d0-83f0-157538585df4 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-8009ab5e-1bf8-4d17-8ff2-b62b25eeff20" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:02:34 compute-1 nova_compute[182713]: 2026-01-22 00:02:34.855 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:02:34 compute-1 nova_compute[182713]: 2026-01-22 00:02:34.856 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:02:35 compute-1 nova_compute[182713]: 2026-01-22 00:02:35.855 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:02:35 compute-1 nova_compute[182713]: 2026-01-22 00:02:35.855 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 00:02:36 compute-1 nova_compute[182713]: 2026-01-22 00:02:36.189 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquiring lock "refresh_cache-8009ab5e-1bf8-4d17-8ff2-b62b25eeff20" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:02:36 compute-1 nova_compute[182713]: 2026-01-22 00:02:36.216 182717 DEBUG nova.network.neutron [None req-9197a828-795b-4e73-bc6e-c4fcf95c17d1 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20] Updating instance_info_cache with network_info: [{"id": "5965ccd1-7d75-4079-ade6-e1859a860162", "address": "fa:16:3e:c9:be:ae", "network": {"id": "19c3e0c8-5563-479c-995a-ab38d8b8c7f7", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-10713966-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.240", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cccb624dbe6d4401a89e9cd254f91828", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5965ccd1-7d", "ovs_interfaceid": "5965ccd1-7d75-4079-ade6-e1859a860162", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:02:36 compute-1 nova_compute[182713]: 2026-01-22 00:02:36.624 182717 DEBUG oslo_concurrency.lockutils [None req-9197a828-795b-4e73-bc6e-c4fcf95c17d1 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Releasing lock "refresh_cache-8009ab5e-1bf8-4d17-8ff2-b62b25eeff20" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:02:36 compute-1 nova_compute[182713]: 2026-01-22 00:02:36.626 182717 DEBUG nova.virt.libvirt.driver [None req-9197a828-795b-4e73-bc6e-c4fcf95c17d1 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20] Starting finish_revert_migration finish_revert_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11843
Jan 22 00:02:36 compute-1 nova_compute[182713]: 2026-01-22 00:02:36.630 182717 DEBUG oslo_concurrency.lockutils [req-542bb107-f9f0-448b-ac24-30b9094c209f req-161299a5-cd6a-42d0-83f0-157538585df4 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-8009ab5e-1bf8-4d17-8ff2-b62b25eeff20" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:02:36 compute-1 nova_compute[182713]: 2026-01-22 00:02:36.631 182717 DEBUG nova.network.neutron [req-542bb107-f9f0-448b-ac24-30b9094c209f req-161299a5-cd6a-42d0-83f0-157538585df4 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20] Refreshing network info cache for port 5965ccd1-7d75-4079-ade6-e1859a860162 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 00:02:36 compute-1 nova_compute[182713]: 2026-01-22 00:02:36.642 182717 DEBUG nova.virt.libvirt.driver [None req-9197a828-795b-4e73-bc6e-c4fcf95c17d1 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20] Start _get_guest_xml network_info=[{"id": "5965ccd1-7d75-4079-ade6-e1859a860162", "address": "fa:16:3e:c9:be:ae", "network": {"id": "19c3e0c8-5563-479c-995a-ab38d8b8c7f7", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-10713966-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.240", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cccb624dbe6d4401a89e9cd254f91828", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5965ccd1-7d", "ovs_interfaceid": "5965ccd1-7d75-4079-ade6-e1859a860162", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'size': 0, 'disk_bus': 'virtio', 'boot_index': 0, 'device_name': '/dev/vda', 'guest_format': None, 'encrypted': False, 'encryption_format': None, 'encryption_secret_uuid': None, 'device_type': 'disk', 'image_id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 22 00:02:36 compute-1 nova_compute[182713]: 2026-01-22 00:02:36.648 182717 WARNING nova.virt.libvirt.driver [None req-9197a828-795b-4e73-bc6e-c4fcf95c17d1 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 00:02:36 compute-1 nova_compute[182713]: 2026-01-22 00:02:36.658 182717 DEBUG nova.virt.libvirt.host [None req-9197a828-795b-4e73-bc6e-c4fcf95c17d1 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 22 00:02:36 compute-1 nova_compute[182713]: 2026-01-22 00:02:36.659 182717 DEBUG nova.virt.libvirt.host [None req-9197a828-795b-4e73-bc6e-c4fcf95c17d1 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 22 00:02:36 compute-1 nova_compute[182713]: 2026-01-22 00:02:36.667 182717 DEBUG nova.virt.libvirt.host [None req-9197a828-795b-4e73-bc6e-c4fcf95c17d1 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 22 00:02:36 compute-1 nova_compute[182713]: 2026-01-22 00:02:36.668 182717 DEBUG nova.virt.libvirt.host [None req-9197a828-795b-4e73-bc6e-c4fcf95c17d1 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 22 00:02:36 compute-1 nova_compute[182713]: 2026-01-22 00:02:36.670 182717 DEBUG nova.virt.libvirt.driver [None req-9197a828-795b-4e73-bc6e-c4fcf95c17d1 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 22 00:02:36 compute-1 nova_compute[182713]: 2026-01-22 00:02:36.670 182717 DEBUG nova.virt.hardware [None req-9197a828-795b-4e73-bc6e-c4fcf95c17d1 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-21T23:42:45Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c3389c03-89c4-4ff5-9e03-1a99d41713d4',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 22 00:02:36 compute-1 nova_compute[182713]: 2026-01-22 00:02:36.671 182717 DEBUG nova.virt.hardware [None req-9197a828-795b-4e73-bc6e-c4fcf95c17d1 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 22 00:02:36 compute-1 nova_compute[182713]: 2026-01-22 00:02:36.672 182717 DEBUG nova.virt.hardware [None req-9197a828-795b-4e73-bc6e-c4fcf95c17d1 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 22 00:02:36 compute-1 nova_compute[182713]: 2026-01-22 00:02:36.672 182717 DEBUG nova.virt.hardware [None req-9197a828-795b-4e73-bc6e-c4fcf95c17d1 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 22 00:02:36 compute-1 nova_compute[182713]: 2026-01-22 00:02:36.672 182717 DEBUG nova.virt.hardware [None req-9197a828-795b-4e73-bc6e-c4fcf95c17d1 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 22 00:02:36 compute-1 nova_compute[182713]: 2026-01-22 00:02:36.673 182717 DEBUG nova.virt.hardware [None req-9197a828-795b-4e73-bc6e-c4fcf95c17d1 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 22 00:02:36 compute-1 nova_compute[182713]: 2026-01-22 00:02:36.673 182717 DEBUG nova.virt.hardware [None req-9197a828-795b-4e73-bc6e-c4fcf95c17d1 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 22 00:02:36 compute-1 nova_compute[182713]: 2026-01-22 00:02:36.674 182717 DEBUG nova.virt.hardware [None req-9197a828-795b-4e73-bc6e-c4fcf95c17d1 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 22 00:02:36 compute-1 nova_compute[182713]: 2026-01-22 00:02:36.674 182717 DEBUG nova.virt.hardware [None req-9197a828-795b-4e73-bc6e-c4fcf95c17d1 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 22 00:02:36 compute-1 nova_compute[182713]: 2026-01-22 00:02:36.675 182717 DEBUG nova.virt.hardware [None req-9197a828-795b-4e73-bc6e-c4fcf95c17d1 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 22 00:02:36 compute-1 nova_compute[182713]: 2026-01-22 00:02:36.675 182717 DEBUG nova.virt.hardware [None req-9197a828-795b-4e73-bc6e-c4fcf95c17d1 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 22 00:02:36 compute-1 nova_compute[182713]: 2026-01-22 00:02:36.676 182717 DEBUG nova.objects.instance [None req-9197a828-795b-4e73-bc6e-c4fcf95c17d1 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Lazy-loading 'vcpu_model' on Instance uuid 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:02:36 compute-1 nova_compute[182713]: 2026-01-22 00:02:36.858 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:02:36 compute-1 nova_compute[182713]: 2026-01-22 00:02:36.998 182717 DEBUG oslo_concurrency.processutils [None req-9197a828-795b-4e73-bc6e-c4fcf95c17d1 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8009ab5e-1bf8-4d17-8ff2-b62b25eeff20/disk.config --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:02:37 compute-1 nova_compute[182713]: 2026-01-22 00:02:37.091 182717 DEBUG oslo_concurrency.processutils [None req-9197a828-795b-4e73-bc6e-c4fcf95c17d1 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8009ab5e-1bf8-4d17-8ff2-b62b25eeff20/disk.config --force-share --output=json" returned: 0 in 0.093s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:02:37 compute-1 nova_compute[182713]: 2026-01-22 00:02:37.094 182717 DEBUG oslo_concurrency.lockutils [None req-9197a828-795b-4e73-bc6e-c4fcf95c17d1 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Acquiring lock "/var/lib/nova/instances/8009ab5e-1bf8-4d17-8ff2-b62b25eeff20/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:02:37 compute-1 nova_compute[182713]: 2026-01-22 00:02:37.094 182717 DEBUG oslo_concurrency.lockutils [None req-9197a828-795b-4e73-bc6e-c4fcf95c17d1 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Lock "/var/lib/nova/instances/8009ab5e-1bf8-4d17-8ff2-b62b25eeff20/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:02:37 compute-1 nova_compute[182713]: 2026-01-22 00:02:37.096 182717 DEBUG oslo_concurrency.lockutils [None req-9197a828-795b-4e73-bc6e-c4fcf95c17d1 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Lock "/var/lib/nova/instances/8009ab5e-1bf8-4d17-8ff2-b62b25eeff20/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:02:37 compute-1 nova_compute[182713]: 2026-01-22 00:02:37.099 182717 DEBUG nova.virt.libvirt.vif [None req-9197a828-795b-4e73-bc6e-c4fcf95c17d1 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-01-22T00:01:05Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-803720403',display_name='tempest-ServerActionsTestJSON-server-803720403',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-803720403',id=84,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBM2ugiUux7DYMlN8dY8gue1BzsfXbOKOqdPq/gJUxFgjYtiZRKn0Il7yH7vkt/FF0n0nQ57uKZ7FjQwDvGcLpEHkhrK3RTLhPWsztjfiNHjhjKK0S86T4k3kzP0rpeoh4Q==',key_name='tempest-keypair-452781070',keypairs=<?>,launch_index=0,launched_at=2026-01-22T00:02:12Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='cccb624dbe6d4401a89e9cd254f91828',ramdisk_id='',reservation_id='r-zph0mh3t',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-ServerActionsTestJSON-78742637',owner_user_name='tempest-ServerActionsTestJSON-78742637-project-member'},tags=<?>,task_state='resize_reverting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T00:02:22Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='3e78a70a1d284a9d932d4a53b872df39',uuid=8009ab5e-1bf8-4d17-8ff2-b62b25eeff20,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='resized') vif={"id": "5965ccd1-7d75-4079-ade6-e1859a860162", "address": "fa:16:3e:c9:be:ae", "network": {"id": "19c3e0c8-5563-479c-995a-ab38d8b8c7f7", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-10713966-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.240", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cccb624dbe6d4401a89e9cd254f91828", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5965ccd1-7d", "ovs_interfaceid": "5965ccd1-7d75-4079-ade6-e1859a860162", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 00:02:37 compute-1 nova_compute[182713]: 2026-01-22 00:02:37.100 182717 DEBUG nova.network.os_vif_util [None req-9197a828-795b-4e73-bc6e-c4fcf95c17d1 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Converting VIF {"id": "5965ccd1-7d75-4079-ade6-e1859a860162", "address": "fa:16:3e:c9:be:ae", "network": {"id": "19c3e0c8-5563-479c-995a-ab38d8b8c7f7", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-10713966-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.240", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cccb624dbe6d4401a89e9cd254f91828", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5965ccd1-7d", "ovs_interfaceid": "5965ccd1-7d75-4079-ade6-e1859a860162", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:02:37 compute-1 nova_compute[182713]: 2026-01-22 00:02:37.102 182717 DEBUG nova.network.os_vif_util [None req-9197a828-795b-4e73-bc6e-c4fcf95c17d1 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c9:be:ae,bridge_name='br-int',has_traffic_filtering=True,id=5965ccd1-7d75-4079-ade6-e1859a860162,network=Network(19c3e0c8-5563-479c-995a-ab38d8b8c7f7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5965ccd1-7d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:02:37 compute-1 nova_compute[182713]: 2026-01-22 00:02:37.108 182717 DEBUG nova.virt.libvirt.driver [None req-9197a828-795b-4e73-bc6e-c4fcf95c17d1 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20] End _get_guest_xml xml=<domain type="kvm">
Jan 22 00:02:37 compute-1 nova_compute[182713]:   <uuid>8009ab5e-1bf8-4d17-8ff2-b62b25eeff20</uuid>
Jan 22 00:02:37 compute-1 nova_compute[182713]:   <name>instance-00000054</name>
Jan 22 00:02:37 compute-1 nova_compute[182713]:   <memory>131072</memory>
Jan 22 00:02:37 compute-1 nova_compute[182713]:   <vcpu>1</vcpu>
Jan 22 00:02:37 compute-1 nova_compute[182713]:   <metadata>
Jan 22 00:02:37 compute-1 nova_compute[182713]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 00:02:37 compute-1 nova_compute[182713]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 00:02:37 compute-1 nova_compute[182713]:       <nova:name>tempest-ServerActionsTestJSON-server-803720403</nova:name>
Jan 22 00:02:37 compute-1 nova_compute[182713]:       <nova:creationTime>2026-01-22 00:02:36</nova:creationTime>
Jan 22 00:02:37 compute-1 nova_compute[182713]:       <nova:flavor name="m1.nano">
Jan 22 00:02:37 compute-1 nova_compute[182713]:         <nova:memory>128</nova:memory>
Jan 22 00:02:37 compute-1 nova_compute[182713]:         <nova:disk>1</nova:disk>
Jan 22 00:02:37 compute-1 nova_compute[182713]:         <nova:swap>0</nova:swap>
Jan 22 00:02:37 compute-1 nova_compute[182713]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 00:02:37 compute-1 nova_compute[182713]:         <nova:vcpus>1</nova:vcpus>
Jan 22 00:02:37 compute-1 nova_compute[182713]:       </nova:flavor>
Jan 22 00:02:37 compute-1 nova_compute[182713]:       <nova:owner>
Jan 22 00:02:37 compute-1 nova_compute[182713]:         <nova:user uuid="3e78a70a1d284a9d932d4a53b872df39">tempest-ServerActionsTestJSON-78742637-project-member</nova:user>
Jan 22 00:02:37 compute-1 nova_compute[182713]:         <nova:project uuid="cccb624dbe6d4401a89e9cd254f91828">tempest-ServerActionsTestJSON-78742637</nova:project>
Jan 22 00:02:37 compute-1 nova_compute[182713]:       </nova:owner>
Jan 22 00:02:37 compute-1 nova_compute[182713]:       <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 22 00:02:37 compute-1 nova_compute[182713]:       <nova:ports>
Jan 22 00:02:37 compute-1 nova_compute[182713]:         <nova:port uuid="5965ccd1-7d75-4079-ade6-e1859a860162">
Jan 22 00:02:37 compute-1 nova_compute[182713]:           <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Jan 22 00:02:37 compute-1 nova_compute[182713]:         </nova:port>
Jan 22 00:02:37 compute-1 nova_compute[182713]:       </nova:ports>
Jan 22 00:02:37 compute-1 nova_compute[182713]:     </nova:instance>
Jan 22 00:02:37 compute-1 nova_compute[182713]:   </metadata>
Jan 22 00:02:37 compute-1 nova_compute[182713]:   <sysinfo type="smbios">
Jan 22 00:02:37 compute-1 nova_compute[182713]:     <system>
Jan 22 00:02:37 compute-1 nova_compute[182713]:       <entry name="manufacturer">RDO</entry>
Jan 22 00:02:37 compute-1 nova_compute[182713]:       <entry name="product">OpenStack Compute</entry>
Jan 22 00:02:37 compute-1 nova_compute[182713]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 00:02:37 compute-1 nova_compute[182713]:       <entry name="serial">8009ab5e-1bf8-4d17-8ff2-b62b25eeff20</entry>
Jan 22 00:02:37 compute-1 nova_compute[182713]:       <entry name="uuid">8009ab5e-1bf8-4d17-8ff2-b62b25eeff20</entry>
Jan 22 00:02:37 compute-1 nova_compute[182713]:       <entry name="family">Virtual Machine</entry>
Jan 22 00:02:37 compute-1 nova_compute[182713]:     </system>
Jan 22 00:02:37 compute-1 nova_compute[182713]:   </sysinfo>
Jan 22 00:02:37 compute-1 nova_compute[182713]:   <os>
Jan 22 00:02:37 compute-1 nova_compute[182713]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 22 00:02:37 compute-1 nova_compute[182713]:     <boot dev="hd"/>
Jan 22 00:02:37 compute-1 nova_compute[182713]:     <smbios mode="sysinfo"/>
Jan 22 00:02:37 compute-1 nova_compute[182713]:   </os>
Jan 22 00:02:37 compute-1 nova_compute[182713]:   <features>
Jan 22 00:02:37 compute-1 nova_compute[182713]:     <acpi/>
Jan 22 00:02:37 compute-1 nova_compute[182713]:     <apic/>
Jan 22 00:02:37 compute-1 nova_compute[182713]:     <vmcoreinfo/>
Jan 22 00:02:37 compute-1 nova_compute[182713]:   </features>
Jan 22 00:02:37 compute-1 nova_compute[182713]:   <clock offset="utc">
Jan 22 00:02:37 compute-1 nova_compute[182713]:     <timer name="pit" tickpolicy="delay"/>
Jan 22 00:02:37 compute-1 nova_compute[182713]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 22 00:02:37 compute-1 nova_compute[182713]:     <timer name="hpet" present="no"/>
Jan 22 00:02:37 compute-1 nova_compute[182713]:   </clock>
Jan 22 00:02:37 compute-1 nova_compute[182713]:   <cpu mode="custom" match="exact">
Jan 22 00:02:37 compute-1 nova_compute[182713]:     <model>Nehalem</model>
Jan 22 00:02:37 compute-1 nova_compute[182713]:     <topology sockets="1" cores="1" threads="1"/>
Jan 22 00:02:37 compute-1 nova_compute[182713]:   </cpu>
Jan 22 00:02:37 compute-1 nova_compute[182713]:   <devices>
Jan 22 00:02:37 compute-1 nova_compute[182713]:     <disk type="file" device="disk">
Jan 22 00:02:37 compute-1 nova_compute[182713]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 00:02:37 compute-1 nova_compute[182713]:       <source file="/var/lib/nova/instances/8009ab5e-1bf8-4d17-8ff2-b62b25eeff20/disk"/>
Jan 22 00:02:37 compute-1 nova_compute[182713]:       <target dev="vda" bus="virtio"/>
Jan 22 00:02:37 compute-1 nova_compute[182713]:     </disk>
Jan 22 00:02:37 compute-1 nova_compute[182713]:     <disk type="file" device="cdrom">
Jan 22 00:02:37 compute-1 nova_compute[182713]:       <driver name="qemu" type="raw" cache="none"/>
Jan 22 00:02:37 compute-1 nova_compute[182713]:       <source file="/var/lib/nova/instances/8009ab5e-1bf8-4d17-8ff2-b62b25eeff20/disk.config"/>
Jan 22 00:02:37 compute-1 nova_compute[182713]:       <target dev="sda" bus="sata"/>
Jan 22 00:02:37 compute-1 nova_compute[182713]:     </disk>
Jan 22 00:02:37 compute-1 nova_compute[182713]:     <interface type="ethernet">
Jan 22 00:02:37 compute-1 nova_compute[182713]:       <mac address="fa:16:3e:c9:be:ae"/>
Jan 22 00:02:37 compute-1 nova_compute[182713]:       <model type="virtio"/>
Jan 22 00:02:37 compute-1 nova_compute[182713]:       <driver name="vhost" rx_queue_size="512"/>
Jan 22 00:02:37 compute-1 nova_compute[182713]:       <mtu size="1442"/>
Jan 22 00:02:37 compute-1 nova_compute[182713]:       <target dev="tap5965ccd1-7d"/>
Jan 22 00:02:37 compute-1 nova_compute[182713]:     </interface>
Jan 22 00:02:37 compute-1 nova_compute[182713]:     <serial type="pty">
Jan 22 00:02:37 compute-1 nova_compute[182713]:       <log file="/var/lib/nova/instances/8009ab5e-1bf8-4d17-8ff2-b62b25eeff20/console.log" append="off"/>
Jan 22 00:02:37 compute-1 nova_compute[182713]:     </serial>
Jan 22 00:02:37 compute-1 nova_compute[182713]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 00:02:37 compute-1 nova_compute[182713]:     <video>
Jan 22 00:02:37 compute-1 nova_compute[182713]:       <model type="virtio"/>
Jan 22 00:02:37 compute-1 nova_compute[182713]:     </video>
Jan 22 00:02:37 compute-1 nova_compute[182713]:     <input type="tablet" bus="usb"/>
Jan 22 00:02:37 compute-1 nova_compute[182713]:     <input type="keyboard" bus="usb"/>
Jan 22 00:02:37 compute-1 nova_compute[182713]:     <rng model="virtio">
Jan 22 00:02:37 compute-1 nova_compute[182713]:       <backend model="random">/dev/urandom</backend>
Jan 22 00:02:37 compute-1 nova_compute[182713]:     </rng>
Jan 22 00:02:37 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root"/>
Jan 22 00:02:37 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:02:37 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:02:37 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:02:37 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:02:37 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:02:37 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:02:37 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:02:37 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:02:37 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:02:37 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:02:37 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:02:37 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:02:37 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:02:37 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:02:37 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:02:37 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:02:37 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:02:37 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:02:37 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:02:37 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:02:37 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:02:37 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:02:37 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:02:37 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:02:37 compute-1 nova_compute[182713]:     <controller type="usb" index="0"/>
Jan 22 00:02:37 compute-1 nova_compute[182713]:     <memballoon model="virtio">
Jan 22 00:02:37 compute-1 nova_compute[182713]:       <stats period="10"/>
Jan 22 00:02:37 compute-1 nova_compute[182713]:     </memballoon>
Jan 22 00:02:37 compute-1 nova_compute[182713]:   </devices>
Jan 22 00:02:37 compute-1 nova_compute[182713]: </domain>
Jan 22 00:02:37 compute-1 nova_compute[182713]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 22 00:02:37 compute-1 nova_compute[182713]: 2026-01-22 00:02:37.110 182717 DEBUG nova.compute.manager [None req-9197a828-795b-4e73-bc6e-c4fcf95c17d1 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20] Preparing to wait for external event network-vif-plugged-5965ccd1-7d75-4079-ade6-e1859a860162 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 22 00:02:37 compute-1 nova_compute[182713]: 2026-01-22 00:02:37.111 182717 DEBUG oslo_concurrency.lockutils [None req-9197a828-795b-4e73-bc6e-c4fcf95c17d1 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Acquiring lock "8009ab5e-1bf8-4d17-8ff2-b62b25eeff20-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:02:37 compute-1 nova_compute[182713]: 2026-01-22 00:02:37.112 182717 DEBUG oslo_concurrency.lockutils [None req-9197a828-795b-4e73-bc6e-c4fcf95c17d1 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Lock "8009ab5e-1bf8-4d17-8ff2-b62b25eeff20-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:02:37 compute-1 nova_compute[182713]: 2026-01-22 00:02:37.112 182717 DEBUG oslo_concurrency.lockutils [None req-9197a828-795b-4e73-bc6e-c4fcf95c17d1 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Lock "8009ab5e-1bf8-4d17-8ff2-b62b25eeff20-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:02:37 compute-1 nova_compute[182713]: 2026-01-22 00:02:37.114 182717 DEBUG nova.virt.libvirt.vif [None req-9197a828-795b-4e73-bc6e-c4fcf95c17d1 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-01-22T00:01:05Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-803720403',display_name='tempest-ServerActionsTestJSON-server-803720403',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-803720403',id=84,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBM2ugiUux7DYMlN8dY8gue1BzsfXbOKOqdPq/gJUxFgjYtiZRKn0Il7yH7vkt/FF0n0nQ57uKZ7FjQwDvGcLpEHkhrK3RTLhPWsztjfiNHjhjKK0S86T4k3kzP0rpeoh4Q==',key_name='tempest-keypair-452781070',keypairs=<?>,launch_index=0,launched_at=2026-01-22T00:02:12Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='cccb624dbe6d4401a89e9cd254f91828',ramdisk_id='',reservation_id='r-zph0mh3t',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-ServerActionsTestJSON-78742637',owner_user_name='tempest-ServerActionsTestJSON-78742637-project-member'},tags=<?>,task_state='resize_reverting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T00:02:22Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='3e78a70a1d284a9d932d4a53b872df39',uuid=8009ab5e-1bf8-4d17-8ff2-b62b25eeff20,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='resized') vif={"id": "5965ccd1-7d75-4079-ade6-e1859a860162", "address": "fa:16:3e:c9:be:ae", "network": {"id": "19c3e0c8-5563-479c-995a-ab38d8b8c7f7", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-10713966-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.240", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cccb624dbe6d4401a89e9cd254f91828", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5965ccd1-7d", "ovs_interfaceid": "5965ccd1-7d75-4079-ade6-e1859a860162", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 00:02:37 compute-1 nova_compute[182713]: 2026-01-22 00:02:37.114 182717 DEBUG nova.network.os_vif_util [None req-9197a828-795b-4e73-bc6e-c4fcf95c17d1 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Converting VIF {"id": "5965ccd1-7d75-4079-ade6-e1859a860162", "address": "fa:16:3e:c9:be:ae", "network": {"id": "19c3e0c8-5563-479c-995a-ab38d8b8c7f7", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-10713966-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.240", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cccb624dbe6d4401a89e9cd254f91828", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5965ccd1-7d", "ovs_interfaceid": "5965ccd1-7d75-4079-ade6-e1859a860162", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:02:37 compute-1 nova_compute[182713]: 2026-01-22 00:02:37.116 182717 DEBUG nova.network.os_vif_util [None req-9197a828-795b-4e73-bc6e-c4fcf95c17d1 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c9:be:ae,bridge_name='br-int',has_traffic_filtering=True,id=5965ccd1-7d75-4079-ade6-e1859a860162,network=Network(19c3e0c8-5563-479c-995a-ab38d8b8c7f7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5965ccd1-7d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:02:37 compute-1 nova_compute[182713]: 2026-01-22 00:02:37.116 182717 DEBUG os_vif [None req-9197a828-795b-4e73-bc6e-c4fcf95c17d1 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:c9:be:ae,bridge_name='br-int',has_traffic_filtering=True,id=5965ccd1-7d75-4079-ade6-e1859a860162,network=Network(19c3e0c8-5563-479c-995a-ab38d8b8c7f7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5965ccd1-7d') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 00:02:37 compute-1 nova_compute[182713]: 2026-01-22 00:02:37.118 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:02:37 compute-1 nova_compute[182713]: 2026-01-22 00:02:37.119 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:02:37 compute-1 nova_compute[182713]: 2026-01-22 00:02:37.120 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 00:02:37 compute-1 nova_compute[182713]: 2026-01-22 00:02:37.127 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:02:37 compute-1 nova_compute[182713]: 2026-01-22 00:02:37.128 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap5965ccd1-7d, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:02:37 compute-1 nova_compute[182713]: 2026-01-22 00:02:37.130 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap5965ccd1-7d, col_values=(('external_ids', {'iface-id': '5965ccd1-7d75-4079-ade6-e1859a860162', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:c9:be:ae', 'vm-uuid': '8009ab5e-1bf8-4d17-8ff2-b62b25eeff20'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:02:37 compute-1 nova_compute[182713]: 2026-01-22 00:02:37.180 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:02:37 compute-1 NetworkManager[54952]: <info>  [1769040157.1826] manager: (tap5965ccd1-7d): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/175)
Jan 22 00:02:37 compute-1 nova_compute[182713]: 2026-01-22 00:02:37.184 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:02:37 compute-1 nova_compute[182713]: 2026-01-22 00:02:37.188 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:02:37 compute-1 nova_compute[182713]: 2026-01-22 00:02:37.189 182717 INFO os_vif [None req-9197a828-795b-4e73-bc6e-c4fcf95c17d1 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:c9:be:ae,bridge_name='br-int',has_traffic_filtering=True,id=5965ccd1-7d75-4079-ade6-e1859a860162,network=Network(19c3e0c8-5563-479c-995a-ab38d8b8c7f7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5965ccd1-7d')
Jan 22 00:02:37 compute-1 kernel: tap5965ccd1-7d: entered promiscuous mode
Jan 22 00:02:37 compute-1 ovn_controller[94841]: 2026-01-22T00:02:37Z|00356|binding|INFO|Claiming lport 5965ccd1-7d75-4079-ade6-e1859a860162 for this chassis.
Jan 22 00:02:37 compute-1 ovn_controller[94841]: 2026-01-22T00:02:37Z|00357|binding|INFO|5965ccd1-7d75-4079-ade6-e1859a860162: Claiming fa:16:3e:c9:be:ae 10.100.0.4
Jan 22 00:02:37 compute-1 nova_compute[182713]: 2026-01-22 00:02:37.297 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:02:37 compute-1 NetworkManager[54952]: <info>  [1769040157.3012] manager: (tap5965ccd1-7d): new Tun device (/org/freedesktop/NetworkManager/Devices/176)
Jan 22 00:02:37 compute-1 ovn_controller[94841]: 2026-01-22T00:02:37Z|00358|binding|INFO|Setting lport 5965ccd1-7d75-4079-ade6-e1859a860162 ovn-installed in OVS
Jan 22 00:02:37 compute-1 nova_compute[182713]: 2026-01-22 00:02:37.316 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:02:37 compute-1 nova_compute[182713]: 2026-01-22 00:02:37.318 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:02:37 compute-1 systemd-udevd[224613]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 00:02:37 compute-1 systemd-machined[153970]: New machine qemu-41-instance-00000054.
Jan 22 00:02:37 compute-1 NetworkManager[54952]: <info>  [1769040157.3474] device (tap5965ccd1-7d): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 00:02:37 compute-1 NetworkManager[54952]: <info>  [1769040157.3493] device (tap5965ccd1-7d): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 00:02:37 compute-1 systemd[1]: Started Virtual Machine qemu-41-instance-00000054.
Jan 22 00:02:37 compute-1 ovn_controller[94841]: 2026-01-22T00:02:37Z|00359|binding|INFO|Setting lport 5965ccd1-7d75-4079-ade6-e1859a860162 up in Southbound
Jan 22 00:02:37 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:02:37.389 104184 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c9:be:ae 10.100.0.4'], port_security=['fa:16:3e:c9:be:ae 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '8009ab5e-1bf8-4d17-8ff2-b62b25eeff20', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-19c3e0c8-5563-479c-995a-ab38d8b8c7f7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'cccb624dbe6d4401a89e9cd254f91828', 'neutron:revision_number': '10', 'neutron:security_group_ids': '6d59a7e5-ecca-4ec2-a40e-386acabc1d66', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.240'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1cb5ae5b-fb9e-4b4d-8960-35191db09308, chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>], logical_port=5965ccd1-7d75-4079-ade6-e1859a860162) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:02:37 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:02:37.390 104184 INFO neutron.agent.ovn.metadata.agent [-] Port 5965ccd1-7d75-4079-ade6-e1859a860162 in datapath 19c3e0c8-5563-479c-995a-ab38d8b8c7f7 bound to our chassis
Jan 22 00:02:37 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:02:37.392 104184 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 19c3e0c8-5563-479c-995a-ab38d8b8c7f7
Jan 22 00:02:37 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:02:37.411 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[73ef97b5-aa02-4985-8015-ecda9a99ad87]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:02:37 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:02:37.411 104184 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap19c3e0c8-51 in ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 22 00:02:37 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:02:37.413 211733 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap19c3e0c8-50 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 22 00:02:37 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:02:37.413 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[4db00e43-a3fa-4c6c-a4ce-2a5a1212c07b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:02:37 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:02:37.414 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[62629915-a80b-40e1-af76-d3223ae6f95a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:02:37 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:02:37.427 104576 DEBUG oslo.privsep.daemon [-] privsep: reply[63256545-1dbc-4c06-bdd8-99e03be5f3fa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:02:37 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:02:37.445 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[0af0fe7d-bb19-435e-b4a0-eb80e1a6866b]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:02:37 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:02:37.480 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[de4bce99-0019-4f62-a29b-df91ef62c991]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:02:37 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:02:37.488 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[503a81c6-7090-4e45-aa10-115bae8a8638]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:02:37 compute-1 NetworkManager[54952]: <info>  [1769040157.4907] manager: (tap19c3e0c8-50): new Veth device (/org/freedesktop/NetworkManager/Devices/177)
Jan 22 00:02:37 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:02:37.538 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[1224a3b5-99cc-4f9d-9beb-d6e73f512a09]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:02:37 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:02:37.543 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[df8535ee-511e-4d06-a8e9-287697c5b2ef]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:02:37 compute-1 NetworkManager[54952]: <info>  [1769040157.5760] device (tap19c3e0c8-50): carrier: link connected
Jan 22 00:02:37 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:02:37.585 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[8988513b-0b8c-45b3-8d7e-94ede768bdf3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:02:37 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:02:37.611 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[75f0ed7f-4345-4400-a987-594bb80f030e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap19c3e0c8-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:8f:3a:b0'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 107], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 473022, 'reachable_time': 15481, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 224649, 'error': None, 'target': 'ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:02:37 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:02:37.637 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[0726c517-b4db-4403-ae19-aa1ef49f1255]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe8f:3ab0'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 473022, 'tstamp': 473022}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 224650, 'error': None, 'target': 'ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:02:37 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:02:37.666 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[a2af24e8-ce2f-40cf-b4a1-c378be474915]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap19c3e0c8-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:8f:3a:b0'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 107], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 473022, 'reachable_time': 15481, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 224651, 'error': None, 'target': 'ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:02:37 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:02:37.712 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[7f906b89-cb1f-42da-a41d-673183c12c02]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:02:37 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:02:37.816 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[be5f97ab-16a8-46b3-bea1-3e3501ab61cc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:02:37 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:02:37.818 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap19c3e0c8-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:02:37 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:02:37.819 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 00:02:37 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:02:37.820 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap19c3e0c8-50, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:02:37 compute-1 nova_compute[182713]: 2026-01-22 00:02:37.823 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:02:37 compute-1 NetworkManager[54952]: <info>  [1769040157.8245] manager: (tap19c3e0c8-50): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/178)
Jan 22 00:02:37 compute-1 kernel: tap19c3e0c8-50: entered promiscuous mode
Jan 22 00:02:37 compute-1 nova_compute[182713]: 2026-01-22 00:02:37.827 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:02:37 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:02:37.829 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap19c3e0c8-50, col_values=(('external_ids', {'iface-id': '1b7e9589-a667-4684-99c2-2699b19c29bb'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:02:37 compute-1 ovn_controller[94841]: 2026-01-22T00:02:37Z|00360|binding|INFO|Releasing lport 1b7e9589-a667-4684-99c2-2699b19c29bb from this chassis (sb_readonly=0)
Jan 22 00:02:37 compute-1 nova_compute[182713]: 2026-01-22 00:02:37.831 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:02:37 compute-1 nova_compute[182713]: 2026-01-22 00:02:37.864 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:02:37 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:02:37.867 104184 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/19c3e0c8-5563-479c-995a-ab38d8b8c7f7.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/19c3e0c8-5563-479c-995a-ab38d8b8c7f7.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 22 00:02:37 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:02:37.869 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[fda14e14-38ab-43c5-8d42-aad1a33d76be]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:02:37 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:02:37.871 104184 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 00:02:37 compute-1 ovn_metadata_agent[104179]: global
Jan 22 00:02:37 compute-1 ovn_metadata_agent[104179]:     log         /dev/log local0 debug
Jan 22 00:02:37 compute-1 ovn_metadata_agent[104179]:     log-tag     haproxy-metadata-proxy-19c3e0c8-5563-479c-995a-ab38d8b8c7f7
Jan 22 00:02:37 compute-1 ovn_metadata_agent[104179]:     user        root
Jan 22 00:02:37 compute-1 ovn_metadata_agent[104179]:     group       root
Jan 22 00:02:37 compute-1 ovn_metadata_agent[104179]:     maxconn     1024
Jan 22 00:02:37 compute-1 ovn_metadata_agent[104179]:     pidfile     /var/lib/neutron/external/pids/19c3e0c8-5563-479c-995a-ab38d8b8c7f7.pid.haproxy
Jan 22 00:02:37 compute-1 ovn_metadata_agent[104179]:     daemon
Jan 22 00:02:37 compute-1 ovn_metadata_agent[104179]: 
Jan 22 00:02:37 compute-1 ovn_metadata_agent[104179]: defaults
Jan 22 00:02:37 compute-1 ovn_metadata_agent[104179]:     log global
Jan 22 00:02:37 compute-1 ovn_metadata_agent[104179]:     mode http
Jan 22 00:02:37 compute-1 ovn_metadata_agent[104179]:     option httplog
Jan 22 00:02:37 compute-1 ovn_metadata_agent[104179]:     option dontlognull
Jan 22 00:02:37 compute-1 ovn_metadata_agent[104179]:     option http-server-close
Jan 22 00:02:37 compute-1 ovn_metadata_agent[104179]:     option forwardfor
Jan 22 00:02:37 compute-1 ovn_metadata_agent[104179]:     retries                 3
Jan 22 00:02:37 compute-1 ovn_metadata_agent[104179]:     timeout http-request    30s
Jan 22 00:02:37 compute-1 ovn_metadata_agent[104179]:     timeout connect         30s
Jan 22 00:02:37 compute-1 ovn_metadata_agent[104179]:     timeout client          32s
Jan 22 00:02:37 compute-1 ovn_metadata_agent[104179]:     timeout server          32s
Jan 22 00:02:37 compute-1 ovn_metadata_agent[104179]:     timeout http-keep-alive 30s
Jan 22 00:02:37 compute-1 ovn_metadata_agent[104179]: 
Jan 22 00:02:37 compute-1 ovn_metadata_agent[104179]: 
Jan 22 00:02:37 compute-1 ovn_metadata_agent[104179]: listen listener
Jan 22 00:02:37 compute-1 ovn_metadata_agent[104179]:     bind 169.254.169.254:80
Jan 22 00:02:37 compute-1 ovn_metadata_agent[104179]:     server metadata /var/lib/neutron/metadata_proxy
Jan 22 00:02:37 compute-1 ovn_metadata_agent[104179]:     http-request add-header X-OVN-Network-ID 19c3e0c8-5563-479c-995a-ab38d8b8c7f7
Jan 22 00:02:37 compute-1 ovn_metadata_agent[104179]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 22 00:02:37 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:02:37.872 104184 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7', 'env', 'PROCESS_TAG=haproxy-19c3e0c8-5563-479c-995a-ab38d8b8c7f7', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/19c3e0c8-5563-479c-995a-ab38d8b8c7f7.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 22 00:02:38 compute-1 nova_compute[182713]: 2026-01-22 00:02:38.169 182717 DEBUG nova.virt.driver [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Emitting event <LifecycleEvent: 1769040158.1681423, 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:02:38 compute-1 nova_compute[182713]: 2026-01-22 00:02:38.170 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20] VM Started (Lifecycle Event)
Jan 22 00:02:38 compute-1 podman[224690]: 2026-01-22 00:02:38.288050801 +0000 UTC m=+0.040293844 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 00:02:38 compute-1 nova_compute[182713]: 2026-01-22 00:02:38.553 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:02:38 compute-1 nova_compute[182713]: 2026-01-22 00:02:38.560 182717 DEBUG nova.virt.driver [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Emitting event <LifecycleEvent: 1769040158.1698928, 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:02:38 compute-1 nova_compute[182713]: 2026-01-22 00:02:38.560 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20] VM Paused (Lifecycle Event)
Jan 22 00:02:38 compute-1 podman[224690]: 2026-01-22 00:02:38.580586963 +0000 UTC m=+0.332829976 container create 2a5c60588b10c95aceec6103de311aad25924495835801eae27d247e81bdb417 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251202)
Jan 22 00:02:38 compute-1 systemd[1]: Started libpod-conmon-2a5c60588b10c95aceec6103de311aad25924495835801eae27d247e81bdb417.scope.
Jan 22 00:02:38 compute-1 systemd[1]: Started libcrun container.
Jan 22 00:02:38 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b00ebb192e47f4423fc4415f2db58c82aeeed62969b6d66891e19170721653d5/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 00:02:38 compute-1 podman[224690]: 2026-01-22 00:02:38.768840298 +0000 UTC m=+0.521083341 container init 2a5c60588b10c95aceec6103de311aad25924495835801eae27d247e81bdb417 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Jan 22 00:02:38 compute-1 podman[224690]: 2026-01-22 00:02:38.7802454 +0000 UTC m=+0.532488403 container start 2a5c60588b10c95aceec6103de311aad25924495835801eae27d247e81bdb417 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_managed=true)
Jan 22 00:02:38 compute-1 nova_compute[182713]: 2026-01-22 00:02:38.814 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:02:38 compute-1 nova_compute[182713]: 2026-01-22 00:02:38.822 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: resized, current task_state: resize_reverting, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 00:02:38 compute-1 neutron-haproxy-ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7[224706]: [NOTICE]   (224710) : New worker (224712) forked
Jan 22 00:02:38 compute-1 neutron-haproxy-ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7[224706]: [NOTICE]   (224710) : Loading success.
Jan 22 00:02:39 compute-1 nova_compute[182713]: 2026-01-22 00:02:39.210 182717 DEBUG nova.compute.manager [req-4af6ca45-6295-44fe-a037-1fe6d0d10854 req-c7ae5120-3e11-43cf-875a-a4ab4e4af83a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20] Received event network-vif-plugged-5965ccd1-7d75-4079-ade6-e1859a860162 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:02:39 compute-1 nova_compute[182713]: 2026-01-22 00:02:39.211 182717 DEBUG oslo_concurrency.lockutils [req-4af6ca45-6295-44fe-a037-1fe6d0d10854 req-c7ae5120-3e11-43cf-875a-a4ab4e4af83a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "8009ab5e-1bf8-4d17-8ff2-b62b25eeff20-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:02:39 compute-1 nova_compute[182713]: 2026-01-22 00:02:39.212 182717 DEBUG oslo_concurrency.lockutils [req-4af6ca45-6295-44fe-a037-1fe6d0d10854 req-c7ae5120-3e11-43cf-875a-a4ab4e4af83a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "8009ab5e-1bf8-4d17-8ff2-b62b25eeff20-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:02:39 compute-1 nova_compute[182713]: 2026-01-22 00:02:39.212 182717 DEBUG oslo_concurrency.lockutils [req-4af6ca45-6295-44fe-a037-1fe6d0d10854 req-c7ae5120-3e11-43cf-875a-a4ab4e4af83a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "8009ab5e-1bf8-4d17-8ff2-b62b25eeff20-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:02:39 compute-1 nova_compute[182713]: 2026-01-22 00:02:39.213 182717 DEBUG nova.compute.manager [req-4af6ca45-6295-44fe-a037-1fe6d0d10854 req-c7ae5120-3e11-43cf-875a-a4ab4e4af83a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20] Processing event network-vif-plugged-5965ccd1-7d75-4079-ade6-e1859a860162 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 22 00:02:39 compute-1 nova_compute[182713]: 2026-01-22 00:02:39.215 182717 DEBUG nova.compute.manager [None req-9197a828-795b-4e73-bc6e-c4fcf95c17d1 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 00:02:39 compute-1 nova_compute[182713]: 2026-01-22 00:02:39.225 182717 INFO nova.virt.libvirt.driver [-] [instance: 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20] Instance running successfully.
Jan 22 00:02:39 compute-1 nova_compute[182713]: 2026-01-22 00:02:39.225 182717 DEBUG nova.virt.libvirt.driver [None req-9197a828-795b-4e73-bc6e-c4fcf95c17d1 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20] finish_revert_migration finished successfully. finish_revert_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11887
Jan 22 00:02:39 compute-1 nova_compute[182713]: 2026-01-22 00:02:39.249 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20] During sync_power_state the instance has a pending task (resize_reverting). Skip.
Jan 22 00:02:39 compute-1 nova_compute[182713]: 2026-01-22 00:02:39.250 182717 DEBUG nova.virt.driver [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Emitting event <LifecycleEvent: 1769040159.2189736, 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:02:39 compute-1 nova_compute[182713]: 2026-01-22 00:02:39.250 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20] VM Resumed (Lifecycle Event)
Jan 22 00:02:39 compute-1 nova_compute[182713]: 2026-01-22 00:02:39.712 182717 DEBUG nova.network.neutron [req-542bb107-f9f0-448b-ac24-30b9094c209f req-161299a5-cd6a-42d0-83f0-157538585df4 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20] Updated VIF entry in instance network info cache for port 5965ccd1-7d75-4079-ade6-e1859a860162. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 00:02:39 compute-1 nova_compute[182713]: 2026-01-22 00:02:39.713 182717 DEBUG nova.network.neutron [req-542bb107-f9f0-448b-ac24-30b9094c209f req-161299a5-cd6a-42d0-83f0-157538585df4 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20] Updating instance_info_cache with network_info: [{"id": "5965ccd1-7d75-4079-ade6-e1859a860162", "address": "fa:16:3e:c9:be:ae", "network": {"id": "19c3e0c8-5563-479c-995a-ab38d8b8c7f7", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-10713966-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.240", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cccb624dbe6d4401a89e9cd254f91828", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5965ccd1-7d", "ovs_interfaceid": "5965ccd1-7d75-4079-ade6-e1859a860162", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:02:39 compute-1 nova_compute[182713]: 2026-01-22 00:02:39.726 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:02:39 compute-1 nova_compute[182713]: 2026-01-22 00:02:39.730 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: resized, current task_state: resize_reverting, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 00:02:39 compute-1 nova_compute[182713]: 2026-01-22 00:02:39.763 182717 INFO nova.compute.manager [None req-d1786a2c-2481-42a8-a1d4-9321495cdc0e 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] [instance: 2cb6e3d6-f22a-49ea-aab8-900dd88605e9] Pausing
Jan 22 00:02:39 compute-1 nova_compute[182713]: 2026-01-22 00:02:39.765 182717 DEBUG nova.objects.instance [None req-d1786a2c-2481-42a8-a1d4-9321495cdc0e 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] Lazy-loading 'flavor' on Instance uuid 2cb6e3d6-f22a-49ea-aab8-900dd88605e9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:02:40 compute-1 nova_compute[182713]: 2026-01-22 00:02:40.199 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20] During sync_power_state the instance has a pending task (resize_reverting). Skip.
Jan 22 00:02:40 compute-1 nova_compute[182713]: 2026-01-22 00:02:40.200 182717 DEBUG oslo_concurrency.lockutils [req-542bb107-f9f0-448b-ac24-30b9094c209f req-161299a5-cd6a-42d0-83f0-157538585df4 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-8009ab5e-1bf8-4d17-8ff2-b62b25eeff20" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:02:40 compute-1 nova_compute[182713]: 2026-01-22 00:02:40.201 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquired lock "refresh_cache-8009ab5e-1bf8-4d17-8ff2-b62b25eeff20" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:02:40 compute-1 nova_compute[182713]: 2026-01-22 00:02:40.201 182717 DEBUG nova.network.neutron [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] [instance: 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Jan 22 00:02:40 compute-1 nova_compute[182713]: 2026-01-22 00:02:40.653 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:02:40 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:02:40.653 104184 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=22, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:a4:fb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'fa:c2:bb:50:eb:27'}, ipsec=False) old=SB_Global(nb_cfg=21) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:02:40 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:02:40.656 104184 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 22 00:02:41 compute-1 nova_compute[182713]: 2026-01-22 00:02:41.777 182717 INFO nova.compute.manager [None req-9197a828-795b-4e73-bc6e-c4fcf95c17d1 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20] Updating instance to original state: 'active'
Jan 22 00:02:41 compute-1 nova_compute[182713]: 2026-01-22 00:02:41.860 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:02:41 compute-1 nova_compute[182713]: 2026-01-22 00:02:41.951 182717 DEBUG nova.compute.manager [req-4c154a84-3e80-47d0-b703-9c02de796467 req-5166357a-22e5-47a5-ab3f-293f9ca9b6b9 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20] Received event network-vif-plugged-5965ccd1-7d75-4079-ade6-e1859a860162 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:02:41 compute-1 nova_compute[182713]: 2026-01-22 00:02:41.952 182717 DEBUG oslo_concurrency.lockutils [req-4c154a84-3e80-47d0-b703-9c02de796467 req-5166357a-22e5-47a5-ab3f-293f9ca9b6b9 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "8009ab5e-1bf8-4d17-8ff2-b62b25eeff20-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:02:41 compute-1 nova_compute[182713]: 2026-01-22 00:02:41.952 182717 DEBUG oslo_concurrency.lockutils [req-4c154a84-3e80-47d0-b703-9c02de796467 req-5166357a-22e5-47a5-ab3f-293f9ca9b6b9 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "8009ab5e-1bf8-4d17-8ff2-b62b25eeff20-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:02:41 compute-1 nova_compute[182713]: 2026-01-22 00:02:41.953 182717 DEBUG oslo_concurrency.lockutils [req-4c154a84-3e80-47d0-b703-9c02de796467 req-5166357a-22e5-47a5-ab3f-293f9ca9b6b9 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "8009ab5e-1bf8-4d17-8ff2-b62b25eeff20-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:02:41 compute-1 nova_compute[182713]: 2026-01-22 00:02:41.953 182717 DEBUG nova.compute.manager [req-4c154a84-3e80-47d0-b703-9c02de796467 req-5166357a-22e5-47a5-ab3f-293f9ca9b6b9 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20] No waiting events found dispatching network-vif-plugged-5965ccd1-7d75-4079-ade6-e1859a860162 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:02:41 compute-1 nova_compute[182713]: 2026-01-22 00:02:41.953 182717 WARNING nova.compute.manager [req-4c154a84-3e80-47d0-b703-9c02de796467 req-5166357a-22e5-47a5-ab3f-293f9ca9b6b9 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20] Received unexpected event network-vif-plugged-5965ccd1-7d75-4079-ade6-e1859a860162 for instance with vm_state resized and task_state resize_reverting.
Jan 22 00:02:42 compute-1 nova_compute[182713]: 2026-01-22 00:02:42.056 182717 DEBUG nova.virt.driver [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Emitting event <LifecycleEvent: 1769040162.0566497, 2cb6e3d6-f22a-49ea-aab8-900dd88605e9 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:02:42 compute-1 nova_compute[182713]: 2026-01-22 00:02:42.057 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 2cb6e3d6-f22a-49ea-aab8-900dd88605e9] VM Paused (Lifecycle Event)
Jan 22 00:02:42 compute-1 nova_compute[182713]: 2026-01-22 00:02:42.060 182717 DEBUG nova.compute.manager [None req-d1786a2c-2481-42a8-a1d4-9321495cdc0e 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] [instance: 2cb6e3d6-f22a-49ea-aab8-900dd88605e9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:02:42 compute-1 nova_compute[182713]: 2026-01-22 00:02:42.180 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:02:42 compute-1 nova_compute[182713]: 2026-01-22 00:02:42.282 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 2cb6e3d6-f22a-49ea-aab8-900dd88605e9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:02:42 compute-1 nova_compute[182713]: 2026-01-22 00:02:42.286 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 2cb6e3d6-f22a-49ea-aab8-900dd88605e9] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: pausing, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 00:02:42 compute-1 ovn_controller[94841]: 2026-01-22T00:02:42Z|00361|binding|INFO|Releasing lport 1b7e9589-a667-4684-99c2-2699b19c29bb from this chassis (sb_readonly=0)
Jan 22 00:02:42 compute-1 ovn_controller[94841]: 2026-01-22T00:02:42Z|00362|binding|INFO|Releasing lport f7f4d7e4-9841-41f2-85bd-658a3b613e0d from this chassis (sb_readonly=0)
Jan 22 00:02:42 compute-1 nova_compute[182713]: 2026-01-22 00:02:42.519 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:02:43 compute-1 podman[224723]: 2026-01-22 00:02:43.613322134 +0000 UTC m=+0.081155653 container health_status 9069102163b9014ce8d990e36224edf2f6277e737eebbc85a8ea0ba5a3d6a468 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 22 00:02:43 compute-1 podman[224722]: 2026-01-22 00:02:43.636794839 +0000 UTC m=+0.111138660 container health_status 1b31374526468d6b98833c6ddc06c791524e3aaa8f4a92d02e3f11c449a17ca2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_managed=true)
Jan 22 00:02:44 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:02:44.659 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=74526b6d-b1ca-423f-9094-b845f8b97526, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '22'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:02:46 compute-1 nova_compute[182713]: 2026-01-22 00:02:46.863 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:02:47 compute-1 nova_compute[182713]: 2026-01-22 00:02:47.183 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:02:47 compute-1 nova_compute[182713]: 2026-01-22 00:02:47.508 182717 INFO nova.compute.manager [None req-2c7ad546-b23d-4c74-a073-549ba10043d2 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] [instance: 2cb6e3d6-f22a-49ea-aab8-900dd88605e9] Unpausing
Jan 22 00:02:47 compute-1 nova_compute[182713]: 2026-01-22 00:02:47.509 182717 DEBUG nova.objects.instance [None req-2c7ad546-b23d-4c74-a073-549ba10043d2 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] Lazy-loading 'flavor' on Instance uuid 2cb6e3d6-f22a-49ea-aab8-900dd88605e9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:02:47 compute-1 nova_compute[182713]: 2026-01-22 00:02:47.550 182717 DEBUG nova.virt.driver [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Emitting event <LifecycleEvent: 1769040167.5506396, 2cb6e3d6-f22a-49ea-aab8-900dd88605e9 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:02:47 compute-1 nova_compute[182713]: 2026-01-22 00:02:47.551 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 2cb6e3d6-f22a-49ea-aab8-900dd88605e9] VM Resumed (Lifecycle Event)
Jan 22 00:02:47 compute-1 virtqemud[182235]: argument unsupported: QEMU guest agent is not configured
Jan 22 00:02:47 compute-1 nova_compute[182713]: 2026-01-22 00:02:47.558 182717 DEBUG nova.virt.libvirt.guest [None req-2c7ad546-b23d-4c74-a073-549ba10043d2 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] [instance: 2cb6e3d6-f22a-49ea-aab8-900dd88605e9] Failed to set time: agent not configured sync_guest_time /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:200
Jan 22 00:02:47 compute-1 nova_compute[182713]: 2026-01-22 00:02:47.559 182717 DEBUG nova.compute.manager [None req-2c7ad546-b23d-4c74-a073-549ba10043d2 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] [instance: 2cb6e3d6-f22a-49ea-aab8-900dd88605e9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:02:47 compute-1 nova_compute[182713]: 2026-01-22 00:02:47.591 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 2cb6e3d6-f22a-49ea-aab8-900dd88605e9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:02:47 compute-1 nova_compute[182713]: 2026-01-22 00:02:47.598 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 2cb6e3d6-f22a-49ea-aab8-900dd88605e9] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: paused, current task_state: unpausing, current DB power_state: 3, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 00:02:47 compute-1 nova_compute[182713]: 2026-01-22 00:02:47.636 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 2cb6e3d6-f22a-49ea-aab8-900dd88605e9] During sync_power_state the instance has a pending task (unpausing). Skip.
Jan 22 00:02:47 compute-1 nova_compute[182713]: 2026-01-22 00:02:47.716 182717 DEBUG nova.network.neutron [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] [instance: 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20] Updating instance_info_cache with network_info: [{"id": "5965ccd1-7d75-4079-ade6-e1859a860162", "address": "fa:16:3e:c9:be:ae", "network": {"id": "19c3e0c8-5563-479c-995a-ab38d8b8c7f7", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-10713966-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.240", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cccb624dbe6d4401a89e9cd254f91828", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5965ccd1-7d", "ovs_interfaceid": "5965ccd1-7d75-4079-ade6-e1859a860162", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:02:47 compute-1 nova_compute[182713]: 2026-01-22 00:02:47.740 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Releasing lock "refresh_cache-8009ab5e-1bf8-4d17-8ff2-b62b25eeff20" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:02:47 compute-1 nova_compute[182713]: 2026-01-22 00:02:47.741 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] [instance: 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Jan 22 00:02:47 compute-1 nova_compute[182713]: 2026-01-22 00:02:47.742 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:02:47 compute-1 nova_compute[182713]: 2026-01-22 00:02:47.794 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:02:47 compute-1 nova_compute[182713]: 2026-01-22 00:02:47.796 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:02:47 compute-1 nova_compute[182713]: 2026-01-22 00:02:47.797 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:02:47 compute-1 nova_compute[182713]: 2026-01-22 00:02:47.797 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 00:02:47 compute-1 nova_compute[182713]: 2026-01-22 00:02:47.892 182717 DEBUG oslo_concurrency.processutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8009ab5e-1bf8-4d17-8ff2-b62b25eeff20/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:02:47 compute-1 nova_compute[182713]: 2026-01-22 00:02:47.967 182717 DEBUG oslo_concurrency.processutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8009ab5e-1bf8-4d17-8ff2-b62b25eeff20/disk --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:02:47 compute-1 nova_compute[182713]: 2026-01-22 00:02:47.969 182717 DEBUG oslo_concurrency.processutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8009ab5e-1bf8-4d17-8ff2-b62b25eeff20/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:02:48 compute-1 nova_compute[182713]: 2026-01-22 00:02:48.045 182717 DEBUG oslo_concurrency.processutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8009ab5e-1bf8-4d17-8ff2-b62b25eeff20/disk --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:02:48 compute-1 nova_compute[182713]: 2026-01-22 00:02:48.056 182717 DEBUG oslo_concurrency.processutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2cb6e3d6-f22a-49ea-aab8-900dd88605e9/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:02:48 compute-1 nova_compute[182713]: 2026-01-22 00:02:48.129 182717 DEBUG oslo_concurrency.processutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2cb6e3d6-f22a-49ea-aab8-900dd88605e9/disk --force-share --output=json" returned: 0 in 0.073s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:02:48 compute-1 nova_compute[182713]: 2026-01-22 00:02:48.133 182717 DEBUG oslo_concurrency.processutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2cb6e3d6-f22a-49ea-aab8-900dd88605e9/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:02:48 compute-1 nova_compute[182713]: 2026-01-22 00:02:48.209 182717 DEBUG oslo_concurrency.processutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2cb6e3d6-f22a-49ea-aab8-900dd88605e9/disk --force-share --output=json" returned: 0 in 0.076s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:02:48 compute-1 nova_compute[182713]: 2026-01-22 00:02:48.437 182717 WARNING nova.virt.libvirt.driver [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 00:02:48 compute-1 nova_compute[182713]: 2026-01-22 00:02:48.439 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5355MB free_disk=73.24593353271484GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 00:02:48 compute-1 nova_compute[182713]: 2026-01-22 00:02:48.439 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:02:48 compute-1 nova_compute[182713]: 2026-01-22 00:02:48.439 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:02:48 compute-1 nova_compute[182713]: 2026-01-22 00:02:48.635 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Instance 2cb6e3d6-f22a-49ea-aab8-900dd88605e9 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 22 00:02:48 compute-1 nova_compute[182713]: 2026-01-22 00:02:48.636 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Instance 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 22 00:02:48 compute-1 nova_compute[182713]: 2026-01-22 00:02:48.636 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 00:02:48 compute-1 nova_compute[182713]: 2026-01-22 00:02:48.636 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 00:02:48 compute-1 nova_compute[182713]: 2026-01-22 00:02:48.749 182717 DEBUG nova.compute.provider_tree [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Inventory has not changed in ProviderTree for provider: 39680711-70c9-4df1-ae59-25e54fac688d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:02:48 compute-1 nova_compute[182713]: 2026-01-22 00:02:48.772 182717 DEBUG nova.scheduler.client.report [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Inventory has not changed for provider 39680711-70c9-4df1-ae59-25e54fac688d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:02:48 compute-1 nova_compute[182713]: 2026-01-22 00:02:48.775 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 00:02:48 compute-1 nova_compute[182713]: 2026-01-22 00:02:48.776 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.336s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:02:48 compute-1 nova_compute[182713]: 2026-01-22 00:02:48.777 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:02:48 compute-1 nova_compute[182713]: 2026-01-22 00:02:48.777 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Jan 22 00:02:49 compute-1 nova_compute[182713]: 2026-01-22 00:02:49.088 182717 DEBUG oslo_concurrency.lockutils [None req-ca40f472-a002-4980-aeaa-5de9b12da47d 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Acquiring lock "8009ab5e-1bf8-4d17-8ff2-b62b25eeff20" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:02:49 compute-1 nova_compute[182713]: 2026-01-22 00:02:49.089 182717 DEBUG oslo_concurrency.lockutils [None req-ca40f472-a002-4980-aeaa-5de9b12da47d 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Lock "8009ab5e-1bf8-4d17-8ff2-b62b25eeff20" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:02:49 compute-1 nova_compute[182713]: 2026-01-22 00:02:49.089 182717 DEBUG oslo_concurrency.lockutils [None req-ca40f472-a002-4980-aeaa-5de9b12da47d 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Acquiring lock "8009ab5e-1bf8-4d17-8ff2-b62b25eeff20-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:02:49 compute-1 nova_compute[182713]: 2026-01-22 00:02:49.089 182717 DEBUG oslo_concurrency.lockutils [None req-ca40f472-a002-4980-aeaa-5de9b12da47d 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Lock "8009ab5e-1bf8-4d17-8ff2-b62b25eeff20-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:02:49 compute-1 nova_compute[182713]: 2026-01-22 00:02:49.090 182717 DEBUG oslo_concurrency.lockutils [None req-ca40f472-a002-4980-aeaa-5de9b12da47d 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Lock "8009ab5e-1bf8-4d17-8ff2-b62b25eeff20-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:02:49 compute-1 nova_compute[182713]: 2026-01-22 00:02:49.106 182717 INFO nova.compute.manager [None req-ca40f472-a002-4980-aeaa-5de9b12da47d 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20] Terminating instance
Jan 22 00:02:49 compute-1 nova_compute[182713]: 2026-01-22 00:02:49.164 182717 DEBUG nova.compute.manager [None req-ca40f472-a002-4980-aeaa-5de9b12da47d 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 22 00:02:49 compute-1 kernel: tap5965ccd1-7d (unregistering): left promiscuous mode
Jan 22 00:02:49 compute-1 NetworkManager[54952]: <info>  [1769040169.1920] device (tap5965ccd1-7d): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 00:02:49 compute-1 ovn_controller[94841]: 2026-01-22T00:02:49Z|00363|binding|INFO|Releasing lport 5965ccd1-7d75-4079-ade6-e1859a860162 from this chassis (sb_readonly=0)
Jan 22 00:02:49 compute-1 nova_compute[182713]: 2026-01-22 00:02:49.199 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:02:49 compute-1 ovn_controller[94841]: 2026-01-22T00:02:49Z|00364|binding|INFO|Setting lport 5965ccd1-7d75-4079-ade6-e1859a860162 down in Southbound
Jan 22 00:02:49 compute-1 ovn_controller[94841]: 2026-01-22T00:02:49Z|00365|binding|INFO|Removing iface tap5965ccd1-7d ovn-installed in OVS
Jan 22 00:02:49 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:02:49.212 104184 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c9:be:ae 10.100.0.4'], port_security=['fa:16:3e:c9:be:ae 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '8009ab5e-1bf8-4d17-8ff2-b62b25eeff20', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-19c3e0c8-5563-479c-995a-ab38d8b8c7f7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'cccb624dbe6d4401a89e9cd254f91828', 'neutron:revision_number': '12', 'neutron:security_group_ids': '6d59a7e5-ecca-4ec2-a40e-386acabc1d66', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.240', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1cb5ae5b-fb9e-4b4d-8960-35191db09308, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>], logical_port=5965ccd1-7d75-4079-ade6-e1859a860162) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:02:49 compute-1 nova_compute[182713]: 2026-01-22 00:02:49.213 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:02:49 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:02:49.216 104184 INFO neutron.agent.ovn.metadata.agent [-] Port 5965ccd1-7d75-4079-ade6-e1859a860162 in datapath 19c3e0c8-5563-479c-995a-ab38d8b8c7f7 unbound from our chassis
Jan 22 00:02:49 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:02:49.220 104184 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 19c3e0c8-5563-479c-995a-ab38d8b8c7f7, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 00:02:49 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:02:49.222 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[725f2389-4c96-4ae1-93b6-a1f306615e5c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:02:49 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:02:49.224 104184 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7 namespace which is not needed anymore
Jan 22 00:02:49 compute-1 systemd[1]: machine-qemu\x2d41\x2dinstance\x2d00000054.scope: Deactivated successfully.
Jan 22 00:02:49 compute-1 systemd[1]: machine-qemu\x2d41\x2dinstance\x2d00000054.scope: Consumed 10.987s CPU time.
Jan 22 00:02:49 compute-1 systemd-machined[153970]: Machine qemu-41-instance-00000054 terminated.
Jan 22 00:02:49 compute-1 nova_compute[182713]: 2026-01-22 00:02:49.430 182717 INFO nova.virt.libvirt.driver [-] [instance: 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20] Instance destroyed successfully.
Jan 22 00:02:49 compute-1 nova_compute[182713]: 2026-01-22 00:02:49.432 182717 DEBUG nova.objects.instance [None req-ca40f472-a002-4980-aeaa-5de9b12da47d 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Lazy-loading 'resources' on Instance uuid 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:02:49 compute-1 nova_compute[182713]: 2026-01-22 00:02:49.451 182717 DEBUG nova.virt.libvirt.vif [None req-ca40f472-a002-4980-aeaa-5de9b12da47d 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-01-22T00:01:05Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-803720403',display_name='tempest-ServerActionsTestJSON-server-803720403',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-803720403',id=84,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBM2ugiUux7DYMlN8dY8gue1BzsfXbOKOqdPq/gJUxFgjYtiZRKn0Il7yH7vkt/FF0n0nQ57uKZ7FjQwDvGcLpEHkhrK3RTLhPWsztjfiNHjhjKK0S86T4k3kzP0rpeoh4Q==',key_name='tempest-keypair-452781070',keypairs=<?>,launch_index=0,launched_at=2026-01-22T00:02:39Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='cccb624dbe6d4401a89e9cd254f91828',ramdisk_id='',reservation_id='r-zph0mh3t',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestJSON-78742637',owner_user_name='tempest-ServerActionsTestJSON-78742637-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T00:02:44Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='3e78a70a1d284a9d932d4a53b872df39',uuid=8009ab5e-1bf8-4d17-8ff2-b62b25eeff20,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "5965ccd1-7d75-4079-ade6-e1859a860162", "address": "fa:16:3e:c9:be:ae", "network": {"id": "19c3e0c8-5563-479c-995a-ab38d8b8c7f7", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-10713966-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.240", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cccb624dbe6d4401a89e9cd254f91828", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5965ccd1-7d", "ovs_interfaceid": "5965ccd1-7d75-4079-ade6-e1859a860162", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 00:02:49 compute-1 nova_compute[182713]: 2026-01-22 00:02:49.451 182717 DEBUG nova.network.os_vif_util [None req-ca40f472-a002-4980-aeaa-5de9b12da47d 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Converting VIF {"id": "5965ccd1-7d75-4079-ade6-e1859a860162", "address": "fa:16:3e:c9:be:ae", "network": {"id": "19c3e0c8-5563-479c-995a-ab38d8b8c7f7", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-10713966-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.240", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cccb624dbe6d4401a89e9cd254f91828", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5965ccd1-7d", "ovs_interfaceid": "5965ccd1-7d75-4079-ade6-e1859a860162", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:02:49 compute-1 nova_compute[182713]: 2026-01-22 00:02:49.453 182717 DEBUG nova.network.os_vif_util [None req-ca40f472-a002-4980-aeaa-5de9b12da47d 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:c9:be:ae,bridge_name='br-int',has_traffic_filtering=True,id=5965ccd1-7d75-4079-ade6-e1859a860162,network=Network(19c3e0c8-5563-479c-995a-ab38d8b8c7f7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5965ccd1-7d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:02:49 compute-1 nova_compute[182713]: 2026-01-22 00:02:49.455 182717 DEBUG os_vif [None req-ca40f472-a002-4980-aeaa-5de9b12da47d 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:c9:be:ae,bridge_name='br-int',has_traffic_filtering=True,id=5965ccd1-7d75-4079-ade6-e1859a860162,network=Network(19c3e0c8-5563-479c-995a-ab38d8b8c7f7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5965ccd1-7d') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 00:02:49 compute-1 nova_compute[182713]: 2026-01-22 00:02:49.458 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:02:49 compute-1 nova_compute[182713]: 2026-01-22 00:02:49.459 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap5965ccd1-7d, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:02:49 compute-1 nova_compute[182713]: 2026-01-22 00:02:49.462 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:02:49 compute-1 nova_compute[182713]: 2026-01-22 00:02:49.463 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:02:49 compute-1 nova_compute[182713]: 2026-01-22 00:02:49.470 182717 INFO os_vif [None req-ca40f472-a002-4980-aeaa-5de9b12da47d 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:c9:be:ae,bridge_name='br-int',has_traffic_filtering=True,id=5965ccd1-7d75-4079-ade6-e1859a860162,network=Network(19c3e0c8-5563-479c-995a-ab38d8b8c7f7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5965ccd1-7d')
Jan 22 00:02:49 compute-1 nova_compute[182713]: 2026-01-22 00:02:49.471 182717 INFO nova.virt.libvirt.driver [None req-ca40f472-a002-4980-aeaa-5de9b12da47d 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20] Deleting instance files /var/lib/nova/instances/8009ab5e-1bf8-4d17-8ff2-b62b25eeff20_del
Jan 22 00:02:49 compute-1 nova_compute[182713]: 2026-01-22 00:02:49.476 182717 INFO nova.virt.libvirt.driver [None req-ca40f472-a002-4980-aeaa-5de9b12da47d 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20] Deletion of /var/lib/nova/instances/8009ab5e-1bf8-4d17-8ff2-b62b25eeff20_del complete
Jan 22 00:02:49 compute-1 neutron-haproxy-ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7[224706]: [NOTICE]   (224710) : haproxy version is 2.8.14-c23fe91
Jan 22 00:02:49 compute-1 neutron-haproxy-ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7[224706]: [NOTICE]   (224710) : path to executable is /usr/sbin/haproxy
Jan 22 00:02:49 compute-1 neutron-haproxy-ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7[224706]: [WARNING]  (224710) : Exiting Master process...
Jan 22 00:02:49 compute-1 neutron-haproxy-ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7[224706]: [ALERT]    (224710) : Current worker (224712) exited with code 143 (Terminated)
Jan 22 00:02:49 compute-1 neutron-haproxy-ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7[224706]: [WARNING]  (224710) : All workers exited. Exiting... (0)
Jan 22 00:02:49 compute-1 systemd[1]: libpod-2a5c60588b10c95aceec6103de311aad25924495835801eae27d247e81bdb417.scope: Deactivated successfully.
Jan 22 00:02:49 compute-1 podman[224808]: 2026-01-22 00:02:49.553194083 +0000 UTC m=+0.211172733 container died 2a5c60588b10c95aceec6103de311aad25924495835801eae27d247e81bdb417 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 00:02:49 compute-1 nova_compute[182713]: 2026-01-22 00:02:49.594 182717 INFO nova.compute.manager [None req-ca40f472-a002-4980-aeaa-5de9b12da47d 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20] Took 0.43 seconds to destroy the instance on the hypervisor.
Jan 22 00:02:49 compute-1 nova_compute[182713]: 2026-01-22 00:02:49.594 182717 DEBUG oslo.service.loopingcall [None req-ca40f472-a002-4980-aeaa-5de9b12da47d 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 22 00:02:49 compute-1 nova_compute[182713]: 2026-01-22 00:02:49.595 182717 DEBUG nova.compute.manager [-] [instance: 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 22 00:02:49 compute-1 nova_compute[182713]: 2026-01-22 00:02:49.595 182717 DEBUG nova.network.neutron [-] [instance: 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 22 00:02:50 compute-1 systemd[1]: var-lib-containers-storage-overlay-b00ebb192e47f4423fc4415f2db58c82aeeed62969b6d66891e19170721653d5-merged.mount: Deactivated successfully.
Jan 22 00:02:50 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-2a5c60588b10c95aceec6103de311aad25924495835801eae27d247e81bdb417-userdata-shm.mount: Deactivated successfully.
Jan 22 00:02:50 compute-1 podman[224852]: 2026-01-22 00:02:50.090947597 +0000 UTC m=+0.311209859 container health_status e305ebfcd3dbca18f8dd680606b043b108cf9efb0d0a771039cb04fe0df8441d (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Jan 22 00:02:50 compute-1 podman[224851]: 2026-01-22 00:02:50.107157167 +0000 UTC m=+0.337125727 container health_status af9a44923f14d865893c91712a29a99d59e05d83f1a1a0300c4f4d5af638f284 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3)
Jan 22 00:02:50 compute-1 podman[224808]: 2026-01-22 00:02:50.203792148 +0000 UTC m=+0.861770778 container cleanup 2a5c60588b10c95aceec6103de311aad25924495835801eae27d247e81bdb417 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 00:02:50 compute-1 systemd[1]: libpod-conmon-2a5c60588b10c95aceec6103de311aad25924495835801eae27d247e81bdb417.scope: Deactivated successfully.
Jan 22 00:02:50 compute-1 nova_compute[182713]: 2026-01-22 00:02:50.439 182717 DEBUG nova.compute.manager [req-33760388-b11c-48e6-bed6-435c852d0387 req-c0f4507e-09d5-46d3-94e6-ff20e4b287cf 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20] Received event network-vif-unplugged-5965ccd1-7d75-4079-ade6-e1859a860162 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:02:50 compute-1 nova_compute[182713]: 2026-01-22 00:02:50.439 182717 DEBUG oslo_concurrency.lockutils [req-33760388-b11c-48e6-bed6-435c852d0387 req-c0f4507e-09d5-46d3-94e6-ff20e4b287cf 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "8009ab5e-1bf8-4d17-8ff2-b62b25eeff20-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:02:50 compute-1 nova_compute[182713]: 2026-01-22 00:02:50.440 182717 DEBUG oslo_concurrency.lockutils [req-33760388-b11c-48e6-bed6-435c852d0387 req-c0f4507e-09d5-46d3-94e6-ff20e4b287cf 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "8009ab5e-1bf8-4d17-8ff2-b62b25eeff20-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:02:50 compute-1 nova_compute[182713]: 2026-01-22 00:02:50.440 182717 DEBUG oslo_concurrency.lockutils [req-33760388-b11c-48e6-bed6-435c852d0387 req-c0f4507e-09d5-46d3-94e6-ff20e4b287cf 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "8009ab5e-1bf8-4d17-8ff2-b62b25eeff20-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:02:50 compute-1 nova_compute[182713]: 2026-01-22 00:02:50.441 182717 DEBUG nova.compute.manager [req-33760388-b11c-48e6-bed6-435c852d0387 req-c0f4507e-09d5-46d3-94e6-ff20e4b287cf 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20] No waiting events found dispatching network-vif-unplugged-5965ccd1-7d75-4079-ade6-e1859a860162 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:02:50 compute-1 nova_compute[182713]: 2026-01-22 00:02:50.441 182717 DEBUG nova.compute.manager [req-33760388-b11c-48e6-bed6-435c852d0387 req-c0f4507e-09d5-46d3-94e6-ff20e4b287cf 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20] Received event network-vif-unplugged-5965ccd1-7d75-4079-ade6-e1859a860162 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 22 00:02:50 compute-1 podman[224899]: 2026-01-22 00:02:50.56930201 +0000 UTC m=+0.333072323 container remove 2a5c60588b10c95aceec6103de311aad25924495835801eae27d247e81bdb417 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 22 00:02:50 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:02:50.577 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[d7467eda-75b9-4544-9b1f-de0c6dd029ff]: (4, ('Thu Jan 22 12:02:49 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7 (2a5c60588b10c95aceec6103de311aad25924495835801eae27d247e81bdb417)\n2a5c60588b10c95aceec6103de311aad25924495835801eae27d247e81bdb417\nThu Jan 22 12:02:50 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7 (2a5c60588b10c95aceec6103de311aad25924495835801eae27d247e81bdb417)\n2a5c60588b10c95aceec6103de311aad25924495835801eae27d247e81bdb417\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:02:50 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:02:50.580 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[3ad53167-b834-4ee8-8a85-adea96a13ba4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:02:50 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:02:50.581 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap19c3e0c8-50, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:02:50 compute-1 nova_compute[182713]: 2026-01-22 00:02:50.584 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:02:50 compute-1 kernel: tap19c3e0c8-50: left promiscuous mode
Jan 22 00:02:50 compute-1 nova_compute[182713]: 2026-01-22 00:02:50.587 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:02:50 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:02:50.590 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[4f3d583c-8299-4b78-9a93-1faa0a43de7e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:02:50 compute-1 nova_compute[182713]: 2026-01-22 00:02:50.601 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:02:50 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:02:50.617 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[ced27e94-7534-4e0d-bd9e-f1610bf74434]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:02:50 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:02:50.618 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[7b21ea55-d6e8-48ec-8da4-30d12d3c0209]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:02:50 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:02:50.638 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[2257f688-9ea2-4042-97bb-a4141db82425]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 473012, 'reachable_time': 40395, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 224915, 'error': None, 'target': 'ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:02:50 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:02:50.643 104576 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 22 00:02:50 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:02:50.643 104576 DEBUG oslo.privsep.daemon [-] privsep: reply[05f693d7-c233-49ca-bfeb-5a3dcd65bc7a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:02:50 compute-1 systemd[1]: run-netns-ovnmeta\x2d19c3e0c8\x2d5563\x2d479c\x2d995a\x2dab38d8b8c7f7.mount: Deactivated successfully.
Jan 22 00:02:51 compute-1 nova_compute[182713]: 2026-01-22 00:02:51.187 182717 DEBUG nova.network.neutron [-] [instance: 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:02:51 compute-1 nova_compute[182713]: 2026-01-22 00:02:51.206 182717 INFO nova.compute.manager [-] [instance: 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20] Took 1.61 seconds to deallocate network for instance.
Jan 22 00:02:51 compute-1 nova_compute[182713]: 2026-01-22 00:02:51.300 182717 DEBUG oslo_concurrency.lockutils [None req-ca40f472-a002-4980-aeaa-5de9b12da47d 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:02:51 compute-1 nova_compute[182713]: 2026-01-22 00:02:51.301 182717 DEBUG oslo_concurrency.lockutils [None req-ca40f472-a002-4980-aeaa-5de9b12da47d 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:02:51 compute-1 nova_compute[182713]: 2026-01-22 00:02:51.328 182717 DEBUG nova.compute.manager [req-0d1f2287-edea-4936-a5be-5d8e9a939bd7 req-11a12383-86c9-461f-8b19-75535ce923ed 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20] Received event network-vif-deleted-5965ccd1-7d75-4079-ade6-e1859a860162 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:02:51 compute-1 nova_compute[182713]: 2026-01-22 00:02:51.401 182717 DEBUG nova.compute.provider_tree [None req-ca40f472-a002-4980-aeaa-5de9b12da47d 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Inventory has not changed in ProviderTree for provider: 39680711-70c9-4df1-ae59-25e54fac688d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:02:51 compute-1 nova_compute[182713]: 2026-01-22 00:02:51.421 182717 DEBUG nova.scheduler.client.report [None req-ca40f472-a002-4980-aeaa-5de9b12da47d 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Inventory has not changed for provider 39680711-70c9-4df1-ae59-25e54fac688d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:02:51 compute-1 nova_compute[182713]: 2026-01-22 00:02:51.445 182717 DEBUG oslo_concurrency.lockutils [None req-ca40f472-a002-4980-aeaa-5de9b12da47d 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.144s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:02:51 compute-1 nova_compute[182713]: 2026-01-22 00:02:51.488 182717 INFO nova.scheduler.client.report [None req-ca40f472-a002-4980-aeaa-5de9b12da47d 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Deleted allocations for instance 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20
Jan 22 00:02:51 compute-1 nova_compute[182713]: 2026-01-22 00:02:51.598 182717 DEBUG oslo_concurrency.lockutils [None req-ca40f472-a002-4980-aeaa-5de9b12da47d 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Lock "8009ab5e-1bf8-4d17-8ff2-b62b25eeff20" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.509s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:02:51 compute-1 nova_compute[182713]: 2026-01-22 00:02:51.865 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:02:52 compute-1 nova_compute[182713]: 2026-01-22 00:02:52.675 182717 DEBUG nova.compute.manager [req-47339f33-f72d-4651-9208-509a4bab1c8a req-456d1a60-6e56-41c9-9713-2b8e5e0f8ece 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20] Received event network-vif-plugged-5965ccd1-7d75-4079-ade6-e1859a860162 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:02:52 compute-1 nova_compute[182713]: 2026-01-22 00:02:52.675 182717 DEBUG oslo_concurrency.lockutils [req-47339f33-f72d-4651-9208-509a4bab1c8a req-456d1a60-6e56-41c9-9713-2b8e5e0f8ece 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "8009ab5e-1bf8-4d17-8ff2-b62b25eeff20-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:02:52 compute-1 nova_compute[182713]: 2026-01-22 00:02:52.676 182717 DEBUG oslo_concurrency.lockutils [req-47339f33-f72d-4651-9208-509a4bab1c8a req-456d1a60-6e56-41c9-9713-2b8e5e0f8ece 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "8009ab5e-1bf8-4d17-8ff2-b62b25eeff20-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:02:52 compute-1 nova_compute[182713]: 2026-01-22 00:02:52.676 182717 DEBUG oslo_concurrency.lockutils [req-47339f33-f72d-4651-9208-509a4bab1c8a req-456d1a60-6e56-41c9-9713-2b8e5e0f8ece 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "8009ab5e-1bf8-4d17-8ff2-b62b25eeff20-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:02:52 compute-1 nova_compute[182713]: 2026-01-22 00:02:52.676 182717 DEBUG nova.compute.manager [req-47339f33-f72d-4651-9208-509a4bab1c8a req-456d1a60-6e56-41c9-9713-2b8e5e0f8ece 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20] No waiting events found dispatching network-vif-plugged-5965ccd1-7d75-4079-ade6-e1859a860162 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:02:52 compute-1 nova_compute[182713]: 2026-01-22 00:02:52.677 182717 WARNING nova.compute.manager [req-47339f33-f72d-4651-9208-509a4bab1c8a req-456d1a60-6e56-41c9-9713-2b8e5e0f8ece 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20] Received unexpected event network-vif-plugged-5965ccd1-7d75-4079-ade6-e1859a860162 for instance with vm_state deleted and task_state None.
Jan 22 00:02:54 compute-1 nova_compute[182713]: 2026-01-22 00:02:54.462 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:02:56 compute-1 nova_compute[182713]: 2026-01-22 00:02:56.867 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:02:59 compute-1 nova_compute[182713]: 2026-01-22 00:02:59.464 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:03:00 compute-1 podman[224916]: 2026-01-22 00:03:00.59492253 +0000 UTC m=+0.080195457 container health_status cba261ffc859a105e0afa4436e64e27f9ba7e7387a1ea02146e0072c51844d44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Jan 22 00:03:01 compute-1 nova_compute[182713]: 2026-01-22 00:03:01.870 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:03:03 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:03:03.010 104184 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:03:03 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:03:03.010 104184 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:03:03 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:03:03.011 104184 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:03:04 compute-1 nova_compute[182713]: 2026-01-22 00:03:04.429 182717 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769040169.4273152, 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:03:04 compute-1 nova_compute[182713]: 2026-01-22 00:03:04.430 182717 INFO nova.compute.manager [-] [instance: 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20] VM Stopped (Lifecycle Event)
Jan 22 00:03:04 compute-1 nova_compute[182713]: 2026-01-22 00:03:04.467 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:03:04 compute-1 nova_compute[182713]: 2026-01-22 00:03:04.513 182717 DEBUG nova.compute.manager [None req-772bcc29-9459-405c-b5b7-1c0df38c7651 - - - - - -] [instance: 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:03:04 compute-1 podman[224936]: 2026-01-22 00:03:04.608994203 +0000 UTC m=+0.099660158 container health_status dce133a2ffa21f3006ac27dc80027060a9029e6d972f4c62fecd35dd3bbb00f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., architecture=x86_64, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., release=1755695350, distribution-scope=public, maintainer=Red Hat, Inc., vcs-type=git, config_id=openstack_network_exporter, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, name=ubi9-minimal)
Jan 22 00:03:06 compute-1 nova_compute[182713]: 2026-01-22 00:03:06.850 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:03:06 compute-1 nova_compute[182713]: 2026-01-22 00:03:06.872 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:03:09 compute-1 nova_compute[182713]: 2026-01-22 00:03:09.469 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:03:11 compute-1 nova_compute[182713]: 2026-01-22 00:03:11.874 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:03:14 compute-1 nova_compute[182713]: 2026-01-22 00:03:14.471 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:03:14 compute-1 podman[224959]: 2026-01-22 00:03:14.581985411 +0000 UTC m=+0.067396892 container health_status 9069102163b9014ce8d990e36224edf2f6277e737eebbc85a8ea0ba5a3d6a468 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 22 00:03:14 compute-1 podman[224958]: 2026-01-22 00:03:14.601281956 +0000 UTC m=+0.090090422 container health_status 1b31374526468d6b98833c6ddc06c791524e3aaa8f4a92d02e3f11c449a17ca2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS)
Jan 22 00:03:15 compute-1 ovn_controller[94841]: 2026-01-22T00:03:15Z|00366|binding|INFO|Releasing lport f7f4d7e4-9841-41f2-85bd-658a3b613e0d from this chassis (sb_readonly=0)
Jan 22 00:03:15 compute-1 nova_compute[182713]: 2026-01-22 00:03:15.264 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:03:15 compute-1 ovn_controller[94841]: 2026-01-22T00:03:15Z|00367|binding|INFO|Releasing lport f7f4d7e4-9841-41f2-85bd-658a3b613e0d from this chassis (sb_readonly=0)
Jan 22 00:03:15 compute-1 nova_compute[182713]: 2026-01-22 00:03:15.444 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:03:16 compute-1 nova_compute[182713]: 2026-01-22 00:03:16.911 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:03:19 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:03:19.277 104184 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=23, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:a4:fb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'fa:c2:bb:50:eb:27'}, ipsec=False) old=SB_Global(nb_cfg=22) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:03:19 compute-1 nova_compute[182713]: 2026-01-22 00:03:19.278 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:03:19 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:03:19.280 104184 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 22 00:03:19 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:03:19.282 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=74526b6d-b1ca-423f-9094-b845f8b97526, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '23'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:03:19 compute-1 nova_compute[182713]: 2026-01-22 00:03:19.473 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:03:20 compute-1 podman[225007]: 2026-01-22 00:03:20.594763658 +0000 UTC m=+0.073469440 container health_status e305ebfcd3dbca18f8dd680606b043b108cf9efb0d0a771039cb04fe0df8441d (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 22 00:03:20 compute-1 podman[225006]: 2026-01-22 00:03:20.597568935 +0000 UTC m=+0.082655333 container health_status af9a44923f14d865893c91712a29a99d59e05d83f1a1a0300c4f4d5af638f284 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent)
Jan 22 00:03:21 compute-1 nova_compute[182713]: 2026-01-22 00:03:21.914 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:03:24 compute-1 nova_compute[182713]: 2026-01-22 00:03:24.476 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:03:26 compute-1 nova_compute[182713]: 2026-01-22 00:03:26.916 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:03:29 compute-1 nova_compute[182713]: 2026-01-22 00:03:29.477 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:03:31 compute-1 podman[225050]: 2026-01-22 00:03:31.60455573 +0000 UTC m=+0.091089374 container health_status cba261ffc859a105e0afa4436e64e27f9ba7e7387a1ea02146e0072c51844d44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ceilometer_agent_compute)
Jan 22 00:03:31 compute-1 nova_compute[182713]: 2026-01-22 00:03:31.919 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:03:33 compute-1 nova_compute[182713]: 2026-01-22 00:03:33.917 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:03:33 compute-1 nova_compute[182713]: 2026-01-22 00:03:33.918 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:03:33 compute-1 nova_compute[182713]: 2026-01-22 00:03:33.918 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:03:33 compute-1 nova_compute[182713]: 2026-01-22 00:03:33.919 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:03:33 compute-1 nova_compute[182713]: 2026-01-22 00:03:33.919 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:03:33 compute-1 nova_compute[182713]: 2026-01-22 00:03:33.919 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 00:03:34 compute-1 nova_compute[182713]: 2026-01-22 00:03:34.479 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:03:34 compute-1 nova_compute[182713]: 2026-01-22 00:03:34.857 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:03:35 compute-1 podman[225070]: 2026-01-22 00:03:35.631297723 +0000 UTC m=+0.117113646 container health_status dce133a2ffa21f3006ac27dc80027060a9029e6d972f4c62fecd35dd3bbb00f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=openstack_network_exporter, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, vendor=Red Hat, Inc., architecture=x86_64, distribution-scope=public, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, version=9.6, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, release=1755695350, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Jan 22 00:03:36 compute-1 nova_compute[182713]: 2026-01-22 00:03:36.856 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:03:36 compute-1 nova_compute[182713]: 2026-01-22 00:03:36.922 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:03:37 compute-1 nova_compute[182713]: 2026-01-22 00:03:37.852 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:03:38 compute-1 nova_compute[182713]: 2026-01-22 00:03:38.271 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:03:38 compute-1 nova_compute[182713]: 2026-01-22 00:03:38.272 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 00:03:38 compute-1 nova_compute[182713]: 2026-01-22 00:03:38.272 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 22 00:03:39 compute-1 nova_compute[182713]: 2026-01-22 00:03:39.482 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:03:41 compute-1 nova_compute[182713]: 2026-01-22 00:03:41.097 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquiring lock "refresh_cache-2cb6e3d6-f22a-49ea-aab8-900dd88605e9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:03:41 compute-1 nova_compute[182713]: 2026-01-22 00:03:41.098 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquired lock "refresh_cache-2cb6e3d6-f22a-49ea-aab8-900dd88605e9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:03:41 compute-1 nova_compute[182713]: 2026-01-22 00:03:41.098 182717 DEBUG nova.network.neutron [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] [instance: 2cb6e3d6-f22a-49ea-aab8-900dd88605e9] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Jan 22 00:03:41 compute-1 nova_compute[182713]: 2026-01-22 00:03:41.099 182717 DEBUG nova.objects.instance [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lazy-loading 'info_cache' on Instance uuid 2cb6e3d6-f22a-49ea-aab8-900dd88605e9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:03:41 compute-1 nova_compute[182713]: 2026-01-22 00:03:41.925 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:03:44 compute-1 nova_compute[182713]: 2026-01-22 00:03:44.484 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:03:45 compute-1 podman[225092]: 2026-01-22 00:03:45.589303179 +0000 UTC m=+0.074399338 container health_status 9069102163b9014ce8d990e36224edf2f6277e737eebbc85a8ea0ba5a3d6a468 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 22 00:03:45 compute-1 podman[225091]: 2026-01-22 00:03:45.616744946 +0000 UTC m=+0.112554106 container health_status 1b31374526468d6b98833c6ddc06c791524e3aaa8f4a92d02e3f11c449a17ca2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 00:03:46 compute-1 nova_compute[182713]: 2026-01-22 00:03:46.927 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:03:47 compute-1 nova_compute[182713]: 2026-01-22 00:03:47.836 182717 DEBUG oslo_concurrency.lockutils [None req-b3f95957-4bae-4bc8-b76d-42d27061c036 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] Acquiring lock "2cb6e3d6-f22a-49ea-aab8-900dd88605e9" by "nova.compute.manager.ComputeManager.shelve_instance.<locals>.do_shelve_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:03:47 compute-1 nova_compute[182713]: 2026-01-22 00:03:47.837 182717 DEBUG oslo_concurrency.lockutils [None req-b3f95957-4bae-4bc8-b76d-42d27061c036 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] Lock "2cb6e3d6-f22a-49ea-aab8-900dd88605e9" acquired by "nova.compute.manager.ComputeManager.shelve_instance.<locals>.do_shelve_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:03:47 compute-1 nova_compute[182713]: 2026-01-22 00:03:47.838 182717 INFO nova.compute.manager [None req-b3f95957-4bae-4bc8-b76d-42d27061c036 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] [instance: 2cb6e3d6-f22a-49ea-aab8-900dd88605e9] Shelving
Jan 22 00:03:49 compute-1 nova_compute[182713]: 2026-01-22 00:03:49.485 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:03:51 compute-1 podman[225141]: 2026-01-22 00:03:51.58210215 +0000 UTC m=+0.072480790 container health_status af9a44923f14d865893c91712a29a99d59e05d83f1a1a0300c4f4d5af638f284 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Jan 22 00:03:51 compute-1 podman[225142]: 2026-01-22 00:03:51.600198248 +0000 UTC m=+0.076803012 container health_status e305ebfcd3dbca18f8dd680606b043b108cf9efb0d0a771039cb04fe0df8441d (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Jan 22 00:03:51 compute-1 nova_compute[182713]: 2026-01-22 00:03:51.930 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:03:51 compute-1 nova_compute[182713]: 2026-01-22 00:03:51.957 182717 DEBUG nova.virt.libvirt.driver [None req-b3f95957-4bae-4bc8-b76d-42d27061c036 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] [instance: 2cb6e3d6-f22a-49ea-aab8-900dd88605e9] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071
Jan 22 00:03:54 compute-1 kernel: tap8412a083-ca (unregistering): left promiscuous mode
Jan 22 00:03:54 compute-1 NetworkManager[54952]: <info>  [1769040234.1806] device (tap8412a083-ca): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 00:03:54 compute-1 ovn_controller[94841]: 2026-01-22T00:03:54Z|00368|binding|INFO|Releasing lport 8412a083-ca97-4457-bb0e-9c7bcd8bfb2f from this chassis (sb_readonly=0)
Jan 22 00:03:54 compute-1 ovn_controller[94841]: 2026-01-22T00:03:54Z|00369|binding|INFO|Setting lport 8412a083-ca97-4457-bb0e-9c7bcd8bfb2f down in Southbound
Jan 22 00:03:54 compute-1 nova_compute[182713]: 2026-01-22 00:03:54.216 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:03:54 compute-1 ovn_controller[94841]: 2026-01-22T00:03:54Z|00370|binding|INFO|Removing iface tap8412a083-ca ovn-installed in OVS
Jan 22 00:03:54 compute-1 nova_compute[182713]: 2026-01-22 00:03:54.218 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:03:54 compute-1 nova_compute[182713]: 2026-01-22 00:03:54.227 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:03:54 compute-1 systemd[1]: machine-qemu\x2d40\x2dinstance\x2d00000053.scope: Deactivated successfully.
Jan 22 00:03:54 compute-1 systemd[1]: machine-qemu\x2d40\x2dinstance\x2d00000053.scope: Consumed 19.405s CPU time.
Jan 22 00:03:54 compute-1 systemd-machined[153970]: Machine qemu-40-instance-00000053 terminated.
Jan 22 00:03:54 compute-1 nova_compute[182713]: 2026-01-22 00:03:54.461 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:03:54 compute-1 nova_compute[182713]: 2026-01-22 00:03:54.471 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:03:54 compute-1 nova_compute[182713]: 2026-01-22 00:03:54.486 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:03:54 compute-1 nova_compute[182713]: 2026-01-22 00:03:54.977 182717 INFO nova.virt.libvirt.driver [None req-b3f95957-4bae-4bc8-b76d-42d27061c036 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] [instance: 2cb6e3d6-f22a-49ea-aab8-900dd88605e9] Instance shutdown successfully after 3 seconds.
Jan 22 00:03:54 compute-1 nova_compute[182713]: 2026-01-22 00:03:54.985 182717 INFO nova.virt.libvirt.driver [-] [instance: 2cb6e3d6-f22a-49ea-aab8-900dd88605e9] Instance destroyed successfully.
Jan 22 00:03:54 compute-1 nova_compute[182713]: 2026-01-22 00:03:54.986 182717 DEBUG nova.objects.instance [None req-b3f95957-4bae-4bc8-b76d-42d27061c036 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] Lazy-loading 'numa_topology' on Instance uuid 2cb6e3d6-f22a-49ea-aab8-900dd88605e9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:03:56 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:03:56.267 104184 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e0:ee:91 10.100.0.3'], port_security=['fa:16:3e:e0:ee:91 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': '2cb6e3d6-f22a-49ea-aab8-900dd88605e9', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-397ba44b-e27b-4a2a-a10b-7de0daa31656', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a7e425a4d1854533a17d5f0dcd9d87b9', 'neutron:revision_number': '4', 'neutron:security_group_ids': '5fb84efc-d0d8-44ae-84e4-97e70d8c202e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=10175545-8ba8-4bcf-9e15-f460a54818aa, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>], logical_port=8412a083-ca97-4457-bb0e-9c7bcd8bfb2f) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:03:56 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:03:56.268 104184 INFO neutron.agent.ovn.metadata.agent [-] Port 8412a083-ca97-4457-bb0e-9c7bcd8bfb2f in datapath 397ba44b-e27b-4a2a-a10b-7de0daa31656 unbound from our chassis
Jan 22 00:03:56 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:03:56.270 104184 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 397ba44b-e27b-4a2a-a10b-7de0daa31656, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 00:03:56 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:03:56.273 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[f5ba452b-0366-4026-a966-474bbb603871]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:03:56 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:03:56.274 104184 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-397ba44b-e27b-4a2a-a10b-7de0daa31656 namespace which is not needed anymore
Jan 22 00:03:56 compute-1 neutron-haproxy-ovnmeta-397ba44b-e27b-4a2a-a10b-7de0daa31656[224117]: [NOTICE]   (224121) : haproxy version is 2.8.14-c23fe91
Jan 22 00:03:56 compute-1 neutron-haproxy-ovnmeta-397ba44b-e27b-4a2a-a10b-7de0daa31656[224117]: [NOTICE]   (224121) : path to executable is /usr/sbin/haproxy
Jan 22 00:03:56 compute-1 neutron-haproxy-ovnmeta-397ba44b-e27b-4a2a-a10b-7de0daa31656[224117]: [WARNING]  (224121) : Exiting Master process...
Jan 22 00:03:56 compute-1 neutron-haproxy-ovnmeta-397ba44b-e27b-4a2a-a10b-7de0daa31656[224117]: [ALERT]    (224121) : Current worker (224123) exited with code 143 (Terminated)
Jan 22 00:03:56 compute-1 neutron-haproxy-ovnmeta-397ba44b-e27b-4a2a-a10b-7de0daa31656[224117]: [WARNING]  (224121) : All workers exited. Exiting... (0)
Jan 22 00:03:56 compute-1 systemd[1]: libpod-533c90b1f8c293c688d73e9a05f500652ddb8a05d7a63d927a0c94fe78432582.scope: Deactivated successfully.
Jan 22 00:03:56 compute-1 podman[225231]: 2026-01-22 00:03:56.465315579 +0000 UTC m=+0.062328735 container died 533c90b1f8c293c688d73e9a05f500652ddb8a05d7a63d927a0c94fe78432582 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-397ba44b-e27b-4a2a-a10b-7de0daa31656, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 00:03:56 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-533c90b1f8c293c688d73e9a05f500652ddb8a05d7a63d927a0c94fe78432582-userdata-shm.mount: Deactivated successfully.
Jan 22 00:03:56 compute-1 systemd[1]: var-lib-containers-storage-overlay-5c7599a98e1976189a53b952a2523db361af4489a4e76162578e5b2991b81629-merged.mount: Deactivated successfully.
Jan 22 00:03:56 compute-1 podman[225231]: 2026-01-22 00:03:56.510781893 +0000 UTC m=+0.107795079 container cleanup 533c90b1f8c293c688d73e9a05f500652ddb8a05d7a63d927a0c94fe78432582 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-397ba44b-e27b-4a2a-a10b-7de0daa31656, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 00:03:56 compute-1 systemd[1]: libpod-conmon-533c90b1f8c293c688d73e9a05f500652ddb8a05d7a63d927a0c94fe78432582.scope: Deactivated successfully.
Jan 22 00:03:56 compute-1 nova_compute[182713]: 2026-01-22 00:03:56.584 182717 DEBUG nova.network.neutron [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] [instance: 2cb6e3d6-f22a-49ea-aab8-900dd88605e9] Updating instance_info_cache with network_info: [{"id": "8412a083-ca97-4457-bb0e-9c7bcd8bfb2f", "address": "fa:16:3e:e0:ee:91", "network": {"id": "397ba44b-e27b-4a2a-a10b-7de0daa31656", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1751919755-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a7e425a4d1854533a17d5f0dcd9d87b9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8412a083-ca", "ovs_interfaceid": "8412a083-ca97-4457-bb0e-9c7bcd8bfb2f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:03:56 compute-1 podman[225263]: 2026-01-22 00:03:56.596085677 +0000 UTC m=+0.050600153 container remove 533c90b1f8c293c688d73e9a05f500652ddb8a05d7a63d927a0c94fe78432582 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-397ba44b-e27b-4a2a-a10b-7de0daa31656, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 22 00:03:56 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:03:56.603 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[014a2f72-e50b-4b0c-be90-77d5e37d8e5a]: (4, ('Thu Jan 22 12:03:56 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-397ba44b-e27b-4a2a-a10b-7de0daa31656 (533c90b1f8c293c688d73e9a05f500652ddb8a05d7a63d927a0c94fe78432582)\n533c90b1f8c293c688d73e9a05f500652ddb8a05d7a63d927a0c94fe78432582\nThu Jan 22 12:03:56 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-397ba44b-e27b-4a2a-a10b-7de0daa31656 (533c90b1f8c293c688d73e9a05f500652ddb8a05d7a63d927a0c94fe78432582)\n533c90b1f8c293c688d73e9a05f500652ddb8a05d7a63d927a0c94fe78432582\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:03:56 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:03:56.605 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[c55e4061-b886-4957-b1ba-3c3c70e2dcda]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:03:56 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:03:56.606 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap397ba44b-e0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:03:56 compute-1 nova_compute[182713]: 2026-01-22 00:03:56.622 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Releasing lock "refresh_cache-2cb6e3d6-f22a-49ea-aab8-900dd88605e9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:03:56 compute-1 nova_compute[182713]: 2026-01-22 00:03:56.623 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] [instance: 2cb6e3d6-f22a-49ea-aab8-900dd88605e9] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Jan 22 00:03:56 compute-1 nova_compute[182713]: 2026-01-22 00:03:56.623 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:03:56 compute-1 nova_compute[182713]: 2026-01-22 00:03:56.652 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:03:56 compute-1 kernel: tap397ba44b-e0: left promiscuous mode
Jan 22 00:03:56 compute-1 nova_compute[182713]: 2026-01-22 00:03:56.668 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:03:56 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:03:56.672 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[92181f7b-b552-4769-a350-87b29b6c5ef9]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:03:56 compute-1 nova_compute[182713]: 2026-01-22 00:03:56.676 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:03:56 compute-1 nova_compute[182713]: 2026-01-22 00:03:56.677 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:03:56 compute-1 nova_compute[182713]: 2026-01-22 00:03:56.678 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:03:56 compute-1 nova_compute[182713]: 2026-01-22 00:03:56.678 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 00:03:56 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:03:56.690 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[e4d41627-dccc-460c-9a78-fc0bd2431ca1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:03:56 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:03:56.692 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[bb2ddffc-84a2-4cfe-af79-1331aa22e8d7]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:03:56 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:03:56.721 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[db79bdce-a31c-46dc-adc5-a8c2581a3984]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 465585, 'reachable_time': 43157, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 225283, 'error': None, 'target': 'ovnmeta-397ba44b-e27b-4a2a-a10b-7de0daa31656', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:03:56 compute-1 systemd[1]: run-netns-ovnmeta\x2d397ba44b\x2de27b\x2d4a2a\x2da10b\x2d7de0daa31656.mount: Deactivated successfully.
Jan 22 00:03:56 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:03:56.726 104576 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-397ba44b-e27b-4a2a-a10b-7de0daa31656 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 22 00:03:56 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:03:56.727 104576 DEBUG oslo.privsep.daemon [-] privsep: reply[84b97588-f326-4f03-927d-a2e305955946]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:03:56 compute-1 nova_compute[182713]: 2026-01-22 00:03:56.867 182717 DEBUG oslo_concurrency.processutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2cb6e3d6-f22a-49ea-aab8-900dd88605e9/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:03:56 compute-1 nova_compute[182713]: 2026-01-22 00:03:56.932 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:03:56 compute-1 nova_compute[182713]: 2026-01-22 00:03:56.962 182717 DEBUG oslo_concurrency.processutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2cb6e3d6-f22a-49ea-aab8-900dd88605e9/disk --force-share --output=json" returned: 0 in 0.095s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:03:56 compute-1 nova_compute[182713]: 2026-01-22 00:03:56.964 182717 DEBUG oslo_concurrency.processutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2cb6e3d6-f22a-49ea-aab8-900dd88605e9/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:03:56 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:03:56.989 104184 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=24, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:a4:fb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'fa:c2:bb:50:eb:27'}, ipsec=False) old=SB_Global(nb_cfg=23) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:03:56 compute-1 nova_compute[182713]: 2026-01-22 00:03:56.990 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:03:56 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:03:56.991 104184 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 22 00:03:56 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:03:56.993 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=74526b6d-b1ca-423f-9094-b845f8b97526, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '24'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:03:56 compute-1 nova_compute[182713]: 2026-01-22 00:03:56.997 182717 DEBUG nova.compute.manager [req-f901aa35-e49f-4ecf-b77b-6d559b088f0e req-b216b669-3489-4f36-82ac-1e0ead7618ba 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 2cb6e3d6-f22a-49ea-aab8-900dd88605e9] Received event network-vif-unplugged-8412a083-ca97-4457-bb0e-9c7bcd8bfb2f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:03:56 compute-1 nova_compute[182713]: 2026-01-22 00:03:56.998 182717 DEBUG oslo_concurrency.lockutils [req-f901aa35-e49f-4ecf-b77b-6d559b088f0e req-b216b669-3489-4f36-82ac-1e0ead7618ba 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "2cb6e3d6-f22a-49ea-aab8-900dd88605e9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:03:56 compute-1 nova_compute[182713]: 2026-01-22 00:03:56.998 182717 DEBUG oslo_concurrency.lockutils [req-f901aa35-e49f-4ecf-b77b-6d559b088f0e req-b216b669-3489-4f36-82ac-1e0ead7618ba 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "2cb6e3d6-f22a-49ea-aab8-900dd88605e9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:03:56 compute-1 nova_compute[182713]: 2026-01-22 00:03:56.998 182717 DEBUG oslo_concurrency.lockutils [req-f901aa35-e49f-4ecf-b77b-6d559b088f0e req-b216b669-3489-4f36-82ac-1e0ead7618ba 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "2cb6e3d6-f22a-49ea-aab8-900dd88605e9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:03:56 compute-1 nova_compute[182713]: 2026-01-22 00:03:56.999 182717 DEBUG nova.compute.manager [req-f901aa35-e49f-4ecf-b77b-6d559b088f0e req-b216b669-3489-4f36-82ac-1e0ead7618ba 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 2cb6e3d6-f22a-49ea-aab8-900dd88605e9] No waiting events found dispatching network-vif-unplugged-8412a083-ca97-4457-bb0e-9c7bcd8bfb2f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:03:56 compute-1 nova_compute[182713]: 2026-01-22 00:03:56.999 182717 WARNING nova.compute.manager [req-f901aa35-e49f-4ecf-b77b-6d559b088f0e req-b216b669-3489-4f36-82ac-1e0ead7618ba 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 2cb6e3d6-f22a-49ea-aab8-900dd88605e9] Received unexpected event network-vif-unplugged-8412a083-ca97-4457-bb0e-9c7bcd8bfb2f for instance with vm_state active and task_state shelving.
Jan 22 00:03:57 compute-1 nova_compute[182713]: 2026-01-22 00:03:57.044 182717 DEBUG oslo_concurrency.processutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2cb6e3d6-f22a-49ea-aab8-900dd88605e9/disk --force-share --output=json" returned: 0 in 0.081s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:03:57 compute-1 nova_compute[182713]: 2026-01-22 00:03:57.226 182717 WARNING nova.virt.libvirt.driver [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 00:03:57 compute-1 nova_compute[182713]: 2026-01-22 00:03:57.229 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5740MB free_disk=73.27495574951172GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 00:03:57 compute-1 nova_compute[182713]: 2026-01-22 00:03:57.229 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:03:57 compute-1 nova_compute[182713]: 2026-01-22 00:03:57.230 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:03:57 compute-1 nova_compute[182713]: 2026-01-22 00:03:57.365 182717 INFO nova.virt.libvirt.driver [None req-b3f95957-4bae-4bc8-b76d-42d27061c036 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] [instance: 2cb6e3d6-f22a-49ea-aab8-900dd88605e9] Beginning cold snapshot process
Jan 22 00:03:57 compute-1 nova_compute[182713]: 2026-01-22 00:03:57.536 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Instance 2cb6e3d6-f22a-49ea-aab8-900dd88605e9 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 22 00:03:57 compute-1 nova_compute[182713]: 2026-01-22 00:03:57.537 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 00:03:57 compute-1 nova_compute[182713]: 2026-01-22 00:03:57.538 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 00:03:57 compute-1 nova_compute[182713]: 2026-01-22 00:03:57.615 182717 DEBUG nova.compute.provider_tree [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Inventory has not changed in ProviderTree for provider: 39680711-70c9-4df1-ae59-25e54fac688d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:03:57 compute-1 nova_compute[182713]: 2026-01-22 00:03:57.653 182717 DEBUG nova.scheduler.client.report [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Inventory has not changed for provider 39680711-70c9-4df1-ae59-25e54fac688d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:03:57 compute-1 nova_compute[182713]: 2026-01-22 00:03:57.696 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 00:03:57 compute-1 nova_compute[182713]: 2026-01-22 00:03:57.696 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.467s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:03:57 compute-1 nova_compute[182713]: 2026-01-22 00:03:57.701 182717 DEBUG nova.privsep.utils [None req-b3f95957-4bae-4bc8-b76d-42d27061c036 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63
Jan 22 00:03:57 compute-1 nova_compute[182713]: 2026-01-22 00:03:57.702 182717 DEBUG oslo_concurrency.processutils [None req-b3f95957-4bae-4bc8-b76d-42d27061c036 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] Running cmd (subprocess): qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/2cb6e3d6-f22a-49ea-aab8-900dd88605e9/disk /var/lib/nova/instances/snapshots/tmpoz75hsss/ed874cf24c2f4c5fa1cb9e1f28515ee1 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:03:58 compute-1 nova_compute[182713]: 2026-01-22 00:03:58.187 182717 DEBUG oslo_concurrency.processutils [None req-b3f95957-4bae-4bc8-b76d-42d27061c036 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] CMD "qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/2cb6e3d6-f22a-49ea-aab8-900dd88605e9/disk /var/lib/nova/instances/snapshots/tmpoz75hsss/ed874cf24c2f4c5fa1cb9e1f28515ee1" returned: 0 in 0.486s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:03:58 compute-1 nova_compute[182713]: 2026-01-22 00:03:58.188 182717 INFO nova.virt.libvirt.driver [None req-b3f95957-4bae-4bc8-b76d-42d27061c036 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] [instance: 2cb6e3d6-f22a-49ea-aab8-900dd88605e9] Snapshot extracted, beginning image upload
Jan 22 00:03:59 compute-1 nova_compute[182713]: 2026-01-22 00:03:59.268 182717 DEBUG nova.compute.manager [req-9cb99d87-95bd-4d5f-8724-f7edfdc4e733 req-9a2a63b1-e240-4bdd-bb00-df155896162c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 2cb6e3d6-f22a-49ea-aab8-900dd88605e9] Received event network-vif-plugged-8412a083-ca97-4457-bb0e-9c7bcd8bfb2f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:03:59 compute-1 nova_compute[182713]: 2026-01-22 00:03:59.269 182717 DEBUG oslo_concurrency.lockutils [req-9cb99d87-95bd-4d5f-8724-f7edfdc4e733 req-9a2a63b1-e240-4bdd-bb00-df155896162c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "2cb6e3d6-f22a-49ea-aab8-900dd88605e9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:03:59 compute-1 nova_compute[182713]: 2026-01-22 00:03:59.269 182717 DEBUG oslo_concurrency.lockutils [req-9cb99d87-95bd-4d5f-8724-f7edfdc4e733 req-9a2a63b1-e240-4bdd-bb00-df155896162c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "2cb6e3d6-f22a-49ea-aab8-900dd88605e9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:03:59 compute-1 nova_compute[182713]: 2026-01-22 00:03:59.269 182717 DEBUG oslo_concurrency.lockutils [req-9cb99d87-95bd-4d5f-8724-f7edfdc4e733 req-9a2a63b1-e240-4bdd-bb00-df155896162c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "2cb6e3d6-f22a-49ea-aab8-900dd88605e9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:03:59 compute-1 nova_compute[182713]: 2026-01-22 00:03:59.269 182717 DEBUG nova.compute.manager [req-9cb99d87-95bd-4d5f-8724-f7edfdc4e733 req-9a2a63b1-e240-4bdd-bb00-df155896162c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 2cb6e3d6-f22a-49ea-aab8-900dd88605e9] No waiting events found dispatching network-vif-plugged-8412a083-ca97-4457-bb0e-9c7bcd8bfb2f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:03:59 compute-1 nova_compute[182713]: 2026-01-22 00:03:59.270 182717 WARNING nova.compute.manager [req-9cb99d87-95bd-4d5f-8724-f7edfdc4e733 req-9a2a63b1-e240-4bdd-bb00-df155896162c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 2cb6e3d6-f22a-49ea-aab8-900dd88605e9] Received unexpected event network-vif-plugged-8412a083-ca97-4457-bb0e-9c7bcd8bfb2f for instance with vm_state active and task_state shelving_image_uploading.
Jan 22 00:03:59 compute-1 nova_compute[182713]: 2026-01-22 00:03:59.487 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:04:01 compute-1 nova_compute[182713]: 2026-01-22 00:04:01.935 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:04:02 compute-1 nova_compute[182713]: 2026-01-22 00:04:02.370 182717 INFO nova.virt.libvirt.driver [None req-b3f95957-4bae-4bc8-b76d-42d27061c036 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] [instance: 2cb6e3d6-f22a-49ea-aab8-900dd88605e9] Snapshot image upload complete
Jan 22 00:04:02 compute-1 nova_compute[182713]: 2026-01-22 00:04:02.371 182717 DEBUG nova.compute.manager [None req-b3f95957-4bae-4bc8-b76d-42d27061c036 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] [instance: 2cb6e3d6-f22a-49ea-aab8-900dd88605e9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:04:02 compute-1 nova_compute[182713]: 2026-01-22 00:04:02.495 182717 INFO nova.compute.manager [None req-b3f95957-4bae-4bc8-b76d-42d27061c036 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] [instance: 2cb6e3d6-f22a-49ea-aab8-900dd88605e9] Shelve offloading
Jan 22 00:04:02 compute-1 nova_compute[182713]: 2026-01-22 00:04:02.517 182717 INFO nova.virt.libvirt.driver [-] [instance: 2cb6e3d6-f22a-49ea-aab8-900dd88605e9] Instance destroyed successfully.
Jan 22 00:04:02 compute-1 nova_compute[182713]: 2026-01-22 00:04:02.517 182717 DEBUG nova.compute.manager [None req-b3f95957-4bae-4bc8-b76d-42d27061c036 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] [instance: 2cb6e3d6-f22a-49ea-aab8-900dd88605e9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:04:02 compute-1 nova_compute[182713]: 2026-01-22 00:04:02.520 182717 DEBUG oslo_concurrency.lockutils [None req-b3f95957-4bae-4bc8-b76d-42d27061c036 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] Acquiring lock "refresh_cache-2cb6e3d6-f22a-49ea-aab8-900dd88605e9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:04:02 compute-1 nova_compute[182713]: 2026-01-22 00:04:02.520 182717 DEBUG oslo_concurrency.lockutils [None req-b3f95957-4bae-4bc8-b76d-42d27061c036 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] Acquired lock "refresh_cache-2cb6e3d6-f22a-49ea-aab8-900dd88605e9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:04:02 compute-1 nova_compute[182713]: 2026-01-22 00:04:02.521 182717 DEBUG nova.network.neutron [None req-b3f95957-4bae-4bc8-b76d-42d27061c036 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] [instance: 2cb6e3d6-f22a-49ea-aab8-900dd88605e9] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 00:04:02 compute-1 podman[225301]: 2026-01-22 00:04:02.62298283 +0000 UTC m=+0.091400744 container health_status cba261ffc859a105e0afa4436e64e27f9ba7e7387a1ea02146e0072c51844d44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=ceilometer_agent_compute, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Jan 22 00:04:03 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:04:03.011 104184 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:04:03 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:04:03.011 104184 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:04:03 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:04:03.011 104184 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:04:04 compute-1 nova_compute[182713]: 2026-01-22 00:04:04.490 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:04:04 compute-1 nova_compute[182713]: 2026-01-22 00:04:04.729 182717 DEBUG nova.network.neutron [None req-b3f95957-4bae-4bc8-b76d-42d27061c036 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] [instance: 2cb6e3d6-f22a-49ea-aab8-900dd88605e9] Updating instance_info_cache with network_info: [{"id": "8412a083-ca97-4457-bb0e-9c7bcd8bfb2f", "address": "fa:16:3e:e0:ee:91", "network": {"id": "397ba44b-e27b-4a2a-a10b-7de0daa31656", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1751919755-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a7e425a4d1854533a17d5f0dcd9d87b9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8412a083-ca", "ovs_interfaceid": "8412a083-ca97-4457-bb0e-9c7bcd8bfb2f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:04:04 compute-1 nova_compute[182713]: 2026-01-22 00:04:04.763 182717 DEBUG oslo_concurrency.lockutils [None req-b3f95957-4bae-4bc8-b76d-42d27061c036 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] Releasing lock "refresh_cache-2cb6e3d6-f22a-49ea-aab8-900dd88605e9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:04:06 compute-1 podman[225321]: 2026-01-22 00:04:06.612425422 +0000 UTC m=+0.099088881 container health_status dce133a2ffa21f3006ac27dc80027060a9029e6d972f4c62fecd35dd3bbb00f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, maintainer=Red Hat, Inc., vcs-type=git, version=9.6, container_name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, build-date=2025-08-20T13:12:41, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, name=ubi9-minimal, io.buildah.version=1.33.7, release=1755695350, config_id=openstack_network_exporter, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9)
Jan 22 00:04:06 compute-1 nova_compute[182713]: 2026-01-22 00:04:06.937 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:04:08 compute-1 nova_compute[182713]: 2026-01-22 00:04:08.290 182717 INFO nova.virt.libvirt.driver [-] [instance: 2cb6e3d6-f22a-49ea-aab8-900dd88605e9] Instance destroyed successfully.
Jan 22 00:04:08 compute-1 nova_compute[182713]: 2026-01-22 00:04:08.292 182717 DEBUG nova.objects.instance [None req-b3f95957-4bae-4bc8-b76d-42d27061c036 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] Lazy-loading 'resources' on Instance uuid 2cb6e3d6-f22a-49ea-aab8-900dd88605e9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:04:08 compute-1 nova_compute[182713]: 2026-01-22 00:04:08.320 182717 DEBUG nova.virt.libvirt.vif [None req-b3f95957-4bae-4bc8-b76d-42d27061c036 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T00:01:05Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersNegativeTestJSON-server-1381246704',display_name='tempest-ServersNegativeTestJSON-server-1381246704',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serversnegativetestjson-server-1381246704',id=83,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-22T00:01:23Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='a7e425a4d1854533a17d5f0dcd9d87b9',ramdisk_id='',reservation_id='r-37fe06tc',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersNegativeTestJSON-1689661',owner_user_name='tempest-ServersNegativeTestJSON-1689661-project-member',shelved_at='2026-01-22T00:04:02.371491',shelved_host='compute-1.ctlplane.example.com',shelved_image_id='bb239f4f-83bb-4009-8354-2ec967a0da2a'},tags=<?>,task_state='shelving_offloading',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T00:03:58Z,user_data=None,user_id='531ec5a088a94b78af6e2c3feda17c0c',uuid=2cb6e3d6-f22a-49ea-aab8-900dd88605e9,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='shelved') vif={"id": "8412a083-ca97-4457-bb0e-9c7bcd8bfb2f", "address": "fa:16:3e:e0:ee:91", "network": {"id": "397ba44b-e27b-4a2a-a10b-7de0daa31656", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1751919755-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a7e425a4d1854533a17d5f0dcd9d87b9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8412a083-ca", "ovs_interfaceid": "8412a083-ca97-4457-bb0e-9c7bcd8bfb2f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 00:04:08 compute-1 nova_compute[182713]: 2026-01-22 00:04:08.321 182717 DEBUG nova.network.os_vif_util [None req-b3f95957-4bae-4bc8-b76d-42d27061c036 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] Converting VIF {"id": "8412a083-ca97-4457-bb0e-9c7bcd8bfb2f", "address": "fa:16:3e:e0:ee:91", "network": {"id": "397ba44b-e27b-4a2a-a10b-7de0daa31656", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1751919755-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a7e425a4d1854533a17d5f0dcd9d87b9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8412a083-ca", "ovs_interfaceid": "8412a083-ca97-4457-bb0e-9c7bcd8bfb2f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:04:08 compute-1 nova_compute[182713]: 2026-01-22 00:04:08.322 182717 DEBUG nova.network.os_vif_util [None req-b3f95957-4bae-4bc8-b76d-42d27061c036 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e0:ee:91,bridge_name='br-int',has_traffic_filtering=True,id=8412a083-ca97-4457-bb0e-9c7bcd8bfb2f,network=Network(397ba44b-e27b-4a2a-a10b-7de0daa31656),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8412a083-ca') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:04:08 compute-1 nova_compute[182713]: 2026-01-22 00:04:08.322 182717 DEBUG os_vif [None req-b3f95957-4bae-4bc8-b76d-42d27061c036 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:e0:ee:91,bridge_name='br-int',has_traffic_filtering=True,id=8412a083-ca97-4457-bb0e-9c7bcd8bfb2f,network=Network(397ba44b-e27b-4a2a-a10b-7de0daa31656),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8412a083-ca') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 00:04:08 compute-1 nova_compute[182713]: 2026-01-22 00:04:08.324 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:04:08 compute-1 nova_compute[182713]: 2026-01-22 00:04:08.324 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8412a083-ca, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:04:08 compute-1 nova_compute[182713]: 2026-01-22 00:04:08.326 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:04:08 compute-1 nova_compute[182713]: 2026-01-22 00:04:08.328 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:04:08 compute-1 nova_compute[182713]: 2026-01-22 00:04:08.334 182717 INFO os_vif [None req-b3f95957-4bae-4bc8-b76d-42d27061c036 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:e0:ee:91,bridge_name='br-int',has_traffic_filtering=True,id=8412a083-ca97-4457-bb0e-9c7bcd8bfb2f,network=Network(397ba44b-e27b-4a2a-a10b-7de0daa31656),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8412a083-ca')
Jan 22 00:04:08 compute-1 nova_compute[182713]: 2026-01-22 00:04:08.334 182717 INFO nova.virt.libvirt.driver [None req-b3f95957-4bae-4bc8-b76d-42d27061c036 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] [instance: 2cb6e3d6-f22a-49ea-aab8-900dd88605e9] Deleting instance files /var/lib/nova/instances/2cb6e3d6-f22a-49ea-aab8-900dd88605e9_del
Jan 22 00:04:08 compute-1 nova_compute[182713]: 2026-01-22 00:04:08.340 182717 INFO nova.virt.libvirt.driver [None req-b3f95957-4bae-4bc8-b76d-42d27061c036 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] [instance: 2cb6e3d6-f22a-49ea-aab8-900dd88605e9] Deletion of /var/lib/nova/instances/2cb6e3d6-f22a-49ea-aab8-900dd88605e9_del complete
Jan 22 00:04:08 compute-1 nova_compute[182713]: 2026-01-22 00:04:08.545 182717 INFO nova.scheduler.client.report [None req-b3f95957-4bae-4bc8-b76d-42d27061c036 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] Deleted allocations for instance 2cb6e3d6-f22a-49ea-aab8-900dd88605e9
Jan 22 00:04:08 compute-1 nova_compute[182713]: 2026-01-22 00:04:08.638 182717 DEBUG oslo_concurrency.lockutils [None req-b3f95957-4bae-4bc8-b76d-42d27061c036 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:04:08 compute-1 nova_compute[182713]: 2026-01-22 00:04:08.639 182717 DEBUG oslo_concurrency.lockutils [None req-b3f95957-4bae-4bc8-b76d-42d27061c036 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:04:08 compute-1 nova_compute[182713]: 2026-01-22 00:04:08.678 182717 DEBUG nova.compute.provider_tree [None req-b3f95957-4bae-4bc8-b76d-42d27061c036 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] Inventory has not changed in ProviderTree for provider: 39680711-70c9-4df1-ae59-25e54fac688d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:04:08 compute-1 nova_compute[182713]: 2026-01-22 00:04:08.704 182717 DEBUG nova.scheduler.client.report [None req-b3f95957-4bae-4bc8-b76d-42d27061c036 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] Inventory has not changed for provider 39680711-70c9-4df1-ae59-25e54fac688d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:04:08 compute-1 nova_compute[182713]: 2026-01-22 00:04:08.732 182717 DEBUG oslo_concurrency.lockutils [None req-b3f95957-4bae-4bc8-b76d-42d27061c036 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.093s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:04:08 compute-1 nova_compute[182713]: 2026-01-22 00:04:08.851 182717 DEBUG oslo_concurrency.lockutils [None req-b3f95957-4bae-4bc8-b76d-42d27061c036 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] Lock "2cb6e3d6-f22a-49ea-aab8-900dd88605e9" "released" by "nova.compute.manager.ComputeManager.shelve_instance.<locals>.do_shelve_instance" :: held 21.014s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:04:09 compute-1 nova_compute[182713]: 2026-01-22 00:04:09.511 182717 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769040234.5095925, 2cb6e3d6-f22a-49ea-aab8-900dd88605e9 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:04:09 compute-1 nova_compute[182713]: 2026-01-22 00:04:09.511 182717 INFO nova.compute.manager [-] [instance: 2cb6e3d6-f22a-49ea-aab8-900dd88605e9] VM Stopped (Lifecycle Event)
Jan 22 00:04:09 compute-1 nova_compute[182713]: 2026-01-22 00:04:09.546 182717 DEBUG nova.compute.manager [None req-20004138-0fd1-4f37-a65f-e3e2cc0f186e - - - - - -] [instance: 2cb6e3d6-f22a-49ea-aab8-900dd88605e9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:04:11 compute-1 nova_compute[182713]: 2026-01-22 00:04:11.939 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:04:13 compute-1 nova_compute[182713]: 2026-01-22 00:04:13.328 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:04:16 compute-1 podman[225345]: 2026-01-22 00:04:16.559575882 +0000 UTC m=+0.045774995 container health_status 9069102163b9014ce8d990e36224edf2f6277e737eebbc85a8ea0ba5a3d6a468 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 22 00:04:16 compute-1 podman[225344]: 2026-01-22 00:04:16.596577584 +0000 UTC m=+0.081674192 container health_status 1b31374526468d6b98833c6ddc06c791524e3aaa8f4a92d02e3f11c449a17ca2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 00:04:16 compute-1 nova_compute[182713]: 2026-01-22 00:04:16.941 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:04:18 compute-1 nova_compute[182713]: 2026-01-22 00:04:18.332 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:04:21 compute-1 nova_compute[182713]: 2026-01-22 00:04:21.967 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:04:22 compute-1 podman[225394]: 2026-01-22 00:04:22.591941363 +0000 UTC m=+0.085621114 container health_status af9a44923f14d865893c91712a29a99d59e05d83f1a1a0300c4f4d5af638f284 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent)
Jan 22 00:04:22 compute-1 podman[225395]: 2026-01-22 00:04:22.611094175 +0000 UTC m=+0.089838385 container health_status e305ebfcd3dbca18f8dd680606b043b108cf9efb0d0a771039cb04fe0df8441d (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Jan 22 00:04:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:04:22.872 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:04:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:04:22.873 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:04:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:04:22.873 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:04:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:04:22.873 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:04:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:04:22.874 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:04:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:04:22.874 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:04:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:04:22.874 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:04:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:04:22.874 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:04:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:04:22.874 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:04:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:04:22.874 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:04:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:04:22.875 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:04:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:04:22.875 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:04:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:04:22.875 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:04:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:04:22.875 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:04:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:04:22.875 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:04:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:04:22.875 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:04:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:04:22.876 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:04:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:04:22.876 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:04:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:04:22.876 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:04:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:04:22.876 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:04:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:04:22.876 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:04:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:04:22.876 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:04:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:04:22.877 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:04:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:04:22.877 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:04:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:04:22.877 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:04:23 compute-1 nova_compute[182713]: 2026-01-22 00:04:23.335 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:04:26 compute-1 nova_compute[182713]: 2026-01-22 00:04:26.969 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:04:28 compute-1 nova_compute[182713]: 2026-01-22 00:04:28.339 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:04:30 compute-1 nova_compute[182713]: 2026-01-22 00:04:30.643 182717 DEBUG oslo_concurrency.lockutils [None req-60da5aff-cd94-493b-a70e-4a426d3540a0 63648aa2c42a435b8649b88978fe889b 8d71f189c68645b9893c7a1171fc594f - - default default] Acquiring lock "ae422e71-aa79-4a45-ab48-e634bb09283b" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:04:30 compute-1 nova_compute[182713]: 2026-01-22 00:04:30.644 182717 DEBUG oslo_concurrency.lockutils [None req-60da5aff-cd94-493b-a70e-4a426d3540a0 63648aa2c42a435b8649b88978fe889b 8d71f189c68645b9893c7a1171fc594f - - default default] Lock "ae422e71-aa79-4a45-ab48-e634bb09283b" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:04:30 compute-1 nova_compute[182713]: 2026-01-22 00:04:30.685 182717 DEBUG nova.compute.manager [None req-60da5aff-cd94-493b-a70e-4a426d3540a0 63648aa2c42a435b8649b88978fe889b 8d71f189c68645b9893c7a1171fc594f - - default default] [instance: ae422e71-aa79-4a45-ab48-e634bb09283b] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 22 00:04:30 compute-1 nova_compute[182713]: 2026-01-22 00:04:30.859 182717 DEBUG oslo_concurrency.lockutils [None req-60da5aff-cd94-493b-a70e-4a426d3540a0 63648aa2c42a435b8649b88978fe889b 8d71f189c68645b9893c7a1171fc594f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:04:30 compute-1 nova_compute[182713]: 2026-01-22 00:04:30.860 182717 DEBUG oslo_concurrency.lockutils [None req-60da5aff-cd94-493b-a70e-4a426d3540a0 63648aa2c42a435b8649b88978fe889b 8d71f189c68645b9893c7a1171fc594f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:04:30 compute-1 nova_compute[182713]: 2026-01-22 00:04:30.868 182717 DEBUG nova.virt.hardware [None req-60da5aff-cd94-493b-a70e-4a426d3540a0 63648aa2c42a435b8649b88978fe889b 8d71f189c68645b9893c7a1171fc594f - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 22 00:04:30 compute-1 nova_compute[182713]: 2026-01-22 00:04:30.868 182717 INFO nova.compute.claims [None req-60da5aff-cd94-493b-a70e-4a426d3540a0 63648aa2c42a435b8649b88978fe889b 8d71f189c68645b9893c7a1171fc594f - - default default] [instance: ae422e71-aa79-4a45-ab48-e634bb09283b] Claim successful on node compute-1.ctlplane.example.com
Jan 22 00:04:31 compute-1 nova_compute[182713]: 2026-01-22 00:04:31.058 182717 DEBUG nova.scheduler.client.report [None req-60da5aff-cd94-493b-a70e-4a426d3540a0 63648aa2c42a435b8649b88978fe889b 8d71f189c68645b9893c7a1171fc594f - - default default] Refreshing inventories for resource provider 39680711-70c9-4df1-ae59-25e54fac688d _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Jan 22 00:04:31 compute-1 nova_compute[182713]: 2026-01-22 00:04:31.121 182717 DEBUG nova.scheduler.client.report [None req-60da5aff-cd94-493b-a70e-4a426d3540a0 63648aa2c42a435b8649b88978fe889b 8d71f189c68645b9893c7a1171fc594f - - default default] Updating ProviderTree inventory for provider 39680711-70c9-4df1-ae59-25e54fac688d from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Jan 22 00:04:31 compute-1 nova_compute[182713]: 2026-01-22 00:04:31.121 182717 DEBUG nova.compute.provider_tree [None req-60da5aff-cd94-493b-a70e-4a426d3540a0 63648aa2c42a435b8649b88978fe889b 8d71f189c68645b9893c7a1171fc594f - - default default] Updating inventory in ProviderTree for provider 39680711-70c9-4df1-ae59-25e54fac688d with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Jan 22 00:04:31 compute-1 nova_compute[182713]: 2026-01-22 00:04:31.156 182717 DEBUG nova.scheduler.client.report [None req-60da5aff-cd94-493b-a70e-4a426d3540a0 63648aa2c42a435b8649b88978fe889b 8d71f189c68645b9893c7a1171fc594f - - default default] Refreshing aggregate associations for resource provider 39680711-70c9-4df1-ae59-25e54fac688d, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Jan 22 00:04:31 compute-1 nova_compute[182713]: 2026-01-22 00:04:31.210 182717 DEBUG nova.scheduler.client.report [None req-60da5aff-cd94-493b-a70e-4a426d3540a0 63648aa2c42a435b8649b88978fe889b 8d71f189c68645b9893c7a1171fc594f - - default default] Refreshing trait associations for resource provider 39680711-70c9-4df1-ae59-25e54fac688d, traits: COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_ACCELERATORS,COMPUTE_NODE,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_SSE41,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_GRAPHICS_MODEL_CIRRUS,HW_CPU_X86_SSE42,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_SECURITY_TPM_2_0,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_STORAGE_BUS_USB,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_STORAGE_BUS_FDC,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_STORAGE_BUS_SATA,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_DEVICE_TAGGING,COMPUTE_NET_VIF_MODEL_NE2K_PCI,HW_CPU_X86_SSE2,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_CPU_X86_SSSE3,COMPUTE_STORAGE_BUS_IDE,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_RESCUE_BFV,HW_CPU_X86_MMX,HW_CPU_X86_SSE,COMPUTE_TRUSTED_CERTS,COMPUTE_IMAGE_TYPE_ISO _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Jan 22 00:04:31 compute-1 nova_compute[182713]: 2026-01-22 00:04:31.297 182717 DEBUG nova.compute.provider_tree [None req-60da5aff-cd94-493b-a70e-4a426d3540a0 63648aa2c42a435b8649b88978fe889b 8d71f189c68645b9893c7a1171fc594f - - default default] Inventory has not changed in ProviderTree for provider: 39680711-70c9-4df1-ae59-25e54fac688d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:04:31 compute-1 nova_compute[182713]: 2026-01-22 00:04:31.320 182717 DEBUG nova.scheduler.client.report [None req-60da5aff-cd94-493b-a70e-4a426d3540a0 63648aa2c42a435b8649b88978fe889b 8d71f189c68645b9893c7a1171fc594f - - default default] Inventory has not changed for provider 39680711-70c9-4df1-ae59-25e54fac688d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:04:31 compute-1 nova_compute[182713]: 2026-01-22 00:04:31.352 182717 DEBUG oslo_concurrency.lockutils [None req-60da5aff-cd94-493b-a70e-4a426d3540a0 63648aa2c42a435b8649b88978fe889b 8d71f189c68645b9893c7a1171fc594f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.493s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:04:31 compute-1 nova_compute[182713]: 2026-01-22 00:04:31.353 182717 DEBUG nova.compute.manager [None req-60da5aff-cd94-493b-a70e-4a426d3540a0 63648aa2c42a435b8649b88978fe889b 8d71f189c68645b9893c7a1171fc594f - - default default] [instance: ae422e71-aa79-4a45-ab48-e634bb09283b] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 22 00:04:31 compute-1 nova_compute[182713]: 2026-01-22 00:04:31.436 182717 DEBUG nova.compute.manager [None req-60da5aff-cd94-493b-a70e-4a426d3540a0 63648aa2c42a435b8649b88978fe889b 8d71f189c68645b9893c7a1171fc594f - - default default] [instance: ae422e71-aa79-4a45-ab48-e634bb09283b] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 22 00:04:31 compute-1 nova_compute[182713]: 2026-01-22 00:04:31.437 182717 DEBUG nova.network.neutron [None req-60da5aff-cd94-493b-a70e-4a426d3540a0 63648aa2c42a435b8649b88978fe889b 8d71f189c68645b9893c7a1171fc594f - - default default] [instance: ae422e71-aa79-4a45-ab48-e634bb09283b] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 22 00:04:31 compute-1 nova_compute[182713]: 2026-01-22 00:04:31.468 182717 INFO nova.virt.libvirt.driver [None req-60da5aff-cd94-493b-a70e-4a426d3540a0 63648aa2c42a435b8649b88978fe889b 8d71f189c68645b9893c7a1171fc594f - - default default] [instance: ae422e71-aa79-4a45-ab48-e634bb09283b] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 22 00:04:31 compute-1 nova_compute[182713]: 2026-01-22 00:04:31.501 182717 DEBUG nova.compute.manager [None req-60da5aff-cd94-493b-a70e-4a426d3540a0 63648aa2c42a435b8649b88978fe889b 8d71f189c68645b9893c7a1171fc594f - - default default] [instance: ae422e71-aa79-4a45-ab48-e634bb09283b] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 22 00:04:31 compute-1 nova_compute[182713]: 2026-01-22 00:04:31.746 182717 DEBUG nova.compute.manager [None req-60da5aff-cd94-493b-a70e-4a426d3540a0 63648aa2c42a435b8649b88978fe889b 8d71f189c68645b9893c7a1171fc594f - - default default] [instance: ae422e71-aa79-4a45-ab48-e634bb09283b] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 22 00:04:31 compute-1 nova_compute[182713]: 2026-01-22 00:04:31.747 182717 DEBUG nova.virt.libvirt.driver [None req-60da5aff-cd94-493b-a70e-4a426d3540a0 63648aa2c42a435b8649b88978fe889b 8d71f189c68645b9893c7a1171fc594f - - default default] [instance: ae422e71-aa79-4a45-ab48-e634bb09283b] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 22 00:04:31 compute-1 nova_compute[182713]: 2026-01-22 00:04:31.749 182717 INFO nova.virt.libvirt.driver [None req-60da5aff-cd94-493b-a70e-4a426d3540a0 63648aa2c42a435b8649b88978fe889b 8d71f189c68645b9893c7a1171fc594f - - default default] [instance: ae422e71-aa79-4a45-ab48-e634bb09283b] Creating image(s)
Jan 22 00:04:31 compute-1 nova_compute[182713]: 2026-01-22 00:04:31.750 182717 DEBUG oslo_concurrency.lockutils [None req-60da5aff-cd94-493b-a70e-4a426d3540a0 63648aa2c42a435b8649b88978fe889b 8d71f189c68645b9893c7a1171fc594f - - default default] Acquiring lock "/var/lib/nova/instances/ae422e71-aa79-4a45-ab48-e634bb09283b/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:04:31 compute-1 nova_compute[182713]: 2026-01-22 00:04:31.750 182717 DEBUG oslo_concurrency.lockutils [None req-60da5aff-cd94-493b-a70e-4a426d3540a0 63648aa2c42a435b8649b88978fe889b 8d71f189c68645b9893c7a1171fc594f - - default default] Lock "/var/lib/nova/instances/ae422e71-aa79-4a45-ab48-e634bb09283b/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:04:31 compute-1 nova_compute[182713]: 2026-01-22 00:04:31.751 182717 DEBUG oslo_concurrency.lockutils [None req-60da5aff-cd94-493b-a70e-4a426d3540a0 63648aa2c42a435b8649b88978fe889b 8d71f189c68645b9893c7a1171fc594f - - default default] Lock "/var/lib/nova/instances/ae422e71-aa79-4a45-ab48-e634bb09283b/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:04:31 compute-1 nova_compute[182713]: 2026-01-22 00:04:31.780 182717 DEBUG oslo_concurrency.processutils [None req-60da5aff-cd94-493b-a70e-4a426d3540a0 63648aa2c42a435b8649b88978fe889b 8d71f189c68645b9893c7a1171fc594f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:04:31 compute-1 nova_compute[182713]: 2026-01-22 00:04:31.868 182717 DEBUG oslo_concurrency.processutils [None req-60da5aff-cd94-493b-a70e-4a426d3540a0 63648aa2c42a435b8649b88978fe889b 8d71f189c68645b9893c7a1171fc594f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.088s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:04:31 compute-1 nova_compute[182713]: 2026-01-22 00:04:31.869 182717 DEBUG oslo_concurrency.lockutils [None req-60da5aff-cd94-493b-a70e-4a426d3540a0 63648aa2c42a435b8649b88978fe889b 8d71f189c68645b9893c7a1171fc594f - - default default] Acquiring lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:04:31 compute-1 nova_compute[182713]: 2026-01-22 00:04:31.870 182717 DEBUG oslo_concurrency.lockutils [None req-60da5aff-cd94-493b-a70e-4a426d3540a0 63648aa2c42a435b8649b88978fe889b 8d71f189c68645b9893c7a1171fc594f - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:04:31 compute-1 nova_compute[182713]: 2026-01-22 00:04:31.885 182717 DEBUG oslo_concurrency.processutils [None req-60da5aff-cd94-493b-a70e-4a426d3540a0 63648aa2c42a435b8649b88978fe889b 8d71f189c68645b9893c7a1171fc594f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:04:31 compute-1 nova_compute[182713]: 2026-01-22 00:04:31.913 182717 DEBUG nova.policy [None req-60da5aff-cd94-493b-a70e-4a426d3540a0 63648aa2c42a435b8649b88978fe889b 8d71f189c68645b9893c7a1171fc594f - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '63648aa2c42a435b8649b88978fe889b', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '8d71f189c68645b9893c7a1171fc594f', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 22 00:04:31 compute-1 nova_compute[182713]: 2026-01-22 00:04:31.960 182717 DEBUG oslo_concurrency.processutils [None req-60da5aff-cd94-493b-a70e-4a426d3540a0 63648aa2c42a435b8649b88978fe889b 8d71f189c68645b9893c7a1171fc594f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:04:31 compute-1 nova_compute[182713]: 2026-01-22 00:04:31.961 182717 DEBUG oslo_concurrency.processutils [None req-60da5aff-cd94-493b-a70e-4a426d3540a0 63648aa2c42a435b8649b88978fe889b 8d71f189c68645b9893c7a1171fc594f - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/ae422e71-aa79-4a45-ab48-e634bb09283b/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:04:31 compute-1 nova_compute[182713]: 2026-01-22 00:04:31.980 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:04:31 compute-1 nova_compute[182713]: 2026-01-22 00:04:31.996 182717 DEBUG oslo_concurrency.processutils [None req-60da5aff-cd94-493b-a70e-4a426d3540a0 63648aa2c42a435b8649b88978fe889b 8d71f189c68645b9893c7a1171fc594f - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/ae422e71-aa79-4a45-ab48-e634bb09283b/disk 1073741824" returned: 0 in 0.035s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:04:31 compute-1 nova_compute[182713]: 2026-01-22 00:04:31.997 182717 DEBUG oslo_concurrency.lockutils [None req-60da5aff-cd94-493b-a70e-4a426d3540a0 63648aa2c42a435b8649b88978fe889b 8d71f189c68645b9893c7a1171fc594f - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.127s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:04:32 compute-1 nova_compute[182713]: 2026-01-22 00:04:31.999 182717 DEBUG oslo_concurrency.processutils [None req-60da5aff-cd94-493b-a70e-4a426d3540a0 63648aa2c42a435b8649b88978fe889b 8d71f189c68645b9893c7a1171fc594f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:04:32 compute-1 nova_compute[182713]: 2026-01-22 00:04:32.075 182717 DEBUG oslo_concurrency.processutils [None req-60da5aff-cd94-493b-a70e-4a426d3540a0 63648aa2c42a435b8649b88978fe889b 8d71f189c68645b9893c7a1171fc594f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.076s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:04:32 compute-1 nova_compute[182713]: 2026-01-22 00:04:32.076 182717 DEBUG nova.virt.disk.api [None req-60da5aff-cd94-493b-a70e-4a426d3540a0 63648aa2c42a435b8649b88978fe889b 8d71f189c68645b9893c7a1171fc594f - - default default] Checking if we can resize image /var/lib/nova/instances/ae422e71-aa79-4a45-ab48-e634bb09283b/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 22 00:04:32 compute-1 nova_compute[182713]: 2026-01-22 00:04:32.076 182717 DEBUG oslo_concurrency.processutils [None req-60da5aff-cd94-493b-a70e-4a426d3540a0 63648aa2c42a435b8649b88978fe889b 8d71f189c68645b9893c7a1171fc594f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ae422e71-aa79-4a45-ab48-e634bb09283b/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:04:32 compute-1 nova_compute[182713]: 2026-01-22 00:04:32.132 182717 DEBUG oslo_concurrency.processutils [None req-60da5aff-cd94-493b-a70e-4a426d3540a0 63648aa2c42a435b8649b88978fe889b 8d71f189c68645b9893c7a1171fc594f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ae422e71-aa79-4a45-ab48-e634bb09283b/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:04:32 compute-1 nova_compute[182713]: 2026-01-22 00:04:32.133 182717 DEBUG nova.virt.disk.api [None req-60da5aff-cd94-493b-a70e-4a426d3540a0 63648aa2c42a435b8649b88978fe889b 8d71f189c68645b9893c7a1171fc594f - - default default] Cannot resize image /var/lib/nova/instances/ae422e71-aa79-4a45-ab48-e634bb09283b/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 22 00:04:32 compute-1 nova_compute[182713]: 2026-01-22 00:04:32.134 182717 DEBUG nova.objects.instance [None req-60da5aff-cd94-493b-a70e-4a426d3540a0 63648aa2c42a435b8649b88978fe889b 8d71f189c68645b9893c7a1171fc594f - - default default] Lazy-loading 'migration_context' on Instance uuid ae422e71-aa79-4a45-ab48-e634bb09283b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:04:32 compute-1 nova_compute[182713]: 2026-01-22 00:04:32.164 182717 DEBUG nova.virt.libvirt.driver [None req-60da5aff-cd94-493b-a70e-4a426d3540a0 63648aa2c42a435b8649b88978fe889b 8d71f189c68645b9893c7a1171fc594f - - default default] [instance: ae422e71-aa79-4a45-ab48-e634bb09283b] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 22 00:04:32 compute-1 nova_compute[182713]: 2026-01-22 00:04:32.165 182717 DEBUG nova.virt.libvirt.driver [None req-60da5aff-cd94-493b-a70e-4a426d3540a0 63648aa2c42a435b8649b88978fe889b 8d71f189c68645b9893c7a1171fc594f - - default default] [instance: ae422e71-aa79-4a45-ab48-e634bb09283b] Ensure instance console log exists: /var/lib/nova/instances/ae422e71-aa79-4a45-ab48-e634bb09283b/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 22 00:04:32 compute-1 nova_compute[182713]: 2026-01-22 00:04:32.165 182717 DEBUG oslo_concurrency.lockutils [None req-60da5aff-cd94-493b-a70e-4a426d3540a0 63648aa2c42a435b8649b88978fe889b 8d71f189c68645b9893c7a1171fc594f - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:04:32 compute-1 nova_compute[182713]: 2026-01-22 00:04:32.166 182717 DEBUG oslo_concurrency.lockutils [None req-60da5aff-cd94-493b-a70e-4a426d3540a0 63648aa2c42a435b8649b88978fe889b 8d71f189c68645b9893c7a1171fc594f - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:04:32 compute-1 nova_compute[182713]: 2026-01-22 00:04:32.166 182717 DEBUG oslo_concurrency.lockutils [None req-60da5aff-cd94-493b-a70e-4a426d3540a0 63648aa2c42a435b8649b88978fe889b 8d71f189c68645b9893c7a1171fc594f - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:04:33 compute-1 nova_compute[182713]: 2026-01-22 00:04:33.342 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:04:33 compute-1 podman[225453]: 2026-01-22 00:04:33.574653968 +0000 UTC m=+0.070329343 container health_status cba261ffc859a105e0afa4436e64e27f9ba7e7387a1ea02146e0072c51844d44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_managed=true)
Jan 22 00:04:34 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:04:34.378 104184 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=25, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:a4:fb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'fa:c2:bb:50:eb:27'}, ipsec=False) old=SB_Global(nb_cfg=24) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:04:34 compute-1 nova_compute[182713]: 2026-01-22 00:04:34.379 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:04:34 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:04:34.380 104184 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 22 00:04:34 compute-1 nova_compute[182713]: 2026-01-22 00:04:34.736 182717 DEBUG nova.network.neutron [None req-60da5aff-cd94-493b-a70e-4a426d3540a0 63648aa2c42a435b8649b88978fe889b 8d71f189c68645b9893c7a1171fc594f - - default default] [instance: ae422e71-aa79-4a45-ab48-e634bb09283b] Successfully created port: f0e7550a-44b2-4ebc-ae04-c9d364a285a1 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 22 00:04:34 compute-1 nova_compute[182713]: 2026-01-22 00:04:34.930 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:04:34 compute-1 nova_compute[182713]: 2026-01-22 00:04:34.931 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:04:34 compute-1 nova_compute[182713]: 2026-01-22 00:04:34.932 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:04:34 compute-1 nova_compute[182713]: 2026-01-22 00:04:34.932 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:04:34 compute-1 nova_compute[182713]: 2026-01-22 00:04:34.933 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 00:04:35 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:04:35.383 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=74526b6d-b1ca-423f-9094-b845f8b97526, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '25'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:04:35 compute-1 nova_compute[182713]: 2026-01-22 00:04:35.853 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:04:35 compute-1 nova_compute[182713]: 2026-01-22 00:04:35.855 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:04:37 compute-1 nova_compute[182713]: 2026-01-22 00:04:37.008 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:04:37 compute-1 nova_compute[182713]: 2026-01-22 00:04:37.334 182717 DEBUG nova.network.neutron [None req-60da5aff-cd94-493b-a70e-4a426d3540a0 63648aa2c42a435b8649b88978fe889b 8d71f189c68645b9893c7a1171fc594f - - default default] [instance: ae422e71-aa79-4a45-ab48-e634bb09283b] Successfully updated port: f0e7550a-44b2-4ebc-ae04-c9d364a285a1 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 22 00:04:37 compute-1 nova_compute[182713]: 2026-01-22 00:04:37.360 182717 DEBUG oslo_concurrency.lockutils [None req-60da5aff-cd94-493b-a70e-4a426d3540a0 63648aa2c42a435b8649b88978fe889b 8d71f189c68645b9893c7a1171fc594f - - default default] Acquiring lock "refresh_cache-ae422e71-aa79-4a45-ab48-e634bb09283b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:04:37 compute-1 nova_compute[182713]: 2026-01-22 00:04:37.361 182717 DEBUG oslo_concurrency.lockutils [None req-60da5aff-cd94-493b-a70e-4a426d3540a0 63648aa2c42a435b8649b88978fe889b 8d71f189c68645b9893c7a1171fc594f - - default default] Acquired lock "refresh_cache-ae422e71-aa79-4a45-ab48-e634bb09283b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:04:37 compute-1 nova_compute[182713]: 2026-01-22 00:04:37.361 182717 DEBUG nova.network.neutron [None req-60da5aff-cd94-493b-a70e-4a426d3540a0 63648aa2c42a435b8649b88978fe889b 8d71f189c68645b9893c7a1171fc594f - - default default] [instance: ae422e71-aa79-4a45-ab48-e634bb09283b] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 00:04:37 compute-1 podman[225474]: 2026-01-22 00:04:37.558157587 +0000 UTC m=+0.054121902 container health_status dce133a2ffa21f3006ac27dc80027060a9029e6d972f4c62fecd35dd3bbb00f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, vendor=Red Hat, Inc., distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., vcs-type=git, version=9.6, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, io.buildah.version=1.33.7, config_id=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Jan 22 00:04:37 compute-1 nova_compute[182713]: 2026-01-22 00:04:37.746 182717 DEBUG nova.network.neutron [None req-60da5aff-cd94-493b-a70e-4a426d3540a0 63648aa2c42a435b8649b88978fe889b 8d71f189c68645b9893c7a1171fc594f - - default default] [instance: ae422e71-aa79-4a45-ab48-e634bb09283b] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 00:04:37 compute-1 nova_compute[182713]: 2026-01-22 00:04:37.855 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:04:38 compute-1 nova_compute[182713]: 2026-01-22 00:04:38.345 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:04:38 compute-1 nova_compute[182713]: 2026-01-22 00:04:38.855 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:04:38 compute-1 nova_compute[182713]: 2026-01-22 00:04:38.856 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 00:04:38 compute-1 nova_compute[182713]: 2026-01-22 00:04:38.856 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 22 00:04:38 compute-1 nova_compute[182713]: 2026-01-22 00:04:38.908 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] [instance: ae422e71-aa79-4a45-ab48-e634bb09283b] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871
Jan 22 00:04:38 compute-1 nova_compute[182713]: 2026-01-22 00:04:38.908 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 22 00:04:38 compute-1 nova_compute[182713]: 2026-01-22 00:04:38.909 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:04:39 compute-1 nova_compute[182713]: 2026-01-22 00:04:39.237 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:04:39 compute-1 nova_compute[182713]: 2026-01-22 00:04:39.238 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:04:39 compute-1 nova_compute[182713]: 2026-01-22 00:04:39.238 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:04:39 compute-1 nova_compute[182713]: 2026-01-22 00:04:39.239 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 00:04:39 compute-1 nova_compute[182713]: 2026-01-22 00:04:39.465 182717 WARNING nova.virt.libvirt.driver [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 00:04:39 compute-1 nova_compute[182713]: 2026-01-22 00:04:39.466 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5728MB free_disk=73.30367279052734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 00:04:39 compute-1 nova_compute[182713]: 2026-01-22 00:04:39.466 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:04:39 compute-1 nova_compute[182713]: 2026-01-22 00:04:39.467 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:04:39 compute-1 nova_compute[182713]: 2026-01-22 00:04:39.610 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Instance ae422e71-aa79-4a45-ab48-e634bb09283b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 22 00:04:39 compute-1 nova_compute[182713]: 2026-01-22 00:04:39.611 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 00:04:39 compute-1 nova_compute[182713]: 2026-01-22 00:04:39.611 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 00:04:39 compute-1 nova_compute[182713]: 2026-01-22 00:04:39.626 182717 DEBUG nova.compute.manager [req-6e29c3c5-99c7-41b8-9a91-80433db380f3 req-ebb745e9-cad1-481c-867c-3baa4fb328c6 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: ae422e71-aa79-4a45-ab48-e634bb09283b] Received event network-changed-f0e7550a-44b2-4ebc-ae04-c9d364a285a1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:04:39 compute-1 nova_compute[182713]: 2026-01-22 00:04:39.626 182717 DEBUG nova.compute.manager [req-6e29c3c5-99c7-41b8-9a91-80433db380f3 req-ebb745e9-cad1-481c-867c-3baa4fb328c6 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: ae422e71-aa79-4a45-ab48-e634bb09283b] Refreshing instance network info cache due to event network-changed-f0e7550a-44b2-4ebc-ae04-c9d364a285a1. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 00:04:39 compute-1 nova_compute[182713]: 2026-01-22 00:04:39.627 182717 DEBUG oslo_concurrency.lockutils [req-6e29c3c5-99c7-41b8-9a91-80433db380f3 req-ebb745e9-cad1-481c-867c-3baa4fb328c6 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-ae422e71-aa79-4a45-ab48-e634bb09283b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:04:39 compute-1 nova_compute[182713]: 2026-01-22 00:04:39.678 182717 DEBUG nova.compute.provider_tree [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Inventory has not changed in ProviderTree for provider: 39680711-70c9-4df1-ae59-25e54fac688d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:04:39 compute-1 nova_compute[182713]: 2026-01-22 00:04:39.709 182717 DEBUG nova.scheduler.client.report [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Inventory has not changed for provider 39680711-70c9-4df1-ae59-25e54fac688d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:04:39 compute-1 nova_compute[182713]: 2026-01-22 00:04:39.750 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 00:04:39 compute-1 nova_compute[182713]: 2026-01-22 00:04:39.750 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.284s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:04:40 compute-1 nova_compute[182713]: 2026-01-22 00:04:40.913 182717 DEBUG nova.network.neutron [None req-60da5aff-cd94-493b-a70e-4a426d3540a0 63648aa2c42a435b8649b88978fe889b 8d71f189c68645b9893c7a1171fc594f - - default default] [instance: ae422e71-aa79-4a45-ab48-e634bb09283b] Updating instance_info_cache with network_info: [{"id": "f0e7550a-44b2-4ebc-ae04-c9d364a285a1", "address": "fa:16:3e:a5:49:91", "network": {"id": "cc0afe66-c42b-4072-9318-4d70984b8705", "bridge": "br-int", "label": "tempest-ServersTestFqdnHostnames-740362094-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8d71f189c68645b9893c7a1171fc594f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf0e7550a-44", "ovs_interfaceid": "f0e7550a-44b2-4ebc-ae04-c9d364a285a1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:04:40 compute-1 nova_compute[182713]: 2026-01-22 00:04:40.950 182717 DEBUG oslo_concurrency.lockutils [None req-60da5aff-cd94-493b-a70e-4a426d3540a0 63648aa2c42a435b8649b88978fe889b 8d71f189c68645b9893c7a1171fc594f - - default default] Releasing lock "refresh_cache-ae422e71-aa79-4a45-ab48-e634bb09283b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:04:40 compute-1 nova_compute[182713]: 2026-01-22 00:04:40.951 182717 DEBUG nova.compute.manager [None req-60da5aff-cd94-493b-a70e-4a426d3540a0 63648aa2c42a435b8649b88978fe889b 8d71f189c68645b9893c7a1171fc594f - - default default] [instance: ae422e71-aa79-4a45-ab48-e634bb09283b] Instance network_info: |[{"id": "f0e7550a-44b2-4ebc-ae04-c9d364a285a1", "address": "fa:16:3e:a5:49:91", "network": {"id": "cc0afe66-c42b-4072-9318-4d70984b8705", "bridge": "br-int", "label": "tempest-ServersTestFqdnHostnames-740362094-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8d71f189c68645b9893c7a1171fc594f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf0e7550a-44", "ovs_interfaceid": "f0e7550a-44b2-4ebc-ae04-c9d364a285a1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 22 00:04:40 compute-1 nova_compute[182713]: 2026-01-22 00:04:40.952 182717 DEBUG oslo_concurrency.lockutils [req-6e29c3c5-99c7-41b8-9a91-80433db380f3 req-ebb745e9-cad1-481c-867c-3baa4fb328c6 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-ae422e71-aa79-4a45-ab48-e634bb09283b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:04:40 compute-1 nova_compute[182713]: 2026-01-22 00:04:40.953 182717 DEBUG nova.network.neutron [req-6e29c3c5-99c7-41b8-9a91-80433db380f3 req-ebb745e9-cad1-481c-867c-3baa4fb328c6 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: ae422e71-aa79-4a45-ab48-e634bb09283b] Refreshing network info cache for port f0e7550a-44b2-4ebc-ae04-c9d364a285a1 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 00:04:40 compute-1 nova_compute[182713]: 2026-01-22 00:04:40.958 182717 DEBUG nova.virt.libvirt.driver [None req-60da5aff-cd94-493b-a70e-4a426d3540a0 63648aa2c42a435b8649b88978fe889b 8d71f189c68645b9893c7a1171fc594f - - default default] [instance: ae422e71-aa79-4a45-ab48-e634bb09283b] Start _get_guest_xml network_info=[{"id": "f0e7550a-44b2-4ebc-ae04-c9d364a285a1", "address": "fa:16:3e:a5:49:91", "network": {"id": "cc0afe66-c42b-4072-9318-4d70984b8705", "bridge": "br-int", "label": "tempest-ServersTestFqdnHostnames-740362094-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8d71f189c68645b9893c7a1171fc594f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf0e7550a-44", "ovs_interfaceid": "f0e7550a-44b2-4ebc-ae04-c9d364a285a1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'size': 0, 'disk_bus': 'virtio', 'boot_index': 0, 'device_name': '/dev/vda', 'guest_format': None, 'encrypted': False, 'encryption_format': None, 'encryption_secret_uuid': None, 'device_type': 'disk', 'image_id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 22 00:04:40 compute-1 nova_compute[182713]: 2026-01-22 00:04:40.963 182717 WARNING nova.virt.libvirt.driver [None req-60da5aff-cd94-493b-a70e-4a426d3540a0 63648aa2c42a435b8649b88978fe889b 8d71f189c68645b9893c7a1171fc594f - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 00:04:40 compute-1 nova_compute[182713]: 2026-01-22 00:04:40.969 182717 DEBUG nova.virt.libvirt.host [None req-60da5aff-cd94-493b-a70e-4a426d3540a0 63648aa2c42a435b8649b88978fe889b 8d71f189c68645b9893c7a1171fc594f - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 22 00:04:40 compute-1 nova_compute[182713]: 2026-01-22 00:04:40.970 182717 DEBUG nova.virt.libvirt.host [None req-60da5aff-cd94-493b-a70e-4a426d3540a0 63648aa2c42a435b8649b88978fe889b 8d71f189c68645b9893c7a1171fc594f - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 22 00:04:40 compute-1 nova_compute[182713]: 2026-01-22 00:04:40.977 182717 DEBUG nova.virt.libvirt.host [None req-60da5aff-cd94-493b-a70e-4a426d3540a0 63648aa2c42a435b8649b88978fe889b 8d71f189c68645b9893c7a1171fc594f - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 22 00:04:40 compute-1 nova_compute[182713]: 2026-01-22 00:04:40.979 182717 DEBUG nova.virt.libvirt.host [None req-60da5aff-cd94-493b-a70e-4a426d3540a0 63648aa2c42a435b8649b88978fe889b 8d71f189c68645b9893c7a1171fc594f - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 22 00:04:40 compute-1 nova_compute[182713]: 2026-01-22 00:04:40.981 182717 DEBUG nova.virt.libvirt.driver [None req-60da5aff-cd94-493b-a70e-4a426d3540a0 63648aa2c42a435b8649b88978fe889b 8d71f189c68645b9893c7a1171fc594f - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 22 00:04:40 compute-1 nova_compute[182713]: 2026-01-22 00:04:40.981 182717 DEBUG nova.virt.hardware [None req-60da5aff-cd94-493b-a70e-4a426d3540a0 63648aa2c42a435b8649b88978fe889b 8d71f189c68645b9893c7a1171fc594f - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-21T23:42:45Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c3389c03-89c4-4ff5-9e03-1a99d41713d4',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 22 00:04:40 compute-1 nova_compute[182713]: 2026-01-22 00:04:40.982 182717 DEBUG nova.virt.hardware [None req-60da5aff-cd94-493b-a70e-4a426d3540a0 63648aa2c42a435b8649b88978fe889b 8d71f189c68645b9893c7a1171fc594f - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 22 00:04:40 compute-1 nova_compute[182713]: 2026-01-22 00:04:40.983 182717 DEBUG nova.virt.hardware [None req-60da5aff-cd94-493b-a70e-4a426d3540a0 63648aa2c42a435b8649b88978fe889b 8d71f189c68645b9893c7a1171fc594f - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 22 00:04:40 compute-1 nova_compute[182713]: 2026-01-22 00:04:40.983 182717 DEBUG nova.virt.hardware [None req-60da5aff-cd94-493b-a70e-4a426d3540a0 63648aa2c42a435b8649b88978fe889b 8d71f189c68645b9893c7a1171fc594f - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 22 00:04:40 compute-1 nova_compute[182713]: 2026-01-22 00:04:40.984 182717 DEBUG nova.virt.hardware [None req-60da5aff-cd94-493b-a70e-4a426d3540a0 63648aa2c42a435b8649b88978fe889b 8d71f189c68645b9893c7a1171fc594f - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 22 00:04:40 compute-1 nova_compute[182713]: 2026-01-22 00:04:40.984 182717 DEBUG nova.virt.hardware [None req-60da5aff-cd94-493b-a70e-4a426d3540a0 63648aa2c42a435b8649b88978fe889b 8d71f189c68645b9893c7a1171fc594f - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 22 00:04:40 compute-1 nova_compute[182713]: 2026-01-22 00:04:40.985 182717 DEBUG nova.virt.hardware [None req-60da5aff-cd94-493b-a70e-4a426d3540a0 63648aa2c42a435b8649b88978fe889b 8d71f189c68645b9893c7a1171fc594f - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 22 00:04:40 compute-1 nova_compute[182713]: 2026-01-22 00:04:40.985 182717 DEBUG nova.virt.hardware [None req-60da5aff-cd94-493b-a70e-4a426d3540a0 63648aa2c42a435b8649b88978fe889b 8d71f189c68645b9893c7a1171fc594f - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 22 00:04:40 compute-1 nova_compute[182713]: 2026-01-22 00:04:40.986 182717 DEBUG nova.virt.hardware [None req-60da5aff-cd94-493b-a70e-4a426d3540a0 63648aa2c42a435b8649b88978fe889b 8d71f189c68645b9893c7a1171fc594f - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 22 00:04:40 compute-1 nova_compute[182713]: 2026-01-22 00:04:40.986 182717 DEBUG nova.virt.hardware [None req-60da5aff-cd94-493b-a70e-4a426d3540a0 63648aa2c42a435b8649b88978fe889b 8d71f189c68645b9893c7a1171fc594f - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 22 00:04:40 compute-1 nova_compute[182713]: 2026-01-22 00:04:40.987 182717 DEBUG nova.virt.hardware [None req-60da5aff-cd94-493b-a70e-4a426d3540a0 63648aa2c42a435b8649b88978fe889b 8d71f189c68645b9893c7a1171fc594f - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 22 00:04:40 compute-1 nova_compute[182713]: 2026-01-22 00:04:40.993 182717 DEBUG nova.virt.libvirt.vif [None req-60da5aff-cd94-493b-a70e-4a426d3540a0 63648aa2c42a435b8649b88978fe889b 8d71f189c68645b9893c7a1171fc594f - - default default] vif_type=ovs instance=Instance(access_ip_v4=2.2.2.2,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T00:04:28Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='guest-instance-1.domain.com',display_name='guest-instance-1.domain.com',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='guest-instance-1-domain-com',id=90,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMDpNxj52ngzHLHuXy3WYSkFReNYkfmAvALETfwN//J3FPL6aMvUxZprajQE//vK0UiiBEeLGvT9b/h5PdevsIucx3yuVzoDS+oGEOmAuv+ECQ0ItERDBE2uQOufFj0yHg==',key_name='tempest-keypair-969259947',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='8d71f189c68645b9893c7a1171fc594f',ramdisk_id='',reservation_id='r-dqskam0b',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestFqdnHostnames-408381576',owner_user_name='tempest-ServersTestFqdnHostnames-408381576-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:04:31Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='63648aa2c42a435b8649b88978fe889b',uuid=ae422e71-aa79-4a45-ab48-e634bb09283b,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "f0e7550a-44b2-4ebc-ae04-c9d364a285a1", "address": "fa:16:3e:a5:49:91", "network": {"id": "cc0afe66-c42b-4072-9318-4d70984b8705", "bridge": "br-int", "label": "tempest-ServersTestFqdnHostnames-740362094-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8d71f189c68645b9893c7a1171fc594f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf0e7550a-44", "ovs_interfaceid": "f0e7550a-44b2-4ebc-ae04-c9d364a285a1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 00:04:40 compute-1 nova_compute[182713]: 2026-01-22 00:04:40.994 182717 DEBUG nova.network.os_vif_util [None req-60da5aff-cd94-493b-a70e-4a426d3540a0 63648aa2c42a435b8649b88978fe889b 8d71f189c68645b9893c7a1171fc594f - - default default] Converting VIF {"id": "f0e7550a-44b2-4ebc-ae04-c9d364a285a1", "address": "fa:16:3e:a5:49:91", "network": {"id": "cc0afe66-c42b-4072-9318-4d70984b8705", "bridge": "br-int", "label": "tempest-ServersTestFqdnHostnames-740362094-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8d71f189c68645b9893c7a1171fc594f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf0e7550a-44", "ovs_interfaceid": "f0e7550a-44b2-4ebc-ae04-c9d364a285a1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:04:40 compute-1 nova_compute[182713]: 2026-01-22 00:04:40.996 182717 DEBUG nova.network.os_vif_util [None req-60da5aff-cd94-493b-a70e-4a426d3540a0 63648aa2c42a435b8649b88978fe889b 8d71f189c68645b9893c7a1171fc594f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a5:49:91,bridge_name='br-int',has_traffic_filtering=True,id=f0e7550a-44b2-4ebc-ae04-c9d364a285a1,network=Network(cc0afe66-c42b-4072-9318-4d70984b8705),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf0e7550a-44') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:04:40 compute-1 nova_compute[182713]: 2026-01-22 00:04:40.998 182717 DEBUG nova.objects.instance [None req-60da5aff-cd94-493b-a70e-4a426d3540a0 63648aa2c42a435b8649b88978fe889b 8d71f189c68645b9893c7a1171fc594f - - default default] Lazy-loading 'pci_devices' on Instance uuid ae422e71-aa79-4a45-ab48-e634bb09283b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:04:41 compute-1 nova_compute[182713]: 2026-01-22 00:04:41.016 182717 DEBUG nova.virt.libvirt.driver [None req-60da5aff-cd94-493b-a70e-4a426d3540a0 63648aa2c42a435b8649b88978fe889b 8d71f189c68645b9893c7a1171fc594f - - default default] [instance: ae422e71-aa79-4a45-ab48-e634bb09283b] End _get_guest_xml xml=<domain type="kvm">
Jan 22 00:04:41 compute-1 nova_compute[182713]:   <uuid>ae422e71-aa79-4a45-ab48-e634bb09283b</uuid>
Jan 22 00:04:41 compute-1 nova_compute[182713]:   <name>instance-0000005a</name>
Jan 22 00:04:41 compute-1 nova_compute[182713]:   <memory>131072</memory>
Jan 22 00:04:41 compute-1 nova_compute[182713]:   <vcpu>1</vcpu>
Jan 22 00:04:41 compute-1 nova_compute[182713]:   <metadata>
Jan 22 00:04:41 compute-1 nova_compute[182713]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 00:04:41 compute-1 nova_compute[182713]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 00:04:41 compute-1 nova_compute[182713]:       <nova:name>guest-instance-1.domain.com</nova:name>
Jan 22 00:04:41 compute-1 nova_compute[182713]:       <nova:creationTime>2026-01-22 00:04:40</nova:creationTime>
Jan 22 00:04:41 compute-1 nova_compute[182713]:       <nova:flavor name="m1.nano">
Jan 22 00:04:41 compute-1 nova_compute[182713]:         <nova:memory>128</nova:memory>
Jan 22 00:04:41 compute-1 nova_compute[182713]:         <nova:disk>1</nova:disk>
Jan 22 00:04:41 compute-1 nova_compute[182713]:         <nova:swap>0</nova:swap>
Jan 22 00:04:41 compute-1 nova_compute[182713]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 00:04:41 compute-1 nova_compute[182713]:         <nova:vcpus>1</nova:vcpus>
Jan 22 00:04:41 compute-1 nova_compute[182713]:       </nova:flavor>
Jan 22 00:04:41 compute-1 nova_compute[182713]:       <nova:owner>
Jan 22 00:04:41 compute-1 nova_compute[182713]:         <nova:user uuid="63648aa2c42a435b8649b88978fe889b">tempest-ServersTestFqdnHostnames-408381576-project-member</nova:user>
Jan 22 00:04:41 compute-1 nova_compute[182713]:         <nova:project uuid="8d71f189c68645b9893c7a1171fc594f">tempest-ServersTestFqdnHostnames-408381576</nova:project>
Jan 22 00:04:41 compute-1 nova_compute[182713]:       </nova:owner>
Jan 22 00:04:41 compute-1 nova_compute[182713]:       <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 22 00:04:41 compute-1 nova_compute[182713]:       <nova:ports>
Jan 22 00:04:41 compute-1 nova_compute[182713]:         <nova:port uuid="f0e7550a-44b2-4ebc-ae04-c9d364a285a1">
Jan 22 00:04:41 compute-1 nova_compute[182713]:           <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Jan 22 00:04:41 compute-1 nova_compute[182713]:         </nova:port>
Jan 22 00:04:41 compute-1 nova_compute[182713]:       </nova:ports>
Jan 22 00:04:41 compute-1 nova_compute[182713]:     </nova:instance>
Jan 22 00:04:41 compute-1 nova_compute[182713]:   </metadata>
Jan 22 00:04:41 compute-1 nova_compute[182713]:   <sysinfo type="smbios">
Jan 22 00:04:41 compute-1 nova_compute[182713]:     <system>
Jan 22 00:04:41 compute-1 nova_compute[182713]:       <entry name="manufacturer">RDO</entry>
Jan 22 00:04:41 compute-1 nova_compute[182713]:       <entry name="product">OpenStack Compute</entry>
Jan 22 00:04:41 compute-1 nova_compute[182713]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 00:04:41 compute-1 nova_compute[182713]:       <entry name="serial">ae422e71-aa79-4a45-ab48-e634bb09283b</entry>
Jan 22 00:04:41 compute-1 nova_compute[182713]:       <entry name="uuid">ae422e71-aa79-4a45-ab48-e634bb09283b</entry>
Jan 22 00:04:41 compute-1 nova_compute[182713]:       <entry name="family">Virtual Machine</entry>
Jan 22 00:04:41 compute-1 nova_compute[182713]:     </system>
Jan 22 00:04:41 compute-1 nova_compute[182713]:   </sysinfo>
Jan 22 00:04:41 compute-1 nova_compute[182713]:   <os>
Jan 22 00:04:41 compute-1 nova_compute[182713]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 22 00:04:41 compute-1 nova_compute[182713]:     <boot dev="hd"/>
Jan 22 00:04:41 compute-1 nova_compute[182713]:     <smbios mode="sysinfo"/>
Jan 22 00:04:41 compute-1 nova_compute[182713]:   </os>
Jan 22 00:04:41 compute-1 nova_compute[182713]:   <features>
Jan 22 00:04:41 compute-1 nova_compute[182713]:     <acpi/>
Jan 22 00:04:41 compute-1 nova_compute[182713]:     <apic/>
Jan 22 00:04:41 compute-1 nova_compute[182713]:     <vmcoreinfo/>
Jan 22 00:04:41 compute-1 nova_compute[182713]:   </features>
Jan 22 00:04:41 compute-1 nova_compute[182713]:   <clock offset="utc">
Jan 22 00:04:41 compute-1 nova_compute[182713]:     <timer name="pit" tickpolicy="delay"/>
Jan 22 00:04:41 compute-1 nova_compute[182713]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 22 00:04:41 compute-1 nova_compute[182713]:     <timer name="hpet" present="no"/>
Jan 22 00:04:41 compute-1 nova_compute[182713]:   </clock>
Jan 22 00:04:41 compute-1 nova_compute[182713]:   <cpu mode="custom" match="exact">
Jan 22 00:04:41 compute-1 nova_compute[182713]:     <model>Nehalem</model>
Jan 22 00:04:41 compute-1 nova_compute[182713]:     <topology sockets="1" cores="1" threads="1"/>
Jan 22 00:04:41 compute-1 nova_compute[182713]:   </cpu>
Jan 22 00:04:41 compute-1 nova_compute[182713]:   <devices>
Jan 22 00:04:41 compute-1 nova_compute[182713]:     <disk type="file" device="disk">
Jan 22 00:04:41 compute-1 nova_compute[182713]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 00:04:41 compute-1 nova_compute[182713]:       <source file="/var/lib/nova/instances/ae422e71-aa79-4a45-ab48-e634bb09283b/disk"/>
Jan 22 00:04:41 compute-1 nova_compute[182713]:       <target dev="vda" bus="virtio"/>
Jan 22 00:04:41 compute-1 nova_compute[182713]:     </disk>
Jan 22 00:04:41 compute-1 nova_compute[182713]:     <disk type="file" device="cdrom">
Jan 22 00:04:41 compute-1 nova_compute[182713]:       <driver name="qemu" type="raw" cache="none"/>
Jan 22 00:04:41 compute-1 nova_compute[182713]:       <source file="/var/lib/nova/instances/ae422e71-aa79-4a45-ab48-e634bb09283b/disk.config"/>
Jan 22 00:04:41 compute-1 nova_compute[182713]:       <target dev="sda" bus="sata"/>
Jan 22 00:04:41 compute-1 nova_compute[182713]:     </disk>
Jan 22 00:04:41 compute-1 nova_compute[182713]:     <interface type="ethernet">
Jan 22 00:04:41 compute-1 nova_compute[182713]:       <mac address="fa:16:3e:a5:49:91"/>
Jan 22 00:04:41 compute-1 nova_compute[182713]:       <model type="virtio"/>
Jan 22 00:04:41 compute-1 nova_compute[182713]:       <driver name="vhost" rx_queue_size="512"/>
Jan 22 00:04:41 compute-1 nova_compute[182713]:       <mtu size="1442"/>
Jan 22 00:04:41 compute-1 nova_compute[182713]:       <target dev="tapf0e7550a-44"/>
Jan 22 00:04:41 compute-1 nova_compute[182713]:     </interface>
Jan 22 00:04:41 compute-1 nova_compute[182713]:     <serial type="pty">
Jan 22 00:04:41 compute-1 nova_compute[182713]:       <log file="/var/lib/nova/instances/ae422e71-aa79-4a45-ab48-e634bb09283b/console.log" append="off"/>
Jan 22 00:04:41 compute-1 nova_compute[182713]:     </serial>
Jan 22 00:04:41 compute-1 nova_compute[182713]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 00:04:41 compute-1 nova_compute[182713]:     <video>
Jan 22 00:04:41 compute-1 nova_compute[182713]:       <model type="virtio"/>
Jan 22 00:04:41 compute-1 nova_compute[182713]:     </video>
Jan 22 00:04:41 compute-1 nova_compute[182713]:     <input type="tablet" bus="usb"/>
Jan 22 00:04:41 compute-1 nova_compute[182713]:     <rng model="virtio">
Jan 22 00:04:41 compute-1 nova_compute[182713]:       <backend model="random">/dev/urandom</backend>
Jan 22 00:04:41 compute-1 nova_compute[182713]:     </rng>
Jan 22 00:04:41 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root"/>
Jan 22 00:04:41 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:04:41 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:04:41 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:04:41 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:04:41 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:04:41 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:04:41 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:04:41 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:04:41 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:04:41 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:04:41 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:04:41 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:04:41 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:04:41 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:04:41 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:04:41 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:04:41 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:04:41 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:04:41 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:04:41 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:04:41 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:04:41 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:04:41 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:04:41 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:04:41 compute-1 nova_compute[182713]:     <controller type="usb" index="0"/>
Jan 22 00:04:41 compute-1 nova_compute[182713]:     <memballoon model="virtio">
Jan 22 00:04:41 compute-1 nova_compute[182713]:       <stats period="10"/>
Jan 22 00:04:41 compute-1 nova_compute[182713]:     </memballoon>
Jan 22 00:04:41 compute-1 nova_compute[182713]:   </devices>
Jan 22 00:04:41 compute-1 nova_compute[182713]: </domain>
Jan 22 00:04:41 compute-1 nova_compute[182713]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 22 00:04:41 compute-1 nova_compute[182713]: 2026-01-22 00:04:41.018 182717 DEBUG nova.compute.manager [None req-60da5aff-cd94-493b-a70e-4a426d3540a0 63648aa2c42a435b8649b88978fe889b 8d71f189c68645b9893c7a1171fc594f - - default default] [instance: ae422e71-aa79-4a45-ab48-e634bb09283b] Preparing to wait for external event network-vif-plugged-f0e7550a-44b2-4ebc-ae04-c9d364a285a1 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 22 00:04:41 compute-1 nova_compute[182713]: 2026-01-22 00:04:41.018 182717 DEBUG oslo_concurrency.lockutils [None req-60da5aff-cd94-493b-a70e-4a426d3540a0 63648aa2c42a435b8649b88978fe889b 8d71f189c68645b9893c7a1171fc594f - - default default] Acquiring lock "ae422e71-aa79-4a45-ab48-e634bb09283b-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:04:41 compute-1 nova_compute[182713]: 2026-01-22 00:04:41.019 182717 DEBUG oslo_concurrency.lockutils [None req-60da5aff-cd94-493b-a70e-4a426d3540a0 63648aa2c42a435b8649b88978fe889b 8d71f189c68645b9893c7a1171fc594f - - default default] Lock "ae422e71-aa79-4a45-ab48-e634bb09283b-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:04:41 compute-1 nova_compute[182713]: 2026-01-22 00:04:41.019 182717 DEBUG oslo_concurrency.lockutils [None req-60da5aff-cd94-493b-a70e-4a426d3540a0 63648aa2c42a435b8649b88978fe889b 8d71f189c68645b9893c7a1171fc594f - - default default] Lock "ae422e71-aa79-4a45-ab48-e634bb09283b-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:04:41 compute-1 nova_compute[182713]: 2026-01-22 00:04:41.020 182717 DEBUG nova.virt.libvirt.vif [None req-60da5aff-cd94-493b-a70e-4a426d3540a0 63648aa2c42a435b8649b88978fe889b 8d71f189c68645b9893c7a1171fc594f - - default default] vif_type=ovs instance=Instance(access_ip_v4=2.2.2.2,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T00:04:28Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='guest-instance-1.domain.com',display_name='guest-instance-1.domain.com',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='guest-instance-1-domain-com',id=90,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMDpNxj52ngzHLHuXy3WYSkFReNYkfmAvALETfwN//J3FPL6aMvUxZprajQE//vK0UiiBEeLGvT9b/h5PdevsIucx3yuVzoDS+oGEOmAuv+ECQ0ItERDBE2uQOufFj0yHg==',key_name='tempest-keypair-969259947',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='8d71f189c68645b9893c7a1171fc594f',ramdisk_id='',reservation_id='r-dqskam0b',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestFqdnHostnames-408381576',owner_user_name='tempest-ServersTestFqdnHostnames-408381576-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:04:31Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='63648aa2c42a435b8649b88978fe889b',uuid=ae422e71-aa79-4a45-ab48-e634bb09283b,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "f0e7550a-44b2-4ebc-ae04-c9d364a285a1", "address": "fa:16:3e:a5:49:91", "network": {"id": "cc0afe66-c42b-4072-9318-4d70984b8705", "bridge": "br-int", "label": "tempest-ServersTestFqdnHostnames-740362094-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8d71f189c68645b9893c7a1171fc594f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf0e7550a-44", "ovs_interfaceid": "f0e7550a-44b2-4ebc-ae04-c9d364a285a1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 00:04:41 compute-1 nova_compute[182713]: 2026-01-22 00:04:41.020 182717 DEBUG nova.network.os_vif_util [None req-60da5aff-cd94-493b-a70e-4a426d3540a0 63648aa2c42a435b8649b88978fe889b 8d71f189c68645b9893c7a1171fc594f - - default default] Converting VIF {"id": "f0e7550a-44b2-4ebc-ae04-c9d364a285a1", "address": "fa:16:3e:a5:49:91", "network": {"id": "cc0afe66-c42b-4072-9318-4d70984b8705", "bridge": "br-int", "label": "tempest-ServersTestFqdnHostnames-740362094-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8d71f189c68645b9893c7a1171fc594f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf0e7550a-44", "ovs_interfaceid": "f0e7550a-44b2-4ebc-ae04-c9d364a285a1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:04:41 compute-1 nova_compute[182713]: 2026-01-22 00:04:41.021 182717 DEBUG nova.network.os_vif_util [None req-60da5aff-cd94-493b-a70e-4a426d3540a0 63648aa2c42a435b8649b88978fe889b 8d71f189c68645b9893c7a1171fc594f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a5:49:91,bridge_name='br-int',has_traffic_filtering=True,id=f0e7550a-44b2-4ebc-ae04-c9d364a285a1,network=Network(cc0afe66-c42b-4072-9318-4d70984b8705),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf0e7550a-44') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:04:41 compute-1 nova_compute[182713]: 2026-01-22 00:04:41.021 182717 DEBUG os_vif [None req-60da5aff-cd94-493b-a70e-4a426d3540a0 63648aa2c42a435b8649b88978fe889b 8d71f189c68645b9893c7a1171fc594f - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:a5:49:91,bridge_name='br-int',has_traffic_filtering=True,id=f0e7550a-44b2-4ebc-ae04-c9d364a285a1,network=Network(cc0afe66-c42b-4072-9318-4d70984b8705),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf0e7550a-44') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 00:04:41 compute-1 nova_compute[182713]: 2026-01-22 00:04:41.022 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:04:41 compute-1 nova_compute[182713]: 2026-01-22 00:04:41.022 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:04:41 compute-1 nova_compute[182713]: 2026-01-22 00:04:41.023 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 00:04:41 compute-1 nova_compute[182713]: 2026-01-22 00:04:41.028 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:04:41 compute-1 nova_compute[182713]: 2026-01-22 00:04:41.029 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf0e7550a-44, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:04:41 compute-1 nova_compute[182713]: 2026-01-22 00:04:41.030 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapf0e7550a-44, col_values=(('external_ids', {'iface-id': 'f0e7550a-44b2-4ebc-ae04-c9d364a285a1', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:a5:49:91', 'vm-uuid': 'ae422e71-aa79-4a45-ab48-e634bb09283b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:04:41 compute-1 NetworkManager[54952]: <info>  [1769040281.0333] manager: (tapf0e7550a-44): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/179)
Jan 22 00:04:41 compute-1 nova_compute[182713]: 2026-01-22 00:04:41.032 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:04:41 compute-1 nova_compute[182713]: 2026-01-22 00:04:41.036 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:04:41 compute-1 nova_compute[182713]: 2026-01-22 00:04:41.039 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:04:41 compute-1 nova_compute[182713]: 2026-01-22 00:04:41.041 182717 INFO os_vif [None req-60da5aff-cd94-493b-a70e-4a426d3540a0 63648aa2c42a435b8649b88978fe889b 8d71f189c68645b9893c7a1171fc594f - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:a5:49:91,bridge_name='br-int',has_traffic_filtering=True,id=f0e7550a-44b2-4ebc-ae04-c9d364a285a1,network=Network(cc0afe66-c42b-4072-9318-4d70984b8705),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf0e7550a-44')
Jan 22 00:04:41 compute-1 nova_compute[182713]: 2026-01-22 00:04:41.102 182717 DEBUG nova.virt.libvirt.driver [None req-60da5aff-cd94-493b-a70e-4a426d3540a0 63648aa2c42a435b8649b88978fe889b 8d71f189c68645b9893c7a1171fc594f - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 00:04:41 compute-1 nova_compute[182713]: 2026-01-22 00:04:41.103 182717 DEBUG nova.virt.libvirt.driver [None req-60da5aff-cd94-493b-a70e-4a426d3540a0 63648aa2c42a435b8649b88978fe889b 8d71f189c68645b9893c7a1171fc594f - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 00:04:41 compute-1 nova_compute[182713]: 2026-01-22 00:04:41.103 182717 DEBUG nova.virt.libvirt.driver [None req-60da5aff-cd94-493b-a70e-4a426d3540a0 63648aa2c42a435b8649b88978fe889b 8d71f189c68645b9893c7a1171fc594f - - default default] No VIF found with MAC fa:16:3e:a5:49:91, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 00:04:41 compute-1 nova_compute[182713]: 2026-01-22 00:04:41.104 182717 INFO nova.virt.libvirt.driver [None req-60da5aff-cd94-493b-a70e-4a426d3540a0 63648aa2c42a435b8649b88978fe889b 8d71f189c68645b9893c7a1171fc594f - - default default] [instance: ae422e71-aa79-4a45-ab48-e634bb09283b] Using config drive
Jan 22 00:04:41 compute-1 nova_compute[182713]: 2026-01-22 00:04:41.569 182717 INFO nova.virt.libvirt.driver [None req-60da5aff-cd94-493b-a70e-4a426d3540a0 63648aa2c42a435b8649b88978fe889b 8d71f189c68645b9893c7a1171fc594f - - default default] [instance: ae422e71-aa79-4a45-ab48-e634bb09283b] Creating config drive at /var/lib/nova/instances/ae422e71-aa79-4a45-ab48-e634bb09283b/disk.config
Jan 22 00:04:41 compute-1 nova_compute[182713]: 2026-01-22 00:04:41.581 182717 DEBUG oslo_concurrency.processutils [None req-60da5aff-cd94-493b-a70e-4a426d3540a0 63648aa2c42a435b8649b88978fe889b 8d71f189c68645b9893c7a1171fc594f - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/ae422e71-aa79-4a45-ab48-e634bb09283b/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpc8ste317 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:04:41 compute-1 nova_compute[182713]: 2026-01-22 00:04:41.728 182717 DEBUG oslo_concurrency.processutils [None req-60da5aff-cd94-493b-a70e-4a426d3540a0 63648aa2c42a435b8649b88978fe889b 8d71f189c68645b9893c7a1171fc594f - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/ae422e71-aa79-4a45-ab48-e634bb09283b/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpc8ste317" returned: 0 in 0.148s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:04:41 compute-1 kernel: tapf0e7550a-44: entered promiscuous mode
Jan 22 00:04:41 compute-1 NetworkManager[54952]: <info>  [1769040281.8136] manager: (tapf0e7550a-44): new Tun device (/org/freedesktop/NetworkManager/Devices/180)
Jan 22 00:04:41 compute-1 ovn_controller[94841]: 2026-01-22T00:04:41Z|00371|binding|INFO|Claiming lport f0e7550a-44b2-4ebc-ae04-c9d364a285a1 for this chassis.
Jan 22 00:04:41 compute-1 ovn_controller[94841]: 2026-01-22T00:04:41Z|00372|binding|INFO|f0e7550a-44b2-4ebc-ae04-c9d364a285a1: Claiming fa:16:3e:a5:49:91 10.100.0.9
Jan 22 00:04:41 compute-1 nova_compute[182713]: 2026-01-22 00:04:41.819 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:04:41 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:04:41.847 104184 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a5:49:91 10.100.0.9'], port_security=['fa:16:3e:a5:49:91 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': 'ae422e71-aa79-4a45-ab48-e634bb09283b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cc0afe66-c42b-4072-9318-4d70984b8705', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8d71f189c68645b9893c7a1171fc594f', 'neutron:revision_number': '2', 'neutron:security_group_ids': '1dcbe2a3-448b-4c38-b0e2-636005b56d53', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=45aa4207-cb88-4906-b773-785691a1d850, chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>], logical_port=f0e7550a-44b2-4ebc-ae04-c9d364a285a1) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:04:41 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:04:41.851 104184 INFO neutron.agent.ovn.metadata.agent [-] Port f0e7550a-44b2-4ebc-ae04-c9d364a285a1 in datapath cc0afe66-c42b-4072-9318-4d70984b8705 bound to our chassis
Jan 22 00:04:41 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:04:41.853 104184 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network cc0afe66-c42b-4072-9318-4d70984b8705
Jan 22 00:04:41 compute-1 systemd-udevd[225514]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 00:04:41 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:04:41.872 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[0431b9cf-6848-4bf9-9d13-d34406d86c04]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:04:41 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:04:41.873 104184 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapcc0afe66-c1 in ovnmeta-cc0afe66-c42b-4072-9318-4d70984b8705 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 22 00:04:41 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:04:41.875 211733 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapcc0afe66-c0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 22 00:04:41 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:04:41.875 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[16cfea51-7b87-4590-a496-bfb7aabc663e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:04:41 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:04:41.877 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[024414cc-da5f-42d5-8508-c51fe7e8416a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:04:41 compute-1 systemd-machined[153970]: New machine qemu-42-instance-0000005a.
Jan 22 00:04:41 compute-1 NetworkManager[54952]: <info>  [1769040281.8893] device (tapf0e7550a-44): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 00:04:41 compute-1 NetworkManager[54952]: <info>  [1769040281.8907] device (tapf0e7550a-44): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 00:04:41 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:04:41.892 104576 DEBUG oslo.privsep.daemon [-] privsep: reply[820a2f53-bb22-467e-90fd-d07053fa222c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:04:41 compute-1 nova_compute[182713]: 2026-01-22 00:04:41.906 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:04:41 compute-1 nova_compute[182713]: 2026-01-22 00:04:41.910 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:04:41 compute-1 ovn_controller[94841]: 2026-01-22T00:04:41Z|00373|binding|INFO|Setting lport f0e7550a-44b2-4ebc-ae04-c9d364a285a1 ovn-installed in OVS
Jan 22 00:04:41 compute-1 ovn_controller[94841]: 2026-01-22T00:04:41Z|00374|binding|INFO|Setting lport f0e7550a-44b2-4ebc-ae04-c9d364a285a1 up in Southbound
Jan 22 00:04:41 compute-1 nova_compute[182713]: 2026-01-22 00:04:41.913 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:04:41 compute-1 systemd[1]: Started Virtual Machine qemu-42-instance-0000005a.
Jan 22 00:04:41 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:04:41.917 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[53b480ce-dd63-4024-bd98-7d62bbe654dd]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:04:41 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:04:41.950 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[c5e8e127-83b8-49e6-bf8b-66e50d74c73f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:04:41 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:04:41.959 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[df4a47eb-e85c-4732-ad27-4f1f321e1e49]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:04:41 compute-1 NetworkManager[54952]: <info>  [1769040281.9606] manager: (tapcc0afe66-c0): new Veth device (/org/freedesktop/NetworkManager/Devices/181)
Jan 22 00:04:41 compute-1 systemd-udevd[225518]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 00:04:41 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:04:41.995 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[ec3584e2-ab2a-4e1f-bb05-7ac9e5f2e07c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:04:41 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:04:41.999 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[17c4cc95-b1ad-43bb-b40a-0e11478a004a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:04:42 compute-1 nova_compute[182713]: 2026-01-22 00:04:42.009 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:04:42 compute-1 NetworkManager[54952]: <info>  [1769040282.0299] device (tapcc0afe66-c0): carrier: link connected
Jan 22 00:04:42 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:04:42.039 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[eb715ffa-a1c3-495f-9b28-a080f9785654]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:04:42 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:04:42.069 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[f52dfe1e-63bb-4c17-b9aa-c6eee05dd653]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapcc0afe66-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d7:66:4c'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 111], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 485468, 'reachable_time': 38113, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 225547, 'error': None, 'target': 'ovnmeta-cc0afe66-c42b-4072-9318-4d70984b8705', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:04:42 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:04:42.090 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[638a884d-55ef-4886-95b6-c5b7367b8f3d]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fed7:664c'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 485468, 'tstamp': 485468}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 225548, 'error': None, 'target': 'ovnmeta-cc0afe66-c42b-4072-9318-4d70984b8705', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:04:42 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:04:42.117 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[54e439e1-3339-42f4-b39c-b6de756eadb0]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapcc0afe66-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d7:66:4c'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 111], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 485468, 'reachable_time': 38113, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 225549, 'error': None, 'target': 'ovnmeta-cc0afe66-c42b-4072-9318-4d70984b8705', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:04:42 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:04:42.157 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[c6a256ca-771f-4300-804e-b9d2715ecff1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:04:42 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:04:42.233 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[f4609c8c-5009-4955-8eb9-8831dafc6446]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:04:42 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:04:42.234 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapcc0afe66-c0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:04:42 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:04:42.234 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 00:04:42 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:04:42.235 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapcc0afe66-c0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:04:42 compute-1 nova_compute[182713]: 2026-01-22 00:04:42.237 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:04:42 compute-1 kernel: tapcc0afe66-c0: entered promiscuous mode
Jan 22 00:04:42 compute-1 NetworkManager[54952]: <info>  [1769040282.2384] manager: (tapcc0afe66-c0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/182)
Jan 22 00:04:42 compute-1 nova_compute[182713]: 2026-01-22 00:04:42.239 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:04:42 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:04:42.246 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapcc0afe66-c0, col_values=(('external_ids', {'iface-id': '6a8f402c-017b-4947-ae2c-eff9eabad564'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:04:42 compute-1 ovn_controller[94841]: 2026-01-22T00:04:42Z|00375|binding|INFO|Releasing lport 6a8f402c-017b-4947-ae2c-eff9eabad564 from this chassis (sb_readonly=0)
Jan 22 00:04:42 compute-1 nova_compute[182713]: 2026-01-22 00:04:42.247 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:04:42 compute-1 nova_compute[182713]: 2026-01-22 00:04:42.261 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:04:42 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:04:42.263 104184 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/cc0afe66-c42b-4072-9318-4d70984b8705.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/cc0afe66-c42b-4072-9318-4d70984b8705.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 22 00:04:42 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:04:42.264 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[df3e95ce-63d6-446d-a8c5-787d6a3bb86b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:04:42 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:04:42.265 104184 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 00:04:42 compute-1 ovn_metadata_agent[104179]: global
Jan 22 00:04:42 compute-1 ovn_metadata_agent[104179]:     log         /dev/log local0 debug
Jan 22 00:04:42 compute-1 ovn_metadata_agent[104179]:     log-tag     haproxy-metadata-proxy-cc0afe66-c42b-4072-9318-4d70984b8705
Jan 22 00:04:42 compute-1 ovn_metadata_agent[104179]:     user        root
Jan 22 00:04:42 compute-1 ovn_metadata_agent[104179]:     group       root
Jan 22 00:04:42 compute-1 ovn_metadata_agent[104179]:     maxconn     1024
Jan 22 00:04:42 compute-1 ovn_metadata_agent[104179]:     pidfile     /var/lib/neutron/external/pids/cc0afe66-c42b-4072-9318-4d70984b8705.pid.haproxy
Jan 22 00:04:42 compute-1 ovn_metadata_agent[104179]:     daemon
Jan 22 00:04:42 compute-1 ovn_metadata_agent[104179]: 
Jan 22 00:04:42 compute-1 ovn_metadata_agent[104179]: defaults
Jan 22 00:04:42 compute-1 ovn_metadata_agent[104179]:     log global
Jan 22 00:04:42 compute-1 ovn_metadata_agent[104179]:     mode http
Jan 22 00:04:42 compute-1 ovn_metadata_agent[104179]:     option httplog
Jan 22 00:04:42 compute-1 ovn_metadata_agent[104179]:     option dontlognull
Jan 22 00:04:42 compute-1 ovn_metadata_agent[104179]:     option http-server-close
Jan 22 00:04:42 compute-1 ovn_metadata_agent[104179]:     option forwardfor
Jan 22 00:04:42 compute-1 ovn_metadata_agent[104179]:     retries                 3
Jan 22 00:04:42 compute-1 ovn_metadata_agent[104179]:     timeout http-request    30s
Jan 22 00:04:42 compute-1 ovn_metadata_agent[104179]:     timeout connect         30s
Jan 22 00:04:42 compute-1 ovn_metadata_agent[104179]:     timeout client          32s
Jan 22 00:04:42 compute-1 ovn_metadata_agent[104179]:     timeout server          32s
Jan 22 00:04:42 compute-1 ovn_metadata_agent[104179]:     timeout http-keep-alive 30s
Jan 22 00:04:42 compute-1 ovn_metadata_agent[104179]: 
Jan 22 00:04:42 compute-1 ovn_metadata_agent[104179]: 
Jan 22 00:04:42 compute-1 ovn_metadata_agent[104179]: listen listener
Jan 22 00:04:42 compute-1 ovn_metadata_agent[104179]:     bind 169.254.169.254:80
Jan 22 00:04:42 compute-1 ovn_metadata_agent[104179]:     server metadata /var/lib/neutron/metadata_proxy
Jan 22 00:04:42 compute-1 ovn_metadata_agent[104179]:     http-request add-header X-OVN-Network-ID cc0afe66-c42b-4072-9318-4d70984b8705
Jan 22 00:04:42 compute-1 ovn_metadata_agent[104179]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 22 00:04:42 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:04:42.265 104184 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-cc0afe66-c42b-4072-9318-4d70984b8705', 'env', 'PROCESS_TAG=haproxy-cc0afe66-c42b-4072-9318-4d70984b8705', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/cc0afe66-c42b-4072-9318-4d70984b8705.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 22 00:04:42 compute-1 nova_compute[182713]: 2026-01-22 00:04:42.476 182717 DEBUG nova.compute.manager [req-3de1f90e-40ac-41aa-a558-548b842c6e21 req-2b6ef292-63b2-4bd4-b146-c3fe6bb0ee70 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: ae422e71-aa79-4a45-ab48-e634bb09283b] Received event network-vif-plugged-f0e7550a-44b2-4ebc-ae04-c9d364a285a1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:04:42 compute-1 nova_compute[182713]: 2026-01-22 00:04:42.477 182717 DEBUG oslo_concurrency.lockutils [req-3de1f90e-40ac-41aa-a558-548b842c6e21 req-2b6ef292-63b2-4bd4-b146-c3fe6bb0ee70 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "ae422e71-aa79-4a45-ab48-e634bb09283b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:04:42 compute-1 nova_compute[182713]: 2026-01-22 00:04:42.478 182717 DEBUG oslo_concurrency.lockutils [req-3de1f90e-40ac-41aa-a558-548b842c6e21 req-2b6ef292-63b2-4bd4-b146-c3fe6bb0ee70 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "ae422e71-aa79-4a45-ab48-e634bb09283b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:04:42 compute-1 nova_compute[182713]: 2026-01-22 00:04:42.478 182717 DEBUG oslo_concurrency.lockutils [req-3de1f90e-40ac-41aa-a558-548b842c6e21 req-2b6ef292-63b2-4bd4-b146-c3fe6bb0ee70 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "ae422e71-aa79-4a45-ab48-e634bb09283b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:04:42 compute-1 nova_compute[182713]: 2026-01-22 00:04:42.479 182717 DEBUG nova.compute.manager [req-3de1f90e-40ac-41aa-a558-548b842c6e21 req-2b6ef292-63b2-4bd4-b146-c3fe6bb0ee70 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: ae422e71-aa79-4a45-ab48-e634bb09283b] Processing event network-vif-plugged-f0e7550a-44b2-4ebc-ae04-c9d364a285a1 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 22 00:04:42 compute-1 nova_compute[182713]: 2026-01-22 00:04:42.563 182717 DEBUG nova.virt.driver [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Emitting event <LifecycleEvent: 1769040282.5627108, ae422e71-aa79-4a45-ab48-e634bb09283b => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:04:42 compute-1 nova_compute[182713]: 2026-01-22 00:04:42.563 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: ae422e71-aa79-4a45-ab48-e634bb09283b] VM Started (Lifecycle Event)
Jan 22 00:04:42 compute-1 nova_compute[182713]: 2026-01-22 00:04:42.567 182717 DEBUG nova.compute.manager [None req-60da5aff-cd94-493b-a70e-4a426d3540a0 63648aa2c42a435b8649b88978fe889b 8d71f189c68645b9893c7a1171fc594f - - default default] [instance: ae422e71-aa79-4a45-ab48-e634bb09283b] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 00:04:42 compute-1 nova_compute[182713]: 2026-01-22 00:04:42.571 182717 DEBUG nova.virt.libvirt.driver [None req-60da5aff-cd94-493b-a70e-4a426d3540a0 63648aa2c42a435b8649b88978fe889b 8d71f189c68645b9893c7a1171fc594f - - default default] [instance: ae422e71-aa79-4a45-ab48-e634bb09283b] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 22 00:04:42 compute-1 nova_compute[182713]: 2026-01-22 00:04:42.575 182717 INFO nova.virt.libvirt.driver [-] [instance: ae422e71-aa79-4a45-ab48-e634bb09283b] Instance spawned successfully.
Jan 22 00:04:42 compute-1 nova_compute[182713]: 2026-01-22 00:04:42.575 182717 DEBUG nova.virt.libvirt.driver [None req-60da5aff-cd94-493b-a70e-4a426d3540a0 63648aa2c42a435b8649b88978fe889b 8d71f189c68645b9893c7a1171fc594f - - default default] [instance: ae422e71-aa79-4a45-ab48-e634bb09283b] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 22 00:04:42 compute-1 nova_compute[182713]: 2026-01-22 00:04:42.602 182717 DEBUG nova.virt.libvirt.driver [None req-60da5aff-cd94-493b-a70e-4a426d3540a0 63648aa2c42a435b8649b88978fe889b 8d71f189c68645b9893c7a1171fc594f - - default default] [instance: ae422e71-aa79-4a45-ab48-e634bb09283b] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:04:42 compute-1 nova_compute[182713]: 2026-01-22 00:04:42.603 182717 DEBUG nova.virt.libvirt.driver [None req-60da5aff-cd94-493b-a70e-4a426d3540a0 63648aa2c42a435b8649b88978fe889b 8d71f189c68645b9893c7a1171fc594f - - default default] [instance: ae422e71-aa79-4a45-ab48-e634bb09283b] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:04:42 compute-1 nova_compute[182713]: 2026-01-22 00:04:42.604 182717 DEBUG nova.virt.libvirt.driver [None req-60da5aff-cd94-493b-a70e-4a426d3540a0 63648aa2c42a435b8649b88978fe889b 8d71f189c68645b9893c7a1171fc594f - - default default] [instance: ae422e71-aa79-4a45-ab48-e634bb09283b] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:04:42 compute-1 nova_compute[182713]: 2026-01-22 00:04:42.604 182717 DEBUG nova.virt.libvirt.driver [None req-60da5aff-cd94-493b-a70e-4a426d3540a0 63648aa2c42a435b8649b88978fe889b 8d71f189c68645b9893c7a1171fc594f - - default default] [instance: ae422e71-aa79-4a45-ab48-e634bb09283b] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:04:42 compute-1 nova_compute[182713]: 2026-01-22 00:04:42.605 182717 DEBUG nova.virt.libvirt.driver [None req-60da5aff-cd94-493b-a70e-4a426d3540a0 63648aa2c42a435b8649b88978fe889b 8d71f189c68645b9893c7a1171fc594f - - default default] [instance: ae422e71-aa79-4a45-ab48-e634bb09283b] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:04:42 compute-1 nova_compute[182713]: 2026-01-22 00:04:42.605 182717 DEBUG nova.virt.libvirt.driver [None req-60da5aff-cd94-493b-a70e-4a426d3540a0 63648aa2c42a435b8649b88978fe889b 8d71f189c68645b9893c7a1171fc594f - - default default] [instance: ae422e71-aa79-4a45-ab48-e634bb09283b] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:04:42 compute-1 nova_compute[182713]: 2026-01-22 00:04:42.610 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: ae422e71-aa79-4a45-ab48-e634bb09283b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:04:42 compute-1 nova_compute[182713]: 2026-01-22 00:04:42.613 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: ae422e71-aa79-4a45-ab48-e634bb09283b] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 00:04:42 compute-1 podman[225587]: 2026-01-22 00:04:42.647897974 +0000 UTC m=+0.052919495 container create 16744090ccece7b4a70591dbf9761b2ef9f89f61194579f3be41545e56d4c17a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-cc0afe66-c42b-4072-9318-4d70984b8705, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team)
Jan 22 00:04:42 compute-1 nova_compute[182713]: 2026-01-22 00:04:42.661 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: ae422e71-aa79-4a45-ab48-e634bb09283b] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 00:04:42 compute-1 nova_compute[182713]: 2026-01-22 00:04:42.661 182717 DEBUG nova.virt.driver [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Emitting event <LifecycleEvent: 1769040282.562835, ae422e71-aa79-4a45-ab48-e634bb09283b => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:04:42 compute-1 nova_compute[182713]: 2026-01-22 00:04:42.661 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: ae422e71-aa79-4a45-ab48-e634bb09283b] VM Paused (Lifecycle Event)
Jan 22 00:04:42 compute-1 systemd[1]: Started libpod-conmon-16744090ccece7b4a70591dbf9761b2ef9f89f61194579f3be41545e56d4c17a.scope.
Jan 22 00:04:42 compute-1 nova_compute[182713]: 2026-01-22 00:04:42.694 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: ae422e71-aa79-4a45-ab48-e634bb09283b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:04:42 compute-1 nova_compute[182713]: 2026-01-22 00:04:42.700 182717 DEBUG nova.virt.driver [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Emitting event <LifecycleEvent: 1769040282.5695205, ae422e71-aa79-4a45-ab48-e634bb09283b => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:04:42 compute-1 nova_compute[182713]: 2026-01-22 00:04:42.700 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: ae422e71-aa79-4a45-ab48-e634bb09283b] VM Resumed (Lifecycle Event)
Jan 22 00:04:42 compute-1 podman[225587]: 2026-01-22 00:04:42.61732321 +0000 UTC m=+0.022344781 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 00:04:42 compute-1 nova_compute[182713]: 2026-01-22 00:04:42.725 182717 INFO nova.compute.manager [None req-60da5aff-cd94-493b-a70e-4a426d3540a0 63648aa2c42a435b8649b88978fe889b 8d71f189c68645b9893c7a1171fc594f - - default default] [instance: ae422e71-aa79-4a45-ab48-e634bb09283b] Took 10.98 seconds to spawn the instance on the hypervisor.
Jan 22 00:04:42 compute-1 nova_compute[182713]: 2026-01-22 00:04:42.726 182717 DEBUG nova.compute.manager [None req-60da5aff-cd94-493b-a70e-4a426d3540a0 63648aa2c42a435b8649b88978fe889b 8d71f189c68645b9893c7a1171fc594f - - default default] [instance: ae422e71-aa79-4a45-ab48-e634bb09283b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:04:42 compute-1 nova_compute[182713]: 2026-01-22 00:04:42.730 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: ae422e71-aa79-4a45-ab48-e634bb09283b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:04:42 compute-1 systemd[1]: Started libcrun container.
Jan 22 00:04:42 compute-1 nova_compute[182713]: 2026-01-22 00:04:42.737 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: ae422e71-aa79-4a45-ab48-e634bb09283b] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 00:04:42 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7fd945c2ae92f9f9f68e94d86f08474e130fc60e955767b1377dc4666823509f/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 00:04:42 compute-1 podman[225587]: 2026-01-22 00:04:42.758213231 +0000 UTC m=+0.163234772 container init 16744090ccece7b4a70591dbf9761b2ef9f89f61194579f3be41545e56d4c17a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-cc0afe66-c42b-4072-9318-4d70984b8705, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Jan 22 00:04:42 compute-1 podman[225587]: 2026-01-22 00:04:42.765390152 +0000 UTC m=+0.170411673 container start 16744090ccece7b4a70591dbf9761b2ef9f89f61194579f3be41545e56d4c17a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-cc0afe66-c42b-4072-9318-4d70984b8705, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Jan 22 00:04:42 compute-1 nova_compute[182713]: 2026-01-22 00:04:42.800 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: ae422e71-aa79-4a45-ab48-e634bb09283b] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 00:04:42 compute-1 neutron-haproxy-ovnmeta-cc0afe66-c42b-4072-9318-4d70984b8705[225603]: [NOTICE]   (225607) : New worker (225609) forked
Jan 22 00:04:42 compute-1 neutron-haproxy-ovnmeta-cc0afe66-c42b-4072-9318-4d70984b8705[225603]: [NOTICE]   (225607) : Loading success.
Jan 22 00:04:42 compute-1 nova_compute[182713]: 2026-01-22 00:04:42.856 182717 INFO nova.compute.manager [None req-60da5aff-cd94-493b-a70e-4a426d3540a0 63648aa2c42a435b8649b88978fe889b 8d71f189c68645b9893c7a1171fc594f - - default default] [instance: ae422e71-aa79-4a45-ab48-e634bb09283b] Took 12.06 seconds to build instance.
Jan 22 00:04:42 compute-1 nova_compute[182713]: 2026-01-22 00:04:42.891 182717 DEBUG oslo_concurrency.lockutils [None req-60da5aff-cd94-493b-a70e-4a426d3540a0 63648aa2c42a435b8649b88978fe889b 8d71f189c68645b9893c7a1171fc594f - - default default] Lock "ae422e71-aa79-4a45-ab48-e634bb09283b" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 12.247s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:04:43 compute-1 nova_compute[182713]: 2026-01-22 00:04:43.160 182717 DEBUG nova.network.neutron [req-6e29c3c5-99c7-41b8-9a91-80433db380f3 req-ebb745e9-cad1-481c-867c-3baa4fb328c6 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: ae422e71-aa79-4a45-ab48-e634bb09283b] Updated VIF entry in instance network info cache for port f0e7550a-44b2-4ebc-ae04-c9d364a285a1. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 00:04:43 compute-1 nova_compute[182713]: 2026-01-22 00:04:43.161 182717 DEBUG nova.network.neutron [req-6e29c3c5-99c7-41b8-9a91-80433db380f3 req-ebb745e9-cad1-481c-867c-3baa4fb328c6 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: ae422e71-aa79-4a45-ab48-e634bb09283b] Updating instance_info_cache with network_info: [{"id": "f0e7550a-44b2-4ebc-ae04-c9d364a285a1", "address": "fa:16:3e:a5:49:91", "network": {"id": "cc0afe66-c42b-4072-9318-4d70984b8705", "bridge": "br-int", "label": "tempest-ServersTestFqdnHostnames-740362094-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8d71f189c68645b9893c7a1171fc594f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf0e7550a-44", "ovs_interfaceid": "f0e7550a-44b2-4ebc-ae04-c9d364a285a1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:04:43 compute-1 nova_compute[182713]: 2026-01-22 00:04:43.190 182717 DEBUG oslo_concurrency.lockutils [req-6e29c3c5-99c7-41b8-9a91-80433db380f3 req-ebb745e9-cad1-481c-867c-3baa4fb328c6 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-ae422e71-aa79-4a45-ab48-e634bb09283b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:04:44 compute-1 nova_compute[182713]: 2026-01-22 00:04:44.624 182717 DEBUG nova.compute.manager [req-7f859fff-6f04-41cb-abcc-d79408a18638 req-54cd84c7-6717-454a-a700-52fd91e81d72 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: ae422e71-aa79-4a45-ab48-e634bb09283b] Received event network-vif-plugged-f0e7550a-44b2-4ebc-ae04-c9d364a285a1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:04:44 compute-1 nova_compute[182713]: 2026-01-22 00:04:44.625 182717 DEBUG oslo_concurrency.lockutils [req-7f859fff-6f04-41cb-abcc-d79408a18638 req-54cd84c7-6717-454a-a700-52fd91e81d72 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "ae422e71-aa79-4a45-ab48-e634bb09283b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:04:44 compute-1 nova_compute[182713]: 2026-01-22 00:04:44.625 182717 DEBUG oslo_concurrency.lockutils [req-7f859fff-6f04-41cb-abcc-d79408a18638 req-54cd84c7-6717-454a-a700-52fd91e81d72 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "ae422e71-aa79-4a45-ab48-e634bb09283b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:04:44 compute-1 nova_compute[182713]: 2026-01-22 00:04:44.625 182717 DEBUG oslo_concurrency.lockutils [req-7f859fff-6f04-41cb-abcc-d79408a18638 req-54cd84c7-6717-454a-a700-52fd91e81d72 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "ae422e71-aa79-4a45-ab48-e634bb09283b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:04:44 compute-1 nova_compute[182713]: 2026-01-22 00:04:44.626 182717 DEBUG nova.compute.manager [req-7f859fff-6f04-41cb-abcc-d79408a18638 req-54cd84c7-6717-454a-a700-52fd91e81d72 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: ae422e71-aa79-4a45-ab48-e634bb09283b] No waiting events found dispatching network-vif-plugged-f0e7550a-44b2-4ebc-ae04-c9d364a285a1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:04:44 compute-1 nova_compute[182713]: 2026-01-22 00:04:44.626 182717 WARNING nova.compute.manager [req-7f859fff-6f04-41cb-abcc-d79408a18638 req-54cd84c7-6717-454a-a700-52fd91e81d72 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: ae422e71-aa79-4a45-ab48-e634bb09283b] Received unexpected event network-vif-plugged-f0e7550a-44b2-4ebc-ae04-c9d364a285a1 for instance with vm_state active and task_state None.
Jan 22 00:04:45 compute-1 nova_compute[182713]: 2026-01-22 00:04:45.438 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:04:45 compute-1 NetworkManager[54952]: <info>  [1769040285.4392] manager: (patch-br-int-to-provnet-99bcf688-d143-4306-a11c-956e66fbe227): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/183)
Jan 22 00:04:45 compute-1 NetworkManager[54952]: <info>  [1769040285.4403] manager: (patch-provnet-99bcf688-d143-4306-a11c-956e66fbe227-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/184)
Jan 22 00:04:45 compute-1 nova_compute[182713]: 2026-01-22 00:04:45.501 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:04:45 compute-1 ovn_controller[94841]: 2026-01-22T00:04:45Z|00376|binding|INFO|Releasing lport 6a8f402c-017b-4947-ae2c-eff9eabad564 from this chassis (sb_readonly=0)
Jan 22 00:04:45 compute-1 nova_compute[182713]: 2026-01-22 00:04:45.512 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:04:45 compute-1 nova_compute[182713]: 2026-01-22 00:04:45.765 182717 DEBUG nova.compute.manager [req-2f7adf72-604a-4912-8b68-0337b38f6ecd req-51c204a9-c8c5-4826-8447-400b18c6e568 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: ae422e71-aa79-4a45-ab48-e634bb09283b] Received event network-changed-f0e7550a-44b2-4ebc-ae04-c9d364a285a1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:04:45 compute-1 nova_compute[182713]: 2026-01-22 00:04:45.766 182717 DEBUG nova.compute.manager [req-2f7adf72-604a-4912-8b68-0337b38f6ecd req-51c204a9-c8c5-4826-8447-400b18c6e568 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: ae422e71-aa79-4a45-ab48-e634bb09283b] Refreshing instance network info cache due to event network-changed-f0e7550a-44b2-4ebc-ae04-c9d364a285a1. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 00:04:45 compute-1 nova_compute[182713]: 2026-01-22 00:04:45.766 182717 DEBUG oslo_concurrency.lockutils [req-2f7adf72-604a-4912-8b68-0337b38f6ecd req-51c204a9-c8c5-4826-8447-400b18c6e568 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-ae422e71-aa79-4a45-ab48-e634bb09283b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:04:45 compute-1 nova_compute[182713]: 2026-01-22 00:04:45.766 182717 DEBUG oslo_concurrency.lockutils [req-2f7adf72-604a-4912-8b68-0337b38f6ecd req-51c204a9-c8c5-4826-8447-400b18c6e568 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-ae422e71-aa79-4a45-ab48-e634bb09283b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:04:45 compute-1 nova_compute[182713]: 2026-01-22 00:04:45.766 182717 DEBUG nova.network.neutron [req-2f7adf72-604a-4912-8b68-0337b38f6ecd req-51c204a9-c8c5-4826-8447-400b18c6e568 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: ae422e71-aa79-4a45-ab48-e634bb09283b] Refreshing network info cache for port f0e7550a-44b2-4ebc-ae04-c9d364a285a1 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 00:04:46 compute-1 nova_compute[182713]: 2026-01-22 00:04:46.032 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:04:46 compute-1 nova_compute[182713]: 2026-01-22 00:04:46.249 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:04:46 compute-1 ovn_controller[94841]: 2026-01-22T00:04:46Z|00377|binding|INFO|Releasing lport 6a8f402c-017b-4947-ae2c-eff9eabad564 from this chassis (sb_readonly=0)
Jan 22 00:04:46 compute-1 nova_compute[182713]: 2026-01-22 00:04:46.402 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:04:47 compute-1 nova_compute[182713]: 2026-01-22 00:04:47.012 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:04:47 compute-1 podman[225620]: 2026-01-22 00:04:47.576716341 +0000 UTC m=+0.064795150 container health_status 9069102163b9014ce8d990e36224edf2f6277e737eebbc85a8ea0ba5a3d6a468 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 22 00:04:47 compute-1 podman[225619]: 2026-01-22 00:04:47.64111436 +0000 UTC m=+0.117199770 container health_status 1b31374526468d6b98833c6ddc06c791524e3aaa8f4a92d02e3f11c449a17ca2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Jan 22 00:04:47 compute-1 nova_compute[182713]: 2026-01-22 00:04:47.745 182717 DEBUG nova.network.neutron [req-2f7adf72-604a-4912-8b68-0337b38f6ecd req-51c204a9-c8c5-4826-8447-400b18c6e568 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: ae422e71-aa79-4a45-ab48-e634bb09283b] Updated VIF entry in instance network info cache for port f0e7550a-44b2-4ebc-ae04-c9d364a285a1. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 00:04:47 compute-1 nova_compute[182713]: 2026-01-22 00:04:47.746 182717 DEBUG nova.network.neutron [req-2f7adf72-604a-4912-8b68-0337b38f6ecd req-51c204a9-c8c5-4826-8447-400b18c6e568 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: ae422e71-aa79-4a45-ab48-e634bb09283b] Updating instance_info_cache with network_info: [{"id": "f0e7550a-44b2-4ebc-ae04-c9d364a285a1", "address": "fa:16:3e:a5:49:91", "network": {"id": "cc0afe66-c42b-4072-9318-4d70984b8705", "bridge": "br-int", "label": "tempest-ServersTestFqdnHostnames-740362094-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.228", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8d71f189c68645b9893c7a1171fc594f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf0e7550a-44", "ovs_interfaceid": "f0e7550a-44b2-4ebc-ae04-c9d364a285a1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:04:47 compute-1 nova_compute[182713]: 2026-01-22 00:04:47.807 182717 DEBUG oslo_concurrency.lockutils [req-2f7adf72-604a-4912-8b68-0337b38f6ecd req-51c204a9-c8c5-4826-8447-400b18c6e568 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-ae422e71-aa79-4a45-ab48-e634bb09283b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:04:51 compute-1 nova_compute[182713]: 2026-01-22 00:04:51.037 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:04:52 compute-1 nova_compute[182713]: 2026-01-22 00:04:52.013 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:04:53 compute-1 podman[225681]: 2026-01-22 00:04:53.572722802 +0000 UTC m=+0.049937433 container health_status e305ebfcd3dbca18f8dd680606b043b108cf9efb0d0a771039cb04fe0df8441d (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 22 00:04:53 compute-1 podman[225680]: 2026-01-22 00:04:53.581576275 +0000 UTC m=+0.060403216 container health_status af9a44923f14d865893c91712a29a99d59e05d83f1a1a0300c4f4d5af638f284 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 22 00:04:54 compute-1 ovn_controller[94841]: 2026-01-22T00:04:54Z|00045|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:a5:49:91 10.100.0.9
Jan 22 00:04:54 compute-1 ovn_controller[94841]: 2026-01-22T00:04:54Z|00046|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:a5:49:91 10.100.0.9
Jan 22 00:04:56 compute-1 nova_compute[182713]: 2026-01-22 00:04:56.043 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:04:57 compute-1 nova_compute[182713]: 2026-01-22 00:04:57.016 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:04:57 compute-1 nova_compute[182713]: 2026-01-22 00:04:57.606 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:05:01 compute-1 nova_compute[182713]: 2026-01-22 00:05:01.047 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:05:01 compute-1 nova_compute[182713]: 2026-01-22 00:05:01.777 182717 DEBUG oslo_concurrency.lockutils [None req-6cade8b0-a14e-4580-97d8-a11b45f6e1fb 63648aa2c42a435b8649b88978fe889b 8d71f189c68645b9893c7a1171fc594f - - default default] Acquiring lock "ae422e71-aa79-4a45-ab48-e634bb09283b" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:05:01 compute-1 nova_compute[182713]: 2026-01-22 00:05:01.778 182717 DEBUG oslo_concurrency.lockutils [None req-6cade8b0-a14e-4580-97d8-a11b45f6e1fb 63648aa2c42a435b8649b88978fe889b 8d71f189c68645b9893c7a1171fc594f - - default default] Lock "ae422e71-aa79-4a45-ab48-e634bb09283b" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:05:01 compute-1 nova_compute[182713]: 2026-01-22 00:05:01.779 182717 DEBUG oslo_concurrency.lockutils [None req-6cade8b0-a14e-4580-97d8-a11b45f6e1fb 63648aa2c42a435b8649b88978fe889b 8d71f189c68645b9893c7a1171fc594f - - default default] Acquiring lock "ae422e71-aa79-4a45-ab48-e634bb09283b-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:05:01 compute-1 nova_compute[182713]: 2026-01-22 00:05:01.779 182717 DEBUG oslo_concurrency.lockutils [None req-6cade8b0-a14e-4580-97d8-a11b45f6e1fb 63648aa2c42a435b8649b88978fe889b 8d71f189c68645b9893c7a1171fc594f - - default default] Lock "ae422e71-aa79-4a45-ab48-e634bb09283b-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:05:01 compute-1 nova_compute[182713]: 2026-01-22 00:05:01.780 182717 DEBUG oslo_concurrency.lockutils [None req-6cade8b0-a14e-4580-97d8-a11b45f6e1fb 63648aa2c42a435b8649b88978fe889b 8d71f189c68645b9893c7a1171fc594f - - default default] Lock "ae422e71-aa79-4a45-ab48-e634bb09283b-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:05:01 compute-1 nova_compute[182713]: 2026-01-22 00:05:01.796 182717 INFO nova.compute.manager [None req-6cade8b0-a14e-4580-97d8-a11b45f6e1fb 63648aa2c42a435b8649b88978fe889b 8d71f189c68645b9893c7a1171fc594f - - default default] [instance: ae422e71-aa79-4a45-ab48-e634bb09283b] Terminating instance
Jan 22 00:05:01 compute-1 nova_compute[182713]: 2026-01-22 00:05:01.816 182717 DEBUG nova.compute.manager [None req-6cade8b0-a14e-4580-97d8-a11b45f6e1fb 63648aa2c42a435b8649b88978fe889b 8d71f189c68645b9893c7a1171fc594f - - default default] [instance: ae422e71-aa79-4a45-ab48-e634bb09283b] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 22 00:05:01 compute-1 kernel: tapf0e7550a-44 (unregistering): left promiscuous mode
Jan 22 00:05:01 compute-1 NetworkManager[54952]: <info>  [1769040301.8421] device (tapf0e7550a-44): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 00:05:01 compute-1 nova_compute[182713]: 2026-01-22 00:05:01.855 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:05:01 compute-1 ovn_controller[94841]: 2026-01-22T00:05:01Z|00378|binding|INFO|Releasing lport f0e7550a-44b2-4ebc-ae04-c9d364a285a1 from this chassis (sb_readonly=0)
Jan 22 00:05:01 compute-1 ovn_controller[94841]: 2026-01-22T00:05:01Z|00379|binding|INFO|Setting lport f0e7550a-44b2-4ebc-ae04-c9d364a285a1 down in Southbound
Jan 22 00:05:01 compute-1 ovn_controller[94841]: 2026-01-22T00:05:01Z|00380|binding|INFO|Removing iface tapf0e7550a-44 ovn-installed in OVS
Jan 22 00:05:01 compute-1 nova_compute[182713]: 2026-01-22 00:05:01.859 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:05:01 compute-1 nova_compute[182713]: 2026-01-22 00:05:01.890 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:05:01 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:05:01.890 104184 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a5:49:91 10.100.0.9'], port_security=['fa:16:3e:a5:49:91 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': 'ae422e71-aa79-4a45-ab48-e634bb09283b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cc0afe66-c42b-4072-9318-4d70984b8705', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8d71f189c68645b9893c7a1171fc594f', 'neutron:revision_number': '4', 'neutron:security_group_ids': '1dcbe2a3-448b-4c38-b0e2-636005b56d53', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com', 'neutron:port_fip': '192.168.122.228'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=45aa4207-cb88-4906-b773-785691a1d850, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>], logical_port=f0e7550a-44b2-4ebc-ae04-c9d364a285a1) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:05:01 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:05:01.893 104184 INFO neutron.agent.ovn.metadata.agent [-] Port f0e7550a-44b2-4ebc-ae04-c9d364a285a1 in datapath cc0afe66-c42b-4072-9318-4d70984b8705 unbound from our chassis
Jan 22 00:05:01 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:05:01.896 104184 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network cc0afe66-c42b-4072-9318-4d70984b8705, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 00:05:01 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:05:01.899 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[e7f5c7c3-2048-402d-8356-4bbbe6bdcd9f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:05:01 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:05:01.900 104184 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-cc0afe66-c42b-4072-9318-4d70984b8705 namespace which is not needed anymore
Jan 22 00:05:01 compute-1 systemd[1]: machine-qemu\x2d42\x2dinstance\x2d0000005a.scope: Deactivated successfully.
Jan 22 00:05:01 compute-1 systemd[1]: machine-qemu\x2d42\x2dinstance\x2d0000005a.scope: Consumed 12.846s CPU time.
Jan 22 00:05:01 compute-1 systemd-machined[153970]: Machine qemu-42-instance-0000005a terminated.
Jan 22 00:05:02 compute-1 nova_compute[182713]: 2026-01-22 00:05:02.019 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:05:02 compute-1 neutron-haproxy-ovnmeta-cc0afe66-c42b-4072-9318-4d70984b8705[225603]: [NOTICE]   (225607) : haproxy version is 2.8.14-c23fe91
Jan 22 00:05:02 compute-1 neutron-haproxy-ovnmeta-cc0afe66-c42b-4072-9318-4d70984b8705[225603]: [NOTICE]   (225607) : path to executable is /usr/sbin/haproxy
Jan 22 00:05:02 compute-1 neutron-haproxy-ovnmeta-cc0afe66-c42b-4072-9318-4d70984b8705[225603]: [WARNING]  (225607) : Exiting Master process...
Jan 22 00:05:02 compute-1 neutron-haproxy-ovnmeta-cc0afe66-c42b-4072-9318-4d70984b8705[225603]: [ALERT]    (225607) : Current worker (225609) exited with code 143 (Terminated)
Jan 22 00:05:02 compute-1 neutron-haproxy-ovnmeta-cc0afe66-c42b-4072-9318-4d70984b8705[225603]: [WARNING]  (225607) : All workers exited. Exiting... (0)
Jan 22 00:05:02 compute-1 nova_compute[182713]: 2026-01-22 00:05:02.087 182717 INFO nova.virt.libvirt.driver [-] [instance: ae422e71-aa79-4a45-ab48-e634bb09283b] Instance destroyed successfully.
Jan 22 00:05:02 compute-1 nova_compute[182713]: 2026-01-22 00:05:02.087 182717 DEBUG nova.objects.instance [None req-6cade8b0-a14e-4580-97d8-a11b45f6e1fb 63648aa2c42a435b8649b88978fe889b 8d71f189c68645b9893c7a1171fc594f - - default default] Lazy-loading 'resources' on Instance uuid ae422e71-aa79-4a45-ab48-e634bb09283b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:05:02 compute-1 systemd[1]: libpod-16744090ccece7b4a70591dbf9761b2ef9f89f61194579f3be41545e56d4c17a.scope: Deactivated successfully.
Jan 22 00:05:02 compute-1 podman[225746]: 2026-01-22 00:05:02.096590454 +0000 UTC m=+0.083693585 container died 16744090ccece7b4a70591dbf9761b2ef9f89f61194579f3be41545e56d4c17a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-cc0afe66-c42b-4072-9318-4d70984b8705, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Jan 22 00:05:02 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-16744090ccece7b4a70591dbf9761b2ef9f89f61194579f3be41545e56d4c17a-userdata-shm.mount: Deactivated successfully.
Jan 22 00:05:02 compute-1 systemd[1]: var-lib-containers-storage-overlay-7fd945c2ae92f9f9f68e94d86f08474e130fc60e955767b1377dc4666823509f-merged.mount: Deactivated successfully.
Jan 22 00:05:02 compute-1 podman[225746]: 2026-01-22 00:05:02.149693594 +0000 UTC m=+0.136796715 container cleanup 16744090ccece7b4a70591dbf9761b2ef9f89f61194579f3be41545e56d4c17a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-cc0afe66-c42b-4072-9318-4d70984b8705, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Jan 22 00:05:02 compute-1 systemd[1]: libpod-conmon-16744090ccece7b4a70591dbf9761b2ef9f89f61194579f3be41545e56d4c17a.scope: Deactivated successfully.
Jan 22 00:05:02 compute-1 nova_compute[182713]: 2026-01-22 00:05:02.159 182717 DEBUG nova.virt.libvirt.vif [None req-6cade8b0-a14e-4580-97d8-a11b45f6e1fb 63648aa2c42a435b8649b88978fe889b 8d71f189c68645b9893c7a1171fc594f - - default default] vif_type=ovs instance=Instance(access_ip_v4=2.2.2.2,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T00:04:28Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='guest-instance-1.domain.com',display_name='guest-instance-1.domain.com',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='guest-instance-1-domain-com',id=90,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMDpNxj52ngzHLHuXy3WYSkFReNYkfmAvALETfwN//J3FPL6aMvUxZprajQE//vK0UiiBEeLGvT9b/h5PdevsIucx3yuVzoDS+oGEOmAuv+ECQ0ItERDBE2uQOufFj0yHg==',key_name='tempest-keypair-969259947',keypairs=<?>,launch_index=0,launched_at=2026-01-22T00:04:42Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='8d71f189c68645b9893c7a1171fc594f',ramdisk_id='',reservation_id='r-dqskam0b',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersTestFqdnHostnames-408381576',owner_user_name='tempest-ServersTestFqdnHostnames-408381576-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T00:04:42Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='63648aa2c42a435b8649b88978fe889b',uuid=ae422e71-aa79-4a45-ab48-e634bb09283b,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "f0e7550a-44b2-4ebc-ae04-c9d364a285a1", "address": "fa:16:3e:a5:49:91", "network": {"id": "cc0afe66-c42b-4072-9318-4d70984b8705", "bridge": "br-int", "label": "tempest-ServersTestFqdnHostnames-740362094-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.228", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8d71f189c68645b9893c7a1171fc594f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf0e7550a-44", "ovs_interfaceid": "f0e7550a-44b2-4ebc-ae04-c9d364a285a1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 00:05:02 compute-1 nova_compute[182713]: 2026-01-22 00:05:02.159 182717 DEBUG nova.network.os_vif_util [None req-6cade8b0-a14e-4580-97d8-a11b45f6e1fb 63648aa2c42a435b8649b88978fe889b 8d71f189c68645b9893c7a1171fc594f - - default default] Converting VIF {"id": "f0e7550a-44b2-4ebc-ae04-c9d364a285a1", "address": "fa:16:3e:a5:49:91", "network": {"id": "cc0afe66-c42b-4072-9318-4d70984b8705", "bridge": "br-int", "label": "tempest-ServersTestFqdnHostnames-740362094-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.228", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8d71f189c68645b9893c7a1171fc594f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf0e7550a-44", "ovs_interfaceid": "f0e7550a-44b2-4ebc-ae04-c9d364a285a1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:05:02 compute-1 nova_compute[182713]: 2026-01-22 00:05:02.161 182717 DEBUG nova.network.os_vif_util [None req-6cade8b0-a14e-4580-97d8-a11b45f6e1fb 63648aa2c42a435b8649b88978fe889b 8d71f189c68645b9893c7a1171fc594f - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:a5:49:91,bridge_name='br-int',has_traffic_filtering=True,id=f0e7550a-44b2-4ebc-ae04-c9d364a285a1,network=Network(cc0afe66-c42b-4072-9318-4d70984b8705),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf0e7550a-44') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:05:02 compute-1 nova_compute[182713]: 2026-01-22 00:05:02.161 182717 DEBUG os_vif [None req-6cade8b0-a14e-4580-97d8-a11b45f6e1fb 63648aa2c42a435b8649b88978fe889b 8d71f189c68645b9893c7a1171fc594f - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:a5:49:91,bridge_name='br-int',has_traffic_filtering=True,id=f0e7550a-44b2-4ebc-ae04-c9d364a285a1,network=Network(cc0afe66-c42b-4072-9318-4d70984b8705),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf0e7550a-44') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 00:05:02 compute-1 nova_compute[182713]: 2026-01-22 00:05:02.163 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:05:02 compute-1 nova_compute[182713]: 2026-01-22 00:05:02.164 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf0e7550a-44, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:05:02 compute-1 nova_compute[182713]: 2026-01-22 00:05:02.165 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:05:02 compute-1 nova_compute[182713]: 2026-01-22 00:05:02.167 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:05:02 compute-1 nova_compute[182713]: 2026-01-22 00:05:02.173 182717 INFO os_vif [None req-6cade8b0-a14e-4580-97d8-a11b45f6e1fb 63648aa2c42a435b8649b88978fe889b 8d71f189c68645b9893c7a1171fc594f - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:a5:49:91,bridge_name='br-int',has_traffic_filtering=True,id=f0e7550a-44b2-4ebc-ae04-c9d364a285a1,network=Network(cc0afe66-c42b-4072-9318-4d70984b8705),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf0e7550a-44')
Jan 22 00:05:02 compute-1 nova_compute[182713]: 2026-01-22 00:05:02.174 182717 INFO nova.virt.libvirt.driver [None req-6cade8b0-a14e-4580-97d8-a11b45f6e1fb 63648aa2c42a435b8649b88978fe889b 8d71f189c68645b9893c7a1171fc594f - - default default] [instance: ae422e71-aa79-4a45-ab48-e634bb09283b] Deleting instance files /var/lib/nova/instances/ae422e71-aa79-4a45-ab48-e634bb09283b_del
Jan 22 00:05:02 compute-1 nova_compute[182713]: 2026-01-22 00:05:02.175 182717 INFO nova.virt.libvirt.driver [None req-6cade8b0-a14e-4580-97d8-a11b45f6e1fb 63648aa2c42a435b8649b88978fe889b 8d71f189c68645b9893c7a1171fc594f - - default default] [instance: ae422e71-aa79-4a45-ab48-e634bb09283b] Deletion of /var/lib/nova/instances/ae422e71-aa79-4a45-ab48-e634bb09283b_del complete
Jan 22 00:05:02 compute-1 podman[225791]: 2026-01-22 00:05:02.224819083 +0000 UTC m=+0.049005603 container remove 16744090ccece7b4a70591dbf9761b2ef9f89f61194579f3be41545e56d4c17a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-cc0afe66-c42b-4072-9318-4d70984b8705, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 00:05:02 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:05:02.230 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[a53bf981-9c03-4acc-a8a1-f1e56535ddd0]: (4, ('Thu Jan 22 12:05:01 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-cc0afe66-c42b-4072-9318-4d70984b8705 (16744090ccece7b4a70591dbf9761b2ef9f89f61194579f3be41545e56d4c17a)\n16744090ccece7b4a70591dbf9761b2ef9f89f61194579f3be41545e56d4c17a\nThu Jan 22 12:05:02 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-cc0afe66-c42b-4072-9318-4d70984b8705 (16744090ccece7b4a70591dbf9761b2ef9f89f61194579f3be41545e56d4c17a)\n16744090ccece7b4a70591dbf9761b2ef9f89f61194579f3be41545e56d4c17a\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:05:02 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:05:02.232 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[93b3455e-8f2d-4864-9d37-2bd4605b57c9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:05:02 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:05:02.234 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapcc0afe66-c0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:05:02 compute-1 nova_compute[182713]: 2026-01-22 00:05:02.236 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:05:02 compute-1 kernel: tapcc0afe66-c0: left promiscuous mode
Jan 22 00:05:02 compute-1 nova_compute[182713]: 2026-01-22 00:05:02.249 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:05:02 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:05:02.255 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[7678083a-8c39-4cdc-97eb-bbe34e60fa48]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:05:02 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:05:02.275 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[6ff3bc2a-ed03-4e41-be9f-e84ed647c308]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:05:02 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:05:02.277 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[9d28436e-179d-4bce-97e4-d4745141fff5]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:05:02 compute-1 nova_compute[182713]: 2026-01-22 00:05:02.289 182717 INFO nova.compute.manager [None req-6cade8b0-a14e-4580-97d8-a11b45f6e1fb 63648aa2c42a435b8649b88978fe889b 8d71f189c68645b9893c7a1171fc594f - - default default] [instance: ae422e71-aa79-4a45-ab48-e634bb09283b] Took 0.47 seconds to destroy the instance on the hypervisor.
Jan 22 00:05:02 compute-1 nova_compute[182713]: 2026-01-22 00:05:02.290 182717 DEBUG oslo.service.loopingcall [None req-6cade8b0-a14e-4580-97d8-a11b45f6e1fb 63648aa2c42a435b8649b88978fe889b 8d71f189c68645b9893c7a1171fc594f - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 22 00:05:02 compute-1 nova_compute[182713]: 2026-01-22 00:05:02.290 182717 DEBUG nova.compute.manager [-] [instance: ae422e71-aa79-4a45-ab48-e634bb09283b] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 22 00:05:02 compute-1 nova_compute[182713]: 2026-01-22 00:05:02.291 182717 DEBUG nova.network.neutron [-] [instance: ae422e71-aa79-4a45-ab48-e634bb09283b] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 22 00:05:02 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:05:02.301 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[5d954622-f989-40f9-bcbd-ef041a2e2e1b]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 485459, 'reachable_time': 39119, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 225806, 'error': None, 'target': 'ovnmeta-cc0afe66-c42b-4072-9318-4d70984b8705', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:05:02 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:05:02.307 104576 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-cc0afe66-c42b-4072-9318-4d70984b8705 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 22 00:05:02 compute-1 systemd[1]: run-netns-ovnmeta\x2dcc0afe66\x2dc42b\x2d4072\x2d9318\x2d4d70984b8705.mount: Deactivated successfully.
Jan 22 00:05:02 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:05:02.307 104576 DEBUG oslo.privsep.daemon [-] privsep: reply[6f3307a7-fa19-4a3c-8d21-ca24b05e0249]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:05:03 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:05:03.012 104184 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:05:03 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:05:03.012 104184 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:05:03 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:05:03.013 104184 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:05:04 compute-1 podman[225807]: 2026-01-22 00:05:04.601959603 +0000 UTC m=+0.085851652 container health_status cba261ffc859a105e0afa4436e64e27f9ba7e7387a1ea02146e0072c51844d44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 22 00:05:07 compute-1 nova_compute[182713]: 2026-01-22 00:05:07.021 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:05:07 compute-1 nova_compute[182713]: 2026-01-22 00:05:07.166 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:05:07 compute-1 nova_compute[182713]: 2026-01-22 00:05:07.801 182717 DEBUG nova.network.neutron [-] [instance: ae422e71-aa79-4a45-ab48-e634bb09283b] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:05:07 compute-1 nova_compute[182713]: 2026-01-22 00:05:07.839 182717 INFO nova.compute.manager [-] [instance: ae422e71-aa79-4a45-ab48-e634bb09283b] Took 5.55 seconds to deallocate network for instance.
Jan 22 00:05:07 compute-1 nova_compute[182713]: 2026-01-22 00:05:07.959 182717 DEBUG nova.compute.manager [req-1c0e641a-1b34-4b62-a320-5d0c1d156228 req-7b8524f9-e2e9-4669-b964-7f3b5879c4bd 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: ae422e71-aa79-4a45-ab48-e634bb09283b] Received event network-vif-deleted-f0e7550a-44b2-4ebc-ae04-c9d364a285a1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:05:07 compute-1 nova_compute[182713]: 2026-01-22 00:05:07.963 182717 DEBUG oslo_concurrency.lockutils [None req-6cade8b0-a14e-4580-97d8-a11b45f6e1fb 63648aa2c42a435b8649b88978fe889b 8d71f189c68645b9893c7a1171fc594f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:05:07 compute-1 nova_compute[182713]: 2026-01-22 00:05:07.963 182717 DEBUG oslo_concurrency.lockutils [None req-6cade8b0-a14e-4580-97d8-a11b45f6e1fb 63648aa2c42a435b8649b88978fe889b 8d71f189c68645b9893c7a1171fc594f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:05:08 compute-1 nova_compute[182713]: 2026-01-22 00:05:08.061 182717 DEBUG nova.compute.provider_tree [None req-6cade8b0-a14e-4580-97d8-a11b45f6e1fb 63648aa2c42a435b8649b88978fe889b 8d71f189c68645b9893c7a1171fc594f - - default default] Inventory has not changed in ProviderTree for provider: 39680711-70c9-4df1-ae59-25e54fac688d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:05:08 compute-1 nova_compute[182713]: 2026-01-22 00:05:08.081 182717 DEBUG nova.scheduler.client.report [None req-6cade8b0-a14e-4580-97d8-a11b45f6e1fb 63648aa2c42a435b8649b88978fe889b 8d71f189c68645b9893c7a1171fc594f - - default default] Inventory has not changed for provider 39680711-70c9-4df1-ae59-25e54fac688d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:05:08 compute-1 nova_compute[182713]: 2026-01-22 00:05:08.104 182717 DEBUG oslo_concurrency.lockutils [None req-6cade8b0-a14e-4580-97d8-a11b45f6e1fb 63648aa2c42a435b8649b88978fe889b 8d71f189c68645b9893c7a1171fc594f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.141s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:05:08 compute-1 nova_compute[182713]: 2026-01-22 00:05:08.137 182717 INFO nova.scheduler.client.report [None req-6cade8b0-a14e-4580-97d8-a11b45f6e1fb 63648aa2c42a435b8649b88978fe889b 8d71f189c68645b9893c7a1171fc594f - - default default] Deleted allocations for instance ae422e71-aa79-4a45-ab48-e634bb09283b
Jan 22 00:05:08 compute-1 nova_compute[182713]: 2026-01-22 00:05:08.235 182717 DEBUG oslo_concurrency.lockutils [None req-6cade8b0-a14e-4580-97d8-a11b45f6e1fb 63648aa2c42a435b8649b88978fe889b 8d71f189c68645b9893c7a1171fc594f - - default default] Lock "ae422e71-aa79-4a45-ab48-e634bb09283b" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 6.457s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:05:08 compute-1 nova_compute[182713]: 2026-01-22 00:05:08.381 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:05:08 compute-1 podman[225831]: 2026-01-22 00:05:08.602377215 +0000 UTC m=+0.086055478 container health_status dce133a2ffa21f3006ac27dc80027060a9029e6d972f4c62fecd35dd3bbb00f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, name=ubi9-minimal, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, vcs-type=git, release=1755695350, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.buildah.version=1.33.7, architecture=x86_64, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9)
Jan 22 00:05:12 compute-1 nova_compute[182713]: 2026-01-22 00:05:12.022 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:05:12 compute-1 nova_compute[182713]: 2026-01-22 00:05:12.167 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:05:15 compute-1 nova_compute[182713]: 2026-01-22 00:05:15.180 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:05:15 compute-1 nova_compute[182713]: 2026-01-22 00:05:15.388 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:05:17 compute-1 nova_compute[182713]: 2026-01-22 00:05:17.024 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:05:17 compute-1 nova_compute[182713]: 2026-01-22 00:05:17.086 182717 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769040302.0849032, ae422e71-aa79-4a45-ab48-e634bb09283b => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:05:17 compute-1 nova_compute[182713]: 2026-01-22 00:05:17.086 182717 INFO nova.compute.manager [-] [instance: ae422e71-aa79-4a45-ab48-e634bb09283b] VM Stopped (Lifecycle Event)
Jan 22 00:05:17 compute-1 nova_compute[182713]: 2026-01-22 00:05:17.164 182717 DEBUG nova.compute.manager [None req-6ecc4606-89f8-4fa5-aac1-e5a0435597cf - - - - - -] [instance: ae422e71-aa79-4a45-ab48-e634bb09283b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:05:17 compute-1 nova_compute[182713]: 2026-01-22 00:05:17.169 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:05:17 compute-1 nova_compute[182713]: 2026-01-22 00:05:17.565 182717 DEBUG oslo_concurrency.lockutils [None req-23c439ba-fd09-452e-b51e-2fc800e83e05 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Acquiring lock "a7650c58-4663-47b0-8499-d470f8edddbd" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:05:17 compute-1 nova_compute[182713]: 2026-01-22 00:05:17.565 182717 DEBUG oslo_concurrency.lockutils [None req-23c439ba-fd09-452e-b51e-2fc800e83e05 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Lock "a7650c58-4663-47b0-8499-d470f8edddbd" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:05:17 compute-1 nova_compute[182713]: 2026-01-22 00:05:17.583 182717 DEBUG nova.compute.manager [None req-23c439ba-fd09-452e-b51e-2fc800e83e05 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: a7650c58-4663-47b0-8499-d470f8edddbd] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 22 00:05:17 compute-1 nova_compute[182713]: 2026-01-22 00:05:17.712 182717 DEBUG oslo_concurrency.lockutils [None req-23c439ba-fd09-452e-b51e-2fc800e83e05 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:05:17 compute-1 nova_compute[182713]: 2026-01-22 00:05:17.713 182717 DEBUG oslo_concurrency.lockutils [None req-23c439ba-fd09-452e-b51e-2fc800e83e05 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:05:17 compute-1 nova_compute[182713]: 2026-01-22 00:05:17.722 182717 DEBUG nova.virt.hardware [None req-23c439ba-fd09-452e-b51e-2fc800e83e05 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 22 00:05:17 compute-1 nova_compute[182713]: 2026-01-22 00:05:17.723 182717 INFO nova.compute.claims [None req-23c439ba-fd09-452e-b51e-2fc800e83e05 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: a7650c58-4663-47b0-8499-d470f8edddbd] Claim successful on node compute-1.ctlplane.example.com
Jan 22 00:05:17 compute-1 nova_compute[182713]: 2026-01-22 00:05:17.862 182717 DEBUG nova.compute.provider_tree [None req-23c439ba-fd09-452e-b51e-2fc800e83e05 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Inventory has not changed in ProviderTree for provider: 39680711-70c9-4df1-ae59-25e54fac688d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:05:17 compute-1 nova_compute[182713]: 2026-01-22 00:05:17.884 182717 DEBUG nova.scheduler.client.report [None req-23c439ba-fd09-452e-b51e-2fc800e83e05 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Inventory has not changed for provider 39680711-70c9-4df1-ae59-25e54fac688d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:05:17 compute-1 nova_compute[182713]: 2026-01-22 00:05:17.916 182717 DEBUG oslo_concurrency.lockutils [None req-23c439ba-fd09-452e-b51e-2fc800e83e05 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.203s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:05:17 compute-1 nova_compute[182713]: 2026-01-22 00:05:17.917 182717 DEBUG nova.compute.manager [None req-23c439ba-fd09-452e-b51e-2fc800e83e05 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: a7650c58-4663-47b0-8499-d470f8edddbd] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 22 00:05:17 compute-1 nova_compute[182713]: 2026-01-22 00:05:17.981 182717 DEBUG nova.compute.manager [None req-23c439ba-fd09-452e-b51e-2fc800e83e05 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: a7650c58-4663-47b0-8499-d470f8edddbd] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 22 00:05:17 compute-1 nova_compute[182713]: 2026-01-22 00:05:17.981 182717 DEBUG nova.network.neutron [None req-23c439ba-fd09-452e-b51e-2fc800e83e05 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: a7650c58-4663-47b0-8499-d470f8edddbd] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 22 00:05:18 compute-1 nova_compute[182713]: 2026-01-22 00:05:18.010 182717 INFO nova.virt.libvirt.driver [None req-23c439ba-fd09-452e-b51e-2fc800e83e05 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: a7650c58-4663-47b0-8499-d470f8edddbd] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 22 00:05:18 compute-1 nova_compute[182713]: 2026-01-22 00:05:18.042 182717 DEBUG nova.compute.manager [None req-23c439ba-fd09-452e-b51e-2fc800e83e05 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: a7650c58-4663-47b0-8499-d470f8edddbd] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 22 00:05:18 compute-1 nova_compute[182713]: 2026-01-22 00:05:18.187 182717 DEBUG nova.compute.manager [None req-23c439ba-fd09-452e-b51e-2fc800e83e05 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: a7650c58-4663-47b0-8499-d470f8edddbd] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 22 00:05:18 compute-1 nova_compute[182713]: 2026-01-22 00:05:18.189 182717 DEBUG nova.virt.libvirt.driver [None req-23c439ba-fd09-452e-b51e-2fc800e83e05 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: a7650c58-4663-47b0-8499-d470f8edddbd] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 22 00:05:18 compute-1 nova_compute[182713]: 2026-01-22 00:05:18.189 182717 INFO nova.virt.libvirt.driver [None req-23c439ba-fd09-452e-b51e-2fc800e83e05 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: a7650c58-4663-47b0-8499-d470f8edddbd] Creating image(s)
Jan 22 00:05:18 compute-1 nova_compute[182713]: 2026-01-22 00:05:18.190 182717 DEBUG oslo_concurrency.lockutils [None req-23c439ba-fd09-452e-b51e-2fc800e83e05 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Acquiring lock "/var/lib/nova/instances/a7650c58-4663-47b0-8499-d470f8edddbd/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:05:18 compute-1 nova_compute[182713]: 2026-01-22 00:05:18.190 182717 DEBUG oslo_concurrency.lockutils [None req-23c439ba-fd09-452e-b51e-2fc800e83e05 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Lock "/var/lib/nova/instances/a7650c58-4663-47b0-8499-d470f8edddbd/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:05:18 compute-1 nova_compute[182713]: 2026-01-22 00:05:18.191 182717 DEBUG oslo_concurrency.lockutils [None req-23c439ba-fd09-452e-b51e-2fc800e83e05 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Lock "/var/lib/nova/instances/a7650c58-4663-47b0-8499-d470f8edddbd/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:05:18 compute-1 nova_compute[182713]: 2026-01-22 00:05:18.214 182717 DEBUG oslo_concurrency.processutils [None req-23c439ba-fd09-452e-b51e-2fc800e83e05 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:05:18 compute-1 nova_compute[182713]: 2026-01-22 00:05:18.296 182717 DEBUG nova.policy [None req-23c439ba-fd09-452e-b51e-2fc800e83e05 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '365f219cd09c471fa6275faa2fe5e2a1', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'b26cf6f4abd54e30aac169a3cbca648c', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 22 00:05:18 compute-1 nova_compute[182713]: 2026-01-22 00:05:18.304 182717 DEBUG oslo_concurrency.processutils [None req-23c439ba-fd09-452e-b51e-2fc800e83e05 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.091s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:05:18 compute-1 nova_compute[182713]: 2026-01-22 00:05:18.305 182717 DEBUG oslo_concurrency.lockutils [None req-23c439ba-fd09-452e-b51e-2fc800e83e05 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Acquiring lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:05:18 compute-1 nova_compute[182713]: 2026-01-22 00:05:18.306 182717 DEBUG oslo_concurrency.lockutils [None req-23c439ba-fd09-452e-b51e-2fc800e83e05 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:05:18 compute-1 nova_compute[182713]: 2026-01-22 00:05:18.321 182717 DEBUG oslo_concurrency.processutils [None req-23c439ba-fd09-452e-b51e-2fc800e83e05 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:05:18 compute-1 nova_compute[182713]: 2026-01-22 00:05:18.411 182717 DEBUG oslo_concurrency.processutils [None req-23c439ba-fd09-452e-b51e-2fc800e83e05 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.090s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:05:18 compute-1 nova_compute[182713]: 2026-01-22 00:05:18.412 182717 DEBUG oslo_concurrency.processutils [None req-23c439ba-fd09-452e-b51e-2fc800e83e05 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/a7650c58-4663-47b0-8499-d470f8edddbd/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:05:18 compute-1 nova_compute[182713]: 2026-01-22 00:05:18.448 182717 DEBUG oslo_concurrency.processutils [None req-23c439ba-fd09-452e-b51e-2fc800e83e05 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/a7650c58-4663-47b0-8499-d470f8edddbd/disk 1073741824" returned: 0 in 0.036s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:05:18 compute-1 nova_compute[182713]: 2026-01-22 00:05:18.450 182717 DEBUG oslo_concurrency.lockutils [None req-23c439ba-fd09-452e-b51e-2fc800e83e05 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.144s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:05:18 compute-1 nova_compute[182713]: 2026-01-22 00:05:18.451 182717 DEBUG oslo_concurrency.processutils [None req-23c439ba-fd09-452e-b51e-2fc800e83e05 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:05:18 compute-1 nova_compute[182713]: 2026-01-22 00:05:18.540 182717 DEBUG oslo_concurrency.processutils [None req-23c439ba-fd09-452e-b51e-2fc800e83e05 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.089s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:05:18 compute-1 nova_compute[182713]: 2026-01-22 00:05:18.542 182717 DEBUG nova.virt.disk.api [None req-23c439ba-fd09-452e-b51e-2fc800e83e05 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Checking if we can resize image /var/lib/nova/instances/a7650c58-4663-47b0-8499-d470f8edddbd/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 22 00:05:18 compute-1 nova_compute[182713]: 2026-01-22 00:05:18.543 182717 DEBUG oslo_concurrency.processutils [None req-23c439ba-fd09-452e-b51e-2fc800e83e05 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a7650c58-4663-47b0-8499-d470f8edddbd/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:05:18 compute-1 podman[225864]: 2026-01-22 00:05:18.571048888 +0000 UTC m=+0.059441686 container health_status 9069102163b9014ce8d990e36224edf2f6277e737eebbc85a8ea0ba5a3d6a468 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 22 00:05:18 compute-1 nova_compute[182713]: 2026-01-22 00:05:18.631 182717 DEBUG oslo_concurrency.processutils [None req-23c439ba-fd09-452e-b51e-2fc800e83e05 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a7650c58-4663-47b0-8499-d470f8edddbd/disk --force-share --output=json" returned: 0 in 0.088s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:05:18 compute-1 nova_compute[182713]: 2026-01-22 00:05:18.633 182717 DEBUG nova.virt.disk.api [None req-23c439ba-fd09-452e-b51e-2fc800e83e05 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Cannot resize image /var/lib/nova/instances/a7650c58-4663-47b0-8499-d470f8edddbd/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 22 00:05:18 compute-1 nova_compute[182713]: 2026-01-22 00:05:18.633 182717 DEBUG nova.objects.instance [None req-23c439ba-fd09-452e-b51e-2fc800e83e05 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Lazy-loading 'migration_context' on Instance uuid a7650c58-4663-47b0-8499-d470f8edddbd obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:05:18 compute-1 podman[225863]: 2026-01-22 00:05:18.65628068 +0000 UTC m=+0.141815129 container health_status 1b31374526468d6b98833c6ddc06c791524e3aaa8f4a92d02e3f11c449a17ca2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 00:05:18 compute-1 nova_compute[182713]: 2026-01-22 00:05:18.677 182717 DEBUG nova.virt.libvirt.driver [None req-23c439ba-fd09-452e-b51e-2fc800e83e05 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: a7650c58-4663-47b0-8499-d470f8edddbd] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 22 00:05:18 compute-1 nova_compute[182713]: 2026-01-22 00:05:18.677 182717 DEBUG nova.virt.libvirt.driver [None req-23c439ba-fd09-452e-b51e-2fc800e83e05 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: a7650c58-4663-47b0-8499-d470f8edddbd] Ensure instance console log exists: /var/lib/nova/instances/a7650c58-4663-47b0-8499-d470f8edddbd/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 22 00:05:18 compute-1 nova_compute[182713]: 2026-01-22 00:05:18.678 182717 DEBUG oslo_concurrency.lockutils [None req-23c439ba-fd09-452e-b51e-2fc800e83e05 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:05:18 compute-1 nova_compute[182713]: 2026-01-22 00:05:18.678 182717 DEBUG oslo_concurrency.lockutils [None req-23c439ba-fd09-452e-b51e-2fc800e83e05 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:05:18 compute-1 nova_compute[182713]: 2026-01-22 00:05:18.679 182717 DEBUG oslo_concurrency.lockutils [None req-23c439ba-fd09-452e-b51e-2fc800e83e05 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:05:19 compute-1 nova_compute[182713]: 2026-01-22 00:05:19.420 182717 DEBUG nova.network.neutron [None req-23c439ba-fd09-452e-b51e-2fc800e83e05 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: a7650c58-4663-47b0-8499-d470f8edddbd] Successfully created port: 609c277b-133c-4824-9fd7-17b756932543 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 22 00:05:21 compute-1 nova_compute[182713]: 2026-01-22 00:05:21.337 182717 DEBUG nova.network.neutron [None req-23c439ba-fd09-452e-b51e-2fc800e83e05 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: a7650c58-4663-47b0-8499-d470f8edddbd] Successfully updated port: 609c277b-133c-4824-9fd7-17b756932543 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 22 00:05:21 compute-1 nova_compute[182713]: 2026-01-22 00:05:21.354 182717 DEBUG oslo_concurrency.lockutils [None req-23c439ba-fd09-452e-b51e-2fc800e83e05 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Acquiring lock "refresh_cache-a7650c58-4663-47b0-8499-d470f8edddbd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:05:21 compute-1 nova_compute[182713]: 2026-01-22 00:05:21.354 182717 DEBUG oslo_concurrency.lockutils [None req-23c439ba-fd09-452e-b51e-2fc800e83e05 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Acquired lock "refresh_cache-a7650c58-4663-47b0-8499-d470f8edddbd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:05:21 compute-1 nova_compute[182713]: 2026-01-22 00:05:21.355 182717 DEBUG nova.network.neutron [None req-23c439ba-fd09-452e-b51e-2fc800e83e05 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: a7650c58-4663-47b0-8499-d470f8edddbd] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 00:05:21 compute-1 nova_compute[182713]: 2026-01-22 00:05:21.476 182717 DEBUG nova.compute.manager [req-0bd05473-6536-47ca-8029-253a1a510428 req-4888f627-0c9f-495e-9836-b64b7288b43c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: a7650c58-4663-47b0-8499-d470f8edddbd] Received event network-changed-609c277b-133c-4824-9fd7-17b756932543 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:05:21 compute-1 nova_compute[182713]: 2026-01-22 00:05:21.477 182717 DEBUG nova.compute.manager [req-0bd05473-6536-47ca-8029-253a1a510428 req-4888f627-0c9f-495e-9836-b64b7288b43c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: a7650c58-4663-47b0-8499-d470f8edddbd] Refreshing instance network info cache due to event network-changed-609c277b-133c-4824-9fd7-17b756932543. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 00:05:21 compute-1 nova_compute[182713]: 2026-01-22 00:05:21.477 182717 DEBUG oslo_concurrency.lockutils [req-0bd05473-6536-47ca-8029-253a1a510428 req-4888f627-0c9f-495e-9836-b64b7288b43c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-a7650c58-4663-47b0-8499-d470f8edddbd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:05:21 compute-1 nova_compute[182713]: 2026-01-22 00:05:21.547 182717 DEBUG nova.network.neutron [None req-23c439ba-fd09-452e-b51e-2fc800e83e05 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: a7650c58-4663-47b0-8499-d470f8edddbd] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 00:05:22 compute-1 nova_compute[182713]: 2026-01-22 00:05:22.063 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:05:22 compute-1 nova_compute[182713]: 2026-01-22 00:05:22.171 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:05:22 compute-1 nova_compute[182713]: 2026-01-22 00:05:22.463 182717 DEBUG nova.network.neutron [None req-23c439ba-fd09-452e-b51e-2fc800e83e05 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: a7650c58-4663-47b0-8499-d470f8edddbd] Updating instance_info_cache with network_info: [{"id": "609c277b-133c-4824-9fd7-17b756932543", "address": "fa:16:3e:4e:1a:fc", "network": {"id": "1a4bd631-64c5-4e00-9341-0e44fd0833fb", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1779791452-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b26cf6f4abd54e30aac169a3cbca648c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap609c277b-13", "ovs_interfaceid": "609c277b-133c-4824-9fd7-17b756932543", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:05:22 compute-1 nova_compute[182713]: 2026-01-22 00:05:22.501 182717 DEBUG oslo_concurrency.lockutils [None req-23c439ba-fd09-452e-b51e-2fc800e83e05 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Releasing lock "refresh_cache-a7650c58-4663-47b0-8499-d470f8edddbd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:05:22 compute-1 nova_compute[182713]: 2026-01-22 00:05:22.502 182717 DEBUG nova.compute.manager [None req-23c439ba-fd09-452e-b51e-2fc800e83e05 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: a7650c58-4663-47b0-8499-d470f8edddbd] Instance network_info: |[{"id": "609c277b-133c-4824-9fd7-17b756932543", "address": "fa:16:3e:4e:1a:fc", "network": {"id": "1a4bd631-64c5-4e00-9341-0e44fd0833fb", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1779791452-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b26cf6f4abd54e30aac169a3cbca648c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap609c277b-13", "ovs_interfaceid": "609c277b-133c-4824-9fd7-17b756932543", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 22 00:05:22 compute-1 nova_compute[182713]: 2026-01-22 00:05:22.503 182717 DEBUG oslo_concurrency.lockutils [req-0bd05473-6536-47ca-8029-253a1a510428 req-4888f627-0c9f-495e-9836-b64b7288b43c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-a7650c58-4663-47b0-8499-d470f8edddbd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:05:22 compute-1 nova_compute[182713]: 2026-01-22 00:05:22.503 182717 DEBUG nova.network.neutron [req-0bd05473-6536-47ca-8029-253a1a510428 req-4888f627-0c9f-495e-9836-b64b7288b43c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: a7650c58-4663-47b0-8499-d470f8edddbd] Refreshing network info cache for port 609c277b-133c-4824-9fd7-17b756932543 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 00:05:22 compute-1 nova_compute[182713]: 2026-01-22 00:05:22.508 182717 DEBUG nova.virt.libvirt.driver [None req-23c439ba-fd09-452e-b51e-2fc800e83e05 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: a7650c58-4663-47b0-8499-d470f8edddbd] Start _get_guest_xml network_info=[{"id": "609c277b-133c-4824-9fd7-17b756932543", "address": "fa:16:3e:4e:1a:fc", "network": {"id": "1a4bd631-64c5-4e00-9341-0e44fd0833fb", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1779791452-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b26cf6f4abd54e30aac169a3cbca648c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap609c277b-13", "ovs_interfaceid": "609c277b-133c-4824-9fd7-17b756932543", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'size': 0, 'disk_bus': 'virtio', 'boot_index': 0, 'device_name': '/dev/vda', 'guest_format': None, 'encrypted': False, 'encryption_format': None, 'encryption_secret_uuid': None, 'device_type': 'disk', 'image_id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 22 00:05:22 compute-1 nova_compute[182713]: 2026-01-22 00:05:22.517 182717 WARNING nova.virt.libvirt.driver [None req-23c439ba-fd09-452e-b51e-2fc800e83e05 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 00:05:22 compute-1 nova_compute[182713]: 2026-01-22 00:05:22.523 182717 DEBUG nova.virt.libvirt.host [None req-23c439ba-fd09-452e-b51e-2fc800e83e05 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 22 00:05:22 compute-1 nova_compute[182713]: 2026-01-22 00:05:22.524 182717 DEBUG nova.virt.libvirt.host [None req-23c439ba-fd09-452e-b51e-2fc800e83e05 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 22 00:05:22 compute-1 nova_compute[182713]: 2026-01-22 00:05:22.529 182717 DEBUG nova.virt.libvirt.host [None req-23c439ba-fd09-452e-b51e-2fc800e83e05 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 22 00:05:22 compute-1 nova_compute[182713]: 2026-01-22 00:05:22.530 182717 DEBUG nova.virt.libvirt.host [None req-23c439ba-fd09-452e-b51e-2fc800e83e05 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 22 00:05:22 compute-1 nova_compute[182713]: 2026-01-22 00:05:22.531 182717 DEBUG nova.virt.libvirt.driver [None req-23c439ba-fd09-452e-b51e-2fc800e83e05 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 22 00:05:22 compute-1 nova_compute[182713]: 2026-01-22 00:05:22.532 182717 DEBUG nova.virt.hardware [None req-23c439ba-fd09-452e-b51e-2fc800e83e05 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-21T23:42:45Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c3389c03-89c4-4ff5-9e03-1a99d41713d4',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 22 00:05:22 compute-1 nova_compute[182713]: 2026-01-22 00:05:22.532 182717 DEBUG nova.virt.hardware [None req-23c439ba-fd09-452e-b51e-2fc800e83e05 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 22 00:05:22 compute-1 nova_compute[182713]: 2026-01-22 00:05:22.532 182717 DEBUG nova.virt.hardware [None req-23c439ba-fd09-452e-b51e-2fc800e83e05 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 22 00:05:22 compute-1 nova_compute[182713]: 2026-01-22 00:05:22.532 182717 DEBUG nova.virt.hardware [None req-23c439ba-fd09-452e-b51e-2fc800e83e05 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 22 00:05:22 compute-1 nova_compute[182713]: 2026-01-22 00:05:22.533 182717 DEBUG nova.virt.hardware [None req-23c439ba-fd09-452e-b51e-2fc800e83e05 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 22 00:05:22 compute-1 nova_compute[182713]: 2026-01-22 00:05:22.533 182717 DEBUG nova.virt.hardware [None req-23c439ba-fd09-452e-b51e-2fc800e83e05 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 22 00:05:22 compute-1 nova_compute[182713]: 2026-01-22 00:05:22.533 182717 DEBUG nova.virt.hardware [None req-23c439ba-fd09-452e-b51e-2fc800e83e05 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 22 00:05:22 compute-1 nova_compute[182713]: 2026-01-22 00:05:22.533 182717 DEBUG nova.virt.hardware [None req-23c439ba-fd09-452e-b51e-2fc800e83e05 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 22 00:05:22 compute-1 nova_compute[182713]: 2026-01-22 00:05:22.533 182717 DEBUG nova.virt.hardware [None req-23c439ba-fd09-452e-b51e-2fc800e83e05 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 22 00:05:22 compute-1 nova_compute[182713]: 2026-01-22 00:05:22.534 182717 DEBUG nova.virt.hardware [None req-23c439ba-fd09-452e-b51e-2fc800e83e05 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 22 00:05:22 compute-1 nova_compute[182713]: 2026-01-22 00:05:22.534 182717 DEBUG nova.virt.hardware [None req-23c439ba-fd09-452e-b51e-2fc800e83e05 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 22 00:05:22 compute-1 nova_compute[182713]: 2026-01-22 00:05:22.537 182717 DEBUG nova.virt.libvirt.vif [None req-23c439ba-fd09-452e-b51e-2fc800e83e05 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T00:05:15Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerActionsTestOtherB-server-1989312991',display_name='tempest-ServerActionsTestOtherB-server-1989312991',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serveractionstestotherb-server-1989312991',id=92,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOWyAqsdytk3W3HzFQzJP3BXJvSwE75PC1SitNdFnRhcK3nyEFtPzs/DJKbijcwArzRvqYzid7Fty+N11Hyd1TaRzX9I0f6oLPrGjMRpZbi4YRQ8Uh8k7+UR1VtydcvTDA==',key_name='tempest-keypair-1040205771',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b26cf6f4abd54e30aac169a3cbca648c',ramdisk_id='',reservation_id='r-sfv20ppo',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerActionsTestOtherB-1685479237',owner_user_name='tempest-ServerActionsTestOtherB-1685479237-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:05:18Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='365f219cd09c471fa6275faa2fe5e2a1',uuid=a7650c58-4663-47b0-8499-d470f8edddbd,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "609c277b-133c-4824-9fd7-17b756932543", "address": "fa:16:3e:4e:1a:fc", "network": {"id": "1a4bd631-64c5-4e00-9341-0e44fd0833fb", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1779791452-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b26cf6f4abd54e30aac169a3cbca648c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap609c277b-13", "ovs_interfaceid": "609c277b-133c-4824-9fd7-17b756932543", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 00:05:22 compute-1 nova_compute[182713]: 2026-01-22 00:05:22.537 182717 DEBUG nova.network.os_vif_util [None req-23c439ba-fd09-452e-b51e-2fc800e83e05 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Converting VIF {"id": "609c277b-133c-4824-9fd7-17b756932543", "address": "fa:16:3e:4e:1a:fc", "network": {"id": "1a4bd631-64c5-4e00-9341-0e44fd0833fb", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1779791452-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b26cf6f4abd54e30aac169a3cbca648c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap609c277b-13", "ovs_interfaceid": "609c277b-133c-4824-9fd7-17b756932543", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:05:22 compute-1 nova_compute[182713]: 2026-01-22 00:05:22.538 182717 DEBUG nova.network.os_vif_util [None req-23c439ba-fd09-452e-b51e-2fc800e83e05 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:4e:1a:fc,bridge_name='br-int',has_traffic_filtering=True,id=609c277b-133c-4824-9fd7-17b756932543,network=Network(1a4bd631-64c5-4e00-9341-0e44fd0833fb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap609c277b-13') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:05:22 compute-1 nova_compute[182713]: 2026-01-22 00:05:22.538 182717 DEBUG nova.objects.instance [None req-23c439ba-fd09-452e-b51e-2fc800e83e05 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Lazy-loading 'pci_devices' on Instance uuid a7650c58-4663-47b0-8499-d470f8edddbd obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:05:22 compute-1 nova_compute[182713]: 2026-01-22 00:05:22.685 182717 DEBUG nova.virt.libvirt.driver [None req-23c439ba-fd09-452e-b51e-2fc800e83e05 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: a7650c58-4663-47b0-8499-d470f8edddbd] End _get_guest_xml xml=<domain type="kvm">
Jan 22 00:05:22 compute-1 nova_compute[182713]:   <uuid>a7650c58-4663-47b0-8499-d470f8edddbd</uuid>
Jan 22 00:05:22 compute-1 nova_compute[182713]:   <name>instance-0000005c</name>
Jan 22 00:05:22 compute-1 nova_compute[182713]:   <memory>131072</memory>
Jan 22 00:05:22 compute-1 nova_compute[182713]:   <vcpu>1</vcpu>
Jan 22 00:05:22 compute-1 nova_compute[182713]:   <metadata>
Jan 22 00:05:22 compute-1 nova_compute[182713]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 00:05:22 compute-1 nova_compute[182713]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 00:05:22 compute-1 nova_compute[182713]:       <nova:name>tempest-ServerActionsTestOtherB-server-1989312991</nova:name>
Jan 22 00:05:22 compute-1 nova_compute[182713]:       <nova:creationTime>2026-01-22 00:05:22</nova:creationTime>
Jan 22 00:05:22 compute-1 nova_compute[182713]:       <nova:flavor name="m1.nano">
Jan 22 00:05:22 compute-1 nova_compute[182713]:         <nova:memory>128</nova:memory>
Jan 22 00:05:22 compute-1 nova_compute[182713]:         <nova:disk>1</nova:disk>
Jan 22 00:05:22 compute-1 nova_compute[182713]:         <nova:swap>0</nova:swap>
Jan 22 00:05:22 compute-1 nova_compute[182713]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 00:05:22 compute-1 nova_compute[182713]:         <nova:vcpus>1</nova:vcpus>
Jan 22 00:05:22 compute-1 nova_compute[182713]:       </nova:flavor>
Jan 22 00:05:22 compute-1 nova_compute[182713]:       <nova:owner>
Jan 22 00:05:22 compute-1 nova_compute[182713]:         <nova:user uuid="365f219cd09c471fa6275faa2fe5e2a1">tempest-ServerActionsTestOtherB-1685479237-project-member</nova:user>
Jan 22 00:05:22 compute-1 nova_compute[182713]:         <nova:project uuid="b26cf6f4abd54e30aac169a3cbca648c">tempest-ServerActionsTestOtherB-1685479237</nova:project>
Jan 22 00:05:22 compute-1 nova_compute[182713]:       </nova:owner>
Jan 22 00:05:22 compute-1 nova_compute[182713]:       <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 22 00:05:22 compute-1 nova_compute[182713]:       <nova:ports>
Jan 22 00:05:22 compute-1 nova_compute[182713]:         <nova:port uuid="609c277b-133c-4824-9fd7-17b756932543">
Jan 22 00:05:22 compute-1 nova_compute[182713]:           <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Jan 22 00:05:22 compute-1 nova_compute[182713]:         </nova:port>
Jan 22 00:05:22 compute-1 nova_compute[182713]:       </nova:ports>
Jan 22 00:05:22 compute-1 nova_compute[182713]:     </nova:instance>
Jan 22 00:05:22 compute-1 nova_compute[182713]:   </metadata>
Jan 22 00:05:22 compute-1 nova_compute[182713]:   <sysinfo type="smbios">
Jan 22 00:05:22 compute-1 nova_compute[182713]:     <system>
Jan 22 00:05:22 compute-1 nova_compute[182713]:       <entry name="manufacturer">RDO</entry>
Jan 22 00:05:22 compute-1 nova_compute[182713]:       <entry name="product">OpenStack Compute</entry>
Jan 22 00:05:22 compute-1 nova_compute[182713]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 00:05:22 compute-1 nova_compute[182713]:       <entry name="serial">a7650c58-4663-47b0-8499-d470f8edddbd</entry>
Jan 22 00:05:22 compute-1 nova_compute[182713]:       <entry name="uuid">a7650c58-4663-47b0-8499-d470f8edddbd</entry>
Jan 22 00:05:22 compute-1 nova_compute[182713]:       <entry name="family">Virtual Machine</entry>
Jan 22 00:05:22 compute-1 nova_compute[182713]:     </system>
Jan 22 00:05:22 compute-1 nova_compute[182713]:   </sysinfo>
Jan 22 00:05:22 compute-1 nova_compute[182713]:   <os>
Jan 22 00:05:22 compute-1 nova_compute[182713]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 22 00:05:22 compute-1 nova_compute[182713]:     <boot dev="hd"/>
Jan 22 00:05:22 compute-1 nova_compute[182713]:     <smbios mode="sysinfo"/>
Jan 22 00:05:22 compute-1 nova_compute[182713]:   </os>
Jan 22 00:05:22 compute-1 nova_compute[182713]:   <features>
Jan 22 00:05:22 compute-1 nova_compute[182713]:     <acpi/>
Jan 22 00:05:22 compute-1 nova_compute[182713]:     <apic/>
Jan 22 00:05:22 compute-1 nova_compute[182713]:     <vmcoreinfo/>
Jan 22 00:05:22 compute-1 nova_compute[182713]:   </features>
Jan 22 00:05:22 compute-1 nova_compute[182713]:   <clock offset="utc">
Jan 22 00:05:22 compute-1 nova_compute[182713]:     <timer name="pit" tickpolicy="delay"/>
Jan 22 00:05:22 compute-1 nova_compute[182713]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 22 00:05:22 compute-1 nova_compute[182713]:     <timer name="hpet" present="no"/>
Jan 22 00:05:22 compute-1 nova_compute[182713]:   </clock>
Jan 22 00:05:22 compute-1 nova_compute[182713]:   <cpu mode="custom" match="exact">
Jan 22 00:05:22 compute-1 nova_compute[182713]:     <model>Nehalem</model>
Jan 22 00:05:22 compute-1 nova_compute[182713]:     <topology sockets="1" cores="1" threads="1"/>
Jan 22 00:05:22 compute-1 nova_compute[182713]:   </cpu>
Jan 22 00:05:22 compute-1 nova_compute[182713]:   <devices>
Jan 22 00:05:22 compute-1 nova_compute[182713]:     <disk type="file" device="disk">
Jan 22 00:05:22 compute-1 nova_compute[182713]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 00:05:22 compute-1 nova_compute[182713]:       <source file="/var/lib/nova/instances/a7650c58-4663-47b0-8499-d470f8edddbd/disk"/>
Jan 22 00:05:22 compute-1 nova_compute[182713]:       <target dev="vda" bus="virtio"/>
Jan 22 00:05:22 compute-1 nova_compute[182713]:     </disk>
Jan 22 00:05:22 compute-1 nova_compute[182713]:     <disk type="file" device="cdrom">
Jan 22 00:05:22 compute-1 nova_compute[182713]:       <driver name="qemu" type="raw" cache="none"/>
Jan 22 00:05:22 compute-1 nova_compute[182713]:       <source file="/var/lib/nova/instances/a7650c58-4663-47b0-8499-d470f8edddbd/disk.config"/>
Jan 22 00:05:22 compute-1 nova_compute[182713]:       <target dev="sda" bus="sata"/>
Jan 22 00:05:22 compute-1 nova_compute[182713]:     </disk>
Jan 22 00:05:22 compute-1 nova_compute[182713]:     <interface type="ethernet">
Jan 22 00:05:22 compute-1 nova_compute[182713]:       <mac address="fa:16:3e:4e:1a:fc"/>
Jan 22 00:05:22 compute-1 nova_compute[182713]:       <model type="virtio"/>
Jan 22 00:05:22 compute-1 nova_compute[182713]:       <driver name="vhost" rx_queue_size="512"/>
Jan 22 00:05:22 compute-1 nova_compute[182713]:       <mtu size="1442"/>
Jan 22 00:05:22 compute-1 nova_compute[182713]:       <target dev="tap609c277b-13"/>
Jan 22 00:05:22 compute-1 nova_compute[182713]:     </interface>
Jan 22 00:05:22 compute-1 nova_compute[182713]:     <serial type="pty">
Jan 22 00:05:22 compute-1 nova_compute[182713]:       <log file="/var/lib/nova/instances/a7650c58-4663-47b0-8499-d470f8edddbd/console.log" append="off"/>
Jan 22 00:05:22 compute-1 nova_compute[182713]:     </serial>
Jan 22 00:05:22 compute-1 nova_compute[182713]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 00:05:22 compute-1 nova_compute[182713]:     <video>
Jan 22 00:05:22 compute-1 nova_compute[182713]:       <model type="virtio"/>
Jan 22 00:05:22 compute-1 nova_compute[182713]:     </video>
Jan 22 00:05:22 compute-1 nova_compute[182713]:     <input type="tablet" bus="usb"/>
Jan 22 00:05:22 compute-1 nova_compute[182713]:     <rng model="virtio">
Jan 22 00:05:22 compute-1 nova_compute[182713]:       <backend model="random">/dev/urandom</backend>
Jan 22 00:05:22 compute-1 nova_compute[182713]:     </rng>
Jan 22 00:05:22 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root"/>
Jan 22 00:05:22 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:05:22 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:05:22 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:05:22 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:05:22 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:05:22 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:05:22 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:05:22 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:05:22 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:05:22 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:05:22 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:05:22 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:05:22 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:05:22 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:05:22 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:05:22 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:05:22 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:05:22 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:05:22 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:05:22 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:05:22 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:05:22 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:05:22 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:05:22 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:05:22 compute-1 nova_compute[182713]:     <controller type="usb" index="0"/>
Jan 22 00:05:22 compute-1 nova_compute[182713]:     <memballoon model="virtio">
Jan 22 00:05:22 compute-1 nova_compute[182713]:       <stats period="10"/>
Jan 22 00:05:22 compute-1 nova_compute[182713]:     </memballoon>
Jan 22 00:05:22 compute-1 nova_compute[182713]:   </devices>
Jan 22 00:05:22 compute-1 nova_compute[182713]: </domain>
Jan 22 00:05:22 compute-1 nova_compute[182713]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 22 00:05:22 compute-1 nova_compute[182713]: 2026-01-22 00:05:22.687 182717 DEBUG nova.compute.manager [None req-23c439ba-fd09-452e-b51e-2fc800e83e05 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: a7650c58-4663-47b0-8499-d470f8edddbd] Preparing to wait for external event network-vif-plugged-609c277b-133c-4824-9fd7-17b756932543 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 22 00:05:22 compute-1 nova_compute[182713]: 2026-01-22 00:05:22.688 182717 DEBUG oslo_concurrency.lockutils [None req-23c439ba-fd09-452e-b51e-2fc800e83e05 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Acquiring lock "a7650c58-4663-47b0-8499-d470f8edddbd-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:05:22 compute-1 nova_compute[182713]: 2026-01-22 00:05:22.688 182717 DEBUG oslo_concurrency.lockutils [None req-23c439ba-fd09-452e-b51e-2fc800e83e05 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Lock "a7650c58-4663-47b0-8499-d470f8edddbd-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:05:22 compute-1 nova_compute[182713]: 2026-01-22 00:05:22.689 182717 DEBUG oslo_concurrency.lockutils [None req-23c439ba-fd09-452e-b51e-2fc800e83e05 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Lock "a7650c58-4663-47b0-8499-d470f8edddbd-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:05:22 compute-1 nova_compute[182713]: 2026-01-22 00:05:22.690 182717 DEBUG nova.virt.libvirt.vif [None req-23c439ba-fd09-452e-b51e-2fc800e83e05 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T00:05:15Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerActionsTestOtherB-server-1989312991',display_name='tempest-ServerActionsTestOtherB-server-1989312991',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serveractionstestotherb-server-1989312991',id=92,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOWyAqsdytk3W3HzFQzJP3BXJvSwE75PC1SitNdFnRhcK3nyEFtPzs/DJKbijcwArzRvqYzid7Fty+N11Hyd1TaRzX9I0f6oLPrGjMRpZbi4YRQ8Uh8k7+UR1VtydcvTDA==',key_name='tempest-keypair-1040205771',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b26cf6f4abd54e30aac169a3cbca648c',ramdisk_id='',reservation_id='r-sfv20ppo',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerActionsTestOtherB-1685479237',owner_user_name='tempest-ServerActionsTestOtherB-1685479237-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:05:18Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='365f219cd09c471fa6275faa2fe5e2a1',uuid=a7650c58-4663-47b0-8499-d470f8edddbd,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "609c277b-133c-4824-9fd7-17b756932543", "address": "fa:16:3e:4e:1a:fc", "network": {"id": "1a4bd631-64c5-4e00-9341-0e44fd0833fb", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1779791452-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b26cf6f4abd54e30aac169a3cbca648c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap609c277b-13", "ovs_interfaceid": "609c277b-133c-4824-9fd7-17b756932543", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 00:05:22 compute-1 nova_compute[182713]: 2026-01-22 00:05:22.690 182717 DEBUG nova.network.os_vif_util [None req-23c439ba-fd09-452e-b51e-2fc800e83e05 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Converting VIF {"id": "609c277b-133c-4824-9fd7-17b756932543", "address": "fa:16:3e:4e:1a:fc", "network": {"id": "1a4bd631-64c5-4e00-9341-0e44fd0833fb", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1779791452-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b26cf6f4abd54e30aac169a3cbca648c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap609c277b-13", "ovs_interfaceid": "609c277b-133c-4824-9fd7-17b756932543", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:05:22 compute-1 nova_compute[182713]: 2026-01-22 00:05:22.692 182717 DEBUG nova.network.os_vif_util [None req-23c439ba-fd09-452e-b51e-2fc800e83e05 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:4e:1a:fc,bridge_name='br-int',has_traffic_filtering=True,id=609c277b-133c-4824-9fd7-17b756932543,network=Network(1a4bd631-64c5-4e00-9341-0e44fd0833fb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap609c277b-13') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:05:22 compute-1 nova_compute[182713]: 2026-01-22 00:05:22.692 182717 DEBUG os_vif [None req-23c439ba-fd09-452e-b51e-2fc800e83e05 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:4e:1a:fc,bridge_name='br-int',has_traffic_filtering=True,id=609c277b-133c-4824-9fd7-17b756932543,network=Network(1a4bd631-64c5-4e00-9341-0e44fd0833fb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap609c277b-13') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 00:05:22 compute-1 nova_compute[182713]: 2026-01-22 00:05:22.693 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:05:22 compute-1 nova_compute[182713]: 2026-01-22 00:05:22.694 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:05:22 compute-1 nova_compute[182713]: 2026-01-22 00:05:22.694 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 00:05:22 compute-1 nova_compute[182713]: 2026-01-22 00:05:22.699 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:05:22 compute-1 nova_compute[182713]: 2026-01-22 00:05:22.699 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap609c277b-13, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:05:22 compute-1 nova_compute[182713]: 2026-01-22 00:05:22.700 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap609c277b-13, col_values=(('external_ids', {'iface-id': '609c277b-133c-4824-9fd7-17b756932543', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:4e:1a:fc', 'vm-uuid': 'a7650c58-4663-47b0-8499-d470f8edddbd'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:05:22 compute-1 nova_compute[182713]: 2026-01-22 00:05:22.702 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:05:22 compute-1 NetworkManager[54952]: <info>  [1769040322.7044] manager: (tap609c277b-13): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/185)
Jan 22 00:05:22 compute-1 nova_compute[182713]: 2026-01-22 00:05:22.705 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:05:22 compute-1 nova_compute[182713]: 2026-01-22 00:05:22.711 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:05:22 compute-1 nova_compute[182713]: 2026-01-22 00:05:22.712 182717 INFO os_vif [None req-23c439ba-fd09-452e-b51e-2fc800e83e05 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:4e:1a:fc,bridge_name='br-int',has_traffic_filtering=True,id=609c277b-133c-4824-9fd7-17b756932543,network=Network(1a4bd631-64c5-4e00-9341-0e44fd0833fb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap609c277b-13')
Jan 22 00:05:22 compute-1 nova_compute[182713]: 2026-01-22 00:05:22.805 182717 DEBUG nova.virt.libvirt.driver [None req-23c439ba-fd09-452e-b51e-2fc800e83e05 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 00:05:22 compute-1 nova_compute[182713]: 2026-01-22 00:05:22.806 182717 DEBUG nova.virt.libvirt.driver [None req-23c439ba-fd09-452e-b51e-2fc800e83e05 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 00:05:22 compute-1 nova_compute[182713]: 2026-01-22 00:05:22.806 182717 DEBUG nova.virt.libvirt.driver [None req-23c439ba-fd09-452e-b51e-2fc800e83e05 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] No VIF found with MAC fa:16:3e:4e:1a:fc, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 00:05:22 compute-1 nova_compute[182713]: 2026-01-22 00:05:22.807 182717 INFO nova.virt.libvirt.driver [None req-23c439ba-fd09-452e-b51e-2fc800e83e05 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: a7650c58-4663-47b0-8499-d470f8edddbd] Using config drive
Jan 22 00:05:23 compute-1 nova_compute[182713]: 2026-01-22 00:05:23.511 182717 INFO nova.virt.libvirt.driver [None req-23c439ba-fd09-452e-b51e-2fc800e83e05 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: a7650c58-4663-47b0-8499-d470f8edddbd] Creating config drive at /var/lib/nova/instances/a7650c58-4663-47b0-8499-d470f8edddbd/disk.config
Jan 22 00:05:23 compute-1 nova_compute[182713]: 2026-01-22 00:05:23.517 182717 DEBUG oslo_concurrency.processutils [None req-23c439ba-fd09-452e-b51e-2fc800e83e05 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/a7650c58-4663-47b0-8499-d470f8edddbd/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpyi77e_fn execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:05:23 compute-1 nova_compute[182713]: 2026-01-22 00:05:23.644 182717 DEBUG oslo_concurrency.processutils [None req-23c439ba-fd09-452e-b51e-2fc800e83e05 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/a7650c58-4663-47b0-8499-d470f8edddbd/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpyi77e_fn" returned: 0 in 0.127s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:05:23 compute-1 kernel: tap609c277b-13: entered promiscuous mode
Jan 22 00:05:23 compute-1 ovn_controller[94841]: 2026-01-22T00:05:23Z|00381|binding|INFO|Claiming lport 609c277b-133c-4824-9fd7-17b756932543 for this chassis.
Jan 22 00:05:23 compute-1 ovn_controller[94841]: 2026-01-22T00:05:23Z|00382|binding|INFO|609c277b-133c-4824-9fd7-17b756932543: Claiming fa:16:3e:4e:1a:fc 10.100.0.7
Jan 22 00:05:23 compute-1 NetworkManager[54952]: <info>  [1769040323.7483] manager: (tap609c277b-13): new Tun device (/org/freedesktop/NetworkManager/Devices/186)
Jan 22 00:05:23 compute-1 nova_compute[182713]: 2026-01-22 00:05:23.747 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:05:23 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:05:23.768 104184 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:4e:1a:fc 10.100.0.7'], port_security=['fa:16:3e:4e:1a:fc 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': 'a7650c58-4663-47b0-8499-d470f8edddbd', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1a4bd631-64c5-4e00-9341-0e44fd0833fb', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b26cf6f4abd54e30aac169a3cbca648c', 'neutron:revision_number': '2', 'neutron:security_group_ids': '80fb8d02-77b3-43f5-9cd3-4114236093b7', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d46c6b58-b03f-4ac4-a6dd-9f507a40241a, chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>], logical_port=609c277b-133c-4824-9fd7-17b756932543) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:05:23 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:05:23.769 104184 INFO neutron.agent.ovn.metadata.agent [-] Port 609c277b-133c-4824-9fd7-17b756932543 in datapath 1a4bd631-64c5-4e00-9341-0e44fd0833fb bound to our chassis
Jan 22 00:05:23 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:05:23.771 104184 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 1a4bd631-64c5-4e00-9341-0e44fd0833fb
Jan 22 00:05:23 compute-1 systemd-udevd[225957]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 00:05:23 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:05:23.786 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[57f785a3-be6f-44ee-9d0f-73caa6131144]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:05:23 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:05:23.787 104184 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap1a4bd631-61 in ovnmeta-1a4bd631-64c5-4e00-9341-0e44fd0833fb namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 22 00:05:23 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:05:23.791 211733 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap1a4bd631-60 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 22 00:05:23 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:05:23.791 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[a55771b1-9497-428c-a92c-662dc8f51c74]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:05:23 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:05:23.792 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[0ba831da-cadd-4032-9597-c14bad1e6c84]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:05:23 compute-1 NetworkManager[54952]: <info>  [1769040323.8050] device (tap609c277b-13): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 00:05:23 compute-1 NetworkManager[54952]: <info>  [1769040323.8068] device (tap609c277b-13): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 00:05:23 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:05:23.816 104576 DEBUG oslo.privsep.daemon [-] privsep: reply[5ed07d6d-6aae-4cf6-b749-1d6a8ec8eac0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:05:23 compute-1 systemd-machined[153970]: New machine qemu-43-instance-0000005c.
Jan 22 00:05:23 compute-1 ovn_controller[94841]: 2026-01-22T00:05:23Z|00383|binding|INFO|Setting lport 609c277b-133c-4824-9fd7-17b756932543 ovn-installed in OVS
Jan 22 00:05:23 compute-1 ovn_controller[94841]: 2026-01-22T00:05:23Z|00384|binding|INFO|Setting lport 609c277b-133c-4824-9fd7-17b756932543 up in Southbound
Jan 22 00:05:23 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:05:23.831 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[3347fb2a-d22e-4e87-a647-6421c777494b]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:05:23 compute-1 nova_compute[182713]: 2026-01-22 00:05:23.832 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:05:23 compute-1 systemd[1]: Started Virtual Machine qemu-43-instance-0000005c.
Jan 22 00:05:23 compute-1 podman[225929]: 2026-01-22 00:05:23.84523198 +0000 UTC m=+0.117612193 container health_status af9a44923f14d865893c91712a29a99d59e05d83f1a1a0300c4f4d5af638f284 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible)
Jan 22 00:05:23 compute-1 podman[225930]: 2026-01-22 00:05:23.849319536 +0000 UTC m=+0.105019054 container health_status e305ebfcd3dbca18f8dd680606b043b108cf9efb0d0a771039cb04fe0df8441d (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 22 00:05:23 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:05:23.875 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[3f4e0ea1-5fae-4bbd-bd58-a8a06491cc4b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:05:23 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:05:23.880 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[eff02787-c3b6-4e15-b9d6-c5f2727bac39]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:05:23 compute-1 systemd-udevd[225967]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 00:05:23 compute-1 NetworkManager[54952]: <info>  [1769040323.8819] manager: (tap1a4bd631-60): new Veth device (/org/freedesktop/NetworkManager/Devices/187)
Jan 22 00:05:23 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:05:23.916 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[6955f225-14a0-444f-a32b-165cf245413f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:05:23 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:05:23.919 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[a33a53a5-1dd1-4679-b987-c655e3e696a5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:05:23 compute-1 NetworkManager[54952]: <info>  [1769040323.9493] device (tap1a4bd631-60): carrier: link connected
Jan 22 00:05:23 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:05:23.956 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[22a6a82b-76af-4177-a3bb-b5d818e5c3ba]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:05:23 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:05:23.976 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[967f5272-b0a7-4528-8e68-558a6985c877]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap1a4bd631-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:28:78:33'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 114], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 489660, 'reachable_time': 40262, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 226010, 'error': None, 'target': 'ovnmeta-1a4bd631-64c5-4e00-9341-0e44fd0833fb', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:05:23 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:05:23.992 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[1a6bdbcc-7d00-4106-b394-6cd8c39c4ef3]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe28:7833'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 489660, 'tstamp': 489660}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 226011, 'error': None, 'target': 'ovnmeta-1a4bd631-64c5-4e00-9341-0e44fd0833fb', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:05:24 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:05:24.012 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[669f4a9d-e003-46c7-89ae-3b2a023337f2]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap1a4bd631-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:28:78:33'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 114], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 489660, 'reachable_time': 40262, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 226012, 'error': None, 'target': 'ovnmeta-1a4bd631-64c5-4e00-9341-0e44fd0833fb', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:05:24 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:05:24.051 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[5d4cdd32-5494-4790-800c-fce2a2facdf6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:05:24 compute-1 nova_compute[182713]: 2026-01-22 00:05:24.108 182717 DEBUG nova.virt.driver [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Emitting event <LifecycleEvent: 1769040324.107088, a7650c58-4663-47b0-8499-d470f8edddbd => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:05:24 compute-1 nova_compute[182713]: 2026-01-22 00:05:24.109 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: a7650c58-4663-47b0-8499-d470f8edddbd] VM Started (Lifecycle Event)
Jan 22 00:05:24 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:05:24.120 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[b164d2eb-919d-454a-874e-00ec145f4e5b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:05:24 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:05:24.122 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1a4bd631-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:05:24 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:05:24.123 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 00:05:24 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:05:24.124 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap1a4bd631-60, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:05:24 compute-1 nova_compute[182713]: 2026-01-22 00:05:24.126 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:05:24 compute-1 NetworkManager[54952]: <info>  [1769040324.1273] manager: (tap1a4bd631-60): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/188)
Jan 22 00:05:24 compute-1 kernel: tap1a4bd631-60: entered promiscuous mode
Jan 22 00:05:24 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:05:24.130 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap1a4bd631-60, col_values=(('external_ids', {'iface-id': 'c2dbe75a-81e7-4c52-bada-9acaf8fbaf5c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:05:24 compute-1 nova_compute[182713]: 2026-01-22 00:05:24.131 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:05:24 compute-1 ovn_controller[94841]: 2026-01-22T00:05:24Z|00385|binding|INFO|Releasing lport c2dbe75a-81e7-4c52-bada-9acaf8fbaf5c from this chassis (sb_readonly=0)
Jan 22 00:05:24 compute-1 nova_compute[182713]: 2026-01-22 00:05:24.132 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:05:24 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:05:24.132 104184 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/1a4bd631-64c5-4e00-9341-0e44fd0833fb.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/1a4bd631-64c5-4e00-9341-0e44fd0833fb.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 22 00:05:24 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:05:24.133 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[341f854a-257c-41c8-b981-b428df8b9d79]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:05:24 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:05:24.135 104184 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 00:05:24 compute-1 ovn_metadata_agent[104179]: global
Jan 22 00:05:24 compute-1 ovn_metadata_agent[104179]:     log         /dev/log local0 debug
Jan 22 00:05:24 compute-1 ovn_metadata_agent[104179]:     log-tag     haproxy-metadata-proxy-1a4bd631-64c5-4e00-9341-0e44fd0833fb
Jan 22 00:05:24 compute-1 ovn_metadata_agent[104179]:     user        root
Jan 22 00:05:24 compute-1 ovn_metadata_agent[104179]:     group       root
Jan 22 00:05:24 compute-1 ovn_metadata_agent[104179]:     maxconn     1024
Jan 22 00:05:24 compute-1 ovn_metadata_agent[104179]:     pidfile     /var/lib/neutron/external/pids/1a4bd631-64c5-4e00-9341-0e44fd0833fb.pid.haproxy
Jan 22 00:05:24 compute-1 ovn_metadata_agent[104179]:     daemon
Jan 22 00:05:24 compute-1 ovn_metadata_agent[104179]: 
Jan 22 00:05:24 compute-1 ovn_metadata_agent[104179]: defaults
Jan 22 00:05:24 compute-1 ovn_metadata_agent[104179]:     log global
Jan 22 00:05:24 compute-1 ovn_metadata_agent[104179]:     mode http
Jan 22 00:05:24 compute-1 ovn_metadata_agent[104179]:     option httplog
Jan 22 00:05:24 compute-1 ovn_metadata_agent[104179]:     option dontlognull
Jan 22 00:05:24 compute-1 ovn_metadata_agent[104179]:     option http-server-close
Jan 22 00:05:24 compute-1 ovn_metadata_agent[104179]:     option forwardfor
Jan 22 00:05:24 compute-1 ovn_metadata_agent[104179]:     retries                 3
Jan 22 00:05:24 compute-1 ovn_metadata_agent[104179]:     timeout http-request    30s
Jan 22 00:05:24 compute-1 ovn_metadata_agent[104179]:     timeout connect         30s
Jan 22 00:05:24 compute-1 ovn_metadata_agent[104179]:     timeout client          32s
Jan 22 00:05:24 compute-1 ovn_metadata_agent[104179]:     timeout server          32s
Jan 22 00:05:24 compute-1 ovn_metadata_agent[104179]:     timeout http-keep-alive 30s
Jan 22 00:05:24 compute-1 ovn_metadata_agent[104179]: 
Jan 22 00:05:24 compute-1 ovn_metadata_agent[104179]: 
Jan 22 00:05:24 compute-1 ovn_metadata_agent[104179]: listen listener
Jan 22 00:05:24 compute-1 ovn_metadata_agent[104179]:     bind 169.254.169.254:80
Jan 22 00:05:24 compute-1 ovn_metadata_agent[104179]:     server metadata /var/lib/neutron/metadata_proxy
Jan 22 00:05:24 compute-1 ovn_metadata_agent[104179]:     http-request add-header X-OVN-Network-ID 1a4bd631-64c5-4e00-9341-0e44fd0833fb
Jan 22 00:05:24 compute-1 ovn_metadata_agent[104179]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 22 00:05:24 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:05:24.136 104184 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-1a4bd631-64c5-4e00-9341-0e44fd0833fb', 'env', 'PROCESS_TAG=haproxy-1a4bd631-64c5-4e00-9341-0e44fd0833fb', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/1a4bd631-64c5-4e00-9341-0e44fd0833fb.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 22 00:05:24 compute-1 nova_compute[182713]: 2026-01-22 00:05:24.145 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:05:24 compute-1 nova_compute[182713]: 2026-01-22 00:05:24.155 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: a7650c58-4663-47b0-8499-d470f8edddbd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:05:24 compute-1 nova_compute[182713]: 2026-01-22 00:05:24.160 182717 DEBUG nova.virt.driver [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Emitting event <LifecycleEvent: 1769040324.1074286, a7650c58-4663-47b0-8499-d470f8edddbd => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:05:24 compute-1 nova_compute[182713]: 2026-01-22 00:05:24.160 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: a7650c58-4663-47b0-8499-d470f8edddbd] VM Paused (Lifecycle Event)
Jan 22 00:05:24 compute-1 nova_compute[182713]: 2026-01-22 00:05:24.187 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: a7650c58-4663-47b0-8499-d470f8edddbd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:05:24 compute-1 nova_compute[182713]: 2026-01-22 00:05:24.191 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: a7650c58-4663-47b0-8499-d470f8edddbd] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 00:05:24 compute-1 nova_compute[182713]: 2026-01-22 00:05:24.226 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: a7650c58-4663-47b0-8499-d470f8edddbd] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 00:05:24 compute-1 nova_compute[182713]: 2026-01-22 00:05:24.263 182717 DEBUG nova.compute.manager [req-31d5bb91-aa6e-4854-a3e4-1b949aa50807 req-4cca4374-6749-4843-a789-e7d00db1200b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: a7650c58-4663-47b0-8499-d470f8edddbd] Received event network-vif-plugged-609c277b-133c-4824-9fd7-17b756932543 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:05:24 compute-1 nova_compute[182713]: 2026-01-22 00:05:24.263 182717 DEBUG oslo_concurrency.lockutils [req-31d5bb91-aa6e-4854-a3e4-1b949aa50807 req-4cca4374-6749-4843-a789-e7d00db1200b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "a7650c58-4663-47b0-8499-d470f8edddbd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:05:24 compute-1 nova_compute[182713]: 2026-01-22 00:05:24.264 182717 DEBUG oslo_concurrency.lockutils [req-31d5bb91-aa6e-4854-a3e4-1b949aa50807 req-4cca4374-6749-4843-a789-e7d00db1200b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "a7650c58-4663-47b0-8499-d470f8edddbd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:05:24 compute-1 nova_compute[182713]: 2026-01-22 00:05:24.264 182717 DEBUG oslo_concurrency.lockutils [req-31d5bb91-aa6e-4854-a3e4-1b949aa50807 req-4cca4374-6749-4843-a789-e7d00db1200b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "a7650c58-4663-47b0-8499-d470f8edddbd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:05:24 compute-1 nova_compute[182713]: 2026-01-22 00:05:24.264 182717 DEBUG nova.compute.manager [req-31d5bb91-aa6e-4854-a3e4-1b949aa50807 req-4cca4374-6749-4843-a789-e7d00db1200b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: a7650c58-4663-47b0-8499-d470f8edddbd] Processing event network-vif-plugged-609c277b-133c-4824-9fd7-17b756932543 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 22 00:05:24 compute-1 nova_compute[182713]: 2026-01-22 00:05:24.265 182717 DEBUG nova.compute.manager [None req-23c439ba-fd09-452e-b51e-2fc800e83e05 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: a7650c58-4663-47b0-8499-d470f8edddbd] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 00:05:24 compute-1 nova_compute[182713]: 2026-01-22 00:05:24.269 182717 DEBUG nova.virt.driver [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Emitting event <LifecycleEvent: 1769040324.2690108, a7650c58-4663-47b0-8499-d470f8edddbd => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:05:24 compute-1 nova_compute[182713]: 2026-01-22 00:05:24.269 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: a7650c58-4663-47b0-8499-d470f8edddbd] VM Resumed (Lifecycle Event)
Jan 22 00:05:24 compute-1 nova_compute[182713]: 2026-01-22 00:05:24.291 182717 DEBUG nova.virt.libvirt.driver [None req-23c439ba-fd09-452e-b51e-2fc800e83e05 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: a7650c58-4663-47b0-8499-d470f8edddbd] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 22 00:05:24 compute-1 nova_compute[182713]: 2026-01-22 00:05:24.295 182717 INFO nova.virt.libvirt.driver [-] [instance: a7650c58-4663-47b0-8499-d470f8edddbd] Instance spawned successfully.
Jan 22 00:05:24 compute-1 nova_compute[182713]: 2026-01-22 00:05:24.296 182717 DEBUG nova.virt.libvirt.driver [None req-23c439ba-fd09-452e-b51e-2fc800e83e05 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: a7650c58-4663-47b0-8499-d470f8edddbd] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 22 00:05:24 compute-1 nova_compute[182713]: 2026-01-22 00:05:24.326 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: a7650c58-4663-47b0-8499-d470f8edddbd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:05:24 compute-1 nova_compute[182713]: 2026-01-22 00:05:24.332 182717 DEBUG nova.virt.libvirt.driver [None req-23c439ba-fd09-452e-b51e-2fc800e83e05 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: a7650c58-4663-47b0-8499-d470f8edddbd] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:05:24 compute-1 nova_compute[182713]: 2026-01-22 00:05:24.333 182717 DEBUG nova.virt.libvirt.driver [None req-23c439ba-fd09-452e-b51e-2fc800e83e05 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: a7650c58-4663-47b0-8499-d470f8edddbd] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:05:24 compute-1 nova_compute[182713]: 2026-01-22 00:05:24.333 182717 DEBUG nova.virt.libvirt.driver [None req-23c439ba-fd09-452e-b51e-2fc800e83e05 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: a7650c58-4663-47b0-8499-d470f8edddbd] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:05:24 compute-1 nova_compute[182713]: 2026-01-22 00:05:24.334 182717 DEBUG nova.virt.libvirt.driver [None req-23c439ba-fd09-452e-b51e-2fc800e83e05 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: a7650c58-4663-47b0-8499-d470f8edddbd] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:05:24 compute-1 nova_compute[182713]: 2026-01-22 00:05:24.335 182717 DEBUG nova.virt.libvirt.driver [None req-23c439ba-fd09-452e-b51e-2fc800e83e05 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: a7650c58-4663-47b0-8499-d470f8edddbd] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:05:24 compute-1 nova_compute[182713]: 2026-01-22 00:05:24.335 182717 DEBUG nova.virt.libvirt.driver [None req-23c439ba-fd09-452e-b51e-2fc800e83e05 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: a7650c58-4663-47b0-8499-d470f8edddbd] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:05:24 compute-1 nova_compute[182713]: 2026-01-22 00:05:24.341 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: a7650c58-4663-47b0-8499-d470f8edddbd] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 00:05:24 compute-1 nova_compute[182713]: 2026-01-22 00:05:24.363 182717 DEBUG nova.network.neutron [req-0bd05473-6536-47ca-8029-253a1a510428 req-4888f627-0c9f-495e-9836-b64b7288b43c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: a7650c58-4663-47b0-8499-d470f8edddbd] Updated VIF entry in instance network info cache for port 609c277b-133c-4824-9fd7-17b756932543. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 00:05:24 compute-1 nova_compute[182713]: 2026-01-22 00:05:24.363 182717 DEBUG nova.network.neutron [req-0bd05473-6536-47ca-8029-253a1a510428 req-4888f627-0c9f-495e-9836-b64b7288b43c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: a7650c58-4663-47b0-8499-d470f8edddbd] Updating instance_info_cache with network_info: [{"id": "609c277b-133c-4824-9fd7-17b756932543", "address": "fa:16:3e:4e:1a:fc", "network": {"id": "1a4bd631-64c5-4e00-9341-0e44fd0833fb", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1779791452-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b26cf6f4abd54e30aac169a3cbca648c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap609c277b-13", "ovs_interfaceid": "609c277b-133c-4824-9fd7-17b756932543", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:05:24 compute-1 nova_compute[182713]: 2026-01-22 00:05:24.391 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: a7650c58-4663-47b0-8499-d470f8edddbd] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 00:05:24 compute-1 nova_compute[182713]: 2026-01-22 00:05:24.434 182717 DEBUG oslo_concurrency.lockutils [req-0bd05473-6536-47ca-8029-253a1a510428 req-4888f627-0c9f-495e-9836-b64b7288b43c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-a7650c58-4663-47b0-8499-d470f8edddbd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:05:24 compute-1 nova_compute[182713]: 2026-01-22 00:05:24.462 182717 INFO nova.compute.manager [None req-23c439ba-fd09-452e-b51e-2fc800e83e05 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: a7650c58-4663-47b0-8499-d470f8edddbd] Took 6.27 seconds to spawn the instance on the hypervisor.
Jan 22 00:05:24 compute-1 nova_compute[182713]: 2026-01-22 00:05:24.462 182717 DEBUG nova.compute.manager [None req-23c439ba-fd09-452e-b51e-2fc800e83e05 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: a7650c58-4663-47b0-8499-d470f8edddbd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:05:24 compute-1 nova_compute[182713]: 2026-01-22 00:05:24.548 182717 INFO nova.compute.manager [None req-23c439ba-fd09-452e-b51e-2fc800e83e05 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: a7650c58-4663-47b0-8499-d470f8edddbd] Took 6.91 seconds to build instance.
Jan 22 00:05:24 compute-1 nova_compute[182713]: 2026-01-22 00:05:24.576 182717 DEBUG oslo_concurrency.lockutils [None req-23c439ba-fd09-452e-b51e-2fc800e83e05 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Lock "a7650c58-4663-47b0-8499-d470f8edddbd" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.011s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:05:24 compute-1 podman[226051]: 2026-01-22 00:05:24.578046798 +0000 UTC m=+0.068145286 container create 3e6ed091c097c07c60d69ad849698beb4c96b1e5b11d5c56221db997cf17f0a9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1a4bd631-64c5-4e00-9341-0e44fd0833fb, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Jan 22 00:05:24 compute-1 systemd[1]: Started libpod-conmon-3e6ed091c097c07c60d69ad849698beb4c96b1e5b11d5c56221db997cf17f0a9.scope.
Jan 22 00:05:24 compute-1 podman[226051]: 2026-01-22 00:05:24.544521272 +0000 UTC m=+0.034619750 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 00:05:24 compute-1 systemd[1]: Started libcrun container.
Jan 22 00:05:24 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e14a03f644312b385aef31efd2e28794ea27658cb2ae9a3a3e562fce889456c7/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 00:05:24 compute-1 podman[226051]: 2026-01-22 00:05:24.667542601 +0000 UTC m=+0.157641109 container init 3e6ed091c097c07c60d69ad849698beb4c96b1e5b11d5c56221db997cf17f0a9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1a4bd631-64c5-4e00-9341-0e44fd0833fb, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2)
Jan 22 00:05:24 compute-1 podman[226051]: 2026-01-22 00:05:24.675091994 +0000 UTC m=+0.165190442 container start 3e6ed091c097c07c60d69ad849698beb4c96b1e5b11d5c56221db997cf17f0a9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1a4bd631-64c5-4e00-9341-0e44fd0833fb, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 22 00:05:24 compute-1 neutron-haproxy-ovnmeta-1a4bd631-64c5-4e00-9341-0e44fd0833fb[226066]: [NOTICE]   (226070) : New worker (226072) forked
Jan 22 00:05:24 compute-1 neutron-haproxy-ovnmeta-1a4bd631-64c5-4e00-9341-0e44fd0833fb[226066]: [NOTICE]   (226070) : Loading success.
Jan 22 00:05:26 compute-1 nova_compute[182713]: 2026-01-22 00:05:26.260 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:05:26 compute-1 NetworkManager[54952]: <info>  [1769040326.2613] manager: (patch-br-int-to-provnet-99bcf688-d143-4306-a11c-956e66fbe227): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/189)
Jan 22 00:05:26 compute-1 NetworkManager[54952]: <info>  [1769040326.2622] manager: (patch-provnet-99bcf688-d143-4306-a11c-956e66fbe227-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/190)
Jan 22 00:05:26 compute-1 nova_compute[182713]: 2026-01-22 00:05:26.378 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:05:26 compute-1 ovn_controller[94841]: 2026-01-22T00:05:26Z|00386|binding|INFO|Releasing lport c2dbe75a-81e7-4c52-bada-9acaf8fbaf5c from this chassis (sb_readonly=0)
Jan 22 00:05:26 compute-1 nova_compute[182713]: 2026-01-22 00:05:26.388 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:05:26 compute-1 nova_compute[182713]: 2026-01-22 00:05:26.577 182717 DEBUG nova.compute.manager [req-6eee8c85-9299-48e1-921c-b1d83839b10f req-880b6c30-692d-4f23-96ba-9de910ed94bd 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: a7650c58-4663-47b0-8499-d470f8edddbd] Received event network-vif-plugged-609c277b-133c-4824-9fd7-17b756932543 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:05:26 compute-1 nova_compute[182713]: 2026-01-22 00:05:26.579 182717 DEBUG oslo_concurrency.lockutils [req-6eee8c85-9299-48e1-921c-b1d83839b10f req-880b6c30-692d-4f23-96ba-9de910ed94bd 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "a7650c58-4663-47b0-8499-d470f8edddbd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:05:26 compute-1 nova_compute[182713]: 2026-01-22 00:05:26.579 182717 DEBUG oslo_concurrency.lockutils [req-6eee8c85-9299-48e1-921c-b1d83839b10f req-880b6c30-692d-4f23-96ba-9de910ed94bd 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "a7650c58-4663-47b0-8499-d470f8edddbd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:05:26 compute-1 nova_compute[182713]: 2026-01-22 00:05:26.580 182717 DEBUG oslo_concurrency.lockutils [req-6eee8c85-9299-48e1-921c-b1d83839b10f req-880b6c30-692d-4f23-96ba-9de910ed94bd 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "a7650c58-4663-47b0-8499-d470f8edddbd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:05:26 compute-1 nova_compute[182713]: 2026-01-22 00:05:26.580 182717 DEBUG nova.compute.manager [req-6eee8c85-9299-48e1-921c-b1d83839b10f req-880b6c30-692d-4f23-96ba-9de910ed94bd 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: a7650c58-4663-47b0-8499-d470f8edddbd] No waiting events found dispatching network-vif-plugged-609c277b-133c-4824-9fd7-17b756932543 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:05:26 compute-1 nova_compute[182713]: 2026-01-22 00:05:26.581 182717 WARNING nova.compute.manager [req-6eee8c85-9299-48e1-921c-b1d83839b10f req-880b6c30-692d-4f23-96ba-9de910ed94bd 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: a7650c58-4663-47b0-8499-d470f8edddbd] Received unexpected event network-vif-plugged-609c277b-133c-4824-9fd7-17b756932543 for instance with vm_state active and task_state None.
Jan 22 00:05:26 compute-1 nova_compute[182713]: 2026-01-22 00:05:26.862 182717 DEBUG nova.compute.manager [req-61f9934c-d217-4949-8ec0-31912efe825b req-0796d2c7-7e2a-444c-a2d7-cbd4f6abaf83 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: a7650c58-4663-47b0-8499-d470f8edddbd] Received event network-changed-609c277b-133c-4824-9fd7-17b756932543 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:05:26 compute-1 nova_compute[182713]: 2026-01-22 00:05:26.863 182717 DEBUG nova.compute.manager [req-61f9934c-d217-4949-8ec0-31912efe825b req-0796d2c7-7e2a-444c-a2d7-cbd4f6abaf83 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: a7650c58-4663-47b0-8499-d470f8edddbd] Refreshing instance network info cache due to event network-changed-609c277b-133c-4824-9fd7-17b756932543. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 00:05:26 compute-1 nova_compute[182713]: 2026-01-22 00:05:26.863 182717 DEBUG oslo_concurrency.lockutils [req-61f9934c-d217-4949-8ec0-31912efe825b req-0796d2c7-7e2a-444c-a2d7-cbd4f6abaf83 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-a7650c58-4663-47b0-8499-d470f8edddbd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:05:26 compute-1 nova_compute[182713]: 2026-01-22 00:05:26.863 182717 DEBUG oslo_concurrency.lockutils [req-61f9934c-d217-4949-8ec0-31912efe825b req-0796d2c7-7e2a-444c-a2d7-cbd4f6abaf83 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-a7650c58-4663-47b0-8499-d470f8edddbd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:05:26 compute-1 nova_compute[182713]: 2026-01-22 00:05:26.864 182717 DEBUG nova.network.neutron [req-61f9934c-d217-4949-8ec0-31912efe825b req-0796d2c7-7e2a-444c-a2d7-cbd4f6abaf83 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: a7650c58-4663-47b0-8499-d470f8edddbd] Refreshing network info cache for port 609c277b-133c-4824-9fd7-17b756932543 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 00:05:27 compute-1 nova_compute[182713]: 2026-01-22 00:05:27.060 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:05:27 compute-1 nova_compute[182713]: 2026-01-22 00:05:27.703 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:05:28 compute-1 nova_compute[182713]: 2026-01-22 00:05:28.092 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:05:28 compute-1 nova_compute[182713]: 2026-01-22 00:05:28.613 182717 DEBUG nova.network.neutron [req-61f9934c-d217-4949-8ec0-31912efe825b req-0796d2c7-7e2a-444c-a2d7-cbd4f6abaf83 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: a7650c58-4663-47b0-8499-d470f8edddbd] Updated VIF entry in instance network info cache for port 609c277b-133c-4824-9fd7-17b756932543. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 00:05:28 compute-1 nova_compute[182713]: 2026-01-22 00:05:28.614 182717 DEBUG nova.network.neutron [req-61f9934c-d217-4949-8ec0-31912efe825b req-0796d2c7-7e2a-444c-a2d7-cbd4f6abaf83 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: a7650c58-4663-47b0-8499-d470f8edddbd] Updating instance_info_cache with network_info: [{"id": "609c277b-133c-4824-9fd7-17b756932543", "address": "fa:16:3e:4e:1a:fc", "network": {"id": "1a4bd631-64c5-4e00-9341-0e44fd0833fb", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1779791452-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.176", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b26cf6f4abd54e30aac169a3cbca648c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap609c277b-13", "ovs_interfaceid": "609c277b-133c-4824-9fd7-17b756932543", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:05:28 compute-1 nova_compute[182713]: 2026-01-22 00:05:28.636 182717 DEBUG oslo_concurrency.lockutils [req-61f9934c-d217-4949-8ec0-31912efe825b req-0796d2c7-7e2a-444c-a2d7-cbd4f6abaf83 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-a7650c58-4663-47b0-8499-d470f8edddbd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:05:29 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:05:29.220 104184 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=26, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:a4:fb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'fa:c2:bb:50:eb:27'}, ipsec=False) old=SB_Global(nb_cfg=25) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:05:29 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:05:29.221 104184 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 22 00:05:29 compute-1 nova_compute[182713]: 2026-01-22 00:05:29.222 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:05:31 compute-1 nova_compute[182713]: 2026-01-22 00:05:31.478 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:05:32 compute-1 nova_compute[182713]: 2026-01-22 00:05:32.061 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:05:32 compute-1 nova_compute[182713]: 2026-01-22 00:05:32.707 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:05:35 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:05:35.225 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=74526b6d-b1ca-423f-9094-b845f8b97526, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '26'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:05:35 compute-1 podman[226098]: 2026-01-22 00:05:35.601033705 +0000 UTC m=+0.081190818 container health_status cba261ffc859a105e0afa4436e64e27f9ba7e7387a1ea02146e0072c51844d44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, tcib_managed=true, container_name=ceilometer_agent_compute, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Jan 22 00:05:35 compute-1 nova_compute[182713]: 2026-01-22 00:05:35.698 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:05:35 compute-1 nova_compute[182713]: 2026-01-22 00:05:35.700 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:05:35 compute-1 nova_compute[182713]: 2026-01-22 00:05:35.700 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:05:35 compute-1 nova_compute[182713]: 2026-01-22 00:05:35.700 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 00:05:35 compute-1 nova_compute[182713]: 2026-01-22 00:05:35.855 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:05:35 compute-1 nova_compute[182713]: 2026-01-22 00:05:35.856 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:05:36 compute-1 ovn_controller[94841]: 2026-01-22T00:05:36Z|00047|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:4e:1a:fc 10.100.0.7
Jan 22 00:05:36 compute-1 ovn_controller[94841]: 2026-01-22T00:05:36Z|00048|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:4e:1a:fc 10.100.0.7
Jan 22 00:05:36 compute-1 nova_compute[182713]: 2026-01-22 00:05:36.851 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:05:37 compute-1 nova_compute[182713]: 2026-01-22 00:05:37.066 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:05:37 compute-1 nova_compute[182713]: 2026-01-22 00:05:37.388 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:05:37 compute-1 nova_compute[182713]: 2026-01-22 00:05:37.710 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:05:38 compute-1 nova_compute[182713]: 2026-01-22 00:05:38.851 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:05:39 compute-1 podman[226118]: 2026-01-22 00:05:39.620606148 +0000 UTC m=+0.105342323 container health_status dce133a2ffa21f3006ac27dc80027060a9029e6d972f4c62fecd35dd3bbb00f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., architecture=x86_64, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, io.openshift.expose-services=, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, release=1755695350, vcs-type=git, version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Jan 22 00:05:39 compute-1 nova_compute[182713]: 2026-01-22 00:05:39.855 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:05:39 compute-1 nova_compute[182713]: 2026-01-22 00:05:39.856 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 00:05:39 compute-1 nova_compute[182713]: 2026-01-22 00:05:39.856 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 22 00:05:40 compute-1 nova_compute[182713]: 2026-01-22 00:05:40.822 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquiring lock "refresh_cache-a7650c58-4663-47b0-8499-d470f8edddbd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:05:40 compute-1 nova_compute[182713]: 2026-01-22 00:05:40.823 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquired lock "refresh_cache-a7650c58-4663-47b0-8499-d470f8edddbd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:05:40 compute-1 nova_compute[182713]: 2026-01-22 00:05:40.823 182717 DEBUG nova.network.neutron [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] [instance: a7650c58-4663-47b0-8499-d470f8edddbd] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Jan 22 00:05:40 compute-1 nova_compute[182713]: 2026-01-22 00:05:40.824 182717 DEBUG nova.objects.instance [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lazy-loading 'info_cache' on Instance uuid a7650c58-4663-47b0-8499-d470f8edddbd obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:05:42 compute-1 nova_compute[182713]: 2026-01-22 00:05:42.069 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:05:42 compute-1 nova_compute[182713]: 2026-01-22 00:05:42.712 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:05:43 compute-1 nova_compute[182713]: 2026-01-22 00:05:43.561 182717 DEBUG nova.network.neutron [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] [instance: a7650c58-4663-47b0-8499-d470f8edddbd] Updating instance_info_cache with network_info: [{"id": "609c277b-133c-4824-9fd7-17b756932543", "address": "fa:16:3e:4e:1a:fc", "network": {"id": "1a4bd631-64c5-4e00-9341-0e44fd0833fb", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1779791452-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.176", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b26cf6f4abd54e30aac169a3cbca648c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap609c277b-13", "ovs_interfaceid": "609c277b-133c-4824-9fd7-17b756932543", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:05:43 compute-1 nova_compute[182713]: 2026-01-22 00:05:43.606 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Releasing lock "refresh_cache-a7650c58-4663-47b0-8499-d470f8edddbd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:05:43 compute-1 nova_compute[182713]: 2026-01-22 00:05:43.607 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] [instance: a7650c58-4663-47b0-8499-d470f8edddbd] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Jan 22 00:05:43 compute-1 nova_compute[182713]: 2026-01-22 00:05:43.608 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:05:43 compute-1 nova_compute[182713]: 2026-01-22 00:05:43.608 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:05:43 compute-1 nova_compute[182713]: 2026-01-22 00:05:43.642 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:05:43 compute-1 nova_compute[182713]: 2026-01-22 00:05:43.643 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:05:43 compute-1 nova_compute[182713]: 2026-01-22 00:05:43.644 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:05:43 compute-1 nova_compute[182713]: 2026-01-22 00:05:43.644 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 00:05:43 compute-1 nova_compute[182713]: 2026-01-22 00:05:43.745 182717 DEBUG oslo_concurrency.processutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a7650c58-4663-47b0-8499-d470f8edddbd/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:05:43 compute-1 nova_compute[182713]: 2026-01-22 00:05:43.832 182717 DEBUG oslo_concurrency.processutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a7650c58-4663-47b0-8499-d470f8edddbd/disk --force-share --output=json" returned: 0 in 0.087s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:05:43 compute-1 nova_compute[182713]: 2026-01-22 00:05:43.833 182717 DEBUG oslo_concurrency.processutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a7650c58-4663-47b0-8499-d470f8edddbd/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:05:43 compute-1 nova_compute[182713]: 2026-01-22 00:05:43.927 182717 DEBUG oslo_concurrency.processutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a7650c58-4663-47b0-8499-d470f8edddbd/disk --force-share --output=json" returned: 0 in 0.094s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:05:44 compute-1 nova_compute[182713]: 2026-01-22 00:05:44.147 182717 WARNING nova.virt.libvirt.driver [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 00:05:44 compute-1 nova_compute[182713]: 2026-01-22 00:05:44.149 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5532MB free_disk=73.27484130859375GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 00:05:44 compute-1 nova_compute[182713]: 2026-01-22 00:05:44.149 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:05:44 compute-1 nova_compute[182713]: 2026-01-22 00:05:44.150 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:05:44 compute-1 nova_compute[182713]: 2026-01-22 00:05:44.260 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Instance a7650c58-4663-47b0-8499-d470f8edddbd actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 22 00:05:44 compute-1 nova_compute[182713]: 2026-01-22 00:05:44.261 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 00:05:44 compute-1 nova_compute[182713]: 2026-01-22 00:05:44.261 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 00:05:44 compute-1 nova_compute[182713]: 2026-01-22 00:05:44.331 182717 DEBUG nova.compute.provider_tree [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Inventory has not changed in ProviderTree for provider: 39680711-70c9-4df1-ae59-25e54fac688d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:05:44 compute-1 nova_compute[182713]: 2026-01-22 00:05:44.350 182717 DEBUG nova.scheduler.client.report [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Inventory has not changed for provider 39680711-70c9-4df1-ae59-25e54fac688d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:05:44 compute-1 nova_compute[182713]: 2026-01-22 00:05:44.390 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 00:05:44 compute-1 nova_compute[182713]: 2026-01-22 00:05:44.390 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.240s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:05:46 compute-1 nova_compute[182713]: 2026-01-22 00:05:46.607 182717 DEBUG nova.compute.manager [None req-5f28dc63-3e15-4ba7-a2b2-31a56c5d826a 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: a7650c58-4663-47b0-8499-d470f8edddbd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:05:46 compute-1 nova_compute[182713]: 2026-01-22 00:05:46.699 182717 INFO nova.compute.manager [None req-5f28dc63-3e15-4ba7-a2b2-31a56c5d826a 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: a7650c58-4663-47b0-8499-d470f8edddbd] instance snapshotting
Jan 22 00:05:46 compute-1 nova_compute[182713]: 2026-01-22 00:05:46.700 182717 DEBUG nova.objects.instance [None req-5f28dc63-3e15-4ba7-a2b2-31a56c5d826a 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Lazy-loading 'flavor' on Instance uuid a7650c58-4663-47b0-8499-d470f8edddbd obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:05:47 compute-1 nova_compute[182713]: 2026-01-22 00:05:47.071 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:05:47 compute-1 nova_compute[182713]: 2026-01-22 00:05:47.184 182717 INFO nova.virt.libvirt.driver [None req-5f28dc63-3e15-4ba7-a2b2-31a56c5d826a 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: a7650c58-4663-47b0-8499-d470f8edddbd] Beginning live snapshot process
Jan 22 00:05:47 compute-1 virtqemud[182235]: invalid argument: disk vda does not have an active block job
Jan 22 00:05:47 compute-1 nova_compute[182713]: 2026-01-22 00:05:47.383 182717 DEBUG oslo_concurrency.processutils [None req-5f28dc63-3e15-4ba7-a2b2-31a56c5d826a 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a7650c58-4663-47b0-8499-d470f8edddbd/disk --force-share --output=json -f qcow2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:05:47 compute-1 nova_compute[182713]: 2026-01-22 00:05:47.474 182717 DEBUG oslo_concurrency.processutils [None req-5f28dc63-3e15-4ba7-a2b2-31a56c5d826a 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a7650c58-4663-47b0-8499-d470f8edddbd/disk --force-share --output=json -f qcow2" returned: 0 in 0.091s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:05:47 compute-1 nova_compute[182713]: 2026-01-22 00:05:47.475 182717 DEBUG oslo_concurrency.processutils [None req-5f28dc63-3e15-4ba7-a2b2-31a56c5d826a 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a7650c58-4663-47b0-8499-d470f8edddbd/disk --force-share --output=json -f qcow2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:05:47 compute-1 nova_compute[182713]: 2026-01-22 00:05:47.539 182717 DEBUG oslo_concurrency.processutils [None req-5f28dc63-3e15-4ba7-a2b2-31a56c5d826a 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a7650c58-4663-47b0-8499-d470f8edddbd/disk --force-share --output=json -f qcow2" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:05:47 compute-1 nova_compute[182713]: 2026-01-22 00:05:47.557 182717 DEBUG oslo_concurrency.processutils [None req-5f28dc63-3e15-4ba7-a2b2-31a56c5d826a 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:05:47 compute-1 nova_compute[182713]: 2026-01-22 00:05:47.611 182717 DEBUG oslo_concurrency.processutils [None req-5f28dc63-3e15-4ba7-a2b2-31a56c5d826a 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:05:47 compute-1 nova_compute[182713]: 2026-01-22 00:05:47.613 182717 DEBUG oslo_concurrency.processutils [None req-5f28dc63-3e15-4ba7-a2b2-31a56c5d826a 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/snapshots/tmpfyqz8zde/97630bf9f5f549e6bf98f49e7a791754.delta 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:05:47 compute-1 nova_compute[182713]: 2026-01-22 00:05:47.647 182717 DEBUG oslo_concurrency.processutils [None req-5f28dc63-3e15-4ba7-a2b2-31a56c5d826a 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/snapshots/tmpfyqz8zde/97630bf9f5f549e6bf98f49e7a791754.delta 1073741824" returned: 0 in 0.034s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:05:47 compute-1 nova_compute[182713]: 2026-01-22 00:05:47.648 182717 INFO nova.virt.libvirt.driver [None req-5f28dc63-3e15-4ba7-a2b2-31a56c5d826a 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: a7650c58-4663-47b0-8499-d470f8edddbd] Quiescing instance not available: QEMU guest agent is not enabled.
Jan 22 00:05:47 compute-1 nova_compute[182713]: 2026-01-22 00:05:47.704 182717 DEBUG nova.virt.libvirt.guest [None req-5f28dc63-3e15-4ba7-a2b2-31a56c5d826a 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] COPY block job progress, current cursor: 0 final cursor: 75431936 is_job_complete /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:846
Jan 22 00:05:47 compute-1 nova_compute[182713]: 2026-01-22 00:05:47.715 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:05:48 compute-1 nova_compute[182713]: 2026-01-22 00:05:48.209 182717 DEBUG nova.virt.libvirt.guest [None req-5f28dc63-3e15-4ba7-a2b2-31a56c5d826a 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] COPY block job progress, current cursor: 75431936 final cursor: 75431936 is_job_complete /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:846
Jan 22 00:05:48 compute-1 nova_compute[182713]: 2026-01-22 00:05:48.213 182717 INFO nova.virt.libvirt.driver [None req-5f28dc63-3e15-4ba7-a2b2-31a56c5d826a 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: a7650c58-4663-47b0-8499-d470f8edddbd] Skipping quiescing instance: QEMU guest agent is not enabled.
Jan 22 00:05:48 compute-1 nova_compute[182713]: 2026-01-22 00:05:48.267 182717 DEBUG nova.privsep.utils [None req-5f28dc63-3e15-4ba7-a2b2-31a56c5d826a 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63
Jan 22 00:05:48 compute-1 nova_compute[182713]: 2026-01-22 00:05:48.268 182717 DEBUG oslo_concurrency.processutils [None req-5f28dc63-3e15-4ba7-a2b2-31a56c5d826a 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Running cmd (subprocess): qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/snapshots/tmpfyqz8zde/97630bf9f5f549e6bf98f49e7a791754.delta /var/lib/nova/instances/snapshots/tmpfyqz8zde/97630bf9f5f549e6bf98f49e7a791754 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:05:48 compute-1 nova_compute[182713]: 2026-01-22 00:05:48.719 182717 DEBUG oslo_concurrency.processutils [None req-5f28dc63-3e15-4ba7-a2b2-31a56c5d826a 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] CMD "qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/snapshots/tmpfyqz8zde/97630bf9f5f549e6bf98f49e7a791754.delta /var/lib/nova/instances/snapshots/tmpfyqz8zde/97630bf9f5f549e6bf98f49e7a791754" returned: 0 in 0.452s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:05:48 compute-1 nova_compute[182713]: 2026-01-22 00:05:48.726 182717 INFO nova.virt.libvirt.driver [None req-5f28dc63-3e15-4ba7-a2b2-31a56c5d826a 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: a7650c58-4663-47b0-8499-d470f8edddbd] Snapshot extracted, beginning image upload
Jan 22 00:05:49 compute-1 podman[226179]: 2026-01-22 00:05:49.61059026 +0000 UTC m=+0.081036653 container health_status 9069102163b9014ce8d990e36224edf2f6277e737eebbc85a8ea0ba5a3d6a468 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 22 00:05:49 compute-1 podman[226178]: 2026-01-22 00:05:49.662667918 +0000 UTC m=+0.142859591 container health_status 1b31374526468d6b98833c6ddc06c791524e3aaa8f4a92d02e3f11c449a17ca2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 00:05:50 compute-1 nova_compute[182713]: 2026-01-22 00:05:50.450 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:05:52 compute-1 nova_compute[182713]: 2026-01-22 00:05:52.073 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:05:52 compute-1 nova_compute[182713]: 2026-01-22 00:05:52.717 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:05:54 compute-1 podman[226231]: 2026-01-22 00:05:54.572735478 +0000 UTC m=+0.059129037 container health_status e305ebfcd3dbca18f8dd680606b043b108cf9efb0d0a771039cb04fe0df8441d (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 22 00:05:54 compute-1 podman[226230]: 2026-01-22 00:05:54.595326356 +0000 UTC m=+0.078335480 container health_status af9a44923f14d865893c91712a29a99d59e05d83f1a1a0300c4f4d5af638f284 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 22 00:05:55 compute-1 nova_compute[182713]: 2026-01-22 00:05:55.672 182717 INFO nova.virt.libvirt.driver [None req-5f28dc63-3e15-4ba7-a2b2-31a56c5d826a 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: a7650c58-4663-47b0-8499-d470f8edddbd] Snapshot image upload complete
Jan 22 00:05:55 compute-1 nova_compute[182713]: 2026-01-22 00:05:55.673 182717 INFO nova.compute.manager [None req-5f28dc63-3e15-4ba7-a2b2-31a56c5d826a 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: a7650c58-4663-47b0-8499-d470f8edddbd] Took 8.93 seconds to snapshot the instance on the hypervisor.
Jan 22 00:05:56 compute-1 nova_compute[182713]: 2026-01-22 00:05:56.544 182717 DEBUG nova.compute.manager [None req-5f28dc63-3e15-4ba7-a2b2-31a56c5d826a 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: a7650c58-4663-47b0-8499-d470f8edddbd] Found 1 images (rotation: 2) _rotate_backups /usr/lib/python3.9/site-packages/nova/compute/manager.py:4450
Jan 22 00:05:57 compute-1 nova_compute[182713]: 2026-01-22 00:05:57.076 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:05:57 compute-1 nova_compute[182713]: 2026-01-22 00:05:57.719 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:05:59 compute-1 nova_compute[182713]: 2026-01-22 00:05:59.731 182717 DEBUG nova.compute.manager [None req-b8ba8cab-0c23-4b01-968e-dd2a05889bb4 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: a7650c58-4663-47b0-8499-d470f8edddbd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:05:59 compute-1 nova_compute[182713]: 2026-01-22 00:05:59.829 182717 INFO nova.compute.manager [None req-b8ba8cab-0c23-4b01-968e-dd2a05889bb4 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: a7650c58-4663-47b0-8499-d470f8edddbd] instance snapshotting
Jan 22 00:05:59 compute-1 nova_compute[182713]: 2026-01-22 00:05:59.831 182717 DEBUG nova.objects.instance [None req-b8ba8cab-0c23-4b01-968e-dd2a05889bb4 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Lazy-loading 'flavor' on Instance uuid a7650c58-4663-47b0-8499-d470f8edddbd obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:06:00 compute-1 nova_compute[182713]: 2026-01-22 00:06:00.773 182717 INFO nova.virt.libvirt.driver [None req-b8ba8cab-0c23-4b01-968e-dd2a05889bb4 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: a7650c58-4663-47b0-8499-d470f8edddbd] Beginning live snapshot process
Jan 22 00:06:01 compute-1 virtqemud[182235]: invalid argument: disk vda does not have an active block job
Jan 22 00:06:01 compute-1 nova_compute[182713]: 2026-01-22 00:06:01.032 182717 DEBUG oslo_concurrency.processutils [None req-b8ba8cab-0c23-4b01-968e-dd2a05889bb4 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a7650c58-4663-47b0-8499-d470f8edddbd/disk --force-share --output=json -f qcow2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:06:01 compute-1 nova_compute[182713]: 2026-01-22 00:06:01.088 182717 DEBUG oslo_concurrency.processutils [None req-b8ba8cab-0c23-4b01-968e-dd2a05889bb4 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a7650c58-4663-47b0-8499-d470f8edddbd/disk --force-share --output=json -f qcow2" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:06:01 compute-1 nova_compute[182713]: 2026-01-22 00:06:01.089 182717 DEBUG oslo_concurrency.processutils [None req-b8ba8cab-0c23-4b01-968e-dd2a05889bb4 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a7650c58-4663-47b0-8499-d470f8edddbd/disk --force-share --output=json -f qcow2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:06:01 compute-1 nova_compute[182713]: 2026-01-22 00:06:01.153 182717 DEBUG oslo_concurrency.processutils [None req-b8ba8cab-0c23-4b01-968e-dd2a05889bb4 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a7650c58-4663-47b0-8499-d470f8edddbd/disk --force-share --output=json -f qcow2" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:06:01 compute-1 nova_compute[182713]: 2026-01-22 00:06:01.181 182717 DEBUG oslo_concurrency.processutils [None req-b8ba8cab-0c23-4b01-968e-dd2a05889bb4 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:06:01 compute-1 nova_compute[182713]: 2026-01-22 00:06:01.256 182717 DEBUG oslo_concurrency.processutils [None req-b8ba8cab-0c23-4b01-968e-dd2a05889bb4 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:06:01 compute-1 nova_compute[182713]: 2026-01-22 00:06:01.258 182717 DEBUG oslo_concurrency.processutils [None req-b8ba8cab-0c23-4b01-968e-dd2a05889bb4 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/snapshots/tmpa5bkjncf/a030201989634a0391b8f12a9e0a8949.delta 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:06:01 compute-1 nova_compute[182713]: 2026-01-22 00:06:01.302 182717 DEBUG oslo_concurrency.processutils [None req-b8ba8cab-0c23-4b01-968e-dd2a05889bb4 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/snapshots/tmpa5bkjncf/a030201989634a0391b8f12a9e0a8949.delta 1073741824" returned: 0 in 0.045s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:06:01 compute-1 nova_compute[182713]: 2026-01-22 00:06:01.304 182717 INFO nova.virt.libvirt.driver [None req-b8ba8cab-0c23-4b01-968e-dd2a05889bb4 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: a7650c58-4663-47b0-8499-d470f8edddbd] Quiescing instance not available: QEMU guest agent is not enabled.
Jan 22 00:06:01 compute-1 nova_compute[182713]: 2026-01-22 00:06:01.353 182717 DEBUG nova.virt.libvirt.guest [None req-b8ba8cab-0c23-4b01-968e-dd2a05889bb4 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] COPY block job progress, current cursor: 0 final cursor: 75431936 is_job_complete /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:846
Jan 22 00:06:01 compute-1 nova_compute[182713]: 2026-01-22 00:06:01.858 182717 DEBUG nova.virt.libvirt.guest [None req-b8ba8cab-0c23-4b01-968e-dd2a05889bb4 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] COPY block job progress, current cursor: 75431936 final cursor: 75431936 is_job_complete /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:846
Jan 22 00:06:01 compute-1 nova_compute[182713]: 2026-01-22 00:06:01.864 182717 INFO nova.virt.libvirt.driver [None req-b8ba8cab-0c23-4b01-968e-dd2a05889bb4 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: a7650c58-4663-47b0-8499-d470f8edddbd] Skipping quiescing instance: QEMU guest agent is not enabled.
Jan 22 00:06:01 compute-1 nova_compute[182713]: 2026-01-22 00:06:01.918 182717 DEBUG nova.privsep.utils [None req-b8ba8cab-0c23-4b01-968e-dd2a05889bb4 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63
Jan 22 00:06:01 compute-1 nova_compute[182713]: 2026-01-22 00:06:01.920 182717 DEBUG oslo_concurrency.processutils [None req-b8ba8cab-0c23-4b01-968e-dd2a05889bb4 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Running cmd (subprocess): qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/snapshots/tmpa5bkjncf/a030201989634a0391b8f12a9e0a8949.delta /var/lib/nova/instances/snapshots/tmpa5bkjncf/a030201989634a0391b8f12a9e0a8949 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:06:01 compute-1 nova_compute[182713]: 2026-01-22 00:06:01.982 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:06:02 compute-1 nova_compute[182713]: 2026-01-22 00:06:02.078 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:06:02 compute-1 nova_compute[182713]: 2026-01-22 00:06:02.422 182717 DEBUG oslo_concurrency.processutils [None req-b8ba8cab-0c23-4b01-968e-dd2a05889bb4 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] CMD "qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/snapshots/tmpa5bkjncf/a030201989634a0391b8f12a9e0a8949.delta /var/lib/nova/instances/snapshots/tmpa5bkjncf/a030201989634a0391b8f12a9e0a8949" returned: 0 in 0.502s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:06:02 compute-1 nova_compute[182713]: 2026-01-22 00:06:02.429 182717 INFO nova.virt.libvirt.driver [None req-b8ba8cab-0c23-4b01-968e-dd2a05889bb4 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: a7650c58-4663-47b0-8499-d470f8edddbd] Snapshot extracted, beginning image upload
Jan 22 00:06:02 compute-1 nova_compute[182713]: 2026-01-22 00:06:02.721 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:06:03 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:06:03.013 104184 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:06:03 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:06:03.014 104184 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:06:03 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:06:03.015 104184 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:06:05 compute-1 nova_compute[182713]: 2026-01-22 00:06:05.394 182717 INFO nova.virt.libvirt.driver [None req-b8ba8cab-0c23-4b01-968e-dd2a05889bb4 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: a7650c58-4663-47b0-8499-d470f8edddbd] Snapshot image upload complete
Jan 22 00:06:05 compute-1 nova_compute[182713]: 2026-01-22 00:06:05.395 182717 INFO nova.compute.manager [None req-b8ba8cab-0c23-4b01-968e-dd2a05889bb4 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: a7650c58-4663-47b0-8499-d470f8edddbd] Took 5.49 seconds to snapshot the instance on the hypervisor.
Jan 22 00:06:05 compute-1 nova_compute[182713]: 2026-01-22 00:06:05.846 182717 DEBUG nova.compute.manager [None req-b8ba8cab-0c23-4b01-968e-dd2a05889bb4 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: a7650c58-4663-47b0-8499-d470f8edddbd] Found 2 images (rotation: 2) _rotate_backups /usr/lib/python3.9/site-packages/nova/compute/manager.py:4450
Jan 22 00:06:05 compute-1 ovn_controller[94841]: 2026-01-22T00:06:05Z|00387|binding|INFO|Releasing lport c2dbe75a-81e7-4c52-bada-9acaf8fbaf5c from this chassis (sb_readonly=0)
Jan 22 00:06:05 compute-1 nova_compute[182713]: 2026-01-22 00:06:05.977 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:06:06 compute-1 podman[226307]: 2026-01-22 00:06:06.590761991 +0000 UTC m=+0.077481464 container health_status cba261ffc859a105e0afa4436e64e27f9ba7e7387a1ea02146e0072c51844d44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251202, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 22 00:06:07 compute-1 nova_compute[182713]: 2026-01-22 00:06:07.081 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:06:07 compute-1 nova_compute[182713]: 2026-01-22 00:06:07.723 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:06:08 compute-1 nova_compute[182713]: 2026-01-22 00:06:08.874 182717 DEBUG nova.compute.manager [None req-1b96437d-35da-4e0e-ab07-67e10c5e44cd 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: a7650c58-4663-47b0-8499-d470f8edddbd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:06:08 compute-1 nova_compute[182713]: 2026-01-22 00:06:08.954 182717 INFO nova.compute.manager [None req-1b96437d-35da-4e0e-ab07-67e10c5e44cd 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: a7650c58-4663-47b0-8499-d470f8edddbd] instance snapshotting
Jan 22 00:06:08 compute-1 nova_compute[182713]: 2026-01-22 00:06:08.955 182717 DEBUG nova.objects.instance [None req-1b96437d-35da-4e0e-ab07-67e10c5e44cd 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Lazy-loading 'flavor' on Instance uuid a7650c58-4663-47b0-8499-d470f8edddbd obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:06:09 compute-1 nova_compute[182713]: 2026-01-22 00:06:09.372 182717 INFO nova.virt.libvirt.driver [None req-1b96437d-35da-4e0e-ab07-67e10c5e44cd 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: a7650c58-4663-47b0-8499-d470f8edddbd] Beginning live snapshot process
Jan 22 00:06:09 compute-1 nova_compute[182713]: 2026-01-22 00:06:09.425 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:06:09 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:06:09.425 104184 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=27, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:a4:fb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'fa:c2:bb:50:eb:27'}, ipsec=False) old=SB_Global(nb_cfg=26) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:06:09 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:06:09.427 104184 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 22 00:06:09 compute-1 virtqemud[182235]: invalid argument: disk vda does not have an active block job
Jan 22 00:06:09 compute-1 nova_compute[182713]: 2026-01-22 00:06:09.780 182717 DEBUG oslo_concurrency.processutils [None req-1b96437d-35da-4e0e-ab07-67e10c5e44cd 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a7650c58-4663-47b0-8499-d470f8edddbd/disk --force-share --output=json -f qcow2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:06:09 compute-1 nova_compute[182713]: 2026-01-22 00:06:09.869 182717 DEBUG oslo_concurrency.processutils [None req-1b96437d-35da-4e0e-ab07-67e10c5e44cd 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a7650c58-4663-47b0-8499-d470f8edddbd/disk --force-share --output=json -f qcow2" returned: 0 in 0.089s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:06:09 compute-1 nova_compute[182713]: 2026-01-22 00:06:09.871 182717 DEBUG oslo_concurrency.processutils [None req-1b96437d-35da-4e0e-ab07-67e10c5e44cd 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a7650c58-4663-47b0-8499-d470f8edddbd/disk --force-share --output=json -f qcow2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:06:09 compute-1 nova_compute[182713]: 2026-01-22 00:06:09.939 182717 DEBUG oslo_concurrency.processutils [None req-1b96437d-35da-4e0e-ab07-67e10c5e44cd 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a7650c58-4663-47b0-8499-d470f8edddbd/disk --force-share --output=json -f qcow2" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:06:09 compute-1 nova_compute[182713]: 2026-01-22 00:06:09.956 182717 DEBUG oslo_concurrency.processutils [None req-1b96437d-35da-4e0e-ab07-67e10c5e44cd 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:06:10 compute-1 nova_compute[182713]: 2026-01-22 00:06:10.022 182717 DEBUG oslo_concurrency.processutils [None req-1b96437d-35da-4e0e-ab07-67e10c5e44cd 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:06:10 compute-1 nova_compute[182713]: 2026-01-22 00:06:10.023 182717 DEBUG oslo_concurrency.processutils [None req-1b96437d-35da-4e0e-ab07-67e10c5e44cd 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/snapshots/tmpyjtl3tsi/8b5f2318417a4e6b8c2e5710ea3fdeba.delta 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:06:10 compute-1 nova_compute[182713]: 2026-01-22 00:06:10.060 182717 DEBUG oslo_concurrency.processutils [None req-1b96437d-35da-4e0e-ab07-67e10c5e44cd 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/snapshots/tmpyjtl3tsi/8b5f2318417a4e6b8c2e5710ea3fdeba.delta 1073741824" returned: 0 in 0.036s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:06:10 compute-1 nova_compute[182713]: 2026-01-22 00:06:10.061 182717 INFO nova.virt.libvirt.driver [None req-1b96437d-35da-4e0e-ab07-67e10c5e44cd 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: a7650c58-4663-47b0-8499-d470f8edddbd] Quiescing instance not available: QEMU guest agent is not enabled.
Jan 22 00:06:10 compute-1 nova_compute[182713]: 2026-01-22 00:06:10.134 182717 DEBUG nova.virt.libvirt.guest [None req-1b96437d-35da-4e0e-ab07-67e10c5e44cd 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] COPY block job progress, current cursor: 0 final cursor: 75431936 is_job_complete /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:846
Jan 22 00:06:10 compute-1 podman[226342]: 2026-01-22 00:06:10.171941827 +0000 UTC m=+0.072766648 container health_status dce133a2ffa21f3006ac27dc80027060a9029e6d972f4c62fecd35dd3bbb00f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, config_id=openstack_network_exporter, release=1755695350, io.buildah.version=1.33.7, container_name=openstack_network_exporter, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, name=ubi9-minimal, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, architecture=x86_64, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41)
Jan 22 00:06:10 compute-1 nova_compute[182713]: 2026-01-22 00:06:10.638 182717 DEBUG nova.virt.libvirt.guest [None req-1b96437d-35da-4e0e-ab07-67e10c5e44cd 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] COPY block job progress, current cursor: 75431936 final cursor: 75431936 is_job_complete /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:846
Jan 22 00:06:10 compute-1 nova_compute[182713]: 2026-01-22 00:06:10.642 182717 INFO nova.virt.libvirt.driver [None req-1b96437d-35da-4e0e-ab07-67e10c5e44cd 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: a7650c58-4663-47b0-8499-d470f8edddbd] Skipping quiescing instance: QEMU guest agent is not enabled.
Jan 22 00:06:10 compute-1 nova_compute[182713]: 2026-01-22 00:06:10.698 182717 DEBUG nova.privsep.utils [None req-1b96437d-35da-4e0e-ab07-67e10c5e44cd 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63
Jan 22 00:06:10 compute-1 nova_compute[182713]: 2026-01-22 00:06:10.699 182717 DEBUG oslo_concurrency.processutils [None req-1b96437d-35da-4e0e-ab07-67e10c5e44cd 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Running cmd (subprocess): qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/snapshots/tmpyjtl3tsi/8b5f2318417a4e6b8c2e5710ea3fdeba.delta /var/lib/nova/instances/snapshots/tmpyjtl3tsi/8b5f2318417a4e6b8c2e5710ea3fdeba execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:06:11 compute-1 nova_compute[182713]: 2026-01-22 00:06:11.173 182717 DEBUG oslo_concurrency.processutils [None req-1b96437d-35da-4e0e-ab07-67e10c5e44cd 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] CMD "qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/snapshots/tmpyjtl3tsi/8b5f2318417a4e6b8c2e5710ea3fdeba.delta /var/lib/nova/instances/snapshots/tmpyjtl3tsi/8b5f2318417a4e6b8c2e5710ea3fdeba" returned: 0 in 0.473s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:06:11 compute-1 nova_compute[182713]: 2026-01-22 00:06:11.184 182717 INFO nova.virt.libvirt.driver [None req-1b96437d-35da-4e0e-ab07-67e10c5e44cd 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: a7650c58-4663-47b0-8499-d470f8edddbd] Snapshot extracted, beginning image upload
Jan 22 00:06:12 compute-1 nova_compute[182713]: 2026-01-22 00:06:12.084 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:06:12 compute-1 nova_compute[182713]: 2026-01-22 00:06:12.726 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:06:14 compute-1 ovn_controller[94841]: 2026-01-22T00:06:14Z|00388|binding|INFO|Releasing lport c2dbe75a-81e7-4c52-bada-9acaf8fbaf5c from this chassis (sb_readonly=0)
Jan 22 00:06:14 compute-1 nova_compute[182713]: 2026-01-22 00:06:14.437 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:06:14 compute-1 nova_compute[182713]: 2026-01-22 00:06:14.768 182717 INFO nova.virt.libvirt.driver [None req-1b96437d-35da-4e0e-ab07-67e10c5e44cd 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: a7650c58-4663-47b0-8499-d470f8edddbd] Snapshot image upload complete
Jan 22 00:06:14 compute-1 nova_compute[182713]: 2026-01-22 00:06:14.770 182717 INFO nova.compute.manager [None req-1b96437d-35da-4e0e-ab07-67e10c5e44cd 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: a7650c58-4663-47b0-8499-d470f8edddbd] Took 5.77 seconds to snapshot the instance on the hypervisor.
Jan 22 00:06:15 compute-1 nova_compute[182713]: 2026-01-22 00:06:15.233 182717 DEBUG nova.compute.manager [None req-1b96437d-35da-4e0e-ab07-67e10c5e44cd 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: a7650c58-4663-47b0-8499-d470f8edddbd] Found 3 images (rotation: 2) _rotate_backups /usr/lib/python3.9/site-packages/nova/compute/manager.py:4450
Jan 22 00:06:15 compute-1 nova_compute[182713]: 2026-01-22 00:06:15.234 182717 DEBUG nova.compute.manager [None req-1b96437d-35da-4e0e-ab07-67e10c5e44cd 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: a7650c58-4663-47b0-8499-d470f8edddbd] Rotating out 1 backups _rotate_backups /usr/lib/python3.9/site-packages/nova/compute/manager.py:4458
Jan 22 00:06:15 compute-1 nova_compute[182713]: 2026-01-22 00:06:15.234 182717 DEBUG nova.compute.manager [None req-1b96437d-35da-4e0e-ab07-67e10c5e44cd 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: a7650c58-4663-47b0-8499-d470f8edddbd] Deleting image f40b9b79-f749-404b-912a-0c0202e47813 _rotate_backups /usr/lib/python3.9/site-packages/nova/compute/manager.py:4463
Jan 22 00:06:17 compute-1 nova_compute[182713]: 2026-01-22 00:06:17.085 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:06:17 compute-1 nova_compute[182713]: 2026-01-22 00:06:17.728 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:06:18 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:06:18.430 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=74526b6d-b1ca-423f-9094-b845f8b97526, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '27'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:06:19 compute-1 ovn_controller[94841]: 2026-01-22T00:06:19Z|00389|binding|INFO|Releasing lport c2dbe75a-81e7-4c52-bada-9acaf8fbaf5c from this chassis (sb_readonly=0)
Jan 22 00:06:19 compute-1 nova_compute[182713]: 2026-01-22 00:06:19.672 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:06:20 compute-1 podman[226380]: 2026-01-22 00:06:20.593197105 +0000 UTC m=+0.077188254 container health_status 9069102163b9014ce8d990e36224edf2f6277e737eebbc85a8ea0ba5a3d6a468 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 22 00:06:20 compute-1 podman[226379]: 2026-01-22 00:06:20.620115717 +0000 UTC m=+0.111314849 container health_status 1b31374526468d6b98833c6ddc06c791524e3aaa8f4a92d02e3f11c449a17ca2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller)
Jan 22 00:06:22 compute-1 nova_compute[182713]: 2026-01-22 00:06:22.088 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:06:22 compute-1 nova_compute[182713]: 2026-01-22 00:06:22.730 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.875 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'a7650c58-4663-47b0-8499-d470f8edddbd', 'name': 'tempest-ServerActionsTestOtherB-server-1989312991', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-0000005c', 'OS-EXT-SRV-ATTR:host': 'compute-1.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'b26cf6f4abd54e30aac169a3cbca648c', 'user_id': '365f219cd09c471fa6275faa2fe5e2a1', 'hostId': '6538bf0928604205d3543d596640848620246511a60cfb5dc1884224', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.877 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.891 12 DEBUG ceilometer.compute.pollsters [-] a7650c58-4663-47b0-8499-d470f8edddbd/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.893 12 DEBUG ceilometer.compute.pollsters [-] a7650c58-4663-47b0-8499-d470f8edddbd/disk.device.capacity volume: 509952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.897 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '511a39fb-f427-4cbf-8191-f6f75dedecd4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '365f219cd09c471fa6275faa2fe5e2a1', 'user_name': None, 'project_id': 'b26cf6f4abd54e30aac169a3cbca648c', 'project_name': None, 'resource_id': 'a7650c58-4663-47b0-8499-d470f8edddbd-vda', 'timestamp': '2026-01-22T00:06:22.877508', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1989312991', 'name': 'instance-0000005c', 'instance_id': 'a7650c58-4663-47b0-8499-d470f8edddbd', 'instance_type': 'm1.nano', 'host': '6538bf0928604205d3543d596640848620246511a60cfb5dc1884224', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '2fa94d60-f726-11f0-a0a4-fa163e934844', 'monotonic_time': 4955.584883503, 'message_signature': '52b7cf016f3494e5424498d27327d8be11793de797adbbafbe4803f9cdd8a9a1'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 509952, 'user_id': '365f219cd09c471fa6275faa2fe5e2a1', 'user_name': None, 'project_id': 'b26cf6f4abd54e30aac169a3cbca648c', 'project_name': None, 'resource_id': 'a7650c58-4663-47b0-8499-d470f8edddbd-sda', 'timestamp': '2026-01-22T00:06:22.877508', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1989312991', 'name': 'instance-0000005c', 'instance_id': 'a7650c58-4663-47b0-8499-d470f8edddbd', 'instance_type': 'm1.nano', 'host': '6538bf0928604205d3543d596640848620246511a60cfb5dc1884224', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '2fa96854-f726-11f0-a0a4-fa163e934844', 'monotonic_time': 4955.584883503, 'message_signature': '3e44ea93aa222cae7820fe84070fc579029730a7a7539f39656466b97013f8ba'}]}, 'timestamp': '2026-01-22 00:06:22.893803', '_unique_id': '8a06be14d93a491dbf571ff3121ca066'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.897 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.897 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.897 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.897 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.897 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.897 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.897 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.897 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.897 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.897 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.897 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.897 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.897 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.897 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.897 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.897 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.897 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.897 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.897 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.897 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.897 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.897 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.897 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.897 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.897 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.897 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.897 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.897 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.897 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.897 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.897 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.897 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.897 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.897 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.897 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.897 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.897 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.897 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.897 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.897 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.897 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.897 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.897 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.897 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.897 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.897 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.897 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.897 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.897 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.897 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.897 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.897 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.897 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.897 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.898 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.901 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for a7650c58-4663-47b0-8499-d470f8edddbd / tap609c277b-13 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.902 12 DEBUG ceilometer.compute.pollsters [-] a7650c58-4663-47b0-8499-d470f8edddbd/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.903 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'de1d705a-2021-4725-89df-6d8544e16a1f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '365f219cd09c471fa6275faa2fe5e2a1', 'user_name': None, 'project_id': 'b26cf6f4abd54e30aac169a3cbca648c', 'project_name': None, 'resource_id': 'instance-0000005c-a7650c58-4663-47b0-8499-d470f8edddbd-tap609c277b-13', 'timestamp': '2026-01-22T00:06:22.899072', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1989312991', 'name': 'tap609c277b-13', 'instance_id': 'a7650c58-4663-47b0-8499-d470f8edddbd', 'instance_type': 'm1.nano', 'host': '6538bf0928604205d3543d596640848620246511a60cfb5dc1884224', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:4e:1a:fc', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap609c277b-13'}, 'message_id': '2faac690-f726-11f0-a0a4-fa163e934844', 'monotonic_time': 4955.606338035, 'message_signature': 'a51c8e7cbd99825a28156b8c6192d676c35376d016fa4963c803358618b25e5d'}]}, 'timestamp': '2026-01-22 00:06:22.902698', '_unique_id': '52d8463045a4475ca1dc9ba1274381d0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.903 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.903 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.903 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.903 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.903 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.903 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.903 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.903 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.903 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.903 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.903 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.903 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.903 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.903 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.903 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.903 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.903 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.903 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.903 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.903 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.903 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.903 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.903 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.903 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.903 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.903 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.903 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.903 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.903 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.903 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.903 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.903 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.903 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.903 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.903 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.903 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.903 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.903 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.903 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.903 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.903 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.903 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.903 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.903 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.903 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.903 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.903 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.903 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.903 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.903 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.903 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.903 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.903 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.903 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.905 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.905 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.905 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-ServerActionsTestOtherB-server-1989312991>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServerActionsTestOtherB-server-1989312991>]
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.905 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.906 12 DEBUG ceilometer.compute.pollsters [-] a7650c58-4663-47b0-8499-d470f8edddbd/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.907 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '912362ac-6e78-42bf-a7e0-ff9b0bf0b50a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '365f219cd09c471fa6275faa2fe5e2a1', 'user_name': None, 'project_id': 'b26cf6f4abd54e30aac169a3cbca648c', 'project_name': None, 'resource_id': 'instance-0000005c-a7650c58-4663-47b0-8499-d470f8edddbd-tap609c277b-13', 'timestamp': '2026-01-22T00:06:22.906039', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1989312991', 'name': 'tap609c277b-13', 'instance_id': 'a7650c58-4663-47b0-8499-d470f8edddbd', 'instance_type': 'm1.nano', 'host': '6538bf0928604205d3543d596640848620246511a60cfb5dc1884224', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:4e:1a:fc', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap609c277b-13'}, 'message_id': '2fab5a6a-f726-11f0-a0a4-fa163e934844', 'monotonic_time': 4955.606338035, 'message_signature': 'ac0b0d108095eb7e26bf6660e463dfded623fdbcefd662d7edacd96664fafdad'}]}, 'timestamp': '2026-01-22 00:06:22.906409', '_unique_id': '2c81aead2ae540e986788dfe95ebcd64'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.907 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.907 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.907 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.907 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.907 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.907 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.907 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.907 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.907 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.907 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.907 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.907 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.907 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.907 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.907 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.907 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.907 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.907 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.907 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.907 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.907 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.907 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.907 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.907 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.907 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.907 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.907 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.907 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.907 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.907 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.907 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.908 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.908 12 DEBUG ceilometer.compute.pollsters [-] a7650c58-4663-47b0-8499-d470f8edddbd/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.909 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2d015c0c-cff9-40ec-aebf-77811b2817b4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '365f219cd09c471fa6275faa2fe5e2a1', 'user_name': None, 'project_id': 'b26cf6f4abd54e30aac169a3cbca648c', 'project_name': None, 'resource_id': 'instance-0000005c-a7650c58-4663-47b0-8499-d470f8edddbd-tap609c277b-13', 'timestamp': '2026-01-22T00:06:22.908317', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1989312991', 'name': 'tap609c277b-13', 'instance_id': 'a7650c58-4663-47b0-8499-d470f8edddbd', 'instance_type': 'm1.nano', 'host': '6538bf0928604205d3543d596640848620246511a60cfb5dc1884224', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:4e:1a:fc', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap609c277b-13'}, 'message_id': '2fabb2da-f726-11f0-a0a4-fa163e934844', 'monotonic_time': 4955.606338035, 'message_signature': '76df91e6316ea3f0a6b423bc74e8203c59605d2eca8b7ca8fa39515aa98775d5'}]}, 'timestamp': '2026-01-22 00:06:22.908654', '_unique_id': '7afe9eaffa914316bf7c519e025de300'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.909 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.909 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.909 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.909 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.909 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.909 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.909 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.909 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.909 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.909 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.909 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.909 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.909 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.909 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.909 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.909 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.909 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.909 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.909 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.909 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.909 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.909 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.909 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.909 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.909 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.909 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.909 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.909 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.909 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.909 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.909 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.909 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.909 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.909 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.909 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.909 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.909 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.909 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.909 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.909 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.909 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.909 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.909 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.909 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.909 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.909 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.909 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.909 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.909 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.909 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.909 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.909 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.909 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.909 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.910 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.910 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.910 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-ServerActionsTestOtherB-server-1989312991>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServerActionsTestOtherB-server-1989312991>]
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.910 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.910 12 DEBUG ceilometer.compute.pollsters [-] a7650c58-4663-47b0-8499-d470f8edddbd/network.incoming.bytes volume: 4195 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.911 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bf70e355-20bd-4db6-a23e-65c229d4dfa4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 4195, 'user_id': '365f219cd09c471fa6275faa2fe5e2a1', 'user_name': None, 'project_id': 'b26cf6f4abd54e30aac169a3cbca648c', 'project_name': None, 'resource_id': 'instance-0000005c-a7650c58-4663-47b0-8499-d470f8edddbd-tap609c277b-13', 'timestamp': '2026-01-22T00:06:22.910679', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1989312991', 'name': 'tap609c277b-13', 'instance_id': 'a7650c58-4663-47b0-8499-d470f8edddbd', 'instance_type': 'm1.nano', 'host': '6538bf0928604205d3543d596640848620246511a60cfb5dc1884224', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:4e:1a:fc', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap609c277b-13'}, 'message_id': '2fac102c-f726-11f0-a0a4-fa163e934844', 'monotonic_time': 4955.606338035, 'message_signature': 'e8bdac893b0a5f0fd54bf35dc29bd93290f716796d30772ccbc4d6f26fbc789d'}]}, 'timestamp': '2026-01-22 00:06:22.911043', '_unique_id': '1207fd4597c246d3b09bb61c02c0d59d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.911 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.911 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.911 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.911 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.911 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.911 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.911 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.911 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.911 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.911 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.911 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.911 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.911 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.911 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.911 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.911 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.911 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.911 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.911 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.911 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.911 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.911 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.911 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.911 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.911 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.911 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.911 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.911 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.911 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.911 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.911 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.912 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.912 12 DEBUG ceilometer.compute.pollsters [-] a7650c58-4663-47b0-8499-d470f8edddbd/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.913 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7406fbe0-349c-4df2-b6a9-028abbcbd3f5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '365f219cd09c471fa6275faa2fe5e2a1', 'user_name': None, 'project_id': 'b26cf6f4abd54e30aac169a3cbca648c', 'project_name': None, 'resource_id': 'instance-0000005c-a7650c58-4663-47b0-8499-d470f8edddbd-tap609c277b-13', 'timestamp': '2026-01-22T00:06:22.912754', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1989312991', 'name': 'tap609c277b-13', 'instance_id': 'a7650c58-4663-47b0-8499-d470f8edddbd', 'instance_type': 'm1.nano', 'host': '6538bf0928604205d3543d596640848620246511a60cfb5dc1884224', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:4e:1a:fc', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap609c277b-13'}, 'message_id': '2fac618a-f726-11f0-a0a4-fa163e934844', 'monotonic_time': 4955.606338035, 'message_signature': 'c19b4bfc9694b91b955a6d28a4ce6166240d93e17938e7b7ea5a824f4b54d946'}]}, 'timestamp': '2026-01-22 00:06:22.913120', '_unique_id': '645bab3556914428b52a4096e1efd948'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.913 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.913 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.913 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.913 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.913 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.913 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.913 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.913 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.913 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.913 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.913 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.913 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.913 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.913 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.913 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.913 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.913 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.913 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.913 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.913 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.913 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.913 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.913 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.913 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.913 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.913 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.913 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.913 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.913 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.913 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.913 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.914 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.947 12 DEBUG ceilometer.compute.pollsters [-] a7650c58-4663-47b0-8499-d470f8edddbd/disk.device.write.requests volume: 328 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.947 12 DEBUG ceilometer.compute.pollsters [-] a7650c58-4663-47b0-8499-d470f8edddbd/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.949 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9abe8c41-9a7b-4a24-9abb-bbf4042e847e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 328, 'user_id': '365f219cd09c471fa6275faa2fe5e2a1', 'user_name': None, 'project_id': 'b26cf6f4abd54e30aac169a3cbca648c', 'project_name': None, 'resource_id': 'a7650c58-4663-47b0-8499-d470f8edddbd-vda', 'timestamp': '2026-01-22T00:06:22.914649', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1989312991', 'name': 'instance-0000005c', 'instance_id': 'a7650c58-4663-47b0-8499-d470f8edddbd', 'instance_type': 'm1.nano', 'host': '6538bf0928604205d3543d596640848620246511a60cfb5dc1884224', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '2fb1b126-f726-11f0-a0a4-fa163e934844', 'monotonic_time': 4955.621879975, 'message_signature': '04e8af7edab8bbfa1c1737e82ec663ad73570dfaa725d982ab79f97f49f5240f'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '365f219cd09c471fa6275faa2fe5e2a1', 'user_name': None, 'project_id': 'b26cf6f4abd54e30aac169a3cbca648c', 'project_name': None, 'resource_id': 'a7650c58-4663-47b0-8499-d470f8edddbd-sda', 'timestamp': '2026-01-22T00:06:22.914649', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1989312991', 'name': 'instance-0000005c', 'instance_id': 'a7650c58-4663-47b0-8499-d470f8edddbd', 'instance_type': 'm1.nano', 'host': '6538bf0928604205d3543d596640848620246511a60cfb5dc1884224', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '2fb1beaa-f726-11f0-a0a4-fa163e934844', 'monotonic_time': 4955.621879975, 'message_signature': '51289722000f5fe9f747f865368944dea2c12f7273efb52e7f359bb8f7c3b2b3'}]}, 'timestamp': '2026-01-22 00:06:22.948233', '_unique_id': '090f0f022ede49b0ada4a3c1946c7991'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.949 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.949 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.949 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.949 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.949 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.949 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.949 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.949 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.949 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.949 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.949 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.949 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.949 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.949 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.949 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.949 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.949 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.949 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.949 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.949 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.949 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.949 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.949 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.949 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.949 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.949 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.949 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.949 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.949 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.949 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.949 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.949 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.949 12 DEBUG ceilometer.compute.pollsters [-] a7650c58-4663-47b0-8499-d470f8edddbd/disk.device.write.bytes volume: 72978432 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.950 12 DEBUG ceilometer.compute.pollsters [-] a7650c58-4663-47b0-8499-d470f8edddbd/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.950 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1976fbc9-e801-4b89-be54-d74fc5a731d4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 72978432, 'user_id': '365f219cd09c471fa6275faa2fe5e2a1', 'user_name': None, 'project_id': 'b26cf6f4abd54e30aac169a3cbca648c', 'project_name': None, 'resource_id': 'a7650c58-4663-47b0-8499-d470f8edddbd-vda', 'timestamp': '2026-01-22T00:06:22.949866', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1989312991', 'name': 'instance-0000005c', 'instance_id': 'a7650c58-4663-47b0-8499-d470f8edddbd', 'instance_type': 'm1.nano', 'host': '6538bf0928604205d3543d596640848620246511a60cfb5dc1884224', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '2fb2096e-f726-11f0-a0a4-fa163e934844', 'monotonic_time': 4955.621879975, 'message_signature': 'fa03cdd0b7befa4418951164b5ef1f1de0dd4b49090a9cfe5f6403867b645534'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '365f219cd09c471fa6275faa2fe5e2a1', 'user_name': None, 'project_id': 'b26cf6f4abd54e30aac169a3cbca648c', 'project_name': None, 'resource_id': 'a7650c58-4663-47b0-8499-d470f8edddbd-sda', 'timestamp': '2026-01-22T00:06:22.949866', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1989312991', 'name': 'instance-0000005c', 'instance_id': 'a7650c58-4663-47b0-8499-d470f8edddbd', 'instance_type': 'm1.nano', 'host': '6538bf0928604205d3543d596640848620246511a60cfb5dc1884224', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '2fb2144a-f726-11f0-a0a4-fa163e934844', 'monotonic_time': 4955.621879975, 'message_signature': 'be41a8c2cd458d786ef7238eda40b1d0680e1a7e5cba83d711f195adb2c30d38'}]}, 'timestamp': '2026-01-22 00:06:22.950411', '_unique_id': 'f20e325821924055be4d5c0f08392039'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.950 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.950 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.950 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.950 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.950 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.950 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.950 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.950 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.950 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.950 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.950 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.950 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.950 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.950 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.950 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.950 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.950 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.950 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.950 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.950 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.950 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.950 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.950 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.950 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.950 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.950 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.950 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.950 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.950 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.950 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.950 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.951 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.951 12 DEBUG ceilometer.compute.pollsters [-] a7650c58-4663-47b0-8499-d470f8edddbd/disk.device.allocation volume: 30351360 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.951 12 DEBUG ceilometer.compute.pollsters [-] a7650c58-4663-47b0-8499-d470f8edddbd/disk.device.allocation volume: 512000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.952 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2dee6ff8-2266-4eec-9e48-eb94c08cc5c9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30351360, 'user_id': '365f219cd09c471fa6275faa2fe5e2a1', 'user_name': None, 'project_id': 'b26cf6f4abd54e30aac169a3cbca648c', 'project_name': None, 'resource_id': 'a7650c58-4663-47b0-8499-d470f8edddbd-vda', 'timestamp': '2026-01-22T00:06:22.951604', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1989312991', 'name': 'instance-0000005c', 'instance_id': 'a7650c58-4663-47b0-8499-d470f8edddbd', 'instance_type': 'm1.nano', 'host': '6538bf0928604205d3543d596640848620246511a60cfb5dc1884224', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '2fb24cc6-f726-11f0-a0a4-fa163e934844', 'monotonic_time': 4955.584883503, 'message_signature': 'a1f7546268d67e0a29e71b49463fc7d344472f5f5c1cf1f38fda0490d02cefc2'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 512000, 'user_id': '365f219cd09c471fa6275faa2fe5e2a1', 'user_name': None, 'project_id': 'b26cf6f4abd54e30aac169a3cbca648c', 'project_name': None, 'resource_id': 'a7650c58-4663-47b0-8499-d470f8edddbd-sda', 'timestamp': '2026-01-22T00:06:22.951604', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1989312991', 'name': 'instance-0000005c', 'instance_id': 'a7650c58-4663-47b0-8499-d470f8edddbd', 'instance_type': 'm1.nano', 'host': '6538bf0928604205d3543d596640848620246511a60cfb5dc1884224', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '2fb25810-f726-11f0-a0a4-fa163e934844', 'monotonic_time': 4955.584883503, 'message_signature': '310ffb706dd146bdb48b7a6c1cab08eae911dcc710c0351203d52f81e60a87ec'}]}, 'timestamp': '2026-01-22 00:06:22.952144', '_unique_id': '51ed5621ea114ce0914a446a9e03729c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.952 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.952 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.952 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.952 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.952 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.952 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.952 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.952 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.952 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.952 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.952 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.952 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.952 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.952 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.952 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.952 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.952 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.952 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.952 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.952 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.952 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.952 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.952 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.952 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.952 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.952 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.952 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.952 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.952 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.952 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.952 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.953 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.953 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.953 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-ServerActionsTestOtherB-server-1989312991>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServerActionsTestOtherB-server-1989312991>]
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.953 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.953 12 DEBUG ceilometer.compute.pollsters [-] a7650c58-4663-47b0-8499-d470f8edddbd/network.incoming.packets volume: 26 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.954 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b66c651a-6b3c-4737-9d62-fddf2055aeb5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 26, 'user_id': '365f219cd09c471fa6275faa2fe5e2a1', 'user_name': None, 'project_id': 'b26cf6f4abd54e30aac169a3cbca648c', 'project_name': None, 'resource_id': 'instance-0000005c-a7650c58-4663-47b0-8499-d470f8edddbd-tap609c277b-13', 'timestamp': '2026-01-22T00:06:22.953671', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1989312991', 'name': 'tap609c277b-13', 'instance_id': 'a7650c58-4663-47b0-8499-d470f8edddbd', 'instance_type': 'm1.nano', 'host': '6538bf0928604205d3543d596640848620246511a60cfb5dc1884224', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:4e:1a:fc', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap609c277b-13'}, 'message_id': '2fb29dca-f726-11f0-a0a4-fa163e934844', 'monotonic_time': 4955.606338035, 'message_signature': '6cb19b99a296be81d80418da178e98f4478cc4ee47af6647843a7abbaaf1dd86'}]}, 'timestamp': '2026-01-22 00:06:22.953956', '_unique_id': '4af75638711143c89158162707ab527a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.954 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.954 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.954 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.954 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.954 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.954 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.954 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.954 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.954 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.954 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.954 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.954 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.954 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.954 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.954 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.954 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.954 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.954 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.954 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.954 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.954 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.954 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.954 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.954 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.954 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.954 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.954 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.954 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.954 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.954 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.954 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.954 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.955 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.955 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-ServerActionsTestOtherB-server-1989312991>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServerActionsTestOtherB-server-1989312991>]
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.955 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.955 12 DEBUG ceilometer.compute.pollsters [-] a7650c58-4663-47b0-8499-d470f8edddbd/disk.device.read.bytes volume: 30304768 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.955 12 DEBUG ceilometer.compute.pollsters [-] a7650c58-4663-47b0-8499-d470f8edddbd/disk.device.read.bytes volume: 299326 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.956 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '33c56dd4-5031-48af-b0e4-44a476893edf', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 30304768, 'user_id': '365f219cd09c471fa6275faa2fe5e2a1', 'user_name': None, 'project_id': 'b26cf6f4abd54e30aac169a3cbca648c', 'project_name': None, 'resource_id': 'a7650c58-4663-47b0-8499-d470f8edddbd-vda', 'timestamp': '2026-01-22T00:06:22.955465', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1989312991', 'name': 'instance-0000005c', 'instance_id': 'a7650c58-4663-47b0-8499-d470f8edddbd', 'instance_type': 'm1.nano', 'host': '6538bf0928604205d3543d596640848620246511a60cfb5dc1884224', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '2fb2e3e8-f726-11f0-a0a4-fa163e934844', 'monotonic_time': 4955.621879975, 'message_signature': '591e93ed7fb000676ae72996dadf7edb98bcbef7b737a02d5d084b6aa7cb7893'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 299326, 'user_id': '365f219cd09c471fa6275faa2fe5e2a1', 'user_name': None, 'project_id': 'b26cf6f4abd54e30aac169a3cbca648c', 'project_name': None, 'resource_id': 'a7650c58-4663-47b0-8499-d470f8edddbd-sda', 'timestamp': '2026-01-22T00:06:22.955465', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1989312991', 'name': 'instance-0000005c', 'instance_id': 'a7650c58-4663-47b0-8499-d470f8edddbd', 'instance_type': 'm1.nano', 'host': '6538bf0928604205d3543d596640848620246511a60cfb5dc1884224', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '2fb2ee60-f726-11f0-a0a4-fa163e934844', 'monotonic_time': 4955.621879975, 'message_signature': 'fee3f55873be8027fb529931813eb29f77bc237cf4b5fa3744795f2994d8b0e9'}]}, 'timestamp': '2026-01-22 00:06:22.956016', '_unique_id': '39744c441cb740d1a23591676966b554'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.956 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.956 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.956 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.956 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.956 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.956 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.956 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.956 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.956 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.956 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.956 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.956 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.956 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.956 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.956 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.956 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.956 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.956 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.956 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.956 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.956 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.956 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.956 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.956 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.956 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.956 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.956 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.956 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.956 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.956 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.956 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.957 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.957 12 DEBUG ceilometer.compute.pollsters [-] a7650c58-4663-47b0-8499-d470f8edddbd/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.957 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ece55586-7aa3-436c-b75a-2822161f195a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '365f219cd09c471fa6275faa2fe5e2a1', 'user_name': None, 'project_id': 'b26cf6f4abd54e30aac169a3cbca648c', 'project_name': None, 'resource_id': 'instance-0000005c-a7650c58-4663-47b0-8499-d470f8edddbd-tap609c277b-13', 'timestamp': '2026-01-22T00:06:22.957179', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1989312991', 'name': 'tap609c277b-13', 'instance_id': 'a7650c58-4663-47b0-8499-d470f8edddbd', 'instance_type': 'm1.nano', 'host': '6538bf0928604205d3543d596640848620246511a60cfb5dc1884224', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:4e:1a:fc', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap609c277b-13'}, 'message_id': '2fb3268c-f726-11f0-a0a4-fa163e934844', 'monotonic_time': 4955.606338035, 'message_signature': '4fe708e562f0d30a618cabe959069463bed00bc67eaf4771f280be731cdc5dc6'}]}, 'timestamp': '2026-01-22 00:06:22.957471', '_unique_id': '5d762eafb10b444685eb04fcf99b584f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.957 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.957 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.957 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.957 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.957 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.957 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.957 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.957 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.957 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.957 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.957 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.957 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.957 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.957 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.957 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.957 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.957 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.957 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.957 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.957 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.957 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.957 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.957 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.957 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.957 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.957 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.957 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.957 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.957 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.957 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.957 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.958 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.958 12 DEBUG ceilometer.compute.pollsters [-] a7650c58-4663-47b0-8499-d470f8edddbd/disk.device.write.latency volume: 4014551131 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.958 12 DEBUG ceilometer.compute.pollsters [-] a7650c58-4663-47b0-8499-d470f8edddbd/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.959 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2686093b-b56b-4cc5-ba19-9cc9a7574300', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 4014551131, 'user_id': '365f219cd09c471fa6275faa2fe5e2a1', 'user_name': None, 'project_id': 'b26cf6f4abd54e30aac169a3cbca648c', 'project_name': None, 'resource_id': 'a7650c58-4663-47b0-8499-d470f8edddbd-vda', 'timestamp': '2026-01-22T00:06:22.958565', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1989312991', 'name': 'instance-0000005c', 'instance_id': 'a7650c58-4663-47b0-8499-d470f8edddbd', 'instance_type': 'm1.nano', 'host': '6538bf0928604205d3543d596640848620246511a60cfb5dc1884224', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '2fb35cba-f726-11f0-a0a4-fa163e934844', 'monotonic_time': 4955.621879975, 'message_signature': '28c62d1577300c426bf01f59b9fa7d9ea5d49e402a4dafb8d40b9d959c354b46'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '365f219cd09c471fa6275faa2fe5e2a1', 'user_name': None, 'project_id': 'b26cf6f4abd54e30aac169a3cbca648c', 'project_name': None, 'resource_id': 'a7650c58-4663-47b0-8499-d470f8edddbd-sda', 'timestamp': '2026-01-22T00:06:22.958565', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1989312991', 'name': 'instance-0000005c', 'instance_id': 'a7650c58-4663-47b0-8499-d470f8edddbd', 'instance_type': 'm1.nano', 'host': '6538bf0928604205d3543d596640848620246511a60cfb5dc1884224', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '2fb36818-f726-11f0-a0a4-fa163e934844', 'monotonic_time': 4955.621879975, 'message_signature': 'dccbf5aed0731b7175f214ac003e4124c8e70944364cc2cbb6d02f43f2aff5ff'}]}, 'timestamp': '2026-01-22 00:06:22.959119', '_unique_id': 'b3295b2d159246149e08533492129979'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.959 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.959 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.959 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.959 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.959 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.959 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.959 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.959 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.959 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.959 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.959 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.959 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.959 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.959 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.959 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.959 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.959 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.959 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.959 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.959 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.959 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.959 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.959 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.959 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.959 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.959 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.959 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.959 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.959 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.959 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.959 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.960 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.960 12 DEBUG ceilometer.compute.pollsters [-] a7650c58-4663-47b0-8499-d470f8edddbd/disk.device.read.latency volume: 227534597 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.960 12 DEBUG ceilometer.compute.pollsters [-] a7650c58-4663-47b0-8499-d470f8edddbd/disk.device.read.latency volume: 35155999 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.961 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9663c67a-01ad-4e42-9b95-1d2b716aa4ba', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 227534597, 'user_id': '365f219cd09c471fa6275faa2fe5e2a1', 'user_name': None, 'project_id': 'b26cf6f4abd54e30aac169a3cbca648c', 'project_name': None, 'resource_id': 'a7650c58-4663-47b0-8499-d470f8edddbd-vda', 'timestamp': '2026-01-22T00:06:22.960266', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1989312991', 'name': 'instance-0000005c', 'instance_id': 'a7650c58-4663-47b0-8499-d470f8edddbd', 'instance_type': 'm1.nano', 'host': '6538bf0928604205d3543d596640848620246511a60cfb5dc1884224', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '2fb39f18-f726-11f0-a0a4-fa163e934844', 'monotonic_time': 4955.621879975, 'message_signature': 'f45d5c923dbed94ddc33e170c7b3cba4c385fee19dcfd14dc0995cade6e96284'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 35155999, 'user_id': '365f219cd09c471fa6275faa2fe5e2a1', 'user_name': None, 'project_id': 'b26cf6f4abd54e30aac169a3cbca648c', 'project_name': None, 'resource_id': 'a7650c58-4663-47b0-8499-d470f8edddbd-sda', 'timestamp': '2026-01-22T00:06:22.960266', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1989312991', 'name': 'instance-0000005c', 'instance_id': 'a7650c58-4663-47b0-8499-d470f8edddbd', 'instance_type': 'm1.nano', 'host': '6538bf0928604205d3543d596640848620246511a60cfb5dc1884224', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '2fb3a97c-f726-11f0-a0a4-fa163e934844', 'monotonic_time': 4955.621879975, 'message_signature': '762779aeb018527711e9b1b590a5d922f8655feda3d0c41001c0124b4cb8d535'}]}, 'timestamp': '2026-01-22 00:06:22.960803', '_unique_id': 'ee060cd685e5444191412a30a1f1699d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.961 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.961 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.961 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.961 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.961 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.961 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.961 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.961 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.961 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.961 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.961 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.961 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.961 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.961 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.961 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.961 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.961 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.961 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.961 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.961 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.961 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.961 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.961 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.961 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.961 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.961 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.961 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.961 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.961 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.961 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.961 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.962 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.962 12 DEBUG ceilometer.compute.pollsters [-] a7650c58-4663-47b0-8499-d470f8edddbd/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.962 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b9f5a66b-6468-40f9-96f2-5dce768b1e82', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '365f219cd09c471fa6275faa2fe5e2a1', 'user_name': None, 'project_id': 'b26cf6f4abd54e30aac169a3cbca648c', 'project_name': None, 'resource_id': 'instance-0000005c-a7650c58-4663-47b0-8499-d470f8edddbd-tap609c277b-13', 'timestamp': '2026-01-22T00:06:22.962215', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1989312991', 'name': 'tap609c277b-13', 'instance_id': 'a7650c58-4663-47b0-8499-d470f8edddbd', 'instance_type': 'm1.nano', 'host': '6538bf0928604205d3543d596640848620246511a60cfb5dc1884224', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:4e:1a:fc', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap609c277b-13'}, 'message_id': '2fb3eb8a-f726-11f0-a0a4-fa163e934844', 'monotonic_time': 4955.606338035, 'message_signature': 'b87227ceee4062e64b4b516297437eed8e8ffeeb0210779f2c132563d62c4d3d'}]}, 'timestamp': '2026-01-22 00:06:22.962517', '_unique_id': '048abc5c3cc24e9cb459a6bad5a11dc5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.962 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.962 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.962 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.962 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.962 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.962 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.962 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.962 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.962 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.962 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.962 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.962 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.962 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.962 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.962 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.962 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.962 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.962 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.962 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.962 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.962 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.962 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.962 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.962 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.962 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.962 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.962 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.962 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.962 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.962 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.962 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.963 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.963 12 DEBUG ceilometer.compute.pollsters [-] a7650c58-4663-47b0-8499-d470f8edddbd/disk.device.usage volume: 29949952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.963 12 DEBUG ceilometer.compute.pollsters [-] a7650c58-4663-47b0-8499-d470f8edddbd/disk.device.usage volume: 509952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.964 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2a462a33-55ba-4f4c-8467-07c3eefcc769', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29949952, 'user_id': '365f219cd09c471fa6275faa2fe5e2a1', 'user_name': None, 'project_id': 'b26cf6f4abd54e30aac169a3cbca648c', 'project_name': None, 'resource_id': 'a7650c58-4663-47b0-8499-d470f8edddbd-vda', 'timestamp': '2026-01-22T00:06:22.963646', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1989312991', 'name': 'instance-0000005c', 'instance_id': 'a7650c58-4663-47b0-8499-d470f8edddbd', 'instance_type': 'm1.nano', 'host': '6538bf0928604205d3543d596640848620246511a60cfb5dc1884224', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '2fb42316-f726-11f0-a0a4-fa163e934844', 'monotonic_time': 4955.584883503, 'message_signature': '388d413d9cd4375765821b78ea54f2d9fc673cc9872f29ca091a7a1da15ad656'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 509952, 'user_id': '365f219cd09c471fa6275faa2fe5e2a1', 'user_name': None, 'project_id': 'b26cf6f4abd54e30aac169a3cbca648c', 'project_name': None, 'resource_id': 'a7650c58-4663-47b0-8499-d470f8edddbd-sda', 'timestamp': '2026-01-22T00:06:22.963646', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1989312991', 'name': 'instance-0000005c', 'instance_id': 'a7650c58-4663-47b0-8499-d470f8edddbd', 'instance_type': 'm1.nano', 'host': '6538bf0928604205d3543d596640848620246511a60cfb5dc1884224', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '2fb42e74-f726-11f0-a0a4-fa163e934844', 'monotonic_time': 4955.584883503, 'message_signature': '6c79863807f6f7dd6d093f5303d80b19f2fb5898bf26794751fb49b971564218'}]}, 'timestamp': '2026-01-22 00:06:22.964185', '_unique_id': '83d939e29f97463393fbd8cb75ecf713'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.964 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.964 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.964 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.964 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.964 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.964 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.964 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.964 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.964 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.964 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.964 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.964 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.964 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.964 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.964 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.964 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.964 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.964 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.964 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.964 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.964 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.964 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.964 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.964 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.964 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.964 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.964 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.964 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.964 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.964 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.964 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.965 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.965 12 DEBUG ceilometer.compute.pollsters [-] a7650c58-4663-47b0-8499-d470f8edddbd/network.outgoing.bytes volume: 3390 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.966 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8b943451-f53f-49a2-983b-4e24d93dd94b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 3390, 'user_id': '365f219cd09c471fa6275faa2fe5e2a1', 'user_name': None, 'project_id': 'b26cf6f4abd54e30aac169a3cbca648c', 'project_name': None, 'resource_id': 'instance-0000005c-a7650c58-4663-47b0-8499-d470f8edddbd-tap609c277b-13', 'timestamp': '2026-01-22T00:06:22.965274', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1989312991', 'name': 'tap609c277b-13', 'instance_id': 'a7650c58-4663-47b0-8499-d470f8edddbd', 'instance_type': 'm1.nano', 'host': '6538bf0928604205d3543d596640848620246511a60cfb5dc1884224', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:4e:1a:fc', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap609c277b-13'}, 'message_id': '2fb462f4-f726-11f0-a0a4-fa163e934844', 'monotonic_time': 4955.606338035, 'message_signature': '61c4559c5823e4efd0fc06a8f8a4214460488a8837be14fe8ed602a10abbc32c'}]}, 'timestamp': '2026-01-22 00:06:22.965569', '_unique_id': '349f8421a1e741598e406b1e1745594e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.966 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.966 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.966 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.966 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.966 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.966 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.966 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.966 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.966 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.966 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.966 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.966 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.966 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.966 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.966 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.966 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.966 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.966 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.966 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.966 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.966 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.966 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.966 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.966 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.966 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.966 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.966 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.966 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.966 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.966 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.966 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.966 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.992 12 DEBUG ceilometer.compute.pollsters [-] a7650c58-4663-47b0-8499-d470f8edddbd/memory.usage volume: 42.609375 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.993 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a7448a0c-9a42-4f23-a2bb-c66125d08d12', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 42.609375, 'user_id': '365f219cd09c471fa6275faa2fe5e2a1', 'user_name': None, 'project_id': 'b26cf6f4abd54e30aac169a3cbca648c', 'project_name': None, 'resource_id': 'a7650c58-4663-47b0-8499-d470f8edddbd', 'timestamp': '2026-01-22T00:06:22.966722', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1989312991', 'name': 'instance-0000005c', 'instance_id': 'a7650c58-4663-47b0-8499-d470f8edddbd', 'instance_type': 'm1.nano', 'host': '6538bf0928604205d3543d596640848620246511a60cfb5dc1884224', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '2fb88370-f726-11f0-a0a4-fa163e934844', 'monotonic_time': 4955.699270724, 'message_signature': '942f67011e3779c221e6711a46238a8303eeeee2c6bee606fff02f8507dea445'}]}, 'timestamp': '2026-01-22 00:06:22.992632', '_unique_id': 'dd1ac79eb3ae4eb0b83984c04af45fee'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.993 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.993 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.993 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.993 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.993 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.993 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.993 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.993 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.993 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.993 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.993 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.993 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.993 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.993 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.993 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.993 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.993 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.993 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.993 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.993 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.993 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.993 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.993 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.993 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.993 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.993 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.993 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.993 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.993 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.993 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.993 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.994 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.994 12 DEBUG ceilometer.compute.pollsters [-] a7650c58-4663-47b0-8499-d470f8edddbd/disk.device.read.requests volume: 1090 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.994 12 DEBUG ceilometer.compute.pollsters [-] a7650c58-4663-47b0-8499-d470f8edddbd/disk.device.read.requests volume: 120 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.995 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0ccc9fdb-d120-4bc3-926c-f49c6885048d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1090, 'user_id': '365f219cd09c471fa6275faa2fe5e2a1', 'user_name': None, 'project_id': 'b26cf6f4abd54e30aac169a3cbca648c', 'project_name': None, 'resource_id': 'a7650c58-4663-47b0-8499-d470f8edddbd-vda', 'timestamp': '2026-01-22T00:06:22.994301', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1989312991', 'name': 'instance-0000005c', 'instance_id': 'a7650c58-4663-47b0-8499-d470f8edddbd', 'instance_type': 'm1.nano', 'host': '6538bf0928604205d3543d596640848620246511a60cfb5dc1884224', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '2fb8d01e-f726-11f0-a0a4-fa163e934844', 'monotonic_time': 4955.621879975, 'message_signature': 'd712a23c72d8bd2ebb4ce66d21906e708bb54098a6a7843db0e430cdb21f419e'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 120, 'user_id': '365f219cd09c471fa6275faa2fe5e2a1', 'user_name': None, 'project_id': 'b26cf6f4abd54e30aac169a3cbca648c', 'project_name': None, 'resource_id': 'a7650c58-4663-47b0-8499-d470f8edddbd-sda', 'timestamp': '2026-01-22T00:06:22.994301', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1989312991', 'name': 'instance-0000005c', 'instance_id': 'a7650c58-4663-47b0-8499-d470f8edddbd', 'instance_type': 'm1.nano', 'host': '6538bf0928604205d3543d596640848620246511a60cfb5dc1884224', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '2fb8d7d0-f726-11f0-a0a4-fa163e934844', 'monotonic_time': 4955.621879975, 'message_signature': '9f21f6c039e3807c3d8db5bbcd14dab10f0db393170b3e97e76653a632cfddc4'}]}, 'timestamp': '2026-01-22 00:06:22.994728', '_unique_id': 'd0bb7880d8bd46b2bcbefc888ade636c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.995 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.995 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.995 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.995 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.995 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.995 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.995 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.995 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.995 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.995 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.995 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.995 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.995 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.995 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.995 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.995 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.995 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.995 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.995 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.995 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.995 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.995 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.995 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.995 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.995 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.995 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.995 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.995 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.995 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.995 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.995 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.995 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.995 12 DEBUG ceilometer.compute.pollsters [-] a7650c58-4663-47b0-8499-d470f8edddbd/cpu volume: 11550000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.996 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '71cebda6-40bc-4646-8ea7-d3dff259a3e1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 11550000000, 'user_id': '365f219cd09c471fa6275faa2fe5e2a1', 'user_name': None, 'project_id': 'b26cf6f4abd54e30aac169a3cbca648c', 'project_name': None, 'resource_id': 'a7650c58-4663-47b0-8499-d470f8edddbd', 'timestamp': '2026-01-22T00:06:22.995871', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1989312991', 'name': 'instance-0000005c', 'instance_id': 'a7650c58-4663-47b0-8499-d470f8edddbd', 'instance_type': 'm1.nano', 'host': '6538bf0928604205d3543d596640848620246511a60cfb5dc1884224', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '2fb90dfe-f726-11f0-a0a4-fa163e934844', 'monotonic_time': 4955.699270724, 'message_signature': '92ed2f402a0ed880e3157a02df0d4b3aba3e4e65bffc51da94621d69ce9ae704'}]}, 'timestamp': '2026-01-22 00:06:22.996123', '_unique_id': 'cc0214196080482889813f1b9ee6ccc5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.996 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.996 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.996 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.996 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.996 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.996 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.996 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.996 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.996 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.996 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.996 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.996 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.996 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.996 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.996 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.996 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.996 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.996 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.996 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.996 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.996 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.996 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.996 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.996 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.996 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.996 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.996 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.996 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.996 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.996 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.996 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.997 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.997 12 DEBUG ceilometer.compute.pollsters [-] a7650c58-4663-47b0-8499-d470f8edddbd/network.outgoing.packets volume: 28 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.997 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ba0bb28c-2068-4a46-82e1-b1854c02aa44', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 28, 'user_id': '365f219cd09c471fa6275faa2fe5e2a1', 'user_name': None, 'project_id': 'b26cf6f4abd54e30aac169a3cbca648c', 'project_name': None, 'resource_id': 'instance-0000005c-a7650c58-4663-47b0-8499-d470f8edddbd-tap609c277b-13', 'timestamp': '2026-01-22T00:06:22.997170', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1989312991', 'name': 'tap609c277b-13', 'instance_id': 'a7650c58-4663-47b0-8499-d470f8edddbd', 'instance_type': 'm1.nano', 'host': '6538bf0928604205d3543d596640848620246511a60cfb5dc1884224', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:4e:1a:fc', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap609c277b-13'}, 'message_id': '2fb93f9a-f726-11f0-a0a4-fa163e934844', 'monotonic_time': 4955.606338035, 'message_signature': 'e8c1b16eb528b6587a002f2a4f7c0d0ecbbecaae6e46fde459fe234479d11588'}]}, 'timestamp': '2026-01-22 00:06:22.997429', '_unique_id': '7beb9b707b544ccf8569e03fa129d68f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.997 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.997 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.997 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.997 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.997 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.997 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.997 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.997 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.997 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.997 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.997 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.997 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.997 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.997 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.997 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.997 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.997 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.997 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.997 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.997 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.997 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.997 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.997 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.997 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.997 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.997 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.997 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.997 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.997 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.997 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:06:22.997 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:06:25 compute-1 podman[226428]: 2026-01-22 00:06:25.558130089 +0000 UTC m=+0.052021598 container health_status e305ebfcd3dbca18f8dd680606b043b108cf9efb0d0a771039cb04fe0df8441d (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 22 00:06:25 compute-1 podman[226427]: 2026-01-22 00:06:25.558249043 +0000 UTC m=+0.051911424 container health_status af9a44923f14d865893c91712a29a99d59e05d83f1a1a0300c4f4d5af638f284 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent)
Jan 22 00:06:26 compute-1 nova_compute[182713]: 2026-01-22 00:06:26.191 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:06:27 compute-1 nova_compute[182713]: 2026-01-22 00:06:27.089 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:06:27 compute-1 nova_compute[182713]: 2026-01-22 00:06:27.732 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:06:27 compute-1 nova_compute[182713]: 2026-01-22 00:06:27.932 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:06:29 compute-1 nova_compute[182713]: 2026-01-22 00:06:29.411 182717 DEBUG oslo_concurrency.lockutils [None req-e4b2fd5a-21a5-4120-9e01-7775deca166d 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Acquiring lock "21970bbd-36b4-495d-8819-49ef2276a912" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:06:29 compute-1 nova_compute[182713]: 2026-01-22 00:06:29.412 182717 DEBUG oslo_concurrency.lockutils [None req-e4b2fd5a-21a5-4120-9e01-7775deca166d 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Lock "21970bbd-36b4-495d-8819-49ef2276a912" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:06:29 compute-1 nova_compute[182713]: 2026-01-22 00:06:29.435 182717 DEBUG nova.compute.manager [None req-e4b2fd5a-21a5-4120-9e01-7775deca166d 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: 21970bbd-36b4-495d-8819-49ef2276a912] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 22 00:06:29 compute-1 nova_compute[182713]: 2026-01-22 00:06:29.557 182717 DEBUG oslo_concurrency.lockutils [None req-e4b2fd5a-21a5-4120-9e01-7775deca166d 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:06:29 compute-1 nova_compute[182713]: 2026-01-22 00:06:29.557 182717 DEBUG oslo_concurrency.lockutils [None req-e4b2fd5a-21a5-4120-9e01-7775deca166d 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:06:29 compute-1 nova_compute[182713]: 2026-01-22 00:06:29.565 182717 DEBUG nova.virt.hardware [None req-e4b2fd5a-21a5-4120-9e01-7775deca166d 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 22 00:06:29 compute-1 nova_compute[182713]: 2026-01-22 00:06:29.566 182717 INFO nova.compute.claims [None req-e4b2fd5a-21a5-4120-9e01-7775deca166d 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: 21970bbd-36b4-495d-8819-49ef2276a912] Claim successful on node compute-1.ctlplane.example.com
Jan 22 00:06:29 compute-1 nova_compute[182713]: 2026-01-22 00:06:29.729 182717 DEBUG nova.compute.provider_tree [None req-e4b2fd5a-21a5-4120-9e01-7775deca166d 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Inventory has not changed in ProviderTree for provider: 39680711-70c9-4df1-ae59-25e54fac688d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:06:29 compute-1 nova_compute[182713]: 2026-01-22 00:06:29.748 182717 DEBUG nova.scheduler.client.report [None req-e4b2fd5a-21a5-4120-9e01-7775deca166d 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Inventory has not changed for provider 39680711-70c9-4df1-ae59-25e54fac688d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:06:29 compute-1 nova_compute[182713]: 2026-01-22 00:06:29.774 182717 DEBUG oslo_concurrency.lockutils [None req-e4b2fd5a-21a5-4120-9e01-7775deca166d 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.217s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:06:29 compute-1 nova_compute[182713]: 2026-01-22 00:06:29.776 182717 DEBUG nova.compute.manager [None req-e4b2fd5a-21a5-4120-9e01-7775deca166d 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: 21970bbd-36b4-495d-8819-49ef2276a912] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 22 00:06:29 compute-1 nova_compute[182713]: 2026-01-22 00:06:29.849 182717 DEBUG nova.compute.manager [None req-e4b2fd5a-21a5-4120-9e01-7775deca166d 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: 21970bbd-36b4-495d-8819-49ef2276a912] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 22 00:06:29 compute-1 nova_compute[182713]: 2026-01-22 00:06:29.849 182717 DEBUG nova.network.neutron [None req-e4b2fd5a-21a5-4120-9e01-7775deca166d 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: 21970bbd-36b4-495d-8819-49ef2276a912] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 22 00:06:29 compute-1 nova_compute[182713]: 2026-01-22 00:06:29.873 182717 INFO nova.virt.libvirt.driver [None req-e4b2fd5a-21a5-4120-9e01-7775deca166d 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: 21970bbd-36b4-495d-8819-49ef2276a912] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 22 00:06:29 compute-1 nova_compute[182713]: 2026-01-22 00:06:29.896 182717 DEBUG nova.compute.manager [None req-e4b2fd5a-21a5-4120-9e01-7775deca166d 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: 21970bbd-36b4-495d-8819-49ef2276a912] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 22 00:06:30 compute-1 nova_compute[182713]: 2026-01-22 00:06:30.055 182717 DEBUG nova.compute.manager [None req-e4b2fd5a-21a5-4120-9e01-7775deca166d 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: 21970bbd-36b4-495d-8819-49ef2276a912] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 22 00:06:30 compute-1 nova_compute[182713]: 2026-01-22 00:06:30.057 182717 DEBUG nova.virt.libvirt.driver [None req-e4b2fd5a-21a5-4120-9e01-7775deca166d 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: 21970bbd-36b4-495d-8819-49ef2276a912] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 22 00:06:30 compute-1 nova_compute[182713]: 2026-01-22 00:06:30.057 182717 INFO nova.virt.libvirt.driver [None req-e4b2fd5a-21a5-4120-9e01-7775deca166d 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: 21970bbd-36b4-495d-8819-49ef2276a912] Creating image(s)
Jan 22 00:06:30 compute-1 nova_compute[182713]: 2026-01-22 00:06:30.058 182717 DEBUG oslo_concurrency.lockutils [None req-e4b2fd5a-21a5-4120-9e01-7775deca166d 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Acquiring lock "/var/lib/nova/instances/21970bbd-36b4-495d-8819-49ef2276a912/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:06:30 compute-1 nova_compute[182713]: 2026-01-22 00:06:30.058 182717 DEBUG oslo_concurrency.lockutils [None req-e4b2fd5a-21a5-4120-9e01-7775deca166d 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Lock "/var/lib/nova/instances/21970bbd-36b4-495d-8819-49ef2276a912/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:06:30 compute-1 nova_compute[182713]: 2026-01-22 00:06:30.059 182717 DEBUG oslo_concurrency.lockutils [None req-e4b2fd5a-21a5-4120-9e01-7775deca166d 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Lock "/var/lib/nova/instances/21970bbd-36b4-495d-8819-49ef2276a912/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:06:30 compute-1 nova_compute[182713]: 2026-01-22 00:06:30.079 182717 DEBUG oslo_concurrency.processutils [None req-e4b2fd5a-21a5-4120-9e01-7775deca166d 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:06:30 compute-1 nova_compute[182713]: 2026-01-22 00:06:30.118 182717 DEBUG nova.policy [None req-e4b2fd5a-21a5-4120-9e01-7775deca166d 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '365f219cd09c471fa6275faa2fe5e2a1', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'b26cf6f4abd54e30aac169a3cbca648c', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 22 00:06:30 compute-1 nova_compute[182713]: 2026-01-22 00:06:30.170 182717 DEBUG oslo_concurrency.processutils [None req-e4b2fd5a-21a5-4120-9e01-7775deca166d 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.091s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:06:30 compute-1 nova_compute[182713]: 2026-01-22 00:06:30.171 182717 DEBUG oslo_concurrency.lockutils [None req-e4b2fd5a-21a5-4120-9e01-7775deca166d 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Acquiring lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:06:30 compute-1 nova_compute[182713]: 2026-01-22 00:06:30.171 182717 DEBUG oslo_concurrency.lockutils [None req-e4b2fd5a-21a5-4120-9e01-7775deca166d 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:06:30 compute-1 nova_compute[182713]: 2026-01-22 00:06:30.182 182717 DEBUG oslo_concurrency.processutils [None req-e4b2fd5a-21a5-4120-9e01-7775deca166d 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:06:30 compute-1 nova_compute[182713]: 2026-01-22 00:06:30.242 182717 DEBUG oslo_concurrency.processutils [None req-e4b2fd5a-21a5-4120-9e01-7775deca166d 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:06:30 compute-1 nova_compute[182713]: 2026-01-22 00:06:30.243 182717 DEBUG oslo_concurrency.processutils [None req-e4b2fd5a-21a5-4120-9e01-7775deca166d 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/21970bbd-36b4-495d-8819-49ef2276a912/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:06:30 compute-1 nova_compute[182713]: 2026-01-22 00:06:30.275 182717 DEBUG oslo_concurrency.processutils [None req-e4b2fd5a-21a5-4120-9e01-7775deca166d 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/21970bbd-36b4-495d-8819-49ef2276a912/disk 1073741824" returned: 0 in 0.033s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:06:30 compute-1 nova_compute[182713]: 2026-01-22 00:06:30.277 182717 DEBUG oslo_concurrency.lockutils [None req-e4b2fd5a-21a5-4120-9e01-7775deca166d 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.105s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:06:30 compute-1 nova_compute[182713]: 2026-01-22 00:06:30.277 182717 DEBUG oslo_concurrency.processutils [None req-e4b2fd5a-21a5-4120-9e01-7775deca166d 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:06:30 compute-1 nova_compute[182713]: 2026-01-22 00:06:30.327 182717 DEBUG oslo_concurrency.processutils [None req-e4b2fd5a-21a5-4120-9e01-7775deca166d 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.050s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:06:30 compute-1 nova_compute[182713]: 2026-01-22 00:06:30.328 182717 DEBUG nova.virt.disk.api [None req-e4b2fd5a-21a5-4120-9e01-7775deca166d 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Checking if we can resize image /var/lib/nova/instances/21970bbd-36b4-495d-8819-49ef2276a912/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 22 00:06:30 compute-1 nova_compute[182713]: 2026-01-22 00:06:30.328 182717 DEBUG oslo_concurrency.processutils [None req-e4b2fd5a-21a5-4120-9e01-7775deca166d 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/21970bbd-36b4-495d-8819-49ef2276a912/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:06:30 compute-1 nova_compute[182713]: 2026-01-22 00:06:30.381 182717 DEBUG oslo_concurrency.processutils [None req-e4b2fd5a-21a5-4120-9e01-7775deca166d 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/21970bbd-36b4-495d-8819-49ef2276a912/disk --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:06:30 compute-1 nova_compute[182713]: 2026-01-22 00:06:30.382 182717 DEBUG nova.virt.disk.api [None req-e4b2fd5a-21a5-4120-9e01-7775deca166d 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Cannot resize image /var/lib/nova/instances/21970bbd-36b4-495d-8819-49ef2276a912/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 22 00:06:30 compute-1 nova_compute[182713]: 2026-01-22 00:06:30.382 182717 DEBUG nova.objects.instance [None req-e4b2fd5a-21a5-4120-9e01-7775deca166d 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Lazy-loading 'migration_context' on Instance uuid 21970bbd-36b4-495d-8819-49ef2276a912 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:06:30 compute-1 nova_compute[182713]: 2026-01-22 00:06:30.398 182717 DEBUG nova.virt.libvirt.driver [None req-e4b2fd5a-21a5-4120-9e01-7775deca166d 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: 21970bbd-36b4-495d-8819-49ef2276a912] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 22 00:06:30 compute-1 nova_compute[182713]: 2026-01-22 00:06:30.399 182717 DEBUG nova.virt.libvirt.driver [None req-e4b2fd5a-21a5-4120-9e01-7775deca166d 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: 21970bbd-36b4-495d-8819-49ef2276a912] Ensure instance console log exists: /var/lib/nova/instances/21970bbd-36b4-495d-8819-49ef2276a912/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 22 00:06:30 compute-1 nova_compute[182713]: 2026-01-22 00:06:30.399 182717 DEBUG oslo_concurrency.lockutils [None req-e4b2fd5a-21a5-4120-9e01-7775deca166d 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:06:30 compute-1 nova_compute[182713]: 2026-01-22 00:06:30.400 182717 DEBUG oslo_concurrency.lockutils [None req-e4b2fd5a-21a5-4120-9e01-7775deca166d 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:06:30 compute-1 nova_compute[182713]: 2026-01-22 00:06:30.400 182717 DEBUG oslo_concurrency.lockutils [None req-e4b2fd5a-21a5-4120-9e01-7775deca166d 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:06:30 compute-1 nova_compute[182713]: 2026-01-22 00:06:30.780 182717 DEBUG nova.network.neutron [None req-e4b2fd5a-21a5-4120-9e01-7775deca166d 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: 21970bbd-36b4-495d-8819-49ef2276a912] Successfully created port: 06125da7-7adf-4bbe-b033-0045ab83a9f2 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 22 00:06:32 compute-1 nova_compute[182713]: 2026-01-22 00:06:32.091 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:06:32 compute-1 nova_compute[182713]: 2026-01-22 00:06:32.250 182717 DEBUG nova.network.neutron [None req-e4b2fd5a-21a5-4120-9e01-7775deca166d 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: 21970bbd-36b4-495d-8819-49ef2276a912] Successfully updated port: 06125da7-7adf-4bbe-b033-0045ab83a9f2 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 22 00:06:32 compute-1 nova_compute[182713]: 2026-01-22 00:06:32.277 182717 DEBUG oslo_concurrency.lockutils [None req-e4b2fd5a-21a5-4120-9e01-7775deca166d 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Acquiring lock "refresh_cache-21970bbd-36b4-495d-8819-49ef2276a912" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:06:32 compute-1 nova_compute[182713]: 2026-01-22 00:06:32.278 182717 DEBUG oslo_concurrency.lockutils [None req-e4b2fd5a-21a5-4120-9e01-7775deca166d 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Acquired lock "refresh_cache-21970bbd-36b4-495d-8819-49ef2276a912" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:06:32 compute-1 nova_compute[182713]: 2026-01-22 00:06:32.279 182717 DEBUG nova.network.neutron [None req-e4b2fd5a-21a5-4120-9e01-7775deca166d 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: 21970bbd-36b4-495d-8819-49ef2276a912] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 00:06:32 compute-1 nova_compute[182713]: 2026-01-22 00:06:32.453 182717 DEBUG nova.compute.manager [req-8bcbd8be-22a0-4f0a-9983-d90b0fc04005 req-8b893998-22cb-44a7-ab9a-40014b10056e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 21970bbd-36b4-495d-8819-49ef2276a912] Received event network-changed-06125da7-7adf-4bbe-b033-0045ab83a9f2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:06:32 compute-1 nova_compute[182713]: 2026-01-22 00:06:32.454 182717 DEBUG nova.compute.manager [req-8bcbd8be-22a0-4f0a-9983-d90b0fc04005 req-8b893998-22cb-44a7-ab9a-40014b10056e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 21970bbd-36b4-495d-8819-49ef2276a912] Refreshing instance network info cache due to event network-changed-06125da7-7adf-4bbe-b033-0045ab83a9f2. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 00:06:32 compute-1 nova_compute[182713]: 2026-01-22 00:06:32.454 182717 DEBUG oslo_concurrency.lockutils [req-8bcbd8be-22a0-4f0a-9983-d90b0fc04005 req-8b893998-22cb-44a7-ab9a-40014b10056e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-21970bbd-36b4-495d-8819-49ef2276a912" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:06:32 compute-1 nova_compute[182713]: 2026-01-22 00:06:32.526 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:06:32 compute-1 nova_compute[182713]: 2026-01-22 00:06:32.542 182717 DEBUG nova.network.neutron [None req-e4b2fd5a-21a5-4120-9e01-7775deca166d 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: 21970bbd-36b4-495d-8819-49ef2276a912] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 00:06:32 compute-1 nova_compute[182713]: 2026-01-22 00:06:32.734 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:06:33 compute-1 nova_compute[182713]: 2026-01-22 00:06:33.330 182717 DEBUG nova.network.neutron [None req-e4b2fd5a-21a5-4120-9e01-7775deca166d 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: 21970bbd-36b4-495d-8819-49ef2276a912] Updating instance_info_cache with network_info: [{"id": "06125da7-7adf-4bbe-b033-0045ab83a9f2", "address": "fa:16:3e:c4:99:4e", "network": {"id": "1a4bd631-64c5-4e00-9341-0e44fd0833fb", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1779791452-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b26cf6f4abd54e30aac169a3cbca648c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap06125da7-7a", "ovs_interfaceid": "06125da7-7adf-4bbe-b033-0045ab83a9f2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:06:33 compute-1 nova_compute[182713]: 2026-01-22 00:06:33.364 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:06:33 compute-1 nova_compute[182713]: 2026-01-22 00:06:33.620 182717 DEBUG oslo_concurrency.lockutils [None req-e4b2fd5a-21a5-4120-9e01-7775deca166d 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Releasing lock "refresh_cache-21970bbd-36b4-495d-8819-49ef2276a912" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:06:33 compute-1 nova_compute[182713]: 2026-01-22 00:06:33.621 182717 DEBUG nova.compute.manager [None req-e4b2fd5a-21a5-4120-9e01-7775deca166d 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: 21970bbd-36b4-495d-8819-49ef2276a912] Instance network_info: |[{"id": "06125da7-7adf-4bbe-b033-0045ab83a9f2", "address": "fa:16:3e:c4:99:4e", "network": {"id": "1a4bd631-64c5-4e00-9341-0e44fd0833fb", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1779791452-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b26cf6f4abd54e30aac169a3cbca648c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap06125da7-7a", "ovs_interfaceid": "06125da7-7adf-4bbe-b033-0045ab83a9f2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 22 00:06:33 compute-1 nova_compute[182713]: 2026-01-22 00:06:33.621 182717 DEBUG oslo_concurrency.lockutils [req-8bcbd8be-22a0-4f0a-9983-d90b0fc04005 req-8b893998-22cb-44a7-ab9a-40014b10056e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-21970bbd-36b4-495d-8819-49ef2276a912" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:06:33 compute-1 nova_compute[182713]: 2026-01-22 00:06:33.621 182717 DEBUG nova.network.neutron [req-8bcbd8be-22a0-4f0a-9983-d90b0fc04005 req-8b893998-22cb-44a7-ab9a-40014b10056e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 21970bbd-36b4-495d-8819-49ef2276a912] Refreshing network info cache for port 06125da7-7adf-4bbe-b033-0045ab83a9f2 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 00:06:33 compute-1 nova_compute[182713]: 2026-01-22 00:06:33.625 182717 DEBUG nova.virt.libvirt.driver [None req-e4b2fd5a-21a5-4120-9e01-7775deca166d 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: 21970bbd-36b4-495d-8819-49ef2276a912] Start _get_guest_xml network_info=[{"id": "06125da7-7adf-4bbe-b033-0045ab83a9f2", "address": "fa:16:3e:c4:99:4e", "network": {"id": "1a4bd631-64c5-4e00-9341-0e44fd0833fb", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1779791452-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b26cf6f4abd54e30aac169a3cbca648c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap06125da7-7a", "ovs_interfaceid": "06125da7-7adf-4bbe-b033-0045ab83a9f2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'size': 0, 'disk_bus': 'virtio', 'boot_index': 0, 'device_name': '/dev/vda', 'guest_format': None, 'encrypted': False, 'encryption_format': None, 'encryption_secret_uuid': None, 'device_type': 'disk', 'image_id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 22 00:06:33 compute-1 nova_compute[182713]: 2026-01-22 00:06:33.629 182717 WARNING nova.virt.libvirt.driver [None req-e4b2fd5a-21a5-4120-9e01-7775deca166d 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 00:06:33 compute-1 nova_compute[182713]: 2026-01-22 00:06:33.634 182717 DEBUG nova.virt.libvirt.host [None req-e4b2fd5a-21a5-4120-9e01-7775deca166d 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 22 00:06:33 compute-1 nova_compute[182713]: 2026-01-22 00:06:33.634 182717 DEBUG nova.virt.libvirt.host [None req-e4b2fd5a-21a5-4120-9e01-7775deca166d 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 22 00:06:33 compute-1 nova_compute[182713]: 2026-01-22 00:06:33.637 182717 DEBUG nova.virt.libvirt.host [None req-e4b2fd5a-21a5-4120-9e01-7775deca166d 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 22 00:06:33 compute-1 nova_compute[182713]: 2026-01-22 00:06:33.638 182717 DEBUG nova.virt.libvirt.host [None req-e4b2fd5a-21a5-4120-9e01-7775deca166d 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 22 00:06:33 compute-1 nova_compute[182713]: 2026-01-22 00:06:33.639 182717 DEBUG nova.virt.libvirt.driver [None req-e4b2fd5a-21a5-4120-9e01-7775deca166d 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 22 00:06:33 compute-1 nova_compute[182713]: 2026-01-22 00:06:33.639 182717 DEBUG nova.virt.hardware [None req-e4b2fd5a-21a5-4120-9e01-7775deca166d 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-21T23:42:45Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c3389c03-89c4-4ff5-9e03-1a99d41713d4',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 22 00:06:33 compute-1 nova_compute[182713]: 2026-01-22 00:06:33.639 182717 DEBUG nova.virt.hardware [None req-e4b2fd5a-21a5-4120-9e01-7775deca166d 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 22 00:06:33 compute-1 nova_compute[182713]: 2026-01-22 00:06:33.640 182717 DEBUG nova.virt.hardware [None req-e4b2fd5a-21a5-4120-9e01-7775deca166d 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 22 00:06:33 compute-1 nova_compute[182713]: 2026-01-22 00:06:33.640 182717 DEBUG nova.virt.hardware [None req-e4b2fd5a-21a5-4120-9e01-7775deca166d 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 22 00:06:33 compute-1 nova_compute[182713]: 2026-01-22 00:06:33.640 182717 DEBUG nova.virt.hardware [None req-e4b2fd5a-21a5-4120-9e01-7775deca166d 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 22 00:06:33 compute-1 nova_compute[182713]: 2026-01-22 00:06:33.640 182717 DEBUG nova.virt.hardware [None req-e4b2fd5a-21a5-4120-9e01-7775deca166d 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 22 00:06:33 compute-1 nova_compute[182713]: 2026-01-22 00:06:33.641 182717 DEBUG nova.virt.hardware [None req-e4b2fd5a-21a5-4120-9e01-7775deca166d 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 22 00:06:33 compute-1 nova_compute[182713]: 2026-01-22 00:06:33.641 182717 DEBUG nova.virt.hardware [None req-e4b2fd5a-21a5-4120-9e01-7775deca166d 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 22 00:06:33 compute-1 nova_compute[182713]: 2026-01-22 00:06:33.641 182717 DEBUG nova.virt.hardware [None req-e4b2fd5a-21a5-4120-9e01-7775deca166d 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 22 00:06:33 compute-1 nova_compute[182713]: 2026-01-22 00:06:33.641 182717 DEBUG nova.virt.hardware [None req-e4b2fd5a-21a5-4120-9e01-7775deca166d 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 22 00:06:33 compute-1 nova_compute[182713]: 2026-01-22 00:06:33.641 182717 DEBUG nova.virt.hardware [None req-e4b2fd5a-21a5-4120-9e01-7775deca166d 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 22 00:06:33 compute-1 nova_compute[182713]: 2026-01-22 00:06:33.645 182717 DEBUG nova.virt.libvirt.vif [None req-e4b2fd5a-21a5-4120-9e01-7775deca166d 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T00:06:28Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerActionsTestOtherB-server-1249936959',display_name='tempest-ServerActionsTestOtherB-server-1249936959',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serveractionstestotherb-server-1249936959',id=96,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b26cf6f4abd54e30aac169a3cbca648c',ramdisk_id='',reservation_id='r-xm80zy3k',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerActionsTestOtherB-1685479237',owner_user_name='tempest-ServerActionsTestOtherB-1685479237-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:06:29Z,user_data=None,user_id='365f219cd09c471fa6275faa2fe5e2a1',uuid=21970bbd-36b4-495d-8819-49ef2276a912,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "06125da7-7adf-4bbe-b033-0045ab83a9f2", "address": "fa:16:3e:c4:99:4e", "network": {"id": "1a4bd631-64c5-4e00-9341-0e44fd0833fb", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1779791452-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b26cf6f4abd54e30aac169a3cbca648c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap06125da7-7a", "ovs_interfaceid": "06125da7-7adf-4bbe-b033-0045ab83a9f2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 00:06:33 compute-1 nova_compute[182713]: 2026-01-22 00:06:33.646 182717 DEBUG nova.network.os_vif_util [None req-e4b2fd5a-21a5-4120-9e01-7775deca166d 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Converting VIF {"id": "06125da7-7adf-4bbe-b033-0045ab83a9f2", "address": "fa:16:3e:c4:99:4e", "network": {"id": "1a4bd631-64c5-4e00-9341-0e44fd0833fb", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1779791452-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b26cf6f4abd54e30aac169a3cbca648c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap06125da7-7a", "ovs_interfaceid": "06125da7-7adf-4bbe-b033-0045ab83a9f2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:06:33 compute-1 nova_compute[182713]: 2026-01-22 00:06:33.647 182717 DEBUG nova.network.os_vif_util [None req-e4b2fd5a-21a5-4120-9e01-7775deca166d 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c4:99:4e,bridge_name='br-int',has_traffic_filtering=True,id=06125da7-7adf-4bbe-b033-0045ab83a9f2,network=Network(1a4bd631-64c5-4e00-9341-0e44fd0833fb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap06125da7-7a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:06:33 compute-1 nova_compute[182713]: 2026-01-22 00:06:33.648 182717 DEBUG nova.objects.instance [None req-e4b2fd5a-21a5-4120-9e01-7775deca166d 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Lazy-loading 'pci_devices' on Instance uuid 21970bbd-36b4-495d-8819-49ef2276a912 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:06:33 compute-1 nova_compute[182713]: 2026-01-22 00:06:33.665 182717 DEBUG nova.virt.libvirt.driver [None req-e4b2fd5a-21a5-4120-9e01-7775deca166d 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: 21970bbd-36b4-495d-8819-49ef2276a912] End _get_guest_xml xml=<domain type="kvm">
Jan 22 00:06:33 compute-1 nova_compute[182713]:   <uuid>21970bbd-36b4-495d-8819-49ef2276a912</uuid>
Jan 22 00:06:33 compute-1 nova_compute[182713]:   <name>instance-00000060</name>
Jan 22 00:06:33 compute-1 nova_compute[182713]:   <memory>131072</memory>
Jan 22 00:06:33 compute-1 nova_compute[182713]:   <vcpu>1</vcpu>
Jan 22 00:06:33 compute-1 nova_compute[182713]:   <metadata>
Jan 22 00:06:33 compute-1 nova_compute[182713]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 00:06:33 compute-1 nova_compute[182713]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 00:06:33 compute-1 nova_compute[182713]:       <nova:name>tempest-ServerActionsTestOtherB-server-1249936959</nova:name>
Jan 22 00:06:33 compute-1 nova_compute[182713]:       <nova:creationTime>2026-01-22 00:06:33</nova:creationTime>
Jan 22 00:06:33 compute-1 nova_compute[182713]:       <nova:flavor name="m1.nano">
Jan 22 00:06:33 compute-1 nova_compute[182713]:         <nova:memory>128</nova:memory>
Jan 22 00:06:33 compute-1 nova_compute[182713]:         <nova:disk>1</nova:disk>
Jan 22 00:06:33 compute-1 nova_compute[182713]:         <nova:swap>0</nova:swap>
Jan 22 00:06:33 compute-1 nova_compute[182713]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 00:06:33 compute-1 nova_compute[182713]:         <nova:vcpus>1</nova:vcpus>
Jan 22 00:06:33 compute-1 nova_compute[182713]:       </nova:flavor>
Jan 22 00:06:33 compute-1 nova_compute[182713]:       <nova:owner>
Jan 22 00:06:33 compute-1 nova_compute[182713]:         <nova:user uuid="365f219cd09c471fa6275faa2fe5e2a1">tempest-ServerActionsTestOtherB-1685479237-project-member</nova:user>
Jan 22 00:06:33 compute-1 nova_compute[182713]:         <nova:project uuid="b26cf6f4abd54e30aac169a3cbca648c">tempest-ServerActionsTestOtherB-1685479237</nova:project>
Jan 22 00:06:33 compute-1 nova_compute[182713]:       </nova:owner>
Jan 22 00:06:33 compute-1 nova_compute[182713]:       <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 22 00:06:33 compute-1 nova_compute[182713]:       <nova:ports>
Jan 22 00:06:33 compute-1 nova_compute[182713]:         <nova:port uuid="06125da7-7adf-4bbe-b033-0045ab83a9f2">
Jan 22 00:06:33 compute-1 nova_compute[182713]:           <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Jan 22 00:06:33 compute-1 nova_compute[182713]:         </nova:port>
Jan 22 00:06:33 compute-1 nova_compute[182713]:       </nova:ports>
Jan 22 00:06:33 compute-1 nova_compute[182713]:     </nova:instance>
Jan 22 00:06:33 compute-1 nova_compute[182713]:   </metadata>
Jan 22 00:06:33 compute-1 nova_compute[182713]:   <sysinfo type="smbios">
Jan 22 00:06:33 compute-1 nova_compute[182713]:     <system>
Jan 22 00:06:33 compute-1 nova_compute[182713]:       <entry name="manufacturer">RDO</entry>
Jan 22 00:06:33 compute-1 nova_compute[182713]:       <entry name="product">OpenStack Compute</entry>
Jan 22 00:06:33 compute-1 nova_compute[182713]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 00:06:33 compute-1 nova_compute[182713]:       <entry name="serial">21970bbd-36b4-495d-8819-49ef2276a912</entry>
Jan 22 00:06:33 compute-1 nova_compute[182713]:       <entry name="uuid">21970bbd-36b4-495d-8819-49ef2276a912</entry>
Jan 22 00:06:33 compute-1 nova_compute[182713]:       <entry name="family">Virtual Machine</entry>
Jan 22 00:06:33 compute-1 nova_compute[182713]:     </system>
Jan 22 00:06:33 compute-1 nova_compute[182713]:   </sysinfo>
Jan 22 00:06:33 compute-1 nova_compute[182713]:   <os>
Jan 22 00:06:33 compute-1 nova_compute[182713]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 22 00:06:33 compute-1 nova_compute[182713]:     <boot dev="hd"/>
Jan 22 00:06:33 compute-1 nova_compute[182713]:     <smbios mode="sysinfo"/>
Jan 22 00:06:33 compute-1 nova_compute[182713]:   </os>
Jan 22 00:06:33 compute-1 nova_compute[182713]:   <features>
Jan 22 00:06:33 compute-1 nova_compute[182713]:     <acpi/>
Jan 22 00:06:33 compute-1 nova_compute[182713]:     <apic/>
Jan 22 00:06:33 compute-1 nova_compute[182713]:     <vmcoreinfo/>
Jan 22 00:06:33 compute-1 nova_compute[182713]:   </features>
Jan 22 00:06:33 compute-1 nova_compute[182713]:   <clock offset="utc">
Jan 22 00:06:33 compute-1 nova_compute[182713]:     <timer name="pit" tickpolicy="delay"/>
Jan 22 00:06:33 compute-1 nova_compute[182713]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 22 00:06:33 compute-1 nova_compute[182713]:     <timer name="hpet" present="no"/>
Jan 22 00:06:33 compute-1 nova_compute[182713]:   </clock>
Jan 22 00:06:33 compute-1 nova_compute[182713]:   <cpu mode="custom" match="exact">
Jan 22 00:06:33 compute-1 nova_compute[182713]:     <model>Nehalem</model>
Jan 22 00:06:33 compute-1 nova_compute[182713]:     <topology sockets="1" cores="1" threads="1"/>
Jan 22 00:06:33 compute-1 nova_compute[182713]:   </cpu>
Jan 22 00:06:33 compute-1 nova_compute[182713]:   <devices>
Jan 22 00:06:33 compute-1 nova_compute[182713]:     <disk type="file" device="disk">
Jan 22 00:06:33 compute-1 nova_compute[182713]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 00:06:33 compute-1 nova_compute[182713]:       <source file="/var/lib/nova/instances/21970bbd-36b4-495d-8819-49ef2276a912/disk"/>
Jan 22 00:06:33 compute-1 nova_compute[182713]:       <target dev="vda" bus="virtio"/>
Jan 22 00:06:33 compute-1 nova_compute[182713]:     </disk>
Jan 22 00:06:33 compute-1 nova_compute[182713]:     <disk type="file" device="cdrom">
Jan 22 00:06:33 compute-1 nova_compute[182713]:       <driver name="qemu" type="raw" cache="none"/>
Jan 22 00:06:33 compute-1 nova_compute[182713]:       <source file="/var/lib/nova/instances/21970bbd-36b4-495d-8819-49ef2276a912/disk.config"/>
Jan 22 00:06:33 compute-1 nova_compute[182713]:       <target dev="sda" bus="sata"/>
Jan 22 00:06:33 compute-1 nova_compute[182713]:     </disk>
Jan 22 00:06:33 compute-1 nova_compute[182713]:     <interface type="ethernet">
Jan 22 00:06:33 compute-1 nova_compute[182713]:       <mac address="fa:16:3e:c4:99:4e"/>
Jan 22 00:06:33 compute-1 nova_compute[182713]:       <model type="virtio"/>
Jan 22 00:06:33 compute-1 nova_compute[182713]:       <driver name="vhost" rx_queue_size="512"/>
Jan 22 00:06:33 compute-1 nova_compute[182713]:       <mtu size="1442"/>
Jan 22 00:06:33 compute-1 nova_compute[182713]:       <target dev="tap06125da7-7a"/>
Jan 22 00:06:33 compute-1 nova_compute[182713]:     </interface>
Jan 22 00:06:33 compute-1 nova_compute[182713]:     <serial type="pty">
Jan 22 00:06:33 compute-1 nova_compute[182713]:       <log file="/var/lib/nova/instances/21970bbd-36b4-495d-8819-49ef2276a912/console.log" append="off"/>
Jan 22 00:06:33 compute-1 nova_compute[182713]:     </serial>
Jan 22 00:06:33 compute-1 nova_compute[182713]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 00:06:33 compute-1 nova_compute[182713]:     <video>
Jan 22 00:06:33 compute-1 nova_compute[182713]:       <model type="virtio"/>
Jan 22 00:06:33 compute-1 nova_compute[182713]:     </video>
Jan 22 00:06:33 compute-1 nova_compute[182713]:     <input type="tablet" bus="usb"/>
Jan 22 00:06:33 compute-1 nova_compute[182713]:     <rng model="virtio">
Jan 22 00:06:33 compute-1 nova_compute[182713]:       <backend model="random">/dev/urandom</backend>
Jan 22 00:06:33 compute-1 nova_compute[182713]:     </rng>
Jan 22 00:06:33 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root"/>
Jan 22 00:06:33 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:06:33 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:06:33 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:06:33 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:06:33 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:06:33 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:06:33 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:06:33 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:06:33 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:06:33 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:06:33 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:06:33 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:06:33 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:06:33 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:06:33 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:06:33 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:06:33 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:06:33 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:06:33 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:06:33 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:06:33 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:06:33 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:06:33 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:06:33 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:06:33 compute-1 nova_compute[182713]:     <controller type="usb" index="0"/>
Jan 22 00:06:33 compute-1 nova_compute[182713]:     <memballoon model="virtio">
Jan 22 00:06:33 compute-1 nova_compute[182713]:       <stats period="10"/>
Jan 22 00:06:33 compute-1 nova_compute[182713]:     </memballoon>
Jan 22 00:06:33 compute-1 nova_compute[182713]:   </devices>
Jan 22 00:06:33 compute-1 nova_compute[182713]: </domain>
Jan 22 00:06:33 compute-1 nova_compute[182713]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 22 00:06:33 compute-1 nova_compute[182713]: 2026-01-22 00:06:33.666 182717 DEBUG nova.compute.manager [None req-e4b2fd5a-21a5-4120-9e01-7775deca166d 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: 21970bbd-36b4-495d-8819-49ef2276a912] Preparing to wait for external event network-vif-plugged-06125da7-7adf-4bbe-b033-0045ab83a9f2 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 22 00:06:33 compute-1 nova_compute[182713]: 2026-01-22 00:06:33.666 182717 DEBUG oslo_concurrency.lockutils [None req-e4b2fd5a-21a5-4120-9e01-7775deca166d 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Acquiring lock "21970bbd-36b4-495d-8819-49ef2276a912-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:06:33 compute-1 nova_compute[182713]: 2026-01-22 00:06:33.666 182717 DEBUG oslo_concurrency.lockutils [None req-e4b2fd5a-21a5-4120-9e01-7775deca166d 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Lock "21970bbd-36b4-495d-8819-49ef2276a912-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:06:33 compute-1 nova_compute[182713]: 2026-01-22 00:06:33.667 182717 DEBUG oslo_concurrency.lockutils [None req-e4b2fd5a-21a5-4120-9e01-7775deca166d 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Lock "21970bbd-36b4-495d-8819-49ef2276a912-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:06:33 compute-1 nova_compute[182713]: 2026-01-22 00:06:33.667 182717 DEBUG nova.virt.libvirt.vif [None req-e4b2fd5a-21a5-4120-9e01-7775deca166d 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T00:06:28Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerActionsTestOtherB-server-1249936959',display_name='tempest-ServerActionsTestOtherB-server-1249936959',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serveractionstestotherb-server-1249936959',id=96,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b26cf6f4abd54e30aac169a3cbca648c',ramdisk_id='',reservation_id='r-xm80zy3k',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerActionsTestOtherB-1685479237',owner_user_name='tempest-ServerActionsTestOtherB-1685479237-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:06:29Z,user_data=None,user_id='365f219cd09c471fa6275faa2fe5e2a1',uuid=21970bbd-36b4-495d-8819-49ef2276a912,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "06125da7-7adf-4bbe-b033-0045ab83a9f2", "address": "fa:16:3e:c4:99:4e", "network": {"id": "1a4bd631-64c5-4e00-9341-0e44fd0833fb", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1779791452-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b26cf6f4abd54e30aac169a3cbca648c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap06125da7-7a", "ovs_interfaceid": "06125da7-7adf-4bbe-b033-0045ab83a9f2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 00:06:33 compute-1 nova_compute[182713]: 2026-01-22 00:06:33.668 182717 DEBUG nova.network.os_vif_util [None req-e4b2fd5a-21a5-4120-9e01-7775deca166d 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Converting VIF {"id": "06125da7-7adf-4bbe-b033-0045ab83a9f2", "address": "fa:16:3e:c4:99:4e", "network": {"id": "1a4bd631-64c5-4e00-9341-0e44fd0833fb", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1779791452-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b26cf6f4abd54e30aac169a3cbca648c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap06125da7-7a", "ovs_interfaceid": "06125da7-7adf-4bbe-b033-0045ab83a9f2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:06:33 compute-1 nova_compute[182713]: 2026-01-22 00:06:33.668 182717 DEBUG nova.network.os_vif_util [None req-e4b2fd5a-21a5-4120-9e01-7775deca166d 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c4:99:4e,bridge_name='br-int',has_traffic_filtering=True,id=06125da7-7adf-4bbe-b033-0045ab83a9f2,network=Network(1a4bd631-64c5-4e00-9341-0e44fd0833fb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap06125da7-7a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:06:33 compute-1 nova_compute[182713]: 2026-01-22 00:06:33.668 182717 DEBUG os_vif [None req-e4b2fd5a-21a5-4120-9e01-7775deca166d 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:c4:99:4e,bridge_name='br-int',has_traffic_filtering=True,id=06125da7-7adf-4bbe-b033-0045ab83a9f2,network=Network(1a4bd631-64c5-4e00-9341-0e44fd0833fb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap06125da7-7a') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 00:06:33 compute-1 nova_compute[182713]: 2026-01-22 00:06:33.669 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:06:33 compute-1 nova_compute[182713]: 2026-01-22 00:06:33.669 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:06:33 compute-1 nova_compute[182713]: 2026-01-22 00:06:33.670 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 00:06:33 compute-1 nova_compute[182713]: 2026-01-22 00:06:33.676 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:06:33 compute-1 nova_compute[182713]: 2026-01-22 00:06:33.676 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap06125da7-7a, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:06:33 compute-1 nova_compute[182713]: 2026-01-22 00:06:33.676 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap06125da7-7a, col_values=(('external_ids', {'iface-id': '06125da7-7adf-4bbe-b033-0045ab83a9f2', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:c4:99:4e', 'vm-uuid': '21970bbd-36b4-495d-8819-49ef2276a912'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:06:33 compute-1 nova_compute[182713]: 2026-01-22 00:06:33.678 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:06:33 compute-1 NetworkManager[54952]: <info>  [1769040393.6804] manager: (tap06125da7-7a): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/191)
Jan 22 00:06:33 compute-1 nova_compute[182713]: 2026-01-22 00:06:33.681 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:06:33 compute-1 nova_compute[182713]: 2026-01-22 00:06:33.684 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:06:33 compute-1 nova_compute[182713]: 2026-01-22 00:06:33.685 182717 INFO os_vif [None req-e4b2fd5a-21a5-4120-9e01-7775deca166d 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:c4:99:4e,bridge_name='br-int',has_traffic_filtering=True,id=06125da7-7adf-4bbe-b033-0045ab83a9f2,network=Network(1a4bd631-64c5-4e00-9341-0e44fd0833fb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap06125da7-7a')
Jan 22 00:06:33 compute-1 nova_compute[182713]: 2026-01-22 00:06:33.779 182717 DEBUG nova.virt.libvirt.driver [None req-e4b2fd5a-21a5-4120-9e01-7775deca166d 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 00:06:33 compute-1 nova_compute[182713]: 2026-01-22 00:06:33.780 182717 DEBUG nova.virt.libvirt.driver [None req-e4b2fd5a-21a5-4120-9e01-7775deca166d 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 00:06:33 compute-1 nova_compute[182713]: 2026-01-22 00:06:33.780 182717 DEBUG nova.virt.libvirt.driver [None req-e4b2fd5a-21a5-4120-9e01-7775deca166d 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] No VIF found with MAC fa:16:3e:c4:99:4e, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 00:06:33 compute-1 nova_compute[182713]: 2026-01-22 00:06:33.781 182717 INFO nova.virt.libvirt.driver [None req-e4b2fd5a-21a5-4120-9e01-7775deca166d 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: 21970bbd-36b4-495d-8819-49ef2276a912] Using config drive
Jan 22 00:06:34 compute-1 nova_compute[182713]: 2026-01-22 00:06:34.274 182717 INFO nova.virt.libvirt.driver [None req-e4b2fd5a-21a5-4120-9e01-7775deca166d 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: 21970bbd-36b4-495d-8819-49ef2276a912] Creating config drive at /var/lib/nova/instances/21970bbd-36b4-495d-8819-49ef2276a912/disk.config
Jan 22 00:06:34 compute-1 nova_compute[182713]: 2026-01-22 00:06:34.279 182717 DEBUG oslo_concurrency.processutils [None req-e4b2fd5a-21a5-4120-9e01-7775deca166d 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/21970bbd-36b4-495d-8819-49ef2276a912/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpmk8v1kkk execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:06:34 compute-1 nova_compute[182713]: 2026-01-22 00:06:34.410 182717 DEBUG oslo_concurrency.processutils [None req-e4b2fd5a-21a5-4120-9e01-7775deca166d 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/21970bbd-36b4-495d-8819-49ef2276a912/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpmk8v1kkk" returned: 0 in 0.130s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:06:34 compute-1 kernel: tap06125da7-7a: entered promiscuous mode
Jan 22 00:06:34 compute-1 NetworkManager[54952]: <info>  [1769040394.4790] manager: (tap06125da7-7a): new Tun device (/org/freedesktop/NetworkManager/Devices/192)
Jan 22 00:06:34 compute-1 ovn_controller[94841]: 2026-01-22T00:06:34Z|00390|binding|INFO|Claiming lport 06125da7-7adf-4bbe-b033-0045ab83a9f2 for this chassis.
Jan 22 00:06:34 compute-1 ovn_controller[94841]: 2026-01-22T00:06:34Z|00391|binding|INFO|06125da7-7adf-4bbe-b033-0045ab83a9f2: Claiming fa:16:3e:c4:99:4e 10.100.0.13
Jan 22 00:06:34 compute-1 nova_compute[182713]: 2026-01-22 00:06:34.481 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:06:34 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:06:34.496 104184 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c4:99:4e 10.100.0.13'], port_security=['fa:16:3e:c4:99:4e 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '21970bbd-36b4-495d-8819-49ef2276a912', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1a4bd631-64c5-4e00-9341-0e44fd0833fb', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b26cf6f4abd54e30aac169a3cbca648c', 'neutron:revision_number': '2', 'neutron:security_group_ids': '51b050e2-1158-4da5-a294-6c6d2a400a60', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d46c6b58-b03f-4ac4-a6dd-9f507a40241a, chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>], logical_port=06125da7-7adf-4bbe-b033-0045ab83a9f2) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:06:34 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:06:34.497 104184 INFO neutron.agent.ovn.metadata.agent [-] Port 06125da7-7adf-4bbe-b033-0045ab83a9f2 in datapath 1a4bd631-64c5-4e00-9341-0e44fd0833fb bound to our chassis
Jan 22 00:06:34 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:06:34.498 104184 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 1a4bd631-64c5-4e00-9341-0e44fd0833fb
Jan 22 00:06:34 compute-1 ovn_controller[94841]: 2026-01-22T00:06:34Z|00392|binding|INFO|Setting lport 06125da7-7adf-4bbe-b033-0045ab83a9f2 ovn-installed in OVS
Jan 22 00:06:34 compute-1 ovn_controller[94841]: 2026-01-22T00:06:34Z|00393|binding|INFO|Setting lport 06125da7-7adf-4bbe-b033-0045ab83a9f2 up in Southbound
Jan 22 00:06:34 compute-1 nova_compute[182713]: 2026-01-22 00:06:34.511 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:06:34 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:06:34.520 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[7927001d-c6ed-4188-89e3-beead5f7ebb6]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:06:34 compute-1 systemd-udevd[226504]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 00:06:34 compute-1 systemd-machined[153970]: New machine qemu-44-instance-00000060.
Jan 22 00:06:34 compute-1 NetworkManager[54952]: <info>  [1769040394.5517] device (tap06125da7-7a): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 00:06:34 compute-1 NetworkManager[54952]: <info>  [1769040394.5522] device (tap06125da7-7a): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 00:06:34 compute-1 systemd[1]: Started Virtual Machine qemu-44-instance-00000060.
Jan 22 00:06:34 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:06:34.567 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[5f5c8e65-9c90-48db-b89c-00aa71466a90]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:06:34 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:06:34.572 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[951ce759-6c84-497c-aa2a-bc055dfc68f9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:06:34 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:06:34.608 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[4c7a16b2-d015-4a10-becf-230967bdf427]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:06:34 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:06:34.623 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[38bbdeb9-9231-4a9d-bae4-986b2ec9eb96]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap1a4bd631-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:28:78:33'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 114], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 489660, 'reachable_time': 40262, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 226516, 'error': None, 'target': 'ovnmeta-1a4bd631-64c5-4e00-9341-0e44fd0833fb', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:06:34 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:06:34.645 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[1b362935-0d50-4ae5-95d3-276f02e49ded]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap1a4bd631-61'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 489673, 'tstamp': 489673}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 226517, 'error': None, 'target': 'ovnmeta-1a4bd631-64c5-4e00-9341-0e44fd0833fb', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap1a4bd631-61'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 489676, 'tstamp': 489676}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 226517, 'error': None, 'target': 'ovnmeta-1a4bd631-64c5-4e00-9341-0e44fd0833fb', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:06:34 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:06:34.647 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1a4bd631-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:06:34 compute-1 nova_compute[182713]: 2026-01-22 00:06:34.649 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:06:34 compute-1 nova_compute[182713]: 2026-01-22 00:06:34.651 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:06:34 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:06:34.652 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap1a4bd631-60, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:06:34 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:06:34.652 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 00:06:34 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:06:34.653 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap1a4bd631-60, col_values=(('external_ids', {'iface-id': 'c2dbe75a-81e7-4c52-bada-9acaf8fbaf5c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:06:34 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:06:34.653 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 00:06:35 compute-1 nova_compute[182713]: 2026-01-22 00:06:35.067 182717 DEBUG nova.virt.driver [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Emitting event <LifecycleEvent: 1769040395.066365, 21970bbd-36b4-495d-8819-49ef2276a912 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:06:35 compute-1 nova_compute[182713]: 2026-01-22 00:06:35.067 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 21970bbd-36b4-495d-8819-49ef2276a912] VM Started (Lifecycle Event)
Jan 22 00:06:35 compute-1 nova_compute[182713]: 2026-01-22 00:06:35.096 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 21970bbd-36b4-495d-8819-49ef2276a912] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:06:35 compute-1 nova_compute[182713]: 2026-01-22 00:06:35.101 182717 DEBUG nova.virt.driver [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Emitting event <LifecycleEvent: 1769040395.0677018, 21970bbd-36b4-495d-8819-49ef2276a912 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:06:35 compute-1 nova_compute[182713]: 2026-01-22 00:06:35.102 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 21970bbd-36b4-495d-8819-49ef2276a912] VM Paused (Lifecycle Event)
Jan 22 00:06:35 compute-1 nova_compute[182713]: 2026-01-22 00:06:35.133 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 21970bbd-36b4-495d-8819-49ef2276a912] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:06:35 compute-1 nova_compute[182713]: 2026-01-22 00:06:35.138 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 21970bbd-36b4-495d-8819-49ef2276a912] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 00:06:35 compute-1 nova_compute[182713]: 2026-01-22 00:06:35.163 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 21970bbd-36b4-495d-8819-49ef2276a912] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 00:06:35 compute-1 nova_compute[182713]: 2026-01-22 00:06:35.434 182717 DEBUG nova.network.neutron [req-8bcbd8be-22a0-4f0a-9983-d90b0fc04005 req-8b893998-22cb-44a7-ab9a-40014b10056e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 21970bbd-36b4-495d-8819-49ef2276a912] Updated VIF entry in instance network info cache for port 06125da7-7adf-4bbe-b033-0045ab83a9f2. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 00:06:35 compute-1 nova_compute[182713]: 2026-01-22 00:06:35.435 182717 DEBUG nova.network.neutron [req-8bcbd8be-22a0-4f0a-9983-d90b0fc04005 req-8b893998-22cb-44a7-ab9a-40014b10056e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 21970bbd-36b4-495d-8819-49ef2276a912] Updating instance_info_cache with network_info: [{"id": "06125da7-7adf-4bbe-b033-0045ab83a9f2", "address": "fa:16:3e:c4:99:4e", "network": {"id": "1a4bd631-64c5-4e00-9341-0e44fd0833fb", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1779791452-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b26cf6f4abd54e30aac169a3cbca648c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap06125da7-7a", "ovs_interfaceid": "06125da7-7adf-4bbe-b033-0045ab83a9f2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:06:35 compute-1 nova_compute[182713]: 2026-01-22 00:06:35.459 182717 DEBUG oslo_concurrency.lockutils [req-8bcbd8be-22a0-4f0a-9983-d90b0fc04005 req-8b893998-22cb-44a7-ab9a-40014b10056e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-21970bbd-36b4-495d-8819-49ef2276a912" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:06:36 compute-1 nova_compute[182713]: 2026-01-22 00:06:36.639 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:06:36 compute-1 nova_compute[182713]: 2026-01-22 00:06:36.851 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:06:36 compute-1 nova_compute[182713]: 2026-01-22 00:06:36.855 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:06:36 compute-1 nova_compute[182713]: 2026-01-22 00:06:36.855 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:06:36 compute-1 nova_compute[182713]: 2026-01-22 00:06:36.855 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:06:36 compute-1 nova_compute[182713]: 2026-01-22 00:06:36.856 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:06:36 compute-1 nova_compute[182713]: 2026-01-22 00:06:36.856 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 00:06:37 compute-1 nova_compute[182713]: 2026-01-22 00:06:37.094 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:06:37 compute-1 podman[226526]: 2026-01-22 00:06:37.624159453 +0000 UTC m=+0.099788162 container health_status cba261ffc859a105e0afa4436e64e27f9ba7e7387a1ea02146e0072c51844d44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Jan 22 00:06:38 compute-1 nova_compute[182713]: 2026-01-22 00:06:38.680 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:06:39 compute-1 nova_compute[182713]: 2026-01-22 00:06:39.855 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:06:39 compute-1 nova_compute[182713]: 2026-01-22 00:06:39.933 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:06:39 compute-1 nova_compute[182713]: 2026-01-22 00:06:39.934 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:06:39 compute-1 nova_compute[182713]: 2026-01-22 00:06:39.935 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:06:39 compute-1 nova_compute[182713]: 2026-01-22 00:06:39.935 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 00:06:40 compute-1 nova_compute[182713]: 2026-01-22 00:06:40.065 182717 DEBUG oslo_concurrency.processutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/21970bbd-36b4-495d-8819-49ef2276a912/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:06:40 compute-1 nova_compute[182713]: 2026-01-22 00:06:40.160 182717 DEBUG oslo_concurrency.processutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/21970bbd-36b4-495d-8819-49ef2276a912/disk --force-share --output=json" returned: 0 in 0.094s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:06:40 compute-1 nova_compute[182713]: 2026-01-22 00:06:40.161 182717 DEBUG oslo_concurrency.processutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/21970bbd-36b4-495d-8819-49ef2276a912/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:06:40 compute-1 nova_compute[182713]: 2026-01-22 00:06:40.254 182717 DEBUG oslo_concurrency.processutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/21970bbd-36b4-495d-8819-49ef2276a912/disk --force-share --output=json" returned: 0 in 0.093s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:06:40 compute-1 nova_compute[182713]: 2026-01-22 00:06:40.260 182717 DEBUG oslo_concurrency.processutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a7650c58-4663-47b0-8499-d470f8edddbd/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:06:40 compute-1 nova_compute[182713]: 2026-01-22 00:06:40.337 182717 DEBUG oslo_concurrency.processutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a7650c58-4663-47b0-8499-d470f8edddbd/disk --force-share --output=json" returned: 0 in 0.077s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:06:40 compute-1 nova_compute[182713]: 2026-01-22 00:06:40.338 182717 DEBUG oslo_concurrency.processutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a7650c58-4663-47b0-8499-d470f8edddbd/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:06:40 compute-1 nova_compute[182713]: 2026-01-22 00:06:40.391 182717 DEBUG oslo_concurrency.processutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a7650c58-4663-47b0-8499-d470f8edddbd/disk --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:06:40 compute-1 nova_compute[182713]: 2026-01-22 00:06:40.569 182717 WARNING nova.virt.libvirt.driver [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 00:06:40 compute-1 nova_compute[182713]: 2026-01-22 00:06:40.571 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5513MB free_disk=73.27413558959961GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 00:06:40 compute-1 nova_compute[182713]: 2026-01-22 00:06:40.571 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:06:40 compute-1 nova_compute[182713]: 2026-01-22 00:06:40.571 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:06:40 compute-1 podman[226559]: 2026-01-22 00:06:40.577627048 +0000 UTC m=+0.070753246 container health_status dce133a2ffa21f3006ac27dc80027060a9029e6d972f4c62fecd35dd3bbb00f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, io.buildah.version=1.33.7, managed_by=edpm_ansible, distribution-scope=public, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, maintainer=Red Hat, Inc., config_id=openstack_network_exporter, vendor=Red Hat, Inc., name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64)
Jan 22 00:06:40 compute-1 nova_compute[182713]: 2026-01-22 00:06:40.662 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Instance a7650c58-4663-47b0-8499-d470f8edddbd actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 22 00:06:40 compute-1 nova_compute[182713]: 2026-01-22 00:06:40.663 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Instance 21970bbd-36b4-495d-8819-49ef2276a912 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 22 00:06:40 compute-1 nova_compute[182713]: 2026-01-22 00:06:40.663 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 00:06:40 compute-1 nova_compute[182713]: 2026-01-22 00:06:40.664 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 00:06:40 compute-1 nova_compute[182713]: 2026-01-22 00:06:40.765 182717 DEBUG nova.compute.provider_tree [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Inventory has not changed in ProviderTree for provider: 39680711-70c9-4df1-ae59-25e54fac688d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:06:40 compute-1 nova_compute[182713]: 2026-01-22 00:06:40.799 182717 DEBUG nova.scheduler.client.report [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Inventory has not changed for provider 39680711-70c9-4df1-ae59-25e54fac688d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:06:40 compute-1 nova_compute[182713]: 2026-01-22 00:06:40.843 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 00:06:40 compute-1 nova_compute[182713]: 2026-01-22 00:06:40.844 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.273s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:06:42 compute-1 nova_compute[182713]: 2026-01-22 00:06:42.096 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:06:42 compute-1 nova_compute[182713]: 2026-01-22 00:06:42.845 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:06:42 compute-1 nova_compute[182713]: 2026-01-22 00:06:42.846 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 00:06:42 compute-1 nova_compute[182713]: 2026-01-22 00:06:42.846 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 22 00:06:42 compute-1 nova_compute[182713]: 2026-01-22 00:06:42.868 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] [instance: 21970bbd-36b4-495d-8819-49ef2276a912] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871
Jan 22 00:06:43 compute-1 nova_compute[182713]: 2026-01-22 00:06:43.012 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquiring lock "refresh_cache-a7650c58-4663-47b0-8499-d470f8edddbd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:06:43 compute-1 nova_compute[182713]: 2026-01-22 00:06:43.012 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquired lock "refresh_cache-a7650c58-4663-47b0-8499-d470f8edddbd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:06:43 compute-1 nova_compute[182713]: 2026-01-22 00:06:43.013 182717 DEBUG nova.network.neutron [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] [instance: a7650c58-4663-47b0-8499-d470f8edddbd] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Jan 22 00:06:43 compute-1 nova_compute[182713]: 2026-01-22 00:06:43.013 182717 DEBUG nova.objects.instance [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lazy-loading 'info_cache' on Instance uuid a7650c58-4663-47b0-8499-d470f8edddbd obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:06:43 compute-1 nova_compute[182713]: 2026-01-22 00:06:43.683 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:06:44 compute-1 nova_compute[182713]: 2026-01-22 00:06:44.809 182717 DEBUG nova.compute.manager [req-377a9ddd-8f94-4006-9945-49dfd78ec77e req-3d2acfdb-3ae3-41f2-ab87-d6d007c29a9d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 21970bbd-36b4-495d-8819-49ef2276a912] Received event network-vif-plugged-06125da7-7adf-4bbe-b033-0045ab83a9f2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:06:44 compute-1 nova_compute[182713]: 2026-01-22 00:06:44.810 182717 DEBUG oslo_concurrency.lockutils [req-377a9ddd-8f94-4006-9945-49dfd78ec77e req-3d2acfdb-3ae3-41f2-ab87-d6d007c29a9d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "21970bbd-36b4-495d-8819-49ef2276a912-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:06:44 compute-1 nova_compute[182713]: 2026-01-22 00:06:44.810 182717 DEBUG oslo_concurrency.lockutils [req-377a9ddd-8f94-4006-9945-49dfd78ec77e req-3d2acfdb-3ae3-41f2-ab87-d6d007c29a9d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "21970bbd-36b4-495d-8819-49ef2276a912-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:06:44 compute-1 nova_compute[182713]: 2026-01-22 00:06:44.810 182717 DEBUG oslo_concurrency.lockutils [req-377a9ddd-8f94-4006-9945-49dfd78ec77e req-3d2acfdb-3ae3-41f2-ab87-d6d007c29a9d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "21970bbd-36b4-495d-8819-49ef2276a912-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:06:44 compute-1 nova_compute[182713]: 2026-01-22 00:06:44.811 182717 DEBUG nova.compute.manager [req-377a9ddd-8f94-4006-9945-49dfd78ec77e req-3d2acfdb-3ae3-41f2-ab87-d6d007c29a9d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 21970bbd-36b4-495d-8819-49ef2276a912] Processing event network-vif-plugged-06125da7-7adf-4bbe-b033-0045ab83a9f2 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 22 00:06:44 compute-1 nova_compute[182713]: 2026-01-22 00:06:44.811 182717 DEBUG nova.compute.manager [None req-e4b2fd5a-21a5-4120-9e01-7775deca166d 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: 21970bbd-36b4-495d-8819-49ef2276a912] Instance event wait completed in 9 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 00:06:44 compute-1 nova_compute[182713]: 2026-01-22 00:06:44.816 182717 DEBUG nova.virt.driver [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Emitting event <LifecycleEvent: 1769040404.81611, 21970bbd-36b4-495d-8819-49ef2276a912 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:06:44 compute-1 nova_compute[182713]: 2026-01-22 00:06:44.817 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 21970bbd-36b4-495d-8819-49ef2276a912] VM Resumed (Lifecycle Event)
Jan 22 00:06:44 compute-1 nova_compute[182713]: 2026-01-22 00:06:44.819 182717 DEBUG nova.virt.libvirt.driver [None req-e4b2fd5a-21a5-4120-9e01-7775deca166d 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: 21970bbd-36b4-495d-8819-49ef2276a912] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 22 00:06:44 compute-1 nova_compute[182713]: 2026-01-22 00:06:44.823 182717 INFO nova.virt.libvirt.driver [-] [instance: 21970bbd-36b4-495d-8819-49ef2276a912] Instance spawned successfully.
Jan 22 00:06:44 compute-1 nova_compute[182713]: 2026-01-22 00:06:44.823 182717 DEBUG nova.virt.libvirt.driver [None req-e4b2fd5a-21a5-4120-9e01-7775deca166d 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: 21970bbd-36b4-495d-8819-49ef2276a912] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 22 00:06:44 compute-1 nova_compute[182713]: 2026-01-22 00:06:44.847 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 21970bbd-36b4-495d-8819-49ef2276a912] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:06:44 compute-1 nova_compute[182713]: 2026-01-22 00:06:44.852 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 21970bbd-36b4-495d-8819-49ef2276a912] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 00:06:44 compute-1 nova_compute[182713]: 2026-01-22 00:06:44.855 182717 DEBUG nova.virt.libvirt.driver [None req-e4b2fd5a-21a5-4120-9e01-7775deca166d 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: 21970bbd-36b4-495d-8819-49ef2276a912] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:06:44 compute-1 nova_compute[182713]: 2026-01-22 00:06:44.855 182717 DEBUG nova.virt.libvirt.driver [None req-e4b2fd5a-21a5-4120-9e01-7775deca166d 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: 21970bbd-36b4-495d-8819-49ef2276a912] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:06:44 compute-1 nova_compute[182713]: 2026-01-22 00:06:44.856 182717 DEBUG nova.virt.libvirt.driver [None req-e4b2fd5a-21a5-4120-9e01-7775deca166d 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: 21970bbd-36b4-495d-8819-49ef2276a912] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:06:44 compute-1 nova_compute[182713]: 2026-01-22 00:06:44.856 182717 DEBUG nova.virt.libvirt.driver [None req-e4b2fd5a-21a5-4120-9e01-7775deca166d 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: 21970bbd-36b4-495d-8819-49ef2276a912] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:06:44 compute-1 nova_compute[182713]: 2026-01-22 00:06:44.856 182717 DEBUG nova.virt.libvirt.driver [None req-e4b2fd5a-21a5-4120-9e01-7775deca166d 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: 21970bbd-36b4-495d-8819-49ef2276a912] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:06:44 compute-1 nova_compute[182713]: 2026-01-22 00:06:44.857 182717 DEBUG nova.virt.libvirt.driver [None req-e4b2fd5a-21a5-4120-9e01-7775deca166d 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: 21970bbd-36b4-495d-8819-49ef2276a912] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:06:44 compute-1 nova_compute[182713]: 2026-01-22 00:06:44.892 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 21970bbd-36b4-495d-8819-49ef2276a912] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 00:06:44 compute-1 nova_compute[182713]: 2026-01-22 00:06:44.965 182717 DEBUG nova.network.neutron [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] [instance: a7650c58-4663-47b0-8499-d470f8edddbd] Updating instance_info_cache with network_info: [{"id": "609c277b-133c-4824-9fd7-17b756932543", "address": "fa:16:3e:4e:1a:fc", "network": {"id": "1a4bd631-64c5-4e00-9341-0e44fd0833fb", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1779791452-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.176", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b26cf6f4abd54e30aac169a3cbca648c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap609c277b-13", "ovs_interfaceid": "609c277b-133c-4824-9fd7-17b756932543", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:06:44 compute-1 nova_compute[182713]: 2026-01-22 00:06:44.970 182717 INFO nova.compute.manager [None req-e4b2fd5a-21a5-4120-9e01-7775deca166d 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: 21970bbd-36b4-495d-8819-49ef2276a912] Took 14.91 seconds to spawn the instance on the hypervisor.
Jan 22 00:06:44 compute-1 nova_compute[182713]: 2026-01-22 00:06:44.971 182717 DEBUG nova.compute.manager [None req-e4b2fd5a-21a5-4120-9e01-7775deca166d 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: 21970bbd-36b4-495d-8819-49ef2276a912] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:06:45 compute-1 nova_compute[182713]: 2026-01-22 00:06:45.009 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Releasing lock "refresh_cache-a7650c58-4663-47b0-8499-d470f8edddbd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:06:45 compute-1 nova_compute[182713]: 2026-01-22 00:06:45.009 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] [instance: a7650c58-4663-47b0-8499-d470f8edddbd] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Jan 22 00:06:45 compute-1 nova_compute[182713]: 2026-01-22 00:06:45.010 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:06:45 compute-1 nova_compute[182713]: 2026-01-22 00:06:45.098 182717 INFO nova.compute.manager [None req-e4b2fd5a-21a5-4120-9e01-7775deca166d 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: 21970bbd-36b4-495d-8819-49ef2276a912] Took 15.58 seconds to build instance.
Jan 22 00:06:45 compute-1 nova_compute[182713]: 2026-01-22 00:06:45.140 182717 DEBUG oslo_concurrency.lockutils [None req-e4b2fd5a-21a5-4120-9e01-7775deca166d 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Lock "21970bbd-36b4-495d-8819-49ef2276a912" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 15.728s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:06:46 compute-1 nova_compute[182713]: 2026-01-22 00:06:46.316 182717 INFO nova.compute.manager [None req-e040e3f0-fd69-4c3e-87c8-ee7ada5d89b7 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: 21970bbd-36b4-495d-8819-49ef2276a912] Get console output
Jan 22 00:06:46 compute-1 nova_compute[182713]: 2026-01-22 00:06:46.883 182717 DEBUG nova.compute.manager [req-39899f2d-c8d2-4f0d-860c-f24357e16996 req-dfd85e35-bcaf-4e00-9c76-2d7d74aef828 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 21970bbd-36b4-495d-8819-49ef2276a912] Received event network-vif-plugged-06125da7-7adf-4bbe-b033-0045ab83a9f2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:06:46 compute-1 nova_compute[182713]: 2026-01-22 00:06:46.884 182717 DEBUG oslo_concurrency.lockutils [req-39899f2d-c8d2-4f0d-860c-f24357e16996 req-dfd85e35-bcaf-4e00-9c76-2d7d74aef828 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "21970bbd-36b4-495d-8819-49ef2276a912-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:06:46 compute-1 nova_compute[182713]: 2026-01-22 00:06:46.884 182717 DEBUG oslo_concurrency.lockutils [req-39899f2d-c8d2-4f0d-860c-f24357e16996 req-dfd85e35-bcaf-4e00-9c76-2d7d74aef828 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "21970bbd-36b4-495d-8819-49ef2276a912-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:06:46 compute-1 nova_compute[182713]: 2026-01-22 00:06:46.885 182717 DEBUG oslo_concurrency.lockutils [req-39899f2d-c8d2-4f0d-860c-f24357e16996 req-dfd85e35-bcaf-4e00-9c76-2d7d74aef828 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "21970bbd-36b4-495d-8819-49ef2276a912-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:06:46 compute-1 nova_compute[182713]: 2026-01-22 00:06:46.885 182717 DEBUG nova.compute.manager [req-39899f2d-c8d2-4f0d-860c-f24357e16996 req-dfd85e35-bcaf-4e00-9c76-2d7d74aef828 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 21970bbd-36b4-495d-8819-49ef2276a912] No waiting events found dispatching network-vif-plugged-06125da7-7adf-4bbe-b033-0045ab83a9f2 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:06:46 compute-1 nova_compute[182713]: 2026-01-22 00:06:46.885 182717 WARNING nova.compute.manager [req-39899f2d-c8d2-4f0d-860c-f24357e16996 req-dfd85e35-bcaf-4e00-9c76-2d7d74aef828 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 21970bbd-36b4-495d-8819-49ef2276a912] Received unexpected event network-vif-plugged-06125da7-7adf-4bbe-b033-0045ab83a9f2 for instance with vm_state active and task_state None.
Jan 22 00:06:47 compute-1 nova_compute[182713]: 2026-01-22 00:06:47.100 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:06:48 compute-1 nova_compute[182713]: 2026-01-22 00:06:48.686 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:06:48 compute-1 nova_compute[182713]: 2026-01-22 00:06:48.772 182717 DEBUG nova.compute.manager [None req-97248d84-efa1-44e4-b7c2-b7a52997fb64 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: a7650c58-4663-47b0-8499-d470f8edddbd] Getting vnc console get_vnc_console /usr/lib/python3.9/site-packages/nova/compute/manager.py:7196
Jan 22 00:06:50 compute-1 ovn_controller[94841]: 2026-01-22T00:06:50Z|00394|binding|INFO|Releasing lport c2dbe75a-81e7-4c52-bada-9acaf8fbaf5c from this chassis (sb_readonly=0)
Jan 22 00:06:51 compute-1 nova_compute[182713]: 2026-01-22 00:06:51.011 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:06:51 compute-1 nova_compute[182713]: 2026-01-22 00:06:51.223 182717 DEBUG oslo_concurrency.lockutils [None req-46f365e9-f52f-4f7b-aa5e-c25b7bcf36b4 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Acquiring lock "a7650c58-4663-47b0-8499-d470f8edddbd" by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:06:51 compute-1 nova_compute[182713]: 2026-01-22 00:06:51.224 182717 DEBUG oslo_concurrency.lockutils [None req-46f365e9-f52f-4f7b-aa5e-c25b7bcf36b4 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Lock "a7650c58-4663-47b0-8499-d470f8edddbd" acquired by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:06:51 compute-1 nova_compute[182713]: 2026-01-22 00:06:51.224 182717 DEBUG nova.compute.manager [None req-46f365e9-f52f-4f7b-aa5e-c25b7bcf36b4 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: a7650c58-4663-47b0-8499-d470f8edddbd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:06:51 compute-1 nova_compute[182713]: 2026-01-22 00:06:51.229 182717 DEBUG nova.compute.manager [None req-46f365e9-f52f-4f7b-aa5e-c25b7bcf36b4 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: a7650c58-4663-47b0-8499-d470f8edddbd] Stopping instance; current vm_state: active, current task_state: powering-off, current DB power_state: 1, current VM power_state: 1 do_stop_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3338
Jan 22 00:06:51 compute-1 nova_compute[182713]: 2026-01-22 00:06:51.233 182717 DEBUG nova.objects.instance [None req-46f365e9-f52f-4f7b-aa5e-c25b7bcf36b4 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Lazy-loading 'flavor' on Instance uuid a7650c58-4663-47b0-8499-d470f8edddbd obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:06:51 compute-1 nova_compute[182713]: 2026-01-22 00:06:51.277 182717 DEBUG nova.objects.instance [None req-46f365e9-f52f-4f7b-aa5e-c25b7bcf36b4 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Lazy-loading 'info_cache' on Instance uuid a7650c58-4663-47b0-8499-d470f8edddbd obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:06:51 compute-1 nova_compute[182713]: 2026-01-22 00:06:51.324 182717 DEBUG nova.virt.libvirt.driver [None req-46f365e9-f52f-4f7b-aa5e-c25b7bcf36b4 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: a7650c58-4663-47b0-8499-d470f8edddbd] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071
Jan 22 00:06:51 compute-1 podman[226584]: 2026-01-22 00:06:51.615195106 +0000 UTC m=+0.090730522 container health_status 9069102163b9014ce8d990e36224edf2f6277e737eebbc85a8ea0ba5a3d6a468 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 22 00:06:51 compute-1 podman[226583]: 2026-01-22 00:06:51.690313845 +0000 UTC m=+0.170720332 container health_status 1b31374526468d6b98833c6ddc06c791524e3aaa8f4a92d02e3f11c449a17ca2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true)
Jan 22 00:06:52 compute-1 nova_compute[182713]: 2026-01-22 00:06:52.103 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:06:53 compute-1 kernel: tap609c277b-13 (unregistering): left promiscuous mode
Jan 22 00:06:53 compute-1 NetworkManager[54952]: <info>  [1769040413.5662] device (tap609c277b-13): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 00:06:53 compute-1 nova_compute[182713]: 2026-01-22 00:06:53.572 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:06:53 compute-1 ovn_controller[94841]: 2026-01-22T00:06:53Z|00395|binding|INFO|Releasing lport 609c277b-133c-4824-9fd7-17b756932543 from this chassis (sb_readonly=0)
Jan 22 00:06:53 compute-1 ovn_controller[94841]: 2026-01-22T00:06:53Z|00396|binding|INFO|Setting lport 609c277b-133c-4824-9fd7-17b756932543 down in Southbound
Jan 22 00:06:53 compute-1 ovn_controller[94841]: 2026-01-22T00:06:53Z|00397|binding|INFO|Removing iface tap609c277b-13 ovn-installed in OVS
Jan 22 00:06:53 compute-1 nova_compute[182713]: 2026-01-22 00:06:53.574 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:06:53 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:06:53.585 104184 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:4e:1a:fc 10.100.0.7'], port_security=['fa:16:3e:4e:1a:fc 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': 'a7650c58-4663-47b0-8499-d470f8edddbd', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1a4bd631-64c5-4e00-9341-0e44fd0833fb', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b26cf6f4abd54e30aac169a3cbca648c', 'neutron:revision_number': '4', 'neutron:security_group_ids': '80fb8d02-77b3-43f5-9cd3-4114236093b7', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com', 'neutron:port_fip': '192.168.122.176'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d46c6b58-b03f-4ac4-a6dd-9f507a40241a, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>], logical_port=609c277b-133c-4824-9fd7-17b756932543) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:06:53 compute-1 nova_compute[182713]: 2026-01-22 00:06:53.586 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:06:53 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:06:53.588 104184 INFO neutron.agent.ovn.metadata.agent [-] Port 609c277b-133c-4824-9fd7-17b756932543 in datapath 1a4bd631-64c5-4e00-9341-0e44fd0833fb unbound from our chassis
Jan 22 00:06:53 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:06:53.591 104184 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 1a4bd631-64c5-4e00-9341-0e44fd0833fb
Jan 22 00:06:53 compute-1 systemd[1]: machine-qemu\x2d43\x2dinstance\x2d0000005c.scope: Deactivated successfully.
Jan 22 00:06:53 compute-1 systemd[1]: machine-qemu\x2d43\x2dinstance\x2d0000005c.scope: Consumed 16.695s CPU time.
Jan 22 00:06:53 compute-1 systemd-machined[153970]: Machine qemu-43-instance-0000005c terminated.
Jan 22 00:06:53 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:06:53.626 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[8e6c5cb9-4c1d-48e0-b1d4-6c6fd9655699]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:06:53 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:06:53.663 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[17cc4455-c784-41c6-9f7a-fc0887c47b6e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:06:53 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:06:53.668 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[2ac366a1-fede-4d9d-9f6d-b9caca3557c8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:06:53 compute-1 nova_compute[182713]: 2026-01-22 00:06:53.688 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:06:53 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:06:53.703 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[68b3b7ea-262a-4af9-9020-e32ac7cfb725]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:06:53 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:06:53.723 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[68418f99-2cbe-4155-8f7c-b4f5cb817174]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap1a4bd631-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:28:78:33'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 7, 'rx_bytes': 616, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 7, 'rx_bytes': 616, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 114], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 489660, 'reachable_time': 40262, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 226642, 'error': None, 'target': 'ovnmeta-1a4bd631-64c5-4e00-9341-0e44fd0833fb', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:06:53 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:06:53.744 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[a3183e3a-9550-4264-997a-f61f292405a8]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap1a4bd631-61'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 489673, 'tstamp': 489673}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 226643, 'error': None, 'target': 'ovnmeta-1a4bd631-64c5-4e00-9341-0e44fd0833fb', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap1a4bd631-61'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 489676, 'tstamp': 489676}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 226643, 'error': None, 'target': 'ovnmeta-1a4bd631-64c5-4e00-9341-0e44fd0833fb', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:06:53 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:06:53.746 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1a4bd631-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:06:53 compute-1 nova_compute[182713]: 2026-01-22 00:06:53.748 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:06:53 compute-1 nova_compute[182713]: 2026-01-22 00:06:53.752 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:06:53 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:06:53.753 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap1a4bd631-60, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:06:53 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:06:53.753 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 00:06:53 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:06:53.754 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap1a4bd631-60, col_values=(('external_ids', {'iface-id': 'c2dbe75a-81e7-4c52-bada-9acaf8fbaf5c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:06:53 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:06:53.755 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 00:06:53 compute-1 nova_compute[182713]: 2026-01-22 00:06:53.802 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:06:53 compute-1 nova_compute[182713]: 2026-01-22 00:06:53.809 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:06:53 compute-1 nova_compute[182713]: 2026-01-22 00:06:53.962 182717 DEBUG nova.compute.manager [req-64e0cee4-6d52-4ab5-8e70-db03d5f50469 req-8c1acba1-3fc4-45e6-bf0e-8cc9e314ec78 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: a7650c58-4663-47b0-8499-d470f8edddbd] Received event network-vif-unplugged-609c277b-133c-4824-9fd7-17b756932543 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:06:53 compute-1 nova_compute[182713]: 2026-01-22 00:06:53.963 182717 DEBUG oslo_concurrency.lockutils [req-64e0cee4-6d52-4ab5-8e70-db03d5f50469 req-8c1acba1-3fc4-45e6-bf0e-8cc9e314ec78 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "a7650c58-4663-47b0-8499-d470f8edddbd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:06:53 compute-1 nova_compute[182713]: 2026-01-22 00:06:53.963 182717 DEBUG oslo_concurrency.lockutils [req-64e0cee4-6d52-4ab5-8e70-db03d5f50469 req-8c1acba1-3fc4-45e6-bf0e-8cc9e314ec78 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "a7650c58-4663-47b0-8499-d470f8edddbd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:06:53 compute-1 nova_compute[182713]: 2026-01-22 00:06:53.963 182717 DEBUG oslo_concurrency.lockutils [req-64e0cee4-6d52-4ab5-8e70-db03d5f50469 req-8c1acba1-3fc4-45e6-bf0e-8cc9e314ec78 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "a7650c58-4663-47b0-8499-d470f8edddbd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:06:53 compute-1 nova_compute[182713]: 2026-01-22 00:06:53.963 182717 DEBUG nova.compute.manager [req-64e0cee4-6d52-4ab5-8e70-db03d5f50469 req-8c1acba1-3fc4-45e6-bf0e-8cc9e314ec78 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: a7650c58-4663-47b0-8499-d470f8edddbd] No waiting events found dispatching network-vif-unplugged-609c277b-133c-4824-9fd7-17b756932543 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:06:53 compute-1 nova_compute[182713]: 2026-01-22 00:06:53.964 182717 WARNING nova.compute.manager [req-64e0cee4-6d52-4ab5-8e70-db03d5f50469 req-8c1acba1-3fc4-45e6-bf0e-8cc9e314ec78 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: a7650c58-4663-47b0-8499-d470f8edddbd] Received unexpected event network-vif-unplugged-609c277b-133c-4824-9fd7-17b756932543 for instance with vm_state active and task_state powering-off.
Jan 22 00:06:54 compute-1 nova_compute[182713]: 2026-01-22 00:06:54.350 182717 INFO nova.virt.libvirt.driver [None req-46f365e9-f52f-4f7b-aa5e-c25b7bcf36b4 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: a7650c58-4663-47b0-8499-d470f8edddbd] Instance shutdown successfully after 3 seconds.
Jan 22 00:06:54 compute-1 nova_compute[182713]: 2026-01-22 00:06:54.357 182717 INFO nova.virt.libvirt.driver [-] [instance: a7650c58-4663-47b0-8499-d470f8edddbd] Instance destroyed successfully.
Jan 22 00:06:54 compute-1 nova_compute[182713]: 2026-01-22 00:06:54.358 182717 DEBUG nova.objects.instance [None req-46f365e9-f52f-4f7b-aa5e-c25b7bcf36b4 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Lazy-loading 'numa_topology' on Instance uuid a7650c58-4663-47b0-8499-d470f8edddbd obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:06:54 compute-1 nova_compute[182713]: 2026-01-22 00:06:54.373 182717 DEBUG nova.compute.manager [None req-46f365e9-f52f-4f7b-aa5e-c25b7bcf36b4 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: a7650c58-4663-47b0-8499-d470f8edddbd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:06:54 compute-1 nova_compute[182713]: 2026-01-22 00:06:54.464 182717 DEBUG oslo_concurrency.lockutils [None req-46f365e9-f52f-4f7b-aa5e-c25b7bcf36b4 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Lock "a7650c58-4663-47b0-8499-d470f8edddbd" "released" by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" :: held 3.240s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:06:56 compute-1 nova_compute[182713]: 2026-01-22 00:06:56.069 182717 DEBUG nova.compute.manager [req-f0a4e79c-8825-4b22-b6ec-184247bda7b1 req-a10d6f2a-fa49-4f58-90fd-7c1c1bccaad7 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: a7650c58-4663-47b0-8499-d470f8edddbd] Received event network-vif-plugged-609c277b-133c-4824-9fd7-17b756932543 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:06:56 compute-1 nova_compute[182713]: 2026-01-22 00:06:56.071 182717 DEBUG oslo_concurrency.lockutils [req-f0a4e79c-8825-4b22-b6ec-184247bda7b1 req-a10d6f2a-fa49-4f58-90fd-7c1c1bccaad7 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "a7650c58-4663-47b0-8499-d470f8edddbd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:06:56 compute-1 nova_compute[182713]: 2026-01-22 00:06:56.072 182717 DEBUG oslo_concurrency.lockutils [req-f0a4e79c-8825-4b22-b6ec-184247bda7b1 req-a10d6f2a-fa49-4f58-90fd-7c1c1bccaad7 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "a7650c58-4663-47b0-8499-d470f8edddbd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:06:56 compute-1 nova_compute[182713]: 2026-01-22 00:06:56.073 182717 DEBUG oslo_concurrency.lockutils [req-f0a4e79c-8825-4b22-b6ec-184247bda7b1 req-a10d6f2a-fa49-4f58-90fd-7c1c1bccaad7 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "a7650c58-4663-47b0-8499-d470f8edddbd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:06:56 compute-1 nova_compute[182713]: 2026-01-22 00:06:56.074 182717 DEBUG nova.compute.manager [req-f0a4e79c-8825-4b22-b6ec-184247bda7b1 req-a10d6f2a-fa49-4f58-90fd-7c1c1bccaad7 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: a7650c58-4663-47b0-8499-d470f8edddbd] No waiting events found dispatching network-vif-plugged-609c277b-133c-4824-9fd7-17b756932543 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:06:56 compute-1 nova_compute[182713]: 2026-01-22 00:06:56.075 182717 WARNING nova.compute.manager [req-f0a4e79c-8825-4b22-b6ec-184247bda7b1 req-a10d6f2a-fa49-4f58-90fd-7c1c1bccaad7 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: a7650c58-4663-47b0-8499-d470f8edddbd] Received unexpected event network-vif-plugged-609c277b-133c-4824-9fd7-17b756932543 for instance with vm_state stopped and task_state None.
Jan 22 00:06:56 compute-1 podman[226677]: 2026-01-22 00:06:56.609428684 +0000 UTC m=+0.085846112 container health_status af9a44923f14d865893c91712a29a99d59e05d83f1a1a0300c4f4d5af638f284 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ovn_metadata_agent, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 22 00:06:56 compute-1 podman[226678]: 2026-01-22 00:06:56.619161075 +0000 UTC m=+0.105396676 container health_status e305ebfcd3dbca18f8dd680606b043b108cf9efb0d0a771039cb04fe0df8441d (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 22 00:06:57 compute-1 nova_compute[182713]: 2026-01-22 00:06:57.104 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:06:57 compute-1 ovn_controller[94841]: 2026-01-22T00:06:57Z|00049|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:c4:99:4e 10.100.0.13
Jan 22 00:06:57 compute-1 ovn_controller[94841]: 2026-01-22T00:06:57Z|00050|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:c4:99:4e 10.100.0.13
Jan 22 00:06:58 compute-1 nova_compute[182713]: 2026-01-22 00:06:58.690 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:06:59 compute-1 nova_compute[182713]: 2026-01-22 00:06:59.004 182717 DEBUG oslo_concurrency.lockutils [None req-61d84e6f-1f5d-4e85-97f8-1a17c46dd747 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Acquiring lock "refresh_cache-a7650c58-4663-47b0-8499-d470f8edddbd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:06:59 compute-1 nova_compute[182713]: 2026-01-22 00:06:59.005 182717 DEBUG oslo_concurrency.lockutils [None req-61d84e6f-1f5d-4e85-97f8-1a17c46dd747 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Acquired lock "refresh_cache-a7650c58-4663-47b0-8499-d470f8edddbd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:06:59 compute-1 nova_compute[182713]: 2026-01-22 00:06:59.005 182717 DEBUG nova.network.neutron [None req-61d84e6f-1f5d-4e85-97f8-1a17c46dd747 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: a7650c58-4663-47b0-8499-d470f8edddbd] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 00:07:00 compute-1 nova_compute[182713]: 2026-01-22 00:07:00.521 182717 DEBUG nova.network.neutron [None req-61d84e6f-1f5d-4e85-97f8-1a17c46dd747 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: a7650c58-4663-47b0-8499-d470f8edddbd] Updating instance_info_cache with network_info: [{"id": "609c277b-133c-4824-9fd7-17b756932543", "address": "fa:16:3e:4e:1a:fc", "network": {"id": "1a4bd631-64c5-4e00-9341-0e44fd0833fb", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1779791452-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.176", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b26cf6f4abd54e30aac169a3cbca648c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap609c277b-13", "ovs_interfaceid": "609c277b-133c-4824-9fd7-17b756932543", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:07:00 compute-1 nova_compute[182713]: 2026-01-22 00:07:00.547 182717 DEBUG oslo_concurrency.lockutils [None req-61d84e6f-1f5d-4e85-97f8-1a17c46dd747 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Releasing lock "refresh_cache-a7650c58-4663-47b0-8499-d470f8edddbd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:07:00 compute-1 nova_compute[182713]: 2026-01-22 00:07:00.757 182717 DEBUG nova.virt.libvirt.driver [None req-61d84e6f-1f5d-4e85-97f8-1a17c46dd747 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: a7650c58-4663-47b0-8499-d470f8edddbd] Starting migrate_disk_and_power_off migrate_disk_and_power_off /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11511
Jan 22 00:07:00 compute-1 nova_compute[182713]: 2026-01-22 00:07:00.758 182717 DEBUG nova.virt.libvirt.volume.remotefs [None req-61d84e6f-1f5d-4e85-97f8-1a17c46dd747 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Creating file /var/lib/nova/instances/a7650c58-4663-47b0-8499-d470f8edddbd/0c58021a13c6422a9a9ffd697593a22f.tmp on remote host 192.168.122.102 create_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:79
Jan 22 00:07:00 compute-1 nova_compute[182713]: 2026-01-22 00:07:00.759 182717 DEBUG oslo_concurrency.processutils [None req-61d84e6f-1f5d-4e85-97f8-1a17c46dd747 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Running cmd (subprocess): ssh -o BatchMode=yes 192.168.122.102 touch /var/lib/nova/instances/a7650c58-4663-47b0-8499-d470f8edddbd/0c58021a13c6422a9a9ffd697593a22f.tmp execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:07:01 compute-1 nova_compute[182713]: 2026-01-22 00:07:01.282 182717 DEBUG oslo_concurrency.processutils [None req-61d84e6f-1f5d-4e85-97f8-1a17c46dd747 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] CMD "ssh -o BatchMode=yes 192.168.122.102 touch /var/lib/nova/instances/a7650c58-4663-47b0-8499-d470f8edddbd/0c58021a13c6422a9a9ffd697593a22f.tmp" returned: 1 in 0.523s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:07:01 compute-1 nova_compute[182713]: 2026-01-22 00:07:01.283 182717 DEBUG oslo_concurrency.processutils [None req-61d84e6f-1f5d-4e85-97f8-1a17c46dd747 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] 'ssh -o BatchMode=yes 192.168.122.102 touch /var/lib/nova/instances/a7650c58-4663-47b0-8499-d470f8edddbd/0c58021a13c6422a9a9ffd697593a22f.tmp' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473
Jan 22 00:07:01 compute-1 nova_compute[182713]: 2026-01-22 00:07:01.284 182717 DEBUG nova.virt.libvirt.volume.remotefs [None req-61d84e6f-1f5d-4e85-97f8-1a17c46dd747 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Creating directory /var/lib/nova/instances/a7650c58-4663-47b0-8499-d470f8edddbd on remote host 192.168.122.102 create_dir /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:91
Jan 22 00:07:01 compute-1 nova_compute[182713]: 2026-01-22 00:07:01.284 182717 DEBUG oslo_concurrency.processutils [None req-61d84e6f-1f5d-4e85-97f8-1a17c46dd747 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Running cmd (subprocess): ssh -o BatchMode=yes 192.168.122.102 mkdir -p /var/lib/nova/instances/a7650c58-4663-47b0-8499-d470f8edddbd execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:07:01 compute-1 nova_compute[182713]: 2026-01-22 00:07:01.525 182717 DEBUG oslo_concurrency.processutils [None req-61d84e6f-1f5d-4e85-97f8-1a17c46dd747 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] CMD "ssh -o BatchMode=yes 192.168.122.102 mkdir -p /var/lib/nova/instances/a7650c58-4663-47b0-8499-d470f8edddbd" returned: 0 in 0.240s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:07:01 compute-1 nova_compute[182713]: 2026-01-22 00:07:01.530 182717 INFO nova.virt.libvirt.driver [None req-61d84e6f-1f5d-4e85-97f8-1a17c46dd747 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: a7650c58-4663-47b0-8499-d470f8edddbd] Instance already shutdown.
Jan 22 00:07:01 compute-1 nova_compute[182713]: 2026-01-22 00:07:01.538 182717 INFO nova.virt.libvirt.driver [-] [instance: a7650c58-4663-47b0-8499-d470f8edddbd] Instance destroyed successfully.
Jan 22 00:07:01 compute-1 nova_compute[182713]: 2026-01-22 00:07:01.541 182717 DEBUG nova.virt.libvirt.vif [None req-61d84e6f-1f5d-4e85-97f8-1a17c46dd747 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T00:05:15Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestOtherB-server-1989312991',display_name='tempest-ServerActionsTestOtherB-server-1989312991',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serveractionstestotherb-server-1989312991',id=92,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOWyAqsdytk3W3HzFQzJP3BXJvSwE75PC1SitNdFnRhcK3nyEFtPzs/DJKbijcwArzRvqYzid7Fty+N11Hyd1TaRzX9I0f6oLPrGjMRpZbi4YRQ8Uh8k7+UR1VtydcvTDA==',key_name='tempest-keypair-1040205771',keypairs=<?>,launch_index=0,launched_at=2026-01-22T00:05:24Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=4,progress=0,project_id='b26cf6f4abd54e30aac169a3cbca648c',ramdisk_id='',reservation_id='r-sfv20ppo',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='stopped',owner_project_name='tempest-ServerActionsTestOtherB-1685479237',owner_user_name='tempest-ServerActionsTestOtherB-1685479237-project-member'},tags=<?>,task_state='resize_migrating',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T00:06:58Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='365f219cd09c471fa6275faa2fe5e2a1',uuid=a7650c58-4663-47b0-8499-d470f8edddbd,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "609c277b-133c-4824-9fd7-17b756932543", "address": "fa:16:3e:4e:1a:fc", "network": {"id": "1a4bd631-64c5-4e00-9341-0e44fd0833fb", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1779791452-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.176", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-ServerActionsTestOtherB-1779791452-network", "vif_mac": "fa:16:3e:4e:1a:fc"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b26cf6f4abd54e30aac169a3cbca648c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap609c277b-13", "ovs_interfaceid": "609c277b-133c-4824-9fd7-17b756932543", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 00:07:01 compute-1 nova_compute[182713]: 2026-01-22 00:07:01.542 182717 DEBUG nova.network.os_vif_util [None req-61d84e6f-1f5d-4e85-97f8-1a17c46dd747 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Converting VIF {"id": "609c277b-133c-4824-9fd7-17b756932543", "address": "fa:16:3e:4e:1a:fc", "network": {"id": "1a4bd631-64c5-4e00-9341-0e44fd0833fb", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1779791452-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.176", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-ServerActionsTestOtherB-1779791452-network", "vif_mac": "fa:16:3e:4e:1a:fc"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b26cf6f4abd54e30aac169a3cbca648c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap609c277b-13", "ovs_interfaceid": "609c277b-133c-4824-9fd7-17b756932543", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:07:01 compute-1 nova_compute[182713]: 2026-01-22 00:07:01.544 182717 DEBUG nova.network.os_vif_util [None req-61d84e6f-1f5d-4e85-97f8-1a17c46dd747 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:4e:1a:fc,bridge_name='br-int',has_traffic_filtering=True,id=609c277b-133c-4824-9fd7-17b756932543,network=Network(1a4bd631-64c5-4e00-9341-0e44fd0833fb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap609c277b-13') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:07:01 compute-1 nova_compute[182713]: 2026-01-22 00:07:01.545 182717 DEBUG os_vif [None req-61d84e6f-1f5d-4e85-97f8-1a17c46dd747 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:4e:1a:fc,bridge_name='br-int',has_traffic_filtering=True,id=609c277b-133c-4824-9fd7-17b756932543,network=Network(1a4bd631-64c5-4e00-9341-0e44fd0833fb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap609c277b-13') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 00:07:01 compute-1 nova_compute[182713]: 2026-01-22 00:07:01.550 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:07:01 compute-1 nova_compute[182713]: 2026-01-22 00:07:01.551 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap609c277b-13, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:07:01 compute-1 nova_compute[182713]: 2026-01-22 00:07:01.553 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:07:01 compute-1 nova_compute[182713]: 2026-01-22 00:07:01.557 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:07:01 compute-1 nova_compute[182713]: 2026-01-22 00:07:01.565 182717 INFO os_vif [None req-61d84e6f-1f5d-4e85-97f8-1a17c46dd747 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:4e:1a:fc,bridge_name='br-int',has_traffic_filtering=True,id=609c277b-133c-4824-9fd7-17b756932543,network=Network(1a4bd631-64c5-4e00-9341-0e44fd0833fb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap609c277b-13')
Jan 22 00:07:01 compute-1 nova_compute[182713]: 2026-01-22 00:07:01.572 182717 DEBUG oslo_concurrency.processutils [None req-61d84e6f-1f5d-4e85-97f8-1a17c46dd747 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a7650c58-4663-47b0-8499-d470f8edddbd/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:07:01 compute-1 nova_compute[182713]: 2026-01-22 00:07:01.671 182717 DEBUG oslo_concurrency.processutils [None req-61d84e6f-1f5d-4e85-97f8-1a17c46dd747 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a7650c58-4663-47b0-8499-d470f8edddbd/disk --force-share --output=json" returned: 0 in 0.099s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:07:01 compute-1 nova_compute[182713]: 2026-01-22 00:07:01.674 182717 DEBUG oslo_concurrency.processutils [None req-61d84e6f-1f5d-4e85-97f8-1a17c46dd747 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a7650c58-4663-47b0-8499-d470f8edddbd/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:07:01 compute-1 nova_compute[182713]: 2026-01-22 00:07:01.749 182717 DEBUG oslo_concurrency.processutils [None req-61d84e6f-1f5d-4e85-97f8-1a17c46dd747 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a7650c58-4663-47b0-8499-d470f8edddbd/disk --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:07:01 compute-1 nova_compute[182713]: 2026-01-22 00:07:01.753 182717 DEBUG nova.virt.libvirt.volume.remotefs [None req-61d84e6f-1f5d-4e85-97f8-1a17c46dd747 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Copying file /var/lib/nova/instances/a7650c58-4663-47b0-8499-d470f8edddbd_resize/disk to 192.168.122.102:/var/lib/nova/instances/a7650c58-4663-47b0-8499-d470f8edddbd/disk copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103
Jan 22 00:07:01 compute-1 nova_compute[182713]: 2026-01-22 00:07:01.753 182717 DEBUG oslo_concurrency.processutils [None req-61d84e6f-1f5d-4e85-97f8-1a17c46dd747 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Running cmd (subprocess): scp -r /var/lib/nova/instances/a7650c58-4663-47b0-8499-d470f8edddbd_resize/disk 192.168.122.102:/var/lib/nova/instances/a7650c58-4663-47b0-8499-d470f8edddbd/disk execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:07:02 compute-1 nova_compute[182713]: 2026-01-22 00:07:02.107 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:07:02 compute-1 nova_compute[182713]: 2026-01-22 00:07:02.450 182717 DEBUG oslo_concurrency.processutils [None req-61d84e6f-1f5d-4e85-97f8-1a17c46dd747 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] CMD "scp -r /var/lib/nova/instances/a7650c58-4663-47b0-8499-d470f8edddbd_resize/disk 192.168.122.102:/var/lib/nova/instances/a7650c58-4663-47b0-8499-d470f8edddbd/disk" returned: 0 in 0.696s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:07:02 compute-1 nova_compute[182713]: 2026-01-22 00:07:02.451 182717 DEBUG nova.virt.libvirt.volume.remotefs [None req-61d84e6f-1f5d-4e85-97f8-1a17c46dd747 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Copying file /var/lib/nova/instances/a7650c58-4663-47b0-8499-d470f8edddbd_resize/disk.config to 192.168.122.102:/var/lib/nova/instances/a7650c58-4663-47b0-8499-d470f8edddbd/disk.config copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103
Jan 22 00:07:02 compute-1 nova_compute[182713]: 2026-01-22 00:07:02.451 182717 DEBUG oslo_concurrency.processutils [None req-61d84e6f-1f5d-4e85-97f8-1a17c46dd747 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Running cmd (subprocess): scp -C -r /var/lib/nova/instances/a7650c58-4663-47b0-8499-d470f8edddbd_resize/disk.config 192.168.122.102:/var/lib/nova/instances/a7650c58-4663-47b0-8499-d470f8edddbd/disk.config execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:07:02 compute-1 nova_compute[182713]: 2026-01-22 00:07:02.745 182717 DEBUG oslo_concurrency.processutils [None req-61d84e6f-1f5d-4e85-97f8-1a17c46dd747 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] CMD "scp -C -r /var/lib/nova/instances/a7650c58-4663-47b0-8499-d470f8edddbd_resize/disk.config 192.168.122.102:/var/lib/nova/instances/a7650c58-4663-47b0-8499-d470f8edddbd/disk.config" returned: 0 in 0.293s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:07:02 compute-1 nova_compute[182713]: 2026-01-22 00:07:02.746 182717 DEBUG nova.virt.libvirt.volume.remotefs [None req-61d84e6f-1f5d-4e85-97f8-1a17c46dd747 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Copying file /var/lib/nova/instances/a7650c58-4663-47b0-8499-d470f8edddbd_resize/disk.info to 192.168.122.102:/var/lib/nova/instances/a7650c58-4663-47b0-8499-d470f8edddbd/disk.info copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103
Jan 22 00:07:02 compute-1 nova_compute[182713]: 2026-01-22 00:07:02.747 182717 DEBUG oslo_concurrency.processutils [None req-61d84e6f-1f5d-4e85-97f8-1a17c46dd747 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Running cmd (subprocess): scp -C -r /var/lib/nova/instances/a7650c58-4663-47b0-8499-d470f8edddbd_resize/disk.info 192.168.122.102:/var/lib/nova/instances/a7650c58-4663-47b0-8499-d470f8edddbd/disk.info execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:07:03 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:07:03.014 104184 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:07:03 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:07:03.015 104184 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:07:03 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:07:03.015 104184 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:07:03 compute-1 nova_compute[182713]: 2026-01-22 00:07:03.020 182717 DEBUG oslo_concurrency.processutils [None req-61d84e6f-1f5d-4e85-97f8-1a17c46dd747 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] CMD "scp -C -r /var/lib/nova/instances/a7650c58-4663-47b0-8499-d470f8edddbd_resize/disk.info 192.168.122.102:/var/lib/nova/instances/a7650c58-4663-47b0-8499-d470f8edddbd/disk.info" returned: 0 in 0.273s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:07:03 compute-1 nova_compute[182713]: 2026-01-22 00:07:03.556 182717 DEBUG neutronclient.v2_0.client [None req-61d84e6f-1f5d-4e85-97f8-1a17c46dd747 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Error message: {"NeutronError": {"type": "PortBindingNotFound", "message": "Binding for port 609c277b-133c-4824-9fd7-17b756932543 for host compute-2.ctlplane.example.com could not be found.", "detail": ""}} _handle_fault_response /usr/lib/python3.9/site-packages/neutronclient/v2_0/client.py:262
Jan 22 00:07:03 compute-1 nova_compute[182713]: 2026-01-22 00:07:03.718 182717 DEBUG oslo_concurrency.lockutils [None req-61d84e6f-1f5d-4e85-97f8-1a17c46dd747 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Acquiring lock "a7650c58-4663-47b0-8499-d470f8edddbd-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:07:03 compute-1 nova_compute[182713]: 2026-01-22 00:07:03.719 182717 DEBUG oslo_concurrency.lockutils [None req-61d84e6f-1f5d-4e85-97f8-1a17c46dd747 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Lock "a7650c58-4663-47b0-8499-d470f8edddbd-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:07:03 compute-1 nova_compute[182713]: 2026-01-22 00:07:03.719 182717 DEBUG oslo_concurrency.lockutils [None req-61d84e6f-1f5d-4e85-97f8-1a17c46dd747 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Lock "a7650c58-4663-47b0-8499-d470f8edddbd-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:07:06 compute-1 nova_compute[182713]: 2026-01-22 00:07:06.556 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:07:07 compute-1 nova_compute[182713]: 2026-01-22 00:07:07.109 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:07:07 compute-1 nova_compute[182713]: 2026-01-22 00:07:07.639 182717 DEBUG nova.compute.manager [req-7919bbc0-d3f9-4029-8b96-6c174d9052c2 req-505271b8-0003-4264-a046-56ef3977c83b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: a7650c58-4663-47b0-8499-d470f8edddbd] Received event network-changed-609c277b-133c-4824-9fd7-17b756932543 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:07:07 compute-1 nova_compute[182713]: 2026-01-22 00:07:07.640 182717 DEBUG nova.compute.manager [req-7919bbc0-d3f9-4029-8b96-6c174d9052c2 req-505271b8-0003-4264-a046-56ef3977c83b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: a7650c58-4663-47b0-8499-d470f8edddbd] Refreshing instance network info cache due to event network-changed-609c277b-133c-4824-9fd7-17b756932543. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 00:07:07 compute-1 nova_compute[182713]: 2026-01-22 00:07:07.641 182717 DEBUG oslo_concurrency.lockutils [req-7919bbc0-d3f9-4029-8b96-6c174d9052c2 req-505271b8-0003-4264-a046-56ef3977c83b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-a7650c58-4663-47b0-8499-d470f8edddbd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:07:07 compute-1 nova_compute[182713]: 2026-01-22 00:07:07.642 182717 DEBUG oslo_concurrency.lockutils [req-7919bbc0-d3f9-4029-8b96-6c174d9052c2 req-505271b8-0003-4264-a046-56ef3977c83b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-a7650c58-4663-47b0-8499-d470f8edddbd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:07:07 compute-1 nova_compute[182713]: 2026-01-22 00:07:07.642 182717 DEBUG nova.network.neutron [req-7919bbc0-d3f9-4029-8b96-6c174d9052c2 req-505271b8-0003-4264-a046-56ef3977c83b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: a7650c58-4663-47b0-8499-d470f8edddbd] Refreshing network info cache for port 609c277b-133c-4824-9fd7-17b756932543 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 00:07:08 compute-1 podman[226731]: 2026-01-22 00:07:08.626559399 +0000 UTC m=+0.100790313 container health_status cba261ffc859a105e0afa4436e64e27f9ba7e7387a1ea02146e0072c51844d44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251202)
Jan 22 00:07:08 compute-1 nova_compute[182713]: 2026-01-22 00:07:08.856 182717 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769040413.8550773, a7650c58-4663-47b0-8499-d470f8edddbd => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:07:08 compute-1 nova_compute[182713]: 2026-01-22 00:07:08.857 182717 INFO nova.compute.manager [-] [instance: a7650c58-4663-47b0-8499-d470f8edddbd] VM Stopped (Lifecycle Event)
Jan 22 00:07:08 compute-1 nova_compute[182713]: 2026-01-22 00:07:08.922 182717 DEBUG nova.compute.manager [None req-b9f18e16-e453-4b19-8ad6-b482b8f875ba - - - - - -] [instance: a7650c58-4663-47b0-8499-d470f8edddbd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:07:08 compute-1 nova_compute[182713]: 2026-01-22 00:07:08.928 182717 DEBUG nova.compute.manager [None req-b9f18e16-e453-4b19-8ad6-b482b8f875ba - - - - - -] [instance: a7650c58-4663-47b0-8499-d470f8edddbd] Synchronizing instance power state after lifecycle event "Stopped"; current vm_state: stopped, current task_state: resize_migrated, current DB power_state: 4, VM power_state: 4 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 00:07:08 compute-1 nova_compute[182713]: 2026-01-22 00:07:08.952 182717 INFO nova.compute.manager [None req-b9f18e16-e453-4b19-8ad6-b482b8f875ba - - - - - -] [instance: a7650c58-4663-47b0-8499-d470f8edddbd] During the sync_power process the instance has moved from host compute-2.ctlplane.example.com to host compute-1.ctlplane.example.com
Jan 22 00:07:10 compute-1 nova_compute[182713]: 2026-01-22 00:07:10.014 182717 DEBUG nova.network.neutron [req-7919bbc0-d3f9-4029-8b96-6c174d9052c2 req-505271b8-0003-4264-a046-56ef3977c83b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: a7650c58-4663-47b0-8499-d470f8edddbd] Updated VIF entry in instance network info cache for port 609c277b-133c-4824-9fd7-17b756932543. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 00:07:10 compute-1 nova_compute[182713]: 2026-01-22 00:07:10.015 182717 DEBUG nova.network.neutron [req-7919bbc0-d3f9-4029-8b96-6c174d9052c2 req-505271b8-0003-4264-a046-56ef3977c83b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: a7650c58-4663-47b0-8499-d470f8edddbd] Updating instance_info_cache with network_info: [{"id": "609c277b-133c-4824-9fd7-17b756932543", "address": "fa:16:3e:4e:1a:fc", "network": {"id": "1a4bd631-64c5-4e00-9341-0e44fd0833fb", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1779791452-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.176", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b26cf6f4abd54e30aac169a3cbca648c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap609c277b-13", "ovs_interfaceid": "609c277b-133c-4824-9fd7-17b756932543", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:07:10 compute-1 nova_compute[182713]: 2026-01-22 00:07:10.041 182717 DEBUG oslo_concurrency.lockutils [req-7919bbc0-d3f9-4029-8b96-6c174d9052c2 req-505271b8-0003-4264-a046-56ef3977c83b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-a7650c58-4663-47b0-8499-d470f8edddbd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:07:11 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:07:11.091 104184 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=28, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:a4:fb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'fa:c2:bb:50:eb:27'}, ipsec=False) old=SB_Global(nb_cfg=27) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:07:11 compute-1 nova_compute[182713]: 2026-01-22 00:07:11.092 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:07:11 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:07:11.093 104184 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 22 00:07:11 compute-1 nova_compute[182713]: 2026-01-22 00:07:11.587 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:07:11 compute-1 podman[226751]: 2026-01-22 00:07:11.634215926 +0000 UTC m=+0.121627416 container health_status dce133a2ffa21f3006ac27dc80027060a9029e6d972f4c62fecd35dd3bbb00f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, managed_by=edpm_ansible, io.openshift.expose-services=, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., version=9.6, vcs-type=git, distribution-scope=public)
Jan 22 00:07:12 compute-1 nova_compute[182713]: 2026-01-22 00:07:12.112 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:07:12 compute-1 nova_compute[182713]: 2026-01-22 00:07:12.895 182717 DEBUG oslo_concurrency.lockutils [None req-aa1f5c32-a771-4a70-9479-477d654244b6 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Acquiring lock "a7650c58-4663-47b0-8499-d470f8edddbd" by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:07:12 compute-1 nova_compute[182713]: 2026-01-22 00:07:12.896 182717 DEBUG oslo_concurrency.lockutils [None req-aa1f5c32-a771-4a70-9479-477d654244b6 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Lock "a7650c58-4663-47b0-8499-d470f8edddbd" acquired by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:07:12 compute-1 nova_compute[182713]: 2026-01-22 00:07:12.896 182717 DEBUG nova.compute.manager [None req-aa1f5c32-a771-4a70-9479-477d654244b6 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: a7650c58-4663-47b0-8499-d470f8edddbd] Going to confirm migration 15 do_confirm_resize /usr/lib/python3.9/site-packages/nova/compute/manager.py:4679
Jan 22 00:07:12 compute-1 nova_compute[182713]: 2026-01-22 00:07:12.948 182717 DEBUG nova.objects.instance [None req-aa1f5c32-a771-4a70-9479-477d654244b6 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Lazy-loading 'info_cache' on Instance uuid a7650c58-4663-47b0-8499-d470f8edddbd obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:07:13 compute-1 nova_compute[182713]: 2026-01-22 00:07:13.782 182717 DEBUG neutronclient.v2_0.client [None req-aa1f5c32-a771-4a70-9479-477d654244b6 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Error message: {"NeutronError": {"type": "PortBindingNotFound", "message": "Binding for port 609c277b-133c-4824-9fd7-17b756932543 for host compute-1.ctlplane.example.com could not be found.", "detail": ""}} _handle_fault_response /usr/lib/python3.9/site-packages/neutronclient/v2_0/client.py:262
Jan 22 00:07:13 compute-1 nova_compute[182713]: 2026-01-22 00:07:13.783 182717 DEBUG oslo_concurrency.lockutils [None req-aa1f5c32-a771-4a70-9479-477d654244b6 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Acquiring lock "refresh_cache-a7650c58-4663-47b0-8499-d470f8edddbd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:07:13 compute-1 nova_compute[182713]: 2026-01-22 00:07:13.784 182717 DEBUG oslo_concurrency.lockutils [None req-aa1f5c32-a771-4a70-9479-477d654244b6 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Acquired lock "refresh_cache-a7650c58-4663-47b0-8499-d470f8edddbd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:07:13 compute-1 nova_compute[182713]: 2026-01-22 00:07:13.784 182717 DEBUG nova.network.neutron [None req-aa1f5c32-a771-4a70-9479-477d654244b6 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: a7650c58-4663-47b0-8499-d470f8edddbd] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 00:07:14 compute-1 nova_compute[182713]: 2026-01-22 00:07:14.204 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:07:15 compute-1 nova_compute[182713]: 2026-01-22 00:07:15.889 182717 DEBUG nova.network.neutron [None req-aa1f5c32-a771-4a70-9479-477d654244b6 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: a7650c58-4663-47b0-8499-d470f8edddbd] Updating instance_info_cache with network_info: [{"id": "609c277b-133c-4824-9fd7-17b756932543", "address": "fa:16:3e:4e:1a:fc", "network": {"id": "1a4bd631-64c5-4e00-9341-0e44fd0833fb", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1779791452-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.176", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b26cf6f4abd54e30aac169a3cbca648c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap609c277b-13", "ovs_interfaceid": "609c277b-133c-4824-9fd7-17b756932543", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:07:15 compute-1 nova_compute[182713]: 2026-01-22 00:07:15.909 182717 DEBUG oslo_concurrency.lockutils [None req-aa1f5c32-a771-4a70-9479-477d654244b6 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Releasing lock "refresh_cache-a7650c58-4663-47b0-8499-d470f8edddbd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:07:15 compute-1 nova_compute[182713]: 2026-01-22 00:07:15.909 182717 DEBUG nova.objects.instance [None req-aa1f5c32-a771-4a70-9479-477d654244b6 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Lazy-loading 'migration_context' on Instance uuid a7650c58-4663-47b0-8499-d470f8edddbd obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:07:15 compute-1 nova_compute[182713]: 2026-01-22 00:07:15.949 182717 DEBUG nova.virt.libvirt.vif [None req-aa1f5c32-a771-4a70-9479-477d654244b6 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T00:05:15Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestOtherB-server-1989312991',display_name='tempest-ServerActionsTestOtherB-server-1989312991',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serveractionstestotherb-server-1989312991',id=92,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOWyAqsdytk3W3HzFQzJP3BXJvSwE75PC1SitNdFnRhcK3nyEFtPzs/DJKbijcwArzRvqYzid7Fty+N11Hyd1TaRzX9I0f6oLPrGjMRpZbi4YRQ8Uh8k7+UR1VtydcvTDA==',key_name='tempest-keypair-1040205771',keypairs=<?>,launch_index=0,launched_at=2026-01-22T00:07:10Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-2.ctlplane.example.com',numa_topology=<?>,old_flavor=Flavor(1),os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='b26cf6f4abd54e30aac169a3cbca648c',ramdisk_id='',reservation_id='r-sfv20ppo',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='stopped',owner_project_name='tempest-ServerActionsTestOtherB-1685479237',owner_user_name='tempest-ServerActionsTestOtherB-1685479237-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T00:07:10Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='365f219cd09c471fa6275faa2fe5e2a1',uuid=a7650c58-4663-47b0-8499-d470f8edddbd,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='resized') vif={"id": "609c277b-133c-4824-9fd7-17b756932543", "address": "fa:16:3e:4e:1a:fc", "network": {"id": "1a4bd631-64c5-4e00-9341-0e44fd0833fb", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1779791452-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.176", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b26cf6f4abd54e30aac169a3cbca648c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap609c277b-13", "ovs_interfaceid": "609c277b-133c-4824-9fd7-17b756932543", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 00:07:15 compute-1 nova_compute[182713]: 2026-01-22 00:07:15.950 182717 DEBUG nova.network.os_vif_util [None req-aa1f5c32-a771-4a70-9479-477d654244b6 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Converting VIF {"id": "609c277b-133c-4824-9fd7-17b756932543", "address": "fa:16:3e:4e:1a:fc", "network": {"id": "1a4bd631-64c5-4e00-9341-0e44fd0833fb", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1779791452-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.176", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b26cf6f4abd54e30aac169a3cbca648c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap609c277b-13", "ovs_interfaceid": "609c277b-133c-4824-9fd7-17b756932543", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:07:15 compute-1 nova_compute[182713]: 2026-01-22 00:07:15.951 182717 DEBUG nova.network.os_vif_util [None req-aa1f5c32-a771-4a70-9479-477d654244b6 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:4e:1a:fc,bridge_name='br-int',has_traffic_filtering=True,id=609c277b-133c-4824-9fd7-17b756932543,network=Network(1a4bd631-64c5-4e00-9341-0e44fd0833fb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap609c277b-13') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:07:15 compute-1 nova_compute[182713]: 2026-01-22 00:07:15.951 182717 DEBUG os_vif [None req-aa1f5c32-a771-4a70-9479-477d654244b6 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:4e:1a:fc,bridge_name='br-int',has_traffic_filtering=True,id=609c277b-133c-4824-9fd7-17b756932543,network=Network(1a4bd631-64c5-4e00-9341-0e44fd0833fb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap609c277b-13') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 00:07:15 compute-1 nova_compute[182713]: 2026-01-22 00:07:15.953 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:07:15 compute-1 nova_compute[182713]: 2026-01-22 00:07:15.954 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap609c277b-13, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:07:15 compute-1 nova_compute[182713]: 2026-01-22 00:07:15.954 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 00:07:15 compute-1 nova_compute[182713]: 2026-01-22 00:07:15.958 182717 INFO os_vif [None req-aa1f5c32-a771-4a70-9479-477d654244b6 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:4e:1a:fc,bridge_name='br-int',has_traffic_filtering=True,id=609c277b-133c-4824-9fd7-17b756932543,network=Network(1a4bd631-64c5-4e00-9341-0e44fd0833fb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap609c277b-13')
Jan 22 00:07:15 compute-1 nova_compute[182713]: 2026-01-22 00:07:15.959 182717 DEBUG oslo_concurrency.lockutils [None req-aa1f5c32-a771-4a70-9479-477d654244b6 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:07:15 compute-1 nova_compute[182713]: 2026-01-22 00:07:15.959 182717 DEBUG oslo_concurrency.lockutils [None req-aa1f5c32-a771-4a70-9479-477d654244b6 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:07:16 compute-1 nova_compute[182713]: 2026-01-22 00:07:16.096 182717 DEBUG nova.compute.provider_tree [None req-aa1f5c32-a771-4a70-9479-477d654244b6 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Inventory has not changed in ProviderTree for provider: 39680711-70c9-4df1-ae59-25e54fac688d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:07:16 compute-1 nova_compute[182713]: 2026-01-22 00:07:16.117 182717 DEBUG nova.scheduler.client.report [None req-aa1f5c32-a771-4a70-9479-477d654244b6 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Inventory has not changed for provider 39680711-70c9-4df1-ae59-25e54fac688d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:07:16 compute-1 nova_compute[182713]: 2026-01-22 00:07:16.225 182717 DEBUG oslo_concurrency.lockutils [None req-aa1f5c32-a771-4a70-9479-477d654244b6 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" :: held 0.266s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:07:16 compute-1 nova_compute[182713]: 2026-01-22 00:07:16.225 182717 DEBUG nova.compute.manager [None req-aa1f5c32-a771-4a70-9479-477d654244b6 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: a7650c58-4663-47b0-8499-d470f8edddbd] Resized/migrated instance is powered off. Setting vm_state to 'stopped'. _confirm_resize /usr/lib/python3.9/site-packages/nova/compute/manager.py:4805
Jan 22 00:07:16 compute-1 nova_compute[182713]: 2026-01-22 00:07:16.392 182717 INFO nova.scheduler.client.report [None req-aa1f5c32-a771-4a70-9479-477d654244b6 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Deleted allocation for migration f9552d95-1fd3-4e1c-9c7f-b072e712c5b6
Jan 22 00:07:16 compute-1 nova_compute[182713]: 2026-01-22 00:07:16.521 182717 DEBUG oslo_concurrency.lockutils [None req-aa1f5c32-a771-4a70-9479-477d654244b6 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Lock "a7650c58-4663-47b0-8499-d470f8edddbd" "released" by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" :: held 3.626s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:07:16 compute-1 nova_compute[182713]: 2026-01-22 00:07:16.590 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:07:17 compute-1 nova_compute[182713]: 2026-01-22 00:07:17.116 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:07:18 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:07:18.096 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=74526b6d-b1ca-423f-9094-b845f8b97526, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '28'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:07:20 compute-1 nova_compute[182713]: 2026-01-22 00:07:20.205 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:07:21 compute-1 nova_compute[182713]: 2026-01-22 00:07:21.593 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:07:22 compute-1 nova_compute[182713]: 2026-01-22 00:07:22.118 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:07:22 compute-1 podman[226774]: 2026-01-22 00:07:22.606713684 +0000 UTC m=+0.081406642 container health_status 9069102163b9014ce8d990e36224edf2f6277e737eebbc85a8ea0ba5a3d6a468 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 22 00:07:22 compute-1 podman[226773]: 2026-01-22 00:07:22.63491837 +0000 UTC m=+0.124581202 container health_status 1b31374526468d6b98833c6ddc06c791524e3aaa8f4a92d02e3f11c449a17ca2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3)
Jan 22 00:07:25 compute-1 nova_compute[182713]: 2026-01-22 00:07:25.802 182717 DEBUG oslo_concurrency.lockutils [None req-a5bd4c09-3283-4130-b989-3b6a1760b2b4 03ac6c0f8ea448daaee61484ec6b3408 40c52c1a71294070a5448bbfd80c0e64 - - default default] Acquiring lock "3c1aab4c-913f-4be7-a27b-a763fb7cec97" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:07:25 compute-1 nova_compute[182713]: 2026-01-22 00:07:25.803 182717 DEBUG oslo_concurrency.lockutils [None req-a5bd4c09-3283-4130-b989-3b6a1760b2b4 03ac6c0f8ea448daaee61484ec6b3408 40c52c1a71294070a5448bbfd80c0e64 - - default default] Lock "3c1aab4c-913f-4be7-a27b-a763fb7cec97" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:07:26 compute-1 nova_compute[182713]: 2026-01-22 00:07:26.033 182717 DEBUG nova.compute.manager [None req-a5bd4c09-3283-4130-b989-3b6a1760b2b4 03ac6c0f8ea448daaee61484ec6b3408 40c52c1a71294070a5448bbfd80c0e64 - - default default] [instance: 3c1aab4c-913f-4be7-a27b-a763fb7cec97] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 22 00:07:26 compute-1 nova_compute[182713]: 2026-01-22 00:07:26.219 182717 DEBUG oslo_concurrency.lockutils [None req-a5bd4c09-3283-4130-b989-3b6a1760b2b4 03ac6c0f8ea448daaee61484ec6b3408 40c52c1a71294070a5448bbfd80c0e64 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:07:26 compute-1 nova_compute[182713]: 2026-01-22 00:07:26.220 182717 DEBUG oslo_concurrency.lockutils [None req-a5bd4c09-3283-4130-b989-3b6a1760b2b4 03ac6c0f8ea448daaee61484ec6b3408 40c52c1a71294070a5448bbfd80c0e64 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:07:26 compute-1 nova_compute[182713]: 2026-01-22 00:07:26.228 182717 DEBUG nova.virt.hardware [None req-a5bd4c09-3283-4130-b989-3b6a1760b2b4 03ac6c0f8ea448daaee61484ec6b3408 40c52c1a71294070a5448bbfd80c0e64 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 22 00:07:26 compute-1 nova_compute[182713]: 2026-01-22 00:07:26.228 182717 INFO nova.compute.claims [None req-a5bd4c09-3283-4130-b989-3b6a1760b2b4 03ac6c0f8ea448daaee61484ec6b3408 40c52c1a71294070a5448bbfd80c0e64 - - default default] [instance: 3c1aab4c-913f-4be7-a27b-a763fb7cec97] Claim successful on node compute-1.ctlplane.example.com
Jan 22 00:07:26 compute-1 nova_compute[182713]: 2026-01-22 00:07:26.405 182717 DEBUG nova.compute.provider_tree [None req-a5bd4c09-3283-4130-b989-3b6a1760b2b4 03ac6c0f8ea448daaee61484ec6b3408 40c52c1a71294070a5448bbfd80c0e64 - - default default] Inventory has not changed in ProviderTree for provider: 39680711-70c9-4df1-ae59-25e54fac688d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:07:26 compute-1 nova_compute[182713]: 2026-01-22 00:07:26.433 182717 DEBUG nova.scheduler.client.report [None req-a5bd4c09-3283-4130-b989-3b6a1760b2b4 03ac6c0f8ea448daaee61484ec6b3408 40c52c1a71294070a5448bbfd80c0e64 - - default default] Inventory has not changed for provider 39680711-70c9-4df1-ae59-25e54fac688d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:07:26 compute-1 nova_compute[182713]: 2026-01-22 00:07:26.462 182717 DEBUG oslo_concurrency.lockutils [None req-a5bd4c09-3283-4130-b989-3b6a1760b2b4 03ac6c0f8ea448daaee61484ec6b3408 40c52c1a71294070a5448bbfd80c0e64 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.242s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:07:26 compute-1 nova_compute[182713]: 2026-01-22 00:07:26.492 182717 DEBUG oslo_concurrency.lockutils [None req-a5bd4c09-3283-4130-b989-3b6a1760b2b4 03ac6c0f8ea448daaee61484ec6b3408 40c52c1a71294070a5448bbfd80c0e64 - - default default] Acquiring lock "658adc61-20a9-49ab-b69e-0f19ae1384d2" by "nova.compute.manager.ComputeManager._validate_instance_group_policy.<locals>._do_validation" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:07:26 compute-1 nova_compute[182713]: 2026-01-22 00:07:26.493 182717 DEBUG oslo_concurrency.lockutils [None req-a5bd4c09-3283-4130-b989-3b6a1760b2b4 03ac6c0f8ea448daaee61484ec6b3408 40c52c1a71294070a5448bbfd80c0e64 - - default default] Lock "658adc61-20a9-49ab-b69e-0f19ae1384d2" acquired by "nova.compute.manager.ComputeManager._validate_instance_group_policy.<locals>._do_validation" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:07:26 compute-1 nova_compute[182713]: 2026-01-22 00:07:26.504 182717 DEBUG oslo_concurrency.lockutils [None req-a5bd4c09-3283-4130-b989-3b6a1760b2b4 03ac6c0f8ea448daaee61484ec6b3408 40c52c1a71294070a5448bbfd80c0e64 - - default default] Lock "658adc61-20a9-49ab-b69e-0f19ae1384d2" "released" by "nova.compute.manager.ComputeManager._validate_instance_group_policy.<locals>._do_validation" :: held 0.011s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:07:26 compute-1 nova_compute[182713]: 2026-01-22 00:07:26.505 182717 DEBUG nova.compute.manager [None req-a5bd4c09-3283-4130-b989-3b6a1760b2b4 03ac6c0f8ea448daaee61484ec6b3408 40c52c1a71294070a5448bbfd80c0e64 - - default default] [instance: 3c1aab4c-913f-4be7-a27b-a763fb7cec97] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 22 00:07:26 compute-1 nova_compute[182713]: 2026-01-22 00:07:26.595 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:07:26 compute-1 nova_compute[182713]: 2026-01-22 00:07:26.636 182717 DEBUG nova.compute.manager [None req-a5bd4c09-3283-4130-b989-3b6a1760b2b4 03ac6c0f8ea448daaee61484ec6b3408 40c52c1a71294070a5448bbfd80c0e64 - - default default] [instance: 3c1aab4c-913f-4be7-a27b-a763fb7cec97] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 22 00:07:26 compute-1 nova_compute[182713]: 2026-01-22 00:07:26.636 182717 DEBUG nova.network.neutron [None req-a5bd4c09-3283-4130-b989-3b6a1760b2b4 03ac6c0f8ea448daaee61484ec6b3408 40c52c1a71294070a5448bbfd80c0e64 - - default default] [instance: 3c1aab4c-913f-4be7-a27b-a763fb7cec97] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 22 00:07:26 compute-1 nova_compute[182713]: 2026-01-22 00:07:26.672 182717 INFO nova.virt.libvirt.driver [None req-a5bd4c09-3283-4130-b989-3b6a1760b2b4 03ac6c0f8ea448daaee61484ec6b3408 40c52c1a71294070a5448bbfd80c0e64 - - default default] [instance: 3c1aab4c-913f-4be7-a27b-a763fb7cec97] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 22 00:07:26 compute-1 nova_compute[182713]: 2026-01-22 00:07:26.735 182717 DEBUG nova.compute.manager [None req-a5bd4c09-3283-4130-b989-3b6a1760b2b4 03ac6c0f8ea448daaee61484ec6b3408 40c52c1a71294070a5448bbfd80c0e64 - - default default] [instance: 3c1aab4c-913f-4be7-a27b-a763fb7cec97] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 22 00:07:26 compute-1 nova_compute[182713]: 2026-01-22 00:07:26.855 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:07:26 compute-1 nova_compute[182713]: 2026-01-22 00:07:26.856 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Jan 22 00:07:26 compute-1 nova_compute[182713]: 2026-01-22 00:07:26.879 182717 DEBUG nova.compute.manager [None req-a5bd4c09-3283-4130-b989-3b6a1760b2b4 03ac6c0f8ea448daaee61484ec6b3408 40c52c1a71294070a5448bbfd80c0e64 - - default default] [instance: 3c1aab4c-913f-4be7-a27b-a763fb7cec97] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 22 00:07:26 compute-1 nova_compute[182713]: 2026-01-22 00:07:26.880 182717 DEBUG nova.virt.libvirt.driver [None req-a5bd4c09-3283-4130-b989-3b6a1760b2b4 03ac6c0f8ea448daaee61484ec6b3408 40c52c1a71294070a5448bbfd80c0e64 - - default default] [instance: 3c1aab4c-913f-4be7-a27b-a763fb7cec97] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 22 00:07:26 compute-1 nova_compute[182713]: 2026-01-22 00:07:26.881 182717 INFO nova.virt.libvirt.driver [None req-a5bd4c09-3283-4130-b989-3b6a1760b2b4 03ac6c0f8ea448daaee61484ec6b3408 40c52c1a71294070a5448bbfd80c0e64 - - default default] [instance: 3c1aab4c-913f-4be7-a27b-a763fb7cec97] Creating image(s)
Jan 22 00:07:26 compute-1 nova_compute[182713]: 2026-01-22 00:07:26.881 182717 DEBUG oslo_concurrency.lockutils [None req-a5bd4c09-3283-4130-b989-3b6a1760b2b4 03ac6c0f8ea448daaee61484ec6b3408 40c52c1a71294070a5448bbfd80c0e64 - - default default] Acquiring lock "/var/lib/nova/instances/3c1aab4c-913f-4be7-a27b-a763fb7cec97/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:07:26 compute-1 nova_compute[182713]: 2026-01-22 00:07:26.882 182717 DEBUG oslo_concurrency.lockutils [None req-a5bd4c09-3283-4130-b989-3b6a1760b2b4 03ac6c0f8ea448daaee61484ec6b3408 40c52c1a71294070a5448bbfd80c0e64 - - default default] Lock "/var/lib/nova/instances/3c1aab4c-913f-4be7-a27b-a763fb7cec97/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:07:26 compute-1 nova_compute[182713]: 2026-01-22 00:07:26.882 182717 DEBUG oslo_concurrency.lockutils [None req-a5bd4c09-3283-4130-b989-3b6a1760b2b4 03ac6c0f8ea448daaee61484ec6b3408 40c52c1a71294070a5448bbfd80c0e64 - - default default] Lock "/var/lib/nova/instances/3c1aab4c-913f-4be7-a27b-a763fb7cec97/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:07:26 compute-1 nova_compute[182713]: 2026-01-22 00:07:26.895 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Jan 22 00:07:26 compute-1 nova_compute[182713]: 2026-01-22 00:07:26.896 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:07:26 compute-1 nova_compute[182713]: 2026-01-22 00:07:26.897 182717 DEBUG oslo_concurrency.processutils [None req-a5bd4c09-3283-4130-b989-3b6a1760b2b4 03ac6c0f8ea448daaee61484ec6b3408 40c52c1a71294070a5448bbfd80c0e64 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:07:26 compute-1 nova_compute[182713]: 2026-01-22 00:07:26.965 182717 DEBUG oslo_concurrency.processutils [None req-a5bd4c09-3283-4130-b989-3b6a1760b2b4 03ac6c0f8ea448daaee61484ec6b3408 40c52c1a71294070a5448bbfd80c0e64 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:07:26 compute-1 nova_compute[182713]: 2026-01-22 00:07:26.966 182717 DEBUG oslo_concurrency.lockutils [None req-a5bd4c09-3283-4130-b989-3b6a1760b2b4 03ac6c0f8ea448daaee61484ec6b3408 40c52c1a71294070a5448bbfd80c0e64 - - default default] Acquiring lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:07:26 compute-1 nova_compute[182713]: 2026-01-22 00:07:26.966 182717 DEBUG oslo_concurrency.lockutils [None req-a5bd4c09-3283-4130-b989-3b6a1760b2b4 03ac6c0f8ea448daaee61484ec6b3408 40c52c1a71294070a5448bbfd80c0e64 - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:07:26 compute-1 nova_compute[182713]: 2026-01-22 00:07:26.979 182717 DEBUG oslo_concurrency.processutils [None req-a5bd4c09-3283-4130-b989-3b6a1760b2b4 03ac6c0f8ea448daaee61484ec6b3408 40c52c1a71294070a5448bbfd80c0e64 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:07:27 compute-1 nova_compute[182713]: 2026-01-22 00:07:27.038 182717 DEBUG oslo_concurrency.processutils [None req-a5bd4c09-3283-4130-b989-3b6a1760b2b4 03ac6c0f8ea448daaee61484ec6b3408 40c52c1a71294070a5448bbfd80c0e64 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:07:27 compute-1 nova_compute[182713]: 2026-01-22 00:07:27.039 182717 DEBUG oslo_concurrency.processutils [None req-a5bd4c09-3283-4130-b989-3b6a1760b2b4 03ac6c0f8ea448daaee61484ec6b3408 40c52c1a71294070a5448bbfd80c0e64 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/3c1aab4c-913f-4be7-a27b-a763fb7cec97/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:07:27 compute-1 nova_compute[182713]: 2026-01-22 00:07:27.072 182717 DEBUG nova.policy [None req-a5bd4c09-3283-4130-b989-3b6a1760b2b4 03ac6c0f8ea448daaee61484ec6b3408 40c52c1a71294070a5448bbfd80c0e64 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '03ac6c0f8ea448daaee61484ec6b3408', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '40c52c1a71294070a5448bbfd80c0e64', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 22 00:07:27 compute-1 nova_compute[182713]: 2026-01-22 00:07:27.091 182717 DEBUG oslo_concurrency.processutils [None req-a5bd4c09-3283-4130-b989-3b6a1760b2b4 03ac6c0f8ea448daaee61484ec6b3408 40c52c1a71294070a5448bbfd80c0e64 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/3c1aab4c-913f-4be7-a27b-a763fb7cec97/disk 1073741824" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:07:27 compute-1 nova_compute[182713]: 2026-01-22 00:07:27.092 182717 DEBUG oslo_concurrency.lockutils [None req-a5bd4c09-3283-4130-b989-3b6a1760b2b4 03ac6c0f8ea448daaee61484ec6b3408 40c52c1a71294070a5448bbfd80c0e64 - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.126s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:07:27 compute-1 nova_compute[182713]: 2026-01-22 00:07:27.093 182717 DEBUG oslo_concurrency.processutils [None req-a5bd4c09-3283-4130-b989-3b6a1760b2b4 03ac6c0f8ea448daaee61484ec6b3408 40c52c1a71294070a5448bbfd80c0e64 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:07:27 compute-1 nova_compute[182713]: 2026-01-22 00:07:27.120 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:07:27 compute-1 nova_compute[182713]: 2026-01-22 00:07:27.150 182717 DEBUG oslo_concurrency.processutils [None req-a5bd4c09-3283-4130-b989-3b6a1760b2b4 03ac6c0f8ea448daaee61484ec6b3408 40c52c1a71294070a5448bbfd80c0e64 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:07:27 compute-1 nova_compute[182713]: 2026-01-22 00:07:27.151 182717 DEBUG nova.virt.disk.api [None req-a5bd4c09-3283-4130-b989-3b6a1760b2b4 03ac6c0f8ea448daaee61484ec6b3408 40c52c1a71294070a5448bbfd80c0e64 - - default default] Checking if we can resize image /var/lib/nova/instances/3c1aab4c-913f-4be7-a27b-a763fb7cec97/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 22 00:07:27 compute-1 nova_compute[182713]: 2026-01-22 00:07:27.152 182717 DEBUG oslo_concurrency.processutils [None req-a5bd4c09-3283-4130-b989-3b6a1760b2b4 03ac6c0f8ea448daaee61484ec6b3408 40c52c1a71294070a5448bbfd80c0e64 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3c1aab4c-913f-4be7-a27b-a763fb7cec97/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:07:27 compute-1 nova_compute[182713]: 2026-01-22 00:07:27.210 182717 DEBUG oslo_concurrency.processutils [None req-a5bd4c09-3283-4130-b989-3b6a1760b2b4 03ac6c0f8ea448daaee61484ec6b3408 40c52c1a71294070a5448bbfd80c0e64 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3c1aab4c-913f-4be7-a27b-a763fb7cec97/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:07:27 compute-1 nova_compute[182713]: 2026-01-22 00:07:27.211 182717 DEBUG nova.virt.disk.api [None req-a5bd4c09-3283-4130-b989-3b6a1760b2b4 03ac6c0f8ea448daaee61484ec6b3408 40c52c1a71294070a5448bbfd80c0e64 - - default default] Cannot resize image /var/lib/nova/instances/3c1aab4c-913f-4be7-a27b-a763fb7cec97/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 22 00:07:27 compute-1 nova_compute[182713]: 2026-01-22 00:07:27.212 182717 DEBUG nova.objects.instance [None req-a5bd4c09-3283-4130-b989-3b6a1760b2b4 03ac6c0f8ea448daaee61484ec6b3408 40c52c1a71294070a5448bbfd80c0e64 - - default default] Lazy-loading 'migration_context' on Instance uuid 3c1aab4c-913f-4be7-a27b-a763fb7cec97 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:07:27 compute-1 nova_compute[182713]: 2026-01-22 00:07:27.229 182717 DEBUG nova.virt.libvirt.driver [None req-a5bd4c09-3283-4130-b989-3b6a1760b2b4 03ac6c0f8ea448daaee61484ec6b3408 40c52c1a71294070a5448bbfd80c0e64 - - default default] [instance: 3c1aab4c-913f-4be7-a27b-a763fb7cec97] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 22 00:07:27 compute-1 nova_compute[182713]: 2026-01-22 00:07:27.230 182717 DEBUG nova.virt.libvirt.driver [None req-a5bd4c09-3283-4130-b989-3b6a1760b2b4 03ac6c0f8ea448daaee61484ec6b3408 40c52c1a71294070a5448bbfd80c0e64 - - default default] [instance: 3c1aab4c-913f-4be7-a27b-a763fb7cec97] Ensure instance console log exists: /var/lib/nova/instances/3c1aab4c-913f-4be7-a27b-a763fb7cec97/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 22 00:07:27 compute-1 nova_compute[182713]: 2026-01-22 00:07:27.230 182717 DEBUG oslo_concurrency.lockutils [None req-a5bd4c09-3283-4130-b989-3b6a1760b2b4 03ac6c0f8ea448daaee61484ec6b3408 40c52c1a71294070a5448bbfd80c0e64 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:07:27 compute-1 nova_compute[182713]: 2026-01-22 00:07:27.231 182717 DEBUG oslo_concurrency.lockutils [None req-a5bd4c09-3283-4130-b989-3b6a1760b2b4 03ac6c0f8ea448daaee61484ec6b3408 40c52c1a71294070a5448bbfd80c0e64 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:07:27 compute-1 nova_compute[182713]: 2026-01-22 00:07:27.232 182717 DEBUG oslo_concurrency.lockutils [None req-a5bd4c09-3283-4130-b989-3b6a1760b2b4 03ac6c0f8ea448daaee61484ec6b3408 40c52c1a71294070a5448bbfd80c0e64 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:07:27 compute-1 podman[226842]: 2026-01-22 00:07:27.572470532 +0000 UTC m=+0.058407893 container health_status af9a44923f14d865893c91712a29a99d59e05d83f1a1a0300c4f4d5af638f284 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 22 00:07:27 compute-1 podman[226843]: 2026-01-22 00:07:27.576263118 +0000 UTC m=+0.054874736 container health_status e305ebfcd3dbca18f8dd680606b043b108cf9efb0d0a771039cb04fe0df8441d (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 22 00:07:27 compute-1 nova_compute[182713]: 2026-01-22 00:07:27.657 182717 DEBUG nova.network.neutron [None req-a5bd4c09-3283-4130-b989-3b6a1760b2b4 03ac6c0f8ea448daaee61484ec6b3408 40c52c1a71294070a5448bbfd80c0e64 - - default default] [instance: 3c1aab4c-913f-4be7-a27b-a763fb7cec97] Successfully created port: 38eb4aad-1fbd-420b-803e-69bb30d226c4 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 22 00:07:28 compute-1 nova_compute[182713]: 2026-01-22 00:07:28.671 182717 DEBUG nova.network.neutron [None req-a5bd4c09-3283-4130-b989-3b6a1760b2b4 03ac6c0f8ea448daaee61484ec6b3408 40c52c1a71294070a5448bbfd80c0e64 - - default default] [instance: 3c1aab4c-913f-4be7-a27b-a763fb7cec97] Successfully updated port: 38eb4aad-1fbd-420b-803e-69bb30d226c4 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 22 00:07:28 compute-1 nova_compute[182713]: 2026-01-22 00:07:28.697 182717 DEBUG oslo_concurrency.lockutils [None req-a5bd4c09-3283-4130-b989-3b6a1760b2b4 03ac6c0f8ea448daaee61484ec6b3408 40c52c1a71294070a5448bbfd80c0e64 - - default default] Acquiring lock "refresh_cache-3c1aab4c-913f-4be7-a27b-a763fb7cec97" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:07:28 compute-1 nova_compute[182713]: 2026-01-22 00:07:28.697 182717 DEBUG oslo_concurrency.lockutils [None req-a5bd4c09-3283-4130-b989-3b6a1760b2b4 03ac6c0f8ea448daaee61484ec6b3408 40c52c1a71294070a5448bbfd80c0e64 - - default default] Acquired lock "refresh_cache-3c1aab4c-913f-4be7-a27b-a763fb7cec97" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:07:28 compute-1 nova_compute[182713]: 2026-01-22 00:07:28.698 182717 DEBUG nova.network.neutron [None req-a5bd4c09-3283-4130-b989-3b6a1760b2b4 03ac6c0f8ea448daaee61484ec6b3408 40c52c1a71294070a5448bbfd80c0e64 - - default default] [instance: 3c1aab4c-913f-4be7-a27b-a763fb7cec97] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 00:07:28 compute-1 nova_compute[182713]: 2026-01-22 00:07:28.741 182717 DEBUG nova.compute.manager [req-3b7b10a7-4c50-4f7b-a87b-6cfa5883665e req-fd0687a8-de4c-42fd-975c-b1c0cc184977 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3c1aab4c-913f-4be7-a27b-a763fb7cec97] Received event network-changed-38eb4aad-1fbd-420b-803e-69bb30d226c4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:07:28 compute-1 nova_compute[182713]: 2026-01-22 00:07:28.742 182717 DEBUG nova.compute.manager [req-3b7b10a7-4c50-4f7b-a87b-6cfa5883665e req-fd0687a8-de4c-42fd-975c-b1c0cc184977 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3c1aab4c-913f-4be7-a27b-a763fb7cec97] Refreshing instance network info cache due to event network-changed-38eb4aad-1fbd-420b-803e-69bb30d226c4. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 00:07:28 compute-1 nova_compute[182713]: 2026-01-22 00:07:28.742 182717 DEBUG oslo_concurrency.lockutils [req-3b7b10a7-4c50-4f7b-a87b-6cfa5883665e req-fd0687a8-de4c-42fd-975c-b1c0cc184977 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-3c1aab4c-913f-4be7-a27b-a763fb7cec97" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:07:28 compute-1 nova_compute[182713]: 2026-01-22 00:07:28.854 182717 DEBUG nova.network.neutron [None req-a5bd4c09-3283-4130-b989-3b6a1760b2b4 03ac6c0f8ea448daaee61484ec6b3408 40c52c1a71294070a5448bbfd80c0e64 - - default default] [instance: 3c1aab4c-913f-4be7-a27b-a763fb7cec97] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 00:07:29 compute-1 nova_compute[182713]: 2026-01-22 00:07:29.556 182717 DEBUG nova.network.neutron [None req-a5bd4c09-3283-4130-b989-3b6a1760b2b4 03ac6c0f8ea448daaee61484ec6b3408 40c52c1a71294070a5448bbfd80c0e64 - - default default] [instance: 3c1aab4c-913f-4be7-a27b-a763fb7cec97] Updating instance_info_cache with network_info: [{"id": "38eb4aad-1fbd-420b-803e-69bb30d226c4", "address": "fa:16:3e:25:fb:14", "network": {"id": "4fec6f89-3f21-4bdf-a7ae-3fa3d94f6134", "bridge": "br-int", "label": "tempest-ServerGroupTestJSON-989922694-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "40c52c1a71294070a5448bbfd80c0e64", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap38eb4aad-1f", "ovs_interfaceid": "38eb4aad-1fbd-420b-803e-69bb30d226c4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:07:29 compute-1 nova_compute[182713]: 2026-01-22 00:07:29.681 182717 DEBUG oslo_concurrency.lockutils [None req-a5bd4c09-3283-4130-b989-3b6a1760b2b4 03ac6c0f8ea448daaee61484ec6b3408 40c52c1a71294070a5448bbfd80c0e64 - - default default] Releasing lock "refresh_cache-3c1aab4c-913f-4be7-a27b-a763fb7cec97" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:07:29 compute-1 nova_compute[182713]: 2026-01-22 00:07:29.682 182717 DEBUG nova.compute.manager [None req-a5bd4c09-3283-4130-b989-3b6a1760b2b4 03ac6c0f8ea448daaee61484ec6b3408 40c52c1a71294070a5448bbfd80c0e64 - - default default] [instance: 3c1aab4c-913f-4be7-a27b-a763fb7cec97] Instance network_info: |[{"id": "38eb4aad-1fbd-420b-803e-69bb30d226c4", "address": "fa:16:3e:25:fb:14", "network": {"id": "4fec6f89-3f21-4bdf-a7ae-3fa3d94f6134", "bridge": "br-int", "label": "tempest-ServerGroupTestJSON-989922694-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "40c52c1a71294070a5448bbfd80c0e64", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap38eb4aad-1f", "ovs_interfaceid": "38eb4aad-1fbd-420b-803e-69bb30d226c4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 22 00:07:29 compute-1 nova_compute[182713]: 2026-01-22 00:07:29.682 182717 DEBUG oslo_concurrency.lockutils [req-3b7b10a7-4c50-4f7b-a87b-6cfa5883665e req-fd0687a8-de4c-42fd-975c-b1c0cc184977 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-3c1aab4c-913f-4be7-a27b-a763fb7cec97" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:07:29 compute-1 nova_compute[182713]: 2026-01-22 00:07:29.683 182717 DEBUG nova.network.neutron [req-3b7b10a7-4c50-4f7b-a87b-6cfa5883665e req-fd0687a8-de4c-42fd-975c-b1c0cc184977 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3c1aab4c-913f-4be7-a27b-a763fb7cec97] Refreshing network info cache for port 38eb4aad-1fbd-420b-803e-69bb30d226c4 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 00:07:29 compute-1 nova_compute[182713]: 2026-01-22 00:07:29.686 182717 DEBUG nova.virt.libvirt.driver [None req-a5bd4c09-3283-4130-b989-3b6a1760b2b4 03ac6c0f8ea448daaee61484ec6b3408 40c52c1a71294070a5448bbfd80c0e64 - - default default] [instance: 3c1aab4c-913f-4be7-a27b-a763fb7cec97] Start _get_guest_xml network_info=[{"id": "38eb4aad-1fbd-420b-803e-69bb30d226c4", "address": "fa:16:3e:25:fb:14", "network": {"id": "4fec6f89-3f21-4bdf-a7ae-3fa3d94f6134", "bridge": "br-int", "label": "tempest-ServerGroupTestJSON-989922694-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "40c52c1a71294070a5448bbfd80c0e64", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap38eb4aad-1f", "ovs_interfaceid": "38eb4aad-1fbd-420b-803e-69bb30d226c4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'size': 0, 'disk_bus': 'virtio', 'boot_index': 0, 'device_name': '/dev/vda', 'guest_format': None, 'encrypted': False, 'encryption_format': None, 'encryption_secret_uuid': None, 'device_type': 'disk', 'image_id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 22 00:07:29 compute-1 nova_compute[182713]: 2026-01-22 00:07:29.692 182717 WARNING nova.virt.libvirt.driver [None req-a5bd4c09-3283-4130-b989-3b6a1760b2b4 03ac6c0f8ea448daaee61484ec6b3408 40c52c1a71294070a5448bbfd80c0e64 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 00:07:29 compute-1 nova_compute[182713]: 2026-01-22 00:07:29.699 182717 DEBUG nova.virt.libvirt.host [None req-a5bd4c09-3283-4130-b989-3b6a1760b2b4 03ac6c0f8ea448daaee61484ec6b3408 40c52c1a71294070a5448bbfd80c0e64 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 22 00:07:29 compute-1 nova_compute[182713]: 2026-01-22 00:07:29.700 182717 DEBUG nova.virt.libvirt.host [None req-a5bd4c09-3283-4130-b989-3b6a1760b2b4 03ac6c0f8ea448daaee61484ec6b3408 40c52c1a71294070a5448bbfd80c0e64 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 22 00:07:29 compute-1 nova_compute[182713]: 2026-01-22 00:07:29.705 182717 DEBUG nova.virt.libvirt.host [None req-a5bd4c09-3283-4130-b989-3b6a1760b2b4 03ac6c0f8ea448daaee61484ec6b3408 40c52c1a71294070a5448bbfd80c0e64 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 22 00:07:29 compute-1 nova_compute[182713]: 2026-01-22 00:07:29.706 182717 DEBUG nova.virt.libvirt.host [None req-a5bd4c09-3283-4130-b989-3b6a1760b2b4 03ac6c0f8ea448daaee61484ec6b3408 40c52c1a71294070a5448bbfd80c0e64 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 22 00:07:29 compute-1 nova_compute[182713]: 2026-01-22 00:07:29.707 182717 DEBUG nova.virt.libvirt.driver [None req-a5bd4c09-3283-4130-b989-3b6a1760b2b4 03ac6c0f8ea448daaee61484ec6b3408 40c52c1a71294070a5448bbfd80c0e64 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 22 00:07:29 compute-1 nova_compute[182713]: 2026-01-22 00:07:29.708 182717 DEBUG nova.virt.hardware [None req-a5bd4c09-3283-4130-b989-3b6a1760b2b4 03ac6c0f8ea448daaee61484ec6b3408 40c52c1a71294070a5448bbfd80c0e64 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-21T23:42:45Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c3389c03-89c4-4ff5-9e03-1a99d41713d4',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 22 00:07:29 compute-1 nova_compute[182713]: 2026-01-22 00:07:29.709 182717 DEBUG nova.virt.hardware [None req-a5bd4c09-3283-4130-b989-3b6a1760b2b4 03ac6c0f8ea448daaee61484ec6b3408 40c52c1a71294070a5448bbfd80c0e64 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 22 00:07:29 compute-1 nova_compute[182713]: 2026-01-22 00:07:29.710 182717 DEBUG nova.virt.hardware [None req-a5bd4c09-3283-4130-b989-3b6a1760b2b4 03ac6c0f8ea448daaee61484ec6b3408 40c52c1a71294070a5448bbfd80c0e64 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 22 00:07:29 compute-1 nova_compute[182713]: 2026-01-22 00:07:29.710 182717 DEBUG nova.virt.hardware [None req-a5bd4c09-3283-4130-b989-3b6a1760b2b4 03ac6c0f8ea448daaee61484ec6b3408 40c52c1a71294070a5448bbfd80c0e64 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 22 00:07:29 compute-1 nova_compute[182713]: 2026-01-22 00:07:29.710 182717 DEBUG nova.virt.hardware [None req-a5bd4c09-3283-4130-b989-3b6a1760b2b4 03ac6c0f8ea448daaee61484ec6b3408 40c52c1a71294070a5448bbfd80c0e64 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 22 00:07:29 compute-1 nova_compute[182713]: 2026-01-22 00:07:29.711 182717 DEBUG nova.virt.hardware [None req-a5bd4c09-3283-4130-b989-3b6a1760b2b4 03ac6c0f8ea448daaee61484ec6b3408 40c52c1a71294070a5448bbfd80c0e64 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 22 00:07:29 compute-1 nova_compute[182713]: 2026-01-22 00:07:29.711 182717 DEBUG nova.virt.hardware [None req-a5bd4c09-3283-4130-b989-3b6a1760b2b4 03ac6c0f8ea448daaee61484ec6b3408 40c52c1a71294070a5448bbfd80c0e64 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 22 00:07:29 compute-1 nova_compute[182713]: 2026-01-22 00:07:29.712 182717 DEBUG nova.virt.hardware [None req-a5bd4c09-3283-4130-b989-3b6a1760b2b4 03ac6c0f8ea448daaee61484ec6b3408 40c52c1a71294070a5448bbfd80c0e64 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 22 00:07:29 compute-1 nova_compute[182713]: 2026-01-22 00:07:29.712 182717 DEBUG nova.virt.hardware [None req-a5bd4c09-3283-4130-b989-3b6a1760b2b4 03ac6c0f8ea448daaee61484ec6b3408 40c52c1a71294070a5448bbfd80c0e64 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 22 00:07:29 compute-1 nova_compute[182713]: 2026-01-22 00:07:29.712 182717 DEBUG nova.virt.hardware [None req-a5bd4c09-3283-4130-b989-3b6a1760b2b4 03ac6c0f8ea448daaee61484ec6b3408 40c52c1a71294070a5448bbfd80c0e64 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 22 00:07:29 compute-1 nova_compute[182713]: 2026-01-22 00:07:29.713 182717 DEBUG nova.virt.hardware [None req-a5bd4c09-3283-4130-b989-3b6a1760b2b4 03ac6c0f8ea448daaee61484ec6b3408 40c52c1a71294070a5448bbfd80c0e64 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 22 00:07:29 compute-1 nova_compute[182713]: 2026-01-22 00:07:29.720 182717 DEBUG nova.virt.libvirt.vif [None req-a5bd4c09-3283-4130-b989-3b6a1760b2b4 03ac6c0f8ea448daaee61484ec6b3408 40c52c1a71294070a5448bbfd80c0e64 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T00:07:24Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerGroupTestJSON-server-2118778170',display_name='tempest-ServerGroupTestJSON-server-2118778170',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-servergrouptestjson-server-2118778170',id=103,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='40c52c1a71294070a5448bbfd80c0e64',ramdisk_id='',reservation_id='r-8j03ubbl',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerGroupTestJSON-249766086',owner_user_name='tempest-ServerGroupTestJSON-249766086-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:07:26Z,user_data=None,user_id='03ac6c0f8ea448daaee61484ec6b3408',uuid=3c1aab4c-913f-4be7-a27b-a763fb7cec97,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "38eb4aad-1fbd-420b-803e-69bb30d226c4", "address": "fa:16:3e:25:fb:14", "network": {"id": "4fec6f89-3f21-4bdf-a7ae-3fa3d94f6134", "bridge": "br-int", "label": "tempest-ServerGroupTestJSON-989922694-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "40c52c1a71294070a5448bbfd80c0e64", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap38eb4aad-1f", "ovs_interfaceid": "38eb4aad-1fbd-420b-803e-69bb30d226c4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 00:07:29 compute-1 nova_compute[182713]: 2026-01-22 00:07:29.721 182717 DEBUG nova.network.os_vif_util [None req-a5bd4c09-3283-4130-b989-3b6a1760b2b4 03ac6c0f8ea448daaee61484ec6b3408 40c52c1a71294070a5448bbfd80c0e64 - - default default] Converting VIF {"id": "38eb4aad-1fbd-420b-803e-69bb30d226c4", "address": "fa:16:3e:25:fb:14", "network": {"id": "4fec6f89-3f21-4bdf-a7ae-3fa3d94f6134", "bridge": "br-int", "label": "tempest-ServerGroupTestJSON-989922694-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "40c52c1a71294070a5448bbfd80c0e64", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap38eb4aad-1f", "ovs_interfaceid": "38eb4aad-1fbd-420b-803e-69bb30d226c4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:07:29 compute-1 nova_compute[182713]: 2026-01-22 00:07:29.723 182717 DEBUG nova.network.os_vif_util [None req-a5bd4c09-3283-4130-b989-3b6a1760b2b4 03ac6c0f8ea448daaee61484ec6b3408 40c52c1a71294070a5448bbfd80c0e64 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:25:fb:14,bridge_name='br-int',has_traffic_filtering=True,id=38eb4aad-1fbd-420b-803e-69bb30d226c4,network=Network(4fec6f89-3f21-4bdf-a7ae-3fa3d94f6134),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap38eb4aad-1f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:07:29 compute-1 nova_compute[182713]: 2026-01-22 00:07:29.725 182717 DEBUG nova.objects.instance [None req-a5bd4c09-3283-4130-b989-3b6a1760b2b4 03ac6c0f8ea448daaee61484ec6b3408 40c52c1a71294070a5448bbfd80c0e64 - - default default] Lazy-loading 'pci_devices' on Instance uuid 3c1aab4c-913f-4be7-a27b-a763fb7cec97 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:07:29 compute-1 nova_compute[182713]: 2026-01-22 00:07:29.750 182717 DEBUG nova.virt.libvirt.driver [None req-a5bd4c09-3283-4130-b989-3b6a1760b2b4 03ac6c0f8ea448daaee61484ec6b3408 40c52c1a71294070a5448bbfd80c0e64 - - default default] [instance: 3c1aab4c-913f-4be7-a27b-a763fb7cec97] End _get_guest_xml xml=<domain type="kvm">
Jan 22 00:07:29 compute-1 nova_compute[182713]:   <uuid>3c1aab4c-913f-4be7-a27b-a763fb7cec97</uuid>
Jan 22 00:07:29 compute-1 nova_compute[182713]:   <name>instance-00000067</name>
Jan 22 00:07:29 compute-1 nova_compute[182713]:   <memory>131072</memory>
Jan 22 00:07:29 compute-1 nova_compute[182713]:   <vcpu>1</vcpu>
Jan 22 00:07:29 compute-1 nova_compute[182713]:   <metadata>
Jan 22 00:07:29 compute-1 nova_compute[182713]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 00:07:29 compute-1 nova_compute[182713]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 00:07:29 compute-1 nova_compute[182713]:       <nova:name>tempest-ServerGroupTestJSON-server-2118778170</nova:name>
Jan 22 00:07:29 compute-1 nova_compute[182713]:       <nova:creationTime>2026-01-22 00:07:29</nova:creationTime>
Jan 22 00:07:29 compute-1 nova_compute[182713]:       <nova:flavor name="m1.nano">
Jan 22 00:07:29 compute-1 nova_compute[182713]:         <nova:memory>128</nova:memory>
Jan 22 00:07:29 compute-1 nova_compute[182713]:         <nova:disk>1</nova:disk>
Jan 22 00:07:29 compute-1 nova_compute[182713]:         <nova:swap>0</nova:swap>
Jan 22 00:07:29 compute-1 nova_compute[182713]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 00:07:29 compute-1 nova_compute[182713]:         <nova:vcpus>1</nova:vcpus>
Jan 22 00:07:29 compute-1 nova_compute[182713]:       </nova:flavor>
Jan 22 00:07:29 compute-1 nova_compute[182713]:       <nova:owner>
Jan 22 00:07:29 compute-1 nova_compute[182713]:         <nova:user uuid="03ac6c0f8ea448daaee61484ec6b3408">tempest-ServerGroupTestJSON-249766086-project-member</nova:user>
Jan 22 00:07:29 compute-1 nova_compute[182713]:         <nova:project uuid="40c52c1a71294070a5448bbfd80c0e64">tempest-ServerGroupTestJSON-249766086</nova:project>
Jan 22 00:07:29 compute-1 nova_compute[182713]:       </nova:owner>
Jan 22 00:07:29 compute-1 nova_compute[182713]:       <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 22 00:07:29 compute-1 nova_compute[182713]:       <nova:ports>
Jan 22 00:07:29 compute-1 nova_compute[182713]:         <nova:port uuid="38eb4aad-1fbd-420b-803e-69bb30d226c4">
Jan 22 00:07:29 compute-1 nova_compute[182713]:           <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Jan 22 00:07:29 compute-1 nova_compute[182713]:         </nova:port>
Jan 22 00:07:29 compute-1 nova_compute[182713]:       </nova:ports>
Jan 22 00:07:29 compute-1 nova_compute[182713]:     </nova:instance>
Jan 22 00:07:29 compute-1 nova_compute[182713]:   </metadata>
Jan 22 00:07:29 compute-1 nova_compute[182713]:   <sysinfo type="smbios">
Jan 22 00:07:29 compute-1 nova_compute[182713]:     <system>
Jan 22 00:07:29 compute-1 nova_compute[182713]:       <entry name="manufacturer">RDO</entry>
Jan 22 00:07:29 compute-1 nova_compute[182713]:       <entry name="product">OpenStack Compute</entry>
Jan 22 00:07:29 compute-1 nova_compute[182713]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 00:07:29 compute-1 nova_compute[182713]:       <entry name="serial">3c1aab4c-913f-4be7-a27b-a763fb7cec97</entry>
Jan 22 00:07:29 compute-1 nova_compute[182713]:       <entry name="uuid">3c1aab4c-913f-4be7-a27b-a763fb7cec97</entry>
Jan 22 00:07:29 compute-1 nova_compute[182713]:       <entry name="family">Virtual Machine</entry>
Jan 22 00:07:29 compute-1 nova_compute[182713]:     </system>
Jan 22 00:07:29 compute-1 nova_compute[182713]:   </sysinfo>
Jan 22 00:07:29 compute-1 nova_compute[182713]:   <os>
Jan 22 00:07:29 compute-1 nova_compute[182713]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 22 00:07:29 compute-1 nova_compute[182713]:     <boot dev="hd"/>
Jan 22 00:07:29 compute-1 nova_compute[182713]:     <smbios mode="sysinfo"/>
Jan 22 00:07:29 compute-1 nova_compute[182713]:   </os>
Jan 22 00:07:29 compute-1 nova_compute[182713]:   <features>
Jan 22 00:07:29 compute-1 nova_compute[182713]:     <acpi/>
Jan 22 00:07:29 compute-1 nova_compute[182713]:     <apic/>
Jan 22 00:07:29 compute-1 nova_compute[182713]:     <vmcoreinfo/>
Jan 22 00:07:29 compute-1 nova_compute[182713]:   </features>
Jan 22 00:07:29 compute-1 nova_compute[182713]:   <clock offset="utc">
Jan 22 00:07:29 compute-1 nova_compute[182713]:     <timer name="pit" tickpolicy="delay"/>
Jan 22 00:07:29 compute-1 nova_compute[182713]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 22 00:07:29 compute-1 nova_compute[182713]:     <timer name="hpet" present="no"/>
Jan 22 00:07:29 compute-1 nova_compute[182713]:   </clock>
Jan 22 00:07:29 compute-1 nova_compute[182713]:   <cpu mode="custom" match="exact">
Jan 22 00:07:29 compute-1 nova_compute[182713]:     <model>Nehalem</model>
Jan 22 00:07:29 compute-1 nova_compute[182713]:     <topology sockets="1" cores="1" threads="1"/>
Jan 22 00:07:29 compute-1 nova_compute[182713]:   </cpu>
Jan 22 00:07:29 compute-1 nova_compute[182713]:   <devices>
Jan 22 00:07:29 compute-1 nova_compute[182713]:     <disk type="file" device="disk">
Jan 22 00:07:29 compute-1 nova_compute[182713]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 00:07:29 compute-1 nova_compute[182713]:       <source file="/var/lib/nova/instances/3c1aab4c-913f-4be7-a27b-a763fb7cec97/disk"/>
Jan 22 00:07:29 compute-1 nova_compute[182713]:       <target dev="vda" bus="virtio"/>
Jan 22 00:07:29 compute-1 nova_compute[182713]:     </disk>
Jan 22 00:07:29 compute-1 nova_compute[182713]:     <disk type="file" device="cdrom">
Jan 22 00:07:29 compute-1 nova_compute[182713]:       <driver name="qemu" type="raw" cache="none"/>
Jan 22 00:07:29 compute-1 nova_compute[182713]:       <source file="/var/lib/nova/instances/3c1aab4c-913f-4be7-a27b-a763fb7cec97/disk.config"/>
Jan 22 00:07:29 compute-1 nova_compute[182713]:       <target dev="sda" bus="sata"/>
Jan 22 00:07:29 compute-1 nova_compute[182713]:     </disk>
Jan 22 00:07:29 compute-1 nova_compute[182713]:     <interface type="ethernet">
Jan 22 00:07:29 compute-1 nova_compute[182713]:       <mac address="fa:16:3e:25:fb:14"/>
Jan 22 00:07:29 compute-1 nova_compute[182713]:       <model type="virtio"/>
Jan 22 00:07:29 compute-1 nova_compute[182713]:       <driver name="vhost" rx_queue_size="512"/>
Jan 22 00:07:29 compute-1 nova_compute[182713]:       <mtu size="1442"/>
Jan 22 00:07:29 compute-1 nova_compute[182713]:       <target dev="tap38eb4aad-1f"/>
Jan 22 00:07:29 compute-1 nova_compute[182713]:     </interface>
Jan 22 00:07:29 compute-1 nova_compute[182713]:     <serial type="pty">
Jan 22 00:07:29 compute-1 nova_compute[182713]:       <log file="/var/lib/nova/instances/3c1aab4c-913f-4be7-a27b-a763fb7cec97/console.log" append="off"/>
Jan 22 00:07:29 compute-1 nova_compute[182713]:     </serial>
Jan 22 00:07:29 compute-1 nova_compute[182713]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 00:07:29 compute-1 nova_compute[182713]:     <video>
Jan 22 00:07:29 compute-1 nova_compute[182713]:       <model type="virtio"/>
Jan 22 00:07:29 compute-1 nova_compute[182713]:     </video>
Jan 22 00:07:29 compute-1 nova_compute[182713]:     <input type="tablet" bus="usb"/>
Jan 22 00:07:29 compute-1 nova_compute[182713]:     <rng model="virtio">
Jan 22 00:07:29 compute-1 nova_compute[182713]:       <backend model="random">/dev/urandom</backend>
Jan 22 00:07:29 compute-1 nova_compute[182713]:     </rng>
Jan 22 00:07:29 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root"/>
Jan 22 00:07:29 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:07:29 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:07:29 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:07:29 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:07:29 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:07:29 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:07:29 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:07:29 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:07:29 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:07:29 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:07:29 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:07:29 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:07:29 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:07:29 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:07:29 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:07:29 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:07:29 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:07:29 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:07:29 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:07:29 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:07:29 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:07:29 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:07:29 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:07:29 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:07:29 compute-1 nova_compute[182713]:     <controller type="usb" index="0"/>
Jan 22 00:07:29 compute-1 nova_compute[182713]:     <memballoon model="virtio">
Jan 22 00:07:29 compute-1 nova_compute[182713]:       <stats period="10"/>
Jan 22 00:07:29 compute-1 nova_compute[182713]:     </memballoon>
Jan 22 00:07:29 compute-1 nova_compute[182713]:   </devices>
Jan 22 00:07:29 compute-1 nova_compute[182713]: </domain>
Jan 22 00:07:29 compute-1 nova_compute[182713]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 22 00:07:29 compute-1 nova_compute[182713]: 2026-01-22 00:07:29.751 182717 DEBUG nova.compute.manager [None req-a5bd4c09-3283-4130-b989-3b6a1760b2b4 03ac6c0f8ea448daaee61484ec6b3408 40c52c1a71294070a5448bbfd80c0e64 - - default default] [instance: 3c1aab4c-913f-4be7-a27b-a763fb7cec97] Preparing to wait for external event network-vif-plugged-38eb4aad-1fbd-420b-803e-69bb30d226c4 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 22 00:07:29 compute-1 nova_compute[182713]: 2026-01-22 00:07:29.751 182717 DEBUG oslo_concurrency.lockutils [None req-a5bd4c09-3283-4130-b989-3b6a1760b2b4 03ac6c0f8ea448daaee61484ec6b3408 40c52c1a71294070a5448bbfd80c0e64 - - default default] Acquiring lock "3c1aab4c-913f-4be7-a27b-a763fb7cec97-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:07:29 compute-1 nova_compute[182713]: 2026-01-22 00:07:29.752 182717 DEBUG oslo_concurrency.lockutils [None req-a5bd4c09-3283-4130-b989-3b6a1760b2b4 03ac6c0f8ea448daaee61484ec6b3408 40c52c1a71294070a5448bbfd80c0e64 - - default default] Lock "3c1aab4c-913f-4be7-a27b-a763fb7cec97-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:07:29 compute-1 nova_compute[182713]: 2026-01-22 00:07:29.752 182717 DEBUG oslo_concurrency.lockutils [None req-a5bd4c09-3283-4130-b989-3b6a1760b2b4 03ac6c0f8ea448daaee61484ec6b3408 40c52c1a71294070a5448bbfd80c0e64 - - default default] Lock "3c1aab4c-913f-4be7-a27b-a763fb7cec97-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:07:29 compute-1 nova_compute[182713]: 2026-01-22 00:07:29.754 182717 DEBUG nova.virt.libvirt.vif [None req-a5bd4c09-3283-4130-b989-3b6a1760b2b4 03ac6c0f8ea448daaee61484ec6b3408 40c52c1a71294070a5448bbfd80c0e64 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T00:07:24Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerGroupTestJSON-server-2118778170',display_name='tempest-ServerGroupTestJSON-server-2118778170',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-servergrouptestjson-server-2118778170',id=103,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='40c52c1a71294070a5448bbfd80c0e64',ramdisk_id='',reservation_id='r-8j03ubbl',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerGroupTestJSON-249766086',owner_user_name='tempest-ServerGroupTestJSON-249766086-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:07:26Z,user_data=None,user_id='03ac6c0f8ea448daaee61484ec6b3408',uuid=3c1aab4c-913f-4be7-a27b-a763fb7cec97,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "38eb4aad-1fbd-420b-803e-69bb30d226c4", "address": "fa:16:3e:25:fb:14", "network": {"id": "4fec6f89-3f21-4bdf-a7ae-3fa3d94f6134", "bridge": "br-int", "label": "tempest-ServerGroupTestJSON-989922694-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "40c52c1a71294070a5448bbfd80c0e64", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap38eb4aad-1f", "ovs_interfaceid": "38eb4aad-1fbd-420b-803e-69bb30d226c4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 00:07:29 compute-1 nova_compute[182713]: 2026-01-22 00:07:29.754 182717 DEBUG nova.network.os_vif_util [None req-a5bd4c09-3283-4130-b989-3b6a1760b2b4 03ac6c0f8ea448daaee61484ec6b3408 40c52c1a71294070a5448bbfd80c0e64 - - default default] Converting VIF {"id": "38eb4aad-1fbd-420b-803e-69bb30d226c4", "address": "fa:16:3e:25:fb:14", "network": {"id": "4fec6f89-3f21-4bdf-a7ae-3fa3d94f6134", "bridge": "br-int", "label": "tempest-ServerGroupTestJSON-989922694-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "40c52c1a71294070a5448bbfd80c0e64", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap38eb4aad-1f", "ovs_interfaceid": "38eb4aad-1fbd-420b-803e-69bb30d226c4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:07:29 compute-1 nova_compute[182713]: 2026-01-22 00:07:29.755 182717 DEBUG nova.network.os_vif_util [None req-a5bd4c09-3283-4130-b989-3b6a1760b2b4 03ac6c0f8ea448daaee61484ec6b3408 40c52c1a71294070a5448bbfd80c0e64 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:25:fb:14,bridge_name='br-int',has_traffic_filtering=True,id=38eb4aad-1fbd-420b-803e-69bb30d226c4,network=Network(4fec6f89-3f21-4bdf-a7ae-3fa3d94f6134),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap38eb4aad-1f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:07:29 compute-1 nova_compute[182713]: 2026-01-22 00:07:29.756 182717 DEBUG os_vif [None req-a5bd4c09-3283-4130-b989-3b6a1760b2b4 03ac6c0f8ea448daaee61484ec6b3408 40c52c1a71294070a5448bbfd80c0e64 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:25:fb:14,bridge_name='br-int',has_traffic_filtering=True,id=38eb4aad-1fbd-420b-803e-69bb30d226c4,network=Network(4fec6f89-3f21-4bdf-a7ae-3fa3d94f6134),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap38eb4aad-1f') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 00:07:29 compute-1 nova_compute[182713]: 2026-01-22 00:07:29.757 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:07:29 compute-1 nova_compute[182713]: 2026-01-22 00:07:29.758 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:07:29 compute-1 nova_compute[182713]: 2026-01-22 00:07:29.759 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 00:07:29 compute-1 nova_compute[182713]: 2026-01-22 00:07:29.764 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:07:29 compute-1 nova_compute[182713]: 2026-01-22 00:07:29.765 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap38eb4aad-1f, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:07:29 compute-1 nova_compute[182713]: 2026-01-22 00:07:29.766 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap38eb4aad-1f, col_values=(('external_ids', {'iface-id': '38eb4aad-1fbd-420b-803e-69bb30d226c4', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:25:fb:14', 'vm-uuid': '3c1aab4c-913f-4be7-a27b-a763fb7cec97'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:07:29 compute-1 NetworkManager[54952]: <info>  [1769040449.7696] manager: (tap38eb4aad-1f): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/193)
Jan 22 00:07:29 compute-1 nova_compute[182713]: 2026-01-22 00:07:29.770 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:07:29 compute-1 nova_compute[182713]: 2026-01-22 00:07:29.773 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:07:29 compute-1 nova_compute[182713]: 2026-01-22 00:07:29.821 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:07:29 compute-1 nova_compute[182713]: 2026-01-22 00:07:29.822 182717 INFO os_vif [None req-a5bd4c09-3283-4130-b989-3b6a1760b2b4 03ac6c0f8ea448daaee61484ec6b3408 40c52c1a71294070a5448bbfd80c0e64 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:25:fb:14,bridge_name='br-int',has_traffic_filtering=True,id=38eb4aad-1fbd-420b-803e-69bb30d226c4,network=Network(4fec6f89-3f21-4bdf-a7ae-3fa3d94f6134),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap38eb4aad-1f')
Jan 22 00:07:29 compute-1 nova_compute[182713]: 2026-01-22 00:07:29.918 182717 DEBUG nova.virt.libvirt.driver [None req-a5bd4c09-3283-4130-b989-3b6a1760b2b4 03ac6c0f8ea448daaee61484ec6b3408 40c52c1a71294070a5448bbfd80c0e64 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 00:07:29 compute-1 nova_compute[182713]: 2026-01-22 00:07:29.918 182717 DEBUG nova.virt.libvirt.driver [None req-a5bd4c09-3283-4130-b989-3b6a1760b2b4 03ac6c0f8ea448daaee61484ec6b3408 40c52c1a71294070a5448bbfd80c0e64 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 00:07:29 compute-1 nova_compute[182713]: 2026-01-22 00:07:29.919 182717 DEBUG nova.virt.libvirt.driver [None req-a5bd4c09-3283-4130-b989-3b6a1760b2b4 03ac6c0f8ea448daaee61484ec6b3408 40c52c1a71294070a5448bbfd80c0e64 - - default default] No VIF found with MAC fa:16:3e:25:fb:14, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 00:07:29 compute-1 nova_compute[182713]: 2026-01-22 00:07:29.919 182717 INFO nova.virt.libvirt.driver [None req-a5bd4c09-3283-4130-b989-3b6a1760b2b4 03ac6c0f8ea448daaee61484ec6b3408 40c52c1a71294070a5448bbfd80c0e64 - - default default] [instance: 3c1aab4c-913f-4be7-a27b-a763fb7cec97] Using config drive
Jan 22 00:07:30 compute-1 nova_compute[182713]: 2026-01-22 00:07:30.353 182717 INFO nova.virt.libvirt.driver [None req-a5bd4c09-3283-4130-b989-3b6a1760b2b4 03ac6c0f8ea448daaee61484ec6b3408 40c52c1a71294070a5448bbfd80c0e64 - - default default] [instance: 3c1aab4c-913f-4be7-a27b-a763fb7cec97] Creating config drive at /var/lib/nova/instances/3c1aab4c-913f-4be7-a27b-a763fb7cec97/disk.config
Jan 22 00:07:30 compute-1 nova_compute[182713]: 2026-01-22 00:07:30.362 182717 DEBUG oslo_concurrency.processutils [None req-a5bd4c09-3283-4130-b989-3b6a1760b2b4 03ac6c0f8ea448daaee61484ec6b3408 40c52c1a71294070a5448bbfd80c0e64 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/3c1aab4c-913f-4be7-a27b-a763fb7cec97/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp044mq7fb execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:07:30 compute-1 nova_compute[182713]: 2026-01-22 00:07:30.499 182717 DEBUG oslo_concurrency.processutils [None req-a5bd4c09-3283-4130-b989-3b6a1760b2b4 03ac6c0f8ea448daaee61484ec6b3408 40c52c1a71294070a5448bbfd80c0e64 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/3c1aab4c-913f-4be7-a27b-a763fb7cec97/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp044mq7fb" returned: 0 in 0.136s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:07:30 compute-1 kernel: tap38eb4aad-1f: entered promiscuous mode
Jan 22 00:07:30 compute-1 NetworkManager[54952]: <info>  [1769040450.5790] manager: (tap38eb4aad-1f): new Tun device (/org/freedesktop/NetworkManager/Devices/194)
Jan 22 00:07:30 compute-1 nova_compute[182713]: 2026-01-22 00:07:30.578 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:07:30 compute-1 ovn_controller[94841]: 2026-01-22T00:07:30Z|00398|binding|INFO|Claiming lport 38eb4aad-1fbd-420b-803e-69bb30d226c4 for this chassis.
Jan 22 00:07:30 compute-1 ovn_controller[94841]: 2026-01-22T00:07:30Z|00399|binding|INFO|38eb4aad-1fbd-420b-803e-69bb30d226c4: Claiming fa:16:3e:25:fb:14 10.100.0.12
Jan 22 00:07:30 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:07:30.587 104184 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:25:fb:14 10.100.0.12'], port_security=['fa:16:3e:25:fb:14 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '3c1aab4c-913f-4be7-a27b-a763fb7cec97', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4fec6f89-3f21-4bdf-a7ae-3fa3d94f6134', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '40c52c1a71294070a5448bbfd80c0e64', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'c51487e9-5131-485f-b023-911e268f5378', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=86a317d2-1b32-45a7-bad1-78a61cb0921a, chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>], logical_port=38eb4aad-1fbd-420b-803e-69bb30d226c4) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:07:30 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:07:30.588 104184 INFO neutron.agent.ovn.metadata.agent [-] Port 38eb4aad-1fbd-420b-803e-69bb30d226c4 in datapath 4fec6f89-3f21-4bdf-a7ae-3fa3d94f6134 bound to our chassis
Jan 22 00:07:30 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:07:30.590 104184 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 4fec6f89-3f21-4bdf-a7ae-3fa3d94f6134
Jan 22 00:07:30 compute-1 ovn_controller[94841]: 2026-01-22T00:07:30Z|00400|binding|INFO|Setting lport 38eb4aad-1fbd-420b-803e-69bb30d226c4 ovn-installed in OVS
Jan 22 00:07:30 compute-1 ovn_controller[94841]: 2026-01-22T00:07:30Z|00401|binding|INFO|Setting lport 38eb4aad-1fbd-420b-803e-69bb30d226c4 up in Southbound
Jan 22 00:07:30 compute-1 nova_compute[182713]: 2026-01-22 00:07:30.594 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:07:30 compute-1 nova_compute[182713]: 2026-01-22 00:07:30.596 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:07:30 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:07:30.604 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[2794e3be-57c3-4926-a03f-743736c1cfed]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:07:30 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:07:30.605 104184 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap4fec6f89-31 in ovnmeta-4fec6f89-3f21-4bdf-a7ae-3fa3d94f6134 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 22 00:07:30 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:07:30.607 211733 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap4fec6f89-30 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 22 00:07:30 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:07:30.607 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[3a7d1a99-52f7-49a6-bc81-fcf88dc2bb5e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:07:30 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:07:30.608 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[7954139c-ab94-4521-b2f2-c9581c8ecd56]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:07:30 compute-1 systemd-udevd[226905]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 00:07:30 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:07:30.626 104576 DEBUG oslo.privsep.daemon [-] privsep: reply[a1684cfc-8bd3-4992-a7ad-8c589bdce5bc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:07:30 compute-1 systemd-machined[153970]: New machine qemu-45-instance-00000067.
Jan 22 00:07:30 compute-1 NetworkManager[54952]: <info>  [1769040450.6415] device (tap38eb4aad-1f): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 00:07:30 compute-1 NetworkManager[54952]: <info>  [1769040450.6421] device (tap38eb4aad-1f): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 00:07:30 compute-1 systemd[1]: Started Virtual Machine qemu-45-instance-00000067.
Jan 22 00:07:30 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:07:30.650 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[6dbbd094-852e-41fe-a271-eab26b4fdf22]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:07:30 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:07:30.695 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[9d414e8f-24d9-462f-bb40-3b0776343ad8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:07:30 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:07:30.702 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[512a64f3-df74-4f11-82e5-dfa10e4d2254]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:07:30 compute-1 NetworkManager[54952]: <info>  [1769040450.7045] manager: (tap4fec6f89-30): new Veth device (/org/freedesktop/NetworkManager/Devices/195)
Jan 22 00:07:30 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:07:30.740 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[e53fdcce-ee19-4e30-9213-fd5db19d9d58]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:07:30 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:07:30.744 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[f521020e-241a-412a-9a1b-612e85041a1d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:07:30 compute-1 NetworkManager[54952]: <info>  [1769040450.7693] device (tap4fec6f89-30): carrier: link connected
Jan 22 00:07:30 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:07:30.778 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[bf1ccdf8-0ddc-4c40-9527-174ec775c745]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:07:30 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:07:30.802 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[192bcc73-0b13-49a3-b639-f14dfff9da5c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap4fec6f89-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b2:9e:16'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 118], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 502342, 'reachable_time': 37554, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 226937, 'error': None, 'target': 'ovnmeta-4fec6f89-3f21-4bdf-a7ae-3fa3d94f6134', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:07:30 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:07:30.827 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[0d6a9fd3-0b28-4210-a7f9-3934a74607f9]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feb2:9e16'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 502342, 'tstamp': 502342}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 226938, 'error': None, 'target': 'ovnmeta-4fec6f89-3f21-4bdf-a7ae-3fa3d94f6134', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:07:30 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:07:30.849 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[e94b4c93-ddc1-4638-b7e4-8bc293fc2b87]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap4fec6f89-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b2:9e:16'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 118], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 502342, 'reachable_time': 37554, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 226939, 'error': None, 'target': 'ovnmeta-4fec6f89-3f21-4bdf-a7ae-3fa3d94f6134', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:07:30 compute-1 nova_compute[182713]: 2026-01-22 00:07:30.882 182717 DEBUG nova.compute.manager [req-1ced5c8d-3002-4d3a-b0ab-b6b50eeb6e29 req-ed811e85-7a88-4c2c-9dba-479c360d72e9 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3c1aab4c-913f-4be7-a27b-a763fb7cec97] Received event network-vif-plugged-38eb4aad-1fbd-420b-803e-69bb30d226c4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:07:30 compute-1 nova_compute[182713]: 2026-01-22 00:07:30.882 182717 DEBUG oslo_concurrency.lockutils [req-1ced5c8d-3002-4d3a-b0ab-b6b50eeb6e29 req-ed811e85-7a88-4c2c-9dba-479c360d72e9 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "3c1aab4c-913f-4be7-a27b-a763fb7cec97-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:07:30 compute-1 nova_compute[182713]: 2026-01-22 00:07:30.883 182717 DEBUG oslo_concurrency.lockutils [req-1ced5c8d-3002-4d3a-b0ab-b6b50eeb6e29 req-ed811e85-7a88-4c2c-9dba-479c360d72e9 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "3c1aab4c-913f-4be7-a27b-a763fb7cec97-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:07:30 compute-1 nova_compute[182713]: 2026-01-22 00:07:30.883 182717 DEBUG oslo_concurrency.lockutils [req-1ced5c8d-3002-4d3a-b0ab-b6b50eeb6e29 req-ed811e85-7a88-4c2c-9dba-479c360d72e9 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "3c1aab4c-913f-4be7-a27b-a763fb7cec97-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:07:30 compute-1 nova_compute[182713]: 2026-01-22 00:07:30.884 182717 DEBUG nova.compute.manager [req-1ced5c8d-3002-4d3a-b0ab-b6b50eeb6e29 req-ed811e85-7a88-4c2c-9dba-479c360d72e9 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3c1aab4c-913f-4be7-a27b-a763fb7cec97] Processing event network-vif-plugged-38eb4aad-1fbd-420b-803e-69bb30d226c4 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 22 00:07:30 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:07:30.898 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[4b84e10a-ed12-48c2-bac8-f5b2c9a7a14c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:07:30 compute-1 nova_compute[182713]: 2026-01-22 00:07:30.977 182717 DEBUG nova.virt.driver [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Emitting event <LifecycleEvent: 1769040450.9765277, 3c1aab4c-913f-4be7-a27b-a763fb7cec97 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:07:30 compute-1 nova_compute[182713]: 2026-01-22 00:07:30.977 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 3c1aab4c-913f-4be7-a27b-a763fb7cec97] VM Started (Lifecycle Event)
Jan 22 00:07:30 compute-1 nova_compute[182713]: 2026-01-22 00:07:30.979 182717 DEBUG nova.compute.manager [None req-a5bd4c09-3283-4130-b989-3b6a1760b2b4 03ac6c0f8ea448daaee61484ec6b3408 40c52c1a71294070a5448bbfd80c0e64 - - default default] [instance: 3c1aab4c-913f-4be7-a27b-a763fb7cec97] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 00:07:30 compute-1 nova_compute[182713]: 2026-01-22 00:07:30.982 182717 DEBUG nova.virt.libvirt.driver [None req-a5bd4c09-3283-4130-b989-3b6a1760b2b4 03ac6c0f8ea448daaee61484ec6b3408 40c52c1a71294070a5448bbfd80c0e64 - - default default] [instance: 3c1aab4c-913f-4be7-a27b-a763fb7cec97] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 22 00:07:30 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:07:30.983 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[14bd0a82-bfc8-40f3-b3da-a0e25ab6186c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:07:30 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:07:30.984 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4fec6f89-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:07:30 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:07:30.984 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 00:07:30 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:07:30.985 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap4fec6f89-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:07:30 compute-1 nova_compute[182713]: 2026-01-22 00:07:30.986 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:07:30 compute-1 NetworkManager[54952]: <info>  [1769040450.9871] manager: (tap4fec6f89-30): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/196)
Jan 22 00:07:30 compute-1 kernel: tap4fec6f89-30: entered promiscuous mode
Jan 22 00:07:31 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:07:31.002 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap4fec6f89-30, col_values=(('external_ids', {'iface-id': '0c8719da-243f-480a-bf4f-471b440abe6c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:07:31 compute-1 nova_compute[182713]: 2026-01-22 00:07:31.002 182717 INFO nova.virt.libvirt.driver [-] [instance: 3c1aab4c-913f-4be7-a27b-a763fb7cec97] Instance spawned successfully.
Jan 22 00:07:31 compute-1 ovn_controller[94841]: 2026-01-22T00:07:31Z|00402|binding|INFO|Releasing lport 0c8719da-243f-480a-bf4f-471b440abe6c from this chassis (sb_readonly=0)
Jan 22 00:07:31 compute-1 nova_compute[182713]: 2026-01-22 00:07:31.005 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:07:31 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:07:31.005 104184 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/4fec6f89-3f21-4bdf-a7ae-3fa3d94f6134.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/4fec6f89-3f21-4bdf-a7ae-3fa3d94f6134.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 22 00:07:31 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:07:31.006 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[427070f2-db70-413a-a0d1-02b3f772bb30]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:07:31 compute-1 nova_compute[182713]: 2026-01-22 00:07:31.006 182717 DEBUG nova.virt.libvirt.driver [None req-a5bd4c09-3283-4130-b989-3b6a1760b2b4 03ac6c0f8ea448daaee61484ec6b3408 40c52c1a71294070a5448bbfd80c0e64 - - default default] [instance: 3c1aab4c-913f-4be7-a27b-a763fb7cec97] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 22 00:07:31 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:07:31.007 104184 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 00:07:31 compute-1 ovn_metadata_agent[104179]: global
Jan 22 00:07:31 compute-1 ovn_metadata_agent[104179]:     log         /dev/log local0 debug
Jan 22 00:07:31 compute-1 ovn_metadata_agent[104179]:     log-tag     haproxy-metadata-proxy-4fec6f89-3f21-4bdf-a7ae-3fa3d94f6134
Jan 22 00:07:31 compute-1 ovn_metadata_agent[104179]:     user        root
Jan 22 00:07:31 compute-1 ovn_metadata_agent[104179]:     group       root
Jan 22 00:07:31 compute-1 ovn_metadata_agent[104179]:     maxconn     1024
Jan 22 00:07:31 compute-1 ovn_metadata_agent[104179]:     pidfile     /var/lib/neutron/external/pids/4fec6f89-3f21-4bdf-a7ae-3fa3d94f6134.pid.haproxy
Jan 22 00:07:31 compute-1 ovn_metadata_agent[104179]:     daemon
Jan 22 00:07:31 compute-1 ovn_metadata_agent[104179]: 
Jan 22 00:07:31 compute-1 ovn_metadata_agent[104179]: defaults
Jan 22 00:07:31 compute-1 ovn_metadata_agent[104179]:     log global
Jan 22 00:07:31 compute-1 ovn_metadata_agent[104179]:     mode http
Jan 22 00:07:31 compute-1 ovn_metadata_agent[104179]:     option httplog
Jan 22 00:07:31 compute-1 ovn_metadata_agent[104179]:     option dontlognull
Jan 22 00:07:31 compute-1 ovn_metadata_agent[104179]:     option http-server-close
Jan 22 00:07:31 compute-1 ovn_metadata_agent[104179]:     option forwardfor
Jan 22 00:07:31 compute-1 ovn_metadata_agent[104179]:     retries                 3
Jan 22 00:07:31 compute-1 ovn_metadata_agent[104179]:     timeout http-request    30s
Jan 22 00:07:31 compute-1 ovn_metadata_agent[104179]:     timeout connect         30s
Jan 22 00:07:31 compute-1 ovn_metadata_agent[104179]:     timeout client          32s
Jan 22 00:07:31 compute-1 ovn_metadata_agent[104179]:     timeout server          32s
Jan 22 00:07:31 compute-1 ovn_metadata_agent[104179]:     timeout http-keep-alive 30s
Jan 22 00:07:31 compute-1 ovn_metadata_agent[104179]: 
Jan 22 00:07:31 compute-1 ovn_metadata_agent[104179]: 
Jan 22 00:07:31 compute-1 ovn_metadata_agent[104179]: listen listener
Jan 22 00:07:31 compute-1 ovn_metadata_agent[104179]:     bind 169.254.169.254:80
Jan 22 00:07:31 compute-1 ovn_metadata_agent[104179]:     server metadata /var/lib/neutron/metadata_proxy
Jan 22 00:07:31 compute-1 ovn_metadata_agent[104179]:     http-request add-header X-OVN-Network-ID 4fec6f89-3f21-4bdf-a7ae-3fa3d94f6134
Jan 22 00:07:31 compute-1 ovn_metadata_agent[104179]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 22 00:07:31 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:07:31.007 104184 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-4fec6f89-3f21-4bdf-a7ae-3fa3d94f6134', 'env', 'PROCESS_TAG=haproxy-4fec6f89-3f21-4bdf-a7ae-3fa3d94f6134', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/4fec6f89-3f21-4bdf-a7ae-3fa3d94f6134.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 22 00:07:31 compute-1 nova_compute[182713]: 2026-01-22 00:07:31.007 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 3c1aab4c-913f-4be7-a27b-a763fb7cec97] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:07:31 compute-1 nova_compute[182713]: 2026-01-22 00:07:31.011 182717 DEBUG nova.network.neutron [req-3b7b10a7-4c50-4f7b-a87b-6cfa5883665e req-fd0687a8-de4c-42fd-975c-b1c0cc184977 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3c1aab4c-913f-4be7-a27b-a763fb7cec97] Updated VIF entry in instance network info cache for port 38eb4aad-1fbd-420b-803e-69bb30d226c4. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 00:07:31 compute-1 nova_compute[182713]: 2026-01-22 00:07:31.011 182717 DEBUG nova.network.neutron [req-3b7b10a7-4c50-4f7b-a87b-6cfa5883665e req-fd0687a8-de4c-42fd-975c-b1c0cc184977 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3c1aab4c-913f-4be7-a27b-a763fb7cec97] Updating instance_info_cache with network_info: [{"id": "38eb4aad-1fbd-420b-803e-69bb30d226c4", "address": "fa:16:3e:25:fb:14", "network": {"id": "4fec6f89-3f21-4bdf-a7ae-3fa3d94f6134", "bridge": "br-int", "label": "tempest-ServerGroupTestJSON-989922694-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "40c52c1a71294070a5448bbfd80c0e64", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap38eb4aad-1f", "ovs_interfaceid": "38eb4aad-1fbd-420b-803e-69bb30d226c4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:07:31 compute-1 nova_compute[182713]: 2026-01-22 00:07:31.014 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 3c1aab4c-913f-4be7-a27b-a763fb7cec97] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 00:07:31 compute-1 nova_compute[182713]: 2026-01-22 00:07:31.016 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:07:31 compute-1 nova_compute[182713]: 2026-01-22 00:07:31.051 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 3c1aab4c-913f-4be7-a27b-a763fb7cec97] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 00:07:31 compute-1 nova_compute[182713]: 2026-01-22 00:07:31.052 182717 DEBUG nova.virt.driver [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Emitting event <LifecycleEvent: 1769040450.9774892, 3c1aab4c-913f-4be7-a27b-a763fb7cec97 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:07:31 compute-1 nova_compute[182713]: 2026-01-22 00:07:31.052 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 3c1aab4c-913f-4be7-a27b-a763fb7cec97] VM Paused (Lifecycle Event)
Jan 22 00:07:31 compute-1 nova_compute[182713]: 2026-01-22 00:07:31.054 182717 DEBUG oslo_concurrency.lockutils [req-3b7b10a7-4c50-4f7b-a87b-6cfa5883665e req-fd0687a8-de4c-42fd-975c-b1c0cc184977 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-3c1aab4c-913f-4be7-a27b-a763fb7cec97" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:07:31 compute-1 nova_compute[182713]: 2026-01-22 00:07:31.057 182717 DEBUG nova.virt.libvirt.driver [None req-a5bd4c09-3283-4130-b989-3b6a1760b2b4 03ac6c0f8ea448daaee61484ec6b3408 40c52c1a71294070a5448bbfd80c0e64 - - default default] [instance: 3c1aab4c-913f-4be7-a27b-a763fb7cec97] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:07:31 compute-1 nova_compute[182713]: 2026-01-22 00:07:31.057 182717 DEBUG nova.virt.libvirt.driver [None req-a5bd4c09-3283-4130-b989-3b6a1760b2b4 03ac6c0f8ea448daaee61484ec6b3408 40c52c1a71294070a5448bbfd80c0e64 - - default default] [instance: 3c1aab4c-913f-4be7-a27b-a763fb7cec97] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:07:31 compute-1 nova_compute[182713]: 2026-01-22 00:07:31.057 182717 DEBUG nova.virt.libvirt.driver [None req-a5bd4c09-3283-4130-b989-3b6a1760b2b4 03ac6c0f8ea448daaee61484ec6b3408 40c52c1a71294070a5448bbfd80c0e64 - - default default] [instance: 3c1aab4c-913f-4be7-a27b-a763fb7cec97] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:07:31 compute-1 nova_compute[182713]: 2026-01-22 00:07:31.057 182717 DEBUG nova.virt.libvirt.driver [None req-a5bd4c09-3283-4130-b989-3b6a1760b2b4 03ac6c0f8ea448daaee61484ec6b3408 40c52c1a71294070a5448bbfd80c0e64 - - default default] [instance: 3c1aab4c-913f-4be7-a27b-a763fb7cec97] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:07:31 compute-1 nova_compute[182713]: 2026-01-22 00:07:31.058 182717 DEBUG nova.virt.libvirt.driver [None req-a5bd4c09-3283-4130-b989-3b6a1760b2b4 03ac6c0f8ea448daaee61484ec6b3408 40c52c1a71294070a5448bbfd80c0e64 - - default default] [instance: 3c1aab4c-913f-4be7-a27b-a763fb7cec97] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:07:31 compute-1 nova_compute[182713]: 2026-01-22 00:07:31.058 182717 DEBUG nova.virt.libvirt.driver [None req-a5bd4c09-3283-4130-b989-3b6a1760b2b4 03ac6c0f8ea448daaee61484ec6b3408 40c52c1a71294070a5448bbfd80c0e64 - - default default] [instance: 3c1aab4c-913f-4be7-a27b-a763fb7cec97] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:07:31 compute-1 nova_compute[182713]: 2026-01-22 00:07:31.086 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 3c1aab4c-913f-4be7-a27b-a763fb7cec97] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:07:31 compute-1 nova_compute[182713]: 2026-01-22 00:07:31.090 182717 DEBUG nova.virt.driver [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Emitting event <LifecycleEvent: 1769040450.9815223, 3c1aab4c-913f-4be7-a27b-a763fb7cec97 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:07:31 compute-1 nova_compute[182713]: 2026-01-22 00:07:31.090 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 3c1aab4c-913f-4be7-a27b-a763fb7cec97] VM Resumed (Lifecycle Event)
Jan 22 00:07:31 compute-1 nova_compute[182713]: 2026-01-22 00:07:31.123 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 3c1aab4c-913f-4be7-a27b-a763fb7cec97] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:07:31 compute-1 nova_compute[182713]: 2026-01-22 00:07:31.126 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 3c1aab4c-913f-4be7-a27b-a763fb7cec97] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 00:07:31 compute-1 nova_compute[182713]: 2026-01-22 00:07:31.174 182717 INFO nova.compute.manager [None req-a5bd4c09-3283-4130-b989-3b6a1760b2b4 03ac6c0f8ea448daaee61484ec6b3408 40c52c1a71294070a5448bbfd80c0e64 - - default default] [instance: 3c1aab4c-913f-4be7-a27b-a763fb7cec97] Took 4.29 seconds to spawn the instance on the hypervisor.
Jan 22 00:07:31 compute-1 nova_compute[182713]: 2026-01-22 00:07:31.174 182717 DEBUG nova.compute.manager [None req-a5bd4c09-3283-4130-b989-3b6a1760b2b4 03ac6c0f8ea448daaee61484ec6b3408 40c52c1a71294070a5448bbfd80c0e64 - - default default] [instance: 3c1aab4c-913f-4be7-a27b-a763fb7cec97] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:07:31 compute-1 nova_compute[182713]: 2026-01-22 00:07:31.181 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 3c1aab4c-913f-4be7-a27b-a763fb7cec97] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 00:07:31 compute-1 nova_compute[182713]: 2026-01-22 00:07:31.270 182717 INFO nova.compute.manager [None req-a5bd4c09-3283-4130-b989-3b6a1760b2b4 03ac6c0f8ea448daaee61484ec6b3408 40c52c1a71294070a5448bbfd80c0e64 - - default default] [instance: 3c1aab4c-913f-4be7-a27b-a763fb7cec97] Took 5.12 seconds to build instance.
Jan 22 00:07:31 compute-1 nova_compute[182713]: 2026-01-22 00:07:31.295 182717 DEBUG oslo_concurrency.lockutils [None req-a5bd4c09-3283-4130-b989-3b6a1760b2b4 03ac6c0f8ea448daaee61484ec6b3408 40c52c1a71294070a5448bbfd80c0e64 - - default default] Lock "3c1aab4c-913f-4be7-a27b-a763fb7cec97" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 5.492s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:07:31 compute-1 podman[226978]: 2026-01-22 00:07:31.398397448 +0000 UTC m=+0.047781422 container create 6d33f4919d511ee8d858ec17430948982980e692339d5256566e9432d3d7dcf4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4fec6f89-3f21-4bdf-a7ae-3fa3d94f6134, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Jan 22 00:07:31 compute-1 systemd[1]: Started libpod-conmon-6d33f4919d511ee8d858ec17430948982980e692339d5256566e9432d3d7dcf4.scope.
Jan 22 00:07:31 compute-1 podman[226978]: 2026-01-22 00:07:31.371496171 +0000 UTC m=+0.020880165 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 00:07:31 compute-1 systemd[1]: Started libcrun container.
Jan 22 00:07:31 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a5f9ddd84069896effb20f75a10ac772ed30c33cd110f0452d1286dfa0f45968/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 00:07:31 compute-1 podman[226978]: 2026-01-22 00:07:31.514747089 +0000 UTC m=+0.164131083 container init 6d33f4919d511ee8d858ec17430948982980e692339d5256566e9432d3d7dcf4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4fec6f89-3f21-4bdf-a7ae-3fa3d94f6134, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Jan 22 00:07:31 compute-1 podman[226978]: 2026-01-22 00:07:31.526829026 +0000 UTC m=+0.176213000 container start 6d33f4919d511ee8d858ec17430948982980e692339d5256566e9432d3d7dcf4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4fec6f89-3f21-4bdf-a7ae-3fa3d94f6134, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202)
Jan 22 00:07:31 compute-1 neutron-haproxy-ovnmeta-4fec6f89-3f21-4bdf-a7ae-3fa3d94f6134[226994]: [NOTICE]   (226998) : New worker (227000) forked
Jan 22 00:07:31 compute-1 neutron-haproxy-ovnmeta-4fec6f89-3f21-4bdf-a7ae-3fa3d94f6134[226994]: [NOTICE]   (226998) : Loading success.
Jan 22 00:07:32 compute-1 nova_compute[182713]: 2026-01-22 00:07:32.202 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:07:33 compute-1 nova_compute[182713]: 2026-01-22 00:07:33.173 182717 DEBUG nova.compute.manager [req-88372045-c0f1-42a6-98e4-b379540520f4 req-00896317-cabe-47d5-ac9e-3f8b023f006a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3c1aab4c-913f-4be7-a27b-a763fb7cec97] Received event network-vif-plugged-38eb4aad-1fbd-420b-803e-69bb30d226c4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:07:33 compute-1 nova_compute[182713]: 2026-01-22 00:07:33.174 182717 DEBUG oslo_concurrency.lockutils [req-88372045-c0f1-42a6-98e4-b379540520f4 req-00896317-cabe-47d5-ac9e-3f8b023f006a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "3c1aab4c-913f-4be7-a27b-a763fb7cec97-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:07:33 compute-1 nova_compute[182713]: 2026-01-22 00:07:33.174 182717 DEBUG oslo_concurrency.lockutils [req-88372045-c0f1-42a6-98e4-b379540520f4 req-00896317-cabe-47d5-ac9e-3f8b023f006a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "3c1aab4c-913f-4be7-a27b-a763fb7cec97-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:07:33 compute-1 nova_compute[182713]: 2026-01-22 00:07:33.174 182717 DEBUG oslo_concurrency.lockutils [req-88372045-c0f1-42a6-98e4-b379540520f4 req-00896317-cabe-47d5-ac9e-3f8b023f006a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "3c1aab4c-913f-4be7-a27b-a763fb7cec97-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:07:33 compute-1 nova_compute[182713]: 2026-01-22 00:07:33.175 182717 DEBUG nova.compute.manager [req-88372045-c0f1-42a6-98e4-b379540520f4 req-00896317-cabe-47d5-ac9e-3f8b023f006a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3c1aab4c-913f-4be7-a27b-a763fb7cec97] No waiting events found dispatching network-vif-plugged-38eb4aad-1fbd-420b-803e-69bb30d226c4 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:07:33 compute-1 nova_compute[182713]: 2026-01-22 00:07:33.175 182717 WARNING nova.compute.manager [req-88372045-c0f1-42a6-98e4-b379540520f4 req-00896317-cabe-47d5-ac9e-3f8b023f006a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3c1aab4c-913f-4be7-a27b-a763fb7cec97] Received unexpected event network-vif-plugged-38eb4aad-1fbd-420b-803e-69bb30d226c4 for instance with vm_state active and task_state None.
Jan 22 00:07:33 compute-1 nova_compute[182713]: 2026-01-22 00:07:33.302 182717 DEBUG oslo_concurrency.lockutils [None req-b4f702bf-1fe3-4ae6-b3f4-710924636066 03ac6c0f8ea448daaee61484ec6b3408 40c52c1a71294070a5448bbfd80c0e64 - - default default] Acquiring lock "3c1aab4c-913f-4be7-a27b-a763fb7cec97" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:07:33 compute-1 nova_compute[182713]: 2026-01-22 00:07:33.302 182717 DEBUG oslo_concurrency.lockutils [None req-b4f702bf-1fe3-4ae6-b3f4-710924636066 03ac6c0f8ea448daaee61484ec6b3408 40c52c1a71294070a5448bbfd80c0e64 - - default default] Lock "3c1aab4c-913f-4be7-a27b-a763fb7cec97" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:07:33 compute-1 nova_compute[182713]: 2026-01-22 00:07:33.303 182717 DEBUG oslo_concurrency.lockutils [None req-b4f702bf-1fe3-4ae6-b3f4-710924636066 03ac6c0f8ea448daaee61484ec6b3408 40c52c1a71294070a5448bbfd80c0e64 - - default default] Acquiring lock "3c1aab4c-913f-4be7-a27b-a763fb7cec97-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:07:33 compute-1 nova_compute[182713]: 2026-01-22 00:07:33.303 182717 DEBUG oslo_concurrency.lockutils [None req-b4f702bf-1fe3-4ae6-b3f4-710924636066 03ac6c0f8ea448daaee61484ec6b3408 40c52c1a71294070a5448bbfd80c0e64 - - default default] Lock "3c1aab4c-913f-4be7-a27b-a763fb7cec97-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:07:33 compute-1 nova_compute[182713]: 2026-01-22 00:07:33.303 182717 DEBUG oslo_concurrency.lockutils [None req-b4f702bf-1fe3-4ae6-b3f4-710924636066 03ac6c0f8ea448daaee61484ec6b3408 40c52c1a71294070a5448bbfd80c0e64 - - default default] Lock "3c1aab4c-913f-4be7-a27b-a763fb7cec97-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:07:33 compute-1 nova_compute[182713]: 2026-01-22 00:07:33.315 182717 INFO nova.compute.manager [None req-b4f702bf-1fe3-4ae6-b3f4-710924636066 03ac6c0f8ea448daaee61484ec6b3408 40c52c1a71294070a5448bbfd80c0e64 - - default default] [instance: 3c1aab4c-913f-4be7-a27b-a763fb7cec97] Terminating instance
Jan 22 00:07:33 compute-1 nova_compute[182713]: 2026-01-22 00:07:33.325 182717 DEBUG nova.compute.manager [None req-b4f702bf-1fe3-4ae6-b3f4-710924636066 03ac6c0f8ea448daaee61484ec6b3408 40c52c1a71294070a5448bbfd80c0e64 - - default default] [instance: 3c1aab4c-913f-4be7-a27b-a763fb7cec97] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 22 00:07:33 compute-1 kernel: tap38eb4aad-1f (unregistering): left promiscuous mode
Jan 22 00:07:33 compute-1 NetworkManager[54952]: <info>  [1769040453.3425] device (tap38eb4aad-1f): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 00:07:33 compute-1 ovn_controller[94841]: 2026-01-22T00:07:33Z|00403|binding|INFO|Releasing lport 38eb4aad-1fbd-420b-803e-69bb30d226c4 from this chassis (sb_readonly=0)
Jan 22 00:07:33 compute-1 ovn_controller[94841]: 2026-01-22T00:07:33Z|00404|binding|INFO|Setting lport 38eb4aad-1fbd-420b-803e-69bb30d226c4 down in Southbound
Jan 22 00:07:33 compute-1 nova_compute[182713]: 2026-01-22 00:07:33.351 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:07:33 compute-1 ovn_controller[94841]: 2026-01-22T00:07:33Z|00405|binding|INFO|Removing iface tap38eb4aad-1f ovn-installed in OVS
Jan 22 00:07:33 compute-1 nova_compute[182713]: 2026-01-22 00:07:33.354 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:07:33 compute-1 nova_compute[182713]: 2026-01-22 00:07:33.363 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:07:33 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:07:33.365 104184 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:25:fb:14 10.100.0.12'], port_security=['fa:16:3e:25:fb:14 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '3c1aab4c-913f-4be7-a27b-a763fb7cec97', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4fec6f89-3f21-4bdf-a7ae-3fa3d94f6134', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '40c52c1a71294070a5448bbfd80c0e64', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'c51487e9-5131-485f-b023-911e268f5378', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=86a317d2-1b32-45a7-bad1-78a61cb0921a, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>], logical_port=38eb4aad-1fbd-420b-803e-69bb30d226c4) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:07:33 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:07:33.370 104184 INFO neutron.agent.ovn.metadata.agent [-] Port 38eb4aad-1fbd-420b-803e-69bb30d226c4 in datapath 4fec6f89-3f21-4bdf-a7ae-3fa3d94f6134 unbound from our chassis
Jan 22 00:07:33 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:07:33.374 104184 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 4fec6f89-3f21-4bdf-a7ae-3fa3d94f6134, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 00:07:33 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:07:33.375 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[6dd2cb90-5fd1-4c64-b597-ba0f3a670836]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:07:33 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:07:33.376 104184 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-4fec6f89-3f21-4bdf-a7ae-3fa3d94f6134 namespace which is not needed anymore
Jan 22 00:07:33 compute-1 systemd[1]: machine-qemu\x2d45\x2dinstance\x2d00000067.scope: Deactivated successfully.
Jan 22 00:07:33 compute-1 systemd[1]: machine-qemu\x2d45\x2dinstance\x2d00000067.scope: Consumed 2.595s CPU time.
Jan 22 00:07:33 compute-1 systemd-machined[153970]: Machine qemu-45-instance-00000067 terminated.
Jan 22 00:07:33 compute-1 nova_compute[182713]: 2026-01-22 00:07:33.554 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:07:33 compute-1 nova_compute[182713]: 2026-01-22 00:07:33.559 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:07:33 compute-1 neutron-haproxy-ovnmeta-4fec6f89-3f21-4bdf-a7ae-3fa3d94f6134[226994]: [NOTICE]   (226998) : haproxy version is 2.8.14-c23fe91
Jan 22 00:07:33 compute-1 neutron-haproxy-ovnmeta-4fec6f89-3f21-4bdf-a7ae-3fa3d94f6134[226994]: [NOTICE]   (226998) : path to executable is /usr/sbin/haproxy
Jan 22 00:07:33 compute-1 neutron-haproxy-ovnmeta-4fec6f89-3f21-4bdf-a7ae-3fa3d94f6134[226994]: [WARNING]  (226998) : Exiting Master process...
Jan 22 00:07:33 compute-1 neutron-haproxy-ovnmeta-4fec6f89-3f21-4bdf-a7ae-3fa3d94f6134[226994]: [ALERT]    (226998) : Current worker (227000) exited with code 143 (Terminated)
Jan 22 00:07:33 compute-1 neutron-haproxy-ovnmeta-4fec6f89-3f21-4bdf-a7ae-3fa3d94f6134[226994]: [WARNING]  (226998) : All workers exited. Exiting... (0)
Jan 22 00:07:33 compute-1 systemd[1]: libpod-6d33f4919d511ee8d858ec17430948982980e692339d5256566e9432d3d7dcf4.scope: Deactivated successfully.
Jan 22 00:07:33 compute-1 podman[227030]: 2026-01-22 00:07:33.57361137 +0000 UTC m=+0.075946066 container died 6d33f4919d511ee8d858ec17430948982980e692339d5256566e9432d3d7dcf4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4fec6f89-3f21-4bdf-a7ae-3fa3d94f6134, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 22 00:07:33 compute-1 nova_compute[182713]: 2026-01-22 00:07:33.617 182717 INFO nova.virt.libvirt.driver [-] [instance: 3c1aab4c-913f-4be7-a27b-a763fb7cec97] Instance destroyed successfully.
Jan 22 00:07:33 compute-1 nova_compute[182713]: 2026-01-22 00:07:33.618 182717 DEBUG nova.objects.instance [None req-b4f702bf-1fe3-4ae6-b3f4-710924636066 03ac6c0f8ea448daaee61484ec6b3408 40c52c1a71294070a5448bbfd80c0e64 - - default default] Lazy-loading 'resources' on Instance uuid 3c1aab4c-913f-4be7-a27b-a763fb7cec97 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:07:33 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-6d33f4919d511ee8d858ec17430948982980e692339d5256566e9432d3d7dcf4-userdata-shm.mount: Deactivated successfully.
Jan 22 00:07:33 compute-1 systemd[1]: var-lib-containers-storage-overlay-a5f9ddd84069896effb20f75a10ac772ed30c33cd110f0452d1286dfa0f45968-merged.mount: Deactivated successfully.
Jan 22 00:07:33 compute-1 podman[227030]: 2026-01-22 00:07:33.634361334 +0000 UTC m=+0.136696030 container cleanup 6d33f4919d511ee8d858ec17430948982980e692339d5256566e9432d3d7dcf4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4fec6f89-3f21-4bdf-a7ae-3fa3d94f6134, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Jan 22 00:07:33 compute-1 nova_compute[182713]: 2026-01-22 00:07:33.638 182717 DEBUG nova.virt.libvirt.vif [None req-b4f702bf-1fe3-4ae6-b3f4-710924636066 03ac6c0f8ea448daaee61484ec6b3408 40c52c1a71294070a5448bbfd80c0e64 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T00:07:24Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerGroupTestJSON-server-2118778170',display_name='tempest-ServerGroupTestJSON-server-2118778170',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-servergrouptestjson-server-2118778170',id=103,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-22T00:07:31Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='40c52c1a71294070a5448bbfd80c0e64',ramdisk_id='',reservation_id='r-8j03ubbl',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerGroupTestJSON-249766086',owner_user_name='tempest-ServerGroupTestJSON-249766086-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T00:07:31Z,user_data=None,user_id='03ac6c0f8ea448daaee61484ec6b3408',uuid=3c1aab4c-913f-4be7-a27b-a763fb7cec97,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "38eb4aad-1fbd-420b-803e-69bb30d226c4", "address": "fa:16:3e:25:fb:14", "network": {"id": "4fec6f89-3f21-4bdf-a7ae-3fa3d94f6134", "bridge": "br-int", "label": "tempest-ServerGroupTestJSON-989922694-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "40c52c1a71294070a5448bbfd80c0e64", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap38eb4aad-1f", "ovs_interfaceid": "38eb4aad-1fbd-420b-803e-69bb30d226c4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 00:07:33 compute-1 nova_compute[182713]: 2026-01-22 00:07:33.639 182717 DEBUG nova.network.os_vif_util [None req-b4f702bf-1fe3-4ae6-b3f4-710924636066 03ac6c0f8ea448daaee61484ec6b3408 40c52c1a71294070a5448bbfd80c0e64 - - default default] Converting VIF {"id": "38eb4aad-1fbd-420b-803e-69bb30d226c4", "address": "fa:16:3e:25:fb:14", "network": {"id": "4fec6f89-3f21-4bdf-a7ae-3fa3d94f6134", "bridge": "br-int", "label": "tempest-ServerGroupTestJSON-989922694-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "40c52c1a71294070a5448bbfd80c0e64", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap38eb4aad-1f", "ovs_interfaceid": "38eb4aad-1fbd-420b-803e-69bb30d226c4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:07:33 compute-1 nova_compute[182713]: 2026-01-22 00:07:33.640 182717 DEBUG nova.network.os_vif_util [None req-b4f702bf-1fe3-4ae6-b3f4-710924636066 03ac6c0f8ea448daaee61484ec6b3408 40c52c1a71294070a5448bbfd80c0e64 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:25:fb:14,bridge_name='br-int',has_traffic_filtering=True,id=38eb4aad-1fbd-420b-803e-69bb30d226c4,network=Network(4fec6f89-3f21-4bdf-a7ae-3fa3d94f6134),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap38eb4aad-1f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:07:33 compute-1 systemd[1]: libpod-conmon-6d33f4919d511ee8d858ec17430948982980e692339d5256566e9432d3d7dcf4.scope: Deactivated successfully.
Jan 22 00:07:33 compute-1 nova_compute[182713]: 2026-01-22 00:07:33.641 182717 DEBUG os_vif [None req-b4f702bf-1fe3-4ae6-b3f4-710924636066 03ac6c0f8ea448daaee61484ec6b3408 40c52c1a71294070a5448bbfd80c0e64 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:25:fb:14,bridge_name='br-int',has_traffic_filtering=True,id=38eb4aad-1fbd-420b-803e-69bb30d226c4,network=Network(4fec6f89-3f21-4bdf-a7ae-3fa3d94f6134),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap38eb4aad-1f') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 00:07:33 compute-1 nova_compute[182713]: 2026-01-22 00:07:33.643 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:07:33 compute-1 nova_compute[182713]: 2026-01-22 00:07:33.644 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap38eb4aad-1f, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:07:33 compute-1 nova_compute[182713]: 2026-01-22 00:07:33.646 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:07:33 compute-1 nova_compute[182713]: 2026-01-22 00:07:33.649 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:07:33 compute-1 nova_compute[182713]: 2026-01-22 00:07:33.653 182717 INFO os_vif [None req-b4f702bf-1fe3-4ae6-b3f4-710924636066 03ac6c0f8ea448daaee61484ec6b3408 40c52c1a71294070a5448bbfd80c0e64 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:25:fb:14,bridge_name='br-int',has_traffic_filtering=True,id=38eb4aad-1fbd-420b-803e-69bb30d226c4,network=Network(4fec6f89-3f21-4bdf-a7ae-3fa3d94f6134),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap38eb4aad-1f')
Jan 22 00:07:33 compute-1 nova_compute[182713]: 2026-01-22 00:07:33.654 182717 INFO nova.virt.libvirt.driver [None req-b4f702bf-1fe3-4ae6-b3f4-710924636066 03ac6c0f8ea448daaee61484ec6b3408 40c52c1a71294070a5448bbfd80c0e64 - - default default] [instance: 3c1aab4c-913f-4be7-a27b-a763fb7cec97] Deleting instance files /var/lib/nova/instances/3c1aab4c-913f-4be7-a27b-a763fb7cec97_del
Jan 22 00:07:33 compute-1 nova_compute[182713]: 2026-01-22 00:07:33.655 182717 INFO nova.virt.libvirt.driver [None req-b4f702bf-1fe3-4ae6-b3f4-710924636066 03ac6c0f8ea448daaee61484ec6b3408 40c52c1a71294070a5448bbfd80c0e64 - - default default] [instance: 3c1aab4c-913f-4be7-a27b-a763fb7cec97] Deletion of /var/lib/nova/instances/3c1aab4c-913f-4be7-a27b-a763fb7cec97_del complete
Jan 22 00:07:33 compute-1 podman[227074]: 2026-01-22 00:07:33.704921635 +0000 UTC m=+0.040560362 container remove 6d33f4919d511ee8d858ec17430948982980e692339d5256566e9432d3d7dcf4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4fec6f89-3f21-4bdf-a7ae-3fa3d94f6134, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0)
Jan 22 00:07:33 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:07:33.709 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[987d2ce5-a498-49a6-bca7-5f5b3a697084]: (4, ('Thu Jan 22 12:07:33 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-4fec6f89-3f21-4bdf-a7ae-3fa3d94f6134 (6d33f4919d511ee8d858ec17430948982980e692339d5256566e9432d3d7dcf4)\n6d33f4919d511ee8d858ec17430948982980e692339d5256566e9432d3d7dcf4\nThu Jan 22 12:07:33 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-4fec6f89-3f21-4bdf-a7ae-3fa3d94f6134 (6d33f4919d511ee8d858ec17430948982980e692339d5256566e9432d3d7dcf4)\n6d33f4919d511ee8d858ec17430948982980e692339d5256566e9432d3d7dcf4\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:07:33 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:07:33.711 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[44f22719-7e19-4926-a3bf-f1b505c0934c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:07:33 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:07:33.712 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4fec6f89-30, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:07:33 compute-1 nova_compute[182713]: 2026-01-22 00:07:33.714 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:07:33 compute-1 kernel: tap4fec6f89-30: left promiscuous mode
Jan 22 00:07:33 compute-1 nova_compute[182713]: 2026-01-22 00:07:33.716 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:07:33 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:07:33.718 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[b018a256-18d8-47c7-85d8-10bf1bf40f10]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:07:33 compute-1 nova_compute[182713]: 2026-01-22 00:07:33.728 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:07:33 compute-1 nova_compute[182713]: 2026-01-22 00:07:33.733 182717 INFO nova.compute.manager [None req-b4f702bf-1fe3-4ae6-b3f4-710924636066 03ac6c0f8ea448daaee61484ec6b3408 40c52c1a71294070a5448bbfd80c0e64 - - default default] [instance: 3c1aab4c-913f-4be7-a27b-a763fb7cec97] Took 0.41 seconds to destroy the instance on the hypervisor.
Jan 22 00:07:33 compute-1 nova_compute[182713]: 2026-01-22 00:07:33.734 182717 DEBUG oslo.service.loopingcall [None req-b4f702bf-1fe3-4ae6-b3f4-710924636066 03ac6c0f8ea448daaee61484ec6b3408 40c52c1a71294070a5448bbfd80c0e64 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 22 00:07:33 compute-1 nova_compute[182713]: 2026-01-22 00:07:33.735 182717 DEBUG nova.compute.manager [-] [instance: 3c1aab4c-913f-4be7-a27b-a763fb7cec97] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 22 00:07:33 compute-1 nova_compute[182713]: 2026-01-22 00:07:33.735 182717 DEBUG nova.network.neutron [-] [instance: 3c1aab4c-913f-4be7-a27b-a763fb7cec97] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 22 00:07:33 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:07:33.737 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[4bbf5dae-4444-4e18-af73-3fcba9b0927b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:07:33 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:07:33.738 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[6cf1bff3-22fc-4749-b3cf-65aa82eb95a1]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:07:33 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:07:33.755 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[e187e4a7-aba5-495c-afcc-40fed020855d]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 502334, 'reachable_time': 42299, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 227089, 'error': None, 'target': 'ovnmeta-4fec6f89-3f21-4bdf-a7ae-3fa3d94f6134', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:07:33 compute-1 systemd[1]: run-netns-ovnmeta\x2d4fec6f89\x2d3f21\x2d4bdf\x2da7ae\x2d3fa3d94f6134.mount: Deactivated successfully.
Jan 22 00:07:33 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:07:33.758 104576 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-4fec6f89-3f21-4bdf-a7ae-3fa3d94f6134 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 22 00:07:33 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:07:33.758 104576 DEBUG oslo.privsep.daemon [-] privsep: reply[0a3e4b28-fa42-4384-8bb8-0ac2a682c9d4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:07:34 compute-1 nova_compute[182713]: 2026-01-22 00:07:34.436 182717 DEBUG nova.network.neutron [-] [instance: 3c1aab4c-913f-4be7-a27b-a763fb7cec97] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:07:34 compute-1 nova_compute[182713]: 2026-01-22 00:07:34.463 182717 INFO nova.compute.manager [-] [instance: 3c1aab4c-913f-4be7-a27b-a763fb7cec97] Took 0.73 seconds to deallocate network for instance.
Jan 22 00:07:34 compute-1 nova_compute[182713]: 2026-01-22 00:07:34.531 182717 DEBUG nova.compute.manager [req-8e49fc53-313b-4945-833b-25c857c6e52e req-c472f809-0bec-4ae9-9a73-b144ab8f4cd0 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3c1aab4c-913f-4be7-a27b-a763fb7cec97] Received event network-vif-deleted-38eb4aad-1fbd-420b-803e-69bb30d226c4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:07:34 compute-1 nova_compute[182713]: 2026-01-22 00:07:34.628 182717 DEBUG oslo_concurrency.lockutils [None req-b4f702bf-1fe3-4ae6-b3f4-710924636066 03ac6c0f8ea448daaee61484ec6b3408 40c52c1a71294070a5448bbfd80c0e64 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:07:34 compute-1 nova_compute[182713]: 2026-01-22 00:07:34.629 182717 DEBUG oslo_concurrency.lockutils [None req-b4f702bf-1fe3-4ae6-b3f4-710924636066 03ac6c0f8ea448daaee61484ec6b3408 40c52c1a71294070a5448bbfd80c0e64 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:07:34 compute-1 nova_compute[182713]: 2026-01-22 00:07:34.711 182717 DEBUG nova.compute.provider_tree [None req-b4f702bf-1fe3-4ae6-b3f4-710924636066 03ac6c0f8ea448daaee61484ec6b3408 40c52c1a71294070a5448bbfd80c0e64 - - default default] Inventory has not changed in ProviderTree for provider: 39680711-70c9-4df1-ae59-25e54fac688d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:07:34 compute-1 nova_compute[182713]: 2026-01-22 00:07:34.738 182717 DEBUG nova.scheduler.client.report [None req-b4f702bf-1fe3-4ae6-b3f4-710924636066 03ac6c0f8ea448daaee61484ec6b3408 40c52c1a71294070a5448bbfd80c0e64 - - default default] Inventory has not changed for provider 39680711-70c9-4df1-ae59-25e54fac688d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:07:34 compute-1 nova_compute[182713]: 2026-01-22 00:07:34.788 182717 DEBUG oslo_concurrency.lockutils [None req-b4f702bf-1fe3-4ae6-b3f4-710924636066 03ac6c0f8ea448daaee61484ec6b3408 40c52c1a71294070a5448bbfd80c0e64 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.158s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:07:34 compute-1 nova_compute[182713]: 2026-01-22 00:07:34.825 182717 INFO nova.scheduler.client.report [None req-b4f702bf-1fe3-4ae6-b3f4-710924636066 03ac6c0f8ea448daaee61484ec6b3408 40c52c1a71294070a5448bbfd80c0e64 - - default default] Deleted allocations for instance 3c1aab4c-913f-4be7-a27b-a763fb7cec97
Jan 22 00:07:34 compute-1 nova_compute[182713]: 2026-01-22 00:07:34.918 182717 DEBUG oslo_concurrency.lockutils [None req-b4f702bf-1fe3-4ae6-b3f4-710924636066 03ac6c0f8ea448daaee61484ec6b3408 40c52c1a71294070a5448bbfd80c0e64 - - default default] Lock "3c1aab4c-913f-4be7-a27b-a763fb7cec97" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.616s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:07:35 compute-1 nova_compute[182713]: 2026-01-22 00:07:35.539 182717 DEBUG nova.compute.manager [req-79cc67d8-8d39-4984-a3c1-d17fc192a0e7 req-c7d82ee3-6dea-4c44-8427-588876f0cc32 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3c1aab4c-913f-4be7-a27b-a763fb7cec97] Received event network-vif-unplugged-38eb4aad-1fbd-420b-803e-69bb30d226c4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:07:35 compute-1 nova_compute[182713]: 2026-01-22 00:07:35.540 182717 DEBUG oslo_concurrency.lockutils [req-79cc67d8-8d39-4984-a3c1-d17fc192a0e7 req-c7d82ee3-6dea-4c44-8427-588876f0cc32 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "3c1aab4c-913f-4be7-a27b-a763fb7cec97-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:07:35 compute-1 nova_compute[182713]: 2026-01-22 00:07:35.540 182717 DEBUG oslo_concurrency.lockutils [req-79cc67d8-8d39-4984-a3c1-d17fc192a0e7 req-c7d82ee3-6dea-4c44-8427-588876f0cc32 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "3c1aab4c-913f-4be7-a27b-a763fb7cec97-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:07:35 compute-1 nova_compute[182713]: 2026-01-22 00:07:35.541 182717 DEBUG oslo_concurrency.lockutils [req-79cc67d8-8d39-4984-a3c1-d17fc192a0e7 req-c7d82ee3-6dea-4c44-8427-588876f0cc32 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "3c1aab4c-913f-4be7-a27b-a763fb7cec97-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:07:35 compute-1 nova_compute[182713]: 2026-01-22 00:07:35.541 182717 DEBUG nova.compute.manager [req-79cc67d8-8d39-4984-a3c1-d17fc192a0e7 req-c7d82ee3-6dea-4c44-8427-588876f0cc32 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3c1aab4c-913f-4be7-a27b-a763fb7cec97] No waiting events found dispatching network-vif-unplugged-38eb4aad-1fbd-420b-803e-69bb30d226c4 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:07:35 compute-1 nova_compute[182713]: 2026-01-22 00:07:35.541 182717 WARNING nova.compute.manager [req-79cc67d8-8d39-4984-a3c1-d17fc192a0e7 req-c7d82ee3-6dea-4c44-8427-588876f0cc32 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3c1aab4c-913f-4be7-a27b-a763fb7cec97] Received unexpected event network-vif-unplugged-38eb4aad-1fbd-420b-803e-69bb30d226c4 for instance with vm_state deleted and task_state None.
Jan 22 00:07:35 compute-1 nova_compute[182713]: 2026-01-22 00:07:35.541 182717 DEBUG nova.compute.manager [req-79cc67d8-8d39-4984-a3c1-d17fc192a0e7 req-c7d82ee3-6dea-4c44-8427-588876f0cc32 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3c1aab4c-913f-4be7-a27b-a763fb7cec97] Received event network-vif-plugged-38eb4aad-1fbd-420b-803e-69bb30d226c4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:07:35 compute-1 nova_compute[182713]: 2026-01-22 00:07:35.542 182717 DEBUG oslo_concurrency.lockutils [req-79cc67d8-8d39-4984-a3c1-d17fc192a0e7 req-c7d82ee3-6dea-4c44-8427-588876f0cc32 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "3c1aab4c-913f-4be7-a27b-a763fb7cec97-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:07:35 compute-1 nova_compute[182713]: 2026-01-22 00:07:35.542 182717 DEBUG oslo_concurrency.lockutils [req-79cc67d8-8d39-4984-a3c1-d17fc192a0e7 req-c7d82ee3-6dea-4c44-8427-588876f0cc32 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "3c1aab4c-913f-4be7-a27b-a763fb7cec97-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:07:35 compute-1 nova_compute[182713]: 2026-01-22 00:07:35.542 182717 DEBUG oslo_concurrency.lockutils [req-79cc67d8-8d39-4984-a3c1-d17fc192a0e7 req-c7d82ee3-6dea-4c44-8427-588876f0cc32 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "3c1aab4c-913f-4be7-a27b-a763fb7cec97-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:07:35 compute-1 nova_compute[182713]: 2026-01-22 00:07:35.543 182717 DEBUG nova.compute.manager [req-79cc67d8-8d39-4984-a3c1-d17fc192a0e7 req-c7d82ee3-6dea-4c44-8427-588876f0cc32 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3c1aab4c-913f-4be7-a27b-a763fb7cec97] No waiting events found dispatching network-vif-plugged-38eb4aad-1fbd-420b-803e-69bb30d226c4 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:07:35 compute-1 nova_compute[182713]: 2026-01-22 00:07:35.543 182717 WARNING nova.compute.manager [req-79cc67d8-8d39-4984-a3c1-d17fc192a0e7 req-c7d82ee3-6dea-4c44-8427-588876f0cc32 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3c1aab4c-913f-4be7-a27b-a763fb7cec97] Received unexpected event network-vif-plugged-38eb4aad-1fbd-420b-803e-69bb30d226c4 for instance with vm_state deleted and task_state None.
Jan 22 00:07:36 compute-1 nova_compute[182713]: 2026-01-22 00:07:36.713 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:07:36 compute-1 nova_compute[182713]: 2026-01-22 00:07:36.918 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:07:37 compute-1 nova_compute[182713]: 2026-01-22 00:07:37.205 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:07:37 compute-1 nova_compute[182713]: 2026-01-22 00:07:37.852 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:07:37 compute-1 nova_compute[182713]: 2026-01-22 00:07:37.854 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:07:37 compute-1 nova_compute[182713]: 2026-01-22 00:07:37.855 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:07:37 compute-1 nova_compute[182713]: 2026-01-22 00:07:37.855 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:07:37 compute-1 nova_compute[182713]: 2026-01-22 00:07:37.855 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 00:07:38 compute-1 nova_compute[182713]: 2026-01-22 00:07:38.646 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:07:38 compute-1 nova_compute[182713]: 2026-01-22 00:07:38.856 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:07:39 compute-1 podman[227090]: 2026-01-22 00:07:39.584369758 +0000 UTC m=+0.075847613 container health_status cba261ffc859a105e0afa4436e64e27f9ba7e7387a1ea02146e0072c51844d44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0)
Jan 22 00:07:39 compute-1 nova_compute[182713]: 2026-01-22 00:07:39.851 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:07:41 compute-1 nova_compute[182713]: 2026-01-22 00:07:41.856 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:07:41 compute-1 nova_compute[182713]: 2026-01-22 00:07:41.857 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 00:07:41 compute-1 nova_compute[182713]: 2026-01-22 00:07:41.857 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 22 00:07:42 compute-1 nova_compute[182713]: 2026-01-22 00:07:42.204 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquiring lock "refresh_cache-21970bbd-36b4-495d-8819-49ef2276a912" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:07:42 compute-1 nova_compute[182713]: 2026-01-22 00:07:42.205 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquired lock "refresh_cache-21970bbd-36b4-495d-8819-49ef2276a912" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:07:42 compute-1 nova_compute[182713]: 2026-01-22 00:07:42.205 182717 DEBUG nova.network.neutron [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] [instance: 21970bbd-36b4-495d-8819-49ef2276a912] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Jan 22 00:07:42 compute-1 nova_compute[182713]: 2026-01-22 00:07:42.207 182717 DEBUG nova.objects.instance [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lazy-loading 'info_cache' on Instance uuid 21970bbd-36b4-495d-8819-49ef2276a912 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:07:42 compute-1 nova_compute[182713]: 2026-01-22 00:07:42.253 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:07:42 compute-1 podman[227111]: 2026-01-22 00:07:42.622129151 +0000 UTC m=+0.116984112 container health_status dce133a2ffa21f3006ac27dc80027060a9029e6d972f4c62fecd35dd3bbb00f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public, release=1755695350, name=ubi9-minimal, vendor=Red Hat, Inc., version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, config_id=openstack_network_exporter, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, architecture=x86_64, vcs-type=git)
Jan 22 00:07:43 compute-1 nova_compute[182713]: 2026-01-22 00:07:43.649 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:07:44 compute-1 ovn_controller[94841]: 2026-01-22T00:07:44Z|00406|binding|INFO|Releasing lport c2dbe75a-81e7-4c52-bada-9acaf8fbaf5c from this chassis (sb_readonly=0)
Jan 22 00:07:44 compute-1 nova_compute[182713]: 2026-01-22 00:07:44.359 182717 DEBUG nova.network.neutron [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] [instance: 21970bbd-36b4-495d-8819-49ef2276a912] Updating instance_info_cache with network_info: [{"id": "06125da7-7adf-4bbe-b033-0045ab83a9f2", "address": "fa:16:3e:c4:99:4e", "network": {"id": "1a4bd631-64c5-4e00-9341-0e44fd0833fb", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1779791452-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b26cf6f4abd54e30aac169a3cbca648c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap06125da7-7a", "ovs_interfaceid": "06125da7-7adf-4bbe-b033-0045ab83a9f2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:07:44 compute-1 nova_compute[182713]: 2026-01-22 00:07:44.377 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:07:44 compute-1 nova_compute[182713]: 2026-01-22 00:07:44.392 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Releasing lock "refresh_cache-21970bbd-36b4-495d-8819-49ef2276a912" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:07:44 compute-1 nova_compute[182713]: 2026-01-22 00:07:44.393 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] [instance: 21970bbd-36b4-495d-8819-49ef2276a912] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Jan 22 00:07:44 compute-1 nova_compute[182713]: 2026-01-22 00:07:44.394 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:07:44 compute-1 nova_compute[182713]: 2026-01-22 00:07:44.394 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:07:44 compute-1 nova_compute[182713]: 2026-01-22 00:07:44.426 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:07:44 compute-1 nova_compute[182713]: 2026-01-22 00:07:44.427 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:07:44 compute-1 nova_compute[182713]: 2026-01-22 00:07:44.428 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:07:44 compute-1 nova_compute[182713]: 2026-01-22 00:07:44.428 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 00:07:44 compute-1 nova_compute[182713]: 2026-01-22 00:07:44.539 182717 DEBUG oslo_concurrency.processutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/21970bbd-36b4-495d-8819-49ef2276a912/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:07:44 compute-1 nova_compute[182713]: 2026-01-22 00:07:44.611 182717 DEBUG oslo_concurrency.processutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/21970bbd-36b4-495d-8819-49ef2276a912/disk --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:07:44 compute-1 nova_compute[182713]: 2026-01-22 00:07:44.612 182717 DEBUG oslo_concurrency.processutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/21970bbd-36b4-495d-8819-49ef2276a912/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:07:44 compute-1 nova_compute[182713]: 2026-01-22 00:07:44.682 182717 DEBUG oslo_concurrency.processutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/21970bbd-36b4-495d-8819-49ef2276a912/disk --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:07:44 compute-1 nova_compute[182713]: 2026-01-22 00:07:44.818 182717 WARNING nova.virt.libvirt.driver [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 00:07:44 compute-1 nova_compute[182713]: 2026-01-22 00:07:44.820 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5507MB free_disk=73.27495956420898GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 00:07:44 compute-1 nova_compute[182713]: 2026-01-22 00:07:44.820 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:07:44 compute-1 nova_compute[182713]: 2026-01-22 00:07:44.820 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:07:45 compute-1 nova_compute[182713]: 2026-01-22 00:07:45.110 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Instance 21970bbd-36b4-495d-8819-49ef2276a912 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 22 00:07:45 compute-1 nova_compute[182713]: 2026-01-22 00:07:45.111 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 00:07:45 compute-1 nova_compute[182713]: 2026-01-22 00:07:45.111 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 00:07:45 compute-1 nova_compute[182713]: 2026-01-22 00:07:45.160 182717 DEBUG nova.compute.provider_tree [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Inventory has not changed in ProviderTree for provider: 39680711-70c9-4df1-ae59-25e54fac688d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:07:45 compute-1 nova_compute[182713]: 2026-01-22 00:07:45.176 182717 DEBUG nova.scheduler.client.report [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Inventory has not changed for provider 39680711-70c9-4df1-ae59-25e54fac688d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:07:45 compute-1 nova_compute[182713]: 2026-01-22 00:07:45.202 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 00:07:45 compute-1 nova_compute[182713]: 2026-01-22 00:07:45.203 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.383s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:07:45 compute-1 nova_compute[182713]: 2026-01-22 00:07:45.204 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:07:45 compute-1 nova_compute[182713]: 2026-01-22 00:07:45.204 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Jan 22 00:07:47 compute-1 nova_compute[182713]: 2026-01-22 00:07:47.257 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:07:48 compute-1 nova_compute[182713]: 2026-01-22 00:07:48.616 182717 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769040453.6152358, 3c1aab4c-913f-4be7-a27b-a763fb7cec97 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:07:48 compute-1 nova_compute[182713]: 2026-01-22 00:07:48.616 182717 INFO nova.compute.manager [-] [instance: 3c1aab4c-913f-4be7-a27b-a763fb7cec97] VM Stopped (Lifecycle Event)
Jan 22 00:07:48 compute-1 nova_compute[182713]: 2026-01-22 00:07:48.641 182717 DEBUG nova.compute.manager [None req-c0f09647-5745-447f-a5fb-25520c7fd063 - - - - - -] [instance: 3c1aab4c-913f-4be7-a27b-a763fb7cec97] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:07:48 compute-1 nova_compute[182713]: 2026-01-22 00:07:48.652 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:07:52 compute-1 nova_compute[182713]: 2026-01-22 00:07:52.260 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:07:53 compute-1 podman[227141]: 2026-01-22 00:07:53.629821966 +0000 UTC m=+0.095547891 container health_status 9069102163b9014ce8d990e36224edf2f6277e737eebbc85a8ea0ba5a3d6a468 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 22 00:07:53 compute-1 podman[227140]: 2026-01-22 00:07:53.639642374 +0000 UTC m=+0.112028131 container health_status 1b31374526468d6b98833c6ddc06c791524e3aaa8f4a92d02e3f11c449a17ca2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Jan 22 00:07:53 compute-1 nova_compute[182713]: 2026-01-22 00:07:53.654 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:07:55 compute-1 nova_compute[182713]: 2026-01-22 00:07:55.138 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:07:55 compute-1 ovn_controller[94841]: 2026-01-22T00:07:55Z|00407|binding|INFO|Releasing lport c2dbe75a-81e7-4c52-bada-9acaf8fbaf5c from this chassis (sb_readonly=0)
Jan 22 00:07:55 compute-1 nova_compute[182713]: 2026-01-22 00:07:55.368 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:07:57 compute-1 nova_compute[182713]: 2026-01-22 00:07:57.299 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:07:58 compute-1 nova_compute[182713]: 2026-01-22 00:07:58.442 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:07:58 compute-1 podman[227188]: 2026-01-22 00:07:58.597571207 +0000 UTC m=+0.072209292 container health_status e305ebfcd3dbca18f8dd680606b043b108cf9efb0d0a771039cb04fe0df8441d (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 22 00:07:58 compute-1 podman[227187]: 2026-01-22 00:07:58.607489079 +0000 UTC m=+0.091538540 container health_status af9a44923f14d865893c91712a29a99d59e05d83f1a1a0300c4f4d5af638f284 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 22 00:07:58 compute-1 nova_compute[182713]: 2026-01-22 00:07:58.656 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:08:02 compute-1 nova_compute[182713]: 2026-01-22 00:08:02.273 182717 DEBUG oslo_concurrency.lockutils [None req-120fe374-dc82-48bb-aea5-8d624e4d515a 4ec14075ca8f491cb7e481e95011074a 356ef6d1220340c1a290e92a77db430d - - default default] Acquiring lock "c8d10cc4-7d81-41cd-8ccd-393ca4932d92" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:08:02 compute-1 nova_compute[182713]: 2026-01-22 00:08:02.274 182717 DEBUG oslo_concurrency.lockutils [None req-120fe374-dc82-48bb-aea5-8d624e4d515a 4ec14075ca8f491cb7e481e95011074a 356ef6d1220340c1a290e92a77db430d - - default default] Lock "c8d10cc4-7d81-41cd-8ccd-393ca4932d92" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:08:02 compute-1 nova_compute[182713]: 2026-01-22 00:08:02.303 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:08:02 compute-1 nova_compute[182713]: 2026-01-22 00:08:02.307 182717 DEBUG nova.compute.manager [None req-120fe374-dc82-48bb-aea5-8d624e4d515a 4ec14075ca8f491cb7e481e95011074a 356ef6d1220340c1a290e92a77db430d - - default default] [instance: c8d10cc4-7d81-41cd-8ccd-393ca4932d92] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 22 00:08:02 compute-1 nova_compute[182713]: 2026-01-22 00:08:02.447 182717 DEBUG oslo_concurrency.lockutils [None req-120fe374-dc82-48bb-aea5-8d624e4d515a 4ec14075ca8f491cb7e481e95011074a 356ef6d1220340c1a290e92a77db430d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:08:02 compute-1 nova_compute[182713]: 2026-01-22 00:08:02.448 182717 DEBUG oslo_concurrency.lockutils [None req-120fe374-dc82-48bb-aea5-8d624e4d515a 4ec14075ca8f491cb7e481e95011074a 356ef6d1220340c1a290e92a77db430d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:08:02 compute-1 nova_compute[182713]: 2026-01-22 00:08:02.454 182717 DEBUG nova.virt.hardware [None req-120fe374-dc82-48bb-aea5-8d624e4d515a 4ec14075ca8f491cb7e481e95011074a 356ef6d1220340c1a290e92a77db430d - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 22 00:08:02 compute-1 nova_compute[182713]: 2026-01-22 00:08:02.454 182717 INFO nova.compute.claims [None req-120fe374-dc82-48bb-aea5-8d624e4d515a 4ec14075ca8f491cb7e481e95011074a 356ef6d1220340c1a290e92a77db430d - - default default] [instance: c8d10cc4-7d81-41cd-8ccd-393ca4932d92] Claim successful on node compute-1.ctlplane.example.com
Jan 22 00:08:02 compute-1 nova_compute[182713]: 2026-01-22 00:08:02.727 182717 DEBUG nova.compute.provider_tree [None req-120fe374-dc82-48bb-aea5-8d624e4d515a 4ec14075ca8f491cb7e481e95011074a 356ef6d1220340c1a290e92a77db430d - - default default] Inventory has not changed in ProviderTree for provider: 39680711-70c9-4df1-ae59-25e54fac688d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:08:02 compute-1 nova_compute[182713]: 2026-01-22 00:08:02.747 182717 DEBUG nova.scheduler.client.report [None req-120fe374-dc82-48bb-aea5-8d624e4d515a 4ec14075ca8f491cb7e481e95011074a 356ef6d1220340c1a290e92a77db430d - - default default] Inventory has not changed for provider 39680711-70c9-4df1-ae59-25e54fac688d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:08:02 compute-1 nova_compute[182713]: 2026-01-22 00:08:02.775 182717 DEBUG oslo_concurrency.lockutils [None req-120fe374-dc82-48bb-aea5-8d624e4d515a 4ec14075ca8f491cb7e481e95011074a 356ef6d1220340c1a290e92a77db430d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.327s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:08:02 compute-1 nova_compute[182713]: 2026-01-22 00:08:02.777 182717 DEBUG nova.compute.manager [None req-120fe374-dc82-48bb-aea5-8d624e4d515a 4ec14075ca8f491cb7e481e95011074a 356ef6d1220340c1a290e92a77db430d - - default default] [instance: c8d10cc4-7d81-41cd-8ccd-393ca4932d92] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 22 00:08:02 compute-1 nova_compute[182713]: 2026-01-22 00:08:02.855 182717 DEBUG nova.compute.manager [None req-120fe374-dc82-48bb-aea5-8d624e4d515a 4ec14075ca8f491cb7e481e95011074a 356ef6d1220340c1a290e92a77db430d - - default default] [instance: c8d10cc4-7d81-41cd-8ccd-393ca4932d92] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 22 00:08:02 compute-1 nova_compute[182713]: 2026-01-22 00:08:02.856 182717 DEBUG nova.network.neutron [None req-120fe374-dc82-48bb-aea5-8d624e4d515a 4ec14075ca8f491cb7e481e95011074a 356ef6d1220340c1a290e92a77db430d - - default default] [instance: c8d10cc4-7d81-41cd-8ccd-393ca4932d92] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 22 00:08:02 compute-1 nova_compute[182713]: 2026-01-22 00:08:02.889 182717 INFO nova.virt.libvirt.driver [None req-120fe374-dc82-48bb-aea5-8d624e4d515a 4ec14075ca8f491cb7e481e95011074a 356ef6d1220340c1a290e92a77db430d - - default default] [instance: c8d10cc4-7d81-41cd-8ccd-393ca4932d92] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 22 00:08:02 compute-1 nova_compute[182713]: 2026-01-22 00:08:02.910 182717 DEBUG nova.compute.manager [None req-120fe374-dc82-48bb-aea5-8d624e4d515a 4ec14075ca8f491cb7e481e95011074a 356ef6d1220340c1a290e92a77db430d - - default default] [instance: c8d10cc4-7d81-41cd-8ccd-393ca4932d92] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 22 00:08:03 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:08:03.015 104184 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:08:03 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:08:03.016 104184 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:08:03 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:08:03.017 104184 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:08:03 compute-1 nova_compute[182713]: 2026-01-22 00:08:03.033 182717 DEBUG nova.compute.manager [None req-120fe374-dc82-48bb-aea5-8d624e4d515a 4ec14075ca8f491cb7e481e95011074a 356ef6d1220340c1a290e92a77db430d - - default default] [instance: c8d10cc4-7d81-41cd-8ccd-393ca4932d92] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 22 00:08:03 compute-1 nova_compute[182713]: 2026-01-22 00:08:03.035 182717 DEBUG nova.virt.libvirt.driver [None req-120fe374-dc82-48bb-aea5-8d624e4d515a 4ec14075ca8f491cb7e481e95011074a 356ef6d1220340c1a290e92a77db430d - - default default] [instance: c8d10cc4-7d81-41cd-8ccd-393ca4932d92] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 22 00:08:03 compute-1 nova_compute[182713]: 2026-01-22 00:08:03.035 182717 INFO nova.virt.libvirt.driver [None req-120fe374-dc82-48bb-aea5-8d624e4d515a 4ec14075ca8f491cb7e481e95011074a 356ef6d1220340c1a290e92a77db430d - - default default] [instance: c8d10cc4-7d81-41cd-8ccd-393ca4932d92] Creating image(s)
Jan 22 00:08:03 compute-1 nova_compute[182713]: 2026-01-22 00:08:03.036 182717 DEBUG oslo_concurrency.lockutils [None req-120fe374-dc82-48bb-aea5-8d624e4d515a 4ec14075ca8f491cb7e481e95011074a 356ef6d1220340c1a290e92a77db430d - - default default] Acquiring lock "/var/lib/nova/instances/c8d10cc4-7d81-41cd-8ccd-393ca4932d92/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:08:03 compute-1 nova_compute[182713]: 2026-01-22 00:08:03.036 182717 DEBUG oslo_concurrency.lockutils [None req-120fe374-dc82-48bb-aea5-8d624e4d515a 4ec14075ca8f491cb7e481e95011074a 356ef6d1220340c1a290e92a77db430d - - default default] Lock "/var/lib/nova/instances/c8d10cc4-7d81-41cd-8ccd-393ca4932d92/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:08:03 compute-1 nova_compute[182713]: 2026-01-22 00:08:03.037 182717 DEBUG oslo_concurrency.lockutils [None req-120fe374-dc82-48bb-aea5-8d624e4d515a 4ec14075ca8f491cb7e481e95011074a 356ef6d1220340c1a290e92a77db430d - - default default] Lock "/var/lib/nova/instances/c8d10cc4-7d81-41cd-8ccd-393ca4932d92/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:08:03 compute-1 nova_compute[182713]: 2026-01-22 00:08:03.056 182717 DEBUG oslo_concurrency.processutils [None req-120fe374-dc82-48bb-aea5-8d624e4d515a 4ec14075ca8f491cb7e481e95011074a 356ef6d1220340c1a290e92a77db430d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:08:03 compute-1 nova_compute[182713]: 2026-01-22 00:08:03.116 182717 DEBUG oslo_concurrency.processutils [None req-120fe374-dc82-48bb-aea5-8d624e4d515a 4ec14075ca8f491cb7e481e95011074a 356ef6d1220340c1a290e92a77db430d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:08:03 compute-1 nova_compute[182713]: 2026-01-22 00:08:03.117 182717 DEBUG oslo_concurrency.lockutils [None req-120fe374-dc82-48bb-aea5-8d624e4d515a 4ec14075ca8f491cb7e481e95011074a 356ef6d1220340c1a290e92a77db430d - - default default] Acquiring lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:08:03 compute-1 nova_compute[182713]: 2026-01-22 00:08:03.118 182717 DEBUG oslo_concurrency.lockutils [None req-120fe374-dc82-48bb-aea5-8d624e4d515a 4ec14075ca8f491cb7e481e95011074a 356ef6d1220340c1a290e92a77db430d - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:08:03 compute-1 nova_compute[182713]: 2026-01-22 00:08:03.129 182717 DEBUG oslo_concurrency.processutils [None req-120fe374-dc82-48bb-aea5-8d624e4d515a 4ec14075ca8f491cb7e481e95011074a 356ef6d1220340c1a290e92a77db430d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:08:03 compute-1 nova_compute[182713]: 2026-01-22 00:08:03.187 182717 DEBUG oslo_concurrency.processutils [None req-120fe374-dc82-48bb-aea5-8d624e4d515a 4ec14075ca8f491cb7e481e95011074a 356ef6d1220340c1a290e92a77db430d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:08:03 compute-1 nova_compute[182713]: 2026-01-22 00:08:03.188 182717 DEBUG oslo_concurrency.processutils [None req-120fe374-dc82-48bb-aea5-8d624e4d515a 4ec14075ca8f491cb7e481e95011074a 356ef6d1220340c1a290e92a77db430d - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/c8d10cc4-7d81-41cd-8ccd-393ca4932d92/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:08:03 compute-1 nova_compute[182713]: 2026-01-22 00:08:03.222 182717 DEBUG oslo_concurrency.processutils [None req-120fe374-dc82-48bb-aea5-8d624e4d515a 4ec14075ca8f491cb7e481e95011074a 356ef6d1220340c1a290e92a77db430d - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/c8d10cc4-7d81-41cd-8ccd-393ca4932d92/disk 1073741824" returned: 0 in 0.034s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:08:03 compute-1 nova_compute[182713]: 2026-01-22 00:08:03.223 182717 DEBUG oslo_concurrency.lockutils [None req-120fe374-dc82-48bb-aea5-8d624e4d515a 4ec14075ca8f491cb7e481e95011074a 356ef6d1220340c1a290e92a77db430d - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.105s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:08:03 compute-1 nova_compute[182713]: 2026-01-22 00:08:03.224 182717 DEBUG oslo_concurrency.processutils [None req-120fe374-dc82-48bb-aea5-8d624e4d515a 4ec14075ca8f491cb7e481e95011074a 356ef6d1220340c1a290e92a77db430d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:08:03 compute-1 nova_compute[182713]: 2026-01-22 00:08:03.245 182717 DEBUG nova.policy [None req-120fe374-dc82-48bb-aea5-8d624e4d515a 4ec14075ca8f491cb7e481e95011074a 356ef6d1220340c1a290e92a77db430d - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '4ec14075ca8f491cb7e481e95011074a', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '356ef6d1220340c1a290e92a77db430d', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 22 00:08:03 compute-1 nova_compute[182713]: 2026-01-22 00:08:03.283 182717 DEBUG oslo_concurrency.processutils [None req-120fe374-dc82-48bb-aea5-8d624e4d515a 4ec14075ca8f491cb7e481e95011074a 356ef6d1220340c1a290e92a77db430d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:08:03 compute-1 nova_compute[182713]: 2026-01-22 00:08:03.283 182717 DEBUG nova.virt.disk.api [None req-120fe374-dc82-48bb-aea5-8d624e4d515a 4ec14075ca8f491cb7e481e95011074a 356ef6d1220340c1a290e92a77db430d - - default default] Checking if we can resize image /var/lib/nova/instances/c8d10cc4-7d81-41cd-8ccd-393ca4932d92/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 22 00:08:03 compute-1 nova_compute[182713]: 2026-01-22 00:08:03.284 182717 DEBUG oslo_concurrency.processutils [None req-120fe374-dc82-48bb-aea5-8d624e4d515a 4ec14075ca8f491cb7e481e95011074a 356ef6d1220340c1a290e92a77db430d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c8d10cc4-7d81-41cd-8ccd-393ca4932d92/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:08:03 compute-1 nova_compute[182713]: 2026-01-22 00:08:03.339 182717 DEBUG oslo_concurrency.processutils [None req-120fe374-dc82-48bb-aea5-8d624e4d515a 4ec14075ca8f491cb7e481e95011074a 356ef6d1220340c1a290e92a77db430d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c8d10cc4-7d81-41cd-8ccd-393ca4932d92/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:08:03 compute-1 nova_compute[182713]: 2026-01-22 00:08:03.340 182717 DEBUG nova.virt.disk.api [None req-120fe374-dc82-48bb-aea5-8d624e4d515a 4ec14075ca8f491cb7e481e95011074a 356ef6d1220340c1a290e92a77db430d - - default default] Cannot resize image /var/lib/nova/instances/c8d10cc4-7d81-41cd-8ccd-393ca4932d92/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 22 00:08:03 compute-1 nova_compute[182713]: 2026-01-22 00:08:03.340 182717 DEBUG nova.objects.instance [None req-120fe374-dc82-48bb-aea5-8d624e4d515a 4ec14075ca8f491cb7e481e95011074a 356ef6d1220340c1a290e92a77db430d - - default default] Lazy-loading 'migration_context' on Instance uuid c8d10cc4-7d81-41cd-8ccd-393ca4932d92 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:08:03 compute-1 nova_compute[182713]: 2026-01-22 00:08:03.361 182717 DEBUG nova.virt.libvirt.driver [None req-120fe374-dc82-48bb-aea5-8d624e4d515a 4ec14075ca8f491cb7e481e95011074a 356ef6d1220340c1a290e92a77db430d - - default default] [instance: c8d10cc4-7d81-41cd-8ccd-393ca4932d92] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 22 00:08:03 compute-1 nova_compute[182713]: 2026-01-22 00:08:03.361 182717 DEBUG nova.virt.libvirt.driver [None req-120fe374-dc82-48bb-aea5-8d624e4d515a 4ec14075ca8f491cb7e481e95011074a 356ef6d1220340c1a290e92a77db430d - - default default] [instance: c8d10cc4-7d81-41cd-8ccd-393ca4932d92] Ensure instance console log exists: /var/lib/nova/instances/c8d10cc4-7d81-41cd-8ccd-393ca4932d92/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 22 00:08:03 compute-1 nova_compute[182713]: 2026-01-22 00:08:03.362 182717 DEBUG oslo_concurrency.lockutils [None req-120fe374-dc82-48bb-aea5-8d624e4d515a 4ec14075ca8f491cb7e481e95011074a 356ef6d1220340c1a290e92a77db430d - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:08:03 compute-1 nova_compute[182713]: 2026-01-22 00:08:03.362 182717 DEBUG oslo_concurrency.lockutils [None req-120fe374-dc82-48bb-aea5-8d624e4d515a 4ec14075ca8f491cb7e481e95011074a 356ef6d1220340c1a290e92a77db430d - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:08:03 compute-1 nova_compute[182713]: 2026-01-22 00:08:03.362 182717 DEBUG oslo_concurrency.lockutils [None req-120fe374-dc82-48bb-aea5-8d624e4d515a 4ec14075ca8f491cb7e481e95011074a 356ef6d1220340c1a290e92a77db430d - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:08:03 compute-1 nova_compute[182713]: 2026-01-22 00:08:03.659 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:08:04 compute-1 nova_compute[182713]: 2026-01-22 00:08:04.290 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:08:05 compute-1 nova_compute[182713]: 2026-01-22 00:08:05.428 182717 DEBUG nova.network.neutron [None req-120fe374-dc82-48bb-aea5-8d624e4d515a 4ec14075ca8f491cb7e481e95011074a 356ef6d1220340c1a290e92a77db430d - - default default] [instance: c8d10cc4-7d81-41cd-8ccd-393ca4932d92] Successfully created port: 867caf1d-7131-4085-9939-8b72e2f5d448 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 22 00:08:06 compute-1 ovn_controller[94841]: 2026-01-22T00:08:06Z|00408|binding|INFO|Releasing lport c2dbe75a-81e7-4c52-bada-9acaf8fbaf5c from this chassis (sb_readonly=0)
Jan 22 00:08:06 compute-1 nova_compute[182713]: 2026-01-22 00:08:06.662 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:08:07 compute-1 nova_compute[182713]: 2026-01-22 00:08:07.016 182717 DEBUG nova.network.neutron [None req-120fe374-dc82-48bb-aea5-8d624e4d515a 4ec14075ca8f491cb7e481e95011074a 356ef6d1220340c1a290e92a77db430d - - default default] [instance: c8d10cc4-7d81-41cd-8ccd-393ca4932d92] Successfully updated port: 867caf1d-7131-4085-9939-8b72e2f5d448 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 22 00:08:07 compute-1 nova_compute[182713]: 2026-01-22 00:08:07.069 182717 DEBUG oslo_concurrency.lockutils [None req-120fe374-dc82-48bb-aea5-8d624e4d515a 4ec14075ca8f491cb7e481e95011074a 356ef6d1220340c1a290e92a77db430d - - default default] Acquiring lock "refresh_cache-c8d10cc4-7d81-41cd-8ccd-393ca4932d92" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:08:07 compute-1 nova_compute[182713]: 2026-01-22 00:08:07.069 182717 DEBUG oslo_concurrency.lockutils [None req-120fe374-dc82-48bb-aea5-8d624e4d515a 4ec14075ca8f491cb7e481e95011074a 356ef6d1220340c1a290e92a77db430d - - default default] Acquired lock "refresh_cache-c8d10cc4-7d81-41cd-8ccd-393ca4932d92" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:08:07 compute-1 nova_compute[182713]: 2026-01-22 00:08:07.070 182717 DEBUG nova.network.neutron [None req-120fe374-dc82-48bb-aea5-8d624e4d515a 4ec14075ca8f491cb7e481e95011074a 356ef6d1220340c1a290e92a77db430d - - default default] [instance: c8d10cc4-7d81-41cd-8ccd-393ca4932d92] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 00:08:07 compute-1 nova_compute[182713]: 2026-01-22 00:08:07.154 182717 DEBUG nova.compute.manager [req-bb618c44-4187-4fa8-9ad7-d24199e370d9 req-e77f8038-e482-4e2b-903a-e5312048dafd 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c8d10cc4-7d81-41cd-8ccd-393ca4932d92] Received event network-changed-867caf1d-7131-4085-9939-8b72e2f5d448 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:08:07 compute-1 nova_compute[182713]: 2026-01-22 00:08:07.154 182717 DEBUG nova.compute.manager [req-bb618c44-4187-4fa8-9ad7-d24199e370d9 req-e77f8038-e482-4e2b-903a-e5312048dafd 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c8d10cc4-7d81-41cd-8ccd-393ca4932d92] Refreshing instance network info cache due to event network-changed-867caf1d-7131-4085-9939-8b72e2f5d448. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 00:08:07 compute-1 nova_compute[182713]: 2026-01-22 00:08:07.155 182717 DEBUG oslo_concurrency.lockutils [req-bb618c44-4187-4fa8-9ad7-d24199e370d9 req-e77f8038-e482-4e2b-903a-e5312048dafd 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-c8d10cc4-7d81-41cd-8ccd-393ca4932d92" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:08:07 compute-1 nova_compute[182713]: 2026-01-22 00:08:07.289 182717 DEBUG nova.network.neutron [None req-120fe374-dc82-48bb-aea5-8d624e4d515a 4ec14075ca8f491cb7e481e95011074a 356ef6d1220340c1a290e92a77db430d - - default default] [instance: c8d10cc4-7d81-41cd-8ccd-393ca4932d92] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 00:08:07 compute-1 nova_compute[182713]: 2026-01-22 00:08:07.343 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:08:08 compute-1 nova_compute[182713]: 2026-01-22 00:08:08.404 182717 DEBUG nova.network.neutron [None req-120fe374-dc82-48bb-aea5-8d624e4d515a 4ec14075ca8f491cb7e481e95011074a 356ef6d1220340c1a290e92a77db430d - - default default] [instance: c8d10cc4-7d81-41cd-8ccd-393ca4932d92] Updating instance_info_cache with network_info: [{"id": "867caf1d-7131-4085-9939-8b72e2f5d448", "address": "fa:16:3e:ce:92:09", "network": {"id": "220179ff-1200-4907-bb65-b64dfb008873", "bridge": "br-int", "label": "tempest-ServerMetadataTestJSON-1717445310-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "356ef6d1220340c1a290e92a77db430d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap867caf1d-71", "ovs_interfaceid": "867caf1d-7131-4085-9939-8b72e2f5d448", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:08:08 compute-1 nova_compute[182713]: 2026-01-22 00:08:08.436 182717 DEBUG oslo_concurrency.lockutils [None req-120fe374-dc82-48bb-aea5-8d624e4d515a 4ec14075ca8f491cb7e481e95011074a 356ef6d1220340c1a290e92a77db430d - - default default] Releasing lock "refresh_cache-c8d10cc4-7d81-41cd-8ccd-393ca4932d92" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:08:08 compute-1 nova_compute[182713]: 2026-01-22 00:08:08.437 182717 DEBUG nova.compute.manager [None req-120fe374-dc82-48bb-aea5-8d624e4d515a 4ec14075ca8f491cb7e481e95011074a 356ef6d1220340c1a290e92a77db430d - - default default] [instance: c8d10cc4-7d81-41cd-8ccd-393ca4932d92] Instance network_info: |[{"id": "867caf1d-7131-4085-9939-8b72e2f5d448", "address": "fa:16:3e:ce:92:09", "network": {"id": "220179ff-1200-4907-bb65-b64dfb008873", "bridge": "br-int", "label": "tempest-ServerMetadataTestJSON-1717445310-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "356ef6d1220340c1a290e92a77db430d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap867caf1d-71", "ovs_interfaceid": "867caf1d-7131-4085-9939-8b72e2f5d448", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 22 00:08:08 compute-1 nova_compute[182713]: 2026-01-22 00:08:08.437 182717 DEBUG oslo_concurrency.lockutils [req-bb618c44-4187-4fa8-9ad7-d24199e370d9 req-e77f8038-e482-4e2b-903a-e5312048dafd 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-c8d10cc4-7d81-41cd-8ccd-393ca4932d92" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:08:08 compute-1 nova_compute[182713]: 2026-01-22 00:08:08.437 182717 DEBUG nova.network.neutron [req-bb618c44-4187-4fa8-9ad7-d24199e370d9 req-e77f8038-e482-4e2b-903a-e5312048dafd 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c8d10cc4-7d81-41cd-8ccd-393ca4932d92] Refreshing network info cache for port 867caf1d-7131-4085-9939-8b72e2f5d448 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 00:08:08 compute-1 nova_compute[182713]: 2026-01-22 00:08:08.442 182717 DEBUG nova.virt.libvirt.driver [None req-120fe374-dc82-48bb-aea5-8d624e4d515a 4ec14075ca8f491cb7e481e95011074a 356ef6d1220340c1a290e92a77db430d - - default default] [instance: c8d10cc4-7d81-41cd-8ccd-393ca4932d92] Start _get_guest_xml network_info=[{"id": "867caf1d-7131-4085-9939-8b72e2f5d448", "address": "fa:16:3e:ce:92:09", "network": {"id": "220179ff-1200-4907-bb65-b64dfb008873", "bridge": "br-int", "label": "tempest-ServerMetadataTestJSON-1717445310-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "356ef6d1220340c1a290e92a77db430d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap867caf1d-71", "ovs_interfaceid": "867caf1d-7131-4085-9939-8b72e2f5d448", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'size': 0, 'disk_bus': 'virtio', 'boot_index': 0, 'device_name': '/dev/vda', 'guest_format': None, 'encrypted': False, 'encryption_format': None, 'encryption_secret_uuid': None, 'device_type': 'disk', 'image_id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 22 00:08:08 compute-1 nova_compute[182713]: 2026-01-22 00:08:08.447 182717 WARNING nova.virt.libvirt.driver [None req-120fe374-dc82-48bb-aea5-8d624e4d515a 4ec14075ca8f491cb7e481e95011074a 356ef6d1220340c1a290e92a77db430d - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 00:08:08 compute-1 nova_compute[182713]: 2026-01-22 00:08:08.452 182717 DEBUG nova.virt.libvirt.host [None req-120fe374-dc82-48bb-aea5-8d624e4d515a 4ec14075ca8f491cb7e481e95011074a 356ef6d1220340c1a290e92a77db430d - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 22 00:08:08 compute-1 nova_compute[182713]: 2026-01-22 00:08:08.453 182717 DEBUG nova.virt.libvirt.host [None req-120fe374-dc82-48bb-aea5-8d624e4d515a 4ec14075ca8f491cb7e481e95011074a 356ef6d1220340c1a290e92a77db430d - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 22 00:08:08 compute-1 nova_compute[182713]: 2026-01-22 00:08:08.456 182717 DEBUG nova.virt.libvirt.host [None req-120fe374-dc82-48bb-aea5-8d624e4d515a 4ec14075ca8f491cb7e481e95011074a 356ef6d1220340c1a290e92a77db430d - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 22 00:08:08 compute-1 nova_compute[182713]: 2026-01-22 00:08:08.457 182717 DEBUG nova.virt.libvirt.host [None req-120fe374-dc82-48bb-aea5-8d624e4d515a 4ec14075ca8f491cb7e481e95011074a 356ef6d1220340c1a290e92a77db430d - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 22 00:08:08 compute-1 nova_compute[182713]: 2026-01-22 00:08:08.459 182717 DEBUG nova.virt.libvirt.driver [None req-120fe374-dc82-48bb-aea5-8d624e4d515a 4ec14075ca8f491cb7e481e95011074a 356ef6d1220340c1a290e92a77db430d - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 22 00:08:08 compute-1 nova_compute[182713]: 2026-01-22 00:08:08.459 182717 DEBUG nova.virt.hardware [None req-120fe374-dc82-48bb-aea5-8d624e4d515a 4ec14075ca8f491cb7e481e95011074a 356ef6d1220340c1a290e92a77db430d - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-21T23:42:45Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c3389c03-89c4-4ff5-9e03-1a99d41713d4',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 22 00:08:08 compute-1 nova_compute[182713]: 2026-01-22 00:08:08.460 182717 DEBUG nova.virt.hardware [None req-120fe374-dc82-48bb-aea5-8d624e4d515a 4ec14075ca8f491cb7e481e95011074a 356ef6d1220340c1a290e92a77db430d - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 22 00:08:08 compute-1 nova_compute[182713]: 2026-01-22 00:08:08.460 182717 DEBUG nova.virt.hardware [None req-120fe374-dc82-48bb-aea5-8d624e4d515a 4ec14075ca8f491cb7e481e95011074a 356ef6d1220340c1a290e92a77db430d - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 22 00:08:08 compute-1 nova_compute[182713]: 2026-01-22 00:08:08.461 182717 DEBUG nova.virt.hardware [None req-120fe374-dc82-48bb-aea5-8d624e4d515a 4ec14075ca8f491cb7e481e95011074a 356ef6d1220340c1a290e92a77db430d - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 22 00:08:08 compute-1 nova_compute[182713]: 2026-01-22 00:08:08.461 182717 DEBUG nova.virt.hardware [None req-120fe374-dc82-48bb-aea5-8d624e4d515a 4ec14075ca8f491cb7e481e95011074a 356ef6d1220340c1a290e92a77db430d - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 22 00:08:08 compute-1 nova_compute[182713]: 2026-01-22 00:08:08.461 182717 DEBUG nova.virt.hardware [None req-120fe374-dc82-48bb-aea5-8d624e4d515a 4ec14075ca8f491cb7e481e95011074a 356ef6d1220340c1a290e92a77db430d - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 22 00:08:08 compute-1 nova_compute[182713]: 2026-01-22 00:08:08.462 182717 DEBUG nova.virt.hardware [None req-120fe374-dc82-48bb-aea5-8d624e4d515a 4ec14075ca8f491cb7e481e95011074a 356ef6d1220340c1a290e92a77db430d - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 22 00:08:08 compute-1 nova_compute[182713]: 2026-01-22 00:08:08.462 182717 DEBUG nova.virt.hardware [None req-120fe374-dc82-48bb-aea5-8d624e4d515a 4ec14075ca8f491cb7e481e95011074a 356ef6d1220340c1a290e92a77db430d - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 22 00:08:08 compute-1 nova_compute[182713]: 2026-01-22 00:08:08.462 182717 DEBUG nova.virt.hardware [None req-120fe374-dc82-48bb-aea5-8d624e4d515a 4ec14075ca8f491cb7e481e95011074a 356ef6d1220340c1a290e92a77db430d - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 22 00:08:08 compute-1 nova_compute[182713]: 2026-01-22 00:08:08.463 182717 DEBUG nova.virt.hardware [None req-120fe374-dc82-48bb-aea5-8d624e4d515a 4ec14075ca8f491cb7e481e95011074a 356ef6d1220340c1a290e92a77db430d - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 22 00:08:08 compute-1 nova_compute[182713]: 2026-01-22 00:08:08.463 182717 DEBUG nova.virt.hardware [None req-120fe374-dc82-48bb-aea5-8d624e4d515a 4ec14075ca8f491cb7e481e95011074a 356ef6d1220340c1a290e92a77db430d - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 22 00:08:08 compute-1 nova_compute[182713]: 2026-01-22 00:08:08.469 182717 DEBUG nova.virt.libvirt.vif [None req-120fe374-dc82-48bb-aea5-8d624e4d515a 4ec14075ca8f491cb7e481e95011074a 356ef6d1220340c1a290e92a77db430d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T00:08:01Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerMetadataTestJSON-server-1209954763',display_name='tempest-ServerMetadataTestJSON-server-1209954763',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-servermetadatatestjson-server-1209954763',id=106,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='356ef6d1220340c1a290e92a77db430d',ramdisk_id='',reservation_id='r-hbm03n0g',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerMetadataTestJSON-1172998385',owner_user_name='tempest-ServerMetadataTestJSON-1172998385-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:08:02Z,user_data=None,user_id='4ec14075ca8f491cb7e481e95011074a',uuid=c8d10cc4-7d81-41cd-8ccd-393ca4932d92,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "867caf1d-7131-4085-9939-8b72e2f5d448", "address": "fa:16:3e:ce:92:09", "network": {"id": "220179ff-1200-4907-bb65-b64dfb008873", "bridge": "br-int", "label": "tempest-ServerMetadataTestJSON-1717445310-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "356ef6d1220340c1a290e92a77db430d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap867caf1d-71", "ovs_interfaceid": "867caf1d-7131-4085-9939-8b72e2f5d448", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 00:08:08 compute-1 nova_compute[182713]: 2026-01-22 00:08:08.470 182717 DEBUG nova.network.os_vif_util [None req-120fe374-dc82-48bb-aea5-8d624e4d515a 4ec14075ca8f491cb7e481e95011074a 356ef6d1220340c1a290e92a77db430d - - default default] Converting VIF {"id": "867caf1d-7131-4085-9939-8b72e2f5d448", "address": "fa:16:3e:ce:92:09", "network": {"id": "220179ff-1200-4907-bb65-b64dfb008873", "bridge": "br-int", "label": "tempest-ServerMetadataTestJSON-1717445310-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "356ef6d1220340c1a290e92a77db430d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap867caf1d-71", "ovs_interfaceid": "867caf1d-7131-4085-9939-8b72e2f5d448", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:08:08 compute-1 nova_compute[182713]: 2026-01-22 00:08:08.471 182717 DEBUG nova.network.os_vif_util [None req-120fe374-dc82-48bb-aea5-8d624e4d515a 4ec14075ca8f491cb7e481e95011074a 356ef6d1220340c1a290e92a77db430d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ce:92:09,bridge_name='br-int',has_traffic_filtering=True,id=867caf1d-7131-4085-9939-8b72e2f5d448,network=Network(220179ff-1200-4907-bb65-b64dfb008873),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap867caf1d-71') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:08:08 compute-1 nova_compute[182713]: 2026-01-22 00:08:08.473 182717 DEBUG nova.objects.instance [None req-120fe374-dc82-48bb-aea5-8d624e4d515a 4ec14075ca8f491cb7e481e95011074a 356ef6d1220340c1a290e92a77db430d - - default default] Lazy-loading 'pci_devices' on Instance uuid c8d10cc4-7d81-41cd-8ccd-393ca4932d92 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:08:08 compute-1 nova_compute[182713]: 2026-01-22 00:08:08.493 182717 DEBUG nova.virt.libvirt.driver [None req-120fe374-dc82-48bb-aea5-8d624e4d515a 4ec14075ca8f491cb7e481e95011074a 356ef6d1220340c1a290e92a77db430d - - default default] [instance: c8d10cc4-7d81-41cd-8ccd-393ca4932d92] End _get_guest_xml xml=<domain type="kvm">
Jan 22 00:08:08 compute-1 nova_compute[182713]:   <uuid>c8d10cc4-7d81-41cd-8ccd-393ca4932d92</uuid>
Jan 22 00:08:08 compute-1 nova_compute[182713]:   <name>instance-0000006a</name>
Jan 22 00:08:08 compute-1 nova_compute[182713]:   <memory>131072</memory>
Jan 22 00:08:08 compute-1 nova_compute[182713]:   <vcpu>1</vcpu>
Jan 22 00:08:08 compute-1 nova_compute[182713]:   <metadata>
Jan 22 00:08:08 compute-1 nova_compute[182713]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 00:08:08 compute-1 nova_compute[182713]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 00:08:08 compute-1 nova_compute[182713]:       <nova:name>tempest-ServerMetadataTestJSON-server-1209954763</nova:name>
Jan 22 00:08:08 compute-1 nova_compute[182713]:       <nova:creationTime>2026-01-22 00:08:08</nova:creationTime>
Jan 22 00:08:08 compute-1 nova_compute[182713]:       <nova:flavor name="m1.nano">
Jan 22 00:08:08 compute-1 nova_compute[182713]:         <nova:memory>128</nova:memory>
Jan 22 00:08:08 compute-1 nova_compute[182713]:         <nova:disk>1</nova:disk>
Jan 22 00:08:08 compute-1 nova_compute[182713]:         <nova:swap>0</nova:swap>
Jan 22 00:08:08 compute-1 nova_compute[182713]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 00:08:08 compute-1 nova_compute[182713]:         <nova:vcpus>1</nova:vcpus>
Jan 22 00:08:08 compute-1 nova_compute[182713]:       </nova:flavor>
Jan 22 00:08:08 compute-1 nova_compute[182713]:       <nova:owner>
Jan 22 00:08:08 compute-1 nova_compute[182713]:         <nova:user uuid="4ec14075ca8f491cb7e481e95011074a">tempest-ServerMetadataTestJSON-1172998385-project-member</nova:user>
Jan 22 00:08:08 compute-1 nova_compute[182713]:         <nova:project uuid="356ef6d1220340c1a290e92a77db430d">tempest-ServerMetadataTestJSON-1172998385</nova:project>
Jan 22 00:08:08 compute-1 nova_compute[182713]:       </nova:owner>
Jan 22 00:08:08 compute-1 nova_compute[182713]:       <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 22 00:08:08 compute-1 nova_compute[182713]:       <nova:ports>
Jan 22 00:08:08 compute-1 nova_compute[182713]:         <nova:port uuid="867caf1d-7131-4085-9939-8b72e2f5d448">
Jan 22 00:08:08 compute-1 nova_compute[182713]:           <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Jan 22 00:08:08 compute-1 nova_compute[182713]:         </nova:port>
Jan 22 00:08:08 compute-1 nova_compute[182713]:       </nova:ports>
Jan 22 00:08:08 compute-1 nova_compute[182713]:     </nova:instance>
Jan 22 00:08:08 compute-1 nova_compute[182713]:   </metadata>
Jan 22 00:08:08 compute-1 nova_compute[182713]:   <sysinfo type="smbios">
Jan 22 00:08:08 compute-1 nova_compute[182713]:     <system>
Jan 22 00:08:08 compute-1 nova_compute[182713]:       <entry name="manufacturer">RDO</entry>
Jan 22 00:08:08 compute-1 nova_compute[182713]:       <entry name="product">OpenStack Compute</entry>
Jan 22 00:08:08 compute-1 nova_compute[182713]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 00:08:08 compute-1 nova_compute[182713]:       <entry name="serial">c8d10cc4-7d81-41cd-8ccd-393ca4932d92</entry>
Jan 22 00:08:08 compute-1 nova_compute[182713]:       <entry name="uuid">c8d10cc4-7d81-41cd-8ccd-393ca4932d92</entry>
Jan 22 00:08:08 compute-1 nova_compute[182713]:       <entry name="family">Virtual Machine</entry>
Jan 22 00:08:08 compute-1 nova_compute[182713]:     </system>
Jan 22 00:08:08 compute-1 nova_compute[182713]:   </sysinfo>
Jan 22 00:08:08 compute-1 nova_compute[182713]:   <os>
Jan 22 00:08:08 compute-1 nova_compute[182713]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 22 00:08:08 compute-1 nova_compute[182713]:     <boot dev="hd"/>
Jan 22 00:08:08 compute-1 nova_compute[182713]:     <smbios mode="sysinfo"/>
Jan 22 00:08:08 compute-1 nova_compute[182713]:   </os>
Jan 22 00:08:08 compute-1 nova_compute[182713]:   <features>
Jan 22 00:08:08 compute-1 nova_compute[182713]:     <acpi/>
Jan 22 00:08:08 compute-1 nova_compute[182713]:     <apic/>
Jan 22 00:08:08 compute-1 nova_compute[182713]:     <vmcoreinfo/>
Jan 22 00:08:08 compute-1 nova_compute[182713]:   </features>
Jan 22 00:08:08 compute-1 nova_compute[182713]:   <clock offset="utc">
Jan 22 00:08:08 compute-1 nova_compute[182713]:     <timer name="pit" tickpolicy="delay"/>
Jan 22 00:08:08 compute-1 nova_compute[182713]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 22 00:08:08 compute-1 nova_compute[182713]:     <timer name="hpet" present="no"/>
Jan 22 00:08:08 compute-1 nova_compute[182713]:   </clock>
Jan 22 00:08:08 compute-1 nova_compute[182713]:   <cpu mode="custom" match="exact">
Jan 22 00:08:08 compute-1 nova_compute[182713]:     <model>Nehalem</model>
Jan 22 00:08:08 compute-1 nova_compute[182713]:     <topology sockets="1" cores="1" threads="1"/>
Jan 22 00:08:08 compute-1 nova_compute[182713]:   </cpu>
Jan 22 00:08:08 compute-1 nova_compute[182713]:   <devices>
Jan 22 00:08:08 compute-1 nova_compute[182713]:     <disk type="file" device="disk">
Jan 22 00:08:08 compute-1 nova_compute[182713]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 00:08:08 compute-1 nova_compute[182713]:       <source file="/var/lib/nova/instances/c8d10cc4-7d81-41cd-8ccd-393ca4932d92/disk"/>
Jan 22 00:08:08 compute-1 nova_compute[182713]:       <target dev="vda" bus="virtio"/>
Jan 22 00:08:08 compute-1 nova_compute[182713]:     </disk>
Jan 22 00:08:08 compute-1 nova_compute[182713]:     <disk type="file" device="cdrom">
Jan 22 00:08:08 compute-1 nova_compute[182713]:       <driver name="qemu" type="raw" cache="none"/>
Jan 22 00:08:08 compute-1 nova_compute[182713]:       <source file="/var/lib/nova/instances/c8d10cc4-7d81-41cd-8ccd-393ca4932d92/disk.config"/>
Jan 22 00:08:08 compute-1 nova_compute[182713]:       <target dev="sda" bus="sata"/>
Jan 22 00:08:08 compute-1 nova_compute[182713]:     </disk>
Jan 22 00:08:08 compute-1 nova_compute[182713]:     <interface type="ethernet">
Jan 22 00:08:08 compute-1 nova_compute[182713]:       <mac address="fa:16:3e:ce:92:09"/>
Jan 22 00:08:08 compute-1 nova_compute[182713]:       <model type="virtio"/>
Jan 22 00:08:08 compute-1 nova_compute[182713]:       <driver name="vhost" rx_queue_size="512"/>
Jan 22 00:08:08 compute-1 nova_compute[182713]:       <mtu size="1442"/>
Jan 22 00:08:08 compute-1 nova_compute[182713]:       <target dev="tap867caf1d-71"/>
Jan 22 00:08:08 compute-1 nova_compute[182713]:     </interface>
Jan 22 00:08:08 compute-1 nova_compute[182713]:     <serial type="pty">
Jan 22 00:08:08 compute-1 nova_compute[182713]:       <log file="/var/lib/nova/instances/c8d10cc4-7d81-41cd-8ccd-393ca4932d92/console.log" append="off"/>
Jan 22 00:08:08 compute-1 nova_compute[182713]:     </serial>
Jan 22 00:08:08 compute-1 nova_compute[182713]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 00:08:08 compute-1 nova_compute[182713]:     <video>
Jan 22 00:08:08 compute-1 nova_compute[182713]:       <model type="virtio"/>
Jan 22 00:08:08 compute-1 nova_compute[182713]:     </video>
Jan 22 00:08:08 compute-1 nova_compute[182713]:     <input type="tablet" bus="usb"/>
Jan 22 00:08:08 compute-1 nova_compute[182713]:     <rng model="virtio">
Jan 22 00:08:08 compute-1 nova_compute[182713]:       <backend model="random">/dev/urandom</backend>
Jan 22 00:08:08 compute-1 nova_compute[182713]:     </rng>
Jan 22 00:08:08 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root"/>
Jan 22 00:08:08 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:08:08 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:08:08 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:08:08 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:08:08 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:08:08 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:08:08 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:08:08 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:08:08 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:08:08 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:08:08 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:08:08 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:08:08 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:08:08 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:08:08 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:08:08 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:08:08 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:08:08 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:08:08 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:08:08 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:08:08 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:08:08 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:08:08 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:08:08 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:08:08 compute-1 nova_compute[182713]:     <controller type="usb" index="0"/>
Jan 22 00:08:08 compute-1 nova_compute[182713]:     <memballoon model="virtio">
Jan 22 00:08:08 compute-1 nova_compute[182713]:       <stats period="10"/>
Jan 22 00:08:08 compute-1 nova_compute[182713]:     </memballoon>
Jan 22 00:08:08 compute-1 nova_compute[182713]:   </devices>
Jan 22 00:08:08 compute-1 nova_compute[182713]: </domain>
Jan 22 00:08:08 compute-1 nova_compute[182713]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 22 00:08:08 compute-1 nova_compute[182713]: 2026-01-22 00:08:08.494 182717 DEBUG nova.compute.manager [None req-120fe374-dc82-48bb-aea5-8d624e4d515a 4ec14075ca8f491cb7e481e95011074a 356ef6d1220340c1a290e92a77db430d - - default default] [instance: c8d10cc4-7d81-41cd-8ccd-393ca4932d92] Preparing to wait for external event network-vif-plugged-867caf1d-7131-4085-9939-8b72e2f5d448 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 22 00:08:08 compute-1 nova_compute[182713]: 2026-01-22 00:08:08.494 182717 DEBUG oslo_concurrency.lockutils [None req-120fe374-dc82-48bb-aea5-8d624e4d515a 4ec14075ca8f491cb7e481e95011074a 356ef6d1220340c1a290e92a77db430d - - default default] Acquiring lock "c8d10cc4-7d81-41cd-8ccd-393ca4932d92-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:08:08 compute-1 nova_compute[182713]: 2026-01-22 00:08:08.495 182717 DEBUG oslo_concurrency.lockutils [None req-120fe374-dc82-48bb-aea5-8d624e4d515a 4ec14075ca8f491cb7e481e95011074a 356ef6d1220340c1a290e92a77db430d - - default default] Lock "c8d10cc4-7d81-41cd-8ccd-393ca4932d92-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:08:08 compute-1 nova_compute[182713]: 2026-01-22 00:08:08.495 182717 DEBUG oslo_concurrency.lockutils [None req-120fe374-dc82-48bb-aea5-8d624e4d515a 4ec14075ca8f491cb7e481e95011074a 356ef6d1220340c1a290e92a77db430d - - default default] Lock "c8d10cc4-7d81-41cd-8ccd-393ca4932d92-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:08:08 compute-1 nova_compute[182713]: 2026-01-22 00:08:08.496 182717 DEBUG nova.virt.libvirt.vif [None req-120fe374-dc82-48bb-aea5-8d624e4d515a 4ec14075ca8f491cb7e481e95011074a 356ef6d1220340c1a290e92a77db430d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T00:08:01Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerMetadataTestJSON-server-1209954763',display_name='tempest-ServerMetadataTestJSON-server-1209954763',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-servermetadatatestjson-server-1209954763',id=106,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='356ef6d1220340c1a290e92a77db430d',ramdisk_id='',reservation_id='r-hbm03n0g',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerMetadataTestJSON-1172998385',owner_user_name='tempest-ServerMetadataTestJSON-1172998385-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:08:02Z,user_data=None,user_id='4ec14075ca8f491cb7e481e95011074a',uuid=c8d10cc4-7d81-41cd-8ccd-393ca4932d92,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "867caf1d-7131-4085-9939-8b72e2f5d448", "address": "fa:16:3e:ce:92:09", "network": {"id": "220179ff-1200-4907-bb65-b64dfb008873", "bridge": "br-int", "label": "tempest-ServerMetadataTestJSON-1717445310-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "356ef6d1220340c1a290e92a77db430d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap867caf1d-71", "ovs_interfaceid": "867caf1d-7131-4085-9939-8b72e2f5d448", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 00:08:08 compute-1 nova_compute[182713]: 2026-01-22 00:08:08.496 182717 DEBUG nova.network.os_vif_util [None req-120fe374-dc82-48bb-aea5-8d624e4d515a 4ec14075ca8f491cb7e481e95011074a 356ef6d1220340c1a290e92a77db430d - - default default] Converting VIF {"id": "867caf1d-7131-4085-9939-8b72e2f5d448", "address": "fa:16:3e:ce:92:09", "network": {"id": "220179ff-1200-4907-bb65-b64dfb008873", "bridge": "br-int", "label": "tempest-ServerMetadataTestJSON-1717445310-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "356ef6d1220340c1a290e92a77db430d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap867caf1d-71", "ovs_interfaceid": "867caf1d-7131-4085-9939-8b72e2f5d448", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:08:08 compute-1 nova_compute[182713]: 2026-01-22 00:08:08.497 182717 DEBUG nova.network.os_vif_util [None req-120fe374-dc82-48bb-aea5-8d624e4d515a 4ec14075ca8f491cb7e481e95011074a 356ef6d1220340c1a290e92a77db430d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ce:92:09,bridge_name='br-int',has_traffic_filtering=True,id=867caf1d-7131-4085-9939-8b72e2f5d448,network=Network(220179ff-1200-4907-bb65-b64dfb008873),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap867caf1d-71') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:08:08 compute-1 nova_compute[182713]: 2026-01-22 00:08:08.498 182717 DEBUG os_vif [None req-120fe374-dc82-48bb-aea5-8d624e4d515a 4ec14075ca8f491cb7e481e95011074a 356ef6d1220340c1a290e92a77db430d - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ce:92:09,bridge_name='br-int',has_traffic_filtering=True,id=867caf1d-7131-4085-9939-8b72e2f5d448,network=Network(220179ff-1200-4907-bb65-b64dfb008873),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap867caf1d-71') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 00:08:08 compute-1 nova_compute[182713]: 2026-01-22 00:08:08.498 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:08:08 compute-1 nova_compute[182713]: 2026-01-22 00:08:08.499 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:08:08 compute-1 nova_compute[182713]: 2026-01-22 00:08:08.499 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 00:08:08 compute-1 nova_compute[182713]: 2026-01-22 00:08:08.504 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:08:08 compute-1 nova_compute[182713]: 2026-01-22 00:08:08.504 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap867caf1d-71, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:08:08 compute-1 nova_compute[182713]: 2026-01-22 00:08:08.505 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap867caf1d-71, col_values=(('external_ids', {'iface-id': '867caf1d-7131-4085-9939-8b72e2f5d448', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:ce:92:09', 'vm-uuid': 'c8d10cc4-7d81-41cd-8ccd-393ca4932d92'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:08:08 compute-1 nova_compute[182713]: 2026-01-22 00:08:08.509 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:08:08 compute-1 NetworkManager[54952]: <info>  [1769040488.5109] manager: (tap867caf1d-71): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/197)
Jan 22 00:08:08 compute-1 nova_compute[182713]: 2026-01-22 00:08:08.514 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:08:08 compute-1 nova_compute[182713]: 2026-01-22 00:08:08.516 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:08:08 compute-1 nova_compute[182713]: 2026-01-22 00:08:08.518 182717 INFO os_vif [None req-120fe374-dc82-48bb-aea5-8d624e4d515a 4ec14075ca8f491cb7e481e95011074a 356ef6d1220340c1a290e92a77db430d - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ce:92:09,bridge_name='br-int',has_traffic_filtering=True,id=867caf1d-7131-4085-9939-8b72e2f5d448,network=Network(220179ff-1200-4907-bb65-b64dfb008873),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap867caf1d-71')
Jan 22 00:08:08 compute-1 nova_compute[182713]: 2026-01-22 00:08:08.583 182717 DEBUG nova.virt.libvirt.driver [None req-120fe374-dc82-48bb-aea5-8d624e4d515a 4ec14075ca8f491cb7e481e95011074a 356ef6d1220340c1a290e92a77db430d - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 00:08:08 compute-1 nova_compute[182713]: 2026-01-22 00:08:08.584 182717 DEBUG nova.virt.libvirt.driver [None req-120fe374-dc82-48bb-aea5-8d624e4d515a 4ec14075ca8f491cb7e481e95011074a 356ef6d1220340c1a290e92a77db430d - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 00:08:08 compute-1 nova_compute[182713]: 2026-01-22 00:08:08.584 182717 DEBUG nova.virt.libvirt.driver [None req-120fe374-dc82-48bb-aea5-8d624e4d515a 4ec14075ca8f491cb7e481e95011074a 356ef6d1220340c1a290e92a77db430d - - default default] No VIF found with MAC fa:16:3e:ce:92:09, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 00:08:08 compute-1 nova_compute[182713]: 2026-01-22 00:08:08.585 182717 INFO nova.virt.libvirt.driver [None req-120fe374-dc82-48bb-aea5-8d624e4d515a 4ec14075ca8f491cb7e481e95011074a 356ef6d1220340c1a290e92a77db430d - - default default] [instance: c8d10cc4-7d81-41cd-8ccd-393ca4932d92] Using config drive
Jan 22 00:08:09 compute-1 nova_compute[182713]: 2026-01-22 00:08:09.584 182717 INFO nova.virt.libvirt.driver [None req-120fe374-dc82-48bb-aea5-8d624e4d515a 4ec14075ca8f491cb7e481e95011074a 356ef6d1220340c1a290e92a77db430d - - default default] [instance: c8d10cc4-7d81-41cd-8ccd-393ca4932d92] Creating config drive at /var/lib/nova/instances/c8d10cc4-7d81-41cd-8ccd-393ca4932d92/disk.config
Jan 22 00:08:09 compute-1 nova_compute[182713]: 2026-01-22 00:08:09.593 182717 DEBUG oslo_concurrency.processutils [None req-120fe374-dc82-48bb-aea5-8d624e4d515a 4ec14075ca8f491cb7e481e95011074a 356ef6d1220340c1a290e92a77db430d - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/c8d10cc4-7d81-41cd-8ccd-393ca4932d92/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpfb04_3g9 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:08:09 compute-1 nova_compute[182713]: 2026-01-22 00:08:09.736 182717 DEBUG oslo_concurrency.processutils [None req-120fe374-dc82-48bb-aea5-8d624e4d515a 4ec14075ca8f491cb7e481e95011074a 356ef6d1220340c1a290e92a77db430d - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/c8d10cc4-7d81-41cd-8ccd-393ca4932d92/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpfb04_3g9" returned: 0 in 0.143s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:08:09 compute-1 kernel: tap867caf1d-71: entered promiscuous mode
Jan 22 00:08:09 compute-1 NetworkManager[54952]: <info>  [1769040489.8356] manager: (tap867caf1d-71): new Tun device (/org/freedesktop/NetworkManager/Devices/198)
Jan 22 00:08:09 compute-1 ovn_controller[94841]: 2026-01-22T00:08:09Z|00409|binding|INFO|Claiming lport 867caf1d-7131-4085-9939-8b72e2f5d448 for this chassis.
Jan 22 00:08:09 compute-1 ovn_controller[94841]: 2026-01-22T00:08:09Z|00410|binding|INFO|867caf1d-7131-4085-9939-8b72e2f5d448: Claiming fa:16:3e:ce:92:09 10.100.0.3
Jan 22 00:08:09 compute-1 nova_compute[182713]: 2026-01-22 00:08:09.839 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:08:09 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:08:09.850 104184 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ce:92:09 10.100.0.3'], port_security=['fa:16:3e:ce:92:09 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'c8d10cc4-7d81-41cd-8ccd-393ca4932d92', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-220179ff-1200-4907-bb65-b64dfb008873', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '356ef6d1220340c1a290e92a77db430d', 'neutron:revision_number': '2', 'neutron:security_group_ids': '03a68014-fe35-445e-b433-812230af64d2', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e13595a0-4fe4-42e9-9972-52aeaf08e79a, chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>], logical_port=867caf1d-7131-4085-9939-8b72e2f5d448) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:08:09 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:08:09.852 104184 INFO neutron.agent.ovn.metadata.agent [-] Port 867caf1d-7131-4085-9939-8b72e2f5d448 in datapath 220179ff-1200-4907-bb65-b64dfb008873 bound to our chassis
Jan 22 00:08:09 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:08:09.855 104184 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 220179ff-1200-4907-bb65-b64dfb008873
Jan 22 00:08:09 compute-1 ovn_controller[94841]: 2026-01-22T00:08:09Z|00411|binding|INFO|Setting lport 867caf1d-7131-4085-9939-8b72e2f5d448 ovn-installed in OVS
Jan 22 00:08:09 compute-1 ovn_controller[94841]: 2026-01-22T00:08:09Z|00412|binding|INFO|Setting lport 867caf1d-7131-4085-9939-8b72e2f5d448 up in Southbound
Jan 22 00:08:09 compute-1 nova_compute[182713]: 2026-01-22 00:08:09.869 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:08:09 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:08:09.876 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[6d1b4942-470f-46f9-ad09-60d033647d34]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:08:09 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:08:09.879 104184 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap220179ff-11 in ovnmeta-220179ff-1200-4907-bb65-b64dfb008873 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 22 00:08:09 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:08:09.883 211733 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap220179ff-10 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 22 00:08:09 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:08:09.883 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[ec30b6cf-cb66-4843-aeea-a5adaab6d74f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:08:09 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:08:09.884 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[442d4c26-b79b-4c0c-9a33-cd9fcefa8f77]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:08:09 compute-1 systemd-udevd[227286]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 00:08:09 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:08:09.901 104576 DEBUG oslo.privsep.daemon [-] privsep: reply[22e88e04-2311-433f-95ae-3f0f652dd88a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:08:09 compute-1 systemd-machined[153970]: New machine qemu-46-instance-0000006a.
Jan 22 00:08:09 compute-1 NetworkManager[54952]: <info>  [1769040489.9137] device (tap867caf1d-71): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 00:08:09 compute-1 NetworkManager[54952]: <info>  [1769040489.9150] device (tap867caf1d-71): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 00:08:09 compute-1 systemd[1]: Started Virtual Machine qemu-46-instance-0000006a.
Jan 22 00:08:09 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:08:09.926 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[ee7b9e5b-9438-40e4-a991-35729c595002]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:08:09 compute-1 podman[227268]: 2026-01-22 00:08:09.938813228 +0000 UTC m=+0.108251086 container health_status cba261ffc859a105e0afa4436e64e27f9ba7e7387a1ea02146e0072c51844d44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 22 00:08:09 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:08:09.961 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[b5436d88-93c3-45e4-b417-9f622d7aba73]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:08:09 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:08:09.969 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[232d3bf3-b21e-421a-af8c-bbddc7f44453]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:08:09 compute-1 NetworkManager[54952]: <info>  [1769040489.9700] manager: (tap220179ff-10): new Veth device (/org/freedesktop/NetworkManager/Devices/199)
Jan 22 00:08:10 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:08:10.012 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[07c0a0dc-fae7-4b4d-8225-9422a1e1bbfb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:08:10 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:08:10.015 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[17a7c2b7-a40a-4de7-b8e0-d4c755661e47]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:08:10 compute-1 nova_compute[182713]: 2026-01-22 00:08:10.033 182717 DEBUG nova.network.neutron [req-bb618c44-4187-4fa8-9ad7-d24199e370d9 req-e77f8038-e482-4e2b-903a-e5312048dafd 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c8d10cc4-7d81-41cd-8ccd-393ca4932d92] Updated VIF entry in instance network info cache for port 867caf1d-7131-4085-9939-8b72e2f5d448. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 00:08:10 compute-1 nova_compute[182713]: 2026-01-22 00:08:10.033 182717 DEBUG nova.network.neutron [req-bb618c44-4187-4fa8-9ad7-d24199e370d9 req-e77f8038-e482-4e2b-903a-e5312048dafd 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c8d10cc4-7d81-41cd-8ccd-393ca4932d92] Updating instance_info_cache with network_info: [{"id": "867caf1d-7131-4085-9939-8b72e2f5d448", "address": "fa:16:3e:ce:92:09", "network": {"id": "220179ff-1200-4907-bb65-b64dfb008873", "bridge": "br-int", "label": "tempest-ServerMetadataTestJSON-1717445310-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "356ef6d1220340c1a290e92a77db430d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap867caf1d-71", "ovs_interfaceid": "867caf1d-7131-4085-9939-8b72e2f5d448", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:08:10 compute-1 NetworkManager[54952]: <info>  [1769040490.0444] device (tap220179ff-10): carrier: link connected
Jan 22 00:08:10 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:08:10.052 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[105af841-2830-40a6-ab89-2769e9b7753f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:08:10 compute-1 nova_compute[182713]: 2026-01-22 00:08:10.064 182717 DEBUG oslo_concurrency.lockutils [req-bb618c44-4187-4fa8-9ad7-d24199e370d9 req-e77f8038-e482-4e2b-903a-e5312048dafd 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-c8d10cc4-7d81-41cd-8ccd-393ca4932d92" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:08:10 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:08:10.070 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[107ee131-acdb-4679-97cc-90ffc1a9d5ae]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap220179ff-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:11:36:ee'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 121], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 506269, 'reachable_time': 37558, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 227328, 'error': None, 'target': 'ovnmeta-220179ff-1200-4907-bb65-b64dfb008873', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:08:10 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:08:10.089 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[89391d83-1ee6-4857-a585-3f3eb520eda7]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe11:36ee'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 506269, 'tstamp': 506269}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 227329, 'error': None, 'target': 'ovnmeta-220179ff-1200-4907-bb65-b64dfb008873', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:08:10 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:08:10.110 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[419cb394-08d1-4181-8b5c-19a82b179bfb]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap220179ff-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:11:36:ee'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 121], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 506269, 'reachable_time': 37558, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 227330, 'error': None, 'target': 'ovnmeta-220179ff-1200-4907-bb65-b64dfb008873', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:08:10 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:08:10.158 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[67c62f60-cfff-4536-b228-c3fa62d4afe0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:08:10 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:08:10.249 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[0690a66d-dd06-4891-be62-4acb29ce1d00]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:08:10 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:08:10.250 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap220179ff-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:08:10 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:08:10.251 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 00:08:10 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:08:10.251 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap220179ff-10, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:08:10 compute-1 NetworkManager[54952]: <info>  [1769040490.2542] manager: (tap220179ff-10): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/200)
Jan 22 00:08:10 compute-1 nova_compute[182713]: 2026-01-22 00:08:10.253 182717 DEBUG nova.compute.manager [req-bfc5198e-99bf-463d-a459-41cf251f85f2 req-986cd1d9-dc69-4b7d-842a-9b18f829489e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c8d10cc4-7d81-41cd-8ccd-393ca4932d92] Received event network-vif-plugged-867caf1d-7131-4085-9939-8b72e2f5d448 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:08:10 compute-1 kernel: tap220179ff-10: entered promiscuous mode
Jan 22 00:08:10 compute-1 nova_compute[182713]: 2026-01-22 00:08:10.255 182717 DEBUG oslo_concurrency.lockutils [req-bfc5198e-99bf-463d-a459-41cf251f85f2 req-986cd1d9-dc69-4b7d-842a-9b18f829489e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "c8d10cc4-7d81-41cd-8ccd-393ca4932d92-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:08:10 compute-1 nova_compute[182713]: 2026-01-22 00:08:10.255 182717 DEBUG oslo_concurrency.lockutils [req-bfc5198e-99bf-463d-a459-41cf251f85f2 req-986cd1d9-dc69-4b7d-842a-9b18f829489e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "c8d10cc4-7d81-41cd-8ccd-393ca4932d92-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:08:10 compute-1 nova_compute[182713]: 2026-01-22 00:08:10.256 182717 DEBUG oslo_concurrency.lockutils [req-bfc5198e-99bf-463d-a459-41cf251f85f2 req-986cd1d9-dc69-4b7d-842a-9b18f829489e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "c8d10cc4-7d81-41cd-8ccd-393ca4932d92-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:08:10 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:08:10.256 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap220179ff-10, col_values=(('external_ids', {'iface-id': '0c6af779-530d-46c2-8509-0d71d9db29bf'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:08:10 compute-1 nova_compute[182713]: 2026-01-22 00:08:10.257 182717 DEBUG nova.compute.manager [req-bfc5198e-99bf-463d-a459-41cf251f85f2 req-986cd1d9-dc69-4b7d-842a-9b18f829489e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c8d10cc4-7d81-41cd-8ccd-393ca4932d92] Processing event network-vif-plugged-867caf1d-7131-4085-9939-8b72e2f5d448 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 22 00:08:10 compute-1 ovn_controller[94841]: 2026-01-22T00:08:10Z|00413|binding|INFO|Releasing lport 0c6af779-530d-46c2-8509-0d71d9db29bf from this chassis (sb_readonly=0)
Jan 22 00:08:10 compute-1 nova_compute[182713]: 2026-01-22 00:08:10.258 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:08:10 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:08:10.261 104184 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/220179ff-1200-4907-bb65-b64dfb008873.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/220179ff-1200-4907-bb65-b64dfb008873.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 22 00:08:10 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:08:10.262 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[fc825c3f-7172-4b23-af3c-d5f9ffb675c3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:08:10 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:08:10.263 104184 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 00:08:10 compute-1 ovn_metadata_agent[104179]: global
Jan 22 00:08:10 compute-1 ovn_metadata_agent[104179]:     log         /dev/log local0 debug
Jan 22 00:08:10 compute-1 ovn_metadata_agent[104179]:     log-tag     haproxy-metadata-proxy-220179ff-1200-4907-bb65-b64dfb008873
Jan 22 00:08:10 compute-1 ovn_metadata_agent[104179]:     user        root
Jan 22 00:08:10 compute-1 ovn_metadata_agent[104179]:     group       root
Jan 22 00:08:10 compute-1 ovn_metadata_agent[104179]:     maxconn     1024
Jan 22 00:08:10 compute-1 ovn_metadata_agent[104179]:     pidfile     /var/lib/neutron/external/pids/220179ff-1200-4907-bb65-b64dfb008873.pid.haproxy
Jan 22 00:08:10 compute-1 ovn_metadata_agent[104179]:     daemon
Jan 22 00:08:10 compute-1 ovn_metadata_agent[104179]: 
Jan 22 00:08:10 compute-1 ovn_metadata_agent[104179]: defaults
Jan 22 00:08:10 compute-1 ovn_metadata_agent[104179]:     log global
Jan 22 00:08:10 compute-1 ovn_metadata_agent[104179]:     mode http
Jan 22 00:08:10 compute-1 ovn_metadata_agent[104179]:     option httplog
Jan 22 00:08:10 compute-1 ovn_metadata_agent[104179]:     option dontlognull
Jan 22 00:08:10 compute-1 ovn_metadata_agent[104179]:     option http-server-close
Jan 22 00:08:10 compute-1 ovn_metadata_agent[104179]:     option forwardfor
Jan 22 00:08:10 compute-1 ovn_metadata_agent[104179]:     retries                 3
Jan 22 00:08:10 compute-1 ovn_metadata_agent[104179]:     timeout http-request    30s
Jan 22 00:08:10 compute-1 ovn_metadata_agent[104179]:     timeout connect         30s
Jan 22 00:08:10 compute-1 ovn_metadata_agent[104179]:     timeout client          32s
Jan 22 00:08:10 compute-1 ovn_metadata_agent[104179]:     timeout server          32s
Jan 22 00:08:10 compute-1 ovn_metadata_agent[104179]:     timeout http-keep-alive 30s
Jan 22 00:08:10 compute-1 ovn_metadata_agent[104179]: 
Jan 22 00:08:10 compute-1 ovn_metadata_agent[104179]: 
Jan 22 00:08:10 compute-1 ovn_metadata_agent[104179]: listen listener
Jan 22 00:08:10 compute-1 ovn_metadata_agent[104179]:     bind 169.254.169.254:80
Jan 22 00:08:10 compute-1 ovn_metadata_agent[104179]:     server metadata /var/lib/neutron/metadata_proxy
Jan 22 00:08:10 compute-1 ovn_metadata_agent[104179]:     http-request add-header X-OVN-Network-ID 220179ff-1200-4907-bb65-b64dfb008873
Jan 22 00:08:10 compute-1 ovn_metadata_agent[104179]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 22 00:08:10 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:08:10.263 104184 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-220179ff-1200-4907-bb65-b64dfb008873', 'env', 'PROCESS_TAG=haproxy-220179ff-1200-4907-bb65-b64dfb008873', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/220179ff-1200-4907-bb65-b64dfb008873.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 22 00:08:10 compute-1 nova_compute[182713]: 2026-01-22 00:08:10.275 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:08:10 compute-1 podman[227362]: 2026-01-22 00:08:10.684887994 +0000 UTC m=+0.070643886 container create cb36c7b997e9483f1b89391d3d18c5a7a2f904f7bdf64fe0ba1f9bd911292765 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-220179ff-1200-4907-bb65-b64dfb008873, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Jan 22 00:08:10 compute-1 systemd[1]: Started libpod-conmon-cb36c7b997e9483f1b89391d3d18c5a7a2f904f7bdf64fe0ba1f9bd911292765.scope.
Jan 22 00:08:10 compute-1 podman[227362]: 2026-01-22 00:08:10.658151462 +0000 UTC m=+0.043907334 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 00:08:10 compute-1 systemd[1]: Started libcrun container.
Jan 22 00:08:10 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4fa31bf811e2149135442f6a0609e67077e939b499106318f1d41cb668c73e75/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 00:08:10 compute-1 podman[227362]: 2026-01-22 00:08:10.803012078 +0000 UTC m=+0.188768020 container init cb36c7b997e9483f1b89391d3d18c5a7a2f904f7bdf64fe0ba1f9bd911292765 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-220179ff-1200-4907-bb65-b64dfb008873, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 22 00:08:10 compute-1 podman[227362]: 2026-01-22 00:08:10.812961561 +0000 UTC m=+0.198717443 container start cb36c7b997e9483f1b89391d3d18c5a7a2f904f7bdf64fe0ba1f9bd911292765 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-220179ff-1200-4907-bb65-b64dfb008873, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 00:08:10 compute-1 neutron-haproxy-ovnmeta-220179ff-1200-4907-bb65-b64dfb008873[227378]: [NOTICE]   (227382) : New worker (227384) forked
Jan 22 00:08:10 compute-1 neutron-haproxy-ovnmeta-220179ff-1200-4907-bb65-b64dfb008873[227378]: [NOTICE]   (227382) : Loading success.
Jan 22 00:08:11 compute-1 nova_compute[182713]: 2026-01-22 00:08:11.526 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:08:11 compute-1 nova_compute[182713]: 2026-01-22 00:08:11.811 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:08:11 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:08:11.812 104184 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=29, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:a4:fb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'fa:c2:bb:50:eb:27'}, ipsec=False) old=SB_Global(nb_cfg=28) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:08:11 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:08:11.813 104184 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 22 00:08:11 compute-1 nova_compute[182713]: 2026-01-22 00:08:11.829 182717 DEBUG nova.compute.manager [None req-120fe374-dc82-48bb-aea5-8d624e4d515a 4ec14075ca8f491cb7e481e95011074a 356ef6d1220340c1a290e92a77db430d - - default default] [instance: c8d10cc4-7d81-41cd-8ccd-393ca4932d92] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 00:08:11 compute-1 nova_compute[182713]: 2026-01-22 00:08:11.830 182717 DEBUG nova.virt.driver [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Emitting event <LifecycleEvent: 1769040491.83004, c8d10cc4-7d81-41cd-8ccd-393ca4932d92 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:08:11 compute-1 nova_compute[182713]: 2026-01-22 00:08:11.831 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: c8d10cc4-7d81-41cd-8ccd-393ca4932d92] VM Started (Lifecycle Event)
Jan 22 00:08:11 compute-1 nova_compute[182713]: 2026-01-22 00:08:11.836 182717 DEBUG nova.virt.libvirt.driver [None req-120fe374-dc82-48bb-aea5-8d624e4d515a 4ec14075ca8f491cb7e481e95011074a 356ef6d1220340c1a290e92a77db430d - - default default] [instance: c8d10cc4-7d81-41cd-8ccd-393ca4932d92] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 22 00:08:11 compute-1 nova_compute[182713]: 2026-01-22 00:08:11.840 182717 INFO nova.virt.libvirt.driver [-] [instance: c8d10cc4-7d81-41cd-8ccd-393ca4932d92] Instance spawned successfully.
Jan 22 00:08:11 compute-1 nova_compute[182713]: 2026-01-22 00:08:11.841 182717 DEBUG nova.virt.libvirt.driver [None req-120fe374-dc82-48bb-aea5-8d624e4d515a 4ec14075ca8f491cb7e481e95011074a 356ef6d1220340c1a290e92a77db430d - - default default] [instance: c8d10cc4-7d81-41cd-8ccd-393ca4932d92] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 22 00:08:11 compute-1 nova_compute[182713]: 2026-01-22 00:08:11.870 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: c8d10cc4-7d81-41cd-8ccd-393ca4932d92] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:08:11 compute-1 nova_compute[182713]: 2026-01-22 00:08:11.878 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: c8d10cc4-7d81-41cd-8ccd-393ca4932d92] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 00:08:11 compute-1 nova_compute[182713]: 2026-01-22 00:08:11.883 182717 DEBUG nova.virt.libvirt.driver [None req-120fe374-dc82-48bb-aea5-8d624e4d515a 4ec14075ca8f491cb7e481e95011074a 356ef6d1220340c1a290e92a77db430d - - default default] [instance: c8d10cc4-7d81-41cd-8ccd-393ca4932d92] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:08:11 compute-1 nova_compute[182713]: 2026-01-22 00:08:11.883 182717 DEBUG nova.virt.libvirt.driver [None req-120fe374-dc82-48bb-aea5-8d624e4d515a 4ec14075ca8f491cb7e481e95011074a 356ef6d1220340c1a290e92a77db430d - - default default] [instance: c8d10cc4-7d81-41cd-8ccd-393ca4932d92] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:08:11 compute-1 nova_compute[182713]: 2026-01-22 00:08:11.884 182717 DEBUG nova.virt.libvirt.driver [None req-120fe374-dc82-48bb-aea5-8d624e4d515a 4ec14075ca8f491cb7e481e95011074a 356ef6d1220340c1a290e92a77db430d - - default default] [instance: c8d10cc4-7d81-41cd-8ccd-393ca4932d92] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:08:11 compute-1 nova_compute[182713]: 2026-01-22 00:08:11.885 182717 DEBUG nova.virt.libvirt.driver [None req-120fe374-dc82-48bb-aea5-8d624e4d515a 4ec14075ca8f491cb7e481e95011074a 356ef6d1220340c1a290e92a77db430d - - default default] [instance: c8d10cc4-7d81-41cd-8ccd-393ca4932d92] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:08:11 compute-1 nova_compute[182713]: 2026-01-22 00:08:11.885 182717 DEBUG nova.virt.libvirt.driver [None req-120fe374-dc82-48bb-aea5-8d624e4d515a 4ec14075ca8f491cb7e481e95011074a 356ef6d1220340c1a290e92a77db430d - - default default] [instance: c8d10cc4-7d81-41cd-8ccd-393ca4932d92] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:08:11 compute-1 nova_compute[182713]: 2026-01-22 00:08:11.886 182717 DEBUG nova.virt.libvirt.driver [None req-120fe374-dc82-48bb-aea5-8d624e4d515a 4ec14075ca8f491cb7e481e95011074a 356ef6d1220340c1a290e92a77db430d - - default default] [instance: c8d10cc4-7d81-41cd-8ccd-393ca4932d92] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:08:11 compute-1 nova_compute[182713]: 2026-01-22 00:08:11.909 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: c8d10cc4-7d81-41cd-8ccd-393ca4932d92] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 00:08:11 compute-1 nova_compute[182713]: 2026-01-22 00:08:11.909 182717 DEBUG nova.virt.driver [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Emitting event <LifecycleEvent: 1769040491.8302915, c8d10cc4-7d81-41cd-8ccd-393ca4932d92 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:08:11 compute-1 nova_compute[182713]: 2026-01-22 00:08:11.910 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: c8d10cc4-7d81-41cd-8ccd-393ca4932d92] VM Paused (Lifecycle Event)
Jan 22 00:08:11 compute-1 nova_compute[182713]: 2026-01-22 00:08:11.939 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: c8d10cc4-7d81-41cd-8ccd-393ca4932d92] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:08:11 compute-1 nova_compute[182713]: 2026-01-22 00:08:11.943 182717 DEBUG nova.virt.driver [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Emitting event <LifecycleEvent: 1769040491.8344429, c8d10cc4-7d81-41cd-8ccd-393ca4932d92 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:08:11 compute-1 nova_compute[182713]: 2026-01-22 00:08:11.944 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: c8d10cc4-7d81-41cd-8ccd-393ca4932d92] VM Resumed (Lifecycle Event)
Jan 22 00:08:11 compute-1 nova_compute[182713]: 2026-01-22 00:08:11.965 182717 INFO nova.compute.manager [None req-120fe374-dc82-48bb-aea5-8d624e4d515a 4ec14075ca8f491cb7e481e95011074a 356ef6d1220340c1a290e92a77db430d - - default default] [instance: c8d10cc4-7d81-41cd-8ccd-393ca4932d92] Took 8.93 seconds to spawn the instance on the hypervisor.
Jan 22 00:08:11 compute-1 nova_compute[182713]: 2026-01-22 00:08:11.965 182717 DEBUG nova.compute.manager [None req-120fe374-dc82-48bb-aea5-8d624e4d515a 4ec14075ca8f491cb7e481e95011074a 356ef6d1220340c1a290e92a77db430d - - default default] [instance: c8d10cc4-7d81-41cd-8ccd-393ca4932d92] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:08:11 compute-1 nova_compute[182713]: 2026-01-22 00:08:11.968 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: c8d10cc4-7d81-41cd-8ccd-393ca4932d92] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:08:11 compute-1 nova_compute[182713]: 2026-01-22 00:08:11.974 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: c8d10cc4-7d81-41cd-8ccd-393ca4932d92] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 00:08:12 compute-1 nova_compute[182713]: 2026-01-22 00:08:12.039 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: c8d10cc4-7d81-41cd-8ccd-393ca4932d92] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 00:08:12 compute-1 nova_compute[182713]: 2026-01-22 00:08:12.103 182717 INFO nova.compute.manager [None req-120fe374-dc82-48bb-aea5-8d624e4d515a 4ec14075ca8f491cb7e481e95011074a 356ef6d1220340c1a290e92a77db430d - - default default] [instance: c8d10cc4-7d81-41cd-8ccd-393ca4932d92] Took 9.71 seconds to build instance.
Jan 22 00:08:12 compute-1 nova_compute[182713]: 2026-01-22 00:08:12.127 182717 DEBUG oslo_concurrency.lockutils [None req-120fe374-dc82-48bb-aea5-8d624e4d515a 4ec14075ca8f491cb7e481e95011074a 356ef6d1220340c1a290e92a77db430d - - default default] Lock "c8d10cc4-7d81-41cd-8ccd-393ca4932d92" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 9.853s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:08:12 compute-1 nova_compute[182713]: 2026-01-22 00:08:12.346 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:08:12 compute-1 nova_compute[182713]: 2026-01-22 00:08:12.932 182717 DEBUG nova.compute.manager [req-bbbf2ff5-d549-4590-a715-1f51b1ce55f6 req-e7cd7b59-aa0f-4a58-8686-0d1360e4d3d2 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c8d10cc4-7d81-41cd-8ccd-393ca4932d92] Received event network-vif-plugged-867caf1d-7131-4085-9939-8b72e2f5d448 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:08:12 compute-1 nova_compute[182713]: 2026-01-22 00:08:12.934 182717 DEBUG oslo_concurrency.lockutils [req-bbbf2ff5-d549-4590-a715-1f51b1ce55f6 req-e7cd7b59-aa0f-4a58-8686-0d1360e4d3d2 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "c8d10cc4-7d81-41cd-8ccd-393ca4932d92-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:08:12 compute-1 nova_compute[182713]: 2026-01-22 00:08:12.935 182717 DEBUG oslo_concurrency.lockutils [req-bbbf2ff5-d549-4590-a715-1f51b1ce55f6 req-e7cd7b59-aa0f-4a58-8686-0d1360e4d3d2 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "c8d10cc4-7d81-41cd-8ccd-393ca4932d92-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:08:12 compute-1 nova_compute[182713]: 2026-01-22 00:08:12.935 182717 DEBUG oslo_concurrency.lockutils [req-bbbf2ff5-d549-4590-a715-1f51b1ce55f6 req-e7cd7b59-aa0f-4a58-8686-0d1360e4d3d2 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "c8d10cc4-7d81-41cd-8ccd-393ca4932d92-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:08:12 compute-1 nova_compute[182713]: 2026-01-22 00:08:12.936 182717 DEBUG nova.compute.manager [req-bbbf2ff5-d549-4590-a715-1f51b1ce55f6 req-e7cd7b59-aa0f-4a58-8686-0d1360e4d3d2 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c8d10cc4-7d81-41cd-8ccd-393ca4932d92] No waiting events found dispatching network-vif-plugged-867caf1d-7131-4085-9939-8b72e2f5d448 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:08:12 compute-1 nova_compute[182713]: 2026-01-22 00:08:12.936 182717 WARNING nova.compute.manager [req-bbbf2ff5-d549-4590-a715-1f51b1ce55f6 req-e7cd7b59-aa0f-4a58-8686-0d1360e4d3d2 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c8d10cc4-7d81-41cd-8ccd-393ca4932d92] Received unexpected event network-vif-plugged-867caf1d-7131-4085-9939-8b72e2f5d448 for instance with vm_state active and task_state None.
Jan 22 00:08:13 compute-1 nova_compute[182713]: 2026-01-22 00:08:13.510 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:08:13 compute-1 podman[227400]: 2026-01-22 00:08:13.640576925 +0000 UTC m=+0.116208649 container health_status dce133a2ffa21f3006ac27dc80027060a9029e6d972f4c62fecd35dd3bbb00f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, architecture=x86_64, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, io.openshift.expose-services=, build-date=2025-08-20T13:12:41, vcs-type=git, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, config_id=openstack_network_exporter)
Jan 22 00:08:15 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:08:15.815 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=74526b6d-b1ca-423f-9094-b845f8b97526, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '29'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:08:16 compute-1 nova_compute[182713]: 2026-01-22 00:08:16.061 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:08:17 compute-1 nova_compute[182713]: 2026-01-22 00:08:17.348 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:08:18 compute-1 nova_compute[182713]: 2026-01-22 00:08:18.561 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:08:18 compute-1 nova_compute[182713]: 2026-01-22 00:08:18.562 182717 DEBUG oslo_concurrency.lockutils [None req-ce37c11e-50eb-4978-aaa9-e7ed14e09341 4ec14075ca8f491cb7e481e95011074a 356ef6d1220340c1a290e92a77db430d - - default default] Acquiring lock "c8d10cc4-7d81-41cd-8ccd-393ca4932d92" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:08:18 compute-1 nova_compute[182713]: 2026-01-22 00:08:18.563 182717 DEBUG oslo_concurrency.lockutils [None req-ce37c11e-50eb-4978-aaa9-e7ed14e09341 4ec14075ca8f491cb7e481e95011074a 356ef6d1220340c1a290e92a77db430d - - default default] Lock "c8d10cc4-7d81-41cd-8ccd-393ca4932d92" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:08:18 compute-1 nova_compute[182713]: 2026-01-22 00:08:18.563 182717 DEBUG oslo_concurrency.lockutils [None req-ce37c11e-50eb-4978-aaa9-e7ed14e09341 4ec14075ca8f491cb7e481e95011074a 356ef6d1220340c1a290e92a77db430d - - default default] Acquiring lock "c8d10cc4-7d81-41cd-8ccd-393ca4932d92-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:08:18 compute-1 nova_compute[182713]: 2026-01-22 00:08:18.564 182717 DEBUG oslo_concurrency.lockutils [None req-ce37c11e-50eb-4978-aaa9-e7ed14e09341 4ec14075ca8f491cb7e481e95011074a 356ef6d1220340c1a290e92a77db430d - - default default] Lock "c8d10cc4-7d81-41cd-8ccd-393ca4932d92-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:08:18 compute-1 nova_compute[182713]: 2026-01-22 00:08:18.564 182717 DEBUG oslo_concurrency.lockutils [None req-ce37c11e-50eb-4978-aaa9-e7ed14e09341 4ec14075ca8f491cb7e481e95011074a 356ef6d1220340c1a290e92a77db430d - - default default] Lock "c8d10cc4-7d81-41cd-8ccd-393ca4932d92-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:08:18 compute-1 nova_compute[182713]: 2026-01-22 00:08:18.575 182717 INFO nova.compute.manager [None req-ce37c11e-50eb-4978-aaa9-e7ed14e09341 4ec14075ca8f491cb7e481e95011074a 356ef6d1220340c1a290e92a77db430d - - default default] [instance: c8d10cc4-7d81-41cd-8ccd-393ca4932d92] Terminating instance
Jan 22 00:08:18 compute-1 nova_compute[182713]: 2026-01-22 00:08:18.583 182717 DEBUG nova.compute.manager [None req-ce37c11e-50eb-4978-aaa9-e7ed14e09341 4ec14075ca8f491cb7e481e95011074a 356ef6d1220340c1a290e92a77db430d - - default default] [instance: c8d10cc4-7d81-41cd-8ccd-393ca4932d92] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 22 00:08:18 compute-1 kernel: tap867caf1d-71 (unregistering): left promiscuous mode
Jan 22 00:08:18 compute-1 NetworkManager[54952]: <info>  [1769040498.6032] device (tap867caf1d-71): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 00:08:18 compute-1 ovn_controller[94841]: 2026-01-22T00:08:18Z|00414|binding|INFO|Releasing lport 867caf1d-7131-4085-9939-8b72e2f5d448 from this chassis (sb_readonly=0)
Jan 22 00:08:18 compute-1 ovn_controller[94841]: 2026-01-22T00:08:18Z|00415|binding|INFO|Setting lport 867caf1d-7131-4085-9939-8b72e2f5d448 down in Southbound
Jan 22 00:08:18 compute-1 ovn_controller[94841]: 2026-01-22T00:08:18Z|00416|binding|INFO|Removing iface tap867caf1d-71 ovn-installed in OVS
Jan 22 00:08:18 compute-1 nova_compute[182713]: 2026-01-22 00:08:18.655 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:08:18 compute-1 nova_compute[182713]: 2026-01-22 00:08:18.665 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:08:18 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:08:18.670 104184 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ce:92:09 10.100.0.3'], port_security=['fa:16:3e:ce:92:09 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'c8d10cc4-7d81-41cd-8ccd-393ca4932d92', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-220179ff-1200-4907-bb65-b64dfb008873', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '356ef6d1220340c1a290e92a77db430d', 'neutron:revision_number': '4', 'neutron:security_group_ids': '03a68014-fe35-445e-b433-812230af64d2', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e13595a0-4fe4-42e9-9972-52aeaf08e79a, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>], logical_port=867caf1d-7131-4085-9939-8b72e2f5d448) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:08:18 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:08:18.673 104184 INFO neutron.agent.ovn.metadata.agent [-] Port 867caf1d-7131-4085-9939-8b72e2f5d448 in datapath 220179ff-1200-4907-bb65-b64dfb008873 unbound from our chassis
Jan 22 00:08:18 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:08:18.677 104184 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 220179ff-1200-4907-bb65-b64dfb008873, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 00:08:18 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:08:18.680 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[fb817750-c4d3-4ba6-847e-9d2550b1fa06]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:08:18 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:08:18.681 104184 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-220179ff-1200-4907-bb65-b64dfb008873 namespace which is not needed anymore
Jan 22 00:08:18 compute-1 systemd[1]: machine-qemu\x2d46\x2dinstance\x2d0000006a.scope: Deactivated successfully.
Jan 22 00:08:18 compute-1 systemd[1]: machine-qemu\x2d46\x2dinstance\x2d0000006a.scope: Consumed 8.788s CPU time.
Jan 22 00:08:18 compute-1 systemd-machined[153970]: Machine qemu-46-instance-0000006a terminated.
Jan 22 00:08:18 compute-1 nova_compute[182713]: 2026-01-22 00:08:18.834 182717 INFO nova.virt.libvirt.driver [-] [instance: c8d10cc4-7d81-41cd-8ccd-393ca4932d92] Instance destroyed successfully.
Jan 22 00:08:18 compute-1 nova_compute[182713]: 2026-01-22 00:08:18.834 182717 DEBUG nova.objects.instance [None req-ce37c11e-50eb-4978-aaa9-e7ed14e09341 4ec14075ca8f491cb7e481e95011074a 356ef6d1220340c1a290e92a77db430d - - default default] Lazy-loading 'resources' on Instance uuid c8d10cc4-7d81-41cd-8ccd-393ca4932d92 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:08:18 compute-1 neutron-haproxy-ovnmeta-220179ff-1200-4907-bb65-b64dfb008873[227378]: [NOTICE]   (227382) : haproxy version is 2.8.14-c23fe91
Jan 22 00:08:18 compute-1 neutron-haproxy-ovnmeta-220179ff-1200-4907-bb65-b64dfb008873[227378]: [NOTICE]   (227382) : path to executable is /usr/sbin/haproxy
Jan 22 00:08:18 compute-1 neutron-haproxy-ovnmeta-220179ff-1200-4907-bb65-b64dfb008873[227378]: [WARNING]  (227382) : Exiting Master process...
Jan 22 00:08:18 compute-1 neutron-haproxy-ovnmeta-220179ff-1200-4907-bb65-b64dfb008873[227378]: [ALERT]    (227382) : Current worker (227384) exited with code 143 (Terminated)
Jan 22 00:08:18 compute-1 neutron-haproxy-ovnmeta-220179ff-1200-4907-bb65-b64dfb008873[227378]: [WARNING]  (227382) : All workers exited. Exiting... (0)
Jan 22 00:08:18 compute-1 systemd[1]: libpod-cb36c7b997e9483f1b89391d3d18c5a7a2f904f7bdf64fe0ba1f9bd911292765.scope: Deactivated successfully.
Jan 22 00:08:18 compute-1 podman[227444]: 2026-01-22 00:08:18.8499704 +0000 UTC m=+0.051997059 container died cb36c7b997e9483f1b89391d3d18c5a7a2f904f7bdf64fe0ba1f9bd911292765 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-220179ff-1200-4907-bb65-b64dfb008873, io.buildah.version=1.41.3, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 22 00:08:18 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-cb36c7b997e9483f1b89391d3d18c5a7a2f904f7bdf64fe0ba1f9bd911292765-userdata-shm.mount: Deactivated successfully.
Jan 22 00:08:18 compute-1 nova_compute[182713]: 2026-01-22 00:08:18.902 182717 DEBUG nova.compute.manager [req-a73701fe-f8c8-4855-92e1-a8cbcfd7d932 req-551370a7-88f5-45ab-94e1-d913820304aa 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c8d10cc4-7d81-41cd-8ccd-393ca4932d92] Received event network-vif-unplugged-867caf1d-7131-4085-9939-8b72e2f5d448 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:08:18 compute-1 nova_compute[182713]: 2026-01-22 00:08:18.903 182717 DEBUG oslo_concurrency.lockutils [req-a73701fe-f8c8-4855-92e1-a8cbcfd7d932 req-551370a7-88f5-45ab-94e1-d913820304aa 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "c8d10cc4-7d81-41cd-8ccd-393ca4932d92-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:08:18 compute-1 nova_compute[182713]: 2026-01-22 00:08:18.903 182717 DEBUG oslo_concurrency.lockutils [req-a73701fe-f8c8-4855-92e1-a8cbcfd7d932 req-551370a7-88f5-45ab-94e1-d913820304aa 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "c8d10cc4-7d81-41cd-8ccd-393ca4932d92-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:08:18 compute-1 nova_compute[182713]: 2026-01-22 00:08:18.904 182717 DEBUG oslo_concurrency.lockutils [req-a73701fe-f8c8-4855-92e1-a8cbcfd7d932 req-551370a7-88f5-45ab-94e1-d913820304aa 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "c8d10cc4-7d81-41cd-8ccd-393ca4932d92-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:08:18 compute-1 nova_compute[182713]: 2026-01-22 00:08:18.905 182717 DEBUG nova.compute.manager [req-a73701fe-f8c8-4855-92e1-a8cbcfd7d932 req-551370a7-88f5-45ab-94e1-d913820304aa 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c8d10cc4-7d81-41cd-8ccd-393ca4932d92] No waiting events found dispatching network-vif-unplugged-867caf1d-7131-4085-9939-8b72e2f5d448 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:08:18 compute-1 nova_compute[182713]: 2026-01-22 00:08:18.905 182717 DEBUG nova.compute.manager [req-a73701fe-f8c8-4855-92e1-a8cbcfd7d932 req-551370a7-88f5-45ab-94e1-d913820304aa 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c8d10cc4-7d81-41cd-8ccd-393ca4932d92] Received event network-vif-unplugged-867caf1d-7131-4085-9939-8b72e2f5d448 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 22 00:08:18 compute-1 systemd[1]: var-lib-containers-storage-overlay-4fa31bf811e2149135442f6a0609e67077e939b499106318f1d41cb668c73e75-merged.mount: Deactivated successfully.
Jan 22 00:08:18 compute-1 podman[227444]: 2026-01-22 00:08:18.915706026 +0000 UTC m=+0.117732685 container cleanup cb36c7b997e9483f1b89391d3d18c5a7a2f904f7bdf64fe0ba1f9bd911292765 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-220179ff-1200-4907-bb65-b64dfb008873, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, tcib_managed=true)
Jan 22 00:08:18 compute-1 systemd[1]: libpod-conmon-cb36c7b997e9483f1b89391d3d18c5a7a2f904f7bdf64fe0ba1f9bd911292765.scope: Deactivated successfully.
Jan 22 00:08:18 compute-1 nova_compute[182713]: 2026-01-22 00:08:18.923 182717 DEBUG nova.virt.libvirt.vif [None req-ce37c11e-50eb-4978-aaa9-e7ed14e09341 4ec14075ca8f491cb7e481e95011074a 356ef6d1220340c1a290e92a77db430d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T00:08:01Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerMetadataTestJSON-server-1209954763',display_name='tempest-ServerMetadataTestJSON-server-1209954763',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-servermetadatatestjson-server-1209954763',id=106,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-22T00:08:11Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={key1='alt1',key2='value2',key3='value3'},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='356ef6d1220340c1a290e92a77db430d',ramdisk_id='',reservation_id='r-hbm03n0g',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerMetadataTestJSON-1172998385',owner_user_name='tempest-ServerMetadataTestJSON-1172998385-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T00:08:17Z,user_data=None,user_id='4ec14075ca8f491cb7e481e95011074a',uuid=c8d10cc4-7d81-41cd-8ccd-393ca4932d92,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "867caf1d-7131-4085-9939-8b72e2f5d448", "address": "fa:16:3e:ce:92:09", "network": {"id": "220179ff-1200-4907-bb65-b64dfb008873", "bridge": "br-int", "label": "tempest-ServerMetadataTestJSON-1717445310-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "356ef6d1220340c1a290e92a77db430d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap867caf1d-71", "ovs_interfaceid": "867caf1d-7131-4085-9939-8b72e2f5d448", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 00:08:18 compute-1 nova_compute[182713]: 2026-01-22 00:08:18.924 182717 DEBUG nova.network.os_vif_util [None req-ce37c11e-50eb-4978-aaa9-e7ed14e09341 4ec14075ca8f491cb7e481e95011074a 356ef6d1220340c1a290e92a77db430d - - default default] Converting VIF {"id": "867caf1d-7131-4085-9939-8b72e2f5d448", "address": "fa:16:3e:ce:92:09", "network": {"id": "220179ff-1200-4907-bb65-b64dfb008873", "bridge": "br-int", "label": "tempest-ServerMetadataTestJSON-1717445310-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "356ef6d1220340c1a290e92a77db430d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap867caf1d-71", "ovs_interfaceid": "867caf1d-7131-4085-9939-8b72e2f5d448", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:08:18 compute-1 nova_compute[182713]: 2026-01-22 00:08:18.925 182717 DEBUG nova.network.os_vif_util [None req-ce37c11e-50eb-4978-aaa9-e7ed14e09341 4ec14075ca8f491cb7e481e95011074a 356ef6d1220340c1a290e92a77db430d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ce:92:09,bridge_name='br-int',has_traffic_filtering=True,id=867caf1d-7131-4085-9939-8b72e2f5d448,network=Network(220179ff-1200-4907-bb65-b64dfb008873),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap867caf1d-71') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:08:18 compute-1 nova_compute[182713]: 2026-01-22 00:08:18.925 182717 DEBUG os_vif [None req-ce37c11e-50eb-4978-aaa9-e7ed14e09341 4ec14075ca8f491cb7e481e95011074a 356ef6d1220340c1a290e92a77db430d - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ce:92:09,bridge_name='br-int',has_traffic_filtering=True,id=867caf1d-7131-4085-9939-8b72e2f5d448,network=Network(220179ff-1200-4907-bb65-b64dfb008873),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap867caf1d-71') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 00:08:18 compute-1 nova_compute[182713]: 2026-01-22 00:08:18.927 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:08:18 compute-1 nova_compute[182713]: 2026-01-22 00:08:18.928 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap867caf1d-71, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:08:18 compute-1 nova_compute[182713]: 2026-01-22 00:08:18.931 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:08:18 compute-1 nova_compute[182713]: 2026-01-22 00:08:18.931 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:08:18 compute-1 nova_compute[182713]: 2026-01-22 00:08:18.935 182717 INFO os_vif [None req-ce37c11e-50eb-4978-aaa9-e7ed14e09341 4ec14075ca8f491cb7e481e95011074a 356ef6d1220340c1a290e92a77db430d - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ce:92:09,bridge_name='br-int',has_traffic_filtering=True,id=867caf1d-7131-4085-9939-8b72e2f5d448,network=Network(220179ff-1200-4907-bb65-b64dfb008873),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap867caf1d-71')
Jan 22 00:08:18 compute-1 nova_compute[182713]: 2026-01-22 00:08:18.936 182717 INFO nova.virt.libvirt.driver [None req-ce37c11e-50eb-4978-aaa9-e7ed14e09341 4ec14075ca8f491cb7e481e95011074a 356ef6d1220340c1a290e92a77db430d - - default default] [instance: c8d10cc4-7d81-41cd-8ccd-393ca4932d92] Deleting instance files /var/lib/nova/instances/c8d10cc4-7d81-41cd-8ccd-393ca4932d92_del
Jan 22 00:08:18 compute-1 nova_compute[182713]: 2026-01-22 00:08:18.936 182717 INFO nova.virt.libvirt.driver [None req-ce37c11e-50eb-4978-aaa9-e7ed14e09341 4ec14075ca8f491cb7e481e95011074a 356ef6d1220340c1a290e92a77db430d - - default default] [instance: c8d10cc4-7d81-41cd-8ccd-393ca4932d92] Deletion of /var/lib/nova/instances/c8d10cc4-7d81-41cd-8ccd-393ca4932d92_del complete
Jan 22 00:08:18 compute-1 podman[227492]: 2026-01-22 00:08:18.983617106 +0000 UTC m=+0.042037687 container remove cb36c7b997e9483f1b89391d3d18c5a7a2f904f7bdf64fe0ba1f9bd911292765 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-220179ff-1200-4907-bb65-b64dfb008873, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 22 00:08:18 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:08:18.990 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[7c863e8f-4890-4546-92cf-26ec0d45bed2]: (4, ('Thu Jan 22 12:08:18 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-220179ff-1200-4907-bb65-b64dfb008873 (cb36c7b997e9483f1b89391d3d18c5a7a2f904f7bdf64fe0ba1f9bd911292765)\ncb36c7b997e9483f1b89391d3d18c5a7a2f904f7bdf64fe0ba1f9bd911292765\nThu Jan 22 12:08:18 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-220179ff-1200-4907-bb65-b64dfb008873 (cb36c7b997e9483f1b89391d3d18c5a7a2f904f7bdf64fe0ba1f9bd911292765)\ncb36c7b997e9483f1b89391d3d18c5a7a2f904f7bdf64fe0ba1f9bd911292765\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:08:18 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:08:18.992 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[f49e9062-f9ce-4f5c-831e-3717080cc5e6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:08:18 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:08:18.993 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap220179ff-10, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:08:18 compute-1 nova_compute[182713]: 2026-01-22 00:08:18.994 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:08:18 compute-1 kernel: tap220179ff-10: left promiscuous mode
Jan 22 00:08:18 compute-1 nova_compute[182713]: 2026-01-22 00:08:18.998 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:08:19 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:08:19.000 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[33c5a86d-5e0e-4060-a262-17be18cdb420]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:08:19 compute-1 nova_compute[182713]: 2026-01-22 00:08:19.016 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:08:19 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:08:19.026 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[f27ac2ab-26ef-4a45-bdc3-7c7a604b219b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:08:19 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:08:19.027 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[c794ceeb-ef3a-4810-964d-1d31e8a0f339]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:08:19 compute-1 nova_compute[182713]: 2026-01-22 00:08:19.028 182717 INFO nova.compute.manager [None req-ce37c11e-50eb-4978-aaa9-e7ed14e09341 4ec14075ca8f491cb7e481e95011074a 356ef6d1220340c1a290e92a77db430d - - default default] [instance: c8d10cc4-7d81-41cd-8ccd-393ca4932d92] Took 0.44 seconds to destroy the instance on the hypervisor.
Jan 22 00:08:19 compute-1 nova_compute[182713]: 2026-01-22 00:08:19.028 182717 DEBUG oslo.service.loopingcall [None req-ce37c11e-50eb-4978-aaa9-e7ed14e09341 4ec14075ca8f491cb7e481e95011074a 356ef6d1220340c1a290e92a77db430d - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 22 00:08:19 compute-1 nova_compute[182713]: 2026-01-22 00:08:19.029 182717 DEBUG nova.compute.manager [-] [instance: c8d10cc4-7d81-41cd-8ccd-393ca4932d92] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 22 00:08:19 compute-1 nova_compute[182713]: 2026-01-22 00:08:19.029 182717 DEBUG nova.network.neutron [-] [instance: c8d10cc4-7d81-41cd-8ccd-393ca4932d92] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 22 00:08:19 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:08:19.046 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[591918fa-3270-4b11-bc14-147d53dc4946]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 506260, 'reachable_time': 16359, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 227507, 'error': None, 'target': 'ovnmeta-220179ff-1200-4907-bb65-b64dfb008873', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:08:19 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:08:19.050 104576 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-220179ff-1200-4907-bb65-b64dfb008873 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 22 00:08:19 compute-1 systemd[1]: run-netns-ovnmeta\x2d220179ff\x2d1200\x2d4907\x2dbb65\x2db64dfb008873.mount: Deactivated successfully.
Jan 22 00:08:19 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:08:19.051 104576 DEBUG oslo.privsep.daemon [-] privsep: reply[16570b55-f25b-4f48-94cb-40563a081505]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:08:20 compute-1 nova_compute[182713]: 2026-01-22 00:08:20.660 182717 DEBUG nova.network.neutron [-] [instance: c8d10cc4-7d81-41cd-8ccd-393ca4932d92] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:08:20 compute-1 nova_compute[182713]: 2026-01-22 00:08:20.694 182717 INFO nova.compute.manager [-] [instance: c8d10cc4-7d81-41cd-8ccd-393ca4932d92] Took 1.66 seconds to deallocate network for instance.
Jan 22 00:08:20 compute-1 nova_compute[182713]: 2026-01-22 00:08:20.777 182717 DEBUG nova.compute.manager [req-a9e7ab66-5452-4f6f-8048-84ae0b1db75d req-fd932843-f115-481b-89aa-ddc9659571d7 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c8d10cc4-7d81-41cd-8ccd-393ca4932d92] Received event network-vif-deleted-867caf1d-7131-4085-9939-8b72e2f5d448 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:08:20 compute-1 nova_compute[182713]: 2026-01-22 00:08:20.792 182717 DEBUG oslo_concurrency.lockutils [None req-ce37c11e-50eb-4978-aaa9-e7ed14e09341 4ec14075ca8f491cb7e481e95011074a 356ef6d1220340c1a290e92a77db430d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:08:20 compute-1 nova_compute[182713]: 2026-01-22 00:08:20.793 182717 DEBUG oslo_concurrency.lockutils [None req-ce37c11e-50eb-4978-aaa9-e7ed14e09341 4ec14075ca8f491cb7e481e95011074a 356ef6d1220340c1a290e92a77db430d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:08:20 compute-1 nova_compute[182713]: 2026-01-22 00:08:20.903 182717 DEBUG nova.compute.provider_tree [None req-ce37c11e-50eb-4978-aaa9-e7ed14e09341 4ec14075ca8f491cb7e481e95011074a 356ef6d1220340c1a290e92a77db430d - - default default] Inventory has not changed in ProviderTree for provider: 39680711-70c9-4df1-ae59-25e54fac688d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:08:20 compute-1 nova_compute[182713]: 2026-01-22 00:08:20.928 182717 DEBUG nova.scheduler.client.report [None req-ce37c11e-50eb-4978-aaa9-e7ed14e09341 4ec14075ca8f491cb7e481e95011074a 356ef6d1220340c1a290e92a77db430d - - default default] Inventory has not changed for provider 39680711-70c9-4df1-ae59-25e54fac688d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:08:20 compute-1 nova_compute[182713]: 2026-01-22 00:08:20.960 182717 DEBUG oslo_concurrency.lockutils [None req-ce37c11e-50eb-4978-aaa9-e7ed14e09341 4ec14075ca8f491cb7e481e95011074a 356ef6d1220340c1a290e92a77db430d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.167s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:08:21 compute-1 nova_compute[182713]: 2026-01-22 00:08:21.007 182717 INFO nova.scheduler.client.report [None req-ce37c11e-50eb-4978-aaa9-e7ed14e09341 4ec14075ca8f491cb7e481e95011074a 356ef6d1220340c1a290e92a77db430d - - default default] Deleted allocations for instance c8d10cc4-7d81-41cd-8ccd-393ca4932d92
Jan 22 00:08:21 compute-1 nova_compute[182713]: 2026-01-22 00:08:21.024 182717 DEBUG nova.compute.manager [req-56c0ec40-c9df-4c6e-837d-1336a3b9c022 req-53ba06a2-aa6f-4a0f-bc33-522cb40e7f14 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c8d10cc4-7d81-41cd-8ccd-393ca4932d92] Received event network-vif-plugged-867caf1d-7131-4085-9939-8b72e2f5d448 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:08:21 compute-1 nova_compute[182713]: 2026-01-22 00:08:21.024 182717 DEBUG oslo_concurrency.lockutils [req-56c0ec40-c9df-4c6e-837d-1336a3b9c022 req-53ba06a2-aa6f-4a0f-bc33-522cb40e7f14 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "c8d10cc4-7d81-41cd-8ccd-393ca4932d92-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:08:21 compute-1 nova_compute[182713]: 2026-01-22 00:08:21.024 182717 DEBUG oslo_concurrency.lockutils [req-56c0ec40-c9df-4c6e-837d-1336a3b9c022 req-53ba06a2-aa6f-4a0f-bc33-522cb40e7f14 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "c8d10cc4-7d81-41cd-8ccd-393ca4932d92-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:08:21 compute-1 nova_compute[182713]: 2026-01-22 00:08:21.025 182717 DEBUG oslo_concurrency.lockutils [req-56c0ec40-c9df-4c6e-837d-1336a3b9c022 req-53ba06a2-aa6f-4a0f-bc33-522cb40e7f14 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "c8d10cc4-7d81-41cd-8ccd-393ca4932d92-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:08:21 compute-1 nova_compute[182713]: 2026-01-22 00:08:21.025 182717 DEBUG nova.compute.manager [req-56c0ec40-c9df-4c6e-837d-1336a3b9c022 req-53ba06a2-aa6f-4a0f-bc33-522cb40e7f14 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c8d10cc4-7d81-41cd-8ccd-393ca4932d92] No waiting events found dispatching network-vif-plugged-867caf1d-7131-4085-9939-8b72e2f5d448 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:08:21 compute-1 nova_compute[182713]: 2026-01-22 00:08:21.025 182717 WARNING nova.compute.manager [req-56c0ec40-c9df-4c6e-837d-1336a3b9c022 req-53ba06a2-aa6f-4a0f-bc33-522cb40e7f14 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c8d10cc4-7d81-41cd-8ccd-393ca4932d92] Received unexpected event network-vif-plugged-867caf1d-7131-4085-9939-8b72e2f5d448 for instance with vm_state deleted and task_state None.
Jan 22 00:08:21 compute-1 nova_compute[182713]: 2026-01-22 00:08:21.108 182717 DEBUG oslo_concurrency.lockutils [None req-ce37c11e-50eb-4978-aaa9-e7ed14e09341 4ec14075ca8f491cb7e481e95011074a 356ef6d1220340c1a290e92a77db430d - - default default] Lock "c8d10cc4-7d81-41cd-8ccd-393ca4932d92" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.545s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:08:22 compute-1 nova_compute[182713]: 2026-01-22 00:08:22.380 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.879 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '21970bbd-36b4-495d-8819-49ef2276a912', 'name': 'tempest-ServerActionsTestOtherB-server-1249936959', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000060', 'OS-EXT-SRV-ATTR:host': 'compute-1.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'b26cf6f4abd54e30aac169a3cbca648c', 'user_id': '365f219cd09c471fa6275faa2fe5e2a1', 'hostId': '6538bf0928604205d3543d596640848620246511a60cfb5dc1884224', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.881 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.899 12 DEBUG ceilometer.compute.pollsters [-] 21970bbd-36b4-495d-8819-49ef2276a912/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.900 12 DEBUG ceilometer.compute.pollsters [-] 21970bbd-36b4-495d-8819-49ef2276a912/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.902 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '305cbb9e-9ddf-45ac-b850-1ff31fcd92ad', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '365f219cd09c471fa6275faa2fe5e2a1', 'user_name': None, 'project_id': 'b26cf6f4abd54e30aac169a3cbca648c', 'project_name': None, 'resource_id': '21970bbd-36b4-495d-8819-49ef2276a912-vda', 'timestamp': '2026-01-22T00:08:22.881781', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1249936959', 'name': 'instance-00000060', 'instance_id': '21970bbd-36b4-495d-8819-49ef2276a912', 'instance_type': 'm1.nano', 'host': '6538bf0928604205d3543d596640848620246511a60cfb5dc1884224', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '7730fbc4-f726-11f0-a0a4-fa163e934844', 'monotonic_time': 5075.589166146, 'message_signature': 'dbafe7b1ceb11c6016c86fa8f2d2a6463bb2e8ffe3856ee3f00002d0478ff9f3'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '365f219cd09c471fa6275faa2fe5e2a1', 'user_name': None, 'project_id': 'b26cf6f4abd54e30aac169a3cbca648c', 'project_name': None, 'resource_id': '21970bbd-36b4-495d-8819-49ef2276a912-sda', 'timestamp': '2026-01-22T00:08:22.881781', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1249936959', 'name': 'instance-00000060', 'instance_id': '21970bbd-36b4-495d-8819-49ef2276a912', 'instance_type': 'm1.nano', 'host': '6538bf0928604205d3543d596640848620246511a60cfb5dc1884224', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '77310b82-f726-11f0-a0a4-fa163e934844', 'monotonic_time': 5075.589166146, 'message_signature': '6db23a93070f20fd7883f0ca808c2344ed1ab8384a0973f51b41f2941e1033a3'}]}, 'timestamp': '2026-01-22 00:08:22.900794', '_unique_id': '65b8e4674bd04d9d906b4079b1bcda15'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.902 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.902 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.902 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.902 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.902 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.902 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.902 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.902 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.902 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.902 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.902 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.902 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.902 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.902 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.902 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.902 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.902 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.902 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.902 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.902 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.902 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.902 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.902 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.902 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.902 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.902 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.902 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.902 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.902 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.902 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.902 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.902 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.902 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.902 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.902 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.902 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.902 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.902 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.902 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.902 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.902 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.902 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.902 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.902 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.902 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.902 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.902 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.902 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.902 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.902 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.902 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.902 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.902 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.902 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.904 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.905 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 21970bbd-36b4-495d-8819-49ef2276a912 / tap06125da7-7a inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.906 12 DEBUG ceilometer.compute.pollsters [-] 21970bbd-36b4-495d-8819-49ef2276a912/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.906 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1a63d3b4-e1c0-475e-b7f4-34362d843094', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '365f219cd09c471fa6275faa2fe5e2a1', 'user_name': None, 'project_id': 'b26cf6f4abd54e30aac169a3cbca648c', 'project_name': None, 'resource_id': 'instance-00000060-21970bbd-36b4-495d-8819-49ef2276a912-tap06125da7-7a', 'timestamp': '2026-01-22T00:08:22.904169', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1249936959', 'name': 'tap06125da7-7a', 'instance_id': '21970bbd-36b4-495d-8819-49ef2276a912', 'instance_type': 'm1.nano', 'host': '6538bf0928604205d3543d596640848620246511a60cfb5dc1884224', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c4:99:4e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap06125da7-7a'}, 'message_id': '7731e50c-f726-11f0-a0a4-fa163e934844', 'monotonic_time': 5075.611393721, 'message_signature': 'cfb0b31f73105e556b57e930c8dc32bf42fde80e348ecedf12ac4b203af83e12'}]}, 'timestamp': '2026-01-22 00:08:22.906308', '_unique_id': '332172a0105740f4b6887113829ec0b7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.906 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.906 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.906 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.906 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.906 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.906 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.906 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.906 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.906 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.906 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.906 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.906 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.906 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.906 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.906 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.906 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.906 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.906 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.906 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.906 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.906 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.906 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.906 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.906 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.906 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.906 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.906 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.906 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.906 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.906 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.906 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.906 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.906 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.906 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.906 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.906 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.906 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.906 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.906 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.906 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.906 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.906 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.906 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.906 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.906 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.906 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.906 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.906 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.906 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.906 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.906 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.906 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.906 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.906 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.907 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.938 12 DEBUG ceilometer.compute.pollsters [-] 21970bbd-36b4-495d-8819-49ef2276a912/disk.device.write.bytes volume: 72998912 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.938 12 DEBUG ceilometer.compute.pollsters [-] 21970bbd-36b4-495d-8819-49ef2276a912/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.939 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '890e5ac4-0232-4fae-8063-c122aa5dbf78', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 72998912, 'user_id': '365f219cd09c471fa6275faa2fe5e2a1', 'user_name': None, 'project_id': 'b26cf6f4abd54e30aac169a3cbca648c', 'project_name': None, 'resource_id': '21970bbd-36b4-495d-8819-49ef2276a912-vda', 'timestamp': '2026-01-22T00:08:22.907482', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1249936959', 'name': 'instance-00000060', 'instance_id': '21970bbd-36b4-495d-8819-49ef2276a912', 'instance_type': 'm1.nano', 'host': '6538bf0928604205d3543d596640848620246511a60cfb5dc1884224', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '7736d788-f726-11f0-a0a4-fa163e934844', 'monotonic_time': 5075.61467368, 'message_signature': 'af0ea5a9b74f8406d5a6cb63029ed580740a6c918b3dc090346a43af56f2983b'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '365f219cd09c471fa6275faa2fe5e2a1', 'user_name': None, 'project_id': 'b26cf6f4abd54e30aac169a3cbca648c', 'project_name': None, 'resource_id': '21970bbd-36b4-495d-8819-49ef2276a912-sda', 'timestamp': '2026-01-22T00:08:22.907482', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1249936959', 'name': 'instance-00000060', 'instance_id': '21970bbd-36b4-495d-8819-49ef2276a912', 'instance_type': 'm1.nano', 'host': '6538bf0928604205d3543d596640848620246511a60cfb5dc1884224', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '7736e20a-f726-11f0-a0a4-fa163e934844', 'monotonic_time': 5075.61467368, 'message_signature': 'a0574607185ccdf2e2656c7770a1aacb4b8323431c33aa6c59be7169faea1c2f'}]}, 'timestamp': '2026-01-22 00:08:22.939018', '_unique_id': '74d6e2662e0848a299fcd25e668f0f4d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.939 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.939 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.939 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.939 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.939 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.939 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.939 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.939 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.939 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.939 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.939 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.939 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.939 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.939 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.939 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.939 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.939 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.939 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.939 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.939 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.939 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.939 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.939 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.939 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.939 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.939 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.939 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.939 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.939 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.939 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.939 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.940 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.940 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.940 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-ServerActionsTestOtherB-server-1249936959>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServerActionsTestOtherB-server-1249936959>]
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.940 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.941 12 DEBUG ceilometer.compute.pollsters [-] 21970bbd-36b4-495d-8819-49ef2276a912/disk.device.allocation volume: 30154752 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.941 12 DEBUG ceilometer.compute.pollsters [-] 21970bbd-36b4-495d-8819-49ef2276a912/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.942 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0fbe73ee-e80a-4d70-9b97-4373a0e293ba', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30154752, 'user_id': '365f219cd09c471fa6275faa2fe5e2a1', 'user_name': None, 'project_id': 'b26cf6f4abd54e30aac169a3cbca648c', 'project_name': None, 'resource_id': '21970bbd-36b4-495d-8819-49ef2276a912-vda', 'timestamp': '2026-01-22T00:08:22.941037', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1249936959', 'name': 'instance-00000060', 'instance_id': '21970bbd-36b4-495d-8819-49ef2276a912', 'instance_type': 'm1.nano', 'host': '6538bf0928604205d3543d596640848620246511a60cfb5dc1884224', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '77373c28-f726-11f0-a0a4-fa163e934844', 'monotonic_time': 5075.589166146, 'message_signature': '49422a95c08b3216ea9136d13999ae3502bc3e41db8e11b8f0520e73d692fad6'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '365f219cd09c471fa6275faa2fe5e2a1', 'user_name': None, 'project_id': 'b26cf6f4abd54e30aac169a3cbca648c', 'project_name': None, 'resource_id': '21970bbd-36b4-495d-8819-49ef2276a912-sda', 'timestamp': '2026-01-22T00:08:22.941037', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1249936959', 'name': 'instance-00000060', 'instance_id': '21970bbd-36b4-495d-8819-49ef2276a912', 'instance_type': 'm1.nano', 'host': '6538bf0928604205d3543d596640848620246511a60cfb5dc1884224', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '77374696-f726-11f0-a0a4-fa163e934844', 'monotonic_time': 5075.589166146, 'message_signature': '745b2d523b875075645a9075b2a6d5e580a01a7e99fcdc65486a5c3ed9b34bb1'}]}, 'timestamp': '2026-01-22 00:08:22.941569', '_unique_id': 'f45ddf66392b478c9cf8ecd9e953b8dc'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.942 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.942 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.942 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.942 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.942 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.942 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.942 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.942 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.942 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.942 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.942 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.942 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.942 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.942 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.942 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.942 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.942 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.942 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.942 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.942 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.942 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.942 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.942 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.942 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.942 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.942 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.942 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.942 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.942 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.942 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.942 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.942 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.942 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.942 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.942 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.942 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.942 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.942 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.942 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.942 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.942 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.942 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.942 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.942 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.942 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.942 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.942 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.942 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.942 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.942 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.942 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.942 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.942 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.942 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.942 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.942 12 DEBUG ceilometer.compute.pollsters [-] 21970bbd-36b4-495d-8819-49ef2276a912/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.943 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '072f2210-531c-4c15-b836-256b4a1b8025', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '365f219cd09c471fa6275faa2fe5e2a1', 'user_name': None, 'project_id': 'b26cf6f4abd54e30aac169a3cbca648c', 'project_name': None, 'resource_id': 'instance-00000060-21970bbd-36b4-495d-8819-49ef2276a912-tap06125da7-7a', 'timestamp': '2026-01-22T00:08:22.942762', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1249936959', 'name': 'tap06125da7-7a', 'instance_id': '21970bbd-36b4-495d-8819-49ef2276a912', 'instance_type': 'm1.nano', 'host': '6538bf0928604205d3543d596640848620246511a60cfb5dc1884224', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c4:99:4e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap06125da7-7a'}, 'message_id': '77377f8a-f726-11f0-a0a4-fa163e934844', 'monotonic_time': 5075.611393721, 'message_signature': '5496f2a7b4cb6b472d41b48535cb78e434253285efa1b86948cfd0a358b7d8c9'}]}, 'timestamp': '2026-01-22 00:08:22.943023', '_unique_id': '52c08b4010c7419c90c1d53647f92466'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.943 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.943 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.943 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.943 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.943 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.943 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.943 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.943 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.943 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.943 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.943 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.943 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.943 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.943 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.943 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.943 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.943 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.943 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.943 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.943 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.943 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.943 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.943 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.943 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.943 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.943 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.943 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.943 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.943 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.943 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.943 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.944 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.944 12 DEBUG ceilometer.compute.pollsters [-] 21970bbd-36b4-495d-8819-49ef2276a912/disk.device.read.bytes volume: 30517760 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.944 12 DEBUG ceilometer.compute.pollsters [-] 21970bbd-36b4-495d-8819-49ef2276a912/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.945 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '75554756-2a80-4ca8-860c-2a8f2d0c84fd', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 30517760, 'user_id': '365f219cd09c471fa6275faa2fe5e2a1', 'user_name': None, 'project_id': 'b26cf6f4abd54e30aac169a3cbca648c', 'project_name': None, 'resource_id': '21970bbd-36b4-495d-8819-49ef2276a912-vda', 'timestamp': '2026-01-22T00:08:22.944232', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1249936959', 'name': 'instance-00000060', 'instance_id': '21970bbd-36b4-495d-8819-49ef2276a912', 'instance_type': 'm1.nano', 'host': '6538bf0928604205d3543d596640848620246511a60cfb5dc1884224', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '7737b7ac-f726-11f0-a0a4-fa163e934844', 'monotonic_time': 5075.61467368, 'message_signature': '3de52aeff22c6e787f2b656ec43acfb218bca99c772998ae1bc7b50830db369a'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': '365f219cd09c471fa6275faa2fe5e2a1', 'user_name': None, 'project_id': 'b26cf6f4abd54e30aac169a3cbca648c', 'project_name': None, 'resource_id': '21970bbd-36b4-495d-8819-49ef2276a912-sda', 'timestamp': '2026-01-22T00:08:22.944232', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1249936959', 'name': 'instance-00000060', 'instance_id': '21970bbd-36b4-495d-8819-49ef2276a912', 'instance_type': 'm1.nano', 'host': '6538bf0928604205d3543d596640848620246511a60cfb5dc1884224', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '7737bf72-f726-11f0-a0a4-fa163e934844', 'monotonic_time': 5075.61467368, 'message_signature': '7f60305924fafdbcc1c075836ed45921c45885aeed7b4e3702dd61509e7bd5fc'}]}, 'timestamp': '2026-01-22 00:08:22.944643', '_unique_id': 'de2a4daa664741a1bae2b3cf9942672a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.945 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.945 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.945 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.945 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.945 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.945 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.945 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.945 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.945 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.945 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.945 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.945 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.945 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.945 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.945 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.945 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.945 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.945 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.945 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.945 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.945 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.945 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.945 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.945 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.945 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.945 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.945 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.945 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.945 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.945 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.945 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.945 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.945 12 DEBUG ceilometer.compute.pollsters [-] 21970bbd-36b4-495d-8819-49ef2276a912/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.946 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '994a3a71-46a1-4dae-841f-ea172c6f6da7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '365f219cd09c471fa6275faa2fe5e2a1', 'user_name': None, 'project_id': 'b26cf6f4abd54e30aac169a3cbca648c', 'project_name': None, 'resource_id': 'instance-00000060-21970bbd-36b4-495d-8819-49ef2276a912-tap06125da7-7a', 'timestamp': '2026-01-22T00:08:22.945891', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1249936959', 'name': 'tap06125da7-7a', 'instance_id': '21970bbd-36b4-495d-8819-49ef2276a912', 'instance_type': 'm1.nano', 'host': '6538bf0928604205d3543d596640848620246511a60cfb5dc1884224', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c4:99:4e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap06125da7-7a'}, 'message_id': '7737f8d4-f726-11f0-a0a4-fa163e934844', 'monotonic_time': 5075.611393721, 'message_signature': 'c4d85829d173b60283b0e97ec5d70cd424bc87f32b1d942975bb21e7aa0a5773'}]}, 'timestamp': '2026-01-22 00:08:22.946149', '_unique_id': 'cd2269d3f0394151a4f8047c5c2335dc'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.946 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.946 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.946 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.946 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.946 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.946 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.946 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.946 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.946 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.946 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.946 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.946 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.946 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.946 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.946 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.946 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.946 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.946 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.946 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.946 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.946 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.946 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.946 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.946 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.946 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.946 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.946 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.946 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.946 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.946 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.946 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.947 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.947 12 DEBUG ceilometer.compute.pollsters [-] 21970bbd-36b4-495d-8819-49ef2276a912/disk.device.read.requests volume: 1095 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.947 12 DEBUG ceilometer.compute.pollsters [-] 21970bbd-36b4-495d-8819-49ef2276a912/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.948 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '31bb0b5c-44c6-434f-8d18-c7a179fd90bb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1095, 'user_id': '365f219cd09c471fa6275faa2fe5e2a1', 'user_name': None, 'project_id': 'b26cf6f4abd54e30aac169a3cbca648c', 'project_name': None, 'resource_id': '21970bbd-36b4-495d-8819-49ef2276a912-vda', 'timestamp': '2026-01-22T00:08:22.947245', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1249936959', 'name': 'instance-00000060', 'instance_id': '21970bbd-36b4-495d-8819-49ef2276a912', 'instance_type': 'm1.nano', 'host': '6538bf0928604205d3543d596640848620246511a60cfb5dc1884224', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '77382db8-f726-11f0-a0a4-fa163e934844', 'monotonic_time': 5075.61467368, 'message_signature': '97f343421371cbff5a4898fa14f1032660930dd410dc385463fffe928ec06364'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': '365f219cd09c471fa6275faa2fe5e2a1', 'user_name': None, 'project_id': 'b26cf6f4abd54e30aac169a3cbca648c', 'project_name': None, 'resource_id': '21970bbd-36b4-495d-8819-49ef2276a912-sda', 'timestamp': '2026-01-22T00:08:22.947245', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1249936959', 'name': 'instance-00000060', 'instance_id': '21970bbd-36b4-495d-8819-49ef2276a912', 'instance_type': 'm1.nano', 'host': '6538bf0928604205d3543d596640848620246511a60cfb5dc1884224', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '77383588-f726-11f0-a0a4-fa163e934844', 'monotonic_time': 5075.61467368, 'message_signature': '4704f8ebad9facd3b1d87d7e78bad1a0619b827e83839836730d22a5db894736'}]}, 'timestamp': '2026-01-22 00:08:22.947668', '_unique_id': '08c2095c07e1489998823a74a8ca17e3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.948 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.948 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.948 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.948 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.948 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.948 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.948 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.948 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.948 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.948 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.948 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.948 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.948 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.948 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.948 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.948 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.948 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.948 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.948 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.948 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.948 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.948 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.948 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.948 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.948 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.948 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.948 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.948 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.948 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.948 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.948 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.948 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.948 12 DEBUG ceilometer.compute.pollsters [-] 21970bbd-36b4-495d-8819-49ef2276a912/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.949 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3ce3f3d8-ffc9-4c44-913c-27da145f27b4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '365f219cd09c471fa6275faa2fe5e2a1', 'user_name': None, 'project_id': 'b26cf6f4abd54e30aac169a3cbca648c', 'project_name': None, 'resource_id': 'instance-00000060-21970bbd-36b4-495d-8819-49ef2276a912-tap06125da7-7a', 'timestamp': '2026-01-22T00:08:22.948732', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1249936959', 'name': 'tap06125da7-7a', 'instance_id': '21970bbd-36b4-495d-8819-49ef2276a912', 'instance_type': 'm1.nano', 'host': '6538bf0928604205d3543d596640848620246511a60cfb5dc1884224', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c4:99:4e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap06125da7-7a'}, 'message_id': '7738680a-f726-11f0-a0a4-fa163e934844', 'monotonic_time': 5075.611393721, 'message_signature': '6d7756e0604391bcddad6a267448d46593dc24816bd9cc9a97fe15220465d6ea'}]}, 'timestamp': '2026-01-22 00:08:22.948975', '_unique_id': 'a301304bd41d45f6abd41666413762e3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.949 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.949 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.949 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.949 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.949 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.949 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.949 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.949 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.949 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.949 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.949 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.949 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.949 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.949 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.949 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.949 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.949 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.949 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.949 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.949 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.949 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.949 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.949 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.949 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.949 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.949 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.949 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.949 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.949 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.949 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.949 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.950 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.968 12 DEBUG ceilometer.compute.pollsters [-] 21970bbd-36b4-495d-8819-49ef2276a912/cpu volume: 11860000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.969 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '64e99e32-05a1-4abf-a2ab-5a626f6d318d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 11860000000, 'user_id': '365f219cd09c471fa6275faa2fe5e2a1', 'user_name': None, 'project_id': 'b26cf6f4abd54e30aac169a3cbca648c', 'project_name': None, 'resource_id': '21970bbd-36b4-495d-8819-49ef2276a912', 'timestamp': '2026-01-22T00:08:22.950107', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1249936959', 'name': 'instance-00000060', 'instance_id': '21970bbd-36b4-495d-8819-49ef2276a912', 'instance_type': 'm1.nano', 'host': '6538bf0928604205d3543d596640848620246511a60cfb5dc1884224', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '773b5c04-f726-11f0-a0a4-fa163e934844', 'monotonic_time': 5075.675114165, 'message_signature': 'a9d150fc36301fc5c1d8cbc5117746fe79fa7663fa6c2046840576ebf09bc926'}]}, 'timestamp': '2026-01-22 00:08:22.968351', '_unique_id': '5148b19b88d34beab7ce776b3b27070f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.969 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.969 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.969 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.969 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.969 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.969 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.969 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.969 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.969 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.969 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.969 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.969 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.969 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.969 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.969 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.969 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.969 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.969 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.969 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.969 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.969 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.969 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.969 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.969 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.969 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.969 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.969 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.969 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.969 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.969 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.969 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.969 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.969 12 DEBUG ceilometer.compute.pollsters [-] 21970bbd-36b4-495d-8819-49ef2276a912/disk.device.read.latency volume: 236802500 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.970 12 DEBUG ceilometer.compute.pollsters [-] 21970bbd-36b4-495d-8819-49ef2276a912/disk.device.read.latency volume: 26551525 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.970 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7da51dc5-bbed-478f-ac5e-62fad4a4d5d3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 236802500, 'user_id': '365f219cd09c471fa6275faa2fe5e2a1', 'user_name': None, 'project_id': 'b26cf6f4abd54e30aac169a3cbca648c', 'project_name': None, 'resource_id': '21970bbd-36b4-495d-8819-49ef2276a912-vda', 'timestamp': '2026-01-22T00:08:22.969898', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1249936959', 'name': 'instance-00000060', 'instance_id': '21970bbd-36b4-495d-8819-49ef2276a912', 'instance_type': 'm1.nano', 'host': '6538bf0928604205d3543d596640848620246511a60cfb5dc1884224', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '773ba286-f726-11f0-a0a4-fa163e934844', 'monotonic_time': 5075.61467368, 'message_signature': '556b871909a9ae3726420f151f06a37ac80d15508d9af242125fee323acbee30'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 26551525, 'user_id': '365f219cd09c471fa6275faa2fe5e2a1', 'user_name': None, 'project_id': 'b26cf6f4abd54e30aac169a3cbca648c', 'project_name': None, 'resource_id': '21970bbd-36b4-495d-8819-49ef2276a912-sda', 'timestamp': '2026-01-22T00:08:22.969898', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1249936959', 'name': 'instance-00000060', 'instance_id': '21970bbd-36b4-495d-8819-49ef2276a912', 'instance_type': 'm1.nano', 'host': '6538bf0928604205d3543d596640848620246511a60cfb5dc1884224', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '773babaa-f726-11f0-a0a4-fa163e934844', 'monotonic_time': 5075.61467368, 'message_signature': 'de8e3444fcea2e3990f49d7e874603553c31e271406f8ce8dccf3aa872f2ded9'}]}, 'timestamp': '2026-01-22 00:08:22.970377', '_unique_id': 'd4afa4c45c7047b39df4d69f50b5236a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.970 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.970 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.970 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.970 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.970 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.970 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.970 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.970 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.970 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.970 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.970 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.970 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.970 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.970 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.970 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.970 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.970 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.970 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.970 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.970 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.970 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.970 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.970 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.970 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.970 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.970 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.970 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.970 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.970 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.970 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.970 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.970 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.970 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.970 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.970 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.970 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.970 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.970 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.970 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.970 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.970 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.970 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.970 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.970 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.970 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.970 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.970 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.970 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.970 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.970 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.970 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.970 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.970 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.970 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.971 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.971 12 DEBUG ceilometer.compute.pollsters [-] 21970bbd-36b4-495d-8819-49ef2276a912/disk.device.write.requests volume: 324 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.971 12 DEBUG ceilometer.compute.pollsters [-] 21970bbd-36b4-495d-8819-49ef2276a912/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.972 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5d84641f-dd83-4317-b45f-8c47a092698b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 324, 'user_id': '365f219cd09c471fa6275faa2fe5e2a1', 'user_name': None, 'project_id': 'b26cf6f4abd54e30aac169a3cbca648c', 'project_name': None, 'resource_id': '21970bbd-36b4-495d-8819-49ef2276a912-vda', 'timestamp': '2026-01-22T00:08:22.971630', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1249936959', 'name': 'instance-00000060', 'instance_id': '21970bbd-36b4-495d-8819-49ef2276a912', 'instance_type': 'm1.nano', 'host': '6538bf0928604205d3543d596640848620246511a60cfb5dc1884224', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '773be6c4-f726-11f0-a0a4-fa163e934844', 'monotonic_time': 5075.61467368, 'message_signature': '9315e3801d1481f051ada8280e1b8075c670b44623e55c52e6dc32ddf49e433d'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '365f219cd09c471fa6275faa2fe5e2a1', 'user_name': None, 'project_id': 'b26cf6f4abd54e30aac169a3cbca648c', 'project_name': None, 'resource_id': '21970bbd-36b4-495d-8819-49ef2276a912-sda', 'timestamp': '2026-01-22T00:08:22.971630', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1249936959', 'name': 'instance-00000060', 'instance_id': '21970bbd-36b4-495d-8819-49ef2276a912', 'instance_type': 'm1.nano', 'host': '6538bf0928604205d3543d596640848620246511a60cfb5dc1884224', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '773bf11e-f726-11f0-a0a4-fa163e934844', 'monotonic_time': 5075.61467368, 'message_signature': '8ce15ff62ddc4f9b6ff831a2a92fa97ff052ec8f93afd031de04fc7bbd208090'}]}, 'timestamp': '2026-01-22 00:08:22.972131', '_unique_id': '86d58de63599443d8bfe94c470624724'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.972 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.972 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.972 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.972 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.972 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.972 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.972 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.972 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.972 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.972 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.972 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.972 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.972 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.972 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.972 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.972 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.972 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.972 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.972 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.972 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.972 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.972 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.972 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.972 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.972 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.972 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.972 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.972 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.972 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.972 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.972 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.973 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.973 12 DEBUG ceilometer.compute.pollsters [-] 21970bbd-36b4-495d-8819-49ef2276a912/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.973 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e97adaef-7bc8-4a0d-9ffc-3e07bde1d1bb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '365f219cd09c471fa6275faa2fe5e2a1', 'user_name': None, 'project_id': 'b26cf6f4abd54e30aac169a3cbca648c', 'project_name': None, 'resource_id': 'instance-00000060-21970bbd-36b4-495d-8819-49ef2276a912-tap06125da7-7a', 'timestamp': '2026-01-22T00:08:22.973196', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1249936959', 'name': 'tap06125da7-7a', 'instance_id': '21970bbd-36b4-495d-8819-49ef2276a912', 'instance_type': 'm1.nano', 'host': '6538bf0928604205d3543d596640848620246511a60cfb5dc1884224', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c4:99:4e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap06125da7-7a'}, 'message_id': '773c2332-f726-11f0-a0a4-fa163e934844', 'monotonic_time': 5075.611393721, 'message_signature': 'a74cb32471d6ed1b75c33e9097ec6bb37a991e87a39950ab292abd84bc891ca3'}]}, 'timestamp': '2026-01-22 00:08:22.973425', '_unique_id': '429ca0b5062c4492aa760557eceba3b9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.973 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.973 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.973 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.973 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.973 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.973 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.973 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.973 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.973 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.973 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.973 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.973 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.973 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.973 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.973 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.973 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.973 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.973 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.973 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.973 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.973 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.973 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.973 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.973 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.973 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.973 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.973 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.973 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.973 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.973 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.973 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.974 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.974 12 DEBUG ceilometer.compute.pollsters [-] 21970bbd-36b4-495d-8819-49ef2276a912/network.incoming.packets volume: 14 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.975 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '10b6b027-2534-48b1-a339-5d8542518348', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 14, 'user_id': '365f219cd09c471fa6275faa2fe5e2a1', 'user_name': None, 'project_id': 'b26cf6f4abd54e30aac169a3cbca648c', 'project_name': None, 'resource_id': 'instance-00000060-21970bbd-36b4-495d-8819-49ef2276a912-tap06125da7-7a', 'timestamp': '2026-01-22T00:08:22.974610', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1249936959', 'name': 'tap06125da7-7a', 'instance_id': '21970bbd-36b4-495d-8819-49ef2276a912', 'instance_type': 'm1.nano', 'host': '6538bf0928604205d3543d596640848620246511a60cfb5dc1884224', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c4:99:4e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap06125da7-7a'}, 'message_id': '773c5a46-f726-11f0-a0a4-fa163e934844', 'monotonic_time': 5075.611393721, 'message_signature': '4f86b85e5f4de1c421b59dac086a62ee3e3613f8da860b149795e38ed4d0ee71'}]}, 'timestamp': '2026-01-22 00:08:22.974836', '_unique_id': '9f84446d84004067bf6867a514a261be'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.975 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.975 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.975 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.975 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.975 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.975 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.975 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.975 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.975 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.975 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.975 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.975 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.975 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.975 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.975 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.975 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.975 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.975 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.975 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.975 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.975 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.975 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.975 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.975 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.975 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.975 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.975 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.975 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.975 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.975 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.975 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.975 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.976 12 DEBUG ceilometer.compute.pollsters [-] 21970bbd-36b4-495d-8819-49ef2276a912/network.incoming.bytes volume: 1514 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.976 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd6ec3187-78fa-4b95-85c8-31dc85445d99', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1514, 'user_id': '365f219cd09c471fa6275faa2fe5e2a1', 'user_name': None, 'project_id': 'b26cf6f4abd54e30aac169a3cbca648c', 'project_name': None, 'resource_id': 'instance-00000060-21970bbd-36b4-495d-8819-49ef2276a912-tap06125da7-7a', 'timestamp': '2026-01-22T00:08:22.975977', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1249936959', 'name': 'tap06125da7-7a', 'instance_id': '21970bbd-36b4-495d-8819-49ef2276a912', 'instance_type': 'm1.nano', 'host': '6538bf0928604205d3543d596640848620246511a60cfb5dc1884224', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c4:99:4e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap06125da7-7a'}, 'message_id': '773c90ec-f726-11f0-a0a4-fa163e934844', 'monotonic_time': 5075.611393721, 'message_signature': '8c9d44020aeebf49c0e58533bae9e5ce69ae58af790d3064f7450fa34b66384c'}]}, 'timestamp': '2026-01-22 00:08:22.976275', '_unique_id': 'e7b0a63669c54e3d9ff13c1949322830'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.976 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.976 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.976 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.976 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.976 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.976 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.976 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.976 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.976 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.976 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.976 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.976 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.976 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.976 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.976 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.976 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.976 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.976 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.976 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.976 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.976 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.976 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.976 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.976 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.976 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.976 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.976 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.976 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.976 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.976 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.976 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.976 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.976 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.976 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.976 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.976 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.976 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.976 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.976 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.976 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.976 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.976 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.976 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.976 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.976 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.976 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.976 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.976 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.976 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.976 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.976 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.976 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.976 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.976 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.977 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.977 12 DEBUG ceilometer.compute.pollsters [-] 21970bbd-36b4-495d-8819-49ef2276a912/network.outgoing.packets volume: 16 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.978 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'aee6f2bf-53ab-404c-a441-3a597abf02f6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 16, 'user_id': '365f219cd09c471fa6275faa2fe5e2a1', 'user_name': None, 'project_id': 'b26cf6f4abd54e30aac169a3cbca648c', 'project_name': None, 'resource_id': 'instance-00000060-21970bbd-36b4-495d-8819-49ef2276a912-tap06125da7-7a', 'timestamp': '2026-01-22T00:08:22.977590', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1249936959', 'name': 'tap06125da7-7a', 'instance_id': '21970bbd-36b4-495d-8819-49ef2276a912', 'instance_type': 'm1.nano', 'host': '6538bf0928604205d3543d596640848620246511a60cfb5dc1884224', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c4:99:4e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap06125da7-7a'}, 'message_id': '773cceae-f726-11f0-a0a4-fa163e934844', 'monotonic_time': 5075.611393721, 'message_signature': '0ef418fd1e54ac5bcfd6fb82ba1446c52cdaa6ce2a49488028a391db44d6cd5e'}]}, 'timestamp': '2026-01-22 00:08:22.977840', '_unique_id': 'c95855c0728c4739816e3b275cd2908c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.978 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.978 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.978 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.978 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.978 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.978 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.978 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.978 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.978 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.978 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.978 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.978 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.978 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.978 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.978 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.978 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.978 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.978 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.978 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.978 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.978 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.978 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.978 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.978 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.978 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.978 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.978 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.978 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.978 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.978 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.978 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.978 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.978 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.978 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.978 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.978 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.978 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.978 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.978 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.978 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.978 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.978 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.978 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.978 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.978 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.978 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.978 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.978 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.978 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.978 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.978 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.978 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.978 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.978 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.979 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.979 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.979 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-ServerActionsTestOtherB-server-1249936959>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServerActionsTestOtherB-server-1249936959>]
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.979 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.979 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.979 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-ServerActionsTestOtherB-server-1249936959>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServerActionsTestOtherB-server-1249936959>]
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.979 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.979 12 DEBUG ceilometer.compute.pollsters [-] 21970bbd-36b4-495d-8819-49ef2276a912/disk.device.usage volume: 29949952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.980 12 DEBUG ceilometer.compute.pollsters [-] 21970bbd-36b4-495d-8819-49ef2276a912/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.981 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0f1a1959-5ff2-4432-b1aa-9d6f75a1ab1a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29949952, 'user_id': '365f219cd09c471fa6275faa2fe5e2a1', 'user_name': None, 'project_id': 'b26cf6f4abd54e30aac169a3cbca648c', 'project_name': None, 'resource_id': '21970bbd-36b4-495d-8819-49ef2276a912-vda', 'timestamp': '2026-01-22T00:08:22.979969', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1249936959', 'name': 'instance-00000060', 'instance_id': '21970bbd-36b4-495d-8819-49ef2276a912', 'instance_type': 'm1.nano', 'host': '6538bf0928604205d3543d596640848620246511a60cfb5dc1884224', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '773d2d0e-f726-11f0-a0a4-fa163e934844', 'monotonic_time': 5075.589166146, 'message_signature': '499310257b690779b81bef62babbe6949dc6924d850a5b40cc1a328f1cd94323'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '365f219cd09c471fa6275faa2fe5e2a1', 'user_name': None, 'project_id': 'b26cf6f4abd54e30aac169a3cbca648c', 'project_name': None, 'resource_id': '21970bbd-36b4-495d-8819-49ef2276a912-sda', 'timestamp': '2026-01-22T00:08:22.979969', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1249936959', 'name': 'instance-00000060', 'instance_id': '21970bbd-36b4-495d-8819-49ef2276a912', 'instance_type': 'm1.nano', 'host': '6538bf0928604205d3543d596640848620246511a60cfb5dc1884224', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '773d37c2-f726-11f0-a0a4-fa163e934844', 'monotonic_time': 5075.589166146, 'message_signature': '81248897bdb8ad9d0ef9b0bd7bdd91feb9392d8293e6f6b6423a7e1c5e17991b'}]}, 'timestamp': '2026-01-22 00:08:22.980532', '_unique_id': 'a477f4c7bc62408993835031b91113b6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.981 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.981 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.981 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.981 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.981 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.981 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.981 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.981 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.981 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.981 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.981 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.981 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.981 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.981 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.981 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.981 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.981 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.981 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.981 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.981 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.981 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.981 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.981 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.981 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.981 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.981 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.981 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.981 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.981 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.981 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.981 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.982 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.982 12 DEBUG ceilometer.compute.pollsters [-] 21970bbd-36b4-495d-8819-49ef2276a912/memory.usage volume: 42.6796875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.982 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'cdf82c1a-5579-47a6-ad78-9e6bc4a6f903', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 42.6796875, 'user_id': '365f219cd09c471fa6275faa2fe5e2a1', 'user_name': None, 'project_id': 'b26cf6f4abd54e30aac169a3cbca648c', 'project_name': None, 'resource_id': '21970bbd-36b4-495d-8819-49ef2276a912', 'timestamp': '2026-01-22T00:08:22.982105', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1249936959', 'name': 'instance-00000060', 'instance_id': '21970bbd-36b4-495d-8819-49ef2276a912', 'instance_type': 'm1.nano', 'host': '6538bf0928604205d3543d596640848620246511a60cfb5dc1884224', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '773d7f2a-f726-11f0-a0a4-fa163e934844', 'monotonic_time': 5075.675114165, 'message_signature': '2530ae2c01070119123fb2fc4e1e6d1f7f2e6f539845d0144a9b1b13b7360bee'}]}, 'timestamp': '2026-01-22 00:08:22.982326', '_unique_id': 'c14446ae86904055a198819b7e3337cc'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.982 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.982 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.982 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.982 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.982 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.982 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.982 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.982 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.982 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.982 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.982 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.982 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.982 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.982 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.982 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.982 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.982 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.982 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.982 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.982 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.982 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.982 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.982 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.982 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.982 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.982 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.982 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.982 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.982 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.982 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.982 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.983 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.983 12 DEBUG ceilometer.compute.pollsters [-] 21970bbd-36b4-495d-8819-49ef2276a912/disk.device.write.latency volume: 1555823262 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.983 12 DEBUG ceilometer.compute.pollsters [-] 21970bbd-36b4-495d-8819-49ef2276a912/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.984 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a9ccef64-58e0-4216-a732-2d9b0fb3bffd', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1555823262, 'user_id': '365f219cd09c471fa6275faa2fe5e2a1', 'user_name': None, 'project_id': 'b26cf6f4abd54e30aac169a3cbca648c', 'project_name': None, 'resource_id': '21970bbd-36b4-495d-8819-49ef2276a912-vda', 'timestamp': '2026-01-22T00:08:22.983555', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1249936959', 'name': 'instance-00000060', 'instance_id': '21970bbd-36b4-495d-8819-49ef2276a912', 'instance_type': 'm1.nano', 'host': '6538bf0928604205d3543d596640848620246511a60cfb5dc1884224', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '773db904-f726-11f0-a0a4-fa163e934844', 'monotonic_time': 5075.61467368, 'message_signature': '1233ac9c4729bfb8a98afaff19e7e955dc800ce7f442c8a18c91a88a10b9e260'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '365f219cd09c471fa6275faa2fe5e2a1', 'user_name': None, 'project_id': 'b26cf6f4abd54e30aac169a3cbca648c', 'project_name': None, 'resource_id': '21970bbd-36b4-495d-8819-49ef2276a912-sda', 'timestamp': '2026-01-22T00:08:22.983555', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1249936959', 'name': 'instance-00000060', 'instance_id': '21970bbd-36b4-495d-8819-49ef2276a912', 'instance_type': 'm1.nano', 'host': '6538bf0928604205d3543d596640848620246511a60cfb5dc1884224', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '773dc2b4-f726-11f0-a0a4-fa163e934844', 'monotonic_time': 5075.61467368, 'message_signature': '522d8d6803cff2b6c660ff6606ce51efda6d841f287a0d633aaf3a64748b0b90'}]}, 'timestamp': '2026-01-22 00:08:22.984050', '_unique_id': '58abca40f42242f3a09bf0e53ede93ce'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.984 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.984 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.984 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.984 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.984 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.984 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.984 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.984 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.984 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.984 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.984 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.984 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.984 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.984 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.984 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.984 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.984 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.984 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.984 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.984 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.984 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.984 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.984 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.984 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.984 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.984 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.984 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.984 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.984 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.984 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.984 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.985 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.985 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.985 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-ServerActionsTestOtherB-server-1249936959>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServerActionsTestOtherB-server-1249936959>]
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.985 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.985 12 DEBUG ceilometer.compute.pollsters [-] 21970bbd-36b4-495d-8819-49ef2276a912/network.outgoing.bytes volume: 1620 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.986 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f21e837d-13dd-4f26-914e-7b85dad13995', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1620, 'user_id': '365f219cd09c471fa6275faa2fe5e2a1', 'user_name': None, 'project_id': 'b26cf6f4abd54e30aac169a3cbca648c', 'project_name': None, 'resource_id': 'instance-00000060-21970bbd-36b4-495d-8819-49ef2276a912-tap06125da7-7a', 'timestamp': '2026-01-22T00:08:22.985408', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1249936959', 'name': 'tap06125da7-7a', 'instance_id': '21970bbd-36b4-495d-8819-49ef2276a912', 'instance_type': 'm1.nano', 'host': '6538bf0928604205d3543d596640848620246511a60cfb5dc1884224', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c4:99:4e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap06125da7-7a'}, 'message_id': '773e0008-f726-11f0-a0a4-fa163e934844', 'monotonic_time': 5075.611393721, 'message_signature': '7d529386ef90135033a9a900e6d56ee09c3924773ddb45ad8b5e39ef58feffa0'}]}, 'timestamp': '2026-01-22 00:08:22.985633', '_unique_id': '7b8bb85e9e934b1694284364c4ca7b6b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.986 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.986 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.986 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.986 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.986 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.986 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.986 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.986 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.986 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.986 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.986 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.986 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.986 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.986 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.986 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.986 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.986 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.986 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.986 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.986 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.986 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.986 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.986 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.986 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.986 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.986 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.986 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.986 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.986 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.986 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.986 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.986 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.986 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.986 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.986 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.986 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.986 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.986 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.986 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.986 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.986 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.986 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.986 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.986 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.986 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.986 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.986 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.986 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.986 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.986 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.986 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.986 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.986 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.986 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.986 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.986 12 DEBUG ceilometer.compute.pollsters [-] 21970bbd-36b4-495d-8819-49ef2276a912/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.987 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6283656a-bdc7-475d-a2ed-0a0a1e755141', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '365f219cd09c471fa6275faa2fe5e2a1', 'user_name': None, 'project_id': 'b26cf6f4abd54e30aac169a3cbca648c', 'project_name': None, 'resource_id': 'instance-00000060-21970bbd-36b4-495d-8819-49ef2276a912-tap06125da7-7a', 'timestamp': '2026-01-22T00:08:22.986703', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1249936959', 'name': 'tap06125da7-7a', 'instance_id': '21970bbd-36b4-495d-8819-49ef2276a912', 'instance_type': 'm1.nano', 'host': '6538bf0928604205d3543d596640848620246511a60cfb5dc1884224', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c4:99:4e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap06125da7-7a'}, 'message_id': '773e3294-f726-11f0-a0a4-fa163e934844', 'monotonic_time': 5075.611393721, 'message_signature': '67e6b06ada52352427cca0efce34cdbf6a9a5c2457749bec271f6290292d79f4'}]}, 'timestamp': '2026-01-22 00:08:22.986947', '_unique_id': '3767ff7c012e4c28866865fcb75ac6ef'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.987 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.987 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.987 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.987 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.987 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.987 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.987 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.987 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.987 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.987 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.987 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.987 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.987 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.987 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.987 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.987 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.987 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.987 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.987 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.987 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.987 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.987 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.987 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.987 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.987 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.987 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.987 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.987 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.987 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.987 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:08:22.987 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:08:23 compute-1 nova_compute[182713]: 2026-01-22 00:08:23.144 182717 DEBUG oslo_concurrency.lockutils [None req-7a848b90-310c-42b8-9dab-da530c2939d1 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Acquiring lock "6bdc91c7-e39d-4d01-9496-49ceb58c3389" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:08:23 compute-1 nova_compute[182713]: 2026-01-22 00:08:23.145 182717 DEBUG oslo_concurrency.lockutils [None req-7a848b90-310c-42b8-9dab-da530c2939d1 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "6bdc91c7-e39d-4d01-9496-49ceb58c3389" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:08:23 compute-1 nova_compute[182713]: 2026-01-22 00:08:23.166 182717 DEBUG nova.compute.manager [None req-7a848b90-310c-42b8-9dab-da530c2939d1 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 6bdc91c7-e39d-4d01-9496-49ceb58c3389] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 22 00:08:23 compute-1 nova_compute[182713]: 2026-01-22 00:08:23.265 182717 DEBUG oslo_concurrency.lockutils [None req-7a848b90-310c-42b8-9dab-da530c2939d1 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:08:23 compute-1 nova_compute[182713]: 2026-01-22 00:08:23.266 182717 DEBUG oslo_concurrency.lockutils [None req-7a848b90-310c-42b8-9dab-da530c2939d1 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:08:23 compute-1 nova_compute[182713]: 2026-01-22 00:08:23.274 182717 DEBUG nova.virt.hardware [None req-7a848b90-310c-42b8-9dab-da530c2939d1 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 22 00:08:23 compute-1 nova_compute[182713]: 2026-01-22 00:08:23.275 182717 INFO nova.compute.claims [None req-7a848b90-310c-42b8-9dab-da530c2939d1 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 6bdc91c7-e39d-4d01-9496-49ceb58c3389] Claim successful on node compute-1.ctlplane.example.com
Jan 22 00:08:23 compute-1 nova_compute[182713]: 2026-01-22 00:08:23.451 182717 DEBUG nova.compute.provider_tree [None req-7a848b90-310c-42b8-9dab-da530c2939d1 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Inventory has not changed in ProviderTree for provider: 39680711-70c9-4df1-ae59-25e54fac688d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:08:23 compute-1 nova_compute[182713]: 2026-01-22 00:08:23.471 182717 DEBUG nova.scheduler.client.report [None req-7a848b90-310c-42b8-9dab-da530c2939d1 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Inventory has not changed for provider 39680711-70c9-4df1-ae59-25e54fac688d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:08:23 compute-1 nova_compute[182713]: 2026-01-22 00:08:23.492 182717 DEBUG oslo_concurrency.lockutils [None req-7a848b90-310c-42b8-9dab-da530c2939d1 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.227s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:08:23 compute-1 nova_compute[182713]: 2026-01-22 00:08:23.493 182717 DEBUG nova.compute.manager [None req-7a848b90-310c-42b8-9dab-da530c2939d1 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 6bdc91c7-e39d-4d01-9496-49ceb58c3389] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 22 00:08:23 compute-1 nova_compute[182713]: 2026-01-22 00:08:23.558 182717 DEBUG nova.compute.manager [None req-7a848b90-310c-42b8-9dab-da530c2939d1 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 6bdc91c7-e39d-4d01-9496-49ceb58c3389] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 22 00:08:23 compute-1 nova_compute[182713]: 2026-01-22 00:08:23.559 182717 DEBUG nova.network.neutron [None req-7a848b90-310c-42b8-9dab-da530c2939d1 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 6bdc91c7-e39d-4d01-9496-49ceb58c3389] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 22 00:08:23 compute-1 nova_compute[182713]: 2026-01-22 00:08:23.591 182717 INFO nova.virt.libvirt.driver [None req-7a848b90-310c-42b8-9dab-da530c2939d1 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 6bdc91c7-e39d-4d01-9496-49ceb58c3389] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 22 00:08:23 compute-1 nova_compute[182713]: 2026-01-22 00:08:23.609 182717 DEBUG nova.compute.manager [None req-7a848b90-310c-42b8-9dab-da530c2939d1 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 6bdc91c7-e39d-4d01-9496-49ceb58c3389] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 22 00:08:23 compute-1 nova_compute[182713]: 2026-01-22 00:08:23.743 182717 DEBUG nova.compute.manager [None req-7a848b90-310c-42b8-9dab-da530c2939d1 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 6bdc91c7-e39d-4d01-9496-49ceb58c3389] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 22 00:08:23 compute-1 nova_compute[182713]: 2026-01-22 00:08:23.745 182717 DEBUG nova.virt.libvirt.driver [None req-7a848b90-310c-42b8-9dab-da530c2939d1 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 6bdc91c7-e39d-4d01-9496-49ceb58c3389] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 22 00:08:23 compute-1 nova_compute[182713]: 2026-01-22 00:08:23.745 182717 INFO nova.virt.libvirt.driver [None req-7a848b90-310c-42b8-9dab-da530c2939d1 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 6bdc91c7-e39d-4d01-9496-49ceb58c3389] Creating image(s)
Jan 22 00:08:23 compute-1 nova_compute[182713]: 2026-01-22 00:08:23.746 182717 DEBUG oslo_concurrency.lockutils [None req-7a848b90-310c-42b8-9dab-da530c2939d1 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Acquiring lock "/var/lib/nova/instances/6bdc91c7-e39d-4d01-9496-49ceb58c3389/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:08:23 compute-1 nova_compute[182713]: 2026-01-22 00:08:23.747 182717 DEBUG oslo_concurrency.lockutils [None req-7a848b90-310c-42b8-9dab-da530c2939d1 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "/var/lib/nova/instances/6bdc91c7-e39d-4d01-9496-49ceb58c3389/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:08:23 compute-1 nova_compute[182713]: 2026-01-22 00:08:23.748 182717 DEBUG oslo_concurrency.lockutils [None req-7a848b90-310c-42b8-9dab-da530c2939d1 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "/var/lib/nova/instances/6bdc91c7-e39d-4d01-9496-49ceb58c3389/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:08:23 compute-1 nova_compute[182713]: 2026-01-22 00:08:23.775 182717 DEBUG nova.policy [None req-7a848b90-310c-42b8-9dab-da530c2939d1 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '833f1e9dce90456ea55a443da6704907', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '34b96b4037d24a0ea19383ca2477b2fd', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 22 00:08:23 compute-1 nova_compute[182713]: 2026-01-22 00:08:23.779 182717 DEBUG oslo_concurrency.processutils [None req-7a848b90-310c-42b8-9dab-da530c2939d1 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:08:23 compute-1 nova_compute[182713]: 2026-01-22 00:08:23.878 182717 DEBUG oslo_concurrency.processutils [None req-7a848b90-310c-42b8-9dab-da530c2939d1 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.099s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:08:23 compute-1 nova_compute[182713]: 2026-01-22 00:08:23.880 182717 DEBUG oslo_concurrency.lockutils [None req-7a848b90-310c-42b8-9dab-da530c2939d1 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Acquiring lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:08:23 compute-1 nova_compute[182713]: 2026-01-22 00:08:23.881 182717 DEBUG oslo_concurrency.lockutils [None req-7a848b90-310c-42b8-9dab-da530c2939d1 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:08:23 compute-1 nova_compute[182713]: 2026-01-22 00:08:23.906 182717 DEBUG oslo_concurrency.processutils [None req-7a848b90-310c-42b8-9dab-da530c2939d1 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:08:23 compute-1 nova_compute[182713]: 2026-01-22 00:08:23.931 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:08:23 compute-1 nova_compute[182713]: 2026-01-22 00:08:23.982 182717 DEBUG oslo_concurrency.processutils [None req-7a848b90-310c-42b8-9dab-da530c2939d1 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.076s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:08:23 compute-1 nova_compute[182713]: 2026-01-22 00:08:23.983 182717 DEBUG oslo_concurrency.processutils [None req-7a848b90-310c-42b8-9dab-da530c2939d1 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/6bdc91c7-e39d-4d01-9496-49ceb58c3389/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:08:24 compute-1 nova_compute[182713]: 2026-01-22 00:08:24.022 182717 DEBUG oslo_concurrency.processutils [None req-7a848b90-310c-42b8-9dab-da530c2939d1 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/6bdc91c7-e39d-4d01-9496-49ceb58c3389/disk 1073741824" returned: 0 in 0.039s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:08:24 compute-1 nova_compute[182713]: 2026-01-22 00:08:24.023 182717 DEBUG oslo_concurrency.lockutils [None req-7a848b90-310c-42b8-9dab-da530c2939d1 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.142s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:08:24 compute-1 nova_compute[182713]: 2026-01-22 00:08:24.024 182717 DEBUG oslo_concurrency.processutils [None req-7a848b90-310c-42b8-9dab-da530c2939d1 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:08:24 compute-1 nova_compute[182713]: 2026-01-22 00:08:24.087 182717 DEBUG oslo_concurrency.processutils [None req-7a848b90-310c-42b8-9dab-da530c2939d1 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:08:24 compute-1 nova_compute[182713]: 2026-01-22 00:08:24.088 182717 DEBUG nova.virt.disk.api [None req-7a848b90-310c-42b8-9dab-da530c2939d1 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Checking if we can resize image /var/lib/nova/instances/6bdc91c7-e39d-4d01-9496-49ceb58c3389/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 22 00:08:24 compute-1 nova_compute[182713]: 2026-01-22 00:08:24.088 182717 DEBUG oslo_concurrency.processutils [None req-7a848b90-310c-42b8-9dab-da530c2939d1 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6bdc91c7-e39d-4d01-9496-49ceb58c3389/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:08:24 compute-1 nova_compute[182713]: 2026-01-22 00:08:24.162 182717 DEBUG oslo_concurrency.processutils [None req-7a848b90-310c-42b8-9dab-da530c2939d1 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6bdc91c7-e39d-4d01-9496-49ceb58c3389/disk --force-share --output=json" returned: 0 in 0.074s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:08:24 compute-1 nova_compute[182713]: 2026-01-22 00:08:24.163 182717 DEBUG nova.virt.disk.api [None req-7a848b90-310c-42b8-9dab-da530c2939d1 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Cannot resize image /var/lib/nova/instances/6bdc91c7-e39d-4d01-9496-49ceb58c3389/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 22 00:08:24 compute-1 nova_compute[182713]: 2026-01-22 00:08:24.164 182717 DEBUG nova.objects.instance [None req-7a848b90-310c-42b8-9dab-da530c2939d1 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lazy-loading 'migration_context' on Instance uuid 6bdc91c7-e39d-4d01-9496-49ceb58c3389 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:08:24 compute-1 nova_compute[182713]: 2026-01-22 00:08:24.182 182717 DEBUG nova.virt.libvirt.driver [None req-7a848b90-310c-42b8-9dab-da530c2939d1 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 6bdc91c7-e39d-4d01-9496-49ceb58c3389] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 22 00:08:24 compute-1 nova_compute[182713]: 2026-01-22 00:08:24.183 182717 DEBUG nova.virt.libvirt.driver [None req-7a848b90-310c-42b8-9dab-da530c2939d1 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 6bdc91c7-e39d-4d01-9496-49ceb58c3389] Ensure instance console log exists: /var/lib/nova/instances/6bdc91c7-e39d-4d01-9496-49ceb58c3389/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 22 00:08:24 compute-1 nova_compute[182713]: 2026-01-22 00:08:24.184 182717 DEBUG oslo_concurrency.lockutils [None req-7a848b90-310c-42b8-9dab-da530c2939d1 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:08:24 compute-1 nova_compute[182713]: 2026-01-22 00:08:24.184 182717 DEBUG oslo_concurrency.lockutils [None req-7a848b90-310c-42b8-9dab-da530c2939d1 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:08:24 compute-1 nova_compute[182713]: 2026-01-22 00:08:24.184 182717 DEBUG oslo_concurrency.lockutils [None req-7a848b90-310c-42b8-9dab-da530c2939d1 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:08:24 compute-1 nova_compute[182713]: 2026-01-22 00:08:24.593 182717 DEBUG nova.network.neutron [None req-7a848b90-310c-42b8-9dab-da530c2939d1 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 6bdc91c7-e39d-4d01-9496-49ceb58c3389] Successfully created port: 65b75da3-429b-440e-aab6-df04e1794c70 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 22 00:08:24 compute-1 podman[227524]: 2026-01-22 00:08:24.609163062 +0000 UTC m=+0.096130749 container health_status 9069102163b9014ce8d990e36224edf2f6277e737eebbc85a8ea0ba5a3d6a468 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 22 00:08:24 compute-1 podman[227523]: 2026-01-22 00:08:24.695517613 +0000 UTC m=+0.183234942 container health_status 1b31374526468d6b98833c6ddc06c791524e3aaa8f4a92d02e3f11c449a17ca2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, config_id=ovn_controller)
Jan 22 00:08:25 compute-1 nova_compute[182713]: 2026-01-22 00:08:25.686 182717 DEBUG nova.network.neutron [None req-7a848b90-310c-42b8-9dab-da530c2939d1 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 6bdc91c7-e39d-4d01-9496-49ceb58c3389] Successfully updated port: 65b75da3-429b-440e-aab6-df04e1794c70 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 22 00:08:25 compute-1 nova_compute[182713]: 2026-01-22 00:08:25.711 182717 DEBUG oslo_concurrency.lockutils [None req-7a848b90-310c-42b8-9dab-da530c2939d1 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Acquiring lock "refresh_cache-6bdc91c7-e39d-4d01-9496-49ceb58c3389" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:08:25 compute-1 nova_compute[182713]: 2026-01-22 00:08:25.711 182717 DEBUG oslo_concurrency.lockutils [None req-7a848b90-310c-42b8-9dab-da530c2939d1 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Acquired lock "refresh_cache-6bdc91c7-e39d-4d01-9496-49ceb58c3389" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:08:25 compute-1 nova_compute[182713]: 2026-01-22 00:08:25.712 182717 DEBUG nova.network.neutron [None req-7a848b90-310c-42b8-9dab-da530c2939d1 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 6bdc91c7-e39d-4d01-9496-49ceb58c3389] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 00:08:25 compute-1 nova_compute[182713]: 2026-01-22 00:08:25.830 182717 DEBUG nova.compute.manager [req-e7c9a580-18fd-4c51-8a4c-6a1905dcd8e3 req-da15f95b-2437-4987-a99a-43003396e3a1 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 6bdc91c7-e39d-4d01-9496-49ceb58c3389] Received event network-changed-65b75da3-429b-440e-aab6-df04e1794c70 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:08:25 compute-1 nova_compute[182713]: 2026-01-22 00:08:25.831 182717 DEBUG nova.compute.manager [req-e7c9a580-18fd-4c51-8a4c-6a1905dcd8e3 req-da15f95b-2437-4987-a99a-43003396e3a1 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 6bdc91c7-e39d-4d01-9496-49ceb58c3389] Refreshing instance network info cache due to event network-changed-65b75da3-429b-440e-aab6-df04e1794c70. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 00:08:25 compute-1 nova_compute[182713]: 2026-01-22 00:08:25.831 182717 DEBUG oslo_concurrency.lockutils [req-e7c9a580-18fd-4c51-8a4c-6a1905dcd8e3 req-da15f95b-2437-4987-a99a-43003396e3a1 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-6bdc91c7-e39d-4d01-9496-49ceb58c3389" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:08:25 compute-1 nova_compute[182713]: 2026-01-22 00:08:25.966 182717 DEBUG nova.network.neutron [None req-7a848b90-310c-42b8-9dab-da530c2939d1 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 6bdc91c7-e39d-4d01-9496-49ceb58c3389] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 00:08:27 compute-1 nova_compute[182713]: 2026-01-22 00:08:27.383 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:08:27 compute-1 ovn_controller[94841]: 2026-01-22T00:08:27Z|00417|binding|INFO|Releasing lport c2dbe75a-81e7-4c52-bada-9acaf8fbaf5c from this chassis (sb_readonly=0)
Jan 22 00:08:27 compute-1 nova_compute[182713]: 2026-01-22 00:08:27.888 182717 DEBUG nova.network.neutron [None req-7a848b90-310c-42b8-9dab-da530c2939d1 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 6bdc91c7-e39d-4d01-9496-49ceb58c3389] Updating instance_info_cache with network_info: [{"id": "65b75da3-429b-440e-aab6-df04e1794c70", "address": "fa:16:3e:e0:2b:de", "network": {"id": "d9f4afa6-3fd6-432d-a9b5-d1fb06cca4eb", "bridge": "br-int", "label": "tempest-network-smoke--277799587", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "34b96b4037d24a0ea19383ca2477b2fd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap65b75da3-42", "ovs_interfaceid": "65b75da3-429b-440e-aab6-df04e1794c70", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:08:27 compute-1 nova_compute[182713]: 2026-01-22 00:08:27.931 182717 DEBUG oslo_concurrency.lockutils [None req-7a848b90-310c-42b8-9dab-da530c2939d1 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Releasing lock "refresh_cache-6bdc91c7-e39d-4d01-9496-49ceb58c3389" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:08:27 compute-1 nova_compute[182713]: 2026-01-22 00:08:27.931 182717 DEBUG nova.compute.manager [None req-7a848b90-310c-42b8-9dab-da530c2939d1 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 6bdc91c7-e39d-4d01-9496-49ceb58c3389] Instance network_info: |[{"id": "65b75da3-429b-440e-aab6-df04e1794c70", "address": "fa:16:3e:e0:2b:de", "network": {"id": "d9f4afa6-3fd6-432d-a9b5-d1fb06cca4eb", "bridge": "br-int", "label": "tempest-network-smoke--277799587", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "34b96b4037d24a0ea19383ca2477b2fd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap65b75da3-42", "ovs_interfaceid": "65b75da3-429b-440e-aab6-df04e1794c70", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 22 00:08:27 compute-1 nova_compute[182713]: 2026-01-22 00:08:27.931 182717 DEBUG oslo_concurrency.lockutils [req-e7c9a580-18fd-4c51-8a4c-6a1905dcd8e3 req-da15f95b-2437-4987-a99a-43003396e3a1 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-6bdc91c7-e39d-4d01-9496-49ceb58c3389" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:08:27 compute-1 nova_compute[182713]: 2026-01-22 00:08:27.932 182717 DEBUG nova.network.neutron [req-e7c9a580-18fd-4c51-8a4c-6a1905dcd8e3 req-da15f95b-2437-4987-a99a-43003396e3a1 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 6bdc91c7-e39d-4d01-9496-49ceb58c3389] Refreshing network info cache for port 65b75da3-429b-440e-aab6-df04e1794c70 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 00:08:27 compute-1 nova_compute[182713]: 2026-01-22 00:08:27.935 182717 DEBUG nova.virt.libvirt.driver [None req-7a848b90-310c-42b8-9dab-da530c2939d1 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 6bdc91c7-e39d-4d01-9496-49ceb58c3389] Start _get_guest_xml network_info=[{"id": "65b75da3-429b-440e-aab6-df04e1794c70", "address": "fa:16:3e:e0:2b:de", "network": {"id": "d9f4afa6-3fd6-432d-a9b5-d1fb06cca4eb", "bridge": "br-int", "label": "tempest-network-smoke--277799587", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "34b96b4037d24a0ea19383ca2477b2fd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap65b75da3-42", "ovs_interfaceid": "65b75da3-429b-440e-aab6-df04e1794c70", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'size': 0, 'disk_bus': 'virtio', 'boot_index': 0, 'device_name': '/dev/vda', 'guest_format': None, 'encrypted': False, 'encryption_format': None, 'encryption_secret_uuid': None, 'device_type': 'disk', 'image_id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 22 00:08:27 compute-1 nova_compute[182713]: 2026-01-22 00:08:27.939 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:08:27 compute-1 nova_compute[182713]: 2026-01-22 00:08:27.941 182717 WARNING nova.virt.libvirt.driver [None req-7a848b90-310c-42b8-9dab-da530c2939d1 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 00:08:27 compute-1 nova_compute[182713]: 2026-01-22 00:08:27.960 182717 DEBUG nova.virt.libvirt.host [None req-7a848b90-310c-42b8-9dab-da530c2939d1 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 22 00:08:27 compute-1 nova_compute[182713]: 2026-01-22 00:08:27.961 182717 DEBUG nova.virt.libvirt.host [None req-7a848b90-310c-42b8-9dab-da530c2939d1 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 22 00:08:27 compute-1 nova_compute[182713]: 2026-01-22 00:08:27.965 182717 DEBUG nova.virt.libvirt.host [None req-7a848b90-310c-42b8-9dab-da530c2939d1 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 22 00:08:27 compute-1 nova_compute[182713]: 2026-01-22 00:08:27.965 182717 DEBUG nova.virt.libvirt.host [None req-7a848b90-310c-42b8-9dab-da530c2939d1 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 22 00:08:27 compute-1 nova_compute[182713]: 2026-01-22 00:08:27.966 182717 DEBUG nova.virt.libvirt.driver [None req-7a848b90-310c-42b8-9dab-da530c2939d1 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 22 00:08:27 compute-1 nova_compute[182713]: 2026-01-22 00:08:27.966 182717 DEBUG nova.virt.hardware [None req-7a848b90-310c-42b8-9dab-da530c2939d1 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-21T23:42:45Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c3389c03-89c4-4ff5-9e03-1a99d41713d4',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 22 00:08:27 compute-1 nova_compute[182713]: 2026-01-22 00:08:27.967 182717 DEBUG nova.virt.hardware [None req-7a848b90-310c-42b8-9dab-da530c2939d1 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 22 00:08:27 compute-1 nova_compute[182713]: 2026-01-22 00:08:27.967 182717 DEBUG nova.virt.hardware [None req-7a848b90-310c-42b8-9dab-da530c2939d1 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 22 00:08:27 compute-1 nova_compute[182713]: 2026-01-22 00:08:27.967 182717 DEBUG nova.virt.hardware [None req-7a848b90-310c-42b8-9dab-da530c2939d1 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 22 00:08:27 compute-1 nova_compute[182713]: 2026-01-22 00:08:27.967 182717 DEBUG nova.virt.hardware [None req-7a848b90-310c-42b8-9dab-da530c2939d1 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 22 00:08:27 compute-1 nova_compute[182713]: 2026-01-22 00:08:27.967 182717 DEBUG nova.virt.hardware [None req-7a848b90-310c-42b8-9dab-da530c2939d1 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 22 00:08:27 compute-1 nova_compute[182713]: 2026-01-22 00:08:27.967 182717 DEBUG nova.virt.hardware [None req-7a848b90-310c-42b8-9dab-da530c2939d1 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 22 00:08:27 compute-1 nova_compute[182713]: 2026-01-22 00:08:27.968 182717 DEBUG nova.virt.hardware [None req-7a848b90-310c-42b8-9dab-da530c2939d1 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 22 00:08:27 compute-1 nova_compute[182713]: 2026-01-22 00:08:27.968 182717 DEBUG nova.virt.hardware [None req-7a848b90-310c-42b8-9dab-da530c2939d1 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 22 00:08:27 compute-1 nova_compute[182713]: 2026-01-22 00:08:27.968 182717 DEBUG nova.virt.hardware [None req-7a848b90-310c-42b8-9dab-da530c2939d1 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 22 00:08:27 compute-1 nova_compute[182713]: 2026-01-22 00:08:27.968 182717 DEBUG nova.virt.hardware [None req-7a848b90-310c-42b8-9dab-da530c2939d1 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 22 00:08:27 compute-1 nova_compute[182713]: 2026-01-22 00:08:27.971 182717 DEBUG nova.virt.libvirt.vif [None req-7a848b90-310c-42b8-9dab-da530c2939d1 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T00:08:22Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-188093305',display_name='tempest-TestNetworkBasicOps-server-188093305',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-188093305',id=108,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOHkFZnEq7NNmxdySRV5/1EgpWeAUvw5OFcMLn6aXM7GCY8P0GmDvb4iJ7gG7Emvq6PckO0f9TjYb3zc1/HZhYzhCTu/vnvxNF9lXAP/apwH44by79L436dIVo4nFymoqQ==',key_name='tempest-TestNetworkBasicOps-441082338',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='34b96b4037d24a0ea19383ca2477b2fd',ramdisk_id='',reservation_id='r-e07wfkf0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-822850957',owner_user_name='tempest-TestNetworkBasicOps-822850957-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:08:23Z,user_data=None,user_id='833f1e9dce90456ea55a443da6704907',uuid=6bdc91c7-e39d-4d01-9496-49ceb58c3389,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "65b75da3-429b-440e-aab6-df04e1794c70", "address": "fa:16:3e:e0:2b:de", "network": {"id": "d9f4afa6-3fd6-432d-a9b5-d1fb06cca4eb", "bridge": "br-int", "label": "tempest-network-smoke--277799587", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "34b96b4037d24a0ea19383ca2477b2fd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap65b75da3-42", "ovs_interfaceid": "65b75da3-429b-440e-aab6-df04e1794c70", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 00:08:27 compute-1 nova_compute[182713]: 2026-01-22 00:08:27.971 182717 DEBUG nova.network.os_vif_util [None req-7a848b90-310c-42b8-9dab-da530c2939d1 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Converting VIF {"id": "65b75da3-429b-440e-aab6-df04e1794c70", "address": "fa:16:3e:e0:2b:de", "network": {"id": "d9f4afa6-3fd6-432d-a9b5-d1fb06cca4eb", "bridge": "br-int", "label": "tempest-network-smoke--277799587", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "34b96b4037d24a0ea19383ca2477b2fd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap65b75da3-42", "ovs_interfaceid": "65b75da3-429b-440e-aab6-df04e1794c70", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:08:27 compute-1 nova_compute[182713]: 2026-01-22 00:08:27.972 182717 DEBUG nova.network.os_vif_util [None req-7a848b90-310c-42b8-9dab-da530c2939d1 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e0:2b:de,bridge_name='br-int',has_traffic_filtering=True,id=65b75da3-429b-440e-aab6-df04e1794c70,network=Network(d9f4afa6-3fd6-432d-a9b5-d1fb06cca4eb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap65b75da3-42') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:08:27 compute-1 nova_compute[182713]: 2026-01-22 00:08:27.973 182717 DEBUG nova.objects.instance [None req-7a848b90-310c-42b8-9dab-da530c2939d1 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lazy-loading 'pci_devices' on Instance uuid 6bdc91c7-e39d-4d01-9496-49ceb58c3389 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:08:28 compute-1 nova_compute[182713]: 2026-01-22 00:08:28.001 182717 DEBUG nova.virt.libvirt.driver [None req-7a848b90-310c-42b8-9dab-da530c2939d1 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 6bdc91c7-e39d-4d01-9496-49ceb58c3389] End _get_guest_xml xml=<domain type="kvm">
Jan 22 00:08:28 compute-1 nova_compute[182713]:   <uuid>6bdc91c7-e39d-4d01-9496-49ceb58c3389</uuid>
Jan 22 00:08:28 compute-1 nova_compute[182713]:   <name>instance-0000006c</name>
Jan 22 00:08:28 compute-1 nova_compute[182713]:   <memory>131072</memory>
Jan 22 00:08:28 compute-1 nova_compute[182713]:   <vcpu>1</vcpu>
Jan 22 00:08:28 compute-1 nova_compute[182713]:   <metadata>
Jan 22 00:08:28 compute-1 nova_compute[182713]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 00:08:28 compute-1 nova_compute[182713]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 00:08:28 compute-1 nova_compute[182713]:       <nova:name>tempest-TestNetworkBasicOps-server-188093305</nova:name>
Jan 22 00:08:28 compute-1 nova_compute[182713]:       <nova:creationTime>2026-01-22 00:08:27</nova:creationTime>
Jan 22 00:08:28 compute-1 nova_compute[182713]:       <nova:flavor name="m1.nano">
Jan 22 00:08:28 compute-1 nova_compute[182713]:         <nova:memory>128</nova:memory>
Jan 22 00:08:28 compute-1 nova_compute[182713]:         <nova:disk>1</nova:disk>
Jan 22 00:08:28 compute-1 nova_compute[182713]:         <nova:swap>0</nova:swap>
Jan 22 00:08:28 compute-1 nova_compute[182713]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 00:08:28 compute-1 nova_compute[182713]:         <nova:vcpus>1</nova:vcpus>
Jan 22 00:08:28 compute-1 nova_compute[182713]:       </nova:flavor>
Jan 22 00:08:28 compute-1 nova_compute[182713]:       <nova:owner>
Jan 22 00:08:28 compute-1 nova_compute[182713]:         <nova:user uuid="833f1e9dce90456ea55a443da6704907">tempest-TestNetworkBasicOps-822850957-project-member</nova:user>
Jan 22 00:08:28 compute-1 nova_compute[182713]:         <nova:project uuid="34b96b4037d24a0ea19383ca2477b2fd">tempest-TestNetworkBasicOps-822850957</nova:project>
Jan 22 00:08:28 compute-1 nova_compute[182713]:       </nova:owner>
Jan 22 00:08:28 compute-1 nova_compute[182713]:       <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 22 00:08:28 compute-1 nova_compute[182713]:       <nova:ports>
Jan 22 00:08:28 compute-1 nova_compute[182713]:         <nova:port uuid="65b75da3-429b-440e-aab6-df04e1794c70">
Jan 22 00:08:28 compute-1 nova_compute[182713]:           <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Jan 22 00:08:28 compute-1 nova_compute[182713]:         </nova:port>
Jan 22 00:08:28 compute-1 nova_compute[182713]:       </nova:ports>
Jan 22 00:08:28 compute-1 nova_compute[182713]:     </nova:instance>
Jan 22 00:08:28 compute-1 nova_compute[182713]:   </metadata>
Jan 22 00:08:28 compute-1 nova_compute[182713]:   <sysinfo type="smbios">
Jan 22 00:08:28 compute-1 nova_compute[182713]:     <system>
Jan 22 00:08:28 compute-1 nova_compute[182713]:       <entry name="manufacturer">RDO</entry>
Jan 22 00:08:28 compute-1 nova_compute[182713]:       <entry name="product">OpenStack Compute</entry>
Jan 22 00:08:28 compute-1 nova_compute[182713]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 00:08:28 compute-1 nova_compute[182713]:       <entry name="serial">6bdc91c7-e39d-4d01-9496-49ceb58c3389</entry>
Jan 22 00:08:28 compute-1 nova_compute[182713]:       <entry name="uuid">6bdc91c7-e39d-4d01-9496-49ceb58c3389</entry>
Jan 22 00:08:28 compute-1 nova_compute[182713]:       <entry name="family">Virtual Machine</entry>
Jan 22 00:08:28 compute-1 nova_compute[182713]:     </system>
Jan 22 00:08:28 compute-1 nova_compute[182713]:   </sysinfo>
Jan 22 00:08:28 compute-1 nova_compute[182713]:   <os>
Jan 22 00:08:28 compute-1 nova_compute[182713]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 22 00:08:28 compute-1 nova_compute[182713]:     <boot dev="hd"/>
Jan 22 00:08:28 compute-1 nova_compute[182713]:     <smbios mode="sysinfo"/>
Jan 22 00:08:28 compute-1 nova_compute[182713]:   </os>
Jan 22 00:08:28 compute-1 nova_compute[182713]:   <features>
Jan 22 00:08:28 compute-1 nova_compute[182713]:     <acpi/>
Jan 22 00:08:28 compute-1 nova_compute[182713]:     <apic/>
Jan 22 00:08:28 compute-1 nova_compute[182713]:     <vmcoreinfo/>
Jan 22 00:08:28 compute-1 nova_compute[182713]:   </features>
Jan 22 00:08:28 compute-1 nova_compute[182713]:   <clock offset="utc">
Jan 22 00:08:28 compute-1 nova_compute[182713]:     <timer name="pit" tickpolicy="delay"/>
Jan 22 00:08:28 compute-1 nova_compute[182713]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 22 00:08:28 compute-1 nova_compute[182713]:     <timer name="hpet" present="no"/>
Jan 22 00:08:28 compute-1 nova_compute[182713]:   </clock>
Jan 22 00:08:28 compute-1 nova_compute[182713]:   <cpu mode="custom" match="exact">
Jan 22 00:08:28 compute-1 nova_compute[182713]:     <model>Nehalem</model>
Jan 22 00:08:28 compute-1 nova_compute[182713]:     <topology sockets="1" cores="1" threads="1"/>
Jan 22 00:08:28 compute-1 nova_compute[182713]:   </cpu>
Jan 22 00:08:28 compute-1 nova_compute[182713]:   <devices>
Jan 22 00:08:28 compute-1 nova_compute[182713]:     <disk type="file" device="disk">
Jan 22 00:08:28 compute-1 nova_compute[182713]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 00:08:28 compute-1 nova_compute[182713]:       <source file="/var/lib/nova/instances/6bdc91c7-e39d-4d01-9496-49ceb58c3389/disk"/>
Jan 22 00:08:28 compute-1 nova_compute[182713]:       <target dev="vda" bus="virtio"/>
Jan 22 00:08:28 compute-1 nova_compute[182713]:     </disk>
Jan 22 00:08:28 compute-1 nova_compute[182713]:     <disk type="file" device="cdrom">
Jan 22 00:08:28 compute-1 nova_compute[182713]:       <driver name="qemu" type="raw" cache="none"/>
Jan 22 00:08:28 compute-1 nova_compute[182713]:       <source file="/var/lib/nova/instances/6bdc91c7-e39d-4d01-9496-49ceb58c3389/disk.config"/>
Jan 22 00:08:28 compute-1 nova_compute[182713]:       <target dev="sda" bus="sata"/>
Jan 22 00:08:28 compute-1 nova_compute[182713]:     </disk>
Jan 22 00:08:28 compute-1 nova_compute[182713]:     <interface type="ethernet">
Jan 22 00:08:28 compute-1 nova_compute[182713]:       <mac address="fa:16:3e:e0:2b:de"/>
Jan 22 00:08:28 compute-1 nova_compute[182713]:       <model type="virtio"/>
Jan 22 00:08:28 compute-1 nova_compute[182713]:       <driver name="vhost" rx_queue_size="512"/>
Jan 22 00:08:28 compute-1 nova_compute[182713]:       <mtu size="1442"/>
Jan 22 00:08:28 compute-1 nova_compute[182713]:       <target dev="tap65b75da3-42"/>
Jan 22 00:08:28 compute-1 nova_compute[182713]:     </interface>
Jan 22 00:08:28 compute-1 nova_compute[182713]:     <serial type="pty">
Jan 22 00:08:28 compute-1 nova_compute[182713]:       <log file="/var/lib/nova/instances/6bdc91c7-e39d-4d01-9496-49ceb58c3389/console.log" append="off"/>
Jan 22 00:08:28 compute-1 nova_compute[182713]:     </serial>
Jan 22 00:08:28 compute-1 nova_compute[182713]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 00:08:28 compute-1 nova_compute[182713]:     <video>
Jan 22 00:08:28 compute-1 nova_compute[182713]:       <model type="virtio"/>
Jan 22 00:08:28 compute-1 nova_compute[182713]:     </video>
Jan 22 00:08:28 compute-1 nova_compute[182713]:     <input type="tablet" bus="usb"/>
Jan 22 00:08:28 compute-1 nova_compute[182713]:     <rng model="virtio">
Jan 22 00:08:28 compute-1 nova_compute[182713]:       <backend model="random">/dev/urandom</backend>
Jan 22 00:08:28 compute-1 nova_compute[182713]:     </rng>
Jan 22 00:08:28 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root"/>
Jan 22 00:08:28 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:08:28 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:08:28 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:08:28 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:08:28 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:08:28 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:08:28 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:08:28 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:08:28 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:08:28 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:08:28 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:08:28 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:08:28 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:08:28 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:08:28 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:08:28 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:08:28 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:08:28 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:08:28 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:08:28 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:08:28 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:08:28 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:08:28 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:08:28 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:08:28 compute-1 nova_compute[182713]:     <controller type="usb" index="0"/>
Jan 22 00:08:28 compute-1 nova_compute[182713]:     <memballoon model="virtio">
Jan 22 00:08:28 compute-1 nova_compute[182713]:       <stats period="10"/>
Jan 22 00:08:28 compute-1 nova_compute[182713]:     </memballoon>
Jan 22 00:08:28 compute-1 nova_compute[182713]:   </devices>
Jan 22 00:08:28 compute-1 nova_compute[182713]: </domain>
Jan 22 00:08:28 compute-1 nova_compute[182713]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 22 00:08:28 compute-1 nova_compute[182713]: 2026-01-22 00:08:28.001 182717 DEBUG nova.compute.manager [None req-7a848b90-310c-42b8-9dab-da530c2939d1 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 6bdc91c7-e39d-4d01-9496-49ceb58c3389] Preparing to wait for external event network-vif-plugged-65b75da3-429b-440e-aab6-df04e1794c70 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 22 00:08:28 compute-1 nova_compute[182713]: 2026-01-22 00:08:28.001 182717 DEBUG oslo_concurrency.lockutils [None req-7a848b90-310c-42b8-9dab-da530c2939d1 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Acquiring lock "6bdc91c7-e39d-4d01-9496-49ceb58c3389-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:08:28 compute-1 nova_compute[182713]: 2026-01-22 00:08:28.001 182717 DEBUG oslo_concurrency.lockutils [None req-7a848b90-310c-42b8-9dab-da530c2939d1 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "6bdc91c7-e39d-4d01-9496-49ceb58c3389-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:08:28 compute-1 nova_compute[182713]: 2026-01-22 00:08:28.002 182717 DEBUG oslo_concurrency.lockutils [None req-7a848b90-310c-42b8-9dab-da530c2939d1 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "6bdc91c7-e39d-4d01-9496-49ceb58c3389-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:08:28 compute-1 nova_compute[182713]: 2026-01-22 00:08:28.002 182717 DEBUG nova.virt.libvirt.vif [None req-7a848b90-310c-42b8-9dab-da530c2939d1 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T00:08:22Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-188093305',display_name='tempest-TestNetworkBasicOps-server-188093305',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-188093305',id=108,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOHkFZnEq7NNmxdySRV5/1EgpWeAUvw5OFcMLn6aXM7GCY8P0GmDvb4iJ7gG7Emvq6PckO0f9TjYb3zc1/HZhYzhCTu/vnvxNF9lXAP/apwH44by79L436dIVo4nFymoqQ==',key_name='tempest-TestNetworkBasicOps-441082338',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='34b96b4037d24a0ea19383ca2477b2fd',ramdisk_id='',reservation_id='r-e07wfkf0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-822850957',owner_user_name='tempest-TestNetworkBasicOps-822850957-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:08:23Z,user_data=None,user_id='833f1e9dce90456ea55a443da6704907',uuid=6bdc91c7-e39d-4d01-9496-49ceb58c3389,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "65b75da3-429b-440e-aab6-df04e1794c70", "address": "fa:16:3e:e0:2b:de", "network": {"id": "d9f4afa6-3fd6-432d-a9b5-d1fb06cca4eb", "bridge": "br-int", "label": "tempest-network-smoke--277799587", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "34b96b4037d24a0ea19383ca2477b2fd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap65b75da3-42", "ovs_interfaceid": "65b75da3-429b-440e-aab6-df04e1794c70", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 00:08:28 compute-1 nova_compute[182713]: 2026-01-22 00:08:28.002 182717 DEBUG nova.network.os_vif_util [None req-7a848b90-310c-42b8-9dab-da530c2939d1 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Converting VIF {"id": "65b75da3-429b-440e-aab6-df04e1794c70", "address": "fa:16:3e:e0:2b:de", "network": {"id": "d9f4afa6-3fd6-432d-a9b5-d1fb06cca4eb", "bridge": "br-int", "label": "tempest-network-smoke--277799587", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "34b96b4037d24a0ea19383ca2477b2fd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap65b75da3-42", "ovs_interfaceid": "65b75da3-429b-440e-aab6-df04e1794c70", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:08:28 compute-1 nova_compute[182713]: 2026-01-22 00:08:28.003 182717 DEBUG nova.network.os_vif_util [None req-7a848b90-310c-42b8-9dab-da530c2939d1 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e0:2b:de,bridge_name='br-int',has_traffic_filtering=True,id=65b75da3-429b-440e-aab6-df04e1794c70,network=Network(d9f4afa6-3fd6-432d-a9b5-d1fb06cca4eb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap65b75da3-42') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:08:28 compute-1 nova_compute[182713]: 2026-01-22 00:08:28.004 182717 DEBUG os_vif [None req-7a848b90-310c-42b8-9dab-da530c2939d1 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:e0:2b:de,bridge_name='br-int',has_traffic_filtering=True,id=65b75da3-429b-440e-aab6-df04e1794c70,network=Network(d9f4afa6-3fd6-432d-a9b5-d1fb06cca4eb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap65b75da3-42') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 00:08:28 compute-1 nova_compute[182713]: 2026-01-22 00:08:28.004 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:08:28 compute-1 nova_compute[182713]: 2026-01-22 00:08:28.004 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:08:28 compute-1 nova_compute[182713]: 2026-01-22 00:08:28.004 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 00:08:28 compute-1 nova_compute[182713]: 2026-01-22 00:08:28.006 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:08:28 compute-1 nova_compute[182713]: 2026-01-22 00:08:28.006 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap65b75da3-42, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:08:28 compute-1 nova_compute[182713]: 2026-01-22 00:08:28.007 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap65b75da3-42, col_values=(('external_ids', {'iface-id': '65b75da3-429b-440e-aab6-df04e1794c70', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:e0:2b:de', 'vm-uuid': '6bdc91c7-e39d-4d01-9496-49ceb58c3389'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:08:28 compute-1 nova_compute[182713]: 2026-01-22 00:08:28.008 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:08:28 compute-1 nova_compute[182713]: 2026-01-22 00:08:28.010 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:08:28 compute-1 NetworkManager[54952]: <info>  [1769040508.0115] manager: (tap65b75da3-42): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/201)
Jan 22 00:08:28 compute-1 nova_compute[182713]: 2026-01-22 00:08:28.016 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:08:28 compute-1 nova_compute[182713]: 2026-01-22 00:08:28.017 182717 INFO os_vif [None req-7a848b90-310c-42b8-9dab-da530c2939d1 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:e0:2b:de,bridge_name='br-int',has_traffic_filtering=True,id=65b75da3-429b-440e-aab6-df04e1794c70,network=Network(d9f4afa6-3fd6-432d-a9b5-d1fb06cca4eb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap65b75da3-42')
Jan 22 00:08:28 compute-1 nova_compute[182713]: 2026-01-22 00:08:28.130 182717 DEBUG nova.virt.libvirt.driver [None req-7a848b90-310c-42b8-9dab-da530c2939d1 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 00:08:28 compute-1 nova_compute[182713]: 2026-01-22 00:08:28.130 182717 DEBUG nova.virt.libvirt.driver [None req-7a848b90-310c-42b8-9dab-da530c2939d1 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 00:08:28 compute-1 nova_compute[182713]: 2026-01-22 00:08:28.131 182717 DEBUG nova.virt.libvirt.driver [None req-7a848b90-310c-42b8-9dab-da530c2939d1 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] No VIF found with MAC fa:16:3e:e0:2b:de, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 00:08:28 compute-1 nova_compute[182713]: 2026-01-22 00:08:28.131 182717 INFO nova.virt.libvirt.driver [None req-7a848b90-310c-42b8-9dab-da530c2939d1 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 6bdc91c7-e39d-4d01-9496-49ceb58c3389] Using config drive
Jan 22 00:08:29 compute-1 podman[227578]: 2026-01-22 00:08:29.592665601 +0000 UTC m=+0.088195818 container health_status e305ebfcd3dbca18f8dd680606b043b108cf9efb0d0a771039cb04fe0df8441d (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 22 00:08:29 compute-1 podman[227577]: 2026-01-22 00:08:29.599267821 +0000 UTC m=+0.089643841 container health_status af9a44923f14d865893c91712a29a99d59e05d83f1a1a0300c4f4d5af638f284 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 22 00:08:30 compute-1 nova_compute[182713]: 2026-01-22 00:08:30.078 182717 INFO nova.virt.libvirt.driver [None req-7a848b90-310c-42b8-9dab-da530c2939d1 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 6bdc91c7-e39d-4d01-9496-49ceb58c3389] Creating config drive at /var/lib/nova/instances/6bdc91c7-e39d-4d01-9496-49ceb58c3389/disk.config
Jan 22 00:08:30 compute-1 nova_compute[182713]: 2026-01-22 00:08:30.083 182717 DEBUG oslo_concurrency.processutils [None req-7a848b90-310c-42b8-9dab-da530c2939d1 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/6bdc91c7-e39d-4d01-9496-49ceb58c3389/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp7mbql87b execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:08:30 compute-1 nova_compute[182713]: 2026-01-22 00:08:30.261 182717 DEBUG oslo_concurrency.processutils [None req-7a848b90-310c-42b8-9dab-da530c2939d1 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/6bdc91c7-e39d-4d01-9496-49ceb58c3389/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp7mbql87b" returned: 0 in 0.178s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:08:30 compute-1 kernel: tap65b75da3-42: entered promiscuous mode
Jan 22 00:08:30 compute-1 NetworkManager[54952]: <info>  [1769040510.6620] manager: (tap65b75da3-42): new Tun device (/org/freedesktop/NetworkManager/Devices/202)
Jan 22 00:08:30 compute-1 ovn_controller[94841]: 2026-01-22T00:08:30Z|00418|binding|INFO|Claiming lport 65b75da3-429b-440e-aab6-df04e1794c70 for this chassis.
Jan 22 00:08:30 compute-1 ovn_controller[94841]: 2026-01-22T00:08:30Z|00419|binding|INFO|65b75da3-429b-440e-aab6-df04e1794c70: Claiming fa:16:3e:e0:2b:de 10.100.0.14
Jan 22 00:08:30 compute-1 nova_compute[182713]: 2026-01-22 00:08:30.664 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:08:30 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:08:30.676 104184 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e0:2b:de 10.100.0.14'], port_security=['fa:16:3e:e0:2b:de 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '6bdc91c7-e39d-4d01-9496-49ceb58c3389', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d9f4afa6-3fd6-432d-a9b5-d1fb06cca4eb', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '34b96b4037d24a0ea19383ca2477b2fd', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'eb851343-7311-438f-b11b-0c377f259981', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b46e1711-352b-4fc9-8a7d-de660270a1f2, chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>], logical_port=65b75da3-429b-440e-aab6-df04e1794c70) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:08:30 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:08:30.677 104184 INFO neutron.agent.ovn.metadata.agent [-] Port 65b75da3-429b-440e-aab6-df04e1794c70 in datapath d9f4afa6-3fd6-432d-a9b5-d1fb06cca4eb bound to our chassis
Jan 22 00:08:30 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:08:30.679 104184 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network d9f4afa6-3fd6-432d-a9b5-d1fb06cca4eb
Jan 22 00:08:30 compute-1 ovn_controller[94841]: 2026-01-22T00:08:30Z|00420|binding|INFO|Setting lport 65b75da3-429b-440e-aab6-df04e1794c70 ovn-installed in OVS
Jan 22 00:08:30 compute-1 ovn_controller[94841]: 2026-01-22T00:08:30Z|00421|binding|INFO|Setting lport 65b75da3-429b-440e-aab6-df04e1794c70 up in Southbound
Jan 22 00:08:30 compute-1 nova_compute[182713]: 2026-01-22 00:08:30.684 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:08:30 compute-1 nova_compute[182713]: 2026-01-22 00:08:30.686 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:08:30 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:08:30.694 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[f2fc7ab7-1c67-4bc1-b3fe-38b972c2d2db]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:08:30 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:08:30.696 104184 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapd9f4afa6-31 in ovnmeta-d9f4afa6-3fd6-432d-a9b5-d1fb06cca4eb namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 22 00:08:30 compute-1 systemd-udevd[227636]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 00:08:30 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:08:30.700 211733 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapd9f4afa6-30 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 22 00:08:30 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:08:30.700 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[7ad1c0ef-110e-4a11-9508-7cde1b971799]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:08:30 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:08:30.702 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[52516c53-d370-4b66-80f6-30e50499a7d1]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:08:30 compute-1 NetworkManager[54952]: <info>  [1769040510.7143] device (tap65b75da3-42): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 00:08:30 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:08:30.714 104576 DEBUG oslo.privsep.daemon [-] privsep: reply[17bddbd4-55be-42cd-a51a-83e9547b69b1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:08:30 compute-1 NetworkManager[54952]: <info>  [1769040510.7158] device (tap65b75da3-42): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 00:08:30 compute-1 systemd-machined[153970]: New machine qemu-47-instance-0000006c.
Jan 22 00:08:30 compute-1 systemd[1]: Started Virtual Machine qemu-47-instance-0000006c.
Jan 22 00:08:30 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:08:30.739 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[9e30603f-9bc3-4e7b-b708-63cdcd73dac0]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:08:30 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:08:30.770 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[680bfd86-9efb-4006-b0c1-aab10112cb21]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:08:30 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:08:30.779 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[1a7628e7-2824-42b8-aab6-64e0fc7a01de]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:08:30 compute-1 systemd-udevd[227641]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 00:08:30 compute-1 NetworkManager[54952]: <info>  [1769040510.7805] manager: (tapd9f4afa6-30): new Veth device (/org/freedesktop/NetworkManager/Devices/203)
Jan 22 00:08:30 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:08:30.818 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[ff43ab57-da96-4272-ab82-e39d2fd9ab1e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:08:30 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:08:30.823 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[314fb02a-486e-48a3-b3e3-4aa2881f0b74]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:08:30 compute-1 NetworkManager[54952]: <info>  [1769040510.8452] device (tapd9f4afa6-30): carrier: link connected
Jan 22 00:08:30 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:08:30.850 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[a0677431-eb92-419a-85ae-c7648cb84415]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:08:30 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:08:30.870 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[bafb7895-16bc-472e-be31-9915874130e9]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd9f4afa6-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:6a:bb:aa'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 124], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 508349, 'reachable_time': 31229, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 227670, 'error': None, 'target': 'ovnmeta-d9f4afa6-3fd6-432d-a9b5-d1fb06cca4eb', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:08:30 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:08:30.887 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[739201cf-5034-4ef3-954b-2f75d4686781]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe6a:bbaa'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 508349, 'tstamp': 508349}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 227671, 'error': None, 'target': 'ovnmeta-d9f4afa6-3fd6-432d-a9b5-d1fb06cca4eb', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:08:30 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:08:30.913 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[e9bf75ce-b2a6-452d-a5a8-f6ffa5265dff]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd9f4afa6-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:6a:bb:aa'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 124], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 508349, 'reachable_time': 31229, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 227678, 'error': None, 'target': 'ovnmeta-d9f4afa6-3fd6-432d-a9b5-d1fb06cca4eb', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:08:30 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:08:30.955 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[e930f88c-df92-4670-9162-cd3efbfe23ed]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:08:30 compute-1 nova_compute[182713]: 2026-01-22 00:08:30.987 182717 DEBUG nova.virt.driver [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Emitting event <LifecycleEvent: 1769040510.9860759, 6bdc91c7-e39d-4d01-9496-49ceb58c3389 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:08:30 compute-1 nova_compute[182713]: 2026-01-22 00:08:30.987 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 6bdc91c7-e39d-4d01-9496-49ceb58c3389] VM Started (Lifecycle Event)
Jan 22 00:08:31 compute-1 nova_compute[182713]: 2026-01-22 00:08:31.007 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 6bdc91c7-e39d-4d01-9496-49ceb58c3389] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:08:31 compute-1 nova_compute[182713]: 2026-01-22 00:08:31.011 182717 DEBUG nova.virt.driver [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Emitting event <LifecycleEvent: 1769040510.9862416, 6bdc91c7-e39d-4d01-9496-49ceb58c3389 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:08:31 compute-1 nova_compute[182713]: 2026-01-22 00:08:31.011 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 6bdc91c7-e39d-4d01-9496-49ceb58c3389] VM Paused (Lifecycle Event)
Jan 22 00:08:31 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:08:31.020 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[e93956f8-424f-4b65-be35-1ea32622749a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:08:31 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:08:31.021 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd9f4afa6-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:08:31 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:08:31.021 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 00:08:31 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:08:31.022 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd9f4afa6-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:08:31 compute-1 nova_compute[182713]: 2026-01-22 00:08:31.024 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:08:31 compute-1 NetworkManager[54952]: <info>  [1769040511.0246] manager: (tapd9f4afa6-30): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/204)
Jan 22 00:08:31 compute-1 kernel: tapd9f4afa6-30: entered promiscuous mode
Jan 22 00:08:31 compute-1 nova_compute[182713]: 2026-01-22 00:08:31.026 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:08:31 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:08:31.027 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapd9f4afa6-30, col_values=(('external_ids', {'iface-id': 'c8e399d9-6b2c-4418-8d48-d49b8502db31'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:08:31 compute-1 nova_compute[182713]: 2026-01-22 00:08:31.028 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:08:31 compute-1 ovn_controller[94841]: 2026-01-22T00:08:31Z|00422|binding|INFO|Releasing lport c8e399d9-6b2c-4418-8d48-d49b8502db31 from this chassis (sb_readonly=0)
Jan 22 00:08:31 compute-1 nova_compute[182713]: 2026-01-22 00:08:31.030 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:08:31 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:08:31.032 104184 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/d9f4afa6-3fd6-432d-a9b5-d1fb06cca4eb.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/d9f4afa6-3fd6-432d-a9b5-d1fb06cca4eb.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 22 00:08:31 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:08:31.033 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[81ec3d10-3b6d-4abd-8041-e71aaee8f893]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:08:31 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:08:31.034 104184 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 00:08:31 compute-1 ovn_metadata_agent[104179]: global
Jan 22 00:08:31 compute-1 ovn_metadata_agent[104179]:     log         /dev/log local0 debug
Jan 22 00:08:31 compute-1 ovn_metadata_agent[104179]:     log-tag     haproxy-metadata-proxy-d9f4afa6-3fd6-432d-a9b5-d1fb06cca4eb
Jan 22 00:08:31 compute-1 ovn_metadata_agent[104179]:     user        root
Jan 22 00:08:31 compute-1 ovn_metadata_agent[104179]:     group       root
Jan 22 00:08:31 compute-1 ovn_metadata_agent[104179]:     maxconn     1024
Jan 22 00:08:31 compute-1 ovn_metadata_agent[104179]:     pidfile     /var/lib/neutron/external/pids/d9f4afa6-3fd6-432d-a9b5-d1fb06cca4eb.pid.haproxy
Jan 22 00:08:31 compute-1 ovn_metadata_agent[104179]:     daemon
Jan 22 00:08:31 compute-1 ovn_metadata_agent[104179]: 
Jan 22 00:08:31 compute-1 ovn_metadata_agent[104179]: defaults
Jan 22 00:08:31 compute-1 ovn_metadata_agent[104179]:     log global
Jan 22 00:08:31 compute-1 ovn_metadata_agent[104179]:     mode http
Jan 22 00:08:31 compute-1 ovn_metadata_agent[104179]:     option httplog
Jan 22 00:08:31 compute-1 ovn_metadata_agent[104179]:     option dontlognull
Jan 22 00:08:31 compute-1 ovn_metadata_agent[104179]:     option http-server-close
Jan 22 00:08:31 compute-1 ovn_metadata_agent[104179]:     option forwardfor
Jan 22 00:08:31 compute-1 ovn_metadata_agent[104179]:     retries                 3
Jan 22 00:08:31 compute-1 ovn_metadata_agent[104179]:     timeout http-request    30s
Jan 22 00:08:31 compute-1 ovn_metadata_agent[104179]:     timeout connect         30s
Jan 22 00:08:31 compute-1 ovn_metadata_agent[104179]:     timeout client          32s
Jan 22 00:08:31 compute-1 ovn_metadata_agent[104179]:     timeout server          32s
Jan 22 00:08:31 compute-1 ovn_metadata_agent[104179]:     timeout http-keep-alive 30s
Jan 22 00:08:31 compute-1 ovn_metadata_agent[104179]: 
Jan 22 00:08:31 compute-1 ovn_metadata_agent[104179]: 
Jan 22 00:08:31 compute-1 ovn_metadata_agent[104179]: listen listener
Jan 22 00:08:31 compute-1 ovn_metadata_agent[104179]:     bind 169.254.169.254:80
Jan 22 00:08:31 compute-1 ovn_metadata_agent[104179]:     server metadata /var/lib/neutron/metadata_proxy
Jan 22 00:08:31 compute-1 ovn_metadata_agent[104179]:     http-request add-header X-OVN-Network-ID d9f4afa6-3fd6-432d-a9b5-d1fb06cca4eb
Jan 22 00:08:31 compute-1 ovn_metadata_agent[104179]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 22 00:08:31 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:08:31.035 104184 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-d9f4afa6-3fd6-432d-a9b5-d1fb06cca4eb', 'env', 'PROCESS_TAG=haproxy-d9f4afa6-3fd6-432d-a9b5-d1fb06cca4eb', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/d9f4afa6-3fd6-432d-a9b5-d1fb06cca4eb.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 22 00:08:31 compute-1 nova_compute[182713]: 2026-01-22 00:08:31.041 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 6bdc91c7-e39d-4d01-9496-49ceb58c3389] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:08:31 compute-1 nova_compute[182713]: 2026-01-22 00:08:31.041 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:08:31 compute-1 nova_compute[182713]: 2026-01-22 00:08:31.044 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 6bdc91c7-e39d-4d01-9496-49ceb58c3389] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 00:08:31 compute-1 nova_compute[182713]: 2026-01-22 00:08:31.087 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 6bdc91c7-e39d-4d01-9496-49ceb58c3389] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 00:08:31 compute-1 nova_compute[182713]: 2026-01-22 00:08:31.238 182717 DEBUG nova.compute.manager [req-663a20aa-ba08-40d1-9b1d-fa9283a6d38d req-1342f16f-d53e-446e-b6f3-8cedc656baba 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 6bdc91c7-e39d-4d01-9496-49ceb58c3389] Received event network-vif-plugged-65b75da3-429b-440e-aab6-df04e1794c70 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:08:31 compute-1 nova_compute[182713]: 2026-01-22 00:08:31.239 182717 DEBUG oslo_concurrency.lockutils [req-663a20aa-ba08-40d1-9b1d-fa9283a6d38d req-1342f16f-d53e-446e-b6f3-8cedc656baba 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "6bdc91c7-e39d-4d01-9496-49ceb58c3389-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:08:31 compute-1 nova_compute[182713]: 2026-01-22 00:08:31.239 182717 DEBUG oslo_concurrency.lockutils [req-663a20aa-ba08-40d1-9b1d-fa9283a6d38d req-1342f16f-d53e-446e-b6f3-8cedc656baba 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "6bdc91c7-e39d-4d01-9496-49ceb58c3389-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:08:31 compute-1 nova_compute[182713]: 2026-01-22 00:08:31.239 182717 DEBUG oslo_concurrency.lockutils [req-663a20aa-ba08-40d1-9b1d-fa9283a6d38d req-1342f16f-d53e-446e-b6f3-8cedc656baba 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "6bdc91c7-e39d-4d01-9496-49ceb58c3389-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:08:31 compute-1 nova_compute[182713]: 2026-01-22 00:08:31.239 182717 DEBUG nova.compute.manager [req-663a20aa-ba08-40d1-9b1d-fa9283a6d38d req-1342f16f-d53e-446e-b6f3-8cedc656baba 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 6bdc91c7-e39d-4d01-9496-49ceb58c3389] Processing event network-vif-plugged-65b75da3-429b-440e-aab6-df04e1794c70 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 22 00:08:31 compute-1 nova_compute[182713]: 2026-01-22 00:08:31.240 182717 DEBUG nova.compute.manager [None req-7a848b90-310c-42b8-9dab-da530c2939d1 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 6bdc91c7-e39d-4d01-9496-49ceb58c3389] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 00:08:31 compute-1 nova_compute[182713]: 2026-01-22 00:08:31.243 182717 DEBUG nova.virt.driver [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Emitting event <LifecycleEvent: 1769040511.2434695, 6bdc91c7-e39d-4d01-9496-49ceb58c3389 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:08:31 compute-1 nova_compute[182713]: 2026-01-22 00:08:31.244 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 6bdc91c7-e39d-4d01-9496-49ceb58c3389] VM Resumed (Lifecycle Event)
Jan 22 00:08:31 compute-1 nova_compute[182713]: 2026-01-22 00:08:31.245 182717 DEBUG nova.virt.libvirt.driver [None req-7a848b90-310c-42b8-9dab-da530c2939d1 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 6bdc91c7-e39d-4d01-9496-49ceb58c3389] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 22 00:08:31 compute-1 nova_compute[182713]: 2026-01-22 00:08:31.251 182717 INFO nova.virt.libvirt.driver [-] [instance: 6bdc91c7-e39d-4d01-9496-49ceb58c3389] Instance spawned successfully.
Jan 22 00:08:31 compute-1 nova_compute[182713]: 2026-01-22 00:08:31.252 182717 DEBUG nova.virt.libvirt.driver [None req-7a848b90-310c-42b8-9dab-da530c2939d1 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 6bdc91c7-e39d-4d01-9496-49ceb58c3389] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 22 00:08:31 compute-1 nova_compute[182713]: 2026-01-22 00:08:31.274 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 6bdc91c7-e39d-4d01-9496-49ceb58c3389] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:08:31 compute-1 nova_compute[182713]: 2026-01-22 00:08:31.279 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 6bdc91c7-e39d-4d01-9496-49ceb58c3389] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 00:08:31 compute-1 nova_compute[182713]: 2026-01-22 00:08:31.285 182717 DEBUG nova.virt.libvirt.driver [None req-7a848b90-310c-42b8-9dab-da530c2939d1 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 6bdc91c7-e39d-4d01-9496-49ceb58c3389] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:08:31 compute-1 nova_compute[182713]: 2026-01-22 00:08:31.285 182717 DEBUG nova.virt.libvirt.driver [None req-7a848b90-310c-42b8-9dab-da530c2939d1 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 6bdc91c7-e39d-4d01-9496-49ceb58c3389] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:08:31 compute-1 nova_compute[182713]: 2026-01-22 00:08:31.285 182717 DEBUG nova.virt.libvirt.driver [None req-7a848b90-310c-42b8-9dab-da530c2939d1 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 6bdc91c7-e39d-4d01-9496-49ceb58c3389] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:08:31 compute-1 nova_compute[182713]: 2026-01-22 00:08:31.286 182717 DEBUG nova.virt.libvirt.driver [None req-7a848b90-310c-42b8-9dab-da530c2939d1 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 6bdc91c7-e39d-4d01-9496-49ceb58c3389] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:08:31 compute-1 nova_compute[182713]: 2026-01-22 00:08:31.286 182717 DEBUG nova.virt.libvirt.driver [None req-7a848b90-310c-42b8-9dab-da530c2939d1 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 6bdc91c7-e39d-4d01-9496-49ceb58c3389] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:08:31 compute-1 nova_compute[182713]: 2026-01-22 00:08:31.286 182717 DEBUG nova.virt.libvirt.driver [None req-7a848b90-310c-42b8-9dab-da530c2939d1 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 6bdc91c7-e39d-4d01-9496-49ceb58c3389] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:08:31 compute-1 nova_compute[182713]: 2026-01-22 00:08:31.323 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 6bdc91c7-e39d-4d01-9496-49ceb58c3389] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 00:08:31 compute-1 nova_compute[182713]: 2026-01-22 00:08:31.387 182717 INFO nova.compute.manager [None req-7a848b90-310c-42b8-9dab-da530c2939d1 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 6bdc91c7-e39d-4d01-9496-49ceb58c3389] Took 7.64 seconds to spawn the instance on the hypervisor.
Jan 22 00:08:31 compute-1 nova_compute[182713]: 2026-01-22 00:08:31.387 182717 DEBUG nova.compute.manager [None req-7a848b90-310c-42b8-9dab-da530c2939d1 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 6bdc91c7-e39d-4d01-9496-49ceb58c3389] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:08:31 compute-1 nova_compute[182713]: 2026-01-22 00:08:31.483 182717 INFO nova.compute.manager [None req-7a848b90-310c-42b8-9dab-da530c2939d1 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 6bdc91c7-e39d-4d01-9496-49ceb58c3389] Took 8.25 seconds to build instance.
Jan 22 00:08:31 compute-1 podman[227711]: 2026-01-22 00:08:31.395519701 +0000 UTC m=+0.025547046 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 00:08:31 compute-1 nova_compute[182713]: 2026-01-22 00:08:31.499 182717 DEBUG oslo_concurrency.lockutils [None req-7a848b90-310c-42b8-9dab-da530c2939d1 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "6bdc91c7-e39d-4d01-9496-49ceb58c3389" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.354s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:08:31 compute-1 podman[227711]: 2026-01-22 00:08:31.767270854 +0000 UTC m=+0.397298149 container create 222520786f9eb6674bd59b6cfc7efb7414a062816069efe6f65c349730b89a82 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d9f4afa6-3fd6-432d-a9b5-d1fb06cca4eb, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 00:08:31 compute-1 systemd[1]: Started libpod-conmon-222520786f9eb6674bd59b6cfc7efb7414a062816069efe6f65c349730b89a82.scope.
Jan 22 00:08:31 compute-1 systemd[1]: Started libcrun container.
Jan 22 00:08:31 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f0f6e88de36eeb111d6a1bb7d522f7d8107c6ed4d538b479e3f091b1c6dff8a8/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 00:08:31 compute-1 podman[227711]: 2026-01-22 00:08:31.96777706 +0000 UTC m=+0.597804395 container init 222520786f9eb6674bd59b6cfc7efb7414a062816069efe6f65c349730b89a82 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d9f4afa6-3fd6-432d-a9b5-d1fb06cca4eb, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Jan 22 00:08:31 compute-1 podman[227711]: 2026-01-22 00:08:31.974090492 +0000 UTC m=+0.604117757 container start 222520786f9eb6674bd59b6cfc7efb7414a062816069efe6f65c349730b89a82 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d9f4afa6-3fd6-432d-a9b5-d1fb06cca4eb, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Jan 22 00:08:32 compute-1 neutron-haproxy-ovnmeta-d9f4afa6-3fd6-432d-a9b5-d1fb06cca4eb[227726]: [NOTICE]   (227730) : New worker (227732) forked
Jan 22 00:08:32 compute-1 neutron-haproxy-ovnmeta-d9f4afa6-3fd6-432d-a9b5-d1fb06cca4eb[227726]: [NOTICE]   (227730) : Loading success.
Jan 22 00:08:32 compute-1 nova_compute[182713]: 2026-01-22 00:08:32.211 182717 DEBUG nova.network.neutron [req-e7c9a580-18fd-4c51-8a4c-6a1905dcd8e3 req-da15f95b-2437-4987-a99a-43003396e3a1 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 6bdc91c7-e39d-4d01-9496-49ceb58c3389] Updated VIF entry in instance network info cache for port 65b75da3-429b-440e-aab6-df04e1794c70. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 00:08:32 compute-1 nova_compute[182713]: 2026-01-22 00:08:32.211 182717 DEBUG nova.network.neutron [req-e7c9a580-18fd-4c51-8a4c-6a1905dcd8e3 req-da15f95b-2437-4987-a99a-43003396e3a1 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 6bdc91c7-e39d-4d01-9496-49ceb58c3389] Updating instance_info_cache with network_info: [{"id": "65b75da3-429b-440e-aab6-df04e1794c70", "address": "fa:16:3e:e0:2b:de", "network": {"id": "d9f4afa6-3fd6-432d-a9b5-d1fb06cca4eb", "bridge": "br-int", "label": "tempest-network-smoke--277799587", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "34b96b4037d24a0ea19383ca2477b2fd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap65b75da3-42", "ovs_interfaceid": "65b75da3-429b-440e-aab6-df04e1794c70", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:08:32 compute-1 nova_compute[182713]: 2026-01-22 00:08:32.248 182717 DEBUG oslo_concurrency.lockutils [req-e7c9a580-18fd-4c51-8a4c-6a1905dcd8e3 req-da15f95b-2437-4987-a99a-43003396e3a1 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-6bdc91c7-e39d-4d01-9496-49ceb58c3389" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:08:32 compute-1 nova_compute[182713]: 2026-01-22 00:08:32.271 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:08:32 compute-1 nova_compute[182713]: 2026-01-22 00:08:32.309 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Triggering sync for uuid 21970bbd-36b4-495d-8819-49ef2276a912 _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268
Jan 22 00:08:32 compute-1 nova_compute[182713]: 2026-01-22 00:08:32.310 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Triggering sync for uuid 6bdc91c7-e39d-4d01-9496-49ceb58c3389 _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268
Jan 22 00:08:32 compute-1 nova_compute[182713]: 2026-01-22 00:08:32.310 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquiring lock "21970bbd-36b4-495d-8819-49ef2276a912" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:08:32 compute-1 nova_compute[182713]: 2026-01-22 00:08:32.311 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "21970bbd-36b4-495d-8819-49ef2276a912" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:08:32 compute-1 nova_compute[182713]: 2026-01-22 00:08:32.311 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquiring lock "6bdc91c7-e39d-4d01-9496-49ceb58c3389" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:08:32 compute-1 nova_compute[182713]: 2026-01-22 00:08:32.312 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "6bdc91c7-e39d-4d01-9496-49ceb58c3389" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:08:32 compute-1 nova_compute[182713]: 2026-01-22 00:08:32.343 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "6bdc91c7-e39d-4d01-9496-49ceb58c3389" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.031s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:08:32 compute-1 nova_compute[182713]: 2026-01-22 00:08:32.345 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "21970bbd-36b4-495d-8819-49ef2276a912" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.034s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:08:32 compute-1 nova_compute[182713]: 2026-01-22 00:08:32.384 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:08:33 compute-1 nova_compute[182713]: 2026-01-22 00:08:33.009 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:08:33 compute-1 nova_compute[182713]: 2026-01-22 00:08:33.407 182717 DEBUG nova.compute.manager [req-55b3e16e-dfce-4358-b3ef-bcf74474ce5e req-35bb8396-e122-4cf2-adfc-d59e4d0608b2 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 6bdc91c7-e39d-4d01-9496-49ceb58c3389] Received event network-vif-plugged-65b75da3-429b-440e-aab6-df04e1794c70 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:08:33 compute-1 nova_compute[182713]: 2026-01-22 00:08:33.408 182717 DEBUG oslo_concurrency.lockutils [req-55b3e16e-dfce-4358-b3ef-bcf74474ce5e req-35bb8396-e122-4cf2-adfc-d59e4d0608b2 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "6bdc91c7-e39d-4d01-9496-49ceb58c3389-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:08:33 compute-1 nova_compute[182713]: 2026-01-22 00:08:33.408 182717 DEBUG oslo_concurrency.lockutils [req-55b3e16e-dfce-4358-b3ef-bcf74474ce5e req-35bb8396-e122-4cf2-adfc-d59e4d0608b2 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "6bdc91c7-e39d-4d01-9496-49ceb58c3389-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:08:33 compute-1 nova_compute[182713]: 2026-01-22 00:08:33.408 182717 DEBUG oslo_concurrency.lockutils [req-55b3e16e-dfce-4358-b3ef-bcf74474ce5e req-35bb8396-e122-4cf2-adfc-d59e4d0608b2 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "6bdc91c7-e39d-4d01-9496-49ceb58c3389-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:08:33 compute-1 nova_compute[182713]: 2026-01-22 00:08:33.408 182717 DEBUG nova.compute.manager [req-55b3e16e-dfce-4358-b3ef-bcf74474ce5e req-35bb8396-e122-4cf2-adfc-d59e4d0608b2 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 6bdc91c7-e39d-4d01-9496-49ceb58c3389] No waiting events found dispatching network-vif-plugged-65b75da3-429b-440e-aab6-df04e1794c70 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:08:33 compute-1 nova_compute[182713]: 2026-01-22 00:08:33.409 182717 WARNING nova.compute.manager [req-55b3e16e-dfce-4358-b3ef-bcf74474ce5e req-35bb8396-e122-4cf2-adfc-d59e4d0608b2 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 6bdc91c7-e39d-4d01-9496-49ceb58c3389] Received unexpected event network-vif-plugged-65b75da3-429b-440e-aab6-df04e1794c70 for instance with vm_state active and task_state None.
Jan 22 00:08:33 compute-1 nova_compute[182713]: 2026-01-22 00:08:33.832 182717 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769040498.8313184, c8d10cc4-7d81-41cd-8ccd-393ca4932d92 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:08:33 compute-1 nova_compute[182713]: 2026-01-22 00:08:33.832 182717 INFO nova.compute.manager [-] [instance: c8d10cc4-7d81-41cd-8ccd-393ca4932d92] VM Stopped (Lifecycle Event)
Jan 22 00:08:33 compute-1 nova_compute[182713]: 2026-01-22 00:08:33.857 182717 DEBUG nova.compute.manager [None req-ee4356ea-74a8-4d09-b6a3-17f2ec1fd4b2 - - - - - -] [instance: c8d10cc4-7d81-41cd-8ccd-393ca4932d92] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:08:37 compute-1 nova_compute[182713]: 2026-01-22 00:08:37.387 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:08:37 compute-1 nova_compute[182713]: 2026-01-22 00:08:37.897 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:08:37 compute-1 nova_compute[182713]: 2026-01-22 00:08:37.897 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:08:37 compute-1 nova_compute[182713]: 2026-01-22 00:08:37.898 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:08:38 compute-1 nova_compute[182713]: 2026-01-22 00:08:38.010 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:08:38 compute-1 nova_compute[182713]: 2026-01-22 00:08:38.853 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:08:38 compute-1 nova_compute[182713]: 2026-01-22 00:08:38.855 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:08:38 compute-1 nova_compute[182713]: 2026-01-22 00:08:38.855 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 00:08:39 compute-1 nova_compute[182713]: 2026-01-22 00:08:39.532 182717 DEBUG nova.compute.manager [req-188489da-2f25-4b85-8512-4d1752bc0f5c req-a43d8def-dae5-4f8f-8ea2-049e59ae9fbf 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 6bdc91c7-e39d-4d01-9496-49ceb58c3389] Received event network-changed-65b75da3-429b-440e-aab6-df04e1794c70 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:08:39 compute-1 nova_compute[182713]: 2026-01-22 00:08:39.532 182717 DEBUG nova.compute.manager [req-188489da-2f25-4b85-8512-4d1752bc0f5c req-a43d8def-dae5-4f8f-8ea2-049e59ae9fbf 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 6bdc91c7-e39d-4d01-9496-49ceb58c3389] Refreshing instance network info cache due to event network-changed-65b75da3-429b-440e-aab6-df04e1794c70. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 00:08:39 compute-1 nova_compute[182713]: 2026-01-22 00:08:39.533 182717 DEBUG oslo_concurrency.lockutils [req-188489da-2f25-4b85-8512-4d1752bc0f5c req-a43d8def-dae5-4f8f-8ea2-049e59ae9fbf 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-6bdc91c7-e39d-4d01-9496-49ceb58c3389" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:08:39 compute-1 nova_compute[182713]: 2026-01-22 00:08:39.533 182717 DEBUG oslo_concurrency.lockutils [req-188489da-2f25-4b85-8512-4d1752bc0f5c req-a43d8def-dae5-4f8f-8ea2-049e59ae9fbf 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-6bdc91c7-e39d-4d01-9496-49ceb58c3389" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:08:39 compute-1 nova_compute[182713]: 2026-01-22 00:08:39.533 182717 DEBUG nova.network.neutron [req-188489da-2f25-4b85-8512-4d1752bc0f5c req-a43d8def-dae5-4f8f-8ea2-049e59ae9fbf 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 6bdc91c7-e39d-4d01-9496-49ceb58c3389] Refreshing network info cache for port 65b75da3-429b-440e-aab6-df04e1794c70 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 00:08:39 compute-1 nova_compute[182713]: 2026-01-22 00:08:39.855 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:08:40 compute-1 podman[227741]: 2026-01-22 00:08:40.621951512 +0000 UTC m=+0.104045109 container health_status cba261ffc859a105e0afa4436e64e27f9ba7e7387a1ea02146e0072c51844d44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251202)
Jan 22 00:08:41 compute-1 nova_compute[182713]: 2026-01-22 00:08:41.205 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:08:41 compute-1 nova_compute[182713]: 2026-01-22 00:08:41.856 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:08:41 compute-1 nova_compute[182713]: 2026-01-22 00:08:41.882 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:08:41 compute-1 nova_compute[182713]: 2026-01-22 00:08:41.883 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:08:41 compute-1 nova_compute[182713]: 2026-01-22 00:08:41.883 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:08:41 compute-1 nova_compute[182713]: 2026-01-22 00:08:41.883 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 00:08:41 compute-1 nova_compute[182713]: 2026-01-22 00:08:41.960 182717 DEBUG oslo_concurrency.processutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/21970bbd-36b4-495d-8819-49ef2276a912/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:08:42 compute-1 nova_compute[182713]: 2026-01-22 00:08:42.058 182717 DEBUG oslo_concurrency.processutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/21970bbd-36b4-495d-8819-49ef2276a912/disk --force-share --output=json" returned: 0 in 0.098s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:08:42 compute-1 nova_compute[182713]: 2026-01-22 00:08:42.059 182717 DEBUG oslo_concurrency.processutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/21970bbd-36b4-495d-8819-49ef2276a912/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:08:42 compute-1 nova_compute[182713]: 2026-01-22 00:08:42.100 182717 DEBUG nova.network.neutron [req-188489da-2f25-4b85-8512-4d1752bc0f5c req-a43d8def-dae5-4f8f-8ea2-049e59ae9fbf 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 6bdc91c7-e39d-4d01-9496-49ceb58c3389] Updated VIF entry in instance network info cache for port 65b75da3-429b-440e-aab6-df04e1794c70. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 00:08:42 compute-1 nova_compute[182713]: 2026-01-22 00:08:42.101 182717 DEBUG nova.network.neutron [req-188489da-2f25-4b85-8512-4d1752bc0f5c req-a43d8def-dae5-4f8f-8ea2-049e59ae9fbf 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 6bdc91c7-e39d-4d01-9496-49ceb58c3389] Updating instance_info_cache with network_info: [{"id": "65b75da3-429b-440e-aab6-df04e1794c70", "address": "fa:16:3e:e0:2b:de", "network": {"id": "d9f4afa6-3fd6-432d-a9b5-d1fb06cca4eb", "bridge": "br-int", "label": "tempest-network-smoke--277799587", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.248", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "34b96b4037d24a0ea19383ca2477b2fd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap65b75da3-42", "ovs_interfaceid": "65b75da3-429b-440e-aab6-df04e1794c70", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:08:42 compute-1 nova_compute[182713]: 2026-01-22 00:08:42.137 182717 DEBUG oslo_concurrency.lockutils [req-188489da-2f25-4b85-8512-4d1752bc0f5c req-a43d8def-dae5-4f8f-8ea2-049e59ae9fbf 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-6bdc91c7-e39d-4d01-9496-49ceb58c3389" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:08:42 compute-1 nova_compute[182713]: 2026-01-22 00:08:42.150 182717 DEBUG oslo_concurrency.processutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/21970bbd-36b4-495d-8819-49ef2276a912/disk --force-share --output=json" returned: 0 in 0.091s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:08:42 compute-1 nova_compute[182713]: 2026-01-22 00:08:42.157 182717 DEBUG oslo_concurrency.processutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6bdc91c7-e39d-4d01-9496-49ceb58c3389/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:08:42 compute-1 nova_compute[182713]: 2026-01-22 00:08:42.234 182717 DEBUG oslo_concurrency.processutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6bdc91c7-e39d-4d01-9496-49ceb58c3389/disk --force-share --output=json" returned: 0 in 0.077s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:08:42 compute-1 nova_compute[182713]: 2026-01-22 00:08:42.237 182717 DEBUG oslo_concurrency.processutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6bdc91c7-e39d-4d01-9496-49ceb58c3389/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:08:42 compute-1 nova_compute[182713]: 2026-01-22 00:08:42.318 182717 DEBUG oslo_concurrency.processutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6bdc91c7-e39d-4d01-9496-49ceb58c3389/disk --force-share --output=json" returned: 0 in 0.081s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:08:42 compute-1 nova_compute[182713]: 2026-01-22 00:08:42.438 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:08:42 compute-1 nova_compute[182713]: 2026-01-22 00:08:42.533 182717 WARNING nova.virt.libvirt.driver [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 00:08:42 compute-1 nova_compute[182713]: 2026-01-22 00:08:42.534 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5310MB free_disk=73.27399063110352GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 00:08:42 compute-1 nova_compute[182713]: 2026-01-22 00:08:42.535 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:08:42 compute-1 nova_compute[182713]: 2026-01-22 00:08:42.535 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:08:42 compute-1 nova_compute[182713]: 2026-01-22 00:08:42.632 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Instance 21970bbd-36b4-495d-8819-49ef2276a912 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 22 00:08:42 compute-1 nova_compute[182713]: 2026-01-22 00:08:42.632 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Instance 6bdc91c7-e39d-4d01-9496-49ceb58c3389 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 22 00:08:42 compute-1 nova_compute[182713]: 2026-01-22 00:08:42.632 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 00:08:42 compute-1 nova_compute[182713]: 2026-01-22 00:08:42.633 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 00:08:42 compute-1 nova_compute[182713]: 2026-01-22 00:08:42.687 182717 DEBUG nova.compute.provider_tree [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Inventory has not changed in ProviderTree for provider: 39680711-70c9-4df1-ae59-25e54fac688d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:08:42 compute-1 nova_compute[182713]: 2026-01-22 00:08:42.722 182717 DEBUG nova.scheduler.client.report [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Inventory has not changed for provider 39680711-70c9-4df1-ae59-25e54fac688d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:08:42 compute-1 nova_compute[182713]: 2026-01-22 00:08:42.761 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 00:08:42 compute-1 nova_compute[182713]: 2026-01-22 00:08:42.761 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.226s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:08:43 compute-1 nova_compute[182713]: 2026-01-22 00:08:43.011 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:08:43 compute-1 nova_compute[182713]: 2026-01-22 00:08:43.762 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:08:43 compute-1 nova_compute[182713]: 2026-01-22 00:08:43.857 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:08:43 compute-1 nova_compute[182713]: 2026-01-22 00:08:43.858 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 00:08:43 compute-1 nova_compute[182713]: 2026-01-22 00:08:43.858 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 22 00:08:44 compute-1 nova_compute[182713]: 2026-01-22 00:08:44.450 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquiring lock "refresh_cache-21970bbd-36b4-495d-8819-49ef2276a912" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:08:44 compute-1 nova_compute[182713]: 2026-01-22 00:08:44.451 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquired lock "refresh_cache-21970bbd-36b4-495d-8819-49ef2276a912" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:08:44 compute-1 nova_compute[182713]: 2026-01-22 00:08:44.451 182717 DEBUG nova.network.neutron [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] [instance: 21970bbd-36b4-495d-8819-49ef2276a912] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Jan 22 00:08:44 compute-1 nova_compute[182713]: 2026-01-22 00:08:44.451 182717 DEBUG nova.objects.instance [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lazy-loading 'info_cache' on Instance uuid 21970bbd-36b4-495d-8819-49ef2276a912 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:08:44 compute-1 podman[227792]: 2026-01-22 00:08:44.607303516 +0000 UTC m=+0.100904094 container health_status dce133a2ffa21f3006ac27dc80027060a9029e6d972f4c62fecd35dd3bbb00f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, maintainer=Red Hat, Inc., release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, version=9.6, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, managed_by=edpm_ansible, config_id=openstack_network_exporter, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., io.buildah.version=1.33.7, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, architecture=x86_64, com.redhat.component=ubi9-minimal-container)
Jan 22 00:08:44 compute-1 ovn_controller[94841]: 2026-01-22T00:08:44Z|00051|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:e0:2b:de 10.100.0.14
Jan 22 00:08:44 compute-1 ovn_controller[94841]: 2026-01-22T00:08:44Z|00052|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:e0:2b:de 10.100.0.14
Jan 22 00:08:46 compute-1 nova_compute[182713]: 2026-01-22 00:08:46.570 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:08:46 compute-1 nova_compute[182713]: 2026-01-22 00:08:46.684 182717 DEBUG nova.network.neutron [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] [instance: 21970bbd-36b4-495d-8819-49ef2276a912] Updating instance_info_cache with network_info: [{"id": "06125da7-7adf-4bbe-b033-0045ab83a9f2", "address": "fa:16:3e:c4:99:4e", "network": {"id": "1a4bd631-64c5-4e00-9341-0e44fd0833fb", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1779791452-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b26cf6f4abd54e30aac169a3cbca648c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap06125da7-7a", "ovs_interfaceid": "06125da7-7adf-4bbe-b033-0045ab83a9f2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:08:46 compute-1 nova_compute[182713]: 2026-01-22 00:08:46.712 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Releasing lock "refresh_cache-21970bbd-36b4-495d-8819-49ef2276a912" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:08:46 compute-1 nova_compute[182713]: 2026-01-22 00:08:46.713 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] [instance: 21970bbd-36b4-495d-8819-49ef2276a912] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Jan 22 00:08:47 compute-1 nova_compute[182713]: 2026-01-22 00:08:47.444 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:08:48 compute-1 nova_compute[182713]: 2026-01-22 00:08:48.012 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:08:50 compute-1 nova_compute[182713]: 2026-01-22 00:08:50.101 182717 INFO nova.compute.manager [None req-3e4bc1be-ac86-4ec4-a491-74ba1531cd20 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 6bdc91c7-e39d-4d01-9496-49ceb58c3389] Get console output
Jan 22 00:08:50 compute-1 nova_compute[182713]: 2026-01-22 00:08:50.107 211417 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 00:08:52 compute-1 nova_compute[182713]: 2026-01-22 00:08:52.494 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:08:53 compute-1 nova_compute[182713]: 2026-01-22 00:08:53.014 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:08:55 compute-1 podman[227814]: 2026-01-22 00:08:55.58560947 +0000 UTC m=+0.060315922 container health_status 9069102163b9014ce8d990e36224edf2f6277e737eebbc85a8ea0ba5a3d6a468 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 22 00:08:55 compute-1 podman[227813]: 2026-01-22 00:08:55.612658951 +0000 UTC m=+0.102876294 container health_status 1b31374526468d6b98833c6ddc06c791524e3aaa8f4a92d02e3f11c449a17ca2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller)
Jan 22 00:08:56 compute-1 nova_compute[182713]: 2026-01-22 00:08:56.721 182717 DEBUG oslo_concurrency.lockutils [None req-4a946ddd-efe8-44bd-9aed-4b681f4f3fa2 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Acquiring lock "interface-6bdc91c7-e39d-4d01-9496-49ceb58c3389-None" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:08:56 compute-1 nova_compute[182713]: 2026-01-22 00:08:56.722 182717 DEBUG oslo_concurrency.lockutils [None req-4a946ddd-efe8-44bd-9aed-4b681f4f3fa2 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "interface-6bdc91c7-e39d-4d01-9496-49ceb58c3389-None" acquired by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:08:56 compute-1 nova_compute[182713]: 2026-01-22 00:08:56.722 182717 DEBUG nova.objects.instance [None req-4a946ddd-efe8-44bd-9aed-4b681f4f3fa2 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lazy-loading 'flavor' on Instance uuid 6bdc91c7-e39d-4d01-9496-49ceb58c3389 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:08:57 compute-1 nova_compute[182713]: 2026-01-22 00:08:57.237 182717 DEBUG nova.objects.instance [None req-4a946ddd-efe8-44bd-9aed-4b681f4f3fa2 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lazy-loading 'pci_requests' on Instance uuid 6bdc91c7-e39d-4d01-9496-49ceb58c3389 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:08:57 compute-1 nova_compute[182713]: 2026-01-22 00:08:57.272 182717 DEBUG nova.network.neutron [None req-4a946ddd-efe8-44bd-9aed-4b681f4f3fa2 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 6bdc91c7-e39d-4d01-9496-49ceb58c3389] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 22 00:08:57 compute-1 nova_compute[182713]: 2026-01-22 00:08:57.497 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:08:57 compute-1 nova_compute[182713]: 2026-01-22 00:08:57.549 182717 DEBUG nova.policy [None req-4a946ddd-efe8-44bd-9aed-4b681f4f3fa2 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '833f1e9dce90456ea55a443da6704907', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '34b96b4037d24a0ea19383ca2477b2fd', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 22 00:08:58 compute-1 nova_compute[182713]: 2026-01-22 00:08:58.015 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:08:58 compute-1 nova_compute[182713]: 2026-01-22 00:08:58.789 182717 DEBUG nova.network.neutron [None req-4a946ddd-efe8-44bd-9aed-4b681f4f3fa2 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 6bdc91c7-e39d-4d01-9496-49ceb58c3389] Successfully created port: 424a06a4-5a78-4f84-b441-8b6fcf11bee9 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 22 00:09:00 compute-1 podman[227863]: 2026-01-22 00:09:00.570068388 +0000 UTC m=+0.057245989 container health_status af9a44923f14d865893c91712a29a99d59e05d83f1a1a0300c4f4d5af638f284 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251202, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 22 00:09:00 compute-1 podman[227864]: 2026-01-22 00:09:00.57113871 +0000 UTC m=+0.054527576 container health_status e305ebfcd3dbca18f8dd680606b043b108cf9efb0d0a771039cb04fe0df8441d (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 22 00:09:00 compute-1 nova_compute[182713]: 2026-01-22 00:09:00.598 182717 DEBUG nova.network.neutron [None req-4a946ddd-efe8-44bd-9aed-4b681f4f3fa2 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 6bdc91c7-e39d-4d01-9496-49ceb58c3389] Successfully updated port: 424a06a4-5a78-4f84-b441-8b6fcf11bee9 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 22 00:09:00 compute-1 nova_compute[182713]: 2026-01-22 00:09:00.764 182717 DEBUG oslo_concurrency.lockutils [None req-4a946ddd-efe8-44bd-9aed-4b681f4f3fa2 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Acquiring lock "refresh_cache-6bdc91c7-e39d-4d01-9496-49ceb58c3389" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:09:00 compute-1 nova_compute[182713]: 2026-01-22 00:09:00.764 182717 DEBUG oslo_concurrency.lockutils [None req-4a946ddd-efe8-44bd-9aed-4b681f4f3fa2 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Acquired lock "refresh_cache-6bdc91c7-e39d-4d01-9496-49ceb58c3389" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:09:00 compute-1 nova_compute[182713]: 2026-01-22 00:09:00.764 182717 DEBUG nova.network.neutron [None req-4a946ddd-efe8-44bd-9aed-4b681f4f3fa2 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 6bdc91c7-e39d-4d01-9496-49ceb58c3389] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 00:09:00 compute-1 nova_compute[182713]: 2026-01-22 00:09:00.912 182717 DEBUG nova.compute.manager [req-ca0a358d-78ed-40f9-8f41-24f4d17a7c5b req-18dbee98-dea7-4f81-976b-bad68410e373 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 6bdc91c7-e39d-4d01-9496-49ceb58c3389] Received event network-changed-424a06a4-5a78-4f84-b441-8b6fcf11bee9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:09:00 compute-1 nova_compute[182713]: 2026-01-22 00:09:00.912 182717 DEBUG nova.compute.manager [req-ca0a358d-78ed-40f9-8f41-24f4d17a7c5b req-18dbee98-dea7-4f81-976b-bad68410e373 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 6bdc91c7-e39d-4d01-9496-49ceb58c3389] Refreshing instance network info cache due to event network-changed-424a06a4-5a78-4f84-b441-8b6fcf11bee9. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 00:09:00 compute-1 nova_compute[182713]: 2026-01-22 00:09:00.913 182717 DEBUG oslo_concurrency.lockutils [req-ca0a358d-78ed-40f9-8f41-24f4d17a7c5b req-18dbee98-dea7-4f81-976b-bad68410e373 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-6bdc91c7-e39d-4d01-9496-49ceb58c3389" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:09:02 compute-1 nova_compute[182713]: 2026-01-22 00:09:02.522 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:09:03 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:09:03.016 104184 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:09:03 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:09:03.017 104184 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:09:03 compute-1 nova_compute[182713]: 2026-01-22 00:09:03.017 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:09:03 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:09:03.018 104184 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:09:05 compute-1 nova_compute[182713]: 2026-01-22 00:09:05.782 182717 DEBUG nova.network.neutron [None req-4a946ddd-efe8-44bd-9aed-4b681f4f3fa2 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 6bdc91c7-e39d-4d01-9496-49ceb58c3389] Updating instance_info_cache with network_info: [{"id": "65b75da3-429b-440e-aab6-df04e1794c70", "address": "fa:16:3e:e0:2b:de", "network": {"id": "d9f4afa6-3fd6-432d-a9b5-d1fb06cca4eb", "bridge": "br-int", "label": "tempest-network-smoke--277799587", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.248", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "34b96b4037d24a0ea19383ca2477b2fd", "mtu": null, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap65b75da3-42", "ovs_interfaceid": "65b75da3-429b-440e-aab6-df04e1794c70", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "424a06a4-5a78-4f84-b441-8b6fcf11bee9", "address": "fa:16:3e:a8:ff:89", "network": {"id": "71c61b6b-419e-42c2-91c0-28feb86f02ed", "bridge": "br-int", "label": "tempest-network-smoke--2123686400", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.27", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "34b96b4037d24a0ea19383ca2477b2fd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap424a06a4-5a", "ovs_interfaceid": "424a06a4-5a78-4f84-b441-8b6fcf11bee9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:09:06 compute-1 nova_compute[182713]: 2026-01-22 00:09:06.484 182717 DEBUG oslo_concurrency.lockutils [None req-4a946ddd-efe8-44bd-9aed-4b681f4f3fa2 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Releasing lock "refresh_cache-6bdc91c7-e39d-4d01-9496-49ceb58c3389" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:09:06 compute-1 nova_compute[182713]: 2026-01-22 00:09:06.487 182717 DEBUG oslo_concurrency.lockutils [req-ca0a358d-78ed-40f9-8f41-24f4d17a7c5b req-18dbee98-dea7-4f81-976b-bad68410e373 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-6bdc91c7-e39d-4d01-9496-49ceb58c3389" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:09:06 compute-1 nova_compute[182713]: 2026-01-22 00:09:06.487 182717 DEBUG nova.network.neutron [req-ca0a358d-78ed-40f9-8f41-24f4d17a7c5b req-18dbee98-dea7-4f81-976b-bad68410e373 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 6bdc91c7-e39d-4d01-9496-49ceb58c3389] Refreshing network info cache for port 424a06a4-5a78-4f84-b441-8b6fcf11bee9 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 00:09:06 compute-1 nova_compute[182713]: 2026-01-22 00:09:06.492 182717 DEBUG nova.virt.libvirt.vif [None req-4a946ddd-efe8-44bd-9aed-4b681f4f3fa2 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T00:08:22Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-188093305',display_name='tempest-TestNetworkBasicOps-server-188093305',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-188093305',id=108,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOHkFZnEq7NNmxdySRV5/1EgpWeAUvw5OFcMLn6aXM7GCY8P0GmDvb4iJ7gG7Emvq6PckO0f9TjYb3zc1/HZhYzhCTu/vnvxNF9lXAP/apwH44by79L436dIVo4nFymoqQ==',key_name='tempest-TestNetworkBasicOps-441082338',keypairs=<?>,launch_index=0,launched_at=2026-01-22T00:08:31Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='34b96b4037d24a0ea19383ca2477b2fd',ramdisk_id='',reservation_id='r-e07wfkf0',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-822850957',owner_user_name='tempest-TestNetworkBasicOps-822850957-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T00:08:31Z,user_data=None,user_id='833f1e9dce90456ea55a443da6704907',uuid=6bdc91c7-e39d-4d01-9496-49ceb58c3389,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "424a06a4-5a78-4f84-b441-8b6fcf11bee9", "address": "fa:16:3e:a8:ff:89", "network": {"id": "71c61b6b-419e-42c2-91c0-28feb86f02ed", "bridge": "br-int", "label": "tempest-network-smoke--2123686400", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.27", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "34b96b4037d24a0ea19383ca2477b2fd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap424a06a4-5a", "ovs_interfaceid": "424a06a4-5a78-4f84-b441-8b6fcf11bee9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 00:09:06 compute-1 nova_compute[182713]: 2026-01-22 00:09:06.493 182717 DEBUG nova.network.os_vif_util [None req-4a946ddd-efe8-44bd-9aed-4b681f4f3fa2 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Converting VIF {"id": "424a06a4-5a78-4f84-b441-8b6fcf11bee9", "address": "fa:16:3e:a8:ff:89", "network": {"id": "71c61b6b-419e-42c2-91c0-28feb86f02ed", "bridge": "br-int", "label": "tempest-network-smoke--2123686400", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.27", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "34b96b4037d24a0ea19383ca2477b2fd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap424a06a4-5a", "ovs_interfaceid": "424a06a4-5a78-4f84-b441-8b6fcf11bee9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:09:06 compute-1 nova_compute[182713]: 2026-01-22 00:09:06.494 182717 DEBUG nova.network.os_vif_util [None req-4a946ddd-efe8-44bd-9aed-4b681f4f3fa2 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a8:ff:89,bridge_name='br-int',has_traffic_filtering=True,id=424a06a4-5a78-4f84-b441-8b6fcf11bee9,network=Network(71c61b6b-419e-42c2-91c0-28feb86f02ed),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap424a06a4-5a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:09:06 compute-1 nova_compute[182713]: 2026-01-22 00:09:06.495 182717 DEBUG os_vif [None req-4a946ddd-efe8-44bd-9aed-4b681f4f3fa2 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:a8:ff:89,bridge_name='br-int',has_traffic_filtering=True,id=424a06a4-5a78-4f84-b441-8b6fcf11bee9,network=Network(71c61b6b-419e-42c2-91c0-28feb86f02ed),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap424a06a4-5a') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 00:09:06 compute-1 nova_compute[182713]: 2026-01-22 00:09:06.496 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:09:06 compute-1 nova_compute[182713]: 2026-01-22 00:09:06.497 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:09:06 compute-1 nova_compute[182713]: 2026-01-22 00:09:06.498 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 00:09:06 compute-1 nova_compute[182713]: 2026-01-22 00:09:06.512 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:09:06 compute-1 nova_compute[182713]: 2026-01-22 00:09:06.513 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap424a06a4-5a, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:09:06 compute-1 nova_compute[182713]: 2026-01-22 00:09:06.513 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap424a06a4-5a, col_values=(('external_ids', {'iface-id': '424a06a4-5a78-4f84-b441-8b6fcf11bee9', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:a8:ff:89', 'vm-uuid': '6bdc91c7-e39d-4d01-9496-49ceb58c3389'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:09:06 compute-1 nova_compute[182713]: 2026-01-22 00:09:06.516 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:09:06 compute-1 NetworkManager[54952]: <info>  [1769040546.5171] manager: (tap424a06a4-5a): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/205)
Jan 22 00:09:06 compute-1 nova_compute[182713]: 2026-01-22 00:09:06.521 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:09:06 compute-1 nova_compute[182713]: 2026-01-22 00:09:06.526 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:09:06 compute-1 nova_compute[182713]: 2026-01-22 00:09:06.528 182717 INFO os_vif [None req-4a946ddd-efe8-44bd-9aed-4b681f4f3fa2 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:a8:ff:89,bridge_name='br-int',has_traffic_filtering=True,id=424a06a4-5a78-4f84-b441-8b6fcf11bee9,network=Network(71c61b6b-419e-42c2-91c0-28feb86f02ed),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap424a06a4-5a')
Jan 22 00:09:06 compute-1 nova_compute[182713]: 2026-01-22 00:09:06.529 182717 DEBUG nova.virt.libvirt.vif [None req-4a946ddd-efe8-44bd-9aed-4b681f4f3fa2 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T00:08:22Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-188093305',display_name='tempest-TestNetworkBasicOps-server-188093305',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-188093305',id=108,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOHkFZnEq7NNmxdySRV5/1EgpWeAUvw5OFcMLn6aXM7GCY8P0GmDvb4iJ7gG7Emvq6PckO0f9TjYb3zc1/HZhYzhCTu/vnvxNF9lXAP/apwH44by79L436dIVo4nFymoqQ==',key_name='tempest-TestNetworkBasicOps-441082338',keypairs=<?>,launch_index=0,launched_at=2026-01-22T00:08:31Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='34b96b4037d24a0ea19383ca2477b2fd',ramdisk_id='',reservation_id='r-e07wfkf0',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-822850957',owner_user_name='tempest-TestNetworkBasicOps-822850957-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T00:08:31Z,user_data=None,user_id='833f1e9dce90456ea55a443da6704907',uuid=6bdc91c7-e39d-4d01-9496-49ceb58c3389,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "424a06a4-5a78-4f84-b441-8b6fcf11bee9", "address": "fa:16:3e:a8:ff:89", "network": {"id": "71c61b6b-419e-42c2-91c0-28feb86f02ed", "bridge": "br-int", "label": "tempest-network-smoke--2123686400", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.27", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "34b96b4037d24a0ea19383ca2477b2fd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap424a06a4-5a", "ovs_interfaceid": "424a06a4-5a78-4f84-b441-8b6fcf11bee9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 00:09:06 compute-1 nova_compute[182713]: 2026-01-22 00:09:06.529 182717 DEBUG nova.network.os_vif_util [None req-4a946ddd-efe8-44bd-9aed-4b681f4f3fa2 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Converting VIF {"id": "424a06a4-5a78-4f84-b441-8b6fcf11bee9", "address": "fa:16:3e:a8:ff:89", "network": {"id": "71c61b6b-419e-42c2-91c0-28feb86f02ed", "bridge": "br-int", "label": "tempest-network-smoke--2123686400", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.27", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "34b96b4037d24a0ea19383ca2477b2fd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap424a06a4-5a", "ovs_interfaceid": "424a06a4-5a78-4f84-b441-8b6fcf11bee9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:09:06 compute-1 nova_compute[182713]: 2026-01-22 00:09:06.530 182717 DEBUG nova.network.os_vif_util [None req-4a946ddd-efe8-44bd-9aed-4b681f4f3fa2 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a8:ff:89,bridge_name='br-int',has_traffic_filtering=True,id=424a06a4-5a78-4f84-b441-8b6fcf11bee9,network=Network(71c61b6b-419e-42c2-91c0-28feb86f02ed),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap424a06a4-5a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:09:06 compute-1 nova_compute[182713]: 2026-01-22 00:09:06.533 182717 DEBUG nova.virt.libvirt.guest [None req-4a946ddd-efe8-44bd-9aed-4b681f4f3fa2 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] attach device xml: <interface type="ethernet">
Jan 22 00:09:06 compute-1 nova_compute[182713]:   <mac address="fa:16:3e:a8:ff:89"/>
Jan 22 00:09:06 compute-1 nova_compute[182713]:   <model type="virtio"/>
Jan 22 00:09:06 compute-1 nova_compute[182713]:   <driver name="vhost" rx_queue_size="512"/>
Jan 22 00:09:06 compute-1 nova_compute[182713]:   <mtu size="1442"/>
Jan 22 00:09:06 compute-1 nova_compute[182713]:   <target dev="tap424a06a4-5a"/>
Jan 22 00:09:06 compute-1 nova_compute[182713]: </interface>
Jan 22 00:09:06 compute-1 nova_compute[182713]:  attach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:339
Jan 22 00:09:06 compute-1 kernel: tap424a06a4-5a: entered promiscuous mode
Jan 22 00:09:06 compute-1 NetworkManager[54952]: <info>  [1769040546.5521] manager: (tap424a06a4-5a): new Tun device (/org/freedesktop/NetworkManager/Devices/206)
Jan 22 00:09:06 compute-1 ovn_controller[94841]: 2026-01-22T00:09:06Z|00423|binding|INFO|Claiming lport 424a06a4-5a78-4f84-b441-8b6fcf11bee9 for this chassis.
Jan 22 00:09:06 compute-1 nova_compute[182713]: 2026-01-22 00:09:06.553 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:09:06 compute-1 ovn_controller[94841]: 2026-01-22T00:09:06Z|00424|binding|INFO|424a06a4-5a78-4f84-b441-8b6fcf11bee9: Claiming fa:16:3e:a8:ff:89 10.100.0.27
Jan 22 00:09:06 compute-1 nova_compute[182713]: 2026-01-22 00:09:06.557 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:09:06 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:09:06.573 104184 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a8:ff:89 10.100.0.27'], port_security=['fa:16:3e:a8:ff:89 10.100.0.27'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.27/28', 'neutron:device_id': '6bdc91c7-e39d-4d01-9496-49ceb58c3389', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-71c61b6b-419e-42c2-91c0-28feb86f02ed', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '34b96b4037d24a0ea19383ca2477b2fd', 'neutron:revision_number': '2', 'neutron:security_group_ids': '86c286e4-25eb-4f2e-b5ff-30677cfd8882', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b5ebddd3-9956-4acf-af04-d95f407579f1, chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>], logical_port=424a06a4-5a78-4f84-b441-8b6fcf11bee9) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:09:06 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:09:06.575 104184 INFO neutron.agent.ovn.metadata.agent [-] Port 424a06a4-5a78-4f84-b441-8b6fcf11bee9 in datapath 71c61b6b-419e-42c2-91c0-28feb86f02ed bound to our chassis
Jan 22 00:09:06 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:09:06.577 104184 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 71c61b6b-419e-42c2-91c0-28feb86f02ed
Jan 22 00:09:06 compute-1 systemd-udevd[227911]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 00:09:06 compute-1 nova_compute[182713]: 2026-01-22 00:09:06.592 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:09:06 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:09:06.594 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[44f57e4f-c1ad-48bb-87f1-5267d8c8210e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:09:06 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:09:06.595 104184 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap71c61b6b-41 in ovnmeta-71c61b6b-419e-42c2-91c0-28feb86f02ed namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 22 00:09:06 compute-1 ovn_controller[94841]: 2026-01-22T00:09:06Z|00425|binding|INFO|Setting lport 424a06a4-5a78-4f84-b441-8b6fcf11bee9 ovn-installed in OVS
Jan 22 00:09:06 compute-1 ovn_controller[94841]: 2026-01-22T00:09:06Z|00426|binding|INFO|Setting lport 424a06a4-5a78-4f84-b441-8b6fcf11bee9 up in Southbound
Jan 22 00:09:06 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:09:06.597 211733 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap71c61b6b-40 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 22 00:09:06 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:09:06.598 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[9497ebc6-c07a-421e-8a7c-8ecb50ad6a20]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:09:06 compute-1 nova_compute[182713]: 2026-01-22 00:09:06.599 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:09:06 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:09:06.598 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[0a3d9ddf-2e98-49fd-ae11-143aa4e43f72]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:09:06 compute-1 NetworkManager[54952]: <info>  [1769040546.6117] device (tap424a06a4-5a): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 00:09:06 compute-1 NetworkManager[54952]: <info>  [1769040546.6134] device (tap424a06a4-5a): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 00:09:06 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:09:06.616 104576 DEBUG oslo.privsep.daemon [-] privsep: reply[d1c00deb-bd9d-4b2a-88cb-3475d4f8e20e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:09:06 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:09:06.648 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[54880589-52f1-4398-912a-ecca01045da6]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:09:06 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:09:06.756 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[dd46d3e5-9ed9-49c3-93f1-67dbf80fa2a6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:09:06 compute-1 systemd-udevd[227914]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 00:09:06 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:09:06.763 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[a3a74f22-c9a4-43b9-959e-a8ad7cd034d9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:09:06 compute-1 NetworkManager[54952]: <info>  [1769040546.7649] manager: (tap71c61b6b-40): new Veth device (/org/freedesktop/NetworkManager/Devices/207)
Jan 22 00:09:06 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:09:06.811 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[d9c203f4-fe9c-48e1-bc31-c86c09c990a2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:09:06 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:09:06.814 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[44eb4fe4-6597-4e52-8a7f-a116b37a8d51]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:09:06 compute-1 NetworkManager[54952]: <info>  [1769040546.8415] device (tap71c61b6b-40): carrier: link connected
Jan 22 00:09:06 compute-1 nova_compute[182713]: 2026-01-22 00:09:06.842 182717 DEBUG nova.virt.libvirt.driver [None req-4a946ddd-efe8-44bd-9aed-4b681f4f3fa2 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 00:09:06 compute-1 nova_compute[182713]: 2026-01-22 00:09:06.842 182717 DEBUG nova.virt.libvirt.driver [None req-4a946ddd-efe8-44bd-9aed-4b681f4f3fa2 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 00:09:06 compute-1 nova_compute[182713]: 2026-01-22 00:09:06.843 182717 DEBUG nova.virt.libvirt.driver [None req-4a946ddd-efe8-44bd-9aed-4b681f4f3fa2 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] No VIF found with MAC fa:16:3e:e0:2b:de, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 00:09:06 compute-1 nova_compute[182713]: 2026-01-22 00:09:06.843 182717 DEBUG nova.virt.libvirt.driver [None req-4a946ddd-efe8-44bd-9aed-4b681f4f3fa2 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] No VIF found with MAC fa:16:3e:a8:ff:89, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 00:09:06 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:09:06.848 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[bffefb5c-bddd-4638-b71e-cfa6044926e7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:09:06 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:09:06.872 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[64d1295a-e102-4683-90bd-def6b381999b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap71c61b6b-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:1e:e4:58'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 126], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 511949, 'reachable_time': 32369, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 227937, 'error': None, 'target': 'ovnmeta-71c61b6b-419e-42c2-91c0-28feb86f02ed', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:09:06 compute-1 nova_compute[182713]: 2026-01-22 00:09:06.890 182717 DEBUG nova.virt.libvirt.guest [None req-4a946ddd-efe8-44bd-9aed-4b681f4f3fa2 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 00:09:06 compute-1 nova_compute[182713]:   <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 00:09:06 compute-1 nova_compute[182713]:   <nova:name>tempest-TestNetworkBasicOps-server-188093305</nova:name>
Jan 22 00:09:06 compute-1 nova_compute[182713]:   <nova:creationTime>2026-01-22 00:09:06</nova:creationTime>
Jan 22 00:09:06 compute-1 nova_compute[182713]:   <nova:flavor name="m1.nano">
Jan 22 00:09:06 compute-1 nova_compute[182713]:     <nova:memory>128</nova:memory>
Jan 22 00:09:06 compute-1 nova_compute[182713]:     <nova:disk>1</nova:disk>
Jan 22 00:09:06 compute-1 nova_compute[182713]:     <nova:swap>0</nova:swap>
Jan 22 00:09:06 compute-1 nova_compute[182713]:     <nova:ephemeral>0</nova:ephemeral>
Jan 22 00:09:06 compute-1 nova_compute[182713]:     <nova:vcpus>1</nova:vcpus>
Jan 22 00:09:06 compute-1 nova_compute[182713]:   </nova:flavor>
Jan 22 00:09:06 compute-1 nova_compute[182713]:   <nova:owner>
Jan 22 00:09:06 compute-1 nova_compute[182713]:     <nova:user uuid="833f1e9dce90456ea55a443da6704907">tempest-TestNetworkBasicOps-822850957-project-member</nova:user>
Jan 22 00:09:06 compute-1 nova_compute[182713]:     <nova:project uuid="34b96b4037d24a0ea19383ca2477b2fd">tempest-TestNetworkBasicOps-822850957</nova:project>
Jan 22 00:09:06 compute-1 nova_compute[182713]:   </nova:owner>
Jan 22 00:09:06 compute-1 nova_compute[182713]:   <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 22 00:09:06 compute-1 nova_compute[182713]:   <nova:ports>
Jan 22 00:09:06 compute-1 nova_compute[182713]:     <nova:port uuid="65b75da3-429b-440e-aab6-df04e1794c70">
Jan 22 00:09:06 compute-1 nova_compute[182713]:       <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Jan 22 00:09:06 compute-1 nova_compute[182713]:     </nova:port>
Jan 22 00:09:06 compute-1 nova_compute[182713]:     <nova:port uuid="424a06a4-5a78-4f84-b441-8b6fcf11bee9">
Jan 22 00:09:06 compute-1 nova_compute[182713]:       <nova:ip type="fixed" address="10.100.0.27" ipVersion="4"/>
Jan 22 00:09:06 compute-1 nova_compute[182713]:     </nova:port>
Jan 22 00:09:06 compute-1 nova_compute[182713]:   </nova:ports>
Jan 22 00:09:06 compute-1 nova_compute[182713]: </nova:instance>
Jan 22 00:09:06 compute-1 nova_compute[182713]:  set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359
Jan 22 00:09:06 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:09:06.897 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[4d241774-3e5e-42eb-8da6-9cd657848020]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe1e:e458'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 511949, 'tstamp': 511949}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 227938, 'error': None, 'target': 'ovnmeta-71c61b6b-419e-42c2-91c0-28feb86f02ed', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:09:06 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:09:06.917 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[9dacb5c0-74ee-457c-9794-9a53af400387]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap71c61b6b-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:1e:e4:58'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 126], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 511949, 'reachable_time': 32369, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 227939, 'error': None, 'target': 'ovnmeta-71c61b6b-419e-42c2-91c0-28feb86f02ed', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:09:06 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:09:06.961 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[1530d26d-0c4e-437d-a11a-9eb0a6d0077b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:09:06 compute-1 nova_compute[182713]: 2026-01-22 00:09:06.979 182717 DEBUG oslo_concurrency.lockutils [None req-4a946ddd-efe8-44bd-9aed-4b681f4f3fa2 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "interface-6bdc91c7-e39d-4d01-9496-49ceb58c3389-None" "released" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: held 10.257s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:09:07 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:09:07.033 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[da37a124-d259-48af-b2a2-1a7c13493e95]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:09:07 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:09:07.035 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap71c61b6b-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:09:07 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:09:07.036 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 00:09:07 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:09:07.036 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap71c61b6b-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:09:07 compute-1 nova_compute[182713]: 2026-01-22 00:09:07.085 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:09:07 compute-1 kernel: tap71c61b6b-40: entered promiscuous mode
Jan 22 00:09:07 compute-1 NetworkManager[54952]: <info>  [1769040547.0876] manager: (tap71c61b6b-40): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/208)
Jan 22 00:09:07 compute-1 nova_compute[182713]: 2026-01-22 00:09:07.089 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:09:07 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:09:07.089 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap71c61b6b-40, col_values=(('external_ids', {'iface-id': '2e604a38-9751-45b6-8274-b17fdf1906df'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:09:07 compute-1 nova_compute[182713]: 2026-01-22 00:09:07.091 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:09:07 compute-1 ovn_controller[94841]: 2026-01-22T00:09:07Z|00427|binding|INFO|Releasing lport 2e604a38-9751-45b6-8274-b17fdf1906df from this chassis (sb_readonly=0)
Jan 22 00:09:07 compute-1 nova_compute[182713]: 2026-01-22 00:09:07.114 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:09:07 compute-1 nova_compute[182713]: 2026-01-22 00:09:07.118 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:09:07 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:09:07.118 104184 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/71c61b6b-419e-42c2-91c0-28feb86f02ed.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/71c61b6b-419e-42c2-91c0-28feb86f02ed.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 22 00:09:07 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:09:07.120 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[4ece2ad8-be82-4c5c-b1cc-9a92fe3114ef]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:09:07 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:09:07.121 104184 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 00:09:07 compute-1 ovn_metadata_agent[104179]: global
Jan 22 00:09:07 compute-1 ovn_metadata_agent[104179]:     log         /dev/log local0 debug
Jan 22 00:09:07 compute-1 ovn_metadata_agent[104179]:     log-tag     haproxy-metadata-proxy-71c61b6b-419e-42c2-91c0-28feb86f02ed
Jan 22 00:09:07 compute-1 ovn_metadata_agent[104179]:     user        root
Jan 22 00:09:07 compute-1 ovn_metadata_agent[104179]:     group       root
Jan 22 00:09:07 compute-1 ovn_metadata_agent[104179]:     maxconn     1024
Jan 22 00:09:07 compute-1 ovn_metadata_agent[104179]:     pidfile     /var/lib/neutron/external/pids/71c61b6b-419e-42c2-91c0-28feb86f02ed.pid.haproxy
Jan 22 00:09:07 compute-1 ovn_metadata_agent[104179]:     daemon
Jan 22 00:09:07 compute-1 ovn_metadata_agent[104179]: 
Jan 22 00:09:07 compute-1 ovn_metadata_agent[104179]: defaults
Jan 22 00:09:07 compute-1 ovn_metadata_agent[104179]:     log global
Jan 22 00:09:07 compute-1 ovn_metadata_agent[104179]:     mode http
Jan 22 00:09:07 compute-1 ovn_metadata_agent[104179]:     option httplog
Jan 22 00:09:07 compute-1 ovn_metadata_agent[104179]:     option dontlognull
Jan 22 00:09:07 compute-1 ovn_metadata_agent[104179]:     option http-server-close
Jan 22 00:09:07 compute-1 ovn_metadata_agent[104179]:     option forwardfor
Jan 22 00:09:07 compute-1 ovn_metadata_agent[104179]:     retries                 3
Jan 22 00:09:07 compute-1 ovn_metadata_agent[104179]:     timeout http-request    30s
Jan 22 00:09:07 compute-1 ovn_metadata_agent[104179]:     timeout connect         30s
Jan 22 00:09:07 compute-1 ovn_metadata_agent[104179]:     timeout client          32s
Jan 22 00:09:07 compute-1 ovn_metadata_agent[104179]:     timeout server          32s
Jan 22 00:09:07 compute-1 ovn_metadata_agent[104179]:     timeout http-keep-alive 30s
Jan 22 00:09:07 compute-1 ovn_metadata_agent[104179]: 
Jan 22 00:09:07 compute-1 ovn_metadata_agent[104179]: 
Jan 22 00:09:07 compute-1 ovn_metadata_agent[104179]: listen listener
Jan 22 00:09:07 compute-1 ovn_metadata_agent[104179]:     bind 169.254.169.254:80
Jan 22 00:09:07 compute-1 ovn_metadata_agent[104179]:     server metadata /var/lib/neutron/metadata_proxy
Jan 22 00:09:07 compute-1 ovn_metadata_agent[104179]:     http-request add-header X-OVN-Network-ID 71c61b6b-419e-42c2-91c0-28feb86f02ed
Jan 22 00:09:07 compute-1 ovn_metadata_agent[104179]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 22 00:09:07 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:09:07.121 104184 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-71c61b6b-419e-42c2-91c0-28feb86f02ed', 'env', 'PROCESS_TAG=haproxy-71c61b6b-419e-42c2-91c0-28feb86f02ed', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/71c61b6b-419e-42c2-91c0-28feb86f02ed.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 22 00:09:07 compute-1 nova_compute[182713]: 2026-01-22 00:09:07.234 182717 DEBUG nova.compute.manager [req-fd59ec17-192f-46bd-b6e5-ece69ed732b2 req-a1fd13ff-d54d-4082-9c1f-8931a47a5cf0 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 6bdc91c7-e39d-4d01-9496-49ceb58c3389] Received event network-vif-plugged-424a06a4-5a78-4f84-b441-8b6fcf11bee9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:09:07 compute-1 nova_compute[182713]: 2026-01-22 00:09:07.234 182717 DEBUG oslo_concurrency.lockutils [req-fd59ec17-192f-46bd-b6e5-ece69ed732b2 req-a1fd13ff-d54d-4082-9c1f-8931a47a5cf0 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "6bdc91c7-e39d-4d01-9496-49ceb58c3389-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:09:07 compute-1 nova_compute[182713]: 2026-01-22 00:09:07.235 182717 DEBUG oslo_concurrency.lockutils [req-fd59ec17-192f-46bd-b6e5-ece69ed732b2 req-a1fd13ff-d54d-4082-9c1f-8931a47a5cf0 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "6bdc91c7-e39d-4d01-9496-49ceb58c3389-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:09:07 compute-1 nova_compute[182713]: 2026-01-22 00:09:07.235 182717 DEBUG oslo_concurrency.lockutils [req-fd59ec17-192f-46bd-b6e5-ece69ed732b2 req-a1fd13ff-d54d-4082-9c1f-8931a47a5cf0 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "6bdc91c7-e39d-4d01-9496-49ceb58c3389-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:09:07 compute-1 nova_compute[182713]: 2026-01-22 00:09:07.235 182717 DEBUG nova.compute.manager [req-fd59ec17-192f-46bd-b6e5-ece69ed732b2 req-a1fd13ff-d54d-4082-9c1f-8931a47a5cf0 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 6bdc91c7-e39d-4d01-9496-49ceb58c3389] No waiting events found dispatching network-vif-plugged-424a06a4-5a78-4f84-b441-8b6fcf11bee9 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:09:07 compute-1 nova_compute[182713]: 2026-01-22 00:09:07.235 182717 WARNING nova.compute.manager [req-fd59ec17-192f-46bd-b6e5-ece69ed732b2 req-a1fd13ff-d54d-4082-9c1f-8931a47a5cf0 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 6bdc91c7-e39d-4d01-9496-49ceb58c3389] Received unexpected event network-vif-plugged-424a06a4-5a78-4f84-b441-8b6fcf11bee9 for instance with vm_state active and task_state None.
Jan 22 00:09:07 compute-1 nova_compute[182713]: 2026-01-22 00:09:07.525 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:09:07 compute-1 podman[227971]: 2026-01-22 00:09:07.565673297 +0000 UTC m=+0.057887608 container create 4bf098fc564782399f6987ae201ea32802a3ca16c7bbdb6b65ffd056334f100f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-71c61b6b-419e-42c2-91c0-28feb86f02ed, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Jan 22 00:09:07 compute-1 systemd[1]: Started libpod-conmon-4bf098fc564782399f6987ae201ea32802a3ca16c7bbdb6b65ffd056334f100f.scope.
Jan 22 00:09:07 compute-1 podman[227971]: 2026-01-22 00:09:07.541916765 +0000 UTC m=+0.034131096 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 00:09:07 compute-1 systemd[1]: Started libcrun container.
Jan 22 00:09:07 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/eba9b2fea5c6e34a2a86d2bfe4edaeb0c5489823c4e5c4304b966caf7b478452/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 00:09:07 compute-1 podman[227971]: 2026-01-22 00:09:07.664918869 +0000 UTC m=+0.157133210 container init 4bf098fc564782399f6987ae201ea32802a3ca16c7bbdb6b65ffd056334f100f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-71c61b6b-419e-42c2-91c0-28feb86f02ed, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS)
Jan 22 00:09:07 compute-1 podman[227971]: 2026-01-22 00:09:07.672700785 +0000 UTC m=+0.164915096 container start 4bf098fc564782399f6987ae201ea32802a3ca16c7bbdb6b65ffd056334f100f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-71c61b6b-419e-42c2-91c0-28feb86f02ed, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3)
Jan 22 00:09:07 compute-1 neutron-haproxy-ovnmeta-71c61b6b-419e-42c2-91c0-28feb86f02ed[227986]: [NOTICE]   (227990) : New worker (227992) forked
Jan 22 00:09:07 compute-1 neutron-haproxy-ovnmeta-71c61b6b-419e-42c2-91c0-28feb86f02ed[227986]: [NOTICE]   (227990) : Loading success.
Jan 22 00:09:08 compute-1 nova_compute[182713]: 2026-01-22 00:09:08.492 182717 DEBUG nova.network.neutron [req-ca0a358d-78ed-40f9-8f41-24f4d17a7c5b req-18dbee98-dea7-4f81-976b-bad68410e373 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 6bdc91c7-e39d-4d01-9496-49ceb58c3389] Updated VIF entry in instance network info cache for port 424a06a4-5a78-4f84-b441-8b6fcf11bee9. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 00:09:08 compute-1 nova_compute[182713]: 2026-01-22 00:09:08.493 182717 DEBUG nova.network.neutron [req-ca0a358d-78ed-40f9-8f41-24f4d17a7c5b req-18dbee98-dea7-4f81-976b-bad68410e373 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 6bdc91c7-e39d-4d01-9496-49ceb58c3389] Updating instance_info_cache with network_info: [{"id": "65b75da3-429b-440e-aab6-df04e1794c70", "address": "fa:16:3e:e0:2b:de", "network": {"id": "d9f4afa6-3fd6-432d-a9b5-d1fb06cca4eb", "bridge": "br-int", "label": "tempest-network-smoke--277799587", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.248", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "34b96b4037d24a0ea19383ca2477b2fd", "mtu": null, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap65b75da3-42", "ovs_interfaceid": "65b75da3-429b-440e-aab6-df04e1794c70", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "424a06a4-5a78-4f84-b441-8b6fcf11bee9", "address": "fa:16:3e:a8:ff:89", "network": {"id": "71c61b6b-419e-42c2-91c0-28feb86f02ed", "bridge": "br-int", "label": "tempest-network-smoke--2123686400", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.27", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "34b96b4037d24a0ea19383ca2477b2fd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap424a06a4-5a", "ovs_interfaceid": "424a06a4-5a78-4f84-b441-8b6fcf11bee9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:09:08 compute-1 nova_compute[182713]: 2026-01-22 00:09:08.516 182717 DEBUG oslo_concurrency.lockutils [req-ca0a358d-78ed-40f9-8f41-24f4d17a7c5b req-18dbee98-dea7-4f81-976b-bad68410e373 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-6bdc91c7-e39d-4d01-9496-49ceb58c3389" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:09:08 compute-1 ovn_controller[94841]: 2026-01-22T00:09:08Z|00053|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:a8:ff:89 10.100.0.27
Jan 22 00:09:08 compute-1 ovn_controller[94841]: 2026-01-22T00:09:08Z|00054|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:a8:ff:89 10.100.0.27
Jan 22 00:09:08 compute-1 nova_compute[182713]: 2026-01-22 00:09:08.813 182717 DEBUG oslo_concurrency.lockutils [None req-64f9b310-95ff-4976-9e62-bb371ce1db80 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Acquiring lock "interface-6bdc91c7-e39d-4d01-9496-49ceb58c3389-424a06a4-5a78-4f84-b441-8b6fcf11bee9" by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:09:08 compute-1 nova_compute[182713]: 2026-01-22 00:09:08.814 182717 DEBUG oslo_concurrency.lockutils [None req-64f9b310-95ff-4976-9e62-bb371ce1db80 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "interface-6bdc91c7-e39d-4d01-9496-49ceb58c3389-424a06a4-5a78-4f84-b441-8b6fcf11bee9" acquired by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:09:08 compute-1 nova_compute[182713]: 2026-01-22 00:09:08.827 182717 DEBUG nova.objects.instance [None req-64f9b310-95ff-4976-9e62-bb371ce1db80 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lazy-loading 'flavor' on Instance uuid 6bdc91c7-e39d-4d01-9496-49ceb58c3389 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:09:08 compute-1 nova_compute[182713]: 2026-01-22 00:09:08.864 182717 DEBUG nova.virt.libvirt.vif [None req-64f9b310-95ff-4976-9e62-bb371ce1db80 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T00:08:22Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-188093305',display_name='tempest-TestNetworkBasicOps-server-188093305',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-188093305',id=108,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOHkFZnEq7NNmxdySRV5/1EgpWeAUvw5OFcMLn6aXM7GCY8P0GmDvb4iJ7gG7Emvq6PckO0f9TjYb3zc1/HZhYzhCTu/vnvxNF9lXAP/apwH44by79L436dIVo4nFymoqQ==',key_name='tempest-TestNetworkBasicOps-441082338',keypairs=<?>,launch_index=0,launched_at=2026-01-22T00:08:31Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='34b96b4037d24a0ea19383ca2477b2fd',ramdisk_id='',reservation_id='r-e07wfkf0',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-822850957',owner_user_name='tempest-TestNetworkBasicOps-822850957-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T00:08:31Z,user_data=None,user_id='833f1e9dce90456ea55a443da6704907',uuid=6bdc91c7-e39d-4d01-9496-49ceb58c3389,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "424a06a4-5a78-4f84-b441-8b6fcf11bee9", "address": "fa:16:3e:a8:ff:89", "network": {"id": "71c61b6b-419e-42c2-91c0-28feb86f02ed", "bridge": "br-int", "label": "tempest-network-smoke--2123686400", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.27", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "34b96b4037d24a0ea19383ca2477b2fd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap424a06a4-5a", "ovs_interfaceid": "424a06a4-5a78-4f84-b441-8b6fcf11bee9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 00:09:08 compute-1 nova_compute[182713]: 2026-01-22 00:09:08.865 182717 DEBUG nova.network.os_vif_util [None req-64f9b310-95ff-4976-9e62-bb371ce1db80 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Converting VIF {"id": "424a06a4-5a78-4f84-b441-8b6fcf11bee9", "address": "fa:16:3e:a8:ff:89", "network": {"id": "71c61b6b-419e-42c2-91c0-28feb86f02ed", "bridge": "br-int", "label": "tempest-network-smoke--2123686400", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.27", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "34b96b4037d24a0ea19383ca2477b2fd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap424a06a4-5a", "ovs_interfaceid": "424a06a4-5a78-4f84-b441-8b6fcf11bee9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:09:08 compute-1 nova_compute[182713]: 2026-01-22 00:09:08.866 182717 DEBUG nova.network.os_vif_util [None req-64f9b310-95ff-4976-9e62-bb371ce1db80 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a8:ff:89,bridge_name='br-int',has_traffic_filtering=True,id=424a06a4-5a78-4f84-b441-8b6fcf11bee9,network=Network(71c61b6b-419e-42c2-91c0-28feb86f02ed),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap424a06a4-5a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:09:08 compute-1 nova_compute[182713]: 2026-01-22 00:09:08.871 182717 DEBUG nova.virt.libvirt.guest [None req-64f9b310-95ff-4976-9e62-bb371ce1db80 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:a8:ff:89"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap424a06a4-5a"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257
Jan 22 00:09:08 compute-1 nova_compute[182713]: 2026-01-22 00:09:08.874 182717 DEBUG nova.virt.libvirt.guest [None req-64f9b310-95ff-4976-9e62-bb371ce1db80 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:a8:ff:89"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap424a06a4-5a"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257
Jan 22 00:09:08 compute-1 nova_compute[182713]: 2026-01-22 00:09:08.878 182717 DEBUG nova.virt.libvirt.driver [None req-64f9b310-95ff-4976-9e62-bb371ce1db80 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Attempting to detach device tap424a06a4-5a from instance 6bdc91c7-e39d-4d01-9496-49ceb58c3389 from the persistent domain config. _detach_from_persistent /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2487
Jan 22 00:09:08 compute-1 nova_compute[182713]: 2026-01-22 00:09:08.878 182717 DEBUG nova.virt.libvirt.guest [None req-64f9b310-95ff-4976-9e62-bb371ce1db80 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] detach device xml: <interface type="ethernet">
Jan 22 00:09:08 compute-1 nova_compute[182713]:   <mac address="fa:16:3e:a8:ff:89"/>
Jan 22 00:09:08 compute-1 nova_compute[182713]:   <model type="virtio"/>
Jan 22 00:09:08 compute-1 nova_compute[182713]:   <driver name="vhost" rx_queue_size="512"/>
Jan 22 00:09:08 compute-1 nova_compute[182713]:   <mtu size="1442"/>
Jan 22 00:09:08 compute-1 nova_compute[182713]:   <target dev="tap424a06a4-5a"/>
Jan 22 00:09:08 compute-1 nova_compute[182713]: </interface>
Jan 22 00:09:08 compute-1 nova_compute[182713]:  detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465
Jan 22 00:09:08 compute-1 nova_compute[182713]: 2026-01-22 00:09:08.886 182717 DEBUG nova.virt.libvirt.guest [None req-64f9b310-95ff-4976-9e62-bb371ce1db80 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:a8:ff:89"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap424a06a4-5a"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257
Jan 22 00:09:08 compute-1 nova_compute[182713]: 2026-01-22 00:09:08.890 182717 DEBUG nova.virt.libvirt.guest [None req-64f9b310-95ff-4976-9e62-bb371ce1db80 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:a8:ff:89"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap424a06a4-5a"/></interface>not found in domain: <domain type='kvm' id='47'>
Jan 22 00:09:08 compute-1 nova_compute[182713]:   <name>instance-0000006c</name>
Jan 22 00:09:08 compute-1 nova_compute[182713]:   <uuid>6bdc91c7-e39d-4d01-9496-49ceb58c3389</uuid>
Jan 22 00:09:08 compute-1 nova_compute[182713]:   <metadata>
Jan 22 00:09:08 compute-1 nova_compute[182713]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 00:09:08 compute-1 nova_compute[182713]:   <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 00:09:08 compute-1 nova_compute[182713]:   <nova:name>tempest-TestNetworkBasicOps-server-188093305</nova:name>
Jan 22 00:09:08 compute-1 nova_compute[182713]:   <nova:creationTime>2026-01-22 00:09:06</nova:creationTime>
Jan 22 00:09:08 compute-1 nova_compute[182713]:   <nova:flavor name="m1.nano">
Jan 22 00:09:08 compute-1 nova_compute[182713]:     <nova:memory>128</nova:memory>
Jan 22 00:09:08 compute-1 nova_compute[182713]:     <nova:disk>1</nova:disk>
Jan 22 00:09:08 compute-1 nova_compute[182713]:     <nova:swap>0</nova:swap>
Jan 22 00:09:08 compute-1 nova_compute[182713]:     <nova:ephemeral>0</nova:ephemeral>
Jan 22 00:09:08 compute-1 nova_compute[182713]:     <nova:vcpus>1</nova:vcpus>
Jan 22 00:09:08 compute-1 nova_compute[182713]:   </nova:flavor>
Jan 22 00:09:08 compute-1 nova_compute[182713]:   <nova:owner>
Jan 22 00:09:08 compute-1 nova_compute[182713]:     <nova:user uuid="833f1e9dce90456ea55a443da6704907">tempest-TestNetworkBasicOps-822850957-project-member</nova:user>
Jan 22 00:09:08 compute-1 nova_compute[182713]:     <nova:project uuid="34b96b4037d24a0ea19383ca2477b2fd">tempest-TestNetworkBasicOps-822850957</nova:project>
Jan 22 00:09:08 compute-1 nova_compute[182713]:   </nova:owner>
Jan 22 00:09:08 compute-1 nova_compute[182713]:   <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 22 00:09:08 compute-1 nova_compute[182713]:   <nova:ports>
Jan 22 00:09:08 compute-1 nova_compute[182713]:     <nova:port uuid="65b75da3-429b-440e-aab6-df04e1794c70">
Jan 22 00:09:08 compute-1 nova_compute[182713]:       <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Jan 22 00:09:08 compute-1 nova_compute[182713]:     </nova:port>
Jan 22 00:09:08 compute-1 nova_compute[182713]:     <nova:port uuid="424a06a4-5a78-4f84-b441-8b6fcf11bee9">
Jan 22 00:09:08 compute-1 nova_compute[182713]:       <nova:ip type="fixed" address="10.100.0.27" ipVersion="4"/>
Jan 22 00:09:08 compute-1 nova_compute[182713]:     </nova:port>
Jan 22 00:09:08 compute-1 nova_compute[182713]:   </nova:ports>
Jan 22 00:09:08 compute-1 nova_compute[182713]: </nova:instance>
Jan 22 00:09:08 compute-1 nova_compute[182713]:   </metadata>
Jan 22 00:09:08 compute-1 nova_compute[182713]:   <memory unit='KiB'>131072</memory>
Jan 22 00:09:08 compute-1 nova_compute[182713]:   <currentMemory unit='KiB'>131072</currentMemory>
Jan 22 00:09:08 compute-1 nova_compute[182713]:   <vcpu placement='static'>1</vcpu>
Jan 22 00:09:08 compute-1 nova_compute[182713]:   <resource>
Jan 22 00:09:08 compute-1 nova_compute[182713]:     <partition>/machine</partition>
Jan 22 00:09:08 compute-1 nova_compute[182713]:   </resource>
Jan 22 00:09:08 compute-1 nova_compute[182713]:   <sysinfo type='smbios'>
Jan 22 00:09:08 compute-1 nova_compute[182713]:     <system>
Jan 22 00:09:08 compute-1 nova_compute[182713]:       <entry name='manufacturer'>RDO</entry>
Jan 22 00:09:08 compute-1 nova_compute[182713]:       <entry name='product'>OpenStack Compute</entry>
Jan 22 00:09:08 compute-1 nova_compute[182713]:       <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 00:09:08 compute-1 nova_compute[182713]:       <entry name='serial'>6bdc91c7-e39d-4d01-9496-49ceb58c3389</entry>
Jan 22 00:09:08 compute-1 nova_compute[182713]:       <entry name='uuid'>6bdc91c7-e39d-4d01-9496-49ceb58c3389</entry>
Jan 22 00:09:08 compute-1 nova_compute[182713]:       <entry name='family'>Virtual Machine</entry>
Jan 22 00:09:08 compute-1 nova_compute[182713]:     </system>
Jan 22 00:09:08 compute-1 nova_compute[182713]:   </sysinfo>
Jan 22 00:09:08 compute-1 nova_compute[182713]:   <os>
Jan 22 00:09:08 compute-1 nova_compute[182713]:     <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Jan 22 00:09:08 compute-1 nova_compute[182713]:     <boot dev='hd'/>
Jan 22 00:09:08 compute-1 nova_compute[182713]:     <smbios mode='sysinfo'/>
Jan 22 00:09:08 compute-1 nova_compute[182713]:   </os>
Jan 22 00:09:08 compute-1 nova_compute[182713]:   <features>
Jan 22 00:09:08 compute-1 nova_compute[182713]:     <acpi/>
Jan 22 00:09:08 compute-1 nova_compute[182713]:     <apic/>
Jan 22 00:09:08 compute-1 nova_compute[182713]:     <vmcoreinfo state='on'/>
Jan 22 00:09:08 compute-1 nova_compute[182713]:   </features>
Jan 22 00:09:08 compute-1 nova_compute[182713]:   <cpu mode='custom' match='exact' check='full'>
Jan 22 00:09:08 compute-1 nova_compute[182713]:     <model fallback='forbid'>Nehalem</model>
Jan 22 00:09:08 compute-1 nova_compute[182713]:     <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Jan 22 00:09:08 compute-1 nova_compute[182713]:     <feature policy='require' name='x2apic'/>
Jan 22 00:09:08 compute-1 nova_compute[182713]:     <feature policy='require' name='hypervisor'/>
Jan 22 00:09:08 compute-1 nova_compute[182713]:     <feature policy='require' name='vme'/>
Jan 22 00:09:08 compute-1 nova_compute[182713]:   </cpu>
Jan 22 00:09:08 compute-1 nova_compute[182713]:   <clock offset='utc'>
Jan 22 00:09:08 compute-1 nova_compute[182713]:     <timer name='pit' tickpolicy='delay'/>
Jan 22 00:09:08 compute-1 nova_compute[182713]:     <timer name='rtc' tickpolicy='catchup'/>
Jan 22 00:09:08 compute-1 nova_compute[182713]:     <timer name='hpet' present='no'/>
Jan 22 00:09:08 compute-1 nova_compute[182713]:   </clock>
Jan 22 00:09:08 compute-1 nova_compute[182713]:   <on_poweroff>destroy</on_poweroff>
Jan 22 00:09:08 compute-1 nova_compute[182713]:   <on_reboot>restart</on_reboot>
Jan 22 00:09:08 compute-1 nova_compute[182713]:   <on_crash>destroy</on_crash>
Jan 22 00:09:08 compute-1 nova_compute[182713]:   <devices>
Jan 22 00:09:08 compute-1 nova_compute[182713]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 22 00:09:08 compute-1 nova_compute[182713]:     <disk type='file' device='disk'>
Jan 22 00:09:08 compute-1 nova_compute[182713]:       <driver name='qemu' type='qcow2' cache='none'/>
Jan 22 00:09:08 compute-1 nova_compute[182713]:       <source file='/var/lib/nova/instances/6bdc91c7-e39d-4d01-9496-49ceb58c3389/disk' index='2'/>
Jan 22 00:09:08 compute-1 nova_compute[182713]:       <backingStore type='file' index='3'>
Jan 22 00:09:08 compute-1 nova_compute[182713]:         <format type='raw'/>
Jan 22 00:09:08 compute-1 nova_compute[182713]:         <source file='/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474'/>
Jan 22 00:09:08 compute-1 nova_compute[182713]:         <backingStore/>
Jan 22 00:09:08 compute-1 nova_compute[182713]:       </backingStore>
Jan 22 00:09:08 compute-1 nova_compute[182713]:       <target dev='vda' bus='virtio'/>
Jan 22 00:09:08 compute-1 nova_compute[182713]:       <alias name='virtio-disk0'/>
Jan 22 00:09:08 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Jan 22 00:09:08 compute-1 nova_compute[182713]:     </disk>
Jan 22 00:09:08 compute-1 nova_compute[182713]:     <disk type='file' device='cdrom'>
Jan 22 00:09:08 compute-1 nova_compute[182713]:       <driver name='qemu' type='raw' cache='none'/>
Jan 22 00:09:08 compute-1 nova_compute[182713]:       <source file='/var/lib/nova/instances/6bdc91c7-e39d-4d01-9496-49ceb58c3389/disk.config' index='1'/>
Jan 22 00:09:08 compute-1 nova_compute[182713]:       <backingStore/>
Jan 22 00:09:08 compute-1 nova_compute[182713]:       <target dev='sda' bus='sata'/>
Jan 22 00:09:08 compute-1 nova_compute[182713]:       <readonly/>
Jan 22 00:09:08 compute-1 nova_compute[182713]:       <alias name='sata0-0-0'/>
Jan 22 00:09:08 compute-1 nova_compute[182713]:       <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Jan 22 00:09:08 compute-1 nova_compute[182713]:     </disk>
Jan 22 00:09:08 compute-1 nova_compute[182713]:     <controller type='pci' index='0' model='pcie-root'>
Jan 22 00:09:08 compute-1 nova_compute[182713]:       <alias name='pcie.0'/>
Jan 22 00:09:08 compute-1 nova_compute[182713]:     </controller>
Jan 22 00:09:08 compute-1 nova_compute[182713]:     <controller type='pci' index='1' model='pcie-root-port'>
Jan 22 00:09:08 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 22 00:09:08 compute-1 nova_compute[182713]:       <target chassis='1' port='0x10'/>
Jan 22 00:09:08 compute-1 nova_compute[182713]:       <alias name='pci.1'/>
Jan 22 00:09:08 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Jan 22 00:09:08 compute-1 nova_compute[182713]:     </controller>
Jan 22 00:09:08 compute-1 nova_compute[182713]:     <controller type='pci' index='2' model='pcie-root-port'>
Jan 22 00:09:08 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 22 00:09:08 compute-1 nova_compute[182713]:       <target chassis='2' port='0x11'/>
Jan 22 00:09:08 compute-1 nova_compute[182713]:       <alias name='pci.2'/>
Jan 22 00:09:08 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Jan 22 00:09:08 compute-1 nova_compute[182713]:     </controller>
Jan 22 00:09:08 compute-1 nova_compute[182713]:     <controller type='pci' index='3' model='pcie-root-port'>
Jan 22 00:09:08 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 22 00:09:08 compute-1 nova_compute[182713]:       <target chassis='3' port='0x12'/>
Jan 22 00:09:08 compute-1 nova_compute[182713]:       <alias name='pci.3'/>
Jan 22 00:09:08 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Jan 22 00:09:08 compute-1 nova_compute[182713]:     </controller>
Jan 22 00:09:08 compute-1 nova_compute[182713]:     <controller type='pci' index='4' model='pcie-root-port'>
Jan 22 00:09:08 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 22 00:09:08 compute-1 nova_compute[182713]:       <target chassis='4' port='0x13'/>
Jan 22 00:09:08 compute-1 nova_compute[182713]:       <alias name='pci.4'/>
Jan 22 00:09:08 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Jan 22 00:09:08 compute-1 nova_compute[182713]:     </controller>
Jan 22 00:09:08 compute-1 nova_compute[182713]:     <controller type='pci' index='5' model='pcie-root-port'>
Jan 22 00:09:08 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 22 00:09:08 compute-1 nova_compute[182713]:       <target chassis='5' port='0x14'/>
Jan 22 00:09:08 compute-1 nova_compute[182713]:       <alias name='pci.5'/>
Jan 22 00:09:08 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Jan 22 00:09:08 compute-1 nova_compute[182713]:     </controller>
Jan 22 00:09:08 compute-1 nova_compute[182713]:     <controller type='pci' index='6' model='pcie-root-port'>
Jan 22 00:09:08 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 22 00:09:08 compute-1 nova_compute[182713]:       <target chassis='6' port='0x15'/>
Jan 22 00:09:08 compute-1 nova_compute[182713]:       <alias name='pci.6'/>
Jan 22 00:09:08 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Jan 22 00:09:08 compute-1 nova_compute[182713]:     </controller>
Jan 22 00:09:08 compute-1 nova_compute[182713]:     <controller type='pci' index='7' model='pcie-root-port'>
Jan 22 00:09:08 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 22 00:09:08 compute-1 nova_compute[182713]:       <target chassis='7' port='0x16'/>
Jan 22 00:09:08 compute-1 nova_compute[182713]:       <alias name='pci.7'/>
Jan 22 00:09:08 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Jan 22 00:09:08 compute-1 nova_compute[182713]:     </controller>
Jan 22 00:09:08 compute-1 nova_compute[182713]:     <controller type='pci' index='8' model='pcie-root-port'>
Jan 22 00:09:08 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 22 00:09:08 compute-1 nova_compute[182713]:       <target chassis='8' port='0x17'/>
Jan 22 00:09:08 compute-1 nova_compute[182713]:       <alias name='pci.8'/>
Jan 22 00:09:08 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Jan 22 00:09:08 compute-1 nova_compute[182713]:     </controller>
Jan 22 00:09:08 compute-1 nova_compute[182713]:     <controller type='pci' index='9' model='pcie-root-port'>
Jan 22 00:09:08 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 22 00:09:08 compute-1 nova_compute[182713]:       <target chassis='9' port='0x18'/>
Jan 22 00:09:08 compute-1 nova_compute[182713]:       <alias name='pci.9'/>
Jan 22 00:09:08 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Jan 22 00:09:08 compute-1 nova_compute[182713]:     </controller>
Jan 22 00:09:08 compute-1 nova_compute[182713]:     <controller type='pci' index='10' model='pcie-root-port'>
Jan 22 00:09:08 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 22 00:09:08 compute-1 nova_compute[182713]:       <target chassis='10' port='0x19'/>
Jan 22 00:09:08 compute-1 nova_compute[182713]:       <alias name='pci.10'/>
Jan 22 00:09:08 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Jan 22 00:09:08 compute-1 nova_compute[182713]:     </controller>
Jan 22 00:09:08 compute-1 nova_compute[182713]:     <controller type='pci' index='11' model='pcie-root-port'>
Jan 22 00:09:08 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 22 00:09:08 compute-1 nova_compute[182713]:       <target chassis='11' port='0x1a'/>
Jan 22 00:09:08 compute-1 nova_compute[182713]:       <alias name='pci.11'/>
Jan 22 00:09:08 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Jan 22 00:09:08 compute-1 nova_compute[182713]:     </controller>
Jan 22 00:09:08 compute-1 nova_compute[182713]:     <controller type='pci' index='12' model='pcie-root-port'>
Jan 22 00:09:08 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 22 00:09:08 compute-1 nova_compute[182713]:       <target chassis='12' port='0x1b'/>
Jan 22 00:09:08 compute-1 nova_compute[182713]:       <alias name='pci.12'/>
Jan 22 00:09:08 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Jan 22 00:09:08 compute-1 nova_compute[182713]:     </controller>
Jan 22 00:09:08 compute-1 nova_compute[182713]:     <controller type='pci' index='13' model='pcie-root-port'>
Jan 22 00:09:08 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 22 00:09:08 compute-1 nova_compute[182713]:       <target chassis='13' port='0x1c'/>
Jan 22 00:09:08 compute-1 nova_compute[182713]:       <alias name='pci.13'/>
Jan 22 00:09:08 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Jan 22 00:09:08 compute-1 nova_compute[182713]:     </controller>
Jan 22 00:09:08 compute-1 nova_compute[182713]:     <controller type='pci' index='14' model='pcie-root-port'>
Jan 22 00:09:08 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 22 00:09:08 compute-1 nova_compute[182713]:       <target chassis='14' port='0x1d'/>
Jan 22 00:09:08 compute-1 nova_compute[182713]:       <alias name='pci.14'/>
Jan 22 00:09:08 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Jan 22 00:09:08 compute-1 nova_compute[182713]:     </controller>
Jan 22 00:09:08 compute-1 nova_compute[182713]:     <controller type='pci' index='15' model='pcie-root-port'>
Jan 22 00:09:08 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 22 00:09:08 compute-1 nova_compute[182713]:       <target chassis='15' port='0x1e'/>
Jan 22 00:09:08 compute-1 nova_compute[182713]:       <alias name='pci.15'/>
Jan 22 00:09:08 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Jan 22 00:09:08 compute-1 nova_compute[182713]:     </controller>
Jan 22 00:09:08 compute-1 nova_compute[182713]:     <controller type='pci' index='16' model='pcie-root-port'>
Jan 22 00:09:08 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 22 00:09:08 compute-1 nova_compute[182713]:       <target chassis='16' port='0x1f'/>
Jan 22 00:09:08 compute-1 nova_compute[182713]:       <alias name='pci.16'/>
Jan 22 00:09:08 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Jan 22 00:09:08 compute-1 nova_compute[182713]:     </controller>
Jan 22 00:09:08 compute-1 nova_compute[182713]:     <controller type='pci' index='17' model='pcie-root-port'>
Jan 22 00:09:08 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 22 00:09:08 compute-1 nova_compute[182713]:       <target chassis='17' port='0x20'/>
Jan 22 00:09:08 compute-1 nova_compute[182713]:       <alias name='pci.17'/>
Jan 22 00:09:08 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Jan 22 00:09:08 compute-1 nova_compute[182713]:     </controller>
Jan 22 00:09:08 compute-1 nova_compute[182713]:     <controller type='pci' index='18' model='pcie-root-port'>
Jan 22 00:09:08 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 22 00:09:08 compute-1 nova_compute[182713]:       <target chassis='18' port='0x21'/>
Jan 22 00:09:08 compute-1 nova_compute[182713]:       <alias name='pci.18'/>
Jan 22 00:09:08 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Jan 22 00:09:08 compute-1 nova_compute[182713]:     </controller>
Jan 22 00:09:08 compute-1 nova_compute[182713]:     <controller type='pci' index='19' model='pcie-root-port'>
Jan 22 00:09:08 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 22 00:09:08 compute-1 nova_compute[182713]:       <target chassis='19' port='0x22'/>
Jan 22 00:09:08 compute-1 nova_compute[182713]:       <alias name='pci.19'/>
Jan 22 00:09:08 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Jan 22 00:09:08 compute-1 nova_compute[182713]:     </controller>
Jan 22 00:09:08 compute-1 nova_compute[182713]:     <controller type='pci' index='20' model='pcie-root-port'>
Jan 22 00:09:08 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 22 00:09:08 compute-1 nova_compute[182713]:       <target chassis='20' port='0x23'/>
Jan 22 00:09:08 compute-1 nova_compute[182713]:       <alias name='pci.20'/>
Jan 22 00:09:08 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Jan 22 00:09:08 compute-1 nova_compute[182713]:     </controller>
Jan 22 00:09:08 compute-1 nova_compute[182713]:     <controller type='pci' index='21' model='pcie-root-port'>
Jan 22 00:09:08 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 22 00:09:08 compute-1 nova_compute[182713]:       <target chassis='21' port='0x24'/>
Jan 22 00:09:08 compute-1 nova_compute[182713]:       <alias name='pci.21'/>
Jan 22 00:09:08 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Jan 22 00:09:08 compute-1 nova_compute[182713]:     </controller>
Jan 22 00:09:08 compute-1 nova_compute[182713]:     <controller type='pci' index='22' model='pcie-root-port'>
Jan 22 00:09:08 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 22 00:09:08 compute-1 nova_compute[182713]:       <target chassis='22' port='0x25'/>
Jan 22 00:09:08 compute-1 nova_compute[182713]:       <alias name='pci.22'/>
Jan 22 00:09:08 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Jan 22 00:09:08 compute-1 nova_compute[182713]:     </controller>
Jan 22 00:09:08 compute-1 nova_compute[182713]:     <controller type='pci' index='23' model='pcie-root-port'>
Jan 22 00:09:08 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 22 00:09:08 compute-1 nova_compute[182713]:       <target chassis='23' port='0x26'/>
Jan 22 00:09:08 compute-1 nova_compute[182713]:       <alias name='pci.23'/>
Jan 22 00:09:08 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Jan 22 00:09:08 compute-1 nova_compute[182713]:     </controller>
Jan 22 00:09:08 compute-1 nova_compute[182713]:     <controller type='pci' index='24' model='pcie-root-port'>
Jan 22 00:09:08 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 22 00:09:08 compute-1 nova_compute[182713]:       <target chassis='24' port='0x27'/>
Jan 22 00:09:08 compute-1 nova_compute[182713]:       <alias name='pci.24'/>
Jan 22 00:09:08 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Jan 22 00:09:08 compute-1 nova_compute[182713]:     </controller>
Jan 22 00:09:08 compute-1 nova_compute[182713]:     <controller type='pci' index='25' model='pcie-root-port'>
Jan 22 00:09:08 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 22 00:09:08 compute-1 nova_compute[182713]:       <target chassis='25' port='0x28'/>
Jan 22 00:09:08 compute-1 nova_compute[182713]:       <alias name='pci.25'/>
Jan 22 00:09:08 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Jan 22 00:09:08 compute-1 nova_compute[182713]:     </controller>
Jan 22 00:09:08 compute-1 nova_compute[182713]:     <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Jan 22 00:09:08 compute-1 nova_compute[182713]:       <model name='pcie-pci-bridge'/>
Jan 22 00:09:08 compute-1 nova_compute[182713]:       <alias name='pci.26'/>
Jan 22 00:09:08 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Jan 22 00:09:08 compute-1 nova_compute[182713]:     </controller>
Jan 22 00:09:08 compute-1 nova_compute[182713]:     <controller type='usb' index='0' model='piix3-uhci'>
Jan 22 00:09:08 compute-1 nova_compute[182713]:       <alias name='usb'/>
Jan 22 00:09:08 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Jan 22 00:09:08 compute-1 nova_compute[182713]:     </controller>
Jan 22 00:09:08 compute-1 nova_compute[182713]:     <controller type='sata' index='0'>
Jan 22 00:09:08 compute-1 nova_compute[182713]:       <alias name='ide'/>
Jan 22 00:09:08 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Jan 22 00:09:08 compute-1 nova_compute[182713]:     </controller>
Jan 22 00:09:08 compute-1 nova_compute[182713]:     <interface type='ethernet'>
Jan 22 00:09:08 compute-1 nova_compute[182713]:       <mac address='fa:16:3e:e0:2b:de'/>
Jan 22 00:09:08 compute-1 nova_compute[182713]:       <target dev='tap65b75da3-42'/>
Jan 22 00:09:08 compute-1 nova_compute[182713]:       <model type='virtio'/>
Jan 22 00:09:08 compute-1 nova_compute[182713]:       <driver name='vhost' rx_queue_size='512'/>
Jan 22 00:09:08 compute-1 nova_compute[182713]:       <mtu size='1442'/>
Jan 22 00:09:08 compute-1 nova_compute[182713]:       <alias name='net0'/>
Jan 22 00:09:08 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Jan 22 00:09:08 compute-1 nova_compute[182713]:     </interface>
Jan 22 00:09:08 compute-1 nova_compute[182713]:     <interface type='ethernet'>
Jan 22 00:09:08 compute-1 nova_compute[182713]:       <mac address='fa:16:3e:a8:ff:89'/>
Jan 22 00:09:08 compute-1 nova_compute[182713]:       <target dev='tap424a06a4-5a'/>
Jan 22 00:09:08 compute-1 nova_compute[182713]:       <model type='virtio'/>
Jan 22 00:09:08 compute-1 nova_compute[182713]:       <driver name='vhost' rx_queue_size='512'/>
Jan 22 00:09:08 compute-1 nova_compute[182713]:       <mtu size='1442'/>
Jan 22 00:09:08 compute-1 nova_compute[182713]:       <alias name='net1'/>
Jan 22 00:09:08 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x06' slot='0x00' function='0x0'/>
Jan 22 00:09:08 compute-1 nova_compute[182713]:     </interface>
Jan 22 00:09:08 compute-1 nova_compute[182713]:     <serial type='pty'>
Jan 22 00:09:08 compute-1 nova_compute[182713]:       <source path='/dev/pts/0'/>
Jan 22 00:09:08 compute-1 nova_compute[182713]:       <log file='/var/lib/nova/instances/6bdc91c7-e39d-4d01-9496-49ceb58c3389/console.log' append='off'/>
Jan 22 00:09:08 compute-1 nova_compute[182713]:       <target type='isa-serial' port='0'>
Jan 22 00:09:08 compute-1 nova_compute[182713]:         <model name='isa-serial'/>
Jan 22 00:09:08 compute-1 nova_compute[182713]:       </target>
Jan 22 00:09:08 compute-1 nova_compute[182713]:       <alias name='serial0'/>
Jan 22 00:09:08 compute-1 nova_compute[182713]:     </serial>
Jan 22 00:09:08 compute-1 nova_compute[182713]:     <console type='pty' tty='/dev/pts/0'>
Jan 22 00:09:08 compute-1 nova_compute[182713]:       <source path='/dev/pts/0'/>
Jan 22 00:09:08 compute-1 nova_compute[182713]:       <log file='/var/lib/nova/instances/6bdc91c7-e39d-4d01-9496-49ceb58c3389/console.log' append='off'/>
Jan 22 00:09:08 compute-1 nova_compute[182713]:       <target type='serial' port='0'/>
Jan 22 00:09:08 compute-1 nova_compute[182713]:       <alias name='serial0'/>
Jan 22 00:09:08 compute-1 nova_compute[182713]:     </console>
Jan 22 00:09:08 compute-1 nova_compute[182713]:     <input type='tablet' bus='usb'>
Jan 22 00:09:08 compute-1 nova_compute[182713]:       <alias name='input0'/>
Jan 22 00:09:08 compute-1 nova_compute[182713]:       <address type='usb' bus='0' port='1'/>
Jan 22 00:09:08 compute-1 nova_compute[182713]:     </input>
Jan 22 00:09:08 compute-1 nova_compute[182713]:     <input type='mouse' bus='ps2'>
Jan 22 00:09:08 compute-1 nova_compute[182713]:       <alias name='input1'/>
Jan 22 00:09:08 compute-1 nova_compute[182713]:     </input>
Jan 22 00:09:08 compute-1 nova_compute[182713]:     <input type='keyboard' bus='ps2'>
Jan 22 00:09:08 compute-1 nova_compute[182713]:       <alias name='input2'/>
Jan 22 00:09:08 compute-1 nova_compute[182713]:     </input>
Jan 22 00:09:08 compute-1 nova_compute[182713]:     <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Jan 22 00:09:08 compute-1 nova_compute[182713]:       <listen type='address' address='::0'/>
Jan 22 00:09:08 compute-1 nova_compute[182713]:     </graphics>
Jan 22 00:09:08 compute-1 nova_compute[182713]:     <audio id='1' type='none'/>
Jan 22 00:09:08 compute-1 nova_compute[182713]:     <video>
Jan 22 00:09:08 compute-1 nova_compute[182713]:       <model type='virtio' heads='1' primary='yes'/>
Jan 22 00:09:08 compute-1 nova_compute[182713]:       <alias name='video0'/>
Jan 22 00:09:08 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Jan 22 00:09:08 compute-1 nova_compute[182713]:     </video>
Jan 22 00:09:08 compute-1 nova_compute[182713]:     <watchdog model='itco' action='reset'>
Jan 22 00:09:08 compute-1 nova_compute[182713]:       <alias name='watchdog0'/>
Jan 22 00:09:08 compute-1 nova_compute[182713]:     </watchdog>
Jan 22 00:09:08 compute-1 nova_compute[182713]:     <memballoon model='virtio'>
Jan 22 00:09:08 compute-1 nova_compute[182713]:       <stats period='10'/>
Jan 22 00:09:08 compute-1 nova_compute[182713]:       <alias name='balloon0'/>
Jan 22 00:09:08 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Jan 22 00:09:08 compute-1 nova_compute[182713]:     </memballoon>
Jan 22 00:09:08 compute-1 nova_compute[182713]:     <rng model='virtio'>
Jan 22 00:09:08 compute-1 nova_compute[182713]:       <backend model='random'>/dev/urandom</backend>
Jan 22 00:09:08 compute-1 nova_compute[182713]:       <alias name='rng0'/>
Jan 22 00:09:08 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Jan 22 00:09:08 compute-1 nova_compute[182713]:     </rng>
Jan 22 00:09:08 compute-1 nova_compute[182713]:   </devices>
Jan 22 00:09:08 compute-1 nova_compute[182713]:   <seclabel type='dynamic' model='selinux' relabel='yes'>
Jan 22 00:09:08 compute-1 nova_compute[182713]:     <label>system_u:system_r:svirt_t:s0:c335,c433</label>
Jan 22 00:09:08 compute-1 nova_compute[182713]:     <imagelabel>system_u:object_r:svirt_image_t:s0:c335,c433</imagelabel>
Jan 22 00:09:08 compute-1 nova_compute[182713]:   </seclabel>
Jan 22 00:09:08 compute-1 nova_compute[182713]:   <seclabel type='dynamic' model='dac' relabel='yes'>
Jan 22 00:09:08 compute-1 nova_compute[182713]:     <label>+107:+107</label>
Jan 22 00:09:08 compute-1 nova_compute[182713]:     <imagelabel>+107:+107</imagelabel>
Jan 22 00:09:08 compute-1 nova_compute[182713]:   </seclabel>
Jan 22 00:09:08 compute-1 nova_compute[182713]: </domain>
Jan 22 00:09:08 compute-1 nova_compute[182713]:  get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282
Jan 22 00:09:08 compute-1 nova_compute[182713]: 2026-01-22 00:09:08.891 182717 INFO nova.virt.libvirt.driver [None req-64f9b310-95ff-4976-9e62-bb371ce1db80 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Successfully detached device tap424a06a4-5a from instance 6bdc91c7-e39d-4d01-9496-49ceb58c3389 from the persistent domain config.
Jan 22 00:09:08 compute-1 nova_compute[182713]: 2026-01-22 00:09:08.891 182717 DEBUG nova.virt.libvirt.driver [None req-64f9b310-95ff-4976-9e62-bb371ce1db80 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] (1/8): Attempting to detach device tap424a06a4-5a with device alias net1 from instance 6bdc91c7-e39d-4d01-9496-49ceb58c3389 from the live domain config. _detach_from_live_with_retry /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2523
Jan 22 00:09:08 compute-1 nova_compute[182713]: 2026-01-22 00:09:08.892 182717 DEBUG nova.virt.libvirt.guest [None req-64f9b310-95ff-4976-9e62-bb371ce1db80 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] detach device xml: <interface type="ethernet">
Jan 22 00:09:08 compute-1 nova_compute[182713]:   <mac address="fa:16:3e:a8:ff:89"/>
Jan 22 00:09:08 compute-1 nova_compute[182713]:   <model type="virtio"/>
Jan 22 00:09:08 compute-1 nova_compute[182713]:   <driver name="vhost" rx_queue_size="512"/>
Jan 22 00:09:08 compute-1 nova_compute[182713]:   <mtu size="1442"/>
Jan 22 00:09:08 compute-1 nova_compute[182713]:   <target dev="tap424a06a4-5a"/>
Jan 22 00:09:08 compute-1 nova_compute[182713]: </interface>
Jan 22 00:09:08 compute-1 nova_compute[182713]:  detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465
Jan 22 00:09:09 compute-1 kernel: tap424a06a4-5a (unregistering): left promiscuous mode
Jan 22 00:09:09 compute-1 NetworkManager[54952]: <info>  [1769040549.0134] device (tap424a06a4-5a): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 00:09:09 compute-1 ovn_controller[94841]: 2026-01-22T00:09:09Z|00428|binding|INFO|Releasing lport 424a06a4-5a78-4f84-b441-8b6fcf11bee9 from this chassis (sb_readonly=0)
Jan 22 00:09:09 compute-1 ovn_controller[94841]: 2026-01-22T00:09:09Z|00429|binding|INFO|Setting lport 424a06a4-5a78-4f84-b441-8b6fcf11bee9 down in Southbound
Jan 22 00:09:09 compute-1 nova_compute[182713]: 2026-01-22 00:09:09.019 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:09:09 compute-1 ovn_controller[94841]: 2026-01-22T00:09:09Z|00430|binding|INFO|Removing iface tap424a06a4-5a ovn-installed in OVS
Jan 22 00:09:09 compute-1 nova_compute[182713]: 2026-01-22 00:09:09.022 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:09:09 compute-1 nova_compute[182713]: 2026-01-22 00:09:09.026 182717 DEBUG nova.virt.libvirt.driver [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Received event <DeviceRemovedEvent: 1769040549.0252502, 6bdc91c7-e39d-4d01-9496-49ceb58c3389 => net1> from libvirt while the driver is waiting for it; dispatched. emit_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2370
Jan 22 00:09:09 compute-1 nova_compute[182713]: 2026-01-22 00:09:09.028 182717 DEBUG nova.virt.libvirt.driver [None req-64f9b310-95ff-4976-9e62-bb371ce1db80 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Start waiting for the detach event from libvirt for device tap424a06a4-5a with device alias net1 for instance 6bdc91c7-e39d-4d01-9496-49ceb58c3389 _detach_from_live_and_wait_for_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2599
Jan 22 00:09:09 compute-1 nova_compute[182713]: 2026-01-22 00:09:09.029 182717 DEBUG nova.virt.libvirt.guest [None req-64f9b310-95ff-4976-9e62-bb371ce1db80 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:a8:ff:89"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap424a06a4-5a"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257
Jan 22 00:09:09 compute-1 nova_compute[182713]: 2026-01-22 00:09:09.033 182717 DEBUG nova.virt.libvirt.guest [None req-64f9b310-95ff-4976-9e62-bb371ce1db80 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:a8:ff:89"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap424a06a4-5a"/></interface>not found in domain: <domain type='kvm' id='47'>
Jan 22 00:09:09 compute-1 nova_compute[182713]:   <name>instance-0000006c</name>
Jan 22 00:09:09 compute-1 nova_compute[182713]:   <uuid>6bdc91c7-e39d-4d01-9496-49ceb58c3389</uuid>
Jan 22 00:09:09 compute-1 nova_compute[182713]:   <metadata>
Jan 22 00:09:09 compute-1 nova_compute[182713]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 00:09:09 compute-1 nova_compute[182713]:   <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 00:09:09 compute-1 nova_compute[182713]:   <nova:name>tempest-TestNetworkBasicOps-server-188093305</nova:name>
Jan 22 00:09:09 compute-1 nova_compute[182713]:   <nova:creationTime>2026-01-22 00:09:06</nova:creationTime>
Jan 22 00:09:09 compute-1 nova_compute[182713]:   <nova:flavor name="m1.nano">
Jan 22 00:09:09 compute-1 nova_compute[182713]:     <nova:memory>128</nova:memory>
Jan 22 00:09:09 compute-1 nova_compute[182713]:     <nova:disk>1</nova:disk>
Jan 22 00:09:09 compute-1 nova_compute[182713]:     <nova:swap>0</nova:swap>
Jan 22 00:09:09 compute-1 nova_compute[182713]:     <nova:ephemeral>0</nova:ephemeral>
Jan 22 00:09:09 compute-1 nova_compute[182713]:     <nova:vcpus>1</nova:vcpus>
Jan 22 00:09:09 compute-1 nova_compute[182713]:   </nova:flavor>
Jan 22 00:09:09 compute-1 nova_compute[182713]:   <nova:owner>
Jan 22 00:09:09 compute-1 nova_compute[182713]:     <nova:user uuid="833f1e9dce90456ea55a443da6704907">tempest-TestNetworkBasicOps-822850957-project-member</nova:user>
Jan 22 00:09:09 compute-1 nova_compute[182713]:     <nova:project uuid="34b96b4037d24a0ea19383ca2477b2fd">tempest-TestNetworkBasicOps-822850957</nova:project>
Jan 22 00:09:09 compute-1 nova_compute[182713]:   </nova:owner>
Jan 22 00:09:09 compute-1 nova_compute[182713]:   <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 22 00:09:09 compute-1 nova_compute[182713]:   <nova:ports>
Jan 22 00:09:09 compute-1 nova_compute[182713]:     <nova:port uuid="65b75da3-429b-440e-aab6-df04e1794c70">
Jan 22 00:09:09 compute-1 nova_compute[182713]:       <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Jan 22 00:09:09 compute-1 nova_compute[182713]:     </nova:port>
Jan 22 00:09:09 compute-1 nova_compute[182713]:     <nova:port uuid="424a06a4-5a78-4f84-b441-8b6fcf11bee9">
Jan 22 00:09:09 compute-1 nova_compute[182713]:       <nova:ip type="fixed" address="10.100.0.27" ipVersion="4"/>
Jan 22 00:09:09 compute-1 nova_compute[182713]:     </nova:port>
Jan 22 00:09:09 compute-1 nova_compute[182713]:   </nova:ports>
Jan 22 00:09:09 compute-1 nova_compute[182713]: </nova:instance>
Jan 22 00:09:09 compute-1 nova_compute[182713]:   </metadata>
Jan 22 00:09:09 compute-1 nova_compute[182713]:   <memory unit='KiB'>131072</memory>
Jan 22 00:09:09 compute-1 nova_compute[182713]:   <currentMemory unit='KiB'>131072</currentMemory>
Jan 22 00:09:09 compute-1 nova_compute[182713]:   <vcpu placement='static'>1</vcpu>
Jan 22 00:09:09 compute-1 nova_compute[182713]:   <resource>
Jan 22 00:09:09 compute-1 nova_compute[182713]:     <partition>/machine</partition>
Jan 22 00:09:09 compute-1 nova_compute[182713]:   </resource>
Jan 22 00:09:09 compute-1 nova_compute[182713]:   <sysinfo type='smbios'>
Jan 22 00:09:09 compute-1 nova_compute[182713]:     <system>
Jan 22 00:09:09 compute-1 nova_compute[182713]:       <entry name='manufacturer'>RDO</entry>
Jan 22 00:09:09 compute-1 nova_compute[182713]:       <entry name='product'>OpenStack Compute</entry>
Jan 22 00:09:09 compute-1 nova_compute[182713]:       <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 00:09:09 compute-1 nova_compute[182713]:       <entry name='serial'>6bdc91c7-e39d-4d01-9496-49ceb58c3389</entry>
Jan 22 00:09:09 compute-1 nova_compute[182713]:       <entry name='uuid'>6bdc91c7-e39d-4d01-9496-49ceb58c3389</entry>
Jan 22 00:09:09 compute-1 nova_compute[182713]:       <entry name='family'>Virtual Machine</entry>
Jan 22 00:09:09 compute-1 nova_compute[182713]:     </system>
Jan 22 00:09:09 compute-1 nova_compute[182713]:   </sysinfo>
Jan 22 00:09:09 compute-1 nova_compute[182713]:   <os>
Jan 22 00:09:09 compute-1 nova_compute[182713]:     <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Jan 22 00:09:09 compute-1 nova_compute[182713]:     <boot dev='hd'/>
Jan 22 00:09:09 compute-1 nova_compute[182713]:     <smbios mode='sysinfo'/>
Jan 22 00:09:09 compute-1 nova_compute[182713]:   </os>
Jan 22 00:09:09 compute-1 nova_compute[182713]:   <features>
Jan 22 00:09:09 compute-1 nova_compute[182713]:     <acpi/>
Jan 22 00:09:09 compute-1 nova_compute[182713]:     <apic/>
Jan 22 00:09:09 compute-1 nova_compute[182713]:     <vmcoreinfo state='on'/>
Jan 22 00:09:09 compute-1 nova_compute[182713]:   </features>
Jan 22 00:09:09 compute-1 nova_compute[182713]:   <cpu mode='custom' match='exact' check='full'>
Jan 22 00:09:09 compute-1 nova_compute[182713]:     <model fallback='forbid'>Nehalem</model>
Jan 22 00:09:09 compute-1 nova_compute[182713]:     <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Jan 22 00:09:09 compute-1 nova_compute[182713]:     <feature policy='require' name='x2apic'/>
Jan 22 00:09:09 compute-1 nova_compute[182713]:     <feature policy='require' name='hypervisor'/>
Jan 22 00:09:09 compute-1 nova_compute[182713]:     <feature policy='require' name='vme'/>
Jan 22 00:09:09 compute-1 nova_compute[182713]:   </cpu>
Jan 22 00:09:09 compute-1 nova_compute[182713]:   <clock offset='utc'>
Jan 22 00:09:09 compute-1 nova_compute[182713]:     <timer name='pit' tickpolicy='delay'/>
Jan 22 00:09:09 compute-1 nova_compute[182713]:     <timer name='rtc' tickpolicy='catchup'/>
Jan 22 00:09:09 compute-1 nova_compute[182713]:     <timer name='hpet' present='no'/>
Jan 22 00:09:09 compute-1 nova_compute[182713]:   </clock>
Jan 22 00:09:09 compute-1 nova_compute[182713]:   <on_poweroff>destroy</on_poweroff>
Jan 22 00:09:09 compute-1 nova_compute[182713]:   <on_reboot>restart</on_reboot>
Jan 22 00:09:09 compute-1 nova_compute[182713]:   <on_crash>destroy</on_crash>
Jan 22 00:09:09 compute-1 nova_compute[182713]:   <devices>
Jan 22 00:09:09 compute-1 nova_compute[182713]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 22 00:09:09 compute-1 nova_compute[182713]:     <disk type='file' device='disk'>
Jan 22 00:09:09 compute-1 nova_compute[182713]:       <driver name='qemu' type='qcow2' cache='none'/>
Jan 22 00:09:09 compute-1 nova_compute[182713]:       <source file='/var/lib/nova/instances/6bdc91c7-e39d-4d01-9496-49ceb58c3389/disk' index='2'/>
Jan 22 00:09:09 compute-1 nova_compute[182713]:       <backingStore type='file' index='3'>
Jan 22 00:09:09 compute-1 nova_compute[182713]:         <format type='raw'/>
Jan 22 00:09:09 compute-1 nova_compute[182713]:         <source file='/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474'/>
Jan 22 00:09:09 compute-1 nova_compute[182713]:         <backingStore/>
Jan 22 00:09:09 compute-1 nova_compute[182713]:       </backingStore>
Jan 22 00:09:09 compute-1 nova_compute[182713]:       <target dev='vda' bus='virtio'/>
Jan 22 00:09:09 compute-1 nova_compute[182713]:       <alias name='virtio-disk0'/>
Jan 22 00:09:09 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Jan 22 00:09:09 compute-1 nova_compute[182713]:     </disk>
Jan 22 00:09:09 compute-1 nova_compute[182713]:     <disk type='file' device='cdrom'>
Jan 22 00:09:09 compute-1 nova_compute[182713]:       <driver name='qemu' type='raw' cache='none'/>
Jan 22 00:09:09 compute-1 nova_compute[182713]:       <source file='/var/lib/nova/instances/6bdc91c7-e39d-4d01-9496-49ceb58c3389/disk.config' index='1'/>
Jan 22 00:09:09 compute-1 nova_compute[182713]:       <backingStore/>
Jan 22 00:09:09 compute-1 nova_compute[182713]:       <target dev='sda' bus='sata'/>
Jan 22 00:09:09 compute-1 nova_compute[182713]:       <readonly/>
Jan 22 00:09:09 compute-1 nova_compute[182713]:       <alias name='sata0-0-0'/>
Jan 22 00:09:09 compute-1 nova_compute[182713]:       <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Jan 22 00:09:09 compute-1 nova_compute[182713]:     </disk>
Jan 22 00:09:09 compute-1 nova_compute[182713]:     <controller type='pci' index='0' model='pcie-root'>
Jan 22 00:09:09 compute-1 nova_compute[182713]:       <alias name='pcie.0'/>
Jan 22 00:09:09 compute-1 nova_compute[182713]:     </controller>
Jan 22 00:09:09 compute-1 nova_compute[182713]:     <controller type='pci' index='1' model='pcie-root-port'>
Jan 22 00:09:09 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 22 00:09:09 compute-1 nova_compute[182713]:       <target chassis='1' port='0x10'/>
Jan 22 00:09:09 compute-1 nova_compute[182713]:       <alias name='pci.1'/>
Jan 22 00:09:09 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Jan 22 00:09:09 compute-1 nova_compute[182713]:     </controller>
Jan 22 00:09:09 compute-1 nova_compute[182713]:     <controller type='pci' index='2' model='pcie-root-port'>
Jan 22 00:09:09 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 22 00:09:09 compute-1 nova_compute[182713]:       <target chassis='2' port='0x11'/>
Jan 22 00:09:09 compute-1 nova_compute[182713]:       <alias name='pci.2'/>
Jan 22 00:09:09 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Jan 22 00:09:09 compute-1 nova_compute[182713]:     </controller>
Jan 22 00:09:09 compute-1 nova_compute[182713]:     <controller type='pci' index='3' model='pcie-root-port'>
Jan 22 00:09:09 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 22 00:09:09 compute-1 nova_compute[182713]:       <target chassis='3' port='0x12'/>
Jan 22 00:09:09 compute-1 nova_compute[182713]:       <alias name='pci.3'/>
Jan 22 00:09:09 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Jan 22 00:09:09 compute-1 nova_compute[182713]:     </controller>
Jan 22 00:09:09 compute-1 nova_compute[182713]:     <controller type='pci' index='4' model='pcie-root-port'>
Jan 22 00:09:09 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 22 00:09:09 compute-1 nova_compute[182713]:       <target chassis='4' port='0x13'/>
Jan 22 00:09:09 compute-1 nova_compute[182713]:       <alias name='pci.4'/>
Jan 22 00:09:09 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Jan 22 00:09:09 compute-1 nova_compute[182713]:     </controller>
Jan 22 00:09:09 compute-1 nova_compute[182713]:     <controller type='pci' index='5' model='pcie-root-port'>
Jan 22 00:09:09 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 22 00:09:09 compute-1 nova_compute[182713]:       <target chassis='5' port='0x14'/>
Jan 22 00:09:09 compute-1 nova_compute[182713]:       <alias name='pci.5'/>
Jan 22 00:09:09 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Jan 22 00:09:09 compute-1 nova_compute[182713]:     </controller>
Jan 22 00:09:09 compute-1 nova_compute[182713]:     <controller type='pci' index='6' model='pcie-root-port'>
Jan 22 00:09:09 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 22 00:09:09 compute-1 nova_compute[182713]:       <target chassis='6' port='0x15'/>
Jan 22 00:09:09 compute-1 nova_compute[182713]:       <alias name='pci.6'/>
Jan 22 00:09:09 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Jan 22 00:09:09 compute-1 nova_compute[182713]:     </controller>
Jan 22 00:09:09 compute-1 nova_compute[182713]:     <controller type='pci' index='7' model='pcie-root-port'>
Jan 22 00:09:09 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 22 00:09:09 compute-1 nova_compute[182713]:       <target chassis='7' port='0x16'/>
Jan 22 00:09:09 compute-1 nova_compute[182713]:       <alias name='pci.7'/>
Jan 22 00:09:09 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Jan 22 00:09:09 compute-1 nova_compute[182713]:     </controller>
Jan 22 00:09:09 compute-1 nova_compute[182713]:     <controller type='pci' index='8' model='pcie-root-port'>
Jan 22 00:09:09 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 22 00:09:09 compute-1 nova_compute[182713]:       <target chassis='8' port='0x17'/>
Jan 22 00:09:09 compute-1 nova_compute[182713]:       <alias name='pci.8'/>
Jan 22 00:09:09 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Jan 22 00:09:09 compute-1 nova_compute[182713]:     </controller>
Jan 22 00:09:09 compute-1 nova_compute[182713]:     <controller type='pci' index='9' model='pcie-root-port'>
Jan 22 00:09:09 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 22 00:09:09 compute-1 nova_compute[182713]:       <target chassis='9' port='0x18'/>
Jan 22 00:09:09 compute-1 nova_compute[182713]:       <alias name='pci.9'/>
Jan 22 00:09:09 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Jan 22 00:09:09 compute-1 nova_compute[182713]:     </controller>
Jan 22 00:09:09 compute-1 nova_compute[182713]:     <controller type='pci' index='10' model='pcie-root-port'>
Jan 22 00:09:09 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 22 00:09:09 compute-1 nova_compute[182713]:       <target chassis='10' port='0x19'/>
Jan 22 00:09:09 compute-1 nova_compute[182713]:       <alias name='pci.10'/>
Jan 22 00:09:09 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Jan 22 00:09:09 compute-1 nova_compute[182713]:     </controller>
Jan 22 00:09:09 compute-1 nova_compute[182713]:     <controller type='pci' index='11' model='pcie-root-port'>
Jan 22 00:09:09 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 22 00:09:09 compute-1 nova_compute[182713]:       <target chassis='11' port='0x1a'/>
Jan 22 00:09:09 compute-1 nova_compute[182713]:       <alias name='pci.11'/>
Jan 22 00:09:09 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Jan 22 00:09:09 compute-1 nova_compute[182713]:     </controller>
Jan 22 00:09:09 compute-1 nova_compute[182713]:     <controller type='pci' index='12' model='pcie-root-port'>
Jan 22 00:09:09 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 22 00:09:09 compute-1 nova_compute[182713]:       <target chassis='12' port='0x1b'/>
Jan 22 00:09:09 compute-1 nova_compute[182713]:       <alias name='pci.12'/>
Jan 22 00:09:09 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Jan 22 00:09:09 compute-1 nova_compute[182713]:     </controller>
Jan 22 00:09:09 compute-1 nova_compute[182713]:     <controller type='pci' index='13' model='pcie-root-port'>
Jan 22 00:09:09 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 22 00:09:09 compute-1 nova_compute[182713]:       <target chassis='13' port='0x1c'/>
Jan 22 00:09:09 compute-1 nova_compute[182713]:       <alias name='pci.13'/>
Jan 22 00:09:09 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Jan 22 00:09:09 compute-1 nova_compute[182713]:     </controller>
Jan 22 00:09:09 compute-1 nova_compute[182713]:     <controller type='pci' index='14' model='pcie-root-port'>
Jan 22 00:09:09 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 22 00:09:09 compute-1 nova_compute[182713]:       <target chassis='14' port='0x1d'/>
Jan 22 00:09:09 compute-1 nova_compute[182713]:       <alias name='pci.14'/>
Jan 22 00:09:09 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Jan 22 00:09:09 compute-1 nova_compute[182713]:     </controller>
Jan 22 00:09:09 compute-1 nova_compute[182713]:     <controller type='pci' index='15' model='pcie-root-port'>
Jan 22 00:09:09 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 22 00:09:09 compute-1 nova_compute[182713]:       <target chassis='15' port='0x1e'/>
Jan 22 00:09:09 compute-1 nova_compute[182713]:       <alias name='pci.15'/>
Jan 22 00:09:09 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Jan 22 00:09:09 compute-1 nova_compute[182713]:     </controller>
Jan 22 00:09:09 compute-1 nova_compute[182713]:     <controller type='pci' index='16' model='pcie-root-port'>
Jan 22 00:09:09 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 22 00:09:09 compute-1 nova_compute[182713]:       <target chassis='16' port='0x1f'/>
Jan 22 00:09:09 compute-1 nova_compute[182713]:       <alias name='pci.16'/>
Jan 22 00:09:09 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Jan 22 00:09:09 compute-1 nova_compute[182713]:     </controller>
Jan 22 00:09:09 compute-1 nova_compute[182713]:     <controller type='pci' index='17' model='pcie-root-port'>
Jan 22 00:09:09 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 22 00:09:09 compute-1 nova_compute[182713]:       <target chassis='17' port='0x20'/>
Jan 22 00:09:09 compute-1 nova_compute[182713]:       <alias name='pci.17'/>
Jan 22 00:09:09 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Jan 22 00:09:09 compute-1 nova_compute[182713]:     </controller>
Jan 22 00:09:09 compute-1 nova_compute[182713]:     <controller type='pci' index='18' model='pcie-root-port'>
Jan 22 00:09:09 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 22 00:09:09 compute-1 nova_compute[182713]:       <target chassis='18' port='0x21'/>
Jan 22 00:09:09 compute-1 nova_compute[182713]:       <alias name='pci.18'/>
Jan 22 00:09:09 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Jan 22 00:09:09 compute-1 nova_compute[182713]:     </controller>
Jan 22 00:09:09 compute-1 nova_compute[182713]:     <controller type='pci' index='19' model='pcie-root-port'>
Jan 22 00:09:09 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 22 00:09:09 compute-1 nova_compute[182713]:       <target chassis='19' port='0x22'/>
Jan 22 00:09:09 compute-1 nova_compute[182713]:       <alias name='pci.19'/>
Jan 22 00:09:09 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Jan 22 00:09:09 compute-1 nova_compute[182713]:     </controller>
Jan 22 00:09:09 compute-1 nova_compute[182713]:     <controller type='pci' index='20' model='pcie-root-port'>
Jan 22 00:09:09 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 22 00:09:09 compute-1 nova_compute[182713]:       <target chassis='20' port='0x23'/>
Jan 22 00:09:09 compute-1 nova_compute[182713]:       <alias name='pci.20'/>
Jan 22 00:09:09 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Jan 22 00:09:09 compute-1 nova_compute[182713]:     </controller>
Jan 22 00:09:09 compute-1 nova_compute[182713]:     <controller type='pci' index='21' model='pcie-root-port'>
Jan 22 00:09:09 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 22 00:09:09 compute-1 nova_compute[182713]:       <target chassis='21' port='0x24'/>
Jan 22 00:09:09 compute-1 nova_compute[182713]:       <alias name='pci.21'/>
Jan 22 00:09:09 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Jan 22 00:09:09 compute-1 nova_compute[182713]:     </controller>
Jan 22 00:09:09 compute-1 nova_compute[182713]:     <controller type='pci' index='22' model='pcie-root-port'>
Jan 22 00:09:09 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 22 00:09:09 compute-1 nova_compute[182713]:       <target chassis='22' port='0x25'/>
Jan 22 00:09:09 compute-1 nova_compute[182713]:       <alias name='pci.22'/>
Jan 22 00:09:09 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Jan 22 00:09:09 compute-1 nova_compute[182713]:     </controller>
Jan 22 00:09:09 compute-1 nova_compute[182713]:     <controller type='pci' index='23' model='pcie-root-port'>
Jan 22 00:09:09 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 22 00:09:09 compute-1 nova_compute[182713]:       <target chassis='23' port='0x26'/>
Jan 22 00:09:09 compute-1 nova_compute[182713]:       <alias name='pci.23'/>
Jan 22 00:09:09 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Jan 22 00:09:09 compute-1 nova_compute[182713]:     </controller>
Jan 22 00:09:09 compute-1 nova_compute[182713]:     <controller type='pci' index='24' model='pcie-root-port'>
Jan 22 00:09:09 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 22 00:09:09 compute-1 nova_compute[182713]:       <target chassis='24' port='0x27'/>
Jan 22 00:09:09 compute-1 nova_compute[182713]:       <alias name='pci.24'/>
Jan 22 00:09:09 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Jan 22 00:09:09 compute-1 nova_compute[182713]:     </controller>
Jan 22 00:09:09 compute-1 nova_compute[182713]:     <controller type='pci' index='25' model='pcie-root-port'>
Jan 22 00:09:09 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 22 00:09:09 compute-1 nova_compute[182713]:       <target chassis='25' port='0x28'/>
Jan 22 00:09:09 compute-1 nova_compute[182713]:       <alias name='pci.25'/>
Jan 22 00:09:09 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Jan 22 00:09:09 compute-1 nova_compute[182713]:     </controller>
Jan 22 00:09:09 compute-1 nova_compute[182713]:     <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Jan 22 00:09:09 compute-1 nova_compute[182713]:       <model name='pcie-pci-bridge'/>
Jan 22 00:09:09 compute-1 nova_compute[182713]:       <alias name='pci.26'/>
Jan 22 00:09:09 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Jan 22 00:09:09 compute-1 nova_compute[182713]:     </controller>
Jan 22 00:09:09 compute-1 nova_compute[182713]:     <controller type='usb' index='0' model='piix3-uhci'>
Jan 22 00:09:09 compute-1 nova_compute[182713]:       <alias name='usb'/>
Jan 22 00:09:09 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Jan 22 00:09:09 compute-1 nova_compute[182713]:     </controller>
Jan 22 00:09:09 compute-1 nova_compute[182713]:     <controller type='sata' index='0'>
Jan 22 00:09:09 compute-1 nova_compute[182713]:       <alias name='ide'/>
Jan 22 00:09:09 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Jan 22 00:09:09 compute-1 nova_compute[182713]:     </controller>
Jan 22 00:09:09 compute-1 nova_compute[182713]:     <interface type='ethernet'>
Jan 22 00:09:09 compute-1 nova_compute[182713]:       <mac address='fa:16:3e:e0:2b:de'/>
Jan 22 00:09:09 compute-1 nova_compute[182713]:       <target dev='tap65b75da3-42'/>
Jan 22 00:09:09 compute-1 nova_compute[182713]:       <model type='virtio'/>
Jan 22 00:09:09 compute-1 nova_compute[182713]:       <driver name='vhost' rx_queue_size='512'/>
Jan 22 00:09:09 compute-1 nova_compute[182713]:       <mtu size='1442'/>
Jan 22 00:09:09 compute-1 nova_compute[182713]:       <alias name='net0'/>
Jan 22 00:09:09 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Jan 22 00:09:09 compute-1 nova_compute[182713]:     </interface>
Jan 22 00:09:09 compute-1 nova_compute[182713]:     <serial type='pty'>
Jan 22 00:09:09 compute-1 nova_compute[182713]:       <source path='/dev/pts/0'/>
Jan 22 00:09:09 compute-1 nova_compute[182713]:       <log file='/var/lib/nova/instances/6bdc91c7-e39d-4d01-9496-49ceb58c3389/console.log' append='off'/>
Jan 22 00:09:09 compute-1 nova_compute[182713]:       <target type='isa-serial' port='0'>
Jan 22 00:09:09 compute-1 nova_compute[182713]:         <model name='isa-serial'/>
Jan 22 00:09:09 compute-1 nova_compute[182713]:       </target>
Jan 22 00:09:09 compute-1 nova_compute[182713]:       <alias name='serial0'/>
Jan 22 00:09:09 compute-1 nova_compute[182713]:     </serial>
Jan 22 00:09:09 compute-1 nova_compute[182713]:     <console type='pty' tty='/dev/pts/0'>
Jan 22 00:09:09 compute-1 nova_compute[182713]:       <source path='/dev/pts/0'/>
Jan 22 00:09:09 compute-1 nova_compute[182713]:       <log file='/var/lib/nova/instances/6bdc91c7-e39d-4d01-9496-49ceb58c3389/console.log' append='off'/>
Jan 22 00:09:09 compute-1 nova_compute[182713]:       <target type='serial' port='0'/>
Jan 22 00:09:09 compute-1 nova_compute[182713]:       <alias name='serial0'/>
Jan 22 00:09:09 compute-1 nova_compute[182713]:     </console>
Jan 22 00:09:09 compute-1 nova_compute[182713]:     <input type='tablet' bus='usb'>
Jan 22 00:09:09 compute-1 nova_compute[182713]:       <alias name='input0'/>
Jan 22 00:09:09 compute-1 nova_compute[182713]:       <address type='usb' bus='0' port='1'/>
Jan 22 00:09:09 compute-1 nova_compute[182713]:     </input>
Jan 22 00:09:09 compute-1 nova_compute[182713]:     <input type='mouse' bus='ps2'>
Jan 22 00:09:09 compute-1 nova_compute[182713]:       <alias name='input1'/>
Jan 22 00:09:09 compute-1 nova_compute[182713]:     </input>
Jan 22 00:09:09 compute-1 nova_compute[182713]:     <input type='keyboard' bus='ps2'>
Jan 22 00:09:09 compute-1 nova_compute[182713]:       <alias name='input2'/>
Jan 22 00:09:09 compute-1 nova_compute[182713]:     </input>
Jan 22 00:09:09 compute-1 nova_compute[182713]:     <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Jan 22 00:09:09 compute-1 nova_compute[182713]:       <listen type='address' address='::0'/>
Jan 22 00:09:09 compute-1 nova_compute[182713]:     </graphics>
Jan 22 00:09:09 compute-1 nova_compute[182713]:     <audio id='1' type='none'/>
Jan 22 00:09:09 compute-1 nova_compute[182713]:     <video>
Jan 22 00:09:09 compute-1 nova_compute[182713]:       <model type='virtio' heads='1' primary='yes'/>
Jan 22 00:09:09 compute-1 nova_compute[182713]:       <alias name='video0'/>
Jan 22 00:09:09 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Jan 22 00:09:09 compute-1 nova_compute[182713]:     </video>
Jan 22 00:09:09 compute-1 nova_compute[182713]:     <watchdog model='itco' action='reset'>
Jan 22 00:09:09 compute-1 nova_compute[182713]:       <alias name='watchdog0'/>
Jan 22 00:09:09 compute-1 nova_compute[182713]:     </watchdog>
Jan 22 00:09:09 compute-1 nova_compute[182713]:     <memballoon model='virtio'>
Jan 22 00:09:09 compute-1 nova_compute[182713]:       <stats period='10'/>
Jan 22 00:09:09 compute-1 nova_compute[182713]:       <alias name='balloon0'/>
Jan 22 00:09:09 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Jan 22 00:09:09 compute-1 nova_compute[182713]:     </memballoon>
Jan 22 00:09:09 compute-1 nova_compute[182713]:     <rng model='virtio'>
Jan 22 00:09:09 compute-1 nova_compute[182713]:       <backend model='random'>/dev/urandom</backend>
Jan 22 00:09:09 compute-1 nova_compute[182713]:       <alias name='rng0'/>
Jan 22 00:09:09 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Jan 22 00:09:09 compute-1 nova_compute[182713]:     </rng>
Jan 22 00:09:09 compute-1 nova_compute[182713]:   </devices>
Jan 22 00:09:09 compute-1 nova_compute[182713]:   <seclabel type='dynamic' model='selinux' relabel='yes'>
Jan 22 00:09:09 compute-1 nova_compute[182713]:     <label>system_u:system_r:svirt_t:s0:c335,c433</label>
Jan 22 00:09:09 compute-1 nova_compute[182713]:     <imagelabel>system_u:object_r:svirt_image_t:s0:c335,c433</imagelabel>
Jan 22 00:09:09 compute-1 nova_compute[182713]:   </seclabel>
Jan 22 00:09:09 compute-1 nova_compute[182713]:   <seclabel type='dynamic' model='dac' relabel='yes'>
Jan 22 00:09:09 compute-1 nova_compute[182713]:     <label>+107:+107</label>
Jan 22 00:09:09 compute-1 nova_compute[182713]:     <imagelabel>+107:+107</imagelabel>
Jan 22 00:09:09 compute-1 nova_compute[182713]:   </seclabel>
Jan 22 00:09:09 compute-1 nova_compute[182713]: </domain>
Jan 22 00:09:09 compute-1 nova_compute[182713]:  get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282
Jan 22 00:09:09 compute-1 nova_compute[182713]: 2026-01-22 00:09:09.033 182717 INFO nova.virt.libvirt.driver [None req-64f9b310-95ff-4976-9e62-bb371ce1db80 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Successfully detached device tap424a06a4-5a from instance 6bdc91c7-e39d-4d01-9496-49ceb58c3389 from the live domain config.
Jan 22 00:09:09 compute-1 nova_compute[182713]: 2026-01-22 00:09:09.034 182717 DEBUG nova.virt.libvirt.vif [None req-64f9b310-95ff-4976-9e62-bb371ce1db80 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T00:08:22Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-188093305',display_name='tempest-TestNetworkBasicOps-server-188093305',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-188093305',id=108,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOHkFZnEq7NNmxdySRV5/1EgpWeAUvw5OFcMLn6aXM7GCY8P0GmDvb4iJ7gG7Emvq6PckO0f9TjYb3zc1/HZhYzhCTu/vnvxNF9lXAP/apwH44by79L436dIVo4nFymoqQ==',key_name='tempest-TestNetworkBasicOps-441082338',keypairs=<?>,launch_index=0,launched_at=2026-01-22T00:08:31Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='34b96b4037d24a0ea19383ca2477b2fd',ramdisk_id='',reservation_id='r-e07wfkf0',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-822850957',owner_user_name='tempest-TestNetworkBasicOps-822850957-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T00:08:31Z,user_data=None,user_id='833f1e9dce90456ea55a443da6704907',uuid=6bdc91c7-e39d-4d01-9496-49ceb58c3389,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "424a06a4-5a78-4f84-b441-8b6fcf11bee9", "address": "fa:16:3e:a8:ff:89", "network": {"id": "71c61b6b-419e-42c2-91c0-28feb86f02ed", "bridge": "br-int", "label": "tempest-network-smoke--2123686400", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.27", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "34b96b4037d24a0ea19383ca2477b2fd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap424a06a4-5a", "ovs_interfaceid": "424a06a4-5a78-4f84-b441-8b6fcf11bee9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 00:09:09 compute-1 nova_compute[182713]: 2026-01-22 00:09:09.034 182717 DEBUG nova.network.os_vif_util [None req-64f9b310-95ff-4976-9e62-bb371ce1db80 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Converting VIF {"id": "424a06a4-5a78-4f84-b441-8b6fcf11bee9", "address": "fa:16:3e:a8:ff:89", "network": {"id": "71c61b6b-419e-42c2-91c0-28feb86f02ed", "bridge": "br-int", "label": "tempest-network-smoke--2123686400", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.27", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "34b96b4037d24a0ea19383ca2477b2fd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap424a06a4-5a", "ovs_interfaceid": "424a06a4-5a78-4f84-b441-8b6fcf11bee9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:09:09 compute-1 nova_compute[182713]: 2026-01-22 00:09:09.035 182717 DEBUG nova.network.os_vif_util [None req-64f9b310-95ff-4976-9e62-bb371ce1db80 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a8:ff:89,bridge_name='br-int',has_traffic_filtering=True,id=424a06a4-5a78-4f84-b441-8b6fcf11bee9,network=Network(71c61b6b-419e-42c2-91c0-28feb86f02ed),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap424a06a4-5a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:09:09 compute-1 nova_compute[182713]: 2026-01-22 00:09:09.036 182717 DEBUG os_vif [None req-64f9b310-95ff-4976-9e62-bb371ce1db80 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:a8:ff:89,bridge_name='br-int',has_traffic_filtering=True,id=424a06a4-5a78-4f84-b441-8b6fcf11bee9,network=Network(71c61b6b-419e-42c2-91c0-28feb86f02ed),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap424a06a4-5a') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 00:09:09 compute-1 nova_compute[182713]: 2026-01-22 00:09:09.040 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:09:09 compute-1 nova_compute[182713]: 2026-01-22 00:09:09.040 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap424a06a4-5a, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:09:09 compute-1 nova_compute[182713]: 2026-01-22 00:09:09.043 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:09:09 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:09:09.040 104184 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a8:ff:89 10.100.0.27'], port_security=['fa:16:3e:a8:ff:89 10.100.0.27'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.27/28', 'neutron:device_id': '6bdc91c7-e39d-4d01-9496-49ceb58c3389', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-71c61b6b-419e-42c2-91c0-28feb86f02ed', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '34b96b4037d24a0ea19383ca2477b2fd', 'neutron:revision_number': '4', 'neutron:security_group_ids': '86c286e4-25eb-4f2e-b5ff-30677cfd8882', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b5ebddd3-9956-4acf-af04-d95f407579f1, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>], logical_port=424a06a4-5a78-4f84-b441-8b6fcf11bee9) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:09:09 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:09:09.042 104184 INFO neutron.agent.ovn.metadata.agent [-] Port 424a06a4-5a78-4f84-b441-8b6fcf11bee9 in datapath 71c61b6b-419e-42c2-91c0-28feb86f02ed unbound from our chassis
Jan 22 00:09:09 compute-1 nova_compute[182713]: 2026-01-22 00:09:09.044 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:09:09 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:09:09.044 104184 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 71c61b6b-419e-42c2-91c0-28feb86f02ed, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 00:09:09 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:09:09.046 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[f0351f0e-7986-496d-b9ea-9712891fdce7]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:09:09 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:09:09.047 104184 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-71c61b6b-419e-42c2-91c0-28feb86f02ed namespace which is not needed anymore
Jan 22 00:09:09 compute-1 nova_compute[182713]: 2026-01-22 00:09:09.047 182717 INFO os_vif [None req-64f9b310-95ff-4976-9e62-bb371ce1db80 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:a8:ff:89,bridge_name='br-int',has_traffic_filtering=True,id=424a06a4-5a78-4f84-b441-8b6fcf11bee9,network=Network(71c61b6b-419e-42c2-91c0-28feb86f02ed),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap424a06a4-5a')
Jan 22 00:09:09 compute-1 nova_compute[182713]: 2026-01-22 00:09:09.048 182717 DEBUG nova.virt.libvirt.guest [None req-64f9b310-95ff-4976-9e62-bb371ce1db80 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 00:09:09 compute-1 nova_compute[182713]:   <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 00:09:09 compute-1 nova_compute[182713]:   <nova:name>tempest-TestNetworkBasicOps-server-188093305</nova:name>
Jan 22 00:09:09 compute-1 nova_compute[182713]:   <nova:creationTime>2026-01-22 00:09:09</nova:creationTime>
Jan 22 00:09:09 compute-1 nova_compute[182713]:   <nova:flavor name="m1.nano">
Jan 22 00:09:09 compute-1 nova_compute[182713]:     <nova:memory>128</nova:memory>
Jan 22 00:09:09 compute-1 nova_compute[182713]:     <nova:disk>1</nova:disk>
Jan 22 00:09:09 compute-1 nova_compute[182713]:     <nova:swap>0</nova:swap>
Jan 22 00:09:09 compute-1 nova_compute[182713]:     <nova:ephemeral>0</nova:ephemeral>
Jan 22 00:09:09 compute-1 nova_compute[182713]:     <nova:vcpus>1</nova:vcpus>
Jan 22 00:09:09 compute-1 nova_compute[182713]:   </nova:flavor>
Jan 22 00:09:09 compute-1 nova_compute[182713]:   <nova:owner>
Jan 22 00:09:09 compute-1 nova_compute[182713]:     <nova:user uuid="833f1e9dce90456ea55a443da6704907">tempest-TestNetworkBasicOps-822850957-project-member</nova:user>
Jan 22 00:09:09 compute-1 nova_compute[182713]:     <nova:project uuid="34b96b4037d24a0ea19383ca2477b2fd">tempest-TestNetworkBasicOps-822850957</nova:project>
Jan 22 00:09:09 compute-1 nova_compute[182713]:   </nova:owner>
Jan 22 00:09:09 compute-1 nova_compute[182713]:   <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 22 00:09:09 compute-1 nova_compute[182713]:   <nova:ports>
Jan 22 00:09:09 compute-1 nova_compute[182713]:     <nova:port uuid="65b75da3-429b-440e-aab6-df04e1794c70">
Jan 22 00:09:09 compute-1 nova_compute[182713]:       <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Jan 22 00:09:09 compute-1 nova_compute[182713]:     </nova:port>
Jan 22 00:09:09 compute-1 nova_compute[182713]:   </nova:ports>
Jan 22 00:09:09 compute-1 nova_compute[182713]: </nova:instance>
Jan 22 00:09:09 compute-1 nova_compute[182713]:  set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359
Jan 22 00:09:09 compute-1 neutron-haproxy-ovnmeta-71c61b6b-419e-42c2-91c0-28feb86f02ed[227986]: [NOTICE]   (227990) : haproxy version is 2.8.14-c23fe91
Jan 22 00:09:09 compute-1 neutron-haproxy-ovnmeta-71c61b6b-419e-42c2-91c0-28feb86f02ed[227986]: [NOTICE]   (227990) : path to executable is /usr/sbin/haproxy
Jan 22 00:09:09 compute-1 neutron-haproxy-ovnmeta-71c61b6b-419e-42c2-91c0-28feb86f02ed[227986]: [WARNING]  (227990) : Exiting Master process...
Jan 22 00:09:09 compute-1 neutron-haproxy-ovnmeta-71c61b6b-419e-42c2-91c0-28feb86f02ed[227986]: [ALERT]    (227990) : Current worker (227992) exited with code 143 (Terminated)
Jan 22 00:09:09 compute-1 neutron-haproxy-ovnmeta-71c61b6b-419e-42c2-91c0-28feb86f02ed[227986]: [WARNING]  (227990) : All workers exited. Exiting... (0)
Jan 22 00:09:09 compute-1 systemd[1]: libpod-4bf098fc564782399f6987ae201ea32802a3ca16c7bbdb6b65ffd056334f100f.scope: Deactivated successfully.
Jan 22 00:09:09 compute-1 podman[228023]: 2026-01-22 00:09:09.212748859 +0000 UTC m=+0.051327659 container died 4bf098fc564782399f6987ae201ea32802a3ca16c7bbdb6b65ffd056334f100f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-71c61b6b-419e-42c2-91c0-28feb86f02ed, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Jan 22 00:09:09 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-4bf098fc564782399f6987ae201ea32802a3ca16c7bbdb6b65ffd056334f100f-userdata-shm.mount: Deactivated successfully.
Jan 22 00:09:09 compute-1 systemd[1]: var-lib-containers-storage-overlay-eba9b2fea5c6e34a2a86d2bfe4edaeb0c5489823c4e5c4304b966caf7b478452-merged.mount: Deactivated successfully.
Jan 22 00:09:09 compute-1 podman[228023]: 2026-01-22 00:09:09.255119115 +0000 UTC m=+0.093697935 container cleanup 4bf098fc564782399f6987ae201ea32802a3ca16c7bbdb6b65ffd056334f100f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-71c61b6b-419e-42c2-91c0-28feb86f02ed, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Jan 22 00:09:09 compute-1 systemd[1]: libpod-conmon-4bf098fc564782399f6987ae201ea32802a3ca16c7bbdb6b65ffd056334f100f.scope: Deactivated successfully.
Jan 22 00:09:09 compute-1 nova_compute[182713]: 2026-01-22 00:09:09.356 182717 DEBUG nova.compute.manager [req-31a7cecb-a582-47ca-9cc4-8099ed55aadb req-8cb31358-9d05-4d95-9358-b1d8a1b5b89a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 6bdc91c7-e39d-4d01-9496-49ceb58c3389] Received event network-vif-plugged-424a06a4-5a78-4f84-b441-8b6fcf11bee9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:09:09 compute-1 nova_compute[182713]: 2026-01-22 00:09:09.357 182717 DEBUG oslo_concurrency.lockutils [req-31a7cecb-a582-47ca-9cc4-8099ed55aadb req-8cb31358-9d05-4d95-9358-b1d8a1b5b89a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "6bdc91c7-e39d-4d01-9496-49ceb58c3389-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:09:09 compute-1 nova_compute[182713]: 2026-01-22 00:09:09.357 182717 DEBUG oslo_concurrency.lockutils [req-31a7cecb-a582-47ca-9cc4-8099ed55aadb req-8cb31358-9d05-4d95-9358-b1d8a1b5b89a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "6bdc91c7-e39d-4d01-9496-49ceb58c3389-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:09:09 compute-1 nova_compute[182713]: 2026-01-22 00:09:09.357 182717 DEBUG oslo_concurrency.lockutils [req-31a7cecb-a582-47ca-9cc4-8099ed55aadb req-8cb31358-9d05-4d95-9358-b1d8a1b5b89a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "6bdc91c7-e39d-4d01-9496-49ceb58c3389-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:09:09 compute-1 nova_compute[182713]: 2026-01-22 00:09:09.357 182717 DEBUG nova.compute.manager [req-31a7cecb-a582-47ca-9cc4-8099ed55aadb req-8cb31358-9d05-4d95-9358-b1d8a1b5b89a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 6bdc91c7-e39d-4d01-9496-49ceb58c3389] No waiting events found dispatching network-vif-plugged-424a06a4-5a78-4f84-b441-8b6fcf11bee9 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:09:09 compute-1 nova_compute[182713]: 2026-01-22 00:09:09.358 182717 WARNING nova.compute.manager [req-31a7cecb-a582-47ca-9cc4-8099ed55aadb req-8cb31358-9d05-4d95-9358-b1d8a1b5b89a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 6bdc91c7-e39d-4d01-9496-49ceb58c3389] Received unexpected event network-vif-plugged-424a06a4-5a78-4f84-b441-8b6fcf11bee9 for instance with vm_state active and task_state None.
Jan 22 00:09:09 compute-1 nova_compute[182713]: 2026-01-22 00:09:09.358 182717 DEBUG nova.compute.manager [req-31a7cecb-a582-47ca-9cc4-8099ed55aadb req-8cb31358-9d05-4d95-9358-b1d8a1b5b89a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 6bdc91c7-e39d-4d01-9496-49ceb58c3389] Received event network-vif-unplugged-424a06a4-5a78-4f84-b441-8b6fcf11bee9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:09:09 compute-1 nova_compute[182713]: 2026-01-22 00:09:09.358 182717 DEBUG oslo_concurrency.lockutils [req-31a7cecb-a582-47ca-9cc4-8099ed55aadb req-8cb31358-9d05-4d95-9358-b1d8a1b5b89a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "6bdc91c7-e39d-4d01-9496-49ceb58c3389-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:09:09 compute-1 nova_compute[182713]: 2026-01-22 00:09:09.358 182717 DEBUG oslo_concurrency.lockutils [req-31a7cecb-a582-47ca-9cc4-8099ed55aadb req-8cb31358-9d05-4d95-9358-b1d8a1b5b89a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "6bdc91c7-e39d-4d01-9496-49ceb58c3389-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:09:09 compute-1 nova_compute[182713]: 2026-01-22 00:09:09.359 182717 DEBUG oslo_concurrency.lockutils [req-31a7cecb-a582-47ca-9cc4-8099ed55aadb req-8cb31358-9d05-4d95-9358-b1d8a1b5b89a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "6bdc91c7-e39d-4d01-9496-49ceb58c3389-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:09:09 compute-1 nova_compute[182713]: 2026-01-22 00:09:09.359 182717 DEBUG nova.compute.manager [req-31a7cecb-a582-47ca-9cc4-8099ed55aadb req-8cb31358-9d05-4d95-9358-b1d8a1b5b89a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 6bdc91c7-e39d-4d01-9496-49ceb58c3389] No waiting events found dispatching network-vif-unplugged-424a06a4-5a78-4f84-b441-8b6fcf11bee9 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:09:09 compute-1 nova_compute[182713]: 2026-01-22 00:09:09.359 182717 WARNING nova.compute.manager [req-31a7cecb-a582-47ca-9cc4-8099ed55aadb req-8cb31358-9d05-4d95-9358-b1d8a1b5b89a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 6bdc91c7-e39d-4d01-9496-49ceb58c3389] Received unexpected event network-vif-unplugged-424a06a4-5a78-4f84-b441-8b6fcf11bee9 for instance with vm_state active and task_state None.
Jan 22 00:09:09 compute-1 podman[228053]: 2026-01-22 00:09:09.364078642 +0000 UTC m=+0.074817002 container remove 4bf098fc564782399f6987ae201ea32802a3ca16c7bbdb6b65ffd056334f100f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-71c61b6b-419e-42c2-91c0-28feb86f02ed, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 22 00:09:09 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:09:09.372 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[5954a005-7968-48b2-b6a0-3f48bd3aa323]: (4, ('Thu Jan 22 12:09:09 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-71c61b6b-419e-42c2-91c0-28feb86f02ed (4bf098fc564782399f6987ae201ea32802a3ca16c7bbdb6b65ffd056334f100f)\n4bf098fc564782399f6987ae201ea32802a3ca16c7bbdb6b65ffd056334f100f\nThu Jan 22 12:09:09 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-71c61b6b-419e-42c2-91c0-28feb86f02ed (4bf098fc564782399f6987ae201ea32802a3ca16c7bbdb6b65ffd056334f100f)\n4bf098fc564782399f6987ae201ea32802a3ca16c7bbdb6b65ffd056334f100f\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:09:09 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:09:09.374 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[fd0ea332-ecf4-4c99-8c67-81adca7e801c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:09:09 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:09:09.376 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap71c61b6b-40, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:09:09 compute-1 nova_compute[182713]: 2026-01-22 00:09:09.378 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:09:09 compute-1 kernel: tap71c61b6b-40: left promiscuous mode
Jan 22 00:09:09 compute-1 nova_compute[182713]: 2026-01-22 00:09:09.392 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:09:09 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:09:09.395 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[58cc0290-5a1c-4555-b144-19875ac9321e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:09:09 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:09:09.411 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[8aa483b7-792d-497e-87c7-a312da4955b2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:09:09 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:09:09.413 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[d6070b0d-e54a-48c9-a2d9-d61e0d337b8c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:09:09 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:09:09.433 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[761826cd-7241-415d-bf7e-54bf928df5e9]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 511940, 'reachable_time': 17236, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 228070, 'error': None, 'target': 'ovnmeta-71c61b6b-419e-42c2-91c0-28feb86f02ed', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:09:09 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:09:09.436 104576 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-71c61b6b-419e-42c2-91c0-28feb86f02ed deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 22 00:09:09 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:09:09.436 104576 DEBUG oslo.privsep.daemon [-] privsep: reply[da2eccec-143b-4e96-9c4d-2ef9f4f4294f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:09:09 compute-1 systemd[1]: run-netns-ovnmeta\x2d71c61b6b\x2d419e\x2d42c2\x2d91c0\x2d28feb86f02ed.mount: Deactivated successfully.
Jan 22 00:09:10 compute-1 nova_compute[182713]: 2026-01-22 00:09:10.168 182717 DEBUG oslo_concurrency.lockutils [None req-64f9b310-95ff-4976-9e62-bb371ce1db80 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Acquiring lock "refresh_cache-6bdc91c7-e39d-4d01-9496-49ceb58c3389" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:09:10 compute-1 nova_compute[182713]: 2026-01-22 00:09:10.168 182717 DEBUG oslo_concurrency.lockutils [None req-64f9b310-95ff-4976-9e62-bb371ce1db80 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Acquired lock "refresh_cache-6bdc91c7-e39d-4d01-9496-49ceb58c3389" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:09:10 compute-1 nova_compute[182713]: 2026-01-22 00:09:10.169 182717 DEBUG nova.network.neutron [None req-64f9b310-95ff-4976-9e62-bb371ce1db80 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 6bdc91c7-e39d-4d01-9496-49ceb58c3389] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 00:09:11 compute-1 podman[228072]: 2026-01-22 00:09:11.587056774 +0000 UTC m=+0.076873815 container health_status cba261ffc859a105e0afa4436e64e27f9ba7e7387a1ea02146e0072c51844d44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 22 00:09:11 compute-1 nova_compute[182713]: 2026-01-22 00:09:11.660 182717 DEBUG nova.compute.manager [req-14a30dfc-bc7e-42ef-ad5c-e4d8a4e416b1 req-fed0aaac-0d93-44b0-ac20-87b8ee194ad4 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 6bdc91c7-e39d-4d01-9496-49ceb58c3389] Received event network-vif-plugged-424a06a4-5a78-4f84-b441-8b6fcf11bee9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:09:11 compute-1 nova_compute[182713]: 2026-01-22 00:09:11.660 182717 DEBUG oslo_concurrency.lockutils [req-14a30dfc-bc7e-42ef-ad5c-e4d8a4e416b1 req-fed0aaac-0d93-44b0-ac20-87b8ee194ad4 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "6bdc91c7-e39d-4d01-9496-49ceb58c3389-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:09:11 compute-1 nova_compute[182713]: 2026-01-22 00:09:11.660 182717 DEBUG oslo_concurrency.lockutils [req-14a30dfc-bc7e-42ef-ad5c-e4d8a4e416b1 req-fed0aaac-0d93-44b0-ac20-87b8ee194ad4 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "6bdc91c7-e39d-4d01-9496-49ceb58c3389-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:09:11 compute-1 nova_compute[182713]: 2026-01-22 00:09:11.661 182717 DEBUG oslo_concurrency.lockutils [req-14a30dfc-bc7e-42ef-ad5c-e4d8a4e416b1 req-fed0aaac-0d93-44b0-ac20-87b8ee194ad4 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "6bdc91c7-e39d-4d01-9496-49ceb58c3389-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:09:11 compute-1 nova_compute[182713]: 2026-01-22 00:09:11.661 182717 DEBUG nova.compute.manager [req-14a30dfc-bc7e-42ef-ad5c-e4d8a4e416b1 req-fed0aaac-0d93-44b0-ac20-87b8ee194ad4 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 6bdc91c7-e39d-4d01-9496-49ceb58c3389] No waiting events found dispatching network-vif-plugged-424a06a4-5a78-4f84-b441-8b6fcf11bee9 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:09:11 compute-1 nova_compute[182713]: 2026-01-22 00:09:11.661 182717 WARNING nova.compute.manager [req-14a30dfc-bc7e-42ef-ad5c-e4d8a4e416b1 req-fed0aaac-0d93-44b0-ac20-87b8ee194ad4 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 6bdc91c7-e39d-4d01-9496-49ceb58c3389] Received unexpected event network-vif-plugged-424a06a4-5a78-4f84-b441-8b6fcf11bee9 for instance with vm_state active and task_state None.
Jan 22 00:09:11 compute-1 nova_compute[182713]: 2026-01-22 00:09:11.910 182717 INFO nova.network.neutron [None req-64f9b310-95ff-4976-9e62-bb371ce1db80 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 6bdc91c7-e39d-4d01-9496-49ceb58c3389] Port 424a06a4-5a78-4f84-b441-8b6fcf11bee9 from network info_cache is no longer associated with instance in Neutron. Removing from network info_cache.
Jan 22 00:09:11 compute-1 nova_compute[182713]: 2026-01-22 00:09:11.911 182717 DEBUG nova.network.neutron [None req-64f9b310-95ff-4976-9e62-bb371ce1db80 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 6bdc91c7-e39d-4d01-9496-49ceb58c3389] Updating instance_info_cache with network_info: [{"id": "65b75da3-429b-440e-aab6-df04e1794c70", "address": "fa:16:3e:e0:2b:de", "network": {"id": "d9f4afa6-3fd6-432d-a9b5-d1fb06cca4eb", "bridge": "br-int", "label": "tempest-network-smoke--277799587", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.248", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "34b96b4037d24a0ea19383ca2477b2fd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap65b75da3-42", "ovs_interfaceid": "65b75da3-429b-440e-aab6-df04e1794c70", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:09:11 compute-1 nova_compute[182713]: 2026-01-22 00:09:11.944 182717 DEBUG oslo_concurrency.lockutils [None req-64f9b310-95ff-4976-9e62-bb371ce1db80 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Releasing lock "refresh_cache-6bdc91c7-e39d-4d01-9496-49ceb58c3389" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:09:11 compute-1 nova_compute[182713]: 2026-01-22 00:09:11.984 182717 DEBUG oslo_concurrency.lockutils [None req-64f9b310-95ff-4976-9e62-bb371ce1db80 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "interface-6bdc91c7-e39d-4d01-9496-49ceb58c3389-424a06a4-5a78-4f84-b441-8b6fcf11bee9" "released" by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" :: held 3.171s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:09:12 compute-1 nova_compute[182713]: 2026-01-22 00:09:12.375 182717 DEBUG nova.compute.manager [req-87d666b9-0091-4a7a-9b63-219fa89ab49a req-caff93f2-c3bf-4217-b4a1-98a8eb8fb87e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 6bdc91c7-e39d-4d01-9496-49ceb58c3389] Received event network-vif-deleted-424a06a4-5a78-4f84-b441-8b6fcf11bee9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:09:12 compute-1 ovn_controller[94841]: 2026-01-22T00:09:12Z|00431|binding|INFO|Releasing lport c2dbe75a-81e7-4c52-bada-9acaf8fbaf5c from this chassis (sb_readonly=0)
Jan 22 00:09:12 compute-1 ovn_controller[94841]: 2026-01-22T00:09:12Z|00432|binding|INFO|Releasing lport c8e399d9-6b2c-4418-8d48-d49b8502db31 from this chassis (sb_readonly=0)
Jan 22 00:09:12 compute-1 nova_compute[182713]: 2026-01-22 00:09:12.457 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:09:12 compute-1 nova_compute[182713]: 2026-01-22 00:09:12.528 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:09:13 compute-1 nova_compute[182713]: 2026-01-22 00:09:13.461 182717 DEBUG oslo_concurrency.lockutils [None req-f31cb511-4378-4776-8f67-979b02853613 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Acquiring lock "6bdc91c7-e39d-4d01-9496-49ceb58c3389" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:09:13 compute-1 nova_compute[182713]: 2026-01-22 00:09:13.461 182717 DEBUG oslo_concurrency.lockutils [None req-f31cb511-4378-4776-8f67-979b02853613 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "6bdc91c7-e39d-4d01-9496-49ceb58c3389" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:09:13 compute-1 nova_compute[182713]: 2026-01-22 00:09:13.462 182717 DEBUG oslo_concurrency.lockutils [None req-f31cb511-4378-4776-8f67-979b02853613 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Acquiring lock "6bdc91c7-e39d-4d01-9496-49ceb58c3389-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:09:13 compute-1 nova_compute[182713]: 2026-01-22 00:09:13.462 182717 DEBUG oslo_concurrency.lockutils [None req-f31cb511-4378-4776-8f67-979b02853613 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "6bdc91c7-e39d-4d01-9496-49ceb58c3389-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:09:13 compute-1 nova_compute[182713]: 2026-01-22 00:09:13.462 182717 DEBUG oslo_concurrency.lockutils [None req-f31cb511-4378-4776-8f67-979b02853613 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "6bdc91c7-e39d-4d01-9496-49ceb58c3389-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:09:13 compute-1 nova_compute[182713]: 2026-01-22 00:09:13.486 182717 INFO nova.compute.manager [None req-f31cb511-4378-4776-8f67-979b02853613 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 6bdc91c7-e39d-4d01-9496-49ceb58c3389] Terminating instance
Jan 22 00:09:13 compute-1 nova_compute[182713]: 2026-01-22 00:09:13.500 182717 DEBUG nova.compute.manager [None req-f31cb511-4378-4776-8f67-979b02853613 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 6bdc91c7-e39d-4d01-9496-49ceb58c3389] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 22 00:09:13 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:09:13.500 104184 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=30, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:a4:fb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'fa:c2:bb:50:eb:27'}, ipsec=False) old=SB_Global(nb_cfg=29) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:09:13 compute-1 nova_compute[182713]: 2026-01-22 00:09:13.501 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:09:13 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:09:13.503 104184 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 22 00:09:13 compute-1 kernel: tap65b75da3-42 (unregistering): left promiscuous mode
Jan 22 00:09:13 compute-1 NetworkManager[54952]: <info>  [1769040553.5271] device (tap65b75da3-42): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 00:09:13 compute-1 ovn_controller[94841]: 2026-01-22T00:09:13Z|00433|binding|INFO|Releasing lport 65b75da3-429b-440e-aab6-df04e1794c70 from this chassis (sb_readonly=0)
Jan 22 00:09:13 compute-1 ovn_controller[94841]: 2026-01-22T00:09:13Z|00434|binding|INFO|Setting lport 65b75da3-429b-440e-aab6-df04e1794c70 down in Southbound
Jan 22 00:09:13 compute-1 ovn_controller[94841]: 2026-01-22T00:09:13Z|00435|binding|INFO|Removing iface tap65b75da3-42 ovn-installed in OVS
Jan 22 00:09:13 compute-1 nova_compute[182713]: 2026-01-22 00:09:13.539 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:09:13 compute-1 nova_compute[182713]: 2026-01-22 00:09:13.542 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:09:13 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:09:13.549 104184 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e0:2b:de 10.100.0.14'], port_security=['fa:16:3e:e0:2b:de 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '6bdc91c7-e39d-4d01-9496-49ceb58c3389', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d9f4afa6-3fd6-432d-a9b5-d1fb06cca4eb', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '34b96b4037d24a0ea19383ca2477b2fd', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'eb851343-7311-438f-b11b-0c377f259981', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b46e1711-352b-4fc9-8a7d-de660270a1f2, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>], logical_port=65b75da3-429b-440e-aab6-df04e1794c70) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:09:13 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:09:13.551 104184 INFO neutron.agent.ovn.metadata.agent [-] Port 65b75da3-429b-440e-aab6-df04e1794c70 in datapath d9f4afa6-3fd6-432d-a9b5-d1fb06cca4eb unbound from our chassis
Jan 22 00:09:13 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:09:13.554 104184 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network d9f4afa6-3fd6-432d-a9b5-d1fb06cca4eb, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 00:09:13 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:09:13.555 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[d2ee2a16-645e-4069-a97d-99cb9b6347c4]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:09:13 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:09:13.556 104184 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-d9f4afa6-3fd6-432d-a9b5-d1fb06cca4eb namespace which is not needed anymore
Jan 22 00:09:13 compute-1 nova_compute[182713]: 2026-01-22 00:09:13.569 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:09:13 compute-1 systemd[1]: machine-qemu\x2d47\x2dinstance\x2d0000006c.scope: Deactivated successfully.
Jan 22 00:09:13 compute-1 systemd[1]: machine-qemu\x2d47\x2dinstance\x2d0000006c.scope: Consumed 14.314s CPU time.
Jan 22 00:09:13 compute-1 systemd-machined[153970]: Machine qemu-47-instance-0000006c terminated.
Jan 22 00:09:13 compute-1 neutron-haproxy-ovnmeta-d9f4afa6-3fd6-432d-a9b5-d1fb06cca4eb[227726]: [NOTICE]   (227730) : haproxy version is 2.8.14-c23fe91
Jan 22 00:09:13 compute-1 neutron-haproxy-ovnmeta-d9f4afa6-3fd6-432d-a9b5-d1fb06cca4eb[227726]: [NOTICE]   (227730) : path to executable is /usr/sbin/haproxy
Jan 22 00:09:13 compute-1 neutron-haproxy-ovnmeta-d9f4afa6-3fd6-432d-a9b5-d1fb06cca4eb[227726]: [WARNING]  (227730) : Exiting Master process...
Jan 22 00:09:13 compute-1 neutron-haproxy-ovnmeta-d9f4afa6-3fd6-432d-a9b5-d1fb06cca4eb[227726]: [ALERT]    (227730) : Current worker (227732) exited with code 143 (Terminated)
Jan 22 00:09:13 compute-1 neutron-haproxy-ovnmeta-d9f4afa6-3fd6-432d-a9b5-d1fb06cca4eb[227726]: [WARNING]  (227730) : All workers exited. Exiting... (0)
Jan 22 00:09:13 compute-1 systemd[1]: libpod-222520786f9eb6674bd59b6cfc7efb7414a062816069efe6f65c349730b89a82.scope: Deactivated successfully.
Jan 22 00:09:13 compute-1 podman[228118]: 2026-01-22 00:09:13.784953954 +0000 UTC m=+0.093379675 container stop 222520786f9eb6674bd59b6cfc7efb7414a062816069efe6f65c349730b89a82 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d9f4afa6-3fd6-432d-a9b5-d1fb06cca4eb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 22 00:09:13 compute-1 conmon[227726]: conmon 222520786f9eb6674bd5 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-222520786f9eb6674bd59b6cfc7efb7414a062816069efe6f65c349730b89a82.scope/container/memory.events
Jan 22 00:09:13 compute-1 podman[228118]: 2026-01-22 00:09:13.791705439 +0000 UTC m=+0.100131160 container died 222520786f9eb6674bd59b6cfc7efb7414a062816069efe6f65c349730b89a82 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d9f4afa6-3fd6-432d-a9b5-d1fb06cca4eb, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Jan 22 00:09:13 compute-1 nova_compute[182713]: 2026-01-22 00:09:13.793 182717 DEBUG nova.compute.manager [req-0566c5c8-a2d3-467e-ade5-bb1dda16d42f req-5e581df1-a662-4951-b949-71f63efdb112 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 6bdc91c7-e39d-4d01-9496-49ceb58c3389] Received event network-changed-65b75da3-429b-440e-aab6-df04e1794c70 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:09:13 compute-1 nova_compute[182713]: 2026-01-22 00:09:13.794 182717 DEBUG nova.compute.manager [req-0566c5c8-a2d3-467e-ade5-bb1dda16d42f req-5e581df1-a662-4951-b949-71f63efdb112 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 6bdc91c7-e39d-4d01-9496-49ceb58c3389] Refreshing instance network info cache due to event network-changed-65b75da3-429b-440e-aab6-df04e1794c70. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 00:09:13 compute-1 nova_compute[182713]: 2026-01-22 00:09:13.794 182717 DEBUG oslo_concurrency.lockutils [req-0566c5c8-a2d3-467e-ade5-bb1dda16d42f req-5e581df1-a662-4951-b949-71f63efdb112 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-6bdc91c7-e39d-4d01-9496-49ceb58c3389" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:09:13 compute-1 nova_compute[182713]: 2026-01-22 00:09:13.795 182717 DEBUG oslo_concurrency.lockutils [req-0566c5c8-a2d3-467e-ade5-bb1dda16d42f req-5e581df1-a662-4951-b949-71f63efdb112 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-6bdc91c7-e39d-4d01-9496-49ceb58c3389" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:09:13 compute-1 nova_compute[182713]: 2026-01-22 00:09:13.795 182717 DEBUG nova.network.neutron [req-0566c5c8-a2d3-467e-ade5-bb1dda16d42f req-5e581df1-a662-4951-b949-71f63efdb112 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 6bdc91c7-e39d-4d01-9496-49ceb58c3389] Refreshing network info cache for port 65b75da3-429b-440e-aab6-df04e1794c70 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 00:09:13 compute-1 nova_compute[182713]: 2026-01-22 00:09:13.824 182717 INFO nova.virt.libvirt.driver [-] [instance: 6bdc91c7-e39d-4d01-9496-49ceb58c3389] Instance destroyed successfully.
Jan 22 00:09:13 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-222520786f9eb6674bd59b6cfc7efb7414a062816069efe6f65c349730b89a82-userdata-shm.mount: Deactivated successfully.
Jan 22 00:09:13 compute-1 nova_compute[182713]: 2026-01-22 00:09:13.825 182717 DEBUG nova.objects.instance [None req-f31cb511-4378-4776-8f67-979b02853613 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lazy-loading 'resources' on Instance uuid 6bdc91c7-e39d-4d01-9496-49ceb58c3389 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:09:13 compute-1 systemd[1]: var-lib-containers-storage-overlay-f0f6e88de36eeb111d6a1bb7d522f7d8107c6ed4d538b479e3f091b1c6dff8a8-merged.mount: Deactivated successfully.
Jan 22 00:09:13 compute-1 podman[228118]: 2026-01-22 00:09:13.832723854 +0000 UTC m=+0.141149535 container cleanup 222520786f9eb6674bd59b6cfc7efb7414a062816069efe6f65c349730b89a82 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d9f4afa6-3fd6-432d-a9b5-d1fb06cca4eb, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Jan 22 00:09:13 compute-1 nova_compute[182713]: 2026-01-22 00:09:13.839 182717 DEBUG nova.virt.libvirt.vif [None req-f31cb511-4378-4776-8f67-979b02853613 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T00:08:22Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-188093305',display_name='tempest-TestNetworkBasicOps-server-188093305',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-188093305',id=108,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOHkFZnEq7NNmxdySRV5/1EgpWeAUvw5OFcMLn6aXM7GCY8P0GmDvb4iJ7gG7Emvq6PckO0f9TjYb3zc1/HZhYzhCTu/vnvxNF9lXAP/apwH44by79L436dIVo4nFymoqQ==',key_name='tempest-TestNetworkBasicOps-441082338',keypairs=<?>,launch_index=0,launched_at=2026-01-22T00:08:31Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='34b96b4037d24a0ea19383ca2477b2fd',ramdisk_id='',reservation_id='r-e07wfkf0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-822850957',owner_user_name='tempest-TestNetworkBasicOps-822850957-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T00:08:31Z,user_data=None,user_id='833f1e9dce90456ea55a443da6704907',uuid=6bdc91c7-e39d-4d01-9496-49ceb58c3389,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "65b75da3-429b-440e-aab6-df04e1794c70", "address": "fa:16:3e:e0:2b:de", "network": {"id": "d9f4afa6-3fd6-432d-a9b5-d1fb06cca4eb", "bridge": "br-int", "label": "tempest-network-smoke--277799587", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.248", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "34b96b4037d24a0ea19383ca2477b2fd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap65b75da3-42", "ovs_interfaceid": "65b75da3-429b-440e-aab6-df04e1794c70", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 00:09:13 compute-1 nova_compute[182713]: 2026-01-22 00:09:13.839 182717 DEBUG nova.network.os_vif_util [None req-f31cb511-4378-4776-8f67-979b02853613 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Converting VIF {"id": "65b75da3-429b-440e-aab6-df04e1794c70", "address": "fa:16:3e:e0:2b:de", "network": {"id": "d9f4afa6-3fd6-432d-a9b5-d1fb06cca4eb", "bridge": "br-int", "label": "tempest-network-smoke--277799587", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.248", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "34b96b4037d24a0ea19383ca2477b2fd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap65b75da3-42", "ovs_interfaceid": "65b75da3-429b-440e-aab6-df04e1794c70", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:09:13 compute-1 nova_compute[182713]: 2026-01-22 00:09:13.841 182717 DEBUG nova.network.os_vif_util [None req-f31cb511-4378-4776-8f67-979b02853613 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:e0:2b:de,bridge_name='br-int',has_traffic_filtering=True,id=65b75da3-429b-440e-aab6-df04e1794c70,network=Network(d9f4afa6-3fd6-432d-a9b5-d1fb06cca4eb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap65b75da3-42') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:09:13 compute-1 nova_compute[182713]: 2026-01-22 00:09:13.841 182717 DEBUG os_vif [None req-f31cb511-4378-4776-8f67-979b02853613 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:e0:2b:de,bridge_name='br-int',has_traffic_filtering=True,id=65b75da3-429b-440e-aab6-df04e1794c70,network=Network(d9f4afa6-3fd6-432d-a9b5-d1fb06cca4eb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap65b75da3-42') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 00:09:13 compute-1 nova_compute[182713]: 2026-01-22 00:09:13.843 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:09:13 compute-1 nova_compute[182713]: 2026-01-22 00:09:13.844 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap65b75da3-42, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:09:13 compute-1 nova_compute[182713]: 2026-01-22 00:09:13.846 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:09:13 compute-1 nova_compute[182713]: 2026-01-22 00:09:13.848 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:09:13 compute-1 nova_compute[182713]: 2026-01-22 00:09:13.849 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:09:13 compute-1 systemd[1]: libpod-conmon-222520786f9eb6674bd59b6cfc7efb7414a062816069efe6f65c349730b89a82.scope: Deactivated successfully.
Jan 22 00:09:13 compute-1 nova_compute[182713]: 2026-01-22 00:09:13.853 182717 INFO os_vif [None req-f31cb511-4378-4776-8f67-979b02853613 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:e0:2b:de,bridge_name='br-int',has_traffic_filtering=True,id=65b75da3-429b-440e-aab6-df04e1794c70,network=Network(d9f4afa6-3fd6-432d-a9b5-d1fb06cca4eb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap65b75da3-42')
Jan 22 00:09:13 compute-1 nova_compute[182713]: 2026-01-22 00:09:13.854 182717 INFO nova.virt.libvirt.driver [None req-f31cb511-4378-4776-8f67-979b02853613 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 6bdc91c7-e39d-4d01-9496-49ceb58c3389] Deleting instance files /var/lib/nova/instances/6bdc91c7-e39d-4d01-9496-49ceb58c3389_del
Jan 22 00:09:13 compute-1 nova_compute[182713]: 2026-01-22 00:09:13.855 182717 INFO nova.virt.libvirt.driver [None req-f31cb511-4378-4776-8f67-979b02853613 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 6bdc91c7-e39d-4d01-9496-49ceb58c3389] Deletion of /var/lib/nova/instances/6bdc91c7-e39d-4d01-9496-49ceb58c3389_del complete
Jan 22 00:09:13 compute-1 podman[228160]: 2026-01-22 00:09:13.898942435 +0000 UTC m=+0.042126870 container remove 222520786f9eb6674bd59b6cfc7efb7414a062816069efe6f65c349730b89a82 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d9f4afa6-3fd6-432d-a9b5-d1fb06cca4eb, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Jan 22 00:09:13 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:09:13.905 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[4338f010-f3d8-4ca2-8c8f-f7a99047434d]: (4, ('Thu Jan 22 12:09:13 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-d9f4afa6-3fd6-432d-a9b5-d1fb06cca4eb (222520786f9eb6674bd59b6cfc7efb7414a062816069efe6f65c349730b89a82)\n222520786f9eb6674bd59b6cfc7efb7414a062816069efe6f65c349730b89a82\nThu Jan 22 12:09:13 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-d9f4afa6-3fd6-432d-a9b5-d1fb06cca4eb (222520786f9eb6674bd59b6cfc7efb7414a062816069efe6f65c349730b89a82)\n222520786f9eb6674bd59b6cfc7efb7414a062816069efe6f65c349730b89a82\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:09:13 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:09:13.907 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[1460ac22-f4f5-4c70-9487-0e4a9d6468fa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:09:13 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:09:13.908 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd9f4afa6-30, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:09:13 compute-1 kernel: tapd9f4afa6-30: left promiscuous mode
Jan 22 00:09:13 compute-1 nova_compute[182713]: 2026-01-22 00:09:13.911 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:09:13 compute-1 nova_compute[182713]: 2026-01-22 00:09:13.922 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:09:13 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:09:13.926 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[0f18b651-83b6-4551-94fe-884dd63f8709]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:09:13 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:09:13.944 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[0bedddac-7f42-468d-ac43-2d8b1ff4d1cc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:09:13 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:09:13.945 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[738a1417-6d74-4958-a4ee-194ec15bbcc8]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:09:13 compute-1 nova_compute[182713]: 2026-01-22 00:09:13.948 182717 INFO nova.compute.manager [None req-f31cb511-4378-4776-8f67-979b02853613 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 6bdc91c7-e39d-4d01-9496-49ceb58c3389] Took 0.45 seconds to destroy the instance on the hypervisor.
Jan 22 00:09:13 compute-1 nova_compute[182713]: 2026-01-22 00:09:13.949 182717 DEBUG oslo.service.loopingcall [None req-f31cb511-4378-4776-8f67-979b02853613 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 22 00:09:13 compute-1 nova_compute[182713]: 2026-01-22 00:09:13.949 182717 DEBUG nova.compute.manager [-] [instance: 6bdc91c7-e39d-4d01-9496-49ceb58c3389] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 22 00:09:13 compute-1 nova_compute[182713]: 2026-01-22 00:09:13.949 182717 DEBUG nova.network.neutron [-] [instance: 6bdc91c7-e39d-4d01-9496-49ceb58c3389] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 22 00:09:13 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:09:13.972 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[693e2970-a575-45ad-b5ab-3556656591f3]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 508341, 'reachable_time': 30518, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 228175, 'error': None, 'target': 'ovnmeta-d9f4afa6-3fd6-432d-a9b5-d1fb06cca4eb', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:09:13 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:09:13.977 104576 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-d9f4afa6-3fd6-432d-a9b5-d1fb06cca4eb deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 22 00:09:13 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:09:13.977 104576 DEBUG oslo.privsep.daemon [-] privsep: reply[76676ad8-0b82-4170-9195-04dd317d934f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:09:13 compute-1 systemd[1]: run-netns-ovnmeta\x2dd9f4afa6\x2d3fd6\x2d432d\x2da9b5\x2dd1fb06cca4eb.mount: Deactivated successfully.
Jan 22 00:09:14 compute-1 nova_compute[182713]: 2026-01-22 00:09:14.803 182717 DEBUG nova.compute.manager [req-50e483e0-b5aa-4573-8cb3-b827958872d8 req-793e22f4-2c62-4aa2-bfff-6b1c3e8d39c6 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 6bdc91c7-e39d-4d01-9496-49ceb58c3389] Received event network-vif-unplugged-65b75da3-429b-440e-aab6-df04e1794c70 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:09:14 compute-1 nova_compute[182713]: 2026-01-22 00:09:14.804 182717 DEBUG oslo_concurrency.lockutils [req-50e483e0-b5aa-4573-8cb3-b827958872d8 req-793e22f4-2c62-4aa2-bfff-6b1c3e8d39c6 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "6bdc91c7-e39d-4d01-9496-49ceb58c3389-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:09:14 compute-1 nova_compute[182713]: 2026-01-22 00:09:14.804 182717 DEBUG oslo_concurrency.lockutils [req-50e483e0-b5aa-4573-8cb3-b827958872d8 req-793e22f4-2c62-4aa2-bfff-6b1c3e8d39c6 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "6bdc91c7-e39d-4d01-9496-49ceb58c3389-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:09:14 compute-1 nova_compute[182713]: 2026-01-22 00:09:14.804 182717 DEBUG oslo_concurrency.lockutils [req-50e483e0-b5aa-4573-8cb3-b827958872d8 req-793e22f4-2c62-4aa2-bfff-6b1c3e8d39c6 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "6bdc91c7-e39d-4d01-9496-49ceb58c3389-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:09:14 compute-1 nova_compute[182713]: 2026-01-22 00:09:14.804 182717 DEBUG nova.compute.manager [req-50e483e0-b5aa-4573-8cb3-b827958872d8 req-793e22f4-2c62-4aa2-bfff-6b1c3e8d39c6 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 6bdc91c7-e39d-4d01-9496-49ceb58c3389] No waiting events found dispatching network-vif-unplugged-65b75da3-429b-440e-aab6-df04e1794c70 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:09:14 compute-1 nova_compute[182713]: 2026-01-22 00:09:14.804 182717 DEBUG nova.compute.manager [req-50e483e0-b5aa-4573-8cb3-b827958872d8 req-793e22f4-2c62-4aa2-bfff-6b1c3e8d39c6 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 6bdc91c7-e39d-4d01-9496-49ceb58c3389] Received event network-vif-unplugged-65b75da3-429b-440e-aab6-df04e1794c70 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 22 00:09:14 compute-1 nova_compute[182713]: 2026-01-22 00:09:14.886 182717 DEBUG oslo_concurrency.lockutils [None req-b40c2da9-c46a-4e8f-992a-8e34f26bccab 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Acquiring lock "21970bbd-36b4-495d-8819-49ef2276a912" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:09:14 compute-1 nova_compute[182713]: 2026-01-22 00:09:14.887 182717 DEBUG oslo_concurrency.lockutils [None req-b40c2da9-c46a-4e8f-992a-8e34f26bccab 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Lock "21970bbd-36b4-495d-8819-49ef2276a912" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:09:14 compute-1 nova_compute[182713]: 2026-01-22 00:09:14.887 182717 DEBUG oslo_concurrency.lockutils [None req-b40c2da9-c46a-4e8f-992a-8e34f26bccab 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Acquiring lock "21970bbd-36b4-495d-8819-49ef2276a912-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:09:14 compute-1 nova_compute[182713]: 2026-01-22 00:09:14.887 182717 DEBUG oslo_concurrency.lockutils [None req-b40c2da9-c46a-4e8f-992a-8e34f26bccab 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Lock "21970bbd-36b4-495d-8819-49ef2276a912-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:09:14 compute-1 nova_compute[182713]: 2026-01-22 00:09:14.888 182717 DEBUG oslo_concurrency.lockutils [None req-b40c2da9-c46a-4e8f-992a-8e34f26bccab 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Lock "21970bbd-36b4-495d-8819-49ef2276a912-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:09:14 compute-1 nova_compute[182713]: 2026-01-22 00:09:14.906 182717 INFO nova.compute.manager [None req-b40c2da9-c46a-4e8f-992a-8e34f26bccab 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: 21970bbd-36b4-495d-8819-49ef2276a912] Terminating instance
Jan 22 00:09:14 compute-1 nova_compute[182713]: 2026-01-22 00:09:14.920 182717 DEBUG nova.network.neutron [-] [instance: 6bdc91c7-e39d-4d01-9496-49ceb58c3389] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:09:14 compute-1 nova_compute[182713]: 2026-01-22 00:09:14.922 182717 DEBUG nova.compute.manager [None req-b40c2da9-c46a-4e8f-992a-8e34f26bccab 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: 21970bbd-36b4-495d-8819-49ef2276a912] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 22 00:09:14 compute-1 kernel: tap06125da7-7a (unregistering): left promiscuous mode
Jan 22 00:09:14 compute-1 nova_compute[182713]: 2026-01-22 00:09:14.953 182717 INFO nova.compute.manager [-] [instance: 6bdc91c7-e39d-4d01-9496-49ceb58c3389] Took 1.00 seconds to deallocate network for instance.
Jan 22 00:09:14 compute-1 NetworkManager[54952]: <info>  [1769040554.9579] device (tap06125da7-7a): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 00:09:14 compute-1 ovn_controller[94841]: 2026-01-22T00:09:14Z|00436|binding|INFO|Releasing lport 06125da7-7adf-4bbe-b033-0045ab83a9f2 from this chassis (sb_readonly=0)
Jan 22 00:09:14 compute-1 ovn_controller[94841]: 2026-01-22T00:09:14Z|00437|binding|INFO|Setting lport 06125da7-7adf-4bbe-b033-0045ab83a9f2 down in Southbound
Jan 22 00:09:14 compute-1 ovn_controller[94841]: 2026-01-22T00:09:14Z|00438|binding|INFO|Removing iface tap06125da7-7a ovn-installed in OVS
Jan 22 00:09:14 compute-1 nova_compute[182713]: 2026-01-22 00:09:14.967 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:09:14 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:09:14.978 104184 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c4:99:4e 10.100.0.13'], port_security=['fa:16:3e:c4:99:4e 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '21970bbd-36b4-495d-8819-49ef2276a912', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1a4bd631-64c5-4e00-9341-0e44fd0833fb', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b26cf6f4abd54e30aac169a3cbca648c', 'neutron:revision_number': '4', 'neutron:security_group_ids': '51b050e2-1158-4da5-a294-6c6d2a400a60', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d46c6b58-b03f-4ac4-a6dd-9f507a40241a, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>], logical_port=06125da7-7adf-4bbe-b033-0045ab83a9f2) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:09:14 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:09:14.979 104184 INFO neutron.agent.ovn.metadata.agent [-] Port 06125da7-7adf-4bbe-b033-0045ab83a9f2 in datapath 1a4bd631-64c5-4e00-9341-0e44fd0833fb unbound from our chassis
Jan 22 00:09:14 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:09:14.981 104184 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 1a4bd631-64c5-4e00-9341-0e44fd0833fb, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 00:09:14 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:09:14.983 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[017b7178-b971-4c22-be53-b744f9b34bcf]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:09:14 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:09:14.984 104184 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-1a4bd631-64c5-4e00-9341-0e44fd0833fb namespace which is not needed anymore
Jan 22 00:09:14 compute-1 nova_compute[182713]: 2026-01-22 00:09:14.998 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:09:15 compute-1 systemd[1]: machine-qemu\x2d44\x2dinstance\x2d00000060.scope: Deactivated successfully.
Jan 22 00:09:15 compute-1 systemd[1]: machine-qemu\x2d44\x2dinstance\x2d00000060.scope: Consumed 19.620s CPU time.
Jan 22 00:09:15 compute-1 systemd-machined[153970]: Machine qemu-44-instance-00000060 terminated.
Jan 22 00:09:15 compute-1 nova_compute[182713]: 2026-01-22 00:09:15.076 182717 DEBUG oslo_concurrency.lockutils [None req-f31cb511-4378-4776-8f67-979b02853613 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:09:15 compute-1 nova_compute[182713]: 2026-01-22 00:09:15.076 182717 DEBUG oslo_concurrency.lockutils [None req-f31cb511-4378-4776-8f67-979b02853613 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:09:15 compute-1 nova_compute[182713]: 2026-01-22 00:09:15.088 182717 DEBUG nova.compute.manager [req-eaceadea-daa1-4f94-95e2-9b61f1dd2deb req-14df8635-3c50-46b1-a97a-b7cdc827401e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 6bdc91c7-e39d-4d01-9496-49ceb58c3389] Received event network-vif-deleted-65b75da3-429b-440e-aab6-df04e1794c70 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:09:15 compute-1 podman[228178]: 2026-01-22 00:09:15.10037185 +0000 UTC m=+0.097272034 container health_status dce133a2ffa21f3006ac27dc80027060a9029e6d972f4c62fecd35dd3bbb00f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=openstack_network_exporter, architecture=x86_64, io.buildah.version=1.33.7, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, name=ubi9-minimal, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.6, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, release=1755695350, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, vcs-type=git)
Jan 22 00:09:15 compute-1 neutron-haproxy-ovnmeta-1a4bd631-64c5-4e00-9341-0e44fd0833fb[226066]: [NOTICE]   (226070) : haproxy version is 2.8.14-c23fe91
Jan 22 00:09:15 compute-1 neutron-haproxy-ovnmeta-1a4bd631-64c5-4e00-9341-0e44fd0833fb[226066]: [NOTICE]   (226070) : path to executable is /usr/sbin/haproxy
Jan 22 00:09:15 compute-1 neutron-haproxy-ovnmeta-1a4bd631-64c5-4e00-9341-0e44fd0833fb[226066]: [WARNING]  (226070) : Exiting Master process...
Jan 22 00:09:15 compute-1 neutron-haproxy-ovnmeta-1a4bd631-64c5-4e00-9341-0e44fd0833fb[226066]: [ALERT]    (226070) : Current worker (226072) exited with code 143 (Terminated)
Jan 22 00:09:15 compute-1 neutron-haproxy-ovnmeta-1a4bd631-64c5-4e00-9341-0e44fd0833fb[226066]: [WARNING]  (226070) : All workers exited. Exiting... (0)
Jan 22 00:09:15 compute-1 systemd[1]: libpod-3e6ed091c097c07c60d69ad849698beb4c96b1e5b11d5c56221db997cf17f0a9.scope: Deactivated successfully.
Jan 22 00:09:15 compute-1 podman[228213]: 2026-01-22 00:09:15.141204959 +0000 UTC m=+0.055127574 container died 3e6ed091c097c07c60d69ad849698beb4c96b1e5b11d5c56221db997cf17f0a9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1a4bd631-64c5-4e00-9341-0e44fd0833fb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team)
Jan 22 00:09:15 compute-1 NetworkManager[54952]: <info>  [1769040555.1474] manager: (tap06125da7-7a): new Tun device (/org/freedesktop/NetworkManager/Devices/209)
Jan 22 00:09:15 compute-1 kernel: tap06125da7-7a: entered promiscuous mode
Jan 22 00:09:15 compute-1 kernel: tap06125da7-7a (unregistering): left promiscuous mode
Jan 22 00:09:15 compute-1 ovn_controller[94841]: 2026-01-22T00:09:15Z|00439|binding|INFO|Claiming lport 06125da7-7adf-4bbe-b033-0045ab83a9f2 for this chassis.
Jan 22 00:09:15 compute-1 ovn_controller[94841]: 2026-01-22T00:09:15Z|00440|binding|INFO|06125da7-7adf-4bbe-b033-0045ab83a9f2: Claiming fa:16:3e:c4:99:4e 10.100.0.13
Jan 22 00:09:15 compute-1 nova_compute[182713]: 2026-01-22 00:09:15.157 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:09:15 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:09:15.167 104184 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c4:99:4e 10.100.0.13'], port_security=['fa:16:3e:c4:99:4e 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '21970bbd-36b4-495d-8819-49ef2276a912', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1a4bd631-64c5-4e00-9341-0e44fd0833fb', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b26cf6f4abd54e30aac169a3cbca648c', 'neutron:revision_number': '4', 'neutron:security_group_ids': '51b050e2-1158-4da5-a294-6c6d2a400a60', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d46c6b58-b03f-4ac4-a6dd-9f507a40241a, chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>], logical_port=06125da7-7adf-4bbe-b033-0045ab83a9f2) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:09:15 compute-1 ovn_controller[94841]: 2026-01-22T00:09:15Z|00441|binding|INFO|Setting lport 06125da7-7adf-4bbe-b033-0045ab83a9f2 ovn-installed in OVS
Jan 22 00:09:15 compute-1 ovn_controller[94841]: 2026-01-22T00:09:15Z|00442|binding|INFO|Setting lport 06125da7-7adf-4bbe-b033-0045ab83a9f2 up in Southbound
Jan 22 00:09:15 compute-1 nova_compute[182713]: 2026-01-22 00:09:15.181 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:09:15 compute-1 ovn_controller[94841]: 2026-01-22T00:09:15Z|00443|binding|INFO|Releasing lport 06125da7-7adf-4bbe-b033-0045ab83a9f2 from this chassis (sb_readonly=1)
Jan 22 00:09:15 compute-1 ovn_controller[94841]: 2026-01-22T00:09:15Z|00444|if_status|INFO|Dropped 2 log messages in last 786 seconds (most recently, 786 seconds ago) due to excessive rate
Jan 22 00:09:15 compute-1 ovn_controller[94841]: 2026-01-22T00:09:15Z|00445|if_status|INFO|Not setting lport 06125da7-7adf-4bbe-b033-0045ab83a9f2 down as sb is readonly
Jan 22 00:09:15 compute-1 ovn_controller[94841]: 2026-01-22T00:09:15Z|00446|binding|INFO|Removing iface tap06125da7-7a ovn-installed in OVS
Jan 22 00:09:15 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-3e6ed091c097c07c60d69ad849698beb4c96b1e5b11d5c56221db997cf17f0a9-userdata-shm.mount: Deactivated successfully.
Jan 22 00:09:15 compute-1 ovn_controller[94841]: 2026-01-22T00:09:15Z|00447|binding|INFO|Releasing lport 06125da7-7adf-4bbe-b033-0045ab83a9f2 from this chassis (sb_readonly=0)
Jan 22 00:09:15 compute-1 ovn_controller[94841]: 2026-01-22T00:09:15Z|00448|binding|INFO|Setting lport 06125da7-7adf-4bbe-b033-0045ab83a9f2 down in Southbound
Jan 22 00:09:15 compute-1 systemd[1]: var-lib-containers-storage-overlay-e14a03f644312b385aef31efd2e28794ea27658cb2ae9a3a3e562fce889456c7-merged.mount: Deactivated successfully.
Jan 22 00:09:15 compute-1 nova_compute[182713]: 2026-01-22 00:09:15.201 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:09:15 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:09:15.203 104184 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c4:99:4e 10.100.0.13'], port_security=['fa:16:3e:c4:99:4e 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '21970bbd-36b4-495d-8819-49ef2276a912', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1a4bd631-64c5-4e00-9341-0e44fd0833fb', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b26cf6f4abd54e30aac169a3cbca648c', 'neutron:revision_number': '4', 'neutron:security_group_ids': '51b050e2-1158-4da5-a294-6c6d2a400a60', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d46c6b58-b03f-4ac4-a6dd-9f507a40241a, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>], logical_port=06125da7-7adf-4bbe-b033-0045ab83a9f2) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:09:15 compute-1 podman[228213]: 2026-01-22 00:09:15.207200292 +0000 UTC m=+0.121122887 container cleanup 3e6ed091c097c07c60d69ad849698beb4c96b1e5b11d5c56221db997cf17f0a9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1a4bd631-64c5-4e00-9341-0e44fd0833fb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 00:09:15 compute-1 systemd[1]: libpod-conmon-3e6ed091c097c07c60d69ad849698beb4c96b1e5b11d5c56221db997cf17f0a9.scope: Deactivated successfully.
Jan 22 00:09:15 compute-1 nova_compute[182713]: 2026-01-22 00:09:15.222 182717 INFO nova.virt.libvirt.driver [-] [instance: 21970bbd-36b4-495d-8819-49ef2276a912] Instance destroyed successfully.
Jan 22 00:09:15 compute-1 nova_compute[182713]: 2026-01-22 00:09:15.222 182717 DEBUG nova.objects.instance [None req-b40c2da9-c46a-4e8f-992a-8e34f26bccab 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Lazy-loading 'resources' on Instance uuid 21970bbd-36b4-495d-8819-49ef2276a912 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:09:15 compute-1 nova_compute[182713]: 2026-01-22 00:09:15.236 182717 DEBUG nova.virt.libvirt.vif [None req-b40c2da9-c46a-4e8f-992a-8e34f26bccab 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T00:06:28Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestOtherB-server-1249936959',display_name='tempest-ServerActionsTestOtherB-server-1249936959',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serveractionstestotherb-server-1249936959',id=96,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-22T00:06:44Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='b26cf6f4abd54e30aac169a3cbca648c',ramdisk_id='',reservation_id='r-xm80zy3k',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestOtherB-1685479237',owner_user_name='tempest-ServerActionsTestOtherB-1685479237-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T00:06:45Z,user_data=None,user_id='365f219cd09c471fa6275faa2fe5e2a1',uuid=21970bbd-36b4-495d-8819-49ef2276a912,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "06125da7-7adf-4bbe-b033-0045ab83a9f2", "address": "fa:16:3e:c4:99:4e", "network": {"id": "1a4bd631-64c5-4e00-9341-0e44fd0833fb", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1779791452-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b26cf6f4abd54e30aac169a3cbca648c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap06125da7-7a", "ovs_interfaceid": "06125da7-7adf-4bbe-b033-0045ab83a9f2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 00:09:15 compute-1 nova_compute[182713]: 2026-01-22 00:09:15.237 182717 DEBUG nova.network.os_vif_util [None req-b40c2da9-c46a-4e8f-992a-8e34f26bccab 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Converting VIF {"id": "06125da7-7adf-4bbe-b033-0045ab83a9f2", "address": "fa:16:3e:c4:99:4e", "network": {"id": "1a4bd631-64c5-4e00-9341-0e44fd0833fb", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1779791452-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b26cf6f4abd54e30aac169a3cbca648c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap06125da7-7a", "ovs_interfaceid": "06125da7-7adf-4bbe-b033-0045ab83a9f2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:09:15 compute-1 nova_compute[182713]: 2026-01-22 00:09:15.237 182717 DEBUG nova.network.os_vif_util [None req-b40c2da9-c46a-4e8f-992a-8e34f26bccab 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:c4:99:4e,bridge_name='br-int',has_traffic_filtering=True,id=06125da7-7adf-4bbe-b033-0045ab83a9f2,network=Network(1a4bd631-64c5-4e00-9341-0e44fd0833fb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap06125da7-7a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:09:15 compute-1 nova_compute[182713]: 2026-01-22 00:09:15.238 182717 DEBUG os_vif [None req-b40c2da9-c46a-4e8f-992a-8e34f26bccab 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:c4:99:4e,bridge_name='br-int',has_traffic_filtering=True,id=06125da7-7adf-4bbe-b033-0045ab83a9f2,network=Network(1a4bd631-64c5-4e00-9341-0e44fd0833fb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap06125da7-7a') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 00:09:15 compute-1 nova_compute[182713]: 2026-01-22 00:09:15.239 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:09:15 compute-1 nova_compute[182713]: 2026-01-22 00:09:15.240 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap06125da7-7a, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:09:15 compute-1 nova_compute[182713]: 2026-01-22 00:09:15.243 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:09:15 compute-1 nova_compute[182713]: 2026-01-22 00:09:15.244 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:09:15 compute-1 nova_compute[182713]: 2026-01-22 00:09:15.247 182717 INFO os_vif [None req-b40c2da9-c46a-4e8f-992a-8e34f26bccab 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:c4:99:4e,bridge_name='br-int',has_traffic_filtering=True,id=06125da7-7adf-4bbe-b033-0045ab83a9f2,network=Network(1a4bd631-64c5-4e00-9341-0e44fd0833fb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap06125da7-7a')
Jan 22 00:09:15 compute-1 nova_compute[182713]: 2026-01-22 00:09:15.247 182717 INFO nova.virt.libvirt.driver [None req-b40c2da9-c46a-4e8f-992a-8e34f26bccab 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: 21970bbd-36b4-495d-8819-49ef2276a912] Deleting instance files /var/lib/nova/instances/21970bbd-36b4-495d-8819-49ef2276a912_del
Jan 22 00:09:15 compute-1 nova_compute[182713]: 2026-01-22 00:09:15.248 182717 INFO nova.virt.libvirt.driver [None req-b40c2da9-c46a-4e8f-992a-8e34f26bccab 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: 21970bbd-36b4-495d-8819-49ef2276a912] Deletion of /var/lib/nova/instances/21970bbd-36b4-495d-8819-49ef2276a912_del complete
Jan 22 00:09:15 compute-1 podman[228261]: 2026-01-22 00:09:15.280131635 +0000 UTC m=+0.046873143 container remove 3e6ed091c097c07c60d69ad849698beb4c96b1e5b11d5c56221db997cf17f0a9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1a4bd631-64c5-4e00-9341-0e44fd0833fb, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 00:09:15 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:09:15.286 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[193c5f3e-4966-4b89-afea-5a7c9e8c6410]: (4, ('Thu Jan 22 12:09:15 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-1a4bd631-64c5-4e00-9341-0e44fd0833fb (3e6ed091c097c07c60d69ad849698beb4c96b1e5b11d5c56221db997cf17f0a9)\n3e6ed091c097c07c60d69ad849698beb4c96b1e5b11d5c56221db997cf17f0a9\nThu Jan 22 12:09:15 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-1a4bd631-64c5-4e00-9341-0e44fd0833fb (3e6ed091c097c07c60d69ad849698beb4c96b1e5b11d5c56221db997cf17f0a9)\n3e6ed091c097c07c60d69ad849698beb4c96b1e5b11d5c56221db997cf17f0a9\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:09:15 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:09:15.288 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[9d3da726-e40e-47c1-97ff-79197dce5726]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:09:15 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:09:15.289 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1a4bd631-60, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:09:15 compute-1 kernel: tap1a4bd631-60: left promiscuous mode
Jan 22 00:09:15 compute-1 nova_compute[182713]: 2026-01-22 00:09:15.291 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:09:15 compute-1 nova_compute[182713]: 2026-01-22 00:09:15.301 182717 DEBUG nova.compute.provider_tree [None req-f31cb511-4378-4776-8f67-979b02853613 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Inventory has not changed in ProviderTree for provider: 39680711-70c9-4df1-ae59-25e54fac688d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:09:15 compute-1 nova_compute[182713]: 2026-01-22 00:09:15.315 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:09:15 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:09:15.318 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[e0c0b60d-1ac8-4e7c-95d9-c88d243a1bc8]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:09:15 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:09:15.340 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[3ef0a338-5594-4571-9cb4-2790a2d5099b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:09:15 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:09:15.342 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[86411011-79d3-4512-8287-ef51927842e3]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:09:15 compute-1 nova_compute[182713]: 2026-01-22 00:09:15.349 182717 DEBUG nova.scheduler.client.report [None req-f31cb511-4378-4776-8f67-979b02853613 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Inventory has not changed for provider 39680711-70c9-4df1-ae59-25e54fac688d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:09:15 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:09:15.365 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[5fa484f3-ac19-40e5-9c63-91cbefd85da9]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 489651, 'reachable_time': 32386, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 228276, 'error': None, 'target': 'ovnmeta-1a4bd631-64c5-4e00-9341-0e44fd0833fb', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:09:15 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:09:15.367 104576 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-1a4bd631-64c5-4e00-9341-0e44fd0833fb deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 22 00:09:15 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:09:15.367 104576 DEBUG oslo.privsep.daemon [-] privsep: reply[d77469fc-565f-4e83-98e4-33563f514343]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:09:15 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:09:15.368 104184 INFO neutron.agent.ovn.metadata.agent [-] Port 06125da7-7adf-4bbe-b033-0045ab83a9f2 in datapath 1a4bd631-64c5-4e00-9341-0e44fd0833fb unbound from our chassis
Jan 22 00:09:15 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:09:15.369 104184 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 1a4bd631-64c5-4e00-9341-0e44fd0833fb, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 00:09:15 compute-1 systemd[1]: run-netns-ovnmeta\x2d1a4bd631\x2d64c5\x2d4e00\x2d9341\x2d0e44fd0833fb.mount: Deactivated successfully.
Jan 22 00:09:15 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:09:15.370 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[a1699d92-d78c-4bae-a280-15f4dfb8c639]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:09:15 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:09:15.372 104184 INFO neutron.agent.ovn.metadata.agent [-] Port 06125da7-7adf-4bbe-b033-0045ab83a9f2 in datapath 1a4bd631-64c5-4e00-9341-0e44fd0833fb unbound from our chassis
Jan 22 00:09:15 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:09:15.374 104184 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 1a4bd631-64c5-4e00-9341-0e44fd0833fb, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 00:09:15 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:09:15.374 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[45ab41e4-7775-4b7e-a758-5a7d86561e65]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:09:15 compute-1 nova_compute[182713]: 2026-01-22 00:09:15.997 182717 DEBUG oslo_concurrency.lockutils [None req-f31cb511-4378-4776-8f67-979b02853613 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.921s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:09:16 compute-1 nova_compute[182713]: 2026-01-22 00:09:16.039 182717 INFO nova.compute.manager [None req-b40c2da9-c46a-4e8f-992a-8e34f26bccab 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: 21970bbd-36b4-495d-8819-49ef2276a912] Took 1.12 seconds to destroy the instance on the hypervisor.
Jan 22 00:09:16 compute-1 nova_compute[182713]: 2026-01-22 00:09:16.040 182717 DEBUG oslo.service.loopingcall [None req-b40c2da9-c46a-4e8f-992a-8e34f26bccab 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 22 00:09:16 compute-1 nova_compute[182713]: 2026-01-22 00:09:16.041 182717 DEBUG nova.compute.manager [-] [instance: 21970bbd-36b4-495d-8819-49ef2276a912] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 22 00:09:16 compute-1 nova_compute[182713]: 2026-01-22 00:09:16.041 182717 DEBUG nova.network.neutron [-] [instance: 21970bbd-36b4-495d-8819-49ef2276a912] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 22 00:09:16 compute-1 nova_compute[182713]: 2026-01-22 00:09:16.064 182717 INFO nova.scheduler.client.report [None req-f31cb511-4378-4776-8f67-979b02853613 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Deleted allocations for instance 6bdc91c7-e39d-4d01-9496-49ceb58c3389
Jan 22 00:09:16 compute-1 nova_compute[182713]: 2026-01-22 00:09:16.106 182717 DEBUG nova.network.neutron [req-0566c5c8-a2d3-467e-ade5-bb1dda16d42f req-5e581df1-a662-4951-b949-71f63efdb112 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 6bdc91c7-e39d-4d01-9496-49ceb58c3389] Updated VIF entry in instance network info cache for port 65b75da3-429b-440e-aab6-df04e1794c70. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 00:09:16 compute-1 nova_compute[182713]: 2026-01-22 00:09:16.107 182717 DEBUG nova.network.neutron [req-0566c5c8-a2d3-467e-ade5-bb1dda16d42f req-5e581df1-a662-4951-b949-71f63efdb112 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 6bdc91c7-e39d-4d01-9496-49ceb58c3389] Updating instance_info_cache with network_info: [{"id": "65b75da3-429b-440e-aab6-df04e1794c70", "address": "fa:16:3e:e0:2b:de", "network": {"id": "d9f4afa6-3fd6-432d-a9b5-d1fb06cca4eb", "bridge": "br-int", "label": "tempest-network-smoke--277799587", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "34b96b4037d24a0ea19383ca2477b2fd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap65b75da3-42", "ovs_interfaceid": "65b75da3-429b-440e-aab6-df04e1794c70", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:09:16 compute-1 nova_compute[182713]: 2026-01-22 00:09:16.165 182717 DEBUG oslo_concurrency.lockutils [req-0566c5c8-a2d3-467e-ade5-bb1dda16d42f req-5e581df1-a662-4951-b949-71f63efdb112 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-6bdc91c7-e39d-4d01-9496-49ceb58c3389" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:09:16 compute-1 nova_compute[182713]: 2026-01-22 00:09:16.233 182717 DEBUG oslo_concurrency.lockutils [None req-f31cb511-4378-4776-8f67-979b02853613 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "6bdc91c7-e39d-4d01-9496-49ceb58c3389" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.772s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:09:16 compute-1 nova_compute[182713]: 2026-01-22 00:09:16.890 182717 DEBUG nova.compute.manager [req-13a19e34-d51a-493f-8ac6-fcc812f39aac req-ed3fa997-582d-4ddc-a13a-63da517f0c17 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 6bdc91c7-e39d-4d01-9496-49ceb58c3389] Received event network-vif-plugged-65b75da3-429b-440e-aab6-df04e1794c70 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:09:16 compute-1 nova_compute[182713]: 2026-01-22 00:09:16.891 182717 DEBUG oslo_concurrency.lockutils [req-13a19e34-d51a-493f-8ac6-fcc812f39aac req-ed3fa997-582d-4ddc-a13a-63da517f0c17 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "6bdc91c7-e39d-4d01-9496-49ceb58c3389-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:09:16 compute-1 nova_compute[182713]: 2026-01-22 00:09:16.891 182717 DEBUG oslo_concurrency.lockutils [req-13a19e34-d51a-493f-8ac6-fcc812f39aac req-ed3fa997-582d-4ddc-a13a-63da517f0c17 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "6bdc91c7-e39d-4d01-9496-49ceb58c3389-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:09:16 compute-1 nova_compute[182713]: 2026-01-22 00:09:16.891 182717 DEBUG oslo_concurrency.lockutils [req-13a19e34-d51a-493f-8ac6-fcc812f39aac req-ed3fa997-582d-4ddc-a13a-63da517f0c17 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "6bdc91c7-e39d-4d01-9496-49ceb58c3389-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:09:16 compute-1 nova_compute[182713]: 2026-01-22 00:09:16.891 182717 DEBUG nova.compute.manager [req-13a19e34-d51a-493f-8ac6-fcc812f39aac req-ed3fa997-582d-4ddc-a13a-63da517f0c17 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 6bdc91c7-e39d-4d01-9496-49ceb58c3389] No waiting events found dispatching network-vif-plugged-65b75da3-429b-440e-aab6-df04e1794c70 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:09:16 compute-1 nova_compute[182713]: 2026-01-22 00:09:16.891 182717 WARNING nova.compute.manager [req-13a19e34-d51a-493f-8ac6-fcc812f39aac req-ed3fa997-582d-4ddc-a13a-63da517f0c17 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 6bdc91c7-e39d-4d01-9496-49ceb58c3389] Received unexpected event network-vif-plugged-65b75da3-429b-440e-aab6-df04e1794c70 for instance with vm_state deleted and task_state None.
Jan 22 00:09:17 compute-1 nova_compute[182713]: 2026-01-22 00:09:17.235 182717 DEBUG nova.network.neutron [-] [instance: 21970bbd-36b4-495d-8819-49ef2276a912] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:09:17 compute-1 nova_compute[182713]: 2026-01-22 00:09:17.258 182717 INFO nova.compute.manager [-] [instance: 21970bbd-36b4-495d-8819-49ef2276a912] Took 1.22 seconds to deallocate network for instance.
Jan 22 00:09:17 compute-1 nova_compute[182713]: 2026-01-22 00:09:17.330 182717 DEBUG nova.compute.manager [req-58b877e6-5435-4560-a99d-f3db2ac0f6ab req-d18d97ae-6c38-463f-bdc2-b9d7335b8d52 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 21970bbd-36b4-495d-8819-49ef2276a912] Received event network-vif-deleted-06125da7-7adf-4bbe-b033-0045ab83a9f2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:09:17 compute-1 nova_compute[182713]: 2026-01-22 00:09:17.367 182717 DEBUG oslo_concurrency.lockutils [None req-b40c2da9-c46a-4e8f-992a-8e34f26bccab 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:09:17 compute-1 nova_compute[182713]: 2026-01-22 00:09:17.368 182717 DEBUG oslo_concurrency.lockutils [None req-b40c2da9-c46a-4e8f-992a-8e34f26bccab 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:09:17 compute-1 nova_compute[182713]: 2026-01-22 00:09:17.428 182717 DEBUG nova.compute.provider_tree [None req-b40c2da9-c46a-4e8f-992a-8e34f26bccab 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Inventory has not changed in ProviderTree for provider: 39680711-70c9-4df1-ae59-25e54fac688d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:09:17 compute-1 nova_compute[182713]: 2026-01-22 00:09:17.444 182717 DEBUG nova.scheduler.client.report [None req-b40c2da9-c46a-4e8f-992a-8e34f26bccab 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Inventory has not changed for provider 39680711-70c9-4df1-ae59-25e54fac688d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:09:17 compute-1 nova_compute[182713]: 2026-01-22 00:09:17.469 182717 DEBUG oslo_concurrency.lockutils [None req-b40c2da9-c46a-4e8f-992a-8e34f26bccab 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.101s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:09:17 compute-1 nova_compute[182713]: 2026-01-22 00:09:17.502 182717 INFO nova.scheduler.client.report [None req-b40c2da9-c46a-4e8f-992a-8e34f26bccab 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Deleted allocations for instance 21970bbd-36b4-495d-8819-49ef2276a912
Jan 22 00:09:17 compute-1 nova_compute[182713]: 2026-01-22 00:09:17.522 182717 DEBUG nova.compute.manager [req-c516740d-1950-410c-b6a4-8bbe239d59bb req-1333342a-b872-48a3-b848-d3093498c1ee 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 21970bbd-36b4-495d-8819-49ef2276a912] Received event network-vif-unplugged-06125da7-7adf-4bbe-b033-0045ab83a9f2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:09:17 compute-1 nova_compute[182713]: 2026-01-22 00:09:17.523 182717 DEBUG oslo_concurrency.lockutils [req-c516740d-1950-410c-b6a4-8bbe239d59bb req-1333342a-b872-48a3-b848-d3093498c1ee 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "21970bbd-36b4-495d-8819-49ef2276a912-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:09:17 compute-1 nova_compute[182713]: 2026-01-22 00:09:17.523 182717 DEBUG oslo_concurrency.lockutils [req-c516740d-1950-410c-b6a4-8bbe239d59bb req-1333342a-b872-48a3-b848-d3093498c1ee 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "21970bbd-36b4-495d-8819-49ef2276a912-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:09:17 compute-1 nova_compute[182713]: 2026-01-22 00:09:17.524 182717 DEBUG oslo_concurrency.lockutils [req-c516740d-1950-410c-b6a4-8bbe239d59bb req-1333342a-b872-48a3-b848-d3093498c1ee 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "21970bbd-36b4-495d-8819-49ef2276a912-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:09:17 compute-1 nova_compute[182713]: 2026-01-22 00:09:17.524 182717 DEBUG nova.compute.manager [req-c516740d-1950-410c-b6a4-8bbe239d59bb req-1333342a-b872-48a3-b848-d3093498c1ee 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 21970bbd-36b4-495d-8819-49ef2276a912] No waiting events found dispatching network-vif-unplugged-06125da7-7adf-4bbe-b033-0045ab83a9f2 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:09:17 compute-1 nova_compute[182713]: 2026-01-22 00:09:17.524 182717 WARNING nova.compute.manager [req-c516740d-1950-410c-b6a4-8bbe239d59bb req-1333342a-b872-48a3-b848-d3093498c1ee 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 21970bbd-36b4-495d-8819-49ef2276a912] Received unexpected event network-vif-unplugged-06125da7-7adf-4bbe-b033-0045ab83a9f2 for instance with vm_state deleted and task_state None.
Jan 22 00:09:17 compute-1 nova_compute[182713]: 2026-01-22 00:09:17.525 182717 DEBUG nova.compute.manager [req-c516740d-1950-410c-b6a4-8bbe239d59bb req-1333342a-b872-48a3-b848-d3093498c1ee 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 21970bbd-36b4-495d-8819-49ef2276a912] Received event network-vif-plugged-06125da7-7adf-4bbe-b033-0045ab83a9f2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:09:17 compute-1 nova_compute[182713]: 2026-01-22 00:09:17.525 182717 DEBUG oslo_concurrency.lockutils [req-c516740d-1950-410c-b6a4-8bbe239d59bb req-1333342a-b872-48a3-b848-d3093498c1ee 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "21970bbd-36b4-495d-8819-49ef2276a912-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:09:17 compute-1 nova_compute[182713]: 2026-01-22 00:09:17.526 182717 DEBUG oslo_concurrency.lockutils [req-c516740d-1950-410c-b6a4-8bbe239d59bb req-1333342a-b872-48a3-b848-d3093498c1ee 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "21970bbd-36b4-495d-8819-49ef2276a912-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:09:17 compute-1 nova_compute[182713]: 2026-01-22 00:09:17.526 182717 DEBUG oslo_concurrency.lockutils [req-c516740d-1950-410c-b6a4-8bbe239d59bb req-1333342a-b872-48a3-b848-d3093498c1ee 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "21970bbd-36b4-495d-8819-49ef2276a912-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:09:17 compute-1 nova_compute[182713]: 2026-01-22 00:09:17.527 182717 DEBUG nova.compute.manager [req-c516740d-1950-410c-b6a4-8bbe239d59bb req-1333342a-b872-48a3-b848-d3093498c1ee 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 21970bbd-36b4-495d-8819-49ef2276a912] No waiting events found dispatching network-vif-plugged-06125da7-7adf-4bbe-b033-0045ab83a9f2 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:09:17 compute-1 nova_compute[182713]: 2026-01-22 00:09:17.527 182717 WARNING nova.compute.manager [req-c516740d-1950-410c-b6a4-8bbe239d59bb req-1333342a-b872-48a3-b848-d3093498c1ee 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 21970bbd-36b4-495d-8819-49ef2276a912] Received unexpected event network-vif-plugged-06125da7-7adf-4bbe-b033-0045ab83a9f2 for instance with vm_state deleted and task_state None.
Jan 22 00:09:17 compute-1 nova_compute[182713]: 2026-01-22 00:09:17.533 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:09:17 compute-1 nova_compute[182713]: 2026-01-22 00:09:17.630 182717 DEBUG oslo_concurrency.lockutils [None req-b40c2da9-c46a-4e8f-992a-8e34f26bccab 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Lock "21970bbd-36b4-495d-8819-49ef2276a912" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.744s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:09:20 compute-1 nova_compute[182713]: 2026-01-22 00:09:20.245 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:09:21 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:09:21.507 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=74526b6d-b1ca-423f-9094-b845f8b97526, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '30'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:09:21 compute-1 nova_compute[182713]: 2026-01-22 00:09:21.635 182717 DEBUG nova.compute.manager [None req-43b71e46-6cc7-4d6e-8c59-e220051761d4 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 07bda903-2298-433c-aa7d-9a50380e24f1] Stashing vm_state: active _prep_resize /usr/lib/python3.9/site-packages/nova/compute/manager.py:5560
Jan 22 00:09:21 compute-1 nova_compute[182713]: 2026-01-22 00:09:21.775 182717 DEBUG oslo_concurrency.lockutils [None req-43b71e46-6cc7-4d6e-8c59-e220051761d4 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:09:21 compute-1 nova_compute[182713]: 2026-01-22 00:09:21.776 182717 DEBUG oslo_concurrency.lockutils [None req-43b71e46-6cc7-4d6e-8c59-e220051761d4 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:09:21 compute-1 nova_compute[182713]: 2026-01-22 00:09:21.799 182717 DEBUG nova.objects.instance [None req-43b71e46-6cc7-4d6e-8c59-e220051761d4 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Lazy-loading 'pci_requests' on Instance uuid 07bda903-2298-433c-aa7d-9a50380e24f1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:09:21 compute-1 nova_compute[182713]: 2026-01-22 00:09:21.825 182717 DEBUG nova.virt.hardware [None req-43b71e46-6cc7-4d6e-8c59-e220051761d4 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 22 00:09:21 compute-1 nova_compute[182713]: 2026-01-22 00:09:21.826 182717 INFO nova.compute.claims [None req-43b71e46-6cc7-4d6e-8c59-e220051761d4 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 07bda903-2298-433c-aa7d-9a50380e24f1] Claim successful on node compute-1.ctlplane.example.com
Jan 22 00:09:21 compute-1 nova_compute[182713]: 2026-01-22 00:09:21.827 182717 DEBUG nova.objects.instance [None req-43b71e46-6cc7-4d6e-8c59-e220051761d4 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Lazy-loading 'resources' on Instance uuid 07bda903-2298-433c-aa7d-9a50380e24f1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:09:21 compute-1 nova_compute[182713]: 2026-01-22 00:09:21.840 182717 DEBUG nova.objects.instance [None req-43b71e46-6cc7-4d6e-8c59-e220051761d4 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Lazy-loading 'pci_devices' on Instance uuid 07bda903-2298-433c-aa7d-9a50380e24f1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:09:21 compute-1 nova_compute[182713]: 2026-01-22 00:09:21.899 182717 INFO nova.compute.resource_tracker [None req-43b71e46-6cc7-4d6e-8c59-e220051761d4 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 07bda903-2298-433c-aa7d-9a50380e24f1] Updating resource usage from migration 3d4b06f6-bb79-4378-827a-10ce205dc76f
Jan 22 00:09:21 compute-1 nova_compute[182713]: 2026-01-22 00:09:21.899 182717 DEBUG nova.compute.resource_tracker [None req-43b71e46-6cc7-4d6e-8c59-e220051761d4 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 07bda903-2298-433c-aa7d-9a50380e24f1] Starting to track incoming migration 3d4b06f6-bb79-4378-827a-10ce205dc76f with flavor ff01ccba-ad51-439f-9037-926190d6dc0f _update_usage_from_migration /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1431
Jan 22 00:09:22 compute-1 nova_compute[182713]: 2026-01-22 00:09:22.006 182717 DEBUG nova.compute.provider_tree [None req-43b71e46-6cc7-4d6e-8c59-e220051761d4 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Inventory has not changed in ProviderTree for provider: 39680711-70c9-4df1-ae59-25e54fac688d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:09:22 compute-1 nova_compute[182713]: 2026-01-22 00:09:22.035 182717 DEBUG nova.scheduler.client.report [None req-43b71e46-6cc7-4d6e-8c59-e220051761d4 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Inventory has not changed for provider 39680711-70c9-4df1-ae59-25e54fac688d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:09:22 compute-1 nova_compute[182713]: 2026-01-22 00:09:22.089 182717 DEBUG oslo_concurrency.lockutils [None req-43b71e46-6cc7-4d6e-8c59-e220051761d4 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: held 0.313s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:09:22 compute-1 nova_compute[182713]: 2026-01-22 00:09:22.090 182717 INFO nova.compute.manager [None req-43b71e46-6cc7-4d6e-8c59-e220051761d4 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 07bda903-2298-433c-aa7d-9a50380e24f1] Migrating
Jan 22 00:09:22 compute-1 nova_compute[182713]: 2026-01-22 00:09:22.531 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:09:23 compute-1 nova_compute[182713]: 2026-01-22 00:09:23.153 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:09:23 compute-1 nova_compute[182713]: 2026-01-22 00:09:23.365 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:09:24 compute-1 sshd-session[228278]: Accepted publickey for nova from 192.168.122.100 port 48996 ssh2: ECDSA SHA256:F7R2p0Um/J2TH6GGZ4p2tVmjepU+Vjbmccv9PcuTsYI
Jan 22 00:09:24 compute-1 systemd-logind[796]: New session 46 of user nova.
Jan 22 00:09:24 compute-1 systemd[1]: Created slice User Slice of UID 42436.
Jan 22 00:09:24 compute-1 systemd[1]: Starting User Runtime Directory /run/user/42436...
Jan 22 00:09:24 compute-1 systemd[1]: Finished User Runtime Directory /run/user/42436.
Jan 22 00:09:24 compute-1 systemd[1]: Starting User Manager for UID 42436...
Jan 22 00:09:24 compute-1 systemd[228282]: pam_unix(systemd-user:session): session opened for user nova(uid=42436) by nova(uid=0)
Jan 22 00:09:24 compute-1 systemd[228282]: Queued start job for default target Main User Target.
Jan 22 00:09:24 compute-1 systemd[228282]: Created slice User Application Slice.
Jan 22 00:09:24 compute-1 systemd[228282]: Started Mark boot as successful after the user session has run 2 minutes.
Jan 22 00:09:24 compute-1 systemd[228282]: Started Daily Cleanup of User's Temporary Directories.
Jan 22 00:09:24 compute-1 systemd[228282]: Reached target Paths.
Jan 22 00:09:24 compute-1 systemd[228282]: Reached target Timers.
Jan 22 00:09:24 compute-1 systemd[228282]: Starting D-Bus User Message Bus Socket...
Jan 22 00:09:24 compute-1 systemd[228282]: Starting Create User's Volatile Files and Directories...
Jan 22 00:09:24 compute-1 systemd[228282]: Finished Create User's Volatile Files and Directories.
Jan 22 00:09:24 compute-1 systemd[228282]: Listening on D-Bus User Message Bus Socket.
Jan 22 00:09:24 compute-1 systemd[228282]: Reached target Sockets.
Jan 22 00:09:24 compute-1 systemd[228282]: Reached target Basic System.
Jan 22 00:09:24 compute-1 systemd[228282]: Reached target Main User Target.
Jan 22 00:09:24 compute-1 systemd[228282]: Startup finished in 128ms.
Jan 22 00:09:24 compute-1 systemd[1]: Started User Manager for UID 42436.
Jan 22 00:09:24 compute-1 systemd[1]: Started Session 46 of User nova.
Jan 22 00:09:24 compute-1 sshd-session[228278]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Jan 22 00:09:24 compute-1 sshd-session[228298]: Received disconnect from 192.168.122.100 port 48996:11: disconnected by user
Jan 22 00:09:24 compute-1 sshd-session[228298]: Disconnected from user nova 192.168.122.100 port 48996
Jan 22 00:09:24 compute-1 sshd-session[228278]: pam_unix(sshd:session): session closed for user nova
Jan 22 00:09:24 compute-1 systemd[1]: session-46.scope: Deactivated successfully.
Jan 22 00:09:24 compute-1 systemd-logind[796]: Session 46 logged out. Waiting for processes to exit.
Jan 22 00:09:24 compute-1 systemd-logind[796]: Removed session 46.
Jan 22 00:09:25 compute-1 sshd-session[228300]: Accepted publickey for nova from 192.168.122.100 port 49002 ssh2: ECDSA SHA256:F7R2p0Um/J2TH6GGZ4p2tVmjepU+Vjbmccv9PcuTsYI
Jan 22 00:09:25 compute-1 systemd-logind[796]: New session 48 of user nova.
Jan 22 00:09:25 compute-1 systemd[1]: Started Session 48 of User nova.
Jan 22 00:09:25 compute-1 sshd-session[228300]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Jan 22 00:09:25 compute-1 sshd-session[228303]: Received disconnect from 192.168.122.100 port 49002:11: disconnected by user
Jan 22 00:09:25 compute-1 sshd-session[228303]: Disconnected from user nova 192.168.122.100 port 49002
Jan 22 00:09:25 compute-1 sshd-session[228300]: pam_unix(sshd:session): session closed for user nova
Jan 22 00:09:25 compute-1 systemd[1]: session-48.scope: Deactivated successfully.
Jan 22 00:09:25 compute-1 systemd-logind[796]: Session 48 logged out. Waiting for processes to exit.
Jan 22 00:09:25 compute-1 systemd-logind[796]: Removed session 48.
Jan 22 00:09:25 compute-1 nova_compute[182713]: 2026-01-22 00:09:25.247 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:09:26 compute-1 podman[228306]: 2026-01-22 00:09:26.576084091 +0000 UTC m=+0.066382586 container health_status 9069102163b9014ce8d990e36224edf2f6277e737eebbc85a8ea0ba5a3d6a468 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 22 00:09:26 compute-1 podman[228305]: 2026-01-22 00:09:26.602655048 +0000 UTC m=+0.089776626 container health_status 1b31374526468d6b98833c6ddc06c791524e3aaa8f4a92d02e3f11c449a17ca2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 22 00:09:27 compute-1 nova_compute[182713]: 2026-01-22 00:09:27.534 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:09:27 compute-1 nova_compute[182713]: 2026-01-22 00:09:27.885 182717 DEBUG oslo_concurrency.lockutils [None req-f2f806fb-c0d1-4e88-bec3-bbb3576a6a04 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] Acquiring lock "9c30e0e8-a030-4b0c-84ed-324f08bd5f1b" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:09:27 compute-1 nova_compute[182713]: 2026-01-22 00:09:27.886 182717 DEBUG oslo_concurrency.lockutils [None req-f2f806fb-c0d1-4e88-bec3-bbb3576a6a04 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] Lock "9c30e0e8-a030-4b0c-84ed-324f08bd5f1b" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:09:27 compute-1 nova_compute[182713]: 2026-01-22 00:09:27.922 182717 DEBUG nova.compute.manager [None req-f2f806fb-c0d1-4e88-bec3-bbb3576a6a04 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] [instance: 9c30e0e8-a030-4b0c-84ed-324f08bd5f1b] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 22 00:09:28 compute-1 nova_compute[182713]: 2026-01-22 00:09:28.045 182717 DEBUG oslo_concurrency.lockutils [None req-f2f806fb-c0d1-4e88-bec3-bbb3576a6a04 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:09:28 compute-1 nova_compute[182713]: 2026-01-22 00:09:28.045 182717 DEBUG oslo_concurrency.lockutils [None req-f2f806fb-c0d1-4e88-bec3-bbb3576a6a04 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:09:28 compute-1 nova_compute[182713]: 2026-01-22 00:09:28.057 182717 DEBUG nova.virt.hardware [None req-f2f806fb-c0d1-4e88-bec3-bbb3576a6a04 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 22 00:09:28 compute-1 nova_compute[182713]: 2026-01-22 00:09:28.058 182717 INFO nova.compute.claims [None req-f2f806fb-c0d1-4e88-bec3-bbb3576a6a04 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] [instance: 9c30e0e8-a030-4b0c-84ed-324f08bd5f1b] Claim successful on node compute-1.ctlplane.example.com
Jan 22 00:09:28 compute-1 nova_compute[182713]: 2026-01-22 00:09:28.261 182717 DEBUG nova.compute.provider_tree [None req-f2f806fb-c0d1-4e88-bec3-bbb3576a6a04 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] Inventory has not changed in ProviderTree for provider: 39680711-70c9-4df1-ae59-25e54fac688d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:09:28 compute-1 nova_compute[182713]: 2026-01-22 00:09:28.303 182717 DEBUG nova.scheduler.client.report [None req-f2f806fb-c0d1-4e88-bec3-bbb3576a6a04 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] Inventory has not changed for provider 39680711-70c9-4df1-ae59-25e54fac688d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:09:28 compute-1 nova_compute[182713]: 2026-01-22 00:09:28.340 182717 DEBUG oslo_concurrency.lockutils [None req-f2f806fb-c0d1-4e88-bec3-bbb3576a6a04 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.294s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:09:28 compute-1 nova_compute[182713]: 2026-01-22 00:09:28.341 182717 DEBUG nova.compute.manager [None req-f2f806fb-c0d1-4e88-bec3-bbb3576a6a04 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] [instance: 9c30e0e8-a030-4b0c-84ed-324f08bd5f1b] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 22 00:09:28 compute-1 nova_compute[182713]: 2026-01-22 00:09:28.411 182717 DEBUG nova.compute.manager [None req-f2f806fb-c0d1-4e88-bec3-bbb3576a6a04 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] [instance: 9c30e0e8-a030-4b0c-84ed-324f08bd5f1b] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 22 00:09:28 compute-1 nova_compute[182713]: 2026-01-22 00:09:28.412 182717 DEBUG nova.network.neutron [None req-f2f806fb-c0d1-4e88-bec3-bbb3576a6a04 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] [instance: 9c30e0e8-a030-4b0c-84ed-324f08bd5f1b] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 22 00:09:28 compute-1 nova_compute[182713]: 2026-01-22 00:09:28.442 182717 INFO nova.virt.libvirt.driver [None req-f2f806fb-c0d1-4e88-bec3-bbb3576a6a04 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] [instance: 9c30e0e8-a030-4b0c-84ed-324f08bd5f1b] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 22 00:09:28 compute-1 nova_compute[182713]: 2026-01-22 00:09:28.503 182717 DEBUG nova.compute.manager [None req-f2f806fb-c0d1-4e88-bec3-bbb3576a6a04 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] [instance: 9c30e0e8-a030-4b0c-84ed-324f08bd5f1b] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 22 00:09:28 compute-1 nova_compute[182713]: 2026-01-22 00:09:28.671 182717 DEBUG nova.compute.manager [None req-f2f806fb-c0d1-4e88-bec3-bbb3576a6a04 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] [instance: 9c30e0e8-a030-4b0c-84ed-324f08bd5f1b] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 22 00:09:28 compute-1 nova_compute[182713]: 2026-01-22 00:09:28.674 182717 DEBUG nova.virt.libvirt.driver [None req-f2f806fb-c0d1-4e88-bec3-bbb3576a6a04 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] [instance: 9c30e0e8-a030-4b0c-84ed-324f08bd5f1b] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 22 00:09:28 compute-1 nova_compute[182713]: 2026-01-22 00:09:28.675 182717 INFO nova.virt.libvirt.driver [None req-f2f806fb-c0d1-4e88-bec3-bbb3576a6a04 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] [instance: 9c30e0e8-a030-4b0c-84ed-324f08bd5f1b] Creating image(s)
Jan 22 00:09:28 compute-1 nova_compute[182713]: 2026-01-22 00:09:28.676 182717 DEBUG oslo_concurrency.lockutils [None req-f2f806fb-c0d1-4e88-bec3-bbb3576a6a04 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] Acquiring lock "/var/lib/nova/instances/9c30e0e8-a030-4b0c-84ed-324f08bd5f1b/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:09:28 compute-1 nova_compute[182713]: 2026-01-22 00:09:28.677 182717 DEBUG oslo_concurrency.lockutils [None req-f2f806fb-c0d1-4e88-bec3-bbb3576a6a04 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] Lock "/var/lib/nova/instances/9c30e0e8-a030-4b0c-84ed-324f08bd5f1b/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:09:28 compute-1 nova_compute[182713]: 2026-01-22 00:09:28.678 182717 DEBUG oslo_concurrency.lockutils [None req-f2f806fb-c0d1-4e88-bec3-bbb3576a6a04 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] Lock "/var/lib/nova/instances/9c30e0e8-a030-4b0c-84ed-324f08bd5f1b/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:09:28 compute-1 nova_compute[182713]: 2026-01-22 00:09:28.703 182717 DEBUG oslo_concurrency.processutils [None req-f2f806fb-c0d1-4e88-bec3-bbb3576a6a04 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:09:28 compute-1 nova_compute[182713]: 2026-01-22 00:09:28.771 182717 DEBUG oslo_concurrency.processutils [None req-f2f806fb-c0d1-4e88-bec3-bbb3576a6a04 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:09:28 compute-1 nova_compute[182713]: 2026-01-22 00:09:28.773 182717 DEBUG oslo_concurrency.lockutils [None req-f2f806fb-c0d1-4e88-bec3-bbb3576a6a04 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] Acquiring lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:09:28 compute-1 nova_compute[182713]: 2026-01-22 00:09:28.774 182717 DEBUG oslo_concurrency.lockutils [None req-f2f806fb-c0d1-4e88-bec3-bbb3576a6a04 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:09:28 compute-1 nova_compute[182713]: 2026-01-22 00:09:28.790 182717 DEBUG oslo_concurrency.processutils [None req-f2f806fb-c0d1-4e88-bec3-bbb3576a6a04 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:09:28 compute-1 nova_compute[182713]: 2026-01-22 00:09:28.823 182717 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769040553.8215652, 6bdc91c7-e39d-4d01-9496-49ceb58c3389 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:09:28 compute-1 nova_compute[182713]: 2026-01-22 00:09:28.824 182717 INFO nova.compute.manager [-] [instance: 6bdc91c7-e39d-4d01-9496-49ceb58c3389] VM Stopped (Lifecycle Event)
Jan 22 00:09:28 compute-1 nova_compute[182713]: 2026-01-22 00:09:28.849 182717 DEBUG nova.compute.manager [None req-401d7045-1e66-4ed6-88fb-368af3b85141 - - - - - -] [instance: 6bdc91c7-e39d-4d01-9496-49ceb58c3389] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:09:28 compute-1 nova_compute[182713]: 2026-01-22 00:09:28.857 182717 DEBUG oslo_concurrency.processutils [None req-f2f806fb-c0d1-4e88-bec3-bbb3576a6a04 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:09:28 compute-1 nova_compute[182713]: 2026-01-22 00:09:28.857 182717 DEBUG oslo_concurrency.processutils [None req-f2f806fb-c0d1-4e88-bec3-bbb3576a6a04 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/9c30e0e8-a030-4b0c-84ed-324f08bd5f1b/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:09:28 compute-1 nova_compute[182713]: 2026-01-22 00:09:28.881 182717 DEBUG nova.policy [None req-f2f806fb-c0d1-4e88-bec3-bbb3576a6a04 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '8324d8ba232c476e925d31b7d5645a7a', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '3b9315c6168049d79f20d630e51ffff3', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 22 00:09:28 compute-1 nova_compute[182713]: 2026-01-22 00:09:28.894 182717 DEBUG oslo_concurrency.processutils [None req-f2f806fb-c0d1-4e88-bec3-bbb3576a6a04 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/9c30e0e8-a030-4b0c-84ed-324f08bd5f1b/disk 1073741824" returned: 0 in 0.037s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:09:28 compute-1 nova_compute[182713]: 2026-01-22 00:09:28.895 182717 DEBUG oslo_concurrency.lockutils [None req-f2f806fb-c0d1-4e88-bec3-bbb3576a6a04 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.121s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:09:28 compute-1 nova_compute[182713]: 2026-01-22 00:09:28.896 182717 DEBUG oslo_concurrency.processutils [None req-f2f806fb-c0d1-4e88-bec3-bbb3576a6a04 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:09:28 compute-1 nova_compute[182713]: 2026-01-22 00:09:28.952 182717 DEBUG oslo_concurrency.processutils [None req-f2f806fb-c0d1-4e88-bec3-bbb3576a6a04 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:09:28 compute-1 nova_compute[182713]: 2026-01-22 00:09:28.953 182717 DEBUG nova.virt.disk.api [None req-f2f806fb-c0d1-4e88-bec3-bbb3576a6a04 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] Checking if we can resize image /var/lib/nova/instances/9c30e0e8-a030-4b0c-84ed-324f08bd5f1b/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 22 00:09:28 compute-1 nova_compute[182713]: 2026-01-22 00:09:28.953 182717 DEBUG oslo_concurrency.processutils [None req-f2f806fb-c0d1-4e88-bec3-bbb3576a6a04 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9c30e0e8-a030-4b0c-84ed-324f08bd5f1b/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:09:29 compute-1 nova_compute[182713]: 2026-01-22 00:09:29.012 182717 DEBUG oslo_concurrency.processutils [None req-f2f806fb-c0d1-4e88-bec3-bbb3576a6a04 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9c30e0e8-a030-4b0c-84ed-324f08bd5f1b/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:09:29 compute-1 nova_compute[182713]: 2026-01-22 00:09:29.013 182717 DEBUG nova.virt.disk.api [None req-f2f806fb-c0d1-4e88-bec3-bbb3576a6a04 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] Cannot resize image /var/lib/nova/instances/9c30e0e8-a030-4b0c-84ed-324f08bd5f1b/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 22 00:09:29 compute-1 nova_compute[182713]: 2026-01-22 00:09:29.014 182717 DEBUG nova.objects.instance [None req-f2f806fb-c0d1-4e88-bec3-bbb3576a6a04 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] Lazy-loading 'migration_context' on Instance uuid 9c30e0e8-a030-4b0c-84ed-324f08bd5f1b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:09:29 compute-1 nova_compute[182713]: 2026-01-22 00:09:29.030 182717 DEBUG nova.virt.libvirt.driver [None req-f2f806fb-c0d1-4e88-bec3-bbb3576a6a04 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] [instance: 9c30e0e8-a030-4b0c-84ed-324f08bd5f1b] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 22 00:09:29 compute-1 nova_compute[182713]: 2026-01-22 00:09:29.030 182717 DEBUG nova.virt.libvirt.driver [None req-f2f806fb-c0d1-4e88-bec3-bbb3576a6a04 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] [instance: 9c30e0e8-a030-4b0c-84ed-324f08bd5f1b] Ensure instance console log exists: /var/lib/nova/instances/9c30e0e8-a030-4b0c-84ed-324f08bd5f1b/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 22 00:09:29 compute-1 nova_compute[182713]: 2026-01-22 00:09:29.031 182717 DEBUG oslo_concurrency.lockutils [None req-f2f806fb-c0d1-4e88-bec3-bbb3576a6a04 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:09:29 compute-1 nova_compute[182713]: 2026-01-22 00:09:29.031 182717 DEBUG oslo_concurrency.lockutils [None req-f2f806fb-c0d1-4e88-bec3-bbb3576a6a04 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:09:29 compute-1 nova_compute[182713]: 2026-01-22 00:09:29.031 182717 DEBUG oslo_concurrency.lockutils [None req-f2f806fb-c0d1-4e88-bec3-bbb3576a6a04 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:09:30 compute-1 nova_compute[182713]: 2026-01-22 00:09:30.214 182717 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769040555.2132044, 21970bbd-36b4-495d-8819-49ef2276a912 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:09:30 compute-1 nova_compute[182713]: 2026-01-22 00:09:30.214 182717 INFO nova.compute.manager [-] [instance: 21970bbd-36b4-495d-8819-49ef2276a912] VM Stopped (Lifecycle Event)
Jan 22 00:09:30 compute-1 nova_compute[182713]: 2026-01-22 00:09:30.252 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:09:30 compute-1 nova_compute[182713]: 2026-01-22 00:09:30.430 182717 DEBUG nova.compute.manager [None req-9bb4504a-a93a-4691-a290-a69b4f86dc1c - - - - - -] [instance: 21970bbd-36b4-495d-8819-49ef2276a912] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:09:30 compute-1 nova_compute[182713]: 2026-01-22 00:09:30.899 182717 DEBUG nova.network.neutron [None req-f2f806fb-c0d1-4e88-bec3-bbb3576a6a04 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] [instance: 9c30e0e8-a030-4b0c-84ed-324f08bd5f1b] Successfully created port: 3e67543d-6311-420b-878e-c3112fb771c2 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 22 00:09:31 compute-1 podman[228371]: 2026-01-22 00:09:31.568700576 +0000 UTC m=+0.061364333 container health_status af9a44923f14d865893c91712a29a99d59e05d83f1a1a0300c4f4d5af638f284 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 00:09:31 compute-1 podman[228372]: 2026-01-22 00:09:31.577611187 +0000 UTC m=+0.063604751 container health_status e305ebfcd3dbca18f8dd680606b043b108cf9efb0d0a771039cb04fe0df8441d (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Jan 22 00:09:32 compute-1 nova_compute[182713]: 2026-01-22 00:09:32.536 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:09:32 compute-1 nova_compute[182713]: 2026-01-22 00:09:32.701 182717 DEBUG nova.network.neutron [None req-f2f806fb-c0d1-4e88-bec3-bbb3576a6a04 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] [instance: 9c30e0e8-a030-4b0c-84ed-324f08bd5f1b] Successfully updated port: 3e67543d-6311-420b-878e-c3112fb771c2 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 22 00:09:32 compute-1 nova_compute[182713]: 2026-01-22 00:09:32.727 182717 DEBUG oslo_concurrency.lockutils [None req-f2f806fb-c0d1-4e88-bec3-bbb3576a6a04 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] Acquiring lock "refresh_cache-9c30e0e8-a030-4b0c-84ed-324f08bd5f1b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:09:32 compute-1 nova_compute[182713]: 2026-01-22 00:09:32.727 182717 DEBUG oslo_concurrency.lockutils [None req-f2f806fb-c0d1-4e88-bec3-bbb3576a6a04 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] Acquired lock "refresh_cache-9c30e0e8-a030-4b0c-84ed-324f08bd5f1b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:09:32 compute-1 nova_compute[182713]: 2026-01-22 00:09:32.727 182717 DEBUG nova.network.neutron [None req-f2f806fb-c0d1-4e88-bec3-bbb3576a6a04 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] [instance: 9c30e0e8-a030-4b0c-84ed-324f08bd5f1b] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 00:09:32 compute-1 nova_compute[182713]: 2026-01-22 00:09:32.867 182717 DEBUG nova.compute.manager [req-33f0a701-ad1b-4f29-9ecb-9d57d6a724a0 req-4f313d71-4728-41d5-b0fb-efa9a016fb9b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 9c30e0e8-a030-4b0c-84ed-324f08bd5f1b] Received event network-changed-3e67543d-6311-420b-878e-c3112fb771c2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:09:32 compute-1 nova_compute[182713]: 2026-01-22 00:09:32.868 182717 DEBUG nova.compute.manager [req-33f0a701-ad1b-4f29-9ecb-9d57d6a724a0 req-4f313d71-4728-41d5-b0fb-efa9a016fb9b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 9c30e0e8-a030-4b0c-84ed-324f08bd5f1b] Refreshing instance network info cache due to event network-changed-3e67543d-6311-420b-878e-c3112fb771c2. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 00:09:32 compute-1 nova_compute[182713]: 2026-01-22 00:09:32.869 182717 DEBUG oslo_concurrency.lockutils [req-33f0a701-ad1b-4f29-9ecb-9d57d6a724a0 req-4f313d71-4728-41d5-b0fb-efa9a016fb9b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-9c30e0e8-a030-4b0c-84ed-324f08bd5f1b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:09:33 compute-1 nova_compute[182713]: 2026-01-22 00:09:33.005 182717 DEBUG nova.network.neutron [None req-f2f806fb-c0d1-4e88-bec3-bbb3576a6a04 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] [instance: 9c30e0e8-a030-4b0c-84ed-324f08bd5f1b] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 00:09:34 compute-1 nova_compute[182713]: 2026-01-22 00:09:34.427 182717 DEBUG nova.network.neutron [None req-f2f806fb-c0d1-4e88-bec3-bbb3576a6a04 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] [instance: 9c30e0e8-a030-4b0c-84ed-324f08bd5f1b] Updating instance_info_cache with network_info: [{"id": "3e67543d-6311-420b-878e-c3112fb771c2", "address": "fa:16:3e:75:3e:f2", "network": {"id": "eaa59f49-9477-4d9b-9a5e-8f1eb5cf78a6", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-1280377146-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "3b9315c6168049d79f20d630e51ffff3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3e67543d-63", "ovs_interfaceid": "3e67543d-6311-420b-878e-c3112fb771c2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:09:34 compute-1 nova_compute[182713]: 2026-01-22 00:09:34.453 182717 DEBUG oslo_concurrency.lockutils [None req-f2f806fb-c0d1-4e88-bec3-bbb3576a6a04 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] Releasing lock "refresh_cache-9c30e0e8-a030-4b0c-84ed-324f08bd5f1b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:09:34 compute-1 nova_compute[182713]: 2026-01-22 00:09:34.454 182717 DEBUG nova.compute.manager [None req-f2f806fb-c0d1-4e88-bec3-bbb3576a6a04 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] [instance: 9c30e0e8-a030-4b0c-84ed-324f08bd5f1b] Instance network_info: |[{"id": "3e67543d-6311-420b-878e-c3112fb771c2", "address": "fa:16:3e:75:3e:f2", "network": {"id": "eaa59f49-9477-4d9b-9a5e-8f1eb5cf78a6", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-1280377146-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "3b9315c6168049d79f20d630e51ffff3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3e67543d-63", "ovs_interfaceid": "3e67543d-6311-420b-878e-c3112fb771c2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 22 00:09:34 compute-1 nova_compute[182713]: 2026-01-22 00:09:34.455 182717 DEBUG oslo_concurrency.lockutils [req-33f0a701-ad1b-4f29-9ecb-9d57d6a724a0 req-4f313d71-4728-41d5-b0fb-efa9a016fb9b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-9c30e0e8-a030-4b0c-84ed-324f08bd5f1b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:09:34 compute-1 nova_compute[182713]: 2026-01-22 00:09:34.455 182717 DEBUG nova.network.neutron [req-33f0a701-ad1b-4f29-9ecb-9d57d6a724a0 req-4f313d71-4728-41d5-b0fb-efa9a016fb9b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 9c30e0e8-a030-4b0c-84ed-324f08bd5f1b] Refreshing network info cache for port 3e67543d-6311-420b-878e-c3112fb771c2 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 00:09:34 compute-1 nova_compute[182713]: 2026-01-22 00:09:34.461 182717 DEBUG nova.virt.libvirt.driver [None req-f2f806fb-c0d1-4e88-bec3-bbb3576a6a04 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] [instance: 9c30e0e8-a030-4b0c-84ed-324f08bd5f1b] Start _get_guest_xml network_info=[{"id": "3e67543d-6311-420b-878e-c3112fb771c2", "address": "fa:16:3e:75:3e:f2", "network": {"id": "eaa59f49-9477-4d9b-9a5e-8f1eb5cf78a6", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-1280377146-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "3b9315c6168049d79f20d630e51ffff3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3e67543d-63", "ovs_interfaceid": "3e67543d-6311-420b-878e-c3112fb771c2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'size': 0, 'disk_bus': 'virtio', 'boot_index': 0, 'device_name': '/dev/vda', 'guest_format': None, 'encrypted': False, 'encryption_format': None, 'encryption_secret_uuid': None, 'device_type': 'disk', 'image_id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 22 00:09:34 compute-1 nova_compute[182713]: 2026-01-22 00:09:34.468 182717 WARNING nova.virt.libvirt.driver [None req-f2f806fb-c0d1-4e88-bec3-bbb3576a6a04 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 00:09:34 compute-1 nova_compute[182713]: 2026-01-22 00:09:34.474 182717 DEBUG nova.virt.libvirt.host [None req-f2f806fb-c0d1-4e88-bec3-bbb3576a6a04 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 22 00:09:34 compute-1 nova_compute[182713]: 2026-01-22 00:09:34.475 182717 DEBUG nova.virt.libvirt.host [None req-f2f806fb-c0d1-4e88-bec3-bbb3576a6a04 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 22 00:09:34 compute-1 nova_compute[182713]: 2026-01-22 00:09:34.484 182717 DEBUG nova.virt.libvirt.host [None req-f2f806fb-c0d1-4e88-bec3-bbb3576a6a04 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 22 00:09:34 compute-1 nova_compute[182713]: 2026-01-22 00:09:34.485 182717 DEBUG nova.virt.libvirt.host [None req-f2f806fb-c0d1-4e88-bec3-bbb3576a6a04 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 22 00:09:34 compute-1 nova_compute[182713]: 2026-01-22 00:09:34.486 182717 DEBUG nova.virt.libvirt.driver [None req-f2f806fb-c0d1-4e88-bec3-bbb3576a6a04 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 22 00:09:34 compute-1 nova_compute[182713]: 2026-01-22 00:09:34.487 182717 DEBUG nova.virt.hardware [None req-f2f806fb-c0d1-4e88-bec3-bbb3576a6a04 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-21T23:42:45Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c3389c03-89c4-4ff5-9e03-1a99d41713d4',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 22 00:09:34 compute-1 nova_compute[182713]: 2026-01-22 00:09:34.487 182717 DEBUG nova.virt.hardware [None req-f2f806fb-c0d1-4e88-bec3-bbb3576a6a04 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 22 00:09:34 compute-1 nova_compute[182713]: 2026-01-22 00:09:34.487 182717 DEBUG nova.virt.hardware [None req-f2f806fb-c0d1-4e88-bec3-bbb3576a6a04 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 22 00:09:34 compute-1 nova_compute[182713]: 2026-01-22 00:09:34.487 182717 DEBUG nova.virt.hardware [None req-f2f806fb-c0d1-4e88-bec3-bbb3576a6a04 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 22 00:09:34 compute-1 nova_compute[182713]: 2026-01-22 00:09:34.488 182717 DEBUG nova.virt.hardware [None req-f2f806fb-c0d1-4e88-bec3-bbb3576a6a04 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 22 00:09:34 compute-1 nova_compute[182713]: 2026-01-22 00:09:34.488 182717 DEBUG nova.virt.hardware [None req-f2f806fb-c0d1-4e88-bec3-bbb3576a6a04 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 22 00:09:34 compute-1 nova_compute[182713]: 2026-01-22 00:09:34.488 182717 DEBUG nova.virt.hardware [None req-f2f806fb-c0d1-4e88-bec3-bbb3576a6a04 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 22 00:09:34 compute-1 nova_compute[182713]: 2026-01-22 00:09:34.488 182717 DEBUG nova.virt.hardware [None req-f2f806fb-c0d1-4e88-bec3-bbb3576a6a04 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 22 00:09:34 compute-1 nova_compute[182713]: 2026-01-22 00:09:34.489 182717 DEBUG nova.virt.hardware [None req-f2f806fb-c0d1-4e88-bec3-bbb3576a6a04 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 22 00:09:34 compute-1 nova_compute[182713]: 2026-01-22 00:09:34.489 182717 DEBUG nova.virt.hardware [None req-f2f806fb-c0d1-4e88-bec3-bbb3576a6a04 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 22 00:09:34 compute-1 nova_compute[182713]: 2026-01-22 00:09:34.489 182717 DEBUG nova.virt.hardware [None req-f2f806fb-c0d1-4e88-bec3-bbb3576a6a04 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 22 00:09:34 compute-1 nova_compute[182713]: 2026-01-22 00:09:34.493 182717 DEBUG nova.virt.libvirt.vif [None req-f2f806fb-c0d1-4e88-bec3-bbb3576a6a04 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T00:09:26Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerRescueTestJSON-server-635293581',display_name='tempest-ServerRescueTestJSON-server-635293581',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverrescuetestjson-server-635293581',id=112,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='3b9315c6168049d79f20d630e51ffff3',ramdisk_id='',reservation_id='r-egqa46yk',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerRescueTestJSON-401787473',owner_user_name='tempest-ServerRescueTestJSON-401787473-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:09:28Z,user_data=None,user_id='8324d8ba232c476e925d31b7d5645a7a',uuid=9c30e0e8-a030-4b0c-84ed-324f08bd5f1b,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "3e67543d-6311-420b-878e-c3112fb771c2", "address": "fa:16:3e:75:3e:f2", "network": {"id": "eaa59f49-9477-4d9b-9a5e-8f1eb5cf78a6", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-1280377146-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "3b9315c6168049d79f20d630e51ffff3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3e67543d-63", "ovs_interfaceid": "3e67543d-6311-420b-878e-c3112fb771c2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 00:09:34 compute-1 nova_compute[182713]: 2026-01-22 00:09:34.494 182717 DEBUG nova.network.os_vif_util [None req-f2f806fb-c0d1-4e88-bec3-bbb3576a6a04 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] Converting VIF {"id": "3e67543d-6311-420b-878e-c3112fb771c2", "address": "fa:16:3e:75:3e:f2", "network": {"id": "eaa59f49-9477-4d9b-9a5e-8f1eb5cf78a6", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-1280377146-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "3b9315c6168049d79f20d630e51ffff3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3e67543d-63", "ovs_interfaceid": "3e67543d-6311-420b-878e-c3112fb771c2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:09:34 compute-1 nova_compute[182713]: 2026-01-22 00:09:34.495 182717 DEBUG nova.network.os_vif_util [None req-f2f806fb-c0d1-4e88-bec3-bbb3576a6a04 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:75:3e:f2,bridge_name='br-int',has_traffic_filtering=True,id=3e67543d-6311-420b-878e-c3112fb771c2,network=Network(eaa59f49-9477-4d9b-9a5e-8f1eb5cf78a6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3e67543d-63') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:09:34 compute-1 nova_compute[182713]: 2026-01-22 00:09:34.496 182717 DEBUG nova.objects.instance [None req-f2f806fb-c0d1-4e88-bec3-bbb3576a6a04 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] Lazy-loading 'pci_devices' on Instance uuid 9c30e0e8-a030-4b0c-84ed-324f08bd5f1b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:09:34 compute-1 nova_compute[182713]: 2026-01-22 00:09:34.519 182717 DEBUG nova.virt.libvirt.driver [None req-f2f806fb-c0d1-4e88-bec3-bbb3576a6a04 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] [instance: 9c30e0e8-a030-4b0c-84ed-324f08bd5f1b] End _get_guest_xml xml=<domain type="kvm">
Jan 22 00:09:34 compute-1 nova_compute[182713]:   <uuid>9c30e0e8-a030-4b0c-84ed-324f08bd5f1b</uuid>
Jan 22 00:09:34 compute-1 nova_compute[182713]:   <name>instance-00000070</name>
Jan 22 00:09:34 compute-1 nova_compute[182713]:   <memory>131072</memory>
Jan 22 00:09:34 compute-1 nova_compute[182713]:   <vcpu>1</vcpu>
Jan 22 00:09:34 compute-1 nova_compute[182713]:   <metadata>
Jan 22 00:09:34 compute-1 nova_compute[182713]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 00:09:34 compute-1 nova_compute[182713]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 00:09:34 compute-1 nova_compute[182713]:       <nova:name>tempest-ServerRescueTestJSON-server-635293581</nova:name>
Jan 22 00:09:34 compute-1 nova_compute[182713]:       <nova:creationTime>2026-01-22 00:09:34</nova:creationTime>
Jan 22 00:09:34 compute-1 nova_compute[182713]:       <nova:flavor name="m1.nano">
Jan 22 00:09:34 compute-1 nova_compute[182713]:         <nova:memory>128</nova:memory>
Jan 22 00:09:34 compute-1 nova_compute[182713]:         <nova:disk>1</nova:disk>
Jan 22 00:09:34 compute-1 nova_compute[182713]:         <nova:swap>0</nova:swap>
Jan 22 00:09:34 compute-1 nova_compute[182713]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 00:09:34 compute-1 nova_compute[182713]:         <nova:vcpus>1</nova:vcpus>
Jan 22 00:09:34 compute-1 nova_compute[182713]:       </nova:flavor>
Jan 22 00:09:34 compute-1 nova_compute[182713]:       <nova:owner>
Jan 22 00:09:34 compute-1 nova_compute[182713]:         <nova:user uuid="8324d8ba232c476e925d31b7d5645a7a">tempest-ServerRescueTestJSON-401787473-project-member</nova:user>
Jan 22 00:09:34 compute-1 nova_compute[182713]:         <nova:project uuid="3b9315c6168049d79f20d630e51ffff3">tempest-ServerRescueTestJSON-401787473</nova:project>
Jan 22 00:09:34 compute-1 nova_compute[182713]:       </nova:owner>
Jan 22 00:09:34 compute-1 nova_compute[182713]:       <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 22 00:09:34 compute-1 nova_compute[182713]:       <nova:ports>
Jan 22 00:09:34 compute-1 nova_compute[182713]:         <nova:port uuid="3e67543d-6311-420b-878e-c3112fb771c2">
Jan 22 00:09:34 compute-1 nova_compute[182713]:           <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Jan 22 00:09:34 compute-1 nova_compute[182713]:         </nova:port>
Jan 22 00:09:34 compute-1 nova_compute[182713]:       </nova:ports>
Jan 22 00:09:34 compute-1 nova_compute[182713]:     </nova:instance>
Jan 22 00:09:34 compute-1 nova_compute[182713]:   </metadata>
Jan 22 00:09:34 compute-1 nova_compute[182713]:   <sysinfo type="smbios">
Jan 22 00:09:34 compute-1 nova_compute[182713]:     <system>
Jan 22 00:09:34 compute-1 nova_compute[182713]:       <entry name="manufacturer">RDO</entry>
Jan 22 00:09:34 compute-1 nova_compute[182713]:       <entry name="product">OpenStack Compute</entry>
Jan 22 00:09:34 compute-1 nova_compute[182713]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 00:09:34 compute-1 nova_compute[182713]:       <entry name="serial">9c30e0e8-a030-4b0c-84ed-324f08bd5f1b</entry>
Jan 22 00:09:34 compute-1 nova_compute[182713]:       <entry name="uuid">9c30e0e8-a030-4b0c-84ed-324f08bd5f1b</entry>
Jan 22 00:09:34 compute-1 nova_compute[182713]:       <entry name="family">Virtual Machine</entry>
Jan 22 00:09:34 compute-1 nova_compute[182713]:     </system>
Jan 22 00:09:34 compute-1 nova_compute[182713]:   </sysinfo>
Jan 22 00:09:34 compute-1 nova_compute[182713]:   <os>
Jan 22 00:09:34 compute-1 nova_compute[182713]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 22 00:09:34 compute-1 nova_compute[182713]:     <boot dev="hd"/>
Jan 22 00:09:34 compute-1 nova_compute[182713]:     <smbios mode="sysinfo"/>
Jan 22 00:09:34 compute-1 nova_compute[182713]:   </os>
Jan 22 00:09:34 compute-1 nova_compute[182713]:   <features>
Jan 22 00:09:34 compute-1 nova_compute[182713]:     <acpi/>
Jan 22 00:09:34 compute-1 nova_compute[182713]:     <apic/>
Jan 22 00:09:34 compute-1 nova_compute[182713]:     <vmcoreinfo/>
Jan 22 00:09:34 compute-1 nova_compute[182713]:   </features>
Jan 22 00:09:34 compute-1 nova_compute[182713]:   <clock offset="utc">
Jan 22 00:09:34 compute-1 nova_compute[182713]:     <timer name="pit" tickpolicy="delay"/>
Jan 22 00:09:34 compute-1 nova_compute[182713]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 22 00:09:34 compute-1 nova_compute[182713]:     <timer name="hpet" present="no"/>
Jan 22 00:09:34 compute-1 nova_compute[182713]:   </clock>
Jan 22 00:09:34 compute-1 nova_compute[182713]:   <cpu mode="custom" match="exact">
Jan 22 00:09:34 compute-1 nova_compute[182713]:     <model>Nehalem</model>
Jan 22 00:09:34 compute-1 nova_compute[182713]:     <topology sockets="1" cores="1" threads="1"/>
Jan 22 00:09:34 compute-1 nova_compute[182713]:   </cpu>
Jan 22 00:09:34 compute-1 nova_compute[182713]:   <devices>
Jan 22 00:09:34 compute-1 nova_compute[182713]:     <disk type="file" device="disk">
Jan 22 00:09:34 compute-1 nova_compute[182713]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 00:09:34 compute-1 nova_compute[182713]:       <source file="/var/lib/nova/instances/9c30e0e8-a030-4b0c-84ed-324f08bd5f1b/disk"/>
Jan 22 00:09:34 compute-1 nova_compute[182713]:       <target dev="vda" bus="virtio"/>
Jan 22 00:09:34 compute-1 nova_compute[182713]:     </disk>
Jan 22 00:09:34 compute-1 nova_compute[182713]:     <disk type="file" device="cdrom">
Jan 22 00:09:34 compute-1 nova_compute[182713]:       <driver name="qemu" type="raw" cache="none"/>
Jan 22 00:09:34 compute-1 nova_compute[182713]:       <source file="/var/lib/nova/instances/9c30e0e8-a030-4b0c-84ed-324f08bd5f1b/disk.config"/>
Jan 22 00:09:34 compute-1 nova_compute[182713]:       <target dev="sda" bus="sata"/>
Jan 22 00:09:34 compute-1 nova_compute[182713]:     </disk>
Jan 22 00:09:34 compute-1 nova_compute[182713]:     <interface type="ethernet">
Jan 22 00:09:34 compute-1 nova_compute[182713]:       <mac address="fa:16:3e:75:3e:f2"/>
Jan 22 00:09:34 compute-1 nova_compute[182713]:       <model type="virtio"/>
Jan 22 00:09:34 compute-1 nova_compute[182713]:       <driver name="vhost" rx_queue_size="512"/>
Jan 22 00:09:34 compute-1 nova_compute[182713]:       <mtu size="1442"/>
Jan 22 00:09:34 compute-1 nova_compute[182713]:       <target dev="tap3e67543d-63"/>
Jan 22 00:09:34 compute-1 nova_compute[182713]:     </interface>
Jan 22 00:09:34 compute-1 nova_compute[182713]:     <serial type="pty">
Jan 22 00:09:34 compute-1 nova_compute[182713]:       <log file="/var/lib/nova/instances/9c30e0e8-a030-4b0c-84ed-324f08bd5f1b/console.log" append="off"/>
Jan 22 00:09:34 compute-1 nova_compute[182713]:     </serial>
Jan 22 00:09:34 compute-1 nova_compute[182713]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 00:09:34 compute-1 nova_compute[182713]:     <video>
Jan 22 00:09:34 compute-1 nova_compute[182713]:       <model type="virtio"/>
Jan 22 00:09:34 compute-1 nova_compute[182713]:     </video>
Jan 22 00:09:34 compute-1 nova_compute[182713]:     <input type="tablet" bus="usb"/>
Jan 22 00:09:34 compute-1 nova_compute[182713]:     <rng model="virtio">
Jan 22 00:09:34 compute-1 nova_compute[182713]:       <backend model="random">/dev/urandom</backend>
Jan 22 00:09:34 compute-1 nova_compute[182713]:     </rng>
Jan 22 00:09:34 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root"/>
Jan 22 00:09:34 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:09:34 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:09:34 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:09:34 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:09:34 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:09:34 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:09:34 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:09:34 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:09:34 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:09:34 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:09:34 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:09:34 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:09:34 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:09:34 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:09:34 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:09:34 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:09:34 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:09:34 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:09:34 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:09:34 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:09:34 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:09:34 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:09:34 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:09:34 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:09:34 compute-1 nova_compute[182713]:     <controller type="usb" index="0"/>
Jan 22 00:09:34 compute-1 nova_compute[182713]:     <memballoon model="virtio">
Jan 22 00:09:34 compute-1 nova_compute[182713]:       <stats period="10"/>
Jan 22 00:09:34 compute-1 nova_compute[182713]:     </memballoon>
Jan 22 00:09:34 compute-1 nova_compute[182713]:   </devices>
Jan 22 00:09:34 compute-1 nova_compute[182713]: </domain>
Jan 22 00:09:34 compute-1 nova_compute[182713]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 22 00:09:34 compute-1 nova_compute[182713]: 2026-01-22 00:09:34.521 182717 DEBUG nova.compute.manager [None req-f2f806fb-c0d1-4e88-bec3-bbb3576a6a04 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] [instance: 9c30e0e8-a030-4b0c-84ed-324f08bd5f1b] Preparing to wait for external event network-vif-plugged-3e67543d-6311-420b-878e-c3112fb771c2 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 22 00:09:34 compute-1 nova_compute[182713]: 2026-01-22 00:09:34.521 182717 DEBUG oslo_concurrency.lockutils [None req-f2f806fb-c0d1-4e88-bec3-bbb3576a6a04 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] Acquiring lock "9c30e0e8-a030-4b0c-84ed-324f08bd5f1b-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:09:34 compute-1 nova_compute[182713]: 2026-01-22 00:09:34.521 182717 DEBUG oslo_concurrency.lockutils [None req-f2f806fb-c0d1-4e88-bec3-bbb3576a6a04 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] Lock "9c30e0e8-a030-4b0c-84ed-324f08bd5f1b-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:09:34 compute-1 nova_compute[182713]: 2026-01-22 00:09:34.522 182717 DEBUG oslo_concurrency.lockutils [None req-f2f806fb-c0d1-4e88-bec3-bbb3576a6a04 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] Lock "9c30e0e8-a030-4b0c-84ed-324f08bd5f1b-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:09:34 compute-1 nova_compute[182713]: 2026-01-22 00:09:34.522 182717 DEBUG nova.virt.libvirt.vif [None req-f2f806fb-c0d1-4e88-bec3-bbb3576a6a04 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T00:09:26Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerRescueTestJSON-server-635293581',display_name='tempest-ServerRescueTestJSON-server-635293581',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverrescuetestjson-server-635293581',id=112,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='3b9315c6168049d79f20d630e51ffff3',ramdisk_id='',reservation_id='r-egqa46yk',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerRescueTestJSON-401787473',owner_user_name='tempest-ServerRescueTestJSON-401787473-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:09:28Z,user_data=None,user_id='8324d8ba232c476e925d31b7d5645a7a',uuid=9c30e0e8-a030-4b0c-84ed-324f08bd5f1b,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "3e67543d-6311-420b-878e-c3112fb771c2", "address": "fa:16:3e:75:3e:f2", "network": {"id": "eaa59f49-9477-4d9b-9a5e-8f1eb5cf78a6", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-1280377146-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "3b9315c6168049d79f20d630e51ffff3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3e67543d-63", "ovs_interfaceid": "3e67543d-6311-420b-878e-c3112fb771c2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 00:09:34 compute-1 nova_compute[182713]: 2026-01-22 00:09:34.523 182717 DEBUG nova.network.os_vif_util [None req-f2f806fb-c0d1-4e88-bec3-bbb3576a6a04 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] Converting VIF {"id": "3e67543d-6311-420b-878e-c3112fb771c2", "address": "fa:16:3e:75:3e:f2", "network": {"id": "eaa59f49-9477-4d9b-9a5e-8f1eb5cf78a6", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-1280377146-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "3b9315c6168049d79f20d630e51ffff3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3e67543d-63", "ovs_interfaceid": "3e67543d-6311-420b-878e-c3112fb771c2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:09:34 compute-1 nova_compute[182713]: 2026-01-22 00:09:34.524 182717 DEBUG nova.network.os_vif_util [None req-f2f806fb-c0d1-4e88-bec3-bbb3576a6a04 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:75:3e:f2,bridge_name='br-int',has_traffic_filtering=True,id=3e67543d-6311-420b-878e-c3112fb771c2,network=Network(eaa59f49-9477-4d9b-9a5e-8f1eb5cf78a6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3e67543d-63') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:09:34 compute-1 nova_compute[182713]: 2026-01-22 00:09:34.524 182717 DEBUG os_vif [None req-f2f806fb-c0d1-4e88-bec3-bbb3576a6a04 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:75:3e:f2,bridge_name='br-int',has_traffic_filtering=True,id=3e67543d-6311-420b-878e-c3112fb771c2,network=Network(eaa59f49-9477-4d9b-9a5e-8f1eb5cf78a6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3e67543d-63') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 00:09:34 compute-1 nova_compute[182713]: 2026-01-22 00:09:34.525 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:09:34 compute-1 nova_compute[182713]: 2026-01-22 00:09:34.525 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:09:34 compute-1 nova_compute[182713]: 2026-01-22 00:09:34.526 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 00:09:34 compute-1 nova_compute[182713]: 2026-01-22 00:09:34.530 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:09:34 compute-1 nova_compute[182713]: 2026-01-22 00:09:34.530 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap3e67543d-63, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:09:34 compute-1 nova_compute[182713]: 2026-01-22 00:09:34.531 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap3e67543d-63, col_values=(('external_ids', {'iface-id': '3e67543d-6311-420b-878e-c3112fb771c2', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:75:3e:f2', 'vm-uuid': '9c30e0e8-a030-4b0c-84ed-324f08bd5f1b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:09:34 compute-1 nova_compute[182713]: 2026-01-22 00:09:34.533 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:09:34 compute-1 NetworkManager[54952]: <info>  [1769040574.5343] manager: (tap3e67543d-63): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/210)
Jan 22 00:09:34 compute-1 nova_compute[182713]: 2026-01-22 00:09:34.535 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:09:34 compute-1 nova_compute[182713]: 2026-01-22 00:09:34.541 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:09:34 compute-1 nova_compute[182713]: 2026-01-22 00:09:34.544 182717 INFO os_vif [None req-f2f806fb-c0d1-4e88-bec3-bbb3576a6a04 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:75:3e:f2,bridge_name='br-int',has_traffic_filtering=True,id=3e67543d-6311-420b-878e-c3112fb771c2,network=Network(eaa59f49-9477-4d9b-9a5e-8f1eb5cf78a6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3e67543d-63')
Jan 22 00:09:34 compute-1 nova_compute[182713]: 2026-01-22 00:09:34.629 182717 DEBUG nova.virt.libvirt.driver [None req-f2f806fb-c0d1-4e88-bec3-bbb3576a6a04 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 00:09:34 compute-1 nova_compute[182713]: 2026-01-22 00:09:34.629 182717 DEBUG nova.virt.libvirt.driver [None req-f2f806fb-c0d1-4e88-bec3-bbb3576a6a04 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 00:09:34 compute-1 nova_compute[182713]: 2026-01-22 00:09:34.630 182717 DEBUG nova.virt.libvirt.driver [None req-f2f806fb-c0d1-4e88-bec3-bbb3576a6a04 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] No VIF found with MAC fa:16:3e:75:3e:f2, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 00:09:34 compute-1 nova_compute[182713]: 2026-01-22 00:09:34.631 182717 INFO nova.virt.libvirt.driver [None req-f2f806fb-c0d1-4e88-bec3-bbb3576a6a04 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] [instance: 9c30e0e8-a030-4b0c-84ed-324f08bd5f1b] Using config drive
Jan 22 00:09:35 compute-1 nova_compute[182713]: 2026-01-22 00:09:35.223 182717 INFO nova.virt.libvirt.driver [None req-f2f806fb-c0d1-4e88-bec3-bbb3576a6a04 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] [instance: 9c30e0e8-a030-4b0c-84ed-324f08bd5f1b] Creating config drive at /var/lib/nova/instances/9c30e0e8-a030-4b0c-84ed-324f08bd5f1b/disk.config
Jan 22 00:09:35 compute-1 nova_compute[182713]: 2026-01-22 00:09:35.229 182717 DEBUG oslo_concurrency.processutils [None req-f2f806fb-c0d1-4e88-bec3-bbb3576a6a04 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/9c30e0e8-a030-4b0c-84ed-324f08bd5f1b/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpf3q4kglk execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:09:35 compute-1 systemd[1]: Stopping User Manager for UID 42436...
Jan 22 00:09:35 compute-1 systemd[228282]: Activating special unit Exit the Session...
Jan 22 00:09:35 compute-1 systemd[228282]: Stopped target Main User Target.
Jan 22 00:09:35 compute-1 systemd[228282]: Stopped target Basic System.
Jan 22 00:09:35 compute-1 systemd[228282]: Stopped target Paths.
Jan 22 00:09:35 compute-1 systemd[228282]: Stopped target Sockets.
Jan 22 00:09:35 compute-1 systemd[228282]: Stopped target Timers.
Jan 22 00:09:35 compute-1 systemd[228282]: Stopped Mark boot as successful after the user session has run 2 minutes.
Jan 22 00:09:35 compute-1 systemd[228282]: Stopped Daily Cleanup of User's Temporary Directories.
Jan 22 00:09:35 compute-1 systemd[228282]: Closed D-Bus User Message Bus Socket.
Jan 22 00:09:35 compute-1 systemd[228282]: Stopped Create User's Volatile Files and Directories.
Jan 22 00:09:35 compute-1 systemd[228282]: Removed slice User Application Slice.
Jan 22 00:09:35 compute-1 systemd[228282]: Reached target Shutdown.
Jan 22 00:09:35 compute-1 systemd[228282]: Finished Exit the Session.
Jan 22 00:09:35 compute-1 systemd[228282]: Reached target Exit the Session.
Jan 22 00:09:35 compute-1 systemd[1]: user@42436.service: Deactivated successfully.
Jan 22 00:09:35 compute-1 systemd[1]: Stopped User Manager for UID 42436.
Jan 22 00:09:35 compute-1 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Jan 22 00:09:35 compute-1 systemd[1]: run-user-42436.mount: Deactivated successfully.
Jan 22 00:09:35 compute-1 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Jan 22 00:09:35 compute-1 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Jan 22 00:09:35 compute-1 systemd[1]: Removed slice User Slice of UID 42436.
Jan 22 00:09:35 compute-1 nova_compute[182713]: 2026-01-22 00:09:35.374 182717 DEBUG oslo_concurrency.processutils [None req-f2f806fb-c0d1-4e88-bec3-bbb3576a6a04 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/9c30e0e8-a030-4b0c-84ed-324f08bd5f1b/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpf3q4kglk" returned: 0 in 0.145s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:09:35 compute-1 kernel: tap3e67543d-63: entered promiscuous mode
Jan 22 00:09:35 compute-1 NetworkManager[54952]: <info>  [1769040575.4336] manager: (tap3e67543d-63): new Tun device (/org/freedesktop/NetworkManager/Devices/211)
Jan 22 00:09:35 compute-1 ovn_controller[94841]: 2026-01-22T00:09:35Z|00449|binding|INFO|Claiming lport 3e67543d-6311-420b-878e-c3112fb771c2 for this chassis.
Jan 22 00:09:35 compute-1 nova_compute[182713]: 2026-01-22 00:09:35.437 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:09:35 compute-1 ovn_controller[94841]: 2026-01-22T00:09:35Z|00450|binding|INFO|3e67543d-6311-420b-878e-c3112fb771c2: Claiming fa:16:3e:75:3e:f2 10.100.0.8
Jan 22 00:09:35 compute-1 nova_compute[182713]: 2026-01-22 00:09:35.440 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:09:35 compute-1 nova_compute[182713]: 2026-01-22 00:09:35.444 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:09:35 compute-1 systemd-udevd[228433]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 00:09:35 compute-1 NetworkManager[54952]: <info>  [1769040575.4945] device (tap3e67543d-63): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 00:09:35 compute-1 NetworkManager[54952]: <info>  [1769040575.4957] device (tap3e67543d-63): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 00:09:35 compute-1 systemd-machined[153970]: New machine qemu-48-instance-00000070.
Jan 22 00:09:35 compute-1 nova_compute[182713]: 2026-01-22 00:09:35.527 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:09:35 compute-1 ovn_controller[94841]: 2026-01-22T00:09:35Z|00451|binding|INFO|Setting lport 3e67543d-6311-420b-878e-c3112fb771c2 ovn-installed in OVS
Jan 22 00:09:35 compute-1 nova_compute[182713]: 2026-01-22 00:09:35.530 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:09:35 compute-1 systemd[1]: Started Virtual Machine qemu-48-instance-00000070.
Jan 22 00:09:35 compute-1 ovn_controller[94841]: 2026-01-22T00:09:35Z|00452|binding|INFO|Setting lport 3e67543d-6311-420b-878e-c3112fb771c2 up in Southbound
Jan 22 00:09:35 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:09:35.539 104184 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:75:3e:f2 10.100.0.8'], port_security=['fa:16:3e:75:3e:f2 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '9c30e0e8-a030-4b0c-84ed-324f08bd5f1b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-eaa59f49-9477-4d9b-9a5e-8f1eb5cf78a6', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3b9315c6168049d79f20d630e51ffff3', 'neutron:revision_number': '2', 'neutron:security_group_ids': '88dd83ff-b733-44b2-9065-8f39dcf83d23', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ada6e58f-6492-44c0-abaa-a00698af112f, chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>], logical_port=3e67543d-6311-420b-878e-c3112fb771c2) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:09:35 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:09:35.540 104184 INFO neutron.agent.ovn.metadata.agent [-] Port 3e67543d-6311-420b-878e-c3112fb771c2 in datapath eaa59f49-9477-4d9b-9a5e-8f1eb5cf78a6 bound to our chassis
Jan 22 00:09:35 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:09:35.541 104184 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network eaa59f49-9477-4d9b-9a5e-8f1eb5cf78a6 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Jan 22 00:09:35 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:09:35.543 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[a083f689-d568-4ade-861c-a313ebe1b21b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:09:36 compute-1 nova_compute[182713]: 2026-01-22 00:09:36.397 182717 DEBUG nova.virt.driver [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Emitting event <LifecycleEvent: 1769040576.3972833, 9c30e0e8-a030-4b0c-84ed-324f08bd5f1b => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:09:36 compute-1 nova_compute[182713]: 2026-01-22 00:09:36.398 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 9c30e0e8-a030-4b0c-84ed-324f08bd5f1b] VM Started (Lifecycle Event)
Jan 22 00:09:36 compute-1 nova_compute[182713]: 2026-01-22 00:09:36.484 182717 DEBUG nova.network.neutron [req-33f0a701-ad1b-4f29-9ecb-9d57d6a724a0 req-4f313d71-4728-41d5-b0fb-efa9a016fb9b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 9c30e0e8-a030-4b0c-84ed-324f08bd5f1b] Updated VIF entry in instance network info cache for port 3e67543d-6311-420b-878e-c3112fb771c2. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 00:09:36 compute-1 nova_compute[182713]: 2026-01-22 00:09:36.484 182717 DEBUG nova.network.neutron [req-33f0a701-ad1b-4f29-9ecb-9d57d6a724a0 req-4f313d71-4728-41d5-b0fb-efa9a016fb9b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 9c30e0e8-a030-4b0c-84ed-324f08bd5f1b] Updating instance_info_cache with network_info: [{"id": "3e67543d-6311-420b-878e-c3112fb771c2", "address": "fa:16:3e:75:3e:f2", "network": {"id": "eaa59f49-9477-4d9b-9a5e-8f1eb5cf78a6", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-1280377146-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "3b9315c6168049d79f20d630e51ffff3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3e67543d-63", "ovs_interfaceid": "3e67543d-6311-420b-878e-c3112fb771c2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:09:36 compute-1 nova_compute[182713]: 2026-01-22 00:09:36.619 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 9c30e0e8-a030-4b0c-84ed-324f08bd5f1b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:09:36 compute-1 nova_compute[182713]: 2026-01-22 00:09:36.622 182717 DEBUG oslo_concurrency.lockutils [req-33f0a701-ad1b-4f29-9ecb-9d57d6a724a0 req-4f313d71-4728-41d5-b0fb-efa9a016fb9b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-9c30e0e8-a030-4b0c-84ed-324f08bd5f1b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:09:36 compute-1 nova_compute[182713]: 2026-01-22 00:09:36.627 182717 DEBUG nova.virt.driver [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Emitting event <LifecycleEvent: 1769040576.4013283, 9c30e0e8-a030-4b0c-84ed-324f08bd5f1b => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:09:36 compute-1 nova_compute[182713]: 2026-01-22 00:09:36.627 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 9c30e0e8-a030-4b0c-84ed-324f08bd5f1b] VM Paused (Lifecycle Event)
Jan 22 00:09:36 compute-1 nova_compute[182713]: 2026-01-22 00:09:36.652 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 9c30e0e8-a030-4b0c-84ed-324f08bd5f1b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:09:36 compute-1 nova_compute[182713]: 2026-01-22 00:09:36.656 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 9c30e0e8-a030-4b0c-84ed-324f08bd5f1b] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 00:09:36 compute-1 nova_compute[182713]: 2026-01-22 00:09:36.679 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 9c30e0e8-a030-4b0c-84ed-324f08bd5f1b] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 00:09:36 compute-1 nova_compute[182713]: 2026-01-22 00:09:36.891 182717 DEBUG nova.compute.manager [req-020aa01a-85fb-488f-b67f-40e44867ef59 req-a97f2dc4-756e-4b76-941a-6a8b6c367c45 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 9c30e0e8-a030-4b0c-84ed-324f08bd5f1b] Received event network-vif-plugged-3e67543d-6311-420b-878e-c3112fb771c2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:09:36 compute-1 nova_compute[182713]: 2026-01-22 00:09:36.892 182717 DEBUG oslo_concurrency.lockutils [req-020aa01a-85fb-488f-b67f-40e44867ef59 req-a97f2dc4-756e-4b76-941a-6a8b6c367c45 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "9c30e0e8-a030-4b0c-84ed-324f08bd5f1b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:09:36 compute-1 nova_compute[182713]: 2026-01-22 00:09:36.892 182717 DEBUG oslo_concurrency.lockutils [req-020aa01a-85fb-488f-b67f-40e44867ef59 req-a97f2dc4-756e-4b76-941a-6a8b6c367c45 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "9c30e0e8-a030-4b0c-84ed-324f08bd5f1b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:09:36 compute-1 nova_compute[182713]: 2026-01-22 00:09:36.893 182717 DEBUG oslo_concurrency.lockutils [req-020aa01a-85fb-488f-b67f-40e44867ef59 req-a97f2dc4-756e-4b76-941a-6a8b6c367c45 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "9c30e0e8-a030-4b0c-84ed-324f08bd5f1b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:09:36 compute-1 nova_compute[182713]: 2026-01-22 00:09:36.893 182717 DEBUG nova.compute.manager [req-020aa01a-85fb-488f-b67f-40e44867ef59 req-a97f2dc4-756e-4b76-941a-6a8b6c367c45 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 9c30e0e8-a030-4b0c-84ed-324f08bd5f1b] Processing event network-vif-plugged-3e67543d-6311-420b-878e-c3112fb771c2 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 22 00:09:36 compute-1 nova_compute[182713]: 2026-01-22 00:09:36.893 182717 DEBUG nova.compute.manager [None req-f2f806fb-c0d1-4e88-bec3-bbb3576a6a04 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] [instance: 9c30e0e8-a030-4b0c-84ed-324f08bd5f1b] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 00:09:36 compute-1 nova_compute[182713]: 2026-01-22 00:09:36.898 182717 DEBUG nova.virt.driver [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Emitting event <LifecycleEvent: 1769040576.8981538, 9c30e0e8-a030-4b0c-84ed-324f08bd5f1b => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:09:36 compute-1 nova_compute[182713]: 2026-01-22 00:09:36.898 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 9c30e0e8-a030-4b0c-84ed-324f08bd5f1b] VM Resumed (Lifecycle Event)
Jan 22 00:09:36 compute-1 nova_compute[182713]: 2026-01-22 00:09:36.901 182717 DEBUG nova.virt.libvirt.driver [None req-f2f806fb-c0d1-4e88-bec3-bbb3576a6a04 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] [instance: 9c30e0e8-a030-4b0c-84ed-324f08bd5f1b] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 22 00:09:36 compute-1 nova_compute[182713]: 2026-01-22 00:09:36.905 182717 INFO nova.virt.libvirt.driver [-] [instance: 9c30e0e8-a030-4b0c-84ed-324f08bd5f1b] Instance spawned successfully.
Jan 22 00:09:36 compute-1 nova_compute[182713]: 2026-01-22 00:09:36.905 182717 DEBUG nova.virt.libvirt.driver [None req-f2f806fb-c0d1-4e88-bec3-bbb3576a6a04 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] [instance: 9c30e0e8-a030-4b0c-84ed-324f08bd5f1b] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 22 00:09:36 compute-1 nova_compute[182713]: 2026-01-22 00:09:36.922 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 9c30e0e8-a030-4b0c-84ed-324f08bd5f1b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:09:36 compute-1 nova_compute[182713]: 2026-01-22 00:09:36.930 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 9c30e0e8-a030-4b0c-84ed-324f08bd5f1b] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 00:09:36 compute-1 nova_compute[182713]: 2026-01-22 00:09:36.933 182717 DEBUG nova.virt.libvirt.driver [None req-f2f806fb-c0d1-4e88-bec3-bbb3576a6a04 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] [instance: 9c30e0e8-a030-4b0c-84ed-324f08bd5f1b] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:09:36 compute-1 nova_compute[182713]: 2026-01-22 00:09:36.933 182717 DEBUG nova.virt.libvirt.driver [None req-f2f806fb-c0d1-4e88-bec3-bbb3576a6a04 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] [instance: 9c30e0e8-a030-4b0c-84ed-324f08bd5f1b] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:09:36 compute-1 nova_compute[182713]: 2026-01-22 00:09:36.934 182717 DEBUG nova.virt.libvirt.driver [None req-f2f806fb-c0d1-4e88-bec3-bbb3576a6a04 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] [instance: 9c30e0e8-a030-4b0c-84ed-324f08bd5f1b] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:09:36 compute-1 nova_compute[182713]: 2026-01-22 00:09:36.934 182717 DEBUG nova.virt.libvirt.driver [None req-f2f806fb-c0d1-4e88-bec3-bbb3576a6a04 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] [instance: 9c30e0e8-a030-4b0c-84ed-324f08bd5f1b] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:09:36 compute-1 nova_compute[182713]: 2026-01-22 00:09:36.935 182717 DEBUG nova.virt.libvirt.driver [None req-f2f806fb-c0d1-4e88-bec3-bbb3576a6a04 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] [instance: 9c30e0e8-a030-4b0c-84ed-324f08bd5f1b] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:09:36 compute-1 nova_compute[182713]: 2026-01-22 00:09:36.935 182717 DEBUG nova.virt.libvirt.driver [None req-f2f806fb-c0d1-4e88-bec3-bbb3576a6a04 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] [instance: 9c30e0e8-a030-4b0c-84ed-324f08bd5f1b] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:09:36 compute-1 nova_compute[182713]: 2026-01-22 00:09:36.965 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 9c30e0e8-a030-4b0c-84ed-324f08bd5f1b] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 00:09:37 compute-1 nova_compute[182713]: 2026-01-22 00:09:37.035 182717 INFO nova.compute.manager [None req-f2f806fb-c0d1-4e88-bec3-bbb3576a6a04 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] [instance: 9c30e0e8-a030-4b0c-84ed-324f08bd5f1b] Took 8.36 seconds to spawn the instance on the hypervisor.
Jan 22 00:09:37 compute-1 nova_compute[182713]: 2026-01-22 00:09:37.035 182717 DEBUG nova.compute.manager [None req-f2f806fb-c0d1-4e88-bec3-bbb3576a6a04 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] [instance: 9c30e0e8-a030-4b0c-84ed-324f08bd5f1b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:09:37 compute-1 nova_compute[182713]: 2026-01-22 00:09:37.159 182717 INFO nova.compute.manager [None req-f2f806fb-c0d1-4e88-bec3-bbb3576a6a04 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] [instance: 9c30e0e8-a030-4b0c-84ed-324f08bd5f1b] Took 9.16 seconds to build instance.
Jan 22 00:09:37 compute-1 nova_compute[182713]: 2026-01-22 00:09:37.184 182717 DEBUG oslo_concurrency.lockutils [None req-f2f806fb-c0d1-4e88-bec3-bbb3576a6a04 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] Lock "9c30e0e8-a030-4b0c-84ed-324f08bd5f1b" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 9.298s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:09:37 compute-1 nova_compute[182713]: 2026-01-22 00:09:37.538 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:09:37 compute-1 nova_compute[182713]: 2026-01-22 00:09:37.856 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:09:38 compute-1 sshd-session[228451]: Accepted publickey for nova from 192.168.122.100 port 52238 ssh2: ECDSA SHA256:F7R2p0Um/J2TH6GGZ4p2tVmjepU+Vjbmccv9PcuTsYI
Jan 22 00:09:38 compute-1 systemd-logind[796]: New session 49 of user nova.
Jan 22 00:09:38 compute-1 systemd[1]: Created slice User Slice of UID 42436.
Jan 22 00:09:38 compute-1 systemd[1]: Starting User Runtime Directory /run/user/42436...
Jan 22 00:09:38 compute-1 systemd[1]: Finished User Runtime Directory /run/user/42436.
Jan 22 00:09:38 compute-1 systemd[1]: Starting User Manager for UID 42436...
Jan 22 00:09:38 compute-1 systemd[228455]: pam_unix(systemd-user:session): session opened for user nova(uid=42436) by nova(uid=0)
Jan 22 00:09:38 compute-1 systemd[228455]: Queued start job for default target Main User Target.
Jan 22 00:09:38 compute-1 systemd[228455]: Created slice User Application Slice.
Jan 22 00:09:38 compute-1 systemd[228455]: Started Mark boot as successful after the user session has run 2 minutes.
Jan 22 00:09:38 compute-1 systemd[228455]: Started Daily Cleanup of User's Temporary Directories.
Jan 22 00:09:38 compute-1 systemd[228455]: Reached target Paths.
Jan 22 00:09:38 compute-1 systemd[228455]: Reached target Timers.
Jan 22 00:09:38 compute-1 systemd[228455]: Starting D-Bus User Message Bus Socket...
Jan 22 00:09:38 compute-1 systemd[228455]: Starting Create User's Volatile Files and Directories...
Jan 22 00:09:38 compute-1 systemd[228455]: Listening on D-Bus User Message Bus Socket.
Jan 22 00:09:38 compute-1 systemd[228455]: Reached target Sockets.
Jan 22 00:09:38 compute-1 systemd[228455]: Finished Create User's Volatile Files and Directories.
Jan 22 00:09:38 compute-1 systemd[228455]: Reached target Basic System.
Jan 22 00:09:38 compute-1 systemd[228455]: Reached target Main User Target.
Jan 22 00:09:38 compute-1 systemd[228455]: Startup finished in 155ms.
Jan 22 00:09:38 compute-1 systemd[1]: Started User Manager for UID 42436.
Jan 22 00:09:38 compute-1 systemd[1]: Started Session 49 of User nova.
Jan 22 00:09:38 compute-1 sshd-session[228451]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Jan 22 00:09:38 compute-1 nova_compute[182713]: 2026-01-22 00:09:38.857 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:09:39 compute-1 nova_compute[182713]: 2026-01-22 00:09:39.114 182717 DEBUG nova.compute.manager [req-45b137db-447e-40c6-a29e-7080b126c7e5 req-7e1efb92-6681-436b-869d-bb53727ab171 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 9c30e0e8-a030-4b0c-84ed-324f08bd5f1b] Received event network-vif-plugged-3e67543d-6311-420b-878e-c3112fb771c2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:09:39 compute-1 nova_compute[182713]: 2026-01-22 00:09:39.115 182717 DEBUG oslo_concurrency.lockutils [req-45b137db-447e-40c6-a29e-7080b126c7e5 req-7e1efb92-6681-436b-869d-bb53727ab171 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "9c30e0e8-a030-4b0c-84ed-324f08bd5f1b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:09:39 compute-1 nova_compute[182713]: 2026-01-22 00:09:39.116 182717 DEBUG oslo_concurrency.lockutils [req-45b137db-447e-40c6-a29e-7080b126c7e5 req-7e1efb92-6681-436b-869d-bb53727ab171 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "9c30e0e8-a030-4b0c-84ed-324f08bd5f1b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:09:39 compute-1 nova_compute[182713]: 2026-01-22 00:09:39.116 182717 DEBUG oslo_concurrency.lockutils [req-45b137db-447e-40c6-a29e-7080b126c7e5 req-7e1efb92-6681-436b-869d-bb53727ab171 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "9c30e0e8-a030-4b0c-84ed-324f08bd5f1b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:09:39 compute-1 nova_compute[182713]: 2026-01-22 00:09:39.116 182717 DEBUG nova.compute.manager [req-45b137db-447e-40c6-a29e-7080b126c7e5 req-7e1efb92-6681-436b-869d-bb53727ab171 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 9c30e0e8-a030-4b0c-84ed-324f08bd5f1b] No waiting events found dispatching network-vif-plugged-3e67543d-6311-420b-878e-c3112fb771c2 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:09:39 compute-1 nova_compute[182713]: 2026-01-22 00:09:39.117 182717 WARNING nova.compute.manager [req-45b137db-447e-40c6-a29e-7080b126c7e5 req-7e1efb92-6681-436b-869d-bb53727ab171 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 9c30e0e8-a030-4b0c-84ed-324f08bd5f1b] Received unexpected event network-vif-plugged-3e67543d-6311-420b-878e-c3112fb771c2 for instance with vm_state active and task_state None.
Jan 22 00:09:39 compute-1 nova_compute[182713]: 2026-01-22 00:09:39.117 182717 DEBUG nova.compute.manager [req-45b137db-447e-40c6-a29e-7080b126c7e5 req-7e1efb92-6681-436b-869d-bb53727ab171 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 07bda903-2298-433c-aa7d-9a50380e24f1] Received event network-vif-unplugged-bc1908f8-b8ef-40b4-9e46-8e8664065a89 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:09:39 compute-1 nova_compute[182713]: 2026-01-22 00:09:39.118 182717 DEBUG oslo_concurrency.lockutils [req-45b137db-447e-40c6-a29e-7080b126c7e5 req-7e1efb92-6681-436b-869d-bb53727ab171 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "07bda903-2298-433c-aa7d-9a50380e24f1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:09:39 compute-1 nova_compute[182713]: 2026-01-22 00:09:39.118 182717 DEBUG oslo_concurrency.lockutils [req-45b137db-447e-40c6-a29e-7080b126c7e5 req-7e1efb92-6681-436b-869d-bb53727ab171 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "07bda903-2298-433c-aa7d-9a50380e24f1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:09:39 compute-1 nova_compute[182713]: 2026-01-22 00:09:39.118 182717 DEBUG oslo_concurrency.lockutils [req-45b137db-447e-40c6-a29e-7080b126c7e5 req-7e1efb92-6681-436b-869d-bb53727ab171 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "07bda903-2298-433c-aa7d-9a50380e24f1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:09:39 compute-1 nova_compute[182713]: 2026-01-22 00:09:39.119 182717 DEBUG nova.compute.manager [req-45b137db-447e-40c6-a29e-7080b126c7e5 req-7e1efb92-6681-436b-869d-bb53727ab171 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 07bda903-2298-433c-aa7d-9a50380e24f1] No waiting events found dispatching network-vif-unplugged-bc1908f8-b8ef-40b4-9e46-8e8664065a89 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:09:39 compute-1 nova_compute[182713]: 2026-01-22 00:09:39.119 182717 WARNING nova.compute.manager [req-45b137db-447e-40c6-a29e-7080b126c7e5 req-7e1efb92-6681-436b-869d-bb53727ab171 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 07bda903-2298-433c-aa7d-9a50380e24f1] Received unexpected event network-vif-unplugged-bc1908f8-b8ef-40b4-9e46-8e8664065a89 for instance with vm_state active and task_state resize_migrating.
Jan 22 00:09:39 compute-1 nova_compute[182713]: 2026-01-22 00:09:39.120 182717 DEBUG nova.compute.manager [req-45b137db-447e-40c6-a29e-7080b126c7e5 req-7e1efb92-6681-436b-869d-bb53727ab171 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 07bda903-2298-433c-aa7d-9a50380e24f1] Received event network-vif-plugged-bc1908f8-b8ef-40b4-9e46-8e8664065a89 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:09:39 compute-1 nova_compute[182713]: 2026-01-22 00:09:39.120 182717 DEBUG oslo_concurrency.lockutils [req-45b137db-447e-40c6-a29e-7080b126c7e5 req-7e1efb92-6681-436b-869d-bb53727ab171 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "07bda903-2298-433c-aa7d-9a50380e24f1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:09:39 compute-1 nova_compute[182713]: 2026-01-22 00:09:39.121 182717 DEBUG oslo_concurrency.lockutils [req-45b137db-447e-40c6-a29e-7080b126c7e5 req-7e1efb92-6681-436b-869d-bb53727ab171 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "07bda903-2298-433c-aa7d-9a50380e24f1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:09:39 compute-1 nova_compute[182713]: 2026-01-22 00:09:39.121 182717 DEBUG oslo_concurrency.lockutils [req-45b137db-447e-40c6-a29e-7080b126c7e5 req-7e1efb92-6681-436b-869d-bb53727ab171 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "07bda903-2298-433c-aa7d-9a50380e24f1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:09:39 compute-1 nova_compute[182713]: 2026-01-22 00:09:39.122 182717 DEBUG nova.compute.manager [req-45b137db-447e-40c6-a29e-7080b126c7e5 req-7e1efb92-6681-436b-869d-bb53727ab171 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 07bda903-2298-433c-aa7d-9a50380e24f1] No waiting events found dispatching network-vif-plugged-bc1908f8-b8ef-40b4-9e46-8e8664065a89 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:09:39 compute-1 nova_compute[182713]: 2026-01-22 00:09:39.122 182717 WARNING nova.compute.manager [req-45b137db-447e-40c6-a29e-7080b126c7e5 req-7e1efb92-6681-436b-869d-bb53727ab171 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 07bda903-2298-433c-aa7d-9a50380e24f1] Received unexpected event network-vif-plugged-bc1908f8-b8ef-40b4-9e46-8e8664065a89 for instance with vm_state active and task_state resize_migrating.
Jan 22 00:09:39 compute-1 sshd-session[228471]: Received disconnect from 192.168.122.100 port 52238:11: disconnected by user
Jan 22 00:09:39 compute-1 sshd-session[228471]: Disconnected from user nova 192.168.122.100 port 52238
Jan 22 00:09:39 compute-1 sshd-session[228451]: pam_unix(sshd:session): session closed for user nova
Jan 22 00:09:39 compute-1 systemd-logind[796]: Session 49 logged out. Waiting for processes to exit.
Jan 22 00:09:39 compute-1 systemd[1]: session-49.scope: Deactivated successfully.
Jan 22 00:09:39 compute-1 systemd-logind[796]: Removed session 49.
Jan 22 00:09:39 compute-1 sshd-session[228473]: Accepted publickey for nova from 192.168.122.100 port 52252 ssh2: ECDSA SHA256:F7R2p0Um/J2TH6GGZ4p2tVmjepU+Vjbmccv9PcuTsYI
Jan 22 00:09:39 compute-1 systemd-logind[796]: New session 51 of user nova.
Jan 22 00:09:39 compute-1 systemd[1]: Started Session 51 of User nova.
Jan 22 00:09:39 compute-1 sshd-session[228473]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Jan 22 00:09:39 compute-1 sshd-session[228476]: Received disconnect from 192.168.122.100 port 52252:11: disconnected by user
Jan 22 00:09:39 compute-1 sshd-session[228476]: Disconnected from user nova 192.168.122.100 port 52252
Jan 22 00:09:39 compute-1 sshd-session[228473]: pam_unix(sshd:session): session closed for user nova
Jan 22 00:09:39 compute-1 systemd[1]: session-51.scope: Deactivated successfully.
Jan 22 00:09:39 compute-1 systemd-logind[796]: Session 51 logged out. Waiting for processes to exit.
Jan 22 00:09:39 compute-1 systemd-logind[796]: Removed session 51.
Jan 22 00:09:39 compute-1 nova_compute[182713]: 2026-01-22 00:09:39.534 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:09:39 compute-1 sshd-session[228478]: Accepted publickey for nova from 192.168.122.100 port 52266 ssh2: ECDSA SHA256:F7R2p0Um/J2TH6GGZ4p2tVmjepU+Vjbmccv9PcuTsYI
Jan 22 00:09:39 compute-1 systemd-logind[796]: New session 52 of user nova.
Jan 22 00:09:39 compute-1 systemd[1]: Started Session 52 of User nova.
Jan 22 00:09:39 compute-1 sshd-session[228478]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Jan 22 00:09:39 compute-1 nova_compute[182713]: 2026-01-22 00:09:39.665 182717 INFO nova.compute.manager [None req-a4430f0b-c83c-4449-afd7-67aadb4578b3 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] [instance: 9c30e0e8-a030-4b0c-84ed-324f08bd5f1b] Rescuing
Jan 22 00:09:39 compute-1 nova_compute[182713]: 2026-01-22 00:09:39.666 182717 DEBUG oslo_concurrency.lockutils [None req-a4430f0b-c83c-4449-afd7-67aadb4578b3 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] Acquiring lock "refresh_cache-9c30e0e8-a030-4b0c-84ed-324f08bd5f1b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:09:39 compute-1 nova_compute[182713]: 2026-01-22 00:09:39.667 182717 DEBUG oslo_concurrency.lockutils [None req-a4430f0b-c83c-4449-afd7-67aadb4578b3 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] Acquired lock "refresh_cache-9c30e0e8-a030-4b0c-84ed-324f08bd5f1b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:09:39 compute-1 nova_compute[182713]: 2026-01-22 00:09:39.667 182717 DEBUG nova.network.neutron [None req-a4430f0b-c83c-4449-afd7-67aadb4578b3 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] [instance: 9c30e0e8-a030-4b0c-84ed-324f08bd5f1b] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 00:09:39 compute-1 sshd-session[228481]: Received disconnect from 192.168.122.100 port 52266:11: disconnected by user
Jan 22 00:09:39 compute-1 sshd-session[228481]: Disconnected from user nova 192.168.122.100 port 52266
Jan 22 00:09:39 compute-1 sshd-session[228478]: pam_unix(sshd:session): session closed for user nova
Jan 22 00:09:39 compute-1 systemd[1]: session-52.scope: Deactivated successfully.
Jan 22 00:09:39 compute-1 systemd-logind[796]: Session 52 logged out. Waiting for processes to exit.
Jan 22 00:09:39 compute-1 systemd-logind[796]: Removed session 52.
Jan 22 00:09:39 compute-1 nova_compute[182713]: 2026-01-22 00:09:39.852 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:09:39 compute-1 nova_compute[182713]: 2026-01-22 00:09:39.855 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:09:39 compute-1 nova_compute[182713]: 2026-01-22 00:09:39.855 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:09:39 compute-1 nova_compute[182713]: 2026-01-22 00:09:39.856 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 00:09:40 compute-1 nova_compute[182713]: 2026-01-22 00:09:40.547 182717 INFO nova.network.neutron [None req-43b71e46-6cc7-4d6e-8c59-e220051761d4 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 07bda903-2298-433c-aa7d-9a50380e24f1] Updating port bc1908f8-b8ef-40b4-9e46-8e8664065a89 with attributes {'binding:host_id': 'compute-1.ctlplane.example.com', 'device_owner': 'compute:nova'}
Jan 22 00:09:40 compute-1 nova_compute[182713]: 2026-01-22 00:09:40.851 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:09:41 compute-1 nova_compute[182713]: 2026-01-22 00:09:41.326 182717 DEBUG oslo_concurrency.lockutils [None req-43b71e46-6cc7-4d6e-8c59-e220051761d4 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Acquiring lock "refresh_cache-07bda903-2298-433c-aa7d-9a50380e24f1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:09:41 compute-1 nova_compute[182713]: 2026-01-22 00:09:41.327 182717 DEBUG oslo_concurrency.lockutils [None req-43b71e46-6cc7-4d6e-8c59-e220051761d4 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Acquired lock "refresh_cache-07bda903-2298-433c-aa7d-9a50380e24f1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:09:41 compute-1 nova_compute[182713]: 2026-01-22 00:09:41.327 182717 DEBUG nova.network.neutron [None req-43b71e46-6cc7-4d6e-8c59-e220051761d4 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 07bda903-2298-433c-aa7d-9a50380e24f1] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 00:09:41 compute-1 nova_compute[182713]: 2026-01-22 00:09:41.441 182717 DEBUG nova.compute.manager [req-578c5dfe-8ef9-494f-80f3-a95f28f1743d req-7b26f3f8-06a4-4a19-b075-073b3fb8657a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 07bda903-2298-433c-aa7d-9a50380e24f1] Received event network-changed-bc1908f8-b8ef-40b4-9e46-8e8664065a89 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:09:41 compute-1 nova_compute[182713]: 2026-01-22 00:09:41.442 182717 DEBUG nova.compute.manager [req-578c5dfe-8ef9-494f-80f3-a95f28f1743d req-7b26f3f8-06a4-4a19-b075-073b3fb8657a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 07bda903-2298-433c-aa7d-9a50380e24f1] Refreshing instance network info cache due to event network-changed-bc1908f8-b8ef-40b4-9e46-8e8664065a89. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 00:09:41 compute-1 nova_compute[182713]: 2026-01-22 00:09:41.442 182717 DEBUG oslo_concurrency.lockutils [req-578c5dfe-8ef9-494f-80f3-a95f28f1743d req-7b26f3f8-06a4-4a19-b075-073b3fb8657a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-07bda903-2298-433c-aa7d-9a50380e24f1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:09:41 compute-1 nova_compute[182713]: 2026-01-22 00:09:41.512 182717 DEBUG nova.network.neutron [None req-a4430f0b-c83c-4449-afd7-67aadb4578b3 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] [instance: 9c30e0e8-a030-4b0c-84ed-324f08bd5f1b] Updating instance_info_cache with network_info: [{"id": "3e67543d-6311-420b-878e-c3112fb771c2", "address": "fa:16:3e:75:3e:f2", "network": {"id": "eaa59f49-9477-4d9b-9a5e-8f1eb5cf78a6", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-1280377146-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "3b9315c6168049d79f20d630e51ffff3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3e67543d-63", "ovs_interfaceid": "3e67543d-6311-420b-878e-c3112fb771c2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:09:41 compute-1 nova_compute[182713]: 2026-01-22 00:09:41.534 182717 DEBUG oslo_concurrency.lockutils [None req-a4430f0b-c83c-4449-afd7-67aadb4578b3 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] Releasing lock "refresh_cache-9c30e0e8-a030-4b0c-84ed-324f08bd5f1b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:09:41 compute-1 nova_compute[182713]: 2026-01-22 00:09:41.855 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:09:41 compute-1 nova_compute[182713]: 2026-01-22 00:09:41.935 182717 DEBUG nova.virt.libvirt.driver [None req-a4430f0b-c83c-4449-afd7-67aadb4578b3 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] [instance: 9c30e0e8-a030-4b0c-84ed-324f08bd5f1b] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071
Jan 22 00:09:42 compute-1 nova_compute[182713]: 2026-01-22 00:09:42.587 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:09:42 compute-1 podman[228483]: 2026-01-22 00:09:42.618750778 +0000 UTC m=+0.109258128 container health_status cba261ffc859a105e0afa4436e64e27f9ba7e7387a1ea02146e0072c51844d44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=ceilometer_agent_compute, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Jan 22 00:09:42 compute-1 nova_compute[182713]: 2026-01-22 00:09:42.855 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:09:42 compute-1 nova_compute[182713]: 2026-01-22 00:09:42.885 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:09:42 compute-1 nova_compute[182713]: 2026-01-22 00:09:42.886 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:09:42 compute-1 nova_compute[182713]: 2026-01-22 00:09:42.886 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:09:42 compute-1 nova_compute[182713]: 2026-01-22 00:09:42.887 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 00:09:42 compute-1 nova_compute[182713]: 2026-01-22 00:09:42.970 182717 DEBUG nova.network.neutron [None req-43b71e46-6cc7-4d6e-8c59-e220051761d4 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 07bda903-2298-433c-aa7d-9a50380e24f1] Updating instance_info_cache with network_info: [{"id": "bc1908f8-b8ef-40b4-9e46-8e8664065a89", "address": "fa:16:3e:45:56:97", "network": {"id": "d94993bc-77ac-42d2-88cb-3b0110dff29e", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-173623444-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3822e32efd5647aebf2d79a3dd038bd4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbc1908f8-b8", "ovs_interfaceid": "bc1908f8-b8ef-40b4-9e46-8e8664065a89", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:09:42 compute-1 nova_compute[182713]: 2026-01-22 00:09:42.994 182717 DEBUG oslo_concurrency.processutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9c30e0e8-a030-4b0c-84ed-324f08bd5f1b/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:09:43 compute-1 nova_compute[182713]: 2026-01-22 00:09:43.025 182717 DEBUG oslo_concurrency.lockutils [None req-43b71e46-6cc7-4d6e-8c59-e220051761d4 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Releasing lock "refresh_cache-07bda903-2298-433c-aa7d-9a50380e24f1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:09:43 compute-1 nova_compute[182713]: 2026-01-22 00:09:43.031 182717 DEBUG oslo_concurrency.lockutils [req-578c5dfe-8ef9-494f-80f3-a95f28f1743d req-7b26f3f8-06a4-4a19-b075-073b3fb8657a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-07bda903-2298-433c-aa7d-9a50380e24f1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:09:43 compute-1 nova_compute[182713]: 2026-01-22 00:09:43.032 182717 DEBUG nova.network.neutron [req-578c5dfe-8ef9-494f-80f3-a95f28f1743d req-7b26f3f8-06a4-4a19-b075-073b3fb8657a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 07bda903-2298-433c-aa7d-9a50380e24f1] Refreshing network info cache for port bc1908f8-b8ef-40b4-9e46-8e8664065a89 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 00:09:43 compute-1 nova_compute[182713]: 2026-01-22 00:09:43.075 182717 DEBUG oslo_concurrency.processutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9c30e0e8-a030-4b0c-84ed-324f08bd5f1b/disk --force-share --output=json" returned: 0 in 0.080s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:09:43 compute-1 nova_compute[182713]: 2026-01-22 00:09:43.076 182717 DEBUG oslo_concurrency.processutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9c30e0e8-a030-4b0c-84ed-324f08bd5f1b/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:09:43 compute-1 nova_compute[182713]: 2026-01-22 00:09:43.151 182717 DEBUG oslo_concurrency.processutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9c30e0e8-a030-4b0c-84ed-324f08bd5f1b/disk --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:09:43 compute-1 nova_compute[182713]: 2026-01-22 00:09:43.175 182717 DEBUG nova.virt.libvirt.driver [None req-43b71e46-6cc7-4d6e-8c59-e220051761d4 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 07bda903-2298-433c-aa7d-9a50380e24f1] Starting finish_migration finish_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11698
Jan 22 00:09:43 compute-1 nova_compute[182713]: 2026-01-22 00:09:43.178 182717 DEBUG nova.virt.libvirt.driver [None req-43b71e46-6cc7-4d6e-8c59-e220051761d4 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 07bda903-2298-433c-aa7d-9a50380e24f1] Instance directory exists: not creating _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4719
Jan 22 00:09:43 compute-1 nova_compute[182713]: 2026-01-22 00:09:43.179 182717 INFO nova.virt.libvirt.driver [None req-43b71e46-6cc7-4d6e-8c59-e220051761d4 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 07bda903-2298-433c-aa7d-9a50380e24f1] Creating image(s)
Jan 22 00:09:43 compute-1 nova_compute[182713]: 2026-01-22 00:09:43.180 182717 DEBUG nova.objects.instance [None req-43b71e46-6cc7-4d6e-8c59-e220051761d4 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Lazy-loading 'trusted_certs' on Instance uuid 07bda903-2298-433c-aa7d-9a50380e24f1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:09:43 compute-1 nova_compute[182713]: 2026-01-22 00:09:43.196 182717 DEBUG oslo_concurrency.processutils [None req-43b71e46-6cc7-4d6e-8c59-e220051761d4 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:09:43 compute-1 nova_compute[182713]: 2026-01-22 00:09:43.294 182717 DEBUG oslo_concurrency.processutils [None req-43b71e46-6cc7-4d6e-8c59-e220051761d4 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.099s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:09:43 compute-1 nova_compute[182713]: 2026-01-22 00:09:43.296 182717 DEBUG nova.virt.disk.api [None req-43b71e46-6cc7-4d6e-8c59-e220051761d4 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Checking if we can resize image /var/lib/nova/instances/07bda903-2298-433c-aa7d-9a50380e24f1/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 22 00:09:43 compute-1 nova_compute[182713]: 2026-01-22 00:09:43.297 182717 DEBUG oslo_concurrency.processutils [None req-43b71e46-6cc7-4d6e-8c59-e220051761d4 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/07bda903-2298-433c-aa7d-9a50380e24f1/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:09:43 compute-1 nova_compute[182713]: 2026-01-22 00:09:43.378 182717 DEBUG oslo_concurrency.processutils [None req-43b71e46-6cc7-4d6e-8c59-e220051761d4 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/07bda903-2298-433c-aa7d-9a50380e24f1/disk --force-share --output=json" returned: 0 in 0.082s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:09:43 compute-1 nova_compute[182713]: 2026-01-22 00:09:43.380 182717 DEBUG nova.virt.disk.api [None req-43b71e46-6cc7-4d6e-8c59-e220051761d4 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Cannot resize image /var/lib/nova/instances/07bda903-2298-433c-aa7d-9a50380e24f1/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 22 00:09:43 compute-1 nova_compute[182713]: 2026-01-22 00:09:43.400 182717 DEBUG nova.virt.libvirt.driver [None req-43b71e46-6cc7-4d6e-8c59-e220051761d4 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 07bda903-2298-433c-aa7d-9a50380e24f1] Did not create local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4859
Jan 22 00:09:43 compute-1 nova_compute[182713]: 2026-01-22 00:09:43.401 182717 DEBUG nova.virt.libvirt.driver [None req-43b71e46-6cc7-4d6e-8c59-e220051761d4 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 07bda903-2298-433c-aa7d-9a50380e24f1] Ensure instance console log exists: /var/lib/nova/instances/07bda903-2298-433c-aa7d-9a50380e24f1/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 22 00:09:43 compute-1 nova_compute[182713]: 2026-01-22 00:09:43.402 182717 DEBUG oslo_concurrency.lockutils [None req-43b71e46-6cc7-4d6e-8c59-e220051761d4 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:09:43 compute-1 nova_compute[182713]: 2026-01-22 00:09:43.403 182717 DEBUG oslo_concurrency.lockutils [None req-43b71e46-6cc7-4d6e-8c59-e220051761d4 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:09:43 compute-1 nova_compute[182713]: 2026-01-22 00:09:43.403 182717 DEBUG oslo_concurrency.lockutils [None req-43b71e46-6cc7-4d6e-8c59-e220051761d4 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:09:43 compute-1 nova_compute[182713]: 2026-01-22 00:09:43.406 182717 DEBUG nova.virt.libvirt.driver [None req-43b71e46-6cc7-4d6e-8c59-e220051761d4 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 07bda903-2298-433c-aa7d-9a50380e24f1] Start _get_guest_xml network_info=[{"id": "bc1908f8-b8ef-40b4-9e46-8e8664065a89", "address": "fa:16:3e:45:56:97", "network": {"id": "d94993bc-77ac-42d2-88cb-3b0110dff29e", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-173623444-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-DeleteServersTestJSON-173623444-network", "vif_mac": "fa:16:3e:45:56:97"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3822e32efd5647aebf2d79a3dd038bd4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbc1908f8-b8", "ovs_interfaceid": "bc1908f8-b8ef-40b4-9e46-8e8664065a89", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'size': 0, 'disk_bus': 'virtio', 'boot_index': 0, 'device_name': '/dev/vda', 'guest_format': None, 'encrypted': False, 'encryption_format': None, 'encryption_secret_uuid': None, 'device_type': 'disk', 'image_id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 22 00:09:43 compute-1 nova_compute[182713]: 2026-01-22 00:09:43.412 182717 WARNING nova.virt.libvirt.driver [None req-43b71e46-6cc7-4d6e-8c59-e220051761d4 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 00:09:43 compute-1 nova_compute[182713]: 2026-01-22 00:09:43.419 182717 DEBUG nova.virt.libvirt.host [None req-43b71e46-6cc7-4d6e-8c59-e220051761d4 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 22 00:09:43 compute-1 nova_compute[182713]: 2026-01-22 00:09:43.420 182717 DEBUG nova.virt.libvirt.host [None req-43b71e46-6cc7-4d6e-8c59-e220051761d4 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 22 00:09:43 compute-1 nova_compute[182713]: 2026-01-22 00:09:43.427 182717 DEBUG nova.virt.libvirt.host [None req-43b71e46-6cc7-4d6e-8c59-e220051761d4 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 22 00:09:43 compute-1 nova_compute[182713]: 2026-01-22 00:09:43.428 182717 DEBUG nova.virt.libvirt.host [None req-43b71e46-6cc7-4d6e-8c59-e220051761d4 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 22 00:09:43 compute-1 nova_compute[182713]: 2026-01-22 00:09:43.430 182717 DEBUG nova.virt.libvirt.driver [None req-43b71e46-6cc7-4d6e-8c59-e220051761d4 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 22 00:09:43 compute-1 nova_compute[182713]: 2026-01-22 00:09:43.431 182717 DEBUG nova.virt.hardware [None req-43b71e46-6cc7-4d6e-8c59-e220051761d4 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-21T23:42:48Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='ff01ccba-ad51-439f-9037-926190d6dc0f',id=2,is_public=True,memory_mb=192,name='m1.micro',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 22 00:09:43 compute-1 nova_compute[182713]: 2026-01-22 00:09:43.431 182717 DEBUG nova.virt.hardware [None req-43b71e46-6cc7-4d6e-8c59-e220051761d4 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 22 00:09:43 compute-1 nova_compute[182713]: 2026-01-22 00:09:43.432 182717 DEBUG nova.virt.hardware [None req-43b71e46-6cc7-4d6e-8c59-e220051761d4 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 22 00:09:43 compute-1 nova_compute[182713]: 2026-01-22 00:09:43.432 182717 DEBUG nova.virt.hardware [None req-43b71e46-6cc7-4d6e-8c59-e220051761d4 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 22 00:09:43 compute-1 nova_compute[182713]: 2026-01-22 00:09:43.432 182717 DEBUG nova.virt.hardware [None req-43b71e46-6cc7-4d6e-8c59-e220051761d4 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 22 00:09:43 compute-1 nova_compute[182713]: 2026-01-22 00:09:43.433 182717 DEBUG nova.virt.hardware [None req-43b71e46-6cc7-4d6e-8c59-e220051761d4 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 22 00:09:43 compute-1 nova_compute[182713]: 2026-01-22 00:09:43.433 182717 DEBUG nova.virt.hardware [None req-43b71e46-6cc7-4d6e-8c59-e220051761d4 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 22 00:09:43 compute-1 nova_compute[182713]: 2026-01-22 00:09:43.434 182717 DEBUG nova.virt.hardware [None req-43b71e46-6cc7-4d6e-8c59-e220051761d4 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 22 00:09:43 compute-1 nova_compute[182713]: 2026-01-22 00:09:43.434 182717 DEBUG nova.virt.hardware [None req-43b71e46-6cc7-4d6e-8c59-e220051761d4 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 22 00:09:43 compute-1 nova_compute[182713]: 2026-01-22 00:09:43.434 182717 DEBUG nova.virt.hardware [None req-43b71e46-6cc7-4d6e-8c59-e220051761d4 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 22 00:09:43 compute-1 nova_compute[182713]: 2026-01-22 00:09:43.435 182717 DEBUG nova.virt.hardware [None req-43b71e46-6cc7-4d6e-8c59-e220051761d4 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 22 00:09:43 compute-1 nova_compute[182713]: 2026-01-22 00:09:43.435 182717 DEBUG nova.objects.instance [None req-43b71e46-6cc7-4d6e-8c59-e220051761d4 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Lazy-loading 'vcpu_model' on Instance uuid 07bda903-2298-433c-aa7d-9a50380e24f1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:09:43 compute-1 nova_compute[182713]: 2026-01-22 00:09:43.468 182717 DEBUG oslo_concurrency.processutils [None req-43b71e46-6cc7-4d6e-8c59-e220051761d4 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/07bda903-2298-433c-aa7d-9a50380e24f1/disk.config --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:09:43 compute-1 nova_compute[182713]: 2026-01-22 00:09:43.543 182717 WARNING nova.virt.libvirt.driver [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 00:09:43 compute-1 nova_compute[182713]: 2026-01-22 00:09:43.544 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5567MB free_disk=73.27457809448242GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 00:09:43 compute-1 nova_compute[182713]: 2026-01-22 00:09:43.545 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:09:43 compute-1 nova_compute[182713]: 2026-01-22 00:09:43.546 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:09:43 compute-1 nova_compute[182713]: 2026-01-22 00:09:43.549 182717 DEBUG oslo_concurrency.processutils [None req-43b71e46-6cc7-4d6e-8c59-e220051761d4 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/07bda903-2298-433c-aa7d-9a50380e24f1/disk.config --force-share --output=json" returned: 0 in 0.080s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:09:43 compute-1 nova_compute[182713]: 2026-01-22 00:09:43.550 182717 DEBUG oslo_concurrency.lockutils [None req-43b71e46-6cc7-4d6e-8c59-e220051761d4 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Acquiring lock "/var/lib/nova/instances/07bda903-2298-433c-aa7d-9a50380e24f1/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:09:43 compute-1 nova_compute[182713]: 2026-01-22 00:09:43.550 182717 DEBUG oslo_concurrency.lockutils [None req-43b71e46-6cc7-4d6e-8c59-e220051761d4 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Lock "/var/lib/nova/instances/07bda903-2298-433c-aa7d-9a50380e24f1/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:09:43 compute-1 nova_compute[182713]: 2026-01-22 00:09:43.552 182717 DEBUG oslo_concurrency.lockutils [None req-43b71e46-6cc7-4d6e-8c59-e220051761d4 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Lock "/var/lib/nova/instances/07bda903-2298-433c-aa7d-9a50380e24f1/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:09:43 compute-1 nova_compute[182713]: 2026-01-22 00:09:43.553 182717 DEBUG nova.virt.libvirt.vif [None req-43b71e46-6cc7-4d6e-8c59-e220051761d4 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T00:09:06Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-DeleteServersTestJSON-server-554893725',display_name='tempest-DeleteServersTestJSON-server-554893725',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-deleteserverstestjson-server-554893725',id=111,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-22T00:09:16Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='3822e32efd5647aebf2d79a3dd038bd4',ramdisk_id='',reservation_id='r-4j2wmp3v',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-DeleteServersTestJSON-2033458913',owner_user_name='tempest-DeleteServersTestJSON-2033458913-project-member'},tags=<?>,task_state='resize_finish',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:09:40Z,user_data=None,user_id='74ad1bf274924c52af96aa4c6d431410',uuid=07bda903-2298-433c-aa7d-9a50380e24f1,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "bc1908f8-b8ef-40b4-9e46-8e8664065a89", "address": "fa:16:3e:45:56:97", "network": {"id": "d94993bc-77ac-42d2-88cb-3b0110dff29e", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-173623444-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-DeleteServersTestJSON-173623444-network", "vif_mac": "fa:16:3e:45:56:97"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3822e32efd5647aebf2d79a3dd038bd4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbc1908f8-b8", "ovs_interfaceid": "bc1908f8-b8ef-40b4-9e46-8e8664065a89", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 00:09:43 compute-1 nova_compute[182713]: 2026-01-22 00:09:43.554 182717 DEBUG nova.network.os_vif_util [None req-43b71e46-6cc7-4d6e-8c59-e220051761d4 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Converting VIF {"id": "bc1908f8-b8ef-40b4-9e46-8e8664065a89", "address": "fa:16:3e:45:56:97", "network": {"id": "d94993bc-77ac-42d2-88cb-3b0110dff29e", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-173623444-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-DeleteServersTestJSON-173623444-network", "vif_mac": "fa:16:3e:45:56:97"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3822e32efd5647aebf2d79a3dd038bd4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbc1908f8-b8", "ovs_interfaceid": "bc1908f8-b8ef-40b4-9e46-8e8664065a89", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:09:43 compute-1 nova_compute[182713]: 2026-01-22 00:09:43.555 182717 DEBUG nova.network.os_vif_util [None req-43b71e46-6cc7-4d6e-8c59-e220051761d4 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:45:56:97,bridge_name='br-int',has_traffic_filtering=True,id=bc1908f8-b8ef-40b4-9e46-8e8664065a89,network=Network(d94993bc-77ac-42d2-88cb-3b0110dff29e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbc1908f8-b8') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:09:43 compute-1 nova_compute[182713]: 2026-01-22 00:09:43.558 182717 DEBUG nova.virt.libvirt.driver [None req-43b71e46-6cc7-4d6e-8c59-e220051761d4 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 07bda903-2298-433c-aa7d-9a50380e24f1] End _get_guest_xml xml=<domain type="kvm">
Jan 22 00:09:43 compute-1 nova_compute[182713]:   <uuid>07bda903-2298-433c-aa7d-9a50380e24f1</uuid>
Jan 22 00:09:43 compute-1 nova_compute[182713]:   <name>instance-0000006f</name>
Jan 22 00:09:43 compute-1 nova_compute[182713]:   <memory>196608</memory>
Jan 22 00:09:43 compute-1 nova_compute[182713]:   <vcpu>1</vcpu>
Jan 22 00:09:43 compute-1 nova_compute[182713]:   <metadata>
Jan 22 00:09:43 compute-1 nova_compute[182713]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 00:09:43 compute-1 nova_compute[182713]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 00:09:43 compute-1 nova_compute[182713]:       <nova:name>tempest-DeleteServersTestJSON-server-554893725</nova:name>
Jan 22 00:09:43 compute-1 nova_compute[182713]:       <nova:creationTime>2026-01-22 00:09:43</nova:creationTime>
Jan 22 00:09:43 compute-1 nova_compute[182713]:       <nova:flavor name="m1.micro">
Jan 22 00:09:43 compute-1 nova_compute[182713]:         <nova:memory>192</nova:memory>
Jan 22 00:09:43 compute-1 nova_compute[182713]:         <nova:disk>1</nova:disk>
Jan 22 00:09:43 compute-1 nova_compute[182713]:         <nova:swap>0</nova:swap>
Jan 22 00:09:43 compute-1 nova_compute[182713]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 00:09:43 compute-1 nova_compute[182713]:         <nova:vcpus>1</nova:vcpus>
Jan 22 00:09:43 compute-1 nova_compute[182713]:       </nova:flavor>
Jan 22 00:09:43 compute-1 nova_compute[182713]:       <nova:owner>
Jan 22 00:09:43 compute-1 nova_compute[182713]:         <nova:user uuid="74ad1bf274924c52af96aa4c6d431410">tempest-DeleteServersTestJSON-2033458913-project-member</nova:user>
Jan 22 00:09:43 compute-1 nova_compute[182713]:         <nova:project uuid="3822e32efd5647aebf2d79a3dd038bd4">tempest-DeleteServersTestJSON-2033458913</nova:project>
Jan 22 00:09:43 compute-1 nova_compute[182713]:       </nova:owner>
Jan 22 00:09:43 compute-1 nova_compute[182713]:       <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 22 00:09:43 compute-1 nova_compute[182713]:       <nova:ports>
Jan 22 00:09:43 compute-1 nova_compute[182713]:         <nova:port uuid="bc1908f8-b8ef-40b4-9e46-8e8664065a89">
Jan 22 00:09:43 compute-1 nova_compute[182713]:           <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Jan 22 00:09:43 compute-1 nova_compute[182713]:         </nova:port>
Jan 22 00:09:43 compute-1 nova_compute[182713]:       </nova:ports>
Jan 22 00:09:43 compute-1 nova_compute[182713]:     </nova:instance>
Jan 22 00:09:43 compute-1 nova_compute[182713]:   </metadata>
Jan 22 00:09:43 compute-1 nova_compute[182713]:   <sysinfo type="smbios">
Jan 22 00:09:43 compute-1 nova_compute[182713]:     <system>
Jan 22 00:09:43 compute-1 nova_compute[182713]:       <entry name="manufacturer">RDO</entry>
Jan 22 00:09:43 compute-1 nova_compute[182713]:       <entry name="product">OpenStack Compute</entry>
Jan 22 00:09:43 compute-1 nova_compute[182713]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 00:09:43 compute-1 nova_compute[182713]:       <entry name="serial">07bda903-2298-433c-aa7d-9a50380e24f1</entry>
Jan 22 00:09:43 compute-1 nova_compute[182713]:       <entry name="uuid">07bda903-2298-433c-aa7d-9a50380e24f1</entry>
Jan 22 00:09:43 compute-1 nova_compute[182713]:       <entry name="family">Virtual Machine</entry>
Jan 22 00:09:43 compute-1 nova_compute[182713]:     </system>
Jan 22 00:09:43 compute-1 nova_compute[182713]:   </sysinfo>
Jan 22 00:09:43 compute-1 nova_compute[182713]:   <os>
Jan 22 00:09:43 compute-1 nova_compute[182713]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 22 00:09:43 compute-1 nova_compute[182713]:     <boot dev="hd"/>
Jan 22 00:09:43 compute-1 nova_compute[182713]:     <smbios mode="sysinfo"/>
Jan 22 00:09:43 compute-1 nova_compute[182713]:   </os>
Jan 22 00:09:43 compute-1 nova_compute[182713]:   <features>
Jan 22 00:09:43 compute-1 nova_compute[182713]:     <acpi/>
Jan 22 00:09:43 compute-1 nova_compute[182713]:     <apic/>
Jan 22 00:09:43 compute-1 nova_compute[182713]:     <vmcoreinfo/>
Jan 22 00:09:43 compute-1 nova_compute[182713]:   </features>
Jan 22 00:09:43 compute-1 nova_compute[182713]:   <clock offset="utc">
Jan 22 00:09:43 compute-1 nova_compute[182713]:     <timer name="pit" tickpolicy="delay"/>
Jan 22 00:09:43 compute-1 nova_compute[182713]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 22 00:09:43 compute-1 nova_compute[182713]:     <timer name="hpet" present="no"/>
Jan 22 00:09:43 compute-1 nova_compute[182713]:   </clock>
Jan 22 00:09:43 compute-1 nova_compute[182713]:   <cpu mode="custom" match="exact">
Jan 22 00:09:43 compute-1 nova_compute[182713]:     <model>Nehalem</model>
Jan 22 00:09:43 compute-1 nova_compute[182713]:     <topology sockets="1" cores="1" threads="1"/>
Jan 22 00:09:43 compute-1 nova_compute[182713]:   </cpu>
Jan 22 00:09:43 compute-1 nova_compute[182713]:   <devices>
Jan 22 00:09:43 compute-1 nova_compute[182713]:     <disk type="file" device="disk">
Jan 22 00:09:43 compute-1 nova_compute[182713]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 00:09:43 compute-1 nova_compute[182713]:       <source file="/var/lib/nova/instances/07bda903-2298-433c-aa7d-9a50380e24f1/disk"/>
Jan 22 00:09:43 compute-1 nova_compute[182713]:       <target dev="vda" bus="virtio"/>
Jan 22 00:09:43 compute-1 nova_compute[182713]:     </disk>
Jan 22 00:09:43 compute-1 nova_compute[182713]:     <disk type="file" device="cdrom">
Jan 22 00:09:43 compute-1 nova_compute[182713]:       <driver name="qemu" type="raw" cache="none"/>
Jan 22 00:09:43 compute-1 nova_compute[182713]:       <source file="/var/lib/nova/instances/07bda903-2298-433c-aa7d-9a50380e24f1/disk.config"/>
Jan 22 00:09:43 compute-1 nova_compute[182713]:       <target dev="sda" bus="sata"/>
Jan 22 00:09:43 compute-1 nova_compute[182713]:     </disk>
Jan 22 00:09:43 compute-1 nova_compute[182713]:     <interface type="ethernet">
Jan 22 00:09:43 compute-1 nova_compute[182713]:       <mac address="fa:16:3e:45:56:97"/>
Jan 22 00:09:43 compute-1 nova_compute[182713]:       <model type="virtio"/>
Jan 22 00:09:43 compute-1 nova_compute[182713]:       <driver name="vhost" rx_queue_size="512"/>
Jan 22 00:09:43 compute-1 nova_compute[182713]:       <mtu size="1442"/>
Jan 22 00:09:43 compute-1 nova_compute[182713]:       <target dev="tapbc1908f8-b8"/>
Jan 22 00:09:43 compute-1 nova_compute[182713]:     </interface>
Jan 22 00:09:43 compute-1 nova_compute[182713]:     <serial type="pty">
Jan 22 00:09:43 compute-1 nova_compute[182713]:       <log file="/var/lib/nova/instances/07bda903-2298-433c-aa7d-9a50380e24f1/console.log" append="off"/>
Jan 22 00:09:43 compute-1 nova_compute[182713]:     </serial>
Jan 22 00:09:43 compute-1 nova_compute[182713]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 00:09:43 compute-1 nova_compute[182713]:     <video>
Jan 22 00:09:43 compute-1 nova_compute[182713]:       <model type="virtio"/>
Jan 22 00:09:43 compute-1 nova_compute[182713]:     </video>
Jan 22 00:09:43 compute-1 nova_compute[182713]:     <input type="tablet" bus="usb"/>
Jan 22 00:09:43 compute-1 nova_compute[182713]:     <rng model="virtio">
Jan 22 00:09:43 compute-1 nova_compute[182713]:       <backend model="random">/dev/urandom</backend>
Jan 22 00:09:43 compute-1 nova_compute[182713]:     </rng>
Jan 22 00:09:43 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root"/>
Jan 22 00:09:43 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:09:43 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:09:43 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:09:43 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:09:43 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:09:43 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:09:43 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:09:43 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:09:43 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:09:43 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:09:43 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:09:43 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:09:43 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:09:43 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:09:43 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:09:43 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:09:43 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:09:43 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:09:43 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:09:43 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:09:43 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:09:43 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:09:43 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:09:43 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:09:43 compute-1 nova_compute[182713]:     <controller type="usb" index="0"/>
Jan 22 00:09:43 compute-1 nova_compute[182713]:     <memballoon model="virtio">
Jan 22 00:09:43 compute-1 nova_compute[182713]:       <stats period="10"/>
Jan 22 00:09:43 compute-1 nova_compute[182713]:     </memballoon>
Jan 22 00:09:43 compute-1 nova_compute[182713]:   </devices>
Jan 22 00:09:43 compute-1 nova_compute[182713]: </domain>
Jan 22 00:09:43 compute-1 nova_compute[182713]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 22 00:09:43 compute-1 nova_compute[182713]: 2026-01-22 00:09:43.565 182717 DEBUG nova.virt.libvirt.vif [None req-43b71e46-6cc7-4d6e-8c59-e220051761d4 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T00:09:06Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-DeleteServersTestJSON-server-554893725',display_name='tempest-DeleteServersTestJSON-server-554893725',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-deleteserverstestjson-server-554893725',id=111,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-22T00:09:16Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='3822e32efd5647aebf2d79a3dd038bd4',ramdisk_id='',reservation_id='r-4j2wmp3v',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-DeleteServersTestJSON-2033458913',owner_user_name='tempest-DeleteServersTestJSON-2033458913-project-member'},tags=<?>,task_state='resize_finish',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:09:40Z,user_data=None,user_id='74ad1bf274924c52af96aa4c6d431410',uuid=07bda903-2298-433c-aa7d-9a50380e24f1,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "bc1908f8-b8ef-40b4-9e46-8e8664065a89", "address": "fa:16:3e:45:56:97", "network": {"id": "d94993bc-77ac-42d2-88cb-3b0110dff29e", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-173623444-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-DeleteServersTestJSON-173623444-network", "vif_mac": "fa:16:3e:45:56:97"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3822e32efd5647aebf2d79a3dd038bd4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbc1908f8-b8", "ovs_interfaceid": "bc1908f8-b8ef-40b4-9e46-8e8664065a89", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 00:09:43 compute-1 nova_compute[182713]: 2026-01-22 00:09:43.566 182717 DEBUG nova.network.os_vif_util [None req-43b71e46-6cc7-4d6e-8c59-e220051761d4 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Converting VIF {"id": "bc1908f8-b8ef-40b4-9e46-8e8664065a89", "address": "fa:16:3e:45:56:97", "network": {"id": "d94993bc-77ac-42d2-88cb-3b0110dff29e", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-173623444-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-DeleteServersTestJSON-173623444-network", "vif_mac": "fa:16:3e:45:56:97"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3822e32efd5647aebf2d79a3dd038bd4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbc1908f8-b8", "ovs_interfaceid": "bc1908f8-b8ef-40b4-9e46-8e8664065a89", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:09:43 compute-1 nova_compute[182713]: 2026-01-22 00:09:43.566 182717 DEBUG nova.network.os_vif_util [None req-43b71e46-6cc7-4d6e-8c59-e220051761d4 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:45:56:97,bridge_name='br-int',has_traffic_filtering=True,id=bc1908f8-b8ef-40b4-9e46-8e8664065a89,network=Network(d94993bc-77ac-42d2-88cb-3b0110dff29e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbc1908f8-b8') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:09:43 compute-1 nova_compute[182713]: 2026-01-22 00:09:43.567 182717 DEBUG os_vif [None req-43b71e46-6cc7-4d6e-8c59-e220051761d4 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:45:56:97,bridge_name='br-int',has_traffic_filtering=True,id=bc1908f8-b8ef-40b4-9e46-8e8664065a89,network=Network(d94993bc-77ac-42d2-88cb-3b0110dff29e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbc1908f8-b8') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 00:09:43 compute-1 nova_compute[182713]: 2026-01-22 00:09:43.568 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:09:43 compute-1 nova_compute[182713]: 2026-01-22 00:09:43.569 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:09:43 compute-1 nova_compute[182713]: 2026-01-22 00:09:43.570 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 00:09:43 compute-1 nova_compute[182713]: 2026-01-22 00:09:43.575 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:09:43 compute-1 nova_compute[182713]: 2026-01-22 00:09:43.576 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapbc1908f8-b8, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:09:43 compute-1 nova_compute[182713]: 2026-01-22 00:09:43.576 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapbc1908f8-b8, col_values=(('external_ids', {'iface-id': 'bc1908f8-b8ef-40b4-9e46-8e8664065a89', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:45:56:97', 'vm-uuid': '07bda903-2298-433c-aa7d-9a50380e24f1'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:09:43 compute-1 nova_compute[182713]: 2026-01-22 00:09:43.579 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:09:43 compute-1 NetworkManager[54952]: <info>  [1769040583.5814] manager: (tapbc1908f8-b8): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/212)
Jan 22 00:09:43 compute-1 nova_compute[182713]: 2026-01-22 00:09:43.581 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:09:43 compute-1 nova_compute[182713]: 2026-01-22 00:09:43.589 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:09:43 compute-1 nova_compute[182713]: 2026-01-22 00:09:43.593 182717 INFO os_vif [None req-43b71e46-6cc7-4d6e-8c59-e220051761d4 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:45:56:97,bridge_name='br-int',has_traffic_filtering=True,id=bc1908f8-b8ef-40b4-9e46-8e8664065a89,network=Network(d94993bc-77ac-42d2-88cb-3b0110dff29e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbc1908f8-b8')
Jan 22 00:09:43 compute-1 nova_compute[182713]: 2026-01-22 00:09:43.640 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Applying migration context for instance 07bda903-2298-433c-aa7d-9a50380e24f1 as it has an incoming, in-progress migration 3d4b06f6-bb79-4378-827a-10ce205dc76f. Migration status is post-migrating _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:950
Jan 22 00:09:43 compute-1 nova_compute[182713]: 2026-01-22 00:09:43.641 182717 INFO nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] [instance: 07bda903-2298-433c-aa7d-9a50380e24f1] Updating resource usage from migration 3d4b06f6-bb79-4378-827a-10ce205dc76f
Jan 22 00:09:43 compute-1 nova_compute[182713]: 2026-01-22 00:09:43.670 182717 DEBUG nova.virt.libvirt.driver [None req-43b71e46-6cc7-4d6e-8c59-e220051761d4 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 00:09:43 compute-1 nova_compute[182713]: 2026-01-22 00:09:43.672 182717 DEBUG nova.virt.libvirt.driver [None req-43b71e46-6cc7-4d6e-8c59-e220051761d4 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 00:09:43 compute-1 nova_compute[182713]: 2026-01-22 00:09:43.673 182717 DEBUG nova.virt.libvirt.driver [None req-43b71e46-6cc7-4d6e-8c59-e220051761d4 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] No VIF found with MAC fa:16:3e:45:56:97, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 00:09:43 compute-1 nova_compute[182713]: 2026-01-22 00:09:43.674 182717 INFO nova.virt.libvirt.driver [None req-43b71e46-6cc7-4d6e-8c59-e220051761d4 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 07bda903-2298-433c-aa7d-9a50380e24f1] Using config drive
Jan 22 00:09:43 compute-1 nova_compute[182713]: 2026-01-22 00:09:43.679 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Instance 07bda903-2298-433c-aa7d-9a50380e24f1 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 192, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 22 00:09:43 compute-1 nova_compute[182713]: 2026-01-22 00:09:43.679 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Instance 9c30e0e8-a030-4b0c-84ed-324f08bd5f1b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 22 00:09:43 compute-1 nova_compute[182713]: 2026-01-22 00:09:43.680 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 00:09:43 compute-1 nova_compute[182713]: 2026-01-22 00:09:43.680 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=832MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 00:09:43 compute-1 nova_compute[182713]: 2026-01-22 00:09:43.700 182717 DEBUG nova.scheduler.client.report [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Refreshing inventories for resource provider 39680711-70c9-4df1-ae59-25e54fac688d _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Jan 22 00:09:43 compute-1 nova_compute[182713]: 2026-01-22 00:09:43.720 182717 DEBUG nova.scheduler.client.report [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Updating ProviderTree inventory for provider 39680711-70c9-4df1-ae59-25e54fac688d from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Jan 22 00:09:43 compute-1 nova_compute[182713]: 2026-01-22 00:09:43.721 182717 DEBUG nova.compute.provider_tree [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Updating inventory in ProviderTree for provider 39680711-70c9-4df1-ae59-25e54fac688d with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Jan 22 00:09:43 compute-1 NetworkManager[54952]: <info>  [1769040583.7440] manager: (tapbc1908f8-b8): new Tun device (/org/freedesktop/NetworkManager/Devices/213)
Jan 22 00:09:43 compute-1 nova_compute[182713]: 2026-01-22 00:09:43.747 182717 DEBUG nova.scheduler.client.report [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Refreshing aggregate associations for resource provider 39680711-70c9-4df1-ae59-25e54fac688d, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Jan 22 00:09:43 compute-1 kernel: tapbc1908f8-b8: entered promiscuous mode
Jan 22 00:09:43 compute-1 nova_compute[182713]: 2026-01-22 00:09:43.755 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:09:43 compute-1 nova_compute[182713]: 2026-01-22 00:09:43.758 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:09:43 compute-1 ovn_controller[94841]: 2026-01-22T00:09:43Z|00453|binding|INFO|Claiming lport bc1908f8-b8ef-40b4-9e46-8e8664065a89 for this chassis.
Jan 22 00:09:43 compute-1 ovn_controller[94841]: 2026-01-22T00:09:43Z|00454|binding|INFO|bc1908f8-b8ef-40b4-9e46-8e8664065a89: Claiming fa:16:3e:45:56:97 10.100.0.12
Jan 22 00:09:43 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:09:43.775 104184 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:45:56:97 10.100.0.12'], port_security=['fa:16:3e:45:56:97 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '07bda903-2298-433c-aa7d-9a50380e24f1', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d94993bc-77ac-42d2-88cb-3b0110dff29e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3822e32efd5647aebf2d79a3dd038bd4', 'neutron:revision_number': '6', 'neutron:security_group_ids': 'e16c6ba2-0878-43d1-92b3-4f4c19054c38', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b241e30a-4049-4268-8d26-df6eb9b41bf5, chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>], logical_port=bc1908f8-b8ef-40b4-9e46-8e8664065a89) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:09:43 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:09:43.778 104184 INFO neutron.agent.ovn.metadata.agent [-] Port bc1908f8-b8ef-40b4-9e46-8e8664065a89 in datapath d94993bc-77ac-42d2-88cb-3b0110dff29e bound to our chassis
Jan 22 00:09:43 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:09:43.782 104184 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network d94993bc-77ac-42d2-88cb-3b0110dff29e
Jan 22 00:09:43 compute-1 nova_compute[182713]: 2026-01-22 00:09:43.790 182717 DEBUG nova.scheduler.client.report [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Refreshing trait associations for resource provider 39680711-70c9-4df1-ae59-25e54fac688d, traits: COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_ACCELERATORS,COMPUTE_NODE,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_SSE41,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_GRAPHICS_MODEL_CIRRUS,HW_CPU_X86_SSE42,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_SECURITY_TPM_2_0,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_STORAGE_BUS_USB,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_STORAGE_BUS_FDC,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_STORAGE_BUS_SATA,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_DEVICE_TAGGING,COMPUTE_NET_VIF_MODEL_NE2K_PCI,HW_CPU_X86_SSE2,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_CPU_X86_SSSE3,COMPUTE_STORAGE_BUS_IDE,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_RESCUE_BFV,HW_CPU_X86_MMX,HW_CPU_X86_SSE,COMPUTE_TRUSTED_CERTS,COMPUTE_IMAGE_TYPE_ISO _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Jan 22 00:09:43 compute-1 systemd-machined[153970]: New machine qemu-49-instance-0000006f.
Jan 22 00:09:43 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:09:43.807 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[315efb59-66a4-4c66-b98d-c2b293cc029d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:09:43 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:09:43.808 104184 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapd94993bc-71 in ovnmeta-d94993bc-77ac-42d2-88cb-3b0110dff29e namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 22 00:09:43 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:09:43.811 211733 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapd94993bc-70 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 22 00:09:43 compute-1 nova_compute[182713]: 2026-01-22 00:09:43.812 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:09:43 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:09:43.812 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[132e4672-2127-40aa-98ae-1cc6fc4d0e3b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:09:43 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:09:43.813 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[15ae0940-d0a7-486a-9685-9f19da6c2b5a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:09:43 compute-1 ovn_controller[94841]: 2026-01-22T00:09:43Z|00455|binding|INFO|Setting lport bc1908f8-b8ef-40b4-9e46-8e8664065a89 ovn-installed in OVS
Jan 22 00:09:43 compute-1 ovn_controller[94841]: 2026-01-22T00:09:43Z|00456|binding|INFO|Setting lport bc1908f8-b8ef-40b4-9e46-8e8664065a89 up in Southbound
Jan 22 00:09:43 compute-1 nova_compute[182713]: 2026-01-22 00:09:43.818 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:09:43 compute-1 systemd[1]: Started Virtual Machine qemu-49-instance-0000006f.
Jan 22 00:09:43 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:09:43.839 104576 DEBUG oslo.privsep.daemon [-] privsep: reply[dbb4b0b7-aab7-4226-9004-6f229f2cd567]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:09:43 compute-1 systemd-udevd[228536]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 00:09:43 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:09:43.865 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[e00ee383-f3be-4838-b099-4c69b58f9930]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:09:43 compute-1 NetworkManager[54952]: <info>  [1769040583.8819] device (tapbc1908f8-b8): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 00:09:43 compute-1 NetworkManager[54952]: <info>  [1769040583.8827] device (tapbc1908f8-b8): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 00:09:43 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:09:43.911 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[3ed3eb43-5042-478b-81e1-23d68aae9a18]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:09:43 compute-1 nova_compute[182713]: 2026-01-22 00:09:43.912 182717 DEBUG nova.compute.provider_tree [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Inventory has not changed in ProviderTree for provider: 39680711-70c9-4df1-ae59-25e54fac688d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:09:43 compute-1 NetworkManager[54952]: <info>  [1769040583.9225] manager: (tapd94993bc-70): new Veth device (/org/freedesktop/NetworkManager/Devices/214)
Jan 22 00:09:43 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:09:43.923 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[60ee0f9e-f7b8-4554-badd-797ef5742992]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:09:43 compute-1 nova_compute[182713]: 2026-01-22 00:09:43.936 182717 DEBUG nova.scheduler.client.report [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Inventory has not changed for provider 39680711-70c9-4df1-ae59-25e54fac688d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:09:43 compute-1 nova_compute[182713]: 2026-01-22 00:09:43.963 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 00:09:43 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:09:43.963 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[bf69ca82-074c-45ae-a875-11dc00094da2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:09:43 compute-1 nova_compute[182713]: 2026-01-22 00:09:43.964 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.418s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:09:43 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:09:43.970 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[36fa54bd-f81b-4c82-a3cb-fa3134d7b4d9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:09:43 compute-1 NetworkManager[54952]: <info>  [1769040583.9944] device (tapd94993bc-70): carrier: link connected
Jan 22 00:09:44 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:09:43.999 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[def3836e-b56a-4032-b705-a2ba26416b12]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:09:44 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:09:44.016 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[f6811efc-37a5-44e0-ad06-2c220e98b7bb]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd94993bc-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:80:ee:cd'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 131], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 515664, 'reachable_time': 23384, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 228567, 'error': None, 'target': 'ovnmeta-d94993bc-77ac-42d2-88cb-3b0110dff29e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:09:44 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:09:44.035 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[327ca82c-6bb3-4b70-9325-f790ab51e1cf]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe80:eecd'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 515664, 'tstamp': 515664}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 228568, 'error': None, 'target': 'ovnmeta-d94993bc-77ac-42d2-88cb-3b0110dff29e', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:09:44 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:09:44.051 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[09ae8e61-b309-4981-9bcb-0f23fa164629]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd94993bc-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:80:ee:cd'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 131], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 515664, 'reachable_time': 23384, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 228569, 'error': None, 'target': 'ovnmeta-d94993bc-77ac-42d2-88cb-3b0110dff29e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:09:44 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:09:44.089 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[810609ae-c897-457a-98d9-ca5c3064268b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:09:44 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:09:44.169 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[22ecff9f-0b90-4eb1-b7a3-604c1acf0519]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:09:44 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:09:44.171 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd94993bc-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:09:44 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:09:44.172 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 00:09:44 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:09:44.174 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd94993bc-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:09:44 compute-1 kernel: tapd94993bc-70: entered promiscuous mode
Jan 22 00:09:44 compute-1 NetworkManager[54952]: <info>  [1769040584.1766] manager: (tapd94993bc-70): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/215)
Jan 22 00:09:44 compute-1 nova_compute[182713]: 2026-01-22 00:09:44.177 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:09:44 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:09:44.181 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapd94993bc-70, col_values=(('external_ids', {'iface-id': 'd921ee25-8f8a-4375-9839-6c54ab328e88'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:09:44 compute-1 ovn_controller[94841]: 2026-01-22T00:09:44Z|00457|binding|INFO|Releasing lport d921ee25-8f8a-4375-9839-6c54ab328e88 from this chassis (sb_readonly=0)
Jan 22 00:09:44 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:09:44.184 104184 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/d94993bc-77ac-42d2-88cb-3b0110dff29e.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/d94993bc-77ac-42d2-88cb-3b0110dff29e.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 22 00:09:44 compute-1 nova_compute[182713]: 2026-01-22 00:09:44.185 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:09:44 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:09:44.185 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[e0c3356e-567d-46b8-b52d-14a8cb660785]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:09:44 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:09:44.186 104184 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 00:09:44 compute-1 ovn_metadata_agent[104179]: global
Jan 22 00:09:44 compute-1 ovn_metadata_agent[104179]:     log         /dev/log local0 debug
Jan 22 00:09:44 compute-1 ovn_metadata_agent[104179]:     log-tag     haproxy-metadata-proxy-d94993bc-77ac-42d2-88cb-3b0110dff29e
Jan 22 00:09:44 compute-1 ovn_metadata_agent[104179]:     user        root
Jan 22 00:09:44 compute-1 ovn_metadata_agent[104179]:     group       root
Jan 22 00:09:44 compute-1 ovn_metadata_agent[104179]:     maxconn     1024
Jan 22 00:09:44 compute-1 ovn_metadata_agent[104179]:     pidfile     /var/lib/neutron/external/pids/d94993bc-77ac-42d2-88cb-3b0110dff29e.pid.haproxy
Jan 22 00:09:44 compute-1 ovn_metadata_agent[104179]:     daemon
Jan 22 00:09:44 compute-1 ovn_metadata_agent[104179]: 
Jan 22 00:09:44 compute-1 ovn_metadata_agent[104179]: defaults
Jan 22 00:09:44 compute-1 ovn_metadata_agent[104179]:     log global
Jan 22 00:09:44 compute-1 ovn_metadata_agent[104179]:     mode http
Jan 22 00:09:44 compute-1 ovn_metadata_agent[104179]:     option httplog
Jan 22 00:09:44 compute-1 ovn_metadata_agent[104179]:     option dontlognull
Jan 22 00:09:44 compute-1 ovn_metadata_agent[104179]:     option http-server-close
Jan 22 00:09:44 compute-1 ovn_metadata_agent[104179]:     option forwardfor
Jan 22 00:09:44 compute-1 ovn_metadata_agent[104179]:     retries                 3
Jan 22 00:09:44 compute-1 ovn_metadata_agent[104179]:     timeout http-request    30s
Jan 22 00:09:44 compute-1 ovn_metadata_agent[104179]:     timeout connect         30s
Jan 22 00:09:44 compute-1 ovn_metadata_agent[104179]:     timeout client          32s
Jan 22 00:09:44 compute-1 ovn_metadata_agent[104179]:     timeout server          32s
Jan 22 00:09:44 compute-1 ovn_metadata_agent[104179]:     timeout http-keep-alive 30s
Jan 22 00:09:44 compute-1 ovn_metadata_agent[104179]: 
Jan 22 00:09:44 compute-1 ovn_metadata_agent[104179]: 
Jan 22 00:09:44 compute-1 ovn_metadata_agent[104179]: listen listener
Jan 22 00:09:44 compute-1 ovn_metadata_agent[104179]:     bind 169.254.169.254:80
Jan 22 00:09:44 compute-1 ovn_metadata_agent[104179]:     server metadata /var/lib/neutron/metadata_proxy
Jan 22 00:09:44 compute-1 ovn_metadata_agent[104179]:     http-request add-header X-OVN-Network-ID d94993bc-77ac-42d2-88cb-3b0110dff29e
Jan 22 00:09:44 compute-1 ovn_metadata_agent[104179]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 22 00:09:44 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:09:44.187 104184 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-d94993bc-77ac-42d2-88cb-3b0110dff29e', 'env', 'PROCESS_TAG=haproxy-d94993bc-77ac-42d2-88cb-3b0110dff29e', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/d94993bc-77ac-42d2-88cb-3b0110dff29e.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 22 00:09:44 compute-1 nova_compute[182713]: 2026-01-22 00:09:44.198 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:09:44 compute-1 nova_compute[182713]: 2026-01-22 00:09:44.427 182717 DEBUG nova.network.neutron [req-578c5dfe-8ef9-494f-80f3-a95f28f1743d req-7b26f3f8-06a4-4a19-b075-073b3fb8657a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 07bda903-2298-433c-aa7d-9a50380e24f1] Updated VIF entry in instance network info cache for port bc1908f8-b8ef-40b4-9e46-8e8664065a89. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 00:09:44 compute-1 nova_compute[182713]: 2026-01-22 00:09:44.427 182717 DEBUG nova.network.neutron [req-578c5dfe-8ef9-494f-80f3-a95f28f1743d req-7b26f3f8-06a4-4a19-b075-073b3fb8657a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 07bda903-2298-433c-aa7d-9a50380e24f1] Updating instance_info_cache with network_info: [{"id": "bc1908f8-b8ef-40b4-9e46-8e8664065a89", "address": "fa:16:3e:45:56:97", "network": {"id": "d94993bc-77ac-42d2-88cb-3b0110dff29e", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-173623444-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3822e32efd5647aebf2d79a3dd038bd4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbc1908f8-b8", "ovs_interfaceid": "bc1908f8-b8ef-40b4-9e46-8e8664065a89", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:09:44 compute-1 nova_compute[182713]: 2026-01-22 00:09:44.448 182717 DEBUG oslo_concurrency.lockutils [req-578c5dfe-8ef9-494f-80f3-a95f28f1743d req-7b26f3f8-06a4-4a19-b075-073b3fb8657a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-07bda903-2298-433c-aa7d-9a50380e24f1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:09:44 compute-1 podman[228601]: 2026-01-22 00:09:44.627113626 +0000 UTC m=+0.059048943 container create d83941eb9971bdf8439e6f945d1321035820c608165a17438d22cab39ed4f663 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d94993bc-77ac-42d2-88cb-3b0110dff29e, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0)
Jan 22 00:09:44 compute-1 systemd[1]: Started libpod-conmon-d83941eb9971bdf8439e6f945d1321035820c608165a17438d22cab39ed4f663.scope.
Jan 22 00:09:44 compute-1 podman[228601]: 2026-01-22 00:09:44.595459355 +0000 UTC m=+0.027394692 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 00:09:44 compute-1 systemd[1]: Started libcrun container.
Jan 22 00:09:44 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/68fea6c97fdc13796f815f8a29b8880837379d9a1e19b310af76dbc461a7808a/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 00:09:44 compute-1 podman[228601]: 2026-01-22 00:09:44.722639485 +0000 UTC m=+0.154574852 container init d83941eb9971bdf8439e6f945d1321035820c608165a17438d22cab39ed4f663 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d94993bc-77ac-42d2-88cb-3b0110dff29e, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 22 00:09:44 compute-1 podman[228601]: 2026-01-22 00:09:44.729093911 +0000 UTC m=+0.161029228 container start d83941eb9971bdf8439e6f945d1321035820c608165a17438d22cab39ed4f663 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d94993bc-77ac-42d2-88cb-3b0110dff29e, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Jan 22 00:09:44 compute-1 neutron-haproxy-ovnmeta-d94993bc-77ac-42d2-88cb-3b0110dff29e[228616]: [NOTICE]   (228620) : New worker (228622) forked
Jan 22 00:09:44 compute-1 neutron-haproxy-ovnmeta-d94993bc-77ac-42d2-88cb-3b0110dff29e[228616]: [NOTICE]   (228620) : Loading success.
Jan 22 00:09:44 compute-1 nova_compute[182713]: 2026-01-22 00:09:44.924 182717 DEBUG nova.virt.driver [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Emitting event <LifecycleEvent: 1769040584.9236774, 07bda903-2298-433c-aa7d-9a50380e24f1 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:09:44 compute-1 nova_compute[182713]: 2026-01-22 00:09:44.926 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 07bda903-2298-433c-aa7d-9a50380e24f1] VM Resumed (Lifecycle Event)
Jan 22 00:09:44 compute-1 nova_compute[182713]: 2026-01-22 00:09:44.928 182717 DEBUG nova.compute.manager [None req-43b71e46-6cc7-4d6e-8c59-e220051761d4 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 07bda903-2298-433c-aa7d-9a50380e24f1] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 00:09:44 compute-1 nova_compute[182713]: 2026-01-22 00:09:44.932 182717 INFO nova.virt.libvirt.driver [-] [instance: 07bda903-2298-433c-aa7d-9a50380e24f1] Instance running successfully.
Jan 22 00:09:44 compute-1 virtqemud[182235]: argument unsupported: QEMU guest agent is not configured
Jan 22 00:09:44 compute-1 nova_compute[182713]: 2026-01-22 00:09:44.938 182717 DEBUG nova.virt.libvirt.guest [None req-43b71e46-6cc7-4d6e-8c59-e220051761d4 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 07bda903-2298-433c-aa7d-9a50380e24f1] Failed to set time: agent not configured sync_guest_time /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:200
Jan 22 00:09:44 compute-1 nova_compute[182713]: 2026-01-22 00:09:44.940 182717 DEBUG nova.virt.libvirt.driver [None req-43b71e46-6cc7-4d6e-8c59-e220051761d4 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 07bda903-2298-433c-aa7d-9a50380e24f1] finish_migration finished successfully. finish_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11793
Jan 22 00:09:44 compute-1 nova_compute[182713]: 2026-01-22 00:09:44.967 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 07bda903-2298-433c-aa7d-9a50380e24f1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:09:44 compute-1 nova_compute[182713]: 2026-01-22 00:09:44.971 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 07bda903-2298-433c-aa7d-9a50380e24f1] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: resize_finish, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 00:09:45 compute-1 nova_compute[182713]: 2026-01-22 00:09:45.027 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 07bda903-2298-433c-aa7d-9a50380e24f1] During sync_power_state the instance has a pending task (resize_finish). Skip.
Jan 22 00:09:45 compute-1 nova_compute[182713]: 2026-01-22 00:09:45.028 182717 DEBUG nova.virt.driver [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Emitting event <LifecycleEvent: 1769040584.9250703, 07bda903-2298-433c-aa7d-9a50380e24f1 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:09:45 compute-1 nova_compute[182713]: 2026-01-22 00:09:45.028 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 07bda903-2298-433c-aa7d-9a50380e24f1] VM Started (Lifecycle Event)
Jan 22 00:09:45 compute-1 nova_compute[182713]: 2026-01-22 00:09:45.082 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 07bda903-2298-433c-aa7d-9a50380e24f1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:09:45 compute-1 nova_compute[182713]: 2026-01-22 00:09:45.087 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 07bda903-2298-433c-aa7d-9a50380e24f1] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: resize_finish, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 00:09:45 compute-1 podman[228638]: 2026-01-22 00:09:45.579699968 +0000 UTC m=+0.067541470 container health_status dce133a2ffa21f3006ac27dc80027060a9029e6d972f4c62fecd35dd3bbb00f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, io.openshift.expose-services=, vcs-type=git, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., name=ubi9-minimal, release=1755695350, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=openstack_network_exporter, vendor=Red Hat, Inc., io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Jan 22 00:09:45 compute-1 nova_compute[182713]: 2026-01-22 00:09:45.963 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:09:45 compute-1 nova_compute[182713]: 2026-01-22 00:09:45.965 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 00:09:45 compute-1 nova_compute[182713]: 2026-01-22 00:09:45.997 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 22 00:09:45 compute-1 nova_compute[182713]: 2026-01-22 00:09:45.998 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:09:46 compute-1 nova_compute[182713]: 2026-01-22 00:09:46.096 182717 DEBUG nova.compute.manager [req-846914c9-26f1-4c62-aa52-573ed99ea084 req-dd0e9907-c25b-471b-b437-a94b50a90f4a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 07bda903-2298-433c-aa7d-9a50380e24f1] Received event network-vif-plugged-bc1908f8-b8ef-40b4-9e46-8e8664065a89 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:09:46 compute-1 nova_compute[182713]: 2026-01-22 00:09:46.097 182717 DEBUG oslo_concurrency.lockutils [req-846914c9-26f1-4c62-aa52-573ed99ea084 req-dd0e9907-c25b-471b-b437-a94b50a90f4a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "07bda903-2298-433c-aa7d-9a50380e24f1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:09:46 compute-1 nova_compute[182713]: 2026-01-22 00:09:46.097 182717 DEBUG oslo_concurrency.lockutils [req-846914c9-26f1-4c62-aa52-573ed99ea084 req-dd0e9907-c25b-471b-b437-a94b50a90f4a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "07bda903-2298-433c-aa7d-9a50380e24f1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:09:46 compute-1 nova_compute[182713]: 2026-01-22 00:09:46.097 182717 DEBUG oslo_concurrency.lockutils [req-846914c9-26f1-4c62-aa52-573ed99ea084 req-dd0e9907-c25b-471b-b437-a94b50a90f4a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "07bda903-2298-433c-aa7d-9a50380e24f1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:09:46 compute-1 nova_compute[182713]: 2026-01-22 00:09:46.098 182717 DEBUG nova.compute.manager [req-846914c9-26f1-4c62-aa52-573ed99ea084 req-dd0e9907-c25b-471b-b437-a94b50a90f4a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 07bda903-2298-433c-aa7d-9a50380e24f1] No waiting events found dispatching network-vif-plugged-bc1908f8-b8ef-40b4-9e46-8e8664065a89 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:09:46 compute-1 nova_compute[182713]: 2026-01-22 00:09:46.098 182717 WARNING nova.compute.manager [req-846914c9-26f1-4c62-aa52-573ed99ea084 req-dd0e9907-c25b-471b-b437-a94b50a90f4a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 07bda903-2298-433c-aa7d-9a50380e24f1] Received unexpected event network-vif-plugged-bc1908f8-b8ef-40b4-9e46-8e8664065a89 for instance with vm_state resized and task_state None.
Jan 22 00:09:47 compute-1 nova_compute[182713]: 2026-01-22 00:09:47.591 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:09:48 compute-1 nova_compute[182713]: 2026-01-22 00:09:48.580 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:09:49 compute-1 nova_compute[182713]: 2026-01-22 00:09:49.230 182717 DEBUG nova.compute.manager [req-7555d67c-4874-42d9-8410-6c5c5d368ae3 req-0a734f6d-743f-45e2-990c-7f76cad0f514 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 07bda903-2298-433c-aa7d-9a50380e24f1] Received event network-vif-plugged-bc1908f8-b8ef-40b4-9e46-8e8664065a89 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:09:49 compute-1 nova_compute[182713]: 2026-01-22 00:09:49.232 182717 DEBUG oslo_concurrency.lockutils [req-7555d67c-4874-42d9-8410-6c5c5d368ae3 req-0a734f6d-743f-45e2-990c-7f76cad0f514 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "07bda903-2298-433c-aa7d-9a50380e24f1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:09:49 compute-1 nova_compute[182713]: 2026-01-22 00:09:49.232 182717 DEBUG oslo_concurrency.lockutils [req-7555d67c-4874-42d9-8410-6c5c5d368ae3 req-0a734f6d-743f-45e2-990c-7f76cad0f514 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "07bda903-2298-433c-aa7d-9a50380e24f1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:09:49 compute-1 nova_compute[182713]: 2026-01-22 00:09:49.233 182717 DEBUG oslo_concurrency.lockutils [req-7555d67c-4874-42d9-8410-6c5c5d368ae3 req-0a734f6d-743f-45e2-990c-7f76cad0f514 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "07bda903-2298-433c-aa7d-9a50380e24f1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:09:49 compute-1 nova_compute[182713]: 2026-01-22 00:09:49.233 182717 DEBUG nova.compute.manager [req-7555d67c-4874-42d9-8410-6c5c5d368ae3 req-0a734f6d-743f-45e2-990c-7f76cad0f514 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 07bda903-2298-433c-aa7d-9a50380e24f1] No waiting events found dispatching network-vif-plugged-bc1908f8-b8ef-40b4-9e46-8e8664065a89 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:09:49 compute-1 nova_compute[182713]: 2026-01-22 00:09:49.233 182717 WARNING nova.compute.manager [req-7555d67c-4874-42d9-8410-6c5c5d368ae3 req-0a734f6d-743f-45e2-990c-7f76cad0f514 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 07bda903-2298-433c-aa7d-9a50380e24f1] Received unexpected event network-vif-plugged-bc1908f8-b8ef-40b4-9e46-8e8664065a89 for instance with vm_state resized and task_state None.
Jan 22 00:09:49 compute-1 systemd[1]: Stopping User Manager for UID 42436...
Jan 22 00:09:49 compute-1 systemd[228455]: Activating special unit Exit the Session...
Jan 22 00:09:49 compute-1 systemd[228455]: Stopped target Main User Target.
Jan 22 00:09:49 compute-1 systemd[228455]: Stopped target Basic System.
Jan 22 00:09:49 compute-1 systemd[228455]: Stopped target Paths.
Jan 22 00:09:49 compute-1 systemd[228455]: Stopped target Sockets.
Jan 22 00:09:49 compute-1 systemd[228455]: Stopped target Timers.
Jan 22 00:09:49 compute-1 systemd[228455]: Stopped Mark boot as successful after the user session has run 2 minutes.
Jan 22 00:09:49 compute-1 systemd[228455]: Stopped Daily Cleanup of User's Temporary Directories.
Jan 22 00:09:49 compute-1 systemd[228455]: Closed D-Bus User Message Bus Socket.
Jan 22 00:09:49 compute-1 systemd[228455]: Stopped Create User's Volatile Files and Directories.
Jan 22 00:09:49 compute-1 systemd[228455]: Removed slice User Application Slice.
Jan 22 00:09:49 compute-1 systemd[228455]: Reached target Shutdown.
Jan 22 00:09:49 compute-1 systemd[228455]: Finished Exit the Session.
Jan 22 00:09:49 compute-1 systemd[228455]: Reached target Exit the Session.
Jan 22 00:09:49 compute-1 systemd[1]: user@42436.service: Deactivated successfully.
Jan 22 00:09:49 compute-1 systemd[1]: Stopped User Manager for UID 42436.
Jan 22 00:09:49 compute-1 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Jan 22 00:09:49 compute-1 systemd[1]: run-user-42436.mount: Deactivated successfully.
Jan 22 00:09:49 compute-1 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Jan 22 00:09:49 compute-1 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Jan 22 00:09:49 compute-1 systemd[1]: Removed slice User Slice of UID 42436.
Jan 22 00:09:52 compute-1 nova_compute[182713]: 2026-01-22 00:09:51.999 182717 DEBUG nova.virt.libvirt.driver [None req-a4430f0b-c83c-4449-afd7-67aadb4578b3 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] [instance: 9c30e0e8-a030-4b0c-84ed-324f08bd5f1b] Instance in state 1 after 10 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101
Jan 22 00:09:52 compute-1 nova_compute[182713]: 2026-01-22 00:09:52.596 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:09:53 compute-1 nova_compute[182713]: 2026-01-22 00:09:53.582 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:09:54 compute-1 kernel: tap3e67543d-63 (unregistering): left promiscuous mode
Jan 22 00:09:54 compute-1 NetworkManager[54952]: <info>  [1769040594.2310] device (tap3e67543d-63): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 00:09:54 compute-1 ovn_controller[94841]: 2026-01-22T00:09:54Z|00458|binding|INFO|Releasing lport 3e67543d-6311-420b-878e-c3112fb771c2 from this chassis (sb_readonly=0)
Jan 22 00:09:54 compute-1 ovn_controller[94841]: 2026-01-22T00:09:54Z|00459|binding|INFO|Setting lport 3e67543d-6311-420b-878e-c3112fb771c2 down in Southbound
Jan 22 00:09:54 compute-1 nova_compute[182713]: 2026-01-22 00:09:54.251 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:09:54 compute-1 ovn_controller[94841]: 2026-01-22T00:09:54Z|00460|binding|INFO|Removing iface tap3e67543d-63 ovn-installed in OVS
Jan 22 00:09:54 compute-1 nova_compute[182713]: 2026-01-22 00:09:54.256 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:09:54 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:09:54.267 104184 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:75:3e:f2 10.100.0.8'], port_security=['fa:16:3e:75:3e:f2 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '9c30e0e8-a030-4b0c-84ed-324f08bd5f1b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-eaa59f49-9477-4d9b-9a5e-8f1eb5cf78a6', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3b9315c6168049d79f20d630e51ffff3', 'neutron:revision_number': '4', 'neutron:security_group_ids': '88dd83ff-b733-44b2-9065-8f39dcf83d23', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ada6e58f-6492-44c0-abaa-a00698af112f, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>], logical_port=3e67543d-6311-420b-878e-c3112fb771c2) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:09:54 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:09:54.269 104184 INFO neutron.agent.ovn.metadata.agent [-] Port 3e67543d-6311-420b-878e-c3112fb771c2 in datapath eaa59f49-9477-4d9b-9a5e-8f1eb5cf78a6 unbound from our chassis
Jan 22 00:09:54 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:09:54.271 104184 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network eaa59f49-9477-4d9b-9a5e-8f1eb5cf78a6 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Jan 22 00:09:54 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:09:54.273 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[2004c50f-3740-4168-b890-b7fcd7265976]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:09:54 compute-1 nova_compute[182713]: 2026-01-22 00:09:54.286 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:09:54 compute-1 systemd[1]: machine-qemu\x2d48\x2dinstance\x2d00000070.scope: Deactivated successfully.
Jan 22 00:09:54 compute-1 systemd[1]: machine-qemu\x2d48\x2dinstance\x2d00000070.scope: Consumed 13.685s CPU time.
Jan 22 00:09:54 compute-1 systemd-machined[153970]: Machine qemu-48-instance-00000070 terminated.
Jan 22 00:09:55 compute-1 nova_compute[182713]: 2026-01-22 00:09:55.017 182717 INFO nova.virt.libvirt.driver [None req-a4430f0b-c83c-4449-afd7-67aadb4578b3 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] [instance: 9c30e0e8-a030-4b0c-84ed-324f08bd5f1b] Instance shutdown successfully after 13 seconds.
Jan 22 00:09:55 compute-1 nova_compute[182713]: 2026-01-22 00:09:55.024 182717 INFO nova.virt.libvirt.driver [-] [instance: 9c30e0e8-a030-4b0c-84ed-324f08bd5f1b] Instance destroyed successfully.
Jan 22 00:09:55 compute-1 nova_compute[182713]: 2026-01-22 00:09:55.024 182717 DEBUG nova.objects.instance [None req-a4430f0b-c83c-4449-afd7-67aadb4578b3 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] Lazy-loading 'numa_topology' on Instance uuid 9c30e0e8-a030-4b0c-84ed-324f08bd5f1b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:09:55 compute-1 nova_compute[182713]: 2026-01-22 00:09:55.181 182717 INFO nova.virt.libvirt.driver [None req-a4430f0b-c83c-4449-afd7-67aadb4578b3 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] [instance: 9c30e0e8-a030-4b0c-84ed-324f08bd5f1b] Attempting rescue
Jan 22 00:09:55 compute-1 nova_compute[182713]: 2026-01-22 00:09:55.182 182717 DEBUG nova.virt.libvirt.driver [None req-a4430f0b-c83c-4449-afd7-67aadb4578b3 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] rescue generated disk_info: {'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'disk.rescue': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vdb', 'type': 'disk'}, 'disk.config.rescue': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} rescue /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4314
Jan 22 00:09:55 compute-1 nova_compute[182713]: 2026-01-22 00:09:55.186 182717 DEBUG nova.virt.libvirt.driver [None req-a4430f0b-c83c-4449-afd7-67aadb4578b3 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] [instance: 9c30e0e8-a030-4b0c-84ed-324f08bd5f1b] Instance directory exists: not creating _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4719
Jan 22 00:09:55 compute-1 nova_compute[182713]: 2026-01-22 00:09:55.187 182717 INFO nova.virt.libvirt.driver [None req-a4430f0b-c83c-4449-afd7-67aadb4578b3 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] [instance: 9c30e0e8-a030-4b0c-84ed-324f08bd5f1b] Creating image(s)
Jan 22 00:09:55 compute-1 nova_compute[182713]: 2026-01-22 00:09:55.188 182717 DEBUG oslo_concurrency.lockutils [None req-a4430f0b-c83c-4449-afd7-67aadb4578b3 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] Acquiring lock "/var/lib/nova/instances/9c30e0e8-a030-4b0c-84ed-324f08bd5f1b/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:09:55 compute-1 nova_compute[182713]: 2026-01-22 00:09:55.189 182717 DEBUG oslo_concurrency.lockutils [None req-a4430f0b-c83c-4449-afd7-67aadb4578b3 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] Lock "/var/lib/nova/instances/9c30e0e8-a030-4b0c-84ed-324f08bd5f1b/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:09:55 compute-1 nova_compute[182713]: 2026-01-22 00:09:55.190 182717 DEBUG oslo_concurrency.lockutils [None req-a4430f0b-c83c-4449-afd7-67aadb4578b3 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] Lock "/var/lib/nova/instances/9c30e0e8-a030-4b0c-84ed-324f08bd5f1b/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:09:55 compute-1 nova_compute[182713]: 2026-01-22 00:09:55.190 182717 DEBUG nova.objects.instance [None req-a4430f0b-c83c-4449-afd7-67aadb4578b3 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] Lazy-loading 'trusted_certs' on Instance uuid 9c30e0e8-a030-4b0c-84ed-324f08bd5f1b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:09:55 compute-1 nova_compute[182713]: 2026-01-22 00:09:55.231 182717 DEBUG oslo_concurrency.lockutils [None req-a4430f0b-c83c-4449-afd7-67aadb4578b3 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] Acquiring lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:09:55 compute-1 nova_compute[182713]: 2026-01-22 00:09:55.231 182717 DEBUG oslo_concurrency.lockutils [None req-a4430f0b-c83c-4449-afd7-67aadb4578b3 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:09:55 compute-1 nova_compute[182713]: 2026-01-22 00:09:55.242 182717 DEBUG oslo_concurrency.processutils [None req-a4430f0b-c83c-4449-afd7-67aadb4578b3 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:09:55 compute-1 nova_compute[182713]: 2026-01-22 00:09:55.318 182717 DEBUG oslo_concurrency.processutils [None req-a4430f0b-c83c-4449-afd7-67aadb4578b3 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:09:55 compute-1 nova_compute[182713]: 2026-01-22 00:09:55.319 182717 DEBUG oslo_concurrency.processutils [None req-a4430f0b-c83c-4449-afd7-67aadb4578b3 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/9c30e0e8-a030-4b0c-84ed-324f08bd5f1b/disk.rescue execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:09:55 compute-1 nova_compute[182713]: 2026-01-22 00:09:55.360 182717 DEBUG oslo_concurrency.processutils [None req-a4430f0b-c83c-4449-afd7-67aadb4578b3 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/9c30e0e8-a030-4b0c-84ed-324f08bd5f1b/disk.rescue" returned: 0 in 0.041s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:09:55 compute-1 nova_compute[182713]: 2026-01-22 00:09:55.361 182717 DEBUG oslo_concurrency.lockutils [None req-a4430f0b-c83c-4449-afd7-67aadb4578b3 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.130s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:09:55 compute-1 nova_compute[182713]: 2026-01-22 00:09:55.362 182717 DEBUG nova.objects.instance [None req-a4430f0b-c83c-4449-afd7-67aadb4578b3 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] Lazy-loading 'migration_context' on Instance uuid 9c30e0e8-a030-4b0c-84ed-324f08bd5f1b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:09:55 compute-1 nova_compute[182713]: 2026-01-22 00:09:55.394 182717 DEBUG nova.virt.libvirt.driver [None req-a4430f0b-c83c-4449-afd7-67aadb4578b3 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] [instance: 9c30e0e8-a030-4b0c-84ed-324f08bd5f1b] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 22 00:09:55 compute-1 nova_compute[182713]: 2026-01-22 00:09:55.396 182717 DEBUG nova.virt.libvirt.driver [None req-a4430f0b-c83c-4449-afd7-67aadb4578b3 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] [instance: 9c30e0e8-a030-4b0c-84ed-324f08bd5f1b] Start _get_guest_xml network_info=[{"id": "3e67543d-6311-420b-878e-c3112fb771c2", "address": "fa:16:3e:75:3e:f2", "network": {"id": "eaa59f49-9477-4d9b-9a5e-8f1eb5cf78a6", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-1280377146-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerRescueTestJSON-1280377146-network", "vif_mac": "fa:16:3e:75:3e:f2"}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "3b9315c6168049d79f20d630e51ffff3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3e67543d-63", "ovs_interfaceid": "3e67543d-6311-420b-878e-c3112fb771c2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'disk.rescue': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vdb', 'type': 'disk'}, 'disk.config.rescue': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>) rescue={'image_id': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'kernel_id': '', 'ramdisk_id': ''} block_device_info=None _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 22 00:09:55 compute-1 nova_compute[182713]: 2026-01-22 00:09:55.396 182717 DEBUG nova.objects.instance [None req-a4430f0b-c83c-4449-afd7-67aadb4578b3 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] Lazy-loading 'resources' on Instance uuid 9c30e0e8-a030-4b0c-84ed-324f08bd5f1b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:09:55 compute-1 nova_compute[182713]: 2026-01-22 00:09:55.472 182717 WARNING nova.virt.libvirt.driver [None req-a4430f0b-c83c-4449-afd7-67aadb4578b3 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 00:09:55 compute-1 nova_compute[182713]: 2026-01-22 00:09:55.482 182717 DEBUG nova.virt.libvirt.host [None req-a4430f0b-c83c-4449-afd7-67aadb4578b3 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 22 00:09:55 compute-1 nova_compute[182713]: 2026-01-22 00:09:55.483 182717 DEBUG nova.virt.libvirt.host [None req-a4430f0b-c83c-4449-afd7-67aadb4578b3 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 22 00:09:55 compute-1 nova_compute[182713]: 2026-01-22 00:09:55.499 182717 DEBUG nova.virt.libvirt.host [None req-a4430f0b-c83c-4449-afd7-67aadb4578b3 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 22 00:09:55 compute-1 nova_compute[182713]: 2026-01-22 00:09:55.500 182717 DEBUG nova.virt.libvirt.host [None req-a4430f0b-c83c-4449-afd7-67aadb4578b3 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 22 00:09:55 compute-1 nova_compute[182713]: 2026-01-22 00:09:55.502 182717 DEBUG nova.virt.libvirt.driver [None req-a4430f0b-c83c-4449-afd7-67aadb4578b3 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 22 00:09:55 compute-1 nova_compute[182713]: 2026-01-22 00:09:55.502 182717 DEBUG nova.virt.hardware [None req-a4430f0b-c83c-4449-afd7-67aadb4578b3 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-21T23:42:45Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c3389c03-89c4-4ff5-9e03-1a99d41713d4',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 22 00:09:55 compute-1 nova_compute[182713]: 2026-01-22 00:09:55.503 182717 DEBUG nova.virt.hardware [None req-a4430f0b-c83c-4449-afd7-67aadb4578b3 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 22 00:09:55 compute-1 nova_compute[182713]: 2026-01-22 00:09:55.503 182717 DEBUG nova.virt.hardware [None req-a4430f0b-c83c-4449-afd7-67aadb4578b3 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 22 00:09:55 compute-1 nova_compute[182713]: 2026-01-22 00:09:55.503 182717 DEBUG nova.virt.hardware [None req-a4430f0b-c83c-4449-afd7-67aadb4578b3 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 22 00:09:55 compute-1 nova_compute[182713]: 2026-01-22 00:09:55.504 182717 DEBUG nova.virt.hardware [None req-a4430f0b-c83c-4449-afd7-67aadb4578b3 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 22 00:09:55 compute-1 nova_compute[182713]: 2026-01-22 00:09:55.504 182717 DEBUG nova.virt.hardware [None req-a4430f0b-c83c-4449-afd7-67aadb4578b3 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 22 00:09:55 compute-1 nova_compute[182713]: 2026-01-22 00:09:55.504 182717 DEBUG nova.virt.hardware [None req-a4430f0b-c83c-4449-afd7-67aadb4578b3 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 22 00:09:55 compute-1 nova_compute[182713]: 2026-01-22 00:09:55.504 182717 DEBUG nova.virt.hardware [None req-a4430f0b-c83c-4449-afd7-67aadb4578b3 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 22 00:09:55 compute-1 nova_compute[182713]: 2026-01-22 00:09:55.505 182717 DEBUG nova.virt.hardware [None req-a4430f0b-c83c-4449-afd7-67aadb4578b3 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 22 00:09:55 compute-1 nova_compute[182713]: 2026-01-22 00:09:55.505 182717 DEBUG nova.virt.hardware [None req-a4430f0b-c83c-4449-afd7-67aadb4578b3 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 22 00:09:55 compute-1 nova_compute[182713]: 2026-01-22 00:09:55.505 182717 DEBUG nova.virt.hardware [None req-a4430f0b-c83c-4449-afd7-67aadb4578b3 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 22 00:09:55 compute-1 nova_compute[182713]: 2026-01-22 00:09:55.505 182717 DEBUG nova.objects.instance [None req-a4430f0b-c83c-4449-afd7-67aadb4578b3 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] Lazy-loading 'vcpu_model' on Instance uuid 9c30e0e8-a030-4b0c-84ed-324f08bd5f1b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:09:55 compute-1 nova_compute[182713]: 2026-01-22 00:09:55.576 182717 DEBUG nova.virt.libvirt.vif [None req-a4430f0b-c83c-4449-afd7-67aadb4578b3 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T00:09:26Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerRescueTestJSON-server-635293581',display_name='tempest-ServerRescueTestJSON-server-635293581',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverrescuetestjson-server-635293581',id=112,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-22T00:09:37Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='3b9315c6168049d79f20d630e51ffff3',ramdisk_id='',reservation_id='r-egqa46yk',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerRescueTestJSON-401787473',owner_user_name='tempest-ServerRescueTestJSON-401787473-project-member'},tags=<?>,task_state='rescuing',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:09:37Z,user_data=None,user_id='8324d8ba232c476e925d31b7d5645a7a',uuid=9c30e0e8-a030-4b0c-84ed-324f08bd5f1b,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "3e67543d-6311-420b-878e-c3112fb771c2", "address": "fa:16:3e:75:3e:f2", "network": {"id": "eaa59f49-9477-4d9b-9a5e-8f1eb5cf78a6", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-1280377146-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerRescueTestJSON-1280377146-network", "vif_mac": "fa:16:3e:75:3e:f2"}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "3b9315c6168049d79f20d630e51ffff3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3e67543d-63", "ovs_interfaceid": "3e67543d-6311-420b-878e-c3112fb771c2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 00:09:55 compute-1 nova_compute[182713]: 2026-01-22 00:09:55.578 182717 DEBUG nova.network.os_vif_util [None req-a4430f0b-c83c-4449-afd7-67aadb4578b3 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] Converting VIF {"id": "3e67543d-6311-420b-878e-c3112fb771c2", "address": "fa:16:3e:75:3e:f2", "network": {"id": "eaa59f49-9477-4d9b-9a5e-8f1eb5cf78a6", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-1280377146-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerRescueTestJSON-1280377146-network", "vif_mac": "fa:16:3e:75:3e:f2"}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "3b9315c6168049d79f20d630e51ffff3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3e67543d-63", "ovs_interfaceid": "3e67543d-6311-420b-878e-c3112fb771c2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:09:55 compute-1 nova_compute[182713]: 2026-01-22 00:09:55.579 182717 DEBUG nova.network.os_vif_util [None req-a4430f0b-c83c-4449-afd7-67aadb4578b3 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:75:3e:f2,bridge_name='br-int',has_traffic_filtering=True,id=3e67543d-6311-420b-878e-c3112fb771c2,network=Network(eaa59f49-9477-4d9b-9a5e-8f1eb5cf78a6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3e67543d-63') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:09:55 compute-1 nova_compute[182713]: 2026-01-22 00:09:55.580 182717 DEBUG nova.objects.instance [None req-a4430f0b-c83c-4449-afd7-67aadb4578b3 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] Lazy-loading 'pci_devices' on Instance uuid 9c30e0e8-a030-4b0c-84ed-324f08bd5f1b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:09:55 compute-1 nova_compute[182713]: 2026-01-22 00:09:55.618 182717 DEBUG nova.virt.libvirt.driver [None req-a4430f0b-c83c-4449-afd7-67aadb4578b3 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] [instance: 9c30e0e8-a030-4b0c-84ed-324f08bd5f1b] End _get_guest_xml xml=<domain type="kvm">
Jan 22 00:09:55 compute-1 nova_compute[182713]:   <uuid>9c30e0e8-a030-4b0c-84ed-324f08bd5f1b</uuid>
Jan 22 00:09:55 compute-1 nova_compute[182713]:   <name>instance-00000070</name>
Jan 22 00:09:55 compute-1 nova_compute[182713]:   <memory>131072</memory>
Jan 22 00:09:55 compute-1 nova_compute[182713]:   <vcpu>1</vcpu>
Jan 22 00:09:55 compute-1 nova_compute[182713]:   <metadata>
Jan 22 00:09:55 compute-1 nova_compute[182713]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 00:09:55 compute-1 nova_compute[182713]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 00:09:55 compute-1 nova_compute[182713]:       <nova:name>tempest-ServerRescueTestJSON-server-635293581</nova:name>
Jan 22 00:09:55 compute-1 nova_compute[182713]:       <nova:creationTime>2026-01-22 00:09:55</nova:creationTime>
Jan 22 00:09:55 compute-1 nova_compute[182713]:       <nova:flavor name="m1.nano">
Jan 22 00:09:55 compute-1 nova_compute[182713]:         <nova:memory>128</nova:memory>
Jan 22 00:09:55 compute-1 nova_compute[182713]:         <nova:disk>1</nova:disk>
Jan 22 00:09:55 compute-1 nova_compute[182713]:         <nova:swap>0</nova:swap>
Jan 22 00:09:55 compute-1 nova_compute[182713]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 00:09:55 compute-1 nova_compute[182713]:         <nova:vcpus>1</nova:vcpus>
Jan 22 00:09:55 compute-1 nova_compute[182713]:       </nova:flavor>
Jan 22 00:09:55 compute-1 nova_compute[182713]:       <nova:owner>
Jan 22 00:09:55 compute-1 nova_compute[182713]:         <nova:user uuid="8324d8ba232c476e925d31b7d5645a7a">tempest-ServerRescueTestJSON-401787473-project-member</nova:user>
Jan 22 00:09:55 compute-1 nova_compute[182713]:         <nova:project uuid="3b9315c6168049d79f20d630e51ffff3">tempest-ServerRescueTestJSON-401787473</nova:project>
Jan 22 00:09:55 compute-1 nova_compute[182713]:       </nova:owner>
Jan 22 00:09:55 compute-1 nova_compute[182713]:       <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 22 00:09:55 compute-1 nova_compute[182713]:       <nova:ports>
Jan 22 00:09:55 compute-1 nova_compute[182713]:         <nova:port uuid="3e67543d-6311-420b-878e-c3112fb771c2">
Jan 22 00:09:55 compute-1 nova_compute[182713]:           <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Jan 22 00:09:55 compute-1 nova_compute[182713]:         </nova:port>
Jan 22 00:09:55 compute-1 nova_compute[182713]:       </nova:ports>
Jan 22 00:09:55 compute-1 nova_compute[182713]:     </nova:instance>
Jan 22 00:09:55 compute-1 nova_compute[182713]:   </metadata>
Jan 22 00:09:55 compute-1 nova_compute[182713]:   <sysinfo type="smbios">
Jan 22 00:09:55 compute-1 nova_compute[182713]:     <system>
Jan 22 00:09:55 compute-1 nova_compute[182713]:       <entry name="manufacturer">RDO</entry>
Jan 22 00:09:55 compute-1 nova_compute[182713]:       <entry name="product">OpenStack Compute</entry>
Jan 22 00:09:55 compute-1 nova_compute[182713]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 00:09:55 compute-1 nova_compute[182713]:       <entry name="serial">9c30e0e8-a030-4b0c-84ed-324f08bd5f1b</entry>
Jan 22 00:09:55 compute-1 nova_compute[182713]:       <entry name="uuid">9c30e0e8-a030-4b0c-84ed-324f08bd5f1b</entry>
Jan 22 00:09:55 compute-1 nova_compute[182713]:       <entry name="family">Virtual Machine</entry>
Jan 22 00:09:55 compute-1 nova_compute[182713]:     </system>
Jan 22 00:09:55 compute-1 nova_compute[182713]:   </sysinfo>
Jan 22 00:09:55 compute-1 nova_compute[182713]:   <os>
Jan 22 00:09:55 compute-1 nova_compute[182713]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 22 00:09:55 compute-1 nova_compute[182713]:     <smbios mode="sysinfo"/>
Jan 22 00:09:55 compute-1 nova_compute[182713]:   </os>
Jan 22 00:09:55 compute-1 nova_compute[182713]:   <features>
Jan 22 00:09:55 compute-1 nova_compute[182713]:     <acpi/>
Jan 22 00:09:55 compute-1 nova_compute[182713]:     <apic/>
Jan 22 00:09:55 compute-1 nova_compute[182713]:     <vmcoreinfo/>
Jan 22 00:09:55 compute-1 nova_compute[182713]:   </features>
Jan 22 00:09:55 compute-1 nova_compute[182713]:   <clock offset="utc">
Jan 22 00:09:55 compute-1 nova_compute[182713]:     <timer name="pit" tickpolicy="delay"/>
Jan 22 00:09:55 compute-1 nova_compute[182713]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 22 00:09:55 compute-1 nova_compute[182713]:     <timer name="hpet" present="no"/>
Jan 22 00:09:55 compute-1 nova_compute[182713]:   </clock>
Jan 22 00:09:55 compute-1 nova_compute[182713]:   <cpu mode="custom" match="exact">
Jan 22 00:09:55 compute-1 nova_compute[182713]:     <model>Nehalem</model>
Jan 22 00:09:55 compute-1 nova_compute[182713]:     <topology sockets="1" cores="1" threads="1"/>
Jan 22 00:09:55 compute-1 nova_compute[182713]:   </cpu>
Jan 22 00:09:55 compute-1 nova_compute[182713]:   <devices>
Jan 22 00:09:55 compute-1 nova_compute[182713]:     <disk type="file" device="disk">
Jan 22 00:09:55 compute-1 nova_compute[182713]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 00:09:55 compute-1 nova_compute[182713]:       <source file="/var/lib/nova/instances/9c30e0e8-a030-4b0c-84ed-324f08bd5f1b/disk.rescue"/>
Jan 22 00:09:55 compute-1 nova_compute[182713]:       <target dev="vda" bus="virtio"/>
Jan 22 00:09:55 compute-1 nova_compute[182713]:     </disk>
Jan 22 00:09:55 compute-1 nova_compute[182713]:     <disk type="file" device="disk">
Jan 22 00:09:55 compute-1 nova_compute[182713]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 00:09:55 compute-1 nova_compute[182713]:       <source file="/var/lib/nova/instances/9c30e0e8-a030-4b0c-84ed-324f08bd5f1b/disk"/>
Jan 22 00:09:55 compute-1 nova_compute[182713]:       <target dev="vdb" bus="virtio"/>
Jan 22 00:09:55 compute-1 nova_compute[182713]:     </disk>
Jan 22 00:09:55 compute-1 nova_compute[182713]:     <disk type="file" device="cdrom">
Jan 22 00:09:55 compute-1 nova_compute[182713]:       <driver name="qemu" type="raw" cache="none"/>
Jan 22 00:09:55 compute-1 nova_compute[182713]:       <source file="/var/lib/nova/instances/9c30e0e8-a030-4b0c-84ed-324f08bd5f1b/disk.config.rescue"/>
Jan 22 00:09:55 compute-1 nova_compute[182713]:       <target dev="sda" bus="sata"/>
Jan 22 00:09:55 compute-1 nova_compute[182713]:     </disk>
Jan 22 00:09:55 compute-1 nova_compute[182713]:     <interface type="ethernet">
Jan 22 00:09:55 compute-1 nova_compute[182713]:       <mac address="fa:16:3e:75:3e:f2"/>
Jan 22 00:09:55 compute-1 nova_compute[182713]:       <model type="virtio"/>
Jan 22 00:09:55 compute-1 nova_compute[182713]:       <driver name="vhost" rx_queue_size="512"/>
Jan 22 00:09:55 compute-1 nova_compute[182713]:       <mtu size="1442"/>
Jan 22 00:09:55 compute-1 nova_compute[182713]:       <target dev="tap3e67543d-63"/>
Jan 22 00:09:55 compute-1 nova_compute[182713]:     </interface>
Jan 22 00:09:55 compute-1 nova_compute[182713]:     <serial type="pty">
Jan 22 00:09:55 compute-1 nova_compute[182713]:       <log file="/var/lib/nova/instances/9c30e0e8-a030-4b0c-84ed-324f08bd5f1b/console.log" append="off"/>
Jan 22 00:09:55 compute-1 nova_compute[182713]:     </serial>
Jan 22 00:09:55 compute-1 nova_compute[182713]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 00:09:55 compute-1 nova_compute[182713]:     <video>
Jan 22 00:09:55 compute-1 nova_compute[182713]:       <model type="virtio"/>
Jan 22 00:09:55 compute-1 nova_compute[182713]:     </video>
Jan 22 00:09:55 compute-1 nova_compute[182713]:     <input type="tablet" bus="usb"/>
Jan 22 00:09:55 compute-1 nova_compute[182713]:     <rng model="virtio">
Jan 22 00:09:55 compute-1 nova_compute[182713]:       <backend model="random">/dev/urandom</backend>
Jan 22 00:09:55 compute-1 nova_compute[182713]:     </rng>
Jan 22 00:09:55 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root"/>
Jan 22 00:09:55 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:09:55 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:09:55 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:09:55 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:09:55 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:09:55 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:09:55 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:09:55 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:09:55 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:09:55 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:09:55 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:09:55 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:09:55 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:09:55 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:09:55 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:09:55 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:09:55 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:09:55 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:09:55 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:09:55 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:09:55 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:09:55 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:09:55 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:09:55 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:09:55 compute-1 nova_compute[182713]:     <controller type="usb" index="0"/>
Jan 22 00:09:55 compute-1 nova_compute[182713]:     <memballoon model="virtio">
Jan 22 00:09:55 compute-1 nova_compute[182713]:       <stats period="10"/>
Jan 22 00:09:55 compute-1 nova_compute[182713]:     </memballoon>
Jan 22 00:09:55 compute-1 nova_compute[182713]:   </devices>
Jan 22 00:09:55 compute-1 nova_compute[182713]: </domain>
Jan 22 00:09:55 compute-1 nova_compute[182713]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 22 00:09:55 compute-1 nova_compute[182713]: 2026-01-22 00:09:55.632 182717 INFO nova.virt.libvirt.driver [-] [instance: 9c30e0e8-a030-4b0c-84ed-324f08bd5f1b] Instance destroyed successfully.
Jan 22 00:09:55 compute-1 nova_compute[182713]: 2026-01-22 00:09:55.796 182717 DEBUG nova.virt.libvirt.driver [None req-a4430f0b-c83c-4449-afd7-67aadb4578b3 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 00:09:55 compute-1 nova_compute[182713]: 2026-01-22 00:09:55.797 182717 DEBUG nova.virt.libvirt.driver [None req-a4430f0b-c83c-4449-afd7-67aadb4578b3 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] No BDM found with device name vdb, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 00:09:55 compute-1 nova_compute[182713]: 2026-01-22 00:09:55.797 182717 DEBUG nova.virt.libvirt.driver [None req-a4430f0b-c83c-4449-afd7-67aadb4578b3 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 00:09:55 compute-1 nova_compute[182713]: 2026-01-22 00:09:55.797 182717 DEBUG nova.virt.libvirt.driver [None req-a4430f0b-c83c-4449-afd7-67aadb4578b3 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] No VIF found with MAC fa:16:3e:75:3e:f2, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 00:09:55 compute-1 nova_compute[182713]: 2026-01-22 00:09:55.798 182717 INFO nova.virt.libvirt.driver [None req-a4430f0b-c83c-4449-afd7-67aadb4578b3 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] [instance: 9c30e0e8-a030-4b0c-84ed-324f08bd5f1b] Using config drive
Jan 22 00:09:55 compute-1 nova_compute[182713]: 2026-01-22 00:09:55.860 182717 DEBUG nova.objects.instance [None req-a4430f0b-c83c-4449-afd7-67aadb4578b3 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] Lazy-loading 'ec2_ids' on Instance uuid 9c30e0e8-a030-4b0c-84ed-324f08bd5f1b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:09:55 compute-1 nova_compute[182713]: 2026-01-22 00:09:55.949 182717 DEBUG nova.objects.instance [None req-a4430f0b-c83c-4449-afd7-67aadb4578b3 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] Lazy-loading 'keypairs' on Instance uuid 9c30e0e8-a030-4b0c-84ed-324f08bd5f1b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:09:56 compute-1 nova_compute[182713]: 2026-01-22 00:09:56.144 182717 DEBUG nova.compute.manager [req-b02648f6-9e87-4cb2-9995-e99af0a6f3c0 req-9a1e4e96-5828-4b30-9808-50417a5b26d3 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 9c30e0e8-a030-4b0c-84ed-324f08bd5f1b] Received event network-vif-unplugged-3e67543d-6311-420b-878e-c3112fb771c2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:09:56 compute-1 nova_compute[182713]: 2026-01-22 00:09:56.146 182717 DEBUG oslo_concurrency.lockutils [req-b02648f6-9e87-4cb2-9995-e99af0a6f3c0 req-9a1e4e96-5828-4b30-9808-50417a5b26d3 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "9c30e0e8-a030-4b0c-84ed-324f08bd5f1b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:09:56 compute-1 nova_compute[182713]: 2026-01-22 00:09:56.147 182717 DEBUG oslo_concurrency.lockutils [req-b02648f6-9e87-4cb2-9995-e99af0a6f3c0 req-9a1e4e96-5828-4b30-9808-50417a5b26d3 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "9c30e0e8-a030-4b0c-84ed-324f08bd5f1b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:09:56 compute-1 nova_compute[182713]: 2026-01-22 00:09:56.147 182717 DEBUG oslo_concurrency.lockutils [req-b02648f6-9e87-4cb2-9995-e99af0a6f3c0 req-9a1e4e96-5828-4b30-9808-50417a5b26d3 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "9c30e0e8-a030-4b0c-84ed-324f08bd5f1b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:09:56 compute-1 nova_compute[182713]: 2026-01-22 00:09:56.148 182717 DEBUG nova.compute.manager [req-b02648f6-9e87-4cb2-9995-e99af0a6f3c0 req-9a1e4e96-5828-4b30-9808-50417a5b26d3 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 9c30e0e8-a030-4b0c-84ed-324f08bd5f1b] No waiting events found dispatching network-vif-unplugged-3e67543d-6311-420b-878e-c3112fb771c2 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:09:56 compute-1 nova_compute[182713]: 2026-01-22 00:09:56.149 182717 WARNING nova.compute.manager [req-b02648f6-9e87-4cb2-9995-e99af0a6f3c0 req-9a1e4e96-5828-4b30-9808-50417a5b26d3 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 9c30e0e8-a030-4b0c-84ed-324f08bd5f1b] Received unexpected event network-vif-unplugged-3e67543d-6311-420b-878e-c3112fb771c2 for instance with vm_state active and task_state rescuing.
Jan 22 00:09:56 compute-1 ovn_controller[94841]: 2026-01-22T00:09:56Z|00055|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:45:56:97 10.100.0.12
Jan 22 00:09:57 compute-1 podman[228718]: 2026-01-22 00:09:57.573635318 +0000 UTC m=+0.055572157 container health_status 9069102163b9014ce8d990e36224edf2f6277e737eebbc85a8ea0ba5a3d6a468 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 22 00:09:57 compute-1 nova_compute[182713]: 2026-01-22 00:09:57.596 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:09:57 compute-1 podman[228717]: 2026-01-22 00:09:57.630939488 +0000 UTC m=+0.110266168 container health_status 1b31374526468d6b98833c6ddc06c791524e3aaa8f4a92d02e3f11c449a17ca2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, container_name=ovn_controller, org.label-schema.schema-version=1.0)
Jan 22 00:09:58 compute-1 nova_compute[182713]: 2026-01-22 00:09:58.586 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:09:59 compute-1 nova_compute[182713]: 2026-01-22 00:09:59.330 182717 INFO nova.virt.libvirt.driver [None req-a4430f0b-c83c-4449-afd7-67aadb4578b3 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] [instance: 9c30e0e8-a030-4b0c-84ed-324f08bd5f1b] Creating config drive at /var/lib/nova/instances/9c30e0e8-a030-4b0c-84ed-324f08bd5f1b/disk.config.rescue
Jan 22 00:09:59 compute-1 nova_compute[182713]: 2026-01-22 00:09:59.336 182717 DEBUG oslo_concurrency.processutils [None req-a4430f0b-c83c-4449-afd7-67aadb4578b3 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/9c30e0e8-a030-4b0c-84ed-324f08bd5f1b/disk.config.rescue -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpqyp0ie6s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:09:59 compute-1 nova_compute[182713]: 2026-01-22 00:09:59.462 182717 DEBUG oslo_concurrency.processutils [None req-a4430f0b-c83c-4449-afd7-67aadb4578b3 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/9c30e0e8-a030-4b0c-84ed-324f08bd5f1b/disk.config.rescue -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpqyp0ie6s" returned: 0 in 0.126s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:09:59 compute-1 nova_compute[182713]: 2026-01-22 00:09:59.494 182717 DEBUG nova.compute.manager [req-61a5d074-d01e-4cc2-a09c-7a81f2003df3 req-252221d4-ccfb-4cee-9313-c469de622019 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 9c30e0e8-a030-4b0c-84ed-324f08bd5f1b] Received event network-vif-plugged-3e67543d-6311-420b-878e-c3112fb771c2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:09:59 compute-1 nova_compute[182713]: 2026-01-22 00:09:59.495 182717 DEBUG oslo_concurrency.lockutils [req-61a5d074-d01e-4cc2-a09c-7a81f2003df3 req-252221d4-ccfb-4cee-9313-c469de622019 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "9c30e0e8-a030-4b0c-84ed-324f08bd5f1b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:09:59 compute-1 nova_compute[182713]: 2026-01-22 00:09:59.495 182717 DEBUG oslo_concurrency.lockutils [req-61a5d074-d01e-4cc2-a09c-7a81f2003df3 req-252221d4-ccfb-4cee-9313-c469de622019 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "9c30e0e8-a030-4b0c-84ed-324f08bd5f1b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:09:59 compute-1 nova_compute[182713]: 2026-01-22 00:09:59.495 182717 DEBUG oslo_concurrency.lockutils [req-61a5d074-d01e-4cc2-a09c-7a81f2003df3 req-252221d4-ccfb-4cee-9313-c469de622019 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "9c30e0e8-a030-4b0c-84ed-324f08bd5f1b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:09:59 compute-1 nova_compute[182713]: 2026-01-22 00:09:59.496 182717 DEBUG nova.compute.manager [req-61a5d074-d01e-4cc2-a09c-7a81f2003df3 req-252221d4-ccfb-4cee-9313-c469de622019 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 9c30e0e8-a030-4b0c-84ed-324f08bd5f1b] No waiting events found dispatching network-vif-plugged-3e67543d-6311-420b-878e-c3112fb771c2 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:09:59 compute-1 nova_compute[182713]: 2026-01-22 00:09:59.496 182717 WARNING nova.compute.manager [req-61a5d074-d01e-4cc2-a09c-7a81f2003df3 req-252221d4-ccfb-4cee-9313-c469de622019 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 9c30e0e8-a030-4b0c-84ed-324f08bd5f1b] Received unexpected event network-vif-plugged-3e67543d-6311-420b-878e-c3112fb771c2 for instance with vm_state active and task_state rescuing.
Jan 22 00:09:59 compute-1 NetworkManager[54952]: <info>  [1769040599.5333] manager: (tap3e67543d-63): new Tun device (/org/freedesktop/NetworkManager/Devices/216)
Jan 22 00:09:59 compute-1 kernel: tap3e67543d-63: entered promiscuous mode
Jan 22 00:09:59 compute-1 ovn_controller[94841]: 2026-01-22T00:09:59Z|00461|binding|INFO|Claiming lport 3e67543d-6311-420b-878e-c3112fb771c2 for this chassis.
Jan 22 00:09:59 compute-1 ovn_controller[94841]: 2026-01-22T00:09:59Z|00462|binding|INFO|3e67543d-6311-420b-878e-c3112fb771c2: Claiming fa:16:3e:75:3e:f2 10.100.0.8
Jan 22 00:09:59 compute-1 nova_compute[182713]: 2026-01-22 00:09:59.537 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:09:59 compute-1 ovn_controller[94841]: 2026-01-22T00:09:59Z|00463|binding|INFO|Setting lport 3e67543d-6311-420b-878e-c3112fb771c2 ovn-installed in OVS
Jan 22 00:09:59 compute-1 ovn_controller[94841]: 2026-01-22T00:09:59Z|00464|binding|INFO|Setting lport 3e67543d-6311-420b-878e-c3112fb771c2 up in Southbound
Jan 22 00:09:59 compute-1 nova_compute[182713]: 2026-01-22 00:09:59.554 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:09:59 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:09:59.556 104184 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:75:3e:f2 10.100.0.8'], port_security=['fa:16:3e:75:3e:f2 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '9c30e0e8-a030-4b0c-84ed-324f08bd5f1b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-eaa59f49-9477-4d9b-9a5e-8f1eb5cf78a6', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3b9315c6168049d79f20d630e51ffff3', 'neutron:revision_number': '5', 'neutron:security_group_ids': '88dd83ff-b733-44b2-9065-8f39dcf83d23', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ada6e58f-6492-44c0-abaa-a00698af112f, chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>], logical_port=3e67543d-6311-420b-878e-c3112fb771c2) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:09:59 compute-1 nova_compute[182713]: 2026-01-22 00:09:59.556 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:09:59 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:09:59.557 104184 INFO neutron.agent.ovn.metadata.agent [-] Port 3e67543d-6311-420b-878e-c3112fb771c2 in datapath eaa59f49-9477-4d9b-9a5e-8f1eb5cf78a6 bound to our chassis
Jan 22 00:09:59 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:09:59.558 104184 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network eaa59f49-9477-4d9b-9a5e-8f1eb5cf78a6 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Jan 22 00:09:59 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:09:59.559 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[34ee5133-5f2b-4415-b6ab-f4a488dddddc]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:09:59 compute-1 nova_compute[182713]: 2026-01-22 00:09:59.561 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:09:59 compute-1 systemd-udevd[228785]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 00:09:59 compute-1 NetworkManager[54952]: <info>  [1769040599.5920] device (tap3e67543d-63): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 00:09:59 compute-1 NetworkManager[54952]: <info>  [1769040599.5925] device (tap3e67543d-63): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 00:09:59 compute-1 systemd-machined[153970]: New machine qemu-50-instance-00000070.
Jan 22 00:09:59 compute-1 systemd[1]: Started Virtual Machine qemu-50-instance-00000070.
Jan 22 00:10:00 compute-1 nova_compute[182713]: 2026-01-22 00:10:00.155 182717 DEBUG nova.virt.libvirt.host [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Removed pending event for 9c30e0e8-a030-4b0c-84ed-324f08bd5f1b due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438
Jan 22 00:10:00 compute-1 nova_compute[182713]: 2026-01-22 00:10:00.156 182717 DEBUG nova.virt.driver [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Emitting event <LifecycleEvent: 1769040600.1540565, 9c30e0e8-a030-4b0c-84ed-324f08bd5f1b => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:10:00 compute-1 nova_compute[182713]: 2026-01-22 00:10:00.156 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 9c30e0e8-a030-4b0c-84ed-324f08bd5f1b] VM Resumed (Lifecycle Event)
Jan 22 00:10:00 compute-1 nova_compute[182713]: 2026-01-22 00:10:00.174 182717 DEBUG nova.compute.manager [None req-a4430f0b-c83c-4449-afd7-67aadb4578b3 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] [instance: 9c30e0e8-a030-4b0c-84ed-324f08bd5f1b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:10:00 compute-1 nova_compute[182713]: 2026-01-22 00:10:00.217 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 9c30e0e8-a030-4b0c-84ed-324f08bd5f1b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:10:00 compute-1 nova_compute[182713]: 2026-01-22 00:10:00.222 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 9c30e0e8-a030-4b0c-84ed-324f08bd5f1b] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: rescuing, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 00:10:00 compute-1 nova_compute[182713]: 2026-01-22 00:10:00.309 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 9c30e0e8-a030-4b0c-84ed-324f08bd5f1b] During sync_power_state the instance has a pending task (rescuing). Skip.
Jan 22 00:10:00 compute-1 nova_compute[182713]: 2026-01-22 00:10:00.310 182717 DEBUG nova.virt.driver [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Emitting event <LifecycleEvent: 1769040600.1547294, 9c30e0e8-a030-4b0c-84ed-324f08bd5f1b => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:10:00 compute-1 nova_compute[182713]: 2026-01-22 00:10:00.311 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 9c30e0e8-a030-4b0c-84ed-324f08bd5f1b] VM Started (Lifecycle Event)
Jan 22 00:10:00 compute-1 nova_compute[182713]: 2026-01-22 00:10:00.391 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 9c30e0e8-a030-4b0c-84ed-324f08bd5f1b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:10:00 compute-1 nova_compute[182713]: 2026-01-22 00:10:00.395 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 9c30e0e8-a030-4b0c-84ed-324f08bd5f1b] Synchronizing instance power state after lifecycle event "Started"; current vm_state: rescued, current task_state: None, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 00:10:00 compute-1 nova_compute[182713]: 2026-01-22 00:10:00.430 182717 DEBUG oslo_concurrency.lockutils [None req-b3f8ea34-bb90-486b-a526-44ffe333950e 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Acquiring lock "07bda903-2298-433c-aa7d-9a50380e24f1" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:10:00 compute-1 nova_compute[182713]: 2026-01-22 00:10:00.430 182717 DEBUG oslo_concurrency.lockutils [None req-b3f8ea34-bb90-486b-a526-44ffe333950e 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Lock "07bda903-2298-433c-aa7d-9a50380e24f1" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:10:00 compute-1 nova_compute[182713]: 2026-01-22 00:10:00.431 182717 DEBUG oslo_concurrency.lockutils [None req-b3f8ea34-bb90-486b-a526-44ffe333950e 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Acquiring lock "07bda903-2298-433c-aa7d-9a50380e24f1-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:10:00 compute-1 nova_compute[182713]: 2026-01-22 00:10:00.431 182717 DEBUG oslo_concurrency.lockutils [None req-b3f8ea34-bb90-486b-a526-44ffe333950e 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Lock "07bda903-2298-433c-aa7d-9a50380e24f1-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:10:00 compute-1 nova_compute[182713]: 2026-01-22 00:10:00.431 182717 DEBUG oslo_concurrency.lockutils [None req-b3f8ea34-bb90-486b-a526-44ffe333950e 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Lock "07bda903-2298-433c-aa7d-9a50380e24f1-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:10:00 compute-1 nova_compute[182713]: 2026-01-22 00:10:00.444 182717 INFO nova.compute.manager [None req-b3f8ea34-bb90-486b-a526-44ffe333950e 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 07bda903-2298-433c-aa7d-9a50380e24f1] Terminating instance
Jan 22 00:10:00 compute-1 nova_compute[182713]: 2026-01-22 00:10:00.457 182717 DEBUG nova.compute.manager [None req-b3f8ea34-bb90-486b-a526-44ffe333950e 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 07bda903-2298-433c-aa7d-9a50380e24f1] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 22 00:10:00 compute-1 kernel: tapbc1908f8-b8 (unregistering): left promiscuous mode
Jan 22 00:10:00 compute-1 NetworkManager[54952]: <info>  [1769040600.4899] device (tapbc1908f8-b8): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 00:10:00 compute-1 nova_compute[182713]: 2026-01-22 00:10:00.497 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:10:00 compute-1 ovn_controller[94841]: 2026-01-22T00:10:00Z|00465|binding|INFO|Releasing lport bc1908f8-b8ef-40b4-9e46-8e8664065a89 from this chassis (sb_readonly=0)
Jan 22 00:10:00 compute-1 ovn_controller[94841]: 2026-01-22T00:10:00Z|00466|binding|INFO|Setting lport bc1908f8-b8ef-40b4-9e46-8e8664065a89 down in Southbound
Jan 22 00:10:00 compute-1 ovn_controller[94841]: 2026-01-22T00:10:00Z|00467|binding|INFO|Removing iface tapbc1908f8-b8 ovn-installed in OVS
Jan 22 00:10:00 compute-1 nova_compute[182713]: 2026-01-22 00:10:00.504 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:10:00 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:10:00.523 104184 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:45:56:97 10.100.0.12'], port_security=['fa:16:3e:45:56:97 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '07bda903-2298-433c-aa7d-9a50380e24f1', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d94993bc-77ac-42d2-88cb-3b0110dff29e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3822e32efd5647aebf2d79a3dd038bd4', 'neutron:revision_number': '8', 'neutron:security_group_ids': 'e16c6ba2-0878-43d1-92b3-4f4c19054c38', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b241e30a-4049-4268-8d26-df6eb9b41bf5, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>], logical_port=bc1908f8-b8ef-40b4-9e46-8e8664065a89) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:10:00 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:10:00.526 104184 INFO neutron.agent.ovn.metadata.agent [-] Port bc1908f8-b8ef-40b4-9e46-8e8664065a89 in datapath d94993bc-77ac-42d2-88cb-3b0110dff29e unbound from our chassis
Jan 22 00:10:00 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:10:00.528 104184 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network d94993bc-77ac-42d2-88cb-3b0110dff29e, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 00:10:00 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:10:00.529 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[ebaf37b4-2ac1-4aea-8f16-8bc00e326c95]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:10:00 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:10:00.530 104184 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-d94993bc-77ac-42d2-88cb-3b0110dff29e namespace which is not needed anymore
Jan 22 00:10:00 compute-1 nova_compute[182713]: 2026-01-22 00:10:00.532 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:10:00 compute-1 systemd[1]: machine-qemu\x2d49\x2dinstance\x2d0000006f.scope: Deactivated successfully.
Jan 22 00:10:00 compute-1 systemd[1]: machine-qemu\x2d49\x2dinstance\x2d0000006f.scope: Consumed 13.322s CPU time.
Jan 22 00:10:00 compute-1 systemd-machined[153970]: Machine qemu-49-instance-0000006f terminated.
Jan 22 00:10:00 compute-1 NetworkManager[54952]: <info>  [1769040600.6966] manager: (tapbc1908f8-b8): new Tun device (/org/freedesktop/NetworkManager/Devices/217)
Jan 22 00:10:00 compute-1 nova_compute[182713]: 2026-01-22 00:10:00.754 182717 INFO nova.virt.libvirt.driver [-] [instance: 07bda903-2298-433c-aa7d-9a50380e24f1] Instance destroyed successfully.
Jan 22 00:10:00 compute-1 nova_compute[182713]: 2026-01-22 00:10:00.755 182717 DEBUG nova.objects.instance [None req-b3f8ea34-bb90-486b-a526-44ffe333950e 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Lazy-loading 'resources' on Instance uuid 07bda903-2298-433c-aa7d-9a50380e24f1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:10:00 compute-1 nova_compute[182713]: 2026-01-22 00:10:00.812 182717 DEBUG nova.virt.libvirt.vif [None req-b3f8ea34-bb90-486b-a526-44ffe333950e 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T00:09:06Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-DeleteServersTestJSON-server-554893725',display_name='tempest-DeleteServersTestJSON-server-554893725',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-deleteserverstestjson-server-554893725',id=111,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-22T00:09:44Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=<?>,new_flavor=Flavor(2),node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='3822e32efd5647aebf2d79a3dd038bd4',ramdisk_id='',reservation_id='r-4j2wmp3v',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-DeleteServersTestJSON-2033458913',owner_user_name='tempest-DeleteServersTestJSON-2033458913-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T00:09:45Z,user_data=None,user_id='74ad1bf274924c52af96aa4c6d431410',uuid=07bda903-2298-433c-aa7d-9a50380e24f1,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='resized') vif={"id": "bc1908f8-b8ef-40b4-9e46-8e8664065a89", "address": "fa:16:3e:45:56:97", "network": {"id": "d94993bc-77ac-42d2-88cb-3b0110dff29e", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-173623444-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3822e32efd5647aebf2d79a3dd038bd4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbc1908f8-b8", "ovs_interfaceid": "bc1908f8-b8ef-40b4-9e46-8e8664065a89", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 00:10:00 compute-1 nova_compute[182713]: 2026-01-22 00:10:00.813 182717 DEBUG nova.network.os_vif_util [None req-b3f8ea34-bb90-486b-a526-44ffe333950e 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Converting VIF {"id": "bc1908f8-b8ef-40b4-9e46-8e8664065a89", "address": "fa:16:3e:45:56:97", "network": {"id": "d94993bc-77ac-42d2-88cb-3b0110dff29e", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-173623444-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3822e32efd5647aebf2d79a3dd038bd4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbc1908f8-b8", "ovs_interfaceid": "bc1908f8-b8ef-40b4-9e46-8e8664065a89", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:10:00 compute-1 nova_compute[182713]: 2026-01-22 00:10:00.814 182717 DEBUG nova.network.os_vif_util [None req-b3f8ea34-bb90-486b-a526-44ffe333950e 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:45:56:97,bridge_name='br-int',has_traffic_filtering=True,id=bc1908f8-b8ef-40b4-9e46-8e8664065a89,network=Network(d94993bc-77ac-42d2-88cb-3b0110dff29e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbc1908f8-b8') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:10:00 compute-1 nova_compute[182713]: 2026-01-22 00:10:00.815 182717 DEBUG os_vif [None req-b3f8ea34-bb90-486b-a526-44ffe333950e 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:45:56:97,bridge_name='br-int',has_traffic_filtering=True,id=bc1908f8-b8ef-40b4-9e46-8e8664065a89,network=Network(d94993bc-77ac-42d2-88cb-3b0110dff29e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbc1908f8-b8') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 00:10:00 compute-1 nova_compute[182713]: 2026-01-22 00:10:00.821 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:10:00 compute-1 nova_compute[182713]: 2026-01-22 00:10:00.822 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapbc1908f8-b8, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:10:00 compute-1 nova_compute[182713]: 2026-01-22 00:10:00.825 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:10:00 compute-1 nova_compute[182713]: 2026-01-22 00:10:00.829 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:10:00 compute-1 nova_compute[182713]: 2026-01-22 00:10:00.837 182717 INFO os_vif [None req-b3f8ea34-bb90-486b-a526-44ffe333950e 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:45:56:97,bridge_name='br-int',has_traffic_filtering=True,id=bc1908f8-b8ef-40b4-9e46-8e8664065a89,network=Network(d94993bc-77ac-42d2-88cb-3b0110dff29e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbc1908f8-b8')
Jan 22 00:10:00 compute-1 nova_compute[182713]: 2026-01-22 00:10:00.838 182717 INFO nova.virt.libvirt.driver [None req-b3f8ea34-bb90-486b-a526-44ffe333950e 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 07bda903-2298-433c-aa7d-9a50380e24f1] Deleting instance files /var/lib/nova/instances/07bda903-2298-433c-aa7d-9a50380e24f1_del
Jan 22 00:10:00 compute-1 neutron-haproxy-ovnmeta-d94993bc-77ac-42d2-88cb-3b0110dff29e[228616]: [NOTICE]   (228620) : haproxy version is 2.8.14-c23fe91
Jan 22 00:10:00 compute-1 neutron-haproxy-ovnmeta-d94993bc-77ac-42d2-88cb-3b0110dff29e[228616]: [NOTICE]   (228620) : path to executable is /usr/sbin/haproxy
Jan 22 00:10:00 compute-1 neutron-haproxy-ovnmeta-d94993bc-77ac-42d2-88cb-3b0110dff29e[228616]: [WARNING]  (228620) : Exiting Master process...
Jan 22 00:10:00 compute-1 neutron-haproxy-ovnmeta-d94993bc-77ac-42d2-88cb-3b0110dff29e[228616]: [ALERT]    (228620) : Current worker (228622) exited with code 143 (Terminated)
Jan 22 00:10:00 compute-1 neutron-haproxy-ovnmeta-d94993bc-77ac-42d2-88cb-3b0110dff29e[228616]: [WARNING]  (228620) : All workers exited. Exiting... (0)
Jan 22 00:10:00 compute-1 systemd[1]: libpod-d83941eb9971bdf8439e6f945d1321035820c608165a17438d22cab39ed4f663.scope: Deactivated successfully.
Jan 22 00:10:00 compute-1 nova_compute[182713]: 2026-01-22 00:10:00.850 182717 INFO nova.virt.libvirt.driver [None req-b3f8ea34-bb90-486b-a526-44ffe333950e 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 07bda903-2298-433c-aa7d-9a50380e24f1] Deletion of /var/lib/nova/instances/07bda903-2298-433c-aa7d-9a50380e24f1_del complete
Jan 22 00:10:00 compute-1 podman[228824]: 2026-01-22 00:10:00.854562231 +0000 UTC m=+0.200239268 container died d83941eb9971bdf8439e6f945d1321035820c608165a17438d22cab39ed4f663 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d94993bc-77ac-42d2-88cb-3b0110dff29e, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS)
Jan 22 00:10:00 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-d83941eb9971bdf8439e6f945d1321035820c608165a17438d22cab39ed4f663-userdata-shm.mount: Deactivated successfully.
Jan 22 00:10:00 compute-1 systemd[1]: var-lib-containers-storage-overlay-68fea6c97fdc13796f815f8a29b8880837379d9a1e19b310af76dbc461a7808a-merged.mount: Deactivated successfully.
Jan 22 00:10:00 compute-1 podman[228824]: 2026-01-22 00:10:00.906110736 +0000 UTC m=+0.251787773 container cleanup d83941eb9971bdf8439e6f945d1321035820c608165a17438d22cab39ed4f663 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d94993bc-77ac-42d2-88cb-3b0110dff29e, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 00:10:00 compute-1 systemd[1]: libpod-conmon-d83941eb9971bdf8439e6f945d1321035820c608165a17438d22cab39ed4f663.scope: Deactivated successfully.
Jan 22 00:10:01 compute-1 podman[228868]: 2026-01-22 00:10:01.050183448 +0000 UTC m=+0.119849868 container remove d83941eb9971bdf8439e6f945d1321035820c608165a17438d22cab39ed4f663 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d94993bc-77ac-42d2-88cb-3b0110dff29e, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 22 00:10:01 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:10:01.055 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[c2893c8a-46bf-4e8a-a271-c657025bcd25]: (4, ('Thu Jan 22 12:10:00 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-d94993bc-77ac-42d2-88cb-3b0110dff29e (d83941eb9971bdf8439e6f945d1321035820c608165a17438d22cab39ed4f663)\nd83941eb9971bdf8439e6f945d1321035820c608165a17438d22cab39ed4f663\nThu Jan 22 12:10:00 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-d94993bc-77ac-42d2-88cb-3b0110dff29e (d83941eb9971bdf8439e6f945d1321035820c608165a17438d22cab39ed4f663)\nd83941eb9971bdf8439e6f945d1321035820c608165a17438d22cab39ed4f663\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:10:01 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:10:01.057 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[5e9f4be9-467f-4641-9abe-59f183e95203]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:10:01 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:10:01.058 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd94993bc-70, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:10:01 compute-1 nova_compute[182713]: 2026-01-22 00:10:01.059 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:10:01 compute-1 kernel: tapd94993bc-70: left promiscuous mode
Jan 22 00:10:01 compute-1 nova_compute[182713]: 2026-01-22 00:10:01.072 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:10:01 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:10:01.075 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[0be0c572-7741-4011-b508-9971b4e28277]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:10:01 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:10:01.096 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[25258ad6-d416-4678-b4d3-aa95be89de76]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:10:01 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:10:01.097 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[445d3a09-de0f-416b-b133-97db2f1b559f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:10:01 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:10:01.113 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[b3118533-6f3f-4747-bc1d-56b6eba7a800]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 515655, 'reachable_time': 43350, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 228883, 'error': None, 'target': 'ovnmeta-d94993bc-77ac-42d2-88cb-3b0110dff29e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:10:01 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:10:01.116 104576 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-d94993bc-77ac-42d2-88cb-3b0110dff29e deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 22 00:10:01 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:10:01.116 104576 DEBUG oslo.privsep.daemon [-] privsep: reply[8d876cb9-e2c1-4f9d-a7a6-bb7c0d2eb081]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:10:01 compute-1 systemd[1]: run-netns-ovnmeta\x2dd94993bc\x2d77ac\x2d42d2\x2d88cb\x2d3b0110dff29e.mount: Deactivated successfully.
Jan 22 00:10:01 compute-1 nova_compute[182713]: 2026-01-22 00:10:01.183 182717 INFO nova.compute.manager [None req-b3f8ea34-bb90-486b-a526-44ffe333950e 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 07bda903-2298-433c-aa7d-9a50380e24f1] Took 0.73 seconds to destroy the instance on the hypervisor.
Jan 22 00:10:01 compute-1 nova_compute[182713]: 2026-01-22 00:10:01.184 182717 DEBUG oslo.service.loopingcall [None req-b3f8ea34-bb90-486b-a526-44ffe333950e 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 22 00:10:01 compute-1 nova_compute[182713]: 2026-01-22 00:10:01.184 182717 DEBUG nova.compute.manager [-] [instance: 07bda903-2298-433c-aa7d-9a50380e24f1] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 22 00:10:01 compute-1 nova_compute[182713]: 2026-01-22 00:10:01.185 182717 DEBUG nova.network.neutron [-] [instance: 07bda903-2298-433c-aa7d-9a50380e24f1] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 22 00:10:02 compute-1 podman[228884]: 2026-01-22 00:10:02.576868816 +0000 UTC m=+0.061044363 container health_status af9a44923f14d865893c91712a29a99d59e05d83f1a1a0300c4f4d5af638f284 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent)
Jan 22 00:10:02 compute-1 podman[228885]: 2026-01-22 00:10:02.596396109 +0000 UTC m=+0.081572647 container health_status e305ebfcd3dbca18f8dd680606b043b108cf9efb0d0a771039cb04fe0df8441d (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Jan 22 00:10:02 compute-1 nova_compute[182713]: 2026-01-22 00:10:02.638 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:10:03 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:10:03.018 104184 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:10:03 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:10:03.018 104184 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:10:03 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:10:03.018 104184 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:10:03 compute-1 nova_compute[182713]: 2026-01-22 00:10:03.273 182717 DEBUG nova.compute.manager [req-a10e2e3e-75db-4db4-b240-e48576a99b66 req-23b87b12-9113-4f7c-99a2-45e4598dcd67 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 9c30e0e8-a030-4b0c-84ed-324f08bd5f1b] Received event network-vif-plugged-3e67543d-6311-420b-878e-c3112fb771c2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:10:03 compute-1 nova_compute[182713]: 2026-01-22 00:10:03.273 182717 DEBUG oslo_concurrency.lockutils [req-a10e2e3e-75db-4db4-b240-e48576a99b66 req-23b87b12-9113-4f7c-99a2-45e4598dcd67 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "9c30e0e8-a030-4b0c-84ed-324f08bd5f1b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:10:03 compute-1 nova_compute[182713]: 2026-01-22 00:10:03.273 182717 DEBUG oslo_concurrency.lockutils [req-a10e2e3e-75db-4db4-b240-e48576a99b66 req-23b87b12-9113-4f7c-99a2-45e4598dcd67 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "9c30e0e8-a030-4b0c-84ed-324f08bd5f1b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:10:03 compute-1 nova_compute[182713]: 2026-01-22 00:10:03.274 182717 DEBUG oslo_concurrency.lockutils [req-a10e2e3e-75db-4db4-b240-e48576a99b66 req-23b87b12-9113-4f7c-99a2-45e4598dcd67 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "9c30e0e8-a030-4b0c-84ed-324f08bd5f1b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:10:03 compute-1 nova_compute[182713]: 2026-01-22 00:10:03.274 182717 DEBUG nova.compute.manager [req-a10e2e3e-75db-4db4-b240-e48576a99b66 req-23b87b12-9113-4f7c-99a2-45e4598dcd67 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 9c30e0e8-a030-4b0c-84ed-324f08bd5f1b] No waiting events found dispatching network-vif-plugged-3e67543d-6311-420b-878e-c3112fb771c2 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:10:03 compute-1 nova_compute[182713]: 2026-01-22 00:10:03.274 182717 WARNING nova.compute.manager [req-a10e2e3e-75db-4db4-b240-e48576a99b66 req-23b87b12-9113-4f7c-99a2-45e4598dcd67 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 9c30e0e8-a030-4b0c-84ed-324f08bd5f1b] Received unexpected event network-vif-plugged-3e67543d-6311-420b-878e-c3112fb771c2 for instance with vm_state rescued and task_state None.
Jan 22 00:10:04 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:10:04.047 104184 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=31, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:a4:fb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'fa:c2:bb:50:eb:27'}, ipsec=False) old=SB_Global(nb_cfg=30) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:10:04 compute-1 nova_compute[182713]: 2026-01-22 00:10:04.048 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:10:04 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:10:04.049 104184 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 22 00:10:04 compute-1 nova_compute[182713]: 2026-01-22 00:10:04.392 182717 INFO nova.compute.manager [None req-6262e3ee-b5a1-41dc-86e2-a796ef60cc4b 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] [instance: 9c30e0e8-a030-4b0c-84ed-324f08bd5f1b] Unrescuing
Jan 22 00:10:04 compute-1 nova_compute[182713]: 2026-01-22 00:10:04.393 182717 DEBUG oslo_concurrency.lockutils [None req-6262e3ee-b5a1-41dc-86e2-a796ef60cc4b 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] Acquiring lock "refresh_cache-9c30e0e8-a030-4b0c-84ed-324f08bd5f1b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:10:04 compute-1 nova_compute[182713]: 2026-01-22 00:10:04.393 182717 DEBUG oslo_concurrency.lockutils [None req-6262e3ee-b5a1-41dc-86e2-a796ef60cc4b 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] Acquired lock "refresh_cache-9c30e0e8-a030-4b0c-84ed-324f08bd5f1b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:10:04 compute-1 nova_compute[182713]: 2026-01-22 00:10:04.393 182717 DEBUG nova.network.neutron [None req-6262e3ee-b5a1-41dc-86e2-a796ef60cc4b 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] [instance: 9c30e0e8-a030-4b0c-84ed-324f08bd5f1b] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 00:10:05 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:10:05.052 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=74526b6d-b1ca-423f-9094-b845f8b97526, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '31'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:10:05 compute-1 nova_compute[182713]: 2026-01-22 00:10:05.825 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:10:05 compute-1 nova_compute[182713]: 2026-01-22 00:10:05.871 182717 DEBUG nova.network.neutron [-] [instance: 07bda903-2298-433c-aa7d-9a50380e24f1] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:10:05 compute-1 nova_compute[182713]: 2026-01-22 00:10:05.944 182717 INFO nova.compute.manager [-] [instance: 07bda903-2298-433c-aa7d-9a50380e24f1] Took 4.76 seconds to deallocate network for instance.
Jan 22 00:10:06 compute-1 nova_compute[182713]: 2026-01-22 00:10:06.089 182717 DEBUG nova.compute.manager [req-23cafa7a-f08b-4b66-bc79-5c23abbd93d7 req-4b37f4aa-d0a0-44e0-ae44-a32495c89fd5 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 9c30e0e8-a030-4b0c-84ed-324f08bd5f1b] Received event network-vif-plugged-3e67543d-6311-420b-878e-c3112fb771c2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:10:06 compute-1 nova_compute[182713]: 2026-01-22 00:10:06.089 182717 DEBUG oslo_concurrency.lockutils [req-23cafa7a-f08b-4b66-bc79-5c23abbd93d7 req-4b37f4aa-d0a0-44e0-ae44-a32495c89fd5 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "9c30e0e8-a030-4b0c-84ed-324f08bd5f1b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:10:06 compute-1 nova_compute[182713]: 2026-01-22 00:10:06.090 182717 DEBUG oslo_concurrency.lockutils [req-23cafa7a-f08b-4b66-bc79-5c23abbd93d7 req-4b37f4aa-d0a0-44e0-ae44-a32495c89fd5 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "9c30e0e8-a030-4b0c-84ed-324f08bd5f1b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:10:06 compute-1 nova_compute[182713]: 2026-01-22 00:10:06.090 182717 DEBUG oslo_concurrency.lockutils [req-23cafa7a-f08b-4b66-bc79-5c23abbd93d7 req-4b37f4aa-d0a0-44e0-ae44-a32495c89fd5 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "9c30e0e8-a030-4b0c-84ed-324f08bd5f1b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:10:06 compute-1 nova_compute[182713]: 2026-01-22 00:10:06.090 182717 DEBUG nova.compute.manager [req-23cafa7a-f08b-4b66-bc79-5c23abbd93d7 req-4b37f4aa-d0a0-44e0-ae44-a32495c89fd5 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 9c30e0e8-a030-4b0c-84ed-324f08bd5f1b] No waiting events found dispatching network-vif-plugged-3e67543d-6311-420b-878e-c3112fb771c2 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:10:06 compute-1 nova_compute[182713]: 2026-01-22 00:10:06.090 182717 WARNING nova.compute.manager [req-23cafa7a-f08b-4b66-bc79-5c23abbd93d7 req-4b37f4aa-d0a0-44e0-ae44-a32495c89fd5 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 9c30e0e8-a030-4b0c-84ed-324f08bd5f1b] Received unexpected event network-vif-plugged-3e67543d-6311-420b-878e-c3112fb771c2 for instance with vm_state rescued and task_state unrescuing.
Jan 22 00:10:06 compute-1 nova_compute[182713]: 2026-01-22 00:10:06.090 182717 DEBUG nova.compute.manager [req-23cafa7a-f08b-4b66-bc79-5c23abbd93d7 req-4b37f4aa-d0a0-44e0-ae44-a32495c89fd5 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 07bda903-2298-433c-aa7d-9a50380e24f1] Received event network-vif-unplugged-bc1908f8-b8ef-40b4-9e46-8e8664065a89 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:10:06 compute-1 nova_compute[182713]: 2026-01-22 00:10:06.091 182717 DEBUG oslo_concurrency.lockutils [req-23cafa7a-f08b-4b66-bc79-5c23abbd93d7 req-4b37f4aa-d0a0-44e0-ae44-a32495c89fd5 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "07bda903-2298-433c-aa7d-9a50380e24f1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:10:06 compute-1 nova_compute[182713]: 2026-01-22 00:10:06.091 182717 DEBUG oslo_concurrency.lockutils [req-23cafa7a-f08b-4b66-bc79-5c23abbd93d7 req-4b37f4aa-d0a0-44e0-ae44-a32495c89fd5 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "07bda903-2298-433c-aa7d-9a50380e24f1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:10:06 compute-1 nova_compute[182713]: 2026-01-22 00:10:06.091 182717 DEBUG oslo_concurrency.lockutils [req-23cafa7a-f08b-4b66-bc79-5c23abbd93d7 req-4b37f4aa-d0a0-44e0-ae44-a32495c89fd5 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "07bda903-2298-433c-aa7d-9a50380e24f1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:10:06 compute-1 nova_compute[182713]: 2026-01-22 00:10:06.091 182717 DEBUG nova.compute.manager [req-23cafa7a-f08b-4b66-bc79-5c23abbd93d7 req-4b37f4aa-d0a0-44e0-ae44-a32495c89fd5 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 07bda903-2298-433c-aa7d-9a50380e24f1] No waiting events found dispatching network-vif-unplugged-bc1908f8-b8ef-40b4-9e46-8e8664065a89 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:10:06 compute-1 nova_compute[182713]: 2026-01-22 00:10:06.091 182717 WARNING nova.compute.manager [req-23cafa7a-f08b-4b66-bc79-5c23abbd93d7 req-4b37f4aa-d0a0-44e0-ae44-a32495c89fd5 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 07bda903-2298-433c-aa7d-9a50380e24f1] Received unexpected event network-vif-unplugged-bc1908f8-b8ef-40b4-9e46-8e8664065a89 for instance with vm_state deleted and task_state None.
Jan 22 00:10:06 compute-1 nova_compute[182713]: 2026-01-22 00:10:06.092 182717 DEBUG nova.compute.manager [req-23cafa7a-f08b-4b66-bc79-5c23abbd93d7 req-4b37f4aa-d0a0-44e0-ae44-a32495c89fd5 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 07bda903-2298-433c-aa7d-9a50380e24f1] Received event network-vif-plugged-bc1908f8-b8ef-40b4-9e46-8e8664065a89 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:10:06 compute-1 nova_compute[182713]: 2026-01-22 00:10:06.092 182717 DEBUG oslo_concurrency.lockutils [req-23cafa7a-f08b-4b66-bc79-5c23abbd93d7 req-4b37f4aa-d0a0-44e0-ae44-a32495c89fd5 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "07bda903-2298-433c-aa7d-9a50380e24f1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:10:06 compute-1 nova_compute[182713]: 2026-01-22 00:10:06.092 182717 DEBUG oslo_concurrency.lockutils [req-23cafa7a-f08b-4b66-bc79-5c23abbd93d7 req-4b37f4aa-d0a0-44e0-ae44-a32495c89fd5 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "07bda903-2298-433c-aa7d-9a50380e24f1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:10:06 compute-1 nova_compute[182713]: 2026-01-22 00:10:06.092 182717 DEBUG oslo_concurrency.lockutils [req-23cafa7a-f08b-4b66-bc79-5c23abbd93d7 req-4b37f4aa-d0a0-44e0-ae44-a32495c89fd5 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "07bda903-2298-433c-aa7d-9a50380e24f1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:10:06 compute-1 nova_compute[182713]: 2026-01-22 00:10:06.092 182717 DEBUG nova.compute.manager [req-23cafa7a-f08b-4b66-bc79-5c23abbd93d7 req-4b37f4aa-d0a0-44e0-ae44-a32495c89fd5 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 07bda903-2298-433c-aa7d-9a50380e24f1] No waiting events found dispatching network-vif-plugged-bc1908f8-b8ef-40b4-9e46-8e8664065a89 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:10:06 compute-1 nova_compute[182713]: 2026-01-22 00:10:06.093 182717 WARNING nova.compute.manager [req-23cafa7a-f08b-4b66-bc79-5c23abbd93d7 req-4b37f4aa-d0a0-44e0-ae44-a32495c89fd5 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 07bda903-2298-433c-aa7d-9a50380e24f1] Received unexpected event network-vif-plugged-bc1908f8-b8ef-40b4-9e46-8e8664065a89 for instance with vm_state deleted and task_state None.
Jan 22 00:10:06 compute-1 nova_compute[182713]: 2026-01-22 00:10:06.098 182717 DEBUG oslo_concurrency.lockutils [None req-b3f8ea34-bb90-486b-a526-44ffe333950e 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:10:06 compute-1 nova_compute[182713]: 2026-01-22 00:10:06.099 182717 DEBUG oslo_concurrency.lockutils [None req-b3f8ea34-bb90-486b-a526-44ffe333950e 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:10:06 compute-1 nova_compute[182713]: 2026-01-22 00:10:06.199 182717 DEBUG nova.compute.provider_tree [None req-b3f8ea34-bb90-486b-a526-44ffe333950e 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Inventory has not changed in ProviderTree for provider: 39680711-70c9-4df1-ae59-25e54fac688d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:10:06 compute-1 nova_compute[182713]: 2026-01-22 00:10:06.221 182717 DEBUG nova.scheduler.client.report [None req-b3f8ea34-bb90-486b-a526-44ffe333950e 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Inventory has not changed for provider 39680711-70c9-4df1-ae59-25e54fac688d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:10:06 compute-1 nova_compute[182713]: 2026-01-22 00:10:06.263 182717 DEBUG oslo_concurrency.lockutils [None req-b3f8ea34-bb90-486b-a526-44ffe333950e 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.165s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:10:06 compute-1 nova_compute[182713]: 2026-01-22 00:10:06.361 182717 INFO nova.scheduler.client.report [None req-b3f8ea34-bb90-486b-a526-44ffe333950e 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Deleted allocations for instance 07bda903-2298-433c-aa7d-9a50380e24f1
Jan 22 00:10:06 compute-1 nova_compute[182713]: 2026-01-22 00:10:06.484 182717 DEBUG oslo_concurrency.lockutils [None req-b3f8ea34-bb90-486b-a526-44ffe333950e 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Lock "07bda903-2298-433c-aa7d-9a50380e24f1" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 6.053s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:10:07 compute-1 nova_compute[182713]: 2026-01-22 00:10:07.641 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:10:08 compute-1 nova_compute[182713]: 2026-01-22 00:10:08.195 182717 DEBUG nova.compute.manager [req-f6792c25-dac6-4959-b408-24b72d0583ec req-1c57612c-4e9a-452a-80fa-08ed76b4246d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 07bda903-2298-433c-aa7d-9a50380e24f1] Received event network-vif-deleted-bc1908f8-b8ef-40b4-9e46-8e8664065a89 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:10:08 compute-1 nova_compute[182713]: 2026-01-22 00:10:08.351 182717 DEBUG nova.network.neutron [None req-6262e3ee-b5a1-41dc-86e2-a796ef60cc4b 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] [instance: 9c30e0e8-a030-4b0c-84ed-324f08bd5f1b] Updating instance_info_cache with network_info: [{"id": "3e67543d-6311-420b-878e-c3112fb771c2", "address": "fa:16:3e:75:3e:f2", "network": {"id": "eaa59f49-9477-4d9b-9a5e-8f1eb5cf78a6", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-1280377146-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "3b9315c6168049d79f20d630e51ffff3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3e67543d-63", "ovs_interfaceid": "3e67543d-6311-420b-878e-c3112fb771c2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:10:08 compute-1 nova_compute[182713]: 2026-01-22 00:10:08.539 182717 DEBUG oslo_concurrency.lockutils [None req-6262e3ee-b5a1-41dc-86e2-a796ef60cc4b 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] Releasing lock "refresh_cache-9c30e0e8-a030-4b0c-84ed-324f08bd5f1b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:10:08 compute-1 nova_compute[182713]: 2026-01-22 00:10:08.540 182717 DEBUG nova.objects.instance [None req-6262e3ee-b5a1-41dc-86e2-a796ef60cc4b 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] Lazy-loading 'flavor' on Instance uuid 9c30e0e8-a030-4b0c-84ed-324f08bd5f1b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:10:08 compute-1 kernel: tap3e67543d-63 (unregistering): left promiscuous mode
Jan 22 00:10:08 compute-1 NetworkManager[54952]: <info>  [1769040608.7594] device (tap3e67543d-63): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 00:10:08 compute-1 nova_compute[182713]: 2026-01-22 00:10:08.802 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:10:08 compute-1 ovn_controller[94841]: 2026-01-22T00:10:08Z|00468|binding|INFO|Releasing lport 3e67543d-6311-420b-878e-c3112fb771c2 from this chassis (sb_readonly=0)
Jan 22 00:10:08 compute-1 ovn_controller[94841]: 2026-01-22T00:10:08Z|00469|binding|INFO|Setting lport 3e67543d-6311-420b-878e-c3112fb771c2 down in Southbound
Jan 22 00:10:08 compute-1 ovn_controller[94841]: 2026-01-22T00:10:08Z|00470|binding|INFO|Removing iface tap3e67543d-63 ovn-installed in OVS
Jan 22 00:10:08 compute-1 nova_compute[182713]: 2026-01-22 00:10:08.806 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:10:08 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:10:08.811 104184 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:75:3e:f2 10.100.0.8'], port_security=['fa:16:3e:75:3e:f2 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '9c30e0e8-a030-4b0c-84ed-324f08bd5f1b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-eaa59f49-9477-4d9b-9a5e-8f1eb5cf78a6', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3b9315c6168049d79f20d630e51ffff3', 'neutron:revision_number': '6', 'neutron:security_group_ids': '88dd83ff-b733-44b2-9065-8f39dcf83d23', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ada6e58f-6492-44c0-abaa-a00698af112f, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>], logical_port=3e67543d-6311-420b-878e-c3112fb771c2) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:10:08 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:10:08.813 104184 INFO neutron.agent.ovn.metadata.agent [-] Port 3e67543d-6311-420b-878e-c3112fb771c2 in datapath eaa59f49-9477-4d9b-9a5e-8f1eb5cf78a6 unbound from our chassis
Jan 22 00:10:08 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:10:08.814 104184 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network eaa59f49-9477-4d9b-9a5e-8f1eb5cf78a6 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Jan 22 00:10:08 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:10:08.815 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[19016bad-48b8-471a-ba12-f2b564df930b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:10:08 compute-1 nova_compute[182713]: 2026-01-22 00:10:08.817 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:10:08 compute-1 systemd[1]: machine-qemu\x2d50\x2dinstance\x2d00000070.scope: Deactivated successfully.
Jan 22 00:10:08 compute-1 systemd[1]: machine-qemu\x2d50\x2dinstance\x2d00000070.scope: Consumed 9.281s CPU time.
Jan 22 00:10:08 compute-1 systemd-machined[153970]: Machine qemu-50-instance-00000070 terminated.
Jan 22 00:10:08 compute-1 nova_compute[182713]: 2026-01-22 00:10:08.956 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:10:08 compute-1 nova_compute[182713]: 2026-01-22 00:10:08.963 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:10:09 compute-1 nova_compute[182713]: 2026-01-22 00:10:09.022 182717 INFO nova.virt.libvirt.driver [-] [instance: 9c30e0e8-a030-4b0c-84ed-324f08bd5f1b] Instance destroyed successfully.
Jan 22 00:10:09 compute-1 nova_compute[182713]: 2026-01-22 00:10:09.024 182717 DEBUG nova.objects.instance [None req-6262e3ee-b5a1-41dc-86e2-a796ef60cc4b 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] Lazy-loading 'numa_topology' on Instance uuid 9c30e0e8-a030-4b0c-84ed-324f08bd5f1b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:10:09 compute-1 kernel: tap3e67543d-63: entered promiscuous mode
Jan 22 00:10:09 compute-1 systemd-udevd[228929]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 00:10:09 compute-1 ovn_controller[94841]: 2026-01-22T00:10:09Z|00471|binding|INFO|Claiming lport 3e67543d-6311-420b-878e-c3112fb771c2 for this chassis.
Jan 22 00:10:09 compute-1 ovn_controller[94841]: 2026-01-22T00:10:09Z|00472|binding|INFO|3e67543d-6311-420b-878e-c3112fb771c2: Claiming fa:16:3e:75:3e:f2 10.100.0.8
Jan 22 00:10:09 compute-1 NetworkManager[54952]: <info>  [1769040609.3145] manager: (tap3e67543d-63): new Tun device (/org/freedesktop/NetworkManager/Devices/218)
Jan 22 00:10:09 compute-1 nova_compute[182713]: 2026-01-22 00:10:09.312 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:10:09 compute-1 ovn_controller[94841]: 2026-01-22T00:10:09Z|00473|binding|INFO|Setting lport 3e67543d-6311-420b-878e-c3112fb771c2 ovn-installed in OVS
Jan 22 00:10:09 compute-1 NetworkManager[54952]: <info>  [1769040609.3262] device (tap3e67543d-63): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 00:10:09 compute-1 NetworkManager[54952]: <info>  [1769040609.3280] device (tap3e67543d-63): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 00:10:09 compute-1 nova_compute[182713]: 2026-01-22 00:10:09.327 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:10:09 compute-1 nova_compute[182713]: 2026-01-22 00:10:09.330 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:10:09 compute-1 ovn_controller[94841]: 2026-01-22T00:10:09Z|00474|binding|INFO|Setting lport 3e67543d-6311-420b-878e-c3112fb771c2 up in Southbound
Jan 22 00:10:09 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:10:09.365 104184 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:75:3e:f2 10.100.0.8'], port_security=['fa:16:3e:75:3e:f2 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '9c30e0e8-a030-4b0c-84ed-324f08bd5f1b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-eaa59f49-9477-4d9b-9a5e-8f1eb5cf78a6', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3b9315c6168049d79f20d630e51ffff3', 'neutron:revision_number': '6', 'neutron:security_group_ids': '88dd83ff-b733-44b2-9065-8f39dcf83d23', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ada6e58f-6492-44c0-abaa-a00698af112f, chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>], logical_port=3e67543d-6311-420b-878e-c3112fb771c2) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:10:09 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:10:09.367 104184 INFO neutron.agent.ovn.metadata.agent [-] Port 3e67543d-6311-420b-878e-c3112fb771c2 in datapath eaa59f49-9477-4d9b-9a5e-8f1eb5cf78a6 bound to our chassis
Jan 22 00:10:09 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:10:09.370 104184 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network eaa59f49-9477-4d9b-9a5e-8f1eb5cf78a6 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Jan 22 00:10:09 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:10:09.371 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[233f035e-35c0-41aa-bb72-0c5403117868]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:10:09 compute-1 systemd-machined[153970]: New machine qemu-51-instance-00000070.
Jan 22 00:10:09 compute-1 systemd[1]: Started Virtual Machine qemu-51-instance-00000070.
Jan 22 00:10:09 compute-1 nova_compute[182713]: 2026-01-22 00:10:09.955 182717 DEBUG nova.virt.libvirt.host [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Removed pending event for 9c30e0e8-a030-4b0c-84ed-324f08bd5f1b due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438
Jan 22 00:10:09 compute-1 nova_compute[182713]: 2026-01-22 00:10:09.956 182717 DEBUG nova.virt.driver [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Emitting event <LifecycleEvent: 1769040609.954717, 9c30e0e8-a030-4b0c-84ed-324f08bd5f1b => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:10:09 compute-1 nova_compute[182713]: 2026-01-22 00:10:09.957 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 9c30e0e8-a030-4b0c-84ed-324f08bd5f1b] VM Resumed (Lifecycle Event)
Jan 22 00:10:09 compute-1 nova_compute[182713]: 2026-01-22 00:10:09.962 182717 DEBUG nova.compute.manager [None req-6262e3ee-b5a1-41dc-86e2-a796ef60cc4b 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] [instance: 9c30e0e8-a030-4b0c-84ed-324f08bd5f1b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:10:10 compute-1 nova_compute[182713]: 2026-01-22 00:10:10.090 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 9c30e0e8-a030-4b0c-84ed-324f08bd5f1b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:10:10 compute-1 nova_compute[182713]: 2026-01-22 00:10:10.097 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 9c30e0e8-a030-4b0c-84ed-324f08bd5f1b] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: rescued, current task_state: unrescuing, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 00:10:10 compute-1 nova_compute[182713]: 2026-01-22 00:10:10.139 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 9c30e0e8-a030-4b0c-84ed-324f08bd5f1b] During sync_power_state the instance has a pending task (unrescuing). Skip.
Jan 22 00:10:10 compute-1 nova_compute[182713]: 2026-01-22 00:10:10.140 182717 DEBUG nova.virt.driver [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Emitting event <LifecycleEvent: 1769040609.9588647, 9c30e0e8-a030-4b0c-84ed-324f08bd5f1b => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:10:10 compute-1 nova_compute[182713]: 2026-01-22 00:10:10.140 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 9c30e0e8-a030-4b0c-84ed-324f08bd5f1b] VM Started (Lifecycle Event)
Jan 22 00:10:10 compute-1 nova_compute[182713]: 2026-01-22 00:10:10.182 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 9c30e0e8-a030-4b0c-84ed-324f08bd5f1b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:10:10 compute-1 nova_compute[182713]: 2026-01-22 00:10:10.186 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 9c30e0e8-a030-4b0c-84ed-324f08bd5f1b] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: None, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 00:10:10 compute-1 nova_compute[182713]: 2026-01-22 00:10:10.655 182717 DEBUG nova.compute.manager [req-e03353d4-6124-4785-b24b-34bbcbf72902 req-398bf9b4-fcd4-43e8-a143-a43a5f30fa74 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 9c30e0e8-a030-4b0c-84ed-324f08bd5f1b] Received event network-vif-unplugged-3e67543d-6311-420b-878e-c3112fb771c2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:10:10 compute-1 nova_compute[182713]: 2026-01-22 00:10:10.657 182717 DEBUG oslo_concurrency.lockutils [req-e03353d4-6124-4785-b24b-34bbcbf72902 req-398bf9b4-fcd4-43e8-a143-a43a5f30fa74 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "9c30e0e8-a030-4b0c-84ed-324f08bd5f1b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:10:10 compute-1 nova_compute[182713]: 2026-01-22 00:10:10.657 182717 DEBUG oslo_concurrency.lockutils [req-e03353d4-6124-4785-b24b-34bbcbf72902 req-398bf9b4-fcd4-43e8-a143-a43a5f30fa74 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "9c30e0e8-a030-4b0c-84ed-324f08bd5f1b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:10:10 compute-1 nova_compute[182713]: 2026-01-22 00:10:10.657 182717 DEBUG oslo_concurrency.lockutils [req-e03353d4-6124-4785-b24b-34bbcbf72902 req-398bf9b4-fcd4-43e8-a143-a43a5f30fa74 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "9c30e0e8-a030-4b0c-84ed-324f08bd5f1b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:10:10 compute-1 nova_compute[182713]: 2026-01-22 00:10:10.658 182717 DEBUG nova.compute.manager [req-e03353d4-6124-4785-b24b-34bbcbf72902 req-398bf9b4-fcd4-43e8-a143-a43a5f30fa74 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 9c30e0e8-a030-4b0c-84ed-324f08bd5f1b] No waiting events found dispatching network-vif-unplugged-3e67543d-6311-420b-878e-c3112fb771c2 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:10:10 compute-1 nova_compute[182713]: 2026-01-22 00:10:10.658 182717 WARNING nova.compute.manager [req-e03353d4-6124-4785-b24b-34bbcbf72902 req-398bf9b4-fcd4-43e8-a143-a43a5f30fa74 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 9c30e0e8-a030-4b0c-84ed-324f08bd5f1b] Received unexpected event network-vif-unplugged-3e67543d-6311-420b-878e-c3112fb771c2 for instance with vm_state active and task_state None.
Jan 22 00:10:10 compute-1 nova_compute[182713]: 2026-01-22 00:10:10.828 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:10:12 compute-1 nova_compute[182713]: 2026-01-22 00:10:12.645 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:10:12 compute-1 nova_compute[182713]: 2026-01-22 00:10:12.788 182717 DEBUG oslo_concurrency.lockutils [None req-860d9213-848c-4a36-a866-28e9dd47d96a 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] Acquiring lock "9c30e0e8-a030-4b0c-84ed-324f08bd5f1b" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:10:12 compute-1 nova_compute[182713]: 2026-01-22 00:10:12.789 182717 DEBUG oslo_concurrency.lockutils [None req-860d9213-848c-4a36-a866-28e9dd47d96a 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] Lock "9c30e0e8-a030-4b0c-84ed-324f08bd5f1b" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:10:12 compute-1 nova_compute[182713]: 2026-01-22 00:10:12.790 182717 DEBUG oslo_concurrency.lockutils [None req-860d9213-848c-4a36-a866-28e9dd47d96a 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] Acquiring lock "9c30e0e8-a030-4b0c-84ed-324f08bd5f1b-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:10:12 compute-1 nova_compute[182713]: 2026-01-22 00:10:12.790 182717 DEBUG oslo_concurrency.lockutils [None req-860d9213-848c-4a36-a866-28e9dd47d96a 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] Lock "9c30e0e8-a030-4b0c-84ed-324f08bd5f1b-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:10:12 compute-1 nova_compute[182713]: 2026-01-22 00:10:12.790 182717 DEBUG oslo_concurrency.lockutils [None req-860d9213-848c-4a36-a866-28e9dd47d96a 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] Lock "9c30e0e8-a030-4b0c-84ed-324f08bd5f1b-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:10:12 compute-1 nova_compute[182713]: 2026-01-22 00:10:12.804 182717 INFO nova.compute.manager [None req-860d9213-848c-4a36-a866-28e9dd47d96a 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] [instance: 9c30e0e8-a030-4b0c-84ed-324f08bd5f1b] Terminating instance
Jan 22 00:10:12 compute-1 nova_compute[182713]: 2026-01-22 00:10:12.816 182717 DEBUG nova.compute.manager [None req-860d9213-848c-4a36-a866-28e9dd47d96a 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] [instance: 9c30e0e8-a030-4b0c-84ed-324f08bd5f1b] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 22 00:10:12 compute-1 kernel: tap3e67543d-63 (unregistering): left promiscuous mode
Jan 22 00:10:12 compute-1 NetworkManager[54952]: <info>  [1769040612.8342] device (tap3e67543d-63): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 00:10:12 compute-1 nova_compute[182713]: 2026-01-22 00:10:12.843 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:10:12 compute-1 ovn_controller[94841]: 2026-01-22T00:10:12Z|00475|binding|INFO|Releasing lport 3e67543d-6311-420b-878e-c3112fb771c2 from this chassis (sb_readonly=0)
Jan 22 00:10:12 compute-1 ovn_controller[94841]: 2026-01-22T00:10:12Z|00476|binding|INFO|Setting lport 3e67543d-6311-420b-878e-c3112fb771c2 down in Southbound
Jan 22 00:10:12 compute-1 ovn_controller[94841]: 2026-01-22T00:10:12Z|00477|binding|INFO|Removing iface tap3e67543d-63 ovn-installed in OVS
Jan 22 00:10:12 compute-1 nova_compute[182713]: 2026-01-22 00:10:12.848 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:10:12 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:10:12.854 104184 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:75:3e:f2 10.100.0.8'], port_security=['fa:16:3e:75:3e:f2 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '9c30e0e8-a030-4b0c-84ed-324f08bd5f1b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-eaa59f49-9477-4d9b-9a5e-8f1eb5cf78a6', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3b9315c6168049d79f20d630e51ffff3', 'neutron:revision_number': '7', 'neutron:security_group_ids': '88dd83ff-b733-44b2-9065-8f39dcf83d23', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ada6e58f-6492-44c0-abaa-a00698af112f, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>], logical_port=3e67543d-6311-420b-878e-c3112fb771c2) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:10:12 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:10:12.857 104184 INFO neutron.agent.ovn.metadata.agent [-] Port 3e67543d-6311-420b-878e-c3112fb771c2 in datapath eaa59f49-9477-4d9b-9a5e-8f1eb5cf78a6 unbound from our chassis
Jan 22 00:10:12 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:10:12.860 104184 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network eaa59f49-9477-4d9b-9a5e-8f1eb5cf78a6 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Jan 22 00:10:12 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:10:12.861 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[bd90d037-d816-4841-bf9e-b84dc386f204]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:10:12 compute-1 nova_compute[182713]: 2026-01-22 00:10:12.868 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:10:12 compute-1 systemd[1]: machine-qemu\x2d51\x2dinstance\x2d00000070.scope: Deactivated successfully.
Jan 22 00:10:12 compute-1 systemd[1]: machine-qemu\x2d51\x2dinstance\x2d00000070.scope: Consumed 3.438s CPU time.
Jan 22 00:10:12 compute-1 systemd-machined[153970]: Machine qemu-51-instance-00000070 terminated.
Jan 22 00:10:12 compute-1 podman[228983]: 2026-01-22 00:10:12.958603054 +0000 UTC m=+0.082055772 container health_status cba261ffc859a105e0afa4436e64e27f9ba7e7387a1ea02146e0072c51844d44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ceilometer_agent_compute, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Jan 22 00:10:13 compute-1 NetworkManager[54952]: <info>  [1769040613.0436] manager: (tap3e67543d-63): new Tun device (/org/freedesktop/NetworkManager/Devices/219)
Jan 22 00:10:13 compute-1 nova_compute[182713]: 2026-01-22 00:10:13.044 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:10:13 compute-1 nova_compute[182713]: 2026-01-22 00:10:13.052 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:10:13 compute-1 nova_compute[182713]: 2026-01-22 00:10:13.087 182717 INFO nova.virt.libvirt.driver [-] [instance: 9c30e0e8-a030-4b0c-84ed-324f08bd5f1b] Instance destroyed successfully.
Jan 22 00:10:13 compute-1 nova_compute[182713]: 2026-01-22 00:10:13.088 182717 DEBUG nova.objects.instance [None req-860d9213-848c-4a36-a866-28e9dd47d96a 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] Lazy-loading 'resources' on Instance uuid 9c30e0e8-a030-4b0c-84ed-324f08bd5f1b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:10:13 compute-1 nova_compute[182713]: 2026-01-22 00:10:13.128 182717 DEBUG nova.compute.manager [req-046e70c6-7a85-423a-8cff-09319f1c857a req-a7f0a8fa-c7ba-4645-b9de-085d398a7366 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 9c30e0e8-a030-4b0c-84ed-324f08bd5f1b] Received event network-vif-plugged-3e67543d-6311-420b-878e-c3112fb771c2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:10:13 compute-1 nova_compute[182713]: 2026-01-22 00:10:13.128 182717 DEBUG oslo_concurrency.lockutils [req-046e70c6-7a85-423a-8cff-09319f1c857a req-a7f0a8fa-c7ba-4645-b9de-085d398a7366 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "9c30e0e8-a030-4b0c-84ed-324f08bd5f1b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:10:13 compute-1 nova_compute[182713]: 2026-01-22 00:10:13.129 182717 DEBUG oslo_concurrency.lockutils [req-046e70c6-7a85-423a-8cff-09319f1c857a req-a7f0a8fa-c7ba-4645-b9de-085d398a7366 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "9c30e0e8-a030-4b0c-84ed-324f08bd5f1b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:10:13 compute-1 nova_compute[182713]: 2026-01-22 00:10:13.129 182717 DEBUG oslo_concurrency.lockutils [req-046e70c6-7a85-423a-8cff-09319f1c857a req-a7f0a8fa-c7ba-4645-b9de-085d398a7366 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "9c30e0e8-a030-4b0c-84ed-324f08bd5f1b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:10:13 compute-1 nova_compute[182713]: 2026-01-22 00:10:13.129 182717 DEBUG nova.compute.manager [req-046e70c6-7a85-423a-8cff-09319f1c857a req-a7f0a8fa-c7ba-4645-b9de-085d398a7366 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 9c30e0e8-a030-4b0c-84ed-324f08bd5f1b] No waiting events found dispatching network-vif-plugged-3e67543d-6311-420b-878e-c3112fb771c2 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:10:13 compute-1 nova_compute[182713]: 2026-01-22 00:10:13.129 182717 WARNING nova.compute.manager [req-046e70c6-7a85-423a-8cff-09319f1c857a req-a7f0a8fa-c7ba-4645-b9de-085d398a7366 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 9c30e0e8-a030-4b0c-84ed-324f08bd5f1b] Received unexpected event network-vif-plugged-3e67543d-6311-420b-878e-c3112fb771c2 for instance with vm_state active and task_state deleting.
Jan 22 00:10:13 compute-1 nova_compute[182713]: 2026-01-22 00:10:13.129 182717 DEBUG nova.compute.manager [req-046e70c6-7a85-423a-8cff-09319f1c857a req-a7f0a8fa-c7ba-4645-b9de-085d398a7366 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 9c30e0e8-a030-4b0c-84ed-324f08bd5f1b] Received event network-vif-plugged-3e67543d-6311-420b-878e-c3112fb771c2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:10:13 compute-1 nova_compute[182713]: 2026-01-22 00:10:13.130 182717 DEBUG oslo_concurrency.lockutils [req-046e70c6-7a85-423a-8cff-09319f1c857a req-a7f0a8fa-c7ba-4645-b9de-085d398a7366 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "9c30e0e8-a030-4b0c-84ed-324f08bd5f1b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:10:13 compute-1 nova_compute[182713]: 2026-01-22 00:10:13.130 182717 DEBUG oslo_concurrency.lockutils [req-046e70c6-7a85-423a-8cff-09319f1c857a req-a7f0a8fa-c7ba-4645-b9de-085d398a7366 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "9c30e0e8-a030-4b0c-84ed-324f08bd5f1b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:10:13 compute-1 nova_compute[182713]: 2026-01-22 00:10:13.130 182717 DEBUG oslo_concurrency.lockutils [req-046e70c6-7a85-423a-8cff-09319f1c857a req-a7f0a8fa-c7ba-4645-b9de-085d398a7366 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "9c30e0e8-a030-4b0c-84ed-324f08bd5f1b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:10:13 compute-1 nova_compute[182713]: 2026-01-22 00:10:13.130 182717 DEBUG nova.compute.manager [req-046e70c6-7a85-423a-8cff-09319f1c857a req-a7f0a8fa-c7ba-4645-b9de-085d398a7366 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 9c30e0e8-a030-4b0c-84ed-324f08bd5f1b] No waiting events found dispatching network-vif-plugged-3e67543d-6311-420b-878e-c3112fb771c2 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:10:13 compute-1 nova_compute[182713]: 2026-01-22 00:10:13.130 182717 WARNING nova.compute.manager [req-046e70c6-7a85-423a-8cff-09319f1c857a req-a7f0a8fa-c7ba-4645-b9de-085d398a7366 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 9c30e0e8-a030-4b0c-84ed-324f08bd5f1b] Received unexpected event network-vif-plugged-3e67543d-6311-420b-878e-c3112fb771c2 for instance with vm_state active and task_state deleting.
Jan 22 00:10:13 compute-1 nova_compute[182713]: 2026-01-22 00:10:13.264 182717 DEBUG nova.virt.libvirt.vif [None req-860d9213-848c-4a36-a866-28e9dd47d96a 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T00:09:26Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerRescueTestJSON-server-635293581',display_name='tempest-ServerRescueTestJSON-server-635293581',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverrescuetestjson-server-635293581',id=112,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-22T00:10:00Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='3b9315c6168049d79f20d630e51ffff3',ramdisk_id='',reservation_id='r-egqa46yk',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerRescueTestJSON-401787473',owner_user_name='tempest-ServerRescueTestJSON-401787473-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T00:10:10Z,user_data=None,user_id='8324d8ba232c476e925d31b7d5645a7a',uuid=9c30e0e8-a030-4b0c-84ed-324f08bd5f1b,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "3e67543d-6311-420b-878e-c3112fb771c2", "address": "fa:16:3e:75:3e:f2", "network": {"id": "eaa59f49-9477-4d9b-9a5e-8f1eb5cf78a6", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-1280377146-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "3b9315c6168049d79f20d630e51ffff3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3e67543d-63", "ovs_interfaceid": "3e67543d-6311-420b-878e-c3112fb771c2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 00:10:13 compute-1 nova_compute[182713]: 2026-01-22 00:10:13.264 182717 DEBUG nova.network.os_vif_util [None req-860d9213-848c-4a36-a866-28e9dd47d96a 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] Converting VIF {"id": "3e67543d-6311-420b-878e-c3112fb771c2", "address": "fa:16:3e:75:3e:f2", "network": {"id": "eaa59f49-9477-4d9b-9a5e-8f1eb5cf78a6", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-1280377146-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "3b9315c6168049d79f20d630e51ffff3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3e67543d-63", "ovs_interfaceid": "3e67543d-6311-420b-878e-c3112fb771c2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:10:13 compute-1 nova_compute[182713]: 2026-01-22 00:10:13.266 182717 DEBUG nova.network.os_vif_util [None req-860d9213-848c-4a36-a866-28e9dd47d96a 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:75:3e:f2,bridge_name='br-int',has_traffic_filtering=True,id=3e67543d-6311-420b-878e-c3112fb771c2,network=Network(eaa59f49-9477-4d9b-9a5e-8f1eb5cf78a6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3e67543d-63') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:10:13 compute-1 nova_compute[182713]: 2026-01-22 00:10:13.267 182717 DEBUG os_vif [None req-860d9213-848c-4a36-a866-28e9dd47d96a 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:75:3e:f2,bridge_name='br-int',has_traffic_filtering=True,id=3e67543d-6311-420b-878e-c3112fb771c2,network=Network(eaa59f49-9477-4d9b-9a5e-8f1eb5cf78a6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3e67543d-63') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 00:10:13 compute-1 nova_compute[182713]: 2026-01-22 00:10:13.270 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:10:13 compute-1 nova_compute[182713]: 2026-01-22 00:10:13.271 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3e67543d-63, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:10:13 compute-1 nova_compute[182713]: 2026-01-22 00:10:13.302 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:10:13 compute-1 nova_compute[182713]: 2026-01-22 00:10:13.306 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:10:13 compute-1 nova_compute[182713]: 2026-01-22 00:10:13.309 182717 INFO os_vif [None req-860d9213-848c-4a36-a866-28e9dd47d96a 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:75:3e:f2,bridge_name='br-int',has_traffic_filtering=True,id=3e67543d-6311-420b-878e-c3112fb771c2,network=Network(eaa59f49-9477-4d9b-9a5e-8f1eb5cf78a6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3e67543d-63')
Jan 22 00:10:13 compute-1 nova_compute[182713]: 2026-01-22 00:10:13.310 182717 INFO nova.virt.libvirt.driver [None req-860d9213-848c-4a36-a866-28e9dd47d96a 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] [instance: 9c30e0e8-a030-4b0c-84ed-324f08bd5f1b] Deleting instance files /var/lib/nova/instances/9c30e0e8-a030-4b0c-84ed-324f08bd5f1b_del
Jan 22 00:10:13 compute-1 nova_compute[182713]: 2026-01-22 00:10:13.311 182717 INFO nova.virt.libvirt.driver [None req-860d9213-848c-4a36-a866-28e9dd47d96a 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] [instance: 9c30e0e8-a030-4b0c-84ed-324f08bd5f1b] Deletion of /var/lib/nova/instances/9c30e0e8-a030-4b0c-84ed-324f08bd5f1b_del complete
Jan 22 00:10:13 compute-1 nova_compute[182713]: 2026-01-22 00:10:13.542 182717 INFO nova.compute.manager [None req-860d9213-848c-4a36-a866-28e9dd47d96a 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] [instance: 9c30e0e8-a030-4b0c-84ed-324f08bd5f1b] Took 0.73 seconds to destroy the instance on the hypervisor.
Jan 22 00:10:13 compute-1 nova_compute[182713]: 2026-01-22 00:10:13.544 182717 DEBUG oslo.service.loopingcall [None req-860d9213-848c-4a36-a866-28e9dd47d96a 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 22 00:10:13 compute-1 nova_compute[182713]: 2026-01-22 00:10:13.545 182717 DEBUG nova.compute.manager [-] [instance: 9c30e0e8-a030-4b0c-84ed-324f08bd5f1b] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 22 00:10:13 compute-1 nova_compute[182713]: 2026-01-22 00:10:13.545 182717 DEBUG nova.network.neutron [-] [instance: 9c30e0e8-a030-4b0c-84ed-324f08bd5f1b] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 22 00:10:15 compute-1 nova_compute[182713]: 2026-01-22 00:10:15.504 182717 DEBUG nova.compute.manager [req-af548de5-5520-4718-a896-7247be7a7b34 req-79cabac1-c25f-48d3-9709-c6aad0d78825 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 9c30e0e8-a030-4b0c-84ed-324f08bd5f1b] Received event network-vif-plugged-3e67543d-6311-420b-878e-c3112fb771c2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:10:15 compute-1 nova_compute[182713]: 2026-01-22 00:10:15.504 182717 DEBUG oslo_concurrency.lockutils [req-af548de5-5520-4718-a896-7247be7a7b34 req-79cabac1-c25f-48d3-9709-c6aad0d78825 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "9c30e0e8-a030-4b0c-84ed-324f08bd5f1b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:10:15 compute-1 nova_compute[182713]: 2026-01-22 00:10:15.505 182717 DEBUG oslo_concurrency.lockutils [req-af548de5-5520-4718-a896-7247be7a7b34 req-79cabac1-c25f-48d3-9709-c6aad0d78825 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "9c30e0e8-a030-4b0c-84ed-324f08bd5f1b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:10:15 compute-1 nova_compute[182713]: 2026-01-22 00:10:15.505 182717 DEBUG oslo_concurrency.lockutils [req-af548de5-5520-4718-a896-7247be7a7b34 req-79cabac1-c25f-48d3-9709-c6aad0d78825 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "9c30e0e8-a030-4b0c-84ed-324f08bd5f1b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:10:15 compute-1 nova_compute[182713]: 2026-01-22 00:10:15.506 182717 DEBUG nova.compute.manager [req-af548de5-5520-4718-a896-7247be7a7b34 req-79cabac1-c25f-48d3-9709-c6aad0d78825 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 9c30e0e8-a030-4b0c-84ed-324f08bd5f1b] No waiting events found dispatching network-vif-plugged-3e67543d-6311-420b-878e-c3112fb771c2 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:10:15 compute-1 nova_compute[182713]: 2026-01-22 00:10:15.506 182717 WARNING nova.compute.manager [req-af548de5-5520-4718-a896-7247be7a7b34 req-79cabac1-c25f-48d3-9709-c6aad0d78825 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 9c30e0e8-a030-4b0c-84ed-324f08bd5f1b] Received unexpected event network-vif-plugged-3e67543d-6311-420b-878e-c3112fb771c2 for instance with vm_state active and task_state deleting.
Jan 22 00:10:15 compute-1 nova_compute[182713]: 2026-01-22 00:10:15.507 182717 DEBUG nova.compute.manager [req-af548de5-5520-4718-a896-7247be7a7b34 req-79cabac1-c25f-48d3-9709-c6aad0d78825 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 9c30e0e8-a030-4b0c-84ed-324f08bd5f1b] Received event network-vif-unplugged-3e67543d-6311-420b-878e-c3112fb771c2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:10:15 compute-1 nova_compute[182713]: 2026-01-22 00:10:15.507 182717 DEBUG oslo_concurrency.lockutils [req-af548de5-5520-4718-a896-7247be7a7b34 req-79cabac1-c25f-48d3-9709-c6aad0d78825 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "9c30e0e8-a030-4b0c-84ed-324f08bd5f1b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:10:15 compute-1 nova_compute[182713]: 2026-01-22 00:10:15.508 182717 DEBUG oslo_concurrency.lockutils [req-af548de5-5520-4718-a896-7247be7a7b34 req-79cabac1-c25f-48d3-9709-c6aad0d78825 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "9c30e0e8-a030-4b0c-84ed-324f08bd5f1b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:10:15 compute-1 nova_compute[182713]: 2026-01-22 00:10:15.508 182717 DEBUG oslo_concurrency.lockutils [req-af548de5-5520-4718-a896-7247be7a7b34 req-79cabac1-c25f-48d3-9709-c6aad0d78825 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "9c30e0e8-a030-4b0c-84ed-324f08bd5f1b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:10:15 compute-1 nova_compute[182713]: 2026-01-22 00:10:15.509 182717 DEBUG nova.compute.manager [req-af548de5-5520-4718-a896-7247be7a7b34 req-79cabac1-c25f-48d3-9709-c6aad0d78825 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 9c30e0e8-a030-4b0c-84ed-324f08bd5f1b] No waiting events found dispatching network-vif-unplugged-3e67543d-6311-420b-878e-c3112fb771c2 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:10:15 compute-1 nova_compute[182713]: 2026-01-22 00:10:15.509 182717 DEBUG nova.compute.manager [req-af548de5-5520-4718-a896-7247be7a7b34 req-79cabac1-c25f-48d3-9709-c6aad0d78825 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 9c30e0e8-a030-4b0c-84ed-324f08bd5f1b] Received event network-vif-unplugged-3e67543d-6311-420b-878e-c3112fb771c2 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 22 00:10:15 compute-1 nova_compute[182713]: 2026-01-22 00:10:15.753 182717 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769040600.7514968, 07bda903-2298-433c-aa7d-9a50380e24f1 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:10:15 compute-1 nova_compute[182713]: 2026-01-22 00:10:15.753 182717 INFO nova.compute.manager [-] [instance: 07bda903-2298-433c-aa7d-9a50380e24f1] VM Stopped (Lifecycle Event)
Jan 22 00:10:15 compute-1 nova_compute[182713]: 2026-01-22 00:10:15.919 182717 DEBUG nova.compute.manager [None req-d768598f-40d5-4973-8f07-217cc49bedd8 - - - - - -] [instance: 07bda903-2298-433c-aa7d-9a50380e24f1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:10:16 compute-1 nova_compute[182713]: 2026-01-22 00:10:16.372 182717 DEBUG nova.network.neutron [-] [instance: 9c30e0e8-a030-4b0c-84ed-324f08bd5f1b] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:10:16 compute-1 nova_compute[182713]: 2026-01-22 00:10:16.431 182717 INFO nova.compute.manager [-] [instance: 9c30e0e8-a030-4b0c-84ed-324f08bd5f1b] Took 2.89 seconds to deallocate network for instance.
Jan 22 00:10:16 compute-1 nova_compute[182713]: 2026-01-22 00:10:16.592 182717 DEBUG nova.compute.manager [req-2591a8d7-9515-479e-9ead-d0eaf8500ee6 req-6081e190-01a7-4ed5-95e8-109efa46e952 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 9c30e0e8-a030-4b0c-84ed-324f08bd5f1b] Received event network-vif-deleted-3e67543d-6311-420b-878e-c3112fb771c2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:10:16 compute-1 podman[229025]: 2026-01-22 00:10:16.623589903 +0000 UTC m=+0.102691538 container health_status dce133a2ffa21f3006ac27dc80027060a9029e6d972f4c62fecd35dd3bbb00f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9-minimal, config_id=openstack_network_exporter, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible, container_name=openstack_network_exporter, io.buildah.version=1.33.7, io.openshift.expose-services=, vcs-type=git, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, architecture=x86_64)
Jan 22 00:10:16 compute-1 nova_compute[182713]: 2026-01-22 00:10:16.633 182717 DEBUG oslo_concurrency.lockutils [None req-860d9213-848c-4a36-a866-28e9dd47d96a 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:10:16 compute-1 nova_compute[182713]: 2026-01-22 00:10:16.634 182717 DEBUG oslo_concurrency.lockutils [None req-860d9213-848c-4a36-a866-28e9dd47d96a 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:10:16 compute-1 nova_compute[182713]: 2026-01-22 00:10:16.747 182717 DEBUG nova.compute.provider_tree [None req-860d9213-848c-4a36-a866-28e9dd47d96a 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] Inventory has not changed in ProviderTree for provider: 39680711-70c9-4df1-ae59-25e54fac688d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:10:16 compute-1 nova_compute[182713]: 2026-01-22 00:10:16.767 182717 DEBUG nova.scheduler.client.report [None req-860d9213-848c-4a36-a866-28e9dd47d96a 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] Inventory has not changed for provider 39680711-70c9-4df1-ae59-25e54fac688d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:10:17 compute-1 nova_compute[182713]: 2026-01-22 00:10:17.135 182717 DEBUG oslo_concurrency.lockutils [None req-860d9213-848c-4a36-a866-28e9dd47d96a 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.501s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:10:17 compute-1 nova_compute[182713]: 2026-01-22 00:10:17.164 182717 INFO nova.scheduler.client.report [None req-860d9213-848c-4a36-a866-28e9dd47d96a 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] Deleted allocations for instance 9c30e0e8-a030-4b0c-84ed-324f08bd5f1b
Jan 22 00:10:17 compute-1 nova_compute[182713]: 2026-01-22 00:10:17.318 182717 DEBUG oslo_concurrency.lockutils [None req-860d9213-848c-4a36-a866-28e9dd47d96a 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] Lock "9c30e0e8-a030-4b0c-84ed-324f08bd5f1b" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.529s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:10:17 compute-1 nova_compute[182713]: 2026-01-22 00:10:17.646 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:10:17 compute-1 nova_compute[182713]: 2026-01-22 00:10:17.682 182717 DEBUG nova.compute.manager [req-ec5baf1f-2476-4a5c-adab-8598251e9ab9 req-62ee60ab-51bc-413e-a26e-1be3a7fb8a7d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 9c30e0e8-a030-4b0c-84ed-324f08bd5f1b] Received event network-vif-plugged-3e67543d-6311-420b-878e-c3112fb771c2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:10:17 compute-1 nova_compute[182713]: 2026-01-22 00:10:17.683 182717 DEBUG oslo_concurrency.lockutils [req-ec5baf1f-2476-4a5c-adab-8598251e9ab9 req-62ee60ab-51bc-413e-a26e-1be3a7fb8a7d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "9c30e0e8-a030-4b0c-84ed-324f08bd5f1b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:10:17 compute-1 nova_compute[182713]: 2026-01-22 00:10:17.683 182717 DEBUG oslo_concurrency.lockutils [req-ec5baf1f-2476-4a5c-adab-8598251e9ab9 req-62ee60ab-51bc-413e-a26e-1be3a7fb8a7d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "9c30e0e8-a030-4b0c-84ed-324f08bd5f1b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:10:17 compute-1 nova_compute[182713]: 2026-01-22 00:10:17.683 182717 DEBUG oslo_concurrency.lockutils [req-ec5baf1f-2476-4a5c-adab-8598251e9ab9 req-62ee60ab-51bc-413e-a26e-1be3a7fb8a7d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "9c30e0e8-a030-4b0c-84ed-324f08bd5f1b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:10:17 compute-1 nova_compute[182713]: 2026-01-22 00:10:17.684 182717 DEBUG nova.compute.manager [req-ec5baf1f-2476-4a5c-adab-8598251e9ab9 req-62ee60ab-51bc-413e-a26e-1be3a7fb8a7d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 9c30e0e8-a030-4b0c-84ed-324f08bd5f1b] No waiting events found dispatching network-vif-plugged-3e67543d-6311-420b-878e-c3112fb771c2 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:10:17 compute-1 nova_compute[182713]: 2026-01-22 00:10:17.684 182717 WARNING nova.compute.manager [req-ec5baf1f-2476-4a5c-adab-8598251e9ab9 req-62ee60ab-51bc-413e-a26e-1be3a7fb8a7d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 9c30e0e8-a030-4b0c-84ed-324f08bd5f1b] Received unexpected event network-vif-plugged-3e67543d-6311-420b-878e-c3112fb771c2 for instance with vm_state deleted and task_state None.
Jan 22 00:10:18 compute-1 nova_compute[182713]: 2026-01-22 00:10:18.303 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:10:19 compute-1 nova_compute[182713]: 2026-01-22 00:10:19.594 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:10:22 compute-1 nova_compute[182713]: 2026-01-22 00:10:22.694 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:10:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:10:22.875 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:10:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:10:22.876 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:10:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:10:22.877 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:10:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:10:22.877 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:10:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:10:22.877 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:10:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:10:22.877 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:10:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:10:22.877 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:10:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:10:22.877 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:10:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:10:22.877 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:10:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:10:22.878 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:10:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:10:22.878 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:10:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:10:22.878 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:10:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:10:22.878 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:10:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:10:22.878 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:10:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:10:22.878 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:10:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:10:22.878 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:10:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:10:22.878 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:10:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:10:22.879 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:10:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:10:22.879 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:10:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:10:22.879 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:10:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:10:22.879 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:10:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:10:22.879 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:10:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:10:22.879 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:10:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:10:22.879 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:10:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:10:22.880 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:10:23 compute-1 nova_compute[182713]: 2026-01-22 00:10:23.304 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:10:27 compute-1 nova_compute[182713]: 2026-01-22 00:10:27.779 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:10:28 compute-1 nova_compute[182713]: 2026-01-22 00:10:28.086 182717 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769040613.0850961, 9c30e0e8-a030-4b0c-84ed-324f08bd5f1b => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:10:28 compute-1 nova_compute[182713]: 2026-01-22 00:10:28.086 182717 INFO nova.compute.manager [-] [instance: 9c30e0e8-a030-4b0c-84ed-324f08bd5f1b] VM Stopped (Lifecycle Event)
Jan 22 00:10:28 compute-1 nova_compute[182713]: 2026-01-22 00:10:28.116 182717 DEBUG nova.compute.manager [None req-abe9a392-e129-4474-bee2-3c89f8948c09 - - - - - -] [instance: 9c30e0e8-a030-4b0c-84ed-324f08bd5f1b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:10:28 compute-1 nova_compute[182713]: 2026-01-22 00:10:28.306 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:10:28 compute-1 podman[229047]: 2026-01-22 00:10:28.583690096 +0000 UTC m=+0.065959583 container health_status 9069102163b9014ce8d990e36224edf2f6277e737eebbc85a8ea0ba5a3d6a468 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 22 00:10:28 compute-1 podman[229046]: 2026-01-22 00:10:28.611440859 +0000 UTC m=+0.100036568 container health_status 1b31374526468d6b98833c6ddc06c791524e3aaa8f4a92d02e3f11c449a17ca2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible)
Jan 22 00:10:32 compute-1 nova_compute[182713]: 2026-01-22 00:10:32.783 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:10:33 compute-1 nova_compute[182713]: 2026-01-22 00:10:33.308 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:10:33 compute-1 podman[229097]: 2026-01-22 00:10:33.559923824 +0000 UTC m=+0.052251016 container health_status af9a44923f14d865893c91712a29a99d59e05d83f1a1a0300c4f4d5af638f284 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 00:10:33 compute-1 podman[229098]: 2026-01-22 00:10:33.562407079 +0000 UTC m=+0.055043731 container health_status e305ebfcd3dbca18f8dd680606b043b108cf9efb0d0a771039cb04fe0df8441d (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 22 00:10:37 compute-1 nova_compute[182713]: 2026-01-22 00:10:37.784 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:10:38 compute-1 nova_compute[182713]: 2026-01-22 00:10:38.310 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:10:39 compute-1 nova_compute[182713]: 2026-01-22 00:10:39.855 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:10:40 compute-1 nova_compute[182713]: 2026-01-22 00:10:40.856 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:10:40 compute-1 nova_compute[182713]: 2026-01-22 00:10:40.857 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:10:40 compute-1 nova_compute[182713]: 2026-01-22 00:10:40.857 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:10:40 compute-1 nova_compute[182713]: 2026-01-22 00:10:40.857 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 00:10:41 compute-1 nova_compute[182713]: 2026-01-22 00:10:41.852 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:10:42 compute-1 nova_compute[182713]: 2026-01-22 00:10:42.786 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:10:42 compute-1 nova_compute[182713]: 2026-01-22 00:10:42.855 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:10:42 compute-1 nova_compute[182713]: 2026-01-22 00:10:42.926 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:10:42 compute-1 nova_compute[182713]: 2026-01-22 00:10:42.926 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:10:42 compute-1 nova_compute[182713]: 2026-01-22 00:10:42.926 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:10:42 compute-1 nova_compute[182713]: 2026-01-22 00:10:42.927 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 00:10:43 compute-1 nova_compute[182713]: 2026-01-22 00:10:43.118 182717 WARNING nova.virt.libvirt.driver [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 00:10:43 compute-1 nova_compute[182713]: 2026-01-22 00:10:43.120 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5686MB free_disk=73.29581832885742GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 00:10:43 compute-1 nova_compute[182713]: 2026-01-22 00:10:43.121 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:10:43 compute-1 nova_compute[182713]: 2026-01-22 00:10:43.121 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:10:43 compute-1 nova_compute[182713]: 2026-01-22 00:10:43.311 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:10:43 compute-1 nova_compute[182713]: 2026-01-22 00:10:43.377 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 00:10:43 compute-1 nova_compute[182713]: 2026-01-22 00:10:43.377 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 00:10:43 compute-1 nova_compute[182713]: 2026-01-22 00:10:43.437 182717 DEBUG nova.compute.provider_tree [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Inventory has not changed in ProviderTree for provider: 39680711-70c9-4df1-ae59-25e54fac688d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:10:43 compute-1 nova_compute[182713]: 2026-01-22 00:10:43.471 182717 DEBUG nova.scheduler.client.report [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Inventory has not changed for provider 39680711-70c9-4df1-ae59-25e54fac688d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:10:43 compute-1 podman[229140]: 2026-01-22 00:10:43.598881786 +0000 UTC m=+0.092498788 container health_status cba261ffc859a105e0afa4436e64e27f9ba7e7387a1ea02146e0072c51844d44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 00:10:44 compute-1 nova_compute[182713]: 2026-01-22 00:10:44.542 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 00:10:44 compute-1 nova_compute[182713]: 2026-01-22 00:10:44.542 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.421s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:10:45 compute-1 nova_compute[182713]: 2026-01-22 00:10:45.542 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:10:45 compute-1 nova_compute[182713]: 2026-01-22 00:10:45.543 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:10:46 compute-1 nova_compute[182713]: 2026-01-22 00:10:46.856 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:10:46 compute-1 nova_compute[182713]: 2026-01-22 00:10:46.856 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 00:10:46 compute-1 nova_compute[182713]: 2026-01-22 00:10:46.857 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 22 00:10:47 compute-1 podman[229159]: 2026-01-22 00:10:47.571762442 +0000 UTC m=+0.068729547 container health_status dce133a2ffa21f3006ac27dc80027060a9029e6d972f4c62fecd35dd3bbb00f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, config_id=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.buildah.version=1.33.7, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., name=ubi9-minimal, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, architecture=x86_64, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6)
Jan 22 00:10:47 compute-1 nova_compute[182713]: 2026-01-22 00:10:47.828 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:10:48 compute-1 nova_compute[182713]: 2026-01-22 00:10:48.314 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:10:48 compute-1 nova_compute[182713]: 2026-01-22 00:10:48.430 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 22 00:10:52 compute-1 nova_compute[182713]: 2026-01-22 00:10:52.830 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:10:53 compute-1 nova_compute[182713]: 2026-01-22 00:10:53.316 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:10:57 compute-1 nova_compute[182713]: 2026-01-22 00:10:57.863 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:10:58 compute-1 nova_compute[182713]: 2026-01-22 00:10:58.318 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:10:59 compute-1 podman[229181]: 2026-01-22 00:10:59.597649141 +0000 UTC m=+0.087182486 container health_status 9069102163b9014ce8d990e36224edf2f6277e737eebbc85a8ea0ba5a3d6a468 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 22 00:10:59 compute-1 podman[229180]: 2026-01-22 00:10:59.59824106 +0000 UTC m=+0.092319953 container health_status 1b31374526468d6b98833c6ddc06c791524e3aaa8f4a92d02e3f11c449a17ca2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Jan 22 00:11:02 compute-1 nova_compute[182713]: 2026-01-22 00:11:02.866 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:11:03 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:11:03.019 104184 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:11:03 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:11:03.019 104184 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:11:03 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:11:03.019 104184 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:11:03 compute-1 nova_compute[182713]: 2026-01-22 00:11:03.320 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:11:04 compute-1 podman[229228]: 2026-01-22 00:11:04.558548755 +0000 UTC m=+0.056980250 container health_status af9a44923f14d865893c91712a29a99d59e05d83f1a1a0300c4f4d5af638f284 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, managed_by=edpm_ansible)
Jan 22 00:11:04 compute-1 podman[229229]: 2026-01-22 00:11:04.59496014 +0000 UTC m=+0.078971007 container health_status e305ebfcd3dbca18f8dd680606b043b108cf9efb0d0a771039cb04fe0df8441d (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Jan 22 00:11:06 compute-1 ovn_controller[94841]: 2026-01-22T00:11:06Z|00478|memory_trim|INFO|Detected inactivity (last active 30002 ms ago): trimming memory
Jan 22 00:11:07 compute-1 nova_compute[182713]: 2026-01-22 00:11:07.868 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:11:08 compute-1 nova_compute[182713]: 2026-01-22 00:11:08.322 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:11:08 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:11:08.410 104184 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=32, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:a4:fb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'fa:c2:bb:50:eb:27'}, ipsec=False) old=SB_Global(nb_cfg=31) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:11:08 compute-1 nova_compute[182713]: 2026-01-22 00:11:08.411 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:11:08 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:11:08.411 104184 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 22 00:11:12 compute-1 nova_compute[182713]: 2026-01-22 00:11:12.411 182717 DEBUG oslo_concurrency.lockutils [None req-61aa2dd3-cac0-44c7-b98d-d2e0bbccee01 3c4bd8a02cf045ad9703e01b44239806 54baf6b68ab146d49737e618b1e5b40e - - default default] Acquiring lock "cdd4fb44-07d8-4910-b2c1-32386ecffab8" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:11:12 compute-1 nova_compute[182713]: 2026-01-22 00:11:12.412 182717 DEBUG oslo_concurrency.lockutils [None req-61aa2dd3-cac0-44c7-b98d-d2e0bbccee01 3c4bd8a02cf045ad9703e01b44239806 54baf6b68ab146d49737e618b1e5b40e - - default default] Lock "cdd4fb44-07d8-4910-b2c1-32386ecffab8" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:11:12 compute-1 nova_compute[182713]: 2026-01-22 00:11:12.763 182717 DEBUG nova.compute.manager [None req-61aa2dd3-cac0-44c7-b98d-d2e0bbccee01 3c4bd8a02cf045ad9703e01b44239806 54baf6b68ab146d49737e618b1e5b40e - - default default] [instance: cdd4fb44-07d8-4910-b2c1-32386ecffab8] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 22 00:11:12 compute-1 nova_compute[182713]: 2026-01-22 00:11:12.870 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:11:13 compute-1 nova_compute[182713]: 2026-01-22 00:11:13.325 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:11:14 compute-1 podman[229270]: 2026-01-22 00:11:14.597227371 +0000 UTC m=+0.079937108 container health_status cba261ffc859a105e0afa4436e64e27f9ba7e7387a1ea02146e0072c51844d44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 22 00:11:15 compute-1 nova_compute[182713]: 2026-01-22 00:11:15.361 182717 DEBUG oslo_concurrency.lockutils [None req-61aa2dd3-cac0-44c7-b98d-d2e0bbccee01 3c4bd8a02cf045ad9703e01b44239806 54baf6b68ab146d49737e618b1e5b40e - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:11:15 compute-1 nova_compute[182713]: 2026-01-22 00:11:15.362 182717 DEBUG oslo_concurrency.lockutils [None req-61aa2dd3-cac0-44c7-b98d-d2e0bbccee01 3c4bd8a02cf045ad9703e01b44239806 54baf6b68ab146d49737e618b1e5b40e - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:11:15 compute-1 nova_compute[182713]: 2026-01-22 00:11:15.371 182717 DEBUG nova.virt.hardware [None req-61aa2dd3-cac0-44c7-b98d-d2e0bbccee01 3c4bd8a02cf045ad9703e01b44239806 54baf6b68ab146d49737e618b1e5b40e - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 22 00:11:15 compute-1 nova_compute[182713]: 2026-01-22 00:11:15.372 182717 INFO nova.compute.claims [None req-61aa2dd3-cac0-44c7-b98d-d2e0bbccee01 3c4bd8a02cf045ad9703e01b44239806 54baf6b68ab146d49737e618b1e5b40e - - default default] [instance: cdd4fb44-07d8-4910-b2c1-32386ecffab8] Claim successful on node compute-1.ctlplane.example.com
Jan 22 00:11:15 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:11:15.414 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=74526b6d-b1ca-423f-9094-b845f8b97526, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '32'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:11:17 compute-1 nova_compute[182713]: 2026-01-22 00:11:17.873 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:11:18 compute-1 nova_compute[182713]: 2026-01-22 00:11:18.327 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:11:18 compute-1 podman[229290]: 2026-01-22 00:11:18.584349727 +0000 UTC m=+0.075425100 container health_status dce133a2ffa21f3006ac27dc80027060a9029e6d972f4c62fecd35dd3bbb00f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, distribution-scope=public, io.buildah.version=1.33.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, architecture=x86_64, container_name=openstack_network_exporter, release=1755695350, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, config_id=openstack_network_exporter, maintainer=Red Hat, Inc., managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git)
Jan 22 00:11:19 compute-1 nova_compute[182713]: 2026-01-22 00:11:19.657 182717 DEBUG nova.compute.provider_tree [None req-61aa2dd3-cac0-44c7-b98d-d2e0bbccee01 3c4bd8a02cf045ad9703e01b44239806 54baf6b68ab146d49737e618b1e5b40e - - default default] Inventory has not changed in ProviderTree for provider: 39680711-70c9-4df1-ae59-25e54fac688d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:11:19 compute-1 nova_compute[182713]: 2026-01-22 00:11:19.825 182717 DEBUG nova.scheduler.client.report [None req-61aa2dd3-cac0-44c7-b98d-d2e0bbccee01 3c4bd8a02cf045ad9703e01b44239806 54baf6b68ab146d49737e618b1e5b40e - - default default] Inventory has not changed for provider 39680711-70c9-4df1-ae59-25e54fac688d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:11:19 compute-1 nova_compute[182713]: 2026-01-22 00:11:19.867 182717 DEBUG oslo_concurrency.lockutils [None req-61aa2dd3-cac0-44c7-b98d-d2e0bbccee01 3c4bd8a02cf045ad9703e01b44239806 54baf6b68ab146d49737e618b1e5b40e - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 4.505s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:11:19 compute-1 nova_compute[182713]: 2026-01-22 00:11:19.868 182717 DEBUG nova.compute.manager [None req-61aa2dd3-cac0-44c7-b98d-d2e0bbccee01 3c4bd8a02cf045ad9703e01b44239806 54baf6b68ab146d49737e618b1e5b40e - - default default] [instance: cdd4fb44-07d8-4910-b2c1-32386ecffab8] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 22 00:11:20 compute-1 nova_compute[182713]: 2026-01-22 00:11:20.939 182717 DEBUG nova.compute.manager [None req-61aa2dd3-cac0-44c7-b98d-d2e0bbccee01 3c4bd8a02cf045ad9703e01b44239806 54baf6b68ab146d49737e618b1e5b40e - - default default] [instance: cdd4fb44-07d8-4910-b2c1-32386ecffab8] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 22 00:11:20 compute-1 nova_compute[182713]: 2026-01-22 00:11:20.939 182717 DEBUG nova.network.neutron [None req-61aa2dd3-cac0-44c7-b98d-d2e0bbccee01 3c4bd8a02cf045ad9703e01b44239806 54baf6b68ab146d49737e618b1e5b40e - - default default] [instance: cdd4fb44-07d8-4910-b2c1-32386ecffab8] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 22 00:11:21 compute-1 nova_compute[182713]: 2026-01-22 00:11:21.015 182717 INFO nova.virt.libvirt.driver [None req-61aa2dd3-cac0-44c7-b98d-d2e0bbccee01 3c4bd8a02cf045ad9703e01b44239806 54baf6b68ab146d49737e618b1e5b40e - - default default] [instance: cdd4fb44-07d8-4910-b2c1-32386ecffab8] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 22 00:11:21 compute-1 nova_compute[182713]: 2026-01-22 00:11:21.104 182717 DEBUG nova.compute.manager [None req-61aa2dd3-cac0-44c7-b98d-d2e0bbccee01 3c4bd8a02cf045ad9703e01b44239806 54baf6b68ab146d49737e618b1e5b40e - - default default] [instance: cdd4fb44-07d8-4910-b2c1-32386ecffab8] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 22 00:11:21 compute-1 nova_compute[182713]: 2026-01-22 00:11:21.851 182717 DEBUG nova.policy [None req-61aa2dd3-cac0-44c7-b98d-d2e0bbccee01 3c4bd8a02cf045ad9703e01b44239806 54baf6b68ab146d49737e618b1e5b40e - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '3c4bd8a02cf045ad9703e01b44239806', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '54baf6b68ab146d49737e618b1e5b40e', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 22 00:11:21 compute-1 nova_compute[182713]: 2026-01-22 00:11:21.879 182717 DEBUG nova.compute.manager [None req-61aa2dd3-cac0-44c7-b98d-d2e0bbccee01 3c4bd8a02cf045ad9703e01b44239806 54baf6b68ab146d49737e618b1e5b40e - - default default] [instance: cdd4fb44-07d8-4910-b2c1-32386ecffab8] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 22 00:11:21 compute-1 nova_compute[182713]: 2026-01-22 00:11:21.880 182717 DEBUG nova.virt.libvirt.driver [None req-61aa2dd3-cac0-44c7-b98d-d2e0bbccee01 3c4bd8a02cf045ad9703e01b44239806 54baf6b68ab146d49737e618b1e5b40e - - default default] [instance: cdd4fb44-07d8-4910-b2c1-32386ecffab8] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 22 00:11:21 compute-1 nova_compute[182713]: 2026-01-22 00:11:21.880 182717 INFO nova.virt.libvirt.driver [None req-61aa2dd3-cac0-44c7-b98d-d2e0bbccee01 3c4bd8a02cf045ad9703e01b44239806 54baf6b68ab146d49737e618b1e5b40e - - default default] [instance: cdd4fb44-07d8-4910-b2c1-32386ecffab8] Creating image(s)
Jan 22 00:11:21 compute-1 nova_compute[182713]: 2026-01-22 00:11:21.881 182717 DEBUG oslo_concurrency.lockutils [None req-61aa2dd3-cac0-44c7-b98d-d2e0bbccee01 3c4bd8a02cf045ad9703e01b44239806 54baf6b68ab146d49737e618b1e5b40e - - default default] Acquiring lock "/var/lib/nova/instances/cdd4fb44-07d8-4910-b2c1-32386ecffab8/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:11:21 compute-1 nova_compute[182713]: 2026-01-22 00:11:21.881 182717 DEBUG oslo_concurrency.lockutils [None req-61aa2dd3-cac0-44c7-b98d-d2e0bbccee01 3c4bd8a02cf045ad9703e01b44239806 54baf6b68ab146d49737e618b1e5b40e - - default default] Lock "/var/lib/nova/instances/cdd4fb44-07d8-4910-b2c1-32386ecffab8/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:11:21 compute-1 nova_compute[182713]: 2026-01-22 00:11:21.882 182717 DEBUG oslo_concurrency.lockutils [None req-61aa2dd3-cac0-44c7-b98d-d2e0bbccee01 3c4bd8a02cf045ad9703e01b44239806 54baf6b68ab146d49737e618b1e5b40e - - default default] Lock "/var/lib/nova/instances/cdd4fb44-07d8-4910-b2c1-32386ecffab8/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:11:21 compute-1 nova_compute[182713]: 2026-01-22 00:11:21.896 182717 DEBUG oslo_concurrency.processutils [None req-61aa2dd3-cac0-44c7-b98d-d2e0bbccee01 3c4bd8a02cf045ad9703e01b44239806 54baf6b68ab146d49737e618b1e5b40e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:11:21 compute-1 nova_compute[182713]: 2026-01-22 00:11:21.966 182717 DEBUG oslo_concurrency.processutils [None req-61aa2dd3-cac0-44c7-b98d-d2e0bbccee01 3c4bd8a02cf045ad9703e01b44239806 54baf6b68ab146d49737e618b1e5b40e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:11:21 compute-1 nova_compute[182713]: 2026-01-22 00:11:21.968 182717 DEBUG oslo_concurrency.lockutils [None req-61aa2dd3-cac0-44c7-b98d-d2e0bbccee01 3c4bd8a02cf045ad9703e01b44239806 54baf6b68ab146d49737e618b1e5b40e - - default default] Acquiring lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:11:21 compute-1 nova_compute[182713]: 2026-01-22 00:11:21.969 182717 DEBUG oslo_concurrency.lockutils [None req-61aa2dd3-cac0-44c7-b98d-d2e0bbccee01 3c4bd8a02cf045ad9703e01b44239806 54baf6b68ab146d49737e618b1e5b40e - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:11:21 compute-1 nova_compute[182713]: 2026-01-22 00:11:21.992 182717 DEBUG oslo_concurrency.processutils [None req-61aa2dd3-cac0-44c7-b98d-d2e0bbccee01 3c4bd8a02cf045ad9703e01b44239806 54baf6b68ab146d49737e618b1e5b40e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:11:22 compute-1 nova_compute[182713]: 2026-01-22 00:11:22.060 182717 DEBUG oslo_concurrency.processutils [None req-61aa2dd3-cac0-44c7-b98d-d2e0bbccee01 3c4bd8a02cf045ad9703e01b44239806 54baf6b68ab146d49737e618b1e5b40e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:11:22 compute-1 nova_compute[182713]: 2026-01-22 00:11:22.061 182717 DEBUG oslo_concurrency.processutils [None req-61aa2dd3-cac0-44c7-b98d-d2e0bbccee01 3c4bd8a02cf045ad9703e01b44239806 54baf6b68ab146d49737e618b1e5b40e - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/cdd4fb44-07d8-4910-b2c1-32386ecffab8/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:11:22 compute-1 nova_compute[182713]: 2026-01-22 00:11:22.114 182717 DEBUG oslo_concurrency.processutils [None req-61aa2dd3-cac0-44c7-b98d-d2e0bbccee01 3c4bd8a02cf045ad9703e01b44239806 54baf6b68ab146d49737e618b1e5b40e - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/cdd4fb44-07d8-4910-b2c1-32386ecffab8/disk 1073741824" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:11:22 compute-1 nova_compute[182713]: 2026-01-22 00:11:22.116 182717 DEBUG oslo_concurrency.lockutils [None req-61aa2dd3-cac0-44c7-b98d-d2e0bbccee01 3c4bd8a02cf045ad9703e01b44239806 54baf6b68ab146d49737e618b1e5b40e - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.146s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:11:22 compute-1 nova_compute[182713]: 2026-01-22 00:11:22.117 182717 DEBUG oslo_concurrency.processutils [None req-61aa2dd3-cac0-44c7-b98d-d2e0bbccee01 3c4bd8a02cf045ad9703e01b44239806 54baf6b68ab146d49737e618b1e5b40e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:11:22 compute-1 nova_compute[182713]: 2026-01-22 00:11:22.187 182717 DEBUG oslo_concurrency.processutils [None req-61aa2dd3-cac0-44c7-b98d-d2e0bbccee01 3c4bd8a02cf045ad9703e01b44239806 54baf6b68ab146d49737e618b1e5b40e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:11:22 compute-1 nova_compute[182713]: 2026-01-22 00:11:22.188 182717 DEBUG nova.virt.disk.api [None req-61aa2dd3-cac0-44c7-b98d-d2e0bbccee01 3c4bd8a02cf045ad9703e01b44239806 54baf6b68ab146d49737e618b1e5b40e - - default default] Checking if we can resize image /var/lib/nova/instances/cdd4fb44-07d8-4910-b2c1-32386ecffab8/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 22 00:11:22 compute-1 nova_compute[182713]: 2026-01-22 00:11:22.188 182717 DEBUG oslo_concurrency.processutils [None req-61aa2dd3-cac0-44c7-b98d-d2e0bbccee01 3c4bd8a02cf045ad9703e01b44239806 54baf6b68ab146d49737e618b1e5b40e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/cdd4fb44-07d8-4910-b2c1-32386ecffab8/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:11:22 compute-1 nova_compute[182713]: 2026-01-22 00:11:22.264 182717 DEBUG oslo_concurrency.processutils [None req-61aa2dd3-cac0-44c7-b98d-d2e0bbccee01 3c4bd8a02cf045ad9703e01b44239806 54baf6b68ab146d49737e618b1e5b40e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/cdd4fb44-07d8-4910-b2c1-32386ecffab8/disk --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:11:22 compute-1 nova_compute[182713]: 2026-01-22 00:11:22.266 182717 DEBUG nova.virt.disk.api [None req-61aa2dd3-cac0-44c7-b98d-d2e0bbccee01 3c4bd8a02cf045ad9703e01b44239806 54baf6b68ab146d49737e618b1e5b40e - - default default] Cannot resize image /var/lib/nova/instances/cdd4fb44-07d8-4910-b2c1-32386ecffab8/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 22 00:11:22 compute-1 nova_compute[182713]: 2026-01-22 00:11:22.266 182717 DEBUG nova.objects.instance [None req-61aa2dd3-cac0-44c7-b98d-d2e0bbccee01 3c4bd8a02cf045ad9703e01b44239806 54baf6b68ab146d49737e618b1e5b40e - - default default] Lazy-loading 'migration_context' on Instance uuid cdd4fb44-07d8-4910-b2c1-32386ecffab8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:11:22 compute-1 nova_compute[182713]: 2026-01-22 00:11:22.306 182717 DEBUG nova.virt.libvirt.driver [None req-61aa2dd3-cac0-44c7-b98d-d2e0bbccee01 3c4bd8a02cf045ad9703e01b44239806 54baf6b68ab146d49737e618b1e5b40e - - default default] [instance: cdd4fb44-07d8-4910-b2c1-32386ecffab8] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 22 00:11:22 compute-1 nova_compute[182713]: 2026-01-22 00:11:22.307 182717 DEBUG nova.virt.libvirt.driver [None req-61aa2dd3-cac0-44c7-b98d-d2e0bbccee01 3c4bd8a02cf045ad9703e01b44239806 54baf6b68ab146d49737e618b1e5b40e - - default default] [instance: cdd4fb44-07d8-4910-b2c1-32386ecffab8] Ensure instance console log exists: /var/lib/nova/instances/cdd4fb44-07d8-4910-b2c1-32386ecffab8/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 22 00:11:22 compute-1 nova_compute[182713]: 2026-01-22 00:11:22.308 182717 DEBUG oslo_concurrency.lockutils [None req-61aa2dd3-cac0-44c7-b98d-d2e0bbccee01 3c4bd8a02cf045ad9703e01b44239806 54baf6b68ab146d49737e618b1e5b40e - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:11:22 compute-1 nova_compute[182713]: 2026-01-22 00:11:22.308 182717 DEBUG oslo_concurrency.lockutils [None req-61aa2dd3-cac0-44c7-b98d-d2e0bbccee01 3c4bd8a02cf045ad9703e01b44239806 54baf6b68ab146d49737e618b1e5b40e - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:11:22 compute-1 nova_compute[182713]: 2026-01-22 00:11:22.309 182717 DEBUG oslo_concurrency.lockutils [None req-61aa2dd3-cac0-44c7-b98d-d2e0bbccee01 3c4bd8a02cf045ad9703e01b44239806 54baf6b68ab146d49737e618b1e5b40e - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:11:22 compute-1 nova_compute[182713]: 2026-01-22 00:11:22.876 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:11:23 compute-1 nova_compute[182713]: 2026-01-22 00:11:23.330 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:11:26 compute-1 nova_compute[182713]: 2026-01-22 00:11:26.199 182717 DEBUG nova.network.neutron [None req-61aa2dd3-cac0-44c7-b98d-d2e0bbccee01 3c4bd8a02cf045ad9703e01b44239806 54baf6b68ab146d49737e618b1e5b40e - - default default] [instance: cdd4fb44-07d8-4910-b2c1-32386ecffab8] Successfully created port: 7c209da8-98c9-48d0-b48e-ece9f863d933 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 22 00:11:27 compute-1 nova_compute[182713]: 2026-01-22 00:11:27.877 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:11:28 compute-1 nova_compute[182713]: 2026-01-22 00:11:28.333 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:11:29 compute-1 nova_compute[182713]: 2026-01-22 00:11:29.805 182717 DEBUG nova.network.neutron [None req-61aa2dd3-cac0-44c7-b98d-d2e0bbccee01 3c4bd8a02cf045ad9703e01b44239806 54baf6b68ab146d49737e618b1e5b40e - - default default] [instance: cdd4fb44-07d8-4910-b2c1-32386ecffab8] Successfully updated port: 7c209da8-98c9-48d0-b48e-ece9f863d933 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 22 00:11:30 compute-1 nova_compute[182713]: 2026-01-22 00:11:30.079 182717 DEBUG nova.compute.manager [req-130a49f5-2c50-4ce9-ba27-6a802d10b16e req-16dc2a9c-08d4-4962-aad0-8e28c82c5fdc 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: cdd4fb44-07d8-4910-b2c1-32386ecffab8] Received event network-changed-7c209da8-98c9-48d0-b48e-ece9f863d933 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:11:30 compute-1 nova_compute[182713]: 2026-01-22 00:11:30.079 182717 DEBUG nova.compute.manager [req-130a49f5-2c50-4ce9-ba27-6a802d10b16e req-16dc2a9c-08d4-4962-aad0-8e28c82c5fdc 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: cdd4fb44-07d8-4910-b2c1-32386ecffab8] Refreshing instance network info cache due to event network-changed-7c209da8-98c9-48d0-b48e-ece9f863d933. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 00:11:30 compute-1 nova_compute[182713]: 2026-01-22 00:11:30.080 182717 DEBUG oslo_concurrency.lockutils [req-130a49f5-2c50-4ce9-ba27-6a802d10b16e req-16dc2a9c-08d4-4962-aad0-8e28c82c5fdc 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-cdd4fb44-07d8-4910-b2c1-32386ecffab8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:11:30 compute-1 nova_compute[182713]: 2026-01-22 00:11:30.081 182717 DEBUG oslo_concurrency.lockutils [req-130a49f5-2c50-4ce9-ba27-6a802d10b16e req-16dc2a9c-08d4-4962-aad0-8e28c82c5fdc 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-cdd4fb44-07d8-4910-b2c1-32386ecffab8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:11:30 compute-1 nova_compute[182713]: 2026-01-22 00:11:30.081 182717 DEBUG nova.network.neutron [req-130a49f5-2c50-4ce9-ba27-6a802d10b16e req-16dc2a9c-08d4-4962-aad0-8e28c82c5fdc 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: cdd4fb44-07d8-4910-b2c1-32386ecffab8] Refreshing network info cache for port 7c209da8-98c9-48d0-b48e-ece9f863d933 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 00:11:30 compute-1 nova_compute[182713]: 2026-01-22 00:11:30.254 182717 DEBUG oslo_concurrency.lockutils [None req-61aa2dd3-cac0-44c7-b98d-d2e0bbccee01 3c4bd8a02cf045ad9703e01b44239806 54baf6b68ab146d49737e618b1e5b40e - - default default] Acquiring lock "refresh_cache-cdd4fb44-07d8-4910-b2c1-32386ecffab8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:11:30 compute-1 podman[229328]: 2026-01-22 00:11:30.565394206 +0000 UTC m=+0.051915827 container health_status 9069102163b9014ce8d990e36224edf2f6277e737eebbc85a8ea0ba5a3d6a468 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 22 00:11:30 compute-1 podman[229327]: 2026-01-22 00:11:30.600099349 +0000 UTC m=+0.090371163 container health_status 1b31374526468d6b98833c6ddc06c791524e3aaa8f4a92d02e3f11c449a17ca2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3)
Jan 22 00:11:30 compute-1 nova_compute[182713]: 2026-01-22 00:11:30.684 182717 DEBUG nova.network.neutron [req-130a49f5-2c50-4ce9-ba27-6a802d10b16e req-16dc2a9c-08d4-4962-aad0-8e28c82c5fdc 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: cdd4fb44-07d8-4910-b2c1-32386ecffab8] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 00:11:31 compute-1 nova_compute[182713]: 2026-01-22 00:11:31.511 182717 DEBUG nova.network.neutron [req-130a49f5-2c50-4ce9-ba27-6a802d10b16e req-16dc2a9c-08d4-4962-aad0-8e28c82c5fdc 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: cdd4fb44-07d8-4910-b2c1-32386ecffab8] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:11:31 compute-1 nova_compute[182713]: 2026-01-22 00:11:31.692 182717 DEBUG oslo_concurrency.lockutils [req-130a49f5-2c50-4ce9-ba27-6a802d10b16e req-16dc2a9c-08d4-4962-aad0-8e28c82c5fdc 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-cdd4fb44-07d8-4910-b2c1-32386ecffab8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:11:31 compute-1 nova_compute[182713]: 2026-01-22 00:11:31.693 182717 DEBUG oslo_concurrency.lockutils [None req-61aa2dd3-cac0-44c7-b98d-d2e0bbccee01 3c4bd8a02cf045ad9703e01b44239806 54baf6b68ab146d49737e618b1e5b40e - - default default] Acquired lock "refresh_cache-cdd4fb44-07d8-4910-b2c1-32386ecffab8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:11:31 compute-1 nova_compute[182713]: 2026-01-22 00:11:31.693 182717 DEBUG nova.network.neutron [None req-61aa2dd3-cac0-44c7-b98d-d2e0bbccee01 3c4bd8a02cf045ad9703e01b44239806 54baf6b68ab146d49737e618b1e5b40e - - default default] [instance: cdd4fb44-07d8-4910-b2c1-32386ecffab8] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 00:11:31 compute-1 nova_compute[182713]: 2026-01-22 00:11:31.976 182717 DEBUG nova.network.neutron [None req-61aa2dd3-cac0-44c7-b98d-d2e0bbccee01 3c4bd8a02cf045ad9703e01b44239806 54baf6b68ab146d49737e618b1e5b40e - - default default] [instance: cdd4fb44-07d8-4910-b2c1-32386ecffab8] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 00:11:32 compute-1 nova_compute[182713]: 2026-01-22 00:11:32.879 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:11:33 compute-1 nova_compute[182713]: 2026-01-22 00:11:33.335 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:11:33 compute-1 nova_compute[182713]: 2026-01-22 00:11:33.647 182717 DEBUG nova.network.neutron [None req-61aa2dd3-cac0-44c7-b98d-d2e0bbccee01 3c4bd8a02cf045ad9703e01b44239806 54baf6b68ab146d49737e618b1e5b40e - - default default] [instance: cdd4fb44-07d8-4910-b2c1-32386ecffab8] Updating instance_info_cache with network_info: [{"id": "7c209da8-98c9-48d0-b48e-ece9f863d933", "address": "fa:16:3e:3e:6e:e0", "network": {"id": "3b4ab8c5-3d45-4beb-9842-6290486d7c84", "bridge": "br-int", "label": "tempest-ServerRescueTestJSONUnderV235-682511806-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "54baf6b68ab146d49737e618b1e5b40e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7c209da8-98", "ovs_interfaceid": "7c209da8-98c9-48d0-b48e-ece9f863d933", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:11:35 compute-1 nova_compute[182713]: 2026-01-22 00:11:35.208 182717 DEBUG oslo_concurrency.lockutils [None req-61aa2dd3-cac0-44c7-b98d-d2e0bbccee01 3c4bd8a02cf045ad9703e01b44239806 54baf6b68ab146d49737e618b1e5b40e - - default default] Releasing lock "refresh_cache-cdd4fb44-07d8-4910-b2c1-32386ecffab8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:11:35 compute-1 nova_compute[182713]: 2026-01-22 00:11:35.208 182717 DEBUG nova.compute.manager [None req-61aa2dd3-cac0-44c7-b98d-d2e0bbccee01 3c4bd8a02cf045ad9703e01b44239806 54baf6b68ab146d49737e618b1e5b40e - - default default] [instance: cdd4fb44-07d8-4910-b2c1-32386ecffab8] Instance network_info: |[{"id": "7c209da8-98c9-48d0-b48e-ece9f863d933", "address": "fa:16:3e:3e:6e:e0", "network": {"id": "3b4ab8c5-3d45-4beb-9842-6290486d7c84", "bridge": "br-int", "label": "tempest-ServerRescueTestJSONUnderV235-682511806-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "54baf6b68ab146d49737e618b1e5b40e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7c209da8-98", "ovs_interfaceid": "7c209da8-98c9-48d0-b48e-ece9f863d933", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 22 00:11:35 compute-1 nova_compute[182713]: 2026-01-22 00:11:35.211 182717 DEBUG nova.virt.libvirt.driver [None req-61aa2dd3-cac0-44c7-b98d-d2e0bbccee01 3c4bd8a02cf045ad9703e01b44239806 54baf6b68ab146d49737e618b1e5b40e - - default default] [instance: cdd4fb44-07d8-4910-b2c1-32386ecffab8] Start _get_guest_xml network_info=[{"id": "7c209da8-98c9-48d0-b48e-ece9f863d933", "address": "fa:16:3e:3e:6e:e0", "network": {"id": "3b4ab8c5-3d45-4beb-9842-6290486d7c84", "bridge": "br-int", "label": "tempest-ServerRescueTestJSONUnderV235-682511806-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "54baf6b68ab146d49737e618b1e5b40e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7c209da8-98", "ovs_interfaceid": "7c209da8-98c9-48d0-b48e-ece9f863d933", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'size': 0, 'disk_bus': 'virtio', 'boot_index': 0, 'device_name': '/dev/vda', 'guest_format': None, 'encrypted': False, 'encryption_format': None, 'encryption_secret_uuid': None, 'device_type': 'disk', 'image_id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 22 00:11:35 compute-1 nova_compute[182713]: 2026-01-22 00:11:35.216 182717 WARNING nova.virt.libvirt.driver [None req-61aa2dd3-cac0-44c7-b98d-d2e0bbccee01 3c4bd8a02cf045ad9703e01b44239806 54baf6b68ab146d49737e618b1e5b40e - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 00:11:35 compute-1 nova_compute[182713]: 2026-01-22 00:11:35.264 182717 DEBUG nova.virt.libvirt.host [None req-61aa2dd3-cac0-44c7-b98d-d2e0bbccee01 3c4bd8a02cf045ad9703e01b44239806 54baf6b68ab146d49737e618b1e5b40e - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 22 00:11:35 compute-1 nova_compute[182713]: 2026-01-22 00:11:35.265 182717 DEBUG nova.virt.libvirt.host [None req-61aa2dd3-cac0-44c7-b98d-d2e0bbccee01 3c4bd8a02cf045ad9703e01b44239806 54baf6b68ab146d49737e618b1e5b40e - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 22 00:11:35 compute-1 nova_compute[182713]: 2026-01-22 00:11:35.268 182717 DEBUG nova.virt.libvirt.host [None req-61aa2dd3-cac0-44c7-b98d-d2e0bbccee01 3c4bd8a02cf045ad9703e01b44239806 54baf6b68ab146d49737e618b1e5b40e - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 22 00:11:35 compute-1 nova_compute[182713]: 2026-01-22 00:11:35.269 182717 DEBUG nova.virt.libvirt.host [None req-61aa2dd3-cac0-44c7-b98d-d2e0bbccee01 3c4bd8a02cf045ad9703e01b44239806 54baf6b68ab146d49737e618b1e5b40e - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 22 00:11:35 compute-1 nova_compute[182713]: 2026-01-22 00:11:35.270 182717 DEBUG nova.virt.libvirt.driver [None req-61aa2dd3-cac0-44c7-b98d-d2e0bbccee01 3c4bd8a02cf045ad9703e01b44239806 54baf6b68ab146d49737e618b1e5b40e - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 22 00:11:35 compute-1 nova_compute[182713]: 2026-01-22 00:11:35.270 182717 DEBUG nova.virt.hardware [None req-61aa2dd3-cac0-44c7-b98d-d2e0bbccee01 3c4bd8a02cf045ad9703e01b44239806 54baf6b68ab146d49737e618b1e5b40e - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-21T23:42:45Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c3389c03-89c4-4ff5-9e03-1a99d41713d4',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 22 00:11:35 compute-1 nova_compute[182713]: 2026-01-22 00:11:35.271 182717 DEBUG nova.virt.hardware [None req-61aa2dd3-cac0-44c7-b98d-d2e0bbccee01 3c4bd8a02cf045ad9703e01b44239806 54baf6b68ab146d49737e618b1e5b40e - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 22 00:11:35 compute-1 nova_compute[182713]: 2026-01-22 00:11:35.271 182717 DEBUG nova.virt.hardware [None req-61aa2dd3-cac0-44c7-b98d-d2e0bbccee01 3c4bd8a02cf045ad9703e01b44239806 54baf6b68ab146d49737e618b1e5b40e - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 22 00:11:35 compute-1 nova_compute[182713]: 2026-01-22 00:11:35.271 182717 DEBUG nova.virt.hardware [None req-61aa2dd3-cac0-44c7-b98d-d2e0bbccee01 3c4bd8a02cf045ad9703e01b44239806 54baf6b68ab146d49737e618b1e5b40e - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 22 00:11:35 compute-1 nova_compute[182713]: 2026-01-22 00:11:35.272 182717 DEBUG nova.virt.hardware [None req-61aa2dd3-cac0-44c7-b98d-d2e0bbccee01 3c4bd8a02cf045ad9703e01b44239806 54baf6b68ab146d49737e618b1e5b40e - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 22 00:11:35 compute-1 nova_compute[182713]: 2026-01-22 00:11:35.272 182717 DEBUG nova.virt.hardware [None req-61aa2dd3-cac0-44c7-b98d-d2e0bbccee01 3c4bd8a02cf045ad9703e01b44239806 54baf6b68ab146d49737e618b1e5b40e - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 22 00:11:35 compute-1 nova_compute[182713]: 2026-01-22 00:11:35.272 182717 DEBUG nova.virt.hardware [None req-61aa2dd3-cac0-44c7-b98d-d2e0bbccee01 3c4bd8a02cf045ad9703e01b44239806 54baf6b68ab146d49737e618b1e5b40e - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 22 00:11:35 compute-1 nova_compute[182713]: 2026-01-22 00:11:35.272 182717 DEBUG nova.virt.hardware [None req-61aa2dd3-cac0-44c7-b98d-d2e0bbccee01 3c4bd8a02cf045ad9703e01b44239806 54baf6b68ab146d49737e618b1e5b40e - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 22 00:11:35 compute-1 nova_compute[182713]: 2026-01-22 00:11:35.272 182717 DEBUG nova.virt.hardware [None req-61aa2dd3-cac0-44c7-b98d-d2e0bbccee01 3c4bd8a02cf045ad9703e01b44239806 54baf6b68ab146d49737e618b1e5b40e - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 22 00:11:35 compute-1 nova_compute[182713]: 2026-01-22 00:11:35.273 182717 DEBUG nova.virt.hardware [None req-61aa2dd3-cac0-44c7-b98d-d2e0bbccee01 3c4bd8a02cf045ad9703e01b44239806 54baf6b68ab146d49737e618b1e5b40e - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 22 00:11:35 compute-1 nova_compute[182713]: 2026-01-22 00:11:35.273 182717 DEBUG nova.virt.hardware [None req-61aa2dd3-cac0-44c7-b98d-d2e0bbccee01 3c4bd8a02cf045ad9703e01b44239806 54baf6b68ab146d49737e618b1e5b40e - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 22 00:11:35 compute-1 nova_compute[182713]: 2026-01-22 00:11:35.276 182717 DEBUG nova.virt.libvirt.vif [None req-61aa2dd3-cac0-44c7-b98d-d2e0bbccee01 3c4bd8a02cf045ad9703e01b44239806 54baf6b68ab146d49737e618b1e5b40e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T00:11:07Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerRescueTestJSONUnderV235-server-1857258152',display_name='tempest-ServerRescueTestJSONUnderV235-server-1857258152',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverrescuetestjsonunderv235-server-1857258152',id=116,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='54baf6b68ab146d49737e618b1e5b40e',ramdisk_id='',reservation_id='r-1ixyv6bz',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerRescueTestJSONUnderV235-284679437',owner_user_name='tempest-ServerRescueTestJSONUnderV235-284679437-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:11:21Z,user_data=None,user_id='3c4bd8a02cf045ad9703e01b44239806',uuid=cdd4fb44-07d8-4910-b2c1-32386ecffab8,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "7c209da8-98c9-48d0-b48e-ece9f863d933", "address": "fa:16:3e:3e:6e:e0", "network": {"id": "3b4ab8c5-3d45-4beb-9842-6290486d7c84", "bridge": "br-int", "label": "tempest-ServerRescueTestJSONUnderV235-682511806-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "54baf6b68ab146d49737e618b1e5b40e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7c209da8-98", "ovs_interfaceid": "7c209da8-98c9-48d0-b48e-ece9f863d933", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 00:11:35 compute-1 nova_compute[182713]: 2026-01-22 00:11:35.277 182717 DEBUG nova.network.os_vif_util [None req-61aa2dd3-cac0-44c7-b98d-d2e0bbccee01 3c4bd8a02cf045ad9703e01b44239806 54baf6b68ab146d49737e618b1e5b40e - - default default] Converting VIF {"id": "7c209da8-98c9-48d0-b48e-ece9f863d933", "address": "fa:16:3e:3e:6e:e0", "network": {"id": "3b4ab8c5-3d45-4beb-9842-6290486d7c84", "bridge": "br-int", "label": "tempest-ServerRescueTestJSONUnderV235-682511806-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "54baf6b68ab146d49737e618b1e5b40e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7c209da8-98", "ovs_interfaceid": "7c209da8-98c9-48d0-b48e-ece9f863d933", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:11:35 compute-1 nova_compute[182713]: 2026-01-22 00:11:35.277 182717 DEBUG nova.network.os_vif_util [None req-61aa2dd3-cac0-44c7-b98d-d2e0bbccee01 3c4bd8a02cf045ad9703e01b44239806 54baf6b68ab146d49737e618b1e5b40e - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:3e:6e:e0,bridge_name='br-int',has_traffic_filtering=True,id=7c209da8-98c9-48d0-b48e-ece9f863d933,network=Network(3b4ab8c5-3d45-4beb-9842-6290486d7c84),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7c209da8-98') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:11:35 compute-1 nova_compute[182713]: 2026-01-22 00:11:35.278 182717 DEBUG nova.objects.instance [None req-61aa2dd3-cac0-44c7-b98d-d2e0bbccee01 3c4bd8a02cf045ad9703e01b44239806 54baf6b68ab146d49737e618b1e5b40e - - default default] Lazy-loading 'pci_devices' on Instance uuid cdd4fb44-07d8-4910-b2c1-32386ecffab8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:11:35 compute-1 nova_compute[182713]: 2026-01-22 00:11:35.323 182717 DEBUG nova.virt.libvirt.driver [None req-61aa2dd3-cac0-44c7-b98d-d2e0bbccee01 3c4bd8a02cf045ad9703e01b44239806 54baf6b68ab146d49737e618b1e5b40e - - default default] [instance: cdd4fb44-07d8-4910-b2c1-32386ecffab8] End _get_guest_xml xml=<domain type="kvm">
Jan 22 00:11:35 compute-1 nova_compute[182713]:   <uuid>cdd4fb44-07d8-4910-b2c1-32386ecffab8</uuid>
Jan 22 00:11:35 compute-1 nova_compute[182713]:   <name>instance-00000074</name>
Jan 22 00:11:35 compute-1 nova_compute[182713]:   <memory>131072</memory>
Jan 22 00:11:35 compute-1 nova_compute[182713]:   <vcpu>1</vcpu>
Jan 22 00:11:35 compute-1 nova_compute[182713]:   <metadata>
Jan 22 00:11:35 compute-1 nova_compute[182713]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 00:11:35 compute-1 nova_compute[182713]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 00:11:35 compute-1 nova_compute[182713]:       <nova:name>tempest-ServerRescueTestJSONUnderV235-server-1857258152</nova:name>
Jan 22 00:11:35 compute-1 nova_compute[182713]:       <nova:creationTime>2026-01-22 00:11:35</nova:creationTime>
Jan 22 00:11:35 compute-1 nova_compute[182713]:       <nova:flavor name="m1.nano">
Jan 22 00:11:35 compute-1 nova_compute[182713]:         <nova:memory>128</nova:memory>
Jan 22 00:11:35 compute-1 nova_compute[182713]:         <nova:disk>1</nova:disk>
Jan 22 00:11:35 compute-1 nova_compute[182713]:         <nova:swap>0</nova:swap>
Jan 22 00:11:35 compute-1 nova_compute[182713]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 00:11:35 compute-1 nova_compute[182713]:         <nova:vcpus>1</nova:vcpus>
Jan 22 00:11:35 compute-1 nova_compute[182713]:       </nova:flavor>
Jan 22 00:11:35 compute-1 nova_compute[182713]:       <nova:owner>
Jan 22 00:11:35 compute-1 nova_compute[182713]:         <nova:user uuid="3c4bd8a02cf045ad9703e01b44239806">tempest-ServerRescueTestJSONUnderV235-284679437-project-member</nova:user>
Jan 22 00:11:35 compute-1 nova_compute[182713]:         <nova:project uuid="54baf6b68ab146d49737e618b1e5b40e">tempest-ServerRescueTestJSONUnderV235-284679437</nova:project>
Jan 22 00:11:35 compute-1 nova_compute[182713]:       </nova:owner>
Jan 22 00:11:35 compute-1 nova_compute[182713]:       <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 22 00:11:35 compute-1 nova_compute[182713]:       <nova:ports>
Jan 22 00:11:35 compute-1 nova_compute[182713]:         <nova:port uuid="7c209da8-98c9-48d0-b48e-ece9f863d933">
Jan 22 00:11:35 compute-1 nova_compute[182713]:           <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Jan 22 00:11:35 compute-1 nova_compute[182713]:         </nova:port>
Jan 22 00:11:35 compute-1 nova_compute[182713]:       </nova:ports>
Jan 22 00:11:35 compute-1 nova_compute[182713]:     </nova:instance>
Jan 22 00:11:35 compute-1 nova_compute[182713]:   </metadata>
Jan 22 00:11:35 compute-1 nova_compute[182713]:   <sysinfo type="smbios">
Jan 22 00:11:35 compute-1 nova_compute[182713]:     <system>
Jan 22 00:11:35 compute-1 nova_compute[182713]:       <entry name="manufacturer">RDO</entry>
Jan 22 00:11:35 compute-1 nova_compute[182713]:       <entry name="product">OpenStack Compute</entry>
Jan 22 00:11:35 compute-1 nova_compute[182713]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 00:11:35 compute-1 nova_compute[182713]:       <entry name="serial">cdd4fb44-07d8-4910-b2c1-32386ecffab8</entry>
Jan 22 00:11:35 compute-1 nova_compute[182713]:       <entry name="uuid">cdd4fb44-07d8-4910-b2c1-32386ecffab8</entry>
Jan 22 00:11:35 compute-1 nova_compute[182713]:       <entry name="family">Virtual Machine</entry>
Jan 22 00:11:35 compute-1 nova_compute[182713]:     </system>
Jan 22 00:11:35 compute-1 nova_compute[182713]:   </sysinfo>
Jan 22 00:11:35 compute-1 nova_compute[182713]:   <os>
Jan 22 00:11:35 compute-1 nova_compute[182713]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 22 00:11:35 compute-1 nova_compute[182713]:     <boot dev="hd"/>
Jan 22 00:11:35 compute-1 nova_compute[182713]:     <smbios mode="sysinfo"/>
Jan 22 00:11:35 compute-1 nova_compute[182713]:   </os>
Jan 22 00:11:35 compute-1 nova_compute[182713]:   <features>
Jan 22 00:11:35 compute-1 nova_compute[182713]:     <acpi/>
Jan 22 00:11:35 compute-1 nova_compute[182713]:     <apic/>
Jan 22 00:11:35 compute-1 nova_compute[182713]:     <vmcoreinfo/>
Jan 22 00:11:35 compute-1 nova_compute[182713]:   </features>
Jan 22 00:11:35 compute-1 nova_compute[182713]:   <clock offset="utc">
Jan 22 00:11:35 compute-1 nova_compute[182713]:     <timer name="pit" tickpolicy="delay"/>
Jan 22 00:11:35 compute-1 nova_compute[182713]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 22 00:11:35 compute-1 nova_compute[182713]:     <timer name="hpet" present="no"/>
Jan 22 00:11:35 compute-1 nova_compute[182713]:   </clock>
Jan 22 00:11:35 compute-1 nova_compute[182713]:   <cpu mode="custom" match="exact">
Jan 22 00:11:35 compute-1 nova_compute[182713]:     <model>Nehalem</model>
Jan 22 00:11:35 compute-1 nova_compute[182713]:     <topology sockets="1" cores="1" threads="1"/>
Jan 22 00:11:35 compute-1 nova_compute[182713]:   </cpu>
Jan 22 00:11:35 compute-1 nova_compute[182713]:   <devices>
Jan 22 00:11:35 compute-1 nova_compute[182713]:     <disk type="file" device="disk">
Jan 22 00:11:35 compute-1 nova_compute[182713]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 00:11:35 compute-1 nova_compute[182713]:       <source file="/var/lib/nova/instances/cdd4fb44-07d8-4910-b2c1-32386ecffab8/disk"/>
Jan 22 00:11:35 compute-1 nova_compute[182713]:       <target dev="vda" bus="virtio"/>
Jan 22 00:11:35 compute-1 nova_compute[182713]:     </disk>
Jan 22 00:11:35 compute-1 nova_compute[182713]:     <disk type="file" device="cdrom">
Jan 22 00:11:35 compute-1 nova_compute[182713]:       <driver name="qemu" type="raw" cache="none"/>
Jan 22 00:11:35 compute-1 nova_compute[182713]:       <source file="/var/lib/nova/instances/cdd4fb44-07d8-4910-b2c1-32386ecffab8/disk.config"/>
Jan 22 00:11:35 compute-1 nova_compute[182713]:       <target dev="sda" bus="sata"/>
Jan 22 00:11:35 compute-1 nova_compute[182713]:     </disk>
Jan 22 00:11:35 compute-1 nova_compute[182713]:     <interface type="ethernet">
Jan 22 00:11:35 compute-1 nova_compute[182713]:       <mac address="fa:16:3e:3e:6e:e0"/>
Jan 22 00:11:35 compute-1 nova_compute[182713]:       <model type="virtio"/>
Jan 22 00:11:35 compute-1 nova_compute[182713]:       <driver name="vhost" rx_queue_size="512"/>
Jan 22 00:11:35 compute-1 nova_compute[182713]:       <mtu size="1442"/>
Jan 22 00:11:35 compute-1 nova_compute[182713]:       <target dev="tap7c209da8-98"/>
Jan 22 00:11:35 compute-1 nova_compute[182713]:     </interface>
Jan 22 00:11:35 compute-1 nova_compute[182713]:     <serial type="pty">
Jan 22 00:11:35 compute-1 nova_compute[182713]:       <log file="/var/lib/nova/instances/cdd4fb44-07d8-4910-b2c1-32386ecffab8/console.log" append="off"/>
Jan 22 00:11:35 compute-1 nova_compute[182713]:     </serial>
Jan 22 00:11:35 compute-1 nova_compute[182713]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 00:11:35 compute-1 nova_compute[182713]:     <video>
Jan 22 00:11:35 compute-1 nova_compute[182713]:       <model type="virtio"/>
Jan 22 00:11:35 compute-1 nova_compute[182713]:     </video>
Jan 22 00:11:35 compute-1 nova_compute[182713]:     <input type="tablet" bus="usb"/>
Jan 22 00:11:35 compute-1 nova_compute[182713]:     <rng model="virtio">
Jan 22 00:11:35 compute-1 nova_compute[182713]:       <backend model="random">/dev/urandom</backend>
Jan 22 00:11:35 compute-1 nova_compute[182713]:     </rng>
Jan 22 00:11:35 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root"/>
Jan 22 00:11:35 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:11:35 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:11:35 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:11:35 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:11:35 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:11:35 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:11:35 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:11:35 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:11:35 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:11:35 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:11:35 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:11:35 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:11:35 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:11:35 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:11:35 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:11:35 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:11:35 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:11:35 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:11:35 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:11:35 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:11:35 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:11:35 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:11:35 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:11:35 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:11:35 compute-1 nova_compute[182713]:     <controller type="usb" index="0"/>
Jan 22 00:11:35 compute-1 nova_compute[182713]:     <memballoon model="virtio">
Jan 22 00:11:35 compute-1 nova_compute[182713]:       <stats period="10"/>
Jan 22 00:11:35 compute-1 nova_compute[182713]:     </memballoon>
Jan 22 00:11:35 compute-1 nova_compute[182713]:   </devices>
Jan 22 00:11:35 compute-1 nova_compute[182713]: </domain>
Jan 22 00:11:35 compute-1 nova_compute[182713]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 22 00:11:35 compute-1 nova_compute[182713]: 2026-01-22 00:11:35.324 182717 DEBUG nova.compute.manager [None req-61aa2dd3-cac0-44c7-b98d-d2e0bbccee01 3c4bd8a02cf045ad9703e01b44239806 54baf6b68ab146d49737e618b1e5b40e - - default default] [instance: cdd4fb44-07d8-4910-b2c1-32386ecffab8] Preparing to wait for external event network-vif-plugged-7c209da8-98c9-48d0-b48e-ece9f863d933 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 22 00:11:35 compute-1 nova_compute[182713]: 2026-01-22 00:11:35.324 182717 DEBUG oslo_concurrency.lockutils [None req-61aa2dd3-cac0-44c7-b98d-d2e0bbccee01 3c4bd8a02cf045ad9703e01b44239806 54baf6b68ab146d49737e618b1e5b40e - - default default] Acquiring lock "cdd4fb44-07d8-4910-b2c1-32386ecffab8-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:11:35 compute-1 nova_compute[182713]: 2026-01-22 00:11:35.325 182717 DEBUG oslo_concurrency.lockutils [None req-61aa2dd3-cac0-44c7-b98d-d2e0bbccee01 3c4bd8a02cf045ad9703e01b44239806 54baf6b68ab146d49737e618b1e5b40e - - default default] Lock "cdd4fb44-07d8-4910-b2c1-32386ecffab8-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:11:35 compute-1 nova_compute[182713]: 2026-01-22 00:11:35.325 182717 DEBUG oslo_concurrency.lockutils [None req-61aa2dd3-cac0-44c7-b98d-d2e0bbccee01 3c4bd8a02cf045ad9703e01b44239806 54baf6b68ab146d49737e618b1e5b40e - - default default] Lock "cdd4fb44-07d8-4910-b2c1-32386ecffab8-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:11:35 compute-1 nova_compute[182713]: 2026-01-22 00:11:35.325 182717 DEBUG nova.virt.libvirt.vif [None req-61aa2dd3-cac0-44c7-b98d-d2e0bbccee01 3c4bd8a02cf045ad9703e01b44239806 54baf6b68ab146d49737e618b1e5b40e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T00:11:07Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerRescueTestJSONUnderV235-server-1857258152',display_name='tempest-ServerRescueTestJSONUnderV235-server-1857258152',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverrescuetestjsonunderv235-server-1857258152',id=116,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='54baf6b68ab146d49737e618b1e5b40e',ramdisk_id='',reservation_id='r-1ixyv6bz',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerRescueTestJSONUnderV235-284679437',owner_user_name='tempest-ServerRescueTestJSONUnderV235-284679437-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:11:21Z,user_data=None,user_id='3c4bd8a02cf045ad9703e01b44239806',uuid=cdd4fb44-07d8-4910-b2c1-32386ecffab8,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "7c209da8-98c9-48d0-b48e-ece9f863d933", "address": "fa:16:3e:3e:6e:e0", "network": {"id": "3b4ab8c5-3d45-4beb-9842-6290486d7c84", "bridge": "br-int", "label": "tempest-ServerRescueTestJSONUnderV235-682511806-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "54baf6b68ab146d49737e618b1e5b40e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7c209da8-98", "ovs_interfaceid": "7c209da8-98c9-48d0-b48e-ece9f863d933", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 00:11:35 compute-1 nova_compute[182713]: 2026-01-22 00:11:35.326 182717 DEBUG nova.network.os_vif_util [None req-61aa2dd3-cac0-44c7-b98d-d2e0bbccee01 3c4bd8a02cf045ad9703e01b44239806 54baf6b68ab146d49737e618b1e5b40e - - default default] Converting VIF {"id": "7c209da8-98c9-48d0-b48e-ece9f863d933", "address": "fa:16:3e:3e:6e:e0", "network": {"id": "3b4ab8c5-3d45-4beb-9842-6290486d7c84", "bridge": "br-int", "label": "tempest-ServerRescueTestJSONUnderV235-682511806-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "54baf6b68ab146d49737e618b1e5b40e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7c209da8-98", "ovs_interfaceid": "7c209da8-98c9-48d0-b48e-ece9f863d933", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:11:35 compute-1 nova_compute[182713]: 2026-01-22 00:11:35.326 182717 DEBUG nova.network.os_vif_util [None req-61aa2dd3-cac0-44c7-b98d-d2e0bbccee01 3c4bd8a02cf045ad9703e01b44239806 54baf6b68ab146d49737e618b1e5b40e - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:3e:6e:e0,bridge_name='br-int',has_traffic_filtering=True,id=7c209da8-98c9-48d0-b48e-ece9f863d933,network=Network(3b4ab8c5-3d45-4beb-9842-6290486d7c84),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7c209da8-98') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:11:35 compute-1 nova_compute[182713]: 2026-01-22 00:11:35.326 182717 DEBUG os_vif [None req-61aa2dd3-cac0-44c7-b98d-d2e0bbccee01 3c4bd8a02cf045ad9703e01b44239806 54baf6b68ab146d49737e618b1e5b40e - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:3e:6e:e0,bridge_name='br-int',has_traffic_filtering=True,id=7c209da8-98c9-48d0-b48e-ece9f863d933,network=Network(3b4ab8c5-3d45-4beb-9842-6290486d7c84),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7c209da8-98') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 00:11:35 compute-1 nova_compute[182713]: 2026-01-22 00:11:35.327 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:11:35 compute-1 nova_compute[182713]: 2026-01-22 00:11:35.327 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:11:35 compute-1 nova_compute[182713]: 2026-01-22 00:11:35.328 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 00:11:35 compute-1 nova_compute[182713]: 2026-01-22 00:11:35.330 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:11:35 compute-1 nova_compute[182713]: 2026-01-22 00:11:35.330 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap7c209da8-98, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:11:35 compute-1 nova_compute[182713]: 2026-01-22 00:11:35.331 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap7c209da8-98, col_values=(('external_ids', {'iface-id': '7c209da8-98c9-48d0-b48e-ece9f863d933', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:3e:6e:e0', 'vm-uuid': 'cdd4fb44-07d8-4910-b2c1-32386ecffab8'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:11:35 compute-1 nova_compute[182713]: 2026-01-22 00:11:35.332 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:11:35 compute-1 NetworkManager[54952]: <info>  [1769040695.3332] manager: (tap7c209da8-98): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/220)
Jan 22 00:11:35 compute-1 nova_compute[182713]: 2026-01-22 00:11:35.336 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:11:35 compute-1 nova_compute[182713]: 2026-01-22 00:11:35.337 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:11:35 compute-1 nova_compute[182713]: 2026-01-22 00:11:35.338 182717 INFO os_vif [None req-61aa2dd3-cac0-44c7-b98d-d2e0bbccee01 3c4bd8a02cf045ad9703e01b44239806 54baf6b68ab146d49737e618b1e5b40e - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:3e:6e:e0,bridge_name='br-int',has_traffic_filtering=True,id=7c209da8-98c9-48d0-b48e-ece9f863d933,network=Network(3b4ab8c5-3d45-4beb-9842-6290486d7c84),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7c209da8-98')
Jan 22 00:11:35 compute-1 podman[229378]: 2026-01-22 00:11:35.564799543 +0000 UTC m=+0.051056431 container health_status e305ebfcd3dbca18f8dd680606b043b108cf9efb0d0a771039cb04fe0df8441d (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 22 00:11:35 compute-1 podman[229377]: 2026-01-22 00:11:35.586545621 +0000 UTC m=+0.075412579 container health_status af9a44923f14d865893c91712a29a99d59e05d83f1a1a0300c4f4d5af638f284 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_metadata_agent)
Jan 22 00:11:35 compute-1 nova_compute[182713]: 2026-01-22 00:11:35.699 182717 DEBUG nova.virt.libvirt.driver [None req-61aa2dd3-cac0-44c7-b98d-d2e0bbccee01 3c4bd8a02cf045ad9703e01b44239806 54baf6b68ab146d49737e618b1e5b40e - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 00:11:35 compute-1 nova_compute[182713]: 2026-01-22 00:11:35.700 182717 DEBUG nova.virt.libvirt.driver [None req-61aa2dd3-cac0-44c7-b98d-d2e0bbccee01 3c4bd8a02cf045ad9703e01b44239806 54baf6b68ab146d49737e618b1e5b40e - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 00:11:35 compute-1 nova_compute[182713]: 2026-01-22 00:11:35.700 182717 DEBUG nova.virt.libvirt.driver [None req-61aa2dd3-cac0-44c7-b98d-d2e0bbccee01 3c4bd8a02cf045ad9703e01b44239806 54baf6b68ab146d49737e618b1e5b40e - - default default] No VIF found with MAC fa:16:3e:3e:6e:e0, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 00:11:35 compute-1 nova_compute[182713]: 2026-01-22 00:11:35.701 182717 INFO nova.virt.libvirt.driver [None req-61aa2dd3-cac0-44c7-b98d-d2e0bbccee01 3c4bd8a02cf045ad9703e01b44239806 54baf6b68ab146d49737e618b1e5b40e - - default default] [instance: cdd4fb44-07d8-4910-b2c1-32386ecffab8] Using config drive
Jan 22 00:11:36 compute-1 nova_compute[182713]: 2026-01-22 00:11:36.060 182717 INFO nova.virt.libvirt.driver [None req-61aa2dd3-cac0-44c7-b98d-d2e0bbccee01 3c4bd8a02cf045ad9703e01b44239806 54baf6b68ab146d49737e618b1e5b40e - - default default] [instance: cdd4fb44-07d8-4910-b2c1-32386ecffab8] Creating config drive at /var/lib/nova/instances/cdd4fb44-07d8-4910-b2c1-32386ecffab8/disk.config
Jan 22 00:11:36 compute-1 nova_compute[182713]: 2026-01-22 00:11:36.071 182717 DEBUG oslo_concurrency.processutils [None req-61aa2dd3-cac0-44c7-b98d-d2e0bbccee01 3c4bd8a02cf045ad9703e01b44239806 54baf6b68ab146d49737e618b1e5b40e - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/cdd4fb44-07d8-4910-b2c1-32386ecffab8/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpqg10e33_ execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:11:36 compute-1 nova_compute[182713]: 2026-01-22 00:11:36.202 182717 DEBUG oslo_concurrency.processutils [None req-61aa2dd3-cac0-44c7-b98d-d2e0bbccee01 3c4bd8a02cf045ad9703e01b44239806 54baf6b68ab146d49737e618b1e5b40e - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/cdd4fb44-07d8-4910-b2c1-32386ecffab8/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpqg10e33_" returned: 0 in 0.131s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:11:36 compute-1 kernel: tap7c209da8-98: entered promiscuous mode
Jan 22 00:11:36 compute-1 NetworkManager[54952]: <info>  [1769040696.2561] manager: (tap7c209da8-98): new Tun device (/org/freedesktop/NetworkManager/Devices/221)
Jan 22 00:11:36 compute-1 nova_compute[182713]: 2026-01-22 00:11:36.259 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:11:36 compute-1 ovn_controller[94841]: 2026-01-22T00:11:36Z|00479|binding|INFO|Claiming lport 7c209da8-98c9-48d0-b48e-ece9f863d933 for this chassis.
Jan 22 00:11:36 compute-1 ovn_controller[94841]: 2026-01-22T00:11:36Z|00480|binding|INFO|7c209da8-98c9-48d0-b48e-ece9f863d933: Claiming fa:16:3e:3e:6e:e0 10.100.0.4
Jan 22 00:11:36 compute-1 nova_compute[182713]: 2026-01-22 00:11:36.264 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:11:36 compute-1 nova_compute[182713]: 2026-01-22 00:11:36.268 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:11:36 compute-1 systemd-udevd[229435]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 00:11:36 compute-1 systemd-machined[153970]: New machine qemu-52-instance-00000074.
Jan 22 00:11:36 compute-1 NetworkManager[54952]: <info>  [1769040696.3023] device (tap7c209da8-98): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 00:11:36 compute-1 NetworkManager[54952]: <info>  [1769040696.3032] device (tap7c209da8-98): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 00:11:36 compute-1 systemd[1]: Started Virtual Machine qemu-52-instance-00000074.
Jan 22 00:11:36 compute-1 ovn_controller[94841]: 2026-01-22T00:11:36Z|00481|binding|INFO|Setting lport 7c209da8-98c9-48d0-b48e-ece9f863d933 ovn-installed in OVS
Jan 22 00:11:36 compute-1 nova_compute[182713]: 2026-01-22 00:11:36.323 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:11:36 compute-1 ovn_controller[94841]: 2026-01-22T00:11:36Z|00482|binding|INFO|Setting lport 7c209da8-98c9-48d0-b48e-ece9f863d933 up in Southbound
Jan 22 00:11:36 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:11:36.367 104184 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:3e:6e:e0 10.100.0.4'], port_security=['fa:16:3e:3e:6e:e0 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': 'cdd4fb44-07d8-4910-b2c1-32386ecffab8', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3b4ab8c5-3d45-4beb-9842-6290486d7c84', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '54baf6b68ab146d49737e618b1e5b40e', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'cdb31eb0-cb40-4001-b611-8fbfe7d0fb3e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=9fae3c6d-aa86-498b-83cb-e3d2b95d2a8f, chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>], logical_port=7c209da8-98c9-48d0-b48e-ece9f863d933) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:11:36 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:11:36.370 104184 INFO neutron.agent.ovn.metadata.agent [-] Port 7c209da8-98c9-48d0-b48e-ece9f863d933 in datapath 3b4ab8c5-3d45-4beb-9842-6290486d7c84 bound to our chassis
Jan 22 00:11:36 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:11:36.372 104184 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 3b4ab8c5-3d45-4beb-9842-6290486d7c84 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Jan 22 00:11:36 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:11:36.373 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[693f0405-39f5-4aea-ae33-18fdddce8a89]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:11:36 compute-1 nova_compute[182713]: 2026-01-22 00:11:36.863 182717 DEBUG nova.compute.manager [req-980e87ab-27c5-4d32-a131-2b6161cae130 req-96cdc668-23bf-487e-a000-32e1d2ff412f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: cdd4fb44-07d8-4910-b2c1-32386ecffab8] Received event network-vif-plugged-7c209da8-98c9-48d0-b48e-ece9f863d933 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:11:36 compute-1 nova_compute[182713]: 2026-01-22 00:11:36.864 182717 DEBUG oslo_concurrency.lockutils [req-980e87ab-27c5-4d32-a131-2b6161cae130 req-96cdc668-23bf-487e-a000-32e1d2ff412f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "cdd4fb44-07d8-4910-b2c1-32386ecffab8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:11:36 compute-1 nova_compute[182713]: 2026-01-22 00:11:36.864 182717 DEBUG oslo_concurrency.lockutils [req-980e87ab-27c5-4d32-a131-2b6161cae130 req-96cdc668-23bf-487e-a000-32e1d2ff412f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "cdd4fb44-07d8-4910-b2c1-32386ecffab8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:11:36 compute-1 nova_compute[182713]: 2026-01-22 00:11:36.864 182717 DEBUG oslo_concurrency.lockutils [req-980e87ab-27c5-4d32-a131-2b6161cae130 req-96cdc668-23bf-487e-a000-32e1d2ff412f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "cdd4fb44-07d8-4910-b2c1-32386ecffab8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:11:36 compute-1 nova_compute[182713]: 2026-01-22 00:11:36.864 182717 DEBUG nova.compute.manager [req-980e87ab-27c5-4d32-a131-2b6161cae130 req-96cdc668-23bf-487e-a000-32e1d2ff412f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: cdd4fb44-07d8-4910-b2c1-32386ecffab8] Processing event network-vif-plugged-7c209da8-98c9-48d0-b48e-ece9f863d933 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 22 00:11:37 compute-1 nova_compute[182713]: 2026-01-22 00:11:37.174 182717 DEBUG nova.compute.manager [None req-61aa2dd3-cac0-44c7-b98d-d2e0bbccee01 3c4bd8a02cf045ad9703e01b44239806 54baf6b68ab146d49737e618b1e5b40e - - default default] [instance: cdd4fb44-07d8-4910-b2c1-32386ecffab8] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 00:11:37 compute-1 nova_compute[182713]: 2026-01-22 00:11:37.175 182717 DEBUG nova.virt.driver [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Emitting event <LifecycleEvent: 1769040697.1744432, cdd4fb44-07d8-4910-b2c1-32386ecffab8 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:11:37 compute-1 nova_compute[182713]: 2026-01-22 00:11:37.176 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: cdd4fb44-07d8-4910-b2c1-32386ecffab8] VM Started (Lifecycle Event)
Jan 22 00:11:37 compute-1 nova_compute[182713]: 2026-01-22 00:11:37.179 182717 DEBUG nova.virt.libvirt.driver [None req-61aa2dd3-cac0-44c7-b98d-d2e0bbccee01 3c4bd8a02cf045ad9703e01b44239806 54baf6b68ab146d49737e618b1e5b40e - - default default] [instance: cdd4fb44-07d8-4910-b2c1-32386ecffab8] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 22 00:11:37 compute-1 nova_compute[182713]: 2026-01-22 00:11:37.183 182717 INFO nova.virt.libvirt.driver [-] [instance: cdd4fb44-07d8-4910-b2c1-32386ecffab8] Instance spawned successfully.
Jan 22 00:11:37 compute-1 nova_compute[182713]: 2026-01-22 00:11:37.183 182717 DEBUG nova.virt.libvirt.driver [None req-61aa2dd3-cac0-44c7-b98d-d2e0bbccee01 3c4bd8a02cf045ad9703e01b44239806 54baf6b68ab146d49737e618b1e5b40e - - default default] [instance: cdd4fb44-07d8-4910-b2c1-32386ecffab8] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 22 00:11:37 compute-1 nova_compute[182713]: 2026-01-22 00:11:37.329 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: cdd4fb44-07d8-4910-b2c1-32386ecffab8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:11:37 compute-1 nova_compute[182713]: 2026-01-22 00:11:37.333 182717 DEBUG nova.virt.libvirt.driver [None req-61aa2dd3-cac0-44c7-b98d-d2e0bbccee01 3c4bd8a02cf045ad9703e01b44239806 54baf6b68ab146d49737e618b1e5b40e - - default default] [instance: cdd4fb44-07d8-4910-b2c1-32386ecffab8] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:11:37 compute-1 nova_compute[182713]: 2026-01-22 00:11:37.333 182717 DEBUG nova.virt.libvirt.driver [None req-61aa2dd3-cac0-44c7-b98d-d2e0bbccee01 3c4bd8a02cf045ad9703e01b44239806 54baf6b68ab146d49737e618b1e5b40e - - default default] [instance: cdd4fb44-07d8-4910-b2c1-32386ecffab8] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:11:37 compute-1 nova_compute[182713]: 2026-01-22 00:11:37.334 182717 DEBUG nova.virt.libvirt.driver [None req-61aa2dd3-cac0-44c7-b98d-d2e0bbccee01 3c4bd8a02cf045ad9703e01b44239806 54baf6b68ab146d49737e618b1e5b40e - - default default] [instance: cdd4fb44-07d8-4910-b2c1-32386ecffab8] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:11:37 compute-1 nova_compute[182713]: 2026-01-22 00:11:37.334 182717 DEBUG nova.virt.libvirt.driver [None req-61aa2dd3-cac0-44c7-b98d-d2e0bbccee01 3c4bd8a02cf045ad9703e01b44239806 54baf6b68ab146d49737e618b1e5b40e - - default default] [instance: cdd4fb44-07d8-4910-b2c1-32386ecffab8] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:11:37 compute-1 nova_compute[182713]: 2026-01-22 00:11:37.335 182717 DEBUG nova.virt.libvirt.driver [None req-61aa2dd3-cac0-44c7-b98d-d2e0bbccee01 3c4bd8a02cf045ad9703e01b44239806 54baf6b68ab146d49737e618b1e5b40e - - default default] [instance: cdd4fb44-07d8-4910-b2c1-32386ecffab8] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:11:37 compute-1 nova_compute[182713]: 2026-01-22 00:11:37.335 182717 DEBUG nova.virt.libvirt.driver [None req-61aa2dd3-cac0-44c7-b98d-d2e0bbccee01 3c4bd8a02cf045ad9703e01b44239806 54baf6b68ab146d49737e618b1e5b40e - - default default] [instance: cdd4fb44-07d8-4910-b2c1-32386ecffab8] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:11:37 compute-1 nova_compute[182713]: 2026-01-22 00:11:37.339 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: cdd4fb44-07d8-4910-b2c1-32386ecffab8] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 00:11:37 compute-1 nova_compute[182713]: 2026-01-22 00:11:37.446 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: cdd4fb44-07d8-4910-b2c1-32386ecffab8] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 00:11:37 compute-1 nova_compute[182713]: 2026-01-22 00:11:37.446 182717 DEBUG nova.virt.driver [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Emitting event <LifecycleEvent: 1769040697.1753848, cdd4fb44-07d8-4910-b2c1-32386ecffab8 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:11:37 compute-1 nova_compute[182713]: 2026-01-22 00:11:37.446 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: cdd4fb44-07d8-4910-b2c1-32386ecffab8] VM Paused (Lifecycle Event)
Jan 22 00:11:37 compute-1 nova_compute[182713]: 2026-01-22 00:11:37.473 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: cdd4fb44-07d8-4910-b2c1-32386ecffab8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:11:37 compute-1 nova_compute[182713]: 2026-01-22 00:11:37.477 182717 DEBUG nova.virt.driver [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Emitting event <LifecycleEvent: 1769040697.178243, cdd4fb44-07d8-4910-b2c1-32386ecffab8 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:11:37 compute-1 nova_compute[182713]: 2026-01-22 00:11:37.477 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: cdd4fb44-07d8-4910-b2c1-32386ecffab8] VM Resumed (Lifecycle Event)
Jan 22 00:11:37 compute-1 nova_compute[182713]: 2026-01-22 00:11:37.507 182717 INFO nova.compute.manager [None req-61aa2dd3-cac0-44c7-b98d-d2e0bbccee01 3c4bd8a02cf045ad9703e01b44239806 54baf6b68ab146d49737e618b1e5b40e - - default default] [instance: cdd4fb44-07d8-4910-b2c1-32386ecffab8] Took 15.63 seconds to spawn the instance on the hypervisor.
Jan 22 00:11:37 compute-1 nova_compute[182713]: 2026-01-22 00:11:37.507 182717 DEBUG nova.compute.manager [None req-61aa2dd3-cac0-44c7-b98d-d2e0bbccee01 3c4bd8a02cf045ad9703e01b44239806 54baf6b68ab146d49737e618b1e5b40e - - default default] [instance: cdd4fb44-07d8-4910-b2c1-32386ecffab8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:11:37 compute-1 nova_compute[182713]: 2026-01-22 00:11:37.509 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: cdd4fb44-07d8-4910-b2c1-32386ecffab8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:11:37 compute-1 nova_compute[182713]: 2026-01-22 00:11:37.524 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: cdd4fb44-07d8-4910-b2c1-32386ecffab8] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 00:11:37 compute-1 nova_compute[182713]: 2026-01-22 00:11:37.556 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: cdd4fb44-07d8-4910-b2c1-32386ecffab8] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 00:11:37 compute-1 nova_compute[182713]: 2026-01-22 00:11:37.639 182717 INFO nova.compute.manager [None req-61aa2dd3-cac0-44c7-b98d-d2e0bbccee01 3c4bd8a02cf045ad9703e01b44239806 54baf6b68ab146d49737e618b1e5b40e - - default default] [instance: cdd4fb44-07d8-4910-b2c1-32386ecffab8] Took 23.28 seconds to build instance.
Jan 22 00:11:37 compute-1 nova_compute[182713]: 2026-01-22 00:11:37.670 182717 DEBUG oslo_concurrency.lockutils [None req-61aa2dd3-cac0-44c7-b98d-d2e0bbccee01 3c4bd8a02cf045ad9703e01b44239806 54baf6b68ab146d49737e618b1e5b40e - - default default] Lock "cdd4fb44-07d8-4910-b2c1-32386ecffab8" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 25.258s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:11:37 compute-1 nova_compute[182713]: 2026-01-22 00:11:37.880 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:11:39 compute-1 nova_compute[182713]: 2026-01-22 00:11:39.227 182717 DEBUG nova.compute.manager [req-c6f5a649-c64b-45b1-878e-e385b17bae10 req-a64db011-3112-41f6-a5c9-22d45ef2d4b6 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: cdd4fb44-07d8-4910-b2c1-32386ecffab8] Received event network-vif-plugged-7c209da8-98c9-48d0-b48e-ece9f863d933 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:11:39 compute-1 nova_compute[182713]: 2026-01-22 00:11:39.228 182717 DEBUG oslo_concurrency.lockutils [req-c6f5a649-c64b-45b1-878e-e385b17bae10 req-a64db011-3112-41f6-a5c9-22d45ef2d4b6 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "cdd4fb44-07d8-4910-b2c1-32386ecffab8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:11:39 compute-1 nova_compute[182713]: 2026-01-22 00:11:39.228 182717 DEBUG oslo_concurrency.lockutils [req-c6f5a649-c64b-45b1-878e-e385b17bae10 req-a64db011-3112-41f6-a5c9-22d45ef2d4b6 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "cdd4fb44-07d8-4910-b2c1-32386ecffab8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:11:39 compute-1 nova_compute[182713]: 2026-01-22 00:11:39.228 182717 DEBUG oslo_concurrency.lockutils [req-c6f5a649-c64b-45b1-878e-e385b17bae10 req-a64db011-3112-41f6-a5c9-22d45ef2d4b6 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "cdd4fb44-07d8-4910-b2c1-32386ecffab8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:11:39 compute-1 nova_compute[182713]: 2026-01-22 00:11:39.228 182717 DEBUG nova.compute.manager [req-c6f5a649-c64b-45b1-878e-e385b17bae10 req-a64db011-3112-41f6-a5c9-22d45ef2d4b6 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: cdd4fb44-07d8-4910-b2c1-32386ecffab8] No waiting events found dispatching network-vif-plugged-7c209da8-98c9-48d0-b48e-ece9f863d933 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:11:39 compute-1 nova_compute[182713]: 2026-01-22 00:11:39.229 182717 WARNING nova.compute.manager [req-c6f5a649-c64b-45b1-878e-e385b17bae10 req-a64db011-3112-41f6-a5c9-22d45ef2d4b6 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: cdd4fb44-07d8-4910-b2c1-32386ecffab8] Received unexpected event network-vif-plugged-7c209da8-98c9-48d0-b48e-ece9f863d933 for instance with vm_state active and task_state None.
Jan 22 00:11:39 compute-1 nova_compute[182713]: 2026-01-22 00:11:39.856 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:11:39 compute-1 nova_compute[182713]: 2026-01-22 00:11:39.986 182717 INFO nova.compute.manager [None req-389c1e1c-25e9-43bb-a353-66f4730c00d7 3c4bd8a02cf045ad9703e01b44239806 54baf6b68ab146d49737e618b1e5b40e - - default default] [instance: cdd4fb44-07d8-4910-b2c1-32386ecffab8] Rescuing
Jan 22 00:11:39 compute-1 nova_compute[182713]: 2026-01-22 00:11:39.987 182717 DEBUG oslo_concurrency.lockutils [None req-389c1e1c-25e9-43bb-a353-66f4730c00d7 3c4bd8a02cf045ad9703e01b44239806 54baf6b68ab146d49737e618b1e5b40e - - default default] Acquiring lock "refresh_cache-cdd4fb44-07d8-4910-b2c1-32386ecffab8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:11:39 compute-1 nova_compute[182713]: 2026-01-22 00:11:39.987 182717 DEBUG oslo_concurrency.lockutils [None req-389c1e1c-25e9-43bb-a353-66f4730c00d7 3c4bd8a02cf045ad9703e01b44239806 54baf6b68ab146d49737e618b1e5b40e - - default default] Acquired lock "refresh_cache-cdd4fb44-07d8-4910-b2c1-32386ecffab8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:11:39 compute-1 nova_compute[182713]: 2026-01-22 00:11:39.987 182717 DEBUG nova.network.neutron [None req-389c1e1c-25e9-43bb-a353-66f4730c00d7 3c4bd8a02cf045ad9703e01b44239806 54baf6b68ab146d49737e618b1e5b40e - - default default] [instance: cdd4fb44-07d8-4910-b2c1-32386ecffab8] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 00:11:40 compute-1 nova_compute[182713]: 2026-01-22 00:11:40.335 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:11:40 compute-1 nova_compute[182713]: 2026-01-22 00:11:40.852 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:11:40 compute-1 nova_compute[182713]: 2026-01-22 00:11:40.919 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:11:40 compute-1 nova_compute[182713]: 2026-01-22 00:11:40.919 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:11:40 compute-1 nova_compute[182713]: 2026-01-22 00:11:40.920 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 00:11:41 compute-1 nova_compute[182713]: 2026-01-22 00:11:41.917 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:11:42 compute-1 nova_compute[182713]: 2026-01-22 00:11:42.855 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:11:42 compute-1 nova_compute[182713]: 2026-01-22 00:11:42.856 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:11:42 compute-1 nova_compute[182713]: 2026-01-22 00:11:42.881 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:11:43 compute-1 nova_compute[182713]: 2026-01-22 00:11:43.065 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:11:43 compute-1 nova_compute[182713]: 2026-01-22 00:11:43.066 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:11:43 compute-1 nova_compute[182713]: 2026-01-22 00:11:43.066 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:11:43 compute-1 nova_compute[182713]: 2026-01-22 00:11:43.067 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 00:11:43 compute-1 nova_compute[182713]: 2026-01-22 00:11:43.548 182717 DEBUG oslo_concurrency.processutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/cdd4fb44-07d8-4910-b2c1-32386ecffab8/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:11:43 compute-1 nova_compute[182713]: 2026-01-22 00:11:43.578 182717 DEBUG nova.network.neutron [None req-389c1e1c-25e9-43bb-a353-66f4730c00d7 3c4bd8a02cf045ad9703e01b44239806 54baf6b68ab146d49737e618b1e5b40e - - default default] [instance: cdd4fb44-07d8-4910-b2c1-32386ecffab8] Updating instance_info_cache with network_info: [{"id": "7c209da8-98c9-48d0-b48e-ece9f863d933", "address": "fa:16:3e:3e:6e:e0", "network": {"id": "3b4ab8c5-3d45-4beb-9842-6290486d7c84", "bridge": "br-int", "label": "tempest-ServerRescueTestJSONUnderV235-682511806-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "54baf6b68ab146d49737e618b1e5b40e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7c209da8-98", "ovs_interfaceid": "7c209da8-98c9-48d0-b48e-ece9f863d933", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:11:43 compute-1 nova_compute[182713]: 2026-01-22 00:11:43.624 182717 DEBUG oslo_concurrency.processutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/cdd4fb44-07d8-4910-b2c1-32386ecffab8/disk --force-share --output=json" returned: 0 in 0.076s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:11:43 compute-1 nova_compute[182713]: 2026-01-22 00:11:43.625 182717 DEBUG oslo_concurrency.processutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/cdd4fb44-07d8-4910-b2c1-32386ecffab8/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:11:43 compute-1 nova_compute[182713]: 2026-01-22 00:11:43.681 182717 DEBUG oslo_concurrency.processutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/cdd4fb44-07d8-4910-b2c1-32386ecffab8/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:11:43 compute-1 nova_compute[182713]: 2026-01-22 00:11:43.851 182717 WARNING nova.virt.libvirt.driver [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 00:11:43 compute-1 nova_compute[182713]: 2026-01-22 00:11:43.853 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5582MB free_disk=73.29502868652344GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 00:11:43 compute-1 nova_compute[182713]: 2026-01-22 00:11:43.854 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:11:43 compute-1 nova_compute[182713]: 2026-01-22 00:11:43.854 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:11:44 compute-1 nova_compute[182713]: 2026-01-22 00:11:44.419 182717 DEBUG oslo_concurrency.lockutils [None req-389c1e1c-25e9-43bb-a353-66f4730c00d7 3c4bd8a02cf045ad9703e01b44239806 54baf6b68ab146d49737e618b1e5b40e - - default default] Releasing lock "refresh_cache-cdd4fb44-07d8-4910-b2c1-32386ecffab8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:11:45 compute-1 nova_compute[182713]: 2026-01-22 00:11:45.198 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Instance cdd4fb44-07d8-4910-b2c1-32386ecffab8 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 22 00:11:45 compute-1 nova_compute[182713]: 2026-01-22 00:11:45.199 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 00:11:45 compute-1 nova_compute[182713]: 2026-01-22 00:11:45.199 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 00:11:45 compute-1 nova_compute[182713]: 2026-01-22 00:11:45.263 182717 DEBUG nova.compute.provider_tree [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Inventory has not changed in ProviderTree for provider: 39680711-70c9-4df1-ae59-25e54fac688d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:11:45 compute-1 nova_compute[182713]: 2026-01-22 00:11:45.339 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:11:45 compute-1 nova_compute[182713]: 2026-01-22 00:11:45.396 182717 DEBUG nova.virt.libvirt.driver [None req-389c1e1c-25e9-43bb-a353-66f4730c00d7 3c4bd8a02cf045ad9703e01b44239806 54baf6b68ab146d49737e618b1e5b40e - - default default] [instance: cdd4fb44-07d8-4910-b2c1-32386ecffab8] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071
Jan 22 00:11:45 compute-1 nova_compute[182713]: 2026-01-22 00:11:45.414 182717 DEBUG nova.scheduler.client.report [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Inventory has not changed for provider 39680711-70c9-4df1-ae59-25e54fac688d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:11:45 compute-1 podman[229459]: 2026-01-22 00:11:45.616148263 +0000 UTC m=+0.101905174 container health_status cba261ffc859a105e0afa4436e64e27f9ba7e7387a1ea02146e0072c51844d44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0)
Jan 22 00:11:45 compute-1 nova_compute[182713]: 2026-01-22 00:11:45.733 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 00:11:45 compute-1 nova_compute[182713]: 2026-01-22 00:11:45.734 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.880s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:11:47 compute-1 nova_compute[182713]: 2026-01-22 00:11:47.734 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:11:47 compute-1 nova_compute[182713]: 2026-01-22 00:11:47.735 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 00:11:47 compute-1 nova_compute[182713]: 2026-01-22 00:11:47.735 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 22 00:11:47 compute-1 nova_compute[182713]: 2026-01-22 00:11:47.884 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:11:47 compute-1 nova_compute[182713]: 2026-01-22 00:11:47.916 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquiring lock "refresh_cache-cdd4fb44-07d8-4910-b2c1-32386ecffab8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:11:47 compute-1 nova_compute[182713]: 2026-01-22 00:11:47.916 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquired lock "refresh_cache-cdd4fb44-07d8-4910-b2c1-32386ecffab8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:11:47 compute-1 nova_compute[182713]: 2026-01-22 00:11:47.916 182717 DEBUG nova.network.neutron [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] [instance: cdd4fb44-07d8-4910-b2c1-32386ecffab8] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Jan 22 00:11:47 compute-1 nova_compute[182713]: 2026-01-22 00:11:47.916 182717 DEBUG nova.objects.instance [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lazy-loading 'info_cache' on Instance uuid cdd4fb44-07d8-4910-b2c1-32386ecffab8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:11:49 compute-1 podman[229493]: 2026-01-22 00:11:49.574290127 +0000 UTC m=+0.064305357 container health_status dce133a2ffa21f3006ac27dc80027060a9029e6d972f4c62fecd35dd3bbb00f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, config_id=openstack_network_exporter, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, maintainer=Red Hat, Inc., release=1755695350, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, architecture=x86_64, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, io.buildah.version=1.33.7)
Jan 22 00:11:50 compute-1 nova_compute[182713]: 2026-01-22 00:11:50.343 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:11:51 compute-1 nova_compute[182713]: 2026-01-22 00:11:51.484 182717 DEBUG nova.network.neutron [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] [instance: cdd4fb44-07d8-4910-b2c1-32386ecffab8] Updating instance_info_cache with network_info: [{"id": "7c209da8-98c9-48d0-b48e-ece9f863d933", "address": "fa:16:3e:3e:6e:e0", "network": {"id": "3b4ab8c5-3d45-4beb-9842-6290486d7c84", "bridge": "br-int", "label": "tempest-ServerRescueTestJSONUnderV235-682511806-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "54baf6b68ab146d49737e618b1e5b40e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7c209da8-98", "ovs_interfaceid": "7c209da8-98c9-48d0-b48e-ece9f863d933", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:11:51 compute-1 nova_compute[182713]: 2026-01-22 00:11:51.527 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Releasing lock "refresh_cache-cdd4fb44-07d8-4910-b2c1-32386ecffab8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:11:51 compute-1 nova_compute[182713]: 2026-01-22 00:11:51.528 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] [instance: cdd4fb44-07d8-4910-b2c1-32386ecffab8] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Jan 22 00:11:51 compute-1 nova_compute[182713]: 2026-01-22 00:11:51.529 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:11:51 compute-1 nova_compute[182713]: 2026-01-22 00:11:51.529 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:11:52 compute-1 nova_compute[182713]: 2026-01-22 00:11:52.886 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:11:55 compute-1 nova_compute[182713]: 2026-01-22 00:11:55.347 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:11:55 compute-1 nova_compute[182713]: 2026-01-22 00:11:55.453 182717 DEBUG nova.virt.libvirt.driver [None req-389c1e1c-25e9-43bb-a353-66f4730c00d7 3c4bd8a02cf045ad9703e01b44239806 54baf6b68ab146d49737e618b1e5b40e - - default default] [instance: cdd4fb44-07d8-4910-b2c1-32386ecffab8] Instance in state 1 after 10 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101
Jan 22 00:11:57 compute-1 kernel: tap7c209da8-98 (unregistering): left promiscuous mode
Jan 22 00:11:57 compute-1 NetworkManager[54952]: <info>  [1769040717.6535] device (tap7c209da8-98): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 00:11:57 compute-1 ovn_controller[94841]: 2026-01-22T00:11:57Z|00483|binding|INFO|Releasing lport 7c209da8-98c9-48d0-b48e-ece9f863d933 from this chassis (sb_readonly=0)
Jan 22 00:11:57 compute-1 ovn_controller[94841]: 2026-01-22T00:11:57Z|00484|binding|INFO|Setting lport 7c209da8-98c9-48d0-b48e-ece9f863d933 down in Southbound
Jan 22 00:11:57 compute-1 ovn_controller[94841]: 2026-01-22T00:11:57Z|00485|binding|INFO|Removing iface tap7c209da8-98 ovn-installed in OVS
Jan 22 00:11:57 compute-1 nova_compute[182713]: 2026-01-22 00:11:57.662 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:11:57 compute-1 nova_compute[182713]: 2026-01-22 00:11:57.678 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:11:57 compute-1 systemd[1]: machine-qemu\x2d52\x2dinstance\x2d00000074.scope: Deactivated successfully.
Jan 22 00:11:57 compute-1 systemd[1]: machine-qemu\x2d52\x2dinstance\x2d00000074.scope: Consumed 14.518s CPU time.
Jan 22 00:11:57 compute-1 systemd-machined[153970]: Machine qemu-52-instance-00000074 terminated.
Jan 22 00:11:57 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:11:57.771 104184 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:3e:6e:e0 10.100.0.4'], port_security=['fa:16:3e:3e:6e:e0 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': 'cdd4fb44-07d8-4910-b2c1-32386ecffab8', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3b4ab8c5-3d45-4beb-9842-6290486d7c84', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '54baf6b68ab146d49737e618b1e5b40e', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'cdb31eb0-cb40-4001-b611-8fbfe7d0fb3e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=9fae3c6d-aa86-498b-83cb-e3d2b95d2a8f, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>], logical_port=7c209da8-98c9-48d0-b48e-ece9f863d933) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:11:57 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:11:57.773 104184 INFO neutron.agent.ovn.metadata.agent [-] Port 7c209da8-98c9-48d0-b48e-ece9f863d933 in datapath 3b4ab8c5-3d45-4beb-9842-6290486d7c84 unbound from our chassis
Jan 22 00:11:57 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:11:57.774 104184 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 3b4ab8c5-3d45-4beb-9842-6290486d7c84 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Jan 22 00:11:57 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:11:57.776 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[c9ee9d9f-65d1-425b-997a-292c5f39b37e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:11:57 compute-1 nova_compute[182713]: 2026-01-22 00:11:57.887 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:11:57 compute-1 nova_compute[182713]: 2026-01-22 00:11:57.893 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:11:57 compute-1 nova_compute[182713]: 2026-01-22 00:11:57.899 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:11:58 compute-1 nova_compute[182713]: 2026-01-22 00:11:58.468 182717 INFO nova.virt.libvirt.driver [None req-389c1e1c-25e9-43bb-a353-66f4730c00d7 3c4bd8a02cf045ad9703e01b44239806 54baf6b68ab146d49737e618b1e5b40e - - default default] [instance: cdd4fb44-07d8-4910-b2c1-32386ecffab8] Instance shutdown successfully after 13 seconds.
Jan 22 00:11:58 compute-1 nova_compute[182713]: 2026-01-22 00:11:58.474 182717 INFO nova.virt.libvirt.driver [-] [instance: cdd4fb44-07d8-4910-b2c1-32386ecffab8] Instance destroyed successfully.
Jan 22 00:11:58 compute-1 nova_compute[182713]: 2026-01-22 00:11:58.474 182717 DEBUG nova.objects.instance [None req-389c1e1c-25e9-43bb-a353-66f4730c00d7 3c4bd8a02cf045ad9703e01b44239806 54baf6b68ab146d49737e618b1e5b40e - - default default] Lazy-loading 'numa_topology' on Instance uuid cdd4fb44-07d8-4910-b2c1-32386ecffab8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:11:58 compute-1 nova_compute[182713]: 2026-01-22 00:11:58.494 182717 INFO nova.virt.libvirt.driver [None req-389c1e1c-25e9-43bb-a353-66f4730c00d7 3c4bd8a02cf045ad9703e01b44239806 54baf6b68ab146d49737e618b1e5b40e - - default default] [instance: cdd4fb44-07d8-4910-b2c1-32386ecffab8] Attempting rescue
Jan 22 00:11:58 compute-1 nova_compute[182713]: 2026-01-22 00:11:58.495 182717 DEBUG nova.virt.libvirt.driver [None req-389c1e1c-25e9-43bb-a353-66f4730c00d7 3c4bd8a02cf045ad9703e01b44239806 54baf6b68ab146d49737e618b1e5b40e - - default default] rescue generated disk_info: {'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'disk.rescue': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vdb', 'type': 'disk'}, 'disk.config.rescue': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} rescue /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4314
Jan 22 00:11:58 compute-1 nova_compute[182713]: 2026-01-22 00:11:58.499 182717 DEBUG nova.virt.libvirt.driver [None req-389c1e1c-25e9-43bb-a353-66f4730c00d7 3c4bd8a02cf045ad9703e01b44239806 54baf6b68ab146d49737e618b1e5b40e - - default default] [instance: cdd4fb44-07d8-4910-b2c1-32386ecffab8] Instance directory exists: not creating _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4719
Jan 22 00:11:58 compute-1 nova_compute[182713]: 2026-01-22 00:11:58.500 182717 INFO nova.virt.libvirt.driver [None req-389c1e1c-25e9-43bb-a353-66f4730c00d7 3c4bd8a02cf045ad9703e01b44239806 54baf6b68ab146d49737e618b1e5b40e - - default default] [instance: cdd4fb44-07d8-4910-b2c1-32386ecffab8] Creating image(s)
Jan 22 00:11:58 compute-1 nova_compute[182713]: 2026-01-22 00:11:58.501 182717 DEBUG oslo_concurrency.lockutils [None req-389c1e1c-25e9-43bb-a353-66f4730c00d7 3c4bd8a02cf045ad9703e01b44239806 54baf6b68ab146d49737e618b1e5b40e - - default default] Acquiring lock "/var/lib/nova/instances/cdd4fb44-07d8-4910-b2c1-32386ecffab8/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:11:58 compute-1 nova_compute[182713]: 2026-01-22 00:11:58.501 182717 DEBUG oslo_concurrency.lockutils [None req-389c1e1c-25e9-43bb-a353-66f4730c00d7 3c4bd8a02cf045ad9703e01b44239806 54baf6b68ab146d49737e618b1e5b40e - - default default] Lock "/var/lib/nova/instances/cdd4fb44-07d8-4910-b2c1-32386ecffab8/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:11:58 compute-1 nova_compute[182713]: 2026-01-22 00:11:58.501 182717 DEBUG oslo_concurrency.lockutils [None req-389c1e1c-25e9-43bb-a353-66f4730c00d7 3c4bd8a02cf045ad9703e01b44239806 54baf6b68ab146d49737e618b1e5b40e - - default default] Lock "/var/lib/nova/instances/cdd4fb44-07d8-4910-b2c1-32386ecffab8/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:11:58 compute-1 nova_compute[182713]: 2026-01-22 00:11:58.502 182717 DEBUG nova.objects.instance [None req-389c1e1c-25e9-43bb-a353-66f4730c00d7 3c4bd8a02cf045ad9703e01b44239806 54baf6b68ab146d49737e618b1e5b40e - - default default] Lazy-loading 'trusted_certs' on Instance uuid cdd4fb44-07d8-4910-b2c1-32386ecffab8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:11:58 compute-1 nova_compute[182713]: 2026-01-22 00:11:58.671 182717 DEBUG oslo_concurrency.lockutils [None req-389c1e1c-25e9-43bb-a353-66f4730c00d7 3c4bd8a02cf045ad9703e01b44239806 54baf6b68ab146d49737e618b1e5b40e - - default default] Acquiring lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:11:58 compute-1 nova_compute[182713]: 2026-01-22 00:11:58.672 182717 DEBUG oslo_concurrency.lockutils [None req-389c1e1c-25e9-43bb-a353-66f4730c00d7 3c4bd8a02cf045ad9703e01b44239806 54baf6b68ab146d49737e618b1e5b40e - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:11:58 compute-1 nova_compute[182713]: 2026-01-22 00:11:58.684 182717 DEBUG oslo_concurrency.processutils [None req-389c1e1c-25e9-43bb-a353-66f4730c00d7 3c4bd8a02cf045ad9703e01b44239806 54baf6b68ab146d49737e618b1e5b40e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:11:58 compute-1 nova_compute[182713]: 2026-01-22 00:11:58.779 182717 DEBUG oslo_concurrency.processutils [None req-389c1e1c-25e9-43bb-a353-66f4730c00d7 3c4bd8a02cf045ad9703e01b44239806 54baf6b68ab146d49737e618b1e5b40e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.095s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:11:58 compute-1 nova_compute[182713]: 2026-01-22 00:11:58.780 182717 DEBUG oslo_concurrency.processutils [None req-389c1e1c-25e9-43bb-a353-66f4730c00d7 3c4bd8a02cf045ad9703e01b44239806 54baf6b68ab146d49737e618b1e5b40e - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/cdd4fb44-07d8-4910-b2c1-32386ecffab8/disk.rescue execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:11:59 compute-1 nova_compute[182713]: 2026-01-22 00:11:59.211 182717 DEBUG oslo_concurrency.processutils [None req-389c1e1c-25e9-43bb-a353-66f4730c00d7 3c4bd8a02cf045ad9703e01b44239806 54baf6b68ab146d49737e618b1e5b40e - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/cdd4fb44-07d8-4910-b2c1-32386ecffab8/disk.rescue" returned: 0 in 0.431s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:11:59 compute-1 nova_compute[182713]: 2026-01-22 00:11:59.214 182717 DEBUG oslo_concurrency.lockutils [None req-389c1e1c-25e9-43bb-a353-66f4730c00d7 3c4bd8a02cf045ad9703e01b44239806 54baf6b68ab146d49737e618b1e5b40e - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.542s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:11:59 compute-1 nova_compute[182713]: 2026-01-22 00:11:59.215 182717 DEBUG nova.objects.instance [None req-389c1e1c-25e9-43bb-a353-66f4730c00d7 3c4bd8a02cf045ad9703e01b44239806 54baf6b68ab146d49737e618b1e5b40e - - default default] Lazy-loading 'migration_context' on Instance uuid cdd4fb44-07d8-4910-b2c1-32386ecffab8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:11:59 compute-1 nova_compute[182713]: 2026-01-22 00:11:59.605 182717 DEBUG nova.virt.libvirt.driver [None req-389c1e1c-25e9-43bb-a353-66f4730c00d7 3c4bd8a02cf045ad9703e01b44239806 54baf6b68ab146d49737e618b1e5b40e - - default default] [instance: cdd4fb44-07d8-4910-b2c1-32386ecffab8] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 22 00:11:59 compute-1 nova_compute[182713]: 2026-01-22 00:11:59.606 182717 DEBUG nova.virt.libvirt.driver [None req-389c1e1c-25e9-43bb-a353-66f4730c00d7 3c4bd8a02cf045ad9703e01b44239806 54baf6b68ab146d49737e618b1e5b40e - - default default] [instance: cdd4fb44-07d8-4910-b2c1-32386ecffab8] Start _get_guest_xml network_info=[{"id": "7c209da8-98c9-48d0-b48e-ece9f863d933", "address": "fa:16:3e:3e:6e:e0", "network": {"id": "3b4ab8c5-3d45-4beb-9842-6290486d7c84", "bridge": "br-int", "label": "tempest-ServerRescueTestJSONUnderV235-682511806-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerRescueTestJSONUnderV235-682511806-network", "vif_mac": "fa:16:3e:3e:6e:e0"}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "54baf6b68ab146d49737e618b1e5b40e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7c209da8-98", "ovs_interfaceid": "7c209da8-98c9-48d0-b48e-ece9f863d933", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'disk.rescue': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vdb', 'type': 'disk'}, 'disk.config.rescue': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>) rescue={'image_id': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'kernel_id': '', 'ramdisk_id': ''} block_device_info=None _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 22 00:11:59 compute-1 nova_compute[182713]: 2026-01-22 00:11:59.606 182717 DEBUG nova.objects.instance [None req-389c1e1c-25e9-43bb-a353-66f4730c00d7 3c4bd8a02cf045ad9703e01b44239806 54baf6b68ab146d49737e618b1e5b40e - - default default] Lazy-loading 'resources' on Instance uuid cdd4fb44-07d8-4910-b2c1-32386ecffab8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:11:59 compute-1 nova_compute[182713]: 2026-01-22 00:11:59.677 182717 WARNING nova.virt.libvirt.driver [None req-389c1e1c-25e9-43bb-a353-66f4730c00d7 3c4bd8a02cf045ad9703e01b44239806 54baf6b68ab146d49737e618b1e5b40e - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 00:11:59 compute-1 nova_compute[182713]: 2026-01-22 00:11:59.694 182717 DEBUG nova.virt.libvirt.host [None req-389c1e1c-25e9-43bb-a353-66f4730c00d7 3c4bd8a02cf045ad9703e01b44239806 54baf6b68ab146d49737e618b1e5b40e - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 22 00:11:59 compute-1 nova_compute[182713]: 2026-01-22 00:11:59.695 182717 DEBUG nova.virt.libvirt.host [None req-389c1e1c-25e9-43bb-a353-66f4730c00d7 3c4bd8a02cf045ad9703e01b44239806 54baf6b68ab146d49737e618b1e5b40e - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 22 00:11:59 compute-1 nova_compute[182713]: 2026-01-22 00:11:59.700 182717 DEBUG nova.virt.libvirt.host [None req-389c1e1c-25e9-43bb-a353-66f4730c00d7 3c4bd8a02cf045ad9703e01b44239806 54baf6b68ab146d49737e618b1e5b40e - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 22 00:11:59 compute-1 nova_compute[182713]: 2026-01-22 00:11:59.701 182717 DEBUG nova.virt.libvirt.host [None req-389c1e1c-25e9-43bb-a353-66f4730c00d7 3c4bd8a02cf045ad9703e01b44239806 54baf6b68ab146d49737e618b1e5b40e - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 22 00:11:59 compute-1 nova_compute[182713]: 2026-01-22 00:11:59.702 182717 DEBUG nova.virt.libvirt.driver [None req-389c1e1c-25e9-43bb-a353-66f4730c00d7 3c4bd8a02cf045ad9703e01b44239806 54baf6b68ab146d49737e618b1e5b40e - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 22 00:11:59 compute-1 nova_compute[182713]: 2026-01-22 00:11:59.703 182717 DEBUG nova.virt.hardware [None req-389c1e1c-25e9-43bb-a353-66f4730c00d7 3c4bd8a02cf045ad9703e01b44239806 54baf6b68ab146d49737e618b1e5b40e - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-21T23:42:45Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c3389c03-89c4-4ff5-9e03-1a99d41713d4',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 22 00:11:59 compute-1 nova_compute[182713]: 2026-01-22 00:11:59.703 182717 DEBUG nova.virt.hardware [None req-389c1e1c-25e9-43bb-a353-66f4730c00d7 3c4bd8a02cf045ad9703e01b44239806 54baf6b68ab146d49737e618b1e5b40e - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 22 00:11:59 compute-1 nova_compute[182713]: 2026-01-22 00:11:59.703 182717 DEBUG nova.virt.hardware [None req-389c1e1c-25e9-43bb-a353-66f4730c00d7 3c4bd8a02cf045ad9703e01b44239806 54baf6b68ab146d49737e618b1e5b40e - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 22 00:11:59 compute-1 nova_compute[182713]: 2026-01-22 00:11:59.703 182717 DEBUG nova.virt.hardware [None req-389c1e1c-25e9-43bb-a353-66f4730c00d7 3c4bd8a02cf045ad9703e01b44239806 54baf6b68ab146d49737e618b1e5b40e - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 22 00:11:59 compute-1 nova_compute[182713]: 2026-01-22 00:11:59.704 182717 DEBUG nova.virt.hardware [None req-389c1e1c-25e9-43bb-a353-66f4730c00d7 3c4bd8a02cf045ad9703e01b44239806 54baf6b68ab146d49737e618b1e5b40e - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 22 00:11:59 compute-1 nova_compute[182713]: 2026-01-22 00:11:59.704 182717 DEBUG nova.virt.hardware [None req-389c1e1c-25e9-43bb-a353-66f4730c00d7 3c4bd8a02cf045ad9703e01b44239806 54baf6b68ab146d49737e618b1e5b40e - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 22 00:11:59 compute-1 nova_compute[182713]: 2026-01-22 00:11:59.704 182717 DEBUG nova.virt.hardware [None req-389c1e1c-25e9-43bb-a353-66f4730c00d7 3c4bd8a02cf045ad9703e01b44239806 54baf6b68ab146d49737e618b1e5b40e - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 22 00:11:59 compute-1 nova_compute[182713]: 2026-01-22 00:11:59.704 182717 DEBUG nova.virt.hardware [None req-389c1e1c-25e9-43bb-a353-66f4730c00d7 3c4bd8a02cf045ad9703e01b44239806 54baf6b68ab146d49737e618b1e5b40e - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 22 00:11:59 compute-1 nova_compute[182713]: 2026-01-22 00:11:59.704 182717 DEBUG nova.virt.hardware [None req-389c1e1c-25e9-43bb-a353-66f4730c00d7 3c4bd8a02cf045ad9703e01b44239806 54baf6b68ab146d49737e618b1e5b40e - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 22 00:11:59 compute-1 nova_compute[182713]: 2026-01-22 00:11:59.705 182717 DEBUG nova.virt.hardware [None req-389c1e1c-25e9-43bb-a353-66f4730c00d7 3c4bd8a02cf045ad9703e01b44239806 54baf6b68ab146d49737e618b1e5b40e - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 22 00:11:59 compute-1 nova_compute[182713]: 2026-01-22 00:11:59.705 182717 DEBUG nova.virt.hardware [None req-389c1e1c-25e9-43bb-a353-66f4730c00d7 3c4bd8a02cf045ad9703e01b44239806 54baf6b68ab146d49737e618b1e5b40e - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 22 00:11:59 compute-1 nova_compute[182713]: 2026-01-22 00:11:59.705 182717 DEBUG nova.objects.instance [None req-389c1e1c-25e9-43bb-a353-66f4730c00d7 3c4bd8a02cf045ad9703e01b44239806 54baf6b68ab146d49737e618b1e5b40e - - default default] Lazy-loading 'vcpu_model' on Instance uuid cdd4fb44-07d8-4910-b2c1-32386ecffab8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:11:59 compute-1 nova_compute[182713]: 2026-01-22 00:11:59.864 182717 DEBUG nova.virt.libvirt.vif [None req-389c1e1c-25e9-43bb-a353-66f4730c00d7 3c4bd8a02cf045ad9703e01b44239806 54baf6b68ab146d49737e618b1e5b40e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T00:11:07Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerRescueTestJSONUnderV235-server-1857258152',display_name='tempest-ServerRescueTestJSONUnderV235-server-1857258152',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverrescuetestjsonunderv235-server-1857258152',id=116,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-22T00:11:37Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='54baf6b68ab146d49737e618b1e5b40e',ramdisk_id='',reservation_id='r-1ixyv6bz',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerRescueTestJSONUnderV235-284679437',owner_user_name='tempest-ServerRescueTestJSONUnderV235-284679437-project-member'},tags=<?>,task_state='rescuing',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:11:37Z,user_data=None,user_id='3c4bd8a02cf045ad9703e01b44239806',uuid=cdd4fb44-07d8-4910-b2c1-32386ecffab8,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "7c209da8-98c9-48d0-b48e-ece9f863d933", "address": "fa:16:3e:3e:6e:e0", "network": {"id": "3b4ab8c5-3d45-4beb-9842-6290486d7c84", "bridge": "br-int", "label": "tempest-ServerRescueTestJSONUnderV235-682511806-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerRescueTestJSONUnderV235-682511806-network", "vif_mac": "fa:16:3e:3e:6e:e0"}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "54baf6b68ab146d49737e618b1e5b40e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7c209da8-98", "ovs_interfaceid": "7c209da8-98c9-48d0-b48e-ece9f863d933", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 00:11:59 compute-1 nova_compute[182713]: 2026-01-22 00:11:59.865 182717 DEBUG nova.network.os_vif_util [None req-389c1e1c-25e9-43bb-a353-66f4730c00d7 3c4bd8a02cf045ad9703e01b44239806 54baf6b68ab146d49737e618b1e5b40e - - default default] Converting VIF {"id": "7c209da8-98c9-48d0-b48e-ece9f863d933", "address": "fa:16:3e:3e:6e:e0", "network": {"id": "3b4ab8c5-3d45-4beb-9842-6290486d7c84", "bridge": "br-int", "label": "tempest-ServerRescueTestJSONUnderV235-682511806-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerRescueTestJSONUnderV235-682511806-network", "vif_mac": "fa:16:3e:3e:6e:e0"}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "54baf6b68ab146d49737e618b1e5b40e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7c209da8-98", "ovs_interfaceid": "7c209da8-98c9-48d0-b48e-ece9f863d933", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:11:59 compute-1 nova_compute[182713]: 2026-01-22 00:11:59.868 182717 DEBUG nova.network.os_vif_util [None req-389c1e1c-25e9-43bb-a353-66f4730c00d7 3c4bd8a02cf045ad9703e01b44239806 54baf6b68ab146d49737e618b1e5b40e - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:3e:6e:e0,bridge_name='br-int',has_traffic_filtering=True,id=7c209da8-98c9-48d0-b48e-ece9f863d933,network=Network(3b4ab8c5-3d45-4beb-9842-6290486d7c84),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7c209da8-98') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:11:59 compute-1 nova_compute[182713]: 2026-01-22 00:11:59.869 182717 DEBUG nova.objects.instance [None req-389c1e1c-25e9-43bb-a353-66f4730c00d7 3c4bd8a02cf045ad9703e01b44239806 54baf6b68ab146d49737e618b1e5b40e - - default default] Lazy-loading 'pci_devices' on Instance uuid cdd4fb44-07d8-4910-b2c1-32386ecffab8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:11:59 compute-1 nova_compute[182713]: 2026-01-22 00:11:59.899 182717 DEBUG nova.virt.libvirt.driver [None req-389c1e1c-25e9-43bb-a353-66f4730c00d7 3c4bd8a02cf045ad9703e01b44239806 54baf6b68ab146d49737e618b1e5b40e - - default default] [instance: cdd4fb44-07d8-4910-b2c1-32386ecffab8] End _get_guest_xml xml=<domain type="kvm">
Jan 22 00:11:59 compute-1 nova_compute[182713]:   <uuid>cdd4fb44-07d8-4910-b2c1-32386ecffab8</uuid>
Jan 22 00:11:59 compute-1 nova_compute[182713]:   <name>instance-00000074</name>
Jan 22 00:11:59 compute-1 nova_compute[182713]:   <memory>131072</memory>
Jan 22 00:11:59 compute-1 nova_compute[182713]:   <vcpu>1</vcpu>
Jan 22 00:11:59 compute-1 nova_compute[182713]:   <metadata>
Jan 22 00:11:59 compute-1 nova_compute[182713]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 00:11:59 compute-1 nova_compute[182713]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 00:11:59 compute-1 nova_compute[182713]:       <nova:name>tempest-ServerRescueTestJSONUnderV235-server-1857258152</nova:name>
Jan 22 00:11:59 compute-1 nova_compute[182713]:       <nova:creationTime>2026-01-22 00:11:59</nova:creationTime>
Jan 22 00:11:59 compute-1 nova_compute[182713]:       <nova:flavor name="m1.nano">
Jan 22 00:11:59 compute-1 nova_compute[182713]:         <nova:memory>128</nova:memory>
Jan 22 00:11:59 compute-1 nova_compute[182713]:         <nova:disk>1</nova:disk>
Jan 22 00:11:59 compute-1 nova_compute[182713]:         <nova:swap>0</nova:swap>
Jan 22 00:11:59 compute-1 nova_compute[182713]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 00:11:59 compute-1 nova_compute[182713]:         <nova:vcpus>1</nova:vcpus>
Jan 22 00:11:59 compute-1 nova_compute[182713]:       </nova:flavor>
Jan 22 00:11:59 compute-1 nova_compute[182713]:       <nova:owner>
Jan 22 00:11:59 compute-1 nova_compute[182713]:         <nova:user uuid="3c4bd8a02cf045ad9703e01b44239806">tempest-ServerRescueTestJSONUnderV235-284679437-project-member</nova:user>
Jan 22 00:11:59 compute-1 nova_compute[182713]:         <nova:project uuid="54baf6b68ab146d49737e618b1e5b40e">tempest-ServerRescueTestJSONUnderV235-284679437</nova:project>
Jan 22 00:11:59 compute-1 nova_compute[182713]:       </nova:owner>
Jan 22 00:11:59 compute-1 nova_compute[182713]:       <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 22 00:11:59 compute-1 nova_compute[182713]:       <nova:ports>
Jan 22 00:11:59 compute-1 nova_compute[182713]:         <nova:port uuid="7c209da8-98c9-48d0-b48e-ece9f863d933">
Jan 22 00:11:59 compute-1 nova_compute[182713]:           <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Jan 22 00:11:59 compute-1 nova_compute[182713]:         </nova:port>
Jan 22 00:11:59 compute-1 nova_compute[182713]:       </nova:ports>
Jan 22 00:11:59 compute-1 nova_compute[182713]:     </nova:instance>
Jan 22 00:11:59 compute-1 nova_compute[182713]:   </metadata>
Jan 22 00:11:59 compute-1 nova_compute[182713]:   <sysinfo type="smbios">
Jan 22 00:11:59 compute-1 nova_compute[182713]:     <system>
Jan 22 00:11:59 compute-1 nova_compute[182713]:       <entry name="manufacturer">RDO</entry>
Jan 22 00:11:59 compute-1 nova_compute[182713]:       <entry name="product">OpenStack Compute</entry>
Jan 22 00:11:59 compute-1 nova_compute[182713]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 00:11:59 compute-1 nova_compute[182713]:       <entry name="serial">cdd4fb44-07d8-4910-b2c1-32386ecffab8</entry>
Jan 22 00:11:59 compute-1 nova_compute[182713]:       <entry name="uuid">cdd4fb44-07d8-4910-b2c1-32386ecffab8</entry>
Jan 22 00:11:59 compute-1 nova_compute[182713]:       <entry name="family">Virtual Machine</entry>
Jan 22 00:11:59 compute-1 nova_compute[182713]:     </system>
Jan 22 00:11:59 compute-1 nova_compute[182713]:   </sysinfo>
Jan 22 00:11:59 compute-1 nova_compute[182713]:   <os>
Jan 22 00:11:59 compute-1 nova_compute[182713]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 22 00:11:59 compute-1 nova_compute[182713]:     <smbios mode="sysinfo"/>
Jan 22 00:11:59 compute-1 nova_compute[182713]:   </os>
Jan 22 00:11:59 compute-1 nova_compute[182713]:   <features>
Jan 22 00:11:59 compute-1 nova_compute[182713]:     <acpi/>
Jan 22 00:11:59 compute-1 nova_compute[182713]:     <apic/>
Jan 22 00:11:59 compute-1 nova_compute[182713]:     <vmcoreinfo/>
Jan 22 00:11:59 compute-1 nova_compute[182713]:   </features>
Jan 22 00:11:59 compute-1 nova_compute[182713]:   <clock offset="utc">
Jan 22 00:11:59 compute-1 nova_compute[182713]:     <timer name="pit" tickpolicy="delay"/>
Jan 22 00:11:59 compute-1 nova_compute[182713]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 22 00:11:59 compute-1 nova_compute[182713]:     <timer name="hpet" present="no"/>
Jan 22 00:11:59 compute-1 nova_compute[182713]:   </clock>
Jan 22 00:11:59 compute-1 nova_compute[182713]:   <cpu mode="custom" match="exact">
Jan 22 00:11:59 compute-1 nova_compute[182713]:     <model>Nehalem</model>
Jan 22 00:11:59 compute-1 nova_compute[182713]:     <topology sockets="1" cores="1" threads="1"/>
Jan 22 00:11:59 compute-1 nova_compute[182713]:   </cpu>
Jan 22 00:11:59 compute-1 nova_compute[182713]:   <devices>
Jan 22 00:11:59 compute-1 nova_compute[182713]:     <disk type="file" device="disk">
Jan 22 00:11:59 compute-1 nova_compute[182713]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 00:11:59 compute-1 nova_compute[182713]:       <source file="/var/lib/nova/instances/cdd4fb44-07d8-4910-b2c1-32386ecffab8/disk.rescue"/>
Jan 22 00:11:59 compute-1 nova_compute[182713]:       <target dev="vda" bus="virtio"/>
Jan 22 00:11:59 compute-1 nova_compute[182713]:     </disk>
Jan 22 00:11:59 compute-1 nova_compute[182713]:     <disk type="file" device="disk">
Jan 22 00:11:59 compute-1 nova_compute[182713]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 00:11:59 compute-1 nova_compute[182713]:       <source file="/var/lib/nova/instances/cdd4fb44-07d8-4910-b2c1-32386ecffab8/disk"/>
Jan 22 00:11:59 compute-1 nova_compute[182713]:       <target dev="vdb" bus="virtio"/>
Jan 22 00:11:59 compute-1 nova_compute[182713]:     </disk>
Jan 22 00:11:59 compute-1 nova_compute[182713]:     <disk type="file" device="cdrom">
Jan 22 00:11:59 compute-1 nova_compute[182713]:       <driver name="qemu" type="raw" cache="none"/>
Jan 22 00:11:59 compute-1 nova_compute[182713]:       <source file="/var/lib/nova/instances/cdd4fb44-07d8-4910-b2c1-32386ecffab8/disk.config.rescue"/>
Jan 22 00:11:59 compute-1 nova_compute[182713]:       <target dev="sda" bus="sata"/>
Jan 22 00:11:59 compute-1 nova_compute[182713]:     </disk>
Jan 22 00:11:59 compute-1 nova_compute[182713]:     <interface type="ethernet">
Jan 22 00:11:59 compute-1 nova_compute[182713]:       <mac address="fa:16:3e:3e:6e:e0"/>
Jan 22 00:11:59 compute-1 nova_compute[182713]:       <model type="virtio"/>
Jan 22 00:11:59 compute-1 nova_compute[182713]:       <driver name="vhost" rx_queue_size="512"/>
Jan 22 00:11:59 compute-1 nova_compute[182713]:       <mtu size="1442"/>
Jan 22 00:11:59 compute-1 nova_compute[182713]:       <target dev="tap7c209da8-98"/>
Jan 22 00:11:59 compute-1 nova_compute[182713]:     </interface>
Jan 22 00:11:59 compute-1 nova_compute[182713]:     <serial type="pty">
Jan 22 00:11:59 compute-1 nova_compute[182713]:       <log file="/var/lib/nova/instances/cdd4fb44-07d8-4910-b2c1-32386ecffab8/console.log" append="off"/>
Jan 22 00:11:59 compute-1 nova_compute[182713]:     </serial>
Jan 22 00:11:59 compute-1 nova_compute[182713]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 00:11:59 compute-1 nova_compute[182713]:     <video>
Jan 22 00:11:59 compute-1 nova_compute[182713]:       <model type="virtio"/>
Jan 22 00:11:59 compute-1 nova_compute[182713]:     </video>
Jan 22 00:11:59 compute-1 nova_compute[182713]:     <input type="tablet" bus="usb"/>
Jan 22 00:11:59 compute-1 nova_compute[182713]:     <rng model="virtio">
Jan 22 00:11:59 compute-1 nova_compute[182713]:       <backend model="random">/dev/urandom</backend>
Jan 22 00:11:59 compute-1 nova_compute[182713]:     </rng>
Jan 22 00:11:59 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root"/>
Jan 22 00:11:59 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:11:59 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:11:59 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:11:59 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:11:59 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:11:59 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:11:59 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:11:59 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:11:59 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:11:59 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:11:59 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:11:59 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:11:59 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:11:59 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:11:59 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:11:59 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:11:59 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:11:59 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:11:59 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:11:59 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:11:59 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:11:59 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:11:59 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:11:59 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:11:59 compute-1 nova_compute[182713]:     <controller type="usb" index="0"/>
Jan 22 00:11:59 compute-1 nova_compute[182713]:     <memballoon model="virtio">
Jan 22 00:11:59 compute-1 nova_compute[182713]:       <stats period="10"/>
Jan 22 00:11:59 compute-1 nova_compute[182713]:     </memballoon>
Jan 22 00:11:59 compute-1 nova_compute[182713]:   </devices>
Jan 22 00:11:59 compute-1 nova_compute[182713]: </domain>
Jan 22 00:11:59 compute-1 nova_compute[182713]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 22 00:11:59 compute-1 nova_compute[182713]: 2026-01-22 00:11:59.906 182717 INFO nova.virt.libvirt.driver [-] [instance: cdd4fb44-07d8-4910-b2c1-32386ecffab8] Instance destroyed successfully.
Jan 22 00:12:00 compute-1 nova_compute[182713]: 2026-01-22 00:12:00.236 182717 DEBUG nova.virt.libvirt.driver [None req-389c1e1c-25e9-43bb-a353-66f4730c00d7 3c4bd8a02cf045ad9703e01b44239806 54baf6b68ab146d49737e618b1e5b40e - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 00:12:00 compute-1 nova_compute[182713]: 2026-01-22 00:12:00.236 182717 DEBUG nova.virt.libvirt.driver [None req-389c1e1c-25e9-43bb-a353-66f4730c00d7 3c4bd8a02cf045ad9703e01b44239806 54baf6b68ab146d49737e618b1e5b40e - - default default] No BDM found with device name vdb, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 00:12:00 compute-1 nova_compute[182713]: 2026-01-22 00:12:00.237 182717 DEBUG nova.virt.libvirt.driver [None req-389c1e1c-25e9-43bb-a353-66f4730c00d7 3c4bd8a02cf045ad9703e01b44239806 54baf6b68ab146d49737e618b1e5b40e - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 00:12:00 compute-1 nova_compute[182713]: 2026-01-22 00:12:00.237 182717 DEBUG nova.virt.libvirt.driver [None req-389c1e1c-25e9-43bb-a353-66f4730c00d7 3c4bd8a02cf045ad9703e01b44239806 54baf6b68ab146d49737e618b1e5b40e - - default default] No VIF found with MAC fa:16:3e:3e:6e:e0, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 00:12:00 compute-1 nova_compute[182713]: 2026-01-22 00:12:00.237 182717 INFO nova.virt.libvirt.driver [None req-389c1e1c-25e9-43bb-a353-66f4730c00d7 3c4bd8a02cf045ad9703e01b44239806 54baf6b68ab146d49737e618b1e5b40e - - default default] [instance: cdd4fb44-07d8-4910-b2c1-32386ecffab8] Using config drive
Jan 22 00:12:00 compute-1 nova_compute[182713]: 2026-01-22 00:12:00.345 182717 DEBUG nova.objects.instance [None req-389c1e1c-25e9-43bb-a353-66f4730c00d7 3c4bd8a02cf045ad9703e01b44239806 54baf6b68ab146d49737e618b1e5b40e - - default default] Lazy-loading 'ec2_ids' on Instance uuid cdd4fb44-07d8-4910-b2c1-32386ecffab8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:12:00 compute-1 nova_compute[182713]: 2026-01-22 00:12:00.352 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:12:00 compute-1 nova_compute[182713]: 2026-01-22 00:12:00.557 182717 DEBUG nova.objects.instance [None req-389c1e1c-25e9-43bb-a353-66f4730c00d7 3c4bd8a02cf045ad9703e01b44239806 54baf6b68ab146d49737e618b1e5b40e - - default default] Lazy-loading 'keypairs' on Instance uuid cdd4fb44-07d8-4910-b2c1-32386ecffab8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:12:01 compute-1 podman[229545]: 2026-01-22 00:12:01.587793404 +0000 UTC m=+0.067036053 container health_status 9069102163b9014ce8d990e36224edf2f6277e737eebbc85a8ea0ba5a3d6a468 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 22 00:12:01 compute-1 podman[229544]: 2026-01-22 00:12:01.613640797 +0000 UTC m=+0.104116012 container health_status 1b31374526468d6b98833c6ddc06c791524e3aaa8f4a92d02e3f11c449a17ca2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 22 00:12:02 compute-1 nova_compute[182713]: 2026-01-22 00:12:02.247 182717 DEBUG nova.compute.manager [req-afa7afa8-99d3-4e71-9a16-75211fe69d8b req-1af824a4-ed77-4b61-9fda-4a603f7e9ad0 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: cdd4fb44-07d8-4910-b2c1-32386ecffab8] Received event network-vif-unplugged-7c209da8-98c9-48d0-b48e-ece9f863d933 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:12:02 compute-1 nova_compute[182713]: 2026-01-22 00:12:02.248 182717 DEBUG oslo_concurrency.lockutils [req-afa7afa8-99d3-4e71-9a16-75211fe69d8b req-1af824a4-ed77-4b61-9fda-4a603f7e9ad0 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "cdd4fb44-07d8-4910-b2c1-32386ecffab8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:12:02 compute-1 nova_compute[182713]: 2026-01-22 00:12:02.248 182717 DEBUG oslo_concurrency.lockutils [req-afa7afa8-99d3-4e71-9a16-75211fe69d8b req-1af824a4-ed77-4b61-9fda-4a603f7e9ad0 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "cdd4fb44-07d8-4910-b2c1-32386ecffab8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:12:02 compute-1 nova_compute[182713]: 2026-01-22 00:12:02.249 182717 DEBUG oslo_concurrency.lockutils [req-afa7afa8-99d3-4e71-9a16-75211fe69d8b req-1af824a4-ed77-4b61-9fda-4a603f7e9ad0 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "cdd4fb44-07d8-4910-b2c1-32386ecffab8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:12:02 compute-1 nova_compute[182713]: 2026-01-22 00:12:02.249 182717 DEBUG nova.compute.manager [req-afa7afa8-99d3-4e71-9a16-75211fe69d8b req-1af824a4-ed77-4b61-9fda-4a603f7e9ad0 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: cdd4fb44-07d8-4910-b2c1-32386ecffab8] No waiting events found dispatching network-vif-unplugged-7c209da8-98c9-48d0-b48e-ece9f863d933 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:12:02 compute-1 nova_compute[182713]: 2026-01-22 00:12:02.249 182717 WARNING nova.compute.manager [req-afa7afa8-99d3-4e71-9a16-75211fe69d8b req-1af824a4-ed77-4b61-9fda-4a603f7e9ad0 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: cdd4fb44-07d8-4910-b2c1-32386ecffab8] Received unexpected event network-vif-unplugged-7c209da8-98c9-48d0-b48e-ece9f863d933 for instance with vm_state active and task_state rescuing.
Jan 22 00:12:02 compute-1 nova_compute[182713]: 2026-01-22 00:12:02.440 182717 INFO nova.virt.libvirt.driver [None req-389c1e1c-25e9-43bb-a353-66f4730c00d7 3c4bd8a02cf045ad9703e01b44239806 54baf6b68ab146d49737e618b1e5b40e - - default default] [instance: cdd4fb44-07d8-4910-b2c1-32386ecffab8] Creating config drive at /var/lib/nova/instances/cdd4fb44-07d8-4910-b2c1-32386ecffab8/disk.config.rescue
Jan 22 00:12:02 compute-1 nova_compute[182713]: 2026-01-22 00:12:02.445 182717 DEBUG oslo_concurrency.processutils [None req-389c1e1c-25e9-43bb-a353-66f4730c00d7 3c4bd8a02cf045ad9703e01b44239806 54baf6b68ab146d49737e618b1e5b40e - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/cdd4fb44-07d8-4910-b2c1-32386ecffab8/disk.config.rescue -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpx1s4cg6e execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:12:02 compute-1 nova_compute[182713]: 2026-01-22 00:12:02.570 182717 DEBUG oslo_concurrency.processutils [None req-389c1e1c-25e9-43bb-a353-66f4730c00d7 3c4bd8a02cf045ad9703e01b44239806 54baf6b68ab146d49737e618b1e5b40e - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/cdd4fb44-07d8-4910-b2c1-32386ecffab8/disk.config.rescue -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpx1s4cg6e" returned: 0 in 0.125s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:12:02 compute-1 kernel: tap7c209da8-98: entered promiscuous mode
Jan 22 00:12:02 compute-1 NetworkManager[54952]: <info>  [1769040722.6617] manager: (tap7c209da8-98): new Tun device (/org/freedesktop/NetworkManager/Devices/222)
Jan 22 00:12:02 compute-1 nova_compute[182713]: 2026-01-22 00:12:02.664 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:12:02 compute-1 ovn_controller[94841]: 2026-01-22T00:12:02Z|00486|binding|INFO|Claiming lport 7c209da8-98c9-48d0-b48e-ece9f863d933 for this chassis.
Jan 22 00:12:02 compute-1 ovn_controller[94841]: 2026-01-22T00:12:02Z|00487|binding|INFO|7c209da8-98c9-48d0-b48e-ece9f863d933: Claiming fa:16:3e:3e:6e:e0 10.100.0.4
Jan 22 00:12:02 compute-1 ovn_controller[94841]: 2026-01-22T00:12:02Z|00488|binding|INFO|Setting lport 7c209da8-98c9-48d0-b48e-ece9f863d933 ovn-installed in OVS
Jan 22 00:12:02 compute-1 nova_compute[182713]: 2026-01-22 00:12:02.696 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:12:02 compute-1 nova_compute[182713]: 2026-01-22 00:12:02.702 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:12:02 compute-1 systemd-udevd[229614]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 00:12:02 compute-1 systemd-machined[153970]: New machine qemu-53-instance-00000074.
Jan 22 00:12:02 compute-1 NetworkManager[54952]: <info>  [1769040722.7255] device (tap7c209da8-98): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 00:12:02 compute-1 NetworkManager[54952]: <info>  [1769040722.7264] device (tap7c209da8-98): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 00:12:02 compute-1 systemd[1]: Started Virtual Machine qemu-53-instance-00000074.
Jan 22 00:12:02 compute-1 ovn_controller[94841]: 2026-01-22T00:12:02Z|00489|binding|INFO|Setting lport 7c209da8-98c9-48d0-b48e-ece9f863d933 up in Southbound
Jan 22 00:12:02 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:12:02.735 104184 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:3e:6e:e0 10.100.0.4'], port_security=['fa:16:3e:3e:6e:e0 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': 'cdd4fb44-07d8-4910-b2c1-32386ecffab8', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3b4ab8c5-3d45-4beb-9842-6290486d7c84', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '54baf6b68ab146d49737e618b1e5b40e', 'neutron:revision_number': '5', 'neutron:security_group_ids': 'cdb31eb0-cb40-4001-b611-8fbfe7d0fb3e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=9fae3c6d-aa86-498b-83cb-e3d2b95d2a8f, chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>], logical_port=7c209da8-98c9-48d0-b48e-ece9f863d933) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:12:02 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:12:02.736 104184 INFO neutron.agent.ovn.metadata.agent [-] Port 7c209da8-98c9-48d0-b48e-ece9f863d933 in datapath 3b4ab8c5-3d45-4beb-9842-6290486d7c84 bound to our chassis
Jan 22 00:12:02 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:12:02.737 104184 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 3b4ab8c5-3d45-4beb-9842-6290486d7c84 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Jan 22 00:12:02 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:12:02.738 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[677e561d-86fb-40db-b38d-9c0c8c664463]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:12:02 compute-1 nova_compute[182713]: 2026-01-22 00:12:02.931 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:12:03 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:12:03.019 104184 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:12:03 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:12:03.020 104184 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:12:03 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:12:03.020 104184 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:12:03 compute-1 nova_compute[182713]: 2026-01-22 00:12:03.442 182717 DEBUG nova.virt.libvirt.host [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Removed pending event for cdd4fb44-07d8-4910-b2c1-32386ecffab8 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438
Jan 22 00:12:03 compute-1 nova_compute[182713]: 2026-01-22 00:12:03.444 182717 DEBUG nova.virt.driver [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Emitting event <LifecycleEvent: 1769040723.4420862, cdd4fb44-07d8-4910-b2c1-32386ecffab8 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:12:03 compute-1 nova_compute[182713]: 2026-01-22 00:12:03.445 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: cdd4fb44-07d8-4910-b2c1-32386ecffab8] VM Resumed (Lifecycle Event)
Jan 22 00:12:03 compute-1 nova_compute[182713]: 2026-01-22 00:12:03.677 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: cdd4fb44-07d8-4910-b2c1-32386ecffab8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:12:03 compute-1 nova_compute[182713]: 2026-01-22 00:12:03.681 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: cdd4fb44-07d8-4910-b2c1-32386ecffab8] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: rescuing, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 00:12:03 compute-1 nova_compute[182713]: 2026-01-22 00:12:03.814 182717 DEBUG nova.compute.manager [None req-389c1e1c-25e9-43bb-a353-66f4730c00d7 3c4bd8a02cf045ad9703e01b44239806 54baf6b68ab146d49737e618b1e5b40e - - default default] [instance: cdd4fb44-07d8-4910-b2c1-32386ecffab8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:12:03 compute-1 nova_compute[182713]: 2026-01-22 00:12:03.833 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: cdd4fb44-07d8-4910-b2c1-32386ecffab8] During sync_power_state the instance has a pending task (rescuing). Skip.
Jan 22 00:12:03 compute-1 nova_compute[182713]: 2026-01-22 00:12:03.834 182717 DEBUG nova.virt.driver [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Emitting event <LifecycleEvent: 1769040723.4436712, cdd4fb44-07d8-4910-b2c1-32386ecffab8 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:12:03 compute-1 nova_compute[182713]: 2026-01-22 00:12:03.834 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: cdd4fb44-07d8-4910-b2c1-32386ecffab8] VM Started (Lifecycle Event)
Jan 22 00:12:03 compute-1 nova_compute[182713]: 2026-01-22 00:12:03.968 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: cdd4fb44-07d8-4910-b2c1-32386ecffab8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:12:03 compute-1 nova_compute[182713]: 2026-01-22 00:12:03.973 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: cdd4fb44-07d8-4910-b2c1-32386ecffab8] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: rescuing, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 00:12:04 compute-1 nova_compute[182713]: 2026-01-22 00:12:04.433 182717 DEBUG nova.compute.manager [req-72e44d7e-b66d-4eb4-ac46-ba6dd91d6ac3 req-294210d2-c95c-4720-b3d4-f12c9b4d0550 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: cdd4fb44-07d8-4910-b2c1-32386ecffab8] Received event network-vif-plugged-7c209da8-98c9-48d0-b48e-ece9f863d933 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:12:04 compute-1 nova_compute[182713]: 2026-01-22 00:12:04.433 182717 DEBUG oslo_concurrency.lockutils [req-72e44d7e-b66d-4eb4-ac46-ba6dd91d6ac3 req-294210d2-c95c-4720-b3d4-f12c9b4d0550 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "cdd4fb44-07d8-4910-b2c1-32386ecffab8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:12:04 compute-1 nova_compute[182713]: 2026-01-22 00:12:04.434 182717 DEBUG oslo_concurrency.lockutils [req-72e44d7e-b66d-4eb4-ac46-ba6dd91d6ac3 req-294210d2-c95c-4720-b3d4-f12c9b4d0550 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "cdd4fb44-07d8-4910-b2c1-32386ecffab8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:12:04 compute-1 nova_compute[182713]: 2026-01-22 00:12:04.434 182717 DEBUG oslo_concurrency.lockutils [req-72e44d7e-b66d-4eb4-ac46-ba6dd91d6ac3 req-294210d2-c95c-4720-b3d4-f12c9b4d0550 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "cdd4fb44-07d8-4910-b2c1-32386ecffab8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:12:04 compute-1 nova_compute[182713]: 2026-01-22 00:12:04.434 182717 DEBUG nova.compute.manager [req-72e44d7e-b66d-4eb4-ac46-ba6dd91d6ac3 req-294210d2-c95c-4720-b3d4-f12c9b4d0550 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: cdd4fb44-07d8-4910-b2c1-32386ecffab8] No waiting events found dispatching network-vif-plugged-7c209da8-98c9-48d0-b48e-ece9f863d933 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:12:04 compute-1 nova_compute[182713]: 2026-01-22 00:12:04.435 182717 WARNING nova.compute.manager [req-72e44d7e-b66d-4eb4-ac46-ba6dd91d6ac3 req-294210d2-c95c-4720-b3d4-f12c9b4d0550 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: cdd4fb44-07d8-4910-b2c1-32386ecffab8] Received unexpected event network-vif-plugged-7c209da8-98c9-48d0-b48e-ece9f863d933 for instance with vm_state rescued and task_state None.
Jan 22 00:12:04 compute-1 nova_compute[182713]: 2026-01-22 00:12:04.435 182717 DEBUG nova.compute.manager [req-72e44d7e-b66d-4eb4-ac46-ba6dd91d6ac3 req-294210d2-c95c-4720-b3d4-f12c9b4d0550 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: cdd4fb44-07d8-4910-b2c1-32386ecffab8] Received event network-vif-plugged-7c209da8-98c9-48d0-b48e-ece9f863d933 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:12:04 compute-1 nova_compute[182713]: 2026-01-22 00:12:04.436 182717 DEBUG oslo_concurrency.lockutils [req-72e44d7e-b66d-4eb4-ac46-ba6dd91d6ac3 req-294210d2-c95c-4720-b3d4-f12c9b4d0550 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "cdd4fb44-07d8-4910-b2c1-32386ecffab8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:12:04 compute-1 nova_compute[182713]: 2026-01-22 00:12:04.436 182717 DEBUG oslo_concurrency.lockutils [req-72e44d7e-b66d-4eb4-ac46-ba6dd91d6ac3 req-294210d2-c95c-4720-b3d4-f12c9b4d0550 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "cdd4fb44-07d8-4910-b2c1-32386ecffab8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:12:04 compute-1 nova_compute[182713]: 2026-01-22 00:12:04.436 182717 DEBUG oslo_concurrency.lockutils [req-72e44d7e-b66d-4eb4-ac46-ba6dd91d6ac3 req-294210d2-c95c-4720-b3d4-f12c9b4d0550 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "cdd4fb44-07d8-4910-b2c1-32386ecffab8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:12:04 compute-1 nova_compute[182713]: 2026-01-22 00:12:04.437 182717 DEBUG nova.compute.manager [req-72e44d7e-b66d-4eb4-ac46-ba6dd91d6ac3 req-294210d2-c95c-4720-b3d4-f12c9b4d0550 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: cdd4fb44-07d8-4910-b2c1-32386ecffab8] No waiting events found dispatching network-vif-plugged-7c209da8-98c9-48d0-b48e-ece9f863d933 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:12:04 compute-1 nova_compute[182713]: 2026-01-22 00:12:04.437 182717 WARNING nova.compute.manager [req-72e44d7e-b66d-4eb4-ac46-ba6dd91d6ac3 req-294210d2-c95c-4720-b3d4-f12c9b4d0550 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: cdd4fb44-07d8-4910-b2c1-32386ecffab8] Received unexpected event network-vif-plugged-7c209da8-98c9-48d0-b48e-ece9f863d933 for instance with vm_state rescued and task_state None.
Jan 22 00:12:05 compute-1 nova_compute[182713]: 2026-01-22 00:12:05.398 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:12:06 compute-1 podman[229632]: 2026-01-22 00:12:06.557069686 +0000 UTC m=+0.053136845 container health_status e305ebfcd3dbca18f8dd680606b043b108cf9efb0d0a771039cb04fe0df8441d (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Jan 22 00:12:06 compute-1 podman[229631]: 2026-01-22 00:12:06.565386311 +0000 UTC m=+0.060643935 container health_status af9a44923f14d865893c91712a29a99d59e05d83f1a1a0300c4f4d5af638f284 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 22 00:12:06 compute-1 nova_compute[182713]: 2026-01-22 00:12:06.733 182717 DEBUG nova.compute.manager [req-8550335a-86ea-4fa2-b16d-4d47ad77d33e req-285e40af-81d2-489c-b415-26e06dd03e5c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: cdd4fb44-07d8-4910-b2c1-32386ecffab8] Received event network-vif-plugged-7c209da8-98c9-48d0-b48e-ece9f863d933 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:12:06 compute-1 nova_compute[182713]: 2026-01-22 00:12:06.734 182717 DEBUG oslo_concurrency.lockutils [req-8550335a-86ea-4fa2-b16d-4d47ad77d33e req-285e40af-81d2-489c-b415-26e06dd03e5c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "cdd4fb44-07d8-4910-b2c1-32386ecffab8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:12:06 compute-1 nova_compute[182713]: 2026-01-22 00:12:06.735 182717 DEBUG oslo_concurrency.lockutils [req-8550335a-86ea-4fa2-b16d-4d47ad77d33e req-285e40af-81d2-489c-b415-26e06dd03e5c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "cdd4fb44-07d8-4910-b2c1-32386ecffab8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:12:06 compute-1 nova_compute[182713]: 2026-01-22 00:12:06.735 182717 DEBUG oslo_concurrency.lockutils [req-8550335a-86ea-4fa2-b16d-4d47ad77d33e req-285e40af-81d2-489c-b415-26e06dd03e5c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "cdd4fb44-07d8-4910-b2c1-32386ecffab8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:12:06 compute-1 nova_compute[182713]: 2026-01-22 00:12:06.736 182717 DEBUG nova.compute.manager [req-8550335a-86ea-4fa2-b16d-4d47ad77d33e req-285e40af-81d2-489c-b415-26e06dd03e5c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: cdd4fb44-07d8-4910-b2c1-32386ecffab8] No waiting events found dispatching network-vif-plugged-7c209da8-98c9-48d0-b48e-ece9f863d933 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:12:06 compute-1 nova_compute[182713]: 2026-01-22 00:12:06.736 182717 WARNING nova.compute.manager [req-8550335a-86ea-4fa2-b16d-4d47ad77d33e req-285e40af-81d2-489c-b415-26e06dd03e5c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: cdd4fb44-07d8-4910-b2c1-32386ecffab8] Received unexpected event network-vif-plugged-7c209da8-98c9-48d0-b48e-ece9f863d933 for instance with vm_state rescued and task_state None.
Jan 22 00:12:07 compute-1 nova_compute[182713]: 2026-01-22 00:12:07.933 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:12:08 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:12:08.973 104184 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=33, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:a4:fb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'fa:c2:bb:50:eb:27'}, ipsec=False) old=SB_Global(nb_cfg=32) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:12:08 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:12:08.974 104184 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 22 00:12:08 compute-1 nova_compute[182713]: 2026-01-22 00:12:08.974 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:12:09 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:12:09.975 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=74526b6d-b1ca-423f-9094-b845f8b97526, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '33'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:12:10 compute-1 nova_compute[182713]: 2026-01-22 00:12:10.442 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:12:12 compute-1 nova_compute[182713]: 2026-01-22 00:12:12.042 182717 DEBUG nova.compute.manager [req-792d14a1-2e3b-4341-b8c9-233d96d7ce3e req-7cbf3b10-4198-4e4c-aa5e-0ebf25b7b586 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: cdd4fb44-07d8-4910-b2c1-32386ecffab8] Received event network-changed-7c209da8-98c9-48d0-b48e-ece9f863d933 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:12:12 compute-1 nova_compute[182713]: 2026-01-22 00:12:12.044 182717 DEBUG nova.compute.manager [req-792d14a1-2e3b-4341-b8c9-233d96d7ce3e req-7cbf3b10-4198-4e4c-aa5e-0ebf25b7b586 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: cdd4fb44-07d8-4910-b2c1-32386ecffab8] Refreshing instance network info cache due to event network-changed-7c209da8-98c9-48d0-b48e-ece9f863d933. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 00:12:12 compute-1 nova_compute[182713]: 2026-01-22 00:12:12.045 182717 DEBUG oslo_concurrency.lockutils [req-792d14a1-2e3b-4341-b8c9-233d96d7ce3e req-7cbf3b10-4198-4e4c-aa5e-0ebf25b7b586 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-cdd4fb44-07d8-4910-b2c1-32386ecffab8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:12:12 compute-1 nova_compute[182713]: 2026-01-22 00:12:12.046 182717 DEBUG oslo_concurrency.lockutils [req-792d14a1-2e3b-4341-b8c9-233d96d7ce3e req-7cbf3b10-4198-4e4c-aa5e-0ebf25b7b586 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-cdd4fb44-07d8-4910-b2c1-32386ecffab8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:12:12 compute-1 nova_compute[182713]: 2026-01-22 00:12:12.046 182717 DEBUG nova.network.neutron [req-792d14a1-2e3b-4341-b8c9-233d96d7ce3e req-7cbf3b10-4198-4e4c-aa5e-0ebf25b7b586 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: cdd4fb44-07d8-4910-b2c1-32386ecffab8] Refreshing network info cache for port 7c209da8-98c9-48d0-b48e-ece9f863d933 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 00:12:12 compute-1 nova_compute[182713]: 2026-01-22 00:12:12.995 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:12:15 compute-1 nova_compute[182713]: 2026-01-22 00:12:15.445 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:12:16 compute-1 podman[229686]: 2026-01-22 00:12:16.593203546 +0000 UTC m=+0.083641411 container health_status cba261ffc859a105e0afa4436e64e27f9ba7e7387a1ea02146e0072c51844d44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251202, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Jan 22 00:12:17 compute-1 nova_compute[182713]: 2026-01-22 00:12:17.333 182717 DEBUG nova.network.neutron [req-792d14a1-2e3b-4341-b8c9-233d96d7ce3e req-7cbf3b10-4198-4e4c-aa5e-0ebf25b7b586 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: cdd4fb44-07d8-4910-b2c1-32386ecffab8] Updated VIF entry in instance network info cache for port 7c209da8-98c9-48d0-b48e-ece9f863d933. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 00:12:17 compute-1 nova_compute[182713]: 2026-01-22 00:12:17.333 182717 DEBUG nova.network.neutron [req-792d14a1-2e3b-4341-b8c9-233d96d7ce3e req-7cbf3b10-4198-4e4c-aa5e-0ebf25b7b586 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: cdd4fb44-07d8-4910-b2c1-32386ecffab8] Updating instance_info_cache with network_info: [{"id": "7c209da8-98c9-48d0-b48e-ece9f863d933", "address": "fa:16:3e:3e:6e:e0", "network": {"id": "3b4ab8c5-3d45-4beb-9842-6290486d7c84", "bridge": "br-int", "label": "tempest-ServerRescueTestJSONUnderV235-682511806-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "54baf6b68ab146d49737e618b1e5b40e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7c209da8-98", "ovs_interfaceid": "7c209da8-98c9-48d0-b48e-ece9f863d933", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:12:17 compute-1 nova_compute[182713]: 2026-01-22 00:12:17.511 182717 DEBUG oslo_concurrency.lockutils [req-792d14a1-2e3b-4341-b8c9-233d96d7ce3e req-7cbf3b10-4198-4e4c-aa5e-0ebf25b7b586 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-cdd4fb44-07d8-4910-b2c1-32386ecffab8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:12:17 compute-1 nova_compute[182713]: 2026-01-22 00:12:17.997 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:12:19 compute-1 nova_compute[182713]: 2026-01-22 00:12:19.759 182717 DEBUG nova.compute.manager [req-7efd1648-2c07-4e76-9a1e-48c1f244439b req-1f43b490-454e-40dc-88eb-20facf612f50 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: cdd4fb44-07d8-4910-b2c1-32386ecffab8] Received event network-changed-7c209da8-98c9-48d0-b48e-ece9f863d933 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:12:19 compute-1 nova_compute[182713]: 2026-01-22 00:12:19.760 182717 DEBUG nova.compute.manager [req-7efd1648-2c07-4e76-9a1e-48c1f244439b req-1f43b490-454e-40dc-88eb-20facf612f50 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: cdd4fb44-07d8-4910-b2c1-32386ecffab8] Refreshing instance network info cache due to event network-changed-7c209da8-98c9-48d0-b48e-ece9f863d933. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 00:12:19 compute-1 nova_compute[182713]: 2026-01-22 00:12:19.760 182717 DEBUG oslo_concurrency.lockutils [req-7efd1648-2c07-4e76-9a1e-48c1f244439b req-1f43b490-454e-40dc-88eb-20facf612f50 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-cdd4fb44-07d8-4910-b2c1-32386ecffab8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:12:19 compute-1 nova_compute[182713]: 2026-01-22 00:12:19.760 182717 DEBUG oslo_concurrency.lockutils [req-7efd1648-2c07-4e76-9a1e-48c1f244439b req-1f43b490-454e-40dc-88eb-20facf612f50 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-cdd4fb44-07d8-4910-b2c1-32386ecffab8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:12:19 compute-1 nova_compute[182713]: 2026-01-22 00:12:19.760 182717 DEBUG nova.network.neutron [req-7efd1648-2c07-4e76-9a1e-48c1f244439b req-1f43b490-454e-40dc-88eb-20facf612f50 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: cdd4fb44-07d8-4910-b2c1-32386ecffab8] Refreshing network info cache for port 7c209da8-98c9-48d0-b48e-ece9f863d933 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 00:12:20 compute-1 nova_compute[182713]: 2026-01-22 00:12:20.448 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:12:20 compute-1 podman[229708]: 2026-01-22 00:12:20.568061073 +0000 UTC m=+0.065881976 container health_status dce133a2ffa21f3006ac27dc80027060a9029e6d972f4c62fecd35dd3bbb00f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, release=1755695350, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., vendor=Red Hat, Inc., version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, config_id=openstack_network_exporter, name=ubi9-minimal)
Jan 22 00:12:21 compute-1 nova_compute[182713]: 2026-01-22 00:12:21.218 182717 DEBUG nova.network.neutron [req-7efd1648-2c07-4e76-9a1e-48c1f244439b req-1f43b490-454e-40dc-88eb-20facf612f50 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: cdd4fb44-07d8-4910-b2c1-32386ecffab8] Updated VIF entry in instance network info cache for port 7c209da8-98c9-48d0-b48e-ece9f863d933. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 00:12:21 compute-1 nova_compute[182713]: 2026-01-22 00:12:21.219 182717 DEBUG nova.network.neutron [req-7efd1648-2c07-4e76-9a1e-48c1f244439b req-1f43b490-454e-40dc-88eb-20facf612f50 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: cdd4fb44-07d8-4910-b2c1-32386ecffab8] Updating instance_info_cache with network_info: [{"id": "7c209da8-98c9-48d0-b48e-ece9f863d933", "address": "fa:16:3e:3e:6e:e0", "network": {"id": "3b4ab8c5-3d45-4beb-9842-6290486d7c84", "bridge": "br-int", "label": "tempest-ServerRescueTestJSONUnderV235-682511806-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "54baf6b68ab146d49737e618b1e5b40e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7c209da8-98", "ovs_interfaceid": "7c209da8-98c9-48d0-b48e-ece9f863d933", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:12:21 compute-1 nova_compute[182713]: 2026-01-22 00:12:21.343 182717 DEBUG oslo_concurrency.lockutils [req-7efd1648-2c07-4e76-9a1e-48c1f244439b req-1f43b490-454e-40dc-88eb-20facf612f50 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-cdd4fb44-07d8-4910-b2c1-32386ecffab8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.880 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'cdd4fb44-07d8-4910-b2c1-32386ecffab8', 'name': 'tempest-ServerRescueTestJSONUnderV235-server-1857258152', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000074', 'OS-EXT-SRV-ATTR:host': 'compute-1.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '54baf6b68ab146d49737e618b1e5b40e', 'user_id': '3c4bd8a02cf045ad9703e01b44239806', 'hostId': '13631781bc8a72ec7c5fd1fe42a0e71cf2a6a8aa0ce911e2240ec187', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.881 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.933 12 DEBUG ceilometer.compute.pollsters [-] cdd4fb44-07d8-4910-b2c1-32386ecffab8/disk.device.read.requests volume: 960 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.934 12 DEBUG ceilometer.compute.pollsters [-] cdd4fb44-07d8-4910-b2c1-32386ecffab8/disk.device.read.requests volume: 453 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.934 12 DEBUG ceilometer.compute.pollsters [-] cdd4fb44-07d8-4910-b2c1-32386ecffab8/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.937 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b24b2c4b-9490-492d-b605-45f577b298ac', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 960, 'user_id': '3c4bd8a02cf045ad9703e01b44239806', 'user_name': None, 'project_id': '54baf6b68ab146d49737e618b1e5b40e', 'project_name': None, 'resource_id': 'cdd4fb44-07d8-4910-b2c1-32386ecffab8-vda', 'timestamp': '2026-01-22T00:12:22.881991', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSONUnderV235-server-1857258152', 'name': 'instance-00000074', 'instance_id': 'cdd4fb44-07d8-4910-b2c1-32386ecffab8', 'instance_type': 'm1.nano', 'host': '13631781bc8a72ec7c5fd1fe42a0e71cf2a6a8aa0ce911e2240ec187', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '06432bca-f727-11f0-a0a4-fa163e934844', 'monotonic_time': 5315.589297994, 'message_signature': '3541e995aef54a9e1a4d1270a4b30f3d811603b4af927d7db71d4d47e7b6c0a4'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 453, 'user_id': '3c4bd8a02cf045ad9703e01b44239806', 'user_name': None, 'project_id': '54baf6b68ab146d49737e618b1e5b40e', 'project_name': None, 'resource_id': 'cdd4fb44-07d8-4910-b2c1-32386ecffab8-vdb', 'timestamp': '2026-01-22T00:12:22.881991', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSONUnderV235-server-1857258152', 'name': 'instance-00000074', 'instance_id': 'cdd4fb44-07d8-4910-b2c1-32386ecffab8', 'instance_type': 'm1.nano', 'host': '13631781bc8a72ec7c5fd1fe42a0e71cf2a6a8aa0ce911e2240ec187', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vdb'}, 'message_id': '0643431c-f727-11f0-a0a4-fa163e934844', 'monotonic_time': 5315.589297994, 'message_signature': '2dd7fd0276f8278cb85aac09d095a522efd469505285e8515c890536fd02e56b'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': '3c4bd8a02cf045ad9703e01b44239806', 'user_name': None, 'project_id': '54baf6b68ab146d49737e618b1e5b40e', 'project_name': None, 'resource_id': 'cdd4fb44-07d8-4910-b2c1-32386ecffab8-sda', 'timestamp': '2026-01-22T00:12:22.881991', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSONUnderV235-server-1857258152', 'name': 'instance-00000074', 'instance_id': 'cdd4fb44-07d8-4910-b2c1-32386ecffab8', 'instance_type': 'm1.nano', 'host': '13631781bc8a72ec7c5fd1fe42a0e71cf2a6a8aa0ce911e2240ec187', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '06434e70-f727-11f0-a0a4-fa163e934844', 'monotonic_time': 5315.589297994, 'message_signature': '56331b606378fcfa0ec67a5739bda5967647665a835e09b36b4f90a51895ee41'}]}, 'timestamp': '2026-01-22 00:12:22.934714', '_unique_id': 'bcff5b51c3cb4d99857ffcb5b96a5a99'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.937 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.937 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.937 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.937 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.937 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.937 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.937 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.937 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.937 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.937 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.937 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.937 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.937 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.937 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.937 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.937 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.937 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.937 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.937 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.937 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.937 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.937 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.937 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.937 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.937 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.937 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.937 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.937 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.937 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.937 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.937 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.938 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.957 12 DEBUG ceilometer.compute.pollsters [-] cdd4fb44-07d8-4910-b2c1-32386ecffab8/disk.device.allocation volume: 204800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.957 12 DEBUG ceilometer.compute.pollsters [-] cdd4fb44-07d8-4910-b2c1-32386ecffab8/disk.device.allocation volume: 30351360 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.957 12 DEBUG ceilometer.compute.pollsters [-] cdd4fb44-07d8-4910-b2c1-32386ecffab8/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.959 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '34ffacd6-0811-4b60-ba3a-368b5107e059', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 204800, 'user_id': '3c4bd8a02cf045ad9703e01b44239806', 'user_name': None, 'project_id': '54baf6b68ab146d49737e618b1e5b40e', 'project_name': None, 'resource_id': 'cdd4fb44-07d8-4910-b2c1-32386ecffab8-vda', 'timestamp': '2026-01-22T00:12:22.938647', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSONUnderV235-server-1857258152', 'name': 'instance-00000074', 'instance_id': 'cdd4fb44-07d8-4910-b2c1-32386ecffab8', 'instance_type': 'm1.nano', 'host': '13631781bc8a72ec7c5fd1fe42a0e71cf2a6a8aa0ce911e2240ec187', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '0646ce60-f727-11f0-a0a4-fa163e934844', 'monotonic_time': 5315.645887384, 'message_signature': 'b07c08c339342f26f5bc3f6f5fb81132784e7711d0fd69380a950c8784b8f3f5'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30351360, 'user_id': '3c4bd8a02cf045ad9703e01b44239806', 'user_name': None, 'project_id': '54baf6b68ab146d49737e618b1e5b40e', 'project_name': None, 'resource_id': 'cdd4fb44-07d8-4910-b2c1-32386ecffab8-vdb', 'timestamp': '2026-01-22T00:12:22.938647', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSONUnderV235-server-1857258152', 'name': 'instance-00000074', 'instance_id': 'cdd4fb44-07d8-4910-b2c1-32386ecffab8', 'instance_type': 'm1.nano', 'host': '13631781bc8a72ec7c5fd1fe42a0e71cf2a6a8aa0ce911e2240ec187', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vdb'}, 'message_id': '0646da0e-f727-11f0-a0a4-fa163e934844', 'monotonic_time': 5315.645887384, 'message_signature': 'e79036e953ed0a1fae492c1b41278a20cf61bf9bf68a2110f2cad35f1341a829'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '3c4bd8a02cf045ad9703e01b44239806', 'user_name': None, 'project_id': '54baf6b68ab146d49737e618b1e5b40e', 'project_name': None, 'resource_id': 'cdd4fb44-07d8-4910-b2c1-32386ecffab8-sda', 'timestamp': '2026-01-22T00:12:22.938647', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSONUnderV235-server-1857258152', 'name': 'instance-00000074', 'instance_id': 'cdd4fb44-07d8-4910-b2c1-32386ecffab8', 'instance_type': 'm1.nano', 'host': '13631781bc8a72ec7c5fd1fe42a0e71cf2a6a8aa0ce911e2240ec187', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '0646e698-f727-11f0-a0a4-fa163e934844', 'monotonic_time': 5315.645887384, 'message_signature': '717f62a5a29f6197d5c70db2131dedaa91abf95814ea50fc896eee86ba0eb281'}]}, 'timestamp': '2026-01-22 00:12:22.958202', '_unique_id': '74dc2853ccb2419583aedcea5b9e3765'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.959 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.959 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.959 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.959 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.959 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.959 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.959 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.959 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.959 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.959 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.959 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.959 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.959 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.959 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.959 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.959 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.959 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.959 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.959 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.959 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.959 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.959 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.959 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.959 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.959 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.959 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.959 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.959 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.959 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.959 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.959 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.960 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.960 12 DEBUG ceilometer.compute.pollsters [-] cdd4fb44-07d8-4910-b2c1-32386ecffab8/disk.device.usage volume: 196616 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.961 12 DEBUG ceilometer.compute.pollsters [-] cdd4fb44-07d8-4910-b2c1-32386ecffab8/disk.device.usage volume: 30081024 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.961 12 DEBUG ceilometer.compute.pollsters [-] cdd4fb44-07d8-4910-b2c1-32386ecffab8/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.962 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2a176299-7d44-46a8-8de2-404212e898f8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 196616, 'user_id': '3c4bd8a02cf045ad9703e01b44239806', 'user_name': None, 'project_id': '54baf6b68ab146d49737e618b1e5b40e', 'project_name': None, 'resource_id': 'cdd4fb44-07d8-4910-b2c1-32386ecffab8-vda', 'timestamp': '2026-01-22T00:12:22.960917', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSONUnderV235-server-1857258152', 'name': 'instance-00000074', 'instance_id': 'cdd4fb44-07d8-4910-b2c1-32386ecffab8', 'instance_type': 'm1.nano', 'host': '13631781bc8a72ec7c5fd1fe42a0e71cf2a6a8aa0ce911e2240ec187', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '064760a0-f727-11f0-a0a4-fa163e934844', 'monotonic_time': 5315.645887384, 'message_signature': '6f27f0750d45ebe39839fd2b2d55beab6ab95994481639a4b885c699b4cf11b8'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30081024, 'user_id': '3c4bd8a02cf045ad9703e01b44239806', 'user_name': None, 'project_id': '54baf6b68ab146d49737e618b1e5b40e', 'project_name': None, 'resource_id': 'cdd4fb44-07d8-4910-b2c1-32386ecffab8-vdb', 'timestamp': '2026-01-22T00:12:22.960917', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSONUnderV235-server-1857258152', 'name': 'instance-00000074', 'instance_id': 'cdd4fb44-07d8-4910-b2c1-32386ecffab8', 'instance_type': 'm1.nano', 'host': '13631781bc8a72ec7c5fd1fe42a0e71cf2a6a8aa0ce911e2240ec187', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vdb'}, 'message_id': '06476f00-f727-11f0-a0a4-fa163e934844', 'monotonic_time': 5315.645887384, 'message_signature': 'e054a8258a8cb5e608e18fe4af010cb156873c60d912b4465a614f5a3e20b03a'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '3c4bd8a02cf045ad9703e01b44239806', 'user_name': None, 'project_id': '54baf6b68ab146d49737e618b1e5b40e', 'project_name': None, 'resource_id': 'cdd4fb44-07d8-4910-b2c1-32386ecffab8-sda', 'timestamp': '2026-01-22T00:12:22.960917', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSONUnderV235-server-1857258152', 'name': 'instance-00000074', 'instance_id': 'cdd4fb44-07d8-4910-b2c1-32386ecffab8', 'instance_type': 'm1.nano', 'host': '13631781bc8a72ec7c5fd1fe42a0e71cf2a6a8aa0ce911e2240ec187', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '06477770-f727-11f0-a0a4-fa163e934844', 'monotonic_time': 5315.645887384, 'message_signature': 'ec8cbd45dc9ec9bfe08cf483cdeb92047ad1c8acad3a25cb0a40c2043c44fb50'}]}, 'timestamp': '2026-01-22 00:12:22.961904', '_unique_id': '4bd6c0a74c59449181b2d17b999f3935'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.962 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.962 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.962 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.962 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.962 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.962 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.962 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.962 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.962 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.962 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.962 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.962 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.962 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.962 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.962 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.962 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.962 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.962 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.962 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.962 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.962 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.962 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.962 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.962 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.962 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.962 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.962 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.962 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.962 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.962 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.962 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.963 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.963 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.963 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-ServerRescueTestJSONUnderV235-server-1857258152>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServerRescueTestJSONUnderV235-server-1857258152>]
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.964 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.964 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.964 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-ServerRescueTestJSONUnderV235-server-1857258152>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServerRescueTestJSONUnderV235-server-1857258152>]
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.964 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.967 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for cdd4fb44-07d8-4910-b2c1-32386ecffab8 / tap7c209da8-98 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.967 12 DEBUG ceilometer.compute.pollsters [-] cdd4fb44-07d8-4910-b2c1-32386ecffab8/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.969 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '48825dbb-2617-4d96-96b6-ee8b3fab2d73', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '3c4bd8a02cf045ad9703e01b44239806', 'user_name': None, 'project_id': '54baf6b68ab146d49737e618b1e5b40e', 'project_name': None, 'resource_id': 'instance-00000074-cdd4fb44-07d8-4910-b2c1-32386ecffab8-tap7c209da8-98', 'timestamp': '2026-01-22T00:12:22.964558', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSONUnderV235-server-1857258152', 'name': 'tap7c209da8-98', 'instance_id': 'cdd4fb44-07d8-4910-b2c1-32386ecffab8', 'instance_type': 'm1.nano', 'host': '13631781bc8a72ec7c5fd1fe42a0e71cf2a6a8aa0ce911e2240ec187', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:3e:6e:e0', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap7c209da8-98'}, 'message_id': '06486af4-f727-11f0-a0a4-fa163e934844', 'monotonic_time': 5315.67179735, 'message_signature': '9a8400c1b39f12576c1a4109672ab9f7c8aa0a54c04dd0811e2608ea6d14f1b4'}]}, 'timestamp': '2026-01-22 00:12:22.968156', '_unique_id': 'e64884b470ca4fdba3ef1f9bd9b9c128'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.969 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.969 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.969 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.969 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.969 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.969 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.969 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.969 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.969 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.969 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.969 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.969 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.969 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.969 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.969 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.969 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.969 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.969 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.969 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.969 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.969 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.969 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.969 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.969 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.969 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.969 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.969 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.969 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.969 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.969 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.969 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.969 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.970 12 DEBUG ceilometer.compute.pollsters [-] cdd4fb44-07d8-4910-b2c1-32386ecffab8/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.970 12 DEBUG ceilometer.compute.pollsters [-] cdd4fb44-07d8-4910-b2c1-32386ecffab8/disk.device.write.bytes volume: 249856 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.970 12 DEBUG ceilometer.compute.pollsters [-] cdd4fb44-07d8-4910-b2c1-32386ecffab8/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.971 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '110a12c5-00cd-4414-aaa6-5cef102ce304', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '3c4bd8a02cf045ad9703e01b44239806', 'user_name': None, 'project_id': '54baf6b68ab146d49737e618b1e5b40e', 'project_name': None, 'resource_id': 'cdd4fb44-07d8-4910-b2c1-32386ecffab8-vda', 'timestamp': '2026-01-22T00:12:22.970054', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSONUnderV235-server-1857258152', 'name': 'instance-00000074', 'instance_id': 'cdd4fb44-07d8-4910-b2c1-32386ecffab8', 'instance_type': 'm1.nano', 'host': '13631781bc8a72ec7c5fd1fe42a0e71cf2a6a8aa0ce911e2240ec187', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '0648c0bc-f727-11f0-a0a4-fa163e934844', 'monotonic_time': 5315.589297994, 'message_signature': '915f2b3995bc1147887517a3c6b7fe3a48ef635d42f2fb18f7e7fe14fcc503a9'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 249856, 'user_id': '3c4bd8a02cf045ad9703e01b44239806', 'user_name': None, 'project_id': '54baf6b68ab146d49737e618b1e5b40e', 'project_name': None, 'resource_id': 'cdd4fb44-07d8-4910-b2c1-32386ecffab8-vdb', 'timestamp': '2026-01-22T00:12:22.970054', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSONUnderV235-server-1857258152', 'name': 'instance-00000074', 'instance_id': 'cdd4fb44-07d8-4910-b2c1-32386ecffab8', 'instance_type': 'm1.nano', 'host': '13631781bc8a72ec7c5fd1fe42a0e71cf2a6a8aa0ce911e2240ec187', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vdb'}, 'message_id': '0648c92c-f727-11f0-a0a4-fa163e934844', 'monotonic_time': 5315.589297994, 'message_signature': '2e79cd6d71c5d74d3bf4a899c2192969fadf75576cac5af9596815c2aebde635'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '3c4bd8a02cf045ad9703e01b44239806', 'user_name': None, 'project_id': '54baf6b68ab146d49737e618b1e5b40e', 'project_name': None, 'resource_id': 'cdd4fb44-07d8-4910-b2c1-32386ecffab8-sda', 'timestamp': '2026-01-22T00:12:22.970054', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSONUnderV235-server-1857258152', 'name': 'instance-00000074', 'instance_id': 'cdd4fb44-07d8-4910-b2c1-32386ecffab8', 'instance_type': 'm1.nano', 'host': '13631781bc8a72ec7c5fd1fe42a0e71cf2a6a8aa0ce911e2240ec187', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '0648d25a-f727-11f0-a0a4-fa163e934844', 'monotonic_time': 5315.589297994, 'message_signature': '9ba70fc1b4691f2e0d7bb1362ffbb390e5f6eb78bd1b5ead517e099ddb28d5fe'}]}, 'timestamp': '2026-01-22 00:12:22.970734', '_unique_id': '1c4ca36ed3ac46378745996de9854bf5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.971 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.971 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.971 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.971 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.971 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.971 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.971 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.971 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.971 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.971 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.971 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.971 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.971 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.971 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.971 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.971 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.971 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.971 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.971 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.971 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.971 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.971 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.971 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.971 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.971 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.971 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.971 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.971 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.971 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.971 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.971 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.971 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.971 12 DEBUG ceilometer.compute.pollsters [-] cdd4fb44-07d8-4910-b2c1-32386ecffab8/network.incoming.bytes volume: 532 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.972 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c855e351-f4d3-4679-ba59-cc35221c3826', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 532, 'user_id': '3c4bd8a02cf045ad9703e01b44239806', 'user_name': None, 'project_id': '54baf6b68ab146d49737e618b1e5b40e', 'project_name': None, 'resource_id': 'instance-00000074-cdd4fb44-07d8-4910-b2c1-32386ecffab8-tap7c209da8-98', 'timestamp': '2026-01-22T00:12:22.971948', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSONUnderV235-server-1857258152', 'name': 'tap7c209da8-98', 'instance_id': 'cdd4fb44-07d8-4910-b2c1-32386ecffab8', 'instance_type': 'm1.nano', 'host': '13631781bc8a72ec7c5fd1fe42a0e71cf2a6a8aa0ce911e2240ec187', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:3e:6e:e0', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap7c209da8-98'}, 'message_id': '06490a9a-f727-11f0-a0a4-fa163e934844', 'monotonic_time': 5315.67179735, 'message_signature': 'be731f621bc2f23c4b786400a82d5cbacfa7a7210e197713521975f74d33365c'}]}, 'timestamp': '2026-01-22 00:12:22.972183', '_unique_id': '27e0218bc6bf40939ee2cc7aca5acea2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.972 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.972 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.972 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.972 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.972 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.972 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.972 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.972 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.972 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.972 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.972 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.972 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.972 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.972 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.972 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.972 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.972 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.972 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.972 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.972 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.972 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.972 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.972 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.972 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.972 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.972 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.972 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.972 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.972 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.972 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.972 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.973 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.973 12 DEBUG ceilometer.compute.pollsters [-] cdd4fb44-07d8-4910-b2c1-32386ecffab8/disk.device.capacity volume: 117440512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.973 12 DEBUG ceilometer.compute.pollsters [-] cdd4fb44-07d8-4910-b2c1-32386ecffab8/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.973 12 DEBUG ceilometer.compute.pollsters [-] cdd4fb44-07d8-4910-b2c1-32386ecffab8/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.974 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '512e596f-52eb-4fd0-b310-caa42aa6ac88', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 117440512, 'user_id': '3c4bd8a02cf045ad9703e01b44239806', 'user_name': None, 'project_id': '54baf6b68ab146d49737e618b1e5b40e', 'project_name': None, 'resource_id': 'cdd4fb44-07d8-4910-b2c1-32386ecffab8-vda', 'timestamp': '2026-01-22T00:12:22.973247', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSONUnderV235-server-1857258152', 'name': 'instance-00000074', 'instance_id': 'cdd4fb44-07d8-4910-b2c1-32386ecffab8', 'instance_type': 'm1.nano', 'host': '13631781bc8a72ec7c5fd1fe42a0e71cf2a6a8aa0ce911e2240ec187', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '06493d1c-f727-11f0-a0a4-fa163e934844', 'monotonic_time': 5315.645887384, 'message_signature': '7db5f929910420d315d9b30298ea6106b6a684ead77fa22372802f7b2ffc4f0b'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '3c4bd8a02cf045ad9703e01b44239806', 'user_name': None, 'project_id': '54baf6b68ab146d49737e618b1e5b40e', 'project_name': None, 'resource_id': 'cdd4fb44-07d8-4910-b2c1-32386ecffab8-vdb', 'timestamp': '2026-01-22T00:12:22.973247', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSONUnderV235-server-1857258152', 'name': 'instance-00000074', 'instance_id': 'cdd4fb44-07d8-4910-b2c1-32386ecffab8', 'instance_type': 'm1.nano', 'host': '13631781bc8a72ec7c5fd1fe42a0e71cf2a6a8aa0ce911e2240ec187', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vdb'}, 'message_id': '064944ec-f727-11f0-a0a4-fa163e934844', 'monotonic_time': 5315.645887384, 'message_signature': '7085db4523d7a778e45fa09824763a320419d8bbb647d1295e5675524e9fa7cf'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '3c4bd8a02cf045ad9703e01b44239806', 'user_name': None, 'project_id': '54baf6b68ab146d49737e618b1e5b40e', 'project_name': None, 'resource_id': 'cdd4fb44-07d8-4910-b2c1-32386ecffab8-sda', 'timestamp': '2026-01-22T00:12:22.973247', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSONUnderV235-server-1857258152', 'name': 'instance-00000074', 'instance_id': 'cdd4fb44-07d8-4910-b2c1-32386ecffab8', 'instance_type': 'm1.nano', 'host': '13631781bc8a72ec7c5fd1fe42a0e71cf2a6a8aa0ce911e2240ec187', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '06494e06-f727-11f0-a0a4-fa163e934844', 'monotonic_time': 5315.645887384, 'message_signature': '9edaedb97f6f6795ce88fe82961477161fcfef2b59153fd066112f31ef8e739e'}]}, 'timestamp': '2026-01-22 00:12:22.973931', '_unique_id': 'a5961286a4704e5cbf3028c92ee826e2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.974 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.974 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.974 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.974 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.974 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.974 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.974 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.974 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.974 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.974 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.974 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.974 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.974 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.974 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.974 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.974 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.974 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.974 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.974 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.974 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.974 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.974 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.974 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.974 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.974 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.974 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.974 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.974 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.974 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.974 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.974 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.975 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.975 12 DEBUG ceilometer.compute.pollsters [-] cdd4fb44-07d8-4910-b2c1-32386ecffab8/network.incoming.packets volume: 6 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.975 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '55ecf686-5724-4a7f-93a2-1c9bbc926708', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 6, 'user_id': '3c4bd8a02cf045ad9703e01b44239806', 'user_name': None, 'project_id': '54baf6b68ab146d49737e618b1e5b40e', 'project_name': None, 'resource_id': 'instance-00000074-cdd4fb44-07d8-4910-b2c1-32386ecffab8-tap7c209da8-98', 'timestamp': '2026-01-22T00:12:22.975139', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSONUnderV235-server-1857258152', 'name': 'tap7c209da8-98', 'instance_id': 'cdd4fb44-07d8-4910-b2c1-32386ecffab8', 'instance_type': 'm1.nano', 'host': '13631781bc8a72ec7c5fd1fe42a0e71cf2a6a8aa0ce911e2240ec187', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:3e:6e:e0', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap7c209da8-98'}, 'message_id': '06498920-f727-11f0-a0a4-fa163e934844', 'monotonic_time': 5315.67179735, 'message_signature': '274c51008c7ef57ece576dbe3fad3dce54f30ce2116440b980dd62d7e1b39549'}]}, 'timestamp': '2026-01-22 00:12:22.975466', '_unique_id': '7bd6173a42404aa9a95dea5e4561f48a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.975 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.975 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.975 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.975 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.975 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.975 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.975 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.975 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.975 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.975 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.975 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.975 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.975 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.975 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.975 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.975 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.975 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.975 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.975 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.975 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.975 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.975 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.975 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.975 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.975 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.975 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.975 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.975 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.975 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.975 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.975 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.976 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.976 12 DEBUG ceilometer.compute.pollsters [-] cdd4fb44-07d8-4910-b2c1-32386ecffab8/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.977 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2f0e4c8b-9c9f-443f-8670-0de5010f8179', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '3c4bd8a02cf045ad9703e01b44239806', 'user_name': None, 'project_id': '54baf6b68ab146d49737e618b1e5b40e', 'project_name': None, 'resource_id': 'instance-00000074-cdd4fb44-07d8-4910-b2c1-32386ecffab8-tap7c209da8-98', 'timestamp': '2026-01-22T00:12:22.976594', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSONUnderV235-server-1857258152', 'name': 'tap7c209da8-98', 'instance_id': 'cdd4fb44-07d8-4910-b2c1-32386ecffab8', 'instance_type': 'm1.nano', 'host': '13631781bc8a72ec7c5fd1fe42a0e71cf2a6a8aa0ce911e2240ec187', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:3e:6e:e0', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap7c209da8-98'}, 'message_id': '0649c110-f727-11f0-a0a4-fa163e934844', 'monotonic_time': 5315.67179735, 'message_signature': 'a20b036bd4d9bc35b93ae7918e586a84126433d8002e4a219b53dec53fc7ce5d'}]}, 'timestamp': '2026-01-22 00:12:22.976911', '_unique_id': 'bd580f33446942c2911b3f5db842f541'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.977 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.977 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.977 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.977 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.977 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.977 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.977 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.977 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.977 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.977 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.977 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.977 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.977 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.977 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.977 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.977 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.977 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.977 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.977 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.977 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.977 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.977 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.977 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.977 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.977 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.977 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.977 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.977 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.977 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.977 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.977 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.977 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.977 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.977 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.977 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.977 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.977 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.977 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.977 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.977 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.977 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.977 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.977 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.977 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.977 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.977 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.977 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.977 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.977 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.977 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.977 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.977 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.977 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.977 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.977 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.978 12 DEBUG ceilometer.compute.pollsters [-] cdd4fb44-07d8-4910-b2c1-32386ecffab8/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.978 12 DEBUG ceilometer.compute.pollsters [-] cdd4fb44-07d8-4910-b2c1-32386ecffab8/disk.device.write.latency volume: 26394135 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.978 12 DEBUG ceilometer.compute.pollsters [-] cdd4fb44-07d8-4910-b2c1-32386ecffab8/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.979 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ae655442-8364-4427-8240-212f972e8b3c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '3c4bd8a02cf045ad9703e01b44239806', 'user_name': None, 'project_id': '54baf6b68ab146d49737e618b1e5b40e', 'project_name': None, 'resource_id': 'cdd4fb44-07d8-4910-b2c1-32386ecffab8-vda', 'timestamp': '2026-01-22T00:12:22.978054', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSONUnderV235-server-1857258152', 'name': 'instance-00000074', 'instance_id': 'cdd4fb44-07d8-4910-b2c1-32386ecffab8', 'instance_type': 'm1.nano', 'host': '13631781bc8a72ec7c5fd1fe42a0e71cf2a6a8aa0ce911e2240ec187', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '0649f9f0-f727-11f0-a0a4-fa163e934844', 'monotonic_time': 5315.589297994, 'message_signature': '880927281fe31f9ebda40735d259fa403c1c017466cbb0a4381afade76449cab'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 26394135, 'user_id': '3c4bd8a02cf045ad9703e01b44239806', 'user_name': None, 'project_id': '54baf6b68ab146d49737e618b1e5b40e', 'project_name': None, 'resource_id': 'cdd4fb44-07d8-4910-b2c1-32386ecffab8-vdb', 'timestamp': '2026-01-22T00:12:22.978054', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSONUnderV235-server-1857258152', 'name': 'instance-00000074', 'instance_id': 'cdd4fb44-07d8-4910-b2c1-32386ecffab8', 'instance_type': 'm1.nano', 'host': '13631781bc8a72ec7c5fd1fe42a0e71cf2a6a8aa0ce911e2240ec187', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vdb'}, 'message_id': '064a03dc-f727-11f0-a0a4-fa163e934844', 'monotonic_time': 5315.589297994, 'message_signature': 'e431f5749cc154d129591b94a964078f60fb79360835b3f1b5282a5f0151a683'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '3c4bd8a02cf045ad9703e01b44239806', 'user_name': None, 'project_id': '54baf6b68ab146d49737e618b1e5b40e', 'project_name': None, 'resource_id': 'cdd4fb44-07d8-4910-b2c1-32386ecffab8-sda', 'timestamp': '2026-01-22T00:12:22.978054', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSONUnderV235-server-1857258152', 'name': 'instance-00000074', 'instance_id': 'cdd4fb44-07d8-4910-b2c1-32386ecffab8', 'instance_type': 'm1.nano', 'host': '13631781bc8a72ec7c5fd1fe42a0e71cf2a6a8aa0ce911e2240ec187', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '064a0c56-f727-11f0-a0a4-fa163e934844', 'monotonic_time': 5315.589297994, 'message_signature': '1e9d9bd1fd58a3beb9f428c6c4f1eaf9b47981a625b6daa6581d37b43fc086ae'}]}, 'timestamp': '2026-01-22 00:12:22.978777', '_unique_id': '612eb619bc9a4236bf3a5a6e283edac0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.979 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.979 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.979 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.979 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.979 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.979 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.979 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.979 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.979 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.979 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.979 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.979 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.979 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.979 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.979 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.979 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.979 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.979 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.979 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.979 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.979 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.979 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.979 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.979 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.979 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.979 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.979 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.979 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.979 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.979 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.979 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.979 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.979 12 DEBUG ceilometer.compute.pollsters [-] cdd4fb44-07d8-4910-b2c1-32386ecffab8/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.980 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '62ce7d99-500b-4ba8-9c0e-d5eb46016eba', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '3c4bd8a02cf045ad9703e01b44239806', 'user_name': None, 'project_id': '54baf6b68ab146d49737e618b1e5b40e', 'project_name': None, 'resource_id': 'instance-00000074-cdd4fb44-07d8-4910-b2c1-32386ecffab8-tap7c209da8-98', 'timestamp': '2026-01-22T00:12:22.979937', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSONUnderV235-server-1857258152', 'name': 'tap7c209da8-98', 'instance_id': 'cdd4fb44-07d8-4910-b2c1-32386ecffab8', 'instance_type': 'm1.nano', 'host': '13631781bc8a72ec7c5fd1fe42a0e71cf2a6a8aa0ce911e2240ec187', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:3e:6e:e0', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap7c209da8-98'}, 'message_id': '064a437e-f727-11f0-a0a4-fa163e934844', 'monotonic_time': 5315.67179735, 'message_signature': 'cb5b8b30b191a817ab07112ae425976492ae0d5be1363c0d033b6cb900ef8bbc'}]}, 'timestamp': '2026-01-22 00:12:22.980227', '_unique_id': 'a780d446429b4a64ad3a1654a80f4c19'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.980 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.980 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.980 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.980 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.980 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.980 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.980 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.980 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.980 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.980 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.980 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.980 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.980 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.980 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.980 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.980 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.980 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.980 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.980 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.980 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.980 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.980 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.980 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.980 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.980 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.980 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.980 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.980 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.980 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.980 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.980 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.981 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Jan 22 00:12:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.997 12 DEBUG ceilometer.compute.pollsters [-] cdd4fb44-07d8-4910-b2c1-32386ecffab8/cpu volume: 12110000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:12:23 compute-1 nova_compute[182713]: 2026-01-22 00:12:22.999 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.999 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '780ceef8-0e03-455a-a081-53071ea563fb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 12110000000, 'user_id': '3c4bd8a02cf045ad9703e01b44239806', 'user_name': None, 'project_id': '54baf6b68ab146d49737e618b1e5b40e', 'project_name': None, 'resource_id': 'cdd4fb44-07d8-4910-b2c1-32386ecffab8', 'timestamp': '2026-01-22T00:12:22.981360', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSONUnderV235-server-1857258152', 'name': 'instance-00000074', 'instance_id': 'cdd4fb44-07d8-4910-b2c1-32386ecffab8', 'instance_type': 'm1.nano', 'host': '13631781bc8a72ec7c5fd1fe42a0e71cf2a6a8aa0ce911e2240ec187', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '064d028a-f727-11f0-a0a4-fa163e934844', 'monotonic_time': 5315.704799014, 'message_signature': '1cf024b71921007f4561a05c26b99ab2a62e56d4c76cf0ecf1cae3e0c170bc7b'}]}, 'timestamp': '2026-01-22 00:12:22.998249', '_unique_id': '6b41024ce21240399449547c27b4abb3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.999 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.999 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.999 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.999 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.999 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.999 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.999 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.999 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.999 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.999 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.999 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.999 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.999 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.999 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.999 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.999 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.999 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.999 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.999 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.999 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.999 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.999 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.999 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.999 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.999 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.999 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.999 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.999 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.999 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.999 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.999 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.999 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:22.999 12 DEBUG ceilometer.compute.pollsters [-] cdd4fb44-07d8-4910-b2c1-32386ecffab8/disk.device.read.bytes volume: 28287488 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.000 12 DEBUG ceilometer.compute.pollsters [-] cdd4fb44-07d8-4910-b2c1-32386ecffab8/disk.device.read.bytes volume: 8282112 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.000 12 DEBUG ceilometer.compute.pollsters [-] cdd4fb44-07d8-4910-b2c1-32386ecffab8/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.001 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5106e990-47c1-408e-a6d2-f3f33a74c382', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 28287488, 'user_id': '3c4bd8a02cf045ad9703e01b44239806', 'user_name': None, 'project_id': '54baf6b68ab146d49737e618b1e5b40e', 'project_name': None, 'resource_id': 'cdd4fb44-07d8-4910-b2c1-32386ecffab8-vda', 'timestamp': '2026-01-22T00:12:22.999934', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSONUnderV235-server-1857258152', 'name': 'instance-00000074', 'instance_id': 'cdd4fb44-07d8-4910-b2c1-32386ecffab8', 'instance_type': 'm1.nano', 'host': '13631781bc8a72ec7c5fd1fe42a0e71cf2a6a8aa0ce911e2240ec187', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '064d4fec-f727-11f0-a0a4-fa163e934844', 'monotonic_time': 5315.589297994, 'message_signature': '263204b128185cadbde6b14675d8a9d1182f905a41c85845249a0eb8ca1f0f4c'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 8282112, 'user_id': '3c4bd8a02cf045ad9703e01b44239806', 'user_name': None, 'project_id': '54baf6b68ab146d49737e618b1e5b40e', 'project_name': None, 'resource_id': 'cdd4fb44-07d8-4910-b2c1-32386ecffab8-vdb', 'timestamp': '2026-01-22T00:12:22.999934', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSONUnderV235-server-1857258152', 'name': 'instance-00000074', 'instance_id': 'cdd4fb44-07d8-4910-b2c1-32386ecffab8', 'instance_type': 'm1.nano', 'host': '13631781bc8a72ec7c5fd1fe42a0e71cf2a6a8aa0ce911e2240ec187', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vdb'}, 'message_id': '064d59ce-f727-11f0-a0a4-fa163e934844', 'monotonic_time': 5315.589297994, 'message_signature': '8e48fae157a14e6fc3bba17944f0b3cd8393a9822dfdce159ff4d421e87badd6'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': '3c4bd8a02cf045ad9703e01b44239806', 'user_name': None, 'project_id': '54baf6b68ab146d49737e618b1e5b40e', 'project_name': None, 'resource_id': 'cdd4fb44-07d8-4910-b2c1-32386ecffab8-sda', 'timestamp': '2026-01-22T00:12:22.999934', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSONUnderV235-server-1857258152', 'name': 'instance-00000074', 'instance_id': 'cdd4fb44-07d8-4910-b2c1-32386ecffab8', 'instance_type': 'm1.nano', 'host': '13631781bc8a72ec7c5fd1fe42a0e71cf2a6a8aa0ce911e2240ec187', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '064d61da-f727-11f0-a0a4-fa163e934844', 'monotonic_time': 5315.589297994, 'message_signature': '4fbe1db291371ec21d4ab27937b49812c349d7a7140602aa3a9c4b6ceb7f8dd4'}]}, 'timestamp': '2026-01-22 00:12:23.000618', '_unique_id': '696243942b5e4997b65dc936dda0bac8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.001 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.001 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.001 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.001 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.001 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.001 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.001 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.001 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.001 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.001 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.001 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.001 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.001 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.001 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.001 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.001 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.001 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.001 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.001 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.001 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.001 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.001 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.001 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.001 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.001 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.001 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.001 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.001 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.001 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.001 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.001 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.001 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.001 12 DEBUG ceilometer.compute.pollsters [-] cdd4fb44-07d8-4910-b2c1-32386ecffab8/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.002 12 DEBUG ceilometer.compute.pollsters [-] cdd4fb44-07d8-4910-b2c1-32386ecffab8/disk.device.write.requests volume: 28 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.002 12 DEBUG ceilometer.compute.pollsters [-] cdd4fb44-07d8-4910-b2c1-32386ecffab8/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.003 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2ccf8d57-e697-4997-87a8-458f3050fbb8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '3c4bd8a02cf045ad9703e01b44239806', 'user_name': None, 'project_id': '54baf6b68ab146d49737e618b1e5b40e', 'project_name': None, 'resource_id': 'cdd4fb44-07d8-4910-b2c1-32386ecffab8-vda', 'timestamp': '2026-01-22T00:12:23.001887', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSONUnderV235-server-1857258152', 'name': 'instance-00000074', 'instance_id': 'cdd4fb44-07d8-4910-b2c1-32386ecffab8', 'instance_type': 'm1.nano', 'host': '13631781bc8a72ec7c5fd1fe42a0e71cf2a6a8aa0ce911e2240ec187', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '064d9c18-f727-11f0-a0a4-fa163e934844', 'monotonic_time': 5315.589297994, 'message_signature': 'e6c2ac2449f1d20a22c1542e68adee3a58a35089e2824b4402f26016c12f2ec8'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 28, 'user_id': '3c4bd8a02cf045ad9703e01b44239806', 'user_name': None, 'project_id': '54baf6b68ab146d49737e618b1e5b40e', 'project_name': None, 'resource_id': 'cdd4fb44-07d8-4910-b2c1-32386ecffab8-vdb', 'timestamp': '2026-01-22T00:12:23.001887', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSONUnderV235-server-1857258152', 'name': 'instance-00000074', 'instance_id': 'cdd4fb44-07d8-4910-b2c1-32386ecffab8', 'instance_type': 'm1.nano', 'host': '13631781bc8a72ec7c5fd1fe42a0e71cf2a6a8aa0ce911e2240ec187', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vdb'}, 'message_id': '064da4ba-f727-11f0-a0a4-fa163e934844', 'monotonic_time': 5315.589297994, 'message_signature': '705bd265c1f86224684c108d71d882e34d2e7219c45b986cb80edc951900bd3b'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '3c4bd8a02cf045ad9703e01b44239806', 'user_name': None, 'project_id': '54baf6b68ab146d49737e618b1e5b40e', 'project_name': None, 'resource_id': 'cdd4fb44-07d8-4910-b2c1-32386ecffab8-sda', 'timestamp': '2026-01-22T00:12:23.001887', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSONUnderV235-server-1857258152', 'name': 'instance-00000074', 'instance_id': 'cdd4fb44-07d8-4910-b2c1-32386ecffab8', 'instance_type': 'm1.nano', 'host': '13631781bc8a72ec7c5fd1fe42a0e71cf2a6a8aa0ce911e2240ec187', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '064dac58-f727-11f0-a0a4-fa163e934844', 'monotonic_time': 5315.589297994, 'message_signature': '9af1c35ff581f556a4551c39d543da1063501abef8d086c4669ed99f76de5009'}]}, 'timestamp': '2026-01-22 00:12:23.002523', '_unique_id': '3b4ab497c6054b2ca1491c51b7340f57'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.003 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.003 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.003 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.003 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.003 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.003 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.003 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.003 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.003 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.003 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.003 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.003 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.003 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.003 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.003 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.003 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.003 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.003 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.003 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.003 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.003 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.003 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.003 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.003 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.003 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.003 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.003 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.003 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.003 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.003 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.003 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.003 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.003 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.003 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-ServerRescueTestJSONUnderV235-server-1857258152>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServerRescueTestJSONUnderV235-server-1857258152>]
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.003 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.004 12 DEBUG ceilometer.compute.pollsters [-] cdd4fb44-07d8-4910-b2c1-32386ecffab8/network.outgoing.packets volume: 10 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.004 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '417989e7-ccb2-44e8-9f03-c48809139db0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 10, 'user_id': '3c4bd8a02cf045ad9703e01b44239806', 'user_name': None, 'project_id': '54baf6b68ab146d49737e618b1e5b40e', 'project_name': None, 'resource_id': 'instance-00000074-cdd4fb44-07d8-4910-b2c1-32386ecffab8-tap7c209da8-98', 'timestamp': '2026-01-22T00:12:23.004039', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSONUnderV235-server-1857258152', 'name': 'tap7c209da8-98', 'instance_id': 'cdd4fb44-07d8-4910-b2c1-32386ecffab8', 'instance_type': 'm1.nano', 'host': '13631781bc8a72ec7c5fd1fe42a0e71cf2a6a8aa0ce911e2240ec187', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:3e:6e:e0', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap7c209da8-98'}, 'message_id': '064df046-f727-11f0-a0a4-fa163e934844', 'monotonic_time': 5315.67179735, 'message_signature': '9c11d04a6162077ed750075a1b43efd5d74c96012dd5448ecae4bee052de55ac'}]}, 'timestamp': '2026-01-22 00:12:23.004314', '_unique_id': '3505d24519e040d79cec31a6c92042c6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.004 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.004 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.004 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.004 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.004 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.004 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.004 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.004 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.004 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.004 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.004 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.004 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.004 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.004 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.004 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.004 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.004 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.004 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.004 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.004 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.004 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.004 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.004 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.004 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.004 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.004 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.004 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.004 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.004 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.004 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.004 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.005 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.005 12 DEBUG ceilometer.compute.pollsters [-] cdd4fb44-07d8-4910-b2c1-32386ecffab8/memory.usage volume: 40.40625 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.006 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2d50b361-744d-4d26-b856-3b8c0ab6b16a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 40.40625, 'user_id': '3c4bd8a02cf045ad9703e01b44239806', 'user_name': None, 'project_id': '54baf6b68ab146d49737e618b1e5b40e', 'project_name': None, 'resource_id': 'cdd4fb44-07d8-4910-b2c1-32386ecffab8', 'timestamp': '2026-01-22T00:12:23.005426', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSONUnderV235-server-1857258152', 'name': 'instance-00000074', 'instance_id': 'cdd4fb44-07d8-4910-b2c1-32386ecffab8', 'instance_type': 'm1.nano', 'host': '13631781bc8a72ec7c5fd1fe42a0e71cf2a6a8aa0ce911e2240ec187', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '064e2624-f727-11f0-a0a4-fa163e934844', 'monotonic_time': 5315.704799014, 'message_signature': 'dc9b0bedb3d5500f6fbcdb919b70e92bd06ce20cef8bc5952b6b4c553b155292'}]}, 'timestamp': '2026-01-22 00:12:23.005656', '_unique_id': '7e45d5293cf04a6e81322ba2e79089ff'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.006 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.006 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.006 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.006 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.006 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.006 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.006 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.006 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.006 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.006 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.006 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.006 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.006 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.006 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.006 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.006 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.006 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.006 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.006 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.006 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.006 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.006 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.006 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.006 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.006 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.006 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.006 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.006 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.006 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.006 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.006 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.006 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.006 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.007 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-ServerRescueTestJSONUnderV235-server-1857258152>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServerRescueTestJSONUnderV235-server-1857258152>]
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.007 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.007 12 DEBUG ceilometer.compute.pollsters [-] cdd4fb44-07d8-4910-b2c1-32386ecffab8/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.008 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '40eae5db-799e-49f7-8c9e-114d4dbb6abb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '3c4bd8a02cf045ad9703e01b44239806', 'user_name': None, 'project_id': '54baf6b68ab146d49737e618b1e5b40e', 'project_name': None, 'resource_id': 'instance-00000074-cdd4fb44-07d8-4910-b2c1-32386ecffab8-tap7c209da8-98', 'timestamp': '2026-01-22T00:12:23.007246', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSONUnderV235-server-1857258152', 'name': 'tap7c209da8-98', 'instance_id': 'cdd4fb44-07d8-4910-b2c1-32386ecffab8', 'instance_type': 'm1.nano', 'host': '13631781bc8a72ec7c5fd1fe42a0e71cf2a6a8aa0ce911e2240ec187', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:3e:6e:e0', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap7c209da8-98'}, 'message_id': '064e6ddc-f727-11f0-a0a4-fa163e934844', 'monotonic_time': 5315.67179735, 'message_signature': 'ae957ace88000db8b01ebe1ad778d72637f2c9824617fd7880029d002ae5babd'}]}, 'timestamp': '2026-01-22 00:12:23.007497', '_unique_id': '1ea44f4c1b7b4f6bb958445e4b999831'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.008 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.008 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.008 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.008 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.008 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.008 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.008 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.008 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.008 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.008 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.008 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.008 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.008 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.008 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.008 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.008 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.008 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.008 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.008 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.008 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.008 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.008 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.008 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.008 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.008 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.008 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.008 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.008 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.008 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.008 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.008 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.008 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.009 12 DEBUG ceilometer.compute.pollsters [-] cdd4fb44-07d8-4910-b2c1-32386ecffab8/network.outgoing.bytes volume: 1312 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.009 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bf9b552c-2647-4a07-9c54-87fce9991681', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1312, 'user_id': '3c4bd8a02cf045ad9703e01b44239806', 'user_name': None, 'project_id': '54baf6b68ab146d49737e618b1e5b40e', 'project_name': None, 'resource_id': 'instance-00000074-cdd4fb44-07d8-4910-b2c1-32386ecffab8-tap7c209da8-98', 'timestamp': '2026-01-22T00:12:23.009042', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSONUnderV235-server-1857258152', 'name': 'tap7c209da8-98', 'instance_id': 'cdd4fb44-07d8-4910-b2c1-32386ecffab8', 'instance_type': 'm1.nano', 'host': '13631781bc8a72ec7c5fd1fe42a0e71cf2a6a8aa0ce911e2240ec187', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:3e:6e:e0', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap7c209da8-98'}, 'message_id': '064eb40e-f727-11f0-a0a4-fa163e934844', 'monotonic_time': 5315.67179735, 'message_signature': '76f7e369d7595ea450d38fbf901761d87d0b75896763b3e35c900f0da8f8549a'}]}, 'timestamp': '2026-01-22 00:12:23.009343', '_unique_id': 'f026d5b847d24214924c76cb50251843'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.009 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.009 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.009 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.009 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.009 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.009 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.009 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.009 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.009 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.009 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.009 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.009 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.009 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.009 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.009 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.009 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.009 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.009 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.009 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.009 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.009 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.009 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.009 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.009 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.009 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:12:23 compute-1 rsyslogd[1003]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.009 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.009 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.009 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.009 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.009 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.009 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.010 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.010 12 DEBUG ceilometer.compute.pollsters [-] cdd4fb44-07d8-4910-b2c1-32386ecffab8/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.011 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '89caca87-274f-410a-8973-9032caa302ba', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '3c4bd8a02cf045ad9703e01b44239806', 'user_name': None, 'project_id': '54baf6b68ab146d49737e618b1e5b40e', 'project_name': None, 'resource_id': 'instance-00000074-cdd4fb44-07d8-4910-b2c1-32386ecffab8-tap7c209da8-98', 'timestamp': '2026-01-22T00:12:23.010762', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSONUnderV235-server-1857258152', 'name': 'tap7c209da8-98', 'instance_id': 'cdd4fb44-07d8-4910-b2c1-32386ecffab8', 'instance_type': 'm1.nano', 'host': '13631781bc8a72ec7c5fd1fe42a0e71cf2a6a8aa0ce911e2240ec187', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:3e:6e:e0', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap7c209da8-98'}, 'message_id': '064ef7fc-f727-11f0-a0a4-fa163e934844', 'monotonic_time': 5315.67179735, 'message_signature': 'c2b3271512200517edb5add6c379005d87c041ceef3cde8a613dab54a49b2c10'}]}, 'timestamp': '2026-01-22 00:12:23.011041', '_unique_id': 'b42ddfba58f240bca7c7aa18842b90b9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.011 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.011 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.011 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.011 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.011 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.011 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.011 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.011 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.011 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.011 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.011 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.011 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.011 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.011 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.011 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.011 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.011 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.011 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.011 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.011 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.011 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.011 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.011 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.011 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.011 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.011 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.011 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.011 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.011 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.011 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.011 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.012 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.012 12 DEBUG ceilometer.compute.pollsters [-] cdd4fb44-07d8-4910-b2c1-32386ecffab8/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.013 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '48d3368f-db72-4658-b212-2477b55c5a9f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '3c4bd8a02cf045ad9703e01b44239806', 'user_name': None, 'project_id': '54baf6b68ab146d49737e618b1e5b40e', 'project_name': None, 'resource_id': 'instance-00000074-cdd4fb44-07d8-4910-b2c1-32386ecffab8-tap7c209da8-98', 'timestamp': '2026-01-22T00:12:23.012326', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSONUnderV235-server-1857258152', 'name': 'tap7c209da8-98', 'instance_id': 'cdd4fb44-07d8-4910-b2c1-32386ecffab8', 'instance_type': 'm1.nano', 'host': '13631781bc8a72ec7c5fd1fe42a0e71cf2a6a8aa0ce911e2240ec187', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:3e:6e:e0', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap7c209da8-98'}, 'message_id': '064f3550-f727-11f0-a0a4-fa163e934844', 'monotonic_time': 5315.67179735, 'message_signature': 'ed9cc3cdfbf3f0249551ab64e7690888fd7684a94389d5ec8e88b3c46bd79990'}]}, 'timestamp': '2026-01-22 00:12:23.012640', '_unique_id': '7e38add26afd4868b75daec2303aef94'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.013 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.013 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.013 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.013 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.013 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.013 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.013 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.013 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.013 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.013 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.013 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.013 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.013 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.013 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.013 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.013 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.013 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.013 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.013 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.013 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.013 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.013 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.013 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.013 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.013 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.013 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.013 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.013 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.013 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.013 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.013 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.013 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.014 12 DEBUG ceilometer.compute.pollsters [-] cdd4fb44-07d8-4910-b2c1-32386ecffab8/disk.device.read.latency volume: 501081351 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.014 12 DEBUG ceilometer.compute.pollsters [-] cdd4fb44-07d8-4910-b2c1-32386ecffab8/disk.device.read.latency volume: 105556300 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.014 12 DEBUG ceilometer.compute.pollsters [-] cdd4fb44-07d8-4910-b2c1-32386ecffab8/disk.device.read.latency volume: 28243571 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.015 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '53954612-b3bc-4f68-bcd2-a959b87dfc3c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 501081351, 'user_id': '3c4bd8a02cf045ad9703e01b44239806', 'user_name': None, 'project_id': '54baf6b68ab146d49737e618b1e5b40e', 'project_name': None, 'resource_id': 'cdd4fb44-07d8-4910-b2c1-32386ecffab8-vda', 'timestamp': '2026-01-22T00:12:23.014035', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSONUnderV235-server-1857258152', 'name': 'instance-00000074', 'instance_id': 'cdd4fb44-07d8-4910-b2c1-32386ecffab8', 'instance_type': 'm1.nano', 'host': '13631781bc8a72ec7c5fd1fe42a0e71cf2a6a8aa0ce911e2240ec187', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '064f7696-f727-11f0-a0a4-fa163e934844', 'monotonic_time': 5315.589297994, 'message_signature': '9a1e5d4267a80504a30eb4ede35ccef6fa49b2b76e433611f7ca377ce52598d4'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 105556300, 'user_id': '3c4bd8a02cf045ad9703e01b44239806', 'user_name': None, 'project_id': '54baf6b68ab146d49737e618b1e5b40e', 'project_name': None, 'resource_id': 'cdd4fb44-07d8-4910-b2c1-32386ecffab8-vdb', 'timestamp': '2026-01-22T00:12:23.014035', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSONUnderV235-server-1857258152', 'name': 'instance-00000074', 'instance_id': 'cdd4fb44-07d8-4910-b2c1-32386ecffab8', 'instance_type': 'm1.nano', 'host': '13631781bc8a72ec7c5fd1fe42a0e71cf2a6a8aa0ce911e2240ec187', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vdb'}, 'message_id': '064f7e5c-f727-11f0-a0a4-fa163e934844', 'monotonic_time': 5315.589297994, 'message_signature': 'd1dd32fe40ba596c2c2400a3192475c36e425fda6d8bb8c5447431540d244495'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 28243571, 'user_id': '3c4bd8a02cf045ad9703e01b44239806', 'user_name': None, 'project_id': '54baf6b68ab146d49737e618b1e5b40e', 'project_name': None, 'resource_id': 'cdd4fb44-07d8-4910-b2c1-32386ecffab8-sda', 'timestamp': '2026-01-22T00:12:23.014035', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSONUnderV235-server-1857258152', 'name': 'instance-00000074', 'instance_id': 'cdd4fb44-07d8-4910-b2c1-32386ecffab8', 'instance_type': 'm1.nano', 'host': '13631781bc8a72ec7c5fd1fe42a0e71cf2a6a8aa0ce911e2240ec187', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '064f85dc-f727-11f0-a0a4-fa163e934844', 'monotonic_time': 5315.589297994, 'message_signature': '6e7bf27234c6c78d9c33950103242c0feb5b81608be24d6859bb89170efd849b'}]}, 'timestamp': '2026-01-22 00:12:23.014647', '_unique_id': '25f327765be34670a846c2a43551f053'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.015 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.015 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.015 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.015 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.015 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.015 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.015 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.015 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.015 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.015 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.015 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.015 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.015 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.015 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.015 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.015 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.015 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.015 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.015 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.015 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.015 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.015 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.015 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.015 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.015 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.015 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.015 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.015 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.015 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.015 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:12:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:12:23.015 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:12:23 compute-1 rsyslogd[1003]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 22 00:12:25 compute-1 nova_compute[182713]: 2026-01-22 00:12:25.452 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:12:28 compute-1 nova_compute[182713]: 2026-01-22 00:12:28.001 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:12:28 compute-1 nova_compute[182713]: 2026-01-22 00:12:28.089 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:12:28 compute-1 NetworkManager[54952]: <info>  [1769040748.0908] manager: (patch-provnet-99bcf688-d143-4306-a11c-956e66fbe227-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/223)
Jan 22 00:12:28 compute-1 NetworkManager[54952]: <info>  [1769040748.0921] manager: (patch-br-int-to-provnet-99bcf688-d143-4306-a11c-956e66fbe227): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/224)
Jan 22 00:12:28 compute-1 nova_compute[182713]: 2026-01-22 00:12:28.175 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:12:28 compute-1 nova_compute[182713]: 2026-01-22 00:12:28.188 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:12:29 compute-1 nova_compute[182713]: 2026-01-22 00:12:29.376 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:12:29 compute-1 nova_compute[182713]: 2026-01-22 00:12:29.427 182717 DEBUG nova.compute.manager [req-3e43dc55-73b5-4f28-8b59-292e43028aec req-d5371fee-2427-4f33-9a8e-2b921648a0c9 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: cdd4fb44-07d8-4910-b2c1-32386ecffab8] Received event network-changed-7c209da8-98c9-48d0-b48e-ece9f863d933 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:12:29 compute-1 nova_compute[182713]: 2026-01-22 00:12:29.428 182717 DEBUG nova.compute.manager [req-3e43dc55-73b5-4f28-8b59-292e43028aec req-d5371fee-2427-4f33-9a8e-2b921648a0c9 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: cdd4fb44-07d8-4910-b2c1-32386ecffab8] Refreshing instance network info cache due to event network-changed-7c209da8-98c9-48d0-b48e-ece9f863d933. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 00:12:29 compute-1 nova_compute[182713]: 2026-01-22 00:12:29.428 182717 DEBUG oslo_concurrency.lockutils [req-3e43dc55-73b5-4f28-8b59-292e43028aec req-d5371fee-2427-4f33-9a8e-2b921648a0c9 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-cdd4fb44-07d8-4910-b2c1-32386ecffab8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:12:29 compute-1 nova_compute[182713]: 2026-01-22 00:12:29.428 182717 DEBUG oslo_concurrency.lockutils [req-3e43dc55-73b5-4f28-8b59-292e43028aec req-d5371fee-2427-4f33-9a8e-2b921648a0c9 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-cdd4fb44-07d8-4910-b2c1-32386ecffab8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:12:29 compute-1 nova_compute[182713]: 2026-01-22 00:12:29.429 182717 DEBUG nova.network.neutron [req-3e43dc55-73b5-4f28-8b59-292e43028aec req-d5371fee-2427-4f33-9a8e-2b921648a0c9 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: cdd4fb44-07d8-4910-b2c1-32386ecffab8] Refreshing network info cache for port 7c209da8-98c9-48d0-b48e-ece9f863d933 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 00:12:30 compute-1 nova_compute[182713]: 2026-01-22 00:12:30.455 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:12:30 compute-1 nova_compute[182713]: 2026-01-22 00:12:30.856 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:12:32 compute-1 podman[229732]: 2026-01-22 00:12:32.581978003 +0000 UTC m=+0.062902294 container health_status 9069102163b9014ce8d990e36224edf2f6277e737eebbc85a8ea0ba5a3d6a468 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 22 00:12:32 compute-1 podman[229731]: 2026-01-22 00:12:32.617501305 +0000 UTC m=+0.112493419 container health_status 1b31374526468d6b98833c6ddc06c791524e3aaa8f4a92d02e3f11c449a17ca2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, org.label-schema.build-date=20251202, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller)
Jan 22 00:12:32 compute-1 nova_compute[182713]: 2026-01-22 00:12:32.639 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:12:32 compute-1 nova_compute[182713]: 2026-01-22 00:12:32.640 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Jan 22 00:12:33 compute-1 nova_compute[182713]: 2026-01-22 00:12:33.003 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:12:34 compute-1 nova_compute[182713]: 2026-01-22 00:12:34.003 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Jan 22 00:12:35 compute-1 nova_compute[182713]: 2026-01-22 00:12:35.459 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:12:37 compute-1 nova_compute[182713]: 2026-01-22 00:12:37.036 182717 DEBUG nova.network.neutron [req-3e43dc55-73b5-4f28-8b59-292e43028aec req-d5371fee-2427-4f33-9a8e-2b921648a0c9 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: cdd4fb44-07d8-4910-b2c1-32386ecffab8] Updated VIF entry in instance network info cache for port 7c209da8-98c9-48d0-b48e-ece9f863d933. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 00:12:37 compute-1 nova_compute[182713]: 2026-01-22 00:12:37.037 182717 DEBUG nova.network.neutron [req-3e43dc55-73b5-4f28-8b59-292e43028aec req-d5371fee-2427-4f33-9a8e-2b921648a0c9 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: cdd4fb44-07d8-4910-b2c1-32386ecffab8] Updating instance_info_cache with network_info: [{"id": "7c209da8-98c9-48d0-b48e-ece9f863d933", "address": "fa:16:3e:3e:6e:e0", "network": {"id": "3b4ab8c5-3d45-4beb-9842-6290486d7c84", "bridge": "br-int", "label": "tempest-ServerRescueTestJSONUnderV235-682511806-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.213", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "54baf6b68ab146d49737e618b1e5b40e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7c209da8-98", "ovs_interfaceid": "7c209da8-98c9-48d0-b48e-ece9f863d933", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:12:37 compute-1 podman[229782]: 2026-01-22 00:12:37.562703776 +0000 UTC m=+0.060215111 container health_status e305ebfcd3dbca18f8dd680606b043b108cf9efb0d0a771039cb04fe0df8441d (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 22 00:12:37 compute-1 podman[229781]: 2026-01-22 00:12:37.585998452 +0000 UTC m=+0.083804546 container health_status af9a44923f14d865893c91712a29a99d59e05d83f1a1a0300c4f4d5af638f284 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS)
Jan 22 00:12:38 compute-1 nova_compute[182713]: 2026-01-22 00:12:38.005 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:12:38 compute-1 nova_compute[182713]: 2026-01-22 00:12:38.181 182717 DEBUG oslo_concurrency.lockutils [req-3e43dc55-73b5-4f28-8b59-292e43028aec req-d5371fee-2427-4f33-9a8e-2b921648a0c9 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-cdd4fb44-07d8-4910-b2c1-32386ecffab8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:12:40 compute-1 nova_compute[182713]: 2026-01-22 00:12:40.463 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:12:41 compute-1 nova_compute[182713]: 2026-01-22 00:12:41.219 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:12:41 compute-1 nova_compute[182713]: 2026-01-22 00:12:41.220 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:12:42 compute-1 nova_compute[182713]: 2026-01-22 00:12:42.786 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:12:42 compute-1 nova_compute[182713]: 2026-01-22 00:12:42.855 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:12:42 compute-1 nova_compute[182713]: 2026-01-22 00:12:42.856 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 00:12:43 compute-1 nova_compute[182713]: 2026-01-22 00:12:43.007 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:12:43 compute-1 nova_compute[182713]: 2026-01-22 00:12:43.268 182717 DEBUG nova.compute.manager [req-26ef899f-4a52-46a9-ae60-ebc5dbfbef95 req-114e8f57-1bc5-4074-af62-d088cea59abe 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: cdd4fb44-07d8-4910-b2c1-32386ecffab8] Received event network-changed-7c209da8-98c9-48d0-b48e-ece9f863d933 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:12:43 compute-1 nova_compute[182713]: 2026-01-22 00:12:43.269 182717 DEBUG nova.compute.manager [req-26ef899f-4a52-46a9-ae60-ebc5dbfbef95 req-114e8f57-1bc5-4074-af62-d088cea59abe 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: cdd4fb44-07d8-4910-b2c1-32386ecffab8] Refreshing instance network info cache due to event network-changed-7c209da8-98c9-48d0-b48e-ece9f863d933. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 00:12:43 compute-1 nova_compute[182713]: 2026-01-22 00:12:43.270 182717 DEBUG oslo_concurrency.lockutils [req-26ef899f-4a52-46a9-ae60-ebc5dbfbef95 req-114e8f57-1bc5-4074-af62-d088cea59abe 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-cdd4fb44-07d8-4910-b2c1-32386ecffab8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:12:43 compute-1 nova_compute[182713]: 2026-01-22 00:12:43.270 182717 DEBUG oslo_concurrency.lockutils [req-26ef899f-4a52-46a9-ae60-ebc5dbfbef95 req-114e8f57-1bc5-4074-af62-d088cea59abe 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-cdd4fb44-07d8-4910-b2c1-32386ecffab8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:12:43 compute-1 nova_compute[182713]: 2026-01-22 00:12:43.271 182717 DEBUG nova.network.neutron [req-26ef899f-4a52-46a9-ae60-ebc5dbfbef95 req-114e8f57-1bc5-4074-af62-d088cea59abe 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: cdd4fb44-07d8-4910-b2c1-32386ecffab8] Refreshing network info cache for port 7c209da8-98c9-48d0-b48e-ece9f863d933 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 00:12:43 compute-1 nova_compute[182713]: 2026-01-22 00:12:43.851 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:12:44 compute-1 nova_compute[182713]: 2026-01-22 00:12:44.648 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:12:44 compute-1 nova_compute[182713]: 2026-01-22 00:12:44.789 182717 DEBUG nova.network.neutron [req-26ef899f-4a52-46a9-ae60-ebc5dbfbef95 req-114e8f57-1bc5-4074-af62-d088cea59abe 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: cdd4fb44-07d8-4910-b2c1-32386ecffab8] Updated VIF entry in instance network info cache for port 7c209da8-98c9-48d0-b48e-ece9f863d933. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 00:12:44 compute-1 nova_compute[182713]: 2026-01-22 00:12:44.789 182717 DEBUG nova.network.neutron [req-26ef899f-4a52-46a9-ae60-ebc5dbfbef95 req-114e8f57-1bc5-4074-af62-d088cea59abe 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: cdd4fb44-07d8-4910-b2c1-32386ecffab8] Updating instance_info_cache with network_info: [{"id": "7c209da8-98c9-48d0-b48e-ece9f863d933", "address": "fa:16:3e:3e:6e:e0", "network": {"id": "3b4ab8c5-3d45-4beb-9842-6290486d7c84", "bridge": "br-int", "label": "tempest-ServerRescueTestJSONUnderV235-682511806-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "54baf6b68ab146d49737e618b1e5b40e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7c209da8-98", "ovs_interfaceid": "7c209da8-98c9-48d0-b48e-ece9f863d933", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:12:44 compute-1 nova_compute[182713]: 2026-01-22 00:12:44.856 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:12:44 compute-1 nova_compute[182713]: 2026-01-22 00:12:44.856 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:12:45 compute-1 nova_compute[182713]: 2026-01-22 00:12:45.159 182717 DEBUG oslo_concurrency.lockutils [req-26ef899f-4a52-46a9-ae60-ebc5dbfbef95 req-114e8f57-1bc5-4074-af62-d088cea59abe 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-cdd4fb44-07d8-4910-b2c1-32386ecffab8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:12:45 compute-1 nova_compute[182713]: 2026-01-22 00:12:45.205 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:12:45 compute-1 nova_compute[182713]: 2026-01-22 00:12:45.205 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:12:45 compute-1 nova_compute[182713]: 2026-01-22 00:12:45.206 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:12:45 compute-1 nova_compute[182713]: 2026-01-22 00:12:45.206 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 00:12:45 compute-1 nova_compute[182713]: 2026-01-22 00:12:45.465 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:12:47 compute-1 podman[229827]: 2026-01-22 00:12:47.567840725 +0000 UTC m=+0.066411813 container health_status cba261ffc859a105e0afa4436e64e27f9ba7e7387a1ea02146e0072c51844d44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ceilometer_agent_compute, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Jan 22 00:12:48 compute-1 nova_compute[182713]: 2026-01-22 00:12:48.008 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:12:48 compute-1 nova_compute[182713]: 2026-01-22 00:12:48.457 182717 DEBUG oslo_concurrency.processutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/cdd4fb44-07d8-4910-b2c1-32386ecffab8/disk.rescue --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:12:48 compute-1 nova_compute[182713]: 2026-01-22 00:12:48.523 182717 DEBUG oslo_concurrency.processutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/cdd4fb44-07d8-4910-b2c1-32386ecffab8/disk.rescue --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:12:48 compute-1 nova_compute[182713]: 2026-01-22 00:12:48.524 182717 DEBUG oslo_concurrency.processutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/cdd4fb44-07d8-4910-b2c1-32386ecffab8/disk.rescue --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:12:48 compute-1 nova_compute[182713]: 2026-01-22 00:12:48.581 182717 DEBUG oslo_concurrency.processutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/cdd4fb44-07d8-4910-b2c1-32386ecffab8/disk.rescue --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:12:48 compute-1 nova_compute[182713]: 2026-01-22 00:12:48.582 182717 DEBUG oslo_concurrency.processutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/cdd4fb44-07d8-4910-b2c1-32386ecffab8/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:12:48 compute-1 nova_compute[182713]: 2026-01-22 00:12:48.636 182717 DEBUG oslo_concurrency.processutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/cdd4fb44-07d8-4910-b2c1-32386ecffab8/disk --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:12:48 compute-1 nova_compute[182713]: 2026-01-22 00:12:48.637 182717 DEBUG oslo_concurrency.processutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/cdd4fb44-07d8-4910-b2c1-32386ecffab8/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:12:48 compute-1 nova_compute[182713]: 2026-01-22 00:12:48.697 182717 DEBUG oslo_concurrency.processutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/cdd4fb44-07d8-4910-b2c1-32386ecffab8/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:12:48 compute-1 nova_compute[182713]: 2026-01-22 00:12:48.869 182717 WARNING nova.virt.libvirt.driver [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 00:12:48 compute-1 nova_compute[182713]: 2026-01-22 00:12:48.872 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5546MB free_disk=73.26625061035156GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 00:12:48 compute-1 nova_compute[182713]: 2026-01-22 00:12:48.872 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:12:48 compute-1 nova_compute[182713]: 2026-01-22 00:12:48.872 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:12:49 compute-1 nova_compute[182713]: 2026-01-22 00:12:49.159 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Instance cdd4fb44-07d8-4910-b2c1-32386ecffab8 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 22 00:12:49 compute-1 nova_compute[182713]: 2026-01-22 00:12:49.160 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 00:12:49 compute-1 nova_compute[182713]: 2026-01-22 00:12:49.160 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 00:12:49 compute-1 nova_compute[182713]: 2026-01-22 00:12:49.207 182717 DEBUG nova.compute.provider_tree [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Inventory has not changed in ProviderTree for provider: 39680711-70c9-4df1-ae59-25e54fac688d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:12:49 compute-1 nova_compute[182713]: 2026-01-22 00:12:49.375 182717 DEBUG nova.scheduler.client.report [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Inventory has not changed for provider 39680711-70c9-4df1-ae59-25e54fac688d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:12:49 compute-1 nova_compute[182713]: 2026-01-22 00:12:49.423 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 00:12:49 compute-1 nova_compute[182713]: 2026-01-22 00:12:49.424 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.552s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:12:49 compute-1 nova_compute[182713]: 2026-01-22 00:12:49.425 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:12:49 compute-1 nova_compute[182713]: 2026-01-22 00:12:49.425 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Jan 22 00:12:50 compute-1 nova_compute[182713]: 2026-01-22 00:12:50.441 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:12:50 compute-1 nova_compute[182713]: 2026-01-22 00:12:50.442 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 00:12:50 compute-1 nova_compute[182713]: 2026-01-22 00:12:50.442 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 22 00:12:50 compute-1 nova_compute[182713]: 2026-01-22 00:12:50.469 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:12:50 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:12:50.936 104184 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=34, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:a4:fb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'fa:c2:bb:50:eb:27'}, ipsec=False) old=SB_Global(nb_cfg=33) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:12:50 compute-1 nova_compute[182713]: 2026-01-22 00:12:50.937 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:12:50 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:12:50.937 104184 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 22 00:12:51 compute-1 nova_compute[182713]: 2026-01-22 00:12:51.014 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquiring lock "refresh_cache-cdd4fb44-07d8-4910-b2c1-32386ecffab8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:12:51 compute-1 nova_compute[182713]: 2026-01-22 00:12:51.015 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquired lock "refresh_cache-cdd4fb44-07d8-4910-b2c1-32386ecffab8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:12:51 compute-1 nova_compute[182713]: 2026-01-22 00:12:51.015 182717 DEBUG nova.network.neutron [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] [instance: cdd4fb44-07d8-4910-b2c1-32386ecffab8] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Jan 22 00:12:51 compute-1 nova_compute[182713]: 2026-01-22 00:12:51.015 182717 DEBUG nova.objects.instance [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lazy-loading 'info_cache' on Instance uuid cdd4fb44-07d8-4910-b2c1-32386ecffab8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:12:51 compute-1 podman[229859]: 2026-01-22 00:12:51.59273817 +0000 UTC m=+0.075503401 container health_status dce133a2ffa21f3006ac27dc80027060a9029e6d972f4c62fecd35dd3bbb00f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, maintainer=Red Hat, Inc., name=ubi9-minimal, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, distribution-scope=public, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, version=9.6, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=openstack_network_exporter, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, vcs-type=git)
Jan 22 00:12:51 compute-1 nova_compute[182713]: 2026-01-22 00:12:51.623 182717 DEBUG oslo_concurrency.lockutils [None req-eefc8a44-3b21-41da-837b-34801380f815 3c4bd8a02cf045ad9703e01b44239806 54baf6b68ab146d49737e618b1e5b40e - - default default] Acquiring lock "cdd4fb44-07d8-4910-b2c1-32386ecffab8" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:12:51 compute-1 nova_compute[182713]: 2026-01-22 00:12:51.624 182717 DEBUG oslo_concurrency.lockutils [None req-eefc8a44-3b21-41da-837b-34801380f815 3c4bd8a02cf045ad9703e01b44239806 54baf6b68ab146d49737e618b1e5b40e - - default default] Lock "cdd4fb44-07d8-4910-b2c1-32386ecffab8" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:12:51 compute-1 nova_compute[182713]: 2026-01-22 00:12:51.624 182717 DEBUG oslo_concurrency.lockutils [None req-eefc8a44-3b21-41da-837b-34801380f815 3c4bd8a02cf045ad9703e01b44239806 54baf6b68ab146d49737e618b1e5b40e - - default default] Acquiring lock "cdd4fb44-07d8-4910-b2c1-32386ecffab8-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:12:51 compute-1 nova_compute[182713]: 2026-01-22 00:12:51.625 182717 DEBUG oslo_concurrency.lockutils [None req-eefc8a44-3b21-41da-837b-34801380f815 3c4bd8a02cf045ad9703e01b44239806 54baf6b68ab146d49737e618b1e5b40e - - default default] Lock "cdd4fb44-07d8-4910-b2c1-32386ecffab8-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:12:51 compute-1 nova_compute[182713]: 2026-01-22 00:12:51.625 182717 DEBUG oslo_concurrency.lockutils [None req-eefc8a44-3b21-41da-837b-34801380f815 3c4bd8a02cf045ad9703e01b44239806 54baf6b68ab146d49737e618b1e5b40e - - default default] Lock "cdd4fb44-07d8-4910-b2c1-32386ecffab8-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:12:51 compute-1 nova_compute[182713]: 2026-01-22 00:12:51.655 182717 INFO nova.compute.manager [None req-eefc8a44-3b21-41da-837b-34801380f815 3c4bd8a02cf045ad9703e01b44239806 54baf6b68ab146d49737e618b1e5b40e - - default default] [instance: cdd4fb44-07d8-4910-b2c1-32386ecffab8] Terminating instance
Jan 22 00:12:51 compute-1 nova_compute[182713]: 2026-01-22 00:12:51.815 182717 DEBUG nova.compute.manager [None req-eefc8a44-3b21-41da-837b-34801380f815 3c4bd8a02cf045ad9703e01b44239806 54baf6b68ab146d49737e618b1e5b40e - - default default] [instance: cdd4fb44-07d8-4910-b2c1-32386ecffab8] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 22 00:12:51 compute-1 kernel: tap7c209da8-98 (unregistering): left promiscuous mode
Jan 22 00:12:51 compute-1 NetworkManager[54952]: <info>  [1769040771.8426] device (tap7c209da8-98): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 00:12:51 compute-1 ovn_controller[94841]: 2026-01-22T00:12:51Z|00490|binding|INFO|Releasing lport 7c209da8-98c9-48d0-b48e-ece9f863d933 from this chassis (sb_readonly=0)
Jan 22 00:12:51 compute-1 nova_compute[182713]: 2026-01-22 00:12:51.847 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:12:51 compute-1 nova_compute[182713]: 2026-01-22 00:12:51.849 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:12:51 compute-1 ovn_controller[94841]: 2026-01-22T00:12:51Z|00491|binding|INFO|Setting lport 7c209da8-98c9-48d0-b48e-ece9f863d933 down in Southbound
Jan 22 00:12:51 compute-1 ovn_controller[94841]: 2026-01-22T00:12:51Z|00492|binding|INFO|Removing iface tap7c209da8-98 ovn-installed in OVS
Jan 22 00:12:51 compute-1 nova_compute[182713]: 2026-01-22 00:12:51.863 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:12:51 compute-1 systemd[1]: machine-qemu\x2d53\x2dinstance\x2d00000074.scope: Deactivated successfully.
Jan 22 00:12:51 compute-1 systemd[1]: machine-qemu\x2d53\x2dinstance\x2d00000074.scope: Consumed 15.166s CPU time.
Jan 22 00:12:51 compute-1 systemd-machined[153970]: Machine qemu-53-instance-00000074 terminated.
Jan 22 00:12:52 compute-1 nova_compute[182713]: 2026-01-22 00:12:52.115 182717 INFO nova.virt.libvirt.driver [-] [instance: cdd4fb44-07d8-4910-b2c1-32386ecffab8] Instance destroyed successfully.
Jan 22 00:12:52 compute-1 nova_compute[182713]: 2026-01-22 00:12:52.116 182717 DEBUG nova.objects.instance [None req-eefc8a44-3b21-41da-837b-34801380f815 3c4bd8a02cf045ad9703e01b44239806 54baf6b68ab146d49737e618b1e5b40e - - default default] Lazy-loading 'resources' on Instance uuid cdd4fb44-07d8-4910-b2c1-32386ecffab8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:12:52 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:12:52.612 104184 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:3e:6e:e0 10.100.0.4'], port_security=['fa:16:3e:3e:6e:e0 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': 'cdd4fb44-07d8-4910-b2c1-32386ecffab8', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3b4ab8c5-3d45-4beb-9842-6290486d7c84', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '54baf6b68ab146d49737e618b1e5b40e', 'neutron:revision_number': '8', 'neutron:security_group_ids': 'cdb31eb0-cb40-4001-b611-8fbfe7d0fb3e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=9fae3c6d-aa86-498b-83cb-e3d2b95d2a8f, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>], logical_port=7c209da8-98c9-48d0-b48e-ece9f863d933) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:12:52 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:12:52.613 104184 INFO neutron.agent.ovn.metadata.agent [-] Port 7c209da8-98c9-48d0-b48e-ece9f863d933 in datapath 3b4ab8c5-3d45-4beb-9842-6290486d7c84 unbound from our chassis
Jan 22 00:12:52 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:12:52.615 104184 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 3b4ab8c5-3d45-4beb-9842-6290486d7c84 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Jan 22 00:12:52 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:12:52.616 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[e162b3bf-e338-4e09-8dbd-691d6507c1e1]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:12:53 compute-1 nova_compute[182713]: 2026-01-22 00:12:53.011 182717 DEBUG nova.virt.libvirt.vif [None req-eefc8a44-3b21-41da-837b-34801380f815 3c4bd8a02cf045ad9703e01b44239806 54baf6b68ab146d49737e618b1e5b40e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T00:11:07Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerRescueTestJSONUnderV235-server-1857258152',display_name='tempest-ServerRescueTestJSONUnderV235-server-1857258152',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverrescuetestjsonunderv235-server-1857258152',id=116,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-22T00:12:03Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='54baf6b68ab146d49737e618b1e5b40e',ramdisk_id='',reservation_id='r-1ixyv6bz',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerRescueTestJSONUnderV235-284679437',owner_user_name='tempest-ServerRescueTestJSONUnderV235-284679437-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T00:12:03Z,user_data=None,user_id='3c4bd8a02cf045ad9703e01b44239806',uuid=cdd4fb44-07d8-4910-b2c1-32386ecffab8,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='rescued') vif={"id": "7c209da8-98c9-48d0-b48e-ece9f863d933", "address": "fa:16:3e:3e:6e:e0", "network": {"id": "3b4ab8c5-3d45-4beb-9842-6290486d7c84", "bridge": "br-int", "label": "tempest-ServerRescueTestJSONUnderV235-682511806-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "54baf6b68ab146d49737e618b1e5b40e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7c209da8-98", "ovs_interfaceid": "7c209da8-98c9-48d0-b48e-ece9f863d933", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 00:12:53 compute-1 nova_compute[182713]: 2026-01-22 00:12:53.011 182717 DEBUG nova.network.os_vif_util [None req-eefc8a44-3b21-41da-837b-34801380f815 3c4bd8a02cf045ad9703e01b44239806 54baf6b68ab146d49737e618b1e5b40e - - default default] Converting VIF {"id": "7c209da8-98c9-48d0-b48e-ece9f863d933", "address": "fa:16:3e:3e:6e:e0", "network": {"id": "3b4ab8c5-3d45-4beb-9842-6290486d7c84", "bridge": "br-int", "label": "tempest-ServerRescueTestJSONUnderV235-682511806-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "54baf6b68ab146d49737e618b1e5b40e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7c209da8-98", "ovs_interfaceid": "7c209da8-98c9-48d0-b48e-ece9f863d933", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:12:53 compute-1 nova_compute[182713]: 2026-01-22 00:12:53.013 182717 DEBUG nova.network.os_vif_util [None req-eefc8a44-3b21-41da-837b-34801380f815 3c4bd8a02cf045ad9703e01b44239806 54baf6b68ab146d49737e618b1e5b40e - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:3e:6e:e0,bridge_name='br-int',has_traffic_filtering=True,id=7c209da8-98c9-48d0-b48e-ece9f863d933,network=Network(3b4ab8c5-3d45-4beb-9842-6290486d7c84),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7c209da8-98') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:12:53 compute-1 nova_compute[182713]: 2026-01-22 00:12:53.014 182717 DEBUG os_vif [None req-eefc8a44-3b21-41da-837b-34801380f815 3c4bd8a02cf045ad9703e01b44239806 54baf6b68ab146d49737e618b1e5b40e - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:3e:6e:e0,bridge_name='br-int',has_traffic_filtering=True,id=7c209da8-98c9-48d0-b48e-ece9f863d933,network=Network(3b4ab8c5-3d45-4beb-9842-6290486d7c84),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7c209da8-98') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 00:12:53 compute-1 nova_compute[182713]: 2026-01-22 00:12:53.017 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:12:53 compute-1 nova_compute[182713]: 2026-01-22 00:12:53.018 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7c209da8-98, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:12:53 compute-1 nova_compute[182713]: 2026-01-22 00:12:53.026 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:12:53 compute-1 nova_compute[182713]: 2026-01-22 00:12:53.029 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:12:53 compute-1 nova_compute[182713]: 2026-01-22 00:12:53.033 182717 INFO os_vif [None req-eefc8a44-3b21-41da-837b-34801380f815 3c4bd8a02cf045ad9703e01b44239806 54baf6b68ab146d49737e618b1e5b40e - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:3e:6e:e0,bridge_name='br-int',has_traffic_filtering=True,id=7c209da8-98c9-48d0-b48e-ece9f863d933,network=Network(3b4ab8c5-3d45-4beb-9842-6290486d7c84),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7c209da8-98')
Jan 22 00:12:53 compute-1 nova_compute[182713]: 2026-01-22 00:12:53.034 182717 INFO nova.virt.libvirt.driver [None req-eefc8a44-3b21-41da-837b-34801380f815 3c4bd8a02cf045ad9703e01b44239806 54baf6b68ab146d49737e618b1e5b40e - - default default] [instance: cdd4fb44-07d8-4910-b2c1-32386ecffab8] Deleting instance files /var/lib/nova/instances/cdd4fb44-07d8-4910-b2c1-32386ecffab8_del
Jan 22 00:12:53 compute-1 nova_compute[182713]: 2026-01-22 00:12:53.035 182717 INFO nova.virt.libvirt.driver [None req-eefc8a44-3b21-41da-837b-34801380f815 3c4bd8a02cf045ad9703e01b44239806 54baf6b68ab146d49737e618b1e5b40e - - default default] [instance: cdd4fb44-07d8-4910-b2c1-32386ecffab8] Deletion of /var/lib/nova/instances/cdd4fb44-07d8-4910-b2c1-32386ecffab8_del complete
Jan 22 00:12:54 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:12:54.940 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=74526b6d-b1ca-423f-9094-b845f8b97526, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '34'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:12:55 compute-1 nova_compute[182713]: 2026-01-22 00:12:55.307 182717 INFO nova.compute.manager [None req-eefc8a44-3b21-41da-837b-34801380f815 3c4bd8a02cf045ad9703e01b44239806 54baf6b68ab146d49737e618b1e5b40e - - default default] [instance: cdd4fb44-07d8-4910-b2c1-32386ecffab8] Took 3.49 seconds to destroy the instance on the hypervisor.
Jan 22 00:12:55 compute-1 nova_compute[182713]: 2026-01-22 00:12:55.307 182717 DEBUG oslo.service.loopingcall [None req-eefc8a44-3b21-41da-837b-34801380f815 3c4bd8a02cf045ad9703e01b44239806 54baf6b68ab146d49737e618b1e5b40e - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 22 00:12:55 compute-1 nova_compute[182713]: 2026-01-22 00:12:55.308 182717 DEBUG nova.compute.manager [-] [instance: cdd4fb44-07d8-4910-b2c1-32386ecffab8] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 22 00:12:55 compute-1 nova_compute[182713]: 2026-01-22 00:12:55.308 182717 DEBUG nova.network.neutron [-] [instance: cdd4fb44-07d8-4910-b2c1-32386ecffab8] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 22 00:12:56 compute-1 nova_compute[182713]: 2026-01-22 00:12:56.621 182717 DEBUG nova.network.neutron [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] [instance: cdd4fb44-07d8-4910-b2c1-32386ecffab8] Updating instance_info_cache with network_info: [{"id": "7c209da8-98c9-48d0-b48e-ece9f863d933", "address": "fa:16:3e:3e:6e:e0", "network": {"id": "3b4ab8c5-3d45-4beb-9842-6290486d7c84", "bridge": "br-int", "label": "tempest-ServerRescueTestJSONUnderV235-682511806-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "54baf6b68ab146d49737e618b1e5b40e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7c209da8-98", "ovs_interfaceid": "7c209da8-98c9-48d0-b48e-ece9f863d933", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:12:56 compute-1 nova_compute[182713]: 2026-01-22 00:12:56.765 182717 DEBUG nova.compute.manager [req-bf8fb7f2-b6fb-4cb8-8d9e-1ae2dcf96a4a req-177457b7-acb2-4d03-a4c5-0596cb10af4b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: cdd4fb44-07d8-4910-b2c1-32386ecffab8] Received event network-vif-unplugged-7c209da8-98c9-48d0-b48e-ece9f863d933 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:12:56 compute-1 nova_compute[182713]: 2026-01-22 00:12:56.766 182717 DEBUG oslo_concurrency.lockutils [req-bf8fb7f2-b6fb-4cb8-8d9e-1ae2dcf96a4a req-177457b7-acb2-4d03-a4c5-0596cb10af4b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "cdd4fb44-07d8-4910-b2c1-32386ecffab8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:12:56 compute-1 nova_compute[182713]: 2026-01-22 00:12:56.766 182717 DEBUG oslo_concurrency.lockutils [req-bf8fb7f2-b6fb-4cb8-8d9e-1ae2dcf96a4a req-177457b7-acb2-4d03-a4c5-0596cb10af4b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "cdd4fb44-07d8-4910-b2c1-32386ecffab8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:12:56 compute-1 nova_compute[182713]: 2026-01-22 00:12:56.766 182717 DEBUG oslo_concurrency.lockutils [req-bf8fb7f2-b6fb-4cb8-8d9e-1ae2dcf96a4a req-177457b7-acb2-4d03-a4c5-0596cb10af4b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "cdd4fb44-07d8-4910-b2c1-32386ecffab8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:12:56 compute-1 nova_compute[182713]: 2026-01-22 00:12:56.766 182717 DEBUG nova.compute.manager [req-bf8fb7f2-b6fb-4cb8-8d9e-1ae2dcf96a4a req-177457b7-acb2-4d03-a4c5-0596cb10af4b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: cdd4fb44-07d8-4910-b2c1-32386ecffab8] No waiting events found dispatching network-vif-unplugged-7c209da8-98c9-48d0-b48e-ece9f863d933 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:12:56 compute-1 nova_compute[182713]: 2026-01-22 00:12:56.767 182717 DEBUG nova.compute.manager [req-bf8fb7f2-b6fb-4cb8-8d9e-1ae2dcf96a4a req-177457b7-acb2-4d03-a4c5-0596cb10af4b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: cdd4fb44-07d8-4910-b2c1-32386ecffab8] Received event network-vif-unplugged-7c209da8-98c9-48d0-b48e-ece9f863d933 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 22 00:12:56 compute-1 nova_compute[182713]: 2026-01-22 00:12:56.775 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Releasing lock "refresh_cache-cdd4fb44-07d8-4910-b2c1-32386ecffab8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:12:56 compute-1 nova_compute[182713]: 2026-01-22 00:12:56.776 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] [instance: cdd4fb44-07d8-4910-b2c1-32386ecffab8] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Jan 22 00:12:56 compute-1 nova_compute[182713]: 2026-01-22 00:12:56.776 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:12:56 compute-1 nova_compute[182713]: 2026-01-22 00:12:56.776 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:12:57 compute-1 nova_compute[182713]: 2026-01-22 00:12:57.186 182717 DEBUG nova.network.neutron [-] [instance: cdd4fb44-07d8-4910-b2c1-32386ecffab8] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:12:57 compute-1 nova_compute[182713]: 2026-01-22 00:12:57.327 182717 DEBUG nova.compute.manager [req-1a4f34cc-4c7a-4e92-8d95-9ef747267ea5 req-b62975af-3bbf-4b4a-9582-cdfe2c6efb55 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: cdd4fb44-07d8-4910-b2c1-32386ecffab8] Received event network-vif-deleted-7c209da8-98c9-48d0-b48e-ece9f863d933 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:12:57 compute-1 nova_compute[182713]: 2026-01-22 00:12:57.328 182717 INFO nova.compute.manager [req-1a4f34cc-4c7a-4e92-8d95-9ef747267ea5 req-b62975af-3bbf-4b4a-9582-cdfe2c6efb55 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: cdd4fb44-07d8-4910-b2c1-32386ecffab8] Neutron deleted interface 7c209da8-98c9-48d0-b48e-ece9f863d933; detaching it from the instance and deleting it from the info cache
Jan 22 00:12:57 compute-1 nova_compute[182713]: 2026-01-22 00:12:57.328 182717 DEBUG nova.network.neutron [req-1a4f34cc-4c7a-4e92-8d95-9ef747267ea5 req-b62975af-3bbf-4b4a-9582-cdfe2c6efb55 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: cdd4fb44-07d8-4910-b2c1-32386ecffab8] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:12:57 compute-1 nova_compute[182713]: 2026-01-22 00:12:57.372 182717 INFO nova.compute.manager [-] [instance: cdd4fb44-07d8-4910-b2c1-32386ecffab8] Took 2.06 seconds to deallocate network for instance.
Jan 22 00:12:57 compute-1 nova_compute[182713]: 2026-01-22 00:12:57.377 182717 DEBUG nova.compute.manager [req-1a4f34cc-4c7a-4e92-8d95-9ef747267ea5 req-b62975af-3bbf-4b4a-9582-cdfe2c6efb55 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: cdd4fb44-07d8-4910-b2c1-32386ecffab8] Detach interface failed, port_id=7c209da8-98c9-48d0-b48e-ece9f863d933, reason: Instance cdd4fb44-07d8-4910-b2c1-32386ecffab8 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882
Jan 22 00:12:58 compute-1 nova_compute[182713]: 2026-01-22 00:12:58.027 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:12:58 compute-1 nova_compute[182713]: 2026-01-22 00:12:58.030 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:12:58 compute-1 nova_compute[182713]: 2026-01-22 00:12:58.192 182717 DEBUG oslo_concurrency.lockutils [None req-eefc8a44-3b21-41da-837b-34801380f815 3c4bd8a02cf045ad9703e01b44239806 54baf6b68ab146d49737e618b1e5b40e - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:12:58 compute-1 nova_compute[182713]: 2026-01-22 00:12:58.193 182717 DEBUG oslo_concurrency.lockutils [None req-eefc8a44-3b21-41da-837b-34801380f815 3c4bd8a02cf045ad9703e01b44239806 54baf6b68ab146d49737e618b1e5b40e - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:12:58 compute-1 nova_compute[182713]: 2026-01-22 00:12:58.266 182717 DEBUG nova.compute.provider_tree [None req-eefc8a44-3b21-41da-837b-34801380f815 3c4bd8a02cf045ad9703e01b44239806 54baf6b68ab146d49737e618b1e5b40e - - default default] Inventory has not changed in ProviderTree for provider: 39680711-70c9-4df1-ae59-25e54fac688d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:12:58 compute-1 nova_compute[182713]: 2026-01-22 00:12:58.356 182717 DEBUG nova.scheduler.client.report [None req-eefc8a44-3b21-41da-837b-34801380f815 3c4bd8a02cf045ad9703e01b44239806 54baf6b68ab146d49737e618b1e5b40e - - default default] Inventory has not changed for provider 39680711-70c9-4df1-ae59-25e54fac688d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:12:58 compute-1 nova_compute[182713]: 2026-01-22 00:12:58.478 182717 DEBUG oslo_concurrency.lockutils [None req-eefc8a44-3b21-41da-837b-34801380f815 3c4bd8a02cf045ad9703e01b44239806 54baf6b68ab146d49737e618b1e5b40e - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.285s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:12:58 compute-1 nova_compute[182713]: 2026-01-22 00:12:58.511 182717 INFO nova.scheduler.client.report [None req-eefc8a44-3b21-41da-837b-34801380f815 3c4bd8a02cf045ad9703e01b44239806 54baf6b68ab146d49737e618b1e5b40e - - default default] Deleted allocations for instance cdd4fb44-07d8-4910-b2c1-32386ecffab8
Jan 22 00:12:58 compute-1 nova_compute[182713]: 2026-01-22 00:12:58.677 182717 DEBUG oslo_concurrency.lockutils [None req-eefc8a44-3b21-41da-837b-34801380f815 3c4bd8a02cf045ad9703e01b44239806 54baf6b68ab146d49737e618b1e5b40e - - default default] Lock "cdd4fb44-07d8-4910-b2c1-32386ecffab8" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 7.053s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:12:58 compute-1 nova_compute[182713]: 2026-01-22 00:12:58.900 182717 DEBUG nova.compute.manager [req-37e3f9f0-b4d9-40a5-af69-15cfce92c998 req-9abb1779-4984-4cc3-adf2-7eb95c8b385f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: cdd4fb44-07d8-4910-b2c1-32386ecffab8] Received event network-vif-plugged-7c209da8-98c9-48d0-b48e-ece9f863d933 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:12:58 compute-1 nova_compute[182713]: 2026-01-22 00:12:58.900 182717 DEBUG oslo_concurrency.lockutils [req-37e3f9f0-b4d9-40a5-af69-15cfce92c998 req-9abb1779-4984-4cc3-adf2-7eb95c8b385f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "cdd4fb44-07d8-4910-b2c1-32386ecffab8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:12:58 compute-1 nova_compute[182713]: 2026-01-22 00:12:58.901 182717 DEBUG oslo_concurrency.lockutils [req-37e3f9f0-b4d9-40a5-af69-15cfce92c998 req-9abb1779-4984-4cc3-adf2-7eb95c8b385f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "cdd4fb44-07d8-4910-b2c1-32386ecffab8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:12:58 compute-1 nova_compute[182713]: 2026-01-22 00:12:58.901 182717 DEBUG oslo_concurrency.lockutils [req-37e3f9f0-b4d9-40a5-af69-15cfce92c998 req-9abb1779-4984-4cc3-adf2-7eb95c8b385f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "cdd4fb44-07d8-4910-b2c1-32386ecffab8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:12:58 compute-1 nova_compute[182713]: 2026-01-22 00:12:58.901 182717 DEBUG nova.compute.manager [req-37e3f9f0-b4d9-40a5-af69-15cfce92c998 req-9abb1779-4984-4cc3-adf2-7eb95c8b385f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: cdd4fb44-07d8-4910-b2c1-32386ecffab8] No waiting events found dispatching network-vif-plugged-7c209da8-98c9-48d0-b48e-ece9f863d933 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:12:58 compute-1 nova_compute[182713]: 2026-01-22 00:12:58.901 182717 WARNING nova.compute.manager [req-37e3f9f0-b4d9-40a5-af69-15cfce92c998 req-9abb1779-4984-4cc3-adf2-7eb95c8b385f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: cdd4fb44-07d8-4910-b2c1-32386ecffab8] Received unexpected event network-vif-plugged-7c209da8-98c9-48d0-b48e-ece9f863d933 for instance with vm_state deleted and task_state None.
Jan 22 00:12:59 compute-1 nova_compute[182713]: 2026-01-22 00:12:59.140 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:12:59 compute-1 nova_compute[182713]: 2026-01-22 00:12:59.327 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:13:03 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:13:03.020 104184 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:13:03 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:13:03.021 104184 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:13:03 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:13:03.021 104184 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:13:03 compute-1 nova_compute[182713]: 2026-01-22 00:13:03.030 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:13:03 compute-1 podman[229913]: 2026-01-22 00:13:03.600637684 +0000 UTC m=+0.082307961 container health_status 9069102163b9014ce8d990e36224edf2f6277e737eebbc85a8ea0ba5a3d6a468 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 22 00:13:03 compute-1 podman[229912]: 2026-01-22 00:13:03.654250302 +0000 UTC m=+0.147291048 container health_status 1b31374526468d6b98833c6ddc06c791524e3aaa8f4a92d02e3f11c449a17ca2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Jan 22 00:13:07 compute-1 nova_compute[182713]: 2026-01-22 00:13:07.113 182717 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769040772.111574, cdd4fb44-07d8-4910-b2c1-32386ecffab8 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:13:07 compute-1 nova_compute[182713]: 2026-01-22 00:13:07.113 182717 INFO nova.compute.manager [-] [instance: cdd4fb44-07d8-4910-b2c1-32386ecffab8] VM Stopped (Lifecycle Event)
Jan 22 00:13:07 compute-1 nova_compute[182713]: 2026-01-22 00:13:07.524 182717 DEBUG nova.compute.manager [None req-633db62b-cfc6-405e-8963-e750a2ffc5b7 - - - - - -] [instance: cdd4fb44-07d8-4910-b2c1-32386ecffab8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:13:08 compute-1 nova_compute[182713]: 2026-01-22 00:13:08.032 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:13:08 compute-1 nova_compute[182713]: 2026-01-22 00:13:08.034 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:13:08 compute-1 nova_compute[182713]: 2026-01-22 00:13:08.034 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Jan 22 00:13:08 compute-1 nova_compute[182713]: 2026-01-22 00:13:08.035 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 00:13:08 compute-1 nova_compute[182713]: 2026-01-22 00:13:08.071 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:13:08 compute-1 nova_compute[182713]: 2026-01-22 00:13:08.072 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 00:13:08 compute-1 podman[229962]: 2026-01-22 00:13:08.595135391 +0000 UTC m=+0.074359316 container health_status e305ebfcd3dbca18f8dd680606b043b108cf9efb0d0a771039cb04fe0df8441d (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Jan 22 00:13:08 compute-1 podman[229961]: 2026-01-22 00:13:08.625810834 +0000 UTC m=+0.111729165 container health_status af9a44923f14d865893c91712a29a99d59e05d83f1a1a0300c4f4d5af638f284 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 22 00:13:13 compute-1 nova_compute[182713]: 2026-01-22 00:13:13.072 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:13:13 compute-1 nova_compute[182713]: 2026-01-22 00:13:13.074 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:13:18 compute-1 nova_compute[182713]: 2026-01-22 00:13:18.074 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:13:18 compute-1 nova_compute[182713]: 2026-01-22 00:13:18.076 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:13:18 compute-1 nova_compute[182713]: 2026-01-22 00:13:18.076 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Jan 22 00:13:18 compute-1 nova_compute[182713]: 2026-01-22 00:13:18.077 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 00:13:18 compute-1 nova_compute[182713]: 2026-01-22 00:13:18.077 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 00:13:18 compute-1 nova_compute[182713]: 2026-01-22 00:13:18.078 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:13:18 compute-1 podman[230004]: 2026-01-22 00:13:18.572199938 +0000 UTC m=+0.063591076 container health_status cba261ffc859a105e0afa4436e64e27f9ba7e7387a1ea02146e0072c51844d44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_id=ceilometer_agent_compute, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 00:13:22 compute-1 podman[230024]: 2026-01-22 00:13:22.562659785 +0000 UTC m=+0.062292156 container health_status dce133a2ffa21f3006ac27dc80027060a9029e6d972f4c62fecd35dd3bbb00f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, release=1755695350, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, vcs-type=git, config_id=openstack_network_exporter, distribution-scope=public, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, name=ubi9-minimal)
Jan 22 00:13:23 compute-1 nova_compute[182713]: 2026-01-22 00:13:23.078 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:13:28 compute-1 nova_compute[182713]: 2026-01-22 00:13:28.080 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:13:31 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:13:31.615 104184 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=35, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:a4:fb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'fa:c2:bb:50:eb:27'}, ipsec=False) old=SB_Global(nb_cfg=34) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:13:31 compute-1 nova_compute[182713]: 2026-01-22 00:13:31.615 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:13:31 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:13:31.616 104184 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 22 00:13:31 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:13:31.617 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=74526b6d-b1ca-423f-9094-b845f8b97526, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '35'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:13:33 compute-1 nova_compute[182713]: 2026-01-22 00:13:33.083 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:13:34 compute-1 nova_compute[182713]: 2026-01-22 00:13:34.376 182717 DEBUG oslo_concurrency.lockutils [None req-9693e089-a04d-484f-9983-6f3465273a9d 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] Acquiring lock "ad203d96-8f7f-4024-84b2-4bb2655b6395" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:13:34 compute-1 nova_compute[182713]: 2026-01-22 00:13:34.377 182717 DEBUG oslo_concurrency.lockutils [None req-9693e089-a04d-484f-9983-6f3465273a9d 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] Lock "ad203d96-8f7f-4024-84b2-4bb2655b6395" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:13:34 compute-1 nova_compute[182713]: 2026-01-22 00:13:34.400 182717 DEBUG nova.compute.manager [None req-9693e089-a04d-484f-9983-6f3465273a9d 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] [instance: ad203d96-8f7f-4024-84b2-4bb2655b6395] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 22 00:13:34 compute-1 nova_compute[182713]: 2026-01-22 00:13:34.542 182717 DEBUG oslo_concurrency.lockutils [None req-9693e089-a04d-484f-9983-6f3465273a9d 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:13:34 compute-1 nova_compute[182713]: 2026-01-22 00:13:34.543 182717 DEBUG oslo_concurrency.lockutils [None req-9693e089-a04d-484f-9983-6f3465273a9d 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:13:34 compute-1 podman[230046]: 2026-01-22 00:13:34.563603466 +0000 UTC m=+0.053653871 container health_status 9069102163b9014ce8d990e36224edf2f6277e737eebbc85a8ea0ba5a3d6a468 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 22 00:13:34 compute-1 nova_compute[182713]: 2026-01-22 00:13:34.564 182717 DEBUG nova.virt.hardware [None req-9693e089-a04d-484f-9983-6f3465273a9d 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 22 00:13:34 compute-1 nova_compute[182713]: 2026-01-22 00:13:34.565 182717 INFO nova.compute.claims [None req-9693e089-a04d-484f-9983-6f3465273a9d 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] [instance: ad203d96-8f7f-4024-84b2-4bb2655b6395] Claim successful on node compute-1.ctlplane.example.com
Jan 22 00:13:34 compute-1 podman[230045]: 2026-01-22 00:13:34.596789326 +0000 UTC m=+0.088735118 container health_status 1b31374526468d6b98833c6ddc06c791524e3aaa8f4a92d02e3f11c449a17ca2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 00:13:34 compute-1 nova_compute[182713]: 2026-01-22 00:13:34.977 182717 DEBUG nova.compute.provider_tree [None req-9693e089-a04d-484f-9983-6f3465273a9d 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] Inventory has not changed in ProviderTree for provider: 39680711-70c9-4df1-ae59-25e54fac688d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:13:35 compute-1 nova_compute[182713]: 2026-01-22 00:13:35.245 182717 DEBUG nova.scheduler.client.report [None req-9693e089-a04d-484f-9983-6f3465273a9d 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] Inventory has not changed for provider 39680711-70c9-4df1-ae59-25e54fac688d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:13:35 compute-1 nova_compute[182713]: 2026-01-22 00:13:35.358 182717 DEBUG oslo_concurrency.lockutils [None req-5a24a287-7ffe-475c-b0db-9a2a49a050b3 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] Acquiring lock "f1150657-3f68-4e2e-91ee-82eb9d65f0c1" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:13:35 compute-1 nova_compute[182713]: 2026-01-22 00:13:35.359 182717 DEBUG oslo_concurrency.lockutils [None req-5a24a287-7ffe-475c-b0db-9a2a49a050b3 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] Lock "f1150657-3f68-4e2e-91ee-82eb9d65f0c1" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:13:36 compute-1 nova_compute[182713]: 2026-01-22 00:13:36.004 182717 DEBUG oslo_concurrency.lockutils [None req-9693e089-a04d-484f-9983-6f3465273a9d 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 1.461s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:13:36 compute-1 nova_compute[182713]: 2026-01-22 00:13:36.004 182717 DEBUG nova.compute.manager [None req-9693e089-a04d-484f-9983-6f3465273a9d 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] [instance: ad203d96-8f7f-4024-84b2-4bb2655b6395] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 22 00:13:36 compute-1 nova_compute[182713]: 2026-01-22 00:13:36.158 182717 DEBUG nova.compute.manager [None req-5a24a287-7ffe-475c-b0db-9a2a49a050b3 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] [instance: f1150657-3f68-4e2e-91ee-82eb9d65f0c1] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 22 00:13:36 compute-1 nova_compute[182713]: 2026-01-22 00:13:36.595 182717 DEBUG nova.compute.manager [None req-9693e089-a04d-484f-9983-6f3465273a9d 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] [instance: ad203d96-8f7f-4024-84b2-4bb2655b6395] Not allocating networking since 'none' was specified. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1948
Jan 22 00:13:36 compute-1 nova_compute[182713]: 2026-01-22 00:13:36.630 182717 INFO nova.virt.libvirt.driver [None req-9693e089-a04d-484f-9983-6f3465273a9d 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] [instance: ad203d96-8f7f-4024-84b2-4bb2655b6395] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 22 00:13:36 compute-1 nova_compute[182713]: 2026-01-22 00:13:36.636 182717 DEBUG oslo_concurrency.lockutils [None req-5a24a287-7ffe-475c-b0db-9a2a49a050b3 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:13:36 compute-1 nova_compute[182713]: 2026-01-22 00:13:36.637 182717 DEBUG oslo_concurrency.lockutils [None req-5a24a287-7ffe-475c-b0db-9a2a49a050b3 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:13:36 compute-1 nova_compute[182713]: 2026-01-22 00:13:36.646 182717 DEBUG nova.virt.hardware [None req-5a24a287-7ffe-475c-b0db-9a2a49a050b3 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 22 00:13:36 compute-1 nova_compute[182713]: 2026-01-22 00:13:36.647 182717 INFO nova.compute.claims [None req-5a24a287-7ffe-475c-b0db-9a2a49a050b3 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] [instance: f1150657-3f68-4e2e-91ee-82eb9d65f0c1] Claim successful on node compute-1.ctlplane.example.com
Jan 22 00:13:36 compute-1 nova_compute[182713]: 2026-01-22 00:13:36.653 182717 DEBUG nova.compute.manager [None req-9693e089-a04d-484f-9983-6f3465273a9d 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] [instance: ad203d96-8f7f-4024-84b2-4bb2655b6395] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 22 00:13:36 compute-1 nova_compute[182713]: 2026-01-22 00:13:36.824 182717 DEBUG nova.compute.manager [None req-9693e089-a04d-484f-9983-6f3465273a9d 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] [instance: ad203d96-8f7f-4024-84b2-4bb2655b6395] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 22 00:13:36 compute-1 nova_compute[182713]: 2026-01-22 00:13:36.825 182717 DEBUG nova.virt.libvirt.driver [None req-9693e089-a04d-484f-9983-6f3465273a9d 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] [instance: ad203d96-8f7f-4024-84b2-4bb2655b6395] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 22 00:13:36 compute-1 nova_compute[182713]: 2026-01-22 00:13:36.826 182717 INFO nova.virt.libvirt.driver [None req-9693e089-a04d-484f-9983-6f3465273a9d 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] [instance: ad203d96-8f7f-4024-84b2-4bb2655b6395] Creating image(s)
Jan 22 00:13:36 compute-1 nova_compute[182713]: 2026-01-22 00:13:36.827 182717 DEBUG oslo_concurrency.lockutils [None req-9693e089-a04d-484f-9983-6f3465273a9d 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] Acquiring lock "/var/lib/nova/instances/ad203d96-8f7f-4024-84b2-4bb2655b6395/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:13:36 compute-1 nova_compute[182713]: 2026-01-22 00:13:36.828 182717 DEBUG oslo_concurrency.lockutils [None req-9693e089-a04d-484f-9983-6f3465273a9d 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] Lock "/var/lib/nova/instances/ad203d96-8f7f-4024-84b2-4bb2655b6395/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:13:36 compute-1 nova_compute[182713]: 2026-01-22 00:13:36.829 182717 DEBUG oslo_concurrency.lockutils [None req-9693e089-a04d-484f-9983-6f3465273a9d 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] Lock "/var/lib/nova/instances/ad203d96-8f7f-4024-84b2-4bb2655b6395/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:13:36 compute-1 nova_compute[182713]: 2026-01-22 00:13:36.855 182717 DEBUG oslo_concurrency.processutils [None req-9693e089-a04d-484f-9983-6f3465273a9d 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:13:36 compute-1 nova_compute[182713]: 2026-01-22 00:13:36.912 182717 DEBUG nova.compute.provider_tree [None req-5a24a287-7ffe-475c-b0db-9a2a49a050b3 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] Inventory has not changed in ProviderTree for provider: 39680711-70c9-4df1-ae59-25e54fac688d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:13:36 compute-1 nova_compute[182713]: 2026-01-22 00:13:36.949 182717 DEBUG oslo_concurrency.processutils [None req-9693e089-a04d-484f-9983-6f3465273a9d 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.094s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:13:36 compute-1 nova_compute[182713]: 2026-01-22 00:13:36.950 182717 DEBUG oslo_concurrency.lockutils [None req-9693e089-a04d-484f-9983-6f3465273a9d 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] Acquiring lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:13:36 compute-1 nova_compute[182713]: 2026-01-22 00:13:36.951 182717 DEBUG oslo_concurrency.lockutils [None req-9693e089-a04d-484f-9983-6f3465273a9d 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:13:36 compute-1 nova_compute[182713]: 2026-01-22 00:13:36.967 182717 DEBUG oslo_concurrency.processutils [None req-9693e089-a04d-484f-9983-6f3465273a9d 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:13:36 compute-1 nova_compute[182713]: 2026-01-22 00:13:36.991 182717 DEBUG nova.scheduler.client.report [None req-5a24a287-7ffe-475c-b0db-9a2a49a050b3 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] Inventory has not changed for provider 39680711-70c9-4df1-ae59-25e54fac688d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:13:37 compute-1 nova_compute[182713]: 2026-01-22 00:13:37.030 182717 DEBUG oslo_concurrency.lockutils [None req-5a24a287-7ffe-475c-b0db-9a2a49a050b3 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.394s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:13:37 compute-1 nova_compute[182713]: 2026-01-22 00:13:37.031 182717 DEBUG nova.compute.manager [None req-5a24a287-7ffe-475c-b0db-9a2a49a050b3 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] [instance: f1150657-3f68-4e2e-91ee-82eb9d65f0c1] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 22 00:13:37 compute-1 nova_compute[182713]: 2026-01-22 00:13:37.040 182717 DEBUG oslo_concurrency.processutils [None req-9693e089-a04d-484f-9983-6f3465273a9d 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.073s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:13:37 compute-1 nova_compute[182713]: 2026-01-22 00:13:37.041 182717 DEBUG oslo_concurrency.processutils [None req-9693e089-a04d-484f-9983-6f3465273a9d 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/ad203d96-8f7f-4024-84b2-4bb2655b6395/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:13:37 compute-1 nova_compute[182713]: 2026-01-22 00:13:37.102 182717 DEBUG nova.compute.manager [None req-5a24a287-7ffe-475c-b0db-9a2a49a050b3 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] [instance: f1150657-3f68-4e2e-91ee-82eb9d65f0c1] Not allocating networking since 'none' was specified. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1948
Jan 22 00:13:37 compute-1 nova_compute[182713]: 2026-01-22 00:13:37.121 182717 INFO nova.virt.libvirt.driver [None req-5a24a287-7ffe-475c-b0db-9a2a49a050b3 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] [instance: f1150657-3f68-4e2e-91ee-82eb9d65f0c1] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 22 00:13:37 compute-1 nova_compute[182713]: 2026-01-22 00:13:37.151 182717 DEBUG nova.compute.manager [None req-5a24a287-7ffe-475c-b0db-9a2a49a050b3 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] [instance: f1150657-3f68-4e2e-91ee-82eb9d65f0c1] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 22 00:13:37 compute-1 nova_compute[182713]: 2026-01-22 00:13:37.270 182717 DEBUG oslo_concurrency.processutils [None req-9693e089-a04d-484f-9983-6f3465273a9d 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/ad203d96-8f7f-4024-84b2-4bb2655b6395/disk 1073741824" returned: 0 in 0.229s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:13:37 compute-1 nova_compute[182713]: 2026-01-22 00:13:37.272 182717 DEBUG oslo_concurrency.lockutils [None req-9693e089-a04d-484f-9983-6f3465273a9d 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.320s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:13:37 compute-1 nova_compute[182713]: 2026-01-22 00:13:37.272 182717 DEBUG oslo_concurrency.processutils [None req-9693e089-a04d-484f-9983-6f3465273a9d 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:13:37 compute-1 nova_compute[182713]: 2026-01-22 00:13:37.308 182717 DEBUG nova.compute.manager [None req-5a24a287-7ffe-475c-b0db-9a2a49a050b3 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] [instance: f1150657-3f68-4e2e-91ee-82eb9d65f0c1] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 22 00:13:37 compute-1 nova_compute[182713]: 2026-01-22 00:13:37.310 182717 DEBUG nova.virt.libvirt.driver [None req-5a24a287-7ffe-475c-b0db-9a2a49a050b3 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] [instance: f1150657-3f68-4e2e-91ee-82eb9d65f0c1] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 22 00:13:37 compute-1 nova_compute[182713]: 2026-01-22 00:13:37.311 182717 INFO nova.virt.libvirt.driver [None req-5a24a287-7ffe-475c-b0db-9a2a49a050b3 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] [instance: f1150657-3f68-4e2e-91ee-82eb9d65f0c1] Creating image(s)
Jan 22 00:13:37 compute-1 nova_compute[182713]: 2026-01-22 00:13:37.312 182717 DEBUG oslo_concurrency.lockutils [None req-5a24a287-7ffe-475c-b0db-9a2a49a050b3 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] Acquiring lock "/var/lib/nova/instances/f1150657-3f68-4e2e-91ee-82eb9d65f0c1/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:13:37 compute-1 nova_compute[182713]: 2026-01-22 00:13:37.312 182717 DEBUG oslo_concurrency.lockutils [None req-5a24a287-7ffe-475c-b0db-9a2a49a050b3 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] Lock "/var/lib/nova/instances/f1150657-3f68-4e2e-91ee-82eb9d65f0c1/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:13:37 compute-1 nova_compute[182713]: 2026-01-22 00:13:37.313 182717 DEBUG oslo_concurrency.lockutils [None req-5a24a287-7ffe-475c-b0db-9a2a49a050b3 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] Lock "/var/lib/nova/instances/f1150657-3f68-4e2e-91ee-82eb9d65f0c1/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:13:37 compute-1 nova_compute[182713]: 2026-01-22 00:13:37.332 182717 DEBUG oslo_concurrency.processutils [None req-5a24a287-7ffe-475c-b0db-9a2a49a050b3 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:13:37 compute-1 nova_compute[182713]: 2026-01-22 00:13:37.357 182717 DEBUG oslo_concurrency.processutils [None req-9693e089-a04d-484f-9983-6f3465273a9d 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.084s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:13:37 compute-1 nova_compute[182713]: 2026-01-22 00:13:37.358 182717 DEBUG nova.virt.disk.api [None req-9693e089-a04d-484f-9983-6f3465273a9d 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] Checking if we can resize image /var/lib/nova/instances/ad203d96-8f7f-4024-84b2-4bb2655b6395/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 22 00:13:37 compute-1 nova_compute[182713]: 2026-01-22 00:13:37.359 182717 DEBUG oslo_concurrency.processutils [None req-9693e089-a04d-484f-9983-6f3465273a9d 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ad203d96-8f7f-4024-84b2-4bb2655b6395/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:13:37 compute-1 nova_compute[182713]: 2026-01-22 00:13:37.395 182717 DEBUG oslo_concurrency.processutils [None req-5a24a287-7ffe-475c-b0db-9a2a49a050b3 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:13:37 compute-1 nova_compute[182713]: 2026-01-22 00:13:37.397 182717 DEBUG oslo_concurrency.lockutils [None req-5a24a287-7ffe-475c-b0db-9a2a49a050b3 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] Acquiring lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:13:37 compute-1 nova_compute[182713]: 2026-01-22 00:13:37.397 182717 DEBUG oslo_concurrency.lockutils [None req-5a24a287-7ffe-475c-b0db-9a2a49a050b3 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:13:37 compute-1 nova_compute[182713]: 2026-01-22 00:13:37.414 182717 DEBUG oslo_concurrency.processutils [None req-5a24a287-7ffe-475c-b0db-9a2a49a050b3 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:13:37 compute-1 nova_compute[182713]: 2026-01-22 00:13:37.438 182717 DEBUG oslo_concurrency.processutils [None req-9693e089-a04d-484f-9983-6f3465273a9d 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ad203d96-8f7f-4024-84b2-4bb2655b6395/disk --force-share --output=json" returned: 0 in 0.079s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:13:37 compute-1 nova_compute[182713]: 2026-01-22 00:13:37.440 182717 DEBUG nova.virt.disk.api [None req-9693e089-a04d-484f-9983-6f3465273a9d 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] Cannot resize image /var/lib/nova/instances/ad203d96-8f7f-4024-84b2-4bb2655b6395/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 22 00:13:37 compute-1 nova_compute[182713]: 2026-01-22 00:13:37.440 182717 DEBUG nova.objects.instance [None req-9693e089-a04d-484f-9983-6f3465273a9d 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] Lazy-loading 'migration_context' on Instance uuid ad203d96-8f7f-4024-84b2-4bb2655b6395 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:13:37 compute-1 nova_compute[182713]: 2026-01-22 00:13:37.463 182717 DEBUG nova.virt.libvirt.driver [None req-9693e089-a04d-484f-9983-6f3465273a9d 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] [instance: ad203d96-8f7f-4024-84b2-4bb2655b6395] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 22 00:13:37 compute-1 nova_compute[182713]: 2026-01-22 00:13:37.463 182717 DEBUG nova.virt.libvirt.driver [None req-9693e089-a04d-484f-9983-6f3465273a9d 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] [instance: ad203d96-8f7f-4024-84b2-4bb2655b6395] Ensure instance console log exists: /var/lib/nova/instances/ad203d96-8f7f-4024-84b2-4bb2655b6395/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 22 00:13:37 compute-1 nova_compute[182713]: 2026-01-22 00:13:37.464 182717 DEBUG oslo_concurrency.lockutils [None req-9693e089-a04d-484f-9983-6f3465273a9d 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:13:37 compute-1 nova_compute[182713]: 2026-01-22 00:13:37.464 182717 DEBUG oslo_concurrency.lockutils [None req-9693e089-a04d-484f-9983-6f3465273a9d 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:13:37 compute-1 nova_compute[182713]: 2026-01-22 00:13:37.465 182717 DEBUG oslo_concurrency.lockutils [None req-9693e089-a04d-484f-9983-6f3465273a9d 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:13:37 compute-1 nova_compute[182713]: 2026-01-22 00:13:37.466 182717 DEBUG nova.virt.libvirt.driver [None req-9693e089-a04d-484f-9983-6f3465273a9d 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] [instance: ad203d96-8f7f-4024-84b2-4bb2655b6395] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'size': 0, 'disk_bus': 'virtio', 'boot_index': 0, 'device_name': '/dev/vda', 'guest_format': None, 'encrypted': False, 'encryption_format': None, 'encryption_secret_uuid': None, 'device_type': 'disk', 'image_id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 22 00:13:37 compute-1 nova_compute[182713]: 2026-01-22 00:13:37.472 182717 WARNING nova.virt.libvirt.driver [None req-9693e089-a04d-484f-9983-6f3465273a9d 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 00:13:37 compute-1 nova_compute[182713]: 2026-01-22 00:13:37.474 182717 DEBUG oslo_concurrency.processutils [None req-5a24a287-7ffe-475c-b0db-9a2a49a050b3 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:13:37 compute-1 nova_compute[182713]: 2026-01-22 00:13:37.475 182717 DEBUG oslo_concurrency.processutils [None req-5a24a287-7ffe-475c-b0db-9a2a49a050b3 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/f1150657-3f68-4e2e-91ee-82eb9d65f0c1/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:13:37 compute-1 nova_compute[182713]: 2026-01-22 00:13:37.510 182717 DEBUG nova.virt.libvirt.host [None req-9693e089-a04d-484f-9983-6f3465273a9d 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 22 00:13:37 compute-1 nova_compute[182713]: 2026-01-22 00:13:37.512 182717 DEBUG nova.virt.libvirt.host [None req-9693e089-a04d-484f-9983-6f3465273a9d 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 22 00:13:37 compute-1 nova_compute[182713]: 2026-01-22 00:13:37.516 182717 DEBUG nova.virt.libvirt.host [None req-9693e089-a04d-484f-9983-6f3465273a9d 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 22 00:13:37 compute-1 nova_compute[182713]: 2026-01-22 00:13:37.516 182717 DEBUG nova.virt.libvirt.host [None req-9693e089-a04d-484f-9983-6f3465273a9d 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 22 00:13:37 compute-1 nova_compute[182713]: 2026-01-22 00:13:37.518 182717 DEBUG nova.virt.libvirt.driver [None req-9693e089-a04d-484f-9983-6f3465273a9d 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 22 00:13:37 compute-1 nova_compute[182713]: 2026-01-22 00:13:37.519 182717 DEBUG nova.virt.hardware [None req-9693e089-a04d-484f-9983-6f3465273a9d 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-21T23:42:45Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c3389c03-89c4-4ff5-9e03-1a99d41713d4',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 22 00:13:37 compute-1 nova_compute[182713]: 2026-01-22 00:13:37.519 182717 DEBUG nova.virt.hardware [None req-9693e089-a04d-484f-9983-6f3465273a9d 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 22 00:13:37 compute-1 nova_compute[182713]: 2026-01-22 00:13:37.520 182717 DEBUG nova.virt.hardware [None req-9693e089-a04d-484f-9983-6f3465273a9d 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 22 00:13:37 compute-1 nova_compute[182713]: 2026-01-22 00:13:37.520 182717 DEBUG nova.virt.hardware [None req-9693e089-a04d-484f-9983-6f3465273a9d 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 22 00:13:37 compute-1 nova_compute[182713]: 2026-01-22 00:13:37.520 182717 DEBUG nova.virt.hardware [None req-9693e089-a04d-484f-9983-6f3465273a9d 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 22 00:13:37 compute-1 nova_compute[182713]: 2026-01-22 00:13:37.521 182717 DEBUG nova.virt.hardware [None req-9693e089-a04d-484f-9983-6f3465273a9d 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 22 00:13:37 compute-1 nova_compute[182713]: 2026-01-22 00:13:37.521 182717 DEBUG nova.virt.hardware [None req-9693e089-a04d-484f-9983-6f3465273a9d 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 22 00:13:37 compute-1 nova_compute[182713]: 2026-01-22 00:13:37.521 182717 DEBUG nova.virt.hardware [None req-9693e089-a04d-484f-9983-6f3465273a9d 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 22 00:13:37 compute-1 nova_compute[182713]: 2026-01-22 00:13:37.522 182717 DEBUG nova.virt.hardware [None req-9693e089-a04d-484f-9983-6f3465273a9d 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 22 00:13:37 compute-1 nova_compute[182713]: 2026-01-22 00:13:37.522 182717 DEBUG nova.virt.hardware [None req-9693e089-a04d-484f-9983-6f3465273a9d 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 22 00:13:37 compute-1 nova_compute[182713]: 2026-01-22 00:13:37.522 182717 DEBUG nova.virt.hardware [None req-9693e089-a04d-484f-9983-6f3465273a9d 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 22 00:13:37 compute-1 nova_compute[182713]: 2026-01-22 00:13:37.527 182717 DEBUG nova.objects.instance [None req-9693e089-a04d-484f-9983-6f3465273a9d 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] Lazy-loading 'pci_devices' on Instance uuid ad203d96-8f7f-4024-84b2-4bb2655b6395 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:13:37 compute-1 nova_compute[182713]: 2026-01-22 00:13:37.540 182717 DEBUG oslo_concurrency.processutils [None req-5a24a287-7ffe-475c-b0db-9a2a49a050b3 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/f1150657-3f68-4e2e-91ee-82eb9d65f0c1/disk 1073741824" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:13:37 compute-1 nova_compute[182713]: 2026-01-22 00:13:37.541 182717 DEBUG oslo_concurrency.lockutils [None req-5a24a287-7ffe-475c-b0db-9a2a49a050b3 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.144s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:13:37 compute-1 nova_compute[182713]: 2026-01-22 00:13:37.542 182717 DEBUG oslo_concurrency.processutils [None req-5a24a287-7ffe-475c-b0db-9a2a49a050b3 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:13:37 compute-1 nova_compute[182713]: 2026-01-22 00:13:37.568 182717 DEBUG nova.virt.libvirt.driver [None req-9693e089-a04d-484f-9983-6f3465273a9d 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] [instance: ad203d96-8f7f-4024-84b2-4bb2655b6395] End _get_guest_xml xml=<domain type="kvm">
Jan 22 00:13:37 compute-1 nova_compute[182713]:   <uuid>ad203d96-8f7f-4024-84b2-4bb2655b6395</uuid>
Jan 22 00:13:37 compute-1 nova_compute[182713]:   <name>instance-00000078</name>
Jan 22 00:13:37 compute-1 nova_compute[182713]:   <memory>131072</memory>
Jan 22 00:13:37 compute-1 nova_compute[182713]:   <vcpu>1</vcpu>
Jan 22 00:13:37 compute-1 nova_compute[182713]:   <metadata>
Jan 22 00:13:37 compute-1 nova_compute[182713]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 00:13:37 compute-1 nova_compute[182713]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 00:13:37 compute-1 nova_compute[182713]:       <nova:name>tempest-ServerShowV247Test-server-1680784711</nova:name>
Jan 22 00:13:37 compute-1 nova_compute[182713]:       <nova:creationTime>2026-01-22 00:13:37</nova:creationTime>
Jan 22 00:13:37 compute-1 nova_compute[182713]:       <nova:flavor name="m1.nano">
Jan 22 00:13:37 compute-1 nova_compute[182713]:         <nova:memory>128</nova:memory>
Jan 22 00:13:37 compute-1 nova_compute[182713]:         <nova:disk>1</nova:disk>
Jan 22 00:13:37 compute-1 nova_compute[182713]:         <nova:swap>0</nova:swap>
Jan 22 00:13:37 compute-1 nova_compute[182713]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 00:13:37 compute-1 nova_compute[182713]:         <nova:vcpus>1</nova:vcpus>
Jan 22 00:13:37 compute-1 nova_compute[182713]:       </nova:flavor>
Jan 22 00:13:37 compute-1 nova_compute[182713]:       <nova:owner>
Jan 22 00:13:37 compute-1 nova_compute[182713]:         <nova:user uuid="2e0aa78e0b4f4c11823d6a419634cab8">tempest-ServerShowV247Test-904151132-project-member</nova:user>
Jan 22 00:13:37 compute-1 nova_compute[182713]:         <nova:project uuid="9e271bcd92754d2c96796024a51a55a3">tempest-ServerShowV247Test-904151132</nova:project>
Jan 22 00:13:37 compute-1 nova_compute[182713]:       </nova:owner>
Jan 22 00:13:37 compute-1 nova_compute[182713]:       <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 22 00:13:37 compute-1 nova_compute[182713]:       <nova:ports/>
Jan 22 00:13:37 compute-1 nova_compute[182713]:     </nova:instance>
Jan 22 00:13:37 compute-1 nova_compute[182713]:   </metadata>
Jan 22 00:13:37 compute-1 nova_compute[182713]:   <sysinfo type="smbios">
Jan 22 00:13:37 compute-1 nova_compute[182713]:     <system>
Jan 22 00:13:37 compute-1 nova_compute[182713]:       <entry name="manufacturer">RDO</entry>
Jan 22 00:13:37 compute-1 nova_compute[182713]:       <entry name="product">OpenStack Compute</entry>
Jan 22 00:13:37 compute-1 nova_compute[182713]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 00:13:37 compute-1 nova_compute[182713]:       <entry name="serial">ad203d96-8f7f-4024-84b2-4bb2655b6395</entry>
Jan 22 00:13:37 compute-1 nova_compute[182713]:       <entry name="uuid">ad203d96-8f7f-4024-84b2-4bb2655b6395</entry>
Jan 22 00:13:37 compute-1 nova_compute[182713]:       <entry name="family">Virtual Machine</entry>
Jan 22 00:13:37 compute-1 nova_compute[182713]:     </system>
Jan 22 00:13:37 compute-1 nova_compute[182713]:   </sysinfo>
Jan 22 00:13:37 compute-1 nova_compute[182713]:   <os>
Jan 22 00:13:37 compute-1 nova_compute[182713]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 22 00:13:37 compute-1 nova_compute[182713]:     <boot dev="hd"/>
Jan 22 00:13:37 compute-1 nova_compute[182713]:     <smbios mode="sysinfo"/>
Jan 22 00:13:37 compute-1 nova_compute[182713]:   </os>
Jan 22 00:13:37 compute-1 nova_compute[182713]:   <features>
Jan 22 00:13:37 compute-1 nova_compute[182713]:     <acpi/>
Jan 22 00:13:37 compute-1 nova_compute[182713]:     <apic/>
Jan 22 00:13:37 compute-1 nova_compute[182713]:     <vmcoreinfo/>
Jan 22 00:13:37 compute-1 nova_compute[182713]:   </features>
Jan 22 00:13:37 compute-1 nova_compute[182713]:   <clock offset="utc">
Jan 22 00:13:37 compute-1 nova_compute[182713]:     <timer name="pit" tickpolicy="delay"/>
Jan 22 00:13:37 compute-1 nova_compute[182713]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 22 00:13:37 compute-1 nova_compute[182713]:     <timer name="hpet" present="no"/>
Jan 22 00:13:37 compute-1 nova_compute[182713]:   </clock>
Jan 22 00:13:37 compute-1 nova_compute[182713]:   <cpu mode="custom" match="exact">
Jan 22 00:13:37 compute-1 nova_compute[182713]:     <model>Nehalem</model>
Jan 22 00:13:37 compute-1 nova_compute[182713]:     <topology sockets="1" cores="1" threads="1"/>
Jan 22 00:13:37 compute-1 nova_compute[182713]:   </cpu>
Jan 22 00:13:37 compute-1 nova_compute[182713]:   <devices>
Jan 22 00:13:37 compute-1 nova_compute[182713]:     <disk type="file" device="disk">
Jan 22 00:13:37 compute-1 nova_compute[182713]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 00:13:37 compute-1 nova_compute[182713]:       <source file="/var/lib/nova/instances/ad203d96-8f7f-4024-84b2-4bb2655b6395/disk"/>
Jan 22 00:13:37 compute-1 nova_compute[182713]:       <target dev="vda" bus="virtio"/>
Jan 22 00:13:37 compute-1 nova_compute[182713]:     </disk>
Jan 22 00:13:37 compute-1 nova_compute[182713]:     <disk type="file" device="cdrom">
Jan 22 00:13:37 compute-1 nova_compute[182713]:       <driver name="qemu" type="raw" cache="none"/>
Jan 22 00:13:37 compute-1 nova_compute[182713]:       <source file="/var/lib/nova/instances/ad203d96-8f7f-4024-84b2-4bb2655b6395/disk.config"/>
Jan 22 00:13:37 compute-1 nova_compute[182713]:       <target dev="sda" bus="sata"/>
Jan 22 00:13:37 compute-1 nova_compute[182713]:     </disk>
Jan 22 00:13:37 compute-1 nova_compute[182713]:     <serial type="pty">
Jan 22 00:13:37 compute-1 nova_compute[182713]:       <log file="/var/lib/nova/instances/ad203d96-8f7f-4024-84b2-4bb2655b6395/console.log" append="off"/>
Jan 22 00:13:37 compute-1 nova_compute[182713]:     </serial>
Jan 22 00:13:37 compute-1 nova_compute[182713]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 00:13:37 compute-1 nova_compute[182713]:     <video>
Jan 22 00:13:37 compute-1 nova_compute[182713]:       <model type="virtio"/>
Jan 22 00:13:37 compute-1 nova_compute[182713]:     </video>
Jan 22 00:13:37 compute-1 nova_compute[182713]:     <input type="tablet" bus="usb"/>
Jan 22 00:13:37 compute-1 nova_compute[182713]:     <rng model="virtio">
Jan 22 00:13:37 compute-1 nova_compute[182713]:       <backend model="random">/dev/urandom</backend>
Jan 22 00:13:37 compute-1 nova_compute[182713]:     </rng>
Jan 22 00:13:37 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root"/>
Jan 22 00:13:37 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:13:37 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:13:37 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:13:37 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:13:37 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:13:37 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:13:37 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:13:37 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:13:37 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:13:37 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:13:37 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:13:37 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:13:37 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:13:37 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:13:37 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:13:37 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:13:37 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:13:37 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:13:37 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:13:37 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:13:37 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:13:37 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:13:37 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:13:37 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:13:37 compute-1 nova_compute[182713]:     <controller type="usb" index="0"/>
Jan 22 00:13:37 compute-1 nova_compute[182713]:     <memballoon model="virtio">
Jan 22 00:13:37 compute-1 nova_compute[182713]:       <stats period="10"/>
Jan 22 00:13:37 compute-1 nova_compute[182713]:     </memballoon>
Jan 22 00:13:37 compute-1 nova_compute[182713]:   </devices>
Jan 22 00:13:37 compute-1 nova_compute[182713]: </domain>
Jan 22 00:13:37 compute-1 nova_compute[182713]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 22 00:13:37 compute-1 nova_compute[182713]: 2026-01-22 00:13:37.614 182717 DEBUG oslo_concurrency.processutils [None req-5a24a287-7ffe-475c-b0db-9a2a49a050b3 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:13:37 compute-1 nova_compute[182713]: 2026-01-22 00:13:37.615 182717 DEBUG nova.virt.disk.api [None req-5a24a287-7ffe-475c-b0db-9a2a49a050b3 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] Checking if we can resize image /var/lib/nova/instances/f1150657-3f68-4e2e-91ee-82eb9d65f0c1/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 22 00:13:37 compute-1 nova_compute[182713]: 2026-01-22 00:13:37.616 182717 DEBUG oslo_concurrency.processutils [None req-5a24a287-7ffe-475c-b0db-9a2a49a050b3 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f1150657-3f68-4e2e-91ee-82eb9d65f0c1/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:13:37 compute-1 nova_compute[182713]: 2026-01-22 00:13:37.679 182717 DEBUG nova.virt.libvirt.driver [None req-9693e089-a04d-484f-9983-6f3465273a9d 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 00:13:37 compute-1 nova_compute[182713]: 2026-01-22 00:13:37.681 182717 DEBUG nova.virt.libvirt.driver [None req-9693e089-a04d-484f-9983-6f3465273a9d 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 00:13:37 compute-1 nova_compute[182713]: 2026-01-22 00:13:37.682 182717 INFO nova.virt.libvirt.driver [None req-9693e089-a04d-484f-9983-6f3465273a9d 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] [instance: ad203d96-8f7f-4024-84b2-4bb2655b6395] Using config drive
Jan 22 00:13:37 compute-1 nova_compute[182713]: 2026-01-22 00:13:37.739 182717 DEBUG oslo_concurrency.processutils [None req-5a24a287-7ffe-475c-b0db-9a2a49a050b3 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f1150657-3f68-4e2e-91ee-82eb9d65f0c1/disk --force-share --output=json" returned: 0 in 0.122s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:13:37 compute-1 nova_compute[182713]: 2026-01-22 00:13:37.739 182717 DEBUG nova.virt.disk.api [None req-5a24a287-7ffe-475c-b0db-9a2a49a050b3 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] Cannot resize image /var/lib/nova/instances/f1150657-3f68-4e2e-91ee-82eb9d65f0c1/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 22 00:13:37 compute-1 nova_compute[182713]: 2026-01-22 00:13:37.740 182717 DEBUG nova.objects.instance [None req-5a24a287-7ffe-475c-b0db-9a2a49a050b3 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] Lazy-loading 'migration_context' on Instance uuid f1150657-3f68-4e2e-91ee-82eb9d65f0c1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:13:37 compute-1 nova_compute[182713]: 2026-01-22 00:13:37.783 182717 DEBUG nova.virt.libvirt.driver [None req-5a24a287-7ffe-475c-b0db-9a2a49a050b3 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] [instance: f1150657-3f68-4e2e-91ee-82eb9d65f0c1] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 22 00:13:37 compute-1 nova_compute[182713]: 2026-01-22 00:13:37.784 182717 DEBUG nova.virt.libvirt.driver [None req-5a24a287-7ffe-475c-b0db-9a2a49a050b3 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] [instance: f1150657-3f68-4e2e-91ee-82eb9d65f0c1] Ensure instance console log exists: /var/lib/nova/instances/f1150657-3f68-4e2e-91ee-82eb9d65f0c1/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 22 00:13:37 compute-1 nova_compute[182713]: 2026-01-22 00:13:37.785 182717 DEBUG oslo_concurrency.lockutils [None req-5a24a287-7ffe-475c-b0db-9a2a49a050b3 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:13:37 compute-1 nova_compute[182713]: 2026-01-22 00:13:37.786 182717 DEBUG oslo_concurrency.lockutils [None req-5a24a287-7ffe-475c-b0db-9a2a49a050b3 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:13:37 compute-1 nova_compute[182713]: 2026-01-22 00:13:37.786 182717 DEBUG oslo_concurrency.lockutils [None req-5a24a287-7ffe-475c-b0db-9a2a49a050b3 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:13:37 compute-1 nova_compute[182713]: 2026-01-22 00:13:37.789 182717 DEBUG nova.virt.libvirt.driver [None req-5a24a287-7ffe-475c-b0db-9a2a49a050b3 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] [instance: f1150657-3f68-4e2e-91ee-82eb9d65f0c1] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'size': 0, 'disk_bus': 'virtio', 'boot_index': 0, 'device_name': '/dev/vda', 'guest_format': None, 'encrypted': False, 'encryption_format': None, 'encryption_secret_uuid': None, 'device_type': 'disk', 'image_id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 22 00:13:37 compute-1 nova_compute[182713]: 2026-01-22 00:13:37.796 182717 WARNING nova.virt.libvirt.driver [None req-5a24a287-7ffe-475c-b0db-9a2a49a050b3 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 00:13:37 compute-1 nova_compute[182713]: 2026-01-22 00:13:37.805 182717 DEBUG nova.virt.libvirt.host [None req-5a24a287-7ffe-475c-b0db-9a2a49a050b3 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 22 00:13:37 compute-1 nova_compute[182713]: 2026-01-22 00:13:37.806 182717 DEBUG nova.virt.libvirt.host [None req-5a24a287-7ffe-475c-b0db-9a2a49a050b3 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 22 00:13:37 compute-1 nova_compute[182713]: 2026-01-22 00:13:37.810 182717 DEBUG nova.virt.libvirt.host [None req-5a24a287-7ffe-475c-b0db-9a2a49a050b3 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 22 00:13:37 compute-1 nova_compute[182713]: 2026-01-22 00:13:37.811 182717 DEBUG nova.virt.libvirt.host [None req-5a24a287-7ffe-475c-b0db-9a2a49a050b3 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 22 00:13:37 compute-1 nova_compute[182713]: 2026-01-22 00:13:37.813 182717 DEBUG nova.virt.libvirt.driver [None req-5a24a287-7ffe-475c-b0db-9a2a49a050b3 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 22 00:13:37 compute-1 nova_compute[182713]: 2026-01-22 00:13:37.813 182717 DEBUG nova.virt.hardware [None req-5a24a287-7ffe-475c-b0db-9a2a49a050b3 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-21T23:42:45Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c3389c03-89c4-4ff5-9e03-1a99d41713d4',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 22 00:13:37 compute-1 nova_compute[182713]: 2026-01-22 00:13:37.814 182717 DEBUG nova.virt.hardware [None req-5a24a287-7ffe-475c-b0db-9a2a49a050b3 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 22 00:13:37 compute-1 nova_compute[182713]: 2026-01-22 00:13:37.815 182717 DEBUG nova.virt.hardware [None req-5a24a287-7ffe-475c-b0db-9a2a49a050b3 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 22 00:13:37 compute-1 nova_compute[182713]: 2026-01-22 00:13:37.815 182717 DEBUG nova.virt.hardware [None req-5a24a287-7ffe-475c-b0db-9a2a49a050b3 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 22 00:13:37 compute-1 nova_compute[182713]: 2026-01-22 00:13:37.816 182717 DEBUG nova.virt.hardware [None req-5a24a287-7ffe-475c-b0db-9a2a49a050b3 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 22 00:13:37 compute-1 nova_compute[182713]: 2026-01-22 00:13:37.816 182717 DEBUG nova.virt.hardware [None req-5a24a287-7ffe-475c-b0db-9a2a49a050b3 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 22 00:13:37 compute-1 nova_compute[182713]: 2026-01-22 00:13:37.816 182717 DEBUG nova.virt.hardware [None req-5a24a287-7ffe-475c-b0db-9a2a49a050b3 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 22 00:13:37 compute-1 nova_compute[182713]: 2026-01-22 00:13:37.817 182717 DEBUG nova.virt.hardware [None req-5a24a287-7ffe-475c-b0db-9a2a49a050b3 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 22 00:13:37 compute-1 nova_compute[182713]: 2026-01-22 00:13:37.818 182717 DEBUG nova.virt.hardware [None req-5a24a287-7ffe-475c-b0db-9a2a49a050b3 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 22 00:13:37 compute-1 nova_compute[182713]: 2026-01-22 00:13:37.818 182717 DEBUG nova.virt.hardware [None req-5a24a287-7ffe-475c-b0db-9a2a49a050b3 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 22 00:13:37 compute-1 nova_compute[182713]: 2026-01-22 00:13:37.819 182717 DEBUG nova.virt.hardware [None req-5a24a287-7ffe-475c-b0db-9a2a49a050b3 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 22 00:13:37 compute-1 nova_compute[182713]: 2026-01-22 00:13:37.825 182717 DEBUG nova.objects.instance [None req-5a24a287-7ffe-475c-b0db-9a2a49a050b3 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] Lazy-loading 'pci_devices' on Instance uuid f1150657-3f68-4e2e-91ee-82eb9d65f0c1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:13:37 compute-1 nova_compute[182713]: 2026-01-22 00:13:37.852 182717 DEBUG nova.virt.libvirt.driver [None req-5a24a287-7ffe-475c-b0db-9a2a49a050b3 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] [instance: f1150657-3f68-4e2e-91ee-82eb9d65f0c1] End _get_guest_xml xml=<domain type="kvm">
Jan 22 00:13:37 compute-1 nova_compute[182713]:   <uuid>f1150657-3f68-4e2e-91ee-82eb9d65f0c1</uuid>
Jan 22 00:13:37 compute-1 nova_compute[182713]:   <name>instance-00000079</name>
Jan 22 00:13:37 compute-1 nova_compute[182713]:   <memory>131072</memory>
Jan 22 00:13:37 compute-1 nova_compute[182713]:   <vcpu>1</vcpu>
Jan 22 00:13:37 compute-1 nova_compute[182713]:   <metadata>
Jan 22 00:13:37 compute-1 nova_compute[182713]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 00:13:37 compute-1 nova_compute[182713]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 00:13:37 compute-1 nova_compute[182713]:       <nova:name>tempest-ServerShowV247Test-server-1355932132</nova:name>
Jan 22 00:13:37 compute-1 nova_compute[182713]:       <nova:creationTime>2026-01-22 00:13:37</nova:creationTime>
Jan 22 00:13:37 compute-1 nova_compute[182713]:       <nova:flavor name="m1.nano">
Jan 22 00:13:37 compute-1 nova_compute[182713]:         <nova:memory>128</nova:memory>
Jan 22 00:13:37 compute-1 nova_compute[182713]:         <nova:disk>1</nova:disk>
Jan 22 00:13:37 compute-1 nova_compute[182713]:         <nova:swap>0</nova:swap>
Jan 22 00:13:37 compute-1 nova_compute[182713]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 00:13:37 compute-1 nova_compute[182713]:         <nova:vcpus>1</nova:vcpus>
Jan 22 00:13:37 compute-1 nova_compute[182713]:       </nova:flavor>
Jan 22 00:13:37 compute-1 nova_compute[182713]:       <nova:owner>
Jan 22 00:13:37 compute-1 nova_compute[182713]:         <nova:user uuid="2e0aa78e0b4f4c11823d6a419634cab8">tempest-ServerShowV247Test-904151132-project-member</nova:user>
Jan 22 00:13:37 compute-1 nova_compute[182713]:         <nova:project uuid="9e271bcd92754d2c96796024a51a55a3">tempest-ServerShowV247Test-904151132</nova:project>
Jan 22 00:13:37 compute-1 nova_compute[182713]:       </nova:owner>
Jan 22 00:13:37 compute-1 nova_compute[182713]:       <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 22 00:13:37 compute-1 nova_compute[182713]:       <nova:ports/>
Jan 22 00:13:37 compute-1 nova_compute[182713]:     </nova:instance>
Jan 22 00:13:37 compute-1 nova_compute[182713]:   </metadata>
Jan 22 00:13:37 compute-1 nova_compute[182713]:   <sysinfo type="smbios">
Jan 22 00:13:37 compute-1 nova_compute[182713]:     <system>
Jan 22 00:13:37 compute-1 nova_compute[182713]:       <entry name="manufacturer">RDO</entry>
Jan 22 00:13:37 compute-1 nova_compute[182713]:       <entry name="product">OpenStack Compute</entry>
Jan 22 00:13:37 compute-1 nova_compute[182713]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 00:13:37 compute-1 nova_compute[182713]:       <entry name="serial">f1150657-3f68-4e2e-91ee-82eb9d65f0c1</entry>
Jan 22 00:13:37 compute-1 nova_compute[182713]:       <entry name="uuid">f1150657-3f68-4e2e-91ee-82eb9d65f0c1</entry>
Jan 22 00:13:37 compute-1 nova_compute[182713]:       <entry name="family">Virtual Machine</entry>
Jan 22 00:13:37 compute-1 nova_compute[182713]:     </system>
Jan 22 00:13:37 compute-1 nova_compute[182713]:   </sysinfo>
Jan 22 00:13:37 compute-1 nova_compute[182713]:   <os>
Jan 22 00:13:37 compute-1 nova_compute[182713]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 22 00:13:37 compute-1 nova_compute[182713]:     <boot dev="hd"/>
Jan 22 00:13:37 compute-1 nova_compute[182713]:     <smbios mode="sysinfo"/>
Jan 22 00:13:37 compute-1 nova_compute[182713]:   </os>
Jan 22 00:13:37 compute-1 nova_compute[182713]:   <features>
Jan 22 00:13:37 compute-1 nova_compute[182713]:     <acpi/>
Jan 22 00:13:37 compute-1 nova_compute[182713]:     <apic/>
Jan 22 00:13:37 compute-1 nova_compute[182713]:     <vmcoreinfo/>
Jan 22 00:13:37 compute-1 nova_compute[182713]:   </features>
Jan 22 00:13:37 compute-1 nova_compute[182713]:   <clock offset="utc">
Jan 22 00:13:37 compute-1 nova_compute[182713]:     <timer name="pit" tickpolicy="delay"/>
Jan 22 00:13:37 compute-1 nova_compute[182713]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 22 00:13:37 compute-1 nova_compute[182713]:     <timer name="hpet" present="no"/>
Jan 22 00:13:37 compute-1 nova_compute[182713]:   </clock>
Jan 22 00:13:37 compute-1 nova_compute[182713]:   <cpu mode="custom" match="exact">
Jan 22 00:13:37 compute-1 nova_compute[182713]:     <model>Nehalem</model>
Jan 22 00:13:37 compute-1 nova_compute[182713]:     <topology sockets="1" cores="1" threads="1"/>
Jan 22 00:13:37 compute-1 nova_compute[182713]:   </cpu>
Jan 22 00:13:37 compute-1 nova_compute[182713]:   <devices>
Jan 22 00:13:37 compute-1 nova_compute[182713]:     <disk type="file" device="disk">
Jan 22 00:13:37 compute-1 nova_compute[182713]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 00:13:37 compute-1 nova_compute[182713]:       <source file="/var/lib/nova/instances/f1150657-3f68-4e2e-91ee-82eb9d65f0c1/disk"/>
Jan 22 00:13:37 compute-1 nova_compute[182713]:       <target dev="vda" bus="virtio"/>
Jan 22 00:13:37 compute-1 nova_compute[182713]:     </disk>
Jan 22 00:13:37 compute-1 nova_compute[182713]:     <disk type="file" device="cdrom">
Jan 22 00:13:37 compute-1 nova_compute[182713]:       <driver name="qemu" type="raw" cache="none"/>
Jan 22 00:13:37 compute-1 nova_compute[182713]:       <source file="/var/lib/nova/instances/f1150657-3f68-4e2e-91ee-82eb9d65f0c1/disk.config"/>
Jan 22 00:13:37 compute-1 nova_compute[182713]:       <target dev="sda" bus="sata"/>
Jan 22 00:13:37 compute-1 nova_compute[182713]:     </disk>
Jan 22 00:13:37 compute-1 nova_compute[182713]:     <serial type="pty">
Jan 22 00:13:37 compute-1 nova_compute[182713]:       <log file="/var/lib/nova/instances/f1150657-3f68-4e2e-91ee-82eb9d65f0c1/console.log" append="off"/>
Jan 22 00:13:37 compute-1 nova_compute[182713]:     </serial>
Jan 22 00:13:37 compute-1 nova_compute[182713]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 00:13:37 compute-1 nova_compute[182713]:     <video>
Jan 22 00:13:37 compute-1 nova_compute[182713]:       <model type="virtio"/>
Jan 22 00:13:37 compute-1 nova_compute[182713]:     </video>
Jan 22 00:13:37 compute-1 nova_compute[182713]:     <input type="tablet" bus="usb"/>
Jan 22 00:13:37 compute-1 nova_compute[182713]:     <rng model="virtio">
Jan 22 00:13:37 compute-1 nova_compute[182713]:       <backend model="random">/dev/urandom</backend>
Jan 22 00:13:37 compute-1 nova_compute[182713]:     </rng>
Jan 22 00:13:37 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root"/>
Jan 22 00:13:37 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:13:37 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:13:37 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:13:37 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:13:37 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:13:37 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:13:37 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:13:37 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:13:37 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:13:37 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:13:37 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:13:37 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:13:37 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:13:37 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:13:37 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:13:37 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:13:37 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:13:37 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:13:37 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:13:37 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:13:37 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:13:37 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:13:37 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:13:37 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:13:37 compute-1 nova_compute[182713]:     <controller type="usb" index="0"/>
Jan 22 00:13:37 compute-1 nova_compute[182713]:     <memballoon model="virtio">
Jan 22 00:13:37 compute-1 nova_compute[182713]:       <stats period="10"/>
Jan 22 00:13:37 compute-1 nova_compute[182713]:     </memballoon>
Jan 22 00:13:37 compute-1 nova_compute[182713]:   </devices>
Jan 22 00:13:37 compute-1 nova_compute[182713]: </domain>
Jan 22 00:13:37 compute-1 nova_compute[182713]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 22 00:13:37 compute-1 nova_compute[182713]: 2026-01-22 00:13:37.917 182717 DEBUG nova.virt.libvirt.driver [None req-5a24a287-7ffe-475c-b0db-9a2a49a050b3 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 00:13:37 compute-1 nova_compute[182713]: 2026-01-22 00:13:37.917 182717 DEBUG nova.virt.libvirt.driver [None req-5a24a287-7ffe-475c-b0db-9a2a49a050b3 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 00:13:37 compute-1 nova_compute[182713]: 2026-01-22 00:13:37.918 182717 INFO nova.virt.libvirt.driver [None req-5a24a287-7ffe-475c-b0db-9a2a49a050b3 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] [instance: f1150657-3f68-4e2e-91ee-82eb9d65f0c1] Using config drive
Jan 22 00:13:37 compute-1 nova_compute[182713]: 2026-01-22 00:13:37.924 182717 INFO nova.virt.libvirt.driver [None req-9693e089-a04d-484f-9983-6f3465273a9d 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] [instance: ad203d96-8f7f-4024-84b2-4bb2655b6395] Creating config drive at /var/lib/nova/instances/ad203d96-8f7f-4024-84b2-4bb2655b6395/disk.config
Jan 22 00:13:37 compute-1 nova_compute[182713]: 2026-01-22 00:13:37.934 182717 DEBUG oslo_concurrency.processutils [None req-9693e089-a04d-484f-9983-6f3465273a9d 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/ad203d96-8f7f-4024-84b2-4bb2655b6395/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpn3whpuk9 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:13:38 compute-1 nova_compute[182713]: 2026-01-22 00:13:38.072 182717 DEBUG oslo_concurrency.processutils [None req-9693e089-a04d-484f-9983-6f3465273a9d 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/ad203d96-8f7f-4024-84b2-4bb2655b6395/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpn3whpuk9" returned: 0 in 0.138s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:13:38 compute-1 nova_compute[182713]: 2026-01-22 00:13:38.085 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:13:38 compute-1 nova_compute[182713]: 2026-01-22 00:13:38.087 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:13:38 compute-1 nova_compute[182713]: 2026-01-22 00:13:38.088 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5004 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Jan 22 00:13:38 compute-1 nova_compute[182713]: 2026-01-22 00:13:38.088 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 00:13:38 compute-1 nova_compute[182713]: 2026-01-22 00:13:38.089 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 00:13:38 compute-1 nova_compute[182713]: 2026-01-22 00:13:38.090 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:13:38 compute-1 systemd-machined[153970]: New machine qemu-54-instance-00000078.
Jan 22 00:13:38 compute-1 systemd[1]: Started Virtual Machine qemu-54-instance-00000078.
Jan 22 00:13:38 compute-1 nova_compute[182713]: 2026-01-22 00:13:38.505 182717 DEBUG nova.virt.driver [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Emitting event <LifecycleEvent: 1769040818.504774, ad203d96-8f7f-4024-84b2-4bb2655b6395 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:13:38 compute-1 nova_compute[182713]: 2026-01-22 00:13:38.507 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: ad203d96-8f7f-4024-84b2-4bb2655b6395] VM Resumed (Lifecycle Event)
Jan 22 00:13:38 compute-1 nova_compute[182713]: 2026-01-22 00:13:38.509 182717 DEBUG nova.compute.manager [None req-9693e089-a04d-484f-9983-6f3465273a9d 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] [instance: ad203d96-8f7f-4024-84b2-4bb2655b6395] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 00:13:38 compute-1 nova_compute[182713]: 2026-01-22 00:13:38.510 182717 DEBUG nova.virt.libvirt.driver [None req-9693e089-a04d-484f-9983-6f3465273a9d 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] [instance: ad203d96-8f7f-4024-84b2-4bb2655b6395] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 22 00:13:38 compute-1 nova_compute[182713]: 2026-01-22 00:13:38.514 182717 INFO nova.virt.libvirt.driver [-] [instance: ad203d96-8f7f-4024-84b2-4bb2655b6395] Instance spawned successfully.
Jan 22 00:13:38 compute-1 nova_compute[182713]: 2026-01-22 00:13:38.515 182717 DEBUG nova.virt.libvirt.driver [None req-9693e089-a04d-484f-9983-6f3465273a9d 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] [instance: ad203d96-8f7f-4024-84b2-4bb2655b6395] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 22 00:13:38 compute-1 nova_compute[182713]: 2026-01-22 00:13:38.536 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: ad203d96-8f7f-4024-84b2-4bb2655b6395] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:13:38 compute-1 nova_compute[182713]: 2026-01-22 00:13:38.543 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: ad203d96-8f7f-4024-84b2-4bb2655b6395] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 00:13:38 compute-1 nova_compute[182713]: 2026-01-22 00:13:38.547 182717 DEBUG nova.virt.libvirt.driver [None req-9693e089-a04d-484f-9983-6f3465273a9d 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] [instance: ad203d96-8f7f-4024-84b2-4bb2655b6395] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:13:38 compute-1 nova_compute[182713]: 2026-01-22 00:13:38.548 182717 DEBUG nova.virt.libvirt.driver [None req-9693e089-a04d-484f-9983-6f3465273a9d 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] [instance: ad203d96-8f7f-4024-84b2-4bb2655b6395] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:13:38 compute-1 nova_compute[182713]: 2026-01-22 00:13:38.549 182717 DEBUG nova.virt.libvirt.driver [None req-9693e089-a04d-484f-9983-6f3465273a9d 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] [instance: ad203d96-8f7f-4024-84b2-4bb2655b6395] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:13:38 compute-1 nova_compute[182713]: 2026-01-22 00:13:38.549 182717 DEBUG nova.virt.libvirt.driver [None req-9693e089-a04d-484f-9983-6f3465273a9d 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] [instance: ad203d96-8f7f-4024-84b2-4bb2655b6395] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:13:38 compute-1 nova_compute[182713]: 2026-01-22 00:13:38.550 182717 DEBUG nova.virt.libvirt.driver [None req-9693e089-a04d-484f-9983-6f3465273a9d 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] [instance: ad203d96-8f7f-4024-84b2-4bb2655b6395] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:13:38 compute-1 nova_compute[182713]: 2026-01-22 00:13:38.550 182717 DEBUG nova.virt.libvirt.driver [None req-9693e089-a04d-484f-9983-6f3465273a9d 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] [instance: ad203d96-8f7f-4024-84b2-4bb2655b6395] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:13:38 compute-1 nova_compute[182713]: 2026-01-22 00:13:38.586 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: ad203d96-8f7f-4024-84b2-4bb2655b6395] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 00:13:38 compute-1 nova_compute[182713]: 2026-01-22 00:13:38.586 182717 DEBUG nova.virt.driver [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Emitting event <LifecycleEvent: 1769040818.5070083, ad203d96-8f7f-4024-84b2-4bb2655b6395 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:13:38 compute-1 nova_compute[182713]: 2026-01-22 00:13:38.587 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: ad203d96-8f7f-4024-84b2-4bb2655b6395] VM Started (Lifecycle Event)
Jan 22 00:13:38 compute-1 nova_compute[182713]: 2026-01-22 00:13:38.595 182717 INFO nova.virt.libvirt.driver [None req-5a24a287-7ffe-475c-b0db-9a2a49a050b3 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] [instance: f1150657-3f68-4e2e-91ee-82eb9d65f0c1] Creating config drive at /var/lib/nova/instances/f1150657-3f68-4e2e-91ee-82eb9d65f0c1/disk.config
Jan 22 00:13:38 compute-1 nova_compute[182713]: 2026-01-22 00:13:38.599 182717 DEBUG oslo_concurrency.processutils [None req-5a24a287-7ffe-475c-b0db-9a2a49a050b3 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/f1150657-3f68-4e2e-91ee-82eb9d65f0c1/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpvh602an3 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:13:38 compute-1 nova_compute[182713]: 2026-01-22 00:13:38.631 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: ad203d96-8f7f-4024-84b2-4bb2655b6395] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:13:38 compute-1 nova_compute[182713]: 2026-01-22 00:13:38.634 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: ad203d96-8f7f-4024-84b2-4bb2655b6395] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 00:13:38 compute-1 nova_compute[182713]: 2026-01-22 00:13:38.654 182717 INFO nova.compute.manager [None req-9693e089-a04d-484f-9983-6f3465273a9d 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] [instance: ad203d96-8f7f-4024-84b2-4bb2655b6395] Took 1.83 seconds to spawn the instance on the hypervisor.
Jan 22 00:13:38 compute-1 nova_compute[182713]: 2026-01-22 00:13:38.654 182717 DEBUG nova.compute.manager [None req-9693e089-a04d-484f-9983-6f3465273a9d 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] [instance: ad203d96-8f7f-4024-84b2-4bb2655b6395] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:13:38 compute-1 nova_compute[182713]: 2026-01-22 00:13:38.661 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: ad203d96-8f7f-4024-84b2-4bb2655b6395] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 00:13:38 compute-1 nova_compute[182713]: 2026-01-22 00:13:38.731 182717 DEBUG oslo_concurrency.processutils [None req-5a24a287-7ffe-475c-b0db-9a2a49a050b3 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/f1150657-3f68-4e2e-91ee-82eb9d65f0c1/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpvh602an3" returned: 0 in 0.132s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:13:38 compute-1 systemd-machined[153970]: New machine qemu-55-instance-00000079.
Jan 22 00:13:38 compute-1 podman[230156]: 2026-01-22 00:13:38.819251943 +0000 UTC m=+0.050026998 container health_status af9a44923f14d865893c91712a29a99d59e05d83f1a1a0300c4f4d5af638f284 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true)
Jan 22 00:13:38 compute-1 systemd[1]: Started Virtual Machine qemu-55-instance-00000079.
Jan 22 00:13:38 compute-1 podman[230161]: 2026-01-22 00:13:38.848053818 +0000 UTC m=+0.069748025 container health_status e305ebfcd3dbca18f8dd680606b043b108cf9efb0d0a771039cb04fe0df8441d (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 22 00:13:39 compute-1 nova_compute[182713]: 2026-01-22 00:13:39.157 182717 DEBUG nova.compute.manager [None req-5a24a287-7ffe-475c-b0db-9a2a49a050b3 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] [instance: f1150657-3f68-4e2e-91ee-82eb9d65f0c1] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 00:13:39 compute-1 nova_compute[182713]: 2026-01-22 00:13:39.159 182717 DEBUG nova.virt.libvirt.driver [None req-5a24a287-7ffe-475c-b0db-9a2a49a050b3 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] [instance: f1150657-3f68-4e2e-91ee-82eb9d65f0c1] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 22 00:13:39 compute-1 nova_compute[182713]: 2026-01-22 00:13:39.159 182717 DEBUG nova.virt.driver [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Emitting event <LifecycleEvent: 1769040819.1569595, f1150657-3f68-4e2e-91ee-82eb9d65f0c1 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:13:39 compute-1 nova_compute[182713]: 2026-01-22 00:13:39.160 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: f1150657-3f68-4e2e-91ee-82eb9d65f0c1] VM Resumed (Lifecycle Event)
Jan 22 00:13:39 compute-1 nova_compute[182713]: 2026-01-22 00:13:39.164 182717 INFO nova.virt.libvirt.driver [-] [instance: f1150657-3f68-4e2e-91ee-82eb9d65f0c1] Instance spawned successfully.
Jan 22 00:13:39 compute-1 nova_compute[182713]: 2026-01-22 00:13:39.165 182717 DEBUG nova.virt.libvirt.driver [None req-5a24a287-7ffe-475c-b0db-9a2a49a050b3 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] [instance: f1150657-3f68-4e2e-91ee-82eb9d65f0c1] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 22 00:13:39 compute-1 nova_compute[182713]: 2026-01-22 00:13:39.443 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: f1150657-3f68-4e2e-91ee-82eb9d65f0c1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:13:39 compute-1 nova_compute[182713]: 2026-01-22 00:13:39.448 182717 DEBUG nova.virt.libvirt.driver [None req-5a24a287-7ffe-475c-b0db-9a2a49a050b3 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] [instance: f1150657-3f68-4e2e-91ee-82eb9d65f0c1] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:13:39 compute-1 nova_compute[182713]: 2026-01-22 00:13:39.448 182717 DEBUG nova.virt.libvirt.driver [None req-5a24a287-7ffe-475c-b0db-9a2a49a050b3 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] [instance: f1150657-3f68-4e2e-91ee-82eb9d65f0c1] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:13:39 compute-1 nova_compute[182713]: 2026-01-22 00:13:39.449 182717 DEBUG nova.virt.libvirt.driver [None req-5a24a287-7ffe-475c-b0db-9a2a49a050b3 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] [instance: f1150657-3f68-4e2e-91ee-82eb9d65f0c1] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:13:39 compute-1 nova_compute[182713]: 2026-01-22 00:13:39.449 182717 DEBUG nova.virt.libvirt.driver [None req-5a24a287-7ffe-475c-b0db-9a2a49a050b3 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] [instance: f1150657-3f68-4e2e-91ee-82eb9d65f0c1] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:13:39 compute-1 nova_compute[182713]: 2026-01-22 00:13:39.450 182717 DEBUG nova.virt.libvirt.driver [None req-5a24a287-7ffe-475c-b0db-9a2a49a050b3 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] [instance: f1150657-3f68-4e2e-91ee-82eb9d65f0c1] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:13:39 compute-1 nova_compute[182713]: 2026-01-22 00:13:39.450 182717 DEBUG nova.virt.libvirt.driver [None req-5a24a287-7ffe-475c-b0db-9a2a49a050b3 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] [instance: f1150657-3f68-4e2e-91ee-82eb9d65f0c1] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:13:39 compute-1 nova_compute[182713]: 2026-01-22 00:13:39.454 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: f1150657-3f68-4e2e-91ee-82eb9d65f0c1] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 00:13:39 compute-1 nova_compute[182713]: 2026-01-22 00:13:39.647 182717 INFO nova.compute.manager [None req-9693e089-a04d-484f-9983-6f3465273a9d 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] [instance: ad203d96-8f7f-4024-84b2-4bb2655b6395] Took 5.17 seconds to build instance.
Jan 22 00:13:40 compute-1 nova_compute[182713]: 2026-01-22 00:13:40.002 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: f1150657-3f68-4e2e-91ee-82eb9d65f0c1] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 00:13:40 compute-1 nova_compute[182713]: 2026-01-22 00:13:40.003 182717 DEBUG nova.virt.driver [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Emitting event <LifecycleEvent: 1769040819.158411, f1150657-3f68-4e2e-91ee-82eb9d65f0c1 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:13:40 compute-1 nova_compute[182713]: 2026-01-22 00:13:40.003 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: f1150657-3f68-4e2e-91ee-82eb9d65f0c1] VM Started (Lifecycle Event)
Jan 22 00:13:40 compute-1 nova_compute[182713]: 2026-01-22 00:13:40.386 182717 DEBUG oslo_concurrency.lockutils [None req-9693e089-a04d-484f-9983-6f3465273a9d 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] Lock "ad203d96-8f7f-4024-84b2-4bb2655b6395" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 6.009s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:13:40 compute-1 nova_compute[182713]: 2026-01-22 00:13:40.391 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: f1150657-3f68-4e2e-91ee-82eb9d65f0c1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:13:40 compute-1 nova_compute[182713]: 2026-01-22 00:13:40.393 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: f1150657-3f68-4e2e-91ee-82eb9d65f0c1] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 00:13:40 compute-1 nova_compute[182713]: 2026-01-22 00:13:40.432 182717 INFO nova.compute.manager [None req-5a24a287-7ffe-475c-b0db-9a2a49a050b3 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] [instance: f1150657-3f68-4e2e-91ee-82eb9d65f0c1] Took 3.12 seconds to spawn the instance on the hypervisor.
Jan 22 00:13:40 compute-1 nova_compute[182713]: 2026-01-22 00:13:40.433 182717 DEBUG nova.compute.manager [None req-5a24a287-7ffe-475c-b0db-9a2a49a050b3 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] [instance: f1150657-3f68-4e2e-91ee-82eb9d65f0c1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:13:40 compute-1 nova_compute[182713]: 2026-01-22 00:13:40.440 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: f1150657-3f68-4e2e-91ee-82eb9d65f0c1] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 00:13:40 compute-1 nova_compute[182713]: 2026-01-22 00:13:40.561 182717 INFO nova.compute.manager [None req-5a24a287-7ffe-475c-b0db-9a2a49a050b3 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] [instance: f1150657-3f68-4e2e-91ee-82eb9d65f0c1] Took 3.96 seconds to build instance.
Jan 22 00:13:40 compute-1 nova_compute[182713]: 2026-01-22 00:13:40.599 182717 DEBUG oslo_concurrency.lockutils [None req-5a24a287-7ffe-475c-b0db-9a2a49a050b3 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] Lock "f1150657-3f68-4e2e-91ee-82eb9d65f0c1" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 5.240s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:13:40 compute-1 nova_compute[182713]: 2026-01-22 00:13:40.855 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:13:40 compute-1 nova_compute[182713]: 2026-01-22 00:13:40.887 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:13:40 compute-1 nova_compute[182713]: 2026-01-22 00:13:40.888 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:13:42 compute-1 nova_compute[182713]: 2026-01-22 00:13:42.849 182717 INFO nova.compute.manager [None req-dc4fd216-8211-40f3-9ff8-c0f17e173bc7 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] [instance: f1150657-3f68-4e2e-91ee-82eb9d65f0c1] Rebuilding instance
Jan 22 00:13:43 compute-1 nova_compute[182713]: 2026-01-22 00:13:43.090 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:13:43 compute-1 nova_compute[182713]: 2026-01-22 00:13:43.235 182717 DEBUG nova.compute.manager [None req-dc4fd216-8211-40f3-9ff8-c0f17e173bc7 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] [instance: f1150657-3f68-4e2e-91ee-82eb9d65f0c1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:13:43 compute-1 nova_compute[182713]: 2026-01-22 00:13:43.355 182717 DEBUG nova.objects.instance [None req-dc4fd216-8211-40f3-9ff8-c0f17e173bc7 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] Lazy-loading 'pci_requests' on Instance uuid f1150657-3f68-4e2e-91ee-82eb9d65f0c1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:13:43 compute-1 nova_compute[182713]: 2026-01-22 00:13:43.377 182717 DEBUG nova.objects.instance [None req-dc4fd216-8211-40f3-9ff8-c0f17e173bc7 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] Lazy-loading 'pci_devices' on Instance uuid f1150657-3f68-4e2e-91ee-82eb9d65f0c1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:13:43 compute-1 nova_compute[182713]: 2026-01-22 00:13:43.405 182717 DEBUG nova.objects.instance [None req-dc4fd216-8211-40f3-9ff8-c0f17e173bc7 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] Lazy-loading 'resources' on Instance uuid f1150657-3f68-4e2e-91ee-82eb9d65f0c1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:13:43 compute-1 nova_compute[182713]: 2026-01-22 00:13:43.428 182717 DEBUG nova.objects.instance [None req-dc4fd216-8211-40f3-9ff8-c0f17e173bc7 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] Lazy-loading 'migration_context' on Instance uuid f1150657-3f68-4e2e-91ee-82eb9d65f0c1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:13:43 compute-1 nova_compute[182713]: 2026-01-22 00:13:43.455 182717 DEBUG nova.objects.instance [None req-dc4fd216-8211-40f3-9ff8-c0f17e173bc7 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] [instance: f1150657-3f68-4e2e-91ee-82eb9d65f0c1] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032
Jan 22 00:13:43 compute-1 nova_compute[182713]: 2026-01-22 00:13:43.458 182717 DEBUG nova.virt.libvirt.driver [None req-dc4fd216-8211-40f3-9ff8-c0f17e173bc7 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] [instance: f1150657-3f68-4e2e-91ee-82eb9d65f0c1] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071
Jan 22 00:13:43 compute-1 nova_compute[182713]: 2026-01-22 00:13:43.855 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:13:43 compute-1 nova_compute[182713]: 2026-01-22 00:13:43.856 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 00:13:44 compute-1 nova_compute[182713]: 2026-01-22 00:13:44.855 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:13:44 compute-1 nova_compute[182713]: 2026-01-22 00:13:44.883 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:13:44 compute-1 nova_compute[182713]: 2026-01-22 00:13:44.884 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:13:44 compute-1 nova_compute[182713]: 2026-01-22 00:13:44.884 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:13:44 compute-1 nova_compute[182713]: 2026-01-22 00:13:44.885 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 00:13:44 compute-1 nova_compute[182713]: 2026-01-22 00:13:44.976 182717 DEBUG oslo_concurrency.processutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f1150657-3f68-4e2e-91ee-82eb9d65f0c1/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:13:45 compute-1 nova_compute[182713]: 2026-01-22 00:13:45.041 182717 DEBUG oslo_concurrency.processutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f1150657-3f68-4e2e-91ee-82eb9d65f0c1/disk --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:13:45 compute-1 nova_compute[182713]: 2026-01-22 00:13:45.042 182717 DEBUG oslo_concurrency.processutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f1150657-3f68-4e2e-91ee-82eb9d65f0c1/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:13:45 compute-1 nova_compute[182713]: 2026-01-22 00:13:45.104 182717 DEBUG oslo_concurrency.processutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f1150657-3f68-4e2e-91ee-82eb9d65f0c1/disk --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:13:45 compute-1 nova_compute[182713]: 2026-01-22 00:13:45.113 182717 DEBUG oslo_concurrency.processutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ad203d96-8f7f-4024-84b2-4bb2655b6395/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:13:45 compute-1 nova_compute[182713]: 2026-01-22 00:13:45.171 182717 DEBUG oslo_concurrency.processutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ad203d96-8f7f-4024-84b2-4bb2655b6395/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:13:45 compute-1 nova_compute[182713]: 2026-01-22 00:13:45.173 182717 DEBUG oslo_concurrency.processutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ad203d96-8f7f-4024-84b2-4bb2655b6395/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:13:45 compute-1 nova_compute[182713]: 2026-01-22 00:13:45.240 182717 DEBUG oslo_concurrency.processutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ad203d96-8f7f-4024-84b2-4bb2655b6395/disk --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:13:45 compute-1 nova_compute[182713]: 2026-01-22 00:13:45.367 182717 WARNING nova.virt.libvirt.driver [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 00:13:45 compute-1 nova_compute[182713]: 2026-01-22 00:13:45.370 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5458MB free_disk=73.29425811767578GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 00:13:45 compute-1 nova_compute[182713]: 2026-01-22 00:13:45.370 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:13:45 compute-1 nova_compute[182713]: 2026-01-22 00:13:45.371 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:13:45 compute-1 nova_compute[182713]: 2026-01-22 00:13:45.451 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Instance ad203d96-8f7f-4024-84b2-4bb2655b6395 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 22 00:13:45 compute-1 nova_compute[182713]: 2026-01-22 00:13:45.452 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Instance f1150657-3f68-4e2e-91ee-82eb9d65f0c1 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 22 00:13:45 compute-1 nova_compute[182713]: 2026-01-22 00:13:45.452 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 00:13:45 compute-1 nova_compute[182713]: 2026-01-22 00:13:45.452 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 00:13:45 compute-1 nova_compute[182713]: 2026-01-22 00:13:45.522 182717 DEBUG nova.compute.provider_tree [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Inventory has not changed in ProviderTree for provider: 39680711-70c9-4df1-ae59-25e54fac688d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:13:45 compute-1 nova_compute[182713]: 2026-01-22 00:13:45.546 182717 DEBUG nova.scheduler.client.report [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Inventory has not changed for provider 39680711-70c9-4df1-ae59-25e54fac688d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:13:45 compute-1 nova_compute[182713]: 2026-01-22 00:13:45.591 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 00:13:45 compute-1 nova_compute[182713]: 2026-01-22 00:13:45.592 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.221s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:13:46 compute-1 nova_compute[182713]: 2026-01-22 00:13:46.587 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:13:46 compute-1 nova_compute[182713]: 2026-01-22 00:13:46.589 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:13:47 compute-1 nova_compute[182713]: 2026-01-22 00:13:47.855 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:13:48 compute-1 nova_compute[182713]: 2026-01-22 00:13:48.091 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:13:49 compute-1 podman[230247]: 2026-01-22 00:13:49.619472629 +0000 UTC m=+0.091110882 container health_status cba261ffc859a105e0afa4436e64e27f9ba7e7387a1ea02146e0072c51844d44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ceilometer_agent_compute, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3)
Jan 22 00:13:49 compute-1 nova_compute[182713]: 2026-01-22 00:13:49.857 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:13:49 compute-1 nova_compute[182713]: 2026-01-22 00:13:49.859 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 00:13:49 compute-1 nova_compute[182713]: 2026-01-22 00:13:49.859 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 22 00:13:50 compute-1 nova_compute[182713]: 2026-01-22 00:13:50.061 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquiring lock "refresh_cache-ad203d96-8f7f-4024-84b2-4bb2655b6395" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:13:50 compute-1 nova_compute[182713]: 2026-01-22 00:13:50.061 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquired lock "refresh_cache-ad203d96-8f7f-4024-84b2-4bb2655b6395" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:13:50 compute-1 nova_compute[182713]: 2026-01-22 00:13:50.062 182717 DEBUG nova.network.neutron [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] [instance: ad203d96-8f7f-4024-84b2-4bb2655b6395] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Jan 22 00:13:50 compute-1 nova_compute[182713]: 2026-01-22 00:13:50.062 182717 DEBUG nova.objects.instance [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lazy-loading 'info_cache' on Instance uuid ad203d96-8f7f-4024-84b2-4bb2655b6395 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:13:50 compute-1 nova_compute[182713]: 2026-01-22 00:13:50.254 182717 DEBUG nova.network.neutron [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] [instance: ad203d96-8f7f-4024-84b2-4bb2655b6395] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 00:13:50 compute-1 nova_compute[182713]: 2026-01-22 00:13:50.611 182717 DEBUG nova.network.neutron [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] [instance: ad203d96-8f7f-4024-84b2-4bb2655b6395] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:13:50 compute-1 nova_compute[182713]: 2026-01-22 00:13:50.625 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Releasing lock "refresh_cache-ad203d96-8f7f-4024-84b2-4bb2655b6395" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:13:50 compute-1 nova_compute[182713]: 2026-01-22 00:13:50.625 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] [instance: ad203d96-8f7f-4024-84b2-4bb2655b6395] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Jan 22 00:13:50 compute-1 nova_compute[182713]: 2026-01-22 00:13:50.626 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:13:53 compute-1 nova_compute[182713]: 2026-01-22 00:13:53.093 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:13:53 compute-1 podman[230288]: 2026-01-22 00:13:53.208702064 +0000 UTC m=+0.084348244 container health_status dce133a2ffa21f3006ac27dc80027060a9029e6d972f4c62fecd35dd3bbb00f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, config_id=openstack_network_exporter, container_name=openstack_network_exporter, release=1755695350, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., managed_by=edpm_ansible, architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Jan 22 00:13:53 compute-1 nova_compute[182713]: 2026-01-22 00:13:53.510 182717 DEBUG nova.virt.libvirt.driver [None req-dc4fd216-8211-40f3-9ff8-c0f17e173bc7 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] [instance: f1150657-3f68-4e2e-91ee-82eb9d65f0c1] Instance in state 1 after 10 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101
Jan 22 00:13:55 compute-1 systemd[1]: machine-qemu\x2d55\x2dinstance\x2d00000079.scope: Deactivated successfully.
Jan 22 00:13:55 compute-1 systemd[1]: machine-qemu\x2d55\x2dinstance\x2d00000079.scope: Consumed 12.345s CPU time.
Jan 22 00:13:55 compute-1 systemd-machined[153970]: Machine qemu-55-instance-00000079 terminated.
Jan 22 00:13:56 compute-1 nova_compute[182713]: 2026-01-22 00:13:56.526 182717 INFO nova.virt.libvirt.driver [None req-dc4fd216-8211-40f3-9ff8-c0f17e173bc7 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] [instance: f1150657-3f68-4e2e-91ee-82eb9d65f0c1] Instance shutdown successfully after 13 seconds.
Jan 22 00:13:56 compute-1 nova_compute[182713]: 2026-01-22 00:13:56.533 182717 INFO nova.virt.libvirt.driver [-] [instance: f1150657-3f68-4e2e-91ee-82eb9d65f0c1] Instance destroyed successfully.
Jan 22 00:13:56 compute-1 nova_compute[182713]: 2026-01-22 00:13:56.538 182717 INFO nova.virt.libvirt.driver [-] [instance: f1150657-3f68-4e2e-91ee-82eb9d65f0c1] Instance destroyed successfully.
Jan 22 00:13:56 compute-1 nova_compute[182713]: 2026-01-22 00:13:56.538 182717 INFO nova.virt.libvirt.driver [None req-dc4fd216-8211-40f3-9ff8-c0f17e173bc7 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] [instance: f1150657-3f68-4e2e-91ee-82eb9d65f0c1] Deleting instance files /var/lib/nova/instances/f1150657-3f68-4e2e-91ee-82eb9d65f0c1_del
Jan 22 00:13:56 compute-1 nova_compute[182713]: 2026-01-22 00:13:56.539 182717 INFO nova.virt.libvirt.driver [None req-dc4fd216-8211-40f3-9ff8-c0f17e173bc7 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] [instance: f1150657-3f68-4e2e-91ee-82eb9d65f0c1] Deletion of /var/lib/nova/instances/f1150657-3f68-4e2e-91ee-82eb9d65f0c1_del complete
Jan 22 00:13:57 compute-1 nova_compute[182713]: 2026-01-22 00:13:57.607 182717 DEBUG nova.virt.libvirt.driver [None req-dc4fd216-8211-40f3-9ff8-c0f17e173bc7 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] [instance: f1150657-3f68-4e2e-91ee-82eb9d65f0c1] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 22 00:13:57 compute-1 nova_compute[182713]: 2026-01-22 00:13:57.608 182717 INFO nova.virt.libvirt.driver [None req-dc4fd216-8211-40f3-9ff8-c0f17e173bc7 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] [instance: f1150657-3f68-4e2e-91ee-82eb9d65f0c1] Creating image(s)
Jan 22 00:13:57 compute-1 nova_compute[182713]: 2026-01-22 00:13:57.609 182717 DEBUG oslo_concurrency.lockutils [None req-dc4fd216-8211-40f3-9ff8-c0f17e173bc7 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] Acquiring lock "/var/lib/nova/instances/f1150657-3f68-4e2e-91ee-82eb9d65f0c1/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:13:57 compute-1 nova_compute[182713]: 2026-01-22 00:13:57.609 182717 DEBUG oslo_concurrency.lockutils [None req-dc4fd216-8211-40f3-9ff8-c0f17e173bc7 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] Lock "/var/lib/nova/instances/f1150657-3f68-4e2e-91ee-82eb9d65f0c1/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:13:57 compute-1 nova_compute[182713]: 2026-01-22 00:13:57.611 182717 DEBUG oslo_concurrency.lockutils [None req-dc4fd216-8211-40f3-9ff8-c0f17e173bc7 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] Lock "/var/lib/nova/instances/f1150657-3f68-4e2e-91ee-82eb9d65f0c1/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:13:57 compute-1 nova_compute[182713]: 2026-01-22 00:13:57.611 182717 DEBUG oslo_concurrency.lockutils [None req-dc4fd216-8211-40f3-9ff8-c0f17e173bc7 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] Acquiring lock "28d799b333bd7d52e5e892149f424e185effed74" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:13:57 compute-1 nova_compute[182713]: 2026-01-22 00:13:57.612 182717 DEBUG oslo_concurrency.lockutils [None req-dc4fd216-8211-40f3-9ff8-c0f17e173bc7 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] Lock "28d799b333bd7d52e5e892149f424e185effed74" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:13:58 compute-1 nova_compute[182713]: 2026-01-22 00:13:58.095 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:14:00 compute-1 nova_compute[182713]: 2026-01-22 00:14:00.126 182717 DEBUG oslo_concurrency.processutils [None req-dc4fd216-8211-40f3-9ff8-c0f17e173bc7 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/28d799b333bd7d52e5e892149f424e185effed74.part --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:14:00 compute-1 nova_compute[182713]: 2026-01-22 00:14:00.185 182717 DEBUG oslo_concurrency.processutils [None req-dc4fd216-8211-40f3-9ff8-c0f17e173bc7 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/28d799b333bd7d52e5e892149f424e185effed74.part --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:14:00 compute-1 nova_compute[182713]: 2026-01-22 00:14:00.186 182717 DEBUG nova.virt.images [None req-dc4fd216-8211-40f3-9ff8-c0f17e173bc7 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] 3e1dda74-3c6a-4d29-8792-32134d1c36c5 was qcow2, converting to raw fetch_to_raw /usr/lib/python3.9/site-packages/nova/virt/images.py:242
Jan 22 00:14:00 compute-1 nova_compute[182713]: 2026-01-22 00:14:00.188 182717 DEBUG nova.privsep.utils [None req-dc4fd216-8211-40f3-9ff8-c0f17e173bc7 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63
Jan 22 00:14:00 compute-1 nova_compute[182713]: 2026-01-22 00:14:00.188 182717 DEBUG oslo_concurrency.processutils [None req-dc4fd216-8211-40f3-9ff8-c0f17e173bc7 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] Running cmd (subprocess): qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/28d799b333bd7d52e5e892149f424e185effed74.part /var/lib/nova/instances/_base/28d799b333bd7d52e5e892149f424e185effed74.converted execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:14:00 compute-1 nova_compute[182713]: 2026-01-22 00:14:00.435 182717 DEBUG oslo_concurrency.processutils [None req-dc4fd216-8211-40f3-9ff8-c0f17e173bc7 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] CMD "qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/28d799b333bd7d52e5e892149f424e185effed74.part /var/lib/nova/instances/_base/28d799b333bd7d52e5e892149f424e185effed74.converted" returned: 0 in 0.247s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:14:00 compute-1 nova_compute[182713]: 2026-01-22 00:14:00.443 182717 DEBUG oslo_concurrency.processutils [None req-dc4fd216-8211-40f3-9ff8-c0f17e173bc7 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/28d799b333bd7d52e5e892149f424e185effed74.converted --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:14:00 compute-1 nova_compute[182713]: 2026-01-22 00:14:00.519 182717 DEBUG oslo_concurrency.processutils [None req-dc4fd216-8211-40f3-9ff8-c0f17e173bc7 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/28d799b333bd7d52e5e892149f424e185effed74.converted --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:14:00 compute-1 nova_compute[182713]: 2026-01-22 00:14:00.520 182717 DEBUG oslo_concurrency.lockutils [None req-dc4fd216-8211-40f3-9ff8-c0f17e173bc7 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] Lock "28d799b333bd7d52e5e892149f424e185effed74" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 2.908s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:14:00 compute-1 nova_compute[182713]: 2026-01-22 00:14:00.534 182717 DEBUG oslo_concurrency.processutils [None req-dc4fd216-8211-40f3-9ff8-c0f17e173bc7 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/28d799b333bd7d52e5e892149f424e185effed74 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:14:00 compute-1 nova_compute[182713]: 2026-01-22 00:14:00.587 182717 DEBUG oslo_concurrency.processutils [None req-dc4fd216-8211-40f3-9ff8-c0f17e173bc7 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/28d799b333bd7d52e5e892149f424e185effed74 --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:14:00 compute-1 nova_compute[182713]: 2026-01-22 00:14:00.588 182717 DEBUG oslo_concurrency.lockutils [None req-dc4fd216-8211-40f3-9ff8-c0f17e173bc7 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] Acquiring lock "28d799b333bd7d52e5e892149f424e185effed74" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:14:00 compute-1 nova_compute[182713]: 2026-01-22 00:14:00.588 182717 DEBUG oslo_concurrency.lockutils [None req-dc4fd216-8211-40f3-9ff8-c0f17e173bc7 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] Lock "28d799b333bd7d52e5e892149f424e185effed74" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:14:00 compute-1 nova_compute[182713]: 2026-01-22 00:14:00.600 182717 DEBUG oslo_concurrency.processutils [None req-dc4fd216-8211-40f3-9ff8-c0f17e173bc7 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/28d799b333bd7d52e5e892149f424e185effed74 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:14:00 compute-1 nova_compute[182713]: 2026-01-22 00:14:00.689 182717 DEBUG oslo_concurrency.processutils [None req-dc4fd216-8211-40f3-9ff8-c0f17e173bc7 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/28d799b333bd7d52e5e892149f424e185effed74 --force-share --output=json" returned: 0 in 0.089s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:14:00 compute-1 nova_compute[182713]: 2026-01-22 00:14:00.690 182717 DEBUG oslo_concurrency.processutils [None req-dc4fd216-8211-40f3-9ff8-c0f17e173bc7 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/28d799b333bd7d52e5e892149f424e185effed74,backing_fmt=raw /var/lib/nova/instances/f1150657-3f68-4e2e-91ee-82eb9d65f0c1/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:14:00 compute-1 nova_compute[182713]: 2026-01-22 00:14:00.728 182717 DEBUG oslo_concurrency.processutils [None req-dc4fd216-8211-40f3-9ff8-c0f17e173bc7 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/28d799b333bd7d52e5e892149f424e185effed74,backing_fmt=raw /var/lib/nova/instances/f1150657-3f68-4e2e-91ee-82eb9d65f0c1/disk 1073741824" returned: 0 in 0.038s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:14:00 compute-1 nova_compute[182713]: 2026-01-22 00:14:00.729 182717 DEBUG oslo_concurrency.lockutils [None req-dc4fd216-8211-40f3-9ff8-c0f17e173bc7 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] Lock "28d799b333bd7d52e5e892149f424e185effed74" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.141s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:14:00 compute-1 nova_compute[182713]: 2026-01-22 00:14:00.730 182717 DEBUG oslo_concurrency.processutils [None req-dc4fd216-8211-40f3-9ff8-c0f17e173bc7 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/28d799b333bd7d52e5e892149f424e185effed74 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:14:00 compute-1 nova_compute[182713]: 2026-01-22 00:14:00.793 182717 DEBUG oslo_concurrency.processutils [None req-dc4fd216-8211-40f3-9ff8-c0f17e173bc7 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/28d799b333bd7d52e5e892149f424e185effed74 --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:14:00 compute-1 nova_compute[182713]: 2026-01-22 00:14:00.795 182717 DEBUG nova.virt.disk.api [None req-dc4fd216-8211-40f3-9ff8-c0f17e173bc7 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] Checking if we can resize image /var/lib/nova/instances/f1150657-3f68-4e2e-91ee-82eb9d65f0c1/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 22 00:14:00 compute-1 nova_compute[182713]: 2026-01-22 00:14:00.795 182717 DEBUG oslo_concurrency.processutils [None req-dc4fd216-8211-40f3-9ff8-c0f17e173bc7 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f1150657-3f68-4e2e-91ee-82eb9d65f0c1/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:14:00 compute-1 nova_compute[182713]: 2026-01-22 00:14:00.855 182717 DEBUG oslo_concurrency.processutils [None req-dc4fd216-8211-40f3-9ff8-c0f17e173bc7 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f1150657-3f68-4e2e-91ee-82eb9d65f0c1/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:14:00 compute-1 nova_compute[182713]: 2026-01-22 00:14:00.857 182717 DEBUG nova.virt.disk.api [None req-dc4fd216-8211-40f3-9ff8-c0f17e173bc7 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] Cannot resize image /var/lib/nova/instances/f1150657-3f68-4e2e-91ee-82eb9d65f0c1/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 22 00:14:00 compute-1 nova_compute[182713]: 2026-01-22 00:14:00.858 182717 DEBUG nova.virt.libvirt.driver [None req-dc4fd216-8211-40f3-9ff8-c0f17e173bc7 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] [instance: f1150657-3f68-4e2e-91ee-82eb9d65f0c1] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 22 00:14:00 compute-1 nova_compute[182713]: 2026-01-22 00:14:00.858 182717 DEBUG nova.virt.libvirt.driver [None req-dc4fd216-8211-40f3-9ff8-c0f17e173bc7 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] [instance: f1150657-3f68-4e2e-91ee-82eb9d65f0c1] Ensure instance console log exists: /var/lib/nova/instances/f1150657-3f68-4e2e-91ee-82eb9d65f0c1/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 22 00:14:00 compute-1 nova_compute[182713]: 2026-01-22 00:14:00.859 182717 DEBUG oslo_concurrency.lockutils [None req-dc4fd216-8211-40f3-9ff8-c0f17e173bc7 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:14:00 compute-1 nova_compute[182713]: 2026-01-22 00:14:00.860 182717 DEBUG oslo_concurrency.lockutils [None req-dc4fd216-8211-40f3-9ff8-c0f17e173bc7 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:14:00 compute-1 nova_compute[182713]: 2026-01-22 00:14:00.861 182717 DEBUG oslo_concurrency.lockutils [None req-dc4fd216-8211-40f3-9ff8-c0f17e173bc7 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:14:00 compute-1 nova_compute[182713]: 2026-01-22 00:14:00.864 182717 DEBUG nova.virt.libvirt.driver [None req-dc4fd216-8211-40f3-9ff8-c0f17e173bc7 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] [instance: f1150657-3f68-4e2e-91ee-82eb9d65f0c1] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:53Z,direct_url=<?>,disk_format='qcow2',id=3e1dda74-3c6a-4d29-8792-32134d1c36c5,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img_alt',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:54Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'size': 0, 'disk_bus': 'virtio', 'boot_index': 0, 'device_name': '/dev/vda', 'guest_format': None, 'encrypted': False, 'encryption_format': None, 'encryption_secret_uuid': None, 'device_type': 'disk', 'image_id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 22 00:14:00 compute-1 nova_compute[182713]: 2026-01-22 00:14:00.870 182717 WARNING nova.virt.libvirt.driver [None req-dc4fd216-8211-40f3-9ff8-c0f17e173bc7 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.: NotImplementedError
Jan 22 00:14:00 compute-1 nova_compute[182713]: 2026-01-22 00:14:00.882 182717 DEBUG nova.virt.libvirt.host [None req-dc4fd216-8211-40f3-9ff8-c0f17e173bc7 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 22 00:14:00 compute-1 nova_compute[182713]: 2026-01-22 00:14:00.882 182717 DEBUG nova.virt.libvirt.host [None req-dc4fd216-8211-40f3-9ff8-c0f17e173bc7 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 22 00:14:00 compute-1 nova_compute[182713]: 2026-01-22 00:14:00.886 182717 DEBUG nova.virt.libvirt.host [None req-dc4fd216-8211-40f3-9ff8-c0f17e173bc7 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 22 00:14:00 compute-1 nova_compute[182713]: 2026-01-22 00:14:00.886 182717 DEBUG nova.virt.libvirt.host [None req-dc4fd216-8211-40f3-9ff8-c0f17e173bc7 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 22 00:14:00 compute-1 nova_compute[182713]: 2026-01-22 00:14:00.888 182717 DEBUG nova.virt.libvirt.driver [None req-dc4fd216-8211-40f3-9ff8-c0f17e173bc7 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 22 00:14:00 compute-1 nova_compute[182713]: 2026-01-22 00:14:00.888 182717 DEBUG nova.virt.hardware [None req-dc4fd216-8211-40f3-9ff8-c0f17e173bc7 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-21T23:42:45Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c3389c03-89c4-4ff5-9e03-1a99d41713d4',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:53Z,direct_url=<?>,disk_format='qcow2',id=3e1dda74-3c6a-4d29-8792-32134d1c36c5,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img_alt',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:54Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 22 00:14:00 compute-1 nova_compute[182713]: 2026-01-22 00:14:00.888 182717 DEBUG nova.virt.hardware [None req-dc4fd216-8211-40f3-9ff8-c0f17e173bc7 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 22 00:14:00 compute-1 nova_compute[182713]: 2026-01-22 00:14:00.889 182717 DEBUG nova.virt.hardware [None req-dc4fd216-8211-40f3-9ff8-c0f17e173bc7 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 22 00:14:00 compute-1 nova_compute[182713]: 2026-01-22 00:14:00.889 182717 DEBUG nova.virt.hardware [None req-dc4fd216-8211-40f3-9ff8-c0f17e173bc7 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 22 00:14:00 compute-1 nova_compute[182713]: 2026-01-22 00:14:00.889 182717 DEBUG nova.virt.hardware [None req-dc4fd216-8211-40f3-9ff8-c0f17e173bc7 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 22 00:14:00 compute-1 nova_compute[182713]: 2026-01-22 00:14:00.889 182717 DEBUG nova.virt.hardware [None req-dc4fd216-8211-40f3-9ff8-c0f17e173bc7 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 22 00:14:00 compute-1 nova_compute[182713]: 2026-01-22 00:14:00.889 182717 DEBUG nova.virt.hardware [None req-dc4fd216-8211-40f3-9ff8-c0f17e173bc7 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 22 00:14:00 compute-1 nova_compute[182713]: 2026-01-22 00:14:00.890 182717 DEBUG nova.virt.hardware [None req-dc4fd216-8211-40f3-9ff8-c0f17e173bc7 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 22 00:14:00 compute-1 nova_compute[182713]: 2026-01-22 00:14:00.890 182717 DEBUG nova.virt.hardware [None req-dc4fd216-8211-40f3-9ff8-c0f17e173bc7 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 22 00:14:00 compute-1 nova_compute[182713]: 2026-01-22 00:14:00.890 182717 DEBUG nova.virt.hardware [None req-dc4fd216-8211-40f3-9ff8-c0f17e173bc7 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 22 00:14:00 compute-1 nova_compute[182713]: 2026-01-22 00:14:00.890 182717 DEBUG nova.virt.hardware [None req-dc4fd216-8211-40f3-9ff8-c0f17e173bc7 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 22 00:14:00 compute-1 nova_compute[182713]: 2026-01-22 00:14:00.891 182717 DEBUG nova.objects.instance [None req-dc4fd216-8211-40f3-9ff8-c0f17e173bc7 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] Lazy-loading 'vcpu_model' on Instance uuid f1150657-3f68-4e2e-91ee-82eb9d65f0c1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:14:01 compute-1 nova_compute[182713]: 2026-01-22 00:14:01.002 182717 DEBUG nova.virt.libvirt.driver [None req-dc4fd216-8211-40f3-9ff8-c0f17e173bc7 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] [instance: f1150657-3f68-4e2e-91ee-82eb9d65f0c1] End _get_guest_xml xml=<domain type="kvm">
Jan 22 00:14:01 compute-1 nova_compute[182713]:   <uuid>f1150657-3f68-4e2e-91ee-82eb9d65f0c1</uuid>
Jan 22 00:14:01 compute-1 nova_compute[182713]:   <name>instance-00000079</name>
Jan 22 00:14:01 compute-1 nova_compute[182713]:   <memory>131072</memory>
Jan 22 00:14:01 compute-1 nova_compute[182713]:   <vcpu>1</vcpu>
Jan 22 00:14:01 compute-1 nova_compute[182713]:   <metadata>
Jan 22 00:14:01 compute-1 nova_compute[182713]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 00:14:01 compute-1 nova_compute[182713]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 00:14:01 compute-1 nova_compute[182713]:       <nova:name>tempest-ServerShowV247Test-server-1355932132</nova:name>
Jan 22 00:14:01 compute-1 nova_compute[182713]:       <nova:creationTime>2026-01-22 00:14:00</nova:creationTime>
Jan 22 00:14:01 compute-1 nova_compute[182713]:       <nova:flavor name="m1.nano">
Jan 22 00:14:01 compute-1 nova_compute[182713]:         <nova:memory>128</nova:memory>
Jan 22 00:14:01 compute-1 nova_compute[182713]:         <nova:disk>1</nova:disk>
Jan 22 00:14:01 compute-1 nova_compute[182713]:         <nova:swap>0</nova:swap>
Jan 22 00:14:01 compute-1 nova_compute[182713]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 00:14:01 compute-1 nova_compute[182713]:         <nova:vcpus>1</nova:vcpus>
Jan 22 00:14:01 compute-1 nova_compute[182713]:       </nova:flavor>
Jan 22 00:14:01 compute-1 nova_compute[182713]:       <nova:owner>
Jan 22 00:14:01 compute-1 nova_compute[182713]:         <nova:user uuid="2e0aa78e0b4f4c11823d6a419634cab8">tempest-ServerShowV247Test-904151132-project-member</nova:user>
Jan 22 00:14:01 compute-1 nova_compute[182713]:         <nova:project uuid="9e271bcd92754d2c96796024a51a55a3">tempest-ServerShowV247Test-904151132</nova:project>
Jan 22 00:14:01 compute-1 nova_compute[182713]:       </nova:owner>
Jan 22 00:14:01 compute-1 nova_compute[182713]:       <nova:root type="image" uuid="3e1dda74-3c6a-4d29-8792-32134d1c36c5"/>
Jan 22 00:14:01 compute-1 nova_compute[182713]:       <nova:ports/>
Jan 22 00:14:01 compute-1 nova_compute[182713]:     </nova:instance>
Jan 22 00:14:01 compute-1 nova_compute[182713]:   </metadata>
Jan 22 00:14:01 compute-1 nova_compute[182713]:   <sysinfo type="smbios">
Jan 22 00:14:01 compute-1 nova_compute[182713]:     <system>
Jan 22 00:14:01 compute-1 nova_compute[182713]:       <entry name="manufacturer">RDO</entry>
Jan 22 00:14:01 compute-1 nova_compute[182713]:       <entry name="product">OpenStack Compute</entry>
Jan 22 00:14:01 compute-1 nova_compute[182713]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 00:14:01 compute-1 nova_compute[182713]:       <entry name="serial">f1150657-3f68-4e2e-91ee-82eb9d65f0c1</entry>
Jan 22 00:14:01 compute-1 nova_compute[182713]:       <entry name="uuid">f1150657-3f68-4e2e-91ee-82eb9d65f0c1</entry>
Jan 22 00:14:01 compute-1 nova_compute[182713]:       <entry name="family">Virtual Machine</entry>
Jan 22 00:14:01 compute-1 nova_compute[182713]:     </system>
Jan 22 00:14:01 compute-1 nova_compute[182713]:   </sysinfo>
Jan 22 00:14:01 compute-1 nova_compute[182713]:   <os>
Jan 22 00:14:01 compute-1 nova_compute[182713]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 22 00:14:01 compute-1 nova_compute[182713]:     <boot dev="hd"/>
Jan 22 00:14:01 compute-1 nova_compute[182713]:     <smbios mode="sysinfo"/>
Jan 22 00:14:01 compute-1 nova_compute[182713]:   </os>
Jan 22 00:14:01 compute-1 nova_compute[182713]:   <features>
Jan 22 00:14:01 compute-1 nova_compute[182713]:     <acpi/>
Jan 22 00:14:01 compute-1 nova_compute[182713]:     <apic/>
Jan 22 00:14:01 compute-1 nova_compute[182713]:     <vmcoreinfo/>
Jan 22 00:14:01 compute-1 nova_compute[182713]:   </features>
Jan 22 00:14:01 compute-1 nova_compute[182713]:   <clock offset="utc">
Jan 22 00:14:01 compute-1 nova_compute[182713]:     <timer name="pit" tickpolicy="delay"/>
Jan 22 00:14:01 compute-1 nova_compute[182713]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 22 00:14:01 compute-1 nova_compute[182713]:     <timer name="hpet" present="no"/>
Jan 22 00:14:01 compute-1 nova_compute[182713]:   </clock>
Jan 22 00:14:01 compute-1 nova_compute[182713]:   <cpu mode="custom" match="exact">
Jan 22 00:14:01 compute-1 nova_compute[182713]:     <model>Nehalem</model>
Jan 22 00:14:01 compute-1 nova_compute[182713]:     <topology sockets="1" cores="1" threads="1"/>
Jan 22 00:14:01 compute-1 nova_compute[182713]:   </cpu>
Jan 22 00:14:01 compute-1 nova_compute[182713]:   <devices>
Jan 22 00:14:01 compute-1 nova_compute[182713]:     <disk type="file" device="disk">
Jan 22 00:14:01 compute-1 nova_compute[182713]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 00:14:01 compute-1 nova_compute[182713]:       <source file="/var/lib/nova/instances/f1150657-3f68-4e2e-91ee-82eb9d65f0c1/disk"/>
Jan 22 00:14:01 compute-1 nova_compute[182713]:       <target dev="vda" bus="virtio"/>
Jan 22 00:14:01 compute-1 nova_compute[182713]:     </disk>
Jan 22 00:14:01 compute-1 nova_compute[182713]:     <disk type="file" device="cdrom">
Jan 22 00:14:01 compute-1 nova_compute[182713]:       <driver name="qemu" type="raw" cache="none"/>
Jan 22 00:14:01 compute-1 nova_compute[182713]:       <source file="/var/lib/nova/instances/f1150657-3f68-4e2e-91ee-82eb9d65f0c1/disk.config"/>
Jan 22 00:14:01 compute-1 nova_compute[182713]:       <target dev="sda" bus="sata"/>
Jan 22 00:14:01 compute-1 nova_compute[182713]:     </disk>
Jan 22 00:14:01 compute-1 nova_compute[182713]:     <serial type="pty">
Jan 22 00:14:01 compute-1 nova_compute[182713]:       <log file="/var/lib/nova/instances/f1150657-3f68-4e2e-91ee-82eb9d65f0c1/console.log" append="off"/>
Jan 22 00:14:01 compute-1 nova_compute[182713]:     </serial>
Jan 22 00:14:01 compute-1 nova_compute[182713]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 00:14:01 compute-1 nova_compute[182713]:     <video>
Jan 22 00:14:01 compute-1 nova_compute[182713]:       <model type="virtio"/>
Jan 22 00:14:01 compute-1 nova_compute[182713]:     </video>
Jan 22 00:14:01 compute-1 nova_compute[182713]:     <input type="tablet" bus="usb"/>
Jan 22 00:14:01 compute-1 nova_compute[182713]:     <rng model="virtio">
Jan 22 00:14:01 compute-1 nova_compute[182713]:       <backend model="random">/dev/urandom</backend>
Jan 22 00:14:01 compute-1 nova_compute[182713]:     </rng>
Jan 22 00:14:01 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root"/>
Jan 22 00:14:01 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:14:01 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:14:01 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:14:01 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:14:01 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:14:01 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:14:01 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:14:01 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:14:01 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:14:01 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:14:01 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:14:01 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:14:01 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:14:01 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:14:01 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:14:01 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:14:01 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:14:01 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:14:01 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:14:01 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:14:01 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:14:01 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:14:01 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:14:01 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:14:01 compute-1 nova_compute[182713]:     <controller type="usb" index="0"/>
Jan 22 00:14:01 compute-1 nova_compute[182713]:     <memballoon model="virtio">
Jan 22 00:14:01 compute-1 nova_compute[182713]:       <stats period="10"/>
Jan 22 00:14:01 compute-1 nova_compute[182713]:     </memballoon>
Jan 22 00:14:01 compute-1 nova_compute[182713]:   </devices>
Jan 22 00:14:01 compute-1 nova_compute[182713]: </domain>
Jan 22 00:14:01 compute-1 nova_compute[182713]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 22 00:14:01 compute-1 nova_compute[182713]: 2026-01-22 00:14:01.063 182717 DEBUG nova.virt.libvirt.driver [None req-dc4fd216-8211-40f3-9ff8-c0f17e173bc7 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 00:14:01 compute-1 nova_compute[182713]: 2026-01-22 00:14:01.064 182717 DEBUG nova.virt.libvirt.driver [None req-dc4fd216-8211-40f3-9ff8-c0f17e173bc7 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 00:14:01 compute-1 nova_compute[182713]: 2026-01-22 00:14:01.065 182717 INFO nova.virt.libvirt.driver [None req-dc4fd216-8211-40f3-9ff8-c0f17e173bc7 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] [instance: f1150657-3f68-4e2e-91ee-82eb9d65f0c1] Using config drive
Jan 22 00:14:01 compute-1 nova_compute[182713]: 2026-01-22 00:14:01.122 182717 DEBUG nova.objects.instance [None req-dc4fd216-8211-40f3-9ff8-c0f17e173bc7 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] Lazy-loading 'ec2_ids' on Instance uuid f1150657-3f68-4e2e-91ee-82eb9d65f0c1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:14:01 compute-1 nova_compute[182713]: 2026-01-22 00:14:01.159 182717 DEBUG nova.objects.instance [None req-dc4fd216-8211-40f3-9ff8-c0f17e173bc7 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] Lazy-loading 'keypairs' on Instance uuid f1150657-3f68-4e2e-91ee-82eb9d65f0c1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:14:01 compute-1 nova_compute[182713]: 2026-01-22 00:14:01.627 182717 INFO nova.virt.libvirt.driver [None req-dc4fd216-8211-40f3-9ff8-c0f17e173bc7 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] [instance: f1150657-3f68-4e2e-91ee-82eb9d65f0c1] Creating config drive at /var/lib/nova/instances/f1150657-3f68-4e2e-91ee-82eb9d65f0c1/disk.config
Jan 22 00:14:01 compute-1 nova_compute[182713]: 2026-01-22 00:14:01.639 182717 DEBUG oslo_concurrency.processutils [None req-dc4fd216-8211-40f3-9ff8-c0f17e173bc7 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/f1150657-3f68-4e2e-91ee-82eb9d65f0c1/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpz8dbm_1c execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:14:01 compute-1 nova_compute[182713]: 2026-01-22 00:14:01.770 182717 DEBUG oslo_concurrency.processutils [None req-dc4fd216-8211-40f3-9ff8-c0f17e173bc7 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/f1150657-3f68-4e2e-91ee-82eb9d65f0c1/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpz8dbm_1c" returned: 0 in 0.131s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:14:01 compute-1 systemd-machined[153970]: New machine qemu-56-instance-00000079.
Jan 22 00:14:01 compute-1 systemd[1]: Started Virtual Machine qemu-56-instance-00000079.
Jan 22 00:14:02 compute-1 nova_compute[182713]: 2026-01-22 00:14:02.287 182717 DEBUG nova.virt.libvirt.host [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Removed pending event for f1150657-3f68-4e2e-91ee-82eb9d65f0c1 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438
Jan 22 00:14:02 compute-1 nova_compute[182713]: 2026-01-22 00:14:02.289 182717 DEBUG nova.virt.driver [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Emitting event <LifecycleEvent: 1769040842.287028, f1150657-3f68-4e2e-91ee-82eb9d65f0c1 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:14:02 compute-1 nova_compute[182713]: 2026-01-22 00:14:02.289 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: f1150657-3f68-4e2e-91ee-82eb9d65f0c1] VM Resumed (Lifecycle Event)
Jan 22 00:14:02 compute-1 nova_compute[182713]: 2026-01-22 00:14:02.292 182717 DEBUG nova.compute.manager [None req-dc4fd216-8211-40f3-9ff8-c0f17e173bc7 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] [instance: f1150657-3f68-4e2e-91ee-82eb9d65f0c1] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 00:14:02 compute-1 nova_compute[182713]: 2026-01-22 00:14:02.292 182717 DEBUG nova.virt.libvirt.driver [None req-dc4fd216-8211-40f3-9ff8-c0f17e173bc7 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] [instance: f1150657-3f68-4e2e-91ee-82eb9d65f0c1] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 22 00:14:02 compute-1 nova_compute[182713]: 2026-01-22 00:14:02.297 182717 INFO nova.virt.libvirt.driver [-] [instance: f1150657-3f68-4e2e-91ee-82eb9d65f0c1] Instance spawned successfully.
Jan 22 00:14:02 compute-1 nova_compute[182713]: 2026-01-22 00:14:02.298 182717 DEBUG nova.virt.libvirt.driver [None req-dc4fd216-8211-40f3-9ff8-c0f17e173bc7 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] [instance: f1150657-3f68-4e2e-91ee-82eb9d65f0c1] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 22 00:14:03 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:14:03.021 104184 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:14:03 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:14:03.021 104184 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:14:03 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:14:03.021 104184 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:14:03 compute-1 nova_compute[182713]: 2026-01-22 00:14:03.097 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:14:03 compute-1 nova_compute[182713]: 2026-01-22 00:14:03.724 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: f1150657-3f68-4e2e-91ee-82eb9d65f0c1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:14:03 compute-1 nova_compute[182713]: 2026-01-22 00:14:03.728 182717 DEBUG nova.virt.libvirt.driver [None req-dc4fd216-8211-40f3-9ff8-c0f17e173bc7 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] [instance: f1150657-3f68-4e2e-91ee-82eb9d65f0c1] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:14:03 compute-1 nova_compute[182713]: 2026-01-22 00:14:03.728 182717 DEBUG nova.virt.libvirt.driver [None req-dc4fd216-8211-40f3-9ff8-c0f17e173bc7 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] [instance: f1150657-3f68-4e2e-91ee-82eb9d65f0c1] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:14:03 compute-1 nova_compute[182713]: 2026-01-22 00:14:03.729 182717 DEBUG nova.virt.libvirt.driver [None req-dc4fd216-8211-40f3-9ff8-c0f17e173bc7 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] [instance: f1150657-3f68-4e2e-91ee-82eb9d65f0c1] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:14:03 compute-1 nova_compute[182713]: 2026-01-22 00:14:03.729 182717 DEBUG nova.virt.libvirt.driver [None req-dc4fd216-8211-40f3-9ff8-c0f17e173bc7 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] [instance: f1150657-3f68-4e2e-91ee-82eb9d65f0c1] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:14:03 compute-1 nova_compute[182713]: 2026-01-22 00:14:03.729 182717 DEBUG nova.virt.libvirt.driver [None req-dc4fd216-8211-40f3-9ff8-c0f17e173bc7 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] [instance: f1150657-3f68-4e2e-91ee-82eb9d65f0c1] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:14:03 compute-1 nova_compute[182713]: 2026-01-22 00:14:03.729 182717 DEBUG nova.virt.libvirt.driver [None req-dc4fd216-8211-40f3-9ff8-c0f17e173bc7 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] [instance: f1150657-3f68-4e2e-91ee-82eb9d65f0c1] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:14:03 compute-1 nova_compute[182713]: 2026-01-22 00:14:03.735 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: f1150657-3f68-4e2e-91ee-82eb9d65f0c1] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: rebuild_spawning, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 00:14:03 compute-1 nova_compute[182713]: 2026-01-22 00:14:03.775 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: f1150657-3f68-4e2e-91ee-82eb9d65f0c1] During sync_power_state the instance has a pending task (rebuild_spawning). Skip.
Jan 22 00:14:03 compute-1 nova_compute[182713]: 2026-01-22 00:14:03.776 182717 DEBUG nova.virt.driver [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Emitting event <LifecycleEvent: 1769040842.2887363, f1150657-3f68-4e2e-91ee-82eb9d65f0c1 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:14:03 compute-1 nova_compute[182713]: 2026-01-22 00:14:03.776 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: f1150657-3f68-4e2e-91ee-82eb9d65f0c1] VM Started (Lifecycle Event)
Jan 22 00:14:03 compute-1 nova_compute[182713]: 2026-01-22 00:14:03.807 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: f1150657-3f68-4e2e-91ee-82eb9d65f0c1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:14:03 compute-1 nova_compute[182713]: 2026-01-22 00:14:03.809 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: f1150657-3f68-4e2e-91ee-82eb9d65f0c1] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: rebuild_spawning, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 00:14:03 compute-1 nova_compute[182713]: 2026-01-22 00:14:03.847 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: f1150657-3f68-4e2e-91ee-82eb9d65f0c1] During sync_power_state the instance has a pending task (rebuild_spawning). Skip.
Jan 22 00:14:03 compute-1 nova_compute[182713]: 2026-01-22 00:14:03.876 182717 DEBUG nova.compute.manager [None req-dc4fd216-8211-40f3-9ff8-c0f17e173bc7 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] [instance: f1150657-3f68-4e2e-91ee-82eb9d65f0c1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:14:04 compute-1 nova_compute[182713]: 2026-01-22 00:14:04.028 182717 DEBUG oslo_concurrency.lockutils [None req-dc4fd216-8211-40f3-9ff8-c0f17e173bc7 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:14:04 compute-1 nova_compute[182713]: 2026-01-22 00:14:04.029 182717 DEBUG oslo_concurrency.lockutils [None req-dc4fd216-8211-40f3-9ff8-c0f17e173bc7 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:14:04 compute-1 nova_compute[182713]: 2026-01-22 00:14:04.029 182717 DEBUG nova.objects.instance [None req-dc4fd216-8211-40f3-9ff8-c0f17e173bc7 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] [instance: f1150657-3f68-4e2e-91ee-82eb9d65f0c1] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032
Jan 22 00:14:04 compute-1 nova_compute[182713]: 2026-01-22 00:14:04.139 182717 DEBUG oslo_concurrency.lockutils [None req-dc4fd216-8211-40f3-9ff8-c0f17e173bc7 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" :: held 0.111s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:14:05 compute-1 podman[230378]: 2026-01-22 00:14:05.595552646 +0000 UTC m=+0.073192311 container health_status 9069102163b9014ce8d990e36224edf2f6277e737eebbc85a8ea0ba5a3d6a468 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 22 00:14:05 compute-1 podman[230377]: 2026-01-22 00:14:05.638982341 +0000 UTC m=+0.129647966 container health_status 1b31374526468d6b98833c6ddc06c791524e3aaa8f4a92d02e3f11c449a17ca2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_controller, io.buildah.version=1.41.3, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 22 00:14:08 compute-1 nova_compute[182713]: 2026-01-22 00:14:08.100 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:14:09 compute-1 podman[230424]: 2026-01-22 00:14:09.564636667 +0000 UTC m=+0.054546088 container health_status af9a44923f14d865893c91712a29a99d59e05d83f1a1a0300c4f4d5af638f284 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true)
Jan 22 00:14:09 compute-1 podman[230425]: 2026-01-22 00:14:09.571193708 +0000 UTC m=+0.058444998 container health_status e305ebfcd3dbca18f8dd680606b043b108cf9efb0d0a771039cb04fe0df8441d (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Jan 22 00:14:13 compute-1 nova_compute[182713]: 2026-01-22 00:14:13.103 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:14:13 compute-1 nova_compute[182713]: 2026-01-22 00:14:13.105 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:14:13 compute-1 nova_compute[182713]: 2026-01-22 00:14:13.105 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Jan 22 00:14:13 compute-1 nova_compute[182713]: 2026-01-22 00:14:13.105 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 00:14:13 compute-1 nova_compute[182713]: 2026-01-22 00:14:13.117 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:14:13 compute-1 nova_compute[182713]: 2026-01-22 00:14:13.118 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 00:14:13 compute-1 nova_compute[182713]: 2026-01-22 00:14:13.985 182717 DEBUG oslo_concurrency.lockutils [None req-108d9b6f-d44a-4d5c-b6bf-33b2e2b57950 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] Acquiring lock "f1150657-3f68-4e2e-91ee-82eb9d65f0c1" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:14:13 compute-1 nova_compute[182713]: 2026-01-22 00:14:13.986 182717 DEBUG oslo_concurrency.lockutils [None req-108d9b6f-d44a-4d5c-b6bf-33b2e2b57950 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] Lock "f1150657-3f68-4e2e-91ee-82eb9d65f0c1" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:14:13 compute-1 nova_compute[182713]: 2026-01-22 00:14:13.987 182717 DEBUG oslo_concurrency.lockutils [None req-108d9b6f-d44a-4d5c-b6bf-33b2e2b57950 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] Acquiring lock "f1150657-3f68-4e2e-91ee-82eb9d65f0c1-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:14:13 compute-1 nova_compute[182713]: 2026-01-22 00:14:13.987 182717 DEBUG oslo_concurrency.lockutils [None req-108d9b6f-d44a-4d5c-b6bf-33b2e2b57950 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] Lock "f1150657-3f68-4e2e-91ee-82eb9d65f0c1-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:14:13 compute-1 nova_compute[182713]: 2026-01-22 00:14:13.988 182717 DEBUG oslo_concurrency.lockutils [None req-108d9b6f-d44a-4d5c-b6bf-33b2e2b57950 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] Lock "f1150657-3f68-4e2e-91ee-82eb9d65f0c1-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:14:14 compute-1 nova_compute[182713]: 2026-01-22 00:14:14.004 182717 INFO nova.compute.manager [None req-108d9b6f-d44a-4d5c-b6bf-33b2e2b57950 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] [instance: f1150657-3f68-4e2e-91ee-82eb9d65f0c1] Terminating instance
Jan 22 00:14:14 compute-1 nova_compute[182713]: 2026-01-22 00:14:14.021 182717 DEBUG oslo_concurrency.lockutils [None req-108d9b6f-d44a-4d5c-b6bf-33b2e2b57950 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] Acquiring lock "refresh_cache-f1150657-3f68-4e2e-91ee-82eb9d65f0c1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:14:14 compute-1 nova_compute[182713]: 2026-01-22 00:14:14.022 182717 DEBUG oslo_concurrency.lockutils [None req-108d9b6f-d44a-4d5c-b6bf-33b2e2b57950 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] Acquired lock "refresh_cache-f1150657-3f68-4e2e-91ee-82eb9d65f0c1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:14:14 compute-1 nova_compute[182713]: 2026-01-22 00:14:14.022 182717 DEBUG nova.network.neutron [None req-108d9b6f-d44a-4d5c-b6bf-33b2e2b57950 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] [instance: f1150657-3f68-4e2e-91ee-82eb9d65f0c1] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 00:14:14 compute-1 nova_compute[182713]: 2026-01-22 00:14:14.426 182717 DEBUG nova.network.neutron [None req-108d9b6f-d44a-4d5c-b6bf-33b2e2b57950 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] [instance: f1150657-3f68-4e2e-91ee-82eb9d65f0c1] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 00:14:15 compute-1 nova_compute[182713]: 2026-01-22 00:14:15.102 182717 DEBUG nova.network.neutron [None req-108d9b6f-d44a-4d5c-b6bf-33b2e2b57950 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] [instance: f1150657-3f68-4e2e-91ee-82eb9d65f0c1] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:14:15 compute-1 nova_compute[182713]: 2026-01-22 00:14:15.126 182717 DEBUG oslo_concurrency.lockutils [None req-108d9b6f-d44a-4d5c-b6bf-33b2e2b57950 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] Releasing lock "refresh_cache-f1150657-3f68-4e2e-91ee-82eb9d65f0c1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:14:15 compute-1 nova_compute[182713]: 2026-01-22 00:14:15.127 182717 DEBUG nova.compute.manager [None req-108d9b6f-d44a-4d5c-b6bf-33b2e2b57950 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] [instance: f1150657-3f68-4e2e-91ee-82eb9d65f0c1] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 22 00:14:15 compute-1 systemd[1]: machine-qemu\x2d56\x2dinstance\x2d00000079.scope: Deactivated successfully.
Jan 22 00:14:15 compute-1 systemd[1]: machine-qemu\x2d56\x2dinstance\x2d00000079.scope: Consumed 11.639s CPU time.
Jan 22 00:14:15 compute-1 systemd-machined[153970]: Machine qemu-56-instance-00000079 terminated.
Jan 22 00:14:15 compute-1 nova_compute[182713]: 2026-01-22 00:14:15.394 182717 INFO nova.virt.libvirt.driver [-] [instance: f1150657-3f68-4e2e-91ee-82eb9d65f0c1] Instance destroyed successfully.
Jan 22 00:14:15 compute-1 nova_compute[182713]: 2026-01-22 00:14:15.396 182717 DEBUG nova.objects.instance [None req-108d9b6f-d44a-4d5c-b6bf-33b2e2b57950 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] Lazy-loading 'resources' on Instance uuid f1150657-3f68-4e2e-91ee-82eb9d65f0c1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:14:15 compute-1 nova_compute[182713]: 2026-01-22 00:14:15.450 182717 INFO nova.virt.libvirt.driver [None req-108d9b6f-d44a-4d5c-b6bf-33b2e2b57950 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] [instance: f1150657-3f68-4e2e-91ee-82eb9d65f0c1] Deleting instance files /var/lib/nova/instances/f1150657-3f68-4e2e-91ee-82eb9d65f0c1_del
Jan 22 00:14:15 compute-1 nova_compute[182713]: 2026-01-22 00:14:15.451 182717 INFO nova.virt.libvirt.driver [None req-108d9b6f-d44a-4d5c-b6bf-33b2e2b57950 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] [instance: f1150657-3f68-4e2e-91ee-82eb9d65f0c1] Deletion of /var/lib/nova/instances/f1150657-3f68-4e2e-91ee-82eb9d65f0c1_del complete
Jan 22 00:14:15 compute-1 nova_compute[182713]: 2026-01-22 00:14:15.719 182717 INFO nova.compute.manager [None req-108d9b6f-d44a-4d5c-b6bf-33b2e2b57950 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] [instance: f1150657-3f68-4e2e-91ee-82eb9d65f0c1] Took 0.59 seconds to destroy the instance on the hypervisor.
Jan 22 00:14:15 compute-1 nova_compute[182713]: 2026-01-22 00:14:15.720 182717 DEBUG oslo.service.loopingcall [None req-108d9b6f-d44a-4d5c-b6bf-33b2e2b57950 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 22 00:14:15 compute-1 nova_compute[182713]: 2026-01-22 00:14:15.721 182717 DEBUG nova.compute.manager [-] [instance: f1150657-3f68-4e2e-91ee-82eb9d65f0c1] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 22 00:14:15 compute-1 nova_compute[182713]: 2026-01-22 00:14:15.721 182717 DEBUG nova.network.neutron [-] [instance: f1150657-3f68-4e2e-91ee-82eb9d65f0c1] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 22 00:14:17 compute-1 nova_compute[182713]: 2026-01-22 00:14:17.733 182717 DEBUG nova.network.neutron [-] [instance: f1150657-3f68-4e2e-91ee-82eb9d65f0c1] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 00:14:17 compute-1 nova_compute[182713]: 2026-01-22 00:14:17.907 182717 DEBUG nova.network.neutron [-] [instance: f1150657-3f68-4e2e-91ee-82eb9d65f0c1] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:14:17 compute-1 nova_compute[182713]: 2026-01-22 00:14:17.953 182717 INFO nova.compute.manager [-] [instance: f1150657-3f68-4e2e-91ee-82eb9d65f0c1] Took 2.23 seconds to deallocate network for instance.
Jan 22 00:14:18 compute-1 nova_compute[182713]: 2026-01-22 00:14:18.069 182717 DEBUG oslo_concurrency.lockutils [None req-108d9b6f-d44a-4d5c-b6bf-33b2e2b57950 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:14:18 compute-1 nova_compute[182713]: 2026-01-22 00:14:18.070 182717 DEBUG oslo_concurrency.lockutils [None req-108d9b6f-d44a-4d5c-b6bf-33b2e2b57950 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:14:18 compute-1 nova_compute[182713]: 2026-01-22 00:14:18.118 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:14:18 compute-1 nova_compute[182713]: 2026-01-22 00:14:18.120 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:14:18 compute-1 nova_compute[182713]: 2026-01-22 00:14:18.232 182717 DEBUG nova.compute.provider_tree [None req-108d9b6f-d44a-4d5c-b6bf-33b2e2b57950 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] Inventory has not changed in ProviderTree for provider: 39680711-70c9-4df1-ae59-25e54fac688d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:14:18 compute-1 nova_compute[182713]: 2026-01-22 00:14:18.304 182717 DEBUG nova.scheduler.client.report [None req-108d9b6f-d44a-4d5c-b6bf-33b2e2b57950 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] Inventory has not changed for provider 39680711-70c9-4df1-ae59-25e54fac688d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:14:18 compute-1 nova_compute[182713]: 2026-01-22 00:14:18.370 182717 DEBUG oslo_concurrency.lockutils [None req-108d9b6f-d44a-4d5c-b6bf-33b2e2b57950 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.301s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:14:18 compute-1 nova_compute[182713]: 2026-01-22 00:14:18.436 182717 INFO nova.scheduler.client.report [None req-108d9b6f-d44a-4d5c-b6bf-33b2e2b57950 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] Deleted allocations for instance f1150657-3f68-4e2e-91ee-82eb9d65f0c1
Jan 22 00:14:19 compute-1 ovn_controller[94841]: 2026-01-22T00:14:19Z|00493|memory_trim|INFO|Detected inactivity (last active 30003 ms ago): trimming memory
Jan 22 00:14:20 compute-1 podman[230486]: 2026-01-22 00:14:20.614417483 +0000 UTC m=+0.094661271 container health_status cba261ffc859a105e0afa4436e64e27f9ba7e7387a1ea02146e0072c51844d44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Jan 22 00:14:21 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:14:21.916 104184 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=36, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:a4:fb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'fa:c2:bb:50:eb:27'}, ipsec=False) old=SB_Global(nb_cfg=35) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:14:21 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:14:21.917 104184 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 22 00:14:21 compute-1 nova_compute[182713]: 2026-01-22 00:14:21.917 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:14:21 compute-1 nova_compute[182713]: 2026-01-22 00:14:21.973 182717 DEBUG oslo_concurrency.lockutils [None req-108d9b6f-d44a-4d5c-b6bf-33b2e2b57950 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] Lock "f1150657-3f68-4e2e-91ee-82eb9d65f0c1" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 7.987s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.883 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'ad203d96-8f7f-4024-84b2-4bb2655b6395', 'name': 'tempest-ServerShowV247Test-server-1680784711', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000078', 'OS-EXT-SRV-ATTR:host': 'compute-1.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '9e271bcd92754d2c96796024a51a55a3', 'user_id': '2e0aa78e0b4f4c11823d6a419634cab8', 'hostId': '44f66483852de4982eef4ccf694502a1b028c418a0ff59357faf88aa', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.884 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.887 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.906 12 DEBUG ceilometer.compute.pollsters [-] ad203d96-8f7f-4024-84b2-4bb2655b6395/disk.device.usage volume: 29884416 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.907 12 DEBUG ceilometer.compute.pollsters [-] ad203d96-8f7f-4024-84b2-4bb2655b6395/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.910 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'de4ede89-8ce4-4a88-b426-4c9faf760dc6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29884416, 'user_id': '2e0aa78e0b4f4c11823d6a419634cab8', 'user_name': None, 'project_id': '9e271bcd92754d2c96796024a51a55a3', 'project_name': None, 'resource_id': 'ad203d96-8f7f-4024-84b2-4bb2655b6395-vda', 'timestamp': '2026-01-22T00:14:22.888047', 'resource_metadata': {'display_name': 'tempest-ServerShowV247Test-server-1680784711', 'name': 'instance-00000078', 'instance_id': 'ad203d96-8f7f-4024-84b2-4bb2655b6395', 'instance_type': 'm1.nano', 'host': '44f66483852de4982eef4ccf694502a1b028c418a0ff59357faf88aa', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '4dc59d5c-f727-11f0-a0a4-fa163e934844', 'monotonic_time': 5435.59528777, 'message_signature': 'b8fa110f599ce02eaa56f5f72eabf40f04ae9ead316713f168b82cabe5258441'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '2e0aa78e0b4f4c11823d6a419634cab8', 'user_name': None, 'project_id': '9e271bcd92754d2c96796024a51a55a3', 'project_name': None, 'resource_id': 'ad203d96-8f7f-4024-84b2-4bb2655b6395-sda', 'timestamp': '2026-01-22T00:14:22.888047', 'resource_metadata': {'display_name': 'tempest-ServerShowV247Test-server-1680784711', 'name': 'instance-00000078', 'instance_id': 'ad203d96-8f7f-4024-84b2-4bb2655b6395', 'instance_type': 'm1.nano', 'host': '44f66483852de4982eef4ccf694502a1b028c418a0ff59357faf88aa', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '4dc5b2b0-f727-11f0-a0a4-fa163e934844', 'monotonic_time': 5435.59528777, 'message_signature': 'd96c413096ecdb092012598e1c9442274c82e2eb663f4cad9df2cadc101baf24'}]}, 'timestamp': '2026-01-22 00:14:22.907487', '_unique_id': '06ac009d20ba41a7985f69d909d6eca7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.910 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.910 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.910 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.910 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.910 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.910 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.910 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.910 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.910 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.910 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.910 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.910 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.910 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.910 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.910 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.910 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.910 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.910 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.910 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.910 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.910 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.910 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.910 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.910 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.910 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.910 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.910 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.910 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.910 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.910 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.910 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.911 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.932 12 DEBUG ceilometer.compute.pollsters [-] ad203d96-8f7f-4024-84b2-4bb2655b6395/memory.usage volume: 40.99609375 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.933 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '078a363d-454d-4b6d-b077-78b0628923ac', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 40.99609375, 'user_id': '2e0aa78e0b4f4c11823d6a419634cab8', 'user_name': None, 'project_id': '9e271bcd92754d2c96796024a51a55a3', 'project_name': None, 'resource_id': 'ad203d96-8f7f-4024-84b2-4bb2655b6395', 'timestamp': '2026-01-22T00:14:22.911396', 'resource_metadata': {'display_name': 'tempest-ServerShowV247Test-server-1680784711', 'name': 'instance-00000078', 'instance_id': 'ad203d96-8f7f-4024-84b2-4bb2655b6395', 'instance_type': 'm1.nano', 'host': '44f66483852de4982eef4ccf694502a1b028c418a0ff59357faf88aa', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '4dc9931c-f727-11f0-a0a4-fa163e934844', 'monotonic_time': 5435.639355715, 'message_signature': 'ee40967fb3c2ca1db1a636fdfb694de1b8320faacdaae28a4d0cb9fd94c519a7'}]}, 'timestamp': '2026-01-22 00:14:22.932903', '_unique_id': '72feb8298be3407995af23baed564f3d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.933 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.933 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.933 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.933 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.933 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.933 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.933 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.933 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.933 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.933 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.933 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.933 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.933 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.933 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.933 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.933 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.933 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.933 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.933 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.933 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.933 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.933 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.933 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.933 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.933 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.933 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.933 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.933 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.933 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.933 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.933 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.934 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.935 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.935 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-ServerShowV247Test-server-1680784711>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServerShowV247Test-server-1680784711>]
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.935 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.935 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.935 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.935 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-ServerShowV247Test-server-1680784711>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServerShowV247Test-server-1680784711>]
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.935 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.936 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.936 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.936 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.936 12 DEBUG ceilometer.compute.pollsters [-] ad203d96-8f7f-4024-84b2-4bb2655b6395/cpu volume: 11320000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.937 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '32d24088-e243-4ee5-a9d5-c3ac360ad37d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 11320000000, 'user_id': '2e0aa78e0b4f4c11823d6a419634cab8', 'user_name': None, 'project_id': '9e271bcd92754d2c96796024a51a55a3', 'project_name': None, 'resource_id': 'ad203d96-8f7f-4024-84b2-4bb2655b6395', 'timestamp': '2026-01-22T00:14:22.936294', 'resource_metadata': {'display_name': 'tempest-ServerShowV247Test-server-1680784711', 'name': 'instance-00000078', 'instance_id': 'ad203d96-8f7f-4024-84b2-4bb2655b6395', 'instance_type': 'm1.nano', 'host': '44f66483852de4982eef4ccf694502a1b028c418a0ff59357faf88aa', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '4dca28e0-f727-11f0-a0a4-fa163e934844', 'monotonic_time': 5435.639355715, 'message_signature': '20a489d840fde7f42852cf2c56ab7241a1fd3e2bb772e694ded1d5d5b62abea0'}]}, 'timestamp': '2026-01-22 00:14:22.936708', '_unique_id': '38f99d435015437c991dfbcf373fe45d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.937 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.937 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.937 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.937 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.937 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.937 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.937 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.937 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.937 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.937 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.937 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.937 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.937 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.937 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.937 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.937 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.937 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.937 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.937 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.937 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.937 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.937 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.937 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.937 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.937 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.937 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.937 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.937 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.937 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.937 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.937 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.938 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.938 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.938 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-ServerShowV247Test-server-1680784711>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServerShowV247Test-server-1680784711>]
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.938 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.938 12 DEBUG ceilometer.compute.pollsters [-] ad203d96-8f7f-4024-84b2-4bb2655b6395/disk.device.allocation volume: 30744576 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.939 12 DEBUG ceilometer.compute.pollsters [-] ad203d96-8f7f-4024-84b2-4bb2655b6395/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.940 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7c14d3c3-f764-432d-baac-839362750417', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30744576, 'user_id': '2e0aa78e0b4f4c11823d6a419634cab8', 'user_name': None, 'project_id': '9e271bcd92754d2c96796024a51a55a3', 'project_name': None, 'resource_id': 'ad203d96-8f7f-4024-84b2-4bb2655b6395-vda', 'timestamp': '2026-01-22T00:14:22.938724', 'resource_metadata': {'display_name': 'tempest-ServerShowV247Test-server-1680784711', 'name': 'instance-00000078', 'instance_id': 'ad203d96-8f7f-4024-84b2-4bb2655b6395', 'instance_type': 'm1.nano', 'host': '44f66483852de4982eef4ccf694502a1b028c418a0ff59357faf88aa', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '4dca86a0-f727-11f0-a0a4-fa163e934844', 'monotonic_time': 5435.59528777, 'message_signature': '29536e29a37cc72229ae320279f1b52258fac4fa9c010dd36997cd3d8371973e'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '2e0aa78e0b4f4c11823d6a419634cab8', 'user_name': None, 'project_id': '9e271bcd92754d2c96796024a51a55a3', 'project_name': None, 'resource_id': 'ad203d96-8f7f-4024-84b2-4bb2655b6395-sda', 'timestamp': '2026-01-22T00:14:22.938724', 'resource_metadata': {'display_name': 'tempest-ServerShowV247Test-server-1680784711', 'name': 'instance-00000078', 'instance_id': 'ad203d96-8f7f-4024-84b2-4bb2655b6395', 'instance_type': 'm1.nano', 'host': '44f66483852de4982eef4ccf694502a1b028c418a0ff59357faf88aa', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '4dca910e-f727-11f0-a0a4-fa163e934844', 'monotonic_time': 5435.59528777, 'message_signature': 'dc6fcfe0f53026e33319a9290bd6f4b38aa1b7f604d7d9036cb4bf49b4cfe010'}]}, 'timestamp': '2026-01-22 00:14:22.939291', '_unique_id': '388c3833a77a4a95b7f1e5d2d6b26ba9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.940 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.940 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.940 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.940 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.940 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.940 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.940 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.940 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.940 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.940 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.940 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.940 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.940 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.940 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.940 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.940 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.940 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.940 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.940 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.940 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.940 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.940 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.940 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.940 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.940 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.940 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.940 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.940 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.940 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.940 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.940 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.940 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.940 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.971 12 DEBUG ceilometer.compute.pollsters [-] ad203d96-8f7f-4024-84b2-4bb2655b6395/disk.device.write.latency volume: 5010847205 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.971 12 DEBUG ceilometer.compute.pollsters [-] ad203d96-8f7f-4024-84b2-4bb2655b6395/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.973 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a55a3383-c2f7-4257-82fd-5b6f80dacd25', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 5010847205, 'user_id': '2e0aa78e0b4f4c11823d6a419634cab8', 'user_name': None, 'project_id': '9e271bcd92754d2c96796024a51a55a3', 'project_name': None, 'resource_id': 'ad203d96-8f7f-4024-84b2-4bb2655b6395-vda', 'timestamp': '2026-01-22T00:14:22.941017', 'resource_metadata': {'display_name': 'tempest-ServerShowV247Test-server-1680784711', 'name': 'instance-00000078', 'instance_id': 'ad203d96-8f7f-4024-84b2-4bb2655b6395', 'instance_type': 'm1.nano', 'host': '44f66483852de4982eef4ccf694502a1b028c418a0ff59357faf88aa', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '4dcf85ec-f727-11f0-a0a4-fa163e934844', 'monotonic_time': 5435.648239719, 'message_signature': 'd532c5a8b1c8e8082a494380a98d4334d73ccba52a16bfc6969d4ab8b8885c7c'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '2e0aa78e0b4f4c11823d6a419634cab8', 'user_name': None, 'project_id': '9e271bcd92754d2c96796024a51a55a3', 'project_name': None, 'resource_id': 'ad203d96-8f7f-4024-84b2-4bb2655b6395-sda', 'timestamp': '2026-01-22T00:14:22.941017', 'resource_metadata': {'display_name': 'tempest-ServerShowV247Test-server-1680784711', 'name': 'instance-00000078', 'instance_id': 'ad203d96-8f7f-4024-84b2-4bb2655b6395', 'instance_type': 'm1.nano', 'host': '44f66483852de4982eef4ccf694502a1b028c418a0ff59357faf88aa', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '4dcf94e2-f727-11f0-a0a4-fa163e934844', 'monotonic_time': 5435.648239719, 'message_signature': 'a2d15c484edc66248e982de67a5ab94a39716c427725dde2906376a8d17712f7'}]}, 'timestamp': '2026-01-22 00:14:22.972181', '_unique_id': '0374c3b2daab4a7c8740f99da525c8fa'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.973 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.973 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.973 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.973 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.973 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.973 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.973 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.973 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.973 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.973 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.973 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.973 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.973 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.973 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.973 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.973 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.973 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.973 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.973 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.973 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.973 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.973 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.973 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.973 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.973 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.973 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.973 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.973 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.973 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.973 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.973 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.974 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.974 12 DEBUG ceilometer.compute.pollsters [-] ad203d96-8f7f-4024-84b2-4bb2655b6395/disk.device.read.bytes volume: 30968320 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.974 12 DEBUG ceilometer.compute.pollsters [-] ad203d96-8f7f-4024-84b2-4bb2655b6395/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.975 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4c3c3489-f0a6-4e4f-99f7-3705cfa69c05', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 30968320, 'user_id': '2e0aa78e0b4f4c11823d6a419634cab8', 'user_name': None, 'project_id': '9e271bcd92754d2c96796024a51a55a3', 'project_name': None, 'resource_id': 'ad203d96-8f7f-4024-84b2-4bb2655b6395-vda', 'timestamp': '2026-01-22T00:14:22.974253', 'resource_metadata': {'display_name': 'tempest-ServerShowV247Test-server-1680784711', 'name': 'instance-00000078', 'instance_id': 'ad203d96-8f7f-4024-84b2-4bb2655b6395', 'instance_type': 'm1.nano', 'host': '44f66483852de4982eef4ccf694502a1b028c418a0ff59357faf88aa', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '4dcff1d0-f727-11f0-a0a4-fa163e934844', 'monotonic_time': 5435.648239719, 'message_signature': 'da50debfb25409e23d86dcb5e4a0aa709c4dacc4bc1f19c45c737cd057ca6faa'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': '2e0aa78e0b4f4c11823d6a419634cab8', 'user_name': None, 'project_id': '9e271bcd92754d2c96796024a51a55a3', 'project_name': None, 'resource_id': 'ad203d96-8f7f-4024-84b2-4bb2655b6395-sda', 'timestamp': '2026-01-22T00:14:22.974253', 'resource_metadata': {'display_name': 'tempest-ServerShowV247Test-server-1680784711', 'name': 'instance-00000078', 'instance_id': 'ad203d96-8f7f-4024-84b2-4bb2655b6395', 'instance_type': 'm1.nano', 'host': '44f66483852de4982eef4ccf694502a1b028c418a0ff59357faf88aa', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '4dcffc20-f727-11f0-a0a4-fa163e934844', 'monotonic_time': 5435.648239719, 'message_signature': 'a428bdc12408ab93252a34d8f789181739cf970ad187994e55947e23b1748c35'}]}, 'timestamp': '2026-01-22 00:14:22.974806', '_unique_id': '86484c7c2d6441cd8aa36027c2960a26'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.975 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.975 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.975 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.975 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.975 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.975 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.975 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.975 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.975 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.975 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.975 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.975 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.975 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.975 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.975 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.975 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.975 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.975 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.975 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.975 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.975 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.975 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.975 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.975 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.975 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.975 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.975 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.975 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.975 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.975 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.975 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.976 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.976 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.976 12 DEBUG ceilometer.compute.pollsters [-] ad203d96-8f7f-4024-84b2-4bb2655b6395/disk.device.read.requests volume: 1125 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.976 12 DEBUG ceilometer.compute.pollsters [-] ad203d96-8f7f-4024-84b2-4bb2655b6395/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.977 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd49af486-b3ed-4529-a487-e872547bdb3a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1125, 'user_id': '2e0aa78e0b4f4c11823d6a419634cab8', 'user_name': None, 'project_id': '9e271bcd92754d2c96796024a51a55a3', 'project_name': None, 'resource_id': 'ad203d96-8f7f-4024-84b2-4bb2655b6395-vda', 'timestamp': '2026-01-22T00:14:22.976446', 'resource_metadata': {'display_name': 'tempest-ServerShowV247Test-server-1680784711', 'name': 'instance-00000078', 'instance_id': 'ad203d96-8f7f-4024-84b2-4bb2655b6395', 'instance_type': 'm1.nano', 'host': '44f66483852de4982eef4ccf694502a1b028c418a0ff59357faf88aa', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '4dd047c0-f727-11f0-a0a4-fa163e934844', 'monotonic_time': 5435.648239719, 'message_signature': '358322d869b39a8d3b9f133e645b58f68a5c0b073baf1f72a21bcfff77970607'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': '2e0aa78e0b4f4c11823d6a419634cab8', 'user_name': None, 'project_id': '9e271bcd92754d2c96796024a51a55a3', 'project_name': None, 'resource_id': 'ad203d96-8f7f-4024-84b2-4bb2655b6395-sda', 'timestamp': '2026-01-22T00:14:22.976446', 'resource_metadata': {'display_name': 'tempest-ServerShowV247Test-server-1680784711', 'name': 'instance-00000078', 'instance_id': 'ad203d96-8f7f-4024-84b2-4bb2655b6395', 'instance_type': 'm1.nano', 'host': '44f66483852de4982eef4ccf694502a1b028c418a0ff59357faf88aa', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '4dd056d4-f727-11f0-a0a4-fa163e934844', 'monotonic_time': 5435.648239719, 'message_signature': 'b6ef618c490d465855967571c6b89828191f79986dfdf0e11cec7d2027867a7d'}]}, 'timestamp': '2026-01-22 00:14:22.977138', '_unique_id': '3308aef92c374aa1a593331219e4527f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.977 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.977 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.977 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.977 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.977 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.977 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.977 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.977 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.977 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.977 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.977 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.977 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.977 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.977 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.977 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.977 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.977 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.977 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.977 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.977 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.977 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.977 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.977 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.977 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.977 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.977 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.977 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.977 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.977 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.977 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.977 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.977 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.977 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.977 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.977 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.977 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.977 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.977 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.977 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.977 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.977 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.977 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.977 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.977 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.977 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.977 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.977 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.977 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.977 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.977 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.977 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.977 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.977 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.977 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.978 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.978 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.978 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.978 12 DEBUG ceilometer.compute.pollsters [-] ad203d96-8f7f-4024-84b2-4bb2655b6395/disk.device.write.requests volume: 317 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.979 12 DEBUG ceilometer.compute.pollsters [-] ad203d96-8f7f-4024-84b2-4bb2655b6395/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.980 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd9341107-5b84-45fb-8381-b882f0feb845', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 317, 'user_id': '2e0aa78e0b4f4c11823d6a419634cab8', 'user_name': None, 'project_id': '9e271bcd92754d2c96796024a51a55a3', 'project_name': None, 'resource_id': 'ad203d96-8f7f-4024-84b2-4bb2655b6395-vda', 'timestamp': '2026-01-22T00:14:22.978931', 'resource_metadata': {'display_name': 'tempest-ServerShowV247Test-server-1680784711', 'name': 'instance-00000078', 'instance_id': 'ad203d96-8f7f-4024-84b2-4bb2655b6395', 'instance_type': 'm1.nano', 'host': '44f66483852de4982eef4ccf694502a1b028c418a0ff59357faf88aa', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '4dd0a800-f727-11f0-a0a4-fa163e934844', 'monotonic_time': 5435.648239719, 'message_signature': '3e5bfe94f97f7f585086899a7046299d96788e3e861dcfaddcb1dc354b04b6a6'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '2e0aa78e0b4f4c11823d6a419634cab8', 'user_name': None, 'project_id': '9e271bcd92754d2c96796024a51a55a3', 'project_name': None, 'resource_id': 'ad203d96-8f7f-4024-84b2-4bb2655b6395-sda', 'timestamp': '2026-01-22T00:14:22.978931', 'resource_metadata': {'display_name': 'tempest-ServerShowV247Test-server-1680784711', 'name': 'instance-00000078', 'instance_id': 'ad203d96-8f7f-4024-84b2-4bb2655b6395', 'instance_type': 'm1.nano', 'host': '44f66483852de4982eef4ccf694502a1b028c418a0ff59357faf88aa', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '4dd0b0a2-f727-11f0-a0a4-fa163e934844', 'monotonic_time': 5435.648239719, 'message_signature': 'b43a8fcddb751147c71a3e9c548059cc37aeb2ebf7f4b16c164573478b2e4c52'}]}, 'timestamp': '2026-01-22 00:14:22.979422', '_unique_id': '718b314c8e6d413a86aef644dc1ed200'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.980 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.980 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.980 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.980 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.980 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.980 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.980 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.980 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.980 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.980 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.980 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.980 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.980 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.980 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.980 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.980 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.980 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.980 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.980 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.980 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.980 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.980 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.980 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.980 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.980 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.980 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.980 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.980 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.980 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.980 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.980 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.980 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.980 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.980 12 DEBUG ceilometer.compute.pollsters [-] ad203d96-8f7f-4024-84b2-4bb2655b6395/disk.device.read.latency volume: 230445056 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.981 12 DEBUG ceilometer.compute.pollsters [-] ad203d96-8f7f-4024-84b2-4bb2655b6395/disk.device.read.latency volume: 27029534 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.981 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9ed29557-cba8-446e-a8ca-b42707456bfa', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 230445056, 'user_id': '2e0aa78e0b4f4c11823d6a419634cab8', 'user_name': None, 'project_id': '9e271bcd92754d2c96796024a51a55a3', 'project_name': None, 'resource_id': 'ad203d96-8f7f-4024-84b2-4bb2655b6395-vda', 'timestamp': '2026-01-22T00:14:22.980772', 'resource_metadata': {'display_name': 'tempest-ServerShowV247Test-server-1680784711', 'name': 'instance-00000078', 'instance_id': 'ad203d96-8f7f-4024-84b2-4bb2655b6395', 'instance_type': 'm1.nano', 'host': '44f66483852de4982eef4ccf694502a1b028c418a0ff59357faf88aa', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '4dd0ef72-f727-11f0-a0a4-fa163e934844', 'monotonic_time': 5435.648239719, 'message_signature': 'c40178c8b5f936aa3813faca588be5951bdd00ac39aa121048a2dc23379b666d'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 27029534, 'user_id': '2e0aa78e0b4f4c11823d6a419634cab8', 'user_name': None, 'project_id': '9e271bcd92754d2c96796024a51a55a3', 'project_name': None, 'resource_id': 'ad203d96-8f7f-4024-84b2-4bb2655b6395-sda', 'timestamp': '2026-01-22T00:14:22.980772', 'resource_metadata': {'display_name': 'tempest-ServerShowV247Test-server-1680784711', 'name': 'instance-00000078', 'instance_id': 'ad203d96-8f7f-4024-84b2-4bb2655b6395', 'instance_type': 'm1.nano', 'host': '44f66483852de4982eef4ccf694502a1b028c418a0ff59357faf88aa', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '4dd0f756-f727-11f0-a0a4-fa163e934844', 'monotonic_time': 5435.648239719, 'message_signature': '9965b662ae9873201de72e13ffd3853c56b52aff3a35a29a62ddf8e52a891765'}]}, 'timestamp': '2026-01-22 00:14:22.981197', '_unique_id': 'b4e166e83f854f6aa22ec0d1f1ea2d1a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.981 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.981 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.981 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.981 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.981 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.981 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.981 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.981 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.981 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.981 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.981 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.981 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.981 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.981 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.981 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.981 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.981 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.981 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.981 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.981 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.981 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.981 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.981 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.981 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.981 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.981 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.981 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.981 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.981 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.981 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.981 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.982 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.982 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.982 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-ServerShowV247Test-server-1680784711>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServerShowV247Test-server-1680784711>]
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.982 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.982 12 DEBUG ceilometer.compute.pollsters [-] ad203d96-8f7f-4024-84b2-4bb2655b6395/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.982 12 DEBUG ceilometer.compute.pollsters [-] ad203d96-8f7f-4024-84b2-4bb2655b6395/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.983 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ea1d15fe-804a-48da-827c-5acda28049a3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '2e0aa78e0b4f4c11823d6a419634cab8', 'user_name': None, 'project_id': '9e271bcd92754d2c96796024a51a55a3', 'project_name': None, 'resource_id': 'ad203d96-8f7f-4024-84b2-4bb2655b6395-vda', 'timestamp': '2026-01-22T00:14:22.982564', 'resource_metadata': {'display_name': 'tempest-ServerShowV247Test-server-1680784711', 'name': 'instance-00000078', 'instance_id': 'ad203d96-8f7f-4024-84b2-4bb2655b6395', 'instance_type': 'm1.nano', 'host': '44f66483852de4982eef4ccf694502a1b028c418a0ff59357faf88aa', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '4dd134be-f727-11f0-a0a4-fa163e934844', 'monotonic_time': 5435.59528777, 'message_signature': '7ad67aa484fd7b874228727530bd638803260f0f09dbdeb8b54a1764bb65fece'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '2e0aa78e0b4f4c11823d6a419634cab8', 'user_name': None, 'project_id': '9e271bcd92754d2c96796024a51a55a3', 'project_name': None, 'resource_id': 'ad203d96-8f7f-4024-84b2-4bb2655b6395-sda', 'timestamp': '2026-01-22T00:14:22.982564', 'resource_metadata': {'display_name': 'tempest-ServerShowV247Test-server-1680784711', 'name': 'instance-00000078', 'instance_id': 'ad203d96-8f7f-4024-84b2-4bb2655b6395', 'instance_type': 'm1.nano', 'host': '44f66483852de4982eef4ccf694502a1b028c418a0ff59357faf88aa', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '4dd13d2e-f727-11f0-a0a4-fa163e934844', 'monotonic_time': 5435.59528777, 'message_signature': '2ca3e8069d5e344c359d761c85be7da5d5f688da30357c4e53f6c0f43784503c'}]}, 'timestamp': '2026-01-22 00:14:22.982987', '_unique_id': '5d633c2c986b4268b8dbf5e711d6cd20'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.983 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.983 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.983 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.983 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.983 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.983 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.983 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.983 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.983 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.983 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.983 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.983 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.983 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.983 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.983 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.983 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.983 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.983 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.983 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.983 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.983 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.983 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.983 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.983 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.983 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.983 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.983 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.983 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.983 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.983 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.983 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.984 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.984 12 DEBUG ceilometer.compute.pollsters [-] ad203d96-8f7f-4024-84b2-4bb2655b6395/disk.device.write.bytes volume: 72953856 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.984 12 DEBUG ceilometer.compute.pollsters [-] ad203d96-8f7f-4024-84b2-4bb2655b6395/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.985 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '85c6f3f8-f2ee-4e6b-a433-d054c22b680e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 72953856, 'user_id': '2e0aa78e0b4f4c11823d6a419634cab8', 'user_name': None, 'project_id': '9e271bcd92754d2c96796024a51a55a3', 'project_name': None, 'resource_id': 'ad203d96-8f7f-4024-84b2-4bb2655b6395-vda', 'timestamp': '2026-01-22T00:14:22.984085', 'resource_metadata': {'display_name': 'tempest-ServerShowV247Test-server-1680784711', 'name': 'instance-00000078', 'instance_id': 'ad203d96-8f7f-4024-84b2-4bb2655b6395', 'instance_type': 'm1.nano', 'host': '44f66483852de4982eef4ccf694502a1b028c418a0ff59357faf88aa', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '4dd1706e-f727-11f0-a0a4-fa163e934844', 'monotonic_time': 5435.648239719, 'message_signature': '533e75b351dd1e6a57828f6105dbf138b0e73a9b459f3973a9f3784eaadc2148'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '2e0aa78e0b4f4c11823d6a419634cab8', 'user_name': None, 'project_id': '9e271bcd92754d2c96796024a51a55a3', 'project_name': None, 'resource_id': 'ad203d96-8f7f-4024-84b2-4bb2655b6395-sda', 'timestamp': '2026-01-22T00:14:22.984085', 'resource_metadata': {'display_name': 'tempest-ServerShowV247Test-server-1680784711', 'name': 'instance-00000078', 'instance_id': 'ad203d96-8f7f-4024-84b2-4bb2655b6395', 'instance_type': 'm1.nano', 'host': '44f66483852de4982eef4ccf694502a1b028c418a0ff59357faf88aa', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '4dd17af0-f727-11f0-a0a4-fa163e934844', 'monotonic_time': 5435.648239719, 'message_signature': '8a9eaad72719836626b30c3c08deee29cd48f1bd71481a18e2d380664d9c0022'}]}, 'timestamp': '2026-01-22 00:14:22.984612', '_unique_id': 'bc16225c75f944afb06016c3243f6167'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.985 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.985 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.985 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.985 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.985 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.985 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.985 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.985 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.985 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.985 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.985 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.985 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.985 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.985 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.985 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.985 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.985 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.985 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.985 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.985 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.985 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.985 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.985 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.985 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.985 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.985 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.985 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.985 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.985 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.985 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:14:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:14:22.985 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:14:23 compute-1 nova_compute[182713]: 2026-01-22 00:14:23.121 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:14:23 compute-1 nova_compute[182713]: 2026-01-22 00:14:23.124 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:14:23 compute-1 podman[230507]: 2026-01-22 00:14:23.597139814 +0000 UTC m=+0.081413893 container health_status dce133a2ffa21f3006ac27dc80027060a9029e6d972f4c62fecd35dd3bbb00f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, distribution-scope=public, io.buildah.version=1.33.7, name=ubi9-minimal, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, vcs-type=git, version=9.6, config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, release=1755695350)
Jan 22 00:14:23 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:14:23.920 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=74526b6d-b1ca-423f-9094-b845f8b97526, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '36'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:14:23 compute-1 nova_compute[182713]: 2026-01-22 00:14:23.934 182717 DEBUG oslo_concurrency.lockutils [None req-8d7aea8d-1797-479d-a055-73717eca7525 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] Acquiring lock "ad203d96-8f7f-4024-84b2-4bb2655b6395" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:14:23 compute-1 nova_compute[182713]: 2026-01-22 00:14:23.935 182717 DEBUG oslo_concurrency.lockutils [None req-8d7aea8d-1797-479d-a055-73717eca7525 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] Lock "ad203d96-8f7f-4024-84b2-4bb2655b6395" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:14:23 compute-1 nova_compute[182713]: 2026-01-22 00:14:23.935 182717 DEBUG oslo_concurrency.lockutils [None req-8d7aea8d-1797-479d-a055-73717eca7525 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] Acquiring lock "ad203d96-8f7f-4024-84b2-4bb2655b6395-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:14:23 compute-1 nova_compute[182713]: 2026-01-22 00:14:23.936 182717 DEBUG oslo_concurrency.lockutils [None req-8d7aea8d-1797-479d-a055-73717eca7525 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] Lock "ad203d96-8f7f-4024-84b2-4bb2655b6395-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:14:23 compute-1 nova_compute[182713]: 2026-01-22 00:14:23.936 182717 DEBUG oslo_concurrency.lockutils [None req-8d7aea8d-1797-479d-a055-73717eca7525 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] Lock "ad203d96-8f7f-4024-84b2-4bb2655b6395-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:14:24 compute-1 nova_compute[182713]: 2026-01-22 00:14:24.081 182717 INFO nova.compute.manager [None req-8d7aea8d-1797-479d-a055-73717eca7525 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] [instance: ad203d96-8f7f-4024-84b2-4bb2655b6395] Terminating instance
Jan 22 00:14:24 compute-1 nova_compute[182713]: 2026-01-22 00:14:24.144 182717 DEBUG oslo_concurrency.lockutils [None req-8d7aea8d-1797-479d-a055-73717eca7525 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] Acquiring lock "refresh_cache-ad203d96-8f7f-4024-84b2-4bb2655b6395" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:14:24 compute-1 nova_compute[182713]: 2026-01-22 00:14:24.145 182717 DEBUG oslo_concurrency.lockutils [None req-8d7aea8d-1797-479d-a055-73717eca7525 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] Acquired lock "refresh_cache-ad203d96-8f7f-4024-84b2-4bb2655b6395" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:14:24 compute-1 nova_compute[182713]: 2026-01-22 00:14:24.145 182717 DEBUG nova.network.neutron [None req-8d7aea8d-1797-479d-a055-73717eca7525 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] [instance: ad203d96-8f7f-4024-84b2-4bb2655b6395] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 00:14:24 compute-1 nova_compute[182713]: 2026-01-22 00:14:24.717 182717 DEBUG nova.network.neutron [None req-8d7aea8d-1797-479d-a055-73717eca7525 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] [instance: ad203d96-8f7f-4024-84b2-4bb2655b6395] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 00:14:25 compute-1 nova_compute[182713]: 2026-01-22 00:14:25.669 182717 DEBUG nova.network.neutron [None req-8d7aea8d-1797-479d-a055-73717eca7525 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] [instance: ad203d96-8f7f-4024-84b2-4bb2655b6395] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:14:25 compute-1 nova_compute[182713]: 2026-01-22 00:14:25.711 182717 DEBUG oslo_concurrency.lockutils [None req-8d7aea8d-1797-479d-a055-73717eca7525 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] Releasing lock "refresh_cache-ad203d96-8f7f-4024-84b2-4bb2655b6395" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:14:25 compute-1 nova_compute[182713]: 2026-01-22 00:14:25.712 182717 DEBUG nova.compute.manager [None req-8d7aea8d-1797-479d-a055-73717eca7525 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] [instance: ad203d96-8f7f-4024-84b2-4bb2655b6395] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 22 00:14:25 compute-1 systemd[1]: machine-qemu\x2d54\x2dinstance\x2d00000078.scope: Deactivated successfully.
Jan 22 00:14:25 compute-1 systemd[1]: machine-qemu\x2d54\x2dinstance\x2d00000078.scope: Consumed 13.840s CPU time.
Jan 22 00:14:25 compute-1 systemd-machined[153970]: Machine qemu-54-instance-00000078 terminated.
Jan 22 00:14:25 compute-1 nova_compute[182713]: 2026-01-22 00:14:25.971 182717 INFO nova.virt.libvirt.driver [-] [instance: ad203d96-8f7f-4024-84b2-4bb2655b6395] Instance destroyed successfully.
Jan 22 00:14:25 compute-1 nova_compute[182713]: 2026-01-22 00:14:25.971 182717 DEBUG nova.objects.instance [None req-8d7aea8d-1797-479d-a055-73717eca7525 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] Lazy-loading 'resources' on Instance uuid ad203d96-8f7f-4024-84b2-4bb2655b6395 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:14:25 compute-1 nova_compute[182713]: 2026-01-22 00:14:25.992 182717 INFO nova.virt.libvirt.driver [None req-8d7aea8d-1797-479d-a055-73717eca7525 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] [instance: ad203d96-8f7f-4024-84b2-4bb2655b6395] Deleting instance files /var/lib/nova/instances/ad203d96-8f7f-4024-84b2-4bb2655b6395_del
Jan 22 00:14:25 compute-1 nova_compute[182713]: 2026-01-22 00:14:25.993 182717 INFO nova.virt.libvirt.driver [None req-8d7aea8d-1797-479d-a055-73717eca7525 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] [instance: ad203d96-8f7f-4024-84b2-4bb2655b6395] Deletion of /var/lib/nova/instances/ad203d96-8f7f-4024-84b2-4bb2655b6395_del complete
Jan 22 00:14:26 compute-1 nova_compute[182713]: 2026-01-22 00:14:26.143 182717 INFO nova.compute.manager [None req-8d7aea8d-1797-479d-a055-73717eca7525 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] [instance: ad203d96-8f7f-4024-84b2-4bb2655b6395] Took 0.43 seconds to destroy the instance on the hypervisor.
Jan 22 00:14:26 compute-1 nova_compute[182713]: 2026-01-22 00:14:26.144 182717 DEBUG oslo.service.loopingcall [None req-8d7aea8d-1797-479d-a055-73717eca7525 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 22 00:14:26 compute-1 nova_compute[182713]: 2026-01-22 00:14:26.144 182717 DEBUG nova.compute.manager [-] [instance: ad203d96-8f7f-4024-84b2-4bb2655b6395] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 22 00:14:26 compute-1 nova_compute[182713]: 2026-01-22 00:14:26.145 182717 DEBUG nova.network.neutron [-] [instance: ad203d96-8f7f-4024-84b2-4bb2655b6395] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 22 00:14:27 compute-1 nova_compute[182713]: 2026-01-22 00:14:27.315 182717 DEBUG nova.network.neutron [-] [instance: ad203d96-8f7f-4024-84b2-4bb2655b6395] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 00:14:27 compute-1 nova_compute[182713]: 2026-01-22 00:14:27.501 182717 DEBUG nova.network.neutron [-] [instance: ad203d96-8f7f-4024-84b2-4bb2655b6395] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:14:27 compute-1 nova_compute[182713]: 2026-01-22 00:14:27.537 182717 INFO nova.compute.manager [-] [instance: ad203d96-8f7f-4024-84b2-4bb2655b6395] Took 1.39 seconds to deallocate network for instance.
Jan 22 00:14:28 compute-1 nova_compute[182713]: 2026-01-22 00:14:28.102 182717 DEBUG oslo_concurrency.lockutils [None req-8d7aea8d-1797-479d-a055-73717eca7525 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:14:28 compute-1 nova_compute[182713]: 2026-01-22 00:14:28.103 182717 DEBUG oslo_concurrency.lockutils [None req-8d7aea8d-1797-479d-a055-73717eca7525 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:14:28 compute-1 nova_compute[182713]: 2026-01-22 00:14:28.122 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:14:28 compute-1 nova_compute[182713]: 2026-01-22 00:14:28.124 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:14:28 compute-1 nova_compute[182713]: 2026-01-22 00:14:28.124 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5001 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Jan 22 00:14:28 compute-1 nova_compute[182713]: 2026-01-22 00:14:28.124 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 00:14:28 compute-1 nova_compute[182713]: 2026-01-22 00:14:28.169 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:14:28 compute-1 nova_compute[182713]: 2026-01-22 00:14:28.170 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 00:14:28 compute-1 nova_compute[182713]: 2026-01-22 00:14:28.210 182717 DEBUG nova.compute.provider_tree [None req-8d7aea8d-1797-479d-a055-73717eca7525 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] Inventory has not changed in ProviderTree for provider: 39680711-70c9-4df1-ae59-25e54fac688d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:14:28 compute-1 nova_compute[182713]: 2026-01-22 00:14:28.247 182717 DEBUG nova.scheduler.client.report [None req-8d7aea8d-1797-479d-a055-73717eca7525 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] Inventory has not changed for provider 39680711-70c9-4df1-ae59-25e54fac688d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:14:28 compute-1 nova_compute[182713]: 2026-01-22 00:14:28.348 182717 DEBUG oslo_concurrency.lockutils [None req-8d7aea8d-1797-479d-a055-73717eca7525 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.245s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:14:28 compute-1 nova_compute[182713]: 2026-01-22 00:14:28.420 182717 INFO nova.scheduler.client.report [None req-8d7aea8d-1797-479d-a055-73717eca7525 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] Deleted allocations for instance ad203d96-8f7f-4024-84b2-4bb2655b6395
Jan 22 00:14:28 compute-1 nova_compute[182713]: 2026-01-22 00:14:28.572 182717 DEBUG oslo_concurrency.lockutils [None req-8d7aea8d-1797-479d-a055-73717eca7525 2e0aa78e0b4f4c11823d6a419634cab8 9e271bcd92754d2c96796024a51a55a3 - - default default] Lock "ad203d96-8f7f-4024-84b2-4bb2655b6395" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.637s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:14:30 compute-1 nova_compute[182713]: 2026-01-22 00:14:30.393 182717 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769040855.3916056, f1150657-3f68-4e2e-91ee-82eb9d65f0c1 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:14:30 compute-1 nova_compute[182713]: 2026-01-22 00:14:30.393 182717 INFO nova.compute.manager [-] [instance: f1150657-3f68-4e2e-91ee-82eb9d65f0c1] VM Stopped (Lifecycle Event)
Jan 22 00:14:30 compute-1 nova_compute[182713]: 2026-01-22 00:14:30.550 182717 DEBUG nova.compute.manager [None req-62f89106-9b1c-43b2-a3de-238ac380b883 - - - - - -] [instance: f1150657-3f68-4e2e-91ee-82eb9d65f0c1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:14:32 compute-1 nova_compute[182713]: 2026-01-22 00:14:32.695 182717 DEBUG oslo_concurrency.lockutils [None req-0735ad42-80f2-49a3-9023-d8a899287d4f 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Acquiring lock "074fd360-328c-4903-a368-d3890c4a1075" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:14:32 compute-1 nova_compute[182713]: 2026-01-22 00:14:32.696 182717 DEBUG oslo_concurrency.lockutils [None req-0735ad42-80f2-49a3-9023-d8a899287d4f 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "074fd360-328c-4903-a368-d3890c4a1075" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:14:32 compute-1 nova_compute[182713]: 2026-01-22 00:14:32.740 182717 DEBUG nova.compute.manager [None req-0735ad42-80f2-49a3-9023-d8a899287d4f 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 074fd360-328c-4903-a368-d3890c4a1075] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 22 00:14:32 compute-1 nova_compute[182713]: 2026-01-22 00:14:32.951 182717 DEBUG oslo_concurrency.lockutils [None req-0735ad42-80f2-49a3-9023-d8a899287d4f 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:14:32 compute-1 nova_compute[182713]: 2026-01-22 00:14:32.952 182717 DEBUG oslo_concurrency.lockutils [None req-0735ad42-80f2-49a3-9023-d8a899287d4f 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:14:32 compute-1 nova_compute[182713]: 2026-01-22 00:14:32.965 182717 DEBUG nova.virt.hardware [None req-0735ad42-80f2-49a3-9023-d8a899287d4f 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 22 00:14:32 compute-1 nova_compute[182713]: 2026-01-22 00:14:32.965 182717 INFO nova.compute.claims [None req-0735ad42-80f2-49a3-9023-d8a899287d4f 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 074fd360-328c-4903-a368-d3890c4a1075] Claim successful on node compute-1.ctlplane.example.com
Jan 22 00:14:33 compute-1 nova_compute[182713]: 2026-01-22 00:14:33.160 182717 DEBUG nova.compute.provider_tree [None req-0735ad42-80f2-49a3-9023-d8a899287d4f 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Inventory has not changed in ProviderTree for provider: 39680711-70c9-4df1-ae59-25e54fac688d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:14:33 compute-1 nova_compute[182713]: 2026-01-22 00:14:33.170 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:14:33 compute-1 nova_compute[182713]: 2026-01-22 00:14:33.195 182717 DEBUG nova.scheduler.client.report [None req-0735ad42-80f2-49a3-9023-d8a899287d4f 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Inventory has not changed for provider 39680711-70c9-4df1-ae59-25e54fac688d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:14:33 compute-1 nova_compute[182713]: 2026-01-22 00:14:33.238 182717 DEBUG oslo_concurrency.lockutils [None req-0735ad42-80f2-49a3-9023-d8a899287d4f 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.286s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:14:33 compute-1 nova_compute[182713]: 2026-01-22 00:14:33.239 182717 DEBUG nova.compute.manager [None req-0735ad42-80f2-49a3-9023-d8a899287d4f 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 074fd360-328c-4903-a368-d3890c4a1075] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 22 00:14:33 compute-1 nova_compute[182713]: 2026-01-22 00:14:33.406 182717 DEBUG nova.compute.manager [None req-0735ad42-80f2-49a3-9023-d8a899287d4f 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 074fd360-328c-4903-a368-d3890c4a1075] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 22 00:14:33 compute-1 nova_compute[182713]: 2026-01-22 00:14:33.406 182717 DEBUG nova.network.neutron [None req-0735ad42-80f2-49a3-9023-d8a899287d4f 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 074fd360-328c-4903-a368-d3890c4a1075] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 22 00:14:33 compute-1 nova_compute[182713]: 2026-01-22 00:14:33.452 182717 INFO nova.virt.libvirt.driver [None req-0735ad42-80f2-49a3-9023-d8a899287d4f 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 074fd360-328c-4903-a368-d3890c4a1075] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 22 00:14:33 compute-1 nova_compute[182713]: 2026-01-22 00:14:33.488 182717 DEBUG nova.compute.manager [None req-0735ad42-80f2-49a3-9023-d8a899287d4f 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 074fd360-328c-4903-a368-d3890c4a1075] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 22 00:14:33 compute-1 nova_compute[182713]: 2026-01-22 00:14:33.701 182717 DEBUG nova.compute.manager [None req-0735ad42-80f2-49a3-9023-d8a899287d4f 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 074fd360-328c-4903-a368-d3890c4a1075] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 22 00:14:33 compute-1 nova_compute[182713]: 2026-01-22 00:14:33.703 182717 DEBUG nova.virt.libvirt.driver [None req-0735ad42-80f2-49a3-9023-d8a899287d4f 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 074fd360-328c-4903-a368-d3890c4a1075] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 22 00:14:33 compute-1 nova_compute[182713]: 2026-01-22 00:14:33.704 182717 INFO nova.virt.libvirt.driver [None req-0735ad42-80f2-49a3-9023-d8a899287d4f 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 074fd360-328c-4903-a368-d3890c4a1075] Creating image(s)
Jan 22 00:14:33 compute-1 nova_compute[182713]: 2026-01-22 00:14:33.705 182717 DEBUG oslo_concurrency.lockutils [None req-0735ad42-80f2-49a3-9023-d8a899287d4f 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Acquiring lock "/var/lib/nova/instances/074fd360-328c-4903-a368-d3890c4a1075/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:14:33 compute-1 nova_compute[182713]: 2026-01-22 00:14:33.706 182717 DEBUG oslo_concurrency.lockutils [None req-0735ad42-80f2-49a3-9023-d8a899287d4f 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "/var/lib/nova/instances/074fd360-328c-4903-a368-d3890c4a1075/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:14:33 compute-1 nova_compute[182713]: 2026-01-22 00:14:33.708 182717 DEBUG oslo_concurrency.lockutils [None req-0735ad42-80f2-49a3-9023-d8a899287d4f 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "/var/lib/nova/instances/074fd360-328c-4903-a368-d3890c4a1075/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:14:33 compute-1 nova_compute[182713]: 2026-01-22 00:14:33.742 182717 DEBUG oslo_concurrency.processutils [None req-0735ad42-80f2-49a3-9023-d8a899287d4f 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:14:33 compute-1 nova_compute[182713]: 2026-01-22 00:14:33.813 182717 DEBUG oslo_concurrency.processutils [None req-0735ad42-80f2-49a3-9023-d8a899287d4f 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:14:33 compute-1 nova_compute[182713]: 2026-01-22 00:14:33.816 182717 DEBUG oslo_concurrency.lockutils [None req-0735ad42-80f2-49a3-9023-d8a899287d4f 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Acquiring lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:14:33 compute-1 nova_compute[182713]: 2026-01-22 00:14:33.817 182717 DEBUG oslo_concurrency.lockutils [None req-0735ad42-80f2-49a3-9023-d8a899287d4f 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:14:33 compute-1 nova_compute[182713]: 2026-01-22 00:14:33.844 182717 DEBUG oslo_concurrency.processutils [None req-0735ad42-80f2-49a3-9023-d8a899287d4f 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:14:33 compute-1 nova_compute[182713]: 2026-01-22 00:14:33.894 182717 DEBUG nova.policy [None req-0735ad42-80f2-49a3-9023-d8a899287d4f 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '833f1e9dce90456ea55a443da6704907', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '34b96b4037d24a0ea19383ca2477b2fd', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 22 00:14:33 compute-1 nova_compute[182713]: 2026-01-22 00:14:33.915 182717 DEBUG oslo_concurrency.processutils [None req-0735ad42-80f2-49a3-9023-d8a899287d4f 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:14:33 compute-1 nova_compute[182713]: 2026-01-22 00:14:33.916 182717 DEBUG oslo_concurrency.processutils [None req-0735ad42-80f2-49a3-9023-d8a899287d4f 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/074fd360-328c-4903-a368-d3890c4a1075/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:14:33 compute-1 nova_compute[182713]: 2026-01-22 00:14:33.953 182717 DEBUG oslo_concurrency.processutils [None req-0735ad42-80f2-49a3-9023-d8a899287d4f 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/074fd360-328c-4903-a368-d3890c4a1075/disk 1073741824" returned: 0 in 0.038s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:14:33 compute-1 nova_compute[182713]: 2026-01-22 00:14:33.955 182717 DEBUG oslo_concurrency.lockutils [None req-0735ad42-80f2-49a3-9023-d8a899287d4f 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.138s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:14:33 compute-1 nova_compute[182713]: 2026-01-22 00:14:33.955 182717 DEBUG oslo_concurrency.processutils [None req-0735ad42-80f2-49a3-9023-d8a899287d4f 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:14:34 compute-1 nova_compute[182713]: 2026-01-22 00:14:34.054 182717 DEBUG oslo_concurrency.processutils [None req-0735ad42-80f2-49a3-9023-d8a899287d4f 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.099s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:14:34 compute-1 nova_compute[182713]: 2026-01-22 00:14:34.055 182717 DEBUG nova.virt.disk.api [None req-0735ad42-80f2-49a3-9023-d8a899287d4f 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Checking if we can resize image /var/lib/nova/instances/074fd360-328c-4903-a368-d3890c4a1075/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 22 00:14:34 compute-1 nova_compute[182713]: 2026-01-22 00:14:34.056 182717 DEBUG oslo_concurrency.processutils [None req-0735ad42-80f2-49a3-9023-d8a899287d4f 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/074fd360-328c-4903-a368-d3890c4a1075/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:14:34 compute-1 nova_compute[182713]: 2026-01-22 00:14:34.108 182717 DEBUG oslo_concurrency.processutils [None req-0735ad42-80f2-49a3-9023-d8a899287d4f 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/074fd360-328c-4903-a368-d3890c4a1075/disk --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:14:34 compute-1 nova_compute[182713]: 2026-01-22 00:14:34.110 182717 DEBUG nova.virt.disk.api [None req-0735ad42-80f2-49a3-9023-d8a899287d4f 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Cannot resize image /var/lib/nova/instances/074fd360-328c-4903-a368-d3890c4a1075/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 22 00:14:34 compute-1 nova_compute[182713]: 2026-01-22 00:14:34.110 182717 DEBUG nova.objects.instance [None req-0735ad42-80f2-49a3-9023-d8a899287d4f 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lazy-loading 'migration_context' on Instance uuid 074fd360-328c-4903-a368-d3890c4a1075 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:14:34 compute-1 nova_compute[182713]: 2026-01-22 00:14:34.147 182717 DEBUG nova.virt.libvirt.driver [None req-0735ad42-80f2-49a3-9023-d8a899287d4f 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 074fd360-328c-4903-a368-d3890c4a1075] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 22 00:14:34 compute-1 nova_compute[182713]: 2026-01-22 00:14:34.147 182717 DEBUG nova.virt.libvirt.driver [None req-0735ad42-80f2-49a3-9023-d8a899287d4f 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 074fd360-328c-4903-a368-d3890c4a1075] Ensure instance console log exists: /var/lib/nova/instances/074fd360-328c-4903-a368-d3890c4a1075/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 22 00:14:34 compute-1 nova_compute[182713]: 2026-01-22 00:14:34.148 182717 DEBUG oslo_concurrency.lockutils [None req-0735ad42-80f2-49a3-9023-d8a899287d4f 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:14:34 compute-1 nova_compute[182713]: 2026-01-22 00:14:34.148 182717 DEBUG oslo_concurrency.lockutils [None req-0735ad42-80f2-49a3-9023-d8a899287d4f 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:14:34 compute-1 nova_compute[182713]: 2026-01-22 00:14:34.149 182717 DEBUG oslo_concurrency.lockutils [None req-0735ad42-80f2-49a3-9023-d8a899287d4f 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:14:36 compute-1 podman[230553]: 2026-01-22 00:14:36.634891981 +0000 UTC m=+0.116599414 container health_status 9069102163b9014ce8d990e36224edf2f6277e737eebbc85a8ea0ba5a3d6a468 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 22 00:14:36 compute-1 podman[230552]: 2026-01-22 00:14:36.642546126 +0000 UTC m=+0.128228902 container health_status 1b31374526468d6b98833c6ddc06c791524e3aaa8f4a92d02e3f11c449a17ca2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 00:14:37 compute-1 nova_compute[182713]: 2026-01-22 00:14:37.504 182717 DEBUG nova.network.neutron [None req-0735ad42-80f2-49a3-9023-d8a899287d4f 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 074fd360-328c-4903-a368-d3890c4a1075] Successfully created port: a06a78d5-548e-4a84-b918-197a54a79f44 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 22 00:14:38 compute-1 nova_compute[182713]: 2026-01-22 00:14:38.172 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:14:38 compute-1 nova_compute[182713]: 2026-01-22 00:14:38.174 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:14:38 compute-1 nova_compute[182713]: 2026-01-22 00:14:38.174 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Jan 22 00:14:38 compute-1 nova_compute[182713]: 2026-01-22 00:14:38.174 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 00:14:38 compute-1 nova_compute[182713]: 2026-01-22 00:14:38.225 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:14:38 compute-1 nova_compute[182713]: 2026-01-22 00:14:38.226 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 00:14:39 compute-1 nova_compute[182713]: 2026-01-22 00:14:39.375 182717 DEBUG nova.network.neutron [None req-0735ad42-80f2-49a3-9023-d8a899287d4f 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 074fd360-328c-4903-a368-d3890c4a1075] Successfully updated port: a06a78d5-548e-4a84-b918-197a54a79f44 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 22 00:14:39 compute-1 nova_compute[182713]: 2026-01-22 00:14:39.401 182717 DEBUG oslo_concurrency.lockutils [None req-0735ad42-80f2-49a3-9023-d8a899287d4f 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Acquiring lock "refresh_cache-074fd360-328c-4903-a368-d3890c4a1075" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:14:39 compute-1 nova_compute[182713]: 2026-01-22 00:14:39.401 182717 DEBUG oslo_concurrency.lockutils [None req-0735ad42-80f2-49a3-9023-d8a899287d4f 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Acquired lock "refresh_cache-074fd360-328c-4903-a368-d3890c4a1075" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:14:39 compute-1 nova_compute[182713]: 2026-01-22 00:14:39.402 182717 DEBUG nova.network.neutron [None req-0735ad42-80f2-49a3-9023-d8a899287d4f 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 074fd360-328c-4903-a368-d3890c4a1075] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 00:14:39 compute-1 nova_compute[182713]: 2026-01-22 00:14:39.758 182717 DEBUG nova.compute.manager [req-ed89845c-1e2d-46f8-8b33-4e25a048cdc4 req-0d38ae3d-4e63-4664-b5f6-e5d557965426 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 074fd360-328c-4903-a368-d3890c4a1075] Received event network-changed-a06a78d5-548e-4a84-b918-197a54a79f44 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:14:39 compute-1 nova_compute[182713]: 2026-01-22 00:14:39.759 182717 DEBUG nova.compute.manager [req-ed89845c-1e2d-46f8-8b33-4e25a048cdc4 req-0d38ae3d-4e63-4664-b5f6-e5d557965426 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 074fd360-328c-4903-a368-d3890c4a1075] Refreshing instance network info cache due to event network-changed-a06a78d5-548e-4a84-b918-197a54a79f44. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 00:14:39 compute-1 nova_compute[182713]: 2026-01-22 00:14:39.759 182717 DEBUG oslo_concurrency.lockutils [req-ed89845c-1e2d-46f8-8b33-4e25a048cdc4 req-0d38ae3d-4e63-4664-b5f6-e5d557965426 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-074fd360-328c-4903-a368-d3890c4a1075" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:14:40 compute-1 nova_compute[182713]: 2026-01-22 00:14:40.163 182717 DEBUG nova.network.neutron [None req-0735ad42-80f2-49a3-9023-d8a899287d4f 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 074fd360-328c-4903-a368-d3890c4a1075] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 00:14:40 compute-1 podman[230603]: 2026-01-22 00:14:40.58894183 +0000 UTC m=+0.079104083 container health_status af9a44923f14d865893c91712a29a99d59e05d83f1a1a0300c4f4d5af638f284 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent)
Jan 22 00:14:40 compute-1 podman[230604]: 2026-01-22 00:14:40.600436673 +0000 UTC m=+0.075665287 container health_status e305ebfcd3dbca18f8dd680606b043b108cf9efb0d0a771039cb04fe0df8441d (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 22 00:14:40 compute-1 nova_compute[182713]: 2026-01-22 00:14:40.855 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:14:40 compute-1 nova_compute[182713]: 2026-01-22 00:14:40.970 182717 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769040865.9681573, ad203d96-8f7f-4024-84b2-4bb2655b6395 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:14:40 compute-1 nova_compute[182713]: 2026-01-22 00:14:40.970 182717 INFO nova.compute.manager [-] [instance: ad203d96-8f7f-4024-84b2-4bb2655b6395] VM Stopped (Lifecycle Event)
Jan 22 00:14:41 compute-1 nova_compute[182713]: 2026-01-22 00:14:41.013 182717 DEBUG nova.compute.manager [None req-eda911d3-8158-475f-b408-85a9ba450a4d - - - - - -] [instance: ad203d96-8f7f-4024-84b2-4bb2655b6395] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:14:41 compute-1 nova_compute[182713]: 2026-01-22 00:14:41.856 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:14:42 compute-1 nova_compute[182713]: 2026-01-22 00:14:42.247 182717 DEBUG nova.network.neutron [None req-0735ad42-80f2-49a3-9023-d8a899287d4f 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 074fd360-328c-4903-a368-d3890c4a1075] Updating instance_info_cache with network_info: [{"id": "a06a78d5-548e-4a84-b918-197a54a79f44", "address": "fa:16:3e:3e:83:df", "network": {"id": "89b3c74e-a4f2-4889-901d-aba21eee4bda", "bridge": "br-int", "label": "tempest-network-smoke--2024090277", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "34b96b4037d24a0ea19383ca2477b2fd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa06a78d5-54", "ovs_interfaceid": "a06a78d5-548e-4a84-b918-197a54a79f44", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:14:42 compute-1 nova_compute[182713]: 2026-01-22 00:14:42.289 182717 DEBUG oslo_concurrency.lockutils [None req-0735ad42-80f2-49a3-9023-d8a899287d4f 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Releasing lock "refresh_cache-074fd360-328c-4903-a368-d3890c4a1075" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:14:42 compute-1 nova_compute[182713]: 2026-01-22 00:14:42.289 182717 DEBUG nova.compute.manager [None req-0735ad42-80f2-49a3-9023-d8a899287d4f 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 074fd360-328c-4903-a368-d3890c4a1075] Instance network_info: |[{"id": "a06a78d5-548e-4a84-b918-197a54a79f44", "address": "fa:16:3e:3e:83:df", "network": {"id": "89b3c74e-a4f2-4889-901d-aba21eee4bda", "bridge": "br-int", "label": "tempest-network-smoke--2024090277", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "34b96b4037d24a0ea19383ca2477b2fd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa06a78d5-54", "ovs_interfaceid": "a06a78d5-548e-4a84-b918-197a54a79f44", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 22 00:14:42 compute-1 nova_compute[182713]: 2026-01-22 00:14:42.289 182717 DEBUG oslo_concurrency.lockutils [req-ed89845c-1e2d-46f8-8b33-4e25a048cdc4 req-0d38ae3d-4e63-4664-b5f6-e5d557965426 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-074fd360-328c-4903-a368-d3890c4a1075" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:14:42 compute-1 nova_compute[182713]: 2026-01-22 00:14:42.290 182717 DEBUG nova.network.neutron [req-ed89845c-1e2d-46f8-8b33-4e25a048cdc4 req-0d38ae3d-4e63-4664-b5f6-e5d557965426 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 074fd360-328c-4903-a368-d3890c4a1075] Refreshing network info cache for port a06a78d5-548e-4a84-b918-197a54a79f44 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 00:14:42 compute-1 nova_compute[182713]: 2026-01-22 00:14:42.292 182717 DEBUG nova.virt.libvirt.driver [None req-0735ad42-80f2-49a3-9023-d8a899287d4f 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 074fd360-328c-4903-a368-d3890c4a1075] Start _get_guest_xml network_info=[{"id": "a06a78d5-548e-4a84-b918-197a54a79f44", "address": "fa:16:3e:3e:83:df", "network": {"id": "89b3c74e-a4f2-4889-901d-aba21eee4bda", "bridge": "br-int", "label": "tempest-network-smoke--2024090277", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "34b96b4037d24a0ea19383ca2477b2fd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa06a78d5-54", "ovs_interfaceid": "a06a78d5-548e-4a84-b918-197a54a79f44", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'size': 0, 'disk_bus': 'virtio', 'boot_index': 0, 'device_name': '/dev/vda', 'guest_format': None, 'encrypted': False, 'encryption_format': None, 'encryption_secret_uuid': None, 'device_type': 'disk', 'image_id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 22 00:14:42 compute-1 nova_compute[182713]: 2026-01-22 00:14:42.297 182717 WARNING nova.virt.libvirt.driver [None req-0735ad42-80f2-49a3-9023-d8a899287d4f 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 00:14:42 compute-1 nova_compute[182713]: 2026-01-22 00:14:42.301 182717 DEBUG nova.virt.libvirt.host [None req-0735ad42-80f2-49a3-9023-d8a899287d4f 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 22 00:14:42 compute-1 nova_compute[182713]: 2026-01-22 00:14:42.301 182717 DEBUG nova.virt.libvirt.host [None req-0735ad42-80f2-49a3-9023-d8a899287d4f 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 22 00:14:42 compute-1 nova_compute[182713]: 2026-01-22 00:14:42.305 182717 DEBUG nova.virt.libvirt.host [None req-0735ad42-80f2-49a3-9023-d8a899287d4f 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 22 00:14:42 compute-1 nova_compute[182713]: 2026-01-22 00:14:42.306 182717 DEBUG nova.virt.libvirt.host [None req-0735ad42-80f2-49a3-9023-d8a899287d4f 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 22 00:14:42 compute-1 nova_compute[182713]: 2026-01-22 00:14:42.307 182717 DEBUG nova.virt.libvirt.driver [None req-0735ad42-80f2-49a3-9023-d8a899287d4f 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 22 00:14:42 compute-1 nova_compute[182713]: 2026-01-22 00:14:42.307 182717 DEBUG nova.virt.hardware [None req-0735ad42-80f2-49a3-9023-d8a899287d4f 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-21T23:42:45Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c3389c03-89c4-4ff5-9e03-1a99d41713d4',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 22 00:14:42 compute-1 nova_compute[182713]: 2026-01-22 00:14:42.307 182717 DEBUG nova.virt.hardware [None req-0735ad42-80f2-49a3-9023-d8a899287d4f 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 22 00:14:42 compute-1 nova_compute[182713]: 2026-01-22 00:14:42.307 182717 DEBUG nova.virt.hardware [None req-0735ad42-80f2-49a3-9023-d8a899287d4f 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 22 00:14:42 compute-1 nova_compute[182713]: 2026-01-22 00:14:42.307 182717 DEBUG nova.virt.hardware [None req-0735ad42-80f2-49a3-9023-d8a899287d4f 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 22 00:14:42 compute-1 nova_compute[182713]: 2026-01-22 00:14:42.308 182717 DEBUG nova.virt.hardware [None req-0735ad42-80f2-49a3-9023-d8a899287d4f 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 22 00:14:42 compute-1 nova_compute[182713]: 2026-01-22 00:14:42.308 182717 DEBUG nova.virt.hardware [None req-0735ad42-80f2-49a3-9023-d8a899287d4f 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 22 00:14:42 compute-1 nova_compute[182713]: 2026-01-22 00:14:42.308 182717 DEBUG nova.virt.hardware [None req-0735ad42-80f2-49a3-9023-d8a899287d4f 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 22 00:14:42 compute-1 nova_compute[182713]: 2026-01-22 00:14:42.308 182717 DEBUG nova.virt.hardware [None req-0735ad42-80f2-49a3-9023-d8a899287d4f 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 22 00:14:42 compute-1 nova_compute[182713]: 2026-01-22 00:14:42.309 182717 DEBUG nova.virt.hardware [None req-0735ad42-80f2-49a3-9023-d8a899287d4f 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 22 00:14:42 compute-1 nova_compute[182713]: 2026-01-22 00:14:42.309 182717 DEBUG nova.virt.hardware [None req-0735ad42-80f2-49a3-9023-d8a899287d4f 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 22 00:14:42 compute-1 nova_compute[182713]: 2026-01-22 00:14:42.310 182717 DEBUG nova.virt.hardware [None req-0735ad42-80f2-49a3-9023-d8a899287d4f 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 22 00:14:42 compute-1 nova_compute[182713]: 2026-01-22 00:14:42.318 182717 DEBUG nova.virt.libvirt.vif [None req-0735ad42-80f2-49a3-9023-d8a899287d4f 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T00:14:30Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1083808564',display_name='tempest-TestNetworkBasicOps-server-1083808564',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1083808564',id=125,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBM6+EaxgHLeoG7CHqeSWXpXKv4K84dI1QQQjZdcrX0T7kqXBTlhE22YQjJTUFxToUxfZEI27WRcAtCoqb6CCdLa4/l//5Lw6nNA8ZjjrkKnh18RWLjWeeCbEVEj1FdVuCQ==',key_name='tempest-TestNetworkBasicOps-2002692629',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='34b96b4037d24a0ea19383ca2477b2fd',ramdisk_id='',reservation_id='r-5bbetpco',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-822850957',owner_user_name='tempest-TestNetworkBasicOps-822850957-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:14:33Z,user_data=None,user_id='833f1e9dce90456ea55a443da6704907',uuid=074fd360-328c-4903-a368-d3890c4a1075,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "a06a78d5-548e-4a84-b918-197a54a79f44", "address": "fa:16:3e:3e:83:df", "network": {"id": "89b3c74e-a4f2-4889-901d-aba21eee4bda", "bridge": "br-int", "label": "tempest-network-smoke--2024090277", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "34b96b4037d24a0ea19383ca2477b2fd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa06a78d5-54", "ovs_interfaceid": "a06a78d5-548e-4a84-b918-197a54a79f44", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 00:14:42 compute-1 nova_compute[182713]: 2026-01-22 00:14:42.319 182717 DEBUG nova.network.os_vif_util [None req-0735ad42-80f2-49a3-9023-d8a899287d4f 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Converting VIF {"id": "a06a78d5-548e-4a84-b918-197a54a79f44", "address": "fa:16:3e:3e:83:df", "network": {"id": "89b3c74e-a4f2-4889-901d-aba21eee4bda", "bridge": "br-int", "label": "tempest-network-smoke--2024090277", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "34b96b4037d24a0ea19383ca2477b2fd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa06a78d5-54", "ovs_interfaceid": "a06a78d5-548e-4a84-b918-197a54a79f44", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:14:42 compute-1 nova_compute[182713]: 2026-01-22 00:14:42.321 182717 DEBUG nova.network.os_vif_util [None req-0735ad42-80f2-49a3-9023-d8a899287d4f 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:3e:83:df,bridge_name='br-int',has_traffic_filtering=True,id=a06a78d5-548e-4a84-b918-197a54a79f44,network=Network(89b3c74e-a4f2-4889-901d-aba21eee4bda),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa06a78d5-54') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:14:42 compute-1 nova_compute[182713]: 2026-01-22 00:14:42.324 182717 DEBUG nova.objects.instance [None req-0735ad42-80f2-49a3-9023-d8a899287d4f 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lazy-loading 'pci_devices' on Instance uuid 074fd360-328c-4903-a368-d3890c4a1075 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:14:42 compute-1 nova_compute[182713]: 2026-01-22 00:14:42.356 182717 DEBUG nova.virt.libvirt.driver [None req-0735ad42-80f2-49a3-9023-d8a899287d4f 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 074fd360-328c-4903-a368-d3890c4a1075] End _get_guest_xml xml=<domain type="kvm">
Jan 22 00:14:42 compute-1 nova_compute[182713]:   <uuid>074fd360-328c-4903-a368-d3890c4a1075</uuid>
Jan 22 00:14:42 compute-1 nova_compute[182713]:   <name>instance-0000007d</name>
Jan 22 00:14:42 compute-1 nova_compute[182713]:   <memory>131072</memory>
Jan 22 00:14:42 compute-1 nova_compute[182713]:   <vcpu>1</vcpu>
Jan 22 00:14:42 compute-1 nova_compute[182713]:   <metadata>
Jan 22 00:14:42 compute-1 nova_compute[182713]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 00:14:42 compute-1 nova_compute[182713]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 00:14:42 compute-1 nova_compute[182713]:       <nova:name>tempest-TestNetworkBasicOps-server-1083808564</nova:name>
Jan 22 00:14:42 compute-1 nova_compute[182713]:       <nova:creationTime>2026-01-22 00:14:42</nova:creationTime>
Jan 22 00:14:42 compute-1 nova_compute[182713]:       <nova:flavor name="m1.nano">
Jan 22 00:14:42 compute-1 nova_compute[182713]:         <nova:memory>128</nova:memory>
Jan 22 00:14:42 compute-1 nova_compute[182713]:         <nova:disk>1</nova:disk>
Jan 22 00:14:42 compute-1 nova_compute[182713]:         <nova:swap>0</nova:swap>
Jan 22 00:14:42 compute-1 nova_compute[182713]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 00:14:42 compute-1 nova_compute[182713]:         <nova:vcpus>1</nova:vcpus>
Jan 22 00:14:42 compute-1 nova_compute[182713]:       </nova:flavor>
Jan 22 00:14:42 compute-1 nova_compute[182713]:       <nova:owner>
Jan 22 00:14:42 compute-1 nova_compute[182713]:         <nova:user uuid="833f1e9dce90456ea55a443da6704907">tempest-TestNetworkBasicOps-822850957-project-member</nova:user>
Jan 22 00:14:42 compute-1 nova_compute[182713]:         <nova:project uuid="34b96b4037d24a0ea19383ca2477b2fd">tempest-TestNetworkBasicOps-822850957</nova:project>
Jan 22 00:14:42 compute-1 nova_compute[182713]:       </nova:owner>
Jan 22 00:14:42 compute-1 nova_compute[182713]:       <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 22 00:14:42 compute-1 nova_compute[182713]:       <nova:ports>
Jan 22 00:14:42 compute-1 nova_compute[182713]:         <nova:port uuid="a06a78d5-548e-4a84-b918-197a54a79f44">
Jan 22 00:14:42 compute-1 nova_compute[182713]:           <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Jan 22 00:14:42 compute-1 nova_compute[182713]:         </nova:port>
Jan 22 00:14:42 compute-1 nova_compute[182713]:       </nova:ports>
Jan 22 00:14:42 compute-1 nova_compute[182713]:     </nova:instance>
Jan 22 00:14:42 compute-1 nova_compute[182713]:   </metadata>
Jan 22 00:14:42 compute-1 nova_compute[182713]:   <sysinfo type="smbios">
Jan 22 00:14:42 compute-1 nova_compute[182713]:     <system>
Jan 22 00:14:42 compute-1 nova_compute[182713]:       <entry name="manufacturer">RDO</entry>
Jan 22 00:14:42 compute-1 nova_compute[182713]:       <entry name="product">OpenStack Compute</entry>
Jan 22 00:14:42 compute-1 nova_compute[182713]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 00:14:42 compute-1 nova_compute[182713]:       <entry name="serial">074fd360-328c-4903-a368-d3890c4a1075</entry>
Jan 22 00:14:42 compute-1 nova_compute[182713]:       <entry name="uuid">074fd360-328c-4903-a368-d3890c4a1075</entry>
Jan 22 00:14:42 compute-1 nova_compute[182713]:       <entry name="family">Virtual Machine</entry>
Jan 22 00:14:42 compute-1 nova_compute[182713]:     </system>
Jan 22 00:14:42 compute-1 nova_compute[182713]:   </sysinfo>
Jan 22 00:14:42 compute-1 nova_compute[182713]:   <os>
Jan 22 00:14:42 compute-1 nova_compute[182713]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 22 00:14:42 compute-1 nova_compute[182713]:     <boot dev="hd"/>
Jan 22 00:14:42 compute-1 nova_compute[182713]:     <smbios mode="sysinfo"/>
Jan 22 00:14:42 compute-1 nova_compute[182713]:   </os>
Jan 22 00:14:42 compute-1 nova_compute[182713]:   <features>
Jan 22 00:14:42 compute-1 nova_compute[182713]:     <acpi/>
Jan 22 00:14:42 compute-1 nova_compute[182713]:     <apic/>
Jan 22 00:14:42 compute-1 nova_compute[182713]:     <vmcoreinfo/>
Jan 22 00:14:42 compute-1 nova_compute[182713]:   </features>
Jan 22 00:14:42 compute-1 nova_compute[182713]:   <clock offset="utc">
Jan 22 00:14:42 compute-1 nova_compute[182713]:     <timer name="pit" tickpolicy="delay"/>
Jan 22 00:14:42 compute-1 nova_compute[182713]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 22 00:14:42 compute-1 nova_compute[182713]:     <timer name="hpet" present="no"/>
Jan 22 00:14:42 compute-1 nova_compute[182713]:   </clock>
Jan 22 00:14:42 compute-1 nova_compute[182713]:   <cpu mode="custom" match="exact">
Jan 22 00:14:42 compute-1 nova_compute[182713]:     <model>Nehalem</model>
Jan 22 00:14:42 compute-1 nova_compute[182713]:     <topology sockets="1" cores="1" threads="1"/>
Jan 22 00:14:42 compute-1 nova_compute[182713]:   </cpu>
Jan 22 00:14:42 compute-1 nova_compute[182713]:   <devices>
Jan 22 00:14:42 compute-1 nova_compute[182713]:     <disk type="file" device="disk">
Jan 22 00:14:42 compute-1 nova_compute[182713]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 00:14:42 compute-1 nova_compute[182713]:       <source file="/var/lib/nova/instances/074fd360-328c-4903-a368-d3890c4a1075/disk"/>
Jan 22 00:14:42 compute-1 nova_compute[182713]:       <target dev="vda" bus="virtio"/>
Jan 22 00:14:42 compute-1 nova_compute[182713]:     </disk>
Jan 22 00:14:42 compute-1 nova_compute[182713]:     <disk type="file" device="cdrom">
Jan 22 00:14:42 compute-1 nova_compute[182713]:       <driver name="qemu" type="raw" cache="none"/>
Jan 22 00:14:42 compute-1 nova_compute[182713]:       <source file="/var/lib/nova/instances/074fd360-328c-4903-a368-d3890c4a1075/disk.config"/>
Jan 22 00:14:42 compute-1 nova_compute[182713]:       <target dev="sda" bus="sata"/>
Jan 22 00:14:42 compute-1 nova_compute[182713]:     </disk>
Jan 22 00:14:42 compute-1 nova_compute[182713]:     <interface type="ethernet">
Jan 22 00:14:42 compute-1 nova_compute[182713]:       <mac address="fa:16:3e:3e:83:df"/>
Jan 22 00:14:42 compute-1 nova_compute[182713]:       <model type="virtio"/>
Jan 22 00:14:42 compute-1 nova_compute[182713]:       <driver name="vhost" rx_queue_size="512"/>
Jan 22 00:14:42 compute-1 nova_compute[182713]:       <mtu size="1442"/>
Jan 22 00:14:42 compute-1 nova_compute[182713]:       <target dev="tapa06a78d5-54"/>
Jan 22 00:14:42 compute-1 nova_compute[182713]:     </interface>
Jan 22 00:14:42 compute-1 nova_compute[182713]:     <serial type="pty">
Jan 22 00:14:42 compute-1 nova_compute[182713]:       <log file="/var/lib/nova/instances/074fd360-328c-4903-a368-d3890c4a1075/console.log" append="off"/>
Jan 22 00:14:42 compute-1 nova_compute[182713]:     </serial>
Jan 22 00:14:42 compute-1 nova_compute[182713]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 00:14:42 compute-1 nova_compute[182713]:     <video>
Jan 22 00:14:42 compute-1 nova_compute[182713]:       <model type="virtio"/>
Jan 22 00:14:42 compute-1 nova_compute[182713]:     </video>
Jan 22 00:14:42 compute-1 nova_compute[182713]:     <input type="tablet" bus="usb"/>
Jan 22 00:14:42 compute-1 nova_compute[182713]:     <rng model="virtio">
Jan 22 00:14:42 compute-1 nova_compute[182713]:       <backend model="random">/dev/urandom</backend>
Jan 22 00:14:42 compute-1 nova_compute[182713]:     </rng>
Jan 22 00:14:42 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root"/>
Jan 22 00:14:42 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:14:42 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:14:42 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:14:42 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:14:42 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:14:42 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:14:42 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:14:42 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:14:42 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:14:42 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:14:42 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:14:42 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:14:42 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:14:42 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:14:42 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:14:42 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:14:42 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:14:42 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:14:42 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:14:42 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:14:42 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:14:42 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:14:42 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:14:42 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:14:42 compute-1 nova_compute[182713]:     <controller type="usb" index="0"/>
Jan 22 00:14:42 compute-1 nova_compute[182713]:     <memballoon model="virtio">
Jan 22 00:14:42 compute-1 nova_compute[182713]:       <stats period="10"/>
Jan 22 00:14:42 compute-1 nova_compute[182713]:     </memballoon>
Jan 22 00:14:42 compute-1 nova_compute[182713]:   </devices>
Jan 22 00:14:42 compute-1 nova_compute[182713]: </domain>
Jan 22 00:14:42 compute-1 nova_compute[182713]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 22 00:14:42 compute-1 nova_compute[182713]: 2026-01-22 00:14:42.357 182717 DEBUG nova.compute.manager [None req-0735ad42-80f2-49a3-9023-d8a899287d4f 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 074fd360-328c-4903-a368-d3890c4a1075] Preparing to wait for external event network-vif-plugged-a06a78d5-548e-4a84-b918-197a54a79f44 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 22 00:14:42 compute-1 nova_compute[182713]: 2026-01-22 00:14:42.357 182717 DEBUG oslo_concurrency.lockutils [None req-0735ad42-80f2-49a3-9023-d8a899287d4f 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Acquiring lock "074fd360-328c-4903-a368-d3890c4a1075-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:14:42 compute-1 nova_compute[182713]: 2026-01-22 00:14:42.357 182717 DEBUG oslo_concurrency.lockutils [None req-0735ad42-80f2-49a3-9023-d8a899287d4f 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "074fd360-328c-4903-a368-d3890c4a1075-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:14:42 compute-1 nova_compute[182713]: 2026-01-22 00:14:42.358 182717 DEBUG oslo_concurrency.lockutils [None req-0735ad42-80f2-49a3-9023-d8a899287d4f 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "074fd360-328c-4903-a368-d3890c4a1075-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:14:42 compute-1 nova_compute[182713]: 2026-01-22 00:14:42.358 182717 DEBUG nova.virt.libvirt.vif [None req-0735ad42-80f2-49a3-9023-d8a899287d4f 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T00:14:30Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1083808564',display_name='tempest-TestNetworkBasicOps-server-1083808564',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1083808564',id=125,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBM6+EaxgHLeoG7CHqeSWXpXKv4K84dI1QQQjZdcrX0T7kqXBTlhE22YQjJTUFxToUxfZEI27WRcAtCoqb6CCdLa4/l//5Lw6nNA8ZjjrkKnh18RWLjWeeCbEVEj1FdVuCQ==',key_name='tempest-TestNetworkBasicOps-2002692629',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='34b96b4037d24a0ea19383ca2477b2fd',ramdisk_id='',reservation_id='r-5bbetpco',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-822850957',owner_user_name='tempest-TestNetworkBasicOps-822850957-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:14:33Z,user_data=None,user_id='833f1e9dce90456ea55a443da6704907',uuid=074fd360-328c-4903-a368-d3890c4a1075,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "a06a78d5-548e-4a84-b918-197a54a79f44", "address": "fa:16:3e:3e:83:df", "network": {"id": "89b3c74e-a4f2-4889-901d-aba21eee4bda", "bridge": "br-int", "label": "tempest-network-smoke--2024090277", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "34b96b4037d24a0ea19383ca2477b2fd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa06a78d5-54", "ovs_interfaceid": "a06a78d5-548e-4a84-b918-197a54a79f44", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 00:14:42 compute-1 nova_compute[182713]: 2026-01-22 00:14:42.359 182717 DEBUG nova.network.os_vif_util [None req-0735ad42-80f2-49a3-9023-d8a899287d4f 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Converting VIF {"id": "a06a78d5-548e-4a84-b918-197a54a79f44", "address": "fa:16:3e:3e:83:df", "network": {"id": "89b3c74e-a4f2-4889-901d-aba21eee4bda", "bridge": "br-int", "label": "tempest-network-smoke--2024090277", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "34b96b4037d24a0ea19383ca2477b2fd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa06a78d5-54", "ovs_interfaceid": "a06a78d5-548e-4a84-b918-197a54a79f44", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:14:42 compute-1 nova_compute[182713]: 2026-01-22 00:14:42.359 182717 DEBUG nova.network.os_vif_util [None req-0735ad42-80f2-49a3-9023-d8a899287d4f 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:3e:83:df,bridge_name='br-int',has_traffic_filtering=True,id=a06a78d5-548e-4a84-b918-197a54a79f44,network=Network(89b3c74e-a4f2-4889-901d-aba21eee4bda),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa06a78d5-54') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:14:42 compute-1 nova_compute[182713]: 2026-01-22 00:14:42.360 182717 DEBUG os_vif [None req-0735ad42-80f2-49a3-9023-d8a899287d4f 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:3e:83:df,bridge_name='br-int',has_traffic_filtering=True,id=a06a78d5-548e-4a84-b918-197a54a79f44,network=Network(89b3c74e-a4f2-4889-901d-aba21eee4bda),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa06a78d5-54') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 00:14:42 compute-1 nova_compute[182713]: 2026-01-22 00:14:42.360 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:14:42 compute-1 nova_compute[182713]: 2026-01-22 00:14:42.361 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:14:42 compute-1 nova_compute[182713]: 2026-01-22 00:14:42.361 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 00:14:42 compute-1 nova_compute[182713]: 2026-01-22 00:14:42.370 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:14:42 compute-1 nova_compute[182713]: 2026-01-22 00:14:42.371 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa06a78d5-54, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:14:42 compute-1 nova_compute[182713]: 2026-01-22 00:14:42.372 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapa06a78d5-54, col_values=(('external_ids', {'iface-id': 'a06a78d5-548e-4a84-b918-197a54a79f44', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:3e:83:df', 'vm-uuid': '074fd360-328c-4903-a368-d3890c4a1075'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:14:42 compute-1 nova_compute[182713]: 2026-01-22 00:14:42.374 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:14:42 compute-1 nova_compute[182713]: 2026-01-22 00:14:42.378 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:14:42 compute-1 NetworkManager[54952]: <info>  [1769040882.3812] manager: (tapa06a78d5-54): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/225)
Jan 22 00:14:42 compute-1 nova_compute[182713]: 2026-01-22 00:14:42.384 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:14:42 compute-1 nova_compute[182713]: 2026-01-22 00:14:42.384 182717 INFO os_vif [None req-0735ad42-80f2-49a3-9023-d8a899287d4f 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:3e:83:df,bridge_name='br-int',has_traffic_filtering=True,id=a06a78d5-548e-4a84-b918-197a54a79f44,network=Network(89b3c74e-a4f2-4889-901d-aba21eee4bda),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa06a78d5-54')
Jan 22 00:14:42 compute-1 nova_compute[182713]: 2026-01-22 00:14:42.468 182717 DEBUG nova.virt.libvirt.driver [None req-0735ad42-80f2-49a3-9023-d8a899287d4f 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 00:14:42 compute-1 nova_compute[182713]: 2026-01-22 00:14:42.469 182717 DEBUG nova.virt.libvirt.driver [None req-0735ad42-80f2-49a3-9023-d8a899287d4f 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 00:14:42 compute-1 nova_compute[182713]: 2026-01-22 00:14:42.469 182717 DEBUG nova.virt.libvirt.driver [None req-0735ad42-80f2-49a3-9023-d8a899287d4f 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] No VIF found with MAC fa:16:3e:3e:83:df, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 00:14:42 compute-1 nova_compute[182713]: 2026-01-22 00:14:42.470 182717 INFO nova.virt.libvirt.driver [None req-0735ad42-80f2-49a3-9023-d8a899287d4f 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 074fd360-328c-4903-a368-d3890c4a1075] Using config drive
Jan 22 00:14:43 compute-1 nova_compute[182713]: 2026-01-22 00:14:43.228 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:14:43 compute-1 nova_compute[182713]: 2026-01-22 00:14:43.363 182717 INFO nova.virt.libvirt.driver [None req-0735ad42-80f2-49a3-9023-d8a899287d4f 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 074fd360-328c-4903-a368-d3890c4a1075] Creating config drive at /var/lib/nova/instances/074fd360-328c-4903-a368-d3890c4a1075/disk.config
Jan 22 00:14:43 compute-1 nova_compute[182713]: 2026-01-22 00:14:43.368 182717 DEBUG oslo_concurrency.processutils [None req-0735ad42-80f2-49a3-9023-d8a899287d4f 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/074fd360-328c-4903-a368-d3890c4a1075/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpuemxyod7 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:14:43 compute-1 nova_compute[182713]: 2026-01-22 00:14:43.501 182717 DEBUG oslo_concurrency.processutils [None req-0735ad42-80f2-49a3-9023-d8a899287d4f 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/074fd360-328c-4903-a368-d3890c4a1075/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpuemxyod7" returned: 0 in 0.133s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:14:43 compute-1 kernel: tapa06a78d5-54: entered promiscuous mode
Jan 22 00:14:43 compute-1 NetworkManager[54952]: <info>  [1769040883.5894] manager: (tapa06a78d5-54): new Tun device (/org/freedesktop/NetworkManager/Devices/226)
Jan 22 00:14:43 compute-1 nova_compute[182713]: 2026-01-22 00:14:43.593 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:14:43 compute-1 ovn_controller[94841]: 2026-01-22T00:14:43Z|00494|binding|INFO|Claiming lport a06a78d5-548e-4a84-b918-197a54a79f44 for this chassis.
Jan 22 00:14:43 compute-1 ovn_controller[94841]: 2026-01-22T00:14:43Z|00495|binding|INFO|a06a78d5-548e-4a84-b918-197a54a79f44: Claiming fa:16:3e:3e:83:df 10.100.0.3
Jan 22 00:14:43 compute-1 nova_compute[182713]: 2026-01-22 00:14:43.598 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:14:43 compute-1 nova_compute[182713]: 2026-01-22 00:14:43.602 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:14:43 compute-1 nova_compute[182713]: 2026-01-22 00:14:43.606 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:14:43 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:14:43.618 104184 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:3e:83:df 10.100.0.3'], port_security=['fa:16:3e:3e:83:df 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': '074fd360-328c-4903-a368-d3890c4a1075', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-89b3c74e-a4f2-4889-901d-aba21eee4bda', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '34b96b4037d24a0ea19383ca2477b2fd', 'neutron:revision_number': '2', 'neutron:security_group_ids': '7b431dee-6ff2-4ce1-b240-ed1059a68730', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=011f79f2-8e1f-476b-a77e-56d133ce3969, chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>], logical_port=a06a78d5-548e-4a84-b918-197a54a79f44) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:14:43 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:14:43.619 104184 INFO neutron.agent.ovn.metadata.agent [-] Port a06a78d5-548e-4a84-b918-197a54a79f44 in datapath 89b3c74e-a4f2-4889-901d-aba21eee4bda bound to our chassis
Jan 22 00:14:43 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:14:43.621 104184 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 89b3c74e-a4f2-4889-901d-aba21eee4bda
Jan 22 00:14:43 compute-1 systemd-udevd[230665]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 00:14:43 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:14:43.638 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[6dfa1ac4-1aa0-4772-b024-86d345ef46d0]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:14:43 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:14:43.640 104184 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap89b3c74e-a1 in ovnmeta-89b3c74e-a4f2-4889-901d-aba21eee4bda namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 22 00:14:43 compute-1 systemd-machined[153970]: New machine qemu-57-instance-0000007d.
Jan 22 00:14:43 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:14:43.642 211733 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap89b3c74e-a0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 22 00:14:43 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:14:43.642 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[427f0ff0-ec6a-4f75-a18e-8ca20a603ddb]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:14:43 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:14:43.643 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[180ff710-a903-4c78-8796-2a340cf053f6]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:14:43 compute-1 NetworkManager[54952]: <info>  [1769040883.6501] device (tapa06a78d5-54): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 00:14:43 compute-1 NetworkManager[54952]: <info>  [1769040883.6512] device (tapa06a78d5-54): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 00:14:43 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:14:43.658 104576 DEBUG oslo.privsep.daemon [-] privsep: reply[9186f53e-7a8b-44d7-9edc-abc4a939972b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:14:43 compute-1 ovn_controller[94841]: 2026-01-22T00:14:43Z|00496|binding|INFO|Setting lport a06a78d5-548e-4a84-b918-197a54a79f44 ovn-installed in OVS
Jan 22 00:14:43 compute-1 ovn_controller[94841]: 2026-01-22T00:14:43Z|00497|binding|INFO|Setting lport a06a78d5-548e-4a84-b918-197a54a79f44 up in Southbound
Jan 22 00:14:43 compute-1 systemd[1]: Started Virtual Machine qemu-57-instance-0000007d.
Jan 22 00:14:43 compute-1 nova_compute[182713]: 2026-01-22 00:14:43.678 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:14:43 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:14:43.685 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[90cec198-eb3d-4564-93e7-a6e5fe7f1dc4]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:14:43 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:14:43.720 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[7e579f2d-3707-4ab5-a73e-fcb236b919ae]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:14:43 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:14:43.725 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[2bbfd310-7201-4e36-95ba-4d8e029c3bb5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:14:43 compute-1 NetworkManager[54952]: <info>  [1769040883.7269] manager: (tap89b3c74e-a0): new Veth device (/org/freedesktop/NetworkManager/Devices/227)
Jan 22 00:14:43 compute-1 systemd-udevd[230670]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 00:14:43 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:14:43.758 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[bbbca7ca-4521-47ed-a752-22e3b85ef03d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:14:43 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:14:43.762 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[ca2ac088-90af-47b7-a66f-e31cdc1e4e7e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:14:43 compute-1 NetworkManager[54952]: <info>  [1769040883.7873] device (tap89b3c74e-a0): carrier: link connected
Jan 22 00:14:43 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:14:43.794 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[42596429-1571-4c08-85ee-55ee958b1081]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:14:43 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:14:43.814 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[bad35b44-0e15-47f0-83c8-1123f2e3fdb0]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap89b3c74e-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:59:32:c6'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 143], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 545643, 'reachable_time': 40829, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 230699, 'error': None, 'target': 'ovnmeta-89b3c74e-a4f2-4889-901d-aba21eee4bda', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:14:43 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:14:43.835 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[d35059b6-769e-4933-b161-2b7ca259af0a]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe59:32c6'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 545643, 'tstamp': 545643}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 230700, 'error': None, 'target': 'ovnmeta-89b3c74e-a4f2-4889-901d-aba21eee4bda', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:14:43 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:14:43.854 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[77bb4c05-fc11-47cc-b5a8-5f0587ac9638]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap89b3c74e-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:59:32:c6'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 143], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 545643, 'reachable_time': 40829, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 230701, 'error': None, 'target': 'ovnmeta-89b3c74e-a4f2-4889-901d-aba21eee4bda', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:14:43 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:14:43.889 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[5fe74ab2-1df2-4bfb-9a46-f61698814b7a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:14:43 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:14:43.953 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[afc2853f-e4ec-40a2-b95a-2350db5e7e9a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:14:43 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:14:43.955 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap89b3c74e-a0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:14:43 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:14:43.955 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 00:14:43 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:14:43.956 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap89b3c74e-a0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:14:43 compute-1 nova_compute[182713]: 2026-01-22 00:14:43.958 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:14:43 compute-1 NetworkManager[54952]: <info>  [1769040883.9596] manager: (tap89b3c74e-a0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/228)
Jan 22 00:14:43 compute-1 kernel: tap89b3c74e-a0: entered promiscuous mode
Jan 22 00:14:43 compute-1 nova_compute[182713]: 2026-01-22 00:14:43.960 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:14:43 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:14:43.963 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap89b3c74e-a0, col_values=(('external_ids', {'iface-id': '6df2a6eb-cd08-41f6-b95a-bc0711ab706f'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:14:43 compute-1 nova_compute[182713]: 2026-01-22 00:14:43.964 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:14:43 compute-1 ovn_controller[94841]: 2026-01-22T00:14:43Z|00498|binding|INFO|Releasing lport 6df2a6eb-cd08-41f6-b95a-bc0711ab706f from this chassis (sb_readonly=0)
Jan 22 00:14:43 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:14:43.965 104184 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/89b3c74e-a4f2-4889-901d-aba21eee4bda.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/89b3c74e-a4f2-4889-901d-aba21eee4bda.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 22 00:14:43 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:14:43.966 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[68c2d56f-9bee-4d66-8a46-1043a48624be]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:14:43 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:14:43.967 104184 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 00:14:43 compute-1 ovn_metadata_agent[104179]: global
Jan 22 00:14:43 compute-1 ovn_metadata_agent[104179]:     log         /dev/log local0 debug
Jan 22 00:14:43 compute-1 ovn_metadata_agent[104179]:     log-tag     haproxy-metadata-proxy-89b3c74e-a4f2-4889-901d-aba21eee4bda
Jan 22 00:14:43 compute-1 ovn_metadata_agent[104179]:     user        root
Jan 22 00:14:43 compute-1 ovn_metadata_agent[104179]:     group       root
Jan 22 00:14:43 compute-1 ovn_metadata_agent[104179]:     maxconn     1024
Jan 22 00:14:43 compute-1 ovn_metadata_agent[104179]:     pidfile     /var/lib/neutron/external/pids/89b3c74e-a4f2-4889-901d-aba21eee4bda.pid.haproxy
Jan 22 00:14:43 compute-1 ovn_metadata_agent[104179]:     daemon
Jan 22 00:14:43 compute-1 ovn_metadata_agent[104179]: 
Jan 22 00:14:43 compute-1 ovn_metadata_agent[104179]: defaults
Jan 22 00:14:43 compute-1 ovn_metadata_agent[104179]:     log global
Jan 22 00:14:43 compute-1 ovn_metadata_agent[104179]:     mode http
Jan 22 00:14:43 compute-1 ovn_metadata_agent[104179]:     option httplog
Jan 22 00:14:43 compute-1 ovn_metadata_agent[104179]:     option dontlognull
Jan 22 00:14:43 compute-1 ovn_metadata_agent[104179]:     option http-server-close
Jan 22 00:14:43 compute-1 ovn_metadata_agent[104179]:     option forwardfor
Jan 22 00:14:43 compute-1 ovn_metadata_agent[104179]:     retries                 3
Jan 22 00:14:43 compute-1 ovn_metadata_agent[104179]:     timeout http-request    30s
Jan 22 00:14:43 compute-1 ovn_metadata_agent[104179]:     timeout connect         30s
Jan 22 00:14:43 compute-1 ovn_metadata_agent[104179]:     timeout client          32s
Jan 22 00:14:43 compute-1 ovn_metadata_agent[104179]:     timeout server          32s
Jan 22 00:14:43 compute-1 ovn_metadata_agent[104179]:     timeout http-keep-alive 30s
Jan 22 00:14:43 compute-1 ovn_metadata_agent[104179]: 
Jan 22 00:14:43 compute-1 ovn_metadata_agent[104179]: 
Jan 22 00:14:43 compute-1 ovn_metadata_agent[104179]: listen listener
Jan 22 00:14:43 compute-1 ovn_metadata_agent[104179]:     bind 169.254.169.254:80
Jan 22 00:14:43 compute-1 ovn_metadata_agent[104179]:     server metadata /var/lib/neutron/metadata_proxy
Jan 22 00:14:43 compute-1 ovn_metadata_agent[104179]:     http-request add-header X-OVN-Network-ID 89b3c74e-a4f2-4889-901d-aba21eee4bda
Jan 22 00:14:43 compute-1 ovn_metadata_agent[104179]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 22 00:14:43 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:14:43.967 104184 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-89b3c74e-a4f2-4889-901d-aba21eee4bda', 'env', 'PROCESS_TAG=haproxy-89b3c74e-a4f2-4889-901d-aba21eee4bda', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/89b3c74e-a4f2-4889-901d-aba21eee4bda.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 22 00:14:43 compute-1 nova_compute[182713]: 2026-01-22 00:14:43.981 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:14:43 compute-1 nova_compute[182713]: 2026-01-22 00:14:43.996 182717 DEBUG nova.virt.driver [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Emitting event <LifecycleEvent: 1769040883.9960463, 074fd360-328c-4903-a368-d3890c4a1075 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:14:43 compute-1 nova_compute[182713]: 2026-01-22 00:14:43.997 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 074fd360-328c-4903-a368-d3890c4a1075] VM Started (Lifecycle Event)
Jan 22 00:14:44 compute-1 nova_compute[182713]: 2026-01-22 00:14:44.123 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 074fd360-328c-4903-a368-d3890c4a1075] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:14:44 compute-1 nova_compute[182713]: 2026-01-22 00:14:44.127 182717 DEBUG nova.virt.driver [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Emitting event <LifecycleEvent: 1769040883.9971786, 074fd360-328c-4903-a368-d3890c4a1075 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:14:44 compute-1 nova_compute[182713]: 2026-01-22 00:14:44.128 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 074fd360-328c-4903-a368-d3890c4a1075] VM Paused (Lifecycle Event)
Jan 22 00:14:44 compute-1 nova_compute[182713]: 2026-01-22 00:14:44.185 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 074fd360-328c-4903-a368-d3890c4a1075] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:14:44 compute-1 nova_compute[182713]: 2026-01-22 00:14:44.190 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 074fd360-328c-4903-a368-d3890c4a1075] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 00:14:44 compute-1 sshd-session[230718]: Connection closed by 45.148.10.240 port 32906
Jan 22 00:14:44 compute-1 nova_compute[182713]: 2026-01-22 00:14:44.312 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 074fd360-328c-4903-a368-d3890c4a1075] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 00:14:44 compute-1 podman[230741]: 2026-01-22 00:14:44.318803396 +0000 UTC m=+0.050045739 container create 6646af73bc62b968de3d0ae7f8d9abc852f2eaaf3bf95786838997a35d8c8716 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-89b3c74e-a4f2-4889-901d-aba21eee4bda, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251202)
Jan 22 00:14:44 compute-1 systemd[1]: Started libpod-conmon-6646af73bc62b968de3d0ae7f8d9abc852f2eaaf3bf95786838997a35d8c8716.scope.
Jan 22 00:14:44 compute-1 podman[230741]: 2026-01-22 00:14:44.291462426 +0000 UTC m=+0.022704759 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 00:14:44 compute-1 systemd[1]: Started libcrun container.
Jan 22 00:14:44 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1e2ac5dc7176e09e8f0962c01089c8533476a1e2c9cc9c7f0f58336d68c1b569/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 00:14:44 compute-1 podman[230741]: 2026-01-22 00:14:44.433570114 +0000 UTC m=+0.164812507 container init 6646af73bc62b968de3d0ae7f8d9abc852f2eaaf3bf95786838997a35d8c8716 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-89b3c74e-a4f2-4889-901d-aba21eee4bda, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 00:14:44 compute-1 podman[230741]: 2026-01-22 00:14:44.443972214 +0000 UTC m=+0.175214547 container start 6646af73bc62b968de3d0ae7f8d9abc852f2eaaf3bf95786838997a35d8c8716 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-89b3c74e-a4f2-4889-901d-aba21eee4bda, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 00:14:44 compute-1 neutron-haproxy-ovnmeta-89b3c74e-a4f2-4889-901d-aba21eee4bda[230757]: [NOTICE]   (230761) : New worker (230763) forked
Jan 22 00:14:44 compute-1 neutron-haproxy-ovnmeta-89b3c74e-a4f2-4889-901d-aba21eee4bda[230757]: [NOTICE]   (230761) : Loading success.
Jan 22 00:14:44 compute-1 nova_compute[182713]: 2026-01-22 00:14:44.613 182717 DEBUG nova.compute.manager [req-0616fe11-8155-4c82-af13-d64a31e73693 req-aa9ad790-6b6f-4161-b031-3a7280bddaca 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 074fd360-328c-4903-a368-d3890c4a1075] Received event network-vif-plugged-a06a78d5-548e-4a84-b918-197a54a79f44 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:14:44 compute-1 nova_compute[182713]: 2026-01-22 00:14:44.616 182717 DEBUG oslo_concurrency.lockutils [req-0616fe11-8155-4c82-af13-d64a31e73693 req-aa9ad790-6b6f-4161-b031-3a7280bddaca 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "074fd360-328c-4903-a368-d3890c4a1075-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:14:44 compute-1 nova_compute[182713]: 2026-01-22 00:14:44.617 182717 DEBUG oslo_concurrency.lockutils [req-0616fe11-8155-4c82-af13-d64a31e73693 req-aa9ad790-6b6f-4161-b031-3a7280bddaca 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "074fd360-328c-4903-a368-d3890c4a1075-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:14:44 compute-1 nova_compute[182713]: 2026-01-22 00:14:44.617 182717 DEBUG oslo_concurrency.lockutils [req-0616fe11-8155-4c82-af13-d64a31e73693 req-aa9ad790-6b6f-4161-b031-3a7280bddaca 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "074fd360-328c-4903-a368-d3890c4a1075-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:14:44 compute-1 nova_compute[182713]: 2026-01-22 00:14:44.618 182717 DEBUG nova.compute.manager [req-0616fe11-8155-4c82-af13-d64a31e73693 req-aa9ad790-6b6f-4161-b031-3a7280bddaca 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 074fd360-328c-4903-a368-d3890c4a1075] Processing event network-vif-plugged-a06a78d5-548e-4a84-b918-197a54a79f44 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 22 00:14:44 compute-1 nova_compute[182713]: 2026-01-22 00:14:44.619 182717 DEBUG nova.compute.manager [None req-0735ad42-80f2-49a3-9023-d8a899287d4f 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 074fd360-328c-4903-a368-d3890c4a1075] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 00:14:44 compute-1 nova_compute[182713]: 2026-01-22 00:14:44.625 182717 DEBUG nova.virt.driver [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Emitting event <LifecycleEvent: 1769040884.6249683, 074fd360-328c-4903-a368-d3890c4a1075 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:14:44 compute-1 nova_compute[182713]: 2026-01-22 00:14:44.626 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 074fd360-328c-4903-a368-d3890c4a1075] VM Resumed (Lifecycle Event)
Jan 22 00:14:44 compute-1 nova_compute[182713]: 2026-01-22 00:14:44.630 182717 DEBUG nova.virt.libvirt.driver [None req-0735ad42-80f2-49a3-9023-d8a899287d4f 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 074fd360-328c-4903-a368-d3890c4a1075] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 22 00:14:44 compute-1 nova_compute[182713]: 2026-01-22 00:14:44.634 182717 INFO nova.virt.libvirt.driver [-] [instance: 074fd360-328c-4903-a368-d3890c4a1075] Instance spawned successfully.
Jan 22 00:14:44 compute-1 nova_compute[182713]: 2026-01-22 00:14:44.634 182717 DEBUG nova.virt.libvirt.driver [None req-0735ad42-80f2-49a3-9023-d8a899287d4f 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 074fd360-328c-4903-a368-d3890c4a1075] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 22 00:14:44 compute-1 nova_compute[182713]: 2026-01-22 00:14:44.856 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:14:44 compute-1 nova_compute[182713]: 2026-01-22 00:14:44.857 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 00:14:44 compute-1 nova_compute[182713]: 2026-01-22 00:14:44.857 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:14:44 compute-1 nova_compute[182713]: 2026-01-22 00:14:44.957 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 074fd360-328c-4903-a368-d3890c4a1075] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:14:44 compute-1 nova_compute[182713]: 2026-01-22 00:14:44.962 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:14:44 compute-1 nova_compute[182713]: 2026-01-22 00:14:44.962 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:14:44 compute-1 nova_compute[182713]: 2026-01-22 00:14:44.963 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:14:44 compute-1 nova_compute[182713]: 2026-01-22 00:14:44.963 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 00:14:44 compute-1 nova_compute[182713]: 2026-01-22 00:14:44.966 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 074fd360-328c-4903-a368-d3890c4a1075] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 00:14:44 compute-1 nova_compute[182713]: 2026-01-22 00:14:44.970 182717 DEBUG nova.virt.libvirt.driver [None req-0735ad42-80f2-49a3-9023-d8a899287d4f 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 074fd360-328c-4903-a368-d3890c4a1075] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:14:44 compute-1 nova_compute[182713]: 2026-01-22 00:14:44.971 182717 DEBUG nova.virt.libvirt.driver [None req-0735ad42-80f2-49a3-9023-d8a899287d4f 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 074fd360-328c-4903-a368-d3890c4a1075] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:14:44 compute-1 nova_compute[182713]: 2026-01-22 00:14:44.972 182717 DEBUG nova.virt.libvirt.driver [None req-0735ad42-80f2-49a3-9023-d8a899287d4f 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 074fd360-328c-4903-a368-d3890c4a1075] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:14:44 compute-1 nova_compute[182713]: 2026-01-22 00:14:44.972 182717 DEBUG nova.virt.libvirt.driver [None req-0735ad42-80f2-49a3-9023-d8a899287d4f 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 074fd360-328c-4903-a368-d3890c4a1075] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:14:44 compute-1 nova_compute[182713]: 2026-01-22 00:14:44.973 182717 DEBUG nova.virt.libvirt.driver [None req-0735ad42-80f2-49a3-9023-d8a899287d4f 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 074fd360-328c-4903-a368-d3890c4a1075] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:14:44 compute-1 nova_compute[182713]: 2026-01-22 00:14:44.973 182717 DEBUG nova.virt.libvirt.driver [None req-0735ad42-80f2-49a3-9023-d8a899287d4f 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 074fd360-328c-4903-a368-d3890c4a1075] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:14:45 compute-1 nova_compute[182713]: 2026-01-22 00:14:45.356 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 074fd360-328c-4903-a368-d3890c4a1075] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 00:14:45 compute-1 nova_compute[182713]: 2026-01-22 00:14:45.611 182717 DEBUG oslo_concurrency.processutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/074fd360-328c-4903-a368-d3890c4a1075/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:14:45 compute-1 nova_compute[182713]: 2026-01-22 00:14:45.648 182717 INFO nova.compute.manager [None req-0735ad42-80f2-49a3-9023-d8a899287d4f 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 074fd360-328c-4903-a368-d3890c4a1075] Took 11.95 seconds to spawn the instance on the hypervisor.
Jan 22 00:14:45 compute-1 nova_compute[182713]: 2026-01-22 00:14:45.650 182717 DEBUG nova.compute.manager [None req-0735ad42-80f2-49a3-9023-d8a899287d4f 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 074fd360-328c-4903-a368-d3890c4a1075] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:14:45 compute-1 nova_compute[182713]: 2026-01-22 00:14:45.672 182717 DEBUG oslo_concurrency.processutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/074fd360-328c-4903-a368-d3890c4a1075/disk --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:14:45 compute-1 nova_compute[182713]: 2026-01-22 00:14:45.673 182717 DEBUG oslo_concurrency.processutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/074fd360-328c-4903-a368-d3890c4a1075/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:14:45 compute-1 nova_compute[182713]: 2026-01-22 00:14:45.739 182717 DEBUG oslo_concurrency.processutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/074fd360-328c-4903-a368-d3890c4a1075/disk --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:14:45 compute-1 nova_compute[182713]: 2026-01-22 00:14:45.827 182717 INFO nova.compute.manager [None req-0735ad42-80f2-49a3-9023-d8a899287d4f 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 074fd360-328c-4903-a368-d3890c4a1075] Took 12.93 seconds to build instance.
Jan 22 00:14:45 compute-1 nova_compute[182713]: 2026-01-22 00:14:45.864 182717 DEBUG oslo_concurrency.lockutils [None req-0735ad42-80f2-49a3-9023-d8a899287d4f 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "074fd360-328c-4903-a368-d3890c4a1075" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 13.169s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:14:45 compute-1 nova_compute[182713]: 2026-01-22 00:14:45.884 182717 WARNING nova.virt.libvirt.driver [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 00:14:45 compute-1 nova_compute[182713]: 2026-01-22 00:14:45.885 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5640MB free_disk=73.26079940795898GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 00:14:45 compute-1 nova_compute[182713]: 2026-01-22 00:14:45.886 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:14:45 compute-1 nova_compute[182713]: 2026-01-22 00:14:45.886 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:14:45 compute-1 nova_compute[182713]: 2026-01-22 00:14:45.982 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Instance 074fd360-328c-4903-a368-d3890c4a1075 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 22 00:14:45 compute-1 nova_compute[182713]: 2026-01-22 00:14:45.983 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 00:14:45 compute-1 nova_compute[182713]: 2026-01-22 00:14:45.983 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 00:14:46 compute-1 nova_compute[182713]: 2026-01-22 00:14:46.012 182717 DEBUG nova.scheduler.client.report [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Refreshing inventories for resource provider 39680711-70c9-4df1-ae59-25e54fac688d _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Jan 22 00:14:46 compute-1 nova_compute[182713]: 2026-01-22 00:14:46.035 182717 DEBUG nova.scheduler.client.report [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Updating ProviderTree inventory for provider 39680711-70c9-4df1-ae59-25e54fac688d from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Jan 22 00:14:46 compute-1 nova_compute[182713]: 2026-01-22 00:14:46.036 182717 DEBUG nova.compute.provider_tree [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Updating inventory in ProviderTree for provider 39680711-70c9-4df1-ae59-25e54fac688d with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Jan 22 00:14:46 compute-1 nova_compute[182713]: 2026-01-22 00:14:46.054 182717 DEBUG nova.scheduler.client.report [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Refreshing aggregate associations for resource provider 39680711-70c9-4df1-ae59-25e54fac688d, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Jan 22 00:14:46 compute-1 nova_compute[182713]: 2026-01-22 00:14:46.096 182717 DEBUG nova.network.neutron [req-ed89845c-1e2d-46f8-8b33-4e25a048cdc4 req-0d38ae3d-4e63-4664-b5f6-e5d557965426 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 074fd360-328c-4903-a368-d3890c4a1075] Updated VIF entry in instance network info cache for port a06a78d5-548e-4a84-b918-197a54a79f44. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 00:14:46 compute-1 nova_compute[182713]: 2026-01-22 00:14:46.097 182717 DEBUG nova.network.neutron [req-ed89845c-1e2d-46f8-8b33-4e25a048cdc4 req-0d38ae3d-4e63-4664-b5f6-e5d557965426 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 074fd360-328c-4903-a368-d3890c4a1075] Updating instance_info_cache with network_info: [{"id": "a06a78d5-548e-4a84-b918-197a54a79f44", "address": "fa:16:3e:3e:83:df", "network": {"id": "89b3c74e-a4f2-4889-901d-aba21eee4bda", "bridge": "br-int", "label": "tempest-network-smoke--2024090277", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "34b96b4037d24a0ea19383ca2477b2fd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa06a78d5-54", "ovs_interfaceid": "a06a78d5-548e-4a84-b918-197a54a79f44", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:14:46 compute-1 nova_compute[182713]: 2026-01-22 00:14:46.109 182717 DEBUG nova.scheduler.client.report [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Refreshing trait associations for resource provider 39680711-70c9-4df1-ae59-25e54fac688d, traits: COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_ACCELERATORS,COMPUTE_NODE,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_SSE41,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_GRAPHICS_MODEL_CIRRUS,HW_CPU_X86_SSE42,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_SECURITY_TPM_2_0,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_STORAGE_BUS_USB,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_STORAGE_BUS_FDC,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_STORAGE_BUS_SATA,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_DEVICE_TAGGING,COMPUTE_NET_VIF_MODEL_NE2K_PCI,HW_CPU_X86_SSE2,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_CPU_X86_SSSE3,COMPUTE_STORAGE_BUS_IDE,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_RESCUE_BFV,HW_CPU_X86_MMX,HW_CPU_X86_SSE,COMPUTE_TRUSTED_CERTS,COMPUTE_IMAGE_TYPE_ISO _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Jan 22 00:14:46 compute-1 nova_compute[182713]: 2026-01-22 00:14:46.123 182717 DEBUG oslo_concurrency.lockutils [req-ed89845c-1e2d-46f8-8b33-4e25a048cdc4 req-0d38ae3d-4e63-4664-b5f6-e5d557965426 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-074fd360-328c-4903-a368-d3890c4a1075" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:14:46 compute-1 nova_compute[182713]: 2026-01-22 00:14:46.231 182717 DEBUG nova.compute.provider_tree [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Inventory has not changed in ProviderTree for provider: 39680711-70c9-4df1-ae59-25e54fac688d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:14:46 compute-1 nova_compute[182713]: 2026-01-22 00:14:46.248 182717 DEBUG nova.scheduler.client.report [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Inventory has not changed for provider 39680711-70c9-4df1-ae59-25e54fac688d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:14:46 compute-1 nova_compute[182713]: 2026-01-22 00:14:46.272 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 00:14:46 compute-1 nova_compute[182713]: 2026-01-22 00:14:46.273 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.387s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:14:47 compute-1 nova_compute[182713]: 2026-01-22 00:14:47.267 182717 DEBUG nova.compute.manager [req-a28e3b7b-cc9f-4325-bb8e-8beb05c5e779 req-ff67813f-acd5-4d00-ab2a-35bffba0a391 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 074fd360-328c-4903-a368-d3890c4a1075] Received event network-vif-plugged-a06a78d5-548e-4a84-b918-197a54a79f44 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:14:47 compute-1 nova_compute[182713]: 2026-01-22 00:14:47.268 182717 DEBUG oslo_concurrency.lockutils [req-a28e3b7b-cc9f-4325-bb8e-8beb05c5e779 req-ff67813f-acd5-4d00-ab2a-35bffba0a391 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "074fd360-328c-4903-a368-d3890c4a1075-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:14:47 compute-1 nova_compute[182713]: 2026-01-22 00:14:47.269 182717 DEBUG oslo_concurrency.lockutils [req-a28e3b7b-cc9f-4325-bb8e-8beb05c5e779 req-ff67813f-acd5-4d00-ab2a-35bffba0a391 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "074fd360-328c-4903-a368-d3890c4a1075-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:14:47 compute-1 nova_compute[182713]: 2026-01-22 00:14:47.269 182717 DEBUG oslo_concurrency.lockutils [req-a28e3b7b-cc9f-4325-bb8e-8beb05c5e779 req-ff67813f-acd5-4d00-ab2a-35bffba0a391 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "074fd360-328c-4903-a368-d3890c4a1075-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:14:47 compute-1 nova_compute[182713]: 2026-01-22 00:14:47.270 182717 DEBUG nova.compute.manager [req-a28e3b7b-cc9f-4325-bb8e-8beb05c5e779 req-ff67813f-acd5-4d00-ab2a-35bffba0a391 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 074fd360-328c-4903-a368-d3890c4a1075] No waiting events found dispatching network-vif-plugged-a06a78d5-548e-4a84-b918-197a54a79f44 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:14:47 compute-1 nova_compute[182713]: 2026-01-22 00:14:47.270 182717 WARNING nova.compute.manager [req-a28e3b7b-cc9f-4325-bb8e-8beb05c5e779 req-ff67813f-acd5-4d00-ab2a-35bffba0a391 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 074fd360-328c-4903-a368-d3890c4a1075] Received unexpected event network-vif-plugged-a06a78d5-548e-4a84-b918-197a54a79f44 for instance with vm_state active and task_state None.
Jan 22 00:14:47 compute-1 nova_compute[182713]: 2026-01-22 00:14:47.271 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:14:47 compute-1 nova_compute[182713]: 2026-01-22 00:14:47.271 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:14:47 compute-1 nova_compute[182713]: 2026-01-22 00:14:47.375 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:14:48 compute-1 nova_compute[182713]: 2026-01-22 00:14:48.230 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:14:49 compute-1 nova_compute[182713]: 2026-01-22 00:14:49.855 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:14:49 compute-1 nova_compute[182713]: 2026-01-22 00:14:49.856 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 00:14:50 compute-1 nova_compute[182713]: 2026-01-22 00:14:50.442 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 22 00:14:50 compute-1 nova_compute[182713]: 2026-01-22 00:14:50.443 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:14:50 compute-1 nova_compute[182713]: 2026-01-22 00:14:50.443 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:14:51 compute-1 podman[230779]: 2026-01-22 00:14:51.567260628 +0000 UTC m=+0.060999465 container health_status cba261ffc859a105e0afa4436e64e27f9ba7e7387a1ea02146e0072c51844d44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ceilometer_agent_compute, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 22 00:14:52 compute-1 nova_compute[182713]: 2026-01-22 00:14:52.378 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:14:53 compute-1 nova_compute[182713]: 2026-01-22 00:14:53.233 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:14:54 compute-1 podman[230800]: 2026-01-22 00:14:54.601139696 +0000 UTC m=+0.090962888 container health_status dce133a2ffa21f3006ac27dc80027060a9029e6d972f4c62fecd35dd3bbb00f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, release=1755695350, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, io.buildah.version=1.33.7, config_id=openstack_network_exporter, distribution-scope=public, maintainer=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git)
Jan 22 00:14:57 compute-1 ovn_controller[94841]: 2026-01-22T00:14:57Z|00056|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:3e:83:df 10.100.0.3
Jan 22 00:14:57 compute-1 ovn_controller[94841]: 2026-01-22T00:14:57Z|00057|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:3e:83:df 10.100.0.3
Jan 22 00:14:57 compute-1 nova_compute[182713]: 2026-01-22 00:14:57.384 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:14:58 compute-1 nova_compute[182713]: 2026-01-22 00:14:58.235 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:14:59 compute-1 nova_compute[182713]: 2026-01-22 00:14:59.049 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:14:59 compute-1 NetworkManager[54952]: <info>  [1769040899.0508] manager: (patch-br-int-to-provnet-99bcf688-d143-4306-a11c-956e66fbe227): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/229)
Jan 22 00:14:59 compute-1 NetworkManager[54952]: <info>  [1769040899.0519] manager: (patch-provnet-99bcf688-d143-4306-a11c-956e66fbe227-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/230)
Jan 22 00:14:59 compute-1 nova_compute[182713]: 2026-01-22 00:14:59.182 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:14:59 compute-1 ovn_controller[94841]: 2026-01-22T00:14:59Z|00499|binding|INFO|Releasing lport 6df2a6eb-cd08-41f6-b95a-bc0711ab706f from this chassis (sb_readonly=0)
Jan 22 00:14:59 compute-1 nova_compute[182713]: 2026-01-22 00:14:59.196 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:15:00 compute-1 nova_compute[182713]: 2026-01-22 00:15:00.196 182717 DEBUG nova.compute.manager [req-db175b18-7e9e-4b6b-8806-e9fa566c09c0 req-45af25a3-2cc9-428a-9577-db2b9f801ada 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 074fd360-328c-4903-a368-d3890c4a1075] Received event network-changed-a06a78d5-548e-4a84-b918-197a54a79f44 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:15:00 compute-1 nova_compute[182713]: 2026-01-22 00:15:00.197 182717 DEBUG nova.compute.manager [req-db175b18-7e9e-4b6b-8806-e9fa566c09c0 req-45af25a3-2cc9-428a-9577-db2b9f801ada 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 074fd360-328c-4903-a368-d3890c4a1075] Refreshing instance network info cache due to event network-changed-a06a78d5-548e-4a84-b918-197a54a79f44. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 00:15:00 compute-1 nova_compute[182713]: 2026-01-22 00:15:00.197 182717 DEBUG oslo_concurrency.lockutils [req-db175b18-7e9e-4b6b-8806-e9fa566c09c0 req-45af25a3-2cc9-428a-9577-db2b9f801ada 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-074fd360-328c-4903-a368-d3890c4a1075" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:15:00 compute-1 nova_compute[182713]: 2026-01-22 00:15:00.197 182717 DEBUG oslo_concurrency.lockutils [req-db175b18-7e9e-4b6b-8806-e9fa566c09c0 req-45af25a3-2cc9-428a-9577-db2b9f801ada 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-074fd360-328c-4903-a368-d3890c4a1075" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:15:00 compute-1 nova_compute[182713]: 2026-01-22 00:15:00.198 182717 DEBUG nova.network.neutron [req-db175b18-7e9e-4b6b-8806-e9fa566c09c0 req-45af25a3-2cc9-428a-9577-db2b9f801ada 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 074fd360-328c-4903-a368-d3890c4a1075] Refreshing network info cache for port a06a78d5-548e-4a84-b918-197a54a79f44 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 00:15:02 compute-1 nova_compute[182713]: 2026-01-22 00:15:02.301 182717 DEBUG nova.network.neutron [req-db175b18-7e9e-4b6b-8806-e9fa566c09c0 req-45af25a3-2cc9-428a-9577-db2b9f801ada 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 074fd360-328c-4903-a368-d3890c4a1075] Updated VIF entry in instance network info cache for port a06a78d5-548e-4a84-b918-197a54a79f44. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 00:15:02 compute-1 nova_compute[182713]: 2026-01-22 00:15:02.302 182717 DEBUG nova.network.neutron [req-db175b18-7e9e-4b6b-8806-e9fa566c09c0 req-45af25a3-2cc9-428a-9577-db2b9f801ada 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 074fd360-328c-4903-a368-d3890c4a1075] Updating instance_info_cache with network_info: [{"id": "a06a78d5-548e-4a84-b918-197a54a79f44", "address": "fa:16:3e:3e:83:df", "network": {"id": "89b3c74e-a4f2-4889-901d-aba21eee4bda", "bridge": "br-int", "label": "tempest-network-smoke--2024090277", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.235", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "34b96b4037d24a0ea19383ca2477b2fd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa06a78d5-54", "ovs_interfaceid": "a06a78d5-548e-4a84-b918-197a54a79f44", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:15:02 compute-1 nova_compute[182713]: 2026-01-22 00:15:02.386 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:15:02 compute-1 nova_compute[182713]: 2026-01-22 00:15:02.726 182717 INFO nova.compute.manager [None req-b6accc39-93bf-4d95-b73b-b399d292fb46 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 074fd360-328c-4903-a368-d3890c4a1075] Get console output
Jan 22 00:15:02 compute-1 nova_compute[182713]: 2026-01-22 00:15:02.729 182717 DEBUG oslo_concurrency.lockutils [req-db175b18-7e9e-4b6b-8806-e9fa566c09c0 req-45af25a3-2cc9-428a-9577-db2b9f801ada 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-074fd360-328c-4903-a368-d3890c4a1075" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:15:02 compute-1 nova_compute[182713]: 2026-01-22 00:15:02.737 211417 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 00:15:03 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:15:03.022 104184 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:15:03 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:15:03.022 104184 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:15:03 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:15:03.024 104184 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:15:03 compute-1 nova_compute[182713]: 2026-01-22 00:15:03.238 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:15:07 compute-1 nova_compute[182713]: 2026-01-22 00:15:07.391 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:15:07 compute-1 podman[230839]: 2026-01-22 00:15:07.592168907 +0000 UTC m=+0.066093792 container health_status 9069102163b9014ce8d990e36224edf2f6277e737eebbc85a8ea0ba5a3d6a468 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 22 00:15:07 compute-1 podman[230838]: 2026-01-22 00:15:07.610833191 +0000 UTC m=+0.090558655 container health_status 1b31374526468d6b98833c6ddc06c791524e3aaa8f4a92d02e3f11c449a17ca2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller)
Jan 22 00:15:08 compute-1 nova_compute[182713]: 2026-01-22 00:15:08.269 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:15:08 compute-1 nova_compute[182713]: 2026-01-22 00:15:08.902 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:15:11 compute-1 podman[230887]: 2026-01-22 00:15:11.611372297 +0000 UTC m=+0.095370852 container health_status af9a44923f14d865893c91712a29a99d59e05d83f1a1a0300c4f4d5af638f284 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 22 00:15:11 compute-1 podman[230888]: 2026-01-22 00:15:11.614980437 +0000 UTC m=+0.095560068 container health_status e305ebfcd3dbca18f8dd680606b043b108cf9efb0d0a771039cb04fe0df8441d (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Jan 22 00:15:11 compute-1 nova_compute[182713]: 2026-01-22 00:15:11.825 182717 DEBUG oslo_concurrency.lockutils [None req-b91ccd02-28f0-4694-b5d5-b1026bac8ba8 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Acquiring lock "interface-074fd360-328c-4903-a368-d3890c4a1075-None" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:15:11 compute-1 nova_compute[182713]: 2026-01-22 00:15:11.825 182717 DEBUG oslo_concurrency.lockutils [None req-b91ccd02-28f0-4694-b5d5-b1026bac8ba8 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "interface-074fd360-328c-4903-a368-d3890c4a1075-None" acquired by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:15:11 compute-1 nova_compute[182713]: 2026-01-22 00:15:11.826 182717 DEBUG nova.objects.instance [None req-b91ccd02-28f0-4694-b5d5-b1026bac8ba8 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lazy-loading 'flavor' on Instance uuid 074fd360-328c-4903-a368-d3890c4a1075 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:15:12 compute-1 nova_compute[182713]: 2026-01-22 00:15:12.396 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:15:13 compute-1 nova_compute[182713]: 2026-01-22 00:15:13.205 182717 DEBUG nova.objects.instance [None req-b91ccd02-28f0-4694-b5d5-b1026bac8ba8 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lazy-loading 'pci_requests' on Instance uuid 074fd360-328c-4903-a368-d3890c4a1075 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:15:13 compute-1 nova_compute[182713]: 2026-01-22 00:15:13.247 182717 DEBUG nova.network.neutron [None req-b91ccd02-28f0-4694-b5d5-b1026bac8ba8 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 074fd360-328c-4903-a368-d3890c4a1075] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 22 00:15:13 compute-1 nova_compute[182713]: 2026-01-22 00:15:13.271 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:15:13 compute-1 nova_compute[182713]: 2026-01-22 00:15:13.759 182717 DEBUG nova.policy [None req-b91ccd02-28f0-4694-b5d5-b1026bac8ba8 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '833f1e9dce90456ea55a443da6704907', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '34b96b4037d24a0ea19383ca2477b2fd', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 22 00:15:14 compute-1 nova_compute[182713]: 2026-01-22 00:15:14.746 182717 DEBUG nova.network.neutron [None req-b91ccd02-28f0-4694-b5d5-b1026bac8ba8 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 074fd360-328c-4903-a368-d3890c4a1075] Successfully created port: 4230f4cb-ad57-407e-90c1-b2441b67e135 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 22 00:15:14 compute-1 nova_compute[182713]: 2026-01-22 00:15:14.752 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:15:15 compute-1 nova_compute[182713]: 2026-01-22 00:15:15.906 182717 DEBUG nova.network.neutron [None req-b91ccd02-28f0-4694-b5d5-b1026bac8ba8 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 074fd360-328c-4903-a368-d3890c4a1075] Successfully updated port: 4230f4cb-ad57-407e-90c1-b2441b67e135 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 22 00:15:15 compute-1 nova_compute[182713]: 2026-01-22 00:15:15.919 182717 DEBUG oslo_concurrency.lockutils [None req-b91ccd02-28f0-4694-b5d5-b1026bac8ba8 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Acquiring lock "refresh_cache-074fd360-328c-4903-a368-d3890c4a1075" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:15:15 compute-1 nova_compute[182713]: 2026-01-22 00:15:15.920 182717 DEBUG oslo_concurrency.lockutils [None req-b91ccd02-28f0-4694-b5d5-b1026bac8ba8 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Acquired lock "refresh_cache-074fd360-328c-4903-a368-d3890c4a1075" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:15:15 compute-1 nova_compute[182713]: 2026-01-22 00:15:15.920 182717 DEBUG nova.network.neutron [None req-b91ccd02-28f0-4694-b5d5-b1026bac8ba8 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 074fd360-328c-4903-a368-d3890c4a1075] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 00:15:16 compute-1 nova_compute[182713]: 2026-01-22 00:15:16.236 182717 DEBUG nova.compute.manager [req-f4762cfb-72d7-47d5-8f6a-64b6ea5a125f req-fca99bdf-8c09-44d3-aef0-ccad84818966 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 074fd360-328c-4903-a368-d3890c4a1075] Received event network-changed-4230f4cb-ad57-407e-90c1-b2441b67e135 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:15:16 compute-1 nova_compute[182713]: 2026-01-22 00:15:16.237 182717 DEBUG nova.compute.manager [req-f4762cfb-72d7-47d5-8f6a-64b6ea5a125f req-fca99bdf-8c09-44d3-aef0-ccad84818966 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 074fd360-328c-4903-a368-d3890c4a1075] Refreshing instance network info cache due to event network-changed-4230f4cb-ad57-407e-90c1-b2441b67e135. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 00:15:16 compute-1 nova_compute[182713]: 2026-01-22 00:15:16.238 182717 DEBUG oslo_concurrency.lockutils [req-f4762cfb-72d7-47d5-8f6a-64b6ea5a125f req-fca99bdf-8c09-44d3-aef0-ccad84818966 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-074fd360-328c-4903-a368-d3890c4a1075" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:15:17 compute-1 nova_compute[182713]: 2026-01-22 00:15:17.400 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:15:18 compute-1 nova_compute[182713]: 2026-01-22 00:15:18.273 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:15:20 compute-1 nova_compute[182713]: 2026-01-22 00:15:20.110 182717 DEBUG nova.network.neutron [None req-b91ccd02-28f0-4694-b5d5-b1026bac8ba8 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 074fd360-328c-4903-a368-d3890c4a1075] Updating instance_info_cache with network_info: [{"id": "a06a78d5-548e-4a84-b918-197a54a79f44", "address": "fa:16:3e:3e:83:df", "network": {"id": "89b3c74e-a4f2-4889-901d-aba21eee4bda", "bridge": "br-int", "label": "tempest-network-smoke--2024090277", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.235", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "34b96b4037d24a0ea19383ca2477b2fd", "mtu": null, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa06a78d5-54", "ovs_interfaceid": "a06a78d5-548e-4a84-b918-197a54a79f44", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "4230f4cb-ad57-407e-90c1-b2441b67e135", "address": "fa:16:3e:ce:53:e7", "network": {"id": "dbad5d67-48fa-4452-9764-37918af5f722", "bridge": "br-int", "label": "tempest-network-smoke--1888340030", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.29", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "34b96b4037d24a0ea19383ca2477b2fd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4230f4cb-ad", "ovs_interfaceid": "4230f4cb-ad57-407e-90c1-b2441b67e135", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:15:20 compute-1 nova_compute[182713]: 2026-01-22 00:15:20.322 182717 DEBUG oslo_concurrency.lockutils [None req-b91ccd02-28f0-4694-b5d5-b1026bac8ba8 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Releasing lock "refresh_cache-074fd360-328c-4903-a368-d3890c4a1075" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:15:20 compute-1 nova_compute[182713]: 2026-01-22 00:15:20.324 182717 DEBUG oslo_concurrency.lockutils [req-f4762cfb-72d7-47d5-8f6a-64b6ea5a125f req-fca99bdf-8c09-44d3-aef0-ccad84818966 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-074fd360-328c-4903-a368-d3890c4a1075" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:15:20 compute-1 nova_compute[182713]: 2026-01-22 00:15:20.324 182717 DEBUG nova.network.neutron [req-f4762cfb-72d7-47d5-8f6a-64b6ea5a125f req-fca99bdf-8c09-44d3-aef0-ccad84818966 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 074fd360-328c-4903-a368-d3890c4a1075] Refreshing network info cache for port 4230f4cb-ad57-407e-90c1-b2441b67e135 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 00:15:20 compute-1 nova_compute[182713]: 2026-01-22 00:15:20.331 182717 DEBUG nova.virt.libvirt.vif [None req-b91ccd02-28f0-4694-b5d5-b1026bac8ba8 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T00:14:30Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1083808564',display_name='tempest-TestNetworkBasicOps-server-1083808564',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1083808564',id=125,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBM6+EaxgHLeoG7CHqeSWXpXKv4K84dI1QQQjZdcrX0T7kqXBTlhE22YQjJTUFxToUxfZEI27WRcAtCoqb6CCdLa4/l//5Lw6nNA8ZjjrkKnh18RWLjWeeCbEVEj1FdVuCQ==',key_name='tempest-TestNetworkBasicOps-2002692629',keypairs=<?>,launch_index=0,launched_at=2026-01-22T00:14:45Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='34b96b4037d24a0ea19383ca2477b2fd',ramdisk_id='',reservation_id='r-5bbetpco',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-822850957',owner_user_name='tempest-TestNetworkBasicOps-822850957-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T00:14:45Z,user_data=None,user_id='833f1e9dce90456ea55a443da6704907',uuid=074fd360-328c-4903-a368-d3890c4a1075,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "4230f4cb-ad57-407e-90c1-b2441b67e135", "address": "fa:16:3e:ce:53:e7", "network": {"id": "dbad5d67-48fa-4452-9764-37918af5f722", "bridge": "br-int", "label": "tempest-network-smoke--1888340030", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.29", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "34b96b4037d24a0ea19383ca2477b2fd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4230f4cb-ad", "ovs_interfaceid": "4230f4cb-ad57-407e-90c1-b2441b67e135", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 00:15:20 compute-1 nova_compute[182713]: 2026-01-22 00:15:20.331 182717 DEBUG nova.network.os_vif_util [None req-b91ccd02-28f0-4694-b5d5-b1026bac8ba8 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Converting VIF {"id": "4230f4cb-ad57-407e-90c1-b2441b67e135", "address": "fa:16:3e:ce:53:e7", "network": {"id": "dbad5d67-48fa-4452-9764-37918af5f722", "bridge": "br-int", "label": "tempest-network-smoke--1888340030", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.29", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "34b96b4037d24a0ea19383ca2477b2fd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4230f4cb-ad", "ovs_interfaceid": "4230f4cb-ad57-407e-90c1-b2441b67e135", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:15:20 compute-1 nova_compute[182713]: 2026-01-22 00:15:20.333 182717 DEBUG nova.network.os_vif_util [None req-b91ccd02-28f0-4694-b5d5-b1026bac8ba8 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ce:53:e7,bridge_name='br-int',has_traffic_filtering=True,id=4230f4cb-ad57-407e-90c1-b2441b67e135,network=Network(dbad5d67-48fa-4452-9764-37918af5f722),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4230f4cb-ad') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:15:20 compute-1 nova_compute[182713]: 2026-01-22 00:15:20.334 182717 DEBUG os_vif [None req-b91ccd02-28f0-4694-b5d5-b1026bac8ba8 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ce:53:e7,bridge_name='br-int',has_traffic_filtering=True,id=4230f4cb-ad57-407e-90c1-b2441b67e135,network=Network(dbad5d67-48fa-4452-9764-37918af5f722),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4230f4cb-ad') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 00:15:20 compute-1 nova_compute[182713]: 2026-01-22 00:15:20.335 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:15:20 compute-1 nova_compute[182713]: 2026-01-22 00:15:20.336 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:15:20 compute-1 nova_compute[182713]: 2026-01-22 00:15:20.336 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 00:15:20 compute-1 nova_compute[182713]: 2026-01-22 00:15:20.341 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:15:20 compute-1 nova_compute[182713]: 2026-01-22 00:15:20.342 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap4230f4cb-ad, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:15:20 compute-1 nova_compute[182713]: 2026-01-22 00:15:20.342 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap4230f4cb-ad, col_values=(('external_ids', {'iface-id': '4230f4cb-ad57-407e-90c1-b2441b67e135', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:ce:53:e7', 'vm-uuid': '074fd360-328c-4903-a368-d3890c4a1075'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:15:20 compute-1 NetworkManager[54952]: <info>  [1769040920.3467] manager: (tap4230f4cb-ad): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/231)
Jan 22 00:15:20 compute-1 nova_compute[182713]: 2026-01-22 00:15:20.347 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:15:20 compute-1 nova_compute[182713]: 2026-01-22 00:15:20.350 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:15:20 compute-1 nova_compute[182713]: 2026-01-22 00:15:20.353 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:15:20 compute-1 nova_compute[182713]: 2026-01-22 00:15:20.354 182717 INFO os_vif [None req-b91ccd02-28f0-4694-b5d5-b1026bac8ba8 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ce:53:e7,bridge_name='br-int',has_traffic_filtering=True,id=4230f4cb-ad57-407e-90c1-b2441b67e135,network=Network(dbad5d67-48fa-4452-9764-37918af5f722),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4230f4cb-ad')
Jan 22 00:15:20 compute-1 nova_compute[182713]: 2026-01-22 00:15:20.355 182717 DEBUG nova.virt.libvirt.vif [None req-b91ccd02-28f0-4694-b5d5-b1026bac8ba8 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T00:14:30Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1083808564',display_name='tempest-TestNetworkBasicOps-server-1083808564',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1083808564',id=125,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBM6+EaxgHLeoG7CHqeSWXpXKv4K84dI1QQQjZdcrX0T7kqXBTlhE22YQjJTUFxToUxfZEI27WRcAtCoqb6CCdLa4/l//5Lw6nNA8ZjjrkKnh18RWLjWeeCbEVEj1FdVuCQ==',key_name='tempest-TestNetworkBasicOps-2002692629',keypairs=<?>,launch_index=0,launched_at=2026-01-22T00:14:45Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='34b96b4037d24a0ea19383ca2477b2fd',ramdisk_id='',reservation_id='r-5bbetpco',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-822850957',owner_user_name='tempest-TestNetworkBasicOps-822850957-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T00:14:45Z,user_data=None,user_id='833f1e9dce90456ea55a443da6704907',uuid=074fd360-328c-4903-a368-d3890c4a1075,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "4230f4cb-ad57-407e-90c1-b2441b67e135", "address": "fa:16:3e:ce:53:e7", "network": {"id": "dbad5d67-48fa-4452-9764-37918af5f722", "bridge": "br-int", "label": "tempest-network-smoke--1888340030", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.29", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "34b96b4037d24a0ea19383ca2477b2fd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4230f4cb-ad", "ovs_interfaceid": "4230f4cb-ad57-407e-90c1-b2441b67e135", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 00:15:20 compute-1 nova_compute[182713]: 2026-01-22 00:15:20.356 182717 DEBUG nova.network.os_vif_util [None req-b91ccd02-28f0-4694-b5d5-b1026bac8ba8 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Converting VIF {"id": "4230f4cb-ad57-407e-90c1-b2441b67e135", "address": "fa:16:3e:ce:53:e7", "network": {"id": "dbad5d67-48fa-4452-9764-37918af5f722", "bridge": "br-int", "label": "tempest-network-smoke--1888340030", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.29", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "34b96b4037d24a0ea19383ca2477b2fd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4230f4cb-ad", "ovs_interfaceid": "4230f4cb-ad57-407e-90c1-b2441b67e135", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:15:20 compute-1 nova_compute[182713]: 2026-01-22 00:15:20.357 182717 DEBUG nova.network.os_vif_util [None req-b91ccd02-28f0-4694-b5d5-b1026bac8ba8 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ce:53:e7,bridge_name='br-int',has_traffic_filtering=True,id=4230f4cb-ad57-407e-90c1-b2441b67e135,network=Network(dbad5d67-48fa-4452-9764-37918af5f722),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4230f4cb-ad') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:15:20 compute-1 nova_compute[182713]: 2026-01-22 00:15:20.360 182717 DEBUG nova.virt.libvirt.guest [None req-b91ccd02-28f0-4694-b5d5-b1026bac8ba8 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] attach device xml: <interface type="ethernet">
Jan 22 00:15:20 compute-1 nova_compute[182713]:   <mac address="fa:16:3e:ce:53:e7"/>
Jan 22 00:15:20 compute-1 nova_compute[182713]:   <model type="virtio"/>
Jan 22 00:15:20 compute-1 nova_compute[182713]:   <driver name="vhost" rx_queue_size="512"/>
Jan 22 00:15:20 compute-1 nova_compute[182713]:   <mtu size="1442"/>
Jan 22 00:15:20 compute-1 nova_compute[182713]:   <target dev="tap4230f4cb-ad"/>
Jan 22 00:15:20 compute-1 nova_compute[182713]: </interface>
Jan 22 00:15:20 compute-1 nova_compute[182713]:  attach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:339
Jan 22 00:15:20 compute-1 kernel: tap4230f4cb-ad: entered promiscuous mode
Jan 22 00:15:20 compute-1 NetworkManager[54952]: <info>  [1769040920.3791] manager: (tap4230f4cb-ad): new Tun device (/org/freedesktop/NetworkManager/Devices/232)
Jan 22 00:15:20 compute-1 nova_compute[182713]: 2026-01-22 00:15:20.383 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:15:20 compute-1 ovn_controller[94841]: 2026-01-22T00:15:20Z|00500|binding|INFO|Claiming lport 4230f4cb-ad57-407e-90c1-b2441b67e135 for this chassis.
Jan 22 00:15:20 compute-1 ovn_controller[94841]: 2026-01-22T00:15:20Z|00501|binding|INFO|4230f4cb-ad57-407e-90c1-b2441b67e135: Claiming fa:16:3e:ce:53:e7 10.100.0.29
Jan 22 00:15:20 compute-1 nova_compute[182713]: 2026-01-22 00:15:20.390 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:15:20 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:15:20.396 104184 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ce:53:e7 10.100.0.29'], port_security=['fa:16:3e:ce:53:e7 10.100.0.29'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.29/28', 'neutron:device_id': '074fd360-328c-4903-a368-d3890c4a1075', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-dbad5d67-48fa-4452-9764-37918af5f722', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '34b96b4037d24a0ea19383ca2477b2fd', 'neutron:revision_number': '2', 'neutron:security_group_ids': '86c286e4-25eb-4f2e-b5ff-30677cfd8882', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3ac57e7f-dd25-45c3-9d9a-56463ac0e2d3, chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>], logical_port=4230f4cb-ad57-407e-90c1-b2441b67e135) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:15:20 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:15:20.398 104184 INFO neutron.agent.ovn.metadata.agent [-] Port 4230f4cb-ad57-407e-90c1-b2441b67e135 in datapath dbad5d67-48fa-4452-9764-37918af5f722 bound to our chassis
Jan 22 00:15:20 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:15:20.401 104184 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network dbad5d67-48fa-4452-9764-37918af5f722
Jan 22 00:15:20 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:15:20.417 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[267934f8-af00-4dc6-be84-f4dc720c4e56]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:15:20 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:15:20.418 104184 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapdbad5d67-41 in ovnmeta-dbad5d67-48fa-4452-9764-37918af5f722 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 22 00:15:20 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:15:20.425 211733 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapdbad5d67-40 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 22 00:15:20 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:15:20.426 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[5a81ae43-098f-4f72-a96e-3d10e26350df]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:15:20 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:15:20.427 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[616d4d9d-adbc-49ad-923d-2d2255ae380b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:15:20 compute-1 systemd-udevd[230936]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 00:15:20 compute-1 nova_compute[182713]: 2026-01-22 00:15:20.438 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:15:20 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:15:20.439 104576 DEBUG oslo.privsep.daemon [-] privsep: reply[b69d8f6d-8265-4a68-b56d-1ec8f0485716]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:15:20 compute-1 ovn_controller[94841]: 2026-01-22T00:15:20Z|00502|binding|INFO|Setting lport 4230f4cb-ad57-407e-90c1-b2441b67e135 ovn-installed in OVS
Jan 22 00:15:20 compute-1 ovn_controller[94841]: 2026-01-22T00:15:20Z|00503|binding|INFO|Setting lport 4230f4cb-ad57-407e-90c1-b2441b67e135 up in Southbound
Jan 22 00:15:20 compute-1 nova_compute[182713]: 2026-01-22 00:15:20.442 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:15:20 compute-1 NetworkManager[54952]: <info>  [1769040920.4518] device (tap4230f4cb-ad): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 00:15:20 compute-1 NetworkManager[54952]: <info>  [1769040920.4526] device (tap4230f4cb-ad): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 00:15:20 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:15:20.467 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[7efad561-8442-4912-8580-8066315579d2]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:15:20 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:15:20.501 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[e896adbf-224c-4ead-a1f2-a63b5d259f69]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:15:20 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:15:20.511 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[384bd921-8144-47dd-97de-b8b86f3ad205]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:15:20 compute-1 NetworkManager[54952]: <info>  [1769040920.5133] manager: (tapdbad5d67-40): new Veth device (/org/freedesktop/NetworkManager/Devices/233)
Jan 22 00:15:20 compute-1 nova_compute[182713]: 2026-01-22 00:15:20.514 182717 DEBUG nova.virt.libvirt.driver [None req-b91ccd02-28f0-4694-b5d5-b1026bac8ba8 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 00:15:20 compute-1 nova_compute[182713]: 2026-01-22 00:15:20.514 182717 DEBUG nova.virt.libvirt.driver [None req-b91ccd02-28f0-4694-b5d5-b1026bac8ba8 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 00:15:20 compute-1 systemd-udevd[230940]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 00:15:20 compute-1 nova_compute[182713]: 2026-01-22 00:15:20.515 182717 DEBUG nova.virt.libvirt.driver [None req-b91ccd02-28f0-4694-b5d5-b1026bac8ba8 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] No VIF found with MAC fa:16:3e:3e:83:df, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 00:15:20 compute-1 nova_compute[182713]: 2026-01-22 00:15:20.515 182717 DEBUG nova.virt.libvirt.driver [None req-b91ccd02-28f0-4694-b5d5-b1026bac8ba8 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] No VIF found with MAC fa:16:3e:ce:53:e7, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 00:15:20 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:15:20.543 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[493cc265-0a0d-4d87-b562-56b76bd9d74a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:15:20 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:15:20.546 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[d6b49877-9056-456a-bb36-2436e49a256e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:15:20 compute-1 nova_compute[182713]: 2026-01-22 00:15:20.555 182717 DEBUG nova.virt.libvirt.guest [None req-b91ccd02-28f0-4694-b5d5-b1026bac8ba8 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 00:15:20 compute-1 nova_compute[182713]:   <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 00:15:20 compute-1 nova_compute[182713]:   <nova:name>tempest-TestNetworkBasicOps-server-1083808564</nova:name>
Jan 22 00:15:20 compute-1 nova_compute[182713]:   <nova:creationTime>2026-01-22 00:15:20</nova:creationTime>
Jan 22 00:15:20 compute-1 nova_compute[182713]:   <nova:flavor name="m1.nano">
Jan 22 00:15:20 compute-1 nova_compute[182713]:     <nova:memory>128</nova:memory>
Jan 22 00:15:20 compute-1 nova_compute[182713]:     <nova:disk>1</nova:disk>
Jan 22 00:15:20 compute-1 nova_compute[182713]:     <nova:swap>0</nova:swap>
Jan 22 00:15:20 compute-1 nova_compute[182713]:     <nova:ephemeral>0</nova:ephemeral>
Jan 22 00:15:20 compute-1 nova_compute[182713]:     <nova:vcpus>1</nova:vcpus>
Jan 22 00:15:20 compute-1 nova_compute[182713]:   </nova:flavor>
Jan 22 00:15:20 compute-1 nova_compute[182713]:   <nova:owner>
Jan 22 00:15:20 compute-1 nova_compute[182713]:     <nova:user uuid="833f1e9dce90456ea55a443da6704907">tempest-TestNetworkBasicOps-822850957-project-member</nova:user>
Jan 22 00:15:20 compute-1 nova_compute[182713]:     <nova:project uuid="34b96b4037d24a0ea19383ca2477b2fd">tempest-TestNetworkBasicOps-822850957</nova:project>
Jan 22 00:15:20 compute-1 nova_compute[182713]:   </nova:owner>
Jan 22 00:15:20 compute-1 nova_compute[182713]:   <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 22 00:15:20 compute-1 nova_compute[182713]:   <nova:ports>
Jan 22 00:15:20 compute-1 nova_compute[182713]:     <nova:port uuid="a06a78d5-548e-4a84-b918-197a54a79f44">
Jan 22 00:15:20 compute-1 nova_compute[182713]:       <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Jan 22 00:15:20 compute-1 nova_compute[182713]:     </nova:port>
Jan 22 00:15:20 compute-1 nova_compute[182713]:     <nova:port uuid="4230f4cb-ad57-407e-90c1-b2441b67e135">
Jan 22 00:15:20 compute-1 nova_compute[182713]:       <nova:ip type="fixed" address="10.100.0.29" ipVersion="4"/>
Jan 22 00:15:20 compute-1 nova_compute[182713]:     </nova:port>
Jan 22 00:15:20 compute-1 nova_compute[182713]:   </nova:ports>
Jan 22 00:15:20 compute-1 nova_compute[182713]: </nova:instance>
Jan 22 00:15:20 compute-1 nova_compute[182713]:  set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359
Jan 22 00:15:20 compute-1 NetworkManager[54952]: <info>  [1769040920.5695] device (tapdbad5d67-40): carrier: link connected
Jan 22 00:15:20 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:15:20.576 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[428a167c-7019-47bf-816b-281968f43991]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:15:20 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:15:20.595 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[53c40a6e-d643-4ed2-9f7e-fdf5db1bfb51]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapdbad5d67-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:00:10:3f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 145], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 549322, 'reachable_time': 36159, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 230964, 'error': None, 'target': 'ovnmeta-dbad5d67-48fa-4452-9764-37918af5f722', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:15:20 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:15:20.617 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[e53f586b-4465-491c-9cc9-47703bfcee06]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe00:103f'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 549322, 'tstamp': 549322}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 230965, 'error': None, 'target': 'ovnmeta-dbad5d67-48fa-4452-9764-37918af5f722', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:15:20 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:15:20.640 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[25931044-6017-4229-9319-4551daaaa624]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapdbad5d67-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:00:10:3f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 145], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 549322, 'reachable_time': 36159, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 230966, 'error': None, 'target': 'ovnmeta-dbad5d67-48fa-4452-9764-37918af5f722', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:15:20 compute-1 nova_compute[182713]: 2026-01-22 00:15:20.687 182717 DEBUG oslo_concurrency.lockutils [None req-b91ccd02-28f0-4694-b5d5-b1026bac8ba8 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "interface-074fd360-328c-4903-a368-d3890c4a1075-None" "released" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: held 8.861s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:15:20 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:15:20.686 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[597472a8-0198-467d-a842-e583a028f795]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:15:20 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:15:20.769 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[fe853eaa-e7da-4d60-a987-e0a3fe13c889]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:15:20 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:15:20.771 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapdbad5d67-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:15:20 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:15:20.772 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 00:15:20 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:15:20.773 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapdbad5d67-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:15:20 compute-1 NetworkManager[54952]: <info>  [1769040920.7761] manager: (tapdbad5d67-40): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/234)
Jan 22 00:15:20 compute-1 kernel: tapdbad5d67-40: entered promiscuous mode
Jan 22 00:15:20 compute-1 nova_compute[182713]: 2026-01-22 00:15:20.775 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:15:20 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:15:20.780 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapdbad5d67-40, col_values=(('external_ids', {'iface-id': 'bc0feb73-1fe9-487e-8603-9846dd682590'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:15:20 compute-1 ovn_controller[94841]: 2026-01-22T00:15:20Z|00504|binding|INFO|Releasing lport bc0feb73-1fe9-487e-8603-9846dd682590 from this chassis (sb_readonly=0)
Jan 22 00:15:20 compute-1 nova_compute[182713]: 2026-01-22 00:15:20.782 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:15:20 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:15:20.784 104184 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/dbad5d67-48fa-4452-9764-37918af5f722.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/dbad5d67-48fa-4452-9764-37918af5f722.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 22 00:15:20 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:15:20.785 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[a0a408a7-5e59-499b-aa10-c563ab2df667]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:15:20 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:15:20.785 104184 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 00:15:20 compute-1 ovn_metadata_agent[104179]: global
Jan 22 00:15:20 compute-1 ovn_metadata_agent[104179]:     log         /dev/log local0 debug
Jan 22 00:15:20 compute-1 ovn_metadata_agent[104179]:     log-tag     haproxy-metadata-proxy-dbad5d67-48fa-4452-9764-37918af5f722
Jan 22 00:15:20 compute-1 ovn_metadata_agent[104179]:     user        root
Jan 22 00:15:20 compute-1 ovn_metadata_agent[104179]:     group       root
Jan 22 00:15:20 compute-1 ovn_metadata_agent[104179]:     maxconn     1024
Jan 22 00:15:20 compute-1 ovn_metadata_agent[104179]:     pidfile     /var/lib/neutron/external/pids/dbad5d67-48fa-4452-9764-37918af5f722.pid.haproxy
Jan 22 00:15:20 compute-1 ovn_metadata_agent[104179]:     daemon
Jan 22 00:15:20 compute-1 ovn_metadata_agent[104179]: 
Jan 22 00:15:20 compute-1 ovn_metadata_agent[104179]: defaults
Jan 22 00:15:20 compute-1 ovn_metadata_agent[104179]:     log global
Jan 22 00:15:20 compute-1 ovn_metadata_agent[104179]:     mode http
Jan 22 00:15:20 compute-1 ovn_metadata_agent[104179]:     option httplog
Jan 22 00:15:20 compute-1 ovn_metadata_agent[104179]:     option dontlognull
Jan 22 00:15:20 compute-1 ovn_metadata_agent[104179]:     option http-server-close
Jan 22 00:15:20 compute-1 ovn_metadata_agent[104179]:     option forwardfor
Jan 22 00:15:20 compute-1 ovn_metadata_agent[104179]:     retries                 3
Jan 22 00:15:20 compute-1 ovn_metadata_agent[104179]:     timeout http-request    30s
Jan 22 00:15:20 compute-1 ovn_metadata_agent[104179]:     timeout connect         30s
Jan 22 00:15:20 compute-1 ovn_metadata_agent[104179]:     timeout client          32s
Jan 22 00:15:20 compute-1 ovn_metadata_agent[104179]:     timeout server          32s
Jan 22 00:15:20 compute-1 ovn_metadata_agent[104179]:     timeout http-keep-alive 30s
Jan 22 00:15:20 compute-1 ovn_metadata_agent[104179]: 
Jan 22 00:15:20 compute-1 ovn_metadata_agent[104179]: 
Jan 22 00:15:20 compute-1 ovn_metadata_agent[104179]: listen listener
Jan 22 00:15:20 compute-1 ovn_metadata_agent[104179]:     bind 169.254.169.254:80
Jan 22 00:15:20 compute-1 ovn_metadata_agent[104179]:     server metadata /var/lib/neutron/metadata_proxy
Jan 22 00:15:20 compute-1 ovn_metadata_agent[104179]:     http-request add-header X-OVN-Network-ID dbad5d67-48fa-4452-9764-37918af5f722
Jan 22 00:15:20 compute-1 ovn_metadata_agent[104179]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 22 00:15:20 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:15:20.786 104184 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-dbad5d67-48fa-4452-9764-37918af5f722', 'env', 'PROCESS_TAG=haproxy-dbad5d67-48fa-4452-9764-37918af5f722', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/dbad5d67-48fa-4452-9764-37918af5f722.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 22 00:15:20 compute-1 nova_compute[182713]: 2026-01-22 00:15:20.793 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:15:21 compute-1 podman[230998]: 2026-01-22 00:15:21.213117327 +0000 UTC m=+0.075760759 container create b8692c2a9194183a093291b7e9f63d355c0575c53cfb159d02b5b45b8a55e82e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-dbad5d67-48fa-4452-9764-37918af5f722, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202)
Jan 22 00:15:21 compute-1 systemd[1]: Started libpod-conmon-b8692c2a9194183a093291b7e9f63d355c0575c53cfb159d02b5b45b8a55e82e.scope.
Jan 22 00:15:21 compute-1 podman[230998]: 2026-01-22 00:15:21.161593893 +0000 UTC m=+0.024237305 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 00:15:21 compute-1 systemd[1]: Started libcrun container.
Jan 22 00:15:21 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a0926e005cb225d188add7f7a75b0d7a78697ff2107a2eae8b5dccf1ea26fc81/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 00:15:21 compute-1 podman[230998]: 2026-01-22 00:15:21.314547085 +0000 UTC m=+0.177190547 container init b8692c2a9194183a093291b7e9f63d355c0575c53cfb159d02b5b45b8a55e82e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-dbad5d67-48fa-4452-9764-37918af5f722, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team)
Jan 22 00:15:21 compute-1 podman[230998]: 2026-01-22 00:15:21.324721337 +0000 UTC m=+0.187364759 container start b8692c2a9194183a093291b7e9f63d355c0575c53cfb159d02b5b45b8a55e82e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-dbad5d67-48fa-4452-9764-37918af5f722, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Jan 22 00:15:21 compute-1 neutron-haproxy-ovnmeta-dbad5d67-48fa-4452-9764-37918af5f722[231014]: [NOTICE]   (231018) : New worker (231020) forked
Jan 22 00:15:21 compute-1 neutron-haproxy-ovnmeta-dbad5d67-48fa-4452-9764-37918af5f722[231014]: [NOTICE]   (231018) : Loading success.
Jan 22 00:15:21 compute-1 nova_compute[182713]: 2026-01-22 00:15:21.377 182717 DEBUG nova.compute.manager [req-3d8adb6a-90e3-4c4a-9770-9e3bfdaaeb29 req-bc72e59c-b4de-4f91-b353-629778687913 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 074fd360-328c-4903-a368-d3890c4a1075] Received event network-vif-plugged-4230f4cb-ad57-407e-90c1-b2441b67e135 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:15:21 compute-1 nova_compute[182713]: 2026-01-22 00:15:21.378 182717 DEBUG oslo_concurrency.lockutils [req-3d8adb6a-90e3-4c4a-9770-9e3bfdaaeb29 req-bc72e59c-b4de-4f91-b353-629778687913 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "074fd360-328c-4903-a368-d3890c4a1075-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:15:21 compute-1 nova_compute[182713]: 2026-01-22 00:15:21.378 182717 DEBUG oslo_concurrency.lockutils [req-3d8adb6a-90e3-4c4a-9770-9e3bfdaaeb29 req-bc72e59c-b4de-4f91-b353-629778687913 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "074fd360-328c-4903-a368-d3890c4a1075-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:15:21 compute-1 nova_compute[182713]: 2026-01-22 00:15:21.379 182717 DEBUG oslo_concurrency.lockutils [req-3d8adb6a-90e3-4c4a-9770-9e3bfdaaeb29 req-bc72e59c-b4de-4f91-b353-629778687913 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "074fd360-328c-4903-a368-d3890c4a1075-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:15:21 compute-1 nova_compute[182713]: 2026-01-22 00:15:21.379 182717 DEBUG nova.compute.manager [req-3d8adb6a-90e3-4c4a-9770-9e3bfdaaeb29 req-bc72e59c-b4de-4f91-b353-629778687913 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 074fd360-328c-4903-a368-d3890c4a1075] No waiting events found dispatching network-vif-plugged-4230f4cb-ad57-407e-90c1-b2441b67e135 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:15:21 compute-1 nova_compute[182713]: 2026-01-22 00:15:21.380 182717 WARNING nova.compute.manager [req-3d8adb6a-90e3-4c4a-9770-9e3bfdaaeb29 req-bc72e59c-b4de-4f91-b353-629778687913 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 074fd360-328c-4903-a368-d3890c4a1075] Received unexpected event network-vif-plugged-4230f4cb-ad57-407e-90c1-b2441b67e135 for instance with vm_state active and task_state None.
Jan 22 00:15:21 compute-1 nova_compute[182713]: 2026-01-22 00:15:21.380 182717 DEBUG nova.compute.manager [req-3d8adb6a-90e3-4c4a-9770-9e3bfdaaeb29 req-bc72e59c-b4de-4f91-b353-629778687913 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 074fd360-328c-4903-a368-d3890c4a1075] Received event network-vif-plugged-4230f4cb-ad57-407e-90c1-b2441b67e135 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:15:21 compute-1 nova_compute[182713]: 2026-01-22 00:15:21.381 182717 DEBUG oslo_concurrency.lockutils [req-3d8adb6a-90e3-4c4a-9770-9e3bfdaaeb29 req-bc72e59c-b4de-4f91-b353-629778687913 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "074fd360-328c-4903-a368-d3890c4a1075-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:15:21 compute-1 nova_compute[182713]: 2026-01-22 00:15:21.381 182717 DEBUG oslo_concurrency.lockutils [req-3d8adb6a-90e3-4c4a-9770-9e3bfdaaeb29 req-bc72e59c-b4de-4f91-b353-629778687913 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "074fd360-328c-4903-a368-d3890c4a1075-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:15:21 compute-1 nova_compute[182713]: 2026-01-22 00:15:21.382 182717 DEBUG oslo_concurrency.lockutils [req-3d8adb6a-90e3-4c4a-9770-9e3bfdaaeb29 req-bc72e59c-b4de-4f91-b353-629778687913 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "074fd360-328c-4903-a368-d3890c4a1075-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:15:21 compute-1 nova_compute[182713]: 2026-01-22 00:15:21.382 182717 DEBUG nova.compute.manager [req-3d8adb6a-90e3-4c4a-9770-9e3bfdaaeb29 req-bc72e59c-b4de-4f91-b353-629778687913 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 074fd360-328c-4903-a368-d3890c4a1075] No waiting events found dispatching network-vif-plugged-4230f4cb-ad57-407e-90c1-b2441b67e135 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:15:21 compute-1 nova_compute[182713]: 2026-01-22 00:15:21.383 182717 WARNING nova.compute.manager [req-3d8adb6a-90e3-4c4a-9770-9e3bfdaaeb29 req-bc72e59c-b4de-4f91-b353-629778687913 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 074fd360-328c-4903-a368-d3890c4a1075] Received unexpected event network-vif-plugged-4230f4cb-ad57-407e-90c1-b2441b67e135 for instance with vm_state active and task_state None.
Jan 22 00:15:21 compute-1 nova_compute[182713]: 2026-01-22 00:15:21.642 182717 DEBUG nova.network.neutron [req-f4762cfb-72d7-47d5-8f6a-64b6ea5a125f req-fca99bdf-8c09-44d3-aef0-ccad84818966 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 074fd360-328c-4903-a368-d3890c4a1075] Updated VIF entry in instance network info cache for port 4230f4cb-ad57-407e-90c1-b2441b67e135. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 00:15:21 compute-1 nova_compute[182713]: 2026-01-22 00:15:21.643 182717 DEBUG nova.network.neutron [req-f4762cfb-72d7-47d5-8f6a-64b6ea5a125f req-fca99bdf-8c09-44d3-aef0-ccad84818966 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 074fd360-328c-4903-a368-d3890c4a1075] Updating instance_info_cache with network_info: [{"id": "a06a78d5-548e-4a84-b918-197a54a79f44", "address": "fa:16:3e:3e:83:df", "network": {"id": "89b3c74e-a4f2-4889-901d-aba21eee4bda", "bridge": "br-int", "label": "tempest-network-smoke--2024090277", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.235", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "34b96b4037d24a0ea19383ca2477b2fd", "mtu": null, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa06a78d5-54", "ovs_interfaceid": "a06a78d5-548e-4a84-b918-197a54a79f44", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "4230f4cb-ad57-407e-90c1-b2441b67e135", "address": "fa:16:3e:ce:53:e7", "network": {"id": "dbad5d67-48fa-4452-9764-37918af5f722", "bridge": "br-int", "label": "tempest-network-smoke--1888340030", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.29", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "34b96b4037d24a0ea19383ca2477b2fd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4230f4cb-ad", "ovs_interfaceid": "4230f4cb-ad57-407e-90c1-b2441b67e135", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:15:21 compute-1 nova_compute[182713]: 2026-01-22 00:15:21.671 182717 DEBUG oslo_concurrency.lockutils [req-f4762cfb-72d7-47d5-8f6a-64b6ea5a125f req-fca99bdf-8c09-44d3-aef0-ccad84818966 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-074fd360-328c-4903-a368-d3890c4a1075" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:15:21 compute-1 ovn_controller[94841]: 2026-01-22T00:15:21Z|00505|binding|INFO|Releasing lport bc0feb73-1fe9-487e-8603-9846dd682590 from this chassis (sb_readonly=0)
Jan 22 00:15:21 compute-1 ovn_controller[94841]: 2026-01-22T00:15:21Z|00506|binding|INFO|Releasing lport 6df2a6eb-cd08-41f6-b95a-bc0711ab706f from this chassis (sb_readonly=0)
Jan 22 00:15:21 compute-1 nova_compute[182713]: 2026-01-22 00:15:21.992 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:15:22 compute-1 nova_compute[182713]: 2026-01-22 00:15:22.507 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:15:22 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:15:22.507 104184 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=37, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:a4:fb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'fa:c2:bb:50:eb:27'}, ipsec=False) old=SB_Global(nb_cfg=36) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:15:22 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:15:22.509 104184 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 22 00:15:22 compute-1 podman[231029]: 2026-01-22 00:15:22.593629528 +0000 UTC m=+0.079581408 container health_status cba261ffc859a105e0afa4436e64e27f9ba7e7387a1ea02146e0072c51844d44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS)
Jan 22 00:15:22 compute-1 ovn_controller[94841]: 2026-01-22T00:15:22Z|00058|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:ce:53:e7 10.100.0.29
Jan 22 00:15:22 compute-1 ovn_controller[94841]: 2026-01-22T00:15:22Z|00059|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:ce:53:e7 10.100.0.29
Jan 22 00:15:23 compute-1 nova_compute[182713]: 2026-01-22 00:15:23.321 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:15:25 compute-1 nova_compute[182713]: 2026-01-22 00:15:25.347 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:15:25 compute-1 podman[231049]: 2026-01-22 00:15:25.564829917 +0000 UTC m=+0.060656675 container health_status dce133a2ffa21f3006ac27dc80027060a9029e6d972f4c62fecd35dd3bbb00f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.6, distribution-scope=public, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, release=1755695350, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9)
Jan 22 00:15:28 compute-1 nova_compute[182713]: 2026-01-22 00:15:28.324 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:15:28 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:15:28.512 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=74526b6d-b1ca-423f-9094-b845f8b97526, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '37'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:15:30 compute-1 nova_compute[182713]: 2026-01-22 00:15:30.351 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:15:31 compute-1 nova_compute[182713]: 2026-01-22 00:15:31.447 182717 DEBUG oslo_concurrency.lockutils [None req-852fef58-4b1d-485a-b835-1ad595f9cfa6 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Acquiring lock "6763a0f8-8485-4d4a-8418-5b095f3a20ed" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:15:31 compute-1 nova_compute[182713]: 2026-01-22 00:15:31.448 182717 DEBUG oslo_concurrency.lockutils [None req-852fef58-4b1d-485a-b835-1ad595f9cfa6 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "6763a0f8-8485-4d4a-8418-5b095f3a20ed" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:15:31 compute-1 nova_compute[182713]: 2026-01-22 00:15:31.465 182717 DEBUG nova.compute.manager [None req-852fef58-4b1d-485a-b835-1ad595f9cfa6 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 6763a0f8-8485-4d4a-8418-5b095f3a20ed] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 22 00:15:31 compute-1 nova_compute[182713]: 2026-01-22 00:15:31.601 182717 DEBUG oslo_concurrency.lockutils [None req-852fef58-4b1d-485a-b835-1ad595f9cfa6 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:15:31 compute-1 nova_compute[182713]: 2026-01-22 00:15:31.602 182717 DEBUG oslo_concurrency.lockutils [None req-852fef58-4b1d-485a-b835-1ad595f9cfa6 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:15:31 compute-1 nova_compute[182713]: 2026-01-22 00:15:31.613 182717 DEBUG nova.virt.hardware [None req-852fef58-4b1d-485a-b835-1ad595f9cfa6 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 22 00:15:31 compute-1 nova_compute[182713]: 2026-01-22 00:15:31.613 182717 INFO nova.compute.claims [None req-852fef58-4b1d-485a-b835-1ad595f9cfa6 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 6763a0f8-8485-4d4a-8418-5b095f3a20ed] Claim successful on node compute-1.ctlplane.example.com
Jan 22 00:15:31 compute-1 nova_compute[182713]: 2026-01-22 00:15:31.810 182717 DEBUG nova.compute.provider_tree [None req-852fef58-4b1d-485a-b835-1ad595f9cfa6 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Inventory has not changed in ProviderTree for provider: 39680711-70c9-4df1-ae59-25e54fac688d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:15:31 compute-1 nova_compute[182713]: 2026-01-22 00:15:31.834 182717 DEBUG nova.scheduler.client.report [None req-852fef58-4b1d-485a-b835-1ad595f9cfa6 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Inventory has not changed for provider 39680711-70c9-4df1-ae59-25e54fac688d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:15:31 compute-1 nova_compute[182713]: 2026-01-22 00:15:31.860 182717 DEBUG oslo_concurrency.lockutils [None req-852fef58-4b1d-485a-b835-1ad595f9cfa6 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.258s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:15:31 compute-1 nova_compute[182713]: 2026-01-22 00:15:31.861 182717 DEBUG nova.compute.manager [None req-852fef58-4b1d-485a-b835-1ad595f9cfa6 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 6763a0f8-8485-4d4a-8418-5b095f3a20ed] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 22 00:15:31 compute-1 nova_compute[182713]: 2026-01-22 00:15:31.933 182717 DEBUG nova.compute.manager [None req-852fef58-4b1d-485a-b835-1ad595f9cfa6 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 6763a0f8-8485-4d4a-8418-5b095f3a20ed] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 22 00:15:31 compute-1 nova_compute[182713]: 2026-01-22 00:15:31.934 182717 DEBUG nova.network.neutron [None req-852fef58-4b1d-485a-b835-1ad595f9cfa6 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 6763a0f8-8485-4d4a-8418-5b095f3a20ed] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 22 00:15:31 compute-1 nova_compute[182713]: 2026-01-22 00:15:31.968 182717 INFO nova.virt.libvirt.driver [None req-852fef58-4b1d-485a-b835-1ad595f9cfa6 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 6763a0f8-8485-4d4a-8418-5b095f3a20ed] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 22 00:15:31 compute-1 nova_compute[182713]: 2026-01-22 00:15:31.993 182717 DEBUG nova.compute.manager [None req-852fef58-4b1d-485a-b835-1ad595f9cfa6 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 6763a0f8-8485-4d4a-8418-5b095f3a20ed] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 22 00:15:32 compute-1 nova_compute[182713]: 2026-01-22 00:15:32.231 182717 DEBUG nova.compute.manager [None req-852fef58-4b1d-485a-b835-1ad595f9cfa6 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 6763a0f8-8485-4d4a-8418-5b095f3a20ed] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 22 00:15:32 compute-1 nova_compute[182713]: 2026-01-22 00:15:32.234 182717 DEBUG nova.virt.libvirt.driver [None req-852fef58-4b1d-485a-b835-1ad595f9cfa6 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 6763a0f8-8485-4d4a-8418-5b095f3a20ed] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 22 00:15:32 compute-1 nova_compute[182713]: 2026-01-22 00:15:32.235 182717 INFO nova.virt.libvirt.driver [None req-852fef58-4b1d-485a-b835-1ad595f9cfa6 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 6763a0f8-8485-4d4a-8418-5b095f3a20ed] Creating image(s)
Jan 22 00:15:32 compute-1 nova_compute[182713]: 2026-01-22 00:15:32.236 182717 DEBUG oslo_concurrency.lockutils [None req-852fef58-4b1d-485a-b835-1ad595f9cfa6 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Acquiring lock "/var/lib/nova/instances/6763a0f8-8485-4d4a-8418-5b095f3a20ed/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:15:32 compute-1 nova_compute[182713]: 2026-01-22 00:15:32.237 182717 DEBUG oslo_concurrency.lockutils [None req-852fef58-4b1d-485a-b835-1ad595f9cfa6 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "/var/lib/nova/instances/6763a0f8-8485-4d4a-8418-5b095f3a20ed/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:15:32 compute-1 nova_compute[182713]: 2026-01-22 00:15:32.238 182717 DEBUG oslo_concurrency.lockutils [None req-852fef58-4b1d-485a-b835-1ad595f9cfa6 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "/var/lib/nova/instances/6763a0f8-8485-4d4a-8418-5b095f3a20ed/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:15:32 compute-1 nova_compute[182713]: 2026-01-22 00:15:32.268 182717 DEBUG nova.policy [None req-852fef58-4b1d-485a-b835-1ad595f9cfa6 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '833f1e9dce90456ea55a443da6704907', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '34b96b4037d24a0ea19383ca2477b2fd', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 22 00:15:32 compute-1 nova_compute[182713]: 2026-01-22 00:15:32.273 182717 DEBUG oslo_concurrency.processutils [None req-852fef58-4b1d-485a-b835-1ad595f9cfa6 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:15:32 compute-1 nova_compute[182713]: 2026-01-22 00:15:32.367 182717 DEBUG oslo_concurrency.processutils [None req-852fef58-4b1d-485a-b835-1ad595f9cfa6 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.094s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:15:32 compute-1 nova_compute[182713]: 2026-01-22 00:15:32.369 182717 DEBUG oslo_concurrency.lockutils [None req-852fef58-4b1d-485a-b835-1ad595f9cfa6 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Acquiring lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:15:32 compute-1 nova_compute[182713]: 2026-01-22 00:15:32.371 182717 DEBUG oslo_concurrency.lockutils [None req-852fef58-4b1d-485a-b835-1ad595f9cfa6 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:15:32 compute-1 nova_compute[182713]: 2026-01-22 00:15:32.400 182717 DEBUG oslo_concurrency.processutils [None req-852fef58-4b1d-485a-b835-1ad595f9cfa6 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:15:32 compute-1 nova_compute[182713]: 2026-01-22 00:15:32.489 182717 DEBUG oslo_concurrency.processutils [None req-852fef58-4b1d-485a-b835-1ad595f9cfa6 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.089s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:15:32 compute-1 nova_compute[182713]: 2026-01-22 00:15:32.491 182717 DEBUG oslo_concurrency.processutils [None req-852fef58-4b1d-485a-b835-1ad595f9cfa6 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/6763a0f8-8485-4d4a-8418-5b095f3a20ed/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:15:32 compute-1 nova_compute[182713]: 2026-01-22 00:15:32.539 182717 DEBUG oslo_concurrency.processutils [None req-852fef58-4b1d-485a-b835-1ad595f9cfa6 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/6763a0f8-8485-4d4a-8418-5b095f3a20ed/disk 1073741824" returned: 0 in 0.048s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:15:32 compute-1 nova_compute[182713]: 2026-01-22 00:15:32.540 182717 DEBUG oslo_concurrency.lockutils [None req-852fef58-4b1d-485a-b835-1ad595f9cfa6 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.169s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:15:32 compute-1 nova_compute[182713]: 2026-01-22 00:15:32.541 182717 DEBUG oslo_concurrency.processutils [None req-852fef58-4b1d-485a-b835-1ad595f9cfa6 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:15:32 compute-1 nova_compute[182713]: 2026-01-22 00:15:32.625 182717 DEBUG oslo_concurrency.processutils [None req-852fef58-4b1d-485a-b835-1ad595f9cfa6 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.084s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:15:32 compute-1 nova_compute[182713]: 2026-01-22 00:15:32.627 182717 DEBUG nova.virt.disk.api [None req-852fef58-4b1d-485a-b835-1ad595f9cfa6 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Checking if we can resize image /var/lib/nova/instances/6763a0f8-8485-4d4a-8418-5b095f3a20ed/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 22 00:15:32 compute-1 nova_compute[182713]: 2026-01-22 00:15:32.627 182717 DEBUG oslo_concurrency.processutils [None req-852fef58-4b1d-485a-b835-1ad595f9cfa6 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6763a0f8-8485-4d4a-8418-5b095f3a20ed/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:15:32 compute-1 nova_compute[182713]: 2026-01-22 00:15:32.723 182717 DEBUG oslo_concurrency.processutils [None req-852fef58-4b1d-485a-b835-1ad595f9cfa6 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6763a0f8-8485-4d4a-8418-5b095f3a20ed/disk --force-share --output=json" returned: 0 in 0.095s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:15:32 compute-1 nova_compute[182713]: 2026-01-22 00:15:32.724 182717 DEBUG nova.virt.disk.api [None req-852fef58-4b1d-485a-b835-1ad595f9cfa6 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Cannot resize image /var/lib/nova/instances/6763a0f8-8485-4d4a-8418-5b095f3a20ed/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 22 00:15:32 compute-1 nova_compute[182713]: 2026-01-22 00:15:32.725 182717 DEBUG nova.objects.instance [None req-852fef58-4b1d-485a-b835-1ad595f9cfa6 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lazy-loading 'migration_context' on Instance uuid 6763a0f8-8485-4d4a-8418-5b095f3a20ed obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:15:32 compute-1 nova_compute[182713]: 2026-01-22 00:15:32.748 182717 DEBUG nova.virt.libvirt.driver [None req-852fef58-4b1d-485a-b835-1ad595f9cfa6 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 6763a0f8-8485-4d4a-8418-5b095f3a20ed] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 22 00:15:32 compute-1 nova_compute[182713]: 2026-01-22 00:15:32.749 182717 DEBUG nova.virt.libvirt.driver [None req-852fef58-4b1d-485a-b835-1ad595f9cfa6 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 6763a0f8-8485-4d4a-8418-5b095f3a20ed] Ensure instance console log exists: /var/lib/nova/instances/6763a0f8-8485-4d4a-8418-5b095f3a20ed/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 22 00:15:32 compute-1 nova_compute[182713]: 2026-01-22 00:15:32.749 182717 DEBUG oslo_concurrency.lockutils [None req-852fef58-4b1d-485a-b835-1ad595f9cfa6 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:15:32 compute-1 nova_compute[182713]: 2026-01-22 00:15:32.750 182717 DEBUG oslo_concurrency.lockutils [None req-852fef58-4b1d-485a-b835-1ad595f9cfa6 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:15:32 compute-1 nova_compute[182713]: 2026-01-22 00:15:32.750 182717 DEBUG oslo_concurrency.lockutils [None req-852fef58-4b1d-485a-b835-1ad595f9cfa6 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:15:33 compute-1 nova_compute[182713]: 2026-01-22 00:15:33.327 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:15:33 compute-1 nova_compute[182713]: 2026-01-22 00:15:33.758 182717 DEBUG nova.network.neutron [None req-852fef58-4b1d-485a-b835-1ad595f9cfa6 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 6763a0f8-8485-4d4a-8418-5b095f3a20ed] Successfully created port: 48d53d0a-d386-45be-8c0d-a38d9b22332f _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 22 00:15:34 compute-1 nova_compute[182713]: 2026-01-22 00:15:34.868 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:15:35 compute-1 nova_compute[182713]: 2026-01-22 00:15:35.017 182717 DEBUG nova.network.neutron [None req-852fef58-4b1d-485a-b835-1ad595f9cfa6 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 6763a0f8-8485-4d4a-8418-5b095f3a20ed] Successfully updated port: 48d53d0a-d386-45be-8c0d-a38d9b22332f _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 22 00:15:35 compute-1 nova_compute[182713]: 2026-01-22 00:15:35.038 182717 DEBUG oslo_concurrency.lockutils [None req-852fef58-4b1d-485a-b835-1ad595f9cfa6 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Acquiring lock "refresh_cache-6763a0f8-8485-4d4a-8418-5b095f3a20ed" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:15:35 compute-1 nova_compute[182713]: 2026-01-22 00:15:35.038 182717 DEBUG oslo_concurrency.lockutils [None req-852fef58-4b1d-485a-b835-1ad595f9cfa6 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Acquired lock "refresh_cache-6763a0f8-8485-4d4a-8418-5b095f3a20ed" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:15:35 compute-1 nova_compute[182713]: 2026-01-22 00:15:35.038 182717 DEBUG nova.network.neutron [None req-852fef58-4b1d-485a-b835-1ad595f9cfa6 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 6763a0f8-8485-4d4a-8418-5b095f3a20ed] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 00:15:35 compute-1 nova_compute[182713]: 2026-01-22 00:15:35.181 182717 DEBUG nova.compute.manager [req-2515c7c0-3f7f-4db2-b384-b3e6fb12b08b req-22658bd6-b200-4c94-aa7d-1a798514a898 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 6763a0f8-8485-4d4a-8418-5b095f3a20ed] Received event network-changed-48d53d0a-d386-45be-8c0d-a38d9b22332f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:15:35 compute-1 nova_compute[182713]: 2026-01-22 00:15:35.182 182717 DEBUG nova.compute.manager [req-2515c7c0-3f7f-4db2-b384-b3e6fb12b08b req-22658bd6-b200-4c94-aa7d-1a798514a898 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 6763a0f8-8485-4d4a-8418-5b095f3a20ed] Refreshing instance network info cache due to event network-changed-48d53d0a-d386-45be-8c0d-a38d9b22332f. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 00:15:35 compute-1 nova_compute[182713]: 2026-01-22 00:15:35.182 182717 DEBUG oslo_concurrency.lockutils [req-2515c7c0-3f7f-4db2-b384-b3e6fb12b08b req-22658bd6-b200-4c94-aa7d-1a798514a898 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-6763a0f8-8485-4d4a-8418-5b095f3a20ed" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:15:35 compute-1 nova_compute[182713]: 2026-01-22 00:15:35.260 182717 DEBUG nova.network.neutron [None req-852fef58-4b1d-485a-b835-1ad595f9cfa6 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 6763a0f8-8485-4d4a-8418-5b095f3a20ed] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 00:15:35 compute-1 nova_compute[182713]: 2026-01-22 00:15:35.353 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:15:36 compute-1 nova_compute[182713]: 2026-01-22 00:15:36.541 182717 DEBUG nova.network.neutron [None req-852fef58-4b1d-485a-b835-1ad595f9cfa6 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 6763a0f8-8485-4d4a-8418-5b095f3a20ed] Updating instance_info_cache with network_info: [{"id": "48d53d0a-d386-45be-8c0d-a38d9b22332f", "address": "fa:16:3e:82:84:b0", "network": {"id": "dbad5d67-48fa-4452-9764-37918af5f722", "bridge": "br-int", "label": "tempest-network-smoke--1888340030", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.24", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "34b96b4037d24a0ea19383ca2477b2fd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap48d53d0a-d3", "ovs_interfaceid": "48d53d0a-d386-45be-8c0d-a38d9b22332f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:15:36 compute-1 nova_compute[182713]: 2026-01-22 00:15:36.658 182717 DEBUG oslo_concurrency.lockutils [None req-852fef58-4b1d-485a-b835-1ad595f9cfa6 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Releasing lock "refresh_cache-6763a0f8-8485-4d4a-8418-5b095f3a20ed" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:15:36 compute-1 nova_compute[182713]: 2026-01-22 00:15:36.658 182717 DEBUG nova.compute.manager [None req-852fef58-4b1d-485a-b835-1ad595f9cfa6 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 6763a0f8-8485-4d4a-8418-5b095f3a20ed] Instance network_info: |[{"id": "48d53d0a-d386-45be-8c0d-a38d9b22332f", "address": "fa:16:3e:82:84:b0", "network": {"id": "dbad5d67-48fa-4452-9764-37918af5f722", "bridge": "br-int", "label": "tempest-network-smoke--1888340030", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.24", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "34b96b4037d24a0ea19383ca2477b2fd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap48d53d0a-d3", "ovs_interfaceid": "48d53d0a-d386-45be-8c0d-a38d9b22332f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 22 00:15:36 compute-1 nova_compute[182713]: 2026-01-22 00:15:36.661 182717 DEBUG oslo_concurrency.lockutils [req-2515c7c0-3f7f-4db2-b384-b3e6fb12b08b req-22658bd6-b200-4c94-aa7d-1a798514a898 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-6763a0f8-8485-4d4a-8418-5b095f3a20ed" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:15:36 compute-1 nova_compute[182713]: 2026-01-22 00:15:36.661 182717 DEBUG nova.network.neutron [req-2515c7c0-3f7f-4db2-b384-b3e6fb12b08b req-22658bd6-b200-4c94-aa7d-1a798514a898 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 6763a0f8-8485-4d4a-8418-5b095f3a20ed] Refreshing network info cache for port 48d53d0a-d386-45be-8c0d-a38d9b22332f _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 00:15:36 compute-1 nova_compute[182713]: 2026-01-22 00:15:36.664 182717 DEBUG nova.virt.libvirt.driver [None req-852fef58-4b1d-485a-b835-1ad595f9cfa6 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 6763a0f8-8485-4d4a-8418-5b095f3a20ed] Start _get_guest_xml network_info=[{"id": "48d53d0a-d386-45be-8c0d-a38d9b22332f", "address": "fa:16:3e:82:84:b0", "network": {"id": "dbad5d67-48fa-4452-9764-37918af5f722", "bridge": "br-int", "label": "tempest-network-smoke--1888340030", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.24", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "34b96b4037d24a0ea19383ca2477b2fd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap48d53d0a-d3", "ovs_interfaceid": "48d53d0a-d386-45be-8c0d-a38d9b22332f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'size': 0, 'disk_bus': 'virtio', 'boot_index': 0, 'device_name': '/dev/vda', 'guest_format': None, 'encrypted': False, 'encryption_format': None, 'encryption_secret_uuid': None, 'device_type': 'disk', 'image_id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 22 00:15:36 compute-1 nova_compute[182713]: 2026-01-22 00:15:36.668 182717 WARNING nova.virt.libvirt.driver [None req-852fef58-4b1d-485a-b835-1ad595f9cfa6 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 00:15:36 compute-1 nova_compute[182713]: 2026-01-22 00:15:36.680 182717 DEBUG nova.virt.libvirt.host [None req-852fef58-4b1d-485a-b835-1ad595f9cfa6 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 22 00:15:36 compute-1 nova_compute[182713]: 2026-01-22 00:15:36.680 182717 DEBUG nova.virt.libvirt.host [None req-852fef58-4b1d-485a-b835-1ad595f9cfa6 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 22 00:15:36 compute-1 nova_compute[182713]: 2026-01-22 00:15:36.684 182717 DEBUG nova.virt.libvirt.host [None req-852fef58-4b1d-485a-b835-1ad595f9cfa6 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 22 00:15:36 compute-1 nova_compute[182713]: 2026-01-22 00:15:36.685 182717 DEBUG nova.virt.libvirt.host [None req-852fef58-4b1d-485a-b835-1ad595f9cfa6 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 22 00:15:36 compute-1 nova_compute[182713]: 2026-01-22 00:15:36.686 182717 DEBUG nova.virt.libvirt.driver [None req-852fef58-4b1d-485a-b835-1ad595f9cfa6 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 22 00:15:36 compute-1 nova_compute[182713]: 2026-01-22 00:15:36.687 182717 DEBUG nova.virt.hardware [None req-852fef58-4b1d-485a-b835-1ad595f9cfa6 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-21T23:42:45Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c3389c03-89c4-4ff5-9e03-1a99d41713d4',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 22 00:15:36 compute-1 nova_compute[182713]: 2026-01-22 00:15:36.687 182717 DEBUG nova.virt.hardware [None req-852fef58-4b1d-485a-b835-1ad595f9cfa6 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 22 00:15:36 compute-1 nova_compute[182713]: 2026-01-22 00:15:36.687 182717 DEBUG nova.virt.hardware [None req-852fef58-4b1d-485a-b835-1ad595f9cfa6 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 22 00:15:36 compute-1 nova_compute[182713]: 2026-01-22 00:15:36.688 182717 DEBUG nova.virt.hardware [None req-852fef58-4b1d-485a-b835-1ad595f9cfa6 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 22 00:15:36 compute-1 nova_compute[182713]: 2026-01-22 00:15:36.688 182717 DEBUG nova.virt.hardware [None req-852fef58-4b1d-485a-b835-1ad595f9cfa6 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 22 00:15:36 compute-1 nova_compute[182713]: 2026-01-22 00:15:36.688 182717 DEBUG nova.virt.hardware [None req-852fef58-4b1d-485a-b835-1ad595f9cfa6 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 22 00:15:36 compute-1 nova_compute[182713]: 2026-01-22 00:15:36.688 182717 DEBUG nova.virt.hardware [None req-852fef58-4b1d-485a-b835-1ad595f9cfa6 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 22 00:15:36 compute-1 nova_compute[182713]: 2026-01-22 00:15:36.689 182717 DEBUG nova.virt.hardware [None req-852fef58-4b1d-485a-b835-1ad595f9cfa6 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 22 00:15:36 compute-1 nova_compute[182713]: 2026-01-22 00:15:36.689 182717 DEBUG nova.virt.hardware [None req-852fef58-4b1d-485a-b835-1ad595f9cfa6 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 22 00:15:36 compute-1 nova_compute[182713]: 2026-01-22 00:15:36.689 182717 DEBUG nova.virt.hardware [None req-852fef58-4b1d-485a-b835-1ad595f9cfa6 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 22 00:15:36 compute-1 nova_compute[182713]: 2026-01-22 00:15:36.689 182717 DEBUG nova.virt.hardware [None req-852fef58-4b1d-485a-b835-1ad595f9cfa6 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 22 00:15:36 compute-1 nova_compute[182713]: 2026-01-22 00:15:36.693 182717 DEBUG nova.virt.libvirt.vif [None req-852fef58-4b1d-485a-b835-1ad595f9cfa6 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T00:15:29Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1485052167',display_name='tempest-TestNetworkBasicOps-server-1485052167',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1485052167',id=130,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBN+rF74Q+n92wqGfvgmI7dqElp9oBY9n3ay6yBNRhjKVM1q89wQOOSpGhz6I7UWGPqqFjzw4hVpFrSPhKUDzrlCaL5V6NlG7zI66zllUGkpU6xzTEQWK85FJJIBjQeVN4A==',key_name='tempest-TestNetworkBasicOps-287611394',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='34b96b4037d24a0ea19383ca2477b2fd',ramdisk_id='',reservation_id='r-4wpsiaxz',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-822850957',owner_user_name='tempest-TestNetworkBasicOps-822850957-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:15:32Z,user_data=None,user_id='833f1e9dce90456ea55a443da6704907',uuid=6763a0f8-8485-4d4a-8418-5b095f3a20ed,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "48d53d0a-d386-45be-8c0d-a38d9b22332f", "address": "fa:16:3e:82:84:b0", "network": {"id": "dbad5d67-48fa-4452-9764-37918af5f722", "bridge": "br-int", "label": "tempest-network-smoke--1888340030", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.24", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "34b96b4037d24a0ea19383ca2477b2fd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap48d53d0a-d3", "ovs_interfaceid": "48d53d0a-d386-45be-8c0d-a38d9b22332f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 00:15:36 compute-1 nova_compute[182713]: 2026-01-22 00:15:36.693 182717 DEBUG nova.network.os_vif_util [None req-852fef58-4b1d-485a-b835-1ad595f9cfa6 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Converting VIF {"id": "48d53d0a-d386-45be-8c0d-a38d9b22332f", "address": "fa:16:3e:82:84:b0", "network": {"id": "dbad5d67-48fa-4452-9764-37918af5f722", "bridge": "br-int", "label": "tempest-network-smoke--1888340030", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.24", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "34b96b4037d24a0ea19383ca2477b2fd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap48d53d0a-d3", "ovs_interfaceid": "48d53d0a-d386-45be-8c0d-a38d9b22332f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:15:36 compute-1 nova_compute[182713]: 2026-01-22 00:15:36.694 182717 DEBUG nova.network.os_vif_util [None req-852fef58-4b1d-485a-b835-1ad595f9cfa6 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:82:84:b0,bridge_name='br-int',has_traffic_filtering=True,id=48d53d0a-d386-45be-8c0d-a38d9b22332f,network=Network(dbad5d67-48fa-4452-9764-37918af5f722),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap48d53d0a-d3') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:15:36 compute-1 nova_compute[182713]: 2026-01-22 00:15:36.695 182717 DEBUG nova.objects.instance [None req-852fef58-4b1d-485a-b835-1ad595f9cfa6 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lazy-loading 'pci_devices' on Instance uuid 6763a0f8-8485-4d4a-8418-5b095f3a20ed obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:15:36 compute-1 nova_compute[182713]: 2026-01-22 00:15:36.710 182717 DEBUG nova.virt.libvirt.driver [None req-852fef58-4b1d-485a-b835-1ad595f9cfa6 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 6763a0f8-8485-4d4a-8418-5b095f3a20ed] End _get_guest_xml xml=<domain type="kvm">
Jan 22 00:15:36 compute-1 nova_compute[182713]:   <uuid>6763a0f8-8485-4d4a-8418-5b095f3a20ed</uuid>
Jan 22 00:15:36 compute-1 nova_compute[182713]:   <name>instance-00000082</name>
Jan 22 00:15:36 compute-1 nova_compute[182713]:   <memory>131072</memory>
Jan 22 00:15:36 compute-1 nova_compute[182713]:   <vcpu>1</vcpu>
Jan 22 00:15:36 compute-1 nova_compute[182713]:   <metadata>
Jan 22 00:15:36 compute-1 nova_compute[182713]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 00:15:36 compute-1 nova_compute[182713]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 00:15:36 compute-1 nova_compute[182713]:       <nova:name>tempest-TestNetworkBasicOps-server-1485052167</nova:name>
Jan 22 00:15:36 compute-1 nova_compute[182713]:       <nova:creationTime>2026-01-22 00:15:36</nova:creationTime>
Jan 22 00:15:36 compute-1 nova_compute[182713]:       <nova:flavor name="m1.nano">
Jan 22 00:15:36 compute-1 nova_compute[182713]:         <nova:memory>128</nova:memory>
Jan 22 00:15:36 compute-1 nova_compute[182713]:         <nova:disk>1</nova:disk>
Jan 22 00:15:36 compute-1 nova_compute[182713]:         <nova:swap>0</nova:swap>
Jan 22 00:15:36 compute-1 nova_compute[182713]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 00:15:36 compute-1 nova_compute[182713]:         <nova:vcpus>1</nova:vcpus>
Jan 22 00:15:36 compute-1 nova_compute[182713]:       </nova:flavor>
Jan 22 00:15:36 compute-1 nova_compute[182713]:       <nova:owner>
Jan 22 00:15:36 compute-1 nova_compute[182713]:         <nova:user uuid="833f1e9dce90456ea55a443da6704907">tempest-TestNetworkBasicOps-822850957-project-member</nova:user>
Jan 22 00:15:36 compute-1 nova_compute[182713]:         <nova:project uuid="34b96b4037d24a0ea19383ca2477b2fd">tempest-TestNetworkBasicOps-822850957</nova:project>
Jan 22 00:15:36 compute-1 nova_compute[182713]:       </nova:owner>
Jan 22 00:15:36 compute-1 nova_compute[182713]:       <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 22 00:15:36 compute-1 nova_compute[182713]:       <nova:ports>
Jan 22 00:15:36 compute-1 nova_compute[182713]:         <nova:port uuid="48d53d0a-d386-45be-8c0d-a38d9b22332f">
Jan 22 00:15:36 compute-1 nova_compute[182713]:           <nova:ip type="fixed" address="10.100.0.24" ipVersion="4"/>
Jan 22 00:15:36 compute-1 nova_compute[182713]:         </nova:port>
Jan 22 00:15:36 compute-1 nova_compute[182713]:       </nova:ports>
Jan 22 00:15:36 compute-1 nova_compute[182713]:     </nova:instance>
Jan 22 00:15:36 compute-1 nova_compute[182713]:   </metadata>
Jan 22 00:15:36 compute-1 nova_compute[182713]:   <sysinfo type="smbios">
Jan 22 00:15:36 compute-1 nova_compute[182713]:     <system>
Jan 22 00:15:36 compute-1 nova_compute[182713]:       <entry name="manufacturer">RDO</entry>
Jan 22 00:15:36 compute-1 nova_compute[182713]:       <entry name="product">OpenStack Compute</entry>
Jan 22 00:15:36 compute-1 nova_compute[182713]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 00:15:36 compute-1 nova_compute[182713]:       <entry name="serial">6763a0f8-8485-4d4a-8418-5b095f3a20ed</entry>
Jan 22 00:15:36 compute-1 nova_compute[182713]:       <entry name="uuid">6763a0f8-8485-4d4a-8418-5b095f3a20ed</entry>
Jan 22 00:15:36 compute-1 nova_compute[182713]:       <entry name="family">Virtual Machine</entry>
Jan 22 00:15:36 compute-1 nova_compute[182713]:     </system>
Jan 22 00:15:36 compute-1 nova_compute[182713]:   </sysinfo>
Jan 22 00:15:36 compute-1 nova_compute[182713]:   <os>
Jan 22 00:15:36 compute-1 nova_compute[182713]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 22 00:15:36 compute-1 nova_compute[182713]:     <boot dev="hd"/>
Jan 22 00:15:36 compute-1 nova_compute[182713]:     <smbios mode="sysinfo"/>
Jan 22 00:15:36 compute-1 nova_compute[182713]:   </os>
Jan 22 00:15:36 compute-1 nova_compute[182713]:   <features>
Jan 22 00:15:36 compute-1 nova_compute[182713]:     <acpi/>
Jan 22 00:15:36 compute-1 nova_compute[182713]:     <apic/>
Jan 22 00:15:36 compute-1 nova_compute[182713]:     <vmcoreinfo/>
Jan 22 00:15:36 compute-1 nova_compute[182713]:   </features>
Jan 22 00:15:36 compute-1 nova_compute[182713]:   <clock offset="utc">
Jan 22 00:15:36 compute-1 nova_compute[182713]:     <timer name="pit" tickpolicy="delay"/>
Jan 22 00:15:36 compute-1 nova_compute[182713]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 22 00:15:36 compute-1 nova_compute[182713]:     <timer name="hpet" present="no"/>
Jan 22 00:15:36 compute-1 nova_compute[182713]:   </clock>
Jan 22 00:15:36 compute-1 nova_compute[182713]:   <cpu mode="custom" match="exact">
Jan 22 00:15:36 compute-1 nova_compute[182713]:     <model>Nehalem</model>
Jan 22 00:15:36 compute-1 nova_compute[182713]:     <topology sockets="1" cores="1" threads="1"/>
Jan 22 00:15:36 compute-1 nova_compute[182713]:   </cpu>
Jan 22 00:15:36 compute-1 nova_compute[182713]:   <devices>
Jan 22 00:15:36 compute-1 nova_compute[182713]:     <disk type="file" device="disk">
Jan 22 00:15:36 compute-1 nova_compute[182713]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 00:15:36 compute-1 nova_compute[182713]:       <source file="/var/lib/nova/instances/6763a0f8-8485-4d4a-8418-5b095f3a20ed/disk"/>
Jan 22 00:15:36 compute-1 nova_compute[182713]:       <target dev="vda" bus="virtio"/>
Jan 22 00:15:36 compute-1 nova_compute[182713]:     </disk>
Jan 22 00:15:36 compute-1 nova_compute[182713]:     <disk type="file" device="cdrom">
Jan 22 00:15:36 compute-1 nova_compute[182713]:       <driver name="qemu" type="raw" cache="none"/>
Jan 22 00:15:36 compute-1 nova_compute[182713]:       <source file="/var/lib/nova/instances/6763a0f8-8485-4d4a-8418-5b095f3a20ed/disk.config"/>
Jan 22 00:15:36 compute-1 nova_compute[182713]:       <target dev="sda" bus="sata"/>
Jan 22 00:15:36 compute-1 nova_compute[182713]:     </disk>
Jan 22 00:15:36 compute-1 nova_compute[182713]:     <interface type="ethernet">
Jan 22 00:15:36 compute-1 nova_compute[182713]:       <mac address="fa:16:3e:82:84:b0"/>
Jan 22 00:15:36 compute-1 nova_compute[182713]:       <model type="virtio"/>
Jan 22 00:15:36 compute-1 nova_compute[182713]:       <driver name="vhost" rx_queue_size="512"/>
Jan 22 00:15:36 compute-1 nova_compute[182713]:       <mtu size="1442"/>
Jan 22 00:15:36 compute-1 nova_compute[182713]:       <target dev="tap48d53d0a-d3"/>
Jan 22 00:15:36 compute-1 nova_compute[182713]:     </interface>
Jan 22 00:15:36 compute-1 nova_compute[182713]:     <serial type="pty">
Jan 22 00:15:36 compute-1 nova_compute[182713]:       <log file="/var/lib/nova/instances/6763a0f8-8485-4d4a-8418-5b095f3a20ed/console.log" append="off"/>
Jan 22 00:15:36 compute-1 nova_compute[182713]:     </serial>
Jan 22 00:15:36 compute-1 nova_compute[182713]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 00:15:36 compute-1 nova_compute[182713]:     <video>
Jan 22 00:15:36 compute-1 nova_compute[182713]:       <model type="virtio"/>
Jan 22 00:15:36 compute-1 nova_compute[182713]:     </video>
Jan 22 00:15:36 compute-1 nova_compute[182713]:     <input type="tablet" bus="usb"/>
Jan 22 00:15:36 compute-1 nova_compute[182713]:     <rng model="virtio">
Jan 22 00:15:36 compute-1 nova_compute[182713]:       <backend model="random">/dev/urandom</backend>
Jan 22 00:15:36 compute-1 nova_compute[182713]:     </rng>
Jan 22 00:15:36 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root"/>
Jan 22 00:15:36 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:15:36 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:15:36 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:15:36 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:15:36 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:15:36 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:15:36 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:15:36 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:15:36 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:15:36 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:15:36 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:15:36 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:15:36 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:15:36 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:15:36 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:15:36 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:15:36 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:15:36 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:15:36 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:15:36 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:15:36 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:15:36 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:15:36 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:15:36 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:15:36 compute-1 nova_compute[182713]:     <controller type="usb" index="0"/>
Jan 22 00:15:36 compute-1 nova_compute[182713]:     <memballoon model="virtio">
Jan 22 00:15:36 compute-1 nova_compute[182713]:       <stats period="10"/>
Jan 22 00:15:36 compute-1 nova_compute[182713]:     </memballoon>
Jan 22 00:15:36 compute-1 nova_compute[182713]:   </devices>
Jan 22 00:15:36 compute-1 nova_compute[182713]: </domain>
Jan 22 00:15:36 compute-1 nova_compute[182713]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 22 00:15:36 compute-1 nova_compute[182713]: 2026-01-22 00:15:36.711 182717 DEBUG nova.compute.manager [None req-852fef58-4b1d-485a-b835-1ad595f9cfa6 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 6763a0f8-8485-4d4a-8418-5b095f3a20ed] Preparing to wait for external event network-vif-plugged-48d53d0a-d386-45be-8c0d-a38d9b22332f prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 22 00:15:36 compute-1 nova_compute[182713]: 2026-01-22 00:15:36.711 182717 DEBUG oslo_concurrency.lockutils [None req-852fef58-4b1d-485a-b835-1ad595f9cfa6 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Acquiring lock "6763a0f8-8485-4d4a-8418-5b095f3a20ed-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:15:36 compute-1 nova_compute[182713]: 2026-01-22 00:15:36.712 182717 DEBUG oslo_concurrency.lockutils [None req-852fef58-4b1d-485a-b835-1ad595f9cfa6 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "6763a0f8-8485-4d4a-8418-5b095f3a20ed-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:15:36 compute-1 nova_compute[182713]: 2026-01-22 00:15:36.712 182717 DEBUG oslo_concurrency.lockutils [None req-852fef58-4b1d-485a-b835-1ad595f9cfa6 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "6763a0f8-8485-4d4a-8418-5b095f3a20ed-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:15:36 compute-1 nova_compute[182713]: 2026-01-22 00:15:36.713 182717 DEBUG nova.virt.libvirt.vif [None req-852fef58-4b1d-485a-b835-1ad595f9cfa6 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T00:15:29Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1485052167',display_name='tempest-TestNetworkBasicOps-server-1485052167',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1485052167',id=130,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBN+rF74Q+n92wqGfvgmI7dqElp9oBY9n3ay6yBNRhjKVM1q89wQOOSpGhz6I7UWGPqqFjzw4hVpFrSPhKUDzrlCaL5V6NlG7zI66zllUGkpU6xzTEQWK85FJJIBjQeVN4A==',key_name='tempest-TestNetworkBasicOps-287611394',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='34b96b4037d24a0ea19383ca2477b2fd',ramdisk_id='',reservation_id='r-4wpsiaxz',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-822850957',owner_user_name='tempest-TestNetworkBasicOps-822850957-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:15:32Z,user_data=None,user_id='833f1e9dce90456ea55a443da6704907',uuid=6763a0f8-8485-4d4a-8418-5b095f3a20ed,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "48d53d0a-d386-45be-8c0d-a38d9b22332f", "address": "fa:16:3e:82:84:b0", "network": {"id": "dbad5d67-48fa-4452-9764-37918af5f722", "bridge": "br-int", "label": "tempest-network-smoke--1888340030", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.24", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "34b96b4037d24a0ea19383ca2477b2fd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap48d53d0a-d3", "ovs_interfaceid": "48d53d0a-d386-45be-8c0d-a38d9b22332f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 00:15:36 compute-1 nova_compute[182713]: 2026-01-22 00:15:36.713 182717 DEBUG nova.network.os_vif_util [None req-852fef58-4b1d-485a-b835-1ad595f9cfa6 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Converting VIF {"id": "48d53d0a-d386-45be-8c0d-a38d9b22332f", "address": "fa:16:3e:82:84:b0", "network": {"id": "dbad5d67-48fa-4452-9764-37918af5f722", "bridge": "br-int", "label": "tempest-network-smoke--1888340030", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.24", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "34b96b4037d24a0ea19383ca2477b2fd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap48d53d0a-d3", "ovs_interfaceid": "48d53d0a-d386-45be-8c0d-a38d9b22332f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:15:36 compute-1 nova_compute[182713]: 2026-01-22 00:15:36.713 182717 DEBUG nova.network.os_vif_util [None req-852fef58-4b1d-485a-b835-1ad595f9cfa6 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:82:84:b0,bridge_name='br-int',has_traffic_filtering=True,id=48d53d0a-d386-45be-8c0d-a38d9b22332f,network=Network(dbad5d67-48fa-4452-9764-37918af5f722),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap48d53d0a-d3') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:15:36 compute-1 nova_compute[182713]: 2026-01-22 00:15:36.714 182717 DEBUG os_vif [None req-852fef58-4b1d-485a-b835-1ad595f9cfa6 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:82:84:b0,bridge_name='br-int',has_traffic_filtering=True,id=48d53d0a-d386-45be-8c0d-a38d9b22332f,network=Network(dbad5d67-48fa-4452-9764-37918af5f722),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap48d53d0a-d3') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 00:15:36 compute-1 nova_compute[182713]: 2026-01-22 00:15:36.714 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:15:36 compute-1 nova_compute[182713]: 2026-01-22 00:15:36.715 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:15:36 compute-1 nova_compute[182713]: 2026-01-22 00:15:36.715 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 00:15:36 compute-1 nova_compute[182713]: 2026-01-22 00:15:36.718 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:15:36 compute-1 nova_compute[182713]: 2026-01-22 00:15:36.719 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap48d53d0a-d3, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:15:36 compute-1 nova_compute[182713]: 2026-01-22 00:15:36.719 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap48d53d0a-d3, col_values=(('external_ids', {'iface-id': '48d53d0a-d386-45be-8c0d-a38d9b22332f', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:82:84:b0', 'vm-uuid': '6763a0f8-8485-4d4a-8418-5b095f3a20ed'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:15:36 compute-1 nova_compute[182713]: 2026-01-22 00:15:36.721 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:15:36 compute-1 NetworkManager[54952]: <info>  [1769040936.7237] manager: (tap48d53d0a-d3): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/235)
Jan 22 00:15:36 compute-1 nova_compute[182713]: 2026-01-22 00:15:36.725 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:15:36 compute-1 nova_compute[182713]: 2026-01-22 00:15:36.734 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:15:36 compute-1 nova_compute[182713]: 2026-01-22 00:15:36.736 182717 INFO os_vif [None req-852fef58-4b1d-485a-b835-1ad595f9cfa6 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:82:84:b0,bridge_name='br-int',has_traffic_filtering=True,id=48d53d0a-d386-45be-8c0d-a38d9b22332f,network=Network(dbad5d67-48fa-4452-9764-37918af5f722),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap48d53d0a-d3')
Jan 22 00:15:36 compute-1 nova_compute[182713]: 2026-01-22 00:15:36.813 182717 DEBUG nova.virt.libvirt.driver [None req-852fef58-4b1d-485a-b835-1ad595f9cfa6 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 00:15:36 compute-1 nova_compute[182713]: 2026-01-22 00:15:36.814 182717 DEBUG nova.virt.libvirt.driver [None req-852fef58-4b1d-485a-b835-1ad595f9cfa6 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 00:15:36 compute-1 nova_compute[182713]: 2026-01-22 00:15:36.814 182717 DEBUG nova.virt.libvirt.driver [None req-852fef58-4b1d-485a-b835-1ad595f9cfa6 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] No VIF found with MAC fa:16:3e:82:84:b0, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 00:15:36 compute-1 nova_compute[182713]: 2026-01-22 00:15:36.815 182717 INFO nova.virt.libvirt.driver [None req-852fef58-4b1d-485a-b835-1ad595f9cfa6 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 6763a0f8-8485-4d4a-8418-5b095f3a20ed] Using config drive
Jan 22 00:15:37 compute-1 nova_compute[182713]: 2026-01-22 00:15:37.883 182717 INFO nova.virt.libvirt.driver [None req-852fef58-4b1d-485a-b835-1ad595f9cfa6 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 6763a0f8-8485-4d4a-8418-5b095f3a20ed] Creating config drive at /var/lib/nova/instances/6763a0f8-8485-4d4a-8418-5b095f3a20ed/disk.config
Jan 22 00:15:37 compute-1 nova_compute[182713]: 2026-01-22 00:15:37.893 182717 DEBUG oslo_concurrency.processutils [None req-852fef58-4b1d-485a-b835-1ad595f9cfa6 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/6763a0f8-8485-4d4a-8418-5b095f3a20ed/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpzbs18_rz execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:15:38 compute-1 nova_compute[182713]: 2026-01-22 00:15:38.038 182717 DEBUG oslo_concurrency.processutils [None req-852fef58-4b1d-485a-b835-1ad595f9cfa6 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/6763a0f8-8485-4d4a-8418-5b095f3a20ed/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpzbs18_rz" returned: 0 in 0.144s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:15:38 compute-1 NetworkManager[54952]: <info>  [1769040938.1600] manager: (tap48d53d0a-d3): new Tun device (/org/freedesktop/NetworkManager/Devices/236)
Jan 22 00:15:38 compute-1 kernel: tap48d53d0a-d3: entered promiscuous mode
Jan 22 00:15:38 compute-1 ovn_controller[94841]: 2026-01-22T00:15:38Z|00507|binding|INFO|Claiming lport 48d53d0a-d386-45be-8c0d-a38d9b22332f for this chassis.
Jan 22 00:15:38 compute-1 ovn_controller[94841]: 2026-01-22T00:15:38Z|00508|binding|INFO|48d53d0a-d386-45be-8c0d-a38d9b22332f: Claiming fa:16:3e:82:84:b0 10.100.0.24
Jan 22 00:15:38 compute-1 nova_compute[182713]: 2026-01-22 00:15:38.202 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:15:38 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:15:38.210 104184 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:82:84:b0 10.100.0.24'], port_security=['fa:16:3e:82:84:b0 10.100.0.24'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.24/28', 'neutron:device_id': '6763a0f8-8485-4d4a-8418-5b095f3a20ed', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-dbad5d67-48fa-4452-9764-37918af5f722', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '34b96b4037d24a0ea19383ca2477b2fd', 'neutron:revision_number': '2', 'neutron:security_group_ids': '4c1742f2-7c74-46cd-9079-742776973d57', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3ac57e7f-dd25-45c3-9d9a-56463ac0e2d3, chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>], logical_port=48d53d0a-d386-45be-8c0d-a38d9b22332f) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:15:38 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:15:38.212 104184 INFO neutron.agent.ovn.metadata.agent [-] Port 48d53d0a-d386-45be-8c0d-a38d9b22332f in datapath dbad5d67-48fa-4452-9764-37918af5f722 bound to our chassis
Jan 22 00:15:38 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:15:38.216 104184 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network dbad5d67-48fa-4452-9764-37918af5f722
Jan 22 00:15:38 compute-1 nova_compute[182713]: 2026-01-22 00:15:38.223 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:15:38 compute-1 systemd-udevd[231121]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 00:15:38 compute-1 ovn_controller[94841]: 2026-01-22T00:15:38Z|00509|binding|INFO|Setting lport 48d53d0a-d386-45be-8c0d-a38d9b22332f ovn-installed in OVS
Jan 22 00:15:38 compute-1 ovn_controller[94841]: 2026-01-22T00:15:38Z|00510|binding|INFO|Setting lport 48d53d0a-d386-45be-8c0d-a38d9b22332f up in Southbound
Jan 22 00:15:38 compute-1 nova_compute[182713]: 2026-01-22 00:15:38.231 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:15:38 compute-1 systemd-machined[153970]: New machine qemu-58-instance-00000082.
Jan 22 00:15:38 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:15:38.238 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[aefbfe3e-d52a-4f76-9ea6-d30f1dda804b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:15:38 compute-1 NetworkManager[54952]: <info>  [1769040938.2456] device (tap48d53d0a-d3): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 00:15:38 compute-1 NetworkManager[54952]: <info>  [1769040938.2464] device (tap48d53d0a-d3): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 00:15:38 compute-1 systemd[1]: Started Virtual Machine qemu-58-instance-00000082.
Jan 22 00:15:38 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:15:38.275 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[04965adc-5653-4ecb-ba5e-e71e35388daf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:15:38 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:15:38.279 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[ad5f8039-ad18-42ed-bb50-6c758a87bf62]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:15:38 compute-1 podman[231098]: 2026-01-22 00:15:38.282502155 +0000 UTC m=+0.128466209 container health_status 9069102163b9014ce8d990e36224edf2f6277e737eebbc85a8ea0ba5a3d6a468 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 22 00:15:38 compute-1 podman[231097]: 2026-01-22 00:15:38.307409671 +0000 UTC m=+0.160844825 container health_status 1b31374526468d6b98833c6ddc06c791524e3aaa8f4a92d02e3f11c449a17ca2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible)
Jan 22 00:15:38 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:15:38.310 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[f1a9dc26-99b7-4a5a-b807-3079842550fb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:15:38 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:15:38.326 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[6307fc63-5915-4e98-8259-cce7d963a100]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapdbad5d67-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:00:10:3f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 145], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 549322, 'reachable_time': 36159, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 231165, 'error': None, 'target': 'ovnmeta-dbad5d67-48fa-4452-9764-37918af5f722', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:15:38 compute-1 nova_compute[182713]: 2026-01-22 00:15:38.330 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:15:38 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:15:38.344 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[041ca77a-f7a9-45a5-8c49-a4fac958f142]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.17'], ['IFA_LOCAL', '10.100.0.17'], ['IFA_BROADCAST', '10.100.0.31'], ['IFA_LABEL', 'tapdbad5d67-41'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 549337, 'tstamp': 549337}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 231167, 'error': None, 'target': 'ovnmeta-dbad5d67-48fa-4452-9764-37918af5f722', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapdbad5d67-41'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 549341, 'tstamp': 549341}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 231167, 'error': None, 'target': 'ovnmeta-dbad5d67-48fa-4452-9764-37918af5f722', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:15:38 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:15:38.346 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapdbad5d67-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:15:38 compute-1 nova_compute[182713]: 2026-01-22 00:15:38.347 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:15:38 compute-1 nova_compute[182713]: 2026-01-22 00:15:38.348 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:15:38 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:15:38.349 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapdbad5d67-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:15:38 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:15:38.349 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 00:15:38 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:15:38.350 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapdbad5d67-40, col_values=(('external_ids', {'iface-id': 'bc0feb73-1fe9-487e-8603-9846dd682590'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:15:38 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:15:38.350 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 00:15:38 compute-1 nova_compute[182713]: 2026-01-22 00:15:38.520 182717 DEBUG nova.virt.driver [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Emitting event <LifecycleEvent: 1769040938.5197208, 6763a0f8-8485-4d4a-8418-5b095f3a20ed => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:15:38 compute-1 nova_compute[182713]: 2026-01-22 00:15:38.521 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 6763a0f8-8485-4d4a-8418-5b095f3a20ed] VM Started (Lifecycle Event)
Jan 22 00:15:38 compute-1 nova_compute[182713]: 2026-01-22 00:15:38.547 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 6763a0f8-8485-4d4a-8418-5b095f3a20ed] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:15:38 compute-1 nova_compute[182713]: 2026-01-22 00:15:38.552 182717 DEBUG nova.virt.driver [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Emitting event <LifecycleEvent: 1769040938.5198703, 6763a0f8-8485-4d4a-8418-5b095f3a20ed => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:15:38 compute-1 nova_compute[182713]: 2026-01-22 00:15:38.553 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 6763a0f8-8485-4d4a-8418-5b095f3a20ed] VM Paused (Lifecycle Event)
Jan 22 00:15:38 compute-1 nova_compute[182713]: 2026-01-22 00:15:38.571 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 6763a0f8-8485-4d4a-8418-5b095f3a20ed] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:15:38 compute-1 nova_compute[182713]: 2026-01-22 00:15:38.575 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 6763a0f8-8485-4d4a-8418-5b095f3a20ed] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 00:15:38 compute-1 nova_compute[182713]: 2026-01-22 00:15:38.594 182717 DEBUG nova.compute.manager [req-8269c1ae-88e6-4495-a614-d96ad9575fb0 req-eb08531a-97be-49d7-a3e0-658d4c27222f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 6763a0f8-8485-4d4a-8418-5b095f3a20ed] Received event network-vif-plugged-48d53d0a-d386-45be-8c0d-a38d9b22332f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:15:38 compute-1 nova_compute[182713]: 2026-01-22 00:15:38.594 182717 DEBUG oslo_concurrency.lockutils [req-8269c1ae-88e6-4495-a614-d96ad9575fb0 req-eb08531a-97be-49d7-a3e0-658d4c27222f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "6763a0f8-8485-4d4a-8418-5b095f3a20ed-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:15:38 compute-1 nova_compute[182713]: 2026-01-22 00:15:38.595 182717 DEBUG oslo_concurrency.lockutils [req-8269c1ae-88e6-4495-a614-d96ad9575fb0 req-eb08531a-97be-49d7-a3e0-658d4c27222f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "6763a0f8-8485-4d4a-8418-5b095f3a20ed-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:15:38 compute-1 nova_compute[182713]: 2026-01-22 00:15:38.595 182717 DEBUG oslo_concurrency.lockutils [req-8269c1ae-88e6-4495-a614-d96ad9575fb0 req-eb08531a-97be-49d7-a3e0-658d4c27222f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "6763a0f8-8485-4d4a-8418-5b095f3a20ed-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:15:38 compute-1 nova_compute[182713]: 2026-01-22 00:15:38.595 182717 DEBUG nova.compute.manager [req-8269c1ae-88e6-4495-a614-d96ad9575fb0 req-eb08531a-97be-49d7-a3e0-658d4c27222f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 6763a0f8-8485-4d4a-8418-5b095f3a20ed] Processing event network-vif-plugged-48d53d0a-d386-45be-8c0d-a38d9b22332f _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 22 00:15:38 compute-1 nova_compute[182713]: 2026-01-22 00:15:38.596 182717 DEBUG nova.compute.manager [None req-852fef58-4b1d-485a-b835-1ad595f9cfa6 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 6763a0f8-8485-4d4a-8418-5b095f3a20ed] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 00:15:38 compute-1 nova_compute[182713]: 2026-01-22 00:15:38.597 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 6763a0f8-8485-4d4a-8418-5b095f3a20ed] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 00:15:38 compute-1 nova_compute[182713]: 2026-01-22 00:15:38.600 182717 DEBUG nova.virt.driver [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Emitting event <LifecycleEvent: 1769040938.6004138, 6763a0f8-8485-4d4a-8418-5b095f3a20ed => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:15:38 compute-1 nova_compute[182713]: 2026-01-22 00:15:38.600 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 6763a0f8-8485-4d4a-8418-5b095f3a20ed] VM Resumed (Lifecycle Event)
Jan 22 00:15:38 compute-1 nova_compute[182713]: 2026-01-22 00:15:38.603 182717 DEBUG nova.virt.libvirt.driver [None req-852fef58-4b1d-485a-b835-1ad595f9cfa6 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 6763a0f8-8485-4d4a-8418-5b095f3a20ed] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 22 00:15:38 compute-1 nova_compute[182713]: 2026-01-22 00:15:38.606 182717 INFO nova.virt.libvirt.driver [-] [instance: 6763a0f8-8485-4d4a-8418-5b095f3a20ed] Instance spawned successfully.
Jan 22 00:15:38 compute-1 nova_compute[182713]: 2026-01-22 00:15:38.606 182717 DEBUG nova.virt.libvirt.driver [None req-852fef58-4b1d-485a-b835-1ad595f9cfa6 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 6763a0f8-8485-4d4a-8418-5b095f3a20ed] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 22 00:15:38 compute-1 nova_compute[182713]: 2026-01-22 00:15:38.625 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 6763a0f8-8485-4d4a-8418-5b095f3a20ed] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:15:38 compute-1 nova_compute[182713]: 2026-01-22 00:15:38.632 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 6763a0f8-8485-4d4a-8418-5b095f3a20ed] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 00:15:38 compute-1 nova_compute[182713]: 2026-01-22 00:15:38.635 182717 DEBUG nova.virt.libvirt.driver [None req-852fef58-4b1d-485a-b835-1ad595f9cfa6 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 6763a0f8-8485-4d4a-8418-5b095f3a20ed] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:15:38 compute-1 nova_compute[182713]: 2026-01-22 00:15:38.635 182717 DEBUG nova.virt.libvirt.driver [None req-852fef58-4b1d-485a-b835-1ad595f9cfa6 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 6763a0f8-8485-4d4a-8418-5b095f3a20ed] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:15:38 compute-1 nova_compute[182713]: 2026-01-22 00:15:38.635 182717 DEBUG nova.virt.libvirt.driver [None req-852fef58-4b1d-485a-b835-1ad595f9cfa6 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 6763a0f8-8485-4d4a-8418-5b095f3a20ed] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:15:38 compute-1 nova_compute[182713]: 2026-01-22 00:15:38.635 182717 DEBUG nova.virt.libvirt.driver [None req-852fef58-4b1d-485a-b835-1ad595f9cfa6 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 6763a0f8-8485-4d4a-8418-5b095f3a20ed] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:15:38 compute-1 nova_compute[182713]: 2026-01-22 00:15:38.636 182717 DEBUG nova.virt.libvirt.driver [None req-852fef58-4b1d-485a-b835-1ad595f9cfa6 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 6763a0f8-8485-4d4a-8418-5b095f3a20ed] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:15:38 compute-1 nova_compute[182713]: 2026-01-22 00:15:38.636 182717 DEBUG nova.virt.libvirt.driver [None req-852fef58-4b1d-485a-b835-1ad595f9cfa6 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 6763a0f8-8485-4d4a-8418-5b095f3a20ed] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:15:38 compute-1 nova_compute[182713]: 2026-01-22 00:15:38.670 182717 DEBUG nova.network.neutron [req-2515c7c0-3f7f-4db2-b384-b3e6fb12b08b req-22658bd6-b200-4c94-aa7d-1a798514a898 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 6763a0f8-8485-4d4a-8418-5b095f3a20ed] Updated VIF entry in instance network info cache for port 48d53d0a-d386-45be-8c0d-a38d9b22332f. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 00:15:38 compute-1 nova_compute[182713]: 2026-01-22 00:15:38.670 182717 DEBUG nova.network.neutron [req-2515c7c0-3f7f-4db2-b384-b3e6fb12b08b req-22658bd6-b200-4c94-aa7d-1a798514a898 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 6763a0f8-8485-4d4a-8418-5b095f3a20ed] Updating instance_info_cache with network_info: [{"id": "48d53d0a-d386-45be-8c0d-a38d9b22332f", "address": "fa:16:3e:82:84:b0", "network": {"id": "dbad5d67-48fa-4452-9764-37918af5f722", "bridge": "br-int", "label": "tempest-network-smoke--1888340030", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.24", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "34b96b4037d24a0ea19383ca2477b2fd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap48d53d0a-d3", "ovs_interfaceid": "48d53d0a-d386-45be-8c0d-a38d9b22332f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:15:38 compute-1 nova_compute[182713]: 2026-01-22 00:15:38.674 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 6763a0f8-8485-4d4a-8418-5b095f3a20ed] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 00:15:38 compute-1 nova_compute[182713]: 2026-01-22 00:15:38.708 182717 DEBUG oslo_concurrency.lockutils [req-2515c7c0-3f7f-4db2-b384-b3e6fb12b08b req-22658bd6-b200-4c94-aa7d-1a798514a898 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-6763a0f8-8485-4d4a-8418-5b095f3a20ed" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:15:38 compute-1 nova_compute[182713]: 2026-01-22 00:15:38.739 182717 INFO nova.compute.manager [None req-852fef58-4b1d-485a-b835-1ad595f9cfa6 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 6763a0f8-8485-4d4a-8418-5b095f3a20ed] Took 6.51 seconds to spawn the instance on the hypervisor.
Jan 22 00:15:38 compute-1 nova_compute[182713]: 2026-01-22 00:15:38.739 182717 DEBUG nova.compute.manager [None req-852fef58-4b1d-485a-b835-1ad595f9cfa6 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 6763a0f8-8485-4d4a-8418-5b095f3a20ed] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:15:38 compute-1 nova_compute[182713]: 2026-01-22 00:15:38.852 182717 INFO nova.compute.manager [None req-852fef58-4b1d-485a-b835-1ad595f9cfa6 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 6763a0f8-8485-4d4a-8418-5b095f3a20ed] Took 7.29 seconds to build instance.
Jan 22 00:15:38 compute-1 nova_compute[182713]: 2026-01-22 00:15:38.884 182717 DEBUG oslo_concurrency.lockutils [None req-852fef58-4b1d-485a-b835-1ad595f9cfa6 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "6763a0f8-8485-4d4a-8418-5b095f3a20ed" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.436s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:15:39 compute-1 nova_compute[182713]: 2026-01-22 00:15:39.057 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:15:39 compute-1 ovn_controller[94841]: 2026-01-22T00:15:39Z|00511|binding|INFO|Releasing lport bc0feb73-1fe9-487e-8603-9846dd682590 from this chassis (sb_readonly=0)
Jan 22 00:15:39 compute-1 ovn_controller[94841]: 2026-01-22T00:15:39Z|00512|binding|INFO|Releasing lport 6df2a6eb-cd08-41f6-b95a-bc0711ab706f from this chassis (sb_readonly=0)
Jan 22 00:15:39 compute-1 nova_compute[182713]: 2026-01-22 00:15:39.358 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:15:40 compute-1 nova_compute[182713]: 2026-01-22 00:15:40.804 182717 DEBUG nova.compute.manager [req-a5c4aca5-c56d-4acf-97f9-c7382fe3a6e4 req-9778c4c7-4f80-400a-9102-a4b5f13e4f00 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 6763a0f8-8485-4d4a-8418-5b095f3a20ed] Received event network-vif-plugged-48d53d0a-d386-45be-8c0d-a38d9b22332f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:15:40 compute-1 nova_compute[182713]: 2026-01-22 00:15:40.805 182717 DEBUG oslo_concurrency.lockutils [req-a5c4aca5-c56d-4acf-97f9-c7382fe3a6e4 req-9778c4c7-4f80-400a-9102-a4b5f13e4f00 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "6763a0f8-8485-4d4a-8418-5b095f3a20ed-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:15:40 compute-1 nova_compute[182713]: 2026-01-22 00:15:40.805 182717 DEBUG oslo_concurrency.lockutils [req-a5c4aca5-c56d-4acf-97f9-c7382fe3a6e4 req-9778c4c7-4f80-400a-9102-a4b5f13e4f00 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "6763a0f8-8485-4d4a-8418-5b095f3a20ed-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:15:40 compute-1 nova_compute[182713]: 2026-01-22 00:15:40.805 182717 DEBUG oslo_concurrency.lockutils [req-a5c4aca5-c56d-4acf-97f9-c7382fe3a6e4 req-9778c4c7-4f80-400a-9102-a4b5f13e4f00 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "6763a0f8-8485-4d4a-8418-5b095f3a20ed-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:15:40 compute-1 nova_compute[182713]: 2026-01-22 00:15:40.806 182717 DEBUG nova.compute.manager [req-a5c4aca5-c56d-4acf-97f9-c7382fe3a6e4 req-9778c4c7-4f80-400a-9102-a4b5f13e4f00 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 6763a0f8-8485-4d4a-8418-5b095f3a20ed] No waiting events found dispatching network-vif-plugged-48d53d0a-d386-45be-8c0d-a38d9b22332f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:15:40 compute-1 nova_compute[182713]: 2026-01-22 00:15:40.806 182717 WARNING nova.compute.manager [req-a5c4aca5-c56d-4acf-97f9-c7382fe3a6e4 req-9778c4c7-4f80-400a-9102-a4b5f13e4f00 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 6763a0f8-8485-4d4a-8418-5b095f3a20ed] Received unexpected event network-vif-plugged-48d53d0a-d386-45be-8c0d-a38d9b22332f for instance with vm_state active and task_state None.
Jan 22 00:15:40 compute-1 nova_compute[182713]: 2026-01-22 00:15:40.855 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:15:41 compute-1 nova_compute[182713]: 2026-01-22 00:15:41.723 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:15:42 compute-1 podman[231176]: 2026-01-22 00:15:42.604571335 +0000 UTC m=+0.076029748 container health_status af9a44923f14d865893c91712a29a99d59e05d83f1a1a0300c4f4d5af638f284 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 22 00:15:42 compute-1 podman[231177]: 2026-01-22 00:15:42.61321379 +0000 UTC m=+0.089482341 container health_status e305ebfcd3dbca18f8dd680606b043b108cf9efb0d0a771039cb04fe0df8441d (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Jan 22 00:15:42 compute-1 nova_compute[182713]: 2026-01-22 00:15:42.851 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:15:42 compute-1 nova_compute[182713]: 2026-01-22 00:15:42.878 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:15:43 compute-1 nova_compute[182713]: 2026-01-22 00:15:43.331 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:15:43 compute-1 ovn_controller[94841]: 2026-01-22T00:15:43Z|00513|binding|INFO|Releasing lport bc0feb73-1fe9-487e-8603-9846dd682590 from this chassis (sb_readonly=0)
Jan 22 00:15:43 compute-1 ovn_controller[94841]: 2026-01-22T00:15:43Z|00514|binding|INFO|Releasing lport 6df2a6eb-cd08-41f6-b95a-bc0711ab706f from this chassis (sb_readonly=0)
Jan 22 00:15:43 compute-1 nova_compute[182713]: 2026-01-22 00:15:43.952 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:15:44 compute-1 nova_compute[182713]: 2026-01-22 00:15:44.855 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:15:44 compute-1 nova_compute[182713]: 2026-01-22 00:15:44.856 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 00:15:45 compute-1 nova_compute[182713]: 2026-01-22 00:15:45.855 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:15:45 compute-1 nova_compute[182713]: 2026-01-22 00:15:45.855 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:15:46 compute-1 nova_compute[182713]: 2026-01-22 00:15:46.724 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:15:48 compute-1 nova_compute[182713]: 2026-01-22 00:15:48.334 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:15:48 compute-1 nova_compute[182713]: 2026-01-22 00:15:48.635 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:15:48 compute-1 nova_compute[182713]: 2026-01-22 00:15:48.636 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:15:48 compute-1 nova_compute[182713]: 2026-01-22 00:15:48.637 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:15:48 compute-1 nova_compute[182713]: 2026-01-22 00:15:48.637 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 00:15:49 compute-1 nova_compute[182713]: 2026-01-22 00:15:49.103 182717 DEBUG oslo_concurrency.processutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/074fd360-328c-4903-a368-d3890c4a1075/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:15:49 compute-1 nova_compute[182713]: 2026-01-22 00:15:49.200 182717 DEBUG oslo_concurrency.processutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/074fd360-328c-4903-a368-d3890c4a1075/disk --force-share --output=json" returned: 0 in 0.097s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:15:49 compute-1 nova_compute[182713]: 2026-01-22 00:15:49.202 182717 DEBUG oslo_concurrency.processutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/074fd360-328c-4903-a368-d3890c4a1075/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:15:49 compute-1 nova_compute[182713]: 2026-01-22 00:15:49.316 182717 DEBUG oslo_concurrency.processutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/074fd360-328c-4903-a368-d3890c4a1075/disk --force-share --output=json" returned: 0 in 0.114s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:15:49 compute-1 nova_compute[182713]: 2026-01-22 00:15:49.324 182717 DEBUG oslo_concurrency.processutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6763a0f8-8485-4d4a-8418-5b095f3a20ed/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:15:49 compute-1 nova_compute[182713]: 2026-01-22 00:15:49.397 182717 DEBUG oslo_concurrency.processutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6763a0f8-8485-4d4a-8418-5b095f3a20ed/disk --force-share --output=json" returned: 0 in 0.073s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:15:49 compute-1 nova_compute[182713]: 2026-01-22 00:15:49.400 182717 DEBUG oslo_concurrency.processutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6763a0f8-8485-4d4a-8418-5b095f3a20ed/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:15:49 compute-1 nova_compute[182713]: 2026-01-22 00:15:49.485 182717 DEBUG oslo_concurrency.processutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6763a0f8-8485-4d4a-8418-5b095f3a20ed/disk --force-share --output=json" returned: 0 in 0.086s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:15:49 compute-1 nova_compute[182713]: 2026-01-22 00:15:49.704 182717 WARNING nova.virt.libvirt.driver [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 00:15:49 compute-1 nova_compute[182713]: 2026-01-22 00:15:49.706 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5361MB free_disk=73.230224609375GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 00:15:49 compute-1 nova_compute[182713]: 2026-01-22 00:15:49.706 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:15:49 compute-1 nova_compute[182713]: 2026-01-22 00:15:49.707 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:15:50 compute-1 ovn_controller[94841]: 2026-01-22T00:15:50Z|00060|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:82:84:b0 10.100.0.24
Jan 22 00:15:50 compute-1 ovn_controller[94841]: 2026-01-22T00:15:50Z|00061|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:82:84:b0 10.100.0.24
Jan 22 00:15:50 compute-1 nova_compute[182713]: 2026-01-22 00:15:50.272 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Instance 074fd360-328c-4903-a368-d3890c4a1075 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 22 00:15:50 compute-1 nova_compute[182713]: 2026-01-22 00:15:50.273 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Instance 6763a0f8-8485-4d4a-8418-5b095f3a20ed actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 22 00:15:50 compute-1 nova_compute[182713]: 2026-01-22 00:15:50.273 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 00:15:50 compute-1 nova_compute[182713]: 2026-01-22 00:15:50.274 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 00:15:50 compute-1 nova_compute[182713]: 2026-01-22 00:15:50.507 182717 DEBUG nova.compute.provider_tree [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Inventory has not changed in ProviderTree for provider: 39680711-70c9-4df1-ae59-25e54fac688d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:15:50 compute-1 nova_compute[182713]: 2026-01-22 00:15:50.550 182717 DEBUG nova.scheduler.client.report [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Inventory has not changed for provider 39680711-70c9-4df1-ae59-25e54fac688d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:15:50 compute-1 nova_compute[182713]: 2026-01-22 00:15:50.613 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 00:15:50 compute-1 nova_compute[182713]: 2026-01-22 00:15:50.613 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.906s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:15:51 compute-1 nova_compute[182713]: 2026-01-22 00:15:51.726 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:15:52 compute-1 nova_compute[182713]: 2026-01-22 00:15:52.610 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:15:52 compute-1 nova_compute[182713]: 2026-01-22 00:15:52.610 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:15:52 compute-1 nova_compute[182713]: 2026-01-22 00:15:52.611 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 00:15:52 compute-1 nova_compute[182713]: 2026-01-22 00:15:52.611 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 22 00:15:53 compute-1 nova_compute[182713]: 2026-01-22 00:15:53.342 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:15:53 compute-1 podman[231250]: 2026-01-22 00:15:53.605752857 +0000 UTC m=+0.092641689 container health_status cba261ffc859a105e0afa4436e64e27f9ba7e7387a1ea02146e0072c51844d44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ceilometer_agent_compute, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 22 00:15:53 compute-1 nova_compute[182713]: 2026-01-22 00:15:53.875 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquiring lock "refresh_cache-074fd360-328c-4903-a368-d3890c4a1075" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:15:53 compute-1 nova_compute[182713]: 2026-01-22 00:15:53.876 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquired lock "refresh_cache-074fd360-328c-4903-a368-d3890c4a1075" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:15:53 compute-1 nova_compute[182713]: 2026-01-22 00:15:53.876 182717 DEBUG nova.network.neutron [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] [instance: 074fd360-328c-4903-a368-d3890c4a1075] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Jan 22 00:15:53 compute-1 nova_compute[182713]: 2026-01-22 00:15:53.876 182717 DEBUG nova.objects.instance [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lazy-loading 'info_cache' on Instance uuid 074fd360-328c-4903-a368-d3890c4a1075 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:15:55 compute-1 nova_compute[182713]: 2026-01-22 00:15:55.456 182717 DEBUG oslo_concurrency.lockutils [None req-feac5578-dfd0-4b4e-acca-6e5eb0b498ee c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] Acquiring lock "5ba5bafe-ee5b-48f6-aa2f-653708f71f55" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:15:55 compute-1 nova_compute[182713]: 2026-01-22 00:15:55.457 182717 DEBUG oslo_concurrency.lockutils [None req-feac5578-dfd0-4b4e-acca-6e5eb0b498ee c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] Lock "5ba5bafe-ee5b-48f6-aa2f-653708f71f55" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:15:55 compute-1 nova_compute[182713]: 2026-01-22 00:15:55.554 182717 DEBUG nova.compute.manager [None req-feac5578-dfd0-4b4e-acca-6e5eb0b498ee c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] [instance: 5ba5bafe-ee5b-48f6-aa2f-653708f71f55] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 22 00:15:55 compute-1 nova_compute[182713]: 2026-01-22 00:15:55.702 182717 DEBUG oslo_concurrency.lockutils [None req-feac5578-dfd0-4b4e-acca-6e5eb0b498ee c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:15:55 compute-1 nova_compute[182713]: 2026-01-22 00:15:55.702 182717 DEBUG oslo_concurrency.lockutils [None req-feac5578-dfd0-4b4e-acca-6e5eb0b498ee c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:15:55 compute-1 nova_compute[182713]: 2026-01-22 00:15:55.713 182717 DEBUG nova.virt.hardware [None req-feac5578-dfd0-4b4e-acca-6e5eb0b498ee c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 22 00:15:55 compute-1 nova_compute[182713]: 2026-01-22 00:15:55.713 182717 INFO nova.compute.claims [None req-feac5578-dfd0-4b4e-acca-6e5eb0b498ee c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] [instance: 5ba5bafe-ee5b-48f6-aa2f-653708f71f55] Claim successful on node compute-1.ctlplane.example.com
Jan 22 00:15:55 compute-1 nova_compute[182713]: 2026-01-22 00:15:55.977 182717 DEBUG nova.compute.provider_tree [None req-feac5578-dfd0-4b4e-acca-6e5eb0b498ee c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] Inventory has not changed in ProviderTree for provider: 39680711-70c9-4df1-ae59-25e54fac688d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:15:55 compute-1 nova_compute[182713]: 2026-01-22 00:15:55.995 182717 DEBUG nova.scheduler.client.report [None req-feac5578-dfd0-4b4e-acca-6e5eb0b498ee c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] Inventory has not changed for provider 39680711-70c9-4df1-ae59-25e54fac688d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:15:56 compute-1 nova_compute[182713]: 2026-01-22 00:15:56.052 182717 DEBUG oslo_concurrency.lockutils [None req-feac5578-dfd0-4b4e-acca-6e5eb0b498ee c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.350s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:15:56 compute-1 nova_compute[182713]: 2026-01-22 00:15:56.053 182717 DEBUG nova.compute.manager [None req-feac5578-dfd0-4b4e-acca-6e5eb0b498ee c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] [instance: 5ba5bafe-ee5b-48f6-aa2f-653708f71f55] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 22 00:15:56 compute-1 nova_compute[182713]: 2026-01-22 00:15:56.292 182717 DEBUG nova.compute.manager [None req-feac5578-dfd0-4b4e-acca-6e5eb0b498ee c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] [instance: 5ba5bafe-ee5b-48f6-aa2f-653708f71f55] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 22 00:15:56 compute-1 nova_compute[182713]: 2026-01-22 00:15:56.293 182717 DEBUG nova.network.neutron [None req-feac5578-dfd0-4b4e-acca-6e5eb0b498ee c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] [instance: 5ba5bafe-ee5b-48f6-aa2f-653708f71f55] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 22 00:15:56 compute-1 nova_compute[182713]: 2026-01-22 00:15:56.572 182717 INFO nova.virt.libvirt.driver [None req-feac5578-dfd0-4b4e-acca-6e5eb0b498ee c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] [instance: 5ba5bafe-ee5b-48f6-aa2f-653708f71f55] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 22 00:15:56 compute-1 podman[231271]: 2026-01-22 00:15:56.63402377 +0000 UTC m=+0.114950654 container health_status dce133a2ffa21f3006ac27dc80027060a9029e6d972f4c62fecd35dd3bbb00f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=openstack_network_exporter, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, vendor=Red Hat, Inc., managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, distribution-scope=public, name=ubi9-minimal, architecture=x86_64, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Jan 22 00:15:56 compute-1 nova_compute[182713]: 2026-01-22 00:15:56.729 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:15:56 compute-1 nova_compute[182713]: 2026-01-22 00:15:56.821 182717 DEBUG nova.compute.manager [None req-feac5578-dfd0-4b4e-acca-6e5eb0b498ee c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] [instance: 5ba5bafe-ee5b-48f6-aa2f-653708f71f55] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 22 00:15:56 compute-1 nova_compute[182713]: 2026-01-22 00:15:56.981 182717 DEBUG nova.policy [None req-feac5578-dfd0-4b4e-acca-6e5eb0b498ee c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'c26ff016fcfc4e08803feb0e96005a8e', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '4c6e66779ffe440d9c3270f0328391fb', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 22 00:15:57 compute-1 nova_compute[182713]: 2026-01-22 00:15:57.024 182717 DEBUG nova.compute.manager [None req-feac5578-dfd0-4b4e-acca-6e5eb0b498ee c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] [instance: 5ba5bafe-ee5b-48f6-aa2f-653708f71f55] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 22 00:15:57 compute-1 nova_compute[182713]: 2026-01-22 00:15:57.025 182717 DEBUG nova.virt.libvirt.driver [None req-feac5578-dfd0-4b4e-acca-6e5eb0b498ee c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] [instance: 5ba5bafe-ee5b-48f6-aa2f-653708f71f55] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 22 00:15:57 compute-1 nova_compute[182713]: 2026-01-22 00:15:57.025 182717 INFO nova.virt.libvirt.driver [None req-feac5578-dfd0-4b4e-acca-6e5eb0b498ee c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] [instance: 5ba5bafe-ee5b-48f6-aa2f-653708f71f55] Creating image(s)
Jan 22 00:15:57 compute-1 nova_compute[182713]: 2026-01-22 00:15:57.026 182717 DEBUG oslo_concurrency.lockutils [None req-feac5578-dfd0-4b4e-acca-6e5eb0b498ee c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] Acquiring lock "/var/lib/nova/instances/5ba5bafe-ee5b-48f6-aa2f-653708f71f55/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:15:57 compute-1 nova_compute[182713]: 2026-01-22 00:15:57.026 182717 DEBUG oslo_concurrency.lockutils [None req-feac5578-dfd0-4b4e-acca-6e5eb0b498ee c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] Lock "/var/lib/nova/instances/5ba5bafe-ee5b-48f6-aa2f-653708f71f55/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:15:57 compute-1 nova_compute[182713]: 2026-01-22 00:15:57.026 182717 DEBUG oslo_concurrency.lockutils [None req-feac5578-dfd0-4b4e-acca-6e5eb0b498ee c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] Lock "/var/lib/nova/instances/5ba5bafe-ee5b-48f6-aa2f-653708f71f55/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:15:57 compute-1 nova_compute[182713]: 2026-01-22 00:15:57.037 182717 DEBUG oslo_concurrency.processutils [None req-feac5578-dfd0-4b4e-acca-6e5eb0b498ee c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:15:57 compute-1 nova_compute[182713]: 2026-01-22 00:15:57.131 182717 DEBUG oslo_concurrency.processutils [None req-feac5578-dfd0-4b4e-acca-6e5eb0b498ee c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.094s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:15:57 compute-1 nova_compute[182713]: 2026-01-22 00:15:57.133 182717 DEBUG oslo_concurrency.lockutils [None req-feac5578-dfd0-4b4e-acca-6e5eb0b498ee c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] Acquiring lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:15:57 compute-1 nova_compute[182713]: 2026-01-22 00:15:57.133 182717 DEBUG oslo_concurrency.lockutils [None req-feac5578-dfd0-4b4e-acca-6e5eb0b498ee c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:15:57 compute-1 nova_compute[182713]: 2026-01-22 00:15:57.150 182717 DEBUG oslo_concurrency.processutils [None req-feac5578-dfd0-4b4e-acca-6e5eb0b498ee c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:15:57 compute-1 nova_compute[182713]: 2026-01-22 00:15:57.234 182717 DEBUG oslo_concurrency.processutils [None req-feac5578-dfd0-4b4e-acca-6e5eb0b498ee c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.085s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:15:57 compute-1 nova_compute[182713]: 2026-01-22 00:15:57.235 182717 DEBUG oslo_concurrency.processutils [None req-feac5578-dfd0-4b4e-acca-6e5eb0b498ee c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/5ba5bafe-ee5b-48f6-aa2f-653708f71f55/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:15:57 compute-1 nova_compute[182713]: 2026-01-22 00:15:57.292 182717 DEBUG oslo_concurrency.processutils [None req-feac5578-dfd0-4b4e-acca-6e5eb0b498ee c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/5ba5bafe-ee5b-48f6-aa2f-653708f71f55/disk 1073741824" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:15:57 compute-1 nova_compute[182713]: 2026-01-22 00:15:57.293 182717 DEBUG oslo_concurrency.lockutils [None req-feac5578-dfd0-4b4e-acca-6e5eb0b498ee c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.160s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:15:57 compute-1 nova_compute[182713]: 2026-01-22 00:15:57.294 182717 DEBUG oslo_concurrency.processutils [None req-feac5578-dfd0-4b4e-acca-6e5eb0b498ee c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:15:57 compute-1 nova_compute[182713]: 2026-01-22 00:15:57.353 182717 DEBUG oslo_concurrency.processutils [None req-feac5578-dfd0-4b4e-acca-6e5eb0b498ee c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:15:57 compute-1 nova_compute[182713]: 2026-01-22 00:15:57.354 182717 DEBUG nova.virt.disk.api [None req-feac5578-dfd0-4b4e-acca-6e5eb0b498ee c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] Checking if we can resize image /var/lib/nova/instances/5ba5bafe-ee5b-48f6-aa2f-653708f71f55/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 22 00:15:57 compute-1 nova_compute[182713]: 2026-01-22 00:15:57.354 182717 DEBUG oslo_concurrency.processutils [None req-feac5578-dfd0-4b4e-acca-6e5eb0b498ee c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5ba5bafe-ee5b-48f6-aa2f-653708f71f55/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:15:57 compute-1 nova_compute[182713]: 2026-01-22 00:15:57.408 182717 DEBUG oslo_concurrency.processutils [None req-feac5578-dfd0-4b4e-acca-6e5eb0b498ee c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5ba5bafe-ee5b-48f6-aa2f-653708f71f55/disk --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:15:57 compute-1 nova_compute[182713]: 2026-01-22 00:15:57.409 182717 DEBUG nova.virt.disk.api [None req-feac5578-dfd0-4b4e-acca-6e5eb0b498ee c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] Cannot resize image /var/lib/nova/instances/5ba5bafe-ee5b-48f6-aa2f-653708f71f55/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 22 00:15:57 compute-1 nova_compute[182713]: 2026-01-22 00:15:57.409 182717 DEBUG nova.objects.instance [None req-feac5578-dfd0-4b4e-acca-6e5eb0b498ee c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] Lazy-loading 'migration_context' on Instance uuid 5ba5bafe-ee5b-48f6-aa2f-653708f71f55 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:15:57 compute-1 nova_compute[182713]: 2026-01-22 00:15:57.463 182717 DEBUG nova.virt.libvirt.driver [None req-feac5578-dfd0-4b4e-acca-6e5eb0b498ee c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] [instance: 5ba5bafe-ee5b-48f6-aa2f-653708f71f55] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 22 00:15:57 compute-1 nova_compute[182713]: 2026-01-22 00:15:57.463 182717 DEBUG nova.virt.libvirt.driver [None req-feac5578-dfd0-4b4e-acca-6e5eb0b498ee c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] [instance: 5ba5bafe-ee5b-48f6-aa2f-653708f71f55] Ensure instance console log exists: /var/lib/nova/instances/5ba5bafe-ee5b-48f6-aa2f-653708f71f55/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 22 00:15:57 compute-1 nova_compute[182713]: 2026-01-22 00:15:57.464 182717 DEBUG oslo_concurrency.lockutils [None req-feac5578-dfd0-4b4e-acca-6e5eb0b498ee c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:15:57 compute-1 nova_compute[182713]: 2026-01-22 00:15:57.464 182717 DEBUG oslo_concurrency.lockutils [None req-feac5578-dfd0-4b4e-acca-6e5eb0b498ee c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:15:57 compute-1 nova_compute[182713]: 2026-01-22 00:15:57.464 182717 DEBUG oslo_concurrency.lockutils [None req-feac5578-dfd0-4b4e-acca-6e5eb0b498ee c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:15:58 compute-1 nova_compute[182713]: 2026-01-22 00:15:58.377 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:16:01 compute-1 nova_compute[182713]: 2026-01-22 00:16:01.042 182717 DEBUG nova.compute.manager [req-2a672a21-fa25-492d-9938-6a830fedf0fa req-4fd6db0b-8d2d-4e1c-8f05-fc79629e8faa 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 074fd360-328c-4903-a368-d3890c4a1075] Received event network-changed-4230f4cb-ad57-407e-90c1-b2441b67e135 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:16:01 compute-1 nova_compute[182713]: 2026-01-22 00:16:01.042 182717 DEBUG nova.compute.manager [req-2a672a21-fa25-492d-9938-6a830fedf0fa req-4fd6db0b-8d2d-4e1c-8f05-fc79629e8faa 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 074fd360-328c-4903-a368-d3890c4a1075] Refreshing instance network info cache due to event network-changed-4230f4cb-ad57-407e-90c1-b2441b67e135. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 00:16:01 compute-1 nova_compute[182713]: 2026-01-22 00:16:01.043 182717 DEBUG oslo_concurrency.lockutils [req-2a672a21-fa25-492d-9938-6a830fedf0fa req-4fd6db0b-8d2d-4e1c-8f05-fc79629e8faa 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-074fd360-328c-4903-a368-d3890c4a1075" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:16:01 compute-1 nova_compute[182713]: 2026-01-22 00:16:01.059 182717 DEBUG nova.network.neutron [None req-feac5578-dfd0-4b4e-acca-6e5eb0b498ee c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] [instance: 5ba5bafe-ee5b-48f6-aa2f-653708f71f55] Successfully created port: 53f3c575-ccc8-4fde-a256-9598ddf6cdaf _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 22 00:16:01 compute-1 anacron[30859]: Job `cron.monthly' started
Jan 22 00:16:01 compute-1 anacron[30859]: Job `cron.monthly' terminated
Jan 22 00:16:01 compute-1 anacron[30859]: Normal exit (3 jobs run)
Jan 22 00:16:01 compute-1 nova_compute[182713]: 2026-01-22 00:16:01.731 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:16:03 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:16:03.023 104184 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:16:03 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:16:03.023 104184 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:16:03 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:16:03.024 104184 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:16:03 compute-1 nova_compute[182713]: 2026-01-22 00:16:03.380 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:16:03 compute-1 nova_compute[182713]: 2026-01-22 00:16:03.935 182717 DEBUG oslo_concurrency.lockutils [None req-553aace0-ab74-417f-943e-e6ecee28237b 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Acquiring lock "6763a0f8-8485-4d4a-8418-5b095f3a20ed" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:16:03 compute-1 nova_compute[182713]: 2026-01-22 00:16:03.935 182717 DEBUG oslo_concurrency.lockutils [None req-553aace0-ab74-417f-943e-e6ecee28237b 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "6763a0f8-8485-4d4a-8418-5b095f3a20ed" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:16:03 compute-1 nova_compute[182713]: 2026-01-22 00:16:03.936 182717 DEBUG oslo_concurrency.lockutils [None req-553aace0-ab74-417f-943e-e6ecee28237b 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Acquiring lock "6763a0f8-8485-4d4a-8418-5b095f3a20ed-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:16:03 compute-1 nova_compute[182713]: 2026-01-22 00:16:03.936 182717 DEBUG oslo_concurrency.lockutils [None req-553aace0-ab74-417f-943e-e6ecee28237b 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "6763a0f8-8485-4d4a-8418-5b095f3a20ed-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:16:03 compute-1 nova_compute[182713]: 2026-01-22 00:16:03.936 182717 DEBUG oslo_concurrency.lockutils [None req-553aace0-ab74-417f-943e-e6ecee28237b 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "6763a0f8-8485-4d4a-8418-5b095f3a20ed-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:16:03 compute-1 nova_compute[182713]: 2026-01-22 00:16:03.953 182717 INFO nova.compute.manager [None req-553aace0-ab74-417f-943e-e6ecee28237b 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 6763a0f8-8485-4d4a-8418-5b095f3a20ed] Terminating instance
Jan 22 00:16:03 compute-1 nova_compute[182713]: 2026-01-22 00:16:03.964 182717 DEBUG nova.compute.manager [None req-553aace0-ab74-417f-943e-e6ecee28237b 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 6763a0f8-8485-4d4a-8418-5b095f3a20ed] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 22 00:16:03 compute-1 nova_compute[182713]: 2026-01-22 00:16:03.978 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:16:03 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:16:03.981 104184 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=38, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:a4:fb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'fa:c2:bb:50:eb:27'}, ipsec=False) old=SB_Global(nb_cfg=37) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:16:03 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:16:03.982 104184 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 22 00:16:03 compute-1 kernel: tap48d53d0a-d3 (unregistering): left promiscuous mode
Jan 22 00:16:04 compute-1 NetworkManager[54952]: <info>  [1769040964.0044] device (tap48d53d0a-d3): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 00:16:04 compute-1 nova_compute[182713]: 2026-01-22 00:16:04.019 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:16:04 compute-1 ovn_controller[94841]: 2026-01-22T00:16:04Z|00515|binding|INFO|Releasing lport 48d53d0a-d386-45be-8c0d-a38d9b22332f from this chassis (sb_readonly=0)
Jan 22 00:16:04 compute-1 ovn_controller[94841]: 2026-01-22T00:16:04Z|00516|binding|INFO|Setting lport 48d53d0a-d386-45be-8c0d-a38d9b22332f down in Southbound
Jan 22 00:16:04 compute-1 ovn_controller[94841]: 2026-01-22T00:16:04Z|00517|binding|INFO|Removing iface tap48d53d0a-d3 ovn-installed in OVS
Jan 22 00:16:04 compute-1 nova_compute[182713]: 2026-01-22 00:16:04.022 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:16:04 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:16:04.028 104184 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:82:84:b0 10.100.0.24'], port_security=['fa:16:3e:82:84:b0 10.100.0.24'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.24/28', 'neutron:device_id': '6763a0f8-8485-4d4a-8418-5b095f3a20ed', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-dbad5d67-48fa-4452-9764-37918af5f722', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '34b96b4037d24a0ea19383ca2477b2fd', 'neutron:revision_number': '4', 'neutron:security_group_ids': '4c1742f2-7c74-46cd-9079-742776973d57', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3ac57e7f-dd25-45c3-9d9a-56463ac0e2d3, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>], logical_port=48d53d0a-d386-45be-8c0d-a38d9b22332f) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:16:04 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:16:04.030 104184 INFO neutron.agent.ovn.metadata.agent [-] Port 48d53d0a-d386-45be-8c0d-a38d9b22332f in datapath dbad5d67-48fa-4452-9764-37918af5f722 unbound from our chassis
Jan 22 00:16:04 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:16:04.034 104184 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network dbad5d67-48fa-4452-9764-37918af5f722
Jan 22 00:16:04 compute-1 nova_compute[182713]: 2026-01-22 00:16:04.049 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:16:04 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:16:04.053 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[1fc9510f-db8b-4244-904d-1d75d1cfff66]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:16:04 compute-1 systemd[1]: machine-qemu\x2d58\x2dinstance\x2d00000082.scope: Deactivated successfully.
Jan 22 00:16:04 compute-1 systemd[1]: machine-qemu\x2d58\x2dinstance\x2d00000082.scope: Consumed 12.870s CPU time.
Jan 22 00:16:04 compute-1 systemd-machined[153970]: Machine qemu-58-instance-00000082 terminated.
Jan 22 00:16:04 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:16:04.082 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[f5c7f5a2-f2af-460e-88fb-46d868e39e2d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:16:04 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:16:04.085 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[7a420e06-e8fa-43e0-97cf-c3dba77be38e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:16:04 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:16:04.114 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[880a8020-3353-4fb3-8e85-581e4173110b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:16:04 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:16:04.128 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[70246db4-a6fc-4882-88c5-054534d006f1]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapdbad5d67-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:00:10:3f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 7, 'rx_bytes': 700, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 7, 'rx_bytes': 700, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 145], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 549322, 'reachable_time': 36159, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 231320, 'error': None, 'target': 'ovnmeta-dbad5d67-48fa-4452-9764-37918af5f722', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:16:04 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:16:04.148 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[c8b2d945-9467-435b-8ef4-0c687bfaac62]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.17'], ['IFA_LOCAL', '10.100.0.17'], ['IFA_BROADCAST', '10.100.0.31'], ['IFA_LABEL', 'tapdbad5d67-41'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 549337, 'tstamp': 549337}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 231321, 'error': None, 'target': 'ovnmeta-dbad5d67-48fa-4452-9764-37918af5f722', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapdbad5d67-41'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 549341, 'tstamp': 549341}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 231321, 'error': None, 'target': 'ovnmeta-dbad5d67-48fa-4452-9764-37918af5f722', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:16:04 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:16:04.150 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapdbad5d67-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:16:04 compute-1 nova_compute[182713]: 2026-01-22 00:16:04.152 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:16:04 compute-1 nova_compute[182713]: 2026-01-22 00:16:04.159 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:16:04 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:16:04.160 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapdbad5d67-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:16:04 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:16:04.161 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 00:16:04 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:16:04.162 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapdbad5d67-40, col_values=(('external_ids', {'iface-id': 'bc0feb73-1fe9-487e-8603-9846dd682590'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:16:04 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:16:04.162 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 00:16:04 compute-1 nova_compute[182713]: 2026-01-22 00:16:04.239 182717 INFO nova.virt.libvirt.driver [-] [instance: 6763a0f8-8485-4d4a-8418-5b095f3a20ed] Instance destroyed successfully.
Jan 22 00:16:04 compute-1 nova_compute[182713]: 2026-01-22 00:16:04.240 182717 DEBUG nova.objects.instance [None req-553aace0-ab74-417f-943e-e6ecee28237b 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lazy-loading 'resources' on Instance uuid 6763a0f8-8485-4d4a-8418-5b095f3a20ed obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:16:04 compute-1 nova_compute[182713]: 2026-01-22 00:16:04.258 182717 DEBUG nova.virt.libvirt.vif [None req-553aace0-ab74-417f-943e-e6ecee28237b 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T00:15:29Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1485052167',display_name='tempest-TestNetworkBasicOps-server-1485052167',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1485052167',id=130,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBN+rF74Q+n92wqGfvgmI7dqElp9oBY9n3ay6yBNRhjKVM1q89wQOOSpGhz6I7UWGPqqFjzw4hVpFrSPhKUDzrlCaL5V6NlG7zI66zllUGkpU6xzTEQWK85FJJIBjQeVN4A==',key_name='tempest-TestNetworkBasicOps-287611394',keypairs=<?>,launch_index=0,launched_at=2026-01-22T00:15:38Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='34b96b4037d24a0ea19383ca2477b2fd',ramdisk_id='',reservation_id='r-4wpsiaxz',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-822850957',owner_user_name='tempest-TestNetworkBasicOps-822850957-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T00:15:38Z,user_data=None,user_id='833f1e9dce90456ea55a443da6704907',uuid=6763a0f8-8485-4d4a-8418-5b095f3a20ed,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "48d53d0a-d386-45be-8c0d-a38d9b22332f", "address": "fa:16:3e:82:84:b0", "network": {"id": "dbad5d67-48fa-4452-9764-37918af5f722", "bridge": "br-int", "label": "tempest-network-smoke--1888340030", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.24", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "34b96b4037d24a0ea19383ca2477b2fd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap48d53d0a-d3", "ovs_interfaceid": "48d53d0a-d386-45be-8c0d-a38d9b22332f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 00:16:04 compute-1 nova_compute[182713]: 2026-01-22 00:16:04.258 182717 DEBUG nova.network.os_vif_util [None req-553aace0-ab74-417f-943e-e6ecee28237b 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Converting VIF {"id": "48d53d0a-d386-45be-8c0d-a38d9b22332f", "address": "fa:16:3e:82:84:b0", "network": {"id": "dbad5d67-48fa-4452-9764-37918af5f722", "bridge": "br-int", "label": "tempest-network-smoke--1888340030", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.24", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "34b96b4037d24a0ea19383ca2477b2fd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap48d53d0a-d3", "ovs_interfaceid": "48d53d0a-d386-45be-8c0d-a38d9b22332f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:16:04 compute-1 nova_compute[182713]: 2026-01-22 00:16:04.260 182717 DEBUG nova.network.os_vif_util [None req-553aace0-ab74-417f-943e-e6ecee28237b 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:82:84:b0,bridge_name='br-int',has_traffic_filtering=True,id=48d53d0a-d386-45be-8c0d-a38d9b22332f,network=Network(dbad5d67-48fa-4452-9764-37918af5f722),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap48d53d0a-d3') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:16:04 compute-1 nova_compute[182713]: 2026-01-22 00:16:04.260 182717 DEBUG os_vif [None req-553aace0-ab74-417f-943e-e6ecee28237b 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:82:84:b0,bridge_name='br-int',has_traffic_filtering=True,id=48d53d0a-d386-45be-8c0d-a38d9b22332f,network=Network(dbad5d67-48fa-4452-9764-37918af5f722),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap48d53d0a-d3') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 00:16:04 compute-1 nova_compute[182713]: 2026-01-22 00:16:04.264 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:16:04 compute-1 nova_compute[182713]: 2026-01-22 00:16:04.264 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap48d53d0a-d3, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:16:04 compute-1 nova_compute[182713]: 2026-01-22 00:16:04.267 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:16:04 compute-1 nova_compute[182713]: 2026-01-22 00:16:04.269 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:16:04 compute-1 nova_compute[182713]: 2026-01-22 00:16:04.273 182717 INFO os_vif [None req-553aace0-ab74-417f-943e-e6ecee28237b 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:82:84:b0,bridge_name='br-int',has_traffic_filtering=True,id=48d53d0a-d386-45be-8c0d-a38d9b22332f,network=Network(dbad5d67-48fa-4452-9764-37918af5f722),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap48d53d0a-d3')
Jan 22 00:16:04 compute-1 nova_compute[182713]: 2026-01-22 00:16:04.274 182717 INFO nova.virt.libvirt.driver [None req-553aace0-ab74-417f-943e-e6ecee28237b 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 6763a0f8-8485-4d4a-8418-5b095f3a20ed] Deleting instance files /var/lib/nova/instances/6763a0f8-8485-4d4a-8418-5b095f3a20ed_del
Jan 22 00:16:04 compute-1 nova_compute[182713]: 2026-01-22 00:16:04.275 182717 INFO nova.virt.libvirt.driver [None req-553aace0-ab74-417f-943e-e6ecee28237b 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 6763a0f8-8485-4d4a-8418-5b095f3a20ed] Deletion of /var/lib/nova/instances/6763a0f8-8485-4d4a-8418-5b095f3a20ed_del complete
Jan 22 00:16:04 compute-1 nova_compute[182713]: 2026-01-22 00:16:04.362 182717 INFO nova.compute.manager [None req-553aace0-ab74-417f-943e-e6ecee28237b 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 6763a0f8-8485-4d4a-8418-5b095f3a20ed] Took 0.40 seconds to destroy the instance on the hypervisor.
Jan 22 00:16:04 compute-1 nova_compute[182713]: 2026-01-22 00:16:04.363 182717 DEBUG oslo.service.loopingcall [None req-553aace0-ab74-417f-943e-e6ecee28237b 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 22 00:16:04 compute-1 nova_compute[182713]: 2026-01-22 00:16:04.364 182717 DEBUG nova.compute.manager [-] [instance: 6763a0f8-8485-4d4a-8418-5b095f3a20ed] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 22 00:16:04 compute-1 nova_compute[182713]: 2026-01-22 00:16:04.364 182717 DEBUG nova.network.neutron [-] [instance: 6763a0f8-8485-4d4a-8418-5b095f3a20ed] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 22 00:16:05 compute-1 nova_compute[182713]: 2026-01-22 00:16:05.691 182717 DEBUG nova.network.neutron [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] [instance: 074fd360-328c-4903-a368-d3890c4a1075] Updating instance_info_cache with network_info: [{"id": "a06a78d5-548e-4a84-b918-197a54a79f44", "address": "fa:16:3e:3e:83:df", "network": {"id": "89b3c74e-a4f2-4889-901d-aba21eee4bda", "bridge": "br-int", "label": "tempest-network-smoke--2024090277", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.235", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "34b96b4037d24a0ea19383ca2477b2fd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa06a78d5-54", "ovs_interfaceid": "a06a78d5-548e-4a84-b918-197a54a79f44", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "4230f4cb-ad57-407e-90c1-b2441b67e135", "address": "fa:16:3e:ce:53:e7", "network": {"id": "dbad5d67-48fa-4452-9764-37918af5f722", "bridge": "br-int", "label": "tempest-network-smoke--1888340030", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.29", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "34b96b4037d24a0ea19383ca2477b2fd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4230f4cb-ad", "ovs_interfaceid": "4230f4cb-ad57-407e-90c1-b2441b67e135", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:16:05 compute-1 nova_compute[182713]: 2026-01-22 00:16:05.750 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Releasing lock "refresh_cache-074fd360-328c-4903-a368-d3890c4a1075" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:16:05 compute-1 nova_compute[182713]: 2026-01-22 00:16:05.751 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] [instance: 074fd360-328c-4903-a368-d3890c4a1075] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Jan 22 00:16:05 compute-1 nova_compute[182713]: 2026-01-22 00:16:05.752 182717 DEBUG oslo_concurrency.lockutils [req-2a672a21-fa25-492d-9938-6a830fedf0fa req-4fd6db0b-8d2d-4e1c-8f05-fc79629e8faa 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-074fd360-328c-4903-a368-d3890c4a1075" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:16:05 compute-1 nova_compute[182713]: 2026-01-22 00:16:05.752 182717 DEBUG nova.network.neutron [req-2a672a21-fa25-492d-9938-6a830fedf0fa req-4fd6db0b-8d2d-4e1c-8f05-fc79629e8faa 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 074fd360-328c-4903-a368-d3890c4a1075] Refreshing network info cache for port 4230f4cb-ad57-407e-90c1-b2441b67e135 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 00:16:05 compute-1 nova_compute[182713]: 2026-01-22 00:16:05.754 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:16:05 compute-1 nova_compute[182713]: 2026-01-22 00:16:05.755 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:16:05 compute-1 nova_compute[182713]: 2026-01-22 00:16:05.827 182717 DEBUG nova.compute.manager [req-738817b7-0993-4198-be5e-336c182e8590 req-d67aadde-0bfb-49e1-98f4-9b1b32bfa9f4 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 6763a0f8-8485-4d4a-8418-5b095f3a20ed] Received event network-vif-unplugged-48d53d0a-d386-45be-8c0d-a38d9b22332f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:16:05 compute-1 nova_compute[182713]: 2026-01-22 00:16:05.828 182717 DEBUG oslo_concurrency.lockutils [req-738817b7-0993-4198-be5e-336c182e8590 req-d67aadde-0bfb-49e1-98f4-9b1b32bfa9f4 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "6763a0f8-8485-4d4a-8418-5b095f3a20ed-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:16:05 compute-1 nova_compute[182713]: 2026-01-22 00:16:05.829 182717 DEBUG oslo_concurrency.lockutils [req-738817b7-0993-4198-be5e-336c182e8590 req-d67aadde-0bfb-49e1-98f4-9b1b32bfa9f4 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "6763a0f8-8485-4d4a-8418-5b095f3a20ed-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:16:05 compute-1 nova_compute[182713]: 2026-01-22 00:16:05.829 182717 DEBUG oslo_concurrency.lockutils [req-738817b7-0993-4198-be5e-336c182e8590 req-d67aadde-0bfb-49e1-98f4-9b1b32bfa9f4 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "6763a0f8-8485-4d4a-8418-5b095f3a20ed-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:16:05 compute-1 nova_compute[182713]: 2026-01-22 00:16:05.830 182717 DEBUG nova.compute.manager [req-738817b7-0993-4198-be5e-336c182e8590 req-d67aadde-0bfb-49e1-98f4-9b1b32bfa9f4 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 6763a0f8-8485-4d4a-8418-5b095f3a20ed] No waiting events found dispatching network-vif-unplugged-48d53d0a-d386-45be-8c0d-a38d9b22332f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:16:05 compute-1 nova_compute[182713]: 2026-01-22 00:16:05.830 182717 DEBUG nova.compute.manager [req-738817b7-0993-4198-be5e-336c182e8590 req-d67aadde-0bfb-49e1-98f4-9b1b32bfa9f4 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 6763a0f8-8485-4d4a-8418-5b095f3a20ed] Received event network-vif-unplugged-48d53d0a-d386-45be-8c0d-a38d9b22332f for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 22 00:16:06 compute-1 nova_compute[182713]: 2026-01-22 00:16:06.533 182717 DEBUG nova.network.neutron [None req-feac5578-dfd0-4b4e-acca-6e5eb0b498ee c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] [instance: 5ba5bafe-ee5b-48f6-aa2f-653708f71f55] Successfully updated port: 53f3c575-ccc8-4fde-a256-9598ddf6cdaf _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 22 00:16:06 compute-1 nova_compute[182713]: 2026-01-22 00:16:06.576 182717 DEBUG oslo_concurrency.lockutils [None req-feac5578-dfd0-4b4e-acca-6e5eb0b498ee c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] Acquiring lock "refresh_cache-5ba5bafe-ee5b-48f6-aa2f-653708f71f55" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:16:06 compute-1 nova_compute[182713]: 2026-01-22 00:16:06.577 182717 DEBUG oslo_concurrency.lockutils [None req-feac5578-dfd0-4b4e-acca-6e5eb0b498ee c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] Acquired lock "refresh_cache-5ba5bafe-ee5b-48f6-aa2f-653708f71f55" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:16:06 compute-1 nova_compute[182713]: 2026-01-22 00:16:06.577 182717 DEBUG nova.network.neutron [None req-feac5578-dfd0-4b4e-acca-6e5eb0b498ee c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] [instance: 5ba5bafe-ee5b-48f6-aa2f-653708f71f55] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 00:16:06 compute-1 nova_compute[182713]: 2026-01-22 00:16:06.615 182717 DEBUG nova.network.neutron [-] [instance: 6763a0f8-8485-4d4a-8418-5b095f3a20ed] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:16:06 compute-1 nova_compute[182713]: 2026-01-22 00:16:06.640 182717 INFO nova.compute.manager [-] [instance: 6763a0f8-8485-4d4a-8418-5b095f3a20ed] Took 2.28 seconds to deallocate network for instance.
Jan 22 00:16:06 compute-1 nova_compute[182713]: 2026-01-22 00:16:06.726 182717 DEBUG nova.compute.manager [req-ff160996-d54e-4510-9cad-ca205cd5271b req-30b6a111-c568-422e-97af-4c09ab5e4cda 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 5ba5bafe-ee5b-48f6-aa2f-653708f71f55] Received event network-changed-53f3c575-ccc8-4fde-a256-9598ddf6cdaf external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:16:06 compute-1 nova_compute[182713]: 2026-01-22 00:16:06.727 182717 DEBUG nova.compute.manager [req-ff160996-d54e-4510-9cad-ca205cd5271b req-30b6a111-c568-422e-97af-4c09ab5e4cda 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 5ba5bafe-ee5b-48f6-aa2f-653708f71f55] Refreshing instance network info cache due to event network-changed-53f3c575-ccc8-4fde-a256-9598ddf6cdaf. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 00:16:06 compute-1 nova_compute[182713]: 2026-01-22 00:16:06.727 182717 DEBUG oslo_concurrency.lockutils [req-ff160996-d54e-4510-9cad-ca205cd5271b req-30b6a111-c568-422e-97af-4c09ab5e4cda 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-5ba5bafe-ee5b-48f6-aa2f-653708f71f55" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:16:06 compute-1 nova_compute[182713]: 2026-01-22 00:16:06.782 182717 DEBUG oslo_concurrency.lockutils [None req-553aace0-ab74-417f-943e-e6ecee28237b 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:16:06 compute-1 nova_compute[182713]: 2026-01-22 00:16:06.783 182717 DEBUG oslo_concurrency.lockutils [None req-553aace0-ab74-417f-943e-e6ecee28237b 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:16:06 compute-1 nova_compute[182713]: 2026-01-22 00:16:06.985 182717 DEBUG nova.compute.provider_tree [None req-553aace0-ab74-417f-943e-e6ecee28237b 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Inventory has not changed in ProviderTree for provider: 39680711-70c9-4df1-ae59-25e54fac688d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:16:07 compute-1 nova_compute[182713]: 2026-01-22 00:16:07.007 182717 DEBUG nova.scheduler.client.report [None req-553aace0-ab74-417f-943e-e6ecee28237b 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Inventory has not changed for provider 39680711-70c9-4df1-ae59-25e54fac688d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:16:07 compute-1 nova_compute[182713]: 2026-01-22 00:16:07.110 182717 DEBUG oslo_concurrency.lockutils [None req-553aace0-ab74-417f-943e-e6ecee28237b 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.327s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:16:07 compute-1 nova_compute[182713]: 2026-01-22 00:16:07.146 182717 DEBUG nova.network.neutron [None req-feac5578-dfd0-4b4e-acca-6e5eb0b498ee c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] [instance: 5ba5bafe-ee5b-48f6-aa2f-653708f71f55] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 00:16:08 compute-1 nova_compute[182713]: 2026-01-22 00:16:08.383 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:16:08 compute-1 nova_compute[182713]: 2026-01-22 00:16:08.502 182717 INFO nova.scheduler.client.report [None req-553aace0-ab74-417f-943e-e6ecee28237b 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Deleted allocations for instance 6763a0f8-8485-4d4a-8418-5b095f3a20ed
Jan 22 00:16:08 compute-1 podman[231341]: 2026-01-22 00:16:08.590379638 +0000 UTC m=+0.074134450 container health_status 9069102163b9014ce8d990e36224edf2f6277e737eebbc85a8ea0ba5a3d6a468 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 22 00:16:08 compute-1 podman[231340]: 2026-01-22 00:16:08.648893236 +0000 UTC m=+0.130956135 container health_status 1b31374526468d6b98833c6ddc06c791524e3aaa8f4a92d02e3f11c449a17ca2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 22 00:16:09 compute-1 nova_compute[182713]: 2026-01-22 00:16:09.268 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:16:09 compute-1 nova_compute[182713]: 2026-01-22 00:16:09.427 182717 DEBUG nova.compute.manager [req-491c35bd-b51a-437b-9dcc-e338dfc1828e req-e6671563-3f77-4486-befc-f54df90a7dd5 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 6763a0f8-8485-4d4a-8418-5b095f3a20ed] Received event network-vif-plugged-48d53d0a-d386-45be-8c0d-a38d9b22332f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:16:09 compute-1 nova_compute[182713]: 2026-01-22 00:16:09.428 182717 DEBUG oslo_concurrency.lockutils [req-491c35bd-b51a-437b-9dcc-e338dfc1828e req-e6671563-3f77-4486-befc-f54df90a7dd5 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "6763a0f8-8485-4d4a-8418-5b095f3a20ed-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:16:09 compute-1 nova_compute[182713]: 2026-01-22 00:16:09.428 182717 DEBUG oslo_concurrency.lockutils [req-491c35bd-b51a-437b-9dcc-e338dfc1828e req-e6671563-3f77-4486-befc-f54df90a7dd5 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "6763a0f8-8485-4d4a-8418-5b095f3a20ed-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:16:09 compute-1 nova_compute[182713]: 2026-01-22 00:16:09.429 182717 DEBUG oslo_concurrency.lockutils [req-491c35bd-b51a-437b-9dcc-e338dfc1828e req-e6671563-3f77-4486-befc-f54df90a7dd5 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "6763a0f8-8485-4d4a-8418-5b095f3a20ed-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:16:09 compute-1 nova_compute[182713]: 2026-01-22 00:16:09.429 182717 DEBUG nova.compute.manager [req-491c35bd-b51a-437b-9dcc-e338dfc1828e req-e6671563-3f77-4486-befc-f54df90a7dd5 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 6763a0f8-8485-4d4a-8418-5b095f3a20ed] No waiting events found dispatching network-vif-plugged-48d53d0a-d386-45be-8c0d-a38d9b22332f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:16:09 compute-1 nova_compute[182713]: 2026-01-22 00:16:09.430 182717 WARNING nova.compute.manager [req-491c35bd-b51a-437b-9dcc-e338dfc1828e req-e6671563-3f77-4486-befc-f54df90a7dd5 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 6763a0f8-8485-4d4a-8418-5b095f3a20ed] Received unexpected event network-vif-plugged-48d53d0a-d386-45be-8c0d-a38d9b22332f for instance with vm_state deleted and task_state None.
Jan 22 00:16:09 compute-1 nova_compute[182713]: 2026-01-22 00:16:09.432 182717 DEBUG nova.compute.manager [req-5c36661a-60d8-4ff1-8d32-60c8321f4d51 req-f5353460-8d8c-4773-ade2-06634a828e7f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 6763a0f8-8485-4d4a-8418-5b095f3a20ed] Received event network-vif-deleted-48d53d0a-d386-45be-8c0d-a38d9b22332f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:16:09 compute-1 nova_compute[182713]: 2026-01-22 00:16:09.899 182717 DEBUG oslo_concurrency.lockutils [None req-553aace0-ab74-417f-943e-e6ecee28237b 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "6763a0f8-8485-4d4a-8418-5b095f3a20ed" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 5.964s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:16:11 compute-1 nova_compute[182713]: 2026-01-22 00:16:11.405 182717 DEBUG nova.network.neutron [None req-feac5578-dfd0-4b4e-acca-6e5eb0b498ee c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] [instance: 5ba5bafe-ee5b-48f6-aa2f-653708f71f55] Updating instance_info_cache with network_info: [{"id": "53f3c575-ccc8-4fde-a256-9598ddf6cdaf", "address": "fa:16:3e:9f:0e:3f", "network": {"id": "55594f65-206f-4b2a-a4ed-c049861ef480", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-2114387764-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4c6e66779ffe440d9c3270f0328391fb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap53f3c575-cc", "ovs_interfaceid": "53f3c575-ccc8-4fde-a256-9598ddf6cdaf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:16:11 compute-1 nova_compute[182713]: 2026-01-22 00:16:11.426 182717 DEBUG oslo_concurrency.lockutils [None req-feac5578-dfd0-4b4e-acca-6e5eb0b498ee c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] Releasing lock "refresh_cache-5ba5bafe-ee5b-48f6-aa2f-653708f71f55" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:16:11 compute-1 nova_compute[182713]: 2026-01-22 00:16:11.427 182717 DEBUG nova.compute.manager [None req-feac5578-dfd0-4b4e-acca-6e5eb0b498ee c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] [instance: 5ba5bafe-ee5b-48f6-aa2f-653708f71f55] Instance network_info: |[{"id": "53f3c575-ccc8-4fde-a256-9598ddf6cdaf", "address": "fa:16:3e:9f:0e:3f", "network": {"id": "55594f65-206f-4b2a-a4ed-c049861ef480", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-2114387764-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4c6e66779ffe440d9c3270f0328391fb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap53f3c575-cc", "ovs_interfaceid": "53f3c575-ccc8-4fde-a256-9598ddf6cdaf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 22 00:16:11 compute-1 nova_compute[182713]: 2026-01-22 00:16:11.428 182717 DEBUG oslo_concurrency.lockutils [req-ff160996-d54e-4510-9cad-ca205cd5271b req-30b6a111-c568-422e-97af-4c09ab5e4cda 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-5ba5bafe-ee5b-48f6-aa2f-653708f71f55" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:16:11 compute-1 nova_compute[182713]: 2026-01-22 00:16:11.429 182717 DEBUG nova.network.neutron [req-ff160996-d54e-4510-9cad-ca205cd5271b req-30b6a111-c568-422e-97af-4c09ab5e4cda 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 5ba5bafe-ee5b-48f6-aa2f-653708f71f55] Refreshing network info cache for port 53f3c575-ccc8-4fde-a256-9598ddf6cdaf _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 00:16:11 compute-1 nova_compute[182713]: 2026-01-22 00:16:11.435 182717 DEBUG nova.virt.libvirt.driver [None req-feac5578-dfd0-4b4e-acca-6e5eb0b498ee c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] [instance: 5ba5bafe-ee5b-48f6-aa2f-653708f71f55] Start _get_guest_xml network_info=[{"id": "53f3c575-ccc8-4fde-a256-9598ddf6cdaf", "address": "fa:16:3e:9f:0e:3f", "network": {"id": "55594f65-206f-4b2a-a4ed-c049861ef480", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-2114387764-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4c6e66779ffe440d9c3270f0328391fb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap53f3c575-cc", "ovs_interfaceid": "53f3c575-ccc8-4fde-a256-9598ddf6cdaf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'size': 0, 'disk_bus': 'virtio', 'boot_index': 0, 'device_name': '/dev/vda', 'guest_format': None, 'encrypted': False, 'encryption_format': None, 'encryption_secret_uuid': None, 'device_type': 'disk', 'image_id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 22 00:16:11 compute-1 nova_compute[182713]: 2026-01-22 00:16:11.443 182717 WARNING nova.virt.libvirt.driver [None req-feac5578-dfd0-4b4e-acca-6e5eb0b498ee c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 00:16:11 compute-1 nova_compute[182713]: 2026-01-22 00:16:11.454 182717 DEBUG nova.virt.libvirt.host [None req-feac5578-dfd0-4b4e-acca-6e5eb0b498ee c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 22 00:16:11 compute-1 nova_compute[182713]: 2026-01-22 00:16:11.455 182717 DEBUG nova.virt.libvirt.host [None req-feac5578-dfd0-4b4e-acca-6e5eb0b498ee c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 22 00:16:11 compute-1 nova_compute[182713]: 2026-01-22 00:16:11.621 182717 DEBUG nova.virt.libvirt.host [None req-feac5578-dfd0-4b4e-acca-6e5eb0b498ee c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 22 00:16:11 compute-1 nova_compute[182713]: 2026-01-22 00:16:11.622 182717 DEBUG nova.virt.libvirt.host [None req-feac5578-dfd0-4b4e-acca-6e5eb0b498ee c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 22 00:16:11 compute-1 nova_compute[182713]: 2026-01-22 00:16:11.623 182717 DEBUG nova.virt.libvirt.driver [None req-feac5578-dfd0-4b4e-acca-6e5eb0b498ee c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 22 00:16:11 compute-1 nova_compute[182713]: 2026-01-22 00:16:11.623 182717 DEBUG nova.virt.hardware [None req-feac5578-dfd0-4b4e-acca-6e5eb0b498ee c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-21T23:42:45Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c3389c03-89c4-4ff5-9e03-1a99d41713d4',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 22 00:16:11 compute-1 nova_compute[182713]: 2026-01-22 00:16:11.623 182717 DEBUG nova.virt.hardware [None req-feac5578-dfd0-4b4e-acca-6e5eb0b498ee c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 22 00:16:11 compute-1 nova_compute[182713]: 2026-01-22 00:16:11.624 182717 DEBUG nova.virt.hardware [None req-feac5578-dfd0-4b4e-acca-6e5eb0b498ee c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 22 00:16:11 compute-1 nova_compute[182713]: 2026-01-22 00:16:11.624 182717 DEBUG nova.virt.hardware [None req-feac5578-dfd0-4b4e-acca-6e5eb0b498ee c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 22 00:16:11 compute-1 nova_compute[182713]: 2026-01-22 00:16:11.624 182717 DEBUG nova.virt.hardware [None req-feac5578-dfd0-4b4e-acca-6e5eb0b498ee c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 22 00:16:11 compute-1 nova_compute[182713]: 2026-01-22 00:16:11.624 182717 DEBUG nova.virt.hardware [None req-feac5578-dfd0-4b4e-acca-6e5eb0b498ee c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 22 00:16:11 compute-1 nova_compute[182713]: 2026-01-22 00:16:11.624 182717 DEBUG nova.virt.hardware [None req-feac5578-dfd0-4b4e-acca-6e5eb0b498ee c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 22 00:16:11 compute-1 nova_compute[182713]: 2026-01-22 00:16:11.625 182717 DEBUG nova.virt.hardware [None req-feac5578-dfd0-4b4e-acca-6e5eb0b498ee c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 22 00:16:11 compute-1 nova_compute[182713]: 2026-01-22 00:16:11.625 182717 DEBUG nova.virt.hardware [None req-feac5578-dfd0-4b4e-acca-6e5eb0b498ee c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 22 00:16:11 compute-1 nova_compute[182713]: 2026-01-22 00:16:11.625 182717 DEBUG nova.virt.hardware [None req-feac5578-dfd0-4b4e-acca-6e5eb0b498ee c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 22 00:16:11 compute-1 nova_compute[182713]: 2026-01-22 00:16:11.625 182717 DEBUG nova.virt.hardware [None req-feac5578-dfd0-4b4e-acca-6e5eb0b498ee c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 22 00:16:11 compute-1 nova_compute[182713]: 2026-01-22 00:16:11.628 182717 DEBUG nova.virt.libvirt.vif [None req-feac5578-dfd0-4b4e-acca-6e5eb0b498ee c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T00:15:50Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerRescueNegativeTestJSON-server-343289399',display_name='tempest-ServerRescueNegativeTestJSON-server-343289399',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverrescuenegativetestjson-server-343289399',id=133,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='4c6e66779ffe440d9c3270f0328391fb',ramdisk_id='',reservation_id='r-570w62iy',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerRescueNegativeTestJSON-1986679883',owner_user_name='tempest-ServerRescueNegativeTestJSON-1986679883-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:15:56Z,user_data=None,user_id='c26ff016fcfc4e08803feb0e96005a8e',uuid=5ba5bafe-ee5b-48f6-aa2f-653708f71f55,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "53f3c575-ccc8-4fde-a256-9598ddf6cdaf", "address": "fa:16:3e:9f:0e:3f", "network": {"id": "55594f65-206f-4b2a-a4ed-c049861ef480", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-2114387764-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4c6e66779ffe440d9c3270f0328391fb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap53f3c575-cc", "ovs_interfaceid": "53f3c575-ccc8-4fde-a256-9598ddf6cdaf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 00:16:11 compute-1 nova_compute[182713]: 2026-01-22 00:16:11.628 182717 DEBUG nova.network.os_vif_util [None req-feac5578-dfd0-4b4e-acca-6e5eb0b498ee c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] Converting VIF {"id": "53f3c575-ccc8-4fde-a256-9598ddf6cdaf", "address": "fa:16:3e:9f:0e:3f", "network": {"id": "55594f65-206f-4b2a-a4ed-c049861ef480", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-2114387764-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4c6e66779ffe440d9c3270f0328391fb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap53f3c575-cc", "ovs_interfaceid": "53f3c575-ccc8-4fde-a256-9598ddf6cdaf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:16:11 compute-1 nova_compute[182713]: 2026-01-22 00:16:11.629 182717 DEBUG nova.network.os_vif_util [None req-feac5578-dfd0-4b4e-acca-6e5eb0b498ee c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:9f:0e:3f,bridge_name='br-int',has_traffic_filtering=True,id=53f3c575-ccc8-4fde-a256-9598ddf6cdaf,network=Network(55594f65-206f-4b2a-a4ed-c049861ef480),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap53f3c575-cc') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:16:11 compute-1 nova_compute[182713]: 2026-01-22 00:16:11.630 182717 DEBUG nova.objects.instance [None req-feac5578-dfd0-4b4e-acca-6e5eb0b498ee c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] Lazy-loading 'pci_devices' on Instance uuid 5ba5bafe-ee5b-48f6-aa2f-653708f71f55 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:16:11 compute-1 nova_compute[182713]: 2026-01-22 00:16:11.632 182717 DEBUG nova.network.neutron [req-2a672a21-fa25-492d-9938-6a830fedf0fa req-4fd6db0b-8d2d-4e1c-8f05-fc79629e8faa 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 074fd360-328c-4903-a368-d3890c4a1075] Updated VIF entry in instance network info cache for port 4230f4cb-ad57-407e-90c1-b2441b67e135. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 00:16:11 compute-1 nova_compute[182713]: 2026-01-22 00:16:11.632 182717 DEBUG nova.network.neutron [req-2a672a21-fa25-492d-9938-6a830fedf0fa req-4fd6db0b-8d2d-4e1c-8f05-fc79629e8faa 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 074fd360-328c-4903-a368-d3890c4a1075] Updating instance_info_cache with network_info: [{"id": "a06a78d5-548e-4a84-b918-197a54a79f44", "address": "fa:16:3e:3e:83:df", "network": {"id": "89b3c74e-a4f2-4889-901d-aba21eee4bda", "bridge": "br-int", "label": "tempest-network-smoke--2024090277", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.235", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "34b96b4037d24a0ea19383ca2477b2fd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa06a78d5-54", "ovs_interfaceid": "a06a78d5-548e-4a84-b918-197a54a79f44", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "4230f4cb-ad57-407e-90c1-b2441b67e135", "address": "fa:16:3e:ce:53:e7", "network": {"id": "dbad5d67-48fa-4452-9764-37918af5f722", "bridge": "br-int", "label": "tempest-network-smoke--1888340030", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.29", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "34b96b4037d24a0ea19383ca2477b2fd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4230f4cb-ad", "ovs_interfaceid": "4230f4cb-ad57-407e-90c1-b2441b67e135", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:16:11 compute-1 nova_compute[182713]: 2026-01-22 00:16:11.669 182717 DEBUG nova.virt.libvirt.driver [None req-feac5578-dfd0-4b4e-acca-6e5eb0b498ee c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] [instance: 5ba5bafe-ee5b-48f6-aa2f-653708f71f55] End _get_guest_xml xml=<domain type="kvm">
Jan 22 00:16:11 compute-1 nova_compute[182713]:   <uuid>5ba5bafe-ee5b-48f6-aa2f-653708f71f55</uuid>
Jan 22 00:16:11 compute-1 nova_compute[182713]:   <name>instance-00000085</name>
Jan 22 00:16:11 compute-1 nova_compute[182713]:   <memory>131072</memory>
Jan 22 00:16:11 compute-1 nova_compute[182713]:   <vcpu>1</vcpu>
Jan 22 00:16:11 compute-1 nova_compute[182713]:   <metadata>
Jan 22 00:16:11 compute-1 nova_compute[182713]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 00:16:11 compute-1 nova_compute[182713]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 00:16:11 compute-1 nova_compute[182713]:       <nova:name>tempest-ServerRescueNegativeTestJSON-server-343289399</nova:name>
Jan 22 00:16:11 compute-1 nova_compute[182713]:       <nova:creationTime>2026-01-22 00:16:11</nova:creationTime>
Jan 22 00:16:11 compute-1 nova_compute[182713]:       <nova:flavor name="m1.nano">
Jan 22 00:16:11 compute-1 nova_compute[182713]:         <nova:memory>128</nova:memory>
Jan 22 00:16:11 compute-1 nova_compute[182713]:         <nova:disk>1</nova:disk>
Jan 22 00:16:11 compute-1 nova_compute[182713]:         <nova:swap>0</nova:swap>
Jan 22 00:16:11 compute-1 nova_compute[182713]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 00:16:11 compute-1 nova_compute[182713]:         <nova:vcpus>1</nova:vcpus>
Jan 22 00:16:11 compute-1 nova_compute[182713]:       </nova:flavor>
Jan 22 00:16:11 compute-1 nova_compute[182713]:       <nova:owner>
Jan 22 00:16:11 compute-1 nova_compute[182713]:         <nova:user uuid="c26ff016fcfc4e08803feb0e96005a8e">tempest-ServerRescueNegativeTestJSON-1986679883-project-member</nova:user>
Jan 22 00:16:11 compute-1 nova_compute[182713]:         <nova:project uuid="4c6e66779ffe440d9c3270f0328391fb">tempest-ServerRescueNegativeTestJSON-1986679883</nova:project>
Jan 22 00:16:11 compute-1 nova_compute[182713]:       </nova:owner>
Jan 22 00:16:11 compute-1 nova_compute[182713]:       <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 22 00:16:11 compute-1 nova_compute[182713]:       <nova:ports>
Jan 22 00:16:11 compute-1 nova_compute[182713]:         <nova:port uuid="53f3c575-ccc8-4fde-a256-9598ddf6cdaf">
Jan 22 00:16:11 compute-1 nova_compute[182713]:           <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Jan 22 00:16:11 compute-1 nova_compute[182713]:         </nova:port>
Jan 22 00:16:11 compute-1 nova_compute[182713]:       </nova:ports>
Jan 22 00:16:11 compute-1 nova_compute[182713]:     </nova:instance>
Jan 22 00:16:11 compute-1 nova_compute[182713]:   </metadata>
Jan 22 00:16:11 compute-1 nova_compute[182713]:   <sysinfo type="smbios">
Jan 22 00:16:11 compute-1 nova_compute[182713]:     <system>
Jan 22 00:16:11 compute-1 nova_compute[182713]:       <entry name="manufacturer">RDO</entry>
Jan 22 00:16:11 compute-1 nova_compute[182713]:       <entry name="product">OpenStack Compute</entry>
Jan 22 00:16:11 compute-1 nova_compute[182713]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 00:16:11 compute-1 nova_compute[182713]:       <entry name="serial">5ba5bafe-ee5b-48f6-aa2f-653708f71f55</entry>
Jan 22 00:16:11 compute-1 nova_compute[182713]:       <entry name="uuid">5ba5bafe-ee5b-48f6-aa2f-653708f71f55</entry>
Jan 22 00:16:11 compute-1 nova_compute[182713]:       <entry name="family">Virtual Machine</entry>
Jan 22 00:16:11 compute-1 nova_compute[182713]:     </system>
Jan 22 00:16:11 compute-1 nova_compute[182713]:   </sysinfo>
Jan 22 00:16:11 compute-1 nova_compute[182713]:   <os>
Jan 22 00:16:11 compute-1 nova_compute[182713]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 22 00:16:11 compute-1 nova_compute[182713]:     <boot dev="hd"/>
Jan 22 00:16:11 compute-1 nova_compute[182713]:     <smbios mode="sysinfo"/>
Jan 22 00:16:11 compute-1 nova_compute[182713]:   </os>
Jan 22 00:16:11 compute-1 nova_compute[182713]:   <features>
Jan 22 00:16:11 compute-1 nova_compute[182713]:     <acpi/>
Jan 22 00:16:11 compute-1 nova_compute[182713]:     <apic/>
Jan 22 00:16:11 compute-1 nova_compute[182713]:     <vmcoreinfo/>
Jan 22 00:16:11 compute-1 nova_compute[182713]:   </features>
Jan 22 00:16:11 compute-1 nova_compute[182713]:   <clock offset="utc">
Jan 22 00:16:11 compute-1 nova_compute[182713]:     <timer name="pit" tickpolicy="delay"/>
Jan 22 00:16:11 compute-1 nova_compute[182713]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 22 00:16:11 compute-1 nova_compute[182713]:     <timer name="hpet" present="no"/>
Jan 22 00:16:11 compute-1 nova_compute[182713]:   </clock>
Jan 22 00:16:11 compute-1 nova_compute[182713]:   <cpu mode="custom" match="exact">
Jan 22 00:16:11 compute-1 nova_compute[182713]:     <model>Nehalem</model>
Jan 22 00:16:11 compute-1 nova_compute[182713]:     <topology sockets="1" cores="1" threads="1"/>
Jan 22 00:16:11 compute-1 nova_compute[182713]:   </cpu>
Jan 22 00:16:11 compute-1 nova_compute[182713]:   <devices>
Jan 22 00:16:11 compute-1 nova_compute[182713]:     <disk type="file" device="disk">
Jan 22 00:16:11 compute-1 nova_compute[182713]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 00:16:11 compute-1 nova_compute[182713]:       <source file="/var/lib/nova/instances/5ba5bafe-ee5b-48f6-aa2f-653708f71f55/disk"/>
Jan 22 00:16:11 compute-1 nova_compute[182713]:       <target dev="vda" bus="virtio"/>
Jan 22 00:16:11 compute-1 nova_compute[182713]:     </disk>
Jan 22 00:16:11 compute-1 nova_compute[182713]:     <disk type="file" device="cdrom">
Jan 22 00:16:11 compute-1 nova_compute[182713]:       <driver name="qemu" type="raw" cache="none"/>
Jan 22 00:16:11 compute-1 nova_compute[182713]:       <source file="/var/lib/nova/instances/5ba5bafe-ee5b-48f6-aa2f-653708f71f55/disk.config"/>
Jan 22 00:16:11 compute-1 nova_compute[182713]:       <target dev="sda" bus="sata"/>
Jan 22 00:16:11 compute-1 nova_compute[182713]:     </disk>
Jan 22 00:16:11 compute-1 nova_compute[182713]:     <interface type="ethernet">
Jan 22 00:16:11 compute-1 nova_compute[182713]:       <mac address="fa:16:3e:9f:0e:3f"/>
Jan 22 00:16:11 compute-1 nova_compute[182713]:       <model type="virtio"/>
Jan 22 00:16:11 compute-1 nova_compute[182713]:       <driver name="vhost" rx_queue_size="512"/>
Jan 22 00:16:11 compute-1 nova_compute[182713]:       <mtu size="1442"/>
Jan 22 00:16:11 compute-1 nova_compute[182713]:       <target dev="tap53f3c575-cc"/>
Jan 22 00:16:11 compute-1 nova_compute[182713]:     </interface>
Jan 22 00:16:11 compute-1 nova_compute[182713]:     <serial type="pty">
Jan 22 00:16:11 compute-1 nova_compute[182713]:       <log file="/var/lib/nova/instances/5ba5bafe-ee5b-48f6-aa2f-653708f71f55/console.log" append="off"/>
Jan 22 00:16:11 compute-1 nova_compute[182713]:     </serial>
Jan 22 00:16:11 compute-1 nova_compute[182713]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 00:16:11 compute-1 nova_compute[182713]:     <video>
Jan 22 00:16:11 compute-1 nova_compute[182713]:       <model type="virtio"/>
Jan 22 00:16:11 compute-1 nova_compute[182713]:     </video>
Jan 22 00:16:11 compute-1 nova_compute[182713]:     <input type="tablet" bus="usb"/>
Jan 22 00:16:11 compute-1 nova_compute[182713]:     <rng model="virtio">
Jan 22 00:16:11 compute-1 nova_compute[182713]:       <backend model="random">/dev/urandom</backend>
Jan 22 00:16:11 compute-1 nova_compute[182713]:     </rng>
Jan 22 00:16:11 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root"/>
Jan 22 00:16:11 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:16:11 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:16:11 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:16:11 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:16:11 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:16:11 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:16:11 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:16:11 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:16:11 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:16:11 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:16:11 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:16:11 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:16:11 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:16:11 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:16:11 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:16:11 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:16:11 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:16:11 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:16:11 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:16:11 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:16:11 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:16:11 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:16:11 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:16:11 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:16:11 compute-1 nova_compute[182713]:     <controller type="usb" index="0"/>
Jan 22 00:16:11 compute-1 nova_compute[182713]:     <memballoon model="virtio">
Jan 22 00:16:11 compute-1 nova_compute[182713]:       <stats period="10"/>
Jan 22 00:16:11 compute-1 nova_compute[182713]:     </memballoon>
Jan 22 00:16:11 compute-1 nova_compute[182713]:   </devices>
Jan 22 00:16:11 compute-1 nova_compute[182713]: </domain>
Jan 22 00:16:11 compute-1 nova_compute[182713]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 22 00:16:11 compute-1 nova_compute[182713]: 2026-01-22 00:16:11.670 182717 DEBUG nova.compute.manager [None req-feac5578-dfd0-4b4e-acca-6e5eb0b498ee c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] [instance: 5ba5bafe-ee5b-48f6-aa2f-653708f71f55] Preparing to wait for external event network-vif-plugged-53f3c575-ccc8-4fde-a256-9598ddf6cdaf prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 22 00:16:11 compute-1 nova_compute[182713]: 2026-01-22 00:16:11.671 182717 DEBUG oslo_concurrency.lockutils [None req-feac5578-dfd0-4b4e-acca-6e5eb0b498ee c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] Acquiring lock "5ba5bafe-ee5b-48f6-aa2f-653708f71f55-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:16:11 compute-1 nova_compute[182713]: 2026-01-22 00:16:11.671 182717 DEBUG oslo_concurrency.lockutils [None req-feac5578-dfd0-4b4e-acca-6e5eb0b498ee c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] Lock "5ba5bafe-ee5b-48f6-aa2f-653708f71f55-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:16:11 compute-1 nova_compute[182713]: 2026-01-22 00:16:11.672 182717 DEBUG oslo_concurrency.lockutils [None req-feac5578-dfd0-4b4e-acca-6e5eb0b498ee c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] Lock "5ba5bafe-ee5b-48f6-aa2f-653708f71f55-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:16:11 compute-1 nova_compute[182713]: 2026-01-22 00:16:11.673 182717 DEBUG nova.virt.libvirt.vif [None req-feac5578-dfd0-4b4e-acca-6e5eb0b498ee c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T00:15:50Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerRescueNegativeTestJSON-server-343289399',display_name='tempest-ServerRescueNegativeTestJSON-server-343289399',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverrescuenegativetestjson-server-343289399',id=133,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='4c6e66779ffe440d9c3270f0328391fb',ramdisk_id='',reservation_id='r-570w62iy',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerRescueNegativeTestJSON-1986679883',owner_user_name='tempest-ServerRescueNegativeTestJSON-1986679883-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:15:56Z,user_data=None,user_id='c26ff016fcfc4e08803feb0e96005a8e',uuid=5ba5bafe-ee5b-48f6-aa2f-653708f71f55,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "53f3c575-ccc8-4fde-a256-9598ddf6cdaf", "address": "fa:16:3e:9f:0e:3f", "network": {"id": "55594f65-206f-4b2a-a4ed-c049861ef480", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-2114387764-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4c6e66779ffe440d9c3270f0328391fb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap53f3c575-cc", "ovs_interfaceid": "53f3c575-ccc8-4fde-a256-9598ddf6cdaf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 00:16:11 compute-1 nova_compute[182713]: 2026-01-22 00:16:11.673 182717 DEBUG nova.network.os_vif_util [None req-feac5578-dfd0-4b4e-acca-6e5eb0b498ee c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] Converting VIF {"id": "53f3c575-ccc8-4fde-a256-9598ddf6cdaf", "address": "fa:16:3e:9f:0e:3f", "network": {"id": "55594f65-206f-4b2a-a4ed-c049861ef480", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-2114387764-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4c6e66779ffe440d9c3270f0328391fb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap53f3c575-cc", "ovs_interfaceid": "53f3c575-ccc8-4fde-a256-9598ddf6cdaf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:16:11 compute-1 nova_compute[182713]: 2026-01-22 00:16:11.674 182717 DEBUG nova.network.os_vif_util [None req-feac5578-dfd0-4b4e-acca-6e5eb0b498ee c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:9f:0e:3f,bridge_name='br-int',has_traffic_filtering=True,id=53f3c575-ccc8-4fde-a256-9598ddf6cdaf,network=Network(55594f65-206f-4b2a-a4ed-c049861ef480),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap53f3c575-cc') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:16:11 compute-1 nova_compute[182713]: 2026-01-22 00:16:11.675 182717 DEBUG os_vif [None req-feac5578-dfd0-4b4e-acca-6e5eb0b498ee c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:9f:0e:3f,bridge_name='br-int',has_traffic_filtering=True,id=53f3c575-ccc8-4fde-a256-9598ddf6cdaf,network=Network(55594f65-206f-4b2a-a4ed-c049861ef480),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap53f3c575-cc') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 00:16:11 compute-1 nova_compute[182713]: 2026-01-22 00:16:11.676 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:16:11 compute-1 nova_compute[182713]: 2026-01-22 00:16:11.677 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:16:11 compute-1 nova_compute[182713]: 2026-01-22 00:16:11.677 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 00:16:11 compute-1 nova_compute[182713]: 2026-01-22 00:16:11.678 182717 DEBUG oslo_concurrency.lockutils [req-2a672a21-fa25-492d-9938-6a830fedf0fa req-4fd6db0b-8d2d-4e1c-8f05-fc79629e8faa 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-074fd360-328c-4903-a368-d3890c4a1075" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:16:11 compute-1 nova_compute[182713]: 2026-01-22 00:16:11.681 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:16:11 compute-1 nova_compute[182713]: 2026-01-22 00:16:11.681 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap53f3c575-cc, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:16:11 compute-1 nova_compute[182713]: 2026-01-22 00:16:11.682 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap53f3c575-cc, col_values=(('external_ids', {'iface-id': '53f3c575-ccc8-4fde-a256-9598ddf6cdaf', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:9f:0e:3f', 'vm-uuid': '5ba5bafe-ee5b-48f6-aa2f-653708f71f55'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:16:11 compute-1 nova_compute[182713]: 2026-01-22 00:16:11.684 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:16:11 compute-1 NetworkManager[54952]: <info>  [1769040971.6851] manager: (tap53f3c575-cc): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/237)
Jan 22 00:16:11 compute-1 nova_compute[182713]: 2026-01-22 00:16:11.687 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:16:11 compute-1 nova_compute[182713]: 2026-01-22 00:16:11.692 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:16:11 compute-1 nova_compute[182713]: 2026-01-22 00:16:11.693 182717 INFO os_vif [None req-feac5578-dfd0-4b4e-acca-6e5eb0b498ee c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:9f:0e:3f,bridge_name='br-int',has_traffic_filtering=True,id=53f3c575-ccc8-4fde-a256-9598ddf6cdaf,network=Network(55594f65-206f-4b2a-a4ed-c049861ef480),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap53f3c575-cc')
Jan 22 00:16:11 compute-1 nova_compute[182713]: 2026-01-22 00:16:11.812 182717 DEBUG nova.virt.libvirt.driver [None req-feac5578-dfd0-4b4e-acca-6e5eb0b498ee c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 00:16:11 compute-1 nova_compute[182713]: 2026-01-22 00:16:11.812 182717 DEBUG nova.virt.libvirt.driver [None req-feac5578-dfd0-4b4e-acca-6e5eb0b498ee c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 00:16:11 compute-1 nova_compute[182713]: 2026-01-22 00:16:11.812 182717 DEBUG nova.virt.libvirt.driver [None req-feac5578-dfd0-4b4e-acca-6e5eb0b498ee c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] No VIF found with MAC fa:16:3e:9f:0e:3f, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 00:16:11 compute-1 nova_compute[182713]: 2026-01-22 00:16:11.813 182717 INFO nova.virt.libvirt.driver [None req-feac5578-dfd0-4b4e-acca-6e5eb0b498ee c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] [instance: 5ba5bafe-ee5b-48f6-aa2f-653708f71f55] Using config drive
Jan 22 00:16:12 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:16:12.985 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=74526b6d-b1ca-423f-9094-b845f8b97526, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '38'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:16:13 compute-1 nova_compute[182713]: 2026-01-22 00:16:13.385 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:16:13 compute-1 podman[231394]: 2026-01-22 00:16:13.586820704 +0000 UTC m=+0.063098420 container health_status e305ebfcd3dbca18f8dd680606b043b108cf9efb0d0a771039cb04fe0df8441d (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 22 00:16:13 compute-1 podman[231393]: 2026-01-22 00:16:13.608456129 +0000 UTC m=+0.090901044 container health_status af9a44923f14d865893c91712a29a99d59e05d83f1a1a0300c4f4d5af638f284 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 00:16:14 compute-1 nova_compute[182713]: 2026-01-22 00:16:14.551 182717 INFO nova.virt.libvirt.driver [None req-feac5578-dfd0-4b4e-acca-6e5eb0b498ee c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] [instance: 5ba5bafe-ee5b-48f6-aa2f-653708f71f55] Creating config drive at /var/lib/nova/instances/5ba5bafe-ee5b-48f6-aa2f-653708f71f55/disk.config
Jan 22 00:16:14 compute-1 nova_compute[182713]: 2026-01-22 00:16:14.560 182717 DEBUG oslo_concurrency.processutils [None req-feac5578-dfd0-4b4e-acca-6e5eb0b498ee c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/5ba5bafe-ee5b-48f6-aa2f-653708f71f55/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp3mbqms8i execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:16:14 compute-1 nova_compute[182713]: 2026-01-22 00:16:14.698 182717 DEBUG oslo_concurrency.processutils [None req-feac5578-dfd0-4b4e-acca-6e5eb0b498ee c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/5ba5bafe-ee5b-48f6-aa2f-653708f71f55/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp3mbqms8i" returned: 0 in 0.137s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:16:14 compute-1 kernel: tap53f3c575-cc: entered promiscuous mode
Jan 22 00:16:14 compute-1 NetworkManager[54952]: <info>  [1769040974.7800] manager: (tap53f3c575-cc): new Tun device (/org/freedesktop/NetworkManager/Devices/238)
Jan 22 00:16:14 compute-1 ovn_controller[94841]: 2026-01-22T00:16:14Z|00518|binding|INFO|Claiming lport 53f3c575-ccc8-4fde-a256-9598ddf6cdaf for this chassis.
Jan 22 00:16:14 compute-1 ovn_controller[94841]: 2026-01-22T00:16:14Z|00519|binding|INFO|53f3c575-ccc8-4fde-a256-9598ddf6cdaf: Claiming fa:16:3e:9f:0e:3f 10.100.0.12
Jan 22 00:16:14 compute-1 nova_compute[182713]: 2026-01-22 00:16:14.782 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:16:14 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:16:14.793 104184 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:9f:0e:3f 10.100.0.12'], port_security=['fa:16:3e:9f:0e:3f 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '5ba5bafe-ee5b-48f6-aa2f-653708f71f55', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-55594f65-206f-4b2a-a4ed-c049861ef480', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '4c6e66779ffe440d9c3270f0328391fb', 'neutron:revision_number': '2', 'neutron:security_group_ids': '99a05b18-dc2e-46bb-b7ee-4bfce96057f6', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f515f91a-3ddc-47bf-8aaf-753c19e78de1, chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>], logical_port=53f3c575-ccc8-4fde-a256-9598ddf6cdaf) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:16:14 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:16:14.795 104184 INFO neutron.agent.ovn.metadata.agent [-] Port 53f3c575-ccc8-4fde-a256-9598ddf6cdaf in datapath 55594f65-206f-4b2a-a4ed-c049861ef480 bound to our chassis
Jan 22 00:16:14 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:16:14.798 104184 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 55594f65-206f-4b2a-a4ed-c049861ef480
Jan 22 00:16:14 compute-1 ovn_controller[94841]: 2026-01-22T00:16:14Z|00520|binding|INFO|Setting lport 53f3c575-ccc8-4fde-a256-9598ddf6cdaf ovn-installed in OVS
Jan 22 00:16:14 compute-1 ovn_controller[94841]: 2026-01-22T00:16:14Z|00521|binding|INFO|Setting lport 53f3c575-ccc8-4fde-a256-9598ddf6cdaf up in Southbound
Jan 22 00:16:14 compute-1 nova_compute[182713]: 2026-01-22 00:16:14.807 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:16:14 compute-1 nova_compute[182713]: 2026-01-22 00:16:14.812 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:16:14 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:16:14.816 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[07ea462c-6c5e-44e1-9ff2-0b603e0d7a89]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:16:14 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:16:14.817 104184 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap55594f65-21 in ovnmeta-55594f65-206f-4b2a-a4ed-c049861ef480 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 22 00:16:14 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:16:14.818 211733 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap55594f65-20 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 22 00:16:14 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:16:14.818 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[5ec4de57-896e-4010-b5d2-40876a50c3bc]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:16:14 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:16:14.820 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[c6056680-c2f7-40f1-9501-b7b2976345de]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:16:14 compute-1 systemd-udevd[231454]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 00:16:14 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:16:14.831 104576 DEBUG oslo.privsep.daemon [-] privsep: reply[34d9d953-d59c-4e13-806a-82baf340c927]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:16:14 compute-1 systemd-machined[153970]: New machine qemu-59-instance-00000085.
Jan 22 00:16:14 compute-1 NetworkManager[54952]: <info>  [1769040974.8463] device (tap53f3c575-cc): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 00:16:14 compute-1 NetworkManager[54952]: <info>  [1769040974.8480] device (tap53f3c575-cc): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 00:16:14 compute-1 systemd[1]: Started Virtual Machine qemu-59-instance-00000085.
Jan 22 00:16:14 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:16:14.856 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[ffcb5329-90f9-453c-a09b-17c63b3eb380]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:16:14 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:16:14.884 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[f422377e-e2f3-4a59-acaf-1673ba716a87]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:16:14 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:16:14.890 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[73356255-7040-4f7b-8964-ac609fdf0863]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:16:14 compute-1 systemd-udevd[231457]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 00:16:14 compute-1 NetworkManager[54952]: <info>  [1769040974.8923] manager: (tap55594f65-20): new Veth device (/org/freedesktop/NetworkManager/Devices/239)
Jan 22 00:16:14 compute-1 nova_compute[182713]: 2026-01-22 00:16:14.914 182717 DEBUG oslo_concurrency.lockutils [None req-3ddb0b1d-e0b6-4f28-9c7c-469a326f1bab 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Acquiring lock "interface-074fd360-328c-4903-a368-d3890c4a1075-4230f4cb-ad57-407e-90c1-b2441b67e135" by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:16:14 compute-1 nova_compute[182713]: 2026-01-22 00:16:14.915 182717 DEBUG oslo_concurrency.lockutils [None req-3ddb0b1d-e0b6-4f28-9c7c-469a326f1bab 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "interface-074fd360-328c-4903-a368-d3890c4a1075-4230f4cb-ad57-407e-90c1-b2441b67e135" acquired by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:16:14 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:16:14.923 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[f2826b41-585a-4dfb-8113-cf0837342eb9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:16:14 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:16:14.927 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[2ab1573c-c3d2-41eb-9497-07472641b68b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:16:14 compute-1 nova_compute[182713]: 2026-01-22 00:16:14.933 182717 DEBUG nova.objects.instance [None req-3ddb0b1d-e0b6-4f28-9c7c-469a326f1bab 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lazy-loading 'flavor' on Instance uuid 074fd360-328c-4903-a368-d3890c4a1075 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:16:14 compute-1 NetworkManager[54952]: <info>  [1769040974.9511] device (tap55594f65-20): carrier: link connected
Jan 22 00:16:14 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:16:14.958 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[15643893-1127-4d70-8e11-5b2595ff829c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:16:14 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:16:14.973 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[6a4a969b-aec6-4595-8040-99b6d8436042]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap55594f65-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b9:fe:a1'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 149], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 554760, 'reachable_time': 19563, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 148, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 148, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 231485, 'error': None, 'target': 'ovnmeta-55594f65-206f-4b2a-a4ed-c049861ef480', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:16:14 compute-1 nova_compute[182713]: 2026-01-22 00:16:14.982 182717 DEBUG nova.virt.libvirt.vif [None req-3ddb0b1d-e0b6-4f28-9c7c-469a326f1bab 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T00:14:30Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1083808564',display_name='tempest-TestNetworkBasicOps-server-1083808564',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1083808564',id=125,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBM6+EaxgHLeoG7CHqeSWXpXKv4K84dI1QQQjZdcrX0T7kqXBTlhE22YQjJTUFxToUxfZEI27WRcAtCoqb6CCdLa4/l//5Lw6nNA8ZjjrkKnh18RWLjWeeCbEVEj1FdVuCQ==',key_name='tempest-TestNetworkBasicOps-2002692629',keypairs=<?>,launch_index=0,launched_at=2026-01-22T00:14:45Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='34b96b4037d24a0ea19383ca2477b2fd',ramdisk_id='',reservation_id='r-5bbetpco',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-822850957',owner_user_name='tempest-TestNetworkBasicOps-822850957-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T00:14:45Z,user_data=None,user_id='833f1e9dce90456ea55a443da6704907',uuid=074fd360-328c-4903-a368-d3890c4a1075,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "4230f4cb-ad57-407e-90c1-b2441b67e135", "address": "fa:16:3e:ce:53:e7", "network": {"id": "dbad5d67-48fa-4452-9764-37918af5f722", "bridge": "br-int", "label": "tempest-network-smoke--1888340030", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.29", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "34b96b4037d24a0ea19383ca2477b2fd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4230f4cb-ad", "ovs_interfaceid": "4230f4cb-ad57-407e-90c1-b2441b67e135", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 00:16:14 compute-1 nova_compute[182713]: 2026-01-22 00:16:14.982 182717 DEBUG nova.network.os_vif_util [None req-3ddb0b1d-e0b6-4f28-9c7c-469a326f1bab 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Converting VIF {"id": "4230f4cb-ad57-407e-90c1-b2441b67e135", "address": "fa:16:3e:ce:53:e7", "network": {"id": "dbad5d67-48fa-4452-9764-37918af5f722", "bridge": "br-int", "label": "tempest-network-smoke--1888340030", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.29", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "34b96b4037d24a0ea19383ca2477b2fd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4230f4cb-ad", "ovs_interfaceid": "4230f4cb-ad57-407e-90c1-b2441b67e135", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:16:14 compute-1 nova_compute[182713]: 2026-01-22 00:16:14.982 182717 DEBUG nova.network.os_vif_util [None req-3ddb0b1d-e0b6-4f28-9c7c-469a326f1bab 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:ce:53:e7,bridge_name='br-int',has_traffic_filtering=True,id=4230f4cb-ad57-407e-90c1-b2441b67e135,network=Network(dbad5d67-48fa-4452-9764-37918af5f722),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4230f4cb-ad') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:16:14 compute-1 nova_compute[182713]: 2026-01-22 00:16:14.985 182717 DEBUG nova.virt.libvirt.guest [None req-3ddb0b1d-e0b6-4f28-9c7c-469a326f1bab 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:ce:53:e7"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap4230f4cb-ad"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257
Jan 22 00:16:14 compute-1 nova_compute[182713]: 2026-01-22 00:16:14.987 182717 DEBUG nova.virt.libvirt.guest [None req-3ddb0b1d-e0b6-4f28-9c7c-469a326f1bab 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:ce:53:e7"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap4230f4cb-ad"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257
Jan 22 00:16:14 compute-1 nova_compute[182713]: 2026-01-22 00:16:14.989 182717 DEBUG nova.virt.libvirt.driver [None req-3ddb0b1d-e0b6-4f28-9c7c-469a326f1bab 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Attempting to detach device tap4230f4cb-ad from instance 074fd360-328c-4903-a368-d3890c4a1075 from the persistent domain config. _detach_from_persistent /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2487
Jan 22 00:16:14 compute-1 nova_compute[182713]: 2026-01-22 00:16:14.990 182717 DEBUG nova.virt.libvirt.guest [None req-3ddb0b1d-e0b6-4f28-9c7c-469a326f1bab 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] detach device xml: <interface type="ethernet">
Jan 22 00:16:14 compute-1 nova_compute[182713]:   <mac address="fa:16:3e:ce:53:e7"/>
Jan 22 00:16:14 compute-1 nova_compute[182713]:   <model type="virtio"/>
Jan 22 00:16:14 compute-1 nova_compute[182713]:   <driver name="vhost" rx_queue_size="512"/>
Jan 22 00:16:14 compute-1 nova_compute[182713]:   <mtu size="1442"/>
Jan 22 00:16:14 compute-1 nova_compute[182713]:   <target dev="tap4230f4cb-ad"/>
Jan 22 00:16:14 compute-1 nova_compute[182713]: </interface>
Jan 22 00:16:14 compute-1 nova_compute[182713]:  detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465
Jan 22 00:16:14 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:16:14.994 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[a268efd4-5026-44d1-8649-2c6cfe1ee60a]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feb9:fea1'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 554760, 'tstamp': 554760}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 231486, 'error': None, 'target': 'ovnmeta-55594f65-206f-4b2a-a4ed-c049861ef480', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:16:14 compute-1 nova_compute[182713]: 2026-01-22 00:16:14.999 182717 DEBUG nova.virt.libvirt.guest [None req-3ddb0b1d-e0b6-4f28-9c7c-469a326f1bab 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:ce:53:e7"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap4230f4cb-ad"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257
Jan 22 00:16:15 compute-1 nova_compute[182713]: 2026-01-22 00:16:15.002 182717 DEBUG nova.virt.libvirt.guest [None req-3ddb0b1d-e0b6-4f28-9c7c-469a326f1bab 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:ce:53:e7"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap4230f4cb-ad"/></interface>not found in domain: <domain type='kvm' id='57'>
Jan 22 00:16:15 compute-1 nova_compute[182713]:   <name>instance-0000007d</name>
Jan 22 00:16:15 compute-1 nova_compute[182713]:   <uuid>074fd360-328c-4903-a368-d3890c4a1075</uuid>
Jan 22 00:16:15 compute-1 nova_compute[182713]:   <metadata>
Jan 22 00:16:15 compute-1 nova_compute[182713]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 00:16:15 compute-1 nova_compute[182713]:   <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:   <nova:name>tempest-TestNetworkBasicOps-server-1083808564</nova:name>
Jan 22 00:16:15 compute-1 nova_compute[182713]:   <nova:creationTime>2026-01-22 00:15:20</nova:creationTime>
Jan 22 00:16:15 compute-1 nova_compute[182713]:   <nova:flavor name="m1.nano">
Jan 22 00:16:15 compute-1 nova_compute[182713]:     <nova:memory>128</nova:memory>
Jan 22 00:16:15 compute-1 nova_compute[182713]:     <nova:disk>1</nova:disk>
Jan 22 00:16:15 compute-1 nova_compute[182713]:     <nova:swap>0</nova:swap>
Jan 22 00:16:15 compute-1 nova_compute[182713]:     <nova:ephemeral>0</nova:ephemeral>
Jan 22 00:16:15 compute-1 nova_compute[182713]:     <nova:vcpus>1</nova:vcpus>
Jan 22 00:16:15 compute-1 nova_compute[182713]:   </nova:flavor>
Jan 22 00:16:15 compute-1 nova_compute[182713]:   <nova:owner>
Jan 22 00:16:15 compute-1 nova_compute[182713]:     <nova:user uuid="833f1e9dce90456ea55a443da6704907">tempest-TestNetworkBasicOps-822850957-project-member</nova:user>
Jan 22 00:16:15 compute-1 nova_compute[182713]:     <nova:project uuid="34b96b4037d24a0ea19383ca2477b2fd">tempest-TestNetworkBasicOps-822850957</nova:project>
Jan 22 00:16:15 compute-1 nova_compute[182713]:   </nova:owner>
Jan 22 00:16:15 compute-1 nova_compute[182713]:   <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:   <nova:ports>
Jan 22 00:16:15 compute-1 nova_compute[182713]:     <nova:port uuid="a06a78d5-548e-4a84-b918-197a54a79f44">
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:     </nova:port>
Jan 22 00:16:15 compute-1 nova_compute[182713]:     <nova:port uuid="4230f4cb-ad57-407e-90c1-b2441b67e135">
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <nova:ip type="fixed" address="10.100.0.29" ipVersion="4"/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:     </nova:port>
Jan 22 00:16:15 compute-1 nova_compute[182713]:   </nova:ports>
Jan 22 00:16:15 compute-1 nova_compute[182713]: </nova:instance>
Jan 22 00:16:15 compute-1 nova_compute[182713]:   </metadata>
Jan 22 00:16:15 compute-1 nova_compute[182713]:   <memory unit='KiB'>131072</memory>
Jan 22 00:16:15 compute-1 nova_compute[182713]:   <currentMemory unit='KiB'>131072</currentMemory>
Jan 22 00:16:15 compute-1 nova_compute[182713]:   <vcpu placement='static'>1</vcpu>
Jan 22 00:16:15 compute-1 nova_compute[182713]:   <resource>
Jan 22 00:16:15 compute-1 nova_compute[182713]:     <partition>/machine</partition>
Jan 22 00:16:15 compute-1 nova_compute[182713]:   </resource>
Jan 22 00:16:15 compute-1 nova_compute[182713]:   <sysinfo type='smbios'>
Jan 22 00:16:15 compute-1 nova_compute[182713]:     <system>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <entry name='manufacturer'>RDO</entry>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <entry name='product'>OpenStack Compute</entry>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <entry name='serial'>074fd360-328c-4903-a368-d3890c4a1075</entry>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <entry name='uuid'>074fd360-328c-4903-a368-d3890c4a1075</entry>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <entry name='family'>Virtual Machine</entry>
Jan 22 00:16:15 compute-1 nova_compute[182713]:     </system>
Jan 22 00:16:15 compute-1 nova_compute[182713]:   </sysinfo>
Jan 22 00:16:15 compute-1 nova_compute[182713]:   <os>
Jan 22 00:16:15 compute-1 nova_compute[182713]:     <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Jan 22 00:16:15 compute-1 nova_compute[182713]:     <boot dev='hd'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:     <smbios mode='sysinfo'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:   </os>
Jan 22 00:16:15 compute-1 nova_compute[182713]:   <features>
Jan 22 00:16:15 compute-1 nova_compute[182713]:     <acpi/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:     <apic/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:     <vmcoreinfo state='on'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:   </features>
Jan 22 00:16:15 compute-1 nova_compute[182713]:   <cpu mode='custom' match='exact' check='full'>
Jan 22 00:16:15 compute-1 nova_compute[182713]:     <model fallback='forbid'>Nehalem</model>
Jan 22 00:16:15 compute-1 nova_compute[182713]:     <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:     <feature policy='require' name='x2apic'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:     <feature policy='require' name='hypervisor'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:     <feature policy='require' name='vme'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:   </cpu>
Jan 22 00:16:15 compute-1 nova_compute[182713]:   <clock offset='utc'>
Jan 22 00:16:15 compute-1 nova_compute[182713]:     <timer name='pit' tickpolicy='delay'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:     <timer name='rtc' tickpolicy='catchup'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:     <timer name='hpet' present='no'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:   </clock>
Jan 22 00:16:15 compute-1 nova_compute[182713]:   <on_poweroff>destroy</on_poweroff>
Jan 22 00:16:15 compute-1 nova_compute[182713]:   <on_reboot>restart</on_reboot>
Jan 22 00:16:15 compute-1 nova_compute[182713]:   <on_crash>destroy</on_crash>
Jan 22 00:16:15 compute-1 nova_compute[182713]:   <devices>
Jan 22 00:16:15 compute-1 nova_compute[182713]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 22 00:16:15 compute-1 nova_compute[182713]:     <disk type='file' device='disk'>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <driver name='qemu' type='qcow2' cache='none'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <source file='/var/lib/nova/instances/074fd360-328c-4903-a368-d3890c4a1075/disk' index='2'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <backingStore type='file' index='3'>
Jan 22 00:16:15 compute-1 nova_compute[182713]:         <format type='raw'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:         <source file='/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:         <backingStore/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       </backingStore>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <target dev='vda' bus='virtio'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <alias name='virtio-disk0'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:     </disk>
Jan 22 00:16:15 compute-1 nova_compute[182713]:     <disk type='file' device='cdrom'>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <driver name='qemu' type='raw' cache='none'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <source file='/var/lib/nova/instances/074fd360-328c-4903-a368-d3890c4a1075/disk.config' index='1'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <backingStore/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <target dev='sda' bus='sata'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <readonly/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <alias name='sata0-0-0'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:     </disk>
Jan 22 00:16:15 compute-1 nova_compute[182713]:     <controller type='pci' index='0' model='pcie-root'>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <alias name='pcie.0'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:     </controller>
Jan 22 00:16:15 compute-1 nova_compute[182713]:     <controller type='pci' index='1' model='pcie-root-port'>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <target chassis='1' port='0x10'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <alias name='pci.1'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:     </controller>
Jan 22 00:16:15 compute-1 nova_compute[182713]:     <controller type='pci' index='2' model='pcie-root-port'>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <target chassis='2' port='0x11'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <alias name='pci.2'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:     </controller>
Jan 22 00:16:15 compute-1 nova_compute[182713]:     <controller type='pci' index='3' model='pcie-root-port'>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <target chassis='3' port='0x12'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <alias name='pci.3'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:     </controller>
Jan 22 00:16:15 compute-1 nova_compute[182713]:     <controller type='pci' index='4' model='pcie-root-port'>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <target chassis='4' port='0x13'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <alias name='pci.4'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:     </controller>
Jan 22 00:16:15 compute-1 nova_compute[182713]:     <controller type='pci' index='5' model='pcie-root-port'>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <target chassis='5' port='0x14'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <alias name='pci.5'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:     </controller>
Jan 22 00:16:15 compute-1 nova_compute[182713]:     <controller type='pci' index='6' model='pcie-root-port'>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <target chassis='6' port='0x15'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <alias name='pci.6'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:     </controller>
Jan 22 00:16:15 compute-1 nova_compute[182713]:     <controller type='pci' index='7' model='pcie-root-port'>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <target chassis='7' port='0x16'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <alias name='pci.7'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:     </controller>
Jan 22 00:16:15 compute-1 nova_compute[182713]:     <controller type='pci' index='8' model='pcie-root-port'>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <target chassis='8' port='0x17'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <alias name='pci.8'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:     </controller>
Jan 22 00:16:15 compute-1 nova_compute[182713]:     <controller type='pci' index='9' model='pcie-root-port'>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <target chassis='9' port='0x18'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <alias name='pci.9'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:     </controller>
Jan 22 00:16:15 compute-1 nova_compute[182713]:     <controller type='pci' index='10' model='pcie-root-port'>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <target chassis='10' port='0x19'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <alias name='pci.10'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:     </controller>
Jan 22 00:16:15 compute-1 nova_compute[182713]:     <controller type='pci' index='11' model='pcie-root-port'>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <target chassis='11' port='0x1a'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <alias name='pci.11'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:     </controller>
Jan 22 00:16:15 compute-1 nova_compute[182713]:     <controller type='pci' index='12' model='pcie-root-port'>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <target chassis='12' port='0x1b'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <alias name='pci.12'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:     </controller>
Jan 22 00:16:15 compute-1 nova_compute[182713]:     <controller type='pci' index='13' model='pcie-root-port'>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <target chassis='13' port='0x1c'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <alias name='pci.13'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:     </controller>
Jan 22 00:16:15 compute-1 nova_compute[182713]:     <controller type='pci' index='14' model='pcie-root-port'>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <target chassis='14' port='0x1d'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <alias name='pci.14'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:     </controller>
Jan 22 00:16:15 compute-1 nova_compute[182713]:     <controller type='pci' index='15' model='pcie-root-port'>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <target chassis='15' port='0x1e'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <alias name='pci.15'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:     </controller>
Jan 22 00:16:15 compute-1 nova_compute[182713]:     <controller type='pci' index='16' model='pcie-root-port'>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <target chassis='16' port='0x1f'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <alias name='pci.16'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:     </controller>
Jan 22 00:16:15 compute-1 nova_compute[182713]:     <controller type='pci' index='17' model='pcie-root-port'>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <target chassis='17' port='0x20'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <alias name='pci.17'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:     </controller>
Jan 22 00:16:15 compute-1 nova_compute[182713]:     <controller type='pci' index='18' model='pcie-root-port'>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <target chassis='18' port='0x21'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <alias name='pci.18'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:     </controller>
Jan 22 00:16:15 compute-1 nova_compute[182713]:     <controller type='pci' index='19' model='pcie-root-port'>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <target chassis='19' port='0x22'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <alias name='pci.19'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:     </controller>
Jan 22 00:16:15 compute-1 nova_compute[182713]:     <controller type='pci' index='20' model='pcie-root-port'>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <target chassis='20' port='0x23'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <alias name='pci.20'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:     </controller>
Jan 22 00:16:15 compute-1 nova_compute[182713]:     <controller type='pci' index='21' model='pcie-root-port'>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <target chassis='21' port='0x24'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <alias name='pci.21'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:     </controller>
Jan 22 00:16:15 compute-1 nova_compute[182713]:     <controller type='pci' index='22' model='pcie-root-port'>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <target chassis='22' port='0x25'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <alias name='pci.22'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:     </controller>
Jan 22 00:16:15 compute-1 nova_compute[182713]:     <controller type='pci' index='23' model='pcie-root-port'>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <target chassis='23' port='0x26'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <alias name='pci.23'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:     </controller>
Jan 22 00:16:15 compute-1 nova_compute[182713]:     <controller type='pci' index='24' model='pcie-root-port'>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <target chassis='24' port='0x27'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <alias name='pci.24'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:     </controller>
Jan 22 00:16:15 compute-1 nova_compute[182713]:     <controller type='pci' index='25' model='pcie-root-port'>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <target chassis='25' port='0x28'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <alias name='pci.25'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:     </controller>
Jan 22 00:16:15 compute-1 nova_compute[182713]:     <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <model name='pcie-pci-bridge'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <alias name='pci.26'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:     </controller>
Jan 22 00:16:15 compute-1 nova_compute[182713]:     <controller type='usb' index='0' model='piix3-uhci'>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <alias name='usb'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:     </controller>
Jan 22 00:16:15 compute-1 nova_compute[182713]:     <controller type='sata' index='0'>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <alias name='ide'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:     </controller>
Jan 22 00:16:15 compute-1 nova_compute[182713]:     <interface type='ethernet'>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <mac address='fa:16:3e:3e:83:df'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <target dev='tapa06a78d5-54'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <model type='virtio'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <driver name='vhost' rx_queue_size='512'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <mtu size='1442'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <alias name='net0'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:     </interface>
Jan 22 00:16:15 compute-1 nova_compute[182713]:     <interface type='ethernet'>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <mac address='fa:16:3e:ce:53:e7'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <target dev='tap4230f4cb-ad'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <model type='virtio'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <driver name='vhost' rx_queue_size='512'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <mtu size='1442'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <alias name='net1'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x06' slot='0x00' function='0x0'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:     </interface>
Jan 22 00:16:15 compute-1 nova_compute[182713]:     <serial type='pty'>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <source path='/dev/pts/0'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <log file='/var/lib/nova/instances/074fd360-328c-4903-a368-d3890c4a1075/console.log' append='off'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <target type='isa-serial' port='0'>
Jan 22 00:16:15 compute-1 nova_compute[182713]:         <model name='isa-serial'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       </target>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <alias name='serial0'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:     </serial>
Jan 22 00:16:15 compute-1 nova_compute[182713]:     <console type='pty' tty='/dev/pts/0'>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <source path='/dev/pts/0'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <log file='/var/lib/nova/instances/074fd360-328c-4903-a368-d3890c4a1075/console.log' append='off'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <target type='serial' port='0'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <alias name='serial0'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:     </console>
Jan 22 00:16:15 compute-1 nova_compute[182713]:     <input type='tablet' bus='usb'>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <alias name='input0'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <address type='usb' bus='0' port='1'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:     </input>
Jan 22 00:16:15 compute-1 nova_compute[182713]:     <input type='mouse' bus='ps2'>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <alias name='input1'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:     </input>
Jan 22 00:16:15 compute-1 nova_compute[182713]:     <input type='keyboard' bus='ps2'>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <alias name='input2'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:     </input>
Jan 22 00:16:15 compute-1 nova_compute[182713]:     <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <listen type='address' address='::0'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:     </graphics>
Jan 22 00:16:15 compute-1 nova_compute[182713]:     <audio id='1' type='none'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:     <video>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <model type='virtio' heads='1' primary='yes'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <alias name='video0'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:     </video>
Jan 22 00:16:15 compute-1 nova_compute[182713]:     <watchdog model='itco' action='reset'>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <alias name='watchdog0'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:     </watchdog>
Jan 22 00:16:15 compute-1 nova_compute[182713]:     <memballoon model='virtio'>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <stats period='10'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <alias name='balloon0'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:     </memballoon>
Jan 22 00:16:15 compute-1 nova_compute[182713]:     <rng model='virtio'>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <backend model='random'>/dev/urandom</backend>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <alias name='rng0'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:     </rng>
Jan 22 00:16:15 compute-1 nova_compute[182713]:   </devices>
Jan 22 00:16:15 compute-1 nova_compute[182713]:   <seclabel type='dynamic' model='selinux' relabel='yes'>
Jan 22 00:16:15 compute-1 nova_compute[182713]:     <label>system_u:system_r:svirt_t:s0:c752,c871</label>
Jan 22 00:16:15 compute-1 nova_compute[182713]:     <imagelabel>system_u:object_r:svirt_image_t:s0:c752,c871</imagelabel>
Jan 22 00:16:15 compute-1 nova_compute[182713]:   </seclabel>
Jan 22 00:16:15 compute-1 nova_compute[182713]:   <seclabel type='dynamic' model='dac' relabel='yes'>
Jan 22 00:16:15 compute-1 nova_compute[182713]:     <label>+107:+107</label>
Jan 22 00:16:15 compute-1 nova_compute[182713]:     <imagelabel>+107:+107</imagelabel>
Jan 22 00:16:15 compute-1 nova_compute[182713]:   </seclabel>
Jan 22 00:16:15 compute-1 nova_compute[182713]: </domain>
Jan 22 00:16:15 compute-1 nova_compute[182713]:  get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282
Jan 22 00:16:15 compute-1 nova_compute[182713]: 2026-01-22 00:16:15.002 182717 INFO nova.virt.libvirt.driver [None req-3ddb0b1d-e0b6-4f28-9c7c-469a326f1bab 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Successfully detached device tap4230f4cb-ad from instance 074fd360-328c-4903-a368-d3890c4a1075 from the persistent domain config.
Jan 22 00:16:15 compute-1 nova_compute[182713]: 2026-01-22 00:16:15.002 182717 DEBUG nova.virt.libvirt.driver [None req-3ddb0b1d-e0b6-4f28-9c7c-469a326f1bab 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] (1/8): Attempting to detach device tap4230f4cb-ad with device alias net1 from instance 074fd360-328c-4903-a368-d3890c4a1075 from the live domain config. _detach_from_live_with_retry /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2523
Jan 22 00:16:15 compute-1 nova_compute[182713]: 2026-01-22 00:16:15.003 182717 DEBUG nova.virt.libvirt.guest [None req-3ddb0b1d-e0b6-4f28-9c7c-469a326f1bab 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] detach device xml: <interface type="ethernet">
Jan 22 00:16:15 compute-1 nova_compute[182713]:   <mac address="fa:16:3e:ce:53:e7"/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:   <model type="virtio"/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:   <driver name="vhost" rx_queue_size="512"/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:   <mtu size="1442"/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:   <target dev="tap4230f4cb-ad"/>
Jan 22 00:16:15 compute-1 nova_compute[182713]: </interface>
Jan 22 00:16:15 compute-1 nova_compute[182713]:  detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465
Jan 22 00:16:15 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:16:15.016 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[d5f4b216-0dd8-4a16-8bd3-8a7051c72183]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap55594f65-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b9:fe:a1'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 149], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 554760, 'reachable_time': 19563, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 148, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 148, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 231487, 'error': None, 'target': 'ovnmeta-55594f65-206f-4b2a-a4ed-c049861ef480', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:16:15 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:16:15.053 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[cf2c0928-8dbe-4d0a-bd16-bc07f1295bb1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:16:15 compute-1 kernel: tap4230f4cb-ad (unregistering): left promiscuous mode
Jan 22 00:16:15 compute-1 NetworkManager[54952]: <info>  [1769040975.1243] device (tap4230f4cb-ad): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 00:16:15 compute-1 ovn_controller[94841]: 2026-01-22T00:16:15Z|00522|binding|INFO|Releasing lport 4230f4cb-ad57-407e-90c1-b2441b67e135 from this chassis (sb_readonly=0)
Jan 22 00:16:15 compute-1 nova_compute[182713]: 2026-01-22 00:16:15.133 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:16:15 compute-1 ovn_controller[94841]: 2026-01-22T00:16:15Z|00523|binding|INFO|Setting lport 4230f4cb-ad57-407e-90c1-b2441b67e135 down in Southbound
Jan 22 00:16:15 compute-1 ovn_controller[94841]: 2026-01-22T00:16:15Z|00524|binding|INFO|Removing iface tap4230f4cb-ad ovn-installed in OVS
Jan 22 00:16:15 compute-1 nova_compute[182713]: 2026-01-22 00:16:15.136 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:16:15 compute-1 nova_compute[182713]: 2026-01-22 00:16:15.146 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:16:15 compute-1 nova_compute[182713]: 2026-01-22 00:16:15.149 182717 DEBUG nova.virt.libvirt.driver [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Received event <DeviceRemovedEvent: 1769040975.148841, 074fd360-328c-4903-a368-d3890c4a1075 => net1> from libvirt while the driver is waiting for it; dispatched. emit_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2370
Jan 22 00:16:15 compute-1 nova_compute[182713]: 2026-01-22 00:16:15.151 182717 DEBUG nova.virt.libvirt.driver [None req-3ddb0b1d-e0b6-4f28-9c7c-469a326f1bab 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Start waiting for the detach event from libvirt for device tap4230f4cb-ad with device alias net1 for instance 074fd360-328c-4903-a368-d3890c4a1075 _detach_from_live_and_wait_for_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2599
Jan 22 00:16:15 compute-1 nova_compute[182713]: 2026-01-22 00:16:15.152 182717 DEBUG nova.virt.libvirt.guest [None req-3ddb0b1d-e0b6-4f28-9c7c-469a326f1bab 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:ce:53:e7"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap4230f4cb-ad"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257
Jan 22 00:16:15 compute-1 nova_compute[182713]: 2026-01-22 00:16:15.157 182717 DEBUG nova.virt.libvirt.guest [None req-3ddb0b1d-e0b6-4f28-9c7c-469a326f1bab 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:ce:53:e7"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap4230f4cb-ad"/></interface>not found in domain: <domain type='kvm' id='57'>
Jan 22 00:16:15 compute-1 nova_compute[182713]:   <name>instance-0000007d</name>
Jan 22 00:16:15 compute-1 nova_compute[182713]:   <uuid>074fd360-328c-4903-a368-d3890c4a1075</uuid>
Jan 22 00:16:15 compute-1 nova_compute[182713]:   <metadata>
Jan 22 00:16:15 compute-1 nova_compute[182713]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 00:16:15 compute-1 nova_compute[182713]:   <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:   <nova:name>tempest-TestNetworkBasicOps-server-1083808564</nova:name>
Jan 22 00:16:15 compute-1 nova_compute[182713]:   <nova:creationTime>2026-01-22 00:15:20</nova:creationTime>
Jan 22 00:16:15 compute-1 nova_compute[182713]:   <nova:flavor name="m1.nano">
Jan 22 00:16:15 compute-1 nova_compute[182713]:     <nova:memory>128</nova:memory>
Jan 22 00:16:15 compute-1 nova_compute[182713]:     <nova:disk>1</nova:disk>
Jan 22 00:16:15 compute-1 nova_compute[182713]:     <nova:swap>0</nova:swap>
Jan 22 00:16:15 compute-1 nova_compute[182713]:     <nova:ephemeral>0</nova:ephemeral>
Jan 22 00:16:15 compute-1 nova_compute[182713]:     <nova:vcpus>1</nova:vcpus>
Jan 22 00:16:15 compute-1 nova_compute[182713]:   </nova:flavor>
Jan 22 00:16:15 compute-1 nova_compute[182713]:   <nova:owner>
Jan 22 00:16:15 compute-1 nova_compute[182713]:     <nova:user uuid="833f1e9dce90456ea55a443da6704907">tempest-TestNetworkBasicOps-822850957-project-member</nova:user>
Jan 22 00:16:15 compute-1 nova_compute[182713]:     <nova:project uuid="34b96b4037d24a0ea19383ca2477b2fd">tempest-TestNetworkBasicOps-822850957</nova:project>
Jan 22 00:16:15 compute-1 nova_compute[182713]:   </nova:owner>
Jan 22 00:16:15 compute-1 nova_compute[182713]:   <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:   <nova:ports>
Jan 22 00:16:15 compute-1 nova_compute[182713]:     <nova:port uuid="a06a78d5-548e-4a84-b918-197a54a79f44">
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:     </nova:port>
Jan 22 00:16:15 compute-1 nova_compute[182713]:     <nova:port uuid="4230f4cb-ad57-407e-90c1-b2441b67e135">
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <nova:ip type="fixed" address="10.100.0.29" ipVersion="4"/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:     </nova:port>
Jan 22 00:16:15 compute-1 nova_compute[182713]:   </nova:ports>
Jan 22 00:16:15 compute-1 nova_compute[182713]: </nova:instance>
Jan 22 00:16:15 compute-1 nova_compute[182713]:   </metadata>
Jan 22 00:16:15 compute-1 nova_compute[182713]:   <memory unit='KiB'>131072</memory>
Jan 22 00:16:15 compute-1 nova_compute[182713]:   <currentMemory unit='KiB'>131072</currentMemory>
Jan 22 00:16:15 compute-1 nova_compute[182713]:   <vcpu placement='static'>1</vcpu>
Jan 22 00:16:15 compute-1 nova_compute[182713]:   <resource>
Jan 22 00:16:15 compute-1 nova_compute[182713]:     <partition>/machine</partition>
Jan 22 00:16:15 compute-1 nova_compute[182713]:   </resource>
Jan 22 00:16:15 compute-1 nova_compute[182713]:   <sysinfo type='smbios'>
Jan 22 00:16:15 compute-1 nova_compute[182713]:     <system>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <entry name='manufacturer'>RDO</entry>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <entry name='product'>OpenStack Compute</entry>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <entry name='serial'>074fd360-328c-4903-a368-d3890c4a1075</entry>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <entry name='uuid'>074fd360-328c-4903-a368-d3890c4a1075</entry>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <entry name='family'>Virtual Machine</entry>
Jan 22 00:16:15 compute-1 nova_compute[182713]:     </system>
Jan 22 00:16:15 compute-1 nova_compute[182713]:   </sysinfo>
Jan 22 00:16:15 compute-1 nova_compute[182713]:   <os>
Jan 22 00:16:15 compute-1 nova_compute[182713]:     <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Jan 22 00:16:15 compute-1 nova_compute[182713]:     <boot dev='hd'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:     <smbios mode='sysinfo'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:   </os>
Jan 22 00:16:15 compute-1 nova_compute[182713]:   <features>
Jan 22 00:16:15 compute-1 nova_compute[182713]:     <acpi/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:     <apic/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:     <vmcoreinfo state='on'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:   </features>
Jan 22 00:16:15 compute-1 nova_compute[182713]:   <cpu mode='custom' match='exact' check='full'>
Jan 22 00:16:15 compute-1 nova_compute[182713]:     <model fallback='forbid'>Nehalem</model>
Jan 22 00:16:15 compute-1 nova_compute[182713]:     <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:     <feature policy='require' name='x2apic'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:     <feature policy='require' name='hypervisor'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:     <feature policy='require' name='vme'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:   </cpu>
Jan 22 00:16:15 compute-1 nova_compute[182713]:   <clock offset='utc'>
Jan 22 00:16:15 compute-1 nova_compute[182713]:     <timer name='pit' tickpolicy='delay'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:     <timer name='rtc' tickpolicy='catchup'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:     <timer name='hpet' present='no'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:   </clock>
Jan 22 00:16:15 compute-1 nova_compute[182713]:   <on_poweroff>destroy</on_poweroff>
Jan 22 00:16:15 compute-1 nova_compute[182713]:   <on_reboot>restart</on_reboot>
Jan 22 00:16:15 compute-1 nova_compute[182713]:   <on_crash>destroy</on_crash>
Jan 22 00:16:15 compute-1 nova_compute[182713]:   <devices>
Jan 22 00:16:15 compute-1 nova_compute[182713]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 22 00:16:15 compute-1 nova_compute[182713]:     <disk type='file' device='disk'>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <driver name='qemu' type='qcow2' cache='none'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <source file='/var/lib/nova/instances/074fd360-328c-4903-a368-d3890c4a1075/disk' index='2'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <backingStore type='file' index='3'>
Jan 22 00:16:15 compute-1 nova_compute[182713]:         <format type='raw'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:         <source file='/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:         <backingStore/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       </backingStore>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <target dev='vda' bus='virtio'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <alias name='virtio-disk0'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:     </disk>
Jan 22 00:16:15 compute-1 nova_compute[182713]:     <disk type='file' device='cdrom'>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <driver name='qemu' type='raw' cache='none'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <source file='/var/lib/nova/instances/074fd360-328c-4903-a368-d3890c4a1075/disk.config' index='1'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <backingStore/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <target dev='sda' bus='sata'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <readonly/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <alias name='sata0-0-0'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:     </disk>
Jan 22 00:16:15 compute-1 nova_compute[182713]:     <controller type='pci' index='0' model='pcie-root'>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <alias name='pcie.0'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:     </controller>
Jan 22 00:16:15 compute-1 nova_compute[182713]:     <controller type='pci' index='1' model='pcie-root-port'>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <target chassis='1' port='0x10'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <alias name='pci.1'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:     </controller>
Jan 22 00:16:15 compute-1 nova_compute[182713]:     <controller type='pci' index='2' model='pcie-root-port'>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <target chassis='2' port='0x11'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <alias name='pci.2'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:     </controller>
Jan 22 00:16:15 compute-1 nova_compute[182713]:     <controller type='pci' index='3' model='pcie-root-port'>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <target chassis='3' port='0x12'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <alias name='pci.3'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:     </controller>
Jan 22 00:16:15 compute-1 nova_compute[182713]:     <controller type='pci' index='4' model='pcie-root-port'>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <target chassis='4' port='0x13'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <alias name='pci.4'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:     </controller>
Jan 22 00:16:15 compute-1 nova_compute[182713]:     <controller type='pci' index='5' model='pcie-root-port'>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <target chassis='5' port='0x14'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <alias name='pci.5'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:     </controller>
Jan 22 00:16:15 compute-1 nova_compute[182713]:     <controller type='pci' index='6' model='pcie-root-port'>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <target chassis='6' port='0x15'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <alias name='pci.6'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:     </controller>
Jan 22 00:16:15 compute-1 nova_compute[182713]:     <controller type='pci' index='7' model='pcie-root-port'>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <target chassis='7' port='0x16'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <alias name='pci.7'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:     </controller>
Jan 22 00:16:15 compute-1 nova_compute[182713]:     <controller type='pci' index='8' model='pcie-root-port'>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <target chassis='8' port='0x17'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <alias name='pci.8'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:     </controller>
Jan 22 00:16:15 compute-1 nova_compute[182713]:     <controller type='pci' index='9' model='pcie-root-port'>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <target chassis='9' port='0x18'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <alias name='pci.9'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:     </controller>
Jan 22 00:16:15 compute-1 nova_compute[182713]:     <controller type='pci' index='10' model='pcie-root-port'>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <target chassis='10' port='0x19'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <alias name='pci.10'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:     </controller>
Jan 22 00:16:15 compute-1 nova_compute[182713]:     <controller type='pci' index='11' model='pcie-root-port'>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <target chassis='11' port='0x1a'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <alias name='pci.11'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:     </controller>
Jan 22 00:16:15 compute-1 nova_compute[182713]:     <controller type='pci' index='12' model='pcie-root-port'>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <target chassis='12' port='0x1b'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <alias name='pci.12'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:     </controller>
Jan 22 00:16:15 compute-1 nova_compute[182713]:     <controller type='pci' index='13' model='pcie-root-port'>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <target chassis='13' port='0x1c'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <alias name='pci.13'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:     </controller>
Jan 22 00:16:15 compute-1 nova_compute[182713]:     <controller type='pci' index='14' model='pcie-root-port'>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <target chassis='14' port='0x1d'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <alias name='pci.14'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:     </controller>
Jan 22 00:16:15 compute-1 nova_compute[182713]:     <controller type='pci' index='15' model='pcie-root-port'>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <target chassis='15' port='0x1e'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <alias name='pci.15'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:     </controller>
Jan 22 00:16:15 compute-1 nova_compute[182713]:     <controller type='pci' index='16' model='pcie-root-port'>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <target chassis='16' port='0x1f'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <alias name='pci.16'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:     </controller>
Jan 22 00:16:15 compute-1 nova_compute[182713]:     <controller type='pci' index='17' model='pcie-root-port'>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <target chassis='17' port='0x20'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <alias name='pci.17'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:     </controller>
Jan 22 00:16:15 compute-1 nova_compute[182713]:     <controller type='pci' index='18' model='pcie-root-port'>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <target chassis='18' port='0x21'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <alias name='pci.18'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:     </controller>
Jan 22 00:16:15 compute-1 nova_compute[182713]:     <controller type='pci' index='19' model='pcie-root-port'>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <target chassis='19' port='0x22'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <alias name='pci.19'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:     </controller>
Jan 22 00:16:15 compute-1 nova_compute[182713]:     <controller type='pci' index='20' model='pcie-root-port'>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <target chassis='20' port='0x23'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <alias name='pci.20'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:     </controller>
Jan 22 00:16:15 compute-1 nova_compute[182713]:     <controller type='pci' index='21' model='pcie-root-port'>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <target chassis='21' port='0x24'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <alias name='pci.21'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:     </controller>
Jan 22 00:16:15 compute-1 nova_compute[182713]:     <controller type='pci' index='22' model='pcie-root-port'>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <target chassis='22' port='0x25'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <alias name='pci.22'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:     </controller>
Jan 22 00:16:15 compute-1 nova_compute[182713]:     <controller type='pci' index='23' model='pcie-root-port'>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <target chassis='23' port='0x26'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <alias name='pci.23'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:     </controller>
Jan 22 00:16:15 compute-1 nova_compute[182713]:     <controller type='pci' index='24' model='pcie-root-port'>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <target chassis='24' port='0x27'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <alias name='pci.24'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:     </controller>
Jan 22 00:16:15 compute-1 nova_compute[182713]:     <controller type='pci' index='25' model='pcie-root-port'>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <target chassis='25' port='0x28'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <alias name='pci.25'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:     </controller>
Jan 22 00:16:15 compute-1 nova_compute[182713]:     <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <model name='pcie-pci-bridge'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <alias name='pci.26'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:     </controller>
Jan 22 00:16:15 compute-1 nova_compute[182713]:     <controller type='usb' index='0' model='piix3-uhci'>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <alias name='usb'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:     </controller>
Jan 22 00:16:15 compute-1 nova_compute[182713]:     <controller type='sata' index='0'>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <alias name='ide'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:     </controller>
Jan 22 00:16:15 compute-1 nova_compute[182713]:     <interface type='ethernet'>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <mac address='fa:16:3e:3e:83:df'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <target dev='tapa06a78d5-54'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <model type='virtio'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <driver name='vhost' rx_queue_size='512'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <mtu size='1442'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <alias name='net0'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:     </interface>
Jan 22 00:16:15 compute-1 nova_compute[182713]:     <serial type='pty'>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <source path='/dev/pts/0'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <log file='/var/lib/nova/instances/074fd360-328c-4903-a368-d3890c4a1075/console.log' append='off'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <target type='isa-serial' port='0'>
Jan 22 00:16:15 compute-1 nova_compute[182713]:         <model name='isa-serial'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       </target>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <alias name='serial0'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:     </serial>
Jan 22 00:16:15 compute-1 nova_compute[182713]:     <console type='pty' tty='/dev/pts/0'>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <source path='/dev/pts/0'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <log file='/var/lib/nova/instances/074fd360-328c-4903-a368-d3890c4a1075/console.log' append='off'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <target type='serial' port='0'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <alias name='serial0'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:     </console>
Jan 22 00:16:15 compute-1 nova_compute[182713]:     <input type='tablet' bus='usb'>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <alias name='input0'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <address type='usb' bus='0' port='1'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:     </input>
Jan 22 00:16:15 compute-1 nova_compute[182713]:     <input type='mouse' bus='ps2'>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <alias name='input1'/>
Jan 22 00:16:15 compute-1 kernel: tap55594f65-20: entered promiscuous mode
Jan 22 00:16:15 compute-1 nova_compute[182713]:     </input>
Jan 22 00:16:15 compute-1 nova_compute[182713]:     <input type='keyboard' bus='ps2'>
Jan 22 00:16:15 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:16:15.160 104184 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ce:53:e7 10.100.0.29', 'unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.29/28', 'neutron:device_id': '074fd360-328c-4903-a368-d3890c4a1075', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-dbad5d67-48fa-4452-9764-37918af5f722', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '34b96b4037d24a0ea19383ca2477b2fd', 'neutron:revision_number': '5', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3ac57e7f-dd25-45c3-9d9a-56463ac0e2d3, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>], logical_port=4230f4cb-ad57-407e-90c1-b2441b67e135) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <alias name='input2'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:     </input>
Jan 22 00:16:15 compute-1 nova_compute[182713]:     <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <listen type='address' address='::0'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:     </graphics>
Jan 22 00:16:15 compute-1 nova_compute[182713]:     <audio id='1' type='none'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:     <video>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <model type='virtio' heads='1' primary='yes'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <alias name='video0'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:     </video>
Jan 22 00:16:15 compute-1 nova_compute[182713]:     <watchdog model='itco' action='reset'>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <alias name='watchdog0'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:     </watchdog>
Jan 22 00:16:15 compute-1 nova_compute[182713]:     <memballoon model='virtio'>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <stats period='10'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <alias name='balloon0'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:     </memballoon>
Jan 22 00:16:15 compute-1 nova_compute[182713]:     <rng model='virtio'>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <backend model='random'>/dev/urandom</backend>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <alias name='rng0'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:     </rng>
Jan 22 00:16:15 compute-1 nova_compute[182713]:   </devices>
Jan 22 00:16:15 compute-1 nova_compute[182713]:   <seclabel type='dynamic' model='selinux' relabel='yes'>
Jan 22 00:16:15 compute-1 nova_compute[182713]:     <label>system_u:system_r:svirt_t:s0:c752,c871</label>
Jan 22 00:16:15 compute-1 nova_compute[182713]:     <imagelabel>system_u:object_r:svirt_image_t:s0:c752,c871</imagelabel>
Jan 22 00:16:15 compute-1 nova_compute[182713]:   </seclabel>
Jan 22 00:16:15 compute-1 nova_compute[182713]:   <seclabel type='dynamic' model='dac' relabel='yes'>
Jan 22 00:16:15 compute-1 nova_compute[182713]:     <label>+107:+107</label>
Jan 22 00:16:15 compute-1 nova_compute[182713]:     <imagelabel>+107:+107</imagelabel>
Jan 22 00:16:15 compute-1 nova_compute[182713]:   </seclabel>
Jan 22 00:16:15 compute-1 nova_compute[182713]: </domain>
Jan 22 00:16:15 compute-1 nova_compute[182713]:  get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282
Jan 22 00:16:15 compute-1 nova_compute[182713]: 2026-01-22 00:16:15.157 182717 INFO nova.virt.libvirt.driver [None req-3ddb0b1d-e0b6-4f28-9c7c-469a326f1bab 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Successfully detached device tap4230f4cb-ad from instance 074fd360-328c-4903-a368-d3890c4a1075 from the live domain config.
Jan 22 00:16:15 compute-1 nova_compute[182713]: 2026-01-22 00:16:15.158 182717 DEBUG nova.virt.libvirt.vif [None req-3ddb0b1d-e0b6-4f28-9c7c-469a326f1bab 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T00:14:30Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1083808564',display_name='tempest-TestNetworkBasicOps-server-1083808564',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1083808564',id=125,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBM6+EaxgHLeoG7CHqeSWXpXKv4K84dI1QQQjZdcrX0T7kqXBTlhE22YQjJTUFxToUxfZEI27WRcAtCoqb6CCdLa4/l//5Lw6nNA8ZjjrkKnh18RWLjWeeCbEVEj1FdVuCQ==',key_name='tempest-TestNetworkBasicOps-2002692629',keypairs=<?>,launch_index=0,launched_at=2026-01-22T00:14:45Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='34b96b4037d24a0ea19383ca2477b2fd',ramdisk_id='',reservation_id='r-5bbetpco',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-822850957',owner_user_name='tempest-TestNetworkBasicOps-822850957-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T00:14:45Z,user_data=None,user_id='833f1e9dce90456ea55a443da6704907',uuid=074fd360-328c-4903-a368-d3890c4a1075,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "4230f4cb-ad57-407e-90c1-b2441b67e135", "address": "fa:16:3e:ce:53:e7", "network": {"id": "dbad5d67-48fa-4452-9764-37918af5f722", "bridge": "br-int", "label": "tempest-network-smoke--1888340030", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.29", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "34b96b4037d24a0ea19383ca2477b2fd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4230f4cb-ad", "ovs_interfaceid": "4230f4cb-ad57-407e-90c1-b2441b67e135", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 00:16:15 compute-1 nova_compute[182713]: 2026-01-22 00:16:15.159 182717 DEBUG nova.network.os_vif_util [None req-3ddb0b1d-e0b6-4f28-9c7c-469a326f1bab 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Converting VIF {"id": "4230f4cb-ad57-407e-90c1-b2441b67e135", "address": "fa:16:3e:ce:53:e7", "network": {"id": "dbad5d67-48fa-4452-9764-37918af5f722", "bridge": "br-int", "label": "tempest-network-smoke--1888340030", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.29", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "34b96b4037d24a0ea19383ca2477b2fd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4230f4cb-ad", "ovs_interfaceid": "4230f4cb-ad57-407e-90c1-b2441b67e135", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:16:15 compute-1 nova_compute[182713]: 2026-01-22 00:16:15.160 182717 DEBUG nova.network.os_vif_util [None req-3ddb0b1d-e0b6-4f28-9c7c-469a326f1bab 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:ce:53:e7,bridge_name='br-int',has_traffic_filtering=True,id=4230f4cb-ad57-407e-90c1-b2441b67e135,network=Network(dbad5d67-48fa-4452-9764-37918af5f722),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4230f4cb-ad') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:16:15 compute-1 nova_compute[182713]: 2026-01-22 00:16:15.161 182717 DEBUG os_vif [None req-3ddb0b1d-e0b6-4f28-9c7c-469a326f1bab 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:ce:53:e7,bridge_name='br-int',has_traffic_filtering=True,id=4230f4cb-ad57-407e-90c1-b2441b67e135,network=Network(dbad5d67-48fa-4452-9764-37918af5f722),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4230f4cb-ad') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 00:16:15 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:16:15.156 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[300a622c-7698-4725-95c7-0e5a4330f6da]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:16:15 compute-1 nova_compute[182713]: 2026-01-22 00:16:15.164 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:16:15 compute-1 nova_compute[182713]: 2026-01-22 00:16:15.165 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4230f4cb-ad, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:16:15 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:16:15.165 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap55594f65-20, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:16:15 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:16:15.165 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 00:16:15 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:16:15.166 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap55594f65-20, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:16:15 compute-1 nova_compute[182713]: 2026-01-22 00:16:15.167 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:16:15 compute-1 NetworkManager[54952]: <info>  [1769040975.1713] manager: (tap55594f65-20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/240)
Jan 22 00:16:15 compute-1 nova_compute[182713]: 2026-01-22 00:16:15.172 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:16:15 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:16:15.176 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap55594f65-20, col_values=(('external_ids', {'iface-id': 'c516d686-0754-486d-a980-7442f4c88088'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:16:15 compute-1 ovn_controller[94841]: 2026-01-22T00:16:15Z|00525|binding|INFO|Releasing lport c516d686-0754-486d-a980-7442f4c88088 from this chassis (sb_readonly=0)
Jan 22 00:16:15 compute-1 nova_compute[182713]: 2026-01-22 00:16:15.183 182717 INFO os_vif [None req-3ddb0b1d-e0b6-4f28-9c7c-469a326f1bab 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:ce:53:e7,bridge_name='br-int',has_traffic_filtering=True,id=4230f4cb-ad57-407e-90c1-b2441b67e135,network=Network(dbad5d67-48fa-4452-9764-37918af5f722),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4230f4cb-ad')
Jan 22 00:16:15 compute-1 nova_compute[182713]: 2026-01-22 00:16:15.184 182717 DEBUG nova.virt.libvirt.guest [None req-3ddb0b1d-e0b6-4f28-9c7c-469a326f1bab 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 00:16:15 compute-1 nova_compute[182713]:   <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:   <nova:name>tempest-TestNetworkBasicOps-server-1083808564</nova:name>
Jan 22 00:16:15 compute-1 nova_compute[182713]:   <nova:creationTime>2026-01-22 00:16:15</nova:creationTime>
Jan 22 00:16:15 compute-1 nova_compute[182713]:   <nova:flavor name="m1.nano">
Jan 22 00:16:15 compute-1 nova_compute[182713]:     <nova:memory>128</nova:memory>
Jan 22 00:16:15 compute-1 nova_compute[182713]:     <nova:disk>1</nova:disk>
Jan 22 00:16:15 compute-1 nova_compute[182713]:     <nova:swap>0</nova:swap>
Jan 22 00:16:15 compute-1 nova_compute[182713]:     <nova:ephemeral>0</nova:ephemeral>
Jan 22 00:16:15 compute-1 nova_compute[182713]:     <nova:vcpus>1</nova:vcpus>
Jan 22 00:16:15 compute-1 nova_compute[182713]:   </nova:flavor>
Jan 22 00:16:15 compute-1 nova_compute[182713]:   <nova:owner>
Jan 22 00:16:15 compute-1 nova_compute[182713]:     <nova:user uuid="833f1e9dce90456ea55a443da6704907">tempest-TestNetworkBasicOps-822850957-project-member</nova:user>
Jan 22 00:16:15 compute-1 nova_compute[182713]:     <nova:project uuid="34b96b4037d24a0ea19383ca2477b2fd">tempest-TestNetworkBasicOps-822850957</nova:project>
Jan 22 00:16:15 compute-1 nova_compute[182713]:   </nova:owner>
Jan 22 00:16:15 compute-1 nova_compute[182713]:   <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:   <nova:ports>
Jan 22 00:16:15 compute-1 nova_compute[182713]:     <nova:port uuid="a06a78d5-548e-4a84-b918-197a54a79f44">
Jan 22 00:16:15 compute-1 nova_compute[182713]:       <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Jan 22 00:16:15 compute-1 nova_compute[182713]:     </nova:port>
Jan 22 00:16:15 compute-1 nova_compute[182713]:   </nova:ports>
Jan 22 00:16:15 compute-1 nova_compute[182713]: </nova:instance>
Jan 22 00:16:15 compute-1 nova_compute[182713]:  set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359
Jan 22 00:16:15 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:16:15.202 104184 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/55594f65-206f-4b2a-a4ed-c049861ef480.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/55594f65-206f-4b2a-a4ed-c049861ef480.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 22 00:16:15 compute-1 nova_compute[182713]: 2026-01-22 00:16:15.203 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:16:15 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:16:15.203 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[f86eca70-5547-42fa-83ba-3289f3ea34ca]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:16:15 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:16:15.204 104184 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 00:16:15 compute-1 ovn_metadata_agent[104179]: global
Jan 22 00:16:15 compute-1 ovn_metadata_agent[104179]:     log         /dev/log local0 debug
Jan 22 00:16:15 compute-1 ovn_metadata_agent[104179]:     log-tag     haproxy-metadata-proxy-55594f65-206f-4b2a-a4ed-c049861ef480
Jan 22 00:16:15 compute-1 ovn_metadata_agent[104179]:     user        root
Jan 22 00:16:15 compute-1 ovn_metadata_agent[104179]:     group       root
Jan 22 00:16:15 compute-1 ovn_metadata_agent[104179]:     maxconn     1024
Jan 22 00:16:15 compute-1 ovn_metadata_agent[104179]:     pidfile     /var/lib/neutron/external/pids/55594f65-206f-4b2a-a4ed-c049861ef480.pid.haproxy
Jan 22 00:16:15 compute-1 ovn_metadata_agent[104179]:     daemon
Jan 22 00:16:15 compute-1 ovn_metadata_agent[104179]: 
Jan 22 00:16:15 compute-1 ovn_metadata_agent[104179]: defaults
Jan 22 00:16:15 compute-1 ovn_metadata_agent[104179]:     log global
Jan 22 00:16:15 compute-1 ovn_metadata_agent[104179]:     mode http
Jan 22 00:16:15 compute-1 ovn_metadata_agent[104179]:     option httplog
Jan 22 00:16:15 compute-1 ovn_metadata_agent[104179]:     option dontlognull
Jan 22 00:16:15 compute-1 ovn_metadata_agent[104179]:     option http-server-close
Jan 22 00:16:15 compute-1 ovn_metadata_agent[104179]:     option forwardfor
Jan 22 00:16:15 compute-1 ovn_metadata_agent[104179]:     retries                 3
Jan 22 00:16:15 compute-1 ovn_metadata_agent[104179]:     timeout http-request    30s
Jan 22 00:16:15 compute-1 ovn_metadata_agent[104179]:     timeout connect         30s
Jan 22 00:16:15 compute-1 ovn_metadata_agent[104179]:     timeout client          32s
Jan 22 00:16:15 compute-1 ovn_metadata_agent[104179]:     timeout server          32s
Jan 22 00:16:15 compute-1 ovn_metadata_agent[104179]:     timeout http-keep-alive 30s
Jan 22 00:16:15 compute-1 ovn_metadata_agent[104179]: 
Jan 22 00:16:15 compute-1 ovn_metadata_agent[104179]: 
Jan 22 00:16:15 compute-1 ovn_metadata_agent[104179]: listen listener
Jan 22 00:16:15 compute-1 ovn_metadata_agent[104179]:     bind 169.254.169.254:80
Jan 22 00:16:15 compute-1 ovn_metadata_agent[104179]:     server metadata /var/lib/neutron/metadata_proxy
Jan 22 00:16:15 compute-1 ovn_metadata_agent[104179]:     http-request add-header X-OVN-Network-ID 55594f65-206f-4b2a-a4ed-c049861ef480
Jan 22 00:16:15 compute-1 ovn_metadata_agent[104179]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 22 00:16:15 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:16:15.205 104184 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-55594f65-206f-4b2a-a4ed-c049861ef480', 'env', 'PROCESS_TAG=haproxy-55594f65-206f-4b2a-a4ed-c049861ef480', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/55594f65-206f-4b2a-a4ed-c049861ef480.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 22 00:16:15 compute-1 nova_compute[182713]: 2026-01-22 00:16:15.490 182717 DEBUG nova.compute.manager [req-edd2aeea-a8cc-450a-84db-c8cc935c9f3c req-cbda7121-0bcb-4d3f-8c60-0a5d1c029af0 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 5ba5bafe-ee5b-48f6-aa2f-653708f71f55] Received event network-vif-plugged-53f3c575-ccc8-4fde-a256-9598ddf6cdaf external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:16:15 compute-1 nova_compute[182713]: 2026-01-22 00:16:15.491 182717 DEBUG oslo_concurrency.lockutils [req-edd2aeea-a8cc-450a-84db-c8cc935c9f3c req-cbda7121-0bcb-4d3f-8c60-0a5d1c029af0 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "5ba5bafe-ee5b-48f6-aa2f-653708f71f55-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:16:15 compute-1 nova_compute[182713]: 2026-01-22 00:16:15.491 182717 DEBUG oslo_concurrency.lockutils [req-edd2aeea-a8cc-450a-84db-c8cc935c9f3c req-cbda7121-0bcb-4d3f-8c60-0a5d1c029af0 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "5ba5bafe-ee5b-48f6-aa2f-653708f71f55-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:16:15 compute-1 nova_compute[182713]: 2026-01-22 00:16:15.491 182717 DEBUG oslo_concurrency.lockutils [req-edd2aeea-a8cc-450a-84db-c8cc935c9f3c req-cbda7121-0bcb-4d3f-8c60-0a5d1c029af0 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "5ba5bafe-ee5b-48f6-aa2f-653708f71f55-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:16:15 compute-1 nova_compute[182713]: 2026-01-22 00:16:15.491 182717 DEBUG nova.compute.manager [req-edd2aeea-a8cc-450a-84db-c8cc935c9f3c req-cbda7121-0bcb-4d3f-8c60-0a5d1c029af0 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 5ba5bafe-ee5b-48f6-aa2f-653708f71f55] Processing event network-vif-plugged-53f3c575-ccc8-4fde-a256-9598ddf6cdaf _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 22 00:16:15 compute-1 podman[231521]: 2026-01-22 00:16:15.596550583 +0000 UTC m=+0.056587170 container create 4c061ce4c87bd8982286d649c8e61dd380707f7e5749988dc5059a395fe011a1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-55594f65-206f-4b2a-a4ed-c049861ef480, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Jan 22 00:16:15 compute-1 systemd[1]: Started libpod-conmon-4c061ce4c87bd8982286d649c8e61dd380707f7e5749988dc5059a395fe011a1.scope.
Jan 22 00:16:15 compute-1 podman[231521]: 2026-01-22 00:16:15.565432217 +0000 UTC m=+0.025468824 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 00:16:15 compute-1 systemd[1]: Started libcrun container.
Jan 22 00:16:15 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/448da1fae413fd9431a82ee057dc5d8a5f546f4518c8cd6a6c9da0fe1095df74/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 00:16:15 compute-1 podman[231521]: 2026-01-22 00:16:15.687989334 +0000 UTC m=+0.148025941 container init 4c061ce4c87bd8982286d649c8e61dd380707f7e5749988dc5059a395fe011a1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-55594f65-206f-4b2a-a4ed-c049861ef480, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 00:16:15 compute-1 podman[231521]: 2026-01-22 00:16:15.69273875 +0000 UTC m=+0.152775327 container start 4c061ce4c87bd8982286d649c8e61dd380707f7e5749988dc5059a395fe011a1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-55594f65-206f-4b2a-a4ed-c049861ef480, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 22 00:16:15 compute-1 neutron-haproxy-ovnmeta-55594f65-206f-4b2a-a4ed-c049861ef480[231537]: [NOTICE]   (231541) : New worker (231543) forked
Jan 22 00:16:15 compute-1 neutron-haproxy-ovnmeta-55594f65-206f-4b2a-a4ed-c049861ef480[231537]: [NOTICE]   (231541) : Loading success.
Jan 22 00:16:15 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:16:15.754 104184 INFO neutron.agent.ovn.metadata.agent [-] Port 4230f4cb-ad57-407e-90c1-b2441b67e135 in datapath dbad5d67-48fa-4452-9764-37918af5f722 unbound from our chassis
Jan 22 00:16:15 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:16:15.757 104184 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network dbad5d67-48fa-4452-9764-37918af5f722, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 00:16:15 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:16:15.757 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[e981f107-9131-44c0-a8e3-e4d22bd7b6e3]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:16:15 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:16:15.758 104184 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-dbad5d67-48fa-4452-9764-37918af5f722 namespace which is not needed anymore
Jan 22 00:16:15 compute-1 nova_compute[182713]: 2026-01-22 00:16:15.766 182717 DEBUG nova.compute.manager [req-c5919169-5cde-43df-b21e-2782e3af1b82 req-444cc889-8cff-44f9-ae7a-5d3e98db1630 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 074fd360-328c-4903-a368-d3890c4a1075] Received event network-vif-unplugged-4230f4cb-ad57-407e-90c1-b2441b67e135 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:16:15 compute-1 nova_compute[182713]: 2026-01-22 00:16:15.766 182717 DEBUG oslo_concurrency.lockutils [req-c5919169-5cde-43df-b21e-2782e3af1b82 req-444cc889-8cff-44f9-ae7a-5d3e98db1630 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "074fd360-328c-4903-a368-d3890c4a1075-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:16:15 compute-1 nova_compute[182713]: 2026-01-22 00:16:15.766 182717 DEBUG oslo_concurrency.lockutils [req-c5919169-5cde-43df-b21e-2782e3af1b82 req-444cc889-8cff-44f9-ae7a-5d3e98db1630 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "074fd360-328c-4903-a368-d3890c4a1075-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:16:15 compute-1 nova_compute[182713]: 2026-01-22 00:16:15.767 182717 DEBUG oslo_concurrency.lockutils [req-c5919169-5cde-43df-b21e-2782e3af1b82 req-444cc889-8cff-44f9-ae7a-5d3e98db1630 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "074fd360-328c-4903-a368-d3890c4a1075-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:16:15 compute-1 nova_compute[182713]: 2026-01-22 00:16:15.767 182717 DEBUG nova.compute.manager [req-c5919169-5cde-43df-b21e-2782e3af1b82 req-444cc889-8cff-44f9-ae7a-5d3e98db1630 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 074fd360-328c-4903-a368-d3890c4a1075] No waiting events found dispatching network-vif-unplugged-4230f4cb-ad57-407e-90c1-b2441b67e135 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:16:15 compute-1 nova_compute[182713]: 2026-01-22 00:16:15.767 182717 WARNING nova.compute.manager [req-c5919169-5cde-43df-b21e-2782e3af1b82 req-444cc889-8cff-44f9-ae7a-5d3e98db1630 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 074fd360-328c-4903-a368-d3890c4a1075] Received unexpected event network-vif-unplugged-4230f4cb-ad57-407e-90c1-b2441b67e135 for instance with vm_state active and task_state None.
Jan 22 00:16:15 compute-1 neutron-haproxy-ovnmeta-dbad5d67-48fa-4452-9764-37918af5f722[231014]: [NOTICE]   (231018) : haproxy version is 2.8.14-c23fe91
Jan 22 00:16:15 compute-1 neutron-haproxy-ovnmeta-dbad5d67-48fa-4452-9764-37918af5f722[231014]: [NOTICE]   (231018) : path to executable is /usr/sbin/haproxy
Jan 22 00:16:15 compute-1 neutron-haproxy-ovnmeta-dbad5d67-48fa-4452-9764-37918af5f722[231014]: [WARNING]  (231018) : Exiting Master process...
Jan 22 00:16:15 compute-1 neutron-haproxy-ovnmeta-dbad5d67-48fa-4452-9764-37918af5f722[231014]: [ALERT]    (231018) : Current worker (231020) exited with code 143 (Terminated)
Jan 22 00:16:15 compute-1 neutron-haproxy-ovnmeta-dbad5d67-48fa-4452-9764-37918af5f722[231014]: [WARNING]  (231018) : All workers exited. Exiting... (0)
Jan 22 00:16:15 compute-1 systemd[1]: libpod-b8692c2a9194183a093291b7e9f63d355c0575c53cfb159d02b5b45b8a55e82e.scope: Deactivated successfully.
Jan 22 00:16:15 compute-1 podman[231567]: 2026-01-22 00:16:15.877169028 +0000 UTC m=+0.041844997 container died b8692c2a9194183a093291b7e9f63d355c0575c53cfb159d02b5b45b8a55e82e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-dbad5d67-48fa-4452-9764-37918af5f722, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 22 00:16:15 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-b8692c2a9194183a093291b7e9f63d355c0575c53cfb159d02b5b45b8a55e82e-userdata-shm.mount: Deactivated successfully.
Jan 22 00:16:15 compute-1 systemd[1]: var-lib-containers-storage-overlay-a0926e005cb225d188add7f7a75b0d7a78697ff2107a2eae8b5dccf1ea26fc81-merged.mount: Deactivated successfully.
Jan 22 00:16:15 compute-1 podman[231567]: 2026-01-22 00:16:15.926095763 +0000 UTC m=+0.090771732 container cleanup b8692c2a9194183a093291b7e9f63d355c0575c53cfb159d02b5b45b8a55e82e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-dbad5d67-48fa-4452-9764-37918af5f722, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 22 00:16:15 compute-1 systemd[1]: libpod-conmon-b8692c2a9194183a093291b7e9f63d355c0575c53cfb159d02b5b45b8a55e82e.scope: Deactivated successfully.
Jan 22 00:16:15 compute-1 podman[231599]: 2026-01-22 00:16:15.98591318 +0000 UTC m=+0.039984140 container remove b8692c2a9194183a093291b7e9f63d355c0575c53cfb159d02b5b45b8a55e82e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-dbad5d67-48fa-4452-9764-37918af5f722, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_managed=true)
Jan 22 00:16:15 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:16:15.990 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[8d0a8518-da1f-4d02-91b5-0f69eec6e9ca]: (4, ('Thu Jan 22 12:16:15 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-dbad5d67-48fa-4452-9764-37918af5f722 (b8692c2a9194183a093291b7e9f63d355c0575c53cfb159d02b5b45b8a55e82e)\nb8692c2a9194183a093291b7e9f63d355c0575c53cfb159d02b5b45b8a55e82e\nThu Jan 22 12:16:15 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-dbad5d67-48fa-4452-9764-37918af5f722 (b8692c2a9194183a093291b7e9f63d355c0575c53cfb159d02b5b45b8a55e82e)\nb8692c2a9194183a093291b7e9f63d355c0575c53cfb159d02b5b45b8a55e82e\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:16:15 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:16:15.991 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[e8e7eeac-1d7d-4b1a-bf5a-445745d4000d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:16:15 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:16:15.992 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapdbad5d67-40, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:16:16 compute-1 nova_compute[182713]: 2026-01-22 00:16:16.033 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:16:16 compute-1 kernel: tapdbad5d67-40: left promiscuous mode
Jan 22 00:16:16 compute-1 nova_compute[182713]: 2026-01-22 00:16:16.051 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:16:16 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:16:16.053 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[977c9959-31e1-4b6f-8591-c7e1e836eaed]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:16:16 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:16:16.065 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[a650e982-dd0a-4e80-8119-f05f89f6f699]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:16:16 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:16:16.065 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[60931bd7-7e3e-48ba-8d1e-00bf040e837f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:16:16 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:16:16.084 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[61953d87-289d-4fe0-8b9e-884af8f99fcf]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 549314, 'reachable_time': 20034, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 231614, 'error': None, 'target': 'ovnmeta-dbad5d67-48fa-4452-9764-37918af5f722', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:16:16 compute-1 systemd[1]: run-netns-ovnmeta\x2ddbad5d67\x2d48fa\x2d4452\x2d9764\x2d37918af5f722.mount: Deactivated successfully.
Jan 22 00:16:16 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:16:16.090 104576 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-dbad5d67-48fa-4452-9764-37918af5f722 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 22 00:16:16 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:16:16.090 104576 DEBUG oslo.privsep.daemon [-] privsep: reply[8eec9f49-53a6-4838-a2c2-296fe9123bec]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:16:16 compute-1 nova_compute[182713]: 2026-01-22 00:16:16.811 182717 DEBUG nova.compute.manager [None req-feac5578-dfd0-4b4e-acca-6e5eb0b498ee c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] [instance: 5ba5bafe-ee5b-48f6-aa2f-653708f71f55] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 00:16:16 compute-1 nova_compute[182713]: 2026-01-22 00:16:16.812 182717 DEBUG nova.virt.driver [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Emitting event <LifecycleEvent: 1769040976.810939, 5ba5bafe-ee5b-48f6-aa2f-653708f71f55 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:16:16 compute-1 nova_compute[182713]: 2026-01-22 00:16:16.813 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 5ba5bafe-ee5b-48f6-aa2f-653708f71f55] VM Started (Lifecycle Event)
Jan 22 00:16:16 compute-1 nova_compute[182713]: 2026-01-22 00:16:16.815 182717 DEBUG nova.virt.libvirt.driver [None req-feac5578-dfd0-4b4e-acca-6e5eb0b498ee c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] [instance: 5ba5bafe-ee5b-48f6-aa2f-653708f71f55] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 22 00:16:16 compute-1 nova_compute[182713]: 2026-01-22 00:16:16.818 182717 INFO nova.virt.libvirt.driver [-] [instance: 5ba5bafe-ee5b-48f6-aa2f-653708f71f55] Instance spawned successfully.
Jan 22 00:16:16 compute-1 nova_compute[182713]: 2026-01-22 00:16:16.819 182717 DEBUG nova.virt.libvirt.driver [None req-feac5578-dfd0-4b4e-acca-6e5eb0b498ee c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] [instance: 5ba5bafe-ee5b-48f6-aa2f-653708f71f55] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 22 00:16:16 compute-1 nova_compute[182713]: 2026-01-22 00:16:16.842 182717 DEBUG nova.virt.libvirt.driver [None req-feac5578-dfd0-4b4e-acca-6e5eb0b498ee c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] [instance: 5ba5bafe-ee5b-48f6-aa2f-653708f71f55] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:16:16 compute-1 nova_compute[182713]: 2026-01-22 00:16:16.843 182717 DEBUG nova.virt.libvirt.driver [None req-feac5578-dfd0-4b4e-acca-6e5eb0b498ee c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] [instance: 5ba5bafe-ee5b-48f6-aa2f-653708f71f55] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:16:16 compute-1 nova_compute[182713]: 2026-01-22 00:16:16.843 182717 DEBUG nova.virt.libvirt.driver [None req-feac5578-dfd0-4b4e-acca-6e5eb0b498ee c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] [instance: 5ba5bafe-ee5b-48f6-aa2f-653708f71f55] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:16:16 compute-1 nova_compute[182713]: 2026-01-22 00:16:16.844 182717 DEBUG nova.virt.libvirt.driver [None req-feac5578-dfd0-4b4e-acca-6e5eb0b498ee c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] [instance: 5ba5bafe-ee5b-48f6-aa2f-653708f71f55] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:16:16 compute-1 nova_compute[182713]: 2026-01-22 00:16:16.844 182717 DEBUG nova.virt.libvirt.driver [None req-feac5578-dfd0-4b4e-acca-6e5eb0b498ee c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] [instance: 5ba5bafe-ee5b-48f6-aa2f-653708f71f55] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:16:16 compute-1 nova_compute[182713]: 2026-01-22 00:16:16.844 182717 DEBUG nova.virt.libvirt.driver [None req-feac5578-dfd0-4b4e-acca-6e5eb0b498ee c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] [instance: 5ba5bafe-ee5b-48f6-aa2f-653708f71f55] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:16:16 compute-1 nova_compute[182713]: 2026-01-22 00:16:16.849 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 5ba5bafe-ee5b-48f6-aa2f-653708f71f55] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:16:16 compute-1 nova_compute[182713]: 2026-01-22 00:16:16.851 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 5ba5bafe-ee5b-48f6-aa2f-653708f71f55] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 00:16:16 compute-1 nova_compute[182713]: 2026-01-22 00:16:16.912 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 5ba5bafe-ee5b-48f6-aa2f-653708f71f55] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 00:16:16 compute-1 nova_compute[182713]: 2026-01-22 00:16:16.912 182717 DEBUG nova.virt.driver [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Emitting event <LifecycleEvent: 1769040976.8120944, 5ba5bafe-ee5b-48f6-aa2f-653708f71f55 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:16:16 compute-1 nova_compute[182713]: 2026-01-22 00:16:16.913 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 5ba5bafe-ee5b-48f6-aa2f-653708f71f55] VM Paused (Lifecycle Event)
Jan 22 00:16:16 compute-1 nova_compute[182713]: 2026-01-22 00:16:16.942 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 5ba5bafe-ee5b-48f6-aa2f-653708f71f55] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:16:16 compute-1 nova_compute[182713]: 2026-01-22 00:16:16.947 182717 DEBUG nova.virt.driver [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Emitting event <LifecycleEvent: 1769040976.8144743, 5ba5bafe-ee5b-48f6-aa2f-653708f71f55 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:16:16 compute-1 nova_compute[182713]: 2026-01-22 00:16:16.947 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 5ba5bafe-ee5b-48f6-aa2f-653708f71f55] VM Resumed (Lifecycle Event)
Jan 22 00:16:16 compute-1 nova_compute[182713]: 2026-01-22 00:16:16.962 182717 DEBUG nova.network.neutron [req-ff160996-d54e-4510-9cad-ca205cd5271b req-30b6a111-c568-422e-97af-4c09ab5e4cda 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 5ba5bafe-ee5b-48f6-aa2f-653708f71f55] Updated VIF entry in instance network info cache for port 53f3c575-ccc8-4fde-a256-9598ddf6cdaf. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 00:16:16 compute-1 nova_compute[182713]: 2026-01-22 00:16:16.962 182717 DEBUG nova.network.neutron [req-ff160996-d54e-4510-9cad-ca205cd5271b req-30b6a111-c568-422e-97af-4c09ab5e4cda 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 5ba5bafe-ee5b-48f6-aa2f-653708f71f55] Updating instance_info_cache with network_info: [{"id": "53f3c575-ccc8-4fde-a256-9598ddf6cdaf", "address": "fa:16:3e:9f:0e:3f", "network": {"id": "55594f65-206f-4b2a-a4ed-c049861ef480", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-2114387764-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4c6e66779ffe440d9c3270f0328391fb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap53f3c575-cc", "ovs_interfaceid": "53f3c575-ccc8-4fde-a256-9598ddf6cdaf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:16:16 compute-1 nova_compute[182713]: 2026-01-22 00:16:16.988 182717 INFO nova.compute.manager [None req-feac5578-dfd0-4b4e-acca-6e5eb0b498ee c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] [instance: 5ba5bafe-ee5b-48f6-aa2f-653708f71f55] Took 19.96 seconds to spawn the instance on the hypervisor.
Jan 22 00:16:16 compute-1 nova_compute[182713]: 2026-01-22 00:16:16.989 182717 DEBUG nova.compute.manager [None req-feac5578-dfd0-4b4e-acca-6e5eb0b498ee c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] [instance: 5ba5bafe-ee5b-48f6-aa2f-653708f71f55] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:16:16 compute-1 nova_compute[182713]: 2026-01-22 00:16:16.991 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 5ba5bafe-ee5b-48f6-aa2f-653708f71f55] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:16:16 compute-1 nova_compute[182713]: 2026-01-22 00:16:16.996 182717 DEBUG oslo_concurrency.lockutils [req-ff160996-d54e-4510-9cad-ca205cd5271b req-30b6a111-c568-422e-97af-4c09ab5e4cda 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-5ba5bafe-ee5b-48f6-aa2f-653708f71f55" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:16:17 compute-1 nova_compute[182713]: 2026-01-22 00:16:17.004 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 5ba5bafe-ee5b-48f6-aa2f-653708f71f55] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 00:16:17 compute-1 nova_compute[182713]: 2026-01-22 00:16:17.045 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 5ba5bafe-ee5b-48f6-aa2f-653708f71f55] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 00:16:17 compute-1 nova_compute[182713]: 2026-01-22 00:16:17.150 182717 INFO nova.compute.manager [None req-feac5578-dfd0-4b4e-acca-6e5eb0b498ee c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] [instance: 5ba5bafe-ee5b-48f6-aa2f-653708f71f55] Took 21.52 seconds to build instance.
Jan 22 00:16:17 compute-1 nova_compute[182713]: 2026-01-22 00:16:17.181 182717 DEBUG oslo_concurrency.lockutils [None req-feac5578-dfd0-4b4e-acca-6e5eb0b498ee c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] Lock "5ba5bafe-ee5b-48f6-aa2f-653708f71f55" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 21.724s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:16:17 compute-1 nova_compute[182713]: 2026-01-22 00:16:17.504 182717 DEBUG oslo_concurrency.lockutils [None req-3ddb0b1d-e0b6-4f28-9c7c-469a326f1bab 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Acquiring lock "refresh_cache-074fd360-328c-4903-a368-d3890c4a1075" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:16:17 compute-1 nova_compute[182713]: 2026-01-22 00:16:17.504 182717 DEBUG oslo_concurrency.lockutils [None req-3ddb0b1d-e0b6-4f28-9c7c-469a326f1bab 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Acquired lock "refresh_cache-074fd360-328c-4903-a368-d3890c4a1075" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:16:17 compute-1 nova_compute[182713]: 2026-01-22 00:16:17.505 182717 DEBUG nova.network.neutron [None req-3ddb0b1d-e0b6-4f28-9c7c-469a326f1bab 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 074fd360-328c-4903-a368-d3890c4a1075] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 00:16:17 compute-1 nova_compute[182713]: 2026-01-22 00:16:17.781 182717 DEBUG nova.compute.manager [req-97fdd3e1-a664-4a59-ba79-d733c81dcaf2 req-e585c438-b188-41ab-a852-e4ee8a373239 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 5ba5bafe-ee5b-48f6-aa2f-653708f71f55] Received event network-vif-plugged-53f3c575-ccc8-4fde-a256-9598ddf6cdaf external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:16:17 compute-1 nova_compute[182713]: 2026-01-22 00:16:17.781 182717 DEBUG oslo_concurrency.lockutils [req-97fdd3e1-a664-4a59-ba79-d733c81dcaf2 req-e585c438-b188-41ab-a852-e4ee8a373239 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "5ba5bafe-ee5b-48f6-aa2f-653708f71f55-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:16:17 compute-1 nova_compute[182713]: 2026-01-22 00:16:17.783 182717 DEBUG oslo_concurrency.lockutils [req-97fdd3e1-a664-4a59-ba79-d733c81dcaf2 req-e585c438-b188-41ab-a852-e4ee8a373239 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "5ba5bafe-ee5b-48f6-aa2f-653708f71f55-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:16:17 compute-1 nova_compute[182713]: 2026-01-22 00:16:17.784 182717 DEBUG oslo_concurrency.lockutils [req-97fdd3e1-a664-4a59-ba79-d733c81dcaf2 req-e585c438-b188-41ab-a852-e4ee8a373239 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "5ba5bafe-ee5b-48f6-aa2f-653708f71f55-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:16:17 compute-1 nova_compute[182713]: 2026-01-22 00:16:17.784 182717 DEBUG nova.compute.manager [req-97fdd3e1-a664-4a59-ba79-d733c81dcaf2 req-e585c438-b188-41ab-a852-e4ee8a373239 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 5ba5bafe-ee5b-48f6-aa2f-653708f71f55] No waiting events found dispatching network-vif-plugged-53f3c575-ccc8-4fde-a256-9598ddf6cdaf pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:16:17 compute-1 nova_compute[182713]: 2026-01-22 00:16:17.784 182717 WARNING nova.compute.manager [req-97fdd3e1-a664-4a59-ba79-d733c81dcaf2 req-e585c438-b188-41ab-a852-e4ee8a373239 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 5ba5bafe-ee5b-48f6-aa2f-653708f71f55] Received unexpected event network-vif-plugged-53f3c575-ccc8-4fde-a256-9598ddf6cdaf for instance with vm_state active and task_state None.
Jan 22 00:16:17 compute-1 nova_compute[182713]: 2026-01-22 00:16:17.806 182717 DEBUG nova.compute.manager [req-44e79423-ec3b-4d7c-bb94-d38ea705fc6d req-b7a0e244-c620-4878-bcb4-8034b983616f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 074fd360-328c-4903-a368-d3890c4a1075] Received event network-vif-deleted-4230f4cb-ad57-407e-90c1-b2441b67e135 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:16:17 compute-1 nova_compute[182713]: 2026-01-22 00:16:17.806 182717 INFO nova.compute.manager [req-44e79423-ec3b-4d7c-bb94-d38ea705fc6d req-b7a0e244-c620-4878-bcb4-8034b983616f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 074fd360-328c-4903-a368-d3890c4a1075] Neutron deleted interface 4230f4cb-ad57-407e-90c1-b2441b67e135; detaching it from the instance and deleting it from the info cache
Jan 22 00:16:17 compute-1 nova_compute[182713]: 2026-01-22 00:16:17.807 182717 DEBUG nova.network.neutron [req-44e79423-ec3b-4d7c-bb94-d38ea705fc6d req-b7a0e244-c620-4878-bcb4-8034b983616f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 074fd360-328c-4903-a368-d3890c4a1075] Updating instance_info_cache with network_info: [{"id": "a06a78d5-548e-4a84-b918-197a54a79f44", "address": "fa:16:3e:3e:83:df", "network": {"id": "89b3c74e-a4f2-4889-901d-aba21eee4bda", "bridge": "br-int", "label": "tempest-network-smoke--2024090277", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.235", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "34b96b4037d24a0ea19383ca2477b2fd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa06a78d5-54", "ovs_interfaceid": "a06a78d5-548e-4a84-b918-197a54a79f44", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:16:17 compute-1 nova_compute[182713]: 2026-01-22 00:16:17.860 182717 DEBUG nova.objects.instance [req-44e79423-ec3b-4d7c-bb94-d38ea705fc6d req-b7a0e244-c620-4878-bcb4-8034b983616f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lazy-loading 'system_metadata' on Instance uuid 074fd360-328c-4903-a368-d3890c4a1075 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:16:17 compute-1 nova_compute[182713]: 2026-01-22 00:16:17.916 182717 DEBUG nova.objects.instance [req-44e79423-ec3b-4d7c-bb94-d38ea705fc6d req-b7a0e244-c620-4878-bcb4-8034b983616f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lazy-loading 'flavor' on Instance uuid 074fd360-328c-4903-a368-d3890c4a1075 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:16:17 compute-1 nova_compute[182713]: 2026-01-22 00:16:17.952 182717 DEBUG nova.virt.libvirt.vif [req-44e79423-ec3b-4d7c-bb94-d38ea705fc6d req-b7a0e244-c620-4878-bcb4-8034b983616f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T00:14:30Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1083808564',display_name='tempest-TestNetworkBasicOps-server-1083808564',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1083808564',id=125,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBM6+EaxgHLeoG7CHqeSWXpXKv4K84dI1QQQjZdcrX0T7kqXBTlhE22YQjJTUFxToUxfZEI27WRcAtCoqb6CCdLa4/l//5Lw6nNA8ZjjrkKnh18RWLjWeeCbEVEj1FdVuCQ==',key_name='tempest-TestNetworkBasicOps-2002692629',keypairs=<?>,launch_index=0,launched_at=2026-01-22T00:14:45Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata=<?>,migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='34b96b4037d24a0ea19383ca2477b2fd',ramdisk_id='',reservation_id='r-5bbetpco',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-822850957',owner_user_name='tempest-TestNetworkBasicOps-822850957-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T00:14:45Z,user_data=None,user_id='833f1e9dce90456ea55a443da6704907',uuid=074fd360-328c-4903-a368-d3890c4a1075,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "4230f4cb-ad57-407e-90c1-b2441b67e135", "address": "fa:16:3e:ce:53:e7", "network": {"id": "dbad5d67-48fa-4452-9764-37918af5f722", "bridge": "br-int", "label": "tempest-network-smoke--1888340030", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.29", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "34b96b4037d24a0ea19383ca2477b2fd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4230f4cb-ad", "ovs_interfaceid": "4230f4cb-ad57-407e-90c1-b2441b67e135", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 00:16:17 compute-1 nova_compute[182713]: 2026-01-22 00:16:17.953 182717 DEBUG nova.network.os_vif_util [req-44e79423-ec3b-4d7c-bb94-d38ea705fc6d req-b7a0e244-c620-4878-bcb4-8034b983616f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Converting VIF {"id": "4230f4cb-ad57-407e-90c1-b2441b67e135", "address": "fa:16:3e:ce:53:e7", "network": {"id": "dbad5d67-48fa-4452-9764-37918af5f722", "bridge": "br-int", "label": "tempest-network-smoke--1888340030", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.29", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "34b96b4037d24a0ea19383ca2477b2fd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4230f4cb-ad", "ovs_interfaceid": "4230f4cb-ad57-407e-90c1-b2441b67e135", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:16:17 compute-1 nova_compute[182713]: 2026-01-22 00:16:17.954 182717 DEBUG nova.network.os_vif_util [req-44e79423-ec3b-4d7c-bb94-d38ea705fc6d req-b7a0e244-c620-4878-bcb4-8034b983616f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:ce:53:e7,bridge_name='br-int',has_traffic_filtering=True,id=4230f4cb-ad57-407e-90c1-b2441b67e135,network=Network(dbad5d67-48fa-4452-9764-37918af5f722),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4230f4cb-ad') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:16:17 compute-1 nova_compute[182713]: 2026-01-22 00:16:17.958 182717 DEBUG nova.virt.libvirt.guest [req-44e79423-ec3b-4d7c-bb94-d38ea705fc6d req-b7a0e244-c620-4878-bcb4-8034b983616f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:ce:53:e7"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap4230f4cb-ad"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257
Jan 22 00:16:17 compute-1 nova_compute[182713]: 2026-01-22 00:16:17.963 182717 DEBUG nova.virt.libvirt.guest [req-44e79423-ec3b-4d7c-bb94-d38ea705fc6d req-b7a0e244-c620-4878-bcb4-8034b983616f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:ce:53:e7"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap4230f4cb-ad"/></interface>not found in domain: <domain type='kvm' id='57'>
Jan 22 00:16:17 compute-1 nova_compute[182713]:   <name>instance-0000007d</name>
Jan 22 00:16:17 compute-1 nova_compute[182713]:   <uuid>074fd360-328c-4903-a368-d3890c4a1075</uuid>
Jan 22 00:16:17 compute-1 nova_compute[182713]:   <metadata>
Jan 22 00:16:17 compute-1 nova_compute[182713]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 00:16:17 compute-1 nova_compute[182713]:   <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:   <nova:name>tempest-TestNetworkBasicOps-server-1083808564</nova:name>
Jan 22 00:16:17 compute-1 nova_compute[182713]:   <nova:creationTime>2026-01-22 00:16:15</nova:creationTime>
Jan 22 00:16:17 compute-1 nova_compute[182713]:   <nova:flavor name="m1.nano">
Jan 22 00:16:17 compute-1 nova_compute[182713]:     <nova:memory>128</nova:memory>
Jan 22 00:16:17 compute-1 nova_compute[182713]:     <nova:disk>1</nova:disk>
Jan 22 00:16:17 compute-1 nova_compute[182713]:     <nova:swap>0</nova:swap>
Jan 22 00:16:17 compute-1 nova_compute[182713]:     <nova:ephemeral>0</nova:ephemeral>
Jan 22 00:16:17 compute-1 nova_compute[182713]:     <nova:vcpus>1</nova:vcpus>
Jan 22 00:16:17 compute-1 nova_compute[182713]:   </nova:flavor>
Jan 22 00:16:17 compute-1 nova_compute[182713]:   <nova:owner>
Jan 22 00:16:17 compute-1 nova_compute[182713]:     <nova:user uuid="833f1e9dce90456ea55a443da6704907">tempest-TestNetworkBasicOps-822850957-project-member</nova:user>
Jan 22 00:16:17 compute-1 nova_compute[182713]:     <nova:project uuid="34b96b4037d24a0ea19383ca2477b2fd">tempest-TestNetworkBasicOps-822850957</nova:project>
Jan 22 00:16:17 compute-1 nova_compute[182713]:   </nova:owner>
Jan 22 00:16:17 compute-1 nova_compute[182713]:   <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:   <nova:ports>
Jan 22 00:16:17 compute-1 nova_compute[182713]:     <nova:port uuid="a06a78d5-548e-4a84-b918-197a54a79f44">
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:     </nova:port>
Jan 22 00:16:17 compute-1 nova_compute[182713]:   </nova:ports>
Jan 22 00:16:17 compute-1 nova_compute[182713]: </nova:instance>
Jan 22 00:16:17 compute-1 nova_compute[182713]:   </metadata>
Jan 22 00:16:17 compute-1 nova_compute[182713]:   <memory unit='KiB'>131072</memory>
Jan 22 00:16:17 compute-1 nova_compute[182713]:   <currentMemory unit='KiB'>131072</currentMemory>
Jan 22 00:16:17 compute-1 nova_compute[182713]:   <vcpu placement='static'>1</vcpu>
Jan 22 00:16:17 compute-1 nova_compute[182713]:   <resource>
Jan 22 00:16:17 compute-1 nova_compute[182713]:     <partition>/machine</partition>
Jan 22 00:16:17 compute-1 nova_compute[182713]:   </resource>
Jan 22 00:16:17 compute-1 nova_compute[182713]:   <sysinfo type='smbios'>
Jan 22 00:16:17 compute-1 nova_compute[182713]:     <system>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <entry name='manufacturer'>RDO</entry>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <entry name='product'>OpenStack Compute</entry>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <entry name='serial'>074fd360-328c-4903-a368-d3890c4a1075</entry>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <entry name='uuid'>074fd360-328c-4903-a368-d3890c4a1075</entry>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <entry name='family'>Virtual Machine</entry>
Jan 22 00:16:17 compute-1 nova_compute[182713]:     </system>
Jan 22 00:16:17 compute-1 nova_compute[182713]:   </sysinfo>
Jan 22 00:16:17 compute-1 nova_compute[182713]:   <os>
Jan 22 00:16:17 compute-1 nova_compute[182713]:     <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Jan 22 00:16:17 compute-1 nova_compute[182713]:     <boot dev='hd'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:     <smbios mode='sysinfo'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:   </os>
Jan 22 00:16:17 compute-1 nova_compute[182713]:   <features>
Jan 22 00:16:17 compute-1 nova_compute[182713]:     <acpi/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:     <apic/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:     <vmcoreinfo state='on'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:   </features>
Jan 22 00:16:17 compute-1 nova_compute[182713]:   <cpu mode='custom' match='exact' check='full'>
Jan 22 00:16:17 compute-1 nova_compute[182713]:     <model fallback='forbid'>Nehalem</model>
Jan 22 00:16:17 compute-1 nova_compute[182713]:     <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:     <feature policy='require' name='x2apic'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:     <feature policy='require' name='hypervisor'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:     <feature policy='require' name='vme'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:   </cpu>
Jan 22 00:16:17 compute-1 nova_compute[182713]:   <clock offset='utc'>
Jan 22 00:16:17 compute-1 nova_compute[182713]:     <timer name='pit' tickpolicy='delay'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:     <timer name='rtc' tickpolicy='catchup'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:     <timer name='hpet' present='no'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:   </clock>
Jan 22 00:16:17 compute-1 nova_compute[182713]:   <on_poweroff>destroy</on_poweroff>
Jan 22 00:16:17 compute-1 nova_compute[182713]:   <on_reboot>restart</on_reboot>
Jan 22 00:16:17 compute-1 nova_compute[182713]:   <on_crash>destroy</on_crash>
Jan 22 00:16:17 compute-1 nova_compute[182713]:   <devices>
Jan 22 00:16:17 compute-1 nova_compute[182713]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 22 00:16:17 compute-1 nova_compute[182713]:     <disk type='file' device='disk'>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <driver name='qemu' type='qcow2' cache='none'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <source file='/var/lib/nova/instances/074fd360-328c-4903-a368-d3890c4a1075/disk' index='2'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <backingStore type='file' index='3'>
Jan 22 00:16:17 compute-1 nova_compute[182713]:         <format type='raw'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:         <source file='/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:         <backingStore/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       </backingStore>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <target dev='vda' bus='virtio'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <alias name='virtio-disk0'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:     </disk>
Jan 22 00:16:17 compute-1 nova_compute[182713]:     <disk type='file' device='cdrom'>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <driver name='qemu' type='raw' cache='none'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <source file='/var/lib/nova/instances/074fd360-328c-4903-a368-d3890c4a1075/disk.config' index='1'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <backingStore/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <target dev='sda' bus='sata'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <readonly/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <alias name='sata0-0-0'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:     </disk>
Jan 22 00:16:17 compute-1 nova_compute[182713]:     <controller type='pci' index='0' model='pcie-root'>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <alias name='pcie.0'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:     </controller>
Jan 22 00:16:17 compute-1 nova_compute[182713]:     <controller type='pci' index='1' model='pcie-root-port'>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <target chassis='1' port='0x10'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <alias name='pci.1'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:     </controller>
Jan 22 00:16:17 compute-1 nova_compute[182713]:     <controller type='pci' index='2' model='pcie-root-port'>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <target chassis='2' port='0x11'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <alias name='pci.2'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:     </controller>
Jan 22 00:16:17 compute-1 nova_compute[182713]:     <controller type='pci' index='3' model='pcie-root-port'>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <target chassis='3' port='0x12'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <alias name='pci.3'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:     </controller>
Jan 22 00:16:17 compute-1 nova_compute[182713]:     <controller type='pci' index='4' model='pcie-root-port'>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <target chassis='4' port='0x13'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <alias name='pci.4'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:     </controller>
Jan 22 00:16:17 compute-1 nova_compute[182713]:     <controller type='pci' index='5' model='pcie-root-port'>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <target chassis='5' port='0x14'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <alias name='pci.5'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:     </controller>
Jan 22 00:16:17 compute-1 nova_compute[182713]:     <controller type='pci' index='6' model='pcie-root-port'>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <target chassis='6' port='0x15'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <alias name='pci.6'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:     </controller>
Jan 22 00:16:17 compute-1 nova_compute[182713]:     <controller type='pci' index='7' model='pcie-root-port'>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <target chassis='7' port='0x16'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <alias name='pci.7'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:     </controller>
Jan 22 00:16:17 compute-1 nova_compute[182713]:     <controller type='pci' index='8' model='pcie-root-port'>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <target chassis='8' port='0x17'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <alias name='pci.8'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:     </controller>
Jan 22 00:16:17 compute-1 nova_compute[182713]:     <controller type='pci' index='9' model='pcie-root-port'>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <target chassis='9' port='0x18'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <alias name='pci.9'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:     </controller>
Jan 22 00:16:17 compute-1 nova_compute[182713]:     <controller type='pci' index='10' model='pcie-root-port'>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <target chassis='10' port='0x19'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <alias name='pci.10'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:     </controller>
Jan 22 00:16:17 compute-1 nova_compute[182713]:     <controller type='pci' index='11' model='pcie-root-port'>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <target chassis='11' port='0x1a'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <alias name='pci.11'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:     </controller>
Jan 22 00:16:17 compute-1 nova_compute[182713]:     <controller type='pci' index='12' model='pcie-root-port'>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <target chassis='12' port='0x1b'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <alias name='pci.12'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:     </controller>
Jan 22 00:16:17 compute-1 nova_compute[182713]:     <controller type='pci' index='13' model='pcie-root-port'>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <target chassis='13' port='0x1c'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <alias name='pci.13'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:     </controller>
Jan 22 00:16:17 compute-1 nova_compute[182713]:     <controller type='pci' index='14' model='pcie-root-port'>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <target chassis='14' port='0x1d'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <alias name='pci.14'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:     </controller>
Jan 22 00:16:17 compute-1 nova_compute[182713]:     <controller type='pci' index='15' model='pcie-root-port'>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <target chassis='15' port='0x1e'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <alias name='pci.15'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:     </controller>
Jan 22 00:16:17 compute-1 nova_compute[182713]:     <controller type='pci' index='16' model='pcie-root-port'>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <target chassis='16' port='0x1f'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <alias name='pci.16'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:     </controller>
Jan 22 00:16:17 compute-1 nova_compute[182713]:     <controller type='pci' index='17' model='pcie-root-port'>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <target chassis='17' port='0x20'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <alias name='pci.17'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:     </controller>
Jan 22 00:16:17 compute-1 nova_compute[182713]:     <controller type='pci' index='18' model='pcie-root-port'>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <target chassis='18' port='0x21'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <alias name='pci.18'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:     </controller>
Jan 22 00:16:17 compute-1 nova_compute[182713]:     <controller type='pci' index='19' model='pcie-root-port'>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <target chassis='19' port='0x22'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <alias name='pci.19'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:     </controller>
Jan 22 00:16:17 compute-1 nova_compute[182713]:     <controller type='pci' index='20' model='pcie-root-port'>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <target chassis='20' port='0x23'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <alias name='pci.20'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:     </controller>
Jan 22 00:16:17 compute-1 nova_compute[182713]:     <controller type='pci' index='21' model='pcie-root-port'>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <target chassis='21' port='0x24'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <alias name='pci.21'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:     </controller>
Jan 22 00:16:17 compute-1 nova_compute[182713]:     <controller type='pci' index='22' model='pcie-root-port'>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <target chassis='22' port='0x25'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <alias name='pci.22'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:     </controller>
Jan 22 00:16:17 compute-1 nova_compute[182713]:     <controller type='pci' index='23' model='pcie-root-port'>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <target chassis='23' port='0x26'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <alias name='pci.23'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:     </controller>
Jan 22 00:16:17 compute-1 nova_compute[182713]:     <controller type='pci' index='24' model='pcie-root-port'>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <target chassis='24' port='0x27'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <alias name='pci.24'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:     </controller>
Jan 22 00:16:17 compute-1 nova_compute[182713]:     <controller type='pci' index='25' model='pcie-root-port'>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <target chassis='25' port='0x28'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <alias name='pci.25'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:     </controller>
Jan 22 00:16:17 compute-1 nova_compute[182713]:     <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <model name='pcie-pci-bridge'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <alias name='pci.26'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:     </controller>
Jan 22 00:16:17 compute-1 nova_compute[182713]:     <controller type='usb' index='0' model='piix3-uhci'>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <alias name='usb'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:     </controller>
Jan 22 00:16:17 compute-1 nova_compute[182713]:     <controller type='sata' index='0'>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <alias name='ide'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:     </controller>
Jan 22 00:16:17 compute-1 nova_compute[182713]:     <interface type='ethernet'>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <mac address='fa:16:3e:3e:83:df'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <target dev='tapa06a78d5-54'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <model type='virtio'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <driver name='vhost' rx_queue_size='512'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <mtu size='1442'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <alias name='net0'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:     </interface>
Jan 22 00:16:17 compute-1 nova_compute[182713]:     <serial type='pty'>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <source path='/dev/pts/0'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <log file='/var/lib/nova/instances/074fd360-328c-4903-a368-d3890c4a1075/console.log' append='off'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <target type='isa-serial' port='0'>
Jan 22 00:16:17 compute-1 nova_compute[182713]:         <model name='isa-serial'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       </target>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <alias name='serial0'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:     </serial>
Jan 22 00:16:17 compute-1 nova_compute[182713]:     <console type='pty' tty='/dev/pts/0'>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <source path='/dev/pts/0'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <log file='/var/lib/nova/instances/074fd360-328c-4903-a368-d3890c4a1075/console.log' append='off'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <target type='serial' port='0'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <alias name='serial0'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:     </console>
Jan 22 00:16:17 compute-1 nova_compute[182713]:     <input type='tablet' bus='usb'>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <alias name='input0'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <address type='usb' bus='0' port='1'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:     </input>
Jan 22 00:16:17 compute-1 nova_compute[182713]:     <input type='mouse' bus='ps2'>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <alias name='input1'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:     </input>
Jan 22 00:16:17 compute-1 nova_compute[182713]:     <input type='keyboard' bus='ps2'>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <alias name='input2'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:     </input>
Jan 22 00:16:17 compute-1 nova_compute[182713]:     <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <listen type='address' address='::0'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:     </graphics>
Jan 22 00:16:17 compute-1 nova_compute[182713]:     <audio id='1' type='none'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:     <video>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <model type='virtio' heads='1' primary='yes'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <alias name='video0'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:     </video>
Jan 22 00:16:17 compute-1 nova_compute[182713]:     <watchdog model='itco' action='reset'>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <alias name='watchdog0'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:     </watchdog>
Jan 22 00:16:17 compute-1 nova_compute[182713]:     <memballoon model='virtio'>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <stats period='10'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <alias name='balloon0'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:     </memballoon>
Jan 22 00:16:17 compute-1 nova_compute[182713]:     <rng model='virtio'>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <backend model='random'>/dev/urandom</backend>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <alias name='rng0'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:     </rng>
Jan 22 00:16:17 compute-1 nova_compute[182713]:   </devices>
Jan 22 00:16:17 compute-1 nova_compute[182713]:   <seclabel type='dynamic' model='selinux' relabel='yes'>
Jan 22 00:16:17 compute-1 nova_compute[182713]:     <label>system_u:system_r:svirt_t:s0:c752,c871</label>
Jan 22 00:16:17 compute-1 nova_compute[182713]:     <imagelabel>system_u:object_r:svirt_image_t:s0:c752,c871</imagelabel>
Jan 22 00:16:17 compute-1 nova_compute[182713]:   </seclabel>
Jan 22 00:16:17 compute-1 nova_compute[182713]:   <seclabel type='dynamic' model='dac' relabel='yes'>
Jan 22 00:16:17 compute-1 nova_compute[182713]:     <label>+107:+107</label>
Jan 22 00:16:17 compute-1 nova_compute[182713]:     <imagelabel>+107:+107</imagelabel>
Jan 22 00:16:17 compute-1 nova_compute[182713]:   </seclabel>
Jan 22 00:16:17 compute-1 nova_compute[182713]: </domain>
Jan 22 00:16:17 compute-1 nova_compute[182713]:  get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282
Jan 22 00:16:17 compute-1 nova_compute[182713]: 2026-01-22 00:16:17.968 182717 DEBUG nova.virt.libvirt.guest [req-44e79423-ec3b-4d7c-bb94-d38ea705fc6d req-b7a0e244-c620-4878-bcb4-8034b983616f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:ce:53:e7"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap4230f4cb-ad"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257
Jan 22 00:16:17 compute-1 nova_compute[182713]: 2026-01-22 00:16:17.981 182717 DEBUG nova.virt.libvirt.guest [req-44e79423-ec3b-4d7c-bb94-d38ea705fc6d req-b7a0e244-c620-4878-bcb4-8034b983616f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:ce:53:e7"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap4230f4cb-ad"/></interface>not found in domain: <domain type='kvm' id='57'>
Jan 22 00:16:17 compute-1 nova_compute[182713]:   <name>instance-0000007d</name>
Jan 22 00:16:17 compute-1 nova_compute[182713]:   <uuid>074fd360-328c-4903-a368-d3890c4a1075</uuid>
Jan 22 00:16:17 compute-1 nova_compute[182713]:   <metadata>
Jan 22 00:16:17 compute-1 nova_compute[182713]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 00:16:17 compute-1 nova_compute[182713]:   <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:   <nova:name>tempest-TestNetworkBasicOps-server-1083808564</nova:name>
Jan 22 00:16:17 compute-1 nova_compute[182713]:   <nova:creationTime>2026-01-22 00:16:15</nova:creationTime>
Jan 22 00:16:17 compute-1 nova_compute[182713]:   <nova:flavor name="m1.nano">
Jan 22 00:16:17 compute-1 nova_compute[182713]:     <nova:memory>128</nova:memory>
Jan 22 00:16:17 compute-1 nova_compute[182713]:     <nova:disk>1</nova:disk>
Jan 22 00:16:17 compute-1 nova_compute[182713]:     <nova:swap>0</nova:swap>
Jan 22 00:16:17 compute-1 nova_compute[182713]:     <nova:ephemeral>0</nova:ephemeral>
Jan 22 00:16:17 compute-1 nova_compute[182713]:     <nova:vcpus>1</nova:vcpus>
Jan 22 00:16:17 compute-1 nova_compute[182713]:   </nova:flavor>
Jan 22 00:16:17 compute-1 nova_compute[182713]:   <nova:owner>
Jan 22 00:16:17 compute-1 nova_compute[182713]:     <nova:user uuid="833f1e9dce90456ea55a443da6704907">tempest-TestNetworkBasicOps-822850957-project-member</nova:user>
Jan 22 00:16:17 compute-1 nova_compute[182713]:     <nova:project uuid="34b96b4037d24a0ea19383ca2477b2fd">tempest-TestNetworkBasicOps-822850957</nova:project>
Jan 22 00:16:17 compute-1 nova_compute[182713]:   </nova:owner>
Jan 22 00:16:17 compute-1 nova_compute[182713]:   <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:   <nova:ports>
Jan 22 00:16:17 compute-1 nova_compute[182713]:     <nova:port uuid="a06a78d5-548e-4a84-b918-197a54a79f44">
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:     </nova:port>
Jan 22 00:16:17 compute-1 nova_compute[182713]:   </nova:ports>
Jan 22 00:16:17 compute-1 nova_compute[182713]: </nova:instance>
Jan 22 00:16:17 compute-1 nova_compute[182713]:   </metadata>
Jan 22 00:16:17 compute-1 nova_compute[182713]:   <memory unit='KiB'>131072</memory>
Jan 22 00:16:17 compute-1 nova_compute[182713]:   <currentMemory unit='KiB'>131072</currentMemory>
Jan 22 00:16:17 compute-1 nova_compute[182713]:   <vcpu placement='static'>1</vcpu>
Jan 22 00:16:17 compute-1 nova_compute[182713]:   <resource>
Jan 22 00:16:17 compute-1 nova_compute[182713]:     <partition>/machine</partition>
Jan 22 00:16:17 compute-1 nova_compute[182713]:   </resource>
Jan 22 00:16:17 compute-1 nova_compute[182713]:   <sysinfo type='smbios'>
Jan 22 00:16:17 compute-1 nova_compute[182713]:     <system>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <entry name='manufacturer'>RDO</entry>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <entry name='product'>OpenStack Compute</entry>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <entry name='serial'>074fd360-328c-4903-a368-d3890c4a1075</entry>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <entry name='uuid'>074fd360-328c-4903-a368-d3890c4a1075</entry>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <entry name='family'>Virtual Machine</entry>
Jan 22 00:16:17 compute-1 nova_compute[182713]:     </system>
Jan 22 00:16:17 compute-1 nova_compute[182713]:   </sysinfo>
Jan 22 00:16:17 compute-1 nova_compute[182713]:   <os>
Jan 22 00:16:17 compute-1 nova_compute[182713]:     <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Jan 22 00:16:17 compute-1 nova_compute[182713]:     <boot dev='hd'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:     <smbios mode='sysinfo'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:   </os>
Jan 22 00:16:17 compute-1 nova_compute[182713]:   <features>
Jan 22 00:16:17 compute-1 nova_compute[182713]:     <acpi/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:     <apic/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:     <vmcoreinfo state='on'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:   </features>
Jan 22 00:16:17 compute-1 nova_compute[182713]:   <cpu mode='custom' match='exact' check='full'>
Jan 22 00:16:17 compute-1 nova_compute[182713]:     <model fallback='forbid'>Nehalem</model>
Jan 22 00:16:17 compute-1 nova_compute[182713]:     <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:     <feature policy='require' name='x2apic'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:     <feature policy='require' name='hypervisor'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:     <feature policy='require' name='vme'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:   </cpu>
Jan 22 00:16:17 compute-1 nova_compute[182713]:   <clock offset='utc'>
Jan 22 00:16:17 compute-1 nova_compute[182713]:     <timer name='pit' tickpolicy='delay'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:     <timer name='rtc' tickpolicy='catchup'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:     <timer name='hpet' present='no'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:   </clock>
Jan 22 00:16:17 compute-1 nova_compute[182713]:   <on_poweroff>destroy</on_poweroff>
Jan 22 00:16:17 compute-1 nova_compute[182713]:   <on_reboot>restart</on_reboot>
Jan 22 00:16:17 compute-1 nova_compute[182713]:   <on_crash>destroy</on_crash>
Jan 22 00:16:17 compute-1 nova_compute[182713]:   <devices>
Jan 22 00:16:17 compute-1 nova_compute[182713]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 22 00:16:17 compute-1 nova_compute[182713]:     <disk type='file' device='disk'>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <driver name='qemu' type='qcow2' cache='none'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <source file='/var/lib/nova/instances/074fd360-328c-4903-a368-d3890c4a1075/disk' index='2'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <backingStore type='file' index='3'>
Jan 22 00:16:17 compute-1 nova_compute[182713]:         <format type='raw'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:         <source file='/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:         <backingStore/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       </backingStore>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <target dev='vda' bus='virtio'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <alias name='virtio-disk0'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:     </disk>
Jan 22 00:16:17 compute-1 nova_compute[182713]:     <disk type='file' device='cdrom'>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <driver name='qemu' type='raw' cache='none'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <source file='/var/lib/nova/instances/074fd360-328c-4903-a368-d3890c4a1075/disk.config' index='1'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <backingStore/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <target dev='sda' bus='sata'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <readonly/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <alias name='sata0-0-0'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:     </disk>
Jan 22 00:16:17 compute-1 nova_compute[182713]:     <controller type='pci' index='0' model='pcie-root'>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <alias name='pcie.0'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:     </controller>
Jan 22 00:16:17 compute-1 nova_compute[182713]:     <controller type='pci' index='1' model='pcie-root-port'>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <target chassis='1' port='0x10'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <alias name='pci.1'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:     </controller>
Jan 22 00:16:17 compute-1 nova_compute[182713]:     <controller type='pci' index='2' model='pcie-root-port'>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <target chassis='2' port='0x11'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <alias name='pci.2'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:     </controller>
Jan 22 00:16:17 compute-1 nova_compute[182713]:     <controller type='pci' index='3' model='pcie-root-port'>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <target chassis='3' port='0x12'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <alias name='pci.3'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:     </controller>
Jan 22 00:16:17 compute-1 nova_compute[182713]:     <controller type='pci' index='4' model='pcie-root-port'>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <target chassis='4' port='0x13'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <alias name='pci.4'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:     </controller>
Jan 22 00:16:17 compute-1 nova_compute[182713]:     <controller type='pci' index='5' model='pcie-root-port'>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <target chassis='5' port='0x14'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <alias name='pci.5'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:     </controller>
Jan 22 00:16:17 compute-1 nova_compute[182713]:     <controller type='pci' index='6' model='pcie-root-port'>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <target chassis='6' port='0x15'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <alias name='pci.6'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:     </controller>
Jan 22 00:16:17 compute-1 nova_compute[182713]:     <controller type='pci' index='7' model='pcie-root-port'>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <target chassis='7' port='0x16'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <alias name='pci.7'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:     </controller>
Jan 22 00:16:17 compute-1 nova_compute[182713]:     <controller type='pci' index='8' model='pcie-root-port'>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <target chassis='8' port='0x17'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <alias name='pci.8'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:     </controller>
Jan 22 00:16:17 compute-1 nova_compute[182713]:     <controller type='pci' index='9' model='pcie-root-port'>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <target chassis='9' port='0x18'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <alias name='pci.9'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:     </controller>
Jan 22 00:16:17 compute-1 nova_compute[182713]:     <controller type='pci' index='10' model='pcie-root-port'>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <target chassis='10' port='0x19'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <alias name='pci.10'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:     </controller>
Jan 22 00:16:17 compute-1 nova_compute[182713]:     <controller type='pci' index='11' model='pcie-root-port'>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <target chassis='11' port='0x1a'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <alias name='pci.11'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:     </controller>
Jan 22 00:16:17 compute-1 nova_compute[182713]:     <controller type='pci' index='12' model='pcie-root-port'>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <target chassis='12' port='0x1b'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <alias name='pci.12'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:     </controller>
Jan 22 00:16:17 compute-1 nova_compute[182713]:     <controller type='pci' index='13' model='pcie-root-port'>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <target chassis='13' port='0x1c'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <alias name='pci.13'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:     </controller>
Jan 22 00:16:17 compute-1 nova_compute[182713]:     <controller type='pci' index='14' model='pcie-root-port'>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <target chassis='14' port='0x1d'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <alias name='pci.14'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:     </controller>
Jan 22 00:16:17 compute-1 nova_compute[182713]:     <controller type='pci' index='15' model='pcie-root-port'>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <target chassis='15' port='0x1e'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <alias name='pci.15'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:     </controller>
Jan 22 00:16:17 compute-1 nova_compute[182713]:     <controller type='pci' index='16' model='pcie-root-port'>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <target chassis='16' port='0x1f'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <alias name='pci.16'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:     </controller>
Jan 22 00:16:17 compute-1 nova_compute[182713]:     <controller type='pci' index='17' model='pcie-root-port'>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <target chassis='17' port='0x20'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <alias name='pci.17'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:     </controller>
Jan 22 00:16:17 compute-1 nova_compute[182713]:     <controller type='pci' index='18' model='pcie-root-port'>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <target chassis='18' port='0x21'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <alias name='pci.18'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:     </controller>
Jan 22 00:16:17 compute-1 nova_compute[182713]:     <controller type='pci' index='19' model='pcie-root-port'>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <target chassis='19' port='0x22'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <alias name='pci.19'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:     </controller>
Jan 22 00:16:17 compute-1 nova_compute[182713]:     <controller type='pci' index='20' model='pcie-root-port'>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <target chassis='20' port='0x23'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <alias name='pci.20'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:     </controller>
Jan 22 00:16:17 compute-1 nova_compute[182713]:     <controller type='pci' index='21' model='pcie-root-port'>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <target chassis='21' port='0x24'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <alias name='pci.21'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:     </controller>
Jan 22 00:16:17 compute-1 nova_compute[182713]:     <controller type='pci' index='22' model='pcie-root-port'>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <target chassis='22' port='0x25'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <alias name='pci.22'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:     </controller>
Jan 22 00:16:17 compute-1 nova_compute[182713]:     <controller type='pci' index='23' model='pcie-root-port'>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <target chassis='23' port='0x26'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <alias name='pci.23'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:     </controller>
Jan 22 00:16:17 compute-1 nova_compute[182713]:     <controller type='pci' index='24' model='pcie-root-port'>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <target chassis='24' port='0x27'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <alias name='pci.24'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:     </controller>
Jan 22 00:16:17 compute-1 nova_compute[182713]:     <controller type='pci' index='25' model='pcie-root-port'>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <model name='pcie-root-port'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <target chassis='25' port='0x28'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <alias name='pci.25'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:     </controller>
Jan 22 00:16:17 compute-1 nova_compute[182713]:     <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <model name='pcie-pci-bridge'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <alias name='pci.26'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:     </controller>
Jan 22 00:16:17 compute-1 nova_compute[182713]:     <controller type='usb' index='0' model='piix3-uhci'>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <alias name='usb'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:     </controller>
Jan 22 00:16:17 compute-1 nova_compute[182713]:     <controller type='sata' index='0'>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <alias name='ide'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:     </controller>
Jan 22 00:16:17 compute-1 nova_compute[182713]:     <interface type='ethernet'>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <mac address='fa:16:3e:3e:83:df'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <target dev='tapa06a78d5-54'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <model type='virtio'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <driver name='vhost' rx_queue_size='512'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <mtu size='1442'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <alias name='net0'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:     </interface>
Jan 22 00:16:17 compute-1 nova_compute[182713]:     <serial type='pty'>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <source path='/dev/pts/0'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <log file='/var/lib/nova/instances/074fd360-328c-4903-a368-d3890c4a1075/console.log' append='off'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <target type='isa-serial' port='0'>
Jan 22 00:16:17 compute-1 nova_compute[182713]:         <model name='isa-serial'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       </target>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <alias name='serial0'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:     </serial>
Jan 22 00:16:17 compute-1 nova_compute[182713]:     <console type='pty' tty='/dev/pts/0'>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <source path='/dev/pts/0'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <log file='/var/lib/nova/instances/074fd360-328c-4903-a368-d3890c4a1075/console.log' append='off'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <target type='serial' port='0'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <alias name='serial0'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:     </console>
Jan 22 00:16:17 compute-1 nova_compute[182713]:     <input type='tablet' bus='usb'>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <alias name='input0'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <address type='usb' bus='0' port='1'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:     </input>
Jan 22 00:16:17 compute-1 nova_compute[182713]:     <input type='mouse' bus='ps2'>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <alias name='input1'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:     </input>
Jan 22 00:16:17 compute-1 nova_compute[182713]:     <input type='keyboard' bus='ps2'>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <alias name='input2'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:     </input>
Jan 22 00:16:17 compute-1 nova_compute[182713]:     <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <listen type='address' address='::0'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:     </graphics>
Jan 22 00:16:17 compute-1 nova_compute[182713]:     <audio id='1' type='none'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:     <video>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <model type='virtio' heads='1' primary='yes'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <alias name='video0'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:     </video>
Jan 22 00:16:17 compute-1 nova_compute[182713]:     <watchdog model='itco' action='reset'>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <alias name='watchdog0'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:     </watchdog>
Jan 22 00:16:17 compute-1 nova_compute[182713]:     <memballoon model='virtio'>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <stats period='10'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <alias name='balloon0'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:     </memballoon>
Jan 22 00:16:17 compute-1 nova_compute[182713]:     <rng model='virtio'>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <backend model='random'>/dev/urandom</backend>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <alias name='rng0'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:     </rng>
Jan 22 00:16:17 compute-1 nova_compute[182713]:   </devices>
Jan 22 00:16:17 compute-1 nova_compute[182713]:   <seclabel type='dynamic' model='selinux' relabel='yes'>
Jan 22 00:16:17 compute-1 nova_compute[182713]:     <label>system_u:system_r:svirt_t:s0:c752,c871</label>
Jan 22 00:16:17 compute-1 nova_compute[182713]:     <imagelabel>system_u:object_r:svirt_image_t:s0:c752,c871</imagelabel>
Jan 22 00:16:17 compute-1 nova_compute[182713]:   </seclabel>
Jan 22 00:16:17 compute-1 nova_compute[182713]:   <seclabel type='dynamic' model='dac' relabel='yes'>
Jan 22 00:16:17 compute-1 nova_compute[182713]:     <label>+107:+107</label>
Jan 22 00:16:17 compute-1 nova_compute[182713]:     <imagelabel>+107:+107</imagelabel>
Jan 22 00:16:17 compute-1 nova_compute[182713]:   </seclabel>
Jan 22 00:16:17 compute-1 nova_compute[182713]: </domain>
Jan 22 00:16:17 compute-1 nova_compute[182713]:  get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282
Jan 22 00:16:17 compute-1 nova_compute[182713]: 2026-01-22 00:16:17.985 182717 WARNING nova.virt.libvirt.driver [req-44e79423-ec3b-4d7c-bb94-d38ea705fc6d req-b7a0e244-c620-4878-bcb4-8034b983616f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 074fd360-328c-4903-a368-d3890c4a1075] Detaching interface fa:16:3e:ce:53:e7 failed because the device is no longer found on the guest.: nova.exception.DeviceNotFound: Device 'tap4230f4cb-ad' not found.
Jan 22 00:16:17 compute-1 nova_compute[182713]: 2026-01-22 00:16:17.987 182717 DEBUG nova.virt.libvirt.vif [req-44e79423-ec3b-4d7c-bb94-d38ea705fc6d req-b7a0e244-c620-4878-bcb4-8034b983616f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T00:14:30Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1083808564',display_name='tempest-TestNetworkBasicOps-server-1083808564',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1083808564',id=125,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBM6+EaxgHLeoG7CHqeSWXpXKv4K84dI1QQQjZdcrX0T7kqXBTlhE22YQjJTUFxToUxfZEI27WRcAtCoqb6CCdLa4/l//5Lw6nNA8ZjjrkKnh18RWLjWeeCbEVEj1FdVuCQ==',key_name='tempest-TestNetworkBasicOps-2002692629',keypairs=<?>,launch_index=0,launched_at=2026-01-22T00:14:45Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata=<?>,migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='34b96b4037d24a0ea19383ca2477b2fd',ramdisk_id='',reservation_id='r-5bbetpco',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-822850957',owner_user_name='tempest-TestNetworkBasicOps-822850957-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T00:14:45Z,user_data=None,user_id='833f1e9dce90456ea55a443da6704907',uuid=074fd360-328c-4903-a368-d3890c4a1075,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "4230f4cb-ad57-407e-90c1-b2441b67e135", "address": "fa:16:3e:ce:53:e7", "network": {"id": "dbad5d67-48fa-4452-9764-37918af5f722", "bridge": "br-int", "label": "tempest-network-smoke--1888340030", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.29", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "34b96b4037d24a0ea19383ca2477b2fd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4230f4cb-ad", "ovs_interfaceid": "4230f4cb-ad57-407e-90c1-b2441b67e135", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 00:16:17 compute-1 nova_compute[182713]: 2026-01-22 00:16:17.987 182717 DEBUG nova.network.os_vif_util [req-44e79423-ec3b-4d7c-bb94-d38ea705fc6d req-b7a0e244-c620-4878-bcb4-8034b983616f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Converting VIF {"id": "4230f4cb-ad57-407e-90c1-b2441b67e135", "address": "fa:16:3e:ce:53:e7", "network": {"id": "dbad5d67-48fa-4452-9764-37918af5f722", "bridge": "br-int", "label": "tempest-network-smoke--1888340030", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.29", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "34b96b4037d24a0ea19383ca2477b2fd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4230f4cb-ad", "ovs_interfaceid": "4230f4cb-ad57-407e-90c1-b2441b67e135", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:16:17 compute-1 nova_compute[182713]: 2026-01-22 00:16:17.988 182717 DEBUG nova.network.os_vif_util [req-44e79423-ec3b-4d7c-bb94-d38ea705fc6d req-b7a0e244-c620-4878-bcb4-8034b983616f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:ce:53:e7,bridge_name='br-int',has_traffic_filtering=True,id=4230f4cb-ad57-407e-90c1-b2441b67e135,network=Network(dbad5d67-48fa-4452-9764-37918af5f722),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4230f4cb-ad') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:16:17 compute-1 nova_compute[182713]: 2026-01-22 00:16:17.989 182717 DEBUG os_vif [req-44e79423-ec3b-4d7c-bb94-d38ea705fc6d req-b7a0e244-c620-4878-bcb4-8034b983616f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:ce:53:e7,bridge_name='br-int',has_traffic_filtering=True,id=4230f4cb-ad57-407e-90c1-b2441b67e135,network=Network(dbad5d67-48fa-4452-9764-37918af5f722),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4230f4cb-ad') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 00:16:17 compute-1 nova_compute[182713]: 2026-01-22 00:16:17.992 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:16:17 compute-1 nova_compute[182713]: 2026-01-22 00:16:17.993 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4230f4cb-ad, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:16:17 compute-1 nova_compute[182713]: 2026-01-22 00:16:17.993 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 00:16:17 compute-1 nova_compute[182713]: 2026-01-22 00:16:17.997 182717 INFO os_vif [req-44e79423-ec3b-4d7c-bb94-d38ea705fc6d req-b7a0e244-c620-4878-bcb4-8034b983616f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:ce:53:e7,bridge_name='br-int',has_traffic_filtering=True,id=4230f4cb-ad57-407e-90c1-b2441b67e135,network=Network(dbad5d67-48fa-4452-9764-37918af5f722),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4230f4cb-ad')
Jan 22 00:16:17 compute-1 nova_compute[182713]: 2026-01-22 00:16:17.998 182717 DEBUG nova.virt.libvirt.guest [req-44e79423-ec3b-4d7c-bb94-d38ea705fc6d req-b7a0e244-c620-4878-bcb4-8034b983616f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 00:16:17 compute-1 nova_compute[182713]:   <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:   <nova:name>tempest-TestNetworkBasicOps-server-1083808564</nova:name>
Jan 22 00:16:17 compute-1 nova_compute[182713]:   <nova:creationTime>2026-01-22 00:16:17</nova:creationTime>
Jan 22 00:16:17 compute-1 nova_compute[182713]:   <nova:flavor name="m1.nano">
Jan 22 00:16:17 compute-1 nova_compute[182713]:     <nova:memory>128</nova:memory>
Jan 22 00:16:17 compute-1 nova_compute[182713]:     <nova:disk>1</nova:disk>
Jan 22 00:16:17 compute-1 nova_compute[182713]:     <nova:swap>0</nova:swap>
Jan 22 00:16:17 compute-1 nova_compute[182713]:     <nova:ephemeral>0</nova:ephemeral>
Jan 22 00:16:17 compute-1 nova_compute[182713]:     <nova:vcpus>1</nova:vcpus>
Jan 22 00:16:17 compute-1 nova_compute[182713]:   </nova:flavor>
Jan 22 00:16:17 compute-1 nova_compute[182713]:   <nova:owner>
Jan 22 00:16:17 compute-1 nova_compute[182713]:     <nova:user uuid="833f1e9dce90456ea55a443da6704907">tempest-TestNetworkBasicOps-822850957-project-member</nova:user>
Jan 22 00:16:17 compute-1 nova_compute[182713]:     <nova:project uuid="34b96b4037d24a0ea19383ca2477b2fd">tempest-TestNetworkBasicOps-822850957</nova:project>
Jan 22 00:16:17 compute-1 nova_compute[182713]:   </nova:owner>
Jan 22 00:16:17 compute-1 nova_compute[182713]:   <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:   <nova:ports>
Jan 22 00:16:17 compute-1 nova_compute[182713]:     <nova:port uuid="a06a78d5-548e-4a84-b918-197a54a79f44">
Jan 22 00:16:17 compute-1 nova_compute[182713]:       <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Jan 22 00:16:17 compute-1 nova_compute[182713]:     </nova:port>
Jan 22 00:16:17 compute-1 nova_compute[182713]:   </nova:ports>
Jan 22 00:16:17 compute-1 nova_compute[182713]: </nova:instance>
Jan 22 00:16:17 compute-1 nova_compute[182713]:  set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359
Jan 22 00:16:18 compute-1 nova_compute[182713]: 2026-01-22 00:16:18.010 182717 DEBUG nova.compute.manager [req-dc991bfd-a244-40c6-895f-5c0015ecc257 req-cfda17f4-f528-432e-9c67-be563de74f11 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 074fd360-328c-4903-a368-d3890c4a1075] Received event network-vif-plugged-4230f4cb-ad57-407e-90c1-b2441b67e135 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:16:18 compute-1 nova_compute[182713]: 2026-01-22 00:16:18.010 182717 DEBUG oslo_concurrency.lockutils [req-dc991bfd-a244-40c6-895f-5c0015ecc257 req-cfda17f4-f528-432e-9c67-be563de74f11 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "074fd360-328c-4903-a368-d3890c4a1075-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:16:18 compute-1 nova_compute[182713]: 2026-01-22 00:16:18.011 182717 DEBUG oslo_concurrency.lockutils [req-dc991bfd-a244-40c6-895f-5c0015ecc257 req-cfda17f4-f528-432e-9c67-be563de74f11 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "074fd360-328c-4903-a368-d3890c4a1075-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:16:18 compute-1 nova_compute[182713]: 2026-01-22 00:16:18.012 182717 DEBUG oslo_concurrency.lockutils [req-dc991bfd-a244-40c6-895f-5c0015ecc257 req-cfda17f4-f528-432e-9c67-be563de74f11 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "074fd360-328c-4903-a368-d3890c4a1075-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:16:18 compute-1 nova_compute[182713]: 2026-01-22 00:16:18.012 182717 DEBUG nova.compute.manager [req-dc991bfd-a244-40c6-895f-5c0015ecc257 req-cfda17f4-f528-432e-9c67-be563de74f11 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 074fd360-328c-4903-a368-d3890c4a1075] No waiting events found dispatching network-vif-plugged-4230f4cb-ad57-407e-90c1-b2441b67e135 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:16:18 compute-1 nova_compute[182713]: 2026-01-22 00:16:18.013 182717 WARNING nova.compute.manager [req-dc991bfd-a244-40c6-895f-5c0015ecc257 req-cfda17f4-f528-432e-9c67-be563de74f11 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 074fd360-328c-4903-a368-d3890c4a1075] Received unexpected event network-vif-plugged-4230f4cb-ad57-407e-90c1-b2441b67e135 for instance with vm_state active and task_state None.
Jan 22 00:16:18 compute-1 nova_compute[182713]: 2026-01-22 00:16:18.387 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:16:19 compute-1 nova_compute[182713]: 2026-01-22 00:16:19.237 182717 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769040964.2361782, 6763a0f8-8485-4d4a-8418-5b095f3a20ed => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:16:19 compute-1 nova_compute[182713]: 2026-01-22 00:16:19.238 182717 INFO nova.compute.manager [-] [instance: 6763a0f8-8485-4d4a-8418-5b095f3a20ed] VM Stopped (Lifecycle Event)
Jan 22 00:16:19 compute-1 nova_compute[182713]: 2026-01-22 00:16:19.402 182717 DEBUG nova.compute.manager [None req-83636ff8-ea4a-4b88-a3e6-743e9eefd5be - - - - - -] [instance: 6763a0f8-8485-4d4a-8418-5b095f3a20ed] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:16:20 compute-1 nova_compute[182713]: 2026-01-22 00:16:20.168 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:16:21 compute-1 nova_compute[182713]: 2026-01-22 00:16:21.876 182717 INFO nova.compute.manager [None req-cad6ea55-40fb-4283-a434-81d6ea41727f c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] [instance: 5ba5bafe-ee5b-48f6-aa2f-653708f71f55] Rescuing
Jan 22 00:16:21 compute-1 nova_compute[182713]: 2026-01-22 00:16:21.877 182717 DEBUG oslo_concurrency.lockutils [None req-cad6ea55-40fb-4283-a434-81d6ea41727f c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] Acquiring lock "refresh_cache-5ba5bafe-ee5b-48f6-aa2f-653708f71f55" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:16:21 compute-1 nova_compute[182713]: 2026-01-22 00:16:21.877 182717 DEBUG oslo_concurrency.lockutils [None req-cad6ea55-40fb-4283-a434-81d6ea41727f c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] Acquired lock "refresh_cache-5ba5bafe-ee5b-48f6-aa2f-653708f71f55" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:16:21 compute-1 nova_compute[182713]: 2026-01-22 00:16:21.877 182717 DEBUG nova.network.neutron [None req-cad6ea55-40fb-4283-a434-81d6ea41727f c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] [instance: 5ba5bafe-ee5b-48f6-aa2f-653708f71f55] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 00:16:22 compute-1 nova_compute[182713]: 2026-01-22 00:16:22.007 182717 INFO nova.network.neutron [None req-3ddb0b1d-e0b6-4f28-9c7c-469a326f1bab 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 074fd360-328c-4903-a368-d3890c4a1075] Port 4230f4cb-ad57-407e-90c1-b2441b67e135 from network info_cache is no longer associated with instance in Neutron. Removing from network info_cache.
Jan 22 00:16:22 compute-1 nova_compute[182713]: 2026-01-22 00:16:22.008 182717 DEBUG nova.network.neutron [None req-3ddb0b1d-e0b6-4f28-9c7c-469a326f1bab 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 074fd360-328c-4903-a368-d3890c4a1075] Updating instance_info_cache with network_info: [{"id": "a06a78d5-548e-4a84-b918-197a54a79f44", "address": "fa:16:3e:3e:83:df", "network": {"id": "89b3c74e-a4f2-4889-901d-aba21eee4bda", "bridge": "br-int", "label": "tempest-network-smoke--2024090277", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.235", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "34b96b4037d24a0ea19383ca2477b2fd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa06a78d5-54", "ovs_interfaceid": "a06a78d5-548e-4a84-b918-197a54a79f44", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:16:22 compute-1 nova_compute[182713]: 2026-01-22 00:16:22.110 182717 DEBUG oslo_concurrency.lockutils [None req-3ddb0b1d-e0b6-4f28-9c7c-469a326f1bab 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Releasing lock "refresh_cache-074fd360-328c-4903-a368-d3890c4a1075" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:16:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:22.886 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '074fd360-328c-4903-a368-d3890c4a1075', 'name': 'tempest-TestNetworkBasicOps-server-1083808564', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-0000007d', 'OS-EXT-SRV-ATTR:host': 'compute-1.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '34b96b4037d24a0ea19383ca2477b2fd', 'user_id': '833f1e9dce90456ea55a443da6704907', 'hostId': '0b8bd15b4429a8b3db4045e831c64597b4205d6950c39c7abf926666', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Jan 22 00:16:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:22.892 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '5ba5bafe-ee5b-48f6-aa2f-653708f71f55', 'name': 'tempest-ServerRescueNegativeTestJSON-server-343289399', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000085', 'OS-EXT-SRV-ATTR:host': 'compute-1.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '4c6e66779ffe440d9c3270f0328391fb', 'user_id': 'c26ff016fcfc4e08803feb0e96005a8e', 'hostId': '36cb94be136c59f96bdf8037627fe2a95db3a078b538a9314d820ad0', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Jan 22 00:16:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:22.893 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Jan 22 00:16:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:22.934 12 DEBUG ceilometer.compute.pollsters [-] 074fd360-328c-4903-a368-d3890c4a1075/disk.device.read.latency volume: 198824937 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:16:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:22.935 12 DEBUG ceilometer.compute.pollsters [-] 074fd360-328c-4903-a368-d3890c4a1075/disk.device.read.latency volume: 28809245 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:16:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:22.971 12 DEBUG ceilometer.compute.pollsters [-] 5ba5bafe-ee5b-48f6-aa2f-653708f71f55/disk.device.read.latency volume: 124726527 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:16:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:22.972 12 DEBUG ceilometer.compute.pollsters [-] 5ba5bafe-ee5b-48f6-aa2f-653708f71f55/disk.device.read.latency volume: 379332 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:16:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:22.977 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '39a82bef-ad20-4818-af9f-e101db1fe8d2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 198824937, 'user_id': '833f1e9dce90456ea55a443da6704907', 'user_name': None, 'project_id': '34b96b4037d24a0ea19383ca2477b2fd', 'project_name': None, 'resource_id': '074fd360-328c-4903-a368-d3890c4a1075-vda', 'timestamp': '2026-01-22T00:16:22.893738', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1083808564', 'name': 'instance-0000007d', 'instance_id': '074fd360-328c-4903-a368-d3890c4a1075', 'instance_type': 'm1.nano', 'host': '0b8bd15b4429a8b3db4045e831c64597b4205d6950c39c7abf926666', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '95508ff6-f727-11f0-a0a4-fa163e934844', 'monotonic_time': 5555.602239022, 'message_signature': 'd33b545fb102979c8fabd0d1c1e3e7b7d0b030a350c3abcc7d781df201f3ed35'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 28809245, 'user_id': '833f1e9dce90456ea55a443da6704907', 'user_name': None, 'project_id': '34b96b4037d24a0ea19383ca2477b2fd', 'project_name': None, 'resource_id': '074fd360-328c-4903-a368-d3890c4a1075-sda', 'timestamp': '2026-01-22T00:16:22.893738', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1083808564', 'name': 'instance-0000007d', 'instance_id': '074fd360-328c-4903-a368-d3890c4a1075', 'instance_type': 'm1.nano', 'host': '0b8bd15b4429a8b3db4045e831c64597b4205d6950c39c7abf926666', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '9550a57c-f727-11f0-a0a4-fa163e934844', 'monotonic_time': 5555.602239022, 'message_signature': '9fce8b0096efe795b01af3c34e3ccacde33ce060b250823dbf6dc3d7bc03d013'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 124726527, 'user_id': 'c26ff016fcfc4e08803feb0e96005a8e', 'user_name': None, 'project_id': '4c6e66779ffe440d9c3270f0328391fb', 'project_name': None, 'resource_id': '5ba5bafe-ee5b-48f6-aa2f-653708f71f55-vda', 'timestamp': '2026-01-22T00:16:22.893738', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-343289399', 'name': 'instance-00000085', 'instance_id': '5ba5bafe-ee5b-48f6-aa2f-653708f71f55', 'instance_type': 'm1.nano', 'host': '36cb94be136c59f96bdf8037627fe2a95db3a078b538a9314d820ad0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '955625b0-f727-11f0-a0a4-fa163e934844', 'monotonic_time': 5555.643772778, 'message_signature': '940466394acb8824bd0f5f07e0bc132faab37967b72f8b03c2a47eb58cec7cab'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 379332, 'user_id': 'c26ff016fcfc4e08803feb0e96005a8e', 'user_name': None, 'project_id': '4c6e66779ffe440d9c3270f0328391fb', 'project_name': None, 'resource_id': '5ba5bafe-ee5b-48f6-aa2f-653708f71f55-sda', 'timestamp': '2026-01-22T00:16:22.893738', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-343289399', 'name': 'instance-00000085', 'instance_id': '5ba5bafe-ee5b-48f6-aa2f-653708f71f55', 'instance_type': 'm1.nano', 'host': '36cb94be136c59f96bdf8037627fe2a95db3a078b538a9314d820ad0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '95563686-f727-11f0-a0a4-fa163e934844', 'monotonic_time': 5555.643772778, 'message_signature': '48939dba3b68c309e5df074f49084245a31058db85a40d9dc505f7bbf8a26b89'}]}, 'timestamp': '2026-01-22 00:16:22.973214', '_unique_id': '9e06427c0ea2490e82b302cf486f949b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:16:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:22.977 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:16:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:22.977 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:16:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:22.977 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:16:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:22.977 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:16:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:22.977 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:16:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:22.977 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:16:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:22.977 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:16:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:22.977 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:16:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:22.977 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:16:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:22.977 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:16:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:22.977 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:16:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:22.977 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:16:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:22.977 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:16:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:22.977 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:16:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:22.977 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:16:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:22.977 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:16:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:22.977 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:16:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:22.977 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:16:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:22.977 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:16:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:22.977 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:16:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:22.977 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:16:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:22.977 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:16:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:22.977 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:16:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:22.977 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:16:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:22.977 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:16:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:22.977 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:16:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:22.977 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:16:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:22.977 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:16:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:22.977 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:16:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:22.977 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:16:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:22.977 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:16:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:22.977 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:16:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:22.977 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:16:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:22.977 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:16:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:22.977 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:16:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:22.977 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:16:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:22.977 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:16:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:22.977 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:16:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:22.977 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:16:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:22.977 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:16:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:22.977 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:16:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:22.977 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:16:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:22.977 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:16:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:22.977 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:16:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:22.977 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:16:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:22.977 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:16:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:22.977 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:16:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:22.977 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:16:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:22.977 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:16:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:22.977 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:16:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:22.977 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:16:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:22.977 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:16:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:22.977 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:16:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:22.977 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:16:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:22.982 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Jan 22 00:16:22 compute-1 nova_compute[182713]: 2026-01-22 00:16:22.984 182717 DEBUG oslo_concurrency.lockutils [None req-3ddb0b1d-e0b6-4f28-9c7c-469a326f1bab 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "interface-074fd360-328c-4903-a368-d3890c4a1075-4230f4cb-ad57-407e-90c1-b2441b67e135" "released" by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" :: held 8.069s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:16:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:22.986 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 074fd360-328c-4903-a368-d3890c4a1075 / tapa06a78d5-54 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Jan 22 00:16:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:22.987 12 DEBUG ceilometer.compute.pollsters [-] 074fd360-328c-4903-a368-d3890c4a1075/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:16:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:22.990 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 5ba5bafe-ee5b-48f6-aa2f-653708f71f55 / tap53f3c575-cc inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Jan 22 00:16:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:22.990 12 DEBUG ceilometer.compute.pollsters [-] 5ba5bafe-ee5b-48f6-aa2f-653708f71f55/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:16:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:22.992 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a8fdf0be-035e-4157-840e-e1c6dc25e8d8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '833f1e9dce90456ea55a443da6704907', 'user_name': None, 'project_id': '34b96b4037d24a0ea19383ca2477b2fd', 'project_name': None, 'resource_id': 'instance-0000007d-074fd360-328c-4903-a368-d3890c4a1075-tapa06a78d5-54', 'timestamp': '2026-01-22T00:16:22.982922', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1083808564', 'name': 'tapa06a78d5-54', 'instance_id': '074fd360-328c-4903-a368-d3890c4a1075', 'instance_type': 'm1.nano', 'host': '0b8bd15b4429a8b3db4045e831c64597b4205d6950c39c7abf926666', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:3e:83:df', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa06a78d5-54'}, 'message_id': '95588f76-f727-11f0-a0a4-fa163e934844', 'monotonic_time': 5555.690381371, 'message_signature': '8e8ae4f504ae745b0b4c260cd78a4d1d60e0e4c73e940b35592e29874c14d8d6'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'c26ff016fcfc4e08803feb0e96005a8e', 'user_name': None, 'project_id': '4c6e66779ffe440d9c3270f0328391fb', 'project_name': None, 'resource_id': 'instance-00000085-5ba5bafe-ee5b-48f6-aa2f-653708f71f55-tap53f3c575-cc', 'timestamp': '2026-01-22T00:16:22.982922', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-343289399', 'name': 'tap53f3c575-cc', 'instance_id': '5ba5bafe-ee5b-48f6-aa2f-653708f71f55', 'instance_type': 'm1.nano', 'host': '36cb94be136c59f96bdf8037627fe2a95db3a078b538a9314d820ad0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:9f:0e:3f', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap53f3c575-cc'}, 'message_id': '9559069a-f727-11f0-a0a4-fa163e934844', 'monotonic_time': 5555.695484938, 'message_signature': '7490b126d5315fa6297802c8237196ef17a2ecc8c3dca0754e8959441e5bbb81'}]}, 'timestamp': '2026-01-22 00:16:22.991213', '_unique_id': 'e75c6cf0ddfb4f84a508b2b1ab512e82'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:16:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:22.992 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:16:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:22.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:16:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:22.992 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:16:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:22.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:16:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:22.992 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:16:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:22.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:16:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:22.992 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:16:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:22.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:16:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:22.992 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:16:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:22.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:16:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:22.992 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:16:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:22.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:16:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:22.992 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:16:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:22.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:16:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:22.992 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:16:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:22.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:16:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:22.992 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:16:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:22.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:16:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:22.992 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:16:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:22.992 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:16:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:22.992 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:16:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:22.992 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:16:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:22.992 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:16:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:22.992 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:16:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:22.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:16:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:22.992 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:16:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:22.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:16:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:22.992 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:16:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:22.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:16:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:22.992 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:16:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:22.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:16:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:22.992 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:16:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:22.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:16:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:22.992 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:16:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:22.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:16:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:22.992 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:16:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:22.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:16:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:22.992 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:16:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:22.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:16:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:22.992 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:16:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:22.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:16:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:22.992 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:16:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:22.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:16:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:22.992 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:16:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:22.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:16:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:22.992 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:16:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:22.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:16:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:22.992 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:16:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:22.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:16:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:22.992 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:16:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:22.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:16:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:22.992 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:16:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:22.992 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:16:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:22.992 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:16:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:22.996 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.009 12 DEBUG ceilometer.compute.pollsters [-] 074fd360-328c-4903-a368-d3890c4a1075/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.010 12 DEBUG ceilometer.compute.pollsters [-] 074fd360-328c-4903-a368-d3890c4a1075/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.024 12 DEBUG ceilometer.compute.pollsters [-] 5ba5bafe-ee5b-48f6-aa2f-653708f71f55/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.025 12 DEBUG ceilometer.compute.pollsters [-] 5ba5bafe-ee5b-48f6-aa2f-653708f71f55/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.027 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2f599f77-9652-435e-9f9b-939b4ea8e187', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '833f1e9dce90456ea55a443da6704907', 'user_name': None, 'project_id': '34b96b4037d24a0ea19383ca2477b2fd', 'project_name': None, 'resource_id': '074fd360-328c-4903-a368-d3890c4a1075-vda', 'timestamp': '2026-01-22T00:16:22.996371', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1083808564', 'name': 'instance-0000007d', 'instance_id': '074fd360-328c-4903-a368-d3890c4a1075', 'instance_type': 'm1.nano', 'host': '0b8bd15b4429a8b3db4045e831c64597b4205d6950c39c7abf926666', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '955be57c-f727-11f0-a0a4-fa163e934844', 'monotonic_time': 5555.703656299, 'message_signature': '6bc5869a5130268d32ce782b8057d695866bb5555a0eda77f8bc3b6df40a9232'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '833f1e9dce90456ea55a443da6704907', 'user_name': None, 'project_id': '34b96b4037d24a0ea19383ca2477b2fd', 'project_name': None, 'resource_id': '074fd360-328c-4903-a368-d3890c4a1075-sda', 'timestamp': '2026-01-22T00:16:22.996371', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1083808564', 'name': 'instance-0000007d', 'instance_id': '074fd360-328c-4903-a368-d3890c4a1075', 'instance_type': 'm1.nano', 'host': '0b8bd15b4429a8b3db4045e831c64597b4205d6950c39c7abf926666', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '955bf8b4-f727-11f0-a0a4-fa163e934844', 'monotonic_time': 5555.703656299, 'message_signature': '37bc397aecbb72c80976cba7b4e651bedf7ebb3bc54e023c6f42bb0e2c95d117'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'c26ff016fcfc4e08803feb0e96005a8e', 'user_name': None, 'project_id': '4c6e66779ffe440d9c3270f0328391fb', 'project_name': None, 'resource_id': '5ba5bafe-ee5b-48f6-aa2f-653708f71f55-vda', 'timestamp': '2026-01-22T00:16:22.996371', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-343289399', 'name': 'instance-00000085', 'instance_id': '5ba5bafe-ee5b-48f6-aa2f-653708f71f55', 'instance_type': 'm1.nano', 'host': '36cb94be136c59f96bdf8037627fe2a95db3a078b538a9314d820ad0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '955e3ca0-f727-11f0-a0a4-fa163e934844', 'monotonic_time': 5555.717786794, 'message_signature': 'f6dc013af31689551c100332bec8b565fff090cc032c85d7aeda5d0528889a31'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'c26ff016fcfc4e08803feb0e96005a8e', 'user_name': None, 'project_id': '4c6e66779ffe440d9c3270f0328391fb', 'project_name': None, 'resource_id': '5ba5bafe-ee5b-48f6-aa2f-653708f71f55-sda', 'timestamp': '2026-01-22T00:16:22.996371', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-343289399', 'name': 'instance-00000085', 'instance_id': '5ba5bafe-ee5b-48f6-aa2f-653708f71f55', 'instance_type': 'm1.nano', 'host': '36cb94be136c59f96bdf8037627fe2a95db3a078b538a9314d820ad0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '955e4b50-f727-11f0-a0a4-fa163e934844', 'monotonic_time': 5555.717786794, 'message_signature': '2748608ecb7e2b5b1fd62796c3767339b9eb07c0f72129332877f1fb26d63559'}]}, 'timestamp': '2026-01-22 00:16:23.025701', '_unique_id': '813b505bb47345c2a22be470dafe1dc2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.027 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.027 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.027 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.027 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.027 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.027 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.027 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.027 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.027 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.027 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.027 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.027 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.027 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.027 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.027 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.027 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.027 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.027 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.027 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.027 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.027 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.027 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.027 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.027 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.027 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.027 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.027 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.027 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.027 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.027 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.027 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.028 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.028 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.028 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1083808564>, <NovaLikeServer: tempest-ServerRescueNegativeTestJSON-server-343289399>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1083808564>, <NovaLikeServer: tempest-ServerRescueNegativeTestJSON-server-343289399>]
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.028 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.028 12 DEBUG ceilometer.compute.pollsters [-] 074fd360-328c-4903-a368-d3890c4a1075/disk.device.write.latency volume: 2571362850 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.029 12 DEBUG ceilometer.compute.pollsters [-] 074fd360-328c-4903-a368-d3890c4a1075/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.029 12 DEBUG ceilometer.compute.pollsters [-] 5ba5bafe-ee5b-48f6-aa2f-653708f71f55/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.029 12 DEBUG ceilometer.compute.pollsters [-] 5ba5bafe-ee5b-48f6-aa2f-653708f71f55/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.031 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3a4e1c79-5a0d-45e6-affd-8ab7823e4fdb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 2571362850, 'user_id': '833f1e9dce90456ea55a443da6704907', 'user_name': None, 'project_id': '34b96b4037d24a0ea19383ca2477b2fd', 'project_name': None, 'resource_id': '074fd360-328c-4903-a368-d3890c4a1075-vda', 'timestamp': '2026-01-22T00:16:23.028696', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1083808564', 'name': 'instance-0000007d', 'instance_id': '074fd360-328c-4903-a368-d3890c4a1075', 'instance_type': 'm1.nano', 'host': '0b8bd15b4429a8b3db4045e831c64597b4205d6950c39c7abf926666', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '955ecf30-f727-11f0-a0a4-fa163e934844', 'monotonic_time': 5555.602239022, 'message_signature': 'af2ebcb515ecf3c59543c7e924f186aadb5ce38aaa6379e7b9a523fdf926bfec'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '833f1e9dce90456ea55a443da6704907', 'user_name': None, 'project_id': '34b96b4037d24a0ea19383ca2477b2fd', 'project_name': None, 'resource_id': '074fd360-328c-4903-a368-d3890c4a1075-sda', 'timestamp': '2026-01-22T00:16:23.028696', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1083808564', 'name': 'instance-0000007d', 'instance_id': '074fd360-328c-4903-a368-d3890c4a1075', 'instance_type': 'm1.nano', 'host': '0b8bd15b4429a8b3db4045e831c64597b4205d6950c39c7abf926666', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '955edcfa-f727-11f0-a0a4-fa163e934844', 'monotonic_time': 5555.602239022, 'message_signature': 'ed1e1d730de7e6665986ff67f54b2a3141f8a1edb5373b01351d8f4bd49709b9'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'c26ff016fcfc4e08803feb0e96005a8e', 'user_name': None, 'project_id': '4c6e66779ffe440d9c3270f0328391fb', 'project_name': None, 'resource_id': '5ba5bafe-ee5b-48f6-aa2f-653708f71f55-vda', 'timestamp': '2026-01-22T00:16:23.028696', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-343289399', 'name': 'instance-00000085', 'instance_id': '5ba5bafe-ee5b-48f6-aa2f-653708f71f55', 'instance_type': 'm1.nano', 'host': '36cb94be136c59f96bdf8037627fe2a95db3a078b538a9314d820ad0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '955ee88a-f727-11f0-a0a4-fa163e934844', 'monotonic_time': 5555.643772778, 'message_signature': '10ff2380fe82fe77786df02da9c194b4eeaca4ba87d556c35dc04a2f21d4d89c'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'c26ff016fcfc4e08803feb0e96005a8e', 'user_name': None, 'project_id': '4c6e66779ffe440d9c3270f0328391fb', 'project_name': None, 'resource_id': '5ba5bafe-ee5b-48f6-aa2f-653708f71f55-sda', 'timestamp': '2026-01-22T00:16:23.028696', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-343289399', 'name': 'instance-00000085', 'instance_id': '5ba5bafe-ee5b-48f6-aa2f-653708f71f55', 'instance_type': 'm1.nano', 'host': '36cb94be136c59f96bdf8037627fe2a95db3a078b538a9314d820ad0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '955ef442-f727-11f0-a0a4-fa163e934844', 'monotonic_time': 5555.643772778, 'message_signature': '5bc3d4da5e567ee7f776472bde955a3bdb111b2402f4eb20813815128103aee5'}]}, 'timestamp': '2026-01-22 00:16:23.030018', '_unique_id': '1fb4c49b9f1c4f35abe716c6b0fda4fe'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.031 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.031 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.031 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.031 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.031 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.031 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.031 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.031 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.031 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.031 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.031 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.031 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.031 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.031 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.031 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.031 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.031 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.031 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.031 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.031 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.031 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.031 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.031 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.031 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.031 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.031 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.031 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.031 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.031 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.031 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.031 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.031 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.032 12 DEBUG ceilometer.compute.pollsters [-] 074fd360-328c-4903-a368-d3890c4a1075/network.incoming.packets volume: 382 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.032 12 DEBUG ceilometer.compute.pollsters [-] 5ba5bafe-ee5b-48f6-aa2f-653708f71f55/network.incoming.packets volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.033 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '83ca39d6-42e9-4515-aff9-ed397b093285', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 382, 'user_id': '833f1e9dce90456ea55a443da6704907', 'user_name': None, 'project_id': '34b96b4037d24a0ea19383ca2477b2fd', 'project_name': None, 'resource_id': 'instance-0000007d-074fd360-328c-4903-a368-d3890c4a1075-tapa06a78d5-54', 'timestamp': '2026-01-22T00:16:23.032032', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1083808564', 'name': 'tapa06a78d5-54', 'instance_id': '074fd360-328c-4903-a368-d3890c4a1075', 'instance_type': 'm1.nano', 'host': '0b8bd15b4429a8b3db4045e831c64597b4205d6950c39c7abf926666', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:3e:83:df', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa06a78d5-54'}, 'message_id': '955f5018-f727-11f0-a0a4-fa163e934844', 'monotonic_time': 5555.690381371, 'message_signature': 'fe4965f7678c0aa8e549ea41603525d1bec9a0b4c90fbc4f58ee76d15a5983a5'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 1, 'user_id': 'c26ff016fcfc4e08803feb0e96005a8e', 'user_name': None, 'project_id': '4c6e66779ffe440d9c3270f0328391fb', 'project_name': None, 'resource_id': 'instance-00000085-5ba5bafe-ee5b-48f6-aa2f-653708f71f55-tap53f3c575-cc', 'timestamp': '2026-01-22T00:16:23.032032', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-343289399', 'name': 'tap53f3c575-cc', 'instance_id': '5ba5bafe-ee5b-48f6-aa2f-653708f71f55', 'instance_type': 'm1.nano', 'host': '36cb94be136c59f96bdf8037627fe2a95db3a078b538a9314d820ad0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:9f:0e:3f', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap53f3c575-cc'}, 'message_id': '955f5bbc-f727-11f0-a0a4-fa163e934844', 'monotonic_time': 5555.695484938, 'message_signature': 'c2e338ef6f50c51c44c362b0dc94061c050ee7ecaacb1d0d5bb72f6ad2d0a41c'}]}, 'timestamp': '2026-01-22 00:16:23.032669', '_unique_id': 'b3e3b6b37d4640619a288424052e38cf'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.033 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.033 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.033 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.033 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.033 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.033 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.033 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.033 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.033 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.033 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.033 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.033 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.033 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.033 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.033 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.033 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.033 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.033 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.033 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.033 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.033 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.033 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.033 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.033 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.033 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.033 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.033 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.033 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.033 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.033 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.033 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.035 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Jan 22 00:16:23 compute-1 ovn_controller[94841]: 2026-01-22T00:16:23Z|00526|binding|INFO|Releasing lport c516d686-0754-486d-a980-7442f4c88088 from this chassis (sb_readonly=0)
Jan 22 00:16:23 compute-1 ovn_controller[94841]: 2026-01-22T00:16:23Z|00527|binding|INFO|Releasing lport 6df2a6eb-cd08-41f6-b95a-bc0711ab706f from this chassis (sb_readonly=0)
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.051 12 DEBUG ceilometer.compute.pollsters [-] 074fd360-328c-4903-a368-d3890c4a1075/cpu volume: 12720000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.078 12 DEBUG ceilometer.compute.pollsters [-] 5ba5bafe-ee5b-48f6-aa2f-653708f71f55/cpu volume: 6010000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.080 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c11b2066-619b-48df-93c2-8394a7efe864', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 12720000000, 'user_id': '833f1e9dce90456ea55a443da6704907', 'user_name': None, 'project_id': '34b96b4037d24a0ea19383ca2477b2fd', 'project_name': None, 'resource_id': '074fd360-328c-4903-a368-d3890c4a1075', 'timestamp': '2026-01-22T00:16:23.035631', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1083808564', 'name': 'instance-0000007d', 'instance_id': '074fd360-328c-4903-a368-d3890c4a1075', 'instance_type': 'm1.nano', 'host': '0b8bd15b4429a8b3db4045e831c64597b4205d6950c39c7abf926666', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '95624aca-f727-11f0-a0a4-fa163e934844', 'monotonic_time': 5555.758516445, 'message_signature': '97f25b54720ee984a63da2fb35e2ab8b7c58a2fcf630eb735e819c7d6d2957a1'}, {'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 6010000000, 'user_id': 'c26ff016fcfc4e08803feb0e96005a8e', 'user_name': None, 'project_id': '4c6e66779ffe440d9c3270f0328391fb', 'project_name': None, 'resource_id': '5ba5bafe-ee5b-48f6-aa2f-653708f71f55', 'timestamp': '2026-01-22T00:16:23.035631', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-343289399', 'name': 'instance-00000085', 'instance_id': '5ba5bafe-ee5b-48f6-aa2f-653708f71f55', 'instance_type': 'm1.nano', 'host': '36cb94be136c59f96bdf8037627fe2a95db3a078b538a9314d820ad0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '95667d48-f727-11f0-a0a4-fa163e934844', 'monotonic_time': 5555.786016361, 'message_signature': '38c515f9018d0199ccd0ffc99bd1ab2752f798ff0ae70f0f1b0a2ac123ac8aad'}]}, 'timestamp': '2026-01-22 00:16:23.079415', '_unique_id': '09cf85fc258b49e1adecf6a613a35e44'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.080 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.080 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.080 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.080 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.080 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.080 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.080 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.080 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.080 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.080 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.080 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.080 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.080 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.080 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.080 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.080 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.080 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.080 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.080 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.080 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.080 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.080 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.080 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.080 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.080 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.080 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.080 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.080 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.080 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.080 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.080 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.080 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.081 12 DEBUG ceilometer.compute.pollsters [-] 074fd360-328c-4903-a368-d3890c4a1075/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.081 12 DEBUG ceilometer.compute.pollsters [-] 5ba5bafe-ee5b-48f6-aa2f-653708f71f55/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.082 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3b57bdb6-3a4b-451b-a66b-2c960c8c63ad', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '833f1e9dce90456ea55a443da6704907', 'user_name': None, 'project_id': '34b96b4037d24a0ea19383ca2477b2fd', 'project_name': None, 'resource_id': 'instance-0000007d-074fd360-328c-4903-a368-d3890c4a1075-tapa06a78d5-54', 'timestamp': '2026-01-22T00:16:23.081050', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1083808564', 'name': 'tapa06a78d5-54', 'instance_id': '074fd360-328c-4903-a368-d3890c4a1075', 'instance_type': 'm1.nano', 'host': '0b8bd15b4429a8b3db4045e831c64597b4205d6950c39c7abf926666', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:3e:83:df', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa06a78d5-54'}, 'message_id': '9566c8ac-f727-11f0-a0a4-fa163e934844', 'monotonic_time': 5555.690381371, 'message_signature': 'a57ac0eb636c366aeae8a04713e53fa6954812462434dff633fedf30ed8a3ed2'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'c26ff016fcfc4e08803feb0e96005a8e', 'user_name': None, 'project_id': '4c6e66779ffe440d9c3270f0328391fb', 'project_name': None, 'resource_id': 'instance-00000085-5ba5bafe-ee5b-48f6-aa2f-653708f71f55-tap53f3c575-cc', 'timestamp': '2026-01-22T00:16:23.081050', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-343289399', 'name': 'tap53f3c575-cc', 'instance_id': '5ba5bafe-ee5b-48f6-aa2f-653708f71f55', 'instance_type': 'm1.nano', 'host': '36cb94be136c59f96bdf8037627fe2a95db3a078b538a9314d820ad0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:9f:0e:3f', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap53f3c575-cc'}, 'message_id': '9566d176-f727-11f0-a0a4-fa163e934844', 'monotonic_time': 5555.695484938, 'message_signature': 'b62bd43812f7338a69b28687277d3d161748c0320cb85a21bb3564925ed9216f'}]}, 'timestamp': '2026-01-22 00:16:23.081510', '_unique_id': '2086403b7fa74d2b84b2b953eb53b75f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.082 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.082 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.082 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.082 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.082 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.082 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.082 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.082 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.082 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.082 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.082 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.082 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.082 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.082 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.082 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.082 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.082 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.082 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.082 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.082 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.082 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.082 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.082 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.082 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.082 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.082 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.082 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.082 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.082 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.082 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.082 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.082 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.082 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.082 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.082 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.082 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.082 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.082 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.082 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.082 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.082 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.082 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.082 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.082 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.082 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.082 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.082 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.082 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.082 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.082 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.082 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.082 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.082 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.082 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.082 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.082 12 DEBUG ceilometer.compute.pollsters [-] 074fd360-328c-4903-a368-d3890c4a1075/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.082 12 DEBUG ceilometer.compute.pollsters [-] 5ba5bafe-ee5b-48f6-aa2f-653708f71f55/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.083 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '47f43f8d-45b0-497b-bf0b-ace7998c1b4c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '833f1e9dce90456ea55a443da6704907', 'user_name': None, 'project_id': '34b96b4037d24a0ea19383ca2477b2fd', 'project_name': None, 'resource_id': 'instance-0000007d-074fd360-328c-4903-a368-d3890c4a1075-tapa06a78d5-54', 'timestamp': '2026-01-22T00:16:23.082698', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1083808564', 'name': 'tapa06a78d5-54', 'instance_id': '074fd360-328c-4903-a368-d3890c4a1075', 'instance_type': 'm1.nano', 'host': '0b8bd15b4429a8b3db4045e831c64597b4205d6950c39c7abf926666', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:3e:83:df', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa06a78d5-54'}, 'message_id': '956708c6-f727-11f0-a0a4-fa163e934844', 'monotonic_time': 5555.690381371, 'message_signature': '86ffad9ecbf6d5ea37dc754f0f913ea85215a0812a1be0ae9c35d2e052f4c98d'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'c26ff016fcfc4e08803feb0e96005a8e', 'user_name': None, 'project_id': '4c6e66779ffe440d9c3270f0328391fb', 'project_name': None, 'resource_id': 'instance-00000085-5ba5bafe-ee5b-48f6-aa2f-653708f71f55-tap53f3c575-cc', 'timestamp': '2026-01-22T00:16:23.082698', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-343289399', 'name': 'tap53f3c575-cc', 'instance_id': '5ba5bafe-ee5b-48f6-aa2f-653708f71f55', 'instance_type': 'm1.nano', 'host': '36cb94be136c59f96bdf8037627fe2a95db3a078b538a9314d820ad0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:9f:0e:3f', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap53f3c575-cc'}, 'message_id': '95671370-f727-11f0-a0a4-fa163e934844', 'monotonic_time': 5555.695484938, 'message_signature': 'e12aad619dd7b1d317a83df91485af4669be5f1b4b7476ff6565aa25adbbccac'}]}, 'timestamp': '2026-01-22 00:16:23.083207', '_unique_id': '0d012fca210645ceb161b6c5fd1172b5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.083 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.083 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.083 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.083 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.083 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.083 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.083 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.083 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.083 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.083 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.083 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.083 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.083 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.083 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.083 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.083 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.083 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.083 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.083 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.083 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.083 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.083 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.083 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.083 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.083 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.083 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.083 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.083 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.083 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.083 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.083 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.084 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.084 12 DEBUG ceilometer.compute.pollsters [-] 074fd360-328c-4903-a368-d3890c4a1075/network.outgoing.packets volume: 427 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.084 12 DEBUG ceilometer.compute.pollsters [-] 5ba5bafe-ee5b-48f6-aa2f-653708f71f55/network.outgoing.packets volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.085 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8a067e28-6818-48d4-9d75-732b42dd1e12', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 427, 'user_id': '833f1e9dce90456ea55a443da6704907', 'user_name': None, 'project_id': '34b96b4037d24a0ea19383ca2477b2fd', 'project_name': None, 'resource_id': 'instance-0000007d-074fd360-328c-4903-a368-d3890c4a1075-tapa06a78d5-54', 'timestamp': '2026-01-22T00:16:23.084371', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1083808564', 'name': 'tapa06a78d5-54', 'instance_id': '074fd360-328c-4903-a368-d3890c4a1075', 'instance_type': 'm1.nano', 'host': '0b8bd15b4429a8b3db4045e831c64597b4205d6950c39c7abf926666', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:3e:83:df', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa06a78d5-54'}, 'message_id': '956749f8-f727-11f0-a0a4-fa163e934844', 'monotonic_time': 5555.690381371, 'message_signature': '30b010c181b72015f5aa0e99bad3351133865b139bc22246d9abc3039326fd0b'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'c26ff016fcfc4e08803feb0e96005a8e', 'user_name': None, 'project_id': '4c6e66779ffe440d9c3270f0328391fb', 'project_name': None, 'resource_id': 'instance-00000085-5ba5bafe-ee5b-48f6-aa2f-653708f71f55-tap53f3c575-cc', 'timestamp': '2026-01-22T00:16:23.084371', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-343289399', 'name': 'tap53f3c575-cc', 'instance_id': '5ba5bafe-ee5b-48f6-aa2f-653708f71f55', 'instance_type': 'm1.nano', 'host': '36cb94be136c59f96bdf8037627fe2a95db3a078b538a9314d820ad0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:9f:0e:3f', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap53f3c575-cc'}, 'message_id': '956753e4-f727-11f0-a0a4-fa163e934844', 'monotonic_time': 5555.695484938, 'message_signature': 'd846a7b390a56f34efb1685544b2b61faf966947bea30e244135dc58c3029b5a'}]}, 'timestamp': '2026-01-22 00:16:23.084898', '_unique_id': 'b470e13fd12744309d759f7a9794fc69'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.085 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.085 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.085 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.085 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.085 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.085 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.085 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.085 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.085 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.085 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.085 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.085 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.085 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.085 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.085 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.085 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.085 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.085 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.085 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.085 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.085 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.085 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.085 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.085 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.085 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.085 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.085 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.085 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.085 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.085 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.085 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.085 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.086 12 DEBUG ceilometer.compute.pollsters [-] 074fd360-328c-4903-a368-d3890c4a1075/network.outgoing.bytes volume: 65458 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.086 12 DEBUG ceilometer.compute.pollsters [-] 5ba5bafe-ee5b-48f6-aa2f-653708f71f55/network.outgoing.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:16:23 compute-1 nova_compute[182713]: 2026-01-22 00:16:23.086 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.087 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '19cec83a-c62a-4c16-8f42-20eddc6e6419', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 65458, 'user_id': '833f1e9dce90456ea55a443da6704907', 'user_name': None, 'project_id': '34b96b4037d24a0ea19383ca2477b2fd', 'project_name': None, 'resource_id': 'instance-0000007d-074fd360-328c-4903-a368-d3890c4a1075-tapa06a78d5-54', 'timestamp': '2026-01-22T00:16:23.086029', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1083808564', 'name': 'tapa06a78d5-54', 'instance_id': '074fd360-328c-4903-a368-d3890c4a1075', 'instance_type': 'm1.nano', 'host': '0b8bd15b4429a8b3db4045e831c64597b4205d6950c39c7abf926666', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:3e:83:df', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa06a78d5-54'}, 'message_id': '95678abc-f727-11f0-a0a4-fa163e934844', 'monotonic_time': 5555.690381371, 'message_signature': 'b9a6e735c807efd1fd111521aebdbd28d3ae743ec250955f8a5ec2760f18664e'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'c26ff016fcfc4e08803feb0e96005a8e', 'user_name': None, 'project_id': '4c6e66779ffe440d9c3270f0328391fb', 'project_name': None, 'resource_id': 'instance-00000085-5ba5bafe-ee5b-48f6-aa2f-653708f71f55-tap53f3c575-cc', 'timestamp': '2026-01-22T00:16:23.086029', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-343289399', 'name': 'tap53f3c575-cc', 'instance_id': '5ba5bafe-ee5b-48f6-aa2f-653708f71f55', 'instance_type': 'm1.nano', 'host': '36cb94be136c59f96bdf8037627fe2a95db3a078b538a9314d820ad0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:9f:0e:3f', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap53f3c575-cc'}, 'message_id': '9567944e-f727-11f0-a0a4-fa163e934844', 'monotonic_time': 5555.695484938, 'message_signature': 'e0a98e3a815abae018d03e7b87b2ab07ea7e93035a5d694f2f1b7c6c3f0484c7'}]}, 'timestamp': '2026-01-22 00:16:23.086532', '_unique_id': 'dde6aebe08b14cda852e0fdc204abd88'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.087 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.087 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.087 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.087 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.087 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.087 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.087 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.087 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.087 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.087 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.087 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.087 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.087 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.087 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.087 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.087 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.087 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.087 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.087 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.087 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.087 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.087 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.087 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.087 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.087 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.087 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.087 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.087 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.087 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.087 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.087 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.087 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.087 12 DEBUG ceilometer.compute.pollsters [-] 074fd360-328c-4903-a368-d3890c4a1075/memory.usage volume: 46.6796875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.088 12 DEBUG ceilometer.compute.pollsters [-] 5ba5bafe-ee5b-48f6-aa2f-653708f71f55/memory.usage volume: Unavailable _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.088 12 WARNING ceilometer.compute.pollsters [-] memory.usage statistic in not available for instance 5ba5bafe-ee5b-48f6-aa2f-653708f71f55: ceilometer.compute.pollsters.NoVolumeException
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.088 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ab0fe5c5-275f-44b8-9fd8-6ba7b5560419', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 46.6796875, 'user_id': '833f1e9dce90456ea55a443da6704907', 'user_name': None, 'project_id': '34b96b4037d24a0ea19383ca2477b2fd', 'project_name': None, 'resource_id': '074fd360-328c-4903-a368-d3890c4a1075', 'timestamp': '2026-01-22T00:16:23.087749', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1083808564', 'name': 'instance-0000007d', 'instance_id': '074fd360-328c-4903-a368-d3890c4a1075', 'instance_type': 'm1.nano', 'host': '0b8bd15b4429a8b3db4045e831c64597b4205d6950c39c7abf926666', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '9567cec8-f727-11f0-a0a4-fa163e934844', 'monotonic_time': 5555.758516445, 'message_signature': 'd723b6213b7ee4519f8907e8583b44e9a6a3ba72fad0b759f505547740888e22'}]}, 'timestamp': '2026-01-22 00:16:23.088210', '_unique_id': 'a99933874eff4ae7a7b97f474b7bcafd'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.088 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.088 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.088 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.088 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.088 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.088 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.088 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.088 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.088 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.088 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.088 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.088 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.088 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.088 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.088 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.088 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.088 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.088 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.088 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.088 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.088 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.088 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.088 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.088 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.088 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.088 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.088 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.088 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.088 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.088 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.088 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.089 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.089 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.089 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1083808564>, <NovaLikeServer: tempest-ServerRescueNegativeTestJSON-server-343289399>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1083808564>, <NovaLikeServer: tempest-ServerRescueNegativeTestJSON-server-343289399>]
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.089 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.089 12 DEBUG ceilometer.compute.pollsters [-] 074fd360-328c-4903-a368-d3890c4a1075/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.089 12 DEBUG ceilometer.compute.pollsters [-] 5ba5bafe-ee5b-48f6-aa2f-653708f71f55/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.090 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4069d43f-23e7-4afe-8677-d101001df7a3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '833f1e9dce90456ea55a443da6704907', 'user_name': None, 'project_id': '34b96b4037d24a0ea19383ca2477b2fd', 'project_name': None, 'resource_id': 'instance-0000007d-074fd360-328c-4903-a368-d3890c4a1075-tapa06a78d5-54', 'timestamp': '2026-01-22T00:16:23.089648', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1083808564', 'name': 'tapa06a78d5-54', 'instance_id': '074fd360-328c-4903-a368-d3890c4a1075', 'instance_type': 'm1.nano', 'host': '0b8bd15b4429a8b3db4045e831c64597b4205d6950c39c7abf926666', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:3e:83:df', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa06a78d5-54'}, 'message_id': '956818c4-f727-11f0-a0a4-fa163e934844', 'monotonic_time': 5555.690381371, 'message_signature': '009353a6ba9f8f38d460034775f70ae25a21f0c76aa165d3348c16fd1b52580f'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'c26ff016fcfc4e08803feb0e96005a8e', 'user_name': None, 'project_id': '4c6e66779ffe440d9c3270f0328391fb', 'project_name': None, 'resource_id': 'instance-00000085-5ba5bafe-ee5b-48f6-aa2f-653708f71f55-tap53f3c575-cc', 'timestamp': '2026-01-22T00:16:23.089648', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-343289399', 'name': 'tap53f3c575-cc', 'instance_id': '5ba5bafe-ee5b-48f6-aa2f-653708f71f55', 'instance_type': 'm1.nano', 'host': '36cb94be136c59f96bdf8037627fe2a95db3a078b538a9314d820ad0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:9f:0e:3f', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap53f3c575-cc'}, 'message_id': '95682364-f727-11f0-a0a4-fa163e934844', 'monotonic_time': 5555.695484938, 'message_signature': '5e4c0c5a9719ff76e8734e2cc2e4e96067ad000741234a75a6f88668099cddc6'}]}, 'timestamp': '2026-01-22 00:16:23.090161', '_unique_id': '063fa00b14354aecbe97717571144a08'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.090 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.090 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.090 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.090 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.090 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.090 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.090 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.090 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.090 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.090 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.090 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.090 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.090 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.090 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.090 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.090 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.090 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.090 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.090 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.090 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.090 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.090 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.090 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.090 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.090 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.090 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.090 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.090 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.090 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.090 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.090 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.091 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.091 12 DEBUG ceilometer.compute.pollsters [-] 074fd360-328c-4903-a368-d3890c4a1075/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.091 12 DEBUG ceilometer.compute.pollsters [-] 5ba5bafe-ee5b-48f6-aa2f-653708f71f55/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.092 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '705d0292-194c-451c-a87a-8ee0d253653c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '833f1e9dce90456ea55a443da6704907', 'user_name': None, 'project_id': '34b96b4037d24a0ea19383ca2477b2fd', 'project_name': None, 'resource_id': 'instance-0000007d-074fd360-328c-4903-a368-d3890c4a1075-tapa06a78d5-54', 'timestamp': '2026-01-22T00:16:23.091288', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1083808564', 'name': 'tapa06a78d5-54', 'instance_id': '074fd360-328c-4903-a368-d3890c4a1075', 'instance_type': 'm1.nano', 'host': '0b8bd15b4429a8b3db4045e831c64597b4205d6950c39c7abf926666', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:3e:83:df', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa06a78d5-54'}, 'message_id': '9568580c-f727-11f0-a0a4-fa163e934844', 'monotonic_time': 5555.690381371, 'message_signature': '48157e0a53f27571735537a10f7e575b1bce09204f6ea425314d54eabcc6942f'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'c26ff016fcfc4e08803feb0e96005a8e', 'user_name': None, 'project_id': '4c6e66779ffe440d9c3270f0328391fb', 'project_name': None, 'resource_id': 'instance-00000085-5ba5bafe-ee5b-48f6-aa2f-653708f71f55-tap53f3c575-cc', 'timestamp': '2026-01-22T00:16:23.091288', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-343289399', 'name': 'tap53f3c575-cc', 'instance_id': '5ba5bafe-ee5b-48f6-aa2f-653708f71f55', 'instance_type': 'm1.nano', 'host': '36cb94be136c59f96bdf8037627fe2a95db3a078b538a9314d820ad0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:9f:0e:3f', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap53f3c575-cc'}, 'message_id': '95686036-f727-11f0-a0a4-fa163e934844', 'monotonic_time': 5555.695484938, 'message_signature': '11948757fcc3fbb2712b0e612ee53d10939ae780eaafe28f6a58e7896696c78a'}]}, 'timestamp': '2026-01-22 00:16:23.091717', '_unique_id': '48f82cfaedcb4512b06396c746bd788b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.092 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.092 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.092 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.092 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.092 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.092 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.092 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.092 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.092 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.092 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.092 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.092 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.092 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.092 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.092 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.092 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.092 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.092 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.092 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.092 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.092 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.092 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.092 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.092 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.092 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.092 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.092 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.092 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.092 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.092 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.092 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.092 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.092 12 DEBUG ceilometer.compute.pollsters [-] 074fd360-328c-4903-a368-d3890c4a1075/disk.device.allocation volume: 31072256 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.093 12 DEBUG ceilometer.compute.pollsters [-] 074fd360-328c-4903-a368-d3890c4a1075/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.093 12 DEBUG ceilometer.compute.pollsters [-] 5ba5bafe-ee5b-48f6-aa2f-653708f71f55/disk.device.allocation volume: 204800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.093 12 DEBUG ceilometer.compute.pollsters [-] 5ba5bafe-ee5b-48f6-aa2f-653708f71f55/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.094 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '397f6b1b-7b91-400a-b491-a598d41b7269', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 31072256, 'user_id': '833f1e9dce90456ea55a443da6704907', 'user_name': None, 'project_id': '34b96b4037d24a0ea19383ca2477b2fd', 'project_name': None, 'resource_id': '074fd360-328c-4903-a368-d3890c4a1075-vda', 'timestamp': '2026-01-22T00:16:23.092799', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1083808564', 'name': 'instance-0000007d', 'instance_id': '074fd360-328c-4903-a368-d3890c4a1075', 'instance_type': 'm1.nano', 'host': '0b8bd15b4429a8b3db4045e831c64597b4205d6950c39c7abf926666', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '956893b2-f727-11f0-a0a4-fa163e934844', 'monotonic_time': 5555.703656299, 'message_signature': '6b6ec4515cbe4e96de61eea9cd1a4553ea8148988de730a13bc27ebc9f167c79'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '833f1e9dce90456ea55a443da6704907', 'user_name': None, 'project_id': '34b96b4037d24a0ea19383ca2477b2fd', 'project_name': None, 'resource_id': '074fd360-328c-4903-a368-d3890c4a1075-sda', 'timestamp': '2026-01-22T00:16:23.092799', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1083808564', 'name': 'instance-0000007d', 'instance_id': '074fd360-328c-4903-a368-d3890c4a1075', 'instance_type': 'm1.nano', 'host': '0b8bd15b4429a8b3db4045e831c64597b4205d6950c39c7abf926666', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '95689b6e-f727-11f0-a0a4-fa163e934844', 'monotonic_time': 5555.703656299, 'message_signature': 'a39d5694fb581834b7b15ac7e346c56e0a56f62ff9392020723d61e156dd7d21'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 204800, 'user_id': 'c26ff016fcfc4e08803feb0e96005a8e', 'user_name': None, 'project_id': '4c6e66779ffe440d9c3270f0328391fb', 'project_name': None, 'resource_id': '5ba5bafe-ee5b-48f6-aa2f-653708f71f55-vda', 'timestamp': '2026-01-22T00:16:23.092799', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-343289399', 'name': 'instance-00000085', 'instance_id': '5ba5bafe-ee5b-48f6-aa2f-653708f71f55', 'instance_type': 'm1.nano', 'host': '36cb94be136c59f96bdf8037627fe2a95db3a078b538a9314d820ad0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '9568a3de-f727-11f0-a0a4-fa163e934844', 'monotonic_time': 5555.717786794, 'message_signature': 'f88b19a483895f8391fe7fa9362f919f39e14105bee6e6c2329f2fa467d4b567'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': 'c26ff016fcfc4e08803feb0e96005a8e', 'user_name': None, 'project_id': '4c6e66779ffe440d9c3270f0328391fb', 'project_name': None, 'resource_id': '5ba5bafe-ee5b-48f6-aa2f-653708f71f55-sda', 'timestamp': '2026-01-22T00:16:23.092799', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-343289399', 'name': 'instance-00000085', 'instance_id': '5ba5bafe-ee5b-48f6-aa2f-653708f71f55', 'instance_type': 'm1.nano', 'host': '36cb94be136c59f96bdf8037627fe2a95db3a078b538a9314d820ad0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '9568ab68-f727-11f0-a0a4-fa163e934844', 'monotonic_time': 5555.717786794, 'message_signature': '876f1f72ce411b110925c550a7cb3828535a317c897e149f390ac7706cac39da'}]}, 'timestamp': '2026-01-22 00:16:23.093632', '_unique_id': 'defe88f35c1442039f880688e89e35ef'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.094 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.094 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.094 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.094 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.094 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.094 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.094 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.094 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.094 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.094 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.094 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.094 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.094 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.094 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.094 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.094 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.094 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.094 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.094 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.094 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.094 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.094 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.094 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.094 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.094 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.094 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.094 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.094 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.094 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.094 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.094 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.094 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.094 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.094 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1083808564>, <NovaLikeServer: tempest-ServerRescueNegativeTestJSON-server-343289399>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1083808564>, <NovaLikeServer: tempest-ServerRescueNegativeTestJSON-server-343289399>]
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.095 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.095 12 DEBUG ceilometer.compute.pollsters [-] 074fd360-328c-4903-a368-d3890c4a1075/disk.device.usage volume: 30146560 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.095 12 DEBUG ceilometer.compute.pollsters [-] 074fd360-328c-4903-a368-d3890c4a1075/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.095 12 DEBUG ceilometer.compute.pollsters [-] 5ba5bafe-ee5b-48f6-aa2f-653708f71f55/disk.device.usage volume: 196624 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.095 12 DEBUG ceilometer.compute.pollsters [-] 5ba5bafe-ee5b-48f6-aa2f-653708f71f55/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.096 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '314bbde0-f71a-4e0d-a688-4c43efc6f8dc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30146560, 'user_id': '833f1e9dce90456ea55a443da6704907', 'user_name': None, 'project_id': '34b96b4037d24a0ea19383ca2477b2fd', 'project_name': None, 'resource_id': '074fd360-328c-4903-a368-d3890c4a1075-vda', 'timestamp': '2026-01-22T00:16:23.095132', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1083808564', 'name': 'instance-0000007d', 'instance_id': '074fd360-328c-4903-a368-d3890c4a1075', 'instance_type': 'm1.nano', 'host': '0b8bd15b4429a8b3db4045e831c64597b4205d6950c39c7abf926666', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '9568eede-f727-11f0-a0a4-fa163e934844', 'monotonic_time': 5555.703656299, 'message_signature': '22940994ca396c0ac3b740075f9240761007e1e508b631bd8b6493ad561f7a7c'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '833f1e9dce90456ea55a443da6704907', 'user_name': None, 'project_id': '34b96b4037d24a0ea19383ca2477b2fd', 'project_name': None, 'resource_id': '074fd360-328c-4903-a368-d3890c4a1075-sda', 'timestamp': '2026-01-22T00:16:23.095132', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1083808564', 'name': 'instance-0000007d', 'instance_id': '074fd360-328c-4903-a368-d3890c4a1075', 'instance_type': 'm1.nano', 'host': '0b8bd15b4429a8b3db4045e831c64597b4205d6950c39c7abf926666', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '9568f71c-f727-11f0-a0a4-fa163e934844', 'monotonic_time': 5555.703656299, 'message_signature': '2004c628cba17ab0b0f911db0a5768034378845c953f9dda6976b0cd7a248bd0'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 196624, 'user_id': 'c26ff016fcfc4e08803feb0e96005a8e', 'user_name': None, 'project_id': '4c6e66779ffe440d9c3270f0328391fb', 'project_name': None, 'resource_id': '5ba5bafe-ee5b-48f6-aa2f-653708f71f55-vda', 'timestamp': '2026-01-22T00:16:23.095132', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-343289399', 'name': 'instance-00000085', 'instance_id': '5ba5bafe-ee5b-48f6-aa2f-653708f71f55', 'instance_type': 'm1.nano', 'host': '36cb94be136c59f96bdf8037627fe2a95db3a078b538a9314d820ad0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '9568feec-f727-11f0-a0a4-fa163e934844', 'monotonic_time': 5555.717786794, 'message_signature': 'd20e22d6b59482921d0c384544615052a1ba49379c9a909db472533d938f350c'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'c26ff016fcfc4e08803feb0e96005a8e', 'user_name': None, 'project_id': '4c6e66779ffe440d9c3270f0328391fb', 'project_name': None, 'resource_id': '5ba5bafe-ee5b-48f6-aa2f-653708f71f55-sda', 'timestamp': '2026-01-22T00:16:23.095132', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-343289399', 'name': 'instance-00000085', 'instance_id': '5ba5bafe-ee5b-48f6-aa2f-653708f71f55', 'instance_type': 'm1.nano', 'host': '36cb94be136c59f96bdf8037627fe2a95db3a078b538a9314d820ad0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '956906ee-f727-11f0-a0a4-fa163e934844', 'monotonic_time': 5555.717786794, 'message_signature': '7b65148a14bde774235393769959d1436eb6cf3734814f46b47cfeb3f0b7885d'}]}, 'timestamp': '2026-01-22 00:16:23.095976', '_unique_id': '2feb6635b7df447c99e3ebdb4d652d72'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.096 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.096 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.096 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.096 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.096 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.096 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.096 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.096 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.096 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.096 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.096 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.096 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.096 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.096 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.096 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.096 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.096 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.096 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.096 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.096 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.096 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.096 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.096 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.096 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.096 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.096 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.096 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.096 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.096 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.096 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.096 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.097 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.097 12 DEBUG ceilometer.compute.pollsters [-] 074fd360-328c-4903-a368-d3890c4a1075/disk.device.read.bytes volume: 29915648 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.097 12 DEBUG ceilometer.compute.pollsters [-] 074fd360-328c-4903-a368-d3890c4a1075/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.097 12 DEBUG ceilometer.compute.pollsters [-] 5ba5bafe-ee5b-48f6-aa2f-653708f71f55/disk.device.read.bytes volume: 23775232 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.097 12 DEBUG ceilometer.compute.pollsters [-] 5ba5bafe-ee5b-48f6-aa2f-653708f71f55/disk.device.read.bytes volume: 2048 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.098 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b8c2efcc-9344-4bc1-bc9c-27ba5994da8e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 29915648, 'user_id': '833f1e9dce90456ea55a443da6704907', 'user_name': None, 'project_id': '34b96b4037d24a0ea19383ca2477b2fd', 'project_name': None, 'resource_id': '074fd360-328c-4903-a368-d3890c4a1075-vda', 'timestamp': '2026-01-22T00:16:23.097094', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1083808564', 'name': 'instance-0000007d', 'instance_id': '074fd360-328c-4903-a368-d3890c4a1075', 'instance_type': 'm1.nano', 'host': '0b8bd15b4429a8b3db4045e831c64597b4205d6950c39c7abf926666', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '95693b0a-f727-11f0-a0a4-fa163e934844', 'monotonic_time': 5555.602239022, 'message_signature': '6d6ce65086381008cf9ea694b50462fcbec6697f8769608b63a45c2783d0f20e'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': '833f1e9dce90456ea55a443da6704907', 'user_name': None, 'project_id': '34b96b4037d24a0ea19383ca2477b2fd', 'project_name': None, 'resource_id': '074fd360-328c-4903-a368-d3890c4a1075-sda', 'timestamp': '2026-01-22T00:16:23.097094', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1083808564', 'name': 'instance-0000007d', 'instance_id': '074fd360-328c-4903-a368-d3890c4a1075', 'instance_type': 'm1.nano', 'host': '0b8bd15b4429a8b3db4045e831c64597b4205d6950c39c7abf926666', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '956942da-f727-11f0-a0a4-fa163e934844', 'monotonic_time': 5555.602239022, 'message_signature': '6e1180f12631636cc1d4792fe0f529038778e757bb581cf0c1295e19079afe9e'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 23775232, 'user_id': 'c26ff016fcfc4e08803feb0e96005a8e', 'user_name': None, 'project_id': '4c6e66779ffe440d9c3270f0328391fb', 'project_name': None, 'resource_id': '5ba5bafe-ee5b-48f6-aa2f-653708f71f55-vda', 'timestamp': '2026-01-22T00:16:23.097094', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-343289399', 'name': 'instance-00000085', 'instance_id': '5ba5bafe-ee5b-48f6-aa2f-653708f71f55', 'instance_type': 'm1.nano', 'host': '36cb94be136c59f96bdf8037627fe2a95db3a078b538a9314d820ad0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '95694cf8-f727-11f0-a0a4-fa163e934844', 'monotonic_time': 5555.643772778, 'message_signature': '22d714da9b623954ace7da7a13081f7f96d00b43388f72a04f71b0338c7739ac'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2048, 'user_id': 'c26ff016fcfc4e08803feb0e96005a8e', 'user_name': None, 'project_id': '4c6e66779ffe440d9c3270f0328391fb', 'project_name': None, 'resource_id': '5ba5bafe-ee5b-48f6-aa2f-653708f71f55-sda', 'timestamp': '2026-01-22T00:16:23.097094', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-343289399', 'name': 'instance-00000085', 'instance_id': '5ba5bafe-ee5b-48f6-aa2f-653708f71f55', 'instance_type': 'm1.nano', 'host': '36cb94be136c59f96bdf8037627fe2a95db3a078b538a9314d820ad0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '956955c2-f727-11f0-a0a4-fa163e934844', 'monotonic_time': 5555.643772778, 'message_signature': '6febe15be4e752ab94755f0ba2cad4172c19340cc1034f7842aa54deb9705a6f'}]}, 'timestamp': '2026-01-22 00:16:23.098034', '_unique_id': 'ebcddab3c8b84f2ba6ab6a97b2899da6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.098 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.098 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.098 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.098 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.098 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.098 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.098 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.098 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.098 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.098 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.098 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.098 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.098 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.098 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.098 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.098 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.098 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.098 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.098 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.098 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.098 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.098 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.098 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.098 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.098 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.098 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.098 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.098 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.098 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.098 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.098 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.099 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.099 12 DEBUG ceilometer.compute.pollsters [-] 074fd360-328c-4903-a368-d3890c4a1075/network.incoming.bytes volume: 74278 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.099 12 DEBUG ceilometer.compute.pollsters [-] 5ba5bafe-ee5b-48f6-aa2f-653708f71f55/network.incoming.bytes volume: 90 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.100 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a79e9d00-da57-4a18-8ba4-000e6ed7a0a7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 74278, 'user_id': '833f1e9dce90456ea55a443da6704907', 'user_name': None, 'project_id': '34b96b4037d24a0ea19383ca2477b2fd', 'project_name': None, 'resource_id': 'instance-0000007d-074fd360-328c-4903-a368-d3890c4a1075-tapa06a78d5-54', 'timestamp': '2026-01-22T00:16:23.099509', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1083808564', 'name': 'tapa06a78d5-54', 'instance_id': '074fd360-328c-4903-a368-d3890c4a1075', 'instance_type': 'm1.nano', 'host': '0b8bd15b4429a8b3db4045e831c64597b4205d6950c39c7abf926666', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:3e:83:df', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa06a78d5-54'}, 'message_id': '95699aa0-f727-11f0-a0a4-fa163e934844', 'monotonic_time': 5555.690381371, 'message_signature': '36860fe8a38523ba007e4dc5561ebcebceb32b7bd2fa26e7ebe9086b0f4e4006'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 90, 'user_id': 'c26ff016fcfc4e08803feb0e96005a8e', 'user_name': None, 'project_id': '4c6e66779ffe440d9c3270f0328391fb', 'project_name': None, 'resource_id': 'instance-00000085-5ba5bafe-ee5b-48f6-aa2f-653708f71f55-tap53f3c575-cc', 'timestamp': '2026-01-22T00:16:23.099509', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-343289399', 'name': 'tap53f3c575-cc', 'instance_id': '5ba5bafe-ee5b-48f6-aa2f-653708f71f55', 'instance_type': 'm1.nano', 'host': '36cb94be136c59f96bdf8037627fe2a95db3a078b538a9314d820ad0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:9f:0e:3f', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap53f3c575-cc'}, 'message_id': '9569a6da-f727-11f0-a0a4-fa163e934844', 'monotonic_time': 5555.695484938, 'message_signature': 'b443626f7a23d4f378ed0225f91b124c99eb4fc431e61b01a3aa308c68e38a82'}]}, 'timestamp': '2026-01-22 00:16:23.100135', '_unique_id': 'e96bd88b08064c0290fa9bff384a3a64'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.100 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.100 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.100 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.100 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.100 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.100 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.100 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.100 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.100 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.100 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.100 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.100 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.100 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.100 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.100 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.100 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.100 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.100 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.100 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.100 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.100 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.100 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.100 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.100 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.100 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.100 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.100 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.100 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.100 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.100 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.100 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.101 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.101 12 DEBUG ceilometer.compute.pollsters [-] 074fd360-328c-4903-a368-d3890c4a1075/disk.device.write.requests volume: 355 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.102 12 DEBUG ceilometer.compute.pollsters [-] 074fd360-328c-4903-a368-d3890c4a1075/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.102 12 DEBUG ceilometer.compute.pollsters [-] 5ba5bafe-ee5b-48f6-aa2f-653708f71f55/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.102 12 DEBUG ceilometer.compute.pollsters [-] 5ba5bafe-ee5b-48f6-aa2f-653708f71f55/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.103 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '41af5561-fb1f-4c2c-8cd8-d7cb995a14ee', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 355, 'user_id': '833f1e9dce90456ea55a443da6704907', 'user_name': None, 'project_id': '34b96b4037d24a0ea19383ca2477b2fd', 'project_name': None, 'resource_id': '074fd360-328c-4903-a368-d3890c4a1075-vda', 'timestamp': '2026-01-22T00:16:23.101670', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1083808564', 'name': 'instance-0000007d', 'instance_id': '074fd360-328c-4903-a368-d3890c4a1075', 'instance_type': 'm1.nano', 'host': '0b8bd15b4429a8b3db4045e831c64597b4205d6950c39c7abf926666', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '9569efd2-f727-11f0-a0a4-fa163e934844', 'monotonic_time': 5555.602239022, 'message_signature': 'aa04051a640f786d05b1d8d40ba107835164876734ac7f3ac280e67473b1f835'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '833f1e9dce90456ea55a443da6704907', 'user_name': None, 'project_id': '34b96b4037d24a0ea19383ca2477b2fd', 'project_name': None, 'resource_id': '074fd360-328c-4903-a368-d3890c4a1075-sda', 'timestamp': '2026-01-22T00:16:23.101670', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1083808564', 'name': 'instance-0000007d', 'instance_id': '074fd360-328c-4903-a368-d3890c4a1075', 'instance_type': 'm1.nano', 'host': '0b8bd15b4429a8b3db4045e831c64597b4205d6950c39c7abf926666', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '9569fc3e-f727-11f0-a0a4-fa163e934844', 'monotonic_time': 5555.602239022, 'message_signature': '093bfa22e20de6b8e6d0f1c7b8e9cd12e13d848f0f9b83d4ba727f6814b85068'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'c26ff016fcfc4e08803feb0e96005a8e', 'user_name': None, 'project_id': '4c6e66779ffe440d9c3270f0328391fb', 'project_name': None, 'resource_id': '5ba5bafe-ee5b-48f6-aa2f-653708f71f55-vda', 'timestamp': '2026-01-22T00:16:23.101670', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-343289399', 'name': 'instance-00000085', 'instance_id': '5ba5bafe-ee5b-48f6-aa2f-653708f71f55', 'instance_type': 'm1.nano', 'host': '36cb94be136c59f96bdf8037627fe2a95db3a078b538a9314d820ad0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '956a07b0-f727-11f0-a0a4-fa163e934844', 'monotonic_time': 5555.643772778, 'message_signature': 'b2766c2c7b1892e79dee00da110835e75e7519780c18bcd70e8d7b27a99d83a4'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'c26ff016fcfc4e08803feb0e96005a8e', 'user_name': None, 'project_id': '4c6e66779ffe440d9c3270f0328391fb', 'project_name': None, 'resource_id': '5ba5bafe-ee5b-48f6-aa2f-653708f71f55-sda', 'timestamp': '2026-01-22T00:16:23.101670', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-343289399', 'name': 'instance-00000085', 'instance_id': '5ba5bafe-ee5b-48f6-aa2f-653708f71f55', 'instance_type': 'm1.nano', 'host': '36cb94be136c59f96bdf8037627fe2a95db3a078b538a9314d820ad0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '956a1462-f727-11f0-a0a4-fa163e934844', 'monotonic_time': 5555.643772778, 'message_signature': 'c6b1f926393a2578150b3c56ac15cf0dee5efdc2f365cdb0fcc54526cfd70242'}]}, 'timestamp': '2026-01-22 00:16:23.102963', '_unique_id': 'e5018d69a97b491fb60e4b04a717825a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.103 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.103 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.103 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.103 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.103 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.103 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.103 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.103 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.103 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.103 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.103 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.103 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.103 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.103 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.103 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.103 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.103 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.103 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.103 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.103 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.103 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.103 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.103 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.103 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.103 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.103 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.103 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.103 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.103 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.103 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.103 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.104 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.104 12 DEBUG ceilometer.compute.pollsters [-] 074fd360-328c-4903-a368-d3890c4a1075/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.104 12 DEBUG ceilometer.compute.pollsters [-] 5ba5bafe-ee5b-48f6-aa2f-653708f71f55/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.105 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6b9ff42f-e93a-4aab-b9e1-c41204af1409', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '833f1e9dce90456ea55a443da6704907', 'user_name': None, 'project_id': '34b96b4037d24a0ea19383ca2477b2fd', 'project_name': None, 'resource_id': 'instance-0000007d-074fd360-328c-4903-a368-d3890c4a1075-tapa06a78d5-54', 'timestamp': '2026-01-22T00:16:23.104547', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1083808564', 'name': 'tapa06a78d5-54', 'instance_id': '074fd360-328c-4903-a368-d3890c4a1075', 'instance_type': 'm1.nano', 'host': '0b8bd15b4429a8b3db4045e831c64597b4205d6950c39c7abf926666', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:3e:83:df', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa06a78d5-54'}, 'message_id': '956a605c-f727-11f0-a0a4-fa163e934844', 'monotonic_time': 5555.690381371, 'message_signature': '8050b534ef09575c1e08d3488f8e69033eb33b87de7d8c96431632ffb697f603'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'c26ff016fcfc4e08803feb0e96005a8e', 'user_name': None, 'project_id': '4c6e66779ffe440d9c3270f0328391fb', 'project_name': None, 'resource_id': 'instance-00000085-5ba5bafe-ee5b-48f6-aa2f-653708f71f55-tap53f3c575-cc', 'timestamp': '2026-01-22T00:16:23.104547', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-343289399', 'name': 'tap53f3c575-cc', 'instance_id': '5ba5bafe-ee5b-48f6-aa2f-653708f71f55', 'instance_type': 'm1.nano', 'host': '36cb94be136c59f96bdf8037627fe2a95db3a078b538a9314d820ad0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:9f:0e:3f', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap53f3c575-cc'}, 'message_id': '956a6d2c-f727-11f0-a0a4-fa163e934844', 'monotonic_time': 5555.695484938, 'message_signature': '5895db0993eb28ef0599ab85f7d3297455bfa9f5eb4b16cd89a85e92edf7b7cc'}]}, 'timestamp': '2026-01-22 00:16:23.105215', '_unique_id': 'b8ee71d19ba1449f888d50d982d38d49'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.105 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.105 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.105 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.105 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.105 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.105 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.105 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.105 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.105 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.105 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.105 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.105 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.105 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.105 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.105 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.105 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.105 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.105 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.105 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.105 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.105 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.105 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.105 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.105 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.105 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.105 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.105 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.105 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.105 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.105 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.105 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.106 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.106 12 DEBUG ceilometer.compute.pollsters [-] 074fd360-328c-4903-a368-d3890c4a1075/disk.device.write.bytes volume: 73281536 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.107 12 DEBUG ceilometer.compute.pollsters [-] 074fd360-328c-4903-a368-d3890c4a1075/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.107 12 DEBUG ceilometer.compute.pollsters [-] 5ba5bafe-ee5b-48f6-aa2f-653708f71f55/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.107 12 DEBUG ceilometer.compute.pollsters [-] 5ba5bafe-ee5b-48f6-aa2f-653708f71f55/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.108 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'abf8f766-14b1-4c5d-9310-4cbdb85f94fb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 73281536, 'user_id': '833f1e9dce90456ea55a443da6704907', 'user_name': None, 'project_id': '34b96b4037d24a0ea19383ca2477b2fd', 'project_name': None, 'resource_id': '074fd360-328c-4903-a368-d3890c4a1075-vda', 'timestamp': '2026-01-22T00:16:23.106701', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1083808564', 'name': 'instance-0000007d', 'instance_id': '074fd360-328c-4903-a368-d3890c4a1075', 'instance_type': 'm1.nano', 'host': '0b8bd15b4429a8b3db4045e831c64597b4205d6950c39c7abf926666', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '956ab322-f727-11f0-a0a4-fa163e934844', 'monotonic_time': 5555.602239022, 'message_signature': '9acb8d5d48dc1a28c426984396af5f18d9084684c6b0584eab54c2495b76578d'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '833f1e9dce90456ea55a443da6704907', 'user_name': None, 'project_id': '34b96b4037d24a0ea19383ca2477b2fd', 'project_name': None, 'resource_id': '074fd360-328c-4903-a368-d3890c4a1075-sda', 'timestamp': '2026-01-22T00:16:23.106701', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1083808564', 'name': 'instance-0000007d', 'instance_id': '074fd360-328c-4903-a368-d3890c4a1075', 'instance_type': 'm1.nano', 'host': '0b8bd15b4429a8b3db4045e831c64597b4205d6950c39c7abf926666', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '956abe80-f727-11f0-a0a4-fa163e934844', 'monotonic_time': 5555.602239022, 'message_signature': '301d8975df641c4e664ce0c11048badeb92593b2e001e24788414e0328c45e38'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'c26ff016fcfc4e08803feb0e96005a8e', 'user_name': None, 'project_id': '4c6e66779ffe440d9c3270f0328391fb', 'project_name': None, 'resource_id': '5ba5bafe-ee5b-48f6-aa2f-653708f71f55-vda', 'timestamp': '2026-01-22T00:16:23.106701', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-343289399', 'name': 'instance-00000085', 'instance_id': '5ba5bafe-ee5b-48f6-aa2f-653708f71f55', 'instance_type': 'm1.nano', 'host': '36cb94be136c59f96bdf8037627fe2a95db3a078b538a9314d820ad0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '956ac70e-f727-11f0-a0a4-fa163e934844', 'monotonic_time': 5555.643772778, 'message_signature': '9dbbc7a8590309ee81c2286c938483d27b909a83da2035798555e014bd589698'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'c26ff016fcfc4e08803feb0e96005a8e', 'user_name': None, 'project_id': '4c6e66779ffe440d9c3270f0328391fb', 'project_name': None, 'resource_id': '5ba5bafe-ee5b-48f6-aa2f-653708f71f55-sda', 'timestamp': '2026-01-22T00:16:23.106701', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-343289399', 'name': 'instance-00000085', 'instance_id': '5ba5bafe-ee5b-48f6-aa2f-653708f71f55', 'instance_type': 'm1.nano', 'host': '36cb94be136c59f96bdf8037627fe2a95db3a078b538a9314d820ad0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '956ace70-f727-11f0-a0a4-fa163e934844', 'monotonic_time': 5555.643772778, 'message_signature': '847cb2816bbf1c7dc448a6e6b49b6d21fb12b1f39e4ec10ec5df782929dbc30c'}]}, 'timestamp': '2026-01-22 00:16:23.107635', '_unique_id': '9c09eafce5704abfbaa75401b2198531'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.108 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.108 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.108 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.108 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.108 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.108 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.108 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.108 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.108 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.108 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.108 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.108 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.108 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.108 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.108 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.108 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.108 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.108 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.108 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.108 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.108 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.108 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.108 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.108 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.108 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.108 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.108 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.108 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.108 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.108 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.108 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.108 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.108 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.109 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1083808564>, <NovaLikeServer: tempest-ServerRescueNegativeTestJSON-server-343289399>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1083808564>, <NovaLikeServer: tempest-ServerRescueNegativeTestJSON-server-343289399>]
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.109 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.109 12 DEBUG ceilometer.compute.pollsters [-] 074fd360-328c-4903-a368-d3890c4a1075/disk.device.read.requests volume: 1077 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.109 12 DEBUG ceilometer.compute.pollsters [-] 074fd360-328c-4903-a368-d3890c4a1075/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.109 12 DEBUG ceilometer.compute.pollsters [-] 5ba5bafe-ee5b-48f6-aa2f-653708f71f55/disk.device.read.requests volume: 760 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.110 12 DEBUG ceilometer.compute.pollsters [-] 5ba5bafe-ee5b-48f6-aa2f-653708f71f55/disk.device.read.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.111 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e4acb05e-24ea-443d-85c8-17666119ec33', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1077, 'user_id': '833f1e9dce90456ea55a443da6704907', 'user_name': None, 'project_id': '34b96b4037d24a0ea19383ca2477b2fd', 'project_name': None, 'resource_id': '074fd360-328c-4903-a368-d3890c4a1075-vda', 'timestamp': '2026-01-22T00:16:23.109272', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1083808564', 'name': 'instance-0000007d', 'instance_id': '074fd360-328c-4903-a368-d3890c4a1075', 'instance_type': 'm1.nano', 'host': '0b8bd15b4429a8b3db4045e831c64597b4205d6950c39c7abf926666', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '956b17b8-f727-11f0-a0a4-fa163e934844', 'monotonic_time': 5555.602239022, 'message_signature': 'e90dc25f723327d272bcd9b34009c04c10abbc561518e3866574d1bd08d95703'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': '833f1e9dce90456ea55a443da6704907', 'user_name': None, 'project_id': '34b96b4037d24a0ea19383ca2477b2fd', 'project_name': None, 'resource_id': '074fd360-328c-4903-a368-d3890c4a1075-sda', 'timestamp': '2026-01-22T00:16:23.109272', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1083808564', 'name': 'instance-0000007d', 'instance_id': '074fd360-328c-4903-a368-d3890c4a1075', 'instance_type': 'm1.nano', 'host': '0b8bd15b4429a8b3db4045e831c64597b4205d6950c39c7abf926666', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '956b22da-f727-11f0-a0a4-fa163e934844', 'monotonic_time': 5555.602239022, 'message_signature': '1faa8b0a9c967b20e2bf3cb271b525464cd461b60775cbf2335271fcaacffc1f'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 760, 'user_id': 'c26ff016fcfc4e08803feb0e96005a8e', 'user_name': None, 'project_id': '4c6e66779ffe440d9c3270f0328391fb', 'project_name': None, 'resource_id': '5ba5bafe-ee5b-48f6-aa2f-653708f71f55-vda', 'timestamp': '2026-01-22T00:16:23.109272', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-343289399', 'name': 'instance-00000085', 'instance_id': '5ba5bafe-ee5b-48f6-aa2f-653708f71f55', 'instance_type': 'm1.nano', 'host': '36cb94be136c59f96bdf8037627fe2a95db3a078b538a9314d820ad0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '956b2e24-f727-11f0-a0a4-fa163e934844', 'monotonic_time': 5555.643772778, 'message_signature': '37bf07d0e036eae74a8eff86d7db3c114062008a42a3c303d555a8e83ee52ba6'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': 'c26ff016fcfc4e08803feb0e96005a8e', 'user_name': None, 'project_id': '4c6e66779ffe440d9c3270f0328391fb', 'project_name': None, 'resource_id': '5ba5bafe-ee5b-48f6-aa2f-653708f71f55-sda', 'timestamp': '2026-01-22T00:16:23.109272', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-343289399', 'name': 'instance-00000085', 'instance_id': '5ba5bafe-ee5b-48f6-aa2f-653708f71f55', 'instance_type': 'm1.nano', 'host': '36cb94be136c59f96bdf8037627fe2a95db3a078b538a9314d820ad0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '956b3900-f727-11f0-a0a4-fa163e934844', 'monotonic_time': 5555.643772778, 'message_signature': '1480e2bc71e202514c126c4419eb0ee027368d75669bcf0ce694bd0292b7581a'}]}, 'timestamp': '2026-01-22 00:16:23.110400', '_unique_id': '53aba5cf5e1944129c27f8fc149b1ddd'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.111 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.111 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.111 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.111 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.111 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.111 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.111 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.111 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.111 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.111 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.111 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.111 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.111 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.111 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.111 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.111 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.111 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.111 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.111 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.111 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.111 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.111 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.111 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.111 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.111 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.111 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.111 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.111 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.111 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.111 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:16:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:16:23.111 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:16:23 compute-1 nova_compute[182713]: 2026-01-22 00:16:23.390 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:16:24 compute-1 podman[231622]: 2026-01-22 00:16:24.60242266 +0000 UTC m=+0.093243676 container health_status cba261ffc859a105e0afa4436e64e27f9ba7e7387a1ea02146e0072c51844d44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2)
Jan 22 00:16:25 compute-1 nova_compute[182713]: 2026-01-22 00:16:25.172 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:16:25 compute-1 nova_compute[182713]: 2026-01-22 00:16:25.322 182717 DEBUG nova.compute.manager [req-f2155701-5969-47ff-a7e6-6b443529991d req-66d10072-34fb-47b8-8ee8-75aa8ae18dc6 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 074fd360-328c-4903-a368-d3890c4a1075] Received event network-changed-a06a78d5-548e-4a84-b918-197a54a79f44 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:16:25 compute-1 nova_compute[182713]: 2026-01-22 00:16:25.323 182717 DEBUG nova.compute.manager [req-f2155701-5969-47ff-a7e6-6b443529991d req-66d10072-34fb-47b8-8ee8-75aa8ae18dc6 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 074fd360-328c-4903-a368-d3890c4a1075] Refreshing instance network info cache due to event network-changed-a06a78d5-548e-4a84-b918-197a54a79f44. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 00:16:25 compute-1 nova_compute[182713]: 2026-01-22 00:16:25.323 182717 DEBUG oslo_concurrency.lockutils [req-f2155701-5969-47ff-a7e6-6b443529991d req-66d10072-34fb-47b8-8ee8-75aa8ae18dc6 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-074fd360-328c-4903-a368-d3890c4a1075" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:16:25 compute-1 nova_compute[182713]: 2026-01-22 00:16:25.323 182717 DEBUG oslo_concurrency.lockutils [req-f2155701-5969-47ff-a7e6-6b443529991d req-66d10072-34fb-47b8-8ee8-75aa8ae18dc6 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-074fd360-328c-4903-a368-d3890c4a1075" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:16:25 compute-1 nova_compute[182713]: 2026-01-22 00:16:25.324 182717 DEBUG nova.network.neutron [req-f2155701-5969-47ff-a7e6-6b443529991d req-66d10072-34fb-47b8-8ee8-75aa8ae18dc6 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 074fd360-328c-4903-a368-d3890c4a1075] Refreshing network info cache for port a06a78d5-548e-4a84-b918-197a54a79f44 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 00:16:25 compute-1 nova_compute[182713]: 2026-01-22 00:16:25.748 182717 DEBUG nova.network.neutron [None req-cad6ea55-40fb-4283-a434-81d6ea41727f c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] [instance: 5ba5bafe-ee5b-48f6-aa2f-653708f71f55] Updating instance_info_cache with network_info: [{"id": "53f3c575-ccc8-4fde-a256-9598ddf6cdaf", "address": "fa:16:3e:9f:0e:3f", "network": {"id": "55594f65-206f-4b2a-a4ed-c049861ef480", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-2114387764-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4c6e66779ffe440d9c3270f0328391fb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap53f3c575-cc", "ovs_interfaceid": "53f3c575-ccc8-4fde-a256-9598ddf6cdaf", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:16:25 compute-1 nova_compute[182713]: 2026-01-22 00:16:25.815 182717 DEBUG oslo_concurrency.lockutils [None req-cad6ea55-40fb-4283-a434-81d6ea41727f c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] Releasing lock "refresh_cache-5ba5bafe-ee5b-48f6-aa2f-653708f71f55" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:16:25 compute-1 nova_compute[182713]: 2026-01-22 00:16:25.828 182717 DEBUG oslo_concurrency.lockutils [None req-b14e5bbb-d288-4f51-b3a1-df3756c2cdc2 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Acquiring lock "074fd360-328c-4903-a368-d3890c4a1075" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:16:25 compute-1 nova_compute[182713]: 2026-01-22 00:16:25.828 182717 DEBUG oslo_concurrency.lockutils [None req-b14e5bbb-d288-4f51-b3a1-df3756c2cdc2 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "074fd360-328c-4903-a368-d3890c4a1075" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:16:25 compute-1 nova_compute[182713]: 2026-01-22 00:16:25.829 182717 DEBUG oslo_concurrency.lockutils [None req-b14e5bbb-d288-4f51-b3a1-df3756c2cdc2 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Acquiring lock "074fd360-328c-4903-a368-d3890c4a1075-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:16:25 compute-1 nova_compute[182713]: 2026-01-22 00:16:25.829 182717 DEBUG oslo_concurrency.lockutils [None req-b14e5bbb-d288-4f51-b3a1-df3756c2cdc2 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "074fd360-328c-4903-a368-d3890c4a1075-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:16:25 compute-1 nova_compute[182713]: 2026-01-22 00:16:25.829 182717 DEBUG oslo_concurrency.lockutils [None req-b14e5bbb-d288-4f51-b3a1-df3756c2cdc2 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "074fd360-328c-4903-a368-d3890c4a1075-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:16:25 compute-1 nova_compute[182713]: 2026-01-22 00:16:25.865 182717 INFO nova.compute.manager [None req-b14e5bbb-d288-4f51-b3a1-df3756c2cdc2 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 074fd360-328c-4903-a368-d3890c4a1075] Terminating instance
Jan 22 00:16:25 compute-1 nova_compute[182713]: 2026-01-22 00:16:25.928 182717 DEBUG nova.compute.manager [None req-b14e5bbb-d288-4f51-b3a1-df3756c2cdc2 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 074fd360-328c-4903-a368-d3890c4a1075] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 22 00:16:25 compute-1 kernel: tapa06a78d5-54 (unregistering): left promiscuous mode
Jan 22 00:16:25 compute-1 NetworkManager[54952]: <info>  [1769040985.9621] device (tapa06a78d5-54): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 00:16:25 compute-1 ovn_controller[94841]: 2026-01-22T00:16:25Z|00528|binding|INFO|Releasing lport a06a78d5-548e-4a84-b918-197a54a79f44 from this chassis (sb_readonly=0)
Jan 22 00:16:25 compute-1 nova_compute[182713]: 2026-01-22 00:16:25.976 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:16:25 compute-1 ovn_controller[94841]: 2026-01-22T00:16:25Z|00529|binding|INFO|Setting lport a06a78d5-548e-4a84-b918-197a54a79f44 down in Southbound
Jan 22 00:16:25 compute-1 ovn_controller[94841]: 2026-01-22T00:16:25Z|00530|binding|INFO|Removing iface tapa06a78d5-54 ovn-installed in OVS
Jan 22 00:16:25 compute-1 nova_compute[182713]: 2026-01-22 00:16:25.981 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:16:25 compute-1 nova_compute[182713]: 2026-01-22 00:16:25.996 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:16:26 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:16:26.007 104184 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:3e:83:df 10.100.0.3'], port_security=['fa:16:3e:3e:83:df 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': '074fd360-328c-4903-a368-d3890c4a1075', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-89b3c74e-a4f2-4889-901d-aba21eee4bda', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '34b96b4037d24a0ea19383ca2477b2fd', 'neutron:revision_number': '4', 'neutron:security_group_ids': '7b431dee-6ff2-4ce1-b240-ed1059a68730', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=011f79f2-8e1f-476b-a77e-56d133ce3969, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>], logical_port=a06a78d5-548e-4a84-b918-197a54a79f44) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:16:26 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:16:26.009 104184 INFO neutron.agent.ovn.metadata.agent [-] Port a06a78d5-548e-4a84-b918-197a54a79f44 in datapath 89b3c74e-a4f2-4889-901d-aba21eee4bda unbound from our chassis
Jan 22 00:16:26 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:16:26.010 104184 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 89b3c74e-a4f2-4889-901d-aba21eee4bda, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 00:16:26 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:16:26.012 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[d78a8227-9b25-4eb3-872e-9a40f6b776a6]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:16:26 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:16:26.012 104184 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-89b3c74e-a4f2-4889-901d-aba21eee4bda namespace which is not needed anymore
Jan 22 00:16:26 compute-1 systemd[1]: machine-qemu\x2d57\x2dinstance\x2d0000007d.scope: Deactivated successfully.
Jan 22 00:16:26 compute-1 systemd[1]: machine-qemu\x2d57\x2dinstance\x2d0000007d.scope: Consumed 18.093s CPU time.
Jan 22 00:16:26 compute-1 systemd-machined[153970]: Machine qemu-57-instance-0000007d terminated.
Jan 22 00:16:26 compute-1 neutron-haproxy-ovnmeta-89b3c74e-a4f2-4889-901d-aba21eee4bda[230757]: [NOTICE]   (230761) : haproxy version is 2.8.14-c23fe91
Jan 22 00:16:26 compute-1 neutron-haproxy-ovnmeta-89b3c74e-a4f2-4889-901d-aba21eee4bda[230757]: [NOTICE]   (230761) : path to executable is /usr/sbin/haproxy
Jan 22 00:16:26 compute-1 neutron-haproxy-ovnmeta-89b3c74e-a4f2-4889-901d-aba21eee4bda[230757]: [WARNING]  (230761) : Exiting Master process...
Jan 22 00:16:26 compute-1 neutron-haproxy-ovnmeta-89b3c74e-a4f2-4889-901d-aba21eee4bda[230757]: [ALERT]    (230761) : Current worker (230763) exited with code 143 (Terminated)
Jan 22 00:16:26 compute-1 neutron-haproxy-ovnmeta-89b3c74e-a4f2-4889-901d-aba21eee4bda[230757]: [WARNING]  (230761) : All workers exited. Exiting... (0)
Jan 22 00:16:26 compute-1 systemd[1]: libpod-6646af73bc62b968de3d0ae7f8d9abc852f2eaaf3bf95786838997a35d8c8716.scope: Deactivated successfully.
Jan 22 00:16:26 compute-1 podman[231664]: 2026-01-22 00:16:26.155192735 +0000 UTC m=+0.054403474 container died 6646af73bc62b968de3d0ae7f8d9abc852f2eaaf3bf95786838997a35d8c8716 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-89b3c74e-a4f2-4889-901d-aba21eee4bda, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 00:16:26 compute-1 nova_compute[182713]: 2026-01-22 00:16:26.187 182717 INFO nova.virt.libvirt.driver [-] [instance: 074fd360-328c-4903-a368-d3890c4a1075] Instance destroyed successfully.
Jan 22 00:16:26 compute-1 nova_compute[182713]: 2026-01-22 00:16:26.188 182717 DEBUG nova.objects.instance [None req-b14e5bbb-d288-4f51-b3a1-df3756c2cdc2 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lazy-loading 'resources' on Instance uuid 074fd360-328c-4903-a368-d3890c4a1075 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:16:26 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-6646af73bc62b968de3d0ae7f8d9abc852f2eaaf3bf95786838997a35d8c8716-userdata-shm.mount: Deactivated successfully.
Jan 22 00:16:26 compute-1 systemd[1]: var-lib-containers-storage-overlay-1e2ac5dc7176e09e8f0962c01089c8533476a1e2c9cc9c7f0f58336d68c1b569-merged.mount: Deactivated successfully.
Jan 22 00:16:26 compute-1 podman[231664]: 2026-01-22 00:16:26.203020325 +0000 UTC m=+0.102231034 container cleanup 6646af73bc62b968de3d0ae7f8d9abc852f2eaaf3bf95786838997a35d8c8716 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-89b3c74e-a4f2-4889-901d-aba21eee4bda, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Jan 22 00:16:26 compute-1 systemd[1]: libpod-conmon-6646af73bc62b968de3d0ae7f8d9abc852f2eaaf3bf95786838997a35d8c8716.scope: Deactivated successfully.
Jan 22 00:16:26 compute-1 nova_compute[182713]: 2026-01-22 00:16:26.225 182717 DEBUG nova.virt.libvirt.vif [None req-b14e5bbb-d288-4f51-b3a1-df3756c2cdc2 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T00:14:30Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1083808564',display_name='tempest-TestNetworkBasicOps-server-1083808564',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1083808564',id=125,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBM6+EaxgHLeoG7CHqeSWXpXKv4K84dI1QQQjZdcrX0T7kqXBTlhE22YQjJTUFxToUxfZEI27WRcAtCoqb6CCdLa4/l//5Lw6nNA8ZjjrkKnh18RWLjWeeCbEVEj1FdVuCQ==',key_name='tempest-TestNetworkBasicOps-2002692629',keypairs=<?>,launch_index=0,launched_at=2026-01-22T00:14:45Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='34b96b4037d24a0ea19383ca2477b2fd',ramdisk_id='',reservation_id='r-5bbetpco',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-822850957',owner_user_name='tempest-TestNetworkBasicOps-822850957-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T00:14:45Z,user_data=None,user_id='833f1e9dce90456ea55a443da6704907',uuid=074fd360-328c-4903-a368-d3890c4a1075,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "a06a78d5-548e-4a84-b918-197a54a79f44", "address": "fa:16:3e:3e:83:df", "network": {"id": "89b3c74e-a4f2-4889-901d-aba21eee4bda", "bridge": "br-int", "label": "tempest-network-smoke--2024090277", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.235", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "34b96b4037d24a0ea19383ca2477b2fd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa06a78d5-54", "ovs_interfaceid": "a06a78d5-548e-4a84-b918-197a54a79f44", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 00:16:26 compute-1 nova_compute[182713]: 2026-01-22 00:16:26.225 182717 DEBUG nova.network.os_vif_util [None req-b14e5bbb-d288-4f51-b3a1-df3756c2cdc2 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Converting VIF {"id": "a06a78d5-548e-4a84-b918-197a54a79f44", "address": "fa:16:3e:3e:83:df", "network": {"id": "89b3c74e-a4f2-4889-901d-aba21eee4bda", "bridge": "br-int", "label": "tempest-network-smoke--2024090277", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.235", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "34b96b4037d24a0ea19383ca2477b2fd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa06a78d5-54", "ovs_interfaceid": "a06a78d5-548e-4a84-b918-197a54a79f44", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:16:26 compute-1 nova_compute[182713]: 2026-01-22 00:16:26.226 182717 DEBUG nova.network.os_vif_util [None req-b14e5bbb-d288-4f51-b3a1-df3756c2cdc2 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:3e:83:df,bridge_name='br-int',has_traffic_filtering=True,id=a06a78d5-548e-4a84-b918-197a54a79f44,network=Network(89b3c74e-a4f2-4889-901d-aba21eee4bda),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa06a78d5-54') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:16:26 compute-1 nova_compute[182713]: 2026-01-22 00:16:26.226 182717 DEBUG os_vif [None req-b14e5bbb-d288-4f51-b3a1-df3756c2cdc2 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:3e:83:df,bridge_name='br-int',has_traffic_filtering=True,id=a06a78d5-548e-4a84-b918-197a54a79f44,network=Network(89b3c74e-a4f2-4889-901d-aba21eee4bda),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa06a78d5-54') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 00:16:26 compute-1 nova_compute[182713]: 2026-01-22 00:16:26.228 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:16:26 compute-1 nova_compute[182713]: 2026-01-22 00:16:26.228 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa06a78d5-54, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:16:26 compute-1 nova_compute[182713]: 2026-01-22 00:16:26.230 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:16:26 compute-1 nova_compute[182713]: 2026-01-22 00:16:26.231 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:16:26 compute-1 nova_compute[182713]: 2026-01-22 00:16:26.234 182717 INFO os_vif [None req-b14e5bbb-d288-4f51-b3a1-df3756c2cdc2 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:3e:83:df,bridge_name='br-int',has_traffic_filtering=True,id=a06a78d5-548e-4a84-b918-197a54a79f44,network=Network(89b3c74e-a4f2-4889-901d-aba21eee4bda),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa06a78d5-54')
Jan 22 00:16:26 compute-1 nova_compute[182713]: 2026-01-22 00:16:26.234 182717 INFO nova.virt.libvirt.driver [None req-b14e5bbb-d288-4f51-b3a1-df3756c2cdc2 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 074fd360-328c-4903-a368-d3890c4a1075] Deleting instance files /var/lib/nova/instances/074fd360-328c-4903-a368-d3890c4a1075_del
Jan 22 00:16:26 compute-1 nova_compute[182713]: 2026-01-22 00:16:26.235 182717 INFO nova.virt.libvirt.driver [None req-b14e5bbb-d288-4f51-b3a1-df3756c2cdc2 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 074fd360-328c-4903-a368-d3890c4a1075] Deletion of /var/lib/nova/instances/074fd360-328c-4903-a368-d3890c4a1075_del complete
Jan 22 00:16:26 compute-1 podman[231713]: 2026-01-22 00:16:26.272127229 +0000 UTC m=+0.044545660 container remove 6646af73bc62b968de3d0ae7f8d9abc852f2eaaf3bf95786838997a35d8c8716 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-89b3c74e-a4f2-4889-901d-aba21eee4bda, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 22 00:16:26 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:16:26.284 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[d903db70-be46-4858-92a1-ab2e7f0277b0]: (4, ('Thu Jan 22 12:16:26 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-89b3c74e-a4f2-4889-901d-aba21eee4bda (6646af73bc62b968de3d0ae7f8d9abc852f2eaaf3bf95786838997a35d8c8716)\n6646af73bc62b968de3d0ae7f8d9abc852f2eaaf3bf95786838997a35d8c8716\nThu Jan 22 12:16:26 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-89b3c74e-a4f2-4889-901d-aba21eee4bda (6646af73bc62b968de3d0ae7f8d9abc852f2eaaf3bf95786838997a35d8c8716)\n6646af73bc62b968de3d0ae7f8d9abc852f2eaaf3bf95786838997a35d8c8716\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:16:26 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:16:26.285 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[9f985e90-9a57-4f6e-bd29-32b742f8e31a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:16:26 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:16:26.286 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap89b3c74e-a0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:16:26 compute-1 nova_compute[182713]: 2026-01-22 00:16:26.287 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:16:26 compute-1 kernel: tap89b3c74e-a0: left promiscuous mode
Jan 22 00:16:26 compute-1 nova_compute[182713]: 2026-01-22 00:16:26.301 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:16:26 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:16:26.305 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[88235a5f-e42c-4fd2-b395-38c9f57a71f4]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:16:26 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:16:26.325 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[7223d3c8-e42e-4d58-806e-afd4ad49e7df]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:16:26 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:16:26.326 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[c8847a9e-0487-406a-8b88-5b0266e7eb65]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:16:26 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:16:26.346 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[942bee82-77bb-4941-b865-de9fb543e88f]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 545636, 'reachable_time': 29699, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 231725, 'error': None, 'target': 'ovnmeta-89b3c74e-a4f2-4889-901d-aba21eee4bda', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:16:26 compute-1 systemd[1]: run-netns-ovnmeta\x2d89b3c74e\x2da4f2\x2d4889\x2d901d\x2daba21eee4bda.mount: Deactivated successfully.
Jan 22 00:16:26 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:16:26.351 104576 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-89b3c74e-a4f2-4889-901d-aba21eee4bda deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 22 00:16:26 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:16:26.352 104576 DEBUG oslo.privsep.daemon [-] privsep: reply[399dd66e-13f3-4538-b0df-c22c94d61c28]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:16:26 compute-1 nova_compute[182713]: 2026-01-22 00:16:26.663 182717 INFO nova.compute.manager [None req-b14e5bbb-d288-4f51-b3a1-df3756c2cdc2 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 074fd360-328c-4903-a368-d3890c4a1075] Took 0.73 seconds to destroy the instance on the hypervisor.
Jan 22 00:16:26 compute-1 nova_compute[182713]: 2026-01-22 00:16:26.664 182717 DEBUG oslo.service.loopingcall [None req-b14e5bbb-d288-4f51-b3a1-df3756c2cdc2 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 22 00:16:26 compute-1 nova_compute[182713]: 2026-01-22 00:16:26.664 182717 DEBUG nova.compute.manager [-] [instance: 074fd360-328c-4903-a368-d3890c4a1075] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 22 00:16:26 compute-1 nova_compute[182713]: 2026-01-22 00:16:26.664 182717 DEBUG nova.network.neutron [-] [instance: 074fd360-328c-4903-a368-d3890c4a1075] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 22 00:16:26 compute-1 nova_compute[182713]: 2026-01-22 00:16:26.856 182717 DEBUG nova.compute.manager [req-72476105-f3ca-4329-b60c-c0ee8701f123 req-84321536-59dc-4b3a-8e80-e7cc5bccc7bc 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 074fd360-328c-4903-a368-d3890c4a1075] Received event network-vif-unplugged-a06a78d5-548e-4a84-b918-197a54a79f44 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:16:26 compute-1 nova_compute[182713]: 2026-01-22 00:16:26.857 182717 DEBUG oslo_concurrency.lockutils [req-72476105-f3ca-4329-b60c-c0ee8701f123 req-84321536-59dc-4b3a-8e80-e7cc5bccc7bc 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "074fd360-328c-4903-a368-d3890c4a1075-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:16:26 compute-1 nova_compute[182713]: 2026-01-22 00:16:26.857 182717 DEBUG oslo_concurrency.lockutils [req-72476105-f3ca-4329-b60c-c0ee8701f123 req-84321536-59dc-4b3a-8e80-e7cc5bccc7bc 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "074fd360-328c-4903-a368-d3890c4a1075-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:16:26 compute-1 nova_compute[182713]: 2026-01-22 00:16:26.857 182717 DEBUG oslo_concurrency.lockutils [req-72476105-f3ca-4329-b60c-c0ee8701f123 req-84321536-59dc-4b3a-8e80-e7cc5bccc7bc 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "074fd360-328c-4903-a368-d3890c4a1075-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:16:26 compute-1 nova_compute[182713]: 2026-01-22 00:16:26.857 182717 DEBUG nova.compute.manager [req-72476105-f3ca-4329-b60c-c0ee8701f123 req-84321536-59dc-4b3a-8e80-e7cc5bccc7bc 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 074fd360-328c-4903-a368-d3890c4a1075] No waiting events found dispatching network-vif-unplugged-a06a78d5-548e-4a84-b918-197a54a79f44 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:16:26 compute-1 nova_compute[182713]: 2026-01-22 00:16:26.858 182717 DEBUG nova.compute.manager [req-72476105-f3ca-4329-b60c-c0ee8701f123 req-84321536-59dc-4b3a-8e80-e7cc5bccc7bc 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 074fd360-328c-4903-a368-d3890c4a1075] Received event network-vif-unplugged-a06a78d5-548e-4a84-b918-197a54a79f44 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 22 00:16:26 compute-1 nova_compute[182713]: 2026-01-22 00:16:26.907 182717 DEBUG nova.virt.libvirt.driver [None req-cad6ea55-40fb-4283-a434-81d6ea41727f c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] [instance: 5ba5bafe-ee5b-48f6-aa2f-653708f71f55] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071
Jan 22 00:16:27 compute-1 podman[231728]: 2026-01-22 00:16:27.590426867 +0000 UTC m=+0.075357537 container health_status dce133a2ffa21f3006ac27dc80027060a9029e6d972f4c62fecd35dd3bbb00f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, vcs-type=git, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, name=ubi9-minimal, release=1755695350, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7, io.openshift.expose-services=, version=9.6, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Jan 22 00:16:28 compute-1 nova_compute[182713]: 2026-01-22 00:16:28.392 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:16:29 compute-1 nova_compute[182713]: 2026-01-22 00:16:29.331 182717 DEBUG nova.compute.manager [req-aa6eb7b6-77f5-4bab-ad1a-dd06291a2b67 req-a5c013b2-d0a7-4888-84a2-3594b9048ef4 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 074fd360-328c-4903-a368-d3890c4a1075] Received event network-vif-plugged-a06a78d5-548e-4a84-b918-197a54a79f44 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:16:29 compute-1 nova_compute[182713]: 2026-01-22 00:16:29.332 182717 DEBUG oslo_concurrency.lockutils [req-aa6eb7b6-77f5-4bab-ad1a-dd06291a2b67 req-a5c013b2-d0a7-4888-84a2-3594b9048ef4 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "074fd360-328c-4903-a368-d3890c4a1075-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:16:29 compute-1 nova_compute[182713]: 2026-01-22 00:16:29.332 182717 DEBUG oslo_concurrency.lockutils [req-aa6eb7b6-77f5-4bab-ad1a-dd06291a2b67 req-a5c013b2-d0a7-4888-84a2-3594b9048ef4 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "074fd360-328c-4903-a368-d3890c4a1075-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:16:29 compute-1 nova_compute[182713]: 2026-01-22 00:16:29.333 182717 DEBUG oslo_concurrency.lockutils [req-aa6eb7b6-77f5-4bab-ad1a-dd06291a2b67 req-a5c013b2-d0a7-4888-84a2-3594b9048ef4 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "074fd360-328c-4903-a368-d3890c4a1075-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:16:29 compute-1 nova_compute[182713]: 2026-01-22 00:16:29.333 182717 DEBUG nova.compute.manager [req-aa6eb7b6-77f5-4bab-ad1a-dd06291a2b67 req-a5c013b2-d0a7-4888-84a2-3594b9048ef4 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 074fd360-328c-4903-a368-d3890c4a1075] No waiting events found dispatching network-vif-plugged-a06a78d5-548e-4a84-b918-197a54a79f44 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:16:29 compute-1 nova_compute[182713]: 2026-01-22 00:16:29.334 182717 WARNING nova.compute.manager [req-aa6eb7b6-77f5-4bab-ad1a-dd06291a2b67 req-a5c013b2-d0a7-4888-84a2-3594b9048ef4 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 074fd360-328c-4903-a368-d3890c4a1075] Received unexpected event network-vif-plugged-a06a78d5-548e-4a84-b918-197a54a79f44 for instance with vm_state active and task_state deleting.
Jan 22 00:16:29 compute-1 nova_compute[182713]: 2026-01-22 00:16:29.455 182717 DEBUG nova.network.neutron [-] [instance: 074fd360-328c-4903-a368-d3890c4a1075] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:16:29 compute-1 nova_compute[182713]: 2026-01-22 00:16:29.574 182717 DEBUG nova.compute.manager [req-932186d5-3bbd-44b9-b40c-9993952f8a4c req-ce0a3699-07d1-4b5d-950e-4aec5cfd1093 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 074fd360-328c-4903-a368-d3890c4a1075] Received event network-vif-deleted-a06a78d5-548e-4a84-b918-197a54a79f44 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:16:29 compute-1 nova_compute[182713]: 2026-01-22 00:16:29.575 182717 INFO nova.compute.manager [req-932186d5-3bbd-44b9-b40c-9993952f8a4c req-ce0a3699-07d1-4b5d-950e-4aec5cfd1093 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 074fd360-328c-4903-a368-d3890c4a1075] Neutron deleted interface a06a78d5-548e-4a84-b918-197a54a79f44; detaching it from the instance and deleting it from the info cache
Jan 22 00:16:29 compute-1 nova_compute[182713]: 2026-01-22 00:16:29.575 182717 DEBUG nova.network.neutron [req-932186d5-3bbd-44b9-b40c-9993952f8a4c req-ce0a3699-07d1-4b5d-950e-4aec5cfd1093 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 074fd360-328c-4903-a368-d3890c4a1075] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:16:29 compute-1 nova_compute[182713]: 2026-01-22 00:16:29.615 182717 DEBUG nova.network.neutron [req-f2155701-5969-47ff-a7e6-6b443529991d req-66d10072-34fb-47b8-8ee8-75aa8ae18dc6 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 074fd360-328c-4903-a368-d3890c4a1075] Updated VIF entry in instance network info cache for port a06a78d5-548e-4a84-b918-197a54a79f44. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 00:16:29 compute-1 nova_compute[182713]: 2026-01-22 00:16:29.616 182717 DEBUG nova.network.neutron [req-f2155701-5969-47ff-a7e6-6b443529991d req-66d10072-34fb-47b8-8ee8-75aa8ae18dc6 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 074fd360-328c-4903-a368-d3890c4a1075] Updating instance_info_cache with network_info: [{"id": "a06a78d5-548e-4a84-b918-197a54a79f44", "address": "fa:16:3e:3e:83:df", "network": {"id": "89b3c74e-a4f2-4889-901d-aba21eee4bda", "bridge": "br-int", "label": "tempest-network-smoke--2024090277", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "34b96b4037d24a0ea19383ca2477b2fd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa06a78d5-54", "ovs_interfaceid": "a06a78d5-548e-4a84-b918-197a54a79f44", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:16:29 compute-1 ovn_controller[94841]: 2026-01-22T00:16:29Z|00062|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:9f:0e:3f 10.100.0.12
Jan 22 00:16:29 compute-1 ovn_controller[94841]: 2026-01-22T00:16:29Z|00063|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:9f:0e:3f 10.100.0.12
Jan 22 00:16:29 compute-1 nova_compute[182713]: 2026-01-22 00:16:29.829 182717 INFO nova.compute.manager [-] [instance: 074fd360-328c-4903-a368-d3890c4a1075] Took 3.16 seconds to deallocate network for instance.
Jan 22 00:16:29 compute-1 nova_compute[182713]: 2026-01-22 00:16:29.858 182717 DEBUG nova.compute.manager [req-932186d5-3bbd-44b9-b40c-9993952f8a4c req-ce0a3699-07d1-4b5d-950e-4aec5cfd1093 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 074fd360-328c-4903-a368-d3890c4a1075] Detach interface failed, port_id=a06a78d5-548e-4a84-b918-197a54a79f44, reason: Instance 074fd360-328c-4903-a368-d3890c4a1075 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882
Jan 22 00:16:29 compute-1 nova_compute[182713]: 2026-01-22 00:16:29.874 182717 DEBUG oslo_concurrency.lockutils [req-f2155701-5969-47ff-a7e6-6b443529991d req-66d10072-34fb-47b8-8ee8-75aa8ae18dc6 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-074fd360-328c-4903-a368-d3890c4a1075" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:16:30 compute-1 nova_compute[182713]: 2026-01-22 00:16:29.999 182717 DEBUG oslo_concurrency.lockutils [None req-b14e5bbb-d288-4f51-b3a1-df3756c2cdc2 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:16:30 compute-1 nova_compute[182713]: 2026-01-22 00:16:30.000 182717 DEBUG oslo_concurrency.lockutils [None req-b14e5bbb-d288-4f51-b3a1-df3756c2cdc2 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:16:30 compute-1 nova_compute[182713]: 2026-01-22 00:16:30.146 182717 DEBUG nova.compute.provider_tree [None req-b14e5bbb-d288-4f51-b3a1-df3756c2cdc2 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Inventory has not changed in ProviderTree for provider: 39680711-70c9-4df1-ae59-25e54fac688d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:16:30 compute-1 nova_compute[182713]: 2026-01-22 00:16:30.178 182717 DEBUG nova.scheduler.client.report [None req-b14e5bbb-d288-4f51-b3a1-df3756c2cdc2 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Inventory has not changed for provider 39680711-70c9-4df1-ae59-25e54fac688d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:16:30 compute-1 nova_compute[182713]: 2026-01-22 00:16:30.213 182717 DEBUG oslo_concurrency.lockutils [None req-b14e5bbb-d288-4f51-b3a1-df3756c2cdc2 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.213s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:16:30 compute-1 nova_compute[182713]: 2026-01-22 00:16:30.288 182717 INFO nova.scheduler.client.report [None req-b14e5bbb-d288-4f51-b3a1-df3756c2cdc2 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Deleted allocations for instance 074fd360-328c-4903-a368-d3890c4a1075
Jan 22 00:16:30 compute-1 nova_compute[182713]: 2026-01-22 00:16:30.392 182717 DEBUG oslo_concurrency.lockutils [None req-b14e5bbb-d288-4f51-b3a1-df3756c2cdc2 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "074fd360-328c-4903-a368-d3890c4a1075" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.564s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:16:31 compute-1 nova_compute[182713]: 2026-01-22 00:16:31.073 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:16:31 compute-1 nova_compute[182713]: 2026-01-22 00:16:31.230 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:16:33 compute-1 nova_compute[182713]: 2026-01-22 00:16:33.394 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:16:36 compute-1 nova_compute[182713]: 2026-01-22 00:16:36.233 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:16:36 compute-1 nova_compute[182713]: 2026-01-22 00:16:36.952 182717 DEBUG nova.virt.libvirt.driver [None req-cad6ea55-40fb-4283-a434-81d6ea41727f c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] [instance: 5ba5bafe-ee5b-48f6-aa2f-653708f71f55] Instance in state 1 after 10 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101
Jan 22 00:16:37 compute-1 ovn_controller[94841]: 2026-01-22T00:16:37Z|00531|binding|INFO|Releasing lport c516d686-0754-486d-a980-7442f4c88088 from this chassis (sb_readonly=0)
Jan 22 00:16:37 compute-1 nova_compute[182713]: 2026-01-22 00:16:37.132 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:16:37 compute-1 ovn_controller[94841]: 2026-01-22T00:16:37Z|00532|binding|INFO|Releasing lport c516d686-0754-486d-a980-7442f4c88088 from this chassis (sb_readonly=0)
Jan 22 00:16:37 compute-1 nova_compute[182713]: 2026-01-22 00:16:37.236 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:16:38 compute-1 nova_compute[182713]: 2026-01-22 00:16:38.396 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:16:39 compute-1 kernel: tap53f3c575-cc (unregistering): left promiscuous mode
Jan 22 00:16:39 compute-1 NetworkManager[54952]: <info>  [1769040999.1913] device (tap53f3c575-cc): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 00:16:39 compute-1 nova_compute[182713]: 2026-01-22 00:16:39.197 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:16:39 compute-1 ovn_controller[94841]: 2026-01-22T00:16:39Z|00533|binding|INFO|Releasing lport 53f3c575-ccc8-4fde-a256-9598ddf6cdaf from this chassis (sb_readonly=0)
Jan 22 00:16:39 compute-1 ovn_controller[94841]: 2026-01-22T00:16:39Z|00534|binding|INFO|Setting lport 53f3c575-ccc8-4fde-a256-9598ddf6cdaf down in Southbound
Jan 22 00:16:39 compute-1 ovn_controller[94841]: 2026-01-22T00:16:39Z|00535|binding|INFO|Removing iface tap53f3c575-cc ovn-installed in OVS
Jan 22 00:16:39 compute-1 nova_compute[182713]: 2026-01-22 00:16:39.228 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:16:39 compute-1 systemd[1]: machine-qemu\x2d59\x2dinstance\x2d00000085.scope: Deactivated successfully.
Jan 22 00:16:39 compute-1 systemd[1]: machine-qemu\x2d59\x2dinstance\x2d00000085.scope: Consumed 14.343s CPU time.
Jan 22 00:16:39 compute-1 systemd-machined[153970]: Machine qemu-59-instance-00000085 terminated.
Jan 22 00:16:39 compute-1 podman[231765]: 2026-01-22 00:16:39.283434603 +0000 UTC m=+0.069886439 container health_status 9069102163b9014ce8d990e36224edf2f6277e737eebbc85a8ea0ba5a3d6a468 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 22 00:16:39 compute-1 podman[231762]: 2026-01-22 00:16:39.33964232 +0000 UTC m=+0.117525983 container health_status 1b31374526468d6b98833c6ddc06c791524e3aaa8f4a92d02e3f11c449a17ca2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 00:16:39 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:16:39.413 104184 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:9f:0e:3f 10.100.0.12'], port_security=['fa:16:3e:9f:0e:3f 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '5ba5bafe-ee5b-48f6-aa2f-653708f71f55', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-55594f65-206f-4b2a-a4ed-c049861ef480', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '4c6e66779ffe440d9c3270f0328391fb', 'neutron:revision_number': '4', 'neutron:security_group_ids': '99a05b18-dc2e-46bb-b7ee-4bfce96057f6', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f515f91a-3ddc-47bf-8aaf-753c19e78de1, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>], logical_port=53f3c575-ccc8-4fde-a256-9598ddf6cdaf) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:16:39 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:16:39.414 104184 INFO neutron.agent.ovn.metadata.agent [-] Port 53f3c575-ccc8-4fde-a256-9598ddf6cdaf in datapath 55594f65-206f-4b2a-a4ed-c049861ef480 unbound from our chassis
Jan 22 00:16:39 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:16:39.416 104184 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 55594f65-206f-4b2a-a4ed-c049861ef480, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 00:16:39 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:16:39.417 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[df659cc8-c130-48f3-b2d6-a12b75aac3c9]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:16:39 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:16:39.417 104184 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-55594f65-206f-4b2a-a4ed-c049861ef480 namespace which is not needed anymore
Jan 22 00:16:39 compute-1 kernel: tap53f3c575-cc: entered promiscuous mode
Jan 22 00:16:39 compute-1 kernel: tap53f3c575-cc (unregistering): left promiscuous mode
Jan 22 00:16:39 compute-1 NetworkManager[54952]: <info>  [1769040999.4438] manager: (tap53f3c575-cc): new Tun device (/org/freedesktop/NetworkManager/Devices/241)
Jan 22 00:16:39 compute-1 nova_compute[182713]: 2026-01-22 00:16:39.449 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:16:39 compute-1 neutron-haproxy-ovnmeta-55594f65-206f-4b2a-a4ed-c049861ef480[231537]: [NOTICE]   (231541) : haproxy version is 2.8.14-c23fe91
Jan 22 00:16:39 compute-1 neutron-haproxy-ovnmeta-55594f65-206f-4b2a-a4ed-c049861ef480[231537]: [NOTICE]   (231541) : path to executable is /usr/sbin/haproxy
Jan 22 00:16:39 compute-1 neutron-haproxy-ovnmeta-55594f65-206f-4b2a-a4ed-c049861ef480[231537]: [WARNING]  (231541) : Exiting Master process...
Jan 22 00:16:39 compute-1 neutron-haproxy-ovnmeta-55594f65-206f-4b2a-a4ed-c049861ef480[231537]: [ALERT]    (231541) : Current worker (231543) exited with code 143 (Terminated)
Jan 22 00:16:39 compute-1 neutron-haproxy-ovnmeta-55594f65-206f-4b2a-a4ed-c049861ef480[231537]: [WARNING]  (231541) : All workers exited. Exiting... (0)
Jan 22 00:16:39 compute-1 systemd[1]: libpod-4c061ce4c87bd8982286d649c8e61dd380707f7e5749988dc5059a395fe011a1.scope: Deactivated successfully.
Jan 22 00:16:39 compute-1 podman[231846]: 2026-01-22 00:16:39.590213652 +0000 UTC m=+0.051301348 container died 4c061ce4c87bd8982286d649c8e61dd380707f7e5749988dc5059a395fe011a1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-55594f65-206f-4b2a-a4ed-c049861ef480, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 00:16:39 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-4c061ce4c87bd8982286d649c8e61dd380707f7e5749988dc5059a395fe011a1-userdata-shm.mount: Deactivated successfully.
Jan 22 00:16:39 compute-1 systemd[1]: var-lib-containers-storage-overlay-448da1fae413fd9431a82ee057dc5d8a5f546f4518c8cd6a6c9da0fe1095df74-merged.mount: Deactivated successfully.
Jan 22 00:16:39 compute-1 podman[231846]: 2026-01-22 00:16:39.624256209 +0000 UTC m=+0.085343895 container cleanup 4c061ce4c87bd8982286d649c8e61dd380707f7e5749988dc5059a395fe011a1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-55594f65-206f-4b2a-a4ed-c049861ef480, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202)
Jan 22 00:16:39 compute-1 systemd[1]: libpod-conmon-4c061ce4c87bd8982286d649c8e61dd380707f7e5749988dc5059a395fe011a1.scope: Deactivated successfully.
Jan 22 00:16:39 compute-1 podman[231877]: 2026-01-22 00:16:39.716891496 +0000 UTC m=+0.051552266 container remove 4c061ce4c87bd8982286d649c8e61dd380707f7e5749988dc5059a395fe011a1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-55594f65-206f-4b2a-a4ed-c049861ef480, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 22 00:16:39 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:16:39.722 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[f3c7107c-9c55-48fa-8183-7fe3555f9d29]: (4, ('Thu Jan 22 12:16:39 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-55594f65-206f-4b2a-a4ed-c049861ef480 (4c061ce4c87bd8982286d649c8e61dd380707f7e5749988dc5059a395fe011a1)\n4c061ce4c87bd8982286d649c8e61dd380707f7e5749988dc5059a395fe011a1\nThu Jan 22 12:16:39 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-55594f65-206f-4b2a-a4ed-c049861ef480 (4c061ce4c87bd8982286d649c8e61dd380707f7e5749988dc5059a395fe011a1)\n4c061ce4c87bd8982286d649c8e61dd380707f7e5749988dc5059a395fe011a1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:16:39 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:16:39.723 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[c89847df-7646-4cc3-9dfe-ecddfd33cd6a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:16:39 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:16:39.724 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap55594f65-20, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:16:39 compute-1 kernel: tap55594f65-20: left promiscuous mode
Jan 22 00:16:39 compute-1 nova_compute[182713]: 2026-01-22 00:16:39.725 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:16:39 compute-1 nova_compute[182713]: 2026-01-22 00:16:39.742 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:16:39 compute-1 nova_compute[182713]: 2026-01-22 00:16:39.743 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:16:39 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:16:39.747 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[860ba29c-0239-4402-a69a-5bcfb8363969]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:16:39 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:16:39.765 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[f522c08b-fc2e-4f50-b1f1-70eaa60072f6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:16:39 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:16:39.766 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[8185c92f-2168-4fc9-bc88-14fd58abf87f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:16:39 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:16:39.791 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[a1251a03-1cc7-4c9f-bdce-7a28e495628a]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 554753, 'reachable_time': 36103, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 231896, 'error': None, 'target': 'ovnmeta-55594f65-206f-4b2a-a4ed-c049861ef480', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:16:39 compute-1 systemd[1]: run-netns-ovnmeta\x2d55594f65\x2d206f\x2d4b2a\x2da4ed\x2dc049861ef480.mount: Deactivated successfully.
Jan 22 00:16:39 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:16:39.796 104576 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-55594f65-206f-4b2a-a4ed-c049861ef480 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 22 00:16:39 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:16:39.796 104576 DEBUG oslo.privsep.daemon [-] privsep: reply[023bf7b6-ae3b-4fc4-b582-c9124853f632]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:16:39 compute-1 nova_compute[182713]: 2026-01-22 00:16:39.969 182717 INFO nova.virt.libvirt.driver [None req-cad6ea55-40fb-4283-a434-81d6ea41727f c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] [instance: 5ba5bafe-ee5b-48f6-aa2f-653708f71f55] Instance shutdown successfully after 13 seconds.
Jan 22 00:16:39 compute-1 nova_compute[182713]: 2026-01-22 00:16:39.979 182717 INFO nova.virt.libvirt.driver [-] [instance: 5ba5bafe-ee5b-48f6-aa2f-653708f71f55] Instance destroyed successfully.
Jan 22 00:16:39 compute-1 nova_compute[182713]: 2026-01-22 00:16:39.980 182717 DEBUG nova.objects.instance [None req-cad6ea55-40fb-4283-a434-81d6ea41727f c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] Lazy-loading 'numa_topology' on Instance uuid 5ba5bafe-ee5b-48f6-aa2f-653708f71f55 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:16:40 compute-1 nova_compute[182713]: 2026-01-22 00:16:40.000 182717 INFO nova.virt.libvirt.driver [None req-cad6ea55-40fb-4283-a434-81d6ea41727f c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] [instance: 5ba5bafe-ee5b-48f6-aa2f-653708f71f55] Attempting rescue
Jan 22 00:16:40 compute-1 nova_compute[182713]: 2026-01-22 00:16:40.001 182717 DEBUG nova.virt.libvirt.driver [None req-cad6ea55-40fb-4283-a434-81d6ea41727f c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] rescue generated disk_info: {'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'disk.rescue': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vdb', 'type': 'disk'}, 'disk.config.rescue': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} rescue /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4314
Jan 22 00:16:40 compute-1 nova_compute[182713]: 2026-01-22 00:16:40.008 182717 DEBUG nova.virt.libvirt.driver [None req-cad6ea55-40fb-4283-a434-81d6ea41727f c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] [instance: 5ba5bafe-ee5b-48f6-aa2f-653708f71f55] Instance directory exists: not creating _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4719
Jan 22 00:16:40 compute-1 nova_compute[182713]: 2026-01-22 00:16:40.008 182717 INFO nova.virt.libvirt.driver [None req-cad6ea55-40fb-4283-a434-81d6ea41727f c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] [instance: 5ba5bafe-ee5b-48f6-aa2f-653708f71f55] Creating image(s)
Jan 22 00:16:40 compute-1 nova_compute[182713]: 2026-01-22 00:16:40.010 182717 DEBUG oslo_concurrency.lockutils [None req-cad6ea55-40fb-4283-a434-81d6ea41727f c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] Acquiring lock "/var/lib/nova/instances/5ba5bafe-ee5b-48f6-aa2f-653708f71f55/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:16:40 compute-1 nova_compute[182713]: 2026-01-22 00:16:40.010 182717 DEBUG oslo_concurrency.lockutils [None req-cad6ea55-40fb-4283-a434-81d6ea41727f c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] Lock "/var/lib/nova/instances/5ba5bafe-ee5b-48f6-aa2f-653708f71f55/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:16:40 compute-1 nova_compute[182713]: 2026-01-22 00:16:40.011 182717 DEBUG oslo_concurrency.lockutils [None req-cad6ea55-40fb-4283-a434-81d6ea41727f c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] Lock "/var/lib/nova/instances/5ba5bafe-ee5b-48f6-aa2f-653708f71f55/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:16:40 compute-1 nova_compute[182713]: 2026-01-22 00:16:40.012 182717 DEBUG nova.objects.instance [None req-cad6ea55-40fb-4283-a434-81d6ea41727f c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] Lazy-loading 'trusted_certs' on Instance uuid 5ba5bafe-ee5b-48f6-aa2f-653708f71f55 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:16:40 compute-1 nova_compute[182713]: 2026-01-22 00:16:40.074 182717 DEBUG oslo_concurrency.lockutils [None req-cad6ea55-40fb-4283-a434-81d6ea41727f c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] Acquiring lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:16:40 compute-1 nova_compute[182713]: 2026-01-22 00:16:40.075 182717 DEBUG oslo_concurrency.lockutils [None req-cad6ea55-40fb-4283-a434-81d6ea41727f c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:16:40 compute-1 nova_compute[182713]: 2026-01-22 00:16:40.098 182717 DEBUG oslo_concurrency.processutils [None req-cad6ea55-40fb-4283-a434-81d6ea41727f c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:16:40 compute-1 nova_compute[182713]: 2026-01-22 00:16:40.187 182717 DEBUG oslo_concurrency.processutils [None req-cad6ea55-40fb-4283-a434-81d6ea41727f c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.089s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:16:40 compute-1 nova_compute[182713]: 2026-01-22 00:16:40.189 182717 DEBUG oslo_concurrency.processutils [None req-cad6ea55-40fb-4283-a434-81d6ea41727f c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/5ba5bafe-ee5b-48f6-aa2f-653708f71f55/disk.rescue execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:16:40 compute-1 nova_compute[182713]: 2026-01-22 00:16:40.224 182717 DEBUG oslo_concurrency.processutils [None req-cad6ea55-40fb-4283-a434-81d6ea41727f c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/5ba5bafe-ee5b-48f6-aa2f-653708f71f55/disk.rescue" returned: 0 in 0.036s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:16:40 compute-1 nova_compute[182713]: 2026-01-22 00:16:40.225 182717 DEBUG oslo_concurrency.lockutils [None req-cad6ea55-40fb-4283-a434-81d6ea41727f c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.150s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:16:40 compute-1 nova_compute[182713]: 2026-01-22 00:16:40.226 182717 DEBUG nova.objects.instance [None req-cad6ea55-40fb-4283-a434-81d6ea41727f c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] Lazy-loading 'migration_context' on Instance uuid 5ba5bafe-ee5b-48f6-aa2f-653708f71f55 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:16:40 compute-1 nova_compute[182713]: 2026-01-22 00:16:40.244 182717 DEBUG nova.virt.libvirt.driver [None req-cad6ea55-40fb-4283-a434-81d6ea41727f c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] [instance: 5ba5bafe-ee5b-48f6-aa2f-653708f71f55] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 22 00:16:40 compute-1 nova_compute[182713]: 2026-01-22 00:16:40.245 182717 DEBUG nova.virt.libvirt.driver [None req-cad6ea55-40fb-4283-a434-81d6ea41727f c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] [instance: 5ba5bafe-ee5b-48f6-aa2f-653708f71f55] Start _get_guest_xml network_info=[{"id": "53f3c575-ccc8-4fde-a256-9598ddf6cdaf", "address": "fa:16:3e:9f:0e:3f", "network": {"id": "55594f65-206f-4b2a-a4ed-c049861ef480", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-2114387764-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerRescueNegativeTestJSON-2114387764-network", "vif_mac": "fa:16:3e:9f:0e:3f"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4c6e66779ffe440d9c3270f0328391fb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap53f3c575-cc", "ovs_interfaceid": "53f3c575-ccc8-4fde-a256-9598ddf6cdaf", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'disk.rescue': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vdb', 'type': 'disk'}, 'disk.config.rescue': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>) rescue={'image_id': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'kernel_id': '', 'ramdisk_id': ''} block_device_info=None _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 22 00:16:40 compute-1 nova_compute[182713]: 2026-01-22 00:16:40.245 182717 DEBUG nova.objects.instance [None req-cad6ea55-40fb-4283-a434-81d6ea41727f c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] Lazy-loading 'resources' on Instance uuid 5ba5bafe-ee5b-48f6-aa2f-653708f71f55 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:16:40 compute-1 nova_compute[182713]: 2026-01-22 00:16:40.265 182717 WARNING nova.virt.libvirt.driver [None req-cad6ea55-40fb-4283-a434-81d6ea41727f c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 00:16:40 compute-1 nova_compute[182713]: 2026-01-22 00:16:40.276 182717 DEBUG nova.virt.libvirt.host [None req-cad6ea55-40fb-4283-a434-81d6ea41727f c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 22 00:16:40 compute-1 nova_compute[182713]: 2026-01-22 00:16:40.277 182717 DEBUG nova.virt.libvirt.host [None req-cad6ea55-40fb-4283-a434-81d6ea41727f c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 22 00:16:40 compute-1 nova_compute[182713]: 2026-01-22 00:16:40.283 182717 DEBUG nova.virt.libvirt.host [None req-cad6ea55-40fb-4283-a434-81d6ea41727f c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 22 00:16:40 compute-1 nova_compute[182713]: 2026-01-22 00:16:40.284 182717 DEBUG nova.virt.libvirt.host [None req-cad6ea55-40fb-4283-a434-81d6ea41727f c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 22 00:16:40 compute-1 nova_compute[182713]: 2026-01-22 00:16:40.285 182717 DEBUG nova.virt.libvirt.driver [None req-cad6ea55-40fb-4283-a434-81d6ea41727f c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 22 00:16:40 compute-1 nova_compute[182713]: 2026-01-22 00:16:40.285 182717 DEBUG nova.virt.hardware [None req-cad6ea55-40fb-4283-a434-81d6ea41727f c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-21T23:42:45Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c3389c03-89c4-4ff5-9e03-1a99d41713d4',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 22 00:16:40 compute-1 nova_compute[182713]: 2026-01-22 00:16:40.285 182717 DEBUG nova.virt.hardware [None req-cad6ea55-40fb-4283-a434-81d6ea41727f c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 22 00:16:40 compute-1 nova_compute[182713]: 2026-01-22 00:16:40.286 182717 DEBUG nova.virt.hardware [None req-cad6ea55-40fb-4283-a434-81d6ea41727f c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 22 00:16:40 compute-1 nova_compute[182713]: 2026-01-22 00:16:40.286 182717 DEBUG nova.virt.hardware [None req-cad6ea55-40fb-4283-a434-81d6ea41727f c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 22 00:16:40 compute-1 nova_compute[182713]: 2026-01-22 00:16:40.286 182717 DEBUG nova.virt.hardware [None req-cad6ea55-40fb-4283-a434-81d6ea41727f c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 22 00:16:40 compute-1 nova_compute[182713]: 2026-01-22 00:16:40.286 182717 DEBUG nova.virt.hardware [None req-cad6ea55-40fb-4283-a434-81d6ea41727f c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 22 00:16:40 compute-1 nova_compute[182713]: 2026-01-22 00:16:40.286 182717 DEBUG nova.virt.hardware [None req-cad6ea55-40fb-4283-a434-81d6ea41727f c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 22 00:16:40 compute-1 nova_compute[182713]: 2026-01-22 00:16:40.287 182717 DEBUG nova.virt.hardware [None req-cad6ea55-40fb-4283-a434-81d6ea41727f c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 22 00:16:40 compute-1 nova_compute[182713]: 2026-01-22 00:16:40.287 182717 DEBUG nova.virt.hardware [None req-cad6ea55-40fb-4283-a434-81d6ea41727f c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 22 00:16:40 compute-1 nova_compute[182713]: 2026-01-22 00:16:40.287 182717 DEBUG nova.virt.hardware [None req-cad6ea55-40fb-4283-a434-81d6ea41727f c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 22 00:16:40 compute-1 nova_compute[182713]: 2026-01-22 00:16:40.287 182717 DEBUG nova.virt.hardware [None req-cad6ea55-40fb-4283-a434-81d6ea41727f c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 22 00:16:40 compute-1 nova_compute[182713]: 2026-01-22 00:16:40.287 182717 DEBUG nova.objects.instance [None req-cad6ea55-40fb-4283-a434-81d6ea41727f c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] Lazy-loading 'vcpu_model' on Instance uuid 5ba5bafe-ee5b-48f6-aa2f-653708f71f55 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:16:40 compute-1 nova_compute[182713]: 2026-01-22 00:16:40.353 182717 DEBUG nova.virt.libvirt.vif [None req-cad6ea55-40fb-4283-a434-81d6ea41727f c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T00:15:50Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerRescueNegativeTestJSON-server-343289399',display_name='tempest-ServerRescueNegativeTestJSON-server-343289399',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverrescuenegativetestjson-server-343289399',id=133,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-22T00:16:16Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='4c6e66779ffe440d9c3270f0328391fb',ramdisk_id='',reservation_id='r-570w62iy',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerRescueNegativeTestJSON-1986679883',owner_user_name='tempest-ServerRescueNegativeTestJSON-1986679883-project-member'},tags=<?>,task_state='rescuing',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:16:17Z,user_data=None,user_id='c26ff016fcfc4e08803feb0e96005a8e',uuid=5ba5bafe-ee5b-48f6-aa2f-653708f71f55,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "53f3c575-ccc8-4fde-a256-9598ddf6cdaf", "address": "fa:16:3e:9f:0e:3f", "network": {"id": "55594f65-206f-4b2a-a4ed-c049861ef480", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-2114387764-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerRescueNegativeTestJSON-2114387764-network", "vif_mac": "fa:16:3e:9f:0e:3f"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4c6e66779ffe440d9c3270f0328391fb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap53f3c575-cc", "ovs_interfaceid": "53f3c575-ccc8-4fde-a256-9598ddf6cdaf", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 00:16:40 compute-1 nova_compute[182713]: 2026-01-22 00:16:40.354 182717 DEBUG nova.network.os_vif_util [None req-cad6ea55-40fb-4283-a434-81d6ea41727f c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] Converting VIF {"id": "53f3c575-ccc8-4fde-a256-9598ddf6cdaf", "address": "fa:16:3e:9f:0e:3f", "network": {"id": "55594f65-206f-4b2a-a4ed-c049861ef480", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-2114387764-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerRescueNegativeTestJSON-2114387764-network", "vif_mac": "fa:16:3e:9f:0e:3f"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4c6e66779ffe440d9c3270f0328391fb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap53f3c575-cc", "ovs_interfaceid": "53f3c575-ccc8-4fde-a256-9598ddf6cdaf", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:16:40 compute-1 nova_compute[182713]: 2026-01-22 00:16:40.355 182717 DEBUG nova.network.os_vif_util [None req-cad6ea55-40fb-4283-a434-81d6ea41727f c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:9f:0e:3f,bridge_name='br-int',has_traffic_filtering=True,id=53f3c575-ccc8-4fde-a256-9598ddf6cdaf,network=Network(55594f65-206f-4b2a-a4ed-c049861ef480),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap53f3c575-cc') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:16:40 compute-1 nova_compute[182713]: 2026-01-22 00:16:40.357 182717 DEBUG nova.objects.instance [None req-cad6ea55-40fb-4283-a434-81d6ea41727f c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] Lazy-loading 'pci_devices' on Instance uuid 5ba5bafe-ee5b-48f6-aa2f-653708f71f55 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:16:40 compute-1 nova_compute[182713]: 2026-01-22 00:16:40.379 182717 DEBUG nova.virt.libvirt.driver [None req-cad6ea55-40fb-4283-a434-81d6ea41727f c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] [instance: 5ba5bafe-ee5b-48f6-aa2f-653708f71f55] End _get_guest_xml xml=<domain type="kvm">
Jan 22 00:16:40 compute-1 nova_compute[182713]:   <uuid>5ba5bafe-ee5b-48f6-aa2f-653708f71f55</uuid>
Jan 22 00:16:40 compute-1 nova_compute[182713]:   <name>instance-00000085</name>
Jan 22 00:16:40 compute-1 nova_compute[182713]:   <memory>131072</memory>
Jan 22 00:16:40 compute-1 nova_compute[182713]:   <vcpu>1</vcpu>
Jan 22 00:16:40 compute-1 nova_compute[182713]:   <metadata>
Jan 22 00:16:40 compute-1 nova_compute[182713]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 00:16:40 compute-1 nova_compute[182713]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 00:16:40 compute-1 nova_compute[182713]:       <nova:name>tempest-ServerRescueNegativeTestJSON-server-343289399</nova:name>
Jan 22 00:16:40 compute-1 nova_compute[182713]:       <nova:creationTime>2026-01-22 00:16:40</nova:creationTime>
Jan 22 00:16:40 compute-1 nova_compute[182713]:       <nova:flavor name="m1.nano">
Jan 22 00:16:40 compute-1 nova_compute[182713]:         <nova:memory>128</nova:memory>
Jan 22 00:16:40 compute-1 nova_compute[182713]:         <nova:disk>1</nova:disk>
Jan 22 00:16:40 compute-1 nova_compute[182713]:         <nova:swap>0</nova:swap>
Jan 22 00:16:40 compute-1 nova_compute[182713]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 00:16:40 compute-1 nova_compute[182713]:         <nova:vcpus>1</nova:vcpus>
Jan 22 00:16:40 compute-1 nova_compute[182713]:       </nova:flavor>
Jan 22 00:16:40 compute-1 nova_compute[182713]:       <nova:owner>
Jan 22 00:16:40 compute-1 nova_compute[182713]:         <nova:user uuid="c26ff016fcfc4e08803feb0e96005a8e">tempest-ServerRescueNegativeTestJSON-1986679883-project-member</nova:user>
Jan 22 00:16:40 compute-1 nova_compute[182713]:         <nova:project uuid="4c6e66779ffe440d9c3270f0328391fb">tempest-ServerRescueNegativeTestJSON-1986679883</nova:project>
Jan 22 00:16:40 compute-1 nova_compute[182713]:       </nova:owner>
Jan 22 00:16:40 compute-1 nova_compute[182713]:       <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 22 00:16:40 compute-1 nova_compute[182713]:       <nova:ports>
Jan 22 00:16:40 compute-1 nova_compute[182713]:         <nova:port uuid="53f3c575-ccc8-4fde-a256-9598ddf6cdaf">
Jan 22 00:16:40 compute-1 nova_compute[182713]:           <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Jan 22 00:16:40 compute-1 nova_compute[182713]:         </nova:port>
Jan 22 00:16:40 compute-1 nova_compute[182713]:       </nova:ports>
Jan 22 00:16:40 compute-1 nova_compute[182713]:     </nova:instance>
Jan 22 00:16:40 compute-1 nova_compute[182713]:   </metadata>
Jan 22 00:16:40 compute-1 nova_compute[182713]:   <sysinfo type="smbios">
Jan 22 00:16:40 compute-1 nova_compute[182713]:     <system>
Jan 22 00:16:40 compute-1 nova_compute[182713]:       <entry name="manufacturer">RDO</entry>
Jan 22 00:16:40 compute-1 nova_compute[182713]:       <entry name="product">OpenStack Compute</entry>
Jan 22 00:16:40 compute-1 nova_compute[182713]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 00:16:40 compute-1 nova_compute[182713]:       <entry name="serial">5ba5bafe-ee5b-48f6-aa2f-653708f71f55</entry>
Jan 22 00:16:40 compute-1 nova_compute[182713]:       <entry name="uuid">5ba5bafe-ee5b-48f6-aa2f-653708f71f55</entry>
Jan 22 00:16:40 compute-1 nova_compute[182713]:       <entry name="family">Virtual Machine</entry>
Jan 22 00:16:40 compute-1 nova_compute[182713]:     </system>
Jan 22 00:16:40 compute-1 nova_compute[182713]:   </sysinfo>
Jan 22 00:16:40 compute-1 nova_compute[182713]:   <os>
Jan 22 00:16:40 compute-1 nova_compute[182713]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 22 00:16:40 compute-1 nova_compute[182713]:     <smbios mode="sysinfo"/>
Jan 22 00:16:40 compute-1 nova_compute[182713]:   </os>
Jan 22 00:16:40 compute-1 nova_compute[182713]:   <features>
Jan 22 00:16:40 compute-1 nova_compute[182713]:     <acpi/>
Jan 22 00:16:40 compute-1 nova_compute[182713]:     <apic/>
Jan 22 00:16:40 compute-1 nova_compute[182713]:     <vmcoreinfo/>
Jan 22 00:16:40 compute-1 nova_compute[182713]:   </features>
Jan 22 00:16:40 compute-1 nova_compute[182713]:   <clock offset="utc">
Jan 22 00:16:40 compute-1 nova_compute[182713]:     <timer name="pit" tickpolicy="delay"/>
Jan 22 00:16:40 compute-1 nova_compute[182713]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 22 00:16:40 compute-1 nova_compute[182713]:     <timer name="hpet" present="no"/>
Jan 22 00:16:40 compute-1 nova_compute[182713]:   </clock>
Jan 22 00:16:40 compute-1 nova_compute[182713]:   <cpu mode="custom" match="exact">
Jan 22 00:16:40 compute-1 nova_compute[182713]:     <model>Nehalem</model>
Jan 22 00:16:40 compute-1 nova_compute[182713]:     <topology sockets="1" cores="1" threads="1"/>
Jan 22 00:16:40 compute-1 nova_compute[182713]:   </cpu>
Jan 22 00:16:40 compute-1 nova_compute[182713]:   <devices>
Jan 22 00:16:40 compute-1 nova_compute[182713]:     <disk type="file" device="disk">
Jan 22 00:16:40 compute-1 nova_compute[182713]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 00:16:40 compute-1 nova_compute[182713]:       <source file="/var/lib/nova/instances/5ba5bafe-ee5b-48f6-aa2f-653708f71f55/disk.rescue"/>
Jan 22 00:16:40 compute-1 nova_compute[182713]:       <target dev="vda" bus="virtio"/>
Jan 22 00:16:40 compute-1 nova_compute[182713]:     </disk>
Jan 22 00:16:40 compute-1 nova_compute[182713]:     <disk type="file" device="disk">
Jan 22 00:16:40 compute-1 nova_compute[182713]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 00:16:40 compute-1 nova_compute[182713]:       <source file="/var/lib/nova/instances/5ba5bafe-ee5b-48f6-aa2f-653708f71f55/disk"/>
Jan 22 00:16:40 compute-1 nova_compute[182713]:       <target dev="vdb" bus="virtio"/>
Jan 22 00:16:40 compute-1 nova_compute[182713]:     </disk>
Jan 22 00:16:40 compute-1 nova_compute[182713]:     <disk type="file" device="cdrom">
Jan 22 00:16:40 compute-1 nova_compute[182713]:       <driver name="qemu" type="raw" cache="none"/>
Jan 22 00:16:40 compute-1 nova_compute[182713]:       <source file="/var/lib/nova/instances/5ba5bafe-ee5b-48f6-aa2f-653708f71f55/disk.config.rescue"/>
Jan 22 00:16:40 compute-1 nova_compute[182713]:       <target dev="sda" bus="sata"/>
Jan 22 00:16:40 compute-1 nova_compute[182713]:     </disk>
Jan 22 00:16:40 compute-1 nova_compute[182713]:     <interface type="ethernet">
Jan 22 00:16:40 compute-1 nova_compute[182713]:       <mac address="fa:16:3e:9f:0e:3f"/>
Jan 22 00:16:40 compute-1 nova_compute[182713]:       <model type="virtio"/>
Jan 22 00:16:40 compute-1 nova_compute[182713]:       <driver name="vhost" rx_queue_size="512"/>
Jan 22 00:16:40 compute-1 nova_compute[182713]:       <mtu size="1442"/>
Jan 22 00:16:40 compute-1 nova_compute[182713]:       <target dev="tap53f3c575-cc"/>
Jan 22 00:16:40 compute-1 nova_compute[182713]:     </interface>
Jan 22 00:16:40 compute-1 nova_compute[182713]:     <serial type="pty">
Jan 22 00:16:40 compute-1 nova_compute[182713]:       <log file="/var/lib/nova/instances/5ba5bafe-ee5b-48f6-aa2f-653708f71f55/console.log" append="off"/>
Jan 22 00:16:40 compute-1 nova_compute[182713]:     </serial>
Jan 22 00:16:40 compute-1 nova_compute[182713]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 00:16:40 compute-1 nova_compute[182713]:     <video>
Jan 22 00:16:40 compute-1 nova_compute[182713]:       <model type="virtio"/>
Jan 22 00:16:40 compute-1 nova_compute[182713]:     </video>
Jan 22 00:16:40 compute-1 nova_compute[182713]:     <input type="tablet" bus="usb"/>
Jan 22 00:16:40 compute-1 nova_compute[182713]:     <rng model="virtio">
Jan 22 00:16:40 compute-1 nova_compute[182713]:       <backend model="random">/dev/urandom</backend>
Jan 22 00:16:40 compute-1 nova_compute[182713]:     </rng>
Jan 22 00:16:40 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root"/>
Jan 22 00:16:40 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:16:40 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:16:40 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:16:40 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:16:40 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:16:40 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:16:40 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:16:40 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:16:40 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:16:40 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:16:40 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:16:40 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:16:40 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:16:40 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:16:40 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:16:40 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:16:40 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:16:40 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:16:40 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:16:40 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:16:40 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:16:40 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:16:40 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:16:40 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:16:40 compute-1 nova_compute[182713]:     <controller type="usb" index="0"/>
Jan 22 00:16:40 compute-1 nova_compute[182713]:     <memballoon model="virtio">
Jan 22 00:16:40 compute-1 nova_compute[182713]:       <stats period="10"/>
Jan 22 00:16:40 compute-1 nova_compute[182713]:     </memballoon>
Jan 22 00:16:40 compute-1 nova_compute[182713]:   </devices>
Jan 22 00:16:40 compute-1 nova_compute[182713]: </domain>
Jan 22 00:16:40 compute-1 nova_compute[182713]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 22 00:16:40 compute-1 nova_compute[182713]: 2026-01-22 00:16:40.388 182717 INFO nova.virt.libvirt.driver [-] [instance: 5ba5bafe-ee5b-48f6-aa2f-653708f71f55] Instance destroyed successfully.
Jan 22 00:16:40 compute-1 nova_compute[182713]: 2026-01-22 00:16:40.446 182717 DEBUG nova.virt.libvirt.driver [None req-cad6ea55-40fb-4283-a434-81d6ea41727f c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 00:16:40 compute-1 nova_compute[182713]: 2026-01-22 00:16:40.446 182717 DEBUG nova.virt.libvirt.driver [None req-cad6ea55-40fb-4283-a434-81d6ea41727f c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] No BDM found with device name vdb, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 00:16:40 compute-1 nova_compute[182713]: 2026-01-22 00:16:40.447 182717 DEBUG nova.virt.libvirt.driver [None req-cad6ea55-40fb-4283-a434-81d6ea41727f c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 00:16:40 compute-1 nova_compute[182713]: 2026-01-22 00:16:40.447 182717 DEBUG nova.virt.libvirt.driver [None req-cad6ea55-40fb-4283-a434-81d6ea41727f c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] No VIF found with MAC fa:16:3e:9f:0e:3f, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 00:16:40 compute-1 nova_compute[182713]: 2026-01-22 00:16:40.448 182717 INFO nova.virt.libvirt.driver [None req-cad6ea55-40fb-4283-a434-81d6ea41727f c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] [instance: 5ba5bafe-ee5b-48f6-aa2f-653708f71f55] Using config drive
Jan 22 00:16:40 compute-1 nova_compute[182713]: 2026-01-22 00:16:40.465 182717 DEBUG nova.objects.instance [None req-cad6ea55-40fb-4283-a434-81d6ea41727f c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] Lazy-loading 'ec2_ids' on Instance uuid 5ba5bafe-ee5b-48f6-aa2f-653708f71f55 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:16:40 compute-1 nova_compute[182713]: 2026-01-22 00:16:40.511 182717 DEBUG nova.objects.instance [None req-cad6ea55-40fb-4283-a434-81d6ea41727f c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] Lazy-loading 'keypairs' on Instance uuid 5ba5bafe-ee5b-48f6-aa2f-653708f71f55 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:16:41 compute-1 nova_compute[182713]: 2026-01-22 00:16:41.187 182717 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769040986.186485, 074fd360-328c-4903-a368-d3890c4a1075 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:16:41 compute-1 nova_compute[182713]: 2026-01-22 00:16:41.188 182717 INFO nova.compute.manager [-] [instance: 074fd360-328c-4903-a368-d3890c4a1075] VM Stopped (Lifecycle Event)
Jan 22 00:16:41 compute-1 nova_compute[182713]: 2026-01-22 00:16:41.226 182717 DEBUG nova.compute.manager [None req-c5653338-51d6-4a2c-a88c-7a0145f4f90e - - - - - -] [instance: 074fd360-328c-4903-a368-d3890c4a1075] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:16:41 compute-1 nova_compute[182713]: 2026-01-22 00:16:41.235 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:16:41 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:16:41.387 104184 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=39, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:a4:fb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'fa:c2:bb:50:eb:27'}, ipsec=False) old=SB_Global(nb_cfg=38) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:16:41 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:16:41.388 104184 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 22 00:16:41 compute-1 nova_compute[182713]: 2026-01-22 00:16:41.393 182717 INFO nova.virt.libvirt.driver [None req-cad6ea55-40fb-4283-a434-81d6ea41727f c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] [instance: 5ba5bafe-ee5b-48f6-aa2f-653708f71f55] Creating config drive at /var/lib/nova/instances/5ba5bafe-ee5b-48f6-aa2f-653708f71f55/disk.config.rescue
Jan 22 00:16:41 compute-1 nova_compute[182713]: 2026-01-22 00:16:41.403 182717 DEBUG oslo_concurrency.processutils [None req-cad6ea55-40fb-4283-a434-81d6ea41727f c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/5ba5bafe-ee5b-48f6-aa2f-653708f71f55/disk.config.rescue -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp63dc8aft execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:16:41 compute-1 nova_compute[182713]: 2026-01-22 00:16:41.432 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:16:41 compute-1 nova_compute[182713]: 2026-01-22 00:16:41.548 182717 DEBUG oslo_concurrency.processutils [None req-cad6ea55-40fb-4283-a434-81d6ea41727f c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/5ba5bafe-ee5b-48f6-aa2f-653708f71f55/disk.config.rescue -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp63dc8aft" returned: 0 in 0.145s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:16:41 compute-1 kernel: tap53f3c575-cc: entered promiscuous mode
Jan 22 00:16:41 compute-1 systemd-udevd[231780]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 00:16:41 compute-1 NetworkManager[54952]: <info>  [1769041001.6482] manager: (tap53f3c575-cc): new Tun device (/org/freedesktop/NetworkManager/Devices/242)
Jan 22 00:16:41 compute-1 ovn_controller[94841]: 2026-01-22T00:16:41Z|00536|binding|INFO|Claiming lport 53f3c575-ccc8-4fde-a256-9598ddf6cdaf for this chassis.
Jan 22 00:16:41 compute-1 ovn_controller[94841]: 2026-01-22T00:16:41Z|00537|binding|INFO|53f3c575-ccc8-4fde-a256-9598ddf6cdaf: Claiming fa:16:3e:9f:0e:3f 10.100.0.12
Jan 22 00:16:41 compute-1 nova_compute[182713]: 2026-01-22 00:16:41.647 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:16:41 compute-1 NetworkManager[54952]: <info>  [1769041001.6576] device (tap53f3c575-cc): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 00:16:41 compute-1 NetworkManager[54952]: <info>  [1769041001.6593] device (tap53f3c575-cc): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 00:16:41 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:16:41.663 104184 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:9f:0e:3f 10.100.0.12'], port_security=['fa:16:3e:9f:0e:3f 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '5ba5bafe-ee5b-48f6-aa2f-653708f71f55', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-55594f65-206f-4b2a-a4ed-c049861ef480', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '4c6e66779ffe440d9c3270f0328391fb', 'neutron:revision_number': '4', 'neutron:security_group_ids': '99a05b18-dc2e-46bb-b7ee-4bfce96057f6', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f515f91a-3ddc-47bf-8aaf-753c19e78de1, chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>], logical_port=53f3c575-ccc8-4fde-a256-9598ddf6cdaf) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:16:41 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:16:41.666 104184 INFO neutron.agent.ovn.metadata.agent [-] Port 53f3c575-ccc8-4fde-a256-9598ddf6cdaf in datapath 55594f65-206f-4b2a-a4ed-c049861ef480 bound to our chassis
Jan 22 00:16:41 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:16:41.668 104184 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 55594f65-206f-4b2a-a4ed-c049861ef480
Jan 22 00:16:41 compute-1 ovn_controller[94841]: 2026-01-22T00:16:41Z|00538|binding|INFO|Setting lport 53f3c575-ccc8-4fde-a256-9598ddf6cdaf ovn-installed in OVS
Jan 22 00:16:41 compute-1 ovn_controller[94841]: 2026-01-22T00:16:41Z|00539|binding|INFO|Setting lport 53f3c575-ccc8-4fde-a256-9598ddf6cdaf up in Southbound
Jan 22 00:16:41 compute-1 nova_compute[182713]: 2026-01-22 00:16:41.674 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:16:41 compute-1 nova_compute[182713]: 2026-01-22 00:16:41.676 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:16:41 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:16:41.684 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[a5688928-4c1a-46a6-b1df-e6bf56579c34]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:16:41 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:16:41.685 104184 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap55594f65-21 in ovnmeta-55594f65-206f-4b2a-a4ed-c049861ef480 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 22 00:16:41 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:16:41.689 211733 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap55594f65-20 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 22 00:16:41 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:16:41.689 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[76f0e7ba-48d6-4e58-b4f2-648c0658d36a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:16:41 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:16:41.690 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[6c4d837b-608d-45b6-aa04-2c57e6c63488]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:16:41 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:16:41.709 104576 DEBUG oslo.privsep.daemon [-] privsep: reply[96ed56f4-1953-4192-b47a-6ba8218006cd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:16:41 compute-1 systemd-machined[153970]: New machine qemu-60-instance-00000085.
Jan 22 00:16:41 compute-1 systemd[1]: Started Virtual Machine qemu-60-instance-00000085.
Jan 22 00:16:41 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:16:41.746 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[8ad85296-5e28-4fc6-9db8-c8f4a4d32861]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:16:41 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:16:41.795 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[ba7318c0-7883-4825-bff0-0ed176322fea]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:16:41 compute-1 NetworkManager[54952]: <info>  [1769041001.8072] manager: (tap55594f65-20): new Veth device (/org/freedesktop/NetworkManager/Devices/243)
Jan 22 00:16:41 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:16:41.807 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[bee7fb56-7074-4c09-8088-5d6db7c6f6eb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:16:41 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:16:41.854 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[89b269ff-54a5-4b8a-b8da-8fb81cb9261d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:16:41 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:16:41.859 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[fe84da04-4d34-459f-b21a-89c5e6c9c7e9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:16:41 compute-1 NetworkManager[54952]: <info>  [1769041001.8905] device (tap55594f65-20): carrier: link connected
Jan 22 00:16:41 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:16:41.900 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[92ec3d9d-a16e-4918-97fb-54c32a1ba868]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:16:41 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:16:41.930 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[bda14403-6d3a-4631-b09d-f6d8028d9417]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap55594f65-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b9:fe:a1'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 153], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 557454, 'reachable_time': 17967, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 231955, 'error': None, 'target': 'ovnmeta-55594f65-206f-4b2a-a4ed-c049861ef480', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:16:41 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:16:41.953 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[34d94e50-837e-4bd7-bdc6-e82edbfcd19e]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feb9:fea1'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 557454, 'tstamp': 557454}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 231956, 'error': None, 'target': 'ovnmeta-55594f65-206f-4b2a-a4ed-c049861ef480', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:16:41 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:16:41.976 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[028e3ab9-0639-47b7-baf7-108ffcd297bd]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap55594f65-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b9:fe:a1'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 153], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 557454, 'reachable_time': 17967, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 231957, 'error': None, 'target': 'ovnmeta-55594f65-206f-4b2a-a4ed-c049861ef480', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:16:42 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:16:42.020 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[33ce3518-2772-4b70-b1d6-b3781b33aff6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:16:42 compute-1 nova_compute[182713]: 2026-01-22 00:16:42.069 182717 DEBUG nova.compute.manager [req-a2725f87-f8c2-42bd-bff3-4392dad09777 req-f8fd3de8-63e7-4d9d-b455-38bfd7ff478e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 5ba5bafe-ee5b-48f6-aa2f-653708f71f55] Received event network-vif-unplugged-53f3c575-ccc8-4fde-a256-9598ddf6cdaf external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:16:42 compute-1 nova_compute[182713]: 2026-01-22 00:16:42.070 182717 DEBUG oslo_concurrency.lockutils [req-a2725f87-f8c2-42bd-bff3-4392dad09777 req-f8fd3de8-63e7-4d9d-b455-38bfd7ff478e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "5ba5bafe-ee5b-48f6-aa2f-653708f71f55-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:16:42 compute-1 nova_compute[182713]: 2026-01-22 00:16:42.071 182717 DEBUG oslo_concurrency.lockutils [req-a2725f87-f8c2-42bd-bff3-4392dad09777 req-f8fd3de8-63e7-4d9d-b455-38bfd7ff478e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "5ba5bafe-ee5b-48f6-aa2f-653708f71f55-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:16:42 compute-1 nova_compute[182713]: 2026-01-22 00:16:42.071 182717 DEBUG oslo_concurrency.lockutils [req-a2725f87-f8c2-42bd-bff3-4392dad09777 req-f8fd3de8-63e7-4d9d-b455-38bfd7ff478e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "5ba5bafe-ee5b-48f6-aa2f-653708f71f55-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:16:42 compute-1 nova_compute[182713]: 2026-01-22 00:16:42.072 182717 DEBUG nova.compute.manager [req-a2725f87-f8c2-42bd-bff3-4392dad09777 req-f8fd3de8-63e7-4d9d-b455-38bfd7ff478e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 5ba5bafe-ee5b-48f6-aa2f-653708f71f55] No waiting events found dispatching network-vif-unplugged-53f3c575-ccc8-4fde-a256-9598ddf6cdaf pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:16:42 compute-1 nova_compute[182713]: 2026-01-22 00:16:42.072 182717 WARNING nova.compute.manager [req-a2725f87-f8c2-42bd-bff3-4392dad09777 req-f8fd3de8-63e7-4d9d-b455-38bfd7ff478e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 5ba5bafe-ee5b-48f6-aa2f-653708f71f55] Received unexpected event network-vif-unplugged-53f3c575-ccc8-4fde-a256-9598ddf6cdaf for instance with vm_state active and task_state rescuing.
Jan 22 00:16:42 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:16:42.127 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[25d4c926-1382-4096-ba07-584f0de3d832]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:16:42 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:16:42.130 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap55594f65-20, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:16:42 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:16:42.130 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 00:16:42 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:16:42.132 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap55594f65-20, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:16:42 compute-1 NetworkManager[54952]: <info>  [1769041002.1366] manager: (tap55594f65-20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/244)
Jan 22 00:16:42 compute-1 kernel: tap55594f65-20: entered promiscuous mode
Jan 22 00:16:42 compute-1 nova_compute[182713]: 2026-01-22 00:16:42.135 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:16:42 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:16:42.139 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap55594f65-20, col_values=(('external_ids', {'iface-id': 'c516d686-0754-486d-a980-7442f4c88088'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:16:42 compute-1 ovn_controller[94841]: 2026-01-22T00:16:42Z|00540|binding|INFO|Releasing lport c516d686-0754-486d-a980-7442f4c88088 from this chassis (sb_readonly=0)
Jan 22 00:16:42 compute-1 nova_compute[182713]: 2026-01-22 00:16:42.165 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:16:42 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:16:42.168 104184 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/55594f65-206f-4b2a-a4ed-c049861ef480.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/55594f65-206f-4b2a-a4ed-c049861ef480.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 22 00:16:42 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:16:42.169 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[bfbed415-fc7a-4664-b6a0-471d6bb485d6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:16:42 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:16:42.170 104184 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 00:16:42 compute-1 ovn_metadata_agent[104179]: global
Jan 22 00:16:42 compute-1 ovn_metadata_agent[104179]:     log         /dev/log local0 debug
Jan 22 00:16:42 compute-1 ovn_metadata_agent[104179]:     log-tag     haproxy-metadata-proxy-55594f65-206f-4b2a-a4ed-c049861ef480
Jan 22 00:16:42 compute-1 ovn_metadata_agent[104179]:     user        root
Jan 22 00:16:42 compute-1 ovn_metadata_agent[104179]:     group       root
Jan 22 00:16:42 compute-1 ovn_metadata_agent[104179]:     maxconn     1024
Jan 22 00:16:42 compute-1 ovn_metadata_agent[104179]:     pidfile     /var/lib/neutron/external/pids/55594f65-206f-4b2a-a4ed-c049861ef480.pid.haproxy
Jan 22 00:16:42 compute-1 ovn_metadata_agent[104179]:     daemon
Jan 22 00:16:42 compute-1 ovn_metadata_agent[104179]: 
Jan 22 00:16:42 compute-1 ovn_metadata_agent[104179]: defaults
Jan 22 00:16:42 compute-1 ovn_metadata_agent[104179]:     log global
Jan 22 00:16:42 compute-1 ovn_metadata_agent[104179]:     mode http
Jan 22 00:16:42 compute-1 ovn_metadata_agent[104179]:     option httplog
Jan 22 00:16:42 compute-1 ovn_metadata_agent[104179]:     option dontlognull
Jan 22 00:16:42 compute-1 ovn_metadata_agent[104179]:     option http-server-close
Jan 22 00:16:42 compute-1 ovn_metadata_agent[104179]:     option forwardfor
Jan 22 00:16:42 compute-1 ovn_metadata_agent[104179]:     retries                 3
Jan 22 00:16:42 compute-1 ovn_metadata_agent[104179]:     timeout http-request    30s
Jan 22 00:16:42 compute-1 ovn_metadata_agent[104179]:     timeout connect         30s
Jan 22 00:16:42 compute-1 ovn_metadata_agent[104179]:     timeout client          32s
Jan 22 00:16:42 compute-1 ovn_metadata_agent[104179]:     timeout server          32s
Jan 22 00:16:42 compute-1 ovn_metadata_agent[104179]:     timeout http-keep-alive 30s
Jan 22 00:16:42 compute-1 ovn_metadata_agent[104179]: 
Jan 22 00:16:42 compute-1 ovn_metadata_agent[104179]: 
Jan 22 00:16:42 compute-1 ovn_metadata_agent[104179]: listen listener
Jan 22 00:16:42 compute-1 ovn_metadata_agent[104179]:     bind 169.254.169.254:80
Jan 22 00:16:42 compute-1 ovn_metadata_agent[104179]:     server metadata /var/lib/neutron/metadata_proxy
Jan 22 00:16:42 compute-1 ovn_metadata_agent[104179]:     http-request add-header X-OVN-Network-ID 55594f65-206f-4b2a-a4ed-c049861ef480
Jan 22 00:16:42 compute-1 ovn_metadata_agent[104179]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 22 00:16:42 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:16:42.174 104184 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-55594f65-206f-4b2a-a4ed-c049861ef480', 'env', 'PROCESS_TAG=haproxy-55594f65-206f-4b2a-a4ed-c049861ef480', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/55594f65-206f-4b2a-a4ed-c049861ef480.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 22 00:16:42 compute-1 nova_compute[182713]: 2026-01-22 00:16:42.472 182717 DEBUG nova.virt.libvirt.host [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Removed pending event for 5ba5bafe-ee5b-48f6-aa2f-653708f71f55 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438
Jan 22 00:16:42 compute-1 nova_compute[182713]: 2026-01-22 00:16:42.472 182717 DEBUG nova.virt.driver [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Emitting event <LifecycleEvent: 1769041002.469955, 5ba5bafe-ee5b-48f6-aa2f-653708f71f55 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:16:42 compute-1 nova_compute[182713]: 2026-01-22 00:16:42.473 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 5ba5bafe-ee5b-48f6-aa2f-653708f71f55] VM Resumed (Lifecycle Event)
Jan 22 00:16:42 compute-1 nova_compute[182713]: 2026-01-22 00:16:42.490 182717 DEBUG nova.compute.manager [None req-cad6ea55-40fb-4283-a434-81d6ea41727f c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] [instance: 5ba5bafe-ee5b-48f6-aa2f-653708f71f55] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:16:42 compute-1 nova_compute[182713]: 2026-01-22 00:16:42.501 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 5ba5bafe-ee5b-48f6-aa2f-653708f71f55] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:16:42 compute-1 nova_compute[182713]: 2026-01-22 00:16:42.505 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 5ba5bafe-ee5b-48f6-aa2f-653708f71f55] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: rescuing, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 00:16:42 compute-1 nova_compute[182713]: 2026-01-22 00:16:42.552 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 5ba5bafe-ee5b-48f6-aa2f-653708f71f55] During sync_power_state the instance has a pending task (rescuing). Skip.
Jan 22 00:16:42 compute-1 nova_compute[182713]: 2026-01-22 00:16:42.553 182717 DEBUG nova.virt.driver [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Emitting event <LifecycleEvent: 1769041002.471492, 5ba5bafe-ee5b-48f6-aa2f-653708f71f55 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:16:42 compute-1 nova_compute[182713]: 2026-01-22 00:16:42.554 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 5ba5bafe-ee5b-48f6-aa2f-653708f71f55] VM Started (Lifecycle Event)
Jan 22 00:16:42 compute-1 nova_compute[182713]: 2026-01-22 00:16:42.589 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 5ba5bafe-ee5b-48f6-aa2f-653708f71f55] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:16:42 compute-1 nova_compute[182713]: 2026-01-22 00:16:42.594 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 5ba5bafe-ee5b-48f6-aa2f-653708f71f55] Synchronizing instance power state after lifecycle event "Started"; current vm_state: rescued, current task_state: None, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 00:16:42 compute-1 podman[231995]: 2026-01-22 00:16:42.616685931 +0000 UTC m=+0.075995427 container create e27ed4040be34efc1bc9083c381209e97d3c2bdcd1bca7d3e59b63b41e743ae6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-55594f65-206f-4b2a-a4ed-c049861ef480, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 22 00:16:42 compute-1 systemd[1]: Started libpod-conmon-e27ed4040be34efc1bc9083c381209e97d3c2bdcd1bca7d3e59b63b41e743ae6.scope.
Jan 22 00:16:42 compute-1 podman[231995]: 2026-01-22 00:16:42.57207884 +0000 UTC m=+0.031388366 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 00:16:42 compute-1 systemd[1]: Started libcrun container.
Jan 22 00:16:42 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1cdfe0c70a0038bfea30c904cbabed42ecf2da73b2fd0d3f34be98aaf8d3dd29/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 00:16:42 compute-1 podman[231995]: 2026-01-22 00:16:42.730031144 +0000 UTC m=+0.189340680 container init e27ed4040be34efc1bc9083c381209e97d3c2bdcd1bca7d3e59b63b41e743ae6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-55594f65-206f-4b2a-a4ed-c049861ef480, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202)
Jan 22 00:16:42 compute-1 podman[231995]: 2026-01-22 00:16:42.741114805 +0000 UTC m=+0.200424291 container start e27ed4040be34efc1bc9083c381209e97d3c2bdcd1bca7d3e59b63b41e743ae6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-55594f65-206f-4b2a-a4ed-c049861ef480, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 22 00:16:42 compute-1 neutron-haproxy-ovnmeta-55594f65-206f-4b2a-a4ed-c049861ef480[232009]: [NOTICE]   (232013) : New worker (232015) forked
Jan 22 00:16:42 compute-1 neutron-haproxy-ovnmeta-55594f65-206f-4b2a-a4ed-c049861ef480[232009]: [NOTICE]   (232013) : Loading success.
Jan 22 00:16:42 compute-1 nova_compute[182713]: 2026-01-22 00:16:42.855 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:16:42 compute-1 nova_compute[182713]: 2026-01-22 00:16:42.856 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:16:43 compute-1 nova_compute[182713]: 2026-01-22 00:16:43.397 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:16:44 compute-1 nova_compute[182713]: 2026-01-22 00:16:44.241 182717 DEBUG nova.compute.manager [req-4cd7571d-36d5-4838-9a54-b4938a3aa137 req-19a4824b-65e0-4b89-901c-153244e13874 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 5ba5bafe-ee5b-48f6-aa2f-653708f71f55] Received event network-vif-plugged-53f3c575-ccc8-4fde-a256-9598ddf6cdaf external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:16:44 compute-1 nova_compute[182713]: 2026-01-22 00:16:44.242 182717 DEBUG oslo_concurrency.lockutils [req-4cd7571d-36d5-4838-9a54-b4938a3aa137 req-19a4824b-65e0-4b89-901c-153244e13874 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "5ba5bafe-ee5b-48f6-aa2f-653708f71f55-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:16:44 compute-1 nova_compute[182713]: 2026-01-22 00:16:44.243 182717 DEBUG oslo_concurrency.lockutils [req-4cd7571d-36d5-4838-9a54-b4938a3aa137 req-19a4824b-65e0-4b89-901c-153244e13874 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "5ba5bafe-ee5b-48f6-aa2f-653708f71f55-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:16:44 compute-1 nova_compute[182713]: 2026-01-22 00:16:44.243 182717 DEBUG oslo_concurrency.lockutils [req-4cd7571d-36d5-4838-9a54-b4938a3aa137 req-19a4824b-65e0-4b89-901c-153244e13874 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "5ba5bafe-ee5b-48f6-aa2f-653708f71f55-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:16:44 compute-1 nova_compute[182713]: 2026-01-22 00:16:44.244 182717 DEBUG nova.compute.manager [req-4cd7571d-36d5-4838-9a54-b4938a3aa137 req-19a4824b-65e0-4b89-901c-153244e13874 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 5ba5bafe-ee5b-48f6-aa2f-653708f71f55] No waiting events found dispatching network-vif-plugged-53f3c575-ccc8-4fde-a256-9598ddf6cdaf pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:16:44 compute-1 nova_compute[182713]: 2026-01-22 00:16:44.244 182717 WARNING nova.compute.manager [req-4cd7571d-36d5-4838-9a54-b4938a3aa137 req-19a4824b-65e0-4b89-901c-153244e13874 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 5ba5bafe-ee5b-48f6-aa2f-653708f71f55] Received unexpected event network-vif-plugged-53f3c575-ccc8-4fde-a256-9598ddf6cdaf for instance with vm_state rescued and task_state None.
Jan 22 00:16:44 compute-1 nova_compute[182713]: 2026-01-22 00:16:44.245 182717 DEBUG nova.compute.manager [req-4cd7571d-36d5-4838-9a54-b4938a3aa137 req-19a4824b-65e0-4b89-901c-153244e13874 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 5ba5bafe-ee5b-48f6-aa2f-653708f71f55] Received event network-vif-plugged-53f3c575-ccc8-4fde-a256-9598ddf6cdaf external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:16:44 compute-1 nova_compute[182713]: 2026-01-22 00:16:44.245 182717 DEBUG oslo_concurrency.lockutils [req-4cd7571d-36d5-4838-9a54-b4938a3aa137 req-19a4824b-65e0-4b89-901c-153244e13874 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "5ba5bafe-ee5b-48f6-aa2f-653708f71f55-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:16:44 compute-1 nova_compute[182713]: 2026-01-22 00:16:44.246 182717 DEBUG oslo_concurrency.lockutils [req-4cd7571d-36d5-4838-9a54-b4938a3aa137 req-19a4824b-65e0-4b89-901c-153244e13874 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "5ba5bafe-ee5b-48f6-aa2f-653708f71f55-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:16:44 compute-1 nova_compute[182713]: 2026-01-22 00:16:44.246 182717 DEBUG oslo_concurrency.lockutils [req-4cd7571d-36d5-4838-9a54-b4938a3aa137 req-19a4824b-65e0-4b89-901c-153244e13874 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "5ba5bafe-ee5b-48f6-aa2f-653708f71f55-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:16:44 compute-1 nova_compute[182713]: 2026-01-22 00:16:44.247 182717 DEBUG nova.compute.manager [req-4cd7571d-36d5-4838-9a54-b4938a3aa137 req-19a4824b-65e0-4b89-901c-153244e13874 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 5ba5bafe-ee5b-48f6-aa2f-653708f71f55] No waiting events found dispatching network-vif-plugged-53f3c575-ccc8-4fde-a256-9598ddf6cdaf pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:16:44 compute-1 nova_compute[182713]: 2026-01-22 00:16:44.247 182717 WARNING nova.compute.manager [req-4cd7571d-36d5-4838-9a54-b4938a3aa137 req-19a4824b-65e0-4b89-901c-153244e13874 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 5ba5bafe-ee5b-48f6-aa2f-653708f71f55] Received unexpected event network-vif-plugged-53f3c575-ccc8-4fde-a256-9598ddf6cdaf for instance with vm_state rescued and task_state None.
Jan 22 00:16:44 compute-1 podman[232024]: 2026-01-22 00:16:44.56948612 +0000 UTC m=+0.064212614 container health_status af9a44923f14d865893c91712a29a99d59e05d83f1a1a0300c4f4d5af638f284 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251202, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent)
Jan 22 00:16:44 compute-1 podman[232025]: 2026-01-22 00:16:44.584167382 +0000 UTC m=+0.069864959 container health_status e305ebfcd3dbca18f8dd680606b043b108cf9efb0d0a771039cb04fe0df8441d (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Jan 22 00:16:45 compute-1 nova_compute[182713]: 2026-01-22 00:16:45.855 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:16:45 compute-1 nova_compute[182713]: 2026-01-22 00:16:45.857 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 00:16:46 compute-1 nova_compute[182713]: 2026-01-22 00:16:46.238 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:16:46 compute-1 nova_compute[182713]: 2026-01-22 00:16:46.509 182717 DEBUG nova.compute.manager [req-08270255-0a09-4b44-b56b-c368cdc35405 req-6cfd30c3-99db-463d-b876-712f55ee8a05 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 5ba5bafe-ee5b-48f6-aa2f-653708f71f55] Received event network-vif-plugged-53f3c575-ccc8-4fde-a256-9598ddf6cdaf external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:16:46 compute-1 nova_compute[182713]: 2026-01-22 00:16:46.510 182717 DEBUG oslo_concurrency.lockutils [req-08270255-0a09-4b44-b56b-c368cdc35405 req-6cfd30c3-99db-463d-b876-712f55ee8a05 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "5ba5bafe-ee5b-48f6-aa2f-653708f71f55-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:16:46 compute-1 nova_compute[182713]: 2026-01-22 00:16:46.510 182717 DEBUG oslo_concurrency.lockutils [req-08270255-0a09-4b44-b56b-c368cdc35405 req-6cfd30c3-99db-463d-b876-712f55ee8a05 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "5ba5bafe-ee5b-48f6-aa2f-653708f71f55-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:16:46 compute-1 nova_compute[182713]: 2026-01-22 00:16:46.511 182717 DEBUG oslo_concurrency.lockutils [req-08270255-0a09-4b44-b56b-c368cdc35405 req-6cfd30c3-99db-463d-b876-712f55ee8a05 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "5ba5bafe-ee5b-48f6-aa2f-653708f71f55-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:16:46 compute-1 nova_compute[182713]: 2026-01-22 00:16:46.512 182717 DEBUG nova.compute.manager [req-08270255-0a09-4b44-b56b-c368cdc35405 req-6cfd30c3-99db-463d-b876-712f55ee8a05 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 5ba5bafe-ee5b-48f6-aa2f-653708f71f55] No waiting events found dispatching network-vif-plugged-53f3c575-ccc8-4fde-a256-9598ddf6cdaf pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:16:46 compute-1 nova_compute[182713]: 2026-01-22 00:16:46.512 182717 WARNING nova.compute.manager [req-08270255-0a09-4b44-b56b-c368cdc35405 req-6cfd30c3-99db-463d-b876-712f55ee8a05 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 5ba5bafe-ee5b-48f6-aa2f-653708f71f55] Received unexpected event network-vif-plugged-53f3c575-ccc8-4fde-a256-9598ddf6cdaf for instance with vm_state rescued and task_state None.
Jan 22 00:16:46 compute-1 nova_compute[182713]: 2026-01-22 00:16:46.856 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:16:46 compute-1 nova_compute[182713]: 2026-01-22 00:16:46.857 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:16:47 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:16:47.391 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=74526b6d-b1ca-423f-9094-b845f8b97526, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '39'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:16:47 compute-1 nova_compute[182713]: 2026-01-22 00:16:47.621 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:16:47 compute-1 nova_compute[182713]: 2026-01-22 00:16:47.621 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:16:47 compute-1 nova_compute[182713]: 2026-01-22 00:16:47.622 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:16:47 compute-1 nova_compute[182713]: 2026-01-22 00:16:47.622 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 00:16:47 compute-1 nova_compute[182713]: 2026-01-22 00:16:47.805 182717 DEBUG oslo_concurrency.processutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5ba5bafe-ee5b-48f6-aa2f-653708f71f55/disk.rescue --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:16:47 compute-1 nova_compute[182713]: 2026-01-22 00:16:47.884 182717 DEBUG oslo_concurrency.processutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5ba5bafe-ee5b-48f6-aa2f-653708f71f55/disk.rescue --force-share --output=json" returned: 0 in 0.078s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:16:47 compute-1 nova_compute[182713]: 2026-01-22 00:16:47.885 182717 DEBUG oslo_concurrency.processutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5ba5bafe-ee5b-48f6-aa2f-653708f71f55/disk.rescue --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:16:47 compute-1 nova_compute[182713]: 2026-01-22 00:16:47.955 182717 DEBUG oslo_concurrency.processutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5ba5bafe-ee5b-48f6-aa2f-653708f71f55/disk.rescue --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:16:47 compute-1 nova_compute[182713]: 2026-01-22 00:16:47.956 182717 DEBUG oslo_concurrency.processutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5ba5bafe-ee5b-48f6-aa2f-653708f71f55/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:16:48 compute-1 nova_compute[182713]: 2026-01-22 00:16:48.023 182717 DEBUG oslo_concurrency.processutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5ba5bafe-ee5b-48f6-aa2f-653708f71f55/disk --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:16:48 compute-1 nova_compute[182713]: 2026-01-22 00:16:48.024 182717 DEBUG oslo_concurrency.processutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5ba5bafe-ee5b-48f6-aa2f-653708f71f55/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:16:48 compute-1 nova_compute[182713]: 2026-01-22 00:16:48.114 182717 DEBUG oslo_concurrency.processutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5ba5bafe-ee5b-48f6-aa2f-653708f71f55/disk --force-share --output=json" returned: 0 in 0.089s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:16:48 compute-1 nova_compute[182713]: 2026-01-22 00:16:48.336 182717 WARNING nova.virt.libvirt.driver [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 00:16:48 compute-1 nova_compute[182713]: 2026-01-22 00:16:48.337 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5565MB free_disk=73.23107528686523GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 00:16:48 compute-1 nova_compute[182713]: 2026-01-22 00:16:48.338 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:16:48 compute-1 nova_compute[182713]: 2026-01-22 00:16:48.338 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:16:48 compute-1 nova_compute[182713]: 2026-01-22 00:16:48.401 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:16:48 compute-1 nova_compute[182713]: 2026-01-22 00:16:48.623 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Instance 5ba5bafe-ee5b-48f6-aa2f-653708f71f55 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 22 00:16:48 compute-1 nova_compute[182713]: 2026-01-22 00:16:48.624 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 00:16:48 compute-1 nova_compute[182713]: 2026-01-22 00:16:48.624 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 00:16:48 compute-1 nova_compute[182713]: 2026-01-22 00:16:48.686 182717 DEBUG nova.compute.provider_tree [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Inventory has not changed in ProviderTree for provider: 39680711-70c9-4df1-ae59-25e54fac688d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:16:49 compute-1 nova_compute[182713]: 2026-01-22 00:16:49.685 182717 DEBUG nova.scheduler.client.report [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Inventory has not changed for provider 39680711-70c9-4df1-ae59-25e54fac688d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:16:50 compute-1 nova_compute[182713]: 2026-01-22 00:16:50.597 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 00:16:50 compute-1 nova_compute[182713]: 2026-01-22 00:16:50.597 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.259s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:16:51 compute-1 nova_compute[182713]: 2026-01-22 00:16:51.271 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:16:51 compute-1 nova_compute[182713]: 2026-01-22 00:16:51.593 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:16:51 compute-1 nova_compute[182713]: 2026-01-22 00:16:51.855 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:16:52 compute-1 nova_compute[182713]: 2026-01-22 00:16:52.855 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:16:52 compute-1 nova_compute[182713]: 2026-01-22 00:16:52.856 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 00:16:53 compute-1 nova_compute[182713]: 2026-01-22 00:16:53.318 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 22 00:16:53 compute-1 nova_compute[182713]: 2026-01-22 00:16:53.319 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:16:53 compute-1 nova_compute[182713]: 2026-01-22 00:16:53.451 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:16:55 compute-1 ovn_controller[94841]: 2026-01-22T00:16:55Z|00064|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:9f:0e:3f 10.100.0.12
Jan 22 00:16:55 compute-1 podman[232092]: 2026-01-22 00:16:55.60373579 +0000 UTC m=+0.095532357 container health_status cba261ffc859a105e0afa4436e64e27f9ba7e7387a1ea02146e0072c51844d44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 22 00:16:56 compute-1 nova_compute[182713]: 2026-01-22 00:16:56.201 182717 DEBUG oslo_concurrency.lockutils [None req-72427610-b2b9-4171-a719-0c99b5380cb4 c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] Acquiring lock "5ba5bafe-ee5b-48f6-aa2f-653708f71f55" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:16:56 compute-1 nova_compute[182713]: 2026-01-22 00:16:56.202 182717 DEBUG oslo_concurrency.lockutils [None req-72427610-b2b9-4171-a719-0c99b5380cb4 c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] Lock "5ba5bafe-ee5b-48f6-aa2f-653708f71f55" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:16:56 compute-1 nova_compute[182713]: 2026-01-22 00:16:56.202 182717 DEBUG oslo_concurrency.lockutils [None req-72427610-b2b9-4171-a719-0c99b5380cb4 c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] Acquiring lock "5ba5bafe-ee5b-48f6-aa2f-653708f71f55-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:16:56 compute-1 nova_compute[182713]: 2026-01-22 00:16:56.202 182717 DEBUG oslo_concurrency.lockutils [None req-72427610-b2b9-4171-a719-0c99b5380cb4 c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] Lock "5ba5bafe-ee5b-48f6-aa2f-653708f71f55-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:16:56 compute-1 nova_compute[182713]: 2026-01-22 00:16:56.202 182717 DEBUG oslo_concurrency.lockutils [None req-72427610-b2b9-4171-a719-0c99b5380cb4 c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] Lock "5ba5bafe-ee5b-48f6-aa2f-653708f71f55-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:16:56 compute-1 nova_compute[182713]: 2026-01-22 00:16:56.273 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:16:56 compute-1 nova_compute[182713]: 2026-01-22 00:16:56.421 182717 INFO nova.compute.manager [None req-72427610-b2b9-4171-a719-0c99b5380cb4 c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] [instance: 5ba5bafe-ee5b-48f6-aa2f-653708f71f55] Terminating instance
Jan 22 00:16:56 compute-1 nova_compute[182713]: 2026-01-22 00:16:56.439 182717 DEBUG nova.compute.manager [None req-72427610-b2b9-4171-a719-0c99b5380cb4 c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] [instance: 5ba5bafe-ee5b-48f6-aa2f-653708f71f55] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 22 00:16:56 compute-1 kernel: tap53f3c575-cc (unregistering): left promiscuous mode
Jan 22 00:16:56 compute-1 NetworkManager[54952]: <info>  [1769041016.4695] device (tap53f3c575-cc): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 00:16:56 compute-1 ovn_controller[94841]: 2026-01-22T00:16:56Z|00541|binding|INFO|Releasing lport 53f3c575-ccc8-4fde-a256-9598ddf6cdaf from this chassis (sb_readonly=0)
Jan 22 00:16:56 compute-1 ovn_controller[94841]: 2026-01-22T00:16:56Z|00542|binding|INFO|Setting lport 53f3c575-ccc8-4fde-a256-9598ddf6cdaf down in Southbound
Jan 22 00:16:56 compute-1 nova_compute[182713]: 2026-01-22 00:16:56.478 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:16:56 compute-1 ovn_controller[94841]: 2026-01-22T00:16:56Z|00543|binding|INFO|Removing iface tap53f3c575-cc ovn-installed in OVS
Jan 22 00:16:56 compute-1 nova_compute[182713]: 2026-01-22 00:16:56.482 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:16:56 compute-1 nova_compute[182713]: 2026-01-22 00:16:56.499 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:16:56 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:16:56.506 104184 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:9f:0e:3f 10.100.0.12'], port_security=['fa:16:3e:9f:0e:3f 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '5ba5bafe-ee5b-48f6-aa2f-653708f71f55', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-55594f65-206f-4b2a-a4ed-c049861ef480', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '4c6e66779ffe440d9c3270f0328391fb', 'neutron:revision_number': '6', 'neutron:security_group_ids': '99a05b18-dc2e-46bb-b7ee-4bfce96057f6', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f515f91a-3ddc-47bf-8aaf-753c19e78de1, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>], logical_port=53f3c575-ccc8-4fde-a256-9598ddf6cdaf) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:16:56 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:16:56.507 104184 INFO neutron.agent.ovn.metadata.agent [-] Port 53f3c575-ccc8-4fde-a256-9598ddf6cdaf in datapath 55594f65-206f-4b2a-a4ed-c049861ef480 unbound from our chassis
Jan 22 00:16:56 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:16:56.509 104184 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 55594f65-206f-4b2a-a4ed-c049861ef480, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 00:16:56 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:16:56.511 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[13d990f4-1f3b-4d34-b8a0-e5d961966696]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:16:56 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:16:56.512 104184 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-55594f65-206f-4b2a-a4ed-c049861ef480 namespace which is not needed anymore
Jan 22 00:16:56 compute-1 systemd[1]: machine-qemu\x2d60\x2dinstance\x2d00000085.scope: Deactivated successfully.
Jan 22 00:16:56 compute-1 systemd[1]: machine-qemu\x2d60\x2dinstance\x2d00000085.scope: Consumed 12.802s CPU time.
Jan 22 00:16:56 compute-1 systemd-machined[153970]: Machine qemu-60-instance-00000085 terminated.
Jan 22 00:16:56 compute-1 neutron-haproxy-ovnmeta-55594f65-206f-4b2a-a4ed-c049861ef480[232009]: [NOTICE]   (232013) : haproxy version is 2.8.14-c23fe91
Jan 22 00:16:56 compute-1 neutron-haproxy-ovnmeta-55594f65-206f-4b2a-a4ed-c049861ef480[232009]: [NOTICE]   (232013) : path to executable is /usr/sbin/haproxy
Jan 22 00:16:56 compute-1 neutron-haproxy-ovnmeta-55594f65-206f-4b2a-a4ed-c049861ef480[232009]: [WARNING]  (232013) : Exiting Master process...
Jan 22 00:16:56 compute-1 neutron-haproxy-ovnmeta-55594f65-206f-4b2a-a4ed-c049861ef480[232009]: [WARNING]  (232013) : Exiting Master process...
Jan 22 00:16:56 compute-1 neutron-haproxy-ovnmeta-55594f65-206f-4b2a-a4ed-c049861ef480[232009]: [ALERT]    (232013) : Current worker (232015) exited with code 143 (Terminated)
Jan 22 00:16:56 compute-1 neutron-haproxy-ovnmeta-55594f65-206f-4b2a-a4ed-c049861ef480[232009]: [WARNING]  (232013) : All workers exited. Exiting... (0)
Jan 22 00:16:56 compute-1 systemd[1]: libpod-e27ed4040be34efc1bc9083c381209e97d3c2bdcd1bca7d3e59b63b41e743ae6.scope: Deactivated successfully.
Jan 22 00:16:56 compute-1 podman[232136]: 2026-01-22 00:16:56.683671812 +0000 UTC m=+0.058480159 container died e27ed4040be34efc1bc9083c381209e97d3c2bdcd1bca7d3e59b63b41e743ae6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-55594f65-206f-4b2a-a4ed-c049861ef480, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 22 00:16:56 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-e27ed4040be34efc1bc9083c381209e97d3c2bdcd1bca7d3e59b63b41e743ae6-userdata-shm.mount: Deactivated successfully.
Jan 22 00:16:56 compute-1 systemd[1]: var-lib-containers-storage-overlay-1cdfe0c70a0038bfea30c904cbabed42ecf2da73b2fd0d3f34be98aaf8d3dd29-merged.mount: Deactivated successfully.
Jan 22 00:16:56 compute-1 podman[232136]: 2026-01-22 00:16:56.729080618 +0000 UTC m=+0.103888945 container cleanup e27ed4040be34efc1bc9083c381209e97d3c2bdcd1bca7d3e59b63b41e743ae6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-55594f65-206f-4b2a-a4ed-c049861ef480, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 22 00:16:56 compute-1 nova_compute[182713]: 2026-01-22 00:16:56.746 182717 INFO nova.virt.libvirt.driver [-] [instance: 5ba5bafe-ee5b-48f6-aa2f-653708f71f55] Instance destroyed successfully.
Jan 22 00:16:56 compute-1 nova_compute[182713]: 2026-01-22 00:16:56.748 182717 DEBUG nova.objects.instance [None req-72427610-b2b9-4171-a719-0c99b5380cb4 c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] Lazy-loading 'resources' on Instance uuid 5ba5bafe-ee5b-48f6-aa2f-653708f71f55 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:16:56 compute-1 systemd[1]: libpod-conmon-e27ed4040be34efc1bc9083c381209e97d3c2bdcd1bca7d3e59b63b41e743ae6.scope: Deactivated successfully.
Jan 22 00:16:56 compute-1 nova_compute[182713]: 2026-01-22 00:16:56.796 182717 DEBUG nova.virt.libvirt.vif [None req-72427610-b2b9-4171-a719-0c99b5380cb4 c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T00:15:50Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerRescueNegativeTestJSON-server-343289399',display_name='tempest-ServerRescueNegativeTestJSON-server-343289399',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverrescuenegativetestjson-server-343289399',id=133,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-22T00:16:42Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='4c6e66779ffe440d9c3270f0328391fb',ramdisk_id='',reservation_id='r-570w62iy',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerRescueNegativeTestJSON-1986679883',owner_user_name='tempest-ServerRescueNegativeTestJSON-1986679883-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T00:16:42Z,user_data=None,user_id='c26ff016fcfc4e08803feb0e96005a8e',uuid=5ba5bafe-ee5b-48f6-aa2f-653708f71f55,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='rescued') vif={"id": "53f3c575-ccc8-4fde-a256-9598ddf6cdaf", "address": "fa:16:3e:9f:0e:3f", "network": {"id": "55594f65-206f-4b2a-a4ed-c049861ef480", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-2114387764-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4c6e66779ffe440d9c3270f0328391fb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap53f3c575-cc", "ovs_interfaceid": "53f3c575-ccc8-4fde-a256-9598ddf6cdaf", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 00:16:56 compute-1 podman[232186]: 2026-01-22 00:16:56.796956464 +0000 UTC m=+0.042621821 container remove e27ed4040be34efc1bc9083c381209e97d3c2bdcd1bca7d3e59b63b41e743ae6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-55594f65-206f-4b2a-a4ed-c049861ef480, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 00:16:56 compute-1 nova_compute[182713]: 2026-01-22 00:16:56.797 182717 DEBUG nova.network.os_vif_util [None req-72427610-b2b9-4171-a719-0c99b5380cb4 c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] Converting VIF {"id": "53f3c575-ccc8-4fde-a256-9598ddf6cdaf", "address": "fa:16:3e:9f:0e:3f", "network": {"id": "55594f65-206f-4b2a-a4ed-c049861ef480", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-2114387764-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4c6e66779ffe440d9c3270f0328391fb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap53f3c575-cc", "ovs_interfaceid": "53f3c575-ccc8-4fde-a256-9598ddf6cdaf", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:16:56 compute-1 nova_compute[182713]: 2026-01-22 00:16:56.799 182717 DEBUG nova.network.os_vif_util [None req-72427610-b2b9-4171-a719-0c99b5380cb4 c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:9f:0e:3f,bridge_name='br-int',has_traffic_filtering=True,id=53f3c575-ccc8-4fde-a256-9598ddf6cdaf,network=Network(55594f65-206f-4b2a-a4ed-c049861ef480),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap53f3c575-cc') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:16:56 compute-1 nova_compute[182713]: 2026-01-22 00:16:56.799 182717 DEBUG os_vif [None req-72427610-b2b9-4171-a719-0c99b5380cb4 c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:9f:0e:3f,bridge_name='br-int',has_traffic_filtering=True,id=53f3c575-ccc8-4fde-a256-9598ddf6cdaf,network=Network(55594f65-206f-4b2a-a4ed-c049861ef480),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap53f3c575-cc') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 00:16:56 compute-1 nova_compute[182713]: 2026-01-22 00:16:56.801 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:16:56 compute-1 nova_compute[182713]: 2026-01-22 00:16:56.802 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap53f3c575-cc, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:16:56 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:16:56.801 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[ea873e95-84ff-45af-81a5-1cdf3169dd3d]: (4, ('Thu Jan 22 12:16:56 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-55594f65-206f-4b2a-a4ed-c049861ef480 (e27ed4040be34efc1bc9083c381209e97d3c2bdcd1bca7d3e59b63b41e743ae6)\ne27ed4040be34efc1bc9083c381209e97d3c2bdcd1bca7d3e59b63b41e743ae6\nThu Jan 22 12:16:56 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-55594f65-206f-4b2a-a4ed-c049861ef480 (e27ed4040be34efc1bc9083c381209e97d3c2bdcd1bca7d3e59b63b41e743ae6)\ne27ed4040be34efc1bc9083c381209e97d3c2bdcd1bca7d3e59b63b41e743ae6\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:16:56 compute-1 nova_compute[182713]: 2026-01-22 00:16:56.803 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:16:56 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:16:56.803 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[b3800e76-8075-4dd7-b11c-ca19c576952d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:16:56 compute-1 nova_compute[182713]: 2026-01-22 00:16:56.806 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:16:56 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:16:56.807 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap55594f65-20, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:16:56 compute-1 nova_compute[182713]: 2026-01-22 00:16:56.809 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:16:56 compute-1 kernel: tap55594f65-20: left promiscuous mode
Jan 22 00:16:56 compute-1 nova_compute[182713]: 2026-01-22 00:16:56.813 182717 INFO os_vif [None req-72427610-b2b9-4171-a719-0c99b5380cb4 c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:9f:0e:3f,bridge_name='br-int',has_traffic_filtering=True,id=53f3c575-ccc8-4fde-a256-9598ddf6cdaf,network=Network(55594f65-206f-4b2a-a4ed-c049861ef480),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap53f3c575-cc')
Jan 22 00:16:56 compute-1 nova_compute[182713]: 2026-01-22 00:16:56.813 182717 INFO nova.virt.libvirt.driver [None req-72427610-b2b9-4171-a719-0c99b5380cb4 c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] [instance: 5ba5bafe-ee5b-48f6-aa2f-653708f71f55] Deleting instance files /var/lib/nova/instances/5ba5bafe-ee5b-48f6-aa2f-653708f71f55_del
Jan 22 00:16:56 compute-1 nova_compute[182713]: 2026-01-22 00:16:56.814 182717 INFO nova.virt.libvirt.driver [None req-72427610-b2b9-4171-a719-0c99b5380cb4 c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] [instance: 5ba5bafe-ee5b-48f6-aa2f-653708f71f55] Deletion of /var/lib/nova/instances/5ba5bafe-ee5b-48f6-aa2f-653708f71f55_del complete
Jan 22 00:16:56 compute-1 nova_compute[182713]: 2026-01-22 00:16:56.821 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:16:56 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:16:56.827 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[bd2c21dc-f642-42ce-91f3-cb5e671f25f5]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:16:56 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:16:56.841 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[bcb2203f-02d4-46c5-a661-8652c5a90b98]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:16:56 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:16:56.842 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[eba002b3-2713-4fda-9b6e-1a4d5372668b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:16:56 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:16:56.856 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[a5c91ee3-51c6-4365-a1b5-5356b2c4371e]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 557443, 'reachable_time': 16815, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 232203, 'error': None, 'target': 'ovnmeta-55594f65-206f-4b2a-a4ed-c049861ef480', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:16:56 compute-1 systemd[1]: run-netns-ovnmeta\x2d55594f65\x2d206f\x2d4b2a\x2da4ed\x2dc049861ef480.mount: Deactivated successfully.
Jan 22 00:16:56 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:16:56.860 104576 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-55594f65-206f-4b2a-a4ed-c049861ef480 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 22 00:16:56 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:16:56.860 104576 DEBUG oslo.privsep.daemon [-] privsep: reply[0e816b3c-6787-4378-9900-5e93a8f5492a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:16:57 compute-1 nova_compute[182713]: 2026-01-22 00:16:57.170 182717 INFO nova.compute.manager [None req-72427610-b2b9-4171-a719-0c99b5380cb4 c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] [instance: 5ba5bafe-ee5b-48f6-aa2f-653708f71f55] Took 0.73 seconds to destroy the instance on the hypervisor.
Jan 22 00:16:57 compute-1 nova_compute[182713]: 2026-01-22 00:16:57.171 182717 DEBUG oslo.service.loopingcall [None req-72427610-b2b9-4171-a719-0c99b5380cb4 c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 22 00:16:57 compute-1 nova_compute[182713]: 2026-01-22 00:16:57.171 182717 DEBUG nova.compute.manager [-] [instance: 5ba5bafe-ee5b-48f6-aa2f-653708f71f55] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 22 00:16:57 compute-1 nova_compute[182713]: 2026-01-22 00:16:57.172 182717 DEBUG nova.network.neutron [-] [instance: 5ba5bafe-ee5b-48f6-aa2f-653708f71f55] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 22 00:16:57 compute-1 nova_compute[182713]: 2026-01-22 00:16:57.727 182717 DEBUG nova.compute.manager [req-30a01430-c8bc-4106-a651-ae9fadc0329c req-a65d801a-049e-4460-8192-7cf2cb7be2a9 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 5ba5bafe-ee5b-48f6-aa2f-653708f71f55] Received event network-vif-unplugged-53f3c575-ccc8-4fde-a256-9598ddf6cdaf external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:16:57 compute-1 nova_compute[182713]: 2026-01-22 00:16:57.728 182717 DEBUG oslo_concurrency.lockutils [req-30a01430-c8bc-4106-a651-ae9fadc0329c req-a65d801a-049e-4460-8192-7cf2cb7be2a9 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "5ba5bafe-ee5b-48f6-aa2f-653708f71f55-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:16:57 compute-1 nova_compute[182713]: 2026-01-22 00:16:57.729 182717 DEBUG oslo_concurrency.lockutils [req-30a01430-c8bc-4106-a651-ae9fadc0329c req-a65d801a-049e-4460-8192-7cf2cb7be2a9 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "5ba5bafe-ee5b-48f6-aa2f-653708f71f55-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:16:57 compute-1 nova_compute[182713]: 2026-01-22 00:16:57.729 182717 DEBUG oslo_concurrency.lockutils [req-30a01430-c8bc-4106-a651-ae9fadc0329c req-a65d801a-049e-4460-8192-7cf2cb7be2a9 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "5ba5bafe-ee5b-48f6-aa2f-653708f71f55-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:16:57 compute-1 nova_compute[182713]: 2026-01-22 00:16:57.730 182717 DEBUG nova.compute.manager [req-30a01430-c8bc-4106-a651-ae9fadc0329c req-a65d801a-049e-4460-8192-7cf2cb7be2a9 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 5ba5bafe-ee5b-48f6-aa2f-653708f71f55] No waiting events found dispatching network-vif-unplugged-53f3c575-ccc8-4fde-a256-9598ddf6cdaf pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:16:57 compute-1 nova_compute[182713]: 2026-01-22 00:16:57.730 182717 DEBUG nova.compute.manager [req-30a01430-c8bc-4106-a651-ae9fadc0329c req-a65d801a-049e-4460-8192-7cf2cb7be2a9 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 5ba5bafe-ee5b-48f6-aa2f-653708f71f55] Received event network-vif-unplugged-53f3c575-ccc8-4fde-a256-9598ddf6cdaf for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 22 00:16:58 compute-1 nova_compute[182713]: 2026-01-22 00:16:58.453 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:16:58 compute-1 podman[232204]: 2026-01-22 00:16:58.615379982 +0000 UTC m=+0.100264602 container health_status dce133a2ffa21f3006ac27dc80027060a9029e6d972f4c62fecd35dd3bbb00f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, release=1755695350, name=ubi9-minimal, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, vcs-type=git, build-date=2025-08-20T13:12:41, version=9.6, architecture=x86_64, config_id=openstack_network_exporter, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Jan 22 00:17:00 compute-1 nova_compute[182713]: 2026-01-22 00:17:00.005 182717 DEBUG nova.compute.manager [req-5e582034-99a0-4d6e-a1da-25b30e35f35b req-45140faa-b0ab-48cd-9265-099c2073bf59 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 5ba5bafe-ee5b-48f6-aa2f-653708f71f55] Received event network-vif-plugged-53f3c575-ccc8-4fde-a256-9598ddf6cdaf external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:17:00 compute-1 nova_compute[182713]: 2026-01-22 00:17:00.006 182717 DEBUG oslo_concurrency.lockutils [req-5e582034-99a0-4d6e-a1da-25b30e35f35b req-45140faa-b0ab-48cd-9265-099c2073bf59 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "5ba5bafe-ee5b-48f6-aa2f-653708f71f55-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:17:00 compute-1 nova_compute[182713]: 2026-01-22 00:17:00.006 182717 DEBUG oslo_concurrency.lockutils [req-5e582034-99a0-4d6e-a1da-25b30e35f35b req-45140faa-b0ab-48cd-9265-099c2073bf59 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "5ba5bafe-ee5b-48f6-aa2f-653708f71f55-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:17:00 compute-1 nova_compute[182713]: 2026-01-22 00:17:00.006 182717 DEBUG oslo_concurrency.lockutils [req-5e582034-99a0-4d6e-a1da-25b30e35f35b req-45140faa-b0ab-48cd-9265-099c2073bf59 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "5ba5bafe-ee5b-48f6-aa2f-653708f71f55-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:17:00 compute-1 nova_compute[182713]: 2026-01-22 00:17:00.007 182717 DEBUG nova.compute.manager [req-5e582034-99a0-4d6e-a1da-25b30e35f35b req-45140faa-b0ab-48cd-9265-099c2073bf59 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 5ba5bafe-ee5b-48f6-aa2f-653708f71f55] No waiting events found dispatching network-vif-plugged-53f3c575-ccc8-4fde-a256-9598ddf6cdaf pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:17:00 compute-1 nova_compute[182713]: 2026-01-22 00:17:00.007 182717 WARNING nova.compute.manager [req-5e582034-99a0-4d6e-a1da-25b30e35f35b req-45140faa-b0ab-48cd-9265-099c2073bf59 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 5ba5bafe-ee5b-48f6-aa2f-653708f71f55] Received unexpected event network-vif-plugged-53f3c575-ccc8-4fde-a256-9598ddf6cdaf for instance with vm_state rescued and task_state deleting.
Jan 22 00:17:01 compute-1 nova_compute[182713]: 2026-01-22 00:17:01.155 182717 DEBUG nova.network.neutron [-] [instance: 5ba5bafe-ee5b-48f6-aa2f-653708f71f55] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:17:01 compute-1 nova_compute[182713]: 2026-01-22 00:17:01.334 182717 INFO nova.compute.manager [-] [instance: 5ba5bafe-ee5b-48f6-aa2f-653708f71f55] Took 4.16 seconds to deallocate network for instance.
Jan 22 00:17:01 compute-1 nova_compute[182713]: 2026-01-22 00:17:01.676 182717 DEBUG oslo_concurrency.lockutils [None req-72427610-b2b9-4171-a719-0c99b5380cb4 c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:17:01 compute-1 nova_compute[182713]: 2026-01-22 00:17:01.677 182717 DEBUG oslo_concurrency.lockutils [None req-72427610-b2b9-4171-a719-0c99b5380cb4 c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:17:01 compute-1 nova_compute[182713]: 2026-01-22 00:17:01.800 182717 DEBUG nova.compute.provider_tree [None req-72427610-b2b9-4171-a719-0c99b5380cb4 c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] Inventory has not changed in ProviderTree for provider: 39680711-70c9-4df1-ae59-25e54fac688d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:17:01 compute-1 nova_compute[182713]: 2026-01-22 00:17:01.804 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:17:01 compute-1 nova_compute[182713]: 2026-01-22 00:17:01.957 182717 DEBUG nova.scheduler.client.report [None req-72427610-b2b9-4171-a719-0c99b5380cb4 c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] Inventory has not changed for provider 39680711-70c9-4df1-ae59-25e54fac688d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:17:02 compute-1 nova_compute[182713]: 2026-01-22 00:17:02.001 182717 DEBUG oslo_concurrency.lockutils [None req-72427610-b2b9-4171-a719-0c99b5380cb4 c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.324s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:17:02 compute-1 nova_compute[182713]: 2026-01-22 00:17:02.192 182717 INFO nova.scheduler.client.report [None req-72427610-b2b9-4171-a719-0c99b5380cb4 c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] Deleted allocations for instance 5ba5bafe-ee5b-48f6-aa2f-653708f71f55
Jan 22 00:17:02 compute-1 nova_compute[182713]: 2026-01-22 00:17:02.261 182717 DEBUG nova.compute.manager [req-f0f284a5-617f-4c06-a786-a5b1586b7406 req-1206f180-bda0-4cf6-893f-1ac264886abd 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 5ba5bafe-ee5b-48f6-aa2f-653708f71f55] Received event network-vif-deleted-53f3c575-ccc8-4fde-a256-9598ddf6cdaf external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:17:02 compute-1 nova_compute[182713]: 2026-01-22 00:17:02.352 182717 DEBUG oslo_concurrency.lockutils [None req-72427610-b2b9-4171-a719-0c99b5380cb4 c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] Lock "5ba5bafe-ee5b-48f6-aa2f-653708f71f55" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 6.150s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:17:03 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:17:03.024 104184 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:17:03 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:17:03.025 104184 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:17:03 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:17:03.025 104184 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:17:03 compute-1 nova_compute[182713]: 2026-01-22 00:17:03.455 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:17:06 compute-1 nova_compute[182713]: 2026-01-22 00:17:06.806 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:17:08 compute-1 nova_compute[182713]: 2026-01-22 00:17:08.503 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:17:09 compute-1 podman[232226]: 2026-01-22 00:17:09.602123181 +0000 UTC m=+0.079819414 container health_status 9069102163b9014ce8d990e36224edf2f6277e737eebbc85a8ea0ba5a3d6a468 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 22 00:17:09 compute-1 podman[232225]: 2026-01-22 00:17:09.637924502 +0000 UTC m=+0.121319070 container health_status 1b31374526468d6b98833c6ddc06c791524e3aaa8f4a92d02e3f11c449a17ca2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 22 00:17:10 compute-1 nova_compute[182713]: 2026-01-22 00:17:10.855 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._run_image_cache_manager_pass run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:17:10 compute-1 nova_compute[182713]: 2026-01-22 00:17:10.856 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquiring lock "storage-registry-lock" by "nova.virt.storage_users.register_storage_use.<locals>.do_register_storage_use" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:17:10 compute-1 nova_compute[182713]: 2026-01-22 00:17:10.856 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "storage-registry-lock" acquired by "nova.virt.storage_users.register_storage_use.<locals>.do_register_storage_use" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:17:10 compute-1 nova_compute[182713]: 2026-01-22 00:17:10.857 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "storage-registry-lock" "released" by "nova.virt.storage_users.register_storage_use.<locals>.do_register_storage_use" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:17:10 compute-1 nova_compute[182713]: 2026-01-22 00:17:10.857 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquiring lock "storage-registry-lock" by "nova.virt.storage_users.get_storage_users.<locals>.do_get_storage_users" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:17:10 compute-1 nova_compute[182713]: 2026-01-22 00:17:10.857 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "storage-registry-lock" acquired by "nova.virt.storage_users.get_storage_users.<locals>.do_get_storage_users" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:17:10 compute-1 nova_compute[182713]: 2026-01-22 00:17:10.857 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "storage-registry-lock" "released" by "nova.virt.storage_users.get_storage_users.<locals>.do_get_storage_users" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:17:10 compute-1 nova_compute[182713]: 2026-01-22 00:17:10.892 182717 DEBUG nova.virt.libvirt.imagecache [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Verify base images _age_and_verify_cached_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:314
Jan 22 00:17:10 compute-1 nova_compute[182713]: 2026-01-22 00:17:10.893 182717 WARNING nova.virt.libvirt.imagecache [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Unknown base file: /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474
Jan 22 00:17:10 compute-1 nova_compute[182713]: 2026-01-22 00:17:10.893 182717 WARNING nova.virt.libvirt.imagecache [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Unknown base file: /var/lib/nova/instances/_base/3224e12fdef635d4ff8f1066ef61cd1ea8e72c9b
Jan 22 00:17:10 compute-1 nova_compute[182713]: 2026-01-22 00:17:10.893 182717 WARNING nova.virt.libvirt.imagecache [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Unknown base file: /var/lib/nova/instances/_base/28d799b333bd7d52e5e892149f424e185effed74
Jan 22 00:17:10 compute-1 nova_compute[182713]: 2026-01-22 00:17:10.893 182717 INFO nova.virt.libvirt.imagecache [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Removable base files: /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 /var/lib/nova/instances/_base/3224e12fdef635d4ff8f1066ef61cd1ea8e72c9b /var/lib/nova/instances/_base/28d799b333bd7d52e5e892149f424e185effed74
Jan 22 00:17:10 compute-1 nova_compute[182713]: 2026-01-22 00:17:10.894 182717 INFO nova.virt.libvirt.imagecache [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Base, swap or ephemeral file too young to remove: /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474
Jan 22 00:17:10 compute-1 nova_compute[182713]: 2026-01-22 00:17:10.894 182717 INFO nova.virt.libvirt.imagecache [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Base, swap or ephemeral file too young to remove: /var/lib/nova/instances/_base/3224e12fdef635d4ff8f1066ef61cd1ea8e72c9b
Jan 22 00:17:10 compute-1 nova_compute[182713]: 2026-01-22 00:17:10.894 182717 INFO nova.virt.libvirt.imagecache [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Base, swap or ephemeral file too young to remove: /var/lib/nova/instances/_base/28d799b333bd7d52e5e892149f424e185effed74
Jan 22 00:17:10 compute-1 nova_compute[182713]: 2026-01-22 00:17:10.894 182717 DEBUG nova.virt.libvirt.imagecache [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Verification complete _age_and_verify_cached_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:350
Jan 22 00:17:10 compute-1 nova_compute[182713]: 2026-01-22 00:17:10.895 182717 DEBUG nova.virt.libvirt.imagecache [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Verify swap images _age_and_verify_swap_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:299
Jan 22 00:17:10 compute-1 nova_compute[182713]: 2026-01-22 00:17:10.895 182717 DEBUG nova.virt.libvirt.imagecache [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Verify ephemeral images _age_and_verify_ephemeral_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:284
Jan 22 00:17:11 compute-1 nova_compute[182713]: 2026-01-22 00:17:11.743 182717 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769041016.7415082, 5ba5bafe-ee5b-48f6-aa2f-653708f71f55 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:17:11 compute-1 nova_compute[182713]: 2026-01-22 00:17:11.743 182717 INFO nova.compute.manager [-] [instance: 5ba5bafe-ee5b-48f6-aa2f-653708f71f55] VM Stopped (Lifecycle Event)
Jan 22 00:17:11 compute-1 nova_compute[182713]: 2026-01-22 00:17:11.775 182717 DEBUG nova.compute.manager [None req-de13fd30-e581-4436-97fd-a9825edad0e4 - - - - - -] [instance: 5ba5bafe-ee5b-48f6-aa2f-653708f71f55] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:17:11 compute-1 nova_compute[182713]: 2026-01-22 00:17:11.811 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:17:12 compute-1 nova_compute[182713]: 2026-01-22 00:17:12.918 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:17:13 compute-1 nova_compute[182713]: 2026-01-22 00:17:13.505 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:17:15 compute-1 podman[232274]: 2026-01-22 00:17:15.58809304 +0000 UTC m=+0.081153656 container health_status e305ebfcd3dbca18f8dd680606b043b108cf9efb0d0a771039cb04fe0df8441d (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 22 00:17:15 compute-1 podman[232273]: 2026-01-22 00:17:15.588431721 +0000 UTC m=+0.076900235 container health_status af9a44923f14d865893c91712a29a99d59e05d83f1a1a0300c4f4d5af638f284 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 22 00:17:16 compute-1 nova_compute[182713]: 2026-01-22 00:17:16.814 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:17:18 compute-1 nova_compute[182713]: 2026-01-22 00:17:18.508 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:17:21 compute-1 nova_compute[182713]: 2026-01-22 00:17:21.817 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:17:23 compute-1 nova_compute[182713]: 2026-01-22 00:17:23.550 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:17:25 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:17:25.628 104184 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=40, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:a4:fb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'fa:c2:bb:50:eb:27'}, ipsec=False) old=SB_Global(nb_cfg=39) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:17:25 compute-1 nova_compute[182713]: 2026-01-22 00:17:25.628 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:17:25 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:17:25.629 104184 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 22 00:17:26 compute-1 podman[232317]: 2026-01-22 00:17:26.589343627 +0000 UTC m=+0.082285441 container health_status cba261ffc859a105e0afa4436e64e27f9ba7e7387a1ea02146e0072c51844d44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 22 00:17:26 compute-1 nova_compute[182713]: 2026-01-22 00:17:26.866 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:17:28 compute-1 nova_compute[182713]: 2026-01-22 00:17:28.552 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:17:29 compute-1 podman[232337]: 2026-01-22 00:17:29.636025158 +0000 UTC m=+0.122448645 container health_status dce133a2ffa21f3006ac27dc80027060a9029e6d972f4c62fecd35dd3bbb00f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, config_id=openstack_network_exporter, container_name=openstack_network_exporter, architecture=x86_64, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, io.openshift.expose-services=, managed_by=edpm_ansible, name=ubi9-minimal, maintainer=Red Hat, Inc., release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, vcs-type=git)
Jan 22 00:17:30 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:17:30.632 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=74526b6d-b1ca-423f-9094-b845f8b97526, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '40'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:17:31 compute-1 nova_compute[182713]: 2026-01-22 00:17:31.867 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:17:32 compute-1 nova_compute[182713]: 2026-01-22 00:17:32.129 182717 DEBUG oslo_concurrency.lockutils [None req-b825dc05-d088-464d-b9c1-9e04f5387195 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Acquiring lock "7e1a8ec5-e1c0-4c11-aae2-15d84872d95c" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:17:32 compute-1 nova_compute[182713]: 2026-01-22 00:17:32.130 182717 DEBUG oslo_concurrency.lockutils [None req-b825dc05-d088-464d-b9c1-9e04f5387195 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Lock "7e1a8ec5-e1c0-4c11-aae2-15d84872d95c" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:17:32 compute-1 nova_compute[182713]: 2026-01-22 00:17:32.156 182717 DEBUG nova.compute.manager [None req-b825dc05-d088-464d-b9c1-9e04f5387195 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: 7e1a8ec5-e1c0-4c11-aae2-15d84872d95c] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 22 00:17:32 compute-1 nova_compute[182713]: 2026-01-22 00:17:32.294 182717 DEBUG oslo_concurrency.lockutils [None req-b825dc05-d088-464d-b9c1-9e04f5387195 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:17:32 compute-1 nova_compute[182713]: 2026-01-22 00:17:32.295 182717 DEBUG oslo_concurrency.lockutils [None req-b825dc05-d088-464d-b9c1-9e04f5387195 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:17:32 compute-1 nova_compute[182713]: 2026-01-22 00:17:32.304 182717 DEBUG nova.virt.hardware [None req-b825dc05-d088-464d-b9c1-9e04f5387195 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 22 00:17:32 compute-1 nova_compute[182713]: 2026-01-22 00:17:32.304 182717 INFO nova.compute.claims [None req-b825dc05-d088-464d-b9c1-9e04f5387195 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: 7e1a8ec5-e1c0-4c11-aae2-15d84872d95c] Claim successful on node compute-1.ctlplane.example.com
Jan 22 00:17:32 compute-1 nova_compute[182713]: 2026-01-22 00:17:32.444 182717 DEBUG nova.compute.provider_tree [None req-b825dc05-d088-464d-b9c1-9e04f5387195 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Inventory has not changed in ProviderTree for provider: 39680711-70c9-4df1-ae59-25e54fac688d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:17:32 compute-1 nova_compute[182713]: 2026-01-22 00:17:32.469 182717 DEBUG nova.scheduler.client.report [None req-b825dc05-d088-464d-b9c1-9e04f5387195 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Inventory has not changed for provider 39680711-70c9-4df1-ae59-25e54fac688d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:17:32 compute-1 nova_compute[182713]: 2026-01-22 00:17:32.533 182717 DEBUG oslo_concurrency.lockutils [None req-b825dc05-d088-464d-b9c1-9e04f5387195 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.238s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:17:32 compute-1 nova_compute[182713]: 2026-01-22 00:17:32.534 182717 DEBUG nova.compute.manager [None req-b825dc05-d088-464d-b9c1-9e04f5387195 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: 7e1a8ec5-e1c0-4c11-aae2-15d84872d95c] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 22 00:17:32 compute-1 nova_compute[182713]: 2026-01-22 00:17:32.644 182717 DEBUG nova.compute.manager [None req-b825dc05-d088-464d-b9c1-9e04f5387195 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: 7e1a8ec5-e1c0-4c11-aae2-15d84872d95c] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 22 00:17:32 compute-1 nova_compute[182713]: 2026-01-22 00:17:32.645 182717 DEBUG nova.network.neutron [None req-b825dc05-d088-464d-b9c1-9e04f5387195 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: 7e1a8ec5-e1c0-4c11-aae2-15d84872d95c] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 22 00:17:32 compute-1 nova_compute[182713]: 2026-01-22 00:17:32.674 182717 INFO nova.virt.libvirt.driver [None req-b825dc05-d088-464d-b9c1-9e04f5387195 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: 7e1a8ec5-e1c0-4c11-aae2-15d84872d95c] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 22 00:17:32 compute-1 nova_compute[182713]: 2026-01-22 00:17:32.698 182717 DEBUG nova.compute.manager [None req-b825dc05-d088-464d-b9c1-9e04f5387195 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: 7e1a8ec5-e1c0-4c11-aae2-15d84872d95c] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 22 00:17:32 compute-1 nova_compute[182713]: 2026-01-22 00:17:32.876 182717 DEBUG nova.compute.manager [None req-b825dc05-d088-464d-b9c1-9e04f5387195 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: 7e1a8ec5-e1c0-4c11-aae2-15d84872d95c] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 22 00:17:32 compute-1 nova_compute[182713]: 2026-01-22 00:17:32.879 182717 DEBUG nova.virt.libvirt.driver [None req-b825dc05-d088-464d-b9c1-9e04f5387195 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: 7e1a8ec5-e1c0-4c11-aae2-15d84872d95c] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 22 00:17:32 compute-1 nova_compute[182713]: 2026-01-22 00:17:32.879 182717 INFO nova.virt.libvirt.driver [None req-b825dc05-d088-464d-b9c1-9e04f5387195 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: 7e1a8ec5-e1c0-4c11-aae2-15d84872d95c] Creating image(s)
Jan 22 00:17:32 compute-1 nova_compute[182713]: 2026-01-22 00:17:32.881 182717 DEBUG oslo_concurrency.lockutils [None req-b825dc05-d088-464d-b9c1-9e04f5387195 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Acquiring lock "/var/lib/nova/instances/7e1a8ec5-e1c0-4c11-aae2-15d84872d95c/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:17:32 compute-1 nova_compute[182713]: 2026-01-22 00:17:32.881 182717 DEBUG oslo_concurrency.lockutils [None req-b825dc05-d088-464d-b9c1-9e04f5387195 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Lock "/var/lib/nova/instances/7e1a8ec5-e1c0-4c11-aae2-15d84872d95c/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:17:32 compute-1 nova_compute[182713]: 2026-01-22 00:17:32.883 182717 DEBUG oslo_concurrency.lockutils [None req-b825dc05-d088-464d-b9c1-9e04f5387195 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Lock "/var/lib/nova/instances/7e1a8ec5-e1c0-4c11-aae2-15d84872d95c/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:17:32 compute-1 nova_compute[182713]: 2026-01-22 00:17:32.910 182717 DEBUG nova.policy [None req-b825dc05-d088-464d-b9c1-9e04f5387195 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '5eb4e81f0cef4003ae49faa67b3f17c3', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '3e408650207b498c8d115fd0c4f776dc', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 22 00:17:32 compute-1 nova_compute[182713]: 2026-01-22 00:17:32.915 182717 DEBUG oslo_concurrency.processutils [None req-b825dc05-d088-464d-b9c1-9e04f5387195 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:17:33 compute-1 nova_compute[182713]: 2026-01-22 00:17:33.012 182717 DEBUG oslo_concurrency.processutils [None req-b825dc05-d088-464d-b9c1-9e04f5387195 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.097s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:17:33 compute-1 nova_compute[182713]: 2026-01-22 00:17:33.013 182717 DEBUG oslo_concurrency.lockutils [None req-b825dc05-d088-464d-b9c1-9e04f5387195 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Acquiring lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:17:33 compute-1 nova_compute[182713]: 2026-01-22 00:17:33.014 182717 DEBUG oslo_concurrency.lockutils [None req-b825dc05-d088-464d-b9c1-9e04f5387195 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:17:33 compute-1 nova_compute[182713]: 2026-01-22 00:17:33.029 182717 DEBUG oslo_concurrency.processutils [None req-b825dc05-d088-464d-b9c1-9e04f5387195 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:17:33 compute-1 nova_compute[182713]: 2026-01-22 00:17:33.111 182717 DEBUG oslo_concurrency.processutils [None req-b825dc05-d088-464d-b9c1-9e04f5387195 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.082s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:17:33 compute-1 nova_compute[182713]: 2026-01-22 00:17:33.113 182717 DEBUG oslo_concurrency.processutils [None req-b825dc05-d088-464d-b9c1-9e04f5387195 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/7e1a8ec5-e1c0-4c11-aae2-15d84872d95c/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:17:33 compute-1 nova_compute[182713]: 2026-01-22 00:17:33.168 182717 DEBUG oslo_concurrency.processutils [None req-b825dc05-d088-464d-b9c1-9e04f5387195 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/7e1a8ec5-e1c0-4c11-aae2-15d84872d95c/disk 1073741824" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:17:33 compute-1 nova_compute[182713]: 2026-01-22 00:17:33.170 182717 DEBUG oslo_concurrency.lockutils [None req-b825dc05-d088-464d-b9c1-9e04f5387195 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.156s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:17:33 compute-1 nova_compute[182713]: 2026-01-22 00:17:33.171 182717 DEBUG oslo_concurrency.processutils [None req-b825dc05-d088-464d-b9c1-9e04f5387195 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:17:33 compute-1 nova_compute[182713]: 2026-01-22 00:17:33.229 182717 DEBUG oslo_concurrency.processutils [None req-b825dc05-d088-464d-b9c1-9e04f5387195 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:17:33 compute-1 nova_compute[182713]: 2026-01-22 00:17:33.231 182717 DEBUG nova.virt.disk.api [None req-b825dc05-d088-464d-b9c1-9e04f5387195 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Checking if we can resize image /var/lib/nova/instances/7e1a8ec5-e1c0-4c11-aae2-15d84872d95c/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 22 00:17:33 compute-1 nova_compute[182713]: 2026-01-22 00:17:33.232 182717 DEBUG oslo_concurrency.processutils [None req-b825dc05-d088-464d-b9c1-9e04f5387195 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7e1a8ec5-e1c0-4c11-aae2-15d84872d95c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:17:33 compute-1 nova_compute[182713]: 2026-01-22 00:17:33.328 182717 DEBUG oslo_concurrency.processutils [None req-b825dc05-d088-464d-b9c1-9e04f5387195 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7e1a8ec5-e1c0-4c11-aae2-15d84872d95c/disk --force-share --output=json" returned: 0 in 0.096s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:17:33 compute-1 nova_compute[182713]: 2026-01-22 00:17:33.330 182717 DEBUG nova.virt.disk.api [None req-b825dc05-d088-464d-b9c1-9e04f5387195 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Cannot resize image /var/lib/nova/instances/7e1a8ec5-e1c0-4c11-aae2-15d84872d95c/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 22 00:17:33 compute-1 nova_compute[182713]: 2026-01-22 00:17:33.330 182717 DEBUG nova.objects.instance [None req-b825dc05-d088-464d-b9c1-9e04f5387195 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Lazy-loading 'migration_context' on Instance uuid 7e1a8ec5-e1c0-4c11-aae2-15d84872d95c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:17:33 compute-1 nova_compute[182713]: 2026-01-22 00:17:33.354 182717 DEBUG nova.virt.libvirt.driver [None req-b825dc05-d088-464d-b9c1-9e04f5387195 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: 7e1a8ec5-e1c0-4c11-aae2-15d84872d95c] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 22 00:17:33 compute-1 nova_compute[182713]: 2026-01-22 00:17:33.355 182717 DEBUG nova.virt.libvirt.driver [None req-b825dc05-d088-464d-b9c1-9e04f5387195 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: 7e1a8ec5-e1c0-4c11-aae2-15d84872d95c] Ensure instance console log exists: /var/lib/nova/instances/7e1a8ec5-e1c0-4c11-aae2-15d84872d95c/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 22 00:17:33 compute-1 nova_compute[182713]: 2026-01-22 00:17:33.356 182717 DEBUG oslo_concurrency.lockutils [None req-b825dc05-d088-464d-b9c1-9e04f5387195 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:17:33 compute-1 nova_compute[182713]: 2026-01-22 00:17:33.356 182717 DEBUG oslo_concurrency.lockutils [None req-b825dc05-d088-464d-b9c1-9e04f5387195 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:17:33 compute-1 nova_compute[182713]: 2026-01-22 00:17:33.357 182717 DEBUG oslo_concurrency.lockutils [None req-b825dc05-d088-464d-b9c1-9e04f5387195 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:17:33 compute-1 nova_compute[182713]: 2026-01-22 00:17:33.594 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:17:33 compute-1 nova_compute[182713]: 2026-01-22 00:17:33.812 182717 DEBUG nova.network.neutron [None req-b825dc05-d088-464d-b9c1-9e04f5387195 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: 7e1a8ec5-e1c0-4c11-aae2-15d84872d95c] Successfully created port: dfd91ce9-b3dc-46e0-8793-952181553915 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 22 00:17:33 compute-1 nova_compute[182713]: 2026-01-22 00:17:33.855 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:17:33 compute-1 nova_compute[182713]: 2026-01-22 00:17:33.855 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Jan 22 00:17:33 compute-1 nova_compute[182713]: 2026-01-22 00:17:33.883 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Jan 22 00:17:35 compute-1 nova_compute[182713]: 2026-01-22 00:17:35.218 182717 DEBUG nova.network.neutron [None req-b825dc05-d088-464d-b9c1-9e04f5387195 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: 7e1a8ec5-e1c0-4c11-aae2-15d84872d95c] Successfully updated port: dfd91ce9-b3dc-46e0-8793-952181553915 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 22 00:17:35 compute-1 nova_compute[182713]: 2026-01-22 00:17:35.241 182717 DEBUG oslo_concurrency.lockutils [None req-b825dc05-d088-464d-b9c1-9e04f5387195 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Acquiring lock "refresh_cache-7e1a8ec5-e1c0-4c11-aae2-15d84872d95c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:17:35 compute-1 nova_compute[182713]: 2026-01-22 00:17:35.242 182717 DEBUG oslo_concurrency.lockutils [None req-b825dc05-d088-464d-b9c1-9e04f5387195 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Acquired lock "refresh_cache-7e1a8ec5-e1c0-4c11-aae2-15d84872d95c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:17:35 compute-1 nova_compute[182713]: 2026-01-22 00:17:35.242 182717 DEBUG nova.network.neutron [None req-b825dc05-d088-464d-b9c1-9e04f5387195 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: 7e1a8ec5-e1c0-4c11-aae2-15d84872d95c] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 00:17:35 compute-1 nova_compute[182713]: 2026-01-22 00:17:35.476 182717 DEBUG nova.compute.manager [req-06094d8f-94a8-467d-b855-b2de1c152b83 req-f57a70f1-2921-4b24-9b29-a95ae64d9ee8 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 7e1a8ec5-e1c0-4c11-aae2-15d84872d95c] Received event network-changed-dfd91ce9-b3dc-46e0-8793-952181553915 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:17:35 compute-1 nova_compute[182713]: 2026-01-22 00:17:35.477 182717 DEBUG nova.compute.manager [req-06094d8f-94a8-467d-b855-b2de1c152b83 req-f57a70f1-2921-4b24-9b29-a95ae64d9ee8 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 7e1a8ec5-e1c0-4c11-aae2-15d84872d95c] Refreshing instance network info cache due to event network-changed-dfd91ce9-b3dc-46e0-8793-952181553915. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 00:17:35 compute-1 nova_compute[182713]: 2026-01-22 00:17:35.477 182717 DEBUG oslo_concurrency.lockutils [req-06094d8f-94a8-467d-b855-b2de1c152b83 req-f57a70f1-2921-4b24-9b29-a95ae64d9ee8 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-7e1a8ec5-e1c0-4c11-aae2-15d84872d95c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:17:35 compute-1 nova_compute[182713]: 2026-01-22 00:17:35.565 182717 DEBUG nova.network.neutron [None req-b825dc05-d088-464d-b9c1-9e04f5387195 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: 7e1a8ec5-e1c0-4c11-aae2-15d84872d95c] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 00:17:36 compute-1 nova_compute[182713]: 2026-01-22 00:17:36.913 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:17:36 compute-1 nova_compute[182713]: 2026-01-22 00:17:36.964 182717 DEBUG nova.network.neutron [None req-b825dc05-d088-464d-b9c1-9e04f5387195 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: 7e1a8ec5-e1c0-4c11-aae2-15d84872d95c] Updating instance_info_cache with network_info: [{"id": "dfd91ce9-b3dc-46e0-8793-952181553915", "address": "fa:16:3e:0c:78:ae", "network": {"id": "aabf11c6-ef94-408a-8148-6c6400566606", "bridge": "br-int", "label": "tempest-ServersTestJSON-341875047-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3e408650207b498c8d115fd0c4f776dc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdfd91ce9-b3", "ovs_interfaceid": "dfd91ce9-b3dc-46e0-8793-952181553915", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:17:36 compute-1 nova_compute[182713]: 2026-01-22 00:17:36.989 182717 DEBUG oslo_concurrency.lockutils [None req-b825dc05-d088-464d-b9c1-9e04f5387195 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Releasing lock "refresh_cache-7e1a8ec5-e1c0-4c11-aae2-15d84872d95c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:17:36 compute-1 nova_compute[182713]: 2026-01-22 00:17:36.989 182717 DEBUG nova.compute.manager [None req-b825dc05-d088-464d-b9c1-9e04f5387195 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: 7e1a8ec5-e1c0-4c11-aae2-15d84872d95c] Instance network_info: |[{"id": "dfd91ce9-b3dc-46e0-8793-952181553915", "address": "fa:16:3e:0c:78:ae", "network": {"id": "aabf11c6-ef94-408a-8148-6c6400566606", "bridge": "br-int", "label": "tempest-ServersTestJSON-341875047-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3e408650207b498c8d115fd0c4f776dc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdfd91ce9-b3", "ovs_interfaceid": "dfd91ce9-b3dc-46e0-8793-952181553915", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 22 00:17:36 compute-1 nova_compute[182713]: 2026-01-22 00:17:36.989 182717 DEBUG oslo_concurrency.lockutils [req-06094d8f-94a8-467d-b855-b2de1c152b83 req-f57a70f1-2921-4b24-9b29-a95ae64d9ee8 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-7e1a8ec5-e1c0-4c11-aae2-15d84872d95c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:17:36 compute-1 nova_compute[182713]: 2026-01-22 00:17:36.990 182717 DEBUG nova.network.neutron [req-06094d8f-94a8-467d-b855-b2de1c152b83 req-f57a70f1-2921-4b24-9b29-a95ae64d9ee8 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 7e1a8ec5-e1c0-4c11-aae2-15d84872d95c] Refreshing network info cache for port dfd91ce9-b3dc-46e0-8793-952181553915 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 00:17:36 compute-1 nova_compute[182713]: 2026-01-22 00:17:36.993 182717 DEBUG nova.virt.libvirt.driver [None req-b825dc05-d088-464d-b9c1-9e04f5387195 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: 7e1a8ec5-e1c0-4c11-aae2-15d84872d95c] Start _get_guest_xml network_info=[{"id": "dfd91ce9-b3dc-46e0-8793-952181553915", "address": "fa:16:3e:0c:78:ae", "network": {"id": "aabf11c6-ef94-408a-8148-6c6400566606", "bridge": "br-int", "label": "tempest-ServersTestJSON-341875047-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3e408650207b498c8d115fd0c4f776dc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdfd91ce9-b3", "ovs_interfaceid": "dfd91ce9-b3dc-46e0-8793-952181553915", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'size': 0, 'disk_bus': 'virtio', 'boot_index': 0, 'device_name': '/dev/vda', 'guest_format': None, 'encrypted': False, 'encryption_format': None, 'encryption_secret_uuid': None, 'device_type': 'disk', 'image_id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 22 00:17:36 compute-1 nova_compute[182713]: 2026-01-22 00:17:36.998 182717 WARNING nova.virt.libvirt.driver [None req-b825dc05-d088-464d-b9c1-9e04f5387195 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 00:17:37 compute-1 nova_compute[182713]: 2026-01-22 00:17:37.006 182717 DEBUG nova.virt.libvirt.host [None req-b825dc05-d088-464d-b9c1-9e04f5387195 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 22 00:17:37 compute-1 nova_compute[182713]: 2026-01-22 00:17:37.006 182717 DEBUG nova.virt.libvirt.host [None req-b825dc05-d088-464d-b9c1-9e04f5387195 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 22 00:17:37 compute-1 nova_compute[182713]: 2026-01-22 00:17:37.010 182717 DEBUG nova.virt.libvirt.host [None req-b825dc05-d088-464d-b9c1-9e04f5387195 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 22 00:17:37 compute-1 nova_compute[182713]: 2026-01-22 00:17:37.012 182717 DEBUG nova.virt.libvirt.host [None req-b825dc05-d088-464d-b9c1-9e04f5387195 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 22 00:17:37 compute-1 nova_compute[182713]: 2026-01-22 00:17:37.015 182717 DEBUG nova.virt.libvirt.driver [None req-b825dc05-d088-464d-b9c1-9e04f5387195 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 22 00:17:37 compute-1 nova_compute[182713]: 2026-01-22 00:17:37.015 182717 DEBUG nova.virt.hardware [None req-b825dc05-d088-464d-b9c1-9e04f5387195 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-21T23:42:45Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c3389c03-89c4-4ff5-9e03-1a99d41713d4',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 22 00:17:37 compute-1 nova_compute[182713]: 2026-01-22 00:17:37.016 182717 DEBUG nova.virt.hardware [None req-b825dc05-d088-464d-b9c1-9e04f5387195 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 22 00:17:37 compute-1 nova_compute[182713]: 2026-01-22 00:17:37.017 182717 DEBUG nova.virt.hardware [None req-b825dc05-d088-464d-b9c1-9e04f5387195 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 22 00:17:37 compute-1 nova_compute[182713]: 2026-01-22 00:17:37.017 182717 DEBUG nova.virt.hardware [None req-b825dc05-d088-464d-b9c1-9e04f5387195 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 22 00:17:37 compute-1 nova_compute[182713]: 2026-01-22 00:17:37.018 182717 DEBUG nova.virt.hardware [None req-b825dc05-d088-464d-b9c1-9e04f5387195 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 22 00:17:37 compute-1 nova_compute[182713]: 2026-01-22 00:17:37.018 182717 DEBUG nova.virt.hardware [None req-b825dc05-d088-464d-b9c1-9e04f5387195 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 22 00:17:37 compute-1 nova_compute[182713]: 2026-01-22 00:17:37.019 182717 DEBUG nova.virt.hardware [None req-b825dc05-d088-464d-b9c1-9e04f5387195 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 22 00:17:37 compute-1 nova_compute[182713]: 2026-01-22 00:17:37.020 182717 DEBUG nova.virt.hardware [None req-b825dc05-d088-464d-b9c1-9e04f5387195 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 22 00:17:37 compute-1 nova_compute[182713]: 2026-01-22 00:17:37.020 182717 DEBUG nova.virt.hardware [None req-b825dc05-d088-464d-b9c1-9e04f5387195 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 22 00:17:37 compute-1 nova_compute[182713]: 2026-01-22 00:17:37.021 182717 DEBUG nova.virt.hardware [None req-b825dc05-d088-464d-b9c1-9e04f5387195 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 22 00:17:37 compute-1 nova_compute[182713]: 2026-01-22 00:17:37.022 182717 DEBUG nova.virt.hardware [None req-b825dc05-d088-464d-b9c1-9e04f5387195 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 22 00:17:37 compute-1 nova_compute[182713]: 2026-01-22 00:17:37.029 182717 DEBUG nova.virt.libvirt.vif [None req-b825dc05-d088-464d-b9c1-9e04f5387195 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T00:17:30Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-₡-12187943',display_name='tempest-₡-12187943',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest--12187943',id=136,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='3e408650207b498c8d115fd0c4f776dc',ramdisk_id='',reservation_id='r-swmh30g2',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestJSON-374007797',owner_user_name='tempest-ServersTestJSON-374007797-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:17:32Z,user_data=None,user_id='5eb4e81f0cef4003ae49faa67b3f17c3',uuid=7e1a8ec5-e1c0-4c11-aae2-15d84872d95c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "dfd91ce9-b3dc-46e0-8793-952181553915", "address": "fa:16:3e:0c:78:ae", "network": {"id": "aabf11c6-ef94-408a-8148-6c6400566606", "bridge": "br-int", "label": "tempest-ServersTestJSON-341875047-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3e408650207b498c8d115fd0c4f776dc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdfd91ce9-b3", "ovs_interfaceid": "dfd91ce9-b3dc-46e0-8793-952181553915", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 00:17:37 compute-1 nova_compute[182713]: 2026-01-22 00:17:37.030 182717 DEBUG nova.network.os_vif_util [None req-b825dc05-d088-464d-b9c1-9e04f5387195 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Converting VIF {"id": "dfd91ce9-b3dc-46e0-8793-952181553915", "address": "fa:16:3e:0c:78:ae", "network": {"id": "aabf11c6-ef94-408a-8148-6c6400566606", "bridge": "br-int", "label": "tempest-ServersTestJSON-341875047-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3e408650207b498c8d115fd0c4f776dc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdfd91ce9-b3", "ovs_interfaceid": "dfd91ce9-b3dc-46e0-8793-952181553915", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:17:37 compute-1 nova_compute[182713]: 2026-01-22 00:17:37.031 182717 DEBUG nova.network.os_vif_util [None req-b825dc05-d088-464d-b9c1-9e04f5387195 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:0c:78:ae,bridge_name='br-int',has_traffic_filtering=True,id=dfd91ce9-b3dc-46e0-8793-952181553915,network=Network(aabf11c6-ef94-408a-8148-6c6400566606),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdfd91ce9-b3') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:17:37 compute-1 nova_compute[182713]: 2026-01-22 00:17:37.034 182717 DEBUG nova.objects.instance [None req-b825dc05-d088-464d-b9c1-9e04f5387195 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Lazy-loading 'pci_devices' on Instance uuid 7e1a8ec5-e1c0-4c11-aae2-15d84872d95c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:17:37 compute-1 nova_compute[182713]: 2026-01-22 00:17:37.057 182717 DEBUG nova.virt.libvirt.driver [None req-b825dc05-d088-464d-b9c1-9e04f5387195 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: 7e1a8ec5-e1c0-4c11-aae2-15d84872d95c] End _get_guest_xml xml=<domain type="kvm">
Jan 22 00:17:37 compute-1 nova_compute[182713]:   <uuid>7e1a8ec5-e1c0-4c11-aae2-15d84872d95c</uuid>
Jan 22 00:17:37 compute-1 nova_compute[182713]:   <name>instance-00000088</name>
Jan 22 00:17:37 compute-1 nova_compute[182713]:   <memory>131072</memory>
Jan 22 00:17:37 compute-1 nova_compute[182713]:   <vcpu>1</vcpu>
Jan 22 00:17:37 compute-1 nova_compute[182713]:   <metadata>
Jan 22 00:17:37 compute-1 nova_compute[182713]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 00:17:37 compute-1 nova_compute[182713]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 00:17:37 compute-1 nova_compute[182713]:       <nova:name>tempest-₡-12187943</nova:name>
Jan 22 00:17:37 compute-1 nova_compute[182713]:       <nova:creationTime>2026-01-22 00:17:36</nova:creationTime>
Jan 22 00:17:37 compute-1 nova_compute[182713]:       <nova:flavor name="m1.nano">
Jan 22 00:17:37 compute-1 nova_compute[182713]:         <nova:memory>128</nova:memory>
Jan 22 00:17:37 compute-1 nova_compute[182713]:         <nova:disk>1</nova:disk>
Jan 22 00:17:37 compute-1 nova_compute[182713]:         <nova:swap>0</nova:swap>
Jan 22 00:17:37 compute-1 nova_compute[182713]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 00:17:37 compute-1 nova_compute[182713]:         <nova:vcpus>1</nova:vcpus>
Jan 22 00:17:37 compute-1 nova_compute[182713]:       </nova:flavor>
Jan 22 00:17:37 compute-1 nova_compute[182713]:       <nova:owner>
Jan 22 00:17:37 compute-1 nova_compute[182713]:         <nova:user uuid="5eb4e81f0cef4003ae49faa67b3f17c3">tempest-ServersTestJSON-374007797-project-member</nova:user>
Jan 22 00:17:37 compute-1 nova_compute[182713]:         <nova:project uuid="3e408650207b498c8d115fd0c4f776dc">tempest-ServersTestJSON-374007797</nova:project>
Jan 22 00:17:37 compute-1 nova_compute[182713]:       </nova:owner>
Jan 22 00:17:37 compute-1 nova_compute[182713]:       <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 22 00:17:37 compute-1 nova_compute[182713]:       <nova:ports>
Jan 22 00:17:37 compute-1 nova_compute[182713]:         <nova:port uuid="dfd91ce9-b3dc-46e0-8793-952181553915">
Jan 22 00:17:37 compute-1 nova_compute[182713]:           <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Jan 22 00:17:37 compute-1 nova_compute[182713]:         </nova:port>
Jan 22 00:17:37 compute-1 nova_compute[182713]:       </nova:ports>
Jan 22 00:17:37 compute-1 nova_compute[182713]:     </nova:instance>
Jan 22 00:17:37 compute-1 nova_compute[182713]:   </metadata>
Jan 22 00:17:37 compute-1 nova_compute[182713]:   <sysinfo type="smbios">
Jan 22 00:17:37 compute-1 nova_compute[182713]:     <system>
Jan 22 00:17:37 compute-1 nova_compute[182713]:       <entry name="manufacturer">RDO</entry>
Jan 22 00:17:37 compute-1 nova_compute[182713]:       <entry name="product">OpenStack Compute</entry>
Jan 22 00:17:37 compute-1 nova_compute[182713]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 00:17:37 compute-1 nova_compute[182713]:       <entry name="serial">7e1a8ec5-e1c0-4c11-aae2-15d84872d95c</entry>
Jan 22 00:17:37 compute-1 nova_compute[182713]:       <entry name="uuid">7e1a8ec5-e1c0-4c11-aae2-15d84872d95c</entry>
Jan 22 00:17:37 compute-1 nova_compute[182713]:       <entry name="family">Virtual Machine</entry>
Jan 22 00:17:37 compute-1 nova_compute[182713]:     </system>
Jan 22 00:17:37 compute-1 nova_compute[182713]:   </sysinfo>
Jan 22 00:17:37 compute-1 nova_compute[182713]:   <os>
Jan 22 00:17:37 compute-1 nova_compute[182713]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 22 00:17:37 compute-1 nova_compute[182713]:     <boot dev="hd"/>
Jan 22 00:17:37 compute-1 nova_compute[182713]:     <smbios mode="sysinfo"/>
Jan 22 00:17:37 compute-1 nova_compute[182713]:   </os>
Jan 22 00:17:37 compute-1 nova_compute[182713]:   <features>
Jan 22 00:17:37 compute-1 nova_compute[182713]:     <acpi/>
Jan 22 00:17:37 compute-1 nova_compute[182713]:     <apic/>
Jan 22 00:17:37 compute-1 nova_compute[182713]:     <vmcoreinfo/>
Jan 22 00:17:37 compute-1 nova_compute[182713]:   </features>
Jan 22 00:17:37 compute-1 nova_compute[182713]:   <clock offset="utc">
Jan 22 00:17:37 compute-1 nova_compute[182713]:     <timer name="pit" tickpolicy="delay"/>
Jan 22 00:17:37 compute-1 nova_compute[182713]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 22 00:17:37 compute-1 nova_compute[182713]:     <timer name="hpet" present="no"/>
Jan 22 00:17:37 compute-1 nova_compute[182713]:   </clock>
Jan 22 00:17:37 compute-1 nova_compute[182713]:   <cpu mode="custom" match="exact">
Jan 22 00:17:37 compute-1 nova_compute[182713]:     <model>Nehalem</model>
Jan 22 00:17:37 compute-1 nova_compute[182713]:     <topology sockets="1" cores="1" threads="1"/>
Jan 22 00:17:37 compute-1 nova_compute[182713]:   </cpu>
Jan 22 00:17:37 compute-1 nova_compute[182713]:   <devices>
Jan 22 00:17:37 compute-1 nova_compute[182713]:     <disk type="file" device="disk">
Jan 22 00:17:37 compute-1 nova_compute[182713]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 00:17:37 compute-1 nova_compute[182713]:       <source file="/var/lib/nova/instances/7e1a8ec5-e1c0-4c11-aae2-15d84872d95c/disk"/>
Jan 22 00:17:37 compute-1 nova_compute[182713]:       <target dev="vda" bus="virtio"/>
Jan 22 00:17:37 compute-1 nova_compute[182713]:     </disk>
Jan 22 00:17:37 compute-1 nova_compute[182713]:     <disk type="file" device="cdrom">
Jan 22 00:17:37 compute-1 nova_compute[182713]:       <driver name="qemu" type="raw" cache="none"/>
Jan 22 00:17:37 compute-1 nova_compute[182713]:       <source file="/var/lib/nova/instances/7e1a8ec5-e1c0-4c11-aae2-15d84872d95c/disk.config"/>
Jan 22 00:17:37 compute-1 nova_compute[182713]:       <target dev="sda" bus="sata"/>
Jan 22 00:17:37 compute-1 nova_compute[182713]:     </disk>
Jan 22 00:17:37 compute-1 nova_compute[182713]:     <interface type="ethernet">
Jan 22 00:17:37 compute-1 nova_compute[182713]:       <mac address="fa:16:3e:0c:78:ae"/>
Jan 22 00:17:37 compute-1 nova_compute[182713]:       <model type="virtio"/>
Jan 22 00:17:37 compute-1 nova_compute[182713]:       <driver name="vhost" rx_queue_size="512"/>
Jan 22 00:17:37 compute-1 nova_compute[182713]:       <mtu size="1442"/>
Jan 22 00:17:37 compute-1 nova_compute[182713]:       <target dev="tapdfd91ce9-b3"/>
Jan 22 00:17:37 compute-1 nova_compute[182713]:     </interface>
Jan 22 00:17:37 compute-1 nova_compute[182713]:     <serial type="pty">
Jan 22 00:17:37 compute-1 nova_compute[182713]:       <log file="/var/lib/nova/instances/7e1a8ec5-e1c0-4c11-aae2-15d84872d95c/console.log" append="off"/>
Jan 22 00:17:37 compute-1 nova_compute[182713]:     </serial>
Jan 22 00:17:37 compute-1 nova_compute[182713]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 00:17:37 compute-1 nova_compute[182713]:     <video>
Jan 22 00:17:37 compute-1 nova_compute[182713]:       <model type="virtio"/>
Jan 22 00:17:37 compute-1 nova_compute[182713]:     </video>
Jan 22 00:17:37 compute-1 nova_compute[182713]:     <input type="tablet" bus="usb"/>
Jan 22 00:17:37 compute-1 nova_compute[182713]:     <rng model="virtio">
Jan 22 00:17:37 compute-1 nova_compute[182713]:       <backend model="random">/dev/urandom</backend>
Jan 22 00:17:37 compute-1 nova_compute[182713]:     </rng>
Jan 22 00:17:37 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root"/>
Jan 22 00:17:37 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:17:37 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:17:37 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:17:37 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:17:37 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:17:37 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:17:37 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:17:37 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:17:37 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:17:37 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:17:37 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:17:37 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:17:37 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:17:37 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:17:37 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:17:37 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:17:37 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:17:37 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:17:37 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:17:37 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:17:37 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:17:37 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:17:37 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:17:37 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:17:37 compute-1 nova_compute[182713]:     <controller type="usb" index="0"/>
Jan 22 00:17:37 compute-1 nova_compute[182713]:     <memballoon model="virtio">
Jan 22 00:17:37 compute-1 nova_compute[182713]:       <stats period="10"/>
Jan 22 00:17:37 compute-1 nova_compute[182713]:     </memballoon>
Jan 22 00:17:37 compute-1 nova_compute[182713]:   </devices>
Jan 22 00:17:37 compute-1 nova_compute[182713]: </domain>
Jan 22 00:17:37 compute-1 nova_compute[182713]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 22 00:17:37 compute-1 nova_compute[182713]: 2026-01-22 00:17:37.059 182717 DEBUG nova.compute.manager [None req-b825dc05-d088-464d-b9c1-9e04f5387195 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: 7e1a8ec5-e1c0-4c11-aae2-15d84872d95c] Preparing to wait for external event network-vif-plugged-dfd91ce9-b3dc-46e0-8793-952181553915 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 22 00:17:37 compute-1 nova_compute[182713]: 2026-01-22 00:17:37.059 182717 DEBUG oslo_concurrency.lockutils [None req-b825dc05-d088-464d-b9c1-9e04f5387195 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Acquiring lock "7e1a8ec5-e1c0-4c11-aae2-15d84872d95c-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:17:37 compute-1 nova_compute[182713]: 2026-01-22 00:17:37.059 182717 DEBUG oslo_concurrency.lockutils [None req-b825dc05-d088-464d-b9c1-9e04f5387195 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Lock "7e1a8ec5-e1c0-4c11-aae2-15d84872d95c-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:17:37 compute-1 nova_compute[182713]: 2026-01-22 00:17:37.060 182717 DEBUG oslo_concurrency.lockutils [None req-b825dc05-d088-464d-b9c1-9e04f5387195 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Lock "7e1a8ec5-e1c0-4c11-aae2-15d84872d95c-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:17:37 compute-1 nova_compute[182713]: 2026-01-22 00:17:37.061 182717 DEBUG nova.virt.libvirt.vif [None req-b825dc05-d088-464d-b9c1-9e04f5387195 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T00:17:30Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-₡-12187943',display_name='tempest-₡-12187943',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest--12187943',id=136,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='3e408650207b498c8d115fd0c4f776dc',ramdisk_id='',reservation_id='r-swmh30g2',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestJSON-374007797',owner_user_name='tempest-ServersTestJSON-374007797-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:17:32Z,user_data=None,user_id='5eb4e81f0cef4003ae49faa67b3f17c3',uuid=7e1a8ec5-e1c0-4c11-aae2-15d84872d95c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "dfd91ce9-b3dc-46e0-8793-952181553915", "address": "fa:16:3e:0c:78:ae", "network": {"id": "aabf11c6-ef94-408a-8148-6c6400566606", "bridge": "br-int", "label": "tempest-ServersTestJSON-341875047-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3e408650207b498c8d115fd0c4f776dc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdfd91ce9-b3", "ovs_interfaceid": "dfd91ce9-b3dc-46e0-8793-952181553915", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 00:17:37 compute-1 nova_compute[182713]: 2026-01-22 00:17:37.061 182717 DEBUG nova.network.os_vif_util [None req-b825dc05-d088-464d-b9c1-9e04f5387195 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Converting VIF {"id": "dfd91ce9-b3dc-46e0-8793-952181553915", "address": "fa:16:3e:0c:78:ae", "network": {"id": "aabf11c6-ef94-408a-8148-6c6400566606", "bridge": "br-int", "label": "tempest-ServersTestJSON-341875047-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3e408650207b498c8d115fd0c4f776dc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdfd91ce9-b3", "ovs_interfaceid": "dfd91ce9-b3dc-46e0-8793-952181553915", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:17:37 compute-1 nova_compute[182713]: 2026-01-22 00:17:37.062 182717 DEBUG nova.network.os_vif_util [None req-b825dc05-d088-464d-b9c1-9e04f5387195 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:0c:78:ae,bridge_name='br-int',has_traffic_filtering=True,id=dfd91ce9-b3dc-46e0-8793-952181553915,network=Network(aabf11c6-ef94-408a-8148-6c6400566606),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdfd91ce9-b3') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:17:37 compute-1 nova_compute[182713]: 2026-01-22 00:17:37.062 182717 DEBUG os_vif [None req-b825dc05-d088-464d-b9c1-9e04f5387195 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:0c:78:ae,bridge_name='br-int',has_traffic_filtering=True,id=dfd91ce9-b3dc-46e0-8793-952181553915,network=Network(aabf11c6-ef94-408a-8148-6c6400566606),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdfd91ce9-b3') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 00:17:37 compute-1 nova_compute[182713]: 2026-01-22 00:17:37.063 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:17:37 compute-1 nova_compute[182713]: 2026-01-22 00:17:37.063 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:17:37 compute-1 nova_compute[182713]: 2026-01-22 00:17:37.064 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 00:17:37 compute-1 nova_compute[182713]: 2026-01-22 00:17:37.067 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:17:37 compute-1 nova_compute[182713]: 2026-01-22 00:17:37.067 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapdfd91ce9-b3, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:17:37 compute-1 nova_compute[182713]: 2026-01-22 00:17:37.068 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapdfd91ce9-b3, col_values=(('external_ids', {'iface-id': 'dfd91ce9-b3dc-46e0-8793-952181553915', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:0c:78:ae', 'vm-uuid': '7e1a8ec5-e1c0-4c11-aae2-15d84872d95c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:17:37 compute-1 nova_compute[182713]: 2026-01-22 00:17:37.070 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:17:37 compute-1 NetworkManager[54952]: <info>  [1769041057.0720] manager: (tapdfd91ce9-b3): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/245)
Jan 22 00:17:37 compute-1 nova_compute[182713]: 2026-01-22 00:17:37.072 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:17:37 compute-1 nova_compute[182713]: 2026-01-22 00:17:37.079 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:17:37 compute-1 nova_compute[182713]: 2026-01-22 00:17:37.082 182717 INFO os_vif [None req-b825dc05-d088-464d-b9c1-9e04f5387195 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:0c:78:ae,bridge_name='br-int',has_traffic_filtering=True,id=dfd91ce9-b3dc-46e0-8793-952181553915,network=Network(aabf11c6-ef94-408a-8148-6c6400566606),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdfd91ce9-b3')
Jan 22 00:17:37 compute-1 nova_compute[182713]: 2026-01-22 00:17:37.148 182717 DEBUG nova.virt.libvirt.driver [None req-b825dc05-d088-464d-b9c1-9e04f5387195 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 00:17:37 compute-1 nova_compute[182713]: 2026-01-22 00:17:37.149 182717 DEBUG nova.virt.libvirt.driver [None req-b825dc05-d088-464d-b9c1-9e04f5387195 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 00:17:37 compute-1 nova_compute[182713]: 2026-01-22 00:17:37.149 182717 DEBUG nova.virt.libvirt.driver [None req-b825dc05-d088-464d-b9c1-9e04f5387195 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] No VIF found with MAC fa:16:3e:0c:78:ae, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 00:17:37 compute-1 nova_compute[182713]: 2026-01-22 00:17:37.150 182717 INFO nova.virt.libvirt.driver [None req-b825dc05-d088-464d-b9c1-9e04f5387195 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: 7e1a8ec5-e1c0-4c11-aae2-15d84872d95c] Using config drive
Jan 22 00:17:38 compute-1 nova_compute[182713]: 2026-01-22 00:17:38.137 182717 INFO nova.virt.libvirt.driver [None req-b825dc05-d088-464d-b9c1-9e04f5387195 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: 7e1a8ec5-e1c0-4c11-aae2-15d84872d95c] Creating config drive at /var/lib/nova/instances/7e1a8ec5-e1c0-4c11-aae2-15d84872d95c/disk.config
Jan 22 00:17:38 compute-1 nova_compute[182713]: 2026-01-22 00:17:38.143 182717 DEBUG oslo_concurrency.processutils [None req-b825dc05-d088-464d-b9c1-9e04f5387195 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/7e1a8ec5-e1c0-4c11-aae2-15d84872d95c/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpl00sah4s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:17:38 compute-1 nova_compute[182713]: 2026-01-22 00:17:38.288 182717 DEBUG oslo_concurrency.processutils [None req-b825dc05-d088-464d-b9c1-9e04f5387195 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/7e1a8ec5-e1c0-4c11-aae2-15d84872d95c/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpl00sah4s" returned: 0 in 0.145s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:17:38 compute-1 kernel: tapdfd91ce9-b3: entered promiscuous mode
Jan 22 00:17:38 compute-1 NetworkManager[54952]: <info>  [1769041058.3735] manager: (tapdfd91ce9-b3): new Tun device (/org/freedesktop/NetworkManager/Devices/246)
Jan 22 00:17:38 compute-1 ovn_controller[94841]: 2026-01-22T00:17:38Z|00544|binding|INFO|Claiming lport dfd91ce9-b3dc-46e0-8793-952181553915 for this chassis.
Jan 22 00:17:38 compute-1 ovn_controller[94841]: 2026-01-22T00:17:38Z|00545|binding|INFO|dfd91ce9-b3dc-46e0-8793-952181553915: Claiming fa:16:3e:0c:78:ae 10.100.0.5
Jan 22 00:17:38 compute-1 nova_compute[182713]: 2026-01-22 00:17:38.376 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:17:38 compute-1 nova_compute[182713]: 2026-01-22 00:17:38.381 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:17:38 compute-1 nova_compute[182713]: 2026-01-22 00:17:38.386 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:17:38 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:17:38.402 104184 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:0c:78:ae 10.100.0.5'], port_security=['fa:16:3e:0c:78:ae 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '7e1a8ec5-e1c0-4c11-aae2-15d84872d95c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-aabf11c6-ef94-408a-8148-6c6400566606', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3e408650207b498c8d115fd0c4f776dc', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'd88f438e-f9bb-4593-93a6-6ce5aa939167', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=dfd57084-623a-46cf-a9c5-71a440a640c6, chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>], logical_port=dfd91ce9-b3dc-46e0-8793-952181553915) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:17:38 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:17:38.404 104184 INFO neutron.agent.ovn.metadata.agent [-] Port dfd91ce9-b3dc-46e0-8793-952181553915 in datapath aabf11c6-ef94-408a-8148-6c6400566606 bound to our chassis
Jan 22 00:17:38 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:17:38.406 104184 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network aabf11c6-ef94-408a-8148-6c6400566606
Jan 22 00:17:38 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:17:38.425 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[fb6c924f-f6d1-47e0-82be-a1f3b4011513]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:17:38 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:17:38.427 104184 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapaabf11c6-e1 in ovnmeta-aabf11c6-ef94-408a-8148-6c6400566606 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 22 00:17:38 compute-1 systemd-udevd[232394]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 00:17:38 compute-1 systemd-machined[153970]: New machine qemu-61-instance-00000088.
Jan 22 00:17:38 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:17:38.436 211733 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapaabf11c6-e0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 22 00:17:38 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:17:38.436 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[6eae3fef-18ea-4fb0-9ffc-bd69609aa549]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:17:38 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:17:38.439 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[a84abfe3-aeea-4522-8a61-e7b684389f12]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:17:38 compute-1 NetworkManager[54952]: <info>  [1769041058.4479] device (tapdfd91ce9-b3): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 00:17:38 compute-1 NetworkManager[54952]: <info>  [1769041058.4493] device (tapdfd91ce9-b3): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 00:17:38 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:17:38.453 104576 DEBUG oslo.privsep.daemon [-] privsep: reply[e50bf6fe-5859-4949-ae92-cc96a5eb9f03]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:17:38 compute-1 nova_compute[182713]: 2026-01-22 00:17:38.467 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:17:38 compute-1 systemd[1]: Started Virtual Machine qemu-61-instance-00000088.
Jan 22 00:17:38 compute-1 ovn_controller[94841]: 2026-01-22T00:17:38Z|00546|binding|INFO|Setting lport dfd91ce9-b3dc-46e0-8793-952181553915 ovn-installed in OVS
Jan 22 00:17:38 compute-1 ovn_controller[94841]: 2026-01-22T00:17:38Z|00547|binding|INFO|Setting lport dfd91ce9-b3dc-46e0-8793-952181553915 up in Southbound
Jan 22 00:17:38 compute-1 nova_compute[182713]: 2026-01-22 00:17:38.473 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:17:38 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:17:38.475 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[c0c210a7-8f2f-435a-a936-fb339d76662b]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:17:38 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:17:38.516 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[390ed32f-64f1-4270-9406-a7247911c9ef]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:17:38 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:17:38.524 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[42b810aa-b84f-426f-bf02-8ea9d31136ae]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:17:38 compute-1 NetworkManager[54952]: <info>  [1769041058.5262] manager: (tapaabf11c6-e0): new Veth device (/org/freedesktop/NetworkManager/Devices/247)
Jan 22 00:17:38 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:17:38.568 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[89a39c92-b0f5-40a3-a024-5ca5e8e46005]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:17:38 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:17:38.572 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[7c3f31e6-2a6d-4aa3-9fa5-dd15a03c3c38]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:17:38 compute-1 nova_compute[182713]: 2026-01-22 00:17:38.596 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:17:38 compute-1 NetworkManager[54952]: <info>  [1769041058.6028] device (tapaabf11c6-e0): carrier: link connected
Jan 22 00:17:38 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:17:38.607 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[6c866ba7-541c-4090-8768-33b230c9c7f6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:17:38 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:17:38.623 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[791e06ab-01e3-4c63-8fbb-21568796ce15]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapaabf11c6-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ae:1b:62'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 156], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 563125, 'reachable_time': 42828, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 232427, 'error': None, 'target': 'ovnmeta-aabf11c6-ef94-408a-8148-6c6400566606', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:17:38 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:17:38.636 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[1c07ec27-74a7-4d87-a3f2-0888f54e53c7]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feae:1b62'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 563125, 'tstamp': 563125}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 232428, 'error': None, 'target': 'ovnmeta-aabf11c6-ef94-408a-8148-6c6400566606', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:17:38 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:17:38.650 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[e3db023d-ed08-4417-9906-35cc873cc2ca]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapaabf11c6-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ae:1b:62'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 156], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 563125, 'reachable_time': 42828, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 232429, 'error': None, 'target': 'ovnmeta-aabf11c6-ef94-408a-8148-6c6400566606', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:17:38 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:17:38.681 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[2a9756b8-5e2c-46d4-a1a3-a86984a403e6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:17:38 compute-1 nova_compute[182713]: 2026-01-22 00:17:38.724 182717 DEBUG nova.compute.manager [req-12f70d28-b2b8-45d2-85df-62eac768916b req-aa90893f-2f86-429f-85ea-5aff1ea0e7be 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 7e1a8ec5-e1c0-4c11-aae2-15d84872d95c] Received event network-vif-plugged-dfd91ce9-b3dc-46e0-8793-952181553915 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:17:38 compute-1 nova_compute[182713]: 2026-01-22 00:17:38.725 182717 DEBUG oslo_concurrency.lockutils [req-12f70d28-b2b8-45d2-85df-62eac768916b req-aa90893f-2f86-429f-85ea-5aff1ea0e7be 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "7e1a8ec5-e1c0-4c11-aae2-15d84872d95c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:17:38 compute-1 nova_compute[182713]: 2026-01-22 00:17:38.726 182717 DEBUG oslo_concurrency.lockutils [req-12f70d28-b2b8-45d2-85df-62eac768916b req-aa90893f-2f86-429f-85ea-5aff1ea0e7be 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "7e1a8ec5-e1c0-4c11-aae2-15d84872d95c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:17:38 compute-1 nova_compute[182713]: 2026-01-22 00:17:38.727 182717 DEBUG oslo_concurrency.lockutils [req-12f70d28-b2b8-45d2-85df-62eac768916b req-aa90893f-2f86-429f-85ea-5aff1ea0e7be 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "7e1a8ec5-e1c0-4c11-aae2-15d84872d95c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:17:38 compute-1 nova_compute[182713]: 2026-01-22 00:17:38.727 182717 DEBUG nova.compute.manager [req-12f70d28-b2b8-45d2-85df-62eac768916b req-aa90893f-2f86-429f-85ea-5aff1ea0e7be 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 7e1a8ec5-e1c0-4c11-aae2-15d84872d95c] Processing event network-vif-plugged-dfd91ce9-b3dc-46e0-8793-952181553915 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 22 00:17:38 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:17:38.738 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[bea96938-9b38-4bcc-83c4-0bc9764bfbb3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:17:38 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:17:38.740 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapaabf11c6-e0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:17:38 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:17:38.740 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 00:17:38 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:17:38.741 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapaabf11c6-e0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:17:38 compute-1 nova_compute[182713]: 2026-01-22 00:17:38.742 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:17:38 compute-1 NetworkManager[54952]: <info>  [1769041058.7434] manager: (tapaabf11c6-e0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/248)
Jan 22 00:17:38 compute-1 kernel: tapaabf11c6-e0: entered promiscuous mode
Jan 22 00:17:38 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:17:38.746 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapaabf11c6-e0, col_values=(('external_ids', {'iface-id': '1ae0dbff-a7cd-4db8-afc3-1d102fdd130f'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:17:38 compute-1 ovn_controller[94841]: 2026-01-22T00:17:38Z|00548|binding|INFO|Releasing lport 1ae0dbff-a7cd-4db8-afc3-1d102fdd130f from this chassis (sb_readonly=0)
Jan 22 00:17:38 compute-1 nova_compute[182713]: 2026-01-22 00:17:38.747 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:17:38 compute-1 nova_compute[182713]: 2026-01-22 00:17:38.764 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:17:38 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:17:38.765 104184 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/aabf11c6-ef94-408a-8148-6c6400566606.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/aabf11c6-ef94-408a-8148-6c6400566606.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 22 00:17:38 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:17:38.766 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[11bc3383-95be-4b29-80bd-015160de3e57]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:17:38 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:17:38.766 104184 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 00:17:38 compute-1 ovn_metadata_agent[104179]: global
Jan 22 00:17:38 compute-1 ovn_metadata_agent[104179]:     log         /dev/log local0 debug
Jan 22 00:17:38 compute-1 ovn_metadata_agent[104179]:     log-tag     haproxy-metadata-proxy-aabf11c6-ef94-408a-8148-6c6400566606
Jan 22 00:17:38 compute-1 ovn_metadata_agent[104179]:     user        root
Jan 22 00:17:38 compute-1 ovn_metadata_agent[104179]:     group       root
Jan 22 00:17:38 compute-1 ovn_metadata_agent[104179]:     maxconn     1024
Jan 22 00:17:38 compute-1 ovn_metadata_agent[104179]:     pidfile     /var/lib/neutron/external/pids/aabf11c6-ef94-408a-8148-6c6400566606.pid.haproxy
Jan 22 00:17:38 compute-1 ovn_metadata_agent[104179]:     daemon
Jan 22 00:17:38 compute-1 ovn_metadata_agent[104179]: 
Jan 22 00:17:38 compute-1 ovn_metadata_agent[104179]: defaults
Jan 22 00:17:38 compute-1 ovn_metadata_agent[104179]:     log global
Jan 22 00:17:38 compute-1 ovn_metadata_agent[104179]:     mode http
Jan 22 00:17:38 compute-1 ovn_metadata_agent[104179]:     option httplog
Jan 22 00:17:38 compute-1 ovn_metadata_agent[104179]:     option dontlognull
Jan 22 00:17:38 compute-1 ovn_metadata_agent[104179]:     option http-server-close
Jan 22 00:17:38 compute-1 ovn_metadata_agent[104179]:     option forwardfor
Jan 22 00:17:38 compute-1 ovn_metadata_agent[104179]:     retries                 3
Jan 22 00:17:38 compute-1 ovn_metadata_agent[104179]:     timeout http-request    30s
Jan 22 00:17:38 compute-1 ovn_metadata_agent[104179]:     timeout connect         30s
Jan 22 00:17:38 compute-1 ovn_metadata_agent[104179]:     timeout client          32s
Jan 22 00:17:38 compute-1 ovn_metadata_agent[104179]:     timeout server          32s
Jan 22 00:17:38 compute-1 ovn_metadata_agent[104179]:     timeout http-keep-alive 30s
Jan 22 00:17:38 compute-1 ovn_metadata_agent[104179]: 
Jan 22 00:17:38 compute-1 ovn_metadata_agent[104179]: 
Jan 22 00:17:38 compute-1 ovn_metadata_agent[104179]: listen listener
Jan 22 00:17:38 compute-1 ovn_metadata_agent[104179]:     bind 169.254.169.254:80
Jan 22 00:17:38 compute-1 ovn_metadata_agent[104179]:     server metadata /var/lib/neutron/metadata_proxy
Jan 22 00:17:38 compute-1 ovn_metadata_agent[104179]:     http-request add-header X-OVN-Network-ID aabf11c6-ef94-408a-8148-6c6400566606
Jan 22 00:17:38 compute-1 ovn_metadata_agent[104179]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 22 00:17:38 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:17:38.767 104184 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-aabf11c6-ef94-408a-8148-6c6400566606', 'env', 'PROCESS_TAG=haproxy-aabf11c6-ef94-408a-8148-6c6400566606', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/aabf11c6-ef94-408a-8148-6c6400566606.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 22 00:17:38 compute-1 nova_compute[182713]: 2026-01-22 00:17:38.801 182717 DEBUG nova.virt.driver [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Emitting event <LifecycleEvent: 1769041058.801158, 7e1a8ec5-e1c0-4c11-aae2-15d84872d95c => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:17:38 compute-1 nova_compute[182713]: 2026-01-22 00:17:38.802 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 7e1a8ec5-e1c0-4c11-aae2-15d84872d95c] VM Started (Lifecycle Event)
Jan 22 00:17:38 compute-1 nova_compute[182713]: 2026-01-22 00:17:38.804 182717 DEBUG nova.compute.manager [None req-b825dc05-d088-464d-b9c1-9e04f5387195 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: 7e1a8ec5-e1c0-4c11-aae2-15d84872d95c] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 00:17:38 compute-1 nova_compute[182713]: 2026-01-22 00:17:38.807 182717 DEBUG nova.virt.libvirt.driver [None req-b825dc05-d088-464d-b9c1-9e04f5387195 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: 7e1a8ec5-e1c0-4c11-aae2-15d84872d95c] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 22 00:17:38 compute-1 nova_compute[182713]: 2026-01-22 00:17:38.810 182717 INFO nova.virt.libvirt.driver [-] [instance: 7e1a8ec5-e1c0-4c11-aae2-15d84872d95c] Instance spawned successfully.
Jan 22 00:17:38 compute-1 nova_compute[182713]: 2026-01-22 00:17:38.811 182717 DEBUG nova.virt.libvirt.driver [None req-b825dc05-d088-464d-b9c1-9e04f5387195 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: 7e1a8ec5-e1c0-4c11-aae2-15d84872d95c] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 22 00:17:38 compute-1 nova_compute[182713]: 2026-01-22 00:17:38.827 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 7e1a8ec5-e1c0-4c11-aae2-15d84872d95c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:17:38 compute-1 nova_compute[182713]: 2026-01-22 00:17:38.830 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 7e1a8ec5-e1c0-4c11-aae2-15d84872d95c] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 00:17:38 compute-1 nova_compute[182713]: 2026-01-22 00:17:38.842 182717 DEBUG nova.virt.libvirt.driver [None req-b825dc05-d088-464d-b9c1-9e04f5387195 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: 7e1a8ec5-e1c0-4c11-aae2-15d84872d95c] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:17:38 compute-1 nova_compute[182713]: 2026-01-22 00:17:38.842 182717 DEBUG nova.virt.libvirt.driver [None req-b825dc05-d088-464d-b9c1-9e04f5387195 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: 7e1a8ec5-e1c0-4c11-aae2-15d84872d95c] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:17:38 compute-1 nova_compute[182713]: 2026-01-22 00:17:38.843 182717 DEBUG nova.virt.libvirt.driver [None req-b825dc05-d088-464d-b9c1-9e04f5387195 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: 7e1a8ec5-e1c0-4c11-aae2-15d84872d95c] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:17:38 compute-1 nova_compute[182713]: 2026-01-22 00:17:38.843 182717 DEBUG nova.virt.libvirt.driver [None req-b825dc05-d088-464d-b9c1-9e04f5387195 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: 7e1a8ec5-e1c0-4c11-aae2-15d84872d95c] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:17:38 compute-1 nova_compute[182713]: 2026-01-22 00:17:38.843 182717 DEBUG nova.virt.libvirt.driver [None req-b825dc05-d088-464d-b9c1-9e04f5387195 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: 7e1a8ec5-e1c0-4c11-aae2-15d84872d95c] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:17:38 compute-1 nova_compute[182713]: 2026-01-22 00:17:38.844 182717 DEBUG nova.virt.libvirt.driver [None req-b825dc05-d088-464d-b9c1-9e04f5387195 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: 7e1a8ec5-e1c0-4c11-aae2-15d84872d95c] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:17:38 compute-1 nova_compute[182713]: 2026-01-22 00:17:38.868 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 7e1a8ec5-e1c0-4c11-aae2-15d84872d95c] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 00:17:38 compute-1 nova_compute[182713]: 2026-01-22 00:17:38.869 182717 DEBUG nova.virt.driver [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Emitting event <LifecycleEvent: 1769041058.8039515, 7e1a8ec5-e1c0-4c11-aae2-15d84872d95c => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:17:38 compute-1 nova_compute[182713]: 2026-01-22 00:17:38.869 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 7e1a8ec5-e1c0-4c11-aae2-15d84872d95c] VM Paused (Lifecycle Event)
Jan 22 00:17:38 compute-1 nova_compute[182713]: 2026-01-22 00:17:38.918 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 7e1a8ec5-e1c0-4c11-aae2-15d84872d95c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:17:38 compute-1 nova_compute[182713]: 2026-01-22 00:17:38.921 182717 DEBUG nova.virt.driver [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Emitting event <LifecycleEvent: 1769041058.80695, 7e1a8ec5-e1c0-4c11-aae2-15d84872d95c => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:17:38 compute-1 nova_compute[182713]: 2026-01-22 00:17:38.922 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 7e1a8ec5-e1c0-4c11-aae2-15d84872d95c] VM Resumed (Lifecycle Event)
Jan 22 00:17:38 compute-1 nova_compute[182713]: 2026-01-22 00:17:38.955 182717 INFO nova.compute.manager [None req-b825dc05-d088-464d-b9c1-9e04f5387195 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: 7e1a8ec5-e1c0-4c11-aae2-15d84872d95c] Took 6.08 seconds to spawn the instance on the hypervisor.
Jan 22 00:17:38 compute-1 nova_compute[182713]: 2026-01-22 00:17:38.955 182717 DEBUG nova.compute.manager [None req-b825dc05-d088-464d-b9c1-9e04f5387195 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: 7e1a8ec5-e1c0-4c11-aae2-15d84872d95c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:17:38 compute-1 nova_compute[182713]: 2026-01-22 00:17:38.963 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 7e1a8ec5-e1c0-4c11-aae2-15d84872d95c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:17:38 compute-1 nova_compute[182713]: 2026-01-22 00:17:38.971 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 7e1a8ec5-e1c0-4c11-aae2-15d84872d95c] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 00:17:39 compute-1 nova_compute[182713]: 2026-01-22 00:17:39.009 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 7e1a8ec5-e1c0-4c11-aae2-15d84872d95c] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 00:17:39 compute-1 nova_compute[182713]: 2026-01-22 00:17:39.035 182717 DEBUG nova.network.neutron [req-06094d8f-94a8-467d-b855-b2de1c152b83 req-f57a70f1-2921-4b24-9b29-a95ae64d9ee8 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 7e1a8ec5-e1c0-4c11-aae2-15d84872d95c] Updated VIF entry in instance network info cache for port dfd91ce9-b3dc-46e0-8793-952181553915. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 00:17:39 compute-1 nova_compute[182713]: 2026-01-22 00:17:39.036 182717 DEBUG nova.network.neutron [req-06094d8f-94a8-467d-b855-b2de1c152b83 req-f57a70f1-2921-4b24-9b29-a95ae64d9ee8 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 7e1a8ec5-e1c0-4c11-aae2-15d84872d95c] Updating instance_info_cache with network_info: [{"id": "dfd91ce9-b3dc-46e0-8793-952181553915", "address": "fa:16:3e:0c:78:ae", "network": {"id": "aabf11c6-ef94-408a-8148-6c6400566606", "bridge": "br-int", "label": "tempest-ServersTestJSON-341875047-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3e408650207b498c8d115fd0c4f776dc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdfd91ce9-b3", "ovs_interfaceid": "dfd91ce9-b3dc-46e0-8793-952181553915", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:17:39 compute-1 nova_compute[182713]: 2026-01-22 00:17:39.051 182717 DEBUG oslo_concurrency.lockutils [req-06094d8f-94a8-467d-b855-b2de1c152b83 req-f57a70f1-2921-4b24-9b29-a95ae64d9ee8 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-7e1a8ec5-e1c0-4c11-aae2-15d84872d95c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:17:39 compute-1 nova_compute[182713]: 2026-01-22 00:17:39.061 182717 INFO nova.compute.manager [None req-b825dc05-d088-464d-b9c1-9e04f5387195 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: 7e1a8ec5-e1c0-4c11-aae2-15d84872d95c] Took 6.84 seconds to build instance.
Jan 22 00:17:39 compute-1 nova_compute[182713]: 2026-01-22 00:17:39.076 182717 DEBUG oslo_concurrency.lockutils [None req-b825dc05-d088-464d-b9c1-9e04f5387195 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Lock "7e1a8ec5-e1c0-4c11-aae2-15d84872d95c" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 6.946s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:17:39 compute-1 podman[232465]: 2026-01-22 00:17:39.124096944 +0000 UTC m=+0.062275804 container create 43bf0b600f338066a5e6fbea6b784e87596c2153714d5cfcb11117c37429ae05 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-aabf11c6-ef94-408a-8148-6c6400566606, io.buildah.version=1.41.3, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS)
Jan 22 00:17:39 compute-1 systemd[1]: Started libpod-conmon-43bf0b600f338066a5e6fbea6b784e87596c2153714d5cfcb11117c37429ae05.scope.
Jan 22 00:17:39 compute-1 podman[232465]: 2026-01-22 00:17:39.090148692 +0000 UTC m=+0.028327612 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 00:17:39 compute-1 systemd[1]: Started libcrun container.
Jan 22 00:17:39 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a47b43f1c558439c22fd39f33b71a82348c9f13167eeb247c328420325617600/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 00:17:39 compute-1 podman[232465]: 2026-01-22 00:17:39.227187684 +0000 UTC m=+0.165366534 container init 43bf0b600f338066a5e6fbea6b784e87596c2153714d5cfcb11117c37429ae05 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-aabf11c6-ef94-408a-8148-6c6400566606, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 00:17:39 compute-1 podman[232465]: 2026-01-22 00:17:39.231939299 +0000 UTC m=+0.170118129 container start 43bf0b600f338066a5e6fbea6b784e87596c2153714d5cfcb11117c37429ae05 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-aabf11c6-ef94-408a-8148-6c6400566606, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Jan 22 00:17:39 compute-1 neutron-haproxy-ovnmeta-aabf11c6-ef94-408a-8148-6c6400566606[232481]: [NOTICE]   (232485) : New worker (232487) forked
Jan 22 00:17:39 compute-1 neutron-haproxy-ovnmeta-aabf11c6-ef94-408a-8148-6c6400566606[232481]: [NOTICE]   (232485) : Loading success.
Jan 22 00:17:40 compute-1 podman[232497]: 2026-01-22 00:17:40.605134054 +0000 UTC m=+0.094053201 container health_status 9069102163b9014ce8d990e36224edf2f6277e737eebbc85a8ea0ba5a3d6a468 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 22 00:17:40 compute-1 podman[232496]: 2026-01-22 00:17:40.614969767 +0000 UTC m=+0.109351972 container health_status 1b31374526468d6b98833c6ddc06c791524e3aaa8f4a92d02e3f11c449a17ca2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 22 00:17:40 compute-1 nova_compute[182713]: 2026-01-22 00:17:40.888 182717 DEBUG nova.compute.manager [req-fe7b79ff-4b7d-4e67-8e60-931e422a60c8 req-854e8297-22c6-4a88-a857-eda205018fa7 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 7e1a8ec5-e1c0-4c11-aae2-15d84872d95c] Received event network-vif-plugged-dfd91ce9-b3dc-46e0-8793-952181553915 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:17:40 compute-1 nova_compute[182713]: 2026-01-22 00:17:40.888 182717 DEBUG oslo_concurrency.lockutils [req-fe7b79ff-4b7d-4e67-8e60-931e422a60c8 req-854e8297-22c6-4a88-a857-eda205018fa7 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "7e1a8ec5-e1c0-4c11-aae2-15d84872d95c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:17:40 compute-1 nova_compute[182713]: 2026-01-22 00:17:40.889 182717 DEBUG oslo_concurrency.lockutils [req-fe7b79ff-4b7d-4e67-8e60-931e422a60c8 req-854e8297-22c6-4a88-a857-eda205018fa7 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "7e1a8ec5-e1c0-4c11-aae2-15d84872d95c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:17:40 compute-1 nova_compute[182713]: 2026-01-22 00:17:40.890 182717 DEBUG oslo_concurrency.lockutils [req-fe7b79ff-4b7d-4e67-8e60-931e422a60c8 req-854e8297-22c6-4a88-a857-eda205018fa7 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "7e1a8ec5-e1c0-4c11-aae2-15d84872d95c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:17:40 compute-1 nova_compute[182713]: 2026-01-22 00:17:40.890 182717 DEBUG nova.compute.manager [req-fe7b79ff-4b7d-4e67-8e60-931e422a60c8 req-854e8297-22c6-4a88-a857-eda205018fa7 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 7e1a8ec5-e1c0-4c11-aae2-15d84872d95c] No waiting events found dispatching network-vif-plugged-dfd91ce9-b3dc-46e0-8793-952181553915 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:17:40 compute-1 nova_compute[182713]: 2026-01-22 00:17:40.891 182717 WARNING nova.compute.manager [req-fe7b79ff-4b7d-4e67-8e60-931e422a60c8 req-854e8297-22c6-4a88-a857-eda205018fa7 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 7e1a8ec5-e1c0-4c11-aae2-15d84872d95c] Received unexpected event network-vif-plugged-dfd91ce9-b3dc-46e0-8793-952181553915 for instance with vm_state active and task_state None.
Jan 22 00:17:42 compute-1 nova_compute[182713]: 2026-01-22 00:17:42.071 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:17:43 compute-1 nova_compute[182713]: 2026-01-22 00:17:43.599 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:17:43 compute-1 nova_compute[182713]: 2026-01-22 00:17:43.856 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:17:44 compute-1 nova_compute[182713]: 2026-01-22 00:17:44.874 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:17:44 compute-1 nova_compute[182713]: 2026-01-22 00:17:44.875 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:17:45 compute-1 nova_compute[182713]: 2026-01-22 00:17:45.852 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:17:45 compute-1 nova_compute[182713]: 2026-01-22 00:17:45.875 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:17:45 compute-1 nova_compute[182713]: 2026-01-22 00:17:45.875 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 00:17:46 compute-1 podman[232544]: 2026-01-22 00:17:46.569677126 +0000 UTC m=+0.064258526 container health_status af9a44923f14d865893c91712a29a99d59e05d83f1a1a0300c4f4d5af638f284 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 22 00:17:46 compute-1 podman[232545]: 2026-01-22 00:17:46.589776354 +0000 UTC m=+0.074018227 container health_status e305ebfcd3dbca18f8dd680606b043b108cf9efb0d0a771039cb04fe0df8441d (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 22 00:17:47 compute-1 nova_compute[182713]: 2026-01-22 00:17:47.089 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:17:48 compute-1 nova_compute[182713]: 2026-01-22 00:17:48.458 182717 DEBUG oslo_concurrency.lockutils [None req-fdff1040-a7eb-4894-8594-187b51ec2b0e a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Acquiring lock "2ddf383f-fadd-4739-9a90-3db8bcb7cb2a" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:17:48 compute-1 nova_compute[182713]: 2026-01-22 00:17:48.459 182717 DEBUG oslo_concurrency.lockutils [None req-fdff1040-a7eb-4894-8594-187b51ec2b0e a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Lock "2ddf383f-fadd-4739-9a90-3db8bcb7cb2a" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:17:48 compute-1 nova_compute[182713]: 2026-01-22 00:17:48.519 182717 DEBUG nova.compute.manager [None req-fdff1040-a7eb-4894-8594-187b51ec2b0e a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 2ddf383f-fadd-4739-9a90-3db8bcb7cb2a] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 22 00:17:48 compute-1 nova_compute[182713]: 2026-01-22 00:17:48.635 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:17:48 compute-1 nova_compute[182713]: 2026-01-22 00:17:48.659 182717 DEBUG oslo_concurrency.lockutils [None req-fdff1040-a7eb-4894-8594-187b51ec2b0e a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:17:48 compute-1 nova_compute[182713]: 2026-01-22 00:17:48.659 182717 DEBUG oslo_concurrency.lockutils [None req-fdff1040-a7eb-4894-8594-187b51ec2b0e a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:17:48 compute-1 nova_compute[182713]: 2026-01-22 00:17:48.669 182717 DEBUG nova.virt.hardware [None req-fdff1040-a7eb-4894-8594-187b51ec2b0e a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 22 00:17:48 compute-1 nova_compute[182713]: 2026-01-22 00:17:48.669 182717 INFO nova.compute.claims [None req-fdff1040-a7eb-4894-8594-187b51ec2b0e a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 2ddf383f-fadd-4739-9a90-3db8bcb7cb2a] Claim successful on node compute-1.ctlplane.example.com
Jan 22 00:17:48 compute-1 nova_compute[182713]: 2026-01-22 00:17:48.848 182717 DEBUG nova.compute.provider_tree [None req-fdff1040-a7eb-4894-8594-187b51ec2b0e a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Inventory has not changed in ProviderTree for provider: 39680711-70c9-4df1-ae59-25e54fac688d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:17:48 compute-1 nova_compute[182713]: 2026-01-22 00:17:48.855 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:17:48 compute-1 nova_compute[182713]: 2026-01-22 00:17:48.855 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:17:48 compute-1 nova_compute[182713]: 2026-01-22 00:17:48.856 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:17:48 compute-1 nova_compute[182713]: 2026-01-22 00:17:48.864 182717 DEBUG nova.scheduler.client.report [None req-fdff1040-a7eb-4894-8594-187b51ec2b0e a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Inventory has not changed for provider 39680711-70c9-4df1-ae59-25e54fac688d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:17:48 compute-1 nova_compute[182713]: 2026-01-22 00:17:48.883 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:17:48 compute-1 nova_compute[182713]: 2026-01-22 00:17:48.889 182717 DEBUG oslo_concurrency.lockutils [None req-fdff1040-a7eb-4894-8594-187b51ec2b0e a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.230s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:17:48 compute-1 nova_compute[182713]: 2026-01-22 00:17:48.890 182717 DEBUG nova.compute.manager [None req-fdff1040-a7eb-4894-8594-187b51ec2b0e a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 2ddf383f-fadd-4739-9a90-3db8bcb7cb2a] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 22 00:17:48 compute-1 nova_compute[182713]: 2026-01-22 00:17:48.893 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.010s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:17:48 compute-1 nova_compute[182713]: 2026-01-22 00:17:48.893 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:17:48 compute-1 nova_compute[182713]: 2026-01-22 00:17:48.893 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 00:17:49 compute-1 nova_compute[182713]: 2026-01-22 00:17:49.021 182717 DEBUG nova.compute.manager [None req-fdff1040-a7eb-4894-8594-187b51ec2b0e a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 2ddf383f-fadd-4739-9a90-3db8bcb7cb2a] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 22 00:17:49 compute-1 nova_compute[182713]: 2026-01-22 00:17:49.022 182717 DEBUG nova.network.neutron [None req-fdff1040-a7eb-4894-8594-187b51ec2b0e a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 2ddf383f-fadd-4739-9a90-3db8bcb7cb2a] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 22 00:17:49 compute-1 nova_compute[182713]: 2026-01-22 00:17:49.043 182717 DEBUG oslo_concurrency.processutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7e1a8ec5-e1c0-4c11-aae2-15d84872d95c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:17:49 compute-1 nova_compute[182713]: 2026-01-22 00:17:49.079 182717 INFO nova.virt.libvirt.driver [None req-fdff1040-a7eb-4894-8594-187b51ec2b0e a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 2ddf383f-fadd-4739-9a90-3db8bcb7cb2a] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 22 00:17:49 compute-1 nova_compute[182713]: 2026-01-22 00:17:49.103 182717 DEBUG nova.compute.manager [None req-fdff1040-a7eb-4894-8594-187b51ec2b0e a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 2ddf383f-fadd-4739-9a90-3db8bcb7cb2a] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 22 00:17:49 compute-1 nova_compute[182713]: 2026-01-22 00:17:49.143 182717 DEBUG oslo_concurrency.processutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7e1a8ec5-e1c0-4c11-aae2-15d84872d95c/disk --force-share --output=json" returned: 0 in 0.100s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:17:49 compute-1 nova_compute[182713]: 2026-01-22 00:17:49.144 182717 DEBUG oslo_concurrency.processutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7e1a8ec5-e1c0-4c11-aae2-15d84872d95c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:17:49 compute-1 nova_compute[182713]: 2026-01-22 00:17:49.235 182717 DEBUG oslo_concurrency.processutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7e1a8ec5-e1c0-4c11-aae2-15d84872d95c/disk --force-share --output=json" returned: 0 in 0.091s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:17:49 compute-1 nova_compute[182713]: 2026-01-22 00:17:49.309 182717 DEBUG nova.policy [None req-fdff1040-a7eb-4894-8594-187b51ec2b0e a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'a60ce2b7b7ae47b484de12add551b287', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '02bcfc5f1f1044a3856e73a5938ff011', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 22 00:17:49 compute-1 nova_compute[182713]: 2026-01-22 00:17:49.341 182717 DEBUG nova.compute.manager [None req-fdff1040-a7eb-4894-8594-187b51ec2b0e a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 2ddf383f-fadd-4739-9a90-3db8bcb7cb2a] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 22 00:17:49 compute-1 nova_compute[182713]: 2026-01-22 00:17:49.343 182717 DEBUG nova.virt.libvirt.driver [None req-fdff1040-a7eb-4894-8594-187b51ec2b0e a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 2ddf383f-fadd-4739-9a90-3db8bcb7cb2a] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 22 00:17:49 compute-1 nova_compute[182713]: 2026-01-22 00:17:49.343 182717 INFO nova.virt.libvirt.driver [None req-fdff1040-a7eb-4894-8594-187b51ec2b0e a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 2ddf383f-fadd-4739-9a90-3db8bcb7cb2a] Creating image(s)
Jan 22 00:17:49 compute-1 nova_compute[182713]: 2026-01-22 00:17:49.344 182717 DEBUG oslo_concurrency.lockutils [None req-fdff1040-a7eb-4894-8594-187b51ec2b0e a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Acquiring lock "/var/lib/nova/instances/2ddf383f-fadd-4739-9a90-3db8bcb7cb2a/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:17:49 compute-1 nova_compute[182713]: 2026-01-22 00:17:49.344 182717 DEBUG oslo_concurrency.lockutils [None req-fdff1040-a7eb-4894-8594-187b51ec2b0e a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Lock "/var/lib/nova/instances/2ddf383f-fadd-4739-9a90-3db8bcb7cb2a/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:17:49 compute-1 nova_compute[182713]: 2026-01-22 00:17:49.345 182717 DEBUG oslo_concurrency.lockutils [None req-fdff1040-a7eb-4894-8594-187b51ec2b0e a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Lock "/var/lib/nova/instances/2ddf383f-fadd-4739-9a90-3db8bcb7cb2a/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:17:49 compute-1 nova_compute[182713]: 2026-01-22 00:17:49.363 182717 DEBUG oslo_concurrency.processutils [None req-fdff1040-a7eb-4894-8594-187b51ec2b0e a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:17:49 compute-1 nova_compute[182713]: 2026-01-22 00:17:49.460 182717 DEBUG oslo_concurrency.processutils [None req-fdff1040-a7eb-4894-8594-187b51ec2b0e a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.097s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:17:49 compute-1 nova_compute[182713]: 2026-01-22 00:17:49.462 182717 DEBUG oslo_concurrency.lockutils [None req-fdff1040-a7eb-4894-8594-187b51ec2b0e a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Acquiring lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:17:49 compute-1 nova_compute[182713]: 2026-01-22 00:17:49.463 182717 DEBUG oslo_concurrency.lockutils [None req-fdff1040-a7eb-4894-8594-187b51ec2b0e a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:17:49 compute-1 nova_compute[182713]: 2026-01-22 00:17:49.481 182717 DEBUG oslo_concurrency.processutils [None req-fdff1040-a7eb-4894-8594-187b51ec2b0e a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:17:49 compute-1 nova_compute[182713]: 2026-01-22 00:17:49.540 182717 DEBUG oslo_concurrency.processutils [None req-fdff1040-a7eb-4894-8594-187b51ec2b0e a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:17:49 compute-1 nova_compute[182713]: 2026-01-22 00:17:49.541 182717 DEBUG oslo_concurrency.processutils [None req-fdff1040-a7eb-4894-8594-187b51ec2b0e a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/2ddf383f-fadd-4739-9a90-3db8bcb7cb2a/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:17:49 compute-1 nova_compute[182713]: 2026-01-22 00:17:49.591 182717 WARNING nova.virt.libvirt.driver [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 00:17:49 compute-1 nova_compute[182713]: 2026-01-22 00:17:49.593 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5536MB free_disk=73.25958251953125GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 00:17:49 compute-1 nova_compute[182713]: 2026-01-22 00:17:49.594 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:17:49 compute-1 nova_compute[182713]: 2026-01-22 00:17:49.594 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:17:49 compute-1 nova_compute[182713]: 2026-01-22 00:17:49.624 182717 DEBUG oslo_concurrency.processutils [None req-fdff1040-a7eb-4894-8594-187b51ec2b0e a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/2ddf383f-fadd-4739-9a90-3db8bcb7cb2a/disk 1073741824" returned: 0 in 0.083s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:17:49 compute-1 nova_compute[182713]: 2026-01-22 00:17:49.625 182717 DEBUG oslo_concurrency.lockutils [None req-fdff1040-a7eb-4894-8594-187b51ec2b0e a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.162s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:17:49 compute-1 nova_compute[182713]: 2026-01-22 00:17:49.625 182717 DEBUG oslo_concurrency.processutils [None req-fdff1040-a7eb-4894-8594-187b51ec2b0e a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:17:49 compute-1 nova_compute[182713]: 2026-01-22 00:17:49.680 182717 DEBUG oslo_concurrency.processutils [None req-fdff1040-a7eb-4894-8594-187b51ec2b0e a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:17:49 compute-1 nova_compute[182713]: 2026-01-22 00:17:49.680 182717 DEBUG nova.virt.disk.api [None req-fdff1040-a7eb-4894-8594-187b51ec2b0e a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Checking if we can resize image /var/lib/nova/instances/2ddf383f-fadd-4739-9a90-3db8bcb7cb2a/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 22 00:17:49 compute-1 nova_compute[182713]: 2026-01-22 00:17:49.681 182717 DEBUG oslo_concurrency.processutils [None req-fdff1040-a7eb-4894-8594-187b51ec2b0e a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2ddf383f-fadd-4739-9a90-3db8bcb7cb2a/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:17:49 compute-1 nova_compute[182713]: 2026-01-22 00:17:49.697 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Instance 7e1a8ec5-e1c0-4c11-aae2-15d84872d95c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 22 00:17:49 compute-1 nova_compute[182713]: 2026-01-22 00:17:49.697 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Instance 2ddf383f-fadd-4739-9a90-3db8bcb7cb2a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 22 00:17:49 compute-1 nova_compute[182713]: 2026-01-22 00:17:49.698 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 00:17:49 compute-1 nova_compute[182713]: 2026-01-22 00:17:49.698 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 00:17:49 compute-1 nova_compute[182713]: 2026-01-22 00:17:49.732 182717 DEBUG oslo_concurrency.processutils [None req-fdff1040-a7eb-4894-8594-187b51ec2b0e a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2ddf383f-fadd-4739-9a90-3db8bcb7cb2a/disk --force-share --output=json" returned: 0 in 0.051s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:17:49 compute-1 nova_compute[182713]: 2026-01-22 00:17:49.733 182717 DEBUG nova.virt.disk.api [None req-fdff1040-a7eb-4894-8594-187b51ec2b0e a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Cannot resize image /var/lib/nova/instances/2ddf383f-fadd-4739-9a90-3db8bcb7cb2a/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 22 00:17:49 compute-1 nova_compute[182713]: 2026-01-22 00:17:49.733 182717 DEBUG nova.objects.instance [None req-fdff1040-a7eb-4894-8594-187b51ec2b0e a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Lazy-loading 'migration_context' on Instance uuid 2ddf383f-fadd-4739-9a90-3db8bcb7cb2a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:17:49 compute-1 nova_compute[182713]: 2026-01-22 00:17:49.753 182717 DEBUG nova.virt.libvirt.driver [None req-fdff1040-a7eb-4894-8594-187b51ec2b0e a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 2ddf383f-fadd-4739-9a90-3db8bcb7cb2a] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 22 00:17:49 compute-1 nova_compute[182713]: 2026-01-22 00:17:49.754 182717 DEBUG nova.virt.libvirt.driver [None req-fdff1040-a7eb-4894-8594-187b51ec2b0e a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 2ddf383f-fadd-4739-9a90-3db8bcb7cb2a] Ensure instance console log exists: /var/lib/nova/instances/2ddf383f-fadd-4739-9a90-3db8bcb7cb2a/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 22 00:17:49 compute-1 nova_compute[182713]: 2026-01-22 00:17:49.755 182717 DEBUG oslo_concurrency.lockutils [None req-fdff1040-a7eb-4894-8594-187b51ec2b0e a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:17:49 compute-1 nova_compute[182713]: 2026-01-22 00:17:49.756 182717 DEBUG oslo_concurrency.lockutils [None req-fdff1040-a7eb-4894-8594-187b51ec2b0e a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:17:49 compute-1 nova_compute[182713]: 2026-01-22 00:17:49.756 182717 DEBUG oslo_concurrency.lockutils [None req-fdff1040-a7eb-4894-8594-187b51ec2b0e a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:17:49 compute-1 nova_compute[182713]: 2026-01-22 00:17:49.764 182717 DEBUG nova.compute.provider_tree [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Inventory has not changed in ProviderTree for provider: 39680711-70c9-4df1-ae59-25e54fac688d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:17:49 compute-1 nova_compute[182713]: 2026-01-22 00:17:49.788 182717 DEBUG nova.scheduler.client.report [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Inventory has not changed for provider 39680711-70c9-4df1-ae59-25e54fac688d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:17:49 compute-1 nova_compute[182713]: 2026-01-22 00:17:49.820 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 00:17:49 compute-1 nova_compute[182713]: 2026-01-22 00:17:49.821 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.227s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:17:50 compute-1 ovn_controller[94841]: 2026-01-22T00:17:50Z|00065|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:0c:78:ae 10.100.0.5
Jan 22 00:17:50 compute-1 ovn_controller[94841]: 2026-01-22T00:17:50Z|00066|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:0c:78:ae 10.100.0.5
Jan 22 00:17:51 compute-1 nova_compute[182713]: 2026-01-22 00:17:51.625 182717 DEBUG nova.network.neutron [None req-fdff1040-a7eb-4894-8594-187b51ec2b0e a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 2ddf383f-fadd-4739-9a90-3db8bcb7cb2a] Successfully created port: b3245964-feca-4b8b-b219-3a8b97cebae7 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 22 00:17:52 compute-1 nova_compute[182713]: 2026-01-22 00:17:52.091 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:17:53 compute-1 nova_compute[182713]: 2026-01-22 00:17:53.639 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:17:54 compute-1 nova_compute[182713]: 2026-01-22 00:17:54.258 182717 DEBUG nova.network.neutron [None req-fdff1040-a7eb-4894-8594-187b51ec2b0e a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 2ddf383f-fadd-4739-9a90-3db8bcb7cb2a] Successfully updated port: b3245964-feca-4b8b-b219-3a8b97cebae7 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 22 00:17:54 compute-1 nova_compute[182713]: 2026-01-22 00:17:54.280 182717 DEBUG oslo_concurrency.lockutils [None req-fdff1040-a7eb-4894-8594-187b51ec2b0e a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Acquiring lock "refresh_cache-2ddf383f-fadd-4739-9a90-3db8bcb7cb2a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:17:54 compute-1 nova_compute[182713]: 2026-01-22 00:17:54.280 182717 DEBUG oslo_concurrency.lockutils [None req-fdff1040-a7eb-4894-8594-187b51ec2b0e a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Acquired lock "refresh_cache-2ddf383f-fadd-4739-9a90-3db8bcb7cb2a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:17:54 compute-1 nova_compute[182713]: 2026-01-22 00:17:54.281 182717 DEBUG nova.network.neutron [None req-fdff1040-a7eb-4894-8594-187b51ec2b0e a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 2ddf383f-fadd-4739-9a90-3db8bcb7cb2a] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 00:17:54 compute-1 nova_compute[182713]: 2026-01-22 00:17:54.566 182717 DEBUG nova.network.neutron [None req-fdff1040-a7eb-4894-8594-187b51ec2b0e a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 2ddf383f-fadd-4739-9a90-3db8bcb7cb2a] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 00:17:54 compute-1 nova_compute[182713]: 2026-01-22 00:17:54.685 182717 DEBUG nova.compute.manager [req-9a8ff611-f285-4f15-a0f9-b95221d9bed2 req-cbd5a8c3-10a8-4882-9df5-127c90615bfd 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 2ddf383f-fadd-4739-9a90-3db8bcb7cb2a] Received event network-changed-b3245964-feca-4b8b-b219-3a8b97cebae7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:17:54 compute-1 nova_compute[182713]: 2026-01-22 00:17:54.686 182717 DEBUG nova.compute.manager [req-9a8ff611-f285-4f15-a0f9-b95221d9bed2 req-cbd5a8c3-10a8-4882-9df5-127c90615bfd 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 2ddf383f-fadd-4739-9a90-3db8bcb7cb2a] Refreshing instance network info cache due to event network-changed-b3245964-feca-4b8b-b219-3a8b97cebae7. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 00:17:54 compute-1 nova_compute[182713]: 2026-01-22 00:17:54.686 182717 DEBUG oslo_concurrency.lockutils [req-9a8ff611-f285-4f15-a0f9-b95221d9bed2 req-cbd5a8c3-10a8-4882-9df5-127c90615bfd 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-2ddf383f-fadd-4739-9a90-3db8bcb7cb2a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:17:54 compute-1 nova_compute[182713]: 2026-01-22 00:17:54.821 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:17:54 compute-1 nova_compute[182713]: 2026-01-22 00:17:54.822 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:17:54 compute-1 nova_compute[182713]: 2026-01-22 00:17:54.856 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:17:54 compute-1 nova_compute[182713]: 2026-01-22 00:17:54.857 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 00:17:54 compute-1 nova_compute[182713]: 2026-01-22 00:17:54.858 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 22 00:17:54 compute-1 nova_compute[182713]: 2026-01-22 00:17:54.887 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] [instance: 2ddf383f-fadd-4739-9a90-3db8bcb7cb2a] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871
Jan 22 00:17:55 compute-1 nova_compute[182713]: 2026-01-22 00:17:55.065 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquiring lock "refresh_cache-7e1a8ec5-e1c0-4c11-aae2-15d84872d95c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:17:55 compute-1 nova_compute[182713]: 2026-01-22 00:17:55.066 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquired lock "refresh_cache-7e1a8ec5-e1c0-4c11-aae2-15d84872d95c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:17:55 compute-1 nova_compute[182713]: 2026-01-22 00:17:55.066 182717 DEBUG nova.network.neutron [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] [instance: 7e1a8ec5-e1c0-4c11-aae2-15d84872d95c] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Jan 22 00:17:55 compute-1 nova_compute[182713]: 2026-01-22 00:17:55.066 182717 DEBUG nova.objects.instance [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lazy-loading 'info_cache' on Instance uuid 7e1a8ec5-e1c0-4c11-aae2-15d84872d95c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:17:56 compute-1 nova_compute[182713]: 2026-01-22 00:17:56.147 182717 DEBUG nova.network.neutron [None req-fdff1040-a7eb-4894-8594-187b51ec2b0e a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 2ddf383f-fadd-4739-9a90-3db8bcb7cb2a] Updating instance_info_cache with network_info: [{"id": "b3245964-feca-4b8b-b219-3a8b97cebae7", "address": "fa:16:3e:2c:49:00", "network": {"id": "3a4c4578-e100-40e8-b037-1c4af043c44d", "bridge": "br-int", "label": "tempest-network-smoke--1622567996", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "02bcfc5f1f1044a3856e73a5938ff011", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb3245964-fe", "ovs_interfaceid": "b3245964-feca-4b8b-b219-3a8b97cebae7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:17:56 compute-1 nova_compute[182713]: 2026-01-22 00:17:56.225 182717 DEBUG oslo_concurrency.lockutils [None req-fdff1040-a7eb-4894-8594-187b51ec2b0e a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Releasing lock "refresh_cache-2ddf383f-fadd-4739-9a90-3db8bcb7cb2a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:17:56 compute-1 nova_compute[182713]: 2026-01-22 00:17:56.226 182717 DEBUG nova.compute.manager [None req-fdff1040-a7eb-4894-8594-187b51ec2b0e a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 2ddf383f-fadd-4739-9a90-3db8bcb7cb2a] Instance network_info: |[{"id": "b3245964-feca-4b8b-b219-3a8b97cebae7", "address": "fa:16:3e:2c:49:00", "network": {"id": "3a4c4578-e100-40e8-b037-1c4af043c44d", "bridge": "br-int", "label": "tempest-network-smoke--1622567996", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "02bcfc5f1f1044a3856e73a5938ff011", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb3245964-fe", "ovs_interfaceid": "b3245964-feca-4b8b-b219-3a8b97cebae7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 22 00:17:56 compute-1 nova_compute[182713]: 2026-01-22 00:17:56.226 182717 DEBUG oslo_concurrency.lockutils [req-9a8ff611-f285-4f15-a0f9-b95221d9bed2 req-cbd5a8c3-10a8-4882-9df5-127c90615bfd 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-2ddf383f-fadd-4739-9a90-3db8bcb7cb2a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:17:56 compute-1 nova_compute[182713]: 2026-01-22 00:17:56.227 182717 DEBUG nova.network.neutron [req-9a8ff611-f285-4f15-a0f9-b95221d9bed2 req-cbd5a8c3-10a8-4882-9df5-127c90615bfd 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 2ddf383f-fadd-4739-9a90-3db8bcb7cb2a] Refreshing network info cache for port b3245964-feca-4b8b-b219-3a8b97cebae7 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 00:17:56 compute-1 nova_compute[182713]: 2026-01-22 00:17:56.233 182717 DEBUG nova.virt.libvirt.driver [None req-fdff1040-a7eb-4894-8594-187b51ec2b0e a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 2ddf383f-fadd-4739-9a90-3db8bcb7cb2a] Start _get_guest_xml network_info=[{"id": "b3245964-feca-4b8b-b219-3a8b97cebae7", "address": "fa:16:3e:2c:49:00", "network": {"id": "3a4c4578-e100-40e8-b037-1c4af043c44d", "bridge": "br-int", "label": "tempest-network-smoke--1622567996", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "02bcfc5f1f1044a3856e73a5938ff011", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb3245964-fe", "ovs_interfaceid": "b3245964-feca-4b8b-b219-3a8b97cebae7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'size': 0, 'disk_bus': 'virtio', 'boot_index': 0, 'device_name': '/dev/vda', 'guest_format': None, 'encrypted': False, 'encryption_format': None, 'encryption_secret_uuid': None, 'device_type': 'disk', 'image_id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 22 00:17:56 compute-1 nova_compute[182713]: 2026-01-22 00:17:56.242 182717 WARNING nova.virt.libvirt.driver [None req-fdff1040-a7eb-4894-8594-187b51ec2b0e a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 00:17:56 compute-1 nova_compute[182713]: 2026-01-22 00:17:56.248 182717 DEBUG nova.virt.libvirt.host [None req-fdff1040-a7eb-4894-8594-187b51ec2b0e a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 22 00:17:56 compute-1 nova_compute[182713]: 2026-01-22 00:17:56.249 182717 DEBUG nova.virt.libvirt.host [None req-fdff1040-a7eb-4894-8594-187b51ec2b0e a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 22 00:17:56 compute-1 nova_compute[182713]: 2026-01-22 00:17:56.254 182717 DEBUG nova.virt.libvirt.host [None req-fdff1040-a7eb-4894-8594-187b51ec2b0e a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 22 00:17:56 compute-1 nova_compute[182713]: 2026-01-22 00:17:56.255 182717 DEBUG nova.virt.libvirt.host [None req-fdff1040-a7eb-4894-8594-187b51ec2b0e a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 22 00:17:56 compute-1 nova_compute[182713]: 2026-01-22 00:17:56.257 182717 DEBUG nova.virt.libvirt.driver [None req-fdff1040-a7eb-4894-8594-187b51ec2b0e a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 22 00:17:56 compute-1 nova_compute[182713]: 2026-01-22 00:17:56.257 182717 DEBUG nova.virt.hardware [None req-fdff1040-a7eb-4894-8594-187b51ec2b0e a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-21T23:42:45Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c3389c03-89c4-4ff5-9e03-1a99d41713d4',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 22 00:17:56 compute-1 nova_compute[182713]: 2026-01-22 00:17:56.258 182717 DEBUG nova.virt.hardware [None req-fdff1040-a7eb-4894-8594-187b51ec2b0e a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 22 00:17:56 compute-1 nova_compute[182713]: 2026-01-22 00:17:56.259 182717 DEBUG nova.virt.hardware [None req-fdff1040-a7eb-4894-8594-187b51ec2b0e a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 22 00:17:56 compute-1 nova_compute[182713]: 2026-01-22 00:17:56.259 182717 DEBUG nova.virt.hardware [None req-fdff1040-a7eb-4894-8594-187b51ec2b0e a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 22 00:17:56 compute-1 nova_compute[182713]: 2026-01-22 00:17:56.260 182717 DEBUG nova.virt.hardware [None req-fdff1040-a7eb-4894-8594-187b51ec2b0e a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 22 00:17:56 compute-1 nova_compute[182713]: 2026-01-22 00:17:56.260 182717 DEBUG nova.virt.hardware [None req-fdff1040-a7eb-4894-8594-187b51ec2b0e a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 22 00:17:56 compute-1 nova_compute[182713]: 2026-01-22 00:17:56.261 182717 DEBUG nova.virt.hardware [None req-fdff1040-a7eb-4894-8594-187b51ec2b0e a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 22 00:17:56 compute-1 nova_compute[182713]: 2026-01-22 00:17:56.261 182717 DEBUG nova.virt.hardware [None req-fdff1040-a7eb-4894-8594-187b51ec2b0e a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 22 00:17:56 compute-1 nova_compute[182713]: 2026-01-22 00:17:56.261 182717 DEBUG nova.virt.hardware [None req-fdff1040-a7eb-4894-8594-187b51ec2b0e a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 22 00:17:56 compute-1 nova_compute[182713]: 2026-01-22 00:17:56.262 182717 DEBUG nova.virt.hardware [None req-fdff1040-a7eb-4894-8594-187b51ec2b0e a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 22 00:17:56 compute-1 nova_compute[182713]: 2026-01-22 00:17:56.262 182717 DEBUG nova.virt.hardware [None req-fdff1040-a7eb-4894-8594-187b51ec2b0e a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 22 00:17:56 compute-1 nova_compute[182713]: 2026-01-22 00:17:56.269 182717 DEBUG nova.virt.libvirt.vif [None req-fdff1040-a7eb-4894-8594-187b51ec2b0e a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T00:17:46Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-1492736128-access_point-1890133969',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-1492736128-access_point-1890133969',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-1492736128-ac',id=138,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBEC2kFjZ2LLODTqt2mxgHy7rJjxFe8ZrHUXm20i9JwMLGCcZBnlNfhY7KIi4qZ5U0omwJeXWnBlNjC6qoRQOBCrKoweqP5wnKVI5nSTyIGfNTjQz5x3kG940iTmU7L06SA==',key_name='tempest-TestSecurityGroupsBasicOps-720505212',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='02bcfc5f1f1044a3856e73a5938ff011',ramdisk_id='',reservation_id='r-c2o0kvzh',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestSecurityGroupsBasicOps-1492736128',owner_user_name='tempest-TestSecurityGroupsBasicOps-1492736128-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:17:49Z,user_data=None,user_id='a60ce2b7b7ae47b484de12add551b287',uuid=2ddf383f-fadd-4739-9a90-3db8bcb7cb2a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "b3245964-feca-4b8b-b219-3a8b97cebae7", "address": "fa:16:3e:2c:49:00", "network": {"id": "3a4c4578-e100-40e8-b037-1c4af043c44d", "bridge": "br-int", "label": "tempest-network-smoke--1622567996", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "02bcfc5f1f1044a3856e73a5938ff011", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb3245964-fe", "ovs_interfaceid": "b3245964-feca-4b8b-b219-3a8b97cebae7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 00:17:56 compute-1 nova_compute[182713]: 2026-01-22 00:17:56.269 182717 DEBUG nova.network.os_vif_util [None req-fdff1040-a7eb-4894-8594-187b51ec2b0e a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Converting VIF {"id": "b3245964-feca-4b8b-b219-3a8b97cebae7", "address": "fa:16:3e:2c:49:00", "network": {"id": "3a4c4578-e100-40e8-b037-1c4af043c44d", "bridge": "br-int", "label": "tempest-network-smoke--1622567996", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "02bcfc5f1f1044a3856e73a5938ff011", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb3245964-fe", "ovs_interfaceid": "b3245964-feca-4b8b-b219-3a8b97cebae7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:17:56 compute-1 nova_compute[182713]: 2026-01-22 00:17:56.271 182717 DEBUG nova.network.os_vif_util [None req-fdff1040-a7eb-4894-8594-187b51ec2b0e a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:2c:49:00,bridge_name='br-int',has_traffic_filtering=True,id=b3245964-feca-4b8b-b219-3a8b97cebae7,network=Network(3a4c4578-e100-40e8-b037-1c4af043c44d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb3245964-fe') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:17:56 compute-1 nova_compute[182713]: 2026-01-22 00:17:56.272 182717 DEBUG nova.objects.instance [None req-fdff1040-a7eb-4894-8594-187b51ec2b0e a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Lazy-loading 'pci_devices' on Instance uuid 2ddf383f-fadd-4739-9a90-3db8bcb7cb2a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:17:56 compute-1 nova_compute[182713]: 2026-01-22 00:17:56.311 182717 DEBUG nova.virt.libvirt.driver [None req-fdff1040-a7eb-4894-8594-187b51ec2b0e a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 2ddf383f-fadd-4739-9a90-3db8bcb7cb2a] End _get_guest_xml xml=<domain type="kvm">
Jan 22 00:17:56 compute-1 nova_compute[182713]:   <uuid>2ddf383f-fadd-4739-9a90-3db8bcb7cb2a</uuid>
Jan 22 00:17:56 compute-1 nova_compute[182713]:   <name>instance-0000008a</name>
Jan 22 00:17:56 compute-1 nova_compute[182713]:   <memory>131072</memory>
Jan 22 00:17:56 compute-1 nova_compute[182713]:   <vcpu>1</vcpu>
Jan 22 00:17:56 compute-1 nova_compute[182713]:   <metadata>
Jan 22 00:17:56 compute-1 nova_compute[182713]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 00:17:56 compute-1 nova_compute[182713]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 00:17:56 compute-1 nova_compute[182713]:       <nova:name>tempest-server-tempest-TestSecurityGroupsBasicOps-1492736128-access_point-1890133969</nova:name>
Jan 22 00:17:56 compute-1 nova_compute[182713]:       <nova:creationTime>2026-01-22 00:17:56</nova:creationTime>
Jan 22 00:17:56 compute-1 nova_compute[182713]:       <nova:flavor name="m1.nano">
Jan 22 00:17:56 compute-1 nova_compute[182713]:         <nova:memory>128</nova:memory>
Jan 22 00:17:56 compute-1 nova_compute[182713]:         <nova:disk>1</nova:disk>
Jan 22 00:17:56 compute-1 nova_compute[182713]:         <nova:swap>0</nova:swap>
Jan 22 00:17:56 compute-1 nova_compute[182713]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 00:17:56 compute-1 nova_compute[182713]:         <nova:vcpus>1</nova:vcpus>
Jan 22 00:17:56 compute-1 nova_compute[182713]:       </nova:flavor>
Jan 22 00:17:56 compute-1 nova_compute[182713]:       <nova:owner>
Jan 22 00:17:56 compute-1 nova_compute[182713]:         <nova:user uuid="a60ce2b7b7ae47b484de12add551b287">tempest-TestSecurityGroupsBasicOps-1492736128-project-member</nova:user>
Jan 22 00:17:56 compute-1 nova_compute[182713]:         <nova:project uuid="02bcfc5f1f1044a3856e73a5938ff011">tempest-TestSecurityGroupsBasicOps-1492736128</nova:project>
Jan 22 00:17:56 compute-1 nova_compute[182713]:       </nova:owner>
Jan 22 00:17:56 compute-1 nova_compute[182713]:       <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 22 00:17:56 compute-1 nova_compute[182713]:       <nova:ports>
Jan 22 00:17:56 compute-1 nova_compute[182713]:         <nova:port uuid="b3245964-feca-4b8b-b219-3a8b97cebae7">
Jan 22 00:17:56 compute-1 nova_compute[182713]:           <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Jan 22 00:17:56 compute-1 nova_compute[182713]:         </nova:port>
Jan 22 00:17:56 compute-1 nova_compute[182713]:       </nova:ports>
Jan 22 00:17:56 compute-1 nova_compute[182713]:     </nova:instance>
Jan 22 00:17:56 compute-1 nova_compute[182713]:   </metadata>
Jan 22 00:17:56 compute-1 nova_compute[182713]:   <sysinfo type="smbios">
Jan 22 00:17:56 compute-1 nova_compute[182713]:     <system>
Jan 22 00:17:56 compute-1 nova_compute[182713]:       <entry name="manufacturer">RDO</entry>
Jan 22 00:17:56 compute-1 nova_compute[182713]:       <entry name="product">OpenStack Compute</entry>
Jan 22 00:17:56 compute-1 nova_compute[182713]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 00:17:56 compute-1 nova_compute[182713]:       <entry name="serial">2ddf383f-fadd-4739-9a90-3db8bcb7cb2a</entry>
Jan 22 00:17:56 compute-1 nova_compute[182713]:       <entry name="uuid">2ddf383f-fadd-4739-9a90-3db8bcb7cb2a</entry>
Jan 22 00:17:56 compute-1 nova_compute[182713]:       <entry name="family">Virtual Machine</entry>
Jan 22 00:17:56 compute-1 nova_compute[182713]:     </system>
Jan 22 00:17:56 compute-1 nova_compute[182713]:   </sysinfo>
Jan 22 00:17:56 compute-1 nova_compute[182713]:   <os>
Jan 22 00:17:56 compute-1 nova_compute[182713]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 22 00:17:56 compute-1 nova_compute[182713]:     <boot dev="hd"/>
Jan 22 00:17:56 compute-1 nova_compute[182713]:     <smbios mode="sysinfo"/>
Jan 22 00:17:56 compute-1 nova_compute[182713]:   </os>
Jan 22 00:17:56 compute-1 nova_compute[182713]:   <features>
Jan 22 00:17:56 compute-1 nova_compute[182713]:     <acpi/>
Jan 22 00:17:56 compute-1 nova_compute[182713]:     <apic/>
Jan 22 00:17:56 compute-1 nova_compute[182713]:     <vmcoreinfo/>
Jan 22 00:17:56 compute-1 nova_compute[182713]:   </features>
Jan 22 00:17:56 compute-1 nova_compute[182713]:   <clock offset="utc">
Jan 22 00:17:56 compute-1 nova_compute[182713]:     <timer name="pit" tickpolicy="delay"/>
Jan 22 00:17:56 compute-1 nova_compute[182713]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 22 00:17:56 compute-1 nova_compute[182713]:     <timer name="hpet" present="no"/>
Jan 22 00:17:56 compute-1 nova_compute[182713]:   </clock>
Jan 22 00:17:56 compute-1 nova_compute[182713]:   <cpu mode="custom" match="exact">
Jan 22 00:17:56 compute-1 nova_compute[182713]:     <model>Nehalem</model>
Jan 22 00:17:56 compute-1 nova_compute[182713]:     <topology sockets="1" cores="1" threads="1"/>
Jan 22 00:17:56 compute-1 nova_compute[182713]:   </cpu>
Jan 22 00:17:56 compute-1 nova_compute[182713]:   <devices>
Jan 22 00:17:56 compute-1 nova_compute[182713]:     <disk type="file" device="disk">
Jan 22 00:17:56 compute-1 nova_compute[182713]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 00:17:56 compute-1 nova_compute[182713]:       <source file="/var/lib/nova/instances/2ddf383f-fadd-4739-9a90-3db8bcb7cb2a/disk"/>
Jan 22 00:17:56 compute-1 nova_compute[182713]:       <target dev="vda" bus="virtio"/>
Jan 22 00:17:56 compute-1 nova_compute[182713]:     </disk>
Jan 22 00:17:56 compute-1 nova_compute[182713]:     <disk type="file" device="cdrom">
Jan 22 00:17:56 compute-1 nova_compute[182713]:       <driver name="qemu" type="raw" cache="none"/>
Jan 22 00:17:56 compute-1 nova_compute[182713]:       <source file="/var/lib/nova/instances/2ddf383f-fadd-4739-9a90-3db8bcb7cb2a/disk.config"/>
Jan 22 00:17:56 compute-1 nova_compute[182713]:       <target dev="sda" bus="sata"/>
Jan 22 00:17:56 compute-1 nova_compute[182713]:     </disk>
Jan 22 00:17:56 compute-1 nova_compute[182713]:     <interface type="ethernet">
Jan 22 00:17:56 compute-1 nova_compute[182713]:       <mac address="fa:16:3e:2c:49:00"/>
Jan 22 00:17:56 compute-1 nova_compute[182713]:       <model type="virtio"/>
Jan 22 00:17:56 compute-1 nova_compute[182713]:       <driver name="vhost" rx_queue_size="512"/>
Jan 22 00:17:56 compute-1 nova_compute[182713]:       <mtu size="1442"/>
Jan 22 00:17:56 compute-1 nova_compute[182713]:       <target dev="tapb3245964-fe"/>
Jan 22 00:17:56 compute-1 nova_compute[182713]:     </interface>
Jan 22 00:17:56 compute-1 nova_compute[182713]:     <serial type="pty">
Jan 22 00:17:56 compute-1 nova_compute[182713]:       <log file="/var/lib/nova/instances/2ddf383f-fadd-4739-9a90-3db8bcb7cb2a/console.log" append="off"/>
Jan 22 00:17:56 compute-1 nova_compute[182713]:     </serial>
Jan 22 00:17:56 compute-1 nova_compute[182713]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 00:17:56 compute-1 nova_compute[182713]:     <video>
Jan 22 00:17:56 compute-1 nova_compute[182713]:       <model type="virtio"/>
Jan 22 00:17:56 compute-1 nova_compute[182713]:     </video>
Jan 22 00:17:56 compute-1 nova_compute[182713]:     <input type="tablet" bus="usb"/>
Jan 22 00:17:56 compute-1 nova_compute[182713]:     <rng model="virtio">
Jan 22 00:17:56 compute-1 nova_compute[182713]:       <backend model="random">/dev/urandom</backend>
Jan 22 00:17:56 compute-1 nova_compute[182713]:     </rng>
Jan 22 00:17:56 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root"/>
Jan 22 00:17:56 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:17:56 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:17:56 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:17:56 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:17:56 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:17:56 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:17:56 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:17:56 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:17:56 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:17:56 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:17:56 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:17:56 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:17:56 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:17:56 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:17:56 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:17:56 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:17:56 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:17:56 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:17:56 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:17:56 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:17:56 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:17:56 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:17:56 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:17:56 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:17:56 compute-1 nova_compute[182713]:     <controller type="usb" index="0"/>
Jan 22 00:17:56 compute-1 nova_compute[182713]:     <memballoon model="virtio">
Jan 22 00:17:56 compute-1 nova_compute[182713]:       <stats period="10"/>
Jan 22 00:17:56 compute-1 nova_compute[182713]:     </memballoon>
Jan 22 00:17:56 compute-1 nova_compute[182713]:   </devices>
Jan 22 00:17:56 compute-1 nova_compute[182713]: </domain>
Jan 22 00:17:56 compute-1 nova_compute[182713]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 22 00:17:56 compute-1 nova_compute[182713]: 2026-01-22 00:17:56.313 182717 DEBUG nova.compute.manager [None req-fdff1040-a7eb-4894-8594-187b51ec2b0e a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 2ddf383f-fadd-4739-9a90-3db8bcb7cb2a] Preparing to wait for external event network-vif-plugged-b3245964-feca-4b8b-b219-3a8b97cebae7 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 22 00:17:56 compute-1 nova_compute[182713]: 2026-01-22 00:17:56.314 182717 DEBUG oslo_concurrency.lockutils [None req-fdff1040-a7eb-4894-8594-187b51ec2b0e a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Acquiring lock "2ddf383f-fadd-4739-9a90-3db8bcb7cb2a-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:17:56 compute-1 nova_compute[182713]: 2026-01-22 00:17:56.314 182717 DEBUG oslo_concurrency.lockutils [None req-fdff1040-a7eb-4894-8594-187b51ec2b0e a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Lock "2ddf383f-fadd-4739-9a90-3db8bcb7cb2a-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:17:56 compute-1 nova_compute[182713]: 2026-01-22 00:17:56.315 182717 DEBUG oslo_concurrency.lockutils [None req-fdff1040-a7eb-4894-8594-187b51ec2b0e a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Lock "2ddf383f-fadd-4739-9a90-3db8bcb7cb2a-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:17:56 compute-1 nova_compute[182713]: 2026-01-22 00:17:56.316 182717 DEBUG nova.virt.libvirt.vif [None req-fdff1040-a7eb-4894-8594-187b51ec2b0e a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T00:17:46Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-1492736128-access_point-1890133969',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-1492736128-access_point-1890133969',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-1492736128-ac',id=138,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBEC2kFjZ2LLODTqt2mxgHy7rJjxFe8ZrHUXm20i9JwMLGCcZBnlNfhY7KIi4qZ5U0omwJeXWnBlNjC6qoRQOBCrKoweqP5wnKVI5nSTyIGfNTjQz5x3kG940iTmU7L06SA==',key_name='tempest-TestSecurityGroupsBasicOps-720505212',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='02bcfc5f1f1044a3856e73a5938ff011',ramdisk_id='',reservation_id='r-c2o0kvzh',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestSecurityGroupsBasicOps-1492736128',owner_user_name='tempest-TestSecurityGroupsBasicOps-1492736128-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:17:49Z,user_data=None,user_id='a60ce2b7b7ae47b484de12add551b287',uuid=2ddf383f-fadd-4739-9a90-3db8bcb7cb2a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "b3245964-feca-4b8b-b219-3a8b97cebae7", "address": "fa:16:3e:2c:49:00", "network": {"id": "3a4c4578-e100-40e8-b037-1c4af043c44d", "bridge": "br-int", "label": "tempest-network-smoke--1622567996", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "02bcfc5f1f1044a3856e73a5938ff011", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb3245964-fe", "ovs_interfaceid": "b3245964-feca-4b8b-b219-3a8b97cebae7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 00:17:56 compute-1 nova_compute[182713]: 2026-01-22 00:17:56.317 182717 DEBUG nova.network.os_vif_util [None req-fdff1040-a7eb-4894-8594-187b51ec2b0e a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Converting VIF {"id": "b3245964-feca-4b8b-b219-3a8b97cebae7", "address": "fa:16:3e:2c:49:00", "network": {"id": "3a4c4578-e100-40e8-b037-1c4af043c44d", "bridge": "br-int", "label": "tempest-network-smoke--1622567996", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "02bcfc5f1f1044a3856e73a5938ff011", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb3245964-fe", "ovs_interfaceid": "b3245964-feca-4b8b-b219-3a8b97cebae7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:17:56 compute-1 nova_compute[182713]: 2026-01-22 00:17:56.318 182717 DEBUG nova.network.os_vif_util [None req-fdff1040-a7eb-4894-8594-187b51ec2b0e a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:2c:49:00,bridge_name='br-int',has_traffic_filtering=True,id=b3245964-feca-4b8b-b219-3a8b97cebae7,network=Network(3a4c4578-e100-40e8-b037-1c4af043c44d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb3245964-fe') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:17:56 compute-1 nova_compute[182713]: 2026-01-22 00:17:56.319 182717 DEBUG os_vif [None req-fdff1040-a7eb-4894-8594-187b51ec2b0e a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:2c:49:00,bridge_name='br-int',has_traffic_filtering=True,id=b3245964-feca-4b8b-b219-3a8b97cebae7,network=Network(3a4c4578-e100-40e8-b037-1c4af043c44d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb3245964-fe') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 00:17:56 compute-1 nova_compute[182713]: 2026-01-22 00:17:56.320 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:17:56 compute-1 nova_compute[182713]: 2026-01-22 00:17:56.320 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:17:56 compute-1 nova_compute[182713]: 2026-01-22 00:17:56.321 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 00:17:56 compute-1 nova_compute[182713]: 2026-01-22 00:17:56.330 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:17:56 compute-1 nova_compute[182713]: 2026-01-22 00:17:56.331 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb3245964-fe, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:17:56 compute-1 nova_compute[182713]: 2026-01-22 00:17:56.332 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapb3245964-fe, col_values=(('external_ids', {'iface-id': 'b3245964-feca-4b8b-b219-3a8b97cebae7', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:2c:49:00', 'vm-uuid': '2ddf383f-fadd-4739-9a90-3db8bcb7cb2a'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:17:56 compute-1 nova_compute[182713]: 2026-01-22 00:17:56.365 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:17:56 compute-1 NetworkManager[54952]: <info>  [1769041076.3682] manager: (tapb3245964-fe): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/249)
Jan 22 00:17:56 compute-1 nova_compute[182713]: 2026-01-22 00:17:56.369 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:17:56 compute-1 nova_compute[182713]: 2026-01-22 00:17:56.377 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:17:56 compute-1 nova_compute[182713]: 2026-01-22 00:17:56.378 182717 INFO os_vif [None req-fdff1040-a7eb-4894-8594-187b51ec2b0e a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:2c:49:00,bridge_name='br-int',has_traffic_filtering=True,id=b3245964-feca-4b8b-b219-3a8b97cebae7,network=Network(3a4c4578-e100-40e8-b037-1c4af043c44d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb3245964-fe')
Jan 22 00:17:56 compute-1 nova_compute[182713]: 2026-01-22 00:17:56.631 182717 DEBUG nova.virt.libvirt.driver [None req-fdff1040-a7eb-4894-8594-187b51ec2b0e a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 00:17:56 compute-1 nova_compute[182713]: 2026-01-22 00:17:56.632 182717 DEBUG nova.virt.libvirt.driver [None req-fdff1040-a7eb-4894-8594-187b51ec2b0e a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 00:17:56 compute-1 nova_compute[182713]: 2026-01-22 00:17:56.632 182717 DEBUG nova.virt.libvirt.driver [None req-fdff1040-a7eb-4894-8594-187b51ec2b0e a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] No VIF found with MAC fa:16:3e:2c:49:00, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 00:17:56 compute-1 nova_compute[182713]: 2026-01-22 00:17:56.633 182717 INFO nova.virt.libvirt.driver [None req-fdff1040-a7eb-4894-8594-187b51ec2b0e a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 2ddf383f-fadd-4739-9a90-3db8bcb7cb2a] Using config drive
Jan 22 00:17:57 compute-1 nova_compute[182713]: 2026-01-22 00:17:57.133 182717 DEBUG nova.network.neutron [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] [instance: 7e1a8ec5-e1c0-4c11-aae2-15d84872d95c] Updating instance_info_cache with network_info: [{"id": "dfd91ce9-b3dc-46e0-8793-952181553915", "address": "fa:16:3e:0c:78:ae", "network": {"id": "aabf11c6-ef94-408a-8148-6c6400566606", "bridge": "br-int", "label": "tempest-ServersTestJSON-341875047-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3e408650207b498c8d115fd0c4f776dc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdfd91ce9-b3", "ovs_interfaceid": "dfd91ce9-b3dc-46e0-8793-952181553915", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:17:57 compute-1 nova_compute[182713]: 2026-01-22 00:17:57.167 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Releasing lock "refresh_cache-7e1a8ec5-e1c0-4c11-aae2-15d84872d95c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:17:57 compute-1 nova_compute[182713]: 2026-01-22 00:17:57.168 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] [instance: 7e1a8ec5-e1c0-4c11-aae2-15d84872d95c] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Jan 22 00:17:57 compute-1 nova_compute[182713]: 2026-01-22 00:17:57.169 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:17:57 compute-1 nova_compute[182713]: 2026-01-22 00:17:57.169 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Jan 22 00:17:57 compute-1 podman[232628]: 2026-01-22 00:17:57.615041875 +0000 UTC m=+0.085641533 container health_status cba261ffc859a105e0afa4436e64e27f9ba7e7387a1ea02146e0072c51844d44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0)
Jan 22 00:17:57 compute-1 nova_compute[182713]: 2026-01-22 00:17:57.655 182717 INFO nova.virt.libvirt.driver [None req-fdff1040-a7eb-4894-8594-187b51ec2b0e a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 2ddf383f-fadd-4739-9a90-3db8bcb7cb2a] Creating config drive at /var/lib/nova/instances/2ddf383f-fadd-4739-9a90-3db8bcb7cb2a/disk.config
Jan 22 00:17:57 compute-1 nova_compute[182713]: 2026-01-22 00:17:57.661 182717 DEBUG oslo_concurrency.processutils [None req-fdff1040-a7eb-4894-8594-187b51ec2b0e a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/2ddf383f-fadd-4739-9a90-3db8bcb7cb2a/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpjmu9b6u9 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:17:57 compute-1 nova_compute[182713]: 2026-01-22 00:17:57.790 182717 DEBUG oslo_concurrency.processutils [None req-fdff1040-a7eb-4894-8594-187b51ec2b0e a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/2ddf383f-fadd-4739-9a90-3db8bcb7cb2a/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpjmu9b6u9" returned: 0 in 0.129s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:17:57 compute-1 kernel: tapb3245964-fe: entered promiscuous mode
Jan 22 00:17:57 compute-1 NetworkManager[54952]: <info>  [1769041077.8741] manager: (tapb3245964-fe): new Tun device (/org/freedesktop/NetworkManager/Devices/250)
Jan 22 00:17:57 compute-1 ovn_controller[94841]: 2026-01-22T00:17:57Z|00549|binding|INFO|Claiming lport b3245964-feca-4b8b-b219-3a8b97cebae7 for this chassis.
Jan 22 00:17:57 compute-1 ovn_controller[94841]: 2026-01-22T00:17:57Z|00550|binding|INFO|b3245964-feca-4b8b-b219-3a8b97cebae7: Claiming fa:16:3e:2c:49:00 10.100.0.11
Jan 22 00:17:57 compute-1 nova_compute[182713]: 2026-01-22 00:17:57.876 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:17:57 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:17:57.890 104184 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:2c:49:00 10.100.0.11'], port_security=['fa:16:3e:2c:49:00 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '2ddf383f-fadd-4739-9a90-3db8bcb7cb2a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3a4c4578-e100-40e8-b037-1c4af043c44d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '02bcfc5f1f1044a3856e73a5938ff011', 'neutron:revision_number': '2', 'neutron:security_group_ids': '729d6e1e-1a32-4037-a612-7401da3be40f cd748916-10a2-4939-a18d-bdf4af30e611', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=06481ad4-d1ea-4682-b4d3-a69e150c2ff0, chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>], logical_port=b3245964-feca-4b8b-b219-3a8b97cebae7) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:17:57 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:17:57.892 104184 INFO neutron.agent.ovn.metadata.agent [-] Port b3245964-feca-4b8b-b219-3a8b97cebae7 in datapath 3a4c4578-e100-40e8-b037-1c4af043c44d bound to our chassis
Jan 22 00:17:57 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:17:57.895 104184 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 3a4c4578-e100-40e8-b037-1c4af043c44d
Jan 22 00:17:57 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:17:57.910 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[cc04a01b-da42-483e-a6f6-f8c9d081d481]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:17:57 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:17:57.911 104184 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap3a4c4578-e1 in ovnmeta-3a4c4578-e100-40e8-b037-1c4af043c44d namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 22 00:17:57 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:17:57.914 211733 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap3a4c4578-e0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 22 00:17:57 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:17:57.914 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[c1327e78-ce91-48c6-b415-0e525ff016dc]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:17:57 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:17:57.915 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[39578e20-6bad-45b9-8ec6-4baca5270de5]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:17:57 compute-1 systemd-udevd[232667]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 00:17:57 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:17:57.930 104576 DEBUG oslo.privsep.daemon [-] privsep: reply[017b1f2d-b709-4a93-9267-3c886e2f0ef4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:17:57 compute-1 NetworkManager[54952]: <info>  [1769041077.9447] device (tapb3245964-fe): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 00:17:57 compute-1 NetworkManager[54952]: <info>  [1769041077.9466] device (tapb3245964-fe): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 00:17:57 compute-1 systemd-machined[153970]: New machine qemu-62-instance-0000008a.
Jan 22 00:17:57 compute-1 nova_compute[182713]: 2026-01-22 00:17:57.949 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:17:57 compute-1 ovn_controller[94841]: 2026-01-22T00:17:57Z|00551|binding|INFO|Setting lport b3245964-feca-4b8b-b219-3a8b97cebae7 ovn-installed in OVS
Jan 22 00:17:57 compute-1 ovn_controller[94841]: 2026-01-22T00:17:57Z|00552|binding|INFO|Setting lport b3245964-feca-4b8b-b219-3a8b97cebae7 up in Southbound
Jan 22 00:17:57 compute-1 nova_compute[182713]: 2026-01-22 00:17:57.955 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:17:57 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:17:57.961 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[c24082a4-e35c-447e-93f9-2a0cb8f9fa76]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:17:57 compute-1 systemd[1]: Started Virtual Machine qemu-62-instance-0000008a.
Jan 22 00:17:57 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:17:57.997 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[39967f52-1ae7-4431-af2f-9e20d1a9e402]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:17:58 compute-1 systemd-udevd[232671]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 00:17:58 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:17:58.003 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[c415b2d4-e64b-4297-a92b-68d03dac2c87]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:17:58 compute-1 NetworkManager[54952]: <info>  [1769041078.0051] manager: (tap3a4c4578-e0): new Veth device (/org/freedesktop/NetworkManager/Devices/251)
Jan 22 00:17:58 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:17:58.038 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[cbee79d7-9725-4af0-b7f9-810b2cc50770]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:17:58 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:17:58.041 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[2e687272-a357-4d17-afbb-8780ed04cac6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:17:58 compute-1 NetworkManager[54952]: <info>  [1769041078.0676] device (tap3a4c4578-e0): carrier: link connected
Jan 22 00:17:58 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:17:58.072 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[70a1188d-6208-4a5d-944b-3b254237696f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:17:58 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:17:58.090 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[f2c0e361-7c50-446f-93bc-cdaa972ee5a1]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap3a4c4578-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:45:cc:83'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 158], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 565072, 'reachable_time': 44261, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 232699, 'error': None, 'target': 'ovnmeta-3a4c4578-e100-40e8-b037-1c4af043c44d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:17:58 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:17:58.108 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[cc0ab6c9-7d63-40d7-89b1-c99e078e31d2]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe45:cc83'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 565072, 'tstamp': 565072}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 232700, 'error': None, 'target': 'ovnmeta-3a4c4578-e100-40e8-b037-1c4af043c44d', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:17:58 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:17:58.127 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[ae294c2c-de9f-470f-97c5-647a53bc06be]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap3a4c4578-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:45:cc:83'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 158], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 565072, 'reachable_time': 44261, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 232701, 'error': None, 'target': 'ovnmeta-3a4c4578-e100-40e8-b037-1c4af043c44d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:17:58 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:17:58.163 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[4120ae7b-8234-4a5c-807d-d6f6edd88476]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:17:58 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:17:58.227 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[c05077b4-bf47-46f2-8432-334a50e6d3e5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:17:58 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:17:58.229 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3a4c4578-e0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:17:58 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:17:58.229 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 00:17:58 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:17:58.230 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap3a4c4578-e0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:17:58 compute-1 NetworkManager[54952]: <info>  [1769041078.2332] manager: (tap3a4c4578-e0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/252)
Jan 22 00:17:58 compute-1 kernel: tap3a4c4578-e0: entered promiscuous mode
Jan 22 00:17:58 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:17:58.237 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap3a4c4578-e0, col_values=(('external_ids', {'iface-id': 'ecd06410-36ac-42c3-b9e8-b57793dc7305'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:17:58 compute-1 ovn_controller[94841]: 2026-01-22T00:17:58Z|00553|binding|INFO|Releasing lport ecd06410-36ac-42c3-b9e8-b57793dc7305 from this chassis (sb_readonly=0)
Jan 22 00:17:58 compute-1 nova_compute[182713]: 2026-01-22 00:17:58.232 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:17:58 compute-1 nova_compute[182713]: 2026-01-22 00:17:58.261 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:17:58 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:17:58.262 104184 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/3a4c4578-e100-40e8-b037-1c4af043c44d.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/3a4c4578-e100-40e8-b037-1c4af043c44d.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 22 00:17:58 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:17:58.263 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[b050c3e5-659d-40f9-820f-31919308b026]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:17:58 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:17:58.264 104184 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 00:17:58 compute-1 ovn_metadata_agent[104179]: global
Jan 22 00:17:58 compute-1 ovn_metadata_agent[104179]:     log         /dev/log local0 debug
Jan 22 00:17:58 compute-1 ovn_metadata_agent[104179]:     log-tag     haproxy-metadata-proxy-3a4c4578-e100-40e8-b037-1c4af043c44d
Jan 22 00:17:58 compute-1 ovn_metadata_agent[104179]:     user        root
Jan 22 00:17:58 compute-1 ovn_metadata_agent[104179]:     group       root
Jan 22 00:17:58 compute-1 ovn_metadata_agent[104179]:     maxconn     1024
Jan 22 00:17:58 compute-1 ovn_metadata_agent[104179]:     pidfile     /var/lib/neutron/external/pids/3a4c4578-e100-40e8-b037-1c4af043c44d.pid.haproxy
Jan 22 00:17:58 compute-1 ovn_metadata_agent[104179]:     daemon
Jan 22 00:17:58 compute-1 ovn_metadata_agent[104179]: 
Jan 22 00:17:58 compute-1 ovn_metadata_agent[104179]: defaults
Jan 22 00:17:58 compute-1 ovn_metadata_agent[104179]:     log global
Jan 22 00:17:58 compute-1 ovn_metadata_agent[104179]:     mode http
Jan 22 00:17:58 compute-1 ovn_metadata_agent[104179]:     option httplog
Jan 22 00:17:58 compute-1 ovn_metadata_agent[104179]:     option dontlognull
Jan 22 00:17:58 compute-1 ovn_metadata_agent[104179]:     option http-server-close
Jan 22 00:17:58 compute-1 ovn_metadata_agent[104179]:     option forwardfor
Jan 22 00:17:58 compute-1 ovn_metadata_agent[104179]:     retries                 3
Jan 22 00:17:58 compute-1 ovn_metadata_agent[104179]:     timeout http-request    30s
Jan 22 00:17:58 compute-1 ovn_metadata_agent[104179]:     timeout connect         30s
Jan 22 00:17:58 compute-1 ovn_metadata_agent[104179]:     timeout client          32s
Jan 22 00:17:58 compute-1 ovn_metadata_agent[104179]:     timeout server          32s
Jan 22 00:17:58 compute-1 ovn_metadata_agent[104179]:     timeout http-keep-alive 30s
Jan 22 00:17:58 compute-1 ovn_metadata_agent[104179]: 
Jan 22 00:17:58 compute-1 ovn_metadata_agent[104179]: 
Jan 22 00:17:58 compute-1 ovn_metadata_agent[104179]: listen listener
Jan 22 00:17:58 compute-1 ovn_metadata_agent[104179]:     bind 169.254.169.254:80
Jan 22 00:17:58 compute-1 ovn_metadata_agent[104179]:     server metadata /var/lib/neutron/metadata_proxy
Jan 22 00:17:58 compute-1 ovn_metadata_agent[104179]:     http-request add-header X-OVN-Network-ID 3a4c4578-e100-40e8-b037-1c4af043c44d
Jan 22 00:17:58 compute-1 ovn_metadata_agent[104179]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 22 00:17:58 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:17:58.265 104184 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-3a4c4578-e100-40e8-b037-1c4af043c44d', 'env', 'PROCESS_TAG=haproxy-3a4c4578-e100-40e8-b037-1c4af043c44d', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/3a4c4578-e100-40e8-b037-1c4af043c44d.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 22 00:17:58 compute-1 nova_compute[182713]: 2026-01-22 00:17:58.348 182717 DEBUG nova.network.neutron [req-9a8ff611-f285-4f15-a0f9-b95221d9bed2 req-cbd5a8c3-10a8-4882-9df5-127c90615bfd 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 2ddf383f-fadd-4739-9a90-3db8bcb7cb2a] Updated VIF entry in instance network info cache for port b3245964-feca-4b8b-b219-3a8b97cebae7. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 00:17:58 compute-1 nova_compute[182713]: 2026-01-22 00:17:58.349 182717 DEBUG nova.network.neutron [req-9a8ff611-f285-4f15-a0f9-b95221d9bed2 req-cbd5a8c3-10a8-4882-9df5-127c90615bfd 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 2ddf383f-fadd-4739-9a90-3db8bcb7cb2a] Updating instance_info_cache with network_info: [{"id": "b3245964-feca-4b8b-b219-3a8b97cebae7", "address": "fa:16:3e:2c:49:00", "network": {"id": "3a4c4578-e100-40e8-b037-1c4af043c44d", "bridge": "br-int", "label": "tempest-network-smoke--1622567996", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "02bcfc5f1f1044a3856e73a5938ff011", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb3245964-fe", "ovs_interfaceid": "b3245964-feca-4b8b-b219-3a8b97cebae7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:17:58 compute-1 nova_compute[182713]: 2026-01-22 00:17:58.375 182717 DEBUG oslo_concurrency.lockutils [req-9a8ff611-f285-4f15-a0f9-b95221d9bed2 req-cbd5a8c3-10a8-4882-9df5-127c90615bfd 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-2ddf383f-fadd-4739-9a90-3db8bcb7cb2a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:17:58 compute-1 nova_compute[182713]: 2026-01-22 00:17:58.644 182717 DEBUG nova.virt.driver [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Emitting event <LifecycleEvent: 1769041078.6380868, 2ddf383f-fadd-4739-9a90-3db8bcb7cb2a => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:17:58 compute-1 nova_compute[182713]: 2026-01-22 00:17:58.646 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 2ddf383f-fadd-4739-9a90-3db8bcb7cb2a] VM Started (Lifecycle Event)
Jan 22 00:17:58 compute-1 nova_compute[182713]: 2026-01-22 00:17:58.648 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:17:58 compute-1 nova_compute[182713]: 2026-01-22 00:17:58.695 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 2ddf383f-fadd-4739-9a90-3db8bcb7cb2a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:17:58 compute-1 nova_compute[182713]: 2026-01-22 00:17:58.701 182717 DEBUG nova.virt.driver [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Emitting event <LifecycleEvent: 1769041078.638346, 2ddf383f-fadd-4739-9a90-3db8bcb7cb2a => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:17:58 compute-1 nova_compute[182713]: 2026-01-22 00:17:58.702 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 2ddf383f-fadd-4739-9a90-3db8bcb7cb2a] VM Paused (Lifecycle Event)
Jan 22 00:17:58 compute-1 nova_compute[182713]: 2026-01-22 00:17:58.723 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 2ddf383f-fadd-4739-9a90-3db8bcb7cb2a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:17:58 compute-1 nova_compute[182713]: 2026-01-22 00:17:58.728 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 2ddf383f-fadd-4739-9a90-3db8bcb7cb2a] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 00:17:58 compute-1 nova_compute[182713]: 2026-01-22 00:17:58.752 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 2ddf383f-fadd-4739-9a90-3db8bcb7cb2a] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 00:17:58 compute-1 podman[232740]: 2026-01-22 00:17:58.756608302 +0000 UTC m=+0.074830762 container create 8293174ef7a01285881378e40892f284a498aae0528b780df21bd4c6bca0d161 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3a4c4578-e100-40e8-b037-1c4af043c44d, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 00:17:58 compute-1 systemd[1]: Started libpod-conmon-8293174ef7a01285881378e40892f284a498aae0528b780df21bd4c6bca0d161.scope.
Jan 22 00:17:58 compute-1 podman[232740]: 2026-01-22 00:17:58.714528848 +0000 UTC m=+0.032751328 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 00:17:58 compute-1 systemd[1]: Started libcrun container.
Jan 22 00:17:58 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0fbc13cae5849fcd16f421e1b0771b55b510122306df92bf1f8ecfb7eb2d0c4e/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 00:17:58 compute-1 podman[232740]: 2026-01-22 00:17:58.852130538 +0000 UTC m=+0.170353038 container init 8293174ef7a01285881378e40892f284a498aae0528b780df21bd4c6bca0d161 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3a4c4578-e100-40e8-b037-1c4af043c44d, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Jan 22 00:17:58 compute-1 podman[232740]: 2026-01-22 00:17:58.8587519 +0000 UTC m=+0.176974360 container start 8293174ef7a01285881378e40892f284a498aae0528b780df21bd4c6bca0d161 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3a4c4578-e100-40e8-b037-1c4af043c44d, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202)
Jan 22 00:17:58 compute-1 neutron-haproxy-ovnmeta-3a4c4578-e100-40e8-b037-1c4af043c44d[232756]: [NOTICE]   (232760) : New worker (232762) forked
Jan 22 00:17:58 compute-1 neutron-haproxy-ovnmeta-3a4c4578-e100-40e8-b037-1c4af043c44d[232756]: [NOTICE]   (232760) : Loading success.
Jan 22 00:17:59 compute-1 nova_compute[182713]: 2026-01-22 00:17:59.221 182717 DEBUG nova.compute.manager [req-64e38ee6-90a2-4510-bae1-d618b466d682 req-00f3bdce-6faf-44f4-aa3e-248fded2e004 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 2ddf383f-fadd-4739-9a90-3db8bcb7cb2a] Received event network-vif-plugged-b3245964-feca-4b8b-b219-3a8b97cebae7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:17:59 compute-1 nova_compute[182713]: 2026-01-22 00:17:59.222 182717 DEBUG oslo_concurrency.lockutils [req-64e38ee6-90a2-4510-bae1-d618b466d682 req-00f3bdce-6faf-44f4-aa3e-248fded2e004 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "2ddf383f-fadd-4739-9a90-3db8bcb7cb2a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:17:59 compute-1 nova_compute[182713]: 2026-01-22 00:17:59.222 182717 DEBUG oslo_concurrency.lockutils [req-64e38ee6-90a2-4510-bae1-d618b466d682 req-00f3bdce-6faf-44f4-aa3e-248fded2e004 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "2ddf383f-fadd-4739-9a90-3db8bcb7cb2a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:17:59 compute-1 nova_compute[182713]: 2026-01-22 00:17:59.223 182717 DEBUG oslo_concurrency.lockutils [req-64e38ee6-90a2-4510-bae1-d618b466d682 req-00f3bdce-6faf-44f4-aa3e-248fded2e004 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "2ddf383f-fadd-4739-9a90-3db8bcb7cb2a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:17:59 compute-1 nova_compute[182713]: 2026-01-22 00:17:59.223 182717 DEBUG nova.compute.manager [req-64e38ee6-90a2-4510-bae1-d618b466d682 req-00f3bdce-6faf-44f4-aa3e-248fded2e004 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 2ddf383f-fadd-4739-9a90-3db8bcb7cb2a] Processing event network-vif-plugged-b3245964-feca-4b8b-b219-3a8b97cebae7 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 22 00:17:59 compute-1 nova_compute[182713]: 2026-01-22 00:17:59.223 182717 DEBUG nova.compute.manager [req-64e38ee6-90a2-4510-bae1-d618b466d682 req-00f3bdce-6faf-44f4-aa3e-248fded2e004 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 2ddf383f-fadd-4739-9a90-3db8bcb7cb2a] Received event network-vif-plugged-b3245964-feca-4b8b-b219-3a8b97cebae7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:17:59 compute-1 nova_compute[182713]: 2026-01-22 00:17:59.224 182717 DEBUG oslo_concurrency.lockutils [req-64e38ee6-90a2-4510-bae1-d618b466d682 req-00f3bdce-6faf-44f4-aa3e-248fded2e004 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "2ddf383f-fadd-4739-9a90-3db8bcb7cb2a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:17:59 compute-1 nova_compute[182713]: 2026-01-22 00:17:59.224 182717 DEBUG oslo_concurrency.lockutils [req-64e38ee6-90a2-4510-bae1-d618b466d682 req-00f3bdce-6faf-44f4-aa3e-248fded2e004 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "2ddf383f-fadd-4739-9a90-3db8bcb7cb2a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:17:59 compute-1 nova_compute[182713]: 2026-01-22 00:17:59.224 182717 DEBUG oslo_concurrency.lockutils [req-64e38ee6-90a2-4510-bae1-d618b466d682 req-00f3bdce-6faf-44f4-aa3e-248fded2e004 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "2ddf383f-fadd-4739-9a90-3db8bcb7cb2a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:17:59 compute-1 nova_compute[182713]: 2026-01-22 00:17:59.225 182717 DEBUG nova.compute.manager [req-64e38ee6-90a2-4510-bae1-d618b466d682 req-00f3bdce-6faf-44f4-aa3e-248fded2e004 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 2ddf383f-fadd-4739-9a90-3db8bcb7cb2a] No waiting events found dispatching network-vif-plugged-b3245964-feca-4b8b-b219-3a8b97cebae7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:17:59 compute-1 nova_compute[182713]: 2026-01-22 00:17:59.225 182717 WARNING nova.compute.manager [req-64e38ee6-90a2-4510-bae1-d618b466d682 req-00f3bdce-6faf-44f4-aa3e-248fded2e004 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 2ddf383f-fadd-4739-9a90-3db8bcb7cb2a] Received unexpected event network-vif-plugged-b3245964-feca-4b8b-b219-3a8b97cebae7 for instance with vm_state building and task_state spawning.
Jan 22 00:17:59 compute-1 nova_compute[182713]: 2026-01-22 00:17:59.226 182717 DEBUG nova.compute.manager [None req-fdff1040-a7eb-4894-8594-187b51ec2b0e a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 2ddf383f-fadd-4739-9a90-3db8bcb7cb2a] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 00:17:59 compute-1 nova_compute[182713]: 2026-01-22 00:17:59.231 182717 DEBUG nova.virt.driver [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Emitting event <LifecycleEvent: 1769041079.2304335, 2ddf383f-fadd-4739-9a90-3db8bcb7cb2a => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:17:59 compute-1 nova_compute[182713]: 2026-01-22 00:17:59.231 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 2ddf383f-fadd-4739-9a90-3db8bcb7cb2a] VM Resumed (Lifecycle Event)
Jan 22 00:17:59 compute-1 nova_compute[182713]: 2026-01-22 00:17:59.233 182717 DEBUG nova.virt.libvirt.driver [None req-fdff1040-a7eb-4894-8594-187b51ec2b0e a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 2ddf383f-fadd-4739-9a90-3db8bcb7cb2a] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 22 00:17:59 compute-1 nova_compute[182713]: 2026-01-22 00:17:59.237 182717 INFO nova.virt.libvirt.driver [-] [instance: 2ddf383f-fadd-4739-9a90-3db8bcb7cb2a] Instance spawned successfully.
Jan 22 00:17:59 compute-1 nova_compute[182713]: 2026-01-22 00:17:59.237 182717 DEBUG nova.virt.libvirt.driver [None req-fdff1040-a7eb-4894-8594-187b51ec2b0e a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 2ddf383f-fadd-4739-9a90-3db8bcb7cb2a] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 22 00:17:59 compute-1 nova_compute[182713]: 2026-01-22 00:17:59.257 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 2ddf383f-fadd-4739-9a90-3db8bcb7cb2a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:17:59 compute-1 nova_compute[182713]: 2026-01-22 00:17:59.266 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 2ddf383f-fadd-4739-9a90-3db8bcb7cb2a] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 00:17:59 compute-1 nova_compute[182713]: 2026-01-22 00:17:59.270 182717 DEBUG nova.virt.libvirt.driver [None req-fdff1040-a7eb-4894-8594-187b51ec2b0e a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 2ddf383f-fadd-4739-9a90-3db8bcb7cb2a] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:17:59 compute-1 nova_compute[182713]: 2026-01-22 00:17:59.271 182717 DEBUG nova.virt.libvirt.driver [None req-fdff1040-a7eb-4894-8594-187b51ec2b0e a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 2ddf383f-fadd-4739-9a90-3db8bcb7cb2a] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:17:59 compute-1 nova_compute[182713]: 2026-01-22 00:17:59.272 182717 DEBUG nova.virt.libvirt.driver [None req-fdff1040-a7eb-4894-8594-187b51ec2b0e a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 2ddf383f-fadd-4739-9a90-3db8bcb7cb2a] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:17:59 compute-1 nova_compute[182713]: 2026-01-22 00:17:59.272 182717 DEBUG nova.virt.libvirt.driver [None req-fdff1040-a7eb-4894-8594-187b51ec2b0e a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 2ddf383f-fadd-4739-9a90-3db8bcb7cb2a] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:17:59 compute-1 nova_compute[182713]: 2026-01-22 00:17:59.273 182717 DEBUG nova.virt.libvirt.driver [None req-fdff1040-a7eb-4894-8594-187b51ec2b0e a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 2ddf383f-fadd-4739-9a90-3db8bcb7cb2a] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:17:59 compute-1 nova_compute[182713]: 2026-01-22 00:17:59.274 182717 DEBUG nova.virt.libvirt.driver [None req-fdff1040-a7eb-4894-8594-187b51ec2b0e a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 2ddf383f-fadd-4739-9a90-3db8bcb7cb2a] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:17:59 compute-1 nova_compute[182713]: 2026-01-22 00:17:59.306 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 2ddf383f-fadd-4739-9a90-3db8bcb7cb2a] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 00:17:59 compute-1 nova_compute[182713]: 2026-01-22 00:17:59.366 182717 INFO nova.compute.manager [None req-fdff1040-a7eb-4894-8594-187b51ec2b0e a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 2ddf383f-fadd-4739-9a90-3db8bcb7cb2a] Took 10.02 seconds to spawn the instance on the hypervisor.
Jan 22 00:17:59 compute-1 nova_compute[182713]: 2026-01-22 00:17:59.367 182717 DEBUG nova.compute.manager [None req-fdff1040-a7eb-4894-8594-187b51ec2b0e a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 2ddf383f-fadd-4739-9a90-3db8bcb7cb2a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:17:59 compute-1 nova_compute[182713]: 2026-01-22 00:17:59.467 182717 INFO nova.compute.manager [None req-fdff1040-a7eb-4894-8594-187b51ec2b0e a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 2ddf383f-fadd-4739-9a90-3db8bcb7cb2a] Took 10.85 seconds to build instance.
Jan 22 00:17:59 compute-1 nova_compute[182713]: 2026-01-22 00:17:59.495 182717 DEBUG oslo_concurrency.lockutils [None req-fdff1040-a7eb-4894-8594-187b51ec2b0e a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Lock "2ddf383f-fadd-4739-9a90-3db8bcb7cb2a" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 11.037s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:18:00 compute-1 podman[232771]: 2026-01-22 00:18:00.599114071 +0000 UTC m=+0.083371624 container health_status dce133a2ffa21f3006ac27dc80027060a9029e6d972f4c62fecd35dd3bbb00f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, release=1755695350, io.buildah.version=1.33.7, vendor=Red Hat, Inc., config_id=openstack_network_exporter, build-date=2025-08-20T13:12:41, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, version=9.6, container_name=openstack_network_exporter, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Jan 22 00:18:01 compute-1 nova_compute[182713]: 2026-01-22 00:18:01.365 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:18:03 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:18:03.025 104184 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:18:03 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:18:03.026 104184 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:18:03 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:18:03.027 104184 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:18:03 compute-1 nova_compute[182713]: 2026-01-22 00:18:03.649 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:18:04 compute-1 NetworkManager[54952]: <info>  [1769041084.5215] manager: (patch-provnet-99bcf688-d143-4306-a11c-956e66fbe227-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/253)
Jan 22 00:18:04 compute-1 NetworkManager[54952]: <info>  [1769041084.5235] manager: (patch-br-int-to-provnet-99bcf688-d143-4306-a11c-956e66fbe227): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/254)
Jan 22 00:18:04 compute-1 nova_compute[182713]: 2026-01-22 00:18:04.520 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:18:04 compute-1 nova_compute[182713]: 2026-01-22 00:18:04.617 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:18:04 compute-1 ovn_controller[94841]: 2026-01-22T00:18:04Z|00554|binding|INFO|Releasing lport ecd06410-36ac-42c3-b9e8-b57793dc7305 from this chassis (sb_readonly=0)
Jan 22 00:18:04 compute-1 ovn_controller[94841]: 2026-01-22T00:18:04Z|00555|binding|INFO|Releasing lport 1ae0dbff-a7cd-4db8-afc3-1d102fdd130f from this chassis (sb_readonly=0)
Jan 22 00:18:04 compute-1 nova_compute[182713]: 2026-01-22 00:18:04.639 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:18:06 compute-1 nova_compute[182713]: 2026-01-22 00:18:06.367 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:18:07 compute-1 nova_compute[182713]: 2026-01-22 00:18:07.220 182717 DEBUG nova.compute.manager [req-cd74e8b4-fce5-4300-91f9-0c4f0aa59b20 req-65e2f708-ba9b-4d24-a89f-ff59903abfeb 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 2ddf383f-fadd-4739-9a90-3db8bcb7cb2a] Received event network-changed-b3245964-feca-4b8b-b219-3a8b97cebae7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:18:07 compute-1 nova_compute[182713]: 2026-01-22 00:18:07.222 182717 DEBUG nova.compute.manager [req-cd74e8b4-fce5-4300-91f9-0c4f0aa59b20 req-65e2f708-ba9b-4d24-a89f-ff59903abfeb 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 2ddf383f-fadd-4739-9a90-3db8bcb7cb2a] Refreshing instance network info cache due to event network-changed-b3245964-feca-4b8b-b219-3a8b97cebae7. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 00:18:07 compute-1 nova_compute[182713]: 2026-01-22 00:18:07.222 182717 DEBUG oslo_concurrency.lockutils [req-cd74e8b4-fce5-4300-91f9-0c4f0aa59b20 req-65e2f708-ba9b-4d24-a89f-ff59903abfeb 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-2ddf383f-fadd-4739-9a90-3db8bcb7cb2a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:18:07 compute-1 nova_compute[182713]: 2026-01-22 00:18:07.222 182717 DEBUG oslo_concurrency.lockutils [req-cd74e8b4-fce5-4300-91f9-0c4f0aa59b20 req-65e2f708-ba9b-4d24-a89f-ff59903abfeb 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-2ddf383f-fadd-4739-9a90-3db8bcb7cb2a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:18:07 compute-1 nova_compute[182713]: 2026-01-22 00:18:07.223 182717 DEBUG nova.network.neutron [req-cd74e8b4-fce5-4300-91f9-0c4f0aa59b20 req-65e2f708-ba9b-4d24-a89f-ff59903abfeb 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 2ddf383f-fadd-4739-9a90-3db8bcb7cb2a] Refreshing network info cache for port b3245964-feca-4b8b-b219-3a8b97cebae7 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 00:18:07 compute-1 ovn_controller[94841]: 2026-01-22T00:18:07Z|00556|binding|INFO|Releasing lport ecd06410-36ac-42c3-b9e8-b57793dc7305 from this chassis (sb_readonly=0)
Jan 22 00:18:07 compute-1 ovn_controller[94841]: 2026-01-22T00:18:07Z|00557|binding|INFO|Releasing lport 1ae0dbff-a7cd-4db8-afc3-1d102fdd130f from this chassis (sb_readonly=0)
Jan 22 00:18:07 compute-1 nova_compute[182713]: 2026-01-22 00:18:07.491 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:18:07 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:18:07.956 104184 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=41, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:a4:fb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'fa:c2:bb:50:eb:27'}, ipsec=False) old=SB_Global(nb_cfg=40) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:18:07 compute-1 nova_compute[182713]: 2026-01-22 00:18:07.958 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:18:07 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:18:07.958 104184 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 22 00:18:08 compute-1 nova_compute[182713]: 2026-01-22 00:18:08.651 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:18:08 compute-1 nova_compute[182713]: 2026-01-22 00:18:08.876 182717 DEBUG nova.network.neutron [req-cd74e8b4-fce5-4300-91f9-0c4f0aa59b20 req-65e2f708-ba9b-4d24-a89f-ff59903abfeb 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 2ddf383f-fadd-4739-9a90-3db8bcb7cb2a] Updated VIF entry in instance network info cache for port b3245964-feca-4b8b-b219-3a8b97cebae7. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 00:18:08 compute-1 nova_compute[182713]: 2026-01-22 00:18:08.876 182717 DEBUG nova.network.neutron [req-cd74e8b4-fce5-4300-91f9-0c4f0aa59b20 req-65e2f708-ba9b-4d24-a89f-ff59903abfeb 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 2ddf383f-fadd-4739-9a90-3db8bcb7cb2a] Updating instance_info_cache with network_info: [{"id": "b3245964-feca-4b8b-b219-3a8b97cebae7", "address": "fa:16:3e:2c:49:00", "network": {"id": "3a4c4578-e100-40e8-b037-1c4af043c44d", "bridge": "br-int", "label": "tempest-network-smoke--1622567996", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.207", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "02bcfc5f1f1044a3856e73a5938ff011", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb3245964-fe", "ovs_interfaceid": "b3245964-feca-4b8b-b219-3a8b97cebae7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:18:09 compute-1 nova_compute[182713]: 2026-01-22 00:18:09.265 182717 DEBUG oslo_concurrency.lockutils [req-cd74e8b4-fce5-4300-91f9-0c4f0aa59b20 req-65e2f708-ba9b-4d24-a89f-ff59903abfeb 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-2ddf383f-fadd-4739-9a90-3db8bcb7cb2a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:18:11 compute-1 nova_compute[182713]: 2026-01-22 00:18:11.369 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:18:11 compute-1 nova_compute[182713]: 2026-01-22 00:18:11.472 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:18:11 compute-1 podman[232812]: 2026-01-22 00:18:11.598096385 +0000 UTC m=+0.071505419 container health_status 9069102163b9014ce8d990e36224edf2f6277e737eebbc85a8ea0ba5a3d6a468 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 22 00:18:11 compute-1 podman[232811]: 2026-01-22 00:18:11.619888845 +0000 UTC m=+0.107318280 container health_status 1b31374526468d6b98833c6ddc06c791524e3aaa8f4a92d02e3f11c449a17ca2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 00:18:11 compute-1 ovn_controller[94841]: 2026-01-22T00:18:11Z|00067|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:2c:49:00 10.100.0.11
Jan 22 00:18:11 compute-1 ovn_controller[94841]: 2026-01-22T00:18:11Z|00068|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:2c:49:00 10.100.0.11
Jan 22 00:18:12 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:18:12.962 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=74526b6d-b1ca-423f-9094-b845f8b97526, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '41'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:18:13 compute-1 nova_compute[182713]: 2026-01-22 00:18:13.653 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:18:16 compute-1 nova_compute[182713]: 2026-01-22 00:18:16.392 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:18:17 compute-1 nova_compute[182713]: 2026-01-22 00:18:17.458 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:18:17 compute-1 podman[232863]: 2026-01-22 00:18:17.555749995 +0000 UTC m=+0.051119432 container health_status e305ebfcd3dbca18f8dd680606b043b108cf9efb0d0a771039cb04fe0df8441d (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 22 00:18:17 compute-1 podman[232862]: 2026-01-22 00:18:17.555954311 +0000 UTC m=+0.053065402 container health_status af9a44923f14d865893c91712a29a99d59e05d83f1a1a0300c4f4d5af638f284 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_metadata_agent)
Jan 22 00:18:18 compute-1 nova_compute[182713]: 2026-01-22 00:18:18.697 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:18:21 compute-1 nova_compute[182713]: 2026-01-22 00:18:21.394 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:18:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:22.886 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '7e1a8ec5-e1c0-4c11-aae2-15d84872d95c', 'name': 'tempest-₡-12187943', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000088', 'OS-EXT-SRV-ATTR:host': 'compute-1.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '3e408650207b498c8d115fd0c4f776dc', 'user_id': '5eb4e81f0cef4003ae49faa67b3f17c3', 'hostId': '0cd1e6d5e66b4ed719d9f99eabbdfee16bd844131bab31ffa299f8dc', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Jan 22 00:18:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:22.891 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '2ddf383f-fadd-4739-9a90-3db8bcb7cb2a', 'name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1492736128-access_point-1890133969', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-0000008a', 'OS-EXT-SRV-ATTR:host': 'compute-1.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '02bcfc5f1f1044a3856e73a5938ff011', 'user_id': 'a60ce2b7b7ae47b484de12add551b287', 'hostId': '8c8a6a83731757a8a011e74950ab39e6340e88c26d3ac47083c94e63', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Jan 22 00:18:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:22.891 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Jan 22 00:18:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:22.914 12 DEBUG ceilometer.compute.pollsters [-] 7e1a8ec5-e1c0-4c11-aae2-15d84872d95c/disk.device.allocation volume: 30744576 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:18:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:22.916 12 DEBUG ceilometer.compute.pollsters [-] 7e1a8ec5-e1c0-4c11-aae2-15d84872d95c/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:18:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:22.934 12 DEBUG ceilometer.compute.pollsters [-] 2ddf383f-fadd-4739-9a90-3db8bcb7cb2a/disk.device.allocation volume: 30679040 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:18:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:22.935 12 DEBUG ceilometer.compute.pollsters [-] 2ddf383f-fadd-4739-9a90-3db8bcb7cb2a/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:18:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:22.939 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f3f69da4-54ef-43ac-b902-a8c01203ec70', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30744576, 'user_id': '5eb4e81f0cef4003ae49faa67b3f17c3', 'user_name': None, 'project_id': '3e408650207b498c8d115fd0c4f776dc', 'project_name': None, 'resource_id': '7e1a8ec5-e1c0-4c11-aae2-15d84872d95c-vda', 'timestamp': '2026-01-22T00:18:22.892519', 'resource_metadata': {'display_name': 'tempest-₡-12187943', 'name': 'instance-00000088', 'instance_id': '7e1a8ec5-e1c0-4c11-aae2-15d84872d95c', 'instance_type': 'm1.nano', 'host': '0cd1e6d5e66b4ed719d9f99eabbdfee16bd844131bab31ffa299f8dc', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'dcd42310-f727-11f0-a0a4-fa163e934844', 'monotonic_time': 5675.599918926, 'message_signature': '37688bfd48b7de1c433642b00d6dacda995eaf1fd4d2ad8ea0f4459fac7fcae6'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '5eb4e81f0cef4003ae49faa67b3f17c3', 'user_name': None, 'project_id': '3e408650207b498c8d115fd0c4f776dc', 'project_name': None, 'resource_id': '7e1a8ec5-e1c0-4c11-aae2-15d84872d95c-sda', 'timestamp': '2026-01-22T00:18:22.892519', 'resource_metadata': {'display_name': 'tempest-₡-12187943', 'name': 'instance-00000088', 'instance_id': '7e1a8ec5-e1c0-4c11-aae2-15d84872d95c', 'instance_type': 'm1.nano', 'host': '0cd1e6d5e66b4ed719d9f99eabbdfee16bd844131bab31ffa299f8dc', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'dcd4464c-f727-11f0-a0a4-fa163e934844', 'monotonic_time': 5675.599918926, 'message_signature': '6a6e6b92d879eebdcf63b0e20b83c1d9c17855f4ba2b71405470ccc9c9cfdd80'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30679040, 'user_id': 'a60ce2b7b7ae47b484de12add551b287', 'user_name': None, 'project_id': '02bcfc5f1f1044a3856e73a5938ff011', 'project_name': None, 'resource_id': '2ddf383f-fadd-4739-9a90-3db8bcb7cb2a-vda', 'timestamp': '2026-01-22T00:18:22.892519', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1492736128-access_point-1890133969', 'name': 'instance-0000008a', 'instance_id': '2ddf383f-fadd-4739-9a90-3db8bcb7cb2a', 'instance_type': 'm1.nano', 'host': '8c8a6a83731757a8a011e74950ab39e6340e88c26d3ac47083c94e63', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'dcd70f8a-f727-11f0-a0a4-fa163e934844', 'monotonic_time': 5675.624466181, 'message_signature': '72ac2c932980d92478f6661c3156723f5f3ae211e72631bd8a79ddfd0201911a'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': 'a60ce2b7b7ae47b484de12add551b287', 'user_name': None, 'project_id': '02bcfc5f1f1044a3856e73a5938ff011', 'project_name': None, 'resource_id': '2ddf383f-fadd-4739-9a90-3db8bcb7cb2a-sda', 'timestamp': '2026-01-22T00:18:22.892519', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1492736128-access_point-1890133969', 'name': 'instance-0000008a', 'instance_id': '2ddf383f-fadd-4739-9a90-3db8bcb7cb2a', 'instance_type': 'm1.nano', 'host': '8c8a6a83731757a8a011e74950ab39e6340e88c26d3ac47083c94e63', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'dcd72600-f727-11f0-a0a4-fa163e934844', 'monotonic_time': 5675.624466181, 'message_signature': '2bd7e9f27c69627a552957b6de57618aac43f245a890a78bd92c48c5c9b75898'}]}, 'timestamp': '2026-01-22 00:18:22.936199', '_unique_id': '44a56f20d95541cfb58a8ff0b367e033'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:18:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:22.939 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:18:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:22.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:18:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:22.939 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:18:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:22.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:18:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:22.939 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:18:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:22.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:18:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:22.939 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:18:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:22.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:18:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:22.939 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:18:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:22.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:18:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:22.939 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:18:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:22.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:18:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:22.939 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:18:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:22.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:18:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:22.939 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:18:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:22.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:18:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:22.939 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:18:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:22.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:18:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:22.939 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:18:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:22.939 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:18:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:22.939 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:18:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:22.939 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:18:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:22.939 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:18:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:22.939 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:18:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:22.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:18:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:22.939 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:18:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:22.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:18:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:22.939 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:18:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:22.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:18:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:22.939 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:18:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:22.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:18:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:22.939 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:18:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:22.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:18:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:22.939 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:18:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:22.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:18:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:22.939 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:18:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:22.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:18:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:22.939 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:18:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:22.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:18:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:22.939 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:18:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:22.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:18:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:22.939 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:18:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:22.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:18:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:22.939 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:18:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:22.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:18:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:22.939 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:18:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:22.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:18:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:22.939 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:18:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:22.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:18:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:22.939 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:18:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:22.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:18:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:22.939 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:18:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:22.939 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:18:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:22.939 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:18:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:22.942 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Jan 22 00:18:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:22.946 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 7e1a8ec5-e1c0-4c11-aae2-15d84872d95c / tapdfd91ce9-b3 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Jan 22 00:18:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:22.947 12 DEBUG ceilometer.compute.pollsters [-] 7e1a8ec5-e1c0-4c11-aae2-15d84872d95c/network.outgoing.bytes volume: 1620 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:18:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:22.950 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 2ddf383f-fadd-4739-9a90-3db8bcb7cb2a / tapb3245964-fe inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Jan 22 00:18:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:22.951 12 DEBUG ceilometer.compute.pollsters [-] 2ddf383f-fadd-4739-9a90-3db8bcb7cb2a/network.outgoing.bytes volume: 1666 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:18:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:22.954 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '28806611-838b-452e-bc21-ac1a918a333b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1620, 'user_id': '5eb4e81f0cef4003ae49faa67b3f17c3', 'user_name': None, 'project_id': '3e408650207b498c8d115fd0c4f776dc', 'project_name': None, 'resource_id': 'instance-00000088-7e1a8ec5-e1c0-4c11-aae2-15d84872d95c-tapdfd91ce9-b3', 'timestamp': '2026-01-22T00:18:22.942305', 'resource_metadata': {'display_name': 'tempest-₡-12187943', 'name': 'tapdfd91ce9-b3', 'instance_id': '7e1a8ec5-e1c0-4c11-aae2-15d84872d95c', 'instance_type': 'm1.nano', 'host': '0cd1e6d5e66b4ed719d9f99eabbdfee16bd844131bab31ffa299f8dc', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:0c:78:ae', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapdfd91ce9-b3'}, 'message_id': 'dcd8f192-f727-11f0-a0a4-fa163e934844', 'monotonic_time': 5675.649580093, 'message_signature': 'ed1e4de33634da44530ca06b792fa4d3a2edda193b05275ac485c9eab09f69c6'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1666, 'user_id': 'a60ce2b7b7ae47b484de12add551b287', 'user_name': None, 'project_id': '02bcfc5f1f1044a3856e73a5938ff011', 'project_name': None, 'resource_id': 'instance-0000008a-2ddf383f-fadd-4739-9a90-3db8bcb7cb2a-tapb3245964-fe', 'timestamp': '2026-01-22T00:18:22.942305', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1492736128-access_point-1890133969', 'name': 'tapb3245964-fe', 'instance_id': '2ddf383f-fadd-4739-9a90-3db8bcb7cb2a', 'instance_type': 'm1.nano', 'host': '8c8a6a83731757a8a011e74950ab39e6340e88c26d3ac47083c94e63', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:2c:49:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb3245964-fe'}, 'message_id': 'dcd99610-f727-11f0-a0a4-fa163e934844', 'monotonic_time': 5675.655203246, 'message_signature': 'e42bbf2dbff48b5cf9489bb09ed03080275aa5dedd13eacf94af57662602ed5d'}]}, 'timestamp': '2026-01-22 00:18:22.952253', '_unique_id': '88b9f826d6a5414ca977c9b0cff62d03'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:18:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:22.954 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:18:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:22.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:18:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:22.954 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:18:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:22.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:18:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:22.954 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:18:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:22.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:18:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:22.954 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:18:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:22.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:18:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:22.954 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:18:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:22.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:18:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:22.954 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:18:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:22.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:18:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:22.954 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:18:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:22.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:18:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:22.954 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:18:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:22.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:18:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:22.954 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:18:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:22.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:18:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:22.954 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:18:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:22.954 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:18:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:22.954 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:18:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:22.954 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:18:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:22.954 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:18:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:22.954 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:18:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:22.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:18:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:22.954 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:18:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:22.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:18:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:22.954 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:18:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:22.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:18:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:22.954 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:18:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:22.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:18:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:22.954 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:18:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:22.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:18:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:22.954 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:18:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:22.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:18:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:22.954 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:18:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:22.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:18:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:22.954 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:18:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:22.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:18:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:22.954 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:18:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:22.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:18:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:22.954 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:18:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:22.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:18:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:22.954 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:18:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:22.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:18:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:22.954 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:18:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:22.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:18:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:22.954 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:18:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:22.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:18:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:22.954 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:18:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:22.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:18:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:22.954 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:18:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:22.954 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:18:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:22.954 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:18:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:22.956 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Jan 22 00:18:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:22.994 12 DEBUG ceilometer.compute.pollsters [-] 7e1a8ec5-e1c0-4c11-aae2-15d84872d95c/disk.device.write.requests volume: 304 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:18:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:22.995 12 DEBUG ceilometer.compute.pollsters [-] 7e1a8ec5-e1c0-4c11-aae2-15d84872d95c/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.019 12 DEBUG ceilometer.compute.pollsters [-] 2ddf383f-fadd-4739-9a90-3db8bcb7cb2a/disk.device.write.requests volume: 305 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.020 12 DEBUG ceilometer.compute.pollsters [-] 2ddf383f-fadd-4739-9a90-3db8bcb7cb2a/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.021 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'acd1c08e-9539-4385-ae85-e65b262698ed', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 304, 'user_id': '5eb4e81f0cef4003ae49faa67b3f17c3', 'user_name': None, 'project_id': '3e408650207b498c8d115fd0c4f776dc', 'project_name': None, 'resource_id': '7e1a8ec5-e1c0-4c11-aae2-15d84872d95c-vda', 'timestamp': '2026-01-22T00:18:22.957149', 'resource_metadata': {'display_name': 'tempest-₡-12187943', 'name': 'instance-00000088', 'instance_id': '7e1a8ec5-e1c0-4c11-aae2-15d84872d95c', 'instance_type': 'm1.nano', 'host': '0cd1e6d5e66b4ed719d9f99eabbdfee16bd844131bab31ffa299f8dc', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'dce03ca4-f727-11f0-a0a4-fa163e934844', 'monotonic_time': 5675.664449859, 'message_signature': 'c88b0ca265aae5e7321111476c9ebf3d6e423ef6056531c79234da82024c06f6'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '5eb4e81f0cef4003ae49faa67b3f17c3', 'user_name': None, 'project_id': '3e408650207b498c8d115fd0c4f776dc', 'project_name': None, 'resource_id': '7e1a8ec5-e1c0-4c11-aae2-15d84872d95c-sda', 'timestamp': '2026-01-22T00:18:22.957149', 'resource_metadata': {'display_name': 'tempest-₡-12187943', 'name': 'instance-00000088', 'instance_id': '7e1a8ec5-e1c0-4c11-aae2-15d84872d95c', 'instance_type': 'm1.nano', 'host': '0cd1e6d5e66b4ed719d9f99eabbdfee16bd844131bab31ffa299f8dc', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'dce047f8-f727-11f0-a0a4-fa163e934844', 'monotonic_time': 5675.664449859, 'message_signature': 'afa90b706fb14ac6e0fe6e0080ccf1e641791bf2ee6c63d54b033823927232c0'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 305, 'user_id': 'a60ce2b7b7ae47b484de12add551b287', 'user_name': None, 'project_id': '02bcfc5f1f1044a3856e73a5938ff011', 'project_name': None, 'resource_id': '2ddf383f-fadd-4739-9a90-3db8bcb7cb2a-vda', 'timestamp': '2026-01-22T00:18:22.957149', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1492736128-access_point-1890133969', 'name': 'instance-0000008a', 'instance_id': '2ddf383f-fadd-4739-9a90-3db8bcb7cb2a', 'instance_type': 'm1.nano', 'host': '8c8a6a83731757a8a011e74950ab39e6340e88c26d3ac47083c94e63', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'dce40546-f727-11f0-a0a4-fa163e934844', 'monotonic_time': 5675.702996454, 'message_signature': '9ccdd133d09a5fe744f1bc935a8109f79e98b8a6c599d268546ba381b593691e'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'a60ce2b7b7ae47b484de12add551b287', 'user_name': None, 'project_id': '02bcfc5f1f1044a3856e73a5938ff011', 'project_name': None, 'resource_id': '2ddf383f-fadd-4739-9a90-3db8bcb7cb2a-sda', 'timestamp': '2026-01-22T00:18:22.957149', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1492736128-access_point-1890133969', 'name': 'instance-0000008a', 'instance_id': '2ddf383f-fadd-4739-9a90-3db8bcb7cb2a', 'instance_type': 'm1.nano', 'host': '8c8a6a83731757a8a011e74950ab39e6340e88c26d3ac47083c94e63', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'dce41360-f727-11f0-a0a4-fa163e934844', 'monotonic_time': 5675.702996454, 'message_signature': '1cdc811015a6466204a3e789d0db92e059e71368875b8de1a2da03059eed680e'}]}, 'timestamp': '2026-01-22 00:18:23.020678', '_unique_id': 'e41e84b417b24f07a96337f7e0319255'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.021 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.021 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.021 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.021 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.021 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.021 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.021 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.021 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.021 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.021 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.021 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.021 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.021 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.021 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.021 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.021 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.021 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.021 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.021 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.021 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.021 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.021 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.021 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.021 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.021 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.021 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.021 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.021 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.021 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.021 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.021 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.022 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.023 12 DEBUG ceilometer.compute.pollsters [-] 7e1a8ec5-e1c0-4c11-aae2-15d84872d95c/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.023 12 DEBUG ceilometer.compute.pollsters [-] 7e1a8ec5-e1c0-4c11-aae2-15d84872d95c/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.023 12 DEBUG ceilometer.compute.pollsters [-] 2ddf383f-fadd-4739-9a90-3db8bcb7cb2a/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.023 12 DEBUG ceilometer.compute.pollsters [-] 2ddf383f-fadd-4739-9a90-3db8bcb7cb2a/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.024 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7d8dd6c1-936f-4910-b9af-69d55ec9248b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '5eb4e81f0cef4003ae49faa67b3f17c3', 'user_name': None, 'project_id': '3e408650207b498c8d115fd0c4f776dc', 'project_name': None, 'resource_id': '7e1a8ec5-e1c0-4c11-aae2-15d84872d95c-vda', 'timestamp': '2026-01-22T00:18:23.023071', 'resource_metadata': {'display_name': 'tempest-₡-12187943', 'name': 'instance-00000088', 'instance_id': '7e1a8ec5-e1c0-4c11-aae2-15d84872d95c', 'instance_type': 'm1.nano', 'host': '0cd1e6d5e66b4ed719d9f99eabbdfee16bd844131bab31ffa299f8dc', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'dce47c56-f727-11f0-a0a4-fa163e934844', 'monotonic_time': 5675.599918926, 'message_signature': '0079529c36a9f924f873ed6079343a841a149fe0f3ea867c59fc0bfa96a3ae03'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '5eb4e81f0cef4003ae49faa67b3f17c3', 'user_name': None, 'project_id': '3e408650207b498c8d115fd0c4f776dc', 'project_name': None, 'resource_id': '7e1a8ec5-e1c0-4c11-aae2-15d84872d95c-sda', 'timestamp': '2026-01-22T00:18:23.023071', 'resource_metadata': {'display_name': 'tempest-₡-12187943', 'name': 'instance-00000088', 'instance_id': '7e1a8ec5-e1c0-4c11-aae2-15d84872d95c', 'instance_type': 'm1.nano', 'host': '0cd1e6d5e66b4ed719d9f99eabbdfee16bd844131bab31ffa299f8dc', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'dce48494-f727-11f0-a0a4-fa163e934844', 'monotonic_time': 5675.599918926, 'message_signature': '093a48780f329990258ed31b89c29de88b9add7608a4e94109ff00f8414d11a4'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'a60ce2b7b7ae47b484de12add551b287', 'user_name': None, 'project_id': '02bcfc5f1f1044a3856e73a5938ff011', 'project_name': None, 'resource_id': '2ddf383f-fadd-4739-9a90-3db8bcb7cb2a-vda', 'timestamp': '2026-01-22T00:18:23.023071', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1492736128-access_point-1890133969', 'name': 'instance-0000008a', 'instance_id': '2ddf383f-fadd-4739-9a90-3db8bcb7cb2a', 'instance_type': 'm1.nano', 'host': '8c8a6a83731757a8a011e74950ab39e6340e88c26d3ac47083c94e63', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'dce48cc8-f727-11f0-a0a4-fa163e934844', 'monotonic_time': 5675.624466181, 'message_signature': '092595d6a69e1567416591d03c656f18386a5733183c05c87ec2d3584e4767ed'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'a60ce2b7b7ae47b484de12add551b287', 'user_name': None, 'project_id': '02bcfc5f1f1044a3856e73a5938ff011', 'project_name': None, 'resource_id': '2ddf383f-fadd-4739-9a90-3db8bcb7cb2a-sda', 'timestamp': '2026-01-22T00:18:23.023071', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1492736128-access_point-1890133969', 'name': 'instance-0000008a', 'instance_id': '2ddf383f-fadd-4739-9a90-3db8bcb7cb2a', 'instance_type': 'm1.nano', 'host': '8c8a6a83731757a8a011e74950ab39e6340e88c26d3ac47083c94e63', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'dce496fa-f727-11f0-a0a4-fa163e934844', 'monotonic_time': 5675.624466181, 'message_signature': '8119505c167a81bfe7168135ede5217096c873efada02e6fba7fcb1623fde238'}]}, 'timestamp': '2026-01-22 00:18:23.024007', '_unique_id': 'd8f5dc9b349949e28e0c55c2359da358'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.024 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.024 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.024 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.024 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.024 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.024 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.024 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.024 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.024 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.024 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.024 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.024 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.024 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.024 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.024 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.024 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.024 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.024 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.024 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.024 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.024 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.024 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.024 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.024 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.024 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.024 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.024 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.024 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.024 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.024 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.024 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.025 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.038 12 DEBUG ceilometer.compute.pollsters [-] 7e1a8ec5-e1c0-4c11-aae2-15d84872d95c/cpu volume: 10870000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.052 12 DEBUG ceilometer.compute.pollsters [-] 2ddf383f-fadd-4739-9a90-3db8bcb7cb2a/cpu volume: 10870000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.054 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ca13e074-c0dc-4d2c-81dd-d8d7c3d1ef38', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 10870000000, 'user_id': '5eb4e81f0cef4003ae49faa67b3f17c3', 'user_name': None, 'project_id': '3e408650207b498c8d115fd0c4f776dc', 'project_name': None, 'resource_id': '7e1a8ec5-e1c0-4c11-aae2-15d84872d95c', 'timestamp': '2026-01-22T00:18:23.025818', 'resource_metadata': {'display_name': 'tempest-₡-12187943', 'name': 'instance-00000088', 'instance_id': '7e1a8ec5-e1c0-4c11-aae2-15d84872d95c', 'instance_type': 'm1.nano', 'host': '0cd1e6d5e66b4ed719d9f99eabbdfee16bd844131bab31ffa299f8dc', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': 'dce6e77a-f727-11f0-a0a4-fa163e934844', 'monotonic_time': 5675.745863171, 'message_signature': 'b7e176942492d638a59ae9269e19b6e24bb9576d42ce4c0efe875d100fd9dfce'}, {'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 10870000000, 'user_id': 'a60ce2b7b7ae47b484de12add551b287', 'user_name': None, 'project_id': '02bcfc5f1f1044a3856e73a5938ff011', 'project_name': None, 'resource_id': '2ddf383f-fadd-4739-9a90-3db8bcb7cb2a', 'timestamp': '2026-01-22T00:18:23.025818', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1492736128-access_point-1890133969', 'name': 'instance-0000008a', 'instance_id': '2ddf383f-fadd-4739-9a90-3db8bcb7cb2a', 'instance_type': 'm1.nano', 'host': '8c8a6a83731757a8a011e74950ab39e6340e88c26d3ac47083c94e63', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': 'dce90f78-f727-11f0-a0a4-fa163e934844', 'monotonic_time': 5675.759975915, 'message_signature': '5fb388025d1c8acc1cf842fbe6465e2ef5c09ecca7daef0336013efbdd8f329c'}]}, 'timestamp': '2026-01-22 00:18:23.053376', '_unique_id': '562ee7f38a5f4668acd8e686c28522ef'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.054 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.054 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.054 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.054 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.054 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.054 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.054 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.054 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.054 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.054 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.054 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.054 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.054 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.054 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.054 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.054 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.054 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.054 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.054 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.054 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.054 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.054 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.054 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.054 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.054 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.054 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.054 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.054 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.054 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.054 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.054 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.055 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.055 12 DEBUG ceilometer.compute.pollsters [-] 7e1a8ec5-e1c0-4c11-aae2-15d84872d95c/network.incoming.packets volume: 9 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.055 12 DEBUG ceilometer.compute.pollsters [-] 2ddf383f-fadd-4739-9a90-3db8bcb7cb2a/network.incoming.packets volume: 15 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.056 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ffabc20f-bef7-4b7b-8e62-75bdfc2aee9d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 9, 'user_id': '5eb4e81f0cef4003ae49faa67b3f17c3', 'user_name': None, 'project_id': '3e408650207b498c8d115fd0c4f776dc', 'project_name': None, 'resource_id': 'instance-00000088-7e1a8ec5-e1c0-4c11-aae2-15d84872d95c-tapdfd91ce9-b3', 'timestamp': '2026-01-22T00:18:23.055143', 'resource_metadata': {'display_name': 'tempest-₡-12187943', 'name': 'tapdfd91ce9-b3', 'instance_id': '7e1a8ec5-e1c0-4c11-aae2-15d84872d95c', 'instance_type': 'm1.nano', 'host': '0cd1e6d5e66b4ed719d9f99eabbdfee16bd844131bab31ffa299f8dc', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:0c:78:ae', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapdfd91ce9-b3'}, 'message_id': 'dce9625c-f727-11f0-a0a4-fa163e934844', 'monotonic_time': 5675.649580093, 'message_signature': '296c3129eb6edd37a86216258ed2b8f6ff269aaea79045fcfddb5b8090d42bac'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 15, 'user_id': 'a60ce2b7b7ae47b484de12add551b287', 'user_name': None, 'project_id': '02bcfc5f1f1044a3856e73a5938ff011', 'project_name': None, 'resource_id': 'instance-0000008a-2ddf383f-fadd-4739-9a90-3db8bcb7cb2a-tapb3245964-fe', 'timestamp': '2026-01-22T00:18:23.055143', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1492736128-access_point-1890133969', 'name': 'tapb3245964-fe', 'instance_id': '2ddf383f-fadd-4739-9a90-3db8bcb7cb2a', 'instance_type': 'm1.nano', 'host': '8c8a6a83731757a8a011e74950ab39e6340e88c26d3ac47083c94e63', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:2c:49:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb3245964-fe'}, 'message_id': 'dce96dec-f727-11f0-a0a4-fa163e934844', 'monotonic_time': 5675.655203246, 'message_signature': '3af58a4ac16983e0f9e89354b656aeaa8897dee5b21e1c0035bbc7421e451ffe'}]}, 'timestamp': '2026-01-22 00:18:23.055754', '_unique_id': '5ffb71cb70ed45f8a3aa935f2b7beb71'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.056 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.056 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.056 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.056 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.056 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.056 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.056 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.056 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.056 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.056 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.056 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.056 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.056 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.056 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.056 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.056 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.056 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.056 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.056 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.056 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.056 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.056 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.056 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.056 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.056 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.056 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.056 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.056 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.056 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.056 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.056 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.056 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.057 12 DEBUG ceilometer.compute.pollsters [-] 7e1a8ec5-e1c0-4c11-aae2-15d84872d95c/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.057 12 DEBUG ceilometer.compute.pollsters [-] 2ddf383f-fadd-4739-9a90-3db8bcb7cb2a/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.058 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7353de19-082f-43ec-a17a-8216c9466ff8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '5eb4e81f0cef4003ae49faa67b3f17c3', 'user_name': None, 'project_id': '3e408650207b498c8d115fd0c4f776dc', 'project_name': None, 'resource_id': 'instance-00000088-7e1a8ec5-e1c0-4c11-aae2-15d84872d95c-tapdfd91ce9-b3', 'timestamp': '2026-01-22T00:18:23.057046', 'resource_metadata': {'display_name': 'tempest-₡-12187943', 'name': 'tapdfd91ce9-b3', 'instance_id': '7e1a8ec5-e1c0-4c11-aae2-15d84872d95c', 'instance_type': 'm1.nano', 'host': '0cd1e6d5e66b4ed719d9f99eabbdfee16bd844131bab31ffa299f8dc', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:0c:78:ae', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapdfd91ce9-b3'}, 'message_id': 'dce9aaa0-f727-11f0-a0a4-fa163e934844', 'monotonic_time': 5675.649580093, 'message_signature': 'd82a500b12995eb705dff64783052007773378afd5211819bc135d6e21ac44dc'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'a60ce2b7b7ae47b484de12add551b287', 'user_name': None, 'project_id': '02bcfc5f1f1044a3856e73a5938ff011', 'project_name': None, 'resource_id': 'instance-0000008a-2ddf383f-fadd-4739-9a90-3db8bcb7cb2a-tapb3245964-fe', 'timestamp': '2026-01-22T00:18:23.057046', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1492736128-access_point-1890133969', 'name': 'tapb3245964-fe', 'instance_id': '2ddf383f-fadd-4739-9a90-3db8bcb7cb2a', 'instance_type': 'm1.nano', 'host': '8c8a6a83731757a8a011e74950ab39e6340e88c26d3ac47083c94e63', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:2c:49:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb3245964-fe'}, 'message_id': 'dce9b48c-f727-11f0-a0a4-fa163e934844', 'monotonic_time': 5675.655203246, 'message_signature': '9b83b760f5d27cc5f3835eda5ac0896f4750480ace87fd31105a78541d3f5e29'}]}, 'timestamp': '2026-01-22 00:18:23.057552', '_unique_id': '7bd46e9488ca4d17a56d1bc6098302e2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.058 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.058 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.058 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.058 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.058 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.058 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.058 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.058 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.058 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.058 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.058 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.058 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.058 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.058 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.058 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.058 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.058 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.058 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.058 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.058 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.058 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.058 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.058 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.058 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.058 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.058 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.058 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.058 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.058 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.058 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.058 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.058 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.058 12 DEBUG ceilometer.compute.pollsters [-] 7e1a8ec5-e1c0-4c11-aae2-15d84872d95c/disk.device.read.latency volume: 215410502 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.058 12 DEBUG ceilometer.compute.pollsters [-] 7e1a8ec5-e1c0-4c11-aae2-15d84872d95c/disk.device.read.latency volume: 25729122 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.059 12 DEBUG ceilometer.compute.pollsters [-] 2ddf383f-fadd-4739-9a90-3db8bcb7cb2a/disk.device.read.latency volume: 187801048 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.059 12 DEBUG ceilometer.compute.pollsters [-] 2ddf383f-fadd-4739-9a90-3db8bcb7cb2a/disk.device.read.latency volume: 24179389 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.060 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '039c9ae2-9221-4620-a9c0-1b1908d72292', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 215410502, 'user_id': '5eb4e81f0cef4003ae49faa67b3f17c3', 'user_name': None, 'project_id': '3e408650207b498c8d115fd0c4f776dc', 'project_name': None, 'resource_id': '7e1a8ec5-e1c0-4c11-aae2-15d84872d95c-vda', 'timestamp': '2026-01-22T00:18:23.058718', 'resource_metadata': {'display_name': 'tempest-₡-12187943', 'name': 'instance-00000088', 'instance_id': '7e1a8ec5-e1c0-4c11-aae2-15d84872d95c', 'instance_type': 'm1.nano', 'host': '0cd1e6d5e66b4ed719d9f99eabbdfee16bd844131bab31ffa299f8dc', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'dce9ebdc-f727-11f0-a0a4-fa163e934844', 'monotonic_time': 5675.664449859, 'message_signature': 'e0a3bb13988b5fce0c7d75408ed2b3595d08a9ac52dbd8a5576325e132721a73'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 25729122, 'user_id': '5eb4e81f0cef4003ae49faa67b3f17c3', 'user_name': None, 'project_id': '3e408650207b498c8d115fd0c4f776dc', 'project_name': None, 'resource_id': '7e1a8ec5-e1c0-4c11-aae2-15d84872d95c-sda', 'timestamp': '2026-01-22T00:18:23.058718', 'resource_metadata': {'display_name': 'tempest-₡-12187943', 'name': 'instance-00000088', 'instance_id': '7e1a8ec5-e1c0-4c11-aae2-15d84872d95c', 'instance_type': 'm1.nano', 'host': '0cd1e6d5e66b4ed719d9f99eabbdfee16bd844131bab31ffa299f8dc', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'dce9f53c-f727-11f0-a0a4-fa163e934844', 'monotonic_time': 5675.664449859, 'message_signature': '5ca65ce8684da01f1d48b05de46a392d20cb253e4c75b5799df06369fddf889d'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 187801048, 'user_id': 'a60ce2b7b7ae47b484de12add551b287', 'user_name': None, 'project_id': '02bcfc5f1f1044a3856e73a5938ff011', 'project_name': None, 'resource_id': '2ddf383f-fadd-4739-9a90-3db8bcb7cb2a-vda', 'timestamp': '2026-01-22T00:18:23.058718', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1492736128-access_point-1890133969', 'name': 'instance-0000008a', 'instance_id': '2ddf383f-fadd-4739-9a90-3db8bcb7cb2a', 'instance_type': 'm1.nano', 'host': '8c8a6a83731757a8a011e74950ab39e6340e88c26d3ac47083c94e63', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'dce9fd52-f727-11f0-a0a4-fa163e934844', 'monotonic_time': 5675.702996454, 'message_signature': '7c36b404af3c44e630f14893c2f2f7c6688be25cc3717a03ea4d17c9f476f416'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 24179389, 'user_id': 'a60ce2b7b7ae47b484de12add551b287', 'user_name': None, 'project_id': '02bcfc5f1f1044a3856e73a5938ff011', 'project_name': None, 'resource_id': '2ddf383f-fadd-4739-9a90-3db8bcb7cb2a-sda', 'timestamp': '2026-01-22T00:18:23.058718', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1492736128-access_point-1890133969', 'name': 'instance-0000008a', 'instance_id': '2ddf383f-fadd-4739-9a90-3db8bcb7cb2a', 'instance_type': 'm1.nano', 'host': '8c8a6a83731757a8a011e74950ab39e6340e88c26d3ac47083c94e63', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'dcea05ea-f727-11f0-a0a4-fa163e934844', 'monotonic_time': 5675.702996454, 'message_signature': '88d810639c736aa7b73079608f3126a80f4c2ddd3b6f0a15905efb2dd9f9c963'}]}, 'timestamp': '2026-01-22 00:18:23.059624', '_unique_id': '9b735d692d1049eaa8b988a450a689cc'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.060 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.060 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.060 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.060 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.060 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.060 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.060 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.060 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.060 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.060 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.060 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.060 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.060 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.060 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.060 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.060 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.060 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.060 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.060 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.060 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.060 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.060 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.060 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.060 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.060 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.060 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.060 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.060 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.060 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.060 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.060 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.060 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.060 12 DEBUG ceilometer.compute.pollsters [-] 7e1a8ec5-e1c0-4c11-aae2-15d84872d95c/disk.device.write.bytes volume: 72888320 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.061 12 DEBUG ceilometer.compute.pollsters [-] 7e1a8ec5-e1c0-4c11-aae2-15d84872d95c/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.061 12 DEBUG ceilometer.compute.pollsters [-] 2ddf383f-fadd-4739-9a90-3db8bcb7cb2a/disk.device.write.bytes volume: 72937472 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.061 12 DEBUG ceilometer.compute.pollsters [-] 2ddf383f-fadd-4739-9a90-3db8bcb7cb2a/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.062 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6d02f47c-e32e-4f50-8a00-a9d19e822c1e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 72888320, 'user_id': '5eb4e81f0cef4003ae49faa67b3f17c3', 'user_name': None, 'project_id': '3e408650207b498c8d115fd0c4f776dc', 'project_name': None, 'resource_id': '7e1a8ec5-e1c0-4c11-aae2-15d84872d95c-vda', 'timestamp': '2026-01-22T00:18:23.060881', 'resource_metadata': {'display_name': 'tempest-₡-12187943', 'name': 'instance-00000088', 'instance_id': '7e1a8ec5-e1c0-4c11-aae2-15d84872d95c', 'instance_type': 'm1.nano', 'host': '0cd1e6d5e66b4ed719d9f99eabbdfee16bd844131bab31ffa299f8dc', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'dcea4096-f727-11f0-a0a4-fa163e934844', 'monotonic_time': 5675.664449859, 'message_signature': '85898a4a511cd177228fdd602c26c9f99a36567efdbc235ccc8a47b8a61dff83'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '5eb4e81f0cef4003ae49faa67b3f17c3', 'user_name': None, 'project_id': '3e408650207b498c8d115fd0c4f776dc', 'project_name': None, 'resource_id': '7e1a8ec5-e1c0-4c11-aae2-15d84872d95c-sda', 'timestamp': '2026-01-22T00:18:23.060881', 'resource_metadata': {'display_name': 'tempest-₡-12187943', 'name': 'instance-00000088', 'instance_id': '7e1a8ec5-e1c0-4c11-aae2-15d84872d95c', 'instance_type': 'm1.nano', 'host': '0cd1e6d5e66b4ed719d9f99eabbdfee16bd844131bab31ffa299f8dc', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'dcea48a2-f727-11f0-a0a4-fa163e934844', 'monotonic_time': 5675.664449859, 'message_signature': '817d70f049deac152c862f72dae5a7aa961364c6caad5332640981068a4275a3'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 72937472, 'user_id': 'a60ce2b7b7ae47b484de12add551b287', 'user_name': None, 'project_id': '02bcfc5f1f1044a3856e73a5938ff011', 'project_name': None, 'resource_id': '2ddf383f-fadd-4739-9a90-3db8bcb7cb2a-vda', 'timestamp': '2026-01-22T00:18:23.060881', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1492736128-access_point-1890133969', 'name': 'instance-0000008a', 'instance_id': '2ddf383f-fadd-4739-9a90-3db8bcb7cb2a', 'instance_type': 'm1.nano', 'host': '8c8a6a83731757a8a011e74950ab39e6340e88c26d3ac47083c94e63', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'dcea5112-f727-11f0-a0a4-fa163e934844', 'monotonic_time': 5675.702996454, 'message_signature': 'e04f8a98b4a9f469b8545f488b95d42934f98aec4d78bfb114f138357df547e9'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'a60ce2b7b7ae47b484de12add551b287', 'user_name': None, 'project_id': '02bcfc5f1f1044a3856e73a5938ff011', 'project_name': None, 'resource_id': '2ddf383f-fadd-4739-9a90-3db8bcb7cb2a-sda', 'timestamp': '2026-01-22T00:18:23.060881', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1492736128-access_point-1890133969', 'name': 'instance-0000008a', 'instance_id': '2ddf383f-fadd-4739-9a90-3db8bcb7cb2a', 'instance_type': 'm1.nano', 'host': '8c8a6a83731757a8a011e74950ab39e6340e88c26d3ac47083c94e63', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'dcea5982-f727-11f0-a0a4-fa163e934844', 'monotonic_time': 5675.702996454, 'message_signature': 'ee1c1eea259d89c1443517ae512cd8994470a43f042391c77125af744cf62dd1'}]}, 'timestamp': '2026-01-22 00:18:23.061747', '_unique_id': 'e23b34bfb2aa461099ac97e00a6f8ab1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.062 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.062 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.062 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.062 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.062 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.062 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.062 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.062 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.062 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.062 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.062 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.062 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.062 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.062 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.062 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.062 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.062 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.062 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.062 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.062 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.062 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.062 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.062 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.062 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.062 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.062 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.062 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.062 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.062 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.062 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.062 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.062 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.063 12 DEBUG ceilometer.compute.pollsters [-] 7e1a8ec5-e1c0-4c11-aae2-15d84872d95c/network.outgoing.packets volume: 16 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.063 12 DEBUG ceilometer.compute.pollsters [-] 2ddf383f-fadd-4739-9a90-3db8bcb7cb2a/network.outgoing.packets volume: 17 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.063 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'aff9e01d-caf5-46d1-8c2e-6bdfcb1afc72', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 16, 'user_id': '5eb4e81f0cef4003ae49faa67b3f17c3', 'user_name': None, 'project_id': '3e408650207b498c8d115fd0c4f776dc', 'project_name': None, 'resource_id': 'instance-00000088-7e1a8ec5-e1c0-4c11-aae2-15d84872d95c-tapdfd91ce9-b3', 'timestamp': '2026-01-22T00:18:23.063005', 'resource_metadata': {'display_name': 'tempest-₡-12187943', 'name': 'tapdfd91ce9-b3', 'instance_id': '7e1a8ec5-e1c0-4c11-aae2-15d84872d95c', 'instance_type': 'm1.nano', 'host': '0cd1e6d5e66b4ed719d9f99eabbdfee16bd844131bab31ffa299f8dc', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:0c:78:ae', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapdfd91ce9-b3'}, 'message_id': 'dcea935c-f727-11f0-a0a4-fa163e934844', 'monotonic_time': 5675.649580093, 'message_signature': '470524db036f246bb71c5c4c3d104f78b1b11936348bbf432b27afedd071b84c'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 17, 'user_id': 'a60ce2b7b7ae47b484de12add551b287', 'user_name': None, 'project_id': '02bcfc5f1f1044a3856e73a5938ff011', 'project_name': None, 'resource_id': 'instance-0000008a-2ddf383f-fadd-4739-9a90-3db8bcb7cb2a-tapb3245964-fe', 'timestamp': '2026-01-22T00:18:23.063005', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1492736128-access_point-1890133969', 'name': 'tapb3245964-fe', 'instance_id': '2ddf383f-fadd-4739-9a90-3db8bcb7cb2a', 'instance_type': 'm1.nano', 'host': '8c8a6a83731757a8a011e74950ab39e6340e88c26d3ac47083c94e63', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:2c:49:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb3245964-fe'}, 'message_id': 'dcea9bcc-f727-11f0-a0a4-fa163e934844', 'monotonic_time': 5675.655203246, 'message_signature': '58013f59df6407b36a308350ec3d881fa0ffe27dfcc87d1021413265eda101ae'}]}, 'timestamp': '2026-01-22 00:18:23.063444', '_unique_id': 'b8786627d048435e9a8e8b9478568d02'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.063 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.063 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.063 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.063 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.063 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.063 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.063 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.063 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.063 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.063 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.063 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.063 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.063 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.063 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.063 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.063 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.063 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.063 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.063 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.063 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.063 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.063 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.063 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.063 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.063 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.063 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.063 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.063 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.063 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.063 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.063 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.064 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.064 12 DEBUG ceilometer.compute.pollsters [-] 7e1a8ec5-e1c0-4c11-aae2-15d84872d95c/network.incoming.bytes volume: 1352 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.064 12 DEBUG ceilometer.compute.pollsters [-] 2ddf383f-fadd-4739-9a90-3db8bcb7cb2a/network.incoming.bytes volume: 1764 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.065 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '014a4bd7-42a6-404c-9859-d0b2fe4dbb46', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1352, 'user_id': '5eb4e81f0cef4003ae49faa67b3f17c3', 'user_name': None, 'project_id': '3e408650207b498c8d115fd0c4f776dc', 'project_name': None, 'resource_id': 'instance-00000088-7e1a8ec5-e1c0-4c11-aae2-15d84872d95c-tapdfd91ce9-b3', 'timestamp': '2026-01-22T00:18:23.064580', 'resource_metadata': {'display_name': 'tempest-₡-12187943', 'name': 'tapdfd91ce9-b3', 'instance_id': '7e1a8ec5-e1c0-4c11-aae2-15d84872d95c', 'instance_type': 'm1.nano', 'host': '0cd1e6d5e66b4ed719d9f99eabbdfee16bd844131bab31ffa299f8dc', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:0c:78:ae', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapdfd91ce9-b3'}, 'message_id': 'dcead0ec-f727-11f0-a0a4-fa163e934844', 'monotonic_time': 5675.649580093, 'message_signature': 'b128ed35826d81c21286e01b43aceb63ff9dca8a70a9ab3025b7a03068b9fb44'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1764, 'user_id': 'a60ce2b7b7ae47b484de12add551b287', 'user_name': None, 'project_id': '02bcfc5f1f1044a3856e73a5938ff011', 'project_name': None, 'resource_id': 'instance-0000008a-2ddf383f-fadd-4739-9a90-3db8bcb7cb2a-tapb3245964-fe', 'timestamp': '2026-01-22T00:18:23.064580', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1492736128-access_point-1890133969', 'name': 'tapb3245964-fe', 'instance_id': '2ddf383f-fadd-4739-9a90-3db8bcb7cb2a', 'instance_type': 'm1.nano', 'host': '8c8a6a83731757a8a011e74950ab39e6340e88c26d3ac47083c94e63', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:2c:49:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb3245964-fe'}, 'message_id': 'dceada10-f727-11f0-a0a4-fa163e934844', 'monotonic_time': 5675.655203246, 'message_signature': '36fe7602e88e0f5e60d6575d59750318f9d80fdd4691cbd24742b5f2f58c1e78'}]}, 'timestamp': '2026-01-22 00:18:23.065041', '_unique_id': '5c2630241ae84d50a304eb2ec45a5318'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.065 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.065 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.065 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.065 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.065 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.065 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.065 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.065 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.065 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.065 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.065 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.065 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.065 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.065 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.065 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.065 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.065 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.065 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.065 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.065 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.065 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.065 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.065 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.065 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.065 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.065 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.065 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.065 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.065 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.065 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.065 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.066 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.066 12 DEBUG ceilometer.compute.pollsters [-] 7e1a8ec5-e1c0-4c11-aae2-15d84872d95c/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.066 12 DEBUG ceilometer.compute.pollsters [-] 2ddf383f-fadd-4739-9a90-3db8bcb7cb2a/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.067 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0147863c-d417-412c-87f4-d8144880a713', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '5eb4e81f0cef4003ae49faa67b3f17c3', 'user_name': None, 'project_id': '3e408650207b498c8d115fd0c4f776dc', 'project_name': None, 'resource_id': 'instance-00000088-7e1a8ec5-e1c0-4c11-aae2-15d84872d95c-tapdfd91ce9-b3', 'timestamp': '2026-01-22T00:18:23.066140', 'resource_metadata': {'display_name': 'tempest-₡-12187943', 'name': 'tapdfd91ce9-b3', 'instance_id': '7e1a8ec5-e1c0-4c11-aae2-15d84872d95c', 'instance_type': 'm1.nano', 'host': '0cd1e6d5e66b4ed719d9f99eabbdfee16bd844131bab31ffa299f8dc', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:0c:78:ae', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapdfd91ce9-b3'}, 'message_id': 'dceb0ec2-f727-11f0-a0a4-fa163e934844', 'monotonic_time': 5675.649580093, 'message_signature': 'af060454c593b7a1b915582a2a0c00731aceadb3ffbf2bb8de39b610bb18aaad'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'a60ce2b7b7ae47b484de12add551b287', 'user_name': None, 'project_id': '02bcfc5f1f1044a3856e73a5938ff011', 'project_name': None, 'resource_id': 'instance-0000008a-2ddf383f-fadd-4739-9a90-3db8bcb7cb2a-tapb3245964-fe', 'timestamp': '2026-01-22T00:18:23.066140', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1492736128-access_point-1890133969', 'name': 'tapb3245964-fe', 'instance_id': '2ddf383f-fadd-4739-9a90-3db8bcb7cb2a', 'instance_type': 'm1.nano', 'host': '8c8a6a83731757a8a011e74950ab39e6340e88c26d3ac47083c94e63', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:2c:49:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb3245964-fe'}, 'message_id': 'dceb19d0-f727-11f0-a0a4-fa163e934844', 'monotonic_time': 5675.655203246, 'message_signature': '2dd0ce991ef1ef66eb7ba09ea3469d7bdf0ced0fe85df215c4f1ae567825bdf2'}]}, 'timestamp': '2026-01-22 00:18:23.066712', '_unique_id': '3bc8aa0261d74661b5f82131a9f56fbd'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.067 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.067 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.067 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.067 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.067 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.067 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.067 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.067 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.067 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.067 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.067 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.067 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.067 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.067 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.067 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.067 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.067 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.067 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.067 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.067 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.067 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.067 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.067 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.067 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.067 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.067 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.067 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.067 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.067 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.067 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.067 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.067 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.068 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.068 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-₡-12187943>, <NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-1492736128-access_point-1890133969>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-₡-12187943>, <NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-1492736128-access_point-1890133969>]
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.068 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.068 12 DEBUG ceilometer.compute.pollsters [-] 7e1a8ec5-e1c0-4c11-aae2-15d84872d95c/memory.usage volume: 42.77734375 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.068 12 DEBUG ceilometer.compute.pollsters [-] 2ddf383f-fadd-4739-9a90-3db8bcb7cb2a/memory.usage volume: 42.78515625 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.069 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'db25041b-16be-4655-bb63-28554b66021b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 42.77734375, 'user_id': '5eb4e81f0cef4003ae49faa67b3f17c3', 'user_name': None, 'project_id': '3e408650207b498c8d115fd0c4f776dc', 'project_name': None, 'resource_id': '7e1a8ec5-e1c0-4c11-aae2-15d84872d95c', 'timestamp': '2026-01-22T00:18:23.068409', 'resource_metadata': {'display_name': 'tempest-₡-12187943', 'name': 'instance-00000088', 'instance_id': '7e1a8ec5-e1c0-4c11-aae2-15d84872d95c', 'instance_type': 'm1.nano', 'host': '0cd1e6d5e66b4ed719d9f99eabbdfee16bd844131bab31ffa299f8dc', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': 'dceb666a-f727-11f0-a0a4-fa163e934844', 'monotonic_time': 5675.745863171, 'message_signature': 'f64b15bf436eef3d40e630a76af055e6969c0a3072a55f23a64d1d54654ce1b0'}, {'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 42.78515625, 'user_id': 'a60ce2b7b7ae47b484de12add551b287', 'user_name': None, 'project_id': '02bcfc5f1f1044a3856e73a5938ff011', 'project_name': None, 'resource_id': '2ddf383f-fadd-4739-9a90-3db8bcb7cb2a', 'timestamp': '2026-01-22T00:18:23.068409', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1492736128-access_point-1890133969', 'name': 'instance-0000008a', 'instance_id': '2ddf383f-fadd-4739-9a90-3db8bcb7cb2a', 'instance_type': 'm1.nano', 'host': '8c8a6a83731757a8a011e74950ab39e6340e88c26d3ac47083c94e63', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': 'dceb6ec6-f727-11f0-a0a4-fa163e934844', 'monotonic_time': 5675.759975915, 'message_signature': '33b9bf3d89c572a3e6bb82231fcaae182811eff94c3f99910952190b23ddbebd'}]}, 'timestamp': '2026-01-22 00:18:23.068838', '_unique_id': '2c2950cdd2554607bddde000a7dad3c4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.069 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.069 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.069 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.069 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.069 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.069 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.069 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.069 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.069 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.069 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.069 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.069 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.069 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.069 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.069 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.069 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.069 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.069 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.069 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.069 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.069 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.069 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.069 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.069 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.069 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.069 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.069 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.069 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.069 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.069 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.069 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.069 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.069 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.069 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.069 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.069 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.069 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.069 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.069 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.069 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.069 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.069 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.069 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.069 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.069 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.069 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.069 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.069 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.069 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.069 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.069 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.069 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.069 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.069 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.069 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.069 12 DEBUG ceilometer.compute.pollsters [-] 7e1a8ec5-e1c0-4c11-aae2-15d84872d95c/disk.device.read.requests volume: 1103 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.070 12 DEBUG ceilometer.compute.pollsters [-] 7e1a8ec5-e1c0-4c11-aae2-15d84872d95c/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.070 12 DEBUG ceilometer.compute.pollsters [-] 2ddf383f-fadd-4739-9a90-3db8bcb7cb2a/disk.device.read.requests volume: 1109 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.070 12 DEBUG ceilometer.compute.pollsters [-] 2ddf383f-fadd-4739-9a90-3db8bcb7cb2a/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.071 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0c373e15-f28c-45f6-90c0-a186b7cfd3ab', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1103, 'user_id': '5eb4e81f0cef4003ae49faa67b3f17c3', 'user_name': None, 'project_id': '3e408650207b498c8d115fd0c4f776dc', 'project_name': None, 'resource_id': '7e1a8ec5-e1c0-4c11-aae2-15d84872d95c-vda', 'timestamp': '2026-01-22T00:18:23.069939', 'resource_metadata': {'display_name': 'tempest-₡-12187943', 'name': 'instance-00000088', 'instance_id': '7e1a8ec5-e1c0-4c11-aae2-15d84872d95c', 'instance_type': 'm1.nano', 'host': '0cd1e6d5e66b4ed719d9f99eabbdfee16bd844131bab31ffa299f8dc', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'dceba1fc-f727-11f0-a0a4-fa163e934844', 'monotonic_time': 5675.664449859, 'message_signature': '745e41abef1d3a34897099013fb12a4ed7aebf127c9326dbf431e373eed6805b'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': '5eb4e81f0cef4003ae49faa67b3f17c3', 'user_name': None, 'project_id': '3e408650207b498c8d115fd0c4f776dc', 'project_name': None, 'resource_id': '7e1a8ec5-e1c0-4c11-aae2-15d84872d95c-sda', 'timestamp': '2026-01-22T00:18:23.069939', 'resource_metadata': {'display_name': 'tempest-₡-12187943', 'name': 'instance-00000088', 'instance_id': '7e1a8ec5-e1c0-4c11-aae2-15d84872d95c', 'instance_type': 'm1.nano', 'host': '0cd1e6d5e66b4ed719d9f99eabbdfee16bd844131bab31ffa299f8dc', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'dceba990-f727-11f0-a0a4-fa163e934844', 'monotonic_time': 5675.664449859, 'message_signature': '3b555c5e175c041aa063927a0b1ed39942128ec777bf9d8b204846c2b1777b9c'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1109, 'user_id': 'a60ce2b7b7ae47b484de12add551b287', 'user_name': None, 'project_id': '02bcfc5f1f1044a3856e73a5938ff011', 'project_name': None, 'resource_id': '2ddf383f-fadd-4739-9a90-3db8bcb7cb2a-vda', 'timestamp': '2026-01-22T00:18:23.069939', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1492736128-access_point-1890133969', 'name': 'instance-0000008a', 'instance_id': '2ddf383f-fadd-4739-9a90-3db8bcb7cb2a', 'instance_type': 'm1.nano', 'host': '8c8a6a83731757a8a011e74950ab39e6340e88c26d3ac47083c94e63', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'dcebb12e-f727-11f0-a0a4-fa163e934844', 'monotonic_time': 5675.702996454, 'message_signature': 'b8e37674d1a1c0510c8f4216c07d5166b04f3f83219199587bbd9c1f0f921003'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': 'a60ce2b7b7ae47b484de12add551b287', 'user_name': None, 'project_id': '02bcfc5f1f1044a3856e73a5938ff011', 'project_name': None, 'resource_id': '2ddf383f-fadd-4739-9a90-3db8bcb7cb2a-sda', 'timestamp': '2026-01-22T00:18:23.069939', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1492736128-access_point-1890133969', 'name': 'instance-0000008a', 'instance_id': '2ddf383f-fadd-4739-9a90-3db8bcb7cb2a', 'instance_type': 'm1.nano', 'host': '8c8a6a83731757a8a011e74950ab39e6340e88c26d3ac47083c94e63', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'dcebb9e4-f727-11f0-a0a4-fa163e934844', 'monotonic_time': 5675.702996454, 'message_signature': 'a8508a89ed04c02210747226dd00a817794df84982a6ad3b39aa2e3ca601cdcb'}]}, 'timestamp': '2026-01-22 00:18:23.070793', '_unique_id': 'b90c5de788be4729880a1e259822cfcc'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.071 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.071 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.071 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.071 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.071 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.071 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.071 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.071 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.071 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.071 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.071 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.071 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.071 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.071 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.071 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.071 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.071 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.071 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.071 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.071 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.071 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.071 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.071 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.071 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.071 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.071 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.071 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.071 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.071 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.071 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.071 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.071 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.071 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.071 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.071 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.071 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.071 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.071 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.071 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.071 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.071 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.071 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.071 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.071 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.071 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.071 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.071 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.071 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.071 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.071 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.071 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.071 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.071 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.071 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.071 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.072 12 DEBUG ceilometer.compute.pollsters [-] 7e1a8ec5-e1c0-4c11-aae2-15d84872d95c/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.072 12 DEBUG ceilometer.compute.pollsters [-] 2ddf383f-fadd-4739-9a90-3db8bcb7cb2a/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.072 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7c005f2e-652b-482d-a7cc-adecf926cfc1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '5eb4e81f0cef4003ae49faa67b3f17c3', 'user_name': None, 'project_id': '3e408650207b498c8d115fd0c4f776dc', 'project_name': None, 'resource_id': 'instance-00000088-7e1a8ec5-e1c0-4c11-aae2-15d84872d95c-tapdfd91ce9-b3', 'timestamp': '2026-01-22T00:18:23.072024', 'resource_metadata': {'display_name': 'tempest-₡-12187943', 'name': 'tapdfd91ce9-b3', 'instance_id': '7e1a8ec5-e1c0-4c11-aae2-15d84872d95c', 'instance_type': 'm1.nano', 'host': '0cd1e6d5e66b4ed719d9f99eabbdfee16bd844131bab31ffa299f8dc', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:0c:78:ae', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapdfd91ce9-b3'}, 'message_id': 'dcebf404-f727-11f0-a0a4-fa163e934844', 'monotonic_time': 5675.649580093, 'message_signature': 'eb62b06a906c7f41c4f857cd89a1b069a3244dd4cebbef57d62b58ea6ad6a6ca'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'a60ce2b7b7ae47b484de12add551b287', 'user_name': None, 'project_id': '02bcfc5f1f1044a3856e73a5938ff011', 'project_name': None, 'resource_id': 'instance-0000008a-2ddf383f-fadd-4739-9a90-3db8bcb7cb2a-tapb3245964-fe', 'timestamp': '2026-01-22T00:18:23.072024', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1492736128-access_point-1890133969', 'name': 'tapb3245964-fe', 'instance_id': '2ddf383f-fadd-4739-9a90-3db8bcb7cb2a', 'instance_type': 'm1.nano', 'host': '8c8a6a83731757a8a011e74950ab39e6340e88c26d3ac47083c94e63', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:2c:49:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb3245964-fe'}, 'message_id': 'dcebfc74-f727-11f0-a0a4-fa163e934844', 'monotonic_time': 5675.655203246, 'message_signature': 'e108ee4d8d564e4bf3420e7dbe72dadabb72c6f3d017978c34b2e8994fff9236'}]}, 'timestamp': '2026-01-22 00:18:23.072472', '_unique_id': 'c47f4eca8f8d4f258b88e1c1e30cd31e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.072 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.072 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.072 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.072 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.072 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.072 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.072 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.072 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.072 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.072 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.072 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.072 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.072 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.072 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.072 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.072 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.072 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.072 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.072 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.072 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.072 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.072 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.072 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.072 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.072 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.072 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.072 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.072 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.072 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.072 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.072 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.073 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.073 12 DEBUG ceilometer.compute.pollsters [-] 7e1a8ec5-e1c0-4c11-aae2-15d84872d95c/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.073 12 DEBUG ceilometer.compute.pollsters [-] 2ddf383f-fadd-4739-9a90-3db8bcb7cb2a/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.074 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ff0da15c-5679-4763-ae12-f1a8a98a028d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '5eb4e81f0cef4003ae49faa67b3f17c3', 'user_name': None, 'project_id': '3e408650207b498c8d115fd0c4f776dc', 'project_name': None, 'resource_id': 'instance-00000088-7e1a8ec5-e1c0-4c11-aae2-15d84872d95c-tapdfd91ce9-b3', 'timestamp': '2026-01-22T00:18:23.073542', 'resource_metadata': {'display_name': 'tempest-₡-12187943', 'name': 'tapdfd91ce9-b3', 'instance_id': '7e1a8ec5-e1c0-4c11-aae2-15d84872d95c', 'instance_type': 'm1.nano', 'host': '0cd1e6d5e66b4ed719d9f99eabbdfee16bd844131bab31ffa299f8dc', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:0c:78:ae', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapdfd91ce9-b3'}, 'message_id': 'dcec2ee2-f727-11f0-a0a4-fa163e934844', 'monotonic_time': 5675.649580093, 'message_signature': '816d37ff0b6050b9df369891d6b63de85f73b44576da3272d38fe74ade70e990'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'a60ce2b7b7ae47b484de12add551b287', 'user_name': None, 'project_id': '02bcfc5f1f1044a3856e73a5938ff011', 'project_name': None, 'resource_id': 'instance-0000008a-2ddf383f-fadd-4739-9a90-3db8bcb7cb2a-tapb3245964-fe', 'timestamp': '2026-01-22T00:18:23.073542', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1492736128-access_point-1890133969', 'name': 'tapb3245964-fe', 'instance_id': '2ddf383f-fadd-4739-9a90-3db8bcb7cb2a', 'instance_type': 'm1.nano', 'host': '8c8a6a83731757a8a011e74950ab39e6340e88c26d3ac47083c94e63', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:2c:49:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb3245964-fe'}, 'message_id': 'dcec39dc-f727-11f0-a0a4-fa163e934844', 'monotonic_time': 5675.655203246, 'message_signature': 'd3f741ba85b6b50a03cfc2f8330c531938d265ad8959721fc26b3c965807a3bb'}]}, 'timestamp': '2026-01-22 00:18:23.074076', '_unique_id': '09dc1e28186d46f1be7686c41143d43f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.074 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.074 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.074 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.074 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.074 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.074 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.074 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.074 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.074 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.074 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.074 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.074 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.074 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.074 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.074 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.074 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.074 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.074 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.074 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.074 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.074 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.074 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.074 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.074 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.074 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.074 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.074 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.074 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.074 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.074 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.074 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.075 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.075 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.075 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-₡-12187943>, <NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-1492736128-access_point-1890133969>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-₡-12187943>, <NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-1492736128-access_point-1890133969>]
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.075 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.075 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.076 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-₡-12187943>, <NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-1492736128-access_point-1890133969>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-₡-12187943>, <NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-1492736128-access_point-1890133969>]
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.076 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.076 12 DEBUG ceilometer.compute.pollsters [-] 7e1a8ec5-e1c0-4c11-aae2-15d84872d95c/disk.device.read.bytes volume: 30124544 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.076 12 DEBUG ceilometer.compute.pollsters [-] 7e1a8ec5-e1c0-4c11-aae2-15d84872d95c/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.076 12 DEBUG ceilometer.compute.pollsters [-] 2ddf383f-fadd-4739-9a90-3db8bcb7cb2a/disk.device.read.bytes volume: 30669312 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.077 12 DEBUG ceilometer.compute.pollsters [-] 2ddf383f-fadd-4739-9a90-3db8bcb7cb2a/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.078 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '15890ede-e223-40c7-8845-0e249ae1e89a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 30124544, 'user_id': '5eb4e81f0cef4003ae49faa67b3f17c3', 'user_name': None, 'project_id': '3e408650207b498c8d115fd0c4f776dc', 'project_name': None, 'resource_id': '7e1a8ec5-e1c0-4c11-aae2-15d84872d95c-vda', 'timestamp': '2026-01-22T00:18:23.076311', 'resource_metadata': {'display_name': 'tempest-₡-12187943', 'name': 'instance-00000088', 'instance_id': '7e1a8ec5-e1c0-4c11-aae2-15d84872d95c', 'instance_type': 'm1.nano', 'host': '0cd1e6d5e66b4ed719d9f99eabbdfee16bd844131bab31ffa299f8dc', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'dcec9c60-f727-11f0-a0a4-fa163e934844', 'monotonic_time': 5675.664449859, 'message_signature': 'ba709a9f42185bd7c01c516f505a8129af44c4f0902f626336608a87dc8cada9'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': '5eb4e81f0cef4003ae49faa67b3f17c3', 'user_name': None, 'project_id': '3e408650207b498c8d115fd0c4f776dc', 'project_name': None, 'resource_id': '7e1a8ec5-e1c0-4c11-aae2-15d84872d95c-sda', 'timestamp': '2026-01-22T00:18:23.076311', 'resource_metadata': {'display_name': 'tempest-₡-12187943', 'name': 'instance-00000088', 'instance_id': '7e1a8ec5-e1c0-4c11-aae2-15d84872d95c', 'instance_type': 'm1.nano', 'host': '0cd1e6d5e66b4ed719d9f99eabbdfee16bd844131bab31ffa299f8dc', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'dceca728-f727-11f0-a0a4-fa163e934844', 'monotonic_time': 5675.664449859, 'message_signature': '67f64184e6409d6435398c03d41e6c48af84eb4c7411648b03057fecf663c555'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 30669312, 'user_id': 'a60ce2b7b7ae47b484de12add551b287', 'user_name': None, 'project_id': '02bcfc5f1f1044a3856e73a5938ff011', 'project_name': None, 'resource_id': '2ddf383f-fadd-4739-9a90-3db8bcb7cb2a-vda', 'timestamp': '2026-01-22T00:18:23.076311', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1492736128-access_point-1890133969', 'name': 'instance-0000008a', 'instance_id': '2ddf383f-fadd-4739-9a90-3db8bcb7cb2a', 'instance_type': 'm1.nano', 'host': '8c8a6a83731757a8a011e74950ab39e6340e88c26d3ac47083c94e63', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'dcecb222-f727-11f0-a0a4-fa163e934844', 'monotonic_time': 5675.702996454, 'message_signature': '082ea1752a4925031bd5be8bec053b14732ce4b389fe3a1f8add341a45cf1ba6'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': 'a60ce2b7b7ae47b484de12add551b287', 'user_name': None, 'project_id': '02bcfc5f1f1044a3856e73a5938ff011', 'project_name': None, 'resource_id': '2ddf383f-fadd-4739-9a90-3db8bcb7cb2a-sda', 'timestamp': '2026-01-22T00:18:23.076311', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1492736128-access_point-1890133969', 'name': 'instance-0000008a', 'instance_id': '2ddf383f-fadd-4739-9a90-3db8bcb7cb2a', 'instance_type': 'm1.nano', 'host': '8c8a6a83731757a8a011e74950ab39e6340e88c26d3ac47083c94e63', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'dcecbc22-f727-11f0-a0a4-fa163e934844', 'monotonic_time': 5675.702996454, 'message_signature': '0f8d2c45a511f2231210ec35189eb705df9e0eb9c47257d08fb8e4b0ffe7ae1a'}]}, 'timestamp': '2026-01-22 00:18:23.077407', '_unique_id': '003753ab0580460f86f31af5b53529ed'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.078 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.078 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.078 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.078 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.078 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.078 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.078 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.078 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.078 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.078 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.078 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.078 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.078 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.078 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.078 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.078 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.078 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.078 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.078 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.078 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.078 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.078 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.078 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.078 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.078 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.078 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.078 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.078 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.078 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.078 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.078 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.078 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.078 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.079 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-₡-12187943>, <NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-1492736128-access_point-1890133969>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-₡-12187943>, <NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-1492736128-access_point-1890133969>]
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.079 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.079 12 DEBUG ceilometer.compute.pollsters [-] 7e1a8ec5-e1c0-4c11-aae2-15d84872d95c/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.079 12 DEBUG ceilometer.compute.pollsters [-] 2ddf383f-fadd-4739-9a90-3db8bcb7cb2a/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.080 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b2edacb1-b98d-4612-875b-0d7954ac065f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '5eb4e81f0cef4003ae49faa67b3f17c3', 'user_name': None, 'project_id': '3e408650207b498c8d115fd0c4f776dc', 'project_name': None, 'resource_id': 'instance-00000088-7e1a8ec5-e1c0-4c11-aae2-15d84872d95c-tapdfd91ce9-b3', 'timestamp': '2026-01-22T00:18:23.079309', 'resource_metadata': {'display_name': 'tempest-₡-12187943', 'name': 'tapdfd91ce9-b3', 'instance_id': '7e1a8ec5-e1c0-4c11-aae2-15d84872d95c', 'instance_type': 'm1.nano', 'host': '0cd1e6d5e66b4ed719d9f99eabbdfee16bd844131bab31ffa299f8dc', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:0c:78:ae', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapdfd91ce9-b3'}, 'message_id': 'dced10b4-f727-11f0-a0a4-fa163e934844', 'monotonic_time': 5675.649580093, 'message_signature': '9d9aa2afcb300336a49ca185106acaacc6b47646c929a7d67b821c0c76f05904'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'a60ce2b7b7ae47b484de12add551b287', 'user_name': None, 'project_id': '02bcfc5f1f1044a3856e73a5938ff011', 'project_name': None, 'resource_id': 'instance-0000008a-2ddf383f-fadd-4739-9a90-3db8bcb7cb2a-tapb3245964-fe', 'timestamp': '2026-01-22T00:18:23.079309', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1492736128-access_point-1890133969', 'name': 'tapb3245964-fe', 'instance_id': '2ddf383f-fadd-4739-9a90-3db8bcb7cb2a', 'instance_type': 'm1.nano', 'host': '8c8a6a83731757a8a011e74950ab39e6340e88c26d3ac47083c94e63', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:2c:49:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb3245964-fe'}, 'message_id': 'dced1c44-f727-11f0-a0a4-fa163e934844', 'monotonic_time': 5675.655203246, 'message_signature': '99ddfc881fcb87fef3bfd6dac1d60ed4f931439887aa580905b99f2bd7fee205'}]}, 'timestamp': '2026-01-22 00:18:23.079904', '_unique_id': 'f891ff7ae3ce4630a731d03bbf1b2265'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.080 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.080 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.080 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.080 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.080 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.080 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.080 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.080 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.080 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.080 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.080 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.080 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.080 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.080 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.080 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.080 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.080 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.080 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.080 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.080 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.080 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.080 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.080 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.080 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.080 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.080 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.080 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.080 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.080 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.080 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.080 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.081 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.081 12 DEBUG ceilometer.compute.pollsters [-] 7e1a8ec5-e1c0-4c11-aae2-15d84872d95c/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.081 12 DEBUG ceilometer.compute.pollsters [-] 2ddf383f-fadd-4739-9a90-3db8bcb7cb2a/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.082 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '05d5d509-3267-4d9d-804b-4790da02102e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '5eb4e81f0cef4003ae49faa67b3f17c3', 'user_name': None, 'project_id': '3e408650207b498c8d115fd0c4f776dc', 'project_name': None, 'resource_id': 'instance-00000088-7e1a8ec5-e1c0-4c11-aae2-15d84872d95c-tapdfd91ce9-b3', 'timestamp': '2026-01-22T00:18:23.081396', 'resource_metadata': {'display_name': 'tempest-₡-12187943', 'name': 'tapdfd91ce9-b3', 'instance_id': '7e1a8ec5-e1c0-4c11-aae2-15d84872d95c', 'instance_type': 'm1.nano', 'host': '0cd1e6d5e66b4ed719d9f99eabbdfee16bd844131bab31ffa299f8dc', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:0c:78:ae', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapdfd91ce9-b3'}, 'message_id': 'dced630c-f727-11f0-a0a4-fa163e934844', 'monotonic_time': 5675.649580093, 'message_signature': 'dbb46d2c1b1f5afe9f17ec2491de7023d6a3a53b96c969a7fa5cfabed1e0cb34'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'a60ce2b7b7ae47b484de12add551b287', 'user_name': None, 'project_id': '02bcfc5f1f1044a3856e73a5938ff011', 'project_name': None, 'resource_id': 'instance-0000008a-2ddf383f-fadd-4739-9a90-3db8bcb7cb2a-tapb3245964-fe', 'timestamp': '2026-01-22T00:18:23.081396', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1492736128-access_point-1890133969', 'name': 'tapb3245964-fe', 'instance_id': '2ddf383f-fadd-4739-9a90-3db8bcb7cb2a', 'instance_type': 'm1.nano', 'host': '8c8a6a83731757a8a011e74950ab39e6340e88c26d3ac47083c94e63', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:2c:49:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb3245964-fe'}, 'message_id': 'dced6b90-f727-11f0-a0a4-fa163e934844', 'monotonic_time': 5675.655203246, 'message_signature': 'd44896a15dd48ccc837ee13d6bdce952a9f0f83c778b27a3f17ef64edcfe1014'}]}, 'timestamp': '2026-01-22 00:18:23.081886', '_unique_id': '6ba8e73a68da4cfe898f94b3a9e3bb17'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.082 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.082 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.082 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.082 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.082 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.082 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.082 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.082 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.082 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.082 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.082 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.082 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.082 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.082 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.082 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.082 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.082 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.082 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.082 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.082 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.082 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.082 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.082 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.082 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.082 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.082 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.082 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.082 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.082 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.082 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.082 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.082 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.082 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.082 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.082 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.082 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.082 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.082 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.082 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.082 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.082 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.082 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.082 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.082 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.082 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.082 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.082 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.082 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.082 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.082 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.082 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.082 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.082 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.082 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.082 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.083 12 DEBUG ceilometer.compute.pollsters [-] 7e1a8ec5-e1c0-4c11-aae2-15d84872d95c/disk.device.write.latency volume: 4236391539 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.083 12 DEBUG ceilometer.compute.pollsters [-] 7e1a8ec5-e1c0-4c11-aae2-15d84872d95c/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.083 12 DEBUG ceilometer.compute.pollsters [-] 2ddf383f-fadd-4739-9a90-3db8bcb7cb2a/disk.device.write.latency volume: 3581199607 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.083 12 DEBUG ceilometer.compute.pollsters [-] 2ddf383f-fadd-4739-9a90-3db8bcb7cb2a/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.084 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd12fece9-8d44-4ae4-9de3-2ab4cf15b688', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 4236391539, 'user_id': '5eb4e81f0cef4003ae49faa67b3f17c3', 'user_name': None, 'project_id': '3e408650207b498c8d115fd0c4f776dc', 'project_name': None, 'resource_id': '7e1a8ec5-e1c0-4c11-aae2-15d84872d95c-vda', 'timestamp': '2026-01-22T00:18:23.083000', 'resource_metadata': {'display_name': 'tempest-₡-12187943', 'name': 'instance-00000088', 'instance_id': '7e1a8ec5-e1c0-4c11-aae2-15d84872d95c', 'instance_type': 'm1.nano', 'host': '0cd1e6d5e66b4ed719d9f99eabbdfee16bd844131bab31ffa299f8dc', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'dceda02e-f727-11f0-a0a4-fa163e934844', 'monotonic_time': 5675.664449859, 'message_signature': 'c420c4775484d9e228916743f7d8586a4af27188b54b631d2e56dcb393b6cacc'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '5eb4e81f0cef4003ae49faa67b3f17c3', 'user_name': None, 'project_id': '3e408650207b498c8d115fd0c4f776dc', 'project_name': None, 'resource_id': '7e1a8ec5-e1c0-4c11-aae2-15d84872d95c-sda', 'timestamp': '2026-01-22T00:18:23.083000', 'resource_metadata': {'display_name': 'tempest-₡-12187943', 'name': 'instance-00000088', 'instance_id': '7e1a8ec5-e1c0-4c11-aae2-15d84872d95c', 'instance_type': 'm1.nano', 'host': '0cd1e6d5e66b4ed719d9f99eabbdfee16bd844131bab31ffa299f8dc', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'dceda7fe-f727-11f0-a0a4-fa163e934844', 'monotonic_time': 5675.664449859, 'message_signature': '8428b4efa97d33c5415e66c773ac3ac1d01af4d50f171b4f6840505d0dc804a8'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 3581199607, 'user_id': 'a60ce2b7b7ae47b484de12add551b287', 'user_name': None, 'project_id': '02bcfc5f1f1044a3856e73a5938ff011', 'project_name': None, 'resource_id': '2ddf383f-fadd-4739-9a90-3db8bcb7cb2a-vda', 'timestamp': '2026-01-22T00:18:23.083000', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1492736128-access_point-1890133969', 'name': 'instance-0000008a', 'instance_id': '2ddf383f-fadd-4739-9a90-3db8bcb7cb2a', 'instance_type': 'm1.nano', 'host': '8c8a6a83731757a8a011e74950ab39e6340e88c26d3ac47083c94e63', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'dcedb08c-f727-11f0-a0a4-fa163e934844', 'monotonic_time': 5675.702996454, 'message_signature': 'd2d3421ec452b9b8e01f02fd6b6fb52382e27818a4800289582dc4bd784cfca7'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'a60ce2b7b7ae47b484de12add551b287', 'user_name': None, 'project_id': '02bcfc5f1f1044a3856e73a5938ff011', 'project_name': None, 'resource_id': '2ddf383f-fadd-4739-9a90-3db8bcb7cb2a-sda', 'timestamp': '2026-01-22T00:18:23.083000', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1492736128-access_point-1890133969', 'name': 'instance-0000008a', 'instance_id': '2ddf383f-fadd-4739-9a90-3db8bcb7cb2a', 'instance_type': 'm1.nano', 'host': '8c8a6a83731757a8a011e74950ab39e6340e88c26d3ac47083c94e63', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'dcedb924-f727-11f0-a0a4-fa163e934844', 'monotonic_time': 5675.702996454, 'message_signature': '63d734363023818e9659d2b54fe1e06a820cdc82dc8006f2ea4a8b7e34b696d2'}]}, 'timestamp': '2026-01-22 00:18:23.083875', '_unique_id': '6bb969cfadbb4d9c81bcd2ea8e6777f1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.084 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.084 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.084 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.084 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.084 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.084 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.084 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.084 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.084 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.084 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.084 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.084 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.084 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.084 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.084 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.084 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.084 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.084 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.084 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.084 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.084 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.084 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.084 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.084 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.084 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.084 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.084 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.084 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.084 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.084 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.084 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.085 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.085 12 DEBUG ceilometer.compute.pollsters [-] 7e1a8ec5-e1c0-4c11-aae2-15d84872d95c/disk.device.usage volume: 29949952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.085 12 DEBUG ceilometer.compute.pollsters [-] 7e1a8ec5-e1c0-4c11-aae2-15d84872d95c/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.085 12 DEBUG ceilometer.compute.pollsters [-] 2ddf383f-fadd-4739-9a90-3db8bcb7cb2a/disk.device.usage volume: 29949952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.085 12 DEBUG ceilometer.compute.pollsters [-] 2ddf383f-fadd-4739-9a90-3db8bcb7cb2a/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.086 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '895d5f2e-e388-40d7-ba3a-f4e03eac5459', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29949952, 'user_id': '5eb4e81f0cef4003ae49faa67b3f17c3', 'user_name': None, 'project_id': '3e408650207b498c8d115fd0c4f776dc', 'project_name': None, 'resource_id': '7e1a8ec5-e1c0-4c11-aae2-15d84872d95c-vda', 'timestamp': '2026-01-22T00:18:23.085167', 'resource_metadata': {'display_name': 'tempest-₡-12187943', 'name': 'instance-00000088', 'instance_id': '7e1a8ec5-e1c0-4c11-aae2-15d84872d95c', 'instance_type': 'm1.nano', 'host': '0cd1e6d5e66b4ed719d9f99eabbdfee16bd844131bab31ffa299f8dc', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'dcedf56a-f727-11f0-a0a4-fa163e934844', 'monotonic_time': 5675.599918926, 'message_signature': '73595cb0a520229e8f8b871ad3b7282cf584270f82ab8e03af854070a7292d74'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '5eb4e81f0cef4003ae49faa67b3f17c3', 'user_name': None, 'project_id': '3e408650207b498c8d115fd0c4f776dc', 'project_name': None, 'resource_id': '7e1a8ec5-e1c0-4c11-aae2-15d84872d95c-sda', 'timestamp': '2026-01-22T00:18:23.085167', 'resource_metadata': {'display_name': 'tempest-₡-12187943', 'name': 'instance-00000088', 'instance_id': '7e1a8ec5-e1c0-4c11-aae2-15d84872d95c', 'instance_type': 'm1.nano', 'host': '0cd1e6d5e66b4ed719d9f99eabbdfee16bd844131bab31ffa299f8dc', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'dcedfe84-f727-11f0-a0a4-fa163e934844', 'monotonic_time': 5675.599918926, 'message_signature': '45c5037559f907db9b477f2ab7c1404320dc741c8f4bbc298d2f5531b2c920d2'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29949952, 'user_id': 'a60ce2b7b7ae47b484de12add551b287', 'user_name': None, 'project_id': '02bcfc5f1f1044a3856e73a5938ff011', 'project_name': None, 'resource_id': '2ddf383f-fadd-4739-9a90-3db8bcb7cb2a-vda', 'timestamp': '2026-01-22T00:18:23.085167', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1492736128-access_point-1890133969', 'name': 'instance-0000008a', 'instance_id': '2ddf383f-fadd-4739-9a90-3db8bcb7cb2a', 'instance_type': 'm1.nano', 'host': '8c8a6a83731757a8a011e74950ab39e6340e88c26d3ac47083c94e63', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'dcee0636-f727-11f0-a0a4-fa163e934844', 'monotonic_time': 5675.624466181, 'message_signature': 'a5970c3638d8f8ef7b3bff8f803a9d0766ca55db672e3f8cc3c8a378fe0bc661'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'a60ce2b7b7ae47b484de12add551b287', 'user_name': None, 'project_id': '02bcfc5f1f1044a3856e73a5938ff011', 'project_name': None, 'resource_id': '2ddf383f-fadd-4739-9a90-3db8bcb7cb2a-sda', 'timestamp': '2026-01-22T00:18:23.085167', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1492736128-access_point-1890133969', 'name': 'instance-0000008a', 'instance_id': '2ddf383f-fadd-4739-9a90-3db8bcb7cb2a', 'instance_type': 'm1.nano', 'host': '8c8a6a83731757a8a011e74950ab39e6340e88c26d3ac47083c94e63', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'dcee1112-f727-11f0-a0a4-fa163e934844', 'monotonic_time': 5675.624466181, 'message_signature': '7554561559bae773b5ff75c4c321a6388653439861798bb6a320f862241adc36'}]}, 'timestamp': '2026-01-22 00:18:23.086103', '_unique_id': '5ee6af1198974d80bb7857c0b0154de5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.086 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.086 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.086 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.086 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.086 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.086 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.086 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.086 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.086 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.086 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.086 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.086 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.086 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.086 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.086 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.086 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.086 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.086 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.086 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.086 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.086 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.086 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.086 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.086 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.086 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.086 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.086 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.086 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.086 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.086 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:18:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:18:23.086 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:18:23 compute-1 nova_compute[182713]: 2026-01-22 00:18:23.700 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:18:26 compute-1 nova_compute[182713]: 2026-01-22 00:18:26.397 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:18:28 compute-1 podman[232901]: 2026-01-22 00:18:28.577743078 +0000 UTC m=+0.070863930 container health_status cba261ffc859a105e0afa4436e64e27f9ba7e7387a1ea02146e0072c51844d44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 00:18:28 compute-1 nova_compute[182713]: 2026-01-22 00:18:28.704 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:18:31 compute-1 nova_compute[182713]: 2026-01-22 00:18:31.440 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:18:31 compute-1 podman[232921]: 2026-01-22 00:18:31.584591294 +0000 UTC m=+0.075659216 container health_status dce133a2ffa21f3006ac27dc80027060a9029e6d972f4c62fecd35dd3bbb00f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., vcs-type=git, build-date=2025-08-20T13:12:41, name=ubi9-minimal, release=1755695350, config_id=openstack_network_exporter, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, version=9.6, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container)
Jan 22 00:18:33 compute-1 nova_compute[182713]: 2026-01-22 00:18:33.732 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:18:34 compute-1 nova_compute[182713]: 2026-01-22 00:18:34.563 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:18:36 compute-1 nova_compute[182713]: 2026-01-22 00:18:36.442 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:18:38 compute-1 nova_compute[182713]: 2026-01-22 00:18:38.735 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:18:39 compute-1 ovn_controller[94841]: 2026-01-22T00:18:39Z|00558|binding|INFO|Releasing lport ecd06410-36ac-42c3-b9e8-b57793dc7305 from this chassis (sb_readonly=0)
Jan 22 00:18:39 compute-1 ovn_controller[94841]: 2026-01-22T00:18:39Z|00559|binding|INFO|Releasing lport 1ae0dbff-a7cd-4db8-afc3-1d102fdd130f from this chassis (sb_readonly=0)
Jan 22 00:18:39 compute-1 nova_compute[182713]: 2026-01-22 00:18:39.939 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:18:41 compute-1 nova_compute[182713]: 2026-01-22 00:18:41.444 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:18:41 compute-1 nova_compute[182713]: 2026-01-22 00:18:41.735 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:18:42 compute-1 podman[232945]: 2026-01-22 00:18:42.588258723 +0000 UTC m=+0.070060284 container health_status 9069102163b9014ce8d990e36224edf2f6277e737eebbc85a8ea0ba5a3d6a468 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 22 00:18:42 compute-1 podman[232944]: 2026-01-22 00:18:42.627593882 +0000 UTC m=+0.105807193 container health_status 1b31374526468d6b98833c6ddc06c791524e3aaa8f4a92d02e3f11c449a17ca2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 22 00:18:43 compute-1 nova_compute[182713]: 2026-01-22 00:18:43.738 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:18:45 compute-1 nova_compute[182713]: 2026-01-22 00:18:45.869 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:18:45 compute-1 nova_compute[182713]: 2026-01-22 00:18:45.870 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:18:45 compute-1 nova_compute[182713]: 2026-01-22 00:18:45.870 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 00:18:46 compute-1 nova_compute[182713]: 2026-01-22 00:18:46.447 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:18:46 compute-1 nova_compute[182713]: 2026-01-22 00:18:46.856 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:18:47 compute-1 nova_compute[182713]: 2026-01-22 00:18:47.234 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:18:47 compute-1 nova_compute[182713]: 2026-01-22 00:18:47.283 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Triggering sync for uuid 7e1a8ec5-e1c0-4c11-aae2-15d84872d95c _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268
Jan 22 00:18:47 compute-1 nova_compute[182713]: 2026-01-22 00:18:47.284 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Triggering sync for uuid 2ddf383f-fadd-4739-9a90-3db8bcb7cb2a _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268
Jan 22 00:18:47 compute-1 nova_compute[182713]: 2026-01-22 00:18:47.284 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquiring lock "7e1a8ec5-e1c0-4c11-aae2-15d84872d95c" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:18:47 compute-1 nova_compute[182713]: 2026-01-22 00:18:47.284 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "7e1a8ec5-e1c0-4c11-aae2-15d84872d95c" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:18:47 compute-1 nova_compute[182713]: 2026-01-22 00:18:47.285 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquiring lock "2ddf383f-fadd-4739-9a90-3db8bcb7cb2a" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:18:47 compute-1 nova_compute[182713]: 2026-01-22 00:18:47.285 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "2ddf383f-fadd-4739-9a90-3db8bcb7cb2a" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:18:47 compute-1 nova_compute[182713]: 2026-01-22 00:18:47.319 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "7e1a8ec5-e1c0-4c11-aae2-15d84872d95c" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.035s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:18:47 compute-1 nova_compute[182713]: 2026-01-22 00:18:47.322 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "2ddf383f-fadd-4739-9a90-3db8bcb7cb2a" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.037s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:18:47 compute-1 nova_compute[182713]: 2026-01-22 00:18:47.637 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:18:48 compute-1 podman[232991]: 2026-01-22 00:18:48.548876523 +0000 UTC m=+0.041872798 container health_status af9a44923f14d865893c91712a29a99d59e05d83f1a1a0300c4f4d5af638f284 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent)
Jan 22 00:18:48 compute-1 podman[232992]: 2026-01-22 00:18:48.608751743 +0000 UTC m=+0.083925530 container health_status e305ebfcd3dbca18f8dd680606b043b108cf9efb0d0a771039cb04fe0df8441d (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Jan 22 00:18:48 compute-1 nova_compute[182713]: 2026-01-22 00:18:48.768 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:18:49 compute-1 nova_compute[182713]: 2026-01-22 00:18:49.855 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:18:49 compute-1 nova_compute[182713]: 2026-01-22 00:18:49.856 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:18:49 compute-1 nova_compute[182713]: 2026-01-22 00:18:49.889 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:18:49 compute-1 nova_compute[182713]: 2026-01-22 00:18:49.890 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:18:49 compute-1 nova_compute[182713]: 2026-01-22 00:18:49.890 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:18:49 compute-1 nova_compute[182713]: 2026-01-22 00:18:49.890 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 00:18:49 compute-1 nova_compute[182713]: 2026-01-22 00:18:49.984 182717 DEBUG oslo_concurrency.processutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7e1a8ec5-e1c0-4c11-aae2-15d84872d95c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:18:50 compute-1 nova_compute[182713]: 2026-01-22 00:18:50.046 182717 DEBUG oslo_concurrency.processutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7e1a8ec5-e1c0-4c11-aae2-15d84872d95c/disk --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:18:50 compute-1 nova_compute[182713]: 2026-01-22 00:18:50.047 182717 DEBUG oslo_concurrency.processutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7e1a8ec5-e1c0-4c11-aae2-15d84872d95c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:18:50 compute-1 nova_compute[182713]: 2026-01-22 00:18:50.107 182717 DEBUG oslo_concurrency.processutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7e1a8ec5-e1c0-4c11-aae2-15d84872d95c/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:18:50 compute-1 nova_compute[182713]: 2026-01-22 00:18:50.113 182717 DEBUG oslo_concurrency.processutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2ddf383f-fadd-4739-9a90-3db8bcb7cb2a/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:18:50 compute-1 nova_compute[182713]: 2026-01-22 00:18:50.175 182717 DEBUG oslo_concurrency.processutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2ddf383f-fadd-4739-9a90-3db8bcb7cb2a/disk --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:18:50 compute-1 nova_compute[182713]: 2026-01-22 00:18:50.176 182717 DEBUG oslo_concurrency.processutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2ddf383f-fadd-4739-9a90-3db8bcb7cb2a/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:18:50 compute-1 nova_compute[182713]: 2026-01-22 00:18:50.268 182717 DEBUG oslo_concurrency.processutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2ddf383f-fadd-4739-9a90-3db8bcb7cb2a/disk --force-share --output=json" returned: 0 in 0.091s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:18:50 compute-1 nova_compute[182713]: 2026-01-22 00:18:50.483 182717 WARNING nova.virt.libvirt.driver [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 00:18:50 compute-1 nova_compute[182713]: 2026-01-22 00:18:50.484 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5374MB free_disk=73.201904296875GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 00:18:50 compute-1 nova_compute[182713]: 2026-01-22 00:18:50.485 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:18:50 compute-1 nova_compute[182713]: 2026-01-22 00:18:50.485 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:18:50 compute-1 nova_compute[182713]: 2026-01-22 00:18:50.589 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Instance 7e1a8ec5-e1c0-4c11-aae2-15d84872d95c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 22 00:18:50 compute-1 nova_compute[182713]: 2026-01-22 00:18:50.590 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Instance 2ddf383f-fadd-4739-9a90-3db8bcb7cb2a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 22 00:18:50 compute-1 nova_compute[182713]: 2026-01-22 00:18:50.590 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 00:18:50 compute-1 nova_compute[182713]: 2026-01-22 00:18:50.590 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 00:18:50 compute-1 nova_compute[182713]: 2026-01-22 00:18:50.808 182717 DEBUG nova.compute.provider_tree [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Inventory has not changed in ProviderTree for provider: 39680711-70c9-4df1-ae59-25e54fac688d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:18:50 compute-1 nova_compute[182713]: 2026-01-22 00:18:50.840 182717 DEBUG nova.scheduler.client.report [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Inventory has not changed for provider 39680711-70c9-4df1-ae59-25e54fac688d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:18:50 compute-1 nova_compute[182713]: 2026-01-22 00:18:50.866 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 00:18:50 compute-1 nova_compute[182713]: 2026-01-22 00:18:50.866 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.381s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:18:51 compute-1 nova_compute[182713]: 2026-01-22 00:18:51.449 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:18:51 compute-1 nova_compute[182713]: 2026-01-22 00:18:51.862 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:18:52 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:18:52.547 104184 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=42, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:a4:fb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'fa:c2:bb:50:eb:27'}, ipsec=False) old=SB_Global(nb_cfg=41) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:18:52 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:18:52.547 104184 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 22 00:18:52 compute-1 nova_compute[182713]: 2026-01-22 00:18:52.548 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:18:53 compute-1 nova_compute[182713]: 2026-01-22 00:18:53.770 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:18:53 compute-1 nova_compute[182713]: 2026-01-22 00:18:53.855 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:18:54 compute-1 nova_compute[182713]: 2026-01-22 00:18:54.039 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:18:54 compute-1 nova_compute[182713]: 2026-01-22 00:18:54.856 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:18:54 compute-1 nova_compute[182713]: 2026-01-22 00:18:54.857 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 00:18:54 compute-1 nova_compute[182713]: 2026-01-22 00:18:54.857 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 22 00:18:55 compute-1 nova_compute[182713]: 2026-01-22 00:18:55.078 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquiring lock "refresh_cache-7e1a8ec5-e1c0-4c11-aae2-15d84872d95c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:18:55 compute-1 nova_compute[182713]: 2026-01-22 00:18:55.079 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquired lock "refresh_cache-7e1a8ec5-e1c0-4c11-aae2-15d84872d95c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:18:55 compute-1 nova_compute[182713]: 2026-01-22 00:18:55.079 182717 DEBUG nova.network.neutron [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] [instance: 7e1a8ec5-e1c0-4c11-aae2-15d84872d95c] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Jan 22 00:18:55 compute-1 nova_compute[182713]: 2026-01-22 00:18:55.080 182717 DEBUG nova.objects.instance [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lazy-loading 'info_cache' on Instance uuid 7e1a8ec5-e1c0-4c11-aae2-15d84872d95c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:18:56 compute-1 nova_compute[182713]: 2026-01-22 00:18:56.452 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:18:58 compute-1 nova_compute[182713]: 2026-01-22 00:18:58.540 182717 DEBUG oslo_concurrency.lockutils [None req-ebc170f1-84bb-4afa-8b55-273ef87015c4 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Acquiring lock "d59e0943-5372-4680-af52-c9af874c8578" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:18:58 compute-1 nova_compute[182713]: 2026-01-22 00:18:58.540 182717 DEBUG oslo_concurrency.lockutils [None req-ebc170f1-84bb-4afa-8b55-273ef87015c4 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lock "d59e0943-5372-4680-af52-c9af874c8578" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:18:58 compute-1 nova_compute[182713]: 2026-01-22 00:18:58.556 182717 DEBUG nova.compute.manager [None req-ebc170f1-84bb-4afa-8b55-273ef87015c4 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: d59e0943-5372-4680-af52-c9af874c8578] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 22 00:18:58 compute-1 nova_compute[182713]: 2026-01-22 00:18:58.652 182717 DEBUG oslo_concurrency.lockutils [None req-ebc170f1-84bb-4afa-8b55-273ef87015c4 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:18:58 compute-1 nova_compute[182713]: 2026-01-22 00:18:58.652 182717 DEBUG oslo_concurrency.lockutils [None req-ebc170f1-84bb-4afa-8b55-273ef87015c4 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:18:58 compute-1 nova_compute[182713]: 2026-01-22 00:18:58.659 182717 DEBUG nova.virt.hardware [None req-ebc170f1-84bb-4afa-8b55-273ef87015c4 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 22 00:18:58 compute-1 nova_compute[182713]: 2026-01-22 00:18:58.659 182717 INFO nova.compute.claims [None req-ebc170f1-84bb-4afa-8b55-273ef87015c4 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: d59e0943-5372-4680-af52-c9af874c8578] Claim successful on node compute-1.ctlplane.example.com
Jan 22 00:18:58 compute-1 nova_compute[182713]: 2026-01-22 00:18:58.671 182717 DEBUG nova.network.neutron [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] [instance: 7e1a8ec5-e1c0-4c11-aae2-15d84872d95c] Updating instance_info_cache with network_info: [{"id": "dfd91ce9-b3dc-46e0-8793-952181553915", "address": "fa:16:3e:0c:78:ae", "network": {"id": "aabf11c6-ef94-408a-8148-6c6400566606", "bridge": "br-int", "label": "tempest-ServersTestJSON-341875047-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3e408650207b498c8d115fd0c4f776dc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdfd91ce9-b3", "ovs_interfaceid": "dfd91ce9-b3dc-46e0-8793-952181553915", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:18:58 compute-1 nova_compute[182713]: 2026-01-22 00:18:58.710 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Releasing lock "refresh_cache-7e1a8ec5-e1c0-4c11-aae2-15d84872d95c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:18:58 compute-1 nova_compute[182713]: 2026-01-22 00:18:58.711 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] [instance: 7e1a8ec5-e1c0-4c11-aae2-15d84872d95c] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Jan 22 00:18:58 compute-1 nova_compute[182713]: 2026-01-22 00:18:58.712 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:18:58 compute-1 nova_compute[182713]: 2026-01-22 00:18:58.772 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:18:58 compute-1 nova_compute[182713]: 2026-01-22 00:18:58.904 182717 DEBUG nova.compute.provider_tree [None req-ebc170f1-84bb-4afa-8b55-273ef87015c4 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Inventory has not changed in ProviderTree for provider: 39680711-70c9-4df1-ae59-25e54fac688d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:18:59 compute-1 nova_compute[182713]: 2026-01-22 00:18:59.167 182717 DEBUG nova.scheduler.client.report [None req-ebc170f1-84bb-4afa-8b55-273ef87015c4 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Inventory has not changed for provider 39680711-70c9-4df1-ae59-25e54fac688d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:18:59 compute-1 nova_compute[182713]: 2026-01-22 00:18:59.190 182717 DEBUG oslo_concurrency.lockutils [None req-ebc170f1-84bb-4afa-8b55-273ef87015c4 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.538s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:18:59 compute-1 nova_compute[182713]: 2026-01-22 00:18:59.191 182717 DEBUG nova.compute.manager [None req-ebc170f1-84bb-4afa-8b55-273ef87015c4 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: d59e0943-5372-4680-af52-c9af874c8578] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 22 00:18:59 compute-1 nova_compute[182713]: 2026-01-22 00:18:59.248 182717 DEBUG nova.compute.manager [None req-ebc170f1-84bb-4afa-8b55-273ef87015c4 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: d59e0943-5372-4680-af52-c9af874c8578] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 22 00:18:59 compute-1 nova_compute[182713]: 2026-01-22 00:18:59.249 182717 DEBUG nova.network.neutron [None req-ebc170f1-84bb-4afa-8b55-273ef87015c4 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: d59e0943-5372-4680-af52-c9af874c8578] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 22 00:18:59 compute-1 nova_compute[182713]: 2026-01-22 00:18:59.269 182717 INFO nova.virt.libvirt.driver [None req-ebc170f1-84bb-4afa-8b55-273ef87015c4 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: d59e0943-5372-4680-af52-c9af874c8578] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 22 00:18:59 compute-1 nova_compute[182713]: 2026-01-22 00:18:59.288 182717 DEBUG nova.compute.manager [None req-ebc170f1-84bb-4afa-8b55-273ef87015c4 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: d59e0943-5372-4680-af52-c9af874c8578] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 22 00:18:59 compute-1 nova_compute[182713]: 2026-01-22 00:18:59.409 182717 DEBUG nova.compute.manager [None req-ebc170f1-84bb-4afa-8b55-273ef87015c4 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: d59e0943-5372-4680-af52-c9af874c8578] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 22 00:18:59 compute-1 nova_compute[182713]: 2026-01-22 00:18:59.411 182717 DEBUG nova.virt.libvirt.driver [None req-ebc170f1-84bb-4afa-8b55-273ef87015c4 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: d59e0943-5372-4680-af52-c9af874c8578] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 22 00:18:59 compute-1 nova_compute[182713]: 2026-01-22 00:18:59.412 182717 INFO nova.virt.libvirt.driver [None req-ebc170f1-84bb-4afa-8b55-273ef87015c4 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: d59e0943-5372-4680-af52-c9af874c8578] Creating image(s)
Jan 22 00:18:59 compute-1 nova_compute[182713]: 2026-01-22 00:18:59.413 182717 DEBUG oslo_concurrency.lockutils [None req-ebc170f1-84bb-4afa-8b55-273ef87015c4 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Acquiring lock "/var/lib/nova/instances/d59e0943-5372-4680-af52-c9af874c8578/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:18:59 compute-1 nova_compute[182713]: 2026-01-22 00:18:59.413 182717 DEBUG oslo_concurrency.lockutils [None req-ebc170f1-84bb-4afa-8b55-273ef87015c4 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lock "/var/lib/nova/instances/d59e0943-5372-4680-af52-c9af874c8578/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:18:59 compute-1 nova_compute[182713]: 2026-01-22 00:18:59.414 182717 DEBUG oslo_concurrency.lockutils [None req-ebc170f1-84bb-4afa-8b55-273ef87015c4 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lock "/var/lib/nova/instances/d59e0943-5372-4680-af52-c9af874c8578/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:18:59 compute-1 nova_compute[182713]: 2026-01-22 00:18:59.429 182717 DEBUG oslo_concurrency.processutils [None req-ebc170f1-84bb-4afa-8b55-273ef87015c4 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:18:59 compute-1 nova_compute[182713]: 2026-01-22 00:18:59.473 182717 DEBUG nova.policy [None req-ebc170f1-84bb-4afa-8b55-273ef87015c4 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '635cc2f351c344dc8e2b1264080dbafb', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'adb1305c8f874f2684e845e88fd95ffe', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 22 00:18:59 compute-1 nova_compute[182713]: 2026-01-22 00:18:59.500 182717 DEBUG oslo_concurrency.processutils [None req-ebc170f1-84bb-4afa-8b55-273ef87015c4 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:18:59 compute-1 nova_compute[182713]: 2026-01-22 00:18:59.501 182717 DEBUG oslo_concurrency.lockutils [None req-ebc170f1-84bb-4afa-8b55-273ef87015c4 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Acquiring lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:18:59 compute-1 nova_compute[182713]: 2026-01-22 00:18:59.501 182717 DEBUG oslo_concurrency.lockutils [None req-ebc170f1-84bb-4afa-8b55-273ef87015c4 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:18:59 compute-1 nova_compute[182713]: 2026-01-22 00:18:59.516 182717 DEBUG oslo_concurrency.processutils [None req-ebc170f1-84bb-4afa-8b55-273ef87015c4 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:18:59 compute-1 podman[233056]: 2026-01-22 00:18:59.583840092 +0000 UTC m=+0.077253895 container health_status cba261ffc859a105e0afa4436e64e27f9ba7e7387a1ea02146e0072c51844d44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_managed=true, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3)
Jan 22 00:18:59 compute-1 nova_compute[182713]: 2026-01-22 00:18:59.587 182717 DEBUG oslo_concurrency.processutils [None req-ebc170f1-84bb-4afa-8b55-273ef87015c4 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:18:59 compute-1 nova_compute[182713]: 2026-01-22 00:18:59.589 182717 DEBUG oslo_concurrency.processutils [None req-ebc170f1-84bb-4afa-8b55-273ef87015c4 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/d59e0943-5372-4680-af52-c9af874c8578/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:19:00 compute-1 nova_compute[182713]: 2026-01-22 00:19:00.046 182717 DEBUG oslo_concurrency.processutils [None req-ebc170f1-84bb-4afa-8b55-273ef87015c4 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/d59e0943-5372-4680-af52-c9af874c8578/disk 1073741824" returned: 0 in 0.457s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:19:00 compute-1 nova_compute[182713]: 2026-01-22 00:19:00.047 182717 DEBUG oslo_concurrency.lockutils [None req-ebc170f1-84bb-4afa-8b55-273ef87015c4 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.546s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:19:00 compute-1 nova_compute[182713]: 2026-01-22 00:19:00.048 182717 DEBUG oslo_concurrency.processutils [None req-ebc170f1-84bb-4afa-8b55-273ef87015c4 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:19:00 compute-1 nova_compute[182713]: 2026-01-22 00:19:00.152 182717 DEBUG oslo_concurrency.processutils [None req-ebc170f1-84bb-4afa-8b55-273ef87015c4 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.104s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:19:00 compute-1 nova_compute[182713]: 2026-01-22 00:19:00.154 182717 DEBUG nova.virt.disk.api [None req-ebc170f1-84bb-4afa-8b55-273ef87015c4 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Checking if we can resize image /var/lib/nova/instances/d59e0943-5372-4680-af52-c9af874c8578/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 22 00:19:00 compute-1 nova_compute[182713]: 2026-01-22 00:19:00.154 182717 DEBUG oslo_concurrency.processutils [None req-ebc170f1-84bb-4afa-8b55-273ef87015c4 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d59e0943-5372-4680-af52-c9af874c8578/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:19:00 compute-1 nova_compute[182713]: 2026-01-22 00:19:00.207 182717 DEBUG oslo_concurrency.processutils [None req-ebc170f1-84bb-4afa-8b55-273ef87015c4 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d59e0943-5372-4680-af52-c9af874c8578/disk --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:19:00 compute-1 nova_compute[182713]: 2026-01-22 00:19:00.208 182717 DEBUG nova.virt.disk.api [None req-ebc170f1-84bb-4afa-8b55-273ef87015c4 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Cannot resize image /var/lib/nova/instances/d59e0943-5372-4680-af52-c9af874c8578/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 22 00:19:00 compute-1 nova_compute[182713]: 2026-01-22 00:19:00.209 182717 DEBUG nova.objects.instance [None req-ebc170f1-84bb-4afa-8b55-273ef87015c4 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lazy-loading 'migration_context' on Instance uuid d59e0943-5372-4680-af52-c9af874c8578 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:19:00 compute-1 nova_compute[182713]: 2026-01-22 00:19:00.247 182717 DEBUG nova.virt.libvirt.driver [None req-ebc170f1-84bb-4afa-8b55-273ef87015c4 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: d59e0943-5372-4680-af52-c9af874c8578] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 22 00:19:00 compute-1 nova_compute[182713]: 2026-01-22 00:19:00.247 182717 DEBUG nova.virt.libvirt.driver [None req-ebc170f1-84bb-4afa-8b55-273ef87015c4 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: d59e0943-5372-4680-af52-c9af874c8578] Ensure instance console log exists: /var/lib/nova/instances/d59e0943-5372-4680-af52-c9af874c8578/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 22 00:19:00 compute-1 nova_compute[182713]: 2026-01-22 00:19:00.248 182717 DEBUG oslo_concurrency.lockutils [None req-ebc170f1-84bb-4afa-8b55-273ef87015c4 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:19:00 compute-1 nova_compute[182713]: 2026-01-22 00:19:00.248 182717 DEBUG oslo_concurrency.lockutils [None req-ebc170f1-84bb-4afa-8b55-273ef87015c4 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:19:00 compute-1 nova_compute[182713]: 2026-01-22 00:19:00.249 182717 DEBUG oslo_concurrency.lockutils [None req-ebc170f1-84bb-4afa-8b55-273ef87015c4 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:19:00 compute-1 nova_compute[182713]: 2026-01-22 00:19:00.289 182717 DEBUG nova.network.neutron [None req-ebc170f1-84bb-4afa-8b55-273ef87015c4 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: d59e0943-5372-4680-af52-c9af874c8578] Successfully created port: 89ad850c-a87f-489f-8c3e-51dfc078a374 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 22 00:19:01 compute-1 nova_compute[182713]: 2026-01-22 00:19:01.387 182717 DEBUG nova.network.neutron [None req-ebc170f1-84bb-4afa-8b55-273ef87015c4 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: d59e0943-5372-4680-af52-c9af874c8578] Successfully updated port: 89ad850c-a87f-489f-8c3e-51dfc078a374 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 22 00:19:01 compute-1 nova_compute[182713]: 2026-01-22 00:19:01.407 182717 DEBUG oslo_concurrency.lockutils [None req-ebc170f1-84bb-4afa-8b55-273ef87015c4 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Acquiring lock "refresh_cache-d59e0943-5372-4680-af52-c9af874c8578" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:19:01 compute-1 nova_compute[182713]: 2026-01-22 00:19:01.408 182717 DEBUG oslo_concurrency.lockutils [None req-ebc170f1-84bb-4afa-8b55-273ef87015c4 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Acquired lock "refresh_cache-d59e0943-5372-4680-af52-c9af874c8578" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:19:01 compute-1 nova_compute[182713]: 2026-01-22 00:19:01.408 182717 DEBUG nova.network.neutron [None req-ebc170f1-84bb-4afa-8b55-273ef87015c4 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: d59e0943-5372-4680-af52-c9af874c8578] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 00:19:01 compute-1 nova_compute[182713]: 2026-01-22 00:19:01.454 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:19:01 compute-1 nova_compute[182713]: 2026-01-22 00:19:01.487 182717 DEBUG nova.compute.manager [req-f858259d-e5c2-4e3f-ae95-659b51a888a2 req-7d48d416-f2f6-4407-a154-0b45b5f615a4 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: d59e0943-5372-4680-af52-c9af874c8578] Received event network-changed-89ad850c-a87f-489f-8c3e-51dfc078a374 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:19:01 compute-1 nova_compute[182713]: 2026-01-22 00:19:01.488 182717 DEBUG nova.compute.manager [req-f858259d-e5c2-4e3f-ae95-659b51a888a2 req-7d48d416-f2f6-4407-a154-0b45b5f615a4 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: d59e0943-5372-4680-af52-c9af874c8578] Refreshing instance network info cache due to event network-changed-89ad850c-a87f-489f-8c3e-51dfc078a374. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 00:19:01 compute-1 nova_compute[182713]: 2026-01-22 00:19:01.488 182717 DEBUG oslo_concurrency.lockutils [req-f858259d-e5c2-4e3f-ae95-659b51a888a2 req-7d48d416-f2f6-4407-a154-0b45b5f615a4 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-d59e0943-5372-4680-af52-c9af874c8578" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:19:02 compute-1 nova_compute[182713]: 2026-01-22 00:19:02.090 182717 DEBUG nova.network.neutron [None req-ebc170f1-84bb-4afa-8b55-273ef87015c4 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: d59e0943-5372-4680-af52-c9af874c8578] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 00:19:02 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:19:02.549 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=74526b6d-b1ca-423f-9094-b845f8b97526, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '42'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:19:02 compute-1 podman[233089]: 2026-01-22 00:19:02.574391057 +0000 UTC m=+0.065625227 container health_status dce133a2ffa21f3006ac27dc80027060a9029e6d972f4c62fecd35dd3bbb00f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, version=9.6, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, io.openshift.expose-services=, config_id=openstack_network_exporter, distribution-scope=public, managed_by=edpm_ansible, vcs-type=git, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Jan 22 00:19:03 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:19:03.025 104184 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:19:03 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:19:03.026 104184 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:19:03 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:19:03.027 104184 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:19:03 compute-1 nova_compute[182713]: 2026-01-22 00:19:03.324 182717 DEBUG nova.network.neutron [None req-ebc170f1-84bb-4afa-8b55-273ef87015c4 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: d59e0943-5372-4680-af52-c9af874c8578] Updating instance_info_cache with network_info: [{"id": "89ad850c-a87f-489f-8c3e-51dfc078a374", "address": "fa:16:3e:9d:17:06", "network": {"id": "7347045a-f38e-4f56-a03a-a68e0fbe1e8d", "bridge": "br-int", "label": "tempest-network-smoke--1118354446", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "adb1305c8f874f2684e845e88fd95ffe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap89ad850c-a8", "ovs_interfaceid": "89ad850c-a87f-489f-8c3e-51dfc078a374", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:19:03 compute-1 nova_compute[182713]: 2026-01-22 00:19:03.557 182717 DEBUG oslo_concurrency.lockutils [None req-ebc170f1-84bb-4afa-8b55-273ef87015c4 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Releasing lock "refresh_cache-d59e0943-5372-4680-af52-c9af874c8578" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:19:03 compute-1 nova_compute[182713]: 2026-01-22 00:19:03.558 182717 DEBUG nova.compute.manager [None req-ebc170f1-84bb-4afa-8b55-273ef87015c4 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: d59e0943-5372-4680-af52-c9af874c8578] Instance network_info: |[{"id": "89ad850c-a87f-489f-8c3e-51dfc078a374", "address": "fa:16:3e:9d:17:06", "network": {"id": "7347045a-f38e-4f56-a03a-a68e0fbe1e8d", "bridge": "br-int", "label": "tempest-network-smoke--1118354446", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "adb1305c8f874f2684e845e88fd95ffe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap89ad850c-a8", "ovs_interfaceid": "89ad850c-a87f-489f-8c3e-51dfc078a374", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 22 00:19:03 compute-1 nova_compute[182713]: 2026-01-22 00:19:03.558 182717 DEBUG oslo_concurrency.lockutils [req-f858259d-e5c2-4e3f-ae95-659b51a888a2 req-7d48d416-f2f6-4407-a154-0b45b5f615a4 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-d59e0943-5372-4680-af52-c9af874c8578" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:19:03 compute-1 nova_compute[182713]: 2026-01-22 00:19:03.559 182717 DEBUG nova.network.neutron [req-f858259d-e5c2-4e3f-ae95-659b51a888a2 req-7d48d416-f2f6-4407-a154-0b45b5f615a4 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: d59e0943-5372-4680-af52-c9af874c8578] Refreshing network info cache for port 89ad850c-a87f-489f-8c3e-51dfc078a374 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 00:19:03 compute-1 nova_compute[182713]: 2026-01-22 00:19:03.562 182717 DEBUG nova.virt.libvirt.driver [None req-ebc170f1-84bb-4afa-8b55-273ef87015c4 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: d59e0943-5372-4680-af52-c9af874c8578] Start _get_guest_xml network_info=[{"id": "89ad850c-a87f-489f-8c3e-51dfc078a374", "address": "fa:16:3e:9d:17:06", "network": {"id": "7347045a-f38e-4f56-a03a-a68e0fbe1e8d", "bridge": "br-int", "label": "tempest-network-smoke--1118354446", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "adb1305c8f874f2684e845e88fd95ffe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap89ad850c-a8", "ovs_interfaceid": "89ad850c-a87f-489f-8c3e-51dfc078a374", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'size': 0, 'disk_bus': 'virtio', 'boot_index': 0, 'device_name': '/dev/vda', 'guest_format': None, 'encrypted': False, 'encryption_format': None, 'encryption_secret_uuid': None, 'device_type': 'disk', 'image_id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 22 00:19:03 compute-1 nova_compute[182713]: 2026-01-22 00:19:03.567 182717 WARNING nova.virt.libvirt.driver [None req-ebc170f1-84bb-4afa-8b55-273ef87015c4 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 00:19:03 compute-1 nova_compute[182713]: 2026-01-22 00:19:03.572 182717 DEBUG nova.virt.libvirt.host [None req-ebc170f1-84bb-4afa-8b55-273ef87015c4 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 22 00:19:03 compute-1 nova_compute[182713]: 2026-01-22 00:19:03.573 182717 DEBUG nova.virt.libvirt.host [None req-ebc170f1-84bb-4afa-8b55-273ef87015c4 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 22 00:19:03 compute-1 nova_compute[182713]: 2026-01-22 00:19:03.576 182717 DEBUG nova.virt.libvirt.host [None req-ebc170f1-84bb-4afa-8b55-273ef87015c4 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 22 00:19:03 compute-1 nova_compute[182713]: 2026-01-22 00:19:03.576 182717 DEBUG nova.virt.libvirt.host [None req-ebc170f1-84bb-4afa-8b55-273ef87015c4 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 22 00:19:03 compute-1 nova_compute[182713]: 2026-01-22 00:19:03.578 182717 DEBUG nova.virt.libvirt.driver [None req-ebc170f1-84bb-4afa-8b55-273ef87015c4 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 22 00:19:03 compute-1 nova_compute[182713]: 2026-01-22 00:19:03.578 182717 DEBUG nova.virt.hardware [None req-ebc170f1-84bb-4afa-8b55-273ef87015c4 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-21T23:42:45Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c3389c03-89c4-4ff5-9e03-1a99d41713d4',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 22 00:19:03 compute-1 nova_compute[182713]: 2026-01-22 00:19:03.579 182717 DEBUG nova.virt.hardware [None req-ebc170f1-84bb-4afa-8b55-273ef87015c4 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 22 00:19:03 compute-1 nova_compute[182713]: 2026-01-22 00:19:03.579 182717 DEBUG nova.virt.hardware [None req-ebc170f1-84bb-4afa-8b55-273ef87015c4 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 22 00:19:03 compute-1 nova_compute[182713]: 2026-01-22 00:19:03.579 182717 DEBUG nova.virt.hardware [None req-ebc170f1-84bb-4afa-8b55-273ef87015c4 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 22 00:19:03 compute-1 nova_compute[182713]: 2026-01-22 00:19:03.579 182717 DEBUG nova.virt.hardware [None req-ebc170f1-84bb-4afa-8b55-273ef87015c4 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 22 00:19:03 compute-1 nova_compute[182713]: 2026-01-22 00:19:03.580 182717 DEBUG nova.virt.hardware [None req-ebc170f1-84bb-4afa-8b55-273ef87015c4 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 22 00:19:03 compute-1 nova_compute[182713]: 2026-01-22 00:19:03.580 182717 DEBUG nova.virt.hardware [None req-ebc170f1-84bb-4afa-8b55-273ef87015c4 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 22 00:19:03 compute-1 nova_compute[182713]: 2026-01-22 00:19:03.580 182717 DEBUG nova.virt.hardware [None req-ebc170f1-84bb-4afa-8b55-273ef87015c4 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 22 00:19:03 compute-1 nova_compute[182713]: 2026-01-22 00:19:03.581 182717 DEBUG nova.virt.hardware [None req-ebc170f1-84bb-4afa-8b55-273ef87015c4 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 22 00:19:03 compute-1 nova_compute[182713]: 2026-01-22 00:19:03.581 182717 DEBUG nova.virt.hardware [None req-ebc170f1-84bb-4afa-8b55-273ef87015c4 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 22 00:19:03 compute-1 nova_compute[182713]: 2026-01-22 00:19:03.581 182717 DEBUG nova.virt.hardware [None req-ebc170f1-84bb-4afa-8b55-273ef87015c4 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 22 00:19:03 compute-1 nova_compute[182713]: 2026-01-22 00:19:03.586 182717 DEBUG nova.virt.libvirt.vif [None req-ebc170f1-84bb-4afa-8b55-273ef87015c4 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T00:18:57Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-2079598704',display_name='tempest-TestNetworkAdvancedServerOps-server-2079598704',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-2079598704',id=145,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBIY3eqmdW0m2q20hwTxy7fCq5RPOCY+KqJLqriFpcPzIAlQnzQNfW6TIp9Y1voEv/PtpLpDAT0kqBnGToo/qNh+oTys/PEZ/7XtlTWunC6nPRFTGOxMn536DUj7Tail8LA==',key_name='tempest-TestNetworkAdvancedServerOps-1230336080',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='adb1305c8f874f2684e845e88fd95ffe',ramdisk_id='',reservation_id='r-zrj70p8y',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkAdvancedServerOps-587955072',owner_user_name='tempest-TestNetworkAdvancedServerOps-587955072-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:18:59Z,user_data=None,user_id='635cc2f351c344dc8e2b1264080dbafb',uuid=d59e0943-5372-4680-af52-c9af874c8578,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "89ad850c-a87f-489f-8c3e-51dfc078a374", "address": "fa:16:3e:9d:17:06", "network": {"id": "7347045a-f38e-4f56-a03a-a68e0fbe1e8d", "bridge": "br-int", "label": "tempest-network-smoke--1118354446", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "adb1305c8f874f2684e845e88fd95ffe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap89ad850c-a8", "ovs_interfaceid": "89ad850c-a87f-489f-8c3e-51dfc078a374", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 00:19:03 compute-1 nova_compute[182713]: 2026-01-22 00:19:03.586 182717 DEBUG nova.network.os_vif_util [None req-ebc170f1-84bb-4afa-8b55-273ef87015c4 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Converting VIF {"id": "89ad850c-a87f-489f-8c3e-51dfc078a374", "address": "fa:16:3e:9d:17:06", "network": {"id": "7347045a-f38e-4f56-a03a-a68e0fbe1e8d", "bridge": "br-int", "label": "tempest-network-smoke--1118354446", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "adb1305c8f874f2684e845e88fd95ffe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap89ad850c-a8", "ovs_interfaceid": "89ad850c-a87f-489f-8c3e-51dfc078a374", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:19:03 compute-1 nova_compute[182713]: 2026-01-22 00:19:03.587 182717 DEBUG nova.network.os_vif_util [None req-ebc170f1-84bb-4afa-8b55-273ef87015c4 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:9d:17:06,bridge_name='br-int',has_traffic_filtering=True,id=89ad850c-a87f-489f-8c3e-51dfc078a374,network=Network(7347045a-f38e-4f56-a03a-a68e0fbe1e8d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap89ad850c-a8') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:19:03 compute-1 nova_compute[182713]: 2026-01-22 00:19:03.589 182717 DEBUG nova.objects.instance [None req-ebc170f1-84bb-4afa-8b55-273ef87015c4 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lazy-loading 'pci_devices' on Instance uuid d59e0943-5372-4680-af52-c9af874c8578 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:19:03 compute-1 nova_compute[182713]: 2026-01-22 00:19:03.757 182717 DEBUG nova.virt.libvirt.driver [None req-ebc170f1-84bb-4afa-8b55-273ef87015c4 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: d59e0943-5372-4680-af52-c9af874c8578] End _get_guest_xml xml=<domain type="kvm">
Jan 22 00:19:03 compute-1 nova_compute[182713]:   <uuid>d59e0943-5372-4680-af52-c9af874c8578</uuid>
Jan 22 00:19:03 compute-1 nova_compute[182713]:   <name>instance-00000091</name>
Jan 22 00:19:03 compute-1 nova_compute[182713]:   <memory>131072</memory>
Jan 22 00:19:03 compute-1 nova_compute[182713]:   <vcpu>1</vcpu>
Jan 22 00:19:03 compute-1 nova_compute[182713]:   <metadata>
Jan 22 00:19:03 compute-1 nova_compute[182713]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 00:19:03 compute-1 nova_compute[182713]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 00:19:03 compute-1 nova_compute[182713]:       <nova:name>tempest-TestNetworkAdvancedServerOps-server-2079598704</nova:name>
Jan 22 00:19:03 compute-1 nova_compute[182713]:       <nova:creationTime>2026-01-22 00:19:03</nova:creationTime>
Jan 22 00:19:03 compute-1 nova_compute[182713]:       <nova:flavor name="m1.nano">
Jan 22 00:19:03 compute-1 nova_compute[182713]:         <nova:memory>128</nova:memory>
Jan 22 00:19:03 compute-1 nova_compute[182713]:         <nova:disk>1</nova:disk>
Jan 22 00:19:03 compute-1 nova_compute[182713]:         <nova:swap>0</nova:swap>
Jan 22 00:19:03 compute-1 nova_compute[182713]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 00:19:03 compute-1 nova_compute[182713]:         <nova:vcpus>1</nova:vcpus>
Jan 22 00:19:03 compute-1 nova_compute[182713]:       </nova:flavor>
Jan 22 00:19:03 compute-1 nova_compute[182713]:       <nova:owner>
Jan 22 00:19:03 compute-1 nova_compute[182713]:         <nova:user uuid="635cc2f351c344dc8e2b1264080dbafb">tempest-TestNetworkAdvancedServerOps-587955072-project-member</nova:user>
Jan 22 00:19:03 compute-1 nova_compute[182713]:         <nova:project uuid="adb1305c8f874f2684e845e88fd95ffe">tempest-TestNetworkAdvancedServerOps-587955072</nova:project>
Jan 22 00:19:03 compute-1 nova_compute[182713]:       </nova:owner>
Jan 22 00:19:03 compute-1 nova_compute[182713]:       <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 22 00:19:03 compute-1 nova_compute[182713]:       <nova:ports>
Jan 22 00:19:03 compute-1 nova_compute[182713]:         <nova:port uuid="89ad850c-a87f-489f-8c3e-51dfc078a374">
Jan 22 00:19:03 compute-1 nova_compute[182713]:           <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Jan 22 00:19:03 compute-1 nova_compute[182713]:         </nova:port>
Jan 22 00:19:03 compute-1 nova_compute[182713]:       </nova:ports>
Jan 22 00:19:03 compute-1 nova_compute[182713]:     </nova:instance>
Jan 22 00:19:03 compute-1 nova_compute[182713]:   </metadata>
Jan 22 00:19:03 compute-1 nova_compute[182713]:   <sysinfo type="smbios">
Jan 22 00:19:03 compute-1 nova_compute[182713]:     <system>
Jan 22 00:19:03 compute-1 nova_compute[182713]:       <entry name="manufacturer">RDO</entry>
Jan 22 00:19:03 compute-1 nova_compute[182713]:       <entry name="product">OpenStack Compute</entry>
Jan 22 00:19:03 compute-1 nova_compute[182713]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 00:19:03 compute-1 nova_compute[182713]:       <entry name="serial">d59e0943-5372-4680-af52-c9af874c8578</entry>
Jan 22 00:19:03 compute-1 nova_compute[182713]:       <entry name="uuid">d59e0943-5372-4680-af52-c9af874c8578</entry>
Jan 22 00:19:03 compute-1 nova_compute[182713]:       <entry name="family">Virtual Machine</entry>
Jan 22 00:19:03 compute-1 nova_compute[182713]:     </system>
Jan 22 00:19:03 compute-1 nova_compute[182713]:   </sysinfo>
Jan 22 00:19:03 compute-1 nova_compute[182713]:   <os>
Jan 22 00:19:03 compute-1 nova_compute[182713]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 22 00:19:03 compute-1 nova_compute[182713]:     <boot dev="hd"/>
Jan 22 00:19:03 compute-1 nova_compute[182713]:     <smbios mode="sysinfo"/>
Jan 22 00:19:03 compute-1 nova_compute[182713]:   </os>
Jan 22 00:19:03 compute-1 nova_compute[182713]:   <features>
Jan 22 00:19:03 compute-1 nova_compute[182713]:     <acpi/>
Jan 22 00:19:03 compute-1 nova_compute[182713]:     <apic/>
Jan 22 00:19:03 compute-1 nova_compute[182713]:     <vmcoreinfo/>
Jan 22 00:19:03 compute-1 nova_compute[182713]:   </features>
Jan 22 00:19:03 compute-1 nova_compute[182713]:   <clock offset="utc">
Jan 22 00:19:03 compute-1 nova_compute[182713]:     <timer name="pit" tickpolicy="delay"/>
Jan 22 00:19:03 compute-1 nova_compute[182713]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 22 00:19:03 compute-1 nova_compute[182713]:     <timer name="hpet" present="no"/>
Jan 22 00:19:03 compute-1 nova_compute[182713]:   </clock>
Jan 22 00:19:03 compute-1 nova_compute[182713]:   <cpu mode="custom" match="exact">
Jan 22 00:19:03 compute-1 nova_compute[182713]:     <model>Nehalem</model>
Jan 22 00:19:03 compute-1 nova_compute[182713]:     <topology sockets="1" cores="1" threads="1"/>
Jan 22 00:19:03 compute-1 nova_compute[182713]:   </cpu>
Jan 22 00:19:03 compute-1 nova_compute[182713]:   <devices>
Jan 22 00:19:03 compute-1 nova_compute[182713]:     <disk type="file" device="disk">
Jan 22 00:19:03 compute-1 nova_compute[182713]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 00:19:03 compute-1 nova_compute[182713]:       <source file="/var/lib/nova/instances/d59e0943-5372-4680-af52-c9af874c8578/disk"/>
Jan 22 00:19:03 compute-1 nova_compute[182713]:       <target dev="vda" bus="virtio"/>
Jan 22 00:19:03 compute-1 nova_compute[182713]:     </disk>
Jan 22 00:19:03 compute-1 nova_compute[182713]:     <disk type="file" device="cdrom">
Jan 22 00:19:03 compute-1 nova_compute[182713]:       <driver name="qemu" type="raw" cache="none"/>
Jan 22 00:19:03 compute-1 nova_compute[182713]:       <source file="/var/lib/nova/instances/d59e0943-5372-4680-af52-c9af874c8578/disk.config"/>
Jan 22 00:19:03 compute-1 nova_compute[182713]:       <target dev="sda" bus="sata"/>
Jan 22 00:19:03 compute-1 nova_compute[182713]:     </disk>
Jan 22 00:19:03 compute-1 nova_compute[182713]:     <interface type="ethernet">
Jan 22 00:19:03 compute-1 nova_compute[182713]:       <mac address="fa:16:3e:9d:17:06"/>
Jan 22 00:19:03 compute-1 nova_compute[182713]:       <model type="virtio"/>
Jan 22 00:19:03 compute-1 nova_compute[182713]:       <driver name="vhost" rx_queue_size="512"/>
Jan 22 00:19:03 compute-1 nova_compute[182713]:       <mtu size="1442"/>
Jan 22 00:19:03 compute-1 nova_compute[182713]:       <target dev="tap89ad850c-a8"/>
Jan 22 00:19:03 compute-1 nova_compute[182713]:     </interface>
Jan 22 00:19:03 compute-1 nova_compute[182713]:     <serial type="pty">
Jan 22 00:19:03 compute-1 nova_compute[182713]:       <log file="/var/lib/nova/instances/d59e0943-5372-4680-af52-c9af874c8578/console.log" append="off"/>
Jan 22 00:19:03 compute-1 nova_compute[182713]:     </serial>
Jan 22 00:19:03 compute-1 nova_compute[182713]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 00:19:03 compute-1 nova_compute[182713]:     <video>
Jan 22 00:19:03 compute-1 nova_compute[182713]:       <model type="virtio"/>
Jan 22 00:19:03 compute-1 nova_compute[182713]:     </video>
Jan 22 00:19:03 compute-1 nova_compute[182713]:     <input type="tablet" bus="usb"/>
Jan 22 00:19:03 compute-1 nova_compute[182713]:     <rng model="virtio">
Jan 22 00:19:03 compute-1 nova_compute[182713]:       <backend model="random">/dev/urandom</backend>
Jan 22 00:19:03 compute-1 nova_compute[182713]:     </rng>
Jan 22 00:19:03 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root"/>
Jan 22 00:19:03 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:19:03 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:19:03 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:19:03 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:19:03 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:19:03 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:19:03 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:19:03 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:19:03 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:19:03 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:19:03 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:19:03 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:19:03 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:19:03 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:19:03 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:19:03 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:19:03 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:19:03 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:19:03 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:19:03 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:19:03 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:19:03 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:19:03 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:19:03 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:19:03 compute-1 nova_compute[182713]:     <controller type="usb" index="0"/>
Jan 22 00:19:03 compute-1 nova_compute[182713]:     <memballoon model="virtio">
Jan 22 00:19:03 compute-1 nova_compute[182713]:       <stats period="10"/>
Jan 22 00:19:03 compute-1 nova_compute[182713]:     </memballoon>
Jan 22 00:19:03 compute-1 nova_compute[182713]:   </devices>
Jan 22 00:19:03 compute-1 nova_compute[182713]: </domain>
Jan 22 00:19:03 compute-1 nova_compute[182713]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 22 00:19:03 compute-1 nova_compute[182713]: 2026-01-22 00:19:03.759 182717 DEBUG nova.compute.manager [None req-ebc170f1-84bb-4afa-8b55-273ef87015c4 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: d59e0943-5372-4680-af52-c9af874c8578] Preparing to wait for external event network-vif-plugged-89ad850c-a87f-489f-8c3e-51dfc078a374 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 22 00:19:03 compute-1 nova_compute[182713]: 2026-01-22 00:19:03.760 182717 DEBUG oslo_concurrency.lockutils [None req-ebc170f1-84bb-4afa-8b55-273ef87015c4 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Acquiring lock "d59e0943-5372-4680-af52-c9af874c8578-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:19:03 compute-1 nova_compute[182713]: 2026-01-22 00:19:03.760 182717 DEBUG oslo_concurrency.lockutils [None req-ebc170f1-84bb-4afa-8b55-273ef87015c4 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lock "d59e0943-5372-4680-af52-c9af874c8578-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:19:03 compute-1 nova_compute[182713]: 2026-01-22 00:19:03.761 182717 DEBUG oslo_concurrency.lockutils [None req-ebc170f1-84bb-4afa-8b55-273ef87015c4 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lock "d59e0943-5372-4680-af52-c9af874c8578-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:19:03 compute-1 nova_compute[182713]: 2026-01-22 00:19:03.762 182717 DEBUG nova.virt.libvirt.vif [None req-ebc170f1-84bb-4afa-8b55-273ef87015c4 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T00:18:57Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-2079598704',display_name='tempest-TestNetworkAdvancedServerOps-server-2079598704',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-2079598704',id=145,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBIY3eqmdW0m2q20hwTxy7fCq5RPOCY+KqJLqriFpcPzIAlQnzQNfW6TIp9Y1voEv/PtpLpDAT0kqBnGToo/qNh+oTys/PEZ/7XtlTWunC6nPRFTGOxMn536DUj7Tail8LA==',key_name='tempest-TestNetworkAdvancedServerOps-1230336080',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='adb1305c8f874f2684e845e88fd95ffe',ramdisk_id='',reservation_id='r-zrj70p8y',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkAdvancedServerOps-587955072',owner_user_name='tempest-TestNetworkAdvancedServerOps-587955072-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:18:59Z,user_data=None,user_id='635cc2f351c344dc8e2b1264080dbafb',uuid=d59e0943-5372-4680-af52-c9af874c8578,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "89ad850c-a87f-489f-8c3e-51dfc078a374", "address": "fa:16:3e:9d:17:06", "network": {"id": "7347045a-f38e-4f56-a03a-a68e0fbe1e8d", "bridge": "br-int", "label": "tempest-network-smoke--1118354446", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "adb1305c8f874f2684e845e88fd95ffe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap89ad850c-a8", "ovs_interfaceid": "89ad850c-a87f-489f-8c3e-51dfc078a374", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 00:19:03 compute-1 nova_compute[182713]: 2026-01-22 00:19:03.762 182717 DEBUG nova.network.os_vif_util [None req-ebc170f1-84bb-4afa-8b55-273ef87015c4 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Converting VIF {"id": "89ad850c-a87f-489f-8c3e-51dfc078a374", "address": "fa:16:3e:9d:17:06", "network": {"id": "7347045a-f38e-4f56-a03a-a68e0fbe1e8d", "bridge": "br-int", "label": "tempest-network-smoke--1118354446", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "adb1305c8f874f2684e845e88fd95ffe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap89ad850c-a8", "ovs_interfaceid": "89ad850c-a87f-489f-8c3e-51dfc078a374", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:19:03 compute-1 nova_compute[182713]: 2026-01-22 00:19:03.763 182717 DEBUG nova.network.os_vif_util [None req-ebc170f1-84bb-4afa-8b55-273ef87015c4 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:9d:17:06,bridge_name='br-int',has_traffic_filtering=True,id=89ad850c-a87f-489f-8c3e-51dfc078a374,network=Network(7347045a-f38e-4f56-a03a-a68e0fbe1e8d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap89ad850c-a8') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:19:03 compute-1 nova_compute[182713]: 2026-01-22 00:19:03.764 182717 DEBUG os_vif [None req-ebc170f1-84bb-4afa-8b55-273ef87015c4 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:9d:17:06,bridge_name='br-int',has_traffic_filtering=True,id=89ad850c-a87f-489f-8c3e-51dfc078a374,network=Network(7347045a-f38e-4f56-a03a-a68e0fbe1e8d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap89ad850c-a8') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 00:19:03 compute-1 nova_compute[182713]: 2026-01-22 00:19:03.765 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:19:03 compute-1 nova_compute[182713]: 2026-01-22 00:19:03.766 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:19:03 compute-1 nova_compute[182713]: 2026-01-22 00:19:03.767 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 00:19:03 compute-1 nova_compute[182713]: 2026-01-22 00:19:03.774 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:19:03 compute-1 nova_compute[182713]: 2026-01-22 00:19:03.775 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap89ad850c-a8, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:19:03 compute-1 nova_compute[182713]: 2026-01-22 00:19:03.776 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap89ad850c-a8, col_values=(('external_ids', {'iface-id': '89ad850c-a87f-489f-8c3e-51dfc078a374', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:9d:17:06', 'vm-uuid': 'd59e0943-5372-4680-af52-c9af874c8578'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:19:03 compute-1 nova_compute[182713]: 2026-01-22 00:19:03.813 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:19:03 compute-1 NetworkManager[54952]: <info>  [1769041143.8154] manager: (tap89ad850c-a8): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/255)
Jan 22 00:19:03 compute-1 nova_compute[182713]: 2026-01-22 00:19:03.817 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:19:03 compute-1 nova_compute[182713]: 2026-01-22 00:19:03.828 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:19:03 compute-1 nova_compute[182713]: 2026-01-22 00:19:03.830 182717 INFO os_vif [None req-ebc170f1-84bb-4afa-8b55-273ef87015c4 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:9d:17:06,bridge_name='br-int',has_traffic_filtering=True,id=89ad850c-a87f-489f-8c3e-51dfc078a374,network=Network(7347045a-f38e-4f56-a03a-a68e0fbe1e8d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap89ad850c-a8')
Jan 22 00:19:04 compute-1 nova_compute[182713]: 2026-01-22 00:19:04.928 182717 DEBUG nova.virt.libvirt.driver [None req-ebc170f1-84bb-4afa-8b55-273ef87015c4 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 00:19:04 compute-1 nova_compute[182713]: 2026-01-22 00:19:04.929 182717 DEBUG nova.virt.libvirt.driver [None req-ebc170f1-84bb-4afa-8b55-273ef87015c4 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 00:19:04 compute-1 nova_compute[182713]: 2026-01-22 00:19:04.929 182717 DEBUG nova.virt.libvirt.driver [None req-ebc170f1-84bb-4afa-8b55-273ef87015c4 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] No VIF found with MAC fa:16:3e:9d:17:06, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 00:19:04 compute-1 nova_compute[182713]: 2026-01-22 00:19:04.930 182717 INFO nova.virt.libvirt.driver [None req-ebc170f1-84bb-4afa-8b55-273ef87015c4 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: d59e0943-5372-4680-af52-c9af874c8578] Using config drive
Jan 22 00:19:05 compute-1 nova_compute[182713]: 2026-01-22 00:19:05.129 182717 DEBUG nova.network.neutron [req-f858259d-e5c2-4e3f-ae95-659b51a888a2 req-7d48d416-f2f6-4407-a154-0b45b5f615a4 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: d59e0943-5372-4680-af52-c9af874c8578] Updated VIF entry in instance network info cache for port 89ad850c-a87f-489f-8c3e-51dfc078a374. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 00:19:05 compute-1 nova_compute[182713]: 2026-01-22 00:19:05.129 182717 DEBUG nova.network.neutron [req-f858259d-e5c2-4e3f-ae95-659b51a888a2 req-7d48d416-f2f6-4407-a154-0b45b5f615a4 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: d59e0943-5372-4680-af52-c9af874c8578] Updating instance_info_cache with network_info: [{"id": "89ad850c-a87f-489f-8c3e-51dfc078a374", "address": "fa:16:3e:9d:17:06", "network": {"id": "7347045a-f38e-4f56-a03a-a68e0fbe1e8d", "bridge": "br-int", "label": "tempest-network-smoke--1118354446", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "adb1305c8f874f2684e845e88fd95ffe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap89ad850c-a8", "ovs_interfaceid": "89ad850c-a87f-489f-8c3e-51dfc078a374", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:19:05 compute-1 nova_compute[182713]: 2026-01-22 00:19:05.149 182717 DEBUG oslo_concurrency.lockutils [req-f858259d-e5c2-4e3f-ae95-659b51a888a2 req-7d48d416-f2f6-4407-a154-0b45b5f615a4 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-d59e0943-5372-4680-af52-c9af874c8578" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:19:05 compute-1 nova_compute[182713]: 2026-01-22 00:19:05.308 182717 INFO nova.virt.libvirt.driver [None req-ebc170f1-84bb-4afa-8b55-273ef87015c4 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: d59e0943-5372-4680-af52-c9af874c8578] Creating config drive at /var/lib/nova/instances/d59e0943-5372-4680-af52-c9af874c8578/disk.config
Jan 22 00:19:05 compute-1 nova_compute[182713]: 2026-01-22 00:19:05.314 182717 DEBUG oslo_concurrency.processutils [None req-ebc170f1-84bb-4afa-8b55-273ef87015c4 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/d59e0943-5372-4680-af52-c9af874c8578/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp581o6knb execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:19:05 compute-1 nova_compute[182713]: 2026-01-22 00:19:05.447 182717 DEBUG oslo_concurrency.processutils [None req-ebc170f1-84bb-4afa-8b55-273ef87015c4 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/d59e0943-5372-4680-af52-c9af874c8578/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp581o6knb" returned: 0 in 0.132s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:19:05 compute-1 kernel: tap89ad850c-a8: entered promiscuous mode
Jan 22 00:19:05 compute-1 NetworkManager[54952]: <info>  [1769041145.5265] manager: (tap89ad850c-a8): new Tun device (/org/freedesktop/NetworkManager/Devices/256)
Jan 22 00:19:05 compute-1 nova_compute[182713]: 2026-01-22 00:19:05.529 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:19:05 compute-1 ovn_controller[94841]: 2026-01-22T00:19:05Z|00560|binding|INFO|Claiming lport 89ad850c-a87f-489f-8c3e-51dfc078a374 for this chassis.
Jan 22 00:19:05 compute-1 ovn_controller[94841]: 2026-01-22T00:19:05Z|00561|binding|INFO|89ad850c-a87f-489f-8c3e-51dfc078a374: Claiming fa:16:3e:9d:17:06 10.100.0.5
Jan 22 00:19:05 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:19:05.539 104184 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:9d:17:06 10.100.0.5'], port_security=['fa:16:3e:9d:17:06 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': 'd59e0943-5372-4680-af52-c9af874c8578', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7347045a-f38e-4f56-a03a-a68e0fbe1e8d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'adb1305c8f874f2684e845e88fd95ffe', 'neutron:revision_number': '2', 'neutron:security_group_ids': '1bc50146-1f14-43fa-a2db-2904419fa654', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f19555ab-2ed1-467b-9e13-e9518e9577aa, chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>], logical_port=89ad850c-a87f-489f-8c3e-51dfc078a374) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:19:05 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:19:05.541 104184 INFO neutron.agent.ovn.metadata.agent [-] Port 89ad850c-a87f-489f-8c3e-51dfc078a374 in datapath 7347045a-f38e-4f56-a03a-a68e0fbe1e8d bound to our chassis
Jan 22 00:19:05 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:19:05.547 104184 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 7347045a-f38e-4f56-a03a-a68e0fbe1e8d
Jan 22 00:19:05 compute-1 ovn_controller[94841]: 2026-01-22T00:19:05Z|00562|binding|INFO|Setting lport 89ad850c-a87f-489f-8c3e-51dfc078a374 ovn-installed in OVS
Jan 22 00:19:05 compute-1 ovn_controller[94841]: 2026-01-22T00:19:05Z|00563|binding|INFO|Setting lport 89ad850c-a87f-489f-8c3e-51dfc078a374 up in Southbound
Jan 22 00:19:05 compute-1 nova_compute[182713]: 2026-01-22 00:19:05.561 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:19:05 compute-1 nova_compute[182713]: 2026-01-22 00:19:05.563 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:19:05 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:19:05.570 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[95be9472-0f2e-4e50-baaa-ff8e2bd6ca84]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:19:05 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:19:05.572 104184 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap7347045a-f1 in ovnmeta-7347045a-f38e-4f56-a03a-a68e0fbe1e8d namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 22 00:19:05 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:19:05.577 211733 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap7347045a-f0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 22 00:19:05 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:19:05.577 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[fa18d31b-49db-47e2-bda1-a880a4ef0798]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:19:05 compute-1 systemd-machined[153970]: New machine qemu-63-instance-00000091.
Jan 22 00:19:05 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:19:05.579 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[38b1e154-5340-439f-8d7b-3dd8cf0841bc]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:19:05 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:19:05.592 104576 DEBUG oslo.privsep.daemon [-] privsep: reply[db73a0cb-8765-4402-b6fd-4c6ff86cc0c1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:19:05 compute-1 systemd[1]: Started Virtual Machine qemu-63-instance-00000091.
Jan 22 00:19:05 compute-1 systemd-udevd[233133]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 00:19:05 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:19:05.610 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[5514bb1c-25b6-4fed-97bc-6efe6872f25a]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:19:05 compute-1 NetworkManager[54952]: <info>  [1769041145.6241] device (tap89ad850c-a8): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 00:19:05 compute-1 NetworkManager[54952]: <info>  [1769041145.6248] device (tap89ad850c-a8): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 00:19:05 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:19:05.655 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[fcc59343-87ea-4183-99c7-8a13adb07804]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:19:05 compute-1 NetworkManager[54952]: <info>  [1769041145.6656] manager: (tap7347045a-f0): new Veth device (/org/freedesktop/NetworkManager/Devices/257)
Jan 22 00:19:05 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:19:05.665 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[324ea927-422e-4865-861c-0a79fa0ff7bc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:19:05 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:19:05.695 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[fe3b80c3-7d91-4942-88f8-2219af9b6938]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:19:05 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:19:05.700 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[9e48eb5d-a158-460e-97fb-bee1fe517646]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:19:05 compute-1 NetworkManager[54952]: <info>  [1769041145.7238] device (tap7347045a-f0): carrier: link connected
Jan 22 00:19:05 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:19:05.731 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[b9cc295e-663e-4534-9457-1a618db15f2f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:19:05 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:19:05.750 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[3efd3c2b-692b-482c-92ea-7e3a5a856f6f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap7347045a-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a0:da:8d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 160], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 571837, 'reachable_time': 18445, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 233163, 'error': None, 'target': 'ovnmeta-7347045a-f38e-4f56-a03a-a68e0fbe1e8d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:19:05 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:19:05.764 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[7bdcf3f7-e691-4ff3-afc3-291aa6492404]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fea0:da8d'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 571837, 'tstamp': 571837}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 233164, 'error': None, 'target': 'ovnmeta-7347045a-f38e-4f56-a03a-a68e0fbe1e8d', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:19:05 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:19:05.788 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[0a39b717-5550-4ab2-ae8b-5189029f4e32]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap7347045a-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a0:da:8d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 180, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 180, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 160], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 571837, 'reachable_time': 18445, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 152, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 152, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 233165, 'error': None, 'target': 'ovnmeta-7347045a-f38e-4f56-a03a-a68e0fbe1e8d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:19:05 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:19:05.822 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[ce7772f1-8333-4a0e-bef7-72defa978b2a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:19:05 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:19:05.881 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[9f2256db-d833-45aa-90c8-4ab87c609f65]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:19:05 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:19:05.882 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7347045a-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:19:05 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:19:05.882 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 00:19:05 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:19:05.883 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap7347045a-f0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:19:05 compute-1 kernel: tap7347045a-f0: entered promiscuous mode
Jan 22 00:19:05 compute-1 nova_compute[182713]: 2026-01-22 00:19:05.885 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:19:05 compute-1 NetworkManager[54952]: <info>  [1769041145.8872] manager: (tap7347045a-f0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/258)
Jan 22 00:19:05 compute-1 nova_compute[182713]: 2026-01-22 00:19:05.887 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:19:05 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:19:05.889 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap7347045a-f0, col_values=(('external_ids', {'iface-id': 'ce8d757a-1822-40f4-bb02-e88d8e0a4e11'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:19:05 compute-1 nova_compute[182713]: 2026-01-22 00:19:05.890 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:19:05 compute-1 ovn_controller[94841]: 2026-01-22T00:19:05Z|00564|binding|INFO|Releasing lport ce8d757a-1822-40f4-bb02-e88d8e0a4e11 from this chassis (sb_readonly=0)
Jan 22 00:19:05 compute-1 nova_compute[182713]: 2026-01-22 00:19:05.892 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:19:05 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:19:05.892 104184 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/7347045a-f38e-4f56-a03a-a68e0fbe1e8d.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/7347045a-f38e-4f56-a03a-a68e0fbe1e8d.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 22 00:19:05 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:19:05.893 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[99b7a2b0-dd57-416a-a01e-6e0de26c17e8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:19:05 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:19:05.894 104184 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 00:19:05 compute-1 ovn_metadata_agent[104179]: global
Jan 22 00:19:05 compute-1 ovn_metadata_agent[104179]:     log         /dev/log local0 debug
Jan 22 00:19:05 compute-1 ovn_metadata_agent[104179]:     log-tag     haproxy-metadata-proxy-7347045a-f38e-4f56-a03a-a68e0fbe1e8d
Jan 22 00:19:05 compute-1 ovn_metadata_agent[104179]:     user        root
Jan 22 00:19:05 compute-1 ovn_metadata_agent[104179]:     group       root
Jan 22 00:19:05 compute-1 ovn_metadata_agent[104179]:     maxconn     1024
Jan 22 00:19:05 compute-1 ovn_metadata_agent[104179]:     pidfile     /var/lib/neutron/external/pids/7347045a-f38e-4f56-a03a-a68e0fbe1e8d.pid.haproxy
Jan 22 00:19:05 compute-1 ovn_metadata_agent[104179]:     daemon
Jan 22 00:19:05 compute-1 ovn_metadata_agent[104179]: 
Jan 22 00:19:05 compute-1 ovn_metadata_agent[104179]: defaults
Jan 22 00:19:05 compute-1 ovn_metadata_agent[104179]:     log global
Jan 22 00:19:05 compute-1 ovn_metadata_agent[104179]:     mode http
Jan 22 00:19:05 compute-1 ovn_metadata_agent[104179]:     option httplog
Jan 22 00:19:05 compute-1 ovn_metadata_agent[104179]:     option dontlognull
Jan 22 00:19:05 compute-1 ovn_metadata_agent[104179]:     option http-server-close
Jan 22 00:19:05 compute-1 ovn_metadata_agent[104179]:     option forwardfor
Jan 22 00:19:05 compute-1 ovn_metadata_agent[104179]:     retries                 3
Jan 22 00:19:05 compute-1 ovn_metadata_agent[104179]:     timeout http-request    30s
Jan 22 00:19:05 compute-1 ovn_metadata_agent[104179]:     timeout connect         30s
Jan 22 00:19:05 compute-1 ovn_metadata_agent[104179]:     timeout client          32s
Jan 22 00:19:05 compute-1 ovn_metadata_agent[104179]:     timeout server          32s
Jan 22 00:19:05 compute-1 ovn_metadata_agent[104179]:     timeout http-keep-alive 30s
Jan 22 00:19:05 compute-1 ovn_metadata_agent[104179]: 
Jan 22 00:19:05 compute-1 ovn_metadata_agent[104179]: 
Jan 22 00:19:05 compute-1 ovn_metadata_agent[104179]: listen listener
Jan 22 00:19:05 compute-1 ovn_metadata_agent[104179]:     bind 169.254.169.254:80
Jan 22 00:19:05 compute-1 ovn_metadata_agent[104179]:     server metadata /var/lib/neutron/metadata_proxy
Jan 22 00:19:05 compute-1 ovn_metadata_agent[104179]:     http-request add-header X-OVN-Network-ID 7347045a-f38e-4f56-a03a-a68e0fbe1e8d
Jan 22 00:19:05 compute-1 ovn_metadata_agent[104179]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 22 00:19:05 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:19:05.894 104184 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-7347045a-f38e-4f56-a03a-a68e0fbe1e8d', 'env', 'PROCESS_TAG=haproxy-7347045a-f38e-4f56-a03a-a68e0fbe1e8d', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/7347045a-f38e-4f56-a03a-a68e0fbe1e8d.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 22 00:19:05 compute-1 nova_compute[182713]: 2026-01-22 00:19:05.903 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:19:06 compute-1 nova_compute[182713]: 2026-01-22 00:19:06.160 182717 DEBUG nova.virt.driver [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Emitting event <LifecycleEvent: 1769041146.1590254, d59e0943-5372-4680-af52-c9af874c8578 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:19:06 compute-1 nova_compute[182713]: 2026-01-22 00:19:06.161 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: d59e0943-5372-4680-af52-c9af874c8578] VM Started (Lifecycle Event)
Jan 22 00:19:06 compute-1 podman[233204]: 2026-01-22 00:19:06.26468845 +0000 UTC m=+0.024501014 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 00:19:06 compute-1 podman[233204]: 2026-01-22 00:19:06.493006967 +0000 UTC m=+0.252819431 container create 10fe4b75e29acd74c80c31cf8870202c8eea5c8aa9745d31ab24b1e7bf334da4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7347045a-f38e-4f56-a03a-a68e0fbe1e8d, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202)
Jan 22 00:19:06 compute-1 systemd[1]: Started libpod-conmon-10fe4b75e29acd74c80c31cf8870202c8eea5c8aa9745d31ab24b1e7bf334da4.scope.
Jan 22 00:19:06 compute-1 systemd[1]: Started libcrun container.
Jan 22 00:19:06 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6b628a545a8371d5bbcf9ccde4ebc20be87ef4d8ab3a1198ee197d02645045fc/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 00:19:06 compute-1 podman[233204]: 2026-01-22 00:19:06.744664972 +0000 UTC m=+0.504477516 container init 10fe4b75e29acd74c80c31cf8870202c8eea5c8aa9745d31ab24b1e7bf334da4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7347045a-f38e-4f56-a03a-a68e0fbe1e8d, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Jan 22 00:19:06 compute-1 podman[233204]: 2026-01-22 00:19:06.752453121 +0000 UTC m=+0.512265615 container start 10fe4b75e29acd74c80c31cf8870202c8eea5c8aa9745d31ab24b1e7bf334da4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7347045a-f38e-4f56-a03a-a68e0fbe1e8d, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 22 00:19:06 compute-1 neutron-haproxy-ovnmeta-7347045a-f38e-4f56-a03a-a68e0fbe1e8d[233221]: [NOTICE]   (233225) : New worker (233227) forked
Jan 22 00:19:06 compute-1 neutron-haproxy-ovnmeta-7347045a-f38e-4f56-a03a-a68e0fbe1e8d[233221]: [NOTICE]   (233225) : Loading success.
Jan 22 00:19:07 compute-1 nova_compute[182713]: 2026-01-22 00:19:07.395 182717 DEBUG nova.compute.manager [req-849490d5-8989-41e0-a73b-f6861e2bc9d3 req-d59030e7-8afc-4b16-8859-1464107997d3 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: d59e0943-5372-4680-af52-c9af874c8578] Received event network-vif-plugged-89ad850c-a87f-489f-8c3e-51dfc078a374 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:19:07 compute-1 nova_compute[182713]: 2026-01-22 00:19:07.396 182717 DEBUG oslo_concurrency.lockutils [req-849490d5-8989-41e0-a73b-f6861e2bc9d3 req-d59030e7-8afc-4b16-8859-1464107997d3 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "d59e0943-5372-4680-af52-c9af874c8578-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:19:07 compute-1 nova_compute[182713]: 2026-01-22 00:19:07.396 182717 DEBUG oslo_concurrency.lockutils [req-849490d5-8989-41e0-a73b-f6861e2bc9d3 req-d59030e7-8afc-4b16-8859-1464107997d3 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "d59e0943-5372-4680-af52-c9af874c8578-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:19:07 compute-1 nova_compute[182713]: 2026-01-22 00:19:07.396 182717 DEBUG oslo_concurrency.lockutils [req-849490d5-8989-41e0-a73b-f6861e2bc9d3 req-d59030e7-8afc-4b16-8859-1464107997d3 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "d59e0943-5372-4680-af52-c9af874c8578-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:19:07 compute-1 nova_compute[182713]: 2026-01-22 00:19:07.397 182717 DEBUG nova.compute.manager [req-849490d5-8989-41e0-a73b-f6861e2bc9d3 req-d59030e7-8afc-4b16-8859-1464107997d3 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: d59e0943-5372-4680-af52-c9af874c8578] Processing event network-vif-plugged-89ad850c-a87f-489f-8c3e-51dfc078a374 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 22 00:19:07 compute-1 nova_compute[182713]: 2026-01-22 00:19:07.398 182717 DEBUG nova.compute.manager [None req-ebc170f1-84bb-4afa-8b55-273ef87015c4 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: d59e0943-5372-4680-af52-c9af874c8578] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 00:19:07 compute-1 nova_compute[182713]: 2026-01-22 00:19:07.402 182717 DEBUG nova.virt.libvirt.driver [None req-ebc170f1-84bb-4afa-8b55-273ef87015c4 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: d59e0943-5372-4680-af52-c9af874c8578] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 22 00:19:07 compute-1 nova_compute[182713]: 2026-01-22 00:19:07.407 182717 INFO nova.virt.libvirt.driver [-] [instance: d59e0943-5372-4680-af52-c9af874c8578] Instance spawned successfully.
Jan 22 00:19:07 compute-1 nova_compute[182713]: 2026-01-22 00:19:07.407 182717 DEBUG nova.virt.libvirt.driver [None req-ebc170f1-84bb-4afa-8b55-273ef87015c4 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: d59e0943-5372-4680-af52-c9af874c8578] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 22 00:19:07 compute-1 nova_compute[182713]: 2026-01-22 00:19:07.468 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: d59e0943-5372-4680-af52-c9af874c8578] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:19:07 compute-1 nova_compute[182713]: 2026-01-22 00:19:07.472 182717 DEBUG nova.virt.libvirt.driver [None req-ebc170f1-84bb-4afa-8b55-273ef87015c4 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: d59e0943-5372-4680-af52-c9af874c8578] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:19:07 compute-1 nova_compute[182713]: 2026-01-22 00:19:07.473 182717 DEBUG nova.virt.libvirt.driver [None req-ebc170f1-84bb-4afa-8b55-273ef87015c4 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: d59e0943-5372-4680-af52-c9af874c8578] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:19:07 compute-1 nova_compute[182713]: 2026-01-22 00:19:07.473 182717 DEBUG nova.virt.libvirt.driver [None req-ebc170f1-84bb-4afa-8b55-273ef87015c4 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: d59e0943-5372-4680-af52-c9af874c8578] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:19:07 compute-1 nova_compute[182713]: 2026-01-22 00:19:07.474 182717 DEBUG nova.virt.libvirt.driver [None req-ebc170f1-84bb-4afa-8b55-273ef87015c4 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: d59e0943-5372-4680-af52-c9af874c8578] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:19:07 compute-1 nova_compute[182713]: 2026-01-22 00:19:07.474 182717 DEBUG nova.virt.libvirt.driver [None req-ebc170f1-84bb-4afa-8b55-273ef87015c4 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: d59e0943-5372-4680-af52-c9af874c8578] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:19:07 compute-1 nova_compute[182713]: 2026-01-22 00:19:07.474 182717 DEBUG nova.virt.libvirt.driver [None req-ebc170f1-84bb-4afa-8b55-273ef87015c4 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: d59e0943-5372-4680-af52-c9af874c8578] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:19:07 compute-1 nova_compute[182713]: 2026-01-22 00:19:07.478 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: d59e0943-5372-4680-af52-c9af874c8578] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 00:19:07 compute-1 nova_compute[182713]: 2026-01-22 00:19:07.575 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: d59e0943-5372-4680-af52-c9af874c8578] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 00:19:07 compute-1 nova_compute[182713]: 2026-01-22 00:19:07.575 182717 DEBUG nova.virt.driver [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Emitting event <LifecycleEvent: 1769041146.1596122, d59e0943-5372-4680-af52-c9af874c8578 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:19:07 compute-1 nova_compute[182713]: 2026-01-22 00:19:07.576 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: d59e0943-5372-4680-af52-c9af874c8578] VM Paused (Lifecycle Event)
Jan 22 00:19:07 compute-1 nova_compute[182713]: 2026-01-22 00:19:07.617 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: d59e0943-5372-4680-af52-c9af874c8578] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:19:07 compute-1 nova_compute[182713]: 2026-01-22 00:19:07.619 182717 DEBUG nova.virt.driver [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Emitting event <LifecycleEvent: 1769041147.402304, d59e0943-5372-4680-af52-c9af874c8578 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:19:07 compute-1 nova_compute[182713]: 2026-01-22 00:19:07.620 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: d59e0943-5372-4680-af52-c9af874c8578] VM Resumed (Lifecycle Event)
Jan 22 00:19:07 compute-1 nova_compute[182713]: 2026-01-22 00:19:07.646 182717 INFO nova.compute.manager [None req-ebc170f1-84bb-4afa-8b55-273ef87015c4 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: d59e0943-5372-4680-af52-c9af874c8578] Took 8.24 seconds to spawn the instance on the hypervisor.
Jan 22 00:19:07 compute-1 nova_compute[182713]: 2026-01-22 00:19:07.646 182717 DEBUG nova.compute.manager [None req-ebc170f1-84bb-4afa-8b55-273ef87015c4 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: d59e0943-5372-4680-af52-c9af874c8578] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:19:07 compute-1 nova_compute[182713]: 2026-01-22 00:19:07.648 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: d59e0943-5372-4680-af52-c9af874c8578] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:19:07 compute-1 nova_compute[182713]: 2026-01-22 00:19:07.655 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: d59e0943-5372-4680-af52-c9af874c8578] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 00:19:07 compute-1 nova_compute[182713]: 2026-01-22 00:19:07.690 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: d59e0943-5372-4680-af52-c9af874c8578] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 00:19:07 compute-1 nova_compute[182713]: 2026-01-22 00:19:07.750 182717 INFO nova.compute.manager [None req-ebc170f1-84bb-4afa-8b55-273ef87015c4 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: d59e0943-5372-4680-af52-c9af874c8578] Took 9.14 seconds to build instance.
Jan 22 00:19:07 compute-1 nova_compute[182713]: 2026-01-22 00:19:07.787 182717 DEBUG oslo_concurrency.lockutils [None req-ebc170f1-84bb-4afa-8b55-273ef87015c4 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lock "d59e0943-5372-4680-af52-c9af874c8578" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 9.247s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:19:08 compute-1 nova_compute[182713]: 2026-01-22 00:19:08.815 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:19:09 compute-1 nova_compute[182713]: 2026-01-22 00:19:09.574 182717 DEBUG nova.compute.manager [req-94cdd4d9-2c6f-493c-81c8-c3a4087c7642 req-0aa37ac4-08b1-4f93-bf70-eb68f36edf6f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: d59e0943-5372-4680-af52-c9af874c8578] Received event network-vif-plugged-89ad850c-a87f-489f-8c3e-51dfc078a374 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:19:09 compute-1 nova_compute[182713]: 2026-01-22 00:19:09.575 182717 DEBUG oslo_concurrency.lockutils [req-94cdd4d9-2c6f-493c-81c8-c3a4087c7642 req-0aa37ac4-08b1-4f93-bf70-eb68f36edf6f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "d59e0943-5372-4680-af52-c9af874c8578-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:19:09 compute-1 nova_compute[182713]: 2026-01-22 00:19:09.575 182717 DEBUG oslo_concurrency.lockutils [req-94cdd4d9-2c6f-493c-81c8-c3a4087c7642 req-0aa37ac4-08b1-4f93-bf70-eb68f36edf6f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "d59e0943-5372-4680-af52-c9af874c8578-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:19:09 compute-1 nova_compute[182713]: 2026-01-22 00:19:09.575 182717 DEBUG oslo_concurrency.lockutils [req-94cdd4d9-2c6f-493c-81c8-c3a4087c7642 req-0aa37ac4-08b1-4f93-bf70-eb68f36edf6f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "d59e0943-5372-4680-af52-c9af874c8578-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:19:09 compute-1 nova_compute[182713]: 2026-01-22 00:19:09.576 182717 DEBUG nova.compute.manager [req-94cdd4d9-2c6f-493c-81c8-c3a4087c7642 req-0aa37ac4-08b1-4f93-bf70-eb68f36edf6f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: d59e0943-5372-4680-af52-c9af874c8578] No waiting events found dispatching network-vif-plugged-89ad850c-a87f-489f-8c3e-51dfc078a374 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:19:09 compute-1 nova_compute[182713]: 2026-01-22 00:19:09.576 182717 WARNING nova.compute.manager [req-94cdd4d9-2c6f-493c-81c8-c3a4087c7642 req-0aa37ac4-08b1-4f93-bf70-eb68f36edf6f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: d59e0943-5372-4680-af52-c9af874c8578] Received unexpected event network-vif-plugged-89ad850c-a87f-489f-8c3e-51dfc078a374 for instance with vm_state active and task_state None.
Jan 22 00:19:12 compute-1 nova_compute[182713]: 2026-01-22 00:19:12.766 182717 DEBUG nova.compute.manager [req-a8bfefd3-751e-4166-8ca7-e0ff50f2e90a req-12cc35d4-039c-4416-ad69-99de77fe6466 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: d59e0943-5372-4680-af52-c9af874c8578] Received event network-changed-89ad850c-a87f-489f-8c3e-51dfc078a374 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:19:12 compute-1 nova_compute[182713]: 2026-01-22 00:19:12.767 182717 DEBUG nova.compute.manager [req-a8bfefd3-751e-4166-8ca7-e0ff50f2e90a req-12cc35d4-039c-4416-ad69-99de77fe6466 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: d59e0943-5372-4680-af52-c9af874c8578] Refreshing instance network info cache due to event network-changed-89ad850c-a87f-489f-8c3e-51dfc078a374. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 00:19:12 compute-1 nova_compute[182713]: 2026-01-22 00:19:12.768 182717 DEBUG oslo_concurrency.lockutils [req-a8bfefd3-751e-4166-8ca7-e0ff50f2e90a req-12cc35d4-039c-4416-ad69-99de77fe6466 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-d59e0943-5372-4680-af52-c9af874c8578" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:19:12 compute-1 nova_compute[182713]: 2026-01-22 00:19:12.769 182717 DEBUG oslo_concurrency.lockutils [req-a8bfefd3-751e-4166-8ca7-e0ff50f2e90a req-12cc35d4-039c-4416-ad69-99de77fe6466 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-d59e0943-5372-4680-af52-c9af874c8578" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:19:12 compute-1 nova_compute[182713]: 2026-01-22 00:19:12.769 182717 DEBUG nova.network.neutron [req-a8bfefd3-751e-4166-8ca7-e0ff50f2e90a req-12cc35d4-039c-4416-ad69-99de77fe6466 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: d59e0943-5372-4680-af52-c9af874c8578] Refreshing network info cache for port 89ad850c-a87f-489f-8c3e-51dfc078a374 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 00:19:13 compute-1 podman[233237]: 2026-01-22 00:19:13.573891938 +0000 UTC m=+0.064576375 container health_status 9069102163b9014ce8d990e36224edf2f6277e737eebbc85a8ea0ba5a3d6a468 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 22 00:19:13 compute-1 podman[233236]: 2026-01-22 00:19:13.625516425 +0000 UTC m=+0.109087563 container health_status 1b31374526468d6b98833c6ddc06c791524e3aaa8f4a92d02e3f11c449a17ca2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 00:19:13 compute-1 nova_compute[182713]: 2026-01-22 00:19:13.817 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:19:15 compute-1 nova_compute[182713]: 2026-01-22 00:19:15.387 182717 DEBUG nova.network.neutron [req-a8bfefd3-751e-4166-8ca7-e0ff50f2e90a req-12cc35d4-039c-4416-ad69-99de77fe6466 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: d59e0943-5372-4680-af52-c9af874c8578] Updated VIF entry in instance network info cache for port 89ad850c-a87f-489f-8c3e-51dfc078a374. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 00:19:15 compute-1 nova_compute[182713]: 2026-01-22 00:19:15.387 182717 DEBUG nova.network.neutron [req-a8bfefd3-751e-4166-8ca7-e0ff50f2e90a req-12cc35d4-039c-4416-ad69-99de77fe6466 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: d59e0943-5372-4680-af52-c9af874c8578] Updating instance_info_cache with network_info: [{"id": "89ad850c-a87f-489f-8c3e-51dfc078a374", "address": "fa:16:3e:9d:17:06", "network": {"id": "7347045a-f38e-4f56-a03a-a68e0fbe1e8d", "bridge": "br-int", "label": "tempest-network-smoke--1118354446", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.194", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "adb1305c8f874f2684e845e88fd95ffe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap89ad850c-a8", "ovs_interfaceid": "89ad850c-a87f-489f-8c3e-51dfc078a374", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:19:15 compute-1 nova_compute[182713]: 2026-01-22 00:19:15.412 182717 DEBUG oslo_concurrency.lockutils [req-a8bfefd3-751e-4166-8ca7-e0ff50f2e90a req-12cc35d4-039c-4416-ad69-99de77fe6466 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-d59e0943-5372-4680-af52-c9af874c8578" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:19:17 compute-1 ovn_controller[94841]: 2026-01-22T00:19:17Z|00565|binding|INFO|Releasing lport ce8d757a-1822-40f4-bb02-e88d8e0a4e11 from this chassis (sb_readonly=0)
Jan 22 00:19:17 compute-1 ovn_controller[94841]: 2026-01-22T00:19:17Z|00566|binding|INFO|Releasing lport ecd06410-36ac-42c3-b9e8-b57793dc7305 from this chassis (sb_readonly=0)
Jan 22 00:19:17 compute-1 ovn_controller[94841]: 2026-01-22T00:19:17Z|00567|binding|INFO|Releasing lport 1ae0dbff-a7cd-4db8-afc3-1d102fdd130f from this chassis (sb_readonly=0)
Jan 22 00:19:17 compute-1 nova_compute[182713]: 2026-01-22 00:19:17.454 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:19:18 compute-1 nova_compute[182713]: 2026-01-22 00:19:18.821 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:19:19 compute-1 ovn_controller[94841]: 2026-01-22T00:19:19Z|00069|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:9d:17:06 10.100.0.5
Jan 22 00:19:19 compute-1 ovn_controller[94841]: 2026-01-22T00:19:19Z|00070|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:9d:17:06 10.100.0.5
Jan 22 00:19:19 compute-1 podman[233294]: 2026-01-22 00:19:19.583780833 +0000 UTC m=+0.073105897 container health_status af9a44923f14d865893c91712a29a99d59e05d83f1a1a0300c4f4d5af638f284 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent)
Jan 22 00:19:19 compute-1 podman[233295]: 2026-01-22 00:19:19.59866422 +0000 UTC m=+0.084040704 container health_status e305ebfcd3dbca18f8dd680606b043b108cf9efb0d0a771039cb04fe0df8441d (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 22 00:19:23 compute-1 nova_compute[182713]: 2026-01-22 00:19:23.823 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:19:26 compute-1 nova_compute[182713]: 2026-01-22 00:19:26.012 182717 INFO nova.compute.manager [None req-175648f9-fa5c-4493-8c8b-b894c7f90d27 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: d59e0943-5372-4680-af52-c9af874c8578] Get console output
Jan 22 00:19:26 compute-1 nova_compute[182713]: 2026-01-22 00:19:26.019 211417 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 00:19:26 compute-1 nova_compute[182713]: 2026-01-22 00:19:26.696 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:19:28 compute-1 nova_compute[182713]: 2026-01-22 00:19:28.047 182717 INFO nova.compute.manager [None req-481aff70-89b5-4c48-a2a7-7b22bcfb5cb3 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: d59e0943-5372-4680-af52-c9af874c8578] Get console output
Jan 22 00:19:28 compute-1 nova_compute[182713]: 2026-01-22 00:19:28.056 211417 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 00:19:28 compute-1 nova_compute[182713]: 2026-01-22 00:19:28.827 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:19:30 compute-1 podman[233338]: 2026-01-22 00:19:30.612923165 +0000 UTC m=+0.097599861 container health_status cba261ffc859a105e0afa4436e64e27f9ba7e7387a1ea02146e0072c51844d44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 00:19:32 compute-1 nova_compute[182713]: 2026-01-22 00:19:32.176 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:19:32 compute-1 nova_compute[182713]: 2026-01-22 00:19:32.900 182717 DEBUG oslo_concurrency.lockutils [None req-c40b7605-b7ba-4db2-a319-a3b7ffe69f2c 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] Acquiring lock "refresh_cache-d59e0943-5372-4680-af52-c9af874c8578" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:19:32 compute-1 nova_compute[182713]: 2026-01-22 00:19:32.900 182717 DEBUG oslo_concurrency.lockutils [None req-c40b7605-b7ba-4db2-a319-a3b7ffe69f2c 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] Acquired lock "refresh_cache-d59e0943-5372-4680-af52-c9af874c8578" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:19:32 compute-1 nova_compute[182713]: 2026-01-22 00:19:32.901 182717 DEBUG nova.network.neutron [None req-c40b7605-b7ba-4db2-a319-a3b7ffe69f2c 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] [instance: d59e0943-5372-4680-af52-c9af874c8578] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 00:19:33 compute-1 podman[233356]: 2026-01-22 00:19:33.588076386 +0000 UTC m=+0.071070484 container health_status dce133a2ffa21f3006ac27dc80027060a9029e6d972f4c62fecd35dd3bbb00f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.buildah.version=1.33.7, vendor=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, container_name=openstack_network_exporter, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, maintainer=Red Hat, Inc., name=ubi9-minimal, config_id=openstack_network_exporter, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Jan 22 00:19:33 compute-1 nova_compute[182713]: 2026-01-22 00:19:33.830 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:19:34 compute-1 ovn_controller[94841]: 2026-01-22T00:19:34Z|00568|binding|INFO|Releasing lport ce8d757a-1822-40f4-bb02-e88d8e0a4e11 from this chassis (sb_readonly=0)
Jan 22 00:19:34 compute-1 ovn_controller[94841]: 2026-01-22T00:19:34Z|00569|binding|INFO|Releasing lport ecd06410-36ac-42c3-b9e8-b57793dc7305 from this chassis (sb_readonly=0)
Jan 22 00:19:34 compute-1 ovn_controller[94841]: 2026-01-22T00:19:34Z|00570|binding|INFO|Releasing lport 1ae0dbff-a7cd-4db8-afc3-1d102fdd130f from this chassis (sb_readonly=0)
Jan 22 00:19:34 compute-1 nova_compute[182713]: 2026-01-22 00:19:34.067 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:19:35 compute-1 nova_compute[182713]: 2026-01-22 00:19:35.798 182717 DEBUG nova.network.neutron [None req-c40b7605-b7ba-4db2-a319-a3b7ffe69f2c 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] [instance: d59e0943-5372-4680-af52-c9af874c8578] Updating instance_info_cache with network_info: [{"id": "89ad850c-a87f-489f-8c3e-51dfc078a374", "address": "fa:16:3e:9d:17:06", "network": {"id": "7347045a-f38e-4f56-a03a-a68e0fbe1e8d", "bridge": "br-int", "label": "tempest-network-smoke--1118354446", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.194", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "adb1305c8f874f2684e845e88fd95ffe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap89ad850c-a8", "ovs_interfaceid": "89ad850c-a87f-489f-8c3e-51dfc078a374", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:19:35 compute-1 nova_compute[182713]: 2026-01-22 00:19:35.833 182717 DEBUG oslo_concurrency.lockutils [None req-c40b7605-b7ba-4db2-a319-a3b7ffe69f2c 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] Releasing lock "refresh_cache-d59e0943-5372-4680-af52-c9af874c8578" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:19:35 compute-1 nova_compute[182713]: 2026-01-22 00:19:35.982 182717 DEBUG nova.virt.libvirt.driver [None req-c40b7605-b7ba-4db2-a319-a3b7ffe69f2c 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] [instance: d59e0943-5372-4680-af52-c9af874c8578] Starting migrate_disk_and_power_off migrate_disk_and_power_off /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11511
Jan 22 00:19:35 compute-1 nova_compute[182713]: 2026-01-22 00:19:35.983 182717 DEBUG nova.virt.libvirt.volume.remotefs [None req-c40b7605-b7ba-4db2-a319-a3b7ffe69f2c 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] Creating file /var/lib/nova/instances/d59e0943-5372-4680-af52-c9af874c8578/663dffe4d0984aef9e43bba3f504716e.tmp on remote host 192.168.122.100 create_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:79
Jan 22 00:19:35 compute-1 nova_compute[182713]: 2026-01-22 00:19:35.983 182717 DEBUG oslo_concurrency.processutils [None req-c40b7605-b7ba-4db2-a319-a3b7ffe69f2c 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] Running cmd (subprocess): ssh -o BatchMode=yes 192.168.122.100 touch /var/lib/nova/instances/d59e0943-5372-4680-af52-c9af874c8578/663dffe4d0984aef9e43bba3f504716e.tmp execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:19:36 compute-1 nova_compute[182713]: 2026-01-22 00:19:36.390 182717 DEBUG oslo_concurrency.processutils [None req-c40b7605-b7ba-4db2-a319-a3b7ffe69f2c 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] CMD "ssh -o BatchMode=yes 192.168.122.100 touch /var/lib/nova/instances/d59e0943-5372-4680-af52-c9af874c8578/663dffe4d0984aef9e43bba3f504716e.tmp" returned: 1 in 0.407s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:19:36 compute-1 nova_compute[182713]: 2026-01-22 00:19:36.392 182717 DEBUG oslo_concurrency.processutils [None req-c40b7605-b7ba-4db2-a319-a3b7ffe69f2c 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] 'ssh -o BatchMode=yes 192.168.122.100 touch /var/lib/nova/instances/d59e0943-5372-4680-af52-c9af874c8578/663dffe4d0984aef9e43bba3f504716e.tmp' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473
Jan 22 00:19:36 compute-1 nova_compute[182713]: 2026-01-22 00:19:36.392 182717 DEBUG nova.virt.libvirt.volume.remotefs [None req-c40b7605-b7ba-4db2-a319-a3b7ffe69f2c 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] Creating directory /var/lib/nova/instances/d59e0943-5372-4680-af52-c9af874c8578 on remote host 192.168.122.100 create_dir /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:91
Jan 22 00:19:36 compute-1 nova_compute[182713]: 2026-01-22 00:19:36.392 182717 DEBUG oslo_concurrency.processutils [None req-c40b7605-b7ba-4db2-a319-a3b7ffe69f2c 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] Running cmd (subprocess): ssh -o BatchMode=yes 192.168.122.100 mkdir -p /var/lib/nova/instances/d59e0943-5372-4680-af52-c9af874c8578 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:19:36 compute-1 nova_compute[182713]: 2026-01-22 00:19:36.611 182717 DEBUG oslo_concurrency.processutils [None req-c40b7605-b7ba-4db2-a319-a3b7ffe69f2c 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] CMD "ssh -o BatchMode=yes 192.168.122.100 mkdir -p /var/lib/nova/instances/d59e0943-5372-4680-af52-c9af874c8578" returned: 0 in 0.219s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:19:36 compute-1 nova_compute[182713]: 2026-01-22 00:19:36.617 182717 DEBUG nova.virt.libvirt.driver [None req-c40b7605-b7ba-4db2-a319-a3b7ffe69f2c 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] [instance: d59e0943-5372-4680-af52-c9af874c8578] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071
Jan 22 00:19:37 compute-1 nova_compute[182713]: 2026-01-22 00:19:37.727 182717 DEBUG oslo_concurrency.lockutils [None req-1a065704-7193-459a-b687-abfba97c5be6 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Acquiring lock "2ddf383f-fadd-4739-9a90-3db8bcb7cb2a" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:19:37 compute-1 nova_compute[182713]: 2026-01-22 00:19:37.727 182717 DEBUG oslo_concurrency.lockutils [None req-1a065704-7193-459a-b687-abfba97c5be6 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Lock "2ddf383f-fadd-4739-9a90-3db8bcb7cb2a" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:19:37 compute-1 nova_compute[182713]: 2026-01-22 00:19:37.728 182717 DEBUG oslo_concurrency.lockutils [None req-1a065704-7193-459a-b687-abfba97c5be6 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Acquiring lock "2ddf383f-fadd-4739-9a90-3db8bcb7cb2a-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:19:37 compute-1 nova_compute[182713]: 2026-01-22 00:19:37.728 182717 DEBUG oslo_concurrency.lockutils [None req-1a065704-7193-459a-b687-abfba97c5be6 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Lock "2ddf383f-fadd-4739-9a90-3db8bcb7cb2a-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:19:37 compute-1 nova_compute[182713]: 2026-01-22 00:19:37.728 182717 DEBUG oslo_concurrency.lockutils [None req-1a065704-7193-459a-b687-abfba97c5be6 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Lock "2ddf383f-fadd-4739-9a90-3db8bcb7cb2a-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:19:37 compute-1 nova_compute[182713]: 2026-01-22 00:19:37.744 182717 INFO nova.compute.manager [None req-1a065704-7193-459a-b687-abfba97c5be6 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 2ddf383f-fadd-4739-9a90-3db8bcb7cb2a] Terminating instance
Jan 22 00:19:37 compute-1 nova_compute[182713]: 2026-01-22 00:19:37.768 182717 DEBUG nova.compute.manager [None req-1a065704-7193-459a-b687-abfba97c5be6 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 2ddf383f-fadd-4739-9a90-3db8bcb7cb2a] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 22 00:19:37 compute-1 kernel: tapb3245964-fe (unregistering): left promiscuous mode
Jan 22 00:19:37 compute-1 NetworkManager[54952]: <info>  [1769041177.7956] device (tapb3245964-fe): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 00:19:37 compute-1 ovn_controller[94841]: 2026-01-22T00:19:37Z|00571|binding|INFO|Releasing lport b3245964-feca-4b8b-b219-3a8b97cebae7 from this chassis (sb_readonly=0)
Jan 22 00:19:37 compute-1 ovn_controller[94841]: 2026-01-22T00:19:37Z|00572|binding|INFO|Setting lport b3245964-feca-4b8b-b219-3a8b97cebae7 down in Southbound
Jan 22 00:19:37 compute-1 nova_compute[182713]: 2026-01-22 00:19:37.805 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:19:37 compute-1 ovn_controller[94841]: 2026-01-22T00:19:37Z|00573|binding|INFO|Removing iface tapb3245964-fe ovn-installed in OVS
Jan 22 00:19:37 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:19:37.822 104184 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:2c:49:00 10.100.0.11'], port_security=['fa:16:3e:2c:49:00 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '2ddf383f-fadd-4739-9a90-3db8bcb7cb2a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3a4c4578-e100-40e8-b037-1c4af043c44d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '02bcfc5f1f1044a3856e73a5938ff011', 'neutron:revision_number': '4', 'neutron:security_group_ids': '729d6e1e-1a32-4037-a612-7401da3be40f cd748916-10a2-4939-a18d-bdf4af30e611', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=06481ad4-d1ea-4682-b4d3-a69e150c2ff0, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>], logical_port=b3245964-feca-4b8b-b219-3a8b97cebae7) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:19:37 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:19:37.825 104184 INFO neutron.agent.ovn.metadata.agent [-] Port b3245964-feca-4b8b-b219-3a8b97cebae7 in datapath 3a4c4578-e100-40e8-b037-1c4af043c44d unbound from our chassis
Jan 22 00:19:37 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:19:37.829 104184 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 3a4c4578-e100-40e8-b037-1c4af043c44d, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 00:19:37 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:19:37.831 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[cc2cbdcb-0f18-452f-99af-027b9f142cde]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:19:37 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:19:37.832 104184 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-3a4c4578-e100-40e8-b037-1c4af043c44d namespace which is not needed anymore
Jan 22 00:19:37 compute-1 nova_compute[182713]: 2026-01-22 00:19:37.836 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:19:37 compute-1 systemd[1]: machine-qemu\x2d62\x2dinstance\x2d0000008a.scope: Deactivated successfully.
Jan 22 00:19:37 compute-1 systemd[1]: machine-qemu\x2d62\x2dinstance\x2d0000008a.scope: Consumed 16.490s CPU time.
Jan 22 00:19:37 compute-1 systemd-machined[153970]: Machine qemu-62-instance-0000008a terminated.
Jan 22 00:19:37 compute-1 neutron-haproxy-ovnmeta-3a4c4578-e100-40e8-b037-1c4af043c44d[232756]: [NOTICE]   (232760) : haproxy version is 2.8.14-c23fe91
Jan 22 00:19:37 compute-1 neutron-haproxy-ovnmeta-3a4c4578-e100-40e8-b037-1c4af043c44d[232756]: [NOTICE]   (232760) : path to executable is /usr/sbin/haproxy
Jan 22 00:19:37 compute-1 neutron-haproxy-ovnmeta-3a4c4578-e100-40e8-b037-1c4af043c44d[232756]: [WARNING]  (232760) : Exiting Master process...
Jan 22 00:19:37 compute-1 neutron-haproxy-ovnmeta-3a4c4578-e100-40e8-b037-1c4af043c44d[232756]: [ALERT]    (232760) : Current worker (232762) exited with code 143 (Terminated)
Jan 22 00:19:37 compute-1 neutron-haproxy-ovnmeta-3a4c4578-e100-40e8-b037-1c4af043c44d[232756]: [WARNING]  (232760) : All workers exited. Exiting... (0)
Jan 22 00:19:37 compute-1 systemd[1]: libpod-8293174ef7a01285881378e40892f284a498aae0528b780df21bd4c6bca0d161.scope: Deactivated successfully.
Jan 22 00:19:37 compute-1 podman[233404]: 2026-01-22 00:19:37.984385008 +0000 UTC m=+0.047857082 container died 8293174ef7a01285881378e40892f284a498aae0528b780df21bd4c6bca0d161 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3a4c4578-e100-40e8-b037-1c4af043c44d, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 22 00:19:37 compute-1 kernel: tapb3245964-fe: entered promiscuous mode
Jan 22 00:19:37 compute-1 kernel: tapb3245964-fe (unregistering): left promiscuous mode
Jan 22 00:19:38 compute-1 nova_compute[182713]: 2026-01-22 00:19:38.001 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:19:38 compute-1 systemd[1]: var-lib-containers-storage-overlay-0fbc13cae5849fcd16f421e1b0771b55b510122306df92bf1f8ecfb7eb2d0c4e-merged.mount: Deactivated successfully.
Jan 22 00:19:38 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-8293174ef7a01285881378e40892f284a498aae0528b780df21bd4c6bca0d161-userdata-shm.mount: Deactivated successfully.
Jan 22 00:19:38 compute-1 podman[233404]: 2026-01-22 00:19:38.032225708 +0000 UTC m=+0.095697762 container cleanup 8293174ef7a01285881378e40892f284a498aae0528b780df21bd4c6bca0d161 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3a4c4578-e100-40e8-b037-1c4af043c44d, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 22 00:19:38 compute-1 nova_compute[182713]: 2026-01-22 00:19:38.037 182717 INFO nova.virt.libvirt.driver [-] [instance: 2ddf383f-fadd-4739-9a90-3db8bcb7cb2a] Instance destroyed successfully.
Jan 22 00:19:38 compute-1 nova_compute[182713]: 2026-01-22 00:19:38.038 182717 DEBUG nova.objects.instance [None req-1a065704-7193-459a-b687-abfba97c5be6 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Lazy-loading 'resources' on Instance uuid 2ddf383f-fadd-4739-9a90-3db8bcb7cb2a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:19:38 compute-1 systemd[1]: libpod-conmon-8293174ef7a01285881378e40892f284a498aae0528b780df21bd4c6bca0d161.scope: Deactivated successfully.
Jan 22 00:19:38 compute-1 nova_compute[182713]: 2026-01-22 00:19:38.059 182717 DEBUG nova.virt.libvirt.vif [None req-1a065704-7193-459a-b687-abfba97c5be6 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T00:17:46Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-1492736128-access_point-1890133969',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-1492736128-access_point-1890133969',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-1492736128-ac',id=138,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBEC2kFjZ2LLODTqt2mxgHy7rJjxFe8ZrHUXm20i9JwMLGCcZBnlNfhY7KIi4qZ5U0omwJeXWnBlNjC6qoRQOBCrKoweqP5wnKVI5nSTyIGfNTjQz5x3kG940iTmU7L06SA==',key_name='tempest-TestSecurityGroupsBasicOps-720505212',keypairs=<?>,launch_index=0,launched_at=2026-01-22T00:17:59Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='02bcfc5f1f1044a3856e73a5938ff011',ramdisk_id='',reservation_id='r-c2o0kvzh',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestSecurityGroupsBasicOps-1492736128',owner_user_name='tempest-TestSecurityGroupsBasicOps-1492736128-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T00:17:59Z,user_data=None,user_id='a60ce2b7b7ae47b484de12add551b287',uuid=2ddf383f-fadd-4739-9a90-3db8bcb7cb2a,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "b3245964-feca-4b8b-b219-3a8b97cebae7", "address": "fa:16:3e:2c:49:00", "network": {"id": "3a4c4578-e100-40e8-b037-1c4af043c44d", "bridge": "br-int", "label": "tempest-network-smoke--1622567996", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.207", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "02bcfc5f1f1044a3856e73a5938ff011", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb3245964-fe", "ovs_interfaceid": "b3245964-feca-4b8b-b219-3a8b97cebae7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 00:19:38 compute-1 nova_compute[182713]: 2026-01-22 00:19:38.060 182717 DEBUG nova.network.os_vif_util [None req-1a065704-7193-459a-b687-abfba97c5be6 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Converting VIF {"id": "b3245964-feca-4b8b-b219-3a8b97cebae7", "address": "fa:16:3e:2c:49:00", "network": {"id": "3a4c4578-e100-40e8-b037-1c4af043c44d", "bridge": "br-int", "label": "tempest-network-smoke--1622567996", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.207", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "02bcfc5f1f1044a3856e73a5938ff011", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb3245964-fe", "ovs_interfaceid": "b3245964-feca-4b8b-b219-3a8b97cebae7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:19:38 compute-1 nova_compute[182713]: 2026-01-22 00:19:38.061 182717 DEBUG nova.network.os_vif_util [None req-1a065704-7193-459a-b687-abfba97c5be6 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:2c:49:00,bridge_name='br-int',has_traffic_filtering=True,id=b3245964-feca-4b8b-b219-3a8b97cebae7,network=Network(3a4c4578-e100-40e8-b037-1c4af043c44d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb3245964-fe') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:19:38 compute-1 nova_compute[182713]: 2026-01-22 00:19:38.062 182717 DEBUG os_vif [None req-1a065704-7193-459a-b687-abfba97c5be6 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:2c:49:00,bridge_name='br-int',has_traffic_filtering=True,id=b3245964-feca-4b8b-b219-3a8b97cebae7,network=Network(3a4c4578-e100-40e8-b037-1c4af043c44d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb3245964-fe') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 00:19:38 compute-1 nova_compute[182713]: 2026-01-22 00:19:38.064 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:19:38 compute-1 nova_compute[182713]: 2026-01-22 00:19:38.065 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb3245964-fe, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:19:38 compute-1 nova_compute[182713]: 2026-01-22 00:19:38.066 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:19:38 compute-1 nova_compute[182713]: 2026-01-22 00:19:38.068 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:19:38 compute-1 nova_compute[182713]: 2026-01-22 00:19:38.071 182717 INFO os_vif [None req-1a065704-7193-459a-b687-abfba97c5be6 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:2c:49:00,bridge_name='br-int',has_traffic_filtering=True,id=b3245964-feca-4b8b-b219-3a8b97cebae7,network=Network(3a4c4578-e100-40e8-b037-1c4af043c44d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb3245964-fe')
Jan 22 00:19:38 compute-1 nova_compute[182713]: 2026-01-22 00:19:38.072 182717 INFO nova.virt.libvirt.driver [None req-1a065704-7193-459a-b687-abfba97c5be6 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 2ddf383f-fadd-4739-9a90-3db8bcb7cb2a] Deleting instance files /var/lib/nova/instances/2ddf383f-fadd-4739-9a90-3db8bcb7cb2a_del
Jan 22 00:19:38 compute-1 nova_compute[182713]: 2026-01-22 00:19:38.073 182717 INFO nova.virt.libvirt.driver [None req-1a065704-7193-459a-b687-abfba97c5be6 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 2ddf383f-fadd-4739-9a90-3db8bcb7cb2a] Deletion of /var/lib/nova/instances/2ddf383f-fadd-4739-9a90-3db8bcb7cb2a_del complete
Jan 22 00:19:38 compute-1 podman[233448]: 2026-01-22 00:19:38.091964353 +0000 UTC m=+0.040090242 container remove 8293174ef7a01285881378e40892f284a498aae0528b780df21bd4c6bca0d161 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3a4c4578-e100-40e8-b037-1c4af043c44d, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 22 00:19:38 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:19:38.097 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[e93ac77d-5594-4ca2-9334-6ff3b3c0e7fe]: (4, ('Thu Jan 22 12:19:37 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-3a4c4578-e100-40e8-b037-1c4af043c44d (8293174ef7a01285881378e40892f284a498aae0528b780df21bd4c6bca0d161)\n8293174ef7a01285881378e40892f284a498aae0528b780df21bd4c6bca0d161\nThu Jan 22 12:19:38 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-3a4c4578-e100-40e8-b037-1c4af043c44d (8293174ef7a01285881378e40892f284a498aae0528b780df21bd4c6bca0d161)\n8293174ef7a01285881378e40892f284a498aae0528b780df21bd4c6bca0d161\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:19:38 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:19:38.099 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[e7dde5a1-821e-4fd5-9e40-35ed5d366a01]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:19:38 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:19:38.100 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3a4c4578-e0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:19:38 compute-1 kernel: tap3a4c4578-e0: left promiscuous mode
Jan 22 00:19:38 compute-1 nova_compute[182713]: 2026-01-22 00:19:38.103 182717 DEBUG nova.compute.manager [req-2ed73feb-7d65-4bd3-b0be-b7200c9ee6bb req-006350cc-7831-4bde-b42f-3b16e54ef225 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 2ddf383f-fadd-4739-9a90-3db8bcb7cb2a] Received event network-vif-unplugged-b3245964-feca-4b8b-b219-3a8b97cebae7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:19:38 compute-1 nova_compute[182713]: 2026-01-22 00:19:38.103 182717 DEBUG oslo_concurrency.lockutils [req-2ed73feb-7d65-4bd3-b0be-b7200c9ee6bb req-006350cc-7831-4bde-b42f-3b16e54ef225 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "2ddf383f-fadd-4739-9a90-3db8bcb7cb2a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:19:38 compute-1 nova_compute[182713]: 2026-01-22 00:19:38.104 182717 DEBUG oslo_concurrency.lockutils [req-2ed73feb-7d65-4bd3-b0be-b7200c9ee6bb req-006350cc-7831-4bde-b42f-3b16e54ef225 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "2ddf383f-fadd-4739-9a90-3db8bcb7cb2a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:19:38 compute-1 nova_compute[182713]: 2026-01-22 00:19:38.104 182717 DEBUG oslo_concurrency.lockutils [req-2ed73feb-7d65-4bd3-b0be-b7200c9ee6bb req-006350cc-7831-4bde-b42f-3b16e54ef225 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "2ddf383f-fadd-4739-9a90-3db8bcb7cb2a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:19:38 compute-1 nova_compute[182713]: 2026-01-22 00:19:38.104 182717 DEBUG nova.compute.manager [req-2ed73feb-7d65-4bd3-b0be-b7200c9ee6bb req-006350cc-7831-4bde-b42f-3b16e54ef225 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 2ddf383f-fadd-4739-9a90-3db8bcb7cb2a] No waiting events found dispatching network-vif-unplugged-b3245964-feca-4b8b-b219-3a8b97cebae7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:19:38 compute-1 nova_compute[182713]: 2026-01-22 00:19:38.104 182717 DEBUG nova.compute.manager [req-2ed73feb-7d65-4bd3-b0be-b7200c9ee6bb req-006350cc-7831-4bde-b42f-3b16e54ef225 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 2ddf383f-fadd-4739-9a90-3db8bcb7cb2a] Received event network-vif-unplugged-b3245964-feca-4b8b-b219-3a8b97cebae7 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 22 00:19:38 compute-1 nova_compute[182713]: 2026-01-22 00:19:38.105 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:19:38 compute-1 nova_compute[182713]: 2026-01-22 00:19:38.117 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:19:38 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:19:38.119 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[3141c6fc-461f-4d93-a962-c162fb950a07]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:19:38 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:19:38.136 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[e53fd4af-72ad-4550-adcf-eb7e775479bd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:19:38 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:19:38.137 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[b4174ddf-fb07-4814-9c9d-57d3f7c53e74]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:19:38 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:19:38.152 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[615a2da5-cfde-46bd-b20e-70e5e2cb2e0a]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 565064, 'reachable_time': 39913, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 233463, 'error': None, 'target': 'ovnmeta-3a4c4578-e100-40e8-b037-1c4af043c44d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:19:38 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:19:38.154 104576 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-3a4c4578-e100-40e8-b037-1c4af043c44d deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 22 00:19:38 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:19:38.154 104576 DEBUG oslo.privsep.daemon [-] privsep: reply[3414e207-44fe-4b21-a702-b8aa0047d2ce]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:19:38 compute-1 systemd[1]: run-netns-ovnmeta\x2d3a4c4578\x2de100\x2d40e8\x2db037\x2d1c4af043c44d.mount: Deactivated successfully.
Jan 22 00:19:38 compute-1 nova_compute[182713]: 2026-01-22 00:19:38.163 182717 INFO nova.compute.manager [None req-1a065704-7193-459a-b687-abfba97c5be6 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 2ddf383f-fadd-4739-9a90-3db8bcb7cb2a] Took 0.40 seconds to destroy the instance on the hypervisor.
Jan 22 00:19:38 compute-1 nova_compute[182713]: 2026-01-22 00:19:38.164 182717 DEBUG oslo.service.loopingcall [None req-1a065704-7193-459a-b687-abfba97c5be6 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 22 00:19:38 compute-1 nova_compute[182713]: 2026-01-22 00:19:38.164 182717 DEBUG nova.compute.manager [-] [instance: 2ddf383f-fadd-4739-9a90-3db8bcb7cb2a] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 22 00:19:38 compute-1 nova_compute[182713]: 2026-01-22 00:19:38.165 182717 DEBUG nova.network.neutron [-] [instance: 2ddf383f-fadd-4739-9a90-3db8bcb7cb2a] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 22 00:19:38 compute-1 kernel: tap89ad850c-a8 (unregistering): left promiscuous mode
Jan 22 00:19:38 compute-1 NetworkManager[54952]: <info>  [1769041178.7867] device (tap89ad850c-a8): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 00:19:38 compute-1 nova_compute[182713]: 2026-01-22 00:19:38.793 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:19:38 compute-1 ovn_controller[94841]: 2026-01-22T00:19:38Z|00574|binding|INFO|Releasing lport 89ad850c-a87f-489f-8c3e-51dfc078a374 from this chassis (sb_readonly=0)
Jan 22 00:19:38 compute-1 ovn_controller[94841]: 2026-01-22T00:19:38Z|00575|binding|INFO|Setting lport 89ad850c-a87f-489f-8c3e-51dfc078a374 down in Southbound
Jan 22 00:19:38 compute-1 ovn_controller[94841]: 2026-01-22T00:19:38Z|00576|binding|INFO|Removing iface tap89ad850c-a8 ovn-installed in OVS
Jan 22 00:19:38 compute-1 nova_compute[182713]: 2026-01-22 00:19:38.796 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:19:38 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:19:38.802 104184 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:9d:17:06 10.100.0.5'], port_security=['fa:16:3e:9d:17:06 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': 'd59e0943-5372-4680-af52-c9af874c8578', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7347045a-f38e-4f56-a03a-a68e0fbe1e8d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'adb1305c8f874f2684e845e88fd95ffe', 'neutron:revision_number': '4', 'neutron:security_group_ids': '1bc50146-1f14-43fa-a2db-2904419fa654', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com', 'neutron:port_fip': '192.168.122.194'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f19555ab-2ed1-467b-9e13-e9518e9577aa, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>], logical_port=89ad850c-a87f-489f-8c3e-51dfc078a374) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:19:38 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:19:38.804 104184 INFO neutron.agent.ovn.metadata.agent [-] Port 89ad850c-a87f-489f-8c3e-51dfc078a374 in datapath 7347045a-f38e-4f56-a03a-a68e0fbe1e8d unbound from our chassis
Jan 22 00:19:38 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:19:38.809 104184 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 7347045a-f38e-4f56-a03a-a68e0fbe1e8d, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 00:19:38 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:19:38.811 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[c719b7bf-e935-471f-baf9-1ba711c4dd1c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:19:38 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:19:38.812 104184 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-7347045a-f38e-4f56-a03a-a68e0fbe1e8d namespace which is not needed anymore
Jan 22 00:19:38 compute-1 nova_compute[182713]: 2026-01-22 00:19:38.824 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:19:38 compute-1 nova_compute[182713]: 2026-01-22 00:19:38.832 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:19:38 compute-1 systemd[1]: machine-qemu\x2d63\x2dinstance\x2d00000091.scope: Deactivated successfully.
Jan 22 00:19:38 compute-1 systemd[1]: machine-qemu\x2d63\x2dinstance\x2d00000091.scope: Consumed 13.693s CPU time.
Jan 22 00:19:38 compute-1 systemd-machined[153970]: Machine qemu-63-instance-00000091 terminated.
Jan 22 00:19:38 compute-1 neutron-haproxy-ovnmeta-7347045a-f38e-4f56-a03a-a68e0fbe1e8d[233221]: [NOTICE]   (233225) : haproxy version is 2.8.14-c23fe91
Jan 22 00:19:38 compute-1 neutron-haproxy-ovnmeta-7347045a-f38e-4f56-a03a-a68e0fbe1e8d[233221]: [NOTICE]   (233225) : path to executable is /usr/sbin/haproxy
Jan 22 00:19:38 compute-1 neutron-haproxy-ovnmeta-7347045a-f38e-4f56-a03a-a68e0fbe1e8d[233221]: [WARNING]  (233225) : Exiting Master process...
Jan 22 00:19:38 compute-1 neutron-haproxy-ovnmeta-7347045a-f38e-4f56-a03a-a68e0fbe1e8d[233221]: [WARNING]  (233225) : Exiting Master process...
Jan 22 00:19:38 compute-1 neutron-haproxy-ovnmeta-7347045a-f38e-4f56-a03a-a68e0fbe1e8d[233221]: [ALERT]    (233225) : Current worker (233227) exited with code 143 (Terminated)
Jan 22 00:19:38 compute-1 neutron-haproxy-ovnmeta-7347045a-f38e-4f56-a03a-a68e0fbe1e8d[233221]: [WARNING]  (233225) : All workers exited. Exiting... (0)
Jan 22 00:19:38 compute-1 systemd[1]: libpod-10fe4b75e29acd74c80c31cf8870202c8eea5c8aa9745d31ab24b1e7bf334da4.scope: Deactivated successfully.
Jan 22 00:19:38 compute-1 podman[233485]: 2026-01-22 00:19:38.97618476 +0000 UTC m=+0.056502398 container died 10fe4b75e29acd74c80c31cf8870202c8eea5c8aa9745d31ab24b1e7bf334da4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7347045a-f38e-4f56-a03a-a68e0fbe1e8d, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Jan 22 00:19:39 compute-1 systemd[1]: var-lib-containers-storage-overlay-6b628a545a8371d5bbcf9ccde4ebc20be87ef4d8ab3a1198ee197d02645045fc-merged.mount: Deactivated successfully.
Jan 22 00:19:39 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-10fe4b75e29acd74c80c31cf8870202c8eea5c8aa9745d31ab24b1e7bf334da4-userdata-shm.mount: Deactivated successfully.
Jan 22 00:19:39 compute-1 podman[233485]: 2026-01-22 00:19:39.006731108 +0000 UTC m=+0.087048726 container cleanup 10fe4b75e29acd74c80c31cf8870202c8eea5c8aa9745d31ab24b1e7bf334da4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7347045a-f38e-4f56-a03a-a68e0fbe1e8d, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202)
Jan 22 00:19:39 compute-1 systemd[1]: libpod-conmon-10fe4b75e29acd74c80c31cf8870202c8eea5c8aa9745d31ab24b1e7bf334da4.scope: Deactivated successfully.
Jan 22 00:19:39 compute-1 podman[233516]: 2026-01-22 00:19:39.070489678 +0000 UTC m=+0.043835189 container remove 10fe4b75e29acd74c80c31cf8870202c8eea5c8aa9745d31ab24b1e7bf334da4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7347045a-f38e-4f56-a03a-a68e0fbe1e8d, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3)
Jan 22 00:19:39 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:19:39.078 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[4e640c70-44dd-43d4-8315-e63327ae6dcd]: (4, ('Thu Jan 22 12:19:38 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-7347045a-f38e-4f56-a03a-a68e0fbe1e8d (10fe4b75e29acd74c80c31cf8870202c8eea5c8aa9745d31ab24b1e7bf334da4)\n10fe4b75e29acd74c80c31cf8870202c8eea5c8aa9745d31ab24b1e7bf334da4\nThu Jan 22 12:19:39 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-7347045a-f38e-4f56-a03a-a68e0fbe1e8d (10fe4b75e29acd74c80c31cf8870202c8eea5c8aa9745d31ab24b1e7bf334da4)\n10fe4b75e29acd74c80c31cf8870202c8eea5c8aa9745d31ab24b1e7bf334da4\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:19:39 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:19:39.080 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[65459c85-9606-4917-8fdc-6c315020da8d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:19:39 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:19:39.080 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7347045a-f0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:19:39 compute-1 nova_compute[182713]: 2026-01-22 00:19:39.082 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:19:39 compute-1 kernel: tap7347045a-f0: left promiscuous mode
Jan 22 00:19:39 compute-1 nova_compute[182713]: 2026-01-22 00:19:39.096 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:19:39 compute-1 nova_compute[182713]: 2026-01-22 00:19:39.097 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:19:39 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:19:39.100 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[66d8fe49-f86f-4a2f-9724-90b69530621e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:19:39 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:19:39.119 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[7cd2926d-5247-45ec-9abf-c7cea958c899]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:19:39 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:19:39.120 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[10a1b987-d0a7-4ec8-8589-f622a6e3076c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:19:39 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:19:39.136 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[f83e4593-015f-406a-a6b3-7bfcf2f4c750]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 571829, 'reachable_time': 29521, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 233548, 'error': None, 'target': 'ovnmeta-7347045a-f38e-4f56-a03a-a68e0fbe1e8d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:19:39 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:19:39.139 104576 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-7347045a-f38e-4f56-a03a-a68e0fbe1e8d deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 22 00:19:39 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:19:39.139 104576 DEBUG oslo.privsep.daemon [-] privsep: reply[6009858c-1f23-4416-a2d9-3ae561440072]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:19:39 compute-1 systemd[1]: run-netns-ovnmeta\x2d7347045a\x2df38e\x2d4f56\x2da03a\x2da68e0fbe1e8d.mount: Deactivated successfully.
Jan 22 00:19:39 compute-1 nova_compute[182713]: 2026-01-22 00:19:39.527 182717 DEBUG nova.compute.manager [req-a654b444-263d-4553-893f-51314750e73d req-30ade7c9-47e7-4e6e-928c-e40d9ee73668 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 2ddf383f-fadd-4739-9a90-3db8bcb7cb2a] Received event network-changed-b3245964-feca-4b8b-b219-3a8b97cebae7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:19:39 compute-1 nova_compute[182713]: 2026-01-22 00:19:39.527 182717 DEBUG nova.compute.manager [req-a654b444-263d-4553-893f-51314750e73d req-30ade7c9-47e7-4e6e-928c-e40d9ee73668 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 2ddf383f-fadd-4739-9a90-3db8bcb7cb2a] Refreshing instance network info cache due to event network-changed-b3245964-feca-4b8b-b219-3a8b97cebae7. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 00:19:39 compute-1 nova_compute[182713]: 2026-01-22 00:19:39.527 182717 DEBUG oslo_concurrency.lockutils [req-a654b444-263d-4553-893f-51314750e73d req-30ade7c9-47e7-4e6e-928c-e40d9ee73668 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-2ddf383f-fadd-4739-9a90-3db8bcb7cb2a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:19:39 compute-1 nova_compute[182713]: 2026-01-22 00:19:39.528 182717 DEBUG oslo_concurrency.lockutils [req-a654b444-263d-4553-893f-51314750e73d req-30ade7c9-47e7-4e6e-928c-e40d9ee73668 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-2ddf383f-fadd-4739-9a90-3db8bcb7cb2a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:19:39 compute-1 nova_compute[182713]: 2026-01-22 00:19:39.528 182717 DEBUG nova.network.neutron [req-a654b444-263d-4553-893f-51314750e73d req-30ade7c9-47e7-4e6e-928c-e40d9ee73668 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 2ddf383f-fadd-4739-9a90-3db8bcb7cb2a] Refreshing network info cache for port b3245964-feca-4b8b-b219-3a8b97cebae7 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 00:19:39 compute-1 nova_compute[182713]: 2026-01-22 00:19:39.638 182717 INFO nova.virt.libvirt.driver [None req-c40b7605-b7ba-4db2-a319-a3b7ffe69f2c 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] [instance: d59e0943-5372-4680-af52-c9af874c8578] Instance shutdown successfully after 3 seconds.
Jan 22 00:19:39 compute-1 nova_compute[182713]: 2026-01-22 00:19:39.645 182717 INFO nova.virt.libvirt.driver [-] [instance: d59e0943-5372-4680-af52-c9af874c8578] Instance destroyed successfully.
Jan 22 00:19:39 compute-1 nova_compute[182713]: 2026-01-22 00:19:39.646 182717 DEBUG nova.virt.libvirt.vif [None req-c40b7605-b7ba-4db2-a319-a3b7ffe69f2c 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T00:18:57Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-2079598704',display_name='tempest-TestNetworkAdvancedServerOps-server-2079598704',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-2079598704',id=145,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBIY3eqmdW0m2q20hwTxy7fCq5RPOCY+KqJLqriFpcPzIAlQnzQNfW6TIp9Y1voEv/PtpLpDAT0kqBnGToo/qNh+oTys/PEZ/7XtlTWunC6nPRFTGOxMn536DUj7Tail8LA==',key_name='tempest-TestNetworkAdvancedServerOps-1230336080',keypairs=<?>,launch_index=0,launched_at=2026-01-22T00:19:07Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=MigrationContext,new_flavor=Flavor(1),node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='adb1305c8f874f2684e845e88fd95ffe',ramdisk_id='',reservation_id='r-zrj70p8y',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-TestNetworkAdvancedServerOps-587955072',owner_user_name='tempest-TestNetworkAdvancedServerOps-587955072-project-member'},tags=<?>,task_state='resize_migrating',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T00:19:31Z,user_data=None,user_id='635cc2f351c344dc8e2b1264080dbafb',uuid=d59e0943-5372-4680-af52-c9af874c8578,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "89ad850c-a87f-489f-8c3e-51dfc078a374", "address": "fa:16:3e:9d:17:06", "network": {"id": "7347045a-f38e-4f56-a03a-a68e0fbe1e8d", "bridge": "br-int", "label": "tempest-network-smoke--1118354446", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.194", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-network-smoke--1118354446", "vif_mac": "fa:16:3e:9d:17:06"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "adb1305c8f874f2684e845e88fd95ffe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap89ad850c-a8", "ovs_interfaceid": "89ad850c-a87f-489f-8c3e-51dfc078a374", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 00:19:39 compute-1 nova_compute[182713]: 2026-01-22 00:19:39.646 182717 DEBUG nova.network.os_vif_util [None req-c40b7605-b7ba-4db2-a319-a3b7ffe69f2c 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] Converting VIF {"id": "89ad850c-a87f-489f-8c3e-51dfc078a374", "address": "fa:16:3e:9d:17:06", "network": {"id": "7347045a-f38e-4f56-a03a-a68e0fbe1e8d", "bridge": "br-int", "label": "tempest-network-smoke--1118354446", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.194", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-network-smoke--1118354446", "vif_mac": "fa:16:3e:9d:17:06"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "adb1305c8f874f2684e845e88fd95ffe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap89ad850c-a8", "ovs_interfaceid": "89ad850c-a87f-489f-8c3e-51dfc078a374", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:19:39 compute-1 nova_compute[182713]: 2026-01-22 00:19:39.647 182717 DEBUG nova.network.os_vif_util [None req-c40b7605-b7ba-4db2-a319-a3b7ffe69f2c 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:9d:17:06,bridge_name='br-int',has_traffic_filtering=True,id=89ad850c-a87f-489f-8c3e-51dfc078a374,network=Network(7347045a-f38e-4f56-a03a-a68e0fbe1e8d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap89ad850c-a8') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:19:39 compute-1 nova_compute[182713]: 2026-01-22 00:19:39.648 182717 DEBUG os_vif [None req-c40b7605-b7ba-4db2-a319-a3b7ffe69f2c 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:9d:17:06,bridge_name='br-int',has_traffic_filtering=True,id=89ad850c-a87f-489f-8c3e-51dfc078a374,network=Network(7347045a-f38e-4f56-a03a-a68e0fbe1e8d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap89ad850c-a8') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 00:19:39 compute-1 nova_compute[182713]: 2026-01-22 00:19:39.650 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:19:39 compute-1 nova_compute[182713]: 2026-01-22 00:19:39.650 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap89ad850c-a8, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:19:39 compute-1 nova_compute[182713]: 2026-01-22 00:19:39.652 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:19:39 compute-1 nova_compute[182713]: 2026-01-22 00:19:39.654 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:19:39 compute-1 nova_compute[182713]: 2026-01-22 00:19:39.656 182717 INFO os_vif [None req-c40b7605-b7ba-4db2-a319-a3b7ffe69f2c 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:9d:17:06,bridge_name='br-int',has_traffic_filtering=True,id=89ad850c-a87f-489f-8c3e-51dfc078a374,network=Network(7347045a-f38e-4f56-a03a-a68e0fbe1e8d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap89ad850c-a8')
Jan 22 00:19:39 compute-1 nova_compute[182713]: 2026-01-22 00:19:39.661 182717 DEBUG oslo_concurrency.processutils [None req-c40b7605-b7ba-4db2-a319-a3b7ffe69f2c 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d59e0943-5372-4680-af52-c9af874c8578/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:19:39 compute-1 nova_compute[182713]: 2026-01-22 00:19:39.755 182717 DEBUG oslo_concurrency.processutils [None req-c40b7605-b7ba-4db2-a319-a3b7ffe69f2c 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d59e0943-5372-4680-af52-c9af874c8578/disk --force-share --output=json" returned: 0 in 0.094s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:19:39 compute-1 nova_compute[182713]: 2026-01-22 00:19:39.757 182717 DEBUG oslo_concurrency.processutils [None req-c40b7605-b7ba-4db2-a319-a3b7ffe69f2c 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d59e0943-5372-4680-af52-c9af874c8578/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:19:39 compute-1 nova_compute[182713]: 2026-01-22 00:19:39.835 182717 DEBUG oslo_concurrency.processutils [None req-c40b7605-b7ba-4db2-a319-a3b7ffe69f2c 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d59e0943-5372-4680-af52-c9af874c8578/disk --force-share --output=json" returned: 0 in 0.078s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:19:39 compute-1 nova_compute[182713]: 2026-01-22 00:19:39.837 182717 DEBUG nova.virt.libvirt.volume.remotefs [None req-c40b7605-b7ba-4db2-a319-a3b7ffe69f2c 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] Copying file /var/lib/nova/instances/d59e0943-5372-4680-af52-c9af874c8578_resize/disk to 192.168.122.100:/var/lib/nova/instances/d59e0943-5372-4680-af52-c9af874c8578/disk copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103
Jan 22 00:19:39 compute-1 nova_compute[182713]: 2026-01-22 00:19:39.838 182717 DEBUG oslo_concurrency.processutils [None req-c40b7605-b7ba-4db2-a319-a3b7ffe69f2c 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] Running cmd (subprocess): scp -r /var/lib/nova/instances/d59e0943-5372-4680-af52-c9af874c8578_resize/disk 192.168.122.100:/var/lib/nova/instances/d59e0943-5372-4680-af52-c9af874c8578/disk execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:19:40 compute-1 nova_compute[182713]: 2026-01-22 00:19:40.290 182717 DEBUG nova.compute.manager [req-090459b1-770f-4aa5-a4b7-9ba737aa7312 req-30529cbf-6252-431d-8ac3-c2e7c91a211c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 2ddf383f-fadd-4739-9a90-3db8bcb7cb2a] Received event network-vif-plugged-b3245964-feca-4b8b-b219-3a8b97cebae7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:19:40 compute-1 nova_compute[182713]: 2026-01-22 00:19:40.291 182717 DEBUG oslo_concurrency.lockutils [req-090459b1-770f-4aa5-a4b7-9ba737aa7312 req-30529cbf-6252-431d-8ac3-c2e7c91a211c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "2ddf383f-fadd-4739-9a90-3db8bcb7cb2a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:19:40 compute-1 nova_compute[182713]: 2026-01-22 00:19:40.292 182717 DEBUG oslo_concurrency.lockutils [req-090459b1-770f-4aa5-a4b7-9ba737aa7312 req-30529cbf-6252-431d-8ac3-c2e7c91a211c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "2ddf383f-fadd-4739-9a90-3db8bcb7cb2a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:19:40 compute-1 nova_compute[182713]: 2026-01-22 00:19:40.292 182717 DEBUG oslo_concurrency.lockutils [req-090459b1-770f-4aa5-a4b7-9ba737aa7312 req-30529cbf-6252-431d-8ac3-c2e7c91a211c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "2ddf383f-fadd-4739-9a90-3db8bcb7cb2a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:19:40 compute-1 nova_compute[182713]: 2026-01-22 00:19:40.293 182717 DEBUG nova.compute.manager [req-090459b1-770f-4aa5-a4b7-9ba737aa7312 req-30529cbf-6252-431d-8ac3-c2e7c91a211c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 2ddf383f-fadd-4739-9a90-3db8bcb7cb2a] No waiting events found dispatching network-vif-plugged-b3245964-feca-4b8b-b219-3a8b97cebae7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:19:40 compute-1 nova_compute[182713]: 2026-01-22 00:19:40.293 182717 WARNING nova.compute.manager [req-090459b1-770f-4aa5-a4b7-9ba737aa7312 req-30529cbf-6252-431d-8ac3-c2e7c91a211c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 2ddf383f-fadd-4739-9a90-3db8bcb7cb2a] Received unexpected event network-vif-plugged-b3245964-feca-4b8b-b219-3a8b97cebae7 for instance with vm_state active and task_state deleting.
Jan 22 00:19:40 compute-1 nova_compute[182713]: 2026-01-22 00:19:40.403 182717 DEBUG oslo_concurrency.processutils [None req-c40b7605-b7ba-4db2-a319-a3b7ffe69f2c 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] CMD "scp -r /var/lib/nova/instances/d59e0943-5372-4680-af52-c9af874c8578_resize/disk 192.168.122.100:/var/lib/nova/instances/d59e0943-5372-4680-af52-c9af874c8578/disk" returned: 0 in 0.565s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:19:40 compute-1 nova_compute[182713]: 2026-01-22 00:19:40.403 182717 DEBUG nova.virt.libvirt.volume.remotefs [None req-c40b7605-b7ba-4db2-a319-a3b7ffe69f2c 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] Copying file /var/lib/nova/instances/d59e0943-5372-4680-af52-c9af874c8578_resize/disk.config to 192.168.122.100:/var/lib/nova/instances/d59e0943-5372-4680-af52-c9af874c8578/disk.config copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103
Jan 22 00:19:40 compute-1 nova_compute[182713]: 2026-01-22 00:19:40.404 182717 DEBUG oslo_concurrency.processutils [None req-c40b7605-b7ba-4db2-a319-a3b7ffe69f2c 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] Running cmd (subprocess): scp -C -r /var/lib/nova/instances/d59e0943-5372-4680-af52-c9af874c8578_resize/disk.config 192.168.122.100:/var/lib/nova/instances/d59e0943-5372-4680-af52-c9af874c8578/disk.config execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:19:40 compute-1 nova_compute[182713]: 2026-01-22 00:19:40.610 182717 DEBUG oslo_concurrency.processutils [None req-c40b7605-b7ba-4db2-a319-a3b7ffe69f2c 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] CMD "scp -C -r /var/lib/nova/instances/d59e0943-5372-4680-af52-c9af874c8578_resize/disk.config 192.168.122.100:/var/lib/nova/instances/d59e0943-5372-4680-af52-c9af874c8578/disk.config" returned: 0 in 0.206s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:19:40 compute-1 nova_compute[182713]: 2026-01-22 00:19:40.612 182717 DEBUG nova.virt.libvirt.volume.remotefs [None req-c40b7605-b7ba-4db2-a319-a3b7ffe69f2c 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] Copying file /var/lib/nova/instances/d59e0943-5372-4680-af52-c9af874c8578_resize/disk.info to 192.168.122.100:/var/lib/nova/instances/d59e0943-5372-4680-af52-c9af874c8578/disk.info copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103
Jan 22 00:19:40 compute-1 nova_compute[182713]: 2026-01-22 00:19:40.612 182717 DEBUG oslo_concurrency.processutils [None req-c40b7605-b7ba-4db2-a319-a3b7ffe69f2c 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] Running cmd (subprocess): scp -C -r /var/lib/nova/instances/d59e0943-5372-4680-af52-c9af874c8578_resize/disk.info 192.168.122.100:/var/lib/nova/instances/d59e0943-5372-4680-af52-c9af874c8578/disk.info execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:19:40 compute-1 nova_compute[182713]: 2026-01-22 00:19:40.817 182717 DEBUG oslo_concurrency.processutils [None req-c40b7605-b7ba-4db2-a319-a3b7ffe69f2c 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] CMD "scp -C -r /var/lib/nova/instances/d59e0943-5372-4680-af52-c9af874c8578_resize/disk.info 192.168.122.100:/var/lib/nova/instances/d59e0943-5372-4680-af52-c9af874c8578/disk.info" returned: 0 in 0.205s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:19:41 compute-1 nova_compute[182713]: 2026-01-22 00:19:41.190 182717 DEBUG nova.network.neutron [-] [instance: 2ddf383f-fadd-4739-9a90-3db8bcb7cb2a] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:19:41 compute-1 nova_compute[182713]: 2026-01-22 00:19:41.233 182717 INFO nova.compute.manager [-] [instance: 2ddf383f-fadd-4739-9a90-3db8bcb7cb2a] Took 3.07 seconds to deallocate network for instance.
Jan 22 00:19:41 compute-1 nova_compute[182713]: 2026-01-22 00:19:41.297 182717 DEBUG neutronclient.v2_0.client [None req-c40b7605-b7ba-4db2-a319-a3b7ffe69f2c 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] Error message: {"NeutronError": {"type": "PortBindingNotFound", "message": "Binding for port 89ad850c-a87f-489f-8c3e-51dfc078a374 for host compute-0.ctlplane.example.com could not be found.", "detail": ""}} _handle_fault_response /usr/lib/python3.9/site-packages/neutronclient/v2_0/client.py:262
Jan 22 00:19:41 compute-1 nova_compute[182713]: 2026-01-22 00:19:41.482 182717 DEBUG oslo_concurrency.lockutils [None req-1a065704-7193-459a-b687-abfba97c5be6 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:19:41 compute-1 nova_compute[182713]: 2026-01-22 00:19:41.483 182717 DEBUG oslo_concurrency.lockutils [None req-1a065704-7193-459a-b687-abfba97c5be6 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:19:41 compute-1 nova_compute[182713]: 2026-01-22 00:19:41.567 182717 DEBUG oslo_concurrency.lockutils [None req-c40b7605-b7ba-4db2-a319-a3b7ffe69f2c 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] Acquiring lock "d59e0943-5372-4680-af52-c9af874c8578-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:19:41 compute-1 nova_compute[182713]: 2026-01-22 00:19:41.568 182717 DEBUG oslo_concurrency.lockutils [None req-c40b7605-b7ba-4db2-a319-a3b7ffe69f2c 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] Lock "d59e0943-5372-4680-af52-c9af874c8578-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:19:41 compute-1 nova_compute[182713]: 2026-01-22 00:19:41.568 182717 DEBUG oslo_concurrency.lockutils [None req-c40b7605-b7ba-4db2-a319-a3b7ffe69f2c 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] Lock "d59e0943-5372-4680-af52-c9af874c8578-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:19:41 compute-1 nova_compute[182713]: 2026-01-22 00:19:41.627 182717 DEBUG nova.compute.manager [req-b0799055-85b6-4fc6-8dfd-928220673e6b req-00c82f7b-01c0-4fdc-9690-261365235555 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: d59e0943-5372-4680-af52-c9af874c8578] Received event network-vif-unplugged-89ad850c-a87f-489f-8c3e-51dfc078a374 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:19:41 compute-1 nova_compute[182713]: 2026-01-22 00:19:41.628 182717 DEBUG oslo_concurrency.lockutils [req-b0799055-85b6-4fc6-8dfd-928220673e6b req-00c82f7b-01c0-4fdc-9690-261365235555 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "d59e0943-5372-4680-af52-c9af874c8578-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:19:41 compute-1 nova_compute[182713]: 2026-01-22 00:19:41.629 182717 DEBUG oslo_concurrency.lockutils [req-b0799055-85b6-4fc6-8dfd-928220673e6b req-00c82f7b-01c0-4fdc-9690-261365235555 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "d59e0943-5372-4680-af52-c9af874c8578-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:19:41 compute-1 nova_compute[182713]: 2026-01-22 00:19:41.629 182717 DEBUG oslo_concurrency.lockutils [req-b0799055-85b6-4fc6-8dfd-928220673e6b req-00c82f7b-01c0-4fdc-9690-261365235555 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "d59e0943-5372-4680-af52-c9af874c8578-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:19:41 compute-1 nova_compute[182713]: 2026-01-22 00:19:41.630 182717 DEBUG nova.compute.manager [req-b0799055-85b6-4fc6-8dfd-928220673e6b req-00c82f7b-01c0-4fdc-9690-261365235555 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: d59e0943-5372-4680-af52-c9af874c8578] No waiting events found dispatching network-vif-unplugged-89ad850c-a87f-489f-8c3e-51dfc078a374 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:19:41 compute-1 nova_compute[182713]: 2026-01-22 00:19:41.630 182717 WARNING nova.compute.manager [req-b0799055-85b6-4fc6-8dfd-928220673e6b req-00c82f7b-01c0-4fdc-9690-261365235555 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: d59e0943-5372-4680-af52-c9af874c8578] Received unexpected event network-vif-unplugged-89ad850c-a87f-489f-8c3e-51dfc078a374 for instance with vm_state active and task_state resize_migrated.
Jan 22 00:19:41 compute-1 nova_compute[182713]: 2026-01-22 00:19:41.631 182717 DEBUG nova.compute.manager [req-b0799055-85b6-4fc6-8dfd-928220673e6b req-00c82f7b-01c0-4fdc-9690-261365235555 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 2ddf383f-fadd-4739-9a90-3db8bcb7cb2a] Received event network-vif-deleted-b3245964-feca-4b8b-b219-3a8b97cebae7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:19:41 compute-1 nova_compute[182713]: 2026-01-22 00:19:41.658 182717 DEBUG nova.compute.provider_tree [None req-1a065704-7193-459a-b687-abfba97c5be6 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Inventory has not changed in ProviderTree for provider: 39680711-70c9-4df1-ae59-25e54fac688d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:19:41 compute-1 nova_compute[182713]: 2026-01-22 00:19:41.674 182717 DEBUG nova.scheduler.client.report [None req-1a065704-7193-459a-b687-abfba97c5be6 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Inventory has not changed for provider 39680711-70c9-4df1-ae59-25e54fac688d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:19:41 compute-1 nova_compute[182713]: 2026-01-22 00:19:41.698 182717 DEBUG oslo_concurrency.lockutils [None req-1a065704-7193-459a-b687-abfba97c5be6 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.215s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:19:41 compute-1 nova_compute[182713]: 2026-01-22 00:19:41.724 182717 INFO nova.scheduler.client.report [None req-1a065704-7193-459a-b687-abfba97c5be6 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Deleted allocations for instance 2ddf383f-fadd-4739-9a90-3db8bcb7cb2a
Jan 22 00:19:41 compute-1 nova_compute[182713]: 2026-01-22 00:19:41.819 182717 DEBUG oslo_concurrency.lockutils [None req-1a065704-7193-459a-b687-abfba97c5be6 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Lock "2ddf383f-fadd-4739-9a90-3db8bcb7cb2a" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.092s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:19:42 compute-1 nova_compute[182713]: 2026-01-22 00:19:42.087 182717 DEBUG nova.network.neutron [req-a654b444-263d-4553-893f-51314750e73d req-30ade7c9-47e7-4e6e-928c-e40d9ee73668 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 2ddf383f-fadd-4739-9a90-3db8bcb7cb2a] Updated VIF entry in instance network info cache for port b3245964-feca-4b8b-b219-3a8b97cebae7. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 00:19:42 compute-1 nova_compute[182713]: 2026-01-22 00:19:42.087 182717 DEBUG nova.network.neutron [req-a654b444-263d-4553-893f-51314750e73d req-30ade7c9-47e7-4e6e-928c-e40d9ee73668 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 2ddf383f-fadd-4739-9a90-3db8bcb7cb2a] Updating instance_info_cache with network_info: [{"id": "b3245964-feca-4b8b-b219-3a8b97cebae7", "address": "fa:16:3e:2c:49:00", "network": {"id": "3a4c4578-e100-40e8-b037-1c4af043c44d", "bridge": "br-int", "label": "tempest-network-smoke--1622567996", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "02bcfc5f1f1044a3856e73a5938ff011", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb3245964-fe", "ovs_interfaceid": "b3245964-feca-4b8b-b219-3a8b97cebae7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:19:42 compute-1 nova_compute[182713]: 2026-01-22 00:19:42.122 182717 DEBUG oslo_concurrency.lockutils [req-a654b444-263d-4553-893f-51314750e73d req-30ade7c9-47e7-4e6e-928c-e40d9ee73668 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-2ddf383f-fadd-4739-9a90-3db8bcb7cb2a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:19:42 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:19:42.797 104184 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=43, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:a4:fb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'fa:c2:bb:50:eb:27'}, ipsec=False) old=SB_Global(nb_cfg=42) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:19:42 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:19:42.798 104184 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 22 00:19:42 compute-1 nova_compute[182713]: 2026-01-22 00:19:42.799 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:19:43 compute-1 nova_compute[182713]: 2026-01-22 00:19:43.323 182717 DEBUG nova.compute.manager [req-0cb395dc-ae4b-42c4-9284-05103b89d69b req-6509a67f-79e9-458e-ae50-38cae1eab53b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: d59e0943-5372-4680-af52-c9af874c8578] Received event network-changed-89ad850c-a87f-489f-8c3e-51dfc078a374 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:19:43 compute-1 nova_compute[182713]: 2026-01-22 00:19:43.324 182717 DEBUG nova.compute.manager [req-0cb395dc-ae4b-42c4-9284-05103b89d69b req-6509a67f-79e9-458e-ae50-38cae1eab53b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: d59e0943-5372-4680-af52-c9af874c8578] Refreshing instance network info cache due to event network-changed-89ad850c-a87f-489f-8c3e-51dfc078a374. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 00:19:43 compute-1 nova_compute[182713]: 2026-01-22 00:19:43.325 182717 DEBUG oslo_concurrency.lockutils [req-0cb395dc-ae4b-42c4-9284-05103b89d69b req-6509a67f-79e9-458e-ae50-38cae1eab53b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-d59e0943-5372-4680-af52-c9af874c8578" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:19:43 compute-1 nova_compute[182713]: 2026-01-22 00:19:43.325 182717 DEBUG oslo_concurrency.lockutils [req-0cb395dc-ae4b-42c4-9284-05103b89d69b req-6509a67f-79e9-458e-ae50-38cae1eab53b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-d59e0943-5372-4680-af52-c9af874c8578" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:19:43 compute-1 nova_compute[182713]: 2026-01-22 00:19:43.326 182717 DEBUG nova.network.neutron [req-0cb395dc-ae4b-42c4-9284-05103b89d69b req-6509a67f-79e9-458e-ae50-38cae1eab53b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: d59e0943-5372-4680-af52-c9af874c8578] Refreshing network info cache for port 89ad850c-a87f-489f-8c3e-51dfc078a374 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 00:19:43 compute-1 nova_compute[182713]: 2026-01-22 00:19:43.757 182717 DEBUG nova.compute.manager [req-112b9429-15af-466e-9eab-76315045490e req-26d24cc4-20f4-4d0a-90fa-5e149da8ac5b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: d59e0943-5372-4680-af52-c9af874c8578] Received event network-vif-plugged-89ad850c-a87f-489f-8c3e-51dfc078a374 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:19:43 compute-1 nova_compute[182713]: 2026-01-22 00:19:43.758 182717 DEBUG oslo_concurrency.lockutils [req-112b9429-15af-466e-9eab-76315045490e req-26d24cc4-20f4-4d0a-90fa-5e149da8ac5b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "d59e0943-5372-4680-af52-c9af874c8578-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:19:43 compute-1 nova_compute[182713]: 2026-01-22 00:19:43.758 182717 DEBUG oslo_concurrency.lockutils [req-112b9429-15af-466e-9eab-76315045490e req-26d24cc4-20f4-4d0a-90fa-5e149da8ac5b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "d59e0943-5372-4680-af52-c9af874c8578-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:19:43 compute-1 nova_compute[182713]: 2026-01-22 00:19:43.758 182717 DEBUG oslo_concurrency.lockutils [req-112b9429-15af-466e-9eab-76315045490e req-26d24cc4-20f4-4d0a-90fa-5e149da8ac5b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "d59e0943-5372-4680-af52-c9af874c8578-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:19:43 compute-1 nova_compute[182713]: 2026-01-22 00:19:43.759 182717 DEBUG nova.compute.manager [req-112b9429-15af-466e-9eab-76315045490e req-26d24cc4-20f4-4d0a-90fa-5e149da8ac5b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: d59e0943-5372-4680-af52-c9af874c8578] No waiting events found dispatching network-vif-plugged-89ad850c-a87f-489f-8c3e-51dfc078a374 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:19:43 compute-1 nova_compute[182713]: 2026-01-22 00:19:43.759 182717 WARNING nova.compute.manager [req-112b9429-15af-466e-9eab-76315045490e req-26d24cc4-20f4-4d0a-90fa-5e149da8ac5b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: d59e0943-5372-4680-af52-c9af874c8578] Received unexpected event network-vif-plugged-89ad850c-a87f-489f-8c3e-51dfc078a374 for instance with vm_state active and task_state resize_migrated.
Jan 22 00:19:43 compute-1 nova_compute[182713]: 2026-01-22 00:19:43.835 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:19:44 compute-1 podman[233562]: 2026-01-22 00:19:44.582351745 +0000 UTC m=+0.064698519 container health_status 9069102163b9014ce8d990e36224edf2f6277e737eebbc85a8ea0ba5a3d6a468 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 22 00:19:44 compute-1 podman[233561]: 2026-01-22 00:19:44.612366088 +0000 UTC m=+0.103038698 container health_status 1b31374526468d6b98833c6ddc06c791524e3aaa8f4a92d02e3f11c449a17ca2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller)
Jan 22 00:19:44 compute-1 nova_compute[182713]: 2026-01-22 00:19:44.651 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:19:45 compute-1 nova_compute[182713]: 2026-01-22 00:19:45.314 182717 DEBUG nova.network.neutron [req-0cb395dc-ae4b-42c4-9284-05103b89d69b req-6509a67f-79e9-458e-ae50-38cae1eab53b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: d59e0943-5372-4680-af52-c9af874c8578] Updated VIF entry in instance network info cache for port 89ad850c-a87f-489f-8c3e-51dfc078a374. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 00:19:45 compute-1 nova_compute[182713]: 2026-01-22 00:19:45.315 182717 DEBUG nova.network.neutron [req-0cb395dc-ae4b-42c4-9284-05103b89d69b req-6509a67f-79e9-458e-ae50-38cae1eab53b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: d59e0943-5372-4680-af52-c9af874c8578] Updating instance_info_cache with network_info: [{"id": "89ad850c-a87f-489f-8c3e-51dfc078a374", "address": "fa:16:3e:9d:17:06", "network": {"id": "7347045a-f38e-4f56-a03a-a68e0fbe1e8d", "bridge": "br-int", "label": "tempest-network-smoke--1118354446", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.194", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "adb1305c8f874f2684e845e88fd95ffe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap89ad850c-a8", "ovs_interfaceid": "89ad850c-a87f-489f-8c3e-51dfc078a374", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:19:45 compute-1 nova_compute[182713]: 2026-01-22 00:19:45.351 182717 DEBUG oslo_concurrency.lockutils [req-0cb395dc-ae4b-42c4-9284-05103b89d69b req-6509a67f-79e9-458e-ae50-38cae1eab53b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-d59e0943-5372-4680-af52-c9af874c8578" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:19:45 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:19:45.801 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=74526b6d-b1ca-423f-9094-b845f8b97526, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '43'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:19:45 compute-1 nova_compute[182713]: 2026-01-22 00:19:45.855 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:19:45 compute-1 nova_compute[182713]: 2026-01-22 00:19:45.856 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 00:19:46 compute-1 nova_compute[182713]: 2026-01-22 00:19:46.852 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:19:46 compute-1 nova_compute[182713]: 2026-01-22 00:19:46.887 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:19:46 compute-1 nova_compute[182713]: 2026-01-22 00:19:46.888 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:19:47 compute-1 nova_compute[182713]: 2026-01-22 00:19:47.360 182717 DEBUG nova.compute.manager [req-854d04d5-609c-442b-8a6d-6a0faaba4441 req-dbb8f614-f3a8-4e49-abae-d99f68367670 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: d59e0943-5372-4680-af52-c9af874c8578] Received event network-vif-plugged-89ad850c-a87f-489f-8c3e-51dfc078a374 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:19:47 compute-1 nova_compute[182713]: 2026-01-22 00:19:47.361 182717 DEBUG oslo_concurrency.lockutils [req-854d04d5-609c-442b-8a6d-6a0faaba4441 req-dbb8f614-f3a8-4e49-abae-d99f68367670 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "d59e0943-5372-4680-af52-c9af874c8578-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:19:47 compute-1 nova_compute[182713]: 2026-01-22 00:19:47.361 182717 DEBUG oslo_concurrency.lockutils [req-854d04d5-609c-442b-8a6d-6a0faaba4441 req-dbb8f614-f3a8-4e49-abae-d99f68367670 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "d59e0943-5372-4680-af52-c9af874c8578-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:19:47 compute-1 nova_compute[182713]: 2026-01-22 00:19:47.362 182717 DEBUG oslo_concurrency.lockutils [req-854d04d5-609c-442b-8a6d-6a0faaba4441 req-dbb8f614-f3a8-4e49-abae-d99f68367670 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "d59e0943-5372-4680-af52-c9af874c8578-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:19:47 compute-1 nova_compute[182713]: 2026-01-22 00:19:47.362 182717 DEBUG nova.compute.manager [req-854d04d5-609c-442b-8a6d-6a0faaba4441 req-dbb8f614-f3a8-4e49-abae-d99f68367670 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: d59e0943-5372-4680-af52-c9af874c8578] No waiting events found dispatching network-vif-plugged-89ad850c-a87f-489f-8c3e-51dfc078a374 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:19:47 compute-1 nova_compute[182713]: 2026-01-22 00:19:47.362 182717 WARNING nova.compute.manager [req-854d04d5-609c-442b-8a6d-6a0faaba4441 req-dbb8f614-f3a8-4e49-abae-d99f68367670 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: d59e0943-5372-4680-af52-c9af874c8578] Received unexpected event network-vif-plugged-89ad850c-a87f-489f-8c3e-51dfc078a374 for instance with vm_state active and task_state resize_finish.
Jan 22 00:19:47 compute-1 ovn_controller[94841]: 2026-01-22T00:19:47Z|00577|binding|INFO|Releasing lport 1ae0dbff-a7cd-4db8-afc3-1d102fdd130f from this chassis (sb_readonly=0)
Jan 22 00:19:47 compute-1 nova_compute[182713]: 2026-01-22 00:19:47.831 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:19:48 compute-1 ovn_controller[94841]: 2026-01-22T00:19:48Z|00578|binding|INFO|Releasing lport 1ae0dbff-a7cd-4db8-afc3-1d102fdd130f from this chassis (sb_readonly=0)
Jan 22 00:19:48 compute-1 nova_compute[182713]: 2026-01-22 00:19:48.021 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:19:48 compute-1 nova_compute[182713]: 2026-01-22 00:19:48.838 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:19:49 compute-1 nova_compute[182713]: 2026-01-22 00:19:49.474 182717 DEBUG nova.compute.manager [req-848200e0-7403-438e-a753-c614a2e7c466 req-12aa1f5f-fe45-4ecc-9ef9-69408d9f9859 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: d59e0943-5372-4680-af52-c9af874c8578] Received event network-vif-plugged-89ad850c-a87f-489f-8c3e-51dfc078a374 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:19:49 compute-1 nova_compute[182713]: 2026-01-22 00:19:49.475 182717 DEBUG oslo_concurrency.lockutils [req-848200e0-7403-438e-a753-c614a2e7c466 req-12aa1f5f-fe45-4ecc-9ef9-69408d9f9859 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "d59e0943-5372-4680-af52-c9af874c8578-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:19:49 compute-1 nova_compute[182713]: 2026-01-22 00:19:49.475 182717 DEBUG oslo_concurrency.lockutils [req-848200e0-7403-438e-a753-c614a2e7c466 req-12aa1f5f-fe45-4ecc-9ef9-69408d9f9859 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "d59e0943-5372-4680-af52-c9af874c8578-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:19:49 compute-1 nova_compute[182713]: 2026-01-22 00:19:49.475 182717 DEBUG oslo_concurrency.lockutils [req-848200e0-7403-438e-a753-c614a2e7c466 req-12aa1f5f-fe45-4ecc-9ef9-69408d9f9859 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "d59e0943-5372-4680-af52-c9af874c8578-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:19:49 compute-1 nova_compute[182713]: 2026-01-22 00:19:49.476 182717 DEBUG nova.compute.manager [req-848200e0-7403-438e-a753-c614a2e7c466 req-12aa1f5f-fe45-4ecc-9ef9-69408d9f9859 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: d59e0943-5372-4680-af52-c9af874c8578] No waiting events found dispatching network-vif-plugged-89ad850c-a87f-489f-8c3e-51dfc078a374 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:19:49 compute-1 nova_compute[182713]: 2026-01-22 00:19:49.476 182717 WARNING nova.compute.manager [req-848200e0-7403-438e-a753-c614a2e7c466 req-12aa1f5f-fe45-4ecc-9ef9-69408d9f9859 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: d59e0943-5372-4680-af52-c9af874c8578] Received unexpected event network-vif-plugged-89ad850c-a87f-489f-8c3e-51dfc078a374 for instance with vm_state resized and task_state None.
Jan 22 00:19:49 compute-1 nova_compute[182713]: 2026-01-22 00:19:49.654 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:19:50 compute-1 podman[233613]: 2026-01-22 00:19:50.570312795 +0000 UTC m=+0.055203977 container health_status e305ebfcd3dbca18f8dd680606b043b108cf9efb0d0a771039cb04fe0df8441d (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 22 00:19:50 compute-1 podman[233612]: 2026-01-22 00:19:50.626514853 +0000 UTC m=+0.105863785 container health_status af9a44923f14d865893c91712a29a99d59e05d83f1a1a0300c4f4d5af638f284 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Jan 22 00:19:50 compute-1 nova_compute[182713]: 2026-01-22 00:19:50.856 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:19:50 compute-1 nova_compute[182713]: 2026-01-22 00:19:50.857 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:19:51 compute-1 nova_compute[182713]: 2026-01-22 00:19:51.856 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:19:51 compute-1 nova_compute[182713]: 2026-01-22 00:19:51.896 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:19:51 compute-1 nova_compute[182713]: 2026-01-22 00:19:51.897 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:19:51 compute-1 nova_compute[182713]: 2026-01-22 00:19:51.897 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:19:51 compute-1 nova_compute[182713]: 2026-01-22 00:19:51.898 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 00:19:51 compute-1 nova_compute[182713]: 2026-01-22 00:19:51.996 182717 DEBUG oslo_concurrency.processutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7e1a8ec5-e1c0-4c11-aae2-15d84872d95c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:19:52 compute-1 nova_compute[182713]: 2026-01-22 00:19:52.065 182717 DEBUG oslo_concurrency.processutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7e1a8ec5-e1c0-4c11-aae2-15d84872d95c/disk --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:19:52 compute-1 nova_compute[182713]: 2026-01-22 00:19:52.066 182717 DEBUG oslo_concurrency.processutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7e1a8ec5-e1c0-4c11-aae2-15d84872d95c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:19:52 compute-1 nova_compute[182713]: 2026-01-22 00:19:52.129 182717 DEBUG oslo_concurrency.processutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7e1a8ec5-e1c0-4c11-aae2-15d84872d95c/disk --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:19:52 compute-1 nova_compute[182713]: 2026-01-22 00:19:52.134 182717 WARNING nova.virt.libvirt.driver [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Periodic task is updating the host stats, it is trying to get disk info for instance-00000091, but the backing disk storage was removed by a concurrent operation such as resize. Error: No disk at /var/lib/nova/instances/d59e0943-5372-4680-af52-c9af874c8578/disk: nova.exception.DiskNotFound: No disk at /var/lib/nova/instances/d59e0943-5372-4680-af52-c9af874c8578/disk
Jan 22 00:19:52 compute-1 nova_compute[182713]: 2026-01-22 00:19:52.314 182717 WARNING nova.virt.libvirt.driver [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 00:19:52 compute-1 nova_compute[182713]: 2026-01-22 00:19:52.315 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5532MB free_disk=73.20232772827148GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 00:19:52 compute-1 nova_compute[182713]: 2026-01-22 00:19:52.315 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:19:52 compute-1 nova_compute[182713]: 2026-01-22 00:19:52.316 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:19:52 compute-1 nova_compute[182713]: 2026-01-22 00:19:52.376 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Migration for instance d59e0943-5372-4680-af52-c9af874c8578 refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:903
Jan 22 00:19:52 compute-1 nova_compute[182713]: 2026-01-22 00:19:52.400 182717 INFO nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] [instance: d59e0943-5372-4680-af52-c9af874c8578] Updating resource usage from migration 70071767-0c89-48d9-81cd-be888f6d084e
Jan 22 00:19:52 compute-1 nova_compute[182713]: 2026-01-22 00:19:52.401 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] [instance: d59e0943-5372-4680-af52-c9af874c8578] Starting to track outgoing migration 70071767-0c89-48d9-81cd-be888f6d084e with flavor c3389c03-89c4-4ff5-9e03-1a99d41713d4 _update_usage_from_migration /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1444
Jan 22 00:19:52 compute-1 nova_compute[182713]: 2026-01-22 00:19:52.437 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Instance 7e1a8ec5-e1c0-4c11-aae2-15d84872d95c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 22 00:19:52 compute-1 nova_compute[182713]: 2026-01-22 00:19:52.437 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Migration 70071767-0c89-48d9-81cd-be888f6d084e is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1640
Jan 22 00:19:52 compute-1 nova_compute[182713]: 2026-01-22 00:19:52.438 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 00:19:52 compute-1 nova_compute[182713]: 2026-01-22 00:19:52.438 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 00:19:52 compute-1 nova_compute[182713]: 2026-01-22 00:19:52.471 182717 DEBUG nova.scheduler.client.report [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Refreshing inventories for resource provider 39680711-70c9-4df1-ae59-25e54fac688d _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Jan 22 00:19:52 compute-1 nova_compute[182713]: 2026-01-22 00:19:52.492 182717 DEBUG nova.scheduler.client.report [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Updating ProviderTree inventory for provider 39680711-70c9-4df1-ae59-25e54fac688d from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Jan 22 00:19:52 compute-1 nova_compute[182713]: 2026-01-22 00:19:52.493 182717 DEBUG nova.compute.provider_tree [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Updating inventory in ProviderTree for provider 39680711-70c9-4df1-ae59-25e54fac688d with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Jan 22 00:19:52 compute-1 nova_compute[182713]: 2026-01-22 00:19:52.508 182717 DEBUG nova.scheduler.client.report [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Refreshing aggregate associations for resource provider 39680711-70c9-4df1-ae59-25e54fac688d, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Jan 22 00:19:52 compute-1 nova_compute[182713]: 2026-01-22 00:19:52.529 182717 DEBUG nova.scheduler.client.report [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Refreshing trait associations for resource provider 39680711-70c9-4df1-ae59-25e54fac688d, traits: COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_ACCELERATORS,COMPUTE_NODE,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_SSE41,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_GRAPHICS_MODEL_CIRRUS,HW_CPU_X86_SSE42,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_SECURITY_TPM_2_0,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_STORAGE_BUS_USB,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_STORAGE_BUS_FDC,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_STORAGE_BUS_SATA,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_DEVICE_TAGGING,COMPUTE_NET_VIF_MODEL_NE2K_PCI,HW_CPU_X86_SSE2,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_CPU_X86_SSSE3,COMPUTE_STORAGE_BUS_IDE,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_RESCUE_BFV,HW_CPU_X86_MMX,HW_CPU_X86_SSE,COMPUTE_TRUSTED_CERTS,COMPUTE_IMAGE_TYPE_ISO _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Jan 22 00:19:52 compute-1 nova_compute[182713]: 2026-01-22 00:19:52.612 182717 DEBUG nova.compute.provider_tree [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Inventory has not changed in ProviderTree for provider: 39680711-70c9-4df1-ae59-25e54fac688d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:19:52 compute-1 nova_compute[182713]: 2026-01-22 00:19:52.628 182717 DEBUG nova.scheduler.client.report [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Inventory has not changed for provider 39680711-70c9-4df1-ae59-25e54fac688d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:19:52 compute-1 nova_compute[182713]: 2026-01-22 00:19:52.654 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 00:19:52 compute-1 nova_compute[182713]: 2026-01-22 00:19:52.654 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.339s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:19:53 compute-1 nova_compute[182713]: 2026-01-22 00:19:53.036 182717 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769041178.0347836, 2ddf383f-fadd-4739-9a90-3db8bcb7cb2a => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:19:53 compute-1 nova_compute[182713]: 2026-01-22 00:19:53.037 182717 INFO nova.compute.manager [-] [instance: 2ddf383f-fadd-4739-9a90-3db8bcb7cb2a] VM Stopped (Lifecycle Event)
Jan 22 00:19:53 compute-1 nova_compute[182713]: 2026-01-22 00:19:53.143 182717 DEBUG nova.compute.manager [None req-705c067f-5f3a-48a5-88d5-4d6ba2bc0cf9 - - - - - -] [instance: 2ddf383f-fadd-4739-9a90-3db8bcb7cb2a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:19:53 compute-1 nova_compute[182713]: 2026-01-22 00:19:53.573 182717 INFO nova.compute.manager [None req-9407b51e-2df7-478b-9165-75c7adecac6d 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: d59e0943-5372-4680-af52-c9af874c8578] Swapping old allocation on dict_keys(['39680711-70c9-4df1-ae59-25e54fac688d']) held by migration 70071767-0c89-48d9-81cd-be888f6d084e for instance
Jan 22 00:19:53 compute-1 nova_compute[182713]: 2026-01-22 00:19:53.612 182717 DEBUG nova.scheduler.client.report [None req-9407b51e-2df7-478b-9165-75c7adecac6d 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Overwriting current allocation {'allocations': {'5f09a77c-505f-4bd3-ac26-41f43ebdf535': {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}, 'generation': 75}}, 'project_id': 'adb1305c8f874f2684e845e88fd95ffe', 'user_id': '635cc2f351c344dc8e2b1264080dbafb', 'consumer_generation': 1} on consumer d59e0943-5372-4680-af52-c9af874c8578 move_allocations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:2018
Jan 22 00:19:53 compute-1 nova_compute[182713]: 2026-01-22 00:19:53.663 182717 DEBUG nova.compute.manager [req-14ad724d-d432-4538-9f79-eae07079bc11 req-12b00f9e-14aa-4078-9f98-7bd9275c0722 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: d59e0943-5372-4680-af52-c9af874c8578] Received event network-vif-unplugged-89ad850c-a87f-489f-8c3e-51dfc078a374 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:19:53 compute-1 nova_compute[182713]: 2026-01-22 00:19:53.664 182717 DEBUG oslo_concurrency.lockutils [req-14ad724d-d432-4538-9f79-eae07079bc11 req-12b00f9e-14aa-4078-9f98-7bd9275c0722 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "d59e0943-5372-4680-af52-c9af874c8578-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:19:53 compute-1 nova_compute[182713]: 2026-01-22 00:19:53.665 182717 DEBUG oslo_concurrency.lockutils [req-14ad724d-d432-4538-9f79-eae07079bc11 req-12b00f9e-14aa-4078-9f98-7bd9275c0722 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "d59e0943-5372-4680-af52-c9af874c8578-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:19:53 compute-1 nova_compute[182713]: 2026-01-22 00:19:53.665 182717 DEBUG oslo_concurrency.lockutils [req-14ad724d-d432-4538-9f79-eae07079bc11 req-12b00f9e-14aa-4078-9f98-7bd9275c0722 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "d59e0943-5372-4680-af52-c9af874c8578-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:19:53 compute-1 nova_compute[182713]: 2026-01-22 00:19:53.666 182717 DEBUG nova.compute.manager [req-14ad724d-d432-4538-9f79-eae07079bc11 req-12b00f9e-14aa-4078-9f98-7bd9275c0722 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: d59e0943-5372-4680-af52-c9af874c8578] No waiting events found dispatching network-vif-unplugged-89ad850c-a87f-489f-8c3e-51dfc078a374 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:19:53 compute-1 nova_compute[182713]: 2026-01-22 00:19:53.666 182717 WARNING nova.compute.manager [req-14ad724d-d432-4538-9f79-eae07079bc11 req-12b00f9e-14aa-4078-9f98-7bd9275c0722 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: d59e0943-5372-4680-af52-c9af874c8578] Received unexpected event network-vif-unplugged-89ad850c-a87f-489f-8c3e-51dfc078a374 for instance with vm_state resized and task_state resize_reverting.
Jan 22 00:19:53 compute-1 nova_compute[182713]: 2026-01-22 00:19:53.839 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:19:54 compute-1 nova_compute[182713]: 2026-01-22 00:19:54.017 182717 INFO nova.network.neutron [None req-9407b51e-2df7-478b-9165-75c7adecac6d 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: d59e0943-5372-4680-af52-c9af874c8578] Updating port 89ad850c-a87f-489f-8c3e-51dfc078a374 with attributes {'binding:host_id': 'compute-1.ctlplane.example.com', 'device_owner': 'compute:nova'}
Jan 22 00:19:54 compute-1 nova_compute[182713]: 2026-01-22 00:19:54.048 182717 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769041179.0468104, d59e0943-5372-4680-af52-c9af874c8578 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:19:54 compute-1 nova_compute[182713]: 2026-01-22 00:19:54.048 182717 INFO nova.compute.manager [-] [instance: d59e0943-5372-4680-af52-c9af874c8578] VM Stopped (Lifecycle Event)
Jan 22 00:19:54 compute-1 nova_compute[182713]: 2026-01-22 00:19:54.068 182717 DEBUG nova.compute.manager [None req-9c9ff023-78b1-4c14-a37e-b01be49ea136 - - - - - -] [instance: d59e0943-5372-4680-af52-c9af874c8578] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:19:54 compute-1 nova_compute[182713]: 2026-01-22 00:19:54.072 182717 DEBUG nova.compute.manager [None req-9c9ff023-78b1-4c14-a37e-b01be49ea136 - - - - - -] [instance: d59e0943-5372-4680-af52-c9af874c8578] Synchronizing instance power state after lifecycle event "Stopped"; current vm_state: resized, current task_state: resize_reverting, current DB power_state: 1, VM power_state: 4 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 00:19:54 compute-1 nova_compute[182713]: 2026-01-22 00:19:54.098 182717 INFO nova.compute.manager [None req-9c9ff023-78b1-4c14-a37e-b01be49ea136 - - - - - -] [instance: d59e0943-5372-4680-af52-c9af874c8578] During sync_power_state the instance has a pending task (resize_reverting). Skip.
Jan 22 00:19:54 compute-1 nova_compute[182713]: 2026-01-22 00:19:54.656 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:19:55 compute-1 nova_compute[182713]: 2026-01-22 00:19:55.655 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:19:55 compute-1 nova_compute[182713]: 2026-01-22 00:19:55.855 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:19:55 compute-1 nova_compute[182713]: 2026-01-22 00:19:55.856 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 00:19:55 compute-1 nova_compute[182713]: 2026-01-22 00:19:55.904 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 22 00:19:55 compute-1 nova_compute[182713]: 2026-01-22 00:19:55.971 182717 DEBUG nova.compute.manager [req-a2b1674a-4f5f-43cc-a4d7-51e01e542473 req-05949937-cd76-4b36-a3ae-e362138e41cb 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: d59e0943-5372-4680-af52-c9af874c8578] Received event network-vif-plugged-89ad850c-a87f-489f-8c3e-51dfc078a374 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:19:55 compute-1 nova_compute[182713]: 2026-01-22 00:19:55.972 182717 DEBUG oslo_concurrency.lockutils [req-a2b1674a-4f5f-43cc-a4d7-51e01e542473 req-05949937-cd76-4b36-a3ae-e362138e41cb 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "d59e0943-5372-4680-af52-c9af874c8578-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:19:55 compute-1 nova_compute[182713]: 2026-01-22 00:19:55.972 182717 DEBUG oslo_concurrency.lockutils [req-a2b1674a-4f5f-43cc-a4d7-51e01e542473 req-05949937-cd76-4b36-a3ae-e362138e41cb 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "d59e0943-5372-4680-af52-c9af874c8578-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:19:55 compute-1 nova_compute[182713]: 2026-01-22 00:19:55.973 182717 DEBUG oslo_concurrency.lockutils [req-a2b1674a-4f5f-43cc-a4d7-51e01e542473 req-05949937-cd76-4b36-a3ae-e362138e41cb 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "d59e0943-5372-4680-af52-c9af874c8578-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:19:55 compute-1 nova_compute[182713]: 2026-01-22 00:19:55.973 182717 DEBUG nova.compute.manager [req-a2b1674a-4f5f-43cc-a4d7-51e01e542473 req-05949937-cd76-4b36-a3ae-e362138e41cb 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: d59e0943-5372-4680-af52-c9af874c8578] No waiting events found dispatching network-vif-plugged-89ad850c-a87f-489f-8c3e-51dfc078a374 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:19:55 compute-1 nova_compute[182713]: 2026-01-22 00:19:55.974 182717 WARNING nova.compute.manager [req-a2b1674a-4f5f-43cc-a4d7-51e01e542473 req-05949937-cd76-4b36-a3ae-e362138e41cb 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: d59e0943-5372-4680-af52-c9af874c8578] Received unexpected event network-vif-plugged-89ad850c-a87f-489f-8c3e-51dfc078a374 for instance with vm_state resized and task_state resize_reverting.
Jan 22 00:19:56 compute-1 nova_compute[182713]: 2026-01-22 00:19:56.230 182717 DEBUG oslo_concurrency.lockutils [None req-9407b51e-2df7-478b-9165-75c7adecac6d 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Acquiring lock "refresh_cache-d59e0943-5372-4680-af52-c9af874c8578" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:19:56 compute-1 nova_compute[182713]: 2026-01-22 00:19:56.231 182717 DEBUG oslo_concurrency.lockutils [None req-9407b51e-2df7-478b-9165-75c7adecac6d 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Acquired lock "refresh_cache-d59e0943-5372-4680-af52-c9af874c8578" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:19:56 compute-1 nova_compute[182713]: 2026-01-22 00:19:56.231 182717 DEBUG nova.network.neutron [None req-9407b51e-2df7-478b-9165-75c7adecac6d 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: d59e0943-5372-4680-af52-c9af874c8578] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 00:19:57 compute-1 nova_compute[182713]: 2026-01-22 00:19:57.855 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:19:58 compute-1 nova_compute[182713]: 2026-01-22 00:19:58.091 182717 DEBUG nova.compute.manager [req-fef03a7b-2703-4d14-8389-6e8bae4e6f02 req-2f3075f5-572f-4da6-bc20-6c2954e7e6a9 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: d59e0943-5372-4680-af52-c9af874c8578] Received event network-changed-89ad850c-a87f-489f-8c3e-51dfc078a374 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:19:58 compute-1 nova_compute[182713]: 2026-01-22 00:19:58.092 182717 DEBUG nova.compute.manager [req-fef03a7b-2703-4d14-8389-6e8bae4e6f02 req-2f3075f5-572f-4da6-bc20-6c2954e7e6a9 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: d59e0943-5372-4680-af52-c9af874c8578] Refreshing instance network info cache due to event network-changed-89ad850c-a87f-489f-8c3e-51dfc078a374. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 00:19:58 compute-1 nova_compute[182713]: 2026-01-22 00:19:58.092 182717 DEBUG oslo_concurrency.lockutils [req-fef03a7b-2703-4d14-8389-6e8bae4e6f02 req-2f3075f5-572f-4da6-bc20-6c2954e7e6a9 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-d59e0943-5372-4680-af52-c9af874c8578" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:19:58 compute-1 nova_compute[182713]: 2026-01-22 00:19:58.310 182717 DEBUG nova.network.neutron [None req-9407b51e-2df7-478b-9165-75c7adecac6d 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: d59e0943-5372-4680-af52-c9af874c8578] Updating instance_info_cache with network_info: [{"id": "89ad850c-a87f-489f-8c3e-51dfc078a374", "address": "fa:16:3e:9d:17:06", "network": {"id": "7347045a-f38e-4f56-a03a-a68e0fbe1e8d", "bridge": "br-int", "label": "tempest-network-smoke--1118354446", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.194", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "adb1305c8f874f2684e845e88fd95ffe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap89ad850c-a8", "ovs_interfaceid": "89ad850c-a87f-489f-8c3e-51dfc078a374", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:19:58 compute-1 nova_compute[182713]: 2026-01-22 00:19:58.338 182717 DEBUG oslo_concurrency.lockutils [None req-9407b51e-2df7-478b-9165-75c7adecac6d 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Releasing lock "refresh_cache-d59e0943-5372-4680-af52-c9af874c8578" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:19:58 compute-1 nova_compute[182713]: 2026-01-22 00:19:58.338 182717 DEBUG nova.virt.libvirt.driver [None req-9407b51e-2df7-478b-9165-75c7adecac6d 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: d59e0943-5372-4680-af52-c9af874c8578] Starting finish_revert_migration finish_revert_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11843
Jan 22 00:19:58 compute-1 nova_compute[182713]: 2026-01-22 00:19:58.341 182717 DEBUG oslo_concurrency.lockutils [req-fef03a7b-2703-4d14-8389-6e8bae4e6f02 req-2f3075f5-572f-4da6-bc20-6c2954e7e6a9 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-d59e0943-5372-4680-af52-c9af874c8578" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:19:58 compute-1 nova_compute[182713]: 2026-01-22 00:19:58.342 182717 DEBUG nova.network.neutron [req-fef03a7b-2703-4d14-8389-6e8bae4e6f02 req-2f3075f5-572f-4da6-bc20-6c2954e7e6a9 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: d59e0943-5372-4680-af52-c9af874c8578] Refreshing network info cache for port 89ad850c-a87f-489f-8c3e-51dfc078a374 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 00:19:58 compute-1 nova_compute[182713]: 2026-01-22 00:19:58.350 182717 DEBUG nova.virt.libvirt.driver [None req-9407b51e-2df7-478b-9165-75c7adecac6d 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: d59e0943-5372-4680-af52-c9af874c8578] Start _get_guest_xml network_info=[{"id": "89ad850c-a87f-489f-8c3e-51dfc078a374", "address": "fa:16:3e:9d:17:06", "network": {"id": "7347045a-f38e-4f56-a03a-a68e0fbe1e8d", "bridge": "br-int", "label": "tempest-network-smoke--1118354446", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.194", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "adb1305c8f874f2684e845e88fd95ffe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap89ad850c-a8", "ovs_interfaceid": "89ad850c-a87f-489f-8c3e-51dfc078a374", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'size': 0, 'disk_bus': 'virtio', 'boot_index': 0, 'device_name': '/dev/vda', 'guest_format': None, 'encrypted': False, 'encryption_format': None, 'encryption_secret_uuid': None, 'device_type': 'disk', 'image_id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 22 00:19:58 compute-1 nova_compute[182713]: 2026-01-22 00:19:58.354 182717 WARNING nova.virt.libvirt.driver [None req-9407b51e-2df7-478b-9165-75c7adecac6d 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 00:19:58 compute-1 nova_compute[182713]: 2026-01-22 00:19:58.361 182717 DEBUG nova.virt.libvirt.host [None req-9407b51e-2df7-478b-9165-75c7adecac6d 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 22 00:19:58 compute-1 nova_compute[182713]: 2026-01-22 00:19:58.362 182717 DEBUG nova.virt.libvirt.host [None req-9407b51e-2df7-478b-9165-75c7adecac6d 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 22 00:19:58 compute-1 nova_compute[182713]: 2026-01-22 00:19:58.365 182717 DEBUG nova.virt.libvirt.host [None req-9407b51e-2df7-478b-9165-75c7adecac6d 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 22 00:19:58 compute-1 nova_compute[182713]: 2026-01-22 00:19:58.365 182717 DEBUG nova.virt.libvirt.host [None req-9407b51e-2df7-478b-9165-75c7adecac6d 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 22 00:19:58 compute-1 nova_compute[182713]: 2026-01-22 00:19:58.366 182717 DEBUG nova.virt.libvirt.driver [None req-9407b51e-2df7-478b-9165-75c7adecac6d 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 22 00:19:58 compute-1 nova_compute[182713]: 2026-01-22 00:19:58.367 182717 DEBUG nova.virt.hardware [None req-9407b51e-2df7-478b-9165-75c7adecac6d 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-21T23:42:45Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c3389c03-89c4-4ff5-9e03-1a99d41713d4',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 22 00:19:58 compute-1 nova_compute[182713]: 2026-01-22 00:19:58.367 182717 DEBUG nova.virt.hardware [None req-9407b51e-2df7-478b-9165-75c7adecac6d 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 22 00:19:58 compute-1 nova_compute[182713]: 2026-01-22 00:19:58.367 182717 DEBUG nova.virt.hardware [None req-9407b51e-2df7-478b-9165-75c7adecac6d 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 22 00:19:58 compute-1 nova_compute[182713]: 2026-01-22 00:19:58.368 182717 DEBUG nova.virt.hardware [None req-9407b51e-2df7-478b-9165-75c7adecac6d 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 22 00:19:58 compute-1 nova_compute[182713]: 2026-01-22 00:19:58.368 182717 DEBUG nova.virt.hardware [None req-9407b51e-2df7-478b-9165-75c7adecac6d 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 22 00:19:58 compute-1 nova_compute[182713]: 2026-01-22 00:19:58.368 182717 DEBUG nova.virt.hardware [None req-9407b51e-2df7-478b-9165-75c7adecac6d 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 22 00:19:58 compute-1 nova_compute[182713]: 2026-01-22 00:19:58.368 182717 DEBUG nova.virt.hardware [None req-9407b51e-2df7-478b-9165-75c7adecac6d 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 22 00:19:58 compute-1 nova_compute[182713]: 2026-01-22 00:19:58.368 182717 DEBUG nova.virt.hardware [None req-9407b51e-2df7-478b-9165-75c7adecac6d 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 22 00:19:58 compute-1 nova_compute[182713]: 2026-01-22 00:19:58.369 182717 DEBUG nova.virt.hardware [None req-9407b51e-2df7-478b-9165-75c7adecac6d 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 22 00:19:58 compute-1 nova_compute[182713]: 2026-01-22 00:19:58.369 182717 DEBUG nova.virt.hardware [None req-9407b51e-2df7-478b-9165-75c7adecac6d 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 22 00:19:58 compute-1 nova_compute[182713]: 2026-01-22 00:19:58.369 182717 DEBUG nova.virt.hardware [None req-9407b51e-2df7-478b-9165-75c7adecac6d 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 22 00:19:58 compute-1 nova_compute[182713]: 2026-01-22 00:19:58.369 182717 DEBUG nova.objects.instance [None req-9407b51e-2df7-478b-9165-75c7adecac6d 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lazy-loading 'vcpu_model' on Instance uuid d59e0943-5372-4680-af52-c9af874c8578 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:19:58 compute-1 nova_compute[182713]: 2026-01-22 00:19:58.391 182717 DEBUG oslo_concurrency.processutils [None req-9407b51e-2df7-478b-9165-75c7adecac6d 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d59e0943-5372-4680-af52-c9af874c8578/disk.config --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:19:58 compute-1 nova_compute[182713]: 2026-01-22 00:19:58.466 182717 DEBUG oslo_concurrency.processutils [None req-9407b51e-2df7-478b-9165-75c7adecac6d 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d59e0943-5372-4680-af52-c9af874c8578/disk.config --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:19:58 compute-1 nova_compute[182713]: 2026-01-22 00:19:58.467 182717 DEBUG oslo_concurrency.lockutils [None req-9407b51e-2df7-478b-9165-75c7adecac6d 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Acquiring lock "/var/lib/nova/instances/d59e0943-5372-4680-af52-c9af874c8578/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:19:58 compute-1 nova_compute[182713]: 2026-01-22 00:19:58.467 182717 DEBUG oslo_concurrency.lockutils [None req-9407b51e-2df7-478b-9165-75c7adecac6d 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lock "/var/lib/nova/instances/d59e0943-5372-4680-af52-c9af874c8578/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:19:58 compute-1 nova_compute[182713]: 2026-01-22 00:19:58.468 182717 DEBUG oslo_concurrency.lockutils [None req-9407b51e-2df7-478b-9165-75c7adecac6d 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lock "/var/lib/nova/instances/d59e0943-5372-4680-af52-c9af874c8578/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:19:58 compute-1 nova_compute[182713]: 2026-01-22 00:19:58.471 182717 DEBUG nova.virt.libvirt.vif [None req-9407b51e-2df7-478b-9165-75c7adecac6d 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-01-22T00:18:57Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-2079598704',display_name='tempest-TestNetworkAdvancedServerOps-server-2079598704',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-2079598704',id=145,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBIY3eqmdW0m2q20hwTxy7fCq5RPOCY+KqJLqriFpcPzIAlQnzQNfW6TIp9Y1voEv/PtpLpDAT0kqBnGToo/qNh+oTys/PEZ/7XtlTWunC6nPRFTGOxMn536DUj7Tail8LA==',key_name='tempest-TestNetworkAdvancedServerOps-1230336080',keypairs=<?>,launch_index=0,launched_at=2026-01-22T00:19:47Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=MigrationContext,new_flavor=Flavor(1),node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='adb1305c8f874f2684e845e88fd95ffe',ramdisk_id='',reservation_id='r-zrj70p8y',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-TestNetworkAdvancedServerOps-587955072',owner_user_name='tempest-TestNetworkAdvancedServerOps-587955072-project-member'},tags=<?>,task_state='resize_reverting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T00:19:52Z,user_data=None,user_id='635cc2f351c344dc8e2b1264080dbafb',uuid=d59e0943-5372-4680-af52-c9af874c8578,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='resized') vif={"id": "89ad850c-a87f-489f-8c3e-51dfc078a374", "address": "fa:16:3e:9d:17:06", "network": {"id": "7347045a-f38e-4f56-a03a-a68e0fbe1e8d", "bridge": "br-int", "label": "tempest-network-smoke--1118354446", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.194", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "adb1305c8f874f2684e845e88fd95ffe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap89ad850c-a8", "ovs_interfaceid": "89ad850c-a87f-489f-8c3e-51dfc078a374", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 00:19:58 compute-1 nova_compute[182713]: 2026-01-22 00:19:58.471 182717 DEBUG nova.network.os_vif_util [None req-9407b51e-2df7-478b-9165-75c7adecac6d 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Converting VIF {"id": "89ad850c-a87f-489f-8c3e-51dfc078a374", "address": "fa:16:3e:9d:17:06", "network": {"id": "7347045a-f38e-4f56-a03a-a68e0fbe1e8d", "bridge": "br-int", "label": "tempest-network-smoke--1118354446", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.194", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "adb1305c8f874f2684e845e88fd95ffe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap89ad850c-a8", "ovs_interfaceid": "89ad850c-a87f-489f-8c3e-51dfc078a374", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:19:58 compute-1 nova_compute[182713]: 2026-01-22 00:19:58.472 182717 DEBUG nova.network.os_vif_util [None req-9407b51e-2df7-478b-9165-75c7adecac6d 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:9d:17:06,bridge_name='br-int',has_traffic_filtering=True,id=89ad850c-a87f-489f-8c3e-51dfc078a374,network=Network(7347045a-f38e-4f56-a03a-a68e0fbe1e8d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap89ad850c-a8') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:19:58 compute-1 nova_compute[182713]: 2026-01-22 00:19:58.475 182717 DEBUG nova.virt.libvirt.driver [None req-9407b51e-2df7-478b-9165-75c7adecac6d 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: d59e0943-5372-4680-af52-c9af874c8578] End _get_guest_xml xml=<domain type="kvm">
Jan 22 00:19:58 compute-1 nova_compute[182713]:   <uuid>d59e0943-5372-4680-af52-c9af874c8578</uuid>
Jan 22 00:19:58 compute-1 nova_compute[182713]:   <name>instance-00000091</name>
Jan 22 00:19:58 compute-1 nova_compute[182713]:   <memory>131072</memory>
Jan 22 00:19:58 compute-1 nova_compute[182713]:   <vcpu>1</vcpu>
Jan 22 00:19:58 compute-1 nova_compute[182713]:   <metadata>
Jan 22 00:19:58 compute-1 nova_compute[182713]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 00:19:58 compute-1 nova_compute[182713]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 00:19:58 compute-1 nova_compute[182713]:       <nova:name>tempest-TestNetworkAdvancedServerOps-server-2079598704</nova:name>
Jan 22 00:19:58 compute-1 nova_compute[182713]:       <nova:creationTime>2026-01-22 00:19:58</nova:creationTime>
Jan 22 00:19:58 compute-1 nova_compute[182713]:       <nova:flavor name="m1.nano">
Jan 22 00:19:58 compute-1 nova_compute[182713]:         <nova:memory>128</nova:memory>
Jan 22 00:19:58 compute-1 nova_compute[182713]:         <nova:disk>1</nova:disk>
Jan 22 00:19:58 compute-1 nova_compute[182713]:         <nova:swap>0</nova:swap>
Jan 22 00:19:58 compute-1 nova_compute[182713]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 00:19:58 compute-1 nova_compute[182713]:         <nova:vcpus>1</nova:vcpus>
Jan 22 00:19:58 compute-1 nova_compute[182713]:       </nova:flavor>
Jan 22 00:19:58 compute-1 nova_compute[182713]:       <nova:owner>
Jan 22 00:19:58 compute-1 nova_compute[182713]:         <nova:user uuid="635cc2f351c344dc8e2b1264080dbafb">tempest-TestNetworkAdvancedServerOps-587955072-project-member</nova:user>
Jan 22 00:19:58 compute-1 nova_compute[182713]:         <nova:project uuid="adb1305c8f874f2684e845e88fd95ffe">tempest-TestNetworkAdvancedServerOps-587955072</nova:project>
Jan 22 00:19:58 compute-1 nova_compute[182713]:       </nova:owner>
Jan 22 00:19:58 compute-1 nova_compute[182713]:       <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 22 00:19:58 compute-1 nova_compute[182713]:       <nova:ports>
Jan 22 00:19:58 compute-1 nova_compute[182713]:         <nova:port uuid="89ad850c-a87f-489f-8c3e-51dfc078a374">
Jan 22 00:19:58 compute-1 nova_compute[182713]:           <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Jan 22 00:19:58 compute-1 nova_compute[182713]:         </nova:port>
Jan 22 00:19:58 compute-1 nova_compute[182713]:       </nova:ports>
Jan 22 00:19:58 compute-1 nova_compute[182713]:     </nova:instance>
Jan 22 00:19:58 compute-1 nova_compute[182713]:   </metadata>
Jan 22 00:19:58 compute-1 nova_compute[182713]:   <sysinfo type="smbios">
Jan 22 00:19:58 compute-1 nova_compute[182713]:     <system>
Jan 22 00:19:58 compute-1 nova_compute[182713]:       <entry name="manufacturer">RDO</entry>
Jan 22 00:19:58 compute-1 nova_compute[182713]:       <entry name="product">OpenStack Compute</entry>
Jan 22 00:19:58 compute-1 nova_compute[182713]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 00:19:58 compute-1 nova_compute[182713]:       <entry name="serial">d59e0943-5372-4680-af52-c9af874c8578</entry>
Jan 22 00:19:58 compute-1 nova_compute[182713]:       <entry name="uuid">d59e0943-5372-4680-af52-c9af874c8578</entry>
Jan 22 00:19:58 compute-1 nova_compute[182713]:       <entry name="family">Virtual Machine</entry>
Jan 22 00:19:58 compute-1 nova_compute[182713]:     </system>
Jan 22 00:19:58 compute-1 nova_compute[182713]:   </sysinfo>
Jan 22 00:19:58 compute-1 nova_compute[182713]:   <os>
Jan 22 00:19:58 compute-1 nova_compute[182713]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 22 00:19:58 compute-1 nova_compute[182713]:     <boot dev="hd"/>
Jan 22 00:19:58 compute-1 nova_compute[182713]:     <smbios mode="sysinfo"/>
Jan 22 00:19:58 compute-1 nova_compute[182713]:   </os>
Jan 22 00:19:58 compute-1 nova_compute[182713]:   <features>
Jan 22 00:19:58 compute-1 nova_compute[182713]:     <acpi/>
Jan 22 00:19:58 compute-1 nova_compute[182713]:     <apic/>
Jan 22 00:19:58 compute-1 nova_compute[182713]:     <vmcoreinfo/>
Jan 22 00:19:58 compute-1 nova_compute[182713]:   </features>
Jan 22 00:19:58 compute-1 nova_compute[182713]:   <clock offset="utc">
Jan 22 00:19:58 compute-1 nova_compute[182713]:     <timer name="pit" tickpolicy="delay"/>
Jan 22 00:19:58 compute-1 nova_compute[182713]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 22 00:19:58 compute-1 nova_compute[182713]:     <timer name="hpet" present="no"/>
Jan 22 00:19:58 compute-1 nova_compute[182713]:   </clock>
Jan 22 00:19:58 compute-1 nova_compute[182713]:   <cpu mode="custom" match="exact">
Jan 22 00:19:58 compute-1 nova_compute[182713]:     <model>Nehalem</model>
Jan 22 00:19:58 compute-1 nova_compute[182713]:     <topology sockets="1" cores="1" threads="1"/>
Jan 22 00:19:58 compute-1 nova_compute[182713]:   </cpu>
Jan 22 00:19:58 compute-1 nova_compute[182713]:   <devices>
Jan 22 00:19:58 compute-1 nova_compute[182713]:     <disk type="file" device="disk">
Jan 22 00:19:58 compute-1 nova_compute[182713]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 00:19:58 compute-1 nova_compute[182713]:       <source file="/var/lib/nova/instances/d59e0943-5372-4680-af52-c9af874c8578/disk"/>
Jan 22 00:19:58 compute-1 nova_compute[182713]:       <target dev="vda" bus="virtio"/>
Jan 22 00:19:58 compute-1 nova_compute[182713]:     </disk>
Jan 22 00:19:58 compute-1 nova_compute[182713]:     <disk type="file" device="cdrom">
Jan 22 00:19:58 compute-1 nova_compute[182713]:       <driver name="qemu" type="raw" cache="none"/>
Jan 22 00:19:58 compute-1 nova_compute[182713]:       <source file="/var/lib/nova/instances/d59e0943-5372-4680-af52-c9af874c8578/disk.config"/>
Jan 22 00:19:58 compute-1 nova_compute[182713]:       <target dev="sda" bus="sata"/>
Jan 22 00:19:58 compute-1 nova_compute[182713]:     </disk>
Jan 22 00:19:58 compute-1 nova_compute[182713]:     <interface type="ethernet">
Jan 22 00:19:58 compute-1 nova_compute[182713]:       <mac address="fa:16:3e:9d:17:06"/>
Jan 22 00:19:58 compute-1 nova_compute[182713]:       <model type="virtio"/>
Jan 22 00:19:58 compute-1 nova_compute[182713]:       <driver name="vhost" rx_queue_size="512"/>
Jan 22 00:19:58 compute-1 nova_compute[182713]:       <mtu size="1442"/>
Jan 22 00:19:58 compute-1 nova_compute[182713]:       <target dev="tap89ad850c-a8"/>
Jan 22 00:19:58 compute-1 nova_compute[182713]:     </interface>
Jan 22 00:19:58 compute-1 nova_compute[182713]:     <serial type="pty">
Jan 22 00:19:58 compute-1 nova_compute[182713]:       <log file="/var/lib/nova/instances/d59e0943-5372-4680-af52-c9af874c8578/console.log" append="off"/>
Jan 22 00:19:58 compute-1 nova_compute[182713]:     </serial>
Jan 22 00:19:58 compute-1 nova_compute[182713]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 00:19:58 compute-1 nova_compute[182713]:     <video>
Jan 22 00:19:58 compute-1 nova_compute[182713]:       <model type="virtio"/>
Jan 22 00:19:58 compute-1 nova_compute[182713]:     </video>
Jan 22 00:19:58 compute-1 nova_compute[182713]:     <input type="tablet" bus="usb"/>
Jan 22 00:19:58 compute-1 nova_compute[182713]:     <input type="keyboard" bus="usb"/>
Jan 22 00:19:58 compute-1 nova_compute[182713]:     <rng model="virtio">
Jan 22 00:19:58 compute-1 nova_compute[182713]:       <backend model="random">/dev/urandom</backend>
Jan 22 00:19:58 compute-1 nova_compute[182713]:     </rng>
Jan 22 00:19:58 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root"/>
Jan 22 00:19:58 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:19:58 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:19:58 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:19:58 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:19:58 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:19:58 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:19:58 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:19:58 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:19:58 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:19:58 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:19:58 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:19:58 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:19:58 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:19:58 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:19:58 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:19:58 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:19:58 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:19:58 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:19:58 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:19:58 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:19:58 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:19:58 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:19:58 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:19:58 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:19:58 compute-1 nova_compute[182713]:     <controller type="usb" index="0"/>
Jan 22 00:19:58 compute-1 nova_compute[182713]:     <memballoon model="virtio">
Jan 22 00:19:58 compute-1 nova_compute[182713]:       <stats period="10"/>
Jan 22 00:19:58 compute-1 nova_compute[182713]:     </memballoon>
Jan 22 00:19:58 compute-1 nova_compute[182713]:   </devices>
Jan 22 00:19:58 compute-1 nova_compute[182713]: </domain>
Jan 22 00:19:58 compute-1 nova_compute[182713]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 22 00:19:58 compute-1 nova_compute[182713]: 2026-01-22 00:19:58.476 182717 DEBUG nova.compute.manager [None req-9407b51e-2df7-478b-9165-75c7adecac6d 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: d59e0943-5372-4680-af52-c9af874c8578] Preparing to wait for external event network-vif-plugged-89ad850c-a87f-489f-8c3e-51dfc078a374 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 22 00:19:58 compute-1 nova_compute[182713]: 2026-01-22 00:19:58.476 182717 DEBUG oslo_concurrency.lockutils [None req-9407b51e-2df7-478b-9165-75c7adecac6d 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Acquiring lock "d59e0943-5372-4680-af52-c9af874c8578-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:19:58 compute-1 nova_compute[182713]: 2026-01-22 00:19:58.476 182717 DEBUG oslo_concurrency.lockutils [None req-9407b51e-2df7-478b-9165-75c7adecac6d 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lock "d59e0943-5372-4680-af52-c9af874c8578-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:19:58 compute-1 nova_compute[182713]: 2026-01-22 00:19:58.477 182717 DEBUG oslo_concurrency.lockutils [None req-9407b51e-2df7-478b-9165-75c7adecac6d 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lock "d59e0943-5372-4680-af52-c9af874c8578-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:19:58 compute-1 nova_compute[182713]: 2026-01-22 00:19:58.477 182717 DEBUG nova.virt.libvirt.vif [None req-9407b51e-2df7-478b-9165-75c7adecac6d 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-01-22T00:18:57Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-2079598704',display_name='tempest-TestNetworkAdvancedServerOps-server-2079598704',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-2079598704',id=145,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBIY3eqmdW0m2q20hwTxy7fCq5RPOCY+KqJLqriFpcPzIAlQnzQNfW6TIp9Y1voEv/PtpLpDAT0kqBnGToo/qNh+oTys/PEZ/7XtlTWunC6nPRFTGOxMn536DUj7Tail8LA==',key_name='tempest-TestNetworkAdvancedServerOps-1230336080',keypairs=<?>,launch_index=0,launched_at=2026-01-22T00:19:47Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=MigrationContext,new_flavor=Flavor(1),node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='adb1305c8f874f2684e845e88fd95ffe',ramdisk_id='',reservation_id='r-zrj70p8y',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-TestNetworkAdvancedServerOps-587955072',owner_user_name='tempest-TestNetworkAdvancedServerOps-587955072-project-member'},tags=<?>,task_state='resize_reverting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T00:19:52Z,user_data=None,user_id='635cc2f351c344dc8e2b1264080dbafb',uuid=d59e0943-5372-4680-af52-c9af874c8578,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='resized') vif={"id": "89ad850c-a87f-489f-8c3e-51dfc078a374", "address": "fa:16:3e:9d:17:06", "network": {"id": "7347045a-f38e-4f56-a03a-a68e0fbe1e8d", "bridge": "br-int", "label": "tempest-network-smoke--1118354446", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.194", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "adb1305c8f874f2684e845e88fd95ffe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap89ad850c-a8", "ovs_interfaceid": "89ad850c-a87f-489f-8c3e-51dfc078a374", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 00:19:58 compute-1 nova_compute[182713]: 2026-01-22 00:19:58.478 182717 DEBUG nova.network.os_vif_util [None req-9407b51e-2df7-478b-9165-75c7adecac6d 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Converting VIF {"id": "89ad850c-a87f-489f-8c3e-51dfc078a374", "address": "fa:16:3e:9d:17:06", "network": {"id": "7347045a-f38e-4f56-a03a-a68e0fbe1e8d", "bridge": "br-int", "label": "tempest-network-smoke--1118354446", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.194", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "adb1305c8f874f2684e845e88fd95ffe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap89ad850c-a8", "ovs_interfaceid": "89ad850c-a87f-489f-8c3e-51dfc078a374", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:19:58 compute-1 nova_compute[182713]: 2026-01-22 00:19:58.478 182717 DEBUG nova.network.os_vif_util [None req-9407b51e-2df7-478b-9165-75c7adecac6d 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:9d:17:06,bridge_name='br-int',has_traffic_filtering=True,id=89ad850c-a87f-489f-8c3e-51dfc078a374,network=Network(7347045a-f38e-4f56-a03a-a68e0fbe1e8d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap89ad850c-a8') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:19:58 compute-1 nova_compute[182713]: 2026-01-22 00:19:58.478 182717 DEBUG os_vif [None req-9407b51e-2df7-478b-9165-75c7adecac6d 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:9d:17:06,bridge_name='br-int',has_traffic_filtering=True,id=89ad850c-a87f-489f-8c3e-51dfc078a374,network=Network(7347045a-f38e-4f56-a03a-a68e0fbe1e8d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap89ad850c-a8') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 00:19:58 compute-1 nova_compute[182713]: 2026-01-22 00:19:58.479 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:19:58 compute-1 nova_compute[182713]: 2026-01-22 00:19:58.479 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:19:58 compute-1 nova_compute[182713]: 2026-01-22 00:19:58.480 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 00:19:58 compute-1 nova_compute[182713]: 2026-01-22 00:19:58.485 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:19:58 compute-1 nova_compute[182713]: 2026-01-22 00:19:58.485 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap89ad850c-a8, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:19:58 compute-1 nova_compute[182713]: 2026-01-22 00:19:58.486 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap89ad850c-a8, col_values=(('external_ids', {'iface-id': '89ad850c-a87f-489f-8c3e-51dfc078a374', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:9d:17:06', 'vm-uuid': 'd59e0943-5372-4680-af52-c9af874c8578'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:19:58 compute-1 NetworkManager[54952]: <info>  [1769041198.5310] manager: (tap89ad850c-a8): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/259)
Jan 22 00:19:58 compute-1 nova_compute[182713]: 2026-01-22 00:19:58.530 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:19:58 compute-1 nova_compute[182713]: 2026-01-22 00:19:58.534 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:19:58 compute-1 nova_compute[182713]: 2026-01-22 00:19:58.535 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:19:58 compute-1 nova_compute[182713]: 2026-01-22 00:19:58.536 182717 INFO os_vif [None req-9407b51e-2df7-478b-9165-75c7adecac6d 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:9d:17:06,bridge_name='br-int',has_traffic_filtering=True,id=89ad850c-a87f-489f-8c3e-51dfc078a374,network=Network(7347045a-f38e-4f56-a03a-a68e0fbe1e8d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap89ad850c-a8')
Jan 22 00:19:58 compute-1 kernel: tap89ad850c-a8: entered promiscuous mode
Jan 22 00:19:58 compute-1 NetworkManager[54952]: <info>  [1769041198.6147] manager: (tap89ad850c-a8): new Tun device (/org/freedesktop/NetworkManager/Devices/260)
Jan 22 00:19:58 compute-1 nova_compute[182713]: 2026-01-22 00:19:58.614 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:19:58 compute-1 ovn_controller[94841]: 2026-01-22T00:19:58Z|00579|binding|INFO|Claiming lport 89ad850c-a87f-489f-8c3e-51dfc078a374 for this chassis.
Jan 22 00:19:58 compute-1 ovn_controller[94841]: 2026-01-22T00:19:58Z|00580|binding|INFO|89ad850c-a87f-489f-8c3e-51dfc078a374: Claiming fa:16:3e:9d:17:06 10.100.0.5
Jan 22 00:19:58 compute-1 nova_compute[182713]: 2026-01-22 00:19:58.620 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:19:58 compute-1 nova_compute[182713]: 2026-01-22 00:19:58.626 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:19:58 compute-1 NetworkManager[54952]: <info>  [1769041198.6351] manager: (patch-provnet-99bcf688-d143-4306-a11c-956e66fbe227-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/261)
Jan 22 00:19:58 compute-1 NetworkManager[54952]: <info>  [1769041198.6355] manager: (patch-br-int-to-provnet-99bcf688-d143-4306-a11c-956e66fbe227): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/262)
Jan 22 00:19:58 compute-1 nova_compute[182713]: 2026-01-22 00:19:58.634 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:19:58 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:19:58.638 104184 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:9d:17:06 10.100.0.5'], port_security=['fa:16:3e:9d:17:06 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': 'd59e0943-5372-4680-af52-c9af874c8578', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7347045a-f38e-4f56-a03a-a68e0fbe1e8d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'adb1305c8f874f2684e845e88fd95ffe', 'neutron:revision_number': '10', 'neutron:security_group_ids': '1bc50146-1f14-43fa-a2db-2904419fa654', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.194'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f19555ab-2ed1-467b-9e13-e9518e9577aa, chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>], logical_port=89ad850c-a87f-489f-8c3e-51dfc078a374) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:19:58 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:19:58.639 104184 INFO neutron.agent.ovn.metadata.agent [-] Port 89ad850c-a87f-489f-8c3e-51dfc078a374 in datapath 7347045a-f38e-4f56-a03a-a68e0fbe1e8d bound to our chassis
Jan 22 00:19:58 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:19:58.641 104184 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 7347045a-f38e-4f56-a03a-a68e0fbe1e8d
Jan 22 00:19:58 compute-1 systemd-udevd[233682]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 00:19:58 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:19:58.657 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[b72d87ec-6fe8-44e2-8a19-93f16d644f27]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:19:58 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:19:58.658 104184 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap7347045a-f1 in ovnmeta-7347045a-f38e-4f56-a03a-a68e0fbe1e8d namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 22 00:19:58 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:19:58.660 211733 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap7347045a-f0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 22 00:19:58 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:19:58.661 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[6b114033-a31e-46ae-bc29-01556cc582c4]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:19:58 compute-1 systemd-machined[153970]: New machine qemu-64-instance-00000091.
Jan 22 00:19:58 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:19:58.662 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[47cc1655-d67a-44a4-9072-6f7f5e085896]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:19:58 compute-1 NetworkManager[54952]: <info>  [1769041198.6655] device (tap89ad850c-a8): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 00:19:58 compute-1 NetworkManager[54952]: <info>  [1769041198.6660] device (tap89ad850c-a8): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 00:19:58 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:19:58.673 104576 DEBUG oslo.privsep.daemon [-] privsep: reply[00c4d3d2-22a5-40e1-a249-0d9ce8297469]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:19:58 compute-1 systemd[1]: Started Virtual Machine qemu-64-instance-00000091.
Jan 22 00:19:58 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:19:58.699 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[1adedf25-1142-4160-b46a-38628d293621]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:19:58 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:19:58.753 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[3965778e-abf7-476e-a9d9-cce178e6497d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:19:58 compute-1 nova_compute[182713]: 2026-01-22 00:19:58.757 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:19:58 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:19:58.762 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[40d91ab7-130d-4767-9dca-3608f8368494]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:19:58 compute-1 ovn_controller[94841]: 2026-01-22T00:19:58Z|00581|binding|INFO|Releasing lport 1ae0dbff-a7cd-4db8-afc3-1d102fdd130f from this chassis (sb_readonly=0)
Jan 22 00:19:58 compute-1 NetworkManager[54952]: <info>  [1769041198.7700] manager: (tap7347045a-f0): new Veth device (/org/freedesktop/NetworkManager/Devices/263)
Jan 22 00:19:58 compute-1 nova_compute[182713]: 2026-01-22 00:19:58.784 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:19:58 compute-1 ovn_controller[94841]: 2026-01-22T00:19:58Z|00582|binding|INFO|Setting lport 89ad850c-a87f-489f-8c3e-51dfc078a374 ovn-installed in OVS
Jan 22 00:19:58 compute-1 ovn_controller[94841]: 2026-01-22T00:19:58Z|00583|binding|INFO|Setting lport 89ad850c-a87f-489f-8c3e-51dfc078a374 up in Southbound
Jan 22 00:19:58 compute-1 nova_compute[182713]: 2026-01-22 00:19:58.796 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:19:58 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:19:58.807 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[5b8ddf29-201a-4ad3-908d-5991c62d6e1e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:19:58 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:19:58.811 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[b37f5548-4fd9-4110-a92e-f6fa0f418086]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:19:58 compute-1 NetworkManager[54952]: <info>  [1769041198.8342] device (tap7347045a-f0): carrier: link connected
Jan 22 00:19:58 compute-1 nova_compute[182713]: 2026-01-22 00:19:58.841 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:19:58 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:19:58.841 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[bd92ca2e-7480-4000-8a88-b0c4b30dcea5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:19:58 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:19:58.858 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[60353c8d-c389-4615-a8b5-2f393905a667]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap7347045a-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a0:da:8d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 164], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 577148, 'reachable_time': 15122, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 233715, 'error': None, 'target': 'ovnmeta-7347045a-f38e-4f56-a03a-a68e0fbe1e8d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:19:58 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:19:58.876 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[5893c031-2338-4b81-857d-005b08d3af3a]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fea0:da8d'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 577148, 'tstamp': 577148}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 233716, 'error': None, 'target': 'ovnmeta-7347045a-f38e-4f56-a03a-a68e0fbe1e8d', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:19:58 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:19:58.896 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[1af58c50-90b8-4626-843f-d01090e24a38]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap7347045a-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a0:da:8d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 164], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 577148, 'reachable_time': 15122, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 233717, 'error': None, 'target': 'ovnmeta-7347045a-f38e-4f56-a03a-a68e0fbe1e8d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:19:58 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:19:58.931 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[ef76cfe4-2cc1-4e7a-84bb-30e336f2ba8e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:19:59 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:19:59.003 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[76fea319-0407-4bdb-8605-27a7335b0bb7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:19:59 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:19:59.005 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7347045a-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:19:59 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:19:59.006 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 00:19:59 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:19:59.007 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap7347045a-f0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:19:59 compute-1 nova_compute[182713]: 2026-01-22 00:19:59.009 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:19:59 compute-1 kernel: tap7347045a-f0: entered promiscuous mode
Jan 22 00:19:59 compute-1 NetworkManager[54952]: <info>  [1769041199.0120] manager: (tap7347045a-f0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/264)
Jan 22 00:19:59 compute-1 nova_compute[182713]: 2026-01-22 00:19:59.014 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:19:59 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:19:59.016 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap7347045a-f0, col_values=(('external_ids', {'iface-id': 'ce8d757a-1822-40f4-bb02-e88d8e0a4e11'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:19:59 compute-1 nova_compute[182713]: 2026-01-22 00:19:59.018 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:19:59 compute-1 ovn_controller[94841]: 2026-01-22T00:19:59Z|00584|binding|INFO|Releasing lport ce8d757a-1822-40f4-bb02-e88d8e0a4e11 from this chassis (sb_readonly=0)
Jan 22 00:19:59 compute-1 nova_compute[182713]: 2026-01-22 00:19:59.020 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:19:59 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:19:59.022 104184 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/7347045a-f38e-4f56-a03a-a68e0fbe1e8d.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/7347045a-f38e-4f56-a03a-a68e0fbe1e8d.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 22 00:19:59 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:19:59.023 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[3c09c675-ce2d-48ff-9ac3-d037c6faddb9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:19:59 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:19:59.024 104184 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 00:19:59 compute-1 ovn_metadata_agent[104179]: global
Jan 22 00:19:59 compute-1 ovn_metadata_agent[104179]:     log         /dev/log local0 debug
Jan 22 00:19:59 compute-1 ovn_metadata_agent[104179]:     log-tag     haproxy-metadata-proxy-7347045a-f38e-4f56-a03a-a68e0fbe1e8d
Jan 22 00:19:59 compute-1 ovn_metadata_agent[104179]:     user        root
Jan 22 00:19:59 compute-1 ovn_metadata_agent[104179]:     group       root
Jan 22 00:19:59 compute-1 ovn_metadata_agent[104179]:     maxconn     1024
Jan 22 00:19:59 compute-1 ovn_metadata_agent[104179]:     pidfile     /var/lib/neutron/external/pids/7347045a-f38e-4f56-a03a-a68e0fbe1e8d.pid.haproxy
Jan 22 00:19:59 compute-1 ovn_metadata_agent[104179]:     daemon
Jan 22 00:19:59 compute-1 ovn_metadata_agent[104179]: 
Jan 22 00:19:59 compute-1 ovn_metadata_agent[104179]: defaults
Jan 22 00:19:59 compute-1 ovn_metadata_agent[104179]:     log global
Jan 22 00:19:59 compute-1 ovn_metadata_agent[104179]:     mode http
Jan 22 00:19:59 compute-1 ovn_metadata_agent[104179]:     option httplog
Jan 22 00:19:59 compute-1 ovn_metadata_agent[104179]:     option dontlognull
Jan 22 00:19:59 compute-1 ovn_metadata_agent[104179]:     option http-server-close
Jan 22 00:19:59 compute-1 ovn_metadata_agent[104179]:     option forwardfor
Jan 22 00:19:59 compute-1 ovn_metadata_agent[104179]:     retries                 3
Jan 22 00:19:59 compute-1 ovn_metadata_agent[104179]:     timeout http-request    30s
Jan 22 00:19:59 compute-1 ovn_metadata_agent[104179]:     timeout connect         30s
Jan 22 00:19:59 compute-1 ovn_metadata_agent[104179]:     timeout client          32s
Jan 22 00:19:59 compute-1 ovn_metadata_agent[104179]:     timeout server          32s
Jan 22 00:19:59 compute-1 ovn_metadata_agent[104179]:     timeout http-keep-alive 30s
Jan 22 00:19:59 compute-1 ovn_metadata_agent[104179]: 
Jan 22 00:19:59 compute-1 ovn_metadata_agent[104179]: 
Jan 22 00:19:59 compute-1 ovn_metadata_agent[104179]: listen listener
Jan 22 00:19:59 compute-1 ovn_metadata_agent[104179]:     bind 169.254.169.254:80
Jan 22 00:19:59 compute-1 ovn_metadata_agent[104179]:     server metadata /var/lib/neutron/metadata_proxy
Jan 22 00:19:59 compute-1 ovn_metadata_agent[104179]:     http-request add-header X-OVN-Network-ID 7347045a-f38e-4f56-a03a-a68e0fbe1e8d
Jan 22 00:19:59 compute-1 ovn_metadata_agent[104179]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 22 00:19:59 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:19:59.026 104184 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-7347045a-f38e-4f56-a03a-a68e0fbe1e8d', 'env', 'PROCESS_TAG=haproxy-7347045a-f38e-4f56-a03a-a68e0fbe1e8d', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/7347045a-f38e-4f56-a03a-a68e0fbe1e8d.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 22 00:19:59 compute-1 nova_compute[182713]: 2026-01-22 00:19:59.029 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:19:59 compute-1 podman[233749]: 2026-01-22 00:19:59.446927647 +0000 UTC m=+0.058014814 container create 0230d1a353d59c5f77ff1461f0c2ea86d8d153d418071415c514833f43a05550 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7347045a-f38e-4f56-a03a-a68e0fbe1e8d, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202)
Jan 22 00:19:59 compute-1 systemd[1]: Started libpod-conmon-0230d1a353d59c5f77ff1461f0c2ea86d8d153d418071415c514833f43a05550.scope.
Jan 22 00:19:59 compute-1 podman[233749]: 2026-01-22 00:19:59.416469581 +0000 UTC m=+0.027556758 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 00:19:59 compute-1 systemd[1]: Started libcrun container.
Jan 22 00:19:59 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4d861072d5e9e93e460fc407cc3976ea583bb36b2143dcc256b09144d189940d/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 00:19:59 compute-1 podman[233749]: 2026-01-22 00:19:59.546399185 +0000 UTC m=+0.157486402 container init 0230d1a353d59c5f77ff1461f0c2ea86d8d153d418071415c514833f43a05550 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7347045a-f38e-4f56-a03a-a68e0fbe1e8d, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 22 00:19:59 compute-1 podman[233749]: 2026-01-22 00:19:59.55145406 +0000 UTC m=+0.162541227 container start 0230d1a353d59c5f77ff1461f0c2ea86d8d153d418071415c514833f43a05550 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7347045a-f38e-4f56-a03a-a68e0fbe1e8d, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 22 00:19:59 compute-1 neutron-haproxy-ovnmeta-7347045a-f38e-4f56-a03a-a68e0fbe1e8d[233765]: [NOTICE]   (233769) : New worker (233771) forked
Jan 22 00:19:59 compute-1 neutron-haproxy-ovnmeta-7347045a-f38e-4f56-a03a-a68e0fbe1e8d[233765]: [NOTICE]   (233769) : Loading success.
Jan 22 00:19:59 compute-1 nova_compute[182713]: 2026-01-22 00:19:59.853 182717 DEBUG nova.virt.driver [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Emitting event <LifecycleEvent: 1769041199.8518286, d59e0943-5372-4680-af52-c9af874c8578 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:19:59 compute-1 nova_compute[182713]: 2026-01-22 00:19:59.853 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: d59e0943-5372-4680-af52-c9af874c8578] VM Started (Lifecycle Event)
Jan 22 00:19:59 compute-1 nova_compute[182713]: 2026-01-22 00:19:59.888 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: d59e0943-5372-4680-af52-c9af874c8578] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:19:59 compute-1 nova_compute[182713]: 2026-01-22 00:19:59.893 182717 DEBUG nova.virt.driver [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Emitting event <LifecycleEvent: 1769041199.852203, d59e0943-5372-4680-af52-c9af874c8578 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:19:59 compute-1 nova_compute[182713]: 2026-01-22 00:19:59.893 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: d59e0943-5372-4680-af52-c9af874c8578] VM Paused (Lifecycle Event)
Jan 22 00:19:59 compute-1 nova_compute[182713]: 2026-01-22 00:19:59.922 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: d59e0943-5372-4680-af52-c9af874c8578] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:19:59 compute-1 nova_compute[182713]: 2026-01-22 00:19:59.928 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: d59e0943-5372-4680-af52-c9af874c8578] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: resized, current task_state: resize_reverting, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 00:19:59 compute-1 nova_compute[182713]: 2026-01-22 00:19:59.956 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: d59e0943-5372-4680-af52-c9af874c8578] During sync_power_state the instance has a pending task (resize_reverting). Skip.
Jan 22 00:20:00 compute-1 nova_compute[182713]: 2026-01-22 00:20:00.254 182717 DEBUG nova.compute.manager [req-88ae0c87-6bab-40b8-943c-13b89efc546b req-e22361d3-1355-4a46-af49-ebc751ae30f1 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: d59e0943-5372-4680-af52-c9af874c8578] Received event network-vif-plugged-89ad850c-a87f-489f-8c3e-51dfc078a374 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:20:00 compute-1 nova_compute[182713]: 2026-01-22 00:20:00.255 182717 DEBUG oslo_concurrency.lockutils [req-88ae0c87-6bab-40b8-943c-13b89efc546b req-e22361d3-1355-4a46-af49-ebc751ae30f1 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "d59e0943-5372-4680-af52-c9af874c8578-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:20:00 compute-1 nova_compute[182713]: 2026-01-22 00:20:00.256 182717 DEBUG oslo_concurrency.lockutils [req-88ae0c87-6bab-40b8-943c-13b89efc546b req-e22361d3-1355-4a46-af49-ebc751ae30f1 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "d59e0943-5372-4680-af52-c9af874c8578-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:20:00 compute-1 nova_compute[182713]: 2026-01-22 00:20:00.256 182717 DEBUG oslo_concurrency.lockutils [req-88ae0c87-6bab-40b8-943c-13b89efc546b req-e22361d3-1355-4a46-af49-ebc751ae30f1 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "d59e0943-5372-4680-af52-c9af874c8578-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:20:00 compute-1 nova_compute[182713]: 2026-01-22 00:20:00.256 182717 DEBUG nova.compute.manager [req-88ae0c87-6bab-40b8-943c-13b89efc546b req-e22361d3-1355-4a46-af49-ebc751ae30f1 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: d59e0943-5372-4680-af52-c9af874c8578] Processing event network-vif-plugged-89ad850c-a87f-489f-8c3e-51dfc078a374 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 22 00:20:00 compute-1 nova_compute[182713]: 2026-01-22 00:20:00.257 182717 DEBUG nova.compute.manager [None req-9407b51e-2df7-478b-9165-75c7adecac6d 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: d59e0943-5372-4680-af52-c9af874c8578] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 00:20:00 compute-1 nova_compute[182713]: 2026-01-22 00:20:00.260 182717 DEBUG nova.virt.driver [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Emitting event <LifecycleEvent: 1769041200.2602577, d59e0943-5372-4680-af52-c9af874c8578 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:20:00 compute-1 nova_compute[182713]: 2026-01-22 00:20:00.260 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: d59e0943-5372-4680-af52-c9af874c8578] VM Resumed (Lifecycle Event)
Jan 22 00:20:00 compute-1 nova_compute[182713]: 2026-01-22 00:20:00.266 182717 INFO nova.virt.libvirt.driver [-] [instance: d59e0943-5372-4680-af52-c9af874c8578] Instance running successfully.
Jan 22 00:20:00 compute-1 nova_compute[182713]: 2026-01-22 00:20:00.266 182717 DEBUG nova.virt.libvirt.driver [None req-9407b51e-2df7-478b-9165-75c7adecac6d 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: d59e0943-5372-4680-af52-c9af874c8578] finish_revert_migration finished successfully. finish_revert_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11887
Jan 22 00:20:00 compute-1 nova_compute[182713]: 2026-01-22 00:20:00.290 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: d59e0943-5372-4680-af52-c9af874c8578] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:20:00 compute-1 nova_compute[182713]: 2026-01-22 00:20:00.300 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: d59e0943-5372-4680-af52-c9af874c8578] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: resized, current task_state: resize_reverting, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 00:20:00 compute-1 nova_compute[182713]: 2026-01-22 00:20:00.341 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: d59e0943-5372-4680-af52-c9af874c8578] During sync_power_state the instance has a pending task (resize_reverting). Skip.
Jan 22 00:20:00 compute-1 nova_compute[182713]: 2026-01-22 00:20:00.371 182717 INFO nova.compute.manager [None req-9407b51e-2df7-478b-9165-75c7adecac6d 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: d59e0943-5372-4680-af52-c9af874c8578] Updating instance to original state: 'active'
Jan 22 00:20:01 compute-1 nova_compute[182713]: 2026-01-22 00:20:01.189 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:20:01 compute-1 nova_compute[182713]: 2026-01-22 00:20:01.317 182717 DEBUG nova.network.neutron [req-fef03a7b-2703-4d14-8389-6e8bae4e6f02 req-2f3075f5-572f-4da6-bc20-6c2954e7e6a9 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: d59e0943-5372-4680-af52-c9af874c8578] Updated VIF entry in instance network info cache for port 89ad850c-a87f-489f-8c3e-51dfc078a374. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 00:20:01 compute-1 nova_compute[182713]: 2026-01-22 00:20:01.318 182717 DEBUG nova.network.neutron [req-fef03a7b-2703-4d14-8389-6e8bae4e6f02 req-2f3075f5-572f-4da6-bc20-6c2954e7e6a9 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: d59e0943-5372-4680-af52-c9af874c8578] Updating instance_info_cache with network_info: [{"id": "89ad850c-a87f-489f-8c3e-51dfc078a374", "address": "fa:16:3e:9d:17:06", "network": {"id": "7347045a-f38e-4f56-a03a-a68e0fbe1e8d", "bridge": "br-int", "label": "tempest-network-smoke--1118354446", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.194", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "adb1305c8f874f2684e845e88fd95ffe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap89ad850c-a8", "ovs_interfaceid": "89ad850c-a87f-489f-8c3e-51dfc078a374", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:20:01 compute-1 nova_compute[182713]: 2026-01-22 00:20:01.348 182717 DEBUG oslo_concurrency.lockutils [req-fef03a7b-2703-4d14-8389-6e8bae4e6f02 req-2f3075f5-572f-4da6-bc20-6c2954e7e6a9 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-d59e0943-5372-4680-af52-c9af874c8578" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:20:01 compute-1 podman[233788]: 2026-01-22 00:20:01.559735035 +0000 UTC m=+0.056710094 container health_status cba261ffc859a105e0afa4436e64e27f9ba7e7387a1ea02146e0072c51844d44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Jan 22 00:20:02 compute-1 nova_compute[182713]: 2026-01-22 00:20:02.432 182717 DEBUG nova.compute.manager [req-45202dbf-af8b-48b9-80a1-a2e46e5c3b5a req-32105e64-f47d-4239-a921-02f89dfb83e7 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: d59e0943-5372-4680-af52-c9af874c8578] Received event network-vif-plugged-89ad850c-a87f-489f-8c3e-51dfc078a374 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:20:02 compute-1 nova_compute[182713]: 2026-01-22 00:20:02.432 182717 DEBUG oslo_concurrency.lockutils [req-45202dbf-af8b-48b9-80a1-a2e46e5c3b5a req-32105e64-f47d-4239-a921-02f89dfb83e7 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "d59e0943-5372-4680-af52-c9af874c8578-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:20:02 compute-1 nova_compute[182713]: 2026-01-22 00:20:02.432 182717 DEBUG oslo_concurrency.lockutils [req-45202dbf-af8b-48b9-80a1-a2e46e5c3b5a req-32105e64-f47d-4239-a921-02f89dfb83e7 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "d59e0943-5372-4680-af52-c9af874c8578-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:20:02 compute-1 nova_compute[182713]: 2026-01-22 00:20:02.433 182717 DEBUG oslo_concurrency.lockutils [req-45202dbf-af8b-48b9-80a1-a2e46e5c3b5a req-32105e64-f47d-4239-a921-02f89dfb83e7 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "d59e0943-5372-4680-af52-c9af874c8578-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:20:02 compute-1 nova_compute[182713]: 2026-01-22 00:20:02.433 182717 DEBUG nova.compute.manager [req-45202dbf-af8b-48b9-80a1-a2e46e5c3b5a req-32105e64-f47d-4239-a921-02f89dfb83e7 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: d59e0943-5372-4680-af52-c9af874c8578] No waiting events found dispatching network-vif-plugged-89ad850c-a87f-489f-8c3e-51dfc078a374 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:20:02 compute-1 nova_compute[182713]: 2026-01-22 00:20:02.433 182717 WARNING nova.compute.manager [req-45202dbf-af8b-48b9-80a1-a2e46e5c3b5a req-32105e64-f47d-4239-a921-02f89dfb83e7 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: d59e0943-5372-4680-af52-c9af874c8578] Received unexpected event network-vif-plugged-89ad850c-a87f-489f-8c3e-51dfc078a374 for instance with vm_state active and task_state None.
Jan 22 00:20:03 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:20:03.027 104184 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:20:03 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:20:03.028 104184 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:20:03 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:20:03.028 104184 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:20:03 compute-1 nova_compute[182713]: 2026-01-22 00:20:03.532 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:20:03 compute-1 nova_compute[182713]: 2026-01-22 00:20:03.844 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:20:04 compute-1 podman[233808]: 2026-01-22 00:20:04.622353115 +0000 UTC m=+0.103277715 container health_status dce133a2ffa21f3006ac27dc80027060a9029e6d972f4c62fecd35dd3bbb00f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., architecture=x86_64, io.buildah.version=1.33.7, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, config_id=openstack_network_exporter, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible, release=1755695350, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., name=ubi9-minimal)
Jan 22 00:20:08 compute-1 nova_compute[182713]: 2026-01-22 00:20:08.537 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:20:08 compute-1 nova_compute[182713]: 2026-01-22 00:20:08.847 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:20:09 compute-1 nova_compute[182713]: 2026-01-22 00:20:09.302 182717 DEBUG oslo_concurrency.lockutils [None req-498b9e5e-c283-4c61-b5b6-5b8ddc69067c 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Acquiring lock "7e1a8ec5-e1c0-4c11-aae2-15d84872d95c" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:20:09 compute-1 nova_compute[182713]: 2026-01-22 00:20:09.302 182717 DEBUG oslo_concurrency.lockutils [None req-498b9e5e-c283-4c61-b5b6-5b8ddc69067c 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Lock "7e1a8ec5-e1c0-4c11-aae2-15d84872d95c" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:20:09 compute-1 nova_compute[182713]: 2026-01-22 00:20:09.302 182717 DEBUG oslo_concurrency.lockutils [None req-498b9e5e-c283-4c61-b5b6-5b8ddc69067c 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Acquiring lock "7e1a8ec5-e1c0-4c11-aae2-15d84872d95c-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:20:09 compute-1 nova_compute[182713]: 2026-01-22 00:20:09.303 182717 DEBUG oslo_concurrency.lockutils [None req-498b9e5e-c283-4c61-b5b6-5b8ddc69067c 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Lock "7e1a8ec5-e1c0-4c11-aae2-15d84872d95c-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:20:09 compute-1 nova_compute[182713]: 2026-01-22 00:20:09.303 182717 DEBUG oslo_concurrency.lockutils [None req-498b9e5e-c283-4c61-b5b6-5b8ddc69067c 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Lock "7e1a8ec5-e1c0-4c11-aae2-15d84872d95c-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:20:09 compute-1 nova_compute[182713]: 2026-01-22 00:20:09.316 182717 INFO nova.compute.manager [None req-498b9e5e-c283-4c61-b5b6-5b8ddc69067c 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: 7e1a8ec5-e1c0-4c11-aae2-15d84872d95c] Terminating instance
Jan 22 00:20:09 compute-1 nova_compute[182713]: 2026-01-22 00:20:09.328 182717 DEBUG nova.compute.manager [None req-498b9e5e-c283-4c61-b5b6-5b8ddc69067c 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: 7e1a8ec5-e1c0-4c11-aae2-15d84872d95c] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 22 00:20:09 compute-1 kernel: tapdfd91ce9-b3 (unregistering): left promiscuous mode
Jan 22 00:20:09 compute-1 NetworkManager[54952]: <info>  [1769041209.3519] device (tapdfd91ce9-b3): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 00:20:09 compute-1 ovn_controller[94841]: 2026-01-22T00:20:09Z|00585|binding|INFO|Releasing lport dfd91ce9-b3dc-46e0-8793-952181553915 from this chassis (sb_readonly=0)
Jan 22 00:20:09 compute-1 ovn_controller[94841]: 2026-01-22T00:20:09Z|00586|binding|INFO|Setting lport dfd91ce9-b3dc-46e0-8793-952181553915 down in Southbound
Jan 22 00:20:09 compute-1 ovn_controller[94841]: 2026-01-22T00:20:09Z|00587|binding|INFO|Removing iface tapdfd91ce9-b3 ovn-installed in OVS
Jan 22 00:20:09 compute-1 nova_compute[182713]: 2026-01-22 00:20:09.365 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:20:09 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:20:09.373 104184 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:0c:78:ae 10.100.0.5'], port_security=['fa:16:3e:0c:78:ae 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '7e1a8ec5-e1c0-4c11-aae2-15d84872d95c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-aabf11c6-ef94-408a-8148-6c6400566606', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3e408650207b498c8d115fd0c4f776dc', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'd88f438e-f9bb-4593-93a6-6ce5aa939167', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=dfd57084-623a-46cf-a9c5-71a440a640c6, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>], logical_port=dfd91ce9-b3dc-46e0-8793-952181553915) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:20:09 compute-1 nova_compute[182713]: 2026-01-22 00:20:09.375 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:20:09 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:20:09.379 104184 INFO neutron.agent.ovn.metadata.agent [-] Port dfd91ce9-b3dc-46e0-8793-952181553915 in datapath aabf11c6-ef94-408a-8148-6c6400566606 unbound from our chassis
Jan 22 00:20:09 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:20:09.385 104184 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network aabf11c6-ef94-408a-8148-6c6400566606, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 00:20:09 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:20:09.389 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[6cc9303c-fb94-4805-a875-d0d6c6190e62]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:20:09 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:20:09.389 104184 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-aabf11c6-ef94-408a-8148-6c6400566606 namespace which is not needed anymore
Jan 22 00:20:09 compute-1 systemd[1]: machine-qemu\x2d61\x2dinstance\x2d00000088.scope: Deactivated successfully.
Jan 22 00:20:09 compute-1 systemd[1]: machine-qemu\x2d61\x2dinstance\x2d00000088.scope: Consumed 18.286s CPU time.
Jan 22 00:20:09 compute-1 systemd-machined[153970]: Machine qemu-61-instance-00000088 terminated.
Jan 22 00:20:09 compute-1 neutron-haproxy-ovnmeta-aabf11c6-ef94-408a-8148-6c6400566606[232481]: [NOTICE]   (232485) : haproxy version is 2.8.14-c23fe91
Jan 22 00:20:09 compute-1 neutron-haproxy-ovnmeta-aabf11c6-ef94-408a-8148-6c6400566606[232481]: [NOTICE]   (232485) : path to executable is /usr/sbin/haproxy
Jan 22 00:20:09 compute-1 neutron-haproxy-ovnmeta-aabf11c6-ef94-408a-8148-6c6400566606[232481]: [WARNING]  (232485) : Exiting Master process...
Jan 22 00:20:09 compute-1 neutron-haproxy-ovnmeta-aabf11c6-ef94-408a-8148-6c6400566606[232481]: [WARNING]  (232485) : Exiting Master process...
Jan 22 00:20:09 compute-1 neutron-haproxy-ovnmeta-aabf11c6-ef94-408a-8148-6c6400566606[232481]: [ALERT]    (232485) : Current worker (232487) exited with code 143 (Terminated)
Jan 22 00:20:09 compute-1 neutron-haproxy-ovnmeta-aabf11c6-ef94-408a-8148-6c6400566606[232481]: [WARNING]  (232485) : All workers exited. Exiting... (0)
Jan 22 00:20:09 compute-1 systemd[1]: libpod-43bf0b600f338066a5e6fbea6b784e87596c2153714d5cfcb11117c37429ae05.scope: Deactivated successfully.
Jan 22 00:20:09 compute-1 podman[233852]: 2026-01-22 00:20:09.546308254 +0000 UTC m=+0.065852196 container died 43bf0b600f338066a5e6fbea6b784e87596c2153714d5cfcb11117c37429ae05 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-aabf11c6-ef94-408a-8148-6c6400566606, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202)
Jan 22 00:20:09 compute-1 nova_compute[182713]: 2026-01-22 00:20:09.551 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:20:09 compute-1 nova_compute[182713]: 2026-01-22 00:20:09.559 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:20:09 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-43bf0b600f338066a5e6fbea6b784e87596c2153714d5cfcb11117c37429ae05-userdata-shm.mount: Deactivated successfully.
Jan 22 00:20:09 compute-1 systemd[1]: var-lib-containers-storage-overlay-a47b43f1c558439c22fd39f33b71a82348c9f13167eeb247c328420325617600-merged.mount: Deactivated successfully.
Jan 22 00:20:09 compute-1 podman[233852]: 2026-01-22 00:20:09.595950249 +0000 UTC m=+0.115494161 container cleanup 43bf0b600f338066a5e6fbea6b784e87596c2153714d5cfcb11117c37429ae05 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-aabf11c6-ef94-408a-8148-6c6400566606, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 00:20:09 compute-1 nova_compute[182713]: 2026-01-22 00:20:09.615 182717 INFO nova.virt.libvirt.driver [-] [instance: 7e1a8ec5-e1c0-4c11-aae2-15d84872d95c] Instance destroyed successfully.
Jan 22 00:20:09 compute-1 nova_compute[182713]: 2026-01-22 00:20:09.616 182717 DEBUG nova.objects.instance [None req-498b9e5e-c283-4c61-b5b6-5b8ddc69067c 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Lazy-loading 'resources' on Instance uuid 7e1a8ec5-e1c0-4c11-aae2-15d84872d95c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:20:09 compute-1 systemd[1]: libpod-conmon-43bf0b600f338066a5e6fbea6b784e87596c2153714d5cfcb11117c37429ae05.scope: Deactivated successfully.
Jan 22 00:20:09 compute-1 nova_compute[182713]: 2026-01-22 00:20:09.643 182717 DEBUG nova.virt.libvirt.vif [None req-498b9e5e-c283-4c61-b5b6-5b8ddc69067c 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T00:17:30Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-₡-12187943',display_name='tempest-₡-12187943',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest--12187943',id=136,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-22T00:17:38Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='3e408650207b498c8d115fd0c4f776dc',ramdisk_id='',reservation_id='r-swmh30g2',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersTestJSON-374007797',owner_user_name='tempest-ServersTestJSON-374007797-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T00:17:39Z,user_data=None,user_id='5eb4e81f0cef4003ae49faa67b3f17c3',uuid=7e1a8ec5-e1c0-4c11-aae2-15d84872d95c,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "dfd91ce9-b3dc-46e0-8793-952181553915", "address": "fa:16:3e:0c:78:ae", "network": {"id": "aabf11c6-ef94-408a-8148-6c6400566606", "bridge": "br-int", "label": "tempest-ServersTestJSON-341875047-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3e408650207b498c8d115fd0c4f776dc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdfd91ce9-b3", "ovs_interfaceid": "dfd91ce9-b3dc-46e0-8793-952181553915", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 00:20:09 compute-1 nova_compute[182713]: 2026-01-22 00:20:09.644 182717 DEBUG nova.network.os_vif_util [None req-498b9e5e-c283-4c61-b5b6-5b8ddc69067c 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Converting VIF {"id": "dfd91ce9-b3dc-46e0-8793-952181553915", "address": "fa:16:3e:0c:78:ae", "network": {"id": "aabf11c6-ef94-408a-8148-6c6400566606", "bridge": "br-int", "label": "tempest-ServersTestJSON-341875047-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3e408650207b498c8d115fd0c4f776dc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdfd91ce9-b3", "ovs_interfaceid": "dfd91ce9-b3dc-46e0-8793-952181553915", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:20:09 compute-1 nova_compute[182713]: 2026-01-22 00:20:09.645 182717 DEBUG nova.network.os_vif_util [None req-498b9e5e-c283-4c61-b5b6-5b8ddc69067c 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:0c:78:ae,bridge_name='br-int',has_traffic_filtering=True,id=dfd91ce9-b3dc-46e0-8793-952181553915,network=Network(aabf11c6-ef94-408a-8148-6c6400566606),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdfd91ce9-b3') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:20:09 compute-1 nova_compute[182713]: 2026-01-22 00:20:09.645 182717 DEBUG os_vif [None req-498b9e5e-c283-4c61-b5b6-5b8ddc69067c 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:0c:78:ae,bridge_name='br-int',has_traffic_filtering=True,id=dfd91ce9-b3dc-46e0-8793-952181553915,network=Network(aabf11c6-ef94-408a-8148-6c6400566606),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdfd91ce9-b3') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 00:20:09 compute-1 nova_compute[182713]: 2026-01-22 00:20:09.648 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:20:09 compute-1 nova_compute[182713]: 2026-01-22 00:20:09.648 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapdfd91ce9-b3, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:20:09 compute-1 nova_compute[182713]: 2026-01-22 00:20:09.650 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:20:09 compute-1 nova_compute[182713]: 2026-01-22 00:20:09.653 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:20:09 compute-1 nova_compute[182713]: 2026-01-22 00:20:09.656 182717 INFO os_vif [None req-498b9e5e-c283-4c61-b5b6-5b8ddc69067c 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:0c:78:ae,bridge_name='br-int',has_traffic_filtering=True,id=dfd91ce9-b3dc-46e0-8793-952181553915,network=Network(aabf11c6-ef94-408a-8148-6c6400566606),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdfd91ce9-b3')
Jan 22 00:20:09 compute-1 nova_compute[182713]: 2026-01-22 00:20:09.657 182717 INFO nova.virt.libvirt.driver [None req-498b9e5e-c283-4c61-b5b6-5b8ddc69067c 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: 7e1a8ec5-e1c0-4c11-aae2-15d84872d95c] Deleting instance files /var/lib/nova/instances/7e1a8ec5-e1c0-4c11-aae2-15d84872d95c_del
Jan 22 00:20:09 compute-1 nova_compute[182713]: 2026-01-22 00:20:09.658 182717 INFO nova.virt.libvirt.driver [None req-498b9e5e-c283-4c61-b5b6-5b8ddc69067c 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: 7e1a8ec5-e1c0-4c11-aae2-15d84872d95c] Deletion of /var/lib/nova/instances/7e1a8ec5-e1c0-4c11-aae2-15d84872d95c_del complete
Jan 22 00:20:09 compute-1 nova_compute[182713]: 2026-01-22 00:20:09.678 182717 DEBUG nova.compute.manager [req-325e0c95-8c5f-45f1-8968-c24c08af1a19 req-1ea61114-1b8e-4109-b550-38f419400313 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 7e1a8ec5-e1c0-4c11-aae2-15d84872d95c] Received event network-vif-unplugged-dfd91ce9-b3dc-46e0-8793-952181553915 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:20:09 compute-1 nova_compute[182713]: 2026-01-22 00:20:09.679 182717 DEBUG oslo_concurrency.lockutils [req-325e0c95-8c5f-45f1-8968-c24c08af1a19 req-1ea61114-1b8e-4109-b550-38f419400313 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "7e1a8ec5-e1c0-4c11-aae2-15d84872d95c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:20:09 compute-1 nova_compute[182713]: 2026-01-22 00:20:09.679 182717 DEBUG oslo_concurrency.lockutils [req-325e0c95-8c5f-45f1-8968-c24c08af1a19 req-1ea61114-1b8e-4109-b550-38f419400313 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "7e1a8ec5-e1c0-4c11-aae2-15d84872d95c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:20:09 compute-1 nova_compute[182713]: 2026-01-22 00:20:09.679 182717 DEBUG oslo_concurrency.lockutils [req-325e0c95-8c5f-45f1-8968-c24c08af1a19 req-1ea61114-1b8e-4109-b550-38f419400313 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "7e1a8ec5-e1c0-4c11-aae2-15d84872d95c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:20:09 compute-1 nova_compute[182713]: 2026-01-22 00:20:09.680 182717 DEBUG nova.compute.manager [req-325e0c95-8c5f-45f1-8968-c24c08af1a19 req-1ea61114-1b8e-4109-b550-38f419400313 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 7e1a8ec5-e1c0-4c11-aae2-15d84872d95c] No waiting events found dispatching network-vif-unplugged-dfd91ce9-b3dc-46e0-8793-952181553915 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:20:09 compute-1 nova_compute[182713]: 2026-01-22 00:20:09.680 182717 DEBUG nova.compute.manager [req-325e0c95-8c5f-45f1-8968-c24c08af1a19 req-1ea61114-1b8e-4109-b550-38f419400313 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 7e1a8ec5-e1c0-4c11-aae2-15d84872d95c] Received event network-vif-unplugged-dfd91ce9-b3dc-46e0-8793-952181553915 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 22 00:20:09 compute-1 podman[233895]: 2026-01-22 00:20:09.688147233 +0000 UTC m=+0.068406813 container remove 43bf0b600f338066a5e6fbea6b784e87596c2153714d5cfcb11117c37429ae05 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-aabf11c6-ef94-408a-8148-6c6400566606, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 22 00:20:09 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:20:09.693 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[9b63adf3-c7e5-4122-aed6-f439451c4fb7]: (4, ('Thu Jan 22 12:20:09 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-aabf11c6-ef94-408a-8148-6c6400566606 (43bf0b600f338066a5e6fbea6b784e87596c2153714d5cfcb11117c37429ae05)\n43bf0b600f338066a5e6fbea6b784e87596c2153714d5cfcb11117c37429ae05\nThu Jan 22 12:20:09 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-aabf11c6-ef94-408a-8148-6c6400566606 (43bf0b600f338066a5e6fbea6b784e87596c2153714d5cfcb11117c37429ae05)\n43bf0b600f338066a5e6fbea6b784e87596c2153714d5cfcb11117c37429ae05\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:20:09 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:20:09.695 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[66bf2077-a664-450c-9182-5b8084ec1dfc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:20:09 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:20:09.696 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapaabf11c6-e0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:20:09 compute-1 nova_compute[182713]: 2026-01-22 00:20:09.697 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:20:09 compute-1 kernel: tapaabf11c6-e0: left promiscuous mode
Jan 22 00:20:09 compute-1 nova_compute[182713]: 2026-01-22 00:20:09.723 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:20:09 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:20:09.726 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[faf0d2dc-f1c0-424c-999b-ad47dda834d6]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:20:09 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:20:09.741 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[efdfe480-efd3-434a-9022-03434cc35cdb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:20:09 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:20:09.743 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[dc4fadfd-bf02-4ad3-b552-eb40101a167d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:20:09 compute-1 nova_compute[182713]: 2026-01-22 00:20:09.746 182717 INFO nova.compute.manager [None req-498b9e5e-c283-4c61-b5b6-5b8ddc69067c 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: 7e1a8ec5-e1c0-4c11-aae2-15d84872d95c] Took 0.42 seconds to destroy the instance on the hypervisor.
Jan 22 00:20:09 compute-1 nova_compute[182713]: 2026-01-22 00:20:09.747 182717 DEBUG oslo.service.loopingcall [None req-498b9e5e-c283-4c61-b5b6-5b8ddc69067c 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 22 00:20:09 compute-1 nova_compute[182713]: 2026-01-22 00:20:09.747 182717 DEBUG nova.compute.manager [-] [instance: 7e1a8ec5-e1c0-4c11-aae2-15d84872d95c] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 22 00:20:09 compute-1 nova_compute[182713]: 2026-01-22 00:20:09.748 182717 DEBUG nova.network.neutron [-] [instance: 7e1a8ec5-e1c0-4c11-aae2-15d84872d95c] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 22 00:20:09 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:20:09.760 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[eb6afd8e-d66e-4f01-9840-3d4012b5fe3f]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 563115, 'reachable_time': 30415, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 233907, 'error': None, 'target': 'ovnmeta-aabf11c6-ef94-408a-8148-6c6400566606', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:20:09 compute-1 systemd[1]: run-netns-ovnmeta\x2daabf11c6\x2def94\x2d408a\x2d8148\x2d6c6400566606.mount: Deactivated successfully.
Jan 22 00:20:09 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:20:09.766 104576 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-aabf11c6-ef94-408a-8148-6c6400566606 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 22 00:20:09 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:20:09.766 104576 DEBUG oslo.privsep.daemon [-] privsep: reply[92dd85e2-12d0-42c5-b8e8-35fbab3162f5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:20:10 compute-1 nova_compute[182713]: 2026-01-22 00:20:10.663 182717 DEBUG nova.network.neutron [-] [instance: 7e1a8ec5-e1c0-4c11-aae2-15d84872d95c] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:20:10 compute-1 nova_compute[182713]: 2026-01-22 00:20:10.695 182717 INFO nova.compute.manager [-] [instance: 7e1a8ec5-e1c0-4c11-aae2-15d84872d95c] Took 0.95 seconds to deallocate network for instance.
Jan 22 00:20:11 compute-1 nova_compute[182713]: 2026-01-22 00:20:11.092 182717 DEBUG oslo_concurrency.lockutils [None req-498b9e5e-c283-4c61-b5b6-5b8ddc69067c 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:20:11 compute-1 nova_compute[182713]: 2026-01-22 00:20:11.092 182717 DEBUG oslo_concurrency.lockutils [None req-498b9e5e-c283-4c61-b5b6-5b8ddc69067c 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:20:11 compute-1 nova_compute[182713]: 2026-01-22 00:20:11.125 182717 DEBUG nova.compute.manager [req-901bc030-af46-41ec-97bd-405e36ba188e req-8b7d9e84-e6c6-445e-a058-b509ba80fd72 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 7e1a8ec5-e1c0-4c11-aae2-15d84872d95c] Received event network-vif-deleted-dfd91ce9-b3dc-46e0-8793-952181553915 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:20:11 compute-1 nova_compute[182713]: 2026-01-22 00:20:11.207 182717 DEBUG nova.compute.provider_tree [None req-498b9e5e-c283-4c61-b5b6-5b8ddc69067c 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Inventory has not changed in ProviderTree for provider: 39680711-70c9-4df1-ae59-25e54fac688d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:20:11 compute-1 nova_compute[182713]: 2026-01-22 00:20:11.233 182717 DEBUG nova.scheduler.client.report [None req-498b9e5e-c283-4c61-b5b6-5b8ddc69067c 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Inventory has not changed for provider 39680711-70c9-4df1-ae59-25e54fac688d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:20:11 compute-1 nova_compute[182713]: 2026-01-22 00:20:11.262 182717 DEBUG oslo_concurrency.lockutils [None req-498b9e5e-c283-4c61-b5b6-5b8ddc69067c 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.170s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:20:11 compute-1 nova_compute[182713]: 2026-01-22 00:20:11.294 182717 INFO nova.scheduler.client.report [None req-498b9e5e-c283-4c61-b5b6-5b8ddc69067c 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Deleted allocations for instance 7e1a8ec5-e1c0-4c11-aae2-15d84872d95c
Jan 22 00:20:11 compute-1 nova_compute[182713]: 2026-01-22 00:20:11.382 182717 DEBUG oslo_concurrency.lockutils [None req-498b9e5e-c283-4c61-b5b6-5b8ddc69067c 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Lock "7e1a8ec5-e1c0-4c11-aae2-15d84872d95c" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.080s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:20:11 compute-1 nova_compute[182713]: 2026-01-22 00:20:11.840 182717 DEBUG nova.compute.manager [req-7f4a3e93-5e15-4766-b7b7-3349fe4c4cd2 req-0998ee2a-167b-40d1-a525-9cc60ba3fb6f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 7e1a8ec5-e1c0-4c11-aae2-15d84872d95c] Received event network-vif-plugged-dfd91ce9-b3dc-46e0-8793-952181553915 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:20:11 compute-1 nova_compute[182713]: 2026-01-22 00:20:11.841 182717 DEBUG oslo_concurrency.lockutils [req-7f4a3e93-5e15-4766-b7b7-3349fe4c4cd2 req-0998ee2a-167b-40d1-a525-9cc60ba3fb6f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "7e1a8ec5-e1c0-4c11-aae2-15d84872d95c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:20:11 compute-1 nova_compute[182713]: 2026-01-22 00:20:11.841 182717 DEBUG oslo_concurrency.lockutils [req-7f4a3e93-5e15-4766-b7b7-3349fe4c4cd2 req-0998ee2a-167b-40d1-a525-9cc60ba3fb6f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "7e1a8ec5-e1c0-4c11-aae2-15d84872d95c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:20:11 compute-1 nova_compute[182713]: 2026-01-22 00:20:11.842 182717 DEBUG oslo_concurrency.lockutils [req-7f4a3e93-5e15-4766-b7b7-3349fe4c4cd2 req-0998ee2a-167b-40d1-a525-9cc60ba3fb6f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "7e1a8ec5-e1c0-4c11-aae2-15d84872d95c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:20:11 compute-1 nova_compute[182713]: 2026-01-22 00:20:11.842 182717 DEBUG nova.compute.manager [req-7f4a3e93-5e15-4766-b7b7-3349fe4c4cd2 req-0998ee2a-167b-40d1-a525-9cc60ba3fb6f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 7e1a8ec5-e1c0-4c11-aae2-15d84872d95c] No waiting events found dispatching network-vif-plugged-dfd91ce9-b3dc-46e0-8793-952181553915 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:20:11 compute-1 nova_compute[182713]: 2026-01-22 00:20:11.842 182717 WARNING nova.compute.manager [req-7f4a3e93-5e15-4766-b7b7-3349fe4c4cd2 req-0998ee2a-167b-40d1-a525-9cc60ba3fb6f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 7e1a8ec5-e1c0-4c11-aae2-15d84872d95c] Received unexpected event network-vif-plugged-dfd91ce9-b3dc-46e0-8793-952181553915 for instance with vm_state deleted and task_state None.
Jan 22 00:20:12 compute-1 ovn_controller[94841]: 2026-01-22T00:20:12Z|00071|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:9d:17:06 10.100.0.5
Jan 22 00:20:13 compute-1 nova_compute[182713]: 2026-01-22 00:20:13.850 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:20:14 compute-1 nova_compute[182713]: 2026-01-22 00:20:14.652 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:20:15 compute-1 podman[233918]: 2026-01-22 00:20:15.582010294 +0000 UTC m=+0.070015574 container health_status 9069102163b9014ce8d990e36224edf2f6277e737eebbc85a8ea0ba5a3d6a468 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 22 00:20:15 compute-1 podman[233917]: 2026-01-22 00:20:15.612698537 +0000 UTC m=+0.100667316 container health_status 1b31374526468d6b98833c6ddc06c791524e3aaa8f4a92d02e3f11c449a17ca2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Jan 22 00:20:17 compute-1 nova_compute[182713]: 2026-01-22 00:20:17.135 182717 INFO nova.compute.manager [None req-8106c670-3077-44c0-8b98-2bcdc3d4d3a2 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: d59e0943-5372-4680-af52-c9af874c8578] Get console output
Jan 22 00:20:17 compute-1 nova_compute[182713]: 2026-01-22 00:20:17.144 211417 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 00:20:18 compute-1 nova_compute[182713]: 2026-01-22 00:20:18.870 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:20:19 compute-1 ovn_controller[94841]: 2026-01-22T00:20:19Z|00588|binding|INFO|Releasing lport ce8d757a-1822-40f4-bb02-e88d8e0a4e11 from this chassis (sb_readonly=0)
Jan 22 00:20:19 compute-1 nova_compute[182713]: 2026-01-22 00:20:19.205 182717 DEBUG nova.compute.manager [req-4429e531-3d3a-4f2c-95d4-cd408e97dff7 req-c7b24cee-0cbf-4623-8c87-8e36b2570f58 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: d59e0943-5372-4680-af52-c9af874c8578] Received event network-changed-89ad850c-a87f-489f-8c3e-51dfc078a374 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:20:19 compute-1 nova_compute[182713]: 2026-01-22 00:20:19.206 182717 DEBUG nova.compute.manager [req-4429e531-3d3a-4f2c-95d4-cd408e97dff7 req-c7b24cee-0cbf-4623-8c87-8e36b2570f58 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: d59e0943-5372-4680-af52-c9af874c8578] Refreshing instance network info cache due to event network-changed-89ad850c-a87f-489f-8c3e-51dfc078a374. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 00:20:19 compute-1 nova_compute[182713]: 2026-01-22 00:20:19.206 182717 DEBUG oslo_concurrency.lockutils [req-4429e531-3d3a-4f2c-95d4-cd408e97dff7 req-c7b24cee-0cbf-4623-8c87-8e36b2570f58 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-d59e0943-5372-4680-af52-c9af874c8578" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:20:19 compute-1 nova_compute[182713]: 2026-01-22 00:20:19.206 182717 DEBUG oslo_concurrency.lockutils [req-4429e531-3d3a-4f2c-95d4-cd408e97dff7 req-c7b24cee-0cbf-4623-8c87-8e36b2570f58 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-d59e0943-5372-4680-af52-c9af874c8578" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:20:19 compute-1 nova_compute[182713]: 2026-01-22 00:20:19.206 182717 DEBUG nova.network.neutron [req-4429e531-3d3a-4f2c-95d4-cd408e97dff7 req-c7b24cee-0cbf-4623-8c87-8e36b2570f58 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: d59e0943-5372-4680-af52-c9af874c8578] Refreshing network info cache for port 89ad850c-a87f-489f-8c3e-51dfc078a374 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 00:20:19 compute-1 nova_compute[182713]: 2026-01-22 00:20:19.209 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:20:19 compute-1 ovn_controller[94841]: 2026-01-22T00:20:19Z|00589|binding|INFO|Releasing lport ce8d757a-1822-40f4-bb02-e88d8e0a4e11 from this chassis (sb_readonly=0)
Jan 22 00:20:19 compute-1 nova_compute[182713]: 2026-01-22 00:20:19.358 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:20:19 compute-1 nova_compute[182713]: 2026-01-22 00:20:19.360 182717 DEBUG oslo_concurrency.lockutils [None req-c10d769d-6b06-4a8e-98b8-db3e0bc4bab1 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Acquiring lock "d59e0943-5372-4680-af52-c9af874c8578" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:20:19 compute-1 nova_compute[182713]: 2026-01-22 00:20:19.360 182717 DEBUG oslo_concurrency.lockutils [None req-c10d769d-6b06-4a8e-98b8-db3e0bc4bab1 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lock "d59e0943-5372-4680-af52-c9af874c8578" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:20:19 compute-1 nova_compute[182713]: 2026-01-22 00:20:19.361 182717 DEBUG oslo_concurrency.lockutils [None req-c10d769d-6b06-4a8e-98b8-db3e0bc4bab1 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Acquiring lock "d59e0943-5372-4680-af52-c9af874c8578-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:20:19 compute-1 nova_compute[182713]: 2026-01-22 00:20:19.361 182717 DEBUG oslo_concurrency.lockutils [None req-c10d769d-6b06-4a8e-98b8-db3e0bc4bab1 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lock "d59e0943-5372-4680-af52-c9af874c8578-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:20:19 compute-1 nova_compute[182713]: 2026-01-22 00:20:19.362 182717 DEBUG oslo_concurrency.lockutils [None req-c10d769d-6b06-4a8e-98b8-db3e0bc4bab1 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lock "d59e0943-5372-4680-af52-c9af874c8578-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:20:19 compute-1 nova_compute[182713]: 2026-01-22 00:20:19.416 182717 INFO nova.compute.manager [None req-c10d769d-6b06-4a8e-98b8-db3e0bc4bab1 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: d59e0943-5372-4680-af52-c9af874c8578] Terminating instance
Jan 22 00:20:19 compute-1 nova_compute[182713]: 2026-01-22 00:20:19.430 182717 DEBUG nova.compute.manager [None req-c10d769d-6b06-4a8e-98b8-db3e0bc4bab1 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: d59e0943-5372-4680-af52-c9af874c8578] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 22 00:20:19 compute-1 kernel: tap89ad850c-a8 (unregistering): left promiscuous mode
Jan 22 00:20:19 compute-1 NetworkManager[54952]: <info>  [1769041219.4550] device (tap89ad850c-a8): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 00:20:19 compute-1 ovn_controller[94841]: 2026-01-22T00:20:19Z|00590|binding|INFO|Releasing lport 89ad850c-a87f-489f-8c3e-51dfc078a374 from this chassis (sb_readonly=0)
Jan 22 00:20:19 compute-1 ovn_controller[94841]: 2026-01-22T00:20:19Z|00591|binding|INFO|Setting lport 89ad850c-a87f-489f-8c3e-51dfc078a374 down in Southbound
Jan 22 00:20:19 compute-1 nova_compute[182713]: 2026-01-22 00:20:19.461 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:20:19 compute-1 ovn_controller[94841]: 2026-01-22T00:20:19Z|00592|binding|INFO|Removing iface tap89ad850c-a8 ovn-installed in OVS
Jan 22 00:20:19 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:20:19.470 104184 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:9d:17:06 10.100.0.5'], port_security=['fa:16:3e:9d:17:06 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': 'd59e0943-5372-4680-af52-c9af874c8578', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7347045a-f38e-4f56-a03a-a68e0fbe1e8d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'adb1305c8f874f2684e845e88fd95ffe', 'neutron:revision_number': '12', 'neutron:security_group_ids': '1bc50146-1f14-43fa-a2db-2904419fa654', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f19555ab-2ed1-467b-9e13-e9518e9577aa, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>], logical_port=89ad850c-a87f-489f-8c3e-51dfc078a374) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:20:19 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:20:19.471 104184 INFO neutron.agent.ovn.metadata.agent [-] Port 89ad850c-a87f-489f-8c3e-51dfc078a374 in datapath 7347045a-f38e-4f56-a03a-a68e0fbe1e8d unbound from our chassis
Jan 22 00:20:19 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:20:19.473 104184 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 7347045a-f38e-4f56-a03a-a68e0fbe1e8d, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 00:20:19 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:20:19.474 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[5e98d32a-789e-4ac1-a1aa-ed084f06bf13]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:20:19 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:20:19.475 104184 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-7347045a-f38e-4f56-a03a-a68e0fbe1e8d namespace which is not needed anymore
Jan 22 00:20:19 compute-1 nova_compute[182713]: 2026-01-22 00:20:19.478 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:20:19 compute-1 systemd[1]: machine-qemu\x2d64\x2dinstance\x2d00000091.scope: Deactivated successfully.
Jan 22 00:20:19 compute-1 systemd[1]: machine-qemu\x2d64\x2dinstance\x2d00000091.scope: Consumed 13.869s CPU time.
Jan 22 00:20:19 compute-1 systemd-machined[153970]: Machine qemu-64-instance-00000091 terminated.
Jan 22 00:20:19 compute-1 neutron-haproxy-ovnmeta-7347045a-f38e-4f56-a03a-a68e0fbe1e8d[233765]: [NOTICE]   (233769) : haproxy version is 2.8.14-c23fe91
Jan 22 00:20:19 compute-1 neutron-haproxy-ovnmeta-7347045a-f38e-4f56-a03a-a68e0fbe1e8d[233765]: [NOTICE]   (233769) : path to executable is /usr/sbin/haproxy
Jan 22 00:20:19 compute-1 neutron-haproxy-ovnmeta-7347045a-f38e-4f56-a03a-a68e0fbe1e8d[233765]: [WARNING]  (233769) : Exiting Master process...
Jan 22 00:20:19 compute-1 neutron-haproxy-ovnmeta-7347045a-f38e-4f56-a03a-a68e0fbe1e8d[233765]: [ALERT]    (233769) : Current worker (233771) exited with code 143 (Terminated)
Jan 22 00:20:19 compute-1 neutron-haproxy-ovnmeta-7347045a-f38e-4f56-a03a-a68e0fbe1e8d[233765]: [WARNING]  (233769) : All workers exited. Exiting... (0)
Jan 22 00:20:19 compute-1 systemd[1]: libpod-0230d1a353d59c5f77ff1461f0c2ea86d8d153d418071415c514833f43a05550.scope: Deactivated successfully.
Jan 22 00:20:19 compute-1 podman[233990]: 2026-01-22 00:20:19.596938462 +0000 UTC m=+0.041314873 container died 0230d1a353d59c5f77ff1461f0c2ea86d8d153d418071415c514833f43a05550 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7347045a-f38e-4f56-a03a-a68e0fbe1e8d, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 22 00:20:19 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-0230d1a353d59c5f77ff1461f0c2ea86d8d153d418071415c514833f43a05550-userdata-shm.mount: Deactivated successfully.
Jan 22 00:20:19 compute-1 systemd[1]: var-lib-containers-storage-overlay-4d861072d5e9e93e460fc407cc3976ea583bb36b2143dcc256b09144d189940d-merged.mount: Deactivated successfully.
Jan 22 00:20:19 compute-1 podman[233990]: 2026-01-22 00:20:19.623126976 +0000 UTC m=+0.067503377 container cleanup 0230d1a353d59c5f77ff1461f0c2ea86d8d153d418071415c514833f43a05550 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7347045a-f38e-4f56-a03a-a68e0fbe1e8d, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Jan 22 00:20:19 compute-1 systemd[1]: libpod-conmon-0230d1a353d59c5f77ff1461f0c2ea86d8d153d418071415c514833f43a05550.scope: Deactivated successfully.
Jan 22 00:20:19 compute-1 nova_compute[182713]: 2026-01-22 00:20:19.654 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:20:19 compute-1 nova_compute[182713]: 2026-01-22 00:20:19.689 182717 INFO nova.virt.libvirt.driver [-] [instance: d59e0943-5372-4680-af52-c9af874c8578] Instance destroyed successfully.
Jan 22 00:20:19 compute-1 nova_compute[182713]: 2026-01-22 00:20:19.689 182717 DEBUG nova.objects.instance [None req-c10d769d-6b06-4a8e-98b8-db3e0bc4bab1 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lazy-loading 'resources' on Instance uuid d59e0943-5372-4680-af52-c9af874c8578 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:20:19 compute-1 podman[234021]: 2026-01-22 00:20:19.714383528 +0000 UTC m=+0.073132281 container remove 0230d1a353d59c5f77ff1461f0c2ea86d8d153d418071415c514833f43a05550 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7347045a-f38e-4f56-a03a-a68e0fbe1e8d, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251202)
Jan 22 00:20:19 compute-1 nova_compute[182713]: 2026-01-22 00:20:19.720 182717 DEBUG nova.virt.libvirt.vif [None req-c10d769d-6b06-4a8e-98b8-db3e0bc4bab1 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-01-22T00:18:57Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-2079598704',display_name='tempest-TestNetworkAdvancedServerOps-server-2079598704',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-2079598704',id=145,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBIY3eqmdW0m2q20hwTxy7fCq5RPOCY+KqJLqriFpcPzIAlQnzQNfW6TIp9Y1voEv/PtpLpDAT0kqBnGToo/qNh+oTys/PEZ/7XtlTWunC6nPRFTGOxMn536DUj7Tail8LA==',key_name='tempest-TestNetworkAdvancedServerOps-1230336080',keypairs=<?>,launch_index=0,launched_at=2026-01-22T00:20:00Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='adb1305c8f874f2684e845e88fd95ffe',ramdisk_id='',reservation_id='r-zrj70p8y',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkAdvancedServerOps-587955072',owner_user_name='tempest-TestNetworkAdvancedServerOps-587955072-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T00:20:00Z,user_data=None,user_id='635cc2f351c344dc8e2b1264080dbafb',uuid=d59e0943-5372-4680-af52-c9af874c8578,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "89ad850c-a87f-489f-8c3e-51dfc078a374", "address": "fa:16:3e:9d:17:06", "network": {"id": "7347045a-f38e-4f56-a03a-a68e0fbe1e8d", "bridge": "br-int", "label": "tempest-network-smoke--1118354446", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.194", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "adb1305c8f874f2684e845e88fd95ffe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap89ad850c-a8", "ovs_interfaceid": "89ad850c-a87f-489f-8c3e-51dfc078a374", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 00:20:19 compute-1 nova_compute[182713]: 2026-01-22 00:20:19.721 182717 DEBUG nova.network.os_vif_util [None req-c10d769d-6b06-4a8e-98b8-db3e0bc4bab1 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Converting VIF {"id": "89ad850c-a87f-489f-8c3e-51dfc078a374", "address": "fa:16:3e:9d:17:06", "network": {"id": "7347045a-f38e-4f56-a03a-a68e0fbe1e8d", "bridge": "br-int", "label": "tempest-network-smoke--1118354446", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.194", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "adb1305c8f874f2684e845e88fd95ffe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap89ad850c-a8", "ovs_interfaceid": "89ad850c-a87f-489f-8c3e-51dfc078a374", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:20:19 compute-1 nova_compute[182713]: 2026-01-22 00:20:19.721 182717 DEBUG nova.network.os_vif_util [None req-c10d769d-6b06-4a8e-98b8-db3e0bc4bab1 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:9d:17:06,bridge_name='br-int',has_traffic_filtering=True,id=89ad850c-a87f-489f-8c3e-51dfc078a374,network=Network(7347045a-f38e-4f56-a03a-a68e0fbe1e8d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap89ad850c-a8') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:20:19 compute-1 nova_compute[182713]: 2026-01-22 00:20:19.722 182717 DEBUG os_vif [None req-c10d769d-6b06-4a8e-98b8-db3e0bc4bab1 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:9d:17:06,bridge_name='br-int',has_traffic_filtering=True,id=89ad850c-a87f-489f-8c3e-51dfc078a374,network=Network(7347045a-f38e-4f56-a03a-a68e0fbe1e8d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap89ad850c-a8') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 00:20:19 compute-1 nova_compute[182713]: 2026-01-22 00:20:19.725 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:20:19 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:20:19.725 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[6fa14bdf-411c-4f32-82da-f9ada418b0b1]: (4, ('Thu Jan 22 12:20:19 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-7347045a-f38e-4f56-a03a-a68e0fbe1e8d (0230d1a353d59c5f77ff1461f0c2ea86d8d153d418071415c514833f43a05550)\n0230d1a353d59c5f77ff1461f0c2ea86d8d153d418071415c514833f43a05550\nThu Jan 22 12:20:19 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-7347045a-f38e-4f56-a03a-a68e0fbe1e8d (0230d1a353d59c5f77ff1461f0c2ea86d8d153d418071415c514833f43a05550)\n0230d1a353d59c5f77ff1461f0c2ea86d8d153d418071415c514833f43a05550\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:20:19 compute-1 nova_compute[182713]: 2026-01-22 00:20:19.726 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap89ad850c-a8, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:20:19 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:20:19.727 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[533b0c3b-4242-400d-a62e-67bcaf349c20]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:20:19 compute-1 nova_compute[182713]: 2026-01-22 00:20:19.728 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:20:19 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:20:19.728 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7347045a-f0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:20:19 compute-1 nova_compute[182713]: 2026-01-22 00:20:19.729 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:20:19 compute-1 nova_compute[182713]: 2026-01-22 00:20:19.731 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:20:19 compute-1 nova_compute[182713]: 2026-01-22 00:20:19.733 182717 INFO os_vif [None req-c10d769d-6b06-4a8e-98b8-db3e0bc4bab1 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:9d:17:06,bridge_name='br-int',has_traffic_filtering=True,id=89ad850c-a87f-489f-8c3e-51dfc078a374,network=Network(7347045a-f38e-4f56-a03a-a68e0fbe1e8d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap89ad850c-a8')
Jan 22 00:20:19 compute-1 nova_compute[182713]: 2026-01-22 00:20:19.734 182717 INFO nova.virt.libvirt.driver [None req-c10d769d-6b06-4a8e-98b8-db3e0bc4bab1 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: d59e0943-5372-4680-af52-c9af874c8578] Deleting instance files /var/lib/nova/instances/d59e0943-5372-4680-af52-c9af874c8578_del
Jan 22 00:20:19 compute-1 kernel: tap7347045a-f0: left promiscuous mode
Jan 22 00:20:19 compute-1 nova_compute[182713]: 2026-01-22 00:20:19.746 182717 INFO nova.virt.libvirt.driver [None req-c10d769d-6b06-4a8e-98b8-db3e0bc4bab1 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: d59e0943-5372-4680-af52-c9af874c8578] Deletion of /var/lib/nova/instances/d59e0943-5372-4680-af52-c9af874c8578_del complete
Jan 22 00:20:19 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:20:19.748 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[f2c4819a-3a6f-4682-ad29-836482ee8691]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:20:19 compute-1 nova_compute[182713]: 2026-01-22 00:20:19.750 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:20:19 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:20:19.774 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[b3fbfb1f-2d54-46bc-bd0f-fc24df4bb699]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:20:19 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:20:19.776 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[07c6b789-acbf-42a4-9679-79a3de5e6ef9]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:20:19 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:20:19.795 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[86cc5205-1ff0-4c3c-ab41-8782d8184666]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 577139, 'reachable_time': 25533, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 234051, 'error': None, 'target': 'ovnmeta-7347045a-f38e-4f56-a03a-a68e0fbe1e8d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:20:19 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:20:19.797 104576 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-7347045a-f38e-4f56-a03a-a68e0fbe1e8d deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 22 00:20:19 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:20:19.797 104576 DEBUG oslo.privsep.daemon [-] privsep: reply[c17b8a32-e018-40f8-a9d0-8f1b598caed2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:20:19 compute-1 systemd[1]: run-netns-ovnmeta\x2d7347045a\x2df38e\x2d4f56\x2da03a\x2da68e0fbe1e8d.mount: Deactivated successfully.
Jan 22 00:20:19 compute-1 nova_compute[182713]: 2026-01-22 00:20:19.839 182717 INFO nova.compute.manager [None req-c10d769d-6b06-4a8e-98b8-db3e0bc4bab1 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: d59e0943-5372-4680-af52-c9af874c8578] Took 0.41 seconds to destroy the instance on the hypervisor.
Jan 22 00:20:19 compute-1 nova_compute[182713]: 2026-01-22 00:20:19.840 182717 DEBUG oslo.service.loopingcall [None req-c10d769d-6b06-4a8e-98b8-db3e0bc4bab1 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 22 00:20:19 compute-1 nova_compute[182713]: 2026-01-22 00:20:19.840 182717 DEBUG nova.compute.manager [-] [instance: d59e0943-5372-4680-af52-c9af874c8578] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 22 00:20:19 compute-1 nova_compute[182713]: 2026-01-22 00:20:19.840 182717 DEBUG nova.network.neutron [-] [instance: d59e0943-5372-4680-af52-c9af874c8578] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 22 00:20:19 compute-1 nova_compute[182713]: 2026-01-22 00:20:19.908 182717 DEBUG nova.compute.manager [req-b276d41d-7871-4009-9400-f72a9d6d18e8 req-f0d94126-a41d-42fb-a1da-0f5608176434 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: d59e0943-5372-4680-af52-c9af874c8578] Received event network-vif-unplugged-89ad850c-a87f-489f-8c3e-51dfc078a374 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:20:19 compute-1 nova_compute[182713]: 2026-01-22 00:20:19.909 182717 DEBUG oslo_concurrency.lockutils [req-b276d41d-7871-4009-9400-f72a9d6d18e8 req-f0d94126-a41d-42fb-a1da-0f5608176434 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "d59e0943-5372-4680-af52-c9af874c8578-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:20:19 compute-1 nova_compute[182713]: 2026-01-22 00:20:19.909 182717 DEBUG oslo_concurrency.lockutils [req-b276d41d-7871-4009-9400-f72a9d6d18e8 req-f0d94126-a41d-42fb-a1da-0f5608176434 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "d59e0943-5372-4680-af52-c9af874c8578-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:20:19 compute-1 nova_compute[182713]: 2026-01-22 00:20:19.910 182717 DEBUG oslo_concurrency.lockutils [req-b276d41d-7871-4009-9400-f72a9d6d18e8 req-f0d94126-a41d-42fb-a1da-0f5608176434 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "d59e0943-5372-4680-af52-c9af874c8578-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:20:19 compute-1 nova_compute[182713]: 2026-01-22 00:20:19.910 182717 DEBUG nova.compute.manager [req-b276d41d-7871-4009-9400-f72a9d6d18e8 req-f0d94126-a41d-42fb-a1da-0f5608176434 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: d59e0943-5372-4680-af52-c9af874c8578] No waiting events found dispatching network-vif-unplugged-89ad850c-a87f-489f-8c3e-51dfc078a374 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:20:19 compute-1 nova_compute[182713]: 2026-01-22 00:20:19.910 182717 DEBUG nova.compute.manager [req-b276d41d-7871-4009-9400-f72a9d6d18e8 req-f0d94126-a41d-42fb-a1da-0f5608176434 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: d59e0943-5372-4680-af52-c9af874c8578] Received event network-vif-unplugged-89ad850c-a87f-489f-8c3e-51dfc078a374 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 22 00:20:21 compute-1 podman[234052]: 2026-01-22 00:20:21.601993167 +0000 UTC m=+0.083779302 container health_status af9a44923f14d865893c91712a29a99d59e05d83f1a1a0300c4f4d5af638f284 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 22 00:20:21 compute-1 podman[234053]: 2026-01-22 00:20:21.610844442 +0000 UTC m=+0.093658619 container health_status e305ebfcd3dbca18f8dd680606b043b108cf9efb0d0a771039cb04fe0df8441d (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 22 00:20:21 compute-1 nova_compute[182713]: 2026-01-22 00:20:21.630 182717 DEBUG nova.network.neutron [-] [instance: d59e0943-5372-4680-af52-c9af874c8578] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:20:21 compute-1 nova_compute[182713]: 2026-01-22 00:20:21.669 182717 INFO nova.compute.manager [-] [instance: d59e0943-5372-4680-af52-c9af874c8578] Took 1.83 seconds to deallocate network for instance.
Jan 22 00:20:21 compute-1 nova_compute[182713]: 2026-01-22 00:20:21.772 182717 DEBUG nova.compute.manager [req-c834c104-ffdf-4cda-921a-f2bf11b25785 req-ff2cc413-a82f-4f38-b823-7d82c1d07e2e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: d59e0943-5372-4680-af52-c9af874c8578] Received event network-vif-deleted-89ad850c-a87f-489f-8c3e-51dfc078a374 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:20:21 compute-1 nova_compute[182713]: 2026-01-22 00:20:21.782 182717 DEBUG oslo_concurrency.lockutils [None req-c10d769d-6b06-4a8e-98b8-db3e0bc4bab1 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:20:21 compute-1 nova_compute[182713]: 2026-01-22 00:20:21.782 182717 DEBUG oslo_concurrency.lockutils [None req-c10d769d-6b06-4a8e-98b8-db3e0bc4bab1 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:20:21 compute-1 nova_compute[182713]: 2026-01-22 00:20:21.792 182717 DEBUG oslo_concurrency.lockutils [None req-c10d769d-6b06-4a8e-98b8-db3e0bc4bab1 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.009s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:20:22 compute-1 nova_compute[182713]: 2026-01-22 00:20:22.086 182717 INFO nova.scheduler.client.report [None req-c10d769d-6b06-4a8e-98b8-db3e0bc4bab1 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Deleted allocations for instance d59e0943-5372-4680-af52-c9af874c8578
Jan 22 00:20:22 compute-1 nova_compute[182713]: 2026-01-22 00:20:22.297 182717 DEBUG oslo_concurrency.lockutils [None req-c10d769d-6b06-4a8e-98b8-db3e0bc4bab1 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lock "d59e0943-5372-4680-af52-c9af874c8578" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.936s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:20:22 compute-1 nova_compute[182713]: 2026-01-22 00:20:22.375 182717 DEBUG nova.compute.manager [req-c25a101e-9298-4139-abab-726ef76d7de5 req-7c0ca9ea-1f40-4c37-969b-a7e66ca5efb2 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: d59e0943-5372-4680-af52-c9af874c8578] Received event network-vif-plugged-89ad850c-a87f-489f-8c3e-51dfc078a374 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:20:22 compute-1 nova_compute[182713]: 2026-01-22 00:20:22.375 182717 DEBUG oslo_concurrency.lockutils [req-c25a101e-9298-4139-abab-726ef76d7de5 req-7c0ca9ea-1f40-4c37-969b-a7e66ca5efb2 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "d59e0943-5372-4680-af52-c9af874c8578-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:20:22 compute-1 nova_compute[182713]: 2026-01-22 00:20:22.375 182717 DEBUG oslo_concurrency.lockutils [req-c25a101e-9298-4139-abab-726ef76d7de5 req-7c0ca9ea-1f40-4c37-969b-a7e66ca5efb2 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "d59e0943-5372-4680-af52-c9af874c8578-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:20:22 compute-1 nova_compute[182713]: 2026-01-22 00:20:22.376 182717 DEBUG oslo_concurrency.lockutils [req-c25a101e-9298-4139-abab-726ef76d7de5 req-7c0ca9ea-1f40-4c37-969b-a7e66ca5efb2 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "d59e0943-5372-4680-af52-c9af874c8578-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:20:22 compute-1 nova_compute[182713]: 2026-01-22 00:20:22.376 182717 DEBUG nova.compute.manager [req-c25a101e-9298-4139-abab-726ef76d7de5 req-7c0ca9ea-1f40-4c37-969b-a7e66ca5efb2 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: d59e0943-5372-4680-af52-c9af874c8578] No waiting events found dispatching network-vif-plugged-89ad850c-a87f-489f-8c3e-51dfc078a374 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:20:22 compute-1 nova_compute[182713]: 2026-01-22 00:20:22.376 182717 WARNING nova.compute.manager [req-c25a101e-9298-4139-abab-726ef76d7de5 req-7c0ca9ea-1f40-4c37-969b-a7e66ca5efb2 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: d59e0943-5372-4680-af52-c9af874c8578] Received unexpected event network-vif-plugged-89ad850c-a87f-489f-8c3e-51dfc078a374 for instance with vm_state deleted and task_state None.
Jan 22 00:20:22 compute-1 nova_compute[182713]: 2026-01-22 00:20:22.492 182717 DEBUG nova.network.neutron [req-4429e531-3d3a-4f2c-95d4-cd408e97dff7 req-c7b24cee-0cbf-4623-8c87-8e36b2570f58 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: d59e0943-5372-4680-af52-c9af874c8578] Updated VIF entry in instance network info cache for port 89ad850c-a87f-489f-8c3e-51dfc078a374. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 00:20:22 compute-1 nova_compute[182713]: 2026-01-22 00:20:22.493 182717 DEBUG nova.network.neutron [req-4429e531-3d3a-4f2c-95d4-cd408e97dff7 req-c7b24cee-0cbf-4623-8c87-8e36b2570f58 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: d59e0943-5372-4680-af52-c9af874c8578] Updating instance_info_cache with network_info: [{"id": "89ad850c-a87f-489f-8c3e-51dfc078a374", "address": "fa:16:3e:9d:17:06", "network": {"id": "7347045a-f38e-4f56-a03a-a68e0fbe1e8d", "bridge": "br-int", "label": "tempest-network-smoke--1118354446", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "adb1305c8f874f2684e845e88fd95ffe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap89ad850c-a8", "ovs_interfaceid": "89ad850c-a87f-489f-8c3e-51dfc078a374", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:20:22 compute-1 nova_compute[182713]: 2026-01-22 00:20:22.546 182717 DEBUG oslo_concurrency.lockutils [req-4429e531-3d3a-4f2c-95d4-cd408e97dff7 req-c7b24cee-0cbf-4623-8c87-8e36b2570f58 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-d59e0943-5372-4680-af52-c9af874c8578" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:20:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:20:22.882 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:20:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:20:22.883 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:20:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:20:22.883 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:20:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:20:22.883 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:20:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:20:22.883 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:20:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:20:22.883 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:20:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:20:22.883 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:20:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:20:22.883 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:20:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:20:22.883 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:20:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:20:22.883 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:20:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:20:22.884 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:20:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:20:22.884 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:20:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:20:22.884 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:20:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:20:22.884 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:20:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:20:22.884 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:20:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:20:22.884 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:20:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:20:22.884 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:20:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:20:22.884 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:20:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:20:22.884 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:20:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:20:22.884 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:20:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:20:22.884 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:20:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:20:22.884 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:20:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:20:22.885 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:20:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:20:22.885 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:20:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:20:22.885 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:20:23 compute-1 nova_compute[182713]: 2026-01-22 00:20:23.872 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:20:24 compute-1 nova_compute[182713]: 2026-01-22 00:20:24.614 182717 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769041209.6124308, 7e1a8ec5-e1c0-4c11-aae2-15d84872d95c => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:20:24 compute-1 nova_compute[182713]: 2026-01-22 00:20:24.614 182717 INFO nova.compute.manager [-] [instance: 7e1a8ec5-e1c0-4c11-aae2-15d84872d95c] VM Stopped (Lifecycle Event)
Jan 22 00:20:24 compute-1 nova_compute[182713]: 2026-01-22 00:20:24.635 182717 DEBUG nova.compute.manager [None req-0a29f896-1018-46b5-b7cb-a16abf5cde15 - - - - - -] [instance: 7e1a8ec5-e1c0-4c11-aae2-15d84872d95c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:20:24 compute-1 nova_compute[182713]: 2026-01-22 00:20:24.729 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:20:27 compute-1 nova_compute[182713]: 2026-01-22 00:20:27.030 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:20:28 compute-1 nova_compute[182713]: 2026-01-22 00:20:28.874 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:20:29 compute-1 nova_compute[182713]: 2026-01-22 00:20:29.730 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:20:32 compute-1 podman[234094]: 2026-01-22 00:20:32.565512853 +0000 UTC m=+0.063830844 container health_status cba261ffc859a105e0afa4436e64e27f9ba7e7387a1ea02146e0072c51844d44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 00:20:33 compute-1 nova_compute[182713]: 2026-01-22 00:20:33.876 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:20:34 compute-1 nova_compute[182713]: 2026-01-22 00:20:34.687 182717 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769041219.6860862, d59e0943-5372-4680-af52-c9af874c8578 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:20:34 compute-1 nova_compute[182713]: 2026-01-22 00:20:34.688 182717 INFO nova.compute.manager [-] [instance: d59e0943-5372-4680-af52-c9af874c8578] VM Stopped (Lifecycle Event)
Jan 22 00:20:34 compute-1 nova_compute[182713]: 2026-01-22 00:20:34.725 182717 DEBUG nova.compute.manager [None req-bf534cfe-ff4e-4005-b4ec-d52083a0af07 - - - - - -] [instance: d59e0943-5372-4680-af52-c9af874c8578] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:20:34 compute-1 nova_compute[182713]: 2026-01-22 00:20:34.769 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:20:35 compute-1 podman[234115]: 2026-01-22 00:20:35.564032506 +0000 UTC m=+0.055675139 container health_status dce133a2ffa21f3006ac27dc80027060a9029e6d972f4c62fecd35dd3bbb00f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, distribution-scope=public, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, version=9.6, config_id=openstack_network_exporter, io.buildah.version=1.33.7, vendor=Red Hat, Inc., maintainer=Red Hat, Inc.)
Jan 22 00:20:38 compute-1 nova_compute[182713]: 2026-01-22 00:20:38.877 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:20:39 compute-1 nova_compute[182713]: 2026-01-22 00:20:39.770 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:20:43 compute-1 nova_compute[182713]: 2026-01-22 00:20:43.788 182717 DEBUG oslo_concurrency.lockutils [None req-5855d627-d6ad-4b02-bc9d-c3741ae99aa8 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Acquiring lock "595c8a5a-b43c-4eae-ad91-c7848e0e2f44" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:20:43 compute-1 nova_compute[182713]: 2026-01-22 00:20:43.788 182717 DEBUG oslo_concurrency.lockutils [None req-5855d627-d6ad-4b02-bc9d-c3741ae99aa8 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Lock "595c8a5a-b43c-4eae-ad91-c7848e0e2f44" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:20:43 compute-1 nova_compute[182713]: 2026-01-22 00:20:43.845 182717 DEBUG nova.compute.manager [None req-5855d627-d6ad-4b02-bc9d-c3741ae99aa8 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 595c8a5a-b43c-4eae-ad91-c7848e0e2f44] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 22 00:20:43 compute-1 nova_compute[182713]: 2026-01-22 00:20:43.879 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:20:44 compute-1 nova_compute[182713]: 2026-01-22 00:20:44.096 182717 DEBUG oslo_concurrency.lockutils [None req-5855d627-d6ad-4b02-bc9d-c3741ae99aa8 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:20:44 compute-1 nova_compute[182713]: 2026-01-22 00:20:44.097 182717 DEBUG oslo_concurrency.lockutils [None req-5855d627-d6ad-4b02-bc9d-c3741ae99aa8 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:20:44 compute-1 nova_compute[182713]: 2026-01-22 00:20:44.106 182717 DEBUG nova.virt.hardware [None req-5855d627-d6ad-4b02-bc9d-c3741ae99aa8 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 22 00:20:44 compute-1 nova_compute[182713]: 2026-01-22 00:20:44.107 182717 INFO nova.compute.claims [None req-5855d627-d6ad-4b02-bc9d-c3741ae99aa8 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 595c8a5a-b43c-4eae-ad91-c7848e0e2f44] Claim successful on node compute-1.ctlplane.example.com
Jan 22 00:20:44 compute-1 nova_compute[182713]: 2026-01-22 00:20:44.444 182717 DEBUG nova.compute.provider_tree [None req-5855d627-d6ad-4b02-bc9d-c3741ae99aa8 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Inventory has not changed in ProviderTree for provider: 39680711-70c9-4df1-ae59-25e54fac688d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:20:44 compute-1 nova_compute[182713]: 2026-01-22 00:20:44.472 182717 DEBUG nova.scheduler.client.report [None req-5855d627-d6ad-4b02-bc9d-c3741ae99aa8 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Inventory has not changed for provider 39680711-70c9-4df1-ae59-25e54fac688d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:20:44 compute-1 nova_compute[182713]: 2026-01-22 00:20:44.494 182717 DEBUG oslo_concurrency.lockutils [None req-5855d627-d6ad-4b02-bc9d-c3741ae99aa8 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.398s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:20:44 compute-1 nova_compute[182713]: 2026-01-22 00:20:44.495 182717 DEBUG nova.compute.manager [None req-5855d627-d6ad-4b02-bc9d-c3741ae99aa8 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 595c8a5a-b43c-4eae-ad91-c7848e0e2f44] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 22 00:20:44 compute-1 nova_compute[182713]: 2026-01-22 00:20:44.550 182717 DEBUG nova.compute.manager [None req-5855d627-d6ad-4b02-bc9d-c3741ae99aa8 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 595c8a5a-b43c-4eae-ad91-c7848e0e2f44] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 22 00:20:44 compute-1 nova_compute[182713]: 2026-01-22 00:20:44.551 182717 DEBUG nova.network.neutron [None req-5855d627-d6ad-4b02-bc9d-c3741ae99aa8 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 595c8a5a-b43c-4eae-ad91-c7848e0e2f44] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 22 00:20:44 compute-1 nova_compute[182713]: 2026-01-22 00:20:44.577 182717 INFO nova.virt.libvirt.driver [None req-5855d627-d6ad-4b02-bc9d-c3741ae99aa8 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 595c8a5a-b43c-4eae-ad91-c7848e0e2f44] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 22 00:20:44 compute-1 nova_compute[182713]: 2026-01-22 00:20:44.603 182717 DEBUG nova.compute.manager [None req-5855d627-d6ad-4b02-bc9d-c3741ae99aa8 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 595c8a5a-b43c-4eae-ad91-c7848e0e2f44] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 22 00:20:44 compute-1 nova_compute[182713]: 2026-01-22 00:20:44.762 182717 DEBUG nova.compute.manager [None req-5855d627-d6ad-4b02-bc9d-c3741ae99aa8 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 595c8a5a-b43c-4eae-ad91-c7848e0e2f44] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 22 00:20:44 compute-1 nova_compute[182713]: 2026-01-22 00:20:44.763 182717 DEBUG nova.virt.libvirt.driver [None req-5855d627-d6ad-4b02-bc9d-c3741ae99aa8 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 595c8a5a-b43c-4eae-ad91-c7848e0e2f44] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 22 00:20:44 compute-1 nova_compute[182713]: 2026-01-22 00:20:44.764 182717 INFO nova.virt.libvirt.driver [None req-5855d627-d6ad-4b02-bc9d-c3741ae99aa8 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 595c8a5a-b43c-4eae-ad91-c7848e0e2f44] Creating image(s)
Jan 22 00:20:44 compute-1 nova_compute[182713]: 2026-01-22 00:20:44.764 182717 DEBUG oslo_concurrency.lockutils [None req-5855d627-d6ad-4b02-bc9d-c3741ae99aa8 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Acquiring lock "/var/lib/nova/instances/595c8a5a-b43c-4eae-ad91-c7848e0e2f44/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:20:44 compute-1 nova_compute[182713]: 2026-01-22 00:20:44.765 182717 DEBUG oslo_concurrency.lockutils [None req-5855d627-d6ad-4b02-bc9d-c3741ae99aa8 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Lock "/var/lib/nova/instances/595c8a5a-b43c-4eae-ad91-c7848e0e2f44/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:20:44 compute-1 nova_compute[182713]: 2026-01-22 00:20:44.766 182717 DEBUG oslo_concurrency.lockutils [None req-5855d627-d6ad-4b02-bc9d-c3741ae99aa8 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Lock "/var/lib/nova/instances/595c8a5a-b43c-4eae-ad91-c7848e0e2f44/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:20:44 compute-1 nova_compute[182713]: 2026-01-22 00:20:44.786 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:20:44 compute-1 nova_compute[182713]: 2026-01-22 00:20:44.787 182717 DEBUG oslo_concurrency.processutils [None req-5855d627-d6ad-4b02-bc9d-c3741ae99aa8 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:20:44 compute-1 nova_compute[182713]: 2026-01-22 00:20:44.847 182717 DEBUG oslo_concurrency.processutils [None req-5855d627-d6ad-4b02-bc9d-c3741ae99aa8 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:20:44 compute-1 nova_compute[182713]: 2026-01-22 00:20:44.848 182717 DEBUG oslo_concurrency.lockutils [None req-5855d627-d6ad-4b02-bc9d-c3741ae99aa8 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Acquiring lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:20:44 compute-1 nova_compute[182713]: 2026-01-22 00:20:44.849 182717 DEBUG oslo_concurrency.lockutils [None req-5855d627-d6ad-4b02-bc9d-c3741ae99aa8 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:20:44 compute-1 nova_compute[182713]: 2026-01-22 00:20:44.864 182717 DEBUG oslo_concurrency.processutils [None req-5855d627-d6ad-4b02-bc9d-c3741ae99aa8 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:20:44 compute-1 nova_compute[182713]: 2026-01-22 00:20:44.920 182717 DEBUG oslo_concurrency.processutils [None req-5855d627-d6ad-4b02-bc9d-c3741ae99aa8 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:20:44 compute-1 nova_compute[182713]: 2026-01-22 00:20:44.921 182717 DEBUG oslo_concurrency.processutils [None req-5855d627-d6ad-4b02-bc9d-c3741ae99aa8 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/595c8a5a-b43c-4eae-ad91-c7848e0e2f44/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:20:44 compute-1 nova_compute[182713]: 2026-01-22 00:20:44.962 182717 DEBUG oslo_concurrency.processutils [None req-5855d627-d6ad-4b02-bc9d-c3741ae99aa8 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/595c8a5a-b43c-4eae-ad91-c7848e0e2f44/disk 1073741824" returned: 0 in 0.041s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:20:44 compute-1 nova_compute[182713]: 2026-01-22 00:20:44.964 182717 DEBUG oslo_concurrency.lockutils [None req-5855d627-d6ad-4b02-bc9d-c3741ae99aa8 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.115s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:20:44 compute-1 nova_compute[182713]: 2026-01-22 00:20:44.965 182717 DEBUG oslo_concurrency.processutils [None req-5855d627-d6ad-4b02-bc9d-c3741ae99aa8 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:20:45 compute-1 nova_compute[182713]: 2026-01-22 00:20:45.033 182717 DEBUG oslo_concurrency.processutils [None req-5855d627-d6ad-4b02-bc9d-c3741ae99aa8 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:20:45 compute-1 nova_compute[182713]: 2026-01-22 00:20:45.035 182717 DEBUG nova.virt.disk.api [None req-5855d627-d6ad-4b02-bc9d-c3741ae99aa8 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Checking if we can resize image /var/lib/nova/instances/595c8a5a-b43c-4eae-ad91-c7848e0e2f44/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 22 00:20:45 compute-1 nova_compute[182713]: 2026-01-22 00:20:45.036 182717 DEBUG oslo_concurrency.processutils [None req-5855d627-d6ad-4b02-bc9d-c3741ae99aa8 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/595c8a5a-b43c-4eae-ad91-c7848e0e2f44/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:20:45 compute-1 nova_compute[182713]: 2026-01-22 00:20:45.099 182717 DEBUG oslo_concurrency.processutils [None req-5855d627-d6ad-4b02-bc9d-c3741ae99aa8 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/595c8a5a-b43c-4eae-ad91-c7848e0e2f44/disk --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:20:45 compute-1 nova_compute[182713]: 2026-01-22 00:20:45.100 182717 DEBUG nova.virt.disk.api [None req-5855d627-d6ad-4b02-bc9d-c3741ae99aa8 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Cannot resize image /var/lib/nova/instances/595c8a5a-b43c-4eae-ad91-c7848e0e2f44/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 22 00:20:45 compute-1 nova_compute[182713]: 2026-01-22 00:20:45.100 182717 DEBUG nova.objects.instance [None req-5855d627-d6ad-4b02-bc9d-c3741ae99aa8 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Lazy-loading 'migration_context' on Instance uuid 595c8a5a-b43c-4eae-ad91-c7848e0e2f44 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:20:45 compute-1 nova_compute[182713]: 2026-01-22 00:20:45.200 182717 DEBUG nova.policy [None req-5855d627-d6ad-4b02-bc9d-c3741ae99aa8 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'a60ce2b7b7ae47b484de12add551b287', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '02bcfc5f1f1044a3856e73a5938ff011', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 22 00:20:45 compute-1 nova_compute[182713]: 2026-01-22 00:20:45.229 182717 DEBUG nova.virt.libvirt.driver [None req-5855d627-d6ad-4b02-bc9d-c3741ae99aa8 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 595c8a5a-b43c-4eae-ad91-c7848e0e2f44] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 22 00:20:45 compute-1 nova_compute[182713]: 2026-01-22 00:20:45.230 182717 DEBUG nova.virt.libvirt.driver [None req-5855d627-d6ad-4b02-bc9d-c3741ae99aa8 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 595c8a5a-b43c-4eae-ad91-c7848e0e2f44] Ensure instance console log exists: /var/lib/nova/instances/595c8a5a-b43c-4eae-ad91-c7848e0e2f44/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 22 00:20:45 compute-1 nova_compute[182713]: 2026-01-22 00:20:45.230 182717 DEBUG oslo_concurrency.lockutils [None req-5855d627-d6ad-4b02-bc9d-c3741ae99aa8 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:20:45 compute-1 nova_compute[182713]: 2026-01-22 00:20:45.231 182717 DEBUG oslo_concurrency.lockutils [None req-5855d627-d6ad-4b02-bc9d-c3741ae99aa8 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:20:45 compute-1 nova_compute[182713]: 2026-01-22 00:20:45.231 182717 DEBUG oslo_concurrency.lockutils [None req-5855d627-d6ad-4b02-bc9d-c3741ae99aa8 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:20:45 compute-1 nova_compute[182713]: 2026-01-22 00:20:45.855 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:20:45 compute-1 nova_compute[182713]: 2026-01-22 00:20:45.855 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 00:20:46 compute-1 podman[234152]: 2026-01-22 00:20:46.607660881 +0000 UTC m=+0.076509167 container health_status 9069102163b9014ce8d990e36224edf2f6277e737eebbc85a8ea0ba5a3d6a468 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 22 00:20:46 compute-1 podman[234151]: 2026-01-22 00:20:46.619732495 +0000 UTC m=+0.100251583 container health_status 1b31374526468d6b98833c6ddc06c791524e3aaa8f4a92d02e3f11c449a17ca2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.vendor=CentOS)
Jan 22 00:20:46 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:20:46.766 104184 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=44, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:a4:fb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'fa:c2:bb:50:eb:27'}, ipsec=False) old=SB_Global(nb_cfg=43) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:20:46 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:20:46.767 104184 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 22 00:20:46 compute-1 nova_compute[182713]: 2026-01-22 00:20:46.766 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:20:46 compute-1 nova_compute[182713]: 2026-01-22 00:20:46.856 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:20:47 compute-1 nova_compute[182713]: 2026-01-22 00:20:47.236 182717 DEBUG nova.network.neutron [None req-5855d627-d6ad-4b02-bc9d-c3741ae99aa8 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 595c8a5a-b43c-4eae-ad91-c7848e0e2f44] Successfully created port: 7e4f92dd-59c8-4dfe-87e4-3a6b4d14aba8 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 22 00:20:47 compute-1 nova_compute[182713]: 2026-01-22 00:20:47.856 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:20:48 compute-1 nova_compute[182713]: 2026-01-22 00:20:48.631 182717 DEBUG nova.network.neutron [None req-5855d627-d6ad-4b02-bc9d-c3741ae99aa8 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 595c8a5a-b43c-4eae-ad91-c7848e0e2f44] Successfully updated port: 7e4f92dd-59c8-4dfe-87e4-3a6b4d14aba8 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 22 00:20:48 compute-1 nova_compute[182713]: 2026-01-22 00:20:48.646 182717 DEBUG oslo_concurrency.lockutils [None req-5855d627-d6ad-4b02-bc9d-c3741ae99aa8 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Acquiring lock "refresh_cache-595c8a5a-b43c-4eae-ad91-c7848e0e2f44" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:20:48 compute-1 nova_compute[182713]: 2026-01-22 00:20:48.647 182717 DEBUG oslo_concurrency.lockutils [None req-5855d627-d6ad-4b02-bc9d-c3741ae99aa8 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Acquired lock "refresh_cache-595c8a5a-b43c-4eae-ad91-c7848e0e2f44" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:20:48 compute-1 nova_compute[182713]: 2026-01-22 00:20:48.647 182717 DEBUG nova.network.neutron [None req-5855d627-d6ad-4b02-bc9d-c3741ae99aa8 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 595c8a5a-b43c-4eae-ad91-c7848e0e2f44] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 00:20:48 compute-1 nova_compute[182713]: 2026-01-22 00:20:48.759 182717 DEBUG nova.compute.manager [req-e65c3f83-b17e-4a15-9c52-350499d6f666 req-92382f7d-0445-4132-9223-1534f77f9cf3 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 595c8a5a-b43c-4eae-ad91-c7848e0e2f44] Received event network-changed-7e4f92dd-59c8-4dfe-87e4-3a6b4d14aba8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:20:48 compute-1 nova_compute[182713]: 2026-01-22 00:20:48.759 182717 DEBUG nova.compute.manager [req-e65c3f83-b17e-4a15-9c52-350499d6f666 req-92382f7d-0445-4132-9223-1534f77f9cf3 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 595c8a5a-b43c-4eae-ad91-c7848e0e2f44] Refreshing instance network info cache due to event network-changed-7e4f92dd-59c8-4dfe-87e4-3a6b4d14aba8. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 00:20:48 compute-1 nova_compute[182713]: 2026-01-22 00:20:48.760 182717 DEBUG oslo_concurrency.lockutils [req-e65c3f83-b17e-4a15-9c52-350499d6f666 req-92382f7d-0445-4132-9223-1534f77f9cf3 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-595c8a5a-b43c-4eae-ad91-c7848e0e2f44" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:20:48 compute-1 nova_compute[182713]: 2026-01-22 00:20:48.834 182717 DEBUG nova.network.neutron [None req-5855d627-d6ad-4b02-bc9d-c3741ae99aa8 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 595c8a5a-b43c-4eae-ad91-c7848e0e2f44] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 00:20:48 compute-1 nova_compute[182713]: 2026-01-22 00:20:48.893 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:20:49 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:20:49.769 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=74526b6d-b1ca-423f-9094-b845f8b97526, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '44'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:20:49 compute-1 nova_compute[182713]: 2026-01-22 00:20:49.788 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:20:50 compute-1 nova_compute[182713]: 2026-01-22 00:20:50.054 182717 DEBUG nova.network.neutron [None req-5855d627-d6ad-4b02-bc9d-c3741ae99aa8 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 595c8a5a-b43c-4eae-ad91-c7848e0e2f44] Updating instance_info_cache with network_info: [{"id": "7e4f92dd-59c8-4dfe-87e4-3a6b4d14aba8", "address": "fa:16:3e:d2:c1:d4", "network": {"id": "65641ee3-5688-4f52-8e2b-2aae97505b84", "bridge": "br-int", "label": "tempest-network-smoke--2051804877", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "02bcfc5f1f1044a3856e73a5938ff011", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7e4f92dd-59", "ovs_interfaceid": "7e4f92dd-59c8-4dfe-87e4-3a6b4d14aba8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:20:50 compute-1 nova_compute[182713]: 2026-01-22 00:20:50.081 182717 DEBUG oslo_concurrency.lockutils [None req-5855d627-d6ad-4b02-bc9d-c3741ae99aa8 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Releasing lock "refresh_cache-595c8a5a-b43c-4eae-ad91-c7848e0e2f44" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:20:50 compute-1 nova_compute[182713]: 2026-01-22 00:20:50.082 182717 DEBUG nova.compute.manager [None req-5855d627-d6ad-4b02-bc9d-c3741ae99aa8 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 595c8a5a-b43c-4eae-ad91-c7848e0e2f44] Instance network_info: |[{"id": "7e4f92dd-59c8-4dfe-87e4-3a6b4d14aba8", "address": "fa:16:3e:d2:c1:d4", "network": {"id": "65641ee3-5688-4f52-8e2b-2aae97505b84", "bridge": "br-int", "label": "tempest-network-smoke--2051804877", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "02bcfc5f1f1044a3856e73a5938ff011", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7e4f92dd-59", "ovs_interfaceid": "7e4f92dd-59c8-4dfe-87e4-3a6b4d14aba8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 22 00:20:50 compute-1 nova_compute[182713]: 2026-01-22 00:20:50.082 182717 DEBUG oslo_concurrency.lockutils [req-e65c3f83-b17e-4a15-9c52-350499d6f666 req-92382f7d-0445-4132-9223-1534f77f9cf3 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-595c8a5a-b43c-4eae-ad91-c7848e0e2f44" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:20:50 compute-1 nova_compute[182713]: 2026-01-22 00:20:50.082 182717 DEBUG nova.network.neutron [req-e65c3f83-b17e-4a15-9c52-350499d6f666 req-92382f7d-0445-4132-9223-1534f77f9cf3 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 595c8a5a-b43c-4eae-ad91-c7848e0e2f44] Refreshing network info cache for port 7e4f92dd-59c8-4dfe-87e4-3a6b4d14aba8 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 00:20:50 compute-1 nova_compute[182713]: 2026-01-22 00:20:50.086 182717 DEBUG nova.virt.libvirt.driver [None req-5855d627-d6ad-4b02-bc9d-c3741ae99aa8 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 595c8a5a-b43c-4eae-ad91-c7848e0e2f44] Start _get_guest_xml network_info=[{"id": "7e4f92dd-59c8-4dfe-87e4-3a6b4d14aba8", "address": "fa:16:3e:d2:c1:d4", "network": {"id": "65641ee3-5688-4f52-8e2b-2aae97505b84", "bridge": "br-int", "label": "tempest-network-smoke--2051804877", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "02bcfc5f1f1044a3856e73a5938ff011", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7e4f92dd-59", "ovs_interfaceid": "7e4f92dd-59c8-4dfe-87e4-3a6b4d14aba8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'size': 0, 'disk_bus': 'virtio', 'boot_index': 0, 'device_name': '/dev/vda', 'guest_format': None, 'encrypted': False, 'encryption_format': None, 'encryption_secret_uuid': None, 'device_type': 'disk', 'image_id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 22 00:20:50 compute-1 nova_compute[182713]: 2026-01-22 00:20:50.092 182717 WARNING nova.virt.libvirt.driver [None req-5855d627-d6ad-4b02-bc9d-c3741ae99aa8 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 00:20:50 compute-1 nova_compute[182713]: 2026-01-22 00:20:50.099 182717 DEBUG nova.virt.libvirt.host [None req-5855d627-d6ad-4b02-bc9d-c3741ae99aa8 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 22 00:20:50 compute-1 nova_compute[182713]: 2026-01-22 00:20:50.099 182717 DEBUG nova.virt.libvirt.host [None req-5855d627-d6ad-4b02-bc9d-c3741ae99aa8 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 22 00:20:50 compute-1 nova_compute[182713]: 2026-01-22 00:20:50.110 182717 DEBUG nova.virt.libvirt.host [None req-5855d627-d6ad-4b02-bc9d-c3741ae99aa8 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 22 00:20:50 compute-1 nova_compute[182713]: 2026-01-22 00:20:50.111 182717 DEBUG nova.virt.libvirt.host [None req-5855d627-d6ad-4b02-bc9d-c3741ae99aa8 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 22 00:20:50 compute-1 nova_compute[182713]: 2026-01-22 00:20:50.112 182717 DEBUG nova.virt.libvirt.driver [None req-5855d627-d6ad-4b02-bc9d-c3741ae99aa8 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 22 00:20:50 compute-1 nova_compute[182713]: 2026-01-22 00:20:50.113 182717 DEBUG nova.virt.hardware [None req-5855d627-d6ad-4b02-bc9d-c3741ae99aa8 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-21T23:42:45Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c3389c03-89c4-4ff5-9e03-1a99d41713d4',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 22 00:20:50 compute-1 nova_compute[182713]: 2026-01-22 00:20:50.113 182717 DEBUG nova.virt.hardware [None req-5855d627-d6ad-4b02-bc9d-c3741ae99aa8 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 22 00:20:50 compute-1 nova_compute[182713]: 2026-01-22 00:20:50.113 182717 DEBUG nova.virt.hardware [None req-5855d627-d6ad-4b02-bc9d-c3741ae99aa8 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 22 00:20:50 compute-1 nova_compute[182713]: 2026-01-22 00:20:50.113 182717 DEBUG nova.virt.hardware [None req-5855d627-d6ad-4b02-bc9d-c3741ae99aa8 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 22 00:20:50 compute-1 nova_compute[182713]: 2026-01-22 00:20:50.113 182717 DEBUG nova.virt.hardware [None req-5855d627-d6ad-4b02-bc9d-c3741ae99aa8 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 22 00:20:50 compute-1 nova_compute[182713]: 2026-01-22 00:20:50.113 182717 DEBUG nova.virt.hardware [None req-5855d627-d6ad-4b02-bc9d-c3741ae99aa8 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 22 00:20:50 compute-1 nova_compute[182713]: 2026-01-22 00:20:50.114 182717 DEBUG nova.virt.hardware [None req-5855d627-d6ad-4b02-bc9d-c3741ae99aa8 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 22 00:20:50 compute-1 nova_compute[182713]: 2026-01-22 00:20:50.114 182717 DEBUG nova.virt.hardware [None req-5855d627-d6ad-4b02-bc9d-c3741ae99aa8 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 22 00:20:50 compute-1 nova_compute[182713]: 2026-01-22 00:20:50.114 182717 DEBUG nova.virt.hardware [None req-5855d627-d6ad-4b02-bc9d-c3741ae99aa8 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 22 00:20:50 compute-1 nova_compute[182713]: 2026-01-22 00:20:50.114 182717 DEBUG nova.virt.hardware [None req-5855d627-d6ad-4b02-bc9d-c3741ae99aa8 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 22 00:20:50 compute-1 nova_compute[182713]: 2026-01-22 00:20:50.114 182717 DEBUG nova.virt.hardware [None req-5855d627-d6ad-4b02-bc9d-c3741ae99aa8 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 22 00:20:50 compute-1 nova_compute[182713]: 2026-01-22 00:20:50.118 182717 DEBUG nova.virt.libvirt.vif [None req-5855d627-d6ad-4b02-bc9d-c3741ae99aa8 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T00:20:42Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-1492736128-gen-0-721917988',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-1492736128-gen-0-721917988',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-1492736128-ge',id=151,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFOgFr0loz0o97S1yJic425BuuGnqIIzzaQU+1FOWYN8VLWjMOBgkt02kLpdfipR3QnvdUvT3mVD/diPnm35tClCs6BoaTbQN3VWq8tyqhLXUA2JeTkyyUA3yLrgO9t4ag==',key_name='tempest-TestSecurityGroupsBasicOps-1152614963',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='02bcfc5f1f1044a3856e73a5938ff011',ramdisk_id='',reservation_id='r-zvfyb2xc',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestSecurityGroupsBasicOps-1492736128',owner_user_name='tempest-TestSecurityGroupsBasicOps-1492736128-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:20:44Z,user_data=None,user_id='a60ce2b7b7ae47b484de12add551b287',uuid=595c8a5a-b43c-4eae-ad91-c7848e0e2f44,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "7e4f92dd-59c8-4dfe-87e4-3a6b4d14aba8", "address": "fa:16:3e:d2:c1:d4", "network": {"id": "65641ee3-5688-4f52-8e2b-2aae97505b84", "bridge": "br-int", "label": "tempest-network-smoke--2051804877", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "02bcfc5f1f1044a3856e73a5938ff011", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7e4f92dd-59", "ovs_interfaceid": "7e4f92dd-59c8-4dfe-87e4-3a6b4d14aba8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 00:20:50 compute-1 nova_compute[182713]: 2026-01-22 00:20:50.119 182717 DEBUG nova.network.os_vif_util [None req-5855d627-d6ad-4b02-bc9d-c3741ae99aa8 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Converting VIF {"id": "7e4f92dd-59c8-4dfe-87e4-3a6b4d14aba8", "address": "fa:16:3e:d2:c1:d4", "network": {"id": "65641ee3-5688-4f52-8e2b-2aae97505b84", "bridge": "br-int", "label": "tempest-network-smoke--2051804877", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "02bcfc5f1f1044a3856e73a5938ff011", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7e4f92dd-59", "ovs_interfaceid": "7e4f92dd-59c8-4dfe-87e4-3a6b4d14aba8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:20:50 compute-1 nova_compute[182713]: 2026-01-22 00:20:50.119 182717 DEBUG nova.network.os_vif_util [None req-5855d627-d6ad-4b02-bc9d-c3741ae99aa8 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d2:c1:d4,bridge_name='br-int',has_traffic_filtering=True,id=7e4f92dd-59c8-4dfe-87e4-3a6b4d14aba8,network=Network(65641ee3-5688-4f52-8e2b-2aae97505b84),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7e4f92dd-59') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:20:50 compute-1 nova_compute[182713]: 2026-01-22 00:20:50.120 182717 DEBUG nova.objects.instance [None req-5855d627-d6ad-4b02-bc9d-c3741ae99aa8 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Lazy-loading 'pci_devices' on Instance uuid 595c8a5a-b43c-4eae-ad91-c7848e0e2f44 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:20:50 compute-1 nova_compute[182713]: 2026-01-22 00:20:50.136 182717 DEBUG nova.virt.libvirt.driver [None req-5855d627-d6ad-4b02-bc9d-c3741ae99aa8 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 595c8a5a-b43c-4eae-ad91-c7848e0e2f44] End _get_guest_xml xml=<domain type="kvm">
Jan 22 00:20:50 compute-1 nova_compute[182713]:   <uuid>595c8a5a-b43c-4eae-ad91-c7848e0e2f44</uuid>
Jan 22 00:20:50 compute-1 nova_compute[182713]:   <name>instance-00000097</name>
Jan 22 00:20:50 compute-1 nova_compute[182713]:   <memory>131072</memory>
Jan 22 00:20:50 compute-1 nova_compute[182713]:   <vcpu>1</vcpu>
Jan 22 00:20:50 compute-1 nova_compute[182713]:   <metadata>
Jan 22 00:20:50 compute-1 nova_compute[182713]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 00:20:50 compute-1 nova_compute[182713]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 00:20:50 compute-1 nova_compute[182713]:       <nova:name>tempest-server-tempest-TestSecurityGroupsBasicOps-1492736128-gen-0-721917988</nova:name>
Jan 22 00:20:50 compute-1 nova_compute[182713]:       <nova:creationTime>2026-01-22 00:20:50</nova:creationTime>
Jan 22 00:20:50 compute-1 nova_compute[182713]:       <nova:flavor name="m1.nano">
Jan 22 00:20:50 compute-1 nova_compute[182713]:         <nova:memory>128</nova:memory>
Jan 22 00:20:50 compute-1 nova_compute[182713]:         <nova:disk>1</nova:disk>
Jan 22 00:20:50 compute-1 nova_compute[182713]:         <nova:swap>0</nova:swap>
Jan 22 00:20:50 compute-1 nova_compute[182713]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 00:20:50 compute-1 nova_compute[182713]:         <nova:vcpus>1</nova:vcpus>
Jan 22 00:20:50 compute-1 nova_compute[182713]:       </nova:flavor>
Jan 22 00:20:50 compute-1 nova_compute[182713]:       <nova:owner>
Jan 22 00:20:50 compute-1 nova_compute[182713]:         <nova:user uuid="a60ce2b7b7ae47b484de12add551b287">tempest-TestSecurityGroupsBasicOps-1492736128-project-member</nova:user>
Jan 22 00:20:50 compute-1 nova_compute[182713]:         <nova:project uuid="02bcfc5f1f1044a3856e73a5938ff011">tempest-TestSecurityGroupsBasicOps-1492736128</nova:project>
Jan 22 00:20:50 compute-1 nova_compute[182713]:       </nova:owner>
Jan 22 00:20:50 compute-1 nova_compute[182713]:       <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 22 00:20:50 compute-1 nova_compute[182713]:       <nova:ports>
Jan 22 00:20:50 compute-1 nova_compute[182713]:         <nova:port uuid="7e4f92dd-59c8-4dfe-87e4-3a6b4d14aba8">
Jan 22 00:20:50 compute-1 nova_compute[182713]:           <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Jan 22 00:20:50 compute-1 nova_compute[182713]:         </nova:port>
Jan 22 00:20:50 compute-1 nova_compute[182713]:       </nova:ports>
Jan 22 00:20:50 compute-1 nova_compute[182713]:     </nova:instance>
Jan 22 00:20:50 compute-1 nova_compute[182713]:   </metadata>
Jan 22 00:20:50 compute-1 nova_compute[182713]:   <sysinfo type="smbios">
Jan 22 00:20:50 compute-1 nova_compute[182713]:     <system>
Jan 22 00:20:50 compute-1 nova_compute[182713]:       <entry name="manufacturer">RDO</entry>
Jan 22 00:20:50 compute-1 nova_compute[182713]:       <entry name="product">OpenStack Compute</entry>
Jan 22 00:20:50 compute-1 nova_compute[182713]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 00:20:50 compute-1 nova_compute[182713]:       <entry name="serial">595c8a5a-b43c-4eae-ad91-c7848e0e2f44</entry>
Jan 22 00:20:50 compute-1 nova_compute[182713]:       <entry name="uuid">595c8a5a-b43c-4eae-ad91-c7848e0e2f44</entry>
Jan 22 00:20:50 compute-1 nova_compute[182713]:       <entry name="family">Virtual Machine</entry>
Jan 22 00:20:50 compute-1 nova_compute[182713]:     </system>
Jan 22 00:20:50 compute-1 nova_compute[182713]:   </sysinfo>
Jan 22 00:20:50 compute-1 nova_compute[182713]:   <os>
Jan 22 00:20:50 compute-1 nova_compute[182713]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 22 00:20:50 compute-1 nova_compute[182713]:     <boot dev="hd"/>
Jan 22 00:20:50 compute-1 nova_compute[182713]:     <smbios mode="sysinfo"/>
Jan 22 00:20:50 compute-1 nova_compute[182713]:   </os>
Jan 22 00:20:50 compute-1 nova_compute[182713]:   <features>
Jan 22 00:20:50 compute-1 nova_compute[182713]:     <acpi/>
Jan 22 00:20:50 compute-1 nova_compute[182713]:     <apic/>
Jan 22 00:20:50 compute-1 nova_compute[182713]:     <vmcoreinfo/>
Jan 22 00:20:50 compute-1 nova_compute[182713]:   </features>
Jan 22 00:20:50 compute-1 nova_compute[182713]:   <clock offset="utc">
Jan 22 00:20:50 compute-1 nova_compute[182713]:     <timer name="pit" tickpolicy="delay"/>
Jan 22 00:20:50 compute-1 nova_compute[182713]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 22 00:20:50 compute-1 nova_compute[182713]:     <timer name="hpet" present="no"/>
Jan 22 00:20:50 compute-1 nova_compute[182713]:   </clock>
Jan 22 00:20:50 compute-1 nova_compute[182713]:   <cpu mode="custom" match="exact">
Jan 22 00:20:50 compute-1 nova_compute[182713]:     <model>Nehalem</model>
Jan 22 00:20:50 compute-1 nova_compute[182713]:     <topology sockets="1" cores="1" threads="1"/>
Jan 22 00:20:50 compute-1 nova_compute[182713]:   </cpu>
Jan 22 00:20:50 compute-1 nova_compute[182713]:   <devices>
Jan 22 00:20:50 compute-1 nova_compute[182713]:     <disk type="file" device="disk">
Jan 22 00:20:50 compute-1 nova_compute[182713]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 00:20:50 compute-1 nova_compute[182713]:       <source file="/var/lib/nova/instances/595c8a5a-b43c-4eae-ad91-c7848e0e2f44/disk"/>
Jan 22 00:20:50 compute-1 nova_compute[182713]:       <target dev="vda" bus="virtio"/>
Jan 22 00:20:50 compute-1 nova_compute[182713]:     </disk>
Jan 22 00:20:50 compute-1 nova_compute[182713]:     <disk type="file" device="cdrom">
Jan 22 00:20:50 compute-1 nova_compute[182713]:       <driver name="qemu" type="raw" cache="none"/>
Jan 22 00:20:50 compute-1 nova_compute[182713]:       <source file="/var/lib/nova/instances/595c8a5a-b43c-4eae-ad91-c7848e0e2f44/disk.config"/>
Jan 22 00:20:50 compute-1 nova_compute[182713]:       <target dev="sda" bus="sata"/>
Jan 22 00:20:50 compute-1 nova_compute[182713]:     </disk>
Jan 22 00:20:50 compute-1 nova_compute[182713]:     <interface type="ethernet">
Jan 22 00:20:50 compute-1 nova_compute[182713]:       <mac address="fa:16:3e:d2:c1:d4"/>
Jan 22 00:20:50 compute-1 nova_compute[182713]:       <model type="virtio"/>
Jan 22 00:20:50 compute-1 nova_compute[182713]:       <driver name="vhost" rx_queue_size="512"/>
Jan 22 00:20:50 compute-1 nova_compute[182713]:       <mtu size="1442"/>
Jan 22 00:20:50 compute-1 nova_compute[182713]:       <target dev="tap7e4f92dd-59"/>
Jan 22 00:20:50 compute-1 nova_compute[182713]:     </interface>
Jan 22 00:20:50 compute-1 nova_compute[182713]:     <serial type="pty">
Jan 22 00:20:50 compute-1 nova_compute[182713]:       <log file="/var/lib/nova/instances/595c8a5a-b43c-4eae-ad91-c7848e0e2f44/console.log" append="off"/>
Jan 22 00:20:50 compute-1 nova_compute[182713]:     </serial>
Jan 22 00:20:50 compute-1 nova_compute[182713]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 00:20:50 compute-1 nova_compute[182713]:     <video>
Jan 22 00:20:50 compute-1 nova_compute[182713]:       <model type="virtio"/>
Jan 22 00:20:50 compute-1 nova_compute[182713]:     </video>
Jan 22 00:20:50 compute-1 nova_compute[182713]:     <input type="tablet" bus="usb"/>
Jan 22 00:20:50 compute-1 nova_compute[182713]:     <rng model="virtio">
Jan 22 00:20:50 compute-1 nova_compute[182713]:       <backend model="random">/dev/urandom</backend>
Jan 22 00:20:50 compute-1 nova_compute[182713]:     </rng>
Jan 22 00:20:50 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root"/>
Jan 22 00:20:50 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:20:50 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:20:50 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:20:50 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:20:50 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:20:50 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:20:50 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:20:50 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:20:50 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:20:50 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:20:50 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:20:50 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:20:50 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:20:50 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:20:50 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:20:50 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:20:50 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:20:50 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:20:50 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:20:50 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:20:50 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:20:50 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:20:50 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:20:50 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:20:50 compute-1 nova_compute[182713]:     <controller type="usb" index="0"/>
Jan 22 00:20:50 compute-1 nova_compute[182713]:     <memballoon model="virtio">
Jan 22 00:20:50 compute-1 nova_compute[182713]:       <stats period="10"/>
Jan 22 00:20:50 compute-1 nova_compute[182713]:     </memballoon>
Jan 22 00:20:50 compute-1 nova_compute[182713]:   </devices>
Jan 22 00:20:50 compute-1 nova_compute[182713]: </domain>
Jan 22 00:20:50 compute-1 nova_compute[182713]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 22 00:20:50 compute-1 nova_compute[182713]: 2026-01-22 00:20:50.137 182717 DEBUG nova.compute.manager [None req-5855d627-d6ad-4b02-bc9d-c3741ae99aa8 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 595c8a5a-b43c-4eae-ad91-c7848e0e2f44] Preparing to wait for external event network-vif-plugged-7e4f92dd-59c8-4dfe-87e4-3a6b4d14aba8 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 22 00:20:50 compute-1 nova_compute[182713]: 2026-01-22 00:20:50.137 182717 DEBUG oslo_concurrency.lockutils [None req-5855d627-d6ad-4b02-bc9d-c3741ae99aa8 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Acquiring lock "595c8a5a-b43c-4eae-ad91-c7848e0e2f44-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:20:50 compute-1 nova_compute[182713]: 2026-01-22 00:20:50.137 182717 DEBUG oslo_concurrency.lockutils [None req-5855d627-d6ad-4b02-bc9d-c3741ae99aa8 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Lock "595c8a5a-b43c-4eae-ad91-c7848e0e2f44-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:20:50 compute-1 nova_compute[182713]: 2026-01-22 00:20:50.137 182717 DEBUG oslo_concurrency.lockutils [None req-5855d627-d6ad-4b02-bc9d-c3741ae99aa8 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Lock "595c8a5a-b43c-4eae-ad91-c7848e0e2f44-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:20:50 compute-1 nova_compute[182713]: 2026-01-22 00:20:50.138 182717 DEBUG nova.virt.libvirt.vif [None req-5855d627-d6ad-4b02-bc9d-c3741ae99aa8 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T00:20:42Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-1492736128-gen-0-721917988',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-1492736128-gen-0-721917988',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-1492736128-ge',id=151,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFOgFr0loz0o97S1yJic425BuuGnqIIzzaQU+1FOWYN8VLWjMOBgkt02kLpdfipR3QnvdUvT3mVD/diPnm35tClCs6BoaTbQN3VWq8tyqhLXUA2JeTkyyUA3yLrgO9t4ag==',key_name='tempest-TestSecurityGroupsBasicOps-1152614963',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='02bcfc5f1f1044a3856e73a5938ff011',ramdisk_id='',reservation_id='r-zvfyb2xc',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestSecurityGroupsBasicOps-1492736128',owner_user_name='tempest-TestSecurityGroupsBasicOps-1492736128-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:20:44Z,user_data=None,user_id='a60ce2b7b7ae47b484de12add551b287',uuid=595c8a5a-b43c-4eae-ad91-c7848e0e2f44,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "7e4f92dd-59c8-4dfe-87e4-3a6b4d14aba8", "address": "fa:16:3e:d2:c1:d4", "network": {"id": "65641ee3-5688-4f52-8e2b-2aae97505b84", "bridge": "br-int", "label": "tempest-network-smoke--2051804877", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "02bcfc5f1f1044a3856e73a5938ff011", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7e4f92dd-59", "ovs_interfaceid": "7e4f92dd-59c8-4dfe-87e4-3a6b4d14aba8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 00:20:50 compute-1 nova_compute[182713]: 2026-01-22 00:20:50.138 182717 DEBUG nova.network.os_vif_util [None req-5855d627-d6ad-4b02-bc9d-c3741ae99aa8 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Converting VIF {"id": "7e4f92dd-59c8-4dfe-87e4-3a6b4d14aba8", "address": "fa:16:3e:d2:c1:d4", "network": {"id": "65641ee3-5688-4f52-8e2b-2aae97505b84", "bridge": "br-int", "label": "tempest-network-smoke--2051804877", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "02bcfc5f1f1044a3856e73a5938ff011", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7e4f92dd-59", "ovs_interfaceid": "7e4f92dd-59c8-4dfe-87e4-3a6b4d14aba8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:20:50 compute-1 nova_compute[182713]: 2026-01-22 00:20:50.139 182717 DEBUG nova.network.os_vif_util [None req-5855d627-d6ad-4b02-bc9d-c3741ae99aa8 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d2:c1:d4,bridge_name='br-int',has_traffic_filtering=True,id=7e4f92dd-59c8-4dfe-87e4-3a6b4d14aba8,network=Network(65641ee3-5688-4f52-8e2b-2aae97505b84),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7e4f92dd-59') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:20:50 compute-1 nova_compute[182713]: 2026-01-22 00:20:50.139 182717 DEBUG os_vif [None req-5855d627-d6ad-4b02-bc9d-c3741ae99aa8 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:d2:c1:d4,bridge_name='br-int',has_traffic_filtering=True,id=7e4f92dd-59c8-4dfe-87e4-3a6b4d14aba8,network=Network(65641ee3-5688-4f52-8e2b-2aae97505b84),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7e4f92dd-59') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 00:20:50 compute-1 nova_compute[182713]: 2026-01-22 00:20:50.140 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:20:50 compute-1 nova_compute[182713]: 2026-01-22 00:20:50.140 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:20:50 compute-1 nova_compute[182713]: 2026-01-22 00:20:50.141 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 00:20:50 compute-1 nova_compute[182713]: 2026-01-22 00:20:50.144 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:20:50 compute-1 nova_compute[182713]: 2026-01-22 00:20:50.144 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap7e4f92dd-59, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:20:50 compute-1 nova_compute[182713]: 2026-01-22 00:20:50.145 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap7e4f92dd-59, col_values=(('external_ids', {'iface-id': '7e4f92dd-59c8-4dfe-87e4-3a6b4d14aba8', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:d2:c1:d4', 'vm-uuid': '595c8a5a-b43c-4eae-ad91-c7848e0e2f44'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:20:50 compute-1 nova_compute[182713]: 2026-01-22 00:20:50.181 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:20:50 compute-1 NetworkManager[54952]: <info>  [1769041250.1836] manager: (tap7e4f92dd-59): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/265)
Jan 22 00:20:50 compute-1 nova_compute[182713]: 2026-01-22 00:20:50.184 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:20:50 compute-1 nova_compute[182713]: 2026-01-22 00:20:50.187 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:20:50 compute-1 nova_compute[182713]: 2026-01-22 00:20:50.189 182717 INFO os_vif [None req-5855d627-d6ad-4b02-bc9d-c3741ae99aa8 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:d2:c1:d4,bridge_name='br-int',has_traffic_filtering=True,id=7e4f92dd-59c8-4dfe-87e4-3a6b4d14aba8,network=Network(65641ee3-5688-4f52-8e2b-2aae97505b84),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7e4f92dd-59')
Jan 22 00:20:50 compute-1 nova_compute[182713]: 2026-01-22 00:20:50.269 182717 DEBUG nova.virt.libvirt.driver [None req-5855d627-d6ad-4b02-bc9d-c3741ae99aa8 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 00:20:50 compute-1 nova_compute[182713]: 2026-01-22 00:20:50.269 182717 DEBUG nova.virt.libvirt.driver [None req-5855d627-d6ad-4b02-bc9d-c3741ae99aa8 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 00:20:50 compute-1 nova_compute[182713]: 2026-01-22 00:20:50.270 182717 DEBUG nova.virt.libvirt.driver [None req-5855d627-d6ad-4b02-bc9d-c3741ae99aa8 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] No VIF found with MAC fa:16:3e:d2:c1:d4, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 00:20:50 compute-1 nova_compute[182713]: 2026-01-22 00:20:50.270 182717 INFO nova.virt.libvirt.driver [None req-5855d627-d6ad-4b02-bc9d-c3741ae99aa8 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 595c8a5a-b43c-4eae-ad91-c7848e0e2f44] Using config drive
Jan 22 00:20:50 compute-1 nova_compute[182713]: 2026-01-22 00:20:50.776 182717 INFO nova.virt.libvirt.driver [None req-5855d627-d6ad-4b02-bc9d-c3741ae99aa8 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 595c8a5a-b43c-4eae-ad91-c7848e0e2f44] Creating config drive at /var/lib/nova/instances/595c8a5a-b43c-4eae-ad91-c7848e0e2f44/disk.config
Jan 22 00:20:50 compute-1 nova_compute[182713]: 2026-01-22 00:20:50.783 182717 DEBUG oslo_concurrency.processutils [None req-5855d627-d6ad-4b02-bc9d-c3741ae99aa8 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/595c8a5a-b43c-4eae-ad91-c7848e0e2f44/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp8xa2yuxk execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:20:50 compute-1 nova_compute[182713]: 2026-01-22 00:20:50.851 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:20:50 compute-1 nova_compute[182713]: 2026-01-22 00:20:50.925 182717 DEBUG oslo_concurrency.processutils [None req-5855d627-d6ad-4b02-bc9d-c3741ae99aa8 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/595c8a5a-b43c-4eae-ad91-c7848e0e2f44/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp8xa2yuxk" returned: 0 in 0.142s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:20:51 compute-1 kernel: tap7e4f92dd-59: entered promiscuous mode
Jan 22 00:20:51 compute-1 NetworkManager[54952]: <info>  [1769041251.0120] manager: (tap7e4f92dd-59): new Tun device (/org/freedesktop/NetworkManager/Devices/266)
Jan 22 00:20:51 compute-1 ovn_controller[94841]: 2026-01-22T00:20:51Z|00593|binding|INFO|Claiming lport 7e4f92dd-59c8-4dfe-87e4-3a6b4d14aba8 for this chassis.
Jan 22 00:20:51 compute-1 ovn_controller[94841]: 2026-01-22T00:20:51Z|00594|binding|INFO|7e4f92dd-59c8-4dfe-87e4-3a6b4d14aba8: Claiming fa:16:3e:d2:c1:d4 10.100.0.5
Jan 22 00:20:51 compute-1 nova_compute[182713]: 2026-01-22 00:20:51.013 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:20:51 compute-1 nova_compute[182713]: 2026-01-22 00:20:51.021 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:20:51 compute-1 nova_compute[182713]: 2026-01-22 00:20:51.026 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:20:51 compute-1 NetworkManager[54952]: <info>  [1769041251.0283] manager: (patch-br-int-to-provnet-99bcf688-d143-4306-a11c-956e66fbe227): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/267)
Jan 22 00:20:51 compute-1 NetworkManager[54952]: <info>  [1769041251.0300] manager: (patch-provnet-99bcf688-d143-4306-a11c-956e66fbe227-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/268)
Jan 22 00:20:51 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:20:51.033 104184 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d2:c1:d4 10.100.0.5'], port_security=['fa:16:3e:d2:c1:d4 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '595c8a5a-b43c-4eae-ad91-c7848e0e2f44', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-65641ee3-5688-4f52-8e2b-2aae97505b84', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '02bcfc5f1f1044a3856e73a5938ff011', 'neutron:revision_number': '2', 'neutron:security_group_ids': '9d5746ab-567f-4771-baec-483e6edef99f', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e3d414ff-1f29-4cd2-96c4-c90cd0d603fc, chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>], logical_port=7e4f92dd-59c8-4dfe-87e4-3a6b4d14aba8) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:20:51 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:20:51.034 104184 INFO neutron.agent.ovn.metadata.agent [-] Port 7e4f92dd-59c8-4dfe-87e4-3a6b4d14aba8 in datapath 65641ee3-5688-4f52-8e2b-2aae97505b84 bound to our chassis
Jan 22 00:20:51 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:20:51.035 104184 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 65641ee3-5688-4f52-8e2b-2aae97505b84
Jan 22 00:20:51 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:20:51.050 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[e2ea2e6a-f0f2-4f8b-a540-5863e33db2d6]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:20:51 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:20:51.052 104184 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap65641ee3-51 in ovnmeta-65641ee3-5688-4f52-8e2b-2aae97505b84 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 22 00:20:51 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:20:51.053 211733 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap65641ee3-50 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 22 00:20:51 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:20:51.053 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[851c82bc-7866-41d6-aa4a-4379de8170b9]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:20:51 compute-1 systemd-udevd[234222]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 00:20:51 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:20:51.054 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[b51aa758-e0a0-4167-af04-3bc09c9cc845]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:20:51 compute-1 systemd-machined[153970]: New machine qemu-65-instance-00000097.
Jan 22 00:20:51 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:20:51.065 104576 DEBUG oslo.privsep.daemon [-] privsep: reply[ea80871b-298b-4217-ba43-0a85aad09929]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:20:51 compute-1 NetworkManager[54952]: <info>  [1769041251.0767] device (tap7e4f92dd-59): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 00:20:51 compute-1 systemd[1]: Started Virtual Machine qemu-65-instance-00000097.
Jan 22 00:20:51 compute-1 NetworkManager[54952]: <info>  [1769041251.0788] device (tap7e4f92dd-59): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 00:20:51 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:20:51.099 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[d84b3580-c31d-49ee-9543-6daabbdff85c]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:20:51 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:20:51.131 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[20b80407-f1e2-47e8-a704-2124e332ef72]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:20:51 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:20:51.154 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[460d99d7-7eca-4805-8a3f-256d8fcf7f09]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:20:51 compute-1 NetworkManager[54952]: <info>  [1769041251.1551] manager: (tap65641ee3-50): new Veth device (/org/freedesktop/NetworkManager/Devices/269)
Jan 22 00:20:51 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:20:51.184 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[40fa72eb-24b7-485b-868a-2758d7bc6258]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:20:51 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:20:51.186 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[0be49e5a-cd9e-4367-aefc-9b966479f00b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:20:51 compute-1 nova_compute[182713]: 2026-01-22 00:20:51.191 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:20:51 compute-1 nova_compute[182713]: 2026-01-22 00:20:51.193 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:20:51 compute-1 NetworkManager[54952]: <info>  [1769041251.2157] device (tap65641ee3-50): carrier: link connected
Jan 22 00:20:51 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:20:51.221 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[bd7d7c2b-66db-41d4-b8fe-b8747fb7c539]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:20:51 compute-1 ovn_controller[94841]: 2026-01-22T00:20:51Z|00595|binding|INFO|Setting lport 7e4f92dd-59c8-4dfe-87e4-3a6b4d14aba8 up in Southbound
Jan 22 00:20:51 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:20:51.237 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[abc9a524-cc8e-42ed-a4fc-d4bdc171c0e6]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap65641ee3-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:df:51:e7'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 168], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 582386, 'reachable_time': 31632, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 234254, 'error': None, 'target': 'ovnmeta-65641ee3-5688-4f52-8e2b-2aae97505b84', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:20:51 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:20:51.252 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[fbd91fa5-1c59-4ee3-9d83-f72e15b386c2]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fedf:51e7'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 582386, 'tstamp': 582386}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 234255, 'error': None, 'target': 'ovnmeta-65641ee3-5688-4f52-8e2b-2aae97505b84', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:20:51 compute-1 ovn_controller[94841]: 2026-01-22T00:20:51Z|00596|binding|INFO|Setting lport 7e4f92dd-59c8-4dfe-87e4-3a6b4d14aba8 ovn-installed in OVS
Jan 22 00:20:51 compute-1 nova_compute[182713]: 2026-01-22 00:20:51.257 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:20:51 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:20:51.267 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[316f6326-c644-4985-a4c6-7e1ad33d26fa]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap65641ee3-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:df:51:e7'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 168], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 582386, 'reachable_time': 31632, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 234256, 'error': None, 'target': 'ovnmeta-65641ee3-5688-4f52-8e2b-2aae97505b84', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:20:51 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:20:51.298 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[c4fc8e32-c768-49cf-88c2-5314df5b41ac]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:20:51 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:20:51.365 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[0a79493a-1839-4e71-859a-a2911b390596]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:20:51 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:20:51.366 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap65641ee3-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:20:51 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:20:51.367 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 00:20:51 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:20:51.367 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap65641ee3-50, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:20:51 compute-1 nova_compute[182713]: 2026-01-22 00:20:51.369 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:20:51 compute-1 NetworkManager[54952]: <info>  [1769041251.3706] manager: (tap65641ee3-50): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/270)
Jan 22 00:20:51 compute-1 kernel: tap65641ee3-50: entered promiscuous mode
Jan 22 00:20:51 compute-1 nova_compute[182713]: 2026-01-22 00:20:51.371 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:20:51 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:20:51.373 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap65641ee3-50, col_values=(('external_ids', {'iface-id': '737a2d1f-ad8c-46d7-ba36-880bbc6b5728'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:20:51 compute-1 nova_compute[182713]: 2026-01-22 00:20:51.374 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:20:51 compute-1 ovn_controller[94841]: 2026-01-22T00:20:51Z|00597|binding|INFO|Releasing lport 737a2d1f-ad8c-46d7-ba36-880bbc6b5728 from this chassis (sb_readonly=0)
Jan 22 00:20:51 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:20:51.376 104184 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/65641ee3-5688-4f52-8e2b-2aae97505b84.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/65641ee3-5688-4f52-8e2b-2aae97505b84.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 22 00:20:51 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:20:51.377 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[d7a0cd12-29d0-4bc6-92e6-3ad0aae1e147]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:20:51 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:20:51.378 104184 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 00:20:51 compute-1 ovn_metadata_agent[104179]: global
Jan 22 00:20:51 compute-1 ovn_metadata_agent[104179]:     log         /dev/log local0 debug
Jan 22 00:20:51 compute-1 ovn_metadata_agent[104179]:     log-tag     haproxy-metadata-proxy-65641ee3-5688-4f52-8e2b-2aae97505b84
Jan 22 00:20:51 compute-1 ovn_metadata_agent[104179]:     user        root
Jan 22 00:20:51 compute-1 ovn_metadata_agent[104179]:     group       root
Jan 22 00:20:51 compute-1 ovn_metadata_agent[104179]:     maxconn     1024
Jan 22 00:20:51 compute-1 ovn_metadata_agent[104179]:     pidfile     /var/lib/neutron/external/pids/65641ee3-5688-4f52-8e2b-2aae97505b84.pid.haproxy
Jan 22 00:20:51 compute-1 ovn_metadata_agent[104179]:     daemon
Jan 22 00:20:51 compute-1 ovn_metadata_agent[104179]: 
Jan 22 00:20:51 compute-1 ovn_metadata_agent[104179]: defaults
Jan 22 00:20:51 compute-1 ovn_metadata_agent[104179]:     log global
Jan 22 00:20:51 compute-1 ovn_metadata_agent[104179]:     mode http
Jan 22 00:20:51 compute-1 ovn_metadata_agent[104179]:     option httplog
Jan 22 00:20:51 compute-1 ovn_metadata_agent[104179]:     option dontlognull
Jan 22 00:20:51 compute-1 ovn_metadata_agent[104179]:     option http-server-close
Jan 22 00:20:51 compute-1 ovn_metadata_agent[104179]:     option forwardfor
Jan 22 00:20:51 compute-1 ovn_metadata_agent[104179]:     retries                 3
Jan 22 00:20:51 compute-1 ovn_metadata_agent[104179]:     timeout http-request    30s
Jan 22 00:20:51 compute-1 ovn_metadata_agent[104179]:     timeout connect         30s
Jan 22 00:20:51 compute-1 ovn_metadata_agent[104179]:     timeout client          32s
Jan 22 00:20:51 compute-1 ovn_metadata_agent[104179]:     timeout server          32s
Jan 22 00:20:51 compute-1 ovn_metadata_agent[104179]:     timeout http-keep-alive 30s
Jan 22 00:20:51 compute-1 ovn_metadata_agent[104179]: 
Jan 22 00:20:51 compute-1 ovn_metadata_agent[104179]: 
Jan 22 00:20:51 compute-1 ovn_metadata_agent[104179]: listen listener
Jan 22 00:20:51 compute-1 ovn_metadata_agent[104179]:     bind 169.254.169.254:80
Jan 22 00:20:51 compute-1 ovn_metadata_agent[104179]:     server metadata /var/lib/neutron/metadata_proxy
Jan 22 00:20:51 compute-1 ovn_metadata_agent[104179]:     http-request add-header X-OVN-Network-ID 65641ee3-5688-4f52-8e2b-2aae97505b84
Jan 22 00:20:51 compute-1 ovn_metadata_agent[104179]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 22 00:20:51 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:20:51.380 104184 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-65641ee3-5688-4f52-8e2b-2aae97505b84', 'env', 'PROCESS_TAG=haproxy-65641ee3-5688-4f52-8e2b-2aae97505b84', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/65641ee3-5688-4f52-8e2b-2aae97505b84.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 22 00:20:51 compute-1 nova_compute[182713]: 2026-01-22 00:20:51.388 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:20:51 compute-1 nova_compute[182713]: 2026-01-22 00:20:51.399 182717 DEBUG nova.network.neutron [req-e65c3f83-b17e-4a15-9c52-350499d6f666 req-92382f7d-0445-4132-9223-1534f77f9cf3 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 595c8a5a-b43c-4eae-ad91-c7848e0e2f44] Updated VIF entry in instance network info cache for port 7e4f92dd-59c8-4dfe-87e4-3a6b4d14aba8. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 00:20:51 compute-1 nova_compute[182713]: 2026-01-22 00:20:51.400 182717 DEBUG nova.network.neutron [req-e65c3f83-b17e-4a15-9c52-350499d6f666 req-92382f7d-0445-4132-9223-1534f77f9cf3 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 595c8a5a-b43c-4eae-ad91-c7848e0e2f44] Updating instance_info_cache with network_info: [{"id": "7e4f92dd-59c8-4dfe-87e4-3a6b4d14aba8", "address": "fa:16:3e:d2:c1:d4", "network": {"id": "65641ee3-5688-4f52-8e2b-2aae97505b84", "bridge": "br-int", "label": "tempest-network-smoke--2051804877", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "02bcfc5f1f1044a3856e73a5938ff011", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7e4f92dd-59", "ovs_interfaceid": "7e4f92dd-59c8-4dfe-87e4-3a6b4d14aba8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:20:51 compute-1 nova_compute[182713]: 2026-01-22 00:20:51.448 182717 DEBUG oslo_concurrency.lockutils [req-e65c3f83-b17e-4a15-9c52-350499d6f666 req-92382f7d-0445-4132-9223-1534f77f9cf3 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-595c8a5a-b43c-4eae-ad91-c7848e0e2f44" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:20:51 compute-1 nova_compute[182713]: 2026-01-22 00:20:51.451 182717 DEBUG nova.virt.driver [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Emitting event <LifecycleEvent: 1769041251.4506917, 595c8a5a-b43c-4eae-ad91-c7848e0e2f44 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:20:51 compute-1 nova_compute[182713]: 2026-01-22 00:20:51.452 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 595c8a5a-b43c-4eae-ad91-c7848e0e2f44] VM Started (Lifecycle Event)
Jan 22 00:20:51 compute-1 nova_compute[182713]: 2026-01-22 00:20:51.497 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 595c8a5a-b43c-4eae-ad91-c7848e0e2f44] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:20:51 compute-1 nova_compute[182713]: 2026-01-22 00:20:51.503 182717 DEBUG nova.virt.driver [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Emitting event <LifecycleEvent: 1769041251.4523685, 595c8a5a-b43c-4eae-ad91-c7848e0e2f44 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:20:51 compute-1 nova_compute[182713]: 2026-01-22 00:20:51.503 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 595c8a5a-b43c-4eae-ad91-c7848e0e2f44] VM Paused (Lifecycle Event)
Jan 22 00:20:51 compute-1 nova_compute[182713]: 2026-01-22 00:20:51.526 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 595c8a5a-b43c-4eae-ad91-c7848e0e2f44] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:20:51 compute-1 nova_compute[182713]: 2026-01-22 00:20:51.531 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 595c8a5a-b43c-4eae-ad91-c7848e0e2f44] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 00:20:51 compute-1 nova_compute[182713]: 2026-01-22 00:20:51.558 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 595c8a5a-b43c-4eae-ad91-c7848e0e2f44] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 00:20:51 compute-1 podman[234295]: 2026-01-22 00:20:51.7739205 +0000 UTC m=+0.069762057 container create d0693bdcdf806ca5af5571fdcb9041cac340073fb55b84a93bce68c9a16531ee (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-65641ee3-5688-4f52-8e2b-2aae97505b84, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Jan 22 00:20:51 compute-1 systemd[1]: Started libpod-conmon-d0693bdcdf806ca5af5571fdcb9041cac340073fb55b84a93bce68c9a16531ee.scope.
Jan 22 00:20:51 compute-1 systemd[1]: Started libcrun container.
Jan 22 00:20:51 compute-1 podman[234295]: 2026-01-22 00:20:51.742784683 +0000 UTC m=+0.038626260 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 00:20:51 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a868150231ce45e0fe69ba9499a71dd5109d019e0955bf3bd08317f62c20bc52/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 00:20:51 compute-1 podman[234295]: 2026-01-22 00:20:51.856192224 +0000 UTC m=+0.152033791 container init d0693bdcdf806ca5af5571fdcb9041cac340073fb55b84a93bce68c9a16531ee (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-65641ee3-5688-4f52-8e2b-2aae97505b84, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 22 00:20:51 compute-1 podman[234295]: 2026-01-22 00:20:51.861300802 +0000 UTC m=+0.157142339 container start d0693bdcdf806ca5af5571fdcb9041cac340073fb55b84a93bce68c9a16531ee (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-65641ee3-5688-4f52-8e2b-2aae97505b84, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 22 00:20:51 compute-1 podman[234311]: 2026-01-22 00:20:51.879998003 +0000 UTC m=+0.063276965 container health_status e305ebfcd3dbca18f8dd680606b043b108cf9efb0d0a771039cb04fe0df8441d (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Jan 22 00:20:51 compute-1 neutron-haproxy-ovnmeta-65641ee3-5688-4f52-8e2b-2aae97505b84[234315]: [NOTICE]   (234348) : New worker (234358) forked
Jan 22 00:20:51 compute-1 neutron-haproxy-ovnmeta-65641ee3-5688-4f52-8e2b-2aae97505b84[234315]: [NOTICE]   (234348) : Loading success.
Jan 22 00:20:51 compute-1 podman[234308]: 2026-01-22 00:20:51.898759286 +0000 UTC m=+0.087972943 container health_status af9a44923f14d865893c91712a29a99d59e05d83f1a1a0300c4f4d5af638f284 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 22 00:20:52 compute-1 nova_compute[182713]: 2026-01-22 00:20:52.856 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:20:52 compute-1 nova_compute[182713]: 2026-01-22 00:20:52.856 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:20:52 compute-1 nova_compute[182713]: 2026-01-22 00:20:52.944 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:20:52 compute-1 nova_compute[182713]: 2026-01-22 00:20:52.944 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:20:52 compute-1 nova_compute[182713]: 2026-01-22 00:20:52.945 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:20:52 compute-1 nova_compute[182713]: 2026-01-22 00:20:52.945 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 00:20:53 compute-1 nova_compute[182713]: 2026-01-22 00:20:53.164 182717 DEBUG oslo_concurrency.processutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/595c8a5a-b43c-4eae-ad91-c7848e0e2f44/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:20:53 compute-1 nova_compute[182713]: 2026-01-22 00:20:53.258 182717 DEBUG oslo_concurrency.processutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/595c8a5a-b43c-4eae-ad91-c7848e0e2f44/disk --force-share --output=json" returned: 0 in 0.094s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:20:53 compute-1 nova_compute[182713]: 2026-01-22 00:20:53.260 182717 DEBUG oslo_concurrency.processutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/595c8a5a-b43c-4eae-ad91-c7848e0e2f44/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:20:53 compute-1 nova_compute[182713]: 2026-01-22 00:20:53.326 182717 DEBUG oslo_concurrency.processutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/595c8a5a-b43c-4eae-ad91-c7848e0e2f44/disk --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:20:53 compute-1 nova_compute[182713]: 2026-01-22 00:20:53.340 182717 DEBUG nova.compute.manager [req-73a89e32-16bb-4d25-a990-ec46b6588a1a req-5a022a22-5598-484c-9998-a6eba98daea1 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 595c8a5a-b43c-4eae-ad91-c7848e0e2f44] Received event network-vif-plugged-7e4f92dd-59c8-4dfe-87e4-3a6b4d14aba8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:20:53 compute-1 nova_compute[182713]: 2026-01-22 00:20:53.340 182717 DEBUG oslo_concurrency.lockutils [req-73a89e32-16bb-4d25-a990-ec46b6588a1a req-5a022a22-5598-484c-9998-a6eba98daea1 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "595c8a5a-b43c-4eae-ad91-c7848e0e2f44-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:20:53 compute-1 nova_compute[182713]: 2026-01-22 00:20:53.341 182717 DEBUG oslo_concurrency.lockutils [req-73a89e32-16bb-4d25-a990-ec46b6588a1a req-5a022a22-5598-484c-9998-a6eba98daea1 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "595c8a5a-b43c-4eae-ad91-c7848e0e2f44-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:20:53 compute-1 nova_compute[182713]: 2026-01-22 00:20:53.341 182717 DEBUG oslo_concurrency.lockutils [req-73a89e32-16bb-4d25-a990-ec46b6588a1a req-5a022a22-5598-484c-9998-a6eba98daea1 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "595c8a5a-b43c-4eae-ad91-c7848e0e2f44-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:20:53 compute-1 nova_compute[182713]: 2026-01-22 00:20:53.341 182717 DEBUG nova.compute.manager [req-73a89e32-16bb-4d25-a990-ec46b6588a1a req-5a022a22-5598-484c-9998-a6eba98daea1 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 595c8a5a-b43c-4eae-ad91-c7848e0e2f44] Processing event network-vif-plugged-7e4f92dd-59c8-4dfe-87e4-3a6b4d14aba8 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 22 00:20:53 compute-1 nova_compute[182713]: 2026-01-22 00:20:53.342 182717 DEBUG nova.compute.manager [req-73a89e32-16bb-4d25-a990-ec46b6588a1a req-5a022a22-5598-484c-9998-a6eba98daea1 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 595c8a5a-b43c-4eae-ad91-c7848e0e2f44] Received event network-vif-plugged-7e4f92dd-59c8-4dfe-87e4-3a6b4d14aba8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:20:53 compute-1 nova_compute[182713]: 2026-01-22 00:20:53.342 182717 DEBUG oslo_concurrency.lockutils [req-73a89e32-16bb-4d25-a990-ec46b6588a1a req-5a022a22-5598-484c-9998-a6eba98daea1 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "595c8a5a-b43c-4eae-ad91-c7848e0e2f44-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:20:53 compute-1 nova_compute[182713]: 2026-01-22 00:20:53.342 182717 DEBUG oslo_concurrency.lockutils [req-73a89e32-16bb-4d25-a990-ec46b6588a1a req-5a022a22-5598-484c-9998-a6eba98daea1 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "595c8a5a-b43c-4eae-ad91-c7848e0e2f44-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:20:53 compute-1 nova_compute[182713]: 2026-01-22 00:20:53.342 182717 DEBUG oslo_concurrency.lockutils [req-73a89e32-16bb-4d25-a990-ec46b6588a1a req-5a022a22-5598-484c-9998-a6eba98daea1 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "595c8a5a-b43c-4eae-ad91-c7848e0e2f44-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:20:53 compute-1 nova_compute[182713]: 2026-01-22 00:20:53.343 182717 DEBUG nova.compute.manager [req-73a89e32-16bb-4d25-a990-ec46b6588a1a req-5a022a22-5598-484c-9998-a6eba98daea1 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 595c8a5a-b43c-4eae-ad91-c7848e0e2f44] No waiting events found dispatching network-vif-plugged-7e4f92dd-59c8-4dfe-87e4-3a6b4d14aba8 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:20:53 compute-1 nova_compute[182713]: 2026-01-22 00:20:53.343 182717 WARNING nova.compute.manager [req-73a89e32-16bb-4d25-a990-ec46b6588a1a req-5a022a22-5598-484c-9998-a6eba98daea1 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 595c8a5a-b43c-4eae-ad91-c7848e0e2f44] Received unexpected event network-vif-plugged-7e4f92dd-59c8-4dfe-87e4-3a6b4d14aba8 for instance with vm_state building and task_state spawning.
Jan 22 00:20:53 compute-1 nova_compute[182713]: 2026-01-22 00:20:53.344 182717 DEBUG nova.compute.manager [None req-5855d627-d6ad-4b02-bc9d-c3741ae99aa8 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 595c8a5a-b43c-4eae-ad91-c7848e0e2f44] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 00:20:53 compute-1 nova_compute[182713]: 2026-01-22 00:20:53.348 182717 DEBUG nova.virt.driver [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Emitting event <LifecycleEvent: 1769041253.3477116, 595c8a5a-b43c-4eae-ad91-c7848e0e2f44 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:20:53 compute-1 nova_compute[182713]: 2026-01-22 00:20:53.349 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 595c8a5a-b43c-4eae-ad91-c7848e0e2f44] VM Resumed (Lifecycle Event)
Jan 22 00:20:53 compute-1 nova_compute[182713]: 2026-01-22 00:20:53.354 182717 DEBUG nova.virt.libvirt.driver [None req-5855d627-d6ad-4b02-bc9d-c3741ae99aa8 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 595c8a5a-b43c-4eae-ad91-c7848e0e2f44] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 22 00:20:53 compute-1 nova_compute[182713]: 2026-01-22 00:20:53.360 182717 INFO nova.virt.libvirt.driver [-] [instance: 595c8a5a-b43c-4eae-ad91-c7848e0e2f44] Instance spawned successfully.
Jan 22 00:20:53 compute-1 nova_compute[182713]: 2026-01-22 00:20:53.362 182717 DEBUG nova.virt.libvirt.driver [None req-5855d627-d6ad-4b02-bc9d-c3741ae99aa8 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 595c8a5a-b43c-4eae-ad91-c7848e0e2f44] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 22 00:20:53 compute-1 nova_compute[182713]: 2026-01-22 00:20:53.379 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 595c8a5a-b43c-4eae-ad91-c7848e0e2f44] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:20:53 compute-1 nova_compute[182713]: 2026-01-22 00:20:53.383 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 595c8a5a-b43c-4eae-ad91-c7848e0e2f44] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 00:20:53 compute-1 nova_compute[182713]: 2026-01-22 00:20:53.392 182717 DEBUG nova.virt.libvirt.driver [None req-5855d627-d6ad-4b02-bc9d-c3741ae99aa8 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 595c8a5a-b43c-4eae-ad91-c7848e0e2f44] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:20:53 compute-1 nova_compute[182713]: 2026-01-22 00:20:53.393 182717 DEBUG nova.virt.libvirt.driver [None req-5855d627-d6ad-4b02-bc9d-c3741ae99aa8 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 595c8a5a-b43c-4eae-ad91-c7848e0e2f44] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:20:53 compute-1 nova_compute[182713]: 2026-01-22 00:20:53.393 182717 DEBUG nova.virt.libvirt.driver [None req-5855d627-d6ad-4b02-bc9d-c3741ae99aa8 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 595c8a5a-b43c-4eae-ad91-c7848e0e2f44] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:20:53 compute-1 nova_compute[182713]: 2026-01-22 00:20:53.394 182717 DEBUG nova.virt.libvirt.driver [None req-5855d627-d6ad-4b02-bc9d-c3741ae99aa8 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 595c8a5a-b43c-4eae-ad91-c7848e0e2f44] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:20:53 compute-1 nova_compute[182713]: 2026-01-22 00:20:53.394 182717 DEBUG nova.virt.libvirt.driver [None req-5855d627-d6ad-4b02-bc9d-c3741ae99aa8 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 595c8a5a-b43c-4eae-ad91-c7848e0e2f44] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:20:53 compute-1 nova_compute[182713]: 2026-01-22 00:20:53.395 182717 DEBUG nova.virt.libvirt.driver [None req-5855d627-d6ad-4b02-bc9d-c3741ae99aa8 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 595c8a5a-b43c-4eae-ad91-c7848e0e2f44] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:20:53 compute-1 nova_compute[182713]: 2026-01-22 00:20:53.403 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 595c8a5a-b43c-4eae-ad91-c7848e0e2f44] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 00:20:53 compute-1 nova_compute[182713]: 2026-01-22 00:20:53.527 182717 INFO nova.compute.manager [None req-5855d627-d6ad-4b02-bc9d-c3741ae99aa8 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 595c8a5a-b43c-4eae-ad91-c7848e0e2f44] Took 8.76 seconds to spawn the instance on the hypervisor.
Jan 22 00:20:53 compute-1 nova_compute[182713]: 2026-01-22 00:20:53.527 182717 DEBUG nova.compute.manager [None req-5855d627-d6ad-4b02-bc9d-c3741ae99aa8 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 595c8a5a-b43c-4eae-ad91-c7848e0e2f44] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:20:53 compute-1 nova_compute[182713]: 2026-01-22 00:20:53.528 182717 WARNING nova.virt.libvirt.driver [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 00:20:53 compute-1 nova_compute[182713]: 2026-01-22 00:20:53.529 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5647MB free_disk=73.25955963134766GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 00:20:53 compute-1 nova_compute[182713]: 2026-01-22 00:20:53.529 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:20:53 compute-1 nova_compute[182713]: 2026-01-22 00:20:53.530 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:20:53 compute-1 nova_compute[182713]: 2026-01-22 00:20:53.640 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Instance 595c8a5a-b43c-4eae-ad91-c7848e0e2f44 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 22 00:20:53 compute-1 nova_compute[182713]: 2026-01-22 00:20:53.640 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 00:20:53 compute-1 nova_compute[182713]: 2026-01-22 00:20:53.641 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 00:20:53 compute-1 nova_compute[182713]: 2026-01-22 00:20:53.650 182717 INFO nova.compute.manager [None req-5855d627-d6ad-4b02-bc9d-c3741ae99aa8 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 595c8a5a-b43c-4eae-ad91-c7848e0e2f44] Took 9.68 seconds to build instance.
Jan 22 00:20:53 compute-1 nova_compute[182713]: 2026-01-22 00:20:53.671 182717 DEBUG oslo_concurrency.lockutils [None req-5855d627-d6ad-4b02-bc9d-c3741ae99aa8 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Lock "595c8a5a-b43c-4eae-ad91-c7848e0e2f44" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 9.882s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:20:53 compute-1 nova_compute[182713]: 2026-01-22 00:20:53.697 182717 DEBUG nova.compute.provider_tree [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Inventory has not changed in ProviderTree for provider: 39680711-70c9-4df1-ae59-25e54fac688d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:20:53 compute-1 nova_compute[182713]: 2026-01-22 00:20:53.716 182717 DEBUG nova.scheduler.client.report [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Inventory has not changed for provider 39680711-70c9-4df1-ae59-25e54fac688d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:20:53 compute-1 nova_compute[182713]: 2026-01-22 00:20:53.755 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 00:20:53 compute-1 nova_compute[182713]: 2026-01-22 00:20:53.755 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.226s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:20:53 compute-1 nova_compute[182713]: 2026-01-22 00:20:53.895 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:20:55 compute-1 nova_compute[182713]: 2026-01-22 00:20:55.182 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:20:55 compute-1 nova_compute[182713]: 2026-01-22 00:20:55.756 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:20:56 compute-1 nova_compute[182713]: 2026-01-22 00:20:56.857 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:20:56 compute-1 nova_compute[182713]: 2026-01-22 00:20:56.857 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 00:20:56 compute-1 nova_compute[182713]: 2026-01-22 00:20:56.857 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 22 00:20:57 compute-1 nova_compute[182713]: 2026-01-22 00:20:57.190 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquiring lock "refresh_cache-595c8a5a-b43c-4eae-ad91-c7848e0e2f44" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:20:57 compute-1 nova_compute[182713]: 2026-01-22 00:20:57.190 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquired lock "refresh_cache-595c8a5a-b43c-4eae-ad91-c7848e0e2f44" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:20:57 compute-1 nova_compute[182713]: 2026-01-22 00:20:57.190 182717 DEBUG nova.network.neutron [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] [instance: 595c8a5a-b43c-4eae-ad91-c7848e0e2f44] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Jan 22 00:20:57 compute-1 nova_compute[182713]: 2026-01-22 00:20:57.191 182717 DEBUG nova.objects.instance [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lazy-loading 'info_cache' on Instance uuid 595c8a5a-b43c-4eae-ad91-c7848e0e2f44 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:20:58 compute-1 nova_compute[182713]: 2026-01-22 00:20:58.882 182717 DEBUG nova.network.neutron [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] [instance: 595c8a5a-b43c-4eae-ad91-c7848e0e2f44] Updating instance_info_cache with network_info: [{"id": "7e4f92dd-59c8-4dfe-87e4-3a6b4d14aba8", "address": "fa:16:3e:d2:c1:d4", "network": {"id": "65641ee3-5688-4f52-8e2b-2aae97505b84", "bridge": "br-int", "label": "tempest-network-smoke--2051804877", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "02bcfc5f1f1044a3856e73a5938ff011", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7e4f92dd-59", "ovs_interfaceid": "7e4f92dd-59c8-4dfe-87e4-3a6b4d14aba8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:20:58 compute-1 nova_compute[182713]: 2026-01-22 00:20:58.898 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:20:58 compute-1 nova_compute[182713]: 2026-01-22 00:20:58.906 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Releasing lock "refresh_cache-595c8a5a-b43c-4eae-ad91-c7848e0e2f44" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:20:58 compute-1 nova_compute[182713]: 2026-01-22 00:20:58.907 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] [instance: 595c8a5a-b43c-4eae-ad91-c7848e0e2f44] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Jan 22 00:20:58 compute-1 nova_compute[182713]: 2026-01-22 00:20:58.907 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:20:59 compute-1 ovn_controller[94841]: 2026-01-22T00:20:59Z|00598|binding|INFO|Releasing lport 737a2d1f-ad8c-46d7-ba36-880bbc6b5728 from this chassis (sb_readonly=0)
Jan 22 00:20:59 compute-1 nova_compute[182713]: 2026-01-22 00:20:59.452 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:21:00 compute-1 nova_compute[182713]: 2026-01-22 00:21:00.184 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:21:03 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:21:03.029 104184 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:21:03 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:21:03.030 104184 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:21:03 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:21:03.030 104184 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:21:03 compute-1 podman[234375]: 2026-01-22 00:21:03.600254902 +0000 UTC m=+0.088027574 container health_status cba261ffc859a105e0afa4436e64e27f9ba7e7387a1ea02146e0072c51844d44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 00:21:03 compute-1 nova_compute[182713]: 2026-01-22 00:21:03.937 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:21:05 compute-1 nova_compute[182713]: 2026-01-22 00:21:05.232 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:21:05 compute-1 nova_compute[182713]: 2026-01-22 00:21:05.326 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:21:06 compute-1 ovn_controller[94841]: 2026-01-22T00:21:06Z|00072|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:d2:c1:d4 10.100.0.5
Jan 22 00:21:06 compute-1 ovn_controller[94841]: 2026-01-22T00:21:06Z|00073|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:d2:c1:d4 10.100.0.5
Jan 22 00:21:06 compute-1 podman[234416]: 2026-01-22 00:21:06.589726846 +0000 UTC m=+0.080739348 container health_status dce133a2ffa21f3006ac27dc80027060a9029e6d972f4c62fecd35dd3bbb00f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, vcs-type=git, architecture=x86_64, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, config_id=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., container_name=openstack_network_exporter, distribution-scope=public, io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7)
Jan 22 00:21:08 compute-1 nova_compute[182713]: 2026-01-22 00:21:08.940 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:21:10 compute-1 nova_compute[182713]: 2026-01-22 00:21:10.237 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:21:11 compute-1 nova_compute[182713]: 2026-01-22 00:21:11.181 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:21:12 compute-1 nova_compute[182713]: 2026-01-22 00:21:12.583 182717 DEBUG oslo_concurrency.lockutils [None req-a9fefd24-ab17-4788-b147-e9a202b4a5b4 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Acquiring lock "595c8a5a-b43c-4eae-ad91-c7848e0e2f44" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:21:12 compute-1 nova_compute[182713]: 2026-01-22 00:21:12.584 182717 DEBUG oslo_concurrency.lockutils [None req-a9fefd24-ab17-4788-b147-e9a202b4a5b4 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Lock "595c8a5a-b43c-4eae-ad91-c7848e0e2f44" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:21:12 compute-1 nova_compute[182713]: 2026-01-22 00:21:12.585 182717 DEBUG oslo_concurrency.lockutils [None req-a9fefd24-ab17-4788-b147-e9a202b4a5b4 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Acquiring lock "595c8a5a-b43c-4eae-ad91-c7848e0e2f44-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:21:12 compute-1 nova_compute[182713]: 2026-01-22 00:21:12.585 182717 DEBUG oslo_concurrency.lockutils [None req-a9fefd24-ab17-4788-b147-e9a202b4a5b4 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Lock "595c8a5a-b43c-4eae-ad91-c7848e0e2f44-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:21:12 compute-1 nova_compute[182713]: 2026-01-22 00:21:12.586 182717 DEBUG oslo_concurrency.lockutils [None req-a9fefd24-ab17-4788-b147-e9a202b4a5b4 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Lock "595c8a5a-b43c-4eae-ad91-c7848e0e2f44-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:21:12 compute-1 nova_compute[182713]: 2026-01-22 00:21:12.606 182717 INFO nova.compute.manager [None req-a9fefd24-ab17-4788-b147-e9a202b4a5b4 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 595c8a5a-b43c-4eae-ad91-c7848e0e2f44] Terminating instance
Jan 22 00:21:12 compute-1 nova_compute[182713]: 2026-01-22 00:21:12.625 182717 DEBUG nova.compute.manager [None req-a9fefd24-ab17-4788-b147-e9a202b4a5b4 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 595c8a5a-b43c-4eae-ad91-c7848e0e2f44] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 22 00:21:12 compute-1 kernel: tap7e4f92dd-59 (unregistering): left promiscuous mode
Jan 22 00:21:12 compute-1 NetworkManager[54952]: <info>  [1769041272.6472] device (tap7e4f92dd-59): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 00:21:12 compute-1 ovn_controller[94841]: 2026-01-22T00:21:12Z|00599|binding|INFO|Releasing lport 7e4f92dd-59c8-4dfe-87e4-3a6b4d14aba8 from this chassis (sb_readonly=0)
Jan 22 00:21:12 compute-1 ovn_controller[94841]: 2026-01-22T00:21:12Z|00600|binding|INFO|Setting lport 7e4f92dd-59c8-4dfe-87e4-3a6b4d14aba8 down in Southbound
Jan 22 00:21:12 compute-1 ovn_controller[94841]: 2026-01-22T00:21:12Z|00601|binding|INFO|Removing iface tap7e4f92dd-59 ovn-installed in OVS
Jan 22 00:21:12 compute-1 nova_compute[182713]: 2026-01-22 00:21:12.654 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:21:12 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:21:12.660 104184 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d2:c1:d4 10.100.0.5'], port_security=['fa:16:3e:d2:c1:d4 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '595c8a5a-b43c-4eae-ad91-c7848e0e2f44', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-65641ee3-5688-4f52-8e2b-2aae97505b84', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '02bcfc5f1f1044a3856e73a5938ff011', 'neutron:revision_number': '4', 'neutron:security_group_ids': '9d5746ab-567f-4771-baec-483e6edef99f', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e3d414ff-1f29-4cd2-96c4-c90cd0d603fc, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>], logical_port=7e4f92dd-59c8-4dfe-87e4-3a6b4d14aba8) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:21:12 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:21:12.662 104184 INFO neutron.agent.ovn.metadata.agent [-] Port 7e4f92dd-59c8-4dfe-87e4-3a6b4d14aba8 in datapath 65641ee3-5688-4f52-8e2b-2aae97505b84 unbound from our chassis
Jan 22 00:21:12 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:21:12.663 104184 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 65641ee3-5688-4f52-8e2b-2aae97505b84, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 00:21:12 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:21:12.666 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[72eca3ea-4dc3-4535-8bd1-6bff4e73ffe9]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:21:12 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:21:12.666 104184 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-65641ee3-5688-4f52-8e2b-2aae97505b84 namespace which is not needed anymore
Jan 22 00:21:12 compute-1 nova_compute[182713]: 2026-01-22 00:21:12.673 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:21:12 compute-1 systemd[1]: machine-qemu\x2d65\x2dinstance\x2d00000097.scope: Deactivated successfully.
Jan 22 00:21:12 compute-1 systemd[1]: machine-qemu\x2d65\x2dinstance\x2d00000097.scope: Consumed 13.264s CPU time.
Jan 22 00:21:12 compute-1 systemd-machined[153970]: Machine qemu-65-instance-00000097 terminated.
Jan 22 00:21:12 compute-1 neutron-haproxy-ovnmeta-65641ee3-5688-4f52-8e2b-2aae97505b84[234315]: [NOTICE]   (234348) : haproxy version is 2.8.14-c23fe91
Jan 22 00:21:12 compute-1 neutron-haproxy-ovnmeta-65641ee3-5688-4f52-8e2b-2aae97505b84[234315]: [NOTICE]   (234348) : path to executable is /usr/sbin/haproxy
Jan 22 00:21:12 compute-1 neutron-haproxy-ovnmeta-65641ee3-5688-4f52-8e2b-2aae97505b84[234315]: [WARNING]  (234348) : Exiting Master process...
Jan 22 00:21:12 compute-1 neutron-haproxy-ovnmeta-65641ee3-5688-4f52-8e2b-2aae97505b84[234315]: [ALERT]    (234348) : Current worker (234358) exited with code 143 (Terminated)
Jan 22 00:21:12 compute-1 neutron-haproxy-ovnmeta-65641ee3-5688-4f52-8e2b-2aae97505b84[234315]: [WARNING]  (234348) : All workers exited. Exiting... (0)
Jan 22 00:21:12 compute-1 systemd[1]: libpod-d0693bdcdf806ca5af5571fdcb9041cac340073fb55b84a93bce68c9a16531ee.scope: Deactivated successfully.
Jan 22 00:21:12 compute-1 podman[234464]: 2026-01-22 00:21:12.839246683 +0000 UTC m=+0.057768775 container died d0693bdcdf806ca5af5571fdcb9041cac340073fb55b84a93bce68c9a16531ee (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-65641ee3-5688-4f52-8e2b-2aae97505b84, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 22 00:21:12 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-d0693bdcdf806ca5af5571fdcb9041cac340073fb55b84a93bce68c9a16531ee-userdata-shm.mount: Deactivated successfully.
Jan 22 00:21:12 compute-1 systemd[1]: var-lib-containers-storage-overlay-a868150231ce45e0fe69ba9499a71dd5109d019e0955bf3bd08317f62c20bc52-merged.mount: Deactivated successfully.
Jan 22 00:21:12 compute-1 nova_compute[182713]: 2026-01-22 00:21:12.894 182717 INFO nova.virt.libvirt.driver [-] [instance: 595c8a5a-b43c-4eae-ad91-c7848e0e2f44] Instance destroyed successfully.
Jan 22 00:21:12 compute-1 nova_compute[182713]: 2026-01-22 00:21:12.895 182717 DEBUG nova.objects.instance [None req-a9fefd24-ab17-4788-b147-e9a202b4a5b4 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Lazy-loading 'resources' on Instance uuid 595c8a5a-b43c-4eae-ad91-c7848e0e2f44 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:21:12 compute-1 nova_compute[182713]: 2026-01-22 00:21:12.911 182717 DEBUG nova.virt.libvirt.vif [None req-a9fefd24-ab17-4788-b147-e9a202b4a5b4 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T00:20:42Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-1492736128-gen-0-721917988',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-1492736128-gen-0-721917988',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-1492736128-ge',id=151,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFOgFr0loz0o97S1yJic425BuuGnqIIzzaQU+1FOWYN8VLWjMOBgkt02kLpdfipR3QnvdUvT3mVD/diPnm35tClCs6BoaTbQN3VWq8tyqhLXUA2JeTkyyUA3yLrgO9t4ag==',key_name='tempest-TestSecurityGroupsBasicOps-1152614963',keypairs=<?>,launch_index=0,launched_at=2026-01-22T00:20:53Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='02bcfc5f1f1044a3856e73a5938ff011',ramdisk_id='',reservation_id='r-zvfyb2xc',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestSecurityGroupsBasicOps-1492736128',owner_user_name='tempest-TestSecurityGroupsBasicOps-1492736128-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T00:20:53Z,user_data=None,user_id='a60ce2b7b7ae47b484de12add551b287',uuid=595c8a5a-b43c-4eae-ad91-c7848e0e2f44,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "7e4f92dd-59c8-4dfe-87e4-3a6b4d14aba8", "address": "fa:16:3e:d2:c1:d4", "network": {"id": "65641ee3-5688-4f52-8e2b-2aae97505b84", "bridge": "br-int", "label": "tempest-network-smoke--2051804877", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "02bcfc5f1f1044a3856e73a5938ff011", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7e4f92dd-59", "ovs_interfaceid": "7e4f92dd-59c8-4dfe-87e4-3a6b4d14aba8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 00:21:12 compute-1 nova_compute[182713]: 2026-01-22 00:21:12.912 182717 DEBUG nova.network.os_vif_util [None req-a9fefd24-ab17-4788-b147-e9a202b4a5b4 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Converting VIF {"id": "7e4f92dd-59c8-4dfe-87e4-3a6b4d14aba8", "address": "fa:16:3e:d2:c1:d4", "network": {"id": "65641ee3-5688-4f52-8e2b-2aae97505b84", "bridge": "br-int", "label": "tempest-network-smoke--2051804877", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "02bcfc5f1f1044a3856e73a5938ff011", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7e4f92dd-59", "ovs_interfaceid": "7e4f92dd-59c8-4dfe-87e4-3a6b4d14aba8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:21:12 compute-1 nova_compute[182713]: 2026-01-22 00:21:12.913 182717 DEBUG nova.network.os_vif_util [None req-a9fefd24-ab17-4788-b147-e9a202b4a5b4 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:d2:c1:d4,bridge_name='br-int',has_traffic_filtering=True,id=7e4f92dd-59c8-4dfe-87e4-3a6b4d14aba8,network=Network(65641ee3-5688-4f52-8e2b-2aae97505b84),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7e4f92dd-59') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:21:12 compute-1 nova_compute[182713]: 2026-01-22 00:21:12.914 182717 DEBUG os_vif [None req-a9fefd24-ab17-4788-b147-e9a202b4a5b4 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:d2:c1:d4,bridge_name='br-int',has_traffic_filtering=True,id=7e4f92dd-59c8-4dfe-87e4-3a6b4d14aba8,network=Network(65641ee3-5688-4f52-8e2b-2aae97505b84),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7e4f92dd-59') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 00:21:12 compute-1 nova_compute[182713]: 2026-01-22 00:21:12.917 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:21:12 compute-1 nova_compute[182713]: 2026-01-22 00:21:12.917 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7e4f92dd-59, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:21:12 compute-1 nova_compute[182713]: 2026-01-22 00:21:12.920 182717 DEBUG nova.compute.manager [req-e1f39247-e075-4b0c-b65b-89ef8485d17b req-50400c84-85da-4cb2-a0fc-f1b6c0ab83d1 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 595c8a5a-b43c-4eae-ad91-c7848e0e2f44] Received event network-vif-unplugged-7e4f92dd-59c8-4dfe-87e4-3a6b4d14aba8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:21:12 compute-1 nova_compute[182713]: 2026-01-22 00:21:12.921 182717 DEBUG oslo_concurrency.lockutils [req-e1f39247-e075-4b0c-b65b-89ef8485d17b req-50400c84-85da-4cb2-a0fc-f1b6c0ab83d1 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "595c8a5a-b43c-4eae-ad91-c7848e0e2f44-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:21:12 compute-1 nova_compute[182713]: 2026-01-22 00:21:12.921 182717 DEBUG oslo_concurrency.lockutils [req-e1f39247-e075-4b0c-b65b-89ef8485d17b req-50400c84-85da-4cb2-a0fc-f1b6c0ab83d1 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "595c8a5a-b43c-4eae-ad91-c7848e0e2f44-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:21:12 compute-1 nova_compute[182713]: 2026-01-22 00:21:12.921 182717 DEBUG oslo_concurrency.lockutils [req-e1f39247-e075-4b0c-b65b-89ef8485d17b req-50400c84-85da-4cb2-a0fc-f1b6c0ab83d1 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "595c8a5a-b43c-4eae-ad91-c7848e0e2f44-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:21:12 compute-1 nova_compute[182713]: 2026-01-22 00:21:12.922 182717 DEBUG nova.compute.manager [req-e1f39247-e075-4b0c-b65b-89ef8485d17b req-50400c84-85da-4cb2-a0fc-f1b6c0ab83d1 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 595c8a5a-b43c-4eae-ad91-c7848e0e2f44] No waiting events found dispatching network-vif-unplugged-7e4f92dd-59c8-4dfe-87e4-3a6b4d14aba8 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:21:12 compute-1 nova_compute[182713]: 2026-01-22 00:21:12.922 182717 DEBUG nova.compute.manager [req-e1f39247-e075-4b0c-b65b-89ef8485d17b req-50400c84-85da-4cb2-a0fc-f1b6c0ab83d1 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 595c8a5a-b43c-4eae-ad91-c7848e0e2f44] Received event network-vif-unplugged-7e4f92dd-59c8-4dfe-87e4-3a6b4d14aba8 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 22 00:21:12 compute-1 nova_compute[182713]: 2026-01-22 00:21:12.963 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:21:12 compute-1 nova_compute[182713]: 2026-01-22 00:21:12.965 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:21:12 compute-1 podman[234464]: 2026-01-22 00:21:12.968832786 +0000 UTC m=+0.187354838 container cleanup d0693bdcdf806ca5af5571fdcb9041cac340073fb55b84a93bce68c9a16531ee (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-65641ee3-5688-4f52-8e2b-2aae97505b84, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 00:21:12 compute-1 nova_compute[182713]: 2026-01-22 00:21:12.970 182717 INFO os_vif [None req-a9fefd24-ab17-4788-b147-e9a202b4a5b4 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:d2:c1:d4,bridge_name='br-int',has_traffic_filtering=True,id=7e4f92dd-59c8-4dfe-87e4-3a6b4d14aba8,network=Network(65641ee3-5688-4f52-8e2b-2aae97505b84),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7e4f92dd-59')
Jan 22 00:21:12 compute-1 nova_compute[182713]: 2026-01-22 00:21:12.971 182717 INFO nova.virt.libvirt.driver [None req-a9fefd24-ab17-4788-b147-e9a202b4a5b4 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 595c8a5a-b43c-4eae-ad91-c7848e0e2f44] Deleting instance files /var/lib/nova/instances/595c8a5a-b43c-4eae-ad91-c7848e0e2f44_del
Jan 22 00:21:12 compute-1 nova_compute[182713]: 2026-01-22 00:21:12.972 182717 INFO nova.virt.libvirt.driver [None req-a9fefd24-ab17-4788-b147-e9a202b4a5b4 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 595c8a5a-b43c-4eae-ad91-c7848e0e2f44] Deletion of /var/lib/nova/instances/595c8a5a-b43c-4eae-ad91-c7848e0e2f44_del complete
Jan 22 00:21:12 compute-1 systemd[1]: libpod-conmon-d0693bdcdf806ca5af5571fdcb9041cac340073fb55b84a93bce68c9a16531ee.scope: Deactivated successfully.
Jan 22 00:21:13 compute-1 nova_compute[182713]: 2026-01-22 00:21:13.043 182717 INFO nova.compute.manager [None req-a9fefd24-ab17-4788-b147-e9a202b4a5b4 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 595c8a5a-b43c-4eae-ad91-c7848e0e2f44] Took 0.42 seconds to destroy the instance on the hypervisor.
Jan 22 00:21:13 compute-1 nova_compute[182713]: 2026-01-22 00:21:13.043 182717 DEBUG oslo.service.loopingcall [None req-a9fefd24-ab17-4788-b147-e9a202b4a5b4 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 22 00:21:13 compute-1 nova_compute[182713]: 2026-01-22 00:21:13.044 182717 DEBUG nova.compute.manager [-] [instance: 595c8a5a-b43c-4eae-ad91-c7848e0e2f44] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 22 00:21:13 compute-1 nova_compute[182713]: 2026-01-22 00:21:13.044 182717 DEBUG nova.network.neutron [-] [instance: 595c8a5a-b43c-4eae-ad91-c7848e0e2f44] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 22 00:21:13 compute-1 podman[234510]: 2026-01-22 00:21:13.063725742 +0000 UTC m=+0.063491043 container remove d0693bdcdf806ca5af5571fdcb9041cac340073fb55b84a93bce68c9a16531ee (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-65641ee3-5688-4f52-8e2b-2aae97505b84, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251202)
Jan 22 00:21:13 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:21:13.069 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[c0db8cd7-5d5e-429a-8d88-5db19804639d]: (4, ('Thu Jan 22 12:21:12 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-65641ee3-5688-4f52-8e2b-2aae97505b84 (d0693bdcdf806ca5af5571fdcb9041cac340073fb55b84a93bce68c9a16531ee)\nd0693bdcdf806ca5af5571fdcb9041cac340073fb55b84a93bce68c9a16531ee\nThu Jan 22 12:21:12 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-65641ee3-5688-4f52-8e2b-2aae97505b84 (d0693bdcdf806ca5af5571fdcb9041cac340073fb55b84a93bce68c9a16531ee)\nd0693bdcdf806ca5af5571fdcb9041cac340073fb55b84a93bce68c9a16531ee\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:21:13 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:21:13.071 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[accdbc9b-eaaf-4bcf-9b59-c3a9b059f310]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:21:13 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:21:13.073 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap65641ee3-50, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:21:13 compute-1 kernel: tap65641ee3-50: left promiscuous mode
Jan 22 00:21:13 compute-1 nova_compute[182713]: 2026-01-22 00:21:13.075 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:21:13 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:21:13.082 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[266a5aff-d252-4535-bd06-2a35f65551dd]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:21:13 compute-1 nova_compute[182713]: 2026-01-22 00:21:13.087 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:21:13 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:21:13.095 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[21d1ed0c-9ba4-4a94-afd8-71fd7df5e25a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:21:13 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:21:13.096 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[622e4579-a19a-461b-93f1-d6ed54823ae5]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:21:13 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:21:13.122 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[d6c69609-d4da-4d41-9473-6172509b93c3]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 582377, 'reachable_time': 41323, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 234525, 'error': None, 'target': 'ovnmeta-65641ee3-5688-4f52-8e2b-2aae97505b84', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:21:13 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:21:13.125 104576 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-65641ee3-5688-4f52-8e2b-2aae97505b84 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 22 00:21:13 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:21:13.126 104576 DEBUG oslo.privsep.daemon [-] privsep: reply[f2109ed1-2aa2-48d2-a178-413d6b115d2a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:21:13 compute-1 systemd[1]: run-netns-ovnmeta\x2d65641ee3\x2d5688\x2d4f52\x2d8e2b\x2d2aae97505b84.mount: Deactivated successfully.
Jan 22 00:21:13 compute-1 nova_compute[182713]: 2026-01-22 00:21:13.943 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:21:14 compute-1 nova_compute[182713]: 2026-01-22 00:21:14.216 182717 DEBUG nova.network.neutron [-] [instance: 595c8a5a-b43c-4eae-ad91-c7848e0e2f44] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:21:14 compute-1 nova_compute[182713]: 2026-01-22 00:21:14.233 182717 INFO nova.compute.manager [-] [instance: 595c8a5a-b43c-4eae-ad91-c7848e0e2f44] Took 1.19 seconds to deallocate network for instance.
Jan 22 00:21:14 compute-1 nova_compute[182713]: 2026-01-22 00:21:14.287 182717 DEBUG nova.compute.manager [req-ccb52594-7c57-44c1-8abb-e313f84b1b50 req-c5b6f6bb-851d-4a3c-9e79-35482a92d2a8 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 595c8a5a-b43c-4eae-ad91-c7848e0e2f44] Received event network-vif-deleted-7e4f92dd-59c8-4dfe-87e4-3a6b4d14aba8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:21:14 compute-1 nova_compute[182713]: 2026-01-22 00:21:14.319 182717 DEBUG oslo_concurrency.lockutils [None req-a9fefd24-ab17-4788-b147-e9a202b4a5b4 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:21:14 compute-1 nova_compute[182713]: 2026-01-22 00:21:14.320 182717 DEBUG oslo_concurrency.lockutils [None req-a9fefd24-ab17-4788-b147-e9a202b4a5b4 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:21:14 compute-1 nova_compute[182713]: 2026-01-22 00:21:14.379 182717 DEBUG nova.compute.provider_tree [None req-a9fefd24-ab17-4788-b147-e9a202b4a5b4 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Inventory has not changed in ProviderTree for provider: 39680711-70c9-4df1-ae59-25e54fac688d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:21:14 compute-1 nova_compute[182713]: 2026-01-22 00:21:14.394 182717 DEBUG nova.scheduler.client.report [None req-a9fefd24-ab17-4788-b147-e9a202b4a5b4 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Inventory has not changed for provider 39680711-70c9-4df1-ae59-25e54fac688d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:21:14 compute-1 nova_compute[182713]: 2026-01-22 00:21:14.414 182717 DEBUG oslo_concurrency.lockutils [None req-a9fefd24-ab17-4788-b147-e9a202b4a5b4 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.095s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:21:14 compute-1 nova_compute[182713]: 2026-01-22 00:21:14.442 182717 INFO nova.scheduler.client.report [None req-a9fefd24-ab17-4788-b147-e9a202b4a5b4 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Deleted allocations for instance 595c8a5a-b43c-4eae-ad91-c7848e0e2f44
Jan 22 00:21:14 compute-1 nova_compute[182713]: 2026-01-22 00:21:14.547 182717 DEBUG oslo_concurrency.lockutils [None req-a9fefd24-ab17-4788-b147-e9a202b4a5b4 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Lock "595c8a5a-b43c-4eae-ad91-c7848e0e2f44" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.963s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:21:14 compute-1 nova_compute[182713]: 2026-01-22 00:21:14.997 182717 DEBUG nova.compute.manager [req-493b74bf-f157-46f4-be09-2dc79b00627d req-fd02b171-948e-4e58-af81-ed428d2fbfe2 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 595c8a5a-b43c-4eae-ad91-c7848e0e2f44] Received event network-vif-plugged-7e4f92dd-59c8-4dfe-87e4-3a6b4d14aba8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:21:14 compute-1 nova_compute[182713]: 2026-01-22 00:21:14.998 182717 DEBUG oslo_concurrency.lockutils [req-493b74bf-f157-46f4-be09-2dc79b00627d req-fd02b171-948e-4e58-af81-ed428d2fbfe2 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "595c8a5a-b43c-4eae-ad91-c7848e0e2f44-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:21:14 compute-1 nova_compute[182713]: 2026-01-22 00:21:14.998 182717 DEBUG oslo_concurrency.lockutils [req-493b74bf-f157-46f4-be09-2dc79b00627d req-fd02b171-948e-4e58-af81-ed428d2fbfe2 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "595c8a5a-b43c-4eae-ad91-c7848e0e2f44-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:21:14 compute-1 nova_compute[182713]: 2026-01-22 00:21:14.998 182717 DEBUG oslo_concurrency.lockutils [req-493b74bf-f157-46f4-be09-2dc79b00627d req-fd02b171-948e-4e58-af81-ed428d2fbfe2 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "595c8a5a-b43c-4eae-ad91-c7848e0e2f44-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:21:14 compute-1 nova_compute[182713]: 2026-01-22 00:21:14.999 182717 DEBUG nova.compute.manager [req-493b74bf-f157-46f4-be09-2dc79b00627d req-fd02b171-948e-4e58-af81-ed428d2fbfe2 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 595c8a5a-b43c-4eae-ad91-c7848e0e2f44] No waiting events found dispatching network-vif-plugged-7e4f92dd-59c8-4dfe-87e4-3a6b4d14aba8 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:21:14 compute-1 nova_compute[182713]: 2026-01-22 00:21:14.999 182717 WARNING nova.compute.manager [req-493b74bf-f157-46f4-be09-2dc79b00627d req-fd02b171-948e-4e58-af81-ed428d2fbfe2 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 595c8a5a-b43c-4eae-ad91-c7848e0e2f44] Received unexpected event network-vif-plugged-7e4f92dd-59c8-4dfe-87e4-3a6b4d14aba8 for instance with vm_state deleted and task_state None.
Jan 22 00:21:17 compute-1 podman[234527]: 2026-01-22 00:21:17.55154773 +0000 UTC m=+0.047064343 container health_status 9069102163b9014ce8d990e36224edf2f6277e737eebbc85a8ea0ba5a3d6a468 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 22 00:21:17 compute-1 podman[234526]: 2026-01-22 00:21:17.64819648 +0000 UTC m=+0.143475626 container health_status 1b31374526468d6b98833c6ddc06c791524e3aaa8f4a92d02e3f11c449a17ca2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 22 00:21:17 compute-1 nova_compute[182713]: 2026-01-22 00:21:17.963 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:21:18 compute-1 nova_compute[182713]: 2026-01-22 00:21:18.946 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:21:22 compute-1 podman[234576]: 2026-01-22 00:21:22.567133291 +0000 UTC m=+0.063057438 container health_status af9a44923f14d865893c91712a29a99d59e05d83f1a1a0300c4f4d5af638f284 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Jan 22 00:21:22 compute-1 podman[234577]: 2026-01-22 00:21:22.584019165 +0000 UTC m=+0.075633929 container health_status e305ebfcd3dbca18f8dd680606b043b108cf9efb0d0a771039cb04fe0df8441d (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 22 00:21:22 compute-1 nova_compute[182713]: 2026-01-22 00:21:22.967 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:21:23 compute-1 nova_compute[182713]: 2026-01-22 00:21:23.424 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:21:23 compute-1 nova_compute[182713]: 2026-01-22 00:21:23.624 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:21:23 compute-1 nova_compute[182713]: 2026-01-22 00:21:23.948 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:21:24 compute-1 nova_compute[182713]: 2026-01-22 00:21:24.669 182717 DEBUG nova.virt.libvirt.driver [None req-c561fdca-cc2a-4eda-a1c1-2fe887d6a841 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] [instance: dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2] Creating tmpfile /var/lib/nova/instances/tmpcnxlrtdu to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10041
Jan 22 00:21:24 compute-1 nova_compute[182713]: 2026-01-22 00:21:24.819 182717 DEBUG nova.compute.manager [None req-c561fdca-cc2a-4eda-a1c1-2fe887d6a841 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpcnxlrtdu',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.9/site-packages/nova/compute/manager.py:8476
Jan 22 00:21:25 compute-1 nova_compute[182713]: 2026-01-22 00:21:25.966 182717 DEBUG nova.compute.manager [None req-c561fdca-cc2a-4eda-a1c1-2fe887d6a841 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpcnxlrtdu',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8604
Jan 22 00:21:26 compute-1 nova_compute[182713]: 2026-01-22 00:21:26.000 182717 DEBUG oslo_concurrency.lockutils [None req-c561fdca-cc2a-4eda-a1c1-2fe887d6a841 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] Acquiring lock "refresh_cache-dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:21:26 compute-1 nova_compute[182713]: 2026-01-22 00:21:26.001 182717 DEBUG oslo_concurrency.lockutils [None req-c561fdca-cc2a-4eda-a1c1-2fe887d6a841 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] Acquired lock "refresh_cache-dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:21:26 compute-1 nova_compute[182713]: 2026-01-22 00:21:26.001 182717 DEBUG nova.network.neutron [None req-c561fdca-cc2a-4eda-a1c1-2fe887d6a841 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] [instance: dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 00:21:27 compute-1 nova_compute[182713]: 2026-01-22 00:21:27.892 182717 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769041272.8913355, 595c8a5a-b43c-4eae-ad91-c7848e0e2f44 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:21:27 compute-1 nova_compute[182713]: 2026-01-22 00:21:27.893 182717 INFO nova.compute.manager [-] [instance: 595c8a5a-b43c-4eae-ad91-c7848e0e2f44] VM Stopped (Lifecycle Event)
Jan 22 00:21:27 compute-1 nova_compute[182713]: 2026-01-22 00:21:27.915 182717 DEBUG nova.compute.manager [None req-0e64ee3f-ac74-4ee7-9f9a-5994c3c37dfb - - - - - -] [instance: 595c8a5a-b43c-4eae-ad91-c7848e0e2f44] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:21:27 compute-1 nova_compute[182713]: 2026-01-22 00:21:27.970 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:21:28 compute-1 nova_compute[182713]: 2026-01-22 00:21:28.110 182717 DEBUG nova.network.neutron [None req-c561fdca-cc2a-4eda-a1c1-2fe887d6a841 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] [instance: dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2] Updating instance_info_cache with network_info: [{"id": "2de3f942-6922-4800-9d3a-d06aa1263f44", "address": "fa:16:3e:13:64:5d", "network": {"id": "7a7a8118-cbb9-421a-b461-2b8b9d4dfc6d", "bridge": "br-int", "label": "tempest-network-smoke--1881000551", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.218", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "adb1305c8f874f2684e845e88fd95ffe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2de3f942-69", "ovs_interfaceid": "2de3f942-6922-4800-9d3a-d06aa1263f44", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:21:28 compute-1 nova_compute[182713]: 2026-01-22 00:21:28.140 182717 DEBUG oslo_concurrency.lockutils [None req-c561fdca-cc2a-4eda-a1c1-2fe887d6a841 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] Releasing lock "refresh_cache-dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:21:28 compute-1 nova_compute[182713]: 2026-01-22 00:21:28.155 182717 DEBUG nova.virt.libvirt.driver [None req-c561fdca-cc2a-4eda-a1c1-2fe887d6a841 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] [instance: dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpcnxlrtdu',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10827
Jan 22 00:21:28 compute-1 nova_compute[182713]: 2026-01-22 00:21:28.155 182717 DEBUG nova.virt.libvirt.driver [None req-c561fdca-cc2a-4eda-a1c1-2fe887d6a841 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] [instance: dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2] Creating instance directory: /var/lib/nova/instances/dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2 pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10840
Jan 22 00:21:28 compute-1 nova_compute[182713]: 2026-01-22 00:21:28.156 182717 DEBUG nova.virt.libvirt.driver [None req-c561fdca-cc2a-4eda-a1c1-2fe887d6a841 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] [instance: dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2] Creating disk.info with the contents: {'/var/lib/nova/instances/dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2/disk': 'qcow2', '/var/lib/nova/instances/dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2/disk.config': 'raw'} pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10854
Jan 22 00:21:28 compute-1 nova_compute[182713]: 2026-01-22 00:21:28.156 182717 DEBUG nova.virt.libvirt.driver [None req-c561fdca-cc2a-4eda-a1c1-2fe887d6a841 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] [instance: dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2] Checking to make sure images and backing files are present before live migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10864
Jan 22 00:21:28 compute-1 nova_compute[182713]: 2026-01-22 00:21:28.157 182717 DEBUG nova.objects.instance [None req-c561fdca-cc2a-4eda-a1c1-2fe887d6a841 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] Lazy-loading 'trusted_certs' on Instance uuid dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:21:28 compute-1 nova_compute[182713]: 2026-01-22 00:21:28.186 182717 DEBUG oslo_concurrency.processutils [None req-c561fdca-cc2a-4eda-a1c1-2fe887d6a841 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:21:28 compute-1 nova_compute[182713]: 2026-01-22 00:21:28.284 182717 DEBUG oslo_concurrency.processutils [None req-c561fdca-cc2a-4eda-a1c1-2fe887d6a841 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.097s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:21:28 compute-1 nova_compute[182713]: 2026-01-22 00:21:28.286 182717 DEBUG oslo_concurrency.lockutils [None req-c561fdca-cc2a-4eda-a1c1-2fe887d6a841 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] Acquiring lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:21:28 compute-1 nova_compute[182713]: 2026-01-22 00:21:28.287 182717 DEBUG oslo_concurrency.lockutils [None req-c561fdca-cc2a-4eda-a1c1-2fe887d6a841 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:21:28 compute-1 nova_compute[182713]: 2026-01-22 00:21:28.298 182717 DEBUG oslo_concurrency.processutils [None req-c561fdca-cc2a-4eda-a1c1-2fe887d6a841 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:21:28 compute-1 nova_compute[182713]: 2026-01-22 00:21:28.358 182717 DEBUG oslo_concurrency.processutils [None req-c561fdca-cc2a-4eda-a1c1-2fe887d6a841 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:21:28 compute-1 nova_compute[182713]: 2026-01-22 00:21:28.360 182717 DEBUG oslo_concurrency.processutils [None req-c561fdca-cc2a-4eda-a1c1-2fe887d6a841 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:21:28 compute-1 nova_compute[182713]: 2026-01-22 00:21:28.398 182717 DEBUG oslo_concurrency.processutils [None req-c561fdca-cc2a-4eda-a1c1-2fe887d6a841 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2/disk 1073741824" returned: 0 in 0.038s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:21:28 compute-1 nova_compute[182713]: 2026-01-22 00:21:28.400 182717 DEBUG oslo_concurrency.lockutils [None req-c561fdca-cc2a-4eda-a1c1-2fe887d6a841 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.112s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:21:28 compute-1 nova_compute[182713]: 2026-01-22 00:21:28.400 182717 DEBUG oslo_concurrency.processutils [None req-c561fdca-cc2a-4eda-a1c1-2fe887d6a841 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:21:28 compute-1 nova_compute[182713]: 2026-01-22 00:21:28.459 182717 DEBUG oslo_concurrency.processutils [None req-c561fdca-cc2a-4eda-a1c1-2fe887d6a841 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:21:28 compute-1 nova_compute[182713]: 2026-01-22 00:21:28.460 182717 DEBUG nova.virt.disk.api [None req-c561fdca-cc2a-4eda-a1c1-2fe887d6a841 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] Checking if we can resize image /var/lib/nova/instances/dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 22 00:21:28 compute-1 nova_compute[182713]: 2026-01-22 00:21:28.461 182717 DEBUG oslo_concurrency.processutils [None req-c561fdca-cc2a-4eda-a1c1-2fe887d6a841 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:21:28 compute-1 nova_compute[182713]: 2026-01-22 00:21:28.526 182717 DEBUG oslo_concurrency.processutils [None req-c561fdca-cc2a-4eda-a1c1-2fe887d6a841 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2/disk --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:21:28 compute-1 nova_compute[182713]: 2026-01-22 00:21:28.528 182717 DEBUG nova.virt.disk.api [None req-c561fdca-cc2a-4eda-a1c1-2fe887d6a841 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] Cannot resize image /var/lib/nova/instances/dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 22 00:21:28 compute-1 nova_compute[182713]: 2026-01-22 00:21:28.528 182717 DEBUG nova.objects.instance [None req-c561fdca-cc2a-4eda-a1c1-2fe887d6a841 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] Lazy-loading 'migration_context' on Instance uuid dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:21:28 compute-1 nova_compute[182713]: 2026-01-22 00:21:28.546 182717 DEBUG oslo_concurrency.processutils [None req-c561fdca-cc2a-4eda-a1c1-2fe887d6a841 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2/disk.config 485376 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:21:28 compute-1 nova_compute[182713]: 2026-01-22 00:21:28.574 182717 DEBUG oslo_concurrency.processutils [None req-c561fdca-cc2a-4eda-a1c1-2fe887d6a841 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2/disk.config 485376" returned: 0 in 0.028s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:21:28 compute-1 nova_compute[182713]: 2026-01-22 00:21:28.576 182717 DEBUG nova.virt.libvirt.volume.remotefs [None req-c561fdca-cc2a-4eda-a1c1-2fe887d6a841 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] Copying file compute-0.ctlplane.example.com:/var/lib/nova/instances/dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2/disk.config to /var/lib/nova/instances/dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2 copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103
Jan 22 00:21:28 compute-1 nova_compute[182713]: 2026-01-22 00:21:28.576 182717 DEBUG oslo_concurrency.processutils [None req-c561fdca-cc2a-4eda-a1c1-2fe887d6a841 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] Running cmd (subprocess): scp -C -r compute-0.ctlplane.example.com:/var/lib/nova/instances/dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2/disk.config /var/lib/nova/instances/dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:21:28 compute-1 nova_compute[182713]: 2026-01-22 00:21:28.950 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:21:28 compute-1 nova_compute[182713]: 2026-01-22 00:21:28.993 182717 DEBUG oslo_concurrency.processutils [None req-c561fdca-cc2a-4eda-a1c1-2fe887d6a841 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] CMD "scp -C -r compute-0.ctlplane.example.com:/var/lib/nova/instances/dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2/disk.config /var/lib/nova/instances/dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2" returned: 0 in 0.416s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:21:28 compute-1 nova_compute[182713]: 2026-01-22 00:21:28.994 182717 DEBUG nova.virt.libvirt.driver [None req-c561fdca-cc2a-4eda-a1c1-2fe887d6a841 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] [instance: dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10794
Jan 22 00:21:28 compute-1 nova_compute[182713]: 2026-01-22 00:21:28.996 182717 DEBUG nova.virt.libvirt.vif [None req-c561fdca-cc2a-4eda-a1c1-2fe887d6a841 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T00:20:51Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-7270034',display_name='tempest-TestNetworkAdvancedServerOps-server-7270034',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-7270034',id=152,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBKCbkoaVuA62O+ECqX+7Ohn7GbIbEVQCxvPvCXrKqpOjrukjt8m0tS2UNeW9SghkNu53IZT4aL6S7PqVShWjvxQooRpiJSuxHQi4r5UNidyhtoE0twes7RsZVTczYedVHA==',key_name='tempest-TestNetworkAdvancedServerOps-2075791651',keypairs=<?>,launch_index=0,launched_at=2026-01-22T00:20:59Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='adb1305c8f874f2684e845e88fd95ffe',ramdisk_id='',reservation_id='r-1trsnn3z',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkAdvancedServerOps-587955072',owner_user_name='tempest-TestNetworkAdvancedServerOps-587955072-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:20:59Z,user_data=None,user_id='635cc2f351c344dc8e2b1264080dbafb',uuid=dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "2de3f942-6922-4800-9d3a-d06aa1263f44", "address": "fa:16:3e:13:64:5d", "network": {"id": "7a7a8118-cbb9-421a-b461-2b8b9d4dfc6d", "bridge": "br-int", "label": "tempest-network-smoke--1881000551", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.218", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "adb1305c8f874f2684e845e88fd95ffe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap2de3f942-69", "ovs_interfaceid": "2de3f942-6922-4800-9d3a-d06aa1263f44", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 00:21:28 compute-1 nova_compute[182713]: 2026-01-22 00:21:28.996 182717 DEBUG nova.network.os_vif_util [None req-c561fdca-cc2a-4eda-a1c1-2fe887d6a841 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] Converting VIF {"id": "2de3f942-6922-4800-9d3a-d06aa1263f44", "address": "fa:16:3e:13:64:5d", "network": {"id": "7a7a8118-cbb9-421a-b461-2b8b9d4dfc6d", "bridge": "br-int", "label": "tempest-network-smoke--1881000551", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.218", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "adb1305c8f874f2684e845e88fd95ffe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap2de3f942-69", "ovs_interfaceid": "2de3f942-6922-4800-9d3a-d06aa1263f44", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:21:28 compute-1 nova_compute[182713]: 2026-01-22 00:21:28.997 182717 DEBUG nova.network.os_vif_util [None req-c561fdca-cc2a-4eda-a1c1-2fe887d6a841 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:13:64:5d,bridge_name='br-int',has_traffic_filtering=True,id=2de3f942-6922-4800-9d3a-d06aa1263f44,network=Network(7a7a8118-cbb9-421a-b461-2b8b9d4dfc6d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2de3f942-69') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:21:28 compute-1 nova_compute[182713]: 2026-01-22 00:21:28.998 182717 DEBUG os_vif [None req-c561fdca-cc2a-4eda-a1c1-2fe887d6a841 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] Plugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:13:64:5d,bridge_name='br-int',has_traffic_filtering=True,id=2de3f942-6922-4800-9d3a-d06aa1263f44,network=Network(7a7a8118-cbb9-421a-b461-2b8b9d4dfc6d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2de3f942-69') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 00:21:28 compute-1 nova_compute[182713]: 2026-01-22 00:21:28.998 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:21:28 compute-1 nova_compute[182713]: 2026-01-22 00:21:28.999 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:21:28 compute-1 nova_compute[182713]: 2026-01-22 00:21:28.999 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 00:21:29 compute-1 nova_compute[182713]: 2026-01-22 00:21:29.004 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:21:29 compute-1 nova_compute[182713]: 2026-01-22 00:21:29.005 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap2de3f942-69, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:21:29 compute-1 nova_compute[182713]: 2026-01-22 00:21:29.006 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap2de3f942-69, col_values=(('external_ids', {'iface-id': '2de3f942-6922-4800-9d3a-d06aa1263f44', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:13:64:5d', 'vm-uuid': 'dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:21:29 compute-1 nova_compute[182713]: 2026-01-22 00:21:29.008 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:21:29 compute-1 NetworkManager[54952]: <info>  [1769041289.0099] manager: (tap2de3f942-69): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/271)
Jan 22 00:21:29 compute-1 nova_compute[182713]: 2026-01-22 00:21:29.010 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:21:29 compute-1 nova_compute[182713]: 2026-01-22 00:21:29.015 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:21:29 compute-1 nova_compute[182713]: 2026-01-22 00:21:29.017 182717 INFO os_vif [None req-c561fdca-cc2a-4eda-a1c1-2fe887d6a841 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] Successfully plugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:13:64:5d,bridge_name='br-int',has_traffic_filtering=True,id=2de3f942-6922-4800-9d3a-d06aa1263f44,network=Network(7a7a8118-cbb9-421a-b461-2b8b9d4dfc6d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2de3f942-69')
Jan 22 00:21:29 compute-1 nova_compute[182713]: 2026-01-22 00:21:29.017 182717 DEBUG nova.virt.libvirt.driver [None req-c561fdca-cc2a-4eda-a1c1-2fe887d6a841 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10954
Jan 22 00:21:29 compute-1 nova_compute[182713]: 2026-01-22 00:21:29.018 182717 DEBUG nova.compute.manager [None req-c561fdca-cc2a-4eda-a1c1-2fe887d6a841 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpcnxlrtdu',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8668
Jan 22 00:21:30 compute-1 nova_compute[182713]: 2026-01-22 00:21:30.241 182717 DEBUG nova.network.neutron [None req-c561fdca-cc2a-4eda-a1c1-2fe887d6a841 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] [instance: dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2] Port 2de3f942-6922-4800-9d3a-d06aa1263f44 updated with migration profile {'migrating_to': 'compute-1.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.9/site-packages/nova/network/neutron.py:354
Jan 22 00:21:30 compute-1 nova_compute[182713]: 2026-01-22 00:21:30.257 182717 DEBUG nova.compute.manager [None req-c561fdca-cc2a-4eda-a1c1-2fe887d6a841 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpcnxlrtdu',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8723
Jan 22 00:21:30 compute-1 systemd[1]: Starting libvirt proxy daemon...
Jan 22 00:21:30 compute-1 systemd[1]: Started libvirt proxy daemon.
Jan 22 00:21:30 compute-1 kernel: tap2de3f942-69: entered promiscuous mode
Jan 22 00:21:30 compute-1 NetworkManager[54952]: <info>  [1769041290.6613] manager: (tap2de3f942-69): new Tun device (/org/freedesktop/NetworkManager/Devices/272)
Jan 22 00:21:30 compute-1 ovn_controller[94841]: 2026-01-22T00:21:30Z|00602|binding|INFO|Claiming lport 2de3f942-6922-4800-9d3a-d06aa1263f44 for this additional chassis.
Jan 22 00:21:30 compute-1 ovn_controller[94841]: 2026-01-22T00:21:30Z|00603|binding|INFO|2de3f942-6922-4800-9d3a-d06aa1263f44: Claiming fa:16:3e:13:64:5d 10.100.0.4
Jan 22 00:21:30 compute-1 nova_compute[182713]: 2026-01-22 00:21:30.663 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:21:30 compute-1 nova_compute[182713]: 2026-01-22 00:21:30.675 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:21:30 compute-1 NetworkManager[54952]: <info>  [1769041290.6817] manager: (patch-br-int-to-provnet-99bcf688-d143-4306-a11c-956e66fbe227): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/273)
Jan 22 00:21:30 compute-1 nova_compute[182713]: 2026-01-22 00:21:30.680 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:21:30 compute-1 NetworkManager[54952]: <info>  [1769041290.6825] manager: (patch-provnet-99bcf688-d143-4306-a11c-956e66fbe227-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/274)
Jan 22 00:21:30 compute-1 systemd-udevd[234677]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 00:21:30 compute-1 NetworkManager[54952]: <info>  [1769041290.7014] device (tap2de3f942-69): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 00:21:30 compute-1 NetworkManager[54952]: <info>  [1769041290.7026] device (tap2de3f942-69): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 00:21:30 compute-1 systemd-machined[153970]: New machine qemu-66-instance-00000098.
Jan 22 00:21:30 compute-1 systemd[1]: Started Virtual Machine qemu-66-instance-00000098.
Jan 22 00:21:30 compute-1 nova_compute[182713]: 2026-01-22 00:21:30.802 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:21:30 compute-1 nova_compute[182713]: 2026-01-22 00:21:30.804 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:21:30 compute-1 nova_compute[182713]: 2026-01-22 00:21:30.827 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:21:30 compute-1 ovn_controller[94841]: 2026-01-22T00:21:30Z|00604|binding|INFO|Setting lport 2de3f942-6922-4800-9d3a-d06aa1263f44 ovn-installed in OVS
Jan 22 00:21:30 compute-1 nova_compute[182713]: 2026-01-22 00:21:30.837 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:21:31 compute-1 nova_compute[182713]: 2026-01-22 00:21:31.321 182717 DEBUG nova.virt.driver [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Emitting event <LifecycleEvent: 1769041291.320573, dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:21:31 compute-1 nova_compute[182713]: 2026-01-22 00:21:31.321 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2] VM Started (Lifecycle Event)
Jan 22 00:21:31 compute-1 nova_compute[182713]: 2026-01-22 00:21:31.342 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:21:32 compute-1 nova_compute[182713]: 2026-01-22 00:21:32.030 182717 DEBUG nova.virt.driver [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Emitting event <LifecycleEvent: 1769041292.0295062, dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:21:32 compute-1 nova_compute[182713]: 2026-01-22 00:21:32.030 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2] VM Resumed (Lifecycle Event)
Jan 22 00:21:32 compute-1 nova_compute[182713]: 2026-01-22 00:21:32.062 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:21:32 compute-1 nova_compute[182713]: 2026-01-22 00:21:32.069 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 00:21:32 compute-1 nova_compute[182713]: 2026-01-22 00:21:32.103 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2] During the sync_power process the instance has moved from host compute-0.ctlplane.example.com to host compute-1.ctlplane.example.com
Jan 22 00:21:33 compute-1 ovn_controller[94841]: 2026-01-22T00:21:33Z|00605|binding|INFO|Claiming lport 2de3f942-6922-4800-9d3a-d06aa1263f44 for this chassis.
Jan 22 00:21:33 compute-1 ovn_controller[94841]: 2026-01-22T00:21:33Z|00606|binding|INFO|2de3f942-6922-4800-9d3a-d06aa1263f44: Claiming fa:16:3e:13:64:5d 10.100.0.4
Jan 22 00:21:33 compute-1 ovn_controller[94841]: 2026-01-22T00:21:33Z|00607|binding|INFO|Setting lport 2de3f942-6922-4800-9d3a-d06aa1263f44 up in Southbound
Jan 22 00:21:33 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:21:33.195 104184 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:13:64:5d 10.100.0.4'], port_security=['fa:16:3e:13:64:5d 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': 'dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7a7a8118-cbb9-421a-b461-2b8b9d4dfc6d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'adb1305c8f874f2684e845e88fd95ffe', 'neutron:revision_number': '10', 'neutron:security_group_ids': '61ee06fa-a63f-42b6-8f38-7bda03f7a2d0', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com', 'neutron:port_fip': '192.168.122.218'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f5715df4-0e68-4951-9b87-9601d69c7054, chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>], logical_port=2de3f942-6922-4800-9d3a-d06aa1263f44) old=Port_Binding(up=[False], additional_chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:21:33 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:21:33.197 104184 INFO neutron.agent.ovn.metadata.agent [-] Port 2de3f942-6922-4800-9d3a-d06aa1263f44 in datapath 7a7a8118-cbb9-421a-b461-2b8b9d4dfc6d bound to our chassis
Jan 22 00:21:33 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:21:33.200 104184 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 7a7a8118-cbb9-421a-b461-2b8b9d4dfc6d
Jan 22 00:21:33 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:21:33.223 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[3bf94573-c486-40f0-9c82-a5d7f7a066f1]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:21:33 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:21:33.225 104184 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap7a7a8118-c1 in ovnmeta-7a7a8118-cbb9-421a-b461-2b8b9d4dfc6d namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 22 00:21:33 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:21:33.230 211733 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap7a7a8118-c0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 22 00:21:33 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:21:33.230 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[c39b299d-b28d-4e3a-b53c-ac8a1ab37dc8]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:21:33 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:21:33.233 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[7fddfd17-9c9c-43f7-a796-c1ccb28869b6]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:21:33 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:21:33.251 104576 DEBUG oslo.privsep.daemon [-] privsep: reply[acad22f4-5444-4b54-ad2f-c90be44b28bb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:21:33 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:21:33.277 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[89447284-8acc-4542-bb15-f95fb215958c]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:21:33 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:21:33.329 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[774f5ca7-801b-4cca-bde9-c9f55ce10eb8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:21:33 compute-1 systemd-udevd[234680]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 00:21:33 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:21:33.337 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[97a0e88f-6f01-473f-b6d2-a2648426692a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:21:33 compute-1 NetworkManager[54952]: <info>  [1769041293.3391] manager: (tap7a7a8118-c0): new Veth device (/org/freedesktop/NetworkManager/Devices/275)
Jan 22 00:21:33 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:21:33.388 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[c3863f9e-c861-4388-ad6a-ce422777df8c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:21:33 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:21:33.394 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[60325311-648a-4a45-97b6-1390d3f3eb43]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:21:33 compute-1 NetworkManager[54952]: <info>  [1769041293.4286] device (tap7a7a8118-c0): carrier: link connected
Jan 22 00:21:33 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:21:33.436 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[38f92287-2545-42ea-842a-d6bb030f5fed]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:21:33 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:21:33.460 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[560fd993-c701-468c-89ab-5244f152942c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap7a7a8118-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:1c:58:f4'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 171], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 586608, 'reachable_time': 27873, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 234726, 'error': None, 'target': 'ovnmeta-7a7a8118-cbb9-421a-b461-2b8b9d4dfc6d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:21:33 compute-1 nova_compute[182713]: 2026-01-22 00:21:33.475 182717 INFO nova.compute.manager [None req-c561fdca-cc2a-4eda-a1c1-2fe887d6a841 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] [instance: dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2] Post operation of migration started
Jan 22 00:21:33 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:21:33.479 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[3d7fafeb-fbdd-4cb5-811b-2ec2b22b52ec]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe1c:58f4'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 586608, 'tstamp': 586608}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 234727, 'error': None, 'target': 'ovnmeta-7a7a8118-cbb9-421a-b461-2b8b9d4dfc6d', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:21:33 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:21:33.505 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[7bb1594d-5fd1-4587-891b-0c496d4ba2f9]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap7a7a8118-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:1c:58:f4'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 171], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 586608, 'reachable_time': 27873, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 234728, 'error': None, 'target': 'ovnmeta-7a7a8118-cbb9-421a-b461-2b8b9d4dfc6d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:21:33 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:21:33.540 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[37541231-9f77-4356-aed7-6ab8428bac2b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:21:33 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:21:33.617 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[facacdac-c376-425d-9aa6-22eb31e169a2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:21:33 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:21:33.618 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7a7a8118-c0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:21:33 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:21:33.619 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 00:21:33 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:21:33.619 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap7a7a8118-c0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:21:33 compute-1 NetworkManager[54952]: <info>  [1769041293.6662] manager: (tap7a7a8118-c0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/276)
Jan 22 00:21:33 compute-1 kernel: tap7a7a8118-c0: entered promiscuous mode
Jan 22 00:21:33 compute-1 nova_compute[182713]: 2026-01-22 00:21:33.665 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:21:33 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:21:33.669 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap7a7a8118-c0, col_values=(('external_ids', {'iface-id': 'd1e02cd9-8126-49f7-b1af-0e9e0399bf45'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:21:33 compute-1 ovn_controller[94841]: 2026-01-22T00:21:33Z|00608|binding|INFO|Releasing lport d1e02cd9-8126-49f7-b1af-0e9e0399bf45 from this chassis (sb_readonly=0)
Jan 22 00:21:33 compute-1 nova_compute[182713]: 2026-01-22 00:21:33.671 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:21:33 compute-1 nova_compute[182713]: 2026-01-22 00:21:33.685 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:21:33 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:21:33.686 104184 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/7a7a8118-cbb9-421a-b461-2b8b9d4dfc6d.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/7a7a8118-cbb9-421a-b461-2b8b9d4dfc6d.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 22 00:21:33 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:21:33.687 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[22e0d572-b62a-4485-9d59-46446feb07e5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:21:33 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:21:33.688 104184 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 00:21:33 compute-1 ovn_metadata_agent[104179]: global
Jan 22 00:21:33 compute-1 ovn_metadata_agent[104179]:     log         /dev/log local0 debug
Jan 22 00:21:33 compute-1 ovn_metadata_agent[104179]:     log-tag     haproxy-metadata-proxy-7a7a8118-cbb9-421a-b461-2b8b9d4dfc6d
Jan 22 00:21:33 compute-1 ovn_metadata_agent[104179]:     user        root
Jan 22 00:21:33 compute-1 ovn_metadata_agent[104179]:     group       root
Jan 22 00:21:33 compute-1 ovn_metadata_agent[104179]:     maxconn     1024
Jan 22 00:21:33 compute-1 ovn_metadata_agent[104179]:     pidfile     /var/lib/neutron/external/pids/7a7a8118-cbb9-421a-b461-2b8b9d4dfc6d.pid.haproxy
Jan 22 00:21:33 compute-1 ovn_metadata_agent[104179]:     daemon
Jan 22 00:21:33 compute-1 ovn_metadata_agent[104179]: 
Jan 22 00:21:33 compute-1 ovn_metadata_agent[104179]: defaults
Jan 22 00:21:33 compute-1 ovn_metadata_agent[104179]:     log global
Jan 22 00:21:33 compute-1 ovn_metadata_agent[104179]:     mode http
Jan 22 00:21:33 compute-1 ovn_metadata_agent[104179]:     option httplog
Jan 22 00:21:33 compute-1 ovn_metadata_agent[104179]:     option dontlognull
Jan 22 00:21:33 compute-1 ovn_metadata_agent[104179]:     option http-server-close
Jan 22 00:21:33 compute-1 ovn_metadata_agent[104179]:     option forwardfor
Jan 22 00:21:33 compute-1 ovn_metadata_agent[104179]:     retries                 3
Jan 22 00:21:33 compute-1 ovn_metadata_agent[104179]:     timeout http-request    30s
Jan 22 00:21:33 compute-1 ovn_metadata_agent[104179]:     timeout connect         30s
Jan 22 00:21:33 compute-1 ovn_metadata_agent[104179]:     timeout client          32s
Jan 22 00:21:33 compute-1 ovn_metadata_agent[104179]:     timeout server          32s
Jan 22 00:21:33 compute-1 ovn_metadata_agent[104179]:     timeout http-keep-alive 30s
Jan 22 00:21:33 compute-1 ovn_metadata_agent[104179]: 
Jan 22 00:21:33 compute-1 ovn_metadata_agent[104179]: 
Jan 22 00:21:33 compute-1 ovn_metadata_agent[104179]: listen listener
Jan 22 00:21:33 compute-1 ovn_metadata_agent[104179]:     bind 169.254.169.254:80
Jan 22 00:21:33 compute-1 ovn_metadata_agent[104179]:     server metadata /var/lib/neutron/metadata_proxy
Jan 22 00:21:33 compute-1 ovn_metadata_agent[104179]:     http-request add-header X-OVN-Network-ID 7a7a8118-cbb9-421a-b461-2b8b9d4dfc6d
Jan 22 00:21:33 compute-1 ovn_metadata_agent[104179]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 22 00:21:33 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:21:33.689 104184 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-7a7a8118-cbb9-421a-b461-2b8b9d4dfc6d', 'env', 'PROCESS_TAG=haproxy-7a7a8118-cbb9-421a-b461-2b8b9d4dfc6d', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/7a7a8118-cbb9-421a-b461-2b8b9d4dfc6d.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 22 00:21:33 compute-1 nova_compute[182713]: 2026-01-22 00:21:33.900 182717 DEBUG oslo_concurrency.lockutils [None req-c561fdca-cc2a-4eda-a1c1-2fe887d6a841 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] Acquiring lock "refresh_cache-dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:21:33 compute-1 nova_compute[182713]: 2026-01-22 00:21:33.901 182717 DEBUG oslo_concurrency.lockutils [None req-c561fdca-cc2a-4eda-a1c1-2fe887d6a841 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] Acquired lock "refresh_cache-dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:21:33 compute-1 nova_compute[182713]: 2026-01-22 00:21:33.901 182717 DEBUG nova.network.neutron [None req-c561fdca-cc2a-4eda-a1c1-2fe887d6a841 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] [instance: dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 00:21:33 compute-1 nova_compute[182713]: 2026-01-22 00:21:33.954 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:21:34 compute-1 nova_compute[182713]: 2026-01-22 00:21:34.008 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:21:34 compute-1 podman[234761]: 2026-01-22 00:21:34.143661709 +0000 UTC m=+0.064679559 container create ff2d01904e3af0778828817b10fb90ca64b117a4426ad33786e97d29e0950261 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7a7a8118-cbb9-421a-b461-2b8b9d4dfc6d, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 22 00:21:34 compute-1 systemd[1]: Started libpod-conmon-ff2d01904e3af0778828817b10fb90ca64b117a4426ad33786e97d29e0950261.scope.
Jan 22 00:21:34 compute-1 podman[234761]: 2026-01-22 00:21:34.111868192 +0000 UTC m=+0.032886082 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 00:21:34 compute-1 systemd[1]: Started libcrun container.
Jan 22 00:21:34 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9d6b4a88a54ff4106c728a19753c3af1a5c7232ae4b7466c1738d9f858fe3330/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 00:21:34 compute-1 podman[234761]: 2026-01-22 00:21:34.230433153 +0000 UTC m=+0.151451013 container init ff2d01904e3af0778828817b10fb90ca64b117a4426ad33786e97d29e0950261 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7a7a8118-cbb9-421a-b461-2b8b9d4dfc6d, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Jan 22 00:21:34 compute-1 podman[234761]: 2026-01-22 00:21:34.23838781 +0000 UTC m=+0.159405650 container start ff2d01904e3af0778828817b10fb90ca64b117a4426ad33786e97d29e0950261 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7a7a8118-cbb9-421a-b461-2b8b9d4dfc6d, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 00:21:34 compute-1 neutron-haproxy-ovnmeta-7a7a8118-cbb9-421a-b461-2b8b9d4dfc6d[234781]: [NOTICE]   (234795) : New worker (234802) forked
Jan 22 00:21:34 compute-1 neutron-haproxy-ovnmeta-7a7a8118-cbb9-421a-b461-2b8b9d4dfc6d[234781]: [NOTICE]   (234795) : Loading success.
Jan 22 00:21:34 compute-1 podman[234774]: 2026-01-22 00:21:34.277863575 +0000 UTC m=+0.093559126 container health_status cba261ffc859a105e0afa4436e64e27f9ba7e7387a1ea02146e0072c51844d44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ceilometer_agent_compute, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Jan 22 00:21:35 compute-1 nova_compute[182713]: 2026-01-22 00:21:35.040 182717 DEBUG nova.network.neutron [None req-c561fdca-cc2a-4eda-a1c1-2fe887d6a841 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] [instance: dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2] Updating instance_info_cache with network_info: [{"id": "2de3f942-6922-4800-9d3a-d06aa1263f44", "address": "fa:16:3e:13:64:5d", "network": {"id": "7a7a8118-cbb9-421a-b461-2b8b9d4dfc6d", "bridge": "br-int", "label": "tempest-network-smoke--1881000551", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.218", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "adb1305c8f874f2684e845e88fd95ffe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2de3f942-69", "ovs_interfaceid": "2de3f942-6922-4800-9d3a-d06aa1263f44", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:21:35 compute-1 nova_compute[182713]: 2026-01-22 00:21:35.062 182717 DEBUG oslo_concurrency.lockutils [None req-c561fdca-cc2a-4eda-a1c1-2fe887d6a841 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] Releasing lock "refresh_cache-dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:21:35 compute-1 nova_compute[182713]: 2026-01-22 00:21:35.109 182717 DEBUG oslo_concurrency.lockutils [None req-c561fdca-cc2a-4eda-a1c1-2fe887d6a841 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:21:35 compute-1 nova_compute[182713]: 2026-01-22 00:21:35.110 182717 DEBUG oslo_concurrency.lockutils [None req-c561fdca-cc2a-4eda-a1c1-2fe887d6a841 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:21:35 compute-1 nova_compute[182713]: 2026-01-22 00:21:35.110 182717 DEBUG oslo_concurrency.lockutils [None req-c561fdca-cc2a-4eda-a1c1-2fe887d6a841 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:21:35 compute-1 nova_compute[182713]: 2026-01-22 00:21:35.118 182717 INFO nova.virt.libvirt.driver [None req-c561fdca-cc2a-4eda-a1c1-2fe887d6a841 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] [instance: dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2] Sending announce-self command to QEMU monitor. Attempt 1 of 3
Jan 22 00:21:35 compute-1 virtqemud[182235]: Domain id=66 name='instance-00000098' uuid=dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2 is tainted: custom-monitor
Jan 22 00:21:35 compute-1 nova_compute[182713]: 2026-01-22 00:21:35.646 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:21:36 compute-1 nova_compute[182713]: 2026-01-22 00:21:36.129 182717 INFO nova.virt.libvirt.driver [None req-c561fdca-cc2a-4eda-a1c1-2fe887d6a841 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] [instance: dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2] Sending announce-self command to QEMU monitor. Attempt 2 of 3
Jan 22 00:21:37 compute-1 nova_compute[182713]: 2026-01-22 00:21:37.137 182717 INFO nova.virt.libvirt.driver [None req-c561fdca-cc2a-4eda-a1c1-2fe887d6a841 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] [instance: dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2] Sending announce-self command to QEMU monitor. Attempt 3 of 3
Jan 22 00:21:37 compute-1 nova_compute[182713]: 2026-01-22 00:21:37.142 182717 DEBUG nova.compute.manager [None req-c561fdca-cc2a-4eda-a1c1-2fe887d6a841 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] [instance: dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:21:37 compute-1 nova_compute[182713]: 2026-01-22 00:21:37.319 182717 DEBUG nova.objects.instance [None req-c561fdca-cc2a-4eda-a1c1-2fe887d6a841 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] [instance: dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032
Jan 22 00:21:37 compute-1 podman[234812]: 2026-01-22 00:21:37.61430417 +0000 UTC m=+0.100052157 container health_status dce133a2ffa21f3006ac27dc80027060a9029e6d972f4c62fecd35dd3bbb00f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, vendor=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, container_name=openstack_network_exporter, config_id=openstack_network_exporter, io.buildah.version=1.33.7, io.openshift.expose-services=, build-date=2025-08-20T13:12:41, distribution-scope=public, architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., managed_by=edpm_ansible)
Jan 22 00:21:38 compute-1 nova_compute[182713]: 2026-01-22 00:21:38.956 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:21:39 compute-1 nova_compute[182713]: 2026-01-22 00:21:39.011 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:21:41 compute-1 nova_compute[182713]: 2026-01-22 00:21:41.395 182717 INFO nova.compute.manager [None req-59e2948e-0a30-4f33-a13a-948640734aa1 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2] Get console output
Jan 22 00:21:41 compute-1 nova_compute[182713]: 2026-01-22 00:21:41.402 211417 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 00:21:43 compute-1 nova_compute[182713]: 2026-01-22 00:21:43.172 182717 DEBUG nova.compute.manager [req-cf34dfe1-3796-496f-b6cc-449980d32399 req-b00cd0e6-e3ed-4413-9446-0e12f67a3dab 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2] Received event network-changed-2de3f942-6922-4800-9d3a-d06aa1263f44 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:21:43 compute-1 nova_compute[182713]: 2026-01-22 00:21:43.173 182717 DEBUG nova.compute.manager [req-cf34dfe1-3796-496f-b6cc-449980d32399 req-b00cd0e6-e3ed-4413-9446-0e12f67a3dab 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2] Refreshing instance network info cache due to event network-changed-2de3f942-6922-4800-9d3a-d06aa1263f44. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 00:21:43 compute-1 nova_compute[182713]: 2026-01-22 00:21:43.173 182717 DEBUG oslo_concurrency.lockutils [req-cf34dfe1-3796-496f-b6cc-449980d32399 req-b00cd0e6-e3ed-4413-9446-0e12f67a3dab 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:21:43 compute-1 nova_compute[182713]: 2026-01-22 00:21:43.174 182717 DEBUG oslo_concurrency.lockutils [req-cf34dfe1-3796-496f-b6cc-449980d32399 req-b00cd0e6-e3ed-4413-9446-0e12f67a3dab 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:21:43 compute-1 nova_compute[182713]: 2026-01-22 00:21:43.174 182717 DEBUG nova.network.neutron [req-cf34dfe1-3796-496f-b6cc-449980d32399 req-b00cd0e6-e3ed-4413-9446-0e12f67a3dab 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2] Refreshing network info cache for port 2de3f942-6922-4800-9d3a-d06aa1263f44 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 00:21:43 compute-1 nova_compute[182713]: 2026-01-22 00:21:43.287 182717 DEBUG oslo_concurrency.lockutils [None req-af68a5b1-a726-4631-a371-57da143d4b97 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Acquiring lock "dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:21:43 compute-1 nova_compute[182713]: 2026-01-22 00:21:43.288 182717 DEBUG oslo_concurrency.lockutils [None req-af68a5b1-a726-4631-a371-57da143d4b97 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lock "dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:21:43 compute-1 nova_compute[182713]: 2026-01-22 00:21:43.288 182717 DEBUG oslo_concurrency.lockutils [None req-af68a5b1-a726-4631-a371-57da143d4b97 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Acquiring lock "dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:21:43 compute-1 nova_compute[182713]: 2026-01-22 00:21:43.289 182717 DEBUG oslo_concurrency.lockutils [None req-af68a5b1-a726-4631-a371-57da143d4b97 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lock "dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:21:43 compute-1 nova_compute[182713]: 2026-01-22 00:21:43.290 182717 DEBUG oslo_concurrency.lockutils [None req-af68a5b1-a726-4631-a371-57da143d4b97 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lock "dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:21:43 compute-1 nova_compute[182713]: 2026-01-22 00:21:43.306 182717 INFO nova.compute.manager [None req-af68a5b1-a726-4631-a371-57da143d4b97 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2] Terminating instance
Jan 22 00:21:43 compute-1 nova_compute[182713]: 2026-01-22 00:21:43.322 182717 DEBUG nova.compute.manager [None req-af68a5b1-a726-4631-a371-57da143d4b97 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 22 00:21:43 compute-1 kernel: tap2de3f942-69 (unregistering): left promiscuous mode
Jan 22 00:21:43 compute-1 NetworkManager[54952]: <info>  [1769041303.3489] device (tap2de3f942-69): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 00:21:43 compute-1 ovn_controller[94841]: 2026-01-22T00:21:43Z|00609|binding|INFO|Releasing lport 2de3f942-6922-4800-9d3a-d06aa1263f44 from this chassis (sb_readonly=0)
Jan 22 00:21:43 compute-1 ovn_controller[94841]: 2026-01-22T00:21:43Z|00610|binding|INFO|Setting lport 2de3f942-6922-4800-9d3a-d06aa1263f44 down in Southbound
Jan 22 00:21:43 compute-1 ovn_controller[94841]: 2026-01-22T00:21:43Z|00611|binding|INFO|Removing iface tap2de3f942-69 ovn-installed in OVS
Jan 22 00:21:43 compute-1 nova_compute[182713]: 2026-01-22 00:21:43.375 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:21:43 compute-1 nova_compute[182713]: 2026-01-22 00:21:43.394 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:21:43 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:21:43.394 104184 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:13:64:5d 10.100.0.4'], port_security=['fa:16:3e:13:64:5d 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': 'dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7a7a8118-cbb9-421a-b461-2b8b9d4dfc6d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'adb1305c8f874f2684e845e88fd95ffe', 'neutron:revision_number': '12', 'neutron:security_group_ids': '61ee06fa-a63f-42b6-8f38-7bda03f7a2d0', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f5715df4-0e68-4951-9b87-9601d69c7054, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>], logical_port=2de3f942-6922-4800-9d3a-d06aa1263f44) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:21:43 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:21:43.397 104184 INFO neutron.agent.ovn.metadata.agent [-] Port 2de3f942-6922-4800-9d3a-d06aa1263f44 in datapath 7a7a8118-cbb9-421a-b461-2b8b9d4dfc6d unbound from our chassis
Jan 22 00:21:43 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:21:43.401 104184 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 7a7a8118-cbb9-421a-b461-2b8b9d4dfc6d, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 00:21:43 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:21:43.402 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[b81d741e-fe26-49a1-b53b-516653052c4c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:21:43 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:21:43.403 104184 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-7a7a8118-cbb9-421a-b461-2b8b9d4dfc6d namespace which is not needed anymore
Jan 22 00:21:43 compute-1 systemd[1]: machine-qemu\x2d66\x2dinstance\x2d00000098.scope: Deactivated successfully.
Jan 22 00:21:43 compute-1 systemd[1]: machine-qemu\x2d66\x2dinstance\x2d00000098.scope: Consumed 1.594s CPU time.
Jan 22 00:21:43 compute-1 systemd-machined[153970]: Machine qemu-66-instance-00000098 terminated.
Jan 22 00:21:43 compute-1 nova_compute[182713]: 2026-01-22 00:21:43.551 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:21:43 compute-1 nova_compute[182713]: 2026-01-22 00:21:43.558 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:21:43 compute-1 neutron-haproxy-ovnmeta-7a7a8118-cbb9-421a-b461-2b8b9d4dfc6d[234781]: [NOTICE]   (234795) : haproxy version is 2.8.14-c23fe91
Jan 22 00:21:43 compute-1 neutron-haproxy-ovnmeta-7a7a8118-cbb9-421a-b461-2b8b9d4dfc6d[234781]: [NOTICE]   (234795) : path to executable is /usr/sbin/haproxy
Jan 22 00:21:43 compute-1 neutron-haproxy-ovnmeta-7a7a8118-cbb9-421a-b461-2b8b9d4dfc6d[234781]: [WARNING]  (234795) : Exiting Master process...
Jan 22 00:21:43 compute-1 neutron-haproxy-ovnmeta-7a7a8118-cbb9-421a-b461-2b8b9d4dfc6d[234781]: [ALERT]    (234795) : Current worker (234802) exited with code 143 (Terminated)
Jan 22 00:21:43 compute-1 neutron-haproxy-ovnmeta-7a7a8118-cbb9-421a-b461-2b8b9d4dfc6d[234781]: [WARNING]  (234795) : All workers exited. Exiting... (0)
Jan 22 00:21:43 compute-1 systemd[1]: libpod-ff2d01904e3af0778828817b10fb90ca64b117a4426ad33786e97d29e0950261.scope: Deactivated successfully.
Jan 22 00:21:43 compute-1 podman[234858]: 2026-01-22 00:21:43.5842724 +0000 UTC m=+0.070398797 container died ff2d01904e3af0778828817b10fb90ca64b117a4426ad33786e97d29e0950261 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7a7a8118-cbb9-421a-b461-2b8b9d4dfc6d, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251202)
Jan 22 00:21:43 compute-1 nova_compute[182713]: 2026-01-22 00:21:43.597 182717 INFO nova.virt.libvirt.driver [-] [instance: dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2] Instance destroyed successfully.
Jan 22 00:21:43 compute-1 nova_compute[182713]: 2026-01-22 00:21:43.598 182717 DEBUG nova.objects.instance [None req-af68a5b1-a726-4631-a371-57da143d4b97 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lazy-loading 'resources' on Instance uuid dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:21:43 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-ff2d01904e3af0778828817b10fb90ca64b117a4426ad33786e97d29e0950261-userdata-shm.mount: Deactivated successfully.
Jan 22 00:21:43 compute-1 systemd[1]: var-lib-containers-storage-overlay-9d6b4a88a54ff4106c728a19753c3af1a5c7232ae4b7466c1738d9f858fe3330-merged.mount: Deactivated successfully.
Jan 22 00:21:43 compute-1 nova_compute[182713]: 2026-01-22 00:21:43.623 182717 DEBUG nova.virt.libvirt.vif [None req-af68a5b1-a726-4631-a371-57da143d4b97 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-01-22T00:20:51Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-7270034',display_name='tempest-TestNetworkAdvancedServerOps-server-7270034',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-7270034',id=152,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBKCbkoaVuA62O+ECqX+7Ohn7GbIbEVQCxvPvCXrKqpOjrukjt8m0tS2UNeW9SghkNu53IZT4aL6S7PqVShWjvxQooRpiJSuxHQi4r5UNidyhtoE0twes7RsZVTczYedVHA==',key_name='tempest-TestNetworkAdvancedServerOps-2075791651',keypairs=<?>,launch_index=0,launched_at=2026-01-22T00:20:59Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='adb1305c8f874f2684e845e88fd95ffe',ramdisk_id='',reservation_id='r-1trsnn3z',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkAdvancedServerOps-587955072',owner_user_name='tempest-TestNetworkAdvancedServerOps-587955072-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T00:21:37Z,user_data=None,user_id='635cc2f351c344dc8e2b1264080dbafb',uuid=dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "2de3f942-6922-4800-9d3a-d06aa1263f44", "address": "fa:16:3e:13:64:5d", "network": {"id": "7a7a8118-cbb9-421a-b461-2b8b9d4dfc6d", "bridge": "br-int", "label": "tempest-network-smoke--1881000551", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.218", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "adb1305c8f874f2684e845e88fd95ffe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2de3f942-69", "ovs_interfaceid": "2de3f942-6922-4800-9d3a-d06aa1263f44", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 00:21:43 compute-1 nova_compute[182713]: 2026-01-22 00:21:43.623 182717 DEBUG nova.network.os_vif_util [None req-af68a5b1-a726-4631-a371-57da143d4b97 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Converting VIF {"id": "2de3f942-6922-4800-9d3a-d06aa1263f44", "address": "fa:16:3e:13:64:5d", "network": {"id": "7a7a8118-cbb9-421a-b461-2b8b9d4dfc6d", "bridge": "br-int", "label": "tempest-network-smoke--1881000551", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.218", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "adb1305c8f874f2684e845e88fd95ffe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2de3f942-69", "ovs_interfaceid": "2de3f942-6922-4800-9d3a-d06aa1263f44", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:21:43 compute-1 nova_compute[182713]: 2026-01-22 00:21:43.624 182717 DEBUG nova.network.os_vif_util [None req-af68a5b1-a726-4631-a371-57da143d4b97 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:13:64:5d,bridge_name='br-int',has_traffic_filtering=True,id=2de3f942-6922-4800-9d3a-d06aa1263f44,network=Network(7a7a8118-cbb9-421a-b461-2b8b9d4dfc6d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2de3f942-69') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:21:43 compute-1 nova_compute[182713]: 2026-01-22 00:21:43.624 182717 DEBUG os_vif [None req-af68a5b1-a726-4631-a371-57da143d4b97 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:13:64:5d,bridge_name='br-int',has_traffic_filtering=True,id=2de3f942-6922-4800-9d3a-d06aa1263f44,network=Network(7a7a8118-cbb9-421a-b461-2b8b9d4dfc6d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2de3f942-69') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 00:21:43 compute-1 nova_compute[182713]: 2026-01-22 00:21:43.626 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:21:43 compute-1 nova_compute[182713]: 2026-01-22 00:21:43.626 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap2de3f942-69, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:21:43 compute-1 nova_compute[182713]: 2026-01-22 00:21:43.628 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:21:43 compute-1 podman[234858]: 2026-01-22 00:21:43.629985408 +0000 UTC m=+0.116111825 container cleanup ff2d01904e3af0778828817b10fb90ca64b117a4426ad33786e97d29e0950261 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7a7a8118-cbb9-421a-b461-2b8b9d4dfc6d, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 22 00:21:43 compute-1 nova_compute[182713]: 2026-01-22 00:21:43.630 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:21:43 compute-1 nova_compute[182713]: 2026-01-22 00:21:43.633 182717 INFO os_vif [None req-af68a5b1-a726-4631-a371-57da143d4b97 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:13:64:5d,bridge_name='br-int',has_traffic_filtering=True,id=2de3f942-6922-4800-9d3a-d06aa1263f44,network=Network(7a7a8118-cbb9-421a-b461-2b8b9d4dfc6d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2de3f942-69')
Jan 22 00:21:43 compute-1 nova_compute[182713]: 2026-01-22 00:21:43.634 182717 INFO nova.virt.libvirt.driver [None req-af68a5b1-a726-4631-a371-57da143d4b97 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2] Deleting instance files /var/lib/nova/instances/dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2_del
Jan 22 00:21:43 compute-1 nova_compute[182713]: 2026-01-22 00:21:43.635 182717 INFO nova.virt.libvirt.driver [None req-af68a5b1-a726-4631-a371-57da143d4b97 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2] Deletion of /var/lib/nova/instances/dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2_del complete
Jan 22 00:21:43 compute-1 systemd[1]: libpod-conmon-ff2d01904e3af0778828817b10fb90ca64b117a4426ad33786e97d29e0950261.scope: Deactivated successfully.
Jan 22 00:21:43 compute-1 podman[234905]: 2026-01-22 00:21:43.700963252 +0000 UTC m=+0.045400460 container remove ff2d01904e3af0778828817b10fb90ca64b117a4426ad33786e97d29e0950261 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7a7a8118-cbb9-421a-b461-2b8b9d4dfc6d, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Jan 22 00:21:43 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:21:43.706 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[d2eb6f36-cf19-4bc2-9e25-d38487730517]: (4, ('Thu Jan 22 12:21:43 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-7a7a8118-cbb9-421a-b461-2b8b9d4dfc6d (ff2d01904e3af0778828817b10fb90ca64b117a4426ad33786e97d29e0950261)\nff2d01904e3af0778828817b10fb90ca64b117a4426ad33786e97d29e0950261\nThu Jan 22 12:21:43 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-7a7a8118-cbb9-421a-b461-2b8b9d4dfc6d (ff2d01904e3af0778828817b10fb90ca64b117a4426ad33786e97d29e0950261)\nff2d01904e3af0778828817b10fb90ca64b117a4426ad33786e97d29e0950261\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:21:43 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:21:43.708 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[5f87c117-2edc-4824-9368-db05cebc287e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:21:43 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:21:43.710 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7a7a8118-c0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:21:43 compute-1 nova_compute[182713]: 2026-01-22 00:21:43.712 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:21:43 compute-1 kernel: tap7a7a8118-c0: left promiscuous mode
Jan 22 00:21:43 compute-1 nova_compute[182713]: 2026-01-22 00:21:43.725 182717 INFO nova.compute.manager [None req-af68a5b1-a726-4631-a371-57da143d4b97 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2] Took 0.40 seconds to destroy the instance on the hypervisor.
Jan 22 00:21:43 compute-1 nova_compute[182713]: 2026-01-22 00:21:43.726 182717 DEBUG oslo.service.loopingcall [None req-af68a5b1-a726-4631-a371-57da143d4b97 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 22 00:21:43 compute-1 nova_compute[182713]: 2026-01-22 00:21:43.726 182717 DEBUG nova.compute.manager [-] [instance: dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 22 00:21:43 compute-1 nova_compute[182713]: 2026-01-22 00:21:43.727 182717 DEBUG nova.network.neutron [-] [instance: dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 22 00:21:43 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:21:43.731 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[9875d9d3-635b-4ee4-b16b-edbd212bd852]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:21:43 compute-1 nova_compute[182713]: 2026-01-22 00:21:43.742 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:21:43 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:21:43.749 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[14dfb08d-5507-49ea-b34d-22ed9cfca662]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:21:43 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:21:43.751 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[62717165-fe70-4dcd-8392-af5291c0b919]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:21:43 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:21:43.774 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[3e4eff54-2a5a-4419-8b8f-6d4ce851412d]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 586596, 'reachable_time': 38988, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 234920, 'error': None, 'target': 'ovnmeta-7a7a8118-cbb9-421a-b461-2b8b9d4dfc6d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:21:43 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:21:43.777 104576 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-7a7a8118-cbb9-421a-b461-2b8b9d4dfc6d deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 22 00:21:43 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:21:43.778 104576 DEBUG oslo.privsep.daemon [-] privsep: reply[c39fe2c6-f45f-4e22-b180-88bbedbc2b68]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:21:43 compute-1 systemd[1]: run-netns-ovnmeta\x2d7a7a8118\x2dcbb9\x2d421a\x2db461\x2d2b8b9d4dfc6d.mount: Deactivated successfully.
Jan 22 00:21:43 compute-1 nova_compute[182713]: 2026-01-22 00:21:43.959 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:21:44 compute-1 nova_compute[182713]: 2026-01-22 00:21:44.496 182717 DEBUG nova.compute.manager [req-1a900d3c-118f-4ea7-aad4-10a35558e33e req-3bba0463-7023-4621-b940-311c99d3a7d1 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2] Received event network-vif-unplugged-2de3f942-6922-4800-9d3a-d06aa1263f44 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:21:44 compute-1 nova_compute[182713]: 2026-01-22 00:21:44.496 182717 DEBUG oslo_concurrency.lockutils [req-1a900d3c-118f-4ea7-aad4-10a35558e33e req-3bba0463-7023-4621-b940-311c99d3a7d1 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:21:44 compute-1 nova_compute[182713]: 2026-01-22 00:21:44.497 182717 DEBUG oslo_concurrency.lockutils [req-1a900d3c-118f-4ea7-aad4-10a35558e33e req-3bba0463-7023-4621-b940-311c99d3a7d1 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:21:44 compute-1 nova_compute[182713]: 2026-01-22 00:21:44.497 182717 DEBUG oslo_concurrency.lockutils [req-1a900d3c-118f-4ea7-aad4-10a35558e33e req-3bba0463-7023-4621-b940-311c99d3a7d1 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:21:44 compute-1 nova_compute[182713]: 2026-01-22 00:21:44.498 182717 DEBUG nova.compute.manager [req-1a900d3c-118f-4ea7-aad4-10a35558e33e req-3bba0463-7023-4621-b940-311c99d3a7d1 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2] No waiting events found dispatching network-vif-unplugged-2de3f942-6922-4800-9d3a-d06aa1263f44 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:21:44 compute-1 nova_compute[182713]: 2026-01-22 00:21:44.498 182717 DEBUG nova.compute.manager [req-1a900d3c-118f-4ea7-aad4-10a35558e33e req-3bba0463-7023-4621-b940-311c99d3a7d1 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2] Received event network-vif-unplugged-2de3f942-6922-4800-9d3a-d06aa1263f44 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 22 00:21:44 compute-1 nova_compute[182713]: 2026-01-22 00:21:44.558 182717 DEBUG nova.network.neutron [req-cf34dfe1-3796-496f-b6cc-449980d32399 req-b00cd0e6-e3ed-4413-9446-0e12f67a3dab 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2] Updated VIF entry in instance network info cache for port 2de3f942-6922-4800-9d3a-d06aa1263f44. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 00:21:44 compute-1 nova_compute[182713]: 2026-01-22 00:21:44.559 182717 DEBUG nova.network.neutron [req-cf34dfe1-3796-496f-b6cc-449980d32399 req-b00cd0e6-e3ed-4413-9446-0e12f67a3dab 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2] Updating instance_info_cache with network_info: [{"id": "2de3f942-6922-4800-9d3a-d06aa1263f44", "address": "fa:16:3e:13:64:5d", "network": {"id": "7a7a8118-cbb9-421a-b461-2b8b9d4dfc6d", "bridge": "br-int", "label": "tempest-network-smoke--1881000551", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "adb1305c8f874f2684e845e88fd95ffe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2de3f942-69", "ovs_interfaceid": "2de3f942-6922-4800-9d3a-d06aa1263f44", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:21:44 compute-1 nova_compute[182713]: 2026-01-22 00:21:44.601 182717 DEBUG oslo_concurrency.lockutils [req-cf34dfe1-3796-496f-b6cc-449980d32399 req-b00cd0e6-e3ed-4413-9446-0e12f67a3dab 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:21:46 compute-1 nova_compute[182713]: 2026-01-22 00:21:46.005 182717 DEBUG nova.network.neutron [-] [instance: dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:21:46 compute-1 nova_compute[182713]: 2026-01-22 00:21:46.028 182717 INFO nova.compute.manager [-] [instance: dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2] Took 2.30 seconds to deallocate network for instance.
Jan 22 00:21:46 compute-1 nova_compute[182713]: 2026-01-22 00:21:46.189 182717 DEBUG oslo_concurrency.lockutils [None req-af68a5b1-a726-4631-a371-57da143d4b97 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:21:46 compute-1 nova_compute[182713]: 2026-01-22 00:21:46.191 182717 DEBUG oslo_concurrency.lockutils [None req-af68a5b1-a726-4631-a371-57da143d4b97 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:21:46 compute-1 nova_compute[182713]: 2026-01-22 00:21:46.198 182717 DEBUG oslo_concurrency.lockutils [None req-af68a5b1-a726-4631-a371-57da143d4b97 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.007s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:21:46 compute-1 nova_compute[182713]: 2026-01-22 00:21:46.409 182717 INFO nova.scheduler.client.report [None req-af68a5b1-a726-4631-a371-57da143d4b97 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Deleted allocations for instance dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2
Jan 22 00:21:46 compute-1 nova_compute[182713]: 2026-01-22 00:21:46.664 182717 DEBUG nova.compute.manager [req-a01af865-8a39-4834-9d49-b57db172b781 req-d955154f-6436-4fd9-b251-878bc5cf3265 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2] Received event network-vif-plugged-2de3f942-6922-4800-9d3a-d06aa1263f44 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:21:46 compute-1 nova_compute[182713]: 2026-01-22 00:21:46.665 182717 DEBUG oslo_concurrency.lockutils [req-a01af865-8a39-4834-9d49-b57db172b781 req-d955154f-6436-4fd9-b251-878bc5cf3265 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:21:46 compute-1 nova_compute[182713]: 2026-01-22 00:21:46.665 182717 DEBUG oslo_concurrency.lockutils [req-a01af865-8a39-4834-9d49-b57db172b781 req-d955154f-6436-4fd9-b251-878bc5cf3265 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:21:46 compute-1 nova_compute[182713]: 2026-01-22 00:21:46.665 182717 DEBUG oslo_concurrency.lockutils [req-a01af865-8a39-4834-9d49-b57db172b781 req-d955154f-6436-4fd9-b251-878bc5cf3265 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:21:46 compute-1 nova_compute[182713]: 2026-01-22 00:21:46.665 182717 DEBUG nova.compute.manager [req-a01af865-8a39-4834-9d49-b57db172b781 req-d955154f-6436-4fd9-b251-878bc5cf3265 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2] No waiting events found dispatching network-vif-plugged-2de3f942-6922-4800-9d3a-d06aa1263f44 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:21:46 compute-1 nova_compute[182713]: 2026-01-22 00:21:46.666 182717 WARNING nova.compute.manager [req-a01af865-8a39-4834-9d49-b57db172b781 req-d955154f-6436-4fd9-b251-878bc5cf3265 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2] Received unexpected event network-vif-plugged-2de3f942-6922-4800-9d3a-d06aa1263f44 for instance with vm_state deleted and task_state None.
Jan 22 00:21:46 compute-1 nova_compute[182713]: 2026-01-22 00:21:46.666 182717 DEBUG nova.compute.manager [req-a01af865-8a39-4834-9d49-b57db172b781 req-d955154f-6436-4fd9-b251-878bc5cf3265 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2] Received event network-vif-deleted-2de3f942-6922-4800-9d3a-d06aa1263f44 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:21:46 compute-1 nova_compute[182713]: 2026-01-22 00:21:46.678 182717 DEBUG oslo_concurrency.lockutils [None req-af68a5b1-a726-4631-a371-57da143d4b97 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lock "dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.391s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:21:47 compute-1 nova_compute[182713]: 2026-01-22 00:21:47.856 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:21:47 compute-1 nova_compute[182713]: 2026-01-22 00:21:47.857 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:21:47 compute-1 nova_compute[182713]: 2026-01-22 00:21:47.857 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:21:47 compute-1 nova_compute[182713]: 2026-01-22 00:21:47.858 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 00:21:47 compute-1 nova_compute[182713]: 2026-01-22 00:21:47.943 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:21:48 compute-1 nova_compute[182713]: 2026-01-22 00:21:48.092 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:21:48 compute-1 podman[234923]: 2026-01-22 00:21:48.580842692 +0000 UTC m=+0.068858579 container health_status 9069102163b9014ce8d990e36224edf2f6277e737eebbc85a8ea0ba5a3d6a468 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 22 00:21:48 compute-1 nova_compute[182713]: 2026-01-22 00:21:48.627 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:21:48 compute-1 podman[234922]: 2026-01-22 00:21:48.635468069 +0000 UTC m=+0.125449118 container health_status 1b31374526468d6b98833c6ddc06c791524e3aaa8f4a92d02e3f11c449a17ca2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 22 00:21:48 compute-1 nova_compute[182713]: 2026-01-22 00:21:48.961 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:21:49 compute-1 nova_compute[182713]: 2026-01-22 00:21:49.853 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:21:50 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:21:50.110 104184 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=45, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:a4:fb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'fa:c2:bb:50:eb:27'}, ipsec=False) old=SB_Global(nb_cfg=44) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:21:50 compute-1 nova_compute[182713]: 2026-01-22 00:21:50.111 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:21:50 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:21:50.112 104184 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 22 00:21:51 compute-1 nova_compute[182713]: 2026-01-22 00:21:51.984 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:21:52 compute-1 nova_compute[182713]: 2026-01-22 00:21:52.855 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:21:52 compute-1 nova_compute[182713]: 2026-01-22 00:21:52.856 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:21:53 compute-1 nova_compute[182713]: 2026-01-22 00:21:53.263 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:21:53 compute-1 nova_compute[182713]: 2026-01-22 00:21:53.263 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:21:53 compute-1 nova_compute[182713]: 2026-01-22 00:21:53.263 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:21:53 compute-1 nova_compute[182713]: 2026-01-22 00:21:53.264 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 00:21:53 compute-1 podman[234974]: 2026-01-22 00:21:53.383732921 +0000 UTC m=+0.067993111 container health_status e305ebfcd3dbca18f8dd680606b043b108cf9efb0d0a771039cb04fe0df8441d (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Jan 22 00:21:53 compute-1 podman[234973]: 2026-01-22 00:21:53.406701604 +0000 UTC m=+0.095955909 container health_status af9a44923f14d865893c91712a29a99d59e05d83f1a1a0300c4f4d5af638f284 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Jan 22 00:21:53 compute-1 nova_compute[182713]: 2026-01-22 00:21:53.462 182717 WARNING nova.virt.libvirt.driver [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 00:21:53 compute-1 nova_compute[182713]: 2026-01-22 00:21:53.463 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5692MB free_disk=73.26044082641602GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 00:21:53 compute-1 nova_compute[182713]: 2026-01-22 00:21:53.463 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:21:53 compute-1 nova_compute[182713]: 2026-01-22 00:21:53.464 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:21:53 compute-1 nova_compute[182713]: 2026-01-22 00:21:53.630 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:21:53 compute-1 nova_compute[182713]: 2026-01-22 00:21:53.963 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:21:54 compute-1 nova_compute[182713]: 2026-01-22 00:21:54.163 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 00:21:54 compute-1 nova_compute[182713]: 2026-01-22 00:21:54.164 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 00:21:54 compute-1 nova_compute[182713]: 2026-01-22 00:21:54.197 182717 DEBUG nova.compute.provider_tree [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Inventory has not changed in ProviderTree for provider: 39680711-70c9-4df1-ae59-25e54fac688d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:21:54 compute-1 nova_compute[182713]: 2026-01-22 00:21:54.764 182717 DEBUG nova.scheduler.client.report [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Inventory has not changed for provider 39680711-70c9-4df1-ae59-25e54fac688d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:21:54 compute-1 nova_compute[182713]: 2026-01-22 00:21:54.952 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 00:21:54 compute-1 nova_compute[182713]: 2026-01-22 00:21:54.953 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.489s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:21:56 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:21:56.115 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=74526b6d-b1ca-423f-9094-b845f8b97526, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '45'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:21:57 compute-1 nova_compute[182713]: 2026-01-22 00:21:57.953 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:21:57 compute-1 nova_compute[182713]: 2026-01-22 00:21:57.954 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 00:21:57 compute-1 nova_compute[182713]: 2026-01-22 00:21:57.955 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 22 00:21:57 compute-1 nova_compute[182713]: 2026-01-22 00:21:57.978 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 22 00:21:57 compute-1 nova_compute[182713]: 2026-01-22 00:21:57.978 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:21:57 compute-1 nova_compute[182713]: 2026-01-22 00:21:57.979 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:21:58 compute-1 nova_compute[182713]: 2026-01-22 00:21:58.596 182717 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769041303.594831, dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:21:58 compute-1 nova_compute[182713]: 2026-01-22 00:21:58.596 182717 INFO nova.compute.manager [-] [instance: dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2] VM Stopped (Lifecycle Event)
Jan 22 00:21:58 compute-1 nova_compute[182713]: 2026-01-22 00:21:58.709 182717 DEBUG nova.compute.manager [None req-78ae860e-47ba-4346-8ee6-3e0df4f4f116 - - - - - -] [instance: dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:21:58 compute-1 nova_compute[182713]: 2026-01-22 00:21:58.710 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:21:58 compute-1 nova_compute[182713]: 2026-01-22 00:21:58.965 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:22:03 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:22:03.030 104184 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:22:03 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:22:03.031 104184 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:22:03 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:22:03.031 104184 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:22:03 compute-1 nova_compute[182713]: 2026-01-22 00:22:03.762 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:22:03 compute-1 nova_compute[182713]: 2026-01-22 00:22:03.967 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:22:04 compute-1 podman[235016]: 2026-01-22 00:22:04.593011769 +0000 UTC m=+0.075256978 container health_status cba261ffc859a105e0afa4436e64e27f9ba7e7387a1ea02146e0072c51844d44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=ceilometer_agent_compute, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Jan 22 00:22:08 compute-1 podman[235036]: 2026-01-22 00:22:08.59324133 +0000 UTC m=+0.092723470 container health_status dce133a2ffa21f3006ac27dc80027060a9029e6d972f4c62fecd35dd3bbb00f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., architecture=x86_64, vcs-type=git, build-date=2025-08-20T13:12:41, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, name=ubi9-minimal, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, config_id=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, version=9.6)
Jan 22 00:22:08 compute-1 nova_compute[182713]: 2026-01-22 00:22:08.766 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:22:08 compute-1 nova_compute[182713]: 2026-01-22 00:22:08.968 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:22:13 compute-1 nova_compute[182713]: 2026-01-22 00:22:13.769 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:22:13 compute-1 nova_compute[182713]: 2026-01-22 00:22:13.970 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:22:18 compute-1 nova_compute[182713]: 2026-01-22 00:22:18.773 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:22:18 compute-1 nova_compute[182713]: 2026-01-22 00:22:18.971 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:22:19 compute-1 podman[235058]: 2026-01-22 00:22:19.577196271 +0000 UTC m=+0.066490645 container health_status 9069102163b9014ce8d990e36224edf2f6277e737eebbc85a8ea0ba5a3d6a468 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 22 00:22:19 compute-1 podman[235057]: 2026-01-22 00:22:19.595925193 +0000 UTC m=+0.095493796 container health_status 1b31374526468d6b98833c6ddc06c791524e3aaa8f4a92d02e3f11c449a17ca2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 00:22:21 compute-1 nova_compute[182713]: 2026-01-22 00:22:21.865 182717 DEBUG oslo_concurrency.lockutils [None req-e4a337f8-9a95-4d24-a242-d34e65fd7be8 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Acquiring lock "495c8d36-266d-42ae-968f-28046804dcb7" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:22:21 compute-1 nova_compute[182713]: 2026-01-22 00:22:21.866 182717 DEBUG oslo_concurrency.lockutils [None req-e4a337f8-9a95-4d24-a242-d34e65fd7be8 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lock "495c8d36-266d-42ae-968f-28046804dcb7" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:22:21 compute-1 nova_compute[182713]: 2026-01-22 00:22:21.890 182717 DEBUG nova.compute.manager [None req-e4a337f8-9a95-4d24-a242-d34e65fd7be8 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: 495c8d36-266d-42ae-968f-28046804dcb7] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 22 00:22:22 compute-1 nova_compute[182713]: 2026-01-22 00:22:22.013 182717 DEBUG oslo_concurrency.lockutils [None req-e4a337f8-9a95-4d24-a242-d34e65fd7be8 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:22:22 compute-1 nova_compute[182713]: 2026-01-22 00:22:22.014 182717 DEBUG oslo_concurrency.lockutils [None req-e4a337f8-9a95-4d24-a242-d34e65fd7be8 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:22:22 compute-1 nova_compute[182713]: 2026-01-22 00:22:22.020 182717 DEBUG nova.virt.hardware [None req-e4a337f8-9a95-4d24-a242-d34e65fd7be8 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 22 00:22:22 compute-1 nova_compute[182713]: 2026-01-22 00:22:22.021 182717 INFO nova.compute.claims [None req-e4a337f8-9a95-4d24-a242-d34e65fd7be8 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: 495c8d36-266d-42ae-968f-28046804dcb7] Claim successful on node compute-1.ctlplane.example.com
Jan 22 00:22:22 compute-1 nova_compute[182713]: 2026-01-22 00:22:22.483 182717 DEBUG nova.compute.provider_tree [None req-e4a337f8-9a95-4d24-a242-d34e65fd7be8 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Inventory has not changed in ProviderTree for provider: 39680711-70c9-4df1-ae59-25e54fac688d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:22:22 compute-1 nova_compute[182713]: 2026-01-22 00:22:22.503 182717 DEBUG nova.scheduler.client.report [None req-e4a337f8-9a95-4d24-a242-d34e65fd7be8 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Inventory has not changed for provider 39680711-70c9-4df1-ae59-25e54fac688d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:22:22 compute-1 nova_compute[182713]: 2026-01-22 00:22:22.545 182717 DEBUG oslo_concurrency.lockutils [None req-e4a337f8-9a95-4d24-a242-d34e65fd7be8 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.531s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:22:22 compute-1 nova_compute[182713]: 2026-01-22 00:22:22.546 182717 DEBUG nova.compute.manager [None req-e4a337f8-9a95-4d24-a242-d34e65fd7be8 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: 495c8d36-266d-42ae-968f-28046804dcb7] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 22 00:22:22 compute-1 nova_compute[182713]: 2026-01-22 00:22:22.638 182717 DEBUG nova.compute.manager [None req-e4a337f8-9a95-4d24-a242-d34e65fd7be8 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: 495c8d36-266d-42ae-968f-28046804dcb7] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 22 00:22:22 compute-1 nova_compute[182713]: 2026-01-22 00:22:22.639 182717 DEBUG nova.network.neutron [None req-e4a337f8-9a95-4d24-a242-d34e65fd7be8 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: 495c8d36-266d-42ae-968f-28046804dcb7] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 22 00:22:22 compute-1 nova_compute[182713]: 2026-01-22 00:22:22.693 182717 INFO nova.virt.libvirt.driver [None req-e4a337f8-9a95-4d24-a242-d34e65fd7be8 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: 495c8d36-266d-42ae-968f-28046804dcb7] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 22 00:22:22 compute-1 nova_compute[182713]: 2026-01-22 00:22:22.844 182717 DEBUG nova.compute.manager [None req-e4a337f8-9a95-4d24-a242-d34e65fd7be8 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: 495c8d36-266d-42ae-968f-28046804dcb7] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 22 00:22:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:22:22.882 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:22:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:22:22.883 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:22:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:22:22.883 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:22:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:22:22.883 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:22:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:22:22.883 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:22:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:22:22.883 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:22:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:22:22.883 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:22:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:22:22.883 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:22:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:22:22.884 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:22:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:22:22.884 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:22:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:22:22.884 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:22:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:22:22.884 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:22:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:22:22.884 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:22:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:22:22.884 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:22:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:22:22.884 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:22:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:22:22.884 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:22:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:22:22.884 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:22:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:22:22.884 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:22:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:22:22.884 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:22:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:22:22.885 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:22:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:22:22.885 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:22:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:22:22.885 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:22:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:22:22.885 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:22:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:22:22.885 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:22:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:22:22.885 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:22:23 compute-1 nova_compute[182713]: 2026-01-22 00:22:23.094 182717 DEBUG nova.compute.manager [None req-e4a337f8-9a95-4d24-a242-d34e65fd7be8 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: 495c8d36-266d-42ae-968f-28046804dcb7] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 22 00:22:23 compute-1 nova_compute[182713]: 2026-01-22 00:22:23.095 182717 DEBUG nova.virt.libvirt.driver [None req-e4a337f8-9a95-4d24-a242-d34e65fd7be8 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: 495c8d36-266d-42ae-968f-28046804dcb7] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 22 00:22:23 compute-1 nova_compute[182713]: 2026-01-22 00:22:23.096 182717 INFO nova.virt.libvirt.driver [None req-e4a337f8-9a95-4d24-a242-d34e65fd7be8 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: 495c8d36-266d-42ae-968f-28046804dcb7] Creating image(s)
Jan 22 00:22:23 compute-1 nova_compute[182713]: 2026-01-22 00:22:23.096 182717 DEBUG oslo_concurrency.lockutils [None req-e4a337f8-9a95-4d24-a242-d34e65fd7be8 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Acquiring lock "/var/lib/nova/instances/495c8d36-266d-42ae-968f-28046804dcb7/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:22:23 compute-1 nova_compute[182713]: 2026-01-22 00:22:23.096 182717 DEBUG oslo_concurrency.lockutils [None req-e4a337f8-9a95-4d24-a242-d34e65fd7be8 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lock "/var/lib/nova/instances/495c8d36-266d-42ae-968f-28046804dcb7/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:22:23 compute-1 nova_compute[182713]: 2026-01-22 00:22:23.097 182717 DEBUG oslo_concurrency.lockutils [None req-e4a337f8-9a95-4d24-a242-d34e65fd7be8 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lock "/var/lib/nova/instances/495c8d36-266d-42ae-968f-28046804dcb7/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:22:23 compute-1 nova_compute[182713]: 2026-01-22 00:22:23.110 182717 DEBUG oslo_concurrency.processutils [None req-e4a337f8-9a95-4d24-a242-d34e65fd7be8 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:22:23 compute-1 nova_compute[182713]: 2026-01-22 00:22:23.175 182717 DEBUG oslo_concurrency.processutils [None req-e4a337f8-9a95-4d24-a242-d34e65fd7be8 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:22:23 compute-1 nova_compute[182713]: 2026-01-22 00:22:23.176 182717 DEBUG oslo_concurrency.lockutils [None req-e4a337f8-9a95-4d24-a242-d34e65fd7be8 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Acquiring lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:22:23 compute-1 nova_compute[182713]: 2026-01-22 00:22:23.177 182717 DEBUG oslo_concurrency.lockutils [None req-e4a337f8-9a95-4d24-a242-d34e65fd7be8 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:22:23 compute-1 nova_compute[182713]: 2026-01-22 00:22:23.191 182717 DEBUG oslo_concurrency.processutils [None req-e4a337f8-9a95-4d24-a242-d34e65fd7be8 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:22:23 compute-1 nova_compute[182713]: 2026-01-22 00:22:23.216 182717 DEBUG nova.policy [None req-e4a337f8-9a95-4d24-a242-d34e65fd7be8 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '635cc2f351c344dc8e2b1264080dbafb', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'adb1305c8f874f2684e845e88fd95ffe', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 22 00:22:23 compute-1 nova_compute[182713]: 2026-01-22 00:22:23.251 182717 DEBUG oslo_concurrency.processutils [None req-e4a337f8-9a95-4d24-a242-d34e65fd7be8 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:22:23 compute-1 nova_compute[182713]: 2026-01-22 00:22:23.252 182717 DEBUG oslo_concurrency.processutils [None req-e4a337f8-9a95-4d24-a242-d34e65fd7be8 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/495c8d36-266d-42ae-968f-28046804dcb7/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:22:23 compute-1 nova_compute[182713]: 2026-01-22 00:22:23.380 182717 DEBUG oslo_concurrency.processutils [None req-e4a337f8-9a95-4d24-a242-d34e65fd7be8 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/495c8d36-266d-42ae-968f-28046804dcb7/disk 1073741824" returned: 0 in 0.128s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:22:23 compute-1 nova_compute[182713]: 2026-01-22 00:22:23.381 182717 DEBUG oslo_concurrency.lockutils [None req-e4a337f8-9a95-4d24-a242-d34e65fd7be8 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.204s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:22:23 compute-1 nova_compute[182713]: 2026-01-22 00:22:23.381 182717 DEBUG oslo_concurrency.processutils [None req-e4a337f8-9a95-4d24-a242-d34e65fd7be8 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:22:23 compute-1 nova_compute[182713]: 2026-01-22 00:22:23.439 182717 DEBUG oslo_concurrency.processutils [None req-e4a337f8-9a95-4d24-a242-d34e65fd7be8 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:22:23 compute-1 nova_compute[182713]: 2026-01-22 00:22:23.441 182717 DEBUG nova.virt.disk.api [None req-e4a337f8-9a95-4d24-a242-d34e65fd7be8 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Checking if we can resize image /var/lib/nova/instances/495c8d36-266d-42ae-968f-28046804dcb7/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 22 00:22:23 compute-1 nova_compute[182713]: 2026-01-22 00:22:23.441 182717 DEBUG oslo_concurrency.processutils [None req-e4a337f8-9a95-4d24-a242-d34e65fd7be8 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/495c8d36-266d-42ae-968f-28046804dcb7/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:22:23 compute-1 nova_compute[182713]: 2026-01-22 00:22:23.515 182717 DEBUG oslo_concurrency.processutils [None req-e4a337f8-9a95-4d24-a242-d34e65fd7be8 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/495c8d36-266d-42ae-968f-28046804dcb7/disk --force-share --output=json" returned: 0 in 0.073s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:22:23 compute-1 nova_compute[182713]: 2026-01-22 00:22:23.516 182717 DEBUG nova.virt.disk.api [None req-e4a337f8-9a95-4d24-a242-d34e65fd7be8 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Cannot resize image /var/lib/nova/instances/495c8d36-266d-42ae-968f-28046804dcb7/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 22 00:22:23 compute-1 nova_compute[182713]: 2026-01-22 00:22:23.517 182717 DEBUG nova.objects.instance [None req-e4a337f8-9a95-4d24-a242-d34e65fd7be8 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lazy-loading 'migration_context' on Instance uuid 495c8d36-266d-42ae-968f-28046804dcb7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:22:23 compute-1 podman[235117]: 2026-01-22 00:22:23.548681251 +0000 UTC m=+0.044973188 container health_status af9a44923f14d865893c91712a29a99d59e05d83f1a1a0300c4f4d5af638f284 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, container_name=ovn_metadata_agent)
Jan 22 00:22:23 compute-1 podman[235118]: 2026-01-22 00:22:23.552648883 +0000 UTC m=+0.048010521 container health_status e305ebfcd3dbca18f8dd680606b043b108cf9efb0d0a771039cb04fe0df8441d (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 22 00:22:23 compute-1 nova_compute[182713]: 2026-01-22 00:22:23.802 182717 DEBUG nova.virt.libvirt.driver [None req-e4a337f8-9a95-4d24-a242-d34e65fd7be8 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: 495c8d36-266d-42ae-968f-28046804dcb7] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 22 00:22:23 compute-1 nova_compute[182713]: 2026-01-22 00:22:23.803 182717 DEBUG nova.virt.libvirt.driver [None req-e4a337f8-9a95-4d24-a242-d34e65fd7be8 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: 495c8d36-266d-42ae-968f-28046804dcb7] Ensure instance console log exists: /var/lib/nova/instances/495c8d36-266d-42ae-968f-28046804dcb7/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 22 00:22:23 compute-1 nova_compute[182713]: 2026-01-22 00:22:23.803 182717 DEBUG oslo_concurrency.lockutils [None req-e4a337f8-9a95-4d24-a242-d34e65fd7be8 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:22:23 compute-1 nova_compute[182713]: 2026-01-22 00:22:23.803 182717 DEBUG oslo_concurrency.lockutils [None req-e4a337f8-9a95-4d24-a242-d34e65fd7be8 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:22:23 compute-1 nova_compute[182713]: 2026-01-22 00:22:23.804 182717 DEBUG oslo_concurrency.lockutils [None req-e4a337f8-9a95-4d24-a242-d34e65fd7be8 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:22:23 compute-1 nova_compute[182713]: 2026-01-22 00:22:23.810 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:22:23 compute-1 nova_compute[182713]: 2026-01-22 00:22:23.975 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:22:25 compute-1 nova_compute[182713]: 2026-01-22 00:22:25.177 182717 DEBUG nova.network.neutron [None req-e4a337f8-9a95-4d24-a242-d34e65fd7be8 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: 495c8d36-266d-42ae-968f-28046804dcb7] Successfully created port: 6bd0c2c0-d87d-4e29-ae42-754b1c4a6c3a _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 22 00:22:27 compute-1 nova_compute[182713]: 2026-01-22 00:22:27.602 182717 DEBUG nova.network.neutron [None req-e4a337f8-9a95-4d24-a242-d34e65fd7be8 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: 495c8d36-266d-42ae-968f-28046804dcb7] Successfully updated port: 6bd0c2c0-d87d-4e29-ae42-754b1c4a6c3a _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 22 00:22:27 compute-1 nova_compute[182713]: 2026-01-22 00:22:27.631 182717 DEBUG oslo_concurrency.lockutils [None req-e4a337f8-9a95-4d24-a242-d34e65fd7be8 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Acquiring lock "refresh_cache-495c8d36-266d-42ae-968f-28046804dcb7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:22:27 compute-1 nova_compute[182713]: 2026-01-22 00:22:27.632 182717 DEBUG oslo_concurrency.lockutils [None req-e4a337f8-9a95-4d24-a242-d34e65fd7be8 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Acquired lock "refresh_cache-495c8d36-266d-42ae-968f-28046804dcb7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:22:27 compute-1 nova_compute[182713]: 2026-01-22 00:22:27.632 182717 DEBUG nova.network.neutron [None req-e4a337f8-9a95-4d24-a242-d34e65fd7be8 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: 495c8d36-266d-42ae-968f-28046804dcb7] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 00:22:27 compute-1 nova_compute[182713]: 2026-01-22 00:22:27.805 182717 DEBUG nova.compute.manager [req-ce912c3b-c124-4080-a6a9-32d848330626 req-26058e2d-e3cc-4797-b18e-f894e2e49b9c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 495c8d36-266d-42ae-968f-28046804dcb7] Received event network-changed-6bd0c2c0-d87d-4e29-ae42-754b1c4a6c3a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:22:27 compute-1 nova_compute[182713]: 2026-01-22 00:22:27.806 182717 DEBUG nova.compute.manager [req-ce912c3b-c124-4080-a6a9-32d848330626 req-26058e2d-e3cc-4797-b18e-f894e2e49b9c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 495c8d36-266d-42ae-968f-28046804dcb7] Refreshing instance network info cache due to event network-changed-6bd0c2c0-d87d-4e29-ae42-754b1c4a6c3a. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 00:22:27 compute-1 nova_compute[182713]: 2026-01-22 00:22:27.806 182717 DEBUG oslo_concurrency.lockutils [req-ce912c3b-c124-4080-a6a9-32d848330626 req-26058e2d-e3cc-4797-b18e-f894e2e49b9c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-495c8d36-266d-42ae-968f-28046804dcb7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:22:27 compute-1 nova_compute[182713]: 2026-01-22 00:22:27.956 182717 DEBUG nova.network.neutron [None req-e4a337f8-9a95-4d24-a242-d34e65fd7be8 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: 495c8d36-266d-42ae-968f-28046804dcb7] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 00:22:28 compute-1 nova_compute[182713]: 2026-01-22 00:22:28.813 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:22:28 compute-1 nova_compute[182713]: 2026-01-22 00:22:28.977 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:22:29 compute-1 nova_compute[182713]: 2026-01-22 00:22:29.495 182717 DEBUG nova.network.neutron [None req-e4a337f8-9a95-4d24-a242-d34e65fd7be8 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: 495c8d36-266d-42ae-968f-28046804dcb7] Updating instance_info_cache with network_info: [{"id": "6bd0c2c0-d87d-4e29-ae42-754b1c4a6c3a", "address": "fa:16:3e:7d:d5:75", "network": {"id": "cc568949-a996-45b6-b055-c1780ec7685a", "bridge": "br-int", "label": "tempest-network-smoke--1194520279", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "adb1305c8f874f2684e845e88fd95ffe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6bd0c2c0-d8", "ovs_interfaceid": "6bd0c2c0-d87d-4e29-ae42-754b1c4a6c3a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:22:29 compute-1 nova_compute[182713]: 2026-01-22 00:22:29.753 182717 DEBUG oslo_concurrency.lockutils [None req-e4a337f8-9a95-4d24-a242-d34e65fd7be8 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Releasing lock "refresh_cache-495c8d36-266d-42ae-968f-28046804dcb7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:22:29 compute-1 nova_compute[182713]: 2026-01-22 00:22:29.754 182717 DEBUG nova.compute.manager [None req-e4a337f8-9a95-4d24-a242-d34e65fd7be8 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: 495c8d36-266d-42ae-968f-28046804dcb7] Instance network_info: |[{"id": "6bd0c2c0-d87d-4e29-ae42-754b1c4a6c3a", "address": "fa:16:3e:7d:d5:75", "network": {"id": "cc568949-a996-45b6-b055-c1780ec7685a", "bridge": "br-int", "label": "tempest-network-smoke--1194520279", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "adb1305c8f874f2684e845e88fd95ffe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6bd0c2c0-d8", "ovs_interfaceid": "6bd0c2c0-d87d-4e29-ae42-754b1c4a6c3a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 22 00:22:29 compute-1 nova_compute[182713]: 2026-01-22 00:22:29.754 182717 DEBUG oslo_concurrency.lockutils [req-ce912c3b-c124-4080-a6a9-32d848330626 req-26058e2d-e3cc-4797-b18e-f894e2e49b9c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-495c8d36-266d-42ae-968f-28046804dcb7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:22:29 compute-1 nova_compute[182713]: 2026-01-22 00:22:29.755 182717 DEBUG nova.network.neutron [req-ce912c3b-c124-4080-a6a9-32d848330626 req-26058e2d-e3cc-4797-b18e-f894e2e49b9c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 495c8d36-266d-42ae-968f-28046804dcb7] Refreshing network info cache for port 6bd0c2c0-d87d-4e29-ae42-754b1c4a6c3a _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 00:22:29 compute-1 nova_compute[182713]: 2026-01-22 00:22:29.759 182717 DEBUG nova.virt.libvirt.driver [None req-e4a337f8-9a95-4d24-a242-d34e65fd7be8 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: 495c8d36-266d-42ae-968f-28046804dcb7] Start _get_guest_xml network_info=[{"id": "6bd0c2c0-d87d-4e29-ae42-754b1c4a6c3a", "address": "fa:16:3e:7d:d5:75", "network": {"id": "cc568949-a996-45b6-b055-c1780ec7685a", "bridge": "br-int", "label": "tempest-network-smoke--1194520279", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "adb1305c8f874f2684e845e88fd95ffe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6bd0c2c0-d8", "ovs_interfaceid": "6bd0c2c0-d87d-4e29-ae42-754b1c4a6c3a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'size': 0, 'disk_bus': 'virtio', 'boot_index': 0, 'device_name': '/dev/vda', 'guest_format': None, 'encrypted': False, 'encryption_format': None, 'encryption_secret_uuid': None, 'device_type': 'disk', 'image_id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 22 00:22:29 compute-1 nova_compute[182713]: 2026-01-22 00:22:29.764 182717 WARNING nova.virt.libvirt.driver [None req-e4a337f8-9a95-4d24-a242-d34e65fd7be8 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 00:22:29 compute-1 nova_compute[182713]: 2026-01-22 00:22:29.771 182717 DEBUG nova.virt.libvirt.host [None req-e4a337f8-9a95-4d24-a242-d34e65fd7be8 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 22 00:22:29 compute-1 nova_compute[182713]: 2026-01-22 00:22:29.772 182717 DEBUG nova.virt.libvirt.host [None req-e4a337f8-9a95-4d24-a242-d34e65fd7be8 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 22 00:22:29 compute-1 nova_compute[182713]: 2026-01-22 00:22:29.776 182717 DEBUG nova.virt.libvirt.host [None req-e4a337f8-9a95-4d24-a242-d34e65fd7be8 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 22 00:22:29 compute-1 nova_compute[182713]: 2026-01-22 00:22:29.777 182717 DEBUG nova.virt.libvirt.host [None req-e4a337f8-9a95-4d24-a242-d34e65fd7be8 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 22 00:22:29 compute-1 nova_compute[182713]: 2026-01-22 00:22:29.779 182717 DEBUG nova.virt.libvirt.driver [None req-e4a337f8-9a95-4d24-a242-d34e65fd7be8 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 22 00:22:29 compute-1 nova_compute[182713]: 2026-01-22 00:22:29.779 182717 DEBUG nova.virt.hardware [None req-e4a337f8-9a95-4d24-a242-d34e65fd7be8 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-21T23:42:45Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c3389c03-89c4-4ff5-9e03-1a99d41713d4',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 22 00:22:29 compute-1 nova_compute[182713]: 2026-01-22 00:22:29.780 182717 DEBUG nova.virt.hardware [None req-e4a337f8-9a95-4d24-a242-d34e65fd7be8 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 22 00:22:29 compute-1 nova_compute[182713]: 2026-01-22 00:22:29.780 182717 DEBUG nova.virt.hardware [None req-e4a337f8-9a95-4d24-a242-d34e65fd7be8 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 22 00:22:29 compute-1 nova_compute[182713]: 2026-01-22 00:22:29.780 182717 DEBUG nova.virt.hardware [None req-e4a337f8-9a95-4d24-a242-d34e65fd7be8 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 22 00:22:29 compute-1 nova_compute[182713]: 2026-01-22 00:22:29.781 182717 DEBUG nova.virt.hardware [None req-e4a337f8-9a95-4d24-a242-d34e65fd7be8 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 22 00:22:29 compute-1 nova_compute[182713]: 2026-01-22 00:22:29.781 182717 DEBUG nova.virt.hardware [None req-e4a337f8-9a95-4d24-a242-d34e65fd7be8 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 22 00:22:29 compute-1 nova_compute[182713]: 2026-01-22 00:22:29.781 182717 DEBUG nova.virt.hardware [None req-e4a337f8-9a95-4d24-a242-d34e65fd7be8 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 22 00:22:29 compute-1 nova_compute[182713]: 2026-01-22 00:22:29.782 182717 DEBUG nova.virt.hardware [None req-e4a337f8-9a95-4d24-a242-d34e65fd7be8 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 22 00:22:29 compute-1 nova_compute[182713]: 2026-01-22 00:22:29.782 182717 DEBUG nova.virt.hardware [None req-e4a337f8-9a95-4d24-a242-d34e65fd7be8 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 22 00:22:29 compute-1 nova_compute[182713]: 2026-01-22 00:22:29.782 182717 DEBUG nova.virt.hardware [None req-e4a337f8-9a95-4d24-a242-d34e65fd7be8 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 22 00:22:29 compute-1 nova_compute[182713]: 2026-01-22 00:22:29.782 182717 DEBUG nova.virt.hardware [None req-e4a337f8-9a95-4d24-a242-d34e65fd7be8 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 22 00:22:29 compute-1 nova_compute[182713]: 2026-01-22 00:22:29.789 182717 DEBUG nova.virt.libvirt.vif [None req-e4a337f8-9a95-4d24-a242-d34e65fd7be8 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T00:22:20Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-884353453',display_name='tempest-TestNetworkAdvancedServerOps-server-884353453',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-884353453',id=155,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGotZTdEQGRQ7FtPF47WQv8+VDewIY4/N4nNX8FItrUZZclF+5nJcntVPNOC87Q2Kf2jm85PAaaRWchaGkCfaaFMR1OIz+ggaW1GGnvOQXdytdYH1qUy5cdJspAi5mhK2A==',key_name='tempest-TestNetworkAdvancedServerOps-1778186126',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='adb1305c8f874f2684e845e88fd95ffe',ramdisk_id='',reservation_id='r-4uxpb5fa',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkAdvancedServerOps-587955072',owner_user_name='tempest-TestNetworkAdvancedServerOps-587955072-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:22:22Z,user_data=None,user_id='635cc2f351c344dc8e2b1264080dbafb',uuid=495c8d36-266d-42ae-968f-28046804dcb7,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "6bd0c2c0-d87d-4e29-ae42-754b1c4a6c3a", "address": "fa:16:3e:7d:d5:75", "network": {"id": "cc568949-a996-45b6-b055-c1780ec7685a", "bridge": "br-int", "label": "tempest-network-smoke--1194520279", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "adb1305c8f874f2684e845e88fd95ffe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6bd0c2c0-d8", "ovs_interfaceid": "6bd0c2c0-d87d-4e29-ae42-754b1c4a6c3a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 00:22:29 compute-1 nova_compute[182713]: 2026-01-22 00:22:29.789 182717 DEBUG nova.network.os_vif_util [None req-e4a337f8-9a95-4d24-a242-d34e65fd7be8 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Converting VIF {"id": "6bd0c2c0-d87d-4e29-ae42-754b1c4a6c3a", "address": "fa:16:3e:7d:d5:75", "network": {"id": "cc568949-a996-45b6-b055-c1780ec7685a", "bridge": "br-int", "label": "tempest-network-smoke--1194520279", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "adb1305c8f874f2684e845e88fd95ffe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6bd0c2c0-d8", "ovs_interfaceid": "6bd0c2c0-d87d-4e29-ae42-754b1c4a6c3a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:22:29 compute-1 nova_compute[182713]: 2026-01-22 00:22:29.791 182717 DEBUG nova.network.os_vif_util [None req-e4a337f8-9a95-4d24-a242-d34e65fd7be8 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:7d:d5:75,bridge_name='br-int',has_traffic_filtering=True,id=6bd0c2c0-d87d-4e29-ae42-754b1c4a6c3a,network=Network(cc568949-a996-45b6-b055-c1780ec7685a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6bd0c2c0-d8') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:22:29 compute-1 nova_compute[182713]: 2026-01-22 00:22:29.792 182717 DEBUG nova.objects.instance [None req-e4a337f8-9a95-4d24-a242-d34e65fd7be8 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lazy-loading 'pci_devices' on Instance uuid 495c8d36-266d-42ae-968f-28046804dcb7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:22:29 compute-1 nova_compute[182713]: 2026-01-22 00:22:29.990 182717 DEBUG nova.virt.libvirt.driver [None req-e4a337f8-9a95-4d24-a242-d34e65fd7be8 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: 495c8d36-266d-42ae-968f-28046804dcb7] End _get_guest_xml xml=<domain type="kvm">
Jan 22 00:22:29 compute-1 nova_compute[182713]:   <uuid>495c8d36-266d-42ae-968f-28046804dcb7</uuid>
Jan 22 00:22:29 compute-1 nova_compute[182713]:   <name>instance-0000009b</name>
Jan 22 00:22:29 compute-1 nova_compute[182713]:   <memory>131072</memory>
Jan 22 00:22:29 compute-1 nova_compute[182713]:   <vcpu>1</vcpu>
Jan 22 00:22:29 compute-1 nova_compute[182713]:   <metadata>
Jan 22 00:22:29 compute-1 nova_compute[182713]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 00:22:29 compute-1 nova_compute[182713]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 00:22:29 compute-1 nova_compute[182713]:       <nova:name>tempest-TestNetworkAdvancedServerOps-server-884353453</nova:name>
Jan 22 00:22:29 compute-1 nova_compute[182713]:       <nova:creationTime>2026-01-22 00:22:29</nova:creationTime>
Jan 22 00:22:29 compute-1 nova_compute[182713]:       <nova:flavor name="m1.nano">
Jan 22 00:22:29 compute-1 nova_compute[182713]:         <nova:memory>128</nova:memory>
Jan 22 00:22:29 compute-1 nova_compute[182713]:         <nova:disk>1</nova:disk>
Jan 22 00:22:29 compute-1 nova_compute[182713]:         <nova:swap>0</nova:swap>
Jan 22 00:22:29 compute-1 nova_compute[182713]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 00:22:29 compute-1 nova_compute[182713]:         <nova:vcpus>1</nova:vcpus>
Jan 22 00:22:29 compute-1 nova_compute[182713]:       </nova:flavor>
Jan 22 00:22:29 compute-1 nova_compute[182713]:       <nova:owner>
Jan 22 00:22:29 compute-1 nova_compute[182713]:         <nova:user uuid="635cc2f351c344dc8e2b1264080dbafb">tempest-TestNetworkAdvancedServerOps-587955072-project-member</nova:user>
Jan 22 00:22:29 compute-1 nova_compute[182713]:         <nova:project uuid="adb1305c8f874f2684e845e88fd95ffe">tempest-TestNetworkAdvancedServerOps-587955072</nova:project>
Jan 22 00:22:29 compute-1 nova_compute[182713]:       </nova:owner>
Jan 22 00:22:29 compute-1 nova_compute[182713]:       <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 22 00:22:29 compute-1 nova_compute[182713]:       <nova:ports>
Jan 22 00:22:29 compute-1 nova_compute[182713]:         <nova:port uuid="6bd0c2c0-d87d-4e29-ae42-754b1c4a6c3a">
Jan 22 00:22:29 compute-1 nova_compute[182713]:           <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Jan 22 00:22:29 compute-1 nova_compute[182713]:         </nova:port>
Jan 22 00:22:29 compute-1 nova_compute[182713]:       </nova:ports>
Jan 22 00:22:29 compute-1 nova_compute[182713]:     </nova:instance>
Jan 22 00:22:29 compute-1 nova_compute[182713]:   </metadata>
Jan 22 00:22:29 compute-1 nova_compute[182713]:   <sysinfo type="smbios">
Jan 22 00:22:29 compute-1 nova_compute[182713]:     <system>
Jan 22 00:22:29 compute-1 nova_compute[182713]:       <entry name="manufacturer">RDO</entry>
Jan 22 00:22:29 compute-1 nova_compute[182713]:       <entry name="product">OpenStack Compute</entry>
Jan 22 00:22:29 compute-1 nova_compute[182713]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 00:22:29 compute-1 nova_compute[182713]:       <entry name="serial">495c8d36-266d-42ae-968f-28046804dcb7</entry>
Jan 22 00:22:29 compute-1 nova_compute[182713]:       <entry name="uuid">495c8d36-266d-42ae-968f-28046804dcb7</entry>
Jan 22 00:22:29 compute-1 nova_compute[182713]:       <entry name="family">Virtual Machine</entry>
Jan 22 00:22:29 compute-1 nova_compute[182713]:     </system>
Jan 22 00:22:29 compute-1 nova_compute[182713]:   </sysinfo>
Jan 22 00:22:29 compute-1 nova_compute[182713]:   <os>
Jan 22 00:22:29 compute-1 nova_compute[182713]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 22 00:22:29 compute-1 nova_compute[182713]:     <boot dev="hd"/>
Jan 22 00:22:29 compute-1 nova_compute[182713]:     <smbios mode="sysinfo"/>
Jan 22 00:22:29 compute-1 nova_compute[182713]:   </os>
Jan 22 00:22:29 compute-1 nova_compute[182713]:   <features>
Jan 22 00:22:29 compute-1 nova_compute[182713]:     <acpi/>
Jan 22 00:22:29 compute-1 nova_compute[182713]:     <apic/>
Jan 22 00:22:29 compute-1 nova_compute[182713]:     <vmcoreinfo/>
Jan 22 00:22:29 compute-1 nova_compute[182713]:   </features>
Jan 22 00:22:29 compute-1 nova_compute[182713]:   <clock offset="utc">
Jan 22 00:22:29 compute-1 nova_compute[182713]:     <timer name="pit" tickpolicy="delay"/>
Jan 22 00:22:29 compute-1 nova_compute[182713]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 22 00:22:29 compute-1 nova_compute[182713]:     <timer name="hpet" present="no"/>
Jan 22 00:22:29 compute-1 nova_compute[182713]:   </clock>
Jan 22 00:22:29 compute-1 nova_compute[182713]:   <cpu mode="custom" match="exact">
Jan 22 00:22:29 compute-1 nova_compute[182713]:     <model>Nehalem</model>
Jan 22 00:22:29 compute-1 nova_compute[182713]:     <topology sockets="1" cores="1" threads="1"/>
Jan 22 00:22:29 compute-1 nova_compute[182713]:   </cpu>
Jan 22 00:22:29 compute-1 nova_compute[182713]:   <devices>
Jan 22 00:22:29 compute-1 nova_compute[182713]:     <disk type="file" device="disk">
Jan 22 00:22:29 compute-1 nova_compute[182713]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 00:22:29 compute-1 nova_compute[182713]:       <source file="/var/lib/nova/instances/495c8d36-266d-42ae-968f-28046804dcb7/disk"/>
Jan 22 00:22:29 compute-1 nova_compute[182713]:       <target dev="vda" bus="virtio"/>
Jan 22 00:22:29 compute-1 nova_compute[182713]:     </disk>
Jan 22 00:22:29 compute-1 nova_compute[182713]:     <disk type="file" device="cdrom">
Jan 22 00:22:29 compute-1 nova_compute[182713]:       <driver name="qemu" type="raw" cache="none"/>
Jan 22 00:22:29 compute-1 nova_compute[182713]:       <source file="/var/lib/nova/instances/495c8d36-266d-42ae-968f-28046804dcb7/disk.config"/>
Jan 22 00:22:29 compute-1 nova_compute[182713]:       <target dev="sda" bus="sata"/>
Jan 22 00:22:29 compute-1 nova_compute[182713]:     </disk>
Jan 22 00:22:29 compute-1 nova_compute[182713]:     <interface type="ethernet">
Jan 22 00:22:29 compute-1 nova_compute[182713]:       <mac address="fa:16:3e:7d:d5:75"/>
Jan 22 00:22:29 compute-1 nova_compute[182713]:       <model type="virtio"/>
Jan 22 00:22:29 compute-1 nova_compute[182713]:       <driver name="vhost" rx_queue_size="512"/>
Jan 22 00:22:29 compute-1 nova_compute[182713]:       <mtu size="1442"/>
Jan 22 00:22:29 compute-1 nova_compute[182713]:       <target dev="tap6bd0c2c0-d8"/>
Jan 22 00:22:29 compute-1 nova_compute[182713]:     </interface>
Jan 22 00:22:29 compute-1 nova_compute[182713]:     <serial type="pty">
Jan 22 00:22:29 compute-1 nova_compute[182713]:       <log file="/var/lib/nova/instances/495c8d36-266d-42ae-968f-28046804dcb7/console.log" append="off"/>
Jan 22 00:22:29 compute-1 nova_compute[182713]:     </serial>
Jan 22 00:22:29 compute-1 nova_compute[182713]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 00:22:29 compute-1 nova_compute[182713]:     <video>
Jan 22 00:22:29 compute-1 nova_compute[182713]:       <model type="virtio"/>
Jan 22 00:22:29 compute-1 nova_compute[182713]:     </video>
Jan 22 00:22:29 compute-1 nova_compute[182713]:     <input type="tablet" bus="usb"/>
Jan 22 00:22:29 compute-1 nova_compute[182713]:     <rng model="virtio">
Jan 22 00:22:29 compute-1 nova_compute[182713]:       <backend model="random">/dev/urandom</backend>
Jan 22 00:22:29 compute-1 nova_compute[182713]:     </rng>
Jan 22 00:22:29 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root"/>
Jan 22 00:22:29 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:22:29 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:22:29 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:22:29 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:22:29 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:22:29 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:22:29 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:22:29 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:22:29 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:22:29 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:22:29 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:22:29 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:22:29 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:22:29 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:22:29 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:22:29 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:22:29 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:22:29 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:22:29 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:22:29 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:22:29 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:22:29 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:22:29 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:22:29 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:22:29 compute-1 nova_compute[182713]:     <controller type="usb" index="0"/>
Jan 22 00:22:29 compute-1 nova_compute[182713]:     <memballoon model="virtio">
Jan 22 00:22:29 compute-1 nova_compute[182713]:       <stats period="10"/>
Jan 22 00:22:29 compute-1 nova_compute[182713]:     </memballoon>
Jan 22 00:22:29 compute-1 nova_compute[182713]:   </devices>
Jan 22 00:22:29 compute-1 nova_compute[182713]: </domain>
Jan 22 00:22:29 compute-1 nova_compute[182713]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 22 00:22:29 compute-1 nova_compute[182713]: 2026-01-22 00:22:29.991 182717 DEBUG nova.compute.manager [None req-e4a337f8-9a95-4d24-a242-d34e65fd7be8 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: 495c8d36-266d-42ae-968f-28046804dcb7] Preparing to wait for external event network-vif-plugged-6bd0c2c0-d87d-4e29-ae42-754b1c4a6c3a prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 22 00:22:29 compute-1 nova_compute[182713]: 2026-01-22 00:22:29.991 182717 DEBUG oslo_concurrency.lockutils [None req-e4a337f8-9a95-4d24-a242-d34e65fd7be8 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Acquiring lock "495c8d36-266d-42ae-968f-28046804dcb7-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:22:29 compute-1 nova_compute[182713]: 2026-01-22 00:22:29.991 182717 DEBUG oslo_concurrency.lockutils [None req-e4a337f8-9a95-4d24-a242-d34e65fd7be8 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lock "495c8d36-266d-42ae-968f-28046804dcb7-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:22:29 compute-1 nova_compute[182713]: 2026-01-22 00:22:29.992 182717 DEBUG oslo_concurrency.lockutils [None req-e4a337f8-9a95-4d24-a242-d34e65fd7be8 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lock "495c8d36-266d-42ae-968f-28046804dcb7-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:22:29 compute-1 nova_compute[182713]: 2026-01-22 00:22:29.992 182717 DEBUG nova.virt.libvirt.vif [None req-e4a337f8-9a95-4d24-a242-d34e65fd7be8 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T00:22:20Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-884353453',display_name='tempest-TestNetworkAdvancedServerOps-server-884353453',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-884353453',id=155,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGotZTdEQGRQ7FtPF47WQv8+VDewIY4/N4nNX8FItrUZZclF+5nJcntVPNOC87Q2Kf2jm85PAaaRWchaGkCfaaFMR1OIz+ggaW1GGnvOQXdytdYH1qUy5cdJspAi5mhK2A==',key_name='tempest-TestNetworkAdvancedServerOps-1778186126',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='adb1305c8f874f2684e845e88fd95ffe',ramdisk_id='',reservation_id='r-4uxpb5fa',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkAdvancedServerOps-587955072',owner_user_name='tempest-TestNetworkAdvancedServerOps-587955072-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:22:22Z,user_data=None,user_id='635cc2f351c344dc8e2b1264080dbafb',uuid=495c8d36-266d-42ae-968f-28046804dcb7,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "6bd0c2c0-d87d-4e29-ae42-754b1c4a6c3a", "address": "fa:16:3e:7d:d5:75", "network": {"id": "cc568949-a996-45b6-b055-c1780ec7685a", "bridge": "br-int", "label": "tempest-network-smoke--1194520279", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "adb1305c8f874f2684e845e88fd95ffe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6bd0c2c0-d8", "ovs_interfaceid": "6bd0c2c0-d87d-4e29-ae42-754b1c4a6c3a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 00:22:29 compute-1 nova_compute[182713]: 2026-01-22 00:22:29.993 182717 DEBUG nova.network.os_vif_util [None req-e4a337f8-9a95-4d24-a242-d34e65fd7be8 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Converting VIF {"id": "6bd0c2c0-d87d-4e29-ae42-754b1c4a6c3a", "address": "fa:16:3e:7d:d5:75", "network": {"id": "cc568949-a996-45b6-b055-c1780ec7685a", "bridge": "br-int", "label": "tempest-network-smoke--1194520279", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "adb1305c8f874f2684e845e88fd95ffe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6bd0c2c0-d8", "ovs_interfaceid": "6bd0c2c0-d87d-4e29-ae42-754b1c4a6c3a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:22:29 compute-1 nova_compute[182713]: 2026-01-22 00:22:29.993 182717 DEBUG nova.network.os_vif_util [None req-e4a337f8-9a95-4d24-a242-d34e65fd7be8 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:7d:d5:75,bridge_name='br-int',has_traffic_filtering=True,id=6bd0c2c0-d87d-4e29-ae42-754b1c4a6c3a,network=Network(cc568949-a996-45b6-b055-c1780ec7685a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6bd0c2c0-d8') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:22:29 compute-1 nova_compute[182713]: 2026-01-22 00:22:29.993 182717 DEBUG os_vif [None req-e4a337f8-9a95-4d24-a242-d34e65fd7be8 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:7d:d5:75,bridge_name='br-int',has_traffic_filtering=True,id=6bd0c2c0-d87d-4e29-ae42-754b1c4a6c3a,network=Network(cc568949-a996-45b6-b055-c1780ec7685a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6bd0c2c0-d8') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 00:22:29 compute-1 nova_compute[182713]: 2026-01-22 00:22:29.994 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:22:29 compute-1 nova_compute[182713]: 2026-01-22 00:22:29.994 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:22:29 compute-1 nova_compute[182713]: 2026-01-22 00:22:29.995 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 00:22:29 compute-1 nova_compute[182713]: 2026-01-22 00:22:29.999 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:22:29 compute-1 nova_compute[182713]: 2026-01-22 00:22:29.999 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap6bd0c2c0-d8, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:22:29 compute-1 nova_compute[182713]: 2026-01-22 00:22:29.999 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap6bd0c2c0-d8, col_values=(('external_ids', {'iface-id': '6bd0c2c0-d87d-4e29-ae42-754b1c4a6c3a', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:7d:d5:75', 'vm-uuid': '495c8d36-266d-42ae-968f-28046804dcb7'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:22:30 compute-1 nova_compute[182713]: 2026-01-22 00:22:30.001 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:22:30 compute-1 NetworkManager[54952]: <info>  [1769041350.0025] manager: (tap6bd0c2c0-d8): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/277)
Jan 22 00:22:30 compute-1 nova_compute[182713]: 2026-01-22 00:22:30.004 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:22:30 compute-1 nova_compute[182713]: 2026-01-22 00:22:30.012 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:22:30 compute-1 nova_compute[182713]: 2026-01-22 00:22:30.013 182717 INFO os_vif [None req-e4a337f8-9a95-4d24-a242-d34e65fd7be8 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:7d:d5:75,bridge_name='br-int',has_traffic_filtering=True,id=6bd0c2c0-d87d-4e29-ae42-754b1c4a6c3a,network=Network(cc568949-a996-45b6-b055-c1780ec7685a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6bd0c2c0-d8')
Jan 22 00:22:30 compute-1 nova_compute[182713]: 2026-01-22 00:22:30.134 182717 DEBUG nova.virt.libvirt.driver [None req-e4a337f8-9a95-4d24-a242-d34e65fd7be8 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 00:22:30 compute-1 nova_compute[182713]: 2026-01-22 00:22:30.134 182717 DEBUG nova.virt.libvirt.driver [None req-e4a337f8-9a95-4d24-a242-d34e65fd7be8 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 00:22:30 compute-1 nova_compute[182713]: 2026-01-22 00:22:30.135 182717 DEBUG nova.virt.libvirt.driver [None req-e4a337f8-9a95-4d24-a242-d34e65fd7be8 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] No VIF found with MAC fa:16:3e:7d:d5:75, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 00:22:30 compute-1 nova_compute[182713]: 2026-01-22 00:22:30.136 182717 INFO nova.virt.libvirt.driver [None req-e4a337f8-9a95-4d24-a242-d34e65fd7be8 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: 495c8d36-266d-42ae-968f-28046804dcb7] Using config drive
Jan 22 00:22:30 compute-1 nova_compute[182713]: 2026-01-22 00:22:30.732 182717 INFO nova.virt.libvirt.driver [None req-e4a337f8-9a95-4d24-a242-d34e65fd7be8 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: 495c8d36-266d-42ae-968f-28046804dcb7] Creating config drive at /var/lib/nova/instances/495c8d36-266d-42ae-968f-28046804dcb7/disk.config
Jan 22 00:22:30 compute-1 nova_compute[182713]: 2026-01-22 00:22:30.738 182717 DEBUG oslo_concurrency.processutils [None req-e4a337f8-9a95-4d24-a242-d34e65fd7be8 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/495c8d36-266d-42ae-968f-28046804dcb7/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpvo9qeoyr execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:22:31 compute-1 nova_compute[182713]: 2026-01-22 00:22:31.377 182717 DEBUG oslo_concurrency.processutils [None req-e4a337f8-9a95-4d24-a242-d34e65fd7be8 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/495c8d36-266d-42ae-968f-28046804dcb7/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpvo9qeoyr" returned: 0 in 0.639s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:22:31 compute-1 kernel: tap6bd0c2c0-d8: entered promiscuous mode
Jan 22 00:22:31 compute-1 NetworkManager[54952]: <info>  [1769041351.4535] manager: (tap6bd0c2c0-d8): new Tun device (/org/freedesktop/NetworkManager/Devices/278)
Jan 22 00:22:31 compute-1 nova_compute[182713]: 2026-01-22 00:22:31.454 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:22:31 compute-1 ovn_controller[94841]: 2026-01-22T00:22:31Z|00612|binding|INFO|Claiming lport 6bd0c2c0-d87d-4e29-ae42-754b1c4a6c3a for this chassis.
Jan 22 00:22:31 compute-1 ovn_controller[94841]: 2026-01-22T00:22:31Z|00613|binding|INFO|6bd0c2c0-d87d-4e29-ae42-754b1c4a6c3a: Claiming fa:16:3e:7d:d5:75 10.100.0.6
Jan 22 00:22:31 compute-1 nova_compute[182713]: 2026-01-22 00:22:31.458 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:22:31 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:22:31.468 104184 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:7d:d5:75 10.100.0.6'], port_security=['fa:16:3e:7d:d5:75 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '495c8d36-266d-42ae-968f-28046804dcb7', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cc568949-a996-45b6-b055-c1780ec7685a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'adb1305c8f874f2684e845e88fd95ffe', 'neutron:revision_number': '2', 'neutron:security_group_ids': '762053a0-b433-495c-a60f-4015d21965ea', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8e45a905-ef69-47b8-b157-96af9472b990, chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>], logical_port=6bd0c2c0-d87d-4e29-ae42-754b1c4a6c3a) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:22:31 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:22:31.469 104184 INFO neutron.agent.ovn.metadata.agent [-] Port 6bd0c2c0-d87d-4e29-ae42-754b1c4a6c3a in datapath cc568949-a996-45b6-b055-c1780ec7685a bound to our chassis
Jan 22 00:22:31 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:22:31.470 104184 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network cc568949-a996-45b6-b055-c1780ec7685a
Jan 22 00:22:31 compute-1 systemd-udevd[235177]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 00:22:31 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:22:31.483 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[482da8fc-f3b0-4268-8ebe-4365c08c5252]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:22:31 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:22:31.484 104184 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapcc568949-a1 in ovnmeta-cc568949-a996-45b6-b055-c1780ec7685a namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 22 00:22:31 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:22:31.486 211733 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapcc568949-a0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 22 00:22:31 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:22:31.486 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[d4bc2b6d-b29c-49da-beba-8bacb472d66d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:22:31 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:22:31.488 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[b934994c-af1e-4bbe-a217-4e3c9497931f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:22:31 compute-1 NetworkManager[54952]: <info>  [1769041351.4936] device (tap6bd0c2c0-d8): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 00:22:31 compute-1 NetworkManager[54952]: <info>  [1769041351.4943] device (tap6bd0c2c0-d8): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 00:22:31 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:22:31.498 104576 DEBUG oslo.privsep.daemon [-] privsep: reply[0c39319d-b27a-4fc0-a2bd-5859d638dcc0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:22:31 compute-1 systemd-machined[153970]: New machine qemu-67-instance-0000009b.
Jan 22 00:22:31 compute-1 nova_compute[182713]: 2026-01-22 00:22:31.515 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:22:31 compute-1 ovn_controller[94841]: 2026-01-22T00:22:31Z|00614|binding|INFO|Setting lport 6bd0c2c0-d87d-4e29-ae42-754b1c4a6c3a ovn-installed in OVS
Jan 22 00:22:31 compute-1 ovn_controller[94841]: 2026-01-22T00:22:31Z|00615|binding|INFO|Setting lport 6bd0c2c0-d87d-4e29-ae42-754b1c4a6c3a up in Southbound
Jan 22 00:22:31 compute-1 nova_compute[182713]: 2026-01-22 00:22:31.521 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:22:31 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:22:31.524 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[0179a15d-f267-45bc-9bb3-d98af2c3d959]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:22:31 compute-1 systemd[1]: Started Virtual Machine qemu-67-instance-0000009b.
Jan 22 00:22:31 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:22:31.556 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[4990a2f5-af59-4ef8-a082-7b6a7d398893]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:22:31 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:22:31.561 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[47e79572-b767-4c2b-9fa4-ad82a7a57d5a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:22:31 compute-1 NetworkManager[54952]: <info>  [1769041351.5625] manager: (tapcc568949-a0): new Veth device (/org/freedesktop/NetworkManager/Devices/279)
Jan 22 00:22:31 compute-1 systemd-udevd[235182]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 00:22:31 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:22:31.591 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[84eb6da5-e353-40c9-91cc-edf22ea405e0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:22:31 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:22:31.594 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[f865c89c-a1e6-48a2-a0f7-b557f5ed7165]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:22:31 compute-1 nova_compute[182713]: 2026-01-22 00:22:31.606 182717 DEBUG nova.network.neutron [req-ce912c3b-c124-4080-a6a9-32d848330626 req-26058e2d-e3cc-4797-b18e-f894e2e49b9c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 495c8d36-266d-42ae-968f-28046804dcb7] Updated VIF entry in instance network info cache for port 6bd0c2c0-d87d-4e29-ae42-754b1c4a6c3a. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 00:22:31 compute-1 nova_compute[182713]: 2026-01-22 00:22:31.607 182717 DEBUG nova.network.neutron [req-ce912c3b-c124-4080-a6a9-32d848330626 req-26058e2d-e3cc-4797-b18e-f894e2e49b9c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 495c8d36-266d-42ae-968f-28046804dcb7] Updating instance_info_cache with network_info: [{"id": "6bd0c2c0-d87d-4e29-ae42-754b1c4a6c3a", "address": "fa:16:3e:7d:d5:75", "network": {"id": "cc568949-a996-45b6-b055-c1780ec7685a", "bridge": "br-int", "label": "tempest-network-smoke--1194520279", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "adb1305c8f874f2684e845e88fd95ffe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6bd0c2c0-d8", "ovs_interfaceid": "6bd0c2c0-d87d-4e29-ae42-754b1c4a6c3a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:22:31 compute-1 NetworkManager[54952]: <info>  [1769041351.6153] device (tapcc568949-a0): carrier: link connected
Jan 22 00:22:31 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:22:31.620 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[34146c27-eb11-440b-8edc-1d01b856210e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:22:31 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:22:31.638 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[62b5d8e6-a95c-498a-9950-1f91a59eb821]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapcc568949-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:95:d5:ab'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 174], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 592426, 'reachable_time': 23839, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 235212, 'error': None, 'target': 'ovnmeta-cc568949-a996-45b6-b055-c1780ec7685a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:22:31 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:22:31.655 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[1847f439-9c82-414d-93f5-82097b7bbbe4]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe95:d5ab'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 592426, 'tstamp': 592426}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 235213, 'error': None, 'target': 'ovnmeta-cc568949-a996-45b6-b055-c1780ec7685a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:22:31 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:22:31.672 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[5b05d962-85e9-4515-9e8b-b29a6778b3be]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapcc568949-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:95:d5:ab'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 174], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 592426, 'reachable_time': 23839, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 235214, 'error': None, 'target': 'ovnmeta-cc568949-a996-45b6-b055-c1780ec7685a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:22:31 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:22:31.705 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[42e55ce0-7782-4f84-95dd-e3bd96af8ae2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:22:31 compute-1 nova_compute[182713]: 2026-01-22 00:22:31.761 182717 DEBUG oslo_concurrency.lockutils [req-ce912c3b-c124-4080-a6a9-32d848330626 req-26058e2d-e3cc-4797-b18e-f894e2e49b9c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-495c8d36-266d-42ae-968f-28046804dcb7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:22:31 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:22:31.772 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[d124e374-993f-4680-8228-3ab8c83a537f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:22:31 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:22:31.774 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapcc568949-a0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:22:31 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:22:31.775 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 00:22:31 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:22:31.776 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapcc568949-a0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:22:31 compute-1 NetworkManager[54952]: <info>  [1769041351.7788] manager: (tapcc568949-a0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/280)
Jan 22 00:22:31 compute-1 nova_compute[182713]: 2026-01-22 00:22:31.779 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:22:31 compute-1 kernel: tapcc568949-a0: entered promiscuous mode
Jan 22 00:22:31 compute-1 nova_compute[182713]: 2026-01-22 00:22:31.781 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:22:31 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:22:31.784 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapcc568949-a0, col_values=(('external_ids', {'iface-id': '7c217807-262b-45e7-a62c-ca33e3f039ed'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:22:31 compute-1 nova_compute[182713]: 2026-01-22 00:22:31.786 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:22:31 compute-1 ovn_controller[94841]: 2026-01-22T00:22:31Z|00616|binding|INFO|Releasing lport 7c217807-262b-45e7-a62c-ca33e3f039ed from this chassis (sb_readonly=0)
Jan 22 00:22:31 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:22:31.791 104184 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/cc568949-a996-45b6-b055-c1780ec7685a.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/cc568949-a996-45b6-b055-c1780ec7685a.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 22 00:22:31 compute-1 nova_compute[182713]: 2026-01-22 00:22:31.798 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:22:31 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:22:31.797 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[4a9af8c4-7b67-4b26-8310-7c65cf5639c3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:22:31 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:22:31.799 104184 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 00:22:31 compute-1 ovn_metadata_agent[104179]: global
Jan 22 00:22:31 compute-1 ovn_metadata_agent[104179]:     log         /dev/log local0 debug
Jan 22 00:22:31 compute-1 ovn_metadata_agent[104179]:     log-tag     haproxy-metadata-proxy-cc568949-a996-45b6-b055-c1780ec7685a
Jan 22 00:22:31 compute-1 ovn_metadata_agent[104179]:     user        root
Jan 22 00:22:31 compute-1 ovn_metadata_agent[104179]:     group       root
Jan 22 00:22:31 compute-1 ovn_metadata_agent[104179]:     maxconn     1024
Jan 22 00:22:31 compute-1 ovn_metadata_agent[104179]:     pidfile     /var/lib/neutron/external/pids/cc568949-a996-45b6-b055-c1780ec7685a.pid.haproxy
Jan 22 00:22:31 compute-1 ovn_metadata_agent[104179]:     daemon
Jan 22 00:22:31 compute-1 ovn_metadata_agent[104179]: 
Jan 22 00:22:31 compute-1 ovn_metadata_agent[104179]: defaults
Jan 22 00:22:31 compute-1 ovn_metadata_agent[104179]:     log global
Jan 22 00:22:31 compute-1 ovn_metadata_agent[104179]:     mode http
Jan 22 00:22:31 compute-1 ovn_metadata_agent[104179]:     option httplog
Jan 22 00:22:31 compute-1 ovn_metadata_agent[104179]:     option dontlognull
Jan 22 00:22:31 compute-1 ovn_metadata_agent[104179]:     option http-server-close
Jan 22 00:22:31 compute-1 ovn_metadata_agent[104179]:     option forwardfor
Jan 22 00:22:31 compute-1 ovn_metadata_agent[104179]:     retries                 3
Jan 22 00:22:31 compute-1 ovn_metadata_agent[104179]:     timeout http-request    30s
Jan 22 00:22:31 compute-1 ovn_metadata_agent[104179]:     timeout connect         30s
Jan 22 00:22:31 compute-1 ovn_metadata_agent[104179]:     timeout client          32s
Jan 22 00:22:31 compute-1 ovn_metadata_agent[104179]:     timeout server          32s
Jan 22 00:22:31 compute-1 ovn_metadata_agent[104179]:     timeout http-keep-alive 30s
Jan 22 00:22:31 compute-1 ovn_metadata_agent[104179]: 
Jan 22 00:22:31 compute-1 ovn_metadata_agent[104179]: 
Jan 22 00:22:31 compute-1 ovn_metadata_agent[104179]: listen listener
Jan 22 00:22:31 compute-1 ovn_metadata_agent[104179]:     bind 169.254.169.254:80
Jan 22 00:22:31 compute-1 ovn_metadata_agent[104179]:     server metadata /var/lib/neutron/metadata_proxy
Jan 22 00:22:31 compute-1 ovn_metadata_agent[104179]:     http-request add-header X-OVN-Network-ID cc568949-a996-45b6-b055-c1780ec7685a
Jan 22 00:22:31 compute-1 ovn_metadata_agent[104179]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 22 00:22:31 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:22:31.801 104184 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-cc568949-a996-45b6-b055-c1780ec7685a', 'env', 'PROCESS_TAG=haproxy-cc568949-a996-45b6-b055-c1780ec7685a', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/cc568949-a996-45b6-b055-c1780ec7685a.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 22 00:22:31 compute-1 nova_compute[182713]: 2026-01-22 00:22:31.973 182717 DEBUG nova.virt.driver [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Emitting event <LifecycleEvent: 1769041351.9725978, 495c8d36-266d-42ae-968f-28046804dcb7 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:22:31 compute-1 nova_compute[182713]: 2026-01-22 00:22:31.973 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 495c8d36-266d-42ae-968f-28046804dcb7] VM Started (Lifecycle Event)
Jan 22 00:22:32 compute-1 nova_compute[182713]: 2026-01-22 00:22:32.095 182717 DEBUG nova.compute.manager [req-467e56c4-58ac-4473-b892-42eeff077ff3 req-96158f7d-a6c9-40b3-abf6-5416f34152bd 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 495c8d36-266d-42ae-968f-28046804dcb7] Received event network-vif-plugged-6bd0c2c0-d87d-4e29-ae42-754b1c4a6c3a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:22:32 compute-1 nova_compute[182713]: 2026-01-22 00:22:32.095 182717 DEBUG oslo_concurrency.lockutils [req-467e56c4-58ac-4473-b892-42eeff077ff3 req-96158f7d-a6c9-40b3-abf6-5416f34152bd 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "495c8d36-266d-42ae-968f-28046804dcb7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:22:32 compute-1 nova_compute[182713]: 2026-01-22 00:22:32.095 182717 DEBUG oslo_concurrency.lockutils [req-467e56c4-58ac-4473-b892-42eeff077ff3 req-96158f7d-a6c9-40b3-abf6-5416f34152bd 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "495c8d36-266d-42ae-968f-28046804dcb7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:22:32 compute-1 nova_compute[182713]: 2026-01-22 00:22:32.095 182717 DEBUG oslo_concurrency.lockutils [req-467e56c4-58ac-4473-b892-42eeff077ff3 req-96158f7d-a6c9-40b3-abf6-5416f34152bd 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "495c8d36-266d-42ae-968f-28046804dcb7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:22:32 compute-1 nova_compute[182713]: 2026-01-22 00:22:32.096 182717 DEBUG nova.compute.manager [req-467e56c4-58ac-4473-b892-42eeff077ff3 req-96158f7d-a6c9-40b3-abf6-5416f34152bd 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 495c8d36-266d-42ae-968f-28046804dcb7] Processing event network-vif-plugged-6bd0c2c0-d87d-4e29-ae42-754b1c4a6c3a _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 22 00:22:32 compute-1 nova_compute[182713]: 2026-01-22 00:22:32.096 182717 DEBUG nova.compute.manager [None req-e4a337f8-9a95-4d24-a242-d34e65fd7be8 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: 495c8d36-266d-42ae-968f-28046804dcb7] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 00:22:32 compute-1 nova_compute[182713]: 2026-01-22 00:22:32.100 182717 DEBUG nova.virt.libvirt.driver [None req-e4a337f8-9a95-4d24-a242-d34e65fd7be8 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: 495c8d36-266d-42ae-968f-28046804dcb7] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 22 00:22:32 compute-1 nova_compute[182713]: 2026-01-22 00:22:32.103 182717 INFO nova.virt.libvirt.driver [-] [instance: 495c8d36-266d-42ae-968f-28046804dcb7] Instance spawned successfully.
Jan 22 00:22:32 compute-1 nova_compute[182713]: 2026-01-22 00:22:32.104 182717 DEBUG nova.virt.libvirt.driver [None req-e4a337f8-9a95-4d24-a242-d34e65fd7be8 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: 495c8d36-266d-42ae-968f-28046804dcb7] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 22 00:22:32 compute-1 nova_compute[182713]: 2026-01-22 00:22:32.116 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 495c8d36-266d-42ae-968f-28046804dcb7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:22:32 compute-1 nova_compute[182713]: 2026-01-22 00:22:32.121 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 495c8d36-266d-42ae-968f-28046804dcb7] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 00:22:32 compute-1 podman[235252]: 2026-01-22 00:22:32.292682596 +0000 UTC m=+0.116739035 container create 7b96546fbad670fa012f0c819377bd07440f9cf8c4d3de95fec803cfeebb0d50 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-cc568949-a996-45b6-b055-c1780ec7685a, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0)
Jan 22 00:22:32 compute-1 podman[235252]: 2026-01-22 00:22:32.202122805 +0000 UTC m=+0.026179254 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 00:22:32 compute-1 systemd[1]: Started libpod-conmon-7b96546fbad670fa012f0c819377bd07440f9cf8c4d3de95fec803cfeebb0d50.scope.
Jan 22 00:22:32 compute-1 systemd[1]: Started libcrun container.
Jan 22 00:22:32 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a1b53ccffc547685850d8805160fd463aa750f9228c8f9ff6a2c594703db7adf/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 00:22:32 compute-1 podman[235252]: 2026-01-22 00:22:32.392900907 +0000 UTC m=+0.216957326 container init 7b96546fbad670fa012f0c819377bd07440f9cf8c4d3de95fec803cfeebb0d50 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-cc568949-a996-45b6-b055-c1780ec7685a, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS)
Jan 22 00:22:32 compute-1 podman[235252]: 2026-01-22 00:22:32.397963965 +0000 UTC m=+0.222020364 container start 7b96546fbad670fa012f0c819377bd07440f9cf8c4d3de95fec803cfeebb0d50 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-cc568949-a996-45b6-b055-c1780ec7685a, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Jan 22 00:22:32 compute-1 nova_compute[182713]: 2026-01-22 00:22:32.402 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 495c8d36-266d-42ae-968f-28046804dcb7] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 00:22:32 compute-1 nova_compute[182713]: 2026-01-22 00:22:32.402 182717 DEBUG nova.virt.driver [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Emitting event <LifecycleEvent: 1769041351.9727752, 495c8d36-266d-42ae-968f-28046804dcb7 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:22:32 compute-1 nova_compute[182713]: 2026-01-22 00:22:32.403 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 495c8d36-266d-42ae-968f-28046804dcb7] VM Paused (Lifecycle Event)
Jan 22 00:22:32 compute-1 nova_compute[182713]: 2026-01-22 00:22:32.410 182717 DEBUG nova.virt.libvirt.driver [None req-e4a337f8-9a95-4d24-a242-d34e65fd7be8 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: 495c8d36-266d-42ae-968f-28046804dcb7] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:22:32 compute-1 nova_compute[182713]: 2026-01-22 00:22:32.410 182717 DEBUG nova.virt.libvirt.driver [None req-e4a337f8-9a95-4d24-a242-d34e65fd7be8 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: 495c8d36-266d-42ae-968f-28046804dcb7] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:22:32 compute-1 nova_compute[182713]: 2026-01-22 00:22:32.412 182717 DEBUG nova.virt.libvirt.driver [None req-e4a337f8-9a95-4d24-a242-d34e65fd7be8 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: 495c8d36-266d-42ae-968f-28046804dcb7] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:22:32 compute-1 nova_compute[182713]: 2026-01-22 00:22:32.413 182717 DEBUG nova.virt.libvirt.driver [None req-e4a337f8-9a95-4d24-a242-d34e65fd7be8 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: 495c8d36-266d-42ae-968f-28046804dcb7] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:22:32 compute-1 nova_compute[182713]: 2026-01-22 00:22:32.414 182717 DEBUG nova.virt.libvirt.driver [None req-e4a337f8-9a95-4d24-a242-d34e65fd7be8 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: 495c8d36-266d-42ae-968f-28046804dcb7] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:22:32 compute-1 nova_compute[182713]: 2026-01-22 00:22:32.415 182717 DEBUG nova.virt.libvirt.driver [None req-e4a337f8-9a95-4d24-a242-d34e65fd7be8 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: 495c8d36-266d-42ae-968f-28046804dcb7] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:22:32 compute-1 neutron-haproxy-ovnmeta-cc568949-a996-45b6-b055-c1780ec7685a[235267]: [NOTICE]   (235271) : New worker (235273) forked
Jan 22 00:22:32 compute-1 neutron-haproxy-ovnmeta-cc568949-a996-45b6-b055-c1780ec7685a[235267]: [NOTICE]   (235271) : Loading success.
Jan 22 00:22:32 compute-1 nova_compute[182713]: 2026-01-22 00:22:32.461 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 495c8d36-266d-42ae-968f-28046804dcb7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:22:32 compute-1 nova_compute[182713]: 2026-01-22 00:22:32.464 182717 DEBUG nova.virt.driver [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Emitting event <LifecycleEvent: 1769041352.0991297, 495c8d36-266d-42ae-968f-28046804dcb7 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:22:32 compute-1 nova_compute[182713]: 2026-01-22 00:22:32.464 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 495c8d36-266d-42ae-968f-28046804dcb7] VM Resumed (Lifecycle Event)
Jan 22 00:22:32 compute-1 nova_compute[182713]: 2026-01-22 00:22:32.580 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 495c8d36-266d-42ae-968f-28046804dcb7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:22:32 compute-1 nova_compute[182713]: 2026-01-22 00:22:32.588 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 495c8d36-266d-42ae-968f-28046804dcb7] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 00:22:32 compute-1 nova_compute[182713]: 2026-01-22 00:22:32.624 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 495c8d36-266d-42ae-968f-28046804dcb7] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 00:22:32 compute-1 nova_compute[182713]: 2026-01-22 00:22:32.650 182717 INFO nova.compute.manager [None req-e4a337f8-9a95-4d24-a242-d34e65fd7be8 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: 495c8d36-266d-42ae-968f-28046804dcb7] Took 9.56 seconds to spawn the instance on the hypervisor.
Jan 22 00:22:32 compute-1 nova_compute[182713]: 2026-01-22 00:22:32.650 182717 DEBUG nova.compute.manager [None req-e4a337f8-9a95-4d24-a242-d34e65fd7be8 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: 495c8d36-266d-42ae-968f-28046804dcb7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:22:32 compute-1 nova_compute[182713]: 2026-01-22 00:22:32.841 182717 INFO nova.compute.manager [None req-e4a337f8-9a95-4d24-a242-d34e65fd7be8 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: 495c8d36-266d-42ae-968f-28046804dcb7] Took 10.88 seconds to build instance.
Jan 22 00:22:32 compute-1 nova_compute[182713]: 2026-01-22 00:22:32.867 182717 DEBUG oslo_concurrency.lockutils [None req-e4a337f8-9a95-4d24-a242-d34e65fd7be8 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lock "495c8d36-266d-42ae-968f-28046804dcb7" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 11.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:22:33 compute-1 nova_compute[182713]: 2026-01-22 00:22:33.980 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:22:34 compute-1 nova_compute[182713]: 2026-01-22 00:22:34.546 182717 DEBUG nova.compute.manager [req-e9d6e96b-dc23-4892-9463-f5f5ebfbd57e req-462a3f0a-6244-411a-b5c6-96fa151b7d16 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 495c8d36-266d-42ae-968f-28046804dcb7] Received event network-vif-plugged-6bd0c2c0-d87d-4e29-ae42-754b1c4a6c3a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:22:34 compute-1 nova_compute[182713]: 2026-01-22 00:22:34.546 182717 DEBUG oslo_concurrency.lockutils [req-e9d6e96b-dc23-4892-9463-f5f5ebfbd57e req-462a3f0a-6244-411a-b5c6-96fa151b7d16 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "495c8d36-266d-42ae-968f-28046804dcb7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:22:34 compute-1 nova_compute[182713]: 2026-01-22 00:22:34.547 182717 DEBUG oslo_concurrency.lockutils [req-e9d6e96b-dc23-4892-9463-f5f5ebfbd57e req-462a3f0a-6244-411a-b5c6-96fa151b7d16 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "495c8d36-266d-42ae-968f-28046804dcb7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:22:34 compute-1 nova_compute[182713]: 2026-01-22 00:22:34.547 182717 DEBUG oslo_concurrency.lockutils [req-e9d6e96b-dc23-4892-9463-f5f5ebfbd57e req-462a3f0a-6244-411a-b5c6-96fa151b7d16 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "495c8d36-266d-42ae-968f-28046804dcb7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:22:34 compute-1 nova_compute[182713]: 2026-01-22 00:22:34.547 182717 DEBUG nova.compute.manager [req-e9d6e96b-dc23-4892-9463-f5f5ebfbd57e req-462a3f0a-6244-411a-b5c6-96fa151b7d16 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 495c8d36-266d-42ae-968f-28046804dcb7] No waiting events found dispatching network-vif-plugged-6bd0c2c0-d87d-4e29-ae42-754b1c4a6c3a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:22:34 compute-1 nova_compute[182713]: 2026-01-22 00:22:34.547 182717 WARNING nova.compute.manager [req-e9d6e96b-dc23-4892-9463-f5f5ebfbd57e req-462a3f0a-6244-411a-b5c6-96fa151b7d16 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 495c8d36-266d-42ae-968f-28046804dcb7] Received unexpected event network-vif-plugged-6bd0c2c0-d87d-4e29-ae42-754b1c4a6c3a for instance with vm_state active and task_state None.
Jan 22 00:22:35 compute-1 nova_compute[182713]: 2026-01-22 00:22:35.003 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:22:35 compute-1 podman[235282]: 2026-01-22 00:22:35.585093844 +0000 UTC m=+0.071133228 container health_status cba261ffc859a105e0afa4436e64e27f9ba7e7387a1ea02146e0072c51844d44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, container_name=ceilometer_agent_compute, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 22 00:22:39 compute-1 nova_compute[182713]: 2026-01-22 00:22:39.005 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:22:39 compute-1 podman[235302]: 2026-01-22 00:22:39.590119025 +0000 UTC m=+0.081968836 container health_status dce133a2ffa21f3006ac27dc80027060a9029e6d972f4c62fecd35dd3bbb00f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, name=ubi9-minimal, config_id=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., container_name=openstack_network_exporter, maintainer=Red Hat, Inc., release=1755695350, distribution-scope=public, version=9.6, build-date=2025-08-20T13:12:41)
Jan 22 00:22:40 compute-1 nova_compute[182713]: 2026-01-22 00:22:40.006 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:22:40 compute-1 NetworkManager[54952]: <info>  [1769041360.0793] manager: (patch-br-int-to-provnet-99bcf688-d143-4306-a11c-956e66fbe227): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/281)
Jan 22 00:22:40 compute-1 nova_compute[182713]: 2026-01-22 00:22:40.077 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:22:40 compute-1 NetworkManager[54952]: <info>  [1769041360.0804] manager: (patch-provnet-99bcf688-d143-4306-a11c-956e66fbe227-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/282)
Jan 22 00:22:40 compute-1 ovn_controller[94841]: 2026-01-22T00:22:40Z|00617|binding|INFO|Releasing lport 7c217807-262b-45e7-a62c-ca33e3f039ed from this chassis (sb_readonly=0)
Jan 22 00:22:40 compute-1 nova_compute[182713]: 2026-01-22 00:22:40.093 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:22:40 compute-1 nova_compute[182713]: 2026-01-22 00:22:40.099 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:22:40 compute-1 nova_compute[182713]: 2026-01-22 00:22:40.386 182717 DEBUG nova.compute.manager [req-19c9d920-ec83-421a-93e9-74fc03db7770 req-a06604e7-8569-4523-95f6-dc9556833c93 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 495c8d36-266d-42ae-968f-28046804dcb7] Received event network-changed-6bd0c2c0-d87d-4e29-ae42-754b1c4a6c3a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:22:40 compute-1 nova_compute[182713]: 2026-01-22 00:22:40.386 182717 DEBUG nova.compute.manager [req-19c9d920-ec83-421a-93e9-74fc03db7770 req-a06604e7-8569-4523-95f6-dc9556833c93 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 495c8d36-266d-42ae-968f-28046804dcb7] Refreshing instance network info cache due to event network-changed-6bd0c2c0-d87d-4e29-ae42-754b1c4a6c3a. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 00:22:40 compute-1 nova_compute[182713]: 2026-01-22 00:22:40.387 182717 DEBUG oslo_concurrency.lockutils [req-19c9d920-ec83-421a-93e9-74fc03db7770 req-a06604e7-8569-4523-95f6-dc9556833c93 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-495c8d36-266d-42ae-968f-28046804dcb7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:22:40 compute-1 nova_compute[182713]: 2026-01-22 00:22:40.387 182717 DEBUG oslo_concurrency.lockutils [req-19c9d920-ec83-421a-93e9-74fc03db7770 req-a06604e7-8569-4523-95f6-dc9556833c93 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-495c8d36-266d-42ae-968f-28046804dcb7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:22:40 compute-1 nova_compute[182713]: 2026-01-22 00:22:40.387 182717 DEBUG nova.network.neutron [req-19c9d920-ec83-421a-93e9-74fc03db7770 req-a06604e7-8569-4523-95f6-dc9556833c93 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 495c8d36-266d-42ae-968f-28046804dcb7] Refreshing network info cache for port 6bd0c2c0-d87d-4e29-ae42-754b1c4a6c3a _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 00:22:40 compute-1 nova_compute[182713]: 2026-01-22 00:22:40.439 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:22:42 compute-1 nova_compute[182713]: 2026-01-22 00:22:42.178 182717 DEBUG nova.network.neutron [req-19c9d920-ec83-421a-93e9-74fc03db7770 req-a06604e7-8569-4523-95f6-dc9556833c93 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 495c8d36-266d-42ae-968f-28046804dcb7] Updated VIF entry in instance network info cache for port 6bd0c2c0-d87d-4e29-ae42-754b1c4a6c3a. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 00:22:42 compute-1 nova_compute[182713]: 2026-01-22 00:22:42.179 182717 DEBUG nova.network.neutron [req-19c9d920-ec83-421a-93e9-74fc03db7770 req-a06604e7-8569-4523-95f6-dc9556833c93 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 495c8d36-266d-42ae-968f-28046804dcb7] Updating instance_info_cache with network_info: [{"id": "6bd0c2c0-d87d-4e29-ae42-754b1c4a6c3a", "address": "fa:16:3e:7d:d5:75", "network": {"id": "cc568949-a996-45b6-b055-c1780ec7685a", "bridge": "br-int", "label": "tempest-network-smoke--1194520279", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.233", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "adb1305c8f874f2684e845e88fd95ffe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6bd0c2c0-d8", "ovs_interfaceid": "6bd0c2c0-d87d-4e29-ae42-754b1c4a6c3a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:22:42 compute-1 nova_compute[182713]: 2026-01-22 00:22:42.242 182717 DEBUG oslo_concurrency.lockutils [req-19c9d920-ec83-421a-93e9-74fc03db7770 req-a06604e7-8569-4523-95f6-dc9556833c93 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-495c8d36-266d-42ae-968f-28046804dcb7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:22:43 compute-1 nova_compute[182713]: 2026-01-22 00:22:43.856 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:22:44 compute-1 nova_compute[182713]: 2026-01-22 00:22:44.007 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:22:45 compute-1 nova_compute[182713]: 2026-01-22 00:22:45.009 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:22:45 compute-1 ovn_controller[94841]: 2026-01-22T00:22:45Z|00074|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:7d:d5:75 10.100.0.6
Jan 22 00:22:45 compute-1 ovn_controller[94841]: 2026-01-22T00:22:45Z|00075|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:7d:d5:75 10.100.0.6
Jan 22 00:22:47 compute-1 nova_compute[182713]: 2026-01-22 00:22:47.882 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:22:47 compute-1 nova_compute[182713]: 2026-01-22 00:22:47.882 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:22:47 compute-1 nova_compute[182713]: 2026-01-22 00:22:47.883 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Jan 22 00:22:47 compute-1 nova_compute[182713]: 2026-01-22 00:22:47.904 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Jan 22 00:22:48 compute-1 nova_compute[182713]: 2026-01-22 00:22:48.877 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:22:48 compute-1 nova_compute[182713]: 2026-01-22 00:22:48.878 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:22:48 compute-1 nova_compute[182713]: 2026-01-22 00:22:48.878 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 00:22:49 compute-1 nova_compute[182713]: 2026-01-22 00:22:49.009 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:22:50 compute-1 nova_compute[182713]: 2026-01-22 00:22:50.011 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:22:50 compute-1 podman[235346]: 2026-01-22 00:22:50.592638315 +0000 UTC m=+0.072705028 container health_status 9069102163b9014ce8d990e36224edf2f6277e737eebbc85a8ea0ba5a3d6a468 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 22 00:22:50 compute-1 podman[235345]: 2026-01-22 00:22:50.64115566 +0000 UTC m=+0.124621339 container health_status 1b31374526468d6b98833c6ddc06c791524e3aaa8f4a92d02e3f11c449a17ca2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 22 00:22:51 compute-1 nova_compute[182713]: 2026-01-22 00:22:51.851 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:22:52 compute-1 nova_compute[182713]: 2026-01-22 00:22:52.036 182717 INFO nova.compute.manager [None req-5c9e4a7e-a6f6-49d8-93b9-04a2d6997af2 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: 495c8d36-266d-42ae-968f-28046804dcb7] Get console output
Jan 22 00:22:52 compute-1 nova_compute[182713]: 2026-01-22 00:22:52.044 211417 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 00:22:52 compute-1 nova_compute[182713]: 2026-01-22 00:22:52.420 182717 INFO nova.compute.manager [None req-3ab5ec1e-0980-4389-988b-cf023ae0135c 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: 495c8d36-266d-42ae-968f-28046804dcb7] Pausing
Jan 22 00:22:52 compute-1 nova_compute[182713]: 2026-01-22 00:22:52.421 182717 DEBUG nova.objects.instance [None req-3ab5ec1e-0980-4389-988b-cf023ae0135c 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lazy-loading 'flavor' on Instance uuid 495c8d36-266d-42ae-968f-28046804dcb7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:22:52 compute-1 nova_compute[182713]: 2026-01-22 00:22:52.472 182717 DEBUG nova.virt.driver [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Emitting event <LifecycleEvent: 1769041372.471651, 495c8d36-266d-42ae-968f-28046804dcb7 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:22:52 compute-1 nova_compute[182713]: 2026-01-22 00:22:52.472 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 495c8d36-266d-42ae-968f-28046804dcb7] VM Paused (Lifecycle Event)
Jan 22 00:22:52 compute-1 nova_compute[182713]: 2026-01-22 00:22:52.474 182717 DEBUG nova.compute.manager [None req-3ab5ec1e-0980-4389-988b-cf023ae0135c 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: 495c8d36-266d-42ae-968f-28046804dcb7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:22:52 compute-1 nova_compute[182713]: 2026-01-22 00:22:52.507 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 495c8d36-266d-42ae-968f-28046804dcb7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:22:52 compute-1 nova_compute[182713]: 2026-01-22 00:22:52.510 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 495c8d36-266d-42ae-968f-28046804dcb7] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: pausing, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 00:22:52 compute-1 nova_compute[182713]: 2026-01-22 00:22:52.538 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 495c8d36-266d-42ae-968f-28046804dcb7] During sync_power_state the instance has a pending task (pausing). Skip.
Jan 22 00:22:54 compute-1 nova_compute[182713]: 2026-01-22 00:22:54.011 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:22:54 compute-1 podman[235396]: 2026-01-22 00:22:54.580594656 +0000 UTC m=+0.063622876 container health_status af9a44923f14d865893c91712a29a99d59e05d83f1a1a0300c4f4d5af638f284 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_metadata_agent)
Jan 22 00:22:54 compute-1 podman[235397]: 2026-01-22 00:22:54.593239109 +0000 UTC m=+0.078914261 container health_status e305ebfcd3dbca18f8dd680606b043b108cf9efb0d0a771039cb04fe0df8441d (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 22 00:22:54 compute-1 nova_compute[182713]: 2026-01-22 00:22:54.855 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:22:54 compute-1 nova_compute[182713]: 2026-01-22 00:22:54.856 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:22:54 compute-1 nova_compute[182713]: 2026-01-22 00:22:54.892 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:22:54 compute-1 nova_compute[182713]: 2026-01-22 00:22:54.893 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:22:54 compute-1 nova_compute[182713]: 2026-01-22 00:22:54.893 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:22:54 compute-1 nova_compute[182713]: 2026-01-22 00:22:54.893 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 00:22:54 compute-1 nova_compute[182713]: 2026-01-22 00:22:54.967 182717 DEBUG oslo_concurrency.processutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/495c8d36-266d-42ae-968f-28046804dcb7/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:22:55 compute-1 nova_compute[182713]: 2026-01-22 00:22:55.013 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:22:55 compute-1 nova_compute[182713]: 2026-01-22 00:22:55.031 182717 DEBUG oslo_concurrency.processutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/495c8d36-266d-42ae-968f-28046804dcb7/disk --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:22:55 compute-1 nova_compute[182713]: 2026-01-22 00:22:55.032 182717 DEBUG oslo_concurrency.processutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/495c8d36-266d-42ae-968f-28046804dcb7/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:22:55 compute-1 nova_compute[182713]: 2026-01-22 00:22:55.099 182717 DEBUG oslo_concurrency.processutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/495c8d36-266d-42ae-968f-28046804dcb7/disk --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:22:55 compute-1 nova_compute[182713]: 2026-01-22 00:22:55.299 182717 WARNING nova.virt.libvirt.driver [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 00:22:55 compute-1 nova_compute[182713]: 2026-01-22 00:22:55.301 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5511MB free_disk=73.23186874389648GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 00:22:55 compute-1 nova_compute[182713]: 2026-01-22 00:22:55.301 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:22:55 compute-1 nova_compute[182713]: 2026-01-22 00:22:55.301 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:22:55 compute-1 nova_compute[182713]: 2026-01-22 00:22:55.429 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Instance 495c8d36-266d-42ae-968f-28046804dcb7 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 22 00:22:55 compute-1 nova_compute[182713]: 2026-01-22 00:22:55.430 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 00:22:55 compute-1 nova_compute[182713]: 2026-01-22 00:22:55.430 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 00:22:55 compute-1 nova_compute[182713]: 2026-01-22 00:22:55.472 182717 DEBUG nova.compute.provider_tree [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Inventory has not changed in ProviderTree for provider: 39680711-70c9-4df1-ae59-25e54fac688d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:22:55 compute-1 nova_compute[182713]: 2026-01-22 00:22:55.492 182717 DEBUG nova.scheduler.client.report [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Inventory has not changed for provider 39680711-70c9-4df1-ae59-25e54fac688d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:22:55 compute-1 nova_compute[182713]: 2026-01-22 00:22:55.516 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 00:22:55 compute-1 nova_compute[182713]: 2026-01-22 00:22:55.517 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.215s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:22:55 compute-1 nova_compute[182713]: 2026-01-22 00:22:55.836 182717 INFO nova.compute.manager [None req-765d3424-63ea-4995-8a78-6ec099f57bc0 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: 495c8d36-266d-42ae-968f-28046804dcb7] Get console output
Jan 22 00:22:55 compute-1 nova_compute[182713]: 2026-01-22 00:22:55.841 211417 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 00:22:56 compute-1 nova_compute[182713]: 2026-01-22 00:22:56.058 182717 INFO nova.compute.manager [None req-28294557-6adc-4742-9c78-2e4e5f63df40 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: 495c8d36-266d-42ae-968f-28046804dcb7] Unpausing
Jan 22 00:22:56 compute-1 nova_compute[182713]: 2026-01-22 00:22:56.059 182717 DEBUG nova.objects.instance [None req-28294557-6adc-4742-9c78-2e4e5f63df40 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lazy-loading 'flavor' on Instance uuid 495c8d36-266d-42ae-968f-28046804dcb7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:22:56 compute-1 nova_compute[182713]: 2026-01-22 00:22:56.103 182717 DEBUG nova.virt.driver [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Emitting event <LifecycleEvent: 1769041376.1032383, 495c8d36-266d-42ae-968f-28046804dcb7 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:22:56 compute-1 nova_compute[182713]: 2026-01-22 00:22:56.104 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 495c8d36-266d-42ae-968f-28046804dcb7] VM Resumed (Lifecycle Event)
Jan 22 00:22:56 compute-1 virtqemud[182235]: argument unsupported: QEMU guest agent is not configured
Jan 22 00:22:56 compute-1 nova_compute[182713]: 2026-01-22 00:22:56.107 182717 DEBUG nova.virt.libvirt.guest [None req-28294557-6adc-4742-9c78-2e4e5f63df40 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: 495c8d36-266d-42ae-968f-28046804dcb7] Failed to set time: agent not configured sync_guest_time /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:200
Jan 22 00:22:56 compute-1 nova_compute[182713]: 2026-01-22 00:22:56.108 182717 DEBUG nova.compute.manager [None req-28294557-6adc-4742-9c78-2e4e5f63df40 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: 495c8d36-266d-42ae-968f-28046804dcb7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:22:56 compute-1 nova_compute[182713]: 2026-01-22 00:22:56.134 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 495c8d36-266d-42ae-968f-28046804dcb7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:22:56 compute-1 nova_compute[182713]: 2026-01-22 00:22:56.139 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 495c8d36-266d-42ae-968f-28046804dcb7] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: paused, current task_state: unpausing, current DB power_state: 3, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 00:22:56 compute-1 nova_compute[182713]: 2026-01-22 00:22:56.171 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 495c8d36-266d-42ae-968f-28046804dcb7] During sync_power_state the instance has a pending task (unpausing). Skip.
Jan 22 00:22:56 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:22:56.709 104184 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=46, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:a4:fb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'fa:c2:bb:50:eb:27'}, ipsec=False) old=SB_Global(nb_cfg=45) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:22:56 compute-1 nova_compute[182713]: 2026-01-22 00:22:56.710 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:22:56 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:22:56.712 104184 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 22 00:22:57 compute-1 nova_compute[182713]: 2026-01-22 00:22:57.518 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:22:57 compute-1 nova_compute[182713]: 2026-01-22 00:22:57.519 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 00:22:57 compute-1 nova_compute[182713]: 2026-01-22 00:22:57.519 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 22 00:22:57 compute-1 nova_compute[182713]: 2026-01-22 00:22:57.742 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquiring lock "refresh_cache-495c8d36-266d-42ae-968f-28046804dcb7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:22:57 compute-1 nova_compute[182713]: 2026-01-22 00:22:57.743 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquired lock "refresh_cache-495c8d36-266d-42ae-968f-28046804dcb7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:22:57 compute-1 nova_compute[182713]: 2026-01-22 00:22:57.743 182717 DEBUG nova.network.neutron [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] [instance: 495c8d36-266d-42ae-968f-28046804dcb7] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Jan 22 00:22:57 compute-1 nova_compute[182713]: 2026-01-22 00:22:57.744 182717 DEBUG nova.objects.instance [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lazy-loading 'info_cache' on Instance uuid 495c8d36-266d-42ae-968f-28046804dcb7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:22:59 compute-1 nova_compute[182713]: 2026-01-22 00:22:59.015 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:22:59 compute-1 nova_compute[182713]: 2026-01-22 00:22:59.326 182717 INFO nova.compute.manager [None req-565187cd-03c4-4d15-a098-7642c6a3f6b8 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: 495c8d36-266d-42ae-968f-28046804dcb7] Get console output
Jan 22 00:22:59 compute-1 nova_compute[182713]: 2026-01-22 00:22:59.332 211417 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 00:22:59 compute-1 nova_compute[182713]: 2026-01-22 00:22:59.360 182717 DEBUG nova.network.neutron [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] [instance: 495c8d36-266d-42ae-968f-28046804dcb7] Updating instance_info_cache with network_info: [{"id": "6bd0c2c0-d87d-4e29-ae42-754b1c4a6c3a", "address": "fa:16:3e:7d:d5:75", "network": {"id": "cc568949-a996-45b6-b055-c1780ec7685a", "bridge": "br-int", "label": "tempest-network-smoke--1194520279", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.233", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "adb1305c8f874f2684e845e88fd95ffe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6bd0c2c0-d8", "ovs_interfaceid": "6bd0c2c0-d87d-4e29-ae42-754b1c4a6c3a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:22:59 compute-1 nova_compute[182713]: 2026-01-22 00:22:59.382 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Releasing lock "refresh_cache-495c8d36-266d-42ae-968f-28046804dcb7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:22:59 compute-1 nova_compute[182713]: 2026-01-22 00:22:59.383 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] [instance: 495c8d36-266d-42ae-968f-28046804dcb7] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Jan 22 00:22:59 compute-1 nova_compute[182713]: 2026-01-22 00:22:59.384 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:22:59 compute-1 nova_compute[182713]: 2026-01-22 00:22:59.384 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:22:59 compute-1 nova_compute[182713]: 2026-01-22 00:22:59.385 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:22:59 compute-1 nova_compute[182713]: 2026-01-22 00:22:59.385 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Jan 22 00:23:00 compute-1 nova_compute[182713]: 2026-01-22 00:23:00.015 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:23:00 compute-1 nova_compute[182713]: 2026-01-22 00:23:00.060 182717 DEBUG oslo_concurrency.lockutils [None req-2d185cc2-d869-4bd2-971a-5352efa3b1fc 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Acquiring lock "495c8d36-266d-42ae-968f-28046804dcb7" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:23:00 compute-1 nova_compute[182713]: 2026-01-22 00:23:00.061 182717 DEBUG oslo_concurrency.lockutils [None req-2d185cc2-d869-4bd2-971a-5352efa3b1fc 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lock "495c8d36-266d-42ae-968f-28046804dcb7" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:23:00 compute-1 nova_compute[182713]: 2026-01-22 00:23:00.062 182717 DEBUG oslo_concurrency.lockutils [None req-2d185cc2-d869-4bd2-971a-5352efa3b1fc 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Acquiring lock "495c8d36-266d-42ae-968f-28046804dcb7-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:23:00 compute-1 nova_compute[182713]: 2026-01-22 00:23:00.063 182717 DEBUG oslo_concurrency.lockutils [None req-2d185cc2-d869-4bd2-971a-5352efa3b1fc 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lock "495c8d36-266d-42ae-968f-28046804dcb7-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:23:00 compute-1 nova_compute[182713]: 2026-01-22 00:23:00.063 182717 DEBUG oslo_concurrency.lockutils [None req-2d185cc2-d869-4bd2-971a-5352efa3b1fc 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lock "495c8d36-266d-42ae-968f-28046804dcb7-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:23:00 compute-1 nova_compute[182713]: 2026-01-22 00:23:00.079 182717 INFO nova.compute.manager [None req-2d185cc2-d869-4bd2-971a-5352efa3b1fc 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: 495c8d36-266d-42ae-968f-28046804dcb7] Terminating instance
Jan 22 00:23:00 compute-1 nova_compute[182713]: 2026-01-22 00:23:00.093 182717 DEBUG nova.compute.manager [None req-2d185cc2-d869-4bd2-971a-5352efa3b1fc 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: 495c8d36-266d-42ae-968f-28046804dcb7] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 22 00:23:00 compute-1 kernel: tap6bd0c2c0-d8 (unregistering): left promiscuous mode
Jan 22 00:23:00 compute-1 NetworkManager[54952]: <info>  [1769041380.1184] device (tap6bd0c2c0-d8): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 00:23:00 compute-1 ovn_controller[94841]: 2026-01-22T00:23:00Z|00618|binding|INFO|Releasing lport 6bd0c2c0-d87d-4e29-ae42-754b1c4a6c3a from this chassis (sb_readonly=0)
Jan 22 00:23:00 compute-1 ovn_controller[94841]: 2026-01-22T00:23:00Z|00619|binding|INFO|Setting lport 6bd0c2c0-d87d-4e29-ae42-754b1c4a6c3a down in Southbound
Jan 22 00:23:00 compute-1 nova_compute[182713]: 2026-01-22 00:23:00.122 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:23:00 compute-1 ovn_controller[94841]: 2026-01-22T00:23:00Z|00620|binding|INFO|Removing iface tap6bd0c2c0-d8 ovn-installed in OVS
Jan 22 00:23:00 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:23:00.150 104184 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:7d:d5:75 10.100.0.6'], port_security=['fa:16:3e:7d:d5:75 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '495c8d36-266d-42ae-968f-28046804dcb7', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cc568949-a996-45b6-b055-c1780ec7685a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'adb1305c8f874f2684e845e88fd95ffe', 'neutron:revision_number': '4', 'neutron:security_group_ids': '762053a0-b433-495c-a60f-4015d21965ea', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8e45a905-ef69-47b8-b157-96af9472b990, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>], logical_port=6bd0c2c0-d87d-4e29-ae42-754b1c4a6c3a) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:23:00 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:23:00.152 104184 INFO neutron.agent.ovn.metadata.agent [-] Port 6bd0c2c0-d87d-4e29-ae42-754b1c4a6c3a in datapath cc568949-a996-45b6-b055-c1780ec7685a unbound from our chassis
Jan 22 00:23:00 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:23:00.155 104184 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network cc568949-a996-45b6-b055-c1780ec7685a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 00:23:00 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:23:00.157 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[852a807f-75e1-4123-bc99-7153f120122e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:23:00 compute-1 nova_compute[182713]: 2026-01-22 00:23:00.159 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:23:00 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:23:00.159 104184 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-cc568949-a996-45b6-b055-c1780ec7685a namespace which is not needed anymore
Jan 22 00:23:00 compute-1 systemd[1]: machine-qemu\x2d67\x2dinstance\x2d0000009b.scope: Deactivated successfully.
Jan 22 00:23:00 compute-1 systemd[1]: machine-qemu\x2d67\x2dinstance\x2d0000009b.scope: Consumed 13.187s CPU time.
Jan 22 00:23:00 compute-1 systemd-machined[153970]: Machine qemu-67-instance-0000009b terminated.
Jan 22 00:23:00 compute-1 neutron-haproxy-ovnmeta-cc568949-a996-45b6-b055-c1780ec7685a[235267]: [NOTICE]   (235271) : haproxy version is 2.8.14-c23fe91
Jan 22 00:23:00 compute-1 neutron-haproxy-ovnmeta-cc568949-a996-45b6-b055-c1780ec7685a[235267]: [NOTICE]   (235271) : path to executable is /usr/sbin/haproxy
Jan 22 00:23:00 compute-1 neutron-haproxy-ovnmeta-cc568949-a996-45b6-b055-c1780ec7685a[235267]: [WARNING]  (235271) : Exiting Master process...
Jan 22 00:23:00 compute-1 neutron-haproxy-ovnmeta-cc568949-a996-45b6-b055-c1780ec7685a[235267]: [ALERT]    (235271) : Current worker (235273) exited with code 143 (Terminated)
Jan 22 00:23:00 compute-1 neutron-haproxy-ovnmeta-cc568949-a996-45b6-b055-c1780ec7685a[235267]: [WARNING]  (235271) : All workers exited. Exiting... (0)
Jan 22 00:23:00 compute-1 systemd[1]: libpod-7b96546fbad670fa012f0c819377bd07440f9cf8c4d3de95fec803cfeebb0d50.scope: Deactivated successfully.
Jan 22 00:23:00 compute-1 podman[235469]: 2026-01-22 00:23:00.300866894 +0000 UTC m=+0.051423008 container died 7b96546fbad670fa012f0c819377bd07440f9cf8c4d3de95fec803cfeebb0d50 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-cc568949-a996-45b6-b055-c1780ec7685a, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 22 00:23:00 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-7b96546fbad670fa012f0c819377bd07440f9cf8c4d3de95fec803cfeebb0d50-userdata-shm.mount: Deactivated successfully.
Jan 22 00:23:00 compute-1 systemd[1]: var-lib-containers-storage-overlay-a1b53ccffc547685850d8805160fd463aa750f9228c8f9ff6a2c594703db7adf-merged.mount: Deactivated successfully.
Jan 22 00:23:00 compute-1 podman[235469]: 2026-01-22 00:23:00.36099756 +0000 UTC m=+0.111553654 container cleanup 7b96546fbad670fa012f0c819377bd07440f9cf8c4d3de95fec803cfeebb0d50 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-cc568949-a996-45b6-b055-c1780ec7685a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3)
Jan 22 00:23:00 compute-1 nova_compute[182713]: 2026-01-22 00:23:00.381 182717 INFO nova.virt.libvirt.driver [-] [instance: 495c8d36-266d-42ae-968f-28046804dcb7] Instance destroyed successfully.
Jan 22 00:23:00 compute-1 nova_compute[182713]: 2026-01-22 00:23:00.381 182717 DEBUG nova.objects.instance [None req-2d185cc2-d869-4bd2-971a-5352efa3b1fc 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lazy-loading 'resources' on Instance uuid 495c8d36-266d-42ae-968f-28046804dcb7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:23:00 compute-1 systemd[1]: libpod-conmon-7b96546fbad670fa012f0c819377bd07440f9cf8c4d3de95fec803cfeebb0d50.scope: Deactivated successfully.
Jan 22 00:23:00 compute-1 nova_compute[182713]: 2026-01-22 00:23:00.401 182717 DEBUG nova.virt.libvirt.vif [None req-2d185cc2-d869-4bd2-971a-5352efa3b1fc 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T00:22:20Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-884353453',display_name='tempest-TestNetworkAdvancedServerOps-server-884353453',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-884353453',id=155,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGotZTdEQGRQ7FtPF47WQv8+VDewIY4/N4nNX8FItrUZZclF+5nJcntVPNOC87Q2Kf2jm85PAaaRWchaGkCfaaFMR1OIz+ggaW1GGnvOQXdytdYH1qUy5cdJspAi5mhK2A==',key_name='tempest-TestNetworkAdvancedServerOps-1778186126',keypairs=<?>,launch_index=0,launched_at=2026-01-22T00:22:32Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='adb1305c8f874f2684e845e88fd95ffe',ramdisk_id='',reservation_id='r-4uxpb5fa',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkAdvancedServerOps-587955072',owner_user_name='tempest-TestNetworkAdvancedServerOps-587955072-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T00:22:56Z,user_data=None,user_id='635cc2f351c344dc8e2b1264080dbafb',uuid=495c8d36-266d-42ae-968f-28046804dcb7,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "6bd0c2c0-d87d-4e29-ae42-754b1c4a6c3a", "address": "fa:16:3e:7d:d5:75", "network": {"id": "cc568949-a996-45b6-b055-c1780ec7685a", "bridge": "br-int", "label": "tempest-network-smoke--1194520279", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.233", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "adb1305c8f874f2684e845e88fd95ffe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6bd0c2c0-d8", "ovs_interfaceid": "6bd0c2c0-d87d-4e29-ae42-754b1c4a6c3a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 00:23:00 compute-1 nova_compute[182713]: 2026-01-22 00:23:00.402 182717 DEBUG nova.network.os_vif_util [None req-2d185cc2-d869-4bd2-971a-5352efa3b1fc 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Converting VIF {"id": "6bd0c2c0-d87d-4e29-ae42-754b1c4a6c3a", "address": "fa:16:3e:7d:d5:75", "network": {"id": "cc568949-a996-45b6-b055-c1780ec7685a", "bridge": "br-int", "label": "tempest-network-smoke--1194520279", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.233", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "adb1305c8f874f2684e845e88fd95ffe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6bd0c2c0-d8", "ovs_interfaceid": "6bd0c2c0-d87d-4e29-ae42-754b1c4a6c3a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:23:00 compute-1 nova_compute[182713]: 2026-01-22 00:23:00.404 182717 DEBUG nova.network.os_vif_util [None req-2d185cc2-d869-4bd2-971a-5352efa3b1fc 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:7d:d5:75,bridge_name='br-int',has_traffic_filtering=True,id=6bd0c2c0-d87d-4e29-ae42-754b1c4a6c3a,network=Network(cc568949-a996-45b6-b055-c1780ec7685a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6bd0c2c0-d8') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:23:00 compute-1 nova_compute[182713]: 2026-01-22 00:23:00.404 182717 DEBUG os_vif [None req-2d185cc2-d869-4bd2-971a-5352efa3b1fc 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:7d:d5:75,bridge_name='br-int',has_traffic_filtering=True,id=6bd0c2c0-d87d-4e29-ae42-754b1c4a6c3a,network=Network(cc568949-a996-45b6-b055-c1780ec7685a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6bd0c2c0-d8') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 00:23:00 compute-1 nova_compute[182713]: 2026-01-22 00:23:00.407 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:23:00 compute-1 nova_compute[182713]: 2026-01-22 00:23:00.408 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6bd0c2c0-d8, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:23:00 compute-1 nova_compute[182713]: 2026-01-22 00:23:00.410 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:23:00 compute-1 nova_compute[182713]: 2026-01-22 00:23:00.412 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:23:00 compute-1 nova_compute[182713]: 2026-01-22 00:23:00.420 182717 INFO os_vif [None req-2d185cc2-d869-4bd2-971a-5352efa3b1fc 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:7d:d5:75,bridge_name='br-int',has_traffic_filtering=True,id=6bd0c2c0-d87d-4e29-ae42-754b1c4a6c3a,network=Network(cc568949-a996-45b6-b055-c1780ec7685a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6bd0c2c0-d8')
Jan 22 00:23:00 compute-1 nova_compute[182713]: 2026-01-22 00:23:00.421 182717 INFO nova.virt.libvirt.driver [None req-2d185cc2-d869-4bd2-971a-5352efa3b1fc 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: 495c8d36-266d-42ae-968f-28046804dcb7] Deleting instance files /var/lib/nova/instances/495c8d36-266d-42ae-968f-28046804dcb7_del
Jan 22 00:23:00 compute-1 nova_compute[182713]: 2026-01-22 00:23:00.422 182717 INFO nova.virt.libvirt.driver [None req-2d185cc2-d869-4bd2-971a-5352efa3b1fc 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: 495c8d36-266d-42ae-968f-28046804dcb7] Deletion of /var/lib/nova/instances/495c8d36-266d-42ae-968f-28046804dcb7_del complete
Jan 22 00:23:00 compute-1 podman[235513]: 2026-01-22 00:23:00.438307631 +0000 UTC m=+0.048757945 container remove 7b96546fbad670fa012f0c819377bd07440f9cf8c4d3de95fec803cfeebb0d50 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-cc568949-a996-45b6-b055-c1780ec7685a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3)
Jan 22 00:23:00 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:23:00.444 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[d6bbdd1a-95ce-468a-bf52-1e44f01b7f4b]: (4, ('Thu Jan 22 12:23:00 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-cc568949-a996-45b6-b055-c1780ec7685a (7b96546fbad670fa012f0c819377bd07440f9cf8c4d3de95fec803cfeebb0d50)\n7b96546fbad670fa012f0c819377bd07440f9cf8c4d3de95fec803cfeebb0d50\nThu Jan 22 12:23:00 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-cc568949-a996-45b6-b055-c1780ec7685a (7b96546fbad670fa012f0c819377bd07440f9cf8c4d3de95fec803cfeebb0d50)\n7b96546fbad670fa012f0c819377bd07440f9cf8c4d3de95fec803cfeebb0d50\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:23:00 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:23:00.446 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[b04ced28-b82a-49ed-9467-3e871e799ab0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:23:00 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:23:00.447 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapcc568949-a0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:23:00 compute-1 nova_compute[182713]: 2026-01-22 00:23:00.450 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:23:00 compute-1 kernel: tapcc568949-a0: left promiscuous mode
Jan 22 00:23:00 compute-1 nova_compute[182713]: 2026-01-22 00:23:00.462 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:23:00 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:23:00.466 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[da8ece0a-5848-4e65-88d1-6deab8fe236c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:23:00 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:23:00.488 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[887662c5-1d20-43c7-bdda-487284c791be]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:23:00 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:23:00.490 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[289bcd89-29c6-4b7a-9530-12b4030af7ab]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:23:00 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:23:00.508 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[8a040f58-8984-47c4-a10c-c07eb2039ee3]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 592420, 'reachable_time': 38563, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 235526, 'error': None, 'target': 'ovnmeta-cc568949-a996-45b6-b055-c1780ec7685a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:23:00 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:23:00.512 104576 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-cc568949-a996-45b6-b055-c1780ec7685a deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 22 00:23:00 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:23:00.513 104576 DEBUG oslo.privsep.daemon [-] privsep: reply[95cc55dc-e820-4964-8726-62dbbc1fdbcf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:23:00 compute-1 systemd[1]: run-netns-ovnmeta\x2dcc568949\x2da996\x2d45b6\x2db055\x2dc1780ec7685a.mount: Deactivated successfully.
Jan 22 00:23:00 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:23:00.714 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=74526b6d-b1ca-423f-9094-b845f8b97526, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '46'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:23:00 compute-1 nova_compute[182713]: 2026-01-22 00:23:00.770 182717 DEBUG nova.compute.manager [req-ef4722ee-4366-4e51-89fb-79e0bbeca385 req-9331f923-acaf-4c85-ad4a-4be3a4885085 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 495c8d36-266d-42ae-968f-28046804dcb7] Received event network-vif-unplugged-6bd0c2c0-d87d-4e29-ae42-754b1c4a6c3a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:23:00 compute-1 nova_compute[182713]: 2026-01-22 00:23:00.771 182717 DEBUG oslo_concurrency.lockutils [req-ef4722ee-4366-4e51-89fb-79e0bbeca385 req-9331f923-acaf-4c85-ad4a-4be3a4885085 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "495c8d36-266d-42ae-968f-28046804dcb7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:23:00 compute-1 nova_compute[182713]: 2026-01-22 00:23:00.771 182717 DEBUG oslo_concurrency.lockutils [req-ef4722ee-4366-4e51-89fb-79e0bbeca385 req-9331f923-acaf-4c85-ad4a-4be3a4885085 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "495c8d36-266d-42ae-968f-28046804dcb7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:23:00 compute-1 nova_compute[182713]: 2026-01-22 00:23:00.771 182717 DEBUG oslo_concurrency.lockutils [req-ef4722ee-4366-4e51-89fb-79e0bbeca385 req-9331f923-acaf-4c85-ad4a-4be3a4885085 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "495c8d36-266d-42ae-968f-28046804dcb7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:23:00 compute-1 nova_compute[182713]: 2026-01-22 00:23:00.771 182717 DEBUG nova.compute.manager [req-ef4722ee-4366-4e51-89fb-79e0bbeca385 req-9331f923-acaf-4c85-ad4a-4be3a4885085 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 495c8d36-266d-42ae-968f-28046804dcb7] No waiting events found dispatching network-vif-unplugged-6bd0c2c0-d87d-4e29-ae42-754b1c4a6c3a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:23:00 compute-1 nova_compute[182713]: 2026-01-22 00:23:00.772 182717 DEBUG nova.compute.manager [req-ef4722ee-4366-4e51-89fb-79e0bbeca385 req-9331f923-acaf-4c85-ad4a-4be3a4885085 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 495c8d36-266d-42ae-968f-28046804dcb7] Received event network-vif-unplugged-6bd0c2c0-d87d-4e29-ae42-754b1c4a6c3a for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 22 00:23:00 compute-1 nova_compute[182713]: 2026-01-22 00:23:00.781 182717 DEBUG nova.compute.manager [req-e7939d4a-3ff3-489d-bd2e-69fa35957b3c req-0b6c9a6f-3d8c-4be8-a805-85e81a4d6d3a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 495c8d36-266d-42ae-968f-28046804dcb7] Received event network-changed-6bd0c2c0-d87d-4e29-ae42-754b1c4a6c3a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:23:00 compute-1 nova_compute[182713]: 2026-01-22 00:23:00.781 182717 DEBUG nova.compute.manager [req-e7939d4a-3ff3-489d-bd2e-69fa35957b3c req-0b6c9a6f-3d8c-4be8-a805-85e81a4d6d3a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 495c8d36-266d-42ae-968f-28046804dcb7] Refreshing instance network info cache due to event network-changed-6bd0c2c0-d87d-4e29-ae42-754b1c4a6c3a. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 00:23:00 compute-1 nova_compute[182713]: 2026-01-22 00:23:00.782 182717 DEBUG oslo_concurrency.lockutils [req-e7939d4a-3ff3-489d-bd2e-69fa35957b3c req-0b6c9a6f-3d8c-4be8-a805-85e81a4d6d3a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-495c8d36-266d-42ae-968f-28046804dcb7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:23:00 compute-1 nova_compute[182713]: 2026-01-22 00:23:00.782 182717 DEBUG oslo_concurrency.lockutils [req-e7939d4a-3ff3-489d-bd2e-69fa35957b3c req-0b6c9a6f-3d8c-4be8-a805-85e81a4d6d3a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-495c8d36-266d-42ae-968f-28046804dcb7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:23:00 compute-1 nova_compute[182713]: 2026-01-22 00:23:00.782 182717 DEBUG nova.network.neutron [req-e7939d4a-3ff3-489d-bd2e-69fa35957b3c req-0b6c9a6f-3d8c-4be8-a805-85e81a4d6d3a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 495c8d36-266d-42ae-968f-28046804dcb7] Refreshing network info cache for port 6bd0c2c0-d87d-4e29-ae42-754b1c4a6c3a _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 00:23:00 compute-1 nova_compute[182713]: 2026-01-22 00:23:00.823 182717 INFO nova.compute.manager [None req-2d185cc2-d869-4bd2-971a-5352efa3b1fc 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: 495c8d36-266d-42ae-968f-28046804dcb7] Took 0.73 seconds to destroy the instance on the hypervisor.
Jan 22 00:23:00 compute-1 nova_compute[182713]: 2026-01-22 00:23:00.824 182717 DEBUG oslo.service.loopingcall [None req-2d185cc2-d869-4bd2-971a-5352efa3b1fc 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 22 00:23:00 compute-1 nova_compute[182713]: 2026-01-22 00:23:00.824 182717 DEBUG nova.compute.manager [-] [instance: 495c8d36-266d-42ae-968f-28046804dcb7] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 22 00:23:00 compute-1 nova_compute[182713]: 2026-01-22 00:23:00.824 182717 DEBUG nova.network.neutron [-] [instance: 495c8d36-266d-42ae-968f-28046804dcb7] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 22 00:23:03 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:23:03.031 104184 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:23:03 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:23:03.032 104184 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:23:03 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:23:03.032 104184 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:23:04 compute-1 nova_compute[182713]: 2026-01-22 00:23:04.017 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:23:04 compute-1 nova_compute[182713]: 2026-01-22 00:23:04.053 182717 DEBUG nova.compute.manager [req-3efed428-279c-4c5b-a30c-81fd982c3f80 req-1cccbe2b-0552-440b-a946-87158706bd76 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 495c8d36-266d-42ae-968f-28046804dcb7] Received event network-vif-plugged-6bd0c2c0-d87d-4e29-ae42-754b1c4a6c3a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:23:04 compute-1 nova_compute[182713]: 2026-01-22 00:23:04.053 182717 DEBUG oslo_concurrency.lockutils [req-3efed428-279c-4c5b-a30c-81fd982c3f80 req-1cccbe2b-0552-440b-a946-87158706bd76 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "495c8d36-266d-42ae-968f-28046804dcb7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:23:04 compute-1 nova_compute[182713]: 2026-01-22 00:23:04.053 182717 DEBUG oslo_concurrency.lockutils [req-3efed428-279c-4c5b-a30c-81fd982c3f80 req-1cccbe2b-0552-440b-a946-87158706bd76 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "495c8d36-266d-42ae-968f-28046804dcb7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:23:04 compute-1 nova_compute[182713]: 2026-01-22 00:23:04.053 182717 DEBUG oslo_concurrency.lockutils [req-3efed428-279c-4c5b-a30c-81fd982c3f80 req-1cccbe2b-0552-440b-a946-87158706bd76 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "495c8d36-266d-42ae-968f-28046804dcb7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:23:04 compute-1 nova_compute[182713]: 2026-01-22 00:23:04.054 182717 DEBUG nova.compute.manager [req-3efed428-279c-4c5b-a30c-81fd982c3f80 req-1cccbe2b-0552-440b-a946-87158706bd76 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 495c8d36-266d-42ae-968f-28046804dcb7] No waiting events found dispatching network-vif-plugged-6bd0c2c0-d87d-4e29-ae42-754b1c4a6c3a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:23:04 compute-1 nova_compute[182713]: 2026-01-22 00:23:04.054 182717 WARNING nova.compute.manager [req-3efed428-279c-4c5b-a30c-81fd982c3f80 req-1cccbe2b-0552-440b-a946-87158706bd76 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 495c8d36-266d-42ae-968f-28046804dcb7] Received unexpected event network-vif-plugged-6bd0c2c0-d87d-4e29-ae42-754b1c4a6c3a for instance with vm_state active and task_state deleting.
Jan 22 00:23:04 compute-1 nova_compute[182713]: 2026-01-22 00:23:04.599 182717 DEBUG nova.network.neutron [-] [instance: 495c8d36-266d-42ae-968f-28046804dcb7] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:23:04 compute-1 nova_compute[182713]: 2026-01-22 00:23:04.619 182717 INFO nova.compute.manager [-] [instance: 495c8d36-266d-42ae-968f-28046804dcb7] Took 3.80 seconds to deallocate network for instance.
Jan 22 00:23:04 compute-1 nova_compute[182713]: 2026-01-22 00:23:04.676 182717 DEBUG nova.compute.manager [req-dc2e70ee-e1c4-4101-8c46-d59f7dc69368 req-df442df2-1ba2-4f4c-9fdc-d09d67a7d75e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 495c8d36-266d-42ae-968f-28046804dcb7] Received event network-vif-deleted-6bd0c2c0-d87d-4e29-ae42-754b1c4a6c3a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:23:04 compute-1 nova_compute[182713]: 2026-01-22 00:23:04.698 182717 DEBUG oslo_concurrency.lockutils [None req-2d185cc2-d869-4bd2-971a-5352efa3b1fc 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:23:04 compute-1 nova_compute[182713]: 2026-01-22 00:23:04.698 182717 DEBUG oslo_concurrency.lockutils [None req-2d185cc2-d869-4bd2-971a-5352efa3b1fc 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:23:04 compute-1 nova_compute[182713]: 2026-01-22 00:23:04.759 182717 DEBUG nova.compute.provider_tree [None req-2d185cc2-d869-4bd2-971a-5352efa3b1fc 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Inventory has not changed in ProviderTree for provider: 39680711-70c9-4df1-ae59-25e54fac688d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:23:04 compute-1 nova_compute[182713]: 2026-01-22 00:23:04.771 182717 DEBUG nova.scheduler.client.report [None req-2d185cc2-d869-4bd2-971a-5352efa3b1fc 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Inventory has not changed for provider 39680711-70c9-4df1-ae59-25e54fac688d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:23:04 compute-1 nova_compute[182713]: 2026-01-22 00:23:04.796 182717 DEBUG oslo_concurrency.lockutils [None req-2d185cc2-d869-4bd2-971a-5352efa3b1fc 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.098s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:23:04 compute-1 nova_compute[182713]: 2026-01-22 00:23:04.835 182717 INFO nova.scheduler.client.report [None req-2d185cc2-d869-4bd2-971a-5352efa3b1fc 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Deleted allocations for instance 495c8d36-266d-42ae-968f-28046804dcb7
Jan 22 00:23:04 compute-1 nova_compute[182713]: 2026-01-22 00:23:04.939 182717 DEBUG oslo_concurrency.lockutils [None req-2d185cc2-d869-4bd2-971a-5352efa3b1fc 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lock "495c8d36-266d-42ae-968f-28046804dcb7" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.878s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:23:05 compute-1 nova_compute[182713]: 2026-01-22 00:23:05.411 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:23:05 compute-1 nova_compute[182713]: 2026-01-22 00:23:05.713 182717 DEBUG nova.network.neutron [req-e7939d4a-3ff3-489d-bd2e-69fa35957b3c req-0b6c9a6f-3d8c-4be8-a805-85e81a4d6d3a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 495c8d36-266d-42ae-968f-28046804dcb7] Updated VIF entry in instance network info cache for port 6bd0c2c0-d87d-4e29-ae42-754b1c4a6c3a. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 00:23:05 compute-1 nova_compute[182713]: 2026-01-22 00:23:05.714 182717 DEBUG nova.network.neutron [req-e7939d4a-3ff3-489d-bd2e-69fa35957b3c req-0b6c9a6f-3d8c-4be8-a805-85e81a4d6d3a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 495c8d36-266d-42ae-968f-28046804dcb7] Updating instance_info_cache with network_info: [{"id": "6bd0c2c0-d87d-4e29-ae42-754b1c4a6c3a", "address": "fa:16:3e:7d:d5:75", "network": {"id": "cc568949-a996-45b6-b055-c1780ec7685a", "bridge": "br-int", "label": "tempest-network-smoke--1194520279", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "adb1305c8f874f2684e845e88fd95ffe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6bd0c2c0-d8", "ovs_interfaceid": "6bd0c2c0-d87d-4e29-ae42-754b1c4a6c3a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:23:05 compute-1 nova_compute[182713]: 2026-01-22 00:23:05.787 182717 DEBUG oslo_concurrency.lockutils [req-e7939d4a-3ff3-489d-bd2e-69fa35957b3c req-0b6c9a6f-3d8c-4be8-a805-85e81a4d6d3a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-495c8d36-266d-42ae-968f-28046804dcb7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:23:06 compute-1 podman[235527]: 2026-01-22 00:23:06.577689319 +0000 UTC m=+0.063039859 container health_status cba261ffc859a105e0afa4436e64e27f9ba7e7387a1ea02146e0072c51844d44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 00:23:09 compute-1 nova_compute[182713]: 2026-01-22 00:23:09.019 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:23:10 compute-1 nova_compute[182713]: 2026-01-22 00:23:10.414 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:23:10 compute-1 podman[235548]: 2026-01-22 00:23:10.596449606 +0000 UTC m=+0.094017949 container health_status dce133a2ffa21f3006ac27dc80027060a9029e6d972f4c62fecd35dd3bbb00f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, config_id=openstack_network_exporter, io.buildah.version=1.33.7, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, release=1755695350, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, name=ubi9-minimal, build-date=2025-08-20T13:12:41, io.openshift.expose-services=)
Jan 22 00:23:10 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:23:10.891 104184 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:95:d5:ab'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-cc568949-a996-45b6-b055-c1780ec7685a', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cc568949-a996-45b6-b055-c1780ec7685a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'adb1305c8f874f2684e845e88fd95ffe', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8e45a905-ef69-47b8-b157-96af9472b990, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=7c217807-262b-45e7-a62c-ca33e3f039ed) old=Port_Binding(mac=['fa:16:3e:95:d5:ab 10.100.0.2'], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-cc568949-a996-45b6-b055-c1780ec7685a', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cc568949-a996-45b6-b055-c1780ec7685a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'adb1305c8f874f2684e845e88fd95ffe', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:23:10 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:23:10.892 104184 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 7c217807-262b-45e7-a62c-ca33e3f039ed in datapath cc568949-a996-45b6-b055-c1780ec7685a updated
Jan 22 00:23:10 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:23:10.893 104184 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network cc568949-a996-45b6-b055-c1780ec7685a or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Jan 22 00:23:10 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:23:10.894 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[aba95760-da59-4341-8fc8-2cda80e424de]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:23:11 compute-1 nova_compute[182713]: 2026-01-22 00:23:11.164 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:23:11 compute-1 nova_compute[182713]: 2026-01-22 00:23:11.268 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:23:14 compute-1 nova_compute[182713]: 2026-01-22 00:23:14.021 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:23:15 compute-1 nova_compute[182713]: 2026-01-22 00:23:15.379 182717 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769041380.3772924, 495c8d36-266d-42ae-968f-28046804dcb7 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:23:15 compute-1 nova_compute[182713]: 2026-01-22 00:23:15.381 182717 INFO nova.compute.manager [-] [instance: 495c8d36-266d-42ae-968f-28046804dcb7] VM Stopped (Lifecycle Event)
Jan 22 00:23:15 compute-1 nova_compute[182713]: 2026-01-22 00:23:15.461 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:23:15 compute-1 nova_compute[182713]: 2026-01-22 00:23:15.729 182717 DEBUG nova.compute.manager [None req-0a7ee155-a4ed-4e3f-a60d-866e04297d97 - - - - - -] [instance: 495c8d36-266d-42ae-968f-28046804dcb7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:23:19 compute-1 nova_compute[182713]: 2026-01-22 00:23:19.023 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:23:20 compute-1 nova_compute[182713]: 2026-01-22 00:23:20.464 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:23:21 compute-1 podman[235571]: 2026-01-22 00:23:21.562366567 +0000 UTC m=+0.053157131 container health_status 9069102163b9014ce8d990e36224edf2f6277e737eebbc85a8ea0ba5a3d6a468 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 22 00:23:21 compute-1 podman[235570]: 2026-01-22 00:23:21.592908405 +0000 UTC m=+0.087538687 container health_status 1b31374526468d6b98833c6ddc06c791524e3aaa8f4a92d02e3f11c449a17ca2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3)
Jan 22 00:23:24 compute-1 nova_compute[182713]: 2026-01-22 00:23:24.026 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:23:25 compute-1 nova_compute[182713]: 2026-01-22 00:23:25.467 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:23:25 compute-1 podman[235621]: 2026-01-22 00:23:25.570004149 +0000 UTC m=+0.052514312 container health_status e305ebfcd3dbca18f8dd680606b043b108cf9efb0d0a771039cb04fe0df8441d (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 22 00:23:25 compute-1 podman[235620]: 2026-01-22 00:23:25.570000959 +0000 UTC m=+0.059265721 container health_status af9a44923f14d865893c91712a29a99d59e05d83f1a1a0300c4f4d5af638f284 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Jan 22 00:23:29 compute-1 nova_compute[182713]: 2026-01-22 00:23:29.027 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:23:30 compute-1 nova_compute[182713]: 2026-01-22 00:23:30.469 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:23:32 compute-1 systemd[1]: virtproxyd.service: Deactivated successfully.
Jan 22 00:23:34 compute-1 nova_compute[182713]: 2026-01-22 00:23:34.028 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:23:35 compute-1 nova_compute[182713]: 2026-01-22 00:23:35.472 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:23:37 compute-1 podman[235665]: 2026-01-22 00:23:37.574008358 +0000 UTC m=+0.062900044 container health_status cba261ffc859a105e0afa4436e64e27f9ba7e7387a1ea02146e0072c51844d44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 00:23:37 compute-1 nova_compute[182713]: 2026-01-22 00:23:37.950 182717 DEBUG oslo_concurrency.lockutils [None req-64ea3630-5b6c-40fe-a401-c7e8701b66d4 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Acquiring lock "f457f1e4-8770-4c44-a061-214acbc43199" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:23:37 compute-1 nova_compute[182713]: 2026-01-22 00:23:37.951 182717 DEBUG oslo_concurrency.lockutils [None req-64ea3630-5b6c-40fe-a401-c7e8701b66d4 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lock "f457f1e4-8770-4c44-a061-214acbc43199" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:23:38 compute-1 nova_compute[182713]: 2026-01-22 00:23:38.334 182717 DEBUG nova.compute.manager [None req-64ea3630-5b6c-40fe-a401-c7e8701b66d4 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: f457f1e4-8770-4c44-a061-214acbc43199] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 22 00:23:38 compute-1 nova_compute[182713]: 2026-01-22 00:23:38.819 182717 DEBUG oslo_concurrency.lockutils [None req-64ea3630-5b6c-40fe-a401-c7e8701b66d4 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:23:38 compute-1 nova_compute[182713]: 2026-01-22 00:23:38.820 182717 DEBUG oslo_concurrency.lockutils [None req-64ea3630-5b6c-40fe-a401-c7e8701b66d4 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:23:38 compute-1 nova_compute[182713]: 2026-01-22 00:23:38.828 182717 DEBUG nova.virt.hardware [None req-64ea3630-5b6c-40fe-a401-c7e8701b66d4 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 22 00:23:38 compute-1 nova_compute[182713]: 2026-01-22 00:23:38.828 182717 INFO nova.compute.claims [None req-64ea3630-5b6c-40fe-a401-c7e8701b66d4 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: f457f1e4-8770-4c44-a061-214acbc43199] Claim successful on node compute-1.ctlplane.example.com
Jan 22 00:23:39 compute-1 nova_compute[182713]: 2026-01-22 00:23:39.031 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:23:39 compute-1 nova_compute[182713]: 2026-01-22 00:23:39.232 182717 DEBUG nova.compute.provider_tree [None req-64ea3630-5b6c-40fe-a401-c7e8701b66d4 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Inventory has not changed in ProviderTree for provider: 39680711-70c9-4df1-ae59-25e54fac688d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:23:39 compute-1 nova_compute[182713]: 2026-01-22 00:23:39.254 182717 DEBUG nova.scheduler.client.report [None req-64ea3630-5b6c-40fe-a401-c7e8701b66d4 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Inventory has not changed for provider 39680711-70c9-4df1-ae59-25e54fac688d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:23:39 compute-1 nova_compute[182713]: 2026-01-22 00:23:39.280 182717 DEBUG oslo_concurrency.lockutils [None req-64ea3630-5b6c-40fe-a401-c7e8701b66d4 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.461s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:23:39 compute-1 nova_compute[182713]: 2026-01-22 00:23:39.281 182717 DEBUG nova.compute.manager [None req-64ea3630-5b6c-40fe-a401-c7e8701b66d4 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: f457f1e4-8770-4c44-a061-214acbc43199] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 22 00:23:39 compute-1 nova_compute[182713]: 2026-01-22 00:23:39.499 182717 DEBUG nova.compute.manager [None req-64ea3630-5b6c-40fe-a401-c7e8701b66d4 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: f457f1e4-8770-4c44-a061-214acbc43199] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 22 00:23:39 compute-1 nova_compute[182713]: 2026-01-22 00:23:39.500 182717 DEBUG nova.network.neutron [None req-64ea3630-5b6c-40fe-a401-c7e8701b66d4 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: f457f1e4-8770-4c44-a061-214acbc43199] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 22 00:23:39 compute-1 nova_compute[182713]: 2026-01-22 00:23:39.519 182717 INFO nova.virt.libvirt.driver [None req-64ea3630-5b6c-40fe-a401-c7e8701b66d4 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: f457f1e4-8770-4c44-a061-214acbc43199] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 22 00:23:39 compute-1 nova_compute[182713]: 2026-01-22 00:23:39.538 182717 DEBUG nova.compute.manager [None req-64ea3630-5b6c-40fe-a401-c7e8701b66d4 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: f457f1e4-8770-4c44-a061-214acbc43199] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 22 00:23:39 compute-1 nova_compute[182713]: 2026-01-22 00:23:39.683 182717 DEBUG nova.compute.manager [None req-64ea3630-5b6c-40fe-a401-c7e8701b66d4 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: f457f1e4-8770-4c44-a061-214acbc43199] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 22 00:23:39 compute-1 nova_compute[182713]: 2026-01-22 00:23:39.685 182717 DEBUG nova.virt.libvirt.driver [None req-64ea3630-5b6c-40fe-a401-c7e8701b66d4 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: f457f1e4-8770-4c44-a061-214acbc43199] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 22 00:23:39 compute-1 nova_compute[182713]: 2026-01-22 00:23:39.686 182717 INFO nova.virt.libvirt.driver [None req-64ea3630-5b6c-40fe-a401-c7e8701b66d4 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: f457f1e4-8770-4c44-a061-214acbc43199] Creating image(s)
Jan 22 00:23:39 compute-1 nova_compute[182713]: 2026-01-22 00:23:39.687 182717 DEBUG oslo_concurrency.lockutils [None req-64ea3630-5b6c-40fe-a401-c7e8701b66d4 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Acquiring lock "/var/lib/nova/instances/f457f1e4-8770-4c44-a061-214acbc43199/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:23:39 compute-1 nova_compute[182713]: 2026-01-22 00:23:39.687 182717 DEBUG oslo_concurrency.lockutils [None req-64ea3630-5b6c-40fe-a401-c7e8701b66d4 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lock "/var/lib/nova/instances/f457f1e4-8770-4c44-a061-214acbc43199/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:23:39 compute-1 nova_compute[182713]: 2026-01-22 00:23:39.689 182717 DEBUG oslo_concurrency.lockutils [None req-64ea3630-5b6c-40fe-a401-c7e8701b66d4 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lock "/var/lib/nova/instances/f457f1e4-8770-4c44-a061-214acbc43199/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:23:39 compute-1 nova_compute[182713]: 2026-01-22 00:23:39.714 182717 DEBUG oslo_concurrency.processutils [None req-64ea3630-5b6c-40fe-a401-c7e8701b66d4 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:23:39 compute-1 nova_compute[182713]: 2026-01-22 00:23:39.807 182717 DEBUG oslo_concurrency.processutils [None req-64ea3630-5b6c-40fe-a401-c7e8701b66d4 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.093s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:23:39 compute-1 nova_compute[182713]: 2026-01-22 00:23:39.808 182717 DEBUG oslo_concurrency.lockutils [None req-64ea3630-5b6c-40fe-a401-c7e8701b66d4 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Acquiring lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:23:39 compute-1 nova_compute[182713]: 2026-01-22 00:23:39.809 182717 DEBUG oslo_concurrency.lockutils [None req-64ea3630-5b6c-40fe-a401-c7e8701b66d4 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:23:39 compute-1 nova_compute[182713]: 2026-01-22 00:23:39.824 182717 DEBUG oslo_concurrency.processutils [None req-64ea3630-5b6c-40fe-a401-c7e8701b66d4 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:23:39 compute-1 nova_compute[182713]: 2026-01-22 00:23:39.855 182717 DEBUG nova.policy [None req-64ea3630-5b6c-40fe-a401-c7e8701b66d4 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '635cc2f351c344dc8e2b1264080dbafb', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'adb1305c8f874f2684e845e88fd95ffe', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 22 00:23:39 compute-1 nova_compute[182713]: 2026-01-22 00:23:39.898 182717 DEBUG oslo_concurrency.processutils [None req-64ea3630-5b6c-40fe-a401-c7e8701b66d4 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.074s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:23:39 compute-1 nova_compute[182713]: 2026-01-22 00:23:39.899 182717 DEBUG oslo_concurrency.processutils [None req-64ea3630-5b6c-40fe-a401-c7e8701b66d4 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/f457f1e4-8770-4c44-a061-214acbc43199/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:23:39 compute-1 nova_compute[182713]: 2026-01-22 00:23:39.933 182717 DEBUG oslo_concurrency.processutils [None req-64ea3630-5b6c-40fe-a401-c7e8701b66d4 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/f457f1e4-8770-4c44-a061-214acbc43199/disk 1073741824" returned: 0 in 0.034s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:23:39 compute-1 nova_compute[182713]: 2026-01-22 00:23:39.934 182717 DEBUG oslo_concurrency.lockutils [None req-64ea3630-5b6c-40fe-a401-c7e8701b66d4 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.125s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:23:39 compute-1 nova_compute[182713]: 2026-01-22 00:23:39.934 182717 DEBUG oslo_concurrency.processutils [None req-64ea3630-5b6c-40fe-a401-c7e8701b66d4 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:23:40 compute-1 nova_compute[182713]: 2026-01-22 00:23:40.004 182717 DEBUG oslo_concurrency.processutils [None req-64ea3630-5b6c-40fe-a401-c7e8701b66d4 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:23:40 compute-1 nova_compute[182713]: 2026-01-22 00:23:40.006 182717 DEBUG nova.virt.disk.api [None req-64ea3630-5b6c-40fe-a401-c7e8701b66d4 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Checking if we can resize image /var/lib/nova/instances/f457f1e4-8770-4c44-a061-214acbc43199/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 22 00:23:40 compute-1 nova_compute[182713]: 2026-01-22 00:23:40.007 182717 DEBUG oslo_concurrency.processutils [None req-64ea3630-5b6c-40fe-a401-c7e8701b66d4 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f457f1e4-8770-4c44-a061-214acbc43199/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:23:40 compute-1 nova_compute[182713]: 2026-01-22 00:23:40.083 182717 DEBUG oslo_concurrency.processutils [None req-64ea3630-5b6c-40fe-a401-c7e8701b66d4 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f457f1e4-8770-4c44-a061-214acbc43199/disk --force-share --output=json" returned: 0 in 0.076s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:23:40 compute-1 nova_compute[182713]: 2026-01-22 00:23:40.084 182717 DEBUG nova.virt.disk.api [None req-64ea3630-5b6c-40fe-a401-c7e8701b66d4 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Cannot resize image /var/lib/nova/instances/f457f1e4-8770-4c44-a061-214acbc43199/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 22 00:23:40 compute-1 nova_compute[182713]: 2026-01-22 00:23:40.084 182717 DEBUG nova.objects.instance [None req-64ea3630-5b6c-40fe-a401-c7e8701b66d4 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lazy-loading 'migration_context' on Instance uuid f457f1e4-8770-4c44-a061-214acbc43199 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:23:40 compute-1 nova_compute[182713]: 2026-01-22 00:23:40.189 182717 DEBUG nova.virt.libvirt.driver [None req-64ea3630-5b6c-40fe-a401-c7e8701b66d4 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: f457f1e4-8770-4c44-a061-214acbc43199] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 22 00:23:40 compute-1 nova_compute[182713]: 2026-01-22 00:23:40.190 182717 DEBUG nova.virt.libvirt.driver [None req-64ea3630-5b6c-40fe-a401-c7e8701b66d4 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: f457f1e4-8770-4c44-a061-214acbc43199] Ensure instance console log exists: /var/lib/nova/instances/f457f1e4-8770-4c44-a061-214acbc43199/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 22 00:23:40 compute-1 nova_compute[182713]: 2026-01-22 00:23:40.190 182717 DEBUG oslo_concurrency.lockutils [None req-64ea3630-5b6c-40fe-a401-c7e8701b66d4 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:23:40 compute-1 nova_compute[182713]: 2026-01-22 00:23:40.191 182717 DEBUG oslo_concurrency.lockutils [None req-64ea3630-5b6c-40fe-a401-c7e8701b66d4 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:23:40 compute-1 nova_compute[182713]: 2026-01-22 00:23:40.191 182717 DEBUG oslo_concurrency.lockutils [None req-64ea3630-5b6c-40fe-a401-c7e8701b66d4 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:23:40 compute-1 nova_compute[182713]: 2026-01-22 00:23:40.475 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:23:41 compute-1 podman[235701]: 2026-01-22 00:23:41.606386398 +0000 UTC m=+0.087238160 container health_status dce133a2ffa21f3006ac27dc80027060a9029e6d972f4c62fecd35dd3bbb00f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, release=1755695350, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, name=ubi9-minimal, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, io.openshift.expose-services=, config_id=openstack_network_exporter, maintainer=Red Hat, Inc., io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6)
Jan 22 00:23:44 compute-1 nova_compute[182713]: 2026-01-22 00:23:44.034 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:23:45 compute-1 nova_compute[182713]: 2026-01-22 00:23:45.478 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:23:46 compute-1 nova_compute[182713]: 2026-01-22 00:23:46.461 182717 DEBUG nova.network.neutron [None req-64ea3630-5b6c-40fe-a401-c7e8701b66d4 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: f457f1e4-8770-4c44-a061-214acbc43199] Successfully created port: ccbc0d4c-cf7a-4220-a948-0aeade60dbdb _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 22 00:23:47 compute-1 nova_compute[182713]: 2026-01-22 00:23:47.875 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:23:47 compute-1 nova_compute[182713]: 2026-01-22 00:23:47.996 182717 DEBUG nova.network.neutron [None req-64ea3630-5b6c-40fe-a401-c7e8701b66d4 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: f457f1e4-8770-4c44-a061-214acbc43199] Successfully updated port: ccbc0d4c-cf7a-4220-a948-0aeade60dbdb _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 22 00:23:48 compute-1 nova_compute[182713]: 2026-01-22 00:23:48.017 182717 DEBUG oslo_concurrency.lockutils [None req-64ea3630-5b6c-40fe-a401-c7e8701b66d4 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Acquiring lock "refresh_cache-f457f1e4-8770-4c44-a061-214acbc43199" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:23:48 compute-1 nova_compute[182713]: 2026-01-22 00:23:48.018 182717 DEBUG oslo_concurrency.lockutils [None req-64ea3630-5b6c-40fe-a401-c7e8701b66d4 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Acquired lock "refresh_cache-f457f1e4-8770-4c44-a061-214acbc43199" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:23:48 compute-1 nova_compute[182713]: 2026-01-22 00:23:48.018 182717 DEBUG nova.network.neutron [None req-64ea3630-5b6c-40fe-a401-c7e8701b66d4 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: f457f1e4-8770-4c44-a061-214acbc43199] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 00:23:48 compute-1 nova_compute[182713]: 2026-01-22 00:23:48.247 182717 DEBUG nova.compute.manager [req-6bff11ab-48df-47b0-909e-0ac0e9f85b1c req-ccc02547-177d-46dd-8197-951212244605 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: f457f1e4-8770-4c44-a061-214acbc43199] Received event network-changed-ccbc0d4c-cf7a-4220-a948-0aeade60dbdb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:23:48 compute-1 nova_compute[182713]: 2026-01-22 00:23:48.248 182717 DEBUG nova.compute.manager [req-6bff11ab-48df-47b0-909e-0ac0e9f85b1c req-ccc02547-177d-46dd-8197-951212244605 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: f457f1e4-8770-4c44-a061-214acbc43199] Refreshing instance network info cache due to event network-changed-ccbc0d4c-cf7a-4220-a948-0aeade60dbdb. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 00:23:48 compute-1 nova_compute[182713]: 2026-01-22 00:23:48.248 182717 DEBUG oslo_concurrency.lockutils [req-6bff11ab-48df-47b0-909e-0ac0e9f85b1c req-ccc02547-177d-46dd-8197-951212244605 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-f457f1e4-8770-4c44-a061-214acbc43199" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:23:48 compute-1 nova_compute[182713]: 2026-01-22 00:23:48.449 182717 DEBUG nova.network.neutron [None req-64ea3630-5b6c-40fe-a401-c7e8701b66d4 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: f457f1e4-8770-4c44-a061-214acbc43199] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 00:23:49 compute-1 nova_compute[182713]: 2026-01-22 00:23:49.035 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:23:49 compute-1 nova_compute[182713]: 2026-01-22 00:23:49.491 182717 DEBUG nova.network.neutron [None req-64ea3630-5b6c-40fe-a401-c7e8701b66d4 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: f457f1e4-8770-4c44-a061-214acbc43199] Updating instance_info_cache with network_info: [{"id": "ccbc0d4c-cf7a-4220-a948-0aeade60dbdb", "address": "fa:16:3e:17:d8:57", "network": {"id": "c72b0076-9848-49ed-9b2e-d2fe36ac5e52", "bridge": "br-int", "label": "tempest-network-smoke--1029682171", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "adb1305c8f874f2684e845e88fd95ffe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapccbc0d4c-cf", "ovs_interfaceid": "ccbc0d4c-cf7a-4220-a948-0aeade60dbdb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:23:49 compute-1 nova_compute[182713]: 2026-01-22 00:23:49.533 182717 DEBUG oslo_concurrency.lockutils [None req-64ea3630-5b6c-40fe-a401-c7e8701b66d4 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Releasing lock "refresh_cache-f457f1e4-8770-4c44-a061-214acbc43199" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:23:49 compute-1 nova_compute[182713]: 2026-01-22 00:23:49.534 182717 DEBUG nova.compute.manager [None req-64ea3630-5b6c-40fe-a401-c7e8701b66d4 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: f457f1e4-8770-4c44-a061-214acbc43199] Instance network_info: |[{"id": "ccbc0d4c-cf7a-4220-a948-0aeade60dbdb", "address": "fa:16:3e:17:d8:57", "network": {"id": "c72b0076-9848-49ed-9b2e-d2fe36ac5e52", "bridge": "br-int", "label": "tempest-network-smoke--1029682171", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "adb1305c8f874f2684e845e88fd95ffe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapccbc0d4c-cf", "ovs_interfaceid": "ccbc0d4c-cf7a-4220-a948-0aeade60dbdb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 22 00:23:49 compute-1 nova_compute[182713]: 2026-01-22 00:23:49.534 182717 DEBUG oslo_concurrency.lockutils [req-6bff11ab-48df-47b0-909e-0ac0e9f85b1c req-ccc02547-177d-46dd-8197-951212244605 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-f457f1e4-8770-4c44-a061-214acbc43199" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:23:49 compute-1 nova_compute[182713]: 2026-01-22 00:23:49.535 182717 DEBUG nova.network.neutron [req-6bff11ab-48df-47b0-909e-0ac0e9f85b1c req-ccc02547-177d-46dd-8197-951212244605 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: f457f1e4-8770-4c44-a061-214acbc43199] Refreshing network info cache for port ccbc0d4c-cf7a-4220-a948-0aeade60dbdb _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 00:23:49 compute-1 nova_compute[182713]: 2026-01-22 00:23:49.538 182717 DEBUG nova.virt.libvirt.driver [None req-64ea3630-5b6c-40fe-a401-c7e8701b66d4 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: f457f1e4-8770-4c44-a061-214acbc43199] Start _get_guest_xml network_info=[{"id": "ccbc0d4c-cf7a-4220-a948-0aeade60dbdb", "address": "fa:16:3e:17:d8:57", "network": {"id": "c72b0076-9848-49ed-9b2e-d2fe36ac5e52", "bridge": "br-int", "label": "tempest-network-smoke--1029682171", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "adb1305c8f874f2684e845e88fd95ffe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapccbc0d4c-cf", "ovs_interfaceid": "ccbc0d4c-cf7a-4220-a948-0aeade60dbdb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'size': 0, 'disk_bus': 'virtio', 'boot_index': 0, 'device_name': '/dev/vda', 'guest_format': None, 'encrypted': False, 'encryption_format': None, 'encryption_secret_uuid': None, 'device_type': 'disk', 'image_id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 22 00:23:49 compute-1 nova_compute[182713]: 2026-01-22 00:23:49.544 182717 WARNING nova.virt.libvirt.driver [None req-64ea3630-5b6c-40fe-a401-c7e8701b66d4 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 00:23:49 compute-1 nova_compute[182713]: 2026-01-22 00:23:49.554 182717 DEBUG nova.virt.libvirt.host [None req-64ea3630-5b6c-40fe-a401-c7e8701b66d4 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 22 00:23:49 compute-1 nova_compute[182713]: 2026-01-22 00:23:49.555 182717 DEBUG nova.virt.libvirt.host [None req-64ea3630-5b6c-40fe-a401-c7e8701b66d4 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 22 00:23:49 compute-1 nova_compute[182713]: 2026-01-22 00:23:49.566 182717 DEBUG nova.virt.libvirt.host [None req-64ea3630-5b6c-40fe-a401-c7e8701b66d4 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 22 00:23:49 compute-1 nova_compute[182713]: 2026-01-22 00:23:49.567 182717 DEBUG nova.virt.libvirt.host [None req-64ea3630-5b6c-40fe-a401-c7e8701b66d4 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 22 00:23:49 compute-1 nova_compute[182713]: 2026-01-22 00:23:49.569 182717 DEBUG nova.virt.libvirt.driver [None req-64ea3630-5b6c-40fe-a401-c7e8701b66d4 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 22 00:23:49 compute-1 nova_compute[182713]: 2026-01-22 00:23:49.569 182717 DEBUG nova.virt.hardware [None req-64ea3630-5b6c-40fe-a401-c7e8701b66d4 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-21T23:42:45Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c3389c03-89c4-4ff5-9e03-1a99d41713d4',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 22 00:23:49 compute-1 nova_compute[182713]: 2026-01-22 00:23:49.570 182717 DEBUG nova.virt.hardware [None req-64ea3630-5b6c-40fe-a401-c7e8701b66d4 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 22 00:23:49 compute-1 nova_compute[182713]: 2026-01-22 00:23:49.570 182717 DEBUG nova.virt.hardware [None req-64ea3630-5b6c-40fe-a401-c7e8701b66d4 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 22 00:23:49 compute-1 nova_compute[182713]: 2026-01-22 00:23:49.570 182717 DEBUG nova.virt.hardware [None req-64ea3630-5b6c-40fe-a401-c7e8701b66d4 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 22 00:23:49 compute-1 nova_compute[182713]: 2026-01-22 00:23:49.571 182717 DEBUG nova.virt.hardware [None req-64ea3630-5b6c-40fe-a401-c7e8701b66d4 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 22 00:23:49 compute-1 nova_compute[182713]: 2026-01-22 00:23:49.571 182717 DEBUG nova.virt.hardware [None req-64ea3630-5b6c-40fe-a401-c7e8701b66d4 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 22 00:23:49 compute-1 nova_compute[182713]: 2026-01-22 00:23:49.572 182717 DEBUG nova.virt.hardware [None req-64ea3630-5b6c-40fe-a401-c7e8701b66d4 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 22 00:23:49 compute-1 nova_compute[182713]: 2026-01-22 00:23:49.572 182717 DEBUG nova.virt.hardware [None req-64ea3630-5b6c-40fe-a401-c7e8701b66d4 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 22 00:23:49 compute-1 nova_compute[182713]: 2026-01-22 00:23:49.572 182717 DEBUG nova.virt.hardware [None req-64ea3630-5b6c-40fe-a401-c7e8701b66d4 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 22 00:23:49 compute-1 nova_compute[182713]: 2026-01-22 00:23:49.573 182717 DEBUG nova.virt.hardware [None req-64ea3630-5b6c-40fe-a401-c7e8701b66d4 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 22 00:23:49 compute-1 nova_compute[182713]: 2026-01-22 00:23:49.573 182717 DEBUG nova.virt.hardware [None req-64ea3630-5b6c-40fe-a401-c7e8701b66d4 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 22 00:23:49 compute-1 nova_compute[182713]: 2026-01-22 00:23:49.578 182717 DEBUG nova.virt.libvirt.vif [None req-64ea3630-5b6c-40fe-a401-c7e8701b66d4 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T00:23:35Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-1667524508',display_name='tempest-TestNetworkAdvancedServerOps-server-1667524508',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-1667524508',id=159,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHHg+wJV0gNqY7bKALzLqoL0K2EZGAuyjbK8gKQ82X6sT1DieFqLVmwGBVxxlHgI5XE+x3/Jhn+MN25qB+bFchaETh8kb1GrgpW5jMhoanYzyulPU1uvIFw8Ac5CluludA==',key_name='tempest-TestNetworkAdvancedServerOps-913147749',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='adb1305c8f874f2684e845e88fd95ffe',ramdisk_id='',reservation_id='r-an3dhcmb',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkAdvancedServerOps-587955072',owner_user_name='tempest-TestNetworkAdvancedServerOps-587955072-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:23:39Z,user_data=None,user_id='635cc2f351c344dc8e2b1264080dbafb',uuid=f457f1e4-8770-4c44-a061-214acbc43199,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "ccbc0d4c-cf7a-4220-a948-0aeade60dbdb", "address": "fa:16:3e:17:d8:57", "network": {"id": "c72b0076-9848-49ed-9b2e-d2fe36ac5e52", "bridge": "br-int", "label": "tempest-network-smoke--1029682171", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "adb1305c8f874f2684e845e88fd95ffe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapccbc0d4c-cf", "ovs_interfaceid": "ccbc0d4c-cf7a-4220-a948-0aeade60dbdb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 00:23:49 compute-1 nova_compute[182713]: 2026-01-22 00:23:49.578 182717 DEBUG nova.network.os_vif_util [None req-64ea3630-5b6c-40fe-a401-c7e8701b66d4 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Converting VIF {"id": "ccbc0d4c-cf7a-4220-a948-0aeade60dbdb", "address": "fa:16:3e:17:d8:57", "network": {"id": "c72b0076-9848-49ed-9b2e-d2fe36ac5e52", "bridge": "br-int", "label": "tempest-network-smoke--1029682171", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "adb1305c8f874f2684e845e88fd95ffe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapccbc0d4c-cf", "ovs_interfaceid": "ccbc0d4c-cf7a-4220-a948-0aeade60dbdb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:23:49 compute-1 nova_compute[182713]: 2026-01-22 00:23:49.579 182717 DEBUG nova.network.os_vif_util [None req-64ea3630-5b6c-40fe-a401-c7e8701b66d4 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:17:d8:57,bridge_name='br-int',has_traffic_filtering=True,id=ccbc0d4c-cf7a-4220-a948-0aeade60dbdb,network=Network(c72b0076-9848-49ed-9b2e-d2fe36ac5e52),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapccbc0d4c-cf') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:23:49 compute-1 nova_compute[182713]: 2026-01-22 00:23:49.580 182717 DEBUG nova.objects.instance [None req-64ea3630-5b6c-40fe-a401-c7e8701b66d4 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lazy-loading 'pci_devices' on Instance uuid f457f1e4-8770-4c44-a061-214acbc43199 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:23:49 compute-1 nova_compute[182713]: 2026-01-22 00:23:49.596 182717 DEBUG nova.virt.libvirt.driver [None req-64ea3630-5b6c-40fe-a401-c7e8701b66d4 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: f457f1e4-8770-4c44-a061-214acbc43199] End _get_guest_xml xml=<domain type="kvm">
Jan 22 00:23:49 compute-1 nova_compute[182713]:   <uuid>f457f1e4-8770-4c44-a061-214acbc43199</uuid>
Jan 22 00:23:49 compute-1 nova_compute[182713]:   <name>instance-0000009f</name>
Jan 22 00:23:49 compute-1 nova_compute[182713]:   <memory>131072</memory>
Jan 22 00:23:49 compute-1 nova_compute[182713]:   <vcpu>1</vcpu>
Jan 22 00:23:49 compute-1 nova_compute[182713]:   <metadata>
Jan 22 00:23:49 compute-1 nova_compute[182713]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 00:23:49 compute-1 nova_compute[182713]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 00:23:49 compute-1 nova_compute[182713]:       <nova:name>tempest-TestNetworkAdvancedServerOps-server-1667524508</nova:name>
Jan 22 00:23:49 compute-1 nova_compute[182713]:       <nova:creationTime>2026-01-22 00:23:49</nova:creationTime>
Jan 22 00:23:49 compute-1 nova_compute[182713]:       <nova:flavor name="m1.nano">
Jan 22 00:23:49 compute-1 nova_compute[182713]:         <nova:memory>128</nova:memory>
Jan 22 00:23:49 compute-1 nova_compute[182713]:         <nova:disk>1</nova:disk>
Jan 22 00:23:49 compute-1 nova_compute[182713]:         <nova:swap>0</nova:swap>
Jan 22 00:23:49 compute-1 nova_compute[182713]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 00:23:49 compute-1 nova_compute[182713]:         <nova:vcpus>1</nova:vcpus>
Jan 22 00:23:49 compute-1 nova_compute[182713]:       </nova:flavor>
Jan 22 00:23:49 compute-1 nova_compute[182713]:       <nova:owner>
Jan 22 00:23:49 compute-1 nova_compute[182713]:         <nova:user uuid="635cc2f351c344dc8e2b1264080dbafb">tempest-TestNetworkAdvancedServerOps-587955072-project-member</nova:user>
Jan 22 00:23:49 compute-1 nova_compute[182713]:         <nova:project uuid="adb1305c8f874f2684e845e88fd95ffe">tempest-TestNetworkAdvancedServerOps-587955072</nova:project>
Jan 22 00:23:49 compute-1 nova_compute[182713]:       </nova:owner>
Jan 22 00:23:49 compute-1 nova_compute[182713]:       <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 22 00:23:49 compute-1 nova_compute[182713]:       <nova:ports>
Jan 22 00:23:49 compute-1 nova_compute[182713]:         <nova:port uuid="ccbc0d4c-cf7a-4220-a948-0aeade60dbdb">
Jan 22 00:23:49 compute-1 nova_compute[182713]:           <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Jan 22 00:23:49 compute-1 nova_compute[182713]:         </nova:port>
Jan 22 00:23:49 compute-1 nova_compute[182713]:       </nova:ports>
Jan 22 00:23:49 compute-1 nova_compute[182713]:     </nova:instance>
Jan 22 00:23:49 compute-1 nova_compute[182713]:   </metadata>
Jan 22 00:23:49 compute-1 nova_compute[182713]:   <sysinfo type="smbios">
Jan 22 00:23:49 compute-1 nova_compute[182713]:     <system>
Jan 22 00:23:49 compute-1 nova_compute[182713]:       <entry name="manufacturer">RDO</entry>
Jan 22 00:23:49 compute-1 nova_compute[182713]:       <entry name="product">OpenStack Compute</entry>
Jan 22 00:23:49 compute-1 nova_compute[182713]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 00:23:49 compute-1 nova_compute[182713]:       <entry name="serial">f457f1e4-8770-4c44-a061-214acbc43199</entry>
Jan 22 00:23:49 compute-1 nova_compute[182713]:       <entry name="uuid">f457f1e4-8770-4c44-a061-214acbc43199</entry>
Jan 22 00:23:49 compute-1 nova_compute[182713]:       <entry name="family">Virtual Machine</entry>
Jan 22 00:23:49 compute-1 nova_compute[182713]:     </system>
Jan 22 00:23:49 compute-1 nova_compute[182713]:   </sysinfo>
Jan 22 00:23:49 compute-1 nova_compute[182713]:   <os>
Jan 22 00:23:49 compute-1 nova_compute[182713]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 22 00:23:49 compute-1 nova_compute[182713]:     <boot dev="hd"/>
Jan 22 00:23:49 compute-1 nova_compute[182713]:     <smbios mode="sysinfo"/>
Jan 22 00:23:49 compute-1 nova_compute[182713]:   </os>
Jan 22 00:23:49 compute-1 nova_compute[182713]:   <features>
Jan 22 00:23:49 compute-1 nova_compute[182713]:     <acpi/>
Jan 22 00:23:49 compute-1 nova_compute[182713]:     <apic/>
Jan 22 00:23:49 compute-1 nova_compute[182713]:     <vmcoreinfo/>
Jan 22 00:23:49 compute-1 nova_compute[182713]:   </features>
Jan 22 00:23:49 compute-1 nova_compute[182713]:   <clock offset="utc">
Jan 22 00:23:49 compute-1 nova_compute[182713]:     <timer name="pit" tickpolicy="delay"/>
Jan 22 00:23:49 compute-1 nova_compute[182713]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 22 00:23:49 compute-1 nova_compute[182713]:     <timer name="hpet" present="no"/>
Jan 22 00:23:49 compute-1 nova_compute[182713]:   </clock>
Jan 22 00:23:49 compute-1 nova_compute[182713]:   <cpu mode="custom" match="exact">
Jan 22 00:23:49 compute-1 nova_compute[182713]:     <model>Nehalem</model>
Jan 22 00:23:49 compute-1 nova_compute[182713]:     <topology sockets="1" cores="1" threads="1"/>
Jan 22 00:23:49 compute-1 nova_compute[182713]:   </cpu>
Jan 22 00:23:49 compute-1 nova_compute[182713]:   <devices>
Jan 22 00:23:49 compute-1 nova_compute[182713]:     <disk type="file" device="disk">
Jan 22 00:23:49 compute-1 nova_compute[182713]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 00:23:49 compute-1 nova_compute[182713]:       <source file="/var/lib/nova/instances/f457f1e4-8770-4c44-a061-214acbc43199/disk"/>
Jan 22 00:23:49 compute-1 nova_compute[182713]:       <target dev="vda" bus="virtio"/>
Jan 22 00:23:49 compute-1 nova_compute[182713]:     </disk>
Jan 22 00:23:49 compute-1 nova_compute[182713]:     <disk type="file" device="cdrom">
Jan 22 00:23:49 compute-1 nova_compute[182713]:       <driver name="qemu" type="raw" cache="none"/>
Jan 22 00:23:49 compute-1 nova_compute[182713]:       <source file="/var/lib/nova/instances/f457f1e4-8770-4c44-a061-214acbc43199/disk.config"/>
Jan 22 00:23:49 compute-1 nova_compute[182713]:       <target dev="sda" bus="sata"/>
Jan 22 00:23:49 compute-1 nova_compute[182713]:     </disk>
Jan 22 00:23:49 compute-1 nova_compute[182713]:     <interface type="ethernet">
Jan 22 00:23:49 compute-1 nova_compute[182713]:       <mac address="fa:16:3e:17:d8:57"/>
Jan 22 00:23:49 compute-1 nova_compute[182713]:       <model type="virtio"/>
Jan 22 00:23:49 compute-1 nova_compute[182713]:       <driver name="vhost" rx_queue_size="512"/>
Jan 22 00:23:49 compute-1 nova_compute[182713]:       <mtu size="1442"/>
Jan 22 00:23:49 compute-1 nova_compute[182713]:       <target dev="tapccbc0d4c-cf"/>
Jan 22 00:23:49 compute-1 nova_compute[182713]:     </interface>
Jan 22 00:23:49 compute-1 nova_compute[182713]:     <serial type="pty">
Jan 22 00:23:49 compute-1 nova_compute[182713]:       <log file="/var/lib/nova/instances/f457f1e4-8770-4c44-a061-214acbc43199/console.log" append="off"/>
Jan 22 00:23:49 compute-1 nova_compute[182713]:     </serial>
Jan 22 00:23:49 compute-1 nova_compute[182713]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 00:23:49 compute-1 nova_compute[182713]:     <video>
Jan 22 00:23:49 compute-1 nova_compute[182713]:       <model type="virtio"/>
Jan 22 00:23:49 compute-1 nova_compute[182713]:     </video>
Jan 22 00:23:49 compute-1 nova_compute[182713]:     <input type="tablet" bus="usb"/>
Jan 22 00:23:49 compute-1 nova_compute[182713]:     <rng model="virtio">
Jan 22 00:23:49 compute-1 nova_compute[182713]:       <backend model="random">/dev/urandom</backend>
Jan 22 00:23:49 compute-1 nova_compute[182713]:     </rng>
Jan 22 00:23:49 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root"/>
Jan 22 00:23:49 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:23:49 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:23:49 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:23:49 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:23:49 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:23:49 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:23:49 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:23:49 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:23:49 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:23:49 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:23:49 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:23:49 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:23:49 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:23:49 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:23:49 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:23:49 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:23:49 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:23:49 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:23:49 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:23:49 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:23:49 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:23:49 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:23:49 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:23:49 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:23:49 compute-1 nova_compute[182713]:     <controller type="usb" index="0"/>
Jan 22 00:23:49 compute-1 nova_compute[182713]:     <memballoon model="virtio">
Jan 22 00:23:49 compute-1 nova_compute[182713]:       <stats period="10"/>
Jan 22 00:23:49 compute-1 nova_compute[182713]:     </memballoon>
Jan 22 00:23:49 compute-1 nova_compute[182713]:   </devices>
Jan 22 00:23:49 compute-1 nova_compute[182713]: </domain>
Jan 22 00:23:49 compute-1 nova_compute[182713]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 22 00:23:49 compute-1 nova_compute[182713]: 2026-01-22 00:23:49.598 182717 DEBUG nova.compute.manager [None req-64ea3630-5b6c-40fe-a401-c7e8701b66d4 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: f457f1e4-8770-4c44-a061-214acbc43199] Preparing to wait for external event network-vif-plugged-ccbc0d4c-cf7a-4220-a948-0aeade60dbdb prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 22 00:23:49 compute-1 nova_compute[182713]: 2026-01-22 00:23:49.599 182717 DEBUG oslo_concurrency.lockutils [None req-64ea3630-5b6c-40fe-a401-c7e8701b66d4 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Acquiring lock "f457f1e4-8770-4c44-a061-214acbc43199-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:23:49 compute-1 nova_compute[182713]: 2026-01-22 00:23:49.600 182717 DEBUG oslo_concurrency.lockutils [None req-64ea3630-5b6c-40fe-a401-c7e8701b66d4 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lock "f457f1e4-8770-4c44-a061-214acbc43199-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:23:49 compute-1 nova_compute[182713]: 2026-01-22 00:23:49.600 182717 DEBUG oslo_concurrency.lockutils [None req-64ea3630-5b6c-40fe-a401-c7e8701b66d4 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lock "f457f1e4-8770-4c44-a061-214acbc43199-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:23:49 compute-1 nova_compute[182713]: 2026-01-22 00:23:49.601 182717 DEBUG nova.virt.libvirt.vif [None req-64ea3630-5b6c-40fe-a401-c7e8701b66d4 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T00:23:35Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-1667524508',display_name='tempest-TestNetworkAdvancedServerOps-server-1667524508',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-1667524508',id=159,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHHg+wJV0gNqY7bKALzLqoL0K2EZGAuyjbK8gKQ82X6sT1DieFqLVmwGBVxxlHgI5XE+x3/Jhn+MN25qB+bFchaETh8kb1GrgpW5jMhoanYzyulPU1uvIFw8Ac5CluludA==',key_name='tempest-TestNetworkAdvancedServerOps-913147749',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='adb1305c8f874f2684e845e88fd95ffe',ramdisk_id='',reservation_id='r-an3dhcmb',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkAdvancedServerOps-587955072',owner_user_name='tempest-TestNetworkAdvancedServerOps-587955072-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:23:39Z,user_data=None,user_id='635cc2f351c344dc8e2b1264080dbafb',uuid=f457f1e4-8770-4c44-a061-214acbc43199,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "ccbc0d4c-cf7a-4220-a948-0aeade60dbdb", "address": "fa:16:3e:17:d8:57", "network": {"id": "c72b0076-9848-49ed-9b2e-d2fe36ac5e52", "bridge": "br-int", "label": "tempest-network-smoke--1029682171", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "adb1305c8f874f2684e845e88fd95ffe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapccbc0d4c-cf", "ovs_interfaceid": "ccbc0d4c-cf7a-4220-a948-0aeade60dbdb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 00:23:49 compute-1 nova_compute[182713]: 2026-01-22 00:23:49.601 182717 DEBUG nova.network.os_vif_util [None req-64ea3630-5b6c-40fe-a401-c7e8701b66d4 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Converting VIF {"id": "ccbc0d4c-cf7a-4220-a948-0aeade60dbdb", "address": "fa:16:3e:17:d8:57", "network": {"id": "c72b0076-9848-49ed-9b2e-d2fe36ac5e52", "bridge": "br-int", "label": "tempest-network-smoke--1029682171", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "adb1305c8f874f2684e845e88fd95ffe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapccbc0d4c-cf", "ovs_interfaceid": "ccbc0d4c-cf7a-4220-a948-0aeade60dbdb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:23:49 compute-1 nova_compute[182713]: 2026-01-22 00:23:49.602 182717 DEBUG nova.network.os_vif_util [None req-64ea3630-5b6c-40fe-a401-c7e8701b66d4 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:17:d8:57,bridge_name='br-int',has_traffic_filtering=True,id=ccbc0d4c-cf7a-4220-a948-0aeade60dbdb,network=Network(c72b0076-9848-49ed-9b2e-d2fe36ac5e52),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapccbc0d4c-cf') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:23:49 compute-1 nova_compute[182713]: 2026-01-22 00:23:49.602 182717 DEBUG os_vif [None req-64ea3630-5b6c-40fe-a401-c7e8701b66d4 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:17:d8:57,bridge_name='br-int',has_traffic_filtering=True,id=ccbc0d4c-cf7a-4220-a948-0aeade60dbdb,network=Network(c72b0076-9848-49ed-9b2e-d2fe36ac5e52),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapccbc0d4c-cf') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 00:23:49 compute-1 nova_compute[182713]: 2026-01-22 00:23:49.603 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:23:49 compute-1 nova_compute[182713]: 2026-01-22 00:23:49.603 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:23:49 compute-1 nova_compute[182713]: 2026-01-22 00:23:49.604 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 00:23:49 compute-1 nova_compute[182713]: 2026-01-22 00:23:49.607 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:23:49 compute-1 nova_compute[182713]: 2026-01-22 00:23:49.608 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapccbc0d4c-cf, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:23:49 compute-1 nova_compute[182713]: 2026-01-22 00:23:49.608 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapccbc0d4c-cf, col_values=(('external_ids', {'iface-id': 'ccbc0d4c-cf7a-4220-a948-0aeade60dbdb', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:17:d8:57', 'vm-uuid': 'f457f1e4-8770-4c44-a061-214acbc43199'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:23:49 compute-1 NetworkManager[54952]: <info>  [1769041429.6159] manager: (tapccbc0d4c-cf): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/283)
Jan 22 00:23:49 compute-1 nova_compute[182713]: 2026-01-22 00:23:49.615 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:23:49 compute-1 nova_compute[182713]: 2026-01-22 00:23:49.619 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:23:49 compute-1 nova_compute[182713]: 2026-01-22 00:23:49.622 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:23:49 compute-1 nova_compute[182713]: 2026-01-22 00:23:49.624 182717 INFO os_vif [None req-64ea3630-5b6c-40fe-a401-c7e8701b66d4 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:17:d8:57,bridge_name='br-int',has_traffic_filtering=True,id=ccbc0d4c-cf7a-4220-a948-0aeade60dbdb,network=Network(c72b0076-9848-49ed-9b2e-d2fe36ac5e52),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapccbc0d4c-cf')
Jan 22 00:23:49 compute-1 nova_compute[182713]: 2026-01-22 00:23:49.694 182717 DEBUG nova.virt.libvirt.driver [None req-64ea3630-5b6c-40fe-a401-c7e8701b66d4 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 00:23:49 compute-1 nova_compute[182713]: 2026-01-22 00:23:49.694 182717 DEBUG nova.virt.libvirt.driver [None req-64ea3630-5b6c-40fe-a401-c7e8701b66d4 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 00:23:49 compute-1 nova_compute[182713]: 2026-01-22 00:23:49.695 182717 DEBUG nova.virt.libvirt.driver [None req-64ea3630-5b6c-40fe-a401-c7e8701b66d4 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] No VIF found with MAC fa:16:3e:17:d8:57, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 00:23:49 compute-1 nova_compute[182713]: 2026-01-22 00:23:49.696 182717 INFO nova.virt.libvirt.driver [None req-64ea3630-5b6c-40fe-a401-c7e8701b66d4 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: f457f1e4-8770-4c44-a061-214acbc43199] Using config drive
Jan 22 00:23:50 compute-1 nova_compute[182713]: 2026-01-22 00:23:50.072 182717 INFO nova.virt.libvirt.driver [None req-64ea3630-5b6c-40fe-a401-c7e8701b66d4 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: f457f1e4-8770-4c44-a061-214acbc43199] Creating config drive at /var/lib/nova/instances/f457f1e4-8770-4c44-a061-214acbc43199/disk.config
Jan 22 00:23:50 compute-1 nova_compute[182713]: 2026-01-22 00:23:50.078 182717 DEBUG oslo_concurrency.processutils [None req-64ea3630-5b6c-40fe-a401-c7e8701b66d4 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/f457f1e4-8770-4c44-a061-214acbc43199/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp3do1so5w execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:23:50 compute-1 nova_compute[182713]: 2026-01-22 00:23:50.210 182717 DEBUG oslo_concurrency.processutils [None req-64ea3630-5b6c-40fe-a401-c7e8701b66d4 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/f457f1e4-8770-4c44-a061-214acbc43199/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp3do1so5w" returned: 0 in 0.131s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:23:50 compute-1 kernel: tapccbc0d4c-cf: entered promiscuous mode
Jan 22 00:23:50 compute-1 NetworkManager[54952]: <info>  [1769041430.2826] manager: (tapccbc0d4c-cf): new Tun device (/org/freedesktop/NetworkManager/Devices/284)
Jan 22 00:23:50 compute-1 nova_compute[182713]: 2026-01-22 00:23:50.283 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:23:50 compute-1 ovn_controller[94841]: 2026-01-22T00:23:50Z|00621|binding|INFO|Claiming lport ccbc0d4c-cf7a-4220-a948-0aeade60dbdb for this chassis.
Jan 22 00:23:50 compute-1 ovn_controller[94841]: 2026-01-22T00:23:50Z|00622|binding|INFO|ccbc0d4c-cf7a-4220-a948-0aeade60dbdb: Claiming fa:16:3e:17:d8:57 10.100.0.12
Jan 22 00:23:50 compute-1 nova_compute[182713]: 2026-01-22 00:23:50.287 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:23:50 compute-1 nova_compute[182713]: 2026-01-22 00:23:50.300 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:23:50 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:23:50.310 104184 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:17:d8:57 10.100.0.12'], port_security=['fa:16:3e:17:d8:57 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': 'f457f1e4-8770-4c44-a061-214acbc43199', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c72b0076-9848-49ed-9b2e-d2fe36ac5e52', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'adb1305c8f874f2684e845e88fd95ffe', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'c9ca07a4-cd9c-4730-b243-d5bdfe31822a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3c6698c9-b140-4a4b-89f4-0ea800814cda, chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>], logical_port=ccbc0d4c-cf7a-4220-a948-0aeade60dbdb) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:23:50 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:23:50.311 104184 INFO neutron.agent.ovn.metadata.agent [-] Port ccbc0d4c-cf7a-4220-a948-0aeade60dbdb in datapath c72b0076-9848-49ed-9b2e-d2fe36ac5e52 bound to our chassis
Jan 22 00:23:50 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:23:50.312 104184 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network c72b0076-9848-49ed-9b2e-d2fe36ac5e52
Jan 22 00:23:50 compute-1 systemd-udevd[235740]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 00:23:50 compute-1 NetworkManager[54952]: <info>  [1769041430.3237] device (tapccbc0d4c-cf): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 00:23:50 compute-1 NetworkManager[54952]: <info>  [1769041430.3246] device (tapccbc0d4c-cf): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 00:23:50 compute-1 systemd-machined[153970]: New machine qemu-68-instance-0000009f.
Jan 22 00:23:50 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:23:50.327 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[38c1abc6-6173-493e-ba65-f1fdc09afba7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:23:50 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:23:50.329 104184 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapc72b0076-91 in ovnmeta-c72b0076-9848-49ed-9b2e-d2fe36ac5e52 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 22 00:23:50 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:23:50.331 211733 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapc72b0076-90 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 22 00:23:50 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:23:50.331 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[9056ef0a-c952-452f-a9b9-13b59a819224]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:23:50 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:23:50.332 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[ec541caf-47f5-4cac-adda-f9644ea2ede0]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:23:50 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:23:50.344 104576 DEBUG oslo.privsep.daemon [-] privsep: reply[062afcca-1948-4541-84d3-023e8c31978c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:23:50 compute-1 systemd[1]: Started Virtual Machine qemu-68-instance-0000009f.
Jan 22 00:23:50 compute-1 nova_compute[182713]: 2026-01-22 00:23:50.364 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:23:50 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:23:50.364 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[0603a02f-3e7e-479c-a212-f2b5de313496]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:23:50 compute-1 ovn_controller[94841]: 2026-01-22T00:23:50Z|00623|binding|INFO|Setting lport ccbc0d4c-cf7a-4220-a948-0aeade60dbdb ovn-installed in OVS
Jan 22 00:23:50 compute-1 ovn_controller[94841]: 2026-01-22T00:23:50Z|00624|binding|INFO|Setting lport ccbc0d4c-cf7a-4220-a948-0aeade60dbdb up in Southbound
Jan 22 00:23:50 compute-1 nova_compute[182713]: 2026-01-22 00:23:50.368 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:23:50 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:23:50.390 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[c9356958-d925-4dba-abe2-c7893a360917]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:23:50 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:23:50.395 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[0499f141-2cef-4f1f-97bf-917c8eb26b89]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:23:50 compute-1 NetworkManager[54952]: <info>  [1769041430.3970] manager: (tapc72b0076-90): new Veth device (/org/freedesktop/NetworkManager/Devices/285)
Jan 22 00:23:50 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:23:50.422 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[6b993e14-79c6-4541-9c47-1ce9b8b1f38b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:23:50 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:23:50.425 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[f996eed4-f11d-4f69-92f4-a8917cbcd52f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:23:50 compute-1 NetworkManager[54952]: <info>  [1769041430.4454] device (tapc72b0076-90): carrier: link connected
Jan 22 00:23:50 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:23:50.449 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[a33f9f91-57b4-436d-84ed-bb2db405bdc2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:23:50 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:23:50.464 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[9ae52962-2735-41bb-8323-64ea61fe55e2]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapc72b0076-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:07:a4:d3'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 177], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 600309, 'reachable_time': 35442, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 235776, 'error': None, 'target': 'ovnmeta-c72b0076-9848-49ed-9b2e-d2fe36ac5e52', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:23:50 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:23:50.478 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[cd48ddac-d6aa-4b31-9913-e994a05b6e3d]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe07:a4d3'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 600309, 'tstamp': 600309}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 235777, 'error': None, 'target': 'ovnmeta-c72b0076-9848-49ed-9b2e-d2fe36ac5e52', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:23:50 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:23:50.494 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[2c38eb3f-bcae-467c-9150-560627f99401]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapc72b0076-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:07:a4:d3'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 177], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 600309, 'reachable_time': 35442, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 235778, 'error': None, 'target': 'ovnmeta-c72b0076-9848-49ed-9b2e-d2fe36ac5e52', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:23:50 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:23:50.526 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[cdd30123-ae25-4ce5-97c7-6579e7f2fba5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:23:50 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:23:50.591 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[4bcae9f3-eea1-46ab-a7d2-ac3febe2974a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:23:50 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:23:50.594 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc72b0076-90, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:23:50 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:23:50.595 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 00:23:50 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:23:50.596 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc72b0076-90, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:23:50 compute-1 nova_compute[182713]: 2026-01-22 00:23:50.599 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:23:50 compute-1 kernel: tapc72b0076-90: entered promiscuous mode
Jan 22 00:23:50 compute-1 NetworkManager[54952]: <info>  [1769041430.6007] manager: (tapc72b0076-90): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/286)
Jan 22 00:23:50 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:23:50.604 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapc72b0076-90, col_values=(('external_ids', {'iface-id': '329d205a-2611-48b0-b77c-7d020bb0a3df'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:23:50 compute-1 nova_compute[182713]: 2026-01-22 00:23:50.604 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:23:50 compute-1 nova_compute[182713]: 2026-01-22 00:23:50.606 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:23:50 compute-1 ovn_controller[94841]: 2026-01-22T00:23:50Z|00625|binding|INFO|Releasing lport 329d205a-2611-48b0-b77c-7d020bb0a3df from this chassis (sb_readonly=0)
Jan 22 00:23:50 compute-1 nova_compute[182713]: 2026-01-22 00:23:50.625 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:23:50 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:23:50.626 104184 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/c72b0076-9848-49ed-9b2e-d2fe36ac5e52.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/c72b0076-9848-49ed-9b2e-d2fe36ac5e52.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 22 00:23:50 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:23:50.627 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[b624c0c6-fec1-45b1-89e3-6d5e93716520]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:23:50 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:23:50.628 104184 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 00:23:50 compute-1 ovn_metadata_agent[104179]: global
Jan 22 00:23:50 compute-1 ovn_metadata_agent[104179]:     log         /dev/log local0 debug
Jan 22 00:23:50 compute-1 ovn_metadata_agent[104179]:     log-tag     haproxy-metadata-proxy-c72b0076-9848-49ed-9b2e-d2fe36ac5e52
Jan 22 00:23:50 compute-1 ovn_metadata_agent[104179]:     user        root
Jan 22 00:23:50 compute-1 ovn_metadata_agent[104179]:     group       root
Jan 22 00:23:50 compute-1 ovn_metadata_agent[104179]:     maxconn     1024
Jan 22 00:23:50 compute-1 ovn_metadata_agent[104179]:     pidfile     /var/lib/neutron/external/pids/c72b0076-9848-49ed-9b2e-d2fe36ac5e52.pid.haproxy
Jan 22 00:23:50 compute-1 ovn_metadata_agent[104179]:     daemon
Jan 22 00:23:50 compute-1 ovn_metadata_agent[104179]: 
Jan 22 00:23:50 compute-1 ovn_metadata_agent[104179]: defaults
Jan 22 00:23:50 compute-1 ovn_metadata_agent[104179]:     log global
Jan 22 00:23:50 compute-1 ovn_metadata_agent[104179]:     mode http
Jan 22 00:23:50 compute-1 ovn_metadata_agent[104179]:     option httplog
Jan 22 00:23:50 compute-1 ovn_metadata_agent[104179]:     option dontlognull
Jan 22 00:23:50 compute-1 ovn_metadata_agent[104179]:     option http-server-close
Jan 22 00:23:50 compute-1 ovn_metadata_agent[104179]:     option forwardfor
Jan 22 00:23:50 compute-1 ovn_metadata_agent[104179]:     retries                 3
Jan 22 00:23:50 compute-1 ovn_metadata_agent[104179]:     timeout http-request    30s
Jan 22 00:23:50 compute-1 ovn_metadata_agent[104179]:     timeout connect         30s
Jan 22 00:23:50 compute-1 ovn_metadata_agent[104179]:     timeout client          32s
Jan 22 00:23:50 compute-1 ovn_metadata_agent[104179]:     timeout server          32s
Jan 22 00:23:50 compute-1 ovn_metadata_agent[104179]:     timeout http-keep-alive 30s
Jan 22 00:23:50 compute-1 ovn_metadata_agent[104179]: 
Jan 22 00:23:50 compute-1 ovn_metadata_agent[104179]: 
Jan 22 00:23:50 compute-1 ovn_metadata_agent[104179]: listen listener
Jan 22 00:23:50 compute-1 ovn_metadata_agent[104179]:     bind 169.254.169.254:80
Jan 22 00:23:50 compute-1 ovn_metadata_agent[104179]:     server metadata /var/lib/neutron/metadata_proxy
Jan 22 00:23:50 compute-1 ovn_metadata_agent[104179]:     http-request add-header X-OVN-Network-ID c72b0076-9848-49ed-9b2e-d2fe36ac5e52
Jan 22 00:23:50 compute-1 ovn_metadata_agent[104179]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 22 00:23:50 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:23:50.630 104184 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-c72b0076-9848-49ed-9b2e-d2fe36ac5e52', 'env', 'PROCESS_TAG=haproxy-c72b0076-9848-49ed-9b2e-d2fe36ac5e52', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/c72b0076-9848-49ed-9b2e-d2fe36ac5e52.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 22 00:23:50 compute-1 nova_compute[182713]: 2026-01-22 00:23:50.855 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:23:50 compute-1 nova_compute[182713]: 2026-01-22 00:23:50.863 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:23:50 compute-1 nova_compute[182713]: 2026-01-22 00:23:50.863 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 00:23:50 compute-1 nova_compute[182713]: 2026-01-22 00:23:50.865 182717 DEBUG nova.compute.manager [req-23f086db-d31e-44bd-9c8d-5eec6f9cb7be req-905b52ab-8aa5-4776-8dd0-a6a7b3eaf910 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: f457f1e4-8770-4c44-a061-214acbc43199] Received event network-vif-plugged-ccbc0d4c-cf7a-4220-a948-0aeade60dbdb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:23:50 compute-1 nova_compute[182713]: 2026-01-22 00:23:50.865 182717 DEBUG oslo_concurrency.lockutils [req-23f086db-d31e-44bd-9c8d-5eec6f9cb7be req-905b52ab-8aa5-4776-8dd0-a6a7b3eaf910 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "f457f1e4-8770-4c44-a061-214acbc43199-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:23:50 compute-1 nova_compute[182713]: 2026-01-22 00:23:50.866 182717 DEBUG oslo_concurrency.lockutils [req-23f086db-d31e-44bd-9c8d-5eec6f9cb7be req-905b52ab-8aa5-4776-8dd0-a6a7b3eaf910 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "f457f1e4-8770-4c44-a061-214acbc43199-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:23:50 compute-1 nova_compute[182713]: 2026-01-22 00:23:50.866 182717 DEBUG oslo_concurrency.lockutils [req-23f086db-d31e-44bd-9c8d-5eec6f9cb7be req-905b52ab-8aa5-4776-8dd0-a6a7b3eaf910 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "f457f1e4-8770-4c44-a061-214acbc43199-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:23:50 compute-1 nova_compute[182713]: 2026-01-22 00:23:50.866 182717 DEBUG nova.compute.manager [req-23f086db-d31e-44bd-9c8d-5eec6f9cb7be req-905b52ab-8aa5-4776-8dd0-a6a7b3eaf910 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: f457f1e4-8770-4c44-a061-214acbc43199] Processing event network-vif-plugged-ccbc0d4c-cf7a-4220-a948-0aeade60dbdb _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 22 00:23:51 compute-1 podman[235812]: 2026-01-22 00:23:51.006234032 +0000 UTC m=+0.065298078 container create cec1f8f9da1d517a7878b397e2dfcdac2e6a42554f693777b2806fcace373470 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c72b0076-9848-49ed-9b2e-d2fe36ac5e52, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 22 00:23:51 compute-1 nova_compute[182713]: 2026-01-22 00:23:51.010 182717 DEBUG nova.virt.driver [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Emitting event <LifecycleEvent: 1769041431.009665, f457f1e4-8770-4c44-a061-214acbc43199 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:23:51 compute-1 nova_compute[182713]: 2026-01-22 00:23:51.011 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: f457f1e4-8770-4c44-a061-214acbc43199] VM Started (Lifecycle Event)
Jan 22 00:23:51 compute-1 nova_compute[182713]: 2026-01-22 00:23:51.015 182717 DEBUG nova.compute.manager [None req-64ea3630-5b6c-40fe-a401-c7e8701b66d4 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: f457f1e4-8770-4c44-a061-214acbc43199] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 00:23:51 compute-1 nova_compute[182713]: 2026-01-22 00:23:51.022 182717 DEBUG nova.virt.libvirt.driver [None req-64ea3630-5b6c-40fe-a401-c7e8701b66d4 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: f457f1e4-8770-4c44-a061-214acbc43199] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 22 00:23:51 compute-1 nova_compute[182713]: 2026-01-22 00:23:51.027 182717 INFO nova.virt.libvirt.driver [-] [instance: f457f1e4-8770-4c44-a061-214acbc43199] Instance spawned successfully.
Jan 22 00:23:51 compute-1 nova_compute[182713]: 2026-01-22 00:23:51.028 182717 DEBUG nova.virt.libvirt.driver [None req-64ea3630-5b6c-40fe-a401-c7e8701b66d4 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: f457f1e4-8770-4c44-a061-214acbc43199] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 22 00:23:51 compute-1 nova_compute[182713]: 2026-01-22 00:23:51.038 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: f457f1e4-8770-4c44-a061-214acbc43199] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:23:51 compute-1 nova_compute[182713]: 2026-01-22 00:23:51.040 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: f457f1e4-8770-4c44-a061-214acbc43199] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 00:23:51 compute-1 systemd[1]: Started libpod-conmon-cec1f8f9da1d517a7878b397e2dfcdac2e6a42554f693777b2806fcace373470.scope.
Jan 22 00:23:51 compute-1 podman[235812]: 2026-01-22 00:23:50.965009673 +0000 UTC m=+0.024073809 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 00:23:51 compute-1 nova_compute[182713]: 2026-01-22 00:23:51.061 182717 DEBUG nova.virt.libvirt.driver [None req-64ea3630-5b6c-40fe-a401-c7e8701b66d4 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: f457f1e4-8770-4c44-a061-214acbc43199] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:23:51 compute-1 nova_compute[182713]: 2026-01-22 00:23:51.061 182717 DEBUG nova.virt.libvirt.driver [None req-64ea3630-5b6c-40fe-a401-c7e8701b66d4 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: f457f1e4-8770-4c44-a061-214acbc43199] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:23:51 compute-1 nova_compute[182713]: 2026-01-22 00:23:51.062 182717 DEBUG nova.virt.libvirt.driver [None req-64ea3630-5b6c-40fe-a401-c7e8701b66d4 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: f457f1e4-8770-4c44-a061-214acbc43199] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:23:51 compute-1 nova_compute[182713]: 2026-01-22 00:23:51.063 182717 DEBUG nova.virt.libvirt.driver [None req-64ea3630-5b6c-40fe-a401-c7e8701b66d4 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: f457f1e4-8770-4c44-a061-214acbc43199] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:23:51 compute-1 nova_compute[182713]: 2026-01-22 00:23:51.063 182717 DEBUG nova.virt.libvirt.driver [None req-64ea3630-5b6c-40fe-a401-c7e8701b66d4 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: f457f1e4-8770-4c44-a061-214acbc43199] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:23:51 compute-1 nova_compute[182713]: 2026-01-22 00:23:51.064 182717 DEBUG nova.virt.libvirt.driver [None req-64ea3630-5b6c-40fe-a401-c7e8701b66d4 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: f457f1e4-8770-4c44-a061-214acbc43199] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:23:51 compute-1 nova_compute[182713]: 2026-01-22 00:23:51.068 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: f457f1e4-8770-4c44-a061-214acbc43199] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 00:23:51 compute-1 nova_compute[182713]: 2026-01-22 00:23:51.069 182717 DEBUG nova.virt.driver [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Emitting event <LifecycleEvent: 1769041431.009967, f457f1e4-8770-4c44-a061-214acbc43199 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:23:51 compute-1 nova_compute[182713]: 2026-01-22 00:23:51.069 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: f457f1e4-8770-4c44-a061-214acbc43199] VM Paused (Lifecycle Event)
Jan 22 00:23:51 compute-1 systemd[1]: Started libcrun container.
Jan 22 00:23:51 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a8056c9ce0a71047f425ec98124f461b802c446dcb84d58d095fdf2605efca3e/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 00:23:51 compute-1 nova_compute[182713]: 2026-01-22 00:23:51.123 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: f457f1e4-8770-4c44-a061-214acbc43199] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:23:51 compute-1 nova_compute[182713]: 2026-01-22 00:23:51.132 182717 DEBUG nova.virt.driver [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Emitting event <LifecycleEvent: 1769041431.019996, f457f1e4-8770-4c44-a061-214acbc43199 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:23:51 compute-1 nova_compute[182713]: 2026-01-22 00:23:51.132 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: f457f1e4-8770-4c44-a061-214acbc43199] VM Resumed (Lifecycle Event)
Jan 22 00:23:51 compute-1 nova_compute[182713]: 2026-01-22 00:23:51.158 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: f457f1e4-8770-4c44-a061-214acbc43199] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:23:51 compute-1 nova_compute[182713]: 2026-01-22 00:23:51.161 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: f457f1e4-8770-4c44-a061-214acbc43199] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 00:23:51 compute-1 nova_compute[182713]: 2026-01-22 00:23:51.193 182717 INFO nova.compute.manager [None req-64ea3630-5b6c-40fe-a401-c7e8701b66d4 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: f457f1e4-8770-4c44-a061-214acbc43199] Took 11.51 seconds to spawn the instance on the hypervisor.
Jan 22 00:23:51 compute-1 nova_compute[182713]: 2026-01-22 00:23:51.194 182717 DEBUG nova.compute.manager [None req-64ea3630-5b6c-40fe-a401-c7e8701b66d4 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: f457f1e4-8770-4c44-a061-214acbc43199] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:23:51 compute-1 nova_compute[182713]: 2026-01-22 00:23:51.200 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: f457f1e4-8770-4c44-a061-214acbc43199] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 00:23:51 compute-1 nova_compute[182713]: 2026-01-22 00:23:51.217 182717 DEBUG nova.network.neutron [req-6bff11ab-48df-47b0-909e-0ac0e9f85b1c req-ccc02547-177d-46dd-8197-951212244605 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: f457f1e4-8770-4c44-a061-214acbc43199] Updated VIF entry in instance network info cache for port ccbc0d4c-cf7a-4220-a948-0aeade60dbdb. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 00:23:51 compute-1 nova_compute[182713]: 2026-01-22 00:23:51.218 182717 DEBUG nova.network.neutron [req-6bff11ab-48df-47b0-909e-0ac0e9f85b1c req-ccc02547-177d-46dd-8197-951212244605 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: f457f1e4-8770-4c44-a061-214acbc43199] Updating instance_info_cache with network_info: [{"id": "ccbc0d4c-cf7a-4220-a948-0aeade60dbdb", "address": "fa:16:3e:17:d8:57", "network": {"id": "c72b0076-9848-49ed-9b2e-d2fe36ac5e52", "bridge": "br-int", "label": "tempest-network-smoke--1029682171", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "adb1305c8f874f2684e845e88fd95ffe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapccbc0d4c-cf", "ovs_interfaceid": "ccbc0d4c-cf7a-4220-a948-0aeade60dbdb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:23:51 compute-1 nova_compute[182713]: 2026-01-22 00:23:51.242 182717 DEBUG oslo_concurrency.lockutils [req-6bff11ab-48df-47b0-909e-0ac0e9f85b1c req-ccc02547-177d-46dd-8197-951212244605 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-f457f1e4-8770-4c44-a061-214acbc43199" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:23:51 compute-1 nova_compute[182713]: 2026-01-22 00:23:51.273 182717 INFO nova.compute.manager [None req-64ea3630-5b6c-40fe-a401-c7e8701b66d4 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: f457f1e4-8770-4c44-a061-214acbc43199] Took 12.51 seconds to build instance.
Jan 22 00:23:51 compute-1 podman[235812]: 2026-01-22 00:23:51.277253306 +0000 UTC m=+0.336317362 container init cec1f8f9da1d517a7878b397e2dfcdac2e6a42554f693777b2806fcace373470 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c72b0076-9848-49ed-9b2e-d2fe36ac5e52, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 22 00:23:51 compute-1 podman[235812]: 2026-01-22 00:23:51.282673334 +0000 UTC m=+0.341737380 container start cec1f8f9da1d517a7878b397e2dfcdac2e6a42554f693777b2806fcace373470 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c72b0076-9848-49ed-9b2e-d2fe36ac5e52, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 00:23:51 compute-1 nova_compute[182713]: 2026-01-22 00:23:51.294 182717 DEBUG oslo_concurrency.lockutils [None req-64ea3630-5b6c-40fe-a401-c7e8701b66d4 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lock "f457f1e4-8770-4c44-a061-214acbc43199" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 13.343s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:23:51 compute-1 neutron-haproxy-ovnmeta-c72b0076-9848-49ed-9b2e-d2fe36ac5e52[235832]: [NOTICE]   (235836) : New worker (235838) forked
Jan 22 00:23:51 compute-1 neutron-haproxy-ovnmeta-c72b0076-9848-49ed-9b2e-d2fe36ac5e52[235832]: [NOTICE]   (235836) : Loading success.
Jan 22 00:23:51 compute-1 nova_compute[182713]: 2026-01-22 00:23:51.861 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:23:51 compute-1 nova_compute[182713]: 2026-01-22 00:23:51.862 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:23:52 compute-1 podman[235848]: 2026-01-22 00:23:52.585841929 +0000 UTC m=+0.062921624 container health_status 9069102163b9014ce8d990e36224edf2f6277e737eebbc85a8ea0ba5a3d6a468 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 22 00:23:52 compute-1 podman[235847]: 2026-01-22 00:23:52.59680307 +0000 UTC m=+0.088745156 container health_status 1b31374526468d6b98833c6ddc06c791524e3aaa8f4a92d02e3f11c449a17ca2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 22 00:23:53 compute-1 nova_compute[182713]: 2026-01-22 00:23:53.082 182717 DEBUG nova.compute.manager [req-5250700c-1c4a-4ea0-b811-1aa11a289da8 req-e158c00f-f38e-4a6d-a092-f4f30d12e389 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: f457f1e4-8770-4c44-a061-214acbc43199] Received event network-vif-plugged-ccbc0d4c-cf7a-4220-a948-0aeade60dbdb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:23:53 compute-1 nova_compute[182713]: 2026-01-22 00:23:53.083 182717 DEBUG oslo_concurrency.lockutils [req-5250700c-1c4a-4ea0-b811-1aa11a289da8 req-e158c00f-f38e-4a6d-a092-f4f30d12e389 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "f457f1e4-8770-4c44-a061-214acbc43199-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:23:53 compute-1 nova_compute[182713]: 2026-01-22 00:23:53.084 182717 DEBUG oslo_concurrency.lockutils [req-5250700c-1c4a-4ea0-b811-1aa11a289da8 req-e158c00f-f38e-4a6d-a092-f4f30d12e389 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "f457f1e4-8770-4c44-a061-214acbc43199-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:23:53 compute-1 nova_compute[182713]: 2026-01-22 00:23:53.084 182717 DEBUG oslo_concurrency.lockutils [req-5250700c-1c4a-4ea0-b811-1aa11a289da8 req-e158c00f-f38e-4a6d-a092-f4f30d12e389 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "f457f1e4-8770-4c44-a061-214acbc43199-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:23:53 compute-1 nova_compute[182713]: 2026-01-22 00:23:53.085 182717 DEBUG nova.compute.manager [req-5250700c-1c4a-4ea0-b811-1aa11a289da8 req-e158c00f-f38e-4a6d-a092-f4f30d12e389 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: f457f1e4-8770-4c44-a061-214acbc43199] No waiting events found dispatching network-vif-plugged-ccbc0d4c-cf7a-4220-a948-0aeade60dbdb pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:23:53 compute-1 nova_compute[182713]: 2026-01-22 00:23:53.085 182717 WARNING nova.compute.manager [req-5250700c-1c4a-4ea0-b811-1aa11a289da8 req-e158c00f-f38e-4a6d-a092-f4f30d12e389 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: f457f1e4-8770-4c44-a061-214acbc43199] Received unexpected event network-vif-plugged-ccbc0d4c-cf7a-4220-a948-0aeade60dbdb for instance with vm_state active and task_state None.
Jan 22 00:23:54 compute-1 nova_compute[182713]: 2026-01-22 00:23:54.038 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:23:54 compute-1 nova_compute[182713]: 2026-01-22 00:23:54.611 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:23:55 compute-1 nova_compute[182713]: 2026-01-22 00:23:55.855 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:23:55 compute-1 nova_compute[182713]: 2026-01-22 00:23:55.856 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:23:55 compute-1 nova_compute[182713]: 2026-01-22 00:23:55.881 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:23:55 compute-1 nova_compute[182713]: 2026-01-22 00:23:55.881 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:23:55 compute-1 nova_compute[182713]: 2026-01-22 00:23:55.881 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:23:55 compute-1 nova_compute[182713]: 2026-01-22 00:23:55.882 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 00:23:55 compute-1 nova_compute[182713]: 2026-01-22 00:23:55.958 182717 DEBUG oslo_concurrency.processutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f457f1e4-8770-4c44-a061-214acbc43199/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:23:55 compute-1 podman[235899]: 2026-01-22 00:23:55.97464454 +0000 UTC m=+0.049438835 container health_status af9a44923f14d865893c91712a29a99d59e05d83f1a1a0300c4f4d5af638f284 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Jan 22 00:23:55 compute-1 podman[235900]: 2026-01-22 00:23:55.980079339 +0000 UTC m=+0.049824268 container health_status e305ebfcd3dbca18f8dd680606b043b108cf9efb0d0a771039cb04fe0df8441d (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Jan 22 00:23:56 compute-1 nova_compute[182713]: 2026-01-22 00:23:56.032 182717 DEBUG oslo_concurrency.processutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f457f1e4-8770-4c44-a061-214acbc43199/disk --force-share --output=json" returned: 0 in 0.074s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:23:56 compute-1 nova_compute[182713]: 2026-01-22 00:23:56.034 182717 DEBUG oslo_concurrency.processutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f457f1e4-8770-4c44-a061-214acbc43199/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:23:56 compute-1 nova_compute[182713]: 2026-01-22 00:23:56.084 182717 DEBUG oslo_concurrency.processutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f457f1e4-8770-4c44-a061-214acbc43199/disk --force-share --output=json" returned: 0 in 0.050s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:23:56 compute-1 nova_compute[182713]: 2026-01-22 00:23:56.284 182717 WARNING nova.virt.libvirt.driver [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 00:23:56 compute-1 nova_compute[182713]: 2026-01-22 00:23:56.285 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5537MB free_disk=73.25963973999023GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 00:23:56 compute-1 nova_compute[182713]: 2026-01-22 00:23:56.285 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:23:56 compute-1 nova_compute[182713]: 2026-01-22 00:23:56.285 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:23:56 compute-1 nova_compute[182713]: 2026-01-22 00:23:56.556 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Instance f457f1e4-8770-4c44-a061-214acbc43199 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 22 00:23:56 compute-1 nova_compute[182713]: 2026-01-22 00:23:56.557 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 00:23:56 compute-1 nova_compute[182713]: 2026-01-22 00:23:56.558 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 00:23:56 compute-1 nova_compute[182713]: 2026-01-22 00:23:56.668 182717 DEBUG nova.compute.provider_tree [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Inventory has not changed in ProviderTree for provider: 39680711-70c9-4df1-ae59-25e54fac688d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:23:56 compute-1 nova_compute[182713]: 2026-01-22 00:23:56.733 182717 DEBUG nova.scheduler.client.report [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Inventory has not changed for provider 39680711-70c9-4df1-ae59-25e54fac688d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:23:56 compute-1 nova_compute[182713]: 2026-01-22 00:23:56.777 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 00:23:56 compute-1 nova_compute[182713]: 2026-01-22 00:23:56.778 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.493s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:23:57 compute-1 NetworkManager[54952]: <info>  [1769041437.3662] manager: (patch-br-int-to-provnet-99bcf688-d143-4306-a11c-956e66fbe227): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/287)
Jan 22 00:23:57 compute-1 NetworkManager[54952]: <info>  [1769041437.3672] manager: (patch-provnet-99bcf688-d143-4306-a11c-956e66fbe227-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/288)
Jan 22 00:23:57 compute-1 nova_compute[182713]: 2026-01-22 00:23:57.365 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:23:57 compute-1 nova_compute[182713]: 2026-01-22 00:23:57.513 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:23:57 compute-1 ovn_controller[94841]: 2026-01-22T00:23:57Z|00626|binding|INFO|Releasing lport 329d205a-2611-48b0-b77c-7d020bb0a3df from this chassis (sb_readonly=0)
Jan 22 00:23:57 compute-1 nova_compute[182713]: 2026-01-22 00:23:57.542 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:23:58 compute-1 nova_compute[182713]: 2026-01-22 00:23:58.024 182717 DEBUG nova.compute.manager [req-dfb40ac0-affc-4dcb-bf27-1a195413513a req-a73bf5b8-962d-4472-aabc-7ad02c7d4337 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: f457f1e4-8770-4c44-a061-214acbc43199] Received event network-changed-ccbc0d4c-cf7a-4220-a948-0aeade60dbdb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:23:58 compute-1 nova_compute[182713]: 2026-01-22 00:23:58.025 182717 DEBUG nova.compute.manager [req-dfb40ac0-affc-4dcb-bf27-1a195413513a req-a73bf5b8-962d-4472-aabc-7ad02c7d4337 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: f457f1e4-8770-4c44-a061-214acbc43199] Refreshing instance network info cache due to event network-changed-ccbc0d4c-cf7a-4220-a948-0aeade60dbdb. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 00:23:58 compute-1 nova_compute[182713]: 2026-01-22 00:23:58.025 182717 DEBUG oslo_concurrency.lockutils [req-dfb40ac0-affc-4dcb-bf27-1a195413513a req-a73bf5b8-962d-4472-aabc-7ad02c7d4337 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-f457f1e4-8770-4c44-a061-214acbc43199" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:23:58 compute-1 nova_compute[182713]: 2026-01-22 00:23:58.025 182717 DEBUG oslo_concurrency.lockutils [req-dfb40ac0-affc-4dcb-bf27-1a195413513a req-a73bf5b8-962d-4472-aabc-7ad02c7d4337 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-f457f1e4-8770-4c44-a061-214acbc43199" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:23:58 compute-1 nova_compute[182713]: 2026-01-22 00:23:58.026 182717 DEBUG nova.network.neutron [req-dfb40ac0-affc-4dcb-bf27-1a195413513a req-a73bf5b8-962d-4472-aabc-7ad02c7d4337 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: f457f1e4-8770-4c44-a061-214acbc43199] Refreshing network info cache for port ccbc0d4c-cf7a-4220-a948-0aeade60dbdb _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 00:23:58 compute-1 nova_compute[182713]: 2026-01-22 00:23:58.226 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:23:58 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:23:58.226 104184 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=47, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:a4:fb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'fa:c2:bb:50:eb:27'}, ipsec=False) old=SB_Global(nb_cfg=46) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:23:58 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:23:58.227 104184 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 22 00:23:58 compute-1 nova_compute[182713]: 2026-01-22 00:23:58.778 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:23:58 compute-1 nova_compute[182713]: 2026-01-22 00:23:58.855 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:23:58 compute-1 nova_compute[182713]: 2026-01-22 00:23:58.856 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 00:23:58 compute-1 nova_compute[182713]: 2026-01-22 00:23:58.856 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 22 00:23:59 compute-1 nova_compute[182713]: 2026-01-22 00:23:59.040 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:23:59 compute-1 nova_compute[182713]: 2026-01-22 00:23:59.043 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquiring lock "refresh_cache-f457f1e4-8770-4c44-a061-214acbc43199" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:23:59 compute-1 nova_compute[182713]: 2026-01-22 00:23:59.613 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:24:01 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:24:01.231 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=74526b6d-b1ca-423f-9094-b845f8b97526, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '47'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:24:01 compute-1 nova_compute[182713]: 2026-01-22 00:24:01.457 182717 DEBUG nova.network.neutron [req-dfb40ac0-affc-4dcb-bf27-1a195413513a req-a73bf5b8-962d-4472-aabc-7ad02c7d4337 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: f457f1e4-8770-4c44-a061-214acbc43199] Updated VIF entry in instance network info cache for port ccbc0d4c-cf7a-4220-a948-0aeade60dbdb. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 00:24:01 compute-1 nova_compute[182713]: 2026-01-22 00:24:01.458 182717 DEBUG nova.network.neutron [req-dfb40ac0-affc-4dcb-bf27-1a195413513a req-a73bf5b8-962d-4472-aabc-7ad02c7d4337 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: f457f1e4-8770-4c44-a061-214acbc43199] Updating instance_info_cache with network_info: [{"id": "ccbc0d4c-cf7a-4220-a948-0aeade60dbdb", "address": "fa:16:3e:17:d8:57", "network": {"id": "c72b0076-9848-49ed-9b2e-d2fe36ac5e52", "bridge": "br-int", "label": "tempest-network-smoke--1029682171", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.202", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "adb1305c8f874f2684e845e88fd95ffe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapccbc0d4c-cf", "ovs_interfaceid": "ccbc0d4c-cf7a-4220-a948-0aeade60dbdb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:24:01 compute-1 nova_compute[182713]: 2026-01-22 00:24:01.479 182717 DEBUG oslo_concurrency.lockutils [req-dfb40ac0-affc-4dcb-bf27-1a195413513a req-a73bf5b8-962d-4472-aabc-7ad02c7d4337 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-f457f1e4-8770-4c44-a061-214acbc43199" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:24:01 compute-1 nova_compute[182713]: 2026-01-22 00:24:01.482 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquired lock "refresh_cache-f457f1e4-8770-4c44-a061-214acbc43199" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:24:01 compute-1 nova_compute[182713]: 2026-01-22 00:24:01.483 182717 DEBUG nova.network.neutron [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] [instance: f457f1e4-8770-4c44-a061-214acbc43199] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Jan 22 00:24:01 compute-1 nova_compute[182713]: 2026-01-22 00:24:01.483 182717 DEBUG nova.objects.instance [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lazy-loading 'info_cache' on Instance uuid f457f1e4-8770-4c44-a061-214acbc43199 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:24:03 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:24:03.033 104184 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:24:03 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:24:03.034 104184 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:24:03 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:24:03.035 104184 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:24:03 compute-1 ovn_controller[94841]: 2026-01-22T00:24:03Z|00076|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:17:d8:57 10.100.0.12
Jan 22 00:24:03 compute-1 ovn_controller[94841]: 2026-01-22T00:24:03Z|00077|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:17:d8:57 10.100.0.12
Jan 22 00:24:04 compute-1 nova_compute[182713]: 2026-01-22 00:24:04.043 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:24:04 compute-1 nova_compute[182713]: 2026-01-22 00:24:04.614 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:24:04 compute-1 nova_compute[182713]: 2026-01-22 00:24:04.772 182717 DEBUG nova.network.neutron [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] [instance: f457f1e4-8770-4c44-a061-214acbc43199] Updating instance_info_cache with network_info: [{"id": "ccbc0d4c-cf7a-4220-a948-0aeade60dbdb", "address": "fa:16:3e:17:d8:57", "network": {"id": "c72b0076-9848-49ed-9b2e-d2fe36ac5e52", "bridge": "br-int", "label": "tempest-network-smoke--1029682171", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.202", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "adb1305c8f874f2684e845e88fd95ffe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapccbc0d4c-cf", "ovs_interfaceid": "ccbc0d4c-cf7a-4220-a948-0aeade60dbdb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:24:04 compute-1 nova_compute[182713]: 2026-01-22 00:24:04.792 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Releasing lock "refresh_cache-f457f1e4-8770-4c44-a061-214acbc43199" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:24:04 compute-1 nova_compute[182713]: 2026-01-22 00:24:04.793 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] [instance: f457f1e4-8770-4c44-a061-214acbc43199] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Jan 22 00:24:04 compute-1 nova_compute[182713]: 2026-01-22 00:24:04.794 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:24:08 compute-1 podman[235962]: 2026-01-22 00:24:08.60106454 +0000 UTC m=+0.080642514 container health_status cba261ffc859a105e0afa4436e64e27f9ba7e7387a1ea02146e0072c51844d44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 00:24:09 compute-1 nova_compute[182713]: 2026-01-22 00:24:09.046 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:24:09 compute-1 nova_compute[182713]: 2026-01-22 00:24:09.163 182717 INFO nova.compute.manager [None req-e33ab090-db42-4927-b6bf-3908887f8373 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: f457f1e4-8770-4c44-a061-214acbc43199] Get console output
Jan 22 00:24:09 compute-1 nova_compute[182713]: 2026-01-22 00:24:09.171 211417 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 00:24:09 compute-1 nova_compute[182713]: 2026-01-22 00:24:09.399 182717 DEBUG oslo_concurrency.lockutils [None req-4508f7df-6182-4d89-a2a6-4899a2e2c60b 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Acquiring lock "f457f1e4-8770-4c44-a061-214acbc43199" by "nova.compute.manager.ComputeManager.reboot_instance.<locals>.do_reboot_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:24:09 compute-1 nova_compute[182713]: 2026-01-22 00:24:09.399 182717 DEBUG oslo_concurrency.lockutils [None req-4508f7df-6182-4d89-a2a6-4899a2e2c60b 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lock "f457f1e4-8770-4c44-a061-214acbc43199" acquired by "nova.compute.manager.ComputeManager.reboot_instance.<locals>.do_reboot_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:24:09 compute-1 nova_compute[182713]: 2026-01-22 00:24:09.400 182717 INFO nova.compute.manager [None req-4508f7df-6182-4d89-a2a6-4899a2e2c60b 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: f457f1e4-8770-4c44-a061-214acbc43199] Rebooting instance
Jan 22 00:24:09 compute-1 nova_compute[182713]: 2026-01-22 00:24:09.414 182717 DEBUG oslo_concurrency.lockutils [None req-4508f7df-6182-4d89-a2a6-4899a2e2c60b 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Acquiring lock "refresh_cache-f457f1e4-8770-4c44-a061-214acbc43199" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:24:09 compute-1 nova_compute[182713]: 2026-01-22 00:24:09.415 182717 DEBUG oslo_concurrency.lockutils [None req-4508f7df-6182-4d89-a2a6-4899a2e2c60b 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Acquired lock "refresh_cache-f457f1e4-8770-4c44-a061-214acbc43199" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:24:09 compute-1 nova_compute[182713]: 2026-01-22 00:24:09.415 182717 DEBUG nova.network.neutron [None req-4508f7df-6182-4d89-a2a6-4899a2e2c60b 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: f457f1e4-8770-4c44-a061-214acbc43199] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 00:24:09 compute-1 nova_compute[182713]: 2026-01-22 00:24:09.617 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:24:11 compute-1 ovn_controller[94841]: 2026-01-22T00:24:11Z|00627|binding|INFO|Releasing lport 329d205a-2611-48b0-b77c-7d020bb0a3df from this chassis (sb_readonly=0)
Jan 22 00:24:11 compute-1 nova_compute[182713]: 2026-01-22 00:24:11.128 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:24:11 compute-1 nova_compute[182713]: 2026-01-22 00:24:11.518 182717 DEBUG nova.network.neutron [None req-4508f7df-6182-4d89-a2a6-4899a2e2c60b 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: f457f1e4-8770-4c44-a061-214acbc43199] Updating instance_info_cache with network_info: [{"id": "ccbc0d4c-cf7a-4220-a948-0aeade60dbdb", "address": "fa:16:3e:17:d8:57", "network": {"id": "c72b0076-9848-49ed-9b2e-d2fe36ac5e52", "bridge": "br-int", "label": "tempest-network-smoke--1029682171", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.202", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "adb1305c8f874f2684e845e88fd95ffe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapccbc0d4c-cf", "ovs_interfaceid": "ccbc0d4c-cf7a-4220-a948-0aeade60dbdb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:24:11 compute-1 nova_compute[182713]: 2026-01-22 00:24:11.546 182717 DEBUG oslo_concurrency.lockutils [None req-4508f7df-6182-4d89-a2a6-4899a2e2c60b 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Releasing lock "refresh_cache-f457f1e4-8770-4c44-a061-214acbc43199" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:24:11 compute-1 nova_compute[182713]: 2026-01-22 00:24:11.558 182717 DEBUG nova.compute.manager [None req-4508f7df-6182-4d89-a2a6-4899a2e2c60b 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: f457f1e4-8770-4c44-a061-214acbc43199] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:24:12 compute-1 podman[235983]: 2026-01-22 00:24:12.613571754 +0000 UTC m=+0.104249998 container health_status dce133a2ffa21f3006ac27dc80027060a9029e6d972f4c62fecd35dd3bbb00f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=openstack_network_exporter, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., architecture=x86_64, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, io.buildah.version=1.33.7, release=1755695350, version=9.6, maintainer=Red Hat, Inc., managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Jan 22 00:24:13 compute-1 kernel: tapccbc0d4c-cf (unregistering): left promiscuous mode
Jan 22 00:24:13 compute-1 NetworkManager[54952]: <info>  [1769041453.9094] device (tapccbc0d4c-cf): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 00:24:13 compute-1 ovn_controller[94841]: 2026-01-22T00:24:13Z|00628|binding|INFO|Releasing lport ccbc0d4c-cf7a-4220-a948-0aeade60dbdb from this chassis (sb_readonly=0)
Jan 22 00:24:13 compute-1 ovn_controller[94841]: 2026-01-22T00:24:13Z|00629|binding|INFO|Setting lport ccbc0d4c-cf7a-4220-a948-0aeade60dbdb down in Southbound
Jan 22 00:24:13 compute-1 ovn_controller[94841]: 2026-01-22T00:24:13Z|00630|binding|INFO|Removing iface tapccbc0d4c-cf ovn-installed in OVS
Jan 22 00:24:13 compute-1 nova_compute[182713]: 2026-01-22 00:24:13.970 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:24:13 compute-1 nova_compute[182713]: 2026-01-22 00:24:13.973 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:24:13 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:24:13.979 104184 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:17:d8:57 10.100.0.12'], port_security=['fa:16:3e:17:d8:57 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': 'f457f1e4-8770-4c44-a061-214acbc43199', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c72b0076-9848-49ed-9b2e-d2fe36ac5e52', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'adb1305c8f874f2684e845e88fd95ffe', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'c9ca07a4-cd9c-4730-b243-d5bdfe31822a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com', 'neutron:port_fip': '192.168.122.202'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3c6698c9-b140-4a4b-89f4-0ea800814cda, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>], logical_port=ccbc0d4c-cf7a-4220-a948-0aeade60dbdb) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:24:13 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:24:13.981 104184 INFO neutron.agent.ovn.metadata.agent [-] Port ccbc0d4c-cf7a-4220-a948-0aeade60dbdb in datapath c72b0076-9848-49ed-9b2e-d2fe36ac5e52 unbound from our chassis
Jan 22 00:24:13 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:24:13.983 104184 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network c72b0076-9848-49ed-9b2e-d2fe36ac5e52, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 00:24:13 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:24:13.987 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[591b2246-6fc5-4616-932c-3ed1f0ce8e47]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:24:13 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:24:13.990 104184 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-c72b0076-9848-49ed-9b2e-d2fe36ac5e52 namespace which is not needed anymore
Jan 22 00:24:13 compute-1 nova_compute[182713]: 2026-01-22 00:24:13.990 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:24:14 compute-1 systemd[1]: machine-qemu\x2d68\x2dinstance\x2d0000009f.scope: Deactivated successfully.
Jan 22 00:24:14 compute-1 systemd[1]: machine-qemu\x2d68\x2dinstance\x2d0000009f.scope: Consumed 13.599s CPU time.
Jan 22 00:24:14 compute-1 systemd-machined[153970]: Machine qemu-68-instance-0000009f terminated.
Jan 22 00:24:14 compute-1 nova_compute[182713]: 2026-01-22 00:24:14.048 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:24:14 compute-1 nova_compute[182713]: 2026-01-22 00:24:14.619 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:24:14 compute-1 neutron-haproxy-ovnmeta-c72b0076-9848-49ed-9b2e-d2fe36ac5e52[235832]: [NOTICE]   (235836) : haproxy version is 2.8.14-c23fe91
Jan 22 00:24:14 compute-1 neutron-haproxy-ovnmeta-c72b0076-9848-49ed-9b2e-d2fe36ac5e52[235832]: [NOTICE]   (235836) : path to executable is /usr/sbin/haproxy
Jan 22 00:24:14 compute-1 neutron-haproxy-ovnmeta-c72b0076-9848-49ed-9b2e-d2fe36ac5e52[235832]: [WARNING]  (235836) : Exiting Master process...
Jan 22 00:24:14 compute-1 neutron-haproxy-ovnmeta-c72b0076-9848-49ed-9b2e-d2fe36ac5e52[235832]: [ALERT]    (235836) : Current worker (235838) exited with code 143 (Terminated)
Jan 22 00:24:14 compute-1 neutron-haproxy-ovnmeta-c72b0076-9848-49ed-9b2e-d2fe36ac5e52[235832]: [WARNING]  (235836) : All workers exited. Exiting... (0)
Jan 22 00:24:14 compute-1 systemd[1]: libpod-cec1f8f9da1d517a7878b397e2dfcdac2e6a42554f693777b2806fcace373470.scope: Deactivated successfully.
Jan 22 00:24:14 compute-1 podman[236028]: 2026-01-22 00:24:14.656544515 +0000 UTC m=+0.561654597 container died cec1f8f9da1d517a7878b397e2dfcdac2e6a42554f693777b2806fcace373470 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c72b0076-9848-49ed-9b2e-d2fe36ac5e52, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 22 00:24:14 compute-1 nova_compute[182713]: 2026-01-22 00:24:14.710 182717 DEBUG nova.compute.manager [req-6c7dd51e-c3af-4d5f-acb9-478e30b2f5e3 req-27a43912-90cf-4bd3-a2e8-20efd5276e93 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: f457f1e4-8770-4c44-a061-214acbc43199] Received event network-vif-unplugged-ccbc0d4c-cf7a-4220-a948-0aeade60dbdb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:24:14 compute-1 nova_compute[182713]: 2026-01-22 00:24:14.711 182717 DEBUG oslo_concurrency.lockutils [req-6c7dd51e-c3af-4d5f-acb9-478e30b2f5e3 req-27a43912-90cf-4bd3-a2e8-20efd5276e93 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "f457f1e4-8770-4c44-a061-214acbc43199-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:24:14 compute-1 nova_compute[182713]: 2026-01-22 00:24:14.711 182717 DEBUG oslo_concurrency.lockutils [req-6c7dd51e-c3af-4d5f-acb9-478e30b2f5e3 req-27a43912-90cf-4bd3-a2e8-20efd5276e93 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "f457f1e4-8770-4c44-a061-214acbc43199-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:24:14 compute-1 nova_compute[182713]: 2026-01-22 00:24:14.712 182717 DEBUG oslo_concurrency.lockutils [req-6c7dd51e-c3af-4d5f-acb9-478e30b2f5e3 req-27a43912-90cf-4bd3-a2e8-20efd5276e93 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "f457f1e4-8770-4c44-a061-214acbc43199-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:24:14 compute-1 nova_compute[182713]: 2026-01-22 00:24:14.712 182717 DEBUG nova.compute.manager [req-6c7dd51e-c3af-4d5f-acb9-478e30b2f5e3 req-27a43912-90cf-4bd3-a2e8-20efd5276e93 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: f457f1e4-8770-4c44-a061-214acbc43199] No waiting events found dispatching network-vif-unplugged-ccbc0d4c-cf7a-4220-a948-0aeade60dbdb pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:24:14 compute-1 nova_compute[182713]: 2026-01-22 00:24:14.713 182717 WARNING nova.compute.manager [req-6c7dd51e-c3af-4d5f-acb9-478e30b2f5e3 req-27a43912-90cf-4bd3-a2e8-20efd5276e93 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: f457f1e4-8770-4c44-a061-214acbc43199] Received unexpected event network-vif-unplugged-ccbc0d4c-cf7a-4220-a948-0aeade60dbdb for instance with vm_state active and task_state reboot_started.
Jan 22 00:24:14 compute-1 nova_compute[182713]: 2026-01-22 00:24:14.744 182717 INFO nova.virt.libvirt.driver [None req-4508f7df-6182-4d89-a2a6-4899a2e2c60b 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: f457f1e4-8770-4c44-a061-214acbc43199] Instance shutdown successfully.
Jan 22 00:24:14 compute-1 kernel: tapccbc0d4c-cf: entered promiscuous mode
Jan 22 00:24:14 compute-1 NetworkManager[54952]: <info>  [1769041454.8291] manager: (tapccbc0d4c-cf): new Tun device (/org/freedesktop/NetworkManager/Devices/289)
Jan 22 00:24:14 compute-1 systemd-udevd[236007]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 00:24:14 compute-1 ovn_controller[94841]: 2026-01-22T00:24:14Z|00631|binding|INFO|Claiming lport ccbc0d4c-cf7a-4220-a948-0aeade60dbdb for this chassis.
Jan 22 00:24:14 compute-1 ovn_controller[94841]: 2026-01-22T00:24:14Z|00632|binding|INFO|ccbc0d4c-cf7a-4220-a948-0aeade60dbdb: Claiming fa:16:3e:17:d8:57 10.100.0.12
Jan 22 00:24:14 compute-1 nova_compute[182713]: 2026-01-22 00:24:14.831 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:24:14 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:24:14.842 104184 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:17:d8:57 10.100.0.12'], port_security=['fa:16:3e:17:d8:57 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': 'f457f1e4-8770-4c44-a061-214acbc43199', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c72b0076-9848-49ed-9b2e-d2fe36ac5e52', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'adb1305c8f874f2684e845e88fd95ffe', 'neutron:revision_number': '5', 'neutron:security_group_ids': 'c9ca07a4-cd9c-4730-b243-d5bdfe31822a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.202'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3c6698c9-b140-4a4b-89f4-0ea800814cda, chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>], logical_port=ccbc0d4c-cf7a-4220-a948-0aeade60dbdb) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:24:14 compute-1 ovn_controller[94841]: 2026-01-22T00:24:14Z|00633|binding|INFO|Setting lport ccbc0d4c-cf7a-4220-a948-0aeade60dbdb ovn-installed in OVS
Jan 22 00:24:14 compute-1 ovn_controller[94841]: 2026-01-22T00:24:14Z|00634|binding|INFO|Setting lport ccbc0d4c-cf7a-4220-a948-0aeade60dbdb up in Southbound
Jan 22 00:24:14 compute-1 nova_compute[182713]: 2026-01-22 00:24:14.846 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:24:14 compute-1 nova_compute[182713]: 2026-01-22 00:24:14.848 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:24:14 compute-1 NetworkManager[54952]: <info>  [1769041454.8524] device (tapccbc0d4c-cf): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 00:24:14 compute-1 NetworkManager[54952]: <info>  [1769041454.8540] device (tapccbc0d4c-cf): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 00:24:14 compute-1 systemd-machined[153970]: New machine qemu-69-instance-0000009f.
Jan 22 00:24:14 compute-1 systemd[1]: Started Virtual Machine qemu-69-instance-0000009f.
Jan 22 00:24:15 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-cec1f8f9da1d517a7878b397e2dfcdac2e6a42554f693777b2806fcace373470-userdata-shm.mount: Deactivated successfully.
Jan 22 00:24:15 compute-1 systemd[1]: var-lib-containers-storage-overlay-a8056c9ce0a71047f425ec98124f461b802c446dcb84d58d095fdf2605efca3e-merged.mount: Deactivated successfully.
Jan 22 00:24:15 compute-1 nova_compute[182713]: 2026-01-22 00:24:15.165 182717 DEBUG nova.virt.libvirt.host [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Removed pending event for f457f1e4-8770-4c44-a061-214acbc43199 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438
Jan 22 00:24:15 compute-1 nova_compute[182713]: 2026-01-22 00:24:15.168 182717 DEBUG nova.virt.driver [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Emitting event <LifecycleEvent: 1769041455.1646693, f457f1e4-8770-4c44-a061-214acbc43199 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:24:15 compute-1 nova_compute[182713]: 2026-01-22 00:24:15.168 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: f457f1e4-8770-4c44-a061-214acbc43199] VM Resumed (Lifecycle Event)
Jan 22 00:24:15 compute-1 nova_compute[182713]: 2026-01-22 00:24:15.175 182717 INFO nova.virt.libvirt.driver [-] [instance: f457f1e4-8770-4c44-a061-214acbc43199] Instance running successfully.
Jan 22 00:24:15 compute-1 nova_compute[182713]: 2026-01-22 00:24:15.176 182717 INFO nova.virt.libvirt.driver [None req-4508f7df-6182-4d89-a2a6-4899a2e2c60b 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: f457f1e4-8770-4c44-a061-214acbc43199] Instance soft rebooted successfully.
Jan 22 00:24:15 compute-1 nova_compute[182713]: 2026-01-22 00:24:15.176 182717 DEBUG nova.compute.manager [None req-4508f7df-6182-4d89-a2a6-4899a2e2c60b 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: f457f1e4-8770-4c44-a061-214acbc43199] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:24:15 compute-1 nova_compute[182713]: 2026-01-22 00:24:15.210 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: f457f1e4-8770-4c44-a061-214acbc43199] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:24:15 compute-1 nova_compute[182713]: 2026-01-22 00:24:15.214 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: f457f1e4-8770-4c44-a061-214acbc43199] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: reboot_started, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 00:24:15 compute-1 nova_compute[182713]: 2026-01-22 00:24:15.244 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: f457f1e4-8770-4c44-a061-214acbc43199] During sync_power_state the instance has a pending task (reboot_started). Skip.
Jan 22 00:24:15 compute-1 nova_compute[182713]: 2026-01-22 00:24:15.245 182717 DEBUG nova.virt.driver [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Emitting event <LifecycleEvent: 1769041455.167152, f457f1e4-8770-4c44-a061-214acbc43199 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:24:15 compute-1 nova_compute[182713]: 2026-01-22 00:24:15.245 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: f457f1e4-8770-4c44-a061-214acbc43199] VM Started (Lifecycle Event)
Jan 22 00:24:15 compute-1 nova_compute[182713]: 2026-01-22 00:24:15.277 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: f457f1e4-8770-4c44-a061-214acbc43199] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:24:15 compute-1 nova_compute[182713]: 2026-01-22 00:24:15.282 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: f457f1e4-8770-4c44-a061-214acbc43199] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: None, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 00:24:15 compute-1 nova_compute[182713]: 2026-01-22 00:24:15.291 182717 DEBUG oslo_concurrency.lockutils [None req-4508f7df-6182-4d89-a2a6-4899a2e2c60b 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lock "f457f1e4-8770-4c44-a061-214acbc43199" "released" by "nova.compute.manager.ComputeManager.reboot_instance.<locals>.do_reboot_instance" :: held 5.892s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:24:15 compute-1 podman[236028]: 2026-01-22 00:24:15.447567312 +0000 UTC m=+1.352677404 container cleanup cec1f8f9da1d517a7878b397e2dfcdac2e6a42554f693777b2806fcace373470 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c72b0076-9848-49ed-9b2e-d2fe36ac5e52, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Jan 22 00:24:15 compute-1 systemd[1]: libpod-conmon-cec1f8f9da1d517a7878b397e2dfcdac2e6a42554f693777b2806fcace373470.scope: Deactivated successfully.
Jan 22 00:24:15 compute-1 podman[236104]: 2026-01-22 00:24:15.641720138 +0000 UTC m=+0.164053053 container remove cec1f8f9da1d517a7878b397e2dfcdac2e6a42554f693777b2806fcace373470 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c72b0076-9848-49ed-9b2e-d2fe36ac5e52, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Jan 22 00:24:15 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:24:15.648 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[2f6137e1-8c24-4918-ba51-394b2117d862]: (4, ('Thu Jan 22 12:24:14 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-c72b0076-9848-49ed-9b2e-d2fe36ac5e52 (cec1f8f9da1d517a7878b397e2dfcdac2e6a42554f693777b2806fcace373470)\ncec1f8f9da1d517a7878b397e2dfcdac2e6a42554f693777b2806fcace373470\nThu Jan 22 12:24:15 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-c72b0076-9848-49ed-9b2e-d2fe36ac5e52 (cec1f8f9da1d517a7878b397e2dfcdac2e6a42554f693777b2806fcace373470)\ncec1f8f9da1d517a7878b397e2dfcdac2e6a42554f693777b2806fcace373470\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:24:15 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:24:15.651 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[23936572-af50-4aa9-bb9a-51a34ed46552]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:24:15 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:24:15.654 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc72b0076-90, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:24:15 compute-1 nova_compute[182713]: 2026-01-22 00:24:15.656 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:24:15 compute-1 kernel: tapc72b0076-90: left promiscuous mode
Jan 22 00:24:15 compute-1 nova_compute[182713]: 2026-01-22 00:24:15.664 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:24:15 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:24:15.666 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[bdbf8e16-8ed7-4d92-a955-d27ec6c961cc]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:24:15 compute-1 nova_compute[182713]: 2026-01-22 00:24:15.689 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:24:15 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:24:15.705 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[03d30afa-6ff0-4806-a4de-86e704dca690]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:24:15 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:24:15.706 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[476a72ad-a918-48a1-b0ae-d1a59e5ee710]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:24:15 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:24:15.727 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[b9f9e16b-2515-4761-a536-3bafa1d2dc72]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 600303, 'reachable_time': 42875, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 236119, 'error': None, 'target': 'ovnmeta-c72b0076-9848-49ed-9b2e-d2fe36ac5e52', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:24:15 compute-1 systemd[1]: run-netns-ovnmeta\x2dc72b0076\x2d9848\x2d49ed\x2d9b2e\x2dd2fe36ac5e52.mount: Deactivated successfully.
Jan 22 00:24:15 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:24:15.731 104576 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-c72b0076-9848-49ed-9b2e-d2fe36ac5e52 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 22 00:24:15 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:24:15.731 104576 DEBUG oslo.privsep.daemon [-] privsep: reply[285f594d-08f6-4704-a47e-5aea9c905704]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:24:15 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:24:15.734 104184 INFO neutron.agent.ovn.metadata.agent [-] Port ccbc0d4c-cf7a-4220-a948-0aeade60dbdb in datapath c72b0076-9848-49ed-9b2e-d2fe36ac5e52 unbound from our chassis
Jan 22 00:24:15 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:24:15.736 104184 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network c72b0076-9848-49ed-9b2e-d2fe36ac5e52
Jan 22 00:24:15 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:24:15.752 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[234939d5-b809-4328-95e0-28393909dc2c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:24:15 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:24:15.753 104184 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapc72b0076-91 in ovnmeta-c72b0076-9848-49ed-9b2e-d2fe36ac5e52 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 22 00:24:15 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:24:15.755 211733 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapc72b0076-90 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 22 00:24:15 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:24:15.755 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[dcaeb3c9-c9fc-4903-93f6-871bcb82f976]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:24:15 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:24:15.756 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[d895a1f1-81d9-4038-a879-3c2f887448f2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:24:15 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:24:15.775 104576 DEBUG oslo.privsep.daemon [-] privsep: reply[5379f4eb-de78-446b-9ca7-7b9e30e37be3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:24:15 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:24:15.806 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[96755545-13e2-4159-9041-495dabf3798b]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:24:15 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:24:15.858 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[b65b4c8a-7c7f-49c7-adbc-8b50a8beb83c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:24:15 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:24:15.877 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[99ae86bf-ccbb-4060-a93a-080355c412f4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:24:15 compute-1 NetworkManager[54952]: <info>  [1769041455.8784] manager: (tapc72b0076-90): new Veth device (/org/freedesktop/NetworkManager/Devices/290)
Jan 22 00:24:15 compute-1 nova_compute[182713]: 2026-01-22 00:24:15.880 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:24:15 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:24:15.920 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[bf193339-b834-4142-903d-27405283411b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:24:15 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:24:15.925 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[70256e17-00ca-492e-9c11-1aa4218d48e2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:24:15 compute-1 NetworkManager[54952]: <info>  [1769041455.9535] device (tapc72b0076-90): carrier: link connected
Jan 22 00:24:15 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:24:15.960 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[8fa32376-48cc-47c0-83cb-63ac35b019f7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:24:15 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:24:15.982 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[108623d8-1242-4525-afe6-9787bcffa29d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapc72b0076-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:07:a4:d3'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 180], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 602860, 'reachable_time': 25708, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 236144, 'error': None, 'target': 'ovnmeta-c72b0076-9848-49ed-9b2e-d2fe36ac5e52', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:24:16 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:24:16.002 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[93d6b1c2-71d9-4d2e-bee5-18c77c997dba]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe07:a4d3'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 602860, 'tstamp': 602860}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 236145, 'error': None, 'target': 'ovnmeta-c72b0076-9848-49ed-9b2e-d2fe36ac5e52', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:24:16 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:24:16.025 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[13b62c20-048e-4d95-862a-d10b4d63b36b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapc72b0076-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:07:a4:d3'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 180], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 602860, 'reachable_time': 25708, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 236146, 'error': None, 'target': 'ovnmeta-c72b0076-9848-49ed-9b2e-d2fe36ac5e52', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:24:16 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:24:16.062 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[289f0563-a2bf-49bc-abfe-47dadb01257f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:24:16 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:24:16.145 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[e43fdc5a-613d-4f47-9530-6487999166c4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:24:16 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:24:16.147 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc72b0076-90, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:24:16 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:24:16.147 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 00:24:16 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:24:16.147 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc72b0076-90, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:24:16 compute-1 nova_compute[182713]: 2026-01-22 00:24:16.189 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:24:16 compute-1 kernel: tapc72b0076-90: entered promiscuous mode
Jan 22 00:24:16 compute-1 NetworkManager[54952]: <info>  [1769041456.1917] manager: (tapc72b0076-90): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/291)
Jan 22 00:24:16 compute-1 nova_compute[182713]: 2026-01-22 00:24:16.191 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:24:16 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:24:16.193 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapc72b0076-90, col_values=(('external_ids', {'iface-id': '329d205a-2611-48b0-b77c-7d020bb0a3df'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:24:16 compute-1 ovn_controller[94841]: 2026-01-22T00:24:16Z|00635|binding|INFO|Releasing lport 329d205a-2611-48b0-b77c-7d020bb0a3df from this chassis (sb_readonly=0)
Jan 22 00:24:16 compute-1 nova_compute[182713]: 2026-01-22 00:24:16.194 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:24:16 compute-1 nova_compute[182713]: 2026-01-22 00:24:16.216 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:24:16 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:24:16.218 104184 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/c72b0076-9848-49ed-9b2e-d2fe36ac5e52.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/c72b0076-9848-49ed-9b2e-d2fe36ac5e52.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 22 00:24:16 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:24:16.218 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[74999e41-7bc8-4750-a9ac-c55185387eb2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:24:16 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:24:16.219 104184 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 00:24:16 compute-1 ovn_metadata_agent[104179]: global
Jan 22 00:24:16 compute-1 ovn_metadata_agent[104179]:     log         /dev/log local0 debug
Jan 22 00:24:16 compute-1 ovn_metadata_agent[104179]:     log-tag     haproxy-metadata-proxy-c72b0076-9848-49ed-9b2e-d2fe36ac5e52
Jan 22 00:24:16 compute-1 ovn_metadata_agent[104179]:     user        root
Jan 22 00:24:16 compute-1 ovn_metadata_agent[104179]:     group       root
Jan 22 00:24:16 compute-1 ovn_metadata_agent[104179]:     maxconn     1024
Jan 22 00:24:16 compute-1 ovn_metadata_agent[104179]:     pidfile     /var/lib/neutron/external/pids/c72b0076-9848-49ed-9b2e-d2fe36ac5e52.pid.haproxy
Jan 22 00:24:16 compute-1 ovn_metadata_agent[104179]:     daemon
Jan 22 00:24:16 compute-1 ovn_metadata_agent[104179]: 
Jan 22 00:24:16 compute-1 ovn_metadata_agent[104179]: defaults
Jan 22 00:24:16 compute-1 ovn_metadata_agent[104179]:     log global
Jan 22 00:24:16 compute-1 ovn_metadata_agent[104179]:     mode http
Jan 22 00:24:16 compute-1 ovn_metadata_agent[104179]:     option httplog
Jan 22 00:24:16 compute-1 ovn_metadata_agent[104179]:     option dontlognull
Jan 22 00:24:16 compute-1 ovn_metadata_agent[104179]:     option http-server-close
Jan 22 00:24:16 compute-1 ovn_metadata_agent[104179]:     option forwardfor
Jan 22 00:24:16 compute-1 ovn_metadata_agent[104179]:     retries                 3
Jan 22 00:24:16 compute-1 ovn_metadata_agent[104179]:     timeout http-request    30s
Jan 22 00:24:16 compute-1 ovn_metadata_agent[104179]:     timeout connect         30s
Jan 22 00:24:16 compute-1 ovn_metadata_agent[104179]:     timeout client          32s
Jan 22 00:24:16 compute-1 ovn_metadata_agent[104179]:     timeout server          32s
Jan 22 00:24:16 compute-1 ovn_metadata_agent[104179]:     timeout http-keep-alive 30s
Jan 22 00:24:16 compute-1 ovn_metadata_agent[104179]: 
Jan 22 00:24:16 compute-1 ovn_metadata_agent[104179]: 
Jan 22 00:24:16 compute-1 ovn_metadata_agent[104179]: listen listener
Jan 22 00:24:16 compute-1 ovn_metadata_agent[104179]:     bind 169.254.169.254:80
Jan 22 00:24:16 compute-1 ovn_metadata_agent[104179]:     server metadata /var/lib/neutron/metadata_proxy
Jan 22 00:24:16 compute-1 ovn_metadata_agent[104179]:     http-request add-header X-OVN-Network-ID c72b0076-9848-49ed-9b2e-d2fe36ac5e52
Jan 22 00:24:16 compute-1 ovn_metadata_agent[104179]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 22 00:24:16 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:24:16.220 104184 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-c72b0076-9848-49ed-9b2e-d2fe36ac5e52', 'env', 'PROCESS_TAG=haproxy-c72b0076-9848-49ed-9b2e-d2fe36ac5e52', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/c72b0076-9848-49ed-9b2e-d2fe36ac5e52.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 22 00:24:16 compute-1 podman[236179]: 2026-01-22 00:24:16.572488653 +0000 UTC m=+0.058062794 container create 48ebf605699fdfedaac1528a2ee8d4e91fb4d51b994d6f98979a87d081ac6e46 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c72b0076-9848-49ed-9b2e-d2fe36ac5e52, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Jan 22 00:24:16 compute-1 systemd[1]: Started libpod-conmon-48ebf605699fdfedaac1528a2ee8d4e91fb4d51b994d6f98979a87d081ac6e46.scope.
Jan 22 00:24:16 compute-1 podman[236179]: 2026-01-22 00:24:16.540955934 +0000 UTC m=+0.026530095 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 00:24:16 compute-1 systemd[1]: Started libcrun container.
Jan 22 00:24:16 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/72b8be811844df3e6bb11b4e4de8b0a0f3181aa6336177228387e22a38071bcf/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 00:24:16 compute-1 podman[236179]: 2026-01-22 00:24:16.657291375 +0000 UTC m=+0.142865536 container init 48ebf605699fdfedaac1528a2ee8d4e91fb4d51b994d6f98979a87d081ac6e46 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c72b0076-9848-49ed-9b2e-d2fe36ac5e52, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Jan 22 00:24:16 compute-1 podman[236179]: 2026-01-22 00:24:16.662448096 +0000 UTC m=+0.148022237 container start 48ebf605699fdfedaac1528a2ee8d4e91fb4d51b994d6f98979a87d081ac6e46 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c72b0076-9848-49ed-9b2e-d2fe36ac5e52, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Jan 22 00:24:16 compute-1 neutron-haproxy-ovnmeta-c72b0076-9848-49ed-9b2e-d2fe36ac5e52[236195]: [NOTICE]   (236199) : New worker (236201) forked
Jan 22 00:24:16 compute-1 neutron-haproxy-ovnmeta-c72b0076-9848-49ed-9b2e-d2fe36ac5e52[236195]: [NOTICE]   (236199) : Loading success.
Jan 22 00:24:16 compute-1 nova_compute[182713]: 2026-01-22 00:24:16.833 182717 DEBUG nova.compute.manager [req-554e293e-2860-48a8-8ec6-56ec6d219dae req-69fa6a5a-0fbe-4692-ae4b-81d89de47012 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: f457f1e4-8770-4c44-a061-214acbc43199] Received event network-vif-plugged-ccbc0d4c-cf7a-4220-a948-0aeade60dbdb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:24:16 compute-1 nova_compute[182713]: 2026-01-22 00:24:16.833 182717 DEBUG oslo_concurrency.lockutils [req-554e293e-2860-48a8-8ec6-56ec6d219dae req-69fa6a5a-0fbe-4692-ae4b-81d89de47012 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "f457f1e4-8770-4c44-a061-214acbc43199-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:24:16 compute-1 nova_compute[182713]: 2026-01-22 00:24:16.834 182717 DEBUG oslo_concurrency.lockutils [req-554e293e-2860-48a8-8ec6-56ec6d219dae req-69fa6a5a-0fbe-4692-ae4b-81d89de47012 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "f457f1e4-8770-4c44-a061-214acbc43199-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:24:16 compute-1 nova_compute[182713]: 2026-01-22 00:24:16.834 182717 DEBUG oslo_concurrency.lockutils [req-554e293e-2860-48a8-8ec6-56ec6d219dae req-69fa6a5a-0fbe-4692-ae4b-81d89de47012 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "f457f1e4-8770-4c44-a061-214acbc43199-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:24:16 compute-1 nova_compute[182713]: 2026-01-22 00:24:16.834 182717 DEBUG nova.compute.manager [req-554e293e-2860-48a8-8ec6-56ec6d219dae req-69fa6a5a-0fbe-4692-ae4b-81d89de47012 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: f457f1e4-8770-4c44-a061-214acbc43199] No waiting events found dispatching network-vif-plugged-ccbc0d4c-cf7a-4220-a948-0aeade60dbdb pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:24:16 compute-1 nova_compute[182713]: 2026-01-22 00:24:16.834 182717 WARNING nova.compute.manager [req-554e293e-2860-48a8-8ec6-56ec6d219dae req-69fa6a5a-0fbe-4692-ae4b-81d89de47012 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: f457f1e4-8770-4c44-a061-214acbc43199] Received unexpected event network-vif-plugged-ccbc0d4c-cf7a-4220-a948-0aeade60dbdb for instance with vm_state active and task_state None.
Jan 22 00:24:16 compute-1 nova_compute[182713]: 2026-01-22 00:24:16.835 182717 DEBUG nova.compute.manager [req-554e293e-2860-48a8-8ec6-56ec6d219dae req-69fa6a5a-0fbe-4692-ae4b-81d89de47012 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: f457f1e4-8770-4c44-a061-214acbc43199] Received event network-vif-plugged-ccbc0d4c-cf7a-4220-a948-0aeade60dbdb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:24:16 compute-1 nova_compute[182713]: 2026-01-22 00:24:16.835 182717 DEBUG oslo_concurrency.lockutils [req-554e293e-2860-48a8-8ec6-56ec6d219dae req-69fa6a5a-0fbe-4692-ae4b-81d89de47012 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "f457f1e4-8770-4c44-a061-214acbc43199-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:24:16 compute-1 nova_compute[182713]: 2026-01-22 00:24:16.835 182717 DEBUG oslo_concurrency.lockutils [req-554e293e-2860-48a8-8ec6-56ec6d219dae req-69fa6a5a-0fbe-4692-ae4b-81d89de47012 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "f457f1e4-8770-4c44-a061-214acbc43199-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:24:16 compute-1 nova_compute[182713]: 2026-01-22 00:24:16.835 182717 DEBUG oslo_concurrency.lockutils [req-554e293e-2860-48a8-8ec6-56ec6d219dae req-69fa6a5a-0fbe-4692-ae4b-81d89de47012 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "f457f1e4-8770-4c44-a061-214acbc43199-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:24:16 compute-1 nova_compute[182713]: 2026-01-22 00:24:16.836 182717 DEBUG nova.compute.manager [req-554e293e-2860-48a8-8ec6-56ec6d219dae req-69fa6a5a-0fbe-4692-ae4b-81d89de47012 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: f457f1e4-8770-4c44-a061-214acbc43199] No waiting events found dispatching network-vif-plugged-ccbc0d4c-cf7a-4220-a948-0aeade60dbdb pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:24:16 compute-1 nova_compute[182713]: 2026-01-22 00:24:16.836 182717 WARNING nova.compute.manager [req-554e293e-2860-48a8-8ec6-56ec6d219dae req-69fa6a5a-0fbe-4692-ae4b-81d89de47012 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: f457f1e4-8770-4c44-a061-214acbc43199] Received unexpected event network-vif-plugged-ccbc0d4c-cf7a-4220-a948-0aeade60dbdb for instance with vm_state active and task_state None.
Jan 22 00:24:16 compute-1 nova_compute[182713]: 2026-01-22 00:24:16.836 182717 DEBUG nova.compute.manager [req-554e293e-2860-48a8-8ec6-56ec6d219dae req-69fa6a5a-0fbe-4692-ae4b-81d89de47012 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: f457f1e4-8770-4c44-a061-214acbc43199] Received event network-vif-plugged-ccbc0d4c-cf7a-4220-a948-0aeade60dbdb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:24:16 compute-1 nova_compute[182713]: 2026-01-22 00:24:16.836 182717 DEBUG oslo_concurrency.lockutils [req-554e293e-2860-48a8-8ec6-56ec6d219dae req-69fa6a5a-0fbe-4692-ae4b-81d89de47012 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "f457f1e4-8770-4c44-a061-214acbc43199-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:24:16 compute-1 nova_compute[182713]: 2026-01-22 00:24:16.836 182717 DEBUG oslo_concurrency.lockutils [req-554e293e-2860-48a8-8ec6-56ec6d219dae req-69fa6a5a-0fbe-4692-ae4b-81d89de47012 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "f457f1e4-8770-4c44-a061-214acbc43199-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:24:16 compute-1 nova_compute[182713]: 2026-01-22 00:24:16.837 182717 DEBUG oslo_concurrency.lockutils [req-554e293e-2860-48a8-8ec6-56ec6d219dae req-69fa6a5a-0fbe-4692-ae4b-81d89de47012 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "f457f1e4-8770-4c44-a061-214acbc43199-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:24:16 compute-1 nova_compute[182713]: 2026-01-22 00:24:16.837 182717 DEBUG nova.compute.manager [req-554e293e-2860-48a8-8ec6-56ec6d219dae req-69fa6a5a-0fbe-4692-ae4b-81d89de47012 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: f457f1e4-8770-4c44-a061-214acbc43199] No waiting events found dispatching network-vif-plugged-ccbc0d4c-cf7a-4220-a948-0aeade60dbdb pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:24:16 compute-1 nova_compute[182713]: 2026-01-22 00:24:16.837 182717 WARNING nova.compute.manager [req-554e293e-2860-48a8-8ec6-56ec6d219dae req-69fa6a5a-0fbe-4692-ae4b-81d89de47012 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: f457f1e4-8770-4c44-a061-214acbc43199] Received unexpected event network-vif-plugged-ccbc0d4c-cf7a-4220-a948-0aeade60dbdb for instance with vm_state active and task_state None.
Jan 22 00:24:19 compute-1 nova_compute[182713]: 2026-01-22 00:24:19.050 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:24:19 compute-1 nova_compute[182713]: 2026-01-22 00:24:19.622 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:24:22 compute-1 nova_compute[182713]: 2026-01-22 00:24:22.765 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:24:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:22.885 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'f457f1e4-8770-4c44-a061-214acbc43199', 'name': 'tempest-TestNetworkAdvancedServerOps-server-1667524508', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-0000009f', 'OS-EXT-SRV-ATTR:host': 'compute-1.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'adb1305c8f874f2684e845e88fd95ffe', 'user_id': '635cc2f351c344dc8e2b1264080dbafb', 'hostId': 'b6528ac1376be324f8d5d15004a450d4a2f6f4430ec1e55f06f8eb97', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Jan 22 00:24:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:22.887 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Jan 22 00:24:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:22.891 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for f457f1e4-8770-4c44-a061-214acbc43199 / tapccbc0d4c-cf inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Jan 22 00:24:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:22.891 12 DEBUG ceilometer.compute.pollsters [-] f457f1e4-8770-4c44-a061-214acbc43199/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:24:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:22.895 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'cd3ab5c0-f269-4548-a56b-194c10817034', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '635cc2f351c344dc8e2b1264080dbafb', 'user_name': None, 'project_id': 'adb1305c8f874f2684e845e88fd95ffe', 'project_name': None, 'resource_id': 'instance-0000009f-f457f1e4-8770-4c44-a061-214acbc43199-tapccbc0d4c-cf', 'timestamp': '2026-01-22T00:24:22.887585', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1667524508', 'name': 'tapccbc0d4c-cf', 'instance_id': 'f457f1e4-8770-4c44-a061-214acbc43199', 'instance_type': 'm1.nano', 'host': 'b6528ac1376be324f8d5d15004a450d4a2f6f4430ec1e55f06f8eb97', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:17:d8:57', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapccbc0d4c-cf'}, 'message_id': 'b3642542-f728-11f0-a0a4-fa163e934844', 'monotonic_time': 6035.594883848, 'message_signature': '59890f0ff80fdf4ae2b1b51c33d02caf2cbbaad15ec3ec5b16a17736af1cc0ab'}]}, 'timestamp': '2026-01-22 00:24:22.892794', '_unique_id': '1267238f265b490abe37c949d8c82511'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:24:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:22.895 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:24:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:22.895 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:24:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:22.895 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:24:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:22.895 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:24:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:22.895 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:24:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:22.895 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:24:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:22.895 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:24:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:22.895 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:24:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:22.895 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:24:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:22.895 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:24:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:22.895 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:24:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:22.895 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:24:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:22.895 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:24:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:22.895 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:24:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:22.895 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:24:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:22.895 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:24:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:22.895 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:24:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:22.895 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:24:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:22.895 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:24:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:22.895 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:24:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:22.895 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:24:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:22.895 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:24:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:22.895 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:24:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:22.895 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:24:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:22.895 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:24:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:22.895 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:24:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:22.895 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:24:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:22.895 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:24:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:22.895 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:24:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:22.895 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:24:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:22.895 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:24:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:22.895 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:24:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:22.895 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:24:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:22.895 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:24:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:22.895 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:24:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:22.895 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:24:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:22.895 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:24:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:22.895 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:24:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:22.895 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:24:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:22.895 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:24:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:22.895 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:24:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:22.895 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:24:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:22.895 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:24:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:22.895 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:24:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:22.895 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:24:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:22.895 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:24:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:22.895 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:24:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:22.895 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:24:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:22.895 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:24:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:22.895 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:24:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:22.895 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:24:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:22.895 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:24:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:22.895 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:24:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:22.895 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:24:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:22.897 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Jan 22 00:24:22 compute-1 podman[236211]: 2026-01-22 00:24:22.944458702 +0000 UTC m=+0.123297709 container health_status 9069102163b9014ce8d990e36224edf2f6277e737eebbc85a8ea0ba5a3d6a468 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 22 00:24:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:22.948 12 DEBUG ceilometer.compute.pollsters [-] f457f1e4-8770-4c44-a061-214acbc43199/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:24:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:22.948 12 DEBUG ceilometer.compute.pollsters [-] f457f1e4-8770-4c44-a061-214acbc43199/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:24:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:22.950 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '22d0cdda-c128-4a4b-b5f7-3cc1c290ac09', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '635cc2f351c344dc8e2b1264080dbafb', 'user_name': None, 'project_id': 'adb1305c8f874f2684e845e88fd95ffe', 'project_name': None, 'resource_id': 'f457f1e4-8770-4c44-a061-214acbc43199-vda', 'timestamp': '2026-01-22T00:24:22.897741', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1667524508', 'name': 'instance-0000009f', 'instance_id': 'f457f1e4-8770-4c44-a061-214acbc43199', 'instance_type': 'm1.nano', 'host': 'b6528ac1376be324f8d5d15004a450d4a2f6f4430ec1e55f06f8eb97', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'b36cc09e-f728-11f0-a0a4-fa163e934844', 'monotonic_time': 6035.604982901, 'message_signature': '6292e7388637ee455b1ae2d69cd3f7285101ed6ad9bd4f01db408867025999cf'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '635cc2f351c344dc8e2b1264080dbafb', 'user_name': None, 'project_id': 'adb1305c8f874f2684e845e88fd95ffe', 'project_name': None, 'resource_id': 'f457f1e4-8770-4c44-a061-214acbc43199-sda', 'timestamp': '2026-01-22T00:24:22.897741', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1667524508', 'name': 'instance-0000009f', 'instance_id': 'f457f1e4-8770-4c44-a061-214acbc43199', 'instance_type': 'm1.nano', 'host': 'b6528ac1376be324f8d5d15004a450d4a2f6f4430ec1e55f06f8eb97', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'b36ccfc6-f728-11f0-a0a4-fa163e934844', 'monotonic_time': 6035.604982901, 'message_signature': 'c87b2d18278a2f80006ef71d545cf6e430270e68c10e436965e2cbd22a5a48a5'}]}, 'timestamp': '2026-01-22 00:24:22.949179', '_unique_id': 'd52208957d794e89858577e010770437'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:24:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:22.950 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:24:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:22.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:24:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:22.950 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:24:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:22.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:24:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:22.950 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:24:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:22.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:24:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:22.950 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:24:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:22.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:24:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:22.950 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:24:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:22.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:24:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:22.950 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:24:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:22.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:24:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:22.950 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:24:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:22.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:24:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:22.950 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:24:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:22.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:24:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:22.950 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:24:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:22.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:24:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:22.950 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:24:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:22.950 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:24:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:22.950 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:24:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:22.950 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:24:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:22.950 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:24:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:22.950 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:24:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:22.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:24:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:22.950 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:24:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:22.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:24:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:22.950 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:24:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:22.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:24:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:22.950 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:24:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:22.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:24:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:22.950 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:24:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:22.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:24:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:22.950 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:24:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:22.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:24:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:22.950 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:24:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:22.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:24:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:22.950 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:24:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:22.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:24:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:22.950 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:24:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:22.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:24:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:22.950 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:24:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:22.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:24:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:22.950 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:24:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:22.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:24:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:22.950 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:24:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:22.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:24:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:22.950 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:24:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:22.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:24:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:22.950 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:24:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:22.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:24:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:22.950 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:24:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:22.950 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:24:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:22.950 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:24:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:22.951 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Jan 22 00:24:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:22.951 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 00:24:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:22.951 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-1667524508>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-1667524508>]
Jan 22 00:24:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:22.952 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Jan 22 00:24:22 compute-1 podman[236210]: 2026-01-22 00:24:22.96178671 +0000 UTC m=+0.154879529 container health_status 1b31374526468d6b98833c6ddc06c791524e3aaa8f4a92d02e3f11c449a17ca2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Jan 22 00:24:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:22.986 12 DEBUG ceilometer.compute.pollsters [-] f457f1e4-8770-4c44-a061-214acbc43199/cpu volume: 7510000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:24:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:22.987 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4038038d-f3ee-45ed-b337-79b444d73198', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 7510000000, 'user_id': '635cc2f351c344dc8e2b1264080dbafb', 'user_name': None, 'project_id': 'adb1305c8f874f2684e845e88fd95ffe', 'project_name': None, 'resource_id': 'f457f1e4-8770-4c44-a061-214acbc43199', 'timestamp': '2026-01-22T00:24:22.952267', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1667524508', 'name': 'instance-0000009f', 'instance_id': 'f457f1e4-8770-4c44-a061-214acbc43199', 'instance_type': 'm1.nano', 'host': 'b6528ac1376be324f8d5d15004a450d4a2f6f4430ec1e55f06f8eb97', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': 'b3728b28-f728-11f0-a0a4-fa163e934844', 'monotonic_time': 6035.693132138, 'message_signature': '16798a6f86a39891c8507036090d4c54c89e66cd9131e386ef4c9476cfc28afd'}]}, 'timestamp': '2026-01-22 00:24:22.986885', '_unique_id': '0d5a566f5753471388567bc72e51da43'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:24:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:22.987 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:24:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:22.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:24:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:22.987 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:24:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:22.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:24:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:22.987 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:24:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:22.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:24:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:22.987 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:24:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:22.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:24:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:22.987 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:24:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:22.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:24:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:22.987 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:24:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:22.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:24:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:22.987 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:24:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:22.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:24:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:22.987 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:24:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:22.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:24:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:22.987 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:24:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:22.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:24:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:22.987 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:24:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:22.987 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:24:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:22.987 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:24:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:22.987 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:24:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:22.987 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:24:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:22.987 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:24:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:22.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:24:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:22.987 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:24:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:22.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:24:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:22.987 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:24:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:22.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:24:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:22.987 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:24:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:22.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:24:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:22.987 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:24:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:22.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:24:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:22.987 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:24:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:22.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:24:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:22.987 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:24:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:22.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:24:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:22.987 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:24:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:22.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:24:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:22.987 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:24:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:22.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:24:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:22.987 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:24:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:22.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:24:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:22.987 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:24:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:22.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:24:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:22.987 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:24:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:22.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:24:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:22.987 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:24:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:22.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:24:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:22.987 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:24:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:22.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:24:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:22.987 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:24:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:22.987 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:24:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:22.987 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:24:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:22.989 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.027 12 DEBUG ceilometer.compute.pollsters [-] f457f1e4-8770-4c44-a061-214acbc43199/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.028 12 DEBUG ceilometer.compute.pollsters [-] f457f1e4-8770-4c44-a061-214acbc43199/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.030 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f07b1fac-bb84-47a8-923a-b183c6e2f62a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '635cc2f351c344dc8e2b1264080dbafb', 'user_name': None, 'project_id': 'adb1305c8f874f2684e845e88fd95ffe', 'project_name': None, 'resource_id': 'f457f1e4-8770-4c44-a061-214acbc43199-vda', 'timestamp': '2026-01-22T00:24:22.989160', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1667524508', 'name': 'instance-0000009f', 'instance_id': 'f457f1e4-8770-4c44-a061-214acbc43199', 'instance_type': 'm1.nano', 'host': 'b6528ac1376be324f8d5d15004a450d4a2f6f4430ec1e55f06f8eb97', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'b378d91a-f728-11f0-a0a4-fa163e934844', 'monotonic_time': 6035.696378479, 'message_signature': '02a9b1b33b6031d5699e9474dbddb5f2cd3b8237ade3eea62189a72461762cc9'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '635cc2f351c344dc8e2b1264080dbafb', 'user_name': None, 'project_id': 'adb1305c8f874f2684e845e88fd95ffe', 'project_name': None, 'resource_id': 'f457f1e4-8770-4c44-a061-214acbc43199-sda', 'timestamp': '2026-01-22T00:24:22.989160', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1667524508', 'name': 'instance-0000009f', 'instance_id': 'f457f1e4-8770-4c44-a061-214acbc43199', 'instance_type': 'm1.nano', 'host': 'b6528ac1376be324f8d5d15004a450d4a2f6f4430ec1e55f06f8eb97', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'b378ec52-f728-11f0-a0a4-fa163e934844', 'monotonic_time': 6035.696378479, 'message_signature': 'e5b0beb3e278ca4517be2c512d3e1d62f2ed1fa920fb5094a9b5d03959e8ba2f'}]}, 'timestamp': '2026-01-22 00:24:23.028605', '_unique_id': 'b7a845f31a3e4c178c6ddd9537809df8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.030 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.030 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.030 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.030 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.030 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.030 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.030 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.030 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.030 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.030 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.030 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.030 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.030 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.030 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.030 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.030 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.030 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.030 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.030 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.030 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.030 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.030 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.030 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.030 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.030 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.030 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.030 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.030 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.030 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.030 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.030 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.032 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.033 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.033 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-1667524508>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-1667524508>]
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.033 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.033 12 DEBUG ceilometer.compute.pollsters [-] f457f1e4-8770-4c44-a061-214acbc43199/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.035 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4716c195-9c47-40bd-8767-d5fe1fd07c90', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '635cc2f351c344dc8e2b1264080dbafb', 'user_name': None, 'project_id': 'adb1305c8f874f2684e845e88fd95ffe', 'project_name': None, 'resource_id': 'instance-0000009f-f457f1e4-8770-4c44-a061-214acbc43199-tapccbc0d4c-cf', 'timestamp': '2026-01-22T00:24:23.033707', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1667524508', 'name': 'tapccbc0d4c-cf', 'instance_id': 'f457f1e4-8770-4c44-a061-214acbc43199', 'instance_type': 'm1.nano', 'host': 'b6528ac1376be324f8d5d15004a450d4a2f6f4430ec1e55f06f8eb97', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:17:d8:57', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapccbc0d4c-cf'}, 'message_id': 'b379c6cc-f728-11f0-a0a4-fa163e934844', 'monotonic_time': 6035.594883848, 'message_signature': 'b4f8059d0c8eab3f7ac6bd10e76e746b0e9c3e8c177127b2dba373e96613ef70'}]}, 'timestamp': '2026-01-22 00:24:23.034244', '_unique_id': '1bf8f5ff355149c5950fa916e5c1b6cd'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.035 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.035 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.035 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.035 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.035 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.035 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.035 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.035 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.035 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.035 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.035 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.035 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.035 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.035 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.035 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.035 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.035 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.035 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.035 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.035 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.035 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.035 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.035 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.035 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.035 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.035 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.035 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.035 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.035 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.035 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.035 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.036 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.037 12 DEBUG ceilometer.compute.pollsters [-] f457f1e4-8770-4c44-a061-214acbc43199/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.038 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2fa0d53a-61be-497e-b0e0-d48b219f88f7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '635cc2f351c344dc8e2b1264080dbafb', 'user_name': None, 'project_id': 'adb1305c8f874f2684e845e88fd95ffe', 'project_name': None, 'resource_id': 'instance-0000009f-f457f1e4-8770-4c44-a061-214acbc43199-tapccbc0d4c-cf', 'timestamp': '2026-01-22T00:24:23.037034', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1667524508', 'name': 'tapccbc0d4c-cf', 'instance_id': 'f457f1e4-8770-4c44-a061-214acbc43199', 'instance_type': 'm1.nano', 'host': 'b6528ac1376be324f8d5d15004a450d4a2f6f4430ec1e55f06f8eb97', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:17:d8:57', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapccbc0d4c-cf'}, 'message_id': 'b37a464c-f728-11f0-a0a4-fa163e934844', 'monotonic_time': 6035.594883848, 'message_signature': 'c9229e6ee8384afe7fb92960f67c28942eed329f8b3bf8035fa4b061a8c0c339'}]}, 'timestamp': '2026-01-22 00:24:23.037502', '_unique_id': '272b1c1220054649917ff3815970cb78'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.038 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.038 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.038 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.038 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.038 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.038 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.038 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.038 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.038 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.038 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.038 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.038 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.038 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.038 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.038 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.038 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.038 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.038 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.038 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.038 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.038 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.038 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.038 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.038 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.038 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.038 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.038 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.038 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.038 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.038 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.038 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.039 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.039 12 DEBUG ceilometer.compute.pollsters [-] f457f1e4-8770-4c44-a061-214acbc43199/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.040 12 DEBUG ceilometer.compute.pollsters [-] f457f1e4-8770-4c44-a061-214acbc43199/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.041 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9a332177-0907-4b16-8417-2e69b9a426db', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '635cc2f351c344dc8e2b1264080dbafb', 'user_name': None, 'project_id': 'adb1305c8f874f2684e845e88fd95ffe', 'project_name': None, 'resource_id': 'f457f1e4-8770-4c44-a061-214acbc43199-vda', 'timestamp': '2026-01-22T00:24:23.039749', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1667524508', 'name': 'instance-0000009f', 'instance_id': 'f457f1e4-8770-4c44-a061-214acbc43199', 'instance_type': 'm1.nano', 'host': 'b6528ac1376be324f8d5d15004a450d4a2f6f4430ec1e55f06f8eb97', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'b37aaee8-f728-11f0-a0a4-fa163e934844', 'monotonic_time': 6035.696378479, 'message_signature': '500f2d504131bec29864ec1000edbcec1da58417420a1b2a9cfa2167973e0c36'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '635cc2f351c344dc8e2b1264080dbafb', 'user_name': None, 'project_id': 'adb1305c8f874f2684e845e88fd95ffe', 'project_name': None, 'resource_id': 'f457f1e4-8770-4c44-a061-214acbc43199-sda', 'timestamp': '2026-01-22T00:24:23.039749', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1667524508', 'name': 'instance-0000009f', 'instance_id': 'f457f1e4-8770-4c44-a061-214acbc43199', 'instance_type': 'm1.nano', 'host': 'b6528ac1376be324f8d5d15004a450d4a2f6f4430ec1e55f06f8eb97', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'b37abb04-f728-11f0-a0a4-fa163e934844', 'monotonic_time': 6035.696378479, 'message_signature': 'e6bfac08194b98299b2dc7b506d6a4c8ab49d47fac1d4465412eac1bbafd929d'}]}, 'timestamp': '2026-01-22 00:24:23.040409', '_unique_id': 'e9fd1fde7df64c1c94d37a85e0aab440'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.041 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.041 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.041 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.041 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.041 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.041 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.041 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.041 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.041 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.041 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.041 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.041 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.041 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.041 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.041 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.041 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.041 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.041 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.041 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.041 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.041 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.041 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.041 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.041 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.041 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.041 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.041 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.041 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.041 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.041 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.041 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.042 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.042 12 DEBUG ceilometer.compute.pollsters [-] f457f1e4-8770-4c44-a061-214acbc43199/disk.device.read.bytes volume: 23775232 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.042 12 DEBUG ceilometer.compute.pollsters [-] f457f1e4-8770-4c44-a061-214acbc43199/disk.device.read.bytes volume: 2048 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.043 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b45f2b30-0f4e-4e49-aabb-6f61337e4561', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 23775232, 'user_id': '635cc2f351c344dc8e2b1264080dbafb', 'user_name': None, 'project_id': 'adb1305c8f874f2684e845e88fd95ffe', 'project_name': None, 'resource_id': 'f457f1e4-8770-4c44-a061-214acbc43199-vda', 'timestamp': '2026-01-22T00:24:23.042427', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1667524508', 'name': 'instance-0000009f', 'instance_id': 'f457f1e4-8770-4c44-a061-214acbc43199', 'instance_type': 'm1.nano', 'host': 'b6528ac1376be324f8d5d15004a450d4a2f6f4430ec1e55f06f8eb97', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'b37b1658-f728-11f0-a0a4-fa163e934844', 'monotonic_time': 6035.696378479, 'message_signature': 'ff1bc92ab8b2b93f5ed3285ca0e0c24ef280e7e7da95caee57cd4a7247d17399'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2048, 'user_id': '635cc2f351c344dc8e2b1264080dbafb', 'user_name': None, 'project_id': 'adb1305c8f874f2684e845e88fd95ffe', 'project_name': None, 'resource_id': 'f457f1e4-8770-4c44-a061-214acbc43199-sda', 'timestamp': '2026-01-22T00:24:23.042427', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1667524508', 'name': 'instance-0000009f', 'instance_id': 'f457f1e4-8770-4c44-a061-214acbc43199', 'instance_type': 'm1.nano', 'host': 'b6528ac1376be324f8d5d15004a450d4a2f6f4430ec1e55f06f8eb97', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'b37b2210-f728-11f0-a0a4-fa163e934844', 'monotonic_time': 6035.696378479, 'message_signature': '09b0e4df215edf6ec64933b3aeaf65ba56c65a3d7015932295ffaa32db26abbe'}]}, 'timestamp': '2026-01-22 00:24:23.043049', '_unique_id': '6e42f5867b524b249ef56adc01ff3743'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.043 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.043 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.043 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.043 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.043 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.043 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.043 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.043 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.043 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.043 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.043 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.043 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.043 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.043 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.043 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.043 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.043 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.043 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.043 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.043 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.043 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.043 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.043 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.043 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.043 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.043 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.043 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.043 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.043 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.043 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.043 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.045 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.045 12 DEBUG ceilometer.compute.pollsters [-] f457f1e4-8770-4c44-a061-214acbc43199/network.incoming.packets volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.045 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9a9d5e21-53ff-4d96-80d4-e4e49c8c29ef', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 1, 'user_id': '635cc2f351c344dc8e2b1264080dbafb', 'user_name': None, 'project_id': 'adb1305c8f874f2684e845e88fd95ffe', 'project_name': None, 'resource_id': 'instance-0000009f-f457f1e4-8770-4c44-a061-214acbc43199-tapccbc0d4c-cf', 'timestamp': '2026-01-22T00:24:23.045117', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1667524508', 'name': 'tapccbc0d4c-cf', 'instance_id': 'f457f1e4-8770-4c44-a061-214acbc43199', 'instance_type': 'm1.nano', 'host': 'b6528ac1376be324f8d5d15004a450d4a2f6f4430ec1e55f06f8eb97', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:17:d8:57', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapccbc0d4c-cf'}, 'message_id': 'b37b7ecc-f728-11f0-a0a4-fa163e934844', 'monotonic_time': 6035.594883848, 'message_signature': 'f6a93c8fa2f1375bd1f26c05d06c57c4f9ac97a6c4c5d7587df50d69f8a48979'}]}, 'timestamp': '2026-01-22 00:24:23.045411', '_unique_id': '3066fd9441cc40aa84b36629a0fa5d8f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.045 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.045 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.045 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.045 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.045 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.045 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.045 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.045 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.045 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.045 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.045 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.045 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.045 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.045 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.045 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.045 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.045 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.045 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.045 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.045 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.045 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.045 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.045 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.045 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.045 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.045 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.045 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.045 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.045 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.045 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.045 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.046 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.047 12 DEBUG ceilometer.compute.pollsters [-] f457f1e4-8770-4c44-a061-214acbc43199/memory.usage volume: Unavailable _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.047 12 WARNING ceilometer.compute.pollsters [-] memory.usage statistic in not available for instance f457f1e4-8770-4c44-a061-214acbc43199: ceilometer.compute.pollsters.NoVolumeException
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.047 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.047 12 DEBUG ceilometer.compute.pollsters [-] f457f1e4-8770-4c44-a061-214acbc43199/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.048 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'cee18bb6-a446-446c-8c8c-e75fb0e9ab4b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '635cc2f351c344dc8e2b1264080dbafb', 'user_name': None, 'project_id': 'adb1305c8f874f2684e845e88fd95ffe', 'project_name': None, 'resource_id': 'instance-0000009f-f457f1e4-8770-4c44-a061-214acbc43199-tapccbc0d4c-cf', 'timestamp': '2026-01-22T00:24:23.047423', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1667524508', 'name': 'tapccbc0d4c-cf', 'instance_id': 'f457f1e4-8770-4c44-a061-214acbc43199', 'instance_type': 'm1.nano', 'host': 'b6528ac1376be324f8d5d15004a450d4a2f6f4430ec1e55f06f8eb97', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:17:d8:57', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapccbc0d4c-cf'}, 'message_id': 'b37bd93a-f728-11f0-a0a4-fa163e934844', 'monotonic_time': 6035.594883848, 'message_signature': 'bfa7f34362f5886e9aa9e62aaf4c49f7e0bb2337132f08aba11677b0db5c81eb'}]}, 'timestamp': '2026-01-22 00:24:23.047743', '_unique_id': 'dadd3b9d94a74f50ad4303bf9c56e59a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.048 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.048 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.048 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.048 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.048 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.048 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.048 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.048 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.048 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.048 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.048 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.048 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.048 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.048 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.048 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.048 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.048 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.048 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.048 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.048 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.048 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.048 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.048 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.048 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.048 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.048 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.048 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.048 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.048 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.048 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.048 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.049 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.049 12 DEBUG ceilometer.compute.pollsters [-] f457f1e4-8770-4c44-a061-214acbc43199/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.050 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6dd6203e-77ce-4913-86a7-24e0eed8350d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '635cc2f351c344dc8e2b1264080dbafb', 'user_name': None, 'project_id': 'adb1305c8f874f2684e845e88fd95ffe', 'project_name': None, 'resource_id': 'instance-0000009f-f457f1e4-8770-4c44-a061-214acbc43199-tapccbc0d4c-cf', 'timestamp': '2026-01-22T00:24:23.049362', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1667524508', 'name': 'tapccbc0d4c-cf', 'instance_id': 'f457f1e4-8770-4c44-a061-214acbc43199', 'instance_type': 'm1.nano', 'host': 'b6528ac1376be324f8d5d15004a450d4a2f6f4430ec1e55f06f8eb97', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:17:d8:57', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapccbc0d4c-cf'}, 'message_id': 'b37c2444-f728-11f0-a0a4-fa163e934844', 'monotonic_time': 6035.594883848, 'message_signature': 'b1d472bab2fab08ea4c32a2562e8423d69939236f91f2fc796cb628d2af7b3ae'}]}, 'timestamp': '2026-01-22 00:24:23.049647', '_unique_id': 'efb7b6be912b436b9dfeff0d3f00bad7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.050 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.050 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.050 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.050 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.050 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.050 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.050 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.050 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.050 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.050 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.050 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.050 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.050 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.050 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.050 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.050 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.050 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.050 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.050 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.050 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.050 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.050 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.050 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.050 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.050 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.050 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.050 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.050 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.050 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.050 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.050 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.051 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.051 12 DEBUG ceilometer.compute.pollsters [-] f457f1e4-8770-4c44-a061-214acbc43199/disk.device.allocation volume: 30089216 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.051 12 DEBUG ceilometer.compute.pollsters [-] f457f1e4-8770-4c44-a061-214acbc43199/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.052 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5fa58e76-da3a-4ca5-a2a9-7a8659578d37', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30089216, 'user_id': '635cc2f351c344dc8e2b1264080dbafb', 'user_name': None, 'project_id': 'adb1305c8f874f2684e845e88fd95ffe', 'project_name': None, 'resource_id': 'f457f1e4-8770-4c44-a061-214acbc43199-vda', 'timestamp': '2026-01-22T00:24:23.051196', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1667524508', 'name': 'instance-0000009f', 'instance_id': 'f457f1e4-8770-4c44-a061-214acbc43199', 'instance_type': 'm1.nano', 'host': 'b6528ac1376be324f8d5d15004a450d4a2f6f4430ec1e55f06f8eb97', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'b37c6cce-f728-11f0-a0a4-fa163e934844', 'monotonic_time': 6035.604982901, 'message_signature': '8afa9da59b03f76ebbeb2a9014f369672e18972963d9c31b17b3c43e7d8adfcf'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '635cc2f351c344dc8e2b1264080dbafb', 'user_name': None, 'project_id': 'adb1305c8f874f2684e845e88fd95ffe', 'project_name': None, 'resource_id': 'f457f1e4-8770-4c44-a061-214acbc43199-sda', 'timestamp': '2026-01-22T00:24:23.051196', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1667524508', 'name': 'instance-0000009f', 'instance_id': 'f457f1e4-8770-4c44-a061-214acbc43199', 'instance_type': 'm1.nano', 'host': 'b6528ac1376be324f8d5d15004a450d4a2f6f4430ec1e55f06f8eb97', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'b37c785e-f728-11f0-a0a4-fa163e934844', 'monotonic_time': 6035.604982901, 'message_signature': '394cceb29669b681b5328eef5d13bd8c0891ff796ce3c7b9654f8b105cd022e4'}]}, 'timestamp': '2026-01-22 00:24:23.051809', '_unique_id': '3a9218966d2e4ed288151c634cd2610c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.052 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.052 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.052 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.052 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.052 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.052 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.052 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.052 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.052 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.052 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.052 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.052 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.052 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.052 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.052 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.052 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.052 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.052 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.052 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.052 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.052 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.052 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.052 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.052 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.052 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.052 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.052 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.052 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.052 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.052 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.052 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.053 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.053 12 DEBUG ceilometer.compute.pollsters [-] f457f1e4-8770-4c44-a061-214acbc43199/disk.device.read.latency volume: 193277052 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.053 12 DEBUG ceilometer.compute.pollsters [-] f457f1e4-8770-4c44-a061-214acbc43199/disk.device.read.latency volume: 18894616 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.054 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fd2fa355-e715-420e-9cd8-f33836f0adfb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 193277052, 'user_id': '635cc2f351c344dc8e2b1264080dbafb', 'user_name': None, 'project_id': 'adb1305c8f874f2684e845e88fd95ffe', 'project_name': None, 'resource_id': 'f457f1e4-8770-4c44-a061-214acbc43199-vda', 'timestamp': '2026-01-22T00:24:23.053566', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1667524508', 'name': 'instance-0000009f', 'instance_id': 'f457f1e4-8770-4c44-a061-214acbc43199', 'instance_type': 'm1.nano', 'host': 'b6528ac1376be324f8d5d15004a450d4a2f6f4430ec1e55f06f8eb97', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'b37cc89a-f728-11f0-a0a4-fa163e934844', 'monotonic_time': 6035.696378479, 'message_signature': '76511163f7c104d7ee6ff8445d6ec88363a6a7e9b1e0a6dd4f557934d32a671a'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 18894616, 'user_id': '635cc2f351c344dc8e2b1264080dbafb', 'user_name': None, 'project_id': 'adb1305c8f874f2684e845e88fd95ffe', 'project_name': None, 'resource_id': 'f457f1e4-8770-4c44-a061-214acbc43199-sda', 'timestamp': '2026-01-22T00:24:23.053566', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1667524508', 'name': 'instance-0000009f', 'instance_id': 'f457f1e4-8770-4c44-a061-214acbc43199', 'instance_type': 'm1.nano', 'host': 'b6528ac1376be324f8d5d15004a450d4a2f6f4430ec1e55f06f8eb97', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'b37cd33a-f728-11f0-a0a4-fa163e934844', 'monotonic_time': 6035.696378479, 'message_signature': '42bae973618a1df379b49a01fc6fee549dbad3fdaa53f92fe1538c63ce6c1ebb'}]}, 'timestamp': '2026-01-22 00:24:23.054135', '_unique_id': '1580a5f672054a75932b51c2029e79d1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.054 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.054 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.054 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.054 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.054 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.054 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.054 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.054 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.054 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.054 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.054 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.054 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.054 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.054 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.054 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.054 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.054 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.054 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.054 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.054 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.054 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.054 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.054 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.054 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.054 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.054 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.054 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.054 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.054 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.054 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.054 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.055 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.055 12 DEBUG ceilometer.compute.pollsters [-] f457f1e4-8770-4c44-a061-214acbc43199/network.outgoing.packets volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.056 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '80754e67-ffe6-4d37-af1d-b04136cc862e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '635cc2f351c344dc8e2b1264080dbafb', 'user_name': None, 'project_id': 'adb1305c8f874f2684e845e88fd95ffe', 'project_name': None, 'resource_id': 'instance-0000009f-f457f1e4-8770-4c44-a061-214acbc43199-tapccbc0d4c-cf', 'timestamp': '2026-01-22T00:24:23.055740', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1667524508', 'name': 'tapccbc0d4c-cf', 'instance_id': 'f457f1e4-8770-4c44-a061-214acbc43199', 'instance_type': 'm1.nano', 'host': 'b6528ac1376be324f8d5d15004a450d4a2f6f4430ec1e55f06f8eb97', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:17:d8:57', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapccbc0d4c-cf'}, 'message_id': 'b37d1e58-f728-11f0-a0a4-fa163e934844', 'monotonic_time': 6035.594883848, 'message_signature': '482ff970411e0ebbf246e904f1d519bed25aa1f0742b24dc0ecf44d635c72b4a'}]}, 'timestamp': '2026-01-22 00:24:23.056045', '_unique_id': 'e9f8624255164824a008f55577403e93'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.056 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.056 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.056 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.056 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.056 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.056 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.056 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.056 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.056 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.056 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.056 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.056 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.056 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.056 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.056 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.056 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.056 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.056 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.056 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.056 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.056 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.056 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.056 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.056 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.056 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.056 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.056 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.056 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.056 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.056 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.056 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.057 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.057 12 DEBUG ceilometer.compute.pollsters [-] f457f1e4-8770-4c44-a061-214acbc43199/disk.device.usage volume: 30015488 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.057 12 DEBUG ceilometer.compute.pollsters [-] f457f1e4-8770-4c44-a061-214acbc43199/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.058 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7aa24f1b-8c5e-4cdb-aedd-e9e6699fe352', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30015488, 'user_id': '635cc2f351c344dc8e2b1264080dbafb', 'user_name': None, 'project_id': 'adb1305c8f874f2684e845e88fd95ffe', 'project_name': None, 'resource_id': 'f457f1e4-8770-4c44-a061-214acbc43199-vda', 'timestamp': '2026-01-22T00:24:23.057522', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1667524508', 'name': 'instance-0000009f', 'instance_id': 'f457f1e4-8770-4c44-a061-214acbc43199', 'instance_type': 'm1.nano', 'host': 'b6528ac1376be324f8d5d15004a450d4a2f6f4430ec1e55f06f8eb97', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'b37d6282-f728-11f0-a0a4-fa163e934844', 'monotonic_time': 6035.604982901, 'message_signature': 'b844c1cb4760f68ed65312a3249d29fe0edf2079f45661a4ba72f17124483c36'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '635cc2f351c344dc8e2b1264080dbafb', 'user_name': None, 'project_id': 'adb1305c8f874f2684e845e88fd95ffe', 'project_name': None, 'resource_id': 'f457f1e4-8770-4c44-a061-214acbc43199-sda', 'timestamp': '2026-01-22T00:24:23.057522', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1667524508', 'name': 'instance-0000009f', 'instance_id': 'f457f1e4-8770-4c44-a061-214acbc43199', 'instance_type': 'm1.nano', 'host': 'b6528ac1376be324f8d5d15004a450d4a2f6f4430ec1e55f06f8eb97', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'b37d6cd2-f728-11f0-a0a4-fa163e934844', 'monotonic_time': 6035.604982901, 'message_signature': '1d9698c9e3886e3a1ad6bf8a3cfa4afbfedaeee2ef569c7e501907e473e7e9d9'}]}, 'timestamp': '2026-01-22 00:24:23.058039', '_unique_id': '7ead4c0c203440309c6d87995e40b75c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.058 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.058 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.058 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.058 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.058 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.058 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.058 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.058 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.058 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.058 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.058 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.058 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.058 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.058 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.058 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.058 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.058 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.058 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.058 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.058 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.058 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.058 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.058 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.058 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.058 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.058 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.058 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.058 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.058 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.058 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.058 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.059 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.059 12 DEBUG ceilometer.compute.pollsters [-] f457f1e4-8770-4c44-a061-214acbc43199/network.incoming.bytes volume: 90 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.060 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '319c4869-28cf-4055-b1a3-3b40451d33b6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 90, 'user_id': '635cc2f351c344dc8e2b1264080dbafb', 'user_name': None, 'project_id': 'adb1305c8f874f2684e845e88fd95ffe', 'project_name': None, 'resource_id': 'instance-0000009f-f457f1e4-8770-4c44-a061-214acbc43199-tapccbc0d4c-cf', 'timestamp': '2026-01-22T00:24:23.059583', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1667524508', 'name': 'tapccbc0d4c-cf', 'instance_id': 'f457f1e4-8770-4c44-a061-214acbc43199', 'instance_type': 'm1.nano', 'host': 'b6528ac1376be324f8d5d15004a450d4a2f6f4430ec1e55f06f8eb97', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:17:d8:57', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapccbc0d4c-cf'}, 'message_id': 'b37db4da-f728-11f0-a0a4-fa163e934844', 'monotonic_time': 6035.594883848, 'message_signature': '36bd23cf20f192eca4ad15e4f6b85f8485ca3788d371578a52ad3b03f9996704'}]}, 'timestamp': '2026-01-22 00:24:23.059963', '_unique_id': '66896f14cc5d4f3aad5df2a748ebc929'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.060 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.060 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.060 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.060 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.060 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.060 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.060 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.060 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.060 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.060 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.060 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.060 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.060 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.060 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.060 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.060 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.060 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.060 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.060 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.060 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.060 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.060 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.060 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.060 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.060 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.060 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.060 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.060 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.060 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.060 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.060 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.061 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.061 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.061 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-1667524508>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-1667524508>]
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.062 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.062 12 DEBUG ceilometer.compute.pollsters [-] f457f1e4-8770-4c44-a061-214acbc43199/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.062 12 DEBUG ceilometer.compute.pollsters [-] f457f1e4-8770-4c44-a061-214acbc43199/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.063 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6aaf976b-f638-4e66-969e-3d70fa084e57', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '635cc2f351c344dc8e2b1264080dbafb', 'user_name': None, 'project_id': 'adb1305c8f874f2684e845e88fd95ffe', 'project_name': None, 'resource_id': 'f457f1e4-8770-4c44-a061-214acbc43199-vda', 'timestamp': '2026-01-22T00:24:23.062319', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1667524508', 'name': 'instance-0000009f', 'instance_id': 'f457f1e4-8770-4c44-a061-214acbc43199', 'instance_type': 'm1.nano', 'host': 'b6528ac1376be324f8d5d15004a450d4a2f6f4430ec1e55f06f8eb97', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'b37e1f42-f728-11f0-a0a4-fa163e934844', 'monotonic_time': 6035.696378479, 'message_signature': '7b89e721993060b2641971126e9700774f34722f578a4a43854b4d1aa7699298'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '635cc2f351c344dc8e2b1264080dbafb', 'user_name': None, 'project_id': 'adb1305c8f874f2684e845e88fd95ffe', 'project_name': None, 'resource_id': 'f457f1e4-8770-4c44-a061-214acbc43199-sda', 'timestamp': '2026-01-22T00:24:23.062319', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1667524508', 'name': 'instance-0000009f', 'instance_id': 'f457f1e4-8770-4c44-a061-214acbc43199', 'instance_type': 'm1.nano', 'host': 'b6528ac1376be324f8d5d15004a450d4a2f6f4430ec1e55f06f8eb97', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'b37e2b22-f728-11f0-a0a4-fa163e934844', 'monotonic_time': 6035.696378479, 'message_signature': 'f788d72f1d89e1e2e787a1138a2549b8e4834446855c74389102529eca7ab6f8'}]}, 'timestamp': '2026-01-22 00:24:23.062962', '_unique_id': '61a9a93ce4fd4f19b59789d68b4d5614'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.063 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.063 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.063 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.063 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.063 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.063 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.063 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.063 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.063 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.063 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.063 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.063 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.063 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.063 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.063 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.063 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.063 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.063 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.063 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.063 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.063 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.063 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.063 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.063 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.063 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.063 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.063 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.063 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.063 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.063 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.063 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.064 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.064 12 DEBUG ceilometer.compute.pollsters [-] f457f1e4-8770-4c44-a061-214acbc43199/disk.device.read.requests volume: 760 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.064 12 DEBUG ceilometer.compute.pollsters [-] f457f1e4-8770-4c44-a061-214acbc43199/disk.device.read.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.065 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e263e2c7-f691-471e-b874-4df1a6c5ef8f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 760, 'user_id': '635cc2f351c344dc8e2b1264080dbafb', 'user_name': None, 'project_id': 'adb1305c8f874f2684e845e88fd95ffe', 'project_name': None, 'resource_id': 'f457f1e4-8770-4c44-a061-214acbc43199-vda', 'timestamp': '2026-01-22T00:24:23.064651', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1667524508', 'name': 'instance-0000009f', 'instance_id': 'f457f1e4-8770-4c44-a061-214acbc43199', 'instance_type': 'm1.nano', 'host': 'b6528ac1376be324f8d5d15004a450d4a2f6f4430ec1e55f06f8eb97', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'b37e7a46-f728-11f0-a0a4-fa163e934844', 'monotonic_time': 6035.696378479, 'message_signature': 'fe53a51b4856915bb5fc1099415785f30c8827fea6af9dad5a766e0eedc9e058'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '635cc2f351c344dc8e2b1264080dbafb', 'user_name': None, 'project_id': 'adb1305c8f874f2684e845e88fd95ffe', 'project_name': None, 'resource_id': 'f457f1e4-8770-4c44-a061-214acbc43199-sda', 'timestamp': '2026-01-22T00:24:23.064651', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1667524508', 'name': 'instance-0000009f', 'instance_id': 'f457f1e4-8770-4c44-a061-214acbc43199', 'instance_type': 'm1.nano', 'host': 'b6528ac1376be324f8d5d15004a450d4a2f6f4430ec1e55f06f8eb97', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'b37e8676-f728-11f0-a0a4-fa163e934844', 'monotonic_time': 6035.696378479, 'message_signature': '4accecccb562a9ffda29d99b15755f6ea342f7f663683fc479f7b17b182f03df'}]}, 'timestamp': '2026-01-22 00:24:23.065253', '_unique_id': '0204eda0401a4678afba6d8aa58e3900'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.065 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.065 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.065 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.065 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.065 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.065 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.065 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.065 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.065 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.065 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.065 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.065 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.065 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.065 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.065 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.065 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.065 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.065 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.065 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.065 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.065 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.065 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.065 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.065 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.065 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.065 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.065 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.065 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.065 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.065 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.065 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.066 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.066 12 DEBUG ceilometer.compute.pollsters [-] f457f1e4-8770-4c44-a061-214acbc43199/network.outgoing.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.067 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ed98e474-f8e7-4ac1-bf81-bad0015af057', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '635cc2f351c344dc8e2b1264080dbafb', 'user_name': None, 'project_id': 'adb1305c8f874f2684e845e88fd95ffe', 'project_name': None, 'resource_id': 'instance-0000009f-f457f1e4-8770-4c44-a061-214acbc43199-tapccbc0d4c-cf', 'timestamp': '2026-01-22T00:24:23.066741', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1667524508', 'name': 'tapccbc0d4c-cf', 'instance_id': 'f457f1e4-8770-4c44-a061-214acbc43199', 'instance_type': 'm1.nano', 'host': 'b6528ac1376be324f8d5d15004a450d4a2f6f4430ec1e55f06f8eb97', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:17:d8:57', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapccbc0d4c-cf'}, 'message_id': 'b37ece06-f728-11f0-a0a4-fa163e934844', 'monotonic_time': 6035.594883848, 'message_signature': '2c47ae00c094f8bae9ad2254e6bf26bc361799676350fa129a06d1ac440b4eb9'}]}, 'timestamp': '2026-01-22 00:24:23.067129', '_unique_id': 'e11e1b23ca2e490bac114bc8266b0fa5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.067 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.067 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.067 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.067 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.067 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.067 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.067 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.067 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.067 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.067 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.067 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.067 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.067 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.067 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.067 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.067 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.067 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.067 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.067 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.067 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.067 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.067 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.067 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.067 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.067 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.067 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.067 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.067 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.067 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.067 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.067 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.068 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.068 12 DEBUG ceilometer.compute.pollsters [-] f457f1e4-8770-4c44-a061-214acbc43199/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.069 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4b1dc3d2-8e99-4d5d-8a26-ef2cc9d16670', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '635cc2f351c344dc8e2b1264080dbafb', 'user_name': None, 'project_id': 'adb1305c8f874f2684e845e88fd95ffe', 'project_name': None, 'resource_id': 'instance-0000009f-f457f1e4-8770-4c44-a061-214acbc43199-tapccbc0d4c-cf', 'timestamp': '2026-01-22T00:24:23.068788', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1667524508', 'name': 'tapccbc0d4c-cf', 'instance_id': 'f457f1e4-8770-4c44-a061-214acbc43199', 'instance_type': 'm1.nano', 'host': 'b6528ac1376be324f8d5d15004a450d4a2f6f4430ec1e55f06f8eb97', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:17:d8:57', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapccbc0d4c-cf'}, 'message_id': 'b37f1bcc-f728-11f0-a0a4-fa163e934844', 'monotonic_time': 6035.594883848, 'message_signature': 'e7d2e4ffc92352082ce2ccc81d3fadc7550be8137b0784949bd41a31fc80f78f'}]}, 'timestamp': '2026-01-22 00:24:23.069084', '_unique_id': 'f813fad377fd4dc6950b1b2de25f7b6d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.069 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.069 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.069 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.069 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.069 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.069 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.069 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.069 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.069 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.069 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.069 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.069 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.069 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.069 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.069 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.069 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.069 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.069 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.069 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.069 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.069 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.069 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.069 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.069 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.069 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.069 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.069 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.069 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.069 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.069 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.069 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.069 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.069 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.069 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.069 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.069 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.069 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.069 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.069 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.069 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.069 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.069 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.069 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.069 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.069 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.069 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.069 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.069 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.069 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.069 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.069 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.069 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.069 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.069 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.070 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.070 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 00:24:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:24:23.070 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-1667524508>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-1667524508>]
Jan 22 00:24:24 compute-1 nova_compute[182713]: 2026-01-22 00:24:24.054 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:24:24 compute-1 nova_compute[182713]: 2026-01-22 00:24:24.625 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:24:26 compute-1 podman[236263]: 2026-01-22 00:24:26.589370152 +0000 UTC m=+0.066071752 container health_status e305ebfcd3dbca18f8dd680606b043b108cf9efb0d0a771039cb04fe0df8441d (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 22 00:24:26 compute-1 podman[236262]: 2026-01-22 00:24:26.603870463 +0000 UTC m=+0.076344972 container health_status af9a44923f14d865893c91712a29a99d59e05d83f1a1a0300c4f4d5af638f284 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 22 00:24:28 compute-1 ovn_controller[94841]: 2026-01-22T00:24:28Z|00078|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:17:d8:57 10.100.0.12
Jan 22 00:24:29 compute-1 nova_compute[182713]: 2026-01-22 00:24:29.055 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:24:29 compute-1 nova_compute[182713]: 2026-01-22 00:24:29.628 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:24:34 compute-1 nova_compute[182713]: 2026-01-22 00:24:34.058 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:24:34 compute-1 nova_compute[182713]: 2026-01-22 00:24:34.630 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:24:36 compute-1 nova_compute[182713]: 2026-01-22 00:24:36.640 182717 INFO nova.compute.manager [None req-b59fe571-1d42-4373-b9cb-ef5aefe4c062 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: f457f1e4-8770-4c44-a061-214acbc43199] Get console output
Jan 22 00:24:36 compute-1 nova_compute[182713]: 2026-01-22 00:24:36.647 211417 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 00:24:39 compute-1 nova_compute[182713]: 2026-01-22 00:24:39.060 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:24:39 compute-1 podman[236304]: 2026-01-22 00:24:39.580660587 +0000 UTC m=+0.074577166 container health_status cba261ffc859a105e0afa4436e64e27f9ba7e7387a1ea02146e0072c51844d44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Jan 22 00:24:39 compute-1 nova_compute[182713]: 2026-01-22 00:24:39.633 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:24:39 compute-1 nova_compute[182713]: 2026-01-22 00:24:39.834 182717 DEBUG nova.compute.manager [req-fc583c92-4c57-4edb-91c7-352a83ddb950 req-b6f3c394-8120-46d0-ae7f-42a89927c8cc 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: f457f1e4-8770-4c44-a061-214acbc43199] Received event network-changed-ccbc0d4c-cf7a-4220-a948-0aeade60dbdb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:24:39 compute-1 nova_compute[182713]: 2026-01-22 00:24:39.835 182717 DEBUG nova.compute.manager [req-fc583c92-4c57-4edb-91c7-352a83ddb950 req-b6f3c394-8120-46d0-ae7f-42a89927c8cc 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: f457f1e4-8770-4c44-a061-214acbc43199] Refreshing instance network info cache due to event network-changed-ccbc0d4c-cf7a-4220-a948-0aeade60dbdb. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 00:24:39 compute-1 nova_compute[182713]: 2026-01-22 00:24:39.836 182717 DEBUG oslo_concurrency.lockutils [req-fc583c92-4c57-4edb-91c7-352a83ddb950 req-b6f3c394-8120-46d0-ae7f-42a89927c8cc 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-f457f1e4-8770-4c44-a061-214acbc43199" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:24:39 compute-1 nova_compute[182713]: 2026-01-22 00:24:39.836 182717 DEBUG oslo_concurrency.lockutils [req-fc583c92-4c57-4edb-91c7-352a83ddb950 req-b6f3c394-8120-46d0-ae7f-42a89927c8cc 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-f457f1e4-8770-4c44-a061-214acbc43199" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:24:39 compute-1 nova_compute[182713]: 2026-01-22 00:24:39.836 182717 DEBUG nova.network.neutron [req-fc583c92-4c57-4edb-91c7-352a83ddb950 req-b6f3c394-8120-46d0-ae7f-42a89927c8cc 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: f457f1e4-8770-4c44-a061-214acbc43199] Refreshing network info cache for port ccbc0d4c-cf7a-4220-a948-0aeade60dbdb _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 00:24:41 compute-1 nova_compute[182713]: 2026-01-22 00:24:41.804 182717 DEBUG nova.network.neutron [req-fc583c92-4c57-4edb-91c7-352a83ddb950 req-b6f3c394-8120-46d0-ae7f-42a89927c8cc 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: f457f1e4-8770-4c44-a061-214acbc43199] Updated VIF entry in instance network info cache for port ccbc0d4c-cf7a-4220-a948-0aeade60dbdb. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 00:24:41 compute-1 nova_compute[182713]: 2026-01-22 00:24:41.805 182717 DEBUG nova.network.neutron [req-fc583c92-4c57-4edb-91c7-352a83ddb950 req-b6f3c394-8120-46d0-ae7f-42a89927c8cc 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: f457f1e4-8770-4c44-a061-214acbc43199] Updating instance_info_cache with network_info: [{"id": "ccbc0d4c-cf7a-4220-a948-0aeade60dbdb", "address": "fa:16:3e:17:d8:57", "network": {"id": "c72b0076-9848-49ed-9b2e-d2fe36ac5e52", "bridge": "br-int", "label": "tempest-network-smoke--1029682171", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "adb1305c8f874f2684e845e88fd95ffe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapccbc0d4c-cf", "ovs_interfaceid": "ccbc0d4c-cf7a-4220-a948-0aeade60dbdb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:24:42 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:24:42.370 104184 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=48, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:a4:fb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'fa:c2:bb:50:eb:27'}, ipsec=False) old=SB_Global(nb_cfg=47) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:24:42 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:24:42.373 104184 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 22 00:24:42 compute-1 nova_compute[182713]: 2026-01-22 00:24:42.402 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:24:42 compute-1 nova_compute[182713]: 2026-01-22 00:24:42.414 182717 DEBUG oslo_concurrency.lockutils [req-fc583c92-4c57-4edb-91c7-352a83ddb950 req-b6f3c394-8120-46d0-ae7f-42a89927c8cc 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-f457f1e4-8770-4c44-a061-214acbc43199" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:24:42 compute-1 nova_compute[182713]: 2026-01-22 00:24:42.497 182717 DEBUG oslo_concurrency.lockutils [None req-07844b01-b3c6-4f59-bd07-1bebf0d053bf 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Acquiring lock "f457f1e4-8770-4c44-a061-214acbc43199" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:24:42 compute-1 nova_compute[182713]: 2026-01-22 00:24:42.497 182717 DEBUG oslo_concurrency.lockutils [None req-07844b01-b3c6-4f59-bd07-1bebf0d053bf 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lock "f457f1e4-8770-4c44-a061-214acbc43199" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:24:42 compute-1 nova_compute[182713]: 2026-01-22 00:24:42.498 182717 DEBUG oslo_concurrency.lockutils [None req-07844b01-b3c6-4f59-bd07-1bebf0d053bf 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Acquiring lock "f457f1e4-8770-4c44-a061-214acbc43199-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:24:42 compute-1 nova_compute[182713]: 2026-01-22 00:24:42.499 182717 DEBUG oslo_concurrency.lockutils [None req-07844b01-b3c6-4f59-bd07-1bebf0d053bf 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lock "f457f1e4-8770-4c44-a061-214acbc43199-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:24:42 compute-1 nova_compute[182713]: 2026-01-22 00:24:42.499 182717 DEBUG oslo_concurrency.lockutils [None req-07844b01-b3c6-4f59-bd07-1bebf0d053bf 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lock "f457f1e4-8770-4c44-a061-214acbc43199-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:24:42 compute-1 nova_compute[182713]: 2026-01-22 00:24:42.519 182717 INFO nova.compute.manager [None req-07844b01-b3c6-4f59-bd07-1bebf0d053bf 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: f457f1e4-8770-4c44-a061-214acbc43199] Terminating instance
Jan 22 00:24:42 compute-1 nova_compute[182713]: 2026-01-22 00:24:42.534 182717 DEBUG nova.compute.manager [None req-07844b01-b3c6-4f59-bd07-1bebf0d053bf 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: f457f1e4-8770-4c44-a061-214acbc43199] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 22 00:24:42 compute-1 kernel: tapccbc0d4c-cf (unregistering): left promiscuous mode
Jan 22 00:24:42 compute-1 NetworkManager[54952]: <info>  [1769041482.5604] device (tapccbc0d4c-cf): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 00:24:42 compute-1 ovn_controller[94841]: 2026-01-22T00:24:42Z|00636|binding|INFO|Releasing lport ccbc0d4c-cf7a-4220-a948-0aeade60dbdb from this chassis (sb_readonly=0)
Jan 22 00:24:42 compute-1 ovn_controller[94841]: 2026-01-22T00:24:42Z|00637|binding|INFO|Setting lport ccbc0d4c-cf7a-4220-a948-0aeade60dbdb down in Southbound
Jan 22 00:24:42 compute-1 nova_compute[182713]: 2026-01-22 00:24:42.571 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:24:42 compute-1 ovn_controller[94841]: 2026-01-22T00:24:42Z|00638|binding|INFO|Removing iface tapccbc0d4c-cf ovn-installed in OVS
Jan 22 00:24:42 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:24:42.579 104184 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:17:d8:57 10.100.0.12'], port_security=['fa:16:3e:17:d8:57 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': 'f457f1e4-8770-4c44-a061-214acbc43199', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c72b0076-9848-49ed-9b2e-d2fe36ac5e52', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'adb1305c8f874f2684e845e88fd95ffe', 'neutron:revision_number': '6', 'neutron:security_group_ids': 'c9ca07a4-cd9c-4730-b243-d5bdfe31822a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3c6698c9-b140-4a4b-89f4-0ea800814cda, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>], logical_port=ccbc0d4c-cf7a-4220-a948-0aeade60dbdb) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:24:42 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:24:42.580 104184 INFO neutron.agent.ovn.metadata.agent [-] Port ccbc0d4c-cf7a-4220-a948-0aeade60dbdb in datapath c72b0076-9848-49ed-9b2e-d2fe36ac5e52 unbound from our chassis
Jan 22 00:24:42 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:24:42.581 104184 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network c72b0076-9848-49ed-9b2e-d2fe36ac5e52, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 00:24:42 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:24:42.582 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[ba562c2f-0185-435e-8fdd-d8e28d956e0a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:24:42 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:24:42.583 104184 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-c72b0076-9848-49ed-9b2e-d2fe36ac5e52 namespace which is not needed anymore
Jan 22 00:24:42 compute-1 nova_compute[182713]: 2026-01-22 00:24:42.589 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:24:42 compute-1 systemd[1]: machine-qemu\x2d69\x2dinstance\x2d0000009f.scope: Deactivated successfully.
Jan 22 00:24:42 compute-1 systemd[1]: machine-qemu\x2d69\x2dinstance\x2d0000009f.scope: Consumed 12.824s CPU time.
Jan 22 00:24:42 compute-1 systemd-machined[153970]: Machine qemu-69-instance-0000009f terminated.
Jan 22 00:24:42 compute-1 podman[236336]: 2026-01-22 00:24:42.712765909 +0000 UTC m=+0.063541323 container health_status dce133a2ffa21f3006ac27dc80027060a9029e6d972f4c62fecd35dd3bbb00f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, vcs-type=git, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, vendor=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, version=9.6, config_id=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, release=1755695350, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Jan 22 00:24:42 compute-1 neutron-haproxy-ovnmeta-c72b0076-9848-49ed-9b2e-d2fe36ac5e52[236195]: [NOTICE]   (236199) : haproxy version is 2.8.14-c23fe91
Jan 22 00:24:42 compute-1 neutron-haproxy-ovnmeta-c72b0076-9848-49ed-9b2e-d2fe36ac5e52[236195]: [NOTICE]   (236199) : path to executable is /usr/sbin/haproxy
Jan 22 00:24:42 compute-1 neutron-haproxy-ovnmeta-c72b0076-9848-49ed-9b2e-d2fe36ac5e52[236195]: [WARNING]  (236199) : Exiting Master process...
Jan 22 00:24:42 compute-1 neutron-haproxy-ovnmeta-c72b0076-9848-49ed-9b2e-d2fe36ac5e52[236195]: [WARNING]  (236199) : Exiting Master process...
Jan 22 00:24:42 compute-1 neutron-haproxy-ovnmeta-c72b0076-9848-49ed-9b2e-d2fe36ac5e52[236195]: [ALERT]    (236199) : Current worker (236201) exited with code 143 (Terminated)
Jan 22 00:24:42 compute-1 neutron-haproxy-ovnmeta-c72b0076-9848-49ed-9b2e-d2fe36ac5e52[236195]: [WARNING]  (236199) : All workers exited. Exiting... (0)
Jan 22 00:24:42 compute-1 systemd[1]: libpod-48ebf605699fdfedaac1528a2ee8d4e91fb4d51b994d6f98979a87d081ac6e46.scope: Deactivated successfully.
Jan 22 00:24:42 compute-1 conmon[236195]: conmon 48ebf605699fdfedaac1 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-48ebf605699fdfedaac1528a2ee8d4e91fb4d51b994d6f98979a87d081ac6e46.scope/container/memory.events
Jan 22 00:24:42 compute-1 podman[236363]: 2026-01-22 00:24:42.739346305 +0000 UTC m=+0.054289437 container died 48ebf605699fdfedaac1528a2ee8d4e91fb4d51b994d6f98979a87d081ac6e46 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c72b0076-9848-49ed-9b2e-d2fe36ac5e52, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 00:24:42 compute-1 nova_compute[182713]: 2026-01-22 00:24:42.763 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:24:42 compute-1 nova_compute[182713]: 2026-01-22 00:24:42.770 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:24:42 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-48ebf605699fdfedaac1528a2ee8d4e91fb4d51b994d6f98979a87d081ac6e46-userdata-shm.mount: Deactivated successfully.
Jan 22 00:24:42 compute-1 systemd[1]: var-lib-containers-storage-overlay-72b8be811844df3e6bb11b4e4de8b0a0f3181aa6336177228387e22a38071bcf-merged.mount: Deactivated successfully.
Jan 22 00:24:42 compute-1 podman[236363]: 2026-01-22 00:24:42.794219278 +0000 UTC m=+0.109162400 container cleanup 48ebf605699fdfedaac1528a2ee8d4e91fb4d51b994d6f98979a87d081ac6e46 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c72b0076-9848-49ed-9b2e-d2fe36ac5e52, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_managed=true)
Jan 22 00:24:42 compute-1 systemd[1]: libpod-conmon-48ebf605699fdfedaac1528a2ee8d4e91fb4d51b994d6f98979a87d081ac6e46.scope: Deactivated successfully.
Jan 22 00:24:42 compute-1 nova_compute[182713]: 2026-01-22 00:24:42.804 182717 INFO nova.virt.libvirt.driver [-] [instance: f457f1e4-8770-4c44-a061-214acbc43199] Instance destroyed successfully.
Jan 22 00:24:42 compute-1 nova_compute[182713]: 2026-01-22 00:24:42.804 182717 DEBUG nova.objects.instance [None req-07844b01-b3c6-4f59-bd07-1bebf0d053bf 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lazy-loading 'resources' on Instance uuid f457f1e4-8770-4c44-a061-214acbc43199 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:24:42 compute-1 nova_compute[182713]: 2026-01-22 00:24:42.826 182717 DEBUG nova.virt.libvirt.vif [None req-07844b01-b3c6-4f59-bd07-1bebf0d053bf 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T00:23:35Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-1667524508',display_name='tempest-TestNetworkAdvancedServerOps-server-1667524508',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-1667524508',id=159,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHHg+wJV0gNqY7bKALzLqoL0K2EZGAuyjbK8gKQ82X6sT1DieFqLVmwGBVxxlHgI5XE+x3/Jhn+MN25qB+bFchaETh8kb1GrgpW5jMhoanYzyulPU1uvIFw8Ac5CluludA==',key_name='tempest-TestNetworkAdvancedServerOps-913147749',keypairs=<?>,launch_index=0,launched_at=2026-01-22T00:23:51Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='adb1305c8f874f2684e845e88fd95ffe',ramdisk_id='',reservation_id='r-an3dhcmb',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkAdvancedServerOps-587955072',owner_user_name='tempest-TestNetworkAdvancedServerOps-587955072-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T00:24:15Z,user_data=None,user_id='635cc2f351c344dc8e2b1264080dbafb',uuid=f457f1e4-8770-4c44-a061-214acbc43199,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "ccbc0d4c-cf7a-4220-a948-0aeade60dbdb", "address": "fa:16:3e:17:d8:57", "network": {"id": "c72b0076-9848-49ed-9b2e-d2fe36ac5e52", "bridge": "br-int", "label": "tempest-network-smoke--1029682171", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.202", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "adb1305c8f874f2684e845e88fd95ffe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapccbc0d4c-cf", "ovs_interfaceid": "ccbc0d4c-cf7a-4220-a948-0aeade60dbdb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 00:24:42 compute-1 nova_compute[182713]: 2026-01-22 00:24:42.826 182717 DEBUG nova.network.os_vif_util [None req-07844b01-b3c6-4f59-bd07-1bebf0d053bf 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Converting VIF {"id": "ccbc0d4c-cf7a-4220-a948-0aeade60dbdb", "address": "fa:16:3e:17:d8:57", "network": {"id": "c72b0076-9848-49ed-9b2e-d2fe36ac5e52", "bridge": "br-int", "label": "tempest-network-smoke--1029682171", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.202", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "adb1305c8f874f2684e845e88fd95ffe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapccbc0d4c-cf", "ovs_interfaceid": "ccbc0d4c-cf7a-4220-a948-0aeade60dbdb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:24:42 compute-1 nova_compute[182713]: 2026-01-22 00:24:42.827 182717 DEBUG nova.network.os_vif_util [None req-07844b01-b3c6-4f59-bd07-1bebf0d053bf 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:17:d8:57,bridge_name='br-int',has_traffic_filtering=True,id=ccbc0d4c-cf7a-4220-a948-0aeade60dbdb,network=Network(c72b0076-9848-49ed-9b2e-d2fe36ac5e52),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapccbc0d4c-cf') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:24:42 compute-1 nova_compute[182713]: 2026-01-22 00:24:42.828 182717 DEBUG os_vif [None req-07844b01-b3c6-4f59-bd07-1bebf0d053bf 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:17:d8:57,bridge_name='br-int',has_traffic_filtering=True,id=ccbc0d4c-cf7a-4220-a948-0aeade60dbdb,network=Network(c72b0076-9848-49ed-9b2e-d2fe36ac5e52),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapccbc0d4c-cf') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 00:24:42 compute-1 nova_compute[182713]: 2026-01-22 00:24:42.831 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:24:42 compute-1 nova_compute[182713]: 2026-01-22 00:24:42.831 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapccbc0d4c-cf, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:24:42 compute-1 nova_compute[182713]: 2026-01-22 00:24:42.833 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:24:42 compute-1 nova_compute[182713]: 2026-01-22 00:24:42.836 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:24:42 compute-1 nova_compute[182713]: 2026-01-22 00:24:42.843 182717 INFO os_vif [None req-07844b01-b3c6-4f59-bd07-1bebf0d053bf 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:17:d8:57,bridge_name='br-int',has_traffic_filtering=True,id=ccbc0d4c-cf7a-4220-a948-0aeade60dbdb,network=Network(c72b0076-9848-49ed-9b2e-d2fe36ac5e52),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapccbc0d4c-cf')
Jan 22 00:24:42 compute-1 nova_compute[182713]: 2026-01-22 00:24:42.844 182717 INFO nova.virt.libvirt.driver [None req-07844b01-b3c6-4f59-bd07-1bebf0d053bf 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: f457f1e4-8770-4c44-a061-214acbc43199] Deleting instance files /var/lib/nova/instances/f457f1e4-8770-4c44-a061-214acbc43199_del
Jan 22 00:24:42 compute-1 nova_compute[182713]: 2026-01-22 00:24:42.845 182717 INFO nova.virt.libvirt.driver [None req-07844b01-b3c6-4f59-bd07-1bebf0d053bf 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: f457f1e4-8770-4c44-a061-214acbc43199] Deletion of /var/lib/nova/instances/f457f1e4-8770-4c44-a061-214acbc43199_del complete
Jan 22 00:24:42 compute-1 podman[236413]: 2026-01-22 00:24:42.873638584 +0000 UTC m=+0.052798550 container remove 48ebf605699fdfedaac1528a2ee8d4e91fb4d51b994d6f98979a87d081ac6e46 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c72b0076-9848-49ed-9b2e-d2fe36ac5e52, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Jan 22 00:24:42 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:24:42.882 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[3de94cff-a2f7-4bac-a779-031df54bb235]: (4, ('Thu Jan 22 12:24:42 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-c72b0076-9848-49ed-9b2e-d2fe36ac5e52 (48ebf605699fdfedaac1528a2ee8d4e91fb4d51b994d6f98979a87d081ac6e46)\n48ebf605699fdfedaac1528a2ee8d4e91fb4d51b994d6f98979a87d081ac6e46\nThu Jan 22 12:24:42 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-c72b0076-9848-49ed-9b2e-d2fe36ac5e52 (48ebf605699fdfedaac1528a2ee8d4e91fb4d51b994d6f98979a87d081ac6e46)\n48ebf605699fdfedaac1528a2ee8d4e91fb4d51b994d6f98979a87d081ac6e46\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:24:42 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:24:42.883 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[a43a7423-50ce-4985-a8ea-f0d581d84ce1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:24:42 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:24:42.884 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc72b0076-90, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:24:42 compute-1 nova_compute[182713]: 2026-01-22 00:24:42.886 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:24:42 compute-1 kernel: tapc72b0076-90: left promiscuous mode
Jan 22 00:24:42 compute-1 nova_compute[182713]: 2026-01-22 00:24:42.900 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:24:42 compute-1 nova_compute[182713]: 2026-01-22 00:24:42.901 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:24:42 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:24:42.903 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[c8210f9d-6db1-4e0d-ab3f-e121dcb56455]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:24:42 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:24:42.915 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[4084a551-4947-435f-be1d-18ef805b99b4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:24:42 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:24:42.917 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[92df2f7f-8d64-44b0-b011-345718982733]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:24:42 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:24:42.935 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[fc6fd6e7-dab6-493d-a00d-94ef2d89bed9]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 602850, 'reachable_time': 42134, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 236429, 'error': None, 'target': 'ovnmeta-c72b0076-9848-49ed-9b2e-d2fe36ac5e52', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:24:42 compute-1 systemd[1]: run-netns-ovnmeta\x2dc72b0076\x2d9848\x2d49ed\x2d9b2e\x2dd2fe36ac5e52.mount: Deactivated successfully.
Jan 22 00:24:42 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:24:42.939 104576 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-c72b0076-9848-49ed-9b2e-d2fe36ac5e52 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 22 00:24:42 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:24:42.940 104576 DEBUG oslo.privsep.daemon [-] privsep: reply[1125d2df-63a4-4f6d-9f15-3938b84c6c6f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:24:42 compute-1 nova_compute[182713]: 2026-01-22 00:24:42.982 182717 INFO nova.compute.manager [None req-07844b01-b3c6-4f59-bd07-1bebf0d053bf 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: f457f1e4-8770-4c44-a061-214acbc43199] Took 0.45 seconds to destroy the instance on the hypervisor.
Jan 22 00:24:42 compute-1 nova_compute[182713]: 2026-01-22 00:24:42.983 182717 DEBUG oslo.service.loopingcall [None req-07844b01-b3c6-4f59-bd07-1bebf0d053bf 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 22 00:24:42 compute-1 nova_compute[182713]: 2026-01-22 00:24:42.984 182717 DEBUG nova.compute.manager [-] [instance: f457f1e4-8770-4c44-a061-214acbc43199] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 22 00:24:42 compute-1 nova_compute[182713]: 2026-01-22 00:24:42.984 182717 DEBUG nova.network.neutron [-] [instance: f457f1e4-8770-4c44-a061-214acbc43199] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 22 00:24:43 compute-1 nova_compute[182713]: 2026-01-22 00:24:43.048 182717 DEBUG nova.compute.manager [req-f7f29ea2-59b2-4901-ab94-16777661e326 req-9961c380-ba11-4278-ad5c-be7e61b786ed 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: f457f1e4-8770-4c44-a061-214acbc43199] Received event network-vif-unplugged-ccbc0d4c-cf7a-4220-a948-0aeade60dbdb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:24:43 compute-1 nova_compute[182713]: 2026-01-22 00:24:43.048 182717 DEBUG oslo_concurrency.lockutils [req-f7f29ea2-59b2-4901-ab94-16777661e326 req-9961c380-ba11-4278-ad5c-be7e61b786ed 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "f457f1e4-8770-4c44-a061-214acbc43199-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:24:43 compute-1 nova_compute[182713]: 2026-01-22 00:24:43.049 182717 DEBUG oslo_concurrency.lockutils [req-f7f29ea2-59b2-4901-ab94-16777661e326 req-9961c380-ba11-4278-ad5c-be7e61b786ed 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "f457f1e4-8770-4c44-a061-214acbc43199-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:24:43 compute-1 nova_compute[182713]: 2026-01-22 00:24:43.049 182717 DEBUG oslo_concurrency.lockutils [req-f7f29ea2-59b2-4901-ab94-16777661e326 req-9961c380-ba11-4278-ad5c-be7e61b786ed 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "f457f1e4-8770-4c44-a061-214acbc43199-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:24:43 compute-1 nova_compute[182713]: 2026-01-22 00:24:43.050 182717 DEBUG nova.compute.manager [req-f7f29ea2-59b2-4901-ab94-16777661e326 req-9961c380-ba11-4278-ad5c-be7e61b786ed 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: f457f1e4-8770-4c44-a061-214acbc43199] No waiting events found dispatching network-vif-unplugged-ccbc0d4c-cf7a-4220-a948-0aeade60dbdb pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:24:43 compute-1 nova_compute[182713]: 2026-01-22 00:24:43.050 182717 DEBUG nova.compute.manager [req-f7f29ea2-59b2-4901-ab94-16777661e326 req-9961c380-ba11-4278-ad5c-be7e61b786ed 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: f457f1e4-8770-4c44-a061-214acbc43199] Received event network-vif-unplugged-ccbc0d4c-cf7a-4220-a948-0aeade60dbdb for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 22 00:24:44 compute-1 nova_compute[182713]: 2026-01-22 00:24:44.062 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:24:44 compute-1 nova_compute[182713]: 2026-01-22 00:24:44.928 182717 DEBUG nova.network.neutron [-] [instance: f457f1e4-8770-4c44-a061-214acbc43199] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:24:45 compute-1 nova_compute[182713]: 2026-01-22 00:24:45.289 182717 DEBUG nova.compute.manager [req-d6e6c40f-ba97-43fe-ab7b-3047683a6959 req-da1feaba-79c0-4eda-9fa2-51926657585e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: f457f1e4-8770-4c44-a061-214acbc43199] Received event network-vif-plugged-ccbc0d4c-cf7a-4220-a948-0aeade60dbdb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:24:45 compute-1 nova_compute[182713]: 2026-01-22 00:24:45.289 182717 DEBUG oslo_concurrency.lockutils [req-d6e6c40f-ba97-43fe-ab7b-3047683a6959 req-da1feaba-79c0-4eda-9fa2-51926657585e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "f457f1e4-8770-4c44-a061-214acbc43199-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:24:45 compute-1 nova_compute[182713]: 2026-01-22 00:24:45.289 182717 DEBUG oslo_concurrency.lockutils [req-d6e6c40f-ba97-43fe-ab7b-3047683a6959 req-da1feaba-79c0-4eda-9fa2-51926657585e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "f457f1e4-8770-4c44-a061-214acbc43199-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:24:45 compute-1 nova_compute[182713]: 2026-01-22 00:24:45.290 182717 DEBUG oslo_concurrency.lockutils [req-d6e6c40f-ba97-43fe-ab7b-3047683a6959 req-da1feaba-79c0-4eda-9fa2-51926657585e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "f457f1e4-8770-4c44-a061-214acbc43199-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:24:45 compute-1 nova_compute[182713]: 2026-01-22 00:24:45.290 182717 DEBUG nova.compute.manager [req-d6e6c40f-ba97-43fe-ab7b-3047683a6959 req-da1feaba-79c0-4eda-9fa2-51926657585e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: f457f1e4-8770-4c44-a061-214acbc43199] No waiting events found dispatching network-vif-plugged-ccbc0d4c-cf7a-4220-a948-0aeade60dbdb pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:24:45 compute-1 nova_compute[182713]: 2026-01-22 00:24:45.290 182717 WARNING nova.compute.manager [req-d6e6c40f-ba97-43fe-ab7b-3047683a6959 req-da1feaba-79c0-4eda-9fa2-51926657585e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: f457f1e4-8770-4c44-a061-214acbc43199] Received unexpected event network-vif-plugged-ccbc0d4c-cf7a-4220-a948-0aeade60dbdb for instance with vm_state active and task_state deleting.
Jan 22 00:24:45 compute-1 nova_compute[182713]: 2026-01-22 00:24:45.291 182717 DEBUG nova.compute.manager [req-d6e6c40f-ba97-43fe-ab7b-3047683a6959 req-da1feaba-79c0-4eda-9fa2-51926657585e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: f457f1e4-8770-4c44-a061-214acbc43199] Received event network-vif-deleted-ccbc0d4c-cf7a-4220-a948-0aeade60dbdb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:24:45 compute-1 nova_compute[182713]: 2026-01-22 00:24:45.291 182717 INFO nova.compute.manager [req-d6e6c40f-ba97-43fe-ab7b-3047683a6959 req-da1feaba-79c0-4eda-9fa2-51926657585e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: f457f1e4-8770-4c44-a061-214acbc43199] Neutron deleted interface ccbc0d4c-cf7a-4220-a948-0aeade60dbdb; detaching it from the instance and deleting it from the info cache
Jan 22 00:24:45 compute-1 nova_compute[182713]: 2026-01-22 00:24:45.292 182717 DEBUG nova.network.neutron [req-d6e6c40f-ba97-43fe-ab7b-3047683a6959 req-da1feaba-79c0-4eda-9fa2-51926657585e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: f457f1e4-8770-4c44-a061-214acbc43199] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:24:45 compute-1 nova_compute[182713]: 2026-01-22 00:24:45.297 182717 INFO nova.compute.manager [-] [instance: f457f1e4-8770-4c44-a061-214acbc43199] Took 2.31 seconds to deallocate network for instance.
Jan 22 00:24:45 compute-1 nova_compute[182713]: 2026-01-22 00:24:45.318 182717 DEBUG nova.compute.manager [req-d6e6c40f-ba97-43fe-ab7b-3047683a6959 req-da1feaba-79c0-4eda-9fa2-51926657585e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: f457f1e4-8770-4c44-a061-214acbc43199] Detach interface failed, port_id=ccbc0d4c-cf7a-4220-a948-0aeade60dbdb, reason: Instance f457f1e4-8770-4c44-a061-214acbc43199 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882
Jan 22 00:24:45 compute-1 nova_compute[182713]: 2026-01-22 00:24:45.436 182717 DEBUG oslo_concurrency.lockutils [None req-07844b01-b3c6-4f59-bd07-1bebf0d053bf 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:24:45 compute-1 nova_compute[182713]: 2026-01-22 00:24:45.437 182717 DEBUG oslo_concurrency.lockutils [None req-07844b01-b3c6-4f59-bd07-1bebf0d053bf 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:24:45 compute-1 nova_compute[182713]: 2026-01-22 00:24:45.528 182717 DEBUG nova.compute.provider_tree [None req-07844b01-b3c6-4f59-bd07-1bebf0d053bf 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Inventory has not changed in ProviderTree for provider: 39680711-70c9-4df1-ae59-25e54fac688d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:24:45 compute-1 nova_compute[182713]: 2026-01-22 00:24:45.553 182717 DEBUG nova.scheduler.client.report [None req-07844b01-b3c6-4f59-bd07-1bebf0d053bf 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Inventory has not changed for provider 39680711-70c9-4df1-ae59-25e54fac688d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:24:45 compute-1 nova_compute[182713]: 2026-01-22 00:24:45.586 182717 DEBUG oslo_concurrency.lockutils [None req-07844b01-b3c6-4f59-bd07-1bebf0d053bf 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.149s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:24:45 compute-1 nova_compute[182713]: 2026-01-22 00:24:45.828 182717 INFO nova.scheduler.client.report [None req-07844b01-b3c6-4f59-bd07-1bebf0d053bf 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Deleted allocations for instance f457f1e4-8770-4c44-a061-214acbc43199
Jan 22 00:24:45 compute-1 nova_compute[182713]: 2026-01-22 00:24:45.985 182717 DEBUG oslo_concurrency.lockutils [None req-07844b01-b3c6-4f59-bd07-1bebf0d053bf 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lock "f457f1e4-8770-4c44-a061-214acbc43199" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.488s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:24:46 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:24:46.375 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=74526b6d-b1ca-423f-9094-b845f8b97526, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '48'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:24:47 compute-1 nova_compute[182713]: 2026-01-22 00:24:47.835 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:24:47 compute-1 nova_compute[182713]: 2026-01-22 00:24:47.856 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:24:49 compute-1 nova_compute[182713]: 2026-01-22 00:24:49.064 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:24:50 compute-1 nova_compute[182713]: 2026-01-22 00:24:50.855 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:24:51 compute-1 nova_compute[182713]: 2026-01-22 00:24:51.851 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:24:51 compute-1 nova_compute[182713]: 2026-01-22 00:24:51.857 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:24:51 compute-1 nova_compute[182713]: 2026-01-22 00:24:51.857 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 00:24:52 compute-1 nova_compute[182713]: 2026-01-22 00:24:52.602 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:24:52 compute-1 nova_compute[182713]: 2026-01-22 00:24:52.768 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:24:52 compute-1 nova_compute[182713]: 2026-01-22 00:24:52.837 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:24:53 compute-1 podman[236432]: 2026-01-22 00:24:53.585874712 +0000 UTC m=+0.065843706 container health_status 9069102163b9014ce8d990e36224edf2f6277e737eebbc85a8ea0ba5a3d6a468 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 22 00:24:53 compute-1 podman[236431]: 2026-01-22 00:24:53.635265244 +0000 UTC m=+0.115700373 container health_status 1b31374526468d6b98833c6ddc06c791524e3aaa8f4a92d02e3f11c449a17ca2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, config_id=ovn_controller, container_name=ovn_controller, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 22 00:24:54 compute-1 nova_compute[182713]: 2026-01-22 00:24:54.066 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:24:55 compute-1 nova_compute[182713]: 2026-01-22 00:24:55.855 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:24:55 compute-1 nova_compute[182713]: 2026-01-22 00:24:55.882 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:24:55 compute-1 nova_compute[182713]: 2026-01-22 00:24:55.883 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:24:55 compute-1 nova_compute[182713]: 2026-01-22 00:24:55.883 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:24:55 compute-1 nova_compute[182713]: 2026-01-22 00:24:55.883 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 00:24:56 compute-1 nova_compute[182713]: 2026-01-22 00:24:56.035 182717 WARNING nova.virt.libvirt.driver [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 00:24:56 compute-1 nova_compute[182713]: 2026-01-22 00:24:56.036 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5713MB free_disk=73.26041793823242GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 00:24:56 compute-1 nova_compute[182713]: 2026-01-22 00:24:56.037 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:24:56 compute-1 nova_compute[182713]: 2026-01-22 00:24:56.037 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:24:56 compute-1 nova_compute[182713]: 2026-01-22 00:24:56.160 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 00:24:56 compute-1 nova_compute[182713]: 2026-01-22 00:24:56.160 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 00:24:56 compute-1 nova_compute[182713]: 2026-01-22 00:24:56.189 182717 DEBUG nova.scheduler.client.report [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Refreshing inventories for resource provider 39680711-70c9-4df1-ae59-25e54fac688d _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Jan 22 00:24:56 compute-1 nova_compute[182713]: 2026-01-22 00:24:56.227 182717 DEBUG nova.scheduler.client.report [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Updating ProviderTree inventory for provider 39680711-70c9-4df1-ae59-25e54fac688d from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Jan 22 00:24:56 compute-1 nova_compute[182713]: 2026-01-22 00:24:56.227 182717 DEBUG nova.compute.provider_tree [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Updating inventory in ProviderTree for provider 39680711-70c9-4df1-ae59-25e54fac688d with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Jan 22 00:24:56 compute-1 nova_compute[182713]: 2026-01-22 00:24:56.256 182717 DEBUG nova.scheduler.client.report [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Refreshing aggregate associations for resource provider 39680711-70c9-4df1-ae59-25e54fac688d, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Jan 22 00:24:56 compute-1 nova_compute[182713]: 2026-01-22 00:24:56.295 182717 DEBUG nova.scheduler.client.report [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Refreshing trait associations for resource provider 39680711-70c9-4df1-ae59-25e54fac688d, traits: COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_ACCELERATORS,COMPUTE_NODE,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_SSE41,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_GRAPHICS_MODEL_CIRRUS,HW_CPU_X86_SSE42,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_SECURITY_TPM_2_0,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_STORAGE_BUS_USB,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_STORAGE_BUS_FDC,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_STORAGE_BUS_SATA,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_DEVICE_TAGGING,COMPUTE_NET_VIF_MODEL_NE2K_PCI,HW_CPU_X86_SSE2,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_CPU_X86_SSSE3,COMPUTE_STORAGE_BUS_IDE,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_RESCUE_BFV,HW_CPU_X86_MMX,HW_CPU_X86_SSE,COMPUTE_TRUSTED_CERTS,COMPUTE_IMAGE_TYPE_ISO _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Jan 22 00:24:56 compute-1 nova_compute[182713]: 2026-01-22 00:24:56.320 182717 DEBUG nova.compute.provider_tree [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Inventory has not changed in ProviderTree for provider: 39680711-70c9-4df1-ae59-25e54fac688d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:24:56 compute-1 nova_compute[182713]: 2026-01-22 00:24:56.337 182717 DEBUG nova.scheduler.client.report [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Inventory has not changed for provider 39680711-70c9-4df1-ae59-25e54fac688d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:24:56 compute-1 nova_compute[182713]: 2026-01-22 00:24:56.380 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 00:24:56 compute-1 nova_compute[182713]: 2026-01-22 00:24:56.381 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.344s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:24:57 compute-1 podman[236483]: 2026-01-22 00:24:57.588080245 +0000 UTC m=+0.073849574 container health_status e305ebfcd3dbca18f8dd680606b043b108cf9efb0d0a771039cb04fe0df8441d (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 22 00:24:57 compute-1 podman[236482]: 2026-01-22 00:24:57.606042993 +0000 UTC m=+0.087631141 container health_status af9a44923f14d865893c91712a29a99d59e05d83f1a1a0300c4f4d5af638f284 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Jan 22 00:24:57 compute-1 nova_compute[182713]: 2026-01-22 00:24:57.803 182717 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769041482.802418, f457f1e4-8770-4c44-a061-214acbc43199 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:24:57 compute-1 nova_compute[182713]: 2026-01-22 00:24:57.803 182717 INFO nova.compute.manager [-] [instance: f457f1e4-8770-4c44-a061-214acbc43199] VM Stopped (Lifecycle Event)
Jan 22 00:24:57 compute-1 nova_compute[182713]: 2026-01-22 00:24:57.838 182717 DEBUG nova.compute.manager [None req-8fc7a172-e649-4ace-9213-f697d64f1109 - - - - - -] [instance: f457f1e4-8770-4c44-a061-214acbc43199] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:24:57 compute-1 nova_compute[182713]: 2026-01-22 00:24:57.873 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:24:58 compute-1 nova_compute[182713]: 2026-01-22 00:24:58.381 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:24:58 compute-1 nova_compute[182713]: 2026-01-22 00:24:58.856 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:24:58 compute-1 nova_compute[182713]: 2026-01-22 00:24:58.856 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 00:24:58 compute-1 nova_compute[182713]: 2026-01-22 00:24:58.857 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 22 00:24:58 compute-1 nova_compute[182713]: 2026-01-22 00:24:58.873 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 22 00:24:58 compute-1 nova_compute[182713]: 2026-01-22 00:24:58.873 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:24:58 compute-1 nova_compute[182713]: 2026-01-22 00:24:58.874 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:24:59 compute-1 nova_compute[182713]: 2026-01-22 00:24:59.069 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:25:02 compute-1 nova_compute[182713]: 2026-01-22 00:25:02.877 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:25:03 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:25:03.034 104184 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:25:03 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:25:03.035 104184 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:25:03 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:25:03.035 104184 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:25:04 compute-1 nova_compute[182713]: 2026-01-22 00:25:04.070 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:25:07 compute-1 nova_compute[182713]: 2026-01-22 00:25:07.880 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:25:09 compute-1 nova_compute[182713]: 2026-01-22 00:25:09.072 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:25:10 compute-1 podman[236524]: 2026-01-22 00:25:10.589601592 +0000 UTC m=+0.071331686 container health_status cba261ffc859a105e0afa4436e64e27f9ba7e7387a1ea02146e0072c51844d44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ceilometer_agent_compute, org.label-schema.build-date=20251202, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 22 00:25:12 compute-1 nova_compute[182713]: 2026-01-22 00:25:12.883 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:25:13 compute-1 podman[236545]: 2026-01-22 00:25:13.565931539 +0000 UTC m=+0.061944875 container health_status dce133a2ffa21f3006ac27dc80027060a9029e6d972f4c62fecd35dd3bbb00f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., architecture=x86_64, config_id=openstack_network_exporter, io.buildah.version=1.33.7, name=ubi9-minimal, container_name=openstack_network_exporter, vendor=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, version=9.6)
Jan 22 00:25:14 compute-1 nova_compute[182713]: 2026-01-22 00:25:14.074 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:25:17 compute-1 nova_compute[182713]: 2026-01-22 00:25:17.886 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:25:19 compute-1 nova_compute[182713]: 2026-01-22 00:25:19.076 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:25:20 compute-1 nova_compute[182713]: 2026-01-22 00:25:20.876 182717 DEBUG oslo_concurrency.lockutils [None req-76f04832-7a74-448f-b926-bd392c1665bc a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Acquiring lock "7530f1b4-dc2b-4291-b4c1-fb75b3542e2b" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:25:20 compute-1 nova_compute[182713]: 2026-01-22 00:25:20.876 182717 DEBUG oslo_concurrency.lockutils [None req-76f04832-7a74-448f-b926-bd392c1665bc a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Lock "7530f1b4-dc2b-4291-b4c1-fb75b3542e2b" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:25:20 compute-1 nova_compute[182713]: 2026-01-22 00:25:20.904 182717 DEBUG nova.compute.manager [None req-76f04832-7a74-448f-b926-bd392c1665bc a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 7530f1b4-dc2b-4291-b4c1-fb75b3542e2b] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 22 00:25:21 compute-1 nova_compute[182713]: 2026-01-22 00:25:21.061 182717 DEBUG oslo_concurrency.lockutils [None req-76f04832-7a74-448f-b926-bd392c1665bc a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:25:21 compute-1 nova_compute[182713]: 2026-01-22 00:25:21.062 182717 DEBUG oslo_concurrency.lockutils [None req-76f04832-7a74-448f-b926-bd392c1665bc a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:25:21 compute-1 nova_compute[182713]: 2026-01-22 00:25:21.069 182717 DEBUG nova.virt.hardware [None req-76f04832-7a74-448f-b926-bd392c1665bc a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 22 00:25:21 compute-1 nova_compute[182713]: 2026-01-22 00:25:21.069 182717 INFO nova.compute.claims [None req-76f04832-7a74-448f-b926-bd392c1665bc a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 7530f1b4-dc2b-4291-b4c1-fb75b3542e2b] Claim successful on node compute-1.ctlplane.example.com
Jan 22 00:25:21 compute-1 nova_compute[182713]: 2026-01-22 00:25:21.267 182717 DEBUG nova.compute.provider_tree [None req-76f04832-7a74-448f-b926-bd392c1665bc a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Inventory has not changed in ProviderTree for provider: 39680711-70c9-4df1-ae59-25e54fac688d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:25:21 compute-1 nova_compute[182713]: 2026-01-22 00:25:21.291 182717 DEBUG nova.scheduler.client.report [None req-76f04832-7a74-448f-b926-bd392c1665bc a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Inventory has not changed for provider 39680711-70c9-4df1-ae59-25e54fac688d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:25:21 compute-1 nova_compute[182713]: 2026-01-22 00:25:21.323 182717 DEBUG oslo_concurrency.lockutils [None req-76f04832-7a74-448f-b926-bd392c1665bc a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.261s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:25:21 compute-1 nova_compute[182713]: 2026-01-22 00:25:21.324 182717 DEBUG nova.compute.manager [None req-76f04832-7a74-448f-b926-bd392c1665bc a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 7530f1b4-dc2b-4291-b4c1-fb75b3542e2b] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 22 00:25:21 compute-1 nova_compute[182713]: 2026-01-22 00:25:21.404 182717 DEBUG nova.compute.manager [None req-76f04832-7a74-448f-b926-bd392c1665bc a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 7530f1b4-dc2b-4291-b4c1-fb75b3542e2b] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 22 00:25:21 compute-1 nova_compute[182713]: 2026-01-22 00:25:21.405 182717 DEBUG nova.network.neutron [None req-76f04832-7a74-448f-b926-bd392c1665bc a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 7530f1b4-dc2b-4291-b4c1-fb75b3542e2b] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 22 00:25:21 compute-1 nova_compute[182713]: 2026-01-22 00:25:21.425 182717 INFO nova.virt.libvirt.driver [None req-76f04832-7a74-448f-b926-bd392c1665bc a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 7530f1b4-dc2b-4291-b4c1-fb75b3542e2b] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 22 00:25:21 compute-1 nova_compute[182713]: 2026-01-22 00:25:21.466 182717 DEBUG nova.compute.manager [None req-76f04832-7a74-448f-b926-bd392c1665bc a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 7530f1b4-dc2b-4291-b4c1-fb75b3542e2b] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 22 00:25:21 compute-1 nova_compute[182713]: 2026-01-22 00:25:21.848 182717 DEBUG nova.compute.manager [None req-76f04832-7a74-448f-b926-bd392c1665bc a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 7530f1b4-dc2b-4291-b4c1-fb75b3542e2b] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 22 00:25:21 compute-1 nova_compute[182713]: 2026-01-22 00:25:21.850 182717 DEBUG nova.virt.libvirt.driver [None req-76f04832-7a74-448f-b926-bd392c1665bc a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 7530f1b4-dc2b-4291-b4c1-fb75b3542e2b] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 22 00:25:21 compute-1 nova_compute[182713]: 2026-01-22 00:25:21.851 182717 INFO nova.virt.libvirt.driver [None req-76f04832-7a74-448f-b926-bd392c1665bc a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 7530f1b4-dc2b-4291-b4c1-fb75b3542e2b] Creating image(s)
Jan 22 00:25:21 compute-1 nova_compute[182713]: 2026-01-22 00:25:21.852 182717 DEBUG oslo_concurrency.lockutils [None req-76f04832-7a74-448f-b926-bd392c1665bc a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Acquiring lock "/var/lib/nova/instances/7530f1b4-dc2b-4291-b4c1-fb75b3542e2b/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:25:21 compute-1 nova_compute[182713]: 2026-01-22 00:25:21.852 182717 DEBUG oslo_concurrency.lockutils [None req-76f04832-7a74-448f-b926-bd392c1665bc a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Lock "/var/lib/nova/instances/7530f1b4-dc2b-4291-b4c1-fb75b3542e2b/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:25:21 compute-1 nova_compute[182713]: 2026-01-22 00:25:21.854 182717 DEBUG oslo_concurrency.lockutils [None req-76f04832-7a74-448f-b926-bd392c1665bc a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Lock "/var/lib/nova/instances/7530f1b4-dc2b-4291-b4c1-fb75b3542e2b/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:25:21 compute-1 nova_compute[182713]: 2026-01-22 00:25:21.871 182717 DEBUG nova.policy [None req-76f04832-7a74-448f-b926-bd392c1665bc a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'a60ce2b7b7ae47b484de12add551b287', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '02bcfc5f1f1044a3856e73a5938ff011', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 22 00:25:21 compute-1 nova_compute[182713]: 2026-01-22 00:25:21.874 182717 DEBUG oslo_concurrency.processutils [None req-76f04832-7a74-448f-b926-bd392c1665bc a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:25:21 compute-1 nova_compute[182713]: 2026-01-22 00:25:21.933 182717 DEBUG oslo_concurrency.processutils [None req-76f04832-7a74-448f-b926-bd392c1665bc a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:25:21 compute-1 nova_compute[182713]: 2026-01-22 00:25:21.934 182717 DEBUG oslo_concurrency.lockutils [None req-76f04832-7a74-448f-b926-bd392c1665bc a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Acquiring lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:25:21 compute-1 nova_compute[182713]: 2026-01-22 00:25:21.935 182717 DEBUG oslo_concurrency.lockutils [None req-76f04832-7a74-448f-b926-bd392c1665bc a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:25:21 compute-1 nova_compute[182713]: 2026-01-22 00:25:21.955 182717 DEBUG oslo_concurrency.processutils [None req-76f04832-7a74-448f-b926-bd392c1665bc a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:25:22 compute-1 nova_compute[182713]: 2026-01-22 00:25:22.011 182717 DEBUG oslo_concurrency.processutils [None req-76f04832-7a74-448f-b926-bd392c1665bc a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:25:22 compute-1 nova_compute[182713]: 2026-01-22 00:25:22.012 182717 DEBUG oslo_concurrency.processutils [None req-76f04832-7a74-448f-b926-bd392c1665bc a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/7530f1b4-dc2b-4291-b4c1-fb75b3542e2b/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:25:22 compute-1 nova_compute[182713]: 2026-01-22 00:25:22.061 182717 DEBUG oslo_concurrency.processutils [None req-76f04832-7a74-448f-b926-bd392c1665bc a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/7530f1b4-dc2b-4291-b4c1-fb75b3542e2b/disk 1073741824" returned: 0 in 0.048s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:25:22 compute-1 nova_compute[182713]: 2026-01-22 00:25:22.062 182717 DEBUG oslo_concurrency.lockutils [None req-76f04832-7a74-448f-b926-bd392c1665bc a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.127s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:25:22 compute-1 nova_compute[182713]: 2026-01-22 00:25:22.063 182717 DEBUG oslo_concurrency.processutils [None req-76f04832-7a74-448f-b926-bd392c1665bc a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:25:22 compute-1 nova_compute[182713]: 2026-01-22 00:25:22.131 182717 DEBUG oslo_concurrency.processutils [None req-76f04832-7a74-448f-b926-bd392c1665bc a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:25:22 compute-1 nova_compute[182713]: 2026-01-22 00:25:22.132 182717 DEBUG nova.virt.disk.api [None req-76f04832-7a74-448f-b926-bd392c1665bc a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Checking if we can resize image /var/lib/nova/instances/7530f1b4-dc2b-4291-b4c1-fb75b3542e2b/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 22 00:25:22 compute-1 nova_compute[182713]: 2026-01-22 00:25:22.133 182717 DEBUG oslo_concurrency.processutils [None req-76f04832-7a74-448f-b926-bd392c1665bc a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7530f1b4-dc2b-4291-b4c1-fb75b3542e2b/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:25:22 compute-1 nova_compute[182713]: 2026-01-22 00:25:22.227 182717 DEBUG oslo_concurrency.processutils [None req-76f04832-7a74-448f-b926-bd392c1665bc a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7530f1b4-dc2b-4291-b4c1-fb75b3542e2b/disk --force-share --output=json" returned: 0 in 0.094s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:25:22 compute-1 nova_compute[182713]: 2026-01-22 00:25:22.228 182717 DEBUG nova.virt.disk.api [None req-76f04832-7a74-448f-b926-bd392c1665bc a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Cannot resize image /var/lib/nova/instances/7530f1b4-dc2b-4291-b4c1-fb75b3542e2b/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 22 00:25:22 compute-1 nova_compute[182713]: 2026-01-22 00:25:22.229 182717 DEBUG nova.objects.instance [None req-76f04832-7a74-448f-b926-bd392c1665bc a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Lazy-loading 'migration_context' on Instance uuid 7530f1b4-dc2b-4291-b4c1-fb75b3542e2b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:25:22 compute-1 nova_compute[182713]: 2026-01-22 00:25:22.247 182717 DEBUG nova.virt.libvirt.driver [None req-76f04832-7a74-448f-b926-bd392c1665bc a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 7530f1b4-dc2b-4291-b4c1-fb75b3542e2b] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 22 00:25:22 compute-1 nova_compute[182713]: 2026-01-22 00:25:22.248 182717 DEBUG nova.virt.libvirt.driver [None req-76f04832-7a74-448f-b926-bd392c1665bc a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 7530f1b4-dc2b-4291-b4c1-fb75b3542e2b] Ensure instance console log exists: /var/lib/nova/instances/7530f1b4-dc2b-4291-b4c1-fb75b3542e2b/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 22 00:25:22 compute-1 nova_compute[182713]: 2026-01-22 00:25:22.248 182717 DEBUG oslo_concurrency.lockutils [None req-76f04832-7a74-448f-b926-bd392c1665bc a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:25:22 compute-1 nova_compute[182713]: 2026-01-22 00:25:22.249 182717 DEBUG oslo_concurrency.lockutils [None req-76f04832-7a74-448f-b926-bd392c1665bc a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:25:22 compute-1 nova_compute[182713]: 2026-01-22 00:25:22.249 182717 DEBUG oslo_concurrency.lockutils [None req-76f04832-7a74-448f-b926-bd392c1665bc a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:25:22 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:25:22.735 104184 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=49, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:a4:fb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'fa:c2:bb:50:eb:27'}, ipsec=False) old=SB_Global(nb_cfg=48) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:25:22 compute-1 nova_compute[182713]: 2026-01-22 00:25:22.736 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:25:22 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:25:22.736 104184 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 22 00:25:22 compute-1 nova_compute[182713]: 2026-01-22 00:25:22.888 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:25:24 compute-1 nova_compute[182713]: 2026-01-22 00:25:24.078 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:25:24 compute-1 podman[236582]: 2026-01-22 00:25:24.584659488 +0000 UTC m=+0.062926955 container health_status 9069102163b9014ce8d990e36224edf2f6277e737eebbc85a8ea0ba5a3d6a468 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 22 00:25:24 compute-1 nova_compute[182713]: 2026-01-22 00:25:24.604 182717 DEBUG nova.network.neutron [None req-76f04832-7a74-448f-b926-bd392c1665bc a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 7530f1b4-dc2b-4291-b4c1-fb75b3542e2b] Successfully created port: 00f97767-091d-496b-9dc0-f66f15a21142 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 22 00:25:24 compute-1 podman[236581]: 2026-01-22 00:25:24.607958172 +0000 UTC m=+0.097398495 container health_status 1b31374526468d6b98833c6ddc06c791524e3aaa8f4a92d02e3f11c449a17ca2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 00:25:25 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:25:25.738 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=74526b6d-b1ca-423f-9094-b845f8b97526, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '49'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:25:26 compute-1 nova_compute[182713]: 2026-01-22 00:25:26.171 182717 DEBUG nova.network.neutron [None req-76f04832-7a74-448f-b926-bd392c1665bc a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 7530f1b4-dc2b-4291-b4c1-fb75b3542e2b] Successfully updated port: 00f97767-091d-496b-9dc0-f66f15a21142 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 22 00:25:26 compute-1 nova_compute[182713]: 2026-01-22 00:25:26.614 182717 DEBUG oslo_concurrency.lockutils [None req-76f04832-7a74-448f-b926-bd392c1665bc a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Acquiring lock "refresh_cache-7530f1b4-dc2b-4291-b4c1-fb75b3542e2b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:25:26 compute-1 nova_compute[182713]: 2026-01-22 00:25:26.615 182717 DEBUG oslo_concurrency.lockutils [None req-76f04832-7a74-448f-b926-bd392c1665bc a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Acquired lock "refresh_cache-7530f1b4-dc2b-4291-b4c1-fb75b3542e2b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:25:26 compute-1 nova_compute[182713]: 2026-01-22 00:25:26.615 182717 DEBUG nova.network.neutron [None req-76f04832-7a74-448f-b926-bd392c1665bc a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 7530f1b4-dc2b-4291-b4c1-fb75b3542e2b] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 00:25:26 compute-1 nova_compute[182713]: 2026-01-22 00:25:26.658 182717 DEBUG nova.compute.manager [req-9b5c3331-d3e6-4efe-a217-313a56e2d652 req-dc75c86b-f7a1-4da2-b43f-b71827f0c1e9 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 7530f1b4-dc2b-4291-b4c1-fb75b3542e2b] Received event network-changed-00f97767-091d-496b-9dc0-f66f15a21142 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:25:26 compute-1 nova_compute[182713]: 2026-01-22 00:25:26.659 182717 DEBUG nova.compute.manager [req-9b5c3331-d3e6-4efe-a217-313a56e2d652 req-dc75c86b-f7a1-4da2-b43f-b71827f0c1e9 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 7530f1b4-dc2b-4291-b4c1-fb75b3542e2b] Refreshing instance network info cache due to event network-changed-00f97767-091d-496b-9dc0-f66f15a21142. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 00:25:26 compute-1 nova_compute[182713]: 2026-01-22 00:25:26.659 182717 DEBUG oslo_concurrency.lockutils [req-9b5c3331-d3e6-4efe-a217-313a56e2d652 req-dc75c86b-f7a1-4da2-b43f-b71827f0c1e9 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-7530f1b4-dc2b-4291-b4c1-fb75b3542e2b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:25:27 compute-1 nova_compute[182713]: 2026-01-22 00:25:27.493 182717 DEBUG nova.network.neutron [None req-76f04832-7a74-448f-b926-bd392c1665bc a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 7530f1b4-dc2b-4291-b4c1-fb75b3542e2b] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 00:25:27 compute-1 nova_compute[182713]: 2026-01-22 00:25:27.891 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:25:28 compute-1 podman[236631]: 2026-01-22 00:25:28.566532691 +0000 UTC m=+0.060788679 container health_status af9a44923f14d865893c91712a29a99d59e05d83f1a1a0300c4f4d5af638f284 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 00:25:28 compute-1 podman[236632]: 2026-01-22 00:25:28.599002318 +0000 UTC m=+0.080312904 container health_status e305ebfcd3dbca18f8dd680606b043b108cf9efb0d0a771039cb04fe0df8441d (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 22 00:25:29 compute-1 nova_compute[182713]: 2026-01-22 00:25:29.080 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:25:31 compute-1 nova_compute[182713]: 2026-01-22 00:25:31.856 182717 DEBUG nova.network.neutron [None req-76f04832-7a74-448f-b926-bd392c1665bc a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 7530f1b4-dc2b-4291-b4c1-fb75b3542e2b] Updating instance_info_cache with network_info: [{"id": "00f97767-091d-496b-9dc0-f66f15a21142", "address": "fa:16:3e:3d:55:4f", "network": {"id": "5c39d2a7-2c89-4543-a593-0bbe9a34dfef", "bridge": "br-int", "label": "tempest-network-smoke--1486796925", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "02bcfc5f1f1044a3856e73a5938ff011", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap00f97767-09", "ovs_interfaceid": "00f97767-091d-496b-9dc0-f66f15a21142", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:25:31 compute-1 nova_compute[182713]: 2026-01-22 00:25:31.881 182717 DEBUG oslo_concurrency.lockutils [None req-76f04832-7a74-448f-b926-bd392c1665bc a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Releasing lock "refresh_cache-7530f1b4-dc2b-4291-b4c1-fb75b3542e2b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:25:31 compute-1 nova_compute[182713]: 2026-01-22 00:25:31.882 182717 DEBUG nova.compute.manager [None req-76f04832-7a74-448f-b926-bd392c1665bc a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 7530f1b4-dc2b-4291-b4c1-fb75b3542e2b] Instance network_info: |[{"id": "00f97767-091d-496b-9dc0-f66f15a21142", "address": "fa:16:3e:3d:55:4f", "network": {"id": "5c39d2a7-2c89-4543-a593-0bbe9a34dfef", "bridge": "br-int", "label": "tempest-network-smoke--1486796925", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "02bcfc5f1f1044a3856e73a5938ff011", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap00f97767-09", "ovs_interfaceid": "00f97767-091d-496b-9dc0-f66f15a21142", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 22 00:25:31 compute-1 nova_compute[182713]: 2026-01-22 00:25:31.882 182717 DEBUG oslo_concurrency.lockutils [req-9b5c3331-d3e6-4efe-a217-313a56e2d652 req-dc75c86b-f7a1-4da2-b43f-b71827f0c1e9 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-7530f1b4-dc2b-4291-b4c1-fb75b3542e2b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:25:31 compute-1 nova_compute[182713]: 2026-01-22 00:25:31.882 182717 DEBUG nova.network.neutron [req-9b5c3331-d3e6-4efe-a217-313a56e2d652 req-dc75c86b-f7a1-4da2-b43f-b71827f0c1e9 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 7530f1b4-dc2b-4291-b4c1-fb75b3542e2b] Refreshing network info cache for port 00f97767-091d-496b-9dc0-f66f15a21142 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 00:25:31 compute-1 nova_compute[182713]: 2026-01-22 00:25:31.886 182717 DEBUG nova.virt.libvirt.driver [None req-76f04832-7a74-448f-b926-bd392c1665bc a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 7530f1b4-dc2b-4291-b4c1-fb75b3542e2b] Start _get_guest_xml network_info=[{"id": "00f97767-091d-496b-9dc0-f66f15a21142", "address": "fa:16:3e:3d:55:4f", "network": {"id": "5c39d2a7-2c89-4543-a593-0bbe9a34dfef", "bridge": "br-int", "label": "tempest-network-smoke--1486796925", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "02bcfc5f1f1044a3856e73a5938ff011", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap00f97767-09", "ovs_interfaceid": "00f97767-091d-496b-9dc0-f66f15a21142", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'size': 0, 'disk_bus': 'virtio', 'boot_index': 0, 'device_name': '/dev/vda', 'guest_format': None, 'encrypted': False, 'encryption_format': None, 'encryption_secret_uuid': None, 'device_type': 'disk', 'image_id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 22 00:25:31 compute-1 nova_compute[182713]: 2026-01-22 00:25:31.891 182717 WARNING nova.virt.libvirt.driver [None req-76f04832-7a74-448f-b926-bd392c1665bc a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 00:25:31 compute-1 nova_compute[182713]: 2026-01-22 00:25:31.905 182717 DEBUG nova.virt.libvirt.host [None req-76f04832-7a74-448f-b926-bd392c1665bc a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 22 00:25:31 compute-1 nova_compute[182713]: 2026-01-22 00:25:31.906 182717 DEBUG nova.virt.libvirt.host [None req-76f04832-7a74-448f-b926-bd392c1665bc a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 22 00:25:31 compute-1 nova_compute[182713]: 2026-01-22 00:25:31.910 182717 DEBUG nova.virt.libvirt.host [None req-76f04832-7a74-448f-b926-bd392c1665bc a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 22 00:25:31 compute-1 nova_compute[182713]: 2026-01-22 00:25:31.911 182717 DEBUG nova.virt.libvirt.host [None req-76f04832-7a74-448f-b926-bd392c1665bc a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 22 00:25:31 compute-1 nova_compute[182713]: 2026-01-22 00:25:31.913 182717 DEBUG nova.virt.libvirt.driver [None req-76f04832-7a74-448f-b926-bd392c1665bc a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 22 00:25:31 compute-1 nova_compute[182713]: 2026-01-22 00:25:31.914 182717 DEBUG nova.virt.hardware [None req-76f04832-7a74-448f-b926-bd392c1665bc a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-21T23:42:45Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c3389c03-89c4-4ff5-9e03-1a99d41713d4',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 22 00:25:31 compute-1 nova_compute[182713]: 2026-01-22 00:25:31.914 182717 DEBUG nova.virt.hardware [None req-76f04832-7a74-448f-b926-bd392c1665bc a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 22 00:25:31 compute-1 nova_compute[182713]: 2026-01-22 00:25:31.915 182717 DEBUG nova.virt.hardware [None req-76f04832-7a74-448f-b926-bd392c1665bc a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 22 00:25:31 compute-1 nova_compute[182713]: 2026-01-22 00:25:31.916 182717 DEBUG nova.virt.hardware [None req-76f04832-7a74-448f-b926-bd392c1665bc a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 22 00:25:31 compute-1 nova_compute[182713]: 2026-01-22 00:25:31.916 182717 DEBUG nova.virt.hardware [None req-76f04832-7a74-448f-b926-bd392c1665bc a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 22 00:25:31 compute-1 nova_compute[182713]: 2026-01-22 00:25:31.916 182717 DEBUG nova.virt.hardware [None req-76f04832-7a74-448f-b926-bd392c1665bc a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 22 00:25:31 compute-1 nova_compute[182713]: 2026-01-22 00:25:31.917 182717 DEBUG nova.virt.hardware [None req-76f04832-7a74-448f-b926-bd392c1665bc a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 22 00:25:31 compute-1 nova_compute[182713]: 2026-01-22 00:25:31.917 182717 DEBUG nova.virt.hardware [None req-76f04832-7a74-448f-b926-bd392c1665bc a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 22 00:25:31 compute-1 nova_compute[182713]: 2026-01-22 00:25:31.918 182717 DEBUG nova.virt.hardware [None req-76f04832-7a74-448f-b926-bd392c1665bc a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 22 00:25:31 compute-1 nova_compute[182713]: 2026-01-22 00:25:31.918 182717 DEBUG nova.virt.hardware [None req-76f04832-7a74-448f-b926-bd392c1665bc a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 22 00:25:31 compute-1 nova_compute[182713]: 2026-01-22 00:25:31.919 182717 DEBUG nova.virt.hardware [None req-76f04832-7a74-448f-b926-bd392c1665bc a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 22 00:25:31 compute-1 nova_compute[182713]: 2026-01-22 00:25:31.925 182717 DEBUG nova.virt.libvirt.vif [None req-76f04832-7a74-448f-b926-bd392c1665bc a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T00:25:19Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-1492736128-gen-1-1423755979',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-1492736128-gen-1-1423755979',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-1492736128-ge',id=163,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBA7jwxBkdWOz7w57AiaRcIngM4Y6BG0RAdXKZN1lpSf4fY7AaWS+RG49OCjvRqIpg6m9+OlWKeWKEGH6c13ztIF3i7IhSM5D4o2yMlEeDmvrwLxAQoPueCNJW1uOa0WZAw==',key_name='tempest-TestSecurityGroupsBasicOps-1448768175',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='02bcfc5f1f1044a3856e73a5938ff011',ramdisk_id='',reservation_id='r-68ni0ee7',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestSecurityGroupsBasicOps-1492736128',owner_user_name='tempest-TestSecurityGroupsBasicOps-1492736128-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:25:21Z,user_data=None,user_id='a60ce2b7b7ae47b484de12add551b287',uuid=7530f1b4-dc2b-4291-b4c1-fb75b3542e2b,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "00f97767-091d-496b-9dc0-f66f15a21142", "address": "fa:16:3e:3d:55:4f", "network": {"id": "5c39d2a7-2c89-4543-a593-0bbe9a34dfef", "bridge": "br-int", "label": "tempest-network-smoke--1486796925", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "02bcfc5f1f1044a3856e73a5938ff011", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap00f97767-09", "ovs_interfaceid": "00f97767-091d-496b-9dc0-f66f15a21142", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 00:25:31 compute-1 nova_compute[182713]: 2026-01-22 00:25:31.926 182717 DEBUG nova.network.os_vif_util [None req-76f04832-7a74-448f-b926-bd392c1665bc a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Converting VIF {"id": "00f97767-091d-496b-9dc0-f66f15a21142", "address": "fa:16:3e:3d:55:4f", "network": {"id": "5c39d2a7-2c89-4543-a593-0bbe9a34dfef", "bridge": "br-int", "label": "tempest-network-smoke--1486796925", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "02bcfc5f1f1044a3856e73a5938ff011", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap00f97767-09", "ovs_interfaceid": "00f97767-091d-496b-9dc0-f66f15a21142", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:25:31 compute-1 nova_compute[182713]: 2026-01-22 00:25:31.927 182717 DEBUG nova.network.os_vif_util [None req-76f04832-7a74-448f-b926-bd392c1665bc a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:3d:55:4f,bridge_name='br-int',has_traffic_filtering=True,id=00f97767-091d-496b-9dc0-f66f15a21142,network=Network(5c39d2a7-2c89-4543-a593-0bbe9a34dfef),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap00f97767-09') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:25:31 compute-1 nova_compute[182713]: 2026-01-22 00:25:31.929 182717 DEBUG nova.objects.instance [None req-76f04832-7a74-448f-b926-bd392c1665bc a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Lazy-loading 'pci_devices' on Instance uuid 7530f1b4-dc2b-4291-b4c1-fb75b3542e2b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:25:31 compute-1 nova_compute[182713]: 2026-01-22 00:25:31.947 182717 DEBUG nova.virt.libvirt.driver [None req-76f04832-7a74-448f-b926-bd392c1665bc a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 7530f1b4-dc2b-4291-b4c1-fb75b3542e2b] End _get_guest_xml xml=<domain type="kvm">
Jan 22 00:25:31 compute-1 nova_compute[182713]:   <uuid>7530f1b4-dc2b-4291-b4c1-fb75b3542e2b</uuid>
Jan 22 00:25:31 compute-1 nova_compute[182713]:   <name>instance-000000a3</name>
Jan 22 00:25:31 compute-1 nova_compute[182713]:   <memory>131072</memory>
Jan 22 00:25:31 compute-1 nova_compute[182713]:   <vcpu>1</vcpu>
Jan 22 00:25:31 compute-1 nova_compute[182713]:   <metadata>
Jan 22 00:25:31 compute-1 nova_compute[182713]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 00:25:31 compute-1 nova_compute[182713]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 00:25:31 compute-1 nova_compute[182713]:       <nova:name>tempest-server-tempest-TestSecurityGroupsBasicOps-1492736128-gen-1-1423755979</nova:name>
Jan 22 00:25:31 compute-1 nova_compute[182713]:       <nova:creationTime>2026-01-22 00:25:31</nova:creationTime>
Jan 22 00:25:31 compute-1 nova_compute[182713]:       <nova:flavor name="m1.nano">
Jan 22 00:25:31 compute-1 nova_compute[182713]:         <nova:memory>128</nova:memory>
Jan 22 00:25:31 compute-1 nova_compute[182713]:         <nova:disk>1</nova:disk>
Jan 22 00:25:31 compute-1 nova_compute[182713]:         <nova:swap>0</nova:swap>
Jan 22 00:25:31 compute-1 nova_compute[182713]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 00:25:31 compute-1 nova_compute[182713]:         <nova:vcpus>1</nova:vcpus>
Jan 22 00:25:31 compute-1 nova_compute[182713]:       </nova:flavor>
Jan 22 00:25:31 compute-1 nova_compute[182713]:       <nova:owner>
Jan 22 00:25:31 compute-1 nova_compute[182713]:         <nova:user uuid="a60ce2b7b7ae47b484de12add551b287">tempest-TestSecurityGroupsBasicOps-1492736128-project-member</nova:user>
Jan 22 00:25:31 compute-1 nova_compute[182713]:         <nova:project uuid="02bcfc5f1f1044a3856e73a5938ff011">tempest-TestSecurityGroupsBasicOps-1492736128</nova:project>
Jan 22 00:25:31 compute-1 nova_compute[182713]:       </nova:owner>
Jan 22 00:25:31 compute-1 nova_compute[182713]:       <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 22 00:25:31 compute-1 nova_compute[182713]:       <nova:ports>
Jan 22 00:25:31 compute-1 nova_compute[182713]:         <nova:port uuid="00f97767-091d-496b-9dc0-f66f15a21142">
Jan 22 00:25:31 compute-1 nova_compute[182713]:           <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Jan 22 00:25:31 compute-1 nova_compute[182713]:         </nova:port>
Jan 22 00:25:31 compute-1 nova_compute[182713]:       </nova:ports>
Jan 22 00:25:31 compute-1 nova_compute[182713]:     </nova:instance>
Jan 22 00:25:31 compute-1 nova_compute[182713]:   </metadata>
Jan 22 00:25:31 compute-1 nova_compute[182713]:   <sysinfo type="smbios">
Jan 22 00:25:31 compute-1 nova_compute[182713]:     <system>
Jan 22 00:25:31 compute-1 nova_compute[182713]:       <entry name="manufacturer">RDO</entry>
Jan 22 00:25:31 compute-1 nova_compute[182713]:       <entry name="product">OpenStack Compute</entry>
Jan 22 00:25:31 compute-1 nova_compute[182713]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 00:25:31 compute-1 nova_compute[182713]:       <entry name="serial">7530f1b4-dc2b-4291-b4c1-fb75b3542e2b</entry>
Jan 22 00:25:31 compute-1 nova_compute[182713]:       <entry name="uuid">7530f1b4-dc2b-4291-b4c1-fb75b3542e2b</entry>
Jan 22 00:25:31 compute-1 nova_compute[182713]:       <entry name="family">Virtual Machine</entry>
Jan 22 00:25:31 compute-1 nova_compute[182713]:     </system>
Jan 22 00:25:31 compute-1 nova_compute[182713]:   </sysinfo>
Jan 22 00:25:31 compute-1 nova_compute[182713]:   <os>
Jan 22 00:25:31 compute-1 nova_compute[182713]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 22 00:25:31 compute-1 nova_compute[182713]:     <boot dev="hd"/>
Jan 22 00:25:31 compute-1 nova_compute[182713]:     <smbios mode="sysinfo"/>
Jan 22 00:25:31 compute-1 nova_compute[182713]:   </os>
Jan 22 00:25:31 compute-1 nova_compute[182713]:   <features>
Jan 22 00:25:31 compute-1 nova_compute[182713]:     <acpi/>
Jan 22 00:25:31 compute-1 nova_compute[182713]:     <apic/>
Jan 22 00:25:31 compute-1 nova_compute[182713]:     <vmcoreinfo/>
Jan 22 00:25:31 compute-1 nova_compute[182713]:   </features>
Jan 22 00:25:31 compute-1 nova_compute[182713]:   <clock offset="utc">
Jan 22 00:25:31 compute-1 nova_compute[182713]:     <timer name="pit" tickpolicy="delay"/>
Jan 22 00:25:31 compute-1 nova_compute[182713]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 22 00:25:31 compute-1 nova_compute[182713]:     <timer name="hpet" present="no"/>
Jan 22 00:25:31 compute-1 nova_compute[182713]:   </clock>
Jan 22 00:25:31 compute-1 nova_compute[182713]:   <cpu mode="custom" match="exact">
Jan 22 00:25:31 compute-1 nova_compute[182713]:     <model>Nehalem</model>
Jan 22 00:25:31 compute-1 nova_compute[182713]:     <topology sockets="1" cores="1" threads="1"/>
Jan 22 00:25:31 compute-1 nova_compute[182713]:   </cpu>
Jan 22 00:25:31 compute-1 nova_compute[182713]:   <devices>
Jan 22 00:25:31 compute-1 nova_compute[182713]:     <disk type="file" device="disk">
Jan 22 00:25:31 compute-1 nova_compute[182713]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 00:25:31 compute-1 nova_compute[182713]:       <source file="/var/lib/nova/instances/7530f1b4-dc2b-4291-b4c1-fb75b3542e2b/disk"/>
Jan 22 00:25:31 compute-1 nova_compute[182713]:       <target dev="vda" bus="virtio"/>
Jan 22 00:25:31 compute-1 nova_compute[182713]:     </disk>
Jan 22 00:25:31 compute-1 nova_compute[182713]:     <disk type="file" device="cdrom">
Jan 22 00:25:31 compute-1 nova_compute[182713]:       <driver name="qemu" type="raw" cache="none"/>
Jan 22 00:25:31 compute-1 nova_compute[182713]:       <source file="/var/lib/nova/instances/7530f1b4-dc2b-4291-b4c1-fb75b3542e2b/disk.config"/>
Jan 22 00:25:31 compute-1 nova_compute[182713]:       <target dev="sda" bus="sata"/>
Jan 22 00:25:31 compute-1 nova_compute[182713]:     </disk>
Jan 22 00:25:31 compute-1 nova_compute[182713]:     <interface type="ethernet">
Jan 22 00:25:31 compute-1 nova_compute[182713]:       <mac address="fa:16:3e:3d:55:4f"/>
Jan 22 00:25:31 compute-1 nova_compute[182713]:       <model type="virtio"/>
Jan 22 00:25:31 compute-1 nova_compute[182713]:       <driver name="vhost" rx_queue_size="512"/>
Jan 22 00:25:31 compute-1 nova_compute[182713]:       <mtu size="1442"/>
Jan 22 00:25:31 compute-1 nova_compute[182713]:       <target dev="tap00f97767-09"/>
Jan 22 00:25:31 compute-1 nova_compute[182713]:     </interface>
Jan 22 00:25:31 compute-1 nova_compute[182713]:     <serial type="pty">
Jan 22 00:25:31 compute-1 nova_compute[182713]:       <log file="/var/lib/nova/instances/7530f1b4-dc2b-4291-b4c1-fb75b3542e2b/console.log" append="off"/>
Jan 22 00:25:31 compute-1 nova_compute[182713]:     </serial>
Jan 22 00:25:31 compute-1 nova_compute[182713]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 00:25:31 compute-1 nova_compute[182713]:     <video>
Jan 22 00:25:31 compute-1 nova_compute[182713]:       <model type="virtio"/>
Jan 22 00:25:31 compute-1 nova_compute[182713]:     </video>
Jan 22 00:25:31 compute-1 nova_compute[182713]:     <input type="tablet" bus="usb"/>
Jan 22 00:25:31 compute-1 nova_compute[182713]:     <rng model="virtio">
Jan 22 00:25:31 compute-1 nova_compute[182713]:       <backend model="random">/dev/urandom</backend>
Jan 22 00:25:31 compute-1 nova_compute[182713]:     </rng>
Jan 22 00:25:31 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root"/>
Jan 22 00:25:31 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:25:31 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:25:31 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:25:31 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:25:31 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:25:31 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:25:31 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:25:31 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:25:31 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:25:31 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:25:31 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:25:31 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:25:31 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:25:31 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:25:31 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:25:31 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:25:31 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:25:31 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:25:31 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:25:31 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:25:31 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:25:31 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:25:31 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:25:31 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:25:31 compute-1 nova_compute[182713]:     <controller type="usb" index="0"/>
Jan 22 00:25:31 compute-1 nova_compute[182713]:     <memballoon model="virtio">
Jan 22 00:25:31 compute-1 nova_compute[182713]:       <stats period="10"/>
Jan 22 00:25:31 compute-1 nova_compute[182713]:     </memballoon>
Jan 22 00:25:31 compute-1 nova_compute[182713]:   </devices>
Jan 22 00:25:31 compute-1 nova_compute[182713]: </domain>
Jan 22 00:25:31 compute-1 nova_compute[182713]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 22 00:25:31 compute-1 nova_compute[182713]: 2026-01-22 00:25:31.949 182717 DEBUG nova.compute.manager [None req-76f04832-7a74-448f-b926-bd392c1665bc a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 7530f1b4-dc2b-4291-b4c1-fb75b3542e2b] Preparing to wait for external event network-vif-plugged-00f97767-091d-496b-9dc0-f66f15a21142 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 22 00:25:31 compute-1 nova_compute[182713]: 2026-01-22 00:25:31.950 182717 DEBUG oslo_concurrency.lockutils [None req-76f04832-7a74-448f-b926-bd392c1665bc a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Acquiring lock "7530f1b4-dc2b-4291-b4c1-fb75b3542e2b-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:25:31 compute-1 nova_compute[182713]: 2026-01-22 00:25:31.950 182717 DEBUG oslo_concurrency.lockutils [None req-76f04832-7a74-448f-b926-bd392c1665bc a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Lock "7530f1b4-dc2b-4291-b4c1-fb75b3542e2b-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:25:31 compute-1 nova_compute[182713]: 2026-01-22 00:25:31.950 182717 DEBUG oslo_concurrency.lockutils [None req-76f04832-7a74-448f-b926-bd392c1665bc a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Lock "7530f1b4-dc2b-4291-b4c1-fb75b3542e2b-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:25:31 compute-1 nova_compute[182713]: 2026-01-22 00:25:31.951 182717 DEBUG nova.virt.libvirt.vif [None req-76f04832-7a74-448f-b926-bd392c1665bc a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T00:25:19Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-1492736128-gen-1-1423755979',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-1492736128-gen-1-1423755979',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-1492736128-ge',id=163,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBA7jwxBkdWOz7w57AiaRcIngM4Y6BG0RAdXKZN1lpSf4fY7AaWS+RG49OCjvRqIpg6m9+OlWKeWKEGH6c13ztIF3i7IhSM5D4o2yMlEeDmvrwLxAQoPueCNJW1uOa0WZAw==',key_name='tempest-TestSecurityGroupsBasicOps-1448768175',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='02bcfc5f1f1044a3856e73a5938ff011',ramdisk_id='',reservation_id='r-68ni0ee7',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestSecurityGroupsBasicOps-1492736128',owner_user_name='tempest-TestSecurityGroupsBasicOps-1492736128-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:25:21Z,user_data=None,user_id='a60ce2b7b7ae47b484de12add551b287',uuid=7530f1b4-dc2b-4291-b4c1-fb75b3542e2b,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "00f97767-091d-496b-9dc0-f66f15a21142", "address": "fa:16:3e:3d:55:4f", "network": {"id": "5c39d2a7-2c89-4543-a593-0bbe9a34dfef", "bridge": "br-int", "label": "tempest-network-smoke--1486796925", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "02bcfc5f1f1044a3856e73a5938ff011", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap00f97767-09", "ovs_interfaceid": "00f97767-091d-496b-9dc0-f66f15a21142", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 00:25:31 compute-1 nova_compute[182713]: 2026-01-22 00:25:31.951 182717 DEBUG nova.network.os_vif_util [None req-76f04832-7a74-448f-b926-bd392c1665bc a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Converting VIF {"id": "00f97767-091d-496b-9dc0-f66f15a21142", "address": "fa:16:3e:3d:55:4f", "network": {"id": "5c39d2a7-2c89-4543-a593-0bbe9a34dfef", "bridge": "br-int", "label": "tempest-network-smoke--1486796925", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "02bcfc5f1f1044a3856e73a5938ff011", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap00f97767-09", "ovs_interfaceid": "00f97767-091d-496b-9dc0-f66f15a21142", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:25:31 compute-1 nova_compute[182713]: 2026-01-22 00:25:31.952 182717 DEBUG nova.network.os_vif_util [None req-76f04832-7a74-448f-b926-bd392c1665bc a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:3d:55:4f,bridge_name='br-int',has_traffic_filtering=True,id=00f97767-091d-496b-9dc0-f66f15a21142,network=Network(5c39d2a7-2c89-4543-a593-0bbe9a34dfef),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap00f97767-09') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:25:31 compute-1 nova_compute[182713]: 2026-01-22 00:25:31.953 182717 DEBUG os_vif [None req-76f04832-7a74-448f-b926-bd392c1665bc a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:3d:55:4f,bridge_name='br-int',has_traffic_filtering=True,id=00f97767-091d-496b-9dc0-f66f15a21142,network=Network(5c39d2a7-2c89-4543-a593-0bbe9a34dfef),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap00f97767-09') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 00:25:31 compute-1 nova_compute[182713]: 2026-01-22 00:25:31.953 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:25:31 compute-1 nova_compute[182713]: 2026-01-22 00:25:31.954 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:25:31 compute-1 nova_compute[182713]: 2026-01-22 00:25:31.954 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 00:25:31 compute-1 nova_compute[182713]: 2026-01-22 00:25:31.962 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:25:31 compute-1 nova_compute[182713]: 2026-01-22 00:25:31.962 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap00f97767-09, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:25:31 compute-1 nova_compute[182713]: 2026-01-22 00:25:31.963 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap00f97767-09, col_values=(('external_ids', {'iface-id': '00f97767-091d-496b-9dc0-f66f15a21142', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:3d:55:4f', 'vm-uuid': '7530f1b4-dc2b-4291-b4c1-fb75b3542e2b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:25:31 compute-1 nova_compute[182713]: 2026-01-22 00:25:31.964 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:25:31 compute-1 NetworkManager[54952]: <info>  [1769041531.9664] manager: (tap00f97767-09): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/292)
Jan 22 00:25:31 compute-1 nova_compute[182713]: 2026-01-22 00:25:31.968 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:25:31 compute-1 nova_compute[182713]: 2026-01-22 00:25:31.973 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:25:31 compute-1 nova_compute[182713]: 2026-01-22 00:25:31.974 182717 INFO os_vif [None req-76f04832-7a74-448f-b926-bd392c1665bc a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:3d:55:4f,bridge_name='br-int',has_traffic_filtering=True,id=00f97767-091d-496b-9dc0-f66f15a21142,network=Network(5c39d2a7-2c89-4543-a593-0bbe9a34dfef),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap00f97767-09')
Jan 22 00:25:32 compute-1 nova_compute[182713]: 2026-01-22 00:25:32.035 182717 DEBUG nova.virt.libvirt.driver [None req-76f04832-7a74-448f-b926-bd392c1665bc a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 00:25:32 compute-1 nova_compute[182713]: 2026-01-22 00:25:32.036 182717 DEBUG nova.virt.libvirt.driver [None req-76f04832-7a74-448f-b926-bd392c1665bc a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 00:25:32 compute-1 nova_compute[182713]: 2026-01-22 00:25:32.036 182717 DEBUG nova.virt.libvirt.driver [None req-76f04832-7a74-448f-b926-bd392c1665bc a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] No VIF found with MAC fa:16:3e:3d:55:4f, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 00:25:32 compute-1 nova_compute[182713]: 2026-01-22 00:25:32.036 182717 INFO nova.virt.libvirt.driver [None req-76f04832-7a74-448f-b926-bd392c1665bc a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 7530f1b4-dc2b-4291-b4c1-fb75b3542e2b] Using config drive
Jan 22 00:25:32 compute-1 nova_compute[182713]: 2026-01-22 00:25:32.576 182717 INFO nova.virt.libvirt.driver [None req-76f04832-7a74-448f-b926-bd392c1665bc a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 7530f1b4-dc2b-4291-b4c1-fb75b3542e2b] Creating config drive at /var/lib/nova/instances/7530f1b4-dc2b-4291-b4c1-fb75b3542e2b/disk.config
Jan 22 00:25:32 compute-1 nova_compute[182713]: 2026-01-22 00:25:32.580 182717 DEBUG oslo_concurrency.processutils [None req-76f04832-7a74-448f-b926-bd392c1665bc a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/7530f1b4-dc2b-4291-b4c1-fb75b3542e2b/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpru3idqze execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:25:32 compute-1 nova_compute[182713]: 2026-01-22 00:25:32.720 182717 DEBUG oslo_concurrency.processutils [None req-76f04832-7a74-448f-b926-bd392c1665bc a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/7530f1b4-dc2b-4291-b4c1-fb75b3542e2b/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpru3idqze" returned: 0 in 0.140s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:25:32 compute-1 kernel: tap00f97767-09: entered promiscuous mode
Jan 22 00:25:32 compute-1 NetworkManager[54952]: <info>  [1769041532.8000] manager: (tap00f97767-09): new Tun device (/org/freedesktop/NetworkManager/Devices/293)
Jan 22 00:25:32 compute-1 nova_compute[182713]: 2026-01-22 00:25:32.801 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:25:32 compute-1 nova_compute[182713]: 2026-01-22 00:25:32.805 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:25:32 compute-1 ovn_controller[94841]: 2026-01-22T00:25:32Z|00639|binding|INFO|Claiming lport 00f97767-091d-496b-9dc0-f66f15a21142 for this chassis.
Jan 22 00:25:32 compute-1 ovn_controller[94841]: 2026-01-22T00:25:32Z|00640|binding|INFO|00f97767-091d-496b-9dc0-f66f15a21142: Claiming fa:16:3e:3d:55:4f 10.100.0.6
Jan 22 00:25:32 compute-1 nova_compute[182713]: 2026-01-22 00:25:32.810 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:25:32 compute-1 nova_compute[182713]: 2026-01-22 00:25:32.813 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:25:32 compute-1 nova_compute[182713]: 2026-01-22 00:25:32.826 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:25:32 compute-1 NetworkManager[54952]: <info>  [1769041532.8279] manager: (patch-br-int-to-provnet-99bcf688-d143-4306-a11c-956e66fbe227): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/294)
Jan 22 00:25:32 compute-1 NetworkManager[54952]: <info>  [1769041532.8286] manager: (patch-provnet-99bcf688-d143-4306-a11c-956e66fbe227-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/295)
Jan 22 00:25:32 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:25:32.835 104184 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:3d:55:4f 10.100.0.6'], port_security=['fa:16:3e:3d:55:4f 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '7530f1b4-dc2b-4291-b4c1-fb75b3542e2b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-5c39d2a7-2c89-4543-a593-0bbe9a34dfef', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '02bcfc5f1f1044a3856e73a5938ff011', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'cf342e9e-efd0-4e0e-9d6f-a5a24378b540', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=48468636-833b-49e3-b1b9-d984040b8ee3, chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>], logical_port=00f97767-091d-496b-9dc0-f66f15a21142) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:25:32 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:25:32.836 104184 INFO neutron.agent.ovn.metadata.agent [-] Port 00f97767-091d-496b-9dc0-f66f15a21142 in datapath 5c39d2a7-2c89-4543-a593-0bbe9a34dfef bound to our chassis
Jan 22 00:25:32 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:25:32.838 104184 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 5c39d2a7-2c89-4543-a593-0bbe9a34dfef
Jan 22 00:25:32 compute-1 systemd-udevd[236694]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 00:25:32 compute-1 systemd-machined[153970]: New machine qemu-70-instance-000000a3.
Jan 22 00:25:32 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:25:32.856 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[7a6730b7-e468-4314-a595-43585c1dacb7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:25:32 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:25:32.858 104184 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap5c39d2a7-21 in ovnmeta-5c39d2a7-2c89-4543-a593-0bbe9a34dfef namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 22 00:25:32 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:25:32.860 211733 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap5c39d2a7-20 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 22 00:25:32 compute-1 NetworkManager[54952]: <info>  [1769041532.8624] device (tap00f97767-09): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 00:25:32 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:25:32.860 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[005f397e-2bb5-49ca-b672-edc18be197ae]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:25:32 compute-1 NetworkManager[54952]: <info>  [1769041532.8632] device (tap00f97767-09): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 00:25:32 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:25:32.863 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[32cb4847-1467-4f2a-89ae-0f80323440c0]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:25:32 compute-1 systemd[1]: Started Virtual Machine qemu-70-instance-000000a3.
Jan 22 00:25:32 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:25:32.881 104576 DEBUG oslo.privsep.daemon [-] privsep: reply[b2e027b5-688e-45a7-9b96-9c25913a38ed]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:25:32 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:25:32.912 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[8b12c338-4f23-4e26-a5f5-f35288d58662]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:25:32 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:25:32.952 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[0ab59d7d-5a3d-4111-a2d8-62f208f363f6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:25:32 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:25:32.965 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[5f7bc62a-44b8-444c-8902-ad0320360cfa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:25:32 compute-1 NetworkManager[54952]: <info>  [1769041532.9668] manager: (tap5c39d2a7-20): new Veth device (/org/freedesktop/NetworkManager/Devices/296)
Jan 22 00:25:32 compute-1 systemd-udevd[236697]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 00:25:32 compute-1 nova_compute[182713]: 2026-01-22 00:25:32.968 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:25:32 compute-1 nova_compute[182713]: 2026-01-22 00:25:32.990 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:25:32 compute-1 ovn_controller[94841]: 2026-01-22T00:25:32Z|00641|binding|INFO|Setting lport 00f97767-091d-496b-9dc0-f66f15a21142 ovn-installed in OVS
Jan 22 00:25:32 compute-1 ovn_controller[94841]: 2026-01-22T00:25:32Z|00642|binding|INFO|Setting lport 00f97767-091d-496b-9dc0-f66f15a21142 up in Southbound
Jan 22 00:25:32 compute-1 nova_compute[182713]: 2026-01-22 00:25:32.997 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:25:33 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:25:33.012 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[9535c035-9980-4178-a12a-d31cc374dbac]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:25:33 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:25:33.015 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[421b0717-dd36-4003-951e-e1f76c5a6445]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:25:33 compute-1 NetworkManager[54952]: <info>  [1769041533.0454] device (tap5c39d2a7-20): carrier: link connected
Jan 22 00:25:33 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:25:33.052 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[06b614c3-6ad0-4a0d-9a3c-eebc8bd28b91]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:25:33 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:25:33.072 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[da439c1c-1d34-4248-840f-1301863d3e8b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap5c39d2a7-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:84:5b:ec'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 183], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 610569, 'reachable_time': 32183, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 236727, 'error': None, 'target': 'ovnmeta-5c39d2a7-2c89-4543-a593-0bbe9a34dfef', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:25:33 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:25:33.089 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[024f6526-3618-423a-bef7-4aa47c99a7b0]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe84:5bec'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 610569, 'tstamp': 610569}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 236728, 'error': None, 'target': 'ovnmeta-5c39d2a7-2c89-4543-a593-0bbe9a34dfef', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:25:33 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:25:33.116 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[0310f4ab-bb9e-4dd3-b6ed-61b47d5071a8]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap5c39d2a7-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:84:5b:ec'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 183], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 610569, 'reachable_time': 32183, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 236729, 'error': None, 'target': 'ovnmeta-5c39d2a7-2c89-4543-a593-0bbe9a34dfef', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:25:33 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:25:33.151 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[37c23e2b-e699-4bdb-8f52-624d5eedd29b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:25:33 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:25:33.208 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[1139a29b-4401-4f0f-9058-0a298ec3af0b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:25:33 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:25:33.209 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap5c39d2a7-20, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:25:33 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:25:33.210 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 00:25:33 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:25:33.210 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap5c39d2a7-20, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:25:33 compute-1 NetworkManager[54952]: <info>  [1769041533.2127] manager: (tap5c39d2a7-20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/297)
Jan 22 00:25:33 compute-1 nova_compute[182713]: 2026-01-22 00:25:33.212 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:25:33 compute-1 kernel: tap5c39d2a7-20: entered promiscuous mode
Jan 22 00:25:33 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:25:33.215 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap5c39d2a7-20, col_values=(('external_ids', {'iface-id': 'dbac63f8-5924-480d-ac2c-ed6dee0a255b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:25:33 compute-1 nova_compute[182713]: 2026-01-22 00:25:33.215 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:25:33 compute-1 ovn_controller[94841]: 2026-01-22T00:25:33Z|00643|binding|INFO|Releasing lport dbac63f8-5924-480d-ac2c-ed6dee0a255b from this chassis (sb_readonly=0)
Jan 22 00:25:33 compute-1 nova_compute[182713]: 2026-01-22 00:25:33.228 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:25:33 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:25:33.230 104184 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/5c39d2a7-2c89-4543-a593-0bbe9a34dfef.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/5c39d2a7-2c89-4543-a593-0bbe9a34dfef.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 22 00:25:33 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:25:33.231 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[7100f5f7-8358-447e-b465-1c341c283a8a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:25:33 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:25:33.232 104184 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 00:25:33 compute-1 ovn_metadata_agent[104179]: global
Jan 22 00:25:33 compute-1 ovn_metadata_agent[104179]:     log         /dev/log local0 debug
Jan 22 00:25:33 compute-1 ovn_metadata_agent[104179]:     log-tag     haproxy-metadata-proxy-5c39d2a7-2c89-4543-a593-0bbe9a34dfef
Jan 22 00:25:33 compute-1 ovn_metadata_agent[104179]:     user        root
Jan 22 00:25:33 compute-1 ovn_metadata_agent[104179]:     group       root
Jan 22 00:25:33 compute-1 ovn_metadata_agent[104179]:     maxconn     1024
Jan 22 00:25:33 compute-1 ovn_metadata_agent[104179]:     pidfile     /var/lib/neutron/external/pids/5c39d2a7-2c89-4543-a593-0bbe9a34dfef.pid.haproxy
Jan 22 00:25:33 compute-1 ovn_metadata_agent[104179]:     daemon
Jan 22 00:25:33 compute-1 ovn_metadata_agent[104179]: 
Jan 22 00:25:33 compute-1 ovn_metadata_agent[104179]: defaults
Jan 22 00:25:33 compute-1 ovn_metadata_agent[104179]:     log global
Jan 22 00:25:33 compute-1 ovn_metadata_agent[104179]:     mode http
Jan 22 00:25:33 compute-1 ovn_metadata_agent[104179]:     option httplog
Jan 22 00:25:33 compute-1 ovn_metadata_agent[104179]:     option dontlognull
Jan 22 00:25:33 compute-1 ovn_metadata_agent[104179]:     option http-server-close
Jan 22 00:25:33 compute-1 ovn_metadata_agent[104179]:     option forwardfor
Jan 22 00:25:33 compute-1 ovn_metadata_agent[104179]:     retries                 3
Jan 22 00:25:33 compute-1 ovn_metadata_agent[104179]:     timeout http-request    30s
Jan 22 00:25:33 compute-1 ovn_metadata_agent[104179]:     timeout connect         30s
Jan 22 00:25:33 compute-1 ovn_metadata_agent[104179]:     timeout client          32s
Jan 22 00:25:33 compute-1 ovn_metadata_agent[104179]:     timeout server          32s
Jan 22 00:25:33 compute-1 ovn_metadata_agent[104179]:     timeout http-keep-alive 30s
Jan 22 00:25:33 compute-1 ovn_metadata_agent[104179]: 
Jan 22 00:25:33 compute-1 ovn_metadata_agent[104179]: 
Jan 22 00:25:33 compute-1 ovn_metadata_agent[104179]: listen listener
Jan 22 00:25:33 compute-1 ovn_metadata_agent[104179]:     bind 169.254.169.254:80
Jan 22 00:25:33 compute-1 ovn_metadata_agent[104179]:     server metadata /var/lib/neutron/metadata_proxy
Jan 22 00:25:33 compute-1 ovn_metadata_agent[104179]:     http-request add-header X-OVN-Network-ID 5c39d2a7-2c89-4543-a593-0bbe9a34dfef
Jan 22 00:25:33 compute-1 ovn_metadata_agent[104179]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 22 00:25:33 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:25:33.232 104184 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-5c39d2a7-2c89-4543-a593-0bbe9a34dfef', 'env', 'PROCESS_TAG=haproxy-5c39d2a7-2c89-4543-a593-0bbe9a34dfef', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/5c39d2a7-2c89-4543-a593-0bbe9a34dfef.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 22 00:25:33 compute-1 nova_compute[182713]: 2026-01-22 00:25:33.252 182717 DEBUG nova.virt.driver [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Emitting event <LifecycleEvent: 1769041533.2515545, 7530f1b4-dc2b-4291-b4c1-fb75b3542e2b => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:25:33 compute-1 nova_compute[182713]: 2026-01-22 00:25:33.252 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 7530f1b4-dc2b-4291-b4c1-fb75b3542e2b] VM Started (Lifecycle Event)
Jan 22 00:25:33 compute-1 nova_compute[182713]: 2026-01-22 00:25:33.281 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 7530f1b4-dc2b-4291-b4c1-fb75b3542e2b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:25:33 compute-1 nova_compute[182713]: 2026-01-22 00:25:33.285 182717 DEBUG nova.virt.driver [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Emitting event <LifecycleEvent: 1769041533.2545912, 7530f1b4-dc2b-4291-b4c1-fb75b3542e2b => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:25:33 compute-1 nova_compute[182713]: 2026-01-22 00:25:33.286 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 7530f1b4-dc2b-4291-b4c1-fb75b3542e2b] VM Paused (Lifecycle Event)
Jan 22 00:25:33 compute-1 nova_compute[182713]: 2026-01-22 00:25:33.308 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 7530f1b4-dc2b-4291-b4c1-fb75b3542e2b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:25:33 compute-1 nova_compute[182713]: 2026-01-22 00:25:33.313 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 7530f1b4-dc2b-4291-b4c1-fb75b3542e2b] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 00:25:33 compute-1 nova_compute[182713]: 2026-01-22 00:25:33.336 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 7530f1b4-dc2b-4291-b4c1-fb75b3542e2b] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 00:25:33 compute-1 nova_compute[182713]: 2026-01-22 00:25:33.359 182717 DEBUG nova.compute.manager [req-3a167dc7-259e-4ada-b859-d61c3e773878 req-7703f2c7-2e95-4907-b9ce-b86f6c4de526 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 7530f1b4-dc2b-4291-b4c1-fb75b3542e2b] Received event network-vif-plugged-00f97767-091d-496b-9dc0-f66f15a21142 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:25:33 compute-1 nova_compute[182713]: 2026-01-22 00:25:33.360 182717 DEBUG oslo_concurrency.lockutils [req-3a167dc7-259e-4ada-b859-d61c3e773878 req-7703f2c7-2e95-4907-b9ce-b86f6c4de526 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "7530f1b4-dc2b-4291-b4c1-fb75b3542e2b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:25:33 compute-1 nova_compute[182713]: 2026-01-22 00:25:33.360 182717 DEBUG oslo_concurrency.lockutils [req-3a167dc7-259e-4ada-b859-d61c3e773878 req-7703f2c7-2e95-4907-b9ce-b86f6c4de526 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "7530f1b4-dc2b-4291-b4c1-fb75b3542e2b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:25:33 compute-1 nova_compute[182713]: 2026-01-22 00:25:33.360 182717 DEBUG oslo_concurrency.lockutils [req-3a167dc7-259e-4ada-b859-d61c3e773878 req-7703f2c7-2e95-4907-b9ce-b86f6c4de526 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "7530f1b4-dc2b-4291-b4c1-fb75b3542e2b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:25:33 compute-1 nova_compute[182713]: 2026-01-22 00:25:33.360 182717 DEBUG nova.compute.manager [req-3a167dc7-259e-4ada-b859-d61c3e773878 req-7703f2c7-2e95-4907-b9ce-b86f6c4de526 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 7530f1b4-dc2b-4291-b4c1-fb75b3542e2b] Processing event network-vif-plugged-00f97767-091d-496b-9dc0-f66f15a21142 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 22 00:25:33 compute-1 nova_compute[182713]: 2026-01-22 00:25:33.361 182717 DEBUG nova.compute.manager [None req-76f04832-7a74-448f-b926-bd392c1665bc a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 7530f1b4-dc2b-4291-b4c1-fb75b3542e2b] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 00:25:33 compute-1 nova_compute[182713]: 2026-01-22 00:25:33.366 182717 DEBUG nova.virt.driver [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Emitting event <LifecycleEvent: 1769041533.3666236, 7530f1b4-dc2b-4291-b4c1-fb75b3542e2b => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:25:33 compute-1 nova_compute[182713]: 2026-01-22 00:25:33.367 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 7530f1b4-dc2b-4291-b4c1-fb75b3542e2b] VM Resumed (Lifecycle Event)
Jan 22 00:25:33 compute-1 nova_compute[182713]: 2026-01-22 00:25:33.369 182717 DEBUG nova.virt.libvirt.driver [None req-76f04832-7a74-448f-b926-bd392c1665bc a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 7530f1b4-dc2b-4291-b4c1-fb75b3542e2b] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 22 00:25:33 compute-1 nova_compute[182713]: 2026-01-22 00:25:33.374 182717 INFO nova.virt.libvirt.driver [-] [instance: 7530f1b4-dc2b-4291-b4c1-fb75b3542e2b] Instance spawned successfully.
Jan 22 00:25:33 compute-1 nova_compute[182713]: 2026-01-22 00:25:33.375 182717 DEBUG nova.virt.libvirt.driver [None req-76f04832-7a74-448f-b926-bd392c1665bc a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 7530f1b4-dc2b-4291-b4c1-fb75b3542e2b] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 22 00:25:33 compute-1 nova_compute[182713]: 2026-01-22 00:25:33.395 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 7530f1b4-dc2b-4291-b4c1-fb75b3542e2b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:25:33 compute-1 nova_compute[182713]: 2026-01-22 00:25:33.399 182717 DEBUG nova.virt.libvirt.driver [None req-76f04832-7a74-448f-b926-bd392c1665bc a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 7530f1b4-dc2b-4291-b4c1-fb75b3542e2b] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:25:33 compute-1 nova_compute[182713]: 2026-01-22 00:25:33.400 182717 DEBUG nova.virt.libvirt.driver [None req-76f04832-7a74-448f-b926-bd392c1665bc a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 7530f1b4-dc2b-4291-b4c1-fb75b3542e2b] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:25:33 compute-1 nova_compute[182713]: 2026-01-22 00:25:33.400 182717 DEBUG nova.virt.libvirt.driver [None req-76f04832-7a74-448f-b926-bd392c1665bc a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 7530f1b4-dc2b-4291-b4c1-fb75b3542e2b] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:25:33 compute-1 nova_compute[182713]: 2026-01-22 00:25:33.401 182717 DEBUG nova.virt.libvirt.driver [None req-76f04832-7a74-448f-b926-bd392c1665bc a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 7530f1b4-dc2b-4291-b4c1-fb75b3542e2b] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:25:33 compute-1 nova_compute[182713]: 2026-01-22 00:25:33.401 182717 DEBUG nova.virt.libvirt.driver [None req-76f04832-7a74-448f-b926-bd392c1665bc a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 7530f1b4-dc2b-4291-b4c1-fb75b3542e2b] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:25:33 compute-1 nova_compute[182713]: 2026-01-22 00:25:33.402 182717 DEBUG nova.virt.libvirt.driver [None req-76f04832-7a74-448f-b926-bd392c1665bc a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 7530f1b4-dc2b-4291-b4c1-fb75b3542e2b] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:25:33 compute-1 nova_compute[182713]: 2026-01-22 00:25:33.406 182717 DEBUG nova.network.neutron [req-9b5c3331-d3e6-4efe-a217-313a56e2d652 req-dc75c86b-f7a1-4da2-b43f-b71827f0c1e9 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 7530f1b4-dc2b-4291-b4c1-fb75b3542e2b] Updated VIF entry in instance network info cache for port 00f97767-091d-496b-9dc0-f66f15a21142. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 00:25:33 compute-1 nova_compute[182713]: 2026-01-22 00:25:33.406 182717 DEBUG nova.network.neutron [req-9b5c3331-d3e6-4efe-a217-313a56e2d652 req-dc75c86b-f7a1-4da2-b43f-b71827f0c1e9 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 7530f1b4-dc2b-4291-b4c1-fb75b3542e2b] Updating instance_info_cache with network_info: [{"id": "00f97767-091d-496b-9dc0-f66f15a21142", "address": "fa:16:3e:3d:55:4f", "network": {"id": "5c39d2a7-2c89-4543-a593-0bbe9a34dfef", "bridge": "br-int", "label": "tempest-network-smoke--1486796925", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "02bcfc5f1f1044a3856e73a5938ff011", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap00f97767-09", "ovs_interfaceid": "00f97767-091d-496b-9dc0-f66f15a21142", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:25:33 compute-1 nova_compute[182713]: 2026-01-22 00:25:33.409 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 7530f1b4-dc2b-4291-b4c1-fb75b3542e2b] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 00:25:33 compute-1 nova_compute[182713]: 2026-01-22 00:25:33.448 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 7530f1b4-dc2b-4291-b4c1-fb75b3542e2b] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 00:25:33 compute-1 nova_compute[182713]: 2026-01-22 00:25:33.448 182717 DEBUG oslo_concurrency.lockutils [req-9b5c3331-d3e6-4efe-a217-313a56e2d652 req-dc75c86b-f7a1-4da2-b43f-b71827f0c1e9 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-7530f1b4-dc2b-4291-b4c1-fb75b3542e2b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:25:33 compute-1 nova_compute[182713]: 2026-01-22 00:25:33.491 182717 INFO nova.compute.manager [None req-76f04832-7a74-448f-b926-bd392c1665bc a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 7530f1b4-dc2b-4291-b4c1-fb75b3542e2b] Took 11.64 seconds to spawn the instance on the hypervisor.
Jan 22 00:25:33 compute-1 nova_compute[182713]: 2026-01-22 00:25:33.492 182717 DEBUG nova.compute.manager [None req-76f04832-7a74-448f-b926-bd392c1665bc a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 7530f1b4-dc2b-4291-b4c1-fb75b3542e2b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:25:33 compute-1 nova_compute[182713]: 2026-01-22 00:25:33.600 182717 INFO nova.compute.manager [None req-76f04832-7a74-448f-b926-bd392c1665bc a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 7530f1b4-dc2b-4291-b4c1-fb75b3542e2b] Took 12.60 seconds to build instance.
Jan 22 00:25:33 compute-1 podman[236768]: 2026-01-22 00:25:33.621339139 +0000 UTC m=+0.066043011 container create fd975280d1e5ecc9d871f27f376a54a55ecc437196dcefaa5cc946850cfd4483 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5c39d2a7-2c89-4543-a593-0bbe9a34dfef, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 00:25:33 compute-1 nova_compute[182713]: 2026-01-22 00:25:33.626 182717 DEBUG oslo_concurrency.lockutils [None req-76f04832-7a74-448f-b926-bd392c1665bc a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Lock "7530f1b4-dc2b-4291-b4c1-fb75b3542e2b" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 12.750s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:25:33 compute-1 systemd[1]: Started libpod-conmon-fd975280d1e5ecc9d871f27f376a54a55ecc437196dcefaa5cc946850cfd4483.scope.
Jan 22 00:25:33 compute-1 podman[236768]: 2026-01-22 00:25:33.576671742 +0000 UTC m=+0.021375624 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 00:25:33 compute-1 systemd[1]: Started libcrun container.
Jan 22 00:25:33 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/88ef5f22e01b3820ac8bdaf5cf91e7fb7f8fdf50bda1156b0a3ee09041451a44/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 00:25:33 compute-1 podman[236768]: 2026-01-22 00:25:33.717974249 +0000 UTC m=+0.162678151 container init fd975280d1e5ecc9d871f27f376a54a55ecc437196dcefaa5cc946850cfd4483 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5c39d2a7-2c89-4543-a593-0bbe9a34dfef, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Jan 22 00:25:33 compute-1 podman[236768]: 2026-01-22 00:25:33.723388167 +0000 UTC m=+0.168092029 container start fd975280d1e5ecc9d871f27f376a54a55ecc437196dcefaa5cc946850cfd4483 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5c39d2a7-2c89-4543-a593-0bbe9a34dfef, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Jan 22 00:25:33 compute-1 neutron-haproxy-ovnmeta-5c39d2a7-2c89-4543-a593-0bbe9a34dfef[236783]: [NOTICE]   (236787) : New worker (236789) forked
Jan 22 00:25:33 compute-1 neutron-haproxy-ovnmeta-5c39d2a7-2c89-4543-a593-0bbe9a34dfef[236783]: [NOTICE]   (236787) : Loading success.
Jan 22 00:25:34 compute-1 nova_compute[182713]: 2026-01-22 00:25:34.083 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:25:35 compute-1 nova_compute[182713]: 2026-01-22 00:25:35.454 182717 DEBUG nova.compute.manager [req-768d8055-d0bf-4478-90fb-b8ac1ffd07d7 req-647241ea-3b00-40cf-8092-9bff4276e5bd 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 7530f1b4-dc2b-4291-b4c1-fb75b3542e2b] Received event network-vif-plugged-00f97767-091d-496b-9dc0-f66f15a21142 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:25:35 compute-1 nova_compute[182713]: 2026-01-22 00:25:35.454 182717 DEBUG oslo_concurrency.lockutils [req-768d8055-d0bf-4478-90fb-b8ac1ffd07d7 req-647241ea-3b00-40cf-8092-9bff4276e5bd 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "7530f1b4-dc2b-4291-b4c1-fb75b3542e2b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:25:35 compute-1 nova_compute[182713]: 2026-01-22 00:25:35.455 182717 DEBUG oslo_concurrency.lockutils [req-768d8055-d0bf-4478-90fb-b8ac1ffd07d7 req-647241ea-3b00-40cf-8092-9bff4276e5bd 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "7530f1b4-dc2b-4291-b4c1-fb75b3542e2b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:25:35 compute-1 nova_compute[182713]: 2026-01-22 00:25:35.455 182717 DEBUG oslo_concurrency.lockutils [req-768d8055-d0bf-4478-90fb-b8ac1ffd07d7 req-647241ea-3b00-40cf-8092-9bff4276e5bd 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "7530f1b4-dc2b-4291-b4c1-fb75b3542e2b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:25:35 compute-1 nova_compute[182713]: 2026-01-22 00:25:35.456 182717 DEBUG nova.compute.manager [req-768d8055-d0bf-4478-90fb-b8ac1ffd07d7 req-647241ea-3b00-40cf-8092-9bff4276e5bd 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 7530f1b4-dc2b-4291-b4c1-fb75b3542e2b] No waiting events found dispatching network-vif-plugged-00f97767-091d-496b-9dc0-f66f15a21142 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:25:35 compute-1 nova_compute[182713]: 2026-01-22 00:25:35.456 182717 WARNING nova.compute.manager [req-768d8055-d0bf-4478-90fb-b8ac1ffd07d7 req-647241ea-3b00-40cf-8092-9bff4276e5bd 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 7530f1b4-dc2b-4291-b4c1-fb75b3542e2b] Received unexpected event network-vif-plugged-00f97767-091d-496b-9dc0-f66f15a21142 for instance with vm_state active and task_state None.
Jan 22 00:25:36 compute-1 nova_compute[182713]: 2026-01-22 00:25:36.967 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:25:37 compute-1 nova_compute[182713]: 2026-01-22 00:25:37.562 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:25:37 compute-1 nova_compute[182713]: 2026-01-22 00:25:37.910 182717 DEBUG nova.compute.manager [req-0941d5b5-d376-4897-8bc3-e3a7d4416ebd req-4d216e17-961a-4e89-b4f5-b1128d96265f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 7530f1b4-dc2b-4291-b4c1-fb75b3542e2b] Received event network-changed-00f97767-091d-496b-9dc0-f66f15a21142 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:25:37 compute-1 nova_compute[182713]: 2026-01-22 00:25:37.911 182717 DEBUG nova.compute.manager [req-0941d5b5-d376-4897-8bc3-e3a7d4416ebd req-4d216e17-961a-4e89-b4f5-b1128d96265f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 7530f1b4-dc2b-4291-b4c1-fb75b3542e2b] Refreshing instance network info cache due to event network-changed-00f97767-091d-496b-9dc0-f66f15a21142. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 00:25:37 compute-1 nova_compute[182713]: 2026-01-22 00:25:37.911 182717 DEBUG oslo_concurrency.lockutils [req-0941d5b5-d376-4897-8bc3-e3a7d4416ebd req-4d216e17-961a-4e89-b4f5-b1128d96265f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-7530f1b4-dc2b-4291-b4c1-fb75b3542e2b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:25:37 compute-1 nova_compute[182713]: 2026-01-22 00:25:37.911 182717 DEBUG oslo_concurrency.lockutils [req-0941d5b5-d376-4897-8bc3-e3a7d4416ebd req-4d216e17-961a-4e89-b4f5-b1128d96265f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-7530f1b4-dc2b-4291-b4c1-fb75b3542e2b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:25:37 compute-1 nova_compute[182713]: 2026-01-22 00:25:37.911 182717 DEBUG nova.network.neutron [req-0941d5b5-d376-4897-8bc3-e3a7d4416ebd req-4d216e17-961a-4e89-b4f5-b1128d96265f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 7530f1b4-dc2b-4291-b4c1-fb75b3542e2b] Refreshing network info cache for port 00f97767-091d-496b-9dc0-f66f15a21142 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 00:25:39 compute-1 nova_compute[182713]: 2026-01-22 00:25:39.086 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:25:40 compute-1 nova_compute[182713]: 2026-01-22 00:25:40.182 182717 DEBUG nova.network.neutron [req-0941d5b5-d376-4897-8bc3-e3a7d4416ebd req-4d216e17-961a-4e89-b4f5-b1128d96265f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 7530f1b4-dc2b-4291-b4c1-fb75b3542e2b] Updated VIF entry in instance network info cache for port 00f97767-091d-496b-9dc0-f66f15a21142. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 00:25:40 compute-1 nova_compute[182713]: 2026-01-22 00:25:40.183 182717 DEBUG nova.network.neutron [req-0941d5b5-d376-4897-8bc3-e3a7d4416ebd req-4d216e17-961a-4e89-b4f5-b1128d96265f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 7530f1b4-dc2b-4291-b4c1-fb75b3542e2b] Updating instance_info_cache with network_info: [{"id": "00f97767-091d-496b-9dc0-f66f15a21142", "address": "fa:16:3e:3d:55:4f", "network": {"id": "5c39d2a7-2c89-4543-a593-0bbe9a34dfef", "bridge": "br-int", "label": "tempest-network-smoke--1486796925", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "02bcfc5f1f1044a3856e73a5938ff011", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap00f97767-09", "ovs_interfaceid": "00f97767-091d-496b-9dc0-f66f15a21142", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:25:40 compute-1 nova_compute[182713]: 2026-01-22 00:25:40.216 182717 DEBUG oslo_concurrency.lockutils [req-0941d5b5-d376-4897-8bc3-e3a7d4416ebd req-4d216e17-961a-4e89-b4f5-b1128d96265f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-7530f1b4-dc2b-4291-b4c1-fb75b3542e2b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:25:40 compute-1 nova_compute[182713]: 2026-01-22 00:25:40.789 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:25:41 compute-1 podman[236798]: 2026-01-22 00:25:41.625070234 +0000 UTC m=+0.097809068 container health_status cba261ffc859a105e0afa4436e64e27f9ba7e7387a1ea02146e0072c51844d44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 00:25:41 compute-1 nova_compute[182713]: 2026-01-22 00:25:41.969 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:25:44 compute-1 nova_compute[182713]: 2026-01-22 00:25:44.090 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:25:44 compute-1 podman[236819]: 2026-01-22 00:25:44.616615632 +0000 UTC m=+0.092431970 container health_status dce133a2ffa21f3006ac27dc80027060a9029e6d972f4c62fecd35dd3bbb00f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, release=1755695350, vcs-type=git, version=9.6, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public, build-date=2025-08-20T13:12:41, config_id=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, io.openshift.expose-services=, vendor=Red Hat, Inc., architecture=x86_64, managed_by=edpm_ansible, name=ubi9-minimal)
Jan 22 00:25:46 compute-1 nova_compute[182713]: 2026-01-22 00:25:46.977 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:25:47 compute-1 ovn_controller[94841]: 2026-01-22T00:25:47Z|00079|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:3d:55:4f 10.100.0.6
Jan 22 00:25:47 compute-1 ovn_controller[94841]: 2026-01-22T00:25:47Z|00080|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:3d:55:4f 10.100.0.6
Jan 22 00:25:49 compute-1 nova_compute[182713]: 2026-01-22 00:25:49.093 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:25:49 compute-1 nova_compute[182713]: 2026-01-22 00:25:49.858 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:25:51 compute-1 nova_compute[182713]: 2026-01-22 00:25:51.855 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:25:51 compute-1 nova_compute[182713]: 2026-01-22 00:25:51.857 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 00:25:51 compute-1 nova_compute[182713]: 2026-01-22 00:25:51.982 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:25:52 compute-1 nova_compute[182713]: 2026-01-22 00:25:52.858 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:25:53 compute-1 nova_compute[182713]: 2026-01-22 00:25:53.852 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:25:54 compute-1 nova_compute[182713]: 2026-01-22 00:25:54.095 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:25:54 compute-1 nova_compute[182713]: 2026-01-22 00:25:54.876 182717 DEBUG oslo_concurrency.lockutils [None req-2ed6adb0-5d90-47fc-a82e-57f6619825e2 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Acquiring lock "7530f1b4-dc2b-4291-b4c1-fb75b3542e2b" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:25:54 compute-1 nova_compute[182713]: 2026-01-22 00:25:54.877 182717 DEBUG oslo_concurrency.lockutils [None req-2ed6adb0-5d90-47fc-a82e-57f6619825e2 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Lock "7530f1b4-dc2b-4291-b4c1-fb75b3542e2b" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:25:54 compute-1 nova_compute[182713]: 2026-01-22 00:25:54.878 182717 DEBUG oslo_concurrency.lockutils [None req-2ed6adb0-5d90-47fc-a82e-57f6619825e2 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Acquiring lock "7530f1b4-dc2b-4291-b4c1-fb75b3542e2b-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:25:54 compute-1 nova_compute[182713]: 2026-01-22 00:25:54.878 182717 DEBUG oslo_concurrency.lockutils [None req-2ed6adb0-5d90-47fc-a82e-57f6619825e2 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Lock "7530f1b4-dc2b-4291-b4c1-fb75b3542e2b-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:25:54 compute-1 nova_compute[182713]: 2026-01-22 00:25:54.878 182717 DEBUG oslo_concurrency.lockutils [None req-2ed6adb0-5d90-47fc-a82e-57f6619825e2 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Lock "7530f1b4-dc2b-4291-b4c1-fb75b3542e2b-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:25:54 compute-1 nova_compute[182713]: 2026-01-22 00:25:54.893 182717 INFO nova.compute.manager [None req-2ed6adb0-5d90-47fc-a82e-57f6619825e2 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 7530f1b4-dc2b-4291-b4c1-fb75b3542e2b] Terminating instance
Jan 22 00:25:54 compute-1 nova_compute[182713]: 2026-01-22 00:25:54.906 182717 DEBUG nova.compute.manager [None req-2ed6adb0-5d90-47fc-a82e-57f6619825e2 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 7530f1b4-dc2b-4291-b4c1-fb75b3542e2b] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 22 00:25:54 compute-1 kernel: tap00f97767-09 (unregistering): left promiscuous mode
Jan 22 00:25:54 compute-1 NetworkManager[54952]: <info>  [1769041554.9413] device (tap00f97767-09): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 00:25:54 compute-1 ovn_controller[94841]: 2026-01-22T00:25:54Z|00644|binding|INFO|Releasing lport 00f97767-091d-496b-9dc0-f66f15a21142 from this chassis (sb_readonly=0)
Jan 22 00:25:54 compute-1 ovn_controller[94841]: 2026-01-22T00:25:54Z|00645|binding|INFO|Setting lport 00f97767-091d-496b-9dc0-f66f15a21142 down in Southbound
Jan 22 00:25:54 compute-1 nova_compute[182713]: 2026-01-22 00:25:54.947 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:25:54 compute-1 ovn_controller[94841]: 2026-01-22T00:25:54Z|00646|binding|INFO|Removing iface tap00f97767-09 ovn-installed in OVS
Jan 22 00:25:54 compute-1 nova_compute[182713]: 2026-01-22 00:25:54.950 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:25:54 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:25:54.957 104184 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:3d:55:4f 10.100.0.6'], port_security=['fa:16:3e:3d:55:4f 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '7530f1b4-dc2b-4291-b4c1-fb75b3542e2b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-5c39d2a7-2c89-4543-a593-0bbe9a34dfef', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '02bcfc5f1f1044a3856e73a5938ff011', 'neutron:revision_number': '5', 'neutron:security_group_ids': 'db612486-819a-4c82-9855-5d1c291fe4bc', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=48468636-833b-49e3-b1b9-d984040b8ee3, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>], logical_port=00f97767-091d-496b-9dc0-f66f15a21142) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:25:54 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:25:54.959 104184 INFO neutron.agent.ovn.metadata.agent [-] Port 00f97767-091d-496b-9dc0-f66f15a21142 in datapath 5c39d2a7-2c89-4543-a593-0bbe9a34dfef unbound from our chassis
Jan 22 00:25:54 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:25:54.962 104184 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 5c39d2a7-2c89-4543-a593-0bbe9a34dfef, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 00:25:54 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:25:54.965 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[c131429e-171e-428a-9463-a3cf134107e7]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:25:54 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:25:54.967 104184 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-5c39d2a7-2c89-4543-a593-0bbe9a34dfef namespace which is not needed anymore
Jan 22 00:25:54 compute-1 nova_compute[182713]: 2026-01-22 00:25:54.969 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:25:55 compute-1 systemd[1]: machine-qemu\x2d70\x2dinstance\x2d000000a3.scope: Deactivated successfully.
Jan 22 00:25:55 compute-1 systemd[1]: machine-qemu\x2d70\x2dinstance\x2d000000a3.scope: Consumed 13.293s CPU time.
Jan 22 00:25:55 compute-1 systemd-machined[153970]: Machine qemu-70-instance-000000a3 terminated.
Jan 22 00:25:55 compute-1 podman[236863]: 2026-01-22 00:25:55.047981071 +0000 UTC m=+0.070082706 container health_status 9069102163b9014ce8d990e36224edf2f6277e737eebbc85a8ea0ba5a3d6a468 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 22 00:25:55 compute-1 podman[236861]: 2026-01-22 00:25:55.082829833 +0000 UTC m=+0.109843782 container health_status 1b31374526468d6b98833c6ddc06c791524e3aaa8f4a92d02e3f11c449a17ca2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251202, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 22 00:25:55 compute-1 nova_compute[182713]: 2026-01-22 00:25:55.132 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:25:55 compute-1 nova_compute[182713]: 2026-01-22 00:25:55.140 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:25:55 compute-1 neutron-haproxy-ovnmeta-5c39d2a7-2c89-4543-a593-0bbe9a34dfef[236783]: [NOTICE]   (236787) : haproxy version is 2.8.14-c23fe91
Jan 22 00:25:55 compute-1 neutron-haproxy-ovnmeta-5c39d2a7-2c89-4543-a593-0bbe9a34dfef[236783]: [NOTICE]   (236787) : path to executable is /usr/sbin/haproxy
Jan 22 00:25:55 compute-1 neutron-haproxy-ovnmeta-5c39d2a7-2c89-4543-a593-0bbe9a34dfef[236783]: [ALERT]    (236787) : Current worker (236789) exited with code 143 (Terminated)
Jan 22 00:25:55 compute-1 neutron-haproxy-ovnmeta-5c39d2a7-2c89-4543-a593-0bbe9a34dfef[236783]: [WARNING]  (236787) : All workers exited. Exiting... (0)
Jan 22 00:25:55 compute-1 systemd[1]: libpod-fd975280d1e5ecc9d871f27f376a54a55ecc437196dcefaa5cc946850cfd4483.scope: Deactivated successfully.
Jan 22 00:25:55 compute-1 podman[236930]: 2026-01-22 00:25:55.161422913 +0000 UTC m=+0.067212518 container died fd975280d1e5ecc9d871f27f376a54a55ecc437196dcefaa5cc946850cfd4483 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5c39d2a7-2c89-4543-a593-0bbe9a34dfef, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 22 00:25:55 compute-1 nova_compute[182713]: 2026-01-22 00:25:55.187 182717 INFO nova.virt.libvirt.driver [-] [instance: 7530f1b4-dc2b-4291-b4c1-fb75b3542e2b] Instance destroyed successfully.
Jan 22 00:25:55 compute-1 nova_compute[182713]: 2026-01-22 00:25:55.188 182717 DEBUG nova.objects.instance [None req-2ed6adb0-5d90-47fc-a82e-57f6619825e2 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Lazy-loading 'resources' on Instance uuid 7530f1b4-dc2b-4291-b4c1-fb75b3542e2b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:25:55 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-fd975280d1e5ecc9d871f27f376a54a55ecc437196dcefaa5cc946850cfd4483-userdata-shm.mount: Deactivated successfully.
Jan 22 00:25:55 compute-1 systemd[1]: var-lib-containers-storage-overlay-88ef5f22e01b3820ac8bdaf5cf91e7fb7f8fdf50bda1156b0a3ee09041451a44-merged.mount: Deactivated successfully.
Jan 22 00:25:55 compute-1 nova_compute[182713]: 2026-01-22 00:25:55.206 182717 DEBUG nova.virt.libvirt.vif [None req-2ed6adb0-5d90-47fc-a82e-57f6619825e2 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T00:25:19Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-1492736128-gen-1-1423755979',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-1492736128-gen-1-1423755979',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-1492736128-ge',id=163,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBA7jwxBkdWOz7w57AiaRcIngM4Y6BG0RAdXKZN1lpSf4fY7AaWS+RG49OCjvRqIpg6m9+OlWKeWKEGH6c13ztIF3i7IhSM5D4o2yMlEeDmvrwLxAQoPueCNJW1uOa0WZAw==',key_name='tempest-TestSecurityGroupsBasicOps-1448768175',keypairs=<?>,launch_index=0,launched_at=2026-01-22T00:25:33Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='02bcfc5f1f1044a3856e73a5938ff011',ramdisk_id='',reservation_id='r-68ni0ee7',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestSecurityGroupsBasicOps-1492736128',owner_user_name='tempest-TestSecurityGroupsBasicOps-1492736128-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T00:25:33Z,user_data=None,user_id='a60ce2b7b7ae47b484de12add551b287',uuid=7530f1b4-dc2b-4291-b4c1-fb75b3542e2b,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "00f97767-091d-496b-9dc0-f66f15a21142", "address": "fa:16:3e:3d:55:4f", "network": {"id": "5c39d2a7-2c89-4543-a593-0bbe9a34dfef", "bridge": "br-int", "label": "tempest-network-smoke--1486796925", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "02bcfc5f1f1044a3856e73a5938ff011", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap00f97767-09", "ovs_interfaceid": "00f97767-091d-496b-9dc0-f66f15a21142", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 00:25:55 compute-1 nova_compute[182713]: 2026-01-22 00:25:55.206 182717 DEBUG nova.network.os_vif_util [None req-2ed6adb0-5d90-47fc-a82e-57f6619825e2 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Converting VIF {"id": "00f97767-091d-496b-9dc0-f66f15a21142", "address": "fa:16:3e:3d:55:4f", "network": {"id": "5c39d2a7-2c89-4543-a593-0bbe9a34dfef", "bridge": "br-int", "label": "tempest-network-smoke--1486796925", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "02bcfc5f1f1044a3856e73a5938ff011", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap00f97767-09", "ovs_interfaceid": "00f97767-091d-496b-9dc0-f66f15a21142", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:25:55 compute-1 nova_compute[182713]: 2026-01-22 00:25:55.207 182717 DEBUG nova.network.os_vif_util [None req-2ed6adb0-5d90-47fc-a82e-57f6619825e2 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:3d:55:4f,bridge_name='br-int',has_traffic_filtering=True,id=00f97767-091d-496b-9dc0-f66f15a21142,network=Network(5c39d2a7-2c89-4543-a593-0bbe9a34dfef),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap00f97767-09') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:25:55 compute-1 nova_compute[182713]: 2026-01-22 00:25:55.208 182717 DEBUG os_vif [None req-2ed6adb0-5d90-47fc-a82e-57f6619825e2 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:3d:55:4f,bridge_name='br-int',has_traffic_filtering=True,id=00f97767-091d-496b-9dc0-f66f15a21142,network=Network(5c39d2a7-2c89-4543-a593-0bbe9a34dfef),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap00f97767-09') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 00:25:55 compute-1 podman[236930]: 2026-01-22 00:25:55.209088952 +0000 UTC m=+0.114878557 container cleanup fd975280d1e5ecc9d871f27f376a54a55ecc437196dcefaa5cc946850cfd4483 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5c39d2a7-2c89-4543-a593-0bbe9a34dfef, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 00:25:55 compute-1 nova_compute[182713]: 2026-01-22 00:25:55.212 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:25:55 compute-1 nova_compute[182713]: 2026-01-22 00:25:55.212 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap00f97767-09, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:25:55 compute-1 nova_compute[182713]: 2026-01-22 00:25:55.215 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:25:55 compute-1 nova_compute[182713]: 2026-01-22 00:25:55.217 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:25:55 compute-1 nova_compute[182713]: 2026-01-22 00:25:55.227 182717 INFO os_vif [None req-2ed6adb0-5d90-47fc-a82e-57f6619825e2 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:3d:55:4f,bridge_name='br-int',has_traffic_filtering=True,id=00f97767-091d-496b-9dc0-f66f15a21142,network=Network(5c39d2a7-2c89-4543-a593-0bbe9a34dfef),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap00f97767-09')
Jan 22 00:25:55 compute-1 nova_compute[182713]: 2026-01-22 00:25:55.228 182717 INFO nova.virt.libvirt.driver [None req-2ed6adb0-5d90-47fc-a82e-57f6619825e2 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 7530f1b4-dc2b-4291-b4c1-fb75b3542e2b] Deleting instance files /var/lib/nova/instances/7530f1b4-dc2b-4291-b4c1-fb75b3542e2b_del
Jan 22 00:25:55 compute-1 nova_compute[182713]: 2026-01-22 00:25:55.230 182717 INFO nova.virt.libvirt.driver [None req-2ed6adb0-5d90-47fc-a82e-57f6619825e2 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 7530f1b4-dc2b-4291-b4c1-fb75b3542e2b] Deletion of /var/lib/nova/instances/7530f1b4-dc2b-4291-b4c1-fb75b3542e2b_del complete
Jan 22 00:25:55 compute-1 systemd[1]: libpod-conmon-fd975280d1e5ecc9d871f27f376a54a55ecc437196dcefaa5cc946850cfd4483.scope: Deactivated successfully.
Jan 22 00:25:55 compute-1 podman[236973]: 2026-01-22 00:25:55.286273098 +0000 UTC m=+0.046284187 container remove fd975280d1e5ecc9d871f27f376a54a55ecc437196dcefaa5cc946850cfd4483 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5c39d2a7-2c89-4543-a593-0bbe9a34dfef, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0)
Jan 22 00:25:55 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:25:55.293 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[5f9afbc5-3e2b-4a10-846b-2a74a62bb494]: (4, ('Thu Jan 22 12:25:55 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-5c39d2a7-2c89-4543-a593-0bbe9a34dfef (fd975280d1e5ecc9d871f27f376a54a55ecc437196dcefaa5cc946850cfd4483)\nfd975280d1e5ecc9d871f27f376a54a55ecc437196dcefaa5cc946850cfd4483\nThu Jan 22 12:25:55 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-5c39d2a7-2c89-4543-a593-0bbe9a34dfef (fd975280d1e5ecc9d871f27f376a54a55ecc437196dcefaa5cc946850cfd4483)\nfd975280d1e5ecc9d871f27f376a54a55ecc437196dcefaa5cc946850cfd4483\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:25:55 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:25:55.295 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[f1dacc39-a2b2-4a7a-9403-66a4afcdc87a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:25:55 compute-1 kernel: tap5c39d2a7-20: left promiscuous mode
Jan 22 00:25:55 compute-1 rsyslogd[1003]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 22 00:25:55 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:25:55.297 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap5c39d2a7-20, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:25:55 compute-1 rsyslogd[1003]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 22 00:25:55 compute-1 nova_compute[182713]: 2026-01-22 00:25:55.299 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:25:55 compute-1 nova_compute[182713]: 2026-01-22 00:25:55.319 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:25:55 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:25:55.323 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[47e66890-7e67-4cc0-aefa-2ae9ae0924a8]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:25:55 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:25:55.339 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[5717c34f-1981-40ca-b4c2-034229bc7db7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:25:55 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:25:55.341 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[83959430-7855-4f53-bc7b-c740cbbbe70a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:25:55 compute-1 nova_compute[182713]: 2026-01-22 00:25:55.367 182717 INFO nova.compute.manager [None req-2ed6adb0-5d90-47fc-a82e-57f6619825e2 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 7530f1b4-dc2b-4291-b4c1-fb75b3542e2b] Took 0.46 seconds to destroy the instance on the hypervisor.
Jan 22 00:25:55 compute-1 nova_compute[182713]: 2026-01-22 00:25:55.368 182717 DEBUG oslo.service.loopingcall [None req-2ed6adb0-5d90-47fc-a82e-57f6619825e2 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 22 00:25:55 compute-1 nova_compute[182713]: 2026-01-22 00:25:55.368 182717 DEBUG nova.compute.manager [-] [instance: 7530f1b4-dc2b-4291-b4c1-fb75b3542e2b] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 22 00:25:55 compute-1 nova_compute[182713]: 2026-01-22 00:25:55.369 182717 DEBUG nova.network.neutron [-] [instance: 7530f1b4-dc2b-4291-b4c1-fb75b3542e2b] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 22 00:25:55 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:25:55.369 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[599e43ff-1d29-4602-9a31-ba21fab67ad3]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 610559, 'reachable_time': 20689, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 236990, 'error': None, 'target': 'ovnmeta-5c39d2a7-2c89-4543-a593-0bbe9a34dfef', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:25:55 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:25:55.374 104576 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-5c39d2a7-2c89-4543-a593-0bbe9a34dfef deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 22 00:25:55 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:25:55.374 104576 DEBUG oslo.privsep.daemon [-] privsep: reply[e110dad4-4c8c-45ae-a21e-7be108206384]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:25:55 compute-1 systemd[1]: run-netns-ovnmeta\x2d5c39d2a7\x2d2c89\x2d4543\x2da593\x2d0bbe9a34dfef.mount: Deactivated successfully.
Jan 22 00:25:55 compute-1 nova_compute[182713]: 2026-01-22 00:25:55.391 182717 DEBUG nova.compute.manager [req-68cea94f-0782-44ac-b2da-0b1385b96e3f req-76699e77-d2e9-4dd7-9077-90ebbe01a55e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 7530f1b4-dc2b-4291-b4c1-fb75b3542e2b] Received event network-vif-unplugged-00f97767-091d-496b-9dc0-f66f15a21142 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:25:55 compute-1 nova_compute[182713]: 2026-01-22 00:25:55.391 182717 DEBUG oslo_concurrency.lockutils [req-68cea94f-0782-44ac-b2da-0b1385b96e3f req-76699e77-d2e9-4dd7-9077-90ebbe01a55e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "7530f1b4-dc2b-4291-b4c1-fb75b3542e2b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:25:55 compute-1 nova_compute[182713]: 2026-01-22 00:25:55.391 182717 DEBUG oslo_concurrency.lockutils [req-68cea94f-0782-44ac-b2da-0b1385b96e3f req-76699e77-d2e9-4dd7-9077-90ebbe01a55e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "7530f1b4-dc2b-4291-b4c1-fb75b3542e2b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:25:55 compute-1 nova_compute[182713]: 2026-01-22 00:25:55.392 182717 DEBUG oslo_concurrency.lockutils [req-68cea94f-0782-44ac-b2da-0b1385b96e3f req-76699e77-d2e9-4dd7-9077-90ebbe01a55e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "7530f1b4-dc2b-4291-b4c1-fb75b3542e2b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:25:55 compute-1 nova_compute[182713]: 2026-01-22 00:25:55.392 182717 DEBUG nova.compute.manager [req-68cea94f-0782-44ac-b2da-0b1385b96e3f req-76699e77-d2e9-4dd7-9077-90ebbe01a55e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 7530f1b4-dc2b-4291-b4c1-fb75b3542e2b] No waiting events found dispatching network-vif-unplugged-00f97767-091d-496b-9dc0-f66f15a21142 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:25:55 compute-1 nova_compute[182713]: 2026-01-22 00:25:55.392 182717 DEBUG nova.compute.manager [req-68cea94f-0782-44ac-b2da-0b1385b96e3f req-76699e77-d2e9-4dd7-9077-90ebbe01a55e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 7530f1b4-dc2b-4291-b4c1-fb75b3542e2b] Received event network-vif-unplugged-00f97767-091d-496b-9dc0-f66f15a21142 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 22 00:25:55 compute-1 nova_compute[182713]: 2026-01-22 00:25:55.850 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:25:56 compute-1 nova_compute[182713]: 2026-01-22 00:25:56.590 182717 DEBUG nova.network.neutron [-] [instance: 7530f1b4-dc2b-4291-b4c1-fb75b3542e2b] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:25:56 compute-1 nova_compute[182713]: 2026-01-22 00:25:56.620 182717 INFO nova.compute.manager [-] [instance: 7530f1b4-dc2b-4291-b4c1-fb75b3542e2b] Took 1.25 seconds to deallocate network for instance.
Jan 22 00:25:56 compute-1 nova_compute[182713]: 2026-01-22 00:25:56.734 182717 DEBUG nova.compute.manager [req-04eda0a3-7fb1-496d-8af5-4f5ed0459b0c req-a01a3acb-c99b-4137-be00-bfea647c2e24 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 7530f1b4-dc2b-4291-b4c1-fb75b3542e2b] Received event network-vif-deleted-00f97767-091d-496b-9dc0-f66f15a21142 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:25:56 compute-1 nova_compute[182713]: 2026-01-22 00:25:56.855 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:25:57 compute-1 nova_compute[182713]: 2026-01-22 00:25:57.085 182717 DEBUG oslo_concurrency.lockutils [None req-2ed6adb0-5d90-47fc-a82e-57f6619825e2 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:25:57 compute-1 nova_compute[182713]: 2026-01-22 00:25:57.086 182717 DEBUG oslo_concurrency.lockutils [None req-2ed6adb0-5d90-47fc-a82e-57f6619825e2 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:25:57 compute-1 nova_compute[182713]: 2026-01-22 00:25:57.088 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:25:57 compute-1 nova_compute[182713]: 2026-01-22 00:25:57.205 182717 DEBUG nova.compute.provider_tree [None req-2ed6adb0-5d90-47fc-a82e-57f6619825e2 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Inventory has not changed in ProviderTree for provider: 39680711-70c9-4df1-ae59-25e54fac688d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:25:57 compute-1 nova_compute[182713]: 2026-01-22 00:25:57.222 182717 DEBUG nova.scheduler.client.report [None req-2ed6adb0-5d90-47fc-a82e-57f6619825e2 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Inventory has not changed for provider 39680711-70c9-4df1-ae59-25e54fac688d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:25:57 compute-1 nova_compute[182713]: 2026-01-22 00:25:57.247 182717 DEBUG oslo_concurrency.lockutils [None req-2ed6adb0-5d90-47fc-a82e-57f6619825e2 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.161s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:25:57 compute-1 nova_compute[182713]: 2026-01-22 00:25:57.249 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.161s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:25:57 compute-1 nova_compute[182713]: 2026-01-22 00:25:57.250 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:25:57 compute-1 nova_compute[182713]: 2026-01-22 00:25:57.250 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 00:25:57 compute-1 nova_compute[182713]: 2026-01-22 00:25:57.289 182717 INFO nova.scheduler.client.report [None req-2ed6adb0-5d90-47fc-a82e-57f6619825e2 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Deleted allocations for instance 7530f1b4-dc2b-4291-b4c1-fb75b3542e2b
Jan 22 00:25:57 compute-1 nova_compute[182713]: 2026-01-22 00:25:57.385 182717 DEBUG oslo_concurrency.lockutils [None req-2ed6adb0-5d90-47fc-a82e-57f6619825e2 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Lock "7530f1b4-dc2b-4291-b4c1-fb75b3542e2b" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.508s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:25:57 compute-1 nova_compute[182713]: 2026-01-22 00:25:57.461 182717 WARNING nova.virt.libvirt.driver [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 00:25:57 compute-1 nova_compute[182713]: 2026-01-22 00:25:57.463 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5708MB free_disk=73.26041030883789GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 00:25:57 compute-1 nova_compute[182713]: 2026-01-22 00:25:57.463 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:25:57 compute-1 nova_compute[182713]: 2026-01-22 00:25:57.463 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:25:57 compute-1 nova_compute[182713]: 2026-01-22 00:25:57.548 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 00:25:57 compute-1 nova_compute[182713]: 2026-01-22 00:25:57.549 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 00:25:57 compute-1 nova_compute[182713]: 2026-01-22 00:25:57.575 182717 DEBUG nova.compute.manager [req-19ce6cb9-06c0-4b7f-9d83-4c03bef3f672 req-218f3866-c577-4da9-a5bc-d7787e839c05 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 7530f1b4-dc2b-4291-b4c1-fb75b3542e2b] Received event network-vif-plugged-00f97767-091d-496b-9dc0-f66f15a21142 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:25:57 compute-1 nova_compute[182713]: 2026-01-22 00:25:57.575 182717 DEBUG oslo_concurrency.lockutils [req-19ce6cb9-06c0-4b7f-9d83-4c03bef3f672 req-218f3866-c577-4da9-a5bc-d7787e839c05 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "7530f1b4-dc2b-4291-b4c1-fb75b3542e2b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:25:57 compute-1 nova_compute[182713]: 2026-01-22 00:25:57.575 182717 DEBUG oslo_concurrency.lockutils [req-19ce6cb9-06c0-4b7f-9d83-4c03bef3f672 req-218f3866-c577-4da9-a5bc-d7787e839c05 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "7530f1b4-dc2b-4291-b4c1-fb75b3542e2b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:25:57 compute-1 nova_compute[182713]: 2026-01-22 00:25:57.576 182717 DEBUG oslo_concurrency.lockutils [req-19ce6cb9-06c0-4b7f-9d83-4c03bef3f672 req-218f3866-c577-4da9-a5bc-d7787e839c05 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "7530f1b4-dc2b-4291-b4c1-fb75b3542e2b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:25:57 compute-1 nova_compute[182713]: 2026-01-22 00:25:57.576 182717 DEBUG nova.compute.manager [req-19ce6cb9-06c0-4b7f-9d83-4c03bef3f672 req-218f3866-c577-4da9-a5bc-d7787e839c05 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 7530f1b4-dc2b-4291-b4c1-fb75b3542e2b] No waiting events found dispatching network-vif-plugged-00f97767-091d-496b-9dc0-f66f15a21142 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:25:57 compute-1 nova_compute[182713]: 2026-01-22 00:25:57.576 182717 WARNING nova.compute.manager [req-19ce6cb9-06c0-4b7f-9d83-4c03bef3f672 req-218f3866-c577-4da9-a5bc-d7787e839c05 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 7530f1b4-dc2b-4291-b4c1-fb75b3542e2b] Received unexpected event network-vif-plugged-00f97767-091d-496b-9dc0-f66f15a21142 for instance with vm_state deleted and task_state None.
Jan 22 00:25:57 compute-1 nova_compute[182713]: 2026-01-22 00:25:57.599 182717 DEBUG nova.compute.provider_tree [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Inventory has not changed in ProviderTree for provider: 39680711-70c9-4df1-ae59-25e54fac688d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:25:57 compute-1 nova_compute[182713]: 2026-01-22 00:25:57.618 182717 DEBUG nova.scheduler.client.report [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Inventory has not changed for provider 39680711-70c9-4df1-ae59-25e54fac688d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:25:57 compute-1 nova_compute[182713]: 2026-01-22 00:25:57.661 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 00:25:57 compute-1 nova_compute[182713]: 2026-01-22 00:25:57.661 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.198s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:25:58 compute-1 nova_compute[182713]: 2026-01-22 00:25:58.663 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:25:59 compute-1 nova_compute[182713]: 2026-01-22 00:25:59.097 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:25:59 compute-1 podman[236992]: 2026-01-22 00:25:59.587890915 +0000 UTC m=+0.073877945 container health_status af9a44923f14d865893c91712a29a99d59e05d83f1a1a0300c4f4d5af638f284 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251202)
Jan 22 00:25:59 compute-1 podman[236993]: 2026-01-22 00:25:59.607208104 +0000 UTC m=+0.081372397 container health_status e305ebfcd3dbca18f8dd680606b043b108cf9efb0d0a771039cb04fe0df8441d (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 22 00:25:59 compute-1 nova_compute[182713]: 2026-01-22 00:25:59.855 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:25:59 compute-1 nova_compute[182713]: 2026-01-22 00:25:59.856 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:26:00 compute-1 nova_compute[182713]: 2026-01-22 00:26:00.215 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:26:00 compute-1 nova_compute[182713]: 2026-01-22 00:26:00.856 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:26:00 compute-1 nova_compute[182713]: 2026-01-22 00:26:00.857 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 00:26:00 compute-1 nova_compute[182713]: 2026-01-22 00:26:00.857 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 22 00:26:00 compute-1 nova_compute[182713]: 2026-01-22 00:26:00.879 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 22 00:26:03 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:26:03.035 104184 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:26:03 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:26:03.036 104184 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:26:03 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:26:03.036 104184 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:26:04 compute-1 nova_compute[182713]: 2026-01-22 00:26:04.100 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:26:05 compute-1 nova_compute[182713]: 2026-01-22 00:26:05.217 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:26:09 compute-1 nova_compute[182713]: 2026-01-22 00:26:09.101 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:26:09 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:26:09.670 104184 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=50, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:a4:fb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'fa:c2:bb:50:eb:27'}, ipsec=False) old=SB_Global(nb_cfg=49) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:26:09 compute-1 nova_compute[182713]: 2026-01-22 00:26:09.670 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:26:09 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:26:09.671 104184 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 22 00:26:10 compute-1 nova_compute[182713]: 2026-01-22 00:26:10.185 182717 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769041555.1842988, 7530f1b4-dc2b-4291-b4c1-fb75b3542e2b => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:26:10 compute-1 nova_compute[182713]: 2026-01-22 00:26:10.186 182717 INFO nova.compute.manager [-] [instance: 7530f1b4-dc2b-4291-b4c1-fb75b3542e2b] VM Stopped (Lifecycle Event)
Jan 22 00:26:10 compute-1 nova_compute[182713]: 2026-01-22 00:26:10.216 182717 DEBUG nova.compute.manager [None req-fb32f976-e4e5-49ce-8402-0f625ad983ab - - - - - -] [instance: 7530f1b4-dc2b-4291-b4c1-fb75b3542e2b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:26:10 compute-1 nova_compute[182713]: 2026-01-22 00:26:10.219 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:26:12 compute-1 podman[237036]: 2026-01-22 00:26:12.604240679 +0000 UTC m=+0.095053422 container health_status cba261ffc859a105e0afa4436e64e27f9ba7e7387a1ea02146e0072c51844d44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ceilometer_agent_compute, org.label-schema.build-date=20251202, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 22 00:26:14 compute-1 nova_compute[182713]: 2026-01-22 00:26:14.104 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:26:15 compute-1 nova_compute[182713]: 2026-01-22 00:26:15.221 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:26:15 compute-1 podman[237058]: 2026-01-22 00:26:15.590965358 +0000 UTC m=+0.072785790 container health_status dce133a2ffa21f3006ac27dc80027060a9029e6d972f4c62fecd35dd3bbb00f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=openstack_network_exporter, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, vcs-type=git, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, distribution-scope=public, release=1755695350, name=ubi9-minimal, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container)
Jan 22 00:26:15 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:26:15.674 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=74526b6d-b1ca-423f-9094-b845f8b97526, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '50'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:26:15 compute-1 nova_compute[182713]: 2026-01-22 00:26:15.837 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:26:16 compute-1 nova_compute[182713]: 2026-01-22 00:26:16.056 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:26:19 compute-1 nova_compute[182713]: 2026-01-22 00:26:19.106 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:26:20 compute-1 nova_compute[182713]: 2026-01-22 00:26:20.260 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:26:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:26:22.884 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:26:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:26:22.885 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:26:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:26:22.886 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:26:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:26:22.886 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:26:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:26:22.886 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:26:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:26:22.886 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:26:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:26:22.887 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:26:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:26:22.887 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:26:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:26:22.887 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:26:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:26:22.887 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:26:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:26:22.887 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:26:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:26:22.888 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:26:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:26:22.888 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:26:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:26:22.888 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:26:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:26:22.888 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:26:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:26:22.888 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:26:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:26:22.888 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:26:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:26:22.889 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:26:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:26:22.889 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:26:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:26:22.889 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:26:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:26:22.889 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:26:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:26:22.889 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:26:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:26:22.890 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:26:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:26:22.890 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:26:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:26:22.890 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:26:24 compute-1 nova_compute[182713]: 2026-01-22 00:26:24.110 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:26:25 compute-1 nova_compute[182713]: 2026-01-22 00:26:25.262 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:26:25 compute-1 podman[237083]: 2026-01-22 00:26:25.585122084 +0000 UTC m=+0.067055313 container health_status 9069102163b9014ce8d990e36224edf2f6277e737eebbc85a8ea0ba5a3d6a468 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 22 00:26:25 compute-1 podman[237082]: 2026-01-22 00:26:25.646177939 +0000 UTC m=+0.125811777 container health_status 1b31374526468d6b98833c6ddc06c791524e3aaa8f4a92d02e3f11c449a17ca2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_controller)
Jan 22 00:26:29 compute-1 nova_compute[182713]: 2026-01-22 00:26:29.112 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:26:30 compute-1 nova_compute[182713]: 2026-01-22 00:26:30.264 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:26:30 compute-1 podman[237132]: 2026-01-22 00:26:30.556948686 +0000 UTC m=+0.054548575 container health_status af9a44923f14d865893c91712a29a99d59e05d83f1a1a0300c4f4d5af638f284 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202)
Jan 22 00:26:30 compute-1 podman[237133]: 2026-01-22 00:26:30.570082953 +0000 UTC m=+0.059564580 container health_status e305ebfcd3dbca18f8dd680606b043b108cf9efb0d0a771039cb04fe0df8441d (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 22 00:26:34 compute-1 nova_compute[182713]: 2026-01-22 00:26:34.113 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:26:35 compute-1 nova_compute[182713]: 2026-01-22 00:26:35.266 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:26:39 compute-1 nova_compute[182713]: 2026-01-22 00:26:39.115 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:26:40 compute-1 nova_compute[182713]: 2026-01-22 00:26:40.268 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:26:43 compute-1 podman[237177]: 2026-01-22 00:26:43.565692625 +0000 UTC m=+0.061767668 container health_status cba261ffc859a105e0afa4436e64e27f9ba7e7387a1ea02146e0072c51844d44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS)
Jan 22 00:26:44 compute-1 nova_compute[182713]: 2026-01-22 00:26:44.118 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:26:45 compute-1 nova_compute[182713]: 2026-01-22 00:26:45.272 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:26:46 compute-1 podman[237197]: 2026-01-22 00:26:46.585083719 +0000 UTC m=+0.073450141 container health_status dce133a2ffa21f3006ac27dc80027060a9029e6d972f4c62fecd35dd3bbb00f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.buildah.version=1.33.7, maintainer=Red Hat, Inc., architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, managed_by=edpm_ansible, distribution-scope=public, release=1755695350, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, config_id=openstack_network_exporter, container_name=openstack_network_exporter, vcs-type=git, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41)
Jan 22 00:26:46 compute-1 nova_compute[182713]: 2026-01-22 00:26:46.836 182717 DEBUG oslo_concurrency.lockutils [None req-af792dc1-44e2-4e66-b644-3cb6ea394021 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] Acquiring lock "470aa2dc-e43d-414b-8bac-208dec5bcfe2" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:26:46 compute-1 nova_compute[182713]: 2026-01-22 00:26:46.836 182717 DEBUG oslo_concurrency.lockutils [None req-af792dc1-44e2-4e66-b644-3cb6ea394021 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] Lock "470aa2dc-e43d-414b-8bac-208dec5bcfe2" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:26:46 compute-1 nova_compute[182713]: 2026-01-22 00:26:46.873 182717 DEBUG nova.compute.manager [None req-af792dc1-44e2-4e66-b644-3cb6ea394021 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] [instance: 470aa2dc-e43d-414b-8bac-208dec5bcfe2] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 22 00:26:47 compute-1 nova_compute[182713]: 2026-01-22 00:26:47.008 182717 DEBUG oslo_concurrency.lockutils [None req-af792dc1-44e2-4e66-b644-3cb6ea394021 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:26:47 compute-1 nova_compute[182713]: 2026-01-22 00:26:47.009 182717 DEBUG oslo_concurrency.lockutils [None req-af792dc1-44e2-4e66-b644-3cb6ea394021 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:26:47 compute-1 nova_compute[182713]: 2026-01-22 00:26:47.030 182717 DEBUG nova.virt.hardware [None req-af792dc1-44e2-4e66-b644-3cb6ea394021 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 22 00:26:47 compute-1 nova_compute[182713]: 2026-01-22 00:26:47.031 182717 INFO nova.compute.claims [None req-af792dc1-44e2-4e66-b644-3cb6ea394021 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] [instance: 470aa2dc-e43d-414b-8bac-208dec5bcfe2] Claim successful on node compute-1.ctlplane.example.com
Jan 22 00:26:47 compute-1 nova_compute[182713]: 2026-01-22 00:26:47.640 182717 DEBUG nova.compute.provider_tree [None req-af792dc1-44e2-4e66-b644-3cb6ea394021 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] Inventory has not changed in ProviderTree for provider: 39680711-70c9-4df1-ae59-25e54fac688d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:26:47 compute-1 nova_compute[182713]: 2026-01-22 00:26:47.754 182717 DEBUG nova.scheduler.client.report [None req-af792dc1-44e2-4e66-b644-3cb6ea394021 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] Inventory has not changed for provider 39680711-70c9-4df1-ae59-25e54fac688d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:26:48 compute-1 nova_compute[182713]: 2026-01-22 00:26:48.001 182717 DEBUG oslo_concurrency.lockutils [None req-af792dc1-44e2-4e66-b644-3cb6ea394021 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.992s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:26:48 compute-1 nova_compute[182713]: 2026-01-22 00:26:48.004 182717 DEBUG nova.compute.manager [None req-af792dc1-44e2-4e66-b644-3cb6ea394021 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] [instance: 470aa2dc-e43d-414b-8bac-208dec5bcfe2] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 22 00:26:48 compute-1 nova_compute[182713]: 2026-01-22 00:26:48.082 182717 DEBUG nova.compute.manager [None req-af792dc1-44e2-4e66-b644-3cb6ea394021 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] [instance: 470aa2dc-e43d-414b-8bac-208dec5bcfe2] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 22 00:26:48 compute-1 nova_compute[182713]: 2026-01-22 00:26:48.083 182717 DEBUG nova.network.neutron [None req-af792dc1-44e2-4e66-b644-3cb6ea394021 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] [instance: 470aa2dc-e43d-414b-8bac-208dec5bcfe2] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 22 00:26:48 compute-1 nova_compute[182713]: 2026-01-22 00:26:48.127 182717 INFO nova.virt.libvirt.driver [None req-af792dc1-44e2-4e66-b644-3cb6ea394021 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] [instance: 470aa2dc-e43d-414b-8bac-208dec5bcfe2] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 22 00:26:48 compute-1 nova_compute[182713]: 2026-01-22 00:26:48.150 182717 DEBUG nova.compute.manager [None req-af792dc1-44e2-4e66-b644-3cb6ea394021 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] [instance: 470aa2dc-e43d-414b-8bac-208dec5bcfe2] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 22 00:26:48 compute-1 nova_compute[182713]: 2026-01-22 00:26:48.284 182717 DEBUG nova.compute.manager [None req-af792dc1-44e2-4e66-b644-3cb6ea394021 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] [instance: 470aa2dc-e43d-414b-8bac-208dec5bcfe2] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 22 00:26:48 compute-1 nova_compute[182713]: 2026-01-22 00:26:48.287 182717 DEBUG nova.virt.libvirt.driver [None req-af792dc1-44e2-4e66-b644-3cb6ea394021 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] [instance: 470aa2dc-e43d-414b-8bac-208dec5bcfe2] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 22 00:26:48 compute-1 nova_compute[182713]: 2026-01-22 00:26:48.287 182717 INFO nova.virt.libvirt.driver [None req-af792dc1-44e2-4e66-b644-3cb6ea394021 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] [instance: 470aa2dc-e43d-414b-8bac-208dec5bcfe2] Creating image(s)
Jan 22 00:26:48 compute-1 nova_compute[182713]: 2026-01-22 00:26:48.289 182717 DEBUG oslo_concurrency.lockutils [None req-af792dc1-44e2-4e66-b644-3cb6ea394021 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] Acquiring lock "/var/lib/nova/instances/470aa2dc-e43d-414b-8bac-208dec5bcfe2/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:26:48 compute-1 nova_compute[182713]: 2026-01-22 00:26:48.289 182717 DEBUG oslo_concurrency.lockutils [None req-af792dc1-44e2-4e66-b644-3cb6ea394021 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] Lock "/var/lib/nova/instances/470aa2dc-e43d-414b-8bac-208dec5bcfe2/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:26:48 compute-1 nova_compute[182713]: 2026-01-22 00:26:48.290 182717 DEBUG oslo_concurrency.lockutils [None req-af792dc1-44e2-4e66-b644-3cb6ea394021 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] Lock "/var/lib/nova/instances/470aa2dc-e43d-414b-8bac-208dec5bcfe2/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:26:48 compute-1 nova_compute[182713]: 2026-01-22 00:26:48.291 182717 DEBUG oslo_concurrency.lockutils [None req-af792dc1-44e2-4e66-b644-3cb6ea394021 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] Acquiring lock "4546bcf384626c54ce60a485a9f0fede193badcf" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:26:48 compute-1 nova_compute[182713]: 2026-01-22 00:26:48.292 182717 DEBUG oslo_concurrency.lockutils [None req-af792dc1-44e2-4e66-b644-3cb6ea394021 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] Lock "4546bcf384626c54ce60a485a9f0fede193badcf" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:26:48 compute-1 nova_compute[182713]: 2026-01-22 00:26:48.337 182717 DEBUG nova.policy [None req-af792dc1-44e2-4e66-b644-3cb6ea394021 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '93f27bcf715e498cbac482f96dec39c0', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'c869345f15654dea91ddb775c6c3ed7d', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 22 00:26:49 compute-1 nova_compute[182713]: 2026-01-22 00:26:49.119 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:26:49 compute-1 nova_compute[182713]: 2026-01-22 00:26:49.327 182717 DEBUG nova.network.neutron [None req-af792dc1-44e2-4e66-b644-3cb6ea394021 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] [instance: 470aa2dc-e43d-414b-8bac-208dec5bcfe2] Successfully created port: 45b36889-973e-4cd7-a054-83b1843214dc _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 22 00:26:50 compute-1 nova_compute[182713]: 2026-01-22 00:26:50.077 182717 DEBUG oslo_concurrency.processutils [None req-af792dc1-44e2-4e66-b644-3cb6ea394021 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4546bcf384626c54ce60a485a9f0fede193badcf.part --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:26:50 compute-1 nova_compute[182713]: 2026-01-22 00:26:50.142 182717 DEBUG oslo_concurrency.processutils [None req-af792dc1-44e2-4e66-b644-3cb6ea394021 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4546bcf384626c54ce60a485a9f0fede193badcf.part --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:26:50 compute-1 nova_compute[182713]: 2026-01-22 00:26:50.145 182717 DEBUG nova.virt.images [None req-af792dc1-44e2-4e66-b644-3cb6ea394021 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] 1fa72c45-3744-4826-ac11-1114970a3fb7 was qcow2, converting to raw fetch_to_raw /usr/lib/python3.9/site-packages/nova/virt/images.py:242
Jan 22 00:26:50 compute-1 nova_compute[182713]: 2026-01-22 00:26:50.146 182717 DEBUG nova.privsep.utils [None req-af792dc1-44e2-4e66-b644-3cb6ea394021 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63
Jan 22 00:26:50 compute-1 nova_compute[182713]: 2026-01-22 00:26:50.146 182717 DEBUG oslo_concurrency.processutils [None req-af792dc1-44e2-4e66-b644-3cb6ea394021 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] Running cmd (subprocess): qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/4546bcf384626c54ce60a485a9f0fede193badcf.part /var/lib/nova/instances/_base/4546bcf384626c54ce60a485a9f0fede193badcf.converted execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:26:50 compute-1 nova_compute[182713]: 2026-01-22 00:26:50.177 182717 DEBUG nova.network.neutron [None req-af792dc1-44e2-4e66-b644-3cb6ea394021 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] [instance: 470aa2dc-e43d-414b-8bac-208dec5bcfe2] Successfully updated port: 45b36889-973e-4cd7-a054-83b1843214dc _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 22 00:26:50 compute-1 nova_compute[182713]: 2026-01-22 00:26:50.209 182717 DEBUG oslo_concurrency.lockutils [None req-af792dc1-44e2-4e66-b644-3cb6ea394021 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] Acquiring lock "refresh_cache-470aa2dc-e43d-414b-8bac-208dec5bcfe2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:26:50 compute-1 nova_compute[182713]: 2026-01-22 00:26:50.210 182717 DEBUG oslo_concurrency.lockutils [None req-af792dc1-44e2-4e66-b644-3cb6ea394021 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] Acquired lock "refresh_cache-470aa2dc-e43d-414b-8bac-208dec5bcfe2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:26:50 compute-1 nova_compute[182713]: 2026-01-22 00:26:50.210 182717 DEBUG nova.network.neutron [None req-af792dc1-44e2-4e66-b644-3cb6ea394021 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] [instance: 470aa2dc-e43d-414b-8bac-208dec5bcfe2] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 00:26:50 compute-1 nova_compute[182713]: 2026-01-22 00:26:50.275 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:26:50 compute-1 nova_compute[182713]: 2026-01-22 00:26:50.289 182717 DEBUG nova.compute.manager [req-8e9e3912-452a-41ee-b8cd-40a328945e24 req-ba6199c7-a66e-4817-be6e-b79899e1f860 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 470aa2dc-e43d-414b-8bac-208dec5bcfe2] Received event network-changed-45b36889-973e-4cd7-a054-83b1843214dc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:26:50 compute-1 nova_compute[182713]: 2026-01-22 00:26:50.290 182717 DEBUG nova.compute.manager [req-8e9e3912-452a-41ee-b8cd-40a328945e24 req-ba6199c7-a66e-4817-be6e-b79899e1f860 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 470aa2dc-e43d-414b-8bac-208dec5bcfe2] Refreshing instance network info cache due to event network-changed-45b36889-973e-4cd7-a054-83b1843214dc. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 00:26:50 compute-1 nova_compute[182713]: 2026-01-22 00:26:50.291 182717 DEBUG oslo_concurrency.lockutils [req-8e9e3912-452a-41ee-b8cd-40a328945e24 req-ba6199c7-a66e-4817-be6e-b79899e1f860 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-470aa2dc-e43d-414b-8bac-208dec5bcfe2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:26:50 compute-1 nova_compute[182713]: 2026-01-22 00:26:50.378 182717 DEBUG nova.network.neutron [None req-af792dc1-44e2-4e66-b644-3cb6ea394021 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] [instance: 470aa2dc-e43d-414b-8bac-208dec5bcfe2] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 00:26:50 compute-1 nova_compute[182713]: 2026-01-22 00:26:50.535 182717 DEBUG oslo_concurrency.processutils [None req-af792dc1-44e2-4e66-b644-3cb6ea394021 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] CMD "qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/4546bcf384626c54ce60a485a9f0fede193badcf.part /var/lib/nova/instances/_base/4546bcf384626c54ce60a485a9f0fede193badcf.converted" returned: 0 in 0.388s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:26:50 compute-1 nova_compute[182713]: 2026-01-22 00:26:50.549 182717 DEBUG oslo_concurrency.processutils [None req-af792dc1-44e2-4e66-b644-3cb6ea394021 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4546bcf384626c54ce60a485a9f0fede193badcf.converted --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:26:50 compute-1 nova_compute[182713]: 2026-01-22 00:26:50.650 182717 DEBUG oslo_concurrency.processutils [None req-af792dc1-44e2-4e66-b644-3cb6ea394021 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4546bcf384626c54ce60a485a9f0fede193badcf.converted --force-share --output=json" returned: 0 in 0.101s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:26:50 compute-1 nova_compute[182713]: 2026-01-22 00:26:50.652 182717 DEBUG oslo_concurrency.lockutils [None req-af792dc1-44e2-4e66-b644-3cb6ea394021 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] Lock "4546bcf384626c54ce60a485a9f0fede193badcf" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 2.359s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:26:50 compute-1 nova_compute[182713]: 2026-01-22 00:26:50.674 182717 DEBUG oslo_concurrency.processutils [None req-af792dc1-44e2-4e66-b644-3cb6ea394021 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4546bcf384626c54ce60a485a9f0fede193badcf --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:26:50 compute-1 nova_compute[182713]: 2026-01-22 00:26:50.743 182717 DEBUG oslo_concurrency.processutils [None req-af792dc1-44e2-4e66-b644-3cb6ea394021 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4546bcf384626c54ce60a485a9f0fede193badcf --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:26:50 compute-1 nova_compute[182713]: 2026-01-22 00:26:50.746 182717 DEBUG oslo_concurrency.lockutils [None req-af792dc1-44e2-4e66-b644-3cb6ea394021 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] Acquiring lock "4546bcf384626c54ce60a485a9f0fede193badcf" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:26:50 compute-1 nova_compute[182713]: 2026-01-22 00:26:50.747 182717 DEBUG oslo_concurrency.lockutils [None req-af792dc1-44e2-4e66-b644-3cb6ea394021 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] Lock "4546bcf384626c54ce60a485a9f0fede193badcf" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:26:50 compute-1 nova_compute[182713]: 2026-01-22 00:26:50.761 182717 DEBUG oslo_concurrency.processutils [None req-af792dc1-44e2-4e66-b644-3cb6ea394021 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4546bcf384626c54ce60a485a9f0fede193badcf --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:26:50 compute-1 nova_compute[182713]: 2026-01-22 00:26:50.826 182717 DEBUG oslo_concurrency.processutils [None req-af792dc1-44e2-4e66-b644-3cb6ea394021 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4546bcf384626c54ce60a485a9f0fede193badcf --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:26:50 compute-1 nova_compute[182713]: 2026-01-22 00:26:50.828 182717 DEBUG oslo_concurrency.processutils [None req-af792dc1-44e2-4e66-b644-3cb6ea394021 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/4546bcf384626c54ce60a485a9f0fede193badcf,backing_fmt=raw /var/lib/nova/instances/470aa2dc-e43d-414b-8bac-208dec5bcfe2/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:26:50 compute-1 nova_compute[182713]: 2026-01-22 00:26:50.871 182717 DEBUG oslo_concurrency.processutils [None req-af792dc1-44e2-4e66-b644-3cb6ea394021 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/4546bcf384626c54ce60a485a9f0fede193badcf,backing_fmt=raw /var/lib/nova/instances/470aa2dc-e43d-414b-8bac-208dec5bcfe2/disk 1073741824" returned: 0 in 0.043s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:26:50 compute-1 nova_compute[182713]: 2026-01-22 00:26:50.872 182717 DEBUG oslo_concurrency.lockutils [None req-af792dc1-44e2-4e66-b644-3cb6ea394021 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] Lock "4546bcf384626c54ce60a485a9f0fede193badcf" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.126s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:26:50 compute-1 nova_compute[182713]: 2026-01-22 00:26:50.873 182717 DEBUG oslo_concurrency.processutils [None req-af792dc1-44e2-4e66-b644-3cb6ea394021 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4546bcf384626c54ce60a485a9f0fede193badcf --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:26:50 compute-1 nova_compute[182713]: 2026-01-22 00:26:50.937 182717 DEBUG oslo_concurrency.processutils [None req-af792dc1-44e2-4e66-b644-3cb6ea394021 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4546bcf384626c54ce60a485a9f0fede193badcf --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:26:50 compute-1 nova_compute[182713]: 2026-01-22 00:26:50.938 182717 DEBUG nova.objects.instance [None req-af792dc1-44e2-4e66-b644-3cb6ea394021 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] Lazy-loading 'migration_context' on Instance uuid 470aa2dc-e43d-414b-8bac-208dec5bcfe2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:26:51 compute-1 nova_compute[182713]: 2026-01-22 00:26:51.010 182717 DEBUG nova.virt.libvirt.driver [None req-af792dc1-44e2-4e66-b644-3cb6ea394021 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] [instance: 470aa2dc-e43d-414b-8bac-208dec5bcfe2] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 22 00:26:51 compute-1 nova_compute[182713]: 2026-01-22 00:26:51.011 182717 DEBUG nova.virt.libvirt.driver [None req-af792dc1-44e2-4e66-b644-3cb6ea394021 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] [instance: 470aa2dc-e43d-414b-8bac-208dec5bcfe2] Ensure instance console log exists: /var/lib/nova/instances/470aa2dc-e43d-414b-8bac-208dec5bcfe2/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 22 00:26:51 compute-1 nova_compute[182713]: 2026-01-22 00:26:51.011 182717 DEBUG oslo_concurrency.lockutils [None req-af792dc1-44e2-4e66-b644-3cb6ea394021 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:26:51 compute-1 nova_compute[182713]: 2026-01-22 00:26:51.011 182717 DEBUG oslo_concurrency.lockutils [None req-af792dc1-44e2-4e66-b644-3cb6ea394021 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:26:51 compute-1 nova_compute[182713]: 2026-01-22 00:26:51.012 182717 DEBUG oslo_concurrency.lockutils [None req-af792dc1-44e2-4e66-b644-3cb6ea394021 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:26:51 compute-1 nova_compute[182713]: 2026-01-22 00:26:51.855 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:26:52 compute-1 nova_compute[182713]: 2026-01-22 00:26:52.855 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:26:52 compute-1 nova_compute[182713]: 2026-01-22 00:26:52.856 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:26:52 compute-1 nova_compute[182713]: 2026-01-22 00:26:52.857 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 00:26:53 compute-1 nova_compute[182713]: 2026-01-22 00:26:53.851 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:26:54 compute-1 nova_compute[182713]: 2026-01-22 00:26:54.015 182717 DEBUG nova.network.neutron [None req-af792dc1-44e2-4e66-b644-3cb6ea394021 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] [instance: 470aa2dc-e43d-414b-8bac-208dec5bcfe2] Updating instance_info_cache with network_info: [{"id": "45b36889-973e-4cd7-a054-83b1843214dc", "address": "fa:16:3e:9b:1f:ab", "network": {"id": "ab086ee0-e007-4a86-babc-64d267c3fd5e", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-266383806-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c869345f15654dea91ddb775c6c3ed7d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap45b36889-97", "ovs_interfaceid": "45b36889-973e-4cd7-a054-83b1843214dc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:26:54 compute-1 nova_compute[182713]: 2026-01-22 00:26:54.039 182717 DEBUG oslo_concurrency.lockutils [None req-af792dc1-44e2-4e66-b644-3cb6ea394021 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] Releasing lock "refresh_cache-470aa2dc-e43d-414b-8bac-208dec5bcfe2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:26:54 compute-1 nova_compute[182713]: 2026-01-22 00:26:54.039 182717 DEBUG nova.compute.manager [None req-af792dc1-44e2-4e66-b644-3cb6ea394021 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] [instance: 470aa2dc-e43d-414b-8bac-208dec5bcfe2] Instance network_info: |[{"id": "45b36889-973e-4cd7-a054-83b1843214dc", "address": "fa:16:3e:9b:1f:ab", "network": {"id": "ab086ee0-e007-4a86-babc-64d267c3fd5e", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-266383806-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c869345f15654dea91ddb775c6c3ed7d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap45b36889-97", "ovs_interfaceid": "45b36889-973e-4cd7-a054-83b1843214dc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 22 00:26:54 compute-1 nova_compute[182713]: 2026-01-22 00:26:54.040 182717 DEBUG oslo_concurrency.lockutils [req-8e9e3912-452a-41ee-b8cd-40a328945e24 req-ba6199c7-a66e-4817-be6e-b79899e1f860 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-470aa2dc-e43d-414b-8bac-208dec5bcfe2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:26:54 compute-1 nova_compute[182713]: 2026-01-22 00:26:54.040 182717 DEBUG nova.network.neutron [req-8e9e3912-452a-41ee-b8cd-40a328945e24 req-ba6199c7-a66e-4817-be6e-b79899e1f860 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 470aa2dc-e43d-414b-8bac-208dec5bcfe2] Refreshing network info cache for port 45b36889-973e-4cd7-a054-83b1843214dc _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 00:26:54 compute-1 nova_compute[182713]: 2026-01-22 00:26:54.046 182717 DEBUG nova.virt.libvirt.driver [None req-af792dc1-44e2-4e66-b644-3cb6ea394021 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] [instance: 470aa2dc-e43d-414b-8bac-208dec5bcfe2] Start _get_guest_xml network_info=[{"id": "45b36889-973e-4cd7-a054-83b1843214dc", "address": "fa:16:3e:9b:1f:ab", "network": {"id": "ab086ee0-e007-4a86-babc-64d267c3fd5e", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-266383806-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c869345f15654dea91ddb775c6c3ed7d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap45b36889-97", "ovs_interfaceid": "45b36889-973e-4cd7-a054-83b1843214dc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='0c3890ac18fafec10799ed606204e906',container_format='bare',created_at=2026-01-22T00:26:36Z,direct_url=<?>,disk_format='qcow2',id=1fa72c45-3744-4826-ac11-1114970a3fb7,min_disk=1,min_ram=0,name='tempest-TestSnapshotPatternsnapshot-118559772',owner='c869345f15654dea91ddb775c6c3ed7d',properties=ImageMetaProps,protected=<?>,size=52297728,status='active',tags=<?>,updated_at=2026-01-22T00:26:42Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'size': 0, 'disk_bus': 'virtio', 'boot_index': 0, 'device_name': '/dev/vda', 'guest_format': None, 'encrypted': False, 'encryption_format': None, 'encryption_secret_uuid': None, 'device_type': 'disk', 'image_id': '1fa72c45-3744-4826-ac11-1114970a3fb7'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 22 00:26:54 compute-1 nova_compute[182713]: 2026-01-22 00:26:54.053 182717 WARNING nova.virt.libvirt.driver [None req-af792dc1-44e2-4e66-b644-3cb6ea394021 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 00:26:54 compute-1 nova_compute[182713]: 2026-01-22 00:26:54.059 182717 DEBUG nova.virt.libvirt.host [None req-af792dc1-44e2-4e66-b644-3cb6ea394021 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 22 00:26:54 compute-1 nova_compute[182713]: 2026-01-22 00:26:54.061 182717 DEBUG nova.virt.libvirt.host [None req-af792dc1-44e2-4e66-b644-3cb6ea394021 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 22 00:26:54 compute-1 nova_compute[182713]: 2026-01-22 00:26:54.072 182717 DEBUG nova.virt.libvirt.host [None req-af792dc1-44e2-4e66-b644-3cb6ea394021 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 22 00:26:54 compute-1 nova_compute[182713]: 2026-01-22 00:26:54.073 182717 DEBUG nova.virt.libvirt.host [None req-af792dc1-44e2-4e66-b644-3cb6ea394021 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 22 00:26:54 compute-1 nova_compute[182713]: 2026-01-22 00:26:54.075 182717 DEBUG nova.virt.libvirt.driver [None req-af792dc1-44e2-4e66-b644-3cb6ea394021 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 22 00:26:54 compute-1 nova_compute[182713]: 2026-01-22 00:26:54.076 182717 DEBUG nova.virt.hardware [None req-af792dc1-44e2-4e66-b644-3cb6ea394021 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-21T23:42:45Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c3389c03-89c4-4ff5-9e03-1a99d41713d4',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='0c3890ac18fafec10799ed606204e906',container_format='bare',created_at=2026-01-22T00:26:36Z,direct_url=<?>,disk_format='qcow2',id=1fa72c45-3744-4826-ac11-1114970a3fb7,min_disk=1,min_ram=0,name='tempest-TestSnapshotPatternsnapshot-118559772',owner='c869345f15654dea91ddb775c6c3ed7d',properties=ImageMetaProps,protected=<?>,size=52297728,status='active',tags=<?>,updated_at=2026-01-22T00:26:42Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 22 00:26:54 compute-1 nova_compute[182713]: 2026-01-22 00:26:54.077 182717 DEBUG nova.virt.hardware [None req-af792dc1-44e2-4e66-b644-3cb6ea394021 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 22 00:26:54 compute-1 nova_compute[182713]: 2026-01-22 00:26:54.077 182717 DEBUG nova.virt.hardware [None req-af792dc1-44e2-4e66-b644-3cb6ea394021 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 22 00:26:54 compute-1 nova_compute[182713]: 2026-01-22 00:26:54.078 182717 DEBUG nova.virt.hardware [None req-af792dc1-44e2-4e66-b644-3cb6ea394021 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 22 00:26:54 compute-1 nova_compute[182713]: 2026-01-22 00:26:54.078 182717 DEBUG nova.virt.hardware [None req-af792dc1-44e2-4e66-b644-3cb6ea394021 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 22 00:26:54 compute-1 nova_compute[182713]: 2026-01-22 00:26:54.079 182717 DEBUG nova.virt.hardware [None req-af792dc1-44e2-4e66-b644-3cb6ea394021 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 22 00:26:54 compute-1 nova_compute[182713]: 2026-01-22 00:26:54.080 182717 DEBUG nova.virt.hardware [None req-af792dc1-44e2-4e66-b644-3cb6ea394021 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 22 00:26:54 compute-1 nova_compute[182713]: 2026-01-22 00:26:54.080 182717 DEBUG nova.virt.hardware [None req-af792dc1-44e2-4e66-b644-3cb6ea394021 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 22 00:26:54 compute-1 nova_compute[182713]: 2026-01-22 00:26:54.080 182717 DEBUG nova.virt.hardware [None req-af792dc1-44e2-4e66-b644-3cb6ea394021 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 22 00:26:54 compute-1 nova_compute[182713]: 2026-01-22 00:26:54.080 182717 DEBUG nova.virt.hardware [None req-af792dc1-44e2-4e66-b644-3cb6ea394021 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 22 00:26:54 compute-1 nova_compute[182713]: 2026-01-22 00:26:54.081 182717 DEBUG nova.virt.hardware [None req-af792dc1-44e2-4e66-b644-3cb6ea394021 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 22 00:26:54 compute-1 nova_compute[182713]: 2026-01-22 00:26:54.086 182717 DEBUG nova.virt.libvirt.vif [None req-af792dc1-44e2-4e66-b644-3cb6ea394021 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T00:26:45Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestSnapshotPattern-server-1582714945',display_name='tempest-TestSnapshotPattern-server-1582714945',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testsnapshotpattern-server-1582714945',id=165,image_ref='1fa72c45-3744-4826-ac11-1114970a3fb7',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBI0DY/TXOQB0M9YFyDLcqcxCKEdFgCfMpGiYt7S54G4iqWyBQXCc1Xwky+N3hTMg7xuZaO7fBEy0faktvOAVQkBCk+NHBAAtdaooYCb3c3mlb/2fG1QJ9qBFBibcnv6XRw==',key_name='tempest-TestSnapshotPattern-1957582469',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c869345f15654dea91ddb775c6c3ed7d',ramdisk_id='',reservation_id='r-sutdufti',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_boot_roles='reader,member',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_image_location='snapshot',image_image_state='available',image_image_type='snapshot',image_instance_uuid='ba1975bd-ca63-4cb4-afd3-fb1f077c28f0',image_min_disk='1',image_min_ram='0',image_owner_id='c869345f15654dea91ddb775c6c3ed7d',image_owner_project_name='tempest-TestSnapshotPattern-735860214',image_owner_user_name='tempest-TestSnapshotPattern-735860214-project-member',image_user_id='93f27bcf715e498cbac482f96dec39c0',image_version='8.0',network_allocated='True',owner_project_name='tempest-TestSnapshotPattern-735860214',owner_user_name='tempest-TestSnapshotPattern-735860214-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:26:48Z,user_data=None,user_id='93f27bcf715e498cbac482f96dec39c0',uuid=470aa2dc-e43d-414b-8bac-208dec5bcfe2,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "45b36889-973e-4cd7-a054-83b1843214dc", "address": "fa:16:3e:9b:1f:ab", "network": {"id": "ab086ee0-e007-4a86-babc-64d267c3fd5e", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-266383806-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c869345f15654dea91ddb775c6c3ed7d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap45b36889-97", "ovs_interfaceid": "45b36889-973e-4cd7-a054-83b1843214dc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 00:26:54 compute-1 nova_compute[182713]: 2026-01-22 00:26:54.086 182717 DEBUG nova.network.os_vif_util [None req-af792dc1-44e2-4e66-b644-3cb6ea394021 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] Converting VIF {"id": "45b36889-973e-4cd7-a054-83b1843214dc", "address": "fa:16:3e:9b:1f:ab", "network": {"id": "ab086ee0-e007-4a86-babc-64d267c3fd5e", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-266383806-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c869345f15654dea91ddb775c6c3ed7d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap45b36889-97", "ovs_interfaceid": "45b36889-973e-4cd7-a054-83b1843214dc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:26:54 compute-1 nova_compute[182713]: 2026-01-22 00:26:54.087 182717 DEBUG nova.network.os_vif_util [None req-af792dc1-44e2-4e66-b644-3cb6ea394021 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:9b:1f:ab,bridge_name='br-int',has_traffic_filtering=True,id=45b36889-973e-4cd7-a054-83b1843214dc,network=Network(ab086ee0-e007-4a86-babc-64d267c3fd5e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap45b36889-97') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:26:54 compute-1 nova_compute[182713]: 2026-01-22 00:26:54.088 182717 DEBUG nova.objects.instance [None req-af792dc1-44e2-4e66-b644-3cb6ea394021 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] Lazy-loading 'pci_devices' on Instance uuid 470aa2dc-e43d-414b-8bac-208dec5bcfe2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:26:54 compute-1 nova_compute[182713]: 2026-01-22 00:26:54.107 182717 DEBUG nova.virt.libvirt.driver [None req-af792dc1-44e2-4e66-b644-3cb6ea394021 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] [instance: 470aa2dc-e43d-414b-8bac-208dec5bcfe2] End _get_guest_xml xml=<domain type="kvm">
Jan 22 00:26:54 compute-1 nova_compute[182713]:   <uuid>470aa2dc-e43d-414b-8bac-208dec5bcfe2</uuid>
Jan 22 00:26:54 compute-1 nova_compute[182713]:   <name>instance-000000a5</name>
Jan 22 00:26:54 compute-1 nova_compute[182713]:   <memory>131072</memory>
Jan 22 00:26:54 compute-1 nova_compute[182713]:   <vcpu>1</vcpu>
Jan 22 00:26:54 compute-1 nova_compute[182713]:   <metadata>
Jan 22 00:26:54 compute-1 nova_compute[182713]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 00:26:54 compute-1 nova_compute[182713]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 00:26:54 compute-1 nova_compute[182713]:       <nova:name>tempest-TestSnapshotPattern-server-1582714945</nova:name>
Jan 22 00:26:54 compute-1 nova_compute[182713]:       <nova:creationTime>2026-01-22 00:26:54</nova:creationTime>
Jan 22 00:26:54 compute-1 nova_compute[182713]:       <nova:flavor name="m1.nano">
Jan 22 00:26:54 compute-1 nova_compute[182713]:         <nova:memory>128</nova:memory>
Jan 22 00:26:54 compute-1 nova_compute[182713]:         <nova:disk>1</nova:disk>
Jan 22 00:26:54 compute-1 nova_compute[182713]:         <nova:swap>0</nova:swap>
Jan 22 00:26:54 compute-1 nova_compute[182713]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 00:26:54 compute-1 nova_compute[182713]:         <nova:vcpus>1</nova:vcpus>
Jan 22 00:26:54 compute-1 nova_compute[182713]:       </nova:flavor>
Jan 22 00:26:54 compute-1 nova_compute[182713]:       <nova:owner>
Jan 22 00:26:54 compute-1 nova_compute[182713]:         <nova:user uuid="93f27bcf715e498cbac482f96dec39c0">tempest-TestSnapshotPattern-735860214-project-member</nova:user>
Jan 22 00:26:54 compute-1 nova_compute[182713]:         <nova:project uuid="c869345f15654dea91ddb775c6c3ed7d">tempest-TestSnapshotPattern-735860214</nova:project>
Jan 22 00:26:54 compute-1 nova_compute[182713]:       </nova:owner>
Jan 22 00:26:54 compute-1 nova_compute[182713]:       <nova:root type="image" uuid="1fa72c45-3744-4826-ac11-1114970a3fb7"/>
Jan 22 00:26:54 compute-1 nova_compute[182713]:       <nova:ports>
Jan 22 00:26:54 compute-1 nova_compute[182713]:         <nova:port uuid="45b36889-973e-4cd7-a054-83b1843214dc">
Jan 22 00:26:54 compute-1 nova_compute[182713]:           <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Jan 22 00:26:54 compute-1 nova_compute[182713]:         </nova:port>
Jan 22 00:26:54 compute-1 nova_compute[182713]:       </nova:ports>
Jan 22 00:26:54 compute-1 nova_compute[182713]:     </nova:instance>
Jan 22 00:26:54 compute-1 nova_compute[182713]:   </metadata>
Jan 22 00:26:54 compute-1 nova_compute[182713]:   <sysinfo type="smbios">
Jan 22 00:26:54 compute-1 nova_compute[182713]:     <system>
Jan 22 00:26:54 compute-1 nova_compute[182713]:       <entry name="manufacturer">RDO</entry>
Jan 22 00:26:54 compute-1 nova_compute[182713]:       <entry name="product">OpenStack Compute</entry>
Jan 22 00:26:54 compute-1 nova_compute[182713]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 00:26:54 compute-1 nova_compute[182713]:       <entry name="serial">470aa2dc-e43d-414b-8bac-208dec5bcfe2</entry>
Jan 22 00:26:54 compute-1 nova_compute[182713]:       <entry name="uuid">470aa2dc-e43d-414b-8bac-208dec5bcfe2</entry>
Jan 22 00:26:54 compute-1 nova_compute[182713]:       <entry name="family">Virtual Machine</entry>
Jan 22 00:26:54 compute-1 nova_compute[182713]:     </system>
Jan 22 00:26:54 compute-1 nova_compute[182713]:   </sysinfo>
Jan 22 00:26:54 compute-1 nova_compute[182713]:   <os>
Jan 22 00:26:54 compute-1 nova_compute[182713]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 22 00:26:54 compute-1 nova_compute[182713]:     <boot dev="hd"/>
Jan 22 00:26:54 compute-1 nova_compute[182713]:     <smbios mode="sysinfo"/>
Jan 22 00:26:54 compute-1 nova_compute[182713]:   </os>
Jan 22 00:26:54 compute-1 nova_compute[182713]:   <features>
Jan 22 00:26:54 compute-1 nova_compute[182713]:     <acpi/>
Jan 22 00:26:54 compute-1 nova_compute[182713]:     <apic/>
Jan 22 00:26:54 compute-1 nova_compute[182713]:     <vmcoreinfo/>
Jan 22 00:26:54 compute-1 nova_compute[182713]:   </features>
Jan 22 00:26:54 compute-1 nova_compute[182713]:   <clock offset="utc">
Jan 22 00:26:54 compute-1 nova_compute[182713]:     <timer name="pit" tickpolicy="delay"/>
Jan 22 00:26:54 compute-1 nova_compute[182713]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 22 00:26:54 compute-1 nova_compute[182713]:     <timer name="hpet" present="no"/>
Jan 22 00:26:54 compute-1 nova_compute[182713]:   </clock>
Jan 22 00:26:54 compute-1 nova_compute[182713]:   <cpu mode="custom" match="exact">
Jan 22 00:26:54 compute-1 nova_compute[182713]:     <model>Nehalem</model>
Jan 22 00:26:54 compute-1 nova_compute[182713]:     <topology sockets="1" cores="1" threads="1"/>
Jan 22 00:26:54 compute-1 nova_compute[182713]:   </cpu>
Jan 22 00:26:54 compute-1 nova_compute[182713]:   <devices>
Jan 22 00:26:54 compute-1 nova_compute[182713]:     <disk type="file" device="disk">
Jan 22 00:26:54 compute-1 nova_compute[182713]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 00:26:54 compute-1 nova_compute[182713]:       <source file="/var/lib/nova/instances/470aa2dc-e43d-414b-8bac-208dec5bcfe2/disk"/>
Jan 22 00:26:54 compute-1 nova_compute[182713]:       <target dev="vda" bus="virtio"/>
Jan 22 00:26:54 compute-1 nova_compute[182713]:     </disk>
Jan 22 00:26:54 compute-1 nova_compute[182713]:     <disk type="file" device="cdrom">
Jan 22 00:26:54 compute-1 nova_compute[182713]:       <driver name="qemu" type="raw" cache="none"/>
Jan 22 00:26:54 compute-1 nova_compute[182713]:       <source file="/var/lib/nova/instances/470aa2dc-e43d-414b-8bac-208dec5bcfe2/disk.config"/>
Jan 22 00:26:54 compute-1 nova_compute[182713]:       <target dev="sda" bus="sata"/>
Jan 22 00:26:54 compute-1 nova_compute[182713]:     </disk>
Jan 22 00:26:54 compute-1 nova_compute[182713]:     <interface type="ethernet">
Jan 22 00:26:54 compute-1 nova_compute[182713]:       <mac address="fa:16:3e:9b:1f:ab"/>
Jan 22 00:26:54 compute-1 nova_compute[182713]:       <model type="virtio"/>
Jan 22 00:26:54 compute-1 nova_compute[182713]:       <driver name="vhost" rx_queue_size="512"/>
Jan 22 00:26:54 compute-1 nova_compute[182713]:       <mtu size="1442"/>
Jan 22 00:26:54 compute-1 nova_compute[182713]:       <target dev="tap45b36889-97"/>
Jan 22 00:26:54 compute-1 nova_compute[182713]:     </interface>
Jan 22 00:26:54 compute-1 nova_compute[182713]:     <serial type="pty">
Jan 22 00:26:54 compute-1 nova_compute[182713]:       <log file="/var/lib/nova/instances/470aa2dc-e43d-414b-8bac-208dec5bcfe2/console.log" append="off"/>
Jan 22 00:26:54 compute-1 nova_compute[182713]:     </serial>
Jan 22 00:26:54 compute-1 nova_compute[182713]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 00:26:54 compute-1 nova_compute[182713]:     <video>
Jan 22 00:26:54 compute-1 nova_compute[182713]:       <model type="virtio"/>
Jan 22 00:26:54 compute-1 nova_compute[182713]:     </video>
Jan 22 00:26:54 compute-1 nova_compute[182713]:     <input type="tablet" bus="usb"/>
Jan 22 00:26:54 compute-1 nova_compute[182713]:     <input type="keyboard" bus="usb"/>
Jan 22 00:26:54 compute-1 nova_compute[182713]:     <rng model="virtio">
Jan 22 00:26:54 compute-1 nova_compute[182713]:       <backend model="random">/dev/urandom</backend>
Jan 22 00:26:54 compute-1 nova_compute[182713]:     </rng>
Jan 22 00:26:54 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root"/>
Jan 22 00:26:54 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:26:54 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:26:54 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:26:54 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:26:54 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:26:54 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:26:54 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:26:54 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:26:54 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:26:54 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:26:54 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:26:54 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:26:54 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:26:54 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:26:54 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:26:54 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:26:54 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:26:54 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:26:54 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:26:54 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:26:54 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:26:54 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:26:54 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:26:54 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:26:54 compute-1 nova_compute[182713]:     <controller type="usb" index="0"/>
Jan 22 00:26:54 compute-1 nova_compute[182713]:     <memballoon model="virtio">
Jan 22 00:26:54 compute-1 nova_compute[182713]:       <stats period="10"/>
Jan 22 00:26:54 compute-1 nova_compute[182713]:     </memballoon>
Jan 22 00:26:54 compute-1 nova_compute[182713]:   </devices>
Jan 22 00:26:54 compute-1 nova_compute[182713]: </domain>
Jan 22 00:26:54 compute-1 nova_compute[182713]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 22 00:26:54 compute-1 nova_compute[182713]: 2026-01-22 00:26:54.109 182717 DEBUG nova.compute.manager [None req-af792dc1-44e2-4e66-b644-3cb6ea394021 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] [instance: 470aa2dc-e43d-414b-8bac-208dec5bcfe2] Preparing to wait for external event network-vif-plugged-45b36889-973e-4cd7-a054-83b1843214dc prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 22 00:26:54 compute-1 nova_compute[182713]: 2026-01-22 00:26:54.110 182717 DEBUG oslo_concurrency.lockutils [None req-af792dc1-44e2-4e66-b644-3cb6ea394021 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] Acquiring lock "470aa2dc-e43d-414b-8bac-208dec5bcfe2-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:26:54 compute-1 nova_compute[182713]: 2026-01-22 00:26:54.110 182717 DEBUG oslo_concurrency.lockutils [None req-af792dc1-44e2-4e66-b644-3cb6ea394021 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] Lock "470aa2dc-e43d-414b-8bac-208dec5bcfe2-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:26:54 compute-1 nova_compute[182713]: 2026-01-22 00:26:54.111 182717 DEBUG oslo_concurrency.lockutils [None req-af792dc1-44e2-4e66-b644-3cb6ea394021 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] Lock "470aa2dc-e43d-414b-8bac-208dec5bcfe2-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:26:54 compute-1 nova_compute[182713]: 2026-01-22 00:26:54.111 182717 DEBUG nova.virt.libvirt.vif [None req-af792dc1-44e2-4e66-b644-3cb6ea394021 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T00:26:45Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestSnapshotPattern-server-1582714945',display_name='tempest-TestSnapshotPattern-server-1582714945',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testsnapshotpattern-server-1582714945',id=165,image_ref='1fa72c45-3744-4826-ac11-1114970a3fb7',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBI0DY/TXOQB0M9YFyDLcqcxCKEdFgCfMpGiYt7S54G4iqWyBQXCc1Xwky+N3hTMg7xuZaO7fBEy0faktvOAVQkBCk+NHBAAtdaooYCb3c3mlb/2fG1QJ9qBFBibcnv6XRw==',key_name='tempest-TestSnapshotPattern-1957582469',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c869345f15654dea91ddb775c6c3ed7d',ramdisk_id='',reservation_id='r-sutdufti',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_boot_roles='reader,member',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_image_location='snapshot',image_image_state='available',image_image_type='snapshot',image_instance_uuid='ba1975bd-ca63-4cb4-afd3-fb1f077c28f0',image_min_disk='1',image_min_ram='0',image_owner_id='c869345f15654dea91ddb775c6c3ed7d',image_owner_project_name='tempest-TestSnapshotPattern-735860214',image_owner_user_name='tempest-TestSnapshotPattern-735860214-project-member',image_user_id='93f27bcf715e498cbac482f96dec39c0',image_version='8.0',network_allocated='True',owner_project_name='tempest-TestSnapshotPattern-735860214',owner_user_name='tempest-TestSnapshotPattern-735860214-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:26:48Z,user_data=None,user_id='93f27bcf715e498cbac482f96dec39c0',uuid=470aa2dc-e43d-414b-8bac-208dec5bcfe2,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "45b36889-973e-4cd7-a054-83b1843214dc", "address": "fa:16:3e:9b:1f:ab", "network": {"id": "ab086ee0-e007-4a86-babc-64d267c3fd5e", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-266383806-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c869345f15654dea91ddb775c6c3ed7d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap45b36889-97", "ovs_interfaceid": "45b36889-973e-4cd7-a054-83b1843214dc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 00:26:54 compute-1 nova_compute[182713]: 2026-01-22 00:26:54.112 182717 DEBUG nova.network.os_vif_util [None req-af792dc1-44e2-4e66-b644-3cb6ea394021 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] Converting VIF {"id": "45b36889-973e-4cd7-a054-83b1843214dc", "address": "fa:16:3e:9b:1f:ab", "network": {"id": "ab086ee0-e007-4a86-babc-64d267c3fd5e", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-266383806-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c869345f15654dea91ddb775c6c3ed7d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap45b36889-97", "ovs_interfaceid": "45b36889-973e-4cd7-a054-83b1843214dc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:26:54 compute-1 nova_compute[182713]: 2026-01-22 00:26:54.113 182717 DEBUG nova.network.os_vif_util [None req-af792dc1-44e2-4e66-b644-3cb6ea394021 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:9b:1f:ab,bridge_name='br-int',has_traffic_filtering=True,id=45b36889-973e-4cd7-a054-83b1843214dc,network=Network(ab086ee0-e007-4a86-babc-64d267c3fd5e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap45b36889-97') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:26:54 compute-1 nova_compute[182713]: 2026-01-22 00:26:54.113 182717 DEBUG os_vif [None req-af792dc1-44e2-4e66-b644-3cb6ea394021 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:9b:1f:ab,bridge_name='br-int',has_traffic_filtering=True,id=45b36889-973e-4cd7-a054-83b1843214dc,network=Network(ab086ee0-e007-4a86-babc-64d267c3fd5e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap45b36889-97') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 00:26:54 compute-1 nova_compute[182713]: 2026-01-22 00:26:54.114 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:26:54 compute-1 nova_compute[182713]: 2026-01-22 00:26:54.114 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:26:54 compute-1 nova_compute[182713]: 2026-01-22 00:26:54.114 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 00:26:54 compute-1 nova_compute[182713]: 2026-01-22 00:26:54.121 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:26:54 compute-1 nova_compute[182713]: 2026-01-22 00:26:54.122 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap45b36889-97, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:26:54 compute-1 nova_compute[182713]: 2026-01-22 00:26:54.122 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap45b36889-97, col_values=(('external_ids', {'iface-id': '45b36889-973e-4cd7-a054-83b1843214dc', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:9b:1f:ab', 'vm-uuid': '470aa2dc-e43d-414b-8bac-208dec5bcfe2'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:26:54 compute-1 nova_compute[182713]: 2026-01-22 00:26:54.123 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:26:54 compute-1 nova_compute[182713]: 2026-01-22 00:26:54.125 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:26:54 compute-1 NetworkManager[54952]: <info>  [1769041614.1258] manager: (tap45b36889-97): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/298)
Jan 22 00:26:54 compute-1 nova_compute[182713]: 2026-01-22 00:26:54.130 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:26:54 compute-1 nova_compute[182713]: 2026-01-22 00:26:54.131 182717 INFO os_vif [None req-af792dc1-44e2-4e66-b644-3cb6ea394021 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:9b:1f:ab,bridge_name='br-int',has_traffic_filtering=True,id=45b36889-973e-4cd7-a054-83b1843214dc,network=Network(ab086ee0-e007-4a86-babc-64d267c3fd5e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap45b36889-97')
Jan 22 00:26:54 compute-1 nova_compute[182713]: 2026-01-22 00:26:54.229 182717 DEBUG nova.virt.libvirt.driver [None req-af792dc1-44e2-4e66-b644-3cb6ea394021 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 00:26:54 compute-1 nova_compute[182713]: 2026-01-22 00:26:54.230 182717 DEBUG nova.virt.libvirt.driver [None req-af792dc1-44e2-4e66-b644-3cb6ea394021 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 00:26:54 compute-1 nova_compute[182713]: 2026-01-22 00:26:54.230 182717 DEBUG nova.virt.libvirt.driver [None req-af792dc1-44e2-4e66-b644-3cb6ea394021 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] No VIF found with MAC fa:16:3e:9b:1f:ab, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 00:26:54 compute-1 nova_compute[182713]: 2026-01-22 00:26:54.232 182717 INFO nova.virt.libvirt.driver [None req-af792dc1-44e2-4e66-b644-3cb6ea394021 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] [instance: 470aa2dc-e43d-414b-8bac-208dec5bcfe2] Using config drive
Jan 22 00:26:54 compute-1 nova_compute[182713]: 2026-01-22 00:26:54.666 182717 INFO nova.virt.libvirt.driver [None req-af792dc1-44e2-4e66-b644-3cb6ea394021 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] [instance: 470aa2dc-e43d-414b-8bac-208dec5bcfe2] Creating config drive at /var/lib/nova/instances/470aa2dc-e43d-414b-8bac-208dec5bcfe2/disk.config
Jan 22 00:26:54 compute-1 nova_compute[182713]: 2026-01-22 00:26:54.672 182717 DEBUG oslo_concurrency.processutils [None req-af792dc1-44e2-4e66-b644-3cb6ea394021 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/470aa2dc-e43d-414b-8bac-208dec5bcfe2/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpj565ex6n execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:26:54 compute-1 nova_compute[182713]: 2026-01-22 00:26:54.806 182717 DEBUG oslo_concurrency.processutils [None req-af792dc1-44e2-4e66-b644-3cb6ea394021 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/470aa2dc-e43d-414b-8bac-208dec5bcfe2/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpj565ex6n" returned: 0 in 0.135s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:26:54 compute-1 NetworkManager[54952]: <info>  [1769041614.8968] manager: (tap45b36889-97): new Tun device (/org/freedesktop/NetworkManager/Devices/299)
Jan 22 00:26:54 compute-1 kernel: tap45b36889-97: entered promiscuous mode
Jan 22 00:26:54 compute-1 ovn_controller[94841]: 2026-01-22T00:26:54Z|00647|binding|INFO|Claiming lport 45b36889-973e-4cd7-a054-83b1843214dc for this chassis.
Jan 22 00:26:54 compute-1 ovn_controller[94841]: 2026-01-22T00:26:54Z|00648|binding|INFO|45b36889-973e-4cd7-a054-83b1843214dc: Claiming fa:16:3e:9b:1f:ab 10.100.0.4
Jan 22 00:26:54 compute-1 nova_compute[182713]: 2026-01-22 00:26:54.900 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:26:54 compute-1 nova_compute[182713]: 2026-01-22 00:26:54.906 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:26:54 compute-1 nova_compute[182713]: 2026-01-22 00:26:54.909 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:26:54 compute-1 nova_compute[182713]: 2026-01-22 00:26:54.916 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:26:54 compute-1 nova_compute[182713]: 2026-01-22 00:26:54.918 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:26:54 compute-1 NetworkManager[54952]: <info>  [1769041614.9222] manager: (patch-br-int-to-provnet-99bcf688-d143-4306-a11c-956e66fbe227): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/300)
Jan 22 00:26:54 compute-1 NetworkManager[54952]: <info>  [1769041614.9231] manager: (patch-provnet-99bcf688-d143-4306-a11c-956e66fbe227-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/301)
Jan 22 00:26:54 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:26:54.923 104184 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:9b:1f:ab 10.100.0.4'], port_security=['fa:16:3e:9b:1f:ab 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '470aa2dc-e43d-414b-8bac-208dec5bcfe2', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ab086ee0-e007-4a86-babc-64d267c3fd5e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c869345f15654dea91ddb775c6c3ed7d', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'd99566cf-9d10-4ed9-89fe-0fedfcd05fcc', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7ddf0c9e-e496-4d74-b1f7-5f7f3b8a365b, chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>], logical_port=45b36889-973e-4cd7-a054-83b1843214dc) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:26:54 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:26:54.924 104184 INFO neutron.agent.ovn.metadata.agent [-] Port 45b36889-973e-4cd7-a054-83b1843214dc in datapath ab086ee0-e007-4a86-babc-64d267c3fd5e bound to our chassis
Jan 22 00:26:54 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:26:54.926 104184 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network ab086ee0-e007-4a86-babc-64d267c3fd5e
Jan 22 00:26:54 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:26:54.944 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[33bb140d-58c3-4b02-bd34-9ec5fe7bad2b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:26:54 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:26:54.946 104184 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapab086ee0-e1 in ovnmeta-ab086ee0-e007-4a86-babc-64d267c3fd5e namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 22 00:26:54 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:26:54.948 211733 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapab086ee0-e0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 22 00:26:54 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:26:54.948 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[3f30513f-b383-4f6c-8149-14408582d15f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:26:54 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:26:54.950 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[d71861b5-f908-4983-adc6-d8a6bfe14bf3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:26:54 compute-1 systemd-udevd[237263]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 00:26:54 compute-1 systemd-machined[153970]: New machine qemu-71-instance-000000a5.
Jan 22 00:26:54 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:26:54.966 104576 DEBUG oslo.privsep.daemon [-] privsep: reply[a3ee6eaf-5234-44f9-b813-2b493bbdd0d5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:26:54 compute-1 NetworkManager[54952]: <info>  [1769041614.9737] device (tap45b36889-97): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 00:26:54 compute-1 NetworkManager[54952]: <info>  [1769041614.9746] device (tap45b36889-97): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 00:26:55 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:26:55.005 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[c4e849e7-0242-455d-b232-b74a6d952560]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:26:55 compute-1 systemd[1]: Started Virtual Machine qemu-71-instance-000000a5.
Jan 22 00:26:55 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:26:55.041 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[b8708aa1-c26e-4154-9412-ca7cfa4f90d1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:26:55 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:26:55.064 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[fb02d37d-d879-4ab7-8d1e-c5c3414e501f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:26:55 compute-1 NetworkManager[54952]: <info>  [1769041615.0731] manager: (tapab086ee0-e0): new Veth device (/org/freedesktop/NetworkManager/Devices/302)
Jan 22 00:26:55 compute-1 ovn_controller[94841]: 2026-01-22T00:26:55Z|00649|binding|INFO|Setting lport 45b36889-973e-4cd7-a054-83b1843214dc up in Southbound
Jan 22 00:26:55 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:26:55.105 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[0008f638-6112-4a94-aafd-0559bca5f53d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:26:55 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:26:55.124 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[d7a4fdc0-ce1d-41ca-8dd3-c2ad1f2f3d5c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:26:55 compute-1 nova_compute[182713]: 2026-01-22 00:26:55.128 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:26:55 compute-1 ovn_controller[94841]: 2026-01-22T00:26:55Z|00650|binding|INFO|Setting lport 45b36889-973e-4cd7-a054-83b1843214dc ovn-installed in OVS
Jan 22 00:26:55 compute-1 nova_compute[182713]: 2026-01-22 00:26:55.136 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:26:55 compute-1 NetworkManager[54952]: <info>  [1769041615.1483] device (tapab086ee0-e0): carrier: link connected
Jan 22 00:26:55 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:26:55.155 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[d75eeaaa-121c-4c7a-84c9-09dc1f86f015]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:26:55 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:26:55.177 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[fa7598a7-699c-491c-8239-058c3e94dd79]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapab086ee0-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:1c:eb:16'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 180, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 180, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 186], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 618780, 'reachable_time': 32282, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 152, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 152, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 237295, 'error': None, 'target': 'ovnmeta-ab086ee0-e007-4a86-babc-64d267c3fd5e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:26:55 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:26:55.196 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[9e7bffe8-9d78-4f21-9b08-27418acab7cd]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe1c:eb16'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 618780, 'tstamp': 618780}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 237296, 'error': None, 'target': 'ovnmeta-ab086ee0-e007-4a86-babc-64d267c3fd5e', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:26:55 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:26:55.217 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[6d2de367-9537-44bb-8902-2749f6ac5bef]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapab086ee0-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:1c:eb:16'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 180, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 180, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 186], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 618780, 'reachable_time': 32282, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 152, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 152, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 237297, 'error': None, 'target': 'ovnmeta-ab086ee0-e007-4a86-babc-64d267c3fd5e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:26:55 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:26:55.249 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[331daa66-acf9-4db1-a9e8-50fdc3879cdd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:26:55 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:26:55.329 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[61a29435-b0af-43a9-8a79-94acc2a732c4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:26:55 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:26:55.331 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapab086ee0-e0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:26:55 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:26:55.331 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 00:26:55 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:26:55.332 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapab086ee0-e0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:26:55 compute-1 nova_compute[182713]: 2026-01-22 00:26:55.333 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:26:55 compute-1 NetworkManager[54952]: <info>  [1769041615.3342] manager: (tapab086ee0-e0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/303)
Jan 22 00:26:55 compute-1 kernel: tapab086ee0-e0: entered promiscuous mode
Jan 22 00:26:55 compute-1 nova_compute[182713]: 2026-01-22 00:26:55.336 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:26:55 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:26:55.336 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapab086ee0-e0, col_values=(('external_ids', {'iface-id': '931dfb05-b9a9-4afa-88ca-d1f52289b640'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:26:55 compute-1 nova_compute[182713]: 2026-01-22 00:26:55.337 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:26:55 compute-1 ovn_controller[94841]: 2026-01-22T00:26:55Z|00651|binding|INFO|Releasing lport 931dfb05-b9a9-4afa-88ca-d1f52289b640 from this chassis (sb_readonly=0)
Jan 22 00:26:55 compute-1 nova_compute[182713]: 2026-01-22 00:26:55.349 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:26:55 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:26:55.350 104184 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/ab086ee0-e007-4a86-babc-64d267c3fd5e.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/ab086ee0-e007-4a86-babc-64d267c3fd5e.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 22 00:26:55 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:26:55.351 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[0b5ea1e3-a2b6-47b8-9813-2b995ec5d58d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:26:55 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:26:55.352 104184 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 00:26:55 compute-1 ovn_metadata_agent[104179]: global
Jan 22 00:26:55 compute-1 ovn_metadata_agent[104179]:     log         /dev/log local0 debug
Jan 22 00:26:55 compute-1 ovn_metadata_agent[104179]:     log-tag     haproxy-metadata-proxy-ab086ee0-e007-4a86-babc-64d267c3fd5e
Jan 22 00:26:55 compute-1 ovn_metadata_agent[104179]:     user        root
Jan 22 00:26:55 compute-1 ovn_metadata_agent[104179]:     group       root
Jan 22 00:26:55 compute-1 ovn_metadata_agent[104179]:     maxconn     1024
Jan 22 00:26:55 compute-1 ovn_metadata_agent[104179]:     pidfile     /var/lib/neutron/external/pids/ab086ee0-e007-4a86-babc-64d267c3fd5e.pid.haproxy
Jan 22 00:26:55 compute-1 ovn_metadata_agent[104179]:     daemon
Jan 22 00:26:55 compute-1 ovn_metadata_agent[104179]: 
Jan 22 00:26:55 compute-1 ovn_metadata_agent[104179]: defaults
Jan 22 00:26:55 compute-1 ovn_metadata_agent[104179]:     log global
Jan 22 00:26:55 compute-1 ovn_metadata_agent[104179]:     mode http
Jan 22 00:26:55 compute-1 ovn_metadata_agent[104179]:     option httplog
Jan 22 00:26:55 compute-1 ovn_metadata_agent[104179]:     option dontlognull
Jan 22 00:26:55 compute-1 ovn_metadata_agent[104179]:     option http-server-close
Jan 22 00:26:55 compute-1 ovn_metadata_agent[104179]:     option forwardfor
Jan 22 00:26:55 compute-1 ovn_metadata_agent[104179]:     retries                 3
Jan 22 00:26:55 compute-1 ovn_metadata_agent[104179]:     timeout http-request    30s
Jan 22 00:26:55 compute-1 ovn_metadata_agent[104179]:     timeout connect         30s
Jan 22 00:26:55 compute-1 ovn_metadata_agent[104179]:     timeout client          32s
Jan 22 00:26:55 compute-1 ovn_metadata_agent[104179]:     timeout server          32s
Jan 22 00:26:55 compute-1 ovn_metadata_agent[104179]:     timeout http-keep-alive 30s
Jan 22 00:26:55 compute-1 ovn_metadata_agent[104179]: 
Jan 22 00:26:55 compute-1 ovn_metadata_agent[104179]: 
Jan 22 00:26:55 compute-1 ovn_metadata_agent[104179]: listen listener
Jan 22 00:26:55 compute-1 ovn_metadata_agent[104179]:     bind 169.254.169.254:80
Jan 22 00:26:55 compute-1 ovn_metadata_agent[104179]:     server metadata /var/lib/neutron/metadata_proxy
Jan 22 00:26:55 compute-1 ovn_metadata_agent[104179]:     http-request add-header X-OVN-Network-ID ab086ee0-e007-4a86-babc-64d267c3fd5e
Jan 22 00:26:55 compute-1 ovn_metadata_agent[104179]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 22 00:26:55 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:26:55.353 104184 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-ab086ee0-e007-4a86-babc-64d267c3fd5e', 'env', 'PROCESS_TAG=haproxy-ab086ee0-e007-4a86-babc-64d267c3fd5e', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/ab086ee0-e007-4a86-babc-64d267c3fd5e.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 22 00:26:55 compute-1 nova_compute[182713]: 2026-01-22 00:26:55.360 182717 DEBUG nova.virt.driver [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Emitting event <LifecycleEvent: 1769041615.3594975, 470aa2dc-e43d-414b-8bac-208dec5bcfe2 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:26:55 compute-1 nova_compute[182713]: 2026-01-22 00:26:55.360 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 470aa2dc-e43d-414b-8bac-208dec5bcfe2] VM Started (Lifecycle Event)
Jan 22 00:26:55 compute-1 nova_compute[182713]: 2026-01-22 00:26:55.384 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 470aa2dc-e43d-414b-8bac-208dec5bcfe2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:26:55 compute-1 nova_compute[182713]: 2026-01-22 00:26:55.388 182717 DEBUG nova.virt.driver [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Emitting event <LifecycleEvent: 1769041615.3607621, 470aa2dc-e43d-414b-8bac-208dec5bcfe2 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:26:55 compute-1 nova_compute[182713]: 2026-01-22 00:26:55.388 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 470aa2dc-e43d-414b-8bac-208dec5bcfe2] VM Paused (Lifecycle Event)
Jan 22 00:26:55 compute-1 nova_compute[182713]: 2026-01-22 00:26:55.405 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 470aa2dc-e43d-414b-8bac-208dec5bcfe2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:26:55 compute-1 nova_compute[182713]: 2026-01-22 00:26:55.407 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 470aa2dc-e43d-414b-8bac-208dec5bcfe2] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 00:26:55 compute-1 nova_compute[182713]: 2026-01-22 00:26:55.428 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 470aa2dc-e43d-414b-8bac-208dec5bcfe2] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 00:26:55 compute-1 podman[237336]: 2026-01-22 00:26:55.79248004 +0000 UTC m=+0.059851860 container create 1b5d788651856846096961b925f558ee3ca2ab0f9cdf62791fee58cb966b8339 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ab086ee0-e007-4a86-babc-64d267c3fd5e, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Jan 22 00:26:55 compute-1 systemd[1]: Started libpod-conmon-1b5d788651856846096961b925f558ee3ca2ab0f9cdf62791fee58cb966b8339.scope.
Jan 22 00:26:55 compute-1 podman[237336]: 2026-01-22 00:26:55.764927305 +0000 UTC m=+0.032299185 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 00:26:55 compute-1 systemd[1]: Started libcrun container.
Jan 22 00:26:55 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/af7af4ca93236f5e8a95600ef74ed2043abbff6b358ff5fb003c7044277913e6/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 00:26:55 compute-1 nova_compute[182713]: 2026-01-22 00:26:55.870 182717 DEBUG nova.compute.manager [req-db6da505-fc65-489d-96aa-4c05c2370221 req-a66dea82-381b-48dd-93f9-adc28c372c5b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 470aa2dc-e43d-414b-8bac-208dec5bcfe2] Received event network-vif-plugged-45b36889-973e-4cd7-a054-83b1843214dc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:26:55 compute-1 nova_compute[182713]: 2026-01-22 00:26:55.872 182717 DEBUG oslo_concurrency.lockutils [req-db6da505-fc65-489d-96aa-4c05c2370221 req-a66dea82-381b-48dd-93f9-adc28c372c5b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "470aa2dc-e43d-414b-8bac-208dec5bcfe2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:26:55 compute-1 nova_compute[182713]: 2026-01-22 00:26:55.873 182717 DEBUG oslo_concurrency.lockutils [req-db6da505-fc65-489d-96aa-4c05c2370221 req-a66dea82-381b-48dd-93f9-adc28c372c5b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "470aa2dc-e43d-414b-8bac-208dec5bcfe2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:26:55 compute-1 nova_compute[182713]: 2026-01-22 00:26:55.873 182717 DEBUG oslo_concurrency.lockutils [req-db6da505-fc65-489d-96aa-4c05c2370221 req-a66dea82-381b-48dd-93f9-adc28c372c5b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "470aa2dc-e43d-414b-8bac-208dec5bcfe2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:26:55 compute-1 nova_compute[182713]: 2026-01-22 00:26:55.873 182717 DEBUG nova.compute.manager [req-db6da505-fc65-489d-96aa-4c05c2370221 req-a66dea82-381b-48dd-93f9-adc28c372c5b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 470aa2dc-e43d-414b-8bac-208dec5bcfe2] Processing event network-vif-plugged-45b36889-973e-4cd7-a054-83b1843214dc _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 22 00:26:55 compute-1 nova_compute[182713]: 2026-01-22 00:26:55.874 182717 DEBUG nova.compute.manager [None req-af792dc1-44e2-4e66-b644-3cb6ea394021 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] [instance: 470aa2dc-e43d-414b-8bac-208dec5bcfe2] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 00:26:55 compute-1 nova_compute[182713]: 2026-01-22 00:26:55.878 182717 DEBUG nova.virt.driver [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Emitting event <LifecycleEvent: 1769041615.878213, 470aa2dc-e43d-414b-8bac-208dec5bcfe2 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:26:55 compute-1 nova_compute[182713]: 2026-01-22 00:26:55.878 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 470aa2dc-e43d-414b-8bac-208dec5bcfe2] VM Resumed (Lifecycle Event)
Jan 22 00:26:55 compute-1 nova_compute[182713]: 2026-01-22 00:26:55.882 182717 DEBUG nova.virt.libvirt.driver [None req-af792dc1-44e2-4e66-b644-3cb6ea394021 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] [instance: 470aa2dc-e43d-414b-8bac-208dec5bcfe2] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 22 00:26:55 compute-1 nova_compute[182713]: 2026-01-22 00:26:55.888 182717 INFO nova.virt.libvirt.driver [-] [instance: 470aa2dc-e43d-414b-8bac-208dec5bcfe2] Instance spawned successfully.
Jan 22 00:26:55 compute-1 nova_compute[182713]: 2026-01-22 00:26:55.889 182717 INFO nova.compute.manager [None req-af792dc1-44e2-4e66-b644-3cb6ea394021 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] [instance: 470aa2dc-e43d-414b-8bac-208dec5bcfe2] Took 7.60 seconds to spawn the instance on the hypervisor.
Jan 22 00:26:55 compute-1 nova_compute[182713]: 2026-01-22 00:26:55.890 182717 DEBUG nova.compute.manager [None req-af792dc1-44e2-4e66-b644-3cb6ea394021 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] [instance: 470aa2dc-e43d-414b-8bac-208dec5bcfe2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:26:55 compute-1 nova_compute[182713]: 2026-01-22 00:26:55.904 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 470aa2dc-e43d-414b-8bac-208dec5bcfe2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:26:55 compute-1 podman[237336]: 2026-01-22 00:26:55.907981985 +0000 UTC m=+0.175353825 container init 1b5d788651856846096961b925f558ee3ca2ab0f9cdf62791fee58cb966b8339 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ab086ee0-e007-4a86-babc-64d267c3fd5e, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 00:26:55 compute-1 nova_compute[182713]: 2026-01-22 00:26:55.908 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 470aa2dc-e43d-414b-8bac-208dec5bcfe2] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 00:26:55 compute-1 podman[237336]: 2026-01-22 00:26:55.915287912 +0000 UTC m=+0.182659732 container start 1b5d788651856846096961b925f558ee3ca2ab0f9cdf62791fee58cb966b8339 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ab086ee0-e007-4a86-babc-64d267c3fd5e, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202)
Jan 22 00:26:55 compute-1 nova_compute[182713]: 2026-01-22 00:26:55.951 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 470aa2dc-e43d-414b-8bac-208dec5bcfe2] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 00:26:55 compute-1 neutron-haproxy-ovnmeta-ab086ee0-e007-4a86-babc-64d267c3fd5e[237353]: [NOTICE]   (237382) : New worker (237392) forked
Jan 22 00:26:55 compute-1 neutron-haproxy-ovnmeta-ab086ee0-e007-4a86-babc-64d267c3fd5e[237353]: [NOTICE]   (237382) : Loading success.
Jan 22 00:26:55 compute-1 podman[237349]: 2026-01-22 00:26:55.974421148 +0000 UTC m=+0.137718287 container health_status 1b31374526468d6b98833c6ddc06c791524e3aaa8f4a92d02e3f11c449a17ca2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ovn_controller, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Jan 22 00:26:55 compute-1 podman[237352]: 2026-01-22 00:26:55.99961889 +0000 UTC m=+0.153257449 container health_status 9069102163b9014ce8d990e36224edf2f6277e737eebbc85a8ea0ba5a3d6a468 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 22 00:26:56 compute-1 nova_compute[182713]: 2026-01-22 00:26:56.031 182717 INFO nova.compute.manager [None req-af792dc1-44e2-4e66-b644-3cb6ea394021 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] [instance: 470aa2dc-e43d-414b-8bac-208dec5bcfe2] Took 9.08 seconds to build instance.
Jan 22 00:26:56 compute-1 nova_compute[182713]: 2026-01-22 00:26:56.068 182717 DEBUG oslo_concurrency.lockutils [None req-af792dc1-44e2-4e66-b644-3cb6ea394021 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] Lock "470aa2dc-e43d-414b-8bac-208dec5bcfe2" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 9.232s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:26:56 compute-1 nova_compute[182713]: 2026-01-22 00:26:56.284 182717 DEBUG nova.network.neutron [req-8e9e3912-452a-41ee-b8cd-40a328945e24 req-ba6199c7-a66e-4817-be6e-b79899e1f860 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 470aa2dc-e43d-414b-8bac-208dec5bcfe2] Updated VIF entry in instance network info cache for port 45b36889-973e-4cd7-a054-83b1843214dc. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 00:26:56 compute-1 nova_compute[182713]: 2026-01-22 00:26:56.284 182717 DEBUG nova.network.neutron [req-8e9e3912-452a-41ee-b8cd-40a328945e24 req-ba6199c7-a66e-4817-be6e-b79899e1f860 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 470aa2dc-e43d-414b-8bac-208dec5bcfe2] Updating instance_info_cache with network_info: [{"id": "45b36889-973e-4cd7-a054-83b1843214dc", "address": "fa:16:3e:9b:1f:ab", "network": {"id": "ab086ee0-e007-4a86-babc-64d267c3fd5e", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-266383806-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c869345f15654dea91ddb775c6c3ed7d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap45b36889-97", "ovs_interfaceid": "45b36889-973e-4cd7-a054-83b1843214dc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:26:56 compute-1 nova_compute[182713]: 2026-01-22 00:26:56.471 182717 DEBUG oslo_concurrency.lockutils [req-8e9e3912-452a-41ee-b8cd-40a328945e24 req-ba6199c7-a66e-4817-be6e-b79899e1f860 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-470aa2dc-e43d-414b-8bac-208dec5bcfe2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:26:58 compute-1 nova_compute[182713]: 2026-01-22 00:26:58.020 182717 DEBUG nova.compute.manager [req-f306989c-819a-4c38-aa03-347005cd1872 req-e81c5ba4-0901-4f9a-9dde-6039129071f6 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 470aa2dc-e43d-414b-8bac-208dec5bcfe2] Received event network-vif-plugged-45b36889-973e-4cd7-a054-83b1843214dc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:26:58 compute-1 nova_compute[182713]: 2026-01-22 00:26:58.021 182717 DEBUG oslo_concurrency.lockutils [req-f306989c-819a-4c38-aa03-347005cd1872 req-e81c5ba4-0901-4f9a-9dde-6039129071f6 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "470aa2dc-e43d-414b-8bac-208dec5bcfe2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:26:58 compute-1 nova_compute[182713]: 2026-01-22 00:26:58.021 182717 DEBUG oslo_concurrency.lockutils [req-f306989c-819a-4c38-aa03-347005cd1872 req-e81c5ba4-0901-4f9a-9dde-6039129071f6 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "470aa2dc-e43d-414b-8bac-208dec5bcfe2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:26:58 compute-1 nova_compute[182713]: 2026-01-22 00:26:58.021 182717 DEBUG oslo_concurrency.lockutils [req-f306989c-819a-4c38-aa03-347005cd1872 req-e81c5ba4-0901-4f9a-9dde-6039129071f6 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "470aa2dc-e43d-414b-8bac-208dec5bcfe2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:26:58 compute-1 nova_compute[182713]: 2026-01-22 00:26:58.022 182717 DEBUG nova.compute.manager [req-f306989c-819a-4c38-aa03-347005cd1872 req-e81c5ba4-0901-4f9a-9dde-6039129071f6 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 470aa2dc-e43d-414b-8bac-208dec5bcfe2] No waiting events found dispatching network-vif-plugged-45b36889-973e-4cd7-a054-83b1843214dc pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:26:58 compute-1 nova_compute[182713]: 2026-01-22 00:26:58.022 182717 WARNING nova.compute.manager [req-f306989c-819a-4c38-aa03-347005cd1872 req-e81c5ba4-0901-4f9a-9dde-6039129071f6 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 470aa2dc-e43d-414b-8bac-208dec5bcfe2] Received unexpected event network-vif-plugged-45b36889-973e-4cd7-a054-83b1843214dc for instance with vm_state active and task_state None.
Jan 22 00:26:58 compute-1 nova_compute[182713]: 2026-01-22 00:26:58.855 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:26:58 compute-1 nova_compute[182713]: 2026-01-22 00:26:58.856 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:26:58 compute-1 nova_compute[182713]: 2026-01-22 00:26:58.882 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:26:58 compute-1 nova_compute[182713]: 2026-01-22 00:26:58.883 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:26:58 compute-1 nova_compute[182713]: 2026-01-22 00:26:58.883 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:26:58 compute-1 nova_compute[182713]: 2026-01-22 00:26:58.883 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 00:26:58 compute-1 nova_compute[182713]: 2026-01-22 00:26:58.967 182717 DEBUG oslo_concurrency.processutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/470aa2dc-e43d-414b-8bac-208dec5bcfe2/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:26:59 compute-1 nova_compute[182713]: 2026-01-22 00:26:59.053 182717 DEBUG oslo_concurrency.processutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/470aa2dc-e43d-414b-8bac-208dec5bcfe2/disk --force-share --output=json" returned: 0 in 0.086s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:26:59 compute-1 nova_compute[182713]: 2026-01-22 00:26:59.055 182717 DEBUG oslo_concurrency.processutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/470aa2dc-e43d-414b-8bac-208dec5bcfe2/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:26:59 compute-1 nova_compute[182713]: 2026-01-22 00:26:59.113 182717 DEBUG oslo_concurrency.processutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/470aa2dc-e43d-414b-8bac-208dec5bcfe2/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:26:59 compute-1 nova_compute[182713]: 2026-01-22 00:26:59.122 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:26:59 compute-1 nova_compute[182713]: 2026-01-22 00:26:59.302 182717 WARNING nova.virt.libvirt.driver [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 00:26:59 compute-1 nova_compute[182713]: 2026-01-22 00:26:59.303 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5539MB free_disk=73.19226455688477GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 00:26:59 compute-1 nova_compute[182713]: 2026-01-22 00:26:59.304 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:26:59 compute-1 nova_compute[182713]: 2026-01-22 00:26:59.304 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:26:59 compute-1 nova_compute[182713]: 2026-01-22 00:26:59.400 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Instance 470aa2dc-e43d-414b-8bac-208dec5bcfe2 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 22 00:26:59 compute-1 nova_compute[182713]: 2026-01-22 00:26:59.401 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 00:26:59 compute-1 nova_compute[182713]: 2026-01-22 00:26:59.401 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 00:26:59 compute-1 nova_compute[182713]: 2026-01-22 00:26:59.441 182717 DEBUG nova.compute.provider_tree [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Inventory has not changed in ProviderTree for provider: 39680711-70c9-4df1-ae59-25e54fac688d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:26:59 compute-1 nova_compute[182713]: 2026-01-22 00:26:59.695 182717 DEBUG nova.scheduler.client.report [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Inventory has not changed for provider 39680711-70c9-4df1-ae59-25e54fac688d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:26:59 compute-1 nova_compute[182713]: 2026-01-22 00:26:59.720 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 00:26:59 compute-1 nova_compute[182713]: 2026-01-22 00:26:59.720 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.416s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:27:01 compute-1 podman[237422]: 2026-01-22 00:27:01.572013297 +0000 UTC m=+0.065542315 container health_status af9a44923f14d865893c91712a29a99d59e05d83f1a1a0300c4f4d5af638f284 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 22 00:27:01 compute-1 podman[237423]: 2026-01-22 00:27:01.597417226 +0000 UTC m=+0.090361146 container health_status e305ebfcd3dbca18f8dd680606b043b108cf9efb0d0a771039cb04fe0df8441d (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 22 00:27:02 compute-1 nova_compute[182713]: 2026-01-22 00:27:02.722 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:27:02 compute-1 nova_compute[182713]: 2026-01-22 00:27:02.723 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:27:02 compute-1 nova_compute[182713]: 2026-01-22 00:27:02.856 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:27:02 compute-1 nova_compute[182713]: 2026-01-22 00:27:02.857 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 00:27:02 compute-1 nova_compute[182713]: 2026-01-22 00:27:02.857 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 22 00:27:03 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:27:03.036 104184 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:27:03 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:27:03.037 104184 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:27:03 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:27:03.038 104184 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:27:03 compute-1 nova_compute[182713]: 2026-01-22 00:27:03.069 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquiring lock "refresh_cache-470aa2dc-e43d-414b-8bac-208dec5bcfe2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:27:03 compute-1 nova_compute[182713]: 2026-01-22 00:27:03.069 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquired lock "refresh_cache-470aa2dc-e43d-414b-8bac-208dec5bcfe2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:27:03 compute-1 nova_compute[182713]: 2026-01-22 00:27:03.070 182717 DEBUG nova.network.neutron [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] [instance: 470aa2dc-e43d-414b-8bac-208dec5bcfe2] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Jan 22 00:27:03 compute-1 nova_compute[182713]: 2026-01-22 00:27:03.070 182717 DEBUG nova.objects.instance [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lazy-loading 'info_cache' on Instance uuid 470aa2dc-e43d-414b-8bac-208dec5bcfe2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:27:04 compute-1 nova_compute[182713]: 2026-01-22 00:27:04.127 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:27:04 compute-1 nova_compute[182713]: 2026-01-22 00:27:04.234 182717 DEBUG nova.compute.manager [req-a5233041-390a-4ba8-9e80-f0b8ca0168e6 req-77bf6ef7-46d6-4875-8950-3671256b7ce9 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 470aa2dc-e43d-414b-8bac-208dec5bcfe2] Received event network-changed-45b36889-973e-4cd7-a054-83b1843214dc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:27:04 compute-1 nova_compute[182713]: 2026-01-22 00:27:04.234 182717 DEBUG nova.compute.manager [req-a5233041-390a-4ba8-9e80-f0b8ca0168e6 req-77bf6ef7-46d6-4875-8950-3671256b7ce9 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 470aa2dc-e43d-414b-8bac-208dec5bcfe2] Refreshing instance network info cache due to event network-changed-45b36889-973e-4cd7-a054-83b1843214dc. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 00:27:04 compute-1 nova_compute[182713]: 2026-01-22 00:27:04.235 182717 DEBUG oslo_concurrency.lockutils [req-a5233041-390a-4ba8-9e80-f0b8ca0168e6 req-77bf6ef7-46d6-4875-8950-3671256b7ce9 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-470aa2dc-e43d-414b-8bac-208dec5bcfe2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:27:04 compute-1 nova_compute[182713]: 2026-01-22 00:27:04.847 182717 DEBUG nova.network.neutron [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] [instance: 470aa2dc-e43d-414b-8bac-208dec5bcfe2] Updating instance_info_cache with network_info: [{"id": "45b36889-973e-4cd7-a054-83b1843214dc", "address": "fa:16:3e:9b:1f:ab", "network": {"id": "ab086ee0-e007-4a86-babc-64d267c3fd5e", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-266383806-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.176", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c869345f15654dea91ddb775c6c3ed7d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap45b36889-97", "ovs_interfaceid": "45b36889-973e-4cd7-a054-83b1843214dc", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:27:04 compute-1 nova_compute[182713]: 2026-01-22 00:27:04.873 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Releasing lock "refresh_cache-470aa2dc-e43d-414b-8bac-208dec5bcfe2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:27:04 compute-1 nova_compute[182713]: 2026-01-22 00:27:04.875 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] [instance: 470aa2dc-e43d-414b-8bac-208dec5bcfe2] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Jan 22 00:27:04 compute-1 nova_compute[182713]: 2026-01-22 00:27:04.876 182717 DEBUG oslo_concurrency.lockutils [req-a5233041-390a-4ba8-9e80-f0b8ca0168e6 req-77bf6ef7-46d6-4875-8950-3671256b7ce9 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-470aa2dc-e43d-414b-8bac-208dec5bcfe2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:27:04 compute-1 nova_compute[182713]: 2026-01-22 00:27:04.877 182717 DEBUG nova.network.neutron [req-a5233041-390a-4ba8-9e80-f0b8ca0168e6 req-77bf6ef7-46d6-4875-8950-3671256b7ce9 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 470aa2dc-e43d-414b-8bac-208dec5bcfe2] Refreshing network info cache for port 45b36889-973e-4cd7-a054-83b1843214dc _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 00:27:08 compute-1 nova_compute[182713]: 2026-01-22 00:27:08.225 182717 DEBUG nova.network.neutron [req-a5233041-390a-4ba8-9e80-f0b8ca0168e6 req-77bf6ef7-46d6-4875-8950-3671256b7ce9 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 470aa2dc-e43d-414b-8bac-208dec5bcfe2] Updated VIF entry in instance network info cache for port 45b36889-973e-4cd7-a054-83b1843214dc. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 00:27:08 compute-1 nova_compute[182713]: 2026-01-22 00:27:08.226 182717 DEBUG nova.network.neutron [req-a5233041-390a-4ba8-9e80-f0b8ca0168e6 req-77bf6ef7-46d6-4875-8950-3671256b7ce9 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 470aa2dc-e43d-414b-8bac-208dec5bcfe2] Updating instance_info_cache with network_info: [{"id": "45b36889-973e-4cd7-a054-83b1843214dc", "address": "fa:16:3e:9b:1f:ab", "network": {"id": "ab086ee0-e007-4a86-babc-64d267c3fd5e", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-266383806-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.176", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c869345f15654dea91ddb775c6c3ed7d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap45b36889-97", "ovs_interfaceid": "45b36889-973e-4cd7-a054-83b1843214dc", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:27:08 compute-1 nova_compute[182713]: 2026-01-22 00:27:08.345 182717 DEBUG oslo_concurrency.lockutils [req-a5233041-390a-4ba8-9e80-f0b8ca0168e6 req-77bf6ef7-46d6-4875-8950-3671256b7ce9 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-470aa2dc-e43d-414b-8bac-208dec5bcfe2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:27:09 compute-1 nova_compute[182713]: 2026-01-22 00:27:09.127 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:27:10 compute-1 ovn_controller[94841]: 2026-01-22T00:27:10Z|00081|pinctrl(ovn_pinctrl0)|WARN|DHCPREQUEST requested IP 10.100.0.6 does not match offer 10.100.0.4
Jan 22 00:27:10 compute-1 ovn_controller[94841]: 2026-01-22T00:27:10Z|00082|pinctrl(ovn_pinctrl0)|INFO|DHCPNAK fa:16:3e:9b:1f:ab 10.100.0.4
Jan 22 00:27:14 compute-1 nova_compute[182713]: 2026-01-22 00:27:14.129 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:27:14 compute-1 podman[237477]: 2026-01-22 00:27:14.610660504 +0000 UTC m=+0.088776176 container health_status cba261ffc859a105e0afa4436e64e27f9ba7e7387a1ea02146e0072c51844d44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Jan 22 00:27:15 compute-1 ovn_controller[94841]: 2026-01-22T00:27:15Z|00083|pinctrl(ovn_pinctrl0)|WARN|DHCPREQUEST requested IP 10.100.0.6 does not match offer 10.100.0.4
Jan 22 00:27:15 compute-1 ovn_controller[94841]: 2026-01-22T00:27:15Z|00084|pinctrl(ovn_pinctrl0)|INFO|DHCPNAK fa:16:3e:9b:1f:ab 10.100.0.4
Jan 22 00:27:15 compute-1 ovn_controller[94841]: 2026-01-22T00:27:15Z|00085|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:9b:1f:ab 10.100.0.4
Jan 22 00:27:15 compute-1 ovn_controller[94841]: 2026-01-22T00:27:15Z|00086|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:9b:1f:ab 10.100.0.4
Jan 22 00:27:17 compute-1 podman[237497]: 2026-01-22 00:27:17.613328949 +0000 UTC m=+0.098945802 container health_status dce133a2ffa21f3006ac27dc80027060a9029e6d972f4c62fecd35dd3bbb00f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, vendor=Red Hat, Inc., distribution-scope=public, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, architecture=x86_64, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=openstack_network_exporter, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., io.openshift.expose-services=, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Jan 22 00:27:19 compute-1 nova_compute[182713]: 2026-01-22 00:27:19.131 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:27:24 compute-1 nova_compute[182713]: 2026-01-22 00:27:24.133 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:27:25 compute-1 nova_compute[182713]: 2026-01-22 00:27:25.965 182717 DEBUG nova.compute.manager [None req-d64d90c4-baf8-4f55-b56a-f30899272d45 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: 53686c08-86df-445a-b433-6a2c7c590fdb] Stashing vm_state: active _prep_resize /usr/lib/python3.9/site-packages/nova/compute/manager.py:5560
Jan 22 00:27:26 compute-1 nova_compute[182713]: 2026-01-22 00:27:26.134 182717 DEBUG oslo_concurrency.lockutils [None req-d64d90c4-baf8-4f55-b56a-f30899272d45 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:27:26 compute-1 nova_compute[182713]: 2026-01-22 00:27:26.135 182717 DEBUG oslo_concurrency.lockutils [None req-d64d90c4-baf8-4f55-b56a-f30899272d45 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:27:26 compute-1 nova_compute[182713]: 2026-01-22 00:27:26.182 182717 DEBUG nova.objects.instance [None req-d64d90c4-baf8-4f55-b56a-f30899272d45 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lazy-loading 'pci_requests' on Instance uuid 53686c08-86df-445a-b433-6a2c7c590fdb obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:27:26 compute-1 nova_compute[182713]: 2026-01-22 00:27:26.206 182717 DEBUG nova.virt.hardware [None req-d64d90c4-baf8-4f55-b56a-f30899272d45 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 22 00:27:26 compute-1 nova_compute[182713]: 2026-01-22 00:27:26.207 182717 INFO nova.compute.claims [None req-d64d90c4-baf8-4f55-b56a-f30899272d45 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: 53686c08-86df-445a-b433-6a2c7c590fdb] Claim successful on node compute-1.ctlplane.example.com
Jan 22 00:27:26 compute-1 nova_compute[182713]: 2026-01-22 00:27:26.207 182717 DEBUG nova.objects.instance [None req-d64d90c4-baf8-4f55-b56a-f30899272d45 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lazy-loading 'resources' on Instance uuid 53686c08-86df-445a-b433-6a2c7c590fdb obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:27:26 compute-1 nova_compute[182713]: 2026-01-22 00:27:26.222 182717 DEBUG nova.objects.instance [None req-d64d90c4-baf8-4f55-b56a-f30899272d45 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lazy-loading 'pci_devices' on Instance uuid 53686c08-86df-445a-b433-6a2c7c590fdb obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:27:26 compute-1 nova_compute[182713]: 2026-01-22 00:27:26.267 182717 INFO nova.compute.resource_tracker [None req-d64d90c4-baf8-4f55-b56a-f30899272d45 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: 53686c08-86df-445a-b433-6a2c7c590fdb] Updating resource usage from migration f701be3c-7bb1-4932-accc-5d8672c233bc
Jan 22 00:27:26 compute-1 nova_compute[182713]: 2026-01-22 00:27:26.268 182717 DEBUG nova.compute.resource_tracker [None req-d64d90c4-baf8-4f55-b56a-f30899272d45 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: 53686c08-86df-445a-b433-6a2c7c590fdb] Starting to track incoming migration f701be3c-7bb1-4932-accc-5d8672c233bc with flavor ff01ccba-ad51-439f-9037-926190d6dc0f _update_usage_from_migration /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1431
Jan 22 00:27:26 compute-1 nova_compute[182713]: 2026-01-22 00:27:26.382 182717 DEBUG nova.compute.provider_tree [None req-d64d90c4-baf8-4f55-b56a-f30899272d45 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Inventory has not changed in ProviderTree for provider: 39680711-70c9-4df1-ae59-25e54fac688d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:27:26 compute-1 nova_compute[182713]: 2026-01-22 00:27:26.405 182717 DEBUG nova.scheduler.client.report [None req-d64d90c4-baf8-4f55-b56a-f30899272d45 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Inventory has not changed for provider 39680711-70c9-4df1-ae59-25e54fac688d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:27:26 compute-1 nova_compute[182713]: 2026-01-22 00:27:26.478 182717 DEBUG oslo_concurrency.lockutils [None req-d64d90c4-baf8-4f55-b56a-f30899272d45 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: held 0.343s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:27:26 compute-1 nova_compute[182713]: 2026-01-22 00:27:26.478 182717 INFO nova.compute.manager [None req-d64d90c4-baf8-4f55-b56a-f30899272d45 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: 53686c08-86df-445a-b433-6a2c7c590fdb] Migrating
Jan 22 00:27:26 compute-1 podman[237521]: 2026-01-22 00:27:26.612150064 +0000 UTC m=+0.092637227 container health_status 9069102163b9014ce8d990e36224edf2f6277e737eebbc85a8ea0ba5a3d6a468 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 22 00:27:26 compute-1 podman[237520]: 2026-01-22 00:27:26.63328522 +0000 UTC m=+0.114387662 container health_status 1b31374526468d6b98833c6ddc06c791524e3aaa8f4a92d02e3f11c449a17ca2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 22 00:27:29 compute-1 nova_compute[182713]: 2026-01-22 00:27:29.135 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:27:31 compute-1 sshd-session[237568]: Accepted publickey for nova from 192.168.122.100 port 46876 ssh2: ECDSA SHA256:F7R2p0Um/J2TH6GGZ4p2tVmjepU+Vjbmccv9PcuTsYI
Jan 22 00:27:31 compute-1 systemd[1]: Created slice User Slice of UID 42436.
Jan 22 00:27:31 compute-1 systemd[1]: Starting User Runtime Directory /run/user/42436...
Jan 22 00:27:31 compute-1 systemd-logind[796]: New session 53 of user nova.
Jan 22 00:27:31 compute-1 systemd[1]: Finished User Runtime Directory /run/user/42436.
Jan 22 00:27:31 compute-1 systemd[1]: Starting User Manager for UID 42436...
Jan 22 00:27:31 compute-1 systemd[237572]: pam_unix(systemd-user:session): session opened for user nova(uid=42436) by nova(uid=0)
Jan 22 00:27:31 compute-1 systemd[237572]: Queued start job for default target Main User Target.
Jan 22 00:27:31 compute-1 systemd[237572]: Created slice User Application Slice.
Jan 22 00:27:31 compute-1 systemd[237572]: Started Mark boot as successful after the user session has run 2 minutes.
Jan 22 00:27:31 compute-1 systemd[237572]: Started Daily Cleanup of User's Temporary Directories.
Jan 22 00:27:31 compute-1 systemd[237572]: Reached target Paths.
Jan 22 00:27:31 compute-1 systemd[237572]: Reached target Timers.
Jan 22 00:27:31 compute-1 systemd[237572]: Starting D-Bus User Message Bus Socket...
Jan 22 00:27:31 compute-1 systemd[237572]: Starting Create User's Volatile Files and Directories...
Jan 22 00:27:31 compute-1 systemd[237572]: Listening on D-Bus User Message Bus Socket.
Jan 22 00:27:31 compute-1 systemd[237572]: Reached target Sockets.
Jan 22 00:27:31 compute-1 systemd[237572]: Finished Create User's Volatile Files and Directories.
Jan 22 00:27:31 compute-1 systemd[237572]: Reached target Basic System.
Jan 22 00:27:31 compute-1 systemd[237572]: Reached target Main User Target.
Jan 22 00:27:31 compute-1 systemd[237572]: Startup finished in 153ms.
Jan 22 00:27:31 compute-1 systemd[1]: Started User Manager for UID 42436.
Jan 22 00:27:31 compute-1 systemd[1]: Started Session 53 of User nova.
Jan 22 00:27:31 compute-1 sshd-session[237568]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Jan 22 00:27:31 compute-1 sshd-session[237587]: Received disconnect from 192.168.122.100 port 46876:11: disconnected by user
Jan 22 00:27:31 compute-1 sshd-session[237587]: Disconnected from user nova 192.168.122.100 port 46876
Jan 22 00:27:31 compute-1 sshd-session[237568]: pam_unix(sshd:session): session closed for user nova
Jan 22 00:27:31 compute-1 systemd[1]: session-53.scope: Deactivated successfully.
Jan 22 00:27:31 compute-1 systemd-logind[796]: Session 53 logged out. Waiting for processes to exit.
Jan 22 00:27:31 compute-1 systemd-logind[796]: Removed session 53.
Jan 22 00:27:31 compute-1 sshd-session[237589]: Accepted publickey for nova from 192.168.122.100 port 46884 ssh2: ECDSA SHA256:F7R2p0Um/J2TH6GGZ4p2tVmjepU+Vjbmccv9PcuTsYI
Jan 22 00:27:31 compute-1 systemd-logind[796]: New session 55 of user nova.
Jan 22 00:27:31 compute-1 systemd[1]: Started Session 55 of User nova.
Jan 22 00:27:31 compute-1 sshd-session[237589]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Jan 22 00:27:31 compute-1 podman[237591]: 2026-01-22 00:27:31.742297383 +0000 UTC m=+0.080839072 container health_status af9a44923f14d865893c91712a29a99d59e05d83f1a1a0300c4f4d5af638f284 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_metadata_agent)
Jan 22 00:27:31 compute-1 podman[237593]: 2026-01-22 00:27:31.743468449 +0000 UTC m=+0.073809573 container health_status e305ebfcd3dbca18f8dd680606b043b108cf9efb0d0a771039cb04fe0df8441d (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 22 00:27:31 compute-1 sshd-session[237614]: Received disconnect from 192.168.122.100 port 46884:11: disconnected by user
Jan 22 00:27:31 compute-1 sshd-session[237614]: Disconnected from user nova 192.168.122.100 port 46884
Jan 22 00:27:31 compute-1 sshd-session[237589]: pam_unix(sshd:session): session closed for user nova
Jan 22 00:27:31 compute-1 systemd[1]: session-55.scope: Deactivated successfully.
Jan 22 00:27:31 compute-1 systemd-logind[796]: Session 55 logged out. Waiting for processes to exit.
Jan 22 00:27:31 compute-1 systemd-logind[796]: Removed session 55.
Jan 22 00:27:34 compute-1 nova_compute[182713]: 2026-01-22 00:27:34.138 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:27:35 compute-1 sshd-session[237638]: Accepted publickey for nova from 192.168.122.100 port 46892 ssh2: ECDSA SHA256:F7R2p0Um/J2TH6GGZ4p2tVmjepU+Vjbmccv9PcuTsYI
Jan 22 00:27:35 compute-1 systemd-logind[796]: New session 56 of user nova.
Jan 22 00:27:35 compute-1 ovn_controller[94841]: 2026-01-22T00:27:35Z|00652|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory
Jan 22 00:27:35 compute-1 systemd[1]: Started Session 56 of User nova.
Jan 22 00:27:35 compute-1 sshd-session[237638]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Jan 22 00:27:35 compute-1 sshd-session[237641]: Received disconnect from 192.168.122.100 port 46892:11: disconnected by user
Jan 22 00:27:35 compute-1 sshd-session[237641]: Disconnected from user nova 192.168.122.100 port 46892
Jan 22 00:27:35 compute-1 sshd-session[237638]: pam_unix(sshd:session): session closed for user nova
Jan 22 00:27:35 compute-1 systemd[1]: session-56.scope: Deactivated successfully.
Jan 22 00:27:35 compute-1 systemd-logind[796]: Session 56 logged out. Waiting for processes to exit.
Jan 22 00:27:35 compute-1 systemd-logind[796]: Removed session 56.
Jan 22 00:27:35 compute-1 sshd-session[237643]: Accepted publickey for nova from 192.168.122.100 port 46898 ssh2: ECDSA SHA256:F7R2p0Um/J2TH6GGZ4p2tVmjepU+Vjbmccv9PcuTsYI
Jan 22 00:27:35 compute-1 systemd-logind[796]: New session 57 of user nova.
Jan 22 00:27:35 compute-1 systemd[1]: Started Session 57 of User nova.
Jan 22 00:27:35 compute-1 sshd-session[237643]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Jan 22 00:27:35 compute-1 sshd-session[237646]: Received disconnect from 192.168.122.100 port 46898:11: disconnected by user
Jan 22 00:27:35 compute-1 sshd-session[237646]: Disconnected from user nova 192.168.122.100 port 46898
Jan 22 00:27:35 compute-1 sshd-session[237643]: pam_unix(sshd:session): session closed for user nova
Jan 22 00:27:35 compute-1 systemd[1]: session-57.scope: Deactivated successfully.
Jan 22 00:27:35 compute-1 systemd-logind[796]: Session 57 logged out. Waiting for processes to exit.
Jan 22 00:27:35 compute-1 systemd-logind[796]: Removed session 57.
Jan 22 00:27:35 compute-1 sshd-session[237648]: Accepted publickey for nova from 192.168.122.100 port 46906 ssh2: ECDSA SHA256:F7R2p0Um/J2TH6GGZ4p2tVmjepU+Vjbmccv9PcuTsYI
Jan 22 00:27:35 compute-1 systemd-logind[796]: New session 58 of user nova.
Jan 22 00:27:35 compute-1 systemd[1]: Started Session 58 of User nova.
Jan 22 00:27:35 compute-1 sshd-session[237648]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Jan 22 00:27:35 compute-1 nova_compute[182713]: 2026-01-22 00:27:35.982 182717 DEBUG nova.compute.manager [req-3a47cc33-4bbd-45d3-8f44-43fb487e322e req-08a689e5-329a-4756-a08e-6578927d611b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 53686c08-86df-445a-b433-6a2c7c590fdb] Received event network-vif-unplugged-ac62ef89-aec4-41c9-83dd-366bdfc1c0bd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:27:35 compute-1 nova_compute[182713]: 2026-01-22 00:27:35.982 182717 DEBUG oslo_concurrency.lockutils [req-3a47cc33-4bbd-45d3-8f44-43fb487e322e req-08a689e5-329a-4756-a08e-6578927d611b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "53686c08-86df-445a-b433-6a2c7c590fdb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:27:35 compute-1 nova_compute[182713]: 2026-01-22 00:27:35.983 182717 DEBUG oslo_concurrency.lockutils [req-3a47cc33-4bbd-45d3-8f44-43fb487e322e req-08a689e5-329a-4756-a08e-6578927d611b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "53686c08-86df-445a-b433-6a2c7c590fdb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:27:35 compute-1 nova_compute[182713]: 2026-01-22 00:27:35.983 182717 DEBUG oslo_concurrency.lockutils [req-3a47cc33-4bbd-45d3-8f44-43fb487e322e req-08a689e5-329a-4756-a08e-6578927d611b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "53686c08-86df-445a-b433-6a2c7c590fdb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:27:35 compute-1 nova_compute[182713]: 2026-01-22 00:27:35.983 182717 DEBUG nova.compute.manager [req-3a47cc33-4bbd-45d3-8f44-43fb487e322e req-08a689e5-329a-4756-a08e-6578927d611b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 53686c08-86df-445a-b433-6a2c7c590fdb] No waiting events found dispatching network-vif-unplugged-ac62ef89-aec4-41c9-83dd-366bdfc1c0bd pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:27:35 compute-1 nova_compute[182713]: 2026-01-22 00:27:35.984 182717 WARNING nova.compute.manager [req-3a47cc33-4bbd-45d3-8f44-43fb487e322e req-08a689e5-329a-4756-a08e-6578927d611b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 53686c08-86df-445a-b433-6a2c7c590fdb] Received unexpected event network-vif-unplugged-ac62ef89-aec4-41c9-83dd-366bdfc1c0bd for instance with vm_state active and task_state resize_migrating.
Jan 22 00:27:35 compute-1 sshd-session[237651]: Received disconnect from 192.168.122.100 port 46906:11: disconnected by user
Jan 22 00:27:35 compute-1 sshd-session[237651]: Disconnected from user nova 192.168.122.100 port 46906
Jan 22 00:27:35 compute-1 sshd-session[237648]: pam_unix(sshd:session): session closed for user nova
Jan 22 00:27:35 compute-1 systemd[1]: session-58.scope: Deactivated successfully.
Jan 22 00:27:35 compute-1 systemd-logind[796]: Session 58 logged out. Waiting for processes to exit.
Jan 22 00:27:35 compute-1 systemd-logind[796]: Removed session 58.
Jan 22 00:27:37 compute-1 nova_compute[182713]: 2026-01-22 00:27:37.226 182717 INFO nova.network.neutron [None req-d64d90c4-baf8-4f55-b56a-f30899272d45 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: 53686c08-86df-445a-b433-6a2c7c590fdb] Updating port ac62ef89-aec4-41c9-83dd-366bdfc1c0bd with attributes {'binding:host_id': 'compute-1.ctlplane.example.com', 'device_owner': 'compute:nova'}
Jan 22 00:27:37 compute-1 nova_compute[182713]: 2026-01-22 00:27:37.233 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:27:37 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:27:37.235 104184 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=51, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:a4:fb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'fa:c2:bb:50:eb:27'}, ipsec=False) old=SB_Global(nb_cfg=50) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:27:37 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:27:37.238 104184 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 22 00:27:37 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:27:37.239 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=74526b6d-b1ca-423f-9094-b845f8b97526, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '51'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:27:37 compute-1 nova_compute[182713]: 2026-01-22 00:27:37.617 182717 DEBUG nova.compute.manager [None req-d029da05-1bf7-4fca-af97-ee76e5a39006 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] [instance: 470aa2dc-e43d-414b-8bac-208dec5bcfe2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:27:38 compute-1 nova_compute[182713]: 2026-01-22 00:27:38.118 182717 INFO nova.compute.manager [None req-d029da05-1bf7-4fca-af97-ee76e5a39006 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] [instance: 470aa2dc-e43d-414b-8bac-208dec5bcfe2] instance snapshotting
Jan 22 00:27:38 compute-1 nova_compute[182713]: 2026-01-22 00:27:38.161 182717 DEBUG nova.compute.manager [req-e42ec2be-7388-433f-8df1-40167c5bf885 req-62dd5fe9-aa97-462e-a31b-193d36af9092 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 53686c08-86df-445a-b433-6a2c7c590fdb] Received event network-vif-plugged-ac62ef89-aec4-41c9-83dd-366bdfc1c0bd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:27:38 compute-1 nova_compute[182713]: 2026-01-22 00:27:38.161 182717 DEBUG oslo_concurrency.lockutils [req-e42ec2be-7388-433f-8df1-40167c5bf885 req-62dd5fe9-aa97-462e-a31b-193d36af9092 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "53686c08-86df-445a-b433-6a2c7c590fdb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:27:38 compute-1 nova_compute[182713]: 2026-01-22 00:27:38.162 182717 DEBUG oslo_concurrency.lockutils [req-e42ec2be-7388-433f-8df1-40167c5bf885 req-62dd5fe9-aa97-462e-a31b-193d36af9092 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "53686c08-86df-445a-b433-6a2c7c590fdb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:27:38 compute-1 nova_compute[182713]: 2026-01-22 00:27:38.162 182717 DEBUG oslo_concurrency.lockutils [req-e42ec2be-7388-433f-8df1-40167c5bf885 req-62dd5fe9-aa97-462e-a31b-193d36af9092 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "53686c08-86df-445a-b433-6a2c7c590fdb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:27:38 compute-1 nova_compute[182713]: 2026-01-22 00:27:38.162 182717 DEBUG nova.compute.manager [req-e42ec2be-7388-433f-8df1-40167c5bf885 req-62dd5fe9-aa97-462e-a31b-193d36af9092 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 53686c08-86df-445a-b433-6a2c7c590fdb] No waiting events found dispatching network-vif-plugged-ac62ef89-aec4-41c9-83dd-366bdfc1c0bd pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:27:38 compute-1 nova_compute[182713]: 2026-01-22 00:27:38.163 182717 WARNING nova.compute.manager [req-e42ec2be-7388-433f-8df1-40167c5bf885 req-62dd5fe9-aa97-462e-a31b-193d36af9092 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 53686c08-86df-445a-b433-6a2c7c590fdb] Received unexpected event network-vif-plugged-ac62ef89-aec4-41c9-83dd-366bdfc1c0bd for instance with vm_state active and task_state resize_migrated.
Jan 22 00:27:38 compute-1 nova_compute[182713]: 2026-01-22 00:27:38.582 182717 INFO nova.virt.libvirt.driver [None req-d029da05-1bf7-4fca-af97-ee76e5a39006 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] [instance: 470aa2dc-e43d-414b-8bac-208dec5bcfe2] Beginning live snapshot process
Jan 22 00:27:39 compute-1 nova_compute[182713]: 2026-01-22 00:27:39.142 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:27:39 compute-1 virtqemud[182235]: invalid argument: disk vda does not have an active block job
Jan 22 00:27:39 compute-1 nova_compute[182713]: 2026-01-22 00:27:39.758 182717 DEBUG oslo_concurrency.processutils [None req-d029da05-1bf7-4fca-af97-ee76e5a39006 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/470aa2dc-e43d-414b-8bac-208dec5bcfe2/disk --force-share --output=json -f qcow2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:27:39 compute-1 nova_compute[182713]: 2026-01-22 00:27:39.791 182717 DEBUG oslo_concurrency.lockutils [None req-d64d90c4-baf8-4f55-b56a-f30899272d45 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Acquiring lock "refresh_cache-53686c08-86df-445a-b433-6a2c7c590fdb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:27:39 compute-1 nova_compute[182713]: 2026-01-22 00:27:39.792 182717 DEBUG oslo_concurrency.lockutils [None req-d64d90c4-baf8-4f55-b56a-f30899272d45 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Acquired lock "refresh_cache-53686c08-86df-445a-b433-6a2c7c590fdb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:27:39 compute-1 nova_compute[182713]: 2026-01-22 00:27:39.792 182717 DEBUG nova.network.neutron [None req-d64d90c4-baf8-4f55-b56a-f30899272d45 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: 53686c08-86df-445a-b433-6a2c7c590fdb] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 00:27:39 compute-1 nova_compute[182713]: 2026-01-22 00:27:39.824 182717 DEBUG oslo_concurrency.processutils [None req-d029da05-1bf7-4fca-af97-ee76e5a39006 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/470aa2dc-e43d-414b-8bac-208dec5bcfe2/disk --force-share --output=json -f qcow2" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:27:39 compute-1 nova_compute[182713]: 2026-01-22 00:27:39.825 182717 DEBUG oslo_concurrency.processutils [None req-d029da05-1bf7-4fca-af97-ee76e5a39006 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/470aa2dc-e43d-414b-8bac-208dec5bcfe2/disk --force-share --output=json -f qcow2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:27:39 compute-1 nova_compute[182713]: 2026-01-22 00:27:39.882 182717 DEBUG oslo_concurrency.processutils [None req-d029da05-1bf7-4fca-af97-ee76e5a39006 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/470aa2dc-e43d-414b-8bac-208dec5bcfe2/disk --force-share --output=json -f qcow2" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:27:39 compute-1 nova_compute[182713]: 2026-01-22 00:27:39.899 182717 DEBUG oslo_concurrency.processutils [None req-d029da05-1bf7-4fca-af97-ee76e5a39006 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4546bcf384626c54ce60a485a9f0fede193badcf --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:27:39 compute-1 nova_compute[182713]: 2026-01-22 00:27:39.954 182717 DEBUG oslo_concurrency.processutils [None req-d029da05-1bf7-4fca-af97-ee76e5a39006 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4546bcf384626c54ce60a485a9f0fede193badcf --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:27:39 compute-1 nova_compute[182713]: 2026-01-22 00:27:39.956 182717 DEBUG oslo_concurrency.processutils [None req-d029da05-1bf7-4fca-af97-ee76e5a39006 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/4546bcf384626c54ce60a485a9f0fede193badcf,backing_fmt=raw /var/lib/nova/instances/snapshots/tmpdp5nxx61/764d4b4e9021413b83c6210fb62a86bb.delta 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:27:40 compute-1 nova_compute[182713]: 2026-01-22 00:27:40.005 182717 DEBUG oslo_concurrency.processutils [None req-d029da05-1bf7-4fca-af97-ee76e5a39006 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/4546bcf384626c54ce60a485a9f0fede193badcf,backing_fmt=raw /var/lib/nova/instances/snapshots/tmpdp5nxx61/764d4b4e9021413b83c6210fb62a86bb.delta 1073741824" returned: 0 in 0.048s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:27:40 compute-1 nova_compute[182713]: 2026-01-22 00:27:40.006 182717 INFO nova.virt.libvirt.driver [None req-d029da05-1bf7-4fca-af97-ee76e5a39006 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] [instance: 470aa2dc-e43d-414b-8bac-208dec5bcfe2] Quiescing instance not available: QEMU guest agent is not enabled.
Jan 22 00:27:40 compute-1 nova_compute[182713]: 2026-01-22 00:27:40.067 182717 DEBUG nova.virt.libvirt.guest [None req-d029da05-1bf7-4fca-af97-ee76e5a39006 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] COPY block job progress, current cursor: 0 final cursor: 1114112 is_job_complete /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:846
Jan 22 00:27:40 compute-1 nova_compute[182713]: 2026-01-22 00:27:40.453 182717 DEBUG nova.compute.manager [req-eaffd75f-0ed7-4dbd-a946-dc60e5a0e634 req-fe340fb6-e67a-4e24-afbb-7893f533be27 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 53686c08-86df-445a-b433-6a2c7c590fdb] Received event network-changed-ac62ef89-aec4-41c9-83dd-366bdfc1c0bd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:27:40 compute-1 nova_compute[182713]: 2026-01-22 00:27:40.454 182717 DEBUG nova.compute.manager [req-eaffd75f-0ed7-4dbd-a946-dc60e5a0e634 req-fe340fb6-e67a-4e24-afbb-7893f533be27 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 53686c08-86df-445a-b433-6a2c7c590fdb] Refreshing instance network info cache due to event network-changed-ac62ef89-aec4-41c9-83dd-366bdfc1c0bd. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 00:27:40 compute-1 nova_compute[182713]: 2026-01-22 00:27:40.454 182717 DEBUG oslo_concurrency.lockutils [req-eaffd75f-0ed7-4dbd-a946-dc60e5a0e634 req-fe340fb6-e67a-4e24-afbb-7893f533be27 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-53686c08-86df-445a-b433-6a2c7c590fdb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:27:40 compute-1 nova_compute[182713]: 2026-01-22 00:27:40.572 182717 DEBUG nova.virt.libvirt.guest [None req-d029da05-1bf7-4fca-af97-ee76e5a39006 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] COPY block job progress, current cursor: 1114112 final cursor: 1114112 is_job_complete /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:846
Jan 22 00:27:40 compute-1 nova_compute[182713]: 2026-01-22 00:27:40.577 182717 INFO nova.virt.libvirt.driver [None req-d029da05-1bf7-4fca-af97-ee76e5a39006 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] [instance: 470aa2dc-e43d-414b-8bac-208dec5bcfe2] Skipping quiescing instance: QEMU guest agent is not enabled.
Jan 22 00:27:40 compute-1 nova_compute[182713]: 2026-01-22 00:27:40.622 182717 DEBUG nova.privsep.utils [None req-d029da05-1bf7-4fca-af97-ee76e5a39006 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63
Jan 22 00:27:40 compute-1 nova_compute[182713]: 2026-01-22 00:27:40.623 182717 DEBUG oslo_concurrency.processutils [None req-d029da05-1bf7-4fca-af97-ee76e5a39006 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] Running cmd (subprocess): qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/snapshots/tmpdp5nxx61/764d4b4e9021413b83c6210fb62a86bb.delta /var/lib/nova/instances/snapshots/tmpdp5nxx61/764d4b4e9021413b83c6210fb62a86bb execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:27:41 compute-1 nova_compute[182713]: 2026-01-22 00:27:41.005 182717 DEBUG oslo_concurrency.processutils [None req-d029da05-1bf7-4fca-af97-ee76e5a39006 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] CMD "qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/snapshots/tmpdp5nxx61/764d4b4e9021413b83c6210fb62a86bb.delta /var/lib/nova/instances/snapshots/tmpdp5nxx61/764d4b4e9021413b83c6210fb62a86bb" returned: 0 in 0.382s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:27:41 compute-1 nova_compute[182713]: 2026-01-22 00:27:41.008 182717 INFO nova.virt.libvirt.driver [None req-d029da05-1bf7-4fca-af97-ee76e5a39006 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] [instance: 470aa2dc-e43d-414b-8bac-208dec5bcfe2] Snapshot extracted, beginning image upload
Jan 22 00:27:42 compute-1 nova_compute[182713]: 2026-01-22 00:27:42.332 182717 DEBUG nova.network.neutron [None req-d64d90c4-baf8-4f55-b56a-f30899272d45 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: 53686c08-86df-445a-b433-6a2c7c590fdb] Updating instance_info_cache with network_info: [{"id": "ac62ef89-aec4-41c9-83dd-366bdfc1c0bd", "address": "fa:16:3e:5b:20:cd", "network": {"id": "3b1ad694-cd0e-4047-b840-b090066a26f4", "bridge": "br-int", "label": "tempest-network-smoke--1985720748", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.180", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "adb1305c8f874f2684e845e88fd95ffe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapac62ef89-ae", "ovs_interfaceid": "ac62ef89-aec4-41c9-83dd-366bdfc1c0bd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:27:42 compute-1 nova_compute[182713]: 2026-01-22 00:27:42.464 182717 DEBUG oslo_concurrency.lockutils [None req-d64d90c4-baf8-4f55-b56a-f30899272d45 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Releasing lock "refresh_cache-53686c08-86df-445a-b433-6a2c7c590fdb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:27:42 compute-1 nova_compute[182713]: 2026-01-22 00:27:42.469 182717 DEBUG oslo_concurrency.lockutils [req-eaffd75f-0ed7-4dbd-a946-dc60e5a0e634 req-fe340fb6-e67a-4e24-afbb-7893f533be27 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-53686c08-86df-445a-b433-6a2c7c590fdb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:27:42 compute-1 nova_compute[182713]: 2026-01-22 00:27:42.470 182717 DEBUG nova.network.neutron [req-eaffd75f-0ed7-4dbd-a946-dc60e5a0e634 req-fe340fb6-e67a-4e24-afbb-7893f533be27 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 53686c08-86df-445a-b433-6a2c7c590fdb] Refreshing network info cache for port ac62ef89-aec4-41c9-83dd-366bdfc1c0bd _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 00:27:42 compute-1 nova_compute[182713]: 2026-01-22 00:27:42.678 182717 DEBUG nova.virt.libvirt.driver [None req-d64d90c4-baf8-4f55-b56a-f30899272d45 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: 53686c08-86df-445a-b433-6a2c7c590fdb] Starting finish_migration finish_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11698
Jan 22 00:27:42 compute-1 nova_compute[182713]: 2026-01-22 00:27:42.681 182717 DEBUG nova.virt.libvirt.driver [None req-d64d90c4-baf8-4f55-b56a-f30899272d45 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: 53686c08-86df-445a-b433-6a2c7c590fdb] Instance directory exists: not creating _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4719
Jan 22 00:27:42 compute-1 nova_compute[182713]: 2026-01-22 00:27:42.681 182717 INFO nova.virt.libvirt.driver [None req-d64d90c4-baf8-4f55-b56a-f30899272d45 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: 53686c08-86df-445a-b433-6a2c7c590fdb] Creating image(s)
Jan 22 00:27:42 compute-1 nova_compute[182713]: 2026-01-22 00:27:42.682 182717 DEBUG nova.objects.instance [None req-d64d90c4-baf8-4f55-b56a-f30899272d45 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lazy-loading 'trusted_certs' on Instance uuid 53686c08-86df-445a-b433-6a2c7c590fdb obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:27:42 compute-1 nova_compute[182713]: 2026-01-22 00:27:42.701 182717 DEBUG oslo_concurrency.processutils [None req-d64d90c4-baf8-4f55-b56a-f30899272d45 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:27:42 compute-1 nova_compute[182713]: 2026-01-22 00:27:42.794 182717 DEBUG oslo_concurrency.processutils [None req-d64d90c4-baf8-4f55-b56a-f30899272d45 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.094s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:27:42 compute-1 nova_compute[182713]: 2026-01-22 00:27:42.796 182717 DEBUG nova.virt.disk.api [None req-d64d90c4-baf8-4f55-b56a-f30899272d45 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Checking if we can resize image /var/lib/nova/instances/53686c08-86df-445a-b433-6a2c7c590fdb/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 22 00:27:42 compute-1 nova_compute[182713]: 2026-01-22 00:27:42.797 182717 DEBUG oslo_concurrency.processutils [None req-d64d90c4-baf8-4f55-b56a-f30899272d45 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/53686c08-86df-445a-b433-6a2c7c590fdb/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:27:42 compute-1 nova_compute[182713]: 2026-01-22 00:27:42.856 182717 DEBUG oslo_concurrency.processutils [None req-d64d90c4-baf8-4f55-b56a-f30899272d45 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/53686c08-86df-445a-b433-6a2c7c590fdb/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:27:42 compute-1 nova_compute[182713]: 2026-01-22 00:27:42.857 182717 DEBUG nova.virt.disk.api [None req-d64d90c4-baf8-4f55-b56a-f30899272d45 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Cannot resize image /var/lib/nova/instances/53686c08-86df-445a-b433-6a2c7c590fdb/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 22 00:27:42 compute-1 nova_compute[182713]: 2026-01-22 00:27:42.977 182717 DEBUG nova.virt.libvirt.driver [None req-d64d90c4-baf8-4f55-b56a-f30899272d45 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: 53686c08-86df-445a-b433-6a2c7c590fdb] Did not create local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4859
Jan 22 00:27:42 compute-1 nova_compute[182713]: 2026-01-22 00:27:42.978 182717 DEBUG nova.virt.libvirt.driver [None req-d64d90c4-baf8-4f55-b56a-f30899272d45 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: 53686c08-86df-445a-b433-6a2c7c590fdb] Ensure instance console log exists: /var/lib/nova/instances/53686c08-86df-445a-b433-6a2c7c590fdb/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 22 00:27:42 compute-1 nova_compute[182713]: 2026-01-22 00:27:42.979 182717 DEBUG oslo_concurrency.lockutils [None req-d64d90c4-baf8-4f55-b56a-f30899272d45 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:27:42 compute-1 nova_compute[182713]: 2026-01-22 00:27:42.979 182717 DEBUG oslo_concurrency.lockutils [None req-d64d90c4-baf8-4f55-b56a-f30899272d45 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:27:42 compute-1 nova_compute[182713]: 2026-01-22 00:27:42.980 182717 DEBUG oslo_concurrency.lockutils [None req-d64d90c4-baf8-4f55-b56a-f30899272d45 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:27:42 compute-1 nova_compute[182713]: 2026-01-22 00:27:42.984 182717 DEBUG nova.virt.libvirt.driver [None req-d64d90c4-baf8-4f55-b56a-f30899272d45 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: 53686c08-86df-445a-b433-6a2c7c590fdb] Start _get_guest_xml network_info=[{"id": "ac62ef89-aec4-41c9-83dd-366bdfc1c0bd", "address": "fa:16:3e:5b:20:cd", "network": {"id": "3b1ad694-cd0e-4047-b840-b090066a26f4", "bridge": "br-int", "label": "tempest-network-smoke--1985720748", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.180", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-network-smoke--1985720748", "vif_mac": "fa:16:3e:5b:20:cd"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "adb1305c8f874f2684e845e88fd95ffe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapac62ef89-ae", "ovs_interfaceid": "ac62ef89-aec4-41c9-83dd-366bdfc1c0bd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'size': 0, 'disk_bus': 'virtio', 'boot_index': 0, 'device_name': '/dev/vda', 'guest_format': None, 'encrypted': False, 'encryption_format': None, 'encryption_secret_uuid': None, 'device_type': 'disk', 'image_id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 22 00:27:42 compute-1 nova_compute[182713]: 2026-01-22 00:27:42.990 182717 WARNING nova.virt.libvirt.driver [None req-d64d90c4-baf8-4f55-b56a-f30899272d45 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 00:27:43 compute-1 nova_compute[182713]: 2026-01-22 00:27:43.000 182717 DEBUG nova.virt.libvirt.host [None req-d64d90c4-baf8-4f55-b56a-f30899272d45 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 22 00:27:43 compute-1 nova_compute[182713]: 2026-01-22 00:27:43.000 182717 DEBUG nova.virt.libvirt.host [None req-d64d90c4-baf8-4f55-b56a-f30899272d45 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 22 00:27:43 compute-1 nova_compute[182713]: 2026-01-22 00:27:43.016 182717 DEBUG nova.virt.libvirt.host [None req-d64d90c4-baf8-4f55-b56a-f30899272d45 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 22 00:27:43 compute-1 nova_compute[182713]: 2026-01-22 00:27:43.017 182717 DEBUG nova.virt.libvirt.host [None req-d64d90c4-baf8-4f55-b56a-f30899272d45 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 22 00:27:43 compute-1 nova_compute[182713]: 2026-01-22 00:27:43.019 182717 DEBUG nova.virt.libvirt.driver [None req-d64d90c4-baf8-4f55-b56a-f30899272d45 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 22 00:27:43 compute-1 nova_compute[182713]: 2026-01-22 00:27:43.019 182717 DEBUG nova.virt.hardware [None req-d64d90c4-baf8-4f55-b56a-f30899272d45 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-21T23:42:48Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='ff01ccba-ad51-439f-9037-926190d6dc0f',id=2,is_public=True,memory_mb=192,name='m1.micro',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 22 00:27:43 compute-1 nova_compute[182713]: 2026-01-22 00:27:43.020 182717 DEBUG nova.virt.hardware [None req-d64d90c4-baf8-4f55-b56a-f30899272d45 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 22 00:27:43 compute-1 nova_compute[182713]: 2026-01-22 00:27:43.020 182717 DEBUG nova.virt.hardware [None req-d64d90c4-baf8-4f55-b56a-f30899272d45 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 22 00:27:43 compute-1 nova_compute[182713]: 2026-01-22 00:27:43.020 182717 DEBUG nova.virt.hardware [None req-d64d90c4-baf8-4f55-b56a-f30899272d45 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 22 00:27:43 compute-1 nova_compute[182713]: 2026-01-22 00:27:43.020 182717 DEBUG nova.virt.hardware [None req-d64d90c4-baf8-4f55-b56a-f30899272d45 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 22 00:27:43 compute-1 nova_compute[182713]: 2026-01-22 00:27:43.021 182717 DEBUG nova.virt.hardware [None req-d64d90c4-baf8-4f55-b56a-f30899272d45 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 22 00:27:43 compute-1 nova_compute[182713]: 2026-01-22 00:27:43.021 182717 DEBUG nova.virt.hardware [None req-d64d90c4-baf8-4f55-b56a-f30899272d45 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 22 00:27:43 compute-1 nova_compute[182713]: 2026-01-22 00:27:43.022 182717 DEBUG nova.virt.hardware [None req-d64d90c4-baf8-4f55-b56a-f30899272d45 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 22 00:27:43 compute-1 nova_compute[182713]: 2026-01-22 00:27:43.022 182717 DEBUG nova.virt.hardware [None req-d64d90c4-baf8-4f55-b56a-f30899272d45 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 22 00:27:43 compute-1 nova_compute[182713]: 2026-01-22 00:27:43.022 182717 DEBUG nova.virt.hardware [None req-d64d90c4-baf8-4f55-b56a-f30899272d45 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 22 00:27:43 compute-1 nova_compute[182713]: 2026-01-22 00:27:43.022 182717 DEBUG nova.virt.hardware [None req-d64d90c4-baf8-4f55-b56a-f30899272d45 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 22 00:27:43 compute-1 nova_compute[182713]: 2026-01-22 00:27:43.023 182717 DEBUG nova.objects.instance [None req-d64d90c4-baf8-4f55-b56a-f30899272d45 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lazy-loading 'vcpu_model' on Instance uuid 53686c08-86df-445a-b433-6a2c7c590fdb obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:27:43 compute-1 nova_compute[182713]: 2026-01-22 00:27:43.305 182717 DEBUG oslo_concurrency.processutils [None req-d64d90c4-baf8-4f55-b56a-f30899272d45 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/53686c08-86df-445a-b433-6a2c7c590fdb/disk.config --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:27:43 compute-1 nova_compute[182713]: 2026-01-22 00:27:43.403 182717 DEBUG oslo_concurrency.processutils [None req-d64d90c4-baf8-4f55-b56a-f30899272d45 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/53686c08-86df-445a-b433-6a2c7c590fdb/disk.config --force-share --output=json" returned: 0 in 0.098s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:27:43 compute-1 nova_compute[182713]: 2026-01-22 00:27:43.406 182717 DEBUG oslo_concurrency.lockutils [None req-d64d90c4-baf8-4f55-b56a-f30899272d45 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Acquiring lock "/var/lib/nova/instances/53686c08-86df-445a-b433-6a2c7c590fdb/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:27:43 compute-1 nova_compute[182713]: 2026-01-22 00:27:43.406 182717 DEBUG oslo_concurrency.lockutils [None req-d64d90c4-baf8-4f55-b56a-f30899272d45 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lock "/var/lib/nova/instances/53686c08-86df-445a-b433-6a2c7c590fdb/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:27:43 compute-1 nova_compute[182713]: 2026-01-22 00:27:43.408 182717 DEBUG oslo_concurrency.lockutils [None req-d64d90c4-baf8-4f55-b56a-f30899272d45 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lock "/var/lib/nova/instances/53686c08-86df-445a-b433-6a2c7c590fdb/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:27:43 compute-1 nova_compute[182713]: 2026-01-22 00:27:43.409 182717 DEBUG nova.virt.libvirt.vif [None req-d64d90c4-baf8-4f55-b56a-f30899272d45 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T00:26:48Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-1419654419',display_name='tempest-TestNetworkAdvancedServerOps-server-1419654419',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-1419654419',id=166,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJ5yIuZRAnyr0fY4MG0JJtrl2YmC7LkxFjTLIDSY0MneCjEwMPb+R0i/C3i76549W+tX7/jAJYcJ/Zhm6OZjTa9donlIVfIM40NClFZ/uOy/0cpEFxgyZR4Q96O10ulXCA==',key_name='tempest-TestNetworkAdvancedServerOps-623011376',keypairs=<?>,launch_index=0,launched_at=2026-01-22T00:27:02Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='adb1305c8f874f2684e845e88fd95ffe',ramdisk_id='',reservation_id='r-1itpmhcp',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-TestNetworkAdvancedServerOps-587955072',owner_user_name='tempest-TestNetworkAdvancedServerOps-587955072-project-member'},tags=<?>,task_state='resize_finish',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:27:36Z,user_data=None,user_id='635cc2f351c344dc8e2b1264080dbafb',uuid=53686c08-86df-445a-b433-6a2c7c590fdb,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "ac62ef89-aec4-41c9-83dd-366bdfc1c0bd", "address": "fa:16:3e:5b:20:cd", "network": {"id": "3b1ad694-cd0e-4047-b840-b090066a26f4", "bridge": "br-int", "label": "tempest-network-smoke--1985720748", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.180", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-network-smoke--1985720748", "vif_mac": "fa:16:3e:5b:20:cd"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "adb1305c8f874f2684e845e88fd95ffe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapac62ef89-ae", "ovs_interfaceid": "ac62ef89-aec4-41c9-83dd-366bdfc1c0bd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 00:27:43 compute-1 nova_compute[182713]: 2026-01-22 00:27:43.410 182717 DEBUG nova.network.os_vif_util [None req-d64d90c4-baf8-4f55-b56a-f30899272d45 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Converting VIF {"id": "ac62ef89-aec4-41c9-83dd-366bdfc1c0bd", "address": "fa:16:3e:5b:20:cd", "network": {"id": "3b1ad694-cd0e-4047-b840-b090066a26f4", "bridge": "br-int", "label": "tempest-network-smoke--1985720748", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.180", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-network-smoke--1985720748", "vif_mac": "fa:16:3e:5b:20:cd"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "adb1305c8f874f2684e845e88fd95ffe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapac62ef89-ae", "ovs_interfaceid": "ac62ef89-aec4-41c9-83dd-366bdfc1c0bd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:27:43 compute-1 nova_compute[182713]: 2026-01-22 00:27:43.411 182717 DEBUG nova.network.os_vif_util [None req-d64d90c4-baf8-4f55-b56a-f30899272d45 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:5b:20:cd,bridge_name='br-int',has_traffic_filtering=True,id=ac62ef89-aec4-41c9-83dd-366bdfc1c0bd,network=Network(3b1ad694-cd0e-4047-b840-b090066a26f4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapac62ef89-ae') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:27:43 compute-1 nova_compute[182713]: 2026-01-22 00:27:43.414 182717 DEBUG nova.virt.libvirt.driver [None req-d64d90c4-baf8-4f55-b56a-f30899272d45 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: 53686c08-86df-445a-b433-6a2c7c590fdb] End _get_guest_xml xml=<domain type="kvm">
Jan 22 00:27:43 compute-1 nova_compute[182713]:   <uuid>53686c08-86df-445a-b433-6a2c7c590fdb</uuid>
Jan 22 00:27:43 compute-1 nova_compute[182713]:   <name>instance-000000a6</name>
Jan 22 00:27:43 compute-1 nova_compute[182713]:   <memory>196608</memory>
Jan 22 00:27:43 compute-1 nova_compute[182713]:   <vcpu>1</vcpu>
Jan 22 00:27:43 compute-1 nova_compute[182713]:   <metadata>
Jan 22 00:27:43 compute-1 nova_compute[182713]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 00:27:43 compute-1 nova_compute[182713]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 00:27:43 compute-1 nova_compute[182713]:       <nova:name>tempest-TestNetworkAdvancedServerOps-server-1419654419</nova:name>
Jan 22 00:27:43 compute-1 nova_compute[182713]:       <nova:creationTime>2026-01-22 00:27:42</nova:creationTime>
Jan 22 00:27:43 compute-1 nova_compute[182713]:       <nova:flavor name="m1.micro">
Jan 22 00:27:43 compute-1 nova_compute[182713]:         <nova:memory>192</nova:memory>
Jan 22 00:27:43 compute-1 nova_compute[182713]:         <nova:disk>1</nova:disk>
Jan 22 00:27:43 compute-1 nova_compute[182713]:         <nova:swap>0</nova:swap>
Jan 22 00:27:43 compute-1 nova_compute[182713]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 00:27:43 compute-1 nova_compute[182713]:         <nova:vcpus>1</nova:vcpus>
Jan 22 00:27:43 compute-1 nova_compute[182713]:       </nova:flavor>
Jan 22 00:27:43 compute-1 nova_compute[182713]:       <nova:owner>
Jan 22 00:27:43 compute-1 nova_compute[182713]:         <nova:user uuid="635cc2f351c344dc8e2b1264080dbafb">tempest-TestNetworkAdvancedServerOps-587955072-project-member</nova:user>
Jan 22 00:27:43 compute-1 nova_compute[182713]:         <nova:project uuid="adb1305c8f874f2684e845e88fd95ffe">tempest-TestNetworkAdvancedServerOps-587955072</nova:project>
Jan 22 00:27:43 compute-1 nova_compute[182713]:       </nova:owner>
Jan 22 00:27:43 compute-1 nova_compute[182713]:       <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 22 00:27:43 compute-1 nova_compute[182713]:       <nova:ports>
Jan 22 00:27:43 compute-1 nova_compute[182713]:         <nova:port uuid="ac62ef89-aec4-41c9-83dd-366bdfc1c0bd">
Jan 22 00:27:43 compute-1 nova_compute[182713]:           <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Jan 22 00:27:43 compute-1 nova_compute[182713]:         </nova:port>
Jan 22 00:27:43 compute-1 nova_compute[182713]:       </nova:ports>
Jan 22 00:27:43 compute-1 nova_compute[182713]:     </nova:instance>
Jan 22 00:27:43 compute-1 nova_compute[182713]:   </metadata>
Jan 22 00:27:43 compute-1 nova_compute[182713]:   <sysinfo type="smbios">
Jan 22 00:27:43 compute-1 nova_compute[182713]:     <system>
Jan 22 00:27:43 compute-1 nova_compute[182713]:       <entry name="manufacturer">RDO</entry>
Jan 22 00:27:43 compute-1 nova_compute[182713]:       <entry name="product">OpenStack Compute</entry>
Jan 22 00:27:43 compute-1 nova_compute[182713]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 00:27:43 compute-1 nova_compute[182713]:       <entry name="serial">53686c08-86df-445a-b433-6a2c7c590fdb</entry>
Jan 22 00:27:43 compute-1 nova_compute[182713]:       <entry name="uuid">53686c08-86df-445a-b433-6a2c7c590fdb</entry>
Jan 22 00:27:43 compute-1 nova_compute[182713]:       <entry name="family">Virtual Machine</entry>
Jan 22 00:27:43 compute-1 nova_compute[182713]:     </system>
Jan 22 00:27:43 compute-1 nova_compute[182713]:   </sysinfo>
Jan 22 00:27:43 compute-1 nova_compute[182713]:   <os>
Jan 22 00:27:43 compute-1 nova_compute[182713]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 22 00:27:43 compute-1 nova_compute[182713]:     <boot dev="hd"/>
Jan 22 00:27:43 compute-1 nova_compute[182713]:     <smbios mode="sysinfo"/>
Jan 22 00:27:43 compute-1 nova_compute[182713]:   </os>
Jan 22 00:27:43 compute-1 nova_compute[182713]:   <features>
Jan 22 00:27:43 compute-1 nova_compute[182713]:     <acpi/>
Jan 22 00:27:43 compute-1 nova_compute[182713]:     <apic/>
Jan 22 00:27:43 compute-1 nova_compute[182713]:     <vmcoreinfo/>
Jan 22 00:27:43 compute-1 nova_compute[182713]:   </features>
Jan 22 00:27:43 compute-1 nova_compute[182713]:   <clock offset="utc">
Jan 22 00:27:43 compute-1 nova_compute[182713]:     <timer name="pit" tickpolicy="delay"/>
Jan 22 00:27:43 compute-1 nova_compute[182713]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 22 00:27:43 compute-1 nova_compute[182713]:     <timer name="hpet" present="no"/>
Jan 22 00:27:43 compute-1 nova_compute[182713]:   </clock>
Jan 22 00:27:43 compute-1 nova_compute[182713]:   <cpu mode="custom" match="exact">
Jan 22 00:27:43 compute-1 nova_compute[182713]:     <model>Nehalem</model>
Jan 22 00:27:43 compute-1 nova_compute[182713]:     <topology sockets="1" cores="1" threads="1"/>
Jan 22 00:27:43 compute-1 nova_compute[182713]:   </cpu>
Jan 22 00:27:43 compute-1 nova_compute[182713]:   <devices>
Jan 22 00:27:43 compute-1 nova_compute[182713]:     <disk type="file" device="disk">
Jan 22 00:27:43 compute-1 nova_compute[182713]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 00:27:43 compute-1 nova_compute[182713]:       <source file="/var/lib/nova/instances/53686c08-86df-445a-b433-6a2c7c590fdb/disk"/>
Jan 22 00:27:43 compute-1 nova_compute[182713]:       <target dev="vda" bus="virtio"/>
Jan 22 00:27:43 compute-1 nova_compute[182713]:     </disk>
Jan 22 00:27:43 compute-1 nova_compute[182713]:     <disk type="file" device="cdrom">
Jan 22 00:27:43 compute-1 nova_compute[182713]:       <driver name="qemu" type="raw" cache="none"/>
Jan 22 00:27:43 compute-1 nova_compute[182713]:       <source file="/var/lib/nova/instances/53686c08-86df-445a-b433-6a2c7c590fdb/disk.config"/>
Jan 22 00:27:43 compute-1 nova_compute[182713]:       <target dev="sda" bus="sata"/>
Jan 22 00:27:43 compute-1 nova_compute[182713]:     </disk>
Jan 22 00:27:43 compute-1 nova_compute[182713]:     <interface type="ethernet">
Jan 22 00:27:43 compute-1 nova_compute[182713]:       <mac address="fa:16:3e:5b:20:cd"/>
Jan 22 00:27:43 compute-1 nova_compute[182713]:       <model type="virtio"/>
Jan 22 00:27:43 compute-1 nova_compute[182713]:       <driver name="vhost" rx_queue_size="512"/>
Jan 22 00:27:43 compute-1 nova_compute[182713]:       <mtu size="1442"/>
Jan 22 00:27:43 compute-1 nova_compute[182713]:       <target dev="tapac62ef89-ae"/>
Jan 22 00:27:43 compute-1 nova_compute[182713]:     </interface>
Jan 22 00:27:43 compute-1 nova_compute[182713]:     <serial type="pty">
Jan 22 00:27:43 compute-1 nova_compute[182713]:       <log file="/var/lib/nova/instances/53686c08-86df-445a-b433-6a2c7c590fdb/console.log" append="off"/>
Jan 22 00:27:43 compute-1 nova_compute[182713]:     </serial>
Jan 22 00:27:43 compute-1 nova_compute[182713]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 00:27:43 compute-1 nova_compute[182713]:     <video>
Jan 22 00:27:43 compute-1 nova_compute[182713]:       <model type="virtio"/>
Jan 22 00:27:43 compute-1 nova_compute[182713]:     </video>
Jan 22 00:27:43 compute-1 nova_compute[182713]:     <input type="tablet" bus="usb"/>
Jan 22 00:27:43 compute-1 nova_compute[182713]:     <rng model="virtio">
Jan 22 00:27:43 compute-1 nova_compute[182713]:       <backend model="random">/dev/urandom</backend>
Jan 22 00:27:43 compute-1 nova_compute[182713]:     </rng>
Jan 22 00:27:43 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root"/>
Jan 22 00:27:43 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:27:43 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:27:43 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:27:43 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:27:43 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:27:43 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:27:43 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:27:43 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:27:43 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:27:43 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:27:43 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:27:43 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:27:43 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:27:43 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:27:43 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:27:43 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:27:43 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:27:43 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:27:43 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:27:43 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:27:43 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:27:43 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:27:43 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:27:43 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:27:43 compute-1 nova_compute[182713]:     <controller type="usb" index="0"/>
Jan 22 00:27:43 compute-1 nova_compute[182713]:     <memballoon model="virtio">
Jan 22 00:27:43 compute-1 nova_compute[182713]:       <stats period="10"/>
Jan 22 00:27:43 compute-1 nova_compute[182713]:     </memballoon>
Jan 22 00:27:43 compute-1 nova_compute[182713]:   </devices>
Jan 22 00:27:43 compute-1 nova_compute[182713]: </domain>
Jan 22 00:27:43 compute-1 nova_compute[182713]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 22 00:27:43 compute-1 nova_compute[182713]: 2026-01-22 00:27:43.417 182717 DEBUG nova.virt.libvirt.vif [None req-d64d90c4-baf8-4f55-b56a-f30899272d45 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T00:26:48Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-1419654419',display_name='tempest-TestNetworkAdvancedServerOps-server-1419654419',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-1419654419',id=166,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJ5yIuZRAnyr0fY4MG0JJtrl2YmC7LkxFjTLIDSY0MneCjEwMPb+R0i/C3i76549W+tX7/jAJYcJ/Zhm6OZjTa9donlIVfIM40NClFZ/uOy/0cpEFxgyZR4Q96O10ulXCA==',key_name='tempest-TestNetworkAdvancedServerOps-623011376',keypairs=<?>,launch_index=0,launched_at=2026-01-22T00:27:02Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='adb1305c8f874f2684e845e88fd95ffe',ramdisk_id='',reservation_id='r-1itpmhcp',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-TestNetworkAdvancedServerOps-587955072',owner_user_name='tempest-TestNetworkAdvancedServerOps-587955072-project-member'},tags=<?>,task_state='resize_finish',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:27:36Z,user_data=None,user_id='635cc2f351c344dc8e2b1264080dbafb',uuid=53686c08-86df-445a-b433-6a2c7c590fdb,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "ac62ef89-aec4-41c9-83dd-366bdfc1c0bd", "address": "fa:16:3e:5b:20:cd", "network": {"id": "3b1ad694-cd0e-4047-b840-b090066a26f4", "bridge": "br-int", "label": "tempest-network-smoke--1985720748", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.180", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-network-smoke--1985720748", "vif_mac": "fa:16:3e:5b:20:cd"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "adb1305c8f874f2684e845e88fd95ffe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapac62ef89-ae", "ovs_interfaceid": "ac62ef89-aec4-41c9-83dd-366bdfc1c0bd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 00:27:43 compute-1 nova_compute[182713]: 2026-01-22 00:27:43.418 182717 DEBUG nova.network.os_vif_util [None req-d64d90c4-baf8-4f55-b56a-f30899272d45 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Converting VIF {"id": "ac62ef89-aec4-41c9-83dd-366bdfc1c0bd", "address": "fa:16:3e:5b:20:cd", "network": {"id": "3b1ad694-cd0e-4047-b840-b090066a26f4", "bridge": "br-int", "label": "tempest-network-smoke--1985720748", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.180", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-network-smoke--1985720748", "vif_mac": "fa:16:3e:5b:20:cd"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "adb1305c8f874f2684e845e88fd95ffe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapac62ef89-ae", "ovs_interfaceid": "ac62ef89-aec4-41c9-83dd-366bdfc1c0bd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:27:43 compute-1 nova_compute[182713]: 2026-01-22 00:27:43.419 182717 DEBUG nova.network.os_vif_util [None req-d64d90c4-baf8-4f55-b56a-f30899272d45 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:5b:20:cd,bridge_name='br-int',has_traffic_filtering=True,id=ac62ef89-aec4-41c9-83dd-366bdfc1c0bd,network=Network(3b1ad694-cd0e-4047-b840-b090066a26f4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapac62ef89-ae') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:27:43 compute-1 nova_compute[182713]: 2026-01-22 00:27:43.419 182717 DEBUG os_vif [None req-d64d90c4-baf8-4f55-b56a-f30899272d45 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:5b:20:cd,bridge_name='br-int',has_traffic_filtering=True,id=ac62ef89-aec4-41c9-83dd-366bdfc1c0bd,network=Network(3b1ad694-cd0e-4047-b840-b090066a26f4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapac62ef89-ae') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 00:27:43 compute-1 nova_compute[182713]: 2026-01-22 00:27:43.420 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:27:43 compute-1 nova_compute[182713]: 2026-01-22 00:27:43.421 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:27:43 compute-1 nova_compute[182713]: 2026-01-22 00:27:43.421 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 00:27:43 compute-1 nova_compute[182713]: 2026-01-22 00:27:43.427 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:27:43 compute-1 nova_compute[182713]: 2026-01-22 00:27:43.427 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapac62ef89-ae, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:27:43 compute-1 nova_compute[182713]: 2026-01-22 00:27:43.428 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapac62ef89-ae, col_values=(('external_ids', {'iface-id': 'ac62ef89-aec4-41c9-83dd-366bdfc1c0bd', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:5b:20:cd', 'vm-uuid': '53686c08-86df-445a-b433-6a2c7c590fdb'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:27:43 compute-1 nova_compute[182713]: 2026-01-22 00:27:43.429 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:27:43 compute-1 NetworkManager[54952]: <info>  [1769041663.4306] manager: (tapac62ef89-ae): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/304)
Jan 22 00:27:43 compute-1 nova_compute[182713]: 2026-01-22 00:27:43.432 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:27:43 compute-1 nova_compute[182713]: 2026-01-22 00:27:43.440 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:27:43 compute-1 nova_compute[182713]: 2026-01-22 00:27:43.441 182717 INFO os_vif [None req-d64d90c4-baf8-4f55-b56a-f30899272d45 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:5b:20:cd,bridge_name='br-int',has_traffic_filtering=True,id=ac62ef89-aec4-41c9-83dd-366bdfc1c0bd,network=Network(3b1ad694-cd0e-4047-b840-b090066a26f4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapac62ef89-ae')
Jan 22 00:27:43 compute-1 nova_compute[182713]: 2026-01-22 00:27:43.534 182717 DEBUG nova.virt.libvirt.driver [None req-d64d90c4-baf8-4f55-b56a-f30899272d45 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 00:27:43 compute-1 nova_compute[182713]: 2026-01-22 00:27:43.535 182717 DEBUG nova.virt.libvirt.driver [None req-d64d90c4-baf8-4f55-b56a-f30899272d45 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 00:27:43 compute-1 nova_compute[182713]: 2026-01-22 00:27:43.536 182717 DEBUG nova.virt.libvirt.driver [None req-d64d90c4-baf8-4f55-b56a-f30899272d45 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] No VIF found with MAC fa:16:3e:5b:20:cd, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 00:27:43 compute-1 nova_compute[182713]: 2026-01-22 00:27:43.537 182717 INFO nova.virt.libvirt.driver [None req-d64d90c4-baf8-4f55-b56a-f30899272d45 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: 53686c08-86df-445a-b433-6a2c7c590fdb] Using config drive
Jan 22 00:27:43 compute-1 kernel: tapac62ef89-ae: entered promiscuous mode
Jan 22 00:27:43 compute-1 NetworkManager[54952]: <info>  [1769041663.6221] manager: (tapac62ef89-ae): new Tun device (/org/freedesktop/NetworkManager/Devices/305)
Jan 22 00:27:43 compute-1 nova_compute[182713]: 2026-01-22 00:27:43.623 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:27:43 compute-1 ovn_controller[94841]: 2026-01-22T00:27:43Z|00653|binding|INFO|Claiming lport ac62ef89-aec4-41c9-83dd-366bdfc1c0bd for this chassis.
Jan 22 00:27:43 compute-1 ovn_controller[94841]: 2026-01-22T00:27:43Z|00654|binding|INFO|ac62ef89-aec4-41c9-83dd-366bdfc1c0bd: Claiming fa:16:3e:5b:20:cd 10.100.0.10
Jan 22 00:27:43 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:27:43.635 104184 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:5b:20:cd 10.100.0.10'], port_security=['fa:16:3e:5b:20:cd 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '53686c08-86df-445a-b433-6a2c7c590fdb', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3b1ad694-cd0e-4047-b840-b090066a26f4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'adb1305c8f874f2684e845e88fd95ffe', 'neutron:revision_number': '6', 'neutron:security_group_ids': 'ddb2905c-b7d9-4e7e-b5f3-61f1bd651115', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.180'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=07b94cbf-fe21-425f-b9b0-192a8a6fba61, chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>], logical_port=ac62ef89-aec4-41c9-83dd-366bdfc1c0bd) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:27:43 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:27:43.637 104184 INFO neutron.agent.ovn.metadata.agent [-] Port ac62ef89-aec4-41c9-83dd-366bdfc1c0bd in datapath 3b1ad694-cd0e-4047-b840-b090066a26f4 bound to our chassis
Jan 22 00:27:43 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:27:43.639 104184 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 3b1ad694-cd0e-4047-b840-b090066a26f4
Jan 22 00:27:43 compute-1 nova_compute[182713]: 2026-01-22 00:27:43.651 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:27:43 compute-1 systemd-udevd[237702]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 00:27:43 compute-1 ovn_controller[94841]: 2026-01-22T00:27:43Z|00655|binding|INFO|Setting lport ac62ef89-aec4-41c9-83dd-366bdfc1c0bd up in Southbound
Jan 22 00:27:43 compute-1 nova_compute[182713]: 2026-01-22 00:27:43.654 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:27:43 compute-1 ovn_controller[94841]: 2026-01-22T00:27:43Z|00656|binding|INFO|Setting lport ac62ef89-aec4-41c9-83dd-366bdfc1c0bd ovn-installed in OVS
Jan 22 00:27:43 compute-1 nova_compute[182713]: 2026-01-22 00:27:43.655 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:27:43 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:27:43.656 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[39ba3c95-75fa-4723-b3a8-cf306c2bf2ec]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:27:43 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:27:43.657 104184 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap3b1ad694-c1 in ovnmeta-3b1ad694-cd0e-4047-b840-b090066a26f4 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 22 00:27:43 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:27:43.659 211733 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap3b1ad694-c0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 22 00:27:43 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:27:43.659 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[fca49131-c7ec-49aa-a9cb-cedf3fcf71b4]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:27:43 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:27:43.660 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[866c3728-a847-410f-93c5-8a3732a400b6]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:27:43 compute-1 NetworkManager[54952]: <info>  [1769041663.6677] device (tapac62ef89-ae): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 00:27:43 compute-1 NetworkManager[54952]: <info>  [1769041663.6684] device (tapac62ef89-ae): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 00:27:43 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:27:43.676 104576 DEBUG oslo.privsep.daemon [-] privsep: reply[d564ed09-cd3b-42c4-af6f-b2769fdb0549]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:27:43 compute-1 systemd-machined[153970]: New machine qemu-72-instance-000000a6.
Jan 22 00:27:43 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:27:43.691 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[09f59a5d-ef63-4990-84ee-c921481d5159]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:27:43 compute-1 systemd[1]: Started Virtual Machine qemu-72-instance-000000a6.
Jan 22 00:27:43 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:27:43.733 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[d3f32376-74a0-4bd8-97d1-13bb1ff45238]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:27:43 compute-1 NetworkManager[54952]: <info>  [1769041663.7407] manager: (tap3b1ad694-c0): new Veth device (/org/freedesktop/NetworkManager/Devices/306)
Jan 22 00:27:43 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:27:43.739 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[c3e4de68-ba1a-4215-a3b5-a1c1263eb082]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:27:43 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:27:43.782 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[40a23196-c16d-471e-847e-5d29ad3425f0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:27:43 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:27:43.786 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[baf4f3bc-69e4-491c-93e1-5ceb1d604d5c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:27:43 compute-1 NetworkManager[54952]: <info>  [1769041663.8134] device (tap3b1ad694-c0): carrier: link connected
Jan 22 00:27:43 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:27:43.820 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[e44cde3e-771a-4cb0-b4e5-496c2edce2c4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:27:43 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:27:43.842 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[e5d993d8-e150-447c-81bc-5a5803b03263]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap3b1ad694-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:9c:3a:17'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 188], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 623646, 'reachable_time': 16398, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 237737, 'error': None, 'target': 'ovnmeta-3b1ad694-cd0e-4047-b840-b090066a26f4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:27:43 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:27:43.859 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[7770af3b-e80c-4705-a4df-9d3f73dc38dc]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe9c:3a17'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 623646, 'tstamp': 623646}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 237738, 'error': None, 'target': 'ovnmeta-3b1ad694-cd0e-4047-b840-b090066a26f4', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:27:43 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:27:43.887 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[d7399cce-f17e-4a0b-ab25-c45e5b88c95f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap3b1ad694-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:9c:3a:17'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 188], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 623646, 'reachable_time': 16398, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 237739, 'error': None, 'target': 'ovnmeta-3b1ad694-cd0e-4047-b840-b090066a26f4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:27:43 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:27:43.934 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[5b2cae61-393e-4a73-8eac-763c91fc9ecd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:27:44 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:27:44.018 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[64a4c7b7-199d-486a-982e-12d98f5ef6e9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:27:44 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:27:44.020 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3b1ad694-c0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:27:44 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:27:44.020 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 00:27:44 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:27:44.021 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap3b1ad694-c0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:27:44 compute-1 kernel: tap3b1ad694-c0: entered promiscuous mode
Jan 22 00:27:44 compute-1 NetworkManager[54952]: <info>  [1769041664.0246] manager: (tap3b1ad694-c0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/307)
Jan 22 00:27:44 compute-1 nova_compute[182713]: 2026-01-22 00:27:44.022 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:27:44 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:27:44.026 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap3b1ad694-c0, col_values=(('external_ids', {'iface-id': '9b691818-8ed3-4906-ae03-85650608d26c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:27:44 compute-1 ovn_controller[94841]: 2026-01-22T00:27:44Z|00657|binding|INFO|Releasing lport 9b691818-8ed3-4906-ae03-85650608d26c from this chassis (sb_readonly=0)
Jan 22 00:27:44 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:27:44.029 104184 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/3b1ad694-cd0e-4047-b840-b090066a26f4.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/3b1ad694-cd0e-4047-b840-b090066a26f4.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 22 00:27:44 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:27:44.039 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[336ac8a3-3159-4e11-8015-749032ba17be]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:27:44 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:27:44.040 104184 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 00:27:44 compute-1 ovn_metadata_agent[104179]: global
Jan 22 00:27:44 compute-1 ovn_metadata_agent[104179]:     log         /dev/log local0 debug
Jan 22 00:27:44 compute-1 ovn_metadata_agent[104179]:     log-tag     haproxy-metadata-proxy-3b1ad694-cd0e-4047-b840-b090066a26f4
Jan 22 00:27:44 compute-1 ovn_metadata_agent[104179]:     user        root
Jan 22 00:27:44 compute-1 ovn_metadata_agent[104179]:     group       root
Jan 22 00:27:44 compute-1 ovn_metadata_agent[104179]:     maxconn     1024
Jan 22 00:27:44 compute-1 ovn_metadata_agent[104179]:     pidfile     /var/lib/neutron/external/pids/3b1ad694-cd0e-4047-b840-b090066a26f4.pid.haproxy
Jan 22 00:27:44 compute-1 ovn_metadata_agent[104179]:     daemon
Jan 22 00:27:44 compute-1 ovn_metadata_agent[104179]: 
Jan 22 00:27:44 compute-1 ovn_metadata_agent[104179]: defaults
Jan 22 00:27:44 compute-1 ovn_metadata_agent[104179]:     log global
Jan 22 00:27:44 compute-1 ovn_metadata_agent[104179]:     mode http
Jan 22 00:27:44 compute-1 ovn_metadata_agent[104179]:     option httplog
Jan 22 00:27:44 compute-1 ovn_metadata_agent[104179]:     option dontlognull
Jan 22 00:27:44 compute-1 ovn_metadata_agent[104179]:     option http-server-close
Jan 22 00:27:44 compute-1 ovn_metadata_agent[104179]:     option forwardfor
Jan 22 00:27:44 compute-1 ovn_metadata_agent[104179]:     retries                 3
Jan 22 00:27:44 compute-1 ovn_metadata_agent[104179]:     timeout http-request    30s
Jan 22 00:27:44 compute-1 ovn_metadata_agent[104179]:     timeout connect         30s
Jan 22 00:27:44 compute-1 ovn_metadata_agent[104179]:     timeout client          32s
Jan 22 00:27:44 compute-1 ovn_metadata_agent[104179]:     timeout server          32s
Jan 22 00:27:44 compute-1 ovn_metadata_agent[104179]:     timeout http-keep-alive 30s
Jan 22 00:27:44 compute-1 ovn_metadata_agent[104179]: 
Jan 22 00:27:44 compute-1 ovn_metadata_agent[104179]: 
Jan 22 00:27:44 compute-1 ovn_metadata_agent[104179]: listen listener
Jan 22 00:27:44 compute-1 ovn_metadata_agent[104179]:     bind 169.254.169.254:80
Jan 22 00:27:44 compute-1 ovn_metadata_agent[104179]:     server metadata /var/lib/neutron/metadata_proxy
Jan 22 00:27:44 compute-1 ovn_metadata_agent[104179]:     http-request add-header X-OVN-Network-ID 3b1ad694-cd0e-4047-b840-b090066a26f4
Jan 22 00:27:44 compute-1 ovn_metadata_agent[104179]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 22 00:27:44 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:27:44.041 104184 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-3b1ad694-cd0e-4047-b840-b090066a26f4', 'env', 'PROCESS_TAG=haproxy-3b1ad694-cd0e-4047-b840-b090066a26f4', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/3b1ad694-cd0e-4047-b840-b090066a26f4.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 22 00:27:44 compute-1 nova_compute[182713]: 2026-01-22 00:27:44.041 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:27:44 compute-1 nova_compute[182713]: 2026-01-22 00:27:44.145 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:27:44 compute-1 podman[237777]: 2026-01-22 00:27:44.458410375 +0000 UTC m=+0.066446974 container create 0e76741c6cac05e70e13d84762effc9a7d28f208ff3a1cf5b04fcb0ed000a626 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3b1ad694-cd0e-4047-b840-b090066a26f4, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 22 00:27:44 compute-1 nova_compute[182713]: 2026-01-22 00:27:44.458 182717 DEBUG nova.virt.driver [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Emitting event <LifecycleEvent: 1769041664.458096, 53686c08-86df-445a-b433-6a2c7c590fdb => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:27:44 compute-1 nova_compute[182713]: 2026-01-22 00:27:44.460 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 53686c08-86df-445a-b433-6a2c7c590fdb] VM Resumed (Lifecycle Event)
Jan 22 00:27:44 compute-1 nova_compute[182713]: 2026-01-22 00:27:44.464 182717 DEBUG nova.compute.manager [None req-d64d90c4-baf8-4f55-b56a-f30899272d45 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: 53686c08-86df-445a-b433-6a2c7c590fdb] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 00:27:44 compute-1 virtqemud[182235]: argument unsupported: QEMU guest agent is not configured
Jan 22 00:27:44 compute-1 nova_compute[182713]: 2026-01-22 00:27:44.470 182717 INFO nova.virt.libvirt.driver [-] [instance: 53686c08-86df-445a-b433-6a2c7c590fdb] Instance running successfully.
Jan 22 00:27:44 compute-1 nova_compute[182713]: 2026-01-22 00:27:44.473 182717 DEBUG nova.virt.libvirt.guest [None req-d64d90c4-baf8-4f55-b56a-f30899272d45 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: 53686c08-86df-445a-b433-6a2c7c590fdb] Failed to set time: agent not configured sync_guest_time /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:200
Jan 22 00:27:44 compute-1 nova_compute[182713]: 2026-01-22 00:27:44.474 182717 DEBUG nova.virt.libvirt.driver [None req-d64d90c4-baf8-4f55-b56a-f30899272d45 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: 53686c08-86df-445a-b433-6a2c7c590fdb] finish_migration finished successfully. finish_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11793
Jan 22 00:27:44 compute-1 systemd[1]: Started libpod-conmon-0e76741c6cac05e70e13d84762effc9a7d28f208ff3a1cf5b04fcb0ed000a626.scope.
Jan 22 00:27:44 compute-1 podman[237777]: 2026-01-22 00:27:44.420982734 +0000 UTC m=+0.029019313 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 00:27:44 compute-1 systemd[1]: Started libcrun container.
Jan 22 00:27:44 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/04ee774a16f0d0f1042fd31d17591e628a95e3cfde0cb59147b6229217b73e4a/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 00:27:44 compute-1 podman[237777]: 2026-01-22 00:27:44.568251225 +0000 UTC m=+0.176287794 container init 0e76741c6cac05e70e13d84762effc9a7d28f208ff3a1cf5b04fcb0ed000a626 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3b1ad694-cd0e-4047-b840-b090066a26f4, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 00:27:44 compute-1 podman[237777]: 2026-01-22 00:27:44.574900882 +0000 UTC m=+0.182937431 container start 0e76741c6cac05e70e13d84762effc9a7d28f208ff3a1cf5b04fcb0ed000a626 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3b1ad694-cd0e-4047-b840-b090066a26f4, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Jan 22 00:27:44 compute-1 neutron-haproxy-ovnmeta-3b1ad694-cd0e-4047-b840-b090066a26f4[237793]: [NOTICE]   (237797) : New worker (237799) forked
Jan 22 00:27:44 compute-1 neutron-haproxy-ovnmeta-3b1ad694-cd0e-4047-b840-b090066a26f4[237793]: [NOTICE]   (237797) : Loading success.
Jan 22 00:27:44 compute-1 nova_compute[182713]: 2026-01-22 00:27:44.801 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 53686c08-86df-445a-b433-6a2c7c590fdb] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:27:44 compute-1 nova_compute[182713]: 2026-01-22 00:27:44.806 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 53686c08-86df-445a-b433-6a2c7c590fdb] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: resize_finish, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 00:27:45 compute-1 nova_compute[182713]: 2026-01-22 00:27:45.381 182717 DEBUG nova.compute.manager [req-c7d24026-805c-4dbb-9e54-0e8a9d67218e req-f5d575d5-b938-4c70-b142-d2c3582f4ff5 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 53686c08-86df-445a-b433-6a2c7c590fdb] Received event network-vif-plugged-ac62ef89-aec4-41c9-83dd-366bdfc1c0bd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:27:45 compute-1 nova_compute[182713]: 2026-01-22 00:27:45.381 182717 DEBUG oslo_concurrency.lockutils [req-c7d24026-805c-4dbb-9e54-0e8a9d67218e req-f5d575d5-b938-4c70-b142-d2c3582f4ff5 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "53686c08-86df-445a-b433-6a2c7c590fdb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:27:45 compute-1 nova_compute[182713]: 2026-01-22 00:27:45.382 182717 DEBUG oslo_concurrency.lockutils [req-c7d24026-805c-4dbb-9e54-0e8a9d67218e req-f5d575d5-b938-4c70-b142-d2c3582f4ff5 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "53686c08-86df-445a-b433-6a2c7c590fdb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:27:45 compute-1 nova_compute[182713]: 2026-01-22 00:27:45.382 182717 DEBUG oslo_concurrency.lockutils [req-c7d24026-805c-4dbb-9e54-0e8a9d67218e req-f5d575d5-b938-4c70-b142-d2c3582f4ff5 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "53686c08-86df-445a-b433-6a2c7c590fdb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:27:45 compute-1 nova_compute[182713]: 2026-01-22 00:27:45.382 182717 DEBUG nova.compute.manager [req-c7d24026-805c-4dbb-9e54-0e8a9d67218e req-f5d575d5-b938-4c70-b142-d2c3582f4ff5 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 53686c08-86df-445a-b433-6a2c7c590fdb] No waiting events found dispatching network-vif-plugged-ac62ef89-aec4-41c9-83dd-366bdfc1c0bd pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:27:45 compute-1 nova_compute[182713]: 2026-01-22 00:27:45.382 182717 WARNING nova.compute.manager [req-c7d24026-805c-4dbb-9e54-0e8a9d67218e req-f5d575d5-b938-4c70-b142-d2c3582f4ff5 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 53686c08-86df-445a-b433-6a2c7c590fdb] Received unexpected event network-vif-plugged-ac62ef89-aec4-41c9-83dd-366bdfc1c0bd for instance with vm_state active and task_state resize_finish.
Jan 22 00:27:45 compute-1 nova_compute[182713]: 2026-01-22 00:27:45.415 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 53686c08-86df-445a-b433-6a2c7c590fdb] During sync_power_state the instance has a pending task (resize_finish). Skip.
Jan 22 00:27:45 compute-1 nova_compute[182713]: 2026-01-22 00:27:45.416 182717 DEBUG nova.virt.driver [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Emitting event <LifecycleEvent: 1769041664.459525, 53686c08-86df-445a-b433-6a2c7c590fdb => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:27:45 compute-1 nova_compute[182713]: 2026-01-22 00:27:45.416 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 53686c08-86df-445a-b433-6a2c7c590fdb] VM Started (Lifecycle Event)
Jan 22 00:27:45 compute-1 podman[237809]: 2026-01-22 00:27:45.586440063 +0000 UTC m=+0.081174271 container health_status cba261ffc859a105e0afa4436e64e27f9ba7e7387a1ea02146e0072c51844d44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ceilometer_agent_compute)
Jan 22 00:27:45 compute-1 nova_compute[182713]: 2026-01-22 00:27:45.802 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 53686c08-86df-445a-b433-6a2c7c590fdb] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:27:45 compute-1 nova_compute[182713]: 2026-01-22 00:27:45.809 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 53686c08-86df-445a-b433-6a2c7c590fdb] Synchronizing instance power state after lifecycle event "Started"; current vm_state: resized, current task_state: None, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 00:27:46 compute-1 systemd[1]: Stopping User Manager for UID 42436...
Jan 22 00:27:46 compute-1 systemd[237572]: Activating special unit Exit the Session...
Jan 22 00:27:46 compute-1 systemd[237572]: Stopped target Main User Target.
Jan 22 00:27:46 compute-1 systemd[237572]: Stopped target Basic System.
Jan 22 00:27:46 compute-1 systemd[237572]: Stopped target Paths.
Jan 22 00:27:46 compute-1 systemd[237572]: Stopped target Sockets.
Jan 22 00:27:46 compute-1 systemd[237572]: Stopped target Timers.
Jan 22 00:27:46 compute-1 systemd[237572]: Stopped Mark boot as successful after the user session has run 2 minutes.
Jan 22 00:27:46 compute-1 systemd[237572]: Stopped Daily Cleanup of User's Temporary Directories.
Jan 22 00:27:46 compute-1 systemd[237572]: Closed D-Bus User Message Bus Socket.
Jan 22 00:27:46 compute-1 systemd[237572]: Stopped Create User's Volatile Files and Directories.
Jan 22 00:27:46 compute-1 systemd[237572]: Removed slice User Application Slice.
Jan 22 00:27:46 compute-1 systemd[237572]: Reached target Shutdown.
Jan 22 00:27:46 compute-1 systemd[237572]: Finished Exit the Session.
Jan 22 00:27:46 compute-1 systemd[237572]: Reached target Exit the Session.
Jan 22 00:27:46 compute-1 systemd[1]: user@42436.service: Deactivated successfully.
Jan 22 00:27:46 compute-1 systemd[1]: Stopped User Manager for UID 42436.
Jan 22 00:27:46 compute-1 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Jan 22 00:27:46 compute-1 systemd[1]: run-user-42436.mount: Deactivated successfully.
Jan 22 00:27:46 compute-1 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Jan 22 00:27:46 compute-1 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Jan 22 00:27:46 compute-1 systemd[1]: Removed slice User Slice of UID 42436.
Jan 22 00:27:46 compute-1 nova_compute[182713]: 2026-01-22 00:27:46.363 182717 DEBUG nova.network.neutron [req-eaffd75f-0ed7-4dbd-a946-dc60e5a0e634 req-fe340fb6-e67a-4e24-afbb-7893f533be27 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 53686c08-86df-445a-b433-6a2c7c590fdb] Updated VIF entry in instance network info cache for port ac62ef89-aec4-41c9-83dd-366bdfc1c0bd. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 00:27:46 compute-1 nova_compute[182713]: 2026-01-22 00:27:46.364 182717 DEBUG nova.network.neutron [req-eaffd75f-0ed7-4dbd-a946-dc60e5a0e634 req-fe340fb6-e67a-4e24-afbb-7893f533be27 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 53686c08-86df-445a-b433-6a2c7c590fdb] Updating instance_info_cache with network_info: [{"id": "ac62ef89-aec4-41c9-83dd-366bdfc1c0bd", "address": "fa:16:3e:5b:20:cd", "network": {"id": "3b1ad694-cd0e-4047-b840-b090066a26f4", "bridge": "br-int", "label": "tempest-network-smoke--1985720748", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.180", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "adb1305c8f874f2684e845e88fd95ffe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapac62ef89-ae", "ovs_interfaceid": "ac62ef89-aec4-41c9-83dd-366bdfc1c0bd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:27:46 compute-1 nova_compute[182713]: 2026-01-22 00:27:46.580 182717 DEBUG oslo_concurrency.lockutils [req-eaffd75f-0ed7-4dbd-a946-dc60e5a0e634 req-fe340fb6-e67a-4e24-afbb-7893f533be27 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-53686c08-86df-445a-b433-6a2c7c590fdb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:27:46 compute-1 nova_compute[182713]: 2026-01-22 00:27:46.811 182717 INFO nova.virt.libvirt.driver [None req-d029da05-1bf7-4fca-af97-ee76e5a39006 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] [instance: 470aa2dc-e43d-414b-8bac-208dec5bcfe2] Snapshot image upload complete
Jan 22 00:27:46 compute-1 nova_compute[182713]: 2026-01-22 00:27:46.811 182717 INFO nova.compute.manager [None req-d029da05-1bf7-4fca-af97-ee76e5a39006 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] [instance: 470aa2dc-e43d-414b-8bac-208dec5bcfe2] Took 8.68 seconds to snapshot the instance on the hypervisor.
Jan 22 00:27:48 compute-1 nova_compute[182713]: 2026-01-22 00:27:48.215 182717 DEBUG nova.compute.manager [req-99b05eba-a415-475d-8e88-7be7a73275b8 req-c77cb988-bf89-444e-9bd0-9c34a43c8994 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 53686c08-86df-445a-b433-6a2c7c590fdb] Received event network-vif-plugged-ac62ef89-aec4-41c9-83dd-366bdfc1c0bd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:27:48 compute-1 nova_compute[182713]: 2026-01-22 00:27:48.216 182717 DEBUG oslo_concurrency.lockutils [req-99b05eba-a415-475d-8e88-7be7a73275b8 req-c77cb988-bf89-444e-9bd0-9c34a43c8994 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "53686c08-86df-445a-b433-6a2c7c590fdb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:27:48 compute-1 nova_compute[182713]: 2026-01-22 00:27:48.216 182717 DEBUG oslo_concurrency.lockutils [req-99b05eba-a415-475d-8e88-7be7a73275b8 req-c77cb988-bf89-444e-9bd0-9c34a43c8994 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "53686c08-86df-445a-b433-6a2c7c590fdb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:27:48 compute-1 nova_compute[182713]: 2026-01-22 00:27:48.216 182717 DEBUG oslo_concurrency.lockutils [req-99b05eba-a415-475d-8e88-7be7a73275b8 req-c77cb988-bf89-444e-9bd0-9c34a43c8994 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "53686c08-86df-445a-b433-6a2c7c590fdb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:27:48 compute-1 nova_compute[182713]: 2026-01-22 00:27:48.216 182717 DEBUG nova.compute.manager [req-99b05eba-a415-475d-8e88-7be7a73275b8 req-c77cb988-bf89-444e-9bd0-9c34a43c8994 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 53686c08-86df-445a-b433-6a2c7c590fdb] No waiting events found dispatching network-vif-plugged-ac62ef89-aec4-41c9-83dd-366bdfc1c0bd pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:27:48 compute-1 nova_compute[182713]: 2026-01-22 00:27:48.217 182717 WARNING nova.compute.manager [req-99b05eba-a415-475d-8e88-7be7a73275b8 req-c77cb988-bf89-444e-9bd0-9c34a43c8994 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 53686c08-86df-445a-b433-6a2c7c590fdb] Received unexpected event network-vif-plugged-ac62ef89-aec4-41c9-83dd-366bdfc1c0bd for instance with vm_state resized and task_state None.
Jan 22 00:27:48 compute-1 nova_compute[182713]: 2026-01-22 00:27:48.432 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:27:48 compute-1 podman[237831]: 2026-01-22 00:27:48.633491935 +0000 UTC m=+0.067910631 container health_status dce133a2ffa21f3006ac27dc80027060a9029e6d972f4c62fecd35dd3bbb00f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, managed_by=edpm_ansible, vcs-type=git, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, release=1755695350, vendor=Red Hat, Inc., config_id=openstack_network_exporter, io.openshift.tags=minimal rhel9, io.openshift.expose-services=, distribution-scope=public, version=9.6, build-date=2025-08-20T13:12:41)
Jan 22 00:27:49 compute-1 nova_compute[182713]: 2026-01-22 00:27:49.148 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:27:51 compute-1 nova_compute[182713]: 2026-01-22 00:27:51.855 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:27:51 compute-1 nova_compute[182713]: 2026-01-22 00:27:51.856 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Jan 22 00:27:51 compute-1 nova_compute[182713]: 2026-01-22 00:27:51.872 182717 DEBUG nova.compute.manager [req-2a42c7d3-265e-40b9-9210-db8335fc270f req-3cf00650-f0bd-4127-8746-87e79eebe8c9 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 470aa2dc-e43d-414b-8bac-208dec5bcfe2] Received event network-changed-45b36889-973e-4cd7-a054-83b1843214dc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:27:51 compute-1 nova_compute[182713]: 2026-01-22 00:27:51.873 182717 DEBUG nova.compute.manager [req-2a42c7d3-265e-40b9-9210-db8335fc270f req-3cf00650-f0bd-4127-8746-87e79eebe8c9 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 470aa2dc-e43d-414b-8bac-208dec5bcfe2] Refreshing instance network info cache due to event network-changed-45b36889-973e-4cd7-a054-83b1843214dc. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 00:27:51 compute-1 nova_compute[182713]: 2026-01-22 00:27:51.874 182717 DEBUG oslo_concurrency.lockutils [req-2a42c7d3-265e-40b9-9210-db8335fc270f req-3cf00650-f0bd-4127-8746-87e79eebe8c9 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-470aa2dc-e43d-414b-8bac-208dec5bcfe2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:27:51 compute-1 nova_compute[182713]: 2026-01-22 00:27:51.874 182717 DEBUG oslo_concurrency.lockutils [req-2a42c7d3-265e-40b9-9210-db8335fc270f req-3cf00650-f0bd-4127-8746-87e79eebe8c9 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-470aa2dc-e43d-414b-8bac-208dec5bcfe2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:27:51 compute-1 nova_compute[182713]: 2026-01-22 00:27:51.874 182717 DEBUG nova.network.neutron [req-2a42c7d3-265e-40b9-9210-db8335fc270f req-3cf00650-f0bd-4127-8746-87e79eebe8c9 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 470aa2dc-e43d-414b-8bac-208dec5bcfe2] Refreshing network info cache for port 45b36889-973e-4cd7-a054-83b1843214dc _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 00:27:51 compute-1 nova_compute[182713]: 2026-01-22 00:27:51.884 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Jan 22 00:27:52 compute-1 nova_compute[182713]: 2026-01-22 00:27:52.323 182717 DEBUG oslo_concurrency.lockutils [None req-89b1ff45-09c1-4b56-ac68-26c722e0fc2c 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] Acquiring lock "470aa2dc-e43d-414b-8bac-208dec5bcfe2" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:27:52 compute-1 nova_compute[182713]: 2026-01-22 00:27:52.324 182717 DEBUG oslo_concurrency.lockutils [None req-89b1ff45-09c1-4b56-ac68-26c722e0fc2c 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] Lock "470aa2dc-e43d-414b-8bac-208dec5bcfe2" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:27:52 compute-1 nova_compute[182713]: 2026-01-22 00:27:52.325 182717 DEBUG oslo_concurrency.lockutils [None req-89b1ff45-09c1-4b56-ac68-26c722e0fc2c 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] Acquiring lock "470aa2dc-e43d-414b-8bac-208dec5bcfe2-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:27:52 compute-1 nova_compute[182713]: 2026-01-22 00:27:52.327 182717 DEBUG oslo_concurrency.lockutils [None req-89b1ff45-09c1-4b56-ac68-26c722e0fc2c 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] Lock "470aa2dc-e43d-414b-8bac-208dec5bcfe2-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:27:52 compute-1 nova_compute[182713]: 2026-01-22 00:27:52.327 182717 DEBUG oslo_concurrency.lockutils [None req-89b1ff45-09c1-4b56-ac68-26c722e0fc2c 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] Lock "470aa2dc-e43d-414b-8bac-208dec5bcfe2-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:27:52 compute-1 nova_compute[182713]: 2026-01-22 00:27:52.344 182717 INFO nova.compute.manager [None req-89b1ff45-09c1-4b56-ac68-26c722e0fc2c 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] [instance: 470aa2dc-e43d-414b-8bac-208dec5bcfe2] Terminating instance
Jan 22 00:27:52 compute-1 nova_compute[182713]: 2026-01-22 00:27:52.361 182717 DEBUG nova.compute.manager [None req-89b1ff45-09c1-4b56-ac68-26c722e0fc2c 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] [instance: 470aa2dc-e43d-414b-8bac-208dec5bcfe2] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 22 00:27:52 compute-1 kernel: tap45b36889-97 (unregistering): left promiscuous mode
Jan 22 00:27:52 compute-1 NetworkManager[54952]: <info>  [1769041672.6143] device (tap45b36889-97): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 00:27:52 compute-1 nova_compute[182713]: 2026-01-22 00:27:52.616 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:27:52 compute-1 ovn_controller[94841]: 2026-01-22T00:27:52Z|00658|binding|INFO|Releasing lport 45b36889-973e-4cd7-a054-83b1843214dc from this chassis (sb_readonly=0)
Jan 22 00:27:52 compute-1 ovn_controller[94841]: 2026-01-22T00:27:52Z|00659|binding|INFO|Setting lport 45b36889-973e-4cd7-a054-83b1843214dc down in Southbound
Jan 22 00:27:52 compute-1 ovn_controller[94841]: 2026-01-22T00:27:52Z|00660|binding|INFO|Removing iface tap45b36889-97 ovn-installed in OVS
Jan 22 00:27:52 compute-1 nova_compute[182713]: 2026-01-22 00:27:52.619 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:27:52 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:27:52.628 104184 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:9b:1f:ab 10.100.0.4'], port_security=['fa:16:3e:9b:1f:ab 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '470aa2dc-e43d-414b-8bac-208dec5bcfe2', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ab086ee0-e007-4a86-babc-64d267c3fd5e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c869345f15654dea91ddb775c6c3ed7d', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'd99566cf-9d10-4ed9-89fe-0fedfcd05fcc', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7ddf0c9e-e496-4d74-b1f7-5f7f3b8a365b, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>], logical_port=45b36889-973e-4cd7-a054-83b1843214dc) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:27:52 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:27:52.630 104184 INFO neutron.agent.ovn.metadata.agent [-] Port 45b36889-973e-4cd7-a054-83b1843214dc in datapath ab086ee0-e007-4a86-babc-64d267c3fd5e unbound from our chassis
Jan 22 00:27:52 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:27:52.631 104184 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network ab086ee0-e007-4a86-babc-64d267c3fd5e, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 00:27:52 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:27:52.632 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[90dd96f6-9af7-416d-99d8-42f9b0e31d74]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:27:52 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:27:52.633 104184 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-ab086ee0-e007-4a86-babc-64d267c3fd5e namespace which is not needed anymore
Jan 22 00:27:52 compute-1 nova_compute[182713]: 2026-01-22 00:27:52.633 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:27:52 compute-1 systemd[1]: machine-qemu\x2d71\x2dinstance\x2d000000a5.scope: Deactivated successfully.
Jan 22 00:27:52 compute-1 systemd[1]: machine-qemu\x2d71\x2dinstance\x2d000000a5.scope: Consumed 14.853s CPU time.
Jan 22 00:27:52 compute-1 systemd-machined[153970]: Machine qemu-71-instance-000000a5 terminated.
Jan 22 00:27:52 compute-1 nova_compute[182713]: 2026-01-22 00:27:52.784 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:27:52 compute-1 nova_compute[182713]: 2026-01-22 00:27:52.795 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:27:52 compute-1 nova_compute[182713]: 2026-01-22 00:27:52.832 182717 INFO nova.virt.libvirt.driver [-] [instance: 470aa2dc-e43d-414b-8bac-208dec5bcfe2] Instance destroyed successfully.
Jan 22 00:27:52 compute-1 nova_compute[182713]: 2026-01-22 00:27:52.833 182717 DEBUG nova.objects.instance [None req-89b1ff45-09c1-4b56-ac68-26c722e0fc2c 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] Lazy-loading 'resources' on Instance uuid 470aa2dc-e43d-414b-8bac-208dec5bcfe2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:27:52 compute-1 nova_compute[182713]: 2026-01-22 00:27:52.849 182717 DEBUG nova.virt.libvirt.vif [None req-89b1ff45-09c1-4b56-ac68-26c722e0fc2c 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T00:26:45Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestSnapshotPattern-server-1582714945',display_name='tempest-TestSnapshotPattern-server-1582714945',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testsnapshotpattern-server-1582714945',id=165,image_ref='1fa72c45-3744-4826-ac11-1114970a3fb7',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBI0DY/TXOQB0M9YFyDLcqcxCKEdFgCfMpGiYt7S54G4iqWyBQXCc1Xwky+N3hTMg7xuZaO7fBEy0faktvOAVQkBCk+NHBAAtdaooYCb3c3mlb/2fG1QJ9qBFBibcnv6XRw==',key_name='tempest-TestSnapshotPattern-1957582469',keypairs=<?>,launch_index=0,launched_at=2026-01-22T00:26:55Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='c869345f15654dea91ddb775c6c3ed7d',ramdisk_id='',reservation_id='r-sutdufti',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_boot_roles='reader,member',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_image_location='snapshot',image_image_state='available',image_image_type='snapshot',image_instance_uuid='ba1975bd-ca63-4cb4-afd3-fb1f077c28f0',image_min_disk='1',image_min_ram='0',image_owner_id='c869345f15654dea91ddb775c6c3ed7d',image_owner_project_name='tempest-TestSnapshotPattern-735860214',image_owner_user_name='tempest-TestSnapshotPattern-735860214-project-member',image_user_id='93f27bcf715e498cbac482f96dec39c0',image_version='8.0',owner_project_name='tempest-TestSnapshotPattern-735860214',owner_user_name='tempest-TestSnapshotPattern-735860214-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T00:27:46Z,user_data=None,user_id='93f27bcf715e498cbac482f96dec39c0',uuid=470aa2dc-e43d-414b-8bac-208dec5bcfe2,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "45b36889-973e-4cd7-a054-83b1843214dc", "address": "fa:16:3e:9b:1f:ab", "network": {"id": "ab086ee0-e007-4a86-babc-64d267c3fd5e", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-266383806-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.176", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c869345f15654dea91ddb775c6c3ed7d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap45b36889-97", "ovs_interfaceid": "45b36889-973e-4cd7-a054-83b1843214dc", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 00:27:52 compute-1 nova_compute[182713]: 2026-01-22 00:27:52.849 182717 DEBUG nova.network.os_vif_util [None req-89b1ff45-09c1-4b56-ac68-26c722e0fc2c 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] Converting VIF {"id": "45b36889-973e-4cd7-a054-83b1843214dc", "address": "fa:16:3e:9b:1f:ab", "network": {"id": "ab086ee0-e007-4a86-babc-64d267c3fd5e", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-266383806-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.176", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c869345f15654dea91ddb775c6c3ed7d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap45b36889-97", "ovs_interfaceid": "45b36889-973e-4cd7-a054-83b1843214dc", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:27:52 compute-1 nova_compute[182713]: 2026-01-22 00:27:52.850 182717 DEBUG nova.network.os_vif_util [None req-89b1ff45-09c1-4b56-ac68-26c722e0fc2c 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:9b:1f:ab,bridge_name='br-int',has_traffic_filtering=True,id=45b36889-973e-4cd7-a054-83b1843214dc,network=Network(ab086ee0-e007-4a86-babc-64d267c3fd5e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap45b36889-97') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:27:52 compute-1 nova_compute[182713]: 2026-01-22 00:27:52.851 182717 DEBUG os_vif [None req-89b1ff45-09c1-4b56-ac68-26c722e0fc2c 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:9b:1f:ab,bridge_name='br-int',has_traffic_filtering=True,id=45b36889-973e-4cd7-a054-83b1843214dc,network=Network(ab086ee0-e007-4a86-babc-64d267c3fd5e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap45b36889-97') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 00:27:52 compute-1 nova_compute[182713]: 2026-01-22 00:27:52.853 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:27:52 compute-1 nova_compute[182713]: 2026-01-22 00:27:52.854 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap45b36889-97, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:27:52 compute-1 nova_compute[182713]: 2026-01-22 00:27:52.903 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:27:52 compute-1 nova_compute[182713]: 2026-01-22 00:27:52.905 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:27:52 compute-1 nova_compute[182713]: 2026-01-22 00:27:52.909 182717 INFO os_vif [None req-89b1ff45-09c1-4b56-ac68-26c722e0fc2c 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:9b:1f:ab,bridge_name='br-int',has_traffic_filtering=True,id=45b36889-973e-4cd7-a054-83b1843214dc,network=Network(ab086ee0-e007-4a86-babc-64d267c3fd5e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap45b36889-97')
Jan 22 00:27:52 compute-1 nova_compute[182713]: 2026-01-22 00:27:52.910 182717 INFO nova.virt.libvirt.driver [None req-89b1ff45-09c1-4b56-ac68-26c722e0fc2c 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] [instance: 470aa2dc-e43d-414b-8bac-208dec5bcfe2] Deleting instance files /var/lib/nova/instances/470aa2dc-e43d-414b-8bac-208dec5bcfe2_del
Jan 22 00:27:52 compute-1 nova_compute[182713]: 2026-01-22 00:27:52.910 182717 INFO nova.virt.libvirt.driver [None req-89b1ff45-09c1-4b56-ac68-26c722e0fc2c 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] [instance: 470aa2dc-e43d-414b-8bac-208dec5bcfe2] Deletion of /var/lib/nova/instances/470aa2dc-e43d-414b-8bac-208dec5bcfe2_del complete
Jan 22 00:27:52 compute-1 neutron-haproxy-ovnmeta-ab086ee0-e007-4a86-babc-64d267c3fd5e[237353]: [NOTICE]   (237382) : haproxy version is 2.8.14-c23fe91
Jan 22 00:27:52 compute-1 neutron-haproxy-ovnmeta-ab086ee0-e007-4a86-babc-64d267c3fd5e[237353]: [NOTICE]   (237382) : path to executable is /usr/sbin/haproxy
Jan 22 00:27:52 compute-1 neutron-haproxy-ovnmeta-ab086ee0-e007-4a86-babc-64d267c3fd5e[237353]: [WARNING]  (237382) : Exiting Master process...
Jan 22 00:27:52 compute-1 neutron-haproxy-ovnmeta-ab086ee0-e007-4a86-babc-64d267c3fd5e[237353]: [ALERT]    (237382) : Current worker (237392) exited with code 143 (Terminated)
Jan 22 00:27:52 compute-1 neutron-haproxy-ovnmeta-ab086ee0-e007-4a86-babc-64d267c3fd5e[237353]: [WARNING]  (237382) : All workers exited. Exiting... (0)
Jan 22 00:27:52 compute-1 systemd[1]: libpod-1b5d788651856846096961b925f558ee3ca2ab0f9cdf62791fee58cb966b8339.scope: Deactivated successfully.
Jan 22 00:27:52 compute-1 podman[237874]: 2026-01-22 00:27:52.970009596 +0000 UTC m=+0.231759185 container died 1b5d788651856846096961b925f558ee3ca2ab0f9cdf62791fee58cb966b8339 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ab086ee0-e007-4a86-babc-64d267c3fd5e, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3)
Jan 22 00:27:53 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-1b5d788651856846096961b925f558ee3ca2ab0f9cdf62791fee58cb966b8339-userdata-shm.mount: Deactivated successfully.
Jan 22 00:27:53 compute-1 systemd[1]: var-lib-containers-storage-overlay-af7af4ca93236f5e8a95600ef74ed2043abbff6b358ff5fb003c7044277913e6-merged.mount: Deactivated successfully.
Jan 22 00:27:53 compute-1 podman[237874]: 2026-01-22 00:27:53.024991142 +0000 UTC m=+0.286740711 container cleanup 1b5d788651856846096961b925f558ee3ca2ab0f9cdf62791fee58cb966b8339 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ab086ee0-e007-4a86-babc-64d267c3fd5e, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 22 00:27:53 compute-1 systemd[1]: libpod-conmon-1b5d788651856846096961b925f558ee3ca2ab0f9cdf62791fee58cb966b8339.scope: Deactivated successfully.
Jan 22 00:27:53 compute-1 nova_compute[182713]: 2026-01-22 00:27:53.042 182717 INFO nova.compute.manager [None req-89b1ff45-09c1-4b56-ac68-26c722e0fc2c 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] [instance: 470aa2dc-e43d-414b-8bac-208dec5bcfe2] Took 0.68 seconds to destroy the instance on the hypervisor.
Jan 22 00:27:53 compute-1 nova_compute[182713]: 2026-01-22 00:27:53.042 182717 DEBUG oslo.service.loopingcall [None req-89b1ff45-09c1-4b56-ac68-26c722e0fc2c 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 22 00:27:53 compute-1 nova_compute[182713]: 2026-01-22 00:27:53.046 182717 DEBUG nova.compute.manager [-] [instance: 470aa2dc-e43d-414b-8bac-208dec5bcfe2] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 22 00:27:53 compute-1 nova_compute[182713]: 2026-01-22 00:27:53.046 182717 DEBUG nova.network.neutron [-] [instance: 470aa2dc-e43d-414b-8bac-208dec5bcfe2] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 22 00:27:53 compute-1 nova_compute[182713]: 2026-01-22 00:27:53.205 182717 DEBUG nova.compute.manager [req-63edb4cf-3da8-42cb-9998-d78282a1295e req-a4f24e60-c5e7-4676-91b0-568beb344e29 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 470aa2dc-e43d-414b-8bac-208dec5bcfe2] Received event network-vif-unplugged-45b36889-973e-4cd7-a054-83b1843214dc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:27:53 compute-1 nova_compute[182713]: 2026-01-22 00:27:53.205 182717 DEBUG oslo_concurrency.lockutils [req-63edb4cf-3da8-42cb-9998-d78282a1295e req-a4f24e60-c5e7-4676-91b0-568beb344e29 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "470aa2dc-e43d-414b-8bac-208dec5bcfe2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:27:53 compute-1 nova_compute[182713]: 2026-01-22 00:27:53.206 182717 DEBUG oslo_concurrency.lockutils [req-63edb4cf-3da8-42cb-9998-d78282a1295e req-a4f24e60-c5e7-4676-91b0-568beb344e29 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "470aa2dc-e43d-414b-8bac-208dec5bcfe2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:27:53 compute-1 nova_compute[182713]: 2026-01-22 00:27:53.207 182717 DEBUG oslo_concurrency.lockutils [req-63edb4cf-3da8-42cb-9998-d78282a1295e req-a4f24e60-c5e7-4676-91b0-568beb344e29 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "470aa2dc-e43d-414b-8bac-208dec5bcfe2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:27:53 compute-1 nova_compute[182713]: 2026-01-22 00:27:53.208 182717 DEBUG nova.compute.manager [req-63edb4cf-3da8-42cb-9998-d78282a1295e req-a4f24e60-c5e7-4676-91b0-568beb344e29 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 470aa2dc-e43d-414b-8bac-208dec5bcfe2] No waiting events found dispatching network-vif-unplugged-45b36889-973e-4cd7-a054-83b1843214dc pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:27:53 compute-1 nova_compute[182713]: 2026-01-22 00:27:53.210 182717 DEBUG nova.compute.manager [req-63edb4cf-3da8-42cb-9998-d78282a1295e req-a4f24e60-c5e7-4676-91b0-568beb344e29 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 470aa2dc-e43d-414b-8bac-208dec5bcfe2] Received event network-vif-unplugged-45b36889-973e-4cd7-a054-83b1843214dc for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 22 00:27:53 compute-1 podman[237920]: 2026-01-22 00:27:53.224690912 +0000 UTC m=+0.170283027 container remove 1b5d788651856846096961b925f558ee3ca2ab0f9cdf62791fee58cb966b8339 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ab086ee0-e007-4a86-babc-64d267c3fd5e, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0)
Jan 22 00:27:53 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:27:53.232 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[6b119f87-a88e-4d06-a847-e8e10893ff0e]: (4, ('Thu Jan 22 12:27:52 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-ab086ee0-e007-4a86-babc-64d267c3fd5e (1b5d788651856846096961b925f558ee3ca2ab0f9cdf62791fee58cb966b8339)\n1b5d788651856846096961b925f558ee3ca2ab0f9cdf62791fee58cb966b8339\nThu Jan 22 12:27:53 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-ab086ee0-e007-4a86-babc-64d267c3fd5e (1b5d788651856846096961b925f558ee3ca2ab0f9cdf62791fee58cb966b8339)\n1b5d788651856846096961b925f558ee3ca2ab0f9cdf62791fee58cb966b8339\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:27:53 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:27:53.234 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[12ebd1ee-1322-4e5f-8a70-ffa22b21f061]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:27:53 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:27:53.235 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapab086ee0-e0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:27:53 compute-1 kernel: tapab086ee0-e0: left promiscuous mode
Jan 22 00:27:53 compute-1 nova_compute[182713]: 2026-01-22 00:27:53.237 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:27:53 compute-1 nova_compute[182713]: 2026-01-22 00:27:53.257 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:27:53 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:27:53.309 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[000a9804-c3f0-4a65-9e32-02f912220274]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:27:53 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:27:53.325 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[7ac4d8aa-7e81-431e-bd1f-cf7245331355]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:27:53 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:27:53.326 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[ebe0387e-fe07-45c4-83c5-1071d327c425]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:27:53 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:27:53.344 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[a4569984-cb20-4ddf-89ed-d682f0b49201]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 618768, 'reachable_time': 25124, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 237935, 'error': None, 'target': 'ovnmeta-ab086ee0-e007-4a86-babc-64d267c3fd5e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:27:53 compute-1 systemd[1]: run-netns-ovnmeta\x2dab086ee0\x2de007\x2d4a86\x2dbabc\x2d64d267c3fd5e.mount: Deactivated successfully.
Jan 22 00:27:53 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:27:53.348 104576 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-ab086ee0-e007-4a86-babc-64d267c3fd5e deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 22 00:27:53 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:27:53.348 104576 DEBUG oslo.privsep.daemon [-] privsep: reply[9ab40705-44b1-45ea-b675-25e97c754c99]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:27:53 compute-1 nova_compute[182713]: 2026-01-22 00:27:53.885 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:27:53 compute-1 nova_compute[182713]: 2026-01-22 00:27:53.885 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:27:53 compute-1 nova_compute[182713]: 2026-01-22 00:27:53.886 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 00:27:54 compute-1 nova_compute[182713]: 2026-01-22 00:27:54.151 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:27:54 compute-1 nova_compute[182713]: 2026-01-22 00:27:54.850 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:27:54 compute-1 nova_compute[182713]: 2026-01-22 00:27:54.854 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:27:55 compute-1 nova_compute[182713]: 2026-01-22 00:27:55.851 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:27:56 compute-1 nova_compute[182713]: 2026-01-22 00:27:56.129 182717 DEBUG nova.compute.manager [req-2f0a6013-1be9-4640-820a-9cc882e5caf1 req-6c95b475-2b82-487e-9024-6ad3b551877f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 470aa2dc-e43d-414b-8bac-208dec5bcfe2] Received event network-vif-plugged-45b36889-973e-4cd7-a054-83b1843214dc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:27:56 compute-1 nova_compute[182713]: 2026-01-22 00:27:56.130 182717 DEBUG oslo_concurrency.lockutils [req-2f0a6013-1be9-4640-820a-9cc882e5caf1 req-6c95b475-2b82-487e-9024-6ad3b551877f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "470aa2dc-e43d-414b-8bac-208dec5bcfe2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:27:56 compute-1 nova_compute[182713]: 2026-01-22 00:27:56.130 182717 DEBUG oslo_concurrency.lockutils [req-2f0a6013-1be9-4640-820a-9cc882e5caf1 req-6c95b475-2b82-487e-9024-6ad3b551877f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "470aa2dc-e43d-414b-8bac-208dec5bcfe2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:27:56 compute-1 nova_compute[182713]: 2026-01-22 00:27:56.131 182717 DEBUG oslo_concurrency.lockutils [req-2f0a6013-1be9-4640-820a-9cc882e5caf1 req-6c95b475-2b82-487e-9024-6ad3b551877f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "470aa2dc-e43d-414b-8bac-208dec5bcfe2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:27:56 compute-1 nova_compute[182713]: 2026-01-22 00:27:56.131 182717 DEBUG nova.compute.manager [req-2f0a6013-1be9-4640-820a-9cc882e5caf1 req-6c95b475-2b82-487e-9024-6ad3b551877f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 470aa2dc-e43d-414b-8bac-208dec5bcfe2] No waiting events found dispatching network-vif-plugged-45b36889-973e-4cd7-a054-83b1843214dc pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:27:56 compute-1 nova_compute[182713]: 2026-01-22 00:27:56.131 182717 WARNING nova.compute.manager [req-2f0a6013-1be9-4640-820a-9cc882e5caf1 req-6c95b475-2b82-487e-9024-6ad3b551877f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 470aa2dc-e43d-414b-8bac-208dec5bcfe2] Received unexpected event network-vif-plugged-45b36889-973e-4cd7-a054-83b1843214dc for instance with vm_state active and task_state deleting.
Jan 22 00:27:56 compute-1 nova_compute[182713]: 2026-01-22 00:27:56.261 182717 DEBUG nova.network.neutron [-] [instance: 470aa2dc-e43d-414b-8bac-208dec5bcfe2] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:27:56 compute-1 nova_compute[182713]: 2026-01-22 00:27:56.295 182717 INFO nova.compute.manager [-] [instance: 470aa2dc-e43d-414b-8bac-208dec5bcfe2] Took 3.25 seconds to deallocate network for instance.
Jan 22 00:27:56 compute-1 nova_compute[182713]: 2026-01-22 00:27:56.408 182717 DEBUG nova.compute.manager [req-59eb5815-79a1-4d38-b8ce-8e70d7bd1154 req-364f57ff-56c9-46a8-8a09-febad0f59bd8 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 470aa2dc-e43d-414b-8bac-208dec5bcfe2] Received event network-vif-deleted-45b36889-973e-4cd7-a054-83b1843214dc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:27:56 compute-1 nova_compute[182713]: 2026-01-22 00:27:56.409 182717 DEBUG nova.network.neutron [req-2a42c7d3-265e-40b9-9210-db8335fc270f req-3cf00650-f0bd-4127-8746-87e79eebe8c9 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 470aa2dc-e43d-414b-8bac-208dec5bcfe2] Updated VIF entry in instance network info cache for port 45b36889-973e-4cd7-a054-83b1843214dc. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 00:27:56 compute-1 nova_compute[182713]: 2026-01-22 00:27:56.409 182717 DEBUG nova.network.neutron [req-2a42c7d3-265e-40b9-9210-db8335fc270f req-3cf00650-f0bd-4127-8746-87e79eebe8c9 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 470aa2dc-e43d-414b-8bac-208dec5bcfe2] Updating instance_info_cache with network_info: [{"id": "45b36889-973e-4cd7-a054-83b1843214dc", "address": "fa:16:3e:9b:1f:ab", "network": {"id": "ab086ee0-e007-4a86-babc-64d267c3fd5e", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-266383806-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c869345f15654dea91ddb775c6c3ed7d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap45b36889-97", "ovs_interfaceid": "45b36889-973e-4cd7-a054-83b1843214dc", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:27:56 compute-1 ovn_controller[94841]: 2026-01-22T00:27:56Z|00087|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:5b:20:cd 10.100.0.10
Jan 22 00:27:56 compute-1 nova_compute[182713]: 2026-01-22 00:27:56.447 182717 DEBUG oslo_concurrency.lockutils [None req-89b1ff45-09c1-4b56-ac68-26c722e0fc2c 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:27:56 compute-1 nova_compute[182713]: 2026-01-22 00:27:56.447 182717 DEBUG oslo_concurrency.lockutils [None req-89b1ff45-09c1-4b56-ac68-26c722e0fc2c 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:27:56 compute-1 nova_compute[182713]: 2026-01-22 00:27:56.448 182717 DEBUG oslo_concurrency.lockutils [req-2a42c7d3-265e-40b9-9210-db8335fc270f req-3cf00650-f0bd-4127-8746-87e79eebe8c9 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-470aa2dc-e43d-414b-8bac-208dec5bcfe2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:27:56 compute-1 nova_compute[182713]: 2026-01-22 00:27:56.600 182717 DEBUG nova.compute.provider_tree [None req-89b1ff45-09c1-4b56-ac68-26c722e0fc2c 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] Inventory has not changed in ProviderTree for provider: 39680711-70c9-4df1-ae59-25e54fac688d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:27:56 compute-1 nova_compute[182713]: 2026-01-22 00:27:56.632 182717 DEBUG nova.scheduler.client.report [None req-89b1ff45-09c1-4b56-ac68-26c722e0fc2c 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] Inventory has not changed for provider 39680711-70c9-4df1-ae59-25e54fac688d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:27:56 compute-1 nova_compute[182713]: 2026-01-22 00:27:56.677 182717 DEBUG oslo_concurrency.lockutils [None req-89b1ff45-09c1-4b56-ac68-26c722e0fc2c 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.230s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:27:56 compute-1 nova_compute[182713]: 2026-01-22 00:27:56.748 182717 INFO nova.scheduler.client.report [None req-89b1ff45-09c1-4b56-ac68-26c722e0fc2c 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] Deleted allocations for instance 470aa2dc-e43d-414b-8bac-208dec5bcfe2
Jan 22 00:27:56 compute-1 nova_compute[182713]: 2026-01-22 00:27:56.895 182717 DEBUG oslo_concurrency.lockutils [None req-89b1ff45-09c1-4b56-ac68-26c722e0fc2c 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] Lock "470aa2dc-e43d-414b-8bac-208dec5bcfe2" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.571s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:27:57 compute-1 podman[237948]: 2026-01-22 00:27:57.583460584 +0000 UTC m=+0.070578702 container health_status 9069102163b9014ce8d990e36224edf2f6277e737eebbc85a8ea0ba5a3d6a468 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 22 00:27:57 compute-1 podman[237947]: 2026-01-22 00:27:57.629986219 +0000 UTC m=+0.117453828 container health_status 1b31374526468d6b98833c6ddc06c791524e3aaa8f4a92d02e3f11c449a17ca2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 22 00:27:57 compute-1 nova_compute[182713]: 2026-01-22 00:27:57.855 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:27:57 compute-1 nova_compute[182713]: 2026-01-22 00:27:57.904 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:27:59 compute-1 nova_compute[182713]: 2026-01-22 00:27:59.154 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:28:00 compute-1 nova_compute[182713]: 2026-01-22 00:28:00.885 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:28:00 compute-1 nova_compute[182713]: 2026-01-22 00:28:00.886 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:28:00 compute-1 nova_compute[182713]: 2026-01-22 00:28:00.945 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:28:00 compute-1 nova_compute[182713]: 2026-01-22 00:28:00.945 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:28:00 compute-1 nova_compute[182713]: 2026-01-22 00:28:00.945 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:28:00 compute-1 nova_compute[182713]: 2026-01-22 00:28:00.946 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 00:28:01 compute-1 nova_compute[182713]: 2026-01-22 00:28:01.050 182717 DEBUG oslo_concurrency.processutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/53686c08-86df-445a-b433-6a2c7c590fdb/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:28:01 compute-1 nova_compute[182713]: 2026-01-22 00:28:01.105 182717 DEBUG oslo_concurrency.processutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/53686c08-86df-445a-b433-6a2c7c590fdb/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:28:01 compute-1 nova_compute[182713]: 2026-01-22 00:28:01.106 182717 DEBUG oslo_concurrency.processutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/53686c08-86df-445a-b433-6a2c7c590fdb/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:28:01 compute-1 nova_compute[182713]: 2026-01-22 00:28:01.172 182717 DEBUG oslo_concurrency.processutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/53686c08-86df-445a-b433-6a2c7c590fdb/disk --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:28:01 compute-1 nova_compute[182713]: 2026-01-22 00:28:01.370 182717 WARNING nova.virt.libvirt.driver [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 00:28:01 compute-1 nova_compute[182713]: 2026-01-22 00:28:01.371 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5453MB free_disk=73.16451644897461GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 00:28:01 compute-1 nova_compute[182713]: 2026-01-22 00:28:01.372 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:28:01 compute-1 nova_compute[182713]: 2026-01-22 00:28:01.372 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:28:01 compute-1 nova_compute[182713]: 2026-01-22 00:28:01.478 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Instance 53686c08-86df-445a-b433-6a2c7c590fdb actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 192, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 22 00:28:01 compute-1 nova_compute[182713]: 2026-01-22 00:28:01.478 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 00:28:01 compute-1 nova_compute[182713]: 2026-01-22 00:28:01.479 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=704MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 00:28:01 compute-1 nova_compute[182713]: 2026-01-22 00:28:01.573 182717 DEBUG nova.compute.provider_tree [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Inventory has not changed in ProviderTree for provider: 39680711-70c9-4df1-ae59-25e54fac688d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:28:01 compute-1 nova_compute[182713]: 2026-01-22 00:28:01.618 182717 DEBUG nova.scheduler.client.report [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Inventory has not changed for provider 39680711-70c9-4df1-ae59-25e54fac688d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:28:01 compute-1 nova_compute[182713]: 2026-01-22 00:28:01.682 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 00:28:01 compute-1 nova_compute[182713]: 2026-01-22 00:28:01.683 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.311s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:28:01 compute-1 nova_compute[182713]: 2026-01-22 00:28:01.856 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:28:01 compute-1 nova_compute[182713]: 2026-01-22 00:28:01.857 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:28:01 compute-1 nova_compute[182713]: 2026-01-22 00:28:01.858 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Jan 22 00:28:02 compute-1 podman[238003]: 2026-01-22 00:28:02.56112769 +0000 UTC m=+0.054308427 container health_status e305ebfcd3dbca18f8dd680606b043b108cf9efb0d0a771039cb04fe0df8441d (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 22 00:28:02 compute-1 podman[238002]: 2026-01-22 00:28:02.56274847 +0000 UTC m=+0.058770645 container health_status af9a44923f14d865893c91712a29a99d59e05d83f1a1a0300c4f4d5af638f284 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 00:28:02 compute-1 nova_compute[182713]: 2026-01-22 00:28:02.723 182717 INFO nova.compute.manager [None req-83bea23c-0fdc-45d2-a509-7d68327a90c7 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: 53686c08-86df-445a-b433-6a2c7c590fdb] Get console output
Jan 22 00:28:02 compute-1 nova_compute[182713]: 2026-01-22 00:28:02.731 211417 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 00:28:02 compute-1 nova_compute[182713]: 2026-01-22 00:28:02.914 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:28:02 compute-1 nova_compute[182713]: 2026-01-22 00:28:02.914 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 00:28:02 compute-1 nova_compute[182713]: 2026-01-22 00:28:02.915 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 22 00:28:02 compute-1 nova_compute[182713]: 2026-01-22 00:28:02.945 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:28:03 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:28:03.037 104184 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:28:03 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:28:03.038 104184 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:28:03 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:28:03.040 104184 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:28:03 compute-1 nova_compute[182713]: 2026-01-22 00:28:03.626 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquiring lock "refresh_cache-53686c08-86df-445a-b433-6a2c7c590fdb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:28:03 compute-1 nova_compute[182713]: 2026-01-22 00:28:03.627 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquired lock "refresh_cache-53686c08-86df-445a-b433-6a2c7c590fdb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:28:03 compute-1 nova_compute[182713]: 2026-01-22 00:28:03.627 182717 DEBUG nova.network.neutron [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] [instance: 53686c08-86df-445a-b433-6a2c7c590fdb] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Jan 22 00:28:03 compute-1 nova_compute[182713]: 2026-01-22 00:28:03.627 182717 DEBUG nova.objects.instance [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lazy-loading 'info_cache' on Instance uuid 53686c08-86df-445a-b433-6a2c7c590fdb obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:28:04 compute-1 nova_compute[182713]: 2026-01-22 00:28:04.156 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:28:05 compute-1 nova_compute[182713]: 2026-01-22 00:28:05.911 182717 DEBUG nova.compute.manager [req-c0e8c13e-8bef-49a0-81fd-880fbc3f5a4c req-444feb0d-7391-456b-977d-f1c0def3b7f6 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 53686c08-86df-445a-b433-6a2c7c590fdb] Received event network-changed-ac62ef89-aec4-41c9-83dd-366bdfc1c0bd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:28:05 compute-1 nova_compute[182713]: 2026-01-22 00:28:05.912 182717 DEBUG nova.compute.manager [req-c0e8c13e-8bef-49a0-81fd-880fbc3f5a4c req-444feb0d-7391-456b-977d-f1c0def3b7f6 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 53686c08-86df-445a-b433-6a2c7c590fdb] Refreshing instance network info cache due to event network-changed-ac62ef89-aec4-41c9-83dd-366bdfc1c0bd. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 00:28:05 compute-1 nova_compute[182713]: 2026-01-22 00:28:05.912 182717 DEBUG oslo_concurrency.lockutils [req-c0e8c13e-8bef-49a0-81fd-880fbc3f5a4c req-444feb0d-7391-456b-977d-f1c0def3b7f6 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-53686c08-86df-445a-b433-6a2c7c590fdb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:28:06 compute-1 nova_compute[182713]: 2026-01-22 00:28:06.232 182717 DEBUG oslo_concurrency.lockutils [None req-50d8132e-7b21-4c72-8620-2ef3c67abd13 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Acquiring lock "53686c08-86df-445a-b433-6a2c7c590fdb" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:28:06 compute-1 nova_compute[182713]: 2026-01-22 00:28:06.233 182717 DEBUG oslo_concurrency.lockutils [None req-50d8132e-7b21-4c72-8620-2ef3c67abd13 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lock "53686c08-86df-445a-b433-6a2c7c590fdb" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:28:06 compute-1 nova_compute[182713]: 2026-01-22 00:28:06.233 182717 DEBUG oslo_concurrency.lockutils [None req-50d8132e-7b21-4c72-8620-2ef3c67abd13 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Acquiring lock "53686c08-86df-445a-b433-6a2c7c590fdb-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:28:06 compute-1 nova_compute[182713]: 2026-01-22 00:28:06.234 182717 DEBUG oslo_concurrency.lockutils [None req-50d8132e-7b21-4c72-8620-2ef3c67abd13 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lock "53686c08-86df-445a-b433-6a2c7c590fdb-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:28:06 compute-1 nova_compute[182713]: 2026-01-22 00:28:06.234 182717 DEBUG oslo_concurrency.lockutils [None req-50d8132e-7b21-4c72-8620-2ef3c67abd13 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lock "53686c08-86df-445a-b433-6a2c7c590fdb-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:28:06 compute-1 nova_compute[182713]: 2026-01-22 00:28:06.252 182717 INFO nova.compute.manager [None req-50d8132e-7b21-4c72-8620-2ef3c67abd13 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: 53686c08-86df-445a-b433-6a2c7c590fdb] Terminating instance
Jan 22 00:28:06 compute-1 nova_compute[182713]: 2026-01-22 00:28:06.267 182717 DEBUG nova.compute.manager [None req-50d8132e-7b21-4c72-8620-2ef3c67abd13 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: 53686c08-86df-445a-b433-6a2c7c590fdb] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 22 00:28:06 compute-1 kernel: tapac62ef89-ae (unregistering): left promiscuous mode
Jan 22 00:28:06 compute-1 NetworkManager[54952]: <info>  [1769041686.2902] device (tapac62ef89-ae): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 00:28:06 compute-1 ovn_controller[94841]: 2026-01-22T00:28:06Z|00661|binding|INFO|Releasing lport ac62ef89-aec4-41c9-83dd-366bdfc1c0bd from this chassis (sb_readonly=0)
Jan 22 00:28:06 compute-1 nova_compute[182713]: 2026-01-22 00:28:06.292 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:28:06 compute-1 ovn_controller[94841]: 2026-01-22T00:28:06Z|00662|binding|INFO|Setting lport ac62ef89-aec4-41c9-83dd-366bdfc1c0bd down in Southbound
Jan 22 00:28:06 compute-1 ovn_controller[94841]: 2026-01-22T00:28:06Z|00663|binding|INFO|Removing iface tapac62ef89-ae ovn-installed in OVS
Jan 22 00:28:06 compute-1 nova_compute[182713]: 2026-01-22 00:28:06.296 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:28:06 compute-1 nova_compute[182713]: 2026-01-22 00:28:06.311 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:28:06 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:28:06.315 104184 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:5b:20:cd 10.100.0.10'], port_security=['fa:16:3e:5b:20:cd 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '53686c08-86df-445a-b433-6a2c7c590fdb', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3b1ad694-cd0e-4047-b840-b090066a26f4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'adb1305c8f874f2684e845e88fd95ffe', 'neutron:revision_number': '8', 'neutron:security_group_ids': 'ddb2905c-b7d9-4e7e-b5f3-61f1bd651115', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=07b94cbf-fe21-425f-b9b0-192a8a6fba61, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>], logical_port=ac62ef89-aec4-41c9-83dd-366bdfc1c0bd) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:28:06 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:28:06.317 104184 INFO neutron.agent.ovn.metadata.agent [-] Port ac62ef89-aec4-41c9-83dd-366bdfc1c0bd in datapath 3b1ad694-cd0e-4047-b840-b090066a26f4 unbound from our chassis
Jan 22 00:28:06 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:28:06.319 104184 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 3b1ad694-cd0e-4047-b840-b090066a26f4, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 00:28:06 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:28:06.320 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[363701be-a7d8-45a6-9c28-993a59d4ee67]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:28:06 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:28:06.321 104184 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-3b1ad694-cd0e-4047-b840-b090066a26f4 namespace which is not needed anymore
Jan 22 00:28:06 compute-1 systemd[1]: machine-qemu\x2d72\x2dinstance\x2d000000a6.scope: Deactivated successfully.
Jan 22 00:28:06 compute-1 systemd[1]: machine-qemu\x2d72\x2dinstance\x2d000000a6.scope: Consumed 13.460s CPU time.
Jan 22 00:28:06 compute-1 systemd-machined[153970]: Machine qemu-72-instance-000000a6 terminated.
Jan 22 00:28:06 compute-1 kernel: tapac62ef89-ae: entered promiscuous mode
Jan 22 00:28:06 compute-1 neutron-haproxy-ovnmeta-3b1ad694-cd0e-4047-b840-b090066a26f4[237793]: [NOTICE]   (237797) : haproxy version is 2.8.14-c23fe91
Jan 22 00:28:06 compute-1 neutron-haproxy-ovnmeta-3b1ad694-cd0e-4047-b840-b090066a26f4[237793]: [NOTICE]   (237797) : path to executable is /usr/sbin/haproxy
Jan 22 00:28:06 compute-1 neutron-haproxy-ovnmeta-3b1ad694-cd0e-4047-b840-b090066a26f4[237793]: [WARNING]  (237797) : Exiting Master process...
Jan 22 00:28:06 compute-1 kernel: tapac62ef89-ae (unregistering): left promiscuous mode
Jan 22 00:28:06 compute-1 NetworkManager[54952]: <info>  [1769041686.4971] manager: (tapac62ef89-ae): new Tun device (/org/freedesktop/NetworkManager/Devices/308)
Jan 22 00:28:06 compute-1 neutron-haproxy-ovnmeta-3b1ad694-cd0e-4047-b840-b090066a26f4[237793]: [ALERT]    (237797) : Current worker (237799) exited with code 143 (Terminated)
Jan 22 00:28:06 compute-1 neutron-haproxy-ovnmeta-3b1ad694-cd0e-4047-b840-b090066a26f4[237793]: [WARNING]  (237797) : All workers exited. Exiting... (0)
Jan 22 00:28:06 compute-1 systemd[1]: libpod-0e76741c6cac05e70e13d84762effc9a7d28f208ff3a1cf5b04fcb0ed000a626.scope: Deactivated successfully.
Jan 22 00:28:06 compute-1 podman[238069]: 2026-01-22 00:28:06.503048362 +0000 UTC m=+0.057827276 container died 0e76741c6cac05e70e13d84762effc9a7d28f208ff3a1cf5b04fcb0ed000a626 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3b1ad694-cd0e-4047-b840-b090066a26f4, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 00:28:06 compute-1 nova_compute[182713]: 2026-01-22 00:28:06.505 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:28:06 compute-1 systemd[1]: var-lib-containers-storage-overlay-04ee774a16f0d0f1042fd31d17591e628a95e3cfde0cb59147b6229217b73e4a-merged.mount: Deactivated successfully.
Jan 22 00:28:06 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-0e76741c6cac05e70e13d84762effc9a7d28f208ff3a1cf5b04fcb0ed000a626-userdata-shm.mount: Deactivated successfully.
Jan 22 00:28:06 compute-1 nova_compute[182713]: 2026-01-22 00:28:06.544 182717 INFO nova.virt.libvirt.driver [-] [instance: 53686c08-86df-445a-b433-6a2c7c590fdb] Instance destroyed successfully.
Jan 22 00:28:06 compute-1 nova_compute[182713]: 2026-01-22 00:28:06.545 182717 DEBUG nova.objects.instance [None req-50d8132e-7b21-4c72-8620-2ef3c67abd13 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lazy-loading 'resources' on Instance uuid 53686c08-86df-445a-b433-6a2c7c590fdb obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:28:06 compute-1 podman[238069]: 2026-01-22 00:28:06.545654084 +0000 UTC m=+0.100432988 container cleanup 0e76741c6cac05e70e13d84762effc9a7d28f208ff3a1cf5b04fcb0ed000a626 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3b1ad694-cd0e-4047-b840-b090066a26f4, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3)
Jan 22 00:28:06 compute-1 systemd[1]: libpod-conmon-0e76741c6cac05e70e13d84762effc9a7d28f208ff3a1cf5b04fcb0ed000a626.scope: Deactivated successfully.
Jan 22 00:28:06 compute-1 nova_compute[182713]: 2026-01-22 00:28:06.572 182717 DEBUG nova.virt.libvirt.vif [None req-50d8132e-7b21-4c72-8620-2ef3c67abd13 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T00:26:48Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-1419654419',display_name='tempest-TestNetworkAdvancedServerOps-server-1419654419',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-1419654419',id=166,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJ5yIuZRAnyr0fY4MG0JJtrl2YmC7LkxFjTLIDSY0MneCjEwMPb+R0i/C3i76549W+tX7/jAJYcJ/Zhm6OZjTa9donlIVfIM40NClFZ/uOy/0cpEFxgyZR4Q96O10ulXCA==',key_name='tempest-TestNetworkAdvancedServerOps-623011376',keypairs=<?>,launch_index=0,launched_at=2026-01-22T00:27:45Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='adb1305c8f874f2684e845e88fd95ffe',ramdisk_id='',reservation_id='r-1itpmhcp',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkAdvancedServerOps-587955072',owner_user_name='tempest-TestNetworkAdvancedServerOps-587955072-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T00:27:56Z,user_data=None,user_id='635cc2f351c344dc8e2b1264080dbafb',uuid=53686c08-86df-445a-b433-6a2c7c590fdb,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "ac62ef89-aec4-41c9-83dd-366bdfc1c0bd", "address": "fa:16:3e:5b:20:cd", "network": {"id": "3b1ad694-cd0e-4047-b840-b090066a26f4", "bridge": "br-int", "label": "tempest-network-smoke--1985720748", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.180", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "adb1305c8f874f2684e845e88fd95ffe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapac62ef89-ae", "ovs_interfaceid": "ac62ef89-aec4-41c9-83dd-366bdfc1c0bd", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 00:28:06 compute-1 nova_compute[182713]: 2026-01-22 00:28:06.572 182717 DEBUG nova.network.os_vif_util [None req-50d8132e-7b21-4c72-8620-2ef3c67abd13 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Converting VIF {"id": "ac62ef89-aec4-41c9-83dd-366bdfc1c0bd", "address": "fa:16:3e:5b:20:cd", "network": {"id": "3b1ad694-cd0e-4047-b840-b090066a26f4", "bridge": "br-int", "label": "tempest-network-smoke--1985720748", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.180", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "adb1305c8f874f2684e845e88fd95ffe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapac62ef89-ae", "ovs_interfaceid": "ac62ef89-aec4-41c9-83dd-366bdfc1c0bd", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:28:06 compute-1 nova_compute[182713]: 2026-01-22 00:28:06.573 182717 DEBUG nova.network.os_vif_util [None req-50d8132e-7b21-4c72-8620-2ef3c67abd13 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:5b:20:cd,bridge_name='br-int',has_traffic_filtering=True,id=ac62ef89-aec4-41c9-83dd-366bdfc1c0bd,network=Network(3b1ad694-cd0e-4047-b840-b090066a26f4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapac62ef89-ae') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:28:06 compute-1 nova_compute[182713]: 2026-01-22 00:28:06.573 182717 DEBUG os_vif [None req-50d8132e-7b21-4c72-8620-2ef3c67abd13 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:5b:20:cd,bridge_name='br-int',has_traffic_filtering=True,id=ac62ef89-aec4-41c9-83dd-366bdfc1c0bd,network=Network(3b1ad694-cd0e-4047-b840-b090066a26f4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapac62ef89-ae') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 00:28:06 compute-1 nova_compute[182713]: 2026-01-22 00:28:06.575 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:28:06 compute-1 nova_compute[182713]: 2026-01-22 00:28:06.575 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapac62ef89-ae, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:28:06 compute-1 nova_compute[182713]: 2026-01-22 00:28:06.577 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:28:06 compute-1 nova_compute[182713]: 2026-01-22 00:28:06.580 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:28:06 compute-1 nova_compute[182713]: 2026-01-22 00:28:06.582 182717 INFO os_vif [None req-50d8132e-7b21-4c72-8620-2ef3c67abd13 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:5b:20:cd,bridge_name='br-int',has_traffic_filtering=True,id=ac62ef89-aec4-41c9-83dd-366bdfc1c0bd,network=Network(3b1ad694-cd0e-4047-b840-b090066a26f4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapac62ef89-ae')
Jan 22 00:28:06 compute-1 nova_compute[182713]: 2026-01-22 00:28:06.583 182717 INFO nova.virt.libvirt.driver [None req-50d8132e-7b21-4c72-8620-2ef3c67abd13 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: 53686c08-86df-445a-b433-6a2c7c590fdb] Deleting instance files /var/lib/nova/instances/53686c08-86df-445a-b433-6a2c7c590fdb_del
Jan 22 00:28:06 compute-1 nova_compute[182713]: 2026-01-22 00:28:06.590 182717 INFO nova.virt.libvirt.driver [None req-50d8132e-7b21-4c72-8620-2ef3c67abd13 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: 53686c08-86df-445a-b433-6a2c7c590fdb] Deletion of /var/lib/nova/instances/53686c08-86df-445a-b433-6a2c7c590fdb_del complete
Jan 22 00:28:06 compute-1 podman[238113]: 2026-01-22 00:28:06.613576183 +0000 UTC m=+0.049415095 container remove 0e76741c6cac05e70e13d84762effc9a7d28f208ff3a1cf5b04fcb0ed000a626 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3b1ad694-cd0e-4047-b840-b090066a26f4, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Jan 22 00:28:06 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:28:06.623 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[6d1184cd-aa6c-4318-8928-878bbc67331c]: (4, ('Thu Jan 22 12:28:06 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-3b1ad694-cd0e-4047-b840-b090066a26f4 (0e76741c6cac05e70e13d84762effc9a7d28f208ff3a1cf5b04fcb0ed000a626)\n0e76741c6cac05e70e13d84762effc9a7d28f208ff3a1cf5b04fcb0ed000a626\nThu Jan 22 12:28:06 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-3b1ad694-cd0e-4047-b840-b090066a26f4 (0e76741c6cac05e70e13d84762effc9a7d28f208ff3a1cf5b04fcb0ed000a626)\n0e76741c6cac05e70e13d84762effc9a7d28f208ff3a1cf5b04fcb0ed000a626\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:28:06 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:28:06.625 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[d496a06f-ec84-474a-8bdc-7e8008820d18]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:28:06 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:28:06.626 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3b1ad694-c0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:28:06 compute-1 nova_compute[182713]: 2026-01-22 00:28:06.629 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:28:06 compute-1 kernel: tap3b1ad694-c0: left promiscuous mode
Jan 22 00:28:06 compute-1 nova_compute[182713]: 2026-01-22 00:28:06.647 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:28:06 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:28:06.650 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[1b81cc03-e1d2-474c-95dc-df3843f21a7d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:28:06 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:28:06.664 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[d9a025cb-16c7-4a11-8c0f-b68d5e522882]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:28:06 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:28:06.665 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[20bed6ca-c8f5-4664-b10d-b404b30844a9]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:28:06 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:28:06.689 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[b096a5e1-fa15-4c64-bbb2-4cb24d5a0718]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 623637, 'reachable_time': 30502, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 238130, 'error': None, 'target': 'ovnmeta-3b1ad694-cd0e-4047-b840-b090066a26f4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:28:06 compute-1 systemd[1]: run-netns-ovnmeta\x2d3b1ad694\x2dcd0e\x2d4047\x2db840\x2db090066a26f4.mount: Deactivated successfully.
Jan 22 00:28:06 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:28:06.695 104576 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-3b1ad694-cd0e-4047-b840-b090066a26f4 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 22 00:28:06 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:28:06.696 104576 DEBUG oslo.privsep.daemon [-] privsep: reply[94a11a54-71a8-4a6a-a196-06cbffec088f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:28:06 compute-1 nova_compute[182713]: 2026-01-22 00:28:06.702 182717 INFO nova.compute.manager [None req-50d8132e-7b21-4c72-8620-2ef3c67abd13 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: 53686c08-86df-445a-b433-6a2c7c590fdb] Took 0.43 seconds to destroy the instance on the hypervisor.
Jan 22 00:28:06 compute-1 nova_compute[182713]: 2026-01-22 00:28:06.703 182717 DEBUG oslo.service.loopingcall [None req-50d8132e-7b21-4c72-8620-2ef3c67abd13 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 22 00:28:06 compute-1 nova_compute[182713]: 2026-01-22 00:28:06.703 182717 DEBUG nova.compute.manager [-] [instance: 53686c08-86df-445a-b433-6a2c7c590fdb] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 22 00:28:06 compute-1 nova_compute[182713]: 2026-01-22 00:28:06.703 182717 DEBUG nova.network.neutron [-] [instance: 53686c08-86df-445a-b433-6a2c7c590fdb] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 22 00:28:07 compute-1 nova_compute[182713]: 2026-01-22 00:28:07.516 182717 DEBUG nova.network.neutron [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] [instance: 53686c08-86df-445a-b433-6a2c7c590fdb] Updating instance_info_cache with network_info: [{"id": "ac62ef89-aec4-41c9-83dd-366bdfc1c0bd", "address": "fa:16:3e:5b:20:cd", "network": {"id": "3b1ad694-cd0e-4047-b840-b090066a26f4", "bridge": "br-int", "label": "tempest-network-smoke--1985720748", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "adb1305c8f874f2684e845e88fd95ffe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapac62ef89-ae", "ovs_interfaceid": "ac62ef89-aec4-41c9-83dd-366bdfc1c0bd", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:28:07 compute-1 nova_compute[182713]: 2026-01-22 00:28:07.548 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Releasing lock "refresh_cache-53686c08-86df-445a-b433-6a2c7c590fdb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:28:07 compute-1 nova_compute[182713]: 2026-01-22 00:28:07.548 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] [instance: 53686c08-86df-445a-b433-6a2c7c590fdb] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Jan 22 00:28:07 compute-1 nova_compute[182713]: 2026-01-22 00:28:07.548 182717 DEBUG oslo_concurrency.lockutils [req-c0e8c13e-8bef-49a0-81fd-880fbc3f5a4c req-444feb0d-7391-456b-977d-f1c0def3b7f6 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-53686c08-86df-445a-b433-6a2c7c590fdb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:28:07 compute-1 nova_compute[182713]: 2026-01-22 00:28:07.549 182717 DEBUG nova.network.neutron [req-c0e8c13e-8bef-49a0-81fd-880fbc3f5a4c req-444feb0d-7391-456b-977d-f1c0def3b7f6 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 53686c08-86df-445a-b433-6a2c7c590fdb] Refreshing network info cache for port ac62ef89-aec4-41c9-83dd-366bdfc1c0bd _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 00:28:07 compute-1 nova_compute[182713]: 2026-01-22 00:28:07.549 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:28:07 compute-1 nova_compute[182713]: 2026-01-22 00:28:07.830 182717 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769041672.829701, 470aa2dc-e43d-414b-8bac-208dec5bcfe2 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:28:07 compute-1 nova_compute[182713]: 2026-01-22 00:28:07.830 182717 INFO nova.compute.manager [-] [instance: 470aa2dc-e43d-414b-8bac-208dec5bcfe2] VM Stopped (Lifecycle Event)
Jan 22 00:28:07 compute-1 nova_compute[182713]: 2026-01-22 00:28:07.833 182717 INFO nova.network.neutron [req-c0e8c13e-8bef-49a0-81fd-880fbc3f5a4c req-444feb0d-7391-456b-977d-f1c0def3b7f6 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 53686c08-86df-445a-b433-6a2c7c590fdb] Port ac62ef89-aec4-41c9-83dd-366bdfc1c0bd from network info_cache is no longer associated with instance in Neutron. Removing from network info_cache.
Jan 22 00:28:07 compute-1 nova_compute[182713]: 2026-01-22 00:28:07.835 182717 DEBUG nova.network.neutron [req-c0e8c13e-8bef-49a0-81fd-880fbc3f5a4c req-444feb0d-7391-456b-977d-f1c0def3b7f6 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 53686c08-86df-445a-b433-6a2c7c590fdb] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:28:07 compute-1 nova_compute[182713]: 2026-01-22 00:28:07.879 182717 DEBUG nova.compute.manager [None req-cb44ff5f-f5bf-4dca-b499-bc467e346254 - - - - - -] [instance: 470aa2dc-e43d-414b-8bac-208dec5bcfe2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:28:07 compute-1 nova_compute[182713]: 2026-01-22 00:28:07.885 182717 DEBUG oslo_concurrency.lockutils [req-c0e8c13e-8bef-49a0-81fd-880fbc3f5a4c req-444feb0d-7391-456b-977d-f1c0def3b7f6 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-53686c08-86df-445a-b433-6a2c7c590fdb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:28:07 compute-1 nova_compute[182713]: 2026-01-22 00:28:07.939 182717 DEBUG nova.network.neutron [-] [instance: 53686c08-86df-445a-b433-6a2c7c590fdb] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:28:07 compute-1 nova_compute[182713]: 2026-01-22 00:28:07.956 182717 INFO nova.compute.manager [-] [instance: 53686c08-86df-445a-b433-6a2c7c590fdb] Took 1.25 seconds to deallocate network for instance.
Jan 22 00:28:08 compute-1 nova_compute[182713]: 2026-01-22 00:28:08.153 182717 DEBUG nova.compute.manager [req-3dc047f2-bc82-4f00-afeb-09a7b1906105 req-f694a981-5adc-4273-a045-56a5766b925f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 53686c08-86df-445a-b433-6a2c7c590fdb] Received event network-vif-unplugged-ac62ef89-aec4-41c9-83dd-366bdfc1c0bd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:28:08 compute-1 nova_compute[182713]: 2026-01-22 00:28:08.154 182717 DEBUG oslo_concurrency.lockutils [req-3dc047f2-bc82-4f00-afeb-09a7b1906105 req-f694a981-5adc-4273-a045-56a5766b925f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "53686c08-86df-445a-b433-6a2c7c590fdb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:28:08 compute-1 nova_compute[182713]: 2026-01-22 00:28:08.154 182717 DEBUG oslo_concurrency.lockutils [req-3dc047f2-bc82-4f00-afeb-09a7b1906105 req-f694a981-5adc-4273-a045-56a5766b925f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "53686c08-86df-445a-b433-6a2c7c590fdb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:28:08 compute-1 nova_compute[182713]: 2026-01-22 00:28:08.155 182717 DEBUG oslo_concurrency.lockutils [req-3dc047f2-bc82-4f00-afeb-09a7b1906105 req-f694a981-5adc-4273-a045-56a5766b925f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "53686c08-86df-445a-b433-6a2c7c590fdb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:28:08 compute-1 nova_compute[182713]: 2026-01-22 00:28:08.155 182717 DEBUG nova.compute.manager [req-3dc047f2-bc82-4f00-afeb-09a7b1906105 req-f694a981-5adc-4273-a045-56a5766b925f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 53686c08-86df-445a-b433-6a2c7c590fdb] No waiting events found dispatching network-vif-unplugged-ac62ef89-aec4-41c9-83dd-366bdfc1c0bd pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:28:08 compute-1 nova_compute[182713]: 2026-01-22 00:28:08.155 182717 WARNING nova.compute.manager [req-3dc047f2-bc82-4f00-afeb-09a7b1906105 req-f694a981-5adc-4273-a045-56a5766b925f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 53686c08-86df-445a-b433-6a2c7c590fdb] Received unexpected event network-vif-unplugged-ac62ef89-aec4-41c9-83dd-366bdfc1c0bd for instance with vm_state deleted and task_state None.
Jan 22 00:28:08 compute-1 nova_compute[182713]: 2026-01-22 00:28:08.155 182717 DEBUG nova.compute.manager [req-3dc047f2-bc82-4f00-afeb-09a7b1906105 req-f694a981-5adc-4273-a045-56a5766b925f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 53686c08-86df-445a-b433-6a2c7c590fdb] Received event network-vif-plugged-ac62ef89-aec4-41c9-83dd-366bdfc1c0bd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:28:08 compute-1 nova_compute[182713]: 2026-01-22 00:28:08.156 182717 DEBUG oslo_concurrency.lockutils [req-3dc047f2-bc82-4f00-afeb-09a7b1906105 req-f694a981-5adc-4273-a045-56a5766b925f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "53686c08-86df-445a-b433-6a2c7c590fdb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:28:08 compute-1 nova_compute[182713]: 2026-01-22 00:28:08.156 182717 DEBUG oslo_concurrency.lockutils [req-3dc047f2-bc82-4f00-afeb-09a7b1906105 req-f694a981-5adc-4273-a045-56a5766b925f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "53686c08-86df-445a-b433-6a2c7c590fdb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:28:08 compute-1 nova_compute[182713]: 2026-01-22 00:28:08.156 182717 DEBUG oslo_concurrency.lockutils [req-3dc047f2-bc82-4f00-afeb-09a7b1906105 req-f694a981-5adc-4273-a045-56a5766b925f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "53686c08-86df-445a-b433-6a2c7c590fdb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:28:08 compute-1 nova_compute[182713]: 2026-01-22 00:28:08.157 182717 DEBUG nova.compute.manager [req-3dc047f2-bc82-4f00-afeb-09a7b1906105 req-f694a981-5adc-4273-a045-56a5766b925f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 53686c08-86df-445a-b433-6a2c7c590fdb] No waiting events found dispatching network-vif-plugged-ac62ef89-aec4-41c9-83dd-366bdfc1c0bd pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:28:08 compute-1 nova_compute[182713]: 2026-01-22 00:28:08.157 182717 WARNING nova.compute.manager [req-3dc047f2-bc82-4f00-afeb-09a7b1906105 req-f694a981-5adc-4273-a045-56a5766b925f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 53686c08-86df-445a-b433-6a2c7c590fdb] Received unexpected event network-vif-plugged-ac62ef89-aec4-41c9-83dd-366bdfc1c0bd for instance with vm_state deleted and task_state None.
Jan 22 00:28:08 compute-1 nova_compute[182713]: 2026-01-22 00:28:08.157 182717 DEBUG nova.compute.manager [req-3dc047f2-bc82-4f00-afeb-09a7b1906105 req-f694a981-5adc-4273-a045-56a5766b925f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 53686c08-86df-445a-b433-6a2c7c590fdb] Received event network-vif-deleted-ac62ef89-aec4-41c9-83dd-366bdfc1c0bd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:28:08 compute-1 nova_compute[182713]: 2026-01-22 00:28:08.182 182717 DEBUG oslo_concurrency.lockutils [None req-50d8132e-7b21-4c72-8620-2ef3c67abd13 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:28:08 compute-1 nova_compute[182713]: 2026-01-22 00:28:08.183 182717 DEBUG oslo_concurrency.lockutils [None req-50d8132e-7b21-4c72-8620-2ef3c67abd13 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:28:08 compute-1 nova_compute[182713]: 2026-01-22 00:28:08.264 182717 DEBUG nova.compute.provider_tree [None req-50d8132e-7b21-4c72-8620-2ef3c67abd13 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Inventory has not changed in ProviderTree for provider: 39680711-70c9-4df1-ae59-25e54fac688d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:28:08 compute-1 nova_compute[182713]: 2026-01-22 00:28:08.286 182717 DEBUG nova.scheduler.client.report [None req-50d8132e-7b21-4c72-8620-2ef3c67abd13 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Inventory has not changed for provider 39680711-70c9-4df1-ae59-25e54fac688d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:28:08 compute-1 nova_compute[182713]: 2026-01-22 00:28:08.315 182717 DEBUG oslo_concurrency.lockutils [None req-50d8132e-7b21-4c72-8620-2ef3c67abd13 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.132s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:28:08 compute-1 nova_compute[182713]: 2026-01-22 00:28:08.363 182717 INFO nova.scheduler.client.report [None req-50d8132e-7b21-4c72-8620-2ef3c67abd13 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Deleted allocations for instance 53686c08-86df-445a-b433-6a2c7c590fdb
Jan 22 00:28:08 compute-1 nova_compute[182713]: 2026-01-22 00:28:08.541 182717 DEBUG oslo_concurrency.lockutils [None req-50d8132e-7b21-4c72-8620-2ef3c67abd13 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lock "53686c08-86df-445a-b433-6a2c7c590fdb" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.308s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:28:09 compute-1 nova_compute[182713]: 2026-01-22 00:28:09.158 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:28:11 compute-1 nova_compute[182713]: 2026-01-22 00:28:11.578 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:28:11 compute-1 nova_compute[182713]: 2026-01-22 00:28:11.701 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:28:11 compute-1 nova_compute[182713]: 2026-01-22 00:28:11.852 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:28:14 compute-1 nova_compute[182713]: 2026-01-22 00:28:14.161 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:28:16 compute-1 nova_compute[182713]: 2026-01-22 00:28:16.581 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:28:16 compute-1 podman[238132]: 2026-01-22 00:28:16.634569091 +0000 UTC m=+0.114112364 container health_status cba261ffc859a105e0afa4436e64e27f9ba7e7387a1ea02146e0072c51844d44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Jan 22 00:28:19 compute-1 nova_compute[182713]: 2026-01-22 00:28:19.162 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:28:19 compute-1 podman[238153]: 2026-01-22 00:28:19.597649286 +0000 UTC m=+0.084171654 container health_status dce133a2ffa21f3006ac27dc80027060a9029e6d972f4c62fecd35dd3bbb00f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, vendor=Red Hat, Inc., release=1755695350, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6)
Jan 22 00:28:21 compute-1 nova_compute[182713]: 2026-01-22 00:28:21.544 182717 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769041686.5432754, 53686c08-86df-445a-b433-6a2c7c590fdb => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:28:21 compute-1 nova_compute[182713]: 2026-01-22 00:28:21.544 182717 INFO nova.compute.manager [-] [instance: 53686c08-86df-445a-b433-6a2c7c590fdb] VM Stopped (Lifecycle Event)
Jan 22 00:28:21 compute-1 nova_compute[182713]: 2026-01-22 00:28:21.568 182717 DEBUG nova.compute.manager [None req-ecd331fe-211c-4646-8076-223cd4407854 - - - - - -] [instance: 53686c08-86df-445a-b433-6a2c7c590fdb] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:28:21 compute-1 nova_compute[182713]: 2026-01-22 00:28:21.583 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:28:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:28:22.885 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:28:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:28:22.887 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:28:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:28:22.887 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:28:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:28:22.888 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:28:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:28:22.888 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:28:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:28:22.888 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:28:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:28:22.888 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:28:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:28:22.888 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:28:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:28:22.888 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:28:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:28:22.888 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:28:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:28:22.889 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:28:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:28:22.889 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:28:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:28:22.889 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:28:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:28:22.889 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:28:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:28:22.889 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:28:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:28:22.889 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:28:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:28:22.889 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:28:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:28:22.890 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:28:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:28:22.890 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:28:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:28:22.890 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:28:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:28:22.890 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:28:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:28:22.890 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:28:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:28:22.890 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:28:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:28:22.890 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:28:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:28:22.891 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:28:24 compute-1 nova_compute[182713]: 2026-01-22 00:28:24.175 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:28:26 compute-1 nova_compute[182713]: 2026-01-22 00:28:26.584 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:28:28 compute-1 podman[238175]: 2026-01-22 00:28:28.575810791 +0000 UTC m=+0.066725922 container health_status 9069102163b9014ce8d990e36224edf2f6277e737eebbc85a8ea0ba5a3d6a468 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 22 00:28:28 compute-1 podman[238174]: 2026-01-22 00:28:28.623901564 +0000 UTC m=+0.110847583 container health_status 1b31374526468d6b98833c6ddc06c791524e3aaa8f4a92d02e3f11c449a17ca2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 22 00:28:29 compute-1 nova_compute[182713]: 2026-01-22 00:28:29.177 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:28:31 compute-1 nova_compute[182713]: 2026-01-22 00:28:31.588 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:28:33 compute-1 podman[238220]: 2026-01-22 00:28:33.550046729 +0000 UTC m=+0.044006427 container health_status af9a44923f14d865893c91712a29a99d59e05d83f1a1a0300c4f4d5af638f284 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent)
Jan 22 00:28:33 compute-1 podman[238221]: 2026-01-22 00:28:33.558578194 +0000 UTC m=+0.048719124 container health_status e305ebfcd3dbca18f8dd680606b043b108cf9efb0d0a771039cb04fe0df8441d (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Jan 22 00:28:34 compute-1 nova_compute[182713]: 2026-01-22 00:28:34.177 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:28:36 compute-1 nova_compute[182713]: 2026-01-22 00:28:36.590 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:28:38 compute-1 nova_compute[182713]: 2026-01-22 00:28:38.256 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:28:38 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:28:38.256 104184 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=52, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:a4:fb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'fa:c2:bb:50:eb:27'}, ipsec=False) old=SB_Global(nb_cfg=51) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:28:38 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:28:38.258 104184 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 22 00:28:39 compute-1 nova_compute[182713]: 2026-01-22 00:28:39.179 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:28:41 compute-1 nova_compute[182713]: 2026-01-22 00:28:41.630 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:28:44 compute-1 nova_compute[182713]: 2026-01-22 00:28:44.182 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:28:45 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:28:45.261 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=74526b6d-b1ca-423f-9094-b845f8b97526, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '52'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:28:46 compute-1 nova_compute[182713]: 2026-01-22 00:28:46.632 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:28:47 compute-1 podman[238261]: 2026-01-22 00:28:47.593926832 +0000 UTC m=+0.070226031 container health_status cba261ffc859a105e0afa4436e64e27f9ba7e7387a1ea02146e0072c51844d44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0)
Jan 22 00:28:49 compute-1 nova_compute[182713]: 2026-01-22 00:28:49.221 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:28:49 compute-1 nova_compute[182713]: 2026-01-22 00:28:49.233 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:28:50 compute-1 podman[238281]: 2026-01-22 00:28:50.594033196 +0000 UTC m=+0.082835883 container health_status dce133a2ffa21f3006ac27dc80027060a9029e6d972f4c62fecd35dd3bbb00f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, distribution-scope=public, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., managed_by=edpm_ansible, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, release=1755695350, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.6, name=ubi9-minimal, vcs-type=git, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Jan 22 00:28:51 compute-1 nova_compute[182713]: 2026-01-22 00:28:51.668 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:28:54 compute-1 nova_compute[182713]: 2026-01-22 00:28:54.224 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:28:54 compute-1 nova_compute[182713]: 2026-01-22 00:28:54.881 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:28:54 compute-1 nova_compute[182713]: 2026-01-22 00:28:54.881 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:28:55 compute-1 nova_compute[182713]: 2026-01-22 00:28:55.855 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:28:55 compute-1 nova_compute[182713]: 2026-01-22 00:28:55.855 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 00:28:56 compute-1 nova_compute[182713]: 2026-01-22 00:28:56.669 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:28:56 compute-1 nova_compute[182713]: 2026-01-22 00:28:56.851 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:28:59 compute-1 nova_compute[182713]: 2026-01-22 00:28:59.226 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:28:59 compute-1 podman[238305]: 2026-01-22 00:28:59.614478076 +0000 UTC m=+0.085640952 container health_status 9069102163b9014ce8d990e36224edf2f6277e737eebbc85a8ea0ba5a3d6a468 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 22 00:28:59 compute-1 podman[238304]: 2026-01-22 00:28:59.691746373 +0000 UTC m=+0.175003948 container health_status 1b31374526468d6b98833c6ddc06c791524e3aaa8f4a92d02e3f11c449a17ca2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251202, container_name=ovn_controller, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 22 00:29:00 compute-1 ovn_controller[94841]: 2026-01-22T00:29:00Z|00664|memory_trim|INFO|Detected inactivity (last active 30003 ms ago): trimming memory
Jan 22 00:29:01 compute-1 nova_compute[182713]: 2026-01-22 00:29:01.672 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:29:01 compute-1 nova_compute[182713]: 2026-01-22 00:29:01.856 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:29:02 compute-1 nova_compute[182713]: 2026-01-22 00:29:02.855 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:29:02 compute-1 nova_compute[182713]: 2026-01-22 00:29:02.856 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:29:02 compute-1 nova_compute[182713]: 2026-01-22 00:29:02.941 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:29:02 compute-1 nova_compute[182713]: 2026-01-22 00:29:02.942 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:29:02 compute-1 nova_compute[182713]: 2026-01-22 00:29:02.943 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:29:02 compute-1 nova_compute[182713]: 2026-01-22 00:29:02.943 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 00:29:03 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:29:03.039 104184 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:29:03 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:29:03.039 104184 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:29:03 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:29:03.039 104184 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:29:03 compute-1 nova_compute[182713]: 2026-01-22 00:29:03.164 182717 WARNING nova.virt.libvirt.driver [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 00:29:03 compute-1 nova_compute[182713]: 2026-01-22 00:29:03.166 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5697MB free_disk=73.19317245483398GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 00:29:03 compute-1 nova_compute[182713]: 2026-01-22 00:29:03.166 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:29:03 compute-1 nova_compute[182713]: 2026-01-22 00:29:03.167 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:29:03 compute-1 nova_compute[182713]: 2026-01-22 00:29:03.644 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 00:29:03 compute-1 nova_compute[182713]: 2026-01-22 00:29:03.644 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 00:29:03 compute-1 nova_compute[182713]: 2026-01-22 00:29:03.715 182717 DEBUG nova.compute.provider_tree [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Inventory has not changed in ProviderTree for provider: 39680711-70c9-4df1-ae59-25e54fac688d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:29:03 compute-1 nova_compute[182713]: 2026-01-22 00:29:03.912 182717 DEBUG nova.scheduler.client.report [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Inventory has not changed for provider 39680711-70c9-4df1-ae59-25e54fac688d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:29:04 compute-1 nova_compute[182713]: 2026-01-22 00:29:04.218 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 00:29:04 compute-1 nova_compute[182713]: 2026-01-22 00:29:04.218 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.052s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:29:04 compute-1 nova_compute[182713]: 2026-01-22 00:29:04.273 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:29:04 compute-1 podman[238353]: 2026-01-22 00:29:04.579823833 +0000 UTC m=+0.059008104 container health_status af9a44923f14d865893c91712a29a99d59e05d83f1a1a0300c4f4d5af638f284 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, managed_by=edpm_ansible, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 00:29:04 compute-1 podman[238354]: 2026-01-22 00:29:04.587979039 +0000 UTC m=+0.063170624 container health_status e305ebfcd3dbca18f8dd680606b043b108cf9efb0d0a771039cb04fe0df8441d (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 22 00:29:05 compute-1 nova_compute[182713]: 2026-01-22 00:29:05.219 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:29:05 compute-1 nova_compute[182713]: 2026-01-22 00:29:05.220 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 00:29:05 compute-1 nova_compute[182713]: 2026-01-22 00:29:05.220 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 22 00:29:05 compute-1 nova_compute[182713]: 2026-01-22 00:29:05.586 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 22 00:29:05 compute-1 nova_compute[182713]: 2026-01-22 00:29:05.587 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:29:06 compute-1 nova_compute[182713]: 2026-01-22 00:29:06.673 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:29:09 compute-1 nova_compute[182713]: 2026-01-22 00:29:09.276 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:29:11 compute-1 nova_compute[182713]: 2026-01-22 00:29:11.707 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:29:14 compute-1 nova_compute[182713]: 2026-01-22 00:29:14.278 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:29:16 compute-1 nova_compute[182713]: 2026-01-22 00:29:16.709 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:29:18 compute-1 podman[238396]: 2026-01-22 00:29:18.598647753 +0000 UTC m=+0.084392232 container health_status cba261ffc859a105e0afa4436e64e27f9ba7e7387a1ea02146e0072c51844d44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Jan 22 00:29:19 compute-1 nova_compute[182713]: 2026-01-22 00:29:19.280 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:29:21 compute-1 podman[238416]: 2026-01-22 00:29:21.586495705 +0000 UTC m=+0.081618254 container health_status dce133a2ffa21f3006ac27dc80027060a9029e6d972f4c62fecd35dd3bbb00f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, io.buildah.version=1.33.7, managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., version=9.6, build-date=2025-08-20T13:12:41, distribution-scope=public, io.openshift.tags=minimal rhel9, config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc.)
Jan 22 00:29:21 compute-1 nova_compute[182713]: 2026-01-22 00:29:21.711 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:29:24 compute-1 nova_compute[182713]: 2026-01-22 00:29:24.282 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:29:26 compute-1 nova_compute[182713]: 2026-01-22 00:29:26.713 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:29:29 compute-1 nova_compute[182713]: 2026-01-22 00:29:29.284 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:29:30 compute-1 podman[238440]: 2026-01-22 00:29:30.584923117 +0000 UTC m=+0.073043015 container health_status 9069102163b9014ce8d990e36224edf2f6277e737eebbc85a8ea0ba5a3d6a468 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 22 00:29:30 compute-1 podman[238439]: 2026-01-22 00:29:30.609608902 +0000 UTC m=+0.098975359 container health_status 1b31374526468d6b98833c6ddc06c791524e3aaa8f4a92d02e3f11c449a17ca2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Jan 22 00:29:31 compute-1 nova_compute[182713]: 2026-01-22 00:29:31.716 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:29:34 compute-1 nova_compute[182713]: 2026-01-22 00:29:34.327 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:29:35 compute-1 podman[238491]: 2026-01-22 00:29:35.583157568 +0000 UTC m=+0.078176156 container health_status af9a44923f14d865893c91712a29a99d59e05d83f1a1a0300c4f4d5af638f284 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 22 00:29:35 compute-1 podman[238492]: 2026-01-22 00:29:35.592048998 +0000 UTC m=+0.077365822 container health_status e305ebfcd3dbca18f8dd680606b043b108cf9efb0d0a771039cb04fe0df8441d (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Jan 22 00:29:36 compute-1 nova_compute[182713]: 2026-01-22 00:29:36.770 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:29:39 compute-1 nova_compute[182713]: 2026-01-22 00:29:39.328 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:29:40 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:29:40.681 104184 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=53, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:a4:fb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'fa:c2:bb:50:eb:27'}, ipsec=False) old=SB_Global(nb_cfg=52) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:29:40 compute-1 nova_compute[182713]: 2026-01-22 00:29:40.682 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:29:40 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:29:40.682 104184 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 22 00:29:41 compute-1 nova_compute[182713]: 2026-01-22 00:29:41.771 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:29:42 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:29:42.684 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=74526b6d-b1ca-423f-9094-b845f8b97526, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '53'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:29:44 compute-1 nova_compute[182713]: 2026-01-22 00:29:44.329 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:29:46 compute-1 nova_compute[182713]: 2026-01-22 00:29:46.773 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:29:49 compute-1 nova_compute[182713]: 2026-01-22 00:29:49.330 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:29:49 compute-1 podman[238533]: 2026-01-22 00:29:49.578793131 +0000 UTC m=+0.071569180 container health_status cba261ffc859a105e0afa4436e64e27f9ba7e7387a1ea02146e0072c51844d44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ceilometer_agent_compute)
Jan 22 00:29:51 compute-1 nova_compute[182713]: 2026-01-22 00:29:51.774 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:29:52 compute-1 podman[238553]: 2026-01-22 00:29:52.608154377 +0000 UTC m=+0.095986017 container health_status dce133a2ffa21f3006ac27dc80027060a9029e6d972f4c62fecd35dd3bbb00f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, managed_by=edpm_ansible, vcs-type=git, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9-minimal, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, release=1755695350, config_id=openstack_network_exporter, distribution-scope=public, version=9.6, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64)
Jan 22 00:29:54 compute-1 nova_compute[182713]: 2026-01-22 00:29:54.332 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:29:55 compute-1 nova_compute[182713]: 2026-01-22 00:29:55.856 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:29:55 compute-1 nova_compute[182713]: 2026-01-22 00:29:55.857 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:29:55 compute-1 nova_compute[182713]: 2026-01-22 00:29:55.857 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:29:55 compute-1 nova_compute[182713]: 2026-01-22 00:29:55.858 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 00:29:56 compute-1 nova_compute[182713]: 2026-01-22 00:29:56.776 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:29:57 compute-1 nova_compute[182713]: 2026-01-22 00:29:57.852 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:29:58 compute-1 nova_compute[182713]: 2026-01-22 00:29:58.884 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:29:59 compute-1 nova_compute[182713]: 2026-01-22 00:29:59.333 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:30:01 compute-1 podman[238575]: 2026-01-22 00:30:01.576042938 +0000 UTC m=+0.064853538 container health_status 9069102163b9014ce8d990e36224edf2f6277e737eebbc85a8ea0ba5a3d6a468 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 22 00:30:01 compute-1 podman[238574]: 2026-01-22 00:30:01.602320324 +0000 UTC m=+0.100293861 container health_status 1b31374526468d6b98833c6ddc06c791524e3aaa8f4a92d02e3f11c449a17ca2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, config_id=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 22 00:30:01 compute-1 nova_compute[182713]: 2026-01-22 00:30:01.778 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:30:01 compute-1 nova_compute[182713]: 2026-01-22 00:30:01.855 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:30:02 compute-1 nova_compute[182713]: 2026-01-22 00:30:02.855 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:30:02 compute-1 nova_compute[182713]: 2026-01-22 00:30:02.855 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:30:03 compute-1 nova_compute[182713]: 2026-01-22 00:30:03.008 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:30:03 compute-1 nova_compute[182713]: 2026-01-22 00:30:03.009 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:30:03 compute-1 nova_compute[182713]: 2026-01-22 00:30:03.010 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:30:03 compute-1 nova_compute[182713]: 2026-01-22 00:30:03.010 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 00:30:03 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:30:03.039 104184 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:30:03 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:30:03.040 104184 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:30:03 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:30:03.040 104184 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:30:03 compute-1 nova_compute[182713]: 2026-01-22 00:30:03.198 182717 WARNING nova.virt.libvirt.driver [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 00:30:03 compute-1 nova_compute[182713]: 2026-01-22 00:30:03.199 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5697MB free_disk=73.19319915771484GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 00:30:03 compute-1 nova_compute[182713]: 2026-01-22 00:30:03.199 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:30:03 compute-1 nova_compute[182713]: 2026-01-22 00:30:03.200 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:30:03 compute-1 nova_compute[182713]: 2026-01-22 00:30:03.614 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 00:30:03 compute-1 nova_compute[182713]: 2026-01-22 00:30:03.615 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 00:30:03 compute-1 nova_compute[182713]: 2026-01-22 00:30:03.634 182717 DEBUG nova.scheduler.client.report [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Refreshing inventories for resource provider 39680711-70c9-4df1-ae59-25e54fac688d _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Jan 22 00:30:03 compute-1 nova_compute[182713]: 2026-01-22 00:30:03.653 182717 DEBUG nova.scheduler.client.report [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Updating ProviderTree inventory for provider 39680711-70c9-4df1-ae59-25e54fac688d from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Jan 22 00:30:03 compute-1 nova_compute[182713]: 2026-01-22 00:30:03.653 182717 DEBUG nova.compute.provider_tree [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Updating inventory in ProviderTree for provider 39680711-70c9-4df1-ae59-25e54fac688d with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Jan 22 00:30:03 compute-1 nova_compute[182713]: 2026-01-22 00:30:03.672 182717 DEBUG nova.scheduler.client.report [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Refreshing aggregate associations for resource provider 39680711-70c9-4df1-ae59-25e54fac688d, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Jan 22 00:30:03 compute-1 nova_compute[182713]: 2026-01-22 00:30:03.693 182717 DEBUG nova.scheduler.client.report [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Refreshing trait associations for resource provider 39680711-70c9-4df1-ae59-25e54fac688d, traits: COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_ACCELERATORS,COMPUTE_NODE,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_SSE41,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_GRAPHICS_MODEL_CIRRUS,HW_CPU_X86_SSE42,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_SECURITY_TPM_2_0,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_STORAGE_BUS_USB,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_STORAGE_BUS_FDC,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_STORAGE_BUS_SATA,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_DEVICE_TAGGING,COMPUTE_NET_VIF_MODEL_NE2K_PCI,HW_CPU_X86_SSE2,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_CPU_X86_SSSE3,COMPUTE_STORAGE_BUS_IDE,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_RESCUE_BFV,HW_CPU_X86_MMX,HW_CPU_X86_SSE,COMPUTE_TRUSTED_CERTS,COMPUTE_IMAGE_TYPE_ISO _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Jan 22 00:30:03 compute-1 nova_compute[182713]: 2026-01-22 00:30:03.718 182717 DEBUG nova.compute.provider_tree [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Inventory has not changed in ProviderTree for provider: 39680711-70c9-4df1-ae59-25e54fac688d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:30:03 compute-1 nova_compute[182713]: 2026-01-22 00:30:03.736 182717 DEBUG nova.scheduler.client.report [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Inventory has not changed for provider 39680711-70c9-4df1-ae59-25e54fac688d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:30:03 compute-1 nova_compute[182713]: 2026-01-22 00:30:03.738 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 00:30:03 compute-1 nova_compute[182713]: 2026-01-22 00:30:03.739 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.539s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:30:04 compute-1 nova_compute[182713]: 2026-01-22 00:30:04.379 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:30:05 compute-1 nova_compute[182713]: 2026-01-22 00:30:05.740 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:30:05 compute-1 nova_compute[182713]: 2026-01-22 00:30:05.856 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:30:05 compute-1 nova_compute[182713]: 2026-01-22 00:30:05.856 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 00:30:05 compute-1 nova_compute[182713]: 2026-01-22 00:30:05.856 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 22 00:30:06 compute-1 nova_compute[182713]: 2026-01-22 00:30:06.047 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 22 00:30:06 compute-1 podman[238624]: 2026-01-22 00:30:06.573263188 +0000 UTC m=+0.065588941 container health_status e305ebfcd3dbca18f8dd680606b043b108cf9efb0d0a771039cb04fe0df8441d (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 22 00:30:06 compute-1 podman[238623]: 2026-01-22 00:30:06.574368883 +0000 UTC m=+0.064113025 container health_status af9a44923f14d865893c91712a29a99d59e05d83f1a1a0300c4f4d5af638f284 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Jan 22 00:30:06 compute-1 nova_compute[182713]: 2026-01-22 00:30:06.812 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:30:09 compute-1 nova_compute[182713]: 2026-01-22 00:30:09.381 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:30:11 compute-1 nova_compute[182713]: 2026-01-22 00:30:11.814 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:30:14 compute-1 nova_compute[182713]: 2026-01-22 00:30:14.383 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:30:16 compute-1 nova_compute[182713]: 2026-01-22 00:30:16.818 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:30:19 compute-1 nova_compute[182713]: 2026-01-22 00:30:19.410 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:30:20 compute-1 podman[238665]: 2026-01-22 00:30:20.590414705 +0000 UTC m=+0.077425183 container health_status cba261ffc859a105e0afa4436e64e27f9ba7e7387a1ea02146e0072c51844d44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute)
Jan 22 00:30:21 compute-1 nova_compute[182713]: 2026-01-22 00:30:21.820 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:30:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:30:22.884 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:30:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:30:22.885 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:30:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:30:22.885 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:30:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:30:22.885 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:30:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:30:22.886 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:30:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:30:22.886 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:30:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:30:22.886 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:30:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:30:22.886 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:30:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:30:22.886 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:30:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:30:22.886 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:30:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:30:22.886 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:30:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:30:22.886 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:30:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:30:22.886 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:30:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:30:22.887 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:30:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:30:22.887 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:30:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:30:22.887 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:30:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:30:22.887 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:30:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:30:22.887 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:30:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:30:22.887 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:30:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:30:22.887 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:30:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:30:22.887 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:30:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:30:22.887 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:30:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:30:22.887 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:30:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:30:22.888 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:30:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:30:22.888 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:30:22 compute-1 podman[238685]: 2026-01-22 00:30:22.964994634 +0000 UTC m=+0.059738547 container health_status dce133a2ffa21f3006ac27dc80027060a9029e6d972f4c62fecd35dd3bbb00f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, io.openshift.tags=minimal rhel9, release=1755695350, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, managed_by=edpm_ansible, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, distribution-scope=public, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Jan 22 00:30:24 compute-1 nova_compute[182713]: 2026-01-22 00:30:24.414 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:30:24 compute-1 sshd-session[238707]: Connection closed by 92.118.39.95 port 44546
Jan 22 00:30:26 compute-1 nova_compute[182713]: 2026-01-22 00:30:26.822 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:30:29 compute-1 nova_compute[182713]: 2026-01-22 00:30:29.509 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:30:31 compute-1 nova_compute[182713]: 2026-01-22 00:30:31.823 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:30:32 compute-1 podman[238709]: 2026-01-22 00:30:32.59869606 +0000 UTC m=+0.085380613 container health_status 9069102163b9014ce8d990e36224edf2f6277e737eebbc85a8ea0ba5a3d6a468 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 22 00:30:32 compute-1 podman[238708]: 2026-01-22 00:30:32.643173637 +0000 UTC m=+0.131111710 container health_status 1b31374526468d6b98833c6ddc06c791524e3aaa8f4a92d02e3f11c449a17ca2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 00:30:34 compute-1 nova_compute[182713]: 2026-01-22 00:30:34.540 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:30:36 compute-1 nova_compute[182713]: 2026-01-22 00:30:36.865 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:30:37 compute-1 podman[238760]: 2026-01-22 00:30:37.568679283 +0000 UTC m=+0.055480754 container health_status af9a44923f14d865893c91712a29a99d59e05d83f1a1a0300c4f4d5af638f284 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 22 00:30:37 compute-1 podman[238761]: 2026-01-22 00:30:37.577866592 +0000 UTC m=+0.058176979 container health_status e305ebfcd3dbca18f8dd680606b043b108cf9efb0d0a771039cb04fe0df8441d (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 22 00:30:39 compute-1 nova_compute[182713]: 2026-01-22 00:30:39.541 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:30:41 compute-1 nova_compute[182713]: 2026-01-22 00:30:41.866 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:30:44 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:30:44.115 104184 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=54, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:a4:fb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'fa:c2:bb:50:eb:27'}, ipsec=False) old=SB_Global(nb_cfg=53) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:30:44 compute-1 nova_compute[182713]: 2026-01-22 00:30:44.117 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:30:44 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:30:44.117 104184 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 22 00:30:44 compute-1 nova_compute[182713]: 2026-01-22 00:30:44.543 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:30:45 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:30:45.119 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=74526b6d-b1ca-423f-9094-b845f8b97526, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '54'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:30:46 compute-1 nova_compute[182713]: 2026-01-22 00:30:46.868 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:30:49 compute-1 nova_compute[182713]: 2026-01-22 00:30:49.589 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:30:51 compute-1 podman[238803]: 2026-01-22 00:30:51.620337264 +0000 UTC m=+0.112067051 container health_status cba261ffc859a105e0afa4436e64e27f9ba7e7387a1ea02146e0072c51844d44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 00:30:51 compute-1 nova_compute[182713]: 2026-01-22 00:30:51.869 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:30:53 compute-1 podman[238823]: 2026-01-22 00:30:53.602812575 +0000 UTC m=+0.098602567 container health_status dce133a2ffa21f3006ac27dc80027060a9029e6d972f4c62fecd35dd3bbb00f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, name=ubi9-minimal, version=9.6, container_name=openstack_network_exporter, config_id=openstack_network_exporter, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, release=1755695350, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, vendor=Red Hat, Inc., distribution-scope=public, maintainer=Red Hat, Inc.)
Jan 22 00:30:54 compute-1 nova_compute[182713]: 2026-01-22 00:30:54.591 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:30:56 compute-1 nova_compute[182713]: 2026-01-22 00:30:56.871 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:30:57 compute-1 nova_compute[182713]: 2026-01-22 00:30:57.855 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:30:57 compute-1 nova_compute[182713]: 2026-01-22 00:30:57.856 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:30:57 compute-1 nova_compute[182713]: 2026-01-22 00:30:57.857 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:30:57 compute-1 nova_compute[182713]: 2026-01-22 00:30:57.857 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 00:30:59 compute-1 nova_compute[182713]: 2026-01-22 00:30:59.592 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:30:59 compute-1 nova_compute[182713]: 2026-01-22 00:30:59.852 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:31:01 compute-1 nova_compute[182713]: 2026-01-22 00:31:01.872 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:31:02 compute-1 nova_compute[182713]: 2026-01-22 00:31:02.855 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:31:02 compute-1 nova_compute[182713]: 2026-01-22 00:31:02.899 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:31:02 compute-1 nova_compute[182713]: 2026-01-22 00:31:02.900 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:31:02 compute-1 nova_compute[182713]: 2026-01-22 00:31:02.900 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:31:02 compute-1 nova_compute[182713]: 2026-01-22 00:31:02.901 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 00:31:03 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:31:03.040 104184 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:31:03 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:31:03.040 104184 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:31:03 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:31:03.041 104184 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:31:03 compute-1 nova_compute[182713]: 2026-01-22 00:31:03.094 182717 WARNING nova.virt.libvirt.driver [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 00:31:03 compute-1 nova_compute[182713]: 2026-01-22 00:31:03.096 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5712MB free_disk=73.19329833984375GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 00:31:03 compute-1 nova_compute[182713]: 2026-01-22 00:31:03.096 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:31:03 compute-1 nova_compute[182713]: 2026-01-22 00:31:03.096 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:31:03 compute-1 nova_compute[182713]: 2026-01-22 00:31:03.166 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 00:31:03 compute-1 nova_compute[182713]: 2026-01-22 00:31:03.166 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 00:31:03 compute-1 nova_compute[182713]: 2026-01-22 00:31:03.194 182717 DEBUG nova.compute.provider_tree [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Inventory has not changed in ProviderTree for provider: 39680711-70c9-4df1-ae59-25e54fac688d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:31:03 compute-1 nova_compute[182713]: 2026-01-22 00:31:03.212 182717 DEBUG nova.scheduler.client.report [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Inventory has not changed for provider 39680711-70c9-4df1-ae59-25e54fac688d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:31:03 compute-1 nova_compute[182713]: 2026-01-22 00:31:03.213 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 00:31:03 compute-1 nova_compute[182713]: 2026-01-22 00:31:03.213 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.117s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:31:03 compute-1 podman[238847]: 2026-01-22 00:31:03.58533168 +0000 UTC m=+0.062607728 container health_status 9069102163b9014ce8d990e36224edf2f6277e737eebbc85a8ea0ba5a3d6a468 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 22 00:31:03 compute-1 podman[238846]: 2026-01-22 00:31:03.61399523 +0000 UTC m=+0.100269041 container health_status 1b31374526468d6b98833c6ddc06c791524e3aaa8f4a92d02e3f11c449a17ca2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 00:31:04 compute-1 nova_compute[182713]: 2026-01-22 00:31:04.214 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:31:04 compute-1 nova_compute[182713]: 2026-01-22 00:31:04.595 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:31:04 compute-1 nova_compute[182713]: 2026-01-22 00:31:04.855 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:31:06 compute-1 nova_compute[182713]: 2026-01-22 00:31:06.857 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:31:06 compute-1 nova_compute[182713]: 2026-01-22 00:31:06.874 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:31:07 compute-1 nova_compute[182713]: 2026-01-22 00:31:07.856 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:31:07 compute-1 nova_compute[182713]: 2026-01-22 00:31:07.857 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 00:31:07 compute-1 nova_compute[182713]: 2026-01-22 00:31:07.857 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 22 00:31:08 compute-1 nova_compute[182713]: 2026-01-22 00:31:08.244 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 22 00:31:08 compute-1 podman[238897]: 2026-01-22 00:31:08.576048035 +0000 UTC m=+0.067739279 container health_status af9a44923f14d865893c91712a29a99d59e05d83f1a1a0300c4f4d5af638f284 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 22 00:31:08 compute-1 podman[238898]: 2026-01-22 00:31:08.616473965 +0000 UTC m=+0.096019077 container health_status e305ebfcd3dbca18f8dd680606b043b108cf9efb0d0a771039cb04fe0df8441d (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 22 00:31:09 compute-1 nova_compute[182713]: 2026-01-22 00:31:09.598 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:31:11 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:31:11.727 104184 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:99:47:fc 10.100.0.2 2001:db8::f816:3eff:fe99:47fc'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fe99:47fc/64', 'neutron:device_id': 'ovnmeta-895033ac-5f91-4350-ad1a-b5c5d0ff13a2', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-895033ac-5f91-4350-ad1a-b5c5d0ff13a2', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '837db8748d074b3c9179b47d30e7a1d4', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=cb52301b-689d-4e28-a6fb-c23352694dd4, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=b38c45f8-f983-4d04-9b7c-db4cbbad86b5) old=Port_Binding(mac=['fa:16:3e:99:47:fc 10.100.0.2'], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-895033ac-5f91-4350-ad1a-b5c5d0ff13a2', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-895033ac-5f91-4350-ad1a-b5c5d0ff13a2', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '837db8748d074b3c9179b47d30e7a1d4', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:31:11 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:31:11.730 104184 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port b38c45f8-f983-4d04-9b7c-db4cbbad86b5 in datapath 895033ac-5f91-4350-ad1a-b5c5d0ff13a2 updated
Jan 22 00:31:11 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:31:11.731 104184 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 895033ac-5f91-4350-ad1a-b5c5d0ff13a2, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 00:31:11 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:31:11.734 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[b510c91d-841f-4413-aff1-c9d39f57583f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:31:11 compute-1 nova_compute[182713]: 2026-01-22 00:31:11.921 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:31:14 compute-1 nova_compute[182713]: 2026-01-22 00:31:14.601 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:31:16 compute-1 nova_compute[182713]: 2026-01-22 00:31:16.923 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:31:19 compute-1 nova_compute[182713]: 2026-01-22 00:31:19.602 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:31:20 compute-1 nova_compute[182713]: 2026-01-22 00:31:20.581 182717 DEBUG oslo_concurrency.lockutils [None req-6c5e4ea0-3064-495b-b0bf-552bd7da6812 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Acquiring lock "83be5683-cac9-4e5b-b8a1-9c37a5bc2b42" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:31:20 compute-1 nova_compute[182713]: 2026-01-22 00:31:20.581 182717 DEBUG oslo_concurrency.lockutils [None req-6c5e4ea0-3064-495b-b0bf-552bd7da6812 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "83be5683-cac9-4e5b-b8a1-9c37a5bc2b42" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:31:20 compute-1 nova_compute[182713]: 2026-01-22 00:31:20.598 182717 DEBUG nova.compute.manager [None req-6c5e4ea0-3064-495b-b0bf-552bd7da6812 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 83be5683-cac9-4e5b-b8a1-9c37a5bc2b42] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 22 00:31:20 compute-1 nova_compute[182713]: 2026-01-22 00:31:20.727 182717 DEBUG oslo_concurrency.lockutils [None req-6c5e4ea0-3064-495b-b0bf-552bd7da6812 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:31:20 compute-1 nova_compute[182713]: 2026-01-22 00:31:20.728 182717 DEBUG oslo_concurrency.lockutils [None req-6c5e4ea0-3064-495b-b0bf-552bd7da6812 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:31:20 compute-1 nova_compute[182713]: 2026-01-22 00:31:20.738 182717 DEBUG nova.virt.hardware [None req-6c5e4ea0-3064-495b-b0bf-552bd7da6812 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 22 00:31:20 compute-1 nova_compute[182713]: 2026-01-22 00:31:20.738 182717 INFO nova.compute.claims [None req-6c5e4ea0-3064-495b-b0bf-552bd7da6812 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 83be5683-cac9-4e5b-b8a1-9c37a5bc2b42] Claim successful on node compute-1.ctlplane.example.com
Jan 22 00:31:20 compute-1 nova_compute[182713]: 2026-01-22 00:31:20.869 182717 DEBUG nova.compute.provider_tree [None req-6c5e4ea0-3064-495b-b0bf-552bd7da6812 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Inventory has not changed in ProviderTree for provider: 39680711-70c9-4df1-ae59-25e54fac688d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:31:20 compute-1 nova_compute[182713]: 2026-01-22 00:31:20.886 182717 DEBUG nova.scheduler.client.report [None req-6c5e4ea0-3064-495b-b0bf-552bd7da6812 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Inventory has not changed for provider 39680711-70c9-4df1-ae59-25e54fac688d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:31:20 compute-1 nova_compute[182713]: 2026-01-22 00:31:20.914 182717 DEBUG oslo_concurrency.lockutils [None req-6c5e4ea0-3064-495b-b0bf-552bd7da6812 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.187s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:31:20 compute-1 nova_compute[182713]: 2026-01-22 00:31:20.915 182717 DEBUG nova.compute.manager [None req-6c5e4ea0-3064-495b-b0bf-552bd7da6812 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 83be5683-cac9-4e5b-b8a1-9c37a5bc2b42] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 22 00:31:20 compute-1 nova_compute[182713]: 2026-01-22 00:31:20.972 182717 DEBUG nova.compute.manager [None req-6c5e4ea0-3064-495b-b0bf-552bd7da6812 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 83be5683-cac9-4e5b-b8a1-9c37a5bc2b42] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 22 00:31:20 compute-1 nova_compute[182713]: 2026-01-22 00:31:20.973 182717 DEBUG nova.network.neutron [None req-6c5e4ea0-3064-495b-b0bf-552bd7da6812 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 83be5683-cac9-4e5b-b8a1-9c37a5bc2b42] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 22 00:31:20 compute-1 nova_compute[182713]: 2026-01-22 00:31:20.990 182717 INFO nova.virt.libvirt.driver [None req-6c5e4ea0-3064-495b-b0bf-552bd7da6812 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 83be5683-cac9-4e5b-b8a1-9c37a5bc2b42] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 22 00:31:21 compute-1 nova_compute[182713]: 2026-01-22 00:31:21.010 182717 DEBUG nova.compute.manager [None req-6c5e4ea0-3064-495b-b0bf-552bd7da6812 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 83be5683-cac9-4e5b-b8a1-9c37a5bc2b42] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 22 00:31:21 compute-1 nova_compute[182713]: 2026-01-22 00:31:21.129 182717 DEBUG nova.compute.manager [None req-6c5e4ea0-3064-495b-b0bf-552bd7da6812 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 83be5683-cac9-4e5b-b8a1-9c37a5bc2b42] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 22 00:31:21 compute-1 nova_compute[182713]: 2026-01-22 00:31:21.132 182717 DEBUG nova.virt.libvirt.driver [None req-6c5e4ea0-3064-495b-b0bf-552bd7da6812 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 83be5683-cac9-4e5b-b8a1-9c37a5bc2b42] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 22 00:31:21 compute-1 nova_compute[182713]: 2026-01-22 00:31:21.133 182717 INFO nova.virt.libvirt.driver [None req-6c5e4ea0-3064-495b-b0bf-552bd7da6812 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 83be5683-cac9-4e5b-b8a1-9c37a5bc2b42] Creating image(s)
Jan 22 00:31:21 compute-1 nova_compute[182713]: 2026-01-22 00:31:21.134 182717 DEBUG oslo_concurrency.lockutils [None req-6c5e4ea0-3064-495b-b0bf-552bd7da6812 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Acquiring lock "/var/lib/nova/instances/83be5683-cac9-4e5b-b8a1-9c37a5bc2b42/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:31:21 compute-1 nova_compute[182713]: 2026-01-22 00:31:21.134 182717 DEBUG oslo_concurrency.lockutils [None req-6c5e4ea0-3064-495b-b0bf-552bd7da6812 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "/var/lib/nova/instances/83be5683-cac9-4e5b-b8a1-9c37a5bc2b42/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:31:21 compute-1 nova_compute[182713]: 2026-01-22 00:31:21.136 182717 DEBUG oslo_concurrency.lockutils [None req-6c5e4ea0-3064-495b-b0bf-552bd7da6812 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "/var/lib/nova/instances/83be5683-cac9-4e5b-b8a1-9c37a5bc2b42/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:31:21 compute-1 nova_compute[182713]: 2026-01-22 00:31:21.163 182717 DEBUG oslo_concurrency.processutils [None req-6c5e4ea0-3064-495b-b0bf-552bd7da6812 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:31:21 compute-1 nova_compute[182713]: 2026-01-22 00:31:21.245 182717 DEBUG oslo_concurrency.processutils [None req-6c5e4ea0-3064-495b-b0bf-552bd7da6812 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.083s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:31:21 compute-1 nova_compute[182713]: 2026-01-22 00:31:21.247 182717 DEBUG oslo_concurrency.lockutils [None req-6c5e4ea0-3064-495b-b0bf-552bd7da6812 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Acquiring lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:31:21 compute-1 nova_compute[182713]: 2026-01-22 00:31:21.248 182717 DEBUG oslo_concurrency.lockutils [None req-6c5e4ea0-3064-495b-b0bf-552bd7da6812 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:31:21 compute-1 nova_compute[182713]: 2026-01-22 00:31:21.273 182717 DEBUG oslo_concurrency.processutils [None req-6c5e4ea0-3064-495b-b0bf-552bd7da6812 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:31:21 compute-1 nova_compute[182713]: 2026-01-22 00:31:21.333 182717 DEBUG oslo_concurrency.processutils [None req-6c5e4ea0-3064-495b-b0bf-552bd7da6812 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:31:21 compute-1 nova_compute[182713]: 2026-01-22 00:31:21.335 182717 DEBUG oslo_concurrency.processutils [None req-6c5e4ea0-3064-495b-b0bf-552bd7da6812 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/83be5683-cac9-4e5b-b8a1-9c37a5bc2b42/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:31:21 compute-1 nova_compute[182713]: 2026-01-22 00:31:21.389 182717 DEBUG oslo_concurrency.processutils [None req-6c5e4ea0-3064-495b-b0bf-552bd7da6812 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/83be5683-cac9-4e5b-b8a1-9c37a5bc2b42/disk 1073741824" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:31:21 compute-1 nova_compute[182713]: 2026-01-22 00:31:21.391 182717 DEBUG oslo_concurrency.lockutils [None req-6c5e4ea0-3064-495b-b0bf-552bd7da6812 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.142s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:31:21 compute-1 nova_compute[182713]: 2026-01-22 00:31:21.392 182717 DEBUG oslo_concurrency.processutils [None req-6c5e4ea0-3064-495b-b0bf-552bd7da6812 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:31:21 compute-1 nova_compute[182713]: 2026-01-22 00:31:21.466 182717 DEBUG oslo_concurrency.processutils [None req-6c5e4ea0-3064-495b-b0bf-552bd7da6812 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.074s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:31:21 compute-1 nova_compute[182713]: 2026-01-22 00:31:21.468 182717 DEBUG nova.virt.disk.api [None req-6c5e4ea0-3064-495b-b0bf-552bd7da6812 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Checking if we can resize image /var/lib/nova/instances/83be5683-cac9-4e5b-b8a1-9c37a5bc2b42/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 22 00:31:21 compute-1 nova_compute[182713]: 2026-01-22 00:31:21.468 182717 DEBUG oslo_concurrency.processutils [None req-6c5e4ea0-3064-495b-b0bf-552bd7da6812 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/83be5683-cac9-4e5b-b8a1-9c37a5bc2b42/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:31:21 compute-1 nova_compute[182713]: 2026-01-22 00:31:21.536 182717 DEBUG oslo_concurrency.processutils [None req-6c5e4ea0-3064-495b-b0bf-552bd7da6812 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/83be5683-cac9-4e5b-b8a1-9c37a5bc2b42/disk --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:31:21 compute-1 nova_compute[182713]: 2026-01-22 00:31:21.538 182717 DEBUG nova.virt.disk.api [None req-6c5e4ea0-3064-495b-b0bf-552bd7da6812 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Cannot resize image /var/lib/nova/instances/83be5683-cac9-4e5b-b8a1-9c37a5bc2b42/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 22 00:31:21 compute-1 nova_compute[182713]: 2026-01-22 00:31:21.538 182717 DEBUG nova.objects.instance [None req-6c5e4ea0-3064-495b-b0bf-552bd7da6812 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lazy-loading 'migration_context' on Instance uuid 83be5683-cac9-4e5b-b8a1-9c37a5bc2b42 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:31:21 compute-1 nova_compute[182713]: 2026-01-22 00:31:21.555 182717 DEBUG nova.virt.libvirt.driver [None req-6c5e4ea0-3064-495b-b0bf-552bd7da6812 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 83be5683-cac9-4e5b-b8a1-9c37a5bc2b42] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 22 00:31:21 compute-1 nova_compute[182713]: 2026-01-22 00:31:21.556 182717 DEBUG nova.virt.libvirt.driver [None req-6c5e4ea0-3064-495b-b0bf-552bd7da6812 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 83be5683-cac9-4e5b-b8a1-9c37a5bc2b42] Ensure instance console log exists: /var/lib/nova/instances/83be5683-cac9-4e5b-b8a1-9c37a5bc2b42/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 22 00:31:21 compute-1 nova_compute[182713]: 2026-01-22 00:31:21.557 182717 DEBUG oslo_concurrency.lockutils [None req-6c5e4ea0-3064-495b-b0bf-552bd7da6812 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:31:21 compute-1 nova_compute[182713]: 2026-01-22 00:31:21.558 182717 DEBUG oslo_concurrency.lockutils [None req-6c5e4ea0-3064-495b-b0bf-552bd7da6812 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:31:21 compute-1 nova_compute[182713]: 2026-01-22 00:31:21.558 182717 DEBUG oslo_concurrency.lockutils [None req-6c5e4ea0-3064-495b-b0bf-552bd7da6812 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:31:21 compute-1 nova_compute[182713]: 2026-01-22 00:31:21.735 182717 DEBUG nova.policy [None req-6c5e4ea0-3064-495b-b0bf-552bd7da6812 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 22 00:31:21 compute-1 nova_compute[182713]: 2026-01-22 00:31:21.926 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:31:22 compute-1 podman[238957]: 2026-01-22 00:31:22.59279776 +0000 UTC m=+0.078408184 container health_status cba261ffc859a105e0afa4436e64e27f9ba7e7387a1ea02146e0072c51844d44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, managed_by=edpm_ansible, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 00:31:23 compute-1 nova_compute[182713]: 2026-01-22 00:31:23.616 182717 DEBUG nova.network.neutron [None req-6c5e4ea0-3064-495b-b0bf-552bd7da6812 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 83be5683-cac9-4e5b-b8a1-9c37a5bc2b42] Successfully created port: 84e1946c-1832-49b5-9a84-edfb4022e8bc _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 22 00:31:24 compute-1 nova_compute[182713]: 2026-01-22 00:31:24.604 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:31:24 compute-1 podman[238978]: 2026-01-22 00:31:24.614388901 +0000 UTC m=+0.101789428 container health_status dce133a2ffa21f3006ac27dc80027060a9029e6d972f4c62fecd35dd3bbb00f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, config_id=openstack_network_exporter, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., maintainer=Red Hat, Inc., container_name=openstack_network_exporter, managed_by=edpm_ansible, vcs-type=git, io.openshift.expose-services=, version=9.6, io.buildah.version=1.33.7, release=1755695350, distribution-scope=public, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Jan 22 00:31:25 compute-1 nova_compute[182713]: 2026-01-22 00:31:25.803 182717 DEBUG nova.network.neutron [None req-6c5e4ea0-3064-495b-b0bf-552bd7da6812 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 83be5683-cac9-4e5b-b8a1-9c37a5bc2b42] Successfully updated port: 84e1946c-1832-49b5-9a84-edfb4022e8bc _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 22 00:31:26 compute-1 nova_compute[182713]: 2026-01-22 00:31:26.015 182717 DEBUG nova.compute.manager [req-c3932572-2752-4311-bfc7-24f8b7223198 req-c49a04a3-907a-414a-a7b5-f27bf1a357ba 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 83be5683-cac9-4e5b-b8a1-9c37a5bc2b42] Received event network-changed-84e1946c-1832-49b5-9a84-edfb4022e8bc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:31:26 compute-1 nova_compute[182713]: 2026-01-22 00:31:26.016 182717 DEBUG nova.compute.manager [req-c3932572-2752-4311-bfc7-24f8b7223198 req-c49a04a3-907a-414a-a7b5-f27bf1a357ba 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 83be5683-cac9-4e5b-b8a1-9c37a5bc2b42] Refreshing instance network info cache due to event network-changed-84e1946c-1832-49b5-9a84-edfb4022e8bc. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 00:31:26 compute-1 nova_compute[182713]: 2026-01-22 00:31:26.016 182717 DEBUG oslo_concurrency.lockutils [req-c3932572-2752-4311-bfc7-24f8b7223198 req-c49a04a3-907a-414a-a7b5-f27bf1a357ba 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-83be5683-cac9-4e5b-b8a1-9c37a5bc2b42" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:31:26 compute-1 nova_compute[182713]: 2026-01-22 00:31:26.016 182717 DEBUG oslo_concurrency.lockutils [req-c3932572-2752-4311-bfc7-24f8b7223198 req-c49a04a3-907a-414a-a7b5-f27bf1a357ba 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-83be5683-cac9-4e5b-b8a1-9c37a5bc2b42" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:31:26 compute-1 nova_compute[182713]: 2026-01-22 00:31:26.016 182717 DEBUG nova.network.neutron [req-c3932572-2752-4311-bfc7-24f8b7223198 req-c49a04a3-907a-414a-a7b5-f27bf1a357ba 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 83be5683-cac9-4e5b-b8a1-9c37a5bc2b42] Refreshing network info cache for port 84e1946c-1832-49b5-9a84-edfb4022e8bc _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 00:31:26 compute-1 nova_compute[182713]: 2026-01-22 00:31:26.283 182717 DEBUG oslo_concurrency.lockutils [None req-6c5e4ea0-3064-495b-b0bf-552bd7da6812 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Acquiring lock "refresh_cache-83be5683-cac9-4e5b-b8a1-9c37a5bc2b42" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:31:26 compute-1 nova_compute[182713]: 2026-01-22 00:31:26.691 182717 DEBUG nova.network.neutron [req-c3932572-2752-4311-bfc7-24f8b7223198 req-c49a04a3-907a-414a-a7b5-f27bf1a357ba 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 83be5683-cac9-4e5b-b8a1-9c37a5bc2b42] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 00:31:26 compute-1 nova_compute[182713]: 2026-01-22 00:31:26.927 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:31:26 compute-1 nova_compute[182713]: 2026-01-22 00:31:26.963 182717 DEBUG nova.network.neutron [req-c3932572-2752-4311-bfc7-24f8b7223198 req-c49a04a3-907a-414a-a7b5-f27bf1a357ba 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 83be5683-cac9-4e5b-b8a1-9c37a5bc2b42] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:31:26 compute-1 nova_compute[182713]: 2026-01-22 00:31:26.979 182717 DEBUG oslo_concurrency.lockutils [req-c3932572-2752-4311-bfc7-24f8b7223198 req-c49a04a3-907a-414a-a7b5-f27bf1a357ba 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-83be5683-cac9-4e5b-b8a1-9c37a5bc2b42" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:31:26 compute-1 nova_compute[182713]: 2026-01-22 00:31:26.980 182717 DEBUG oslo_concurrency.lockutils [None req-6c5e4ea0-3064-495b-b0bf-552bd7da6812 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Acquired lock "refresh_cache-83be5683-cac9-4e5b-b8a1-9c37a5bc2b42" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:31:26 compute-1 nova_compute[182713]: 2026-01-22 00:31:26.980 182717 DEBUG nova.network.neutron [None req-6c5e4ea0-3064-495b-b0bf-552bd7da6812 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 83be5683-cac9-4e5b-b8a1-9c37a5bc2b42] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 00:31:27 compute-1 nova_compute[182713]: 2026-01-22 00:31:27.173 182717 DEBUG nova.network.neutron [None req-6c5e4ea0-3064-495b-b0bf-552bd7da6812 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 83be5683-cac9-4e5b-b8a1-9c37a5bc2b42] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 00:31:28 compute-1 nova_compute[182713]: 2026-01-22 00:31:28.364 182717 DEBUG nova.network.neutron [None req-6c5e4ea0-3064-495b-b0bf-552bd7da6812 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 83be5683-cac9-4e5b-b8a1-9c37a5bc2b42] Updating instance_info_cache with network_info: [{"id": "84e1946c-1832-49b5-9a84-edfb4022e8bc", "address": "fa:16:3e:5e:14:50", "network": {"id": "895033ac-5f91-4350-ad1a-b5c5d0ff13a2", "bridge": "br-int", "label": "tempest-network-smoke--1056014394", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe5e:1450", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap84e1946c-18", "ovs_interfaceid": "84e1946c-1832-49b5-9a84-edfb4022e8bc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:31:28 compute-1 nova_compute[182713]: 2026-01-22 00:31:28.402 182717 DEBUG oslo_concurrency.lockutils [None req-6c5e4ea0-3064-495b-b0bf-552bd7da6812 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Releasing lock "refresh_cache-83be5683-cac9-4e5b-b8a1-9c37a5bc2b42" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:31:28 compute-1 nova_compute[182713]: 2026-01-22 00:31:28.402 182717 DEBUG nova.compute.manager [None req-6c5e4ea0-3064-495b-b0bf-552bd7da6812 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 83be5683-cac9-4e5b-b8a1-9c37a5bc2b42] Instance network_info: |[{"id": "84e1946c-1832-49b5-9a84-edfb4022e8bc", "address": "fa:16:3e:5e:14:50", "network": {"id": "895033ac-5f91-4350-ad1a-b5c5d0ff13a2", "bridge": "br-int", "label": "tempest-network-smoke--1056014394", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe5e:1450", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap84e1946c-18", "ovs_interfaceid": "84e1946c-1832-49b5-9a84-edfb4022e8bc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 22 00:31:28 compute-1 nova_compute[182713]: 2026-01-22 00:31:28.405 182717 DEBUG nova.virt.libvirt.driver [None req-6c5e4ea0-3064-495b-b0bf-552bd7da6812 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 83be5683-cac9-4e5b-b8a1-9c37a5bc2b42] Start _get_guest_xml network_info=[{"id": "84e1946c-1832-49b5-9a84-edfb4022e8bc", "address": "fa:16:3e:5e:14:50", "network": {"id": "895033ac-5f91-4350-ad1a-b5c5d0ff13a2", "bridge": "br-int", "label": "tempest-network-smoke--1056014394", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe5e:1450", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap84e1946c-18", "ovs_interfaceid": "84e1946c-1832-49b5-9a84-edfb4022e8bc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'size': 0, 'disk_bus': 'virtio', 'boot_index': 0, 'device_name': '/dev/vda', 'guest_format': None, 'encrypted': False, 'encryption_format': None, 'encryption_secret_uuid': None, 'device_type': 'disk', 'image_id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 22 00:31:28 compute-1 nova_compute[182713]: 2026-01-22 00:31:28.414 182717 WARNING nova.virt.libvirt.driver [None req-6c5e4ea0-3064-495b-b0bf-552bd7da6812 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 00:31:28 compute-1 nova_compute[182713]: 2026-01-22 00:31:28.422 182717 DEBUG nova.virt.libvirt.host [None req-6c5e4ea0-3064-495b-b0bf-552bd7da6812 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 22 00:31:28 compute-1 nova_compute[182713]: 2026-01-22 00:31:28.423 182717 DEBUG nova.virt.libvirt.host [None req-6c5e4ea0-3064-495b-b0bf-552bd7da6812 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 22 00:31:28 compute-1 nova_compute[182713]: 2026-01-22 00:31:28.430 182717 DEBUG nova.virt.libvirt.host [None req-6c5e4ea0-3064-495b-b0bf-552bd7da6812 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 22 00:31:28 compute-1 nova_compute[182713]: 2026-01-22 00:31:28.432 182717 DEBUG nova.virt.libvirt.host [None req-6c5e4ea0-3064-495b-b0bf-552bd7da6812 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 22 00:31:28 compute-1 nova_compute[182713]: 2026-01-22 00:31:28.435 182717 DEBUG nova.virt.libvirt.driver [None req-6c5e4ea0-3064-495b-b0bf-552bd7da6812 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 22 00:31:28 compute-1 nova_compute[182713]: 2026-01-22 00:31:28.435 182717 DEBUG nova.virt.hardware [None req-6c5e4ea0-3064-495b-b0bf-552bd7da6812 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-21T23:42:45Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c3389c03-89c4-4ff5-9e03-1a99d41713d4',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 22 00:31:28 compute-1 nova_compute[182713]: 2026-01-22 00:31:28.437 182717 DEBUG nova.virt.hardware [None req-6c5e4ea0-3064-495b-b0bf-552bd7da6812 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 22 00:31:28 compute-1 nova_compute[182713]: 2026-01-22 00:31:28.437 182717 DEBUG nova.virt.hardware [None req-6c5e4ea0-3064-495b-b0bf-552bd7da6812 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 22 00:31:28 compute-1 nova_compute[182713]: 2026-01-22 00:31:28.438 182717 DEBUG nova.virt.hardware [None req-6c5e4ea0-3064-495b-b0bf-552bd7da6812 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 22 00:31:28 compute-1 nova_compute[182713]: 2026-01-22 00:31:28.438 182717 DEBUG nova.virt.hardware [None req-6c5e4ea0-3064-495b-b0bf-552bd7da6812 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 22 00:31:28 compute-1 nova_compute[182713]: 2026-01-22 00:31:28.439 182717 DEBUG nova.virt.hardware [None req-6c5e4ea0-3064-495b-b0bf-552bd7da6812 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 22 00:31:28 compute-1 nova_compute[182713]: 2026-01-22 00:31:28.440 182717 DEBUG nova.virt.hardware [None req-6c5e4ea0-3064-495b-b0bf-552bd7da6812 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 22 00:31:28 compute-1 nova_compute[182713]: 2026-01-22 00:31:28.440 182717 DEBUG nova.virt.hardware [None req-6c5e4ea0-3064-495b-b0bf-552bd7da6812 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 22 00:31:28 compute-1 nova_compute[182713]: 2026-01-22 00:31:28.441 182717 DEBUG nova.virt.hardware [None req-6c5e4ea0-3064-495b-b0bf-552bd7da6812 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 22 00:31:28 compute-1 nova_compute[182713]: 2026-01-22 00:31:28.441 182717 DEBUG nova.virt.hardware [None req-6c5e4ea0-3064-495b-b0bf-552bd7da6812 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 22 00:31:28 compute-1 nova_compute[182713]: 2026-01-22 00:31:28.442 182717 DEBUG nova.virt.hardware [None req-6c5e4ea0-3064-495b-b0bf-552bd7da6812 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 22 00:31:28 compute-1 nova_compute[182713]: 2026-01-22 00:31:28.450 182717 DEBUG nova.virt.libvirt.vif [None req-6c5e4ea0-3064-495b-b0bf-552bd7da6812 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T00:31:18Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1345718251',display_name='tempest-TestGettingAddress-server-1345718251',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1345718251',id=169,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBEN8d2ViSKyfMT2XqudbWbmZhugtSo0AUa3hssPfIHXZXJuMLED9XwzZlkaV7imX6BxsiK4pWMoh9iMrlu0xzxZRk5QI4OmaesLZRq01J/YYbtUdz/2t7KMOohgfE7jvUQ==',key_name='tempest-TestGettingAddress-1273457996',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='837db8748d074b3c9179b47d30e7a1d4',ramdisk_id='',reservation_id='r-ogq5xbwe',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-471729430',owner_user_name='tempest-TestGettingAddress-471729430-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:31:21Z,user_data=None,user_id='a8fd196423d94b309668ffd08655f7ed',uuid=83be5683-cac9-4e5b-b8a1-9c37a5bc2b42,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "84e1946c-1832-49b5-9a84-edfb4022e8bc", "address": "fa:16:3e:5e:14:50", "network": {"id": "895033ac-5f91-4350-ad1a-b5c5d0ff13a2", "bridge": "br-int", "label": "tempest-network-smoke--1056014394", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe5e:1450", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap84e1946c-18", "ovs_interfaceid": "84e1946c-1832-49b5-9a84-edfb4022e8bc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 00:31:28 compute-1 nova_compute[182713]: 2026-01-22 00:31:28.451 182717 DEBUG nova.network.os_vif_util [None req-6c5e4ea0-3064-495b-b0bf-552bd7da6812 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Converting VIF {"id": "84e1946c-1832-49b5-9a84-edfb4022e8bc", "address": "fa:16:3e:5e:14:50", "network": {"id": "895033ac-5f91-4350-ad1a-b5c5d0ff13a2", "bridge": "br-int", "label": "tempest-network-smoke--1056014394", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe5e:1450", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap84e1946c-18", "ovs_interfaceid": "84e1946c-1832-49b5-9a84-edfb4022e8bc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:31:28 compute-1 nova_compute[182713]: 2026-01-22 00:31:28.453 182717 DEBUG nova.network.os_vif_util [None req-6c5e4ea0-3064-495b-b0bf-552bd7da6812 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:5e:14:50,bridge_name='br-int',has_traffic_filtering=True,id=84e1946c-1832-49b5-9a84-edfb4022e8bc,network=Network(895033ac-5f91-4350-ad1a-b5c5d0ff13a2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap84e1946c-18') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:31:28 compute-1 nova_compute[182713]: 2026-01-22 00:31:28.455 182717 DEBUG nova.objects.instance [None req-6c5e4ea0-3064-495b-b0bf-552bd7da6812 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lazy-loading 'pci_devices' on Instance uuid 83be5683-cac9-4e5b-b8a1-9c37a5bc2b42 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:31:28 compute-1 nova_compute[182713]: 2026-01-22 00:31:28.481 182717 DEBUG nova.virt.libvirt.driver [None req-6c5e4ea0-3064-495b-b0bf-552bd7da6812 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 83be5683-cac9-4e5b-b8a1-9c37a5bc2b42] End _get_guest_xml xml=<domain type="kvm">
Jan 22 00:31:28 compute-1 nova_compute[182713]:   <uuid>83be5683-cac9-4e5b-b8a1-9c37a5bc2b42</uuid>
Jan 22 00:31:28 compute-1 nova_compute[182713]:   <name>instance-000000a9</name>
Jan 22 00:31:28 compute-1 nova_compute[182713]:   <memory>131072</memory>
Jan 22 00:31:28 compute-1 nova_compute[182713]:   <vcpu>1</vcpu>
Jan 22 00:31:28 compute-1 nova_compute[182713]:   <metadata>
Jan 22 00:31:28 compute-1 nova_compute[182713]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 00:31:28 compute-1 nova_compute[182713]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 00:31:28 compute-1 nova_compute[182713]:       <nova:name>tempest-TestGettingAddress-server-1345718251</nova:name>
Jan 22 00:31:28 compute-1 nova_compute[182713]:       <nova:creationTime>2026-01-22 00:31:28</nova:creationTime>
Jan 22 00:31:28 compute-1 nova_compute[182713]:       <nova:flavor name="m1.nano">
Jan 22 00:31:28 compute-1 nova_compute[182713]:         <nova:memory>128</nova:memory>
Jan 22 00:31:28 compute-1 nova_compute[182713]:         <nova:disk>1</nova:disk>
Jan 22 00:31:28 compute-1 nova_compute[182713]:         <nova:swap>0</nova:swap>
Jan 22 00:31:28 compute-1 nova_compute[182713]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 00:31:28 compute-1 nova_compute[182713]:         <nova:vcpus>1</nova:vcpus>
Jan 22 00:31:28 compute-1 nova_compute[182713]:       </nova:flavor>
Jan 22 00:31:28 compute-1 nova_compute[182713]:       <nova:owner>
Jan 22 00:31:28 compute-1 nova_compute[182713]:         <nova:user uuid="a8fd196423d94b309668ffd08655f7ed">tempest-TestGettingAddress-471729430-project-member</nova:user>
Jan 22 00:31:28 compute-1 nova_compute[182713]:         <nova:project uuid="837db8748d074b3c9179b47d30e7a1d4">tempest-TestGettingAddress-471729430</nova:project>
Jan 22 00:31:28 compute-1 nova_compute[182713]:       </nova:owner>
Jan 22 00:31:28 compute-1 nova_compute[182713]:       <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 22 00:31:28 compute-1 nova_compute[182713]:       <nova:ports>
Jan 22 00:31:28 compute-1 nova_compute[182713]:         <nova:port uuid="84e1946c-1832-49b5-9a84-edfb4022e8bc">
Jan 22 00:31:28 compute-1 nova_compute[182713]:           <nova:ip type="fixed" address="2001:db8::f816:3eff:fe5e:1450" ipVersion="6"/>
Jan 22 00:31:28 compute-1 nova_compute[182713]:           <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Jan 22 00:31:28 compute-1 nova_compute[182713]:         </nova:port>
Jan 22 00:31:28 compute-1 nova_compute[182713]:       </nova:ports>
Jan 22 00:31:28 compute-1 nova_compute[182713]:     </nova:instance>
Jan 22 00:31:28 compute-1 nova_compute[182713]:   </metadata>
Jan 22 00:31:28 compute-1 nova_compute[182713]:   <sysinfo type="smbios">
Jan 22 00:31:28 compute-1 nova_compute[182713]:     <system>
Jan 22 00:31:28 compute-1 nova_compute[182713]:       <entry name="manufacturer">RDO</entry>
Jan 22 00:31:28 compute-1 nova_compute[182713]:       <entry name="product">OpenStack Compute</entry>
Jan 22 00:31:28 compute-1 nova_compute[182713]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 00:31:28 compute-1 nova_compute[182713]:       <entry name="serial">83be5683-cac9-4e5b-b8a1-9c37a5bc2b42</entry>
Jan 22 00:31:28 compute-1 nova_compute[182713]:       <entry name="uuid">83be5683-cac9-4e5b-b8a1-9c37a5bc2b42</entry>
Jan 22 00:31:28 compute-1 nova_compute[182713]:       <entry name="family">Virtual Machine</entry>
Jan 22 00:31:28 compute-1 nova_compute[182713]:     </system>
Jan 22 00:31:28 compute-1 nova_compute[182713]:   </sysinfo>
Jan 22 00:31:28 compute-1 nova_compute[182713]:   <os>
Jan 22 00:31:28 compute-1 nova_compute[182713]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 22 00:31:28 compute-1 nova_compute[182713]:     <boot dev="hd"/>
Jan 22 00:31:28 compute-1 nova_compute[182713]:     <smbios mode="sysinfo"/>
Jan 22 00:31:28 compute-1 nova_compute[182713]:   </os>
Jan 22 00:31:28 compute-1 nova_compute[182713]:   <features>
Jan 22 00:31:28 compute-1 nova_compute[182713]:     <acpi/>
Jan 22 00:31:28 compute-1 nova_compute[182713]:     <apic/>
Jan 22 00:31:28 compute-1 nova_compute[182713]:     <vmcoreinfo/>
Jan 22 00:31:28 compute-1 nova_compute[182713]:   </features>
Jan 22 00:31:28 compute-1 nova_compute[182713]:   <clock offset="utc">
Jan 22 00:31:28 compute-1 nova_compute[182713]:     <timer name="pit" tickpolicy="delay"/>
Jan 22 00:31:28 compute-1 nova_compute[182713]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 22 00:31:28 compute-1 nova_compute[182713]:     <timer name="hpet" present="no"/>
Jan 22 00:31:28 compute-1 nova_compute[182713]:   </clock>
Jan 22 00:31:28 compute-1 nova_compute[182713]:   <cpu mode="custom" match="exact">
Jan 22 00:31:28 compute-1 nova_compute[182713]:     <model>Nehalem</model>
Jan 22 00:31:28 compute-1 nova_compute[182713]:     <topology sockets="1" cores="1" threads="1"/>
Jan 22 00:31:28 compute-1 nova_compute[182713]:   </cpu>
Jan 22 00:31:28 compute-1 nova_compute[182713]:   <devices>
Jan 22 00:31:28 compute-1 nova_compute[182713]:     <disk type="file" device="disk">
Jan 22 00:31:28 compute-1 nova_compute[182713]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 00:31:28 compute-1 nova_compute[182713]:       <source file="/var/lib/nova/instances/83be5683-cac9-4e5b-b8a1-9c37a5bc2b42/disk"/>
Jan 22 00:31:28 compute-1 nova_compute[182713]:       <target dev="vda" bus="virtio"/>
Jan 22 00:31:28 compute-1 nova_compute[182713]:     </disk>
Jan 22 00:31:28 compute-1 nova_compute[182713]:     <disk type="file" device="cdrom">
Jan 22 00:31:28 compute-1 nova_compute[182713]:       <driver name="qemu" type="raw" cache="none"/>
Jan 22 00:31:28 compute-1 nova_compute[182713]:       <source file="/var/lib/nova/instances/83be5683-cac9-4e5b-b8a1-9c37a5bc2b42/disk.config"/>
Jan 22 00:31:28 compute-1 nova_compute[182713]:       <target dev="sda" bus="sata"/>
Jan 22 00:31:28 compute-1 nova_compute[182713]:     </disk>
Jan 22 00:31:28 compute-1 nova_compute[182713]:     <interface type="ethernet">
Jan 22 00:31:28 compute-1 nova_compute[182713]:       <mac address="fa:16:3e:5e:14:50"/>
Jan 22 00:31:28 compute-1 nova_compute[182713]:       <model type="virtio"/>
Jan 22 00:31:28 compute-1 nova_compute[182713]:       <driver name="vhost" rx_queue_size="512"/>
Jan 22 00:31:28 compute-1 nova_compute[182713]:       <mtu size="1442"/>
Jan 22 00:31:28 compute-1 nova_compute[182713]:       <target dev="tap84e1946c-18"/>
Jan 22 00:31:28 compute-1 nova_compute[182713]:     </interface>
Jan 22 00:31:28 compute-1 nova_compute[182713]:     <serial type="pty">
Jan 22 00:31:28 compute-1 nova_compute[182713]:       <log file="/var/lib/nova/instances/83be5683-cac9-4e5b-b8a1-9c37a5bc2b42/console.log" append="off"/>
Jan 22 00:31:28 compute-1 nova_compute[182713]:     </serial>
Jan 22 00:31:28 compute-1 nova_compute[182713]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 00:31:28 compute-1 nova_compute[182713]:     <video>
Jan 22 00:31:28 compute-1 nova_compute[182713]:       <model type="virtio"/>
Jan 22 00:31:28 compute-1 nova_compute[182713]:     </video>
Jan 22 00:31:28 compute-1 nova_compute[182713]:     <input type="tablet" bus="usb"/>
Jan 22 00:31:28 compute-1 nova_compute[182713]:     <rng model="virtio">
Jan 22 00:31:28 compute-1 nova_compute[182713]:       <backend model="random">/dev/urandom</backend>
Jan 22 00:31:28 compute-1 nova_compute[182713]:     </rng>
Jan 22 00:31:28 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root"/>
Jan 22 00:31:28 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:31:28 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:31:28 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:31:28 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:31:28 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:31:28 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:31:28 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:31:28 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:31:28 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:31:28 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:31:28 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:31:28 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:31:28 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:31:28 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:31:28 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:31:28 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:31:28 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:31:28 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:31:28 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:31:28 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:31:28 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:31:28 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:31:28 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:31:28 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:31:28 compute-1 nova_compute[182713]:     <controller type="usb" index="0"/>
Jan 22 00:31:28 compute-1 nova_compute[182713]:     <memballoon model="virtio">
Jan 22 00:31:28 compute-1 nova_compute[182713]:       <stats period="10"/>
Jan 22 00:31:28 compute-1 nova_compute[182713]:     </memballoon>
Jan 22 00:31:28 compute-1 nova_compute[182713]:   </devices>
Jan 22 00:31:28 compute-1 nova_compute[182713]: </domain>
Jan 22 00:31:28 compute-1 nova_compute[182713]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 22 00:31:28 compute-1 nova_compute[182713]: 2026-01-22 00:31:28.483 182717 DEBUG nova.compute.manager [None req-6c5e4ea0-3064-495b-b0bf-552bd7da6812 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 83be5683-cac9-4e5b-b8a1-9c37a5bc2b42] Preparing to wait for external event network-vif-plugged-84e1946c-1832-49b5-9a84-edfb4022e8bc prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 22 00:31:28 compute-1 nova_compute[182713]: 2026-01-22 00:31:28.484 182717 DEBUG oslo_concurrency.lockutils [None req-6c5e4ea0-3064-495b-b0bf-552bd7da6812 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Acquiring lock "83be5683-cac9-4e5b-b8a1-9c37a5bc2b42-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:31:28 compute-1 nova_compute[182713]: 2026-01-22 00:31:28.484 182717 DEBUG oslo_concurrency.lockutils [None req-6c5e4ea0-3064-495b-b0bf-552bd7da6812 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "83be5683-cac9-4e5b-b8a1-9c37a5bc2b42-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:31:28 compute-1 nova_compute[182713]: 2026-01-22 00:31:28.484 182717 DEBUG oslo_concurrency.lockutils [None req-6c5e4ea0-3064-495b-b0bf-552bd7da6812 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "83be5683-cac9-4e5b-b8a1-9c37a5bc2b42-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:31:28 compute-1 nova_compute[182713]: 2026-01-22 00:31:28.485 182717 DEBUG nova.virt.libvirt.vif [None req-6c5e4ea0-3064-495b-b0bf-552bd7da6812 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T00:31:18Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1345718251',display_name='tempest-TestGettingAddress-server-1345718251',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1345718251',id=169,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBEN8d2ViSKyfMT2XqudbWbmZhugtSo0AUa3hssPfIHXZXJuMLED9XwzZlkaV7imX6BxsiK4pWMoh9iMrlu0xzxZRk5QI4OmaesLZRq01J/YYbtUdz/2t7KMOohgfE7jvUQ==',key_name='tempest-TestGettingAddress-1273457996',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='837db8748d074b3c9179b47d30e7a1d4',ramdisk_id='',reservation_id='r-ogq5xbwe',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-471729430',owner_user_name='tempest-TestGettingAddress-471729430-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:31:21Z,user_data=None,user_id='a8fd196423d94b309668ffd08655f7ed',uuid=83be5683-cac9-4e5b-b8a1-9c37a5bc2b42,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "84e1946c-1832-49b5-9a84-edfb4022e8bc", "address": "fa:16:3e:5e:14:50", "network": {"id": "895033ac-5f91-4350-ad1a-b5c5d0ff13a2", "bridge": "br-int", "label": "tempest-network-smoke--1056014394", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe5e:1450", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap84e1946c-18", "ovs_interfaceid": "84e1946c-1832-49b5-9a84-edfb4022e8bc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 00:31:28 compute-1 nova_compute[182713]: 2026-01-22 00:31:28.485 182717 DEBUG nova.network.os_vif_util [None req-6c5e4ea0-3064-495b-b0bf-552bd7da6812 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Converting VIF {"id": "84e1946c-1832-49b5-9a84-edfb4022e8bc", "address": "fa:16:3e:5e:14:50", "network": {"id": "895033ac-5f91-4350-ad1a-b5c5d0ff13a2", "bridge": "br-int", "label": "tempest-network-smoke--1056014394", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe5e:1450", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap84e1946c-18", "ovs_interfaceid": "84e1946c-1832-49b5-9a84-edfb4022e8bc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:31:28 compute-1 nova_compute[182713]: 2026-01-22 00:31:28.486 182717 DEBUG nova.network.os_vif_util [None req-6c5e4ea0-3064-495b-b0bf-552bd7da6812 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:5e:14:50,bridge_name='br-int',has_traffic_filtering=True,id=84e1946c-1832-49b5-9a84-edfb4022e8bc,network=Network(895033ac-5f91-4350-ad1a-b5c5d0ff13a2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap84e1946c-18') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:31:28 compute-1 nova_compute[182713]: 2026-01-22 00:31:28.487 182717 DEBUG os_vif [None req-6c5e4ea0-3064-495b-b0bf-552bd7da6812 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:5e:14:50,bridge_name='br-int',has_traffic_filtering=True,id=84e1946c-1832-49b5-9a84-edfb4022e8bc,network=Network(895033ac-5f91-4350-ad1a-b5c5d0ff13a2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap84e1946c-18') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 00:31:28 compute-1 nova_compute[182713]: 2026-01-22 00:31:28.488 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:31:28 compute-1 nova_compute[182713]: 2026-01-22 00:31:28.488 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:31:28 compute-1 nova_compute[182713]: 2026-01-22 00:31:28.489 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 00:31:28 compute-1 nova_compute[182713]: 2026-01-22 00:31:28.505 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:31:28 compute-1 nova_compute[182713]: 2026-01-22 00:31:28.506 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap84e1946c-18, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:31:28 compute-1 nova_compute[182713]: 2026-01-22 00:31:28.507 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap84e1946c-18, col_values=(('external_ids', {'iface-id': '84e1946c-1832-49b5-9a84-edfb4022e8bc', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:5e:14:50', 'vm-uuid': '83be5683-cac9-4e5b-b8a1-9c37a5bc2b42'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:31:28 compute-1 NetworkManager[54952]: <info>  [1769041888.5117] manager: (tap84e1946c-18): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/309)
Jan 22 00:31:28 compute-1 nova_compute[182713]: 2026-01-22 00:31:28.511 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:31:28 compute-1 nova_compute[182713]: 2026-01-22 00:31:28.515 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:31:28 compute-1 nova_compute[182713]: 2026-01-22 00:31:28.521 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:31:28 compute-1 nova_compute[182713]: 2026-01-22 00:31:28.522 182717 INFO os_vif [None req-6c5e4ea0-3064-495b-b0bf-552bd7da6812 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:5e:14:50,bridge_name='br-int',has_traffic_filtering=True,id=84e1946c-1832-49b5-9a84-edfb4022e8bc,network=Network(895033ac-5f91-4350-ad1a-b5c5d0ff13a2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap84e1946c-18')
Jan 22 00:31:28 compute-1 nova_compute[182713]: 2026-01-22 00:31:28.577 182717 DEBUG nova.virt.libvirt.driver [None req-6c5e4ea0-3064-495b-b0bf-552bd7da6812 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 00:31:28 compute-1 nova_compute[182713]: 2026-01-22 00:31:28.578 182717 DEBUG nova.virt.libvirt.driver [None req-6c5e4ea0-3064-495b-b0bf-552bd7da6812 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 00:31:28 compute-1 nova_compute[182713]: 2026-01-22 00:31:28.578 182717 DEBUG nova.virt.libvirt.driver [None req-6c5e4ea0-3064-495b-b0bf-552bd7da6812 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] No VIF found with MAC fa:16:3e:5e:14:50, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 00:31:28 compute-1 nova_compute[182713]: 2026-01-22 00:31:28.578 182717 INFO nova.virt.libvirt.driver [None req-6c5e4ea0-3064-495b-b0bf-552bd7da6812 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 83be5683-cac9-4e5b-b8a1-9c37a5bc2b42] Using config drive
Jan 22 00:31:28 compute-1 nova_compute[182713]: 2026-01-22 00:31:28.918 182717 INFO nova.virt.libvirt.driver [None req-6c5e4ea0-3064-495b-b0bf-552bd7da6812 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 83be5683-cac9-4e5b-b8a1-9c37a5bc2b42] Creating config drive at /var/lib/nova/instances/83be5683-cac9-4e5b-b8a1-9c37a5bc2b42/disk.config
Jan 22 00:31:28 compute-1 nova_compute[182713]: 2026-01-22 00:31:28.925 182717 DEBUG oslo_concurrency.processutils [None req-6c5e4ea0-3064-495b-b0bf-552bd7da6812 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/83be5683-cac9-4e5b-b8a1-9c37a5bc2b42/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpdn2cqodr execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:31:29 compute-1 nova_compute[182713]: 2026-01-22 00:31:29.483 182717 DEBUG oslo_concurrency.processutils [None req-6c5e4ea0-3064-495b-b0bf-552bd7da6812 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/83be5683-cac9-4e5b-b8a1-9c37a5bc2b42/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpdn2cqodr" returned: 0 in 0.558s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:31:29 compute-1 kernel: tap84e1946c-18: entered promiscuous mode
Jan 22 00:31:29 compute-1 ovn_controller[94841]: 2026-01-22T00:31:29Z|00665|binding|INFO|Claiming lport 84e1946c-1832-49b5-9a84-edfb4022e8bc for this chassis.
Jan 22 00:31:29 compute-1 ovn_controller[94841]: 2026-01-22T00:31:29Z|00666|binding|INFO|84e1946c-1832-49b5-9a84-edfb4022e8bc: Claiming fa:16:3e:5e:14:50 10.100.0.10 2001:db8::f816:3eff:fe5e:1450
Jan 22 00:31:29 compute-1 nova_compute[182713]: 2026-01-22 00:31:29.566 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:31:29 compute-1 NetworkManager[54952]: <info>  [1769041889.5726] manager: (tap84e1946c-18): new Tun device (/org/freedesktop/NetworkManager/Devices/310)
Jan 22 00:31:29 compute-1 nova_compute[182713]: 2026-01-22 00:31:29.572 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:31:29 compute-1 nova_compute[182713]: 2026-01-22 00:31:29.576 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:31:29 compute-1 systemd-machined[153970]: New machine qemu-73-instance-000000a9.
Jan 22 00:31:29 compute-1 systemd[1]: Started Virtual Machine qemu-73-instance-000000a9.
Jan 22 00:31:29 compute-1 nova_compute[182713]: 2026-01-22 00:31:29.635 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:31:29 compute-1 ovn_controller[94841]: 2026-01-22T00:31:29Z|00667|binding|INFO|Setting lport 84e1946c-1832-49b5-9a84-edfb4022e8bc ovn-installed in OVS
Jan 22 00:31:29 compute-1 nova_compute[182713]: 2026-01-22 00:31:29.640 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:31:29 compute-1 systemd-udevd[239020]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 00:31:29 compute-1 ovn_controller[94841]: 2026-01-22T00:31:29Z|00668|binding|INFO|Setting lport 84e1946c-1832-49b5-9a84-edfb4022e8bc up in Southbound
Jan 22 00:31:29 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:31:29.650 104184 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:5e:14:50 10.100.0.10 2001:db8::f816:3eff:fe5e:1450'], port_security=['fa:16:3e:5e:14:50 10.100.0.10 2001:db8::f816:3eff:fe5e:1450'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28 2001:db8::f816:3eff:fe5e:1450/64', 'neutron:device_id': '83be5683-cac9-4e5b-b8a1-9c37a5bc2b42', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-895033ac-5f91-4350-ad1a-b5c5d0ff13a2', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '837db8748d074b3c9179b47d30e7a1d4', 'neutron:revision_number': '2', 'neutron:security_group_ids': '9d931792-0187-42bd-ad30-da2120e7bd41', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=cb52301b-689d-4e28-a6fb-c23352694dd4, chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>], logical_port=84e1946c-1832-49b5-9a84-edfb4022e8bc) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:31:29 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:31:29.652 104184 INFO neutron.agent.ovn.metadata.agent [-] Port 84e1946c-1832-49b5-9a84-edfb4022e8bc in datapath 895033ac-5f91-4350-ad1a-b5c5d0ff13a2 bound to our chassis
Jan 22 00:31:29 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:31:29.655 104184 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 895033ac-5f91-4350-ad1a-b5c5d0ff13a2
Jan 22 00:31:29 compute-1 NetworkManager[54952]: <info>  [1769041889.6594] device (tap84e1946c-18): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 00:31:29 compute-1 NetworkManager[54952]: <info>  [1769041889.6604] device (tap84e1946c-18): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 00:31:29 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:31:29.666 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[0186d9af-024f-45b3-81cc-8f8f5b7d773a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:31:29 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:31:29.667 104184 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap895033ac-51 in ovnmeta-895033ac-5f91-4350-ad1a-b5c5d0ff13a2 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 22 00:31:29 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:31:29.671 211733 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap895033ac-50 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 22 00:31:29 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:31:29.671 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[c014c72c-da86-4dae-94ec-9e2119d7194c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:31:29 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:31:29.672 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[a169bb3f-c98e-4f56-b644-4fd633febc5e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:31:29 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:31:29.697 104576 DEBUG oslo.privsep.daemon [-] privsep: reply[4f9ed1df-ee9c-4c61-abbd-e60caa427f8b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:31:29 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:31:29.713 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[92b9de22-461c-4ff3-9e0b-ccf87d052482]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:31:29 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:31:29.739 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[04f80e9e-c1cb-4ff1-847d-211a0a70f778]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:31:29 compute-1 NetworkManager[54952]: <info>  [1769041889.7478] manager: (tap895033ac-50): new Veth device (/org/freedesktop/NetworkManager/Devices/311)
Jan 22 00:31:29 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:31:29.748 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[e56174df-716e-4116-8d0c-1b5897a44907]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:31:29 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:31:29.783 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[eb92ea5f-3e24-4d8c-8b73-6af286f5b0b3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:31:29 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:31:29.787 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[0228e3bf-def7-433c-a780-cdf5c3ec26f0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:31:29 compute-1 NetworkManager[54952]: <info>  [1769041889.8092] device (tap895033ac-50): carrier: link connected
Jan 22 00:31:29 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:31:29.818 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[8531d73f-2bdc-4683-b42c-735e66f66310]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:31:29 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:31:29.836 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[94c6c962-dc4f-4482-a3d9-ffd8508dd84c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap895033ac-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:99:47:fc'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 192], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 646246, 'reachable_time': 29592, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 148, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 148, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 239053, 'error': None, 'target': 'ovnmeta-895033ac-5f91-4350-ad1a-b5c5d0ff13a2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:31:29 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:31:29.852 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[342f5bda-4350-4e84-8fad-bbfbb780b3a8]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe99:47fc'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 646246, 'tstamp': 646246}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 239054, 'error': None, 'target': 'ovnmeta-895033ac-5f91-4350-ad1a-b5c5d0ff13a2', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:31:29 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:31:29.874 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[e0434650-8d03-401c-ba93-88fd65491be6]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap895033ac-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:99:47:fc'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 192], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 646246, 'reachable_time': 29592, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 148, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 148, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 239055, 'error': None, 'target': 'ovnmeta-895033ac-5f91-4350-ad1a-b5c5d0ff13a2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:31:29 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:31:29.913 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[728d7b93-ea4f-4535-9bbc-9ac1a9698054]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:31:29 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:31:29.991 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[7369addf-dbfa-4a23-9364-af9258158f64]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:31:29 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:31:29.993 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap895033ac-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:31:29 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:31:29.993 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 00:31:29 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:31:29.994 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap895033ac-50, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:31:29 compute-1 NetworkManager[54952]: <info>  [1769041889.9965] manager: (tap895033ac-50): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/312)
Jan 22 00:31:29 compute-1 kernel: tap895033ac-50: entered promiscuous mode
Jan 22 00:31:29 compute-1 nova_compute[182713]: 2026-01-22 00:31:29.995 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:31:29 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:31:29.999 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap895033ac-50, col_values=(('external_ids', {'iface-id': 'b38c45f8-f983-4d04-9b7c-db4cbbad86b5'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:31:30 compute-1 ovn_controller[94841]: 2026-01-22T00:31:30Z|00669|binding|INFO|Releasing lport b38c45f8-f983-4d04-9b7c-db4cbbad86b5 from this chassis (sb_readonly=0)
Jan 22 00:31:30 compute-1 nova_compute[182713]: 2026-01-22 00:31:30.000 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:31:30 compute-1 nova_compute[182713]: 2026-01-22 00:31:30.001 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:31:30 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:31:30.002 104184 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/895033ac-5f91-4350-ad1a-b5c5d0ff13a2.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/895033ac-5f91-4350-ad1a-b5c5d0ff13a2.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 22 00:31:30 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:31:30.003 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[fc75a10b-258f-4f1e-bd3d-d9e67b0a9757]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:31:30 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:31:30.004 104184 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 00:31:30 compute-1 ovn_metadata_agent[104179]: global
Jan 22 00:31:30 compute-1 ovn_metadata_agent[104179]:     log         /dev/log local0 debug
Jan 22 00:31:30 compute-1 ovn_metadata_agent[104179]:     log-tag     haproxy-metadata-proxy-895033ac-5f91-4350-ad1a-b5c5d0ff13a2
Jan 22 00:31:30 compute-1 ovn_metadata_agent[104179]:     user        root
Jan 22 00:31:30 compute-1 ovn_metadata_agent[104179]:     group       root
Jan 22 00:31:30 compute-1 ovn_metadata_agent[104179]:     maxconn     1024
Jan 22 00:31:30 compute-1 ovn_metadata_agent[104179]:     pidfile     /var/lib/neutron/external/pids/895033ac-5f91-4350-ad1a-b5c5d0ff13a2.pid.haproxy
Jan 22 00:31:30 compute-1 ovn_metadata_agent[104179]:     daemon
Jan 22 00:31:30 compute-1 ovn_metadata_agent[104179]: 
Jan 22 00:31:30 compute-1 ovn_metadata_agent[104179]: defaults
Jan 22 00:31:30 compute-1 ovn_metadata_agent[104179]:     log global
Jan 22 00:31:30 compute-1 ovn_metadata_agent[104179]:     mode http
Jan 22 00:31:30 compute-1 ovn_metadata_agent[104179]:     option httplog
Jan 22 00:31:30 compute-1 ovn_metadata_agent[104179]:     option dontlognull
Jan 22 00:31:30 compute-1 ovn_metadata_agent[104179]:     option http-server-close
Jan 22 00:31:30 compute-1 ovn_metadata_agent[104179]:     option forwardfor
Jan 22 00:31:30 compute-1 ovn_metadata_agent[104179]:     retries                 3
Jan 22 00:31:30 compute-1 ovn_metadata_agent[104179]:     timeout http-request    30s
Jan 22 00:31:30 compute-1 ovn_metadata_agent[104179]:     timeout connect         30s
Jan 22 00:31:30 compute-1 ovn_metadata_agent[104179]:     timeout client          32s
Jan 22 00:31:30 compute-1 ovn_metadata_agent[104179]:     timeout server          32s
Jan 22 00:31:30 compute-1 ovn_metadata_agent[104179]:     timeout http-keep-alive 30s
Jan 22 00:31:30 compute-1 ovn_metadata_agent[104179]: 
Jan 22 00:31:30 compute-1 ovn_metadata_agent[104179]: 
Jan 22 00:31:30 compute-1 ovn_metadata_agent[104179]: listen listener
Jan 22 00:31:30 compute-1 ovn_metadata_agent[104179]:     bind 169.254.169.254:80
Jan 22 00:31:30 compute-1 ovn_metadata_agent[104179]:     server metadata /var/lib/neutron/metadata_proxy
Jan 22 00:31:30 compute-1 ovn_metadata_agent[104179]:     http-request add-header X-OVN-Network-ID 895033ac-5f91-4350-ad1a-b5c5d0ff13a2
Jan 22 00:31:30 compute-1 ovn_metadata_agent[104179]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 22 00:31:30 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:31:30.005 104184 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-895033ac-5f91-4350-ad1a-b5c5d0ff13a2', 'env', 'PROCESS_TAG=haproxy-895033ac-5f91-4350-ad1a-b5c5d0ff13a2', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/895033ac-5f91-4350-ad1a-b5c5d0ff13a2.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 22 00:31:30 compute-1 nova_compute[182713]: 2026-01-22 00:31:30.013 182717 DEBUG nova.virt.driver [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Emitting event <LifecycleEvent: 1769041890.0129201, 83be5683-cac9-4e5b-b8a1-9c37a5bc2b42 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:31:30 compute-1 nova_compute[182713]: 2026-01-22 00:31:30.014 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 83be5683-cac9-4e5b-b8a1-9c37a5bc2b42] VM Started (Lifecycle Event)
Jan 22 00:31:30 compute-1 nova_compute[182713]: 2026-01-22 00:31:30.025 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:31:30 compute-1 nova_compute[182713]: 2026-01-22 00:31:30.364 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 83be5683-cac9-4e5b-b8a1-9c37a5bc2b42] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:31:30 compute-1 nova_compute[182713]: 2026-01-22 00:31:30.368 182717 DEBUG nova.virt.driver [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Emitting event <LifecycleEvent: 1769041890.0130477, 83be5683-cac9-4e5b-b8a1-9c37a5bc2b42 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:31:30 compute-1 nova_compute[182713]: 2026-01-22 00:31:30.369 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 83be5683-cac9-4e5b-b8a1-9c37a5bc2b42] VM Paused (Lifecycle Event)
Jan 22 00:31:30 compute-1 podman[239094]: 2026-01-22 00:31:30.392040404 +0000 UTC m=+0.060105348 container create b5efe806540be88a35ad3b35409a0ae3c3f99b50058b214dbe362f925c3f9f7d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-895033ac-5f91-4350-ad1a-b5c5d0ff13a2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, tcib_managed=true)
Jan 22 00:31:30 compute-1 nova_compute[182713]: 2026-01-22 00:31:30.415 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 83be5683-cac9-4e5b-b8a1-9c37a5bc2b42] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:31:30 compute-1 nova_compute[182713]: 2026-01-22 00:31:30.420 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 83be5683-cac9-4e5b-b8a1-9c37a5bc2b42] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 00:31:30 compute-1 systemd[1]: Started libpod-conmon-b5efe806540be88a35ad3b35409a0ae3c3f99b50058b214dbe362f925c3f9f7d.scope.
Jan 22 00:31:30 compute-1 nova_compute[182713]: 2026-01-22 00:31:30.446 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 83be5683-cac9-4e5b-b8a1-9c37a5bc2b42] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 00:31:30 compute-1 podman[239094]: 2026-01-22 00:31:30.356104145 +0000 UTC m=+0.024169180 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 00:31:30 compute-1 systemd[1]: Started libcrun container.
Jan 22 00:31:30 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7b2f5eb6ebdbf50d7e4f516f522476fe4e69e764b5ac59678474689afe2862d4/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 00:31:30 compute-1 podman[239094]: 2026-01-22 00:31:30.486933726 +0000 UTC m=+0.154998690 container init b5efe806540be88a35ad3b35409a0ae3c3f99b50058b214dbe362f925c3f9f7d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-895033ac-5f91-4350-ad1a-b5c5d0ff13a2, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 00:31:30 compute-1 podman[239094]: 2026-01-22 00:31:30.491925632 +0000 UTC m=+0.159990576 container start b5efe806540be88a35ad3b35409a0ae3c3f99b50058b214dbe362f925c3f9f7d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-895033ac-5f91-4350-ad1a-b5c5d0ff13a2, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 00:31:30 compute-1 neutron-haproxy-ovnmeta-895033ac-5f91-4350-ad1a-b5c5d0ff13a2[239109]: [NOTICE]   (239113) : New worker (239115) forked
Jan 22 00:31:30 compute-1 neutron-haproxy-ovnmeta-895033ac-5f91-4350-ad1a-b5c5d0ff13a2[239109]: [NOTICE]   (239113) : Loading success.
Jan 22 00:31:32 compute-1 nova_compute[182713]: 2026-01-22 00:31:32.835 182717 DEBUG nova.compute.manager [req-92348f5e-2279-4eb0-b4d6-c9b1abc2cac3 req-681a913e-970d-4b11-967f-93d34e5cfc58 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 83be5683-cac9-4e5b-b8a1-9c37a5bc2b42] Received event network-vif-plugged-84e1946c-1832-49b5-9a84-edfb4022e8bc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:31:32 compute-1 nova_compute[182713]: 2026-01-22 00:31:32.835 182717 DEBUG oslo_concurrency.lockutils [req-92348f5e-2279-4eb0-b4d6-c9b1abc2cac3 req-681a913e-970d-4b11-967f-93d34e5cfc58 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "83be5683-cac9-4e5b-b8a1-9c37a5bc2b42-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:31:32 compute-1 nova_compute[182713]: 2026-01-22 00:31:32.836 182717 DEBUG oslo_concurrency.lockutils [req-92348f5e-2279-4eb0-b4d6-c9b1abc2cac3 req-681a913e-970d-4b11-967f-93d34e5cfc58 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "83be5683-cac9-4e5b-b8a1-9c37a5bc2b42-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:31:32 compute-1 nova_compute[182713]: 2026-01-22 00:31:32.836 182717 DEBUG oslo_concurrency.lockutils [req-92348f5e-2279-4eb0-b4d6-c9b1abc2cac3 req-681a913e-970d-4b11-967f-93d34e5cfc58 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "83be5683-cac9-4e5b-b8a1-9c37a5bc2b42-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:31:32 compute-1 nova_compute[182713]: 2026-01-22 00:31:32.836 182717 DEBUG nova.compute.manager [req-92348f5e-2279-4eb0-b4d6-c9b1abc2cac3 req-681a913e-970d-4b11-967f-93d34e5cfc58 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 83be5683-cac9-4e5b-b8a1-9c37a5bc2b42] Processing event network-vif-plugged-84e1946c-1832-49b5-9a84-edfb4022e8bc _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 22 00:31:32 compute-1 nova_compute[182713]: 2026-01-22 00:31:32.837 182717 DEBUG nova.compute.manager [None req-6c5e4ea0-3064-495b-b0bf-552bd7da6812 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 83be5683-cac9-4e5b-b8a1-9c37a5bc2b42] Instance event wait completed in 2 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 00:31:32 compute-1 nova_compute[182713]: 2026-01-22 00:31:32.841 182717 DEBUG nova.virt.driver [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Emitting event <LifecycleEvent: 1769041892.8414154, 83be5683-cac9-4e5b-b8a1-9c37a5bc2b42 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:31:32 compute-1 nova_compute[182713]: 2026-01-22 00:31:32.842 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 83be5683-cac9-4e5b-b8a1-9c37a5bc2b42] VM Resumed (Lifecycle Event)
Jan 22 00:31:32 compute-1 nova_compute[182713]: 2026-01-22 00:31:32.844 182717 DEBUG nova.virt.libvirt.driver [None req-6c5e4ea0-3064-495b-b0bf-552bd7da6812 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 83be5683-cac9-4e5b-b8a1-9c37a5bc2b42] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 22 00:31:32 compute-1 nova_compute[182713]: 2026-01-22 00:31:32.848 182717 INFO nova.virt.libvirt.driver [-] [instance: 83be5683-cac9-4e5b-b8a1-9c37a5bc2b42] Instance spawned successfully.
Jan 22 00:31:32 compute-1 nova_compute[182713]: 2026-01-22 00:31:32.849 182717 DEBUG nova.virt.libvirt.driver [None req-6c5e4ea0-3064-495b-b0bf-552bd7da6812 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 83be5683-cac9-4e5b-b8a1-9c37a5bc2b42] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 22 00:31:32 compute-1 nova_compute[182713]: 2026-01-22 00:31:32.873 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 83be5683-cac9-4e5b-b8a1-9c37a5bc2b42] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:31:32 compute-1 nova_compute[182713]: 2026-01-22 00:31:32.878 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 83be5683-cac9-4e5b-b8a1-9c37a5bc2b42] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 00:31:32 compute-1 nova_compute[182713]: 2026-01-22 00:31:32.884 182717 DEBUG nova.virt.libvirt.driver [None req-6c5e4ea0-3064-495b-b0bf-552bd7da6812 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 83be5683-cac9-4e5b-b8a1-9c37a5bc2b42] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:31:32 compute-1 nova_compute[182713]: 2026-01-22 00:31:32.884 182717 DEBUG nova.virt.libvirt.driver [None req-6c5e4ea0-3064-495b-b0bf-552bd7da6812 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 83be5683-cac9-4e5b-b8a1-9c37a5bc2b42] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:31:32 compute-1 nova_compute[182713]: 2026-01-22 00:31:32.885 182717 DEBUG nova.virt.libvirt.driver [None req-6c5e4ea0-3064-495b-b0bf-552bd7da6812 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 83be5683-cac9-4e5b-b8a1-9c37a5bc2b42] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:31:32 compute-1 nova_compute[182713]: 2026-01-22 00:31:32.885 182717 DEBUG nova.virt.libvirt.driver [None req-6c5e4ea0-3064-495b-b0bf-552bd7da6812 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 83be5683-cac9-4e5b-b8a1-9c37a5bc2b42] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:31:32 compute-1 nova_compute[182713]: 2026-01-22 00:31:32.885 182717 DEBUG nova.virt.libvirt.driver [None req-6c5e4ea0-3064-495b-b0bf-552bd7da6812 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 83be5683-cac9-4e5b-b8a1-9c37a5bc2b42] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:31:32 compute-1 nova_compute[182713]: 2026-01-22 00:31:32.886 182717 DEBUG nova.virt.libvirt.driver [None req-6c5e4ea0-3064-495b-b0bf-552bd7da6812 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 83be5683-cac9-4e5b-b8a1-9c37a5bc2b42] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:31:32 compute-1 nova_compute[182713]: 2026-01-22 00:31:32.924 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 83be5683-cac9-4e5b-b8a1-9c37a5bc2b42] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 00:31:32 compute-1 nova_compute[182713]: 2026-01-22 00:31:32.973 182717 INFO nova.compute.manager [None req-6c5e4ea0-3064-495b-b0bf-552bd7da6812 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 83be5683-cac9-4e5b-b8a1-9c37a5bc2b42] Took 11.84 seconds to spawn the instance on the hypervisor.
Jan 22 00:31:32 compute-1 nova_compute[182713]: 2026-01-22 00:31:32.974 182717 DEBUG nova.compute.manager [None req-6c5e4ea0-3064-495b-b0bf-552bd7da6812 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 83be5683-cac9-4e5b-b8a1-9c37a5bc2b42] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:31:33 compute-1 nova_compute[182713]: 2026-01-22 00:31:33.068 182717 INFO nova.compute.manager [None req-6c5e4ea0-3064-495b-b0bf-552bd7da6812 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 83be5683-cac9-4e5b-b8a1-9c37a5bc2b42] Took 12.40 seconds to build instance.
Jan 22 00:31:33 compute-1 nova_compute[182713]: 2026-01-22 00:31:33.090 182717 DEBUG oslo_concurrency.lockutils [None req-6c5e4ea0-3064-495b-b0bf-552bd7da6812 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "83be5683-cac9-4e5b-b8a1-9c37a5bc2b42" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 12.508s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:31:33 compute-1 nova_compute[182713]: 2026-01-22 00:31:33.511 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:31:34 compute-1 podman[239125]: 2026-01-22 00:31:34.578843628 +0000 UTC m=+0.063180547 container health_status 9069102163b9014ce8d990e36224edf2f6277e737eebbc85a8ea0ba5a3d6a468 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 22 00:31:34 compute-1 podman[239124]: 2026-01-22 00:31:34.609554352 +0000 UTC m=+0.100373113 container health_status 1b31374526468d6b98833c6ddc06c791524e3aaa8f4a92d02e3f11c449a17ca2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Jan 22 00:31:34 compute-1 nova_compute[182713]: 2026-01-22 00:31:34.637 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:31:34 compute-1 nova_compute[182713]: 2026-01-22 00:31:34.911 182717 DEBUG nova.compute.manager [req-de2788ee-6622-45be-9de2-2445fae59480 req-8c5ecef5-49d8-43b7-b2ca-e4d45edc9157 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 83be5683-cac9-4e5b-b8a1-9c37a5bc2b42] Received event network-vif-plugged-84e1946c-1832-49b5-9a84-edfb4022e8bc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:31:34 compute-1 nova_compute[182713]: 2026-01-22 00:31:34.912 182717 DEBUG oslo_concurrency.lockutils [req-de2788ee-6622-45be-9de2-2445fae59480 req-8c5ecef5-49d8-43b7-b2ca-e4d45edc9157 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "83be5683-cac9-4e5b-b8a1-9c37a5bc2b42-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:31:34 compute-1 nova_compute[182713]: 2026-01-22 00:31:34.912 182717 DEBUG oslo_concurrency.lockutils [req-de2788ee-6622-45be-9de2-2445fae59480 req-8c5ecef5-49d8-43b7-b2ca-e4d45edc9157 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "83be5683-cac9-4e5b-b8a1-9c37a5bc2b42-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:31:34 compute-1 nova_compute[182713]: 2026-01-22 00:31:34.913 182717 DEBUG oslo_concurrency.lockutils [req-de2788ee-6622-45be-9de2-2445fae59480 req-8c5ecef5-49d8-43b7-b2ca-e4d45edc9157 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "83be5683-cac9-4e5b-b8a1-9c37a5bc2b42-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:31:34 compute-1 nova_compute[182713]: 2026-01-22 00:31:34.913 182717 DEBUG nova.compute.manager [req-de2788ee-6622-45be-9de2-2445fae59480 req-8c5ecef5-49d8-43b7-b2ca-e4d45edc9157 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 83be5683-cac9-4e5b-b8a1-9c37a5bc2b42] No waiting events found dispatching network-vif-plugged-84e1946c-1832-49b5-9a84-edfb4022e8bc pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:31:34 compute-1 nova_compute[182713]: 2026-01-22 00:31:34.913 182717 WARNING nova.compute.manager [req-de2788ee-6622-45be-9de2-2445fae59480 req-8c5ecef5-49d8-43b7-b2ca-e4d45edc9157 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 83be5683-cac9-4e5b-b8a1-9c37a5bc2b42] Received unexpected event network-vif-plugged-84e1946c-1832-49b5-9a84-edfb4022e8bc for instance with vm_state active and task_state None.
Jan 22 00:31:38 compute-1 ovn_controller[94841]: 2026-01-22T00:31:38Z|00670|binding|INFO|Releasing lport b38c45f8-f983-4d04-9b7c-db4cbbad86b5 from this chassis (sb_readonly=0)
Jan 22 00:31:38 compute-1 NetworkManager[54952]: <info>  [1769041898.1625] manager: (patch-provnet-99bcf688-d143-4306-a11c-956e66fbe227-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/313)
Jan 22 00:31:38 compute-1 NetworkManager[54952]: <info>  [1769041898.1635] manager: (patch-br-int-to-provnet-99bcf688-d143-4306-a11c-956e66fbe227): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/314)
Jan 22 00:31:38 compute-1 nova_compute[182713]: 2026-01-22 00:31:38.168 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:31:38 compute-1 ovn_controller[94841]: 2026-01-22T00:31:38Z|00671|binding|INFO|Releasing lport b38c45f8-f983-4d04-9b7c-db4cbbad86b5 from this chassis (sb_readonly=0)
Jan 22 00:31:38 compute-1 nova_compute[182713]: 2026-01-22 00:31:38.201 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:31:38 compute-1 nova_compute[182713]: 2026-01-22 00:31:38.445 182717 DEBUG nova.compute.manager [req-69f86919-84ff-4597-92c9-243992fcd182 req-475f1995-c440-4e95-a58f-6b628c39bdbe 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 83be5683-cac9-4e5b-b8a1-9c37a5bc2b42] Received event network-changed-84e1946c-1832-49b5-9a84-edfb4022e8bc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:31:38 compute-1 nova_compute[182713]: 2026-01-22 00:31:38.446 182717 DEBUG nova.compute.manager [req-69f86919-84ff-4597-92c9-243992fcd182 req-475f1995-c440-4e95-a58f-6b628c39bdbe 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 83be5683-cac9-4e5b-b8a1-9c37a5bc2b42] Refreshing instance network info cache due to event network-changed-84e1946c-1832-49b5-9a84-edfb4022e8bc. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 00:31:38 compute-1 nova_compute[182713]: 2026-01-22 00:31:38.446 182717 DEBUG oslo_concurrency.lockutils [req-69f86919-84ff-4597-92c9-243992fcd182 req-475f1995-c440-4e95-a58f-6b628c39bdbe 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-83be5683-cac9-4e5b-b8a1-9c37a5bc2b42" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:31:38 compute-1 nova_compute[182713]: 2026-01-22 00:31:38.447 182717 DEBUG oslo_concurrency.lockutils [req-69f86919-84ff-4597-92c9-243992fcd182 req-475f1995-c440-4e95-a58f-6b628c39bdbe 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-83be5683-cac9-4e5b-b8a1-9c37a5bc2b42" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:31:38 compute-1 nova_compute[182713]: 2026-01-22 00:31:38.447 182717 DEBUG nova.network.neutron [req-69f86919-84ff-4597-92c9-243992fcd182 req-475f1995-c440-4e95-a58f-6b628c39bdbe 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 83be5683-cac9-4e5b-b8a1-9c37a5bc2b42] Refreshing network info cache for port 84e1946c-1832-49b5-9a84-edfb4022e8bc _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 00:31:38 compute-1 nova_compute[182713]: 2026-01-22 00:31:38.514 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:31:39 compute-1 podman[239173]: 2026-01-22 00:31:39.563085709 +0000 UTC m=+0.043897370 container health_status e305ebfcd3dbca18f8dd680606b043b108cf9efb0d0a771039cb04fe0df8441d (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Jan 22 00:31:39 compute-1 podman[239172]: 2026-01-22 00:31:39.603114436 +0000 UTC m=+0.089361458 container health_status af9a44923f14d865893c91712a29a99d59e05d83f1a1a0300c4f4d5af638f284 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Jan 22 00:31:39 compute-1 nova_compute[182713]: 2026-01-22 00:31:39.640 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:31:40 compute-1 nova_compute[182713]: 2026-01-22 00:31:40.101 182717 DEBUG nova.network.neutron [req-69f86919-84ff-4597-92c9-243992fcd182 req-475f1995-c440-4e95-a58f-6b628c39bdbe 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 83be5683-cac9-4e5b-b8a1-9c37a5bc2b42] Updated VIF entry in instance network info cache for port 84e1946c-1832-49b5-9a84-edfb4022e8bc. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 00:31:40 compute-1 nova_compute[182713]: 2026-01-22 00:31:40.101 182717 DEBUG nova.network.neutron [req-69f86919-84ff-4597-92c9-243992fcd182 req-475f1995-c440-4e95-a58f-6b628c39bdbe 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 83be5683-cac9-4e5b-b8a1-9c37a5bc2b42] Updating instance_info_cache with network_info: [{"id": "84e1946c-1832-49b5-9a84-edfb4022e8bc", "address": "fa:16:3e:5e:14:50", "network": {"id": "895033ac-5f91-4350-ad1a-b5c5d0ff13a2", "bridge": "br-int", "label": "tempest-network-smoke--1056014394", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe5e:1450", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.194", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap84e1946c-18", "ovs_interfaceid": "84e1946c-1832-49b5-9a84-edfb4022e8bc", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:31:40 compute-1 nova_compute[182713]: 2026-01-22 00:31:40.437 182717 DEBUG oslo_concurrency.lockutils [req-69f86919-84ff-4597-92c9-243992fcd182 req-475f1995-c440-4e95-a58f-6b628c39bdbe 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-83be5683-cac9-4e5b-b8a1-9c37a5bc2b42" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:31:43 compute-1 nova_compute[182713]: 2026-01-22 00:31:43.519 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:31:44 compute-1 nova_compute[182713]: 2026-01-22 00:31:44.643 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:31:45 compute-1 ovn_controller[94841]: 2026-01-22T00:31:45Z|00088|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:5e:14:50 10.100.0.10
Jan 22 00:31:45 compute-1 ovn_controller[94841]: 2026-01-22T00:31:45Z|00089|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:5e:14:50 10.100.0.10
Jan 22 00:31:48 compute-1 nova_compute[182713]: 2026-01-22 00:31:48.525 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:31:49 compute-1 nova_compute[182713]: 2026-01-22 00:31:49.645 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:31:53 compute-1 nova_compute[182713]: 2026-01-22 00:31:53.528 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:31:53 compute-1 podman[239228]: 2026-01-22 00:31:53.573035829 +0000 UTC m=+0.062733201 container health_status cba261ffc859a105e0afa4436e64e27f9ba7e7387a1ea02146e0072c51844d44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, org.label-schema.build-date=20251202)
Jan 22 00:31:54 compute-1 nova_compute[182713]: 2026-01-22 00:31:54.686 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:31:55 compute-1 podman[239250]: 2026-01-22 00:31:55.595968982 +0000 UTC m=+0.081705707 container health_status dce133a2ffa21f3006ac27dc80027060a9029e6d972f4c62fecd35dd3bbb00f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, config_id=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., name=ubi9-minimal, release=1755695350, version=9.6, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7)
Jan 22 00:31:57 compute-1 nova_compute[182713]: 2026-01-22 00:31:57.856 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:31:58 compute-1 nova_compute[182713]: 2026-01-22 00:31:58.531 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:31:58 compute-1 nova_compute[182713]: 2026-01-22 00:31:58.856 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:31:58 compute-1 nova_compute[182713]: 2026-01-22 00:31:58.856 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:31:58 compute-1 nova_compute[182713]: 2026-01-22 00:31:58.857 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 00:31:59 compute-1 nova_compute[182713]: 2026-01-22 00:31:59.688 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:32:00 compute-1 nova_compute[182713]: 2026-01-22 00:32:00.852 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:32:01 compute-1 nova_compute[182713]: 2026-01-22 00:32:01.851 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:32:03 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:32:03.042 104184 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:32:03 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:32:03.043 104184 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:32:03 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:32:03.044 104184 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:32:03 compute-1 nova_compute[182713]: 2026-01-22 00:32:03.535 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:32:04 compute-1 nova_compute[182713]: 2026-01-22 00:32:04.690 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:32:04 compute-1 nova_compute[182713]: 2026-01-22 00:32:04.855 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:32:04 compute-1 nova_compute[182713]: 2026-01-22 00:32:04.856 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:32:04 compute-1 nova_compute[182713]: 2026-01-22 00:32:04.856 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:32:05 compute-1 nova_compute[182713]: 2026-01-22 00:32:05.003 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:32:05 compute-1 nova_compute[182713]: 2026-01-22 00:32:05.004 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:32:05 compute-1 nova_compute[182713]: 2026-01-22 00:32:05.004 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:32:05 compute-1 nova_compute[182713]: 2026-01-22 00:32:05.004 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 00:32:05 compute-1 nova_compute[182713]: 2026-01-22 00:32:05.092 182717 DEBUG oslo_concurrency.processutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/83be5683-cac9-4e5b-b8a1-9c37a5bc2b42/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:32:05 compute-1 podman[239274]: 2026-01-22 00:32:05.136983688 +0000 UTC m=+0.073435718 container health_status 9069102163b9014ce8d990e36224edf2f6277e737eebbc85a8ea0ba5a3d6a468 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 22 00:32:05 compute-1 nova_compute[182713]: 2026-01-22 00:32:05.189 182717 DEBUG oslo_concurrency.processutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/83be5683-cac9-4e5b-b8a1-9c37a5bc2b42/disk --force-share --output=json" returned: 0 in 0.098s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:32:05 compute-1 nova_compute[182713]: 2026-01-22 00:32:05.191 182717 DEBUG oslo_concurrency.processutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/83be5683-cac9-4e5b-b8a1-9c37a5bc2b42/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:32:05 compute-1 podman[239273]: 2026-01-22 00:32:05.207017658 +0000 UTC m=+0.153210304 container health_status 1b31374526468d6b98833c6ddc06c791524e3aaa8f4a92d02e3f11c449a17ca2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=ovn_controller, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 22 00:32:05 compute-1 nova_compute[182713]: 2026-01-22 00:32:05.261 182717 DEBUG oslo_concurrency.processutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/83be5683-cac9-4e5b-b8a1-9c37a5bc2b42/disk --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:32:05 compute-1 nova_compute[182713]: 2026-01-22 00:32:05.437 182717 WARNING nova.virt.libvirt.driver [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 00:32:05 compute-1 nova_compute[182713]: 2026-01-22 00:32:05.439 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5546MB free_disk=73.16445922851562GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 00:32:05 compute-1 nova_compute[182713]: 2026-01-22 00:32:05.439 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:32:05 compute-1 nova_compute[182713]: 2026-01-22 00:32:05.440 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:32:05 compute-1 nova_compute[182713]: 2026-01-22 00:32:05.508 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Instance 83be5683-cac9-4e5b-b8a1-9c37a5bc2b42 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 22 00:32:05 compute-1 nova_compute[182713]: 2026-01-22 00:32:05.509 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 00:32:05 compute-1 nova_compute[182713]: 2026-01-22 00:32:05.509 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 00:32:05 compute-1 nova_compute[182713]: 2026-01-22 00:32:05.786 182717 DEBUG nova.compute.provider_tree [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Inventory has not changed in ProviderTree for provider: 39680711-70c9-4df1-ae59-25e54fac688d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:32:05 compute-1 nova_compute[182713]: 2026-01-22 00:32:05.803 182717 DEBUG nova.scheduler.client.report [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Inventory has not changed for provider 39680711-70c9-4df1-ae59-25e54fac688d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:32:05 compute-1 nova_compute[182713]: 2026-01-22 00:32:05.832 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 00:32:05 compute-1 nova_compute[182713]: 2026-01-22 00:32:05.833 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.393s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:32:07 compute-1 nova_compute[182713]: 2026-01-22 00:32:07.833 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:32:08 compute-1 nova_compute[182713]: 2026-01-22 00:32:08.576 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:32:08 compute-1 nova_compute[182713]: 2026-01-22 00:32:08.857 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:32:08 compute-1 nova_compute[182713]: 2026-01-22 00:32:08.860 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 00:32:08 compute-1 nova_compute[182713]: 2026-01-22 00:32:08.860 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 22 00:32:09 compute-1 nova_compute[182713]: 2026-01-22 00:32:09.314 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquiring lock "refresh_cache-83be5683-cac9-4e5b-b8a1-9c37a5bc2b42" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:32:09 compute-1 nova_compute[182713]: 2026-01-22 00:32:09.315 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquired lock "refresh_cache-83be5683-cac9-4e5b-b8a1-9c37a5bc2b42" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:32:09 compute-1 nova_compute[182713]: 2026-01-22 00:32:09.315 182717 DEBUG nova.network.neutron [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] [instance: 83be5683-cac9-4e5b-b8a1-9c37a5bc2b42] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Jan 22 00:32:09 compute-1 nova_compute[182713]: 2026-01-22 00:32:09.316 182717 DEBUG nova.objects.instance [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lazy-loading 'info_cache' on Instance uuid 83be5683-cac9-4e5b-b8a1-9c37a5bc2b42 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:32:09 compute-1 nova_compute[182713]: 2026-01-22 00:32:09.740 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:32:10 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:32:10.148 104184 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=55, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:a4:fb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'fa:c2:bb:50:eb:27'}, ipsec=False) old=SB_Global(nb_cfg=54) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:32:10 compute-1 nova_compute[182713]: 2026-01-22 00:32:10.149 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:32:10 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:32:10.150 104184 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 22 00:32:10 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:32:10.153 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=74526b6d-b1ca-423f-9094-b845f8b97526, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '55'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:32:10 compute-1 podman[239327]: 2026-01-22 00:32:10.569943456 +0000 UTC m=+0.055325768 container health_status af9a44923f14d865893c91712a29a99d59e05d83f1a1a0300c4f4d5af638f284 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent)
Jan 22 00:32:10 compute-1 podman[239328]: 2026-01-22 00:32:10.593709332 +0000 UTC m=+0.065089585 container health_status e305ebfcd3dbca18f8dd680606b043b108cf9efb0d0a771039cb04fe0df8441d (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 22 00:32:11 compute-1 nova_compute[182713]: 2026-01-22 00:32:11.461 182717 DEBUG nova.network.neutron [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] [instance: 83be5683-cac9-4e5b-b8a1-9c37a5bc2b42] Updating instance_info_cache with network_info: [{"id": "84e1946c-1832-49b5-9a84-edfb4022e8bc", "address": "fa:16:3e:5e:14:50", "network": {"id": "895033ac-5f91-4350-ad1a-b5c5d0ff13a2", "bridge": "br-int", "label": "tempest-network-smoke--1056014394", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe5e:1450", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.194", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap84e1946c-18", "ovs_interfaceid": "84e1946c-1832-49b5-9a84-edfb4022e8bc", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:32:11 compute-1 nova_compute[182713]: 2026-01-22 00:32:11.485 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Releasing lock "refresh_cache-83be5683-cac9-4e5b-b8a1-9c37a5bc2b42" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:32:11 compute-1 nova_compute[182713]: 2026-01-22 00:32:11.486 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] [instance: 83be5683-cac9-4e5b-b8a1-9c37a5bc2b42] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Jan 22 00:32:13 compute-1 nova_compute[182713]: 2026-01-22 00:32:13.580 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:32:14 compute-1 nova_compute[182713]: 2026-01-22 00:32:14.744 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:32:18 compute-1 nova_compute[182713]: 2026-01-22 00:32:18.584 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:32:19 compute-1 nova_compute[182713]: 2026-01-22 00:32:19.747 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.893 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '83be5683-cac9-4e5b-b8a1-9c37a5bc2b42', 'name': 'tempest-TestGettingAddress-server-1345718251', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-000000a9', 'OS-EXT-SRV-ATTR:host': 'compute-1.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '837db8748d074b3c9179b47d30e7a1d4', 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'hostId': '114686826ac799d79679ebf48d50469c6b568964d23812d7c3d3d1b1', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.895 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.928 12 DEBUG ceilometer.compute.pollsters [-] 83be5683-cac9-4e5b-b8a1-9c37a5bc2b42/disk.device.read.latency volume: 223161236 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.930 12 DEBUG ceilometer.compute.pollsters [-] 83be5683-cac9-4e5b-b8a1-9c37a5bc2b42/disk.device.read.latency volume: 20985933 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.935 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c57507ef-14fe-4b94-b025-f75603768c00', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 223161236, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': '83be5683-cac9-4e5b-b8a1-9c37a5bc2b42-vda', 'timestamp': '2026-01-22T00:32:22.896207', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1345718251', 'name': 'instance-000000a9', 'instance_id': '83be5683-cac9-4e5b-b8a1-9c37a5bc2b42', 'instance_type': 'm1.nano', 'host': '114686826ac799d79679ebf48d50469c6b568964d23812d7c3d3d1b1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'd18402ee-f729-11f0-a0a4-fa163e934844', 'monotonic_time': 6515.603716997, 'message_signature': 'c7e1e63b7e61bfcbd0914e5f9ad625987417406172ec607dd1c224ecf710bf4e'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 20985933, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': '83be5683-cac9-4e5b-b8a1-9c37a5bc2b42-sda', 'timestamp': '2026-01-22T00:32:22.896207', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1345718251', 'name': 'instance-000000a9', 'instance_id': '83be5683-cac9-4e5b-b8a1-9c37a5bc2b42', 'instance_type': 'm1.nano', 'host': '114686826ac799d79679ebf48d50469c6b568964d23812d7c3d3d1b1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'd18422c4-f729-11f0-a0a4-fa163e934844', 'monotonic_time': 6515.603716997, 'message_signature': '8165060c78ac15e4e75b5d29b282d7a8a7fa3136ce059bcd8254144a08ebf8bc'}]}, 'timestamp': '2026-01-22 00:32:22.930623', '_unique_id': '5d60610616894505bf18d35adb3e0b86'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.935 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.935 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.935 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.935 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.935 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.935 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.935 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.935 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.935 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.935 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.935 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.935 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.935 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.935 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.935 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.935 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.935 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.935 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.935 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.935 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.935 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.935 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.935 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.935 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.935 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.935 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.935 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.935 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.935 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.935 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.935 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.937 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.950 12 DEBUG ceilometer.compute.pollsters [-] 83be5683-cac9-4e5b-b8a1-9c37a5bc2b42/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.951 12 DEBUG ceilometer.compute.pollsters [-] 83be5683-cac9-4e5b-b8a1-9c37a5bc2b42/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.953 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6e0703c3-f9ef-447d-a4d8-6f181ebf2fa9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': '83be5683-cac9-4e5b-b8a1-9c37a5bc2b42-vda', 'timestamp': '2026-01-22T00:32:22.937697', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1345718251', 'name': 'instance-000000a9', 'instance_id': '83be5683-cac9-4e5b-b8a1-9c37a5bc2b42', 'instance_type': 'm1.nano', 'host': '114686826ac799d79679ebf48d50469c6b568964d23812d7c3d3d1b1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'd187544e-f729-11f0-a0a4-fa163e934844', 'monotonic_time': 6515.644943902, 'message_signature': '3dbe45054d16cf20a06794a1fe9a01edb42a1690cae74568a6a4dc6a924f8307'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': '83be5683-cac9-4e5b-b8a1-9c37a5bc2b42-sda', 'timestamp': '2026-01-22T00:32:22.937697', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1345718251', 'name': 'instance-000000a9', 'instance_id': '83be5683-cac9-4e5b-b8a1-9c37a5bc2b42', 'instance_type': 'm1.nano', 'host': '114686826ac799d79679ebf48d50469c6b568964d23812d7c3d3d1b1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'd18765ec-f729-11f0-a0a4-fa163e934844', 'monotonic_time': 6515.644943902, 'message_signature': '90c08c295e5512c28827e5633bd00df40b1b5a280d0a866e038c3fbb37e51dc4'}]}, 'timestamp': '2026-01-22 00:32:22.951806', '_unique_id': 'cc144e42cbe0402bb861abaf30eb6790'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.953 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.953 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.953 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.953 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.953 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.953 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.953 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.953 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.953 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.953 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.953 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.953 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.953 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.953 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.953 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.953 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.953 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.953 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.953 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.953 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.953 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.953 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.953 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.953 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.953 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.953 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.953 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.953 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.953 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.953 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.953 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.954 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.957 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 83be5683-cac9-4e5b-b8a1-9c37a5bc2b42 / tap84e1946c-18 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.957 12 DEBUG ceilometer.compute.pollsters [-] 83be5683-cac9-4e5b-b8a1-9c37a5bc2b42/network.outgoing.packets volume: 32 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.959 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2397e6ce-c206-402e-8c0d-dd3aece81f0d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 32, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': 'instance-000000a9-83be5683-cac9-4e5b-b8a1-9c37a5bc2b42-tap84e1946c-18', 'timestamp': '2026-01-22T00:32:22.954196', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1345718251', 'name': 'tap84e1946c-18', 'instance_id': '83be5683-cac9-4e5b-b8a1-9c37a5bc2b42', 'instance_type': 'm1.nano', 'host': '114686826ac799d79679ebf48d50469c6b568964d23812d7c3d3d1b1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:5e:14:50', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap84e1946c-18'}, 'message_id': 'd1885fce-f729-11f0-a0a4-fa163e934844', 'monotonic_time': 6515.661498572, 'message_signature': '272c4daf131ebec0512a9220795a409bf547e9842128bda157b923ea1c5913cc'}]}, 'timestamp': '2026-01-22 00:32:22.958317', '_unique_id': '0421bf8ca19e4ca190dca9cc01f6e98f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.959 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.959 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.959 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.959 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.959 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.959 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.959 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.959 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.959 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.959 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.959 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.959 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.959 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.959 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.959 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.959 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.959 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.959 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.959 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.959 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.959 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.959 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.959 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.959 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.959 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.959 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.959 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.959 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.959 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.959 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.959 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.959 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.960 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.960 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-TestGettingAddress-server-1345718251>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestGettingAddress-server-1345718251>]
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.960 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.960 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.960 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-TestGettingAddress-server-1345718251>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestGettingAddress-server-1345718251>]
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.960 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.961 12 DEBUG ceilometer.compute.pollsters [-] 83be5683-cac9-4e5b-b8a1-9c37a5bc2b42/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.962 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '74ad2465-3c0d-4c0d-9948-58fee7ae9bc0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': 'instance-000000a9-83be5683-cac9-4e5b-b8a1-9c37a5bc2b42-tap84e1946c-18', 'timestamp': '2026-01-22T00:32:22.961071', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1345718251', 'name': 'tap84e1946c-18', 'instance_id': '83be5683-cac9-4e5b-b8a1-9c37a5bc2b42', 'instance_type': 'm1.nano', 'host': '114686826ac799d79679ebf48d50469c6b568964d23812d7c3d3d1b1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:5e:14:50', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap84e1946c-18'}, 'message_id': 'd188db48-f729-11f0-a0a4-fa163e934844', 'monotonic_time': 6515.661498572, 'message_signature': '6fb378cbf8d243a59ed817b16a59a66ab3c0edbe6cc1d57762e8a4641dc63398'}]}, 'timestamp': '2026-01-22 00:32:22.961358', '_unique_id': 'b097d0ca6d2b4c8ebbebfc8af9b6d49c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.962 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.962 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.962 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.962 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.962 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.962 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.962 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.962 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.962 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.962 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.962 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.962 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.962 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.962 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.962 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.962 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.962 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.962 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.962 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.962 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.962 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.962 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.962 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.962 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.962 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.962 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.962 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.962 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.962 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.962 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.962 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.962 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.963 12 DEBUG ceilometer.compute.pollsters [-] 83be5683-cac9-4e5b-b8a1-9c37a5bc2b42/disk.device.write.bytes volume: 72957952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.963 12 DEBUG ceilometer.compute.pollsters [-] 83be5683-cac9-4e5b-b8a1-9c37a5bc2b42/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.964 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd7683764-caa5-4cee-bb29-3adc0be84f18', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 72957952, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': '83be5683-cac9-4e5b-b8a1-9c37a5bc2b42-vda', 'timestamp': '2026-01-22T00:32:22.963005', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1345718251', 'name': 'instance-000000a9', 'instance_id': '83be5683-cac9-4e5b-b8a1-9c37a5bc2b42', 'instance_type': 'm1.nano', 'host': '114686826ac799d79679ebf48d50469c6b568964d23812d7c3d3d1b1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'd18925e4-f729-11f0-a0a4-fa163e934844', 'monotonic_time': 6515.603716997, 'message_signature': 'f6128ca8b5f00376000e71c97862998a61710aaa37cc3a3ee96e5879aaf201e1'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': '83be5683-cac9-4e5b-b8a1-9c37a5bc2b42-sda', 'timestamp': '2026-01-22T00:32:22.963005', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1345718251', 'name': 'instance-000000a9', 'instance_id': '83be5683-cac9-4e5b-b8a1-9c37a5bc2b42', 'instance_type': 'm1.nano', 'host': '114686826ac799d79679ebf48d50469c6b568964d23812d7c3d3d1b1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'd1892fd0-f729-11f0-a0a4-fa163e934844', 'monotonic_time': 6515.603716997, 'message_signature': '5f95d594bca2b658e4ad35da1eb7a5141f3b67d491af2c6c86f782cdf21e9e3d'}]}, 'timestamp': '2026-01-22 00:32:22.963505', '_unique_id': '9ebce871b41f47678fa6b92a55726ea4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.964 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.964 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.964 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.964 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.964 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.964 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.964 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.964 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.964 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.964 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.964 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.964 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.964 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.964 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.964 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.964 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.964 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.964 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.964 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.964 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.964 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.964 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.964 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.964 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.964 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.964 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.964 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.964 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.964 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.964 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.964 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.966 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.966 12 DEBUG ceilometer.compute.pollsters [-] 83be5683-cac9-4e5b-b8a1-9c37a5bc2b42/disk.device.write.latency volume: 1833467622 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.966 12 DEBUG ceilometer.compute.pollsters [-] 83be5683-cac9-4e5b-b8a1-9c37a5bc2b42/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.968 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0d350dff-b569-4539-890e-775064b8d771', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1833467622, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': '83be5683-cac9-4e5b-b8a1-9c37a5bc2b42-vda', 'timestamp': '2026-01-22T00:32:22.966092', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1345718251', 'name': 'instance-000000a9', 'instance_id': '83be5683-cac9-4e5b-b8a1-9c37a5bc2b42', 'instance_type': 'm1.nano', 'host': '114686826ac799d79679ebf48d50469c6b568964d23812d7c3d3d1b1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'd1899e70-f729-11f0-a0a4-fa163e934844', 'monotonic_time': 6515.603716997, 'message_signature': '1673f2d056c740388d7c6e8c3d1751ee2bd3c6a96c06f209404d594676ca22fc'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': '83be5683-cac9-4e5b-b8a1-9c37a5bc2b42-sda', 'timestamp': '2026-01-22T00:32:22.966092', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1345718251', 'name': 'instance-000000a9', 'instance_id': '83be5683-cac9-4e5b-b8a1-9c37a5bc2b42', 'instance_type': 'm1.nano', 'host': '114686826ac799d79679ebf48d50469c6b568964d23812d7c3d3d1b1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'd189a690-f729-11f0-a0a4-fa163e934844', 'monotonic_time': 6515.603716997, 'message_signature': '2bf2a844ed4de0ea5b3f559ea341d85768e6893d882069ff0496c70aea696b9d'}]}, 'timestamp': '2026-01-22 00:32:22.966551', '_unique_id': 'a678ca6bfa3e465695d5c262b69365c4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.968 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.968 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.968 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.968 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.968 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.968 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.968 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.968 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.968 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.968 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.968 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.968 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.968 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.968 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.968 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.968 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.968 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.968 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.968 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.968 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.968 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.968 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.968 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.968 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.968 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.968 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.968 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.968 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.968 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.968 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.968 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.969 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.969 12 DEBUG ceilometer.compute.pollsters [-] 83be5683-cac9-4e5b-b8a1-9c37a5bc2b42/disk.device.read.requests volume: 1091 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.970 12 DEBUG ceilometer.compute.pollsters [-] 83be5683-cac9-4e5b-b8a1-9c37a5bc2b42/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.971 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '469268ec-d8e4-4a10-9047-c5e21a8932ac', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1091, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': '83be5683-cac9-4e5b-b8a1-9c37a5bc2b42-vda', 'timestamp': '2026-01-22T00:32:22.969919', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1345718251', 'name': 'instance-000000a9', 'instance_id': '83be5683-cac9-4e5b-b8a1-9c37a5bc2b42', 'instance_type': 'm1.nano', 'host': '114686826ac799d79679ebf48d50469c6b568964d23812d7c3d3d1b1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'd18a3a92-f729-11f0-a0a4-fa163e934844', 'monotonic_time': 6515.603716997, 'message_signature': '7879d6d9bae5d8eb916d3df26b4b977fa4bc3ca3e34022e954cfae6af4514385'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': '83be5683-cac9-4e5b-b8a1-9c37a5bc2b42-sda', 'timestamp': '2026-01-22T00:32:22.969919', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1345718251', 'name': 'instance-000000a9', 'instance_id': '83be5683-cac9-4e5b-b8a1-9c37a5bc2b42', 'instance_type': 'm1.nano', 'host': '114686826ac799d79679ebf48d50469c6b568964d23812d7c3d3d1b1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'd18a496a-f729-11f0-a0a4-fa163e934844', 'monotonic_time': 6515.603716997, 'message_signature': '0bf40e25e74763a35e7b5fe549e6285141cc15d6f8763bee0742485527b5c75b'}]}, 'timestamp': '2026-01-22 00:32:22.970831', '_unique_id': '75bfd33bbc624742a8d1c2032e55e002'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.971 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.971 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.971 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.971 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.971 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.971 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.971 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.971 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.971 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.971 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.971 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.971 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.971 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.971 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.971 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.971 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.971 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.971 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.971 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.971 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.971 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.971 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.971 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.971 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.971 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.971 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.971 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.971 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.971 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.971 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.971 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.972 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.973 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.973 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-TestGettingAddress-server-1345718251>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestGettingAddress-server-1345718251>]
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.973 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.990 12 DEBUG ceilometer.compute.pollsters [-] 83be5683-cac9-4e5b-b8a1-9c37a5bc2b42/cpu volume: 11690000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.992 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '470c87c7-9171-498c-a59c-d55fdae9b57a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 11690000000, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': '83be5683-cac9-4e5b-b8a1-9c37a5bc2b42', 'timestamp': '2026-01-22T00:32:22.973561', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1345718251', 'name': 'instance-000000a9', 'instance_id': '83be5683-cac9-4e5b-b8a1-9c37a5bc2b42', 'instance_type': 'm1.nano', 'host': '114686826ac799d79679ebf48d50469c6b568964d23812d7c3d3d1b1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': 'd18d6a8c-f729-11f0-a0a4-fa163e934844', 'monotonic_time': 6515.697334057, 'message_signature': '4b2d21a4e03619dfe03e7d7c4490389357a6b4edb55d9510e13b77d89a89016e'}]}, 'timestamp': '2026-01-22 00:32:22.991378', '_unique_id': '8ac5d08e35944fba9a57cd3d8942927b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.992 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.992 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.992 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.992 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.992 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.992 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.992 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.992 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.992 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.992 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.992 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.992 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.992 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.992 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.992 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.992 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.992 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.992 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.992 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.992 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.992 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.992 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.992 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.992 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.992 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.992 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.992 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.992 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.992 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.992 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.992 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.993 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.993 12 DEBUG ceilometer.compute.pollsters [-] 83be5683-cac9-4e5b-b8a1-9c37a5bc2b42/disk.device.usage volume: 29949952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.993 12 DEBUG ceilometer.compute.pollsters [-] 83be5683-cac9-4e5b-b8a1-9c37a5bc2b42/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.994 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '353a4d22-b290-49ae-8fad-1abe5363e3e0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29949952, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': '83be5683-cac9-4e5b-b8a1-9c37a5bc2b42-vda', 'timestamp': '2026-01-22T00:32:22.993577', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1345718251', 'name': 'instance-000000a9', 'instance_id': '83be5683-cac9-4e5b-b8a1-9c37a5bc2b42', 'instance_type': 'm1.nano', 'host': '114686826ac799d79679ebf48d50469c6b568964d23812d7c3d3d1b1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'd18dd012-f729-11f0-a0a4-fa163e934844', 'monotonic_time': 6515.644943902, 'message_signature': '24bd16ea87ed6b4064b661e6d45efe508999f94ef299356c12512adcdaea30cb'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': '83be5683-cac9-4e5b-b8a1-9c37a5bc2b42-sda', 'timestamp': '2026-01-22T00:32:22.993577', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1345718251', 'name': 'instance-000000a9', 'instance_id': '83be5683-cac9-4e5b-b8a1-9c37a5bc2b42', 'instance_type': 'm1.nano', 'host': '114686826ac799d79679ebf48d50469c6b568964d23812d7c3d3d1b1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'd18dd9c2-f729-11f0-a0a4-fa163e934844', 'monotonic_time': 6515.644943902, 'message_signature': '91c83f7154bb0d2c231304f427533df1d0814c70564e54c0009e00e8f1bd9ed5'}]}, 'timestamp': '2026-01-22 00:32:22.994066', '_unique_id': '6c9dc46312a440aaa68a475f189a36f5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.994 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.994 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.994 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.994 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.994 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.994 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.994 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.994 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.994 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.994 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.994 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.994 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.994 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.994 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.994 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.994 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.994 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.994 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.994 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.994 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.994 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.994 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.994 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.994 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.994 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.994 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.994 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.994 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.994 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.994 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.994 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.995 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.995 12 DEBUG ceilometer.compute.pollsters [-] 83be5683-cac9-4e5b-b8a1-9c37a5bc2b42/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.996 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '90e33e51-e0f0-4ae0-a5f3-f419b6b520a7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': 'instance-000000a9-83be5683-cac9-4e5b-b8a1-9c37a5bc2b42-tap84e1946c-18', 'timestamp': '2026-01-22T00:32:22.995268', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1345718251', 'name': 'tap84e1946c-18', 'instance_id': '83be5683-cac9-4e5b-b8a1-9c37a5bc2b42', 'instance_type': 'm1.nano', 'host': '114686826ac799d79679ebf48d50469c6b568964d23812d7c3d3d1b1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:5e:14:50', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap84e1946c-18'}, 'message_id': 'd18e1220-f729-11f0-a0a4-fa163e934844', 'monotonic_time': 6515.661498572, 'message_signature': '174cbb05d0dc48f8c039b42be4c0363abeb3c747a680b0d3aebb0477b577fd3e'}]}, 'timestamp': '2026-01-22 00:32:22.995522', '_unique_id': 'bd71e360125f47059fa540b871268464'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.996 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.996 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.996 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.996 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.996 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.996 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.996 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.996 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.996 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.996 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.996 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.996 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.996 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.996 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.996 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.996 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.996 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.996 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.996 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.996 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.996 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.996 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.996 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.996 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.996 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.996 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.996 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.996 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.996 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.996 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.996 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.996 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.996 12 DEBUG ceilometer.compute.pollsters [-] 83be5683-cac9-4e5b-b8a1-9c37a5bc2b42/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.997 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd51955a7-37cb-49cb-9a6f-b5c883886028', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': 'instance-000000a9-83be5683-cac9-4e5b-b8a1-9c37a5bc2b42-tap84e1946c-18', 'timestamp': '2026-01-22T00:32:22.996634', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1345718251', 'name': 'tap84e1946c-18', 'instance_id': '83be5683-cac9-4e5b-b8a1-9c37a5bc2b42', 'instance_type': 'm1.nano', 'host': '114686826ac799d79679ebf48d50469c6b568964d23812d7c3d3d1b1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:5e:14:50', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap84e1946c-18'}, 'message_id': 'd18e46aa-f729-11f0-a0a4-fa163e934844', 'monotonic_time': 6515.661498572, 'message_signature': '71c8518bf555d527ced2dc971110305a4d3bb75cd0773e5e80f04cedac5101f4'}]}, 'timestamp': '2026-01-22 00:32:22.996890', '_unique_id': '21c22d6a6e00408888f0d7fef3259483'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.997 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.997 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.997 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.997 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.997 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.997 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.997 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.997 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.997 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.997 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.997 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.997 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.997 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.997 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.997 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.997 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.997 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.997 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.997 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.997 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.997 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.997 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.997 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.997 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.997 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.997 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.997 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.997 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.997 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.997 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.997 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.997 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.998 12 DEBUG ceilometer.compute.pollsters [-] 83be5683-cac9-4e5b-b8a1-9c37a5bc2b42/disk.device.allocation volume: 30089216 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.998 12 DEBUG ceilometer.compute.pollsters [-] 83be5683-cac9-4e5b-b8a1-9c37a5bc2b42/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.998 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e980e064-6c39-4ff6-a91b-232d1339bd0d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30089216, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': '83be5683-cac9-4e5b-b8a1-9c37a5bc2b42-vda', 'timestamp': '2026-01-22T00:32:22.997984', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1345718251', 'name': 'instance-000000a9', 'instance_id': '83be5683-cac9-4e5b-b8a1-9c37a5bc2b42', 'instance_type': 'm1.nano', 'host': '114686826ac799d79679ebf48d50469c6b568964d23812d7c3d3d1b1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'd18e7b5c-f729-11f0-a0a4-fa163e934844', 'monotonic_time': 6515.644943902, 'message_signature': '0642883ae7e9a1598ab235c8ee55ba3fd755c521d55da18786a10397cad75dc3'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': '83be5683-cac9-4e5b-b8a1-9c37a5bc2b42-sda', 'timestamp': '2026-01-22T00:32:22.997984', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1345718251', 'name': 'instance-000000a9', 'instance_id': '83be5683-cac9-4e5b-b8a1-9c37a5bc2b42', 'instance_type': 'm1.nano', 'host': '114686826ac799d79679ebf48d50469c6b568964d23812d7c3d3d1b1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'd18e832c-f729-11f0-a0a4-fa163e934844', 'monotonic_time': 6515.644943902, 'message_signature': 'b54c64810d0852b6586f10b1f455c780ee9815cfa34cd1a889690c7aa0f9d65e'}]}, 'timestamp': '2026-01-22 00:32:22.998397', '_unique_id': '34049d5dacab4ddeb8e0cf8d2dff0c45'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.998 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.998 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.998 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.998 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.998 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.998 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.998 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.998 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.998 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.998 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.998 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.998 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.998 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.998 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.998 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.998 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.998 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.998 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.998 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.998 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.998 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.998 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.998 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.998 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.998 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.998 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.998 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.998 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.998 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.998 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.998 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.999 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.999 12 DEBUG ceilometer.compute.pollsters [-] 83be5683-cac9-4e5b-b8a1-9c37a5bc2b42/disk.device.write.requests volume: 312 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:32:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:22.999 12 DEBUG ceilometer.compute.pollsters [-] 83be5683-cac9-4e5b-b8a1-9c37a5bc2b42/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.000 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2b9d94f4-f9f6-4a91-926f-0ec16ddc08e7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 312, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': '83be5683-cac9-4e5b-b8a1-9c37a5bc2b42-vda', 'timestamp': '2026-01-22T00:32:22.999511', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1345718251', 'name': 'instance-000000a9', 'instance_id': '83be5683-cac9-4e5b-b8a1-9c37a5bc2b42', 'instance_type': 'm1.nano', 'host': '114686826ac799d79679ebf48d50469c6b568964d23812d7c3d3d1b1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'd18eb720-f729-11f0-a0a4-fa163e934844', 'monotonic_time': 6515.603716997, 'message_signature': '132728405c266f7b282a9c47dc54806aab789fbe437ddf066c6b2196b4c074ed'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': '83be5683-cac9-4e5b-b8a1-9c37a5bc2b42-sda', 'timestamp': '2026-01-22T00:32:22.999511', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1345718251', 'name': 'instance-000000a9', 'instance_id': '83be5683-cac9-4e5b-b8a1-9c37a5bc2b42', 'instance_type': 'm1.nano', 'host': '114686826ac799d79679ebf48d50469c6b568964d23812d7c3d3d1b1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'd18ebefa-f729-11f0-a0a4-fa163e934844', 'monotonic_time': 6515.603716997, 'message_signature': '26913ee0efa3a1cbaa3c86191ca16b5ec3bcbe7a7b5001b3cd5ab2191e14f80d'}]}, 'timestamp': '2026-01-22 00:32:22.999945', '_unique_id': 'e68194f6d97a4ceaa8c7a70cbd6c9e32'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.000 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.000 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.000 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.000 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.000 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.000 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.000 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.000 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.000 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.000 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.000 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.000 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.000 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.000 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.000 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.000 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.000 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.000 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.000 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.000 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.000 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.000 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.000 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.000 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.000 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.000 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.000 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.000 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.000 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.000 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.000 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.000 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.001 12 DEBUG ceilometer.compute.pollsters [-] 83be5683-cac9-4e5b-b8a1-9c37a5bc2b42/disk.device.read.bytes volume: 30304768 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.001 12 DEBUG ceilometer.compute.pollsters [-] 83be5683-cac9-4e5b-b8a1-9c37a5bc2b42/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.001 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7c7a52aa-67e3-45e2-a9ab-56aba0b30398', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 30304768, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': '83be5683-cac9-4e5b-b8a1-9c37a5bc2b42-vda', 'timestamp': '2026-01-22T00:32:23.001020', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1345718251', 'name': 'instance-000000a9', 'instance_id': '83be5683-cac9-4e5b-b8a1-9c37a5bc2b42', 'instance_type': 'm1.nano', 'host': '114686826ac799d79679ebf48d50469c6b568964d23812d7c3d3d1b1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'd18ef208-f729-11f0-a0a4-fa163e934844', 'monotonic_time': 6515.603716997, 'message_signature': '58d080d7c09824a667bfd08e620d9414494ae548b1f0946f8a3cd149fa14eb8f'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': '83be5683-cac9-4e5b-b8a1-9c37a5bc2b42-sda', 'timestamp': '2026-01-22T00:32:23.001020', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1345718251', 'name': 'instance-000000a9', 'instance_id': '83be5683-cac9-4e5b-b8a1-9c37a5bc2b42', 'instance_type': 'm1.nano', 'host': '114686826ac799d79679ebf48d50469c6b568964d23812d7c3d3d1b1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'd18ef9ec-f729-11f0-a0a4-fa163e934844', 'monotonic_time': 6515.603716997, 'message_signature': 'aebb50fed377b06f0907c6b6adc62319a16f05bc9ad34c3d72edfa6e99a38287'}]}, 'timestamp': '2026-01-22 00:32:23.001436', '_unique_id': '1c19d900075d42f1a29d748fceade2b5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.001 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.001 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.001 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.001 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.001 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.001 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.001 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.001 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.001 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.001 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.001 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.001 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.001 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.001 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.001 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.001 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.001 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.001 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.001 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.001 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.001 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.001 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.001 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.001 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.001 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.001 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.001 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.001 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.001 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.001 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.001 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.002 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.002 12 DEBUG ceilometer.compute.pollsters [-] 83be5683-cac9-4e5b-b8a1-9c37a5bc2b42/network.incoming.packets volume: 35 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.003 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bc8fb445-2082-4eac-8b2b-6320d8ee72bb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 35, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': 'instance-000000a9-83be5683-cac9-4e5b-b8a1-9c37a5bc2b42-tap84e1946c-18', 'timestamp': '2026-01-22T00:32:23.002528', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1345718251', 'name': 'tap84e1946c-18', 'instance_id': '83be5683-cac9-4e5b-b8a1-9c37a5bc2b42', 'instance_type': 'm1.nano', 'host': '114686826ac799d79679ebf48d50469c6b568964d23812d7c3d3d1b1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:5e:14:50', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap84e1946c-18'}, 'message_id': 'd18f2cfa-f729-11f0-a0a4-fa163e934844', 'monotonic_time': 6515.661498572, 'message_signature': 'cb2b18ed52d20c72ad2bb05f1bc1706b0b26dd9f614dd8e43ff90c8595d62565'}]}, 'timestamp': '2026-01-22 00:32:23.002787', '_unique_id': '8849ddb944df4a618239497a08c11c9c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.003 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.003 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.003 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.003 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.003 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.003 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.003 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.003 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.003 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.003 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.003 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.003 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.003 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.003 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.003 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.003 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.003 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.003 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.003 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.003 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.003 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.003 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.003 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.003 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.003 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.003 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.003 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.003 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.003 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.003 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.003 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.003 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.003 12 DEBUG ceilometer.compute.pollsters [-] 83be5683-cac9-4e5b-b8a1-9c37a5bc2b42/memory.usage volume: 42.8046875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.004 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a50ee10c-205f-4dcb-be35-42c5e274a760', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 42.8046875, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': '83be5683-cac9-4e5b-b8a1-9c37a5bc2b42', 'timestamp': '2026-01-22T00:32:23.003894', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1345718251', 'name': 'instance-000000a9', 'instance_id': '83be5683-cac9-4e5b-b8a1-9c37a5bc2b42', 'instance_type': 'm1.nano', 'host': '114686826ac799d79679ebf48d50469c6b568964d23812d7c3d3d1b1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': 'd18f6274-f729-11f0-a0a4-fa163e934844', 'monotonic_time': 6515.697334057, 'message_signature': 'ec641ae80a8705ad915fabee6c7d0df8123d53815d4388873e7df2e21e894f5f'}]}, 'timestamp': '2026-01-22 00:32:23.004120', '_unique_id': '2894c1135855484cba9990305fc4b5ed'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.004 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.004 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.004 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.004 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.004 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.004 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.004 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.004 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.004 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.004 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.004 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.004 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.004 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.004 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.004 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.004 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.004 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.004 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.004 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.004 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.004 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.004 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.004 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.004 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.004 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.004 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.004 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.004 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.004 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.004 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.004 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.005 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.005 12 DEBUG ceilometer.compute.pollsters [-] 83be5683-cac9-4e5b-b8a1-9c37a5bc2b42/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.005 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f0c9b37e-35ec-4ba2-9615-133b2260436b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': 'instance-000000a9-83be5683-cac9-4e5b-b8a1-9c37a5bc2b42-tap84e1946c-18', 'timestamp': '2026-01-22T00:32:23.005183', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1345718251', 'name': 'tap84e1946c-18', 'instance_id': '83be5683-cac9-4e5b-b8a1-9c37a5bc2b42', 'instance_type': 'm1.nano', 'host': '114686826ac799d79679ebf48d50469c6b568964d23812d7c3d3d1b1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:5e:14:50', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap84e1946c-18'}, 'message_id': 'd18f9492-f729-11f0-a0a4-fa163e934844', 'monotonic_time': 6515.661498572, 'message_signature': '2f215e8a6e51757d5190fbd87006debcf7b8ee298ccd4792e303f66a338fac56'}]}, 'timestamp': '2026-01-22 00:32:23.005410', '_unique_id': '1f87158a643b48c9b9c2654b8191a9f3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.005 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.005 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.005 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.005 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.005 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.005 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.005 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.005 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.005 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.005 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.005 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.005 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.005 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.005 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.005 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.005 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.005 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.005 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.005 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.005 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.005 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.005 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.005 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.005 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.005 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.005 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.005 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.005 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.005 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.005 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.005 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.006 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.006 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.006 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-TestGettingAddress-server-1345718251>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestGettingAddress-server-1345718251>]
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.006 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.006 12 DEBUG ceilometer.compute.pollsters [-] 83be5683-cac9-4e5b-b8a1-9c37a5bc2b42/network.outgoing.bytes volume: 3770 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.007 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6ff58e40-9068-4628-a3ba-71c33dac3862', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 3770, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': 'instance-000000a9-83be5683-cac9-4e5b-b8a1-9c37a5bc2b42-tap84e1946c-18', 'timestamp': '2026-01-22T00:32:23.006803', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1345718251', 'name': 'tap84e1946c-18', 'instance_id': '83be5683-cac9-4e5b-b8a1-9c37a5bc2b42', 'instance_type': 'm1.nano', 'host': '114686826ac799d79679ebf48d50469c6b568964d23812d7c3d3d1b1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:5e:14:50', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap84e1946c-18'}, 'message_id': 'd18fd466-f729-11f0-a0a4-fa163e934844', 'monotonic_time': 6515.661498572, 'message_signature': '9af56e17a49894fa99ff9f5f5ac969848ee360a5199177ba6446ecbce226f2af'}]}, 'timestamp': '2026-01-22 00:32:23.007045', '_unique_id': '014f42d370c94bc390d498d8c6ae8b01'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.007 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.007 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.007 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.007 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.007 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.007 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.007 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.007 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.007 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.007 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.007 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.007 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.007 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.007 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.007 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.007 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.007 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.007 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.007 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.007 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.007 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.007 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.007 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.007 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.007 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.007 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.007 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.007 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.007 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.007 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.007 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.008 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.008 12 DEBUG ceilometer.compute.pollsters [-] 83be5683-cac9-4e5b-b8a1-9c37a5bc2b42/network.incoming.bytes volume: 5037 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.008 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7b56f31c-a41e-43c0-a549-5401d0c3241e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 5037, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': 'instance-000000a9-83be5683-cac9-4e5b-b8a1-9c37a5bc2b42-tap84e1946c-18', 'timestamp': '2026-01-22T00:32:23.008104', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1345718251', 'name': 'tap84e1946c-18', 'instance_id': '83be5683-cac9-4e5b-b8a1-9c37a5bc2b42', 'instance_type': 'm1.nano', 'host': '114686826ac799d79679ebf48d50469c6b568964d23812d7c3d3d1b1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:5e:14:50', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap84e1946c-18'}, 'message_id': 'd19006a2-f729-11f0-a0a4-fa163e934844', 'monotonic_time': 6515.661498572, 'message_signature': '8faef51d6d4496970dcbd88bce6c0207c0574d9dcaf03ca72be610627908c79e'}]}, 'timestamp': '2026-01-22 00:32:23.008329', '_unique_id': '0ea83a0bf6354c6fbad2fb4a8cd65c9f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.008 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.008 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.008 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.008 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.008 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.008 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.008 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.008 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.008 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.008 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.008 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.008 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.008 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.008 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.008 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.008 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.008 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.008 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.008 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.008 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.008 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.008 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.008 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.008 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.008 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.008 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.008 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.008 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.008 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.008 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.008 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.009 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.009 12 DEBUG ceilometer.compute.pollsters [-] 83be5683-cac9-4e5b-b8a1-9c37a5bc2b42/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.010 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3caa3706-f4c1-456e-90be-b5ba6d9002c4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': 'instance-000000a9-83be5683-cac9-4e5b-b8a1-9c37a5bc2b42-tap84e1946c-18', 'timestamp': '2026-01-22T00:32:23.009377', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1345718251', 'name': 'tap84e1946c-18', 'instance_id': '83be5683-cac9-4e5b-b8a1-9c37a5bc2b42', 'instance_type': 'm1.nano', 'host': '114686826ac799d79679ebf48d50469c6b568964d23812d7c3d3d1b1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:5e:14:50', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap84e1946c-18'}, 'message_id': 'd19038e8-f729-11f0-a0a4-fa163e934844', 'monotonic_time': 6515.661498572, 'message_signature': '6af08cab3bd7bf95ab3b02983d989fb37386f267e29c4309cde36d6b302f5e79'}]}, 'timestamp': '2026-01-22 00:32:23.009618', '_unique_id': 'addc2e3d437c4efbbdded77cc6aeecc6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.010 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.010 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.010 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.010 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.010 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.010 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.010 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.010 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.010 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.010 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.010 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.010 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.010 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.010 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.010 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.010 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.010 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.010 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.010 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.010 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.010 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.010 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.010 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.010 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.010 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.010 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.010 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.010 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.010 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.010 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.010 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.010 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.010 12 DEBUG ceilometer.compute.pollsters [-] 83be5683-cac9-4e5b-b8a1-9c37a5bc2b42/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.011 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7185c5c6-5514-4e8c-8fe2-369ff5df7813', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': 'instance-000000a9-83be5683-cac9-4e5b-b8a1-9c37a5bc2b42-tap84e1946c-18', 'timestamp': '2026-01-22T00:32:23.010662', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1345718251', 'name': 'tap84e1946c-18', 'instance_id': '83be5683-cac9-4e5b-b8a1-9c37a5bc2b42', 'instance_type': 'm1.nano', 'host': '114686826ac799d79679ebf48d50469c6b568964d23812d7c3d3d1b1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:5e:14:50', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap84e1946c-18'}, 'message_id': 'd1906b4c-f729-11f0-a0a4-fa163e934844', 'monotonic_time': 6515.661498572, 'message_signature': 'b7e93ed489398baad9ae9f7b612f5445c2bb93b9e367ef34795b48139e916261'}]}, 'timestamp': '2026-01-22 00:32:23.010937', '_unique_id': 'ed2dfd37cc8f41889d7c3d8e9beb57ce'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.011 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.011 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.011 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.011 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.011 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.011 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.011 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.011 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.011 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.011 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.011 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.011 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.011 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.011 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.011 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.011 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.011 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.011 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.011 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.011 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.011 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.011 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.011 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.011 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.011 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.011 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.011 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.011 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.011 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.011 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:32:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:32:23.011 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:32:23 compute-1 nova_compute[182713]: 2026-01-22 00:32:23.587 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:32:24 compute-1 podman[239369]: 2026-01-22 00:32:24.580357819 +0000 UTC m=+0.070301338 container health_status cba261ffc859a105e0afa4436e64e27f9ba7e7387a1ea02146e0072c51844d44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3)
Jan 22 00:32:24 compute-1 nova_compute[182713]: 2026-01-22 00:32:24.750 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:32:26 compute-1 podman[239389]: 2026-01-22 00:32:26.574828218 +0000 UTC m=+0.064290820 container health_status dce133a2ffa21f3006ac27dc80027060a9029e6d972f4c62fecd35dd3bbb00f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9-minimal, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, distribution-scope=public, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., version=9.6, managed_by=edpm_ansible, architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., vcs-type=git, io.openshift.tags=minimal rhel9)
Jan 22 00:32:28 compute-1 nova_compute[182713]: 2026-01-22 00:32:28.590 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:32:29 compute-1 nova_compute[182713]: 2026-01-22 00:32:29.753 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:32:33 compute-1 nova_compute[182713]: 2026-01-22 00:32:33.594 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:32:34 compute-1 nova_compute[182713]: 2026-01-22 00:32:34.756 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:32:35 compute-1 podman[239412]: 2026-01-22 00:32:35.568271066 +0000 UTC m=+0.058283303 container health_status 9069102163b9014ce8d990e36224edf2f6277e737eebbc85a8ea0ba5a3d6a468 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 22 00:32:35 compute-1 podman[239411]: 2026-01-22 00:32:35.598410172 +0000 UTC m=+0.090666879 container health_status 1b31374526468d6b98833c6ddc06c791524e3aaa8f4a92d02e3f11c449a17ca2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 22 00:32:38 compute-1 nova_compute[182713]: 2026-01-22 00:32:38.598 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:32:39 compute-1 nova_compute[182713]: 2026-01-22 00:32:39.758 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:32:40 compute-1 ovn_controller[94841]: 2026-01-22T00:32:40Z|00672|memory_trim|INFO|Detected inactivity (last active 30006 ms ago): trimming memory
Jan 22 00:32:41 compute-1 podman[239461]: 2026-01-22 00:32:41.584835624 +0000 UTC m=+0.069293219 container health_status e305ebfcd3dbca18f8dd680606b043b108cf9efb0d0a771039cb04fe0df8441d (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 22 00:32:41 compute-1 podman[239460]: 2026-01-22 00:32:41.592798764 +0000 UTC m=+0.075495132 container health_status af9a44923f14d865893c91712a29a99d59e05d83f1a1a0300c4f4d5af638f284 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 22 00:32:43 compute-1 nova_compute[182713]: 2026-01-22 00:32:43.601 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:32:44 compute-1 nova_compute[182713]: 2026-01-22 00:32:44.761 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:32:47 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:32:47.233 104184 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=56, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:a4:fb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'fa:c2:bb:50:eb:27'}, ipsec=False) old=SB_Global(nb_cfg=55) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:32:47 compute-1 nova_compute[182713]: 2026-01-22 00:32:47.233 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:32:47 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:32:47.234 104184 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 22 00:32:47 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:32:47.235 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=74526b6d-b1ca-423f-9094-b845f8b97526, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '56'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:32:48 compute-1 nova_compute[182713]: 2026-01-22 00:32:48.606 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:32:49 compute-1 nova_compute[182713]: 2026-01-22 00:32:49.484 182717 DEBUG oslo_concurrency.lockutils [None req-89e86b78-cb98-45b8-9211-b4467a431a37 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Acquiring lock "83be5683-cac9-4e5b-b8a1-9c37a5bc2b42" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:32:49 compute-1 nova_compute[182713]: 2026-01-22 00:32:49.485 182717 DEBUG oslo_concurrency.lockutils [None req-89e86b78-cb98-45b8-9211-b4467a431a37 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "83be5683-cac9-4e5b-b8a1-9c37a5bc2b42" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:32:49 compute-1 nova_compute[182713]: 2026-01-22 00:32:49.485 182717 DEBUG oslo_concurrency.lockutils [None req-89e86b78-cb98-45b8-9211-b4467a431a37 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Acquiring lock "83be5683-cac9-4e5b-b8a1-9c37a5bc2b42-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:32:49 compute-1 nova_compute[182713]: 2026-01-22 00:32:49.486 182717 DEBUG oslo_concurrency.lockutils [None req-89e86b78-cb98-45b8-9211-b4467a431a37 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "83be5683-cac9-4e5b-b8a1-9c37a5bc2b42-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:32:49 compute-1 nova_compute[182713]: 2026-01-22 00:32:49.486 182717 DEBUG oslo_concurrency.lockutils [None req-89e86b78-cb98-45b8-9211-b4467a431a37 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "83be5683-cac9-4e5b-b8a1-9c37a5bc2b42-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:32:49 compute-1 nova_compute[182713]: 2026-01-22 00:32:49.506 182717 INFO nova.compute.manager [None req-89e86b78-cb98-45b8-9211-b4467a431a37 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 83be5683-cac9-4e5b-b8a1-9c37a5bc2b42] Terminating instance
Jan 22 00:32:49 compute-1 nova_compute[182713]: 2026-01-22 00:32:49.534 182717 DEBUG nova.compute.manager [None req-89e86b78-cb98-45b8-9211-b4467a431a37 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 83be5683-cac9-4e5b-b8a1-9c37a5bc2b42] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 22 00:32:49 compute-1 kernel: tap84e1946c-18 (unregistering): left promiscuous mode
Jan 22 00:32:49 compute-1 NetworkManager[54952]: <info>  [1769041969.5704] device (tap84e1946c-18): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 00:32:49 compute-1 nova_compute[182713]: 2026-01-22 00:32:49.579 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:32:49 compute-1 ovn_controller[94841]: 2026-01-22T00:32:49Z|00673|binding|INFO|Releasing lport 84e1946c-1832-49b5-9a84-edfb4022e8bc from this chassis (sb_readonly=0)
Jan 22 00:32:49 compute-1 ovn_controller[94841]: 2026-01-22T00:32:49Z|00674|binding|INFO|Setting lport 84e1946c-1832-49b5-9a84-edfb4022e8bc down in Southbound
Jan 22 00:32:49 compute-1 ovn_controller[94841]: 2026-01-22T00:32:49Z|00675|binding|INFO|Removing iface tap84e1946c-18 ovn-installed in OVS
Jan 22 00:32:49 compute-1 nova_compute[182713]: 2026-01-22 00:32:49.582 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:32:49 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:32:49.592 104184 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:5e:14:50 10.100.0.10 2001:db8::f816:3eff:fe5e:1450'], port_security=['fa:16:3e:5e:14:50 10.100.0.10 2001:db8::f816:3eff:fe5e:1450'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28 2001:db8::f816:3eff:fe5e:1450/64', 'neutron:device_id': '83be5683-cac9-4e5b-b8a1-9c37a5bc2b42', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-895033ac-5f91-4350-ad1a-b5c5d0ff13a2', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '837db8748d074b3c9179b47d30e7a1d4', 'neutron:revision_number': '4', 'neutron:security_group_ids': '9d931792-0187-42bd-ad30-da2120e7bd41', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=cb52301b-689d-4e28-a6fb-c23352694dd4, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>], logical_port=84e1946c-1832-49b5-9a84-edfb4022e8bc) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:32:49 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:32:49.595 104184 INFO neutron.agent.ovn.metadata.agent [-] Port 84e1946c-1832-49b5-9a84-edfb4022e8bc in datapath 895033ac-5f91-4350-ad1a-b5c5d0ff13a2 unbound from our chassis
Jan 22 00:32:49 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:32:49.598 104184 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 895033ac-5f91-4350-ad1a-b5c5d0ff13a2, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 00:32:49 compute-1 nova_compute[182713]: 2026-01-22 00:32:49.601 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:32:49 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:32:49.601 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[3fcff736-0e26-4c52-85d5-b8e3d75c3518]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:32:49 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:32:49.603 104184 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-895033ac-5f91-4350-ad1a-b5c5d0ff13a2 namespace which is not needed anymore
Jan 22 00:32:49 compute-1 systemd[1]: machine-qemu\x2d73\x2dinstance\x2d000000a9.scope: Deactivated successfully.
Jan 22 00:32:49 compute-1 systemd[1]: machine-qemu\x2d73\x2dinstance\x2d000000a9.scope: Consumed 15.505s CPU time.
Jan 22 00:32:49 compute-1 systemd-machined[153970]: Machine qemu-73-instance-000000a9 terminated.
Jan 22 00:32:49 compute-1 neutron-haproxy-ovnmeta-895033ac-5f91-4350-ad1a-b5c5d0ff13a2[239109]: [NOTICE]   (239113) : haproxy version is 2.8.14-c23fe91
Jan 22 00:32:49 compute-1 neutron-haproxy-ovnmeta-895033ac-5f91-4350-ad1a-b5c5d0ff13a2[239109]: [NOTICE]   (239113) : path to executable is /usr/sbin/haproxy
Jan 22 00:32:49 compute-1 neutron-haproxy-ovnmeta-895033ac-5f91-4350-ad1a-b5c5d0ff13a2[239109]: [WARNING]  (239113) : Exiting Master process...
Jan 22 00:32:49 compute-1 neutron-haproxy-ovnmeta-895033ac-5f91-4350-ad1a-b5c5d0ff13a2[239109]: [ALERT]    (239113) : Current worker (239115) exited with code 143 (Terminated)
Jan 22 00:32:49 compute-1 neutron-haproxy-ovnmeta-895033ac-5f91-4350-ad1a-b5c5d0ff13a2[239109]: [WARNING]  (239113) : All workers exited. Exiting... (0)
Jan 22 00:32:49 compute-1 systemd[1]: libpod-b5efe806540be88a35ad3b35409a0ae3c3f99b50058b214dbe362f925c3f9f7d.scope: Deactivated successfully.
Jan 22 00:32:49 compute-1 conmon[239109]: conmon b5efe806540be88a35ad <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-b5efe806540be88a35ad3b35409a0ae3c3f99b50058b214dbe362f925c3f9f7d.scope/container/memory.events
Jan 22 00:32:49 compute-1 nova_compute[182713]: 2026-01-22 00:32:49.764 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:32:49 compute-1 podman[239529]: 2026-01-22 00:32:49.766765329 +0000 UTC m=+0.059017695 container died b5efe806540be88a35ad3b35409a0ae3c3f99b50058b214dbe362f925c3f9f7d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-895033ac-5f91-4350-ad1a-b5c5d0ff13a2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 00:32:49 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-b5efe806540be88a35ad3b35409a0ae3c3f99b50058b214dbe362f925c3f9f7d-userdata-shm.mount: Deactivated successfully.
Jan 22 00:32:49 compute-1 systemd[1]: var-lib-containers-storage-overlay-7b2f5eb6ebdbf50d7e4f516f522476fe4e69e764b5ac59678474689afe2862d4-merged.mount: Deactivated successfully.
Jan 22 00:32:49 compute-1 nova_compute[182713]: 2026-01-22 00:32:49.822 182717 INFO nova.virt.libvirt.driver [-] [instance: 83be5683-cac9-4e5b-b8a1-9c37a5bc2b42] Instance destroyed successfully.
Jan 22 00:32:49 compute-1 nova_compute[182713]: 2026-01-22 00:32:49.823 182717 DEBUG nova.objects.instance [None req-89e86b78-cb98-45b8-9211-b4467a431a37 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lazy-loading 'resources' on Instance uuid 83be5683-cac9-4e5b-b8a1-9c37a5bc2b42 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:32:49 compute-1 podman[239529]: 2026-01-22 00:32:49.827965731 +0000 UTC m=+0.120218067 container cleanup b5efe806540be88a35ad3b35409a0ae3c3f99b50058b214dbe362f925c3f9f7d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-895033ac-5f91-4350-ad1a-b5c5d0ff13a2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 22 00:32:49 compute-1 systemd[1]: libpod-conmon-b5efe806540be88a35ad3b35409a0ae3c3f99b50058b214dbe362f925c3f9f7d.scope: Deactivated successfully.
Jan 22 00:32:49 compute-1 nova_compute[182713]: 2026-01-22 00:32:49.845 182717 DEBUG nova.virt.libvirt.vif [None req-89e86b78-cb98-45b8-9211-b4467a431a37 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T00:31:18Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1345718251',display_name='tempest-TestGettingAddress-server-1345718251',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1345718251',id=169,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBEN8d2ViSKyfMT2XqudbWbmZhugtSo0AUa3hssPfIHXZXJuMLED9XwzZlkaV7imX6BxsiK4pWMoh9iMrlu0xzxZRk5QI4OmaesLZRq01J/YYbtUdz/2t7KMOohgfE7jvUQ==',key_name='tempest-TestGettingAddress-1273457996',keypairs=<?>,launch_index=0,launched_at=2026-01-22T00:31:32Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='837db8748d074b3c9179b47d30e7a1d4',ramdisk_id='',reservation_id='r-ogq5xbwe',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestGettingAddress-471729430',owner_user_name='tempest-TestGettingAddress-471729430-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T00:31:33Z,user_data=None,user_id='a8fd196423d94b309668ffd08655f7ed',uuid=83be5683-cac9-4e5b-b8a1-9c37a5bc2b42,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "84e1946c-1832-49b5-9a84-edfb4022e8bc", "address": "fa:16:3e:5e:14:50", "network": {"id": "895033ac-5f91-4350-ad1a-b5c5d0ff13a2", "bridge": "br-int", "label": "tempest-network-smoke--1056014394", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe5e:1450", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.194", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap84e1946c-18", "ovs_interfaceid": "84e1946c-1832-49b5-9a84-edfb4022e8bc", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 00:32:49 compute-1 nova_compute[182713]: 2026-01-22 00:32:49.845 182717 DEBUG nova.network.os_vif_util [None req-89e86b78-cb98-45b8-9211-b4467a431a37 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Converting VIF {"id": "84e1946c-1832-49b5-9a84-edfb4022e8bc", "address": "fa:16:3e:5e:14:50", "network": {"id": "895033ac-5f91-4350-ad1a-b5c5d0ff13a2", "bridge": "br-int", "label": "tempest-network-smoke--1056014394", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe5e:1450", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.194", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap84e1946c-18", "ovs_interfaceid": "84e1946c-1832-49b5-9a84-edfb4022e8bc", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:32:49 compute-1 nova_compute[182713]: 2026-01-22 00:32:49.847 182717 DEBUG nova.network.os_vif_util [None req-89e86b78-cb98-45b8-9211-b4467a431a37 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:5e:14:50,bridge_name='br-int',has_traffic_filtering=True,id=84e1946c-1832-49b5-9a84-edfb4022e8bc,network=Network(895033ac-5f91-4350-ad1a-b5c5d0ff13a2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap84e1946c-18') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:32:49 compute-1 nova_compute[182713]: 2026-01-22 00:32:49.847 182717 DEBUG os_vif [None req-89e86b78-cb98-45b8-9211-b4467a431a37 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:5e:14:50,bridge_name='br-int',has_traffic_filtering=True,id=84e1946c-1832-49b5-9a84-edfb4022e8bc,network=Network(895033ac-5f91-4350-ad1a-b5c5d0ff13a2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap84e1946c-18') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 00:32:49 compute-1 nova_compute[182713]: 2026-01-22 00:32:49.850 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:32:49 compute-1 nova_compute[182713]: 2026-01-22 00:32:49.850 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap84e1946c-18, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:32:49 compute-1 nova_compute[182713]: 2026-01-22 00:32:49.853 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:32:49 compute-1 nova_compute[182713]: 2026-01-22 00:32:49.856 182717 DEBUG nova.compute.manager [req-36929e15-beb7-489a-854c-2d47c9293bba req-610bbd50-e720-467e-b7e0-a4d7f706b6a2 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 83be5683-cac9-4e5b-b8a1-9c37a5bc2b42] Received event network-vif-unplugged-84e1946c-1832-49b5-9a84-edfb4022e8bc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:32:49 compute-1 nova_compute[182713]: 2026-01-22 00:32:49.857 182717 DEBUG oslo_concurrency.lockutils [req-36929e15-beb7-489a-854c-2d47c9293bba req-610bbd50-e720-467e-b7e0-a4d7f706b6a2 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "83be5683-cac9-4e5b-b8a1-9c37a5bc2b42-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:32:49 compute-1 nova_compute[182713]: 2026-01-22 00:32:49.857 182717 DEBUG oslo_concurrency.lockutils [req-36929e15-beb7-489a-854c-2d47c9293bba req-610bbd50-e720-467e-b7e0-a4d7f706b6a2 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "83be5683-cac9-4e5b-b8a1-9c37a5bc2b42-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:32:49 compute-1 nova_compute[182713]: 2026-01-22 00:32:49.857 182717 DEBUG oslo_concurrency.lockutils [req-36929e15-beb7-489a-854c-2d47c9293bba req-610bbd50-e720-467e-b7e0-a4d7f706b6a2 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "83be5683-cac9-4e5b-b8a1-9c37a5bc2b42-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:32:49 compute-1 nova_compute[182713]: 2026-01-22 00:32:49.857 182717 DEBUG nova.compute.manager [req-36929e15-beb7-489a-854c-2d47c9293bba req-610bbd50-e720-467e-b7e0-a4d7f706b6a2 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 83be5683-cac9-4e5b-b8a1-9c37a5bc2b42] No waiting events found dispatching network-vif-unplugged-84e1946c-1832-49b5-9a84-edfb4022e8bc pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:32:49 compute-1 nova_compute[182713]: 2026-01-22 00:32:49.858 182717 DEBUG nova.compute.manager [req-36929e15-beb7-489a-854c-2d47c9293bba req-610bbd50-e720-467e-b7e0-a4d7f706b6a2 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 83be5683-cac9-4e5b-b8a1-9c37a5bc2b42] Received event network-vif-unplugged-84e1946c-1832-49b5-9a84-edfb4022e8bc for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 22 00:32:49 compute-1 nova_compute[182713]: 2026-01-22 00:32:49.858 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:32:49 compute-1 nova_compute[182713]: 2026-01-22 00:32:49.863 182717 INFO os_vif [None req-89e86b78-cb98-45b8-9211-b4467a431a37 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:5e:14:50,bridge_name='br-int',has_traffic_filtering=True,id=84e1946c-1832-49b5-9a84-edfb4022e8bc,network=Network(895033ac-5f91-4350-ad1a-b5c5d0ff13a2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap84e1946c-18')
Jan 22 00:32:49 compute-1 nova_compute[182713]: 2026-01-22 00:32:49.865 182717 INFO nova.virt.libvirt.driver [None req-89e86b78-cb98-45b8-9211-b4467a431a37 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 83be5683-cac9-4e5b-b8a1-9c37a5bc2b42] Deleting instance files /var/lib/nova/instances/83be5683-cac9-4e5b-b8a1-9c37a5bc2b42_del
Jan 22 00:32:49 compute-1 nova_compute[182713]: 2026-01-22 00:32:49.866 182717 INFO nova.virt.libvirt.driver [None req-89e86b78-cb98-45b8-9211-b4467a431a37 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 83be5683-cac9-4e5b-b8a1-9c37a5bc2b42] Deletion of /var/lib/nova/instances/83be5683-cac9-4e5b-b8a1-9c37a5bc2b42_del complete
Jan 22 00:32:49 compute-1 podman[239576]: 2026-01-22 00:32:49.914291773 +0000 UTC m=+0.060669147 container remove b5efe806540be88a35ad3b35409a0ae3c3f99b50058b214dbe362f925c3f9f7d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-895033ac-5f91-4350-ad1a-b5c5d0ff13a2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Jan 22 00:32:49 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:32:49.921 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[80e38dd7-d32e-40b3-a1e5-29351b1372ef]: (4, ('Thu Jan 22 12:32:49 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-895033ac-5f91-4350-ad1a-b5c5d0ff13a2 (b5efe806540be88a35ad3b35409a0ae3c3f99b50058b214dbe362f925c3f9f7d)\nb5efe806540be88a35ad3b35409a0ae3c3f99b50058b214dbe362f925c3f9f7d\nThu Jan 22 12:32:49 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-895033ac-5f91-4350-ad1a-b5c5d0ff13a2 (b5efe806540be88a35ad3b35409a0ae3c3f99b50058b214dbe362f925c3f9f7d)\nb5efe806540be88a35ad3b35409a0ae3c3f99b50058b214dbe362f925c3f9f7d\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:32:49 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:32:49.924 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[7fcd92c1-b086-40e3-b496-9944bf2d4c6d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:32:49 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:32:49.925 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap895033ac-50, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:32:49 compute-1 nova_compute[182713]: 2026-01-22 00:32:49.927 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:32:49 compute-1 kernel: tap895033ac-50: left promiscuous mode
Jan 22 00:32:49 compute-1 nova_compute[182713]: 2026-01-22 00:32:49.950 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:32:49 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:32:49.956 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[1b96a2ef-eb0f-440a-9655-65fa011fd316]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:32:49 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:32:49.973 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[9fffdffb-cc81-4214-b36d-a853891ee549]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:32:49 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:32:49.974 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[d9298cbc-414c-40d3-9279-2c2214caab3f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:32:49 compute-1 nova_compute[182713]: 2026-01-22 00:32:49.977 182717 INFO nova.compute.manager [None req-89e86b78-cb98-45b8-9211-b4467a431a37 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 83be5683-cac9-4e5b-b8a1-9c37a5bc2b42] Took 0.44 seconds to destroy the instance on the hypervisor.
Jan 22 00:32:49 compute-1 nova_compute[182713]: 2026-01-22 00:32:49.978 182717 DEBUG oslo.service.loopingcall [None req-89e86b78-cb98-45b8-9211-b4467a431a37 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 22 00:32:49 compute-1 nova_compute[182713]: 2026-01-22 00:32:49.978 182717 DEBUG nova.compute.manager [-] [instance: 83be5683-cac9-4e5b-b8a1-9c37a5bc2b42] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 22 00:32:49 compute-1 nova_compute[182713]: 2026-01-22 00:32:49.979 182717 DEBUG nova.network.neutron [-] [instance: 83be5683-cac9-4e5b-b8a1-9c37a5bc2b42] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 22 00:32:50 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:32:50.000 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[2f7e955d-2004-47ca-be3c-82fdb0b7ee68]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 646238, 'reachable_time': 36882, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 239592, 'error': None, 'target': 'ovnmeta-895033ac-5f91-4350-ad1a-b5c5d0ff13a2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:32:50 compute-1 systemd[1]: run-netns-ovnmeta\x2d895033ac\x2d5f91\x2d4350\x2dad1a\x2db5c5d0ff13a2.mount: Deactivated successfully.
Jan 22 00:32:50 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:32:50.003 104576 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-895033ac-5f91-4350-ad1a-b5c5d0ff13a2 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 22 00:32:50 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:32:50.003 104576 DEBUG oslo.privsep.daemon [-] privsep: reply[e452edca-64dd-4873-a875-da6ad5bedd2f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:32:50 compute-1 nova_compute[182713]: 2026-01-22 00:32:50.238 182717 DEBUG nova.compute.manager [req-18c19045-b27e-484f-9dae-0dd20e7ff942 req-0121e526-d18a-45e8-8527-e724bc08eca2 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 83be5683-cac9-4e5b-b8a1-9c37a5bc2b42] Received event network-changed-84e1946c-1832-49b5-9a84-edfb4022e8bc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:32:50 compute-1 nova_compute[182713]: 2026-01-22 00:32:50.239 182717 DEBUG nova.compute.manager [req-18c19045-b27e-484f-9dae-0dd20e7ff942 req-0121e526-d18a-45e8-8527-e724bc08eca2 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 83be5683-cac9-4e5b-b8a1-9c37a5bc2b42] Refreshing instance network info cache due to event network-changed-84e1946c-1832-49b5-9a84-edfb4022e8bc. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 00:32:50 compute-1 nova_compute[182713]: 2026-01-22 00:32:50.239 182717 DEBUG oslo_concurrency.lockutils [req-18c19045-b27e-484f-9dae-0dd20e7ff942 req-0121e526-d18a-45e8-8527-e724bc08eca2 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-83be5683-cac9-4e5b-b8a1-9c37a5bc2b42" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:32:50 compute-1 nova_compute[182713]: 2026-01-22 00:32:50.240 182717 DEBUG oslo_concurrency.lockutils [req-18c19045-b27e-484f-9dae-0dd20e7ff942 req-0121e526-d18a-45e8-8527-e724bc08eca2 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-83be5683-cac9-4e5b-b8a1-9c37a5bc2b42" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:32:50 compute-1 nova_compute[182713]: 2026-01-22 00:32:50.240 182717 DEBUG nova.network.neutron [req-18c19045-b27e-484f-9dae-0dd20e7ff942 req-0121e526-d18a-45e8-8527-e724bc08eca2 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 83be5683-cac9-4e5b-b8a1-9c37a5bc2b42] Refreshing network info cache for port 84e1946c-1832-49b5-9a84-edfb4022e8bc _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 00:32:50 compute-1 sshd-session[239588]: Invalid user solana from 92.118.39.95 port 48212
Jan 22 00:32:50 compute-1 sshd-session[239588]: Connection closed by invalid user solana 92.118.39.95 port 48212 [preauth]
Jan 22 00:32:51 compute-1 nova_compute[182713]: 2026-01-22 00:32:51.069 182717 DEBUG nova.network.neutron [-] [instance: 83be5683-cac9-4e5b-b8a1-9c37a5bc2b42] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:32:51 compute-1 nova_compute[182713]: 2026-01-22 00:32:51.091 182717 INFO nova.compute.manager [-] [instance: 83be5683-cac9-4e5b-b8a1-9c37a5bc2b42] Took 1.11 seconds to deallocate network for instance.
Jan 22 00:32:51 compute-1 nova_compute[182713]: 2026-01-22 00:32:51.165 182717 DEBUG oslo_concurrency.lockutils [None req-89e86b78-cb98-45b8-9211-b4467a431a37 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:32:51 compute-1 nova_compute[182713]: 2026-01-22 00:32:51.166 182717 DEBUG oslo_concurrency.lockutils [None req-89e86b78-cb98-45b8-9211-b4467a431a37 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:32:51 compute-1 nova_compute[182713]: 2026-01-22 00:32:51.227 182717 DEBUG nova.compute.provider_tree [None req-89e86b78-cb98-45b8-9211-b4467a431a37 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Inventory has not changed in ProviderTree for provider: 39680711-70c9-4df1-ae59-25e54fac688d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:32:51 compute-1 nova_compute[182713]: 2026-01-22 00:32:51.250 182717 DEBUG nova.scheduler.client.report [None req-89e86b78-cb98-45b8-9211-b4467a431a37 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Inventory has not changed for provider 39680711-70c9-4df1-ae59-25e54fac688d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:32:51 compute-1 nova_compute[182713]: 2026-01-22 00:32:51.280 182717 DEBUG oslo_concurrency.lockutils [None req-89e86b78-cb98-45b8-9211-b4467a431a37 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.114s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:32:51 compute-1 nova_compute[182713]: 2026-01-22 00:32:51.318 182717 INFO nova.scheduler.client.report [None req-89e86b78-cb98-45b8-9211-b4467a431a37 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Deleted allocations for instance 83be5683-cac9-4e5b-b8a1-9c37a5bc2b42
Jan 22 00:32:51 compute-1 nova_compute[182713]: 2026-01-22 00:32:51.393 182717 DEBUG oslo_concurrency.lockutils [None req-89e86b78-cb98-45b8-9211-b4467a431a37 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "83be5683-cac9-4e5b-b8a1-9c37a5bc2b42" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.908s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:32:51 compute-1 nova_compute[182713]: 2026-01-22 00:32:51.966 182717 DEBUG nova.compute.manager [req-2d660bf9-c5e2-43b6-a173-c55e98fe6d2a req-0f091a69-0ce5-4713-ae4d-c28918fb1855 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 83be5683-cac9-4e5b-b8a1-9c37a5bc2b42] Received event network-vif-plugged-84e1946c-1832-49b5-9a84-edfb4022e8bc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:32:51 compute-1 nova_compute[182713]: 2026-01-22 00:32:51.967 182717 DEBUG oslo_concurrency.lockutils [req-2d660bf9-c5e2-43b6-a173-c55e98fe6d2a req-0f091a69-0ce5-4713-ae4d-c28918fb1855 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "83be5683-cac9-4e5b-b8a1-9c37a5bc2b42-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:32:51 compute-1 nova_compute[182713]: 2026-01-22 00:32:51.967 182717 DEBUG oslo_concurrency.lockutils [req-2d660bf9-c5e2-43b6-a173-c55e98fe6d2a req-0f091a69-0ce5-4713-ae4d-c28918fb1855 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "83be5683-cac9-4e5b-b8a1-9c37a5bc2b42-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:32:51 compute-1 nova_compute[182713]: 2026-01-22 00:32:51.968 182717 DEBUG oslo_concurrency.lockutils [req-2d660bf9-c5e2-43b6-a173-c55e98fe6d2a req-0f091a69-0ce5-4713-ae4d-c28918fb1855 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "83be5683-cac9-4e5b-b8a1-9c37a5bc2b42-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:32:51 compute-1 nova_compute[182713]: 2026-01-22 00:32:51.968 182717 DEBUG nova.compute.manager [req-2d660bf9-c5e2-43b6-a173-c55e98fe6d2a req-0f091a69-0ce5-4713-ae4d-c28918fb1855 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 83be5683-cac9-4e5b-b8a1-9c37a5bc2b42] No waiting events found dispatching network-vif-plugged-84e1946c-1832-49b5-9a84-edfb4022e8bc pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:32:51 compute-1 nova_compute[182713]: 2026-01-22 00:32:51.969 182717 WARNING nova.compute.manager [req-2d660bf9-c5e2-43b6-a173-c55e98fe6d2a req-0f091a69-0ce5-4713-ae4d-c28918fb1855 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 83be5683-cac9-4e5b-b8a1-9c37a5bc2b42] Received unexpected event network-vif-plugged-84e1946c-1832-49b5-9a84-edfb4022e8bc for instance with vm_state deleted and task_state None.
Jan 22 00:32:51 compute-1 nova_compute[182713]: 2026-01-22 00:32:51.969 182717 DEBUG nova.compute.manager [req-2d660bf9-c5e2-43b6-a173-c55e98fe6d2a req-0f091a69-0ce5-4713-ae4d-c28918fb1855 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 83be5683-cac9-4e5b-b8a1-9c37a5bc2b42] Received event network-vif-deleted-84e1946c-1832-49b5-9a84-edfb4022e8bc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:32:52 compute-1 nova_compute[182713]: 2026-01-22 00:32:52.252 182717 DEBUG nova.network.neutron [req-18c19045-b27e-484f-9dae-0dd20e7ff942 req-0121e526-d18a-45e8-8527-e724bc08eca2 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 83be5683-cac9-4e5b-b8a1-9c37a5bc2b42] Updated VIF entry in instance network info cache for port 84e1946c-1832-49b5-9a84-edfb4022e8bc. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 00:32:52 compute-1 nova_compute[182713]: 2026-01-22 00:32:52.253 182717 DEBUG nova.network.neutron [req-18c19045-b27e-484f-9dae-0dd20e7ff942 req-0121e526-d18a-45e8-8527-e724bc08eca2 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 83be5683-cac9-4e5b-b8a1-9c37a5bc2b42] Updating instance_info_cache with network_info: [{"id": "84e1946c-1832-49b5-9a84-edfb4022e8bc", "address": "fa:16:3e:5e:14:50", "network": {"id": "895033ac-5f91-4350-ad1a-b5c5d0ff13a2", "bridge": "br-int", "label": "tempest-network-smoke--1056014394", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe5e:1450", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap84e1946c-18", "ovs_interfaceid": "84e1946c-1832-49b5-9a84-edfb4022e8bc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:32:52 compute-1 nova_compute[182713]: 2026-01-22 00:32:52.275 182717 DEBUG oslo_concurrency.lockutils [req-18c19045-b27e-484f-9dae-0dd20e7ff942 req-0121e526-d18a-45e8-8527-e724bc08eca2 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-83be5683-cac9-4e5b-b8a1-9c37a5bc2b42" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:32:53 compute-1 nova_compute[182713]: 2026-01-22 00:32:53.856 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:32:53 compute-1 nova_compute[182713]: 2026-01-22 00:32:53.857 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Jan 22 00:32:53 compute-1 nova_compute[182713]: 2026-01-22 00:32:53.907 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Jan 22 00:32:54 compute-1 nova_compute[182713]: 2026-01-22 00:32:54.777 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:32:54 compute-1 nova_compute[182713]: 2026-01-22 00:32:54.853 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:32:55 compute-1 podman[239593]: 2026-01-22 00:32:55.572910197 +0000 UTC m=+0.065940902 container health_status cba261ffc859a105e0afa4436e64e27f9ba7e7387a1ea02146e0072c51844d44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ceilometer_agent_compute, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 00:32:55 compute-1 nova_compute[182713]: 2026-01-22 00:32:55.682 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:32:55 compute-1 nova_compute[182713]: 2026-01-22 00:32:55.780 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:32:57 compute-1 podman[239615]: 2026-01-22 00:32:57.60825569 +0000 UTC m=+0.093560210 container health_status dce133a2ffa21f3006ac27dc80027060a9029e6d972f4c62fecd35dd3bbb00f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, config_id=openstack_network_exporter, build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, version=9.6, distribution-scope=public, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., io.openshift.expose-services=, name=ubi9-minimal)
Jan 22 00:32:58 compute-1 nova_compute[182713]: 2026-01-22 00:32:58.907 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:32:58 compute-1 nova_compute[182713]: 2026-01-22 00:32:58.907 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:32:59 compute-1 nova_compute[182713]: 2026-01-22 00:32:59.779 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:32:59 compute-1 nova_compute[182713]: 2026-01-22 00:32:59.855 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:32:59 compute-1 nova_compute[182713]: 2026-01-22 00:32:59.856 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 00:32:59 compute-1 nova_compute[182713]: 2026-01-22 00:32:59.856 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:33:00 compute-1 nova_compute[182713]: 2026-01-22 00:33:00.852 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:33:02 compute-1 nova_compute[182713]: 2026-01-22 00:33:02.856 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:33:03 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:33:03.044 104184 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:33:03 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:33:03.045 104184 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:33:03 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:33:03.045 104184 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:33:04 compute-1 nova_compute[182713]: 2026-01-22 00:33:04.782 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:33:04 compute-1 nova_compute[182713]: 2026-01-22 00:33:04.821 182717 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769041969.8197815, 83be5683-cac9-4e5b-b8a1-9c37a5bc2b42 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:33:04 compute-1 nova_compute[182713]: 2026-01-22 00:33:04.821 182717 INFO nova.compute.manager [-] [instance: 83be5683-cac9-4e5b-b8a1-9c37a5bc2b42] VM Stopped (Lifecycle Event)
Jan 22 00:33:04 compute-1 nova_compute[182713]: 2026-01-22 00:33:04.843 182717 DEBUG nova.compute.manager [None req-4cda5ea0-5c0c-496b-801e-0b87355dcb93 - - - - - -] [instance: 83be5683-cac9-4e5b-b8a1-9c37a5bc2b42] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:33:04 compute-1 nova_compute[182713]: 2026-01-22 00:33:04.857 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:33:04 compute-1 nova_compute[182713]: 2026-01-22 00:33:04.870 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:33:06 compute-1 podman[239637]: 2026-01-22 00:33:06.574337366 +0000 UTC m=+0.060291855 container health_status 9069102163b9014ce8d990e36224edf2f6277e737eebbc85a8ea0ba5a3d6a468 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 22 00:33:06 compute-1 podman[239636]: 2026-01-22 00:33:06.663031942 +0000 UTC m=+0.148586888 container health_status 1b31374526468d6b98833c6ddc06c791524e3aaa8f4a92d02e3f11c449a17ca2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Jan 22 00:33:06 compute-1 nova_compute[182713]: 2026-01-22 00:33:06.855 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:33:06 compute-1 nova_compute[182713]: 2026-01-22 00:33:06.856 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:33:06 compute-1 nova_compute[182713]: 2026-01-22 00:33:06.856 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:33:06 compute-1 nova_compute[182713]: 2026-01-22 00:33:06.893 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:33:06 compute-1 nova_compute[182713]: 2026-01-22 00:33:06.894 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:33:06 compute-1 nova_compute[182713]: 2026-01-22 00:33:06.894 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:33:06 compute-1 nova_compute[182713]: 2026-01-22 00:33:06.894 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 00:33:07 compute-1 nova_compute[182713]: 2026-01-22 00:33:07.070 182717 WARNING nova.virt.libvirt.driver [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 00:33:07 compute-1 nova_compute[182713]: 2026-01-22 00:33:07.071 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5707MB free_disk=73.19268417358398GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 00:33:07 compute-1 nova_compute[182713]: 2026-01-22 00:33:07.071 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:33:07 compute-1 nova_compute[182713]: 2026-01-22 00:33:07.071 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:33:07 compute-1 nova_compute[182713]: 2026-01-22 00:33:07.143 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 00:33:07 compute-1 nova_compute[182713]: 2026-01-22 00:33:07.144 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 00:33:07 compute-1 nova_compute[182713]: 2026-01-22 00:33:07.165 182717 DEBUG nova.compute.provider_tree [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Inventory has not changed in ProviderTree for provider: 39680711-70c9-4df1-ae59-25e54fac688d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:33:07 compute-1 nova_compute[182713]: 2026-01-22 00:33:07.187 182717 DEBUG nova.scheduler.client.report [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Inventory has not changed for provider 39680711-70c9-4df1-ae59-25e54fac688d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:33:07 compute-1 nova_compute[182713]: 2026-01-22 00:33:07.213 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 00:33:07 compute-1 nova_compute[182713]: 2026-01-22 00:33:07.214 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.142s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:33:07 compute-1 nova_compute[182713]: 2026-01-22 00:33:07.855 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:33:07 compute-1 nova_compute[182713]: 2026-01-22 00:33:07.856 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Jan 22 00:33:09 compute-1 nova_compute[182713]: 2026-01-22 00:33:09.784 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:33:09 compute-1 nova_compute[182713]: 2026-01-22 00:33:09.860 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:33:10 compute-1 nova_compute[182713]: 2026-01-22 00:33:10.874 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:33:10 compute-1 nova_compute[182713]: 2026-01-22 00:33:10.875 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 00:33:10 compute-1 nova_compute[182713]: 2026-01-22 00:33:10.875 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 22 00:33:11 compute-1 nova_compute[182713]: 2026-01-22 00:33:11.030 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 22 00:33:12 compute-1 podman[239687]: 2026-01-22 00:33:12.566507778 +0000 UTC m=+0.062777014 container health_status af9a44923f14d865893c91712a29a99d59e05d83f1a1a0300c4f4d5af638f284 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 00:33:12 compute-1 podman[239688]: 2026-01-22 00:33:12.57263307 +0000 UTC m=+0.053853423 container health_status e305ebfcd3dbca18f8dd680606b043b108cf9efb0d0a771039cb04fe0df8441d (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 22 00:33:14 compute-1 nova_compute[182713]: 2026-01-22 00:33:14.815 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:33:14 compute-1 nova_compute[182713]: 2026-01-22 00:33:14.861 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:33:19 compute-1 nova_compute[182713]: 2026-01-22 00:33:19.817 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:33:19 compute-1 nova_compute[182713]: 2026-01-22 00:33:19.863 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:33:24 compute-1 nova_compute[182713]: 2026-01-22 00:33:24.819 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:33:24 compute-1 nova_compute[182713]: 2026-01-22 00:33:24.865 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:33:26 compute-1 podman[239730]: 2026-01-22 00:33:26.571237853 +0000 UTC m=+0.062500444 container health_status cba261ffc859a105e0afa4436e64e27f9ba7e7387a1ea02146e0072c51844d44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 22 00:33:28 compute-1 podman[239751]: 2026-01-22 00:33:28.57999154 +0000 UTC m=+0.068333448 container health_status dce133a2ffa21f3006ac27dc80027060a9029e6d972f4c62fecd35dd3bbb00f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, container_name=openstack_network_exporter, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=openstack_network_exporter, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., name=ubi9-minimal, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, io.openshift.tags=minimal rhel9, version=9.6, vendor=Red Hat, Inc., architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=)
Jan 22 00:33:29 compute-1 nova_compute[182713]: 2026-01-22 00:33:29.859 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:33:29 compute-1 nova_compute[182713]: 2026-01-22 00:33:29.867 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:33:34 compute-1 nova_compute[182713]: 2026-01-22 00:33:34.861 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:33:34 compute-1 nova_compute[182713]: 2026-01-22 00:33:34.869 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:33:37 compute-1 podman[239774]: 2026-01-22 00:33:37.568352946 +0000 UTC m=+0.058653923 container health_status 9069102163b9014ce8d990e36224edf2f6277e737eebbc85a8ea0ba5a3d6a468 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 22 00:33:37 compute-1 podman[239773]: 2026-01-22 00:33:37.645916983 +0000 UTC m=+0.134094333 container health_status 1b31374526468d6b98833c6ddc06c791524e3aaa8f4a92d02e3f11c449a17ca2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, org.label-schema.license=GPLv2)
Jan 22 00:33:39 compute-1 nova_compute[182713]: 2026-01-22 00:33:39.863 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:33:39 compute-1 nova_compute[182713]: 2026-01-22 00:33:39.869 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:33:42 compute-1 ovn_controller[94841]: 2026-01-22T00:33:42Z|00676|memory_trim|INFO|Detected inactivity (last active 30005 ms ago): trimming memory
Jan 22 00:33:43 compute-1 podman[239826]: 2026-01-22 00:33:43.571049098 +0000 UTC m=+0.059753837 container health_status af9a44923f14d865893c91712a29a99d59e05d83f1a1a0300c4f4d5af638f284 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 00:33:43 compute-1 podman[239827]: 2026-01-22 00:33:43.571006667 +0000 UTC m=+0.055091801 container health_status e305ebfcd3dbca18f8dd680606b043b108cf9efb0d0a771039cb04fe0df8441d (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Jan 22 00:33:44 compute-1 nova_compute[182713]: 2026-01-22 00:33:44.865 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:33:44 compute-1 nova_compute[182713]: 2026-01-22 00:33:44.870 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:33:49 compute-1 nova_compute[182713]: 2026-01-22 00:33:49.868 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:33:49 compute-1 nova_compute[182713]: 2026-01-22 00:33:49.871 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:33:54 compute-1 nova_compute[182713]: 2026-01-22 00:33:54.871 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:33:57 compute-1 podman[239866]: 2026-01-22 00:33:57.596323221 +0000 UTC m=+0.076782162 container health_status cba261ffc859a105e0afa4436e64e27f9ba7e7387a1ea02146e0072c51844d44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible)
Jan 22 00:33:59 compute-1 podman[239886]: 2026-01-22 00:33:59.604551453 +0000 UTC m=+0.091907138 container health_status dce133a2ffa21f3006ac27dc80027060a9029e6d972f4c62fecd35dd3bbb00f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, release=1755695350, vcs-type=git, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, build-date=2025-08-20T13:12:41, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, io.openshift.expose-services=, maintainer=Red Hat, Inc., name=ubi9-minimal, config_id=openstack_network_exporter, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers)
Jan 22 00:33:59 compute-1 nova_compute[182713]: 2026-01-22 00:33:59.856 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:33:59 compute-1 nova_compute[182713]: 2026-01-22 00:33:59.856 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:33:59 compute-1 nova_compute[182713]: 2026-01-22 00:33:59.873 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:34:00 compute-1 nova_compute[182713]: 2026-01-22 00:34:00.852 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:34:00 compute-1 nova_compute[182713]: 2026-01-22 00:34:00.855 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:34:00 compute-1 nova_compute[182713]: 2026-01-22 00:34:00.855 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 00:34:01 compute-1 nova_compute[182713]: 2026-01-22 00:34:01.681 182717 DEBUG oslo_concurrency.lockutils [None req-bf60fd21-8b8d-44e2-a2bf-a8170dd7749a a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Acquiring lock "02484dd3-2f3e-41bb-8ceb-d71936912a38" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:34:01 compute-1 nova_compute[182713]: 2026-01-22 00:34:01.682 182717 DEBUG oslo_concurrency.lockutils [None req-bf60fd21-8b8d-44e2-a2bf-a8170dd7749a a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "02484dd3-2f3e-41bb-8ceb-d71936912a38" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:34:01 compute-1 nova_compute[182713]: 2026-01-22 00:34:01.701 182717 DEBUG nova.compute.manager [None req-bf60fd21-8b8d-44e2-a2bf-a8170dd7749a a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 02484dd3-2f3e-41bb-8ceb-d71936912a38] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 22 00:34:01 compute-1 nova_compute[182713]: 2026-01-22 00:34:01.831 182717 DEBUG oslo_concurrency.lockutils [None req-bf60fd21-8b8d-44e2-a2bf-a8170dd7749a a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:34:01 compute-1 nova_compute[182713]: 2026-01-22 00:34:01.832 182717 DEBUG oslo_concurrency.lockutils [None req-bf60fd21-8b8d-44e2-a2bf-a8170dd7749a a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:34:01 compute-1 nova_compute[182713]: 2026-01-22 00:34:01.841 182717 DEBUG nova.virt.hardware [None req-bf60fd21-8b8d-44e2-a2bf-a8170dd7749a a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 22 00:34:01 compute-1 nova_compute[182713]: 2026-01-22 00:34:01.842 182717 INFO nova.compute.claims [None req-bf60fd21-8b8d-44e2-a2bf-a8170dd7749a a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 02484dd3-2f3e-41bb-8ceb-d71936912a38] Claim successful on node compute-1.ctlplane.example.com
Jan 22 00:34:02 compute-1 nova_compute[182713]: 2026-01-22 00:34:02.040 182717 DEBUG nova.compute.provider_tree [None req-bf60fd21-8b8d-44e2-a2bf-a8170dd7749a a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Inventory has not changed in ProviderTree for provider: 39680711-70c9-4df1-ae59-25e54fac688d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:34:02 compute-1 nova_compute[182713]: 2026-01-22 00:34:02.059 182717 DEBUG nova.scheduler.client.report [None req-bf60fd21-8b8d-44e2-a2bf-a8170dd7749a a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Inventory has not changed for provider 39680711-70c9-4df1-ae59-25e54fac688d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:34:02 compute-1 nova_compute[182713]: 2026-01-22 00:34:02.088 182717 DEBUG oslo_concurrency.lockutils [None req-bf60fd21-8b8d-44e2-a2bf-a8170dd7749a a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.256s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:34:02 compute-1 nova_compute[182713]: 2026-01-22 00:34:02.089 182717 DEBUG nova.compute.manager [None req-bf60fd21-8b8d-44e2-a2bf-a8170dd7749a a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 02484dd3-2f3e-41bb-8ceb-d71936912a38] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 22 00:34:02 compute-1 nova_compute[182713]: 2026-01-22 00:34:02.152 182717 DEBUG nova.compute.manager [None req-bf60fd21-8b8d-44e2-a2bf-a8170dd7749a a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 02484dd3-2f3e-41bb-8ceb-d71936912a38] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 22 00:34:02 compute-1 nova_compute[182713]: 2026-01-22 00:34:02.153 182717 DEBUG nova.network.neutron [None req-bf60fd21-8b8d-44e2-a2bf-a8170dd7749a a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 02484dd3-2f3e-41bb-8ceb-d71936912a38] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 22 00:34:02 compute-1 nova_compute[182713]: 2026-01-22 00:34:02.184 182717 INFO nova.virt.libvirt.driver [None req-bf60fd21-8b8d-44e2-a2bf-a8170dd7749a a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 02484dd3-2f3e-41bb-8ceb-d71936912a38] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 22 00:34:02 compute-1 nova_compute[182713]: 2026-01-22 00:34:02.225 182717 DEBUG nova.compute.manager [None req-bf60fd21-8b8d-44e2-a2bf-a8170dd7749a a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 02484dd3-2f3e-41bb-8ceb-d71936912a38] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 22 00:34:02 compute-1 nova_compute[182713]: 2026-01-22 00:34:02.364 182717 DEBUG nova.compute.manager [None req-bf60fd21-8b8d-44e2-a2bf-a8170dd7749a a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 02484dd3-2f3e-41bb-8ceb-d71936912a38] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 22 00:34:02 compute-1 nova_compute[182713]: 2026-01-22 00:34:02.367 182717 DEBUG nova.virt.libvirt.driver [None req-bf60fd21-8b8d-44e2-a2bf-a8170dd7749a a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 02484dd3-2f3e-41bb-8ceb-d71936912a38] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 22 00:34:02 compute-1 nova_compute[182713]: 2026-01-22 00:34:02.367 182717 INFO nova.virt.libvirt.driver [None req-bf60fd21-8b8d-44e2-a2bf-a8170dd7749a a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 02484dd3-2f3e-41bb-8ceb-d71936912a38] Creating image(s)
Jan 22 00:34:02 compute-1 nova_compute[182713]: 2026-01-22 00:34:02.368 182717 DEBUG oslo_concurrency.lockutils [None req-bf60fd21-8b8d-44e2-a2bf-a8170dd7749a a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Acquiring lock "/var/lib/nova/instances/02484dd3-2f3e-41bb-8ceb-d71936912a38/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:34:02 compute-1 nova_compute[182713]: 2026-01-22 00:34:02.369 182717 DEBUG oslo_concurrency.lockutils [None req-bf60fd21-8b8d-44e2-a2bf-a8170dd7749a a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "/var/lib/nova/instances/02484dd3-2f3e-41bb-8ceb-d71936912a38/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:34:02 compute-1 nova_compute[182713]: 2026-01-22 00:34:02.370 182717 DEBUG oslo_concurrency.lockutils [None req-bf60fd21-8b8d-44e2-a2bf-a8170dd7749a a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "/var/lib/nova/instances/02484dd3-2f3e-41bb-8ceb-d71936912a38/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:34:02 compute-1 nova_compute[182713]: 2026-01-22 00:34:02.399 182717 DEBUG oslo_concurrency.processutils [None req-bf60fd21-8b8d-44e2-a2bf-a8170dd7749a a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:34:02 compute-1 nova_compute[182713]: 2026-01-22 00:34:02.468 182717 DEBUG oslo_concurrency.processutils [None req-bf60fd21-8b8d-44e2-a2bf-a8170dd7749a a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:34:02 compute-1 nova_compute[182713]: 2026-01-22 00:34:02.469 182717 DEBUG oslo_concurrency.lockutils [None req-bf60fd21-8b8d-44e2-a2bf-a8170dd7749a a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Acquiring lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:34:02 compute-1 nova_compute[182713]: 2026-01-22 00:34:02.470 182717 DEBUG oslo_concurrency.lockutils [None req-bf60fd21-8b8d-44e2-a2bf-a8170dd7749a a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:34:02 compute-1 nova_compute[182713]: 2026-01-22 00:34:02.481 182717 DEBUG oslo_concurrency.processutils [None req-bf60fd21-8b8d-44e2-a2bf-a8170dd7749a a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:34:02 compute-1 nova_compute[182713]: 2026-01-22 00:34:02.541 182717 DEBUG oslo_concurrency.processutils [None req-bf60fd21-8b8d-44e2-a2bf-a8170dd7749a a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:34:02 compute-1 nova_compute[182713]: 2026-01-22 00:34:02.542 182717 DEBUG oslo_concurrency.processutils [None req-bf60fd21-8b8d-44e2-a2bf-a8170dd7749a a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/02484dd3-2f3e-41bb-8ceb-d71936912a38/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:34:02 compute-1 nova_compute[182713]: 2026-01-22 00:34:02.580 182717 DEBUG oslo_concurrency.processutils [None req-bf60fd21-8b8d-44e2-a2bf-a8170dd7749a a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/02484dd3-2f3e-41bb-8ceb-d71936912a38/disk 1073741824" returned: 0 in 0.038s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:34:02 compute-1 nova_compute[182713]: 2026-01-22 00:34:02.581 182717 DEBUG oslo_concurrency.lockutils [None req-bf60fd21-8b8d-44e2-a2bf-a8170dd7749a a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.111s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:34:02 compute-1 nova_compute[182713]: 2026-01-22 00:34:02.581 182717 DEBUG oslo_concurrency.processutils [None req-bf60fd21-8b8d-44e2-a2bf-a8170dd7749a a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:34:02 compute-1 nova_compute[182713]: 2026-01-22 00:34:02.662 182717 DEBUG oslo_concurrency.processutils [None req-bf60fd21-8b8d-44e2-a2bf-a8170dd7749a a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.081s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:34:02 compute-1 nova_compute[182713]: 2026-01-22 00:34:02.663 182717 DEBUG nova.virt.disk.api [None req-bf60fd21-8b8d-44e2-a2bf-a8170dd7749a a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Checking if we can resize image /var/lib/nova/instances/02484dd3-2f3e-41bb-8ceb-d71936912a38/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 22 00:34:02 compute-1 nova_compute[182713]: 2026-01-22 00:34:02.664 182717 DEBUG oslo_concurrency.processutils [None req-bf60fd21-8b8d-44e2-a2bf-a8170dd7749a a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/02484dd3-2f3e-41bb-8ceb-d71936912a38/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:34:02 compute-1 nova_compute[182713]: 2026-01-22 00:34:02.732 182717 DEBUG oslo_concurrency.processutils [None req-bf60fd21-8b8d-44e2-a2bf-a8170dd7749a a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/02484dd3-2f3e-41bb-8ceb-d71936912a38/disk --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:34:02 compute-1 nova_compute[182713]: 2026-01-22 00:34:02.733 182717 DEBUG nova.virt.disk.api [None req-bf60fd21-8b8d-44e2-a2bf-a8170dd7749a a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Cannot resize image /var/lib/nova/instances/02484dd3-2f3e-41bb-8ceb-d71936912a38/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 22 00:34:02 compute-1 nova_compute[182713]: 2026-01-22 00:34:02.733 182717 DEBUG nova.objects.instance [None req-bf60fd21-8b8d-44e2-a2bf-a8170dd7749a a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lazy-loading 'migration_context' on Instance uuid 02484dd3-2f3e-41bb-8ceb-d71936912a38 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:34:02 compute-1 nova_compute[182713]: 2026-01-22 00:34:02.784 182717 DEBUG nova.virt.libvirt.driver [None req-bf60fd21-8b8d-44e2-a2bf-a8170dd7749a a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 02484dd3-2f3e-41bb-8ceb-d71936912a38] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 22 00:34:02 compute-1 nova_compute[182713]: 2026-01-22 00:34:02.784 182717 DEBUG nova.virt.libvirt.driver [None req-bf60fd21-8b8d-44e2-a2bf-a8170dd7749a a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 02484dd3-2f3e-41bb-8ceb-d71936912a38] Ensure instance console log exists: /var/lib/nova/instances/02484dd3-2f3e-41bb-8ceb-d71936912a38/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 22 00:34:02 compute-1 nova_compute[182713]: 2026-01-22 00:34:02.785 182717 DEBUG oslo_concurrency.lockutils [None req-bf60fd21-8b8d-44e2-a2bf-a8170dd7749a a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:34:02 compute-1 nova_compute[182713]: 2026-01-22 00:34:02.785 182717 DEBUG oslo_concurrency.lockutils [None req-bf60fd21-8b8d-44e2-a2bf-a8170dd7749a a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:34:02 compute-1 nova_compute[182713]: 2026-01-22 00:34:02.785 182717 DEBUG oslo_concurrency.lockutils [None req-bf60fd21-8b8d-44e2-a2bf-a8170dd7749a a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:34:02 compute-1 nova_compute[182713]: 2026-01-22 00:34:02.965 182717 DEBUG nova.policy [None req-bf60fd21-8b8d-44e2-a2bf-a8170dd7749a a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 22 00:34:03 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:34:03.045 104184 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:34:03 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:34:03.046 104184 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:34:03 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:34:03.046 104184 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:34:03 compute-1 nova_compute[182713]: 2026-01-22 00:34:03.851 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:34:04 compute-1 nova_compute[182713]: 2026-01-22 00:34:04.090 182717 DEBUG nova.network.neutron [None req-bf60fd21-8b8d-44e2-a2bf-a8170dd7749a a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 02484dd3-2f3e-41bb-8ceb-d71936912a38] Successfully created port: 0eb5006d-d9d0-403a-bc92-5ad7bd032ade _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 22 00:34:04 compute-1 nova_compute[182713]: 2026-01-22 00:34:04.854 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:34:04 compute-1 nova_compute[182713]: 2026-01-22 00:34:04.873 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:34:04 compute-1 nova_compute[182713]: 2026-01-22 00:34:04.875 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:34:05 compute-1 nova_compute[182713]: 2026-01-22 00:34:05.052 182717 DEBUG nova.network.neutron [None req-bf60fd21-8b8d-44e2-a2bf-a8170dd7749a a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 02484dd3-2f3e-41bb-8ceb-d71936912a38] Successfully created port: fd875d41-c569-4c76-ae51-bbf4251e41bd _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 22 00:34:06 compute-1 nova_compute[182713]: 2026-01-22 00:34:06.511 182717 DEBUG nova.network.neutron [None req-bf60fd21-8b8d-44e2-a2bf-a8170dd7749a a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 02484dd3-2f3e-41bb-8ceb-d71936912a38] Successfully updated port: 0eb5006d-d9d0-403a-bc92-5ad7bd032ade _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 22 00:34:06 compute-1 nova_compute[182713]: 2026-01-22 00:34:06.639 182717 DEBUG nova.compute.manager [req-623c4d2d-f784-4a4b-b4b8-fc020efc423d req-a3e67fd8-5aad-4832-bbdb-fab2a839ee20 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 02484dd3-2f3e-41bb-8ceb-d71936912a38] Received event network-changed-0eb5006d-d9d0-403a-bc92-5ad7bd032ade external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:34:06 compute-1 nova_compute[182713]: 2026-01-22 00:34:06.639 182717 DEBUG nova.compute.manager [req-623c4d2d-f784-4a4b-b4b8-fc020efc423d req-a3e67fd8-5aad-4832-bbdb-fab2a839ee20 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 02484dd3-2f3e-41bb-8ceb-d71936912a38] Refreshing instance network info cache due to event network-changed-0eb5006d-d9d0-403a-bc92-5ad7bd032ade. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 00:34:06 compute-1 nova_compute[182713]: 2026-01-22 00:34:06.640 182717 DEBUG oslo_concurrency.lockutils [req-623c4d2d-f784-4a4b-b4b8-fc020efc423d req-a3e67fd8-5aad-4832-bbdb-fab2a839ee20 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-02484dd3-2f3e-41bb-8ceb-d71936912a38" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:34:06 compute-1 nova_compute[182713]: 2026-01-22 00:34:06.640 182717 DEBUG oslo_concurrency.lockutils [req-623c4d2d-f784-4a4b-b4b8-fc020efc423d req-a3e67fd8-5aad-4832-bbdb-fab2a839ee20 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-02484dd3-2f3e-41bb-8ceb-d71936912a38" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:34:06 compute-1 nova_compute[182713]: 2026-01-22 00:34:06.640 182717 DEBUG nova.network.neutron [req-623c4d2d-f784-4a4b-b4b8-fc020efc423d req-a3e67fd8-5aad-4832-bbdb-fab2a839ee20 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 02484dd3-2f3e-41bb-8ceb-d71936912a38] Refreshing network info cache for port 0eb5006d-d9d0-403a-bc92-5ad7bd032ade _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 00:34:06 compute-1 nova_compute[182713]: 2026-01-22 00:34:06.855 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:34:06 compute-1 nova_compute[182713]: 2026-01-22 00:34:06.943 182717 DEBUG nova.network.neutron [req-623c4d2d-f784-4a4b-b4b8-fc020efc423d req-a3e67fd8-5aad-4832-bbdb-fab2a839ee20 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 02484dd3-2f3e-41bb-8ceb-d71936912a38] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 00:34:07 compute-1 nova_compute[182713]: 2026-01-22 00:34:07.467 182717 DEBUG nova.network.neutron [None req-bf60fd21-8b8d-44e2-a2bf-a8170dd7749a a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 02484dd3-2f3e-41bb-8ceb-d71936912a38] Successfully updated port: fd875d41-c569-4c76-ae51-bbf4251e41bd _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 22 00:34:07 compute-1 nova_compute[182713]: 2026-01-22 00:34:07.484 182717 DEBUG nova.network.neutron [req-623c4d2d-f784-4a4b-b4b8-fc020efc423d req-a3e67fd8-5aad-4832-bbdb-fab2a839ee20 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 02484dd3-2f3e-41bb-8ceb-d71936912a38] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:34:07 compute-1 nova_compute[182713]: 2026-01-22 00:34:07.527 182717 DEBUG oslo_concurrency.lockutils [None req-bf60fd21-8b8d-44e2-a2bf-a8170dd7749a a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Acquiring lock "refresh_cache-02484dd3-2f3e-41bb-8ceb-d71936912a38" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:34:07 compute-1 nova_compute[182713]: 2026-01-22 00:34:07.528 182717 DEBUG oslo_concurrency.lockutils [req-623c4d2d-f784-4a4b-b4b8-fc020efc423d req-a3e67fd8-5aad-4832-bbdb-fab2a839ee20 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-02484dd3-2f3e-41bb-8ceb-d71936912a38" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:34:07 compute-1 nova_compute[182713]: 2026-01-22 00:34:07.529 182717 DEBUG oslo_concurrency.lockutils [None req-bf60fd21-8b8d-44e2-a2bf-a8170dd7749a a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Acquired lock "refresh_cache-02484dd3-2f3e-41bb-8ceb-d71936912a38" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:34:07 compute-1 nova_compute[182713]: 2026-01-22 00:34:07.529 182717 DEBUG nova.network.neutron [None req-bf60fd21-8b8d-44e2-a2bf-a8170dd7749a a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 02484dd3-2f3e-41bb-8ceb-d71936912a38] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 00:34:08 compute-1 nova_compute[182713]: 2026-01-22 00:34:08.386 182717 DEBUG nova.network.neutron [None req-bf60fd21-8b8d-44e2-a2bf-a8170dd7749a a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 02484dd3-2f3e-41bb-8ceb-d71936912a38] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 00:34:08 compute-1 podman[239924]: 2026-01-22 00:34:08.586813209 +0000 UTC m=+0.065734286 container health_status 9069102163b9014ce8d990e36224edf2f6277e737eebbc85a8ea0ba5a3d6a468 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 22 00:34:08 compute-1 podman[239923]: 2026-01-22 00:34:08.604664789 +0000 UTC m=+0.092735264 container health_status 1b31374526468d6b98833c6ddc06c791524e3aaa8f4a92d02e3f11c449a17ca2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Jan 22 00:34:08 compute-1 nova_compute[182713]: 2026-01-22 00:34:08.727 182717 DEBUG nova.compute.manager [req-f05ff1ae-5190-46b8-bd65-e185c4ae9a21 req-f95d2cfe-f9ba-4c0b-8592-e1be2fdddec8 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 02484dd3-2f3e-41bb-8ceb-d71936912a38] Received event network-changed-fd875d41-c569-4c76-ae51-bbf4251e41bd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:34:08 compute-1 nova_compute[182713]: 2026-01-22 00:34:08.727 182717 DEBUG nova.compute.manager [req-f05ff1ae-5190-46b8-bd65-e185c4ae9a21 req-f95d2cfe-f9ba-4c0b-8592-e1be2fdddec8 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 02484dd3-2f3e-41bb-8ceb-d71936912a38] Refreshing instance network info cache due to event network-changed-fd875d41-c569-4c76-ae51-bbf4251e41bd. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 00:34:08 compute-1 nova_compute[182713]: 2026-01-22 00:34:08.727 182717 DEBUG oslo_concurrency.lockutils [req-f05ff1ae-5190-46b8-bd65-e185c4ae9a21 req-f95d2cfe-f9ba-4c0b-8592-e1be2fdddec8 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-02484dd3-2f3e-41bb-8ceb-d71936912a38" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:34:08 compute-1 nova_compute[182713]: 2026-01-22 00:34:08.855 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:34:08 compute-1 nova_compute[182713]: 2026-01-22 00:34:08.856 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:34:08 compute-1 nova_compute[182713]: 2026-01-22 00:34:08.880 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:34:08 compute-1 nova_compute[182713]: 2026-01-22 00:34:08.881 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:34:08 compute-1 nova_compute[182713]: 2026-01-22 00:34:08.881 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:34:08 compute-1 nova_compute[182713]: 2026-01-22 00:34:08.882 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 00:34:09 compute-1 nova_compute[182713]: 2026-01-22 00:34:09.042 182717 WARNING nova.virt.libvirt.driver [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 00:34:09 compute-1 nova_compute[182713]: 2026-01-22 00:34:09.043 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5708MB free_disk=73.19244384765625GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 00:34:09 compute-1 nova_compute[182713]: 2026-01-22 00:34:09.043 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:34:09 compute-1 nova_compute[182713]: 2026-01-22 00:34:09.043 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:34:09 compute-1 nova_compute[182713]: 2026-01-22 00:34:09.112 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Instance 02484dd3-2f3e-41bb-8ceb-d71936912a38 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 22 00:34:09 compute-1 nova_compute[182713]: 2026-01-22 00:34:09.112 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 00:34:09 compute-1 nova_compute[182713]: 2026-01-22 00:34:09.112 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 00:34:09 compute-1 nova_compute[182713]: 2026-01-22 00:34:09.261 182717 DEBUG nova.compute.provider_tree [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Inventory has not changed in ProviderTree for provider: 39680711-70c9-4df1-ae59-25e54fac688d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:34:09 compute-1 nova_compute[182713]: 2026-01-22 00:34:09.275 182717 DEBUG nova.scheduler.client.report [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Inventory has not changed for provider 39680711-70c9-4df1-ae59-25e54fac688d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:34:09 compute-1 nova_compute[182713]: 2026-01-22 00:34:09.295 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 00:34:09 compute-1 nova_compute[182713]: 2026-01-22 00:34:09.295 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.252s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:34:09 compute-1 nova_compute[182713]: 2026-01-22 00:34:09.875 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:34:09 compute-1 nova_compute[182713]: 2026-01-22 00:34:09.877 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:34:10 compute-1 nova_compute[182713]: 2026-01-22 00:34:10.555 182717 DEBUG nova.network.neutron [None req-bf60fd21-8b8d-44e2-a2bf-a8170dd7749a a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 02484dd3-2f3e-41bb-8ceb-d71936912a38] Updating instance_info_cache with network_info: [{"id": "0eb5006d-d9d0-403a-bc92-5ad7bd032ade", "address": "fa:16:3e:d3:fd:67", "network": {"id": "2c32b591-bafa-4089-9793-ef7884c86bda", "bridge": "br-int", "label": "tempest-network-smoke--1415909396", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0eb5006d-d9", "ovs_interfaceid": "0eb5006d-d9d0-403a-bc92-5ad7bd032ade", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "fd875d41-c569-4c76-ae51-bbf4251e41bd", "address": "fa:16:3e:d6:3b:8b", "network": {"id": "ac047d42-8ff5-4760-85b5-73b5e4be7fc9", "bridge": "br-int", "label": "tempest-network-smoke--1874258653", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fed6:3b8b", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfd875d41-c5", "ovs_interfaceid": "fd875d41-c569-4c76-ae51-bbf4251e41bd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:34:10 compute-1 nova_compute[182713]: 2026-01-22 00:34:10.580 182717 DEBUG oslo_concurrency.lockutils [None req-bf60fd21-8b8d-44e2-a2bf-a8170dd7749a a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Releasing lock "refresh_cache-02484dd3-2f3e-41bb-8ceb-d71936912a38" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:34:10 compute-1 nova_compute[182713]: 2026-01-22 00:34:10.580 182717 DEBUG nova.compute.manager [None req-bf60fd21-8b8d-44e2-a2bf-a8170dd7749a a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 02484dd3-2f3e-41bb-8ceb-d71936912a38] Instance network_info: |[{"id": "0eb5006d-d9d0-403a-bc92-5ad7bd032ade", "address": "fa:16:3e:d3:fd:67", "network": {"id": "2c32b591-bafa-4089-9793-ef7884c86bda", "bridge": "br-int", "label": "tempest-network-smoke--1415909396", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0eb5006d-d9", "ovs_interfaceid": "0eb5006d-d9d0-403a-bc92-5ad7bd032ade", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "fd875d41-c569-4c76-ae51-bbf4251e41bd", "address": "fa:16:3e:d6:3b:8b", "network": {"id": "ac047d42-8ff5-4760-85b5-73b5e4be7fc9", "bridge": "br-int", "label": "tempest-network-smoke--1874258653", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fed6:3b8b", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfd875d41-c5", "ovs_interfaceid": "fd875d41-c569-4c76-ae51-bbf4251e41bd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 22 00:34:10 compute-1 nova_compute[182713]: 2026-01-22 00:34:10.581 182717 DEBUG oslo_concurrency.lockutils [req-f05ff1ae-5190-46b8-bd65-e185c4ae9a21 req-f95d2cfe-f9ba-4c0b-8592-e1be2fdddec8 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-02484dd3-2f3e-41bb-8ceb-d71936912a38" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:34:10 compute-1 nova_compute[182713]: 2026-01-22 00:34:10.582 182717 DEBUG nova.network.neutron [req-f05ff1ae-5190-46b8-bd65-e185c4ae9a21 req-f95d2cfe-f9ba-4c0b-8592-e1be2fdddec8 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 02484dd3-2f3e-41bb-8ceb-d71936912a38] Refreshing network info cache for port fd875d41-c569-4c76-ae51-bbf4251e41bd _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 00:34:10 compute-1 nova_compute[182713]: 2026-01-22 00:34:10.589 182717 DEBUG nova.virt.libvirt.driver [None req-bf60fd21-8b8d-44e2-a2bf-a8170dd7749a a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 02484dd3-2f3e-41bb-8ceb-d71936912a38] Start _get_guest_xml network_info=[{"id": "0eb5006d-d9d0-403a-bc92-5ad7bd032ade", "address": "fa:16:3e:d3:fd:67", "network": {"id": "2c32b591-bafa-4089-9793-ef7884c86bda", "bridge": "br-int", "label": "tempest-network-smoke--1415909396", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0eb5006d-d9", "ovs_interfaceid": "0eb5006d-d9d0-403a-bc92-5ad7bd032ade", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "fd875d41-c569-4c76-ae51-bbf4251e41bd", "address": "fa:16:3e:d6:3b:8b", "network": {"id": "ac047d42-8ff5-4760-85b5-73b5e4be7fc9", "bridge": "br-int", "label": "tempest-network-smoke--1874258653", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fed6:3b8b", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfd875d41-c5", "ovs_interfaceid": "fd875d41-c569-4c76-ae51-bbf4251e41bd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'size': 0, 'disk_bus': 'virtio', 'boot_index': 0, 'device_name': '/dev/vda', 'guest_format': None, 'encrypted': False, 'encryption_format': None, 'encryption_secret_uuid': None, 'device_type': 'disk', 'image_id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 22 00:34:10 compute-1 nova_compute[182713]: 2026-01-22 00:34:10.594 182717 WARNING nova.virt.libvirt.driver [None req-bf60fd21-8b8d-44e2-a2bf-a8170dd7749a a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 00:34:10 compute-1 nova_compute[182713]: 2026-01-22 00:34:10.599 182717 DEBUG nova.virt.libvirt.host [None req-bf60fd21-8b8d-44e2-a2bf-a8170dd7749a a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 22 00:34:10 compute-1 nova_compute[182713]: 2026-01-22 00:34:10.600 182717 DEBUG nova.virt.libvirt.host [None req-bf60fd21-8b8d-44e2-a2bf-a8170dd7749a a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 22 00:34:10 compute-1 nova_compute[182713]: 2026-01-22 00:34:10.605 182717 DEBUG nova.virt.libvirt.host [None req-bf60fd21-8b8d-44e2-a2bf-a8170dd7749a a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 22 00:34:10 compute-1 nova_compute[182713]: 2026-01-22 00:34:10.606 182717 DEBUG nova.virt.libvirt.host [None req-bf60fd21-8b8d-44e2-a2bf-a8170dd7749a a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 22 00:34:10 compute-1 nova_compute[182713]: 2026-01-22 00:34:10.607 182717 DEBUG nova.virt.libvirt.driver [None req-bf60fd21-8b8d-44e2-a2bf-a8170dd7749a a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 22 00:34:10 compute-1 nova_compute[182713]: 2026-01-22 00:34:10.608 182717 DEBUG nova.virt.hardware [None req-bf60fd21-8b8d-44e2-a2bf-a8170dd7749a a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-21T23:42:45Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c3389c03-89c4-4ff5-9e03-1a99d41713d4',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 22 00:34:10 compute-1 nova_compute[182713]: 2026-01-22 00:34:10.608 182717 DEBUG nova.virt.hardware [None req-bf60fd21-8b8d-44e2-a2bf-a8170dd7749a a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 22 00:34:10 compute-1 nova_compute[182713]: 2026-01-22 00:34:10.608 182717 DEBUG nova.virt.hardware [None req-bf60fd21-8b8d-44e2-a2bf-a8170dd7749a a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 22 00:34:10 compute-1 nova_compute[182713]: 2026-01-22 00:34:10.609 182717 DEBUG nova.virt.hardware [None req-bf60fd21-8b8d-44e2-a2bf-a8170dd7749a a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 22 00:34:10 compute-1 nova_compute[182713]: 2026-01-22 00:34:10.609 182717 DEBUG nova.virt.hardware [None req-bf60fd21-8b8d-44e2-a2bf-a8170dd7749a a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 22 00:34:10 compute-1 nova_compute[182713]: 2026-01-22 00:34:10.609 182717 DEBUG nova.virt.hardware [None req-bf60fd21-8b8d-44e2-a2bf-a8170dd7749a a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 22 00:34:10 compute-1 nova_compute[182713]: 2026-01-22 00:34:10.609 182717 DEBUG nova.virt.hardware [None req-bf60fd21-8b8d-44e2-a2bf-a8170dd7749a a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 22 00:34:10 compute-1 nova_compute[182713]: 2026-01-22 00:34:10.610 182717 DEBUG nova.virt.hardware [None req-bf60fd21-8b8d-44e2-a2bf-a8170dd7749a a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 22 00:34:10 compute-1 nova_compute[182713]: 2026-01-22 00:34:10.610 182717 DEBUG nova.virt.hardware [None req-bf60fd21-8b8d-44e2-a2bf-a8170dd7749a a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 22 00:34:10 compute-1 nova_compute[182713]: 2026-01-22 00:34:10.610 182717 DEBUG nova.virt.hardware [None req-bf60fd21-8b8d-44e2-a2bf-a8170dd7749a a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 22 00:34:10 compute-1 nova_compute[182713]: 2026-01-22 00:34:10.610 182717 DEBUG nova.virt.hardware [None req-bf60fd21-8b8d-44e2-a2bf-a8170dd7749a a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 22 00:34:10 compute-1 nova_compute[182713]: 2026-01-22 00:34:10.614 182717 DEBUG nova.virt.libvirt.vif [None req-bf60fd21-8b8d-44e2-a2bf-a8170dd7749a a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T00:33:59Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1621779069',display_name='tempest-TestGettingAddress-server-1621779069',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1621779069',id=172,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBCnBNde9LnK+U+D5ENHV5Bhm30z07BTZ3lIWHvMukOxK8Jf8o76X7/cU7jqimLz8JbkUji6/91McZm7z1O1Yc3tDz3ZuSaz3KrVuj+NWjZEoVIpu6UJrwRH+k4I2kfmaw==',key_name='tempest-TestGettingAddress-1562921949',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='837db8748d074b3c9179b47d30e7a1d4',ramdisk_id='',reservation_id='r-6oeidn9k',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-471729430',owner_user_name='tempest-TestGettingAddress-471729430-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:34:02Z,user_data=None,user_id='a8fd196423d94b309668ffd08655f7ed',uuid=02484dd3-2f3e-41bb-8ceb-d71936912a38,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "0eb5006d-d9d0-403a-bc92-5ad7bd032ade", "address": "fa:16:3e:d3:fd:67", "network": {"id": "2c32b591-bafa-4089-9793-ef7884c86bda", "bridge": "br-int", "label": "tempest-network-smoke--1415909396", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0eb5006d-d9", "ovs_interfaceid": "0eb5006d-d9d0-403a-bc92-5ad7bd032ade", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 00:34:10 compute-1 nova_compute[182713]: 2026-01-22 00:34:10.615 182717 DEBUG nova.network.os_vif_util [None req-bf60fd21-8b8d-44e2-a2bf-a8170dd7749a a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Converting VIF {"id": "0eb5006d-d9d0-403a-bc92-5ad7bd032ade", "address": "fa:16:3e:d3:fd:67", "network": {"id": "2c32b591-bafa-4089-9793-ef7884c86bda", "bridge": "br-int", "label": "tempest-network-smoke--1415909396", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0eb5006d-d9", "ovs_interfaceid": "0eb5006d-d9d0-403a-bc92-5ad7bd032ade", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:34:10 compute-1 nova_compute[182713]: 2026-01-22 00:34:10.615 182717 DEBUG nova.network.os_vif_util [None req-bf60fd21-8b8d-44e2-a2bf-a8170dd7749a a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d3:fd:67,bridge_name='br-int',has_traffic_filtering=True,id=0eb5006d-d9d0-403a-bc92-5ad7bd032ade,network=Network(2c32b591-bafa-4089-9793-ef7884c86bda),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0eb5006d-d9') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:34:10 compute-1 nova_compute[182713]: 2026-01-22 00:34:10.616 182717 DEBUG nova.virt.libvirt.vif [None req-bf60fd21-8b8d-44e2-a2bf-a8170dd7749a a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T00:33:59Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1621779069',display_name='tempest-TestGettingAddress-server-1621779069',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1621779069',id=172,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBCnBNde9LnK+U+D5ENHV5Bhm30z07BTZ3lIWHvMukOxK8Jf8o76X7/cU7jqimLz8JbkUji6/91McZm7z1O1Yc3tDz3ZuSaz3KrVuj+NWjZEoVIpu6UJrwRH+k4I2kfmaw==',key_name='tempest-TestGettingAddress-1562921949',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='837db8748d074b3c9179b47d30e7a1d4',ramdisk_id='',reservation_id='r-6oeidn9k',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-471729430',owner_user_name='tempest-TestGettingAddress-471729430-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:34:02Z,user_data=None,user_id='a8fd196423d94b309668ffd08655f7ed',uuid=02484dd3-2f3e-41bb-8ceb-d71936912a38,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "fd875d41-c569-4c76-ae51-bbf4251e41bd", "address": "fa:16:3e:d6:3b:8b", "network": {"id": "ac047d42-8ff5-4760-85b5-73b5e4be7fc9", "bridge": "br-int", "label": "tempest-network-smoke--1874258653", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fed6:3b8b", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfd875d41-c5", "ovs_interfaceid": "fd875d41-c569-4c76-ae51-bbf4251e41bd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 00:34:10 compute-1 nova_compute[182713]: 2026-01-22 00:34:10.617 182717 DEBUG nova.network.os_vif_util [None req-bf60fd21-8b8d-44e2-a2bf-a8170dd7749a a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Converting VIF {"id": "fd875d41-c569-4c76-ae51-bbf4251e41bd", "address": "fa:16:3e:d6:3b:8b", "network": {"id": "ac047d42-8ff5-4760-85b5-73b5e4be7fc9", "bridge": "br-int", "label": "tempest-network-smoke--1874258653", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fed6:3b8b", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfd875d41-c5", "ovs_interfaceid": "fd875d41-c569-4c76-ae51-bbf4251e41bd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:34:10 compute-1 nova_compute[182713]: 2026-01-22 00:34:10.617 182717 DEBUG nova.network.os_vif_util [None req-bf60fd21-8b8d-44e2-a2bf-a8170dd7749a a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d6:3b:8b,bridge_name='br-int',has_traffic_filtering=True,id=fd875d41-c569-4c76-ae51-bbf4251e41bd,network=Network(ac047d42-8ff5-4760-85b5-73b5e4be7fc9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfd875d41-c5') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:34:10 compute-1 nova_compute[182713]: 2026-01-22 00:34:10.618 182717 DEBUG nova.objects.instance [None req-bf60fd21-8b8d-44e2-a2bf-a8170dd7749a a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lazy-loading 'pci_devices' on Instance uuid 02484dd3-2f3e-41bb-8ceb-d71936912a38 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:34:10 compute-1 nova_compute[182713]: 2026-01-22 00:34:10.634 182717 DEBUG nova.virt.libvirt.driver [None req-bf60fd21-8b8d-44e2-a2bf-a8170dd7749a a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 02484dd3-2f3e-41bb-8ceb-d71936912a38] End _get_guest_xml xml=<domain type="kvm">
Jan 22 00:34:10 compute-1 nova_compute[182713]:   <uuid>02484dd3-2f3e-41bb-8ceb-d71936912a38</uuid>
Jan 22 00:34:10 compute-1 nova_compute[182713]:   <name>instance-000000ac</name>
Jan 22 00:34:10 compute-1 nova_compute[182713]:   <memory>131072</memory>
Jan 22 00:34:10 compute-1 nova_compute[182713]:   <vcpu>1</vcpu>
Jan 22 00:34:10 compute-1 nova_compute[182713]:   <metadata>
Jan 22 00:34:10 compute-1 nova_compute[182713]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 00:34:10 compute-1 nova_compute[182713]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 00:34:10 compute-1 nova_compute[182713]:       <nova:name>tempest-TestGettingAddress-server-1621779069</nova:name>
Jan 22 00:34:10 compute-1 nova_compute[182713]:       <nova:creationTime>2026-01-22 00:34:10</nova:creationTime>
Jan 22 00:34:10 compute-1 nova_compute[182713]:       <nova:flavor name="m1.nano">
Jan 22 00:34:10 compute-1 nova_compute[182713]:         <nova:memory>128</nova:memory>
Jan 22 00:34:10 compute-1 nova_compute[182713]:         <nova:disk>1</nova:disk>
Jan 22 00:34:10 compute-1 nova_compute[182713]:         <nova:swap>0</nova:swap>
Jan 22 00:34:10 compute-1 nova_compute[182713]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 00:34:10 compute-1 nova_compute[182713]:         <nova:vcpus>1</nova:vcpus>
Jan 22 00:34:10 compute-1 nova_compute[182713]:       </nova:flavor>
Jan 22 00:34:10 compute-1 nova_compute[182713]:       <nova:owner>
Jan 22 00:34:10 compute-1 nova_compute[182713]:         <nova:user uuid="a8fd196423d94b309668ffd08655f7ed">tempest-TestGettingAddress-471729430-project-member</nova:user>
Jan 22 00:34:10 compute-1 nova_compute[182713]:         <nova:project uuid="837db8748d074b3c9179b47d30e7a1d4">tempest-TestGettingAddress-471729430</nova:project>
Jan 22 00:34:10 compute-1 nova_compute[182713]:       </nova:owner>
Jan 22 00:34:10 compute-1 nova_compute[182713]:       <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 22 00:34:10 compute-1 nova_compute[182713]:       <nova:ports>
Jan 22 00:34:10 compute-1 nova_compute[182713]:         <nova:port uuid="0eb5006d-d9d0-403a-bc92-5ad7bd032ade">
Jan 22 00:34:10 compute-1 nova_compute[182713]:           <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Jan 22 00:34:10 compute-1 nova_compute[182713]:         </nova:port>
Jan 22 00:34:10 compute-1 nova_compute[182713]:         <nova:port uuid="fd875d41-c569-4c76-ae51-bbf4251e41bd">
Jan 22 00:34:10 compute-1 nova_compute[182713]:           <nova:ip type="fixed" address="2001:db8::f816:3eff:fed6:3b8b" ipVersion="6"/>
Jan 22 00:34:10 compute-1 nova_compute[182713]:         </nova:port>
Jan 22 00:34:10 compute-1 nova_compute[182713]:       </nova:ports>
Jan 22 00:34:10 compute-1 nova_compute[182713]:     </nova:instance>
Jan 22 00:34:10 compute-1 nova_compute[182713]:   </metadata>
Jan 22 00:34:10 compute-1 nova_compute[182713]:   <sysinfo type="smbios">
Jan 22 00:34:10 compute-1 nova_compute[182713]:     <system>
Jan 22 00:34:10 compute-1 nova_compute[182713]:       <entry name="manufacturer">RDO</entry>
Jan 22 00:34:10 compute-1 nova_compute[182713]:       <entry name="product">OpenStack Compute</entry>
Jan 22 00:34:10 compute-1 nova_compute[182713]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 00:34:10 compute-1 nova_compute[182713]:       <entry name="serial">02484dd3-2f3e-41bb-8ceb-d71936912a38</entry>
Jan 22 00:34:10 compute-1 nova_compute[182713]:       <entry name="uuid">02484dd3-2f3e-41bb-8ceb-d71936912a38</entry>
Jan 22 00:34:10 compute-1 nova_compute[182713]:       <entry name="family">Virtual Machine</entry>
Jan 22 00:34:10 compute-1 nova_compute[182713]:     </system>
Jan 22 00:34:10 compute-1 nova_compute[182713]:   </sysinfo>
Jan 22 00:34:10 compute-1 nova_compute[182713]:   <os>
Jan 22 00:34:10 compute-1 nova_compute[182713]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 22 00:34:10 compute-1 nova_compute[182713]:     <boot dev="hd"/>
Jan 22 00:34:10 compute-1 nova_compute[182713]:     <smbios mode="sysinfo"/>
Jan 22 00:34:10 compute-1 nova_compute[182713]:   </os>
Jan 22 00:34:10 compute-1 nova_compute[182713]:   <features>
Jan 22 00:34:10 compute-1 nova_compute[182713]:     <acpi/>
Jan 22 00:34:10 compute-1 nova_compute[182713]:     <apic/>
Jan 22 00:34:10 compute-1 nova_compute[182713]:     <vmcoreinfo/>
Jan 22 00:34:10 compute-1 nova_compute[182713]:   </features>
Jan 22 00:34:10 compute-1 nova_compute[182713]:   <clock offset="utc">
Jan 22 00:34:10 compute-1 nova_compute[182713]:     <timer name="pit" tickpolicy="delay"/>
Jan 22 00:34:10 compute-1 nova_compute[182713]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 22 00:34:10 compute-1 nova_compute[182713]:     <timer name="hpet" present="no"/>
Jan 22 00:34:10 compute-1 nova_compute[182713]:   </clock>
Jan 22 00:34:10 compute-1 nova_compute[182713]:   <cpu mode="custom" match="exact">
Jan 22 00:34:10 compute-1 nova_compute[182713]:     <model>Nehalem</model>
Jan 22 00:34:10 compute-1 nova_compute[182713]:     <topology sockets="1" cores="1" threads="1"/>
Jan 22 00:34:10 compute-1 nova_compute[182713]:   </cpu>
Jan 22 00:34:10 compute-1 nova_compute[182713]:   <devices>
Jan 22 00:34:10 compute-1 nova_compute[182713]:     <disk type="file" device="disk">
Jan 22 00:34:10 compute-1 nova_compute[182713]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 00:34:10 compute-1 nova_compute[182713]:       <source file="/var/lib/nova/instances/02484dd3-2f3e-41bb-8ceb-d71936912a38/disk"/>
Jan 22 00:34:10 compute-1 nova_compute[182713]:       <target dev="vda" bus="virtio"/>
Jan 22 00:34:10 compute-1 nova_compute[182713]:     </disk>
Jan 22 00:34:10 compute-1 nova_compute[182713]:     <disk type="file" device="cdrom">
Jan 22 00:34:10 compute-1 nova_compute[182713]:       <driver name="qemu" type="raw" cache="none"/>
Jan 22 00:34:10 compute-1 nova_compute[182713]:       <source file="/var/lib/nova/instances/02484dd3-2f3e-41bb-8ceb-d71936912a38/disk.config"/>
Jan 22 00:34:10 compute-1 nova_compute[182713]:       <target dev="sda" bus="sata"/>
Jan 22 00:34:10 compute-1 nova_compute[182713]:     </disk>
Jan 22 00:34:10 compute-1 nova_compute[182713]:     <interface type="ethernet">
Jan 22 00:34:10 compute-1 nova_compute[182713]:       <mac address="fa:16:3e:d3:fd:67"/>
Jan 22 00:34:10 compute-1 nova_compute[182713]:       <model type="virtio"/>
Jan 22 00:34:10 compute-1 nova_compute[182713]:       <driver name="vhost" rx_queue_size="512"/>
Jan 22 00:34:10 compute-1 nova_compute[182713]:       <mtu size="1442"/>
Jan 22 00:34:10 compute-1 nova_compute[182713]:       <target dev="tap0eb5006d-d9"/>
Jan 22 00:34:10 compute-1 nova_compute[182713]:     </interface>
Jan 22 00:34:10 compute-1 nova_compute[182713]:     <interface type="ethernet">
Jan 22 00:34:10 compute-1 nova_compute[182713]:       <mac address="fa:16:3e:d6:3b:8b"/>
Jan 22 00:34:10 compute-1 nova_compute[182713]:       <model type="virtio"/>
Jan 22 00:34:10 compute-1 nova_compute[182713]:       <driver name="vhost" rx_queue_size="512"/>
Jan 22 00:34:10 compute-1 nova_compute[182713]:       <mtu size="1442"/>
Jan 22 00:34:10 compute-1 nova_compute[182713]:       <target dev="tapfd875d41-c5"/>
Jan 22 00:34:10 compute-1 nova_compute[182713]:     </interface>
Jan 22 00:34:10 compute-1 nova_compute[182713]:     <serial type="pty">
Jan 22 00:34:10 compute-1 nova_compute[182713]:       <log file="/var/lib/nova/instances/02484dd3-2f3e-41bb-8ceb-d71936912a38/console.log" append="off"/>
Jan 22 00:34:10 compute-1 nova_compute[182713]:     </serial>
Jan 22 00:34:10 compute-1 nova_compute[182713]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 00:34:10 compute-1 nova_compute[182713]:     <video>
Jan 22 00:34:10 compute-1 nova_compute[182713]:       <model type="virtio"/>
Jan 22 00:34:10 compute-1 nova_compute[182713]:     </video>
Jan 22 00:34:10 compute-1 nova_compute[182713]:     <input type="tablet" bus="usb"/>
Jan 22 00:34:10 compute-1 nova_compute[182713]:     <rng model="virtio">
Jan 22 00:34:10 compute-1 nova_compute[182713]:       <backend model="random">/dev/urandom</backend>
Jan 22 00:34:10 compute-1 nova_compute[182713]:     </rng>
Jan 22 00:34:10 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root"/>
Jan 22 00:34:10 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:34:10 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:34:10 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:34:10 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:34:10 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:34:10 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:34:10 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:34:10 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:34:10 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:34:10 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:34:10 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:34:10 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:34:10 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:34:10 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:34:10 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:34:10 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:34:10 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:34:10 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:34:10 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:34:10 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:34:10 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:34:10 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:34:10 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:34:10 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:34:10 compute-1 nova_compute[182713]:     <controller type="usb" index="0"/>
Jan 22 00:34:10 compute-1 nova_compute[182713]:     <memballoon model="virtio">
Jan 22 00:34:10 compute-1 nova_compute[182713]:       <stats period="10"/>
Jan 22 00:34:10 compute-1 nova_compute[182713]:     </memballoon>
Jan 22 00:34:10 compute-1 nova_compute[182713]:   </devices>
Jan 22 00:34:10 compute-1 nova_compute[182713]: </domain>
Jan 22 00:34:10 compute-1 nova_compute[182713]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 22 00:34:10 compute-1 nova_compute[182713]: 2026-01-22 00:34:10.635 182717 DEBUG nova.compute.manager [None req-bf60fd21-8b8d-44e2-a2bf-a8170dd7749a a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 02484dd3-2f3e-41bb-8ceb-d71936912a38] Preparing to wait for external event network-vif-plugged-0eb5006d-d9d0-403a-bc92-5ad7bd032ade prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 22 00:34:10 compute-1 nova_compute[182713]: 2026-01-22 00:34:10.635 182717 DEBUG oslo_concurrency.lockutils [None req-bf60fd21-8b8d-44e2-a2bf-a8170dd7749a a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Acquiring lock "02484dd3-2f3e-41bb-8ceb-d71936912a38-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:34:10 compute-1 nova_compute[182713]: 2026-01-22 00:34:10.636 182717 DEBUG oslo_concurrency.lockutils [None req-bf60fd21-8b8d-44e2-a2bf-a8170dd7749a a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "02484dd3-2f3e-41bb-8ceb-d71936912a38-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:34:10 compute-1 nova_compute[182713]: 2026-01-22 00:34:10.636 182717 DEBUG oslo_concurrency.lockutils [None req-bf60fd21-8b8d-44e2-a2bf-a8170dd7749a a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "02484dd3-2f3e-41bb-8ceb-d71936912a38-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:34:10 compute-1 nova_compute[182713]: 2026-01-22 00:34:10.636 182717 DEBUG nova.compute.manager [None req-bf60fd21-8b8d-44e2-a2bf-a8170dd7749a a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 02484dd3-2f3e-41bb-8ceb-d71936912a38] Preparing to wait for external event network-vif-plugged-fd875d41-c569-4c76-ae51-bbf4251e41bd prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 22 00:34:10 compute-1 nova_compute[182713]: 2026-01-22 00:34:10.636 182717 DEBUG oslo_concurrency.lockutils [None req-bf60fd21-8b8d-44e2-a2bf-a8170dd7749a a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Acquiring lock "02484dd3-2f3e-41bb-8ceb-d71936912a38-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:34:10 compute-1 nova_compute[182713]: 2026-01-22 00:34:10.637 182717 DEBUG oslo_concurrency.lockutils [None req-bf60fd21-8b8d-44e2-a2bf-a8170dd7749a a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "02484dd3-2f3e-41bb-8ceb-d71936912a38-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:34:10 compute-1 nova_compute[182713]: 2026-01-22 00:34:10.637 182717 DEBUG oslo_concurrency.lockutils [None req-bf60fd21-8b8d-44e2-a2bf-a8170dd7749a a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "02484dd3-2f3e-41bb-8ceb-d71936912a38-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:34:10 compute-1 nova_compute[182713]: 2026-01-22 00:34:10.638 182717 DEBUG nova.virt.libvirt.vif [None req-bf60fd21-8b8d-44e2-a2bf-a8170dd7749a a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T00:33:59Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1621779069',display_name='tempest-TestGettingAddress-server-1621779069',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1621779069',id=172,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBCnBNde9LnK+U+D5ENHV5Bhm30z07BTZ3lIWHvMukOxK8Jf8o76X7/cU7jqimLz8JbkUji6/91McZm7z1O1Yc3tDz3ZuSaz3KrVuj+NWjZEoVIpu6UJrwRH+k4I2kfmaw==',key_name='tempest-TestGettingAddress-1562921949',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='837db8748d074b3c9179b47d30e7a1d4',ramdisk_id='',reservation_id='r-6oeidn9k',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-471729430',owner_user_name='tempest-TestGettingAddress-471729430-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:34:02Z,user_data=None,user_id='a8fd196423d94b309668ffd08655f7ed',uuid=02484dd3-2f3e-41bb-8ceb-d71936912a38,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "0eb5006d-d9d0-403a-bc92-5ad7bd032ade", "address": "fa:16:3e:d3:fd:67", "network": {"id": "2c32b591-bafa-4089-9793-ef7884c86bda", "bridge": "br-int", "label": "tempest-network-smoke--1415909396", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0eb5006d-d9", "ovs_interfaceid": "0eb5006d-d9d0-403a-bc92-5ad7bd032ade", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 00:34:10 compute-1 nova_compute[182713]: 2026-01-22 00:34:10.638 182717 DEBUG nova.network.os_vif_util [None req-bf60fd21-8b8d-44e2-a2bf-a8170dd7749a a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Converting VIF {"id": "0eb5006d-d9d0-403a-bc92-5ad7bd032ade", "address": "fa:16:3e:d3:fd:67", "network": {"id": "2c32b591-bafa-4089-9793-ef7884c86bda", "bridge": "br-int", "label": "tempest-network-smoke--1415909396", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0eb5006d-d9", "ovs_interfaceid": "0eb5006d-d9d0-403a-bc92-5ad7bd032ade", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:34:10 compute-1 nova_compute[182713]: 2026-01-22 00:34:10.638 182717 DEBUG nova.network.os_vif_util [None req-bf60fd21-8b8d-44e2-a2bf-a8170dd7749a a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d3:fd:67,bridge_name='br-int',has_traffic_filtering=True,id=0eb5006d-d9d0-403a-bc92-5ad7bd032ade,network=Network(2c32b591-bafa-4089-9793-ef7884c86bda),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0eb5006d-d9') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:34:10 compute-1 nova_compute[182713]: 2026-01-22 00:34:10.639 182717 DEBUG os_vif [None req-bf60fd21-8b8d-44e2-a2bf-a8170dd7749a a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:d3:fd:67,bridge_name='br-int',has_traffic_filtering=True,id=0eb5006d-d9d0-403a-bc92-5ad7bd032ade,network=Network(2c32b591-bafa-4089-9793-ef7884c86bda),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0eb5006d-d9') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 00:34:10 compute-1 nova_compute[182713]: 2026-01-22 00:34:10.639 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:34:10 compute-1 nova_compute[182713]: 2026-01-22 00:34:10.640 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:34:10 compute-1 nova_compute[182713]: 2026-01-22 00:34:10.640 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 00:34:10 compute-1 nova_compute[182713]: 2026-01-22 00:34:10.653 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:34:10 compute-1 nova_compute[182713]: 2026-01-22 00:34:10.653 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap0eb5006d-d9, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:34:10 compute-1 nova_compute[182713]: 2026-01-22 00:34:10.654 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap0eb5006d-d9, col_values=(('external_ids', {'iface-id': '0eb5006d-d9d0-403a-bc92-5ad7bd032ade', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:d3:fd:67', 'vm-uuid': '02484dd3-2f3e-41bb-8ceb-d71936912a38'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:34:10 compute-1 nova_compute[182713]: 2026-01-22 00:34:10.657 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:34:10 compute-1 NetworkManager[54952]: <info>  [1769042050.6587] manager: (tap0eb5006d-d9): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/315)
Jan 22 00:34:10 compute-1 nova_compute[182713]: 2026-01-22 00:34:10.661 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:34:10 compute-1 nova_compute[182713]: 2026-01-22 00:34:10.667 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:34:10 compute-1 nova_compute[182713]: 2026-01-22 00:34:10.668 182717 INFO os_vif [None req-bf60fd21-8b8d-44e2-a2bf-a8170dd7749a a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:d3:fd:67,bridge_name='br-int',has_traffic_filtering=True,id=0eb5006d-d9d0-403a-bc92-5ad7bd032ade,network=Network(2c32b591-bafa-4089-9793-ef7884c86bda),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0eb5006d-d9')
Jan 22 00:34:10 compute-1 nova_compute[182713]: 2026-01-22 00:34:10.669 182717 DEBUG nova.virt.libvirt.vif [None req-bf60fd21-8b8d-44e2-a2bf-a8170dd7749a a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T00:33:59Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1621779069',display_name='tempest-TestGettingAddress-server-1621779069',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1621779069',id=172,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBCnBNde9LnK+U+D5ENHV5Bhm30z07BTZ3lIWHvMukOxK8Jf8o76X7/cU7jqimLz8JbkUji6/91McZm7z1O1Yc3tDz3ZuSaz3KrVuj+NWjZEoVIpu6UJrwRH+k4I2kfmaw==',key_name='tempest-TestGettingAddress-1562921949',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='837db8748d074b3c9179b47d30e7a1d4',ramdisk_id='',reservation_id='r-6oeidn9k',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-471729430',owner_user_name='tempest-TestGettingAddress-471729430-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:34:02Z,user_data=None,user_id='a8fd196423d94b309668ffd08655f7ed',uuid=02484dd3-2f3e-41bb-8ceb-d71936912a38,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "fd875d41-c569-4c76-ae51-bbf4251e41bd", "address": "fa:16:3e:d6:3b:8b", "network": {"id": "ac047d42-8ff5-4760-85b5-73b5e4be7fc9", "bridge": "br-int", "label": "tempest-network-smoke--1874258653", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fed6:3b8b", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfd875d41-c5", "ovs_interfaceid": "fd875d41-c569-4c76-ae51-bbf4251e41bd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 00:34:10 compute-1 nova_compute[182713]: 2026-01-22 00:34:10.669 182717 DEBUG nova.network.os_vif_util [None req-bf60fd21-8b8d-44e2-a2bf-a8170dd7749a a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Converting VIF {"id": "fd875d41-c569-4c76-ae51-bbf4251e41bd", "address": "fa:16:3e:d6:3b:8b", "network": {"id": "ac047d42-8ff5-4760-85b5-73b5e4be7fc9", "bridge": "br-int", "label": "tempest-network-smoke--1874258653", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fed6:3b8b", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfd875d41-c5", "ovs_interfaceid": "fd875d41-c569-4c76-ae51-bbf4251e41bd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:34:10 compute-1 nova_compute[182713]: 2026-01-22 00:34:10.670 182717 DEBUG nova.network.os_vif_util [None req-bf60fd21-8b8d-44e2-a2bf-a8170dd7749a a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d6:3b:8b,bridge_name='br-int',has_traffic_filtering=True,id=fd875d41-c569-4c76-ae51-bbf4251e41bd,network=Network(ac047d42-8ff5-4760-85b5-73b5e4be7fc9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfd875d41-c5') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:34:10 compute-1 nova_compute[182713]: 2026-01-22 00:34:10.671 182717 DEBUG os_vif [None req-bf60fd21-8b8d-44e2-a2bf-a8170dd7749a a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:d6:3b:8b,bridge_name='br-int',has_traffic_filtering=True,id=fd875d41-c569-4c76-ae51-bbf4251e41bd,network=Network(ac047d42-8ff5-4760-85b5-73b5e4be7fc9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfd875d41-c5') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 00:34:10 compute-1 nova_compute[182713]: 2026-01-22 00:34:10.671 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:34:10 compute-1 nova_compute[182713]: 2026-01-22 00:34:10.672 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:34:10 compute-1 nova_compute[182713]: 2026-01-22 00:34:10.672 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 00:34:10 compute-1 nova_compute[182713]: 2026-01-22 00:34:10.675 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:34:10 compute-1 nova_compute[182713]: 2026-01-22 00:34:10.675 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapfd875d41-c5, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:34:10 compute-1 nova_compute[182713]: 2026-01-22 00:34:10.676 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapfd875d41-c5, col_values=(('external_ids', {'iface-id': 'fd875d41-c569-4c76-ae51-bbf4251e41bd', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:d6:3b:8b', 'vm-uuid': '02484dd3-2f3e-41bb-8ceb-d71936912a38'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:34:10 compute-1 nova_compute[182713]: 2026-01-22 00:34:10.677 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:34:10 compute-1 NetworkManager[54952]: <info>  [1769042050.6786] manager: (tapfd875d41-c5): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/316)
Jan 22 00:34:10 compute-1 nova_compute[182713]: 2026-01-22 00:34:10.679 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:34:10 compute-1 nova_compute[182713]: 2026-01-22 00:34:10.688 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:34:10 compute-1 nova_compute[182713]: 2026-01-22 00:34:10.688 182717 INFO os_vif [None req-bf60fd21-8b8d-44e2-a2bf-a8170dd7749a a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:d6:3b:8b,bridge_name='br-int',has_traffic_filtering=True,id=fd875d41-c569-4c76-ae51-bbf4251e41bd,network=Network(ac047d42-8ff5-4760-85b5-73b5e4be7fc9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfd875d41-c5')
Jan 22 00:34:10 compute-1 nova_compute[182713]: 2026-01-22 00:34:10.781 182717 DEBUG nova.virt.libvirt.driver [None req-bf60fd21-8b8d-44e2-a2bf-a8170dd7749a a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 00:34:10 compute-1 nova_compute[182713]: 2026-01-22 00:34:10.782 182717 DEBUG nova.virt.libvirt.driver [None req-bf60fd21-8b8d-44e2-a2bf-a8170dd7749a a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 00:34:10 compute-1 nova_compute[182713]: 2026-01-22 00:34:10.782 182717 DEBUG nova.virt.libvirt.driver [None req-bf60fd21-8b8d-44e2-a2bf-a8170dd7749a a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] No VIF found with MAC fa:16:3e:d3:fd:67, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 00:34:10 compute-1 nova_compute[182713]: 2026-01-22 00:34:10.783 182717 DEBUG nova.virt.libvirt.driver [None req-bf60fd21-8b8d-44e2-a2bf-a8170dd7749a a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] No VIF found with MAC fa:16:3e:d6:3b:8b, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 00:34:10 compute-1 nova_compute[182713]: 2026-01-22 00:34:10.783 182717 INFO nova.virt.libvirt.driver [None req-bf60fd21-8b8d-44e2-a2bf-a8170dd7749a a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 02484dd3-2f3e-41bb-8ceb-d71936912a38] Using config drive
Jan 22 00:34:11 compute-1 nova_compute[182713]: 2026-01-22 00:34:11.142 182717 INFO nova.virt.libvirt.driver [None req-bf60fd21-8b8d-44e2-a2bf-a8170dd7749a a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 02484dd3-2f3e-41bb-8ceb-d71936912a38] Creating config drive at /var/lib/nova/instances/02484dd3-2f3e-41bb-8ceb-d71936912a38/disk.config
Jan 22 00:34:11 compute-1 nova_compute[182713]: 2026-01-22 00:34:11.151 182717 DEBUG oslo_concurrency.processutils [None req-bf60fd21-8b8d-44e2-a2bf-a8170dd7749a a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/02484dd3-2f3e-41bb-8ceb-d71936912a38/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp62wj3uhi execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:34:11 compute-1 nova_compute[182713]: 2026-01-22 00:34:11.284 182717 DEBUG oslo_concurrency.processutils [None req-bf60fd21-8b8d-44e2-a2bf-a8170dd7749a a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/02484dd3-2f3e-41bb-8ceb-d71936912a38/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp62wj3uhi" returned: 0 in 0.133s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:34:11 compute-1 nova_compute[182713]: 2026-01-22 00:34:11.296 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:34:11 compute-1 nova_compute[182713]: 2026-01-22 00:34:11.296 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 00:34:11 compute-1 nova_compute[182713]: 2026-01-22 00:34:11.297 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 22 00:34:11 compute-1 nova_compute[182713]: 2026-01-22 00:34:11.316 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] [instance: 02484dd3-2f3e-41bb-8ceb-d71936912a38] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871
Jan 22 00:34:11 compute-1 nova_compute[182713]: 2026-01-22 00:34:11.317 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 22 00:34:11 compute-1 NetworkManager[54952]: <info>  [1769042051.3635] manager: (tap0eb5006d-d9): new Tun device (/org/freedesktop/NetworkManager/Devices/317)
Jan 22 00:34:11 compute-1 kernel: tap0eb5006d-d9: entered promiscuous mode
Jan 22 00:34:11 compute-1 nova_compute[182713]: 2026-01-22 00:34:11.369 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:34:11 compute-1 ovn_controller[94841]: 2026-01-22T00:34:11Z|00677|binding|INFO|Claiming lport 0eb5006d-d9d0-403a-bc92-5ad7bd032ade for this chassis.
Jan 22 00:34:11 compute-1 ovn_controller[94841]: 2026-01-22T00:34:11Z|00678|binding|INFO|0eb5006d-d9d0-403a-bc92-5ad7bd032ade: Claiming fa:16:3e:d3:fd:67 10.100.0.14
Jan 22 00:34:11 compute-1 NetworkManager[54952]: <info>  [1769042051.3841] manager: (tapfd875d41-c5): new Tun device (/org/freedesktop/NetworkManager/Devices/318)
Jan 22 00:34:11 compute-1 kernel: tapfd875d41-c5: entered promiscuous mode
Jan 22 00:34:11 compute-1 NetworkManager[54952]: <info>  [1769042051.3900] manager: (patch-br-int-to-provnet-99bcf688-d143-4306-a11c-956e66fbe227): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/319)
Jan 22 00:34:11 compute-1 NetworkManager[54952]: <info>  [1769042051.3907] manager: (patch-provnet-99bcf688-d143-4306-a11c-956e66fbe227-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/320)
Jan 22 00:34:11 compute-1 nova_compute[182713]: 2026-01-22 00:34:11.388 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:34:11 compute-1 systemd-udevd[239996]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 00:34:11 compute-1 systemd-udevd[239997]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 00:34:11 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:34:11.405 104184 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d3:fd:67 10.100.0.14'], port_security=['fa:16:3e:d3:fd:67 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '02484dd3-2f3e-41bb-8ceb-d71936912a38', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-2c32b591-bafa-4089-9793-ef7884c86bda', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '837db8748d074b3c9179b47d30e7a1d4', 'neutron:revision_number': '2', 'neutron:security_group_ids': '2e757454-ac82-49d6-9905-0e379d7d274f', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e2ff2e46-2af8-49fc-9f01-6a639111eeb4, chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>], logical_port=0eb5006d-d9d0-403a-bc92-5ad7bd032ade) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:34:11 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:34:11.407 104184 INFO neutron.agent.ovn.metadata.agent [-] Port 0eb5006d-d9d0-403a-bc92-5ad7bd032ade in datapath 2c32b591-bafa-4089-9793-ef7884c86bda bound to our chassis
Jan 22 00:34:11 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:34:11.408 104184 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 2c32b591-bafa-4089-9793-ef7884c86bda
Jan 22 00:34:11 compute-1 NetworkManager[54952]: <info>  [1769042051.4211] device (tap0eb5006d-d9): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 00:34:11 compute-1 NetworkManager[54952]: <info>  [1769042051.4222] device (tapfd875d41-c5): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 00:34:11 compute-1 NetworkManager[54952]: <info>  [1769042051.4233] device (tap0eb5006d-d9): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 00:34:11 compute-1 NetworkManager[54952]: <info>  [1769042051.4240] device (tapfd875d41-c5): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 00:34:11 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:34:11.425 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[e9e67ae3-17dc-420c-affc-504b12eeb0bc]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:34:11 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:34:11.426 104184 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap2c32b591-b1 in ovnmeta-2c32b591-bafa-4089-9793-ef7884c86bda namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 22 00:34:11 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:34:11.434 211733 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap2c32b591-b0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 22 00:34:11 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:34:11.435 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[10dc81fd-2248-477d-9765-939fcf84dd6b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:34:11 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:34:11.436 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[4ed7dc3c-6ccf-4b88-86e0-e49d3b4e1abd]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:34:11 compute-1 systemd-machined[153970]: New machine qemu-74-instance-000000ac.
Jan 22 00:34:11 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:34:11.449 104576 DEBUG oslo.privsep.daemon [-] privsep: reply[788d71f2-43e7-4bda-b818-c1c53dc00603]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:34:11 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:34:11.479 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[d8e60d4b-cddd-4ad4-b500-85a74902fad5]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:34:11 compute-1 systemd[1]: Started Virtual Machine qemu-74-instance-000000ac.
Jan 22 00:34:11 compute-1 nova_compute[182713]: 2026-01-22 00:34:11.485 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:34:11 compute-1 nova_compute[182713]: 2026-01-22 00:34:11.489 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:34:11 compute-1 nova_compute[182713]: 2026-01-22 00:34:11.492 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:34:11 compute-1 ovn_controller[94841]: 2026-01-22T00:34:11Z|00679|binding|INFO|Claiming lport fd875d41-c569-4c76-ae51-bbf4251e41bd for this chassis.
Jan 22 00:34:11 compute-1 ovn_controller[94841]: 2026-01-22T00:34:11Z|00680|binding|INFO|fd875d41-c569-4c76-ae51-bbf4251e41bd: Claiming fa:16:3e:d6:3b:8b 2001:db8::f816:3eff:fed6:3b8b
Jan 22 00:34:11 compute-1 ovn_controller[94841]: 2026-01-22T00:34:11Z|00681|binding|INFO|Setting lport 0eb5006d-d9d0-403a-bc92-5ad7bd032ade ovn-installed in OVS
Jan 22 00:34:11 compute-1 nova_compute[182713]: 2026-01-22 00:34:11.505 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:34:11 compute-1 ovn_controller[94841]: 2026-01-22T00:34:11Z|00682|binding|INFO|Setting lport 0eb5006d-d9d0-403a-bc92-5ad7bd032ade up in Southbound
Jan 22 00:34:11 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:34:11.511 104184 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d6:3b:8b 2001:db8::f816:3eff:fed6:3b8b'], port_security=['fa:16:3e:d6:3b:8b 2001:db8::f816:3eff:fed6:3b8b'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fed6:3b8b/64', 'neutron:device_id': '02484dd3-2f3e-41bb-8ceb-d71936912a38', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ac047d42-8ff5-4760-85b5-73b5e4be7fc9', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '837db8748d074b3c9179b47d30e7a1d4', 'neutron:revision_number': '2', 'neutron:security_group_ids': '2e757454-ac82-49d6-9905-0e379d7d274f', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=27a761c8-dbca-47a8-b596-d7db8b087bd0, chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>], logical_port=fd875d41-c569-4c76-ae51-bbf4251e41bd) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:34:11 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:34:11.514 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[943f920d-e7c9-4dfb-bd10-42143dd436bd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:34:11 compute-1 ovn_controller[94841]: 2026-01-22T00:34:11Z|00683|binding|INFO|Setting lport fd875d41-c569-4c76-ae51-bbf4251e41bd ovn-installed in OVS
Jan 22 00:34:11 compute-1 ovn_controller[94841]: 2026-01-22T00:34:11Z|00684|binding|INFO|Setting lport fd875d41-c569-4c76-ae51-bbf4251e41bd up in Southbound
Jan 22 00:34:11 compute-1 NetworkManager[54952]: <info>  [1769042051.5245] manager: (tap2c32b591-b0): new Veth device (/org/freedesktop/NetworkManager/Devices/321)
Jan 22 00:34:11 compute-1 nova_compute[182713]: 2026-01-22 00:34:11.524 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:34:11 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:34:11.523 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[a5db832a-fd55-4692-b439-38555994f4d6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:34:11 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:34:11.558 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[55ed63c7-bfcc-4d4c-aeff-bf0e365a9526]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:34:11 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:34:11.561 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[4cbe8bc1-f5b1-417a-9190-636a1bf4412d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:34:11 compute-1 NetworkManager[54952]: <info>  [1769042051.5834] device (tap2c32b591-b0): carrier: link connected
Jan 22 00:34:11 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:34:11.588 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[098b56f3-0aae-4618-a97c-7a9afe3925d7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:34:11 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:34:11.607 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[a5dae9ab-2e19-40b9-89a8-75dbd1d5f72e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap2c32b591-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:93:9c:46'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 196], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 662423, 'reachable_time': 28307, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 240033, 'error': None, 'target': 'ovnmeta-2c32b591-bafa-4089-9793-ef7884c86bda', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:34:11 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:34:11.628 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[848bd8a4-c1ad-4824-9be1-bc662a35241a]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe93:9c46'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 662423, 'tstamp': 662423}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 240034, 'error': None, 'target': 'ovnmeta-2c32b591-bafa-4089-9793-ef7884c86bda', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:34:11 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:34:11.648 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[648c1bd6-ac0d-46dc-9e20-ecbd2565f8b2]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap2c32b591-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:93:9c:46'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 180, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 180, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 196], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 662423, 'reachable_time': 28307, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 152, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 152, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 240035, 'error': None, 'target': 'ovnmeta-2c32b591-bafa-4089-9793-ef7884c86bda', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:34:11 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:34:11.683 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[f0d28e67-ef2c-4033-8a2e-85a3642314dd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:34:11 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:34:11.744 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[1db40d24-6d0d-471c-a25b-12d10e51146c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:34:11 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:34:11.745 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap2c32b591-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:34:11 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:34:11.745 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 00:34:11 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:34:11.746 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap2c32b591-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:34:11 compute-1 kernel: tap2c32b591-b0: entered promiscuous mode
Jan 22 00:34:11 compute-1 NetworkManager[54952]: <info>  [1769042051.7480] manager: (tap2c32b591-b0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/322)
Jan 22 00:34:11 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:34:11.750 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap2c32b591-b0, col_values=(('external_ids', {'iface-id': 'e38b1907-f53c-4547-8cf1-c0fe946413c0'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:34:11 compute-1 nova_compute[182713]: 2026-01-22 00:34:11.747 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:34:11 compute-1 nova_compute[182713]: 2026-01-22 00:34:11.752 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:34:11 compute-1 ovn_controller[94841]: 2026-01-22T00:34:11Z|00685|binding|INFO|Releasing lport e38b1907-f53c-4547-8cf1-c0fe946413c0 from this chassis (sb_readonly=0)
Jan 22 00:34:11 compute-1 nova_compute[182713]: 2026-01-22 00:34:11.768 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:34:11 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:34:11.769 104184 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/2c32b591-bafa-4089-9793-ef7884c86bda.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/2c32b591-bafa-4089-9793-ef7884c86bda.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 22 00:34:11 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:34:11.769 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[38e70d8d-5cce-4bc6-a476-cdf1c0c33e58]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:34:11 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:34:11.770 104184 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 00:34:11 compute-1 ovn_metadata_agent[104179]: global
Jan 22 00:34:11 compute-1 ovn_metadata_agent[104179]:     log         /dev/log local0 debug
Jan 22 00:34:11 compute-1 ovn_metadata_agent[104179]:     log-tag     haproxy-metadata-proxy-2c32b591-bafa-4089-9793-ef7884c86bda
Jan 22 00:34:11 compute-1 ovn_metadata_agent[104179]:     user        root
Jan 22 00:34:11 compute-1 ovn_metadata_agent[104179]:     group       root
Jan 22 00:34:11 compute-1 ovn_metadata_agent[104179]:     maxconn     1024
Jan 22 00:34:11 compute-1 ovn_metadata_agent[104179]:     pidfile     /var/lib/neutron/external/pids/2c32b591-bafa-4089-9793-ef7884c86bda.pid.haproxy
Jan 22 00:34:11 compute-1 ovn_metadata_agent[104179]:     daemon
Jan 22 00:34:11 compute-1 ovn_metadata_agent[104179]: 
Jan 22 00:34:11 compute-1 ovn_metadata_agent[104179]: defaults
Jan 22 00:34:11 compute-1 ovn_metadata_agent[104179]:     log global
Jan 22 00:34:11 compute-1 ovn_metadata_agent[104179]:     mode http
Jan 22 00:34:11 compute-1 ovn_metadata_agent[104179]:     option httplog
Jan 22 00:34:11 compute-1 ovn_metadata_agent[104179]:     option dontlognull
Jan 22 00:34:11 compute-1 ovn_metadata_agent[104179]:     option http-server-close
Jan 22 00:34:11 compute-1 ovn_metadata_agent[104179]:     option forwardfor
Jan 22 00:34:11 compute-1 ovn_metadata_agent[104179]:     retries                 3
Jan 22 00:34:11 compute-1 ovn_metadata_agent[104179]:     timeout http-request    30s
Jan 22 00:34:11 compute-1 ovn_metadata_agent[104179]:     timeout connect         30s
Jan 22 00:34:11 compute-1 ovn_metadata_agent[104179]:     timeout client          32s
Jan 22 00:34:11 compute-1 ovn_metadata_agent[104179]:     timeout server          32s
Jan 22 00:34:11 compute-1 ovn_metadata_agent[104179]:     timeout http-keep-alive 30s
Jan 22 00:34:11 compute-1 ovn_metadata_agent[104179]: 
Jan 22 00:34:11 compute-1 ovn_metadata_agent[104179]: 
Jan 22 00:34:11 compute-1 ovn_metadata_agent[104179]: listen listener
Jan 22 00:34:11 compute-1 ovn_metadata_agent[104179]:     bind 169.254.169.254:80
Jan 22 00:34:11 compute-1 ovn_metadata_agent[104179]:     server metadata /var/lib/neutron/metadata_proxy
Jan 22 00:34:11 compute-1 ovn_metadata_agent[104179]:     http-request add-header X-OVN-Network-ID 2c32b591-bafa-4089-9793-ef7884c86bda
Jan 22 00:34:11 compute-1 ovn_metadata_agent[104179]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 22 00:34:11 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:34:11.771 104184 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-2c32b591-bafa-4089-9793-ef7884c86bda', 'env', 'PROCESS_TAG=haproxy-2c32b591-bafa-4089-9793-ef7884c86bda', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/2c32b591-bafa-4089-9793-ef7884c86bda.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 22 00:34:11 compute-1 nova_compute[182713]: 2026-01-22 00:34:11.816 182717 DEBUG nova.compute.manager [req-24f06dd7-5e48-41b4-930f-f418d631cef8 req-af9b7c7a-69f4-4cdb-8b8f-b38ad46cbb21 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 02484dd3-2f3e-41bb-8ceb-d71936912a38] Received event network-vif-plugged-fd875d41-c569-4c76-ae51-bbf4251e41bd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:34:11 compute-1 nova_compute[182713]: 2026-01-22 00:34:11.817 182717 DEBUG oslo_concurrency.lockutils [req-24f06dd7-5e48-41b4-930f-f418d631cef8 req-af9b7c7a-69f4-4cdb-8b8f-b38ad46cbb21 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "02484dd3-2f3e-41bb-8ceb-d71936912a38-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:34:11 compute-1 nova_compute[182713]: 2026-01-22 00:34:11.818 182717 DEBUG oslo_concurrency.lockutils [req-24f06dd7-5e48-41b4-930f-f418d631cef8 req-af9b7c7a-69f4-4cdb-8b8f-b38ad46cbb21 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "02484dd3-2f3e-41bb-8ceb-d71936912a38-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:34:11 compute-1 nova_compute[182713]: 2026-01-22 00:34:11.818 182717 DEBUG oslo_concurrency.lockutils [req-24f06dd7-5e48-41b4-930f-f418d631cef8 req-af9b7c7a-69f4-4cdb-8b8f-b38ad46cbb21 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "02484dd3-2f3e-41bb-8ceb-d71936912a38-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:34:11 compute-1 nova_compute[182713]: 2026-01-22 00:34:11.819 182717 DEBUG nova.compute.manager [req-24f06dd7-5e48-41b4-930f-f418d631cef8 req-af9b7c7a-69f4-4cdb-8b8f-b38ad46cbb21 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 02484dd3-2f3e-41bb-8ceb-d71936912a38] Processing event network-vif-plugged-fd875d41-c569-4c76-ae51-bbf4251e41bd _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 22 00:34:11 compute-1 nova_compute[182713]: 2026-01-22 00:34:11.863 182717 DEBUG nova.compute.manager [req-98e7cfbd-4db1-41dc-a1ea-96096c321a10 req-2bfb5305-f784-47e5-96b6-41740c110a3d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 02484dd3-2f3e-41bb-8ceb-d71936912a38] Received event network-vif-plugged-0eb5006d-d9d0-403a-bc92-5ad7bd032ade external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:34:11 compute-1 nova_compute[182713]: 2026-01-22 00:34:11.864 182717 DEBUG oslo_concurrency.lockutils [req-98e7cfbd-4db1-41dc-a1ea-96096c321a10 req-2bfb5305-f784-47e5-96b6-41740c110a3d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "02484dd3-2f3e-41bb-8ceb-d71936912a38-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:34:11 compute-1 nova_compute[182713]: 2026-01-22 00:34:11.864 182717 DEBUG oslo_concurrency.lockutils [req-98e7cfbd-4db1-41dc-a1ea-96096c321a10 req-2bfb5305-f784-47e5-96b6-41740c110a3d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "02484dd3-2f3e-41bb-8ceb-d71936912a38-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:34:11 compute-1 nova_compute[182713]: 2026-01-22 00:34:11.864 182717 DEBUG oslo_concurrency.lockutils [req-98e7cfbd-4db1-41dc-a1ea-96096c321a10 req-2bfb5305-f784-47e5-96b6-41740c110a3d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "02484dd3-2f3e-41bb-8ceb-d71936912a38-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:34:11 compute-1 nova_compute[182713]: 2026-01-22 00:34:11.864 182717 DEBUG nova.compute.manager [req-98e7cfbd-4db1-41dc-a1ea-96096c321a10 req-2bfb5305-f784-47e5-96b6-41740c110a3d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 02484dd3-2f3e-41bb-8ceb-d71936912a38] Processing event network-vif-plugged-0eb5006d-d9d0-403a-bc92-5ad7bd032ade _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 22 00:34:11 compute-1 nova_compute[182713]: 2026-01-22 00:34:11.920 182717 DEBUG nova.virt.driver [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Emitting event <LifecycleEvent: 1769042051.920238, 02484dd3-2f3e-41bb-8ceb-d71936912a38 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:34:11 compute-1 nova_compute[182713]: 2026-01-22 00:34:11.921 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 02484dd3-2f3e-41bb-8ceb-d71936912a38] VM Started (Lifecycle Event)
Jan 22 00:34:11 compute-1 nova_compute[182713]: 2026-01-22 00:34:11.923 182717 DEBUG nova.compute.manager [None req-bf60fd21-8b8d-44e2-a2bf-a8170dd7749a a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 02484dd3-2f3e-41bb-8ceb-d71936912a38] Instance event wait completed in 0 seconds for network-vif-plugged,network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 00:34:11 compute-1 nova_compute[182713]: 2026-01-22 00:34:11.927 182717 DEBUG nova.virt.libvirt.driver [None req-bf60fd21-8b8d-44e2-a2bf-a8170dd7749a a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 02484dd3-2f3e-41bb-8ceb-d71936912a38] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 22 00:34:11 compute-1 nova_compute[182713]: 2026-01-22 00:34:11.931 182717 INFO nova.virt.libvirt.driver [-] [instance: 02484dd3-2f3e-41bb-8ceb-d71936912a38] Instance spawned successfully.
Jan 22 00:34:11 compute-1 nova_compute[182713]: 2026-01-22 00:34:11.931 182717 DEBUG nova.virt.libvirt.driver [None req-bf60fd21-8b8d-44e2-a2bf-a8170dd7749a a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 02484dd3-2f3e-41bb-8ceb-d71936912a38] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 22 00:34:11 compute-1 nova_compute[182713]: 2026-01-22 00:34:11.958 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 02484dd3-2f3e-41bb-8ceb-d71936912a38] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:34:11 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:34:11.963 104184 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=57, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:a4:fb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'fa:c2:bb:50:eb:27'}, ipsec=False) old=SB_Global(nb_cfg=56) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:34:11 compute-1 nova_compute[182713]: 2026-01-22 00:34:11.965 182717 DEBUG nova.virt.libvirt.driver [None req-bf60fd21-8b8d-44e2-a2bf-a8170dd7749a a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 02484dd3-2f3e-41bb-8ceb-d71936912a38] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:34:11 compute-1 nova_compute[182713]: 2026-01-22 00:34:11.966 182717 DEBUG nova.virt.libvirt.driver [None req-bf60fd21-8b8d-44e2-a2bf-a8170dd7749a a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 02484dd3-2f3e-41bb-8ceb-d71936912a38] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:34:11 compute-1 nova_compute[182713]: 2026-01-22 00:34:11.967 182717 DEBUG nova.virt.libvirt.driver [None req-bf60fd21-8b8d-44e2-a2bf-a8170dd7749a a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 02484dd3-2f3e-41bb-8ceb-d71936912a38] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:34:11 compute-1 nova_compute[182713]: 2026-01-22 00:34:11.968 182717 DEBUG nova.virt.libvirt.driver [None req-bf60fd21-8b8d-44e2-a2bf-a8170dd7749a a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 02484dd3-2f3e-41bb-8ceb-d71936912a38] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:34:11 compute-1 nova_compute[182713]: 2026-01-22 00:34:11.968 182717 DEBUG nova.virt.libvirt.driver [None req-bf60fd21-8b8d-44e2-a2bf-a8170dd7749a a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 02484dd3-2f3e-41bb-8ceb-d71936912a38] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:34:11 compute-1 nova_compute[182713]: 2026-01-22 00:34:11.969 182717 DEBUG nova.virt.libvirt.driver [None req-bf60fd21-8b8d-44e2-a2bf-a8170dd7749a a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 02484dd3-2f3e-41bb-8ceb-d71936912a38] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:34:11 compute-1 nova_compute[182713]: 2026-01-22 00:34:11.973 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:34:11 compute-1 nova_compute[182713]: 2026-01-22 00:34:11.978 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 02484dd3-2f3e-41bb-8ceb-d71936912a38] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 00:34:12 compute-1 nova_compute[182713]: 2026-01-22 00:34:12.009 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 02484dd3-2f3e-41bb-8ceb-d71936912a38] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 00:34:12 compute-1 nova_compute[182713]: 2026-01-22 00:34:12.010 182717 DEBUG nova.virt.driver [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Emitting event <LifecycleEvent: 1769042051.9212859, 02484dd3-2f3e-41bb-8ceb-d71936912a38 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:34:12 compute-1 nova_compute[182713]: 2026-01-22 00:34:12.011 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 02484dd3-2f3e-41bb-8ceb-d71936912a38] VM Paused (Lifecycle Event)
Jan 22 00:34:12 compute-1 nova_compute[182713]: 2026-01-22 00:34:12.036 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 02484dd3-2f3e-41bb-8ceb-d71936912a38] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:34:12 compute-1 nova_compute[182713]: 2026-01-22 00:34:12.040 182717 DEBUG nova.virt.driver [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Emitting event <LifecycleEvent: 1769042051.9275582, 02484dd3-2f3e-41bb-8ceb-d71936912a38 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:34:12 compute-1 nova_compute[182713]: 2026-01-22 00:34:12.040 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 02484dd3-2f3e-41bb-8ceb-d71936912a38] VM Resumed (Lifecycle Event)
Jan 22 00:34:12 compute-1 nova_compute[182713]: 2026-01-22 00:34:12.051 182717 DEBUG nova.network.neutron [req-f05ff1ae-5190-46b8-bd65-e185c4ae9a21 req-f95d2cfe-f9ba-4c0b-8592-e1be2fdddec8 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 02484dd3-2f3e-41bb-8ceb-d71936912a38] Updated VIF entry in instance network info cache for port fd875d41-c569-4c76-ae51-bbf4251e41bd. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 00:34:12 compute-1 nova_compute[182713]: 2026-01-22 00:34:12.052 182717 DEBUG nova.network.neutron [req-f05ff1ae-5190-46b8-bd65-e185c4ae9a21 req-f95d2cfe-f9ba-4c0b-8592-e1be2fdddec8 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 02484dd3-2f3e-41bb-8ceb-d71936912a38] Updating instance_info_cache with network_info: [{"id": "0eb5006d-d9d0-403a-bc92-5ad7bd032ade", "address": "fa:16:3e:d3:fd:67", "network": {"id": "2c32b591-bafa-4089-9793-ef7884c86bda", "bridge": "br-int", "label": "tempest-network-smoke--1415909396", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0eb5006d-d9", "ovs_interfaceid": "0eb5006d-d9d0-403a-bc92-5ad7bd032ade", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "fd875d41-c569-4c76-ae51-bbf4251e41bd", "address": "fa:16:3e:d6:3b:8b", "network": {"id": "ac047d42-8ff5-4760-85b5-73b5e4be7fc9", "bridge": "br-int", "label": "tempest-network-smoke--1874258653", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fed6:3b8b", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfd875d41-c5", "ovs_interfaceid": "fd875d41-c569-4c76-ae51-bbf4251e41bd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:34:12 compute-1 nova_compute[182713]: 2026-01-22 00:34:12.063 182717 INFO nova.compute.manager [None req-bf60fd21-8b8d-44e2-a2bf-a8170dd7749a a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 02484dd3-2f3e-41bb-8ceb-d71936912a38] Took 9.70 seconds to spawn the instance on the hypervisor.
Jan 22 00:34:12 compute-1 nova_compute[182713]: 2026-01-22 00:34:12.063 182717 DEBUG nova.compute.manager [None req-bf60fd21-8b8d-44e2-a2bf-a8170dd7749a a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 02484dd3-2f3e-41bb-8ceb-d71936912a38] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:34:12 compute-1 nova_compute[182713]: 2026-01-22 00:34:12.073 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 02484dd3-2f3e-41bb-8ceb-d71936912a38] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:34:12 compute-1 nova_compute[182713]: 2026-01-22 00:34:12.076 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 02484dd3-2f3e-41bb-8ceb-d71936912a38] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 00:34:12 compute-1 nova_compute[182713]: 2026-01-22 00:34:12.080 182717 DEBUG oslo_concurrency.lockutils [req-f05ff1ae-5190-46b8-bd65-e185c4ae9a21 req-f95d2cfe-f9ba-4c0b-8592-e1be2fdddec8 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-02484dd3-2f3e-41bb-8ceb-d71936912a38" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:34:12 compute-1 nova_compute[182713]: 2026-01-22 00:34:12.118 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 02484dd3-2f3e-41bb-8ceb-d71936912a38] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 00:34:12 compute-1 nova_compute[182713]: 2026-01-22 00:34:12.173 182717 INFO nova.compute.manager [None req-bf60fd21-8b8d-44e2-a2bf-a8170dd7749a a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 02484dd3-2f3e-41bb-8ceb-d71936912a38] Took 10.40 seconds to build instance.
Jan 22 00:34:12 compute-1 nova_compute[182713]: 2026-01-22 00:34:12.192 182717 DEBUG oslo_concurrency.lockutils [None req-bf60fd21-8b8d-44e2-a2bf-a8170dd7749a a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "02484dd3-2f3e-41bb-8ceb-d71936912a38" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 10.510s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:34:12 compute-1 podman[240073]: 2026-01-22 00:34:12.217533475 +0000 UTC m=+0.050565829 container create 43e5e025721d7d967ec8fd883f5789eb184a5a964f6f9343b7aedc76f0528881 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2c32b591-bafa-4089-9793-ef7884c86bda, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Jan 22 00:34:12 compute-1 systemd[1]: Started libpod-conmon-43e5e025721d7d967ec8fd883f5789eb184a5a964f6f9343b7aedc76f0528881.scope.
Jan 22 00:34:12 compute-1 podman[240073]: 2026-01-22 00:34:12.192388815 +0000 UTC m=+0.025421199 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 00:34:12 compute-1 systemd[1]: Started libcrun container.
Jan 22 00:34:12 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/278b74386ef2db2584441ba6166460d4c1e06f73516138a2251d428eb20590db/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 00:34:12 compute-1 podman[240073]: 2026-01-22 00:34:12.321514481 +0000 UTC m=+0.154546895 container init 43e5e025721d7d967ec8fd883f5789eb184a5a964f6f9343b7aedc76f0528881 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2c32b591-bafa-4089-9793-ef7884c86bda, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251202)
Jan 22 00:34:12 compute-1 podman[240073]: 2026-01-22 00:34:12.32881346 +0000 UTC m=+0.161845864 container start 43e5e025721d7d967ec8fd883f5789eb184a5a964f6f9343b7aedc76f0528881 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2c32b591-bafa-4089-9793-ef7884c86bda, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Jan 22 00:34:12 compute-1 neutron-haproxy-ovnmeta-2c32b591-bafa-4089-9793-ef7884c86bda[240089]: [NOTICE]   (240093) : New worker (240095) forked
Jan 22 00:34:12 compute-1 neutron-haproxy-ovnmeta-2c32b591-bafa-4089-9793-ef7884c86bda[240089]: [NOTICE]   (240093) : Loading success.
Jan 22 00:34:12 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:34:12.410 104184 INFO neutron.agent.ovn.metadata.agent [-] Port fd875d41-c569-4c76-ae51-bbf4251e41bd in datapath ac047d42-8ff5-4760-85b5-73b5e4be7fc9 unbound from our chassis
Jan 22 00:34:12 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:34:12.412 104184 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network ac047d42-8ff5-4760-85b5-73b5e4be7fc9
Jan 22 00:34:12 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:34:12.425 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[8a561e55-5c95-4856-a106-dfbd6a24c43a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:34:12 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:34:12.426 104184 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapac047d42-81 in ovnmeta-ac047d42-8ff5-4760-85b5-73b5e4be7fc9 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 22 00:34:12 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:34:12.430 211733 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapac047d42-80 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 22 00:34:12 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:34:12.430 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[dc71b089-8349-47f7-8f66-d41b41303d3e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:34:12 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:34:12.432 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[2769d95a-9341-4a6f-8c6d-8a605a4e82fb]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:34:12 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:34:12.448 104576 DEBUG oslo.privsep.daemon [-] privsep: reply[222f9e9e-4817-48d7-9043-1e1edbf528f5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:34:12 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:34:12.466 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[7ef2f4ff-905a-4479-bfd5-30a10780d663]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:34:12 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:34:12.500 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[e5f92625-17f7-4ecd-bfba-de6b7a71a774]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:34:12 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:34:12.508 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[1bc6b709-57b2-420b-95a4-050bcb165743]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:34:12 compute-1 NetworkManager[54952]: <info>  [1769042052.5096] manager: (tapac047d42-80): new Veth device (/org/freedesktop/NetworkManager/Devices/323)
Jan 22 00:34:12 compute-1 systemd-udevd[240026]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 00:34:12 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:34:12.547 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[47956a77-90b4-4295-ab2f-25eaa6fb9fe5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:34:12 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:34:12.553 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[598bc464-f66e-4a0d-a952-4e72beab5495]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:34:12 compute-1 NetworkManager[54952]: <info>  [1769042052.5774] device (tapac047d42-80): carrier: link connected
Jan 22 00:34:12 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:34:12.583 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[c61016ba-447c-4501-bac1-241046d3f54c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:34:12 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:34:12.603 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[7f3436cd-6361-44e4-afcf-35ae9f857312]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapac047d42-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:5f:b1:74'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 197], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 662523, 'reachable_time': 17478, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 240114, 'error': None, 'target': 'ovnmeta-ac047d42-8ff5-4760-85b5-73b5e4be7fc9', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:34:12 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:34:12.618 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[d0e1e6d2-5705-4ad5-93a6-4b3798f8f71f]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe5f:b174'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 662523, 'tstamp': 662523}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 240115, 'error': None, 'target': 'ovnmeta-ac047d42-8ff5-4760-85b5-73b5e4be7fc9', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:34:12 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:34:12.640 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[710ce6b0-072d-4cff-b4dc-17d40652e7c0]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapac047d42-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:5f:b1:74'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 197], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 662523, 'reachable_time': 17478, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 148, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 148, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 240116, 'error': None, 'target': 'ovnmeta-ac047d42-8ff5-4760-85b5-73b5e4be7fc9', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:34:12 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:34:12.667 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[a19ccdd5-91ae-4bc3-93ce-05a0e1a6b669]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:34:12 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:34:12.695 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[d7be7196-e987-4f8c-b780-f511dea3a908]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:34:12 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:34:12.697 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapac047d42-80, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:34:12 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:34:12.697 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 00:34:12 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:34:12.697 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapac047d42-80, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:34:12 compute-1 nova_compute[182713]: 2026-01-22 00:34:12.699 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:34:12 compute-1 NetworkManager[54952]: <info>  [1769042052.7006] manager: (tapac047d42-80): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/324)
Jan 22 00:34:12 compute-1 kernel: tapac047d42-80: entered promiscuous mode
Jan 22 00:34:12 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:34:12.703 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapac047d42-80, col_values=(('external_ids', {'iface-id': 'eacecdac-1525-4e22-9343-339f328bc180'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:34:12 compute-1 nova_compute[182713]: 2026-01-22 00:34:12.704 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:34:12 compute-1 ovn_controller[94841]: 2026-01-22T00:34:12Z|00686|binding|INFO|Releasing lport eacecdac-1525-4e22-9343-339f328bc180 from this chassis (sb_readonly=0)
Jan 22 00:34:12 compute-1 nova_compute[182713]: 2026-01-22 00:34:12.716 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:34:12 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:34:12.717 104184 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/ac047d42-8ff5-4760-85b5-73b5e4be7fc9.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/ac047d42-8ff5-4760-85b5-73b5e4be7fc9.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 22 00:34:12 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:34:12.718 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[c99a9b35-58e5-48ad-b30f-cc65e038d032]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:34:12 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:34:12.718 104184 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 00:34:12 compute-1 ovn_metadata_agent[104179]: global
Jan 22 00:34:12 compute-1 ovn_metadata_agent[104179]:     log         /dev/log local0 debug
Jan 22 00:34:12 compute-1 ovn_metadata_agent[104179]:     log-tag     haproxy-metadata-proxy-ac047d42-8ff5-4760-85b5-73b5e4be7fc9
Jan 22 00:34:12 compute-1 ovn_metadata_agent[104179]:     user        root
Jan 22 00:34:12 compute-1 ovn_metadata_agent[104179]:     group       root
Jan 22 00:34:12 compute-1 ovn_metadata_agent[104179]:     maxconn     1024
Jan 22 00:34:12 compute-1 ovn_metadata_agent[104179]:     pidfile     /var/lib/neutron/external/pids/ac047d42-8ff5-4760-85b5-73b5e4be7fc9.pid.haproxy
Jan 22 00:34:12 compute-1 ovn_metadata_agent[104179]:     daemon
Jan 22 00:34:12 compute-1 ovn_metadata_agent[104179]: 
Jan 22 00:34:12 compute-1 ovn_metadata_agent[104179]: defaults
Jan 22 00:34:12 compute-1 ovn_metadata_agent[104179]:     log global
Jan 22 00:34:12 compute-1 ovn_metadata_agent[104179]:     mode http
Jan 22 00:34:12 compute-1 ovn_metadata_agent[104179]:     option httplog
Jan 22 00:34:12 compute-1 ovn_metadata_agent[104179]:     option dontlognull
Jan 22 00:34:12 compute-1 ovn_metadata_agent[104179]:     option http-server-close
Jan 22 00:34:12 compute-1 ovn_metadata_agent[104179]:     option forwardfor
Jan 22 00:34:12 compute-1 ovn_metadata_agent[104179]:     retries                 3
Jan 22 00:34:12 compute-1 ovn_metadata_agent[104179]:     timeout http-request    30s
Jan 22 00:34:12 compute-1 ovn_metadata_agent[104179]:     timeout connect         30s
Jan 22 00:34:12 compute-1 ovn_metadata_agent[104179]:     timeout client          32s
Jan 22 00:34:12 compute-1 ovn_metadata_agent[104179]:     timeout server          32s
Jan 22 00:34:12 compute-1 ovn_metadata_agent[104179]:     timeout http-keep-alive 30s
Jan 22 00:34:12 compute-1 ovn_metadata_agent[104179]: 
Jan 22 00:34:12 compute-1 ovn_metadata_agent[104179]: 
Jan 22 00:34:12 compute-1 ovn_metadata_agent[104179]: listen listener
Jan 22 00:34:12 compute-1 ovn_metadata_agent[104179]:     bind 169.254.169.254:80
Jan 22 00:34:12 compute-1 ovn_metadata_agent[104179]:     server metadata /var/lib/neutron/metadata_proxy
Jan 22 00:34:12 compute-1 ovn_metadata_agent[104179]:     http-request add-header X-OVN-Network-ID ac047d42-8ff5-4760-85b5-73b5e4be7fc9
Jan 22 00:34:12 compute-1 ovn_metadata_agent[104179]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 22 00:34:12 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:34:12.719 104184 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-ac047d42-8ff5-4760-85b5-73b5e4be7fc9', 'env', 'PROCESS_TAG=haproxy-ac047d42-8ff5-4760-85b5-73b5e4be7fc9', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/ac047d42-8ff5-4760-85b5-73b5e4be7fc9.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 22 00:34:13 compute-1 podman[240146]: 2026-01-22 00:34:13.106964973 +0000 UTC m=+0.063523716 container create c4954f1b9b5958dba720e54560b01ebccff889c4e3e04a316f3d6a7321d23195 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ac047d42-8ff5-4760-85b5-73b5e4be7fc9, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Jan 22 00:34:13 compute-1 systemd[1]: Started libpod-conmon-c4954f1b9b5958dba720e54560b01ebccff889c4e3e04a316f3d6a7321d23195.scope.
Jan 22 00:34:13 compute-1 podman[240146]: 2026-01-22 00:34:13.077106335 +0000 UTC m=+0.033665158 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 00:34:13 compute-1 systemd[1]: Started libcrun container.
Jan 22 00:34:13 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bb4b5572b610bb761579f5a85e8777e126e1fbd08a67a43ccaf7711916fbf3f7/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 00:34:13 compute-1 podman[240146]: 2026-01-22 00:34:13.19251437 +0000 UTC m=+0.149073113 container init c4954f1b9b5958dba720e54560b01ebccff889c4e3e04a316f3d6a7321d23195 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ac047d42-8ff5-4760-85b5-73b5e4be7fc9, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 00:34:13 compute-1 podman[240146]: 2026-01-22 00:34:13.198967873 +0000 UTC m=+0.155526606 container start c4954f1b9b5958dba720e54560b01ebccff889c4e3e04a316f3d6a7321d23195 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ac047d42-8ff5-4760-85b5-73b5e4be7fc9, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202)
Jan 22 00:34:13 compute-1 neutron-haproxy-ovnmeta-ac047d42-8ff5-4760-85b5-73b5e4be7fc9[240160]: [NOTICE]   (240164) : New worker (240166) forked
Jan 22 00:34:13 compute-1 neutron-haproxy-ovnmeta-ac047d42-8ff5-4760-85b5-73b5e4be7fc9[240160]: [NOTICE]   (240164) : Loading success.
Jan 22 00:34:13 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:34:13.279 104184 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 22 00:34:13 compute-1 nova_compute[182713]: 2026-01-22 00:34:13.983 182717 DEBUG nova.compute.manager [req-c024ef88-c963-4363-8ae9-0bbdf5d53359 req-ff1f74dc-a40c-499e-a2e1-f81567ff3dfa 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 02484dd3-2f3e-41bb-8ceb-d71936912a38] Received event network-vif-plugged-fd875d41-c569-4c76-ae51-bbf4251e41bd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:34:13 compute-1 nova_compute[182713]: 2026-01-22 00:34:13.985 182717 DEBUG oslo_concurrency.lockutils [req-c024ef88-c963-4363-8ae9-0bbdf5d53359 req-ff1f74dc-a40c-499e-a2e1-f81567ff3dfa 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "02484dd3-2f3e-41bb-8ceb-d71936912a38-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:34:13 compute-1 nova_compute[182713]: 2026-01-22 00:34:13.985 182717 DEBUG oslo_concurrency.lockutils [req-c024ef88-c963-4363-8ae9-0bbdf5d53359 req-ff1f74dc-a40c-499e-a2e1-f81567ff3dfa 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "02484dd3-2f3e-41bb-8ceb-d71936912a38-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:34:13 compute-1 nova_compute[182713]: 2026-01-22 00:34:13.986 182717 DEBUG oslo_concurrency.lockutils [req-c024ef88-c963-4363-8ae9-0bbdf5d53359 req-ff1f74dc-a40c-499e-a2e1-f81567ff3dfa 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "02484dd3-2f3e-41bb-8ceb-d71936912a38-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:34:13 compute-1 nova_compute[182713]: 2026-01-22 00:34:13.986 182717 DEBUG nova.compute.manager [req-c024ef88-c963-4363-8ae9-0bbdf5d53359 req-ff1f74dc-a40c-499e-a2e1-f81567ff3dfa 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 02484dd3-2f3e-41bb-8ceb-d71936912a38] No waiting events found dispatching network-vif-plugged-fd875d41-c569-4c76-ae51-bbf4251e41bd pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:34:13 compute-1 nova_compute[182713]: 2026-01-22 00:34:13.987 182717 WARNING nova.compute.manager [req-c024ef88-c963-4363-8ae9-0bbdf5d53359 req-ff1f74dc-a40c-499e-a2e1-f81567ff3dfa 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 02484dd3-2f3e-41bb-8ceb-d71936912a38] Received unexpected event network-vif-plugged-fd875d41-c569-4c76-ae51-bbf4251e41bd for instance with vm_state active and task_state None.
Jan 22 00:34:14 compute-1 nova_compute[182713]: 2026-01-22 00:34:14.011 182717 DEBUG nova.compute.manager [req-e7577983-e621-4590-9914-0377fba1ca8b req-dfc07601-4d34-495a-a145-3bb2bef4163d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 02484dd3-2f3e-41bb-8ceb-d71936912a38] Received event network-vif-plugged-0eb5006d-d9d0-403a-bc92-5ad7bd032ade external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:34:14 compute-1 nova_compute[182713]: 2026-01-22 00:34:14.012 182717 DEBUG oslo_concurrency.lockutils [req-e7577983-e621-4590-9914-0377fba1ca8b req-dfc07601-4d34-495a-a145-3bb2bef4163d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "02484dd3-2f3e-41bb-8ceb-d71936912a38-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:34:14 compute-1 nova_compute[182713]: 2026-01-22 00:34:14.012 182717 DEBUG oslo_concurrency.lockutils [req-e7577983-e621-4590-9914-0377fba1ca8b req-dfc07601-4d34-495a-a145-3bb2bef4163d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "02484dd3-2f3e-41bb-8ceb-d71936912a38-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:34:14 compute-1 nova_compute[182713]: 2026-01-22 00:34:14.013 182717 DEBUG oslo_concurrency.lockutils [req-e7577983-e621-4590-9914-0377fba1ca8b req-dfc07601-4d34-495a-a145-3bb2bef4163d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "02484dd3-2f3e-41bb-8ceb-d71936912a38-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:34:14 compute-1 nova_compute[182713]: 2026-01-22 00:34:14.013 182717 DEBUG nova.compute.manager [req-e7577983-e621-4590-9914-0377fba1ca8b req-dfc07601-4d34-495a-a145-3bb2bef4163d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 02484dd3-2f3e-41bb-8ceb-d71936912a38] No waiting events found dispatching network-vif-plugged-0eb5006d-d9d0-403a-bc92-5ad7bd032ade pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:34:14 compute-1 nova_compute[182713]: 2026-01-22 00:34:14.014 182717 WARNING nova.compute.manager [req-e7577983-e621-4590-9914-0377fba1ca8b req-dfc07601-4d34-495a-a145-3bb2bef4163d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 02484dd3-2f3e-41bb-8ceb-d71936912a38] Received unexpected event network-vif-plugged-0eb5006d-d9d0-403a-bc92-5ad7bd032ade for instance with vm_state active and task_state None.
Jan 22 00:34:14 compute-1 podman[240176]: 2026-01-22 00:34:14.629915201 +0000 UTC m=+0.105471864 container health_status e305ebfcd3dbca18f8dd680606b043b108cf9efb0d0a771039cb04fe0df8441d (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 22 00:34:14 compute-1 podman[240175]: 2026-01-22 00:34:14.638131459 +0000 UTC m=+0.119796164 container health_status af9a44923f14d865893c91712a29a99d59e05d83f1a1a0300c4f4d5af638f284 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 22 00:34:14 compute-1 nova_compute[182713]: 2026-01-22 00:34:14.903 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:34:15 compute-1 nova_compute[182713]: 2026-01-22 00:34:15.678 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:34:17 compute-1 nova_compute[182713]: 2026-01-22 00:34:17.399 182717 DEBUG nova.compute.manager [req-fe743c01-a168-44f8-968d-c36f9cb8a922 req-1861012f-f94d-495d-b620-f4c9e4b37ed7 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 02484dd3-2f3e-41bb-8ceb-d71936912a38] Received event network-changed-0eb5006d-d9d0-403a-bc92-5ad7bd032ade external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:34:17 compute-1 nova_compute[182713]: 2026-01-22 00:34:17.401 182717 DEBUG nova.compute.manager [req-fe743c01-a168-44f8-968d-c36f9cb8a922 req-1861012f-f94d-495d-b620-f4c9e4b37ed7 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 02484dd3-2f3e-41bb-8ceb-d71936912a38] Refreshing instance network info cache due to event network-changed-0eb5006d-d9d0-403a-bc92-5ad7bd032ade. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 00:34:17 compute-1 nova_compute[182713]: 2026-01-22 00:34:17.402 182717 DEBUG oslo_concurrency.lockutils [req-fe743c01-a168-44f8-968d-c36f9cb8a922 req-1861012f-f94d-495d-b620-f4c9e4b37ed7 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-02484dd3-2f3e-41bb-8ceb-d71936912a38" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:34:17 compute-1 nova_compute[182713]: 2026-01-22 00:34:17.402 182717 DEBUG oslo_concurrency.lockutils [req-fe743c01-a168-44f8-968d-c36f9cb8a922 req-1861012f-f94d-495d-b620-f4c9e4b37ed7 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-02484dd3-2f3e-41bb-8ceb-d71936912a38" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:34:17 compute-1 nova_compute[182713]: 2026-01-22 00:34:17.403 182717 DEBUG nova.network.neutron [req-fe743c01-a168-44f8-968d-c36f9cb8a922 req-1861012f-f94d-495d-b620-f4c9e4b37ed7 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 02484dd3-2f3e-41bb-8ceb-d71936912a38] Refreshing network info cache for port 0eb5006d-d9d0-403a-bc92-5ad7bd032ade _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 00:34:18 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:34:18.283 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=74526b6d-b1ca-423f-9094-b845f8b97526, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '57'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:34:18 compute-1 nova_compute[182713]: 2026-01-22 00:34:18.560 182717 DEBUG nova.network.neutron [req-fe743c01-a168-44f8-968d-c36f9cb8a922 req-1861012f-f94d-495d-b620-f4c9e4b37ed7 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 02484dd3-2f3e-41bb-8ceb-d71936912a38] Updated VIF entry in instance network info cache for port 0eb5006d-d9d0-403a-bc92-5ad7bd032ade. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 00:34:18 compute-1 nova_compute[182713]: 2026-01-22 00:34:18.562 182717 DEBUG nova.network.neutron [req-fe743c01-a168-44f8-968d-c36f9cb8a922 req-1861012f-f94d-495d-b620-f4c9e4b37ed7 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 02484dd3-2f3e-41bb-8ceb-d71936912a38] Updating instance_info_cache with network_info: [{"id": "0eb5006d-d9d0-403a-bc92-5ad7bd032ade", "address": "fa:16:3e:d3:fd:67", "network": {"id": "2c32b591-bafa-4089-9793-ef7884c86bda", "bridge": "br-int", "label": "tempest-network-smoke--1415909396", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.249", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0eb5006d-d9", "ovs_interfaceid": "0eb5006d-d9d0-403a-bc92-5ad7bd032ade", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "fd875d41-c569-4c76-ae51-bbf4251e41bd", "address": "fa:16:3e:d6:3b:8b", "network": {"id": "ac047d42-8ff5-4760-85b5-73b5e4be7fc9", "bridge": "br-int", "label": "tempest-network-smoke--1874258653", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fed6:3b8b", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfd875d41-c5", "ovs_interfaceid": "fd875d41-c569-4c76-ae51-bbf4251e41bd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:34:18 compute-1 nova_compute[182713]: 2026-01-22 00:34:18.592 182717 DEBUG oslo_concurrency.lockutils [req-fe743c01-a168-44f8-968d-c36f9cb8a922 req-1861012f-f94d-495d-b620-f4c9e4b37ed7 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-02484dd3-2f3e-41bb-8ceb-d71936912a38" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:34:19 compute-1 nova_compute[182713]: 2026-01-22 00:34:19.905 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:34:20 compute-1 nova_compute[182713]: 2026-01-22 00:34:20.715 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.893 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '02484dd3-2f3e-41bb-8ceb-d71936912a38', 'name': 'tempest-TestGettingAddress-server-1621779069', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-000000ac', 'OS-EXT-SRV-ATTR:host': 'compute-1.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '837db8748d074b3c9179b47d30e7a1d4', 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'hostId': '114686826ac799d79679ebf48d50469c6b568964d23812d7c3d3d1b1', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.896 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.922 12 DEBUG ceilometer.compute.pollsters [-] 02484dd3-2f3e-41bb-8ceb-d71936912a38/memory.usage volume: 40.4140625 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.928 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '57ce979e-91fe-4580-bf4b-d89a4c050ddd', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 40.4140625, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': '02484dd3-2f3e-41bb-8ceb-d71936912a38', 'timestamp': '2026-01-22T00:34:22.896847', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1621779069', 'name': 'instance-000000ac', 'instance_id': '02484dd3-2f3e-41bb-8ceb-d71936912a38', 'instance_type': 'm1.nano', 'host': '114686826ac799d79679ebf48d50469c6b568964d23812d7c3d3d1b1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '1909ae34-f72a-11f0-a0a4-fa163e934844', 'monotonic_time': 6635.629782943, 'message_signature': '0a06a0fbece6fa4cb679ba71c11f0f06f45a3407f7dcfb02b8e427869bd26553'}]}, 'timestamp': '2026-01-22 00:34:22.924522', '_unique_id': '1faeb10101a6480ba0975d15a70b3ea6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.928 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.928 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.928 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.928 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.928 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.928 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.928 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.928 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.928 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.928 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.928 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.928 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.928 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.928 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.928 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.928 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.928 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.928 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.928 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.928 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.928 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.928 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.928 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.928 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.928 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.928 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.928 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.928 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.928 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.928 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.928 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.930 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.971 12 DEBUG ceilometer.compute.pollsters [-] 02484dd3-2f3e-41bb-8ceb-d71936912a38/disk.device.read.requests volume: 770 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.972 12 DEBUG ceilometer.compute.pollsters [-] 02484dd3-2f3e-41bb-8ceb-d71936912a38/disk.device.read.requests volume: 6 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.974 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b644219f-eda3-4003-98b5-da88253a8266', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 770, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': '02484dd3-2f3e-41bb-8ceb-d71936912a38-vda', 'timestamp': '2026-01-22T00:34:22.931198', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1621779069', 'name': 'instance-000000ac', 'instance_id': '02484dd3-2f3e-41bb-8ceb-d71936912a38', 'instance_type': 'm1.nano', 'host': '114686826ac799d79679ebf48d50469c6b568964d23812d7c3d3d1b1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '19111c96-f72a-11f0-a0a4-fa163e934844', 'monotonic_time': 6635.638528588, 'message_signature': '0e4b584cfbcd737b3ecf47dc5e800c7535ea56f20912d2f6c90ddcb351251c3a'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 6, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': '02484dd3-2f3e-41bb-8ceb-d71936912a38-sda', 'timestamp': '2026-01-22T00:34:22.931198', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1621779069', 'name': 'instance-000000ac', 'instance_id': '02484dd3-2f3e-41bb-8ceb-d71936912a38', 'instance_type': 'm1.nano', 'host': '114686826ac799d79679ebf48d50469c6b568964d23812d7c3d3d1b1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '19113032-f72a-11f0-a0a4-fa163e934844', 'monotonic_time': 6635.638528588, 'message_signature': '15b80e001b9f1dbed7a612bae9c97cdda1d32b7dc757db801661b50db6953688'}]}, 'timestamp': '2026-01-22 00:34:22.973177', '_unique_id': '6328217623894c5fab120fa2354ada8a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.974 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.974 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.974 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.974 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.974 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.974 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.974 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.974 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.974 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.974 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.974 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.974 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.974 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.974 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.974 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.974 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.974 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.974 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.974 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.974 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.974 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.974 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.974 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.974 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.974 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.974 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.974 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.974 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.974 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.974 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.974 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.976 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.979 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 02484dd3-2f3e-41bb-8ceb-d71936912a38 / tap0eb5006d-d9 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.980 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 02484dd3-2f3e-41bb-8ceb-d71936912a38 / tapfd875d41-c5 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.980 12 DEBUG ceilometer.compute.pollsters [-] 02484dd3-2f3e-41bb-8ceb-d71936912a38/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.981 12 DEBUG ceilometer.compute.pollsters [-] 02484dd3-2f3e-41bb-8ceb-d71936912a38/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.983 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '32cfa4b7-41e0-4f0d-9750-0a87f4730ced', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': 'instance-000000ac-02484dd3-2f3e-41bb-8ceb-d71936912a38-tap0eb5006d-d9', 'timestamp': '2026-01-22T00:34:22.976190', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1621779069', 'name': 'tap0eb5006d-d9', 'instance_id': '02484dd3-2f3e-41bb-8ceb-d71936912a38', 'instance_type': 'm1.nano', 'host': '114686826ac799d79679ebf48d50469c6b568964d23812d7c3d3d1b1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d3:fd:67', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap0eb5006d-d9'}, 'message_id': '1912732a-f72a-11f0-a0a4-fa163e934844', 'monotonic_time': 6635.68347876, 'message_signature': 'ea5acf185abaf18f885d5444b46484913284fa6e914d195f4ae977dc724be82c'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': 'instance-000000ac-02484dd3-2f3e-41bb-8ceb-d71936912a38-tapfd875d41-c5', 'timestamp': '2026-01-22T00:34:22.976190', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1621779069', 'name': 'tapfd875d41-c5', 'instance_id': '02484dd3-2f3e-41bb-8ceb-d71936912a38', 'instance_type': 'm1.nano', 'host': '114686826ac799d79679ebf48d50469c6b568964d23812d7c3d3d1b1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d6:3b:8b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapfd875d41-c5'}, 'message_id': '1912859a-f72a-11f0-a0a4-fa163e934844', 'monotonic_time': 6635.68347876, 'message_signature': '144d27abcd607d565ccc59052fbfad1f0fd84b972fa079b3baf6230599ed9a31'}]}, 'timestamp': '2026-01-22 00:34:22.981960', '_unique_id': 'c58ff470669c4389b848bff994553b35'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.983 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.983 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.983 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.983 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.983 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.983 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.983 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.983 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.983 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.983 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.983 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.983 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.983 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.983 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.983 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.983 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.983 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.983 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.983 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.983 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.983 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.983 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.983 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.983 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.983 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.983 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.983 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.983 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.983 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.983 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.983 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.984 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.984 12 DEBUG ceilometer.compute.pollsters [-] 02484dd3-2f3e-41bb-8ceb-d71936912a38/network.incoming.bytes volume: 90 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.985 12 DEBUG ceilometer.compute.pollsters [-] 02484dd3-2f3e-41bb-8ceb-d71936912a38/network.incoming.bytes volume: 90 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.986 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6056cece-3c68-4b4a-ace4-53a2bfd7d49a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 90, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': 'instance-000000ac-02484dd3-2f3e-41bb-8ceb-d71936912a38-tap0eb5006d-d9', 'timestamp': '2026-01-22T00:34:22.984501', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1621779069', 'name': 'tap0eb5006d-d9', 'instance_id': '02484dd3-2f3e-41bb-8ceb-d71936912a38', 'instance_type': 'm1.nano', 'host': '114686826ac799d79679ebf48d50469c6b568964d23812d7c3d3d1b1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d3:fd:67', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap0eb5006d-d9'}, 'message_id': '1912fdae-f72a-11f0-a0a4-fa163e934844', 'monotonic_time': 6635.68347876, 'message_signature': 'ea4db9e26ca37fe5656019bb1080a7bf83c136bcf0b6fdc8ab585ef3f37d0d7c'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 90, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': 'instance-000000ac-02484dd3-2f3e-41bb-8ceb-d71936912a38-tapfd875d41-c5', 'timestamp': '2026-01-22T00:34:22.984501', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1621779069', 'name': 'tapfd875d41-c5', 'instance_id': '02484dd3-2f3e-41bb-8ceb-d71936912a38', 'instance_type': 'm1.nano', 'host': '114686826ac799d79679ebf48d50469c6b568964d23812d7c3d3d1b1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d6:3b:8b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapfd875d41-c5'}, 'message_id': '19131078-f72a-11f0-a0a4-fa163e934844', 'monotonic_time': 6635.68347876, 'message_signature': 'f6c3ca35e5371dcdd028df488081cec6ce5fdb9ec52cd7439942a69c7cf24d54'}]}, 'timestamp': '2026-01-22 00:34:22.985450', '_unique_id': 'b9ecb9f077454da4bf4a09e23491bf57'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.986 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.986 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.986 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.986 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.986 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.986 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.986 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.986 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.986 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.986 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.986 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.986 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.986 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.986 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.986 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.986 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.986 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.986 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.986 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.986 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.986 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.986 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.986 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.986 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.986 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.986 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.986 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.986 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.986 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.986 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.986 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.986 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.986 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.986 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.986 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.986 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.986 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.986 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.986 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.986 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.986 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.986 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.986 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.986 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.986 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.986 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.986 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.986 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.986 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.986 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.986 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.986 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.986 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.986 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.987 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.987 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.988 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-TestGettingAddress-server-1621779069>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestGettingAddress-server-1621779069>]
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.988 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.988 12 DEBUG ceilometer.compute.pollsters [-] 02484dd3-2f3e-41bb-8ceb-d71936912a38/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.989 12 DEBUG ceilometer.compute.pollsters [-] 02484dd3-2f3e-41bb-8ceb-d71936912a38/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.990 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1a008851-da95-4e41-aaa9-7b655de8f937', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': '02484dd3-2f3e-41bb-8ceb-d71936912a38-vda', 'timestamp': '2026-01-22T00:34:22.988693', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1621779069', 'name': 'instance-000000ac', 'instance_id': '02484dd3-2f3e-41bb-8ceb-d71936912a38', 'instance_type': 'm1.nano', 'host': '114686826ac799d79679ebf48d50469c6b568964d23812d7c3d3d1b1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '1913a218-f72a-11f0-a0a4-fa163e934844', 'monotonic_time': 6635.638528588, 'message_signature': '0c63371cf3d7cdb7e491615cce9f8933ab0a9b1edd39e3a29984bb5746dc45cf'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': '02484dd3-2f3e-41bb-8ceb-d71936912a38-sda', 'timestamp': '2026-01-22T00:34:22.988693', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1621779069', 'name': 'instance-000000ac', 'instance_id': '02484dd3-2f3e-41bb-8ceb-d71936912a38', 'instance_type': 'm1.nano', 'host': '114686826ac799d79679ebf48d50469c6b568964d23812d7c3d3d1b1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '1913b2b2-f72a-11f0-a0a4-fa163e934844', 'monotonic_time': 6635.638528588, 'message_signature': 'e87142aa28a7274dedccdbf686601fbf4d500294c44ebc6b4638957f76b02f55'}]}, 'timestamp': '2026-01-22 00:34:22.989588', '_unique_id': '3b418700a32341ef9479d74bd02be3cb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.990 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.990 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.990 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.990 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.990 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.990 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.990 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.990 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.990 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.990 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.990 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.990 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.990 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.990 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.990 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.990 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.990 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.990 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.990 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.990 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.990 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.990 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.990 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.990 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.990 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.990 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.990 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.990 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.990 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.990 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.990 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:34:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:22.992 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.009 12 DEBUG ceilometer.compute.pollsters [-] 02484dd3-2f3e-41bb-8ceb-d71936912a38/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.009 12 DEBUG ceilometer.compute.pollsters [-] 02484dd3-2f3e-41bb-8ceb-d71936912a38/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.011 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '431a3c8d-1b7b-4345-8af8-1024e3c638a7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': '02484dd3-2f3e-41bb-8ceb-d71936912a38-vda', 'timestamp': '2026-01-22T00:34:22.992253', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1621779069', 'name': 'instance-000000ac', 'instance_id': '02484dd3-2f3e-41bb-8ceb-d71936912a38', 'instance_type': 'm1.nano', 'host': '114686826ac799d79679ebf48d50469c6b568964d23812d7c3d3d1b1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '1916c8ee-f72a-11f0-a0a4-fa163e934844', 'monotonic_time': 6635.699511043, 'message_signature': '079f47a4867086020c70f07511f396aa47cd947ff587f5e49294f2f678434f4a'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': '02484dd3-2f3e-41bb-8ceb-d71936912a38-sda', 'timestamp': '2026-01-22T00:34:22.992253', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1621779069', 'name': 'instance-000000ac', 'instance_id': '02484dd3-2f3e-41bb-8ceb-d71936912a38', 'instance_type': 'm1.nano', 'host': '114686826ac799d79679ebf48d50469c6b568964d23812d7c3d3d1b1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '1916db18-f72a-11f0-a0a4-fa163e934844', 'monotonic_time': 6635.699511043, 'message_signature': 'a449a244e06647eafc5e142176647ba2727e23f58a41b992ac611fa550cc196d'}]}, 'timestamp': '2026-01-22 00:34:23.010244', '_unique_id': '20237acb2d824424882c8cb8aabcdcdd'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.011 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.011 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.011 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.011 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.011 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.011 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.011 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.011 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.011 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.011 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.011 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.011 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.011 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.011 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.011 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.011 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.011 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.011 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.011 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.011 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.011 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.011 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.011 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.011 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.011 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.011 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.011 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.011 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.011 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.011 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.011 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.012 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.012 12 DEBUG ceilometer.compute.pollsters [-] 02484dd3-2f3e-41bb-8ceb-d71936912a38/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.012 12 DEBUG ceilometer.compute.pollsters [-] 02484dd3-2f3e-41bb-8ceb-d71936912a38/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.013 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fd3fc8b4-4af1-4128-87ac-93c462c67826', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': '02484dd3-2f3e-41bb-8ceb-d71936912a38-vda', 'timestamp': '2026-01-22T00:34:23.012437', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1621779069', 'name': 'instance-000000ac', 'instance_id': '02484dd3-2f3e-41bb-8ceb-d71936912a38', 'instance_type': 'm1.nano', 'host': '114686826ac799d79679ebf48d50469c6b568964d23812d7c3d3d1b1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '19173e1e-f72a-11f0-a0a4-fa163e934844', 'monotonic_time': 6635.638528588, 'message_signature': '1cb415cbb7ba42d2ed0e5daae02f393b57351c9df9ec143743295adb8e73c6e9'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': '02484dd3-2f3e-41bb-8ceb-d71936912a38-sda', 'timestamp': '2026-01-22T00:34:23.012437', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1621779069', 'name': 'instance-000000ac', 'instance_id': '02484dd3-2f3e-41bb-8ceb-d71936912a38', 'instance_type': 'm1.nano', 'host': '114686826ac799d79679ebf48d50469c6b568964d23812d7c3d3d1b1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '191749fe-f72a-11f0-a0a4-fa163e934844', 'monotonic_time': 6635.638528588, 'message_signature': '7e6ec62e6dccac0a47cc78594bcb66ed9922f6691f8db97a4891813a84e40955'}]}, 'timestamp': '2026-01-22 00:34:23.013051', '_unique_id': 'd205e610b3b6405ab930ad4dacd9f2c9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.013 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.013 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.013 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.013 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.013 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.013 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.013 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.013 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.013 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.013 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.013 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.013 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.013 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.013 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.013 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.013 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.013 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.013 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.013 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.013 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.013 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.013 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.013 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.013 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.013 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.013 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.013 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.013 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.013 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.013 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.013 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.014 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.014 12 DEBUG ceilometer.compute.pollsters [-] 02484dd3-2f3e-41bb-8ceb-d71936912a38/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.015 12 DEBUG ceilometer.compute.pollsters [-] 02484dd3-2f3e-41bb-8ceb-d71936912a38/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.016 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a68be4b3-a3c2-4612-9ea8-78273072ab73', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': 'instance-000000ac-02484dd3-2f3e-41bb-8ceb-d71936912a38-tap0eb5006d-d9', 'timestamp': '2026-01-22T00:34:23.014765', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1621779069', 'name': 'tap0eb5006d-d9', 'instance_id': '02484dd3-2f3e-41bb-8ceb-d71936912a38', 'instance_type': 'm1.nano', 'host': '114686826ac799d79679ebf48d50469c6b568964d23812d7c3d3d1b1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d3:fd:67', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap0eb5006d-d9'}, 'message_id': '19179be8-f72a-11f0-a0a4-fa163e934844', 'monotonic_time': 6635.68347876, 'message_signature': 'e73cc84d81afd2fc30d297e93a8182aaf543f696539796a5fa65a7478bfe4ece'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': 'instance-000000ac-02484dd3-2f3e-41bb-8ceb-d71936912a38-tapfd875d41-c5', 'timestamp': '2026-01-22T00:34:23.014765', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1621779069', 'name': 'tapfd875d41-c5', 'instance_id': '02484dd3-2f3e-41bb-8ceb-d71936912a38', 'instance_type': 'm1.nano', 'host': '114686826ac799d79679ebf48d50469c6b568964d23812d7c3d3d1b1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d6:3b:8b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapfd875d41-c5'}, 'message_id': '1917ac32-f72a-11f0-a0a4-fa163e934844', 'monotonic_time': 6635.68347876, 'message_signature': '07e687b885cd1d1bd7e7dee12520526f001d94674e59e941372d9b4e0f94e7ad'}]}, 'timestamp': '2026-01-22 00:34:23.015639', '_unique_id': 'd023f70a798a46fe9d06d93f67246f16'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.016 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.016 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.016 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.016 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.016 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.016 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.016 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.016 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.016 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.016 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.016 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.016 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.016 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.016 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.016 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.016 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.016 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.016 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.016 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.016 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.016 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.016 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.016 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.016 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.016 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.016 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.016 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.016 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.016 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.016 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.016 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.016 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.016 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.016 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.016 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.016 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.016 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.016 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.016 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.016 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.016 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.016 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.016 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.016 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.016 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.016 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.016 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.016 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.016 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.016 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.016 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.016 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.016 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.016 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.017 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.018 12 DEBUG ceilometer.compute.pollsters [-] 02484dd3-2f3e-41bb-8ceb-d71936912a38/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.018 12 DEBUG ceilometer.compute.pollsters [-] 02484dd3-2f3e-41bb-8ceb-d71936912a38/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.019 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '64c64dc9-2da9-44f3-94b3-c5471690d626', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': '02484dd3-2f3e-41bb-8ceb-d71936912a38-vda', 'timestamp': '2026-01-22T00:34:23.018011', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1621779069', 'name': 'instance-000000ac', 'instance_id': '02484dd3-2f3e-41bb-8ceb-d71936912a38', 'instance_type': 'm1.nano', 'host': '114686826ac799d79679ebf48d50469c6b568964d23812d7c3d3d1b1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '19181a00-f72a-11f0-a0a4-fa163e934844', 'monotonic_time': 6635.638528588, 'message_signature': '1911133eece2989267d93dae9ed4e0fdada8fbfe2830ed83bc9ff4bc308abec3'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': '02484dd3-2f3e-41bb-8ceb-d71936912a38-sda', 'timestamp': '2026-01-22T00:34:23.018011', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1621779069', 'name': 'instance-000000ac', 'instance_id': '02484dd3-2f3e-41bb-8ceb-d71936912a38', 'instance_type': 'm1.nano', 'host': '114686826ac799d79679ebf48d50469c6b568964d23812d7c3d3d1b1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '19182950-f72a-11f0-a0a4-fa163e934844', 'monotonic_time': 6635.638528588, 'message_signature': 'bd1a7fcb9c56f53f896518e3331f68c5221e5b5ce45fb5c5c3ef69599247f7f1'}]}, 'timestamp': '2026-01-22 00:34:23.018831', '_unique_id': '2305ff41d7da4126b334d63d1b534f65'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.019 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.019 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.019 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.019 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.019 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.019 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.019 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.019 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.019 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.019 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.019 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.019 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.019 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.019 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.019 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.019 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.019 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.019 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.019 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.019 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.019 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.019 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.019 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.019 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.019 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.019 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.019 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.019 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.019 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.019 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.019 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.019 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.019 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.019 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.019 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.019 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.019 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.019 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.019 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.019 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.019 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.019 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.019 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.019 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.019 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.019 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.019 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.019 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.019 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.019 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.019 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.019 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.019 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.019 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.021 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.021 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.021 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-TestGettingAddress-server-1621779069>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestGettingAddress-server-1621779069>]
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.021 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.021 12 DEBUG ceilometer.compute.pollsters [-] 02484dd3-2f3e-41bb-8ceb-d71936912a38/network.outgoing.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.022 12 DEBUG ceilometer.compute.pollsters [-] 02484dd3-2f3e-41bb-8ceb-d71936912a38/network.outgoing.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.023 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd2ad8912-d5ef-4f80-a2cf-d89082e2d0bc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': 'instance-000000ac-02484dd3-2f3e-41bb-8ceb-d71936912a38-tap0eb5006d-d9', 'timestamp': '2026-01-22T00:34:23.021801', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1621779069', 'name': 'tap0eb5006d-d9', 'instance_id': '02484dd3-2f3e-41bb-8ceb-d71936912a38', 'instance_type': 'm1.nano', 'host': '114686826ac799d79679ebf48d50469c6b568964d23812d7c3d3d1b1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d3:fd:67', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap0eb5006d-d9'}, 'message_id': '1918b00a-f72a-11f0-a0a4-fa163e934844', 'monotonic_time': 6635.68347876, 'message_signature': 'b9ef29314a5066b6aee43ebc47a6236335b3e6996d955a5df58e6c27f57be859'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': 'instance-000000ac-02484dd3-2f3e-41bb-8ceb-d71936912a38-tapfd875d41-c5', 'timestamp': '2026-01-22T00:34:23.021801', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1621779069', 'name': 'tapfd875d41-c5', 'instance_id': '02484dd3-2f3e-41bb-8ceb-d71936912a38', 'instance_type': 'm1.nano', 'host': '114686826ac799d79679ebf48d50469c6b568964d23812d7c3d3d1b1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d6:3b:8b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapfd875d41-c5'}, 'message_id': '1918bffa-f72a-11f0-a0a4-fa163e934844', 'monotonic_time': 6635.68347876, 'message_signature': 'e7f41061380f992ad015f651100ded6d5aa48e07b241fc79b37d5566cfc7cad2'}]}, 'timestamp': '2026-01-22 00:34:23.022705', '_unique_id': 'ed868d4e1a164192a0bec10e411a3d61'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.023 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.023 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.023 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.023 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.023 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.023 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.023 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.023 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.023 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.023 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.023 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.023 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.023 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.023 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.023 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.023 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.023 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.023 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.023 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.023 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.023 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.023 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.023 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.023 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.023 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.023 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.023 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.023 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.023 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.023 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.023 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.024 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.024 12 DEBUG ceilometer.compute.pollsters [-] 02484dd3-2f3e-41bb-8ceb-d71936912a38/network.outgoing.packets volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.025 12 DEBUG ceilometer.compute.pollsters [-] 02484dd3-2f3e-41bb-8ceb-d71936912a38/network.outgoing.packets volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.026 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1bf95a6a-3de2-4bcb-afe2-a17dd42a0896', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': 'instance-000000ac-02484dd3-2f3e-41bb-8ceb-d71936912a38-tap0eb5006d-d9', 'timestamp': '2026-01-22T00:34:23.024783', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1621779069', 'name': 'tap0eb5006d-d9', 'instance_id': '02484dd3-2f3e-41bb-8ceb-d71936912a38', 'instance_type': 'm1.nano', 'host': '114686826ac799d79679ebf48d50469c6b568964d23812d7c3d3d1b1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d3:fd:67', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap0eb5006d-d9'}, 'message_id': '19192364-f72a-11f0-a0a4-fa163e934844', 'monotonic_time': 6635.68347876, 'message_signature': '359839b885e71f07887713afcf21f6ff7fd18983dbfb7f90790948847fadf444'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': 'instance-000000ac-02484dd3-2f3e-41bb-8ceb-d71936912a38-tapfd875d41-c5', 'timestamp': '2026-01-22T00:34:23.024783', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1621779069', 'name': 'tapfd875d41-c5', 'instance_id': '02484dd3-2f3e-41bb-8ceb-d71936912a38', 'instance_type': 'm1.nano', 'host': '114686826ac799d79679ebf48d50469c6b568964d23812d7c3d3d1b1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d6:3b:8b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapfd875d41-c5'}, 'message_id': '191933d6-f72a-11f0-a0a4-fa163e934844', 'monotonic_time': 6635.68347876, 'message_signature': 'a181f711f185f8d4a66adec4397bc672db9fc618d74b069e973e22bb8626818f'}]}, 'timestamp': '2026-01-22 00:34:23.025664', '_unique_id': 'd234ca027c664f718099b4e4e84f357b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.026 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.026 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.026 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.026 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.026 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.026 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.026 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.026 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.026 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.026 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.026 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.026 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.026 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.026 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.026 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.026 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.026 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.026 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.026 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.026 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.026 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.026 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.026 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.026 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.026 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.026 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.026 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.026 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.026 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.026 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.026 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.027 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.027 12 DEBUG ceilometer.compute.pollsters [-] 02484dd3-2f3e-41bb-8ceb-d71936912a38/disk.device.allocation volume: 204800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.028 12 DEBUG ceilometer.compute.pollsters [-] 02484dd3-2f3e-41bb-8ceb-d71936912a38/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.029 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd89e7056-4093-4684-9a72-27a3ba19cb5a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 204800, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': '02484dd3-2f3e-41bb-8ceb-d71936912a38-vda', 'timestamp': '2026-01-22T00:34:23.027738', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1621779069', 'name': 'instance-000000ac', 'instance_id': '02484dd3-2f3e-41bb-8ceb-d71936912a38', 'instance_type': 'm1.nano', 'host': '114686826ac799d79679ebf48d50469c6b568964d23812d7c3d3d1b1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '191996aa-f72a-11f0-a0a4-fa163e934844', 'monotonic_time': 6635.699511043, 'message_signature': '6e930ee8eea5ac642d0c0e8cd20ca86b4e8bad3d0543b14d0ee68a7726561bd1'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': '02484dd3-2f3e-41bb-8ceb-d71936912a38-sda', 'timestamp': '2026-01-22T00:34:23.027738', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1621779069', 'name': 'instance-000000ac', 'instance_id': '02484dd3-2f3e-41bb-8ceb-d71936912a38', 'instance_type': 'm1.nano', 'host': '114686826ac799d79679ebf48d50469c6b568964d23812d7c3d3d1b1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '1919a5be-f72a-11f0-a0a4-fa163e934844', 'monotonic_time': 6635.699511043, 'message_signature': '1986e90e214c337e9d8dde0ee9cc1ca0efbfc0de937dc9f9f5154ca9ebc1de66'}]}, 'timestamp': '2026-01-22 00:34:23.028559', '_unique_id': '90898a767c4c495da4da79146ee83f29'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.029 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.029 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.029 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.029 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.029 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.029 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.029 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.029 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.029 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.029 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.029 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.029 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.029 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.029 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.029 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.029 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.029 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.029 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.029 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.029 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.029 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.029 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.029 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.029 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.029 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.029 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.029 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.029 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.029 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.029 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.029 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.030 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.030 12 DEBUG ceilometer.compute.pollsters [-] 02484dd3-2f3e-41bb-8ceb-d71936912a38/disk.device.read.bytes volume: 23816192 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.031 12 DEBUG ceilometer.compute.pollsters [-] 02484dd3-2f3e-41bb-8ceb-d71936912a38/disk.device.read.bytes volume: 2208 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.032 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c0f1b17f-287f-4384-b05d-33ec16a537f9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 23816192, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': '02484dd3-2f3e-41bb-8ceb-d71936912a38-vda', 'timestamp': '2026-01-22T00:34:23.030656', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1621779069', 'name': 'instance-000000ac', 'instance_id': '02484dd3-2f3e-41bb-8ceb-d71936912a38', 'instance_type': 'm1.nano', 'host': '114686826ac799d79679ebf48d50469c6b568964d23812d7c3d3d1b1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '191a0860-f72a-11f0-a0a4-fa163e934844', 'monotonic_time': 6635.638528588, 'message_signature': 'e7e53f3f05ed606247af4cf45bb69fa74d5ee6c6256b1613e0aebf1459dba1d4'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2208, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': '02484dd3-2f3e-41bb-8ceb-d71936912a38-sda', 'timestamp': '2026-01-22T00:34:23.030656', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1621779069', 'name': 'instance-000000ac', 'instance_id': '02484dd3-2f3e-41bb-8ceb-d71936912a38', 'instance_type': 'm1.nano', 'host': '114686826ac799d79679ebf48d50469c6b568964d23812d7c3d3d1b1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '191a186e-f72a-11f0-a0a4-fa163e934844', 'monotonic_time': 6635.638528588, 'message_signature': '95b44f42f18f38d6cef2112690924fd1433ca3c8d1e3df788950dd393107b9d5'}]}, 'timestamp': '2026-01-22 00:34:23.031503', '_unique_id': 'af4dd50df1bc460985af7fd05f90c7f4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.032 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.032 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.032 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.032 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.032 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.032 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.032 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.032 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.032 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.032 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.032 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.032 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.032 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.032 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.032 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.032 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.032 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.032 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.032 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.032 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.032 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.032 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.032 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.032 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.032 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.032 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.032 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.032 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.032 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.032 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.032 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.033 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.033 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.034 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-TestGettingAddress-server-1621779069>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestGettingAddress-server-1621779069>]
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.034 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.034 12 DEBUG ceilometer.compute.pollsters [-] 02484dd3-2f3e-41bb-8ceb-d71936912a38/disk.device.read.latency volume: 128200422 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.034 12 DEBUG ceilometer.compute.pollsters [-] 02484dd3-2f3e-41bb-8ceb-d71936912a38/disk.device.read.latency volume: 692821 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.036 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1ff5421f-960a-4061-bb24-f06c64028f02', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 128200422, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': '02484dd3-2f3e-41bb-8ceb-d71936912a38-vda', 'timestamp': '2026-01-22T00:34:23.034414', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1621779069', 'name': 'instance-000000ac', 'instance_id': '02484dd3-2f3e-41bb-8ceb-d71936912a38', 'instance_type': 'm1.nano', 'host': '114686826ac799d79679ebf48d50469c6b568964d23812d7c3d3d1b1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '191a9a28-f72a-11f0-a0a4-fa163e934844', 'monotonic_time': 6635.638528588, 'message_signature': '2fd035301b85c3f1d37d52a33de57e1b479c799465e38a029629522059613800'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 692821, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': '02484dd3-2f3e-41bb-8ceb-d71936912a38-sda', 'timestamp': '2026-01-22T00:34:23.034414', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1621779069', 'name': 'instance-000000ac', 'instance_id': '02484dd3-2f3e-41bb-8ceb-d71936912a38', 'instance_type': 'm1.nano', 'host': '114686826ac799d79679ebf48d50469c6b568964d23812d7c3d3d1b1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '191aaacc-f72a-11f0-a0a4-fa163e934844', 'monotonic_time': 6635.638528588, 'message_signature': 'cf9e2f063b89299dd01f42976b9e9339e9b79a78e97511bf0b747b8c60e68950'}]}, 'timestamp': '2026-01-22 00:34:23.035249', '_unique_id': '7fcc64b05a744624a17157d73ee2df6f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.036 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.036 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.036 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.036 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.036 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.036 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.036 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.036 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.036 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.036 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.036 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.036 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.036 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.036 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.036 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.036 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.036 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.036 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.036 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.036 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.036 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.036 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.036 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.036 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.036 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.036 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.036 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.036 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.036 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.036 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.036 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.037 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.037 12 DEBUG ceilometer.compute.pollsters [-] 02484dd3-2f3e-41bb-8ceb-d71936912a38/network.incoming.packets volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.037 12 DEBUG ceilometer.compute.pollsters [-] 02484dd3-2f3e-41bb-8ceb-d71936912a38/network.incoming.packets volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.039 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a42dacc0-52b3-454e-9729-e1cd0bc9bb3c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 1, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': 'instance-000000ac-02484dd3-2f3e-41bb-8ceb-d71936912a38-tap0eb5006d-d9', 'timestamp': '2026-01-22T00:34:23.037433', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1621779069', 'name': 'tap0eb5006d-d9', 'instance_id': '02484dd3-2f3e-41bb-8ceb-d71936912a38', 'instance_type': 'm1.nano', 'host': '114686826ac799d79679ebf48d50469c6b568964d23812d7c3d3d1b1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d3:fd:67', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap0eb5006d-d9'}, 'message_id': '191b0ff8-f72a-11f0-a0a4-fa163e934844', 'monotonic_time': 6635.68347876, 'message_signature': '74cdbd29107e0954291694ccbbc4cc4c02d6bd9f33d2ea4db04cad89bf15488f'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 1, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': 'instance-000000ac-02484dd3-2f3e-41bb-8ceb-d71936912a38-tapfd875d41-c5', 'timestamp': '2026-01-22T00:34:23.037433', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1621779069', 'name': 'tapfd875d41-c5', 'instance_id': '02484dd3-2f3e-41bb-8ceb-d71936912a38', 'instance_type': 'm1.nano', 'host': '114686826ac799d79679ebf48d50469c6b568964d23812d7c3d3d1b1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d6:3b:8b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapfd875d41-c5'}, 'message_id': '191b2132-f72a-11f0-a0a4-fa163e934844', 'monotonic_time': 6635.68347876, 'message_signature': '989c92ea5d4fa0f7651ee50ea678ec6d458265cd30866344d6303d50f758f127'}]}, 'timestamp': '2026-01-22 00:34:23.038285', '_unique_id': '2049ff9aea0944819f0eb8cd04a8e4eb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.039 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.039 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.039 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.039 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.039 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.039 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.039 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.039 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.039 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.039 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.039 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.039 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.039 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.039 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.039 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.039 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.039 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.039 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.039 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.039 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.039 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.039 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.039 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.039 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.039 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.039 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.039 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.039 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.039 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.039 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.039 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.040 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.040 12 DEBUG ceilometer.compute.pollsters [-] 02484dd3-2f3e-41bb-8ceb-d71936912a38/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.040 12 DEBUG ceilometer.compute.pollsters [-] 02484dd3-2f3e-41bb-8ceb-d71936912a38/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.042 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ef256a3a-9be0-4fc8-95a6-2969a06a4822', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': 'instance-000000ac-02484dd3-2f3e-41bb-8ceb-d71936912a38-tap0eb5006d-d9', 'timestamp': '2026-01-22T00:34:23.040321', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1621779069', 'name': 'tap0eb5006d-d9', 'instance_id': '02484dd3-2f3e-41bb-8ceb-d71936912a38', 'instance_type': 'm1.nano', 'host': '114686826ac799d79679ebf48d50469c6b568964d23812d7c3d3d1b1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d3:fd:67', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap0eb5006d-d9'}, 'message_id': '191b80be-f72a-11f0-a0a4-fa163e934844', 'monotonic_time': 6635.68347876, 'message_signature': '4b35c119c66c5425e4a0aa16e157039ea04bb734f76e7a8e4ba88b303c95c03e'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': 'instance-000000ac-02484dd3-2f3e-41bb-8ceb-d71936912a38-tapfd875d41-c5', 'timestamp': '2026-01-22T00:34:23.040321', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1621779069', 'name': 'tapfd875d41-c5', 'instance_id': '02484dd3-2f3e-41bb-8ceb-d71936912a38', 'instance_type': 'm1.nano', 'host': '114686826ac799d79679ebf48d50469c6b568964d23812d7c3d3d1b1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d6:3b:8b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapfd875d41-c5'}, 'message_id': '191b911c-f72a-11f0-a0a4-fa163e934844', 'monotonic_time': 6635.68347876, 'message_signature': 'd50512c43bea0ebea69ab5f05f0043c7c7f21a645263b136106a0939eab8945a'}]}, 'timestamp': '2026-01-22 00:34:23.041153', '_unique_id': '3acef653808246209b3d8562073b341d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.042 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.042 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.042 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.042 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.042 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.042 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.042 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.042 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.042 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.042 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.042 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.042 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.042 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.042 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.042 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.042 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.042 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.042 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.042 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.042 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.042 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.042 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.042 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.042 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.042 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.042 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.042 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.042 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.042 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.042 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.042 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.043 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.043 12 DEBUG ceilometer.compute.pollsters [-] 02484dd3-2f3e-41bb-8ceb-d71936912a38/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.043 12 DEBUG ceilometer.compute.pollsters [-] 02484dd3-2f3e-41bb-8ceb-d71936912a38/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.045 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1d6ed2bf-19b6-4ea4-b7d7-2a0cd994c755', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': 'instance-000000ac-02484dd3-2f3e-41bb-8ceb-d71936912a38-tap0eb5006d-d9', 'timestamp': '2026-01-22T00:34:23.043216', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1621779069', 'name': 'tap0eb5006d-d9', 'instance_id': '02484dd3-2f3e-41bb-8ceb-d71936912a38', 'instance_type': 'm1.nano', 'host': '114686826ac799d79679ebf48d50469c6b568964d23812d7c3d3d1b1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d3:fd:67', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap0eb5006d-d9'}, 'message_id': '191bf274-f72a-11f0-a0a4-fa163e934844', 'monotonic_time': 6635.68347876, 'message_signature': 'f1235657a73e4dbeb0305567be76d20a313119b9c25093092e5c4cf3c8beecbc'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': 'instance-000000ac-02484dd3-2f3e-41bb-8ceb-d71936912a38-tapfd875d41-c5', 'timestamp': '2026-01-22T00:34:23.043216', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1621779069', 'name': 'tapfd875d41-c5', 'instance_id': '02484dd3-2f3e-41bb-8ceb-d71936912a38', 'instance_type': 'm1.nano', 'host': '114686826ac799d79679ebf48d50469c6b568964d23812d7c3d3d1b1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d6:3b:8b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapfd875d41-c5'}, 'message_id': '191c0624-f72a-11f0-a0a4-fa163e934844', 'monotonic_time': 6635.68347876, 'message_signature': 'aed274ff705b31ce9fda000ef3ece9f3968b4e3c60aec754a627f68f1c426883'}]}, 'timestamp': '2026-01-22 00:34:23.044179', '_unique_id': '0b2ed95e4b1744f9ad4c173322520e29'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.045 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.045 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.045 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.045 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.045 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.045 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.045 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.045 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.045 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.045 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.045 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.045 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.045 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.045 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.045 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.045 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.045 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.045 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.045 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.045 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.045 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.045 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.045 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.045 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.045 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.045 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.045 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.045 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.045 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.045 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.045 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.046 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.046 12 DEBUG ceilometer.compute.pollsters [-] 02484dd3-2f3e-41bb-8ceb-d71936912a38/cpu volume: 10260000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.047 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ad84f4db-65fa-44b8-9ade-24f08d2e39b0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 10260000000, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': '02484dd3-2f3e-41bb-8ceb-d71936912a38', 'timestamp': '2026-01-22T00:34:23.046533', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1621779069', 'name': 'instance-000000ac', 'instance_id': '02484dd3-2f3e-41bb-8ceb-d71936912a38', 'instance_type': 'm1.nano', 'host': '114686826ac799d79679ebf48d50469c6b568964d23812d7c3d3d1b1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '191c74f6-f72a-11f0-a0a4-fa163e934844', 'monotonic_time': 6635.629782943, 'message_signature': 'd46f09aa6119993e6d5c04accaffa8978b5268bcb3c4fb4bcc426f2ebb7fdcbd'}]}, 'timestamp': '2026-01-22 00:34:23.046999', '_unique_id': 'cf7f15a60b5d4758bbcab45a2dcef97f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.047 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.047 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.047 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.047 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.047 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.047 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.047 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.047 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.047 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.047 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.047 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.047 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.047 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.047 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.047 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.047 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.047 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.047 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.047 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.047 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.047 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.047 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.047 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.047 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.047 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.047 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.047 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.047 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.047 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.047 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.047 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.048 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.048 12 DEBUG ceilometer.compute.pollsters [-] 02484dd3-2f3e-41bb-8ceb-d71936912a38/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.049 12 DEBUG ceilometer.compute.pollsters [-] 02484dd3-2f3e-41bb-8ceb-d71936912a38/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.050 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '350fe2c3-6fc1-4fde-ba69-ee02088ab5ee', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': 'instance-000000ac-02484dd3-2f3e-41bb-8ceb-d71936912a38-tap0eb5006d-d9', 'timestamp': '2026-01-22T00:34:23.048899', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1621779069', 'name': 'tap0eb5006d-d9', 'instance_id': '02484dd3-2f3e-41bb-8ceb-d71936912a38', 'instance_type': 'm1.nano', 'host': '114686826ac799d79679ebf48d50469c6b568964d23812d7c3d3d1b1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d3:fd:67', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap0eb5006d-d9'}, 'message_id': '191ccfdc-f72a-11f0-a0a4-fa163e934844', 'monotonic_time': 6635.68347876, 'message_signature': 'a2caac4685d4cad38b2a1b500a7b50e19617f573d79fff9cae078ad686d56d90'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': 'instance-000000ac-02484dd3-2f3e-41bb-8ceb-d71936912a38-tapfd875d41-c5', 'timestamp': '2026-01-22T00:34:23.048899', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1621779069', 'name': 'tapfd875d41-c5', 'instance_id': '02484dd3-2f3e-41bb-8ceb-d71936912a38', 'instance_type': 'm1.nano', 'host': '114686826ac799d79679ebf48d50469c6b568964d23812d7c3d3d1b1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d6:3b:8b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapfd875d41-c5'}, 'message_id': '191ce562-f72a-11f0-a0a4-fa163e934844', 'monotonic_time': 6635.68347876, 'message_signature': '0d4340b511cd00b7588f5f0a04e867d3f417b3254d9d2fe4605496aa2c6da54c'}]}, 'timestamp': '2026-01-22 00:34:23.049919', '_unique_id': '6bd0d27fecef4c71aefb651ce791b0f3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.050 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.050 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.050 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.050 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.050 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.050 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.050 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.050 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.050 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.050 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.050 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.050 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.050 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.050 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.050 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.050 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.050 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.050 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.050 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.050 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.050 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.050 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.050 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.050 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.050 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.050 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.050 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.050 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.050 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.050 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.050 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.051 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.051 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.052 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-TestGettingAddress-server-1621779069>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestGettingAddress-server-1621779069>]
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.052 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.052 12 DEBUG ceilometer.compute.pollsters [-] 02484dd3-2f3e-41bb-8ceb-d71936912a38/disk.device.usage volume: 196624 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.052 12 DEBUG ceilometer.compute.pollsters [-] 02484dd3-2f3e-41bb-8ceb-d71936912a38/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.054 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6ee5ee8d-dea5-4944-a57d-a207f4aabf9c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 196624, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': '02484dd3-2f3e-41bb-8ceb-d71936912a38-vda', 'timestamp': '2026-01-22T00:34:23.052427', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1621779069', 'name': 'instance-000000ac', 'instance_id': '02484dd3-2f3e-41bb-8ceb-d71936912a38', 'instance_type': 'm1.nano', 'host': '114686826ac799d79679ebf48d50469c6b568964d23812d7c3d3d1b1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '191d59b6-f72a-11f0-a0a4-fa163e934844', 'monotonic_time': 6635.699511043, 'message_signature': '203e764fb0fe0f0d4d115171f9c0c4c3ad2e3af75a364160be48835f8c88cdec'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': '02484dd3-2f3e-41bb-8ceb-d71936912a38-sda', 'timestamp': '2026-01-22T00:34:23.052427', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1621779069', 'name': 'instance-000000ac', 'instance_id': '02484dd3-2f3e-41bb-8ceb-d71936912a38', 'instance_type': 'm1.nano', 'host': '114686826ac799d79679ebf48d50469c6b568964d23812d7c3d3d1b1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '191d69c4-f72a-11f0-a0a4-fa163e934844', 'monotonic_time': 6635.699511043, 'message_signature': '40b9ef4aa6345eb6186f6113ea528f30e2dd0451328f8fec650e55cf79ee4127'}]}, 'timestamp': '2026-01-22 00:34:23.053234', '_unique_id': 'e066417d2bba4b13a72ec673cbb4d88d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.054 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.054 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.054 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.054 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.054 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.054 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.054 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.054 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.054 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.054 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.054 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.054 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.054 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.054 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.054 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.054 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.054 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.054 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.054 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.054 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.054 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.054 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.054 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.054 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.054 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.054 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.054 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.054 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.054 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.054 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.054 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.055 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.055 12 DEBUG ceilometer.compute.pollsters [-] 02484dd3-2f3e-41bb-8ceb-d71936912a38/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.055 12 DEBUG ceilometer.compute.pollsters [-] 02484dd3-2f3e-41bb-8ceb-d71936912a38/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.056 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd0f40694-e42a-4950-9602-740c0299e5ba', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': 'instance-000000ac-02484dd3-2f3e-41bb-8ceb-d71936912a38-tap0eb5006d-d9', 'timestamp': '2026-01-22T00:34:23.055208', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1621779069', 'name': 'tap0eb5006d-d9', 'instance_id': '02484dd3-2f3e-41bb-8ceb-d71936912a38', 'instance_type': 'm1.nano', 'host': '114686826ac799d79679ebf48d50469c6b568964d23812d7c3d3d1b1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d3:fd:67', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap0eb5006d-d9'}, 'message_id': '191dc630-f72a-11f0-a0a4-fa163e934844', 'monotonic_time': 6635.68347876, 'message_signature': '58088493f210ce6ebe19cb6fe6c3e5ea0b2bcb60072234c3e686c450e16f4c6b'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': 'instance-000000ac-02484dd3-2f3e-41bb-8ceb-d71936912a38-tapfd875d41-c5', 'timestamp': '2026-01-22T00:34:23.055208', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1621779069', 'name': 'tapfd875d41-c5', 'instance_id': '02484dd3-2f3e-41bb-8ceb-d71936912a38', 'instance_type': 'm1.nano', 'host': '114686826ac799d79679ebf48d50469c6b568964d23812d7c3d3d1b1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d6:3b:8b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapfd875d41-c5'}, 'message_id': '191dd602-f72a-11f0-a0a4-fa163e934844', 'monotonic_time': 6635.68347876, 'message_signature': '5eea35dc7b98bf49dec67169447c2ebd08549422609499c3919b8729d53471fc'}]}, 'timestamp': '2026-01-22 00:34:23.056050', '_unique_id': '48fd59e15bdc4480b89b76060cb33e6c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.056 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.056 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.056 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.056 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.056 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.056 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.056 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.056 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.056 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.056 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.056 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.056 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.056 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.056 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.056 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.056 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.056 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.056 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.056 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.056 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.056 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.056 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.056 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.056 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.056 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.056 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.056 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.056 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.056 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.056 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:34:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:34:23.056 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:34:23 compute-1 ovn_controller[94841]: 2026-01-22T00:34:23Z|00090|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:d3:fd:67 10.100.0.14
Jan 22 00:34:23 compute-1 ovn_controller[94841]: 2026-01-22T00:34:23Z|00091|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:d3:fd:67 10.100.0.14
Jan 22 00:34:24 compute-1 nova_compute[182713]: 2026-01-22 00:34:24.908 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:34:25 compute-1 nova_compute[182713]: 2026-01-22 00:34:25.719 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:34:28 compute-1 podman[240234]: 2026-01-22 00:34:28.59401878 +0000 UTC m=+0.080709476 container health_status cba261ffc859a105e0afa4436e64e27f9ba7e7387a1ea02146e0072c51844d44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251202)
Jan 22 00:34:29 compute-1 nova_compute[182713]: 2026-01-22 00:34:29.910 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:34:30 compute-1 podman[240254]: 2026-01-22 00:34:30.595004374 +0000 UTC m=+0.082432940 container health_status dce133a2ffa21f3006ac27dc80027060a9029e6d972f4c62fecd35dd3bbb00f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, release=1755695350, vendor=Red Hat, Inc., io.buildah.version=1.33.7, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, name=ubi9-minimal, build-date=2025-08-20T13:12:41, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, distribution-scope=public, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, config_id=openstack_network_exporter, io.openshift.expose-services=)
Jan 22 00:34:30 compute-1 nova_compute[182713]: 2026-01-22 00:34:30.755 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:34:34 compute-1 nova_compute[182713]: 2026-01-22 00:34:34.913 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:34:35 compute-1 nova_compute[182713]: 2026-01-22 00:34:35.758 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:34:35 compute-1 nova_compute[182713]: 2026-01-22 00:34:35.880 182717 DEBUG nova.compute.manager [req-89f7433d-a458-4c59-9a3b-3153e57464b4 req-733c3f44-b5c5-4bcd-8064-09e95b038bcb 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 02484dd3-2f3e-41bb-8ceb-d71936912a38] Received event network-changed-0eb5006d-d9d0-403a-bc92-5ad7bd032ade external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:34:35 compute-1 nova_compute[182713]: 2026-01-22 00:34:35.881 182717 DEBUG nova.compute.manager [req-89f7433d-a458-4c59-9a3b-3153e57464b4 req-733c3f44-b5c5-4bcd-8064-09e95b038bcb 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 02484dd3-2f3e-41bb-8ceb-d71936912a38] Refreshing instance network info cache due to event network-changed-0eb5006d-d9d0-403a-bc92-5ad7bd032ade. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 00:34:35 compute-1 nova_compute[182713]: 2026-01-22 00:34:35.881 182717 DEBUG oslo_concurrency.lockutils [req-89f7433d-a458-4c59-9a3b-3153e57464b4 req-733c3f44-b5c5-4bcd-8064-09e95b038bcb 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-02484dd3-2f3e-41bb-8ceb-d71936912a38" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:34:35 compute-1 nova_compute[182713]: 2026-01-22 00:34:35.882 182717 DEBUG oslo_concurrency.lockutils [req-89f7433d-a458-4c59-9a3b-3153e57464b4 req-733c3f44-b5c5-4bcd-8064-09e95b038bcb 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-02484dd3-2f3e-41bb-8ceb-d71936912a38" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:34:35 compute-1 nova_compute[182713]: 2026-01-22 00:34:35.882 182717 DEBUG nova.network.neutron [req-89f7433d-a458-4c59-9a3b-3153e57464b4 req-733c3f44-b5c5-4bcd-8064-09e95b038bcb 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 02484dd3-2f3e-41bb-8ceb-d71936912a38] Refreshing network info cache for port 0eb5006d-d9d0-403a-bc92-5ad7bd032ade _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 00:34:35 compute-1 nova_compute[182713]: 2026-01-22 00:34:35.987 182717 DEBUG oslo_concurrency.lockutils [None req-9da88252-6dce-47ac-93ae-a61f175a791f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Acquiring lock "02484dd3-2f3e-41bb-8ceb-d71936912a38" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:34:35 compute-1 nova_compute[182713]: 2026-01-22 00:34:35.988 182717 DEBUG oslo_concurrency.lockutils [None req-9da88252-6dce-47ac-93ae-a61f175a791f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "02484dd3-2f3e-41bb-8ceb-d71936912a38" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:34:35 compute-1 nova_compute[182713]: 2026-01-22 00:34:35.989 182717 DEBUG oslo_concurrency.lockutils [None req-9da88252-6dce-47ac-93ae-a61f175a791f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Acquiring lock "02484dd3-2f3e-41bb-8ceb-d71936912a38-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:34:35 compute-1 nova_compute[182713]: 2026-01-22 00:34:35.989 182717 DEBUG oslo_concurrency.lockutils [None req-9da88252-6dce-47ac-93ae-a61f175a791f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "02484dd3-2f3e-41bb-8ceb-d71936912a38-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:34:35 compute-1 nova_compute[182713]: 2026-01-22 00:34:35.990 182717 DEBUG oslo_concurrency.lockutils [None req-9da88252-6dce-47ac-93ae-a61f175a791f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "02484dd3-2f3e-41bb-8ceb-d71936912a38-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:34:36 compute-1 nova_compute[182713]: 2026-01-22 00:34:36.008 182717 INFO nova.compute.manager [None req-9da88252-6dce-47ac-93ae-a61f175a791f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 02484dd3-2f3e-41bb-8ceb-d71936912a38] Terminating instance
Jan 22 00:34:36 compute-1 nova_compute[182713]: 2026-01-22 00:34:36.026 182717 DEBUG nova.compute.manager [None req-9da88252-6dce-47ac-93ae-a61f175a791f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 02484dd3-2f3e-41bb-8ceb-d71936912a38] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 22 00:34:36 compute-1 kernel: tap0eb5006d-d9 (unregistering): left promiscuous mode
Jan 22 00:34:36 compute-1 NetworkManager[54952]: <info>  [1769042076.0489] device (tap0eb5006d-d9): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 00:34:36 compute-1 nova_compute[182713]: 2026-01-22 00:34:36.054 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:34:36 compute-1 ovn_controller[94841]: 2026-01-22T00:34:36Z|00687|binding|INFO|Releasing lport 0eb5006d-d9d0-403a-bc92-5ad7bd032ade from this chassis (sb_readonly=0)
Jan 22 00:34:36 compute-1 ovn_controller[94841]: 2026-01-22T00:34:36Z|00688|binding|INFO|Setting lport 0eb5006d-d9d0-403a-bc92-5ad7bd032ade down in Southbound
Jan 22 00:34:36 compute-1 ovn_controller[94841]: 2026-01-22T00:34:36Z|00689|binding|INFO|Removing iface tap0eb5006d-d9 ovn-installed in OVS
Jan 22 00:34:36 compute-1 nova_compute[182713]: 2026-01-22 00:34:36.058 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:34:36 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:34:36.063 104184 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d3:fd:67 10.100.0.14'], port_security=['fa:16:3e:d3:fd:67 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '02484dd3-2f3e-41bb-8ceb-d71936912a38', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-2c32b591-bafa-4089-9793-ef7884c86bda', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '837db8748d074b3c9179b47d30e7a1d4', 'neutron:revision_number': '4', 'neutron:security_group_ids': '2e757454-ac82-49d6-9905-0e379d7d274f', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e2ff2e46-2af8-49fc-9f01-6a639111eeb4, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>], logical_port=0eb5006d-d9d0-403a-bc92-5ad7bd032ade) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:34:36 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:34:36.064 104184 INFO neutron.agent.ovn.metadata.agent [-] Port 0eb5006d-d9d0-403a-bc92-5ad7bd032ade in datapath 2c32b591-bafa-4089-9793-ef7884c86bda unbound from our chassis
Jan 22 00:34:36 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:34:36.065 104184 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 2c32b591-bafa-4089-9793-ef7884c86bda, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 00:34:36 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:34:36.066 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[e87267fd-3b04-44d0-bb7f-5d45b93c1ef8]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:34:36 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:34:36.067 104184 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-2c32b591-bafa-4089-9793-ef7884c86bda namespace which is not needed anymore
Jan 22 00:34:36 compute-1 nova_compute[182713]: 2026-01-22 00:34:36.072 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:34:36 compute-1 kernel: tapfd875d41-c5 (unregistering): left promiscuous mode
Jan 22 00:34:36 compute-1 NetworkManager[54952]: <info>  [1769042076.0816] device (tapfd875d41-c5): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 00:34:36 compute-1 nova_compute[182713]: 2026-01-22 00:34:36.084 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:34:36 compute-1 nova_compute[182713]: 2026-01-22 00:34:36.093 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:34:36 compute-1 ovn_controller[94841]: 2026-01-22T00:34:36Z|00690|binding|INFO|Releasing lport fd875d41-c569-4c76-ae51-bbf4251e41bd from this chassis (sb_readonly=0)
Jan 22 00:34:36 compute-1 ovn_controller[94841]: 2026-01-22T00:34:36Z|00691|binding|INFO|Setting lport fd875d41-c569-4c76-ae51-bbf4251e41bd down in Southbound
Jan 22 00:34:36 compute-1 ovn_controller[94841]: 2026-01-22T00:34:36Z|00692|binding|INFO|Removing iface tapfd875d41-c5 ovn-installed in OVS
Jan 22 00:34:36 compute-1 nova_compute[182713]: 2026-01-22 00:34:36.095 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:34:36 compute-1 nova_compute[182713]: 2026-01-22 00:34:36.123 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:34:36 compute-1 systemd[1]: machine-qemu\x2d74\x2dinstance\x2d000000ac.scope: Deactivated successfully.
Jan 22 00:34:36 compute-1 systemd[1]: machine-qemu\x2d74\x2dinstance\x2d000000ac.scope: Consumed 13.633s CPU time.
Jan 22 00:34:36 compute-1 systemd-machined[153970]: Machine qemu-74-instance-000000ac terminated.
Jan 22 00:34:36 compute-1 neutron-haproxy-ovnmeta-2c32b591-bafa-4089-9793-ef7884c86bda[240089]: [NOTICE]   (240093) : haproxy version is 2.8.14-c23fe91
Jan 22 00:34:36 compute-1 neutron-haproxy-ovnmeta-2c32b591-bafa-4089-9793-ef7884c86bda[240089]: [NOTICE]   (240093) : path to executable is /usr/sbin/haproxy
Jan 22 00:34:36 compute-1 neutron-haproxy-ovnmeta-2c32b591-bafa-4089-9793-ef7884c86bda[240089]: [WARNING]  (240093) : Exiting Master process...
Jan 22 00:34:36 compute-1 neutron-haproxy-ovnmeta-2c32b591-bafa-4089-9793-ef7884c86bda[240089]: [WARNING]  (240093) : Exiting Master process...
Jan 22 00:34:36 compute-1 neutron-haproxy-ovnmeta-2c32b591-bafa-4089-9793-ef7884c86bda[240089]: [ALERT]    (240093) : Current worker (240095) exited with code 143 (Terminated)
Jan 22 00:34:36 compute-1 neutron-haproxy-ovnmeta-2c32b591-bafa-4089-9793-ef7884c86bda[240089]: [WARNING]  (240093) : All workers exited. Exiting... (0)
Jan 22 00:34:36 compute-1 systemd[1]: libpod-43e5e025721d7d967ec8fd883f5789eb184a5a964f6f9343b7aedc76f0528881.scope: Deactivated successfully.
Jan 22 00:34:36 compute-1 podman[240305]: 2026-01-22 00:34:36.226268708 +0000 UTC m=+0.049179147 container died 43e5e025721d7d967ec8fd883f5789eb184a5a964f6f9343b7aedc76f0528881 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2c32b591-bafa-4089-9793-ef7884c86bda, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.license=GPLv2)
Jan 22 00:34:36 compute-1 NetworkManager[54952]: <info>  [1769042076.2609] manager: (tap0eb5006d-d9): new Tun device (/org/freedesktop/NetworkManager/Devices/325)
Jan 22 00:34:36 compute-1 NetworkManager[54952]: <info>  [1769042076.2699] manager: (tapfd875d41-c5): new Tun device (/org/freedesktop/NetworkManager/Devices/326)
Jan 22 00:34:36 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-43e5e025721d7d967ec8fd883f5789eb184a5a964f6f9343b7aedc76f0528881-userdata-shm.mount: Deactivated successfully.
Jan 22 00:34:36 compute-1 systemd[1]: var-lib-containers-storage-overlay-278b74386ef2db2584441ba6166460d4c1e06f73516138a2251d428eb20590db-merged.mount: Deactivated successfully.
Jan 22 00:34:36 compute-1 podman[240305]: 2026-01-22 00:34:36.291207578 +0000 UTC m=+0.114118067 container cleanup 43e5e025721d7d967ec8fd883f5789eb184a5a964f6f9343b7aedc76f0528881 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2c32b591-bafa-4089-9793-ef7884c86bda, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202)
Jan 22 00:34:36 compute-1 systemd[1]: libpod-conmon-43e5e025721d7d967ec8fd883f5789eb184a5a964f6f9343b7aedc76f0528881.scope: Deactivated successfully.
Jan 22 00:34:36 compute-1 nova_compute[182713]: 2026-01-22 00:34:36.323 182717 INFO nova.virt.libvirt.driver [-] [instance: 02484dd3-2f3e-41bb-8ceb-d71936912a38] Instance destroyed successfully.
Jan 22 00:34:36 compute-1 nova_compute[182713]: 2026-01-22 00:34:36.325 182717 DEBUG nova.objects.instance [None req-9da88252-6dce-47ac-93ae-a61f175a791f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lazy-loading 'resources' on Instance uuid 02484dd3-2f3e-41bb-8ceb-d71936912a38 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:34:36 compute-1 podman[240358]: 2026-01-22 00:34:36.37214973 +0000 UTC m=+0.046597925 container remove 43e5e025721d7d967ec8fd883f5789eb184a5a964f6f9343b7aedc76f0528881 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2c32b591-bafa-4089-9793-ef7884c86bda, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true)
Jan 22 00:34:36 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:34:36.381 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[9d7a2774-dda2-4ad6-bb1a-834a2a8a5279]: (4, ('Thu Jan 22 12:34:36 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-2c32b591-bafa-4089-9793-ef7884c86bda (43e5e025721d7d967ec8fd883f5789eb184a5a964f6f9343b7aedc76f0528881)\n43e5e025721d7d967ec8fd883f5789eb184a5a964f6f9343b7aedc76f0528881\nThu Jan 22 12:34:36 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-2c32b591-bafa-4089-9793-ef7884c86bda (43e5e025721d7d967ec8fd883f5789eb184a5a964f6f9343b7aedc76f0528881)\n43e5e025721d7d967ec8fd883f5789eb184a5a964f6f9343b7aedc76f0528881\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:34:36 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:34:36.383 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[122f5350-54be-4312-ae2e-c86f73c06072]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:34:36 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:34:36.384 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap2c32b591-b0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:34:36 compute-1 nova_compute[182713]: 2026-01-22 00:34:36.387 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:34:36 compute-1 kernel: tap2c32b591-b0: left promiscuous mode
Jan 22 00:34:36 compute-1 nova_compute[182713]: 2026-01-22 00:34:36.419 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:34:36 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:34:36.424 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[10923f0f-0d6b-4245-a7f2-4756106ef6af]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:34:36 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:34:36.437 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[f452ef14-275b-42b3-a927-544458b84df0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:34:36 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:34:36.438 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[5446a0a2-3e90-425f-a91b-dc70762cb33e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:34:36 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:34:36.456 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[7c7af646-d555-4c3c-9745-e5131aaf790d]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 662416, 'reachable_time': 30029, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 240382, 'error': None, 'target': 'ovnmeta-2c32b591-bafa-4089-9793-ef7884c86bda', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:34:36 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:34:36.459 104576 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-2c32b591-bafa-4089-9793-ef7884c86bda deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 22 00:34:36 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:34:36.459 104576 DEBUG oslo.privsep.daemon [-] privsep: reply[913b487f-00ff-443a-aa65-cc339b059eed]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:34:36 compute-1 systemd[1]: run-netns-ovnmeta\x2d2c32b591\x2dbafa\x2d4089\x2d9793\x2def7884c86bda.mount: Deactivated successfully.
Jan 22 00:34:36 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:34:36.561 104184 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d6:3b:8b 2001:db8::f816:3eff:fed6:3b8b'], port_security=['fa:16:3e:d6:3b:8b 2001:db8::f816:3eff:fed6:3b8b'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fed6:3b8b/64', 'neutron:device_id': '02484dd3-2f3e-41bb-8ceb-d71936912a38', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ac047d42-8ff5-4760-85b5-73b5e4be7fc9', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '837db8748d074b3c9179b47d30e7a1d4', 'neutron:revision_number': '4', 'neutron:security_group_ids': '2e757454-ac82-49d6-9905-0e379d7d274f', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=27a761c8-dbca-47a8-b596-d7db8b087bd0, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>], logical_port=fd875d41-c569-4c76-ae51-bbf4251e41bd) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:34:36 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:34:36.564 104184 INFO neutron.agent.ovn.metadata.agent [-] Port fd875d41-c569-4c76-ae51-bbf4251e41bd in datapath ac047d42-8ff5-4760-85b5-73b5e4be7fc9 unbound from our chassis
Jan 22 00:34:36 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:34:36.565 104184 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network ac047d42-8ff5-4760-85b5-73b5e4be7fc9, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 00:34:36 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:34:36.566 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[9054d0b7-68b3-4a62-b25b-c772da884d77]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:34:36 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:34:36.568 104184 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-ac047d42-8ff5-4760-85b5-73b5e4be7fc9 namespace which is not needed anymore
Jan 22 00:34:36 compute-1 neutron-haproxy-ovnmeta-ac047d42-8ff5-4760-85b5-73b5e4be7fc9[240160]: [NOTICE]   (240164) : haproxy version is 2.8.14-c23fe91
Jan 22 00:34:36 compute-1 neutron-haproxy-ovnmeta-ac047d42-8ff5-4760-85b5-73b5e4be7fc9[240160]: [NOTICE]   (240164) : path to executable is /usr/sbin/haproxy
Jan 22 00:34:36 compute-1 neutron-haproxy-ovnmeta-ac047d42-8ff5-4760-85b5-73b5e4be7fc9[240160]: [WARNING]  (240164) : Exiting Master process...
Jan 22 00:34:36 compute-1 neutron-haproxy-ovnmeta-ac047d42-8ff5-4760-85b5-73b5e4be7fc9[240160]: [WARNING]  (240164) : Exiting Master process...
Jan 22 00:34:36 compute-1 neutron-haproxy-ovnmeta-ac047d42-8ff5-4760-85b5-73b5e4be7fc9[240160]: [ALERT]    (240164) : Current worker (240166) exited with code 143 (Terminated)
Jan 22 00:34:36 compute-1 neutron-haproxy-ovnmeta-ac047d42-8ff5-4760-85b5-73b5e4be7fc9[240160]: [WARNING]  (240164) : All workers exited. Exiting... (0)
Jan 22 00:34:36 compute-1 systemd[1]: libpod-c4954f1b9b5958dba720e54560b01ebccff889c4e3e04a316f3d6a7321d23195.scope: Deactivated successfully.
Jan 22 00:34:36 compute-1 podman[240400]: 2026-01-22 00:34:36.743263447 +0000 UTC m=+0.053500182 container died c4954f1b9b5958dba720e54560b01ebccff889c4e3e04a316f3d6a7321d23195 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ac047d42-8ff5-4760-85b5-73b5e4be7fc9, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 00:34:36 compute-1 nova_compute[182713]: 2026-01-22 00:34:36.773 182717 DEBUG nova.virt.libvirt.vif [None req-9da88252-6dce-47ac-93ae-a61f175a791f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T00:33:59Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1621779069',display_name='tempest-TestGettingAddress-server-1621779069',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1621779069',id=172,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBCnBNde9LnK+U+D5ENHV5Bhm30z07BTZ3lIWHvMukOxK8Jf8o76X7/cU7jqimLz8JbkUji6/91McZm7z1O1Yc3tDz3ZuSaz3KrVuj+NWjZEoVIpu6UJrwRH+k4I2kfmaw==',key_name='tempest-TestGettingAddress-1562921949',keypairs=<?>,launch_index=0,launched_at=2026-01-22T00:34:12Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='837db8748d074b3c9179b47d30e7a1d4',ramdisk_id='',reservation_id='r-6oeidn9k',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestGettingAddress-471729430',owner_user_name='tempest-TestGettingAddress-471729430-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T00:34:12Z,user_data=None,user_id='a8fd196423d94b309668ffd08655f7ed',uuid=02484dd3-2f3e-41bb-8ceb-d71936912a38,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "0eb5006d-d9d0-403a-bc92-5ad7bd032ade", "address": "fa:16:3e:d3:fd:67", "network": {"id": "2c32b591-bafa-4089-9793-ef7884c86bda", "bridge": "br-int", "label": "tempest-network-smoke--1415909396", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.249", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0eb5006d-d9", "ovs_interfaceid": "0eb5006d-d9d0-403a-bc92-5ad7bd032ade", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 00:34:36 compute-1 nova_compute[182713]: 2026-01-22 00:34:36.773 182717 DEBUG nova.network.os_vif_util [None req-9da88252-6dce-47ac-93ae-a61f175a791f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Converting VIF {"id": "0eb5006d-d9d0-403a-bc92-5ad7bd032ade", "address": "fa:16:3e:d3:fd:67", "network": {"id": "2c32b591-bafa-4089-9793-ef7884c86bda", "bridge": "br-int", "label": "tempest-network-smoke--1415909396", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.249", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0eb5006d-d9", "ovs_interfaceid": "0eb5006d-d9d0-403a-bc92-5ad7bd032ade", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:34:36 compute-1 nova_compute[182713]: 2026-01-22 00:34:36.774 182717 DEBUG nova.network.os_vif_util [None req-9da88252-6dce-47ac-93ae-a61f175a791f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:d3:fd:67,bridge_name='br-int',has_traffic_filtering=True,id=0eb5006d-d9d0-403a-bc92-5ad7bd032ade,network=Network(2c32b591-bafa-4089-9793-ef7884c86bda),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0eb5006d-d9') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:34:36 compute-1 nova_compute[182713]: 2026-01-22 00:34:36.775 182717 DEBUG os_vif [None req-9da88252-6dce-47ac-93ae-a61f175a791f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:d3:fd:67,bridge_name='br-int',has_traffic_filtering=True,id=0eb5006d-d9d0-403a-bc92-5ad7bd032ade,network=Network(2c32b591-bafa-4089-9793-ef7884c86bda),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0eb5006d-d9') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 00:34:36 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-c4954f1b9b5958dba720e54560b01ebccff889c4e3e04a316f3d6a7321d23195-userdata-shm.mount: Deactivated successfully.
Jan 22 00:34:36 compute-1 nova_compute[182713]: 2026-01-22 00:34:36.777 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:34:36 compute-1 nova_compute[182713]: 2026-01-22 00:34:36.777 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0eb5006d-d9, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:34:36 compute-1 nova_compute[182713]: 2026-01-22 00:34:36.779 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:34:36 compute-1 systemd[1]: var-lib-containers-storage-overlay-bb4b5572b610bb761579f5a85e8777e126e1fbd08a67a43ccaf7711916fbf3f7-merged.mount: Deactivated successfully.
Jan 22 00:34:36 compute-1 nova_compute[182713]: 2026-01-22 00:34:36.780 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:34:36 compute-1 nova_compute[182713]: 2026-01-22 00:34:36.784 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:34:36 compute-1 podman[240400]: 2026-01-22 00:34:36.784372849 +0000 UTC m=+0.094609494 container cleanup c4954f1b9b5958dba720e54560b01ebccff889c4e3e04a316f3d6a7321d23195 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ac047d42-8ff5-4760-85b5-73b5e4be7fc9, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251202)
Jan 22 00:34:36 compute-1 nova_compute[182713]: 2026-01-22 00:34:36.788 182717 INFO os_vif [None req-9da88252-6dce-47ac-93ae-a61f175a791f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:d3:fd:67,bridge_name='br-int',has_traffic_filtering=True,id=0eb5006d-d9d0-403a-bc92-5ad7bd032ade,network=Network(2c32b591-bafa-4089-9793-ef7884c86bda),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0eb5006d-d9')
Jan 22 00:34:36 compute-1 nova_compute[182713]: 2026-01-22 00:34:36.789 182717 DEBUG nova.virt.libvirt.vif [None req-9da88252-6dce-47ac-93ae-a61f175a791f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T00:33:59Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1621779069',display_name='tempest-TestGettingAddress-server-1621779069',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1621779069',id=172,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBCnBNde9LnK+U+D5ENHV5Bhm30z07BTZ3lIWHvMukOxK8Jf8o76X7/cU7jqimLz8JbkUji6/91McZm7z1O1Yc3tDz3ZuSaz3KrVuj+NWjZEoVIpu6UJrwRH+k4I2kfmaw==',key_name='tempest-TestGettingAddress-1562921949',keypairs=<?>,launch_index=0,launched_at=2026-01-22T00:34:12Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='837db8748d074b3c9179b47d30e7a1d4',ramdisk_id='',reservation_id='r-6oeidn9k',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestGettingAddress-471729430',owner_user_name='tempest-TestGettingAddress-471729430-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T00:34:12Z,user_data=None,user_id='a8fd196423d94b309668ffd08655f7ed',uuid=02484dd3-2f3e-41bb-8ceb-d71936912a38,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "fd875d41-c569-4c76-ae51-bbf4251e41bd", "address": "fa:16:3e:d6:3b:8b", "network": {"id": "ac047d42-8ff5-4760-85b5-73b5e4be7fc9", "bridge": "br-int", "label": "tempest-network-smoke--1874258653", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fed6:3b8b", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfd875d41-c5", "ovs_interfaceid": "fd875d41-c569-4c76-ae51-bbf4251e41bd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 00:34:36 compute-1 nova_compute[182713]: 2026-01-22 00:34:36.789 182717 DEBUG nova.network.os_vif_util [None req-9da88252-6dce-47ac-93ae-a61f175a791f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Converting VIF {"id": "fd875d41-c569-4c76-ae51-bbf4251e41bd", "address": "fa:16:3e:d6:3b:8b", "network": {"id": "ac047d42-8ff5-4760-85b5-73b5e4be7fc9", "bridge": "br-int", "label": "tempest-network-smoke--1874258653", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fed6:3b8b", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfd875d41-c5", "ovs_interfaceid": "fd875d41-c569-4c76-ae51-bbf4251e41bd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:34:36 compute-1 nova_compute[182713]: 2026-01-22 00:34:36.790 182717 DEBUG nova.network.os_vif_util [None req-9da88252-6dce-47ac-93ae-a61f175a791f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d6:3b:8b,bridge_name='br-int',has_traffic_filtering=True,id=fd875d41-c569-4c76-ae51-bbf4251e41bd,network=Network(ac047d42-8ff5-4760-85b5-73b5e4be7fc9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfd875d41-c5') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:34:36 compute-1 nova_compute[182713]: 2026-01-22 00:34:36.790 182717 DEBUG os_vif [None req-9da88252-6dce-47ac-93ae-a61f175a791f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:d6:3b:8b,bridge_name='br-int',has_traffic_filtering=True,id=fd875d41-c569-4c76-ae51-bbf4251e41bd,network=Network(ac047d42-8ff5-4760-85b5-73b5e4be7fc9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfd875d41-c5') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 00:34:36 compute-1 nova_compute[182713]: 2026-01-22 00:34:36.791 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:34:36 compute-1 nova_compute[182713]: 2026-01-22 00:34:36.792 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfd875d41-c5, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:34:36 compute-1 nova_compute[182713]: 2026-01-22 00:34:36.793 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:34:36 compute-1 nova_compute[182713]: 2026-01-22 00:34:36.794 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:34:36 compute-1 nova_compute[182713]: 2026-01-22 00:34:36.796 182717 INFO os_vif [None req-9da88252-6dce-47ac-93ae-a61f175a791f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:d6:3b:8b,bridge_name='br-int',has_traffic_filtering=True,id=fd875d41-c569-4c76-ae51-bbf4251e41bd,network=Network(ac047d42-8ff5-4760-85b5-73b5e4be7fc9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfd875d41-c5')
Jan 22 00:34:36 compute-1 nova_compute[182713]: 2026-01-22 00:34:36.797 182717 INFO nova.virt.libvirt.driver [None req-9da88252-6dce-47ac-93ae-a61f175a791f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 02484dd3-2f3e-41bb-8ceb-d71936912a38] Deleting instance files /var/lib/nova/instances/02484dd3-2f3e-41bb-8ceb-d71936912a38_del
Jan 22 00:34:36 compute-1 nova_compute[182713]: 2026-01-22 00:34:36.798 182717 INFO nova.virt.libvirt.driver [None req-9da88252-6dce-47ac-93ae-a61f175a791f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 02484dd3-2f3e-41bb-8ceb-d71936912a38] Deletion of /var/lib/nova/instances/02484dd3-2f3e-41bb-8ceb-d71936912a38_del complete
Jan 22 00:34:36 compute-1 systemd[1]: libpod-conmon-c4954f1b9b5958dba720e54560b01ebccff889c4e3e04a316f3d6a7321d23195.scope: Deactivated successfully.
Jan 22 00:34:36 compute-1 podman[240431]: 2026-01-22 00:34:36.856702921 +0000 UTC m=+0.045799870 container remove c4954f1b9b5958dba720e54560b01ebccff889c4e3e04a316f3d6a7321d23195 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ac047d42-8ff5-4760-85b5-73b5e4be7fc9, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 22 00:34:36 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:34:36.862 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[5124eff8-71fe-4072-a271-f2441851dd68]: (4, ('Thu Jan 22 12:34:36 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-ac047d42-8ff5-4760-85b5-73b5e4be7fc9 (c4954f1b9b5958dba720e54560b01ebccff889c4e3e04a316f3d6a7321d23195)\nc4954f1b9b5958dba720e54560b01ebccff889c4e3e04a316f3d6a7321d23195\nThu Jan 22 12:34:36 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-ac047d42-8ff5-4760-85b5-73b5e4be7fc9 (c4954f1b9b5958dba720e54560b01ebccff889c4e3e04a316f3d6a7321d23195)\nc4954f1b9b5958dba720e54560b01ebccff889c4e3e04a316f3d6a7321d23195\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:34:36 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:34:36.864 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[64a06dcd-7545-49e7-8e8e-855bf15f1c8f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:34:36 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:34:36.865 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapac047d42-80, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:34:36 compute-1 nova_compute[182713]: 2026-01-22 00:34:36.867 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:34:36 compute-1 kernel: tapac047d42-80: left promiscuous mode
Jan 22 00:34:36 compute-1 nova_compute[182713]: 2026-01-22 00:34:36.883 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:34:36 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:34:36.886 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[0490ee6e-a417-45eb-af72-367b72ea6030]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:34:36 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:34:36.906 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[aab1f745-0272-4f6b-ad65-a28932d6de98]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:34:36 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:34:36.907 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[4897226b-51dc-4d66-8efd-346498ea2bf8]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:34:36 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:34:36.920 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[2540725b-a629-415f-9778-e1114f08a197]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 662514, 'reachable_time': 25432, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 240446, 'error': None, 'target': 'ovnmeta-ac047d42-8ff5-4760-85b5-73b5e4be7fc9', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:34:36 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:34:36.921 104576 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-ac047d42-8ff5-4760-85b5-73b5e4be7fc9 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 22 00:34:36 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:34:36.921 104576 DEBUG oslo.privsep.daemon [-] privsep: reply[eb00f34b-5d9b-431a-a4bc-e465ecc64e74]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:34:37 compute-1 systemd[1]: run-netns-ovnmeta\x2dac047d42\x2d8ff5\x2d4760\x2d85b5\x2d73b5e4be7fc9.mount: Deactivated successfully.
Jan 22 00:34:37 compute-1 nova_compute[182713]: 2026-01-22 00:34:37.836 182717 INFO nova.compute.manager [None req-9da88252-6dce-47ac-93ae-a61f175a791f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 02484dd3-2f3e-41bb-8ceb-d71936912a38] Took 1.81 seconds to destroy the instance on the hypervisor.
Jan 22 00:34:37 compute-1 nova_compute[182713]: 2026-01-22 00:34:37.837 182717 DEBUG oslo.service.loopingcall [None req-9da88252-6dce-47ac-93ae-a61f175a791f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 22 00:34:37 compute-1 nova_compute[182713]: 2026-01-22 00:34:37.838 182717 DEBUG nova.compute.manager [-] [instance: 02484dd3-2f3e-41bb-8ceb-d71936912a38] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 22 00:34:37 compute-1 nova_compute[182713]: 2026-01-22 00:34:37.838 182717 DEBUG nova.network.neutron [-] [instance: 02484dd3-2f3e-41bb-8ceb-d71936912a38] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 22 00:34:38 compute-1 nova_compute[182713]: 2026-01-22 00:34:38.151 182717 DEBUG nova.compute.manager [req-32642515-124e-46be-8e37-ec7e93840933 req-398b3be9-5634-40cf-9494-d80110361554 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 02484dd3-2f3e-41bb-8ceb-d71936912a38] Received event network-vif-unplugged-fd875d41-c569-4c76-ae51-bbf4251e41bd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:34:38 compute-1 nova_compute[182713]: 2026-01-22 00:34:38.152 182717 DEBUG oslo_concurrency.lockutils [req-32642515-124e-46be-8e37-ec7e93840933 req-398b3be9-5634-40cf-9494-d80110361554 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "02484dd3-2f3e-41bb-8ceb-d71936912a38-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:34:38 compute-1 nova_compute[182713]: 2026-01-22 00:34:38.152 182717 DEBUG oslo_concurrency.lockutils [req-32642515-124e-46be-8e37-ec7e93840933 req-398b3be9-5634-40cf-9494-d80110361554 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "02484dd3-2f3e-41bb-8ceb-d71936912a38-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:34:38 compute-1 nova_compute[182713]: 2026-01-22 00:34:38.153 182717 DEBUG oslo_concurrency.lockutils [req-32642515-124e-46be-8e37-ec7e93840933 req-398b3be9-5634-40cf-9494-d80110361554 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "02484dd3-2f3e-41bb-8ceb-d71936912a38-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:34:38 compute-1 nova_compute[182713]: 2026-01-22 00:34:38.153 182717 DEBUG nova.compute.manager [req-32642515-124e-46be-8e37-ec7e93840933 req-398b3be9-5634-40cf-9494-d80110361554 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 02484dd3-2f3e-41bb-8ceb-d71936912a38] No waiting events found dispatching network-vif-unplugged-fd875d41-c569-4c76-ae51-bbf4251e41bd pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:34:38 compute-1 nova_compute[182713]: 2026-01-22 00:34:38.154 182717 DEBUG nova.compute.manager [req-32642515-124e-46be-8e37-ec7e93840933 req-398b3be9-5634-40cf-9494-d80110361554 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 02484dd3-2f3e-41bb-8ceb-d71936912a38] Received event network-vif-unplugged-fd875d41-c569-4c76-ae51-bbf4251e41bd for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 22 00:34:38 compute-1 nova_compute[182713]: 2026-01-22 00:34:38.161 182717 DEBUG nova.compute.manager [req-4f25e4c9-2ca4-4083-97c4-da207924871a req-e0e9bf8a-f217-426e-824c-03abc594a084 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 02484dd3-2f3e-41bb-8ceb-d71936912a38] Received event network-vif-unplugged-0eb5006d-d9d0-403a-bc92-5ad7bd032ade external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:34:38 compute-1 nova_compute[182713]: 2026-01-22 00:34:38.162 182717 DEBUG oslo_concurrency.lockutils [req-4f25e4c9-2ca4-4083-97c4-da207924871a req-e0e9bf8a-f217-426e-824c-03abc594a084 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "02484dd3-2f3e-41bb-8ceb-d71936912a38-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:34:38 compute-1 nova_compute[182713]: 2026-01-22 00:34:38.162 182717 DEBUG oslo_concurrency.lockutils [req-4f25e4c9-2ca4-4083-97c4-da207924871a req-e0e9bf8a-f217-426e-824c-03abc594a084 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "02484dd3-2f3e-41bb-8ceb-d71936912a38-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:34:38 compute-1 nova_compute[182713]: 2026-01-22 00:34:38.163 182717 DEBUG oslo_concurrency.lockutils [req-4f25e4c9-2ca4-4083-97c4-da207924871a req-e0e9bf8a-f217-426e-824c-03abc594a084 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "02484dd3-2f3e-41bb-8ceb-d71936912a38-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:34:38 compute-1 nova_compute[182713]: 2026-01-22 00:34:38.163 182717 DEBUG nova.compute.manager [req-4f25e4c9-2ca4-4083-97c4-da207924871a req-e0e9bf8a-f217-426e-824c-03abc594a084 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 02484dd3-2f3e-41bb-8ceb-d71936912a38] No waiting events found dispatching network-vif-unplugged-0eb5006d-d9d0-403a-bc92-5ad7bd032ade pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:34:38 compute-1 nova_compute[182713]: 2026-01-22 00:34:38.163 182717 DEBUG nova.compute.manager [req-4f25e4c9-2ca4-4083-97c4-da207924871a req-e0e9bf8a-f217-426e-824c-03abc594a084 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 02484dd3-2f3e-41bb-8ceb-d71936912a38] Received event network-vif-unplugged-0eb5006d-d9d0-403a-bc92-5ad7bd032ade for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 22 00:34:39 compute-1 podman[240448]: 2026-01-22 00:34:39.601153167 +0000 UTC m=+0.087003244 container health_status 9069102163b9014ce8d990e36224edf2f6277e737eebbc85a8ea0ba5a3d6a468 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 22 00:34:39 compute-1 podman[240447]: 2026-01-22 00:34:39.630085956 +0000 UTC m=+0.124307366 container health_status 1b31374526468d6b98833c6ddc06c791524e3aaa8f4a92d02e3f11c449a17ca2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3)
Jan 22 00:34:39 compute-1 nova_compute[182713]: 2026-01-22 00:34:39.915 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:34:40 compute-1 nova_compute[182713]: 2026-01-22 00:34:40.289 182717 DEBUG nova.compute.manager [req-c07ca192-7e6c-4aaa-b24e-81cb265746eb req-b7078db3-b3f4-45c8-b91a-75bb5fa696ee 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 02484dd3-2f3e-41bb-8ceb-d71936912a38] Received event network-vif-plugged-0eb5006d-d9d0-403a-bc92-5ad7bd032ade external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:34:40 compute-1 nova_compute[182713]: 2026-01-22 00:34:40.290 182717 DEBUG oslo_concurrency.lockutils [req-c07ca192-7e6c-4aaa-b24e-81cb265746eb req-b7078db3-b3f4-45c8-b91a-75bb5fa696ee 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "02484dd3-2f3e-41bb-8ceb-d71936912a38-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:34:40 compute-1 nova_compute[182713]: 2026-01-22 00:34:40.290 182717 DEBUG oslo_concurrency.lockutils [req-c07ca192-7e6c-4aaa-b24e-81cb265746eb req-b7078db3-b3f4-45c8-b91a-75bb5fa696ee 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "02484dd3-2f3e-41bb-8ceb-d71936912a38-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:34:40 compute-1 nova_compute[182713]: 2026-01-22 00:34:40.290 182717 DEBUG oslo_concurrency.lockutils [req-c07ca192-7e6c-4aaa-b24e-81cb265746eb req-b7078db3-b3f4-45c8-b91a-75bb5fa696ee 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "02484dd3-2f3e-41bb-8ceb-d71936912a38-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:34:40 compute-1 nova_compute[182713]: 2026-01-22 00:34:40.290 182717 DEBUG nova.compute.manager [req-c07ca192-7e6c-4aaa-b24e-81cb265746eb req-b7078db3-b3f4-45c8-b91a-75bb5fa696ee 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 02484dd3-2f3e-41bb-8ceb-d71936912a38] No waiting events found dispatching network-vif-plugged-0eb5006d-d9d0-403a-bc92-5ad7bd032ade pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:34:40 compute-1 nova_compute[182713]: 2026-01-22 00:34:40.291 182717 WARNING nova.compute.manager [req-c07ca192-7e6c-4aaa-b24e-81cb265746eb req-b7078db3-b3f4-45c8-b91a-75bb5fa696ee 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 02484dd3-2f3e-41bb-8ceb-d71936912a38] Received unexpected event network-vif-plugged-0eb5006d-d9d0-403a-bc92-5ad7bd032ade for instance with vm_state active and task_state deleting.
Jan 22 00:34:40 compute-1 nova_compute[182713]: 2026-01-22 00:34:40.292 182717 DEBUG nova.compute.manager [req-f7a9fe2b-5387-4a00-9757-c1a1a136d32d req-e9063dd0-3446-4877-bfec-0982599057cf 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 02484dd3-2f3e-41bb-8ceb-d71936912a38] Received event network-vif-plugged-fd875d41-c569-4c76-ae51-bbf4251e41bd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:34:40 compute-1 nova_compute[182713]: 2026-01-22 00:34:40.292 182717 DEBUG oslo_concurrency.lockutils [req-f7a9fe2b-5387-4a00-9757-c1a1a136d32d req-e9063dd0-3446-4877-bfec-0982599057cf 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "02484dd3-2f3e-41bb-8ceb-d71936912a38-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:34:40 compute-1 nova_compute[182713]: 2026-01-22 00:34:40.292 182717 DEBUG oslo_concurrency.lockutils [req-f7a9fe2b-5387-4a00-9757-c1a1a136d32d req-e9063dd0-3446-4877-bfec-0982599057cf 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "02484dd3-2f3e-41bb-8ceb-d71936912a38-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:34:40 compute-1 nova_compute[182713]: 2026-01-22 00:34:40.292 182717 DEBUG oslo_concurrency.lockutils [req-f7a9fe2b-5387-4a00-9757-c1a1a136d32d req-e9063dd0-3446-4877-bfec-0982599057cf 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "02484dd3-2f3e-41bb-8ceb-d71936912a38-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:34:40 compute-1 nova_compute[182713]: 2026-01-22 00:34:40.292 182717 DEBUG nova.compute.manager [req-f7a9fe2b-5387-4a00-9757-c1a1a136d32d req-e9063dd0-3446-4877-bfec-0982599057cf 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 02484dd3-2f3e-41bb-8ceb-d71936912a38] No waiting events found dispatching network-vif-plugged-fd875d41-c569-4c76-ae51-bbf4251e41bd pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:34:40 compute-1 nova_compute[182713]: 2026-01-22 00:34:40.293 182717 WARNING nova.compute.manager [req-f7a9fe2b-5387-4a00-9757-c1a1a136d32d req-e9063dd0-3446-4877-bfec-0982599057cf 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 02484dd3-2f3e-41bb-8ceb-d71936912a38] Received unexpected event network-vif-plugged-fd875d41-c569-4c76-ae51-bbf4251e41bd for instance with vm_state active and task_state deleting.
Jan 22 00:34:41 compute-1 nova_compute[182713]: 2026-01-22 00:34:41.254 182717 DEBUG nova.network.neutron [req-89f7433d-a458-4c59-9a3b-3153e57464b4 req-733c3f44-b5c5-4bcd-8064-09e95b038bcb 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 02484dd3-2f3e-41bb-8ceb-d71936912a38] Updated VIF entry in instance network info cache for port 0eb5006d-d9d0-403a-bc92-5ad7bd032ade. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 00:34:41 compute-1 nova_compute[182713]: 2026-01-22 00:34:41.255 182717 DEBUG nova.network.neutron [req-89f7433d-a458-4c59-9a3b-3153e57464b4 req-733c3f44-b5c5-4bcd-8064-09e95b038bcb 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 02484dd3-2f3e-41bb-8ceb-d71936912a38] Updating instance_info_cache with network_info: [{"id": "0eb5006d-d9d0-403a-bc92-5ad7bd032ade", "address": "fa:16:3e:d3:fd:67", "network": {"id": "2c32b591-bafa-4089-9793-ef7884c86bda", "bridge": "br-int", "label": "tempest-network-smoke--1415909396", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0eb5006d-d9", "ovs_interfaceid": "0eb5006d-d9d0-403a-bc92-5ad7bd032ade", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "fd875d41-c569-4c76-ae51-bbf4251e41bd", "address": "fa:16:3e:d6:3b:8b", "network": {"id": "ac047d42-8ff5-4760-85b5-73b5e4be7fc9", "bridge": "br-int", "label": "tempest-network-smoke--1874258653", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fed6:3b8b", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfd875d41-c5", "ovs_interfaceid": "fd875d41-c569-4c76-ae51-bbf4251e41bd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:34:41 compute-1 nova_compute[182713]: 2026-01-22 00:34:41.281 182717 DEBUG oslo_concurrency.lockutils [req-89f7433d-a458-4c59-9a3b-3153e57464b4 req-733c3f44-b5c5-4bcd-8064-09e95b038bcb 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-02484dd3-2f3e-41bb-8ceb-d71936912a38" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:34:41 compute-1 nova_compute[182713]: 2026-01-22 00:34:41.310 182717 DEBUG nova.compute.manager [req-474b655d-3989-475d-8ba2-eafdc322a78e req-e5ae43aa-33ca-475e-b556-9de75f221314 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 02484dd3-2f3e-41bb-8ceb-d71936912a38] Received event network-vif-deleted-0eb5006d-d9d0-403a-bc92-5ad7bd032ade external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:34:41 compute-1 nova_compute[182713]: 2026-01-22 00:34:41.311 182717 INFO nova.compute.manager [req-474b655d-3989-475d-8ba2-eafdc322a78e req-e5ae43aa-33ca-475e-b556-9de75f221314 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 02484dd3-2f3e-41bb-8ceb-d71936912a38] Neutron deleted interface 0eb5006d-d9d0-403a-bc92-5ad7bd032ade; detaching it from the instance and deleting it from the info cache
Jan 22 00:34:41 compute-1 nova_compute[182713]: 2026-01-22 00:34:41.311 182717 DEBUG nova.network.neutron [req-474b655d-3989-475d-8ba2-eafdc322a78e req-e5ae43aa-33ca-475e-b556-9de75f221314 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 02484dd3-2f3e-41bb-8ceb-d71936912a38] Updating instance_info_cache with network_info: [{"id": "fd875d41-c569-4c76-ae51-bbf4251e41bd", "address": "fa:16:3e:d6:3b:8b", "network": {"id": "ac047d42-8ff5-4760-85b5-73b5e4be7fc9", "bridge": "br-int", "label": "tempest-network-smoke--1874258653", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fed6:3b8b", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfd875d41-c5", "ovs_interfaceid": "fd875d41-c569-4c76-ae51-bbf4251e41bd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:34:41 compute-1 nova_compute[182713]: 2026-01-22 00:34:41.336 182717 DEBUG nova.compute.manager [req-474b655d-3989-475d-8ba2-eafdc322a78e req-e5ae43aa-33ca-475e-b556-9de75f221314 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 02484dd3-2f3e-41bb-8ceb-d71936912a38] Detach interface failed, port_id=0eb5006d-d9d0-403a-bc92-5ad7bd032ade, reason: Instance 02484dd3-2f3e-41bb-8ceb-d71936912a38 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882
Jan 22 00:34:41 compute-1 nova_compute[182713]: 2026-01-22 00:34:41.838 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:34:42 compute-1 nova_compute[182713]: 2026-01-22 00:34:42.239 182717 DEBUG nova.network.neutron [-] [instance: 02484dd3-2f3e-41bb-8ceb-d71936912a38] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:34:42 compute-1 nova_compute[182713]: 2026-01-22 00:34:42.259 182717 INFO nova.compute.manager [-] [instance: 02484dd3-2f3e-41bb-8ceb-d71936912a38] Took 4.42 seconds to deallocate network for instance.
Jan 22 00:34:42 compute-1 nova_compute[182713]: 2026-01-22 00:34:42.328 182717 DEBUG oslo_concurrency.lockutils [None req-9da88252-6dce-47ac-93ae-a61f175a791f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:34:42 compute-1 nova_compute[182713]: 2026-01-22 00:34:42.329 182717 DEBUG oslo_concurrency.lockutils [None req-9da88252-6dce-47ac-93ae-a61f175a791f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:34:42 compute-1 nova_compute[182713]: 2026-01-22 00:34:42.393 182717 DEBUG nova.compute.provider_tree [None req-9da88252-6dce-47ac-93ae-a61f175a791f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Inventory has not changed in ProviderTree for provider: 39680711-70c9-4df1-ae59-25e54fac688d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:34:42 compute-1 nova_compute[182713]: 2026-01-22 00:34:42.414 182717 DEBUG nova.scheduler.client.report [None req-9da88252-6dce-47ac-93ae-a61f175a791f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Inventory has not changed for provider 39680711-70c9-4df1-ae59-25e54fac688d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:34:42 compute-1 nova_compute[182713]: 2026-01-22 00:34:42.447 182717 DEBUG oslo_concurrency.lockutils [None req-9da88252-6dce-47ac-93ae-a61f175a791f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.117s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:34:42 compute-1 nova_compute[182713]: 2026-01-22 00:34:42.496 182717 INFO nova.scheduler.client.report [None req-9da88252-6dce-47ac-93ae-a61f175a791f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Deleted allocations for instance 02484dd3-2f3e-41bb-8ceb-d71936912a38
Jan 22 00:34:44 compute-1 nova_compute[182713]: 2026-01-22 00:34:44.240 182717 DEBUG nova.compute.manager [req-65c183b4-a3bf-49a7-b65e-93c67ba6fcba req-9a3c78dc-9e4f-4d92-b417-a7c4a9359cac 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 02484dd3-2f3e-41bb-8ceb-d71936912a38] Received event network-vif-deleted-fd875d41-c569-4c76-ae51-bbf4251e41bd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:34:44 compute-1 nova_compute[182713]: 2026-01-22 00:34:44.317 182717 DEBUG oslo_concurrency.lockutils [None req-9da88252-6dce-47ac-93ae-a61f175a791f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "02484dd3-2f3e-41bb-8ceb-d71936912a38" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 8.329s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:34:44 compute-1 nova_compute[182713]: 2026-01-22 00:34:44.917 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:34:45 compute-1 podman[240498]: 2026-01-22 00:34:45.592997779 +0000 UTC m=+0.071342731 container health_status af9a44923f14d865893c91712a29a99d59e05d83f1a1a0300c4f4d5af638f284 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 00:34:45 compute-1 podman[240499]: 2026-01-22 00:34:45.611959374 +0000 UTC m=+0.084011619 container health_status e305ebfcd3dbca18f8dd680606b043b108cf9efb0d0a771039cb04fe0df8441d (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 22 00:34:46 compute-1 nova_compute[182713]: 2026-01-22 00:34:46.843 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:34:49 compute-1 nova_compute[182713]: 2026-01-22 00:34:49.920 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:34:51 compute-1 nova_compute[182713]: 2026-01-22 00:34:51.322 182717 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769042076.3210366, 02484dd3-2f3e-41bb-8ceb-d71936912a38 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:34:51 compute-1 nova_compute[182713]: 2026-01-22 00:34:51.323 182717 INFO nova.compute.manager [-] [instance: 02484dd3-2f3e-41bb-8ceb-d71936912a38] VM Stopped (Lifecycle Event)
Jan 22 00:34:51 compute-1 nova_compute[182713]: 2026-01-22 00:34:51.474 182717 DEBUG nova.compute.manager [None req-d0728d07-cccf-41cf-8a1e-731a72ec00cb - - - - - -] [instance: 02484dd3-2f3e-41bb-8ceb-d71936912a38] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:34:51 compute-1 nova_compute[182713]: 2026-01-22 00:34:51.848 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:34:54 compute-1 nova_compute[182713]: 2026-01-22 00:34:54.921 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:34:56 compute-1 nova_compute[182713]: 2026-01-22 00:34:56.852 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:34:58 compute-1 sshd-session[240539]: Invalid user ubuntu from 92.118.39.95 port 47188
Jan 22 00:34:58 compute-1 sshd-session[240539]: Connection closed by invalid user ubuntu 92.118.39.95 port 47188 [preauth]
Jan 22 00:34:58 compute-1 podman[240541]: 2026-01-22 00:34:58.789519589 +0000 UTC m=+0.100959152 container health_status cba261ffc859a105e0afa4436e64e27f9ba7e7387a1ea02146e0072c51844d44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2)
Jan 22 00:34:59 compute-1 nova_compute[182713]: 2026-01-22 00:34:59.855 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:34:59 compute-1 nova_compute[182713]: 2026-01-22 00:34:59.922 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:35:00 compute-1 nova_compute[182713]: 2026-01-22 00:35:00.855 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:35:00 compute-1 nova_compute[182713]: 2026-01-22 00:35:00.856 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:35:00 compute-1 nova_compute[182713]: 2026-01-22 00:35:00.856 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 00:35:01 compute-1 podman[240562]: 2026-01-22 00:35:01.588888341 +0000 UTC m=+0.074047566 container health_status dce133a2ffa21f3006ac27dc80027060a9029e6d972f4c62fecd35dd3bbb00f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, vendor=Red Hat, Inc., architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, config_id=openstack_network_exporter, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Jan 22 00:35:01 compute-1 nova_compute[182713]: 2026-01-22 00:35:01.851 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:35:01 compute-1 nova_compute[182713]: 2026-01-22 00:35:01.857 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:35:02 compute-1 nova_compute[182713]: 2026-01-22 00:35:02.391 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:35:02 compute-1 nova_compute[182713]: 2026-01-22 00:35:02.506 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:35:03 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:35:03.047 104184 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:35:03 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:35:03.047 104184 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:35:03 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:35:03.048 104184 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:35:04 compute-1 nova_compute[182713]: 2026-01-22 00:35:04.856 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:35:04 compute-1 nova_compute[182713]: 2026-01-22 00:35:04.924 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:35:06 compute-1 nova_compute[182713]: 2026-01-22 00:35:06.888 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:35:07 compute-1 nova_compute[182713]: 2026-01-22 00:35:07.855 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:35:09 compute-1 nova_compute[182713]: 2026-01-22 00:35:09.926 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:35:10 compute-1 podman[240587]: 2026-01-22 00:35:10.56426974 +0000 UTC m=+0.055393190 container health_status 9069102163b9014ce8d990e36224edf2f6277e737eebbc85a8ea0ba5a3d6a468 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 22 00:35:10 compute-1 podman[240586]: 2026-01-22 00:35:10.591713943 +0000 UTC m=+0.089925105 container health_status 1b31374526468d6b98833c6ddc06c791524e3aaa8f4a92d02e3f11c449a17ca2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 22 00:35:10 compute-1 nova_compute[182713]: 2026-01-22 00:35:10.855 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:35:10 compute-1 nova_compute[182713]: 2026-01-22 00:35:10.855 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:35:10 compute-1 nova_compute[182713]: 2026-01-22 00:35:10.898 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:35:10 compute-1 nova_compute[182713]: 2026-01-22 00:35:10.899 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:35:10 compute-1 nova_compute[182713]: 2026-01-22 00:35:10.900 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:35:10 compute-1 nova_compute[182713]: 2026-01-22 00:35:10.900 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 00:35:11 compute-1 nova_compute[182713]: 2026-01-22 00:35:11.088 182717 WARNING nova.virt.libvirt.driver [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 00:35:11 compute-1 nova_compute[182713]: 2026-01-22 00:35:11.090 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5704MB free_disk=73.19267654418945GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 00:35:11 compute-1 nova_compute[182713]: 2026-01-22 00:35:11.090 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:35:11 compute-1 nova_compute[182713]: 2026-01-22 00:35:11.090 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:35:11 compute-1 nova_compute[182713]: 2026-01-22 00:35:11.891 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:35:11 compute-1 nova_compute[182713]: 2026-01-22 00:35:11.911 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 00:35:11 compute-1 nova_compute[182713]: 2026-01-22 00:35:11.911 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 00:35:12 compute-1 nova_compute[182713]: 2026-01-22 00:35:12.268 182717 DEBUG nova.scheduler.client.report [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Refreshing inventories for resource provider 39680711-70c9-4df1-ae59-25e54fac688d _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Jan 22 00:35:12 compute-1 nova_compute[182713]: 2026-01-22 00:35:12.631 182717 DEBUG nova.scheduler.client.report [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Updating ProviderTree inventory for provider 39680711-70c9-4df1-ae59-25e54fac688d from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Jan 22 00:35:12 compute-1 nova_compute[182713]: 2026-01-22 00:35:12.632 182717 DEBUG nova.compute.provider_tree [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Updating inventory in ProviderTree for provider 39680711-70c9-4df1-ae59-25e54fac688d with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Jan 22 00:35:12 compute-1 nova_compute[182713]: 2026-01-22 00:35:12.647 182717 DEBUG nova.scheduler.client.report [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Refreshing aggregate associations for resource provider 39680711-70c9-4df1-ae59-25e54fac688d, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Jan 22 00:35:12 compute-1 nova_compute[182713]: 2026-01-22 00:35:12.684 182717 DEBUG nova.scheduler.client.report [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Refreshing trait associations for resource provider 39680711-70c9-4df1-ae59-25e54fac688d, traits: COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_ACCELERATORS,COMPUTE_NODE,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_SSE41,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_GRAPHICS_MODEL_CIRRUS,HW_CPU_X86_SSE42,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_SECURITY_TPM_2_0,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_STORAGE_BUS_USB,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_STORAGE_BUS_FDC,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_STORAGE_BUS_SATA,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_DEVICE_TAGGING,COMPUTE_NET_VIF_MODEL_NE2K_PCI,HW_CPU_X86_SSE2,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_CPU_X86_SSSE3,COMPUTE_STORAGE_BUS_IDE,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_RESCUE_BFV,HW_CPU_X86_MMX,HW_CPU_X86_SSE,COMPUTE_TRUSTED_CERTS,COMPUTE_IMAGE_TYPE_ISO _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Jan 22 00:35:12 compute-1 nova_compute[182713]: 2026-01-22 00:35:12.708 182717 DEBUG nova.compute.provider_tree [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Inventory has not changed in ProviderTree for provider: 39680711-70c9-4df1-ae59-25e54fac688d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:35:12 compute-1 nova_compute[182713]: 2026-01-22 00:35:12.723 182717 DEBUG nova.scheduler.client.report [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Inventory has not changed for provider 39680711-70c9-4df1-ae59-25e54fac688d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:35:12 compute-1 nova_compute[182713]: 2026-01-22 00:35:12.755 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 00:35:12 compute-1 nova_compute[182713]: 2026-01-22 00:35:12.756 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.666s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:35:12 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:35:12.968 104184 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=58, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:a4:fb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'fa:c2:bb:50:eb:27'}, ipsec=False) old=SB_Global(nb_cfg=57) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:35:12 compute-1 nova_compute[182713]: 2026-01-22 00:35:12.969 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:35:12 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:35:12.969 104184 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 22 00:35:14 compute-1 nova_compute[182713]: 2026-01-22 00:35:14.757 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:35:14 compute-1 nova_compute[182713]: 2026-01-22 00:35:14.758 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 00:35:14 compute-1 nova_compute[182713]: 2026-01-22 00:35:14.758 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 22 00:35:14 compute-1 nova_compute[182713]: 2026-01-22 00:35:14.799 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 22 00:35:14 compute-1 nova_compute[182713]: 2026-01-22 00:35:14.929 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:35:14 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:35:14.971 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=74526b6d-b1ca-423f-9094-b845f8b97526, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '58'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:35:16 compute-1 podman[240636]: 2026-01-22 00:35:16.567642194 +0000 UTC m=+0.059458019 container health_status af9a44923f14d865893c91712a29a99d59e05d83f1a1a0300c4f4d5af638f284 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Jan 22 00:35:16 compute-1 podman[240637]: 2026-01-22 00:35:16.589297524 +0000 UTC m=+0.072853299 container health_status e305ebfcd3dbca18f8dd680606b043b108cf9efb0d0a771039cb04fe0df8441d (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Jan 22 00:35:16 compute-1 nova_compute[182713]: 2026-01-22 00:35:16.936 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:35:19 compute-1 nova_compute[182713]: 2026-01-22 00:35:19.931 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:35:21 compute-1 nova_compute[182713]: 2026-01-22 00:35:21.939 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:35:24 compute-1 nova_compute[182713]: 2026-01-22 00:35:24.933 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:35:26 compute-1 nova_compute[182713]: 2026-01-22 00:35:26.992 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:35:29 compute-1 podman[240677]: 2026-01-22 00:35:29.795653894 +0000 UTC m=+0.101368756 container health_status cba261ffc859a105e0afa4436e64e27f9ba7e7387a1ea02146e0072c51844d44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251202)
Jan 22 00:35:29 compute-1 nova_compute[182713]: 2026-01-22 00:35:29.936 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:35:30 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:35:30.183 104184 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:17:32:e3 2001:db8:0:1:f816:3eff:fe17:32e3 2001:db8::f816:3eff:fe17:32e3'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:0:1:f816:3eff:fe17:32e3/64 2001:db8::f816:3eff:fe17:32e3/64', 'neutron:device_id': 'ovnmeta-041654ff-0c5d-4cd2-89f6-0863dbbf44a8', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-041654ff-0c5d-4cd2-89f6-0863dbbf44a8', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '837db8748d074b3c9179b47d30e7a1d4', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=9d6cac94-5c44-44de-a872-7bf42948d910, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=44d0292d-d743-4a92-8996-3ae3a26c0afc) old=Port_Binding(mac=['fa:16:3e:17:32:e3 2001:db8::f816:3eff:fe17:32e3'], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fe17:32e3/64', 'neutron:device_id': 'ovnmeta-041654ff-0c5d-4cd2-89f6-0863dbbf44a8', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-041654ff-0c5d-4cd2-89f6-0863dbbf44a8', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '837db8748d074b3c9179b47d30e7a1d4', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:35:30 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:35:30.184 104184 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 44d0292d-d743-4a92-8996-3ae3a26c0afc in datapath 041654ff-0c5d-4cd2-89f6-0863dbbf44a8 updated
Jan 22 00:35:30 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:35:30.185 104184 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 041654ff-0c5d-4cd2-89f6-0863dbbf44a8, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 00:35:30 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:35:30.189 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[d0b54822-ecdc-4e30-922a-48d3961ec441]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:35:31 compute-1 nova_compute[182713]: 2026-01-22 00:35:31.996 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:35:32 compute-1 podman[240697]: 2026-01-22 00:35:32.570808005 +0000 UTC m=+0.063654420 container health_status dce133a2ffa21f3006ac27dc80027060a9029e6d972f4c62fecd35dd3bbb00f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, release=1755695350, vcs-type=git, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, config_id=openstack_network_exporter, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, architecture=x86_64)
Jan 22 00:35:34 compute-1 nova_compute[182713]: 2026-01-22 00:35:34.939 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:35:36 compute-1 nova_compute[182713]: 2026-01-22 00:35:36.291 182717 DEBUG oslo_concurrency.lockutils [None req-9311732a-61f8-423e-850c-57479ca0bc2e a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Acquiring lock "79afbcaf-6ef5-4db5-a05c-78bccd9f772a" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:35:36 compute-1 nova_compute[182713]: 2026-01-22 00:35:36.292 182717 DEBUG oslo_concurrency.lockutils [None req-9311732a-61f8-423e-850c-57479ca0bc2e a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "79afbcaf-6ef5-4db5-a05c-78bccd9f772a" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:35:36 compute-1 nova_compute[182713]: 2026-01-22 00:35:36.311 182717 DEBUG nova.compute.manager [None req-9311732a-61f8-423e-850c-57479ca0bc2e a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 79afbcaf-6ef5-4db5-a05c-78bccd9f772a] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 22 00:35:36 compute-1 nova_compute[182713]: 2026-01-22 00:35:36.435 182717 DEBUG oslo_concurrency.lockutils [None req-9311732a-61f8-423e-850c-57479ca0bc2e a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:35:36 compute-1 nova_compute[182713]: 2026-01-22 00:35:36.435 182717 DEBUG oslo_concurrency.lockutils [None req-9311732a-61f8-423e-850c-57479ca0bc2e a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:35:36 compute-1 nova_compute[182713]: 2026-01-22 00:35:36.443 182717 DEBUG nova.virt.hardware [None req-9311732a-61f8-423e-850c-57479ca0bc2e a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 22 00:35:36 compute-1 nova_compute[182713]: 2026-01-22 00:35:36.443 182717 INFO nova.compute.claims [None req-9311732a-61f8-423e-850c-57479ca0bc2e a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 79afbcaf-6ef5-4db5-a05c-78bccd9f772a] Claim successful on node compute-1.ctlplane.example.com
Jan 22 00:35:36 compute-1 nova_compute[182713]: 2026-01-22 00:35:36.586 182717 DEBUG nova.compute.provider_tree [None req-9311732a-61f8-423e-850c-57479ca0bc2e a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Inventory has not changed in ProviderTree for provider: 39680711-70c9-4df1-ae59-25e54fac688d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:35:36 compute-1 nova_compute[182713]: 2026-01-22 00:35:36.605 182717 DEBUG nova.scheduler.client.report [None req-9311732a-61f8-423e-850c-57479ca0bc2e a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Inventory has not changed for provider 39680711-70c9-4df1-ae59-25e54fac688d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:35:36 compute-1 nova_compute[182713]: 2026-01-22 00:35:36.641 182717 DEBUG oslo_concurrency.lockutils [None req-9311732a-61f8-423e-850c-57479ca0bc2e a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.206s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:35:36 compute-1 nova_compute[182713]: 2026-01-22 00:35:36.643 182717 DEBUG nova.compute.manager [None req-9311732a-61f8-423e-850c-57479ca0bc2e a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 79afbcaf-6ef5-4db5-a05c-78bccd9f772a] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 22 00:35:36 compute-1 nova_compute[182713]: 2026-01-22 00:35:36.709 182717 DEBUG nova.compute.manager [None req-9311732a-61f8-423e-850c-57479ca0bc2e a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 79afbcaf-6ef5-4db5-a05c-78bccd9f772a] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 22 00:35:36 compute-1 nova_compute[182713]: 2026-01-22 00:35:36.710 182717 DEBUG nova.network.neutron [None req-9311732a-61f8-423e-850c-57479ca0bc2e a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 79afbcaf-6ef5-4db5-a05c-78bccd9f772a] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 22 00:35:36 compute-1 nova_compute[182713]: 2026-01-22 00:35:36.746 182717 INFO nova.virt.libvirt.driver [None req-9311732a-61f8-423e-850c-57479ca0bc2e a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 79afbcaf-6ef5-4db5-a05c-78bccd9f772a] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 22 00:35:36 compute-1 nova_compute[182713]: 2026-01-22 00:35:36.765 182717 DEBUG nova.compute.manager [None req-9311732a-61f8-423e-850c-57479ca0bc2e a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 79afbcaf-6ef5-4db5-a05c-78bccd9f772a] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 22 00:35:36 compute-1 nova_compute[182713]: 2026-01-22 00:35:36.882 182717 DEBUG nova.compute.manager [None req-9311732a-61f8-423e-850c-57479ca0bc2e a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 79afbcaf-6ef5-4db5-a05c-78bccd9f772a] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 22 00:35:36 compute-1 nova_compute[182713]: 2026-01-22 00:35:36.883 182717 DEBUG nova.virt.libvirt.driver [None req-9311732a-61f8-423e-850c-57479ca0bc2e a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 79afbcaf-6ef5-4db5-a05c-78bccd9f772a] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 22 00:35:36 compute-1 nova_compute[182713]: 2026-01-22 00:35:36.884 182717 INFO nova.virt.libvirt.driver [None req-9311732a-61f8-423e-850c-57479ca0bc2e a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 79afbcaf-6ef5-4db5-a05c-78bccd9f772a] Creating image(s)
Jan 22 00:35:36 compute-1 nova_compute[182713]: 2026-01-22 00:35:36.884 182717 DEBUG oslo_concurrency.lockutils [None req-9311732a-61f8-423e-850c-57479ca0bc2e a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Acquiring lock "/var/lib/nova/instances/79afbcaf-6ef5-4db5-a05c-78bccd9f772a/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:35:36 compute-1 nova_compute[182713]: 2026-01-22 00:35:36.884 182717 DEBUG oslo_concurrency.lockutils [None req-9311732a-61f8-423e-850c-57479ca0bc2e a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "/var/lib/nova/instances/79afbcaf-6ef5-4db5-a05c-78bccd9f772a/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:35:36 compute-1 nova_compute[182713]: 2026-01-22 00:35:36.885 182717 DEBUG oslo_concurrency.lockutils [None req-9311732a-61f8-423e-850c-57479ca0bc2e a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "/var/lib/nova/instances/79afbcaf-6ef5-4db5-a05c-78bccd9f772a/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:35:36 compute-1 nova_compute[182713]: 2026-01-22 00:35:36.899 182717 DEBUG oslo_concurrency.processutils [None req-9311732a-61f8-423e-850c-57479ca0bc2e a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:35:36 compute-1 nova_compute[182713]: 2026-01-22 00:35:36.970 182717 DEBUG oslo_concurrency.processutils [None req-9311732a-61f8-423e-850c-57479ca0bc2e a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:35:36 compute-1 nova_compute[182713]: 2026-01-22 00:35:36.972 182717 DEBUG oslo_concurrency.lockutils [None req-9311732a-61f8-423e-850c-57479ca0bc2e a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Acquiring lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:35:36 compute-1 nova_compute[182713]: 2026-01-22 00:35:36.972 182717 DEBUG oslo_concurrency.lockutils [None req-9311732a-61f8-423e-850c-57479ca0bc2e a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:35:36 compute-1 nova_compute[182713]: 2026-01-22 00:35:36.988 182717 DEBUG oslo_concurrency.processutils [None req-9311732a-61f8-423e-850c-57479ca0bc2e a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:35:37 compute-1 nova_compute[182713]: 2026-01-22 00:35:37.004 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:35:37 compute-1 nova_compute[182713]: 2026-01-22 00:35:37.010 182717 DEBUG nova.policy [None req-9311732a-61f8-423e-850c-57479ca0bc2e a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 22 00:35:37 compute-1 nova_compute[182713]: 2026-01-22 00:35:37.042 182717 DEBUG oslo_concurrency.processutils [None req-9311732a-61f8-423e-850c-57479ca0bc2e a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:35:37 compute-1 nova_compute[182713]: 2026-01-22 00:35:37.043 182717 DEBUG oslo_concurrency.processutils [None req-9311732a-61f8-423e-850c-57479ca0bc2e a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/79afbcaf-6ef5-4db5-a05c-78bccd9f772a/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:35:37 compute-1 nova_compute[182713]: 2026-01-22 00:35:37.080 182717 DEBUG oslo_concurrency.processutils [None req-9311732a-61f8-423e-850c-57479ca0bc2e a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/79afbcaf-6ef5-4db5-a05c-78bccd9f772a/disk 1073741824" returned: 0 in 0.037s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:35:37 compute-1 nova_compute[182713]: 2026-01-22 00:35:37.081 182717 DEBUG oslo_concurrency.lockutils [None req-9311732a-61f8-423e-850c-57479ca0bc2e a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.108s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:35:37 compute-1 nova_compute[182713]: 2026-01-22 00:35:37.081 182717 DEBUG oslo_concurrency.processutils [None req-9311732a-61f8-423e-850c-57479ca0bc2e a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:35:37 compute-1 nova_compute[182713]: 2026-01-22 00:35:37.133 182717 DEBUG oslo_concurrency.processutils [None req-9311732a-61f8-423e-850c-57479ca0bc2e a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:35:37 compute-1 nova_compute[182713]: 2026-01-22 00:35:37.134 182717 DEBUG nova.virt.disk.api [None req-9311732a-61f8-423e-850c-57479ca0bc2e a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Checking if we can resize image /var/lib/nova/instances/79afbcaf-6ef5-4db5-a05c-78bccd9f772a/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 22 00:35:37 compute-1 nova_compute[182713]: 2026-01-22 00:35:37.135 182717 DEBUG oslo_concurrency.processutils [None req-9311732a-61f8-423e-850c-57479ca0bc2e a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/79afbcaf-6ef5-4db5-a05c-78bccd9f772a/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:35:37 compute-1 nova_compute[182713]: 2026-01-22 00:35:37.189 182717 DEBUG oslo_concurrency.processutils [None req-9311732a-61f8-423e-850c-57479ca0bc2e a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/79afbcaf-6ef5-4db5-a05c-78bccd9f772a/disk --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:35:37 compute-1 nova_compute[182713]: 2026-01-22 00:35:37.190 182717 DEBUG nova.virt.disk.api [None req-9311732a-61f8-423e-850c-57479ca0bc2e a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Cannot resize image /var/lib/nova/instances/79afbcaf-6ef5-4db5-a05c-78bccd9f772a/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 22 00:35:37 compute-1 nova_compute[182713]: 2026-01-22 00:35:37.190 182717 DEBUG nova.objects.instance [None req-9311732a-61f8-423e-850c-57479ca0bc2e a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lazy-loading 'migration_context' on Instance uuid 79afbcaf-6ef5-4db5-a05c-78bccd9f772a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:35:37 compute-1 nova_compute[182713]: 2026-01-22 00:35:37.207 182717 DEBUG nova.virt.libvirt.driver [None req-9311732a-61f8-423e-850c-57479ca0bc2e a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 79afbcaf-6ef5-4db5-a05c-78bccd9f772a] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 22 00:35:37 compute-1 nova_compute[182713]: 2026-01-22 00:35:37.208 182717 DEBUG nova.virt.libvirt.driver [None req-9311732a-61f8-423e-850c-57479ca0bc2e a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 79afbcaf-6ef5-4db5-a05c-78bccd9f772a] Ensure instance console log exists: /var/lib/nova/instances/79afbcaf-6ef5-4db5-a05c-78bccd9f772a/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 22 00:35:37 compute-1 nova_compute[182713]: 2026-01-22 00:35:37.208 182717 DEBUG oslo_concurrency.lockutils [None req-9311732a-61f8-423e-850c-57479ca0bc2e a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:35:37 compute-1 nova_compute[182713]: 2026-01-22 00:35:37.209 182717 DEBUG oslo_concurrency.lockutils [None req-9311732a-61f8-423e-850c-57479ca0bc2e a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:35:37 compute-1 nova_compute[182713]: 2026-01-22 00:35:37.209 182717 DEBUG oslo_concurrency.lockutils [None req-9311732a-61f8-423e-850c-57479ca0bc2e a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:35:39 compute-1 nova_compute[182713]: 2026-01-22 00:35:39.941 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:35:41 compute-1 nova_compute[182713]: 2026-01-22 00:35:41.146 182717 DEBUG nova.network.neutron [None req-9311732a-61f8-423e-850c-57479ca0bc2e a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 79afbcaf-6ef5-4db5-a05c-78bccd9f772a] Successfully created port: 4a9813e7-d0e1-490f-a8ca-3138225442a8 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 22 00:35:41 compute-1 podman[240735]: 2026-01-22 00:35:41.591437716 +0000 UTC m=+0.077913579 container health_status 9069102163b9014ce8d990e36224edf2f6277e737eebbc85a8ea0ba5a3d6a468 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 22 00:35:41 compute-1 podman[240734]: 2026-01-22 00:35:41.642050616 +0000 UTC m=+0.129903341 container health_status 1b31374526468d6b98833c6ddc06c791524e3aaa8f4a92d02e3f11c449a17ca2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_controller)
Jan 22 00:35:42 compute-1 nova_compute[182713]: 2026-01-22 00:35:42.006 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:35:43 compute-1 nova_compute[182713]: 2026-01-22 00:35:43.025 182717 DEBUG nova.network.neutron [None req-9311732a-61f8-423e-850c-57479ca0bc2e a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 79afbcaf-6ef5-4db5-a05c-78bccd9f772a] Successfully created port: 90c66d2f-b763-45aa-9b34-a83255b9d97b _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 22 00:35:44 compute-1 nova_compute[182713]: 2026-01-22 00:35:44.155 182717 DEBUG nova.network.neutron [None req-9311732a-61f8-423e-850c-57479ca0bc2e a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 79afbcaf-6ef5-4db5-a05c-78bccd9f772a] Successfully updated port: 4a9813e7-d0e1-490f-a8ca-3138225442a8 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 22 00:35:44 compute-1 nova_compute[182713]: 2026-01-22 00:35:44.261 182717 DEBUG nova.compute.manager [req-2514fdb1-4956-46d3-b4c0-9eda94ac4faa req-841c68a6-ee46-4f13-8d83-6397ed8648b7 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 79afbcaf-6ef5-4db5-a05c-78bccd9f772a] Received event network-changed-4a9813e7-d0e1-490f-a8ca-3138225442a8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:35:44 compute-1 nova_compute[182713]: 2026-01-22 00:35:44.262 182717 DEBUG nova.compute.manager [req-2514fdb1-4956-46d3-b4c0-9eda94ac4faa req-841c68a6-ee46-4f13-8d83-6397ed8648b7 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 79afbcaf-6ef5-4db5-a05c-78bccd9f772a] Refreshing instance network info cache due to event network-changed-4a9813e7-d0e1-490f-a8ca-3138225442a8. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 00:35:44 compute-1 nova_compute[182713]: 2026-01-22 00:35:44.263 182717 DEBUG oslo_concurrency.lockutils [req-2514fdb1-4956-46d3-b4c0-9eda94ac4faa req-841c68a6-ee46-4f13-8d83-6397ed8648b7 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-79afbcaf-6ef5-4db5-a05c-78bccd9f772a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:35:44 compute-1 nova_compute[182713]: 2026-01-22 00:35:44.263 182717 DEBUG oslo_concurrency.lockutils [req-2514fdb1-4956-46d3-b4c0-9eda94ac4faa req-841c68a6-ee46-4f13-8d83-6397ed8648b7 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-79afbcaf-6ef5-4db5-a05c-78bccd9f772a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:35:44 compute-1 nova_compute[182713]: 2026-01-22 00:35:44.264 182717 DEBUG nova.network.neutron [req-2514fdb1-4956-46d3-b4c0-9eda94ac4faa req-841c68a6-ee46-4f13-8d83-6397ed8648b7 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 79afbcaf-6ef5-4db5-a05c-78bccd9f772a] Refreshing network info cache for port 4a9813e7-d0e1-490f-a8ca-3138225442a8 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 00:35:44 compute-1 nova_compute[182713]: 2026-01-22 00:35:44.593 182717 DEBUG nova.network.neutron [req-2514fdb1-4956-46d3-b4c0-9eda94ac4faa req-841c68a6-ee46-4f13-8d83-6397ed8648b7 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 79afbcaf-6ef5-4db5-a05c-78bccd9f772a] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 00:35:44 compute-1 nova_compute[182713]: 2026-01-22 00:35:44.826 182717 DEBUG nova.network.neutron [None req-9311732a-61f8-423e-850c-57479ca0bc2e a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 79afbcaf-6ef5-4db5-a05c-78bccd9f772a] Successfully updated port: 90c66d2f-b763-45aa-9b34-a83255b9d97b _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 22 00:35:44 compute-1 nova_compute[182713]: 2026-01-22 00:35:44.842 182717 DEBUG oslo_concurrency.lockutils [None req-9311732a-61f8-423e-850c-57479ca0bc2e a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Acquiring lock "refresh_cache-79afbcaf-6ef5-4db5-a05c-78bccd9f772a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:35:44 compute-1 nova_compute[182713]: 2026-01-22 00:35:44.876 182717 DEBUG nova.network.neutron [req-2514fdb1-4956-46d3-b4c0-9eda94ac4faa req-841c68a6-ee46-4f13-8d83-6397ed8648b7 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 79afbcaf-6ef5-4db5-a05c-78bccd9f772a] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:35:44 compute-1 nova_compute[182713]: 2026-01-22 00:35:44.898 182717 DEBUG oslo_concurrency.lockutils [req-2514fdb1-4956-46d3-b4c0-9eda94ac4faa req-841c68a6-ee46-4f13-8d83-6397ed8648b7 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-79afbcaf-6ef5-4db5-a05c-78bccd9f772a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:35:44 compute-1 nova_compute[182713]: 2026-01-22 00:35:44.898 182717 DEBUG oslo_concurrency.lockutils [None req-9311732a-61f8-423e-850c-57479ca0bc2e a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Acquired lock "refresh_cache-79afbcaf-6ef5-4db5-a05c-78bccd9f772a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:35:44 compute-1 nova_compute[182713]: 2026-01-22 00:35:44.899 182717 DEBUG nova.network.neutron [None req-9311732a-61f8-423e-850c-57479ca0bc2e a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 79afbcaf-6ef5-4db5-a05c-78bccd9f772a] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 00:35:44 compute-1 nova_compute[182713]: 2026-01-22 00:35:44.945 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:35:45 compute-1 nova_compute[182713]: 2026-01-22 00:35:45.079 182717 DEBUG nova.network.neutron [None req-9311732a-61f8-423e-850c-57479ca0bc2e a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 79afbcaf-6ef5-4db5-a05c-78bccd9f772a] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 00:35:46 compute-1 nova_compute[182713]: 2026-01-22 00:35:46.369 182717 DEBUG nova.compute.manager [req-7171e206-6c75-4ace-9a55-81d2e03b861e req-e0942562-0dee-4c2a-9e98-82489a87a625 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 79afbcaf-6ef5-4db5-a05c-78bccd9f772a] Received event network-changed-90c66d2f-b763-45aa-9b34-a83255b9d97b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:35:46 compute-1 nova_compute[182713]: 2026-01-22 00:35:46.369 182717 DEBUG nova.compute.manager [req-7171e206-6c75-4ace-9a55-81d2e03b861e req-e0942562-0dee-4c2a-9e98-82489a87a625 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 79afbcaf-6ef5-4db5-a05c-78bccd9f772a] Refreshing instance network info cache due to event network-changed-90c66d2f-b763-45aa-9b34-a83255b9d97b. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 00:35:46 compute-1 nova_compute[182713]: 2026-01-22 00:35:46.370 182717 DEBUG oslo_concurrency.lockutils [req-7171e206-6c75-4ace-9a55-81d2e03b861e req-e0942562-0dee-4c2a-9e98-82489a87a625 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-79afbcaf-6ef5-4db5-a05c-78bccd9f772a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:35:47 compute-1 nova_compute[182713]: 2026-01-22 00:35:47.009 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:35:47 compute-1 nova_compute[182713]: 2026-01-22 00:35:47.369 182717 DEBUG nova.network.neutron [None req-9311732a-61f8-423e-850c-57479ca0bc2e a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 79afbcaf-6ef5-4db5-a05c-78bccd9f772a] Updating instance_info_cache with network_info: [{"id": "4a9813e7-d0e1-490f-a8ca-3138225442a8", "address": "fa:16:3e:7f:b8:cc", "network": {"id": "9d3a0d92-0a01-43e0-bbe5-a677082b8f1b", "bridge": "br-int", "label": "tempest-network-smoke--1303473530", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4a9813e7-d0", "ovs_interfaceid": "4a9813e7-d0e1-490f-a8ca-3138225442a8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "90c66d2f-b763-45aa-9b34-a83255b9d97b", "address": "fa:16:3e:e5:06:a1", "network": {"id": "041654ff-0c5d-4cd2-89f6-0863dbbf44a8", "bridge": "br-int", "label": "tempest-network-smoke--1472393287", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fee5:6a1", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fee5:6a1", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap90c66d2f-b7", "ovs_interfaceid": "90c66d2f-b763-45aa-9b34-a83255b9d97b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:35:47 compute-1 nova_compute[182713]: 2026-01-22 00:35:47.396 182717 DEBUG oslo_concurrency.lockutils [None req-9311732a-61f8-423e-850c-57479ca0bc2e a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Releasing lock "refresh_cache-79afbcaf-6ef5-4db5-a05c-78bccd9f772a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:35:47 compute-1 nova_compute[182713]: 2026-01-22 00:35:47.396 182717 DEBUG nova.compute.manager [None req-9311732a-61f8-423e-850c-57479ca0bc2e a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 79afbcaf-6ef5-4db5-a05c-78bccd9f772a] Instance network_info: |[{"id": "4a9813e7-d0e1-490f-a8ca-3138225442a8", "address": "fa:16:3e:7f:b8:cc", "network": {"id": "9d3a0d92-0a01-43e0-bbe5-a677082b8f1b", "bridge": "br-int", "label": "tempest-network-smoke--1303473530", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4a9813e7-d0", "ovs_interfaceid": "4a9813e7-d0e1-490f-a8ca-3138225442a8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "90c66d2f-b763-45aa-9b34-a83255b9d97b", "address": "fa:16:3e:e5:06:a1", "network": {"id": "041654ff-0c5d-4cd2-89f6-0863dbbf44a8", "bridge": "br-int", "label": "tempest-network-smoke--1472393287", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fee5:6a1", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fee5:6a1", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap90c66d2f-b7", "ovs_interfaceid": "90c66d2f-b763-45aa-9b34-a83255b9d97b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 22 00:35:47 compute-1 nova_compute[182713]: 2026-01-22 00:35:47.397 182717 DEBUG oslo_concurrency.lockutils [req-7171e206-6c75-4ace-9a55-81d2e03b861e req-e0942562-0dee-4c2a-9e98-82489a87a625 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-79afbcaf-6ef5-4db5-a05c-78bccd9f772a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:35:47 compute-1 nova_compute[182713]: 2026-01-22 00:35:47.397 182717 DEBUG nova.network.neutron [req-7171e206-6c75-4ace-9a55-81d2e03b861e req-e0942562-0dee-4c2a-9e98-82489a87a625 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 79afbcaf-6ef5-4db5-a05c-78bccd9f772a] Refreshing network info cache for port 90c66d2f-b763-45aa-9b34-a83255b9d97b _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 00:35:47 compute-1 nova_compute[182713]: 2026-01-22 00:35:47.401 182717 DEBUG nova.virt.libvirt.driver [None req-9311732a-61f8-423e-850c-57479ca0bc2e a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 79afbcaf-6ef5-4db5-a05c-78bccd9f772a] Start _get_guest_xml network_info=[{"id": "4a9813e7-d0e1-490f-a8ca-3138225442a8", "address": "fa:16:3e:7f:b8:cc", "network": {"id": "9d3a0d92-0a01-43e0-bbe5-a677082b8f1b", "bridge": "br-int", "label": "tempest-network-smoke--1303473530", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4a9813e7-d0", "ovs_interfaceid": "4a9813e7-d0e1-490f-a8ca-3138225442a8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "90c66d2f-b763-45aa-9b34-a83255b9d97b", "address": "fa:16:3e:e5:06:a1", "network": {"id": "041654ff-0c5d-4cd2-89f6-0863dbbf44a8", "bridge": "br-int", "label": "tempest-network-smoke--1472393287", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fee5:6a1", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fee5:6a1", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap90c66d2f-b7", "ovs_interfaceid": "90c66d2f-b763-45aa-9b34-a83255b9d97b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'size': 0, 'disk_bus': 'virtio', 'boot_index': 0, 'device_name': '/dev/vda', 'guest_format': None, 'encrypted': False, 'encryption_format': None, 'encryption_secret_uuid': None, 'device_type': 'disk', 'image_id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 22 00:35:47 compute-1 nova_compute[182713]: 2026-01-22 00:35:47.406 182717 WARNING nova.virt.libvirt.driver [None req-9311732a-61f8-423e-850c-57479ca0bc2e a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 00:35:47 compute-1 nova_compute[182713]: 2026-01-22 00:35:47.411 182717 DEBUG nova.virt.libvirt.host [None req-9311732a-61f8-423e-850c-57479ca0bc2e a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 22 00:35:47 compute-1 nova_compute[182713]: 2026-01-22 00:35:47.412 182717 DEBUG nova.virt.libvirt.host [None req-9311732a-61f8-423e-850c-57479ca0bc2e a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 22 00:35:47 compute-1 nova_compute[182713]: 2026-01-22 00:35:47.417 182717 DEBUG nova.virt.libvirt.host [None req-9311732a-61f8-423e-850c-57479ca0bc2e a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 22 00:35:47 compute-1 nova_compute[182713]: 2026-01-22 00:35:47.418 182717 DEBUG nova.virt.libvirt.host [None req-9311732a-61f8-423e-850c-57479ca0bc2e a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 22 00:35:47 compute-1 nova_compute[182713]: 2026-01-22 00:35:47.419 182717 DEBUG nova.virt.libvirt.driver [None req-9311732a-61f8-423e-850c-57479ca0bc2e a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 22 00:35:47 compute-1 nova_compute[182713]: 2026-01-22 00:35:47.419 182717 DEBUG nova.virt.hardware [None req-9311732a-61f8-423e-850c-57479ca0bc2e a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-21T23:42:45Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c3389c03-89c4-4ff5-9e03-1a99d41713d4',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 22 00:35:47 compute-1 nova_compute[182713]: 2026-01-22 00:35:47.420 182717 DEBUG nova.virt.hardware [None req-9311732a-61f8-423e-850c-57479ca0bc2e a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 22 00:35:47 compute-1 nova_compute[182713]: 2026-01-22 00:35:47.420 182717 DEBUG nova.virt.hardware [None req-9311732a-61f8-423e-850c-57479ca0bc2e a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 22 00:35:47 compute-1 nova_compute[182713]: 2026-01-22 00:35:47.420 182717 DEBUG nova.virt.hardware [None req-9311732a-61f8-423e-850c-57479ca0bc2e a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 22 00:35:47 compute-1 nova_compute[182713]: 2026-01-22 00:35:47.420 182717 DEBUG nova.virt.hardware [None req-9311732a-61f8-423e-850c-57479ca0bc2e a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 22 00:35:47 compute-1 nova_compute[182713]: 2026-01-22 00:35:47.421 182717 DEBUG nova.virt.hardware [None req-9311732a-61f8-423e-850c-57479ca0bc2e a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 22 00:35:47 compute-1 nova_compute[182713]: 2026-01-22 00:35:47.421 182717 DEBUG nova.virt.hardware [None req-9311732a-61f8-423e-850c-57479ca0bc2e a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 22 00:35:47 compute-1 nova_compute[182713]: 2026-01-22 00:35:47.421 182717 DEBUG nova.virt.hardware [None req-9311732a-61f8-423e-850c-57479ca0bc2e a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 22 00:35:47 compute-1 nova_compute[182713]: 2026-01-22 00:35:47.421 182717 DEBUG nova.virt.hardware [None req-9311732a-61f8-423e-850c-57479ca0bc2e a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 22 00:35:47 compute-1 nova_compute[182713]: 2026-01-22 00:35:47.422 182717 DEBUG nova.virt.hardware [None req-9311732a-61f8-423e-850c-57479ca0bc2e a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 22 00:35:47 compute-1 nova_compute[182713]: 2026-01-22 00:35:47.422 182717 DEBUG nova.virt.hardware [None req-9311732a-61f8-423e-850c-57479ca0bc2e a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 22 00:35:47 compute-1 nova_compute[182713]: 2026-01-22 00:35:47.426 182717 DEBUG nova.virt.libvirt.vif [None req-9311732a-61f8-423e-850c-57479ca0bc2e a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T00:35:34Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1327765747',display_name='tempest-TestGettingAddress-server-1327765747',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1327765747',id=173,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBK7nR7wFnCoiySw65REL0XK2oqhDKLhnFVsBGaeEJezobQmbvet9F136TafqeB+t847DytpkOvQ0+Cnej4wWLBEdCAU4r3MTN2LY3bi428WR2O1oEXJJ3VIylh32sSXeGw==',key_name='tempest-TestGettingAddress-2092456170',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='837db8748d074b3c9179b47d30e7a1d4',ramdisk_id='',reservation_id='r-dt2fdt7m',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-471729430',owner_user_name='tempest-TestGettingAddress-471729430-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:35:36Z,user_data=None,user_id='a8fd196423d94b309668ffd08655f7ed',uuid=79afbcaf-6ef5-4db5-a05c-78bccd9f772a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "4a9813e7-d0e1-490f-a8ca-3138225442a8", "address": "fa:16:3e:7f:b8:cc", "network": {"id": "9d3a0d92-0a01-43e0-bbe5-a677082b8f1b", "bridge": "br-int", "label": "tempest-network-smoke--1303473530", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4a9813e7-d0", "ovs_interfaceid": "4a9813e7-d0e1-490f-a8ca-3138225442a8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 00:35:47 compute-1 nova_compute[182713]: 2026-01-22 00:35:47.427 182717 DEBUG nova.network.os_vif_util [None req-9311732a-61f8-423e-850c-57479ca0bc2e a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Converting VIF {"id": "4a9813e7-d0e1-490f-a8ca-3138225442a8", "address": "fa:16:3e:7f:b8:cc", "network": {"id": "9d3a0d92-0a01-43e0-bbe5-a677082b8f1b", "bridge": "br-int", "label": "tempest-network-smoke--1303473530", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4a9813e7-d0", "ovs_interfaceid": "4a9813e7-d0e1-490f-a8ca-3138225442a8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:35:47 compute-1 nova_compute[182713]: 2026-01-22 00:35:47.428 182717 DEBUG nova.network.os_vif_util [None req-9311732a-61f8-423e-850c-57479ca0bc2e a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:7f:b8:cc,bridge_name='br-int',has_traffic_filtering=True,id=4a9813e7-d0e1-490f-a8ca-3138225442a8,network=Network(9d3a0d92-0a01-43e0-bbe5-a677082b8f1b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4a9813e7-d0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:35:47 compute-1 nova_compute[182713]: 2026-01-22 00:35:47.429 182717 DEBUG nova.virt.libvirt.vif [None req-9311732a-61f8-423e-850c-57479ca0bc2e a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T00:35:34Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1327765747',display_name='tempest-TestGettingAddress-server-1327765747',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1327765747',id=173,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBK7nR7wFnCoiySw65REL0XK2oqhDKLhnFVsBGaeEJezobQmbvet9F136TafqeB+t847DytpkOvQ0+Cnej4wWLBEdCAU4r3MTN2LY3bi428WR2O1oEXJJ3VIylh32sSXeGw==',key_name='tempest-TestGettingAddress-2092456170',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='837db8748d074b3c9179b47d30e7a1d4',ramdisk_id='',reservation_id='r-dt2fdt7m',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-471729430',owner_user_name='tempest-TestGettingAddress-471729430-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:35:36Z,user_data=None,user_id='a8fd196423d94b309668ffd08655f7ed',uuid=79afbcaf-6ef5-4db5-a05c-78bccd9f772a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "90c66d2f-b763-45aa-9b34-a83255b9d97b", "address": "fa:16:3e:e5:06:a1", "network": {"id": "041654ff-0c5d-4cd2-89f6-0863dbbf44a8", "bridge": "br-int", "label": "tempest-network-smoke--1472393287", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fee5:6a1", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fee5:6a1", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap90c66d2f-b7", "ovs_interfaceid": "90c66d2f-b763-45aa-9b34-a83255b9d97b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 00:35:47 compute-1 nova_compute[182713]: 2026-01-22 00:35:47.429 182717 DEBUG nova.network.os_vif_util [None req-9311732a-61f8-423e-850c-57479ca0bc2e a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Converting VIF {"id": "90c66d2f-b763-45aa-9b34-a83255b9d97b", "address": "fa:16:3e:e5:06:a1", "network": {"id": "041654ff-0c5d-4cd2-89f6-0863dbbf44a8", "bridge": "br-int", "label": "tempest-network-smoke--1472393287", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fee5:6a1", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fee5:6a1", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap90c66d2f-b7", "ovs_interfaceid": "90c66d2f-b763-45aa-9b34-a83255b9d97b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:35:47 compute-1 nova_compute[182713]: 2026-01-22 00:35:47.430 182717 DEBUG nova.network.os_vif_util [None req-9311732a-61f8-423e-850c-57479ca0bc2e a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e5:06:a1,bridge_name='br-int',has_traffic_filtering=True,id=90c66d2f-b763-45aa-9b34-a83255b9d97b,network=Network(041654ff-0c5d-4cd2-89f6-0863dbbf44a8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap90c66d2f-b7') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:35:47 compute-1 nova_compute[182713]: 2026-01-22 00:35:47.431 182717 DEBUG nova.objects.instance [None req-9311732a-61f8-423e-850c-57479ca0bc2e a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lazy-loading 'pci_devices' on Instance uuid 79afbcaf-6ef5-4db5-a05c-78bccd9f772a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:35:47 compute-1 nova_compute[182713]: 2026-01-22 00:35:47.447 182717 DEBUG nova.virt.libvirt.driver [None req-9311732a-61f8-423e-850c-57479ca0bc2e a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 79afbcaf-6ef5-4db5-a05c-78bccd9f772a] End _get_guest_xml xml=<domain type="kvm">
Jan 22 00:35:47 compute-1 nova_compute[182713]:   <uuid>79afbcaf-6ef5-4db5-a05c-78bccd9f772a</uuid>
Jan 22 00:35:47 compute-1 nova_compute[182713]:   <name>instance-000000ad</name>
Jan 22 00:35:47 compute-1 nova_compute[182713]:   <memory>131072</memory>
Jan 22 00:35:47 compute-1 nova_compute[182713]:   <vcpu>1</vcpu>
Jan 22 00:35:47 compute-1 nova_compute[182713]:   <metadata>
Jan 22 00:35:47 compute-1 nova_compute[182713]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 00:35:47 compute-1 nova_compute[182713]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 00:35:47 compute-1 nova_compute[182713]:       <nova:name>tempest-TestGettingAddress-server-1327765747</nova:name>
Jan 22 00:35:47 compute-1 nova_compute[182713]:       <nova:creationTime>2026-01-22 00:35:47</nova:creationTime>
Jan 22 00:35:47 compute-1 nova_compute[182713]:       <nova:flavor name="m1.nano">
Jan 22 00:35:47 compute-1 nova_compute[182713]:         <nova:memory>128</nova:memory>
Jan 22 00:35:47 compute-1 nova_compute[182713]:         <nova:disk>1</nova:disk>
Jan 22 00:35:47 compute-1 nova_compute[182713]:         <nova:swap>0</nova:swap>
Jan 22 00:35:47 compute-1 nova_compute[182713]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 00:35:47 compute-1 nova_compute[182713]:         <nova:vcpus>1</nova:vcpus>
Jan 22 00:35:47 compute-1 nova_compute[182713]:       </nova:flavor>
Jan 22 00:35:47 compute-1 nova_compute[182713]:       <nova:owner>
Jan 22 00:35:47 compute-1 nova_compute[182713]:         <nova:user uuid="a8fd196423d94b309668ffd08655f7ed">tempest-TestGettingAddress-471729430-project-member</nova:user>
Jan 22 00:35:47 compute-1 nova_compute[182713]:         <nova:project uuid="837db8748d074b3c9179b47d30e7a1d4">tempest-TestGettingAddress-471729430</nova:project>
Jan 22 00:35:47 compute-1 nova_compute[182713]:       </nova:owner>
Jan 22 00:35:47 compute-1 nova_compute[182713]:       <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 22 00:35:47 compute-1 nova_compute[182713]:       <nova:ports>
Jan 22 00:35:47 compute-1 nova_compute[182713]:         <nova:port uuid="4a9813e7-d0e1-490f-a8ca-3138225442a8">
Jan 22 00:35:47 compute-1 nova_compute[182713]:           <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Jan 22 00:35:47 compute-1 nova_compute[182713]:         </nova:port>
Jan 22 00:35:47 compute-1 nova_compute[182713]:         <nova:port uuid="90c66d2f-b763-45aa-9b34-a83255b9d97b">
Jan 22 00:35:47 compute-1 nova_compute[182713]:           <nova:ip type="fixed" address="2001:db8::f816:3eff:fee5:6a1" ipVersion="6"/>
Jan 22 00:35:47 compute-1 nova_compute[182713]:           <nova:ip type="fixed" address="2001:db8:0:1:f816:3eff:fee5:6a1" ipVersion="6"/>
Jan 22 00:35:47 compute-1 nova_compute[182713]:         </nova:port>
Jan 22 00:35:47 compute-1 nova_compute[182713]:       </nova:ports>
Jan 22 00:35:47 compute-1 nova_compute[182713]:     </nova:instance>
Jan 22 00:35:47 compute-1 nova_compute[182713]:   </metadata>
Jan 22 00:35:47 compute-1 nova_compute[182713]:   <sysinfo type="smbios">
Jan 22 00:35:47 compute-1 nova_compute[182713]:     <system>
Jan 22 00:35:47 compute-1 nova_compute[182713]:       <entry name="manufacturer">RDO</entry>
Jan 22 00:35:47 compute-1 nova_compute[182713]:       <entry name="product">OpenStack Compute</entry>
Jan 22 00:35:47 compute-1 nova_compute[182713]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 00:35:47 compute-1 nova_compute[182713]:       <entry name="serial">79afbcaf-6ef5-4db5-a05c-78bccd9f772a</entry>
Jan 22 00:35:47 compute-1 nova_compute[182713]:       <entry name="uuid">79afbcaf-6ef5-4db5-a05c-78bccd9f772a</entry>
Jan 22 00:35:47 compute-1 nova_compute[182713]:       <entry name="family">Virtual Machine</entry>
Jan 22 00:35:47 compute-1 nova_compute[182713]:     </system>
Jan 22 00:35:47 compute-1 nova_compute[182713]:   </sysinfo>
Jan 22 00:35:47 compute-1 nova_compute[182713]:   <os>
Jan 22 00:35:47 compute-1 nova_compute[182713]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 22 00:35:47 compute-1 nova_compute[182713]:     <boot dev="hd"/>
Jan 22 00:35:47 compute-1 nova_compute[182713]:     <smbios mode="sysinfo"/>
Jan 22 00:35:47 compute-1 nova_compute[182713]:   </os>
Jan 22 00:35:47 compute-1 nova_compute[182713]:   <features>
Jan 22 00:35:47 compute-1 nova_compute[182713]:     <acpi/>
Jan 22 00:35:47 compute-1 nova_compute[182713]:     <apic/>
Jan 22 00:35:47 compute-1 nova_compute[182713]:     <vmcoreinfo/>
Jan 22 00:35:47 compute-1 nova_compute[182713]:   </features>
Jan 22 00:35:47 compute-1 nova_compute[182713]:   <clock offset="utc">
Jan 22 00:35:47 compute-1 nova_compute[182713]:     <timer name="pit" tickpolicy="delay"/>
Jan 22 00:35:47 compute-1 nova_compute[182713]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 22 00:35:47 compute-1 nova_compute[182713]:     <timer name="hpet" present="no"/>
Jan 22 00:35:47 compute-1 nova_compute[182713]:   </clock>
Jan 22 00:35:47 compute-1 nova_compute[182713]:   <cpu mode="custom" match="exact">
Jan 22 00:35:47 compute-1 nova_compute[182713]:     <model>Nehalem</model>
Jan 22 00:35:47 compute-1 nova_compute[182713]:     <topology sockets="1" cores="1" threads="1"/>
Jan 22 00:35:47 compute-1 nova_compute[182713]:   </cpu>
Jan 22 00:35:47 compute-1 nova_compute[182713]:   <devices>
Jan 22 00:35:47 compute-1 nova_compute[182713]:     <disk type="file" device="disk">
Jan 22 00:35:47 compute-1 nova_compute[182713]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 00:35:47 compute-1 nova_compute[182713]:       <source file="/var/lib/nova/instances/79afbcaf-6ef5-4db5-a05c-78bccd9f772a/disk"/>
Jan 22 00:35:47 compute-1 nova_compute[182713]:       <target dev="vda" bus="virtio"/>
Jan 22 00:35:47 compute-1 nova_compute[182713]:     </disk>
Jan 22 00:35:47 compute-1 nova_compute[182713]:     <disk type="file" device="cdrom">
Jan 22 00:35:47 compute-1 nova_compute[182713]:       <driver name="qemu" type="raw" cache="none"/>
Jan 22 00:35:47 compute-1 nova_compute[182713]:       <source file="/var/lib/nova/instances/79afbcaf-6ef5-4db5-a05c-78bccd9f772a/disk.config"/>
Jan 22 00:35:47 compute-1 nova_compute[182713]:       <target dev="sda" bus="sata"/>
Jan 22 00:35:47 compute-1 nova_compute[182713]:     </disk>
Jan 22 00:35:47 compute-1 nova_compute[182713]:     <interface type="ethernet">
Jan 22 00:35:47 compute-1 nova_compute[182713]:       <mac address="fa:16:3e:7f:b8:cc"/>
Jan 22 00:35:47 compute-1 nova_compute[182713]:       <model type="virtio"/>
Jan 22 00:35:47 compute-1 nova_compute[182713]:       <driver name="vhost" rx_queue_size="512"/>
Jan 22 00:35:47 compute-1 nova_compute[182713]:       <mtu size="1442"/>
Jan 22 00:35:47 compute-1 nova_compute[182713]:       <target dev="tap4a9813e7-d0"/>
Jan 22 00:35:47 compute-1 nova_compute[182713]:     </interface>
Jan 22 00:35:47 compute-1 nova_compute[182713]:     <interface type="ethernet">
Jan 22 00:35:47 compute-1 nova_compute[182713]:       <mac address="fa:16:3e:e5:06:a1"/>
Jan 22 00:35:47 compute-1 nova_compute[182713]:       <model type="virtio"/>
Jan 22 00:35:47 compute-1 nova_compute[182713]:       <driver name="vhost" rx_queue_size="512"/>
Jan 22 00:35:47 compute-1 nova_compute[182713]:       <mtu size="1442"/>
Jan 22 00:35:47 compute-1 nova_compute[182713]:       <target dev="tap90c66d2f-b7"/>
Jan 22 00:35:47 compute-1 nova_compute[182713]:     </interface>
Jan 22 00:35:47 compute-1 nova_compute[182713]:     <serial type="pty">
Jan 22 00:35:47 compute-1 nova_compute[182713]:       <log file="/var/lib/nova/instances/79afbcaf-6ef5-4db5-a05c-78bccd9f772a/console.log" append="off"/>
Jan 22 00:35:47 compute-1 nova_compute[182713]:     </serial>
Jan 22 00:35:47 compute-1 nova_compute[182713]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 00:35:47 compute-1 nova_compute[182713]:     <video>
Jan 22 00:35:47 compute-1 nova_compute[182713]:       <model type="virtio"/>
Jan 22 00:35:47 compute-1 nova_compute[182713]:     </video>
Jan 22 00:35:47 compute-1 nova_compute[182713]:     <input type="tablet" bus="usb"/>
Jan 22 00:35:47 compute-1 nova_compute[182713]:     <rng model="virtio">
Jan 22 00:35:47 compute-1 nova_compute[182713]:       <backend model="random">/dev/urandom</backend>
Jan 22 00:35:47 compute-1 nova_compute[182713]:     </rng>
Jan 22 00:35:47 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root"/>
Jan 22 00:35:47 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:35:47 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:35:47 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:35:47 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:35:47 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:35:47 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:35:47 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:35:47 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:35:47 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:35:47 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:35:47 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:35:47 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:35:47 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:35:47 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:35:47 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:35:47 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:35:47 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:35:47 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:35:47 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:35:47 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:35:47 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:35:47 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:35:47 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:35:47 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:35:47 compute-1 nova_compute[182713]:     <controller type="usb" index="0"/>
Jan 22 00:35:47 compute-1 nova_compute[182713]:     <memballoon model="virtio">
Jan 22 00:35:47 compute-1 nova_compute[182713]:       <stats period="10"/>
Jan 22 00:35:47 compute-1 nova_compute[182713]:     </memballoon>
Jan 22 00:35:47 compute-1 nova_compute[182713]:   </devices>
Jan 22 00:35:47 compute-1 nova_compute[182713]: </domain>
Jan 22 00:35:47 compute-1 nova_compute[182713]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 22 00:35:47 compute-1 nova_compute[182713]: 2026-01-22 00:35:47.448 182717 DEBUG nova.compute.manager [None req-9311732a-61f8-423e-850c-57479ca0bc2e a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 79afbcaf-6ef5-4db5-a05c-78bccd9f772a] Preparing to wait for external event network-vif-plugged-4a9813e7-d0e1-490f-a8ca-3138225442a8 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 22 00:35:47 compute-1 nova_compute[182713]: 2026-01-22 00:35:47.448 182717 DEBUG oslo_concurrency.lockutils [None req-9311732a-61f8-423e-850c-57479ca0bc2e a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Acquiring lock "79afbcaf-6ef5-4db5-a05c-78bccd9f772a-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:35:47 compute-1 nova_compute[182713]: 2026-01-22 00:35:47.449 182717 DEBUG oslo_concurrency.lockutils [None req-9311732a-61f8-423e-850c-57479ca0bc2e a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "79afbcaf-6ef5-4db5-a05c-78bccd9f772a-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:35:47 compute-1 nova_compute[182713]: 2026-01-22 00:35:47.449 182717 DEBUG oslo_concurrency.lockutils [None req-9311732a-61f8-423e-850c-57479ca0bc2e a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "79afbcaf-6ef5-4db5-a05c-78bccd9f772a-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:35:47 compute-1 nova_compute[182713]: 2026-01-22 00:35:47.449 182717 DEBUG nova.compute.manager [None req-9311732a-61f8-423e-850c-57479ca0bc2e a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 79afbcaf-6ef5-4db5-a05c-78bccd9f772a] Preparing to wait for external event network-vif-plugged-90c66d2f-b763-45aa-9b34-a83255b9d97b prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 22 00:35:47 compute-1 nova_compute[182713]: 2026-01-22 00:35:47.450 182717 DEBUG oslo_concurrency.lockutils [None req-9311732a-61f8-423e-850c-57479ca0bc2e a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Acquiring lock "79afbcaf-6ef5-4db5-a05c-78bccd9f772a-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:35:47 compute-1 nova_compute[182713]: 2026-01-22 00:35:47.450 182717 DEBUG oslo_concurrency.lockutils [None req-9311732a-61f8-423e-850c-57479ca0bc2e a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "79afbcaf-6ef5-4db5-a05c-78bccd9f772a-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:35:47 compute-1 nova_compute[182713]: 2026-01-22 00:35:47.450 182717 DEBUG oslo_concurrency.lockutils [None req-9311732a-61f8-423e-850c-57479ca0bc2e a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "79afbcaf-6ef5-4db5-a05c-78bccd9f772a-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:35:47 compute-1 nova_compute[182713]: 2026-01-22 00:35:47.451 182717 DEBUG nova.virt.libvirt.vif [None req-9311732a-61f8-423e-850c-57479ca0bc2e a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T00:35:34Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1327765747',display_name='tempest-TestGettingAddress-server-1327765747',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1327765747',id=173,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBK7nR7wFnCoiySw65REL0XK2oqhDKLhnFVsBGaeEJezobQmbvet9F136TafqeB+t847DytpkOvQ0+Cnej4wWLBEdCAU4r3MTN2LY3bi428WR2O1oEXJJ3VIylh32sSXeGw==',key_name='tempest-TestGettingAddress-2092456170',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='837db8748d074b3c9179b47d30e7a1d4',ramdisk_id='',reservation_id='r-dt2fdt7m',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-471729430',owner_user_name='tempest-TestGettingAddress-471729430-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:35:36Z,user_data=None,user_id='a8fd196423d94b309668ffd08655f7ed',uuid=79afbcaf-6ef5-4db5-a05c-78bccd9f772a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "4a9813e7-d0e1-490f-a8ca-3138225442a8", "address": "fa:16:3e:7f:b8:cc", "network": {"id": "9d3a0d92-0a01-43e0-bbe5-a677082b8f1b", "bridge": "br-int", "label": "tempest-network-smoke--1303473530", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4a9813e7-d0", "ovs_interfaceid": "4a9813e7-d0e1-490f-a8ca-3138225442a8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 00:35:47 compute-1 nova_compute[182713]: 2026-01-22 00:35:47.451 182717 DEBUG nova.network.os_vif_util [None req-9311732a-61f8-423e-850c-57479ca0bc2e a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Converting VIF {"id": "4a9813e7-d0e1-490f-a8ca-3138225442a8", "address": "fa:16:3e:7f:b8:cc", "network": {"id": "9d3a0d92-0a01-43e0-bbe5-a677082b8f1b", "bridge": "br-int", "label": "tempest-network-smoke--1303473530", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4a9813e7-d0", "ovs_interfaceid": "4a9813e7-d0e1-490f-a8ca-3138225442a8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:35:47 compute-1 nova_compute[182713]: 2026-01-22 00:35:47.452 182717 DEBUG nova.network.os_vif_util [None req-9311732a-61f8-423e-850c-57479ca0bc2e a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:7f:b8:cc,bridge_name='br-int',has_traffic_filtering=True,id=4a9813e7-d0e1-490f-a8ca-3138225442a8,network=Network(9d3a0d92-0a01-43e0-bbe5-a677082b8f1b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4a9813e7-d0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:35:47 compute-1 nova_compute[182713]: 2026-01-22 00:35:47.452 182717 DEBUG os_vif [None req-9311732a-61f8-423e-850c-57479ca0bc2e a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:7f:b8:cc,bridge_name='br-int',has_traffic_filtering=True,id=4a9813e7-d0e1-490f-a8ca-3138225442a8,network=Network(9d3a0d92-0a01-43e0-bbe5-a677082b8f1b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4a9813e7-d0') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 00:35:47 compute-1 nova_compute[182713]: 2026-01-22 00:35:47.453 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:35:47 compute-1 nova_compute[182713]: 2026-01-22 00:35:47.453 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:35:47 compute-1 nova_compute[182713]: 2026-01-22 00:35:47.454 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 00:35:47 compute-1 nova_compute[182713]: 2026-01-22 00:35:47.459 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:35:47 compute-1 nova_compute[182713]: 2026-01-22 00:35:47.459 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap4a9813e7-d0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:35:47 compute-1 nova_compute[182713]: 2026-01-22 00:35:47.459 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap4a9813e7-d0, col_values=(('external_ids', {'iface-id': '4a9813e7-d0e1-490f-a8ca-3138225442a8', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:7f:b8:cc', 'vm-uuid': '79afbcaf-6ef5-4db5-a05c-78bccd9f772a'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:35:47 compute-1 nova_compute[182713]: 2026-01-22 00:35:47.461 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:35:47 compute-1 NetworkManager[54952]: <info>  [1769042147.4630] manager: (tap4a9813e7-d0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/327)
Jan 22 00:35:47 compute-1 nova_compute[182713]: 2026-01-22 00:35:47.464 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:35:47 compute-1 nova_compute[182713]: 2026-01-22 00:35:47.467 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:35:47 compute-1 nova_compute[182713]: 2026-01-22 00:35:47.468 182717 INFO os_vif [None req-9311732a-61f8-423e-850c-57479ca0bc2e a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:7f:b8:cc,bridge_name='br-int',has_traffic_filtering=True,id=4a9813e7-d0e1-490f-a8ca-3138225442a8,network=Network(9d3a0d92-0a01-43e0-bbe5-a677082b8f1b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4a9813e7-d0')
Jan 22 00:35:47 compute-1 nova_compute[182713]: 2026-01-22 00:35:47.469 182717 DEBUG nova.virt.libvirt.vif [None req-9311732a-61f8-423e-850c-57479ca0bc2e a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T00:35:34Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1327765747',display_name='tempest-TestGettingAddress-server-1327765747',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1327765747',id=173,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBK7nR7wFnCoiySw65REL0XK2oqhDKLhnFVsBGaeEJezobQmbvet9F136TafqeB+t847DytpkOvQ0+Cnej4wWLBEdCAU4r3MTN2LY3bi428WR2O1oEXJJ3VIylh32sSXeGw==',key_name='tempest-TestGettingAddress-2092456170',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='837db8748d074b3c9179b47d30e7a1d4',ramdisk_id='',reservation_id='r-dt2fdt7m',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-471729430',owner_user_name='tempest-TestGettingAddress-471729430-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:35:36Z,user_data=None,user_id='a8fd196423d94b309668ffd08655f7ed',uuid=79afbcaf-6ef5-4db5-a05c-78bccd9f772a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "90c66d2f-b763-45aa-9b34-a83255b9d97b", "address": "fa:16:3e:e5:06:a1", "network": {"id": "041654ff-0c5d-4cd2-89f6-0863dbbf44a8", "bridge": "br-int", "label": "tempest-network-smoke--1472393287", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fee5:6a1", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fee5:6a1", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap90c66d2f-b7", "ovs_interfaceid": "90c66d2f-b763-45aa-9b34-a83255b9d97b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 00:35:47 compute-1 nova_compute[182713]: 2026-01-22 00:35:47.469 182717 DEBUG nova.network.os_vif_util [None req-9311732a-61f8-423e-850c-57479ca0bc2e a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Converting VIF {"id": "90c66d2f-b763-45aa-9b34-a83255b9d97b", "address": "fa:16:3e:e5:06:a1", "network": {"id": "041654ff-0c5d-4cd2-89f6-0863dbbf44a8", "bridge": "br-int", "label": "tempest-network-smoke--1472393287", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fee5:6a1", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fee5:6a1", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap90c66d2f-b7", "ovs_interfaceid": "90c66d2f-b763-45aa-9b34-a83255b9d97b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:35:47 compute-1 nova_compute[182713]: 2026-01-22 00:35:47.469 182717 DEBUG nova.network.os_vif_util [None req-9311732a-61f8-423e-850c-57479ca0bc2e a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e5:06:a1,bridge_name='br-int',has_traffic_filtering=True,id=90c66d2f-b763-45aa-9b34-a83255b9d97b,network=Network(041654ff-0c5d-4cd2-89f6-0863dbbf44a8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap90c66d2f-b7') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:35:47 compute-1 nova_compute[182713]: 2026-01-22 00:35:47.470 182717 DEBUG os_vif [None req-9311732a-61f8-423e-850c-57479ca0bc2e a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:e5:06:a1,bridge_name='br-int',has_traffic_filtering=True,id=90c66d2f-b763-45aa-9b34-a83255b9d97b,network=Network(041654ff-0c5d-4cd2-89f6-0863dbbf44a8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap90c66d2f-b7') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 00:35:47 compute-1 nova_compute[182713]: 2026-01-22 00:35:47.470 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:35:47 compute-1 nova_compute[182713]: 2026-01-22 00:35:47.470 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:35:47 compute-1 nova_compute[182713]: 2026-01-22 00:35:47.470 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 00:35:47 compute-1 nova_compute[182713]: 2026-01-22 00:35:47.473 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:35:47 compute-1 nova_compute[182713]: 2026-01-22 00:35:47.473 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap90c66d2f-b7, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:35:47 compute-1 nova_compute[182713]: 2026-01-22 00:35:47.473 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap90c66d2f-b7, col_values=(('external_ids', {'iface-id': '90c66d2f-b763-45aa-9b34-a83255b9d97b', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:e5:06:a1', 'vm-uuid': '79afbcaf-6ef5-4db5-a05c-78bccd9f772a'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:35:47 compute-1 nova_compute[182713]: 2026-01-22 00:35:47.474 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:35:47 compute-1 NetworkManager[54952]: <info>  [1769042147.4758] manager: (tap90c66d2f-b7): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/328)
Jan 22 00:35:47 compute-1 nova_compute[182713]: 2026-01-22 00:35:47.476 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:35:47 compute-1 nova_compute[182713]: 2026-01-22 00:35:47.486 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:35:47 compute-1 nova_compute[182713]: 2026-01-22 00:35:47.487 182717 INFO os_vif [None req-9311732a-61f8-423e-850c-57479ca0bc2e a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:e5:06:a1,bridge_name='br-int',has_traffic_filtering=True,id=90c66d2f-b763-45aa-9b34-a83255b9d97b,network=Network(041654ff-0c5d-4cd2-89f6-0863dbbf44a8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap90c66d2f-b7')
Jan 22 00:35:47 compute-1 nova_compute[182713]: 2026-01-22 00:35:47.550 182717 DEBUG nova.virt.libvirt.driver [None req-9311732a-61f8-423e-850c-57479ca0bc2e a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 00:35:47 compute-1 nova_compute[182713]: 2026-01-22 00:35:47.551 182717 DEBUG nova.virt.libvirt.driver [None req-9311732a-61f8-423e-850c-57479ca0bc2e a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 00:35:47 compute-1 nova_compute[182713]: 2026-01-22 00:35:47.551 182717 DEBUG nova.virt.libvirt.driver [None req-9311732a-61f8-423e-850c-57479ca0bc2e a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] No VIF found with MAC fa:16:3e:7f:b8:cc, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 00:35:47 compute-1 nova_compute[182713]: 2026-01-22 00:35:47.551 182717 DEBUG nova.virt.libvirt.driver [None req-9311732a-61f8-423e-850c-57479ca0bc2e a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] No VIF found with MAC fa:16:3e:e5:06:a1, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 00:35:47 compute-1 nova_compute[182713]: 2026-01-22 00:35:47.552 182717 INFO nova.virt.libvirt.driver [None req-9311732a-61f8-423e-850c-57479ca0bc2e a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 79afbcaf-6ef5-4db5-a05c-78bccd9f772a] Using config drive
Jan 22 00:35:47 compute-1 podman[240786]: 2026-01-22 00:35:47.576574668 +0000 UTC m=+0.061847444 container health_status af9a44923f14d865893c91712a29a99d59e05d83f1a1a0300c4f4d5af638f284 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 22 00:35:47 compute-1 podman[240787]: 2026-01-22 00:35:47.597634919 +0000 UTC m=+0.070500245 container health_status e305ebfcd3dbca18f8dd680606b043b108cf9efb0d0a771039cb04fe0df8441d (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 22 00:35:47 compute-1 nova_compute[182713]: 2026-01-22 00:35:47.885 182717 INFO nova.virt.libvirt.driver [None req-9311732a-61f8-423e-850c-57479ca0bc2e a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 79afbcaf-6ef5-4db5-a05c-78bccd9f772a] Creating config drive at /var/lib/nova/instances/79afbcaf-6ef5-4db5-a05c-78bccd9f772a/disk.config
Jan 22 00:35:47 compute-1 nova_compute[182713]: 2026-01-22 00:35:47.895 182717 DEBUG oslo_concurrency.processutils [None req-9311732a-61f8-423e-850c-57479ca0bc2e a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/79afbcaf-6ef5-4db5-a05c-78bccd9f772a/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpf04ia3em execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:35:48 compute-1 nova_compute[182713]: 2026-01-22 00:35:48.039 182717 DEBUG oslo_concurrency.processutils [None req-9311732a-61f8-423e-850c-57479ca0bc2e a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/79afbcaf-6ef5-4db5-a05c-78bccd9f772a/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpf04ia3em" returned: 0 in 0.144s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:35:48 compute-1 NetworkManager[54952]: <info>  [1769042148.1282] manager: (tap4a9813e7-d0): new Tun device (/org/freedesktop/NetworkManager/Devices/329)
Jan 22 00:35:48 compute-1 kernel: tap4a9813e7-d0: entered promiscuous mode
Jan 22 00:35:48 compute-1 nova_compute[182713]: 2026-01-22 00:35:48.134 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:35:48 compute-1 ovn_controller[94841]: 2026-01-22T00:35:48Z|00693|binding|INFO|Claiming lport 4a9813e7-d0e1-490f-a8ca-3138225442a8 for this chassis.
Jan 22 00:35:48 compute-1 ovn_controller[94841]: 2026-01-22T00:35:48Z|00694|binding|INFO|4a9813e7-d0e1-490f-a8ca-3138225442a8: Claiming fa:16:3e:7f:b8:cc 10.100.0.4
Jan 22 00:35:48 compute-1 nova_compute[182713]: 2026-01-22 00:35:48.152 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:35:48 compute-1 NetworkManager[54952]: <info>  [1769042148.1557] manager: (tap90c66d2f-b7): new Tun device (/org/freedesktop/NetworkManager/Devices/330)
Jan 22 00:35:48 compute-1 kernel: tap90c66d2f-b7: entered promiscuous mode
Jan 22 00:35:48 compute-1 nova_compute[182713]: 2026-01-22 00:35:48.167 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:35:48 compute-1 ovn_controller[94841]: 2026-01-22T00:35:48Z|00695|if_status|INFO|Not updating pb chassis for 90c66d2f-b763-45aa-9b34-a83255b9d97b now as sb is readonly
Jan 22 00:35:48 compute-1 ovn_controller[94841]: 2026-01-22T00:35:48Z|00696|binding|INFO|Claiming lport 90c66d2f-b763-45aa-9b34-a83255b9d97b for this chassis.
Jan 22 00:35:48 compute-1 ovn_controller[94841]: 2026-01-22T00:35:48Z|00697|binding|INFO|90c66d2f-b763-45aa-9b34-a83255b9d97b: Claiming fa:16:3e:e5:06:a1 2001:db8:0:1:f816:3eff:fee5:6a1 2001:db8::f816:3eff:fee5:6a1
Jan 22 00:35:48 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:35:48.175 104184 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:7f:b8:cc 10.100.0.4'], port_security=['fa:16:3e:7f:b8:cc 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '79afbcaf-6ef5-4db5-a05c-78bccd9f772a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9d3a0d92-0a01-43e0-bbe5-a677082b8f1b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '837db8748d074b3c9179b47d30e7a1d4', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'a02fb4eb-eda5-4559-8b41-ffe0af33e841', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=9c9c72d4-43bc-43b5-af16-0875792fba89, chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>], logical_port=4a9813e7-d0e1-490f-a8ca-3138225442a8) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:35:48 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:35:48.177 104184 INFO neutron.agent.ovn.metadata.agent [-] Port 4a9813e7-d0e1-490f-a8ca-3138225442a8 in datapath 9d3a0d92-0a01-43e0-bbe5-a677082b8f1b bound to our chassis
Jan 22 00:35:48 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:35:48.180 104184 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 9d3a0d92-0a01-43e0-bbe5-a677082b8f1b
Jan 22 00:35:48 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:35:48.193 104184 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e5:06:a1 2001:db8:0:1:f816:3eff:fee5:6a1 2001:db8::f816:3eff:fee5:6a1'], port_security=['fa:16:3e:e5:06:a1 2001:db8:0:1:f816:3eff:fee5:6a1 2001:db8::f816:3eff:fee5:6a1'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:0:1:f816:3eff:fee5:6a1/64 2001:db8::f816:3eff:fee5:6a1/64', 'neutron:device_id': '79afbcaf-6ef5-4db5-a05c-78bccd9f772a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-041654ff-0c5d-4cd2-89f6-0863dbbf44a8', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '837db8748d074b3c9179b47d30e7a1d4', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'a02fb4eb-eda5-4559-8b41-ffe0af33e841', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=9d6cac94-5c44-44de-a872-7bf42948d910, chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>], logical_port=90c66d2f-b763-45aa-9b34-a83255b9d97b) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:35:48 compute-1 systemd-udevd[240849]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 00:35:48 compute-1 systemd-udevd[240850]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 00:35:48 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:35:48.207 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[e13198ef-1946-4423-9901-20f443ce0274]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:35:48 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:35:48.208 104184 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap9d3a0d92-01 in ovnmeta-9d3a0d92-0a01-43e0-bbe5-a677082b8f1b namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 22 00:35:48 compute-1 systemd-machined[153970]: New machine qemu-75-instance-000000ad.
Jan 22 00:35:48 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:35:48.211 211733 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap9d3a0d92-00 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 22 00:35:48 compute-1 NetworkManager[54952]: <info>  [1769042148.2122] device (tap90c66d2f-b7): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 00:35:48 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:35:48.211 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[1a166ca7-41f7-4163-9913-64a36e39d6fa]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:35:48 compute-1 NetworkManager[54952]: <info>  [1769042148.2130] device (tap90c66d2f-b7): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 00:35:48 compute-1 NetworkManager[54952]: <info>  [1769042148.2136] device (tap4a9813e7-d0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 00:35:48 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:35:48.213 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[962f5f1f-046a-4776-a1a5-77e5e7f19298]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:35:48 compute-1 NetworkManager[54952]: <info>  [1769042148.2143] device (tap4a9813e7-d0): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 00:35:48 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:35:48.232 104576 DEBUG oslo.privsep.daemon [-] privsep: reply[0ca593d8-dd53-4665-835f-e804bbeb86d3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:35:48 compute-1 systemd[1]: Started Virtual Machine qemu-75-instance-000000ad.
Jan 22 00:35:48 compute-1 nova_compute[182713]: 2026-01-22 00:35:48.262 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:35:48 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:35:48.263 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[c16706ad-00cf-4287-881b-86a8ae0ecf35]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:35:48 compute-1 ovn_controller[94841]: 2026-01-22T00:35:48Z|00698|binding|INFO|Setting lport 4a9813e7-d0e1-490f-a8ca-3138225442a8 ovn-installed in OVS
Jan 22 00:35:48 compute-1 ovn_controller[94841]: 2026-01-22T00:35:48Z|00699|binding|INFO|Setting lport 4a9813e7-d0e1-490f-a8ca-3138225442a8 up in Southbound
Jan 22 00:35:48 compute-1 nova_compute[182713]: 2026-01-22 00:35:48.269 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:35:48 compute-1 ovn_controller[94841]: 2026-01-22T00:35:48Z|00700|binding|INFO|Setting lport 90c66d2f-b763-45aa-9b34-a83255b9d97b ovn-installed in OVS
Jan 22 00:35:48 compute-1 ovn_controller[94841]: 2026-01-22T00:35:48Z|00701|binding|INFO|Setting lport 90c66d2f-b763-45aa-9b34-a83255b9d97b up in Southbound
Jan 22 00:35:48 compute-1 nova_compute[182713]: 2026-01-22 00:35:48.279 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:35:48 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:35:48.298 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[9cbe16dd-0748-4818-9315-75f4b4187941]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:35:48 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:35:48.302 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[b47418f5-c948-4184-82de-62ddc27082e3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:35:48 compute-1 NetworkManager[54952]: <info>  [1769042148.3041] manager: (tap9d3a0d92-00): new Veth device (/org/freedesktop/NetworkManager/Devices/331)
Jan 22 00:35:48 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:35:48.336 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[82a14429-3597-4c39-bd71-13edab9239f0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:35:48 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:35:48.339 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[7b66e2c0-30d2-4203-a70a-9629a4372307]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:35:48 compute-1 NetworkManager[54952]: <info>  [1769042148.3648] device (tap9d3a0d92-00): carrier: link connected
Jan 22 00:35:48 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:35:48.371 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[f68b6db9-eb5d-487b-b9be-ecf13d5f6401]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:35:48 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:35:48.390 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[e2598749-ab4b-4487-84ae-65f00964a8bf]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap9d3a0d92-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:21:b0:af'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 202], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 672101, 'reachable_time': 40638, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 240885, 'error': None, 'target': 'ovnmeta-9d3a0d92-0a01-43e0-bbe5-a677082b8f1b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:35:48 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:35:48.407 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[8ef0824f-8ede-4d66-8709-a85140fdd53e]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe21:b0af'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 672101, 'tstamp': 672101}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 240886, 'error': None, 'target': 'ovnmeta-9d3a0d92-0a01-43e0-bbe5-a677082b8f1b', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:35:48 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:35:48.430 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[5417a7ba-882a-4935-92da-78ec3a7728a4]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap9d3a0d92-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:21:b0:af'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 202], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 672101, 'reachable_time': 40638, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 240887, 'error': None, 'target': 'ovnmeta-9d3a0d92-0a01-43e0-bbe5-a677082b8f1b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:35:48 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:35:48.467 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[3d45437f-fe28-4f5f-902c-b09d1198684d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:35:48 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:35:48.538 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[b80cdb4d-021e-4b27-9296-87ffa31b5fab]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:35:48 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:35:48.539 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9d3a0d92-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:35:48 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:35:48.540 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 00:35:48 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:35:48.540 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap9d3a0d92-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:35:48 compute-1 NetworkManager[54952]: <info>  [1769042148.5425] manager: (tap9d3a0d92-00): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/332)
Jan 22 00:35:48 compute-1 kernel: tap9d3a0d92-00: entered promiscuous mode
Jan 22 00:35:48 compute-1 nova_compute[182713]: 2026-01-22 00:35:48.542 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:35:48 compute-1 nova_compute[182713]: 2026-01-22 00:35:48.545 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:35:48 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:35:48.546 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap9d3a0d92-00, col_values=(('external_ids', {'iface-id': '285533c3-11fb-4871-bfe4-af8cc3d787e8'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:35:48 compute-1 ovn_controller[94841]: 2026-01-22T00:35:48Z|00702|binding|INFO|Releasing lport 285533c3-11fb-4871-bfe4-af8cc3d787e8 from this chassis (sb_readonly=0)
Jan 22 00:35:48 compute-1 nova_compute[182713]: 2026-01-22 00:35:48.569 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:35:48 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:35:48.571 104184 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/9d3a0d92-0a01-43e0-bbe5-a677082b8f1b.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/9d3a0d92-0a01-43e0-bbe5-a677082b8f1b.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 22 00:35:48 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:35:48.573 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[a9870f73-031e-4b3a-988b-488770031f89]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:35:48 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:35:48.574 104184 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 00:35:48 compute-1 ovn_metadata_agent[104179]: global
Jan 22 00:35:48 compute-1 ovn_metadata_agent[104179]:     log         /dev/log local0 debug
Jan 22 00:35:48 compute-1 ovn_metadata_agent[104179]:     log-tag     haproxy-metadata-proxy-9d3a0d92-0a01-43e0-bbe5-a677082b8f1b
Jan 22 00:35:48 compute-1 ovn_metadata_agent[104179]:     user        root
Jan 22 00:35:48 compute-1 ovn_metadata_agent[104179]:     group       root
Jan 22 00:35:48 compute-1 ovn_metadata_agent[104179]:     maxconn     1024
Jan 22 00:35:48 compute-1 ovn_metadata_agent[104179]:     pidfile     /var/lib/neutron/external/pids/9d3a0d92-0a01-43e0-bbe5-a677082b8f1b.pid.haproxy
Jan 22 00:35:48 compute-1 ovn_metadata_agent[104179]:     daemon
Jan 22 00:35:48 compute-1 ovn_metadata_agent[104179]: 
Jan 22 00:35:48 compute-1 ovn_metadata_agent[104179]: defaults
Jan 22 00:35:48 compute-1 ovn_metadata_agent[104179]:     log global
Jan 22 00:35:48 compute-1 ovn_metadata_agent[104179]:     mode http
Jan 22 00:35:48 compute-1 ovn_metadata_agent[104179]:     option httplog
Jan 22 00:35:48 compute-1 ovn_metadata_agent[104179]:     option dontlognull
Jan 22 00:35:48 compute-1 ovn_metadata_agent[104179]:     option http-server-close
Jan 22 00:35:48 compute-1 ovn_metadata_agent[104179]:     option forwardfor
Jan 22 00:35:48 compute-1 ovn_metadata_agent[104179]:     retries                 3
Jan 22 00:35:48 compute-1 ovn_metadata_agent[104179]:     timeout http-request    30s
Jan 22 00:35:48 compute-1 ovn_metadata_agent[104179]:     timeout connect         30s
Jan 22 00:35:48 compute-1 ovn_metadata_agent[104179]:     timeout client          32s
Jan 22 00:35:48 compute-1 ovn_metadata_agent[104179]:     timeout server          32s
Jan 22 00:35:48 compute-1 ovn_metadata_agent[104179]:     timeout http-keep-alive 30s
Jan 22 00:35:48 compute-1 ovn_metadata_agent[104179]: 
Jan 22 00:35:48 compute-1 ovn_metadata_agent[104179]: 
Jan 22 00:35:48 compute-1 ovn_metadata_agent[104179]: listen listener
Jan 22 00:35:48 compute-1 ovn_metadata_agent[104179]:     bind 169.254.169.254:80
Jan 22 00:35:48 compute-1 ovn_metadata_agent[104179]:     server metadata /var/lib/neutron/metadata_proxy
Jan 22 00:35:48 compute-1 ovn_metadata_agent[104179]:     http-request add-header X-OVN-Network-ID 9d3a0d92-0a01-43e0-bbe5-a677082b8f1b
Jan 22 00:35:48 compute-1 ovn_metadata_agent[104179]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 22 00:35:48 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:35:48.575 104184 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-9d3a0d92-0a01-43e0-bbe5-a677082b8f1b', 'env', 'PROCESS_TAG=haproxy-9d3a0d92-0a01-43e0-bbe5-a677082b8f1b', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/9d3a0d92-0a01-43e0-bbe5-a677082b8f1b.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 22 00:35:48 compute-1 nova_compute[182713]: 2026-01-22 00:35:48.658 182717 DEBUG nova.compute.manager [req-f54f11bd-b650-4857-807c-173ce66edb28 req-c847effc-ffe7-4931-93fd-a610a6e2deed 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 79afbcaf-6ef5-4db5-a05c-78bccd9f772a] Received event network-vif-plugged-4a9813e7-d0e1-490f-a8ca-3138225442a8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:35:48 compute-1 nova_compute[182713]: 2026-01-22 00:35:48.668 182717 DEBUG oslo_concurrency.lockutils [req-f54f11bd-b650-4857-807c-173ce66edb28 req-c847effc-ffe7-4931-93fd-a610a6e2deed 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "79afbcaf-6ef5-4db5-a05c-78bccd9f772a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:35:48 compute-1 nova_compute[182713]: 2026-01-22 00:35:48.669 182717 DEBUG oslo_concurrency.lockutils [req-f54f11bd-b650-4857-807c-173ce66edb28 req-c847effc-ffe7-4931-93fd-a610a6e2deed 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "79afbcaf-6ef5-4db5-a05c-78bccd9f772a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:35:48 compute-1 nova_compute[182713]: 2026-01-22 00:35:48.669 182717 DEBUG oslo_concurrency.lockutils [req-f54f11bd-b650-4857-807c-173ce66edb28 req-c847effc-ffe7-4931-93fd-a610a6e2deed 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "79afbcaf-6ef5-4db5-a05c-78bccd9f772a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:35:48 compute-1 nova_compute[182713]: 2026-01-22 00:35:48.670 182717 DEBUG nova.compute.manager [req-f54f11bd-b650-4857-807c-173ce66edb28 req-c847effc-ffe7-4931-93fd-a610a6e2deed 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 79afbcaf-6ef5-4db5-a05c-78bccd9f772a] Processing event network-vif-plugged-4a9813e7-d0e1-490f-a8ca-3138225442a8 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 22 00:35:48 compute-1 nova_compute[182713]: 2026-01-22 00:35:48.811 182717 DEBUG nova.virt.driver [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Emitting event <LifecycleEvent: 1769042148.8108926, 79afbcaf-6ef5-4db5-a05c-78bccd9f772a => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:35:48 compute-1 nova_compute[182713]: 2026-01-22 00:35:48.812 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 79afbcaf-6ef5-4db5-a05c-78bccd9f772a] VM Started (Lifecycle Event)
Jan 22 00:35:48 compute-1 nova_compute[182713]: 2026-01-22 00:35:48.861 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 79afbcaf-6ef5-4db5-a05c-78bccd9f772a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:35:48 compute-1 nova_compute[182713]: 2026-01-22 00:35:48.866 182717 DEBUG nova.virt.driver [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Emitting event <LifecycleEvent: 1769042148.8110397, 79afbcaf-6ef5-4db5-a05c-78bccd9f772a => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:35:48 compute-1 nova_compute[182713]: 2026-01-22 00:35:48.867 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 79afbcaf-6ef5-4db5-a05c-78bccd9f772a] VM Paused (Lifecycle Event)
Jan 22 00:35:48 compute-1 nova_compute[182713]: 2026-01-22 00:35:48.891 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 79afbcaf-6ef5-4db5-a05c-78bccd9f772a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:35:48 compute-1 nova_compute[182713]: 2026-01-22 00:35:48.895 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 79afbcaf-6ef5-4db5-a05c-78bccd9f772a] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 00:35:48 compute-1 nova_compute[182713]: 2026-01-22 00:35:48.926 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 79afbcaf-6ef5-4db5-a05c-78bccd9f772a] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 00:35:49 compute-1 podman[240927]: 2026-01-22 00:35:49.025360846 +0000 UTC m=+0.068461631 container create ae5a9895e723c928283f82eafcd3e44f36c2feaa2a2bd22fe9bd21eef22c39e4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9d3a0d92-0a01-43e0-bbe5-a677082b8f1b, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Jan 22 00:35:49 compute-1 podman[240927]: 2026-01-22 00:35:48.986165185 +0000 UTC m=+0.029266000 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 00:35:49 compute-1 systemd[1]: Started libpod-conmon-ae5a9895e723c928283f82eafcd3e44f36c2feaa2a2bd22fe9bd21eef22c39e4.scope.
Jan 22 00:35:49 compute-1 systemd[1]: Started libcrun container.
Jan 22 00:35:49 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/108f8f9a78426956c8ce866ece23d16367d5b61a498e5a95c7a9be8ff948d08d/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 00:35:49 compute-1 podman[240927]: 2026-01-22 00:35:49.129830318 +0000 UTC m=+0.172931103 container init ae5a9895e723c928283f82eafcd3e44f36c2feaa2a2bd22fe9bd21eef22c39e4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9d3a0d92-0a01-43e0-bbe5-a677082b8f1b, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 22 00:35:49 compute-1 podman[240927]: 2026-01-22 00:35:49.136470627 +0000 UTC m=+0.179571392 container start ae5a9895e723c928283f82eafcd3e44f36c2feaa2a2bd22fe9bd21eef22c39e4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9d3a0d92-0a01-43e0-bbe5-a677082b8f1b, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Jan 22 00:35:49 compute-1 neutron-haproxy-ovnmeta-9d3a0d92-0a01-43e0-bbe5-a677082b8f1b[240942]: [NOTICE]   (240946) : New worker (240948) forked
Jan 22 00:35:49 compute-1 neutron-haproxy-ovnmeta-9d3a0d92-0a01-43e0-bbe5-a677082b8f1b[240942]: [NOTICE]   (240946) : Loading success.
Jan 22 00:35:49 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:35:49.203 104184 INFO neutron.agent.ovn.metadata.agent [-] Port 90c66d2f-b763-45aa-9b34-a83255b9d97b in datapath 041654ff-0c5d-4cd2-89f6-0863dbbf44a8 unbound from our chassis
Jan 22 00:35:49 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:35:49.207 104184 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 041654ff-0c5d-4cd2-89f6-0863dbbf44a8
Jan 22 00:35:49 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:35:49.221 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[b9939952-8707-4827-85ea-6985b1c0b615]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:35:49 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:35:49.222 104184 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap041654ff-01 in ovnmeta-041654ff-0c5d-4cd2-89f6-0863dbbf44a8 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 22 00:35:49 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:35:49.225 211733 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap041654ff-00 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 22 00:35:49 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:35:49.225 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[f8255bd3-1bd3-41c5-b2cc-9f75068d2591]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:35:49 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:35:49.226 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[423014e3-62a1-42ba-aa2d-fc52bec15893]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:35:49 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:35:49.238 104576 DEBUG oslo.privsep.daemon [-] privsep: reply[dd247a28-acb2-43bd-8962-d2a373b27dfe]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:35:49 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:35:49.255 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[9f37d912-0d56-470e-bb50-a9c60cdd036e]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:35:49 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:35:49.298 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[dbe156b4-2d7a-41e2-9abb-f73b35299555]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:35:49 compute-1 systemd-udevd[240877]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 00:35:49 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:35:49.306 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[4bff68e0-d013-47ad-8b5f-94c74cd94045]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:35:49 compute-1 NetworkManager[54952]: <info>  [1769042149.3081] manager: (tap041654ff-00): new Veth device (/org/freedesktop/NetworkManager/Devices/333)
Jan 22 00:35:49 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:35:49.349 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[c24a904d-aecd-4c2f-b3f9-6d3399e19e2f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:35:49 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:35:49.354 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[09a30b9b-0f16-4632-9dab-0a736e651732]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:35:49 compute-1 NetworkManager[54952]: <info>  [1769042149.3865] device (tap041654ff-00): carrier: link connected
Jan 22 00:35:49 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:35:49.395 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[52acf38a-771c-47f7-a7d8-5f5fc9eb3576]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:35:49 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:35:49.424 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[6b276edf-1348-4dd5-8ca7-2616a65016d0]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap041654ff-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:17:32:e3'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 203], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 672203, 'reachable_time': 33771, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 240967, 'error': None, 'target': 'ovnmeta-041654ff-0c5d-4cd2-89f6-0863dbbf44a8', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:35:49 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:35:49.449 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[2aca38c2-707a-4041-937e-44acd1a4642e]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe17:32e3'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 672203, 'tstamp': 672203}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 240968, 'error': None, 'target': 'ovnmeta-041654ff-0c5d-4cd2-89f6-0863dbbf44a8', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:35:49 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:35:49.477 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[55712937-6d19-49d4-8c45-d5a3df983619]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap041654ff-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:17:32:e3'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 203], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 672203, 'reachable_time': 33771, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 240969, 'error': None, 'target': 'ovnmeta-041654ff-0c5d-4cd2-89f6-0863dbbf44a8', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:35:49 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:35:49.524 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[b35535d8-2526-4ce5-be3a-9924466c1189]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:35:49 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:35:49.574 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[08002a63-dd3e-455a-9821-07ad427651db]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:35:49 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:35:49.576 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap041654ff-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:35:49 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:35:49.577 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 00:35:49 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:35:49.577 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap041654ff-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:35:49 compute-1 nova_compute[182713]: 2026-01-22 00:35:49.580 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:35:49 compute-1 NetworkManager[54952]: <info>  [1769042149.5812] manager: (tap041654ff-00): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/334)
Jan 22 00:35:49 compute-1 kernel: tap041654ff-00: entered promiscuous mode
Jan 22 00:35:49 compute-1 nova_compute[182713]: 2026-01-22 00:35:49.584 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:35:49 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:35:49.585 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap041654ff-00, col_values=(('external_ids', {'iface-id': '44d0292d-d743-4a92-8996-3ae3a26c0afc'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:35:49 compute-1 nova_compute[182713]: 2026-01-22 00:35:49.587 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:35:49 compute-1 ovn_controller[94841]: 2026-01-22T00:35:49Z|00703|binding|INFO|Releasing lport 44d0292d-d743-4a92-8996-3ae3a26c0afc from this chassis (sb_readonly=0)
Jan 22 00:35:49 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:35:49.616 104184 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/041654ff-0c5d-4cd2-89f6-0863dbbf44a8.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/041654ff-0c5d-4cd2-89f6-0863dbbf44a8.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 22 00:35:49 compute-1 nova_compute[182713]: 2026-01-22 00:35:49.614 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:35:49 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:35:49.617 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[db991a0f-6142-4532-b91d-898b69ef4653]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:35:49 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:35:49.618 104184 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 00:35:49 compute-1 ovn_metadata_agent[104179]: global
Jan 22 00:35:49 compute-1 ovn_metadata_agent[104179]:     log         /dev/log local0 debug
Jan 22 00:35:49 compute-1 ovn_metadata_agent[104179]:     log-tag     haproxy-metadata-proxy-041654ff-0c5d-4cd2-89f6-0863dbbf44a8
Jan 22 00:35:49 compute-1 ovn_metadata_agent[104179]:     user        root
Jan 22 00:35:49 compute-1 ovn_metadata_agent[104179]:     group       root
Jan 22 00:35:49 compute-1 ovn_metadata_agent[104179]:     maxconn     1024
Jan 22 00:35:49 compute-1 ovn_metadata_agent[104179]:     pidfile     /var/lib/neutron/external/pids/041654ff-0c5d-4cd2-89f6-0863dbbf44a8.pid.haproxy
Jan 22 00:35:49 compute-1 ovn_metadata_agent[104179]:     daemon
Jan 22 00:35:49 compute-1 ovn_metadata_agent[104179]: 
Jan 22 00:35:49 compute-1 ovn_metadata_agent[104179]: defaults
Jan 22 00:35:49 compute-1 ovn_metadata_agent[104179]:     log global
Jan 22 00:35:49 compute-1 ovn_metadata_agent[104179]:     mode http
Jan 22 00:35:49 compute-1 ovn_metadata_agent[104179]:     option httplog
Jan 22 00:35:49 compute-1 ovn_metadata_agent[104179]:     option dontlognull
Jan 22 00:35:49 compute-1 ovn_metadata_agent[104179]:     option http-server-close
Jan 22 00:35:49 compute-1 ovn_metadata_agent[104179]:     option forwardfor
Jan 22 00:35:49 compute-1 ovn_metadata_agent[104179]:     retries                 3
Jan 22 00:35:49 compute-1 ovn_metadata_agent[104179]:     timeout http-request    30s
Jan 22 00:35:49 compute-1 ovn_metadata_agent[104179]:     timeout connect         30s
Jan 22 00:35:49 compute-1 ovn_metadata_agent[104179]:     timeout client          32s
Jan 22 00:35:49 compute-1 ovn_metadata_agent[104179]:     timeout server          32s
Jan 22 00:35:49 compute-1 ovn_metadata_agent[104179]:     timeout http-keep-alive 30s
Jan 22 00:35:49 compute-1 ovn_metadata_agent[104179]: 
Jan 22 00:35:49 compute-1 ovn_metadata_agent[104179]: 
Jan 22 00:35:49 compute-1 ovn_metadata_agent[104179]: listen listener
Jan 22 00:35:49 compute-1 ovn_metadata_agent[104179]:     bind 169.254.169.254:80
Jan 22 00:35:49 compute-1 ovn_metadata_agent[104179]:     server metadata /var/lib/neutron/metadata_proxy
Jan 22 00:35:49 compute-1 ovn_metadata_agent[104179]:     http-request add-header X-OVN-Network-ID 041654ff-0c5d-4cd2-89f6-0863dbbf44a8
Jan 22 00:35:49 compute-1 ovn_metadata_agent[104179]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 22 00:35:49 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:35:49.619 104184 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-041654ff-0c5d-4cd2-89f6-0863dbbf44a8', 'env', 'PROCESS_TAG=haproxy-041654ff-0c5d-4cd2-89f6-0863dbbf44a8', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/041654ff-0c5d-4cd2-89f6-0863dbbf44a8.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 22 00:35:49 compute-1 nova_compute[182713]: 2026-01-22 00:35:49.951 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:35:50 compute-1 podman[240999]: 2026-01-22 00:35:50.081301445 +0000 UTC m=+0.073585273 container create 99f354808ca4484bdd27e6dd1d4a4452ad527b6cd3762f2a4d9739d1f7342111 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-041654ff-0c5d-4cd2-89f6-0863dbbf44a8, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Jan 22 00:35:50 compute-1 systemd[1]: Started libpod-conmon-99f354808ca4484bdd27e6dd1d4a4452ad527b6cd3762f2a4d9739d1f7342111.scope.
Jan 22 00:35:50 compute-1 systemd[1]: Started libcrun container.
Jan 22 00:35:50 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/db5952f42460ce485b188c5c4ad16b60f53bc8fe7442b45d62e9240739811d26/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 00:35:50 compute-1 podman[240999]: 2026-01-22 00:35:50.045052316 +0000 UTC m=+0.037336234 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 00:35:50 compute-1 podman[240999]: 2026-01-22 00:35:50.148899918 +0000 UTC m=+0.141183766 container init 99f354808ca4484bdd27e6dd1d4a4452ad527b6cd3762f2a4d9739d1f7342111 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-041654ff-0c5d-4cd2-89f6-0863dbbf44a8, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 00:35:50 compute-1 podman[240999]: 2026-01-22 00:35:50.153949636 +0000 UTC m=+0.146233454 container start 99f354808ca4484bdd27e6dd1d4a4452ad527b6cd3762f2a4d9739d1f7342111 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-041654ff-0c5d-4cd2-89f6-0863dbbf44a8, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 00:35:50 compute-1 nova_compute[182713]: 2026-01-22 00:35:50.158 182717 DEBUG nova.network.neutron [req-7171e206-6c75-4ace-9a55-81d2e03b861e req-e0942562-0dee-4c2a-9e98-82489a87a625 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 79afbcaf-6ef5-4db5-a05c-78bccd9f772a] Updated VIF entry in instance network info cache for port 90c66d2f-b763-45aa-9b34-a83255b9d97b. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 00:35:50 compute-1 nova_compute[182713]: 2026-01-22 00:35:50.158 182717 DEBUG nova.network.neutron [req-7171e206-6c75-4ace-9a55-81d2e03b861e req-e0942562-0dee-4c2a-9e98-82489a87a625 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 79afbcaf-6ef5-4db5-a05c-78bccd9f772a] Updating instance_info_cache with network_info: [{"id": "4a9813e7-d0e1-490f-a8ca-3138225442a8", "address": "fa:16:3e:7f:b8:cc", "network": {"id": "9d3a0d92-0a01-43e0-bbe5-a677082b8f1b", "bridge": "br-int", "label": "tempest-network-smoke--1303473530", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4a9813e7-d0", "ovs_interfaceid": "4a9813e7-d0e1-490f-a8ca-3138225442a8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "90c66d2f-b763-45aa-9b34-a83255b9d97b", "address": "fa:16:3e:e5:06:a1", "network": {"id": "041654ff-0c5d-4cd2-89f6-0863dbbf44a8", "bridge": "br-int", "label": "tempest-network-smoke--1472393287", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fee5:6a1", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fee5:6a1", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap90c66d2f-b7", "ovs_interfaceid": "90c66d2f-b763-45aa-9b34-a83255b9d97b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:35:50 compute-1 neutron-haproxy-ovnmeta-041654ff-0c5d-4cd2-89f6-0863dbbf44a8[241015]: [NOTICE]   (241019) : New worker (241021) forked
Jan 22 00:35:50 compute-1 neutron-haproxy-ovnmeta-041654ff-0c5d-4cd2-89f6-0863dbbf44a8[241015]: [NOTICE]   (241019) : Loading success.
Jan 22 00:35:50 compute-1 nova_compute[182713]: 2026-01-22 00:35:50.294 182717 DEBUG oslo_concurrency.lockutils [req-7171e206-6c75-4ace-9a55-81d2e03b861e req-e0942562-0dee-4c2a-9e98-82489a87a625 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-79afbcaf-6ef5-4db5-a05c-78bccd9f772a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:35:50 compute-1 nova_compute[182713]: 2026-01-22 00:35:50.822 182717 DEBUG nova.compute.manager [req-d19dfa11-2797-407a-ae14-d63153140d27 req-6f531a1f-d2ae-4307-9eb3-51d1bea5f955 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 79afbcaf-6ef5-4db5-a05c-78bccd9f772a] Received event network-vif-plugged-4a9813e7-d0e1-490f-a8ca-3138225442a8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:35:50 compute-1 nova_compute[182713]: 2026-01-22 00:35:50.822 182717 DEBUG oslo_concurrency.lockutils [req-d19dfa11-2797-407a-ae14-d63153140d27 req-6f531a1f-d2ae-4307-9eb3-51d1bea5f955 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "79afbcaf-6ef5-4db5-a05c-78bccd9f772a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:35:50 compute-1 nova_compute[182713]: 2026-01-22 00:35:50.823 182717 DEBUG oslo_concurrency.lockutils [req-d19dfa11-2797-407a-ae14-d63153140d27 req-6f531a1f-d2ae-4307-9eb3-51d1bea5f955 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "79afbcaf-6ef5-4db5-a05c-78bccd9f772a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:35:50 compute-1 nova_compute[182713]: 2026-01-22 00:35:50.824 182717 DEBUG oslo_concurrency.lockutils [req-d19dfa11-2797-407a-ae14-d63153140d27 req-6f531a1f-d2ae-4307-9eb3-51d1bea5f955 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "79afbcaf-6ef5-4db5-a05c-78bccd9f772a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:35:50 compute-1 nova_compute[182713]: 2026-01-22 00:35:50.824 182717 DEBUG nova.compute.manager [req-d19dfa11-2797-407a-ae14-d63153140d27 req-6f531a1f-d2ae-4307-9eb3-51d1bea5f955 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 79afbcaf-6ef5-4db5-a05c-78bccd9f772a] No event matching network-vif-plugged-4a9813e7-d0e1-490f-a8ca-3138225442a8 in dict_keys([('network-vif-plugged', '90c66d2f-b763-45aa-9b34-a83255b9d97b')]) pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:325
Jan 22 00:35:50 compute-1 nova_compute[182713]: 2026-01-22 00:35:50.825 182717 WARNING nova.compute.manager [req-d19dfa11-2797-407a-ae14-d63153140d27 req-6f531a1f-d2ae-4307-9eb3-51d1bea5f955 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 79afbcaf-6ef5-4db5-a05c-78bccd9f772a] Received unexpected event network-vif-plugged-4a9813e7-d0e1-490f-a8ca-3138225442a8 for instance with vm_state building and task_state spawning.
Jan 22 00:35:50 compute-1 nova_compute[182713]: 2026-01-22 00:35:50.826 182717 DEBUG nova.compute.manager [req-d19dfa11-2797-407a-ae14-d63153140d27 req-6f531a1f-d2ae-4307-9eb3-51d1bea5f955 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 79afbcaf-6ef5-4db5-a05c-78bccd9f772a] Received event network-vif-plugged-90c66d2f-b763-45aa-9b34-a83255b9d97b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:35:50 compute-1 nova_compute[182713]: 2026-01-22 00:35:50.826 182717 DEBUG oslo_concurrency.lockutils [req-d19dfa11-2797-407a-ae14-d63153140d27 req-6f531a1f-d2ae-4307-9eb3-51d1bea5f955 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "79afbcaf-6ef5-4db5-a05c-78bccd9f772a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:35:50 compute-1 nova_compute[182713]: 2026-01-22 00:35:50.827 182717 DEBUG oslo_concurrency.lockutils [req-d19dfa11-2797-407a-ae14-d63153140d27 req-6f531a1f-d2ae-4307-9eb3-51d1bea5f955 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "79afbcaf-6ef5-4db5-a05c-78bccd9f772a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:35:50 compute-1 nova_compute[182713]: 2026-01-22 00:35:50.827 182717 DEBUG oslo_concurrency.lockutils [req-d19dfa11-2797-407a-ae14-d63153140d27 req-6f531a1f-d2ae-4307-9eb3-51d1bea5f955 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "79afbcaf-6ef5-4db5-a05c-78bccd9f772a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:35:50 compute-1 nova_compute[182713]: 2026-01-22 00:35:50.828 182717 DEBUG nova.compute.manager [req-d19dfa11-2797-407a-ae14-d63153140d27 req-6f531a1f-d2ae-4307-9eb3-51d1bea5f955 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 79afbcaf-6ef5-4db5-a05c-78bccd9f772a] Processing event network-vif-plugged-90c66d2f-b763-45aa-9b34-a83255b9d97b _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 22 00:35:50 compute-1 nova_compute[182713]: 2026-01-22 00:35:50.828 182717 DEBUG nova.compute.manager [req-d19dfa11-2797-407a-ae14-d63153140d27 req-6f531a1f-d2ae-4307-9eb3-51d1bea5f955 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 79afbcaf-6ef5-4db5-a05c-78bccd9f772a] Received event network-vif-plugged-90c66d2f-b763-45aa-9b34-a83255b9d97b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:35:50 compute-1 nova_compute[182713]: 2026-01-22 00:35:50.829 182717 DEBUG oslo_concurrency.lockutils [req-d19dfa11-2797-407a-ae14-d63153140d27 req-6f531a1f-d2ae-4307-9eb3-51d1bea5f955 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "79afbcaf-6ef5-4db5-a05c-78bccd9f772a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:35:50 compute-1 nova_compute[182713]: 2026-01-22 00:35:50.829 182717 DEBUG oslo_concurrency.lockutils [req-d19dfa11-2797-407a-ae14-d63153140d27 req-6f531a1f-d2ae-4307-9eb3-51d1bea5f955 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "79afbcaf-6ef5-4db5-a05c-78bccd9f772a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:35:50 compute-1 nova_compute[182713]: 2026-01-22 00:35:50.830 182717 DEBUG oslo_concurrency.lockutils [req-d19dfa11-2797-407a-ae14-d63153140d27 req-6f531a1f-d2ae-4307-9eb3-51d1bea5f955 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "79afbcaf-6ef5-4db5-a05c-78bccd9f772a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:35:50 compute-1 nova_compute[182713]: 2026-01-22 00:35:50.830 182717 DEBUG nova.compute.manager [req-d19dfa11-2797-407a-ae14-d63153140d27 req-6f531a1f-d2ae-4307-9eb3-51d1bea5f955 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 79afbcaf-6ef5-4db5-a05c-78bccd9f772a] No waiting events found dispatching network-vif-plugged-90c66d2f-b763-45aa-9b34-a83255b9d97b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:35:50 compute-1 nova_compute[182713]: 2026-01-22 00:35:50.831 182717 WARNING nova.compute.manager [req-d19dfa11-2797-407a-ae14-d63153140d27 req-6f531a1f-d2ae-4307-9eb3-51d1bea5f955 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 79afbcaf-6ef5-4db5-a05c-78bccd9f772a] Received unexpected event network-vif-plugged-90c66d2f-b763-45aa-9b34-a83255b9d97b for instance with vm_state building and task_state spawning.
Jan 22 00:35:50 compute-1 nova_compute[182713]: 2026-01-22 00:35:50.832 182717 DEBUG nova.compute.manager [None req-9311732a-61f8-423e-850c-57479ca0bc2e a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 79afbcaf-6ef5-4db5-a05c-78bccd9f772a] Instance event wait completed in 2 seconds for network-vif-plugged,network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 00:35:50 compute-1 nova_compute[182713]: 2026-01-22 00:35:50.838 182717 DEBUG nova.virt.driver [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Emitting event <LifecycleEvent: 1769042150.8380342, 79afbcaf-6ef5-4db5-a05c-78bccd9f772a => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:35:50 compute-1 nova_compute[182713]: 2026-01-22 00:35:50.839 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 79afbcaf-6ef5-4db5-a05c-78bccd9f772a] VM Resumed (Lifecycle Event)
Jan 22 00:35:50 compute-1 nova_compute[182713]: 2026-01-22 00:35:50.842 182717 DEBUG nova.virt.libvirt.driver [None req-9311732a-61f8-423e-850c-57479ca0bc2e a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 79afbcaf-6ef5-4db5-a05c-78bccd9f772a] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 22 00:35:50 compute-1 nova_compute[182713]: 2026-01-22 00:35:50.848 182717 INFO nova.virt.libvirt.driver [-] [instance: 79afbcaf-6ef5-4db5-a05c-78bccd9f772a] Instance spawned successfully.
Jan 22 00:35:50 compute-1 nova_compute[182713]: 2026-01-22 00:35:50.849 182717 DEBUG nova.virt.libvirt.driver [None req-9311732a-61f8-423e-850c-57479ca0bc2e a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 79afbcaf-6ef5-4db5-a05c-78bccd9f772a] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 22 00:35:50 compute-1 nova_compute[182713]: 2026-01-22 00:35:50.864 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 79afbcaf-6ef5-4db5-a05c-78bccd9f772a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:35:50 compute-1 nova_compute[182713]: 2026-01-22 00:35:50.868 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 79afbcaf-6ef5-4db5-a05c-78bccd9f772a] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 00:35:50 compute-1 nova_compute[182713]: 2026-01-22 00:35:50.879 182717 DEBUG nova.virt.libvirt.driver [None req-9311732a-61f8-423e-850c-57479ca0bc2e a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 79afbcaf-6ef5-4db5-a05c-78bccd9f772a] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:35:50 compute-1 nova_compute[182713]: 2026-01-22 00:35:50.880 182717 DEBUG nova.virt.libvirt.driver [None req-9311732a-61f8-423e-850c-57479ca0bc2e a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 79afbcaf-6ef5-4db5-a05c-78bccd9f772a] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:35:50 compute-1 nova_compute[182713]: 2026-01-22 00:35:50.881 182717 DEBUG nova.virt.libvirt.driver [None req-9311732a-61f8-423e-850c-57479ca0bc2e a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 79afbcaf-6ef5-4db5-a05c-78bccd9f772a] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:35:50 compute-1 nova_compute[182713]: 2026-01-22 00:35:50.881 182717 DEBUG nova.virt.libvirt.driver [None req-9311732a-61f8-423e-850c-57479ca0bc2e a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 79afbcaf-6ef5-4db5-a05c-78bccd9f772a] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:35:50 compute-1 nova_compute[182713]: 2026-01-22 00:35:50.882 182717 DEBUG nova.virt.libvirt.driver [None req-9311732a-61f8-423e-850c-57479ca0bc2e a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 79afbcaf-6ef5-4db5-a05c-78bccd9f772a] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:35:50 compute-1 nova_compute[182713]: 2026-01-22 00:35:50.883 182717 DEBUG nova.virt.libvirt.driver [None req-9311732a-61f8-423e-850c-57479ca0bc2e a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 79afbcaf-6ef5-4db5-a05c-78bccd9f772a] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:35:50 compute-1 nova_compute[182713]: 2026-01-22 00:35:50.967 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 79afbcaf-6ef5-4db5-a05c-78bccd9f772a] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 00:35:51 compute-1 nova_compute[182713]: 2026-01-22 00:35:51.031 182717 INFO nova.compute.manager [None req-9311732a-61f8-423e-850c-57479ca0bc2e a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 79afbcaf-6ef5-4db5-a05c-78bccd9f772a] Took 14.15 seconds to spawn the instance on the hypervisor.
Jan 22 00:35:51 compute-1 nova_compute[182713]: 2026-01-22 00:35:51.032 182717 DEBUG nova.compute.manager [None req-9311732a-61f8-423e-850c-57479ca0bc2e a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 79afbcaf-6ef5-4db5-a05c-78bccd9f772a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:35:52 compute-1 nova_compute[182713]: 2026-01-22 00:35:52.316 182717 INFO nova.compute.manager [None req-9311732a-61f8-423e-850c-57479ca0bc2e a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 79afbcaf-6ef5-4db5-a05c-78bccd9f772a] Took 15.94 seconds to build instance.
Jan 22 00:35:52 compute-1 nova_compute[182713]: 2026-01-22 00:35:52.366 182717 DEBUG oslo_concurrency.lockutils [None req-9311732a-61f8-423e-850c-57479ca0bc2e a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "79afbcaf-6ef5-4db5-a05c-78bccd9f772a" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 16.074s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:35:52 compute-1 nova_compute[182713]: 2026-01-22 00:35:52.475 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:35:54 compute-1 nova_compute[182713]: 2026-01-22 00:35:54.951 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:35:57 compute-1 nova_compute[182713]: 2026-01-22 00:35:57.477 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:35:59 compute-1 nova_compute[182713]: 2026-01-22 00:35:59.954 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:36:00 compute-1 podman[241030]: 2026-01-22 00:36:00.579768045 +0000 UTC m=+0.063661180 container health_status cba261ffc859a105e0afa4436e64e27f9ba7e7387a1ea02146e0072c51844d44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 00:36:00 compute-1 nova_compute[182713]: 2026-01-22 00:36:00.855 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:36:01 compute-1 nova_compute[182713]: 2026-01-22 00:36:01.852 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:36:01 compute-1 nova_compute[182713]: 2026-01-22 00:36:01.854 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:36:01 compute-1 nova_compute[182713]: 2026-01-22 00:36:01.855 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:36:01 compute-1 nova_compute[182713]: 2026-01-22 00:36:01.855 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 00:36:02 compute-1 nova_compute[182713]: 2026-01-22 00:36:02.480 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:36:02 compute-1 ovn_controller[94841]: 2026-01-22T00:36:02Z|00092|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:7f:b8:cc 10.100.0.4
Jan 22 00:36:02 compute-1 ovn_controller[94841]: 2026-01-22T00:36:02Z|00093|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:7f:b8:cc 10.100.0.4
Jan 22 00:36:03 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:36:03.048 104184 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:36:03 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:36:03.049 104184 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:36:03 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:36:03.050 104184 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:36:03 compute-1 podman[241061]: 2026-01-22 00:36:03.576223508 +0000 UTC m=+0.068751861 container health_status dce133a2ffa21f3006ac27dc80027060a9029e6d972f4c62fecd35dd3bbb00f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, vcs-type=git, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, name=ubi9-minimal, config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Jan 22 00:36:03 compute-1 nova_compute[182713]: 2026-01-22 00:36:03.851 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:36:04 compute-1 nova_compute[182713]: 2026-01-22 00:36:04.956 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:36:06 compute-1 nova_compute[182713]: 2026-01-22 00:36:06.772 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:36:06 compute-1 ovn_controller[94841]: 2026-01-22T00:36:06Z|00704|binding|INFO|Releasing lport 44d0292d-d743-4a92-8996-3ae3a26c0afc from this chassis (sb_readonly=0)
Jan 22 00:36:06 compute-1 ovn_controller[94841]: 2026-01-22T00:36:06Z|00705|binding|INFO|Releasing lport 285533c3-11fb-4871-bfe4-af8cc3d787e8 from this chassis (sb_readonly=0)
Jan 22 00:36:06 compute-1 nova_compute[182713]: 2026-01-22 00:36:06.777 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:36:06 compute-1 NetworkManager[54952]: <info>  [1769042166.7791] manager: (patch-provnet-99bcf688-d143-4306-a11c-956e66fbe227-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/335)
Jan 22 00:36:06 compute-1 NetworkManager[54952]: <info>  [1769042166.7802] manager: (patch-br-int-to-provnet-99bcf688-d143-4306-a11c-956e66fbe227): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/336)
Jan 22 00:36:06 compute-1 ovn_controller[94841]: 2026-01-22T00:36:06Z|00706|binding|INFO|Releasing lport 44d0292d-d743-4a92-8996-3ae3a26c0afc from this chassis (sb_readonly=0)
Jan 22 00:36:06 compute-1 ovn_controller[94841]: 2026-01-22T00:36:06Z|00707|binding|INFO|Releasing lport 285533c3-11fb-4871-bfe4-af8cc3d787e8 from this chassis (sb_readonly=0)
Jan 22 00:36:06 compute-1 nova_compute[182713]: 2026-01-22 00:36:06.808 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:36:06 compute-1 nova_compute[182713]: 2026-01-22 00:36:06.815 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:36:07 compute-1 nova_compute[182713]: 2026-01-22 00:36:07.248 182717 DEBUG nova.compute.manager [req-4d4e17e6-7ac2-4ea2-9887-f28fd2b4f04b req-2034374d-8437-4a1f-bc6e-20d1b66d121b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 79afbcaf-6ef5-4db5-a05c-78bccd9f772a] Received event network-changed-4a9813e7-d0e1-490f-a8ca-3138225442a8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:36:07 compute-1 nova_compute[182713]: 2026-01-22 00:36:07.249 182717 DEBUG nova.compute.manager [req-4d4e17e6-7ac2-4ea2-9887-f28fd2b4f04b req-2034374d-8437-4a1f-bc6e-20d1b66d121b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 79afbcaf-6ef5-4db5-a05c-78bccd9f772a] Refreshing instance network info cache due to event network-changed-4a9813e7-d0e1-490f-a8ca-3138225442a8. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 00:36:07 compute-1 nova_compute[182713]: 2026-01-22 00:36:07.249 182717 DEBUG oslo_concurrency.lockutils [req-4d4e17e6-7ac2-4ea2-9887-f28fd2b4f04b req-2034374d-8437-4a1f-bc6e-20d1b66d121b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-79afbcaf-6ef5-4db5-a05c-78bccd9f772a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:36:07 compute-1 nova_compute[182713]: 2026-01-22 00:36:07.250 182717 DEBUG oslo_concurrency.lockutils [req-4d4e17e6-7ac2-4ea2-9887-f28fd2b4f04b req-2034374d-8437-4a1f-bc6e-20d1b66d121b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-79afbcaf-6ef5-4db5-a05c-78bccd9f772a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:36:07 compute-1 nova_compute[182713]: 2026-01-22 00:36:07.250 182717 DEBUG nova.network.neutron [req-4d4e17e6-7ac2-4ea2-9887-f28fd2b4f04b req-2034374d-8437-4a1f-bc6e-20d1b66d121b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 79afbcaf-6ef5-4db5-a05c-78bccd9f772a] Refreshing network info cache for port 4a9813e7-d0e1-490f-a8ca-3138225442a8 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 00:36:07 compute-1 nova_compute[182713]: 2026-01-22 00:36:07.481 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:36:09 compute-1 nova_compute[182713]: 2026-01-22 00:36:09.855 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:36:09 compute-1 nova_compute[182713]: 2026-01-22 00:36:09.993 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:36:10 compute-1 nova_compute[182713]: 2026-01-22 00:36:10.245 182717 DEBUG nova.network.neutron [req-4d4e17e6-7ac2-4ea2-9887-f28fd2b4f04b req-2034374d-8437-4a1f-bc6e-20d1b66d121b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 79afbcaf-6ef5-4db5-a05c-78bccd9f772a] Updated VIF entry in instance network info cache for port 4a9813e7-d0e1-490f-a8ca-3138225442a8. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 00:36:10 compute-1 nova_compute[182713]: 2026-01-22 00:36:10.245 182717 DEBUG nova.network.neutron [req-4d4e17e6-7ac2-4ea2-9887-f28fd2b4f04b req-2034374d-8437-4a1f-bc6e-20d1b66d121b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 79afbcaf-6ef5-4db5-a05c-78bccd9f772a] Updating instance_info_cache with network_info: [{"id": "4a9813e7-d0e1-490f-a8ca-3138225442a8", "address": "fa:16:3e:7f:b8:cc", "network": {"id": "9d3a0d92-0a01-43e0-bbe5-a677082b8f1b", "bridge": "br-int", "label": "tempest-network-smoke--1303473530", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.222", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4a9813e7-d0", "ovs_interfaceid": "4a9813e7-d0e1-490f-a8ca-3138225442a8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "90c66d2f-b763-45aa-9b34-a83255b9d97b", "address": "fa:16:3e:e5:06:a1", "network": {"id": "041654ff-0c5d-4cd2-89f6-0863dbbf44a8", "bridge": "br-int", "label": "tempest-network-smoke--1472393287", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fee5:6a1", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fee5:6a1", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap90c66d2f-b7", "ovs_interfaceid": "90c66d2f-b763-45aa-9b34-a83255b9d97b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:36:10 compute-1 nova_compute[182713]: 2026-01-22 00:36:10.301 182717 DEBUG oslo_concurrency.lockutils [req-4d4e17e6-7ac2-4ea2-9887-f28fd2b4f04b req-2034374d-8437-4a1f-bc6e-20d1b66d121b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-79afbcaf-6ef5-4db5-a05c-78bccd9f772a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:36:10 compute-1 nova_compute[182713]: 2026-01-22 00:36:10.855 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:36:12 compute-1 nova_compute[182713]: 2026-01-22 00:36:12.483 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:36:12 compute-1 podman[241086]: 2026-01-22 00:36:12.566063519 +0000 UTC m=+0.055441163 container health_status 9069102163b9014ce8d990e36224edf2f6277e737eebbc85a8ea0ba5a3d6a468 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 22 00:36:12 compute-1 podman[241085]: 2026-01-22 00:36:12.575413042 +0000 UTC m=+0.069151163 container health_status 1b31374526468d6b98833c6ddc06c791524e3aaa8f4a92d02e3f11c449a17ca2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller, container_name=ovn_controller)
Jan 22 00:36:13 compute-1 nova_compute[182713]: 2026-01-22 00:36:13.108 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:36:13 compute-1 nova_compute[182713]: 2026-01-22 00:36:13.109 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:36:13 compute-1 nova_compute[182713]: 2026-01-22 00:36:13.110 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:36:13 compute-1 nova_compute[182713]: 2026-01-22 00:36:13.110 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 00:36:13 compute-1 nova_compute[182713]: 2026-01-22 00:36:13.197 182717 DEBUG oslo_concurrency.processutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/79afbcaf-6ef5-4db5-a05c-78bccd9f772a/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:36:13 compute-1 nova_compute[182713]: 2026-01-22 00:36:13.282 182717 DEBUG oslo_concurrency.processutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/79afbcaf-6ef5-4db5-a05c-78bccd9f772a/disk --force-share --output=json" returned: 0 in 0.085s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:36:13 compute-1 nova_compute[182713]: 2026-01-22 00:36:13.284 182717 DEBUG oslo_concurrency.processutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/79afbcaf-6ef5-4db5-a05c-78bccd9f772a/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:36:13 compute-1 nova_compute[182713]: 2026-01-22 00:36:13.378 182717 DEBUG oslo_concurrency.processutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/79afbcaf-6ef5-4db5-a05c-78bccd9f772a/disk --force-share --output=json" returned: 0 in 0.094s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:36:13 compute-1 nova_compute[182713]: 2026-01-22 00:36:13.585 182717 WARNING nova.virt.libvirt.driver [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 00:36:13 compute-1 nova_compute[182713]: 2026-01-22 00:36:13.587 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5525MB free_disk=73.16400527954102GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 00:36:13 compute-1 nova_compute[182713]: 2026-01-22 00:36:13.587 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:36:13 compute-1 nova_compute[182713]: 2026-01-22 00:36:13.588 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:36:13 compute-1 nova_compute[182713]: 2026-01-22 00:36:13.706 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Instance 79afbcaf-6ef5-4db5-a05c-78bccd9f772a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 22 00:36:13 compute-1 nova_compute[182713]: 2026-01-22 00:36:13.707 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 00:36:13 compute-1 nova_compute[182713]: 2026-01-22 00:36:13.707 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 00:36:13 compute-1 nova_compute[182713]: 2026-01-22 00:36:13.757 182717 DEBUG nova.compute.provider_tree [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Inventory has not changed in ProviderTree for provider: 39680711-70c9-4df1-ae59-25e54fac688d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:36:13 compute-1 nova_compute[182713]: 2026-01-22 00:36:13.780 182717 DEBUG nova.scheduler.client.report [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Inventory has not changed for provider 39680711-70c9-4df1-ae59-25e54fac688d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:36:13 compute-1 nova_compute[182713]: 2026-01-22 00:36:13.809 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 00:36:13 compute-1 nova_compute[182713]: 2026-01-22 00:36:13.810 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.222s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:36:14 compute-1 nova_compute[182713]: 2026-01-22 00:36:14.811 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:36:14 compute-1 nova_compute[182713]: 2026-01-22 00:36:14.811 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 00:36:14 compute-1 nova_compute[182713]: 2026-01-22 00:36:14.812 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 22 00:36:14 compute-1 nova_compute[182713]: 2026-01-22 00:36:14.996 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:36:15 compute-1 nova_compute[182713]: 2026-01-22 00:36:15.063 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquiring lock "refresh_cache-79afbcaf-6ef5-4db5-a05c-78bccd9f772a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:36:15 compute-1 nova_compute[182713]: 2026-01-22 00:36:15.063 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquired lock "refresh_cache-79afbcaf-6ef5-4db5-a05c-78bccd9f772a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:36:15 compute-1 nova_compute[182713]: 2026-01-22 00:36:15.064 182717 DEBUG nova.network.neutron [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] [instance: 79afbcaf-6ef5-4db5-a05c-78bccd9f772a] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Jan 22 00:36:15 compute-1 nova_compute[182713]: 2026-01-22 00:36:15.064 182717 DEBUG nova.objects.instance [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lazy-loading 'info_cache' on Instance uuid 79afbcaf-6ef5-4db5-a05c-78bccd9f772a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:36:17 compute-1 nova_compute[182713]: 2026-01-22 00:36:17.485 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:36:18 compute-1 podman[241141]: 2026-01-22 00:36:18.602952563 +0000 UTC m=+0.080514129 container health_status af9a44923f14d865893c91712a29a99d59e05d83f1a1a0300c4f4d5af638f284 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 00:36:18 compute-1 podman[241142]: 2026-01-22 00:36:18.608047863 +0000 UTC m=+0.086931411 container health_status e305ebfcd3dbca18f8dd680606b043b108cf9efb0d0a771039cb04fe0df8441d (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Jan 22 00:36:19 compute-1 nova_compute[182713]: 2026-01-22 00:36:19.600 182717 DEBUG nova.network.neutron [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] [instance: 79afbcaf-6ef5-4db5-a05c-78bccd9f772a] Updating instance_info_cache with network_info: [{"id": "4a9813e7-d0e1-490f-a8ca-3138225442a8", "address": "fa:16:3e:7f:b8:cc", "network": {"id": "9d3a0d92-0a01-43e0-bbe5-a677082b8f1b", "bridge": "br-int", "label": "tempest-network-smoke--1303473530", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.222", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4a9813e7-d0", "ovs_interfaceid": "4a9813e7-d0e1-490f-a8ca-3138225442a8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "90c66d2f-b763-45aa-9b34-a83255b9d97b", "address": "fa:16:3e:e5:06:a1", "network": {"id": "041654ff-0c5d-4cd2-89f6-0863dbbf44a8", "bridge": "br-int", "label": "tempest-network-smoke--1472393287", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fee5:6a1", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fee5:6a1", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap90c66d2f-b7", "ovs_interfaceid": "90c66d2f-b763-45aa-9b34-a83255b9d97b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:36:19 compute-1 nova_compute[182713]: 2026-01-22 00:36:19.640 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Releasing lock "refresh_cache-79afbcaf-6ef5-4db5-a05c-78bccd9f772a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:36:19 compute-1 nova_compute[182713]: 2026-01-22 00:36:19.641 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] [instance: 79afbcaf-6ef5-4db5-a05c-78bccd9f772a] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Jan 22 00:36:19 compute-1 nova_compute[182713]: 2026-01-22 00:36:19.641 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:36:19 compute-1 nova_compute[182713]: 2026-01-22 00:36:19.998 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:36:20 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:36:20.386 104184 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=59, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:a4:fb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'fa:c2:bb:50:eb:27'}, ipsec=False) old=SB_Global(nb_cfg=58) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:36:20 compute-1 nova_compute[182713]: 2026-01-22 00:36:20.387 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:36:20 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:36:20.388 104184 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 22 00:36:22 compute-1 nova_compute[182713]: 2026-01-22 00:36:22.488 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:36:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:22.898 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '79afbcaf-6ef5-4db5-a05c-78bccd9f772a', 'name': 'tempest-TestGettingAddress-server-1327765747', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-000000ad', 'OS-EXT-SRV-ATTR:host': 'compute-1.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '837db8748d074b3c9179b47d30e7a1d4', 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'hostId': '114686826ac799d79679ebf48d50469c6b568964d23812d7c3d3d1b1', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Jan 22 00:36:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:22.901 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Jan 22 00:36:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:22.918 12 DEBUG ceilometer.compute.pollsters [-] 79afbcaf-6ef5-4db5-a05c-78bccd9f772a/disk.device.allocation volume: 30023680 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:36:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:22.920 12 DEBUG ceilometer.compute.pollsters [-] 79afbcaf-6ef5-4db5-a05c-78bccd9f772a/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:36:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:22.929 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '23b98e84-71a5-4b3e-b02e-b1327247e0b7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30023680, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': '79afbcaf-6ef5-4db5-a05c-78bccd9f772a-vda', 'timestamp': '2026-01-22T00:36:22.902110', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1327765747', 'name': 'instance-000000ad', 'instance_id': '79afbcaf-6ef5-4db5-a05c-78bccd9f772a', 'instance_type': 'm1.nano', 'host': '114686826ac799d79679ebf48d50469c6b568964d23812d7c3d3d1b1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '608f971e-f72a-11f0-a0a4-fa163e934844', 'monotonic_time': 6755.609841881, 'message_signature': '2ea72343492113d12354dad7e2d65c7912408cf2760d756c2681c4bdfd789c3f'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': '79afbcaf-6ef5-4db5-a05c-78bccd9f772a-sda', 'timestamp': '2026-01-22T00:36:22.902110', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1327765747', 'name': 'instance-000000ad', 'instance_id': '79afbcaf-6ef5-4db5-a05c-78bccd9f772a', 'instance_type': 'm1.nano', 'host': '114686826ac799d79679ebf48d50469c6b568964d23812d7c3d3d1b1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '608fc86a-f72a-11f0-a0a4-fa163e934844', 'monotonic_time': 6755.609841881, 'message_signature': '878b1b45e4d7f3b2b01ec364d0610ff3dc169bc1ea78b733fa56e8151930b95e'}]}, 'timestamp': '2026-01-22 00:36:22.921415', '_unique_id': 'c9f230c7e0bb4264aabd82ff4c6e6d69'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:36:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:22.929 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:36:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:22.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:36:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:22.929 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:36:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:22.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:36:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:22.929 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:36:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:22.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:36:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:22.929 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:36:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:22.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:36:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:22.929 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:36:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:22.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:36:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:22.929 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:36:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:22.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:36:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:22.929 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:36:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:22.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:36:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:22.929 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:36:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:22.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:36:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:22.929 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:36:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:22.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:36:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:22.929 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:36:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:22.929 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:36:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:22.929 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:36:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:22.929 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:36:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:22.929 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:36:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:22.929 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:36:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:22.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:36:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:22.929 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:36:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:22.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:36:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:22.929 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:36:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:22.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:36:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:22.929 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:36:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:22.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:36:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:22.929 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:36:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:22.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:36:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:22.929 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:36:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:22.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:36:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:22.929 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:36:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:22.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:36:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:22.929 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:36:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:22.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:36:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:22.929 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:36:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:22.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:36:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:22.929 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:36:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:22.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:36:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:22.929 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:36:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:22.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:36:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:22.929 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:36:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:22.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:36:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:22.929 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:36:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:22.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:36:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:22.929 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:36:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:22.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:36:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:22.929 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:36:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:22.929 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:36:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:22.929 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:36:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:22.932 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Jan 22 00:36:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:22.964 12 DEBUG ceilometer.compute.pollsters [-] 79afbcaf-6ef5-4db5-a05c-78bccd9f772a/disk.device.read.latency volume: 204815853 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:36:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:22.965 12 DEBUG ceilometer.compute.pollsters [-] 79afbcaf-6ef5-4db5-a05c-78bccd9f772a/disk.device.read.latency volume: 31200541 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:36:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:22.967 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '55078525-dfec-48d4-9129-c810369477b3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 204815853, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': '79afbcaf-6ef5-4db5-a05c-78bccd9f772a-vda', 'timestamp': '2026-01-22T00:36:22.932311', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1327765747', 'name': 'instance-000000ad', 'instance_id': '79afbcaf-6ef5-4db5-a05c-78bccd9f772a', 'instance_type': 'm1.nano', 'host': '114686826ac799d79679ebf48d50469c6b568964d23812d7c3d3d1b1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '6096843e-f72a-11f0-a0a4-fa163e934844', 'monotonic_time': 6755.639616127, 'message_signature': '1d6deae577adba627393fd180d9ffa30fb98e5db1d26d76b4457cf512c4917f9'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 31200541, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': '79afbcaf-6ef5-4db5-a05c-78bccd9f772a-sda', 'timestamp': '2026-01-22T00:36:22.932311', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1327765747', 'name': 'instance-000000ad', 'instance_id': '79afbcaf-6ef5-4db5-a05c-78bccd9f772a', 'instance_type': 'm1.nano', 'host': '114686826ac799d79679ebf48d50469c6b568964d23812d7c3d3d1b1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '60969e06-f72a-11f0-a0a4-fa163e934844', 'monotonic_time': 6755.639616127, 'message_signature': '8fdf05c96617737eb5767bffc485df443d3d6b7ea7bb15d8fd8a752c61d74fe9'}]}, 'timestamp': '2026-01-22 00:36:22.965773', '_unique_id': '610186d613bf4ae2bcc723b55fc7b01a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:36:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:22.967 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:36:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:22.967 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:36:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:22.967 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:36:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:22.967 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:36:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:22.967 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:36:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:22.967 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:36:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:22.967 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:36:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:22.967 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:36:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:22.967 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:36:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:22.967 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:36:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:22.967 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:36:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:22.967 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:36:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:22.967 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:36:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:22.967 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:36:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:22.967 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:36:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:22.967 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:36:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:22.967 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:36:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:22.967 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:36:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:22.967 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:36:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:22.967 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:36:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:22.967 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:36:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:22.967 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:36:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:22.967 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:36:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:22.967 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:36:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:22.967 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:36:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:22.967 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:36:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:22.967 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:36:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:22.967 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:36:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:22.967 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:36:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:22.967 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:36:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:22.967 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:36:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:22.967 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:36:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:22.967 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:36:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:22.967 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:36:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:22.967 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:36:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:22.967 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:36:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:22.967 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:36:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:22.967 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:36:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:22.967 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:36:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:22.967 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:36:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:22.967 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:36:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:22.967 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:36:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:22.967 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:36:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:22.967 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:36:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:22.967 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:36:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:22.967 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:36:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:22.967 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:36:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:22.967 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:36:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:22.967 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:36:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:22.967 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:36:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:22.967 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:36:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:22.967 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:36:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:22.967 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:36:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:22.967 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:36:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:22.968 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Jan 22 00:36:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:22.971 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 79afbcaf-6ef5-4db5-a05c-78bccd9f772a / tap4a9813e7-d0 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Jan 22 00:36:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:22.972 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 79afbcaf-6ef5-4db5-a05c-78bccd9f772a / tap90c66d2f-b7 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Jan 22 00:36:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:22.972 12 DEBUG ceilometer.compute.pollsters [-] 79afbcaf-6ef5-4db5-a05c-78bccd9f772a/network.outgoing.bytes volume: 3390 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:36:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:22.973 12 DEBUG ceilometer.compute.pollsters [-] 79afbcaf-6ef5-4db5-a05c-78bccd9f772a/network.outgoing.bytes volume: 2620 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:36:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:22.974 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6bf7474f-21fc-4b9e-af77-6cf247fd9700', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 3390, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': 'instance-000000ad-79afbcaf-6ef5-4db5-a05c-78bccd9f772a-tap4a9813e7-d0', 'timestamp': '2026-01-22T00:36:22.968356', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1327765747', 'name': 'tap4a9813e7-d0', 'instance_id': '79afbcaf-6ef5-4db5-a05c-78bccd9f772a', 'instance_type': 'm1.nano', 'host': '114686826ac799d79679ebf48d50469c6b568964d23812d7c3d3d1b1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:7f:b8:cc', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a9813e7-d0'}, 'message_id': '6097bad4-f72a-11f0-a0a4-fa163e934844', 'monotonic_time': 6755.675629848, 'message_signature': '390ac8ae41532d0ac4cf10a3826a3b44ca9a7e6503f937968c4e4f200de30d96'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2620, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': 'instance-000000ad-79afbcaf-6ef5-4db5-a05c-78bccd9f772a-tap90c66d2f-b7', 'timestamp': '2026-01-22T00:36:22.968356', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1327765747', 'name': 'tap90c66d2f-b7', 'instance_id': '79afbcaf-6ef5-4db5-a05c-78bccd9f772a', 'instance_type': 'm1.nano', 'host': '114686826ac799d79679ebf48d50469c6b568964d23812d7c3d3d1b1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e5:06:a1', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap90c66d2f-b7'}, 'message_id': '6097cae2-f72a-11f0-a0a4-fa163e934844', 'monotonic_time': 6755.675629848, 'message_signature': 'bc7dff2cbfc707c00c1cf23295aba5ac56fe9242294c4f362696704d8c26096c'}]}, 'timestamp': '2026-01-22 00:36:22.973473', '_unique_id': '4e49b604842243289253523b9429c98d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:36:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:22.974 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:36:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:22.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:36:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:22.974 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:36:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:22.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:36:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:22.974 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:36:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:22.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:36:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:22.974 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:36:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:22.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:36:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:22.974 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:36:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:22.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:36:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:22.974 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:36:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:22.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:36:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:22.974 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:36:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:22.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:36:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:22.974 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:36:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:22.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:36:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:22.974 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:36:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:22.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:36:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:22.974 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:36:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:22.974 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:36:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:22.974 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:36:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:22.974 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:36:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:22.974 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:36:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:22.974 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:36:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:22.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:36:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:22.974 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:36:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:22.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:36:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:22.974 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:36:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:22.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:36:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:22.974 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:36:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:22.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:36:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:22.974 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:36:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:22.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:36:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:22.974 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:36:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:22.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:36:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:22.974 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:36:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:22.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:36:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:22.974 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:36:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:22.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:36:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:22.974 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:36:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:22.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:36:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:22.974 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:36:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:22.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:36:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:22.974 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:36:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:22.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:36:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:22.974 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:36:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:22.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:36:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:22.974 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:36:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:22.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:36:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:22.974 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:36:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:22.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:36:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:22.974 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:36:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:22.974 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:36:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:22.974 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:36:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:22.975 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Jan 22 00:36:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:22.997 12 DEBUG ceilometer.compute.pollsters [-] 79afbcaf-6ef5-4db5-a05c-78bccd9f772a/cpu volume: 11860000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:22.999 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1aa1564e-f735-420a-8fe7-9500adb2f144', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 11860000000, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': '79afbcaf-6ef5-4db5-a05c-78bccd9f772a', 'timestamp': '2026-01-22T00:36:22.975688', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1327765747', 'name': 'instance-000000ad', 'instance_id': '79afbcaf-6ef5-4db5-a05c-78bccd9f772a', 'instance_type': 'm1.nano', 'host': '114686826ac799d79679ebf48d50469c6b568964d23812d7c3d3d1b1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '609b95dc-f72a-11f0-a0a4-fa163e934844', 'monotonic_time': 6755.704747072, 'message_signature': '43e8954beed27b4d3ae49937fe11a7ad9790e6406881b2507720a604880c16d5'}]}, 'timestamp': '2026-01-22 00:36:22.998429', '_unique_id': 'ace214f65b4a44409b12cb238affd6bd'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:22.999 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:22.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:22.999 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:22.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:22.999 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:22.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:22.999 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:22.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:22.999 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:22.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:22.999 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:22.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:22.999 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:22.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:22.999 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:22.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:22.999 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:22.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:22.999 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:22.999 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:22.999 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:22.999 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:22.999 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:22.999 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:22.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:22.999 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:22.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:22.999 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:22.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:22.999 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:22.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:22.999 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:22.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:22.999 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:22.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:22.999 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:22.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:22.999 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:22.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:22.999 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:22.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:22.999 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:22.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:22.999 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:22.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:22.999 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:22.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:22.999 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:22.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:22.999 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:22.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:22.999 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:22.999 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:22.999 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.000 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.001 12 DEBUG ceilometer.compute.pollsters [-] 79afbcaf-6ef5-4db5-a05c-78bccd9f772a/disk.device.write.bytes volume: 72929280 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.001 12 DEBUG ceilometer.compute.pollsters [-] 79afbcaf-6ef5-4db5-a05c-78bccd9f772a/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.002 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e4da1b0a-60af-4d43-90f3-c0d3d5229ac2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 72929280, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': '79afbcaf-6ef5-4db5-a05c-78bccd9f772a-vda', 'timestamp': '2026-01-22T00:36:23.001004', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1327765747', 'name': 'instance-000000ad', 'instance_id': '79afbcaf-6ef5-4db5-a05c-78bccd9f772a', 'instance_type': 'm1.nano', 'host': '114686826ac799d79679ebf48d50469c6b568964d23812d7c3d3d1b1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '609c0c42-f72a-11f0-a0a4-fa163e934844', 'monotonic_time': 6755.639616127, 'message_signature': 'b088ed28205fc745d3321d5f1b2991b840cbf8d76269e1938480cab44ec388da'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': '79afbcaf-6ef5-4db5-a05c-78bccd9f772a-sda', 'timestamp': '2026-01-22T00:36:23.001004', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1327765747', 'name': 'instance-000000ad', 'instance_id': '79afbcaf-6ef5-4db5-a05c-78bccd9f772a', 'instance_type': 'm1.nano', 'host': '114686826ac799d79679ebf48d50469c6b568964d23812d7c3d3d1b1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '609c184a-f72a-11f0-a0a4-fa163e934844', 'monotonic_time': 6755.639616127, 'message_signature': '8130fd50f016f8ac6a9731a714e705c8851e589f447acd6692925605850f4710'}]}, 'timestamp': '2026-01-22 00:36:23.001647', '_unique_id': 'dbfa4077ed4f47b38c34630ec8a016d1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.002 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.002 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.002 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.002 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.002 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.002 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.002 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.002 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.002 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.002 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.002 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.002 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.002 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.002 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.002 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.002 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.002 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.002 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.002 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.002 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.002 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.002 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.002 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.002 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.002 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.002 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.002 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.002 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.002 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.002 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.002 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.003 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.003 12 DEBUG ceilometer.compute.pollsters [-] 79afbcaf-6ef5-4db5-a05c-78bccd9f772a/disk.device.read.bytes volume: 29645312 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.003 12 DEBUG ceilometer.compute.pollsters [-] 79afbcaf-6ef5-4db5-a05c-78bccd9f772a/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.005 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b190b7ba-10a6-417a-8aab-b50462adc87c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 29645312, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': '79afbcaf-6ef5-4db5-a05c-78bccd9f772a-vda', 'timestamp': '2026-01-22T00:36:23.003434', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1327765747', 'name': 'instance-000000ad', 'instance_id': '79afbcaf-6ef5-4db5-a05c-78bccd9f772a', 'instance_type': 'm1.nano', 'host': '114686826ac799d79679ebf48d50469c6b568964d23812d7c3d3d1b1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '609c6a66-f72a-11f0-a0a4-fa163e934844', 'monotonic_time': 6755.639616127, 'message_signature': '088db03a815836004f2d6b1da57b460351d0014e6798aae9aac9f8c8d2971c2d'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': '79afbcaf-6ef5-4db5-a05c-78bccd9f772a-sda', 'timestamp': '2026-01-22T00:36:23.003434', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1327765747', 'name': 'instance-000000ad', 'instance_id': '79afbcaf-6ef5-4db5-a05c-78bccd9f772a', 'instance_type': 'm1.nano', 'host': '114686826ac799d79679ebf48d50469c6b568964d23812d7c3d3d1b1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '609c78f8-f72a-11f0-a0a4-fa163e934844', 'monotonic_time': 6755.639616127, 'message_signature': '4621bb0d4b4f4e912df65f5c9a2c833588caab34becd5248bfeb05f2aaa3476e'}]}, 'timestamp': '2026-01-22 00:36:23.004174', '_unique_id': '86b0ededc22f4ec2b0523c758c609d4d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.005 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.005 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.005 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.005 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.005 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.005 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.005 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.005 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.005 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.005 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.005 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.005 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.005 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.005 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.005 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.005 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.005 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.005 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.005 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.005 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.005 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.005 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.005 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.005 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.005 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.005 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.005 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.005 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.005 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.005 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.005 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.005 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.006 12 DEBUG ceilometer.compute.pollsters [-] 79afbcaf-6ef5-4db5-a05c-78bccd9f772a/memory.usage volume: 47.796875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.007 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e7cffa9b-11d6-45a9-8320-58e23f9da740', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 47.796875, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': '79afbcaf-6ef5-4db5-a05c-78bccd9f772a', 'timestamp': '2026-01-22T00:36:23.005974', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1327765747', 'name': 'instance-000000ad', 'instance_id': '79afbcaf-6ef5-4db5-a05c-78bccd9f772a', 'instance_type': 'm1.nano', 'host': '114686826ac799d79679ebf48d50469c6b568964d23812d7c3d3d1b1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '609ccdda-f72a-11f0-a0a4-fa163e934844', 'monotonic_time': 6755.704747072, 'message_signature': '5eaca56abf9934de6a5e17aa1dfe8ec38f3c769d91721126beab5166254be7a2'}]}, 'timestamp': '2026-01-22 00:36:23.006305', '_unique_id': 'bf2880e28da54a3d82fc72a6bcc9a822'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.007 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.007 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.007 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.007 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.007 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.007 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.007 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.007 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.007 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.007 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.007 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.007 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.007 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.007 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.007 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.007 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.007 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.007 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.007 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.007 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.007 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.007 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.007 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.007 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.007 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.007 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.007 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.007 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.007 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.007 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.007 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.008 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.008 12 DEBUG ceilometer.compute.pollsters [-] 79afbcaf-6ef5-4db5-a05c-78bccd9f772a/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.008 12 DEBUG ceilometer.compute.pollsters [-] 79afbcaf-6ef5-4db5-a05c-78bccd9f772a/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.009 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '22528c76-390b-43d7-8f27-ebfb02ec88fd', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': 'instance-000000ad-79afbcaf-6ef5-4db5-a05c-78bccd9f772a-tap4a9813e7-d0', 'timestamp': '2026-01-22T00:36:23.008241', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1327765747', 'name': 'tap4a9813e7-d0', 'instance_id': '79afbcaf-6ef5-4db5-a05c-78bccd9f772a', 'instance_type': 'm1.nano', 'host': '114686826ac799d79679ebf48d50469c6b568964d23812d7c3d3d1b1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:7f:b8:cc', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a9813e7-d0'}, 'message_id': '609d28c0-f72a-11f0-a0a4-fa163e934844', 'monotonic_time': 6755.675629848, 'message_signature': '603451c60d00a199534e422a622443f9ed7de3cb68c561c5e5ea25062503de3a'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': 'instance-000000ad-79afbcaf-6ef5-4db5-a05c-78bccd9f772a-tap90c66d2f-b7', 'timestamp': '2026-01-22T00:36:23.008241', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1327765747', 'name': 'tap90c66d2f-b7', 'instance_id': '79afbcaf-6ef5-4db5-a05c-78bccd9f772a', 'instance_type': 'm1.nano', 'host': '114686826ac799d79679ebf48d50469c6b568964d23812d7c3d3d1b1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e5:06:a1', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap90c66d2f-b7'}, 'message_id': '609d3586-f72a-11f0-a0a4-fa163e934844', 'monotonic_time': 6755.675629848, 'message_signature': '77931dbf5dc9a14adee427480727e8a45ddba04523a0998eb2d8f25bafe4e96e'}]}, 'timestamp': '2026-01-22 00:36:23.008985', '_unique_id': 'd474c896ecac454c910f42ccff79add7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.009 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.009 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.009 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.009 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.009 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.009 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.009 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.009 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.009 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.009 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.009 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.009 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.009 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.009 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.009 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.009 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.009 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.009 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.009 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.009 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.009 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.009 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.009 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.009 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.009 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.009 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.009 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.009 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.009 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.009 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.009 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.010 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.010 12 DEBUG ceilometer.compute.pollsters [-] 79afbcaf-6ef5-4db5-a05c-78bccd9f772a/network.outgoing.packets volume: 28 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.010 12 DEBUG ceilometer.compute.pollsters [-] 79afbcaf-6ef5-4db5-a05c-78bccd9f772a/network.outgoing.packets volume: 24 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.011 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e156298d-1b0b-4d2f-862d-80ddd6713dc4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 28, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': 'instance-000000ad-79afbcaf-6ef5-4db5-a05c-78bccd9f772a-tap4a9813e7-d0', 'timestamp': '2026-01-22T00:36:23.010615', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1327765747', 'name': 'tap4a9813e7-d0', 'instance_id': '79afbcaf-6ef5-4db5-a05c-78bccd9f772a', 'instance_type': 'm1.nano', 'host': '114686826ac799d79679ebf48d50469c6b568964d23812d7c3d3d1b1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:7f:b8:cc', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a9813e7-d0'}, 'message_id': '609d8310-f72a-11f0-a0a4-fa163e934844', 'monotonic_time': 6755.675629848, 'message_signature': '64d3549067e41ee6cdf54962e52911d428d491b4614f4e247e921b26fe4d6fc2'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 24, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': 'instance-000000ad-79afbcaf-6ef5-4db5-a05c-78bccd9f772a-tap90c66d2f-b7', 'timestamp': '2026-01-22T00:36:23.010615', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1327765747', 'name': 'tap90c66d2f-b7', 'instance_id': '79afbcaf-6ef5-4db5-a05c-78bccd9f772a', 'instance_type': 'm1.nano', 'host': '114686826ac799d79679ebf48d50469c6b568964d23812d7c3d3d1b1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e5:06:a1', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap90c66d2f-b7'}, 'message_id': '609d90da-f72a-11f0-a0a4-fa163e934844', 'monotonic_time': 6755.675629848, 'message_signature': '4baa3666c8ce25d0b38c9f7d0617a633aa87e6e744adce325a7c3aa16c1608d7'}]}, 'timestamp': '2026-01-22 00:36:23.011298', '_unique_id': 'abba3bcf60764f94a94a4aa86dff6b41'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.011 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.011 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.011 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.011 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.011 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.011 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.011 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.011 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.011 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.011 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.011 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.011 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.011 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.011 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.011 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.011 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.011 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.011 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.011 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.011 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.011 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.011 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.011 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.011 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.011 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.011 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.011 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.011 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.011 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.011 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.011 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.012 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.012 12 DEBUG ceilometer.compute.pollsters [-] 79afbcaf-6ef5-4db5-a05c-78bccd9f772a/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.013 12 DEBUG ceilometer.compute.pollsters [-] 79afbcaf-6ef5-4db5-a05c-78bccd9f772a/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.014 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9a7f7c1d-eefa-4235-a20d-97e04d4ad5db', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': 'instance-000000ad-79afbcaf-6ef5-4db5-a05c-78bccd9f772a-tap4a9813e7-d0', 'timestamp': '2026-01-22T00:36:23.012926', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1327765747', 'name': 'tap4a9813e7-d0', 'instance_id': '79afbcaf-6ef5-4db5-a05c-78bccd9f772a', 'instance_type': 'm1.nano', 'host': '114686826ac799d79679ebf48d50469c6b568964d23812d7c3d3d1b1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:7f:b8:cc', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a9813e7-d0'}, 'message_id': '609ddf40-f72a-11f0-a0a4-fa163e934844', 'monotonic_time': 6755.675629848, 'message_signature': 'f5627e1d8edf2a9e77a1fbacf7aceda1b1c7582ad7c6a744fee34da19e115168'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': 'instance-000000ad-79afbcaf-6ef5-4db5-a05c-78bccd9f772a-tap90c66d2f-b7', 'timestamp': '2026-01-22T00:36:23.012926', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1327765747', 'name': 'tap90c66d2f-b7', 'instance_id': '79afbcaf-6ef5-4db5-a05c-78bccd9f772a', 'instance_type': 'm1.nano', 'host': '114686826ac799d79679ebf48d50469c6b568964d23812d7c3d3d1b1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e5:06:a1', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap90c66d2f-b7'}, 'message_id': '609deeae-f72a-11f0-a0a4-fa163e934844', 'monotonic_time': 6755.675629848, 'message_signature': 'f53766aa004e12a668f151b304e50ece17465675bdc11f81d8d163979a20d9c8'}]}, 'timestamp': '2026-01-22 00:36:23.013700', '_unique_id': 'd9966c9118a743a0b27f431fb3a40f3d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.014 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.014 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.014 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.014 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.014 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.014 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.014 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.014 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.014 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.014 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.014 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.014 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.014 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.014 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.014 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.014 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.014 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.014 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.014 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.014 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.014 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.014 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.014 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.014 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.014 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.014 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.014 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.014 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.014 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.014 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.014 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.015 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.015 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.015 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-TestGettingAddress-server-1327765747>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestGettingAddress-server-1327765747>]
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.016 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.016 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.016 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-TestGettingAddress-server-1327765747>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestGettingAddress-server-1327765747>]
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.016 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.016 12 DEBUG ceilometer.compute.pollsters [-] 79afbcaf-6ef5-4db5-a05c-78bccd9f772a/network.incoming.packets volume: 23 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.017 12 DEBUG ceilometer.compute.pollsters [-] 79afbcaf-6ef5-4db5-a05c-78bccd9f772a/network.incoming.packets volume: 8 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.018 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd4d8b8e0-7851-4a03-938b-302fa549da72', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 23, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': 'instance-000000ad-79afbcaf-6ef5-4db5-a05c-78bccd9f772a-tap4a9813e7-d0', 'timestamp': '2026-01-22T00:36:23.016648', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1327765747', 'name': 'tap4a9813e7-d0', 'instance_id': '79afbcaf-6ef5-4db5-a05c-78bccd9f772a', 'instance_type': 'm1.nano', 'host': '114686826ac799d79679ebf48d50469c6b568964d23812d7c3d3d1b1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:7f:b8:cc', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a9813e7-d0'}, 'message_id': '609e70f4-f72a-11f0-a0a4-fa163e934844', 'monotonic_time': 6755.675629848, 'message_signature': 'e12b3a4cca1d8c5f381c10e334f378de3684bd953580a1eac9858a5b598ed593'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 8, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': 'instance-000000ad-79afbcaf-6ef5-4db5-a05c-78bccd9f772a-tap90c66d2f-b7', 'timestamp': '2026-01-22T00:36:23.016648', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1327765747', 'name': 'tap90c66d2f-b7', 'instance_id': '79afbcaf-6ef5-4db5-a05c-78bccd9f772a', 'instance_type': 'm1.nano', 'host': '114686826ac799d79679ebf48d50469c6b568964d23812d7c3d3d1b1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e5:06:a1', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap90c66d2f-b7'}, 'message_id': '609e7f7c-f72a-11f0-a0a4-fa163e934844', 'monotonic_time': 6755.675629848, 'message_signature': 'c42fcb793a2d058db9ea70655236b4a5c5188999ee824dab9966eadc16157952'}]}, 'timestamp': '2026-01-22 00:36:23.017419', '_unique_id': '1ecb2830de874e4c8c5185d1c47a70c2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.018 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.018 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.018 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.018 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.018 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.018 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.018 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.018 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.018 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.018 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.018 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.018 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.018 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.018 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.018 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.018 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.018 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.018 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.018 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.018 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.018 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.018 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.018 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.018 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.018 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.018 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.018 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.018 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.018 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.018 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.018 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.019 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.019 12 DEBUG ceilometer.compute.pollsters [-] 79afbcaf-6ef5-4db5-a05c-78bccd9f772a/network.incoming.bytes volume: 3973 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.019 12 DEBUG ceilometer.compute.pollsters [-] 79afbcaf-6ef5-4db5-a05c-78bccd9f772a/network.incoming.bytes volume: 772 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.020 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b892f8ac-6fae-44b0-9a5b-8512e9943ca5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 3973, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': 'instance-000000ad-79afbcaf-6ef5-4db5-a05c-78bccd9f772a-tap4a9813e7-d0', 'timestamp': '2026-01-22T00:36:23.019233', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1327765747', 'name': 'tap4a9813e7-d0', 'instance_id': '79afbcaf-6ef5-4db5-a05c-78bccd9f772a', 'instance_type': 'm1.nano', 'host': '114686826ac799d79679ebf48d50469c6b568964d23812d7c3d3d1b1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:7f:b8:cc', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a9813e7-d0'}, 'message_id': '609ed382-f72a-11f0-a0a4-fa163e934844', 'monotonic_time': 6755.675629848, 'message_signature': 'f39c7f2d26d984b57e0e3199ab4d3552c9a4d8afa66e2c2fc74a5757069d5d87'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 772, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': 'instance-000000ad-79afbcaf-6ef5-4db5-a05c-78bccd9f772a-tap90c66d2f-b7', 'timestamp': '2026-01-22T00:36:23.019233', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1327765747', 'name': 'tap90c66d2f-b7', 'instance_id': '79afbcaf-6ef5-4db5-a05c-78bccd9f772a', 'instance_type': 'm1.nano', 'host': '114686826ac799d79679ebf48d50469c6b568964d23812d7c3d3d1b1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e5:06:a1', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap90c66d2f-b7'}, 'message_id': '609edfbc-f72a-11f0-a0a4-fa163e934844', 'monotonic_time': 6755.675629848, 'message_signature': '6f8ff0bc662ddd096a9729b8d08199f911a63e890cf5ed2f731f7044eefd326c'}]}, 'timestamp': '2026-01-22 00:36:23.019894', '_unique_id': '601633b65a1f48e1b8c748dba87e483a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.020 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.020 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.020 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.020 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.020 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.020 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.020 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.020 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.020 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.020 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.020 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.020 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.020 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.020 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.020 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.020 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.020 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.020 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.020 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.020 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.020 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.020 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.020 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.020 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.020 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.020 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.020 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.020 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.020 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.020 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.020 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.021 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.021 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.022 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-TestGettingAddress-server-1327765747>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestGettingAddress-server-1327765747>]
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.022 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.022 12 DEBUG ceilometer.compute.pollsters [-] 79afbcaf-6ef5-4db5-a05c-78bccd9f772a/disk.device.usage volume: 29949952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.022 12 DEBUG ceilometer.compute.pollsters [-] 79afbcaf-6ef5-4db5-a05c-78bccd9f772a/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.023 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '15f56261-ba92-4be9-b024-b93693265c78', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29949952, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': '79afbcaf-6ef5-4db5-a05c-78bccd9f772a-vda', 'timestamp': '2026-01-22T00:36:23.022311', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1327765747', 'name': 'instance-000000ad', 'instance_id': '79afbcaf-6ef5-4db5-a05c-78bccd9f772a', 'instance_type': 'm1.nano', 'host': '114686826ac799d79679ebf48d50469c6b568964d23812d7c3d3d1b1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '609f4baa-f72a-11f0-a0a4-fa163e934844', 'monotonic_time': 6755.609841881, 'message_signature': '7422ef5ddfa6839696a11fe11e2a388d4d6e37a2706dc4eaad5bc2bfe677baf1'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': '79afbcaf-6ef5-4db5-a05c-78bccd9f772a-sda', 'timestamp': '2026-01-22T00:36:23.022311', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1327765747', 'name': 'instance-000000ad', 'instance_id': '79afbcaf-6ef5-4db5-a05c-78bccd9f772a', 'instance_type': 'm1.nano', 'host': '114686826ac799d79679ebf48d50469c6b568964d23812d7c3d3d1b1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '609f5898-f72a-11f0-a0a4-fa163e934844', 'monotonic_time': 6755.609841881, 'message_signature': 'db8892746447756c1651232117f7d0258d7f397c8a1c50c0e8352d6459f57f79'}]}, 'timestamp': '2026-01-22 00:36:23.022976', '_unique_id': 'e0ff9fe9fdf1446f97349c1ccf94b0ce'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.023 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.023 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.023 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.023 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.023 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.023 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.023 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.023 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.023 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.023 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.023 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.023 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.023 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.023 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.023 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.023 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.023 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.023 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.023 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.023 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.023 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.023 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.023 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.023 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.023 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.023 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.023 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.023 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.023 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.023 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.023 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.024 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.024 12 DEBUG ceilometer.compute.pollsters [-] 79afbcaf-6ef5-4db5-a05c-78bccd9f772a/disk.device.write.latency volume: 2139782544 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.024 12 DEBUG ceilometer.compute.pollsters [-] 79afbcaf-6ef5-4db5-a05c-78bccd9f772a/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.025 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0f0a7d95-645b-4901-84c9-95e2e6793373', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 2139782544, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': '79afbcaf-6ef5-4db5-a05c-78bccd9f772a-vda', 'timestamp': '2026-01-22T00:36:23.024508', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1327765747', 'name': 'instance-000000ad', 'instance_id': '79afbcaf-6ef5-4db5-a05c-78bccd9f772a', 'instance_type': 'm1.nano', 'host': '114686826ac799d79679ebf48d50469c6b568964d23812d7c3d3d1b1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '609fa140-f72a-11f0-a0a4-fa163e934844', 'monotonic_time': 6755.639616127, 'message_signature': 'bec350b93625d220f67187764bfdfb662d52483e01a416b28799acbfa178cae0'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': '79afbcaf-6ef5-4db5-a05c-78bccd9f772a-sda', 'timestamp': '2026-01-22T00:36:23.024508', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1327765747', 'name': 'instance-000000ad', 'instance_id': '79afbcaf-6ef5-4db5-a05c-78bccd9f772a', 'instance_type': 'm1.nano', 'host': '114686826ac799d79679ebf48d50469c6b568964d23812d7c3d3d1b1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '609fadac-f72a-11f0-a0a4-fa163e934844', 'monotonic_time': 6755.639616127, 'message_signature': '556412d5184994135bbd0ea799612187c01868881048e7300f5e54a87309a90a'}]}, 'timestamp': '2026-01-22 00:36:23.025128', '_unique_id': 'eaee0deb31514ae589dabf3094eeec28'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.025 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.025 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.025 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.025 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.025 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.025 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.025 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.025 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.025 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.025 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.025 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.025 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.025 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.025 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.025 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.025 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.025 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.025 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.025 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.025 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.025 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.025 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.025 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.025 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.025 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.025 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.025 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.025 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.025 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.025 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.025 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.026 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.026 12 DEBUG ceilometer.compute.pollsters [-] 79afbcaf-6ef5-4db5-a05c-78bccd9f772a/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.027 12 DEBUG ceilometer.compute.pollsters [-] 79afbcaf-6ef5-4db5-a05c-78bccd9f772a/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.027 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5b6a8a2c-f874-4b43-9693-8e2efe98df7e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': '79afbcaf-6ef5-4db5-a05c-78bccd9f772a-vda', 'timestamp': '2026-01-22T00:36:23.026684', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1327765747', 'name': 'instance-000000ad', 'instance_id': '79afbcaf-6ef5-4db5-a05c-78bccd9f772a', 'instance_type': 'm1.nano', 'host': '114686826ac799d79679ebf48d50469c6b568964d23812d7c3d3d1b1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '609ff62c-f72a-11f0-a0a4-fa163e934844', 'monotonic_time': 6755.609841881, 'message_signature': 'e09e24e342cac0bf6056d7b35a458689e4ce49ded82fc28d5cdad4a576567e4f'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': '79afbcaf-6ef5-4db5-a05c-78bccd9f772a-sda', 'timestamp': '2026-01-22T00:36:23.026684', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1327765747', 'name': 'instance-000000ad', 'instance_id': '79afbcaf-6ef5-4db5-a05c-78bccd9f772a', 'instance_type': 'm1.nano', 'host': '114686826ac799d79679ebf48d50469c6b568964d23812d7c3d3d1b1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '60a0028e-f72a-11f0-a0a4-fa163e934844', 'monotonic_time': 6755.609841881, 'message_signature': '024d3306178b1bd73d60409ae59ad3622840f303ebc9efc3d4138a4dae0b2e53'}]}, 'timestamp': '2026-01-22 00:36:23.027301', '_unique_id': '2782616d3602407285e5ce3c02435b66'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.027 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.027 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.027 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.027 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.027 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.027 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.027 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.027 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.027 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.027 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.027 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.027 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.027 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.027 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.027 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.027 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.027 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.027 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.027 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.027 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.027 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.027 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.027 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.027 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.027 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.027 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.027 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.027 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.027 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.027 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.027 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.028 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.028 12 DEBUG ceilometer.compute.pollsters [-] 79afbcaf-6ef5-4db5-a05c-78bccd9f772a/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.029 12 DEBUG ceilometer.compute.pollsters [-] 79afbcaf-6ef5-4db5-a05c-78bccd9f772a/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.030 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '153583c9-7551-4b3f-8dfc-923162545b14', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': 'instance-000000ad-79afbcaf-6ef5-4db5-a05c-78bccd9f772a-tap4a9813e7-d0', 'timestamp': '2026-01-22T00:36:23.028915', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1327765747', 'name': 'tap4a9813e7-d0', 'instance_id': '79afbcaf-6ef5-4db5-a05c-78bccd9f772a', 'instance_type': 'm1.nano', 'host': '114686826ac799d79679ebf48d50469c6b568964d23812d7c3d3d1b1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:7f:b8:cc', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a9813e7-d0'}, 'message_id': '60a04dac-f72a-11f0-a0a4-fa163e934844', 'monotonic_time': 6755.675629848, 'message_signature': '2aca59d5ac7389fcf5ac7f5855f2b66404ff7a43416e54c52f8f4745186dd180'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': 'instance-000000ad-79afbcaf-6ef5-4db5-a05c-78bccd9f772a-tap90c66d2f-b7', 'timestamp': '2026-01-22T00:36:23.028915', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1327765747', 'name': 'tap90c66d2f-b7', 'instance_id': '79afbcaf-6ef5-4db5-a05c-78bccd9f772a', 'instance_type': 'm1.nano', 'host': '114686826ac799d79679ebf48d50469c6b568964d23812d7c3d3d1b1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e5:06:a1', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap90c66d2f-b7'}, 'message_id': '60a059c8-f72a-11f0-a0a4-fa163e934844', 'monotonic_time': 6755.675629848, 'message_signature': '0028a036856099c24b96221df15ee379533faae7cc8991472ed3c1448d5929a3'}]}, 'timestamp': '2026-01-22 00:36:23.029544', '_unique_id': '7f5023a6ad934ad38ce40aa67d4c1367'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.030 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.030 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.030 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.030 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.030 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.030 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.030 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.030 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.030 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.030 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.030 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.030 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.030 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.030 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.030 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.030 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.030 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.030 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.030 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.030 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.030 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.030 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.030 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.030 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.030 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.030 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.030 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.030 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.030 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.030 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.030 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.031 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.031 12 DEBUG ceilometer.compute.pollsters [-] 79afbcaf-6ef5-4db5-a05c-78bccd9f772a/disk.device.write.requests volume: 309 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.031 12 DEBUG ceilometer.compute.pollsters [-] 79afbcaf-6ef5-4db5-a05c-78bccd9f772a/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.031 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '91aa5e58-5a05-42b6-976e-98eac254cfa5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 309, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': '79afbcaf-6ef5-4db5-a05c-78bccd9f772a-vda', 'timestamp': '2026-01-22T00:36:23.031122', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1327765747', 'name': 'instance-000000ad', 'instance_id': '79afbcaf-6ef5-4db5-a05c-78bccd9f772a', 'instance_type': 'm1.nano', 'host': '114686826ac799d79679ebf48d50469c6b568964d23812d7c3d3d1b1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '60a0a234-f72a-11f0-a0a4-fa163e934844', 'monotonic_time': 6755.639616127, 'message_signature': '32eed2bb4b7e33335dc33e4d8a236923a1b70896321e86cacc89c96ed2176c61'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': '79afbcaf-6ef5-4db5-a05c-78bccd9f772a-sda', 'timestamp': '2026-01-22T00:36:23.031122', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1327765747', 'name': 'instance-000000ad', 'instance_id': '79afbcaf-6ef5-4db5-a05c-78bccd9f772a', 'instance_type': 'm1.nano', 'host': '114686826ac799d79679ebf48d50469c6b568964d23812d7c3d3d1b1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '60a0a9e6-f72a-11f0-a0a4-fa163e934844', 'monotonic_time': 6755.639616127, 'message_signature': 'a19bc000ba680e4dd9c882b66e2426a60a6e4ceaebd74846375363fc9f1656f0'}]}, 'timestamp': '2026-01-22 00:36:23.031539', '_unique_id': 'fa6fc157f55347d192edf8b4495ace50'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.031 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.031 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.031 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.031 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.031 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.031 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.031 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.031 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.031 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.031 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.031 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.031 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.031 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.031 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.031 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.031 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.031 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.031 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.031 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.031 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.031 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.031 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.031 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.031 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.031 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.031 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.031 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.031 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.031 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.031 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.031 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.032 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.032 12 DEBUG ceilometer.compute.pollsters [-] 79afbcaf-6ef5-4db5-a05c-78bccd9f772a/disk.device.read.requests volume: 1071 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.032 12 DEBUG ceilometer.compute.pollsters [-] 79afbcaf-6ef5-4db5-a05c-78bccd9f772a/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.033 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e628c126-3c35-4e92-b799-21c69d1c9747', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1071, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': '79afbcaf-6ef5-4db5-a05c-78bccd9f772a-vda', 'timestamp': '2026-01-22T00:36:23.032588', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1327765747', 'name': 'instance-000000ad', 'instance_id': '79afbcaf-6ef5-4db5-a05c-78bccd9f772a', 'instance_type': 'm1.nano', 'host': '114686826ac799d79679ebf48d50469c6b568964d23812d7c3d3d1b1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '60a0dace-f72a-11f0-a0a4-fa163e934844', 'monotonic_time': 6755.639616127, 'message_signature': 'b852558b86235ef5bd3d0bf7805440fffd34eb3b2e5a0473bccff16dee1618de'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': '79afbcaf-6ef5-4db5-a05c-78bccd9f772a-sda', 'timestamp': '2026-01-22T00:36:23.032588', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1327765747', 'name': 'instance-000000ad', 'instance_id': '79afbcaf-6ef5-4db5-a05c-78bccd9f772a', 'instance_type': 'm1.nano', 'host': '114686826ac799d79679ebf48d50469c6b568964d23812d7c3d3d1b1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '60a0e474-f72a-11f0-a0a4-fa163e934844', 'monotonic_time': 6755.639616127, 'message_signature': 'c48ea39bdacdc27f7658cc282d15f81eecb3a832f6b27074ed7d145cdf6d7db9'}]}, 'timestamp': '2026-01-22 00:36:23.033043', '_unique_id': 'a4f1fab483ee46999beb059cd96d8516'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.033 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.033 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.033 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.033 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.033 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.033 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.033 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.033 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.033 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.033 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.033 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.033 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.033 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.033 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.033 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.033 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.033 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.033 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.033 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.033 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.033 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.033 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.033 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.033 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.033 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.033 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.033 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.033 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.033 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.033 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.033 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.034 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.034 12 DEBUG ceilometer.compute.pollsters [-] 79afbcaf-6ef5-4db5-a05c-78bccd9f772a/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.034 12 DEBUG ceilometer.compute.pollsters [-] 79afbcaf-6ef5-4db5-a05c-78bccd9f772a/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.035 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bfa45ad0-3ec9-4e67-b0cb-b6e26d1de11e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': 'instance-000000ad-79afbcaf-6ef5-4db5-a05c-78bccd9f772a-tap4a9813e7-d0', 'timestamp': '2026-01-22T00:36:23.034181', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1327765747', 'name': 'tap4a9813e7-d0', 'instance_id': '79afbcaf-6ef5-4db5-a05c-78bccd9f772a', 'instance_type': 'm1.nano', 'host': '114686826ac799d79679ebf48d50469c6b568964d23812d7c3d3d1b1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:7f:b8:cc', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a9813e7-d0'}, 'message_id': '60a11aa2-f72a-11f0-a0a4-fa163e934844', 'monotonic_time': 6755.675629848, 'message_signature': '422cc8fae99cccbf8f81314dcc87e937e40600b259e448f3cafab0bf6ec8fb46'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': 'instance-000000ad-79afbcaf-6ef5-4db5-a05c-78bccd9f772a-tap90c66d2f-b7', 'timestamp': '2026-01-22T00:36:23.034181', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1327765747', 'name': 'tap90c66d2f-b7', 'instance_id': '79afbcaf-6ef5-4db5-a05c-78bccd9f772a', 'instance_type': 'm1.nano', 'host': '114686826ac799d79679ebf48d50469c6b568964d23812d7c3d3d1b1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e5:06:a1', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap90c66d2f-b7'}, 'message_id': '60a124ac-f72a-11f0-a0a4-fa163e934844', 'monotonic_time': 6755.675629848, 'message_signature': '8c7c4b192c306696f1f911c51531b72b805ebeca8b4ae2c7cfc66a41232c2949'}]}, 'timestamp': '2026-01-22 00:36:23.034697', '_unique_id': 'db8a2364cdba4b4e93f8d47fa4697371'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.035 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.035 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.035 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.035 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.035 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.035 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.035 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.035 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.035 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.035 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.035 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.035 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.035 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.035 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.035 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.035 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.035 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.035 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.035 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.035 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.035 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.035 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.035 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.035 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.035 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.035 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.035 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.035 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.035 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.035 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.035 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.035 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.035 12 DEBUG ceilometer.compute.pollsters [-] 79afbcaf-6ef5-4db5-a05c-78bccd9f772a/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.036 12 DEBUG ceilometer.compute.pollsters [-] 79afbcaf-6ef5-4db5-a05c-78bccd9f772a/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.036 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '682d91dd-0b68-495b-b236-adf904623644', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': 'instance-000000ad-79afbcaf-6ef5-4db5-a05c-78bccd9f772a-tap4a9813e7-d0', 'timestamp': '2026-01-22T00:36:23.035822', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1327765747', 'name': 'tap4a9813e7-d0', 'instance_id': '79afbcaf-6ef5-4db5-a05c-78bccd9f772a', 'instance_type': 'm1.nano', 'host': '114686826ac799d79679ebf48d50469c6b568964d23812d7c3d3d1b1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:7f:b8:cc', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a9813e7-d0'}, 'message_id': '60a15a4e-f72a-11f0-a0a4-fa163e934844', 'monotonic_time': 6755.675629848, 'message_signature': 'e406592d646b523a16b2ae96973968420bc149841eff99326b38c4f67d521273'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': 'instance-000000ad-79afbcaf-6ef5-4db5-a05c-78bccd9f772a-tap90c66d2f-b7', 'timestamp': '2026-01-22T00:36:23.035822', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1327765747', 'name': 'tap90c66d2f-b7', 'instance_id': '79afbcaf-6ef5-4db5-a05c-78bccd9f772a', 'instance_type': 'm1.nano', 'host': '114686826ac799d79679ebf48d50469c6b568964d23812d7c3d3d1b1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e5:06:a1', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap90c66d2f-b7'}, 'message_id': '60a162a0-f72a-11f0-a0a4-fa163e934844', 'monotonic_time': 6755.675629848, 'message_signature': '9371312a37bd2dda09bfcb1fd71312f96b9f6fdefca9f29f6e8454231c0a73d0'}]}, 'timestamp': '2026-01-22 00:36:23.036277', '_unique_id': 'a2cd7c98c92d4375aed0aab5f08322cf'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.036 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.036 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.036 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.036 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.036 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.036 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.036 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.036 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.036 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.036 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.036 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.036 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.036 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.036 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.036 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.036 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.036 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.036 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.036 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.036 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.036 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.036 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.036 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.036 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.036 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.036 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.036 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.036 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.036 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.036 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.036 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.037 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.037 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.037 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-TestGettingAddress-server-1327765747>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestGettingAddress-server-1327765747>]
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.037 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.037 12 DEBUG ceilometer.compute.pollsters [-] 79afbcaf-6ef5-4db5-a05c-78bccd9f772a/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.038 12 DEBUG ceilometer.compute.pollsters [-] 79afbcaf-6ef5-4db5-a05c-78bccd9f772a/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.038 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1730cd6a-52fd-497b-9673-956cf92e4241', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': 'instance-000000ad-79afbcaf-6ef5-4db5-a05c-78bccd9f772a-tap4a9813e7-d0', 'timestamp': '2026-01-22T00:36:23.037788', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1327765747', 'name': 'tap4a9813e7-d0', 'instance_id': '79afbcaf-6ef5-4db5-a05c-78bccd9f772a', 'instance_type': 'm1.nano', 'host': '114686826ac799d79679ebf48d50469c6b568964d23812d7c3d3d1b1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:7f:b8:cc', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a9813e7-d0'}, 'message_id': '60a1a850-f72a-11f0-a0a4-fa163e934844', 'monotonic_time': 6755.675629848, 'message_signature': '2fc8a2fdefb2bf62bd9c15717c978124521700c0de829b0a14fa49793c40e61f'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': 'instance-000000ad-79afbcaf-6ef5-4db5-a05c-78bccd9f772a-tap90c66d2f-b7', 'timestamp': '2026-01-22T00:36:23.037788', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1327765747', 'name': 'tap90c66d2f-b7', 'instance_id': '79afbcaf-6ef5-4db5-a05c-78bccd9f772a', 'instance_type': 'm1.nano', 'host': '114686826ac799d79679ebf48d50469c6b568964d23812d7c3d3d1b1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e5:06:a1', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap90c66d2f-b7'}, 'message_id': '60a1b412-f72a-11f0-a0a4-fa163e934844', 'monotonic_time': 6755.675629848, 'message_signature': '3e3a0a2cdf97ba769fa9c571494b42b63734fa0b0ca1beec739aa1fae356ae77'}]}, 'timestamp': '2026-01-22 00:36:23.038405', '_unique_id': '2d50c84614d04af197889449a2400557'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.038 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.038 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.038 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.038 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.038 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.038 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.038 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.038 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.038 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.038 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.038 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.038 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.038 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.038 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.038 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.038 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.038 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.038 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.038 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.038 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.038 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.038 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.038 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.038 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.038 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.038 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.038 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.038 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.038 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.038 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:36:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:36:23.038 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:36:25 compute-1 nova_compute[182713]: 2026-01-22 00:36:24.999 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:36:27 compute-1 nova_compute[182713]: 2026-01-22 00:36:27.493 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:36:29 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:36:29.391 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=74526b6d-b1ca-423f-9094-b845f8b97526, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '59'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:36:30 compute-1 nova_compute[182713]: 2026-01-22 00:36:30.001 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:36:31 compute-1 podman[241185]: 2026-01-22 00:36:31.592521392 +0000 UTC m=+0.080874051 container health_status cba261ffc859a105e0afa4436e64e27f9ba7e7387a1ea02146e0072c51844d44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 22 00:36:32 compute-1 nova_compute[182713]: 2026-01-22 00:36:32.515 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:36:34 compute-1 podman[241204]: 2026-01-22 00:36:34.595965756 +0000 UTC m=+0.085076474 container health_status dce133a2ffa21f3006ac27dc80027060a9029e6d972f4c62fecd35dd3bbb00f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2025-08-20T13:12:41, version=9.6, vcs-type=git, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, config_id=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container)
Jan 22 00:36:35 compute-1 nova_compute[182713]: 2026-01-22 00:36:35.003 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:36:37 compute-1 nova_compute[182713]: 2026-01-22 00:36:37.518 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:36:40 compute-1 nova_compute[182713]: 2026-01-22 00:36:40.053 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:36:42 compute-1 nova_compute[182713]: 2026-01-22 00:36:42.522 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:36:43 compute-1 podman[241227]: 2026-01-22 00:36:43.603634391 +0000 UTC m=+0.091455973 container health_status 9069102163b9014ce8d990e36224edf2f6277e737eebbc85a8ea0ba5a3d6a468 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 22 00:36:43 compute-1 podman[241226]: 2026-01-22 00:36:43.606645416 +0000 UTC m=+0.102552542 container health_status 1b31374526468d6b98833c6ddc06c791524e3aaa8f4a92d02e3f11c449a17ca2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, io.buildah.version=1.41.3, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 22 00:36:45 compute-1 nova_compute[182713]: 2026-01-22 00:36:45.056 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:36:47 compute-1 nova_compute[182713]: 2026-01-22 00:36:47.526 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:36:49 compute-1 podman[241278]: 2026-01-22 00:36:49.592392417 +0000 UTC m=+0.072973713 container health_status af9a44923f14d865893c91712a29a99d59e05d83f1a1a0300c4f4d5af638f284 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251202)
Jan 22 00:36:49 compute-1 podman[241279]: 2026-01-22 00:36:49.623738902 +0000 UTC m=+0.097762212 container health_status e305ebfcd3dbca18f8dd680606b043b108cf9efb0d0a771039cb04fe0df8441d (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 22 00:36:50 compute-1 nova_compute[182713]: 2026-01-22 00:36:50.099 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:36:52 compute-1 nova_compute[182713]: 2026-01-22 00:36:52.528 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:36:55 compute-1 nova_compute[182713]: 2026-01-22 00:36:55.101 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:36:57 compute-1 nova_compute[182713]: 2026-01-22 00:36:57.532 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:37:00 compute-1 nova_compute[182713]: 2026-01-22 00:37:00.127 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:37:02 compute-1 nova_compute[182713]: 2026-01-22 00:37:02.536 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:37:02 compute-1 podman[241325]: 2026-01-22 00:37:02.576757972 +0000 UTC m=+0.060925855 container health_status cba261ffc859a105e0afa4436e64e27f9ba7e7387a1ea02146e0072c51844d44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ceilometer_agent_compute, managed_by=edpm_ansible)
Jan 22 00:37:02 compute-1 nova_compute[182713]: 2026-01-22 00:37:02.856 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:37:02 compute-1 nova_compute[182713]: 2026-01-22 00:37:02.857 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:37:02 compute-1 nova_compute[182713]: 2026-01-22 00:37:02.857 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:37:03 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:37:03.049 104184 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:37:03 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:37:03.049 104184 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:37:03 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:37:03.050 104184 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:37:03 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:37:03.289 104184 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=60, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:a4:fb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'fa:c2:bb:50:eb:27'}, ipsec=False) old=SB_Global(nb_cfg=59) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:37:03 compute-1 nova_compute[182713]: 2026-01-22 00:37:03.289 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:37:03 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:37:03.290 104184 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 22 00:37:03 compute-1 nova_compute[182713]: 2026-01-22 00:37:03.855 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:37:03 compute-1 nova_compute[182713]: 2026-01-22 00:37:03.855 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 00:37:05 compute-1 nova_compute[182713]: 2026-01-22 00:37:05.130 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:37:05 compute-1 podman[241349]: 2026-01-22 00:37:05.592739466 +0000 UTC m=+0.073368846 container health_status dce133a2ffa21f3006ac27dc80027060a9029e6d972f4c62fecd35dd3bbb00f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, name=ubi9-minimal, vcs-type=git, distribution-scope=public, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., config_id=openstack_network_exporter, managed_by=edpm_ansible, version=9.6, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc.)
Jan 22 00:37:06 compute-1 nova_compute[182713]: 2026-01-22 00:37:06.437 182717 DEBUG oslo_concurrency.lockutils [None req-540db09a-3ca4-4836-bcb0-6a3a92215bbb a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Acquiring lock "79afbcaf-6ef5-4db5-a05c-78bccd9f772a" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:37:06 compute-1 nova_compute[182713]: 2026-01-22 00:37:06.438 182717 DEBUG oslo_concurrency.lockutils [None req-540db09a-3ca4-4836-bcb0-6a3a92215bbb a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "79afbcaf-6ef5-4db5-a05c-78bccd9f772a" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:37:06 compute-1 nova_compute[182713]: 2026-01-22 00:37:06.439 182717 DEBUG oslo_concurrency.lockutils [None req-540db09a-3ca4-4836-bcb0-6a3a92215bbb a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Acquiring lock "79afbcaf-6ef5-4db5-a05c-78bccd9f772a-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:37:06 compute-1 nova_compute[182713]: 2026-01-22 00:37:06.439 182717 DEBUG oslo_concurrency.lockutils [None req-540db09a-3ca4-4836-bcb0-6a3a92215bbb a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "79afbcaf-6ef5-4db5-a05c-78bccd9f772a-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:37:06 compute-1 nova_compute[182713]: 2026-01-22 00:37:06.440 182717 DEBUG oslo_concurrency.lockutils [None req-540db09a-3ca4-4836-bcb0-6a3a92215bbb a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "79afbcaf-6ef5-4db5-a05c-78bccd9f772a-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:37:06 compute-1 nova_compute[182713]: 2026-01-22 00:37:06.456 182717 INFO nova.compute.manager [None req-540db09a-3ca4-4836-bcb0-6a3a92215bbb a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 79afbcaf-6ef5-4db5-a05c-78bccd9f772a] Terminating instance
Jan 22 00:37:06 compute-1 nova_compute[182713]: 2026-01-22 00:37:06.468 182717 DEBUG nova.compute.manager [None req-540db09a-3ca4-4836-bcb0-6a3a92215bbb a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 79afbcaf-6ef5-4db5-a05c-78bccd9f772a] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 22 00:37:06 compute-1 kernel: tap4a9813e7-d0 (unregistering): left promiscuous mode
Jan 22 00:37:06 compute-1 NetworkManager[54952]: <info>  [1769042226.4924] device (tap4a9813e7-d0): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 00:37:06 compute-1 ovn_controller[94841]: 2026-01-22T00:37:06Z|00708|binding|INFO|Releasing lport 4a9813e7-d0e1-490f-a8ca-3138225442a8 from this chassis (sb_readonly=0)
Jan 22 00:37:06 compute-1 nova_compute[182713]: 2026-01-22 00:37:06.506 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:37:06 compute-1 ovn_controller[94841]: 2026-01-22T00:37:06Z|00709|binding|INFO|Setting lport 4a9813e7-d0e1-490f-a8ca-3138225442a8 down in Southbound
Jan 22 00:37:06 compute-1 ovn_controller[94841]: 2026-01-22T00:37:06Z|00710|binding|INFO|Removing iface tap4a9813e7-d0 ovn-installed in OVS
Jan 22 00:37:06 compute-1 nova_compute[182713]: 2026-01-22 00:37:06.510 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:37:06 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:37:06.518 104184 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:7f:b8:cc 10.100.0.4'], port_security=['fa:16:3e:7f:b8:cc 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '79afbcaf-6ef5-4db5-a05c-78bccd9f772a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9d3a0d92-0a01-43e0-bbe5-a677082b8f1b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '837db8748d074b3c9179b47d30e7a1d4', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'a02fb4eb-eda5-4559-8b41-ffe0af33e841', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=9c9c72d4-43bc-43b5-af16-0875792fba89, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>], logical_port=4a9813e7-d0e1-490f-a8ca-3138225442a8) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:37:06 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:37:06.520 104184 INFO neutron.agent.ovn.metadata.agent [-] Port 4a9813e7-d0e1-490f-a8ca-3138225442a8 in datapath 9d3a0d92-0a01-43e0-bbe5-a677082b8f1b unbound from our chassis
Jan 22 00:37:06 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:37:06.522 104184 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 9d3a0d92-0a01-43e0-bbe5-a677082b8f1b, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 00:37:06 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:37:06.525 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[f6c6bf98-4d3e-470f-b54c-7293c8eb78fa]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:37:06 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:37:06.526 104184 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-9d3a0d92-0a01-43e0-bbe5-a677082b8f1b namespace which is not needed anymore
Jan 22 00:37:06 compute-1 kernel: tap90c66d2f-b7 (unregistering): left promiscuous mode
Jan 22 00:37:06 compute-1 nova_compute[182713]: 2026-01-22 00:37:06.537 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:37:06 compute-1 NetworkManager[54952]: <info>  [1769042226.5507] device (tap90c66d2f-b7): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 00:37:06 compute-1 ovn_controller[94841]: 2026-01-22T00:37:06Z|00711|binding|INFO|Releasing lport 90c66d2f-b763-45aa-9b34-a83255b9d97b from this chassis (sb_readonly=0)
Jan 22 00:37:06 compute-1 nova_compute[182713]: 2026-01-22 00:37:06.565 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:37:06 compute-1 ovn_controller[94841]: 2026-01-22T00:37:06Z|00712|binding|INFO|Setting lport 90c66d2f-b763-45aa-9b34-a83255b9d97b down in Southbound
Jan 22 00:37:06 compute-1 ovn_controller[94841]: 2026-01-22T00:37:06Z|00713|binding|INFO|Removing iface tap90c66d2f-b7 ovn-installed in OVS
Jan 22 00:37:06 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:37:06.573 104184 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e5:06:a1 2001:db8:0:1:f816:3eff:fee5:6a1 2001:db8::f816:3eff:fee5:6a1'], port_security=['fa:16:3e:e5:06:a1 2001:db8:0:1:f816:3eff:fee5:6a1 2001:db8::f816:3eff:fee5:6a1'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:0:1:f816:3eff:fee5:6a1/64 2001:db8::f816:3eff:fee5:6a1/64', 'neutron:device_id': '79afbcaf-6ef5-4db5-a05c-78bccd9f772a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-041654ff-0c5d-4cd2-89f6-0863dbbf44a8', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '837db8748d074b3c9179b47d30e7a1d4', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'a02fb4eb-eda5-4559-8b41-ffe0af33e841', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=9d6cac94-5c44-44de-a872-7bf42948d910, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>], logical_port=90c66d2f-b763-45aa-9b34-a83255b9d97b) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:37:06 compute-1 nova_compute[182713]: 2026-01-22 00:37:06.589 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:37:06 compute-1 systemd[1]: machine-qemu\x2d75\x2dinstance\x2d000000ad.scope: Deactivated successfully.
Jan 22 00:37:06 compute-1 systemd[1]: machine-qemu\x2d75\x2dinstance\x2d000000ad.scope: Consumed 16.377s CPU time.
Jan 22 00:37:06 compute-1 systemd-machined[153970]: Machine qemu-75-instance-000000ad terminated.
Jan 22 00:37:06 compute-1 neutron-haproxy-ovnmeta-9d3a0d92-0a01-43e0-bbe5-a677082b8f1b[240942]: [NOTICE]   (240946) : haproxy version is 2.8.14-c23fe91
Jan 22 00:37:06 compute-1 neutron-haproxy-ovnmeta-9d3a0d92-0a01-43e0-bbe5-a677082b8f1b[240942]: [NOTICE]   (240946) : path to executable is /usr/sbin/haproxy
Jan 22 00:37:06 compute-1 neutron-haproxy-ovnmeta-9d3a0d92-0a01-43e0-bbe5-a677082b8f1b[240942]: [WARNING]  (240946) : Exiting Master process...
Jan 22 00:37:06 compute-1 neutron-haproxy-ovnmeta-9d3a0d92-0a01-43e0-bbe5-a677082b8f1b[240942]: [ALERT]    (240946) : Current worker (240948) exited with code 143 (Terminated)
Jan 22 00:37:06 compute-1 neutron-haproxy-ovnmeta-9d3a0d92-0a01-43e0-bbe5-a677082b8f1b[240942]: [WARNING]  (240946) : All workers exited. Exiting... (0)
Jan 22 00:37:06 compute-1 systemd[1]: libpod-ae5a9895e723c928283f82eafcd3e44f36c2feaa2a2bd22fe9bd21eef22c39e4.scope: Deactivated successfully.
Jan 22 00:37:06 compute-1 podman[241401]: 2026-01-22 00:37:06.669398335 +0000 UTC m=+0.050227529 container died ae5a9895e723c928283f82eafcd3e44f36c2feaa2a2bd22fe9bd21eef22c39e4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9d3a0d92-0a01-43e0-bbe5-a677082b8f1b, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team)
Jan 22 00:37:06 compute-1 NetworkManager[54952]: <info>  [1769042226.6934] manager: (tap4a9813e7-d0): new Tun device (/org/freedesktop/NetworkManager/Devices/337)
Jan 22 00:37:06 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-ae5a9895e723c928283f82eafcd3e44f36c2feaa2a2bd22fe9bd21eef22c39e4-userdata-shm.mount: Deactivated successfully.
Jan 22 00:37:06 compute-1 systemd[1]: var-lib-containers-storage-overlay-108f8f9a78426956c8ce866ece23d16367d5b61a498e5a95c7a9be8ff948d08d-merged.mount: Deactivated successfully.
Jan 22 00:37:06 compute-1 NetworkManager[54952]: <info>  [1769042226.7061] manager: (tap90c66d2f-b7): new Tun device (/org/freedesktop/NetworkManager/Devices/338)
Jan 22 00:37:06 compute-1 podman[241401]: 2026-01-22 00:37:06.707695487 +0000 UTC m=+0.088524701 container cleanup ae5a9895e723c928283f82eafcd3e44f36c2feaa2a2bd22fe9bd21eef22c39e4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9d3a0d92-0a01-43e0-bbe5-a677082b8f1b, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 22 00:37:06 compute-1 systemd[1]: libpod-conmon-ae5a9895e723c928283f82eafcd3e44f36c2feaa2a2bd22fe9bd21eef22c39e4.scope: Deactivated successfully.
Jan 22 00:37:06 compute-1 nova_compute[182713]: 2026-01-22 00:37:06.744 182717 INFO nova.virt.libvirt.driver [-] [instance: 79afbcaf-6ef5-4db5-a05c-78bccd9f772a] Instance destroyed successfully.
Jan 22 00:37:06 compute-1 nova_compute[182713]: 2026-01-22 00:37:06.745 182717 DEBUG nova.objects.instance [None req-540db09a-3ca4-4836-bcb0-6a3a92215bbb a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lazy-loading 'resources' on Instance uuid 79afbcaf-6ef5-4db5-a05c-78bccd9f772a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:37:06 compute-1 nova_compute[182713]: 2026-01-22 00:37:06.756 182717 DEBUG nova.virt.libvirt.vif [None req-540db09a-3ca4-4836-bcb0-6a3a92215bbb a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T00:35:34Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1327765747',display_name='tempest-TestGettingAddress-server-1327765747',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1327765747',id=173,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBK7nR7wFnCoiySw65REL0XK2oqhDKLhnFVsBGaeEJezobQmbvet9F136TafqeB+t847DytpkOvQ0+Cnej4wWLBEdCAU4r3MTN2LY3bi428WR2O1oEXJJ3VIylh32sSXeGw==',key_name='tempest-TestGettingAddress-2092456170',keypairs=<?>,launch_index=0,launched_at=2026-01-22T00:35:51Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='837db8748d074b3c9179b47d30e7a1d4',ramdisk_id='',reservation_id='r-dt2fdt7m',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestGettingAddress-471729430',owner_user_name='tempest-TestGettingAddress-471729430-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T00:35:51Z,user_data=None,user_id='a8fd196423d94b309668ffd08655f7ed',uuid=79afbcaf-6ef5-4db5-a05c-78bccd9f772a,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "4a9813e7-d0e1-490f-a8ca-3138225442a8", "address": "fa:16:3e:7f:b8:cc", "network": {"id": "9d3a0d92-0a01-43e0-bbe5-a677082b8f1b", "bridge": "br-int", "label": "tempest-network-smoke--1303473530", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.222", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4a9813e7-d0", "ovs_interfaceid": "4a9813e7-d0e1-490f-a8ca-3138225442a8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 00:37:06 compute-1 nova_compute[182713]: 2026-01-22 00:37:06.757 182717 DEBUG nova.network.os_vif_util [None req-540db09a-3ca4-4836-bcb0-6a3a92215bbb a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Converting VIF {"id": "4a9813e7-d0e1-490f-a8ca-3138225442a8", "address": "fa:16:3e:7f:b8:cc", "network": {"id": "9d3a0d92-0a01-43e0-bbe5-a677082b8f1b", "bridge": "br-int", "label": "tempest-network-smoke--1303473530", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.222", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4a9813e7-d0", "ovs_interfaceid": "4a9813e7-d0e1-490f-a8ca-3138225442a8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:37:06 compute-1 nova_compute[182713]: 2026-01-22 00:37:06.757 182717 DEBUG nova.network.os_vif_util [None req-540db09a-3ca4-4836-bcb0-6a3a92215bbb a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:7f:b8:cc,bridge_name='br-int',has_traffic_filtering=True,id=4a9813e7-d0e1-490f-a8ca-3138225442a8,network=Network(9d3a0d92-0a01-43e0-bbe5-a677082b8f1b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4a9813e7-d0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:37:06 compute-1 nova_compute[182713]: 2026-01-22 00:37:06.758 182717 DEBUG os_vif [None req-540db09a-3ca4-4836-bcb0-6a3a92215bbb a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:7f:b8:cc,bridge_name='br-int',has_traffic_filtering=True,id=4a9813e7-d0e1-490f-a8ca-3138225442a8,network=Network(9d3a0d92-0a01-43e0-bbe5-a677082b8f1b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4a9813e7-d0') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 00:37:06 compute-1 nova_compute[182713]: 2026-01-22 00:37:06.759 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:37:06 compute-1 nova_compute[182713]: 2026-01-22 00:37:06.759 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4a9813e7-d0, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:37:06 compute-1 nova_compute[182713]: 2026-01-22 00:37:06.763 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:37:06 compute-1 nova_compute[182713]: 2026-01-22 00:37:06.766 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:37:06 compute-1 nova_compute[182713]: 2026-01-22 00:37:06.770 182717 INFO os_vif [None req-540db09a-3ca4-4836-bcb0-6a3a92215bbb a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:7f:b8:cc,bridge_name='br-int',has_traffic_filtering=True,id=4a9813e7-d0e1-490f-a8ca-3138225442a8,network=Network(9d3a0d92-0a01-43e0-bbe5-a677082b8f1b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4a9813e7-d0')
Jan 22 00:37:06 compute-1 nova_compute[182713]: 2026-01-22 00:37:06.771 182717 DEBUG nova.virt.libvirt.vif [None req-540db09a-3ca4-4836-bcb0-6a3a92215bbb a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T00:35:34Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1327765747',display_name='tempest-TestGettingAddress-server-1327765747',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1327765747',id=173,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBK7nR7wFnCoiySw65REL0XK2oqhDKLhnFVsBGaeEJezobQmbvet9F136TafqeB+t847DytpkOvQ0+Cnej4wWLBEdCAU4r3MTN2LY3bi428WR2O1oEXJJ3VIylh32sSXeGw==',key_name='tempest-TestGettingAddress-2092456170',keypairs=<?>,launch_index=0,launched_at=2026-01-22T00:35:51Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='837db8748d074b3c9179b47d30e7a1d4',ramdisk_id='',reservation_id='r-dt2fdt7m',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestGettingAddress-471729430',owner_user_name='tempest-TestGettingAddress-471729430-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T00:35:51Z,user_data=None,user_id='a8fd196423d94b309668ffd08655f7ed',uuid=79afbcaf-6ef5-4db5-a05c-78bccd9f772a,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "90c66d2f-b763-45aa-9b34-a83255b9d97b", "address": "fa:16:3e:e5:06:a1", "network": {"id": "041654ff-0c5d-4cd2-89f6-0863dbbf44a8", "bridge": "br-int", "label": "tempest-network-smoke--1472393287", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fee5:6a1", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fee5:6a1", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap90c66d2f-b7", "ovs_interfaceid": "90c66d2f-b763-45aa-9b34-a83255b9d97b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 00:37:06 compute-1 nova_compute[182713]: 2026-01-22 00:37:06.771 182717 DEBUG nova.network.os_vif_util [None req-540db09a-3ca4-4836-bcb0-6a3a92215bbb a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Converting VIF {"id": "90c66d2f-b763-45aa-9b34-a83255b9d97b", "address": "fa:16:3e:e5:06:a1", "network": {"id": "041654ff-0c5d-4cd2-89f6-0863dbbf44a8", "bridge": "br-int", "label": "tempest-network-smoke--1472393287", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fee5:6a1", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fee5:6a1", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap90c66d2f-b7", "ovs_interfaceid": "90c66d2f-b763-45aa-9b34-a83255b9d97b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:37:06 compute-1 nova_compute[182713]: 2026-01-22 00:37:06.772 182717 DEBUG nova.network.os_vif_util [None req-540db09a-3ca4-4836-bcb0-6a3a92215bbb a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:e5:06:a1,bridge_name='br-int',has_traffic_filtering=True,id=90c66d2f-b763-45aa-9b34-a83255b9d97b,network=Network(041654ff-0c5d-4cd2-89f6-0863dbbf44a8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap90c66d2f-b7') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:37:06 compute-1 nova_compute[182713]: 2026-01-22 00:37:06.772 182717 DEBUG os_vif [None req-540db09a-3ca4-4836-bcb0-6a3a92215bbb a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:e5:06:a1,bridge_name='br-int',has_traffic_filtering=True,id=90c66d2f-b763-45aa-9b34-a83255b9d97b,network=Network(041654ff-0c5d-4cd2-89f6-0863dbbf44a8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap90c66d2f-b7') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 00:37:06 compute-1 nova_compute[182713]: 2026-01-22 00:37:06.773 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:37:06 compute-1 nova_compute[182713]: 2026-01-22 00:37:06.774 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap90c66d2f-b7, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:37:06 compute-1 nova_compute[182713]: 2026-01-22 00:37:06.775 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:37:06 compute-1 nova_compute[182713]: 2026-01-22 00:37:06.776 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:37:06 compute-1 podman[241448]: 2026-01-22 00:37:06.776704506 +0000 UTC m=+0.041238547 container remove ae5a9895e723c928283f82eafcd3e44f36c2feaa2a2bd22fe9bd21eef22c39e4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9d3a0d92-0a01-43e0-bbe5-a677082b8f1b, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Jan 22 00:37:06 compute-1 nova_compute[182713]: 2026-01-22 00:37:06.777 182717 INFO os_vif [None req-540db09a-3ca4-4836-bcb0-6a3a92215bbb a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:e5:06:a1,bridge_name='br-int',has_traffic_filtering=True,id=90c66d2f-b763-45aa-9b34-a83255b9d97b,network=Network(041654ff-0c5d-4cd2-89f6-0863dbbf44a8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap90c66d2f-b7')
Jan 22 00:37:06 compute-1 nova_compute[182713]: 2026-01-22 00:37:06.778 182717 INFO nova.virt.libvirt.driver [None req-540db09a-3ca4-4836-bcb0-6a3a92215bbb a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 79afbcaf-6ef5-4db5-a05c-78bccd9f772a] Deleting instance files /var/lib/nova/instances/79afbcaf-6ef5-4db5-a05c-78bccd9f772a_del
Jan 22 00:37:06 compute-1 nova_compute[182713]: 2026-01-22 00:37:06.778 182717 INFO nova.virt.libvirt.driver [None req-540db09a-3ca4-4836-bcb0-6a3a92215bbb a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 79afbcaf-6ef5-4db5-a05c-78bccd9f772a] Deletion of /var/lib/nova/instances/79afbcaf-6ef5-4db5-a05c-78bccd9f772a_del complete
Jan 22 00:37:06 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:37:06.781 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[5012c68d-4586-45fc-9b6b-89d62d8bd12f]: (4, ('Thu Jan 22 12:37:06 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-9d3a0d92-0a01-43e0-bbe5-a677082b8f1b (ae5a9895e723c928283f82eafcd3e44f36c2feaa2a2bd22fe9bd21eef22c39e4)\nae5a9895e723c928283f82eafcd3e44f36c2feaa2a2bd22fe9bd21eef22c39e4\nThu Jan 22 12:37:06 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-9d3a0d92-0a01-43e0-bbe5-a677082b8f1b (ae5a9895e723c928283f82eafcd3e44f36c2feaa2a2bd22fe9bd21eef22c39e4)\nae5a9895e723c928283f82eafcd3e44f36c2feaa2a2bd22fe9bd21eef22c39e4\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:37:06 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:37:06.782 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[59e6bd42-0b4b-455a-80c8-bb7a17a520e8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:37:06 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:37:06.783 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9d3a0d92-00, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:37:06 compute-1 nova_compute[182713]: 2026-01-22 00:37:06.784 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:37:06 compute-1 kernel: tap9d3a0d92-00: left promiscuous mode
Jan 22 00:37:06 compute-1 nova_compute[182713]: 2026-01-22 00:37:06.799 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:37:06 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:37:06.806 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[9213511f-eaf6-461a-b28b-fd1746f79719]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:37:06 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:37:06.819 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[23fd7a86-da6d-442c-95f1-394e98ab0710]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:37:06 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:37:06.820 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[b1d95e80-28cb-41bc-9649-abc6d7be3494]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:37:06 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:37:06.836 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[acf135e3-f08f-44e9-bfa8-56e4bd1a858a]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 672094, 'reachable_time': 21289, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 241471, 'error': None, 'target': 'ovnmeta-9d3a0d92-0a01-43e0-bbe5-a677082b8f1b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:37:06 compute-1 nova_compute[182713]: 2026-01-22 00:37:06.840 182717 INFO nova.compute.manager [None req-540db09a-3ca4-4836-bcb0-6a3a92215bbb a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 79afbcaf-6ef5-4db5-a05c-78bccd9f772a] Took 0.37 seconds to destroy the instance on the hypervisor.
Jan 22 00:37:06 compute-1 systemd[1]: run-netns-ovnmeta\x2d9d3a0d92\x2d0a01\x2d43e0\x2dbbe5\x2da677082b8f1b.mount: Deactivated successfully.
Jan 22 00:37:06 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:37:06.839 104576 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-9d3a0d92-0a01-43e0-bbe5-a677082b8f1b deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 22 00:37:06 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:37:06.840 104576 DEBUG oslo.privsep.daemon [-] privsep: reply[d011d44a-5ce5-46f6-82b7-c53eb4b621cc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:37:06 compute-1 nova_compute[182713]: 2026-01-22 00:37:06.841 182717 DEBUG oslo.service.loopingcall [None req-540db09a-3ca4-4836-bcb0-6a3a92215bbb a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 22 00:37:06 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:37:06.841 104184 INFO neutron.agent.ovn.metadata.agent [-] Port 90c66d2f-b763-45aa-9b34-a83255b9d97b in datapath 041654ff-0c5d-4cd2-89f6-0863dbbf44a8 unbound from our chassis
Jan 22 00:37:06 compute-1 nova_compute[182713]: 2026-01-22 00:37:06.841 182717 DEBUG nova.compute.manager [-] [instance: 79afbcaf-6ef5-4db5-a05c-78bccd9f772a] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 22 00:37:06 compute-1 nova_compute[182713]: 2026-01-22 00:37:06.842 182717 DEBUG nova.network.neutron [-] [instance: 79afbcaf-6ef5-4db5-a05c-78bccd9f772a] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 22 00:37:06 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:37:06.843 104184 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 041654ff-0c5d-4cd2-89f6-0863dbbf44a8, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 00:37:06 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:37:06.844 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[2874e2c0-7b7e-443c-8a7d-87a2526eaa56]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:37:06 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:37:06.845 104184 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-041654ff-0c5d-4cd2-89f6-0863dbbf44a8 namespace which is not needed anymore
Jan 22 00:37:06 compute-1 nova_compute[182713]: 2026-01-22 00:37:06.855 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:37:06 compute-1 neutron-haproxy-ovnmeta-041654ff-0c5d-4cd2-89f6-0863dbbf44a8[241015]: [NOTICE]   (241019) : haproxy version is 2.8.14-c23fe91
Jan 22 00:37:06 compute-1 neutron-haproxy-ovnmeta-041654ff-0c5d-4cd2-89f6-0863dbbf44a8[241015]: [NOTICE]   (241019) : path to executable is /usr/sbin/haproxy
Jan 22 00:37:06 compute-1 neutron-haproxy-ovnmeta-041654ff-0c5d-4cd2-89f6-0863dbbf44a8[241015]: [WARNING]  (241019) : Exiting Master process...
Jan 22 00:37:06 compute-1 neutron-haproxy-ovnmeta-041654ff-0c5d-4cd2-89f6-0863dbbf44a8[241015]: [ALERT]    (241019) : Current worker (241021) exited with code 143 (Terminated)
Jan 22 00:37:06 compute-1 neutron-haproxy-ovnmeta-041654ff-0c5d-4cd2-89f6-0863dbbf44a8[241015]: [WARNING]  (241019) : All workers exited. Exiting... (0)
Jan 22 00:37:06 compute-1 systemd[1]: libpod-99f354808ca4484bdd27e6dd1d4a4452ad527b6cd3762f2a4d9739d1f7342111.scope: Deactivated successfully.
Jan 22 00:37:06 compute-1 podman[241489]: 2026-01-22 00:37:06.982603873 +0000 UTC m=+0.044830129 container died 99f354808ca4484bdd27e6dd1d4a4452ad527b6cd3762f2a4d9739d1f7342111 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-041654ff-0c5d-4cd2-89f6-0863dbbf44a8, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0)
Jan 22 00:37:07 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-99f354808ca4484bdd27e6dd1d4a4452ad527b6cd3762f2a4d9739d1f7342111-userdata-shm.mount: Deactivated successfully.
Jan 22 00:37:07 compute-1 systemd[1]: var-lib-containers-storage-overlay-db5952f42460ce485b188c5c4ad16b60f53bc8fe7442b45d62e9240739811d26-merged.mount: Deactivated successfully.
Jan 22 00:37:07 compute-1 podman[241489]: 2026-01-22 00:37:07.013236615 +0000 UTC m=+0.075462871 container cleanup 99f354808ca4484bdd27e6dd1d4a4452ad527b6cd3762f2a4d9739d1f7342111 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-041654ff-0c5d-4cd2-89f6-0863dbbf44a8, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 00:37:07 compute-1 systemd[1]: libpod-conmon-99f354808ca4484bdd27e6dd1d4a4452ad527b6cd3762f2a4d9739d1f7342111.scope: Deactivated successfully.
Jan 22 00:37:07 compute-1 podman[241517]: 2026-01-22 00:37:07.077717031 +0000 UTC m=+0.042226378 container remove 99f354808ca4484bdd27e6dd1d4a4452ad527b6cd3762f2a4d9739d1f7342111 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-041654ff-0c5d-4cd2-89f6-0863dbbf44a8, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team)
Jan 22 00:37:07 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:37:07.085 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[ccb0b058-5a01-4614-9056-032edea08468]: (4, ('Thu Jan 22 12:37:06 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-041654ff-0c5d-4cd2-89f6-0863dbbf44a8 (99f354808ca4484bdd27e6dd1d4a4452ad527b6cd3762f2a4d9739d1f7342111)\n99f354808ca4484bdd27e6dd1d4a4452ad527b6cd3762f2a4d9739d1f7342111\nThu Jan 22 12:37:07 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-041654ff-0c5d-4cd2-89f6-0863dbbf44a8 (99f354808ca4484bdd27e6dd1d4a4452ad527b6cd3762f2a4d9739d1f7342111)\n99f354808ca4484bdd27e6dd1d4a4452ad527b6cd3762f2a4d9739d1f7342111\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:37:07 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:37:07.086 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[a5624de7-834f-4bfe-84cd-0fa57d8fabea]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:37:07 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:37:07.087 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap041654ff-00, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:37:07 compute-1 nova_compute[182713]: 2026-01-22 00:37:07.088 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:37:07 compute-1 kernel: tap041654ff-00: left promiscuous mode
Jan 22 00:37:07 compute-1 nova_compute[182713]: 2026-01-22 00:37:07.100 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:37:07 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:37:07.104 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[916b9b91-e29b-499a-8eae-5414bbb5f43b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:37:07 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:37:07.122 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[411e0da7-f296-405a-bd66-594b55b0984a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:37:07 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:37:07.124 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[d4877dd7-061c-47e3-8651-9cb4106211d6]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:37:07 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:37:07.139 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[7126c40b-10ed-4f34-84bb-70faeedb6100]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 672194, 'reachable_time': 20815, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 241532, 'error': None, 'target': 'ovnmeta-041654ff-0c5d-4cd2-89f6-0863dbbf44a8', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:37:07 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:37:07.140 104576 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-041654ff-0c5d-4cd2-89f6-0863dbbf44a8 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 22 00:37:07 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:37:07.140 104576 DEBUG oslo.privsep.daemon [-] privsep: reply[6891261b-0bdd-4595-ba8b-0a735b9074b0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:37:07 compute-1 nova_compute[182713]: 2026-01-22 00:37:07.484 182717 DEBUG nova.compute.manager [req-820a9905-c416-4080-968a-69a0ffdf1477 req-49750503-a753-48d9-925c-99a7f8315e1e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 79afbcaf-6ef5-4db5-a05c-78bccd9f772a] Received event network-vif-unplugged-4a9813e7-d0e1-490f-a8ca-3138225442a8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:37:07 compute-1 nova_compute[182713]: 2026-01-22 00:37:07.485 182717 DEBUG oslo_concurrency.lockutils [req-820a9905-c416-4080-968a-69a0ffdf1477 req-49750503-a753-48d9-925c-99a7f8315e1e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "79afbcaf-6ef5-4db5-a05c-78bccd9f772a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:37:07 compute-1 nova_compute[182713]: 2026-01-22 00:37:07.485 182717 DEBUG oslo_concurrency.lockutils [req-820a9905-c416-4080-968a-69a0ffdf1477 req-49750503-a753-48d9-925c-99a7f8315e1e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "79afbcaf-6ef5-4db5-a05c-78bccd9f772a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:37:07 compute-1 nova_compute[182713]: 2026-01-22 00:37:07.486 182717 DEBUG oslo_concurrency.lockutils [req-820a9905-c416-4080-968a-69a0ffdf1477 req-49750503-a753-48d9-925c-99a7f8315e1e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "79afbcaf-6ef5-4db5-a05c-78bccd9f772a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:37:07 compute-1 nova_compute[182713]: 2026-01-22 00:37:07.486 182717 DEBUG nova.compute.manager [req-820a9905-c416-4080-968a-69a0ffdf1477 req-49750503-a753-48d9-925c-99a7f8315e1e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 79afbcaf-6ef5-4db5-a05c-78bccd9f772a] No waiting events found dispatching network-vif-unplugged-4a9813e7-d0e1-490f-a8ca-3138225442a8 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:37:07 compute-1 nova_compute[182713]: 2026-01-22 00:37:07.486 182717 DEBUG nova.compute.manager [req-820a9905-c416-4080-968a-69a0ffdf1477 req-49750503-a753-48d9-925c-99a7f8315e1e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 79afbcaf-6ef5-4db5-a05c-78bccd9f772a] Received event network-vif-unplugged-4a9813e7-d0e1-490f-a8ca-3138225442a8 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 22 00:37:07 compute-1 nova_compute[182713]: 2026-01-22 00:37:07.674 182717 DEBUG nova.compute.manager [req-367ce36f-64ee-4683-9498-1fae1b8ddd0f req-82a0f35a-d21d-481a-b3d7-d2a7b63ef121 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 79afbcaf-6ef5-4db5-a05c-78bccd9f772a] Received event network-changed-4a9813e7-d0e1-490f-a8ca-3138225442a8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:37:07 compute-1 nova_compute[182713]: 2026-01-22 00:37:07.675 182717 DEBUG nova.compute.manager [req-367ce36f-64ee-4683-9498-1fae1b8ddd0f req-82a0f35a-d21d-481a-b3d7-d2a7b63ef121 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 79afbcaf-6ef5-4db5-a05c-78bccd9f772a] Refreshing instance network info cache due to event network-changed-4a9813e7-d0e1-490f-a8ca-3138225442a8. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 00:37:07 compute-1 nova_compute[182713]: 2026-01-22 00:37:07.676 182717 DEBUG oslo_concurrency.lockutils [req-367ce36f-64ee-4683-9498-1fae1b8ddd0f req-82a0f35a-d21d-481a-b3d7-d2a7b63ef121 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-79afbcaf-6ef5-4db5-a05c-78bccd9f772a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:37:07 compute-1 nova_compute[182713]: 2026-01-22 00:37:07.676 182717 DEBUG oslo_concurrency.lockutils [req-367ce36f-64ee-4683-9498-1fae1b8ddd0f req-82a0f35a-d21d-481a-b3d7-d2a7b63ef121 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-79afbcaf-6ef5-4db5-a05c-78bccd9f772a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:37:07 compute-1 nova_compute[182713]: 2026-01-22 00:37:07.676 182717 DEBUG nova.network.neutron [req-367ce36f-64ee-4683-9498-1fae1b8ddd0f req-82a0f35a-d21d-481a-b3d7-d2a7b63ef121 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 79afbcaf-6ef5-4db5-a05c-78bccd9f772a] Refreshing network info cache for port 4a9813e7-d0e1-490f-a8ca-3138225442a8 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 00:37:07 compute-1 systemd[1]: run-netns-ovnmeta\x2d041654ff\x2d0c5d\x2d4cd2\x2d89f6\x2d0863dbbf44a8.mount: Deactivated successfully.
Jan 22 00:37:07 compute-1 nova_compute[182713]: 2026-01-22 00:37:07.975 182717 DEBUG nova.compute.manager [req-556ef6fc-4030-4972-815f-d281133ce719 req-f73c781b-f3fb-4182-865b-66b767d75b07 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 79afbcaf-6ef5-4db5-a05c-78bccd9f772a] Received event network-vif-deleted-90c66d2f-b763-45aa-9b34-a83255b9d97b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:37:07 compute-1 nova_compute[182713]: 2026-01-22 00:37:07.976 182717 INFO nova.compute.manager [req-556ef6fc-4030-4972-815f-d281133ce719 req-f73c781b-f3fb-4182-865b-66b767d75b07 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 79afbcaf-6ef5-4db5-a05c-78bccd9f772a] Neutron deleted interface 90c66d2f-b763-45aa-9b34-a83255b9d97b; detaching it from the instance and deleting it from the info cache
Jan 22 00:37:07 compute-1 nova_compute[182713]: 2026-01-22 00:37:07.976 182717 DEBUG nova.network.neutron [req-556ef6fc-4030-4972-815f-d281133ce719 req-f73c781b-f3fb-4182-865b-66b767d75b07 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 79afbcaf-6ef5-4db5-a05c-78bccd9f772a] Updating instance_info_cache with network_info: [{"id": "4a9813e7-d0e1-490f-a8ca-3138225442a8", "address": "fa:16:3e:7f:b8:cc", "network": {"id": "9d3a0d92-0a01-43e0-bbe5-a677082b8f1b", "bridge": "br-int", "label": "tempest-network-smoke--1303473530", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.222", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4a9813e7-d0", "ovs_interfaceid": "4a9813e7-d0e1-490f-a8ca-3138225442a8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:37:08 compute-1 nova_compute[182713]: 2026-01-22 00:37:08.002 182717 DEBUG nova.compute.manager [req-556ef6fc-4030-4972-815f-d281133ce719 req-f73c781b-f3fb-4182-865b-66b767d75b07 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 79afbcaf-6ef5-4db5-a05c-78bccd9f772a] Detach interface failed, port_id=90c66d2f-b763-45aa-9b34-a83255b9d97b, reason: Instance 79afbcaf-6ef5-4db5-a05c-78bccd9f772a could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882
Jan 22 00:37:08 compute-1 nova_compute[182713]: 2026-01-22 00:37:08.304 182717 DEBUG nova.network.neutron [-] [instance: 79afbcaf-6ef5-4db5-a05c-78bccd9f772a] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:37:08 compute-1 nova_compute[182713]: 2026-01-22 00:37:08.323 182717 INFO nova.compute.manager [-] [instance: 79afbcaf-6ef5-4db5-a05c-78bccd9f772a] Took 1.48 seconds to deallocate network for instance.
Jan 22 00:37:08 compute-1 nova_compute[182713]: 2026-01-22 00:37:08.413 182717 DEBUG oslo_concurrency.lockutils [None req-540db09a-3ca4-4836-bcb0-6a3a92215bbb a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:37:08 compute-1 nova_compute[182713]: 2026-01-22 00:37:08.414 182717 DEBUG oslo_concurrency.lockutils [None req-540db09a-3ca4-4836-bcb0-6a3a92215bbb a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:37:08 compute-1 nova_compute[182713]: 2026-01-22 00:37:08.488 182717 DEBUG nova.compute.provider_tree [None req-540db09a-3ca4-4836-bcb0-6a3a92215bbb a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Inventory has not changed in ProviderTree for provider: 39680711-70c9-4df1-ae59-25e54fac688d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:37:08 compute-1 nova_compute[182713]: 2026-01-22 00:37:08.514 182717 DEBUG nova.scheduler.client.report [None req-540db09a-3ca4-4836-bcb0-6a3a92215bbb a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Inventory has not changed for provider 39680711-70c9-4df1-ae59-25e54fac688d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:37:08 compute-1 nova_compute[182713]: 2026-01-22 00:37:08.543 182717 DEBUG oslo_concurrency.lockutils [None req-540db09a-3ca4-4836-bcb0-6a3a92215bbb a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.129s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:37:08 compute-1 nova_compute[182713]: 2026-01-22 00:37:08.577 182717 INFO nova.scheduler.client.report [None req-540db09a-3ca4-4836-bcb0-6a3a92215bbb a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Deleted allocations for instance 79afbcaf-6ef5-4db5-a05c-78bccd9f772a
Jan 22 00:37:08 compute-1 nova_compute[182713]: 2026-01-22 00:37:08.676 182717 DEBUG oslo_concurrency.lockutils [None req-540db09a-3ca4-4836-bcb0-6a3a92215bbb a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "79afbcaf-6ef5-4db5-a05c-78bccd9f772a" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.238s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:37:09 compute-1 nova_compute[182713]: 2026-01-22 00:37:09.190 182717 DEBUG nova.network.neutron [req-367ce36f-64ee-4683-9498-1fae1b8ddd0f req-82a0f35a-d21d-481a-b3d7-d2a7b63ef121 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 79afbcaf-6ef5-4db5-a05c-78bccd9f772a] Updated VIF entry in instance network info cache for port 4a9813e7-d0e1-490f-a8ca-3138225442a8. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 00:37:09 compute-1 nova_compute[182713]: 2026-01-22 00:37:09.191 182717 DEBUG nova.network.neutron [req-367ce36f-64ee-4683-9498-1fae1b8ddd0f req-82a0f35a-d21d-481a-b3d7-d2a7b63ef121 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 79afbcaf-6ef5-4db5-a05c-78bccd9f772a] Updating instance_info_cache with network_info: [{"id": "4a9813e7-d0e1-490f-a8ca-3138225442a8", "address": "fa:16:3e:7f:b8:cc", "network": {"id": "9d3a0d92-0a01-43e0-bbe5-a677082b8f1b", "bridge": "br-int", "label": "tempest-network-smoke--1303473530", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4a9813e7-d0", "ovs_interfaceid": "4a9813e7-d0e1-490f-a8ca-3138225442a8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "90c66d2f-b763-45aa-9b34-a83255b9d97b", "address": "fa:16:3e:e5:06:a1", "network": {"id": "041654ff-0c5d-4cd2-89f6-0863dbbf44a8", "bridge": "br-int", "label": "tempest-network-smoke--1472393287", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fee5:6a1", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fee5:6a1", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap90c66d2f-b7", "ovs_interfaceid": "90c66d2f-b763-45aa-9b34-a83255b9d97b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:37:09 compute-1 nova_compute[182713]: 2026-01-22 00:37:09.215 182717 DEBUG oslo_concurrency.lockutils [req-367ce36f-64ee-4683-9498-1fae1b8ddd0f req-82a0f35a-d21d-481a-b3d7-d2a7b63ef121 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-79afbcaf-6ef5-4db5-a05c-78bccd9f772a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:37:09 compute-1 nova_compute[182713]: 2026-01-22 00:37:09.620 182717 DEBUG nova.compute.manager [req-9f54d7aa-50d9-4074-b1f4-99c391a1cec1 req-dc3adaf0-fb17-48cb-9d19-dfe73c62517c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 79afbcaf-6ef5-4db5-a05c-78bccd9f772a] Received event network-vif-plugged-4a9813e7-d0e1-490f-a8ca-3138225442a8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:37:09 compute-1 nova_compute[182713]: 2026-01-22 00:37:09.621 182717 DEBUG oslo_concurrency.lockutils [req-9f54d7aa-50d9-4074-b1f4-99c391a1cec1 req-dc3adaf0-fb17-48cb-9d19-dfe73c62517c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "79afbcaf-6ef5-4db5-a05c-78bccd9f772a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:37:09 compute-1 nova_compute[182713]: 2026-01-22 00:37:09.621 182717 DEBUG oslo_concurrency.lockutils [req-9f54d7aa-50d9-4074-b1f4-99c391a1cec1 req-dc3adaf0-fb17-48cb-9d19-dfe73c62517c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "79afbcaf-6ef5-4db5-a05c-78bccd9f772a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:37:09 compute-1 nova_compute[182713]: 2026-01-22 00:37:09.621 182717 DEBUG oslo_concurrency.lockutils [req-9f54d7aa-50d9-4074-b1f4-99c391a1cec1 req-dc3adaf0-fb17-48cb-9d19-dfe73c62517c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "79afbcaf-6ef5-4db5-a05c-78bccd9f772a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:37:09 compute-1 nova_compute[182713]: 2026-01-22 00:37:09.622 182717 DEBUG nova.compute.manager [req-9f54d7aa-50d9-4074-b1f4-99c391a1cec1 req-dc3adaf0-fb17-48cb-9d19-dfe73c62517c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 79afbcaf-6ef5-4db5-a05c-78bccd9f772a] No waiting events found dispatching network-vif-plugged-4a9813e7-d0e1-490f-a8ca-3138225442a8 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:37:09 compute-1 nova_compute[182713]: 2026-01-22 00:37:09.622 182717 WARNING nova.compute.manager [req-9f54d7aa-50d9-4074-b1f4-99c391a1cec1 req-dc3adaf0-fb17-48cb-9d19-dfe73c62517c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 79afbcaf-6ef5-4db5-a05c-78bccd9f772a] Received unexpected event network-vif-plugged-4a9813e7-d0e1-490f-a8ca-3138225442a8 for instance with vm_state deleted and task_state None.
Jan 22 00:37:09 compute-1 nova_compute[182713]: 2026-01-22 00:37:09.622 182717 DEBUG nova.compute.manager [req-9f54d7aa-50d9-4074-b1f4-99c391a1cec1 req-dc3adaf0-fb17-48cb-9d19-dfe73c62517c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 79afbcaf-6ef5-4db5-a05c-78bccd9f772a] Received event network-vif-plugged-90c66d2f-b763-45aa-9b34-a83255b9d97b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:37:09 compute-1 nova_compute[182713]: 2026-01-22 00:37:09.622 182717 DEBUG oslo_concurrency.lockutils [req-9f54d7aa-50d9-4074-b1f4-99c391a1cec1 req-dc3adaf0-fb17-48cb-9d19-dfe73c62517c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "79afbcaf-6ef5-4db5-a05c-78bccd9f772a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:37:09 compute-1 nova_compute[182713]: 2026-01-22 00:37:09.623 182717 DEBUG oslo_concurrency.lockutils [req-9f54d7aa-50d9-4074-b1f4-99c391a1cec1 req-dc3adaf0-fb17-48cb-9d19-dfe73c62517c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "79afbcaf-6ef5-4db5-a05c-78bccd9f772a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:37:09 compute-1 nova_compute[182713]: 2026-01-22 00:37:09.623 182717 DEBUG oslo_concurrency.lockutils [req-9f54d7aa-50d9-4074-b1f4-99c391a1cec1 req-dc3adaf0-fb17-48cb-9d19-dfe73c62517c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "79afbcaf-6ef5-4db5-a05c-78bccd9f772a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:37:09 compute-1 nova_compute[182713]: 2026-01-22 00:37:09.623 182717 DEBUG nova.compute.manager [req-9f54d7aa-50d9-4074-b1f4-99c391a1cec1 req-dc3adaf0-fb17-48cb-9d19-dfe73c62517c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 79afbcaf-6ef5-4db5-a05c-78bccd9f772a] No waiting events found dispatching network-vif-plugged-90c66d2f-b763-45aa-9b34-a83255b9d97b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:37:09 compute-1 nova_compute[182713]: 2026-01-22 00:37:09.624 182717 WARNING nova.compute.manager [req-9f54d7aa-50d9-4074-b1f4-99c391a1cec1 req-dc3adaf0-fb17-48cb-9d19-dfe73c62517c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 79afbcaf-6ef5-4db5-a05c-78bccd9f772a] Received unexpected event network-vif-plugged-90c66d2f-b763-45aa-9b34-a83255b9d97b for instance with vm_state deleted and task_state None.
Jan 22 00:37:10 compute-1 nova_compute[182713]: 2026-01-22 00:37:10.123 182717 DEBUG nova.compute.manager [req-d8ace9b6-efb5-401a-b69e-3b71f8708ba6 req-3ea7fda0-386f-4020-99c6-8e1df8a78177 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 79afbcaf-6ef5-4db5-a05c-78bccd9f772a] Received event network-vif-deleted-4a9813e7-d0e1-490f-a8ca-3138225442a8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:37:10 compute-1 nova_compute[182713]: 2026-01-22 00:37:10.131 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:37:10 compute-1 nova_compute[182713]: 2026-01-22 00:37:10.855 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_shelved_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:37:11 compute-1 sshd-session[241533]: Invalid user solv from 92.118.39.95 port 46180
Jan 22 00:37:11 compute-1 sshd-session[241533]: Connection closed by invalid user solv 92.118.39.95 port 46180 [preauth]
Jan 22 00:37:11 compute-1 nova_compute[182713]: 2026-01-22 00:37:11.777 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:37:11 compute-1 nova_compute[182713]: 2026-01-22 00:37:11.855 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:37:11 compute-1 nova_compute[182713]: 2026-01-22 00:37:11.856 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:37:11 compute-1 nova_compute[182713]: 2026-01-22 00:37:11.856 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:37:11 compute-1 nova_compute[182713]: 2026-01-22 00:37:11.878 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:37:11 compute-1 nova_compute[182713]: 2026-01-22 00:37:11.878 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:37:11 compute-1 nova_compute[182713]: 2026-01-22 00:37:11.878 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:37:11 compute-1 nova_compute[182713]: 2026-01-22 00:37:11.879 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 00:37:12 compute-1 nova_compute[182713]: 2026-01-22 00:37:12.045 182717 WARNING nova.virt.libvirt.driver [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 00:37:12 compute-1 nova_compute[182713]: 2026-01-22 00:37:12.047 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5707MB free_disk=73.19309616088867GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 00:37:12 compute-1 nova_compute[182713]: 2026-01-22 00:37:12.047 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:37:12 compute-1 nova_compute[182713]: 2026-01-22 00:37:12.048 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:37:12 compute-1 nova_compute[182713]: 2026-01-22 00:37:12.104 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 00:37:12 compute-1 nova_compute[182713]: 2026-01-22 00:37:12.105 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 00:37:12 compute-1 nova_compute[182713]: 2026-01-22 00:37:12.131 182717 DEBUG nova.compute.provider_tree [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Inventory has not changed in ProviderTree for provider: 39680711-70c9-4df1-ae59-25e54fac688d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:37:12 compute-1 nova_compute[182713]: 2026-01-22 00:37:12.152 182717 DEBUG nova.scheduler.client.report [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Inventory has not changed for provider 39680711-70c9-4df1-ae59-25e54fac688d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:37:12 compute-1 nova_compute[182713]: 2026-01-22 00:37:12.181 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 00:37:12 compute-1 nova_compute[182713]: 2026-01-22 00:37:12.181 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.134s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:37:12 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:37:12.292 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=74526b6d-b1ca-423f-9094-b845f8b97526, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '60'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:37:14 compute-1 nova_compute[182713]: 2026-01-22 00:37:14.181 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:37:14 compute-1 nova_compute[182713]: 2026-01-22 00:37:14.182 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 00:37:14 compute-1 nova_compute[182713]: 2026-01-22 00:37:14.182 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 22 00:37:14 compute-1 nova_compute[182713]: 2026-01-22 00:37:14.198 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 22 00:37:14 compute-1 nova_compute[182713]: 2026-01-22 00:37:14.442 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:37:14 compute-1 nova_compute[182713]: 2026-01-22 00:37:14.556 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:37:14 compute-1 podman[241538]: 2026-01-22 00:37:14.638817494 +0000 UTC m=+0.058523320 container health_status 9069102163b9014ce8d990e36224edf2f6277e737eebbc85a8ea0ba5a3d6a468 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 22 00:37:14 compute-1 podman[241536]: 2026-01-22 00:37:14.691978013 +0000 UTC m=+0.114734035 container health_status 1b31374526468d6b98833c6ddc06c791524e3aaa8f4a92d02e3f11c449a17ca2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 22 00:37:15 compute-1 nova_compute[182713]: 2026-01-22 00:37:15.133 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:37:16 compute-1 nova_compute[182713]: 2026-01-22 00:37:16.779 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:37:20 compute-1 nova_compute[182713]: 2026-01-22 00:37:20.134 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:37:20 compute-1 podman[241587]: 2026-01-22 00:37:20.559281174 +0000 UTC m=+0.047992018 container health_status e305ebfcd3dbca18f8dd680606b043b108cf9efb0d0a771039cb04fe0df8441d (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Jan 22 00:37:20 compute-1 podman[241586]: 2026-01-22 00:37:20.572582382 +0000 UTC m=+0.056995432 container health_status af9a44923f14d865893c91712a29a99d59e05d83f1a1a0300c4f4d5af638f284 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 22 00:37:21 compute-1 nova_compute[182713]: 2026-01-22 00:37:21.743 182717 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769042226.741914, 79afbcaf-6ef5-4db5-a05c-78bccd9f772a => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:37:21 compute-1 nova_compute[182713]: 2026-01-22 00:37:21.743 182717 INFO nova.compute.manager [-] [instance: 79afbcaf-6ef5-4db5-a05c-78bccd9f772a] VM Stopped (Lifecycle Event)
Jan 22 00:37:21 compute-1 nova_compute[182713]: 2026-01-22 00:37:21.783 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:37:22 compute-1 nova_compute[182713]: 2026-01-22 00:37:22.166 182717 DEBUG nova.compute.manager [None req-c5feb399-bf37-4487-978e-acbb6289220d - - - - - -] [instance: 79afbcaf-6ef5-4db5-a05c-78bccd9f772a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:37:25 compute-1 nova_compute[182713]: 2026-01-22 00:37:25.135 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:37:26 compute-1 nova_compute[182713]: 2026-01-22 00:37:26.786 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:37:30 compute-1 nova_compute[182713]: 2026-01-22 00:37:30.137 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:37:31 compute-1 nova_compute[182713]: 2026-01-22 00:37:31.789 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:37:33 compute-1 podman[241626]: 2026-01-22 00:37:33.574385512 +0000 UTC m=+0.063290372 container health_status cba261ffc859a105e0afa4436e64e27f9ba7e7387a1ea02146e0072c51844d44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 00:37:35 compute-1 nova_compute[182713]: 2026-01-22 00:37:35.166 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:37:36 compute-1 podman[241646]: 2026-01-22 00:37:36.586627801 +0000 UTC m=+0.078075791 container health_status dce133a2ffa21f3006ac27dc80027060a9029e6d972f4c62fecd35dd3bbb00f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, io.openshift.tags=minimal rhel9, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, name=ubi9-minimal, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, managed_by=edpm_ansible, vendor=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=openstack_network_exporter, build-date=2025-08-20T13:12:41)
Jan 22 00:37:36 compute-1 nova_compute[182713]: 2026-01-22 00:37:36.792 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:37:39 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:37:39.727 104184 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:5f:ff:76 2001:db8:0:1:f816:3eff:fe5f:ff76 2001:db8::f816:3eff:fe5f:ff76'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:0:1:f816:3eff:fe5f:ff76/64 2001:db8::f816:3eff:fe5f:ff76/64', 'neutron:device_id': 'ovnmeta-01fa8e13-9f62-4b06-88db-79f2e6ca65b8', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-01fa8e13-9f62-4b06-88db-79f2e6ca65b8', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '837db8748d074b3c9179b47d30e7a1d4', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=eaa35b5e-130a-4933-a219-b6429231aa8c, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=63ad2747-135a-46c8-90ca-ec1def31a1c2) old=Port_Binding(mac=['fa:16:3e:5f:ff:76 2001:db8::f816:3eff:fe5f:ff76'], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fe5f:ff76/64', 'neutron:device_id': 'ovnmeta-01fa8e13-9f62-4b06-88db-79f2e6ca65b8', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-01fa8e13-9f62-4b06-88db-79f2e6ca65b8', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '837db8748d074b3c9179b47d30e7a1d4', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:37:39 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:37:39.728 104184 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 63ad2747-135a-46c8-90ca-ec1def31a1c2 in datapath 01fa8e13-9f62-4b06-88db-79f2e6ca65b8 updated
Jan 22 00:37:39 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:37:39.729 104184 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 01fa8e13-9f62-4b06-88db-79f2e6ca65b8, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 00:37:39 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:37:39.730 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[655433bf-6434-479f-b1c9-740216ca7376]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:37:40 compute-1 nova_compute[182713]: 2026-01-22 00:37:40.212 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:37:41 compute-1 nova_compute[182713]: 2026-01-22 00:37:41.795 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:37:45 compute-1 nova_compute[182713]: 2026-01-22 00:37:45.215 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:37:45 compute-1 podman[241669]: 2026-01-22 00:37:45.575831251 +0000 UTC m=+0.054862122 container health_status 9069102163b9014ce8d990e36224edf2f6277e737eebbc85a8ea0ba5a3d6a468 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 22 00:37:45 compute-1 podman[241668]: 2026-01-22 00:37:45.601819345 +0000 UTC m=+0.083273191 container health_status 1b31374526468d6b98833c6ddc06c791524e3aaa8f4a92d02e3f11c449a17ca2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 22 00:37:46 compute-1 nova_compute[182713]: 2026-01-22 00:37:46.797 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:37:50 compute-1 nova_compute[182713]: 2026-01-22 00:37:50.217 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:37:51 compute-1 podman[241717]: 2026-01-22 00:37:51.566627062 +0000 UTC m=+0.059482654 container health_status e305ebfcd3dbca18f8dd680606b043b108cf9efb0d0a771039cb04fe0df8441d (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 22 00:37:51 compute-1 podman[241716]: 2026-01-22 00:37:51.57656396 +0000 UTC m=+0.070010270 container health_status af9a44923f14d865893c91712a29a99d59e05d83f1a1a0300c4f4d5af638f284 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3)
Jan 22 00:37:51 compute-1 nova_compute[182713]: 2026-01-22 00:37:51.799 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:37:55 compute-1 nova_compute[182713]: 2026-01-22 00:37:55.251 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:37:56 compute-1 nova_compute[182713]: 2026-01-22 00:37:56.803 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:38:00 compute-1 nova_compute[182713]: 2026-01-22 00:38:00.253 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:38:01 compute-1 nova_compute[182713]: 2026-01-22 00:38:01.807 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:38:01 compute-1 nova_compute[182713]: 2026-01-22 00:38:01.869 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:38:02 compute-1 ovn_controller[94841]: 2026-01-22T00:38:02Z|00714|memory_trim|INFO|Detected inactivity (last active 30005 ms ago): trimming memory
Jan 22 00:38:03 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:38:03.050 104184 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:38:03 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:38:03.051 104184 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:38:03 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:38:03.051 104184 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:38:03 compute-1 nova_compute[182713]: 2026-01-22 00:38:03.855 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:38:03 compute-1 nova_compute[182713]: 2026-01-22 00:38:03.855 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:38:04 compute-1 podman[241758]: 2026-01-22 00:38:04.606078386 +0000 UTC m=+0.106540953 container health_status cba261ffc859a105e0afa4436e64e27f9ba7e7387a1ea02146e0072c51844d44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute)
Jan 22 00:38:05 compute-1 nova_compute[182713]: 2026-01-22 00:38:05.283 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:38:05 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:38:05.301 104184 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=61, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:a4:fb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'fa:c2:bb:50:eb:27'}, ipsec=False) old=SB_Global(nb_cfg=60) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:38:05 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:38:05.302 104184 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 22 00:38:05 compute-1 nova_compute[182713]: 2026-01-22 00:38:05.303 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:38:05 compute-1 nova_compute[182713]: 2026-01-22 00:38:05.851 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:38:05 compute-1 nova_compute[182713]: 2026-01-22 00:38:05.868 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:38:05 compute-1 nova_compute[182713]: 2026-01-22 00:38:05.868 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 00:38:06 compute-1 nova_compute[182713]: 2026-01-22 00:38:06.810 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:38:07 compute-1 podman[241778]: 2026-01-22 00:38:07.586364354 +0000 UTC m=+0.067231034 container health_status dce133a2ffa21f3006ac27dc80027060a9029e6d972f4c62fecd35dd3bbb00f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, version=9.6, architecture=x86_64, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, io.openshift.expose-services=, maintainer=Red Hat, Inc., vcs-type=git, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers)
Jan 22 00:38:07 compute-1 nova_compute[182713]: 2026-01-22 00:38:07.855 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:38:07 compute-1 nova_compute[182713]: 2026-01-22 00:38:07.855 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Jan 22 00:38:07 compute-1 nova_compute[182713]: 2026-01-22 00:38:07.878 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Jan 22 00:38:08 compute-1 nova_compute[182713]: 2026-01-22 00:38:08.878 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:38:10 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:38:10.305 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=74526b6d-b1ca-423f-9094-b845f8b97526, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '61'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:38:10 compute-1 nova_compute[182713]: 2026-01-22 00:38:10.330 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:38:11 compute-1 nova_compute[182713]: 2026-01-22 00:38:11.813 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:38:11 compute-1 nova_compute[182713]: 2026-01-22 00:38:11.854 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:38:11 compute-1 nova_compute[182713]: 2026-01-22 00:38:11.855 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:38:11 compute-1 nova_compute[182713]: 2026-01-22 00:38:11.855 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:38:11 compute-1 nova_compute[182713]: 2026-01-22 00:38:11.890 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:38:11 compute-1 nova_compute[182713]: 2026-01-22 00:38:11.891 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:38:11 compute-1 nova_compute[182713]: 2026-01-22 00:38:11.892 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:38:11 compute-1 nova_compute[182713]: 2026-01-22 00:38:11.892 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 00:38:12 compute-1 nova_compute[182713]: 2026-01-22 00:38:12.068 182717 WARNING nova.virt.libvirt.driver [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 00:38:12 compute-1 nova_compute[182713]: 2026-01-22 00:38:12.069 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5706MB free_disk=73.193359375GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 00:38:12 compute-1 nova_compute[182713]: 2026-01-22 00:38:12.070 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:38:12 compute-1 nova_compute[182713]: 2026-01-22 00:38:12.070 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:38:12 compute-1 nova_compute[182713]: 2026-01-22 00:38:12.133 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 00:38:12 compute-1 nova_compute[182713]: 2026-01-22 00:38:12.133 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 00:38:12 compute-1 nova_compute[182713]: 2026-01-22 00:38:12.154 182717 DEBUG nova.compute.provider_tree [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Inventory has not changed in ProviderTree for provider: 39680711-70c9-4df1-ae59-25e54fac688d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:38:12 compute-1 nova_compute[182713]: 2026-01-22 00:38:12.169 182717 DEBUG nova.scheduler.client.report [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Inventory has not changed for provider 39680711-70c9-4df1-ae59-25e54fac688d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:38:12 compute-1 nova_compute[182713]: 2026-01-22 00:38:12.170 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 00:38:12 compute-1 nova_compute[182713]: 2026-01-22 00:38:12.171 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.101s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:38:12 compute-1 nova_compute[182713]: 2026-01-22 00:38:12.856 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:38:15 compute-1 nova_compute[182713]: 2026-01-22 00:38:15.333 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:38:15 compute-1 nova_compute[182713]: 2026-01-22 00:38:15.872 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:38:15 compute-1 nova_compute[182713]: 2026-01-22 00:38:15.873 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 00:38:15 compute-1 nova_compute[182713]: 2026-01-22 00:38:15.873 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 22 00:38:15 compute-1 nova_compute[182713]: 2026-01-22 00:38:15.892 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 22 00:38:16 compute-1 podman[241801]: 2026-01-22 00:38:16.586145722 +0000 UTC m=+0.065761899 container health_status 9069102163b9014ce8d990e36224edf2f6277e737eebbc85a8ea0ba5a3d6a468 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 22 00:38:16 compute-1 podman[241800]: 2026-01-22 00:38:16.613274842 +0000 UTC m=+0.106199141 container health_status 1b31374526468d6b98833c6ddc06c791524e3aaa8f4a92d02e3f11c449a17ca2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_controller, tcib_managed=true, config_id=ovn_controller, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 00:38:16 compute-1 nova_compute[182713]: 2026-01-22 00:38:16.816 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:38:17 compute-1 nova_compute[182713]: 2026-01-22 00:38:17.863 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:38:17 compute-1 nova_compute[182713]: 2026-01-22 00:38:17.863 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Jan 22 00:38:20 compute-1 nova_compute[182713]: 2026-01-22 00:38:20.374 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:38:21 compute-1 nova_compute[182713]: 2026-01-22 00:38:21.820 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:38:22 compute-1 podman[241850]: 2026-01-22 00:38:22.554037063 +0000 UTC m=+0.048748491 container health_status af9a44923f14d865893c91712a29a99d59e05d83f1a1a0300c4f4d5af638f284 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3)
Jan 22 00:38:22 compute-1 podman[241851]: 2026-01-22 00:38:22.584347102 +0000 UTC m=+0.067883664 container health_status e305ebfcd3dbca18f8dd680606b043b108cf9efb0d0a771039cb04fe0df8441d (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 22 00:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:38:22.892 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:38:22.893 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:38:22.893 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:38:22.893 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:38:22.893 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:38:22.894 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:38:22.894 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:38:22.894 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:38:22.894 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:38:22.894 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:38:22.894 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:38:22.894 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:38:22.894 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:38:22.894 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:38:22.895 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:38:22.895 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:38:22.895 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:38:22.895 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:38:22.895 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:38:22.895 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:38:22.895 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:38:22.895 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:38:22.895 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:38:22.896 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:38:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:38:22.896 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:38:25 compute-1 nova_compute[182713]: 2026-01-22 00:38:25.376 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:38:25 compute-1 nova_compute[182713]: 2026-01-22 00:38:25.678 182717 DEBUG oslo_concurrency.lockutils [None req-d3e3bbdd-f547-4686-815e-70606cffa687 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Acquiring lock "34cfe24d-2754-4083-975b-f9775fd743b8" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:38:25 compute-1 nova_compute[182713]: 2026-01-22 00:38:25.678 182717 DEBUG oslo_concurrency.lockutils [None req-d3e3bbdd-f547-4686-815e-70606cffa687 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "34cfe24d-2754-4083-975b-f9775fd743b8" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:38:25 compute-1 nova_compute[182713]: 2026-01-22 00:38:25.696 182717 DEBUG nova.compute.manager [None req-d3e3bbdd-f547-4686-815e-70606cffa687 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 34cfe24d-2754-4083-975b-f9775fd743b8] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 22 00:38:25 compute-1 nova_compute[182713]: 2026-01-22 00:38:25.808 182717 DEBUG oslo_concurrency.lockutils [None req-d3e3bbdd-f547-4686-815e-70606cffa687 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:38:25 compute-1 nova_compute[182713]: 2026-01-22 00:38:25.809 182717 DEBUG oslo_concurrency.lockutils [None req-d3e3bbdd-f547-4686-815e-70606cffa687 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:38:25 compute-1 nova_compute[182713]: 2026-01-22 00:38:25.819 182717 DEBUG nova.virt.hardware [None req-d3e3bbdd-f547-4686-815e-70606cffa687 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 22 00:38:25 compute-1 nova_compute[182713]: 2026-01-22 00:38:25.819 182717 INFO nova.compute.claims [None req-d3e3bbdd-f547-4686-815e-70606cffa687 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 34cfe24d-2754-4083-975b-f9775fd743b8] Claim successful on node compute-1.ctlplane.example.com
Jan 22 00:38:25 compute-1 nova_compute[182713]: 2026-01-22 00:38:25.982 182717 DEBUG nova.compute.provider_tree [None req-d3e3bbdd-f547-4686-815e-70606cffa687 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Inventory has not changed in ProviderTree for provider: 39680711-70c9-4df1-ae59-25e54fac688d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:38:25 compute-1 nova_compute[182713]: 2026-01-22 00:38:25.998 182717 DEBUG nova.scheduler.client.report [None req-d3e3bbdd-f547-4686-815e-70606cffa687 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Inventory has not changed for provider 39680711-70c9-4df1-ae59-25e54fac688d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:38:26 compute-1 nova_compute[182713]: 2026-01-22 00:38:26.027 182717 DEBUG oslo_concurrency.lockutils [None req-d3e3bbdd-f547-4686-815e-70606cffa687 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.218s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:38:26 compute-1 nova_compute[182713]: 2026-01-22 00:38:26.028 182717 DEBUG nova.compute.manager [None req-d3e3bbdd-f547-4686-815e-70606cffa687 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 34cfe24d-2754-4083-975b-f9775fd743b8] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 22 00:38:26 compute-1 nova_compute[182713]: 2026-01-22 00:38:26.100 182717 DEBUG nova.compute.manager [None req-d3e3bbdd-f547-4686-815e-70606cffa687 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 34cfe24d-2754-4083-975b-f9775fd743b8] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 22 00:38:26 compute-1 nova_compute[182713]: 2026-01-22 00:38:26.100 182717 DEBUG nova.network.neutron [None req-d3e3bbdd-f547-4686-815e-70606cffa687 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 34cfe24d-2754-4083-975b-f9775fd743b8] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 22 00:38:26 compute-1 nova_compute[182713]: 2026-01-22 00:38:26.121 182717 INFO nova.virt.libvirt.driver [None req-d3e3bbdd-f547-4686-815e-70606cffa687 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 34cfe24d-2754-4083-975b-f9775fd743b8] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 22 00:38:26 compute-1 nova_compute[182713]: 2026-01-22 00:38:26.144 182717 DEBUG nova.compute.manager [None req-d3e3bbdd-f547-4686-815e-70606cffa687 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 34cfe24d-2754-4083-975b-f9775fd743b8] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 22 00:38:26 compute-1 nova_compute[182713]: 2026-01-22 00:38:26.262 182717 DEBUG nova.compute.manager [None req-d3e3bbdd-f547-4686-815e-70606cffa687 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 34cfe24d-2754-4083-975b-f9775fd743b8] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 22 00:38:26 compute-1 nova_compute[182713]: 2026-01-22 00:38:26.264 182717 DEBUG nova.virt.libvirt.driver [None req-d3e3bbdd-f547-4686-815e-70606cffa687 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 34cfe24d-2754-4083-975b-f9775fd743b8] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 22 00:38:26 compute-1 nova_compute[182713]: 2026-01-22 00:38:26.265 182717 INFO nova.virt.libvirt.driver [None req-d3e3bbdd-f547-4686-815e-70606cffa687 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 34cfe24d-2754-4083-975b-f9775fd743b8] Creating image(s)
Jan 22 00:38:26 compute-1 nova_compute[182713]: 2026-01-22 00:38:26.266 182717 DEBUG oslo_concurrency.lockutils [None req-d3e3bbdd-f547-4686-815e-70606cffa687 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Acquiring lock "/var/lib/nova/instances/34cfe24d-2754-4083-975b-f9775fd743b8/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:38:26 compute-1 nova_compute[182713]: 2026-01-22 00:38:26.266 182717 DEBUG oslo_concurrency.lockutils [None req-d3e3bbdd-f547-4686-815e-70606cffa687 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "/var/lib/nova/instances/34cfe24d-2754-4083-975b-f9775fd743b8/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:38:26 compute-1 nova_compute[182713]: 2026-01-22 00:38:26.267 182717 DEBUG oslo_concurrency.lockutils [None req-d3e3bbdd-f547-4686-815e-70606cffa687 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "/var/lib/nova/instances/34cfe24d-2754-4083-975b-f9775fd743b8/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:38:26 compute-1 nova_compute[182713]: 2026-01-22 00:38:26.288 182717 DEBUG nova.policy [None req-d3e3bbdd-f547-4686-815e-70606cffa687 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 22 00:38:26 compute-1 nova_compute[182713]: 2026-01-22 00:38:26.291 182717 DEBUG oslo_concurrency.processutils [None req-d3e3bbdd-f547-4686-815e-70606cffa687 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:38:26 compute-1 nova_compute[182713]: 2026-01-22 00:38:26.356 182717 DEBUG oslo_concurrency.processutils [None req-d3e3bbdd-f547-4686-815e-70606cffa687 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:38:26 compute-1 nova_compute[182713]: 2026-01-22 00:38:26.357 182717 DEBUG oslo_concurrency.lockutils [None req-d3e3bbdd-f547-4686-815e-70606cffa687 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Acquiring lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:38:26 compute-1 nova_compute[182713]: 2026-01-22 00:38:26.358 182717 DEBUG oslo_concurrency.lockutils [None req-d3e3bbdd-f547-4686-815e-70606cffa687 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:38:26 compute-1 nova_compute[182713]: 2026-01-22 00:38:26.381 182717 DEBUG oslo_concurrency.processutils [None req-d3e3bbdd-f547-4686-815e-70606cffa687 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:38:26 compute-1 nova_compute[182713]: 2026-01-22 00:38:26.442 182717 DEBUG oslo_concurrency.processutils [None req-d3e3bbdd-f547-4686-815e-70606cffa687 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:38:26 compute-1 nova_compute[182713]: 2026-01-22 00:38:26.444 182717 DEBUG oslo_concurrency.processutils [None req-d3e3bbdd-f547-4686-815e-70606cffa687 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/34cfe24d-2754-4083-975b-f9775fd743b8/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:38:26 compute-1 nova_compute[182713]: 2026-01-22 00:38:26.572 182717 DEBUG oslo_concurrency.processutils [None req-d3e3bbdd-f547-4686-815e-70606cffa687 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/34cfe24d-2754-4083-975b-f9775fd743b8/disk 1073741824" returned: 0 in 0.127s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:38:26 compute-1 nova_compute[182713]: 2026-01-22 00:38:26.573 182717 DEBUG oslo_concurrency.lockutils [None req-d3e3bbdd-f547-4686-815e-70606cffa687 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.215s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:38:26 compute-1 nova_compute[182713]: 2026-01-22 00:38:26.574 182717 DEBUG oslo_concurrency.processutils [None req-d3e3bbdd-f547-4686-815e-70606cffa687 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:38:26 compute-1 nova_compute[182713]: 2026-01-22 00:38:26.642 182717 DEBUG oslo_concurrency.processutils [None req-d3e3bbdd-f547-4686-815e-70606cffa687 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:38:26 compute-1 nova_compute[182713]: 2026-01-22 00:38:26.644 182717 DEBUG nova.virt.disk.api [None req-d3e3bbdd-f547-4686-815e-70606cffa687 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Checking if we can resize image /var/lib/nova/instances/34cfe24d-2754-4083-975b-f9775fd743b8/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 22 00:38:26 compute-1 nova_compute[182713]: 2026-01-22 00:38:26.645 182717 DEBUG oslo_concurrency.processutils [None req-d3e3bbdd-f547-4686-815e-70606cffa687 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/34cfe24d-2754-4083-975b-f9775fd743b8/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:38:26 compute-1 nova_compute[182713]: 2026-01-22 00:38:26.713 182717 DEBUG oslo_concurrency.processutils [None req-d3e3bbdd-f547-4686-815e-70606cffa687 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/34cfe24d-2754-4083-975b-f9775fd743b8/disk --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:38:26 compute-1 nova_compute[182713]: 2026-01-22 00:38:26.714 182717 DEBUG nova.virt.disk.api [None req-d3e3bbdd-f547-4686-815e-70606cffa687 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Cannot resize image /var/lib/nova/instances/34cfe24d-2754-4083-975b-f9775fd743b8/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 22 00:38:26 compute-1 nova_compute[182713]: 2026-01-22 00:38:26.715 182717 DEBUG nova.objects.instance [None req-d3e3bbdd-f547-4686-815e-70606cffa687 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lazy-loading 'migration_context' on Instance uuid 34cfe24d-2754-4083-975b-f9775fd743b8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:38:26 compute-1 nova_compute[182713]: 2026-01-22 00:38:26.735 182717 DEBUG nova.virt.libvirt.driver [None req-d3e3bbdd-f547-4686-815e-70606cffa687 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 34cfe24d-2754-4083-975b-f9775fd743b8] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 22 00:38:26 compute-1 nova_compute[182713]: 2026-01-22 00:38:26.735 182717 DEBUG nova.virt.libvirt.driver [None req-d3e3bbdd-f547-4686-815e-70606cffa687 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 34cfe24d-2754-4083-975b-f9775fd743b8] Ensure instance console log exists: /var/lib/nova/instances/34cfe24d-2754-4083-975b-f9775fd743b8/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 22 00:38:26 compute-1 nova_compute[182713]: 2026-01-22 00:38:26.736 182717 DEBUG oslo_concurrency.lockutils [None req-d3e3bbdd-f547-4686-815e-70606cffa687 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:38:26 compute-1 nova_compute[182713]: 2026-01-22 00:38:26.736 182717 DEBUG oslo_concurrency.lockutils [None req-d3e3bbdd-f547-4686-815e-70606cffa687 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:38:26 compute-1 nova_compute[182713]: 2026-01-22 00:38:26.737 182717 DEBUG oslo_concurrency.lockutils [None req-d3e3bbdd-f547-4686-815e-70606cffa687 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:38:26 compute-1 nova_compute[182713]: 2026-01-22 00:38:26.823 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:38:27 compute-1 nova_compute[182713]: 2026-01-22 00:38:27.189 182717 DEBUG nova.network.neutron [None req-d3e3bbdd-f547-4686-815e-70606cffa687 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 34cfe24d-2754-4083-975b-f9775fd743b8] Successfully created port: 6e500b56-ad6b-4af0-9a28-7bd95fe3781d _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 22 00:38:27 compute-1 nova_compute[182713]: 2026-01-22 00:38:27.759 182717 DEBUG nova.network.neutron [None req-d3e3bbdd-f547-4686-815e-70606cffa687 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 34cfe24d-2754-4083-975b-f9775fd743b8] Successfully created port: a3b2b7b3-2c20-4521-88bf-4bd738cbbb5c _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 22 00:38:28 compute-1 nova_compute[182713]: 2026-01-22 00:38:28.831 182717 DEBUG nova.network.neutron [None req-d3e3bbdd-f547-4686-815e-70606cffa687 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 34cfe24d-2754-4083-975b-f9775fd743b8] Successfully updated port: 6e500b56-ad6b-4af0-9a28-7bd95fe3781d _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 22 00:38:28 compute-1 nova_compute[182713]: 2026-01-22 00:38:28.971 182717 DEBUG nova.compute.manager [req-fe9dcb3d-ba78-4713-912b-00e370a614f5 req-44635ce8-7191-4ae9-a7bc-79a0ae499fa5 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 34cfe24d-2754-4083-975b-f9775fd743b8] Received event network-changed-6e500b56-ad6b-4af0-9a28-7bd95fe3781d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:38:28 compute-1 nova_compute[182713]: 2026-01-22 00:38:28.971 182717 DEBUG nova.compute.manager [req-fe9dcb3d-ba78-4713-912b-00e370a614f5 req-44635ce8-7191-4ae9-a7bc-79a0ae499fa5 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 34cfe24d-2754-4083-975b-f9775fd743b8] Refreshing instance network info cache due to event network-changed-6e500b56-ad6b-4af0-9a28-7bd95fe3781d. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 00:38:28 compute-1 nova_compute[182713]: 2026-01-22 00:38:28.971 182717 DEBUG oslo_concurrency.lockutils [req-fe9dcb3d-ba78-4713-912b-00e370a614f5 req-44635ce8-7191-4ae9-a7bc-79a0ae499fa5 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-34cfe24d-2754-4083-975b-f9775fd743b8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:38:28 compute-1 nova_compute[182713]: 2026-01-22 00:38:28.972 182717 DEBUG oslo_concurrency.lockutils [req-fe9dcb3d-ba78-4713-912b-00e370a614f5 req-44635ce8-7191-4ae9-a7bc-79a0ae499fa5 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-34cfe24d-2754-4083-975b-f9775fd743b8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:38:28 compute-1 nova_compute[182713]: 2026-01-22 00:38:28.972 182717 DEBUG nova.network.neutron [req-fe9dcb3d-ba78-4713-912b-00e370a614f5 req-44635ce8-7191-4ae9-a7bc-79a0ae499fa5 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 34cfe24d-2754-4083-975b-f9775fd743b8] Refreshing network info cache for port 6e500b56-ad6b-4af0-9a28-7bd95fe3781d _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 00:38:29 compute-1 nova_compute[182713]: 2026-01-22 00:38:29.164 182717 DEBUG nova.network.neutron [req-fe9dcb3d-ba78-4713-912b-00e370a614f5 req-44635ce8-7191-4ae9-a7bc-79a0ae499fa5 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 34cfe24d-2754-4083-975b-f9775fd743b8] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 00:38:29 compute-1 nova_compute[182713]: 2026-01-22 00:38:29.484 182717 DEBUG nova.network.neutron [req-fe9dcb3d-ba78-4713-912b-00e370a614f5 req-44635ce8-7191-4ae9-a7bc-79a0ae499fa5 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 34cfe24d-2754-4083-975b-f9775fd743b8] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:38:29 compute-1 nova_compute[182713]: 2026-01-22 00:38:29.502 182717 DEBUG oslo_concurrency.lockutils [req-fe9dcb3d-ba78-4713-912b-00e370a614f5 req-44635ce8-7191-4ae9-a7bc-79a0ae499fa5 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-34cfe24d-2754-4083-975b-f9775fd743b8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:38:29 compute-1 nova_compute[182713]: 2026-01-22 00:38:29.634 182717 DEBUG nova.network.neutron [None req-d3e3bbdd-f547-4686-815e-70606cffa687 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 34cfe24d-2754-4083-975b-f9775fd743b8] Successfully updated port: a3b2b7b3-2c20-4521-88bf-4bd738cbbb5c _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 22 00:38:29 compute-1 nova_compute[182713]: 2026-01-22 00:38:29.651 182717 DEBUG oslo_concurrency.lockutils [None req-d3e3bbdd-f547-4686-815e-70606cffa687 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Acquiring lock "refresh_cache-34cfe24d-2754-4083-975b-f9775fd743b8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:38:29 compute-1 nova_compute[182713]: 2026-01-22 00:38:29.651 182717 DEBUG oslo_concurrency.lockutils [None req-d3e3bbdd-f547-4686-815e-70606cffa687 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Acquired lock "refresh_cache-34cfe24d-2754-4083-975b-f9775fd743b8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:38:29 compute-1 nova_compute[182713]: 2026-01-22 00:38:29.651 182717 DEBUG nova.network.neutron [None req-d3e3bbdd-f547-4686-815e-70606cffa687 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 34cfe24d-2754-4083-975b-f9775fd743b8] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 00:38:29 compute-1 nova_compute[182713]: 2026-01-22 00:38:29.793 182717 DEBUG nova.network.neutron [None req-d3e3bbdd-f547-4686-815e-70606cffa687 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 34cfe24d-2754-4083-975b-f9775fd743b8] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 00:38:30 compute-1 nova_compute[182713]: 2026-01-22 00:38:30.378 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:38:31 compute-1 nova_compute[182713]: 2026-01-22 00:38:31.078 182717 DEBUG nova.compute.manager [req-dd017a30-e732-4324-83c3-8c65aa440c69 req-836fe50f-8732-45dc-9e3c-4e267f427d04 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 34cfe24d-2754-4083-975b-f9775fd743b8] Received event network-changed-a3b2b7b3-2c20-4521-88bf-4bd738cbbb5c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:38:31 compute-1 nova_compute[182713]: 2026-01-22 00:38:31.079 182717 DEBUG nova.compute.manager [req-dd017a30-e732-4324-83c3-8c65aa440c69 req-836fe50f-8732-45dc-9e3c-4e267f427d04 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 34cfe24d-2754-4083-975b-f9775fd743b8] Refreshing instance network info cache due to event network-changed-a3b2b7b3-2c20-4521-88bf-4bd738cbbb5c. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 00:38:31 compute-1 nova_compute[182713]: 2026-01-22 00:38:31.079 182717 DEBUG oslo_concurrency.lockutils [req-dd017a30-e732-4324-83c3-8c65aa440c69 req-836fe50f-8732-45dc-9e3c-4e267f427d04 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-34cfe24d-2754-4083-975b-f9775fd743b8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:38:31 compute-1 nova_compute[182713]: 2026-01-22 00:38:31.827 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:38:35 compute-1 nova_compute[182713]: 2026-01-22 00:38:35.614 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:38:35 compute-1 podman[241910]: 2026-01-22 00:38:35.698443239 +0000 UTC m=+0.052256920 container health_status cba261ffc859a105e0afa4436e64e27f9ba7e7387a1ea02146e0072c51844d44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Jan 22 00:38:36 compute-1 nova_compute[182713]: 2026-01-22 00:38:36.172 182717 DEBUG nova.network.neutron [None req-d3e3bbdd-f547-4686-815e-70606cffa687 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 34cfe24d-2754-4083-975b-f9775fd743b8] Updating instance_info_cache with network_info: [{"id": "6e500b56-ad6b-4af0-9a28-7bd95fe3781d", "address": "fa:16:3e:19:b9:32", "network": {"id": "96576974-adfc-492e-9141-63dd99e1cb25", "bridge": "br-int", "label": "tempest-network-smoke--1773827977", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6e500b56-ad", "ovs_interfaceid": "6e500b56-ad6b-4af0-9a28-7bd95fe3781d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "a3b2b7b3-2c20-4521-88bf-4bd738cbbb5c", "address": "fa:16:3e:5f:7d:b6", "network": {"id": "01fa8e13-9f62-4b06-88db-79f2e6ca65b8", "bridge": "br-int", "label": "tempest-network-smoke--1773626540", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe5f:7db6", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe5f:7db6", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa3b2b7b3-2c", "ovs_interfaceid": "a3b2b7b3-2c20-4521-88bf-4bd738cbbb5c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:38:36 compute-1 nova_compute[182713]: 2026-01-22 00:38:36.194 182717 DEBUG oslo_concurrency.lockutils [None req-d3e3bbdd-f547-4686-815e-70606cffa687 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Releasing lock "refresh_cache-34cfe24d-2754-4083-975b-f9775fd743b8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:38:36 compute-1 nova_compute[182713]: 2026-01-22 00:38:36.194 182717 DEBUG nova.compute.manager [None req-d3e3bbdd-f547-4686-815e-70606cffa687 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 34cfe24d-2754-4083-975b-f9775fd743b8] Instance network_info: |[{"id": "6e500b56-ad6b-4af0-9a28-7bd95fe3781d", "address": "fa:16:3e:19:b9:32", "network": {"id": "96576974-adfc-492e-9141-63dd99e1cb25", "bridge": "br-int", "label": "tempest-network-smoke--1773827977", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6e500b56-ad", "ovs_interfaceid": "6e500b56-ad6b-4af0-9a28-7bd95fe3781d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "a3b2b7b3-2c20-4521-88bf-4bd738cbbb5c", "address": "fa:16:3e:5f:7d:b6", "network": {"id": "01fa8e13-9f62-4b06-88db-79f2e6ca65b8", "bridge": "br-int", "label": "tempest-network-smoke--1773626540", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe5f:7db6", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe5f:7db6", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa3b2b7b3-2c", "ovs_interfaceid": "a3b2b7b3-2c20-4521-88bf-4bd738cbbb5c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 22 00:38:36 compute-1 nova_compute[182713]: 2026-01-22 00:38:36.195 182717 DEBUG oslo_concurrency.lockutils [req-dd017a30-e732-4324-83c3-8c65aa440c69 req-836fe50f-8732-45dc-9e3c-4e267f427d04 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-34cfe24d-2754-4083-975b-f9775fd743b8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:38:36 compute-1 nova_compute[182713]: 2026-01-22 00:38:36.195 182717 DEBUG nova.network.neutron [req-dd017a30-e732-4324-83c3-8c65aa440c69 req-836fe50f-8732-45dc-9e3c-4e267f427d04 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 34cfe24d-2754-4083-975b-f9775fd743b8] Refreshing network info cache for port a3b2b7b3-2c20-4521-88bf-4bd738cbbb5c _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 00:38:36 compute-1 nova_compute[182713]: 2026-01-22 00:38:36.199 182717 DEBUG nova.virt.libvirt.driver [None req-d3e3bbdd-f547-4686-815e-70606cffa687 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 34cfe24d-2754-4083-975b-f9775fd743b8] Start _get_guest_xml network_info=[{"id": "6e500b56-ad6b-4af0-9a28-7bd95fe3781d", "address": "fa:16:3e:19:b9:32", "network": {"id": "96576974-adfc-492e-9141-63dd99e1cb25", "bridge": "br-int", "label": "tempest-network-smoke--1773827977", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6e500b56-ad", "ovs_interfaceid": "6e500b56-ad6b-4af0-9a28-7bd95fe3781d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "a3b2b7b3-2c20-4521-88bf-4bd738cbbb5c", "address": "fa:16:3e:5f:7d:b6", "network": {"id": "01fa8e13-9f62-4b06-88db-79f2e6ca65b8", "bridge": "br-int", "label": "tempest-network-smoke--1773626540", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe5f:7db6", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe5f:7db6", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa3b2b7b3-2c", "ovs_interfaceid": "a3b2b7b3-2c20-4521-88bf-4bd738cbbb5c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'size': 0, 'disk_bus': 'virtio', 'boot_index': 0, 'device_name': '/dev/vda', 'guest_format': None, 'encrypted': False, 'encryption_format': None, 'encryption_secret_uuid': None, 'device_type': 'disk', 'image_id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 22 00:38:36 compute-1 nova_compute[182713]: 2026-01-22 00:38:36.205 182717 WARNING nova.virt.libvirt.driver [None req-d3e3bbdd-f547-4686-815e-70606cffa687 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 00:38:36 compute-1 nova_compute[182713]: 2026-01-22 00:38:36.218 182717 DEBUG nova.virt.libvirt.host [None req-d3e3bbdd-f547-4686-815e-70606cffa687 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 22 00:38:36 compute-1 nova_compute[182713]: 2026-01-22 00:38:36.219 182717 DEBUG nova.virt.libvirt.host [None req-d3e3bbdd-f547-4686-815e-70606cffa687 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 22 00:38:36 compute-1 nova_compute[182713]: 2026-01-22 00:38:36.224 182717 DEBUG nova.virt.libvirt.host [None req-d3e3bbdd-f547-4686-815e-70606cffa687 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 22 00:38:36 compute-1 nova_compute[182713]: 2026-01-22 00:38:36.225 182717 DEBUG nova.virt.libvirt.host [None req-d3e3bbdd-f547-4686-815e-70606cffa687 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 22 00:38:36 compute-1 nova_compute[182713]: 2026-01-22 00:38:36.227 182717 DEBUG nova.virt.libvirt.driver [None req-d3e3bbdd-f547-4686-815e-70606cffa687 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 22 00:38:36 compute-1 nova_compute[182713]: 2026-01-22 00:38:36.228 182717 DEBUG nova.virt.hardware [None req-d3e3bbdd-f547-4686-815e-70606cffa687 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-21T23:42:45Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c3389c03-89c4-4ff5-9e03-1a99d41713d4',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 22 00:38:36 compute-1 nova_compute[182713]: 2026-01-22 00:38:36.229 182717 DEBUG nova.virt.hardware [None req-d3e3bbdd-f547-4686-815e-70606cffa687 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 22 00:38:36 compute-1 nova_compute[182713]: 2026-01-22 00:38:36.230 182717 DEBUG nova.virt.hardware [None req-d3e3bbdd-f547-4686-815e-70606cffa687 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 22 00:38:36 compute-1 nova_compute[182713]: 2026-01-22 00:38:36.230 182717 DEBUG nova.virt.hardware [None req-d3e3bbdd-f547-4686-815e-70606cffa687 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 22 00:38:36 compute-1 nova_compute[182713]: 2026-01-22 00:38:36.231 182717 DEBUG nova.virt.hardware [None req-d3e3bbdd-f547-4686-815e-70606cffa687 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 22 00:38:36 compute-1 nova_compute[182713]: 2026-01-22 00:38:36.231 182717 DEBUG nova.virt.hardware [None req-d3e3bbdd-f547-4686-815e-70606cffa687 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 22 00:38:36 compute-1 nova_compute[182713]: 2026-01-22 00:38:36.232 182717 DEBUG nova.virt.hardware [None req-d3e3bbdd-f547-4686-815e-70606cffa687 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 22 00:38:36 compute-1 nova_compute[182713]: 2026-01-22 00:38:36.232 182717 DEBUG nova.virt.hardware [None req-d3e3bbdd-f547-4686-815e-70606cffa687 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 22 00:38:36 compute-1 nova_compute[182713]: 2026-01-22 00:38:36.233 182717 DEBUG nova.virt.hardware [None req-d3e3bbdd-f547-4686-815e-70606cffa687 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 22 00:38:36 compute-1 nova_compute[182713]: 2026-01-22 00:38:36.234 182717 DEBUG nova.virt.hardware [None req-d3e3bbdd-f547-4686-815e-70606cffa687 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 22 00:38:36 compute-1 nova_compute[182713]: 2026-01-22 00:38:36.235 182717 DEBUG nova.virt.hardware [None req-d3e3bbdd-f547-4686-815e-70606cffa687 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 22 00:38:36 compute-1 nova_compute[182713]: 2026-01-22 00:38:36.239 182717 DEBUG nova.virt.libvirt.vif [None req-d3e3bbdd-f547-4686-815e-70606cffa687 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T00:38:24Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1282444667',display_name='tempest-TestGettingAddress-server-1282444667',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1282444667',id=176,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBKUPE/b7T//C7EWRAZhxqFsRGr7AXrACj+OWHY0bSytLiLst+E4mc3tNVo/ZttM4rMO8VKrIAX0ipjrNBzr3hEfrSo0ADkuS+9zF2SWKTGt3QdgHdDGoyPuHVN6vYWqrHA==',key_name='tempest-TestGettingAddress-2032783061',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='837db8748d074b3c9179b47d30e7a1d4',ramdisk_id='',reservation_id='r-jd0666ok',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-471729430',owner_user_name='tempest-TestGettingAddress-471729430-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:38:26Z,user_data=None,user_id='a8fd196423d94b309668ffd08655f7ed',uuid=34cfe24d-2754-4083-975b-f9775fd743b8,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "6e500b56-ad6b-4af0-9a28-7bd95fe3781d", "address": "fa:16:3e:19:b9:32", "network": {"id": "96576974-adfc-492e-9141-63dd99e1cb25", "bridge": "br-int", "label": "tempest-network-smoke--1773827977", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6e500b56-ad", "ovs_interfaceid": "6e500b56-ad6b-4af0-9a28-7bd95fe3781d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 00:38:36 compute-1 nova_compute[182713]: 2026-01-22 00:38:36.240 182717 DEBUG nova.network.os_vif_util [None req-d3e3bbdd-f547-4686-815e-70606cffa687 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Converting VIF {"id": "6e500b56-ad6b-4af0-9a28-7bd95fe3781d", "address": "fa:16:3e:19:b9:32", "network": {"id": "96576974-adfc-492e-9141-63dd99e1cb25", "bridge": "br-int", "label": "tempest-network-smoke--1773827977", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6e500b56-ad", "ovs_interfaceid": "6e500b56-ad6b-4af0-9a28-7bd95fe3781d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:38:36 compute-1 nova_compute[182713]: 2026-01-22 00:38:36.241 182717 DEBUG nova.network.os_vif_util [None req-d3e3bbdd-f547-4686-815e-70606cffa687 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:19:b9:32,bridge_name='br-int',has_traffic_filtering=True,id=6e500b56-ad6b-4af0-9a28-7bd95fe3781d,network=Network(96576974-adfc-492e-9141-63dd99e1cb25),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6e500b56-ad') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:38:36 compute-1 nova_compute[182713]: 2026-01-22 00:38:36.242 182717 DEBUG nova.virt.libvirt.vif [None req-d3e3bbdd-f547-4686-815e-70606cffa687 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T00:38:24Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1282444667',display_name='tempest-TestGettingAddress-server-1282444667',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1282444667',id=176,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBKUPE/b7T//C7EWRAZhxqFsRGr7AXrACj+OWHY0bSytLiLst+E4mc3tNVo/ZttM4rMO8VKrIAX0ipjrNBzr3hEfrSo0ADkuS+9zF2SWKTGt3QdgHdDGoyPuHVN6vYWqrHA==',key_name='tempest-TestGettingAddress-2032783061',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='837db8748d074b3c9179b47d30e7a1d4',ramdisk_id='',reservation_id='r-jd0666ok',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-471729430',owner_user_name='tempest-TestGettingAddress-471729430-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:38:26Z,user_data=None,user_id='a8fd196423d94b309668ffd08655f7ed',uuid=34cfe24d-2754-4083-975b-f9775fd743b8,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "a3b2b7b3-2c20-4521-88bf-4bd738cbbb5c", "address": "fa:16:3e:5f:7d:b6", "network": {"id": "01fa8e13-9f62-4b06-88db-79f2e6ca65b8", "bridge": "br-int", "label": "tempest-network-smoke--1773626540", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe5f:7db6", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe5f:7db6", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa3b2b7b3-2c", "ovs_interfaceid": "a3b2b7b3-2c20-4521-88bf-4bd738cbbb5c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 00:38:36 compute-1 nova_compute[182713]: 2026-01-22 00:38:36.242 182717 DEBUG nova.network.os_vif_util [None req-d3e3bbdd-f547-4686-815e-70606cffa687 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Converting VIF {"id": "a3b2b7b3-2c20-4521-88bf-4bd738cbbb5c", "address": "fa:16:3e:5f:7d:b6", "network": {"id": "01fa8e13-9f62-4b06-88db-79f2e6ca65b8", "bridge": "br-int", "label": "tempest-network-smoke--1773626540", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe5f:7db6", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe5f:7db6", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa3b2b7b3-2c", "ovs_interfaceid": "a3b2b7b3-2c20-4521-88bf-4bd738cbbb5c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:38:36 compute-1 nova_compute[182713]: 2026-01-22 00:38:36.243 182717 DEBUG nova.network.os_vif_util [None req-d3e3bbdd-f547-4686-815e-70606cffa687 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:5f:7d:b6,bridge_name='br-int',has_traffic_filtering=True,id=a3b2b7b3-2c20-4521-88bf-4bd738cbbb5c,network=Network(01fa8e13-9f62-4b06-88db-79f2e6ca65b8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa3b2b7b3-2c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:38:36 compute-1 nova_compute[182713]: 2026-01-22 00:38:36.244 182717 DEBUG nova.objects.instance [None req-d3e3bbdd-f547-4686-815e-70606cffa687 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lazy-loading 'pci_devices' on Instance uuid 34cfe24d-2754-4083-975b-f9775fd743b8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:38:36 compute-1 nova_compute[182713]: 2026-01-22 00:38:36.257 182717 DEBUG nova.virt.libvirt.driver [None req-d3e3bbdd-f547-4686-815e-70606cffa687 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 34cfe24d-2754-4083-975b-f9775fd743b8] End _get_guest_xml xml=<domain type="kvm">
Jan 22 00:38:36 compute-1 nova_compute[182713]:   <uuid>34cfe24d-2754-4083-975b-f9775fd743b8</uuid>
Jan 22 00:38:36 compute-1 nova_compute[182713]:   <name>instance-000000b0</name>
Jan 22 00:38:36 compute-1 nova_compute[182713]:   <memory>131072</memory>
Jan 22 00:38:36 compute-1 nova_compute[182713]:   <vcpu>1</vcpu>
Jan 22 00:38:36 compute-1 nova_compute[182713]:   <metadata>
Jan 22 00:38:36 compute-1 nova_compute[182713]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 00:38:36 compute-1 nova_compute[182713]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 00:38:36 compute-1 nova_compute[182713]:       <nova:name>tempest-TestGettingAddress-server-1282444667</nova:name>
Jan 22 00:38:36 compute-1 nova_compute[182713]:       <nova:creationTime>2026-01-22 00:38:36</nova:creationTime>
Jan 22 00:38:36 compute-1 nova_compute[182713]:       <nova:flavor name="m1.nano">
Jan 22 00:38:36 compute-1 nova_compute[182713]:         <nova:memory>128</nova:memory>
Jan 22 00:38:36 compute-1 nova_compute[182713]:         <nova:disk>1</nova:disk>
Jan 22 00:38:36 compute-1 nova_compute[182713]:         <nova:swap>0</nova:swap>
Jan 22 00:38:36 compute-1 nova_compute[182713]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 00:38:36 compute-1 nova_compute[182713]:         <nova:vcpus>1</nova:vcpus>
Jan 22 00:38:36 compute-1 nova_compute[182713]:       </nova:flavor>
Jan 22 00:38:36 compute-1 nova_compute[182713]:       <nova:owner>
Jan 22 00:38:36 compute-1 nova_compute[182713]:         <nova:user uuid="a8fd196423d94b309668ffd08655f7ed">tempest-TestGettingAddress-471729430-project-member</nova:user>
Jan 22 00:38:36 compute-1 nova_compute[182713]:         <nova:project uuid="837db8748d074b3c9179b47d30e7a1d4">tempest-TestGettingAddress-471729430</nova:project>
Jan 22 00:38:36 compute-1 nova_compute[182713]:       </nova:owner>
Jan 22 00:38:36 compute-1 nova_compute[182713]:       <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 22 00:38:36 compute-1 nova_compute[182713]:       <nova:ports>
Jan 22 00:38:36 compute-1 nova_compute[182713]:         <nova:port uuid="6e500b56-ad6b-4af0-9a28-7bd95fe3781d">
Jan 22 00:38:36 compute-1 nova_compute[182713]:           <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Jan 22 00:38:36 compute-1 nova_compute[182713]:         </nova:port>
Jan 22 00:38:36 compute-1 nova_compute[182713]:         <nova:port uuid="a3b2b7b3-2c20-4521-88bf-4bd738cbbb5c">
Jan 22 00:38:36 compute-1 nova_compute[182713]:           <nova:ip type="fixed" address="2001:db8::f816:3eff:fe5f:7db6" ipVersion="6"/>
Jan 22 00:38:36 compute-1 nova_compute[182713]:           <nova:ip type="fixed" address="2001:db8:0:1:f816:3eff:fe5f:7db6" ipVersion="6"/>
Jan 22 00:38:36 compute-1 nova_compute[182713]:         </nova:port>
Jan 22 00:38:36 compute-1 nova_compute[182713]:       </nova:ports>
Jan 22 00:38:36 compute-1 nova_compute[182713]:     </nova:instance>
Jan 22 00:38:36 compute-1 nova_compute[182713]:   </metadata>
Jan 22 00:38:36 compute-1 nova_compute[182713]:   <sysinfo type="smbios">
Jan 22 00:38:36 compute-1 nova_compute[182713]:     <system>
Jan 22 00:38:36 compute-1 nova_compute[182713]:       <entry name="manufacturer">RDO</entry>
Jan 22 00:38:36 compute-1 nova_compute[182713]:       <entry name="product">OpenStack Compute</entry>
Jan 22 00:38:36 compute-1 nova_compute[182713]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 00:38:36 compute-1 nova_compute[182713]:       <entry name="serial">34cfe24d-2754-4083-975b-f9775fd743b8</entry>
Jan 22 00:38:36 compute-1 nova_compute[182713]:       <entry name="uuid">34cfe24d-2754-4083-975b-f9775fd743b8</entry>
Jan 22 00:38:36 compute-1 nova_compute[182713]:       <entry name="family">Virtual Machine</entry>
Jan 22 00:38:36 compute-1 nova_compute[182713]:     </system>
Jan 22 00:38:36 compute-1 nova_compute[182713]:   </sysinfo>
Jan 22 00:38:36 compute-1 nova_compute[182713]:   <os>
Jan 22 00:38:36 compute-1 nova_compute[182713]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 22 00:38:36 compute-1 nova_compute[182713]:     <boot dev="hd"/>
Jan 22 00:38:36 compute-1 nova_compute[182713]:     <smbios mode="sysinfo"/>
Jan 22 00:38:36 compute-1 nova_compute[182713]:   </os>
Jan 22 00:38:36 compute-1 nova_compute[182713]:   <features>
Jan 22 00:38:36 compute-1 nova_compute[182713]:     <acpi/>
Jan 22 00:38:36 compute-1 nova_compute[182713]:     <apic/>
Jan 22 00:38:36 compute-1 nova_compute[182713]:     <vmcoreinfo/>
Jan 22 00:38:36 compute-1 nova_compute[182713]:   </features>
Jan 22 00:38:36 compute-1 nova_compute[182713]:   <clock offset="utc">
Jan 22 00:38:36 compute-1 nova_compute[182713]:     <timer name="pit" tickpolicy="delay"/>
Jan 22 00:38:36 compute-1 nova_compute[182713]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 22 00:38:36 compute-1 nova_compute[182713]:     <timer name="hpet" present="no"/>
Jan 22 00:38:36 compute-1 nova_compute[182713]:   </clock>
Jan 22 00:38:36 compute-1 nova_compute[182713]:   <cpu mode="custom" match="exact">
Jan 22 00:38:36 compute-1 nova_compute[182713]:     <model>Nehalem</model>
Jan 22 00:38:36 compute-1 nova_compute[182713]:     <topology sockets="1" cores="1" threads="1"/>
Jan 22 00:38:36 compute-1 nova_compute[182713]:   </cpu>
Jan 22 00:38:36 compute-1 nova_compute[182713]:   <devices>
Jan 22 00:38:36 compute-1 nova_compute[182713]:     <disk type="file" device="disk">
Jan 22 00:38:36 compute-1 nova_compute[182713]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 00:38:36 compute-1 nova_compute[182713]:       <source file="/var/lib/nova/instances/34cfe24d-2754-4083-975b-f9775fd743b8/disk"/>
Jan 22 00:38:36 compute-1 nova_compute[182713]:       <target dev="vda" bus="virtio"/>
Jan 22 00:38:36 compute-1 nova_compute[182713]:     </disk>
Jan 22 00:38:36 compute-1 nova_compute[182713]:     <disk type="file" device="cdrom">
Jan 22 00:38:36 compute-1 nova_compute[182713]:       <driver name="qemu" type="raw" cache="none"/>
Jan 22 00:38:36 compute-1 nova_compute[182713]:       <source file="/var/lib/nova/instances/34cfe24d-2754-4083-975b-f9775fd743b8/disk.config"/>
Jan 22 00:38:36 compute-1 nova_compute[182713]:       <target dev="sda" bus="sata"/>
Jan 22 00:38:36 compute-1 nova_compute[182713]:     </disk>
Jan 22 00:38:36 compute-1 nova_compute[182713]:     <interface type="ethernet">
Jan 22 00:38:36 compute-1 nova_compute[182713]:       <mac address="fa:16:3e:19:b9:32"/>
Jan 22 00:38:36 compute-1 nova_compute[182713]:       <model type="virtio"/>
Jan 22 00:38:36 compute-1 nova_compute[182713]:       <driver name="vhost" rx_queue_size="512"/>
Jan 22 00:38:36 compute-1 nova_compute[182713]:       <mtu size="1442"/>
Jan 22 00:38:36 compute-1 nova_compute[182713]:       <target dev="tap6e500b56-ad"/>
Jan 22 00:38:36 compute-1 nova_compute[182713]:     </interface>
Jan 22 00:38:36 compute-1 nova_compute[182713]:     <interface type="ethernet">
Jan 22 00:38:36 compute-1 nova_compute[182713]:       <mac address="fa:16:3e:5f:7d:b6"/>
Jan 22 00:38:36 compute-1 nova_compute[182713]:       <model type="virtio"/>
Jan 22 00:38:36 compute-1 nova_compute[182713]:       <driver name="vhost" rx_queue_size="512"/>
Jan 22 00:38:36 compute-1 nova_compute[182713]:       <mtu size="1442"/>
Jan 22 00:38:36 compute-1 nova_compute[182713]:       <target dev="tapa3b2b7b3-2c"/>
Jan 22 00:38:36 compute-1 nova_compute[182713]:     </interface>
Jan 22 00:38:36 compute-1 nova_compute[182713]:     <serial type="pty">
Jan 22 00:38:36 compute-1 nova_compute[182713]:       <log file="/var/lib/nova/instances/34cfe24d-2754-4083-975b-f9775fd743b8/console.log" append="off"/>
Jan 22 00:38:36 compute-1 nova_compute[182713]:     </serial>
Jan 22 00:38:36 compute-1 nova_compute[182713]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 00:38:36 compute-1 nova_compute[182713]:     <video>
Jan 22 00:38:36 compute-1 nova_compute[182713]:       <model type="virtio"/>
Jan 22 00:38:36 compute-1 nova_compute[182713]:     </video>
Jan 22 00:38:36 compute-1 nova_compute[182713]:     <input type="tablet" bus="usb"/>
Jan 22 00:38:36 compute-1 nova_compute[182713]:     <rng model="virtio">
Jan 22 00:38:36 compute-1 nova_compute[182713]:       <backend model="random">/dev/urandom</backend>
Jan 22 00:38:36 compute-1 nova_compute[182713]:     </rng>
Jan 22 00:38:36 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root"/>
Jan 22 00:38:36 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:38:36 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:38:36 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:38:36 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:38:36 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:38:36 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:38:36 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:38:36 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:38:36 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:38:36 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:38:36 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:38:36 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:38:36 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:38:36 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:38:36 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:38:36 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:38:36 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:38:36 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:38:36 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:38:36 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:38:36 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:38:36 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:38:36 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:38:36 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:38:36 compute-1 nova_compute[182713]:     <controller type="usb" index="0"/>
Jan 22 00:38:36 compute-1 nova_compute[182713]:     <memballoon model="virtio">
Jan 22 00:38:36 compute-1 nova_compute[182713]:       <stats period="10"/>
Jan 22 00:38:36 compute-1 nova_compute[182713]:     </memballoon>
Jan 22 00:38:36 compute-1 nova_compute[182713]:   </devices>
Jan 22 00:38:36 compute-1 nova_compute[182713]: </domain>
Jan 22 00:38:36 compute-1 nova_compute[182713]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 22 00:38:36 compute-1 nova_compute[182713]: 2026-01-22 00:38:36.258 182717 DEBUG nova.compute.manager [None req-d3e3bbdd-f547-4686-815e-70606cffa687 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 34cfe24d-2754-4083-975b-f9775fd743b8] Preparing to wait for external event network-vif-plugged-6e500b56-ad6b-4af0-9a28-7bd95fe3781d prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 22 00:38:36 compute-1 nova_compute[182713]: 2026-01-22 00:38:36.259 182717 DEBUG oslo_concurrency.lockutils [None req-d3e3bbdd-f547-4686-815e-70606cffa687 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Acquiring lock "34cfe24d-2754-4083-975b-f9775fd743b8-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:38:36 compute-1 nova_compute[182713]: 2026-01-22 00:38:36.259 182717 DEBUG oslo_concurrency.lockutils [None req-d3e3bbdd-f547-4686-815e-70606cffa687 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "34cfe24d-2754-4083-975b-f9775fd743b8-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:38:36 compute-1 nova_compute[182713]: 2026-01-22 00:38:36.260 182717 DEBUG oslo_concurrency.lockutils [None req-d3e3bbdd-f547-4686-815e-70606cffa687 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "34cfe24d-2754-4083-975b-f9775fd743b8-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:38:36 compute-1 nova_compute[182713]: 2026-01-22 00:38:36.260 182717 DEBUG nova.compute.manager [None req-d3e3bbdd-f547-4686-815e-70606cffa687 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 34cfe24d-2754-4083-975b-f9775fd743b8] Preparing to wait for external event network-vif-plugged-a3b2b7b3-2c20-4521-88bf-4bd738cbbb5c prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 22 00:38:36 compute-1 nova_compute[182713]: 2026-01-22 00:38:36.260 182717 DEBUG oslo_concurrency.lockutils [None req-d3e3bbdd-f547-4686-815e-70606cffa687 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Acquiring lock "34cfe24d-2754-4083-975b-f9775fd743b8-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:38:36 compute-1 nova_compute[182713]: 2026-01-22 00:38:36.260 182717 DEBUG oslo_concurrency.lockutils [None req-d3e3bbdd-f547-4686-815e-70606cffa687 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "34cfe24d-2754-4083-975b-f9775fd743b8-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:38:36 compute-1 nova_compute[182713]: 2026-01-22 00:38:36.261 182717 DEBUG oslo_concurrency.lockutils [None req-d3e3bbdd-f547-4686-815e-70606cffa687 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "34cfe24d-2754-4083-975b-f9775fd743b8-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:38:36 compute-1 nova_compute[182713]: 2026-01-22 00:38:36.262 182717 DEBUG nova.virt.libvirt.vif [None req-d3e3bbdd-f547-4686-815e-70606cffa687 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T00:38:24Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1282444667',display_name='tempest-TestGettingAddress-server-1282444667',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1282444667',id=176,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBKUPE/b7T//C7EWRAZhxqFsRGr7AXrACj+OWHY0bSytLiLst+E4mc3tNVo/ZttM4rMO8VKrIAX0ipjrNBzr3hEfrSo0ADkuS+9zF2SWKTGt3QdgHdDGoyPuHVN6vYWqrHA==',key_name='tempest-TestGettingAddress-2032783061',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='837db8748d074b3c9179b47d30e7a1d4',ramdisk_id='',reservation_id='r-jd0666ok',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-471729430',owner_user_name='tempest-TestGettingAddress-471729430-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:38:26Z,user_data=None,user_id='a8fd196423d94b309668ffd08655f7ed',uuid=34cfe24d-2754-4083-975b-f9775fd743b8,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "6e500b56-ad6b-4af0-9a28-7bd95fe3781d", "address": "fa:16:3e:19:b9:32", "network": {"id": "96576974-adfc-492e-9141-63dd99e1cb25", "bridge": "br-int", "label": "tempest-network-smoke--1773827977", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6e500b56-ad", "ovs_interfaceid": "6e500b56-ad6b-4af0-9a28-7bd95fe3781d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 00:38:36 compute-1 nova_compute[182713]: 2026-01-22 00:38:36.262 182717 DEBUG nova.network.os_vif_util [None req-d3e3bbdd-f547-4686-815e-70606cffa687 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Converting VIF {"id": "6e500b56-ad6b-4af0-9a28-7bd95fe3781d", "address": "fa:16:3e:19:b9:32", "network": {"id": "96576974-adfc-492e-9141-63dd99e1cb25", "bridge": "br-int", "label": "tempest-network-smoke--1773827977", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6e500b56-ad", "ovs_interfaceid": "6e500b56-ad6b-4af0-9a28-7bd95fe3781d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:38:36 compute-1 nova_compute[182713]: 2026-01-22 00:38:36.263 182717 DEBUG nova.network.os_vif_util [None req-d3e3bbdd-f547-4686-815e-70606cffa687 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:19:b9:32,bridge_name='br-int',has_traffic_filtering=True,id=6e500b56-ad6b-4af0-9a28-7bd95fe3781d,network=Network(96576974-adfc-492e-9141-63dd99e1cb25),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6e500b56-ad') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:38:36 compute-1 nova_compute[182713]: 2026-01-22 00:38:36.263 182717 DEBUG os_vif [None req-d3e3bbdd-f547-4686-815e-70606cffa687 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:19:b9:32,bridge_name='br-int',has_traffic_filtering=True,id=6e500b56-ad6b-4af0-9a28-7bd95fe3781d,network=Network(96576974-adfc-492e-9141-63dd99e1cb25),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6e500b56-ad') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 00:38:36 compute-1 nova_compute[182713]: 2026-01-22 00:38:36.264 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:38:36 compute-1 nova_compute[182713]: 2026-01-22 00:38:36.265 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:38:36 compute-1 nova_compute[182713]: 2026-01-22 00:38:36.265 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 00:38:36 compute-1 nova_compute[182713]: 2026-01-22 00:38:36.274 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:38:36 compute-1 nova_compute[182713]: 2026-01-22 00:38:36.275 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap6e500b56-ad, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:38:36 compute-1 nova_compute[182713]: 2026-01-22 00:38:36.276 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap6e500b56-ad, col_values=(('external_ids', {'iface-id': '6e500b56-ad6b-4af0-9a28-7bd95fe3781d', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:19:b9:32', 'vm-uuid': '34cfe24d-2754-4083-975b-f9775fd743b8'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:38:36 compute-1 NetworkManager[54952]: <info>  [1769042316.2796] manager: (tap6e500b56-ad): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/339)
Jan 22 00:38:36 compute-1 nova_compute[182713]: 2026-01-22 00:38:36.278 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:38:36 compute-1 nova_compute[182713]: 2026-01-22 00:38:36.282 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:38:36 compute-1 nova_compute[182713]: 2026-01-22 00:38:36.288 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:38:36 compute-1 nova_compute[182713]: 2026-01-22 00:38:36.289 182717 INFO os_vif [None req-d3e3bbdd-f547-4686-815e-70606cffa687 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:19:b9:32,bridge_name='br-int',has_traffic_filtering=True,id=6e500b56-ad6b-4af0-9a28-7bd95fe3781d,network=Network(96576974-adfc-492e-9141-63dd99e1cb25),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6e500b56-ad')
Jan 22 00:38:36 compute-1 nova_compute[182713]: 2026-01-22 00:38:36.290 182717 DEBUG nova.virt.libvirt.vif [None req-d3e3bbdd-f547-4686-815e-70606cffa687 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T00:38:24Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1282444667',display_name='tempest-TestGettingAddress-server-1282444667',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1282444667',id=176,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBKUPE/b7T//C7EWRAZhxqFsRGr7AXrACj+OWHY0bSytLiLst+E4mc3tNVo/ZttM4rMO8VKrIAX0ipjrNBzr3hEfrSo0ADkuS+9zF2SWKTGt3QdgHdDGoyPuHVN6vYWqrHA==',key_name='tempest-TestGettingAddress-2032783061',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='837db8748d074b3c9179b47d30e7a1d4',ramdisk_id='',reservation_id='r-jd0666ok',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-471729430',owner_user_name='tempest-TestGettingAddress-471729430-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:38:26Z,user_data=None,user_id='a8fd196423d94b309668ffd08655f7ed',uuid=34cfe24d-2754-4083-975b-f9775fd743b8,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "a3b2b7b3-2c20-4521-88bf-4bd738cbbb5c", "address": "fa:16:3e:5f:7d:b6", "network": {"id": "01fa8e13-9f62-4b06-88db-79f2e6ca65b8", "bridge": "br-int", "label": "tempest-network-smoke--1773626540", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe5f:7db6", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe5f:7db6", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa3b2b7b3-2c", "ovs_interfaceid": "a3b2b7b3-2c20-4521-88bf-4bd738cbbb5c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 00:38:36 compute-1 nova_compute[182713]: 2026-01-22 00:38:36.291 182717 DEBUG nova.network.os_vif_util [None req-d3e3bbdd-f547-4686-815e-70606cffa687 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Converting VIF {"id": "a3b2b7b3-2c20-4521-88bf-4bd738cbbb5c", "address": "fa:16:3e:5f:7d:b6", "network": {"id": "01fa8e13-9f62-4b06-88db-79f2e6ca65b8", "bridge": "br-int", "label": "tempest-network-smoke--1773626540", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe5f:7db6", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe5f:7db6", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa3b2b7b3-2c", "ovs_interfaceid": "a3b2b7b3-2c20-4521-88bf-4bd738cbbb5c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:38:36 compute-1 nova_compute[182713]: 2026-01-22 00:38:36.292 182717 DEBUG nova.network.os_vif_util [None req-d3e3bbdd-f547-4686-815e-70606cffa687 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:5f:7d:b6,bridge_name='br-int',has_traffic_filtering=True,id=a3b2b7b3-2c20-4521-88bf-4bd738cbbb5c,network=Network(01fa8e13-9f62-4b06-88db-79f2e6ca65b8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa3b2b7b3-2c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:38:36 compute-1 nova_compute[182713]: 2026-01-22 00:38:36.293 182717 DEBUG os_vif [None req-d3e3bbdd-f547-4686-815e-70606cffa687 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:5f:7d:b6,bridge_name='br-int',has_traffic_filtering=True,id=a3b2b7b3-2c20-4521-88bf-4bd738cbbb5c,network=Network(01fa8e13-9f62-4b06-88db-79f2e6ca65b8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa3b2b7b3-2c') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 00:38:36 compute-1 nova_compute[182713]: 2026-01-22 00:38:36.293 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:38:36 compute-1 nova_compute[182713]: 2026-01-22 00:38:36.294 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:38:36 compute-1 nova_compute[182713]: 2026-01-22 00:38:36.294 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 00:38:36 compute-1 nova_compute[182713]: 2026-01-22 00:38:36.297 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:38:36 compute-1 nova_compute[182713]: 2026-01-22 00:38:36.297 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa3b2b7b3-2c, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:38:36 compute-1 nova_compute[182713]: 2026-01-22 00:38:36.298 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapa3b2b7b3-2c, col_values=(('external_ids', {'iface-id': 'a3b2b7b3-2c20-4521-88bf-4bd738cbbb5c', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:5f:7d:b6', 'vm-uuid': '34cfe24d-2754-4083-975b-f9775fd743b8'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:38:36 compute-1 nova_compute[182713]: 2026-01-22 00:38:36.300 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:38:36 compute-1 NetworkManager[54952]: <info>  [1769042316.3014] manager: (tapa3b2b7b3-2c): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/340)
Jan 22 00:38:36 compute-1 nova_compute[182713]: 2026-01-22 00:38:36.304 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:38:36 compute-1 nova_compute[182713]: 2026-01-22 00:38:36.311 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:38:36 compute-1 nova_compute[182713]: 2026-01-22 00:38:36.313 182717 INFO os_vif [None req-d3e3bbdd-f547-4686-815e-70606cffa687 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:5f:7d:b6,bridge_name='br-int',has_traffic_filtering=True,id=a3b2b7b3-2c20-4521-88bf-4bd738cbbb5c,network=Network(01fa8e13-9f62-4b06-88db-79f2e6ca65b8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa3b2b7b3-2c')
Jan 22 00:38:36 compute-1 nova_compute[182713]: 2026-01-22 00:38:36.396 182717 DEBUG nova.virt.libvirt.driver [None req-d3e3bbdd-f547-4686-815e-70606cffa687 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 00:38:36 compute-1 nova_compute[182713]: 2026-01-22 00:38:36.397 182717 DEBUG nova.virt.libvirt.driver [None req-d3e3bbdd-f547-4686-815e-70606cffa687 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 00:38:36 compute-1 nova_compute[182713]: 2026-01-22 00:38:36.397 182717 DEBUG nova.virt.libvirt.driver [None req-d3e3bbdd-f547-4686-815e-70606cffa687 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] No VIF found with MAC fa:16:3e:19:b9:32, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 00:38:36 compute-1 nova_compute[182713]: 2026-01-22 00:38:36.397 182717 DEBUG nova.virt.libvirt.driver [None req-d3e3bbdd-f547-4686-815e-70606cffa687 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] No VIF found with MAC fa:16:3e:5f:7d:b6, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 00:38:36 compute-1 nova_compute[182713]: 2026-01-22 00:38:36.397 182717 INFO nova.virt.libvirt.driver [None req-d3e3bbdd-f547-4686-815e-70606cffa687 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 34cfe24d-2754-4083-975b-f9775fd743b8] Using config drive
Jan 22 00:38:37 compute-1 nova_compute[182713]: 2026-01-22 00:38:37.230 182717 INFO nova.virt.libvirt.driver [None req-d3e3bbdd-f547-4686-815e-70606cffa687 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 34cfe24d-2754-4083-975b-f9775fd743b8] Creating config drive at /var/lib/nova/instances/34cfe24d-2754-4083-975b-f9775fd743b8/disk.config
Jan 22 00:38:37 compute-1 nova_compute[182713]: 2026-01-22 00:38:37.234 182717 DEBUG oslo_concurrency.processutils [None req-d3e3bbdd-f547-4686-815e-70606cffa687 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/34cfe24d-2754-4083-975b-f9775fd743b8/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp82j9tb0i execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:38:37 compute-1 nova_compute[182713]: 2026-01-22 00:38:37.363 182717 DEBUG oslo_concurrency.processutils [None req-d3e3bbdd-f547-4686-815e-70606cffa687 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/34cfe24d-2754-4083-975b-f9775fd743b8/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp82j9tb0i" returned: 0 in 0.128s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:38:37 compute-1 kernel: tap6e500b56-ad: entered promiscuous mode
Jan 22 00:38:37 compute-1 NetworkManager[54952]: <info>  [1769042317.4373] manager: (tap6e500b56-ad): new Tun device (/org/freedesktop/NetworkManager/Devices/341)
Jan 22 00:38:37 compute-1 nova_compute[182713]: 2026-01-22 00:38:37.440 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:38:37 compute-1 ovn_controller[94841]: 2026-01-22T00:38:37Z|00715|binding|INFO|Claiming lport 6e500b56-ad6b-4af0-9a28-7bd95fe3781d for this chassis.
Jan 22 00:38:37 compute-1 ovn_controller[94841]: 2026-01-22T00:38:37Z|00716|binding|INFO|6e500b56-ad6b-4af0-9a28-7bd95fe3781d: Claiming fa:16:3e:19:b9:32 10.100.0.6
Jan 22 00:38:37 compute-1 nova_compute[182713]: 2026-01-22 00:38:37.451 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:38:37 compute-1 NetworkManager[54952]: <info>  [1769042317.4538] manager: (tapa3b2b7b3-2c): new Tun device (/org/freedesktop/NetworkManager/Devices/342)
Jan 22 00:38:37 compute-1 kernel: tapa3b2b7b3-2c: entered promiscuous mode
Jan 22 00:38:37 compute-1 nova_compute[182713]: 2026-01-22 00:38:37.456 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:38:37 compute-1 nova_compute[182713]: 2026-01-22 00:38:37.467 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:38:37 compute-1 ovn_controller[94841]: 2026-01-22T00:38:37Z|00717|if_status|INFO|Not updating pb chassis for a3b2b7b3-2c20-4521-88bf-4bd738cbbb5c now as sb is readonly
Jan 22 00:38:37 compute-1 NetworkManager[54952]: <info>  [1769042317.4761] manager: (patch-provnet-99bcf688-d143-4306-a11c-956e66fbe227-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/343)
Jan 22 00:38:37 compute-1 nova_compute[182713]: 2026-01-22 00:38:37.473 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:38:37 compute-1 NetworkManager[54952]: <info>  [1769042317.4782] manager: (patch-br-int-to-provnet-99bcf688-d143-4306-a11c-956e66fbe227): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/344)
Jan 22 00:38:37 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:38:37.479 104184 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:19:b9:32 10.100.0.6'], port_security=['fa:16:3e:19:b9:32 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '34cfe24d-2754-4083-975b-f9775fd743b8', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-96576974-adfc-492e-9141-63dd99e1cb25', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '837db8748d074b3c9179b47d30e7a1d4', 'neutron:revision_number': '2', 'neutron:security_group_ids': '4381a94a-5b04-4450-b603-573605756783', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=700861ed-e604-4e52-bc1a-65ca23f63d88, chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>], logical_port=6e500b56-ad6b-4af0-9a28-7bd95fe3781d) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:38:37 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:38:37.480 104184 INFO neutron.agent.ovn.metadata.agent [-] Port 6e500b56-ad6b-4af0-9a28-7bd95fe3781d in datapath 96576974-adfc-492e-9141-63dd99e1cb25 bound to our chassis
Jan 22 00:38:37 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:38:37.482 104184 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 96576974-adfc-492e-9141-63dd99e1cb25
Jan 22 00:38:37 compute-1 systemd-udevd[241958]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 00:38:37 compute-1 systemd-udevd[241957]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 00:38:37 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:38:37.494 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[2167e2d2-df73-4461-b051-4b97e38e8c95]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:38:37 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:38:37.495 104184 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap96576974-a1 in ovnmeta-96576974-adfc-492e-9141-63dd99e1cb25 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 22 00:38:37 compute-1 NetworkManager[54952]: <info>  [1769042317.4981] device (tap6e500b56-ad): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 00:38:37 compute-1 NetworkManager[54952]: <info>  [1769042317.4986] device (tap6e500b56-ad): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 00:38:37 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:38:37.497 211733 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap96576974-a0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 22 00:38:37 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:38:37.497 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[f7156bf6-06ef-44d7-a7b5-68fb83dad4d4]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:38:37 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:38:37.498 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[8486d05c-5bcd-4630-8522-5103949b5f26]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:38:37 compute-1 systemd-machined[153970]: New machine qemu-76-instance-000000b0.
Jan 22 00:38:37 compute-1 NetworkManager[54952]: <info>  [1769042317.5011] device (tapa3b2b7b3-2c): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 00:38:37 compute-1 NetworkManager[54952]: <info>  [1769042317.5017] device (tapa3b2b7b3-2c): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 00:38:37 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:38:37.512 104576 DEBUG oslo.privsep.daemon [-] privsep: reply[77df5e0f-76a2-4679-a016-a762b93a608d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:38:37 compute-1 systemd[1]: Started Virtual Machine qemu-76-instance-000000b0.
Jan 22 00:38:37 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:38:37.541 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[9cb7dde2-a8cf-4cba-9e38-aa3bb70e3214]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:38:37 compute-1 nova_compute[182713]: 2026-01-22 00:38:37.566 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:38:37 compute-1 ovn_controller[94841]: 2026-01-22T00:38:37Z|00718|binding|INFO|Claiming lport a3b2b7b3-2c20-4521-88bf-4bd738cbbb5c for this chassis.
Jan 22 00:38:37 compute-1 ovn_controller[94841]: 2026-01-22T00:38:37Z|00719|binding|INFO|a3b2b7b3-2c20-4521-88bf-4bd738cbbb5c: Claiming fa:16:3e:5f:7d:b6 2001:db8:0:1:f816:3eff:fe5f:7db6 2001:db8::f816:3eff:fe5f:7db6
Jan 22 00:38:37 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:38:37.575 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[7b1cdbf5-e08b-48c9-b76d-9fe931d2cd57]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:38:37 compute-1 ovn_controller[94841]: 2026-01-22T00:38:37Z|00720|binding|INFO|Setting lport 6e500b56-ad6b-4af0-9a28-7bd95fe3781d ovn-installed in OVS
Jan 22 00:38:37 compute-1 nova_compute[182713]: 2026-01-22 00:38:37.579 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:38:37 compute-1 ovn_controller[94841]: 2026-01-22T00:38:37Z|00721|binding|INFO|Setting lport 6e500b56-ad6b-4af0-9a28-7bd95fe3781d up in Southbound
Jan 22 00:38:37 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:38:37.589 104184 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:5f:7d:b6 2001:db8:0:1:f816:3eff:fe5f:7db6 2001:db8::f816:3eff:fe5f:7db6'], port_security=['fa:16:3e:5f:7d:b6 2001:db8:0:1:f816:3eff:fe5f:7db6 2001:db8::f816:3eff:fe5f:7db6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:0:1:f816:3eff:fe5f:7db6/64 2001:db8::f816:3eff:fe5f:7db6/64', 'neutron:device_id': '34cfe24d-2754-4083-975b-f9775fd743b8', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-01fa8e13-9f62-4b06-88db-79f2e6ca65b8', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '837db8748d074b3c9179b47d30e7a1d4', 'neutron:revision_number': '2', 'neutron:security_group_ids': '4381a94a-5b04-4450-b603-573605756783', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=eaa35b5e-130a-4933-a219-b6429231aa8c, chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>], logical_port=a3b2b7b3-2c20-4521-88bf-4bd738cbbb5c) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:38:37 compute-1 NetworkManager[54952]: <info>  [1769042317.5989] manager: (tap96576974-a0): new Veth device (/org/freedesktop/NetworkManager/Devices/345)
Jan 22 00:38:37 compute-1 ovn_controller[94841]: 2026-01-22T00:38:37Z|00722|binding|INFO|Setting lport a3b2b7b3-2c20-4521-88bf-4bd738cbbb5c ovn-installed in OVS
Jan 22 00:38:37 compute-1 ovn_controller[94841]: 2026-01-22T00:38:37Z|00723|binding|INFO|Setting lport a3b2b7b3-2c20-4521-88bf-4bd738cbbb5c up in Southbound
Jan 22 00:38:37 compute-1 nova_compute[182713]: 2026-01-22 00:38:37.600 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:38:37 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:38:37.600 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[24e12a2f-8531-4337-bf5a-71580d259540]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:38:37 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:38:37.641 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[5e8da00f-3d51-4226-976a-cc6471a292d9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:38:37 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:38:37.645 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[5df58ea5-02d8-4e30-be3d-765546f26e01]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:38:37 compute-1 NetworkManager[54952]: <info>  [1769042317.6714] device (tap96576974-a0): carrier: link connected
Jan 22 00:38:37 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:38:37.676 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[79651e0e-0045-4595-865a-f9be9898202d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:38:37 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:38:37.695 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[22429ab3-bdd4-4169-b7ee-2109a88c4a7c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap96576974-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:2d:af:e7'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 208], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 689032, 'reachable_time': 44548, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 242001, 'error': None, 'target': 'ovnmeta-96576974-adfc-492e-9141-63dd99e1cb25', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:38:37 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:38:37.713 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[0e032528-1de0-47d1-af4b-ee1dced4dba6]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe2d:afe7'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 689032, 'tstamp': 689032}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 242007, 'error': None, 'target': 'ovnmeta-96576974-adfc-492e-9141-63dd99e1cb25', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:38:37 compute-1 podman[241985]: 2026-01-22 00:38:37.720752783 +0000 UTC m=+0.068736881 container health_status dce133a2ffa21f3006ac27dc80027060a9029e6d972f4c62fecd35dd3bbb00f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=openstack_network_exporter, release=1755695350, version=9.6, vendor=Red Hat, Inc., distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, name=ubi9-minimal, architecture=x86_64, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9)
Jan 22 00:38:37 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:38:37.733 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[aab37b06-ebbf-4d3d-9cf9-4d62b4d3df53]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap96576974-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:2d:af:e7'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 208], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 689032, 'reachable_time': 44548, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 242012, 'error': None, 'target': 'ovnmeta-96576974-adfc-492e-9141-63dd99e1cb25', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:38:37 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:38:37.764 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[8e7ed015-fa80-41d7-9330-2a76e78e8b19]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:38:37 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:38:37.816 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[d0ef8937-12f6-46f7-a94a-8c24c6efe925]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:38:37 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:38:37.817 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap96576974-a0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:38:37 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:38:37.817 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 00:38:37 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:38:37.818 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap96576974-a0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:38:37 compute-1 nova_compute[182713]: 2026-01-22 00:38:37.819 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:38:37 compute-1 NetworkManager[54952]: <info>  [1769042317.8205] manager: (tap96576974-a0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/346)
Jan 22 00:38:37 compute-1 kernel: tap96576974-a0: entered promiscuous mode
Jan 22 00:38:37 compute-1 nova_compute[182713]: 2026-01-22 00:38:37.822 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:38:37 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:38:37.825 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap96576974-a0, col_values=(('external_ids', {'iface-id': '4faa3c3e-65cf-4db1-ab38-d3f17011be65'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:38:37 compute-1 nova_compute[182713]: 2026-01-22 00:38:37.826 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:38:37 compute-1 nova_compute[182713]: 2026-01-22 00:38:37.827 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:38:37 compute-1 ovn_controller[94841]: 2026-01-22T00:38:37Z|00724|binding|INFO|Releasing lport 4faa3c3e-65cf-4db1-ab38-d3f17011be65 from this chassis (sb_readonly=0)
Jan 22 00:38:37 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:38:37.830 104184 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/96576974-adfc-492e-9141-63dd99e1cb25.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/96576974-adfc-492e-9141-63dd99e1cb25.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 22 00:38:37 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:38:37.831 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[e8ebf13f-edc2-4768-a1c6-2692e7a55734]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:38:37 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:38:37.831 104184 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 00:38:37 compute-1 ovn_metadata_agent[104179]: global
Jan 22 00:38:37 compute-1 ovn_metadata_agent[104179]:     log         /dev/log local0 debug
Jan 22 00:38:37 compute-1 ovn_metadata_agent[104179]:     log-tag     haproxy-metadata-proxy-96576974-adfc-492e-9141-63dd99e1cb25
Jan 22 00:38:37 compute-1 ovn_metadata_agent[104179]:     user        root
Jan 22 00:38:37 compute-1 ovn_metadata_agent[104179]:     group       root
Jan 22 00:38:37 compute-1 ovn_metadata_agent[104179]:     maxconn     1024
Jan 22 00:38:37 compute-1 ovn_metadata_agent[104179]:     pidfile     /var/lib/neutron/external/pids/96576974-adfc-492e-9141-63dd99e1cb25.pid.haproxy
Jan 22 00:38:37 compute-1 ovn_metadata_agent[104179]:     daemon
Jan 22 00:38:37 compute-1 ovn_metadata_agent[104179]: 
Jan 22 00:38:37 compute-1 ovn_metadata_agent[104179]: defaults
Jan 22 00:38:37 compute-1 ovn_metadata_agent[104179]:     log global
Jan 22 00:38:37 compute-1 ovn_metadata_agent[104179]:     mode http
Jan 22 00:38:37 compute-1 ovn_metadata_agent[104179]:     option httplog
Jan 22 00:38:37 compute-1 ovn_metadata_agent[104179]:     option dontlognull
Jan 22 00:38:37 compute-1 ovn_metadata_agent[104179]:     option http-server-close
Jan 22 00:38:37 compute-1 ovn_metadata_agent[104179]:     option forwardfor
Jan 22 00:38:37 compute-1 ovn_metadata_agent[104179]:     retries                 3
Jan 22 00:38:37 compute-1 ovn_metadata_agent[104179]:     timeout http-request    30s
Jan 22 00:38:37 compute-1 ovn_metadata_agent[104179]:     timeout connect         30s
Jan 22 00:38:37 compute-1 ovn_metadata_agent[104179]:     timeout client          32s
Jan 22 00:38:37 compute-1 ovn_metadata_agent[104179]:     timeout server          32s
Jan 22 00:38:37 compute-1 ovn_metadata_agent[104179]:     timeout http-keep-alive 30s
Jan 22 00:38:37 compute-1 ovn_metadata_agent[104179]: 
Jan 22 00:38:37 compute-1 ovn_metadata_agent[104179]: 
Jan 22 00:38:37 compute-1 ovn_metadata_agent[104179]: listen listener
Jan 22 00:38:37 compute-1 ovn_metadata_agent[104179]:     bind 169.254.169.254:80
Jan 22 00:38:37 compute-1 ovn_metadata_agent[104179]:     server metadata /var/lib/neutron/metadata_proxy
Jan 22 00:38:37 compute-1 ovn_metadata_agent[104179]:     http-request add-header X-OVN-Network-ID 96576974-adfc-492e-9141-63dd99e1cb25
Jan 22 00:38:37 compute-1 ovn_metadata_agent[104179]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 22 00:38:37 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:38:37.832 104184 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-96576974-adfc-492e-9141-63dd99e1cb25', 'env', 'PROCESS_TAG=haproxy-96576974-adfc-492e-9141-63dd99e1cb25', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/96576974-adfc-492e-9141-63dd99e1cb25.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 22 00:38:37 compute-1 nova_compute[182713]: 2026-01-22 00:38:37.839 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:38:37 compute-1 nova_compute[182713]: 2026-01-22 00:38:37.961 182717 DEBUG nova.virt.driver [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Emitting event <LifecycleEvent: 1769042317.9608204, 34cfe24d-2754-4083-975b-f9775fd743b8 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:38:37 compute-1 nova_compute[182713]: 2026-01-22 00:38:37.962 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 34cfe24d-2754-4083-975b-f9775fd743b8] VM Started (Lifecycle Event)
Jan 22 00:38:37 compute-1 nova_compute[182713]: 2026-01-22 00:38:37.986 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 34cfe24d-2754-4083-975b-f9775fd743b8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:38:37 compute-1 nova_compute[182713]: 2026-01-22 00:38:37.991 182717 DEBUG nova.virt.driver [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Emitting event <LifecycleEvent: 1769042317.9618287, 34cfe24d-2754-4083-975b-f9775fd743b8 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:38:37 compute-1 nova_compute[182713]: 2026-01-22 00:38:37.991 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 34cfe24d-2754-4083-975b-f9775fd743b8] VM Paused (Lifecycle Event)
Jan 22 00:38:38 compute-1 nova_compute[182713]: 2026-01-22 00:38:38.008 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 34cfe24d-2754-4083-975b-f9775fd743b8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:38:38 compute-1 nova_compute[182713]: 2026-01-22 00:38:38.014 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 34cfe24d-2754-4083-975b-f9775fd743b8] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 00:38:38 compute-1 nova_compute[182713]: 2026-01-22 00:38:38.040 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 34cfe24d-2754-4083-975b-f9775fd743b8] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 00:38:38 compute-1 nova_compute[182713]: 2026-01-22 00:38:38.270 182717 DEBUG nova.compute.manager [req-c90e969e-48c4-4178-b780-0685248c0243 req-661debf2-5ae9-4ffb-a98e-5ee428de2382 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 34cfe24d-2754-4083-975b-f9775fd743b8] Received event network-vif-plugged-6e500b56-ad6b-4af0-9a28-7bd95fe3781d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:38:38 compute-1 nova_compute[182713]: 2026-01-22 00:38:38.272 182717 DEBUG oslo_concurrency.lockutils [req-c90e969e-48c4-4178-b780-0685248c0243 req-661debf2-5ae9-4ffb-a98e-5ee428de2382 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "34cfe24d-2754-4083-975b-f9775fd743b8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:38:38 compute-1 nova_compute[182713]: 2026-01-22 00:38:38.272 182717 DEBUG oslo_concurrency.lockutils [req-c90e969e-48c4-4178-b780-0685248c0243 req-661debf2-5ae9-4ffb-a98e-5ee428de2382 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "34cfe24d-2754-4083-975b-f9775fd743b8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:38:38 compute-1 nova_compute[182713]: 2026-01-22 00:38:38.273 182717 DEBUG oslo_concurrency.lockutils [req-c90e969e-48c4-4178-b780-0685248c0243 req-661debf2-5ae9-4ffb-a98e-5ee428de2382 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "34cfe24d-2754-4083-975b-f9775fd743b8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:38:38 compute-1 nova_compute[182713]: 2026-01-22 00:38:38.273 182717 DEBUG nova.compute.manager [req-c90e969e-48c4-4178-b780-0685248c0243 req-661debf2-5ae9-4ffb-a98e-5ee428de2382 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 34cfe24d-2754-4083-975b-f9775fd743b8] Processing event network-vif-plugged-6e500b56-ad6b-4af0-9a28-7bd95fe3781d _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 22 00:38:38 compute-1 podman[242052]: 2026-01-22 00:38:38.277882726 +0000 UTC m=+0.098873894 container create 29837a2f8595f39956ae6e57392fdc8ccbffec27fdb56ac6baca9cfbd340bfd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-96576974-adfc-492e-9141-63dd99e1cb25, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 00:38:38 compute-1 nova_compute[182713]: 2026-01-22 00:38:38.281 182717 DEBUG nova.compute.manager [req-af5aba68-b42d-4a5e-891c-c7cfe510a3c0 req-8de17f54-0024-4d57-9b3d-f62a461793a7 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 34cfe24d-2754-4083-975b-f9775fd743b8] Received event network-vif-plugged-a3b2b7b3-2c20-4521-88bf-4bd738cbbb5c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:38:38 compute-1 nova_compute[182713]: 2026-01-22 00:38:38.281 182717 DEBUG oslo_concurrency.lockutils [req-af5aba68-b42d-4a5e-891c-c7cfe510a3c0 req-8de17f54-0024-4d57-9b3d-f62a461793a7 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "34cfe24d-2754-4083-975b-f9775fd743b8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:38:38 compute-1 nova_compute[182713]: 2026-01-22 00:38:38.282 182717 DEBUG oslo_concurrency.lockutils [req-af5aba68-b42d-4a5e-891c-c7cfe510a3c0 req-8de17f54-0024-4d57-9b3d-f62a461793a7 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "34cfe24d-2754-4083-975b-f9775fd743b8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:38:38 compute-1 nova_compute[182713]: 2026-01-22 00:38:38.282 182717 DEBUG oslo_concurrency.lockutils [req-af5aba68-b42d-4a5e-891c-c7cfe510a3c0 req-8de17f54-0024-4d57-9b3d-f62a461793a7 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "34cfe24d-2754-4083-975b-f9775fd743b8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:38:38 compute-1 nova_compute[182713]: 2026-01-22 00:38:38.282 182717 DEBUG nova.compute.manager [req-af5aba68-b42d-4a5e-891c-c7cfe510a3c0 req-8de17f54-0024-4d57-9b3d-f62a461793a7 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 34cfe24d-2754-4083-975b-f9775fd743b8] Processing event network-vif-plugged-a3b2b7b3-2c20-4521-88bf-4bd738cbbb5c _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 22 00:38:38 compute-1 nova_compute[182713]: 2026-01-22 00:38:38.283 182717 DEBUG nova.compute.manager [None req-d3e3bbdd-f547-4686-815e-70606cffa687 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 34cfe24d-2754-4083-975b-f9775fd743b8] Instance event wait completed in 0 seconds for network-vif-plugged,network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 00:38:38 compute-1 nova_compute[182713]: 2026-01-22 00:38:38.288 182717 DEBUG nova.virt.driver [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Emitting event <LifecycleEvent: 1769042318.2877953, 34cfe24d-2754-4083-975b-f9775fd743b8 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:38:38 compute-1 nova_compute[182713]: 2026-01-22 00:38:38.290 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 34cfe24d-2754-4083-975b-f9775fd743b8] VM Resumed (Lifecycle Event)
Jan 22 00:38:38 compute-1 nova_compute[182713]: 2026-01-22 00:38:38.292 182717 DEBUG nova.virt.libvirt.driver [None req-d3e3bbdd-f547-4686-815e-70606cffa687 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 34cfe24d-2754-4083-975b-f9775fd743b8] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 22 00:38:38 compute-1 nova_compute[182713]: 2026-01-22 00:38:38.297 182717 INFO nova.virt.libvirt.driver [-] [instance: 34cfe24d-2754-4083-975b-f9775fd743b8] Instance spawned successfully.
Jan 22 00:38:38 compute-1 nova_compute[182713]: 2026-01-22 00:38:38.298 182717 DEBUG nova.virt.libvirt.driver [None req-d3e3bbdd-f547-4686-815e-70606cffa687 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 34cfe24d-2754-4083-975b-f9775fd743b8] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 22 00:38:38 compute-1 podman[242052]: 2026-01-22 00:38:38.218317241 +0000 UTC m=+0.039308509 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 00:38:38 compute-1 nova_compute[182713]: 2026-01-22 00:38:38.322 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 34cfe24d-2754-4083-975b-f9775fd743b8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:38:38 compute-1 systemd[1]: Started libpod-conmon-29837a2f8595f39956ae6e57392fdc8ccbffec27fdb56ac6baca9cfbd340bfd2.scope.
Jan 22 00:38:38 compute-1 nova_compute[182713]: 2026-01-22 00:38:38.328 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 34cfe24d-2754-4083-975b-f9775fd743b8] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 00:38:38 compute-1 nova_compute[182713]: 2026-01-22 00:38:38.332 182717 DEBUG nova.virt.libvirt.driver [None req-d3e3bbdd-f547-4686-815e-70606cffa687 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 34cfe24d-2754-4083-975b-f9775fd743b8] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:38:38 compute-1 nova_compute[182713]: 2026-01-22 00:38:38.332 182717 DEBUG nova.virt.libvirt.driver [None req-d3e3bbdd-f547-4686-815e-70606cffa687 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 34cfe24d-2754-4083-975b-f9775fd743b8] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:38:38 compute-1 nova_compute[182713]: 2026-01-22 00:38:38.333 182717 DEBUG nova.virt.libvirt.driver [None req-d3e3bbdd-f547-4686-815e-70606cffa687 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 34cfe24d-2754-4083-975b-f9775fd743b8] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:38:38 compute-1 nova_compute[182713]: 2026-01-22 00:38:38.333 182717 DEBUG nova.virt.libvirt.driver [None req-d3e3bbdd-f547-4686-815e-70606cffa687 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 34cfe24d-2754-4083-975b-f9775fd743b8] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:38:38 compute-1 nova_compute[182713]: 2026-01-22 00:38:38.333 182717 DEBUG nova.virt.libvirt.driver [None req-d3e3bbdd-f547-4686-815e-70606cffa687 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 34cfe24d-2754-4083-975b-f9775fd743b8] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:38:38 compute-1 nova_compute[182713]: 2026-01-22 00:38:38.334 182717 DEBUG nova.virt.libvirt.driver [None req-d3e3bbdd-f547-4686-815e-70606cffa687 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 34cfe24d-2754-4083-975b-f9775fd743b8] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:38:38 compute-1 systemd[1]: Started libcrun container.
Jan 22 00:38:38 compute-1 nova_compute[182713]: 2026-01-22 00:38:38.363 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 34cfe24d-2754-4083-975b-f9775fd743b8] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 00:38:38 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bba8c84212294e81ce6bc6ce3ace65e7a59b470a400b47c30f11353e896a96c7/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 00:38:38 compute-1 podman[242052]: 2026-01-22 00:38:38.383145188 +0000 UTC m=+0.204136376 container init 29837a2f8595f39956ae6e57392fdc8ccbffec27fdb56ac6baca9cfbd340bfd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-96576974-adfc-492e-9141-63dd99e1cb25, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 22 00:38:38 compute-1 podman[242052]: 2026-01-22 00:38:38.388527055 +0000 UTC m=+0.209518223 container start 29837a2f8595f39956ae6e57392fdc8ccbffec27fdb56ac6baca9cfbd340bfd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-96576974-adfc-492e-9141-63dd99e1cb25, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 00:38:38 compute-1 nova_compute[182713]: 2026-01-22 00:38:38.409 182717 INFO nova.compute.manager [None req-d3e3bbdd-f547-4686-815e-70606cffa687 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 34cfe24d-2754-4083-975b-f9775fd743b8] Took 12.15 seconds to spawn the instance on the hypervisor.
Jan 22 00:38:38 compute-1 nova_compute[182713]: 2026-01-22 00:38:38.410 182717 DEBUG nova.compute.manager [None req-d3e3bbdd-f547-4686-815e-70606cffa687 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 34cfe24d-2754-4083-975b-f9775fd743b8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:38:38 compute-1 neutron-haproxy-ovnmeta-96576974-adfc-492e-9141-63dd99e1cb25[242067]: [NOTICE]   (242071) : New worker (242073) forked
Jan 22 00:38:38 compute-1 neutron-haproxy-ovnmeta-96576974-adfc-492e-9141-63dd99e1cb25[242067]: [NOTICE]   (242071) : Loading success.
Jan 22 00:38:38 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:38:38.456 104184 INFO neutron.agent.ovn.metadata.agent [-] Port a3b2b7b3-2c20-4521-88bf-4bd738cbbb5c in datapath 01fa8e13-9f62-4b06-88db-79f2e6ca65b8 unbound from our chassis
Jan 22 00:38:38 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:38:38.458 104184 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 01fa8e13-9f62-4b06-88db-79f2e6ca65b8
Jan 22 00:38:38 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:38:38.467 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[b9eec44d-f501-43c5-a927-a36f2cb14ba5]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:38:38 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:38:38.468 104184 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap01fa8e13-91 in ovnmeta-01fa8e13-9f62-4b06-88db-79f2e6ca65b8 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 22 00:38:38 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:38:38.471 211733 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap01fa8e13-90 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 22 00:38:38 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:38:38.471 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[8248e580-2992-4200-8ffd-15ee4d6ddd3e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:38:38 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:38:38.472 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[2cdfb7d9-a021-4839-b1f5-04b3e1dc36a3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:38:38 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:38:38.484 104576 DEBUG oslo.privsep.daemon [-] privsep: reply[f7a73293-1576-46f6-85fd-a14fbe0d172c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:38:38 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:38:38.499 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[50dc81c0-3b10-431b-ab2d-b2efe64ea021]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:38:38 compute-1 nova_compute[182713]: 2026-01-22 00:38:38.527 182717 DEBUG nova.network.neutron [req-dd017a30-e732-4324-83c3-8c65aa440c69 req-836fe50f-8732-45dc-9e3c-4e267f427d04 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 34cfe24d-2754-4083-975b-f9775fd743b8] Updated VIF entry in instance network info cache for port a3b2b7b3-2c20-4521-88bf-4bd738cbbb5c. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 00:38:38 compute-1 nova_compute[182713]: 2026-01-22 00:38:38.528 182717 DEBUG nova.network.neutron [req-dd017a30-e732-4324-83c3-8c65aa440c69 req-836fe50f-8732-45dc-9e3c-4e267f427d04 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 34cfe24d-2754-4083-975b-f9775fd743b8] Updating instance_info_cache with network_info: [{"id": "6e500b56-ad6b-4af0-9a28-7bd95fe3781d", "address": "fa:16:3e:19:b9:32", "network": {"id": "96576974-adfc-492e-9141-63dd99e1cb25", "bridge": "br-int", "label": "tempest-network-smoke--1773827977", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6e500b56-ad", "ovs_interfaceid": "6e500b56-ad6b-4af0-9a28-7bd95fe3781d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "a3b2b7b3-2c20-4521-88bf-4bd738cbbb5c", "address": "fa:16:3e:5f:7d:b6", "network": {"id": "01fa8e13-9f62-4b06-88db-79f2e6ca65b8", "bridge": "br-int", "label": "tempest-network-smoke--1773626540", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe5f:7db6", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe5f:7db6", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa3b2b7b3-2c", "ovs_interfaceid": "a3b2b7b3-2c20-4521-88bf-4bd738cbbb5c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:38:38 compute-1 nova_compute[182713]: 2026-01-22 00:38:38.530 182717 INFO nova.compute.manager [None req-d3e3bbdd-f547-4686-815e-70606cffa687 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 34cfe24d-2754-4083-975b-f9775fd743b8] Took 12.77 seconds to build instance.
Jan 22 00:38:38 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:38:38.542 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[fdf8dc04-3a59-4acf-94de-204b94a4aef5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:38:38 compute-1 systemd-udevd[241977]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 00:38:38 compute-1 NetworkManager[54952]: <info>  [1769042318.5525] manager: (tap01fa8e13-90): new Veth device (/org/freedesktop/NetworkManager/Devices/347)
Jan 22 00:38:38 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:38:38.552 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[d7f6b4c2-5a3d-482a-910a-10ab417f6740]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:38:38 compute-1 nova_compute[182713]: 2026-01-22 00:38:38.555 182717 DEBUG oslo_concurrency.lockutils [req-dd017a30-e732-4324-83c3-8c65aa440c69 req-836fe50f-8732-45dc-9e3c-4e267f427d04 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-34cfe24d-2754-4083-975b-f9775fd743b8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:38:38 compute-1 nova_compute[182713]: 2026-01-22 00:38:38.556 182717 DEBUG oslo_concurrency.lockutils [None req-d3e3bbdd-f547-4686-815e-70606cffa687 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "34cfe24d-2754-4083-975b-f9775fd743b8" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 12.878s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:38:38 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:38:38.603 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[4c8dceba-e540-4fce-ba7d-3fb3dca3d89a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:38:38 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:38:38.608 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[1f17b44c-4dee-4fbb-bfd5-d11a79bcaeff]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:38:38 compute-1 NetworkManager[54952]: <info>  [1769042318.6484] device (tap01fa8e13-90): carrier: link connected
Jan 22 00:38:38 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:38:38.654 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[84356186-8b45-4c56-800a-ad6a610dc07b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:38:38 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:38:38.673 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[bb127cc3-e1aa-40c2-ac5c-7b88aee60155]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap01fa8e13-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:5f:ff:76'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 209], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 689130, 'reachable_time': 43458, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 242092, 'error': None, 'target': 'ovnmeta-01fa8e13-9f62-4b06-88db-79f2e6ca65b8', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:38:38 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:38:38.694 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[76aca50b-c48d-403e-abd3-e6aad03275c8]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe5f:ff76'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 689130, 'tstamp': 689130}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 242093, 'error': None, 'target': 'ovnmeta-01fa8e13-9f62-4b06-88db-79f2e6ca65b8', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:38:38 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:38:38.713 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[71b0e123-a489-4be1-b160-e082bf7c8932]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap01fa8e13-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:5f:ff:76'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 209], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 689130, 'reachable_time': 43458, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 242094, 'error': None, 'target': 'ovnmeta-01fa8e13-9f62-4b06-88db-79f2e6ca65b8', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:38:38 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:38:38.753 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[45c58516-6e85-418d-b333-ac632633ab07]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:38:38 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:38:38.800 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[872dc5b6-bc0a-4975-88e1-eff3769b039b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:38:38 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:38:38.801 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap01fa8e13-90, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:38:38 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:38:38.802 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 00:38:38 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:38:38.802 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap01fa8e13-90, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:38:38 compute-1 kernel: tap01fa8e13-90: entered promiscuous mode
Jan 22 00:38:38 compute-1 NetworkManager[54952]: <info>  [1769042318.8064] manager: (tap01fa8e13-90): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/348)
Jan 22 00:38:38 compute-1 nova_compute[182713]: 2026-01-22 00:38:38.806 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:38:38 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:38:38.809 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap01fa8e13-90, col_values=(('external_ids', {'iface-id': '63ad2747-135a-46c8-90ca-ec1def31a1c2'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:38:38 compute-1 ovn_controller[94841]: 2026-01-22T00:38:38Z|00725|binding|INFO|Releasing lport 63ad2747-135a-46c8-90ca-ec1def31a1c2 from this chassis (sb_readonly=0)
Jan 22 00:38:38 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:38:38.812 104184 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/01fa8e13-9f62-4b06-88db-79f2e6ca65b8.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/01fa8e13-9f62-4b06-88db-79f2e6ca65b8.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 22 00:38:38 compute-1 nova_compute[182713]: 2026-01-22 00:38:38.812 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:38:38 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:38:38.813 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[60c80680-1dd7-4ba0-98a5-b0e49b1a5957]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:38:38 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:38:38.814 104184 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 00:38:38 compute-1 ovn_metadata_agent[104179]: global
Jan 22 00:38:38 compute-1 ovn_metadata_agent[104179]:     log         /dev/log local0 debug
Jan 22 00:38:38 compute-1 ovn_metadata_agent[104179]:     log-tag     haproxy-metadata-proxy-01fa8e13-9f62-4b06-88db-79f2e6ca65b8
Jan 22 00:38:38 compute-1 ovn_metadata_agent[104179]:     user        root
Jan 22 00:38:38 compute-1 ovn_metadata_agent[104179]:     group       root
Jan 22 00:38:38 compute-1 ovn_metadata_agent[104179]:     maxconn     1024
Jan 22 00:38:38 compute-1 ovn_metadata_agent[104179]:     pidfile     /var/lib/neutron/external/pids/01fa8e13-9f62-4b06-88db-79f2e6ca65b8.pid.haproxy
Jan 22 00:38:38 compute-1 ovn_metadata_agent[104179]:     daemon
Jan 22 00:38:38 compute-1 ovn_metadata_agent[104179]: 
Jan 22 00:38:38 compute-1 ovn_metadata_agent[104179]: defaults
Jan 22 00:38:38 compute-1 ovn_metadata_agent[104179]:     log global
Jan 22 00:38:38 compute-1 ovn_metadata_agent[104179]:     mode http
Jan 22 00:38:38 compute-1 ovn_metadata_agent[104179]:     option httplog
Jan 22 00:38:38 compute-1 ovn_metadata_agent[104179]:     option dontlognull
Jan 22 00:38:38 compute-1 ovn_metadata_agent[104179]:     option http-server-close
Jan 22 00:38:38 compute-1 ovn_metadata_agent[104179]:     option forwardfor
Jan 22 00:38:38 compute-1 ovn_metadata_agent[104179]:     retries                 3
Jan 22 00:38:38 compute-1 ovn_metadata_agent[104179]:     timeout http-request    30s
Jan 22 00:38:38 compute-1 ovn_metadata_agent[104179]:     timeout connect         30s
Jan 22 00:38:38 compute-1 ovn_metadata_agent[104179]:     timeout client          32s
Jan 22 00:38:38 compute-1 ovn_metadata_agent[104179]:     timeout server          32s
Jan 22 00:38:38 compute-1 ovn_metadata_agent[104179]:     timeout http-keep-alive 30s
Jan 22 00:38:38 compute-1 ovn_metadata_agent[104179]: 
Jan 22 00:38:38 compute-1 ovn_metadata_agent[104179]: 
Jan 22 00:38:38 compute-1 ovn_metadata_agent[104179]: listen listener
Jan 22 00:38:38 compute-1 ovn_metadata_agent[104179]:     bind 169.254.169.254:80
Jan 22 00:38:38 compute-1 ovn_metadata_agent[104179]:     server metadata /var/lib/neutron/metadata_proxy
Jan 22 00:38:38 compute-1 ovn_metadata_agent[104179]:     http-request add-header X-OVN-Network-ID 01fa8e13-9f62-4b06-88db-79f2e6ca65b8
Jan 22 00:38:38 compute-1 ovn_metadata_agent[104179]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 22 00:38:38 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:38:38.815 104184 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-01fa8e13-9f62-4b06-88db-79f2e6ca65b8', 'env', 'PROCESS_TAG=haproxy-01fa8e13-9f62-4b06-88db-79f2e6ca65b8', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/01fa8e13-9f62-4b06-88db-79f2e6ca65b8.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 22 00:38:38 compute-1 nova_compute[182713]: 2026-01-22 00:38:38.824 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:38:39 compute-1 podman[242124]: 2026-01-22 00:38:39.234447387 +0000 UTC m=+0.082160958 container create cdafecea88256e85ed036288cf7711ac75785120cfb964a4e12c455c465b31d1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-01fa8e13-9f62-4b06-88db-79f2e6ca65b8, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 00:38:39 compute-1 podman[242124]: 2026-01-22 00:38:39.176932545 +0000 UTC m=+0.024646156 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 00:38:39 compute-1 systemd[1]: Started libpod-conmon-cdafecea88256e85ed036288cf7711ac75785120cfb964a4e12c455c465b31d1.scope.
Jan 22 00:38:39 compute-1 systemd[1]: Started libcrun container.
Jan 22 00:38:39 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/134116da2f97004a08a09ae03729bbecb48b4b08aeb445da51f6e676835fa1b5/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 00:38:39 compute-1 podman[242124]: 2026-01-22 00:38:39.352945939 +0000 UTC m=+0.200659540 container init cdafecea88256e85ed036288cf7711ac75785120cfb964a4e12c455c465b31d1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-01fa8e13-9f62-4b06-88db-79f2e6ca65b8, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0)
Jan 22 00:38:39 compute-1 podman[242124]: 2026-01-22 00:38:39.365775546 +0000 UTC m=+0.213489117 container start cdafecea88256e85ed036288cf7711ac75785120cfb964a4e12c455c465b31d1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-01fa8e13-9f62-4b06-88db-79f2e6ca65b8, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Jan 22 00:38:39 compute-1 neutron-haproxy-ovnmeta-01fa8e13-9f62-4b06-88db-79f2e6ca65b8[242139]: [NOTICE]   (242143) : New worker (242145) forked
Jan 22 00:38:39 compute-1 neutron-haproxy-ovnmeta-01fa8e13-9f62-4b06-88db-79f2e6ca65b8[242139]: [NOTICE]   (242143) : Loading success.
Jan 22 00:38:40 compute-1 nova_compute[182713]: 2026-01-22 00:38:40.371 182717 DEBUG nova.compute.manager [req-fc7ffad3-d958-4dc0-a0a4-94cfde58fe4d req-2c37f088-1203-4e7f-939e-4e602c758b1a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 34cfe24d-2754-4083-975b-f9775fd743b8] Received event network-vif-plugged-a3b2b7b3-2c20-4521-88bf-4bd738cbbb5c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:38:40 compute-1 nova_compute[182713]: 2026-01-22 00:38:40.372 182717 DEBUG oslo_concurrency.lockutils [req-fc7ffad3-d958-4dc0-a0a4-94cfde58fe4d req-2c37f088-1203-4e7f-939e-4e602c758b1a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "34cfe24d-2754-4083-975b-f9775fd743b8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:38:40 compute-1 nova_compute[182713]: 2026-01-22 00:38:40.373 182717 DEBUG oslo_concurrency.lockutils [req-fc7ffad3-d958-4dc0-a0a4-94cfde58fe4d req-2c37f088-1203-4e7f-939e-4e602c758b1a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "34cfe24d-2754-4083-975b-f9775fd743b8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:38:40 compute-1 nova_compute[182713]: 2026-01-22 00:38:40.373 182717 DEBUG oslo_concurrency.lockutils [req-fc7ffad3-d958-4dc0-a0a4-94cfde58fe4d req-2c37f088-1203-4e7f-939e-4e602c758b1a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "34cfe24d-2754-4083-975b-f9775fd743b8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:38:40 compute-1 nova_compute[182713]: 2026-01-22 00:38:40.373 182717 DEBUG nova.compute.manager [req-fc7ffad3-d958-4dc0-a0a4-94cfde58fe4d req-2c37f088-1203-4e7f-939e-4e602c758b1a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 34cfe24d-2754-4083-975b-f9775fd743b8] No waiting events found dispatching network-vif-plugged-a3b2b7b3-2c20-4521-88bf-4bd738cbbb5c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:38:40 compute-1 nova_compute[182713]: 2026-01-22 00:38:40.374 182717 WARNING nova.compute.manager [req-fc7ffad3-d958-4dc0-a0a4-94cfde58fe4d req-2c37f088-1203-4e7f-939e-4e602c758b1a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 34cfe24d-2754-4083-975b-f9775fd743b8] Received unexpected event network-vif-plugged-a3b2b7b3-2c20-4521-88bf-4bd738cbbb5c for instance with vm_state active and task_state None.
Jan 22 00:38:40 compute-1 nova_compute[182713]: 2026-01-22 00:38:40.375 182717 DEBUG nova.compute.manager [req-52234063-43ed-4b69-a709-98bc524e8072 req-492a60d4-d5c3-459f-aadf-62340a2c414b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 34cfe24d-2754-4083-975b-f9775fd743b8] Received event network-vif-plugged-6e500b56-ad6b-4af0-9a28-7bd95fe3781d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:38:40 compute-1 nova_compute[182713]: 2026-01-22 00:38:40.375 182717 DEBUG oslo_concurrency.lockutils [req-52234063-43ed-4b69-a709-98bc524e8072 req-492a60d4-d5c3-459f-aadf-62340a2c414b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "34cfe24d-2754-4083-975b-f9775fd743b8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:38:40 compute-1 nova_compute[182713]: 2026-01-22 00:38:40.375 182717 DEBUG oslo_concurrency.lockutils [req-52234063-43ed-4b69-a709-98bc524e8072 req-492a60d4-d5c3-459f-aadf-62340a2c414b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "34cfe24d-2754-4083-975b-f9775fd743b8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:38:40 compute-1 nova_compute[182713]: 2026-01-22 00:38:40.375 182717 DEBUG oslo_concurrency.lockutils [req-52234063-43ed-4b69-a709-98bc524e8072 req-492a60d4-d5c3-459f-aadf-62340a2c414b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "34cfe24d-2754-4083-975b-f9775fd743b8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:38:40 compute-1 nova_compute[182713]: 2026-01-22 00:38:40.376 182717 DEBUG nova.compute.manager [req-52234063-43ed-4b69-a709-98bc524e8072 req-492a60d4-d5c3-459f-aadf-62340a2c414b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 34cfe24d-2754-4083-975b-f9775fd743b8] No waiting events found dispatching network-vif-plugged-6e500b56-ad6b-4af0-9a28-7bd95fe3781d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:38:40 compute-1 nova_compute[182713]: 2026-01-22 00:38:40.376 182717 WARNING nova.compute.manager [req-52234063-43ed-4b69-a709-98bc524e8072 req-492a60d4-d5c3-459f-aadf-62340a2c414b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 34cfe24d-2754-4083-975b-f9775fd743b8] Received unexpected event network-vif-plugged-6e500b56-ad6b-4af0-9a28-7bd95fe3781d for instance with vm_state active and task_state None.
Jan 22 00:38:40 compute-1 nova_compute[182713]: 2026-01-22 00:38:40.616 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:38:41 compute-1 nova_compute[182713]: 2026-01-22 00:38:41.301 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:38:41 compute-1 nova_compute[182713]: 2026-01-22 00:38:41.882 182717 DEBUG nova.compute.manager [req-067bc129-64be-48a6-937c-6ae9c5d9d37d req-c420f599-e865-4518-a943-8218ecc7664d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 34cfe24d-2754-4083-975b-f9775fd743b8] Received event network-changed-6e500b56-ad6b-4af0-9a28-7bd95fe3781d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:38:41 compute-1 nova_compute[182713]: 2026-01-22 00:38:41.883 182717 DEBUG nova.compute.manager [req-067bc129-64be-48a6-937c-6ae9c5d9d37d req-c420f599-e865-4518-a943-8218ecc7664d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 34cfe24d-2754-4083-975b-f9775fd743b8] Refreshing instance network info cache due to event network-changed-6e500b56-ad6b-4af0-9a28-7bd95fe3781d. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 00:38:41 compute-1 nova_compute[182713]: 2026-01-22 00:38:41.884 182717 DEBUG oslo_concurrency.lockutils [req-067bc129-64be-48a6-937c-6ae9c5d9d37d req-c420f599-e865-4518-a943-8218ecc7664d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-34cfe24d-2754-4083-975b-f9775fd743b8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:38:41 compute-1 nova_compute[182713]: 2026-01-22 00:38:41.884 182717 DEBUG oslo_concurrency.lockutils [req-067bc129-64be-48a6-937c-6ae9c5d9d37d req-c420f599-e865-4518-a943-8218ecc7664d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-34cfe24d-2754-4083-975b-f9775fd743b8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:38:41 compute-1 nova_compute[182713]: 2026-01-22 00:38:41.885 182717 DEBUG nova.network.neutron [req-067bc129-64be-48a6-937c-6ae9c5d9d37d req-c420f599-e865-4518-a943-8218ecc7664d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 34cfe24d-2754-4083-975b-f9775fd743b8] Refreshing network info cache for port 6e500b56-ad6b-4af0-9a28-7bd95fe3781d _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 00:38:43 compute-1 nova_compute[182713]: 2026-01-22 00:38:43.864 182717 DEBUG nova.network.neutron [req-067bc129-64be-48a6-937c-6ae9c5d9d37d req-c420f599-e865-4518-a943-8218ecc7664d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 34cfe24d-2754-4083-975b-f9775fd743b8] Updated VIF entry in instance network info cache for port 6e500b56-ad6b-4af0-9a28-7bd95fe3781d. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 00:38:43 compute-1 nova_compute[182713]: 2026-01-22 00:38:43.865 182717 DEBUG nova.network.neutron [req-067bc129-64be-48a6-937c-6ae9c5d9d37d req-c420f599-e865-4518-a943-8218ecc7664d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 34cfe24d-2754-4083-975b-f9775fd743b8] Updating instance_info_cache with network_info: [{"id": "6e500b56-ad6b-4af0-9a28-7bd95fe3781d", "address": "fa:16:3e:19:b9:32", "network": {"id": "96576974-adfc-492e-9141-63dd99e1cb25", "bridge": "br-int", "label": "tempest-network-smoke--1773827977", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.172", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6e500b56-ad", "ovs_interfaceid": "6e500b56-ad6b-4af0-9a28-7bd95fe3781d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "a3b2b7b3-2c20-4521-88bf-4bd738cbbb5c", "address": "fa:16:3e:5f:7d:b6", "network": {"id": "01fa8e13-9f62-4b06-88db-79f2e6ca65b8", "bridge": "br-int", "label": "tempest-network-smoke--1773626540", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe5f:7db6", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe5f:7db6", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa3b2b7b3-2c", "ovs_interfaceid": "a3b2b7b3-2c20-4521-88bf-4bd738cbbb5c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:38:43 compute-1 nova_compute[182713]: 2026-01-22 00:38:43.890 182717 DEBUG oslo_concurrency.lockutils [req-067bc129-64be-48a6-937c-6ae9c5d9d37d req-c420f599-e865-4518-a943-8218ecc7664d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-34cfe24d-2754-4083-975b-f9775fd743b8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:38:45 compute-1 nova_compute[182713]: 2026-01-22 00:38:45.620 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:38:46 compute-1 nova_compute[182713]: 2026-01-22 00:38:46.303 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:38:47 compute-1 podman[242155]: 2026-01-22 00:38:47.567600388 +0000 UTC m=+0.054707976 container health_status 9069102163b9014ce8d990e36224edf2f6277e737eebbc85a8ea0ba5a3d6a468 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 22 00:38:47 compute-1 podman[242154]: 2026-01-22 00:38:47.638866697 +0000 UTC m=+0.109900607 container health_status 1b31374526468d6b98833c6ddc06c791524e3aaa8f4a92d02e3f11c449a17ca2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_controller, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3)
Jan 22 00:38:50 compute-1 nova_compute[182713]: 2026-01-22 00:38:50.624 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:38:51 compute-1 nova_compute[182713]: 2026-01-22 00:38:51.305 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:38:51 compute-1 ovn_controller[94841]: 2026-01-22T00:38:51Z|00094|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:19:b9:32 10.100.0.6
Jan 22 00:38:51 compute-1 ovn_controller[94841]: 2026-01-22T00:38:51Z|00095|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:19:b9:32 10.100.0.6
Jan 22 00:38:53 compute-1 podman[242222]: 2026-01-22 00:38:53.560459084 +0000 UTC m=+0.053927562 container health_status af9a44923f14d865893c91712a29a99d59e05d83f1a1a0300c4f4d5af638f284 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 00:38:53 compute-1 podman[242223]: 2026-01-22 00:38:53.56870942 +0000 UTC m=+0.053610603 container health_status e305ebfcd3dbca18f8dd680606b043b108cf9efb0d0a771039cb04fe0df8441d (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 22 00:38:55 compute-1 nova_compute[182713]: 2026-01-22 00:38:55.628 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:38:56 compute-1 nova_compute[182713]: 2026-01-22 00:38:56.257 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:38:56 compute-1 nova_compute[182713]: 2026-01-22 00:38:56.280 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Triggering sync for uuid 34cfe24d-2754-4083-975b-f9775fd743b8 _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268
Jan 22 00:38:56 compute-1 nova_compute[182713]: 2026-01-22 00:38:56.281 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquiring lock "34cfe24d-2754-4083-975b-f9775fd743b8" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:38:56 compute-1 nova_compute[182713]: 2026-01-22 00:38:56.281 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "34cfe24d-2754-4083-975b-f9775fd743b8" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:38:56 compute-1 nova_compute[182713]: 2026-01-22 00:38:56.308 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:38:56 compute-1 nova_compute[182713]: 2026-01-22 00:38:56.314 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "34cfe24d-2754-4083-975b-f9775fd743b8" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.033s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:39:00 compute-1 nova_compute[182713]: 2026-01-22 00:39:00.632 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:39:01 compute-1 nova_compute[182713]: 2026-01-22 00:39:01.311 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:39:01 compute-1 nova_compute[182713]: 2026-01-22 00:39:01.565 182717 DEBUG nova.compute.manager [req-2fe7cc8f-a170-49fe-b05f-cab6bf4265b2 req-34dc5019-1b03-4ada-b238-ef6522b9c15e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 34cfe24d-2754-4083-975b-f9775fd743b8] Received event network-changed-6e500b56-ad6b-4af0-9a28-7bd95fe3781d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:39:01 compute-1 nova_compute[182713]: 2026-01-22 00:39:01.566 182717 DEBUG nova.compute.manager [req-2fe7cc8f-a170-49fe-b05f-cab6bf4265b2 req-34dc5019-1b03-4ada-b238-ef6522b9c15e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 34cfe24d-2754-4083-975b-f9775fd743b8] Refreshing instance network info cache due to event network-changed-6e500b56-ad6b-4af0-9a28-7bd95fe3781d. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 00:39:01 compute-1 nova_compute[182713]: 2026-01-22 00:39:01.566 182717 DEBUG oslo_concurrency.lockutils [req-2fe7cc8f-a170-49fe-b05f-cab6bf4265b2 req-34dc5019-1b03-4ada-b238-ef6522b9c15e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-34cfe24d-2754-4083-975b-f9775fd743b8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:39:01 compute-1 nova_compute[182713]: 2026-01-22 00:39:01.566 182717 DEBUG oslo_concurrency.lockutils [req-2fe7cc8f-a170-49fe-b05f-cab6bf4265b2 req-34dc5019-1b03-4ada-b238-ef6522b9c15e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-34cfe24d-2754-4083-975b-f9775fd743b8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:39:01 compute-1 nova_compute[182713]: 2026-01-22 00:39:01.567 182717 DEBUG nova.network.neutron [req-2fe7cc8f-a170-49fe-b05f-cab6bf4265b2 req-34dc5019-1b03-4ada-b238-ef6522b9c15e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 34cfe24d-2754-4083-975b-f9775fd743b8] Refreshing network info cache for port 6e500b56-ad6b-4af0-9a28-7bd95fe3781d _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 00:39:01 compute-1 nova_compute[182713]: 2026-01-22 00:39:01.675 182717 DEBUG oslo_concurrency.lockutils [None req-cac25a29-7053-464c-ad7a-8d7895869721 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Acquiring lock "34cfe24d-2754-4083-975b-f9775fd743b8" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:39:01 compute-1 nova_compute[182713]: 2026-01-22 00:39:01.676 182717 DEBUG oslo_concurrency.lockutils [None req-cac25a29-7053-464c-ad7a-8d7895869721 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "34cfe24d-2754-4083-975b-f9775fd743b8" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:39:01 compute-1 nova_compute[182713]: 2026-01-22 00:39:01.676 182717 DEBUG oslo_concurrency.lockutils [None req-cac25a29-7053-464c-ad7a-8d7895869721 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Acquiring lock "34cfe24d-2754-4083-975b-f9775fd743b8-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:39:01 compute-1 nova_compute[182713]: 2026-01-22 00:39:01.676 182717 DEBUG oslo_concurrency.lockutils [None req-cac25a29-7053-464c-ad7a-8d7895869721 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "34cfe24d-2754-4083-975b-f9775fd743b8-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:39:01 compute-1 nova_compute[182713]: 2026-01-22 00:39:01.677 182717 DEBUG oslo_concurrency.lockutils [None req-cac25a29-7053-464c-ad7a-8d7895869721 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "34cfe24d-2754-4083-975b-f9775fd743b8-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:39:01 compute-1 nova_compute[182713]: 2026-01-22 00:39:01.694 182717 INFO nova.compute.manager [None req-cac25a29-7053-464c-ad7a-8d7895869721 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 34cfe24d-2754-4083-975b-f9775fd743b8] Terminating instance
Jan 22 00:39:01 compute-1 nova_compute[182713]: 2026-01-22 00:39:01.705 182717 DEBUG nova.compute.manager [None req-cac25a29-7053-464c-ad7a-8d7895869721 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 34cfe24d-2754-4083-975b-f9775fd743b8] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 22 00:39:01 compute-1 kernel: tap6e500b56-ad (unregistering): left promiscuous mode
Jan 22 00:39:01 compute-1 NetworkManager[54952]: <info>  [1769042341.7345] device (tap6e500b56-ad): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 00:39:01 compute-1 ovn_controller[94841]: 2026-01-22T00:39:01Z|00726|binding|INFO|Releasing lport 6e500b56-ad6b-4af0-9a28-7bd95fe3781d from this chassis (sb_readonly=0)
Jan 22 00:39:01 compute-1 nova_compute[182713]: 2026-01-22 00:39:01.745 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:39:01 compute-1 ovn_controller[94841]: 2026-01-22T00:39:01Z|00727|binding|INFO|Setting lport 6e500b56-ad6b-4af0-9a28-7bd95fe3781d down in Southbound
Jan 22 00:39:01 compute-1 kernel: tapa3b2b7b3-2c (unregistering): left promiscuous mode
Jan 22 00:39:01 compute-1 ovn_controller[94841]: 2026-01-22T00:39:01Z|00728|binding|INFO|Removing iface tap6e500b56-ad ovn-installed in OVS
Jan 22 00:39:01 compute-1 nova_compute[182713]: 2026-01-22 00:39:01.751 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:39:01 compute-1 NetworkManager[54952]: <info>  [1769042341.7561] device (tapa3b2b7b3-2c): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 00:39:01 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:39:01.759 104184 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:19:b9:32 10.100.0.6'], port_security=['fa:16:3e:19:b9:32 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '34cfe24d-2754-4083-975b-f9775fd743b8', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-96576974-adfc-492e-9141-63dd99e1cb25', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '837db8748d074b3c9179b47d30e7a1d4', 'neutron:revision_number': '4', 'neutron:security_group_ids': '4381a94a-5b04-4450-b603-573605756783', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=700861ed-e604-4e52-bc1a-65ca23f63d88, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>], logical_port=6e500b56-ad6b-4af0-9a28-7bd95fe3781d) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:39:01 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:39:01.763 104184 INFO neutron.agent.ovn.metadata.agent [-] Port 6e500b56-ad6b-4af0-9a28-7bd95fe3781d in datapath 96576974-adfc-492e-9141-63dd99e1cb25 unbound from our chassis
Jan 22 00:39:01 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:39:01.765 104184 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 96576974-adfc-492e-9141-63dd99e1cb25, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 00:39:01 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:39:01.767 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[70f01506-6b6e-49d5-875f-269df11082a1]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:39:01 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:39:01.768 104184 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-96576974-adfc-492e-9141-63dd99e1cb25 namespace which is not needed anymore
Jan 22 00:39:01 compute-1 nova_compute[182713]: 2026-01-22 00:39:01.796 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:39:01 compute-1 ovn_controller[94841]: 2026-01-22T00:39:01Z|00729|binding|INFO|Releasing lport a3b2b7b3-2c20-4521-88bf-4bd738cbbb5c from this chassis (sb_readonly=0)
Jan 22 00:39:01 compute-1 ovn_controller[94841]: 2026-01-22T00:39:01Z|00730|binding|INFO|Setting lport a3b2b7b3-2c20-4521-88bf-4bd738cbbb5c down in Southbound
Jan 22 00:39:01 compute-1 ovn_controller[94841]: 2026-01-22T00:39:01Z|00731|binding|INFO|Removing iface tapa3b2b7b3-2c ovn-installed in OVS
Jan 22 00:39:01 compute-1 nova_compute[182713]: 2026-01-22 00:39:01.801 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:39:01 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:39:01.809 104184 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:5f:7d:b6 2001:db8:0:1:f816:3eff:fe5f:7db6 2001:db8::f816:3eff:fe5f:7db6'], port_security=['fa:16:3e:5f:7d:b6 2001:db8:0:1:f816:3eff:fe5f:7db6 2001:db8::f816:3eff:fe5f:7db6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:0:1:f816:3eff:fe5f:7db6/64 2001:db8::f816:3eff:fe5f:7db6/64', 'neutron:device_id': '34cfe24d-2754-4083-975b-f9775fd743b8', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-01fa8e13-9f62-4b06-88db-79f2e6ca65b8', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '837db8748d074b3c9179b47d30e7a1d4', 'neutron:revision_number': '4', 'neutron:security_group_ids': '4381a94a-5b04-4450-b603-573605756783', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=eaa35b5e-130a-4933-a219-b6429231aa8c, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>], logical_port=a3b2b7b3-2c20-4521-88bf-4bd738cbbb5c) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:39:01 compute-1 nova_compute[182713]: 2026-01-22 00:39:01.828 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:39:01 compute-1 systemd[1]: machine-qemu\x2d76\x2dinstance\x2d000000b0.scope: Deactivated successfully.
Jan 22 00:39:01 compute-1 systemd[1]: machine-qemu\x2d76\x2dinstance\x2d000000b0.scope: Consumed 13.897s CPU time.
Jan 22 00:39:01 compute-1 systemd-machined[153970]: Machine qemu-76-instance-000000b0 terminated.
Jan 22 00:39:01 compute-1 neutron-haproxy-ovnmeta-96576974-adfc-492e-9141-63dd99e1cb25[242067]: [NOTICE]   (242071) : haproxy version is 2.8.14-c23fe91
Jan 22 00:39:01 compute-1 neutron-haproxy-ovnmeta-96576974-adfc-492e-9141-63dd99e1cb25[242067]: [NOTICE]   (242071) : path to executable is /usr/sbin/haproxy
Jan 22 00:39:01 compute-1 neutron-haproxy-ovnmeta-96576974-adfc-492e-9141-63dd99e1cb25[242067]: [WARNING]  (242071) : Exiting Master process...
Jan 22 00:39:01 compute-1 neutron-haproxy-ovnmeta-96576974-adfc-492e-9141-63dd99e1cb25[242067]: [ALERT]    (242071) : Current worker (242073) exited with code 143 (Terminated)
Jan 22 00:39:01 compute-1 neutron-haproxy-ovnmeta-96576974-adfc-492e-9141-63dd99e1cb25[242067]: [WARNING]  (242071) : All workers exited. Exiting... (0)
Jan 22 00:39:01 compute-1 systemd[1]: libpod-29837a2f8595f39956ae6e57392fdc8ccbffec27fdb56ac6baca9cfbd340bfd2.scope: Deactivated successfully.
Jan 22 00:39:01 compute-1 NetworkManager[54952]: <info>  [1769042341.9528] manager: (tapa3b2b7b3-2c): new Tun device (/org/freedesktop/NetworkManager/Devices/349)
Jan 22 00:39:01 compute-1 podman[242294]: 2026-01-22 00:39:01.956996871 +0000 UTC m=+0.053527769 container died 29837a2f8595f39956ae6e57392fdc8ccbffec27fdb56ac6baca9cfbd340bfd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-96576974-adfc-492e-9141-63dd99e1cb25, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3)
Jan 22 00:39:01 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-29837a2f8595f39956ae6e57392fdc8ccbffec27fdb56ac6baca9cfbd340bfd2-userdata-shm.mount: Deactivated successfully.
Jan 22 00:39:01 compute-1 systemd[1]: var-lib-containers-storage-overlay-bba8c84212294e81ce6bc6ce3ace65e7a59b470a400b47c30f11353e896a96c7-merged.mount: Deactivated successfully.
Jan 22 00:39:02 compute-1 nova_compute[182713]: 2026-01-22 00:39:01.999 182717 INFO nova.virt.libvirt.driver [-] [instance: 34cfe24d-2754-4083-975b-f9775fd743b8] Instance destroyed successfully.
Jan 22 00:39:02 compute-1 nova_compute[182713]: 2026-01-22 00:39:02.000 182717 DEBUG nova.objects.instance [None req-cac25a29-7053-464c-ad7a-8d7895869721 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lazy-loading 'resources' on Instance uuid 34cfe24d-2754-4083-975b-f9775fd743b8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:39:02 compute-1 podman[242294]: 2026-01-22 00:39:02.004558835 +0000 UTC m=+0.101089723 container cleanup 29837a2f8595f39956ae6e57392fdc8ccbffec27fdb56ac6baca9cfbd340bfd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-96576974-adfc-492e-9141-63dd99e1cb25, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 22 00:39:02 compute-1 systemd[1]: libpod-conmon-29837a2f8595f39956ae6e57392fdc8ccbffec27fdb56ac6baca9cfbd340bfd2.scope: Deactivated successfully.
Jan 22 00:39:02 compute-1 nova_compute[182713]: 2026-01-22 00:39:02.016 182717 DEBUG nova.virt.libvirt.vif [None req-cac25a29-7053-464c-ad7a-8d7895869721 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T00:38:24Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1282444667',display_name='tempest-TestGettingAddress-server-1282444667',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1282444667',id=176,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBKUPE/b7T//C7EWRAZhxqFsRGr7AXrACj+OWHY0bSytLiLst+E4mc3tNVo/ZttM4rMO8VKrIAX0ipjrNBzr3hEfrSo0ADkuS+9zF2SWKTGt3QdgHdDGoyPuHVN6vYWqrHA==',key_name='tempest-TestGettingAddress-2032783061',keypairs=<?>,launch_index=0,launched_at=2026-01-22T00:38:38Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='837db8748d074b3c9179b47d30e7a1d4',ramdisk_id='',reservation_id='r-jd0666ok',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestGettingAddress-471729430',owner_user_name='tempest-TestGettingAddress-471729430-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T00:38:38Z,user_data=None,user_id='a8fd196423d94b309668ffd08655f7ed',uuid=34cfe24d-2754-4083-975b-f9775fd743b8,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "6e500b56-ad6b-4af0-9a28-7bd95fe3781d", "address": "fa:16:3e:19:b9:32", "network": {"id": "96576974-adfc-492e-9141-63dd99e1cb25", "bridge": "br-int", "label": "tempest-network-smoke--1773827977", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.172", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6e500b56-ad", "ovs_interfaceid": "6e500b56-ad6b-4af0-9a28-7bd95fe3781d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 00:39:02 compute-1 nova_compute[182713]: 2026-01-22 00:39:02.016 182717 DEBUG nova.network.os_vif_util [None req-cac25a29-7053-464c-ad7a-8d7895869721 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Converting VIF {"id": "6e500b56-ad6b-4af0-9a28-7bd95fe3781d", "address": "fa:16:3e:19:b9:32", "network": {"id": "96576974-adfc-492e-9141-63dd99e1cb25", "bridge": "br-int", "label": "tempest-network-smoke--1773827977", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.172", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6e500b56-ad", "ovs_interfaceid": "6e500b56-ad6b-4af0-9a28-7bd95fe3781d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:39:02 compute-1 nova_compute[182713]: 2026-01-22 00:39:02.018 182717 DEBUG nova.network.os_vif_util [None req-cac25a29-7053-464c-ad7a-8d7895869721 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:19:b9:32,bridge_name='br-int',has_traffic_filtering=True,id=6e500b56-ad6b-4af0-9a28-7bd95fe3781d,network=Network(96576974-adfc-492e-9141-63dd99e1cb25),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6e500b56-ad') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:39:02 compute-1 nova_compute[182713]: 2026-01-22 00:39:02.018 182717 DEBUG os_vif [None req-cac25a29-7053-464c-ad7a-8d7895869721 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:19:b9:32,bridge_name='br-int',has_traffic_filtering=True,id=6e500b56-ad6b-4af0-9a28-7bd95fe3781d,network=Network(96576974-adfc-492e-9141-63dd99e1cb25),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6e500b56-ad') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 00:39:02 compute-1 nova_compute[182713]: 2026-01-22 00:39:02.020 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:39:02 compute-1 nova_compute[182713]: 2026-01-22 00:39:02.020 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6e500b56-ad, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:39:02 compute-1 nova_compute[182713]: 2026-01-22 00:39:02.021 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:39:02 compute-1 nova_compute[182713]: 2026-01-22 00:39:02.024 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:39:02 compute-1 nova_compute[182713]: 2026-01-22 00:39:02.026 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:39:02 compute-1 nova_compute[182713]: 2026-01-22 00:39:02.032 182717 INFO os_vif [None req-cac25a29-7053-464c-ad7a-8d7895869721 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:19:b9:32,bridge_name='br-int',has_traffic_filtering=True,id=6e500b56-ad6b-4af0-9a28-7bd95fe3781d,network=Network(96576974-adfc-492e-9141-63dd99e1cb25),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6e500b56-ad')
Jan 22 00:39:02 compute-1 nova_compute[182713]: 2026-01-22 00:39:02.032 182717 DEBUG nova.virt.libvirt.vif [None req-cac25a29-7053-464c-ad7a-8d7895869721 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T00:38:24Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1282444667',display_name='tempest-TestGettingAddress-server-1282444667',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1282444667',id=176,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBKUPE/b7T//C7EWRAZhxqFsRGr7AXrACj+OWHY0bSytLiLst+E4mc3tNVo/ZttM4rMO8VKrIAX0ipjrNBzr3hEfrSo0ADkuS+9zF2SWKTGt3QdgHdDGoyPuHVN6vYWqrHA==',key_name='tempest-TestGettingAddress-2032783061',keypairs=<?>,launch_index=0,launched_at=2026-01-22T00:38:38Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='837db8748d074b3c9179b47d30e7a1d4',ramdisk_id='',reservation_id='r-jd0666ok',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestGettingAddress-471729430',owner_user_name='tempest-TestGettingAddress-471729430-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T00:38:38Z,user_data=None,user_id='a8fd196423d94b309668ffd08655f7ed',uuid=34cfe24d-2754-4083-975b-f9775fd743b8,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "a3b2b7b3-2c20-4521-88bf-4bd738cbbb5c", "address": "fa:16:3e:5f:7d:b6", "network": {"id": "01fa8e13-9f62-4b06-88db-79f2e6ca65b8", "bridge": "br-int", "label": "tempest-network-smoke--1773626540", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe5f:7db6", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe5f:7db6", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa3b2b7b3-2c", "ovs_interfaceid": "a3b2b7b3-2c20-4521-88bf-4bd738cbbb5c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 00:39:02 compute-1 nova_compute[182713]: 2026-01-22 00:39:02.033 182717 DEBUG nova.network.os_vif_util [None req-cac25a29-7053-464c-ad7a-8d7895869721 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Converting VIF {"id": "a3b2b7b3-2c20-4521-88bf-4bd738cbbb5c", "address": "fa:16:3e:5f:7d:b6", "network": {"id": "01fa8e13-9f62-4b06-88db-79f2e6ca65b8", "bridge": "br-int", "label": "tempest-network-smoke--1773626540", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe5f:7db6", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe5f:7db6", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa3b2b7b3-2c", "ovs_interfaceid": "a3b2b7b3-2c20-4521-88bf-4bd738cbbb5c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:39:02 compute-1 nova_compute[182713]: 2026-01-22 00:39:02.034 182717 DEBUG nova.network.os_vif_util [None req-cac25a29-7053-464c-ad7a-8d7895869721 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:5f:7d:b6,bridge_name='br-int',has_traffic_filtering=True,id=a3b2b7b3-2c20-4521-88bf-4bd738cbbb5c,network=Network(01fa8e13-9f62-4b06-88db-79f2e6ca65b8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa3b2b7b3-2c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:39:02 compute-1 nova_compute[182713]: 2026-01-22 00:39:02.034 182717 DEBUG os_vif [None req-cac25a29-7053-464c-ad7a-8d7895869721 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:5f:7d:b6,bridge_name='br-int',has_traffic_filtering=True,id=a3b2b7b3-2c20-4521-88bf-4bd738cbbb5c,network=Network(01fa8e13-9f62-4b06-88db-79f2e6ca65b8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa3b2b7b3-2c') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 00:39:02 compute-1 nova_compute[182713]: 2026-01-22 00:39:02.035 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:39:02 compute-1 nova_compute[182713]: 2026-01-22 00:39:02.035 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa3b2b7b3-2c, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:39:02 compute-1 nova_compute[182713]: 2026-01-22 00:39:02.037 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:39:02 compute-1 nova_compute[182713]: 2026-01-22 00:39:02.039 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:39:02 compute-1 nova_compute[182713]: 2026-01-22 00:39:02.041 182717 INFO os_vif [None req-cac25a29-7053-464c-ad7a-8d7895869721 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:5f:7d:b6,bridge_name='br-int',has_traffic_filtering=True,id=a3b2b7b3-2c20-4521-88bf-4bd738cbbb5c,network=Network(01fa8e13-9f62-4b06-88db-79f2e6ca65b8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa3b2b7b3-2c')
Jan 22 00:39:02 compute-1 nova_compute[182713]: 2026-01-22 00:39:02.042 182717 INFO nova.virt.libvirt.driver [None req-cac25a29-7053-464c-ad7a-8d7895869721 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 34cfe24d-2754-4083-975b-f9775fd743b8] Deleting instance files /var/lib/nova/instances/34cfe24d-2754-4083-975b-f9775fd743b8_del
Jan 22 00:39:02 compute-1 nova_compute[182713]: 2026-01-22 00:39:02.043 182717 INFO nova.virt.libvirt.driver [None req-cac25a29-7053-464c-ad7a-8d7895869721 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 34cfe24d-2754-4083-975b-f9775fd743b8] Deletion of /var/lib/nova/instances/34cfe24d-2754-4083-975b-f9775fd743b8_del complete
Jan 22 00:39:02 compute-1 podman[242350]: 2026-01-22 00:39:02.080770596 +0000 UTC m=+0.052132736 container remove 29837a2f8595f39956ae6e57392fdc8ccbffec27fdb56ac6baca9cfbd340bfd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-96576974-adfc-492e-9141-63dd99e1cb25, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 00:39:02 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:39:02.087 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[a68ecd18-2fae-4d90-9e40-04a5239f9ea7]: (4, ('Thu Jan 22 12:39:01 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-96576974-adfc-492e-9141-63dd99e1cb25 (29837a2f8595f39956ae6e57392fdc8ccbffec27fdb56ac6baca9cfbd340bfd2)\n29837a2f8595f39956ae6e57392fdc8ccbffec27fdb56ac6baca9cfbd340bfd2\nThu Jan 22 12:39:02 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-96576974-adfc-492e-9141-63dd99e1cb25 (29837a2f8595f39956ae6e57392fdc8ccbffec27fdb56ac6baca9cfbd340bfd2)\n29837a2f8595f39956ae6e57392fdc8ccbffec27fdb56ac6baca9cfbd340bfd2\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:39:02 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:39:02.089 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[ae186a4d-006b-4299-8a24-471f3da1945d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:39:02 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:39:02.092 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap96576974-a0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:39:02 compute-1 nova_compute[182713]: 2026-01-22 00:39:02.094 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:39:02 compute-1 kernel: tap96576974-a0: left promiscuous mode
Jan 22 00:39:02 compute-1 nova_compute[182713]: 2026-01-22 00:39:02.096 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:39:02 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:39:02.101 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[36ccf276-b2fb-4263-ba2b-b70f3269949b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:39:02 compute-1 nova_compute[182713]: 2026-01-22 00:39:02.106 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:39:02 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:39:02.122 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[87354de8-a5fd-49b2-84f8-cc46c9d37851]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:39:02 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:39:02.124 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[df1cf565-415b-400d-940b-ec387d2ad6d0]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:39:02 compute-1 nova_compute[182713]: 2026-01-22 00:39:02.136 182717 INFO nova.compute.manager [None req-cac25a29-7053-464c-ad7a-8d7895869721 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 34cfe24d-2754-4083-975b-f9775fd743b8] Took 0.43 seconds to destroy the instance on the hypervisor.
Jan 22 00:39:02 compute-1 nova_compute[182713]: 2026-01-22 00:39:02.137 182717 DEBUG oslo.service.loopingcall [None req-cac25a29-7053-464c-ad7a-8d7895869721 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 22 00:39:02 compute-1 nova_compute[182713]: 2026-01-22 00:39:02.137 182717 DEBUG nova.compute.manager [-] [instance: 34cfe24d-2754-4083-975b-f9775fd743b8] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 22 00:39:02 compute-1 nova_compute[182713]: 2026-01-22 00:39:02.137 182717 DEBUG nova.network.neutron [-] [instance: 34cfe24d-2754-4083-975b-f9775fd743b8] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 22 00:39:02 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:39:02.141 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[7d39963b-160e-447b-bce2-ff7586c50081]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 689022, 'reachable_time': 27683, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 242366, 'error': None, 'target': 'ovnmeta-96576974-adfc-492e-9141-63dd99e1cb25', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:39:02 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:39:02.144 104576 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-96576974-adfc-492e-9141-63dd99e1cb25 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 22 00:39:02 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:39:02.145 104576 DEBUG oslo.privsep.daemon [-] privsep: reply[0334b4a7-f97a-454b-a0c6-f5f0c6b4247a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:39:02 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:39:02.145 104184 INFO neutron.agent.ovn.metadata.agent [-] Port a3b2b7b3-2c20-4521-88bf-4bd738cbbb5c in datapath 01fa8e13-9f62-4b06-88db-79f2e6ca65b8 unbound from our chassis
Jan 22 00:39:02 compute-1 systemd[1]: run-netns-ovnmeta\x2d96576974\x2dadfc\x2d492e\x2d9141\x2d63dd99e1cb25.mount: Deactivated successfully.
Jan 22 00:39:02 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:39:02.147 104184 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 01fa8e13-9f62-4b06-88db-79f2e6ca65b8, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 00:39:02 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:39:02.147 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[845e87f1-96f7-404d-8d17-8e3fe779d3fc]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:39:02 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:39:02.148 104184 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-01fa8e13-9f62-4b06-88db-79f2e6ca65b8 namespace which is not needed anymore
Jan 22 00:39:02 compute-1 nova_compute[182713]: 2026-01-22 00:39:02.303 182717 DEBUG nova.compute.manager [req-838328e0-e2fa-430b-b299-70590e6a9865 req-45d5df20-7fdd-440c-9bd3-8d5079b8eb77 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 34cfe24d-2754-4083-975b-f9775fd743b8] Received event network-vif-unplugged-a3b2b7b3-2c20-4521-88bf-4bd738cbbb5c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:39:02 compute-1 nova_compute[182713]: 2026-01-22 00:39:02.304 182717 DEBUG oslo_concurrency.lockutils [req-838328e0-e2fa-430b-b299-70590e6a9865 req-45d5df20-7fdd-440c-9bd3-8d5079b8eb77 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "34cfe24d-2754-4083-975b-f9775fd743b8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:39:02 compute-1 nova_compute[182713]: 2026-01-22 00:39:02.304 182717 DEBUG oslo_concurrency.lockutils [req-838328e0-e2fa-430b-b299-70590e6a9865 req-45d5df20-7fdd-440c-9bd3-8d5079b8eb77 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "34cfe24d-2754-4083-975b-f9775fd743b8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:39:02 compute-1 nova_compute[182713]: 2026-01-22 00:39:02.305 182717 DEBUG oslo_concurrency.lockutils [req-838328e0-e2fa-430b-b299-70590e6a9865 req-45d5df20-7fdd-440c-9bd3-8d5079b8eb77 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "34cfe24d-2754-4083-975b-f9775fd743b8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:39:02 compute-1 nova_compute[182713]: 2026-01-22 00:39:02.305 182717 DEBUG nova.compute.manager [req-838328e0-e2fa-430b-b299-70590e6a9865 req-45d5df20-7fdd-440c-9bd3-8d5079b8eb77 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 34cfe24d-2754-4083-975b-f9775fd743b8] No waiting events found dispatching network-vif-unplugged-a3b2b7b3-2c20-4521-88bf-4bd738cbbb5c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:39:02 compute-1 nova_compute[182713]: 2026-01-22 00:39:02.305 182717 DEBUG nova.compute.manager [req-838328e0-e2fa-430b-b299-70590e6a9865 req-45d5df20-7fdd-440c-9bd3-8d5079b8eb77 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 34cfe24d-2754-4083-975b-f9775fd743b8] Received event network-vif-unplugged-a3b2b7b3-2c20-4521-88bf-4bd738cbbb5c for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 22 00:39:02 compute-1 neutron-haproxy-ovnmeta-01fa8e13-9f62-4b06-88db-79f2e6ca65b8[242139]: [NOTICE]   (242143) : haproxy version is 2.8.14-c23fe91
Jan 22 00:39:02 compute-1 neutron-haproxy-ovnmeta-01fa8e13-9f62-4b06-88db-79f2e6ca65b8[242139]: [NOTICE]   (242143) : path to executable is /usr/sbin/haproxy
Jan 22 00:39:02 compute-1 neutron-haproxy-ovnmeta-01fa8e13-9f62-4b06-88db-79f2e6ca65b8[242139]: [WARNING]  (242143) : Exiting Master process...
Jan 22 00:39:02 compute-1 neutron-haproxy-ovnmeta-01fa8e13-9f62-4b06-88db-79f2e6ca65b8[242139]: [ALERT]    (242143) : Current worker (242145) exited with code 143 (Terminated)
Jan 22 00:39:02 compute-1 neutron-haproxy-ovnmeta-01fa8e13-9f62-4b06-88db-79f2e6ca65b8[242139]: [WARNING]  (242143) : All workers exited. Exiting... (0)
Jan 22 00:39:02 compute-1 systemd[1]: libpod-cdafecea88256e85ed036288cf7711ac75785120cfb964a4e12c455c465b31d1.scope: Deactivated successfully.
Jan 22 00:39:02 compute-1 podman[242384]: 2026-01-22 00:39:02.408493821 +0000 UTC m=+0.153271450 container died cdafecea88256e85ed036288cf7711ac75785120cfb964a4e12c455c465b31d1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-01fa8e13-9f62-4b06-88db-79f2e6ca65b8, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 00:39:02 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-cdafecea88256e85ed036288cf7711ac75785120cfb964a4e12c455c465b31d1-userdata-shm.mount: Deactivated successfully.
Jan 22 00:39:02 compute-1 systemd[1]: var-lib-containers-storage-overlay-134116da2f97004a08a09ae03729bbecb48b4b08aeb445da51f6e676835fa1b5-merged.mount: Deactivated successfully.
Jan 22 00:39:02 compute-1 podman[242384]: 2026-01-22 00:39:02.636372582 +0000 UTC m=+0.381150211 container cleanup cdafecea88256e85ed036288cf7711ac75785120cfb964a4e12c455c465b31d1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-01fa8e13-9f62-4b06-88db-79f2e6ca65b8, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS)
Jan 22 00:39:02 compute-1 systemd[1]: libpod-conmon-cdafecea88256e85ed036288cf7711ac75785120cfb964a4e12c455c465b31d1.scope: Deactivated successfully.
Jan 22 00:39:02 compute-1 podman[242412]: 2026-01-22 00:39:02.722207582 +0000 UTC m=+0.053889901 container remove cdafecea88256e85ed036288cf7711ac75785120cfb964a4e12c455c465b31d1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-01fa8e13-9f62-4b06-88db-79f2e6ca65b8, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0)
Jan 22 00:39:02 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:39:02.728 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[0aa4aef1-b3bd-4051-b883-47c98e7d79f5]: (4, ('Thu Jan 22 12:39:02 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-01fa8e13-9f62-4b06-88db-79f2e6ca65b8 (cdafecea88256e85ed036288cf7711ac75785120cfb964a4e12c455c465b31d1)\ncdafecea88256e85ed036288cf7711ac75785120cfb964a4e12c455c465b31d1\nThu Jan 22 12:39:02 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-01fa8e13-9f62-4b06-88db-79f2e6ca65b8 (cdafecea88256e85ed036288cf7711ac75785120cfb964a4e12c455c465b31d1)\ncdafecea88256e85ed036288cf7711ac75785120cfb964a4e12c455c465b31d1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:39:02 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:39:02.729 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[ca82df3c-c880-49f8-9bd6-a41fb2e3043e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:39:02 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:39:02.730 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap01fa8e13-90, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:39:02 compute-1 nova_compute[182713]: 2026-01-22 00:39:02.732 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:39:02 compute-1 kernel: tap01fa8e13-90: left promiscuous mode
Jan 22 00:39:02 compute-1 nova_compute[182713]: 2026-01-22 00:39:02.747 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:39:02 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:39:02.750 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[c1af9d99-0bc9-4bfa-ae85-880d0aeada3f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:39:02 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:39:02.767 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[53fe4c30-fb1a-4419-9fdc-a10811e493aa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:39:02 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:39:02.768 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[371854d9-8c83-4594-995a-b7797522c761]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:39:02 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:39:02.794 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[c011c821-62a4-4548-9f7d-cecef3d2a6e6]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 689118, 'reachable_time': 23818, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 242431, 'error': None, 'target': 'ovnmeta-01fa8e13-9f62-4b06-88db-79f2e6ca65b8', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:39:02 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:39:02.796 104576 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-01fa8e13-9f62-4b06-88db-79f2e6ca65b8 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 22 00:39:02 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:39:02.797 104576 DEBUG oslo.privsep.daemon [-] privsep: reply[e287fa23-2592-4b20-95f6-34ef898c4138]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:39:02 compute-1 systemd[1]: run-netns-ovnmeta\x2d01fa8e13\x2d9f62\x2d4b06\x2d88db\x2d79f2e6ca65b8.mount: Deactivated successfully.
Jan 22 00:39:03 compute-1 podman[198560]: time="2026-01-22T00:39:03Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 22 00:39:03 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:39:03.051 104184 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:39:03 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:39:03.051 104184 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:39:03 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:39:03.051 104184 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:39:03 compute-1 podman[198560]: @ - - [22/Jan/2026:00:39:03 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=true&sync=false HTTP/1.1" 200 21519 "" "Go-http-client/1.1"
Jan 22 00:39:03 compute-1 nova_compute[182713]: 2026-01-22 00:39:03.271 182717 DEBUG nova.network.neutron [req-2fe7cc8f-a170-49fe-b05f-cab6bf4265b2 req-34dc5019-1b03-4ada-b238-ef6522b9c15e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 34cfe24d-2754-4083-975b-f9775fd743b8] Updated VIF entry in instance network info cache for port 6e500b56-ad6b-4af0-9a28-7bd95fe3781d. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 00:39:03 compute-1 nova_compute[182713]: 2026-01-22 00:39:03.272 182717 DEBUG nova.network.neutron [req-2fe7cc8f-a170-49fe-b05f-cab6bf4265b2 req-34dc5019-1b03-4ada-b238-ef6522b9c15e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 34cfe24d-2754-4083-975b-f9775fd743b8] Updating instance_info_cache with network_info: [{"id": "6e500b56-ad6b-4af0-9a28-7bd95fe3781d", "address": "fa:16:3e:19:b9:32", "network": {"id": "96576974-adfc-492e-9141-63dd99e1cb25", "bridge": "br-int", "label": "tempest-network-smoke--1773827977", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6e500b56-ad", "ovs_interfaceid": "6e500b56-ad6b-4af0-9a28-7bd95fe3781d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "a3b2b7b3-2c20-4521-88bf-4bd738cbbb5c", "address": "fa:16:3e:5f:7d:b6", "network": {"id": "01fa8e13-9f62-4b06-88db-79f2e6ca65b8", "bridge": "br-int", "label": "tempest-network-smoke--1773626540", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe5f:7db6", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe5f:7db6", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa3b2b7b3-2c", "ovs_interfaceid": "a3b2b7b3-2c20-4521-88bf-4bd738cbbb5c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:39:03 compute-1 nova_compute[182713]: 2026-01-22 00:39:03.296 182717 DEBUG oslo_concurrency.lockutils [req-2fe7cc8f-a170-49fe-b05f-cab6bf4265b2 req-34dc5019-1b03-4ada-b238-ef6522b9c15e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-34cfe24d-2754-4083-975b-f9775fd743b8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:39:03 compute-1 nova_compute[182713]: 2026-01-22 00:39:03.499 182717 DEBUG nova.network.neutron [-] [instance: 34cfe24d-2754-4083-975b-f9775fd743b8] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:39:03 compute-1 nova_compute[182713]: 2026-01-22 00:39:03.522 182717 INFO nova.compute.manager [-] [instance: 34cfe24d-2754-4083-975b-f9775fd743b8] Took 1.39 seconds to deallocate network for instance.
Jan 22 00:39:03 compute-1 nova_compute[182713]: 2026-01-22 00:39:03.615 182717 DEBUG oslo_concurrency.lockutils [None req-cac25a29-7053-464c-ad7a-8d7895869721 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:39:03 compute-1 nova_compute[182713]: 2026-01-22 00:39:03.616 182717 DEBUG oslo_concurrency.lockutils [None req-cac25a29-7053-464c-ad7a-8d7895869721 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:39:03 compute-1 nova_compute[182713]: 2026-01-22 00:39:03.649 182717 DEBUG nova.compute.manager [req-f844a704-39cf-4c61-b669-df957f75bedc req-d6aa0276-6298-49cc-ac30-89796a40f776 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 34cfe24d-2754-4083-975b-f9775fd743b8] Received event network-vif-unplugged-6e500b56-ad6b-4af0-9a28-7bd95fe3781d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:39:03 compute-1 nova_compute[182713]: 2026-01-22 00:39:03.649 182717 DEBUG oslo_concurrency.lockutils [req-f844a704-39cf-4c61-b669-df957f75bedc req-d6aa0276-6298-49cc-ac30-89796a40f776 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "34cfe24d-2754-4083-975b-f9775fd743b8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:39:03 compute-1 nova_compute[182713]: 2026-01-22 00:39:03.650 182717 DEBUG oslo_concurrency.lockutils [req-f844a704-39cf-4c61-b669-df957f75bedc req-d6aa0276-6298-49cc-ac30-89796a40f776 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "34cfe24d-2754-4083-975b-f9775fd743b8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:39:03 compute-1 nova_compute[182713]: 2026-01-22 00:39:03.650 182717 DEBUG oslo_concurrency.lockutils [req-f844a704-39cf-4c61-b669-df957f75bedc req-d6aa0276-6298-49cc-ac30-89796a40f776 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "34cfe24d-2754-4083-975b-f9775fd743b8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:39:03 compute-1 nova_compute[182713]: 2026-01-22 00:39:03.650 182717 DEBUG nova.compute.manager [req-f844a704-39cf-4c61-b669-df957f75bedc req-d6aa0276-6298-49cc-ac30-89796a40f776 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 34cfe24d-2754-4083-975b-f9775fd743b8] No waiting events found dispatching network-vif-unplugged-6e500b56-ad6b-4af0-9a28-7bd95fe3781d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:39:03 compute-1 nova_compute[182713]: 2026-01-22 00:39:03.651 182717 WARNING nova.compute.manager [req-f844a704-39cf-4c61-b669-df957f75bedc req-d6aa0276-6298-49cc-ac30-89796a40f776 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 34cfe24d-2754-4083-975b-f9775fd743b8] Received unexpected event network-vif-unplugged-6e500b56-ad6b-4af0-9a28-7bd95fe3781d for instance with vm_state deleted and task_state None.
Jan 22 00:39:03 compute-1 nova_compute[182713]: 2026-01-22 00:39:03.651 182717 DEBUG nova.compute.manager [req-f844a704-39cf-4c61-b669-df957f75bedc req-d6aa0276-6298-49cc-ac30-89796a40f776 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 34cfe24d-2754-4083-975b-f9775fd743b8] Received event network-vif-plugged-6e500b56-ad6b-4af0-9a28-7bd95fe3781d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:39:03 compute-1 nova_compute[182713]: 2026-01-22 00:39:03.651 182717 DEBUG oslo_concurrency.lockutils [req-f844a704-39cf-4c61-b669-df957f75bedc req-d6aa0276-6298-49cc-ac30-89796a40f776 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "34cfe24d-2754-4083-975b-f9775fd743b8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:39:03 compute-1 nova_compute[182713]: 2026-01-22 00:39:03.651 182717 DEBUG oslo_concurrency.lockutils [req-f844a704-39cf-4c61-b669-df957f75bedc req-d6aa0276-6298-49cc-ac30-89796a40f776 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "34cfe24d-2754-4083-975b-f9775fd743b8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:39:03 compute-1 nova_compute[182713]: 2026-01-22 00:39:03.651 182717 DEBUG oslo_concurrency.lockutils [req-f844a704-39cf-4c61-b669-df957f75bedc req-d6aa0276-6298-49cc-ac30-89796a40f776 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "34cfe24d-2754-4083-975b-f9775fd743b8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:39:03 compute-1 nova_compute[182713]: 2026-01-22 00:39:03.652 182717 DEBUG nova.compute.manager [req-f844a704-39cf-4c61-b669-df957f75bedc req-d6aa0276-6298-49cc-ac30-89796a40f776 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 34cfe24d-2754-4083-975b-f9775fd743b8] No waiting events found dispatching network-vif-plugged-6e500b56-ad6b-4af0-9a28-7bd95fe3781d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:39:03 compute-1 nova_compute[182713]: 2026-01-22 00:39:03.652 182717 WARNING nova.compute.manager [req-f844a704-39cf-4c61-b669-df957f75bedc req-d6aa0276-6298-49cc-ac30-89796a40f776 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 34cfe24d-2754-4083-975b-f9775fd743b8] Received unexpected event network-vif-plugged-6e500b56-ad6b-4af0-9a28-7bd95fe3781d for instance with vm_state deleted and task_state None.
Jan 22 00:39:03 compute-1 nova_compute[182713]: 2026-01-22 00:39:03.652 182717 DEBUG nova.compute.manager [req-f844a704-39cf-4c61-b669-df957f75bedc req-d6aa0276-6298-49cc-ac30-89796a40f776 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 34cfe24d-2754-4083-975b-f9775fd743b8] Received event network-vif-deleted-6e500b56-ad6b-4af0-9a28-7bd95fe3781d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:39:03 compute-1 nova_compute[182713]: 2026-01-22 00:39:03.652 182717 DEBUG nova.compute.manager [req-f844a704-39cf-4c61-b669-df957f75bedc req-d6aa0276-6298-49cc-ac30-89796a40f776 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 34cfe24d-2754-4083-975b-f9775fd743b8] Received event network-vif-deleted-a3b2b7b3-2c20-4521-88bf-4bd738cbbb5c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:39:03 compute-1 nova_compute[182713]: 2026-01-22 00:39:03.680 182717 DEBUG nova.compute.provider_tree [None req-cac25a29-7053-464c-ad7a-8d7895869721 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Inventory has not changed in ProviderTree for provider: 39680711-70c9-4df1-ae59-25e54fac688d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:39:03 compute-1 nova_compute[182713]: 2026-01-22 00:39:03.701 182717 DEBUG nova.scheduler.client.report [None req-cac25a29-7053-464c-ad7a-8d7895869721 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Inventory has not changed for provider 39680711-70c9-4df1-ae59-25e54fac688d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:39:03 compute-1 nova_compute[182713]: 2026-01-22 00:39:03.727 182717 DEBUG oslo_concurrency.lockutils [None req-cac25a29-7053-464c-ad7a-8d7895869721 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.111s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:39:03 compute-1 nova_compute[182713]: 2026-01-22 00:39:03.757 182717 INFO nova.scheduler.client.report [None req-cac25a29-7053-464c-ad7a-8d7895869721 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Deleted allocations for instance 34cfe24d-2754-4083-975b-f9775fd743b8
Jan 22 00:39:03 compute-1 nova_compute[182713]: 2026-01-22 00:39:03.865 182717 DEBUG oslo_concurrency.lockutils [None req-cac25a29-7053-464c-ad7a-8d7895869721 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "34cfe24d-2754-4083-975b-f9775fd743b8" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.190s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:39:03 compute-1 nova_compute[182713]: 2026-01-22 00:39:03.874 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:39:04 compute-1 nova_compute[182713]: 2026-01-22 00:39:04.415 182717 DEBUG nova.compute.manager [req-80ac3981-1014-4af2-aba3-36bf93608e4f req-dbcf8d45-c3fc-4def-9e55-702b4a7ed0fd 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 34cfe24d-2754-4083-975b-f9775fd743b8] Received event network-vif-plugged-a3b2b7b3-2c20-4521-88bf-4bd738cbbb5c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:39:04 compute-1 nova_compute[182713]: 2026-01-22 00:39:04.416 182717 DEBUG oslo_concurrency.lockutils [req-80ac3981-1014-4af2-aba3-36bf93608e4f req-dbcf8d45-c3fc-4def-9e55-702b4a7ed0fd 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "34cfe24d-2754-4083-975b-f9775fd743b8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:39:04 compute-1 nova_compute[182713]: 2026-01-22 00:39:04.416 182717 DEBUG oslo_concurrency.lockutils [req-80ac3981-1014-4af2-aba3-36bf93608e4f req-dbcf8d45-c3fc-4def-9e55-702b4a7ed0fd 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "34cfe24d-2754-4083-975b-f9775fd743b8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:39:04 compute-1 nova_compute[182713]: 2026-01-22 00:39:04.416 182717 DEBUG oslo_concurrency.lockutils [req-80ac3981-1014-4af2-aba3-36bf93608e4f req-dbcf8d45-c3fc-4def-9e55-702b4a7ed0fd 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "34cfe24d-2754-4083-975b-f9775fd743b8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:39:04 compute-1 nova_compute[182713]: 2026-01-22 00:39:04.416 182717 DEBUG nova.compute.manager [req-80ac3981-1014-4af2-aba3-36bf93608e4f req-dbcf8d45-c3fc-4def-9e55-702b4a7ed0fd 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 34cfe24d-2754-4083-975b-f9775fd743b8] No waiting events found dispatching network-vif-plugged-a3b2b7b3-2c20-4521-88bf-4bd738cbbb5c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:39:04 compute-1 nova_compute[182713]: 2026-01-22 00:39:04.417 182717 WARNING nova.compute.manager [req-80ac3981-1014-4af2-aba3-36bf93608e4f req-dbcf8d45-c3fc-4def-9e55-702b4a7ed0fd 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 34cfe24d-2754-4083-975b-f9775fd743b8] Received unexpected event network-vif-plugged-a3b2b7b3-2c20-4521-88bf-4bd738cbbb5c for instance with vm_state deleted and task_state None.
Jan 22 00:39:04 compute-1 nova_compute[182713]: 2026-01-22 00:39:04.856 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:39:05 compute-1 nova_compute[182713]: 2026-01-22 00:39:05.722 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:39:05 compute-1 nova_compute[182713]: 2026-01-22 00:39:05.854 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:39:06 compute-1 nova_compute[182713]: 2026-01-22 00:39:06.266 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:39:06 compute-1 podman[242432]: 2026-01-22 00:39:06.596095756 +0000 UTC m=+0.088287316 container health_status cba261ffc859a105e0afa4436e64e27f9ba7e7387a1ea02146e0072c51844d44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, config_id=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Jan 22 00:39:06 compute-1 nova_compute[182713]: 2026-01-22 00:39:06.859 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:39:06 compute-1 nova_compute[182713]: 2026-01-22 00:39:06.859 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 00:39:07 compute-1 nova_compute[182713]: 2026-01-22 00:39:07.039 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:39:07 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:39:07.410 104184 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=62, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:a4:fb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'fa:c2:bb:50:eb:27'}, ipsec=False) old=SB_Global(nb_cfg=61) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:39:07 compute-1 nova_compute[182713]: 2026-01-22 00:39:07.411 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:39:07 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:39:07.412 104184 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 22 00:39:08 compute-1 podman[242453]: 2026-01-22 00:39:08.599080763 +0000 UTC m=+0.079807955 container health_status dce133a2ffa21f3006ac27dc80027060a9029e6d972f4c62fecd35dd3bbb00f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, release=1755695350, config_id=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, version=9.6, build-date=2025-08-20T13:12:41, architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, maintainer=Red Hat, Inc., vcs-type=git)
Jan 22 00:39:10 compute-1 nova_compute[182713]: 2026-01-22 00:39:10.724 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:39:10 compute-1 nova_compute[182713]: 2026-01-22 00:39:10.856 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:39:11 compute-1 nova_compute[182713]: 2026-01-22 00:39:11.856 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:39:12 compute-1 nova_compute[182713]: 2026-01-22 00:39:12.043 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:39:12 compute-1 nova_compute[182713]: 2026-01-22 00:39:12.856 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:39:13 compute-1 nova_compute[182713]: 2026-01-22 00:39:13.856 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:39:13 compute-1 nova_compute[182713]: 2026-01-22 00:39:13.885 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:39:13 compute-1 nova_compute[182713]: 2026-01-22 00:39:13.886 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:39:13 compute-1 nova_compute[182713]: 2026-01-22 00:39:13.886 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:39:13 compute-1 nova_compute[182713]: 2026-01-22 00:39:13.886 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 00:39:14 compute-1 nova_compute[182713]: 2026-01-22 00:39:14.084 182717 WARNING nova.virt.libvirt.driver [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 00:39:14 compute-1 nova_compute[182713]: 2026-01-22 00:39:14.085 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5711MB free_disk=73.19327163696289GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 00:39:14 compute-1 nova_compute[182713]: 2026-01-22 00:39:14.085 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:39:14 compute-1 nova_compute[182713]: 2026-01-22 00:39:14.085 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:39:14 compute-1 nova_compute[182713]: 2026-01-22 00:39:14.157 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 00:39:14 compute-1 nova_compute[182713]: 2026-01-22 00:39:14.158 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 00:39:14 compute-1 nova_compute[182713]: 2026-01-22 00:39:14.180 182717 DEBUG nova.compute.provider_tree [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Inventory has not changed in ProviderTree for provider: 39680711-70c9-4df1-ae59-25e54fac688d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:39:14 compute-1 nova_compute[182713]: 2026-01-22 00:39:14.195 182717 DEBUG nova.scheduler.client.report [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Inventory has not changed for provider 39680711-70c9-4df1-ae59-25e54fac688d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:39:14 compute-1 nova_compute[182713]: 2026-01-22 00:39:14.214 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 00:39:14 compute-1 nova_compute[182713]: 2026-01-22 00:39:14.214 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.129s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:39:15 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:39:15.415 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=74526b6d-b1ca-423f-9094-b845f8b97526, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '62'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:39:15 compute-1 sshd-session[242475]: Invalid user sol from 92.118.39.95 port 45154
Jan 22 00:39:15 compute-1 sshd-session[242475]: Connection closed by invalid user sol 92.118.39.95 port 45154 [preauth]
Jan 22 00:39:15 compute-1 nova_compute[182713]: 2026-01-22 00:39:15.726 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:39:16 compute-1 nova_compute[182713]: 2026-01-22 00:39:16.214 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:39:16 compute-1 nova_compute[182713]: 2026-01-22 00:39:16.214 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 00:39:16 compute-1 nova_compute[182713]: 2026-01-22 00:39:16.215 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 22 00:39:16 compute-1 nova_compute[182713]: 2026-01-22 00:39:16.235 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 22 00:39:16 compute-1 nova_compute[182713]: 2026-01-22 00:39:16.998 182717 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769042341.995626, 34cfe24d-2754-4083-975b-f9775fd743b8 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:39:17 compute-1 nova_compute[182713]: 2026-01-22 00:39:16.999 182717 INFO nova.compute.manager [-] [instance: 34cfe24d-2754-4083-975b-f9775fd743b8] VM Stopped (Lifecycle Event)
Jan 22 00:39:17 compute-1 nova_compute[182713]: 2026-01-22 00:39:17.030 182717 DEBUG nova.compute.manager [None req-6fffbff5-fc35-4dfb-8f6b-55ca445523a1 - - - - - -] [instance: 34cfe24d-2754-4083-975b-f9775fd743b8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:39:17 compute-1 nova_compute[182713]: 2026-01-22 00:39:17.046 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:39:17 compute-1 nova_compute[182713]: 2026-01-22 00:39:17.604 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:39:17 compute-1 nova_compute[182713]: 2026-01-22 00:39:17.717 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:39:18 compute-1 podman[242479]: 2026-01-22 00:39:18.59923507 +0000 UTC m=+0.078472643 container health_status 9069102163b9014ce8d990e36224edf2f6277e737eebbc85a8ea0ba5a3d6a468 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 22 00:39:18 compute-1 podman[242478]: 2026-01-22 00:39:18.624801192 +0000 UTC m=+0.106164860 container health_status 1b31374526468d6b98833c6ddc06c791524e3aaa8f4a92d02e3f11c449a17ca2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 22 00:39:20 compute-1 nova_compute[182713]: 2026-01-22 00:39:20.728 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:39:22 compute-1 nova_compute[182713]: 2026-01-22 00:39:22.050 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:39:24 compute-1 podman[242524]: 2026-01-22 00:39:24.565879803 +0000 UTC m=+0.055263154 container health_status e305ebfcd3dbca18f8dd680606b043b108cf9efb0d0a771039cb04fe0df8441d (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 22 00:39:24 compute-1 podman[242523]: 2026-01-22 00:39:24.573007984 +0000 UTC m=+0.063626752 container health_status af9a44923f14d865893c91712a29a99d59e05d83f1a1a0300c4f4d5af638f284 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent)
Jan 22 00:39:25 compute-1 nova_compute[182713]: 2026-01-22 00:39:25.730 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:39:27 compute-1 nova_compute[182713]: 2026-01-22 00:39:27.053 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:39:30 compute-1 nova_compute[182713]: 2026-01-22 00:39:30.732 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:39:32 compute-1 nova_compute[182713]: 2026-01-22 00:39:32.057 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:39:35 compute-1 nova_compute[182713]: 2026-01-22 00:39:35.735 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:39:37 compute-1 nova_compute[182713]: 2026-01-22 00:39:37.060 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:39:37 compute-1 podman[242564]: 2026-01-22 00:39:37.611792266 +0000 UTC m=+0.095482460 container health_status cba261ffc859a105e0afa4436e64e27f9ba7e7387a1ea02146e0072c51844d44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 22 00:39:39 compute-1 podman[242585]: 2026-01-22 00:39:39.580646783 +0000 UTC m=+0.075987045 container health_status dce133a2ffa21f3006ac27dc80027060a9029e6d972f4c62fecd35dd3bbb00f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, version=9.6, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, container_name=openstack_network_exporter, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, architecture=x86_64, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, vcs-type=git, io.openshift.tags=minimal rhel9)
Jan 22 00:39:40 compute-1 nova_compute[182713]: 2026-01-22 00:39:40.737 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:39:42 compute-1 nova_compute[182713]: 2026-01-22 00:39:42.063 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:39:45 compute-1 nova_compute[182713]: 2026-01-22 00:39:45.739 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:39:47 compute-1 nova_compute[182713]: 2026-01-22 00:39:47.067 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:39:49 compute-1 podman[242608]: 2026-01-22 00:39:49.568688034 +0000 UTC m=+0.067782462 container health_status 1b31374526468d6b98833c6ddc06c791524e3aaa8f4a92d02e3f11c449a17ca2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 22 00:39:49 compute-1 podman[242609]: 2026-01-22 00:39:49.58728997 +0000 UTC m=+0.079895296 container health_status 9069102163b9014ce8d990e36224edf2f6277e737eebbc85a8ea0ba5a3d6a468 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 22 00:39:50 compute-1 nova_compute[182713]: 2026-01-22 00:39:50.742 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:39:52 compute-1 nova_compute[182713]: 2026-01-22 00:39:52.071 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:39:55 compute-1 podman[242660]: 2026-01-22 00:39:55.565065627 +0000 UTC m=+0.056474961 container health_status af9a44923f14d865893c91712a29a99d59e05d83f1a1a0300c4f4d5af638f284 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible)
Jan 22 00:39:55 compute-1 podman[242661]: 2026-01-22 00:39:55.573131807 +0000 UTC m=+0.057186504 container health_status e305ebfcd3dbca18f8dd680606b043b108cf9efb0d0a771039cb04fe0df8441d (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 22 00:39:55 compute-1 nova_compute[182713]: 2026-01-22 00:39:55.744 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:39:57 compute-1 nova_compute[182713]: 2026-01-22 00:39:57.092 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:39:58 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:39:58.523 104184 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=63, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:a4:fb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'fa:c2:bb:50:eb:27'}, ipsec=False) old=SB_Global(nb_cfg=62) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:39:58 compute-1 nova_compute[182713]: 2026-01-22 00:39:58.524 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:39:58 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:39:58.525 104184 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 22 00:40:00 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:40:00.527 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=74526b6d-b1ca-423f-9094-b845f8b97526, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '63'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:40:00 compute-1 nova_compute[182713]: 2026-01-22 00:40:00.746 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:40:02 compute-1 nova_compute[182713]: 2026-01-22 00:40:02.094 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:40:03 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:40:03.052 104184 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:40:03 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:40:03.053 104184 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:40:03 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:40:03.053 104184 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:40:04 compute-1 nova_compute[182713]: 2026-01-22 00:40:04.874 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:40:05 compute-1 nova_compute[182713]: 2026-01-22 00:40:05.748 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:40:05 compute-1 ovn_controller[94841]: 2026-01-22T00:40:05Z|00732|memory_trim|INFO|Detected inactivity (last active 30007 ms ago): trimming memory
Jan 22 00:40:05 compute-1 nova_compute[182713]: 2026-01-22 00:40:05.850 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:40:05 compute-1 nova_compute[182713]: 2026-01-22 00:40:05.873 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:40:06 compute-1 nova_compute[182713]: 2026-01-22 00:40:06.855 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:40:06 compute-1 nova_compute[182713]: 2026-01-22 00:40:06.856 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 00:40:07 compute-1 nova_compute[182713]: 2026-01-22 00:40:07.097 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:40:07 compute-1 nova_compute[182713]: 2026-01-22 00:40:07.856 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:40:08 compute-1 podman[242703]: 2026-01-22 00:40:08.620794225 +0000 UTC m=+0.109892026 container health_status cba261ffc859a105e0afa4436e64e27f9ba7e7387a1ea02146e0072c51844d44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team)
Jan 22 00:40:10 compute-1 podman[242723]: 2026-01-22 00:40:10.552761626 +0000 UTC m=+0.052115816 container health_status dce133a2ffa21f3006ac27dc80027060a9029e6d972f4c62fecd35dd3bbb00f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, distribution-scope=public, config_id=openstack_network_exporter, io.buildah.version=1.33.7, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, com.redhat.component=ubi9-minimal-container, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Jan 22 00:40:10 compute-1 nova_compute[182713]: 2026-01-22 00:40:10.750 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:40:12 compute-1 nova_compute[182713]: 2026-01-22 00:40:12.100 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:40:12 compute-1 nova_compute[182713]: 2026-01-22 00:40:12.857 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:40:13 compute-1 nova_compute[182713]: 2026-01-22 00:40:13.855 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:40:13 compute-1 nova_compute[182713]: 2026-01-22 00:40:13.855 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:40:13 compute-1 nova_compute[182713]: 2026-01-22 00:40:13.906 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:40:13 compute-1 nova_compute[182713]: 2026-01-22 00:40:13.907 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:40:13 compute-1 nova_compute[182713]: 2026-01-22 00:40:13.907 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:40:13 compute-1 nova_compute[182713]: 2026-01-22 00:40:13.907 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 00:40:14 compute-1 nova_compute[182713]: 2026-01-22 00:40:14.108 182717 WARNING nova.virt.libvirt.driver [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 00:40:14 compute-1 nova_compute[182713]: 2026-01-22 00:40:14.109 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5725MB free_disk=73.19327163696289GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 00:40:14 compute-1 nova_compute[182713]: 2026-01-22 00:40:14.109 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:40:14 compute-1 nova_compute[182713]: 2026-01-22 00:40:14.110 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:40:14 compute-1 nova_compute[182713]: 2026-01-22 00:40:14.268 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 00:40:14 compute-1 nova_compute[182713]: 2026-01-22 00:40:14.269 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 00:40:14 compute-1 nova_compute[182713]: 2026-01-22 00:40:14.323 182717 DEBUG nova.scheduler.client.report [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Refreshing inventories for resource provider 39680711-70c9-4df1-ae59-25e54fac688d _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Jan 22 00:40:14 compute-1 nova_compute[182713]: 2026-01-22 00:40:14.398 182717 DEBUG nova.scheduler.client.report [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Updating ProviderTree inventory for provider 39680711-70c9-4df1-ae59-25e54fac688d from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Jan 22 00:40:14 compute-1 nova_compute[182713]: 2026-01-22 00:40:14.399 182717 DEBUG nova.compute.provider_tree [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Updating inventory in ProviderTree for provider 39680711-70c9-4df1-ae59-25e54fac688d with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Jan 22 00:40:14 compute-1 nova_compute[182713]: 2026-01-22 00:40:14.416 182717 DEBUG nova.scheduler.client.report [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Refreshing aggregate associations for resource provider 39680711-70c9-4df1-ae59-25e54fac688d, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Jan 22 00:40:14 compute-1 nova_compute[182713]: 2026-01-22 00:40:14.436 182717 DEBUG nova.scheduler.client.report [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Refreshing trait associations for resource provider 39680711-70c9-4df1-ae59-25e54fac688d, traits: COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_ACCELERATORS,COMPUTE_NODE,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_SSE41,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_GRAPHICS_MODEL_CIRRUS,HW_CPU_X86_SSE42,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_SECURITY_TPM_2_0,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_STORAGE_BUS_USB,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_STORAGE_BUS_FDC,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_STORAGE_BUS_SATA,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_DEVICE_TAGGING,COMPUTE_NET_VIF_MODEL_NE2K_PCI,HW_CPU_X86_SSE2,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_CPU_X86_SSSE3,COMPUTE_STORAGE_BUS_IDE,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_RESCUE_BFV,HW_CPU_X86_MMX,HW_CPU_X86_SSE,COMPUTE_TRUSTED_CERTS,COMPUTE_IMAGE_TYPE_ISO _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Jan 22 00:40:14 compute-1 nova_compute[182713]: 2026-01-22 00:40:14.457 182717 DEBUG nova.compute.provider_tree [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Inventory has not changed in ProviderTree for provider: 39680711-70c9-4df1-ae59-25e54fac688d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:40:14 compute-1 nova_compute[182713]: 2026-01-22 00:40:14.481 182717 DEBUG nova.scheduler.client.report [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Inventory has not changed for provider 39680711-70c9-4df1-ae59-25e54fac688d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:40:14 compute-1 nova_compute[182713]: 2026-01-22 00:40:14.483 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 00:40:14 compute-1 nova_compute[182713]: 2026-01-22 00:40:14.483 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.373s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:40:15 compute-1 nova_compute[182713]: 2026-01-22 00:40:15.483 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:40:15 compute-1 nova_compute[182713]: 2026-01-22 00:40:15.751 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:40:16 compute-1 nova_compute[182713]: 2026-01-22 00:40:16.856 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:40:16 compute-1 nova_compute[182713]: 2026-01-22 00:40:16.857 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 00:40:16 compute-1 nova_compute[182713]: 2026-01-22 00:40:16.857 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 22 00:40:16 compute-1 nova_compute[182713]: 2026-01-22 00:40:16.876 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 22 00:40:17 compute-1 nova_compute[182713]: 2026-01-22 00:40:17.103 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:40:20 compute-1 podman[242745]: 2026-01-22 00:40:20.617051774 +0000 UTC m=+0.095685325 container health_status 9069102163b9014ce8d990e36224edf2f6277e737eebbc85a8ea0ba5a3d6a468 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 22 00:40:20 compute-1 podman[242744]: 2026-01-22 00:40:20.648779257 +0000 UTC m=+0.126980636 container health_status 1b31374526468d6b98833c6ddc06c791524e3aaa8f4a92d02e3f11c449a17ca2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ovn_controller, org.label-schema.schema-version=1.0, config_id=ovn_controller, io.buildah.version=1.41.3)
Jan 22 00:40:20 compute-1 nova_compute[182713]: 2026-01-22 00:40:20.753 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:40:22 compute-1 nova_compute[182713]: 2026-01-22 00:40:22.105 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:40:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:40:22.894 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:40:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:40:22.895 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:40:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:40:22.895 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:40:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:40:22.895 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:40:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:40:22.896 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:40:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:40:22.896 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:40:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:40:22.896 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:40:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:40:22.896 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:40:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:40:22.896 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:40:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:40:22.896 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:40:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:40:22.897 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:40:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:40:22.897 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:40:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:40:22.897 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:40:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:40:22.897 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:40:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:40:22.897 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:40:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:40:22.897 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:40:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:40:22.898 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:40:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:40:22.898 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:40:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:40:22.898 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:40:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:40:22.898 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:40:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:40:22.898 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:40:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:40:22.898 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:40:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:40:22.898 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:40:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:40:22.899 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:40:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:40:22.899 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:40:25 compute-1 nova_compute[182713]: 2026-01-22 00:40:25.756 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:40:26 compute-1 podman[242794]: 2026-01-22 00:40:26.598111924 +0000 UTC m=+0.091210588 container health_status af9a44923f14d865893c91712a29a99d59e05d83f1a1a0300c4f4d5af638f284 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3)
Jan 22 00:40:26 compute-1 podman[242795]: 2026-01-22 00:40:26.604028977 +0000 UTC m=+0.085413318 container health_status e305ebfcd3dbca18f8dd680606b043b108cf9efb0d0a771039cb04fe0df8441d (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 22 00:40:27 compute-1 nova_compute[182713]: 2026-01-22 00:40:27.108 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:40:28 compute-1 nova_compute[182713]: 2026-01-22 00:40:28.145 182717 DEBUG oslo_concurrency.lockutils [None req-7517a60b-77f2-4a0d-9ee6-aab60dc5424f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Acquiring lock "af1fd418-ee94-4203-923f-6b4fd78c2b96" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:40:28 compute-1 nova_compute[182713]: 2026-01-22 00:40:28.145 182717 DEBUG oslo_concurrency.lockutils [None req-7517a60b-77f2-4a0d-9ee6-aab60dc5424f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "af1fd418-ee94-4203-923f-6b4fd78c2b96" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:40:28 compute-1 nova_compute[182713]: 2026-01-22 00:40:28.161 182717 DEBUG nova.compute.manager [None req-7517a60b-77f2-4a0d-9ee6-aab60dc5424f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: af1fd418-ee94-4203-923f-6b4fd78c2b96] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 22 00:40:28 compute-1 nova_compute[182713]: 2026-01-22 00:40:28.318 182717 DEBUG oslo_concurrency.lockutils [None req-7517a60b-77f2-4a0d-9ee6-aab60dc5424f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:40:28 compute-1 nova_compute[182713]: 2026-01-22 00:40:28.319 182717 DEBUG oslo_concurrency.lockutils [None req-7517a60b-77f2-4a0d-9ee6-aab60dc5424f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:40:28 compute-1 nova_compute[182713]: 2026-01-22 00:40:28.327 182717 DEBUG nova.virt.hardware [None req-7517a60b-77f2-4a0d-9ee6-aab60dc5424f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 22 00:40:28 compute-1 nova_compute[182713]: 2026-01-22 00:40:28.327 182717 INFO nova.compute.claims [None req-7517a60b-77f2-4a0d-9ee6-aab60dc5424f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: af1fd418-ee94-4203-923f-6b4fd78c2b96] Claim successful on node compute-1.ctlplane.example.com
Jan 22 00:40:28 compute-1 nova_compute[182713]: 2026-01-22 00:40:28.441 182717 DEBUG nova.compute.provider_tree [None req-7517a60b-77f2-4a0d-9ee6-aab60dc5424f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Inventory has not changed in ProviderTree for provider: 39680711-70c9-4df1-ae59-25e54fac688d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:40:28 compute-1 nova_compute[182713]: 2026-01-22 00:40:28.456 182717 DEBUG nova.scheduler.client.report [None req-7517a60b-77f2-4a0d-9ee6-aab60dc5424f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Inventory has not changed for provider 39680711-70c9-4df1-ae59-25e54fac688d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:40:28 compute-1 nova_compute[182713]: 2026-01-22 00:40:28.481 182717 DEBUG oslo_concurrency.lockutils [None req-7517a60b-77f2-4a0d-9ee6-aab60dc5424f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.161s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:40:28 compute-1 nova_compute[182713]: 2026-01-22 00:40:28.482 182717 DEBUG nova.compute.manager [None req-7517a60b-77f2-4a0d-9ee6-aab60dc5424f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: af1fd418-ee94-4203-923f-6b4fd78c2b96] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 22 00:40:28 compute-1 nova_compute[182713]: 2026-01-22 00:40:28.549 182717 DEBUG nova.compute.manager [None req-7517a60b-77f2-4a0d-9ee6-aab60dc5424f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: af1fd418-ee94-4203-923f-6b4fd78c2b96] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 22 00:40:28 compute-1 nova_compute[182713]: 2026-01-22 00:40:28.549 182717 DEBUG nova.network.neutron [None req-7517a60b-77f2-4a0d-9ee6-aab60dc5424f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: af1fd418-ee94-4203-923f-6b4fd78c2b96] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 22 00:40:28 compute-1 nova_compute[182713]: 2026-01-22 00:40:28.576 182717 INFO nova.virt.libvirt.driver [None req-7517a60b-77f2-4a0d-9ee6-aab60dc5424f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: af1fd418-ee94-4203-923f-6b4fd78c2b96] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 22 00:40:28 compute-1 nova_compute[182713]: 2026-01-22 00:40:28.595 182717 DEBUG nova.compute.manager [None req-7517a60b-77f2-4a0d-9ee6-aab60dc5424f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: af1fd418-ee94-4203-923f-6b4fd78c2b96] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 22 00:40:28 compute-1 nova_compute[182713]: 2026-01-22 00:40:28.720 182717 DEBUG nova.compute.manager [None req-7517a60b-77f2-4a0d-9ee6-aab60dc5424f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: af1fd418-ee94-4203-923f-6b4fd78c2b96] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 22 00:40:28 compute-1 nova_compute[182713]: 2026-01-22 00:40:28.722 182717 DEBUG nova.virt.libvirt.driver [None req-7517a60b-77f2-4a0d-9ee6-aab60dc5424f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: af1fd418-ee94-4203-923f-6b4fd78c2b96] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 22 00:40:28 compute-1 nova_compute[182713]: 2026-01-22 00:40:28.722 182717 INFO nova.virt.libvirt.driver [None req-7517a60b-77f2-4a0d-9ee6-aab60dc5424f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: af1fd418-ee94-4203-923f-6b4fd78c2b96] Creating image(s)
Jan 22 00:40:28 compute-1 nova_compute[182713]: 2026-01-22 00:40:28.723 182717 DEBUG oslo_concurrency.lockutils [None req-7517a60b-77f2-4a0d-9ee6-aab60dc5424f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Acquiring lock "/var/lib/nova/instances/af1fd418-ee94-4203-923f-6b4fd78c2b96/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:40:28 compute-1 nova_compute[182713]: 2026-01-22 00:40:28.723 182717 DEBUG oslo_concurrency.lockutils [None req-7517a60b-77f2-4a0d-9ee6-aab60dc5424f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "/var/lib/nova/instances/af1fd418-ee94-4203-923f-6b4fd78c2b96/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:40:28 compute-1 nova_compute[182713]: 2026-01-22 00:40:28.724 182717 DEBUG oslo_concurrency.lockutils [None req-7517a60b-77f2-4a0d-9ee6-aab60dc5424f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "/var/lib/nova/instances/af1fd418-ee94-4203-923f-6b4fd78c2b96/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:40:28 compute-1 nova_compute[182713]: 2026-01-22 00:40:28.742 182717 DEBUG oslo_concurrency.processutils [None req-7517a60b-77f2-4a0d-9ee6-aab60dc5424f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:40:28 compute-1 nova_compute[182713]: 2026-01-22 00:40:28.804 182717 DEBUG oslo_concurrency.processutils [None req-7517a60b-77f2-4a0d-9ee6-aab60dc5424f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:40:28 compute-1 nova_compute[182713]: 2026-01-22 00:40:28.805 182717 DEBUG oslo_concurrency.lockutils [None req-7517a60b-77f2-4a0d-9ee6-aab60dc5424f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Acquiring lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:40:28 compute-1 nova_compute[182713]: 2026-01-22 00:40:28.806 182717 DEBUG oslo_concurrency.lockutils [None req-7517a60b-77f2-4a0d-9ee6-aab60dc5424f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:40:28 compute-1 nova_compute[182713]: 2026-01-22 00:40:28.825 182717 DEBUG oslo_concurrency.processutils [None req-7517a60b-77f2-4a0d-9ee6-aab60dc5424f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:40:28 compute-1 nova_compute[182713]: 2026-01-22 00:40:28.892 182717 DEBUG oslo_concurrency.processutils [None req-7517a60b-77f2-4a0d-9ee6-aab60dc5424f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:40:28 compute-1 nova_compute[182713]: 2026-01-22 00:40:28.893 182717 DEBUG oslo_concurrency.processutils [None req-7517a60b-77f2-4a0d-9ee6-aab60dc5424f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/af1fd418-ee94-4203-923f-6b4fd78c2b96/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:40:28 compute-1 nova_compute[182713]: 2026-01-22 00:40:28.927 182717 DEBUG oslo_concurrency.processutils [None req-7517a60b-77f2-4a0d-9ee6-aab60dc5424f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/af1fd418-ee94-4203-923f-6b4fd78c2b96/disk 1073741824" returned: 0 in 0.033s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:40:28 compute-1 nova_compute[182713]: 2026-01-22 00:40:28.928 182717 DEBUG oslo_concurrency.lockutils [None req-7517a60b-77f2-4a0d-9ee6-aab60dc5424f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.122s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:40:28 compute-1 nova_compute[182713]: 2026-01-22 00:40:28.929 182717 DEBUG oslo_concurrency.processutils [None req-7517a60b-77f2-4a0d-9ee6-aab60dc5424f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:40:28 compute-1 nova_compute[182713]: 2026-01-22 00:40:28.993 182717 DEBUG oslo_concurrency.processutils [None req-7517a60b-77f2-4a0d-9ee6-aab60dc5424f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:40:28 compute-1 nova_compute[182713]: 2026-01-22 00:40:28.994 182717 DEBUG nova.virt.disk.api [None req-7517a60b-77f2-4a0d-9ee6-aab60dc5424f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Checking if we can resize image /var/lib/nova/instances/af1fd418-ee94-4203-923f-6b4fd78c2b96/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 22 00:40:28 compute-1 nova_compute[182713]: 2026-01-22 00:40:28.995 182717 DEBUG oslo_concurrency.processutils [None req-7517a60b-77f2-4a0d-9ee6-aab60dc5424f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/af1fd418-ee94-4203-923f-6b4fd78c2b96/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:40:29 compute-1 nova_compute[182713]: 2026-01-22 00:40:29.050 182717 DEBUG oslo_concurrency.processutils [None req-7517a60b-77f2-4a0d-9ee6-aab60dc5424f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/af1fd418-ee94-4203-923f-6b4fd78c2b96/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:40:29 compute-1 nova_compute[182713]: 2026-01-22 00:40:29.052 182717 DEBUG nova.virt.disk.api [None req-7517a60b-77f2-4a0d-9ee6-aab60dc5424f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Cannot resize image /var/lib/nova/instances/af1fd418-ee94-4203-923f-6b4fd78c2b96/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 22 00:40:29 compute-1 nova_compute[182713]: 2026-01-22 00:40:29.052 182717 DEBUG nova.objects.instance [None req-7517a60b-77f2-4a0d-9ee6-aab60dc5424f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lazy-loading 'migration_context' on Instance uuid af1fd418-ee94-4203-923f-6b4fd78c2b96 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:40:29 compute-1 nova_compute[182713]: 2026-01-22 00:40:29.064 182717 DEBUG nova.virt.libvirt.driver [None req-7517a60b-77f2-4a0d-9ee6-aab60dc5424f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: af1fd418-ee94-4203-923f-6b4fd78c2b96] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 22 00:40:29 compute-1 nova_compute[182713]: 2026-01-22 00:40:29.064 182717 DEBUG nova.virt.libvirt.driver [None req-7517a60b-77f2-4a0d-9ee6-aab60dc5424f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: af1fd418-ee94-4203-923f-6b4fd78c2b96] Ensure instance console log exists: /var/lib/nova/instances/af1fd418-ee94-4203-923f-6b4fd78c2b96/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 22 00:40:29 compute-1 nova_compute[182713]: 2026-01-22 00:40:29.065 182717 DEBUG oslo_concurrency.lockutils [None req-7517a60b-77f2-4a0d-9ee6-aab60dc5424f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:40:29 compute-1 nova_compute[182713]: 2026-01-22 00:40:29.065 182717 DEBUG oslo_concurrency.lockutils [None req-7517a60b-77f2-4a0d-9ee6-aab60dc5424f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:40:29 compute-1 nova_compute[182713]: 2026-01-22 00:40:29.066 182717 DEBUG oslo_concurrency.lockutils [None req-7517a60b-77f2-4a0d-9ee6-aab60dc5424f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:40:29 compute-1 nova_compute[182713]: 2026-01-22 00:40:29.242 182717 DEBUG nova.policy [None req-7517a60b-77f2-4a0d-9ee6-aab60dc5424f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 22 00:40:30 compute-1 nova_compute[182713]: 2026-01-22 00:40:30.368 182717 DEBUG nova.network.neutron [None req-7517a60b-77f2-4a0d-9ee6-aab60dc5424f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: af1fd418-ee94-4203-923f-6b4fd78c2b96] Successfully created port: c1a311b0-55fd-43c5-9c26-1ce77ebf5300 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 22 00:40:30 compute-1 nova_compute[182713]: 2026-01-22 00:40:30.758 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:40:31 compute-1 nova_compute[182713]: 2026-01-22 00:40:31.264 182717 DEBUG nova.network.neutron [None req-7517a60b-77f2-4a0d-9ee6-aab60dc5424f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: af1fd418-ee94-4203-923f-6b4fd78c2b96] Successfully created port: c45a4550-8234-4bc5-b809-77d90eb7f8fd _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 22 00:40:32 compute-1 nova_compute[182713]: 2026-01-22 00:40:32.111 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:40:32 compute-1 nova_compute[182713]: 2026-01-22 00:40:32.401 182717 DEBUG nova.network.neutron [None req-7517a60b-77f2-4a0d-9ee6-aab60dc5424f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: af1fd418-ee94-4203-923f-6b4fd78c2b96] Successfully updated port: c1a311b0-55fd-43c5-9c26-1ce77ebf5300 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 22 00:40:32 compute-1 nova_compute[182713]: 2026-01-22 00:40:32.525 182717 DEBUG nova.compute.manager [req-905411da-3e16-4026-9593-cb17efd94d47 req-e634626b-36e6-4e91-91bd-c40adc995ad2 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: af1fd418-ee94-4203-923f-6b4fd78c2b96] Received event network-changed-c1a311b0-55fd-43c5-9c26-1ce77ebf5300 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:40:32 compute-1 nova_compute[182713]: 2026-01-22 00:40:32.526 182717 DEBUG nova.compute.manager [req-905411da-3e16-4026-9593-cb17efd94d47 req-e634626b-36e6-4e91-91bd-c40adc995ad2 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: af1fd418-ee94-4203-923f-6b4fd78c2b96] Refreshing instance network info cache due to event network-changed-c1a311b0-55fd-43c5-9c26-1ce77ebf5300. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 00:40:32 compute-1 nova_compute[182713]: 2026-01-22 00:40:32.526 182717 DEBUG oslo_concurrency.lockutils [req-905411da-3e16-4026-9593-cb17efd94d47 req-e634626b-36e6-4e91-91bd-c40adc995ad2 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-af1fd418-ee94-4203-923f-6b4fd78c2b96" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:40:32 compute-1 nova_compute[182713]: 2026-01-22 00:40:32.526 182717 DEBUG oslo_concurrency.lockutils [req-905411da-3e16-4026-9593-cb17efd94d47 req-e634626b-36e6-4e91-91bd-c40adc995ad2 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-af1fd418-ee94-4203-923f-6b4fd78c2b96" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:40:32 compute-1 nova_compute[182713]: 2026-01-22 00:40:32.527 182717 DEBUG nova.network.neutron [req-905411da-3e16-4026-9593-cb17efd94d47 req-e634626b-36e6-4e91-91bd-c40adc995ad2 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: af1fd418-ee94-4203-923f-6b4fd78c2b96] Refreshing network info cache for port c1a311b0-55fd-43c5-9c26-1ce77ebf5300 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 00:40:32 compute-1 nova_compute[182713]: 2026-01-22 00:40:32.681 182717 DEBUG nova.network.neutron [req-905411da-3e16-4026-9593-cb17efd94d47 req-e634626b-36e6-4e91-91bd-c40adc995ad2 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: af1fd418-ee94-4203-923f-6b4fd78c2b96] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 00:40:33 compute-1 nova_compute[182713]: 2026-01-22 00:40:33.066 182717 DEBUG nova.network.neutron [req-905411da-3e16-4026-9593-cb17efd94d47 req-e634626b-36e6-4e91-91bd-c40adc995ad2 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: af1fd418-ee94-4203-923f-6b4fd78c2b96] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:40:33 compute-1 nova_compute[182713]: 2026-01-22 00:40:33.082 182717 DEBUG oslo_concurrency.lockutils [req-905411da-3e16-4026-9593-cb17efd94d47 req-e634626b-36e6-4e91-91bd-c40adc995ad2 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-af1fd418-ee94-4203-923f-6b4fd78c2b96" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:40:33 compute-1 nova_compute[182713]: 2026-01-22 00:40:33.108 182717 DEBUG nova.network.neutron [None req-7517a60b-77f2-4a0d-9ee6-aab60dc5424f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: af1fd418-ee94-4203-923f-6b4fd78c2b96] Successfully updated port: c45a4550-8234-4bc5-b809-77d90eb7f8fd _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 22 00:40:33 compute-1 nova_compute[182713]: 2026-01-22 00:40:33.128 182717 DEBUG oslo_concurrency.lockutils [None req-7517a60b-77f2-4a0d-9ee6-aab60dc5424f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Acquiring lock "refresh_cache-af1fd418-ee94-4203-923f-6b4fd78c2b96" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:40:33 compute-1 nova_compute[182713]: 2026-01-22 00:40:33.128 182717 DEBUG oslo_concurrency.lockutils [None req-7517a60b-77f2-4a0d-9ee6-aab60dc5424f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Acquired lock "refresh_cache-af1fd418-ee94-4203-923f-6b4fd78c2b96" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:40:33 compute-1 nova_compute[182713]: 2026-01-22 00:40:33.129 182717 DEBUG nova.network.neutron [None req-7517a60b-77f2-4a0d-9ee6-aab60dc5424f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: af1fd418-ee94-4203-923f-6b4fd78c2b96] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 00:40:33 compute-1 nova_compute[182713]: 2026-01-22 00:40:33.280 182717 DEBUG nova.network.neutron [None req-7517a60b-77f2-4a0d-9ee6-aab60dc5424f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: af1fd418-ee94-4203-923f-6b4fd78c2b96] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 00:40:34 compute-1 nova_compute[182713]: 2026-01-22 00:40:34.696 182717 DEBUG nova.compute.manager [req-2faf7232-40a9-44b1-9140-a24f415d4b12 req-403d7493-6ac2-4d85-9f9e-50f9f91be129 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: af1fd418-ee94-4203-923f-6b4fd78c2b96] Received event network-changed-c45a4550-8234-4bc5-b809-77d90eb7f8fd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:40:34 compute-1 nova_compute[182713]: 2026-01-22 00:40:34.696 182717 DEBUG nova.compute.manager [req-2faf7232-40a9-44b1-9140-a24f415d4b12 req-403d7493-6ac2-4d85-9f9e-50f9f91be129 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: af1fd418-ee94-4203-923f-6b4fd78c2b96] Refreshing instance network info cache due to event network-changed-c45a4550-8234-4bc5-b809-77d90eb7f8fd. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 00:40:34 compute-1 nova_compute[182713]: 2026-01-22 00:40:34.696 182717 DEBUG oslo_concurrency.lockutils [req-2faf7232-40a9-44b1-9140-a24f415d4b12 req-403d7493-6ac2-4d85-9f9e-50f9f91be129 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-af1fd418-ee94-4203-923f-6b4fd78c2b96" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:40:35 compute-1 nova_compute[182713]: 2026-01-22 00:40:35.324 182717 DEBUG nova.network.neutron [None req-7517a60b-77f2-4a0d-9ee6-aab60dc5424f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: af1fd418-ee94-4203-923f-6b4fd78c2b96] Updating instance_info_cache with network_info: [{"id": "c1a311b0-55fd-43c5-9c26-1ce77ebf5300", "address": "fa:16:3e:c0:06:2d", "network": {"id": "d60c1e89-37d5-4a05-b566-04735ac9e501", "bridge": "br-int", "label": "tempest-network-smoke--2091942253", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc1a311b0-55", "ovs_interfaceid": "c1a311b0-55fd-43c5-9c26-1ce77ebf5300", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "c45a4550-8234-4bc5-b809-77d90eb7f8fd", "address": "fa:16:3e:b5:21:ba", "network": {"id": "65bd5007-25fc-43be-bec0-20ff1d1f0a79", "bridge": "br-int", "label": "tempest-network-smoke--1426694891", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:feb5:21ba", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc45a4550-82", "ovs_interfaceid": "c45a4550-8234-4bc5-b809-77d90eb7f8fd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:40:35 compute-1 nova_compute[182713]: 2026-01-22 00:40:35.352 182717 DEBUG oslo_concurrency.lockutils [None req-7517a60b-77f2-4a0d-9ee6-aab60dc5424f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Releasing lock "refresh_cache-af1fd418-ee94-4203-923f-6b4fd78c2b96" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:40:35 compute-1 nova_compute[182713]: 2026-01-22 00:40:35.353 182717 DEBUG nova.compute.manager [None req-7517a60b-77f2-4a0d-9ee6-aab60dc5424f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: af1fd418-ee94-4203-923f-6b4fd78c2b96] Instance network_info: |[{"id": "c1a311b0-55fd-43c5-9c26-1ce77ebf5300", "address": "fa:16:3e:c0:06:2d", "network": {"id": "d60c1e89-37d5-4a05-b566-04735ac9e501", "bridge": "br-int", "label": "tempest-network-smoke--2091942253", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc1a311b0-55", "ovs_interfaceid": "c1a311b0-55fd-43c5-9c26-1ce77ebf5300", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "c45a4550-8234-4bc5-b809-77d90eb7f8fd", "address": "fa:16:3e:b5:21:ba", "network": {"id": "65bd5007-25fc-43be-bec0-20ff1d1f0a79", "bridge": "br-int", "label": "tempest-network-smoke--1426694891", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:feb5:21ba", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc45a4550-82", "ovs_interfaceid": "c45a4550-8234-4bc5-b809-77d90eb7f8fd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 22 00:40:35 compute-1 nova_compute[182713]: 2026-01-22 00:40:35.353 182717 DEBUG oslo_concurrency.lockutils [req-2faf7232-40a9-44b1-9140-a24f415d4b12 req-403d7493-6ac2-4d85-9f9e-50f9f91be129 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-af1fd418-ee94-4203-923f-6b4fd78c2b96" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:40:35 compute-1 nova_compute[182713]: 2026-01-22 00:40:35.354 182717 DEBUG nova.network.neutron [req-2faf7232-40a9-44b1-9140-a24f415d4b12 req-403d7493-6ac2-4d85-9f9e-50f9f91be129 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: af1fd418-ee94-4203-923f-6b4fd78c2b96] Refreshing network info cache for port c45a4550-8234-4bc5-b809-77d90eb7f8fd _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 00:40:35 compute-1 nova_compute[182713]: 2026-01-22 00:40:35.358 182717 DEBUG nova.virt.libvirt.driver [None req-7517a60b-77f2-4a0d-9ee6-aab60dc5424f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: af1fd418-ee94-4203-923f-6b4fd78c2b96] Start _get_guest_xml network_info=[{"id": "c1a311b0-55fd-43c5-9c26-1ce77ebf5300", "address": "fa:16:3e:c0:06:2d", "network": {"id": "d60c1e89-37d5-4a05-b566-04735ac9e501", "bridge": "br-int", "label": "tempest-network-smoke--2091942253", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc1a311b0-55", "ovs_interfaceid": "c1a311b0-55fd-43c5-9c26-1ce77ebf5300", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "c45a4550-8234-4bc5-b809-77d90eb7f8fd", "address": "fa:16:3e:b5:21:ba", "network": {"id": "65bd5007-25fc-43be-bec0-20ff1d1f0a79", "bridge": "br-int", "label": "tempest-network-smoke--1426694891", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:feb5:21ba", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc45a4550-82", "ovs_interfaceid": "c45a4550-8234-4bc5-b809-77d90eb7f8fd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'size': 0, 'disk_bus': 'virtio', 'boot_index': 0, 'device_name': '/dev/vda', 'guest_format': None, 'encrypted': False, 'encryption_format': None, 'encryption_secret_uuid': None, 'device_type': 'disk', 'image_id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 22 00:40:35 compute-1 nova_compute[182713]: 2026-01-22 00:40:35.365 182717 WARNING nova.virt.libvirt.driver [None req-7517a60b-77f2-4a0d-9ee6-aab60dc5424f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 00:40:35 compute-1 nova_compute[182713]: 2026-01-22 00:40:35.370 182717 DEBUG nova.virt.libvirt.host [None req-7517a60b-77f2-4a0d-9ee6-aab60dc5424f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 22 00:40:35 compute-1 nova_compute[182713]: 2026-01-22 00:40:35.370 182717 DEBUG nova.virt.libvirt.host [None req-7517a60b-77f2-4a0d-9ee6-aab60dc5424f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 22 00:40:35 compute-1 nova_compute[182713]: 2026-01-22 00:40:35.377 182717 DEBUG nova.virt.libvirt.host [None req-7517a60b-77f2-4a0d-9ee6-aab60dc5424f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 22 00:40:35 compute-1 nova_compute[182713]: 2026-01-22 00:40:35.378 182717 DEBUG nova.virt.libvirt.host [None req-7517a60b-77f2-4a0d-9ee6-aab60dc5424f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 22 00:40:35 compute-1 nova_compute[182713]: 2026-01-22 00:40:35.380 182717 DEBUG nova.virt.libvirt.driver [None req-7517a60b-77f2-4a0d-9ee6-aab60dc5424f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 22 00:40:35 compute-1 nova_compute[182713]: 2026-01-22 00:40:35.380 182717 DEBUG nova.virt.hardware [None req-7517a60b-77f2-4a0d-9ee6-aab60dc5424f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-21T23:42:45Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c3389c03-89c4-4ff5-9e03-1a99d41713d4',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 22 00:40:35 compute-1 nova_compute[182713]: 2026-01-22 00:40:35.381 182717 DEBUG nova.virt.hardware [None req-7517a60b-77f2-4a0d-9ee6-aab60dc5424f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 22 00:40:35 compute-1 nova_compute[182713]: 2026-01-22 00:40:35.381 182717 DEBUG nova.virt.hardware [None req-7517a60b-77f2-4a0d-9ee6-aab60dc5424f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 22 00:40:35 compute-1 nova_compute[182713]: 2026-01-22 00:40:35.381 182717 DEBUG nova.virt.hardware [None req-7517a60b-77f2-4a0d-9ee6-aab60dc5424f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 22 00:40:35 compute-1 nova_compute[182713]: 2026-01-22 00:40:35.382 182717 DEBUG nova.virt.hardware [None req-7517a60b-77f2-4a0d-9ee6-aab60dc5424f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 22 00:40:35 compute-1 nova_compute[182713]: 2026-01-22 00:40:35.382 182717 DEBUG nova.virt.hardware [None req-7517a60b-77f2-4a0d-9ee6-aab60dc5424f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 22 00:40:35 compute-1 nova_compute[182713]: 2026-01-22 00:40:35.382 182717 DEBUG nova.virt.hardware [None req-7517a60b-77f2-4a0d-9ee6-aab60dc5424f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 22 00:40:35 compute-1 nova_compute[182713]: 2026-01-22 00:40:35.382 182717 DEBUG nova.virt.hardware [None req-7517a60b-77f2-4a0d-9ee6-aab60dc5424f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 22 00:40:35 compute-1 nova_compute[182713]: 2026-01-22 00:40:35.383 182717 DEBUG nova.virt.hardware [None req-7517a60b-77f2-4a0d-9ee6-aab60dc5424f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 22 00:40:35 compute-1 nova_compute[182713]: 2026-01-22 00:40:35.383 182717 DEBUG nova.virt.hardware [None req-7517a60b-77f2-4a0d-9ee6-aab60dc5424f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 22 00:40:35 compute-1 nova_compute[182713]: 2026-01-22 00:40:35.383 182717 DEBUG nova.virt.hardware [None req-7517a60b-77f2-4a0d-9ee6-aab60dc5424f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 22 00:40:35 compute-1 nova_compute[182713]: 2026-01-22 00:40:35.389 182717 DEBUG nova.virt.libvirt.vif [None req-7517a60b-77f2-4a0d-9ee6-aab60dc5424f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T00:40:26Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-2036429228',display_name='tempest-TestGettingAddress-server-2036429228',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testgettingaddress-server-2036429228',id=178,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHfT5O6ZRpgmE55cZJ3QGKqkIsXzVQor6hmo4DYN0Y1I8FjHDDT3akgLWKRb+GjMp1RPaUHp87VJJNRMEPu8nfNaeXczwCaPpHi3Lj4qUZVeX1rL89fg6akvTfTjYIINQg==',key_name='tempest-TestGettingAddress-1576090324',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='837db8748d074b3c9179b47d30e7a1d4',ramdisk_id='',reservation_id='r-jwgbkrts',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-471729430',owner_user_name='tempest-TestGettingAddress-471729430-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:40:28Z,user_data=None,user_id='a8fd196423d94b309668ffd08655f7ed',uuid=af1fd418-ee94-4203-923f-6b4fd78c2b96,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "c1a311b0-55fd-43c5-9c26-1ce77ebf5300", "address": "fa:16:3e:c0:06:2d", "network": {"id": "d60c1e89-37d5-4a05-b566-04735ac9e501", "bridge": "br-int", "label": "tempest-network-smoke--2091942253", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc1a311b0-55", "ovs_interfaceid": "c1a311b0-55fd-43c5-9c26-1ce77ebf5300", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 00:40:35 compute-1 nova_compute[182713]: 2026-01-22 00:40:35.389 182717 DEBUG nova.network.os_vif_util [None req-7517a60b-77f2-4a0d-9ee6-aab60dc5424f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Converting VIF {"id": "c1a311b0-55fd-43c5-9c26-1ce77ebf5300", "address": "fa:16:3e:c0:06:2d", "network": {"id": "d60c1e89-37d5-4a05-b566-04735ac9e501", "bridge": "br-int", "label": "tempest-network-smoke--2091942253", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc1a311b0-55", "ovs_interfaceid": "c1a311b0-55fd-43c5-9c26-1ce77ebf5300", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:40:35 compute-1 nova_compute[182713]: 2026-01-22 00:40:35.390 182717 DEBUG nova.network.os_vif_util [None req-7517a60b-77f2-4a0d-9ee6-aab60dc5424f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c0:06:2d,bridge_name='br-int',has_traffic_filtering=True,id=c1a311b0-55fd-43c5-9c26-1ce77ebf5300,network=Network(d60c1e89-37d5-4a05-b566-04735ac9e501),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc1a311b0-55') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:40:35 compute-1 nova_compute[182713]: 2026-01-22 00:40:35.391 182717 DEBUG nova.virt.libvirt.vif [None req-7517a60b-77f2-4a0d-9ee6-aab60dc5424f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T00:40:26Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-2036429228',display_name='tempest-TestGettingAddress-server-2036429228',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testgettingaddress-server-2036429228',id=178,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHfT5O6ZRpgmE55cZJ3QGKqkIsXzVQor6hmo4DYN0Y1I8FjHDDT3akgLWKRb+GjMp1RPaUHp87VJJNRMEPu8nfNaeXczwCaPpHi3Lj4qUZVeX1rL89fg6akvTfTjYIINQg==',key_name='tempest-TestGettingAddress-1576090324',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='837db8748d074b3c9179b47d30e7a1d4',ramdisk_id='',reservation_id='r-jwgbkrts',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-471729430',owner_user_name='tempest-TestGettingAddress-471729430-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:40:28Z,user_data=None,user_id='a8fd196423d94b309668ffd08655f7ed',uuid=af1fd418-ee94-4203-923f-6b4fd78c2b96,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "c45a4550-8234-4bc5-b809-77d90eb7f8fd", "address": "fa:16:3e:b5:21:ba", "network": {"id": "65bd5007-25fc-43be-bec0-20ff1d1f0a79", "bridge": "br-int", "label": "tempest-network-smoke--1426694891", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:feb5:21ba", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc45a4550-82", "ovs_interfaceid": "c45a4550-8234-4bc5-b809-77d90eb7f8fd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 00:40:35 compute-1 nova_compute[182713]: 2026-01-22 00:40:35.391 182717 DEBUG nova.network.os_vif_util [None req-7517a60b-77f2-4a0d-9ee6-aab60dc5424f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Converting VIF {"id": "c45a4550-8234-4bc5-b809-77d90eb7f8fd", "address": "fa:16:3e:b5:21:ba", "network": {"id": "65bd5007-25fc-43be-bec0-20ff1d1f0a79", "bridge": "br-int", "label": "tempest-network-smoke--1426694891", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:feb5:21ba", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc45a4550-82", "ovs_interfaceid": "c45a4550-8234-4bc5-b809-77d90eb7f8fd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:40:35 compute-1 nova_compute[182713]: 2026-01-22 00:40:35.392 182717 DEBUG nova.network.os_vif_util [None req-7517a60b-77f2-4a0d-9ee6-aab60dc5424f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b5:21:ba,bridge_name='br-int',has_traffic_filtering=True,id=c45a4550-8234-4bc5-b809-77d90eb7f8fd,network=Network(65bd5007-25fc-43be-bec0-20ff1d1f0a79),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc45a4550-82') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:40:35 compute-1 nova_compute[182713]: 2026-01-22 00:40:35.393 182717 DEBUG nova.objects.instance [None req-7517a60b-77f2-4a0d-9ee6-aab60dc5424f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lazy-loading 'pci_devices' on Instance uuid af1fd418-ee94-4203-923f-6b4fd78c2b96 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:40:35 compute-1 nova_compute[182713]: 2026-01-22 00:40:35.408 182717 DEBUG nova.virt.libvirt.driver [None req-7517a60b-77f2-4a0d-9ee6-aab60dc5424f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: af1fd418-ee94-4203-923f-6b4fd78c2b96] End _get_guest_xml xml=<domain type="kvm">
Jan 22 00:40:35 compute-1 nova_compute[182713]:   <uuid>af1fd418-ee94-4203-923f-6b4fd78c2b96</uuid>
Jan 22 00:40:35 compute-1 nova_compute[182713]:   <name>instance-000000b2</name>
Jan 22 00:40:35 compute-1 nova_compute[182713]:   <memory>131072</memory>
Jan 22 00:40:35 compute-1 nova_compute[182713]:   <vcpu>1</vcpu>
Jan 22 00:40:35 compute-1 nova_compute[182713]:   <metadata>
Jan 22 00:40:35 compute-1 nova_compute[182713]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 00:40:35 compute-1 nova_compute[182713]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 00:40:35 compute-1 nova_compute[182713]:       <nova:name>tempest-TestGettingAddress-server-2036429228</nova:name>
Jan 22 00:40:35 compute-1 nova_compute[182713]:       <nova:creationTime>2026-01-22 00:40:35</nova:creationTime>
Jan 22 00:40:35 compute-1 nova_compute[182713]:       <nova:flavor name="m1.nano">
Jan 22 00:40:35 compute-1 nova_compute[182713]:         <nova:memory>128</nova:memory>
Jan 22 00:40:35 compute-1 nova_compute[182713]:         <nova:disk>1</nova:disk>
Jan 22 00:40:35 compute-1 nova_compute[182713]:         <nova:swap>0</nova:swap>
Jan 22 00:40:35 compute-1 nova_compute[182713]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 00:40:35 compute-1 nova_compute[182713]:         <nova:vcpus>1</nova:vcpus>
Jan 22 00:40:35 compute-1 nova_compute[182713]:       </nova:flavor>
Jan 22 00:40:35 compute-1 nova_compute[182713]:       <nova:owner>
Jan 22 00:40:35 compute-1 nova_compute[182713]:         <nova:user uuid="a8fd196423d94b309668ffd08655f7ed">tempest-TestGettingAddress-471729430-project-member</nova:user>
Jan 22 00:40:35 compute-1 nova_compute[182713]:         <nova:project uuid="837db8748d074b3c9179b47d30e7a1d4">tempest-TestGettingAddress-471729430</nova:project>
Jan 22 00:40:35 compute-1 nova_compute[182713]:       </nova:owner>
Jan 22 00:40:35 compute-1 nova_compute[182713]:       <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 22 00:40:35 compute-1 nova_compute[182713]:       <nova:ports>
Jan 22 00:40:35 compute-1 nova_compute[182713]:         <nova:port uuid="c1a311b0-55fd-43c5-9c26-1ce77ebf5300">
Jan 22 00:40:35 compute-1 nova_compute[182713]:           <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Jan 22 00:40:35 compute-1 nova_compute[182713]:         </nova:port>
Jan 22 00:40:35 compute-1 nova_compute[182713]:         <nova:port uuid="c45a4550-8234-4bc5-b809-77d90eb7f8fd">
Jan 22 00:40:35 compute-1 nova_compute[182713]:           <nova:ip type="fixed" address="2001:db8::f816:3eff:feb5:21ba" ipVersion="6"/>
Jan 22 00:40:35 compute-1 nova_compute[182713]:         </nova:port>
Jan 22 00:40:35 compute-1 nova_compute[182713]:       </nova:ports>
Jan 22 00:40:35 compute-1 nova_compute[182713]:     </nova:instance>
Jan 22 00:40:35 compute-1 nova_compute[182713]:   </metadata>
Jan 22 00:40:35 compute-1 nova_compute[182713]:   <sysinfo type="smbios">
Jan 22 00:40:35 compute-1 nova_compute[182713]:     <system>
Jan 22 00:40:35 compute-1 nova_compute[182713]:       <entry name="manufacturer">RDO</entry>
Jan 22 00:40:35 compute-1 nova_compute[182713]:       <entry name="product">OpenStack Compute</entry>
Jan 22 00:40:35 compute-1 nova_compute[182713]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 00:40:35 compute-1 nova_compute[182713]:       <entry name="serial">af1fd418-ee94-4203-923f-6b4fd78c2b96</entry>
Jan 22 00:40:35 compute-1 nova_compute[182713]:       <entry name="uuid">af1fd418-ee94-4203-923f-6b4fd78c2b96</entry>
Jan 22 00:40:35 compute-1 nova_compute[182713]:       <entry name="family">Virtual Machine</entry>
Jan 22 00:40:35 compute-1 nova_compute[182713]:     </system>
Jan 22 00:40:35 compute-1 nova_compute[182713]:   </sysinfo>
Jan 22 00:40:35 compute-1 nova_compute[182713]:   <os>
Jan 22 00:40:35 compute-1 nova_compute[182713]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 22 00:40:35 compute-1 nova_compute[182713]:     <boot dev="hd"/>
Jan 22 00:40:35 compute-1 nova_compute[182713]:     <smbios mode="sysinfo"/>
Jan 22 00:40:35 compute-1 nova_compute[182713]:   </os>
Jan 22 00:40:35 compute-1 nova_compute[182713]:   <features>
Jan 22 00:40:35 compute-1 nova_compute[182713]:     <acpi/>
Jan 22 00:40:35 compute-1 nova_compute[182713]:     <apic/>
Jan 22 00:40:35 compute-1 nova_compute[182713]:     <vmcoreinfo/>
Jan 22 00:40:35 compute-1 nova_compute[182713]:   </features>
Jan 22 00:40:35 compute-1 nova_compute[182713]:   <clock offset="utc">
Jan 22 00:40:35 compute-1 nova_compute[182713]:     <timer name="pit" tickpolicy="delay"/>
Jan 22 00:40:35 compute-1 nova_compute[182713]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 22 00:40:35 compute-1 nova_compute[182713]:     <timer name="hpet" present="no"/>
Jan 22 00:40:35 compute-1 nova_compute[182713]:   </clock>
Jan 22 00:40:35 compute-1 nova_compute[182713]:   <cpu mode="custom" match="exact">
Jan 22 00:40:35 compute-1 nova_compute[182713]:     <model>Nehalem</model>
Jan 22 00:40:35 compute-1 nova_compute[182713]:     <topology sockets="1" cores="1" threads="1"/>
Jan 22 00:40:35 compute-1 nova_compute[182713]:   </cpu>
Jan 22 00:40:35 compute-1 nova_compute[182713]:   <devices>
Jan 22 00:40:35 compute-1 nova_compute[182713]:     <disk type="file" device="disk">
Jan 22 00:40:35 compute-1 nova_compute[182713]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 00:40:35 compute-1 nova_compute[182713]:       <source file="/var/lib/nova/instances/af1fd418-ee94-4203-923f-6b4fd78c2b96/disk"/>
Jan 22 00:40:35 compute-1 nova_compute[182713]:       <target dev="vda" bus="virtio"/>
Jan 22 00:40:35 compute-1 nova_compute[182713]:     </disk>
Jan 22 00:40:35 compute-1 nova_compute[182713]:     <disk type="file" device="cdrom">
Jan 22 00:40:35 compute-1 nova_compute[182713]:       <driver name="qemu" type="raw" cache="none"/>
Jan 22 00:40:35 compute-1 nova_compute[182713]:       <source file="/var/lib/nova/instances/af1fd418-ee94-4203-923f-6b4fd78c2b96/disk.config"/>
Jan 22 00:40:35 compute-1 nova_compute[182713]:       <target dev="sda" bus="sata"/>
Jan 22 00:40:35 compute-1 nova_compute[182713]:     </disk>
Jan 22 00:40:35 compute-1 nova_compute[182713]:     <interface type="ethernet">
Jan 22 00:40:35 compute-1 nova_compute[182713]:       <mac address="fa:16:3e:c0:06:2d"/>
Jan 22 00:40:35 compute-1 nova_compute[182713]:       <model type="virtio"/>
Jan 22 00:40:35 compute-1 nova_compute[182713]:       <driver name="vhost" rx_queue_size="512"/>
Jan 22 00:40:35 compute-1 nova_compute[182713]:       <mtu size="1442"/>
Jan 22 00:40:35 compute-1 nova_compute[182713]:       <target dev="tapc1a311b0-55"/>
Jan 22 00:40:35 compute-1 nova_compute[182713]:     </interface>
Jan 22 00:40:35 compute-1 nova_compute[182713]:     <interface type="ethernet">
Jan 22 00:40:35 compute-1 nova_compute[182713]:       <mac address="fa:16:3e:b5:21:ba"/>
Jan 22 00:40:35 compute-1 nova_compute[182713]:       <model type="virtio"/>
Jan 22 00:40:35 compute-1 nova_compute[182713]:       <driver name="vhost" rx_queue_size="512"/>
Jan 22 00:40:35 compute-1 nova_compute[182713]:       <mtu size="1442"/>
Jan 22 00:40:35 compute-1 nova_compute[182713]:       <target dev="tapc45a4550-82"/>
Jan 22 00:40:35 compute-1 nova_compute[182713]:     </interface>
Jan 22 00:40:35 compute-1 nova_compute[182713]:     <serial type="pty">
Jan 22 00:40:35 compute-1 nova_compute[182713]:       <log file="/var/lib/nova/instances/af1fd418-ee94-4203-923f-6b4fd78c2b96/console.log" append="off"/>
Jan 22 00:40:35 compute-1 nova_compute[182713]:     </serial>
Jan 22 00:40:35 compute-1 nova_compute[182713]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 00:40:35 compute-1 nova_compute[182713]:     <video>
Jan 22 00:40:35 compute-1 nova_compute[182713]:       <model type="virtio"/>
Jan 22 00:40:35 compute-1 nova_compute[182713]:     </video>
Jan 22 00:40:35 compute-1 nova_compute[182713]:     <input type="tablet" bus="usb"/>
Jan 22 00:40:35 compute-1 nova_compute[182713]:     <rng model="virtio">
Jan 22 00:40:35 compute-1 nova_compute[182713]:       <backend model="random">/dev/urandom</backend>
Jan 22 00:40:35 compute-1 nova_compute[182713]:     </rng>
Jan 22 00:40:35 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root"/>
Jan 22 00:40:35 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:40:35 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:40:35 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:40:35 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:40:35 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:40:35 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:40:35 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:40:35 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:40:35 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:40:35 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:40:35 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:40:35 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:40:35 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:40:35 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:40:35 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:40:35 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:40:35 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:40:35 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:40:35 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:40:35 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:40:35 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:40:35 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:40:35 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:40:35 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:40:35 compute-1 nova_compute[182713]:     <controller type="usb" index="0"/>
Jan 22 00:40:35 compute-1 nova_compute[182713]:     <memballoon model="virtio">
Jan 22 00:40:35 compute-1 nova_compute[182713]:       <stats period="10"/>
Jan 22 00:40:35 compute-1 nova_compute[182713]:     </memballoon>
Jan 22 00:40:35 compute-1 nova_compute[182713]:   </devices>
Jan 22 00:40:35 compute-1 nova_compute[182713]: </domain>
Jan 22 00:40:35 compute-1 nova_compute[182713]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 22 00:40:35 compute-1 nova_compute[182713]: 2026-01-22 00:40:35.410 182717 DEBUG nova.compute.manager [None req-7517a60b-77f2-4a0d-9ee6-aab60dc5424f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: af1fd418-ee94-4203-923f-6b4fd78c2b96] Preparing to wait for external event network-vif-plugged-c1a311b0-55fd-43c5-9c26-1ce77ebf5300 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 22 00:40:35 compute-1 nova_compute[182713]: 2026-01-22 00:40:35.411 182717 DEBUG oslo_concurrency.lockutils [None req-7517a60b-77f2-4a0d-9ee6-aab60dc5424f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Acquiring lock "af1fd418-ee94-4203-923f-6b4fd78c2b96-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:40:35 compute-1 nova_compute[182713]: 2026-01-22 00:40:35.411 182717 DEBUG oslo_concurrency.lockutils [None req-7517a60b-77f2-4a0d-9ee6-aab60dc5424f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "af1fd418-ee94-4203-923f-6b4fd78c2b96-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:40:35 compute-1 nova_compute[182713]: 2026-01-22 00:40:35.411 182717 DEBUG oslo_concurrency.lockutils [None req-7517a60b-77f2-4a0d-9ee6-aab60dc5424f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "af1fd418-ee94-4203-923f-6b4fd78c2b96-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:40:35 compute-1 nova_compute[182713]: 2026-01-22 00:40:35.412 182717 DEBUG nova.compute.manager [None req-7517a60b-77f2-4a0d-9ee6-aab60dc5424f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: af1fd418-ee94-4203-923f-6b4fd78c2b96] Preparing to wait for external event network-vif-plugged-c45a4550-8234-4bc5-b809-77d90eb7f8fd prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 22 00:40:35 compute-1 nova_compute[182713]: 2026-01-22 00:40:35.412 182717 DEBUG oslo_concurrency.lockutils [None req-7517a60b-77f2-4a0d-9ee6-aab60dc5424f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Acquiring lock "af1fd418-ee94-4203-923f-6b4fd78c2b96-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:40:35 compute-1 nova_compute[182713]: 2026-01-22 00:40:35.412 182717 DEBUG oslo_concurrency.lockutils [None req-7517a60b-77f2-4a0d-9ee6-aab60dc5424f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "af1fd418-ee94-4203-923f-6b4fd78c2b96-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:40:35 compute-1 nova_compute[182713]: 2026-01-22 00:40:35.413 182717 DEBUG oslo_concurrency.lockutils [None req-7517a60b-77f2-4a0d-9ee6-aab60dc5424f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "af1fd418-ee94-4203-923f-6b4fd78c2b96-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:40:35 compute-1 nova_compute[182713]: 2026-01-22 00:40:35.414 182717 DEBUG nova.virt.libvirt.vif [None req-7517a60b-77f2-4a0d-9ee6-aab60dc5424f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T00:40:26Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-2036429228',display_name='tempest-TestGettingAddress-server-2036429228',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testgettingaddress-server-2036429228',id=178,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHfT5O6ZRpgmE55cZJ3QGKqkIsXzVQor6hmo4DYN0Y1I8FjHDDT3akgLWKRb+GjMp1RPaUHp87VJJNRMEPu8nfNaeXczwCaPpHi3Lj4qUZVeX1rL89fg6akvTfTjYIINQg==',key_name='tempest-TestGettingAddress-1576090324',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='837db8748d074b3c9179b47d30e7a1d4',ramdisk_id='',reservation_id='r-jwgbkrts',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-471729430',owner_user_name='tempest-TestGettingAddress-471729430-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:40:28Z,user_data=None,user_id='a8fd196423d94b309668ffd08655f7ed',uuid=af1fd418-ee94-4203-923f-6b4fd78c2b96,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "c1a311b0-55fd-43c5-9c26-1ce77ebf5300", "address": "fa:16:3e:c0:06:2d", "network": {"id": "d60c1e89-37d5-4a05-b566-04735ac9e501", "bridge": "br-int", "label": "tempest-network-smoke--2091942253", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc1a311b0-55", "ovs_interfaceid": "c1a311b0-55fd-43c5-9c26-1ce77ebf5300", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 00:40:35 compute-1 nova_compute[182713]: 2026-01-22 00:40:35.414 182717 DEBUG nova.network.os_vif_util [None req-7517a60b-77f2-4a0d-9ee6-aab60dc5424f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Converting VIF {"id": "c1a311b0-55fd-43c5-9c26-1ce77ebf5300", "address": "fa:16:3e:c0:06:2d", "network": {"id": "d60c1e89-37d5-4a05-b566-04735ac9e501", "bridge": "br-int", "label": "tempest-network-smoke--2091942253", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc1a311b0-55", "ovs_interfaceid": "c1a311b0-55fd-43c5-9c26-1ce77ebf5300", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:40:35 compute-1 nova_compute[182713]: 2026-01-22 00:40:35.415 182717 DEBUG nova.network.os_vif_util [None req-7517a60b-77f2-4a0d-9ee6-aab60dc5424f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c0:06:2d,bridge_name='br-int',has_traffic_filtering=True,id=c1a311b0-55fd-43c5-9c26-1ce77ebf5300,network=Network(d60c1e89-37d5-4a05-b566-04735ac9e501),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc1a311b0-55') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:40:35 compute-1 nova_compute[182713]: 2026-01-22 00:40:35.416 182717 DEBUG os_vif [None req-7517a60b-77f2-4a0d-9ee6-aab60dc5424f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:c0:06:2d,bridge_name='br-int',has_traffic_filtering=True,id=c1a311b0-55fd-43c5-9c26-1ce77ebf5300,network=Network(d60c1e89-37d5-4a05-b566-04735ac9e501),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc1a311b0-55') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 00:40:35 compute-1 nova_compute[182713]: 2026-01-22 00:40:35.417 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:40:35 compute-1 nova_compute[182713]: 2026-01-22 00:40:35.418 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:40:35 compute-1 nova_compute[182713]: 2026-01-22 00:40:35.418 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 00:40:35 compute-1 nova_compute[182713]: 2026-01-22 00:40:35.424 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:40:35 compute-1 nova_compute[182713]: 2026-01-22 00:40:35.424 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc1a311b0-55, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:40:35 compute-1 nova_compute[182713]: 2026-01-22 00:40:35.425 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapc1a311b0-55, col_values=(('external_ids', {'iface-id': 'c1a311b0-55fd-43c5-9c26-1ce77ebf5300', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:c0:06:2d', 'vm-uuid': 'af1fd418-ee94-4203-923f-6b4fd78c2b96'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:40:35 compute-1 nova_compute[182713]: 2026-01-22 00:40:35.426 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:40:35 compute-1 NetworkManager[54952]: <info>  [1769042435.4275] manager: (tapc1a311b0-55): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/350)
Jan 22 00:40:35 compute-1 nova_compute[182713]: 2026-01-22 00:40:35.428 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:40:35 compute-1 nova_compute[182713]: 2026-01-22 00:40:35.433 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:40:35 compute-1 nova_compute[182713]: 2026-01-22 00:40:35.433 182717 INFO os_vif [None req-7517a60b-77f2-4a0d-9ee6-aab60dc5424f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:c0:06:2d,bridge_name='br-int',has_traffic_filtering=True,id=c1a311b0-55fd-43c5-9c26-1ce77ebf5300,network=Network(d60c1e89-37d5-4a05-b566-04735ac9e501),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc1a311b0-55')
Jan 22 00:40:35 compute-1 nova_compute[182713]: 2026-01-22 00:40:35.434 182717 DEBUG nova.virt.libvirt.vif [None req-7517a60b-77f2-4a0d-9ee6-aab60dc5424f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T00:40:26Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-2036429228',display_name='tempest-TestGettingAddress-server-2036429228',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testgettingaddress-server-2036429228',id=178,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHfT5O6ZRpgmE55cZJ3QGKqkIsXzVQor6hmo4DYN0Y1I8FjHDDT3akgLWKRb+GjMp1RPaUHp87VJJNRMEPu8nfNaeXczwCaPpHi3Lj4qUZVeX1rL89fg6akvTfTjYIINQg==',key_name='tempest-TestGettingAddress-1576090324',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='837db8748d074b3c9179b47d30e7a1d4',ramdisk_id='',reservation_id='r-jwgbkrts',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-471729430',owner_user_name='tempest-TestGettingAddress-471729430-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:40:28Z,user_data=None,user_id='a8fd196423d94b309668ffd08655f7ed',uuid=af1fd418-ee94-4203-923f-6b4fd78c2b96,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "c45a4550-8234-4bc5-b809-77d90eb7f8fd", "address": "fa:16:3e:b5:21:ba", "network": {"id": "65bd5007-25fc-43be-bec0-20ff1d1f0a79", "bridge": "br-int", "label": "tempest-network-smoke--1426694891", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:feb5:21ba", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc45a4550-82", "ovs_interfaceid": "c45a4550-8234-4bc5-b809-77d90eb7f8fd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 00:40:35 compute-1 nova_compute[182713]: 2026-01-22 00:40:35.434 182717 DEBUG nova.network.os_vif_util [None req-7517a60b-77f2-4a0d-9ee6-aab60dc5424f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Converting VIF {"id": "c45a4550-8234-4bc5-b809-77d90eb7f8fd", "address": "fa:16:3e:b5:21:ba", "network": {"id": "65bd5007-25fc-43be-bec0-20ff1d1f0a79", "bridge": "br-int", "label": "tempest-network-smoke--1426694891", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:feb5:21ba", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc45a4550-82", "ovs_interfaceid": "c45a4550-8234-4bc5-b809-77d90eb7f8fd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:40:35 compute-1 nova_compute[182713]: 2026-01-22 00:40:35.435 182717 DEBUG nova.network.os_vif_util [None req-7517a60b-77f2-4a0d-9ee6-aab60dc5424f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b5:21:ba,bridge_name='br-int',has_traffic_filtering=True,id=c45a4550-8234-4bc5-b809-77d90eb7f8fd,network=Network(65bd5007-25fc-43be-bec0-20ff1d1f0a79),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc45a4550-82') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:40:35 compute-1 nova_compute[182713]: 2026-01-22 00:40:35.435 182717 DEBUG os_vif [None req-7517a60b-77f2-4a0d-9ee6-aab60dc5424f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:b5:21:ba,bridge_name='br-int',has_traffic_filtering=True,id=c45a4550-8234-4bc5-b809-77d90eb7f8fd,network=Network(65bd5007-25fc-43be-bec0-20ff1d1f0a79),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc45a4550-82') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 00:40:35 compute-1 nova_compute[182713]: 2026-01-22 00:40:35.435 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:40:35 compute-1 nova_compute[182713]: 2026-01-22 00:40:35.436 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:40:35 compute-1 nova_compute[182713]: 2026-01-22 00:40:35.436 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 00:40:35 compute-1 nova_compute[182713]: 2026-01-22 00:40:35.438 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:40:35 compute-1 nova_compute[182713]: 2026-01-22 00:40:35.438 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc45a4550-82, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:40:35 compute-1 nova_compute[182713]: 2026-01-22 00:40:35.438 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapc45a4550-82, col_values=(('external_ids', {'iface-id': 'c45a4550-8234-4bc5-b809-77d90eb7f8fd', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:b5:21:ba', 'vm-uuid': 'af1fd418-ee94-4203-923f-6b4fd78c2b96'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:40:35 compute-1 NetworkManager[54952]: <info>  [1769042435.4401] manager: (tapc45a4550-82): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/351)
Jan 22 00:40:35 compute-1 nova_compute[182713]: 2026-01-22 00:40:35.439 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:40:35 compute-1 nova_compute[182713]: 2026-01-22 00:40:35.441 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:40:35 compute-1 nova_compute[182713]: 2026-01-22 00:40:35.446 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:40:35 compute-1 nova_compute[182713]: 2026-01-22 00:40:35.447 182717 INFO os_vif [None req-7517a60b-77f2-4a0d-9ee6-aab60dc5424f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:b5:21:ba,bridge_name='br-int',has_traffic_filtering=True,id=c45a4550-8234-4bc5-b809-77d90eb7f8fd,network=Network(65bd5007-25fc-43be-bec0-20ff1d1f0a79),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc45a4550-82')
Jan 22 00:40:35 compute-1 nova_compute[182713]: 2026-01-22 00:40:35.508 182717 DEBUG nova.virt.libvirt.driver [None req-7517a60b-77f2-4a0d-9ee6-aab60dc5424f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 00:40:35 compute-1 nova_compute[182713]: 2026-01-22 00:40:35.509 182717 DEBUG nova.virt.libvirt.driver [None req-7517a60b-77f2-4a0d-9ee6-aab60dc5424f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 00:40:35 compute-1 nova_compute[182713]: 2026-01-22 00:40:35.509 182717 DEBUG nova.virt.libvirt.driver [None req-7517a60b-77f2-4a0d-9ee6-aab60dc5424f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] No VIF found with MAC fa:16:3e:c0:06:2d, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 00:40:35 compute-1 nova_compute[182713]: 2026-01-22 00:40:35.509 182717 DEBUG nova.virt.libvirt.driver [None req-7517a60b-77f2-4a0d-9ee6-aab60dc5424f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] No VIF found with MAC fa:16:3e:b5:21:ba, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 00:40:35 compute-1 nova_compute[182713]: 2026-01-22 00:40:35.510 182717 INFO nova.virt.libvirt.driver [None req-7517a60b-77f2-4a0d-9ee6-aab60dc5424f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: af1fd418-ee94-4203-923f-6b4fd78c2b96] Using config drive
Jan 22 00:40:35 compute-1 nova_compute[182713]: 2026-01-22 00:40:35.761 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:40:36 compute-1 nova_compute[182713]: 2026-01-22 00:40:36.203 182717 INFO nova.virt.libvirt.driver [None req-7517a60b-77f2-4a0d-9ee6-aab60dc5424f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: af1fd418-ee94-4203-923f-6b4fd78c2b96] Creating config drive at /var/lib/nova/instances/af1fd418-ee94-4203-923f-6b4fd78c2b96/disk.config
Jan 22 00:40:36 compute-1 nova_compute[182713]: 2026-01-22 00:40:36.208 182717 DEBUG oslo_concurrency.processutils [None req-7517a60b-77f2-4a0d-9ee6-aab60dc5424f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/af1fd418-ee94-4203-923f-6b4fd78c2b96/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpgpatcaqx execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:40:36 compute-1 nova_compute[182713]: 2026-01-22 00:40:36.340 182717 DEBUG oslo_concurrency.processutils [None req-7517a60b-77f2-4a0d-9ee6-aab60dc5424f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/af1fd418-ee94-4203-923f-6b4fd78c2b96/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpgpatcaqx" returned: 0 in 0.132s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:40:36 compute-1 kernel: tapc1a311b0-55: entered promiscuous mode
Jan 22 00:40:36 compute-1 NetworkManager[54952]: <info>  [1769042436.4361] manager: (tapc1a311b0-55): new Tun device (/org/freedesktop/NetworkManager/Devices/352)
Jan 22 00:40:36 compute-1 ovn_controller[94841]: 2026-01-22T00:40:36Z|00733|binding|INFO|Claiming lport c1a311b0-55fd-43c5-9c26-1ce77ebf5300 for this chassis.
Jan 22 00:40:36 compute-1 ovn_controller[94841]: 2026-01-22T00:40:36Z|00734|binding|INFO|c1a311b0-55fd-43c5-9c26-1ce77ebf5300: Claiming fa:16:3e:c0:06:2d 10.100.0.12
Jan 22 00:40:36 compute-1 nova_compute[182713]: 2026-01-22 00:40:36.440 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:40:36 compute-1 NetworkManager[54952]: <info>  [1769042436.4567] manager: (tapc45a4550-82): new Tun device (/org/freedesktop/NetworkManager/Devices/353)
Jan 22 00:40:36 compute-1 nova_compute[182713]: 2026-01-22 00:40:36.457 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:40:36 compute-1 NetworkManager[54952]: <info>  [1769042436.4594] manager: (patch-provnet-99bcf688-d143-4306-a11c-956e66fbe227-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/354)
Jan 22 00:40:36 compute-1 NetworkManager[54952]: <info>  [1769042436.4613] manager: (patch-br-int-to-provnet-99bcf688-d143-4306-a11c-956e66fbe227): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/355)
Jan 22 00:40:36 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:40:36.464 104184 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c0:06:2d 10.100.0.12'], port_security=['fa:16:3e:c0:06:2d 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': 'af1fd418-ee94-4203-923f-6b4fd78c2b96', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d60c1e89-37d5-4a05-b566-04735ac9e501', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '837db8748d074b3c9179b47d30e7a1d4', 'neutron:revision_number': '2', 'neutron:security_group_ids': '0b1cab7e-3d92-45e3-88fe-6266af987fc5', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e88081bc-c33e-4f29-8ba8-cdfb76dc2a31, chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>], logical_port=c1a311b0-55fd-43c5-9c26-1ce77ebf5300) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:40:36 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:40:36.465 104184 INFO neutron.agent.ovn.metadata.agent [-] Port c1a311b0-55fd-43c5-9c26-1ce77ebf5300 in datapath d60c1e89-37d5-4a05-b566-04735ac9e501 bound to our chassis
Jan 22 00:40:36 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:40:36.467 104184 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network d60c1e89-37d5-4a05-b566-04735ac9e501
Jan 22 00:40:36 compute-1 systemd-udevd[242879]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 00:40:36 compute-1 systemd-udevd[242878]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 00:40:36 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:40:36.486 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[2f6e12ad-c26d-40e3-a537-da3f4394ff51]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:40:36 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:40:36.487 104184 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapd60c1e89-31 in ovnmeta-d60c1e89-37d5-4a05-b566-04735ac9e501 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 22 00:40:36 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:40:36.490 211733 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapd60c1e89-30 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 22 00:40:36 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:40:36.490 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[73374a95-4099-432e-8d53-ae433bdbd986]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:40:36 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:40:36.491 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[6c33991e-76d3-41b0-bfc8-8b7b5b8601d3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:40:36 compute-1 NetworkManager[54952]: <info>  [1769042436.5021] device (tapc1a311b0-55): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 00:40:36 compute-1 NetworkManager[54952]: <info>  [1769042436.5029] device (tapc1a311b0-55): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 00:40:36 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:40:36.508 104576 DEBUG oslo.privsep.daemon [-] privsep: reply[64decff0-e70a-402f-a178-484f94fe3f2f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:40:36 compute-1 systemd-machined[153970]: New machine qemu-77-instance-000000b2.
Jan 22 00:40:36 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:40:36.543 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[76e41572-d408-42c6-97d2-d12849732e60]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:40:36 compute-1 kernel: tapc45a4550-82: entered promiscuous mode
Jan 22 00:40:36 compute-1 NetworkManager[54952]: <info>  [1769042436.5727] device (tapc45a4550-82): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 00:40:36 compute-1 NetworkManager[54952]: <info>  [1769042436.5745] device (tapc45a4550-82): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 00:40:36 compute-1 nova_compute[182713]: 2026-01-22 00:40:36.576 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:40:36 compute-1 systemd[1]: Started Virtual Machine qemu-77-instance-000000b2.
Jan 22 00:40:36 compute-1 ovn_controller[94841]: 2026-01-22T00:40:36Z|00735|binding|INFO|Claiming lport c45a4550-8234-4bc5-b809-77d90eb7f8fd for this chassis.
Jan 22 00:40:36 compute-1 ovn_controller[94841]: 2026-01-22T00:40:36Z|00736|binding|INFO|c45a4550-8234-4bc5-b809-77d90eb7f8fd: Claiming fa:16:3e:b5:21:ba 2001:db8::f816:3eff:feb5:21ba
Jan 22 00:40:36 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:40:36.591 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[95de1020-5fd5-4f05-90cb-1b9c89e65b85]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:40:36 compute-1 ovn_controller[94841]: 2026-01-22T00:40:36Z|00737|binding|INFO|Setting lport c1a311b0-55fd-43c5-9c26-1ce77ebf5300 ovn-installed in OVS
Jan 22 00:40:36 compute-1 nova_compute[182713]: 2026-01-22 00:40:36.594 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:40:36 compute-1 ovn_controller[94841]: 2026-01-22T00:40:36Z|00738|binding|INFO|Setting lport c1a311b0-55fd-43c5-9c26-1ce77ebf5300 up in Southbound
Jan 22 00:40:36 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:40:36.602 104184 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b5:21:ba 2001:db8::f816:3eff:feb5:21ba'], port_security=['fa:16:3e:b5:21:ba 2001:db8::f816:3eff:feb5:21ba'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:feb5:21ba/64', 'neutron:device_id': 'af1fd418-ee94-4203-923f-6b4fd78c2b96', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-65bd5007-25fc-43be-bec0-20ff1d1f0a79', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '837db8748d074b3c9179b47d30e7a1d4', 'neutron:revision_number': '2', 'neutron:security_group_ids': '0b1cab7e-3d92-45e3-88fe-6266af987fc5', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=29e7c9c6-d993-479c-815c-b28e8e044cd0, chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>], logical_port=c45a4550-8234-4bc5-b809-77d90eb7f8fd) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:40:36 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:40:36.614 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[15ec4d9c-884b-4595-8211-1374a5f759d3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:40:36 compute-1 ovn_controller[94841]: 2026-01-22T00:40:36Z|00739|binding|INFO|Setting lport c45a4550-8234-4bc5-b809-77d90eb7f8fd ovn-installed in OVS
Jan 22 00:40:36 compute-1 ovn_controller[94841]: 2026-01-22T00:40:36Z|00740|binding|INFO|Setting lport c45a4550-8234-4bc5-b809-77d90eb7f8fd up in Southbound
Jan 22 00:40:36 compute-1 NetworkManager[54952]: <info>  [1769042436.6157] manager: (tapd60c1e89-30): new Veth device (/org/freedesktop/NetworkManager/Devices/356)
Jan 22 00:40:36 compute-1 nova_compute[182713]: 2026-01-22 00:40:36.616 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:40:36 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:40:36.650 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[562bba3b-3b83-4a30-b609-219749bb9b58]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:40:36 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:40:36.653 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[5c6476c2-3161-4923-8903-e8bc858662cd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:40:36 compute-1 NetworkManager[54952]: <info>  [1769042436.6739] device (tapd60c1e89-30): carrier: link connected
Jan 22 00:40:36 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:40:36.678 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[8715ec86-8737-4dcf-99ca-f272c1edc976]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:40:36 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:40:36.700 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[69a11003-a637-4e4f-aa56-14d1c0311e40]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd60c1e89-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:f1:c3:8b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 214], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 700932, 'reachable_time': 19595, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 242913, 'error': None, 'target': 'ovnmeta-d60c1e89-37d5-4a05-b566-04735ac9e501', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:40:36 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:40:36.723 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[aba4c736-16d9-4538-9896-83eb05b8bbec]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fef1:c38b'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 700932, 'tstamp': 700932}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 242915, 'error': None, 'target': 'ovnmeta-d60c1e89-37d5-4a05-b566-04735ac9e501', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:40:36 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:40:36.744 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[3c3e2620-3cae-4e5a-bc6d-51317335ef78]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd60c1e89-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:f1:c3:8b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 214], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 700932, 'reachable_time': 19595, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 242916, 'error': None, 'target': 'ovnmeta-d60c1e89-37d5-4a05-b566-04735ac9e501', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:40:36 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:40:36.785 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[8993170e-58cf-43fd-a7c5-2e93128910dd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:40:36 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:40:36.860 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[891baf4d-46b9-4fda-a6b3-d3c117a69740]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:40:36 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:40:36.861 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd60c1e89-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:40:36 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:40:36.861 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 00:40:36 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:40:36.862 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd60c1e89-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:40:36 compute-1 NetworkManager[54952]: <info>  [1769042436.8647] manager: (tapd60c1e89-30): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/357)
Jan 22 00:40:36 compute-1 nova_compute[182713]: 2026-01-22 00:40:36.864 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:40:36 compute-1 kernel: tapd60c1e89-30: entered promiscuous mode
Jan 22 00:40:36 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:40:36.867 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapd60c1e89-30, col_values=(('external_ids', {'iface-id': 'e129da0f-abdd-47af-b02c-0b124db30d95'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:40:36 compute-1 ovn_controller[94841]: 2026-01-22T00:40:36Z|00741|binding|INFO|Releasing lport e129da0f-abdd-47af-b02c-0b124db30d95 from this chassis (sb_readonly=0)
Jan 22 00:40:36 compute-1 nova_compute[182713]: 2026-01-22 00:40:36.868 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:40:36 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:40:36.881 104184 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/d60c1e89-37d5-4a05-b566-04735ac9e501.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/d60c1e89-37d5-4a05-b566-04735ac9e501.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 22 00:40:36 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:40:36.882 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[0924913b-32ba-487a-a909-4877a903a6fa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:40:36 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:40:36.883 104184 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 00:40:36 compute-1 ovn_metadata_agent[104179]: global
Jan 22 00:40:36 compute-1 ovn_metadata_agent[104179]:     log         /dev/log local0 debug
Jan 22 00:40:36 compute-1 ovn_metadata_agent[104179]:     log-tag     haproxy-metadata-proxy-d60c1e89-37d5-4a05-b566-04735ac9e501
Jan 22 00:40:36 compute-1 ovn_metadata_agent[104179]:     user        root
Jan 22 00:40:36 compute-1 ovn_metadata_agent[104179]:     group       root
Jan 22 00:40:36 compute-1 ovn_metadata_agent[104179]:     maxconn     1024
Jan 22 00:40:36 compute-1 ovn_metadata_agent[104179]:     pidfile     /var/lib/neutron/external/pids/d60c1e89-37d5-4a05-b566-04735ac9e501.pid.haproxy
Jan 22 00:40:36 compute-1 ovn_metadata_agent[104179]:     daemon
Jan 22 00:40:36 compute-1 ovn_metadata_agent[104179]: 
Jan 22 00:40:36 compute-1 ovn_metadata_agent[104179]: defaults
Jan 22 00:40:36 compute-1 ovn_metadata_agent[104179]:     log global
Jan 22 00:40:36 compute-1 ovn_metadata_agent[104179]:     mode http
Jan 22 00:40:36 compute-1 ovn_metadata_agent[104179]:     option httplog
Jan 22 00:40:36 compute-1 ovn_metadata_agent[104179]:     option dontlognull
Jan 22 00:40:36 compute-1 ovn_metadata_agent[104179]:     option http-server-close
Jan 22 00:40:36 compute-1 ovn_metadata_agent[104179]:     option forwardfor
Jan 22 00:40:36 compute-1 ovn_metadata_agent[104179]:     retries                 3
Jan 22 00:40:36 compute-1 ovn_metadata_agent[104179]:     timeout http-request    30s
Jan 22 00:40:36 compute-1 ovn_metadata_agent[104179]:     timeout connect         30s
Jan 22 00:40:36 compute-1 ovn_metadata_agent[104179]:     timeout client          32s
Jan 22 00:40:36 compute-1 ovn_metadata_agent[104179]:     timeout server          32s
Jan 22 00:40:36 compute-1 ovn_metadata_agent[104179]:     timeout http-keep-alive 30s
Jan 22 00:40:36 compute-1 ovn_metadata_agent[104179]: 
Jan 22 00:40:36 compute-1 ovn_metadata_agent[104179]: 
Jan 22 00:40:36 compute-1 ovn_metadata_agent[104179]: listen listener
Jan 22 00:40:36 compute-1 ovn_metadata_agent[104179]:     bind 169.254.169.254:80
Jan 22 00:40:36 compute-1 ovn_metadata_agent[104179]:     server metadata /var/lib/neutron/metadata_proxy
Jan 22 00:40:36 compute-1 ovn_metadata_agent[104179]:     http-request add-header X-OVN-Network-ID d60c1e89-37d5-4a05-b566-04735ac9e501
Jan 22 00:40:36 compute-1 ovn_metadata_agent[104179]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 22 00:40:36 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:40:36.884 104184 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-d60c1e89-37d5-4a05-b566-04735ac9e501', 'env', 'PROCESS_TAG=haproxy-d60c1e89-37d5-4a05-b566-04735ac9e501', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/d60c1e89-37d5-4a05-b566-04735ac9e501.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 22 00:40:36 compute-1 nova_compute[182713]: 2026-01-22 00:40:36.884 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:40:36 compute-1 nova_compute[182713]: 2026-01-22 00:40:36.912 182717 DEBUG nova.compute.manager [req-e51c1fe1-1896-4bc4-98ed-cec9de277cc6 req-3576a8ae-0b13-46d1-ac3d-5607b3988842 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: af1fd418-ee94-4203-923f-6b4fd78c2b96] Received event network-vif-plugged-c1a311b0-55fd-43c5-9c26-1ce77ebf5300 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:40:36 compute-1 nova_compute[182713]: 2026-01-22 00:40:36.912 182717 DEBUG oslo_concurrency.lockutils [req-e51c1fe1-1896-4bc4-98ed-cec9de277cc6 req-3576a8ae-0b13-46d1-ac3d-5607b3988842 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "af1fd418-ee94-4203-923f-6b4fd78c2b96-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:40:36 compute-1 nova_compute[182713]: 2026-01-22 00:40:36.913 182717 DEBUG oslo_concurrency.lockutils [req-e51c1fe1-1896-4bc4-98ed-cec9de277cc6 req-3576a8ae-0b13-46d1-ac3d-5607b3988842 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "af1fd418-ee94-4203-923f-6b4fd78c2b96-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:40:36 compute-1 nova_compute[182713]: 2026-01-22 00:40:36.913 182717 DEBUG oslo_concurrency.lockutils [req-e51c1fe1-1896-4bc4-98ed-cec9de277cc6 req-3576a8ae-0b13-46d1-ac3d-5607b3988842 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "af1fd418-ee94-4203-923f-6b4fd78c2b96-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:40:36 compute-1 nova_compute[182713]: 2026-01-22 00:40:36.913 182717 DEBUG nova.compute.manager [req-e51c1fe1-1896-4bc4-98ed-cec9de277cc6 req-3576a8ae-0b13-46d1-ac3d-5607b3988842 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: af1fd418-ee94-4203-923f-6b4fd78c2b96] Processing event network-vif-plugged-c1a311b0-55fd-43c5-9c26-1ce77ebf5300 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 22 00:40:36 compute-1 nova_compute[182713]: 2026-01-22 00:40:36.972 182717 DEBUG nova.compute.manager [req-3dad0ba8-e6b7-4958-bf3c-64a043778181 req-82431f45-6a7b-44fe-98a5-7eaa176ae9e7 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: af1fd418-ee94-4203-923f-6b4fd78c2b96] Received event network-vif-plugged-c45a4550-8234-4bc5-b809-77d90eb7f8fd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:40:36 compute-1 nova_compute[182713]: 2026-01-22 00:40:36.972 182717 DEBUG oslo_concurrency.lockutils [req-3dad0ba8-e6b7-4958-bf3c-64a043778181 req-82431f45-6a7b-44fe-98a5-7eaa176ae9e7 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "af1fd418-ee94-4203-923f-6b4fd78c2b96-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:40:36 compute-1 nova_compute[182713]: 2026-01-22 00:40:36.973 182717 DEBUG oslo_concurrency.lockutils [req-3dad0ba8-e6b7-4958-bf3c-64a043778181 req-82431f45-6a7b-44fe-98a5-7eaa176ae9e7 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "af1fd418-ee94-4203-923f-6b4fd78c2b96-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:40:36 compute-1 nova_compute[182713]: 2026-01-22 00:40:36.973 182717 DEBUG oslo_concurrency.lockutils [req-3dad0ba8-e6b7-4958-bf3c-64a043778181 req-82431f45-6a7b-44fe-98a5-7eaa176ae9e7 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "af1fd418-ee94-4203-923f-6b4fd78c2b96-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:40:36 compute-1 nova_compute[182713]: 2026-01-22 00:40:36.973 182717 DEBUG nova.compute.manager [req-3dad0ba8-e6b7-4958-bf3c-64a043778181 req-82431f45-6a7b-44fe-98a5-7eaa176ae9e7 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: af1fd418-ee94-4203-923f-6b4fd78c2b96] Processing event network-vif-plugged-c45a4550-8234-4bc5-b809-77d90eb7f8fd _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 22 00:40:37 compute-1 nova_compute[182713]: 2026-01-22 00:40:37.198 182717 DEBUG nova.virt.driver [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Emitting event <LifecycleEvent: 1769042437.1971145, af1fd418-ee94-4203-923f-6b4fd78c2b96 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:40:37 compute-1 nova_compute[182713]: 2026-01-22 00:40:37.198 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: af1fd418-ee94-4203-923f-6b4fd78c2b96] VM Started (Lifecycle Event)
Jan 22 00:40:37 compute-1 nova_compute[182713]: 2026-01-22 00:40:37.202 182717 DEBUG nova.compute.manager [None req-7517a60b-77f2-4a0d-9ee6-aab60dc5424f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: af1fd418-ee94-4203-923f-6b4fd78c2b96] Instance event wait completed in 0 seconds for network-vif-plugged,network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 00:40:37 compute-1 nova_compute[182713]: 2026-01-22 00:40:37.206 182717 DEBUG nova.virt.libvirt.driver [None req-7517a60b-77f2-4a0d-9ee6-aab60dc5424f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: af1fd418-ee94-4203-923f-6b4fd78c2b96] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 22 00:40:37 compute-1 nova_compute[182713]: 2026-01-22 00:40:37.214 182717 INFO nova.virt.libvirt.driver [-] [instance: af1fd418-ee94-4203-923f-6b4fd78c2b96] Instance spawned successfully.
Jan 22 00:40:37 compute-1 nova_compute[182713]: 2026-01-22 00:40:37.215 182717 DEBUG nova.virt.libvirt.driver [None req-7517a60b-77f2-4a0d-9ee6-aab60dc5424f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: af1fd418-ee94-4203-923f-6b4fd78c2b96] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 22 00:40:37 compute-1 nova_compute[182713]: 2026-01-22 00:40:37.223 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: af1fd418-ee94-4203-923f-6b4fd78c2b96] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:40:37 compute-1 nova_compute[182713]: 2026-01-22 00:40:37.227 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: af1fd418-ee94-4203-923f-6b4fd78c2b96] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 00:40:37 compute-1 nova_compute[182713]: 2026-01-22 00:40:37.245 182717 DEBUG nova.virt.libvirt.driver [None req-7517a60b-77f2-4a0d-9ee6-aab60dc5424f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: af1fd418-ee94-4203-923f-6b4fd78c2b96] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:40:37 compute-1 nova_compute[182713]: 2026-01-22 00:40:37.246 182717 DEBUG nova.virt.libvirt.driver [None req-7517a60b-77f2-4a0d-9ee6-aab60dc5424f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: af1fd418-ee94-4203-923f-6b4fd78c2b96] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:40:37 compute-1 nova_compute[182713]: 2026-01-22 00:40:37.246 182717 DEBUG nova.virt.libvirt.driver [None req-7517a60b-77f2-4a0d-9ee6-aab60dc5424f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: af1fd418-ee94-4203-923f-6b4fd78c2b96] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:40:37 compute-1 nova_compute[182713]: 2026-01-22 00:40:37.246 182717 DEBUG nova.virt.libvirt.driver [None req-7517a60b-77f2-4a0d-9ee6-aab60dc5424f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: af1fd418-ee94-4203-923f-6b4fd78c2b96] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:40:37 compute-1 nova_compute[182713]: 2026-01-22 00:40:37.247 182717 DEBUG nova.virt.libvirt.driver [None req-7517a60b-77f2-4a0d-9ee6-aab60dc5424f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: af1fd418-ee94-4203-923f-6b4fd78c2b96] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:40:37 compute-1 nova_compute[182713]: 2026-01-22 00:40:37.247 182717 DEBUG nova.virt.libvirt.driver [None req-7517a60b-77f2-4a0d-9ee6-aab60dc5424f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: af1fd418-ee94-4203-923f-6b4fd78c2b96] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:40:37 compute-1 nova_compute[182713]: 2026-01-22 00:40:37.262 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: af1fd418-ee94-4203-923f-6b4fd78c2b96] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 00:40:37 compute-1 nova_compute[182713]: 2026-01-22 00:40:37.263 182717 DEBUG nova.virt.driver [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Emitting event <LifecycleEvent: 1769042437.1973913, af1fd418-ee94-4203-923f-6b4fd78c2b96 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:40:37 compute-1 nova_compute[182713]: 2026-01-22 00:40:37.263 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: af1fd418-ee94-4203-923f-6b4fd78c2b96] VM Paused (Lifecycle Event)
Jan 22 00:40:37 compute-1 podman[242956]: 2026-01-22 00:40:37.275073342 +0000 UTC m=+0.054035186 container create 2c110ae799d80c318d26519b09b4c61cf94e04f0275508de2c796dd8269439a6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d60c1e89-37d5-4a05-b566-04735ac9e501, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Jan 22 00:40:37 compute-1 nova_compute[182713]: 2026-01-22 00:40:37.291 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: af1fd418-ee94-4203-923f-6b4fd78c2b96] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:40:37 compute-1 nova_compute[182713]: 2026-01-22 00:40:37.299 182717 DEBUG nova.virt.driver [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Emitting event <LifecycleEvent: 1769042437.2054396, af1fd418-ee94-4203-923f-6b4fd78c2b96 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:40:37 compute-1 nova_compute[182713]: 2026-01-22 00:40:37.299 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: af1fd418-ee94-4203-923f-6b4fd78c2b96] VM Resumed (Lifecycle Event)
Jan 22 00:40:37 compute-1 systemd[1]: Started libpod-conmon-2c110ae799d80c318d26519b09b4c61cf94e04f0275508de2c796dd8269439a6.scope.
Jan 22 00:40:37 compute-1 nova_compute[182713]: 2026-01-22 00:40:37.329 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: af1fd418-ee94-4203-923f-6b4fd78c2b96] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:40:37 compute-1 nova_compute[182713]: 2026-01-22 00:40:37.332 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: af1fd418-ee94-4203-923f-6b4fd78c2b96] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 00:40:37 compute-1 systemd[1]: Started libcrun container.
Jan 22 00:40:37 compute-1 podman[242956]: 2026-01-22 00:40:37.247385454 +0000 UTC m=+0.026347348 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 00:40:37 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5e1065a3805b46f1a89d437fc43715dee206b5d941817448332e92ea06fc639c/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 00:40:37 compute-1 nova_compute[182713]: 2026-01-22 00:40:37.360 182717 INFO nova.compute.manager [None req-7517a60b-77f2-4a0d-9ee6-aab60dc5424f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: af1fd418-ee94-4203-923f-6b4fd78c2b96] Took 8.64 seconds to spawn the instance on the hypervisor.
Jan 22 00:40:37 compute-1 nova_compute[182713]: 2026-01-22 00:40:37.361 182717 DEBUG nova.compute.manager [None req-7517a60b-77f2-4a0d-9ee6-aab60dc5424f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: af1fd418-ee94-4203-923f-6b4fd78c2b96] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:40:37 compute-1 podman[242956]: 2026-01-22 00:40:37.361569032 +0000 UTC m=+0.140530906 container init 2c110ae799d80c318d26519b09b4c61cf94e04f0275508de2c796dd8269439a6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d60c1e89-37d5-4a05-b566-04735ac9e501, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Jan 22 00:40:37 compute-1 nova_compute[182713]: 2026-01-22 00:40:37.362 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: af1fd418-ee94-4203-923f-6b4fd78c2b96] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 00:40:37 compute-1 podman[242956]: 2026-01-22 00:40:37.36890224 +0000 UTC m=+0.147864074 container start 2c110ae799d80c318d26519b09b4c61cf94e04f0275508de2c796dd8269439a6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d60c1e89-37d5-4a05-b566-04735ac9e501, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Jan 22 00:40:37 compute-1 neutron-haproxy-ovnmeta-d60c1e89-37d5-4a05-b566-04735ac9e501[242972]: [NOTICE]   (242976) : New worker (242978) forked
Jan 22 00:40:37 compute-1 neutron-haproxy-ovnmeta-d60c1e89-37d5-4a05-b566-04735ac9e501[242972]: [NOTICE]   (242976) : Loading success.
Jan 22 00:40:37 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:40:37.433 104184 INFO neutron.agent.ovn.metadata.agent [-] Port c45a4550-8234-4bc5-b809-77d90eb7f8fd in datapath 65bd5007-25fc-43be-bec0-20ff1d1f0a79 unbound from our chassis
Jan 22 00:40:37 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:40:37.435 104184 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 65bd5007-25fc-43be-bec0-20ff1d1f0a79
Jan 22 00:40:37 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:40:37.446 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[349479d5-607d-461a-b660-24161875b87a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:40:37 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:40:37.447 104184 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap65bd5007-21 in ovnmeta-65bd5007-25fc-43be-bec0-20ff1d1f0a79 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 22 00:40:37 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:40:37.448 211733 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap65bd5007-20 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 22 00:40:37 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:40:37.449 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[fe4d7790-19d3-4c83-b11f-424f30697e16]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:40:37 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:40:37.454 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[2f664164-7dfd-4cab-8a24-dbe8f859e7a9]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:40:37 compute-1 nova_compute[182713]: 2026-01-22 00:40:37.460 182717 INFO nova.compute.manager [None req-7517a60b-77f2-4a0d-9ee6-aab60dc5424f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: af1fd418-ee94-4203-923f-6b4fd78c2b96] Took 9.22 seconds to build instance.
Jan 22 00:40:37 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:40:37.465 104576 DEBUG oslo.privsep.daemon [-] privsep: reply[61da4936-6790-4acc-8973-df2497cb5c6e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:40:37 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:40:37.477 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[558d89e0-ba19-4a87-bce2-8e425ba52dae]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:40:37 compute-1 nova_compute[182713]: 2026-01-22 00:40:37.488 182717 DEBUG oslo_concurrency.lockutils [None req-7517a60b-77f2-4a0d-9ee6-aab60dc5424f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "af1fd418-ee94-4203-923f-6b4fd78c2b96" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 9.342s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:40:37 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:40:37.505 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[a6e60f33-d104-48f9-88aa-4dc1712af856]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:40:37 compute-1 NetworkManager[54952]: <info>  [1769042437.5129] manager: (tap65bd5007-20): new Veth device (/org/freedesktop/NetworkManager/Devices/358)
Jan 22 00:40:37 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:40:37.513 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[23d897cc-5622-4d97-a16d-4934d251df9d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:40:37 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:40:37.547 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[ee9bc3c7-616a-4d53-ad70-f3b50840d63a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:40:37 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:40:37.551 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[0d263119-dd3e-4433-8722-5b373e2f40b0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:40:37 compute-1 NetworkManager[54952]: <info>  [1769042437.5772] device (tap65bd5007-20): carrier: link connected
Jan 22 00:40:37 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:40:37.584 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[8079bbf5-bc6f-48af-b707-ad53c35f3e8e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:40:37 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:40:37.613 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[a2629ce7-628e-4e3d-9f0c-0a13d5d99da0]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap65bd5007-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:05:88:ac'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 215], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 701022, 'reachable_time': 33983, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 242997, 'error': None, 'target': 'ovnmeta-65bd5007-25fc-43be-bec0-20ff1d1f0a79', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:40:37 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:40:37.632 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[81cefa8e-d27f-4915-88a3-8b962e898360]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe05:88ac'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 701022, 'tstamp': 701022}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 242998, 'error': None, 'target': 'ovnmeta-65bd5007-25fc-43be-bec0-20ff1d1f0a79', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:40:37 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:40:37.651 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[ea29cd00-2317-4f52-8e46-e5177789eb01]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap65bd5007-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:05:88:ac'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 215], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 701022, 'reachable_time': 33983, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 242999, 'error': None, 'target': 'ovnmeta-65bd5007-25fc-43be-bec0-20ff1d1f0a79', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:40:37 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:40:37.684 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[a31fa672-c8c9-4ba2-b375-5210e9c223d3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:40:37 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:40:37.719 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[6a5249e9-d59b-4baf-9e29-535157a6ffb8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:40:37 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:40:37.721 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap65bd5007-20, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:40:37 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:40:37.721 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 00:40:37 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:40:37.721 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap65bd5007-20, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:40:37 compute-1 NetworkManager[54952]: <info>  [1769042437.7241] manager: (tap65bd5007-20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/359)
Jan 22 00:40:37 compute-1 nova_compute[182713]: 2026-01-22 00:40:37.723 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:40:37 compute-1 kernel: tap65bd5007-20: entered promiscuous mode
Jan 22 00:40:37 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:40:37.727 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap65bd5007-20, col_values=(('external_ids', {'iface-id': 'e79ede92-32a1-4758-b6f9-877484a6ca25'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:40:37 compute-1 ovn_controller[94841]: 2026-01-22T00:40:37Z|00742|binding|INFO|Releasing lport e79ede92-32a1-4758-b6f9-877484a6ca25 from this chassis (sb_readonly=0)
Jan 22 00:40:37 compute-1 nova_compute[182713]: 2026-01-22 00:40:37.728 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:40:37 compute-1 nova_compute[182713]: 2026-01-22 00:40:37.742 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:40:37 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:40:37.744 104184 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/65bd5007-25fc-43be-bec0-20ff1d1f0a79.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/65bd5007-25fc-43be-bec0-20ff1d1f0a79.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 22 00:40:37 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:40:37.745 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[7a019a9f-1867-4ad0-bae9-fd3ebfb28f69]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:40:37 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:40:37.746 104184 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 00:40:37 compute-1 ovn_metadata_agent[104179]: global
Jan 22 00:40:37 compute-1 ovn_metadata_agent[104179]:     log         /dev/log local0 debug
Jan 22 00:40:37 compute-1 ovn_metadata_agent[104179]:     log-tag     haproxy-metadata-proxy-65bd5007-25fc-43be-bec0-20ff1d1f0a79
Jan 22 00:40:37 compute-1 ovn_metadata_agent[104179]:     user        root
Jan 22 00:40:37 compute-1 ovn_metadata_agent[104179]:     group       root
Jan 22 00:40:37 compute-1 ovn_metadata_agent[104179]:     maxconn     1024
Jan 22 00:40:37 compute-1 ovn_metadata_agent[104179]:     pidfile     /var/lib/neutron/external/pids/65bd5007-25fc-43be-bec0-20ff1d1f0a79.pid.haproxy
Jan 22 00:40:37 compute-1 ovn_metadata_agent[104179]:     daemon
Jan 22 00:40:37 compute-1 ovn_metadata_agent[104179]: 
Jan 22 00:40:37 compute-1 ovn_metadata_agent[104179]: defaults
Jan 22 00:40:37 compute-1 ovn_metadata_agent[104179]:     log global
Jan 22 00:40:37 compute-1 ovn_metadata_agent[104179]:     mode http
Jan 22 00:40:37 compute-1 ovn_metadata_agent[104179]:     option httplog
Jan 22 00:40:37 compute-1 ovn_metadata_agent[104179]:     option dontlognull
Jan 22 00:40:37 compute-1 ovn_metadata_agent[104179]:     option http-server-close
Jan 22 00:40:37 compute-1 ovn_metadata_agent[104179]:     option forwardfor
Jan 22 00:40:37 compute-1 ovn_metadata_agent[104179]:     retries                 3
Jan 22 00:40:37 compute-1 ovn_metadata_agent[104179]:     timeout http-request    30s
Jan 22 00:40:37 compute-1 ovn_metadata_agent[104179]:     timeout connect         30s
Jan 22 00:40:37 compute-1 ovn_metadata_agent[104179]:     timeout client          32s
Jan 22 00:40:37 compute-1 ovn_metadata_agent[104179]:     timeout server          32s
Jan 22 00:40:37 compute-1 ovn_metadata_agent[104179]:     timeout http-keep-alive 30s
Jan 22 00:40:37 compute-1 ovn_metadata_agent[104179]: 
Jan 22 00:40:37 compute-1 ovn_metadata_agent[104179]: 
Jan 22 00:40:37 compute-1 ovn_metadata_agent[104179]: listen listener
Jan 22 00:40:37 compute-1 ovn_metadata_agent[104179]:     bind 169.254.169.254:80
Jan 22 00:40:37 compute-1 ovn_metadata_agent[104179]:     server metadata /var/lib/neutron/metadata_proxy
Jan 22 00:40:37 compute-1 ovn_metadata_agent[104179]:     http-request add-header X-OVN-Network-ID 65bd5007-25fc-43be-bec0-20ff1d1f0a79
Jan 22 00:40:37 compute-1 ovn_metadata_agent[104179]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 22 00:40:37 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:40:37.747 104184 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-65bd5007-25fc-43be-bec0-20ff1d1f0a79', 'env', 'PROCESS_TAG=haproxy-65bd5007-25fc-43be-bec0-20ff1d1f0a79', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/65bd5007-25fc-43be-bec0-20ff1d1f0a79.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 22 00:40:37 compute-1 nova_compute[182713]: 2026-01-22 00:40:37.829 182717 DEBUG nova.network.neutron [req-2faf7232-40a9-44b1-9140-a24f415d4b12 req-403d7493-6ac2-4d85-9f9e-50f9f91be129 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: af1fd418-ee94-4203-923f-6b4fd78c2b96] Updated VIF entry in instance network info cache for port c45a4550-8234-4bc5-b809-77d90eb7f8fd. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 00:40:37 compute-1 nova_compute[182713]: 2026-01-22 00:40:37.829 182717 DEBUG nova.network.neutron [req-2faf7232-40a9-44b1-9140-a24f415d4b12 req-403d7493-6ac2-4d85-9f9e-50f9f91be129 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: af1fd418-ee94-4203-923f-6b4fd78c2b96] Updating instance_info_cache with network_info: [{"id": "c1a311b0-55fd-43c5-9c26-1ce77ebf5300", "address": "fa:16:3e:c0:06:2d", "network": {"id": "d60c1e89-37d5-4a05-b566-04735ac9e501", "bridge": "br-int", "label": "tempest-network-smoke--2091942253", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc1a311b0-55", "ovs_interfaceid": "c1a311b0-55fd-43c5-9c26-1ce77ebf5300", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "c45a4550-8234-4bc5-b809-77d90eb7f8fd", "address": "fa:16:3e:b5:21:ba", "network": {"id": "65bd5007-25fc-43be-bec0-20ff1d1f0a79", "bridge": "br-int", "label": "tempest-network-smoke--1426694891", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:feb5:21ba", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc45a4550-82", "ovs_interfaceid": "c45a4550-8234-4bc5-b809-77d90eb7f8fd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:40:37 compute-1 nova_compute[182713]: 2026-01-22 00:40:37.851 182717 DEBUG oslo_concurrency.lockutils [req-2faf7232-40a9-44b1-9140-a24f415d4b12 req-403d7493-6ac2-4d85-9f9e-50f9f91be129 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-af1fd418-ee94-4203-923f-6b4fd78c2b96" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:40:38 compute-1 podman[243026]: 2026-01-22 00:40:38.179089984 +0000 UTC m=+0.064014785 container create 535d32b16dba857c47da46a5af2a4d7a5fd8a50bcf85b786eaf8f89050375923 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-65bd5007-25fc-43be-bec0-20ff1d1f0a79, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0)
Jan 22 00:40:38 compute-1 systemd[1]: Started libpod-conmon-535d32b16dba857c47da46a5af2a4d7a5fd8a50bcf85b786eaf8f89050375923.scope.
Jan 22 00:40:38 compute-1 podman[243026]: 2026-01-22 00:40:38.143442049 +0000 UTC m=+0.028366880 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 00:40:38 compute-1 systemd[1]: Started libcrun container.
Jan 22 00:40:38 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7cab878a80edf36ecea7014aa1f6777c4ae43906a2c646f58ad7446e748ca845/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 00:40:38 compute-1 podman[243026]: 2026-01-22 00:40:38.268225235 +0000 UTC m=+0.153150026 container init 535d32b16dba857c47da46a5af2a4d7a5fd8a50bcf85b786eaf8f89050375923 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-65bd5007-25fc-43be-bec0-20ff1d1f0a79, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 22 00:40:38 compute-1 podman[243026]: 2026-01-22 00:40:38.274240102 +0000 UTC m=+0.159164873 container start 535d32b16dba857c47da46a5af2a4d7a5fd8a50bcf85b786eaf8f89050375923 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-65bd5007-25fc-43be-bec0-20ff1d1f0a79, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 00:40:38 compute-1 neutron-haproxy-ovnmeta-65bd5007-25fc-43be-bec0-20ff1d1f0a79[243041]: [NOTICE]   (243045) : New worker (243047) forked
Jan 22 00:40:38 compute-1 neutron-haproxy-ovnmeta-65bd5007-25fc-43be-bec0-20ff1d1f0a79[243041]: [NOTICE]   (243045) : Loading success.
Jan 22 00:40:39 compute-1 nova_compute[182713]: 2026-01-22 00:40:39.020 182717 DEBUG nova.compute.manager [req-d84ed05a-9c48-4708-a391-82b3f8237145 req-44a96511-e5b4-43ea-abc0-ea0947c25978 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: af1fd418-ee94-4203-923f-6b4fd78c2b96] Received event network-vif-plugged-c1a311b0-55fd-43c5-9c26-1ce77ebf5300 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:40:39 compute-1 nova_compute[182713]: 2026-01-22 00:40:39.021 182717 DEBUG oslo_concurrency.lockutils [req-d84ed05a-9c48-4708-a391-82b3f8237145 req-44a96511-e5b4-43ea-abc0-ea0947c25978 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "af1fd418-ee94-4203-923f-6b4fd78c2b96-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:40:39 compute-1 nova_compute[182713]: 2026-01-22 00:40:39.021 182717 DEBUG oslo_concurrency.lockutils [req-d84ed05a-9c48-4708-a391-82b3f8237145 req-44a96511-e5b4-43ea-abc0-ea0947c25978 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "af1fd418-ee94-4203-923f-6b4fd78c2b96-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:40:39 compute-1 nova_compute[182713]: 2026-01-22 00:40:39.021 182717 DEBUG oslo_concurrency.lockutils [req-d84ed05a-9c48-4708-a391-82b3f8237145 req-44a96511-e5b4-43ea-abc0-ea0947c25978 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "af1fd418-ee94-4203-923f-6b4fd78c2b96-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:40:39 compute-1 nova_compute[182713]: 2026-01-22 00:40:39.021 182717 DEBUG nova.compute.manager [req-d84ed05a-9c48-4708-a391-82b3f8237145 req-44a96511-e5b4-43ea-abc0-ea0947c25978 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: af1fd418-ee94-4203-923f-6b4fd78c2b96] No waiting events found dispatching network-vif-plugged-c1a311b0-55fd-43c5-9c26-1ce77ebf5300 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:40:39 compute-1 nova_compute[182713]: 2026-01-22 00:40:39.021 182717 WARNING nova.compute.manager [req-d84ed05a-9c48-4708-a391-82b3f8237145 req-44a96511-e5b4-43ea-abc0-ea0947c25978 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: af1fd418-ee94-4203-923f-6b4fd78c2b96] Received unexpected event network-vif-plugged-c1a311b0-55fd-43c5-9c26-1ce77ebf5300 for instance with vm_state active and task_state None.
Jan 22 00:40:39 compute-1 nova_compute[182713]: 2026-01-22 00:40:39.077 182717 DEBUG nova.compute.manager [req-a758f633-5add-4c55-9e86-e881a1ffc2e9 req-87b7dd6d-a2b6-4bce-9875-d28f09a075ca 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: af1fd418-ee94-4203-923f-6b4fd78c2b96] Received event network-vif-plugged-c45a4550-8234-4bc5-b809-77d90eb7f8fd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:40:39 compute-1 nova_compute[182713]: 2026-01-22 00:40:39.078 182717 DEBUG oslo_concurrency.lockutils [req-a758f633-5add-4c55-9e86-e881a1ffc2e9 req-87b7dd6d-a2b6-4bce-9875-d28f09a075ca 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "af1fd418-ee94-4203-923f-6b4fd78c2b96-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:40:39 compute-1 nova_compute[182713]: 2026-01-22 00:40:39.078 182717 DEBUG oslo_concurrency.lockutils [req-a758f633-5add-4c55-9e86-e881a1ffc2e9 req-87b7dd6d-a2b6-4bce-9875-d28f09a075ca 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "af1fd418-ee94-4203-923f-6b4fd78c2b96-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:40:39 compute-1 nova_compute[182713]: 2026-01-22 00:40:39.078 182717 DEBUG oslo_concurrency.lockutils [req-a758f633-5add-4c55-9e86-e881a1ffc2e9 req-87b7dd6d-a2b6-4bce-9875-d28f09a075ca 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "af1fd418-ee94-4203-923f-6b4fd78c2b96-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:40:39 compute-1 nova_compute[182713]: 2026-01-22 00:40:39.078 182717 DEBUG nova.compute.manager [req-a758f633-5add-4c55-9e86-e881a1ffc2e9 req-87b7dd6d-a2b6-4bce-9875-d28f09a075ca 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: af1fd418-ee94-4203-923f-6b4fd78c2b96] No waiting events found dispatching network-vif-plugged-c45a4550-8234-4bc5-b809-77d90eb7f8fd pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:40:39 compute-1 nova_compute[182713]: 2026-01-22 00:40:39.078 182717 WARNING nova.compute.manager [req-a758f633-5add-4c55-9e86-e881a1ffc2e9 req-87b7dd6d-a2b6-4bce-9875-d28f09a075ca 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: af1fd418-ee94-4203-923f-6b4fd78c2b96] Received unexpected event network-vif-plugged-c45a4550-8234-4bc5-b809-77d90eb7f8fd for instance with vm_state active and task_state None.
Jan 22 00:40:39 compute-1 podman[243056]: 2026-01-22 00:40:39.589425414 +0000 UTC m=+0.075524821 container health_status cba261ffc859a105e0afa4436e64e27f9ba7e7387a1ea02146e0072c51844d44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute)
Jan 22 00:40:40 compute-1 nova_compute[182713]: 2026-01-22 00:40:40.441 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:40:40 compute-1 nova_compute[182713]: 2026-01-22 00:40:40.764 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:40:41 compute-1 podman[243076]: 2026-01-22 00:40:41.582382589 +0000 UTC m=+0.071779085 container health_status dce133a2ffa21f3006ac27dc80027060a9029e6d972f4c62fecd35dd3bbb00f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9-minimal, build-date=2025-08-20T13:12:41, distribution-scope=public, vcs-type=git, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., architecture=x86_64, container_name=openstack_network_exporter, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, release=1755695350, version=9.6, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=openstack_network_exporter, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Jan 22 00:40:42 compute-1 nova_compute[182713]: 2026-01-22 00:40:42.062 182717 DEBUG nova.compute.manager [req-3f27d4a3-2c2a-4dee-83ea-6a8eab668a2b req-fefc317f-c8f5-42fe-ab62-e4f64ce83fbd 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: af1fd418-ee94-4203-923f-6b4fd78c2b96] Received event network-changed-c1a311b0-55fd-43c5-9c26-1ce77ebf5300 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:40:42 compute-1 nova_compute[182713]: 2026-01-22 00:40:42.063 182717 DEBUG nova.compute.manager [req-3f27d4a3-2c2a-4dee-83ea-6a8eab668a2b req-fefc317f-c8f5-42fe-ab62-e4f64ce83fbd 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: af1fd418-ee94-4203-923f-6b4fd78c2b96] Refreshing instance network info cache due to event network-changed-c1a311b0-55fd-43c5-9c26-1ce77ebf5300. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 00:40:42 compute-1 nova_compute[182713]: 2026-01-22 00:40:42.063 182717 DEBUG oslo_concurrency.lockutils [req-3f27d4a3-2c2a-4dee-83ea-6a8eab668a2b req-fefc317f-c8f5-42fe-ab62-e4f64ce83fbd 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-af1fd418-ee94-4203-923f-6b4fd78c2b96" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:40:42 compute-1 nova_compute[182713]: 2026-01-22 00:40:42.063 182717 DEBUG oslo_concurrency.lockutils [req-3f27d4a3-2c2a-4dee-83ea-6a8eab668a2b req-fefc317f-c8f5-42fe-ab62-e4f64ce83fbd 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-af1fd418-ee94-4203-923f-6b4fd78c2b96" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:40:42 compute-1 nova_compute[182713]: 2026-01-22 00:40:42.064 182717 DEBUG nova.network.neutron [req-3f27d4a3-2c2a-4dee-83ea-6a8eab668a2b req-fefc317f-c8f5-42fe-ab62-e4f64ce83fbd 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: af1fd418-ee94-4203-923f-6b4fd78c2b96] Refreshing network info cache for port c1a311b0-55fd-43c5-9c26-1ce77ebf5300 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 00:40:45 compute-1 nova_compute[182713]: 2026-01-22 00:40:45.445 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:40:45 compute-1 nova_compute[182713]: 2026-01-22 00:40:45.766 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:40:46 compute-1 nova_compute[182713]: 2026-01-22 00:40:46.171 182717 DEBUG nova.network.neutron [req-3f27d4a3-2c2a-4dee-83ea-6a8eab668a2b req-fefc317f-c8f5-42fe-ab62-e4f64ce83fbd 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: af1fd418-ee94-4203-923f-6b4fd78c2b96] Updated VIF entry in instance network info cache for port c1a311b0-55fd-43c5-9c26-1ce77ebf5300. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 00:40:46 compute-1 nova_compute[182713]: 2026-01-22 00:40:46.172 182717 DEBUG nova.network.neutron [req-3f27d4a3-2c2a-4dee-83ea-6a8eab668a2b req-fefc317f-c8f5-42fe-ab62-e4f64ce83fbd 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: af1fd418-ee94-4203-923f-6b4fd78c2b96] Updating instance_info_cache with network_info: [{"id": "c1a311b0-55fd-43c5-9c26-1ce77ebf5300", "address": "fa:16:3e:c0:06:2d", "network": {"id": "d60c1e89-37d5-4a05-b566-04735ac9e501", "bridge": "br-int", "label": "tempest-network-smoke--2091942253", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.240", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc1a311b0-55", "ovs_interfaceid": "c1a311b0-55fd-43c5-9c26-1ce77ebf5300", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "c45a4550-8234-4bc5-b809-77d90eb7f8fd", "address": "fa:16:3e:b5:21:ba", "network": {"id": "65bd5007-25fc-43be-bec0-20ff1d1f0a79", "bridge": "br-int", "label": "tempest-network-smoke--1426694891", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:feb5:21ba", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc45a4550-82", "ovs_interfaceid": "c45a4550-8234-4bc5-b809-77d90eb7f8fd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:40:46 compute-1 nova_compute[182713]: 2026-01-22 00:40:46.201 182717 DEBUG oslo_concurrency.lockutils [req-3f27d4a3-2c2a-4dee-83ea-6a8eab668a2b req-fefc317f-c8f5-42fe-ab62-e4f64ce83fbd 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-af1fd418-ee94-4203-923f-6b4fd78c2b96" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:40:50 compute-1 nova_compute[182713]: 2026-01-22 00:40:50.447 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:40:50 compute-1 ovn_controller[94841]: 2026-01-22T00:40:50Z|00096|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:c0:06:2d 10.100.0.12
Jan 22 00:40:50 compute-1 ovn_controller[94841]: 2026-01-22T00:40:50Z|00097|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:c0:06:2d 10.100.0.12
Jan 22 00:40:50 compute-1 nova_compute[182713]: 2026-01-22 00:40:50.771 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:40:51 compute-1 podman[243113]: 2026-01-22 00:40:51.594538637 +0000 UTC m=+0.072243479 container health_status 9069102163b9014ce8d990e36224edf2f6277e737eebbc85a8ea0ba5a3d6a468 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 22 00:40:51 compute-1 podman[243112]: 2026-01-22 00:40:51.646472267 +0000 UTC m=+0.121938960 container health_status 1b31374526468d6b98833c6ddc06c791524e3aaa8f4a92d02e3f11c449a17ca2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 22 00:40:55 compute-1 nova_compute[182713]: 2026-01-22 00:40:55.450 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:40:55 compute-1 nova_compute[182713]: 2026-01-22 00:40:55.784 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:40:57 compute-1 podman[243162]: 2026-01-22 00:40:57.567250649 +0000 UTC m=+0.065256373 container health_status af9a44923f14d865893c91712a29a99d59e05d83f1a1a0300c4f4d5af638f284 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0)
Jan 22 00:40:57 compute-1 podman[243163]: 2026-01-22 00:40:57.584667959 +0000 UTC m=+0.069983269 container health_status e305ebfcd3dbca18f8dd680606b043b108cf9efb0d0a771039cb04fe0df8441d (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 22 00:41:00 compute-1 nova_compute[182713]: 2026-01-22 00:41:00.455 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:41:00 compute-1 nova_compute[182713]: 2026-01-22 00:41:00.788 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:41:03 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:41:03.056 104184 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:41:03 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:41:03.057 104184 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:41:03 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:41:03.058 104184 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:41:03 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:41:03.909 104184 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=64, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:a4:fb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'fa:c2:bb:50:eb:27'}, ipsec=False) old=SB_Global(nb_cfg=63) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:41:03 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:41:03.910 104184 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 22 00:41:03 compute-1 nova_compute[182713]: 2026-01-22 00:41:03.911 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:41:04 compute-1 nova_compute[182713]: 2026-01-22 00:41:04.199 182717 DEBUG nova.compute.manager [req-47de5942-0231-413b-a4ad-d2367ad4544d req-5547e7cb-4799-414c-8d09-cfea6cb61910 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: af1fd418-ee94-4203-923f-6b4fd78c2b96] Received event network-changed-c1a311b0-55fd-43c5-9c26-1ce77ebf5300 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:41:04 compute-1 nova_compute[182713]: 2026-01-22 00:41:04.199 182717 DEBUG nova.compute.manager [req-47de5942-0231-413b-a4ad-d2367ad4544d req-5547e7cb-4799-414c-8d09-cfea6cb61910 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: af1fd418-ee94-4203-923f-6b4fd78c2b96] Refreshing instance network info cache due to event network-changed-c1a311b0-55fd-43c5-9c26-1ce77ebf5300. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 00:41:04 compute-1 nova_compute[182713]: 2026-01-22 00:41:04.200 182717 DEBUG oslo_concurrency.lockutils [req-47de5942-0231-413b-a4ad-d2367ad4544d req-5547e7cb-4799-414c-8d09-cfea6cb61910 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-af1fd418-ee94-4203-923f-6b4fd78c2b96" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:41:04 compute-1 nova_compute[182713]: 2026-01-22 00:41:04.200 182717 DEBUG oslo_concurrency.lockutils [req-47de5942-0231-413b-a4ad-d2367ad4544d req-5547e7cb-4799-414c-8d09-cfea6cb61910 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-af1fd418-ee94-4203-923f-6b4fd78c2b96" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:41:04 compute-1 nova_compute[182713]: 2026-01-22 00:41:04.200 182717 DEBUG nova.network.neutron [req-47de5942-0231-413b-a4ad-d2367ad4544d req-5547e7cb-4799-414c-8d09-cfea6cb61910 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: af1fd418-ee94-4203-923f-6b4fd78c2b96] Refreshing network info cache for port c1a311b0-55fd-43c5-9c26-1ce77ebf5300 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 00:41:04 compute-1 nova_compute[182713]: 2026-01-22 00:41:04.268 182717 DEBUG oslo_concurrency.lockutils [None req-b94369fb-4069-4afe-b67d-0558cd8723fc a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Acquiring lock "af1fd418-ee94-4203-923f-6b4fd78c2b96" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:41:04 compute-1 nova_compute[182713]: 2026-01-22 00:41:04.269 182717 DEBUG oslo_concurrency.lockutils [None req-b94369fb-4069-4afe-b67d-0558cd8723fc a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "af1fd418-ee94-4203-923f-6b4fd78c2b96" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:41:04 compute-1 nova_compute[182713]: 2026-01-22 00:41:04.269 182717 DEBUG oslo_concurrency.lockutils [None req-b94369fb-4069-4afe-b67d-0558cd8723fc a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Acquiring lock "af1fd418-ee94-4203-923f-6b4fd78c2b96-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:41:04 compute-1 nova_compute[182713]: 2026-01-22 00:41:04.270 182717 DEBUG oslo_concurrency.lockutils [None req-b94369fb-4069-4afe-b67d-0558cd8723fc a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "af1fd418-ee94-4203-923f-6b4fd78c2b96-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:41:04 compute-1 nova_compute[182713]: 2026-01-22 00:41:04.270 182717 DEBUG oslo_concurrency.lockutils [None req-b94369fb-4069-4afe-b67d-0558cd8723fc a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "af1fd418-ee94-4203-923f-6b4fd78c2b96-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:41:04 compute-1 nova_compute[182713]: 2026-01-22 00:41:04.290 182717 INFO nova.compute.manager [None req-b94369fb-4069-4afe-b67d-0558cd8723fc a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: af1fd418-ee94-4203-923f-6b4fd78c2b96] Terminating instance
Jan 22 00:41:04 compute-1 nova_compute[182713]: 2026-01-22 00:41:04.305 182717 DEBUG nova.compute.manager [None req-b94369fb-4069-4afe-b67d-0558cd8723fc a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: af1fd418-ee94-4203-923f-6b4fd78c2b96] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 22 00:41:04 compute-1 kernel: tapc1a311b0-55 (unregistering): left promiscuous mode
Jan 22 00:41:04 compute-1 NetworkManager[54952]: <info>  [1769042464.3329] device (tapc1a311b0-55): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 00:41:04 compute-1 nova_compute[182713]: 2026-01-22 00:41:04.343 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:41:04 compute-1 ovn_controller[94841]: 2026-01-22T00:41:04Z|00743|binding|INFO|Releasing lport c1a311b0-55fd-43c5-9c26-1ce77ebf5300 from this chassis (sb_readonly=0)
Jan 22 00:41:04 compute-1 ovn_controller[94841]: 2026-01-22T00:41:04Z|00744|binding|INFO|Setting lport c1a311b0-55fd-43c5-9c26-1ce77ebf5300 down in Southbound
Jan 22 00:41:04 compute-1 ovn_controller[94841]: 2026-01-22T00:41:04Z|00745|binding|INFO|Removing iface tapc1a311b0-55 ovn-installed in OVS
Jan 22 00:41:04 compute-1 nova_compute[182713]: 2026-01-22 00:41:04.346 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:41:04 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:41:04.356 104184 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c0:06:2d 10.100.0.12'], port_security=['fa:16:3e:c0:06:2d 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': 'af1fd418-ee94-4203-923f-6b4fd78c2b96', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d60c1e89-37d5-4a05-b566-04735ac9e501', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '837db8748d074b3c9179b47d30e7a1d4', 'neutron:revision_number': '4', 'neutron:security_group_ids': '0b1cab7e-3d92-45e3-88fe-6266af987fc5', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e88081bc-c33e-4f29-8ba8-cdfb76dc2a31, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>], logical_port=c1a311b0-55fd-43c5-9c26-1ce77ebf5300) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:41:04 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:41:04.358 104184 INFO neutron.agent.ovn.metadata.agent [-] Port c1a311b0-55fd-43c5-9c26-1ce77ebf5300 in datapath d60c1e89-37d5-4a05-b566-04735ac9e501 unbound from our chassis
Jan 22 00:41:04 compute-1 kernel: tapc45a4550-82 (unregistering): left promiscuous mode
Jan 22 00:41:04 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:41:04.360 104184 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network d60c1e89-37d5-4a05-b566-04735ac9e501, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 00:41:04 compute-1 NetworkManager[54952]: <info>  [1769042464.3635] device (tapc45a4550-82): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 00:41:04 compute-1 nova_compute[182713]: 2026-01-22 00:41:04.365 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:41:04 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:41:04.364 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[d08cf428-5c36-4174-91b9-377a17a54eaa]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:41:04 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:41:04.365 104184 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-d60c1e89-37d5-4a05-b566-04735ac9e501 namespace which is not needed anymore
Jan 22 00:41:04 compute-1 nova_compute[182713]: 2026-01-22 00:41:04.373 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:41:04 compute-1 ovn_controller[94841]: 2026-01-22T00:41:04Z|00746|binding|INFO|Releasing lport c45a4550-8234-4bc5-b809-77d90eb7f8fd from this chassis (sb_readonly=0)
Jan 22 00:41:04 compute-1 ovn_controller[94841]: 2026-01-22T00:41:04Z|00747|binding|INFO|Setting lport c45a4550-8234-4bc5-b809-77d90eb7f8fd down in Southbound
Jan 22 00:41:04 compute-1 ovn_controller[94841]: 2026-01-22T00:41:04Z|00748|binding|INFO|Removing iface tapc45a4550-82 ovn-installed in OVS
Jan 22 00:41:04 compute-1 nova_compute[182713]: 2026-01-22 00:41:04.378 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:41:04 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:41:04.405 104184 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b5:21:ba 2001:db8::f816:3eff:feb5:21ba'], port_security=['fa:16:3e:b5:21:ba 2001:db8::f816:3eff:feb5:21ba'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:feb5:21ba/64', 'neutron:device_id': 'af1fd418-ee94-4203-923f-6b4fd78c2b96', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-65bd5007-25fc-43be-bec0-20ff1d1f0a79', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '837db8748d074b3c9179b47d30e7a1d4', 'neutron:revision_number': '4', 'neutron:security_group_ids': '0b1cab7e-3d92-45e3-88fe-6266af987fc5', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=29e7c9c6-d993-479c-815c-b28e8e044cd0, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>], logical_port=c45a4550-8234-4bc5-b809-77d90eb7f8fd) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:41:04 compute-1 nova_compute[182713]: 2026-01-22 00:41:04.413 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:41:04 compute-1 systemd[1]: machine-qemu\x2d77\x2dinstance\x2d000000b2.scope: Deactivated successfully.
Jan 22 00:41:04 compute-1 systemd[1]: machine-qemu\x2d77\x2dinstance\x2d000000b2.scope: Consumed 14.715s CPU time.
Jan 22 00:41:04 compute-1 systemd-machined[153970]: Machine qemu-77-instance-000000b2 terminated.
Jan 22 00:41:04 compute-1 neutron-haproxy-ovnmeta-d60c1e89-37d5-4a05-b566-04735ac9e501[242972]: [NOTICE]   (242976) : haproxy version is 2.8.14-c23fe91
Jan 22 00:41:04 compute-1 neutron-haproxy-ovnmeta-d60c1e89-37d5-4a05-b566-04735ac9e501[242972]: [NOTICE]   (242976) : path to executable is /usr/sbin/haproxy
Jan 22 00:41:04 compute-1 neutron-haproxy-ovnmeta-d60c1e89-37d5-4a05-b566-04735ac9e501[242972]: [WARNING]  (242976) : Exiting Master process...
Jan 22 00:41:04 compute-1 neutron-haproxy-ovnmeta-d60c1e89-37d5-4a05-b566-04735ac9e501[242972]: [ALERT]    (242976) : Current worker (242978) exited with code 143 (Terminated)
Jan 22 00:41:04 compute-1 neutron-haproxy-ovnmeta-d60c1e89-37d5-4a05-b566-04735ac9e501[242972]: [WARNING]  (242976) : All workers exited. Exiting... (0)
Jan 22 00:41:04 compute-1 systemd[1]: libpod-2c110ae799d80c318d26519b09b4c61cf94e04f0275508de2c796dd8269439a6.scope: Deactivated successfully.
Jan 22 00:41:04 compute-1 podman[243235]: 2026-01-22 00:41:04.510045571 +0000 UTC m=+0.041995633 container died 2c110ae799d80c318d26519b09b4c61cf94e04f0275508de2c796dd8269439a6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d60c1e89-37d5-4a05-b566-04735ac9e501, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 00:41:04 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-2c110ae799d80c318d26519b09b4c61cf94e04f0275508de2c796dd8269439a6-userdata-shm.mount: Deactivated successfully.
Jan 22 00:41:04 compute-1 systemd[1]: var-lib-containers-storage-overlay-5e1065a3805b46f1a89d437fc43715dee206b5d941817448332e92ea06fc639c-merged.mount: Deactivated successfully.
Jan 22 00:41:04 compute-1 NetworkManager[54952]: <info>  [1769042464.5448] manager: (tapc45a4550-82): new Tun device (/org/freedesktop/NetworkManager/Devices/360)
Jan 22 00:41:04 compute-1 podman[243235]: 2026-01-22 00:41:04.552141745 +0000 UTC m=+0.084091807 container cleanup 2c110ae799d80c318d26519b09b4c61cf94e04f0275508de2c796dd8269439a6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d60c1e89-37d5-4a05-b566-04735ac9e501, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Jan 22 00:41:04 compute-1 systemd[1]: libpod-conmon-2c110ae799d80c318d26519b09b4c61cf94e04f0275508de2c796dd8269439a6.scope: Deactivated successfully.
Jan 22 00:41:04 compute-1 nova_compute[182713]: 2026-01-22 00:41:04.596 182717 INFO nova.virt.libvirt.driver [-] [instance: af1fd418-ee94-4203-923f-6b4fd78c2b96] Instance destroyed successfully.
Jan 22 00:41:04 compute-1 nova_compute[182713]: 2026-01-22 00:41:04.597 182717 DEBUG nova.objects.instance [None req-b94369fb-4069-4afe-b67d-0558cd8723fc a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lazy-loading 'resources' on Instance uuid af1fd418-ee94-4203-923f-6b4fd78c2b96 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:41:04 compute-1 nova_compute[182713]: 2026-01-22 00:41:04.617 182717 DEBUG nova.virt.libvirt.vif [None req-b94369fb-4069-4afe-b67d-0558cd8723fc a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T00:40:26Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestGettingAddress-server-2036429228',display_name='tempest-TestGettingAddress-server-2036429228',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testgettingaddress-server-2036429228',id=178,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHfT5O6ZRpgmE55cZJ3QGKqkIsXzVQor6hmo4DYN0Y1I8FjHDDT3akgLWKRb+GjMp1RPaUHp87VJJNRMEPu8nfNaeXczwCaPpHi3Lj4qUZVeX1rL89fg6akvTfTjYIINQg==',key_name='tempest-TestGettingAddress-1576090324',keypairs=<?>,launch_index=0,launched_at=2026-01-22T00:40:37Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='837db8748d074b3c9179b47d30e7a1d4',ramdisk_id='',reservation_id='r-jwgbkrts',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestGettingAddress-471729430',owner_user_name='tempest-TestGettingAddress-471729430-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T00:40:37Z,user_data=None,user_id='a8fd196423d94b309668ffd08655f7ed',uuid=af1fd418-ee94-4203-923f-6b4fd78c2b96,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "c1a311b0-55fd-43c5-9c26-1ce77ebf5300", "address": "fa:16:3e:c0:06:2d", "network": {"id": "d60c1e89-37d5-4a05-b566-04735ac9e501", "bridge": "br-int", "label": "tempest-network-smoke--2091942253", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.240", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc1a311b0-55", "ovs_interfaceid": "c1a311b0-55fd-43c5-9c26-1ce77ebf5300", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 00:41:04 compute-1 nova_compute[182713]: 2026-01-22 00:41:04.618 182717 DEBUG nova.network.os_vif_util [None req-b94369fb-4069-4afe-b67d-0558cd8723fc a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Converting VIF {"id": "c1a311b0-55fd-43c5-9c26-1ce77ebf5300", "address": "fa:16:3e:c0:06:2d", "network": {"id": "d60c1e89-37d5-4a05-b566-04735ac9e501", "bridge": "br-int", "label": "tempest-network-smoke--2091942253", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.240", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc1a311b0-55", "ovs_interfaceid": "c1a311b0-55fd-43c5-9c26-1ce77ebf5300", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:41:04 compute-1 nova_compute[182713]: 2026-01-22 00:41:04.619 182717 DEBUG nova.network.os_vif_util [None req-b94369fb-4069-4afe-b67d-0558cd8723fc a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:c0:06:2d,bridge_name='br-int',has_traffic_filtering=True,id=c1a311b0-55fd-43c5-9c26-1ce77ebf5300,network=Network(d60c1e89-37d5-4a05-b566-04735ac9e501),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc1a311b0-55') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:41:04 compute-1 nova_compute[182713]: 2026-01-22 00:41:04.619 182717 DEBUG os_vif [None req-b94369fb-4069-4afe-b67d-0558cd8723fc a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:c0:06:2d,bridge_name='br-int',has_traffic_filtering=True,id=c1a311b0-55fd-43c5-9c26-1ce77ebf5300,network=Network(d60c1e89-37d5-4a05-b566-04735ac9e501),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc1a311b0-55') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 00:41:04 compute-1 nova_compute[182713]: 2026-01-22 00:41:04.620 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:41:04 compute-1 nova_compute[182713]: 2026-01-22 00:41:04.621 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc1a311b0-55, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:41:04 compute-1 podman[243287]: 2026-01-22 00:41:04.622122403 +0000 UTC m=+0.045637765 container remove 2c110ae799d80c318d26519b09b4c61cf94e04f0275508de2c796dd8269439a6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d60c1e89-37d5-4a05-b566-04735ac9e501, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251202)
Jan 22 00:41:04 compute-1 nova_compute[182713]: 2026-01-22 00:41:04.675 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:41:04 compute-1 nova_compute[182713]: 2026-01-22 00:41:04.677 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:41:04 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:41:04.679 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[4b690058-f65a-4274-a550-1d593410b430]: (4, ('Thu Jan 22 12:41:04 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-d60c1e89-37d5-4a05-b566-04735ac9e501 (2c110ae799d80c318d26519b09b4c61cf94e04f0275508de2c796dd8269439a6)\n2c110ae799d80c318d26519b09b4c61cf94e04f0275508de2c796dd8269439a6\nThu Jan 22 12:41:04 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-d60c1e89-37d5-4a05-b566-04735ac9e501 (2c110ae799d80c318d26519b09b4c61cf94e04f0275508de2c796dd8269439a6)\n2c110ae799d80c318d26519b09b4c61cf94e04f0275508de2c796dd8269439a6\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:41:04 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:41:04.681 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[d1069101-666e-4d0a-9894-e4d501ce9d71]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:41:04 compute-1 nova_compute[182713]: 2026-01-22 00:41:04.681 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:41:04 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:41:04.683 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd60c1e89-30, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:41:04 compute-1 nova_compute[182713]: 2026-01-22 00:41:04.685 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:41:04 compute-1 kernel: tapd60c1e89-30: left promiscuous mode
Jan 22 00:41:04 compute-1 nova_compute[182713]: 2026-01-22 00:41:04.687 182717 INFO os_vif [None req-b94369fb-4069-4afe-b67d-0558cd8723fc a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:c0:06:2d,bridge_name='br-int',has_traffic_filtering=True,id=c1a311b0-55fd-43c5-9c26-1ce77ebf5300,network=Network(d60c1e89-37d5-4a05-b566-04735ac9e501),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc1a311b0-55')
Jan 22 00:41:04 compute-1 nova_compute[182713]: 2026-01-22 00:41:04.688 182717 DEBUG nova.virt.libvirt.vif [None req-b94369fb-4069-4afe-b67d-0558cd8723fc a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T00:40:26Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestGettingAddress-server-2036429228',display_name='tempest-TestGettingAddress-server-2036429228',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testgettingaddress-server-2036429228',id=178,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHfT5O6ZRpgmE55cZJ3QGKqkIsXzVQor6hmo4DYN0Y1I8FjHDDT3akgLWKRb+GjMp1RPaUHp87VJJNRMEPu8nfNaeXczwCaPpHi3Lj4qUZVeX1rL89fg6akvTfTjYIINQg==',key_name='tempest-TestGettingAddress-1576090324',keypairs=<?>,launch_index=0,launched_at=2026-01-22T00:40:37Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='837db8748d074b3c9179b47d30e7a1d4',ramdisk_id='',reservation_id='r-jwgbkrts',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestGettingAddress-471729430',owner_user_name='tempest-TestGettingAddress-471729430-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T00:40:37Z,user_data=None,user_id='a8fd196423d94b309668ffd08655f7ed',uuid=af1fd418-ee94-4203-923f-6b4fd78c2b96,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "c45a4550-8234-4bc5-b809-77d90eb7f8fd", "address": "fa:16:3e:b5:21:ba", "network": {"id": "65bd5007-25fc-43be-bec0-20ff1d1f0a79", "bridge": "br-int", "label": "tempest-network-smoke--1426694891", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:feb5:21ba", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc45a4550-82", "ovs_interfaceid": "c45a4550-8234-4bc5-b809-77d90eb7f8fd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 00:41:04 compute-1 nova_compute[182713]: 2026-01-22 00:41:04.688 182717 DEBUG nova.network.os_vif_util [None req-b94369fb-4069-4afe-b67d-0558cd8723fc a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Converting VIF {"id": "c45a4550-8234-4bc5-b809-77d90eb7f8fd", "address": "fa:16:3e:b5:21:ba", "network": {"id": "65bd5007-25fc-43be-bec0-20ff1d1f0a79", "bridge": "br-int", "label": "tempest-network-smoke--1426694891", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:feb5:21ba", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc45a4550-82", "ovs_interfaceid": "c45a4550-8234-4bc5-b809-77d90eb7f8fd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:41:04 compute-1 nova_compute[182713]: 2026-01-22 00:41:04.689 182717 DEBUG nova.network.os_vif_util [None req-b94369fb-4069-4afe-b67d-0558cd8723fc a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b5:21:ba,bridge_name='br-int',has_traffic_filtering=True,id=c45a4550-8234-4bc5-b809-77d90eb7f8fd,network=Network(65bd5007-25fc-43be-bec0-20ff1d1f0a79),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc45a4550-82') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:41:04 compute-1 nova_compute[182713]: 2026-01-22 00:41:04.689 182717 DEBUG os_vif [None req-b94369fb-4069-4afe-b67d-0558cd8723fc a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:b5:21:ba,bridge_name='br-int',has_traffic_filtering=True,id=c45a4550-8234-4bc5-b809-77d90eb7f8fd,network=Network(65bd5007-25fc-43be-bec0-20ff1d1f0a79),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc45a4550-82') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 00:41:04 compute-1 nova_compute[182713]: 2026-01-22 00:41:04.690 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:41:04 compute-1 nova_compute[182713]: 2026-01-22 00:41:04.691 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc45a4550-82, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:41:04 compute-1 nova_compute[182713]: 2026-01-22 00:41:04.692 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:41:04 compute-1 nova_compute[182713]: 2026-01-22 00:41:04.693 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:41:04 compute-1 nova_compute[182713]: 2026-01-22 00:41:04.702 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:41:04 compute-1 nova_compute[182713]: 2026-01-22 00:41:04.704 182717 INFO os_vif [None req-b94369fb-4069-4afe-b67d-0558cd8723fc a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:b5:21:ba,bridge_name='br-int',has_traffic_filtering=True,id=c45a4550-8234-4bc5-b809-77d90eb7f8fd,network=Network(65bd5007-25fc-43be-bec0-20ff1d1f0a79),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc45a4550-82')
Jan 22 00:41:04 compute-1 nova_compute[182713]: 2026-01-22 00:41:04.705 182717 INFO nova.virt.libvirt.driver [None req-b94369fb-4069-4afe-b67d-0558cd8723fc a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: af1fd418-ee94-4203-923f-6b4fd78c2b96] Deleting instance files /var/lib/nova/instances/af1fd418-ee94-4203-923f-6b4fd78c2b96_del
Jan 22 00:41:04 compute-1 nova_compute[182713]: 2026-01-22 00:41:04.706 182717 INFO nova.virt.libvirt.driver [None req-b94369fb-4069-4afe-b67d-0558cd8723fc a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: af1fd418-ee94-4203-923f-6b4fd78c2b96] Deletion of /var/lib/nova/instances/af1fd418-ee94-4203-923f-6b4fd78c2b96_del complete
Jan 22 00:41:04 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:41:04.706 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[767ac063-7a97-4499-8830-0cff88bfc52b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:41:04 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:41:04.728 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[6bc6b652-7772-46d4-a8cf-81e002693f25]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:41:04 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:41:04.729 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[a7684f19-71ac-441f-95c1-8804992550c8]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:41:04 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:41:04.746 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[f3e39d28-4754-4a68-a2e6-7fc5cb8232ef]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 700923, 'reachable_time': 20620, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 243310, 'error': None, 'target': 'ovnmeta-d60c1e89-37d5-4a05-b566-04735ac9e501', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:41:04 compute-1 systemd[1]: run-netns-ovnmeta\x2dd60c1e89\x2d37d5\x2d4a05\x2db566\x2d04735ac9e501.mount: Deactivated successfully.
Jan 22 00:41:04 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:41:04.752 104576 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-d60c1e89-37d5-4a05-b566-04735ac9e501 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 22 00:41:04 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:41:04.752 104576 DEBUG oslo.privsep.daemon [-] privsep: reply[8e7dc441-1cc1-4e3a-a858-327022865c46]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:41:04 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:41:04.753 104184 INFO neutron.agent.ovn.metadata.agent [-] Port c45a4550-8234-4bc5-b809-77d90eb7f8fd in datapath 65bd5007-25fc-43be-bec0-20ff1d1f0a79 unbound from our chassis
Jan 22 00:41:04 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:41:04.754 104184 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 65bd5007-25fc-43be-bec0-20ff1d1f0a79, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 00:41:04 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:41:04.755 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[0c3d716d-1780-478e-86d4-b0656ec3aacd]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:41:04 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:41:04.755 104184 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-65bd5007-25fc-43be-bec0-20ff1d1f0a79 namespace which is not needed anymore
Jan 22 00:41:04 compute-1 nova_compute[182713]: 2026-01-22 00:41:04.779 182717 INFO nova.compute.manager [None req-b94369fb-4069-4afe-b67d-0558cd8723fc a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: af1fd418-ee94-4203-923f-6b4fd78c2b96] Took 0.47 seconds to destroy the instance on the hypervisor.
Jan 22 00:41:04 compute-1 nova_compute[182713]: 2026-01-22 00:41:04.781 182717 DEBUG oslo.service.loopingcall [None req-b94369fb-4069-4afe-b67d-0558cd8723fc a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 22 00:41:04 compute-1 nova_compute[182713]: 2026-01-22 00:41:04.781 182717 DEBUG nova.compute.manager [-] [instance: af1fd418-ee94-4203-923f-6b4fd78c2b96] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 22 00:41:04 compute-1 nova_compute[182713]: 2026-01-22 00:41:04.781 182717 DEBUG nova.network.neutron [-] [instance: af1fd418-ee94-4203-923f-6b4fd78c2b96] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 22 00:41:04 compute-1 neutron-haproxy-ovnmeta-65bd5007-25fc-43be-bec0-20ff1d1f0a79[243041]: [NOTICE]   (243045) : haproxy version is 2.8.14-c23fe91
Jan 22 00:41:04 compute-1 neutron-haproxy-ovnmeta-65bd5007-25fc-43be-bec0-20ff1d1f0a79[243041]: [NOTICE]   (243045) : path to executable is /usr/sbin/haproxy
Jan 22 00:41:04 compute-1 neutron-haproxy-ovnmeta-65bd5007-25fc-43be-bec0-20ff1d1f0a79[243041]: [WARNING]  (243045) : Exiting Master process...
Jan 22 00:41:04 compute-1 neutron-haproxy-ovnmeta-65bd5007-25fc-43be-bec0-20ff1d1f0a79[243041]: [WARNING]  (243045) : Exiting Master process...
Jan 22 00:41:04 compute-1 neutron-haproxy-ovnmeta-65bd5007-25fc-43be-bec0-20ff1d1f0a79[243041]: [ALERT]    (243045) : Current worker (243047) exited with code 143 (Terminated)
Jan 22 00:41:04 compute-1 neutron-haproxy-ovnmeta-65bd5007-25fc-43be-bec0-20ff1d1f0a79[243041]: [WARNING]  (243045) : All workers exited. Exiting... (0)
Jan 22 00:41:04 compute-1 systemd[1]: libpod-535d32b16dba857c47da46a5af2a4d7a5fd8a50bcf85b786eaf8f89050375923.scope: Deactivated successfully.
Jan 22 00:41:04 compute-1 podman[243327]: 2026-01-22 00:41:04.917213566 +0000 UTC m=+0.054688505 container died 535d32b16dba857c47da46a5af2a4d7a5fd8a50bcf85b786eaf8f89050375923 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-65bd5007-25fc-43be-bec0-20ff1d1f0a79, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 22 00:41:04 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-535d32b16dba857c47da46a5af2a4d7a5fd8a50bcf85b786eaf8f89050375923-userdata-shm.mount: Deactivated successfully.
Jan 22 00:41:04 compute-1 systemd[1]: var-lib-containers-storage-overlay-7cab878a80edf36ecea7014aa1f6777c4ae43906a2c646f58ad7446e748ca845-merged.mount: Deactivated successfully.
Jan 22 00:41:04 compute-1 podman[243327]: 2026-01-22 00:41:04.958256369 +0000 UTC m=+0.095731278 container cleanup 535d32b16dba857c47da46a5af2a4d7a5fd8a50bcf85b786eaf8f89050375923 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-65bd5007-25fc-43be-bec0-20ff1d1f0a79, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Jan 22 00:41:04 compute-1 systemd[1]: libpod-conmon-535d32b16dba857c47da46a5af2a4d7a5fd8a50bcf85b786eaf8f89050375923.scope: Deactivated successfully.
Jan 22 00:41:05 compute-1 podman[243357]: 2026-01-22 00:41:05.021398525 +0000 UTC m=+0.043054885 container remove 535d32b16dba857c47da46a5af2a4d7a5fd8a50bcf85b786eaf8f89050375923 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-65bd5007-25fc-43be-bec0-20ff1d1f0a79, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3)
Jan 22 00:41:05 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:41:05.030 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[f238828f-e4ab-46ee-9930-2925736e3eaa]: (4, ('Thu Jan 22 12:41:04 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-65bd5007-25fc-43be-bec0-20ff1d1f0a79 (535d32b16dba857c47da46a5af2a4d7a5fd8a50bcf85b786eaf8f89050375923)\n535d32b16dba857c47da46a5af2a4d7a5fd8a50bcf85b786eaf8f89050375923\nThu Jan 22 12:41:04 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-65bd5007-25fc-43be-bec0-20ff1d1f0a79 (535d32b16dba857c47da46a5af2a4d7a5fd8a50bcf85b786eaf8f89050375923)\n535d32b16dba857c47da46a5af2a4d7a5fd8a50bcf85b786eaf8f89050375923\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:41:05 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:41:05.031 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[bbe7043c-910e-496c-b669-b82fc1b43b49]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:41:05 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:41:05.032 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap65bd5007-20, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:41:05 compute-1 nova_compute[182713]: 2026-01-22 00:41:05.034 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:41:05 compute-1 kernel: tap65bd5007-20: left promiscuous mode
Jan 22 00:41:05 compute-1 nova_compute[182713]: 2026-01-22 00:41:05.051 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:41:05 compute-1 nova_compute[182713]: 2026-01-22 00:41:05.052 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:41:05 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:41:05.053 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[a319247b-7fd4-492d-a145-0cba45d526c5]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:41:05 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:41:05.070 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[473d00f8-782d-4a75-ad1e-f2b5bc233648]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:41:05 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:41:05.071 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[e222f734-4d99-4140-8984-91ea6ea8c696]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:41:05 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:41:05.086 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[5c789eac-4611-4887-b037-61401809d7b4]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 701015, 'reachable_time': 26153, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 243372, 'error': None, 'target': 'ovnmeta-65bd5007-25fc-43be-bec0-20ff1d1f0a79', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:41:05 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:41:05.088 104576 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-65bd5007-25fc-43be-bec0-20ff1d1f0a79 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 22 00:41:05 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:41:05.089 104576 DEBUG oslo.privsep.daemon [-] privsep: reply[a2cd3094-8e39-442f-9480-a4d51724d8ea]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:41:05 compute-1 nova_compute[182713]: 2026-01-22 00:41:05.440 182717 DEBUG nova.compute.manager [req-cb1d66a0-e8e8-40d9-b5f2-d49911fa06d6 req-6c39121d-de30-4482-91fc-c0f0fea93d63 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: af1fd418-ee94-4203-923f-6b4fd78c2b96] Received event network-vif-unplugged-c45a4550-8234-4bc5-b809-77d90eb7f8fd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:41:05 compute-1 nova_compute[182713]: 2026-01-22 00:41:05.441 182717 DEBUG oslo_concurrency.lockutils [req-cb1d66a0-e8e8-40d9-b5f2-d49911fa06d6 req-6c39121d-de30-4482-91fc-c0f0fea93d63 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "af1fd418-ee94-4203-923f-6b4fd78c2b96-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:41:05 compute-1 nova_compute[182713]: 2026-01-22 00:41:05.442 182717 DEBUG oslo_concurrency.lockutils [req-cb1d66a0-e8e8-40d9-b5f2-d49911fa06d6 req-6c39121d-de30-4482-91fc-c0f0fea93d63 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "af1fd418-ee94-4203-923f-6b4fd78c2b96-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:41:05 compute-1 nova_compute[182713]: 2026-01-22 00:41:05.442 182717 DEBUG oslo_concurrency.lockutils [req-cb1d66a0-e8e8-40d9-b5f2-d49911fa06d6 req-6c39121d-de30-4482-91fc-c0f0fea93d63 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "af1fd418-ee94-4203-923f-6b4fd78c2b96-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:41:05 compute-1 nova_compute[182713]: 2026-01-22 00:41:05.444 182717 DEBUG nova.compute.manager [req-cb1d66a0-e8e8-40d9-b5f2-d49911fa06d6 req-6c39121d-de30-4482-91fc-c0f0fea93d63 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: af1fd418-ee94-4203-923f-6b4fd78c2b96] No waiting events found dispatching network-vif-unplugged-c45a4550-8234-4bc5-b809-77d90eb7f8fd pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:41:05 compute-1 nova_compute[182713]: 2026-01-22 00:41:05.445 182717 DEBUG nova.compute.manager [req-cb1d66a0-e8e8-40d9-b5f2-d49911fa06d6 req-6c39121d-de30-4482-91fc-c0f0fea93d63 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: af1fd418-ee94-4203-923f-6b4fd78c2b96] Received event network-vif-unplugged-c45a4550-8234-4bc5-b809-77d90eb7f8fd for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 22 00:41:05 compute-1 nova_compute[182713]: 2026-01-22 00:41:05.534 182717 DEBUG nova.compute.manager [req-d4c43493-e433-46bc-83bf-589b11de87c9 req-d3b22456-e475-4f0a-92fc-6fe004dbeb3a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: af1fd418-ee94-4203-923f-6b4fd78c2b96] Received event network-vif-unplugged-c1a311b0-55fd-43c5-9c26-1ce77ebf5300 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:41:05 compute-1 nova_compute[182713]: 2026-01-22 00:41:05.534 182717 DEBUG oslo_concurrency.lockutils [req-d4c43493-e433-46bc-83bf-589b11de87c9 req-d3b22456-e475-4f0a-92fc-6fe004dbeb3a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "af1fd418-ee94-4203-923f-6b4fd78c2b96-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:41:05 compute-1 nova_compute[182713]: 2026-01-22 00:41:05.535 182717 DEBUG oslo_concurrency.lockutils [req-d4c43493-e433-46bc-83bf-589b11de87c9 req-d3b22456-e475-4f0a-92fc-6fe004dbeb3a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "af1fd418-ee94-4203-923f-6b4fd78c2b96-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:41:05 compute-1 nova_compute[182713]: 2026-01-22 00:41:05.535 182717 DEBUG oslo_concurrency.lockutils [req-d4c43493-e433-46bc-83bf-589b11de87c9 req-d3b22456-e475-4f0a-92fc-6fe004dbeb3a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "af1fd418-ee94-4203-923f-6b4fd78c2b96-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:41:05 compute-1 nova_compute[182713]: 2026-01-22 00:41:05.535 182717 DEBUG nova.compute.manager [req-d4c43493-e433-46bc-83bf-589b11de87c9 req-d3b22456-e475-4f0a-92fc-6fe004dbeb3a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: af1fd418-ee94-4203-923f-6b4fd78c2b96] No waiting events found dispatching network-vif-unplugged-c1a311b0-55fd-43c5-9c26-1ce77ebf5300 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:41:05 compute-1 nova_compute[182713]: 2026-01-22 00:41:05.535 182717 DEBUG nova.compute.manager [req-d4c43493-e433-46bc-83bf-589b11de87c9 req-d3b22456-e475-4f0a-92fc-6fe004dbeb3a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: af1fd418-ee94-4203-923f-6b4fd78c2b96] Received event network-vif-unplugged-c1a311b0-55fd-43c5-9c26-1ce77ebf5300 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 22 00:41:05 compute-1 systemd[1]: run-netns-ovnmeta\x2d65bd5007\x2d25fc\x2d43be\x2dbec0\x2d20ff1d1f0a79.mount: Deactivated successfully.
Jan 22 00:41:05 compute-1 nova_compute[182713]: 2026-01-22 00:41:05.833 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:41:05 compute-1 nova_compute[182713]: 2026-01-22 00:41:05.855 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:41:06 compute-1 nova_compute[182713]: 2026-01-22 00:41:06.188 182717 DEBUG nova.network.neutron [-] [instance: af1fd418-ee94-4203-923f-6b4fd78c2b96] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:41:06 compute-1 nova_compute[182713]: 2026-01-22 00:41:06.206 182717 INFO nova.compute.manager [-] [instance: af1fd418-ee94-4203-923f-6b4fd78c2b96] Took 1.42 seconds to deallocate network for instance.
Jan 22 00:41:06 compute-1 nova_compute[182713]: 2026-01-22 00:41:06.278 182717 DEBUG oslo_concurrency.lockutils [None req-b94369fb-4069-4afe-b67d-0558cd8723fc a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:41:06 compute-1 nova_compute[182713]: 2026-01-22 00:41:06.279 182717 DEBUG oslo_concurrency.lockutils [None req-b94369fb-4069-4afe-b67d-0558cd8723fc a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:41:06 compute-1 nova_compute[182713]: 2026-01-22 00:41:06.281 182717 DEBUG nova.compute.manager [req-db512ce0-792c-4bd3-b950-ba6addf29631 req-7037f517-e702-40f3-b97d-794986d45c29 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: af1fd418-ee94-4203-923f-6b4fd78c2b96] Received event network-vif-deleted-c1a311b0-55fd-43c5-9c26-1ce77ebf5300 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:41:06 compute-1 nova_compute[182713]: 2026-01-22 00:41:06.282 182717 DEBUG nova.compute.manager [req-db512ce0-792c-4bd3-b950-ba6addf29631 req-7037f517-e702-40f3-b97d-794986d45c29 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: af1fd418-ee94-4203-923f-6b4fd78c2b96] Received event network-vif-deleted-c45a4550-8234-4bc5-b809-77d90eb7f8fd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:41:06 compute-1 nova_compute[182713]: 2026-01-22 00:41:06.490 182717 DEBUG nova.network.neutron [req-47de5942-0231-413b-a4ad-d2367ad4544d req-5547e7cb-4799-414c-8d09-cfea6cb61910 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: af1fd418-ee94-4203-923f-6b4fd78c2b96] Updated VIF entry in instance network info cache for port c1a311b0-55fd-43c5-9c26-1ce77ebf5300. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 00:41:06 compute-1 nova_compute[182713]: 2026-01-22 00:41:06.490 182717 DEBUG nova.network.neutron [req-47de5942-0231-413b-a4ad-d2367ad4544d req-5547e7cb-4799-414c-8d09-cfea6cb61910 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: af1fd418-ee94-4203-923f-6b4fd78c2b96] Updating instance_info_cache with network_info: [{"id": "c1a311b0-55fd-43c5-9c26-1ce77ebf5300", "address": "fa:16:3e:c0:06:2d", "network": {"id": "d60c1e89-37d5-4a05-b566-04735ac9e501", "bridge": "br-int", "label": "tempest-network-smoke--2091942253", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc1a311b0-55", "ovs_interfaceid": "c1a311b0-55fd-43c5-9c26-1ce77ebf5300", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "c45a4550-8234-4bc5-b809-77d90eb7f8fd", "address": "fa:16:3e:b5:21:ba", "network": {"id": "65bd5007-25fc-43be-bec0-20ff1d1f0a79", "bridge": "br-int", "label": "tempest-network-smoke--1426694891", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:feb5:21ba", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc45a4550-82", "ovs_interfaceid": "c45a4550-8234-4bc5-b809-77d90eb7f8fd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:41:06 compute-1 nova_compute[182713]: 2026-01-22 00:41:06.525 182717 DEBUG nova.compute.provider_tree [None req-b94369fb-4069-4afe-b67d-0558cd8723fc a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Inventory has not changed in ProviderTree for provider: 39680711-70c9-4df1-ae59-25e54fac688d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:41:06 compute-1 nova_compute[182713]: 2026-01-22 00:41:06.530 182717 DEBUG oslo_concurrency.lockutils [req-47de5942-0231-413b-a4ad-d2367ad4544d req-5547e7cb-4799-414c-8d09-cfea6cb61910 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-af1fd418-ee94-4203-923f-6b4fd78c2b96" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:41:06 compute-1 nova_compute[182713]: 2026-01-22 00:41:06.539 182717 DEBUG nova.scheduler.client.report [None req-b94369fb-4069-4afe-b67d-0558cd8723fc a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Inventory has not changed for provider 39680711-70c9-4df1-ae59-25e54fac688d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:41:06 compute-1 nova_compute[182713]: 2026-01-22 00:41:06.566 182717 DEBUG oslo_concurrency.lockutils [None req-b94369fb-4069-4afe-b67d-0558cd8723fc a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.287s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:41:06 compute-1 nova_compute[182713]: 2026-01-22 00:41:06.589 182717 INFO nova.scheduler.client.report [None req-b94369fb-4069-4afe-b67d-0558cd8723fc a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Deleted allocations for instance af1fd418-ee94-4203-923f-6b4fd78c2b96
Jan 22 00:41:06 compute-1 nova_compute[182713]: 2026-01-22 00:41:06.671 182717 DEBUG oslo_concurrency.lockutils [None req-b94369fb-4069-4afe-b67d-0558cd8723fc a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "af1fd418-ee94-4203-923f-6b4fd78c2b96" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.403s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:41:06 compute-1 nova_compute[182713]: 2026-01-22 00:41:06.851 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:41:07 compute-1 nova_compute[182713]: 2026-01-22 00:41:07.543 182717 DEBUG nova.compute.manager [req-4868884d-915e-4e13-9839-e409686861cc req-04e73979-a969-4dc9-b7e2-53893e7842a7 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: af1fd418-ee94-4203-923f-6b4fd78c2b96] Received event network-vif-plugged-c45a4550-8234-4bc5-b809-77d90eb7f8fd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:41:07 compute-1 nova_compute[182713]: 2026-01-22 00:41:07.544 182717 DEBUG oslo_concurrency.lockutils [req-4868884d-915e-4e13-9839-e409686861cc req-04e73979-a969-4dc9-b7e2-53893e7842a7 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "af1fd418-ee94-4203-923f-6b4fd78c2b96-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:41:07 compute-1 nova_compute[182713]: 2026-01-22 00:41:07.544 182717 DEBUG oslo_concurrency.lockutils [req-4868884d-915e-4e13-9839-e409686861cc req-04e73979-a969-4dc9-b7e2-53893e7842a7 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "af1fd418-ee94-4203-923f-6b4fd78c2b96-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:41:07 compute-1 nova_compute[182713]: 2026-01-22 00:41:07.544 182717 DEBUG oslo_concurrency.lockutils [req-4868884d-915e-4e13-9839-e409686861cc req-04e73979-a969-4dc9-b7e2-53893e7842a7 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "af1fd418-ee94-4203-923f-6b4fd78c2b96-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:41:07 compute-1 nova_compute[182713]: 2026-01-22 00:41:07.544 182717 DEBUG nova.compute.manager [req-4868884d-915e-4e13-9839-e409686861cc req-04e73979-a969-4dc9-b7e2-53893e7842a7 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: af1fd418-ee94-4203-923f-6b4fd78c2b96] No waiting events found dispatching network-vif-plugged-c45a4550-8234-4bc5-b809-77d90eb7f8fd pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:41:07 compute-1 nova_compute[182713]: 2026-01-22 00:41:07.545 182717 WARNING nova.compute.manager [req-4868884d-915e-4e13-9839-e409686861cc req-04e73979-a969-4dc9-b7e2-53893e7842a7 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: af1fd418-ee94-4203-923f-6b4fd78c2b96] Received unexpected event network-vif-plugged-c45a4550-8234-4bc5-b809-77d90eb7f8fd for instance with vm_state deleted and task_state None.
Jan 22 00:41:07 compute-1 nova_compute[182713]: 2026-01-22 00:41:07.632 182717 DEBUG nova.compute.manager [req-97548006-e8fe-47aa-91ea-7beefa8d8986 req-00a88f40-2a62-4eec-9a38-0a17ed853f99 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: af1fd418-ee94-4203-923f-6b4fd78c2b96] Received event network-vif-plugged-c1a311b0-55fd-43c5-9c26-1ce77ebf5300 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:41:07 compute-1 nova_compute[182713]: 2026-01-22 00:41:07.633 182717 DEBUG oslo_concurrency.lockutils [req-97548006-e8fe-47aa-91ea-7beefa8d8986 req-00a88f40-2a62-4eec-9a38-0a17ed853f99 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "af1fd418-ee94-4203-923f-6b4fd78c2b96-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:41:07 compute-1 nova_compute[182713]: 2026-01-22 00:41:07.633 182717 DEBUG oslo_concurrency.lockutils [req-97548006-e8fe-47aa-91ea-7beefa8d8986 req-00a88f40-2a62-4eec-9a38-0a17ed853f99 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "af1fd418-ee94-4203-923f-6b4fd78c2b96-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:41:07 compute-1 nova_compute[182713]: 2026-01-22 00:41:07.633 182717 DEBUG oslo_concurrency.lockutils [req-97548006-e8fe-47aa-91ea-7beefa8d8986 req-00a88f40-2a62-4eec-9a38-0a17ed853f99 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "af1fd418-ee94-4203-923f-6b4fd78c2b96-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:41:07 compute-1 nova_compute[182713]: 2026-01-22 00:41:07.634 182717 DEBUG nova.compute.manager [req-97548006-e8fe-47aa-91ea-7beefa8d8986 req-00a88f40-2a62-4eec-9a38-0a17ed853f99 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: af1fd418-ee94-4203-923f-6b4fd78c2b96] No waiting events found dispatching network-vif-plugged-c1a311b0-55fd-43c5-9c26-1ce77ebf5300 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:41:07 compute-1 nova_compute[182713]: 2026-01-22 00:41:07.634 182717 WARNING nova.compute.manager [req-97548006-e8fe-47aa-91ea-7beefa8d8986 req-00a88f40-2a62-4eec-9a38-0a17ed853f99 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: af1fd418-ee94-4203-923f-6b4fd78c2b96] Received unexpected event network-vif-plugged-c1a311b0-55fd-43c5-9c26-1ce77ebf5300 for instance with vm_state deleted and task_state None.
Jan 22 00:41:07 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:41:07.913 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=74526b6d-b1ca-423f-9094-b845f8b97526, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '64'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:41:08 compute-1 nova_compute[182713]: 2026-01-22 00:41:08.855 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:41:08 compute-1 nova_compute[182713]: 2026-01-22 00:41:08.855 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:41:08 compute-1 nova_compute[182713]: 2026-01-22 00:41:08.856 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 00:41:09 compute-1 nova_compute[182713]: 2026-01-22 00:41:09.693 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:41:10 compute-1 podman[243373]: 2026-01-22 00:41:10.586990411 +0000 UTC m=+0.072416085 container health_status cba261ffc859a105e0afa4436e64e27f9ba7e7387a1ea02146e0072c51844d44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Jan 22 00:41:10 compute-1 nova_compute[182713]: 2026-01-22 00:41:10.836 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:41:12 compute-1 podman[243394]: 2026-01-22 00:41:12.59462081 +0000 UTC m=+0.084065567 container health_status dce133a2ffa21f3006ac27dc80027060a9029e6d972f4c62fecd35dd3bbb00f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=openstack_network_exporter, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, release=1755695350, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6)
Jan 22 00:41:13 compute-1 nova_compute[182713]: 2026-01-22 00:41:13.855 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:41:13 compute-1 nova_compute[182713]: 2026-01-22 00:41:13.856 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:41:13 compute-1 nova_compute[182713]: 2026-01-22 00:41:13.881 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:41:13 compute-1 nova_compute[182713]: 2026-01-22 00:41:13.881 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:41:13 compute-1 nova_compute[182713]: 2026-01-22 00:41:13.881 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:41:13 compute-1 nova_compute[182713]: 2026-01-22 00:41:13.881 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 00:41:14 compute-1 nova_compute[182713]: 2026-01-22 00:41:14.063 182717 WARNING nova.virt.libvirt.driver [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 00:41:14 compute-1 nova_compute[182713]: 2026-01-22 00:41:14.064 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5706MB free_disk=73.17753601074219GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 00:41:14 compute-1 nova_compute[182713]: 2026-01-22 00:41:14.065 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:41:14 compute-1 nova_compute[182713]: 2026-01-22 00:41:14.065 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:41:14 compute-1 nova_compute[182713]: 2026-01-22 00:41:14.262 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 00:41:14 compute-1 nova_compute[182713]: 2026-01-22 00:41:14.262 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 00:41:14 compute-1 nova_compute[182713]: 2026-01-22 00:41:14.288 182717 DEBUG nova.compute.provider_tree [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Inventory has not changed in ProviderTree for provider: 39680711-70c9-4df1-ae59-25e54fac688d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:41:14 compute-1 nova_compute[182713]: 2026-01-22 00:41:14.303 182717 DEBUG nova.scheduler.client.report [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Inventory has not changed for provider 39680711-70c9-4df1-ae59-25e54fac688d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:41:14 compute-1 nova_compute[182713]: 2026-01-22 00:41:14.323 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 00:41:14 compute-1 nova_compute[182713]: 2026-01-22 00:41:14.323 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.258s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:41:14 compute-1 nova_compute[182713]: 2026-01-22 00:41:14.698 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:41:15 compute-1 nova_compute[182713]: 2026-01-22 00:41:15.324 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:41:15 compute-1 nova_compute[182713]: 2026-01-22 00:41:15.838 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:41:15 compute-1 nova_compute[182713]: 2026-01-22 00:41:15.855 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:41:17 compute-1 sshd-session[243416]: Invalid user sol from 92.118.39.95 port 44150
Jan 22 00:41:17 compute-1 nova_compute[182713]: 2026-01-22 00:41:17.741 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:41:17 compute-1 sshd-session[243416]: Connection closed by invalid user sol 92.118.39.95 port 44150 [preauth]
Jan 22 00:41:17 compute-1 nova_compute[182713]: 2026-01-22 00:41:17.870 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:41:18 compute-1 nova_compute[182713]: 2026-01-22 00:41:18.856 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:41:18 compute-1 nova_compute[182713]: 2026-01-22 00:41:18.857 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 00:41:18 compute-1 nova_compute[182713]: 2026-01-22 00:41:18.857 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 22 00:41:18 compute-1 nova_compute[182713]: 2026-01-22 00:41:18.872 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 22 00:41:19 compute-1 nova_compute[182713]: 2026-01-22 00:41:19.595 182717 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769042464.5941985, af1fd418-ee94-4203-923f-6b4fd78c2b96 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:41:19 compute-1 nova_compute[182713]: 2026-01-22 00:41:19.596 182717 INFO nova.compute.manager [-] [instance: af1fd418-ee94-4203-923f-6b4fd78c2b96] VM Stopped (Lifecycle Event)
Jan 22 00:41:19 compute-1 nova_compute[182713]: 2026-01-22 00:41:19.620 182717 DEBUG nova.compute.manager [None req-3e508249-8a6a-4de8-b682-6f83eb37351c - - - - - -] [instance: af1fd418-ee94-4203-923f-6b4fd78c2b96] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:41:19 compute-1 nova_compute[182713]: 2026-01-22 00:41:19.702 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:41:20 compute-1 nova_compute[182713]: 2026-01-22 00:41:20.840 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:41:22 compute-1 podman[243420]: 2026-01-22 00:41:22.574193569 +0000 UTC m=+0.060007630 container health_status 9069102163b9014ce8d990e36224edf2f6277e737eebbc85a8ea0ba5a3d6a468 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 22 00:41:22 compute-1 podman[243419]: 2026-01-22 00:41:22.626032806 +0000 UTC m=+0.111019511 container health_status 1b31374526468d6b98833c6ddc06c791524e3aaa8f4a92d02e3f11c449a17ca2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3)
Jan 22 00:41:24 compute-1 nova_compute[182713]: 2026-01-22 00:41:24.706 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:41:25 compute-1 nova_compute[182713]: 2026-01-22 00:41:25.842 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:41:28 compute-1 podman[243469]: 2026-01-22 00:41:28.596978882 +0000 UTC m=+0.069892667 container health_status e305ebfcd3dbca18f8dd680606b043b108cf9efb0d0a771039cb04fe0df8441d (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 22 00:41:28 compute-1 podman[243468]: 2026-01-22 00:41:28.598125268 +0000 UTC m=+0.071569569 container health_status af9a44923f14d865893c91712a29a99d59e05d83f1a1a0300c4f4d5af638f284 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, managed_by=edpm_ansible)
Jan 22 00:41:29 compute-1 nova_compute[182713]: 2026-01-22 00:41:29.709 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:41:30 compute-1 nova_compute[182713]: 2026-01-22 00:41:30.845 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:41:34 compute-1 nova_compute[182713]: 2026-01-22 00:41:34.712 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:41:35 compute-1 nova_compute[182713]: 2026-01-22 00:41:35.847 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:41:35 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:41:35.851 104184 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:5f:63:f3 10.100.0.2 2001:db8::f816:3eff:fe5f:63f3'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fe5f:63f3/64', 'neutron:device_id': 'ovnmeta-0fbc923c-90ec-4c3d-92df-bc42843601b3', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0fbc923c-90ec-4c3d-92df-bc42843601b3', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '837db8748d074b3c9179b47d30e7a1d4', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b420faa5-5ae8-471e-9b88-5f792c3ff519, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=2a8aebb4-643e-4d79-9b9e-71408c2b29d3) old=Port_Binding(mac=['fa:16:3e:5f:63:f3 10.100.0.2'], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-0fbc923c-90ec-4c3d-92df-bc42843601b3', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0fbc923c-90ec-4c3d-92df-bc42843601b3', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '837db8748d074b3c9179b47d30e7a1d4', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:41:35 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:41:35.853 104184 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 2a8aebb4-643e-4d79-9b9e-71408c2b29d3 in datapath 0fbc923c-90ec-4c3d-92df-bc42843601b3 updated
Jan 22 00:41:35 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:41:35.854 104184 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 0fbc923c-90ec-4c3d-92df-bc42843601b3, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 00:41:35 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:41:35.856 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[e8492a83-12f8-4507-8319-bfca40ec5e09]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:41:39 compute-1 nova_compute[182713]: 2026-01-22 00:41:39.714 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:41:40 compute-1 nova_compute[182713]: 2026-01-22 00:41:40.849 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:41:41 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:41:41.553 104184 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:5f:63:f3 10.100.0.2 2001:db8:0:1:f816:3eff:fe5f:63f3 2001:db8::f816:3eff:fe5f:63f3'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8:0:1:f816:3eff:fe5f:63f3/64 2001:db8::f816:3eff:fe5f:63f3/64', 'neutron:device_id': 'ovnmeta-0fbc923c-90ec-4c3d-92df-bc42843601b3', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0fbc923c-90ec-4c3d-92df-bc42843601b3', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '837db8748d074b3c9179b47d30e7a1d4', 'neutron:revision_number': '4', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b420faa5-5ae8-471e-9b88-5f792c3ff519, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=2a8aebb4-643e-4d79-9b9e-71408c2b29d3) old=Port_Binding(mac=['fa:16:3e:5f:63:f3 10.100.0.2 2001:db8::f816:3eff:fe5f:63f3'], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fe5f:63f3/64', 'neutron:device_id': 'ovnmeta-0fbc923c-90ec-4c3d-92df-bc42843601b3', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0fbc923c-90ec-4c3d-92df-bc42843601b3', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '837db8748d074b3c9179b47d30e7a1d4', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:41:41 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:41:41.555 104184 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 2a8aebb4-643e-4d79-9b9e-71408c2b29d3 in datapath 0fbc923c-90ec-4c3d-92df-bc42843601b3 updated
Jan 22 00:41:41 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:41:41.557 104184 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 0fbc923c-90ec-4c3d-92df-bc42843601b3, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 00:41:41 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:41:41.558 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[83ace053-6a34-446a-b8d3-ecee576b0a34]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:41:41 compute-1 podman[243511]: 2026-01-22 00:41:41.577253132 +0000 UTC m=+0.068170284 container health_status cba261ffc859a105e0afa4436e64e27f9ba7e7387a1ea02146e0072c51844d44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS)
Jan 22 00:41:43 compute-1 podman[243531]: 2026-01-22 00:41:43.592793705 +0000 UTC m=+0.074434058 container health_status dce133a2ffa21f3006ac27dc80027060a9029e6d972f4c62fecd35dd3bbb00f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, release=1755695350, vcs-type=git, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, io.openshift.expose-services=, vendor=Red Hat, Inc., version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, container_name=openstack_network_exporter)
Jan 22 00:41:44 compute-1 nova_compute[182713]: 2026-01-22 00:41:44.717 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:41:45 compute-1 nova_compute[182713]: 2026-01-22 00:41:45.851 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:41:48 compute-1 nova_compute[182713]: 2026-01-22 00:41:48.755 182717 DEBUG oslo_concurrency.lockutils [None req-17578d0e-8d46-4ccb-99a9-7fd102f300d6 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Acquiring lock "32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:41:48 compute-1 nova_compute[182713]: 2026-01-22 00:41:48.755 182717 DEBUG oslo_concurrency.lockutils [None req-17578d0e-8d46-4ccb-99a9-7fd102f300d6 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:41:48 compute-1 nova_compute[182713]: 2026-01-22 00:41:48.770 182717 DEBUG nova.compute.manager [None req-17578d0e-8d46-4ccb-99a9-7fd102f300d6 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 22 00:41:48 compute-1 nova_compute[182713]: 2026-01-22 00:41:48.870 182717 DEBUG oslo_concurrency.lockutils [None req-17578d0e-8d46-4ccb-99a9-7fd102f300d6 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:41:48 compute-1 nova_compute[182713]: 2026-01-22 00:41:48.870 182717 DEBUG oslo_concurrency.lockutils [None req-17578d0e-8d46-4ccb-99a9-7fd102f300d6 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:41:48 compute-1 nova_compute[182713]: 2026-01-22 00:41:48.879 182717 DEBUG nova.virt.hardware [None req-17578d0e-8d46-4ccb-99a9-7fd102f300d6 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 22 00:41:48 compute-1 nova_compute[182713]: 2026-01-22 00:41:48.880 182717 INFO nova.compute.claims [None req-17578d0e-8d46-4ccb-99a9-7fd102f300d6 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c] Claim successful on node compute-1.ctlplane.example.com
Jan 22 00:41:48 compute-1 nova_compute[182713]: 2026-01-22 00:41:48.997 182717 DEBUG nova.compute.provider_tree [None req-17578d0e-8d46-4ccb-99a9-7fd102f300d6 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Inventory has not changed in ProviderTree for provider: 39680711-70c9-4df1-ae59-25e54fac688d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:41:49 compute-1 nova_compute[182713]: 2026-01-22 00:41:49.012 182717 DEBUG nova.scheduler.client.report [None req-17578d0e-8d46-4ccb-99a9-7fd102f300d6 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Inventory has not changed for provider 39680711-70c9-4df1-ae59-25e54fac688d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:41:49 compute-1 nova_compute[182713]: 2026-01-22 00:41:49.030 182717 DEBUG oslo_concurrency.lockutils [None req-17578d0e-8d46-4ccb-99a9-7fd102f300d6 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.160s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:41:49 compute-1 nova_compute[182713]: 2026-01-22 00:41:49.031 182717 DEBUG nova.compute.manager [None req-17578d0e-8d46-4ccb-99a9-7fd102f300d6 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 22 00:41:49 compute-1 nova_compute[182713]: 2026-01-22 00:41:49.088 182717 DEBUG nova.compute.manager [None req-17578d0e-8d46-4ccb-99a9-7fd102f300d6 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 22 00:41:49 compute-1 nova_compute[182713]: 2026-01-22 00:41:49.088 182717 DEBUG nova.network.neutron [None req-17578d0e-8d46-4ccb-99a9-7fd102f300d6 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 22 00:41:49 compute-1 nova_compute[182713]: 2026-01-22 00:41:49.115 182717 INFO nova.virt.libvirt.driver [None req-17578d0e-8d46-4ccb-99a9-7fd102f300d6 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 22 00:41:49 compute-1 nova_compute[182713]: 2026-01-22 00:41:49.131 182717 DEBUG nova.compute.manager [None req-17578d0e-8d46-4ccb-99a9-7fd102f300d6 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 22 00:41:49 compute-1 nova_compute[182713]: 2026-01-22 00:41:49.230 182717 DEBUG nova.compute.manager [None req-17578d0e-8d46-4ccb-99a9-7fd102f300d6 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 22 00:41:49 compute-1 nova_compute[182713]: 2026-01-22 00:41:49.232 182717 DEBUG nova.virt.libvirt.driver [None req-17578d0e-8d46-4ccb-99a9-7fd102f300d6 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 22 00:41:49 compute-1 nova_compute[182713]: 2026-01-22 00:41:49.233 182717 INFO nova.virt.libvirt.driver [None req-17578d0e-8d46-4ccb-99a9-7fd102f300d6 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c] Creating image(s)
Jan 22 00:41:49 compute-1 nova_compute[182713]: 2026-01-22 00:41:49.235 182717 DEBUG oslo_concurrency.lockutils [None req-17578d0e-8d46-4ccb-99a9-7fd102f300d6 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Acquiring lock "/var/lib/nova/instances/32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:41:49 compute-1 nova_compute[182713]: 2026-01-22 00:41:49.235 182717 DEBUG oslo_concurrency.lockutils [None req-17578d0e-8d46-4ccb-99a9-7fd102f300d6 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "/var/lib/nova/instances/32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:41:49 compute-1 nova_compute[182713]: 2026-01-22 00:41:49.237 182717 DEBUG oslo_concurrency.lockutils [None req-17578d0e-8d46-4ccb-99a9-7fd102f300d6 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "/var/lib/nova/instances/32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:41:49 compute-1 nova_compute[182713]: 2026-01-22 00:41:49.267 182717 DEBUG oslo_concurrency.processutils [None req-17578d0e-8d46-4ccb-99a9-7fd102f300d6 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:41:49 compute-1 nova_compute[182713]: 2026-01-22 00:41:49.347 182717 DEBUG oslo_concurrency.processutils [None req-17578d0e-8d46-4ccb-99a9-7fd102f300d6 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.080s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:41:49 compute-1 nova_compute[182713]: 2026-01-22 00:41:49.349 182717 DEBUG oslo_concurrency.lockutils [None req-17578d0e-8d46-4ccb-99a9-7fd102f300d6 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Acquiring lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:41:49 compute-1 nova_compute[182713]: 2026-01-22 00:41:49.350 182717 DEBUG oslo_concurrency.lockutils [None req-17578d0e-8d46-4ccb-99a9-7fd102f300d6 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:41:49 compute-1 nova_compute[182713]: 2026-01-22 00:41:49.378 182717 DEBUG oslo_concurrency.processutils [None req-17578d0e-8d46-4ccb-99a9-7fd102f300d6 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:41:49 compute-1 nova_compute[182713]: 2026-01-22 00:41:49.467 182717 DEBUG oslo_concurrency.processutils [None req-17578d0e-8d46-4ccb-99a9-7fd102f300d6 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.089s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:41:49 compute-1 nova_compute[182713]: 2026-01-22 00:41:49.469 182717 DEBUG oslo_concurrency.processutils [None req-17578d0e-8d46-4ccb-99a9-7fd102f300d6 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:41:49 compute-1 nova_compute[182713]: 2026-01-22 00:41:49.515 182717 DEBUG oslo_concurrency.processutils [None req-17578d0e-8d46-4ccb-99a9-7fd102f300d6 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c/disk 1073741824" returned: 0 in 0.046s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:41:49 compute-1 nova_compute[182713]: 2026-01-22 00:41:49.516 182717 DEBUG oslo_concurrency.lockutils [None req-17578d0e-8d46-4ccb-99a9-7fd102f300d6 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.166s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:41:49 compute-1 nova_compute[182713]: 2026-01-22 00:41:49.517 182717 DEBUG oslo_concurrency.processutils [None req-17578d0e-8d46-4ccb-99a9-7fd102f300d6 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:41:49 compute-1 nova_compute[182713]: 2026-01-22 00:41:49.595 182717 DEBUG oslo_concurrency.processutils [None req-17578d0e-8d46-4ccb-99a9-7fd102f300d6 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.078s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:41:49 compute-1 nova_compute[182713]: 2026-01-22 00:41:49.596 182717 DEBUG nova.virt.disk.api [None req-17578d0e-8d46-4ccb-99a9-7fd102f300d6 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Checking if we can resize image /var/lib/nova/instances/32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 22 00:41:49 compute-1 nova_compute[182713]: 2026-01-22 00:41:49.596 182717 DEBUG oslo_concurrency.processutils [None req-17578d0e-8d46-4ccb-99a9-7fd102f300d6 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:41:49 compute-1 nova_compute[182713]: 2026-01-22 00:41:49.655 182717 DEBUG oslo_concurrency.processutils [None req-17578d0e-8d46-4ccb-99a9-7fd102f300d6 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:41:49 compute-1 nova_compute[182713]: 2026-01-22 00:41:49.656 182717 DEBUG nova.virt.disk.api [None req-17578d0e-8d46-4ccb-99a9-7fd102f300d6 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Cannot resize image /var/lib/nova/instances/32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 22 00:41:49 compute-1 nova_compute[182713]: 2026-01-22 00:41:49.656 182717 DEBUG nova.objects.instance [None req-17578d0e-8d46-4ccb-99a9-7fd102f300d6 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lazy-loading 'migration_context' on Instance uuid 32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:41:49 compute-1 nova_compute[182713]: 2026-01-22 00:41:49.672 182717 DEBUG nova.virt.libvirt.driver [None req-17578d0e-8d46-4ccb-99a9-7fd102f300d6 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 22 00:41:49 compute-1 nova_compute[182713]: 2026-01-22 00:41:49.672 182717 DEBUG nova.virt.libvirt.driver [None req-17578d0e-8d46-4ccb-99a9-7fd102f300d6 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c] Ensure instance console log exists: /var/lib/nova/instances/32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 22 00:41:49 compute-1 nova_compute[182713]: 2026-01-22 00:41:49.673 182717 DEBUG oslo_concurrency.lockutils [None req-17578d0e-8d46-4ccb-99a9-7fd102f300d6 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:41:49 compute-1 nova_compute[182713]: 2026-01-22 00:41:49.673 182717 DEBUG oslo_concurrency.lockutils [None req-17578d0e-8d46-4ccb-99a9-7fd102f300d6 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:41:49 compute-1 nova_compute[182713]: 2026-01-22 00:41:49.674 182717 DEBUG oslo_concurrency.lockutils [None req-17578d0e-8d46-4ccb-99a9-7fd102f300d6 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:41:49 compute-1 nova_compute[182713]: 2026-01-22 00:41:49.720 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:41:50 compute-1 nova_compute[182713]: 2026-01-22 00:41:50.282 182717 DEBUG nova.policy [None req-17578d0e-8d46-4ccb-99a9-7fd102f300d6 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 22 00:41:50 compute-1 nova_compute[182713]: 2026-01-22 00:41:50.853 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:41:51 compute-1 nova_compute[182713]: 2026-01-22 00:41:51.370 182717 DEBUG nova.network.neutron [None req-17578d0e-8d46-4ccb-99a9-7fd102f300d6 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c] Successfully created port: 5d170032-0a8d-4284-b16f-f41c4caa4d83 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 22 00:41:52 compute-1 nova_compute[182713]: 2026-01-22 00:41:52.364 182717 DEBUG nova.network.neutron [None req-17578d0e-8d46-4ccb-99a9-7fd102f300d6 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c] Successfully updated port: 5d170032-0a8d-4284-b16f-f41c4caa4d83 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 22 00:41:52 compute-1 nova_compute[182713]: 2026-01-22 00:41:52.382 182717 DEBUG oslo_concurrency.lockutils [None req-17578d0e-8d46-4ccb-99a9-7fd102f300d6 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Acquiring lock "refresh_cache-32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:41:52 compute-1 nova_compute[182713]: 2026-01-22 00:41:52.382 182717 DEBUG oslo_concurrency.lockutils [None req-17578d0e-8d46-4ccb-99a9-7fd102f300d6 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Acquired lock "refresh_cache-32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:41:52 compute-1 nova_compute[182713]: 2026-01-22 00:41:52.382 182717 DEBUG nova.network.neutron [None req-17578d0e-8d46-4ccb-99a9-7fd102f300d6 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 00:41:52 compute-1 nova_compute[182713]: 2026-01-22 00:41:52.471 182717 DEBUG nova.compute.manager [req-2afd2b6d-338e-4d43-8c3c-7f5373684b3b req-a17ef1e9-173e-47f8-951f-d10a0a306d20 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c] Received event network-changed-5d170032-0a8d-4284-b16f-f41c4caa4d83 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:41:52 compute-1 nova_compute[182713]: 2026-01-22 00:41:52.472 182717 DEBUG nova.compute.manager [req-2afd2b6d-338e-4d43-8c3c-7f5373684b3b req-a17ef1e9-173e-47f8-951f-d10a0a306d20 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c] Refreshing instance network info cache due to event network-changed-5d170032-0a8d-4284-b16f-f41c4caa4d83. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 00:41:52 compute-1 nova_compute[182713]: 2026-01-22 00:41:52.472 182717 DEBUG oslo_concurrency.lockutils [req-2afd2b6d-338e-4d43-8c3c-7f5373684b3b req-a17ef1e9-173e-47f8-951f-d10a0a306d20 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:41:52 compute-1 nova_compute[182713]: 2026-01-22 00:41:52.522 182717 DEBUG nova.network.neutron [None req-17578d0e-8d46-4ccb-99a9-7fd102f300d6 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 00:41:53 compute-1 podman[243570]: 2026-01-22 00:41:53.590022992 +0000 UTC m=+0.070570288 container health_status 9069102163b9014ce8d990e36224edf2f6277e737eebbc85a8ea0ba5a3d6a468 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 22 00:41:53 compute-1 podman[243569]: 2026-01-22 00:41:53.592316853 +0000 UTC m=+0.087491833 container health_status 1b31374526468d6b98833c6ddc06c791524e3aaa8f4a92d02e3f11c449a17ca2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller)
Jan 22 00:41:53 compute-1 nova_compute[182713]: 2026-01-22 00:41:53.887 182717 DEBUG nova.network.neutron [None req-17578d0e-8d46-4ccb-99a9-7fd102f300d6 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c] Updating instance_info_cache with network_info: [{"id": "5d170032-0a8d-4284-b16f-f41c4caa4d83", "address": "fa:16:3e:8d:27:b6", "network": {"id": "0fbc923c-90ec-4c3d-92df-bc42843601b3", "bridge": "br-int", "label": "tempest-network-smoke--540170543", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe8d:27b6", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe8d:27b6", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5d170032-0a", "ovs_interfaceid": "5d170032-0a8d-4284-b16f-f41c4caa4d83", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:41:53 compute-1 nova_compute[182713]: 2026-01-22 00:41:53.911 182717 DEBUG oslo_concurrency.lockutils [None req-17578d0e-8d46-4ccb-99a9-7fd102f300d6 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Releasing lock "refresh_cache-32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:41:53 compute-1 nova_compute[182713]: 2026-01-22 00:41:53.912 182717 DEBUG nova.compute.manager [None req-17578d0e-8d46-4ccb-99a9-7fd102f300d6 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c] Instance network_info: |[{"id": "5d170032-0a8d-4284-b16f-f41c4caa4d83", "address": "fa:16:3e:8d:27:b6", "network": {"id": "0fbc923c-90ec-4c3d-92df-bc42843601b3", "bridge": "br-int", "label": "tempest-network-smoke--540170543", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe8d:27b6", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe8d:27b6", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5d170032-0a", "ovs_interfaceid": "5d170032-0a8d-4284-b16f-f41c4caa4d83", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 22 00:41:53 compute-1 nova_compute[182713]: 2026-01-22 00:41:53.913 182717 DEBUG oslo_concurrency.lockutils [req-2afd2b6d-338e-4d43-8c3c-7f5373684b3b req-a17ef1e9-173e-47f8-951f-d10a0a306d20 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:41:53 compute-1 nova_compute[182713]: 2026-01-22 00:41:53.914 182717 DEBUG nova.network.neutron [req-2afd2b6d-338e-4d43-8c3c-7f5373684b3b req-a17ef1e9-173e-47f8-951f-d10a0a306d20 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c] Refreshing network info cache for port 5d170032-0a8d-4284-b16f-f41c4caa4d83 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 00:41:53 compute-1 nova_compute[182713]: 2026-01-22 00:41:53.917 182717 DEBUG nova.virt.libvirt.driver [None req-17578d0e-8d46-4ccb-99a9-7fd102f300d6 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c] Start _get_guest_xml network_info=[{"id": "5d170032-0a8d-4284-b16f-f41c4caa4d83", "address": "fa:16:3e:8d:27:b6", "network": {"id": "0fbc923c-90ec-4c3d-92df-bc42843601b3", "bridge": "br-int", "label": "tempest-network-smoke--540170543", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe8d:27b6", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe8d:27b6", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5d170032-0a", "ovs_interfaceid": "5d170032-0a8d-4284-b16f-f41c4caa4d83", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'size': 0, 'disk_bus': 'virtio', 'boot_index': 0, 'device_name': '/dev/vda', 'guest_format': None, 'encrypted': False, 'encryption_format': None, 'encryption_secret_uuid': None, 'device_type': 'disk', 'image_id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 22 00:41:53 compute-1 nova_compute[182713]: 2026-01-22 00:41:53.922 182717 WARNING nova.virt.libvirt.driver [None req-17578d0e-8d46-4ccb-99a9-7fd102f300d6 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 00:41:53 compute-1 nova_compute[182713]: 2026-01-22 00:41:53.934 182717 DEBUG nova.virt.libvirt.host [None req-17578d0e-8d46-4ccb-99a9-7fd102f300d6 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 22 00:41:53 compute-1 nova_compute[182713]: 2026-01-22 00:41:53.936 182717 DEBUG nova.virt.libvirt.host [None req-17578d0e-8d46-4ccb-99a9-7fd102f300d6 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 22 00:41:53 compute-1 nova_compute[182713]: 2026-01-22 00:41:53.941 182717 DEBUG nova.virt.libvirt.host [None req-17578d0e-8d46-4ccb-99a9-7fd102f300d6 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 22 00:41:53 compute-1 nova_compute[182713]: 2026-01-22 00:41:53.942 182717 DEBUG nova.virt.libvirt.host [None req-17578d0e-8d46-4ccb-99a9-7fd102f300d6 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 22 00:41:53 compute-1 nova_compute[182713]: 2026-01-22 00:41:53.944 182717 DEBUG nova.virt.libvirt.driver [None req-17578d0e-8d46-4ccb-99a9-7fd102f300d6 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 22 00:41:53 compute-1 nova_compute[182713]: 2026-01-22 00:41:53.945 182717 DEBUG nova.virt.hardware [None req-17578d0e-8d46-4ccb-99a9-7fd102f300d6 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-21T23:42:45Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c3389c03-89c4-4ff5-9e03-1a99d41713d4',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 22 00:41:53 compute-1 nova_compute[182713]: 2026-01-22 00:41:53.946 182717 DEBUG nova.virt.hardware [None req-17578d0e-8d46-4ccb-99a9-7fd102f300d6 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 22 00:41:53 compute-1 nova_compute[182713]: 2026-01-22 00:41:53.946 182717 DEBUG nova.virt.hardware [None req-17578d0e-8d46-4ccb-99a9-7fd102f300d6 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 22 00:41:53 compute-1 nova_compute[182713]: 2026-01-22 00:41:53.946 182717 DEBUG nova.virt.hardware [None req-17578d0e-8d46-4ccb-99a9-7fd102f300d6 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 22 00:41:53 compute-1 nova_compute[182713]: 2026-01-22 00:41:53.947 182717 DEBUG nova.virt.hardware [None req-17578d0e-8d46-4ccb-99a9-7fd102f300d6 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 22 00:41:53 compute-1 nova_compute[182713]: 2026-01-22 00:41:53.947 182717 DEBUG nova.virt.hardware [None req-17578d0e-8d46-4ccb-99a9-7fd102f300d6 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 22 00:41:53 compute-1 nova_compute[182713]: 2026-01-22 00:41:53.948 182717 DEBUG nova.virt.hardware [None req-17578d0e-8d46-4ccb-99a9-7fd102f300d6 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 22 00:41:53 compute-1 nova_compute[182713]: 2026-01-22 00:41:53.948 182717 DEBUG nova.virt.hardware [None req-17578d0e-8d46-4ccb-99a9-7fd102f300d6 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 22 00:41:53 compute-1 nova_compute[182713]: 2026-01-22 00:41:53.948 182717 DEBUG nova.virt.hardware [None req-17578d0e-8d46-4ccb-99a9-7fd102f300d6 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 22 00:41:53 compute-1 nova_compute[182713]: 2026-01-22 00:41:53.949 182717 DEBUG nova.virt.hardware [None req-17578d0e-8d46-4ccb-99a9-7fd102f300d6 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 22 00:41:53 compute-1 nova_compute[182713]: 2026-01-22 00:41:53.949 182717 DEBUG nova.virt.hardware [None req-17578d0e-8d46-4ccb-99a9-7fd102f300d6 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 22 00:41:53 compute-1 nova_compute[182713]: 2026-01-22 00:41:53.955 182717 DEBUG nova.virt.libvirt.vif [None req-17578d0e-8d46-4ccb-99a9-7fd102f300d6 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T00:41:47Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1979133598',display_name='tempest-TestGettingAddress-server-1979133598',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1979133598',id=179,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGfcFV8SVrYtqhuEJR2u0WnZRZ3aIYjdzcrjfLDmTvbNVw+iNWtleLlqtVUIYQyWXU5cujTqDWjA511UjJA6kRMyxPcbENHgTLoJy3T95U8C9/oslNz/OBwLaWEuXg2SRA==',key_name='tempest-TestGettingAddress-313470075',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='837db8748d074b3c9179b47d30e7a1d4',ramdisk_id='',reservation_id='r-xjlk1541',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-471729430',owner_user_name='tempest-TestGettingAddress-471729430-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:41:49Z,user_data=None,user_id='a8fd196423d94b309668ffd08655f7ed',uuid=32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "5d170032-0a8d-4284-b16f-f41c4caa4d83", "address": "fa:16:3e:8d:27:b6", "network": {"id": "0fbc923c-90ec-4c3d-92df-bc42843601b3", "bridge": "br-int", "label": "tempest-network-smoke--540170543", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe8d:27b6", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe8d:27b6", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5d170032-0a", "ovs_interfaceid": "5d170032-0a8d-4284-b16f-f41c4caa4d83", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 00:41:53 compute-1 nova_compute[182713]: 2026-01-22 00:41:53.955 182717 DEBUG nova.network.os_vif_util [None req-17578d0e-8d46-4ccb-99a9-7fd102f300d6 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Converting VIF {"id": "5d170032-0a8d-4284-b16f-f41c4caa4d83", "address": "fa:16:3e:8d:27:b6", "network": {"id": "0fbc923c-90ec-4c3d-92df-bc42843601b3", "bridge": "br-int", "label": "tempest-network-smoke--540170543", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe8d:27b6", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe8d:27b6", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5d170032-0a", "ovs_interfaceid": "5d170032-0a8d-4284-b16f-f41c4caa4d83", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:41:53 compute-1 nova_compute[182713]: 2026-01-22 00:41:53.957 182717 DEBUG nova.network.os_vif_util [None req-17578d0e-8d46-4ccb-99a9-7fd102f300d6 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:8d:27:b6,bridge_name='br-int',has_traffic_filtering=True,id=5d170032-0a8d-4284-b16f-f41c4caa4d83,network=Network(0fbc923c-90ec-4c3d-92df-bc42843601b3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5d170032-0a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:41:53 compute-1 nova_compute[182713]: 2026-01-22 00:41:53.958 182717 DEBUG nova.objects.instance [None req-17578d0e-8d46-4ccb-99a9-7fd102f300d6 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lazy-loading 'pci_devices' on Instance uuid 32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:41:53 compute-1 nova_compute[182713]: 2026-01-22 00:41:53.974 182717 DEBUG nova.virt.libvirt.driver [None req-17578d0e-8d46-4ccb-99a9-7fd102f300d6 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c] End _get_guest_xml xml=<domain type="kvm">
Jan 22 00:41:53 compute-1 nova_compute[182713]:   <uuid>32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c</uuid>
Jan 22 00:41:53 compute-1 nova_compute[182713]:   <name>instance-000000b3</name>
Jan 22 00:41:53 compute-1 nova_compute[182713]:   <memory>131072</memory>
Jan 22 00:41:53 compute-1 nova_compute[182713]:   <vcpu>1</vcpu>
Jan 22 00:41:53 compute-1 nova_compute[182713]:   <metadata>
Jan 22 00:41:53 compute-1 nova_compute[182713]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 00:41:53 compute-1 nova_compute[182713]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 00:41:53 compute-1 nova_compute[182713]:       <nova:name>tempest-TestGettingAddress-server-1979133598</nova:name>
Jan 22 00:41:53 compute-1 nova_compute[182713]:       <nova:creationTime>2026-01-22 00:41:53</nova:creationTime>
Jan 22 00:41:53 compute-1 nova_compute[182713]:       <nova:flavor name="m1.nano">
Jan 22 00:41:53 compute-1 nova_compute[182713]:         <nova:memory>128</nova:memory>
Jan 22 00:41:53 compute-1 nova_compute[182713]:         <nova:disk>1</nova:disk>
Jan 22 00:41:53 compute-1 nova_compute[182713]:         <nova:swap>0</nova:swap>
Jan 22 00:41:53 compute-1 nova_compute[182713]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 00:41:53 compute-1 nova_compute[182713]:         <nova:vcpus>1</nova:vcpus>
Jan 22 00:41:53 compute-1 nova_compute[182713]:       </nova:flavor>
Jan 22 00:41:53 compute-1 nova_compute[182713]:       <nova:owner>
Jan 22 00:41:53 compute-1 nova_compute[182713]:         <nova:user uuid="a8fd196423d94b309668ffd08655f7ed">tempest-TestGettingAddress-471729430-project-member</nova:user>
Jan 22 00:41:53 compute-1 nova_compute[182713]:         <nova:project uuid="837db8748d074b3c9179b47d30e7a1d4">tempest-TestGettingAddress-471729430</nova:project>
Jan 22 00:41:53 compute-1 nova_compute[182713]:       </nova:owner>
Jan 22 00:41:53 compute-1 nova_compute[182713]:       <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 22 00:41:53 compute-1 nova_compute[182713]:       <nova:ports>
Jan 22 00:41:53 compute-1 nova_compute[182713]:         <nova:port uuid="5d170032-0a8d-4284-b16f-f41c4caa4d83">
Jan 22 00:41:53 compute-1 nova_compute[182713]:           <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Jan 22 00:41:53 compute-1 nova_compute[182713]:           <nova:ip type="fixed" address="2001:db8::f816:3eff:fe8d:27b6" ipVersion="6"/>
Jan 22 00:41:53 compute-1 nova_compute[182713]:           <nova:ip type="fixed" address="2001:db8:0:1:f816:3eff:fe8d:27b6" ipVersion="6"/>
Jan 22 00:41:53 compute-1 nova_compute[182713]:         </nova:port>
Jan 22 00:41:53 compute-1 nova_compute[182713]:       </nova:ports>
Jan 22 00:41:53 compute-1 nova_compute[182713]:     </nova:instance>
Jan 22 00:41:53 compute-1 nova_compute[182713]:   </metadata>
Jan 22 00:41:53 compute-1 nova_compute[182713]:   <sysinfo type="smbios">
Jan 22 00:41:53 compute-1 nova_compute[182713]:     <system>
Jan 22 00:41:53 compute-1 nova_compute[182713]:       <entry name="manufacturer">RDO</entry>
Jan 22 00:41:53 compute-1 nova_compute[182713]:       <entry name="product">OpenStack Compute</entry>
Jan 22 00:41:53 compute-1 nova_compute[182713]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 00:41:53 compute-1 nova_compute[182713]:       <entry name="serial">32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c</entry>
Jan 22 00:41:53 compute-1 nova_compute[182713]:       <entry name="uuid">32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c</entry>
Jan 22 00:41:53 compute-1 nova_compute[182713]:       <entry name="family">Virtual Machine</entry>
Jan 22 00:41:53 compute-1 nova_compute[182713]:     </system>
Jan 22 00:41:53 compute-1 nova_compute[182713]:   </sysinfo>
Jan 22 00:41:53 compute-1 nova_compute[182713]:   <os>
Jan 22 00:41:53 compute-1 nova_compute[182713]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 22 00:41:53 compute-1 nova_compute[182713]:     <boot dev="hd"/>
Jan 22 00:41:53 compute-1 nova_compute[182713]:     <smbios mode="sysinfo"/>
Jan 22 00:41:53 compute-1 nova_compute[182713]:   </os>
Jan 22 00:41:53 compute-1 nova_compute[182713]:   <features>
Jan 22 00:41:53 compute-1 nova_compute[182713]:     <acpi/>
Jan 22 00:41:53 compute-1 nova_compute[182713]:     <apic/>
Jan 22 00:41:53 compute-1 nova_compute[182713]:     <vmcoreinfo/>
Jan 22 00:41:53 compute-1 nova_compute[182713]:   </features>
Jan 22 00:41:53 compute-1 nova_compute[182713]:   <clock offset="utc">
Jan 22 00:41:53 compute-1 nova_compute[182713]:     <timer name="pit" tickpolicy="delay"/>
Jan 22 00:41:53 compute-1 nova_compute[182713]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 22 00:41:53 compute-1 nova_compute[182713]:     <timer name="hpet" present="no"/>
Jan 22 00:41:53 compute-1 nova_compute[182713]:   </clock>
Jan 22 00:41:53 compute-1 nova_compute[182713]:   <cpu mode="custom" match="exact">
Jan 22 00:41:53 compute-1 nova_compute[182713]:     <model>Nehalem</model>
Jan 22 00:41:53 compute-1 nova_compute[182713]:     <topology sockets="1" cores="1" threads="1"/>
Jan 22 00:41:53 compute-1 nova_compute[182713]:   </cpu>
Jan 22 00:41:53 compute-1 nova_compute[182713]:   <devices>
Jan 22 00:41:53 compute-1 nova_compute[182713]:     <disk type="file" device="disk">
Jan 22 00:41:53 compute-1 nova_compute[182713]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 00:41:53 compute-1 nova_compute[182713]:       <source file="/var/lib/nova/instances/32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c/disk"/>
Jan 22 00:41:53 compute-1 nova_compute[182713]:       <target dev="vda" bus="virtio"/>
Jan 22 00:41:53 compute-1 nova_compute[182713]:     </disk>
Jan 22 00:41:53 compute-1 nova_compute[182713]:     <disk type="file" device="cdrom">
Jan 22 00:41:53 compute-1 nova_compute[182713]:       <driver name="qemu" type="raw" cache="none"/>
Jan 22 00:41:53 compute-1 nova_compute[182713]:       <source file="/var/lib/nova/instances/32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c/disk.config"/>
Jan 22 00:41:53 compute-1 nova_compute[182713]:       <target dev="sda" bus="sata"/>
Jan 22 00:41:53 compute-1 nova_compute[182713]:     </disk>
Jan 22 00:41:53 compute-1 nova_compute[182713]:     <interface type="ethernet">
Jan 22 00:41:53 compute-1 nova_compute[182713]:       <mac address="fa:16:3e:8d:27:b6"/>
Jan 22 00:41:53 compute-1 nova_compute[182713]:       <model type="virtio"/>
Jan 22 00:41:53 compute-1 nova_compute[182713]:       <driver name="vhost" rx_queue_size="512"/>
Jan 22 00:41:53 compute-1 nova_compute[182713]:       <mtu size="1442"/>
Jan 22 00:41:53 compute-1 nova_compute[182713]:       <target dev="tap5d170032-0a"/>
Jan 22 00:41:53 compute-1 nova_compute[182713]:     </interface>
Jan 22 00:41:53 compute-1 nova_compute[182713]:     <serial type="pty">
Jan 22 00:41:53 compute-1 nova_compute[182713]:       <log file="/var/lib/nova/instances/32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c/console.log" append="off"/>
Jan 22 00:41:53 compute-1 nova_compute[182713]:     </serial>
Jan 22 00:41:53 compute-1 nova_compute[182713]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 00:41:53 compute-1 nova_compute[182713]:     <video>
Jan 22 00:41:53 compute-1 nova_compute[182713]:       <model type="virtio"/>
Jan 22 00:41:53 compute-1 nova_compute[182713]:     </video>
Jan 22 00:41:53 compute-1 nova_compute[182713]:     <input type="tablet" bus="usb"/>
Jan 22 00:41:53 compute-1 nova_compute[182713]:     <rng model="virtio">
Jan 22 00:41:53 compute-1 nova_compute[182713]:       <backend model="random">/dev/urandom</backend>
Jan 22 00:41:53 compute-1 nova_compute[182713]:     </rng>
Jan 22 00:41:53 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root"/>
Jan 22 00:41:53 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:41:53 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:41:53 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:41:53 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:41:53 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:41:53 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:41:53 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:41:53 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:41:53 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:41:53 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:41:53 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:41:53 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:41:53 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:41:53 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:41:53 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:41:53 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:41:53 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:41:53 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:41:53 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:41:53 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:41:53 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:41:53 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:41:53 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:41:53 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:41:53 compute-1 nova_compute[182713]:     <controller type="usb" index="0"/>
Jan 22 00:41:53 compute-1 nova_compute[182713]:     <memballoon model="virtio">
Jan 22 00:41:53 compute-1 nova_compute[182713]:       <stats period="10"/>
Jan 22 00:41:53 compute-1 nova_compute[182713]:     </memballoon>
Jan 22 00:41:53 compute-1 nova_compute[182713]:   </devices>
Jan 22 00:41:53 compute-1 nova_compute[182713]: </domain>
Jan 22 00:41:53 compute-1 nova_compute[182713]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 22 00:41:53 compute-1 nova_compute[182713]: 2026-01-22 00:41:53.976 182717 DEBUG nova.compute.manager [None req-17578d0e-8d46-4ccb-99a9-7fd102f300d6 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c] Preparing to wait for external event network-vif-plugged-5d170032-0a8d-4284-b16f-f41c4caa4d83 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 22 00:41:53 compute-1 nova_compute[182713]: 2026-01-22 00:41:53.976 182717 DEBUG oslo_concurrency.lockutils [None req-17578d0e-8d46-4ccb-99a9-7fd102f300d6 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Acquiring lock "32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:41:53 compute-1 nova_compute[182713]: 2026-01-22 00:41:53.976 182717 DEBUG oslo_concurrency.lockutils [None req-17578d0e-8d46-4ccb-99a9-7fd102f300d6 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:41:53 compute-1 nova_compute[182713]: 2026-01-22 00:41:53.977 182717 DEBUG oslo_concurrency.lockutils [None req-17578d0e-8d46-4ccb-99a9-7fd102f300d6 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:41:53 compute-1 nova_compute[182713]: 2026-01-22 00:41:53.977 182717 DEBUG nova.virt.libvirt.vif [None req-17578d0e-8d46-4ccb-99a9-7fd102f300d6 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T00:41:47Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1979133598',display_name='tempest-TestGettingAddress-server-1979133598',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1979133598',id=179,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGfcFV8SVrYtqhuEJR2u0WnZRZ3aIYjdzcrjfLDmTvbNVw+iNWtleLlqtVUIYQyWXU5cujTqDWjA511UjJA6kRMyxPcbENHgTLoJy3T95U8C9/oslNz/OBwLaWEuXg2SRA==',key_name='tempest-TestGettingAddress-313470075',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='837db8748d074b3c9179b47d30e7a1d4',ramdisk_id='',reservation_id='r-xjlk1541',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-471729430',owner_user_name='tempest-TestGettingAddress-471729430-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:41:49Z,user_data=None,user_id='a8fd196423d94b309668ffd08655f7ed',uuid=32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "5d170032-0a8d-4284-b16f-f41c4caa4d83", "address": "fa:16:3e:8d:27:b6", "network": {"id": "0fbc923c-90ec-4c3d-92df-bc42843601b3", "bridge": "br-int", "label": "tempest-network-smoke--540170543", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe8d:27b6", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe8d:27b6", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5d170032-0a", "ovs_interfaceid": "5d170032-0a8d-4284-b16f-f41c4caa4d83", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 00:41:53 compute-1 nova_compute[182713]: 2026-01-22 00:41:53.978 182717 DEBUG nova.network.os_vif_util [None req-17578d0e-8d46-4ccb-99a9-7fd102f300d6 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Converting VIF {"id": "5d170032-0a8d-4284-b16f-f41c4caa4d83", "address": "fa:16:3e:8d:27:b6", "network": {"id": "0fbc923c-90ec-4c3d-92df-bc42843601b3", "bridge": "br-int", "label": "tempest-network-smoke--540170543", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe8d:27b6", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe8d:27b6", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5d170032-0a", "ovs_interfaceid": "5d170032-0a8d-4284-b16f-f41c4caa4d83", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:41:53 compute-1 nova_compute[182713]: 2026-01-22 00:41:53.979 182717 DEBUG nova.network.os_vif_util [None req-17578d0e-8d46-4ccb-99a9-7fd102f300d6 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:8d:27:b6,bridge_name='br-int',has_traffic_filtering=True,id=5d170032-0a8d-4284-b16f-f41c4caa4d83,network=Network(0fbc923c-90ec-4c3d-92df-bc42843601b3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5d170032-0a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:41:53 compute-1 nova_compute[182713]: 2026-01-22 00:41:53.980 182717 DEBUG os_vif [None req-17578d0e-8d46-4ccb-99a9-7fd102f300d6 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:8d:27:b6,bridge_name='br-int',has_traffic_filtering=True,id=5d170032-0a8d-4284-b16f-f41c4caa4d83,network=Network(0fbc923c-90ec-4c3d-92df-bc42843601b3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5d170032-0a') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 00:41:53 compute-1 nova_compute[182713]: 2026-01-22 00:41:53.980 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:41:53 compute-1 nova_compute[182713]: 2026-01-22 00:41:53.981 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:41:53 compute-1 nova_compute[182713]: 2026-01-22 00:41:53.981 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 00:41:53 compute-1 nova_compute[182713]: 2026-01-22 00:41:53.985 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:41:53 compute-1 nova_compute[182713]: 2026-01-22 00:41:53.985 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap5d170032-0a, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:41:53 compute-1 nova_compute[182713]: 2026-01-22 00:41:53.986 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap5d170032-0a, col_values=(('external_ids', {'iface-id': '5d170032-0a8d-4284-b16f-f41c4caa4d83', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:8d:27:b6', 'vm-uuid': '32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:41:53 compute-1 nova_compute[182713]: 2026-01-22 00:41:53.987 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:41:53 compute-1 NetworkManager[54952]: <info>  [1769042513.9896] manager: (tap5d170032-0a): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/361)
Jan 22 00:41:53 compute-1 nova_compute[182713]: 2026-01-22 00:41:53.991 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:41:53 compute-1 nova_compute[182713]: 2026-01-22 00:41:53.995 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:41:53 compute-1 nova_compute[182713]: 2026-01-22 00:41:53.997 182717 INFO os_vif [None req-17578d0e-8d46-4ccb-99a9-7fd102f300d6 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:8d:27:b6,bridge_name='br-int',has_traffic_filtering=True,id=5d170032-0a8d-4284-b16f-f41c4caa4d83,network=Network(0fbc923c-90ec-4c3d-92df-bc42843601b3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5d170032-0a')
Jan 22 00:41:54 compute-1 nova_compute[182713]: 2026-01-22 00:41:54.057 182717 DEBUG nova.virt.libvirt.driver [None req-17578d0e-8d46-4ccb-99a9-7fd102f300d6 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 00:41:54 compute-1 nova_compute[182713]: 2026-01-22 00:41:54.058 182717 DEBUG nova.virt.libvirt.driver [None req-17578d0e-8d46-4ccb-99a9-7fd102f300d6 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 00:41:54 compute-1 nova_compute[182713]: 2026-01-22 00:41:54.059 182717 DEBUG nova.virt.libvirt.driver [None req-17578d0e-8d46-4ccb-99a9-7fd102f300d6 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] No VIF found with MAC fa:16:3e:8d:27:b6, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 00:41:54 compute-1 nova_compute[182713]: 2026-01-22 00:41:54.060 182717 INFO nova.virt.libvirt.driver [None req-17578d0e-8d46-4ccb-99a9-7fd102f300d6 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c] Using config drive
Jan 22 00:41:54 compute-1 nova_compute[182713]: 2026-01-22 00:41:54.363 182717 INFO nova.virt.libvirt.driver [None req-17578d0e-8d46-4ccb-99a9-7fd102f300d6 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c] Creating config drive at /var/lib/nova/instances/32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c/disk.config
Jan 22 00:41:54 compute-1 nova_compute[182713]: 2026-01-22 00:41:54.370 182717 DEBUG oslo_concurrency.processutils [None req-17578d0e-8d46-4ccb-99a9-7fd102f300d6 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp6p74uqxd execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:41:54 compute-1 nova_compute[182713]: 2026-01-22 00:41:54.508 182717 DEBUG oslo_concurrency.processutils [None req-17578d0e-8d46-4ccb-99a9-7fd102f300d6 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp6p74uqxd" returned: 0 in 0.138s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:41:54 compute-1 kernel: tap5d170032-0a: entered promiscuous mode
Jan 22 00:41:54 compute-1 NetworkManager[54952]: <info>  [1769042514.6130] manager: (tap5d170032-0a): new Tun device (/org/freedesktop/NetworkManager/Devices/362)
Jan 22 00:41:54 compute-1 ovn_controller[94841]: 2026-01-22T00:41:54Z|00749|binding|INFO|Claiming lport 5d170032-0a8d-4284-b16f-f41c4caa4d83 for this chassis.
Jan 22 00:41:54 compute-1 ovn_controller[94841]: 2026-01-22T00:41:54Z|00750|binding|INFO|5d170032-0a8d-4284-b16f-f41c4caa4d83: Claiming fa:16:3e:8d:27:b6 10.100.0.13 2001:db8:0:1:f816:3eff:fe8d:27b6 2001:db8::f816:3eff:fe8d:27b6
Jan 22 00:41:54 compute-1 nova_compute[182713]: 2026-01-22 00:41:54.615 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:41:54 compute-1 nova_compute[182713]: 2026-01-22 00:41:54.621 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:41:54 compute-1 nova_compute[182713]: 2026-01-22 00:41:54.627 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:41:54 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:41:54.636 104184 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:8d:27:b6 10.100.0.13 2001:db8:0:1:f816:3eff:fe8d:27b6 2001:db8::f816:3eff:fe8d:27b6'], port_security=['fa:16:3e:8d:27:b6 10.100.0.13 2001:db8:0:1:f816:3eff:fe8d:27b6 2001:db8::f816:3eff:fe8d:27b6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28 2001:db8:0:1:f816:3eff:fe8d:27b6/64 2001:db8::f816:3eff:fe8d:27b6/64', 'neutron:device_id': '32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0fbc923c-90ec-4c3d-92df-bc42843601b3', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '837db8748d074b3c9179b47d30e7a1d4', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'ee79161b-ebd1-43ab-81bd-31efca053e81', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b420faa5-5ae8-471e-9b88-5f792c3ff519, chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>], logical_port=5d170032-0a8d-4284-b16f-f41c4caa4d83) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:41:54 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:41:54.637 104184 INFO neutron.agent.ovn.metadata.agent [-] Port 5d170032-0a8d-4284-b16f-f41c4caa4d83 in datapath 0fbc923c-90ec-4c3d-92df-bc42843601b3 bound to our chassis
Jan 22 00:41:54 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:41:54.639 104184 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 0fbc923c-90ec-4c3d-92df-bc42843601b3
Jan 22 00:41:54 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:41:54.709 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[e53c235d-9504-4395-87b7-8b52ceb4262b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:41:54 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:41:54.710 104184 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap0fbc923c-91 in ovnmeta-0fbc923c-90ec-4c3d-92df-bc42843601b3 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 22 00:41:54 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:41:54.712 211733 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap0fbc923c-90 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 22 00:41:54 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:41:54.712 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[232b4210-eb0a-4e91-8106-5facb13a1c1a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:41:54 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:41:54.713 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[61417246-7634-4056-a185-1bff1f47b73c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:41:54 compute-1 systemd-machined[153970]: New machine qemu-78-instance-000000b3.
Jan 22 00:41:54 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:41:54.726 104576 DEBUG oslo.privsep.daemon [-] privsep: reply[4ed97863-bd18-4a85-b0b2-c0bddb1d587c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:41:54 compute-1 systemd-udevd[243640]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 00:41:54 compute-1 nova_compute[182713]: 2026-01-22 00:41:54.729 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:41:54 compute-1 ovn_controller[94841]: 2026-01-22T00:41:54Z|00751|binding|INFO|Setting lport 5d170032-0a8d-4284-b16f-f41c4caa4d83 ovn-installed in OVS
Jan 22 00:41:54 compute-1 ovn_controller[94841]: 2026-01-22T00:41:54Z|00752|binding|INFO|Setting lport 5d170032-0a8d-4284-b16f-f41c4caa4d83 up in Southbound
Jan 22 00:41:54 compute-1 nova_compute[182713]: 2026-01-22 00:41:54.735 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:41:54 compute-1 systemd[1]: Started Virtual Machine qemu-78-instance-000000b3.
Jan 22 00:41:54 compute-1 NetworkManager[54952]: <info>  [1769042514.7416] device (tap5d170032-0a): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 00:41:54 compute-1 NetworkManager[54952]: <info>  [1769042514.7423] device (tap5d170032-0a): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 00:41:54 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:41:54.754 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[010da0dc-81b3-4f9c-8ce7-a4095af975bc]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:41:54 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:41:54.794 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[914e7aef-76f5-4026-88c6-70bdc299d138]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:41:54 compute-1 NetworkManager[54952]: <info>  [1769042514.8015] manager: (tap0fbc923c-90): new Veth device (/org/freedesktop/NetworkManager/Devices/363)
Jan 22 00:41:54 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:41:54.803 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[a8530f18-a863-4bc6-9931-7eef73e14b7d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:41:54 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:41:54.846 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[188b35d1-31fd-4a29-8591-b2960df36a02]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:41:54 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:41:54.850 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[685f80bd-7466-4682-a6d7-8b3221d41486]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:41:54 compute-1 NetworkManager[54952]: <info>  [1769042514.8761] device (tap0fbc923c-90): carrier: link connected
Jan 22 00:41:54 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:41:54.887 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[777e56ba-6d1a-405a-987c-af6f542aa6b7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:41:54 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:41:54.907 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[23dbd8d9-99a2-4f76-8d46-4bfb496db69b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap0fbc923c-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:5f:63:f3'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 219], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 708752, 'reachable_time': 37986, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 243671, 'error': None, 'target': 'ovnmeta-0fbc923c-90ec-4c3d-92df-bc42843601b3', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:41:54 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:41:54.929 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[5bd8fca6-92cd-4c28-adb4-54af24ae055e]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe5f:63f3'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 708752, 'tstamp': 708752}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 243672, 'error': None, 'target': 'ovnmeta-0fbc923c-90ec-4c3d-92df-bc42843601b3', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:41:54 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:41:54.954 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[04825f40-3552-4ca9-905b-a94e9a98a20d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap0fbc923c-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:5f:63:f3'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 219], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 708752, 'reachable_time': 37986, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 243673, 'error': None, 'target': 'ovnmeta-0fbc923c-90ec-4c3d-92df-bc42843601b3', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:41:54 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:41:54.995 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[2f6f02a3-fb5e-4848-a4b4-e060d1fa83f0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:41:55 compute-1 nova_compute[182713]: 2026-01-22 00:41:55.005 182717 DEBUG nova.compute.manager [req-09fd9c3d-89aa-45ab-8196-e70d13b5d8b5 req-8108fb28-2db8-44a0-945d-4c4d5ab07fbb 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c] Received event network-vif-plugged-5d170032-0a8d-4284-b16f-f41c4caa4d83 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:41:55 compute-1 nova_compute[182713]: 2026-01-22 00:41:55.005 182717 DEBUG oslo_concurrency.lockutils [req-09fd9c3d-89aa-45ab-8196-e70d13b5d8b5 req-8108fb28-2db8-44a0-945d-4c4d5ab07fbb 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:41:55 compute-1 nova_compute[182713]: 2026-01-22 00:41:55.006 182717 DEBUG oslo_concurrency.lockutils [req-09fd9c3d-89aa-45ab-8196-e70d13b5d8b5 req-8108fb28-2db8-44a0-945d-4c4d5ab07fbb 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:41:55 compute-1 nova_compute[182713]: 2026-01-22 00:41:55.006 182717 DEBUG oslo_concurrency.lockutils [req-09fd9c3d-89aa-45ab-8196-e70d13b5d8b5 req-8108fb28-2db8-44a0-945d-4c4d5ab07fbb 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:41:55 compute-1 nova_compute[182713]: 2026-01-22 00:41:55.006 182717 DEBUG nova.compute.manager [req-09fd9c3d-89aa-45ab-8196-e70d13b5d8b5 req-8108fb28-2db8-44a0-945d-4c4d5ab07fbb 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c] Processing event network-vif-plugged-5d170032-0a8d-4284-b16f-f41c4caa4d83 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 22 00:41:55 compute-1 nova_compute[182713]: 2026-01-22 00:41:55.067 182717 DEBUG nova.virt.driver [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Emitting event <LifecycleEvent: 1769042515.0662107, 32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:41:55 compute-1 nova_compute[182713]: 2026-01-22 00:41:55.067 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c] VM Started (Lifecycle Event)
Jan 22 00:41:55 compute-1 nova_compute[182713]: 2026-01-22 00:41:55.070 182717 DEBUG nova.compute.manager [None req-17578d0e-8d46-4ccb-99a9-7fd102f300d6 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 00:41:55 compute-1 nova_compute[182713]: 2026-01-22 00:41:55.075 182717 DEBUG nova.virt.libvirt.driver [None req-17578d0e-8d46-4ccb-99a9-7fd102f300d6 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 22 00:41:55 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:41:55.076 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[e6305845-f83a-4ef0-950a-aec4e662f6d5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:41:55 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:41:55.077 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0fbc923c-90, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:41:55 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:41:55.078 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 00:41:55 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:41:55.078 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap0fbc923c-90, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:41:55 compute-1 nova_compute[182713]: 2026-01-22 00:41:55.079 182717 INFO nova.virt.libvirt.driver [-] [instance: 32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c] Instance spawned successfully.
Jan 22 00:41:55 compute-1 nova_compute[182713]: 2026-01-22 00:41:55.079 182717 DEBUG nova.virt.libvirt.driver [None req-17578d0e-8d46-4ccb-99a9-7fd102f300d6 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 22 00:41:55 compute-1 NetworkManager[54952]: <info>  [1769042515.0819] manager: (tap0fbc923c-90): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/364)
Jan 22 00:41:55 compute-1 kernel: tap0fbc923c-90: entered promiscuous mode
Jan 22 00:41:55 compute-1 nova_compute[182713]: 2026-01-22 00:41:55.082 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:41:55 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:41:55.086 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap0fbc923c-90, col_values=(('external_ids', {'iface-id': '2a8aebb4-643e-4d79-9b9e-71408c2b29d3'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:41:55 compute-1 nova_compute[182713]: 2026-01-22 00:41:55.088 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:41:55 compute-1 ovn_controller[94841]: 2026-01-22T00:41:55Z|00753|binding|INFO|Releasing lport 2a8aebb4-643e-4d79-9b9e-71408c2b29d3 from this chassis (sb_readonly=0)
Jan 22 00:41:55 compute-1 nova_compute[182713]: 2026-01-22 00:41:55.091 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:41:55 compute-1 nova_compute[182713]: 2026-01-22 00:41:55.096 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 00:41:55 compute-1 nova_compute[182713]: 2026-01-22 00:41:55.099 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:41:55 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:41:55.101 104184 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/0fbc923c-90ec-4c3d-92df-bc42843601b3.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/0fbc923c-90ec-4c3d-92df-bc42843601b3.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 22 00:41:55 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:41:55.103 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[f2042836-4547-4a83-88ae-faa7f9f1560a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:41:55 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:41:55.104 104184 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 00:41:55 compute-1 ovn_metadata_agent[104179]: global
Jan 22 00:41:55 compute-1 ovn_metadata_agent[104179]:     log         /dev/log local0 debug
Jan 22 00:41:55 compute-1 ovn_metadata_agent[104179]:     log-tag     haproxy-metadata-proxy-0fbc923c-90ec-4c3d-92df-bc42843601b3
Jan 22 00:41:55 compute-1 ovn_metadata_agent[104179]:     user        root
Jan 22 00:41:55 compute-1 ovn_metadata_agent[104179]:     group       root
Jan 22 00:41:55 compute-1 ovn_metadata_agent[104179]:     maxconn     1024
Jan 22 00:41:55 compute-1 ovn_metadata_agent[104179]:     pidfile     /var/lib/neutron/external/pids/0fbc923c-90ec-4c3d-92df-bc42843601b3.pid.haproxy
Jan 22 00:41:55 compute-1 ovn_metadata_agent[104179]:     daemon
Jan 22 00:41:55 compute-1 ovn_metadata_agent[104179]: 
Jan 22 00:41:55 compute-1 ovn_metadata_agent[104179]: defaults
Jan 22 00:41:55 compute-1 ovn_metadata_agent[104179]:     log global
Jan 22 00:41:55 compute-1 ovn_metadata_agent[104179]:     mode http
Jan 22 00:41:55 compute-1 ovn_metadata_agent[104179]:     option httplog
Jan 22 00:41:55 compute-1 ovn_metadata_agent[104179]:     option dontlognull
Jan 22 00:41:55 compute-1 ovn_metadata_agent[104179]:     option http-server-close
Jan 22 00:41:55 compute-1 ovn_metadata_agent[104179]:     option forwardfor
Jan 22 00:41:55 compute-1 ovn_metadata_agent[104179]:     retries                 3
Jan 22 00:41:55 compute-1 ovn_metadata_agent[104179]:     timeout http-request    30s
Jan 22 00:41:55 compute-1 ovn_metadata_agent[104179]:     timeout connect         30s
Jan 22 00:41:55 compute-1 ovn_metadata_agent[104179]:     timeout client          32s
Jan 22 00:41:55 compute-1 ovn_metadata_agent[104179]:     timeout server          32s
Jan 22 00:41:55 compute-1 ovn_metadata_agent[104179]:     timeout http-keep-alive 30s
Jan 22 00:41:55 compute-1 ovn_metadata_agent[104179]: 
Jan 22 00:41:55 compute-1 ovn_metadata_agent[104179]: 
Jan 22 00:41:55 compute-1 ovn_metadata_agent[104179]: listen listener
Jan 22 00:41:55 compute-1 ovn_metadata_agent[104179]:     bind 169.254.169.254:80
Jan 22 00:41:55 compute-1 ovn_metadata_agent[104179]:     server metadata /var/lib/neutron/metadata_proxy
Jan 22 00:41:55 compute-1 ovn_metadata_agent[104179]:     http-request add-header X-OVN-Network-ID 0fbc923c-90ec-4c3d-92df-bc42843601b3
Jan 22 00:41:55 compute-1 ovn_metadata_agent[104179]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 22 00:41:55 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:41:55.105 104184 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-0fbc923c-90ec-4c3d-92df-bc42843601b3', 'env', 'PROCESS_TAG=haproxy-0fbc923c-90ec-4c3d-92df-bc42843601b3', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/0fbc923c-90ec-4c3d-92df-bc42843601b3.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 22 00:41:55 compute-1 nova_compute[182713]: 2026-01-22 00:41:55.110 182717 DEBUG nova.virt.libvirt.driver [None req-17578d0e-8d46-4ccb-99a9-7fd102f300d6 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:41:55 compute-1 nova_compute[182713]: 2026-01-22 00:41:55.110 182717 DEBUG nova.virt.libvirt.driver [None req-17578d0e-8d46-4ccb-99a9-7fd102f300d6 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:41:55 compute-1 nova_compute[182713]: 2026-01-22 00:41:55.112 182717 DEBUG nova.virt.libvirt.driver [None req-17578d0e-8d46-4ccb-99a9-7fd102f300d6 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:41:55 compute-1 nova_compute[182713]: 2026-01-22 00:41:55.114 182717 DEBUG nova.virt.libvirt.driver [None req-17578d0e-8d46-4ccb-99a9-7fd102f300d6 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:41:55 compute-1 nova_compute[182713]: 2026-01-22 00:41:55.115 182717 DEBUG nova.virt.libvirt.driver [None req-17578d0e-8d46-4ccb-99a9-7fd102f300d6 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:41:55 compute-1 nova_compute[182713]: 2026-01-22 00:41:55.116 182717 DEBUG nova.virt.libvirt.driver [None req-17578d0e-8d46-4ccb-99a9-7fd102f300d6 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:41:55 compute-1 nova_compute[182713]: 2026-01-22 00:41:55.122 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 00:41:55 compute-1 nova_compute[182713]: 2026-01-22 00:41:55.123 182717 DEBUG nova.virt.driver [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Emitting event <LifecycleEvent: 1769042515.0664992, 32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:41:55 compute-1 nova_compute[182713]: 2026-01-22 00:41:55.123 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c] VM Paused (Lifecycle Event)
Jan 22 00:41:55 compute-1 nova_compute[182713]: 2026-01-22 00:41:55.150 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:41:55 compute-1 nova_compute[182713]: 2026-01-22 00:41:55.155 182717 DEBUG nova.virt.driver [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Emitting event <LifecycleEvent: 1769042515.0735848, 32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:41:55 compute-1 nova_compute[182713]: 2026-01-22 00:41:55.155 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c] VM Resumed (Lifecycle Event)
Jan 22 00:41:55 compute-1 nova_compute[182713]: 2026-01-22 00:41:55.178 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:41:55 compute-1 nova_compute[182713]: 2026-01-22 00:41:55.182 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 00:41:55 compute-1 nova_compute[182713]: 2026-01-22 00:41:55.202 182717 INFO nova.compute.manager [None req-17578d0e-8d46-4ccb-99a9-7fd102f300d6 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c] Took 5.97 seconds to spawn the instance on the hypervisor.
Jan 22 00:41:55 compute-1 nova_compute[182713]: 2026-01-22 00:41:55.203 182717 DEBUG nova.compute.manager [None req-17578d0e-8d46-4ccb-99a9-7fd102f300d6 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:41:55 compute-1 nova_compute[182713]: 2026-01-22 00:41:55.208 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 00:41:55 compute-1 nova_compute[182713]: 2026-01-22 00:41:55.333 182717 INFO nova.compute.manager [None req-17578d0e-8d46-4ccb-99a9-7fd102f300d6 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c] Took 6.50 seconds to build instance.
Jan 22 00:41:55 compute-1 nova_compute[182713]: 2026-01-22 00:41:55.354 182717 DEBUG oslo_concurrency.lockutils [None req-17578d0e-8d46-4ccb-99a9-7fd102f300d6 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 6.599s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:41:55 compute-1 podman[243712]: 2026-01-22 00:41:55.542324635 +0000 UTC m=+0.071948940 container create ee7208bc637debbfd53d6908469f55201e7f400f04c232b3e85a39c4c6984399 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0fbc923c-90ec-4c3d-92df-bc42843601b3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.schema-version=1.0)
Jan 22 00:41:55 compute-1 podman[243712]: 2026-01-22 00:41:55.504484683 +0000 UTC m=+0.034108978 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 00:41:55 compute-1 systemd[1]: Started libpod-conmon-ee7208bc637debbfd53d6908469f55201e7f400f04c232b3e85a39c4c6984399.scope.
Jan 22 00:41:55 compute-1 systemd[1]: Started libcrun container.
Jan 22 00:41:55 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d7bbd94e3366a47f2fae544067e79c010eaaa5aa951d3cfa17689d54729ccf86/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 00:41:55 compute-1 podman[243712]: 2026-01-22 00:41:55.681562361 +0000 UTC m=+0.211186686 container init ee7208bc637debbfd53d6908469f55201e7f400f04c232b3e85a39c4c6984399 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0fbc923c-90ec-4c3d-92df-bc42843601b3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202)
Jan 22 00:41:55 compute-1 podman[243712]: 2026-01-22 00:41:55.693322754 +0000 UTC m=+0.222947049 container start ee7208bc637debbfd53d6908469f55201e7f400f04c232b3e85a39c4c6984399 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0fbc923c-90ec-4c3d-92df-bc42843601b3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Jan 22 00:41:55 compute-1 neutron-haproxy-ovnmeta-0fbc923c-90ec-4c3d-92df-bc42843601b3[243727]: [NOTICE]   (243731) : New worker (243733) forked
Jan 22 00:41:55 compute-1 neutron-haproxy-ovnmeta-0fbc923c-90ec-4c3d-92df-bc42843601b3[243727]: [NOTICE]   (243731) : Loading success.
Jan 22 00:41:55 compute-1 nova_compute[182713]: 2026-01-22 00:41:55.856 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:41:56 compute-1 nova_compute[182713]: 2026-01-22 00:41:56.503 182717 DEBUG nova.network.neutron [req-2afd2b6d-338e-4d43-8c3c-7f5373684b3b req-a17ef1e9-173e-47f8-951f-d10a0a306d20 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c] Updated VIF entry in instance network info cache for port 5d170032-0a8d-4284-b16f-f41c4caa4d83. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 00:41:56 compute-1 nova_compute[182713]: 2026-01-22 00:41:56.503 182717 DEBUG nova.network.neutron [req-2afd2b6d-338e-4d43-8c3c-7f5373684b3b req-a17ef1e9-173e-47f8-951f-d10a0a306d20 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c] Updating instance_info_cache with network_info: [{"id": "5d170032-0a8d-4284-b16f-f41c4caa4d83", "address": "fa:16:3e:8d:27:b6", "network": {"id": "0fbc923c-90ec-4c3d-92df-bc42843601b3", "bridge": "br-int", "label": "tempest-network-smoke--540170543", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe8d:27b6", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe8d:27b6", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5d170032-0a", "ovs_interfaceid": "5d170032-0a8d-4284-b16f-f41c4caa4d83", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:41:56 compute-1 nova_compute[182713]: 2026-01-22 00:41:56.520 182717 DEBUG oslo_concurrency.lockutils [req-2afd2b6d-338e-4d43-8c3c-7f5373684b3b req-a17ef1e9-173e-47f8-951f-d10a0a306d20 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:41:57 compute-1 nova_compute[182713]: 2026-01-22 00:41:57.096 182717 DEBUG nova.compute.manager [req-bf281965-07ff-4202-8338-dc0379ebe1bb req-87ab8f84-cf7c-45ec-b2bf-c208766940b0 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c] Received event network-vif-plugged-5d170032-0a8d-4284-b16f-f41c4caa4d83 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:41:57 compute-1 nova_compute[182713]: 2026-01-22 00:41:57.097 182717 DEBUG oslo_concurrency.lockutils [req-bf281965-07ff-4202-8338-dc0379ebe1bb req-87ab8f84-cf7c-45ec-b2bf-c208766940b0 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:41:57 compute-1 nova_compute[182713]: 2026-01-22 00:41:57.097 182717 DEBUG oslo_concurrency.lockutils [req-bf281965-07ff-4202-8338-dc0379ebe1bb req-87ab8f84-cf7c-45ec-b2bf-c208766940b0 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:41:57 compute-1 nova_compute[182713]: 2026-01-22 00:41:57.097 182717 DEBUG oslo_concurrency.lockutils [req-bf281965-07ff-4202-8338-dc0379ebe1bb req-87ab8f84-cf7c-45ec-b2bf-c208766940b0 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:41:57 compute-1 nova_compute[182713]: 2026-01-22 00:41:57.097 182717 DEBUG nova.compute.manager [req-bf281965-07ff-4202-8338-dc0379ebe1bb req-87ab8f84-cf7c-45ec-b2bf-c208766940b0 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c] No waiting events found dispatching network-vif-plugged-5d170032-0a8d-4284-b16f-f41c4caa4d83 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:41:57 compute-1 nova_compute[182713]: 2026-01-22 00:41:57.097 182717 WARNING nova.compute.manager [req-bf281965-07ff-4202-8338-dc0379ebe1bb req-87ab8f84-cf7c-45ec-b2bf-c208766940b0 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c] Received unexpected event network-vif-plugged-5d170032-0a8d-4284-b16f-f41c4caa4d83 for instance with vm_state active and task_state None.
Jan 22 00:41:58 compute-1 ovn_controller[94841]: 2026-01-22T00:41:58Z|00754|binding|INFO|Releasing lport 2a8aebb4-643e-4d79-9b9e-71408c2b29d3 from this chassis (sb_readonly=0)
Jan 22 00:41:58 compute-1 nova_compute[182713]: 2026-01-22 00:41:58.857 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:41:58 compute-1 NetworkManager[54952]: <info>  [1769042518.8584] manager: (patch-br-int-to-provnet-99bcf688-d143-4306-a11c-956e66fbe227): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/365)
Jan 22 00:41:58 compute-1 NetworkManager[54952]: <info>  [1769042518.8603] manager: (patch-provnet-99bcf688-d143-4306-a11c-956e66fbe227-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/366)
Jan 22 00:41:58 compute-1 ovn_controller[94841]: 2026-01-22T00:41:58Z|00755|binding|INFO|Releasing lport 2a8aebb4-643e-4d79-9b9e-71408c2b29d3 from this chassis (sb_readonly=0)
Jan 22 00:41:58 compute-1 nova_compute[182713]: 2026-01-22 00:41:58.887 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:41:58 compute-1 nova_compute[182713]: 2026-01-22 00:41:58.894 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:41:58 compute-1 nova_compute[182713]: 2026-01-22 00:41:58.988 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:41:59 compute-1 nova_compute[182713]: 2026-01-22 00:41:59.315 182717 DEBUG nova.compute.manager [req-443d7a17-3c0f-4a81-8eea-35bc0fddc36f req-77812029-0735-44b8-8985-77ff8b204b36 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c] Received event network-changed-5d170032-0a8d-4284-b16f-f41c4caa4d83 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:41:59 compute-1 nova_compute[182713]: 2026-01-22 00:41:59.316 182717 DEBUG nova.compute.manager [req-443d7a17-3c0f-4a81-8eea-35bc0fddc36f req-77812029-0735-44b8-8985-77ff8b204b36 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c] Refreshing instance network info cache due to event network-changed-5d170032-0a8d-4284-b16f-f41c4caa4d83. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 00:41:59 compute-1 nova_compute[182713]: 2026-01-22 00:41:59.316 182717 DEBUG oslo_concurrency.lockutils [req-443d7a17-3c0f-4a81-8eea-35bc0fddc36f req-77812029-0735-44b8-8985-77ff8b204b36 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:41:59 compute-1 nova_compute[182713]: 2026-01-22 00:41:59.317 182717 DEBUG oslo_concurrency.lockutils [req-443d7a17-3c0f-4a81-8eea-35bc0fddc36f req-77812029-0735-44b8-8985-77ff8b204b36 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:41:59 compute-1 nova_compute[182713]: 2026-01-22 00:41:59.317 182717 DEBUG nova.network.neutron [req-443d7a17-3c0f-4a81-8eea-35bc0fddc36f req-77812029-0735-44b8-8985-77ff8b204b36 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c] Refreshing network info cache for port 5d170032-0a8d-4284-b16f-f41c4caa4d83 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 00:41:59 compute-1 podman[243743]: 2026-01-22 00:41:59.588106559 +0000 UTC m=+0.068169844 container health_status af9a44923f14d865893c91712a29a99d59e05d83f1a1a0300c4f4d5af638f284 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202)
Jan 22 00:41:59 compute-1 podman[243744]: 2026-01-22 00:41:59.589251985 +0000 UTC m=+0.069913928 container health_status e305ebfcd3dbca18f8dd680606b043b108cf9efb0d0a771039cb04fe0df8441d (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 22 00:42:00 compute-1 nova_compute[182713]: 2026-01-22 00:42:00.859 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:42:00 compute-1 nova_compute[182713]: 2026-01-22 00:42:00.919 182717 DEBUG nova.network.neutron [req-443d7a17-3c0f-4a81-8eea-35bc0fddc36f req-77812029-0735-44b8-8985-77ff8b204b36 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c] Updated VIF entry in instance network info cache for port 5d170032-0a8d-4284-b16f-f41c4caa4d83. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 00:42:00 compute-1 nova_compute[182713]: 2026-01-22 00:42:00.919 182717 DEBUG nova.network.neutron [req-443d7a17-3c0f-4a81-8eea-35bc0fddc36f req-77812029-0735-44b8-8985-77ff8b204b36 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c] Updating instance_info_cache with network_info: [{"id": "5d170032-0a8d-4284-b16f-f41c4caa4d83", "address": "fa:16:3e:8d:27:b6", "network": {"id": "0fbc923c-90ec-4c3d-92df-bc42843601b3", "bridge": "br-int", "label": "tempest-network-smoke--540170543", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.210", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe8d:27b6", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe8d:27b6", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5d170032-0a", "ovs_interfaceid": "5d170032-0a8d-4284-b16f-f41c4caa4d83", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:42:00 compute-1 nova_compute[182713]: 2026-01-22 00:42:00.940 182717 DEBUG oslo_concurrency.lockutils [req-443d7a17-3c0f-4a81-8eea-35bc0fddc36f req-77812029-0735-44b8-8985-77ff8b204b36 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:42:03 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:42:03.057 104184 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:42:03 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:42:03.058 104184 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:42:03 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:42:03.058 104184 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:42:03 compute-1 systemd[1]: Starting dnf makecache...
Jan 22 00:42:03 compute-1 dnf[243783]: Metadata cache refreshed recently.
Jan 22 00:42:03 compute-1 systemd[1]: dnf-makecache.service: Deactivated successfully.
Jan 22 00:42:03 compute-1 systemd[1]: Finished dnf makecache.
Jan 22 00:42:03 compute-1 nova_compute[182713]: 2026-01-22 00:42:03.991 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:42:05 compute-1 nova_compute[182713]: 2026-01-22 00:42:05.856 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:42:05 compute-1 nova_compute[182713]: 2026-01-22 00:42:05.862 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:42:06 compute-1 ovn_controller[94841]: 2026-01-22T00:42:06Z|00098|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:8d:27:b6 10.100.0.13
Jan 22 00:42:06 compute-1 ovn_controller[94841]: 2026-01-22T00:42:06Z|00099|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:8d:27:b6 10.100.0.13
Jan 22 00:42:07 compute-1 nova_compute[182713]: 2026-01-22 00:42:07.851 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:42:08 compute-1 nova_compute[182713]: 2026-01-22 00:42:08.856 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:42:08 compute-1 nova_compute[182713]: 2026-01-22 00:42:08.994 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:42:09 compute-1 nova_compute[182713]: 2026-01-22 00:42:09.852 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:42:10 compute-1 nova_compute[182713]: 2026-01-22 00:42:10.855 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:42:10 compute-1 nova_compute[182713]: 2026-01-22 00:42:10.856 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 00:42:10 compute-1 nova_compute[182713]: 2026-01-22 00:42:10.865 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:42:12 compute-1 podman[243799]: 2026-01-22 00:42:12.586701225 +0000 UTC m=+0.083947302 container health_status cba261ffc859a105e0afa4436e64e27f9ba7e7387a1ea02146e0072c51844d44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 22 00:42:13 compute-1 nova_compute[182713]: 2026-01-22 00:42:13.996 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:42:14 compute-1 podman[243819]: 2026-01-22 00:42:14.581213417 +0000 UTC m=+0.075544191 container health_status dce133a2ffa21f3006ac27dc80027060a9029e6d972f4c62fecd35dd3bbb00f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=openstack_network_exporter, distribution-scope=public, io.openshift.expose-services=, version=9.6, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350)
Jan 22 00:42:15 compute-1 nova_compute[182713]: 2026-01-22 00:42:15.856 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:42:15 compute-1 nova_compute[182713]: 2026-01-22 00:42:15.857 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:42:15 compute-1 nova_compute[182713]: 2026-01-22 00:42:15.857 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:42:15 compute-1 nova_compute[182713]: 2026-01-22 00:42:15.867 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:42:15 compute-1 nova_compute[182713]: 2026-01-22 00:42:15.887 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:42:15 compute-1 nova_compute[182713]: 2026-01-22 00:42:15.888 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:42:15 compute-1 nova_compute[182713]: 2026-01-22 00:42:15.888 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:42:15 compute-1 nova_compute[182713]: 2026-01-22 00:42:15.888 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 00:42:15 compute-1 nova_compute[182713]: 2026-01-22 00:42:15.978 182717 DEBUG oslo_concurrency.processutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:42:16 compute-1 nova_compute[182713]: 2026-01-22 00:42:16.062 182717 DEBUG oslo_concurrency.processutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c/disk --force-share --output=json" returned: 0 in 0.085s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:42:16 compute-1 nova_compute[182713]: 2026-01-22 00:42:16.065 182717 DEBUG oslo_concurrency.processutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:42:16 compute-1 nova_compute[182713]: 2026-01-22 00:42:16.133 182717 DEBUG oslo_concurrency.processutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c/disk --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:42:16 compute-1 nova_compute[182713]: 2026-01-22 00:42:16.344 182717 WARNING nova.virt.libvirt.driver [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 00:42:16 compute-1 nova_compute[182713]: 2026-01-22 00:42:16.345 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5460MB free_disk=73.14825057983398GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 00:42:16 compute-1 nova_compute[182713]: 2026-01-22 00:42:16.346 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:42:16 compute-1 nova_compute[182713]: 2026-01-22 00:42:16.346 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:42:16 compute-1 nova_compute[182713]: 2026-01-22 00:42:16.417 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Instance 32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 22 00:42:16 compute-1 nova_compute[182713]: 2026-01-22 00:42:16.418 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 00:42:16 compute-1 nova_compute[182713]: 2026-01-22 00:42:16.418 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 00:42:16 compute-1 nova_compute[182713]: 2026-01-22 00:42:16.466 182717 DEBUG nova.compute.provider_tree [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Inventory has not changed in ProviderTree for provider: 39680711-70c9-4df1-ae59-25e54fac688d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:42:16 compute-1 nova_compute[182713]: 2026-01-22 00:42:16.484 182717 DEBUG nova.scheduler.client.report [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Inventory has not changed for provider 39680711-70c9-4df1-ae59-25e54fac688d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:42:16 compute-1 nova_compute[182713]: 2026-01-22 00:42:16.506 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 00:42:16 compute-1 nova_compute[182713]: 2026-01-22 00:42:16.507 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.161s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:42:17 compute-1 nova_compute[182713]: 2026-01-22 00:42:17.506 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:42:18 compute-1 nova_compute[182713]: 2026-01-22 00:42:18.998 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:42:19 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:42:19.913 104184 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=65, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:a4:fb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'fa:c2:bb:50:eb:27'}, ipsec=False) old=SB_Global(nb_cfg=64) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:42:19 compute-1 nova_compute[182713]: 2026-01-22 00:42:19.913 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:42:19 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:42:19.914 104184 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 22 00:42:20 compute-1 nova_compute[182713]: 2026-01-22 00:42:20.856 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:42:20 compute-1 nova_compute[182713]: 2026-01-22 00:42:20.857 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 00:42:20 compute-1 nova_compute[182713]: 2026-01-22 00:42:20.857 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 22 00:42:20 compute-1 nova_compute[182713]: 2026-01-22 00:42:20.911 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:42:21 compute-1 nova_compute[182713]: 2026-01-22 00:42:21.181 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquiring lock "refresh_cache-32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:42:21 compute-1 nova_compute[182713]: 2026-01-22 00:42:21.182 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquired lock "refresh_cache-32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:42:21 compute-1 nova_compute[182713]: 2026-01-22 00:42:21.182 182717 DEBUG nova.network.neutron [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] [instance: 32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Jan 22 00:42:21 compute-1 nova_compute[182713]: 2026-01-22 00:42:21.183 182717 DEBUG nova.objects.instance [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lazy-loading 'info_cache' on Instance uuid 32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.911 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c', 'name': 'tempest-TestGettingAddress-server-1979133598', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-000000b3', 'OS-EXT-SRV-ATTR:host': 'compute-1.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '837db8748d074b3c9179b47d30e7a1d4', 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'hostId': '114686826ac799d79679ebf48d50469c6b568964d23812d7c3d3d1b1', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.912 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.917 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c / tap5d170032-0a inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Jan 22 00:42:22 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:42:22.916 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=74526b6d-b1ca-423f-9094-b845f8b97526, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '65'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.917 12 DEBUG ceilometer.compute.pollsters [-] 32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.921 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4cbda6b0-92bd-4c6c-bb10-6014d8b809ec', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': 'instance-000000b3-32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c-tap5d170032-0a', 'timestamp': '2026-01-22T00:42:22.913215', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1979133598', 'name': 'tap5d170032-0a', 'instance_id': '32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c', 'instance_type': 'm1.nano', 'host': '114686826ac799d79679ebf48d50469c6b568964d23812d7c3d3d1b1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:8d:27:b6', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5d170032-0a'}, 'message_id': '3722ffa0-f72b-11f0-a0a4-fa163e934844', 'monotonic_time': 7115.620581827, 'message_signature': '5a6be79b1fe3c2d8bd75993390fd9a67a3194d13d12b2224ce05e728e3a66352'}]}, 'timestamp': '2026-01-22 00:42:22.918522', '_unique_id': '2636c0e5caf144b6be3648e41bdddc08'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.921 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.921 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.921 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.921 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.921 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.921 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.921 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.921 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.921 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.921 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.921 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.921 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.921 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.921 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.921 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.921 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.921 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.921 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.921 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.921 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.921 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.921 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.921 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.921 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.921 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.921 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.921 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.921 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.921 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.921 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.921 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.923 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.962 12 DEBUG ceilometer.compute.pollsters [-] 32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c/disk.device.read.requests volume: 1105 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.963 12 DEBUG ceilometer.compute.pollsters [-] 32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.964 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e456bd51-56e4-444c-8353-6e7cceb266d4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1105, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': '32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c-vda', 'timestamp': '2026-01-22T00:42:22.923270', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1979133598', 'name': 'instance-000000b3', 'instance_id': '32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c', 'instance_type': 'm1.nano', 'host': '114686826ac799d79679ebf48d50469c6b568964d23812d7c3d3d1b1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '3729d302-f72b-11f0-a0a4-fa163e934844', 'monotonic_time': 7115.630507895, 'message_signature': 'a78299021e2a992e31505e2a85f2de5606a726c81bbc5b82795f3485a5a8e199'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': '32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c-sda', 'timestamp': '2026-01-22T00:42:22.923270', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1979133598', 'name': 'instance-000000b3', 'instance_id': '32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c', 'instance_type': 'm1.nano', 'host': '114686826ac799d79679ebf48d50469c6b568964d23812d7c3d3d1b1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '3729e2ac-f72b-11f0-a0a4-fa163e934844', 'monotonic_time': 7115.630507895, 'message_signature': '7ffa31a4ba5ea62e9f13137bd77af1e0699596ac3cb082a470e5561313f03a2c'}]}, 'timestamp': '2026-01-22 00:42:22.963318', '_unique_id': '0f1f13acbc4542e99ac5cac48185a970'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.964 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.964 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.964 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.964 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.964 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.964 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.964 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.964 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.964 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.964 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.964 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.964 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.964 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.964 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.964 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.964 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.964 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.964 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.964 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.964 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.964 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.964 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.964 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.964 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.964 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.964 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.964 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.964 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.964 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.964 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.964 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.965 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.965 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.965 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-TestGettingAddress-server-1979133598>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestGettingAddress-server-1979133598>]
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.965 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.966 12 DEBUG ceilometer.compute.pollsters [-] 32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.966 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '92e1a7d6-cfdb-48a5-984c-e5ca75744368', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': 'instance-000000b3-32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c-tap5d170032-0a', 'timestamp': '2026-01-22T00:42:22.965988', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1979133598', 'name': 'tap5d170032-0a', 'instance_id': '32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c', 'instance_type': 'm1.nano', 'host': '114686826ac799d79679ebf48d50469c6b568964d23812d7c3d3d1b1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:8d:27:b6', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5d170032-0a'}, 'message_id': '372a57aa-f72b-11f0-a0a4-fa163e934844', 'monotonic_time': 7115.620581827, 'message_signature': 'a719cc8045b5a653001f53607a059bc7c67bb68dd5b077b09fe2cecb9d00993a'}]}, 'timestamp': '2026-01-22 00:42:22.966327', '_unique_id': '8088c15548814cd3bd90eb6770de3e50'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.966 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.966 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.966 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.966 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.966 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.966 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.966 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.966 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.966 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.966 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.966 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.966 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.966 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.966 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.966 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.966 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.966 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.966 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.966 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.966 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.966 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.966 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.966 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.966 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.966 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.966 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.966 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.966 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.966 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.966 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.966 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.967 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.967 12 DEBUG ceilometer.compute.pollsters [-] 32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c/disk.device.write.bytes volume: 72929280 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.968 12 DEBUG ceilometer.compute.pollsters [-] 32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.968 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '71cc6765-4c0e-4100-b3a7-6132d4df6393', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 72929280, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': '32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c-vda', 'timestamp': '2026-01-22T00:42:22.967779', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1979133598', 'name': 'instance-000000b3', 'instance_id': '32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c', 'instance_type': 'm1.nano', 'host': '114686826ac799d79679ebf48d50469c6b568964d23812d7c3d3d1b1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '372a9d5a-f72b-11f0-a0a4-fa163e934844', 'monotonic_time': 7115.630507895, 'message_signature': 'eb22a95b0e4c0531ec04a009e255d9fdaf69ce4d74d98876bc5dbc56fb5e590b'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': '32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c-sda', 'timestamp': '2026-01-22T00:42:22.967779', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1979133598', 'name': 'instance-000000b3', 'instance_id': '32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c', 'instance_type': 'm1.nano', 'host': '114686826ac799d79679ebf48d50469c6b568964d23812d7c3d3d1b1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '372aa778-f72b-11f0-a0a4-fa163e934844', 'monotonic_time': 7115.630507895, 'message_signature': '0784b9e0741ecf0a3f376667ec9455b52094793d6b3a25afc161d00cb71e2343'}]}, 'timestamp': '2026-01-22 00:42:22.968352', '_unique_id': '7ee1afb095ff4369921940e050f72368'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.968 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.968 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.968 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.968 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.968 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.968 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.968 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.968 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.968 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.968 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.968 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.968 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.968 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.968 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.968 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.968 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.968 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.968 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.968 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.968 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.968 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.968 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.968 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.968 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.968 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.968 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.968 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.968 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.968 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.968 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.968 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.969 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.984 12 DEBUG ceilometer.compute.pollsters [-] 32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c/disk.device.usage volume: 29949952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.984 12 DEBUG ceilometer.compute.pollsters [-] 32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.985 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '79ab5692-adce-4be2-90b5-7a365957e508', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29949952, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': '32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c-vda', 'timestamp': '2026-01-22T00:42:22.969966', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1979133598', 'name': 'instance-000000b3', 'instance_id': '32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c', 'instance_type': 'm1.nano', 'host': '114686826ac799d79679ebf48d50469c6b568964d23812d7c3d3d1b1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '372d1eae-f72b-11f0-a0a4-fa163e934844', 'monotonic_time': 7115.677194411, 'message_signature': 'ab9044a2a97a622033b9d0978447b46d71e8358e5554198ace760e8a527b1e79'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': '32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c-sda', 'timestamp': '2026-01-22T00:42:22.969966', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1979133598', 'name': 'instance-000000b3', 'instance_id': '32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c', 'instance_type': 'm1.nano', 'host': '114686826ac799d79679ebf48d50469c6b568964d23812d7c3d3d1b1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '372d2926-f72b-11f0-a0a4-fa163e934844', 'monotonic_time': 7115.677194411, 'message_signature': '9064e7e1870062d87bbd2084c6f0aa8b9ddb97226734603ac403837f2df3d633'}]}, 'timestamp': '2026-01-22 00:42:22.984769', '_unique_id': '08f90d7455794099b1139a0dea8d5eaf'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.985 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.985 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.985 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.985 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.985 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.985 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.985 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.985 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.985 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.985 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.985 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.985 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.985 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.985 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.985 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.985 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.985 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.985 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.985 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.985 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.985 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.985 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.985 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.985 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.985 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.985 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.985 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.985 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.985 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.985 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.985 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.986 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.986 12 DEBUG ceilometer.compute.pollsters [-] 32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c/disk.device.allocation volume: 30613504 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.986 12 DEBUG ceilometer.compute.pollsters [-] 32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.987 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd6176249-4fd5-47f7-a770-d0a9df6ce2d8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30613504, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': '32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c-vda', 'timestamp': '2026-01-22T00:42:22.986512', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1979133598', 'name': 'instance-000000b3', 'instance_id': '32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c', 'instance_type': 'm1.nano', 'host': '114686826ac799d79679ebf48d50469c6b568964d23812d7c3d3d1b1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '372d77f0-f72b-11f0-a0a4-fa163e934844', 'monotonic_time': 7115.677194411, 'message_signature': '71f72e17234a985229dce0ecaef6788423332d2c0c8f9aa7daec1c6e6ac1d3b4'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': '32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c-sda', 'timestamp': '2026-01-22T00:42:22.986512', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1979133598', 'name': 'instance-000000b3', 'instance_id': '32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c', 'instance_type': 'm1.nano', 'host': '114686826ac799d79679ebf48d50469c6b568964d23812d7c3d3d1b1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '372d829a-f72b-11f0-a0a4-fa163e934844', 'monotonic_time': 7115.677194411, 'message_signature': 'c5e2b9bba4966059e727875ccfa8659cc06eef74e33cc543a8522c2a814c8183'}]}, 'timestamp': '2026-01-22 00:42:22.987060', '_unique_id': '273367d5ee1a423587a6b2f3d0fc7e09'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.987 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.987 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.987 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.987 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.987 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.987 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.987 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.987 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.987 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.987 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.987 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.987 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.987 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.987 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.987 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.987 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.987 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.987 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.987 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.987 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.987 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.987 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.987 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.987 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.987 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.987 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.987 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.987 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.987 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.987 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.987 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.988 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.988 12 DEBUG ceilometer.compute.pollsters [-] 32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c/network.outgoing.bytes volume: 4048 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.989 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bb873323-f980-4766-97f4-4c25a2c2a04e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 4048, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': 'instance-000000b3-32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c-tap5d170032-0a', 'timestamp': '2026-01-22T00:42:22.988361', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1979133598', 'name': 'tap5d170032-0a', 'instance_id': '32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c', 'instance_type': 'm1.nano', 'host': '114686826ac799d79679ebf48d50469c6b568964d23812d7c3d3d1b1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:8d:27:b6', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5d170032-0a'}, 'message_id': '372dbfa8-f72b-11f0-a0a4-fa163e934844', 'monotonic_time': 7115.620581827, 'message_signature': 'cc8385f869bd153ed289df9f99ec7592f0e8b3c0eb1369ef6eeecae999f61dc3'}]}, 'timestamp': '2026-01-22 00:42:22.988595', '_unique_id': 'ca938611323c4397a1567bc8af4bab84'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.989 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.989 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.989 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.989 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.989 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.989 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.989 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.989 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.989 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.989 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.989 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.989 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.989 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.989 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.989 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.989 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.989 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.989 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.989 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.989 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.989 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.989 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.989 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.989 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.989 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.989 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.989 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.989 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.989 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.989 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.989 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.989 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.989 12 DEBUG ceilometer.compute.pollsters [-] 32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.989 12 DEBUG ceilometer.compute.pollsters [-] 32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.990 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c3a7f1ca-0fed-423d-9060-a8d8ce552ebc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': '32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c-vda', 'timestamp': '2026-01-22T00:42:22.989633', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1979133598', 'name': 'instance-000000b3', 'instance_id': '32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c', 'instance_type': 'm1.nano', 'host': '114686826ac799d79679ebf48d50469c6b568964d23812d7c3d3d1b1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '372df0ea-f72b-11f0-a0a4-fa163e934844', 'monotonic_time': 7115.677194411, 'message_signature': '61b79eeb83fc0e6f06fcca7a8586383f8a97c2966952ad8231f5baf5ceca637a'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': '32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c-sda', 'timestamp': '2026-01-22T00:42:22.989633', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1979133598', 'name': 'instance-000000b3', 'instance_id': '32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c', 'instance_type': 'm1.nano', 'host': '114686826ac799d79679ebf48d50469c6b568964d23812d7c3d3d1b1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '372df98c-f72b-11f0-a0a4-fa163e934844', 'monotonic_time': 7115.677194411, 'message_signature': '5831b4b54265d08786eaf39f4679efc44ae83b31d257e083f01869715fbe2713'}]}, 'timestamp': '2026-01-22 00:42:22.990061', '_unique_id': '46391147ec3649b487110ac621789ad5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.990 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.990 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.990 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.990 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.990 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.990 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.990 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.990 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.990 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.990 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.990 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.990 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.990 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.990 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.990 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.990 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.990 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.990 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.990 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.990 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.990 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.990 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.990 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.990 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.990 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.990 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.990 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.990 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.990 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.990 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.990 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.991 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.991 12 DEBUG ceilometer.compute.pollsters [-] 32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c/disk.device.write.latency volume: 3387908882 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.991 12 DEBUG ceilometer.compute.pollsters [-] 32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.991 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3bc38c88-3f1d-43eb-96d6-b685e9e5868b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 3387908882, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': '32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c-vda', 'timestamp': '2026-01-22T00:42:22.991114', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1979133598', 'name': 'instance-000000b3', 'instance_id': '32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c', 'instance_type': 'm1.nano', 'host': '114686826ac799d79679ebf48d50469c6b568964d23812d7c3d3d1b1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '372e2aec-f72b-11f0-a0a4-fa163e934844', 'monotonic_time': 7115.630507895, 'message_signature': 'afa73418b2eb43a5332979188347e1b02a578fe8ec98339b48e7ac49a2b1fea6'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': '32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c-sda', 'timestamp': '2026-01-22T00:42:22.991114', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1979133598', 'name': 'instance-000000b3', 'instance_id': '32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c', 'instance_type': 'm1.nano', 'host': '114686826ac799d79679ebf48d50469c6b568964d23812d7c3d3d1b1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '372e328a-f72b-11f0-a0a4-fa163e934844', 'monotonic_time': 7115.630507895, 'message_signature': '04cac3ac45703990fc8dbf1b9cb7fff9ccadf5807ccb8a20f19421a545884402'}]}, 'timestamp': '2026-01-22 00:42:22.991520', '_unique_id': '481ce47d93cf4e5a9c223c193e60b1f6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.991 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.991 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.991 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.991 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.991 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.991 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.991 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.991 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.991 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.991 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.991 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.991 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.991 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.991 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.991 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.991 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.991 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.991 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.991 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.991 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.991 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.991 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.991 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.991 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.991 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.991 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.991 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.991 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.991 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.991 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.991 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.992 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.992 12 DEBUG ceilometer.compute.pollsters [-] 32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c/network.outgoing.packets volume: 35 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.993 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3b29baa5-d508-40f8-8179-ec7227eaa84e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 35, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': 'instance-000000b3-32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c-tap5d170032-0a', 'timestamp': '2026-01-22T00:42:22.992658', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1979133598', 'name': 'tap5d170032-0a', 'instance_id': '32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c', 'instance_type': 'm1.nano', 'host': '114686826ac799d79679ebf48d50469c6b568964d23812d7c3d3d1b1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:8d:27:b6', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5d170032-0a'}, 'message_id': '372e673c-f72b-11f0-a0a4-fa163e934844', 'monotonic_time': 7115.620581827, 'message_signature': '9c28d8a4d30445b67251ec46b2663c9ed47311b152da1215b0959633c4c874e2'}]}, 'timestamp': '2026-01-22 00:42:22.992899', '_unique_id': 'fc82457dd636460aacc246f6fdb2e1fa'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.993 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.993 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.993 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.993 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.993 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.993 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.993 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.993 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.993 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.993 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.993 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.993 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.993 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.993 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.993 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.993 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.993 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.993 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.993 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.993 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.993 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.993 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.993 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.993 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.993 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.993 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.993 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.993 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.993 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.993 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.993 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.993 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.993 12 DEBUG ceilometer.compute.pollsters [-] 32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c/disk.device.write.requests volume: 307 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.994 12 DEBUG ceilometer.compute.pollsters [-] 32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.994 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '26c170a6-9a1a-4c99-87a8-34233631a460', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 307, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': '32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c-vda', 'timestamp': '2026-01-22T00:42:22.993922', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1979133598', 'name': 'instance-000000b3', 'instance_id': '32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c', 'instance_type': 'm1.nano', 'host': '114686826ac799d79679ebf48d50469c6b568964d23812d7c3d3d1b1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '372e9874-f72b-11f0-a0a4-fa163e934844', 'monotonic_time': 7115.630507895, 'message_signature': '91ab41d44dbda9bc813d6407f7439bd5b5406c2a3d9671c078e74ace282b4edd'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': '32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c-sda', 'timestamp': '2026-01-22T00:42:22.993922', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1979133598', 'name': 'instance-000000b3', 'instance_id': '32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c', 'instance_type': 'm1.nano', 'host': '114686826ac799d79679ebf48d50469c6b568964d23812d7c3d3d1b1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '372ea0c6-f72b-11f0-a0a4-fa163e934844', 'monotonic_time': 7115.630507895, 'message_signature': 'e3f6ae2729e471113aefd35d72839c245bccad94851f49d754747e82aef2766d'}]}, 'timestamp': '2026-01-22 00:42:22.994348', '_unique_id': 'a5eef6b1f811442696242331ef1d0cfc'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.994 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.994 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.994 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.994 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.994 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.994 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.994 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.994 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.994 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.994 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.994 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.994 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.994 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.994 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.994 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.994 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.994 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.994 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.994 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.994 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.994 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.994 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.994 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.994 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.994 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.994 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.994 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.994 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.994 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.994 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.994 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.995 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.995 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.995 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-TestGettingAddress-server-1979133598>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestGettingAddress-server-1979133598>]
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.995 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.995 12 DEBUG ceilometer.compute.pollsters [-] 32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.996 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'edcd9fbd-48f9-4cd2-a052-be520bf2d96b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': 'instance-000000b3-32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c-tap5d170032-0a', 'timestamp': '2026-01-22T00:42:22.995783', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1979133598', 'name': 'tap5d170032-0a', 'instance_id': '32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c', 'instance_type': 'm1.nano', 'host': '114686826ac799d79679ebf48d50469c6b568964d23812d7c3d3d1b1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:8d:27:b6', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5d170032-0a'}, 'message_id': '372ee28e-f72b-11f0-a0a4-fa163e934844', 'monotonic_time': 7115.620581827, 'message_signature': 'b50539f6132dd319441944ee407a074febd2244d437e25a3e6157321a13c6457'}]}, 'timestamp': '2026-01-22 00:42:22.996042', '_unique_id': 'd32fca6cbe1648f09acec49662a8dde3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.996 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.996 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.996 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.996 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.996 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.996 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.996 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.996 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.996 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.996 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.996 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.996 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.996 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.996 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.996 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.996 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.996 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.996 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.996 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.996 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.996 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.996 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.996 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.996 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.996 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.996 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.996 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.996 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.996 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.996 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.996 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.997 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.997 12 DEBUG ceilometer.compute.pollsters [-] 32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.997 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3809b688-e70f-4fb4-b49a-6dfffac2e4f0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': 'instance-000000b3-32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c-tap5d170032-0a', 'timestamp': '2026-01-22T00:42:22.997160', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1979133598', 'name': 'tap5d170032-0a', 'instance_id': '32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c', 'instance_type': 'm1.nano', 'host': '114686826ac799d79679ebf48d50469c6b568964d23812d7c3d3d1b1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:8d:27:b6', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5d170032-0a'}, 'message_id': '372f181c-f72b-11f0-a0a4-fa163e934844', 'monotonic_time': 7115.620581827, 'message_signature': '8e7017508b160ac1a1d9661a35dbb1c5702b7cfdd97cb2f93c48db1496d97b18'}]}, 'timestamp': '2026-01-22 00:42:22.997412', '_unique_id': '976bb6939d9e468d8415ff58d2f1b995'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.997 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.997 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.997 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.997 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.997 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.997 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.997 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.997 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.997 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.997 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.997 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.997 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.997 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.997 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.997 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.997 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.997 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.997 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.997 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.997 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.997 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.997 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.997 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.997 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.997 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.997 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.997 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.997 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.997 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.997 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.997 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:42:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:22.998 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.023 12 DEBUG ceilometer.compute.pollsters [-] 32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c/memory.usage volume: 42.88671875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.024 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7b473ea4-6780-4f4f-8ea0-9350f6c73752', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 42.88671875, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': '32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c', 'timestamp': '2026-01-22T00:42:22.998646', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1979133598', 'name': 'instance-000000b3', 'instance_id': '32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c', 'instance_type': 'm1.nano', 'host': '114686826ac799d79679ebf48d50469c6b568964d23812d7c3d3d1b1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '3733137c-f72b-11f0-a0a4-fa163e934844', 'monotonic_time': 7115.730149662, 'message_signature': 'd8bba4ce3c943c8b38975a933a5e0db58276724dbeb771503c3e9acfa02a3230'}]}, 'timestamp': '2026-01-22 00:42:23.023565', '_unique_id': '7bc2d504ef49481b990d5a6f3a8db6ab'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.024 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.024 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.024 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.024 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.024 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.024 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.024 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.024 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.024 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.024 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.024 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.024 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.024 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.024 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.024 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.024 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.024 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.024 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.024 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.024 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.024 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.024 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.024 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.024 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.024 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.024 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.024 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.024 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.024 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.024 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.024 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.025 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.025 12 DEBUG ceilometer.compute.pollsters [-] 32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.025 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f2fd2a8b-20fc-41d4-abb0-0c4d855e9717', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': 'instance-000000b3-32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c-tap5d170032-0a', 'timestamp': '2026-01-22T00:42:23.025204', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1979133598', 'name': 'tap5d170032-0a', 'instance_id': '32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c', 'instance_type': 'm1.nano', 'host': '114686826ac799d79679ebf48d50469c6b568964d23812d7c3d3d1b1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:8d:27:b6', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5d170032-0a'}, 'message_id': '37335f12-f72b-11f0-a0a4-fa163e934844', 'monotonic_time': 7115.620581827, 'message_signature': '0ac248c4be7c55eb31d20a71f5854619698b021f8b0b84fcc37b8f3f123adbf0'}]}, 'timestamp': '2026-01-22 00:42:23.025447', '_unique_id': '6ebf4e7cd68e421bbdd6708fc970918c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.025 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.025 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.025 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.025 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.025 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.025 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.025 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.025 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.025 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.025 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.025 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.025 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.025 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.025 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.025 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.025 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.025 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.025 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.025 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.025 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.025 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.025 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.025 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.025 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.025 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.025 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.025 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.025 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.025 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.025 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.025 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.026 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.026 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.026 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-TestGettingAddress-server-1979133598>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestGettingAddress-server-1979133598>]
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.026 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.026 12 DEBUG ceilometer.compute.pollsters [-] 32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c/network.incoming.packets volume: 27 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.027 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '22660d50-b9ca-44f8-aaff-3f2a91f3becb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 27, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': 'instance-000000b3-32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c-tap5d170032-0a', 'timestamp': '2026-01-22T00:42:23.026944', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1979133598', 'name': 'tap5d170032-0a', 'instance_id': '32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c', 'instance_type': 'm1.nano', 'host': '114686826ac799d79679ebf48d50469c6b568964d23812d7c3d3d1b1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:8d:27:b6', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5d170032-0a'}, 'message_id': '3733a2ce-f72b-11f0-a0a4-fa163e934844', 'monotonic_time': 7115.620581827, 'message_signature': '338d4661821997b6f9cfa272de0e19ef1a1bd92c42b6e872f62db388b929d794'}]}, 'timestamp': '2026-01-22 00:42:23.027180', '_unique_id': 'e0639c2c7aad4ac4900d62823770e7be'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.027 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.027 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.027 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.027 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.027 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.027 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.027 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.027 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.027 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.027 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.027 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.027 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.027 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.027 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.027 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.027 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.027 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.027 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.027 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.027 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.027 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.027 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.027 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.027 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.027 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.027 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.027 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.027 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.027 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.027 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.027 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.028 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.028 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.028 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-TestGettingAddress-server-1979133598>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestGettingAddress-server-1979133598>]
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.028 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.028 12 DEBUG ceilometer.compute.pollsters [-] 32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c/disk.device.read.bytes volume: 30525952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.029 12 DEBUG ceilometer.compute.pollsters [-] 32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.029 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '642e6f95-3d5b-42c5-ba4d-9624c1284c3a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 30525952, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': '32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c-vda', 'timestamp': '2026-01-22T00:42:23.028737', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1979133598', 'name': 'instance-000000b3', 'instance_id': '32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c', 'instance_type': 'm1.nano', 'host': '114686826ac799d79679ebf48d50469c6b568964d23812d7c3d3d1b1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '3733eba8-f72b-11f0-a0a4-fa163e934844', 'monotonic_time': 7115.630507895, 'message_signature': '39eac4104d338645644b05ed61a4860a0b4c54ee274011f567a2fce623cf63d7'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': '32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c-sda', 'timestamp': '2026-01-22T00:42:23.028737', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1979133598', 'name': 'instance-000000b3', 'instance_id': '32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c', 'instance_type': 'm1.nano', 'host': '114686826ac799d79679ebf48d50469c6b568964d23812d7c3d3d1b1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '3733f45e-f72b-11f0-a0a4-fa163e934844', 'monotonic_time': 7115.630507895, 'message_signature': '2fc545e0f810e81f1eb503da4a185645c76d4bf19e73fa8c6911e06b5902604e'}]}, 'timestamp': '2026-01-22 00:42:23.029250', '_unique_id': '62024a1337de4cb9af32f7a2b2aecaee'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.029 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.029 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.029 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.029 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.029 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.029 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.029 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.029 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.029 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.029 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.029 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.029 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.029 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.029 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.029 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.029 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.029 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.029 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.029 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.029 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.029 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.029 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.029 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.029 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.029 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.029 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.029 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.029 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.029 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.029 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.029 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.030 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.030 12 DEBUG ceilometer.compute.pollsters [-] 32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c/network.incoming.bytes volume: 4345 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.031 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '073eb638-ea13-4fdc-8ee1-dd0aa4eb5a6f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 4345, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': 'instance-000000b3-32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c-tap5d170032-0a', 'timestamp': '2026-01-22T00:42:23.030324', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1979133598', 'name': 'tap5d170032-0a', 'instance_id': '32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c', 'instance_type': 'm1.nano', 'host': '114686826ac799d79679ebf48d50469c6b568964d23812d7c3d3d1b1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:8d:27:b6', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5d170032-0a'}, 'message_id': '37342690-f72b-11f0-a0a4-fa163e934844', 'monotonic_time': 7115.620581827, 'message_signature': 'ba4b8732313a97ebdd1dc4a478e13420e87fb09abda100a3179496f90948857e'}]}, 'timestamp': '2026-01-22 00:42:23.030550', '_unique_id': 'cc0c8f523e694eb492ecb379c404ea7c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.031 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.031 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.031 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.031 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.031 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.031 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.031 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.031 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.031 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.031 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.031 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.031 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.031 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.031 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.031 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.031 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.031 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.031 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.031 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.031 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.031 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.031 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.031 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.031 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.031 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.031 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.031 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.031 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.031 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.031 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.031 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.031 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.031 12 DEBUG ceilometer.compute.pollsters [-] 32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c/disk.device.read.latency volume: 167845581 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.031 12 DEBUG ceilometer.compute.pollsters [-] 32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c/disk.device.read.latency volume: 26461056 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.032 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9eb87e11-564f-4eb0-be8d-aca4f43345ab', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 167845581, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': '32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c-vda', 'timestamp': '2026-01-22T00:42:23.031573', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1979133598', 'name': 'instance-000000b3', 'instance_id': '32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c', 'instance_type': 'm1.nano', 'host': '114686826ac799d79679ebf48d50469c6b568964d23812d7c3d3d1b1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '3734573c-f72b-11f0-a0a4-fa163e934844', 'monotonic_time': 7115.630507895, 'message_signature': '0611275dbcc11cfc005c8a7c55f97db0c488553377fcf7da8170eb138f38730b'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 26461056, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': '32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c-sda', 'timestamp': '2026-01-22T00:42:23.031573', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1979133598', 'name': 'instance-000000b3', 'instance_id': '32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c', 'instance_type': 'm1.nano', 'host': '114686826ac799d79679ebf48d50469c6b568964d23812d7c3d3d1b1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '37346010-f72b-11f0-a0a4-fa163e934844', 'monotonic_time': 7115.630507895, 'message_signature': '20a016bf4ed449395d26045a017745768b6c294af988f6069763600cb83de633'}]}, 'timestamp': '2026-01-22 00:42:23.032007', '_unique_id': '0a287366191f469aa2c3f559c11cd30f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.032 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.032 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.032 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.032 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.032 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.032 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.032 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.032 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.032 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.032 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.032 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.032 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.032 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.032 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.032 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.032 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.032 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.032 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.032 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.032 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.032 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.032 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.032 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.032 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.032 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.032 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.032 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.032 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.032 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.032 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.032 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.033 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.033 12 DEBUG ceilometer.compute.pollsters [-] 32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.033 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '96ce5820-1bc2-4908-9817-bcc2c28f787e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': 'instance-000000b3-32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c-tap5d170032-0a', 'timestamp': '2026-01-22T00:42:23.033076', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1979133598', 'name': 'tap5d170032-0a', 'instance_id': '32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c', 'instance_type': 'm1.nano', 'host': '114686826ac799d79679ebf48d50469c6b568964d23812d7c3d3d1b1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:8d:27:b6', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5d170032-0a'}, 'message_id': '373491fc-f72b-11f0-a0a4-fa163e934844', 'monotonic_time': 7115.620581827, 'message_signature': '9f51b80476f610bfdfb3547d5a22681d1246ea3706ac008a9704617b4b2b690d'}]}, 'timestamp': '2026-01-22 00:42:23.033298', '_unique_id': 'f1939912be514abcae4524915538b9b9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.033 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.033 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.033 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.033 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.033 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.033 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.033 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.033 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.033 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.033 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.033 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.033 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.033 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.033 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.033 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.033 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.033 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.033 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.033 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.033 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.033 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.033 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.033 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.033 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.033 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.033 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.033 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.033 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.033 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.033 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.033 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.034 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.034 12 DEBUG ceilometer.compute.pollsters [-] 32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c/cpu volume: 11350000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.035 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0e7194fd-f860-4553-9dc5-82d44cda1464', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 11350000000, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': '32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c', 'timestamp': '2026-01-22T00:42:23.034506', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1979133598', 'name': 'instance-000000b3', 'instance_id': '32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c', 'instance_type': 'm1.nano', 'host': '114686826ac799d79679ebf48d50469c6b568964d23812d7c3d3d1b1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '3734cb36-f72b-11f0-a0a4-fa163e934844', 'monotonic_time': 7115.730149662, 'message_signature': 'e53c609a0692ed913537295c54bcc5244cba85fb2f47a559cb0637ebe1c954ef'}]}, 'timestamp': '2026-01-22 00:42:23.034760', '_unique_id': 'f5ad9c023f6045518d05f70f2bbd6f94'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.035 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.035 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.035 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.035 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.035 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.035 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.035 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.035 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.035 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.035 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.035 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.035 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.035 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.035 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.035 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.035 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.035 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.035 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.035 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.035 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.035 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.035 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.035 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.035 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.035 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.035 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.035 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.035 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.035 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.035 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:42:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:42:23.035 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:42:24 compute-1 nova_compute[182713]: 2026-01-22 00:42:24.001 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:42:24 compute-1 podman[243847]: 2026-01-22 00:42:24.583094616 +0000 UTC m=+0.073958843 container health_status 9069102163b9014ce8d990e36224edf2f6277e737eebbc85a8ea0ba5a3d6a468 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 22 00:42:24 compute-1 podman[243846]: 2026-01-22 00:42:24.622698284 +0000 UTC m=+0.109203725 container health_status 1b31374526468d6b98833c6ddc06c791524e3aaa8f4a92d02e3f11c449a17ca2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller)
Jan 22 00:42:25 compute-1 nova_compute[182713]: 2026-01-22 00:42:25.341 182717 DEBUG nova.network.neutron [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] [instance: 32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c] Updating instance_info_cache with network_info: [{"id": "5d170032-0a8d-4284-b16f-f41c4caa4d83", "address": "fa:16:3e:8d:27:b6", "network": {"id": "0fbc923c-90ec-4c3d-92df-bc42843601b3", "bridge": "br-int", "label": "tempest-network-smoke--540170543", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.210", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe8d:27b6", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe8d:27b6", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5d170032-0a", "ovs_interfaceid": "5d170032-0a8d-4284-b16f-f41c4caa4d83", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:42:25 compute-1 nova_compute[182713]: 2026-01-22 00:42:25.369 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Releasing lock "refresh_cache-32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:42:25 compute-1 nova_compute[182713]: 2026-01-22 00:42:25.370 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] [instance: 32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Jan 22 00:42:25 compute-1 nova_compute[182713]: 2026-01-22 00:42:25.913 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:42:29 compute-1 nova_compute[182713]: 2026-01-22 00:42:29.025 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:42:30 compute-1 podman[243896]: 2026-01-22 00:42:30.60302308 +0000 UTC m=+0.081733524 container health_status af9a44923f14d865893c91712a29a99d59e05d83f1a1a0300c4f4d5af638f284 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251202, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 00:42:30 compute-1 podman[243897]: 2026-01-22 00:42:30.616442565 +0000 UTC m=+0.089954718 container health_status e305ebfcd3dbca18f8dd680606b043b108cf9efb0d0a771039cb04fe0df8441d (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Jan 22 00:42:30 compute-1 nova_compute[182713]: 2026-01-22 00:42:30.916 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:42:34 compute-1 nova_compute[182713]: 2026-01-22 00:42:34.028 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:42:35 compute-1 nova_compute[182713]: 2026-01-22 00:42:35.918 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:42:39 compute-1 nova_compute[182713]: 2026-01-22 00:42:39.031 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:42:40 compute-1 nova_compute[182713]: 2026-01-22 00:42:40.920 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:42:43 compute-1 podman[243939]: 2026-01-22 00:42:43.575381773 +0000 UTC m=+0.068336678 container health_status cba261ffc859a105e0afa4436e64e27f9ba7e7387a1ea02146e0072c51844d44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3)
Jan 22 00:42:44 compute-1 nova_compute[182713]: 2026-01-22 00:42:44.033 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:42:45 compute-1 podman[243961]: 2026-01-22 00:42:45.560094522 +0000 UTC m=+0.055501881 container health_status dce133a2ffa21f3006ac27dc80027060a9029e6d972f4c62fecd35dd3bbb00f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.6, io.openshift.expose-services=, build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, release=1755695350, distribution-scope=public, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, vendor=Red Hat, Inc., architecture=x86_64, config_id=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container)
Jan 22 00:42:45 compute-1 nova_compute[182713]: 2026-01-22 00:42:45.924 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:42:49 compute-1 nova_compute[182713]: 2026-01-22 00:42:49.035 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:42:50 compute-1 nova_compute[182713]: 2026-01-22 00:42:50.926 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:42:54 compute-1 nova_compute[182713]: 2026-01-22 00:42:54.037 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:42:55 compute-1 podman[243993]: 2026-01-22 00:42:55.572377535 +0000 UTC m=+0.067383900 container health_status 9069102163b9014ce8d990e36224edf2f6277e737eebbc85a8ea0ba5a3d6a468 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 22 00:42:55 compute-1 podman[243992]: 2026-01-22 00:42:55.656716168 +0000 UTC m=+0.148463001 container health_status 1b31374526468d6b98833c6ddc06c791524e3aaa8f4a92d02e3f11c449a17ca2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_id=ovn_controller)
Jan 22 00:42:55 compute-1 nova_compute[182713]: 2026-01-22 00:42:55.928 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:42:57 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:42:57.946 104184 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=66, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:a4:fb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'fa:c2:bb:50:eb:27'}, ipsec=False) old=SB_Global(nb_cfg=65) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:42:57 compute-1 nova_compute[182713]: 2026-01-22 00:42:57.947 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:42:57 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:42:57.948 104184 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 22 00:42:58 compute-1 nova_compute[182713]: 2026-01-22 00:42:58.336 182717 DEBUG oslo_concurrency.lockutils [None req-778b2dd6-2cc3-4f89-a36c-bff78a5f9b12 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Acquiring lock "32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:42:58 compute-1 nova_compute[182713]: 2026-01-22 00:42:58.336 182717 DEBUG oslo_concurrency.lockutils [None req-778b2dd6-2cc3-4f89-a36c-bff78a5f9b12 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:42:58 compute-1 nova_compute[182713]: 2026-01-22 00:42:58.337 182717 DEBUG oslo_concurrency.lockutils [None req-778b2dd6-2cc3-4f89-a36c-bff78a5f9b12 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Acquiring lock "32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:42:58 compute-1 nova_compute[182713]: 2026-01-22 00:42:58.338 182717 DEBUG oslo_concurrency.lockutils [None req-778b2dd6-2cc3-4f89-a36c-bff78a5f9b12 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:42:58 compute-1 nova_compute[182713]: 2026-01-22 00:42:58.338 182717 DEBUG oslo_concurrency.lockutils [None req-778b2dd6-2cc3-4f89-a36c-bff78a5f9b12 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:42:58 compute-1 nova_compute[182713]: 2026-01-22 00:42:58.358 182717 INFO nova.compute.manager [None req-778b2dd6-2cc3-4f89-a36c-bff78a5f9b12 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c] Terminating instance
Jan 22 00:42:58 compute-1 nova_compute[182713]: 2026-01-22 00:42:58.373 182717 DEBUG nova.compute.manager [None req-778b2dd6-2cc3-4f89-a36c-bff78a5f9b12 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 22 00:42:58 compute-1 kernel: tap5d170032-0a (unregistering): left promiscuous mode
Jan 22 00:42:58 compute-1 NetworkManager[54952]: <info>  [1769042578.3948] device (tap5d170032-0a): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 00:42:58 compute-1 nova_compute[182713]: 2026-01-22 00:42:58.440 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:42:58 compute-1 ovn_controller[94841]: 2026-01-22T00:42:58Z|00756|binding|INFO|Releasing lport 5d170032-0a8d-4284-b16f-f41c4caa4d83 from this chassis (sb_readonly=0)
Jan 22 00:42:58 compute-1 ovn_controller[94841]: 2026-01-22T00:42:58Z|00757|binding|INFO|Setting lport 5d170032-0a8d-4284-b16f-f41c4caa4d83 down in Southbound
Jan 22 00:42:58 compute-1 ovn_controller[94841]: 2026-01-22T00:42:58Z|00758|binding|INFO|Removing iface tap5d170032-0a ovn-installed in OVS
Jan 22 00:42:58 compute-1 nova_compute[182713]: 2026-01-22 00:42:58.444 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:42:58 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:42:58.453 104184 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:8d:27:b6 10.100.0.13 2001:db8:0:1:f816:3eff:fe8d:27b6 2001:db8::f816:3eff:fe8d:27b6'], port_security=['fa:16:3e:8d:27:b6 10.100.0.13 2001:db8:0:1:f816:3eff:fe8d:27b6 2001:db8::f816:3eff:fe8d:27b6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28 2001:db8:0:1:f816:3eff:fe8d:27b6/64 2001:db8::f816:3eff:fe8d:27b6/64', 'neutron:device_id': '32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0fbc923c-90ec-4c3d-92df-bc42843601b3', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '837db8748d074b3c9179b47d30e7a1d4', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'ee79161b-ebd1-43ab-81bd-31efca053e81', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b420faa5-5ae8-471e-9b88-5f792c3ff519, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>], logical_port=5d170032-0a8d-4284-b16f-f41c4caa4d83) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:42:58 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:42:58.455 104184 INFO neutron.agent.ovn.metadata.agent [-] Port 5d170032-0a8d-4284-b16f-f41c4caa4d83 in datapath 0fbc923c-90ec-4c3d-92df-bc42843601b3 unbound from our chassis
Jan 22 00:42:58 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:42:58.458 104184 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 0fbc923c-90ec-4c3d-92df-bc42843601b3, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 00:42:58 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:42:58.460 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[bc65e3f3-4448-4f3b-9610-ca759bd96710]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:42:58 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:42:58.461 104184 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-0fbc923c-90ec-4c3d-92df-bc42843601b3 namespace which is not needed anymore
Jan 22 00:42:58 compute-1 nova_compute[182713]: 2026-01-22 00:42:58.467 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:42:58 compute-1 systemd[1]: machine-qemu\x2d78\x2dinstance\x2d000000b3.scope: Deactivated successfully.
Jan 22 00:42:58 compute-1 systemd[1]: machine-qemu\x2d78\x2dinstance\x2d000000b3.scope: Consumed 14.893s CPU time.
Jan 22 00:42:58 compute-1 systemd-machined[153970]: Machine qemu-78-instance-000000b3 terminated.
Jan 22 00:42:58 compute-1 nova_compute[182713]: 2026-01-22 00:42:58.616 182717 DEBUG nova.compute.manager [req-6b817550-4944-478e-ac7d-ad6e0aed6005 req-ea39fdcd-53d6-493f-b994-e87e020647e9 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c] Received event network-changed-5d170032-0a8d-4284-b16f-f41c4caa4d83 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:42:58 compute-1 nova_compute[182713]: 2026-01-22 00:42:58.619 182717 DEBUG nova.compute.manager [req-6b817550-4944-478e-ac7d-ad6e0aed6005 req-ea39fdcd-53d6-493f-b994-e87e020647e9 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c] Refreshing instance network info cache due to event network-changed-5d170032-0a8d-4284-b16f-f41c4caa4d83. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 00:42:58 compute-1 nova_compute[182713]: 2026-01-22 00:42:58.619 182717 DEBUG oslo_concurrency.lockutils [req-6b817550-4944-478e-ac7d-ad6e0aed6005 req-ea39fdcd-53d6-493f-b994-e87e020647e9 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:42:58 compute-1 nova_compute[182713]: 2026-01-22 00:42:58.620 182717 DEBUG oslo_concurrency.lockutils [req-6b817550-4944-478e-ac7d-ad6e0aed6005 req-ea39fdcd-53d6-493f-b994-e87e020647e9 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:42:58 compute-1 nova_compute[182713]: 2026-01-22 00:42:58.620 182717 DEBUG nova.network.neutron [req-6b817550-4944-478e-ac7d-ad6e0aed6005 req-ea39fdcd-53d6-493f-b994-e87e020647e9 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c] Refreshing network info cache for port 5d170032-0a8d-4284-b16f-f41c4caa4d83 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 00:42:58 compute-1 neutron-haproxy-ovnmeta-0fbc923c-90ec-4c3d-92df-bc42843601b3[243727]: [NOTICE]   (243731) : haproxy version is 2.8.14-c23fe91
Jan 22 00:42:58 compute-1 neutron-haproxy-ovnmeta-0fbc923c-90ec-4c3d-92df-bc42843601b3[243727]: [NOTICE]   (243731) : path to executable is /usr/sbin/haproxy
Jan 22 00:42:58 compute-1 neutron-haproxy-ovnmeta-0fbc923c-90ec-4c3d-92df-bc42843601b3[243727]: [WARNING]  (243731) : Exiting Master process...
Jan 22 00:42:58 compute-1 neutron-haproxy-ovnmeta-0fbc923c-90ec-4c3d-92df-bc42843601b3[243727]: [ALERT]    (243731) : Current worker (243733) exited with code 143 (Terminated)
Jan 22 00:42:58 compute-1 neutron-haproxy-ovnmeta-0fbc923c-90ec-4c3d-92df-bc42843601b3[243727]: [WARNING]  (243731) : All workers exited. Exiting... (0)
Jan 22 00:42:58 compute-1 systemd[1]: libpod-ee7208bc637debbfd53d6908469f55201e7f400f04c232b3e85a39c4c6984399.scope: Deactivated successfully.
Jan 22 00:42:58 compute-1 podman[244068]: 2026-01-22 00:42:58.656837549 +0000 UTC m=+0.070105133 container died ee7208bc637debbfd53d6908469f55201e7f400f04c232b3e85a39c4c6984399 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0fbc923c-90ec-4c3d-92df-bc42843601b3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3)
Jan 22 00:42:58 compute-1 nova_compute[182713]: 2026-01-22 00:42:58.678 182717 INFO nova.virt.libvirt.driver [-] [instance: 32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c] Instance destroyed successfully.
Jan 22 00:42:58 compute-1 nova_compute[182713]: 2026-01-22 00:42:58.679 182717 DEBUG nova.objects.instance [None req-778b2dd6-2cc3-4f89-a36c-bff78a5f9b12 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lazy-loading 'resources' on Instance uuid 32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:42:58 compute-1 nova_compute[182713]: 2026-01-22 00:42:58.702 182717 DEBUG nova.virt.libvirt.vif [None req-778b2dd6-2cc3-4f89-a36c-bff78a5f9b12 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T00:41:47Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1979133598',display_name='tempest-TestGettingAddress-server-1979133598',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1979133598',id=179,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGfcFV8SVrYtqhuEJR2u0WnZRZ3aIYjdzcrjfLDmTvbNVw+iNWtleLlqtVUIYQyWXU5cujTqDWjA511UjJA6kRMyxPcbENHgTLoJy3T95U8C9/oslNz/OBwLaWEuXg2SRA==',key_name='tempest-TestGettingAddress-313470075',keypairs=<?>,launch_index=0,launched_at=2026-01-22T00:41:55Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='837db8748d074b3c9179b47d30e7a1d4',ramdisk_id='',reservation_id='r-xjlk1541',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestGettingAddress-471729430',owner_user_name='tempest-TestGettingAddress-471729430-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T00:41:55Z,user_data=None,user_id='a8fd196423d94b309668ffd08655f7ed',uuid=32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "5d170032-0a8d-4284-b16f-f41c4caa4d83", "address": "fa:16:3e:8d:27:b6", "network": {"id": "0fbc923c-90ec-4c3d-92df-bc42843601b3", "bridge": "br-int", "label": "tempest-network-smoke--540170543", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.210", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe8d:27b6", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe8d:27b6", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5d170032-0a", "ovs_interfaceid": "5d170032-0a8d-4284-b16f-f41c4caa4d83", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 00:42:58 compute-1 nova_compute[182713]: 2026-01-22 00:42:58.703 182717 DEBUG nova.network.os_vif_util [None req-778b2dd6-2cc3-4f89-a36c-bff78a5f9b12 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Converting VIF {"id": "5d170032-0a8d-4284-b16f-f41c4caa4d83", "address": "fa:16:3e:8d:27:b6", "network": {"id": "0fbc923c-90ec-4c3d-92df-bc42843601b3", "bridge": "br-int", "label": "tempest-network-smoke--540170543", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.210", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe8d:27b6", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe8d:27b6", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5d170032-0a", "ovs_interfaceid": "5d170032-0a8d-4284-b16f-f41c4caa4d83", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:42:58 compute-1 nova_compute[182713]: 2026-01-22 00:42:58.704 182717 DEBUG nova.network.os_vif_util [None req-778b2dd6-2cc3-4f89-a36c-bff78a5f9b12 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:8d:27:b6,bridge_name='br-int',has_traffic_filtering=True,id=5d170032-0a8d-4284-b16f-f41c4caa4d83,network=Network(0fbc923c-90ec-4c3d-92df-bc42843601b3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5d170032-0a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:42:58 compute-1 nova_compute[182713]: 2026-01-22 00:42:58.705 182717 DEBUG os_vif [None req-778b2dd6-2cc3-4f89-a36c-bff78a5f9b12 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:8d:27:b6,bridge_name='br-int',has_traffic_filtering=True,id=5d170032-0a8d-4284-b16f-f41c4caa4d83,network=Network(0fbc923c-90ec-4c3d-92df-bc42843601b3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5d170032-0a') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 00:42:58 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-ee7208bc637debbfd53d6908469f55201e7f400f04c232b3e85a39c4c6984399-userdata-shm.mount: Deactivated successfully.
Jan 22 00:42:58 compute-1 nova_compute[182713]: 2026-01-22 00:42:58.708 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:42:58 compute-1 nova_compute[182713]: 2026-01-22 00:42:58.708 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap5d170032-0a, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:42:58 compute-1 systemd[1]: var-lib-containers-storage-overlay-d7bbd94e3366a47f2fae544067e79c010eaaa5aa951d3cfa17689d54729ccf86-merged.mount: Deactivated successfully.
Jan 22 00:42:58 compute-1 nova_compute[182713]: 2026-01-22 00:42:58.711 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:42:58 compute-1 nova_compute[182713]: 2026-01-22 00:42:58.714 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:42:58 compute-1 podman[244068]: 2026-01-22 00:42:58.71691053 +0000 UTC m=+0.130178094 container cleanup ee7208bc637debbfd53d6908469f55201e7f400f04c232b3e85a39c4c6984399 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0fbc923c-90ec-4c3d-92df-bc42843601b3, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 00:42:58 compute-1 nova_compute[182713]: 2026-01-22 00:42:58.719 182717 INFO os_vif [None req-778b2dd6-2cc3-4f89-a36c-bff78a5f9b12 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:8d:27:b6,bridge_name='br-int',has_traffic_filtering=True,id=5d170032-0a8d-4284-b16f-f41c4caa4d83,network=Network(0fbc923c-90ec-4c3d-92df-bc42843601b3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5d170032-0a')
Jan 22 00:42:58 compute-1 nova_compute[182713]: 2026-01-22 00:42:58.719 182717 INFO nova.virt.libvirt.driver [None req-778b2dd6-2cc3-4f89-a36c-bff78a5f9b12 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c] Deleting instance files /var/lib/nova/instances/32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c_del
Jan 22 00:42:58 compute-1 nova_compute[182713]: 2026-01-22 00:42:58.720 182717 INFO nova.virt.libvirt.driver [None req-778b2dd6-2cc3-4f89-a36c-bff78a5f9b12 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c] Deletion of /var/lib/nova/instances/32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c_del complete
Jan 22 00:42:58 compute-1 systemd[1]: libpod-conmon-ee7208bc637debbfd53d6908469f55201e7f400f04c232b3e85a39c4c6984399.scope: Deactivated successfully.
Jan 22 00:42:58 compute-1 podman[244115]: 2026-01-22 00:42:58.78077348 +0000 UTC m=+0.041173097 container remove ee7208bc637debbfd53d6908469f55201e7f400f04c232b3e85a39c4c6984399 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0fbc923c-90ec-4c3d-92df-bc42843601b3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3)
Jan 22 00:42:58 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:42:58.786 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[b98b5198-d40a-447c-8b3f-5412898aa6a1]: (4, ('Thu Jan 22 12:42:58 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-0fbc923c-90ec-4c3d-92df-bc42843601b3 (ee7208bc637debbfd53d6908469f55201e7f400f04c232b3e85a39c4c6984399)\nee7208bc637debbfd53d6908469f55201e7f400f04c232b3e85a39c4c6984399\nThu Jan 22 12:42:58 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-0fbc923c-90ec-4c3d-92df-bc42843601b3 (ee7208bc637debbfd53d6908469f55201e7f400f04c232b3e85a39c4c6984399)\nee7208bc637debbfd53d6908469f55201e7f400f04c232b3e85a39c4c6984399\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:42:58 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:42:58.789 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[c04dd020-0bb2-47f7-aaf8-4b1ac20cf0b4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:42:58 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:42:58.790 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0fbc923c-90, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:42:58 compute-1 nova_compute[182713]: 2026-01-22 00:42:58.792 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:42:58 compute-1 kernel: tap0fbc923c-90: left promiscuous mode
Jan 22 00:42:58 compute-1 nova_compute[182713]: 2026-01-22 00:42:58.794 182717 INFO nova.compute.manager [None req-778b2dd6-2cc3-4f89-a36c-bff78a5f9b12 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c] Took 0.42 seconds to destroy the instance on the hypervisor.
Jan 22 00:42:58 compute-1 nova_compute[182713]: 2026-01-22 00:42:58.795 182717 DEBUG oslo.service.loopingcall [None req-778b2dd6-2cc3-4f89-a36c-bff78a5f9b12 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 22 00:42:58 compute-1 nova_compute[182713]: 2026-01-22 00:42:58.796 182717 DEBUG nova.compute.manager [-] [instance: 32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 22 00:42:58 compute-1 nova_compute[182713]: 2026-01-22 00:42:58.796 182717 DEBUG nova.network.neutron [-] [instance: 32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 22 00:42:58 compute-1 nova_compute[182713]: 2026-01-22 00:42:58.811 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:42:58 compute-1 nova_compute[182713]: 2026-01-22 00:42:58.813 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:42:58 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:42:58.817 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[c47353bc-e002-4dcf-bee8-2648b01d4888]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:42:58 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:42:58.835 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[b7dd12c8-476a-423f-acc9-f3e751342755]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:42:58 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:42:58.837 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[9fe69839-98e2-49fc-b782-257b61537c1c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:42:58 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:42:58.857 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[6dc1dbf9-20a5-418a-980e-6367767483e7]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 708744, 'reachable_time': 25319, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 244131, 'error': None, 'target': 'ovnmeta-0fbc923c-90ec-4c3d-92df-bc42843601b3', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:42:58 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:42:58.862 104576 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-0fbc923c-90ec-4c3d-92df-bc42843601b3 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 22 00:42:58 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:42:58.862 104576 DEBUG oslo.privsep.daemon [-] privsep: reply[aa5764bd-f8a0-4770-8e91-efc0ccf12e96]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:42:58 compute-1 systemd[1]: run-netns-ovnmeta\x2d0fbc923c\x2d90ec\x2d4c3d\x2d92df\x2dbc42843601b3.mount: Deactivated successfully.
Jan 22 00:42:59 compute-1 nova_compute[182713]: 2026-01-22 00:42:59.745 182717 DEBUG nova.network.neutron [-] [instance: 32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:42:59 compute-1 nova_compute[182713]: 2026-01-22 00:42:59.766 182717 INFO nova.compute.manager [-] [instance: 32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c] Took 0.97 seconds to deallocate network for instance.
Jan 22 00:42:59 compute-1 nova_compute[182713]: 2026-01-22 00:42:59.837 182717 DEBUG nova.compute.manager [req-f40e6ff7-5da8-41fb-8bb2-fd32cf362b5e req-b99c567f-0c62-4823-b9e1-9a8694fbbb80 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c] Received event network-vif-deleted-5d170032-0a8d-4284-b16f-f41c4caa4d83 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:42:59 compute-1 nova_compute[182713]: 2026-01-22 00:42:59.854 182717 DEBUG oslo_concurrency.lockutils [None req-778b2dd6-2cc3-4f89-a36c-bff78a5f9b12 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:42:59 compute-1 nova_compute[182713]: 2026-01-22 00:42:59.855 182717 DEBUG oslo_concurrency.lockutils [None req-778b2dd6-2cc3-4f89-a36c-bff78a5f9b12 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:42:59 compute-1 nova_compute[182713]: 2026-01-22 00:42:59.925 182717 DEBUG nova.compute.provider_tree [None req-778b2dd6-2cc3-4f89-a36c-bff78a5f9b12 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Inventory has not changed in ProviderTree for provider: 39680711-70c9-4df1-ae59-25e54fac688d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:42:59 compute-1 nova_compute[182713]: 2026-01-22 00:42:59.949 182717 DEBUG nova.scheduler.client.report [None req-778b2dd6-2cc3-4f89-a36c-bff78a5f9b12 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Inventory has not changed for provider 39680711-70c9-4df1-ae59-25e54fac688d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:42:59 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:42:59.950 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=74526b6d-b1ca-423f-9094-b845f8b97526, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '66'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:42:59 compute-1 nova_compute[182713]: 2026-01-22 00:42:59.977 182717 DEBUG oslo_concurrency.lockutils [None req-778b2dd6-2cc3-4f89-a36c-bff78a5f9b12 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.123s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:43:00 compute-1 nova_compute[182713]: 2026-01-22 00:43:00.002 182717 INFO nova.scheduler.client.report [None req-778b2dd6-2cc3-4f89-a36c-bff78a5f9b12 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Deleted allocations for instance 32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c
Jan 22 00:43:00 compute-1 nova_compute[182713]: 2026-01-22 00:43:00.101 182717 DEBUG oslo_concurrency.lockutils [None req-778b2dd6-2cc3-4f89-a36c-bff78a5f9b12 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.765s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:43:00 compute-1 nova_compute[182713]: 2026-01-22 00:43:00.792 182717 DEBUG nova.compute.manager [req-13063571-2e7b-41d0-89ae-5dc6b2a0743f req-d201d917-5e08-47ca-b06a-879ac8d3335d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c] Received event network-vif-unplugged-5d170032-0a8d-4284-b16f-f41c4caa4d83 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:43:00 compute-1 nova_compute[182713]: 2026-01-22 00:43:00.793 182717 DEBUG oslo_concurrency.lockutils [req-13063571-2e7b-41d0-89ae-5dc6b2a0743f req-d201d917-5e08-47ca-b06a-879ac8d3335d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:43:00 compute-1 nova_compute[182713]: 2026-01-22 00:43:00.793 182717 DEBUG oslo_concurrency.lockutils [req-13063571-2e7b-41d0-89ae-5dc6b2a0743f req-d201d917-5e08-47ca-b06a-879ac8d3335d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:43:00 compute-1 nova_compute[182713]: 2026-01-22 00:43:00.793 182717 DEBUG oslo_concurrency.lockutils [req-13063571-2e7b-41d0-89ae-5dc6b2a0743f req-d201d917-5e08-47ca-b06a-879ac8d3335d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:43:00 compute-1 nova_compute[182713]: 2026-01-22 00:43:00.793 182717 DEBUG nova.compute.manager [req-13063571-2e7b-41d0-89ae-5dc6b2a0743f req-d201d917-5e08-47ca-b06a-879ac8d3335d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c] No waiting events found dispatching network-vif-unplugged-5d170032-0a8d-4284-b16f-f41c4caa4d83 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:43:00 compute-1 nova_compute[182713]: 2026-01-22 00:43:00.794 182717 WARNING nova.compute.manager [req-13063571-2e7b-41d0-89ae-5dc6b2a0743f req-d201d917-5e08-47ca-b06a-879ac8d3335d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c] Received unexpected event network-vif-unplugged-5d170032-0a8d-4284-b16f-f41c4caa4d83 for instance with vm_state deleted and task_state None.
Jan 22 00:43:00 compute-1 nova_compute[182713]: 2026-01-22 00:43:00.794 182717 DEBUG nova.compute.manager [req-13063571-2e7b-41d0-89ae-5dc6b2a0743f req-d201d917-5e08-47ca-b06a-879ac8d3335d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c] Received event network-vif-plugged-5d170032-0a8d-4284-b16f-f41c4caa4d83 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:43:00 compute-1 nova_compute[182713]: 2026-01-22 00:43:00.794 182717 DEBUG oslo_concurrency.lockutils [req-13063571-2e7b-41d0-89ae-5dc6b2a0743f req-d201d917-5e08-47ca-b06a-879ac8d3335d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:43:00 compute-1 nova_compute[182713]: 2026-01-22 00:43:00.794 182717 DEBUG oslo_concurrency.lockutils [req-13063571-2e7b-41d0-89ae-5dc6b2a0743f req-d201d917-5e08-47ca-b06a-879ac8d3335d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:43:00 compute-1 nova_compute[182713]: 2026-01-22 00:43:00.794 182717 DEBUG oslo_concurrency.lockutils [req-13063571-2e7b-41d0-89ae-5dc6b2a0743f req-d201d917-5e08-47ca-b06a-879ac8d3335d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:43:00 compute-1 nova_compute[182713]: 2026-01-22 00:43:00.795 182717 DEBUG nova.compute.manager [req-13063571-2e7b-41d0-89ae-5dc6b2a0743f req-d201d917-5e08-47ca-b06a-879ac8d3335d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c] No waiting events found dispatching network-vif-plugged-5d170032-0a8d-4284-b16f-f41c4caa4d83 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:43:00 compute-1 nova_compute[182713]: 2026-01-22 00:43:00.795 182717 WARNING nova.compute.manager [req-13063571-2e7b-41d0-89ae-5dc6b2a0743f req-d201d917-5e08-47ca-b06a-879ac8d3335d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c] Received unexpected event network-vif-plugged-5d170032-0a8d-4284-b16f-f41c4caa4d83 for instance with vm_state deleted and task_state None.
Jan 22 00:43:00 compute-1 nova_compute[182713]: 2026-01-22 00:43:00.930 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:43:01 compute-1 nova_compute[182713]: 2026-01-22 00:43:01.450 182717 DEBUG nova.network.neutron [req-6b817550-4944-478e-ac7d-ad6e0aed6005 req-ea39fdcd-53d6-493f-b994-e87e020647e9 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c] Updated VIF entry in instance network info cache for port 5d170032-0a8d-4284-b16f-f41c4caa4d83. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 00:43:01 compute-1 nova_compute[182713]: 2026-01-22 00:43:01.451 182717 DEBUG nova.network.neutron [req-6b817550-4944-478e-ac7d-ad6e0aed6005 req-ea39fdcd-53d6-493f-b994-e87e020647e9 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c] Updating instance_info_cache with network_info: [{"id": "5d170032-0a8d-4284-b16f-f41c4caa4d83", "address": "fa:16:3e:8d:27:b6", "network": {"id": "0fbc923c-90ec-4c3d-92df-bc42843601b3", "bridge": "br-int", "label": "tempest-network-smoke--540170543", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe8d:27b6", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe8d:27b6", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5d170032-0a", "ovs_interfaceid": "5d170032-0a8d-4284-b16f-f41c4caa4d83", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:43:01 compute-1 nova_compute[182713]: 2026-01-22 00:43:01.473 182717 DEBUG oslo_concurrency.lockutils [req-6b817550-4944-478e-ac7d-ad6e0aed6005 req-ea39fdcd-53d6-493f-b994-e87e020647e9 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:43:01 compute-1 podman[244132]: 2026-01-22 00:43:01.568637815 +0000 UTC m=+0.063726666 container health_status af9a44923f14d865893c91712a29a99d59e05d83f1a1a0300c4f4d5af638f284 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202)
Jan 22 00:43:01 compute-1 podman[244133]: 2026-01-22 00:43:01.622963217 +0000 UTC m=+0.106370037 container health_status e305ebfcd3dbca18f8dd680606b043b108cf9efb0d0a771039cb04fe0df8441d (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 22 00:43:03 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:43:03.059 104184 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:43:03 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:43:03.060 104184 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:43:03 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:43:03.060 104184 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:43:03 compute-1 nova_compute[182713]: 2026-01-22 00:43:03.712 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:43:05 compute-1 nova_compute[182713]: 2026-01-22 00:43:05.855 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:43:05 compute-1 nova_compute[182713]: 2026-01-22 00:43:05.932 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:43:07 compute-1 nova_compute[182713]: 2026-01-22 00:43:07.037 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:43:07 compute-1 nova_compute[182713]: 2026-01-22 00:43:07.103 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:43:08 compute-1 nova_compute[182713]: 2026-01-22 00:43:08.715 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:43:09 compute-1 nova_compute[182713]: 2026-01-22 00:43:09.851 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:43:10 compute-1 nova_compute[182713]: 2026-01-22 00:43:10.855 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:43:10 compute-1 nova_compute[182713]: 2026-01-22 00:43:10.934 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:43:11 compute-1 nova_compute[182713]: 2026-01-22 00:43:11.856 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:43:11 compute-1 nova_compute[182713]: 2026-01-22 00:43:11.857 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 00:43:13 compute-1 nova_compute[182713]: 2026-01-22 00:43:13.660 182717 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769042578.6577654, 32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:43:13 compute-1 nova_compute[182713]: 2026-01-22 00:43:13.661 182717 INFO nova.compute.manager [-] [instance: 32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c] VM Stopped (Lifecycle Event)
Jan 22 00:43:13 compute-1 nova_compute[182713]: 2026-01-22 00:43:13.698 182717 DEBUG nova.compute.manager [None req-fd28cd22-6a8a-4ec6-94b7-5d259a574cb6 - - - - - -] [instance: 32c4eb4c-415b-40c9-9fa1-9a5f4ee3dc4c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:43:13 compute-1 nova_compute[182713]: 2026-01-22 00:43:13.717 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:43:14 compute-1 podman[244176]: 2026-01-22 00:43:14.581456712 +0000 UTC m=+0.071480297 container health_status cba261ffc859a105e0afa4436e64e27f9ba7e7387a1ea02146e0072c51844d44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ceilometer_agent_compute)
Jan 22 00:43:14 compute-1 nova_compute[182713]: 2026-01-22 00:43:14.856 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:43:14 compute-1 nova_compute[182713]: 2026-01-22 00:43:14.856 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Jan 22 00:43:14 compute-1 nova_compute[182713]: 2026-01-22 00:43:14.873 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Jan 22 00:43:15 compute-1 nova_compute[182713]: 2026-01-22 00:43:15.855 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:43:15 compute-1 nova_compute[182713]: 2026-01-22 00:43:15.936 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:43:16 compute-1 podman[244196]: 2026-01-22 00:43:16.58778643 +0000 UTC m=+0.071300141 container health_status dce133a2ffa21f3006ac27dc80027060a9029e6d972f4c62fecd35dd3bbb00f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=openstack_network_exporter, container_name=openstack_network_exporter, io.buildah.version=1.33.7, version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., name=ubi9-minimal, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, build-date=2025-08-20T13:12:41, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible)
Jan 22 00:43:16 compute-1 nova_compute[182713]: 2026-01-22 00:43:16.868 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:43:16 compute-1 nova_compute[182713]: 2026-01-22 00:43:16.898 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:43:16 compute-1 nova_compute[182713]: 2026-01-22 00:43:16.898 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:43:16 compute-1 nova_compute[182713]: 2026-01-22 00:43:16.898 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:43:16 compute-1 nova_compute[182713]: 2026-01-22 00:43:16.898 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 00:43:17 compute-1 nova_compute[182713]: 2026-01-22 00:43:17.095 182717 WARNING nova.virt.libvirt.driver [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 00:43:17 compute-1 nova_compute[182713]: 2026-01-22 00:43:17.096 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5688MB free_disk=73.17750549316406GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 00:43:17 compute-1 nova_compute[182713]: 2026-01-22 00:43:17.096 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:43:17 compute-1 nova_compute[182713]: 2026-01-22 00:43:17.096 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:43:17 compute-1 nova_compute[182713]: 2026-01-22 00:43:17.153 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 00:43:17 compute-1 nova_compute[182713]: 2026-01-22 00:43:17.154 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 00:43:17 compute-1 nova_compute[182713]: 2026-01-22 00:43:17.176 182717 DEBUG nova.compute.provider_tree [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Inventory has not changed in ProviderTree for provider: 39680711-70c9-4df1-ae59-25e54fac688d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:43:17 compute-1 nova_compute[182713]: 2026-01-22 00:43:17.193 182717 DEBUG nova.scheduler.client.report [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Inventory has not changed for provider 39680711-70c9-4df1-ae59-25e54fac688d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:43:17 compute-1 nova_compute[182713]: 2026-01-22 00:43:17.215 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 00:43:17 compute-1 nova_compute[182713]: 2026-01-22 00:43:17.216 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.120s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:43:18 compute-1 nova_compute[182713]: 2026-01-22 00:43:18.204 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:43:18 compute-1 nova_compute[182713]: 2026-01-22 00:43:18.205 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:43:18 compute-1 nova_compute[182713]: 2026-01-22 00:43:18.719 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:43:18 compute-1 nova_compute[182713]: 2026-01-22 00:43:18.855 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:43:20 compute-1 nova_compute[182713]: 2026-01-22 00:43:20.939 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:43:21 compute-1 nova_compute[182713]: 2026-01-22 00:43:21.856 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:43:21 compute-1 nova_compute[182713]: 2026-01-22 00:43:21.856 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 00:43:21 compute-1 nova_compute[182713]: 2026-01-22 00:43:21.857 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 22 00:43:21 compute-1 nova_compute[182713]: 2026-01-22 00:43:21.892 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 22 00:43:21 compute-1 nova_compute[182713]: 2026-01-22 00:43:21.892 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:43:21 compute-1 nova_compute[182713]: 2026-01-22 00:43:21.893 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Jan 22 00:43:23 compute-1 nova_compute[182713]: 2026-01-22 00:43:23.722 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:43:24 compute-1 sshd-session[244220]: Invalid user sol from 92.118.39.95 port 43128
Jan 22 00:43:25 compute-1 sshd-session[244220]: Connection closed by invalid user sol 92.118.39.95 port 43128 [preauth]
Jan 22 00:43:25 compute-1 nova_compute[182713]: 2026-01-22 00:43:25.941 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:43:26 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:43:26.450 104184 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e0:9f:dc 10.100.0.2 2001:db8::f816:3eff:fee0:9fdc'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fee0:9fdc/64', 'neutron:device_id': 'ovnmeta-bc173f9b-a39e-490e-b1d4-92abd1855016', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-bc173f9b-a39e-490e-b1d4-92abd1855016', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '837db8748d074b3c9179b47d30e7a1d4', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ad392942-0b6b-462d-a3a5-d979f385a143, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=e429e99d-d544-4554-bbe2-f8538fbd55b8) old=Port_Binding(mac=['fa:16:3e:e0:9f:dc 10.100.0.2'], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-bc173f9b-a39e-490e-b1d4-92abd1855016', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-bc173f9b-a39e-490e-b1d4-92abd1855016', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '837db8748d074b3c9179b47d30e7a1d4', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:43:26 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:43:26.452 104184 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port e429e99d-d544-4554-bbe2-f8538fbd55b8 in datapath bc173f9b-a39e-490e-b1d4-92abd1855016 updated
Jan 22 00:43:26 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:43:26.455 104184 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network bc173f9b-a39e-490e-b1d4-92abd1855016, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 00:43:26 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:43:26.458 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[9b7f08f3-c474-4f6d-a1ec-a6e0152dbaf0]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:43:26 compute-1 podman[244223]: 2026-01-22 00:43:26.620100213 +0000 UTC m=+0.104962013 container health_status 9069102163b9014ce8d990e36224edf2f6277e737eebbc85a8ea0ba5a3d6a468 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 22 00:43:26 compute-1 podman[244222]: 2026-01-22 00:43:26.681197197 +0000 UTC m=+0.163136976 container health_status 1b31374526468d6b98833c6ddc06c791524e3aaa8f4a92d02e3f11c449a17ca2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 00:43:28 compute-1 nova_compute[182713]: 2026-01-22 00:43:28.724 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:43:30 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:43:30.058 104184 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e0:9f:dc 10.100.0.2 2001:db8:0:1:f816:3eff:fee0:9fdc 2001:db8::f816:3eff:fee0:9fdc'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8:0:1:f816:3eff:fee0:9fdc/64 2001:db8::f816:3eff:fee0:9fdc/64', 'neutron:device_id': 'ovnmeta-bc173f9b-a39e-490e-b1d4-92abd1855016', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-bc173f9b-a39e-490e-b1d4-92abd1855016', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '837db8748d074b3c9179b47d30e7a1d4', 'neutron:revision_number': '4', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ad392942-0b6b-462d-a3a5-d979f385a143, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=e429e99d-d544-4554-bbe2-f8538fbd55b8) old=Port_Binding(mac=['fa:16:3e:e0:9f:dc 10.100.0.2 2001:db8::f816:3eff:fee0:9fdc'], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fee0:9fdc/64', 'neutron:device_id': 'ovnmeta-bc173f9b-a39e-490e-b1d4-92abd1855016', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-bc173f9b-a39e-490e-b1d4-92abd1855016', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '837db8748d074b3c9179b47d30e7a1d4', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:43:30 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:43:30.060 104184 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port e429e99d-d544-4554-bbe2-f8538fbd55b8 in datapath bc173f9b-a39e-490e-b1d4-92abd1855016 updated
Jan 22 00:43:30 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:43:30.062 104184 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network bc173f9b-a39e-490e-b1d4-92abd1855016, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 00:43:30 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:43:30.063 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[a463b494-4c2e-4830-b518-8d41e7557f9d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:43:30 compute-1 nova_compute[182713]: 2026-01-22 00:43:30.943 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:43:32 compute-1 podman[244271]: 2026-01-22 00:43:32.569086347 +0000 UTC m=+0.057602405 container health_status e305ebfcd3dbca18f8dd680606b043b108cf9efb0d0a771039cb04fe0df8441d (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Jan 22 00:43:32 compute-1 podman[244270]: 2026-01-22 00:43:32.56915537 +0000 UTC m=+0.060552678 container health_status af9a44923f14d865893c91712a29a99d59e05d83f1a1a0300c4f4d5af638f284 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 22 00:43:33 compute-1 nova_compute[182713]: 2026-01-22 00:43:33.727 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:43:35 compute-1 nova_compute[182713]: 2026-01-22 00:43:35.988 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:43:38 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:43:38.091 104184 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=67, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:a4:fb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'fa:c2:bb:50:eb:27'}, ipsec=False) old=SB_Global(nb_cfg=66) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:43:38 compute-1 nova_compute[182713]: 2026-01-22 00:43:38.091 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:43:38 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:43:38.093 104184 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 22 00:43:38 compute-1 nova_compute[182713]: 2026-01-22 00:43:38.730 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:43:40 compute-1 nova_compute[182713]: 2026-01-22 00:43:40.990 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:43:42 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:43:42.096 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=74526b6d-b1ca-423f-9094-b845f8b97526, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '67'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:43:43 compute-1 nova_compute[182713]: 2026-01-22 00:43:43.732 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:43:45 compute-1 podman[244311]: 2026-01-22 00:43:45.569213064 +0000 UTC m=+0.065961356 container health_status cba261ffc859a105e0afa4436e64e27f9ba7e7387a1ea02146e0072c51844d44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Jan 22 00:43:45 compute-1 nova_compute[182713]: 2026-01-22 00:43:45.992 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:43:47 compute-1 podman[244331]: 2026-01-22 00:43:47.571026352 +0000 UTC m=+0.063928812 container health_status dce133a2ffa21f3006ac27dc80027060a9029e6d972f4c62fecd35dd3bbb00f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, vendor=Red Hat, Inc., version=9.6, io.openshift.tags=minimal rhel9, config_id=openstack_network_exporter, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Jan 22 00:43:48 compute-1 nova_compute[182713]: 2026-01-22 00:43:48.735 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:43:51 compute-1 nova_compute[182713]: 2026-01-22 00:43:51.022 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:43:53 compute-1 ovn_controller[94841]: 2026-01-22T00:43:53Z|00759|memory_trim|INFO|Detected inactivity (last active 30006 ms ago): trimming memory
Jan 22 00:43:53 compute-1 nova_compute[182713]: 2026-01-22 00:43:53.736 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:43:56 compute-1 nova_compute[182713]: 2026-01-22 00:43:56.025 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:43:57 compute-1 podman[244355]: 2026-01-22 00:43:57.566014938 +0000 UTC m=+0.057782572 container health_status 9069102163b9014ce8d990e36224edf2f6277e737eebbc85a8ea0ba5a3d6a468 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 22 00:43:57 compute-1 podman[244354]: 2026-01-22 00:43:57.589285438 +0000 UTC m=+0.085843530 container health_status 1b31374526468d6b98833c6ddc06c791524e3aaa8f4a92d02e3f11c449a17ca2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251202, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Jan 22 00:43:58 compute-1 nova_compute[182713]: 2026-01-22 00:43:58.738 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:44:01 compute-1 nova_compute[182713]: 2026-01-22 00:44:01.027 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:44:03 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:44:03.061 104184 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:44:03 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:44:03.061 104184 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:44:03 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:44:03.062 104184 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:44:03 compute-1 podman[244401]: 2026-01-22 00:44:03.55741737 +0000 UTC m=+0.058250027 container health_status af9a44923f14d865893c91712a29a99d59e05d83f1a1a0300c4f4d5af638f284 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent)
Jan 22 00:44:03 compute-1 podman[244402]: 2026-01-22 00:44:03.573187918 +0000 UTC m=+0.066584434 container health_status e305ebfcd3dbca18f8dd680606b043b108cf9efb0d0a771039cb04fe0df8441d (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 22 00:44:03 compute-1 nova_compute[182713]: 2026-01-22 00:44:03.740 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:44:05 compute-1 nova_compute[182713]: 2026-01-22 00:44:05.870 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:44:06 compute-1 nova_compute[182713]: 2026-01-22 00:44:06.030 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:44:06 compute-1 nova_compute[182713]: 2026-01-22 00:44:06.082 182717 DEBUG oslo_concurrency.lockutils [None req-93f97d16-41a2-45c0-8bba-b80287072eb8 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Acquiring lock "8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:44:06 compute-1 nova_compute[182713]: 2026-01-22 00:44:06.083 182717 DEBUG oslo_concurrency.lockutils [None req-93f97d16-41a2-45c0-8bba-b80287072eb8 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:44:06 compute-1 nova_compute[182713]: 2026-01-22 00:44:06.114 182717 DEBUG nova.compute.manager [None req-93f97d16-41a2-45c0-8bba-b80287072eb8 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 22 00:44:06 compute-1 nova_compute[182713]: 2026-01-22 00:44:06.237 182717 DEBUG oslo_concurrency.lockutils [None req-93f97d16-41a2-45c0-8bba-b80287072eb8 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:44:06 compute-1 nova_compute[182713]: 2026-01-22 00:44:06.238 182717 DEBUG oslo_concurrency.lockutils [None req-93f97d16-41a2-45c0-8bba-b80287072eb8 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:44:06 compute-1 nova_compute[182713]: 2026-01-22 00:44:06.249 182717 DEBUG nova.virt.hardware [None req-93f97d16-41a2-45c0-8bba-b80287072eb8 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 22 00:44:06 compute-1 nova_compute[182713]: 2026-01-22 00:44:06.249 182717 INFO nova.compute.claims [None req-93f97d16-41a2-45c0-8bba-b80287072eb8 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5] Claim successful on node compute-1.ctlplane.example.com
Jan 22 00:44:06 compute-1 nova_compute[182713]: 2026-01-22 00:44:06.425 182717 DEBUG nova.compute.provider_tree [None req-93f97d16-41a2-45c0-8bba-b80287072eb8 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Inventory has not changed in ProviderTree for provider: 39680711-70c9-4df1-ae59-25e54fac688d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:44:06 compute-1 nova_compute[182713]: 2026-01-22 00:44:06.441 182717 DEBUG nova.scheduler.client.report [None req-93f97d16-41a2-45c0-8bba-b80287072eb8 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Inventory has not changed for provider 39680711-70c9-4df1-ae59-25e54fac688d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:44:06 compute-1 nova_compute[182713]: 2026-01-22 00:44:06.471 182717 DEBUG oslo_concurrency.lockutils [None req-93f97d16-41a2-45c0-8bba-b80287072eb8 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.233s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:44:06 compute-1 nova_compute[182713]: 2026-01-22 00:44:06.473 182717 DEBUG nova.compute.manager [None req-93f97d16-41a2-45c0-8bba-b80287072eb8 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 22 00:44:06 compute-1 nova_compute[182713]: 2026-01-22 00:44:06.538 182717 DEBUG nova.compute.manager [None req-93f97d16-41a2-45c0-8bba-b80287072eb8 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 22 00:44:06 compute-1 nova_compute[182713]: 2026-01-22 00:44:06.539 182717 DEBUG nova.network.neutron [None req-93f97d16-41a2-45c0-8bba-b80287072eb8 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 22 00:44:06 compute-1 nova_compute[182713]: 2026-01-22 00:44:06.558 182717 INFO nova.virt.libvirt.driver [None req-93f97d16-41a2-45c0-8bba-b80287072eb8 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 22 00:44:06 compute-1 nova_compute[182713]: 2026-01-22 00:44:06.575 182717 DEBUG nova.compute.manager [None req-93f97d16-41a2-45c0-8bba-b80287072eb8 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 22 00:44:06 compute-1 nova_compute[182713]: 2026-01-22 00:44:06.740 182717 DEBUG nova.compute.manager [None req-93f97d16-41a2-45c0-8bba-b80287072eb8 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 22 00:44:06 compute-1 nova_compute[182713]: 2026-01-22 00:44:06.742 182717 DEBUG nova.virt.libvirt.driver [None req-93f97d16-41a2-45c0-8bba-b80287072eb8 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 22 00:44:06 compute-1 nova_compute[182713]: 2026-01-22 00:44:06.743 182717 INFO nova.virt.libvirt.driver [None req-93f97d16-41a2-45c0-8bba-b80287072eb8 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5] Creating image(s)
Jan 22 00:44:06 compute-1 nova_compute[182713]: 2026-01-22 00:44:06.744 182717 DEBUG oslo_concurrency.lockutils [None req-93f97d16-41a2-45c0-8bba-b80287072eb8 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Acquiring lock "/var/lib/nova/instances/8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:44:06 compute-1 nova_compute[182713]: 2026-01-22 00:44:06.745 182717 DEBUG oslo_concurrency.lockutils [None req-93f97d16-41a2-45c0-8bba-b80287072eb8 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "/var/lib/nova/instances/8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:44:06 compute-1 nova_compute[182713]: 2026-01-22 00:44:06.746 182717 DEBUG oslo_concurrency.lockutils [None req-93f97d16-41a2-45c0-8bba-b80287072eb8 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "/var/lib/nova/instances/8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:44:06 compute-1 nova_compute[182713]: 2026-01-22 00:44:06.775 182717 DEBUG oslo_concurrency.processutils [None req-93f97d16-41a2-45c0-8bba-b80287072eb8 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:44:06 compute-1 nova_compute[182713]: 2026-01-22 00:44:06.843 182717 DEBUG oslo_concurrency.processutils [None req-93f97d16-41a2-45c0-8bba-b80287072eb8 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:44:06 compute-1 nova_compute[182713]: 2026-01-22 00:44:06.845 182717 DEBUG oslo_concurrency.lockutils [None req-93f97d16-41a2-45c0-8bba-b80287072eb8 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Acquiring lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:44:06 compute-1 nova_compute[182713]: 2026-01-22 00:44:06.846 182717 DEBUG oslo_concurrency.lockutils [None req-93f97d16-41a2-45c0-8bba-b80287072eb8 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:44:06 compute-1 nova_compute[182713]: 2026-01-22 00:44:06.873 182717 DEBUG oslo_concurrency.processutils [None req-93f97d16-41a2-45c0-8bba-b80287072eb8 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:44:06 compute-1 nova_compute[182713]: 2026-01-22 00:44:06.973 182717 DEBUG oslo_concurrency.processutils [None req-93f97d16-41a2-45c0-8bba-b80287072eb8 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.101s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:44:06 compute-1 nova_compute[182713]: 2026-01-22 00:44:06.974 182717 DEBUG oslo_concurrency.processutils [None req-93f97d16-41a2-45c0-8bba-b80287072eb8 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:44:07 compute-1 nova_compute[182713]: 2026-01-22 00:44:07.007 182717 DEBUG oslo_concurrency.processutils [None req-93f97d16-41a2-45c0-8bba-b80287072eb8 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5/disk 1073741824" returned: 0 in 0.033s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:44:07 compute-1 nova_compute[182713]: 2026-01-22 00:44:07.008 182717 DEBUG oslo_concurrency.lockutils [None req-93f97d16-41a2-45c0-8bba-b80287072eb8 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.162s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:44:07 compute-1 nova_compute[182713]: 2026-01-22 00:44:07.009 182717 DEBUG oslo_concurrency.processutils [None req-93f97d16-41a2-45c0-8bba-b80287072eb8 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:44:07 compute-1 nova_compute[182713]: 2026-01-22 00:44:07.069 182717 DEBUG oslo_concurrency.processutils [None req-93f97d16-41a2-45c0-8bba-b80287072eb8 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:44:07 compute-1 nova_compute[182713]: 2026-01-22 00:44:07.071 182717 DEBUG nova.virt.disk.api [None req-93f97d16-41a2-45c0-8bba-b80287072eb8 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Checking if we can resize image /var/lib/nova/instances/8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 22 00:44:07 compute-1 nova_compute[182713]: 2026-01-22 00:44:07.072 182717 DEBUG oslo_concurrency.processutils [None req-93f97d16-41a2-45c0-8bba-b80287072eb8 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:44:07 compute-1 nova_compute[182713]: 2026-01-22 00:44:07.128 182717 DEBUG oslo_concurrency.processutils [None req-93f97d16-41a2-45c0-8bba-b80287072eb8 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:44:07 compute-1 nova_compute[182713]: 2026-01-22 00:44:07.129 182717 DEBUG nova.virt.disk.api [None req-93f97d16-41a2-45c0-8bba-b80287072eb8 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Cannot resize image /var/lib/nova/instances/8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 22 00:44:07 compute-1 nova_compute[182713]: 2026-01-22 00:44:07.129 182717 DEBUG nova.objects.instance [None req-93f97d16-41a2-45c0-8bba-b80287072eb8 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lazy-loading 'migration_context' on Instance uuid 8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:44:07 compute-1 nova_compute[182713]: 2026-01-22 00:44:07.338 182717 DEBUG nova.policy [None req-93f97d16-41a2-45c0-8bba-b80287072eb8 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 22 00:44:07 compute-1 nova_compute[182713]: 2026-01-22 00:44:07.445 182717 DEBUG nova.virt.libvirt.driver [None req-93f97d16-41a2-45c0-8bba-b80287072eb8 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 22 00:44:07 compute-1 nova_compute[182713]: 2026-01-22 00:44:07.446 182717 DEBUG nova.virt.libvirt.driver [None req-93f97d16-41a2-45c0-8bba-b80287072eb8 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5] Ensure instance console log exists: /var/lib/nova/instances/8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 22 00:44:07 compute-1 nova_compute[182713]: 2026-01-22 00:44:07.446 182717 DEBUG oslo_concurrency.lockutils [None req-93f97d16-41a2-45c0-8bba-b80287072eb8 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:44:07 compute-1 nova_compute[182713]: 2026-01-22 00:44:07.447 182717 DEBUG oslo_concurrency.lockutils [None req-93f97d16-41a2-45c0-8bba-b80287072eb8 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:44:07 compute-1 nova_compute[182713]: 2026-01-22 00:44:07.447 182717 DEBUG oslo_concurrency.lockutils [None req-93f97d16-41a2-45c0-8bba-b80287072eb8 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:44:08 compute-1 nova_compute[182713]: 2026-01-22 00:44:08.742 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:44:09 compute-1 nova_compute[182713]: 2026-01-22 00:44:09.263 182717 DEBUG nova.network.neutron [None req-93f97d16-41a2-45c0-8bba-b80287072eb8 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5] Successfully created port: 4ab490b8-61a1-4300-b85c-537002247bfe _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 22 00:44:09 compute-1 nova_compute[182713]: 2026-01-22 00:44:09.851 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:44:10 compute-1 nova_compute[182713]: 2026-01-22 00:44:10.850 182717 DEBUG nova.network.neutron [None req-93f97d16-41a2-45c0-8bba-b80287072eb8 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5] Successfully updated port: 4ab490b8-61a1-4300-b85c-537002247bfe _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 22 00:44:10 compute-1 nova_compute[182713]: 2026-01-22 00:44:10.855 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:44:10 compute-1 nova_compute[182713]: 2026-01-22 00:44:10.876 182717 DEBUG oslo_concurrency.lockutils [None req-93f97d16-41a2-45c0-8bba-b80287072eb8 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Acquiring lock "refresh_cache-8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:44:10 compute-1 nova_compute[182713]: 2026-01-22 00:44:10.876 182717 DEBUG oslo_concurrency.lockutils [None req-93f97d16-41a2-45c0-8bba-b80287072eb8 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Acquired lock "refresh_cache-8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:44:10 compute-1 nova_compute[182713]: 2026-01-22 00:44:10.876 182717 DEBUG nova.network.neutron [None req-93f97d16-41a2-45c0-8bba-b80287072eb8 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 00:44:11 compute-1 nova_compute[182713]: 2026-01-22 00:44:11.002 182717 DEBUG nova.compute.manager [req-abd921aa-60ef-4ccd-bb62-6911406fa901 req-eccb4bf7-bbaa-41fb-bb29-516b49698abb 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5] Received event network-changed-4ab490b8-61a1-4300-b85c-537002247bfe external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:44:11 compute-1 nova_compute[182713]: 2026-01-22 00:44:11.002 182717 DEBUG nova.compute.manager [req-abd921aa-60ef-4ccd-bb62-6911406fa901 req-eccb4bf7-bbaa-41fb-bb29-516b49698abb 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5] Refreshing instance network info cache due to event network-changed-4ab490b8-61a1-4300-b85c-537002247bfe. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 00:44:11 compute-1 nova_compute[182713]: 2026-01-22 00:44:11.002 182717 DEBUG oslo_concurrency.lockutils [req-abd921aa-60ef-4ccd-bb62-6911406fa901 req-eccb4bf7-bbaa-41fb-bb29-516b49698abb 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:44:11 compute-1 nova_compute[182713]: 2026-01-22 00:44:11.031 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:44:11 compute-1 nova_compute[182713]: 2026-01-22 00:44:11.302 182717 DEBUG nova.network.neutron [None req-93f97d16-41a2-45c0-8bba-b80287072eb8 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 00:44:12 compute-1 nova_compute[182713]: 2026-01-22 00:44:12.790 182717 DEBUG nova.network.neutron [None req-93f97d16-41a2-45c0-8bba-b80287072eb8 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5] Updating instance_info_cache with network_info: [{"id": "4ab490b8-61a1-4300-b85c-537002247bfe", "address": "fa:16:3e:20:a2:c1", "network": {"id": "bc173f9b-a39e-490e-b1d4-92abd1855016", "bridge": "br-int", "label": "tempest-network-smoke--453483768", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe20:a2c1", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe20:a2c1", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4ab490b8-61", "ovs_interfaceid": "4ab490b8-61a1-4300-b85c-537002247bfe", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:44:12 compute-1 nova_compute[182713]: 2026-01-22 00:44:12.819 182717 DEBUG oslo_concurrency.lockutils [None req-93f97d16-41a2-45c0-8bba-b80287072eb8 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Releasing lock "refresh_cache-8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:44:12 compute-1 nova_compute[182713]: 2026-01-22 00:44:12.819 182717 DEBUG nova.compute.manager [None req-93f97d16-41a2-45c0-8bba-b80287072eb8 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5] Instance network_info: |[{"id": "4ab490b8-61a1-4300-b85c-537002247bfe", "address": "fa:16:3e:20:a2:c1", "network": {"id": "bc173f9b-a39e-490e-b1d4-92abd1855016", "bridge": "br-int", "label": "tempest-network-smoke--453483768", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe20:a2c1", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe20:a2c1", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4ab490b8-61", "ovs_interfaceid": "4ab490b8-61a1-4300-b85c-537002247bfe", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 22 00:44:12 compute-1 nova_compute[182713]: 2026-01-22 00:44:12.820 182717 DEBUG oslo_concurrency.lockutils [req-abd921aa-60ef-4ccd-bb62-6911406fa901 req-eccb4bf7-bbaa-41fb-bb29-516b49698abb 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:44:12 compute-1 nova_compute[182713]: 2026-01-22 00:44:12.821 182717 DEBUG nova.network.neutron [req-abd921aa-60ef-4ccd-bb62-6911406fa901 req-eccb4bf7-bbaa-41fb-bb29-516b49698abb 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5] Refreshing network info cache for port 4ab490b8-61a1-4300-b85c-537002247bfe _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 00:44:12 compute-1 nova_compute[182713]: 2026-01-22 00:44:12.828 182717 DEBUG nova.virt.libvirt.driver [None req-93f97d16-41a2-45c0-8bba-b80287072eb8 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5] Start _get_guest_xml network_info=[{"id": "4ab490b8-61a1-4300-b85c-537002247bfe", "address": "fa:16:3e:20:a2:c1", "network": {"id": "bc173f9b-a39e-490e-b1d4-92abd1855016", "bridge": "br-int", "label": "tempest-network-smoke--453483768", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe20:a2c1", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe20:a2c1", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4ab490b8-61", "ovs_interfaceid": "4ab490b8-61a1-4300-b85c-537002247bfe", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'size': 0, 'disk_bus': 'virtio', 'boot_index': 0, 'device_name': '/dev/vda', 'guest_format': None, 'encrypted': False, 'encryption_format': None, 'encryption_secret_uuid': None, 'device_type': 'disk', 'image_id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 22 00:44:12 compute-1 nova_compute[182713]: 2026-01-22 00:44:12.837 182717 WARNING nova.virt.libvirt.driver [None req-93f97d16-41a2-45c0-8bba-b80287072eb8 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 00:44:12 compute-1 nova_compute[182713]: 2026-01-22 00:44:12.843 182717 DEBUG nova.virt.libvirt.host [None req-93f97d16-41a2-45c0-8bba-b80287072eb8 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 22 00:44:12 compute-1 nova_compute[182713]: 2026-01-22 00:44:12.844 182717 DEBUG nova.virt.libvirt.host [None req-93f97d16-41a2-45c0-8bba-b80287072eb8 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 22 00:44:12 compute-1 nova_compute[182713]: 2026-01-22 00:44:12.857 182717 DEBUG nova.virt.libvirt.host [None req-93f97d16-41a2-45c0-8bba-b80287072eb8 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 22 00:44:12 compute-1 nova_compute[182713]: 2026-01-22 00:44:12.858 182717 DEBUG nova.virt.libvirt.host [None req-93f97d16-41a2-45c0-8bba-b80287072eb8 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 22 00:44:12 compute-1 nova_compute[182713]: 2026-01-22 00:44:12.860 182717 DEBUG nova.virt.libvirt.driver [None req-93f97d16-41a2-45c0-8bba-b80287072eb8 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 22 00:44:12 compute-1 nova_compute[182713]: 2026-01-22 00:44:12.861 182717 DEBUG nova.virt.hardware [None req-93f97d16-41a2-45c0-8bba-b80287072eb8 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-21T23:42:45Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c3389c03-89c4-4ff5-9e03-1a99d41713d4',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 22 00:44:12 compute-1 nova_compute[182713]: 2026-01-22 00:44:12.862 182717 DEBUG nova.virt.hardware [None req-93f97d16-41a2-45c0-8bba-b80287072eb8 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 22 00:44:12 compute-1 nova_compute[182713]: 2026-01-22 00:44:12.862 182717 DEBUG nova.virt.hardware [None req-93f97d16-41a2-45c0-8bba-b80287072eb8 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 22 00:44:12 compute-1 nova_compute[182713]: 2026-01-22 00:44:12.863 182717 DEBUG nova.virt.hardware [None req-93f97d16-41a2-45c0-8bba-b80287072eb8 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 22 00:44:12 compute-1 nova_compute[182713]: 2026-01-22 00:44:12.863 182717 DEBUG nova.virt.hardware [None req-93f97d16-41a2-45c0-8bba-b80287072eb8 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 22 00:44:12 compute-1 nova_compute[182713]: 2026-01-22 00:44:12.864 182717 DEBUG nova.virt.hardware [None req-93f97d16-41a2-45c0-8bba-b80287072eb8 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 22 00:44:12 compute-1 nova_compute[182713]: 2026-01-22 00:44:12.865 182717 DEBUG nova.virt.hardware [None req-93f97d16-41a2-45c0-8bba-b80287072eb8 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 22 00:44:12 compute-1 nova_compute[182713]: 2026-01-22 00:44:12.865 182717 DEBUG nova.virt.hardware [None req-93f97d16-41a2-45c0-8bba-b80287072eb8 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 22 00:44:12 compute-1 nova_compute[182713]: 2026-01-22 00:44:12.866 182717 DEBUG nova.virt.hardware [None req-93f97d16-41a2-45c0-8bba-b80287072eb8 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 22 00:44:12 compute-1 nova_compute[182713]: 2026-01-22 00:44:12.866 182717 DEBUG nova.virt.hardware [None req-93f97d16-41a2-45c0-8bba-b80287072eb8 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 22 00:44:12 compute-1 nova_compute[182713]: 2026-01-22 00:44:12.866 182717 DEBUG nova.virt.hardware [None req-93f97d16-41a2-45c0-8bba-b80287072eb8 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 22 00:44:12 compute-1 nova_compute[182713]: 2026-01-22 00:44:12.876 182717 DEBUG nova.virt.libvirt.vif [None req-93f97d16-41a2-45c0-8bba-b80287072eb8 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T00:44:05Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1450704049',display_name='tempest-TestGettingAddress-server-1450704049',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1450704049',id=182,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFzS1Jx/APWM5vcQRL+j1JWQJn1AI5LoKxMBW97Fa2XcQxLO8wlk0d2rFNEjm5ruItcAVjf35MpAgTKTp3E/600O3yHmKIiUXb2hz3moFXrY6FueGiSaiI56sxuqhWZG5g==',key_name='tempest-TestGettingAddress-79214675',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='837db8748d074b3c9179b47d30e7a1d4',ramdisk_id='',reservation_id='r-0t6fmbpa',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-471729430',owner_user_name='tempest-TestGettingAddress-471729430-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:44:06Z,user_data=None,user_id='a8fd196423d94b309668ffd08655f7ed',uuid=8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "4ab490b8-61a1-4300-b85c-537002247bfe", "address": "fa:16:3e:20:a2:c1", "network": {"id": "bc173f9b-a39e-490e-b1d4-92abd1855016", "bridge": "br-int", "label": "tempest-network-smoke--453483768", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe20:a2c1", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe20:a2c1", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4ab490b8-61", "ovs_interfaceid": "4ab490b8-61a1-4300-b85c-537002247bfe", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 00:44:12 compute-1 nova_compute[182713]: 2026-01-22 00:44:12.877 182717 DEBUG nova.network.os_vif_util [None req-93f97d16-41a2-45c0-8bba-b80287072eb8 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Converting VIF {"id": "4ab490b8-61a1-4300-b85c-537002247bfe", "address": "fa:16:3e:20:a2:c1", "network": {"id": "bc173f9b-a39e-490e-b1d4-92abd1855016", "bridge": "br-int", "label": "tempest-network-smoke--453483768", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe20:a2c1", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe20:a2c1", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4ab490b8-61", "ovs_interfaceid": "4ab490b8-61a1-4300-b85c-537002247bfe", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:44:12 compute-1 nova_compute[182713]: 2026-01-22 00:44:12.880 182717 DEBUG nova.network.os_vif_util [None req-93f97d16-41a2-45c0-8bba-b80287072eb8 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:20:a2:c1,bridge_name='br-int',has_traffic_filtering=True,id=4ab490b8-61a1-4300-b85c-537002247bfe,network=Network(bc173f9b-a39e-490e-b1d4-92abd1855016),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4ab490b8-61') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:44:12 compute-1 nova_compute[182713]: 2026-01-22 00:44:12.882 182717 DEBUG nova.objects.instance [None req-93f97d16-41a2-45c0-8bba-b80287072eb8 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lazy-loading 'pci_devices' on Instance uuid 8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:44:12 compute-1 nova_compute[182713]: 2026-01-22 00:44:12.899 182717 DEBUG nova.virt.libvirt.driver [None req-93f97d16-41a2-45c0-8bba-b80287072eb8 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5] End _get_guest_xml xml=<domain type="kvm">
Jan 22 00:44:12 compute-1 nova_compute[182713]:   <uuid>8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5</uuid>
Jan 22 00:44:12 compute-1 nova_compute[182713]:   <name>instance-000000b6</name>
Jan 22 00:44:12 compute-1 nova_compute[182713]:   <memory>131072</memory>
Jan 22 00:44:12 compute-1 nova_compute[182713]:   <vcpu>1</vcpu>
Jan 22 00:44:12 compute-1 nova_compute[182713]:   <metadata>
Jan 22 00:44:12 compute-1 nova_compute[182713]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 00:44:12 compute-1 nova_compute[182713]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 00:44:12 compute-1 nova_compute[182713]:       <nova:name>tempest-TestGettingAddress-server-1450704049</nova:name>
Jan 22 00:44:12 compute-1 nova_compute[182713]:       <nova:creationTime>2026-01-22 00:44:12</nova:creationTime>
Jan 22 00:44:12 compute-1 nova_compute[182713]:       <nova:flavor name="m1.nano">
Jan 22 00:44:12 compute-1 nova_compute[182713]:         <nova:memory>128</nova:memory>
Jan 22 00:44:12 compute-1 nova_compute[182713]:         <nova:disk>1</nova:disk>
Jan 22 00:44:12 compute-1 nova_compute[182713]:         <nova:swap>0</nova:swap>
Jan 22 00:44:12 compute-1 nova_compute[182713]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 00:44:12 compute-1 nova_compute[182713]:         <nova:vcpus>1</nova:vcpus>
Jan 22 00:44:12 compute-1 nova_compute[182713]:       </nova:flavor>
Jan 22 00:44:12 compute-1 nova_compute[182713]:       <nova:owner>
Jan 22 00:44:12 compute-1 nova_compute[182713]:         <nova:user uuid="a8fd196423d94b309668ffd08655f7ed">tempest-TestGettingAddress-471729430-project-member</nova:user>
Jan 22 00:44:12 compute-1 nova_compute[182713]:         <nova:project uuid="837db8748d074b3c9179b47d30e7a1d4">tempest-TestGettingAddress-471729430</nova:project>
Jan 22 00:44:12 compute-1 nova_compute[182713]:       </nova:owner>
Jan 22 00:44:12 compute-1 nova_compute[182713]:       <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 22 00:44:12 compute-1 nova_compute[182713]:       <nova:ports>
Jan 22 00:44:12 compute-1 nova_compute[182713]:         <nova:port uuid="4ab490b8-61a1-4300-b85c-537002247bfe">
Jan 22 00:44:12 compute-1 nova_compute[182713]:           <nova:ip type="fixed" address="2001:db8:0:1:f816:3eff:fe20:a2c1" ipVersion="6"/>
Jan 22 00:44:12 compute-1 nova_compute[182713]:           <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Jan 22 00:44:12 compute-1 nova_compute[182713]:           <nova:ip type="fixed" address="2001:db8::f816:3eff:fe20:a2c1" ipVersion="6"/>
Jan 22 00:44:12 compute-1 nova_compute[182713]:         </nova:port>
Jan 22 00:44:12 compute-1 nova_compute[182713]:       </nova:ports>
Jan 22 00:44:12 compute-1 nova_compute[182713]:     </nova:instance>
Jan 22 00:44:12 compute-1 nova_compute[182713]:   </metadata>
Jan 22 00:44:12 compute-1 nova_compute[182713]:   <sysinfo type="smbios">
Jan 22 00:44:12 compute-1 nova_compute[182713]:     <system>
Jan 22 00:44:12 compute-1 nova_compute[182713]:       <entry name="manufacturer">RDO</entry>
Jan 22 00:44:12 compute-1 nova_compute[182713]:       <entry name="product">OpenStack Compute</entry>
Jan 22 00:44:12 compute-1 nova_compute[182713]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 00:44:12 compute-1 nova_compute[182713]:       <entry name="serial">8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5</entry>
Jan 22 00:44:12 compute-1 nova_compute[182713]:       <entry name="uuid">8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5</entry>
Jan 22 00:44:12 compute-1 nova_compute[182713]:       <entry name="family">Virtual Machine</entry>
Jan 22 00:44:12 compute-1 nova_compute[182713]:     </system>
Jan 22 00:44:12 compute-1 nova_compute[182713]:   </sysinfo>
Jan 22 00:44:12 compute-1 nova_compute[182713]:   <os>
Jan 22 00:44:12 compute-1 nova_compute[182713]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 22 00:44:12 compute-1 nova_compute[182713]:     <boot dev="hd"/>
Jan 22 00:44:12 compute-1 nova_compute[182713]:     <smbios mode="sysinfo"/>
Jan 22 00:44:12 compute-1 nova_compute[182713]:   </os>
Jan 22 00:44:12 compute-1 nova_compute[182713]:   <features>
Jan 22 00:44:12 compute-1 nova_compute[182713]:     <acpi/>
Jan 22 00:44:12 compute-1 nova_compute[182713]:     <apic/>
Jan 22 00:44:12 compute-1 nova_compute[182713]:     <vmcoreinfo/>
Jan 22 00:44:12 compute-1 nova_compute[182713]:   </features>
Jan 22 00:44:12 compute-1 nova_compute[182713]:   <clock offset="utc">
Jan 22 00:44:12 compute-1 nova_compute[182713]:     <timer name="pit" tickpolicy="delay"/>
Jan 22 00:44:12 compute-1 nova_compute[182713]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 22 00:44:12 compute-1 nova_compute[182713]:     <timer name="hpet" present="no"/>
Jan 22 00:44:12 compute-1 nova_compute[182713]:   </clock>
Jan 22 00:44:12 compute-1 nova_compute[182713]:   <cpu mode="custom" match="exact">
Jan 22 00:44:12 compute-1 nova_compute[182713]:     <model>Nehalem</model>
Jan 22 00:44:12 compute-1 nova_compute[182713]:     <topology sockets="1" cores="1" threads="1"/>
Jan 22 00:44:12 compute-1 nova_compute[182713]:   </cpu>
Jan 22 00:44:12 compute-1 nova_compute[182713]:   <devices>
Jan 22 00:44:12 compute-1 nova_compute[182713]:     <disk type="file" device="disk">
Jan 22 00:44:12 compute-1 nova_compute[182713]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 00:44:12 compute-1 nova_compute[182713]:       <source file="/var/lib/nova/instances/8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5/disk"/>
Jan 22 00:44:12 compute-1 nova_compute[182713]:       <target dev="vda" bus="virtio"/>
Jan 22 00:44:12 compute-1 nova_compute[182713]:     </disk>
Jan 22 00:44:12 compute-1 nova_compute[182713]:     <disk type="file" device="cdrom">
Jan 22 00:44:12 compute-1 nova_compute[182713]:       <driver name="qemu" type="raw" cache="none"/>
Jan 22 00:44:12 compute-1 nova_compute[182713]:       <source file="/var/lib/nova/instances/8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5/disk.config"/>
Jan 22 00:44:12 compute-1 nova_compute[182713]:       <target dev="sda" bus="sata"/>
Jan 22 00:44:12 compute-1 nova_compute[182713]:     </disk>
Jan 22 00:44:12 compute-1 nova_compute[182713]:     <interface type="ethernet">
Jan 22 00:44:12 compute-1 nova_compute[182713]:       <mac address="fa:16:3e:20:a2:c1"/>
Jan 22 00:44:12 compute-1 nova_compute[182713]:       <model type="virtio"/>
Jan 22 00:44:12 compute-1 nova_compute[182713]:       <driver name="vhost" rx_queue_size="512"/>
Jan 22 00:44:12 compute-1 nova_compute[182713]:       <mtu size="1442"/>
Jan 22 00:44:12 compute-1 nova_compute[182713]:       <target dev="tap4ab490b8-61"/>
Jan 22 00:44:12 compute-1 nova_compute[182713]:     </interface>
Jan 22 00:44:12 compute-1 nova_compute[182713]:     <serial type="pty">
Jan 22 00:44:12 compute-1 nova_compute[182713]:       <log file="/var/lib/nova/instances/8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5/console.log" append="off"/>
Jan 22 00:44:12 compute-1 nova_compute[182713]:     </serial>
Jan 22 00:44:12 compute-1 nova_compute[182713]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 00:44:12 compute-1 nova_compute[182713]:     <video>
Jan 22 00:44:12 compute-1 nova_compute[182713]:       <model type="virtio"/>
Jan 22 00:44:12 compute-1 nova_compute[182713]:     </video>
Jan 22 00:44:12 compute-1 nova_compute[182713]:     <input type="tablet" bus="usb"/>
Jan 22 00:44:12 compute-1 nova_compute[182713]:     <rng model="virtio">
Jan 22 00:44:12 compute-1 nova_compute[182713]:       <backend model="random">/dev/urandom</backend>
Jan 22 00:44:12 compute-1 nova_compute[182713]:     </rng>
Jan 22 00:44:12 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root"/>
Jan 22 00:44:12 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:44:12 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:44:12 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:44:12 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:44:12 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:44:12 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:44:12 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:44:12 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:44:12 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:44:12 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:44:12 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:44:12 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:44:12 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:44:12 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:44:12 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:44:12 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:44:12 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:44:12 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:44:12 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:44:12 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:44:12 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:44:12 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:44:12 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:44:12 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:44:12 compute-1 nova_compute[182713]:     <controller type="usb" index="0"/>
Jan 22 00:44:12 compute-1 nova_compute[182713]:     <memballoon model="virtio">
Jan 22 00:44:12 compute-1 nova_compute[182713]:       <stats period="10"/>
Jan 22 00:44:12 compute-1 nova_compute[182713]:     </memballoon>
Jan 22 00:44:12 compute-1 nova_compute[182713]:   </devices>
Jan 22 00:44:12 compute-1 nova_compute[182713]: </domain>
Jan 22 00:44:12 compute-1 nova_compute[182713]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 22 00:44:12 compute-1 nova_compute[182713]: 2026-01-22 00:44:12.900 182717 DEBUG nova.compute.manager [None req-93f97d16-41a2-45c0-8bba-b80287072eb8 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5] Preparing to wait for external event network-vif-plugged-4ab490b8-61a1-4300-b85c-537002247bfe prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 22 00:44:12 compute-1 nova_compute[182713]: 2026-01-22 00:44:12.901 182717 DEBUG oslo_concurrency.lockutils [None req-93f97d16-41a2-45c0-8bba-b80287072eb8 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Acquiring lock "8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:44:12 compute-1 nova_compute[182713]: 2026-01-22 00:44:12.901 182717 DEBUG oslo_concurrency.lockutils [None req-93f97d16-41a2-45c0-8bba-b80287072eb8 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:44:12 compute-1 nova_compute[182713]: 2026-01-22 00:44:12.901 182717 DEBUG oslo_concurrency.lockutils [None req-93f97d16-41a2-45c0-8bba-b80287072eb8 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:44:12 compute-1 nova_compute[182713]: 2026-01-22 00:44:12.903 182717 DEBUG nova.virt.libvirt.vif [None req-93f97d16-41a2-45c0-8bba-b80287072eb8 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T00:44:05Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1450704049',display_name='tempest-TestGettingAddress-server-1450704049',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1450704049',id=182,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFzS1Jx/APWM5vcQRL+j1JWQJn1AI5LoKxMBW97Fa2XcQxLO8wlk0d2rFNEjm5ruItcAVjf35MpAgTKTp3E/600O3yHmKIiUXb2hz3moFXrY6FueGiSaiI56sxuqhWZG5g==',key_name='tempest-TestGettingAddress-79214675',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='837db8748d074b3c9179b47d30e7a1d4',ramdisk_id='',reservation_id='r-0t6fmbpa',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-471729430',owner_user_name='tempest-TestGettingAddress-471729430-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:44:06Z,user_data=None,user_id='a8fd196423d94b309668ffd08655f7ed',uuid=8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "4ab490b8-61a1-4300-b85c-537002247bfe", "address": "fa:16:3e:20:a2:c1", "network": {"id": "bc173f9b-a39e-490e-b1d4-92abd1855016", "bridge": "br-int", "label": "tempest-network-smoke--453483768", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe20:a2c1", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe20:a2c1", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4ab490b8-61", "ovs_interfaceid": "4ab490b8-61a1-4300-b85c-537002247bfe", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 00:44:12 compute-1 nova_compute[182713]: 2026-01-22 00:44:12.903 182717 DEBUG nova.network.os_vif_util [None req-93f97d16-41a2-45c0-8bba-b80287072eb8 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Converting VIF {"id": "4ab490b8-61a1-4300-b85c-537002247bfe", "address": "fa:16:3e:20:a2:c1", "network": {"id": "bc173f9b-a39e-490e-b1d4-92abd1855016", "bridge": "br-int", "label": "tempest-network-smoke--453483768", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe20:a2c1", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe20:a2c1", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4ab490b8-61", "ovs_interfaceid": "4ab490b8-61a1-4300-b85c-537002247bfe", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:44:12 compute-1 nova_compute[182713]: 2026-01-22 00:44:12.904 182717 DEBUG nova.network.os_vif_util [None req-93f97d16-41a2-45c0-8bba-b80287072eb8 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:20:a2:c1,bridge_name='br-int',has_traffic_filtering=True,id=4ab490b8-61a1-4300-b85c-537002247bfe,network=Network(bc173f9b-a39e-490e-b1d4-92abd1855016),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4ab490b8-61') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:44:12 compute-1 nova_compute[182713]: 2026-01-22 00:44:12.905 182717 DEBUG os_vif [None req-93f97d16-41a2-45c0-8bba-b80287072eb8 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:20:a2:c1,bridge_name='br-int',has_traffic_filtering=True,id=4ab490b8-61a1-4300-b85c-537002247bfe,network=Network(bc173f9b-a39e-490e-b1d4-92abd1855016),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4ab490b8-61') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 00:44:12 compute-1 nova_compute[182713]: 2026-01-22 00:44:12.905 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:44:12 compute-1 nova_compute[182713]: 2026-01-22 00:44:12.906 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:44:12 compute-1 nova_compute[182713]: 2026-01-22 00:44:12.907 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 00:44:12 compute-1 nova_compute[182713]: 2026-01-22 00:44:12.916 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:44:12 compute-1 nova_compute[182713]: 2026-01-22 00:44:12.916 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap4ab490b8-61, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:44:12 compute-1 nova_compute[182713]: 2026-01-22 00:44:12.917 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap4ab490b8-61, col_values=(('external_ids', {'iface-id': '4ab490b8-61a1-4300-b85c-537002247bfe', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:20:a2:c1', 'vm-uuid': '8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:44:12 compute-1 nova_compute[182713]: 2026-01-22 00:44:12.920 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:44:12 compute-1 nova_compute[182713]: 2026-01-22 00:44:12.924 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:44:12 compute-1 NetworkManager[54952]: <info>  [1769042652.9249] manager: (tap4ab490b8-61): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/367)
Jan 22 00:44:12 compute-1 nova_compute[182713]: 2026-01-22 00:44:12.928 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:44:12 compute-1 nova_compute[182713]: 2026-01-22 00:44:12.929 182717 INFO os_vif [None req-93f97d16-41a2-45c0-8bba-b80287072eb8 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:20:a2:c1,bridge_name='br-int',has_traffic_filtering=True,id=4ab490b8-61a1-4300-b85c-537002247bfe,network=Network(bc173f9b-a39e-490e-b1d4-92abd1855016),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4ab490b8-61')
Jan 22 00:44:12 compute-1 nova_compute[182713]: 2026-01-22 00:44:12.988 182717 DEBUG nova.virt.libvirt.driver [None req-93f97d16-41a2-45c0-8bba-b80287072eb8 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 00:44:12 compute-1 nova_compute[182713]: 2026-01-22 00:44:12.989 182717 DEBUG nova.virt.libvirt.driver [None req-93f97d16-41a2-45c0-8bba-b80287072eb8 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 00:44:12 compute-1 nova_compute[182713]: 2026-01-22 00:44:12.990 182717 DEBUG nova.virt.libvirt.driver [None req-93f97d16-41a2-45c0-8bba-b80287072eb8 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] No VIF found with MAC fa:16:3e:20:a2:c1, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 00:44:12 compute-1 nova_compute[182713]: 2026-01-22 00:44:12.991 182717 INFO nova.virt.libvirt.driver [None req-93f97d16-41a2-45c0-8bba-b80287072eb8 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5] Using config drive
Jan 22 00:44:13 compute-1 nova_compute[182713]: 2026-01-22 00:44:13.417 182717 INFO nova.virt.libvirt.driver [None req-93f97d16-41a2-45c0-8bba-b80287072eb8 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5] Creating config drive at /var/lib/nova/instances/8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5/disk.config
Jan 22 00:44:13 compute-1 nova_compute[182713]: 2026-01-22 00:44:13.424 182717 DEBUG oslo_concurrency.processutils [None req-93f97d16-41a2-45c0-8bba-b80287072eb8 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp2x_vaqk3 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:44:13 compute-1 nova_compute[182713]: 2026-01-22 00:44:13.555 182717 DEBUG oslo_concurrency.processutils [None req-93f97d16-41a2-45c0-8bba-b80287072eb8 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp2x_vaqk3" returned: 0 in 0.131s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:44:13 compute-1 kernel: tap4ab490b8-61: entered promiscuous mode
Jan 22 00:44:13 compute-1 NetworkManager[54952]: <info>  [1769042653.6398] manager: (tap4ab490b8-61): new Tun device (/org/freedesktop/NetworkManager/Devices/368)
Jan 22 00:44:13 compute-1 ovn_controller[94841]: 2026-01-22T00:44:13Z|00760|binding|INFO|Claiming lport 4ab490b8-61a1-4300-b85c-537002247bfe for this chassis.
Jan 22 00:44:13 compute-1 ovn_controller[94841]: 2026-01-22T00:44:13Z|00761|binding|INFO|4ab490b8-61a1-4300-b85c-537002247bfe: Claiming fa:16:3e:20:a2:c1 10.100.0.5 2001:db8:0:1:f816:3eff:fe20:a2c1 2001:db8::f816:3eff:fe20:a2c1
Jan 22 00:44:13 compute-1 nova_compute[182713]: 2026-01-22 00:44:13.640 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:44:13 compute-1 NetworkManager[54952]: <info>  [1769042653.6551] manager: (patch-provnet-99bcf688-d143-4306-a11c-956e66fbe227-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/369)
Jan 22 00:44:13 compute-1 NetworkManager[54952]: <info>  [1769042653.6556] manager: (patch-br-int-to-provnet-99bcf688-d143-4306-a11c-956e66fbe227): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/370)
Jan 22 00:44:13 compute-1 nova_compute[182713]: 2026-01-22 00:44:13.654 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:44:13 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:44:13.659 104184 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:20:a2:c1 10.100.0.5 2001:db8:0:1:f816:3eff:fe20:a2c1 2001:db8::f816:3eff:fe20:a2c1'], port_security=['fa:16:3e:20:a2:c1 10.100.0.5 2001:db8:0:1:f816:3eff:fe20:a2c1 2001:db8::f816:3eff:fe20:a2c1'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28 2001:db8:0:1:f816:3eff:fe20:a2c1/64 2001:db8::f816:3eff:fe20:a2c1/64', 'neutron:device_id': '8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-bc173f9b-a39e-490e-b1d4-92abd1855016', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '837db8748d074b3c9179b47d30e7a1d4', 'neutron:revision_number': '2', 'neutron:security_group_ids': '46ea8a3f-4945-4bb2-97cf-c1bd6e8fe825', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ad392942-0b6b-462d-a3a5-d979f385a143, chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>], logical_port=4ab490b8-61a1-4300-b85c-537002247bfe) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:44:13 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:44:13.661 104184 INFO neutron.agent.ovn.metadata.agent [-] Port 4ab490b8-61a1-4300-b85c-537002247bfe in datapath bc173f9b-a39e-490e-b1d4-92abd1855016 bound to our chassis
Jan 22 00:44:13 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:44:13.664 104184 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network bc173f9b-a39e-490e-b1d4-92abd1855016
Jan 22 00:44:13 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:44:13.686 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[fbc24eab-a3de-44b6-af3a-c2976d62a075]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:44:13 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:44:13.687 104184 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapbc173f9b-a1 in ovnmeta-bc173f9b-a39e-490e-b1d4-92abd1855016 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 22 00:44:13 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:44:13.690 211733 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapbc173f9b-a0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 22 00:44:13 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:44:13.690 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[bf534d6e-3765-40bd-a397-4980bfb6a28c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:44:13 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:44:13.691 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[a5f3c928-6aec-4d1a-9464-b11b5e4e5a50]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:44:13 compute-1 systemd-udevd[244479]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 00:44:13 compute-1 systemd-machined[153970]: New machine qemu-79-instance-000000b6.
Jan 22 00:44:13 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:44:13.707 104576 DEBUG oslo.privsep.daemon [-] privsep: reply[fb9d38c2-1144-4b2b-ac74-2d8e2a678650]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:44:13 compute-1 NetworkManager[54952]: <info>  [1769042653.7172] device (tap4ab490b8-61): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 00:44:13 compute-1 NetworkManager[54952]: <info>  [1769042653.7183] device (tap4ab490b8-61): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 00:44:13 compute-1 systemd[1]: Started Virtual Machine qemu-79-instance-000000b6.
Jan 22 00:44:13 compute-1 nova_compute[182713]: 2026-01-22 00:44:13.720 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:44:13 compute-1 nova_compute[182713]: 2026-01-22 00:44:13.733 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:44:13 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:44:13.734 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[a22b3953-4308-417b-80f1-9cf45489633c]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:44:13 compute-1 ovn_controller[94841]: 2026-01-22T00:44:13Z|00762|binding|INFO|Setting lport 4ab490b8-61a1-4300-b85c-537002247bfe ovn-installed in OVS
Jan 22 00:44:13 compute-1 ovn_controller[94841]: 2026-01-22T00:44:13Z|00763|binding|INFO|Setting lport 4ab490b8-61a1-4300-b85c-537002247bfe up in Southbound
Jan 22 00:44:13 compute-1 nova_compute[182713]: 2026-01-22 00:44:13.742 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:44:13 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:44:13.766 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[c97a1434-25ed-4670-a4bc-1988ef80454c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:44:13 compute-1 NetworkManager[54952]: <info>  [1769042653.7733] manager: (tapbc173f9b-a0): new Veth device (/org/freedesktop/NetworkManager/Devices/371)
Jan 22 00:44:13 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:44:13.772 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[0af87f10-34a9-49c3-b4d6-5d7a895c07a2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:44:13 compute-1 systemd-udevd[244484]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 00:44:13 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:44:13.809 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[369f0d4f-e2cc-4933-8248-c2be43cee546]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:44:13 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:44:13.812 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[6e745377-1bbf-4580-9bf9-af4bc46691c3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:44:13 compute-1 NetworkManager[54952]: <info>  [1769042653.8437] device (tapbc173f9b-a0): carrier: link connected
Jan 22 00:44:13 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:44:13.848 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[e9ce2f7b-6c87-4524-b569-73c264977b0d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:44:13 compute-1 nova_compute[182713]: 2026-01-22 00:44:13.851 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:44:13 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:44:13.866 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[9bbefb42-188e-49b2-9e99-6557e4a2ce0f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapbc173f9b-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:e0:9f:dc'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 222], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 722649, 'reachable_time': 15619, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 244512, 'error': None, 'target': 'ovnmeta-bc173f9b-a39e-490e-b1d4-92abd1855016', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:44:13 compute-1 nova_compute[182713]: 2026-01-22 00:44:13.882 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:44:13 compute-1 nova_compute[182713]: 2026-01-22 00:44:13.882 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 00:44:13 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:44:13.883 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[bbb27062-66ab-456d-ac04-bfd266711eb0]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fee0:9fdc'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 722649, 'tstamp': 722649}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 244513, 'error': None, 'target': 'ovnmeta-bc173f9b-a39e-490e-b1d4-92abd1855016', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:44:13 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:44:13.903 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[52bce58f-9b50-4f23-8f12-822e13e21ad2]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapbc173f9b-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:e0:9f:dc'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 222], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 722649, 'reachable_time': 15619, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 244514, 'error': None, 'target': 'ovnmeta-bc173f9b-a39e-490e-b1d4-92abd1855016', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:44:13 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:44:13.938 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[258613a9-1458-4758-9152-6380790fe27b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:44:14 compute-1 nova_compute[182713]: 2026-01-22 00:44:14.003 182717 DEBUG nova.compute.manager [req-af0ca350-0983-497d-a0bb-cf347dbc832c req-b1fb0107-81a8-44b0-a14a-77b34cfb0956 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5] Received event network-vif-plugged-4ab490b8-61a1-4300-b85c-537002247bfe external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:44:14 compute-1 nova_compute[182713]: 2026-01-22 00:44:14.005 182717 DEBUG oslo_concurrency.lockutils [req-af0ca350-0983-497d-a0bb-cf347dbc832c req-b1fb0107-81a8-44b0-a14a-77b34cfb0956 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:44:14 compute-1 nova_compute[182713]: 2026-01-22 00:44:14.005 182717 DEBUG oslo_concurrency.lockutils [req-af0ca350-0983-497d-a0bb-cf347dbc832c req-b1fb0107-81a8-44b0-a14a-77b34cfb0956 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:44:14 compute-1 nova_compute[182713]: 2026-01-22 00:44:14.006 182717 DEBUG oslo_concurrency.lockutils [req-af0ca350-0983-497d-a0bb-cf347dbc832c req-b1fb0107-81a8-44b0-a14a-77b34cfb0956 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:44:14 compute-1 nova_compute[182713]: 2026-01-22 00:44:14.006 182717 DEBUG nova.compute.manager [req-af0ca350-0983-497d-a0bb-cf347dbc832c req-b1fb0107-81a8-44b0-a14a-77b34cfb0956 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5] Processing event network-vif-plugged-4ab490b8-61a1-4300-b85c-537002247bfe _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 22 00:44:14 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:44:14.026 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[3a10bf17-ea88-4d04-9235-5fd44fca9aa5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:44:14 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:44:14.028 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapbc173f9b-a0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:44:14 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:44:14.028 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 00:44:14 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:44:14.028 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapbc173f9b-a0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:44:14 compute-1 nova_compute[182713]: 2026-01-22 00:44:14.030 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:44:14 compute-1 NetworkManager[54952]: <info>  [1769042654.0311] manager: (tapbc173f9b-a0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/372)
Jan 22 00:44:14 compute-1 kernel: tapbc173f9b-a0: entered promiscuous mode
Jan 22 00:44:14 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:44:14.033 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapbc173f9b-a0, col_values=(('external_ids', {'iface-id': 'e429e99d-d544-4554-bbe2-f8538fbd55b8'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:44:14 compute-1 ovn_controller[94841]: 2026-01-22T00:44:14Z|00764|binding|INFO|Releasing lport e429e99d-d544-4554-bbe2-f8538fbd55b8 from this chassis (sb_readonly=0)
Jan 22 00:44:14 compute-1 nova_compute[182713]: 2026-01-22 00:44:14.035 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:44:14 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:44:14.035 104184 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/bc173f9b-a39e-490e-b1d4-92abd1855016.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/bc173f9b-a39e-490e-b1d4-92abd1855016.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 22 00:44:14 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:44:14.036 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[57b130aa-7edf-4d69-994c-91ee31e2166f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:44:14 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:44:14.037 104184 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 00:44:14 compute-1 ovn_metadata_agent[104179]: global
Jan 22 00:44:14 compute-1 ovn_metadata_agent[104179]:     log         /dev/log local0 debug
Jan 22 00:44:14 compute-1 ovn_metadata_agent[104179]:     log-tag     haproxy-metadata-proxy-bc173f9b-a39e-490e-b1d4-92abd1855016
Jan 22 00:44:14 compute-1 ovn_metadata_agent[104179]:     user        root
Jan 22 00:44:14 compute-1 ovn_metadata_agent[104179]:     group       root
Jan 22 00:44:14 compute-1 ovn_metadata_agent[104179]:     maxconn     1024
Jan 22 00:44:14 compute-1 ovn_metadata_agent[104179]:     pidfile     /var/lib/neutron/external/pids/bc173f9b-a39e-490e-b1d4-92abd1855016.pid.haproxy
Jan 22 00:44:14 compute-1 ovn_metadata_agent[104179]:     daemon
Jan 22 00:44:14 compute-1 ovn_metadata_agent[104179]: 
Jan 22 00:44:14 compute-1 ovn_metadata_agent[104179]: defaults
Jan 22 00:44:14 compute-1 ovn_metadata_agent[104179]:     log global
Jan 22 00:44:14 compute-1 ovn_metadata_agent[104179]:     mode http
Jan 22 00:44:14 compute-1 ovn_metadata_agent[104179]:     option httplog
Jan 22 00:44:14 compute-1 ovn_metadata_agent[104179]:     option dontlognull
Jan 22 00:44:14 compute-1 ovn_metadata_agent[104179]:     option http-server-close
Jan 22 00:44:14 compute-1 ovn_metadata_agent[104179]:     option forwardfor
Jan 22 00:44:14 compute-1 ovn_metadata_agent[104179]:     retries                 3
Jan 22 00:44:14 compute-1 ovn_metadata_agent[104179]:     timeout http-request    30s
Jan 22 00:44:14 compute-1 ovn_metadata_agent[104179]:     timeout connect         30s
Jan 22 00:44:14 compute-1 ovn_metadata_agent[104179]:     timeout client          32s
Jan 22 00:44:14 compute-1 ovn_metadata_agent[104179]:     timeout server          32s
Jan 22 00:44:14 compute-1 ovn_metadata_agent[104179]:     timeout http-keep-alive 30s
Jan 22 00:44:14 compute-1 ovn_metadata_agent[104179]: 
Jan 22 00:44:14 compute-1 ovn_metadata_agent[104179]: 
Jan 22 00:44:14 compute-1 ovn_metadata_agent[104179]: listen listener
Jan 22 00:44:14 compute-1 ovn_metadata_agent[104179]:     bind 169.254.169.254:80
Jan 22 00:44:14 compute-1 ovn_metadata_agent[104179]:     server metadata /var/lib/neutron/metadata_proxy
Jan 22 00:44:14 compute-1 ovn_metadata_agent[104179]:     http-request add-header X-OVN-Network-ID bc173f9b-a39e-490e-b1d4-92abd1855016
Jan 22 00:44:14 compute-1 ovn_metadata_agent[104179]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 22 00:44:14 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:44:14.038 104184 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-bc173f9b-a39e-490e-b1d4-92abd1855016', 'env', 'PROCESS_TAG=haproxy-bc173f9b-a39e-490e-b1d4-92abd1855016', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/bc173f9b-a39e-490e-b1d4-92abd1855016.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 22 00:44:14 compute-1 nova_compute[182713]: 2026-01-22 00:44:14.052 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:44:14 compute-1 nova_compute[182713]: 2026-01-22 00:44:14.136 182717 DEBUG nova.virt.driver [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Emitting event <LifecycleEvent: 1769042654.135389, 8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:44:14 compute-1 nova_compute[182713]: 2026-01-22 00:44:14.137 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5] VM Started (Lifecycle Event)
Jan 22 00:44:14 compute-1 nova_compute[182713]: 2026-01-22 00:44:14.141 182717 DEBUG nova.compute.manager [None req-93f97d16-41a2-45c0-8bba-b80287072eb8 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 00:44:14 compute-1 nova_compute[182713]: 2026-01-22 00:44:14.146 182717 DEBUG nova.virt.libvirt.driver [None req-93f97d16-41a2-45c0-8bba-b80287072eb8 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 22 00:44:14 compute-1 nova_compute[182713]: 2026-01-22 00:44:14.151 182717 INFO nova.virt.libvirt.driver [-] [instance: 8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5] Instance spawned successfully.
Jan 22 00:44:14 compute-1 nova_compute[182713]: 2026-01-22 00:44:14.151 182717 DEBUG nova.virt.libvirt.driver [None req-93f97d16-41a2-45c0-8bba-b80287072eb8 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 22 00:44:14 compute-1 nova_compute[182713]: 2026-01-22 00:44:14.157 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:44:14 compute-1 nova_compute[182713]: 2026-01-22 00:44:14.161 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 00:44:14 compute-1 nova_compute[182713]: 2026-01-22 00:44:14.173 182717 DEBUG nova.virt.libvirt.driver [None req-93f97d16-41a2-45c0-8bba-b80287072eb8 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:44:14 compute-1 nova_compute[182713]: 2026-01-22 00:44:14.174 182717 DEBUG nova.virt.libvirt.driver [None req-93f97d16-41a2-45c0-8bba-b80287072eb8 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:44:14 compute-1 nova_compute[182713]: 2026-01-22 00:44:14.175 182717 DEBUG nova.virt.libvirt.driver [None req-93f97d16-41a2-45c0-8bba-b80287072eb8 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:44:14 compute-1 nova_compute[182713]: 2026-01-22 00:44:14.175 182717 DEBUG nova.virt.libvirt.driver [None req-93f97d16-41a2-45c0-8bba-b80287072eb8 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:44:14 compute-1 nova_compute[182713]: 2026-01-22 00:44:14.176 182717 DEBUG nova.virt.libvirt.driver [None req-93f97d16-41a2-45c0-8bba-b80287072eb8 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:44:14 compute-1 nova_compute[182713]: 2026-01-22 00:44:14.176 182717 DEBUG nova.virt.libvirt.driver [None req-93f97d16-41a2-45c0-8bba-b80287072eb8 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:44:14 compute-1 nova_compute[182713]: 2026-01-22 00:44:14.182 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 00:44:14 compute-1 nova_compute[182713]: 2026-01-22 00:44:14.182 182717 DEBUG nova.virt.driver [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Emitting event <LifecycleEvent: 1769042654.1358159, 8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:44:14 compute-1 nova_compute[182713]: 2026-01-22 00:44:14.182 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5] VM Paused (Lifecycle Event)
Jan 22 00:44:14 compute-1 nova_compute[182713]: 2026-01-22 00:44:14.209 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:44:14 compute-1 nova_compute[182713]: 2026-01-22 00:44:14.213 182717 DEBUG nova.virt.driver [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Emitting event <LifecycleEvent: 1769042654.1445432, 8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:44:14 compute-1 nova_compute[182713]: 2026-01-22 00:44:14.214 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5] VM Resumed (Lifecycle Event)
Jan 22 00:44:14 compute-1 nova_compute[182713]: 2026-01-22 00:44:14.234 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:44:14 compute-1 nova_compute[182713]: 2026-01-22 00:44:14.238 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 00:44:14 compute-1 nova_compute[182713]: 2026-01-22 00:44:14.248 182717 INFO nova.compute.manager [None req-93f97d16-41a2-45c0-8bba-b80287072eb8 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5] Took 7.51 seconds to spawn the instance on the hypervisor.
Jan 22 00:44:14 compute-1 nova_compute[182713]: 2026-01-22 00:44:14.248 182717 DEBUG nova.compute.manager [None req-93f97d16-41a2-45c0-8bba-b80287072eb8 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:44:14 compute-1 nova_compute[182713]: 2026-01-22 00:44:14.259 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 00:44:14 compute-1 nova_compute[182713]: 2026-01-22 00:44:14.349 182717 INFO nova.compute.manager [None req-93f97d16-41a2-45c0-8bba-b80287072eb8 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5] Took 8.16 seconds to build instance.
Jan 22 00:44:14 compute-1 nova_compute[182713]: 2026-01-22 00:44:14.371 182717 DEBUG oslo_concurrency.lockutils [None req-93f97d16-41a2-45c0-8bba-b80287072eb8 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.288s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:44:14 compute-1 podman[244553]: 2026-01-22 00:44:14.512876906 +0000 UTC m=+0.082106824 container create 1b38bb5c6ea4252c9c140ed85a694e8189532c2ac7e838590b052f414578c0c8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-bc173f9b-a39e-490e-b1d4-92abd1855016, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 22 00:44:14 compute-1 podman[244553]: 2026-01-22 00:44:14.465694054 +0000 UTC m=+0.034924042 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 00:44:14 compute-1 systemd[1]: Started libpod-conmon-1b38bb5c6ea4252c9c140ed85a694e8189532c2ac7e838590b052f414578c0c8.scope.
Jan 22 00:44:14 compute-1 systemd[1]: Started libcrun container.
Jan 22 00:44:14 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1a83448e09dd14d5432f4d0254219636de9d5111446c0b2c4a364f8e3dd5781f/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 00:44:14 compute-1 podman[244553]: 2026-01-22 00:44:14.637138737 +0000 UTC m=+0.206368675 container init 1b38bb5c6ea4252c9c140ed85a694e8189532c2ac7e838590b052f414578c0c8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-bc173f9b-a39e-490e-b1d4-92abd1855016, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251202)
Jan 22 00:44:14 compute-1 podman[244553]: 2026-01-22 00:44:14.643297068 +0000 UTC m=+0.212526966 container start 1b38bb5c6ea4252c9c140ed85a694e8189532c2ac7e838590b052f414578c0c8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-bc173f9b-a39e-490e-b1d4-92abd1855016, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 00:44:14 compute-1 neutron-haproxy-ovnmeta-bc173f9b-a39e-490e-b1d4-92abd1855016[244568]: [NOTICE]   (244572) : New worker (244574) forked
Jan 22 00:44:14 compute-1 neutron-haproxy-ovnmeta-bc173f9b-a39e-490e-b1d4-92abd1855016[244568]: [NOTICE]   (244572) : Loading success.
Jan 22 00:44:15 compute-1 nova_compute[182713]: 2026-01-22 00:44:15.295 182717 DEBUG nova.network.neutron [req-abd921aa-60ef-4ccd-bb62-6911406fa901 req-eccb4bf7-bbaa-41fb-bb29-516b49698abb 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5] Updated VIF entry in instance network info cache for port 4ab490b8-61a1-4300-b85c-537002247bfe. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 00:44:15 compute-1 nova_compute[182713]: 2026-01-22 00:44:15.296 182717 DEBUG nova.network.neutron [req-abd921aa-60ef-4ccd-bb62-6911406fa901 req-eccb4bf7-bbaa-41fb-bb29-516b49698abb 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5] Updating instance_info_cache with network_info: [{"id": "4ab490b8-61a1-4300-b85c-537002247bfe", "address": "fa:16:3e:20:a2:c1", "network": {"id": "bc173f9b-a39e-490e-b1d4-92abd1855016", "bridge": "br-int", "label": "tempest-network-smoke--453483768", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe20:a2c1", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe20:a2c1", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4ab490b8-61", "ovs_interfaceid": "4ab490b8-61a1-4300-b85c-537002247bfe", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:44:15 compute-1 nova_compute[182713]: 2026-01-22 00:44:15.418 182717 DEBUG oslo_concurrency.lockutils [req-abd921aa-60ef-4ccd-bb62-6911406fa901 req-eccb4bf7-bbaa-41fb-bb29-516b49698abb 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:44:16 compute-1 nova_compute[182713]: 2026-01-22 00:44:16.033 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:44:16 compute-1 nova_compute[182713]: 2026-01-22 00:44:16.483 182717 DEBUG nova.compute.manager [req-7abace08-2823-4d7c-9fd2-869aa284cf27 req-6f32c7d3-a69b-4e45-b3f7-5f016ea26e9e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5] Received event network-vif-plugged-4ab490b8-61a1-4300-b85c-537002247bfe external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:44:16 compute-1 nova_compute[182713]: 2026-01-22 00:44:16.485 182717 DEBUG oslo_concurrency.lockutils [req-7abace08-2823-4d7c-9fd2-869aa284cf27 req-6f32c7d3-a69b-4e45-b3f7-5f016ea26e9e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:44:16 compute-1 nova_compute[182713]: 2026-01-22 00:44:16.485 182717 DEBUG oslo_concurrency.lockutils [req-7abace08-2823-4d7c-9fd2-869aa284cf27 req-6f32c7d3-a69b-4e45-b3f7-5f016ea26e9e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:44:16 compute-1 nova_compute[182713]: 2026-01-22 00:44:16.486 182717 DEBUG oslo_concurrency.lockutils [req-7abace08-2823-4d7c-9fd2-869aa284cf27 req-6f32c7d3-a69b-4e45-b3f7-5f016ea26e9e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:44:16 compute-1 nova_compute[182713]: 2026-01-22 00:44:16.486 182717 DEBUG nova.compute.manager [req-7abace08-2823-4d7c-9fd2-869aa284cf27 req-6f32c7d3-a69b-4e45-b3f7-5f016ea26e9e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5] No waiting events found dispatching network-vif-plugged-4ab490b8-61a1-4300-b85c-537002247bfe pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:44:16 compute-1 nova_compute[182713]: 2026-01-22 00:44:16.486 182717 WARNING nova.compute.manager [req-7abace08-2823-4d7c-9fd2-869aa284cf27 req-6f32c7d3-a69b-4e45-b3f7-5f016ea26e9e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5] Received unexpected event network-vif-plugged-4ab490b8-61a1-4300-b85c-537002247bfe for instance with vm_state active and task_state None.
Jan 22 00:44:16 compute-1 podman[244583]: 2026-01-22 00:44:16.604515648 +0000 UTC m=+0.091466805 container health_status cba261ffc859a105e0afa4436e64e27f9ba7e7387a1ea02146e0072c51844d44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3)
Jan 22 00:44:17 compute-1 nova_compute[182713]: 2026-01-22 00:44:17.857 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:44:17 compute-1 nova_compute[182713]: 2026-01-22 00:44:17.857 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:44:17 compute-1 nova_compute[182713]: 2026-01-22 00:44:17.921 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:44:18 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:44:18.085 104184 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=68, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:a4:fb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'fa:c2:bb:50:eb:27'}, ipsec=False) old=SB_Global(nb_cfg=67) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:44:18 compute-1 nova_compute[182713]: 2026-01-22 00:44:18.086 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:44:18 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:44:18.087 104184 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 22 00:44:18 compute-1 podman[244603]: 2026-01-22 00:44:18.595521332 +0000 UTC m=+0.088120112 container health_status dce133a2ffa21f3006ac27dc80027060a9029e6d972f4c62fecd35dd3bbb00f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., architecture=x86_64, release=1755695350, config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., distribution-scope=public, io.openshift.expose-services=)
Jan 22 00:44:18 compute-1 nova_compute[182713]: 2026-01-22 00:44:18.855 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:44:18 compute-1 nova_compute[182713]: 2026-01-22 00:44:18.884 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:44:18 compute-1 nova_compute[182713]: 2026-01-22 00:44:18.886 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:44:18 compute-1 nova_compute[182713]: 2026-01-22 00:44:18.887 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:44:18 compute-1 nova_compute[182713]: 2026-01-22 00:44:18.887 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 00:44:18 compute-1 nova_compute[182713]: 2026-01-22 00:44:18.978 182717 DEBUG oslo_concurrency.processutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:44:19 compute-1 nova_compute[182713]: 2026-01-22 00:44:19.047 182717 DEBUG oslo_concurrency.processutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5/disk --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:44:19 compute-1 nova_compute[182713]: 2026-01-22 00:44:19.049 182717 DEBUG oslo_concurrency.processutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:44:19 compute-1 nova_compute[182713]: 2026-01-22 00:44:19.122 182717 DEBUG oslo_concurrency.processutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5/disk --force-share --output=json" returned: 0 in 0.073s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:44:19 compute-1 nova_compute[182713]: 2026-01-22 00:44:19.307 182717 WARNING nova.virt.libvirt.driver [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 00:44:19 compute-1 nova_compute[182713]: 2026-01-22 00:44:19.309 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5557MB free_disk=73.17657089233398GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 00:44:19 compute-1 nova_compute[182713]: 2026-01-22 00:44:19.310 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:44:19 compute-1 nova_compute[182713]: 2026-01-22 00:44:19.311 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:44:19 compute-1 nova_compute[182713]: 2026-01-22 00:44:19.409 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Instance 8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 22 00:44:19 compute-1 nova_compute[182713]: 2026-01-22 00:44:19.410 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 00:44:19 compute-1 nova_compute[182713]: 2026-01-22 00:44:19.411 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 00:44:19 compute-1 nova_compute[182713]: 2026-01-22 00:44:19.459 182717 DEBUG nova.compute.provider_tree [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Inventory has not changed in ProviderTree for provider: 39680711-70c9-4df1-ae59-25e54fac688d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:44:19 compute-1 nova_compute[182713]: 2026-01-22 00:44:19.479 182717 DEBUG nova.scheduler.client.report [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Inventory has not changed for provider 39680711-70c9-4df1-ae59-25e54fac688d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:44:19 compute-1 nova_compute[182713]: 2026-01-22 00:44:19.504 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 00:44:19 compute-1 nova_compute[182713]: 2026-01-22 00:44:19.505 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.194s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:44:20 compute-1 nova_compute[182713]: 2026-01-22 00:44:20.506 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:44:20 compute-1 nova_compute[182713]: 2026-01-22 00:44:20.672 182717 DEBUG nova.compute.manager [req-a667a52d-d8b4-4cc7-89cc-31eae844363e req-45c82650-79b5-42aa-91e2-11298c871603 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5] Received event network-changed-4ab490b8-61a1-4300-b85c-537002247bfe external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:44:20 compute-1 nova_compute[182713]: 2026-01-22 00:44:20.673 182717 DEBUG nova.compute.manager [req-a667a52d-d8b4-4cc7-89cc-31eae844363e req-45c82650-79b5-42aa-91e2-11298c871603 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5] Refreshing instance network info cache due to event network-changed-4ab490b8-61a1-4300-b85c-537002247bfe. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 00:44:20 compute-1 nova_compute[182713]: 2026-01-22 00:44:20.673 182717 DEBUG oslo_concurrency.lockutils [req-a667a52d-d8b4-4cc7-89cc-31eae844363e req-45c82650-79b5-42aa-91e2-11298c871603 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:44:20 compute-1 nova_compute[182713]: 2026-01-22 00:44:20.674 182717 DEBUG oslo_concurrency.lockutils [req-a667a52d-d8b4-4cc7-89cc-31eae844363e req-45c82650-79b5-42aa-91e2-11298c871603 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:44:20 compute-1 nova_compute[182713]: 2026-01-22 00:44:20.674 182717 DEBUG nova.network.neutron [req-a667a52d-d8b4-4cc7-89cc-31eae844363e req-45c82650-79b5-42aa-91e2-11298c871603 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5] Refreshing network info cache for port 4ab490b8-61a1-4300-b85c-537002247bfe _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 00:44:21 compute-1 nova_compute[182713]: 2026-01-22 00:44:21.035 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:44:21 compute-1 nova_compute[182713]: 2026-01-22 00:44:21.856 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:44:21 compute-1 nova_compute[182713]: 2026-01-22 00:44:21.859 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 00:44:21 compute-1 nova_compute[182713]: 2026-01-22 00:44:21.859 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 22 00:44:22 compute-1 nova_compute[182713]: 2026-01-22 00:44:22.290 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquiring lock "refresh_cache-8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:44:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:22.900 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5', 'name': 'tempest-TestGettingAddress-server-1450704049', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-000000b6', 'OS-EXT-SRV-ATTR:host': 'compute-1.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '837db8748d074b3c9179b47d30e7a1d4', 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'hostId': '114686826ac799d79679ebf48d50469c6b568964d23812d7c3d3d1b1', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Jan 22 00:44:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:22.902 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Jan 22 00:44:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:22.923 12 DEBUG ceilometer.compute.pollsters [-] 8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5/disk.device.allocation volume: 204800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:44:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:22.924 12 DEBUG ceilometer.compute.pollsters [-] 8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:44:22 compute-1 nova_compute[182713]: 2026-01-22 00:44:22.924 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:44:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:22.929 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '012cdd3b-dd82-4898-a21c-7abcf6e42df5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 204800, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': '8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5-vda', 'timestamp': '2026-01-22T00:44:22.902754', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1450704049', 'name': 'instance-000000b6', 'instance_id': '8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5', 'instance_type': 'm1.nano', 'host': '114686826ac799d79679ebf48d50469c6b568964d23812d7c3d3d1b1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '7eaa6a0c-f72b-11f0-a0a4-fa163e934844', 'monotonic_time': 7235.610203933, 'message_signature': '338f64c48271571ff9a3d287ab0ac2ba19b3e83b9b8370633676a7431dd38e36'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': '8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5-sda', 'timestamp': '2026-01-22T00:44:22.902754', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1450704049', 'name': 'instance-000000b6', 'instance_id': '8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5', 'instance_type': 'm1.nano', 'host': '114686826ac799d79679ebf48d50469c6b568964d23812d7c3d3d1b1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '7eaa9040-f72b-11f0-a0a4-fa163e934844', 'monotonic_time': 7235.610203933, 'message_signature': '6d5cd1d3caad5458270667f71bad678cf4713c42dae147ff024075dec24d636a'}]}, 'timestamp': '2026-01-22 00:44:22.925238', '_unique_id': '06a21c735ebd40a59bbe69fd04ed8fa5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:44:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:22.929 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:44:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:22.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:44:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:22.929 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:44:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:22.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:44:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:22.929 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:44:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:22.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:44:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:22.929 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:44:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:22.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:44:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:22.929 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:44:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:22.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:44:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:22.929 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:44:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:22.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:44:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:22.929 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:44:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:22.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:44:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:22.929 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:44:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:22.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:44:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:22.929 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:44:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:22.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:44:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:22.929 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:44:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:22.929 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:44:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:22.929 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:44:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:22.929 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:44:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:22.929 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:44:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:22.929 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:44:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:22.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:44:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:22.929 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:44:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:22.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:44:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:22.929 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:44:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:22.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:44:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:22.929 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:44:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:22.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:44:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:22.929 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:44:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:22.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:44:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:22.929 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:44:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:22.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:44:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:22.929 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:44:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:22.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:44:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:22.929 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:44:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:22.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:44:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:22.929 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:44:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:22.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:44:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:22.929 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:44:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:22.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:44:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:22.929 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:44:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:22.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:44:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:22.929 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:44:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:22.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:44:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:22.929 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:44:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:22.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:44:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:22.929 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:44:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:22.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:44:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:22.929 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:44:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:22.929 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:44:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:22.929 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:44:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:22.932 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Jan 22 00:44:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:22.932 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 00:44:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:22.933 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-TestGettingAddress-server-1450704049>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestGettingAddress-server-1450704049>]
Jan 22 00:44:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:22.933 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Jan 22 00:44:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:22.937 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5 / tap4ab490b8-61 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Jan 22 00:44:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:22.941 12 DEBUG ceilometer.compute.pollsters [-] 8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:44:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:22.944 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd689cfc6-9606-46d9-a699-559d88feb0f1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': 'instance-000000b6-8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5-tap4ab490b8-61', 'timestamp': '2026-01-22T00:44:22.934118', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1450704049', 'name': 'tap4ab490b8-61', 'instance_id': '8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5', 'instance_type': 'm1.nano', 'host': '114686826ac799d79679ebf48d50469c6b568964d23812d7c3d3d1b1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:20:a2:c1', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4ab490b8-61'}, 'message_id': '7ead4aec-f72b-11f0-a0a4-fa163e934844', 'monotonic_time': 7235.641392619, 'message_signature': '9f6b5779043a0f888b9000adf37c3e369d0b4d9208e1666fd805148cc850abcf'}]}, 'timestamp': '2026-01-22 00:44:22.942934', '_unique_id': '4a6ffc90bbc44cfa8cfce93a688b101e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:44:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:22.944 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:44:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:22.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:44:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:22.944 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:44:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:22.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:44:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:22.944 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:44:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:22.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:44:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:22.944 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:44:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:22.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:44:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:22.944 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:44:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:22.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:44:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:22.944 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:44:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:22.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:44:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:22.944 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:44:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:22.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:44:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:22.944 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:44:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:22.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:44:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:22.944 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:44:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:22.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:44:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:22.944 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:44:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:22.944 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:44:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:22.944 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:44:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:22.944 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:44:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:22.944 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:44:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:22.944 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:44:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:22.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:44:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:22.944 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:44:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:22.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:44:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:22.944 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:44:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:22.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:44:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:22.944 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:44:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:22.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:44:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:22.944 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:44:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:22.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:44:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:22.944 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:44:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:22.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:44:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:22.944 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:44:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:22.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:44:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:22.944 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:44:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:22.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:44:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:22.944 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:44:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:22.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:44:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:22.944 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:44:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:22.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:44:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:22.944 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:44:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:22.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:44:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:22.944 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:44:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:22.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:44:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:22.944 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:44:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:22.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:44:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:22.944 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:44:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:22.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:44:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:22.944 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:44:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:22.944 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:44:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:22.944 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:44:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:22.946 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Jan 22 00:44:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:22.946 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 00:44:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:22.947 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-TestGettingAddress-server-1450704049>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestGettingAddress-server-1450704049>]
Jan 22 00:44:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:22.947 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Jan 22 00:44:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:22.947 12 DEBUG ceilometer.compute.pollsters [-] 8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:44:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:22.952 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0c40c7e2-d764-4db3-b687-56dbb5f99daf', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': 'instance-000000b6-8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5-tap4ab490b8-61', 'timestamp': '2026-01-22T00:44:22.947909', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1450704049', 'name': 'tap4ab490b8-61', 'instance_id': '8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5', 'instance_type': 'm1.nano', 'host': '114686826ac799d79679ebf48d50469c6b568964d23812d7c3d3d1b1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:20:a2:c1', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4ab490b8-61'}, 'message_id': '7eae8236-f72b-11f0-a0a4-fa163e934844', 'monotonic_time': 7235.641392619, 'message_signature': '819457c0a2d4fac6fab0854aca2c7ec2ea5c7d6b6d750bd14c7348996df0fe30'}]}, 'timestamp': '2026-01-22 00:44:22.950985', '_unique_id': 'c83c88f0358c44d08237d44c0e8a6a0e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:44:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:22.952 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:44:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:22.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:44:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:22.952 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:44:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:22.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:44:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:22.952 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:44:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:22.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:44:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:22.952 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:44:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:22.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:44:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:22.952 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:44:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:22.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:44:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:22.952 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:44:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:22.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:44:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:22.952 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:44:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:22.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:44:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:22.952 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:44:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:22.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:44:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:22.952 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:44:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:22.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:44:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:22.952 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:44:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:22.952 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:44:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:22.952 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:44:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:22.952 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:44:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:22.952 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:44:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:22.952 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:44:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:22.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:44:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:22.952 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:44:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:22.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:44:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:22.952 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:44:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:22.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:44:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:22.952 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:44:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:22.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:44:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:22.952 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:44:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:22.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:44:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:22.952 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:44:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:22.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:44:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:22.952 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:44:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:22.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:44:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:22.952 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:44:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:22.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:44:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:22.952 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:44:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:22.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:44:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:22.952 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:44:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:22.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:44:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:22.952 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:44:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:22.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:44:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:22.952 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:44:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:22.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:44:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:22.952 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:44:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:22.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:44:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:22.952 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:44:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:22.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:44:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:22.952 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:44:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:22.952 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:44:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:22.952 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:44:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:22.956 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Jan 22 00:44:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:22.957 12 DEBUG ceilometer.compute.pollsters [-] 8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:44:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:22.958 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '977205cc-95f0-400a-a6ff-1db643d2bf37', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': 'instance-000000b6-8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5-tap4ab490b8-61', 'timestamp': '2026-01-22T00:44:22.957717', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1450704049', 'name': 'tap4ab490b8-61', 'instance_id': '8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5', 'instance_type': 'm1.nano', 'host': '114686826ac799d79679ebf48d50469c6b568964d23812d7c3d3d1b1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:20:a2:c1', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4ab490b8-61'}, 'message_id': '7eafa3fa-f72b-11f0-a0a4-fa163e934844', 'monotonic_time': 7235.641392619, 'message_signature': '3ec519ce448aa7bf451319eece7d7cf0251f38522117760ac4fbedcd58fa8a1d'}]}, 'timestamp': '2026-01-22 00:44:22.958137', '_unique_id': 'e812292637614e96b6020d7a30d56ae0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:44:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:22.958 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:44:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:22.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:44:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:22.958 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:44:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:22.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:44:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:22.958 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:44:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:22.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:44:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:22.958 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:44:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:22.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:44:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:22.958 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:44:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:22.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:44:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:22.958 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:44:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:22.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:44:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:22.958 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:44:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:22.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:44:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:22.958 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:44:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:22.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:44:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:22.958 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:44:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:22.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:44:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:22.958 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:44:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:22.958 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:44:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:22.958 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:44:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:22.958 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:44:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:22.958 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:44:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:22.958 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:44:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:22.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:44:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:22.958 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:44:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:22.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:44:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:22.958 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:44:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:22.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:44:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:22.958 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:44:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:22.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:44:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:22.958 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:44:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:22.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:44:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:22.958 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:44:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:22.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:44:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:22.958 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:44:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:22.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:44:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:22.958 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:44:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:22.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:44:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:22.958 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:44:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:22.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:44:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:22.958 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:44:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:22.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:44:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:22.958 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:44:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:22.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:44:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:22.958 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:44:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:22.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:44:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:22.958 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:44:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:22.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:44:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:22.958 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:44:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:22.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:44:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:22.958 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:44:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:22.958 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:44:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:22.958 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:44:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:22.960 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.011 12 DEBUG ceilometer.compute.pollsters [-] 8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5/disk.device.read.latency volume: 141147315 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.012 12 DEBUG ceilometer.compute.pollsters [-] 8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5/disk.device.read.latency volume: 1699763 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.016 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'af455a55-c9d3-4292-ba55-c9e4b58e0ad1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 141147315, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': '8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5-vda', 'timestamp': '2026-01-22T00:44:22.960423', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1450704049', 'name': 'instance-000000b6', 'instance_id': '8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5', 'instance_type': 'm1.nano', 'host': '114686826ac799d79679ebf48d50469c6b568964d23812d7c3d3d1b1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '7eb7f064-f72b-11f0-a0a4-fa163e934844', 'monotonic_time': 7235.667636763, 'message_signature': 'a7e9c6a234505415463cbecfc19fd0c181cac99b58b6fa50596bb5b89519d3be'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1699763, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': '8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5-sda', 'timestamp': '2026-01-22T00:44:22.960423', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1450704049', 'name': 'instance-000000b6', 'instance_id': '8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5', 'instance_type': 'm1.nano', 'host': '114686826ac799d79679ebf48d50469c6b568964d23812d7c3d3d1b1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '7eb80ee6-f72b-11f0-a0a4-fa163e934844', 'monotonic_time': 7235.667636763, 'message_signature': '561097ff582cdfb4b6300564b7adc8325e8a737f60f0313c9dd9a2f3f6700ff3'}]}, 'timestamp': '2026-01-22 00:44:23.013383', '_unique_id': 'b7c2ac89d6034361939118a2623146c9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.016 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.016 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.016 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.016 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.016 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.016 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.016 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.016 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.016 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.016 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.016 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.016 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.016 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.016 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.016 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.016 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.016 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.016 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.016 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.016 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.016 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.016 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.016 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.016 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.016 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.016 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.016 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.016 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.016 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.016 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.016 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.016 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.016 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.016 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.016 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.016 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.016 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.016 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.016 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.016 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.016 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.016 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.016 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.016 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.016 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.016 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.016 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.016 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.016 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.016 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.016 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.016 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.016 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.016 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.021 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.021 12 DEBUG ceilometer.compute.pollsters [-] 8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5/disk.device.read.requests volume: 760 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.021 12 DEBUG ceilometer.compute.pollsters [-] 8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5/disk.device.read.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.022 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '696a6660-fff7-47ea-baad-890b48f4a17a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 760, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': '8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5-vda', 'timestamp': '2026-01-22T00:44:23.021344', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1450704049', 'name': 'instance-000000b6', 'instance_id': '8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5', 'instance_type': 'm1.nano', 'host': '114686826ac799d79679ebf48d50469c6b568964d23812d7c3d3d1b1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '7eb95878-f72b-11f0-a0a4-fa163e934844', 'monotonic_time': 7235.667636763, 'message_signature': 'adef1ed645f2343e7c12c727f98d99d5631fe21ef1637452c22a389e8972fa2c'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': '8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5-sda', 'timestamp': '2026-01-22T00:44:23.021344', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1450704049', 'name': 'instance-000000b6', 'instance_id': '8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5', 'instance_type': 'm1.nano', 'host': '114686826ac799d79679ebf48d50469c6b568964d23812d7c3d3d1b1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '7eb966e2-f72b-11f0-a0a4-fa163e934844', 'monotonic_time': 7235.667636763, 'message_signature': 'e15d31a0316aa0f8064726881363b83e58e86b37700f544a636be0923234b041'}]}, 'timestamp': '2026-01-22 00:44:23.022092', '_unique_id': '8072042f1d284081a9fb7c0def350a08'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.022 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.022 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.022 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.022 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.022 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.022 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.022 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.022 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.022 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.022 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.022 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.022 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.022 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.022 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.022 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.022 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.022 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.022 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.022 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.022 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.022 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.022 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.022 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.022 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.022 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.022 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.022 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.022 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.022 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.022 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.022 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.024 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.024 12 DEBUG ceilometer.compute.pollsters [-] 8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5/network.outgoing.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.027 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '897ca45f-28b8-4bee-8e53-503663c76a87', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': 'instance-000000b6-8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5-tap4ab490b8-61', 'timestamp': '2026-01-22T00:44:23.024415', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1450704049', 'name': 'tap4ab490b8-61', 'instance_id': '8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5', 'instance_type': 'm1.nano', 'host': '114686826ac799d79679ebf48d50469c6b568964d23812d7c3d3d1b1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:20:a2:c1', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4ab490b8-61'}, 'message_id': '7eb9d014-f72b-11f0-a0a4-fa163e934844', 'monotonic_time': 7235.641392619, 'message_signature': 'd8d71a625a9a459f0924273273cc6c9d305040f2aa1874858bc15ccb30a053c1'}]}, 'timestamp': '2026-01-22 00:44:23.024797', '_unique_id': '809dba72b2c74c57b0aa44b486a7b596'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.027 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.027 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.027 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.027 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.027 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.027 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.027 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.027 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.027 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.027 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.027 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.027 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.027 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.027 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.027 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.027 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.027 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.027 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.027 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.027 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.027 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.027 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.027 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.027 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.027 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.027 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.027 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.027 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.027 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.027 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.027 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.028 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.028 12 DEBUG ceilometer.compute.pollsters [-] 8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.029 12 DEBUG ceilometer.compute.pollsters [-] 8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.030 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e867a687-a60b-4800-9ce4-fb347636f662', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': '8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5-vda', 'timestamp': '2026-01-22T00:44:23.028809', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1450704049', 'name': 'instance-000000b6', 'instance_id': '8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5', 'instance_type': 'm1.nano', 'host': '114686826ac799d79679ebf48d50469c6b568964d23812d7c3d3d1b1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '7eba7e24-f72b-11f0-a0a4-fa163e934844', 'monotonic_time': 7235.667636763, 'message_signature': 'd192854ecd2ed7dfb6241418ba76915ee2fc8583e6a110b5fa2ccab015fbc938'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': '8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5-sda', 'timestamp': '2026-01-22T00:44:23.028809', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1450704049', 'name': 'instance-000000b6', 'instance_id': '8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5', 'instance_type': 'm1.nano', 'host': '114686826ac799d79679ebf48d50469c6b568964d23812d7c3d3d1b1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '7eba8b8a-f72b-11f0-a0a4-fa163e934844', 'monotonic_time': 7235.667636763, 'message_signature': 'd9e0bb13faf5af9cc180d62df70f4cd8a918a1f16f9d822a5570abbb35957db0'}]}, 'timestamp': '2026-01-22 00:44:23.029579', '_unique_id': '0597837f75f040ac8df63b72c839a1c1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.030 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.030 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.030 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.030 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.030 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.030 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.030 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.030 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.030 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.030 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.030 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.030 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.030 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.030 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.030 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.030 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.030 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.030 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.030 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.030 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.030 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.030 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.030 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.030 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.030 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.030 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.030 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.030 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.030 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.030 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.030 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.031 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.032 12 DEBUG ceilometer.compute.pollsters [-] 8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5/network.outgoing.packets volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.035 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7999324b-d532-4a8a-ae9a-0a72e2fe1dd2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': 'instance-000000b6-8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5-tap4ab490b8-61', 'timestamp': '2026-01-22T00:44:23.032016', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1450704049', 'name': 'tap4ab490b8-61', 'instance_id': '8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5', 'instance_type': 'm1.nano', 'host': '114686826ac799d79679ebf48d50469c6b568964d23812d7c3d3d1b1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:20:a2:c1', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4ab490b8-61'}, 'message_id': '7ebb1fd2-f72b-11f0-a0a4-fa163e934844', 'monotonic_time': 7235.641392619, 'message_signature': '815a2fd1706c44ffec80e77cbbc2adc93afe2685c106b9c595c58bfe156a0905'}]}, 'timestamp': '2026-01-22 00:44:23.034885', '_unique_id': '3217f79545364238ba025020fa3d53c3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.035 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.035 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.035 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.035 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.035 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.035 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.035 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.035 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.035 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.035 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.035 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.035 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.035 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.035 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.035 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.035 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.035 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.035 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.035 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.035 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.035 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.035 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.035 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.035 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.035 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.035 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.035 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.035 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.035 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.035 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.035 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.036 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.036 12 DEBUG ceilometer.compute.pollsters [-] 8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.037 12 DEBUG ceilometer.compute.pollsters [-] 8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.039 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f79d072f-b01f-431b-81c8-5b90adefcab9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': '8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5-vda', 'timestamp': '2026-01-22T00:44:23.036954', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1450704049', 'name': 'instance-000000b6', 'instance_id': '8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5', 'instance_type': 'm1.nano', 'host': '114686826ac799d79679ebf48d50469c6b568964d23812d7c3d3d1b1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '7ebbb9ce-f72b-11f0-a0a4-fa163e934844', 'monotonic_time': 7235.610203933, 'message_signature': '59854aebec17d36f0e1d7117fd957703175f495d503699a2783a075194430d77'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': '8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5-sda', 'timestamp': '2026-01-22T00:44:23.036954', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1450704049', 'name': 'instance-000000b6', 'instance_id': '8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5', 'instance_type': 'm1.nano', 'host': '114686826ac799d79679ebf48d50469c6b568964d23812d7c3d3d1b1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '7ebbd2a6-f72b-11f0-a0a4-fa163e934844', 'monotonic_time': 7235.610203933, 'message_signature': '0d8ab5da91b133a8831e42e6a688395d2b638ecfeafd9050868e65a4bc8f5c77'}]}, 'timestamp': '2026-01-22 00:44:23.038547', '_unique_id': '5dc10d269c8240d5a5aee33aa0a59f79'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.039 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.039 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.039 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.039 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.039 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.039 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.039 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.039 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.039 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.039 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.039 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.039 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.039 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.039 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.039 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.039 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.039 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.039 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.039 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.039 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.039 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.039 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.039 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.039 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.039 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.039 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.039 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.039 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.039 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.039 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.039 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.041 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.070 12 DEBUG ceilometer.compute.pollsters [-] 8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5/cpu volume: 8330000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.071 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f9493db6-38bb-4ebc-823d-24627ce8d328', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 8330000000, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': '8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5', 'timestamp': '2026-01-22T00:44:23.042533', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1450704049', 'name': 'instance-000000b6', 'instance_id': '8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5', 'instance_type': 'm1.nano', 'host': '114686826ac799d79679ebf48d50469c6b568964d23812d7c3d3d1b1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '7ec0d1ac-f72b-11f0-a0a4-fa163e934844', 'monotonic_time': 7235.777198728, 'message_signature': '564b079d464d9a7299ad8c5cfb3ed9edbe8b52a9fd8fe2d5e21c19617f2b58d3'}]}, 'timestamp': '2026-01-22 00:44:23.070759', '_unique_id': 'effcdc82b6a64c3fb60a847400af2fef'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.071 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.071 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.071 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.071 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.071 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.071 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.071 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.071 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.071 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.071 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.071 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.071 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.071 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.071 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.071 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.071 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.071 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.071 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.071 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.071 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.071 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.071 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.071 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.071 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.071 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.071 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.071 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.071 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.071 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.071 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.071 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.071 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.071 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.071 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.071 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.071 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.071 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.071 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.071 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.071 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.071 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.071 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.071 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.071 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.071 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.071 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.071 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.071 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.071 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.071 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.071 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.071 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.071 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.071 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.073 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.073 12 DEBUG ceilometer.compute.pollsters [-] 8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5/disk.device.read.bytes volume: 23775232 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.074 12 DEBUG ceilometer.compute.pollsters [-] 8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5/disk.device.read.bytes volume: 2048 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.076 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e62eb5f8-9ca9-4b4f-9743-a625aba2df4a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 23775232, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': '8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5-vda', 'timestamp': '2026-01-22T00:44:23.073692', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1450704049', 'name': 'instance-000000b6', 'instance_id': '8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5', 'instance_type': 'm1.nano', 'host': '114686826ac799d79679ebf48d50469c6b568964d23812d7c3d3d1b1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '7ec157b2-f72b-11f0-a0a4-fa163e934844', 'monotonic_time': 7235.667636763, 'message_signature': '546e646bc66655ba95c8b0d1535ba637b1e6bf3f3a30931d7b131deb1cb242b2'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2048, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': '8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5-sda', 'timestamp': '2026-01-22T00:44:23.073692', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1450704049', 'name': 'instance-000000b6', 'instance_id': '8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5', 'instance_type': 'm1.nano', 'host': '114686826ac799d79679ebf48d50469c6b568964d23812d7c3d3d1b1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '7ec166e4-f72b-11f0-a0a4-fa163e934844', 'monotonic_time': 7235.667636763, 'message_signature': 'a0f5d233ef213d5b423d2e3f71a79e3d233713d0cc742ceeba0ec9e79f6536a3'}]}, 'timestamp': '2026-01-22 00:44:23.074525', '_unique_id': 'e165936d80a743b3b9d2083ddbc03e4b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.076 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.076 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.076 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.076 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.076 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.076 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.076 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.076 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.076 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.076 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.076 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.076 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.076 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.076 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.076 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.076 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.076 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.076 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.076 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.076 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.076 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.076 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.076 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.076 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.076 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.076 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.076 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.076 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.076 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.076 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.076 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.078 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.080 12 DEBUG ceilometer.compute.pollsters [-] 8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5/disk.device.usage volume: 196624 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.080 12 DEBUG ceilometer.compute.pollsters [-] 8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.081 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'dda1356e-4f58-4cbd-b3c2-c03ec8ea0526', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 196624, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': '8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5-vda', 'timestamp': '2026-01-22T00:44:23.079941', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1450704049', 'name': 'instance-000000b6', 'instance_id': '8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5', 'instance_type': 'm1.nano', 'host': '114686826ac799d79679ebf48d50469c6b568964d23812d7c3d3d1b1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '7ec25d60-f72b-11f0-a0a4-fa163e934844', 'monotonic_time': 7235.610203933, 'message_signature': 'bb82297c9017846604d298f5c023a658bc4e60381b0569892990dc1a21685fee'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': '8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5-sda', 'timestamp': '2026-01-22T00:44:23.079941', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1450704049', 'name': 'instance-000000b6', 'instance_id': '8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5', 'instance_type': 'm1.nano', 'host': '114686826ac799d79679ebf48d50469c6b568964d23812d7c3d3d1b1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '7ec26b5c-f72b-11f0-a0a4-fa163e934844', 'monotonic_time': 7235.610203933, 'message_signature': 'b30b9874b9e2ea6fc6f9e757e93c45c3510f01df56026724c7700ba635ae062f'}]}, 'timestamp': '2026-01-22 00:44:23.081184', '_unique_id': 'daa61ea3537b4f928c4196a15865b0e0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.081 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.081 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.081 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.081 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.081 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.081 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.081 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.081 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.081 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.081 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.081 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.081 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.081 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.081 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.081 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.081 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.081 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.081 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.081 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.081 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.081 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.081 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.081 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.081 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.081 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.081 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.081 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.081 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.081 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.081 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.081 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.083 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.083 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.083 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-TestGettingAddress-server-1450704049>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestGettingAddress-server-1450704049>]
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.083 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.083 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.083 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-TestGettingAddress-server-1450704049>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestGettingAddress-server-1450704049>]
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.084 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.084 12 DEBUG ceilometer.compute.pollsters [-] 8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5/memory.usage volume: Unavailable _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.084 12 WARNING ceilometer.compute.pollsters [-] memory.usage statistic in not available for instance 8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5: ceilometer.compute.pollsters.NoVolumeException
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.084 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.084 12 DEBUG ceilometer.compute.pollsters [-] 8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5/network.incoming.packets volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.085 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1b513f0f-8c5c-4e0c-b76d-54e2b9272cd0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 1, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': 'instance-000000b6-8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5-tap4ab490b8-61', 'timestamp': '2026-01-22T00:44:23.084593', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1450704049', 'name': 'tap4ab490b8-61', 'instance_id': '8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5', 'instance_type': 'm1.nano', 'host': '114686826ac799d79679ebf48d50469c6b568964d23812d7c3d3d1b1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:20:a2:c1', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4ab490b8-61'}, 'message_id': '7ec2fc98-f72b-11f0-a0a4-fa163e934844', 'monotonic_time': 7235.641392619, 'message_signature': '5d3f8113818d5fed1ce729ea2a26726bd23131b5f3442e017072e60397831ffd'}]}, 'timestamp': '2026-01-22 00:44:23.084901', '_unique_id': 'c8e2c18dc22c47308ea0a9c7a4fdc3a3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.085 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.085 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.085 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.085 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.085 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.085 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.085 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.085 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.085 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.085 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.085 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.085 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.085 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.085 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.085 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.085 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.085 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.085 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.085 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.085 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.085 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.085 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.085 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.085 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.085 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.085 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.085 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.085 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.085 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.085 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.085 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.086 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.086 12 DEBUG ceilometer.compute.pollsters [-] 8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.087 12 DEBUG ceilometer.compute.pollsters [-] 8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:44:23 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:44:23.090 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=74526b6d-b1ca-423f-9094-b845f8b97526, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '68'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.091 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c387b564-7db7-4c7a-ad6c-46d5f21b8b6a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': '8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5-vda', 'timestamp': '2026-01-22T00:44:23.086676', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1450704049', 'name': 'instance-000000b6', 'instance_id': '8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5', 'instance_type': 'm1.nano', 'host': '114686826ac799d79679ebf48d50469c6b568964d23812d7c3d3d1b1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '7ec35b84-f72b-11f0-a0a4-fa163e934844', 'monotonic_time': 7235.667636763, 'message_signature': '6ecbe914462ff1c02e96e403f3e4fab94055f9b39999cd0b3f82d6f22701fdbd'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': '8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5-sda', 'timestamp': '2026-01-22T00:44:23.086676', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1450704049', 'name': 'instance-000000b6', 'instance_id': '8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5', 'instance_type': 'm1.nano', 'host': '114686826ac799d79679ebf48d50469c6b568964d23812d7c3d3d1b1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '7ec37984-f72b-11f0-a0a4-fa163e934844', 'monotonic_time': 7235.667636763, 'message_signature': '2779ff132140245b43d3b62e448f11b51b5ecb98b4d5c4cb93f4a0f2f905338b'}]}, 'timestamp': '2026-01-22 00:44:23.088511', '_unique_id': 'b3c1fa6fd1cb4425b2d975635e3f7244'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.091 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.091 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.091 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.091 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.091 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.091 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.091 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.091 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.091 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.091 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.091 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.091 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.091 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.091 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.091 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.091 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.091 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.091 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.091 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.091 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.091 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.091 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.091 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.091 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.091 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.091 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.091 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.091 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.091 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.091 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.091 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.093 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.094 12 DEBUG ceilometer.compute.pollsters [-] 8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.096 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd39b85a8-7d5f-4491-9e21-5636abed140c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': 'instance-000000b6-8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5-tap4ab490b8-61', 'timestamp': '2026-01-22T00:44:23.094233', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1450704049', 'name': 'tap4ab490b8-61', 'instance_id': '8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5', 'instance_type': 'm1.nano', 'host': '114686826ac799d79679ebf48d50469c6b568964d23812d7c3d3d1b1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:20:a2:c1', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4ab490b8-61'}, 'message_id': '7ec4926a-f72b-11f0-a0a4-fa163e934844', 'monotonic_time': 7235.641392619, 'message_signature': '1112edbeabc00a775fe2affb29b3c88f115c524ee13fad657df3176ef5b0eec2'}]}, 'timestamp': '2026-01-22 00:44:23.095980', '_unique_id': '501ebe12fbbc41248887ab63a5a28709'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.096 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.096 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.096 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.096 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.096 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.096 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.096 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.096 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.096 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.096 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.096 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.096 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.096 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.096 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.096 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.096 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.096 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.096 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.096 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.096 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.096 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.096 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.096 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.096 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.096 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.096 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.096 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.096 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.096 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.096 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.096 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.097 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.097 12 DEBUG ceilometer.compute.pollsters [-] 8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.099 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '198e6cff-cb5d-4687-9ed4-74d995b34db6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': 'instance-000000b6-8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5-tap4ab490b8-61', 'timestamp': '2026-01-22T00:44:23.097739', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1450704049', 'name': 'tap4ab490b8-61', 'instance_id': '8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5', 'instance_type': 'm1.nano', 'host': '114686826ac799d79679ebf48d50469c6b568964d23812d7c3d3d1b1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:20:a2:c1', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4ab490b8-61'}, 'message_id': '7ec52630-f72b-11f0-a0a4-fa163e934844', 'monotonic_time': 7235.641392619, 'message_signature': '2fa478950527e5dd7434118d355fd0e0b9bce3fec689dcb62afa0dd031b6ebd9'}]}, 'timestamp': '2026-01-22 00:44:23.099059', '_unique_id': 'e236d013ff11483aa75052b5a71c436d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.099 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.099 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.099 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.099 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.099 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.099 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.099 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.099 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.099 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.099 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.099 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.099 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.099 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.099 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.099 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.099 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.099 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.099 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.099 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.099 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.099 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.099 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.099 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.099 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.099 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.099 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.099 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.099 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.099 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.099 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.099 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.100 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.101 12 DEBUG ceilometer.compute.pollsters [-] 8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.101 12 DEBUG ceilometer.compute.pollsters [-] 8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.103 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c10b4052-5105-4ddd-97e0-48e9c9b3d572', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': '8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5-vda', 'timestamp': '2026-01-22T00:44:23.101097', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1450704049', 'name': 'instance-000000b6', 'instance_id': '8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5', 'instance_type': 'm1.nano', 'host': '114686826ac799d79679ebf48d50469c6b568964d23812d7c3d3d1b1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '7ec58120-f72b-11f0-a0a4-fa163e934844', 'monotonic_time': 7235.667636763, 'message_signature': 'dfd0f856861b06bdac01f9dbfba026c41b9c71ccaa28f783c2d088bc942986d5'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': '8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5-sda', 'timestamp': '2026-01-22T00:44:23.101097', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1450704049', 'name': 'instance-000000b6', 'instance_id': '8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5', 'instance_type': 'm1.nano', 'host': '114686826ac799d79679ebf48d50469c6b568964d23812d7c3d3d1b1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '7ec5905c-f72b-11f0-a0a4-fa163e934844', 'monotonic_time': 7235.667636763, 'message_signature': '9f370881bd1c6f7eb7b5aeacc2d4d353268318ddfbceae22934440ee6a9bd505'}]}, 'timestamp': '2026-01-22 00:44:23.102031', '_unique_id': 'a1867372be524776967efd080fb68897'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.103 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.103 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.103 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.103 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.103 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.103 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.103 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.103 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.103 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.103 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.103 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.103 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.103 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.103 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.103 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.103 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.103 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.103 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.103 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.103 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.103 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.103 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.103 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.103 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.103 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.103 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.103 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.103 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.103 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.103 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.103 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.104 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.105 12 DEBUG ceilometer.compute.pollsters [-] 8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.107 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '789da7e7-7361-4b16-8f06-fe3619ac814b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': 'instance-000000b6-8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5-tap4ab490b8-61', 'timestamp': '2026-01-22T00:44:23.105015', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1450704049', 'name': 'tap4ab490b8-61', 'instance_id': '8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5', 'instance_type': 'm1.nano', 'host': '114686826ac799d79679ebf48d50469c6b568964d23812d7c3d3d1b1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:20:a2:c1', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4ab490b8-61'}, 'message_id': '7ec61b76-f72b-11f0-a0a4-fa163e934844', 'monotonic_time': 7235.641392619, 'message_signature': 'ef0d24faf5fe05e0fff0978518880db911c1f32a355e6aa41668ace06c101ee0'}]}, 'timestamp': '2026-01-22 00:44:23.105736', '_unique_id': '2c6a664d740d4520af8918a95851f190'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.107 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.107 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.107 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.107 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.107 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.107 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.107 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.107 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.107 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.107 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.107 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.107 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.107 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.107 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.107 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.107 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.107 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.107 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.107 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.107 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.107 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.107 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.107 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.107 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.107 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.107 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.107 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.107 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.107 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.107 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.107 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.108 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.108 12 DEBUG ceilometer.compute.pollsters [-] 8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5/network.incoming.bytes volume: 90 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.109 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4d67c64b-d926-4098-8b6e-dbc5f5da5b07', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 90, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': 'instance-000000b6-8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5-tap4ab490b8-61', 'timestamp': '2026-01-22T00:44:23.108439', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1450704049', 'name': 'tap4ab490b8-61', 'instance_id': '8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5', 'instance_type': 'm1.nano', 'host': '114686826ac799d79679ebf48d50469c6b568964d23812d7c3d3d1b1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:20:a2:c1', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4ab490b8-61'}, 'message_id': '7ec6aab4-f72b-11f0-a0a4-fa163e934844', 'monotonic_time': 7235.641392619, 'message_signature': 'eefc9242aa2056f7681ab32a90836808e424808bebcc4a8fbe030cb75a562734'}]}, 'timestamp': '2026-01-22 00:44:23.109190', '_unique_id': '2d90bf13824d45a6aa4854e0e2de1c6f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.109 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.109 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.109 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.109 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.109 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.109 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.109 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.109 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.109 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.109 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.109 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.109 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.109 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.109 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.109 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.109 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.109 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.109 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.109 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.109 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.109 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.109 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.109 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.109 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.109 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.109 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.109 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.109 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.109 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.109 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:44:23 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:44:23.109 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:44:23 compute-1 nova_compute[182713]: 2026-01-22 00:44:23.369 182717 DEBUG nova.network.neutron [req-a667a52d-d8b4-4cc7-89cc-31eae844363e req-45c82650-79b5-42aa-91e2-11298c871603 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5] Updated VIF entry in instance network info cache for port 4ab490b8-61a1-4300-b85c-537002247bfe. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 00:44:23 compute-1 nova_compute[182713]: 2026-01-22 00:44:23.369 182717 DEBUG nova.network.neutron [req-a667a52d-d8b4-4cc7-89cc-31eae844363e req-45c82650-79b5-42aa-91e2-11298c871603 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5] Updating instance_info_cache with network_info: [{"id": "4ab490b8-61a1-4300-b85c-537002247bfe", "address": "fa:16:3e:20:a2:c1", "network": {"id": "bc173f9b-a39e-490e-b1d4-92abd1855016", "bridge": "br-int", "label": "tempest-network-smoke--453483768", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe20:a2c1", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.206", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe20:a2c1", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4ab490b8-61", "ovs_interfaceid": "4ab490b8-61a1-4300-b85c-537002247bfe", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:44:23 compute-1 nova_compute[182713]: 2026-01-22 00:44:23.396 182717 DEBUG oslo_concurrency.lockutils [req-a667a52d-d8b4-4cc7-89cc-31eae844363e req-45c82650-79b5-42aa-91e2-11298c871603 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:44:23 compute-1 nova_compute[182713]: 2026-01-22 00:44:23.397 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquired lock "refresh_cache-8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:44:23 compute-1 nova_compute[182713]: 2026-01-22 00:44:23.397 182717 DEBUG nova.network.neutron [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] [instance: 8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Jan 22 00:44:23 compute-1 nova_compute[182713]: 2026-01-22 00:44:23.397 182717 DEBUG nova.objects.instance [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lazy-loading 'info_cache' on Instance uuid 8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:44:26 compute-1 nova_compute[182713]: 2026-01-22 00:44:26.038 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:44:26 compute-1 nova_compute[182713]: 2026-01-22 00:44:26.984 182717 DEBUG nova.network.neutron [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] [instance: 8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5] Updating instance_info_cache with network_info: [{"id": "4ab490b8-61a1-4300-b85c-537002247bfe", "address": "fa:16:3e:20:a2:c1", "network": {"id": "bc173f9b-a39e-490e-b1d4-92abd1855016", "bridge": "br-int", "label": "tempest-network-smoke--453483768", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe20:a2c1", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.206", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe20:a2c1", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4ab490b8-61", "ovs_interfaceid": "4ab490b8-61a1-4300-b85c-537002247bfe", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:44:27 compute-1 nova_compute[182713]: 2026-01-22 00:44:27.010 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Releasing lock "refresh_cache-8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:44:27 compute-1 nova_compute[182713]: 2026-01-22 00:44:27.010 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] [instance: 8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Jan 22 00:44:27 compute-1 ovn_controller[94841]: 2026-01-22T00:44:27Z|00100|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:20:a2:c1 10.100.0.5
Jan 22 00:44:27 compute-1 ovn_controller[94841]: 2026-01-22T00:44:27Z|00101|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:20:a2:c1 10.100.0.5
Jan 22 00:44:27 compute-1 nova_compute[182713]: 2026-01-22 00:44:27.928 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:44:28 compute-1 podman[244649]: 2026-01-22 00:44:28.633277223 +0000 UTC m=+0.100304359 container health_status 9069102163b9014ce8d990e36224edf2f6277e737eebbc85a8ea0ba5a3d6a468 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 22 00:44:28 compute-1 podman[244648]: 2026-01-22 00:44:28.700092004 +0000 UTC m=+0.174304062 container health_status 1b31374526468d6b98833c6ddc06c791524e3aaa8f4a92d02e3f11c449a17ca2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, managed_by=edpm_ansible, io.buildah.version=1.41.3, tcib_managed=true)
Jan 22 00:44:31 compute-1 nova_compute[182713]: 2026-01-22 00:44:31.040 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:44:32 compute-1 nova_compute[182713]: 2026-01-22 00:44:32.931 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:44:34 compute-1 podman[244699]: 2026-01-22 00:44:34.589312548 +0000 UTC m=+0.085184541 container health_status af9a44923f14d865893c91712a29a99d59e05d83f1a1a0300c4f4d5af638f284 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 22 00:44:34 compute-1 podman[244700]: 2026-01-22 00:44:34.609281677 +0000 UTC m=+0.089667740 container health_status e305ebfcd3dbca18f8dd680606b043b108cf9efb0d0a771039cb04fe0df8441d (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 22 00:44:36 compute-1 nova_compute[182713]: 2026-01-22 00:44:36.043 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:44:37 compute-1 nova_compute[182713]: 2026-01-22 00:44:37.934 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:44:39 compute-1 nova_compute[182713]: 2026-01-22 00:44:39.086 182717 DEBUG nova.compute.manager [req-b1e92ae4-b745-44c8-a2e0-f2c9a81ea7cd req-bf8f26e3-6ece-46d9-b64a-0eadb42536bd 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5] Received event network-changed-4ab490b8-61a1-4300-b85c-537002247bfe external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:44:39 compute-1 nova_compute[182713]: 2026-01-22 00:44:39.087 182717 DEBUG nova.compute.manager [req-b1e92ae4-b745-44c8-a2e0-f2c9a81ea7cd req-bf8f26e3-6ece-46d9-b64a-0eadb42536bd 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5] Refreshing instance network info cache due to event network-changed-4ab490b8-61a1-4300-b85c-537002247bfe. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 00:44:39 compute-1 nova_compute[182713]: 2026-01-22 00:44:39.087 182717 DEBUG oslo_concurrency.lockutils [req-b1e92ae4-b745-44c8-a2e0-f2c9a81ea7cd req-bf8f26e3-6ece-46d9-b64a-0eadb42536bd 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:44:39 compute-1 nova_compute[182713]: 2026-01-22 00:44:39.087 182717 DEBUG oslo_concurrency.lockutils [req-b1e92ae4-b745-44c8-a2e0-f2c9a81ea7cd req-bf8f26e3-6ece-46d9-b64a-0eadb42536bd 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:44:39 compute-1 nova_compute[182713]: 2026-01-22 00:44:39.087 182717 DEBUG nova.network.neutron [req-b1e92ae4-b745-44c8-a2e0-f2c9a81ea7cd req-bf8f26e3-6ece-46d9-b64a-0eadb42536bd 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5] Refreshing network info cache for port 4ab490b8-61a1-4300-b85c-537002247bfe _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 00:44:39 compute-1 nova_compute[182713]: 2026-01-22 00:44:39.267 182717 DEBUG oslo_concurrency.lockutils [None req-bfe47445-eadf-4a2a-ac35-11b68cea5862 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Acquiring lock "8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:44:39 compute-1 nova_compute[182713]: 2026-01-22 00:44:39.268 182717 DEBUG oslo_concurrency.lockutils [None req-bfe47445-eadf-4a2a-ac35-11b68cea5862 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:44:39 compute-1 nova_compute[182713]: 2026-01-22 00:44:39.268 182717 DEBUG oslo_concurrency.lockutils [None req-bfe47445-eadf-4a2a-ac35-11b68cea5862 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Acquiring lock "8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:44:39 compute-1 nova_compute[182713]: 2026-01-22 00:44:39.268 182717 DEBUG oslo_concurrency.lockutils [None req-bfe47445-eadf-4a2a-ac35-11b68cea5862 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:44:39 compute-1 nova_compute[182713]: 2026-01-22 00:44:39.268 182717 DEBUG oslo_concurrency.lockutils [None req-bfe47445-eadf-4a2a-ac35-11b68cea5862 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:44:39 compute-1 nova_compute[182713]: 2026-01-22 00:44:39.283 182717 INFO nova.compute.manager [None req-bfe47445-eadf-4a2a-ac35-11b68cea5862 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5] Terminating instance
Jan 22 00:44:39 compute-1 nova_compute[182713]: 2026-01-22 00:44:39.299 182717 DEBUG nova.compute.manager [None req-bfe47445-eadf-4a2a-ac35-11b68cea5862 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 22 00:44:39 compute-1 kernel: tap4ab490b8-61 (unregistering): left promiscuous mode
Jan 22 00:44:39 compute-1 NetworkManager[54952]: <info>  [1769042679.3290] device (tap4ab490b8-61): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 00:44:39 compute-1 ovn_controller[94841]: 2026-01-22T00:44:39Z|00765|binding|INFO|Releasing lport 4ab490b8-61a1-4300-b85c-537002247bfe from this chassis (sb_readonly=0)
Jan 22 00:44:39 compute-1 nova_compute[182713]: 2026-01-22 00:44:39.340 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:44:39 compute-1 ovn_controller[94841]: 2026-01-22T00:44:39Z|00766|binding|INFO|Setting lport 4ab490b8-61a1-4300-b85c-537002247bfe down in Southbound
Jan 22 00:44:39 compute-1 ovn_controller[94841]: 2026-01-22T00:44:39Z|00767|binding|INFO|Removing iface tap4ab490b8-61 ovn-installed in OVS
Jan 22 00:44:39 compute-1 nova_compute[182713]: 2026-01-22 00:44:39.345 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:44:39 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:44:39.355 104184 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:20:a2:c1 10.100.0.5 2001:db8:0:1:f816:3eff:fe20:a2c1 2001:db8::f816:3eff:fe20:a2c1'], port_security=['fa:16:3e:20:a2:c1 10.100.0.5 2001:db8:0:1:f816:3eff:fe20:a2c1 2001:db8::f816:3eff:fe20:a2c1'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28 2001:db8:0:1:f816:3eff:fe20:a2c1/64 2001:db8::f816:3eff:fe20:a2c1/64', 'neutron:device_id': '8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-bc173f9b-a39e-490e-b1d4-92abd1855016', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '837db8748d074b3c9179b47d30e7a1d4', 'neutron:revision_number': '4', 'neutron:security_group_ids': '46ea8a3f-4945-4bb2-97cf-c1bd6e8fe825', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ad392942-0b6b-462d-a3a5-d979f385a143, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>], logical_port=4ab490b8-61a1-4300-b85c-537002247bfe) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:44:39 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:44:39.356 104184 INFO neutron.agent.ovn.metadata.agent [-] Port 4ab490b8-61a1-4300-b85c-537002247bfe in datapath bc173f9b-a39e-490e-b1d4-92abd1855016 unbound from our chassis
Jan 22 00:44:39 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:44:39.357 104184 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network bc173f9b-a39e-490e-b1d4-92abd1855016, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 00:44:39 compute-1 nova_compute[182713]: 2026-01-22 00:44:39.358 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:44:39 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:44:39.359 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[e5635099-9d79-4ff2-9054-f79bfaa7cdd2]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:44:39 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:44:39.360 104184 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-bc173f9b-a39e-490e-b1d4-92abd1855016 namespace which is not needed anymore
Jan 22 00:44:39 compute-1 systemd[1]: machine-qemu\x2d79\x2dinstance\x2d000000b6.scope: Deactivated successfully.
Jan 22 00:44:39 compute-1 systemd[1]: machine-qemu\x2d79\x2dinstance\x2d000000b6.scope: Consumed 13.643s CPU time.
Jan 22 00:44:39 compute-1 systemd-machined[153970]: Machine qemu-79-instance-000000b6 terminated.
Jan 22 00:44:39 compute-1 neutron-haproxy-ovnmeta-bc173f9b-a39e-490e-b1d4-92abd1855016[244568]: [NOTICE]   (244572) : haproxy version is 2.8.14-c23fe91
Jan 22 00:44:39 compute-1 neutron-haproxy-ovnmeta-bc173f9b-a39e-490e-b1d4-92abd1855016[244568]: [NOTICE]   (244572) : path to executable is /usr/sbin/haproxy
Jan 22 00:44:39 compute-1 neutron-haproxy-ovnmeta-bc173f9b-a39e-490e-b1d4-92abd1855016[244568]: [WARNING]  (244572) : Exiting Master process...
Jan 22 00:44:39 compute-1 neutron-haproxy-ovnmeta-bc173f9b-a39e-490e-b1d4-92abd1855016[244568]: [WARNING]  (244572) : Exiting Master process...
Jan 22 00:44:39 compute-1 neutron-haproxy-ovnmeta-bc173f9b-a39e-490e-b1d4-92abd1855016[244568]: [ALERT]    (244572) : Current worker (244574) exited with code 143 (Terminated)
Jan 22 00:44:39 compute-1 neutron-haproxy-ovnmeta-bc173f9b-a39e-490e-b1d4-92abd1855016[244568]: [WARNING]  (244572) : All workers exited. Exiting... (0)
Jan 22 00:44:39 compute-1 systemd[1]: libpod-1b38bb5c6ea4252c9c140ed85a694e8189532c2ac7e838590b052f414578c0c8.scope: Deactivated successfully.
Jan 22 00:44:39 compute-1 podman[244767]: 2026-01-22 00:44:39.519741552 +0000 UTC m=+0.060801614 container died 1b38bb5c6ea4252c9c140ed85a694e8189532c2ac7e838590b052f414578c0c8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-bc173f9b-a39e-490e-b1d4-92abd1855016, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team)
Jan 22 00:44:39 compute-1 kernel: tap4ab490b8-61: entered promiscuous mode
Jan 22 00:44:39 compute-1 kernel: tap4ab490b8-61 (unregistering): left promiscuous mode
Jan 22 00:44:39 compute-1 NetworkManager[54952]: <info>  [1769042679.5350] manager: (tap4ab490b8-61): new Tun device (/org/freedesktop/NetworkManager/Devices/373)
Jan 22 00:44:39 compute-1 nova_compute[182713]: 2026-01-22 00:44:39.543 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:44:39 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-1b38bb5c6ea4252c9c140ed85a694e8189532c2ac7e838590b052f414578c0c8-userdata-shm.mount: Deactivated successfully.
Jan 22 00:44:39 compute-1 systemd[1]: var-lib-containers-storage-overlay-1a83448e09dd14d5432f4d0254219636de9d5111446c0b2c4a364f8e3dd5781f-merged.mount: Deactivated successfully.
Jan 22 00:44:39 compute-1 podman[244767]: 2026-01-22 00:44:39.585389456 +0000 UTC m=+0.126449488 container cleanup 1b38bb5c6ea4252c9c140ed85a694e8189532c2ac7e838590b052f414578c0c8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-bc173f9b-a39e-490e-b1d4-92abd1855016, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 00:44:39 compute-1 systemd[1]: libpod-conmon-1b38bb5c6ea4252c9c140ed85a694e8189532c2ac7e838590b052f414578c0c8.scope: Deactivated successfully.
Jan 22 00:44:39 compute-1 nova_compute[182713]: 2026-01-22 00:44:39.598 182717 INFO nova.virt.libvirt.driver [-] [instance: 8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5] Instance destroyed successfully.
Jan 22 00:44:39 compute-1 nova_compute[182713]: 2026-01-22 00:44:39.598 182717 DEBUG nova.objects.instance [None req-bfe47445-eadf-4a2a-ac35-11b68cea5862 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lazy-loading 'resources' on Instance uuid 8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:44:39 compute-1 nova_compute[182713]: 2026-01-22 00:44:39.616 182717 DEBUG nova.virt.libvirt.vif [None req-bfe47445-eadf-4a2a-ac35-11b68cea5862 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T00:44:05Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1450704049',display_name='tempest-TestGettingAddress-server-1450704049',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1450704049',id=182,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFzS1Jx/APWM5vcQRL+j1JWQJn1AI5LoKxMBW97Fa2XcQxLO8wlk0d2rFNEjm5ruItcAVjf35MpAgTKTp3E/600O3yHmKIiUXb2hz3moFXrY6FueGiSaiI56sxuqhWZG5g==',key_name='tempest-TestGettingAddress-79214675',keypairs=<?>,launch_index=0,launched_at=2026-01-22T00:44:14Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='837db8748d074b3c9179b47d30e7a1d4',ramdisk_id='',reservation_id='r-0t6fmbpa',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestGettingAddress-471729430',owner_user_name='tempest-TestGettingAddress-471729430-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T00:44:14Z,user_data=None,user_id='a8fd196423d94b309668ffd08655f7ed',uuid=8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "4ab490b8-61a1-4300-b85c-537002247bfe", "address": "fa:16:3e:20:a2:c1", "network": {"id": "bc173f9b-a39e-490e-b1d4-92abd1855016", "bridge": "br-int", "label": "tempest-network-smoke--453483768", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe20:a2c1", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.206", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe20:a2c1", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4ab490b8-61", "ovs_interfaceid": "4ab490b8-61a1-4300-b85c-537002247bfe", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 00:44:39 compute-1 nova_compute[182713]: 2026-01-22 00:44:39.617 182717 DEBUG nova.network.os_vif_util [None req-bfe47445-eadf-4a2a-ac35-11b68cea5862 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Converting VIF {"id": "4ab490b8-61a1-4300-b85c-537002247bfe", "address": "fa:16:3e:20:a2:c1", "network": {"id": "bc173f9b-a39e-490e-b1d4-92abd1855016", "bridge": "br-int", "label": "tempest-network-smoke--453483768", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe20:a2c1", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.206", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe20:a2c1", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4ab490b8-61", "ovs_interfaceid": "4ab490b8-61a1-4300-b85c-537002247bfe", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:44:39 compute-1 nova_compute[182713]: 2026-01-22 00:44:39.618 182717 DEBUG nova.network.os_vif_util [None req-bfe47445-eadf-4a2a-ac35-11b68cea5862 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:20:a2:c1,bridge_name='br-int',has_traffic_filtering=True,id=4ab490b8-61a1-4300-b85c-537002247bfe,network=Network(bc173f9b-a39e-490e-b1d4-92abd1855016),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4ab490b8-61') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:44:39 compute-1 nova_compute[182713]: 2026-01-22 00:44:39.619 182717 DEBUG os_vif [None req-bfe47445-eadf-4a2a-ac35-11b68cea5862 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:20:a2:c1,bridge_name='br-int',has_traffic_filtering=True,id=4ab490b8-61a1-4300-b85c-537002247bfe,network=Network(bc173f9b-a39e-490e-b1d4-92abd1855016),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4ab490b8-61') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 00:44:39 compute-1 nova_compute[182713]: 2026-01-22 00:44:39.621 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:44:39 compute-1 nova_compute[182713]: 2026-01-22 00:44:39.621 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4ab490b8-61, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:44:39 compute-1 nova_compute[182713]: 2026-01-22 00:44:39.623 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:44:39 compute-1 nova_compute[182713]: 2026-01-22 00:44:39.627 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:44:39 compute-1 nova_compute[182713]: 2026-01-22 00:44:39.632 182717 INFO os_vif [None req-bfe47445-eadf-4a2a-ac35-11b68cea5862 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:20:a2:c1,bridge_name='br-int',has_traffic_filtering=True,id=4ab490b8-61a1-4300-b85c-537002247bfe,network=Network(bc173f9b-a39e-490e-b1d4-92abd1855016),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4ab490b8-61')
Jan 22 00:44:39 compute-1 nova_compute[182713]: 2026-01-22 00:44:39.633 182717 INFO nova.virt.libvirt.driver [None req-bfe47445-eadf-4a2a-ac35-11b68cea5862 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5] Deleting instance files /var/lib/nova/instances/8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5_del
Jan 22 00:44:39 compute-1 nova_compute[182713]: 2026-01-22 00:44:39.634 182717 INFO nova.virt.libvirt.driver [None req-bfe47445-eadf-4a2a-ac35-11b68cea5862 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5] Deletion of /var/lib/nova/instances/8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5_del complete
Jan 22 00:44:39 compute-1 podman[244808]: 2026-01-22 00:44:39.661998021 +0000 UTC m=+0.051472607 container remove 1b38bb5c6ea4252c9c140ed85a694e8189532c2ac7e838590b052f414578c0c8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-bc173f9b-a39e-490e-b1d4-92abd1855016, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Jan 22 00:44:39 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:44:39.670 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[c7fe57aa-b95c-4437-b742-98087296928e]: (4, ('Thu Jan 22 12:44:39 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-bc173f9b-a39e-490e-b1d4-92abd1855016 (1b38bb5c6ea4252c9c140ed85a694e8189532c2ac7e838590b052f414578c0c8)\n1b38bb5c6ea4252c9c140ed85a694e8189532c2ac7e838590b052f414578c0c8\nThu Jan 22 12:44:39 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-bc173f9b-a39e-490e-b1d4-92abd1855016 (1b38bb5c6ea4252c9c140ed85a694e8189532c2ac7e838590b052f414578c0c8)\n1b38bb5c6ea4252c9c140ed85a694e8189532c2ac7e838590b052f414578c0c8\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:44:39 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:44:39.672 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[e7dd8c8e-f496-406b-82ef-b66c82a3e8d4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:44:39 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:44:39.673 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapbc173f9b-a0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:44:39 compute-1 kernel: tapbc173f9b-a0: left promiscuous mode
Jan 22 00:44:39 compute-1 nova_compute[182713]: 2026-01-22 00:44:39.729 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:44:39 compute-1 nova_compute[182713]: 2026-01-22 00:44:39.739 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:44:39 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:44:39.743 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[ea36e4d6-9e37-4e67-be19-3423022a88c9]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:44:39 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:44:39.766 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[92b542fa-6a88-4b26-9165-7b7d6850de35]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:44:39 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:44:39.768 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[a311e8ab-cc1f-40a5-be10-665d03b1d115]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:44:39 compute-1 nova_compute[182713]: 2026-01-22 00:44:39.785 182717 INFO nova.compute.manager [None req-bfe47445-eadf-4a2a-ac35-11b68cea5862 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5] Took 0.49 seconds to destroy the instance on the hypervisor.
Jan 22 00:44:39 compute-1 nova_compute[182713]: 2026-01-22 00:44:39.786 182717 DEBUG oslo.service.loopingcall [None req-bfe47445-eadf-4a2a-ac35-11b68cea5862 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 22 00:44:39 compute-1 nova_compute[182713]: 2026-01-22 00:44:39.787 182717 DEBUG nova.compute.manager [-] [instance: 8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 22 00:44:39 compute-1 nova_compute[182713]: 2026-01-22 00:44:39.787 182717 DEBUG nova.network.neutron [-] [instance: 8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 22 00:44:39 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:44:39.801 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[63528b41-9024-4c32-9918-150a25c08f0e]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 722641, 'reachable_time': 16869, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 244824, 'error': None, 'target': 'ovnmeta-bc173f9b-a39e-490e-b1d4-92abd1855016', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:44:39 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:44:39.806 104576 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-bc173f9b-a39e-490e-b1d4-92abd1855016 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 22 00:44:39 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:44:39.806 104576 DEBUG oslo.privsep.daemon [-] privsep: reply[2e2e67cb-4a4a-46fa-8ae4-4b7b5fc3e9dc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:44:39 compute-1 systemd[1]: run-netns-ovnmeta\x2dbc173f9b\x2da39e\x2d490e\x2db1d4\x2d92abd1855016.mount: Deactivated successfully.
Jan 22 00:44:39 compute-1 nova_compute[182713]: 2026-01-22 00:44:39.950 182717 DEBUG nova.compute.manager [req-24865e8c-149e-4e5c-ac3f-3765217b51c1 req-25739c2e-ab17-4107-8898-cd44ca664ca2 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5] Received event network-vif-unplugged-4ab490b8-61a1-4300-b85c-537002247bfe external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:44:39 compute-1 nova_compute[182713]: 2026-01-22 00:44:39.951 182717 DEBUG oslo_concurrency.lockutils [req-24865e8c-149e-4e5c-ac3f-3765217b51c1 req-25739c2e-ab17-4107-8898-cd44ca664ca2 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:44:39 compute-1 nova_compute[182713]: 2026-01-22 00:44:39.951 182717 DEBUG oslo_concurrency.lockutils [req-24865e8c-149e-4e5c-ac3f-3765217b51c1 req-25739c2e-ab17-4107-8898-cd44ca664ca2 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:44:39 compute-1 nova_compute[182713]: 2026-01-22 00:44:39.952 182717 DEBUG oslo_concurrency.lockutils [req-24865e8c-149e-4e5c-ac3f-3765217b51c1 req-25739c2e-ab17-4107-8898-cd44ca664ca2 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:44:39 compute-1 nova_compute[182713]: 2026-01-22 00:44:39.952 182717 DEBUG nova.compute.manager [req-24865e8c-149e-4e5c-ac3f-3765217b51c1 req-25739c2e-ab17-4107-8898-cd44ca664ca2 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5] No waiting events found dispatching network-vif-unplugged-4ab490b8-61a1-4300-b85c-537002247bfe pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:44:39 compute-1 nova_compute[182713]: 2026-01-22 00:44:39.953 182717 DEBUG nova.compute.manager [req-24865e8c-149e-4e5c-ac3f-3765217b51c1 req-25739c2e-ab17-4107-8898-cd44ca664ca2 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5] Received event network-vif-unplugged-4ab490b8-61a1-4300-b85c-537002247bfe for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 22 00:44:41 compute-1 nova_compute[182713]: 2026-01-22 00:44:41.045 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:44:41 compute-1 nova_compute[182713]: 2026-01-22 00:44:41.119 182717 DEBUG nova.network.neutron [-] [instance: 8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:44:41 compute-1 nova_compute[182713]: 2026-01-22 00:44:41.158 182717 INFO nova.compute.manager [-] [instance: 8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5] Took 1.37 seconds to deallocate network for instance.
Jan 22 00:44:41 compute-1 nova_compute[182713]: 2026-01-22 00:44:41.286 182717 DEBUG nova.compute.manager [req-f944af30-e5be-472a-899e-2c7fc6edf2e3 req-b89da79c-ae9d-41d1-ad84-a189700c686e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5] Received event network-vif-deleted-4ab490b8-61a1-4300-b85c-537002247bfe external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:44:41 compute-1 nova_compute[182713]: 2026-01-22 00:44:41.294 182717 DEBUG oslo_concurrency.lockutils [None req-bfe47445-eadf-4a2a-ac35-11b68cea5862 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:44:41 compute-1 nova_compute[182713]: 2026-01-22 00:44:41.294 182717 DEBUG oslo_concurrency.lockutils [None req-bfe47445-eadf-4a2a-ac35-11b68cea5862 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:44:41 compute-1 nova_compute[182713]: 2026-01-22 00:44:41.386 182717 DEBUG nova.compute.provider_tree [None req-bfe47445-eadf-4a2a-ac35-11b68cea5862 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Inventory has not changed in ProviderTree for provider: 39680711-70c9-4df1-ae59-25e54fac688d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:44:41 compute-1 nova_compute[182713]: 2026-01-22 00:44:41.410 182717 DEBUG nova.scheduler.client.report [None req-bfe47445-eadf-4a2a-ac35-11b68cea5862 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Inventory has not changed for provider 39680711-70c9-4df1-ae59-25e54fac688d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:44:41 compute-1 nova_compute[182713]: 2026-01-22 00:44:41.449 182717 DEBUG oslo_concurrency.lockutils [None req-bfe47445-eadf-4a2a-ac35-11b68cea5862 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.155s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:44:41 compute-1 nova_compute[182713]: 2026-01-22 00:44:41.503 182717 INFO nova.scheduler.client.report [None req-bfe47445-eadf-4a2a-ac35-11b68cea5862 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Deleted allocations for instance 8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5
Jan 22 00:44:41 compute-1 nova_compute[182713]: 2026-01-22 00:44:41.608 182717 DEBUG oslo_concurrency.lockutils [None req-bfe47445-eadf-4a2a-ac35-11b68cea5862 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.341s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:44:42 compute-1 nova_compute[182713]: 2026-01-22 00:44:42.067 182717 DEBUG nova.compute.manager [req-3c01d25f-a5d0-4328-b73a-c7570acb97b2 req-444770c8-4c9f-4e04-b2ee-5caf70158986 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5] Received event network-vif-plugged-4ab490b8-61a1-4300-b85c-537002247bfe external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:44:42 compute-1 nova_compute[182713]: 2026-01-22 00:44:42.068 182717 DEBUG oslo_concurrency.lockutils [req-3c01d25f-a5d0-4328-b73a-c7570acb97b2 req-444770c8-4c9f-4e04-b2ee-5caf70158986 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:44:42 compute-1 nova_compute[182713]: 2026-01-22 00:44:42.068 182717 DEBUG oslo_concurrency.lockutils [req-3c01d25f-a5d0-4328-b73a-c7570acb97b2 req-444770c8-4c9f-4e04-b2ee-5caf70158986 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:44:42 compute-1 nova_compute[182713]: 2026-01-22 00:44:42.069 182717 DEBUG oslo_concurrency.lockutils [req-3c01d25f-a5d0-4328-b73a-c7570acb97b2 req-444770c8-4c9f-4e04-b2ee-5caf70158986 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:44:42 compute-1 nova_compute[182713]: 2026-01-22 00:44:42.069 182717 DEBUG nova.compute.manager [req-3c01d25f-a5d0-4328-b73a-c7570acb97b2 req-444770c8-4c9f-4e04-b2ee-5caf70158986 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5] No waiting events found dispatching network-vif-plugged-4ab490b8-61a1-4300-b85c-537002247bfe pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:44:42 compute-1 nova_compute[182713]: 2026-01-22 00:44:42.069 182717 WARNING nova.compute.manager [req-3c01d25f-a5d0-4328-b73a-c7570acb97b2 req-444770c8-4c9f-4e04-b2ee-5caf70158986 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5] Received unexpected event network-vif-plugged-4ab490b8-61a1-4300-b85c-537002247bfe for instance with vm_state deleted and task_state None.
Jan 22 00:44:42 compute-1 nova_compute[182713]: 2026-01-22 00:44:42.097 182717 DEBUG nova.network.neutron [req-b1e92ae4-b745-44c8-a2e0-f2c9a81ea7cd req-bf8f26e3-6ece-46d9-b64a-0eadb42536bd 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5] Updated VIF entry in instance network info cache for port 4ab490b8-61a1-4300-b85c-537002247bfe. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 00:44:42 compute-1 nova_compute[182713]: 2026-01-22 00:44:42.098 182717 DEBUG nova.network.neutron [req-b1e92ae4-b745-44c8-a2e0-f2c9a81ea7cd req-bf8f26e3-6ece-46d9-b64a-0eadb42536bd 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5] Updating instance_info_cache with network_info: [{"id": "4ab490b8-61a1-4300-b85c-537002247bfe", "address": "fa:16:3e:20:a2:c1", "network": {"id": "bc173f9b-a39e-490e-b1d4-92abd1855016", "bridge": "br-int", "label": "tempest-network-smoke--453483768", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe20:a2c1", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe20:a2c1", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4ab490b8-61", "ovs_interfaceid": "4ab490b8-61a1-4300-b85c-537002247bfe", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:44:42 compute-1 nova_compute[182713]: 2026-01-22 00:44:42.122 182717 DEBUG oslo_concurrency.lockutils [req-b1e92ae4-b745-44c8-a2e0-f2c9a81ea7cd req-bf8f26e3-6ece-46d9-b64a-0eadb42536bd 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:44:44 compute-1 nova_compute[182713]: 2026-01-22 00:44:44.625 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:44:46 compute-1 nova_compute[182713]: 2026-01-22 00:44:46.048 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:44:47 compute-1 podman[244825]: 2026-01-22 00:44:47.57911702 +0000 UTC m=+0.076635927 container health_status cba261ffc859a105e0afa4436e64e27f9ba7e7387a1ea02146e0072c51844d44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 00:44:49 compute-1 podman[244846]: 2026-01-22 00:44:49.602466426 +0000 UTC m=+0.083405515 container health_status dce133a2ffa21f3006ac27dc80027060a9029e6d972f4c62fecd35dd3bbb00f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, com.redhat.component=ubi9-minimal-container, config_id=openstack_network_exporter, vcs-type=git, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, container_name=openstack_network_exporter, io.openshift.expose-services=)
Jan 22 00:44:49 compute-1 nova_compute[182713]: 2026-01-22 00:44:49.628 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:44:51 compute-1 nova_compute[182713]: 2026-01-22 00:44:51.049 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:44:54 compute-1 nova_compute[182713]: 2026-01-22 00:44:54.596 182717 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769042679.5940566, 8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:44:54 compute-1 nova_compute[182713]: 2026-01-22 00:44:54.597 182717 INFO nova.compute.manager [-] [instance: 8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5] VM Stopped (Lifecycle Event)
Jan 22 00:44:54 compute-1 nova_compute[182713]: 2026-01-22 00:44:54.622 182717 DEBUG nova.compute.manager [None req-0f2c1b8a-6145-4c4d-9c74-8b0f319d0156 - - - - - -] [instance: 8ba65ca9-d3cb-4c07-bc12-d73b208ee2b5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:44:54 compute-1 nova_compute[182713]: 2026-01-22 00:44:54.631 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:44:56 compute-1 nova_compute[182713]: 2026-01-22 00:44:56.091 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:44:59 compute-1 podman[244866]: 2026-01-22 00:44:59.625381678 +0000 UTC m=+0.102795816 container health_status 9069102163b9014ce8d990e36224edf2f6277e737eebbc85a8ea0ba5a3d6a468 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 22 00:44:59 compute-1 nova_compute[182713]: 2026-01-22 00:44:59.677 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:44:59 compute-1 podman[244865]: 2026-01-22 00:44:59.705294545 +0000 UTC m=+0.189305568 container health_status 1b31374526468d6b98833c6ddc06c791524e3aaa8f4a92d02e3f11c449a17ca2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 22 00:45:01 compute-1 nova_compute[182713]: 2026-01-22 00:45:01.093 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:45:01 compute-1 nova_compute[182713]: 2026-01-22 00:45:01.982 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:45:02 compute-1 nova_compute[182713]: 2026-01-22 00:45:02.052 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:45:03 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:45:03.062 104184 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:45:03 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:45:03.062 104184 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:45:03 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:45:03.062 104184 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:45:04 compute-1 nova_compute[182713]: 2026-01-22 00:45:04.680 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:45:05 compute-1 podman[244915]: 2026-01-22 00:45:05.576763139 +0000 UTC m=+0.063004663 container health_status af9a44923f14d865893c91712a29a99d59e05d83f1a1a0300c4f4d5af638f284 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 00:45:05 compute-1 podman[244916]: 2026-01-22 00:45:05.577232344 +0000 UTC m=+0.063140008 container health_status e305ebfcd3dbca18f8dd680606b043b108cf9efb0d0a771039cb04fe0df8441d (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 22 00:45:06 compute-1 nova_compute[182713]: 2026-01-22 00:45:06.096 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:45:07 compute-1 nova_compute[182713]: 2026-01-22 00:45:07.855 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:45:09 compute-1 nova_compute[182713]: 2026-01-22 00:45:09.682 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:45:11 compute-1 nova_compute[182713]: 2026-01-22 00:45:11.097 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:45:11 compute-1 nova_compute[182713]: 2026-01-22 00:45:11.851 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:45:11 compute-1 nova_compute[182713]: 2026-01-22 00:45:11.855 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:45:14 compute-1 nova_compute[182713]: 2026-01-22 00:45:14.684 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:45:15 compute-1 nova_compute[182713]: 2026-01-22 00:45:15.856 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:45:15 compute-1 nova_compute[182713]: 2026-01-22 00:45:15.856 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 00:45:16 compute-1 nova_compute[182713]: 2026-01-22 00:45:16.104 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:45:18 compute-1 podman[244959]: 2026-01-22 00:45:18.592730465 +0000 UTC m=+0.085045357 container health_status cba261ffc859a105e0afa4436e64e27f9ba7e7387a1ea02146e0072c51844d44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251202)
Jan 22 00:45:18 compute-1 nova_compute[182713]: 2026-01-22 00:45:18.857 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:45:19 compute-1 nova_compute[182713]: 2026-01-22 00:45:19.686 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:45:19 compute-1 nova_compute[182713]: 2026-01-22 00:45:19.855 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:45:19 compute-1 nova_compute[182713]: 2026-01-22 00:45:19.855 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:45:19 compute-1 nova_compute[182713]: 2026-01-22 00:45:19.855 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:45:19 compute-1 nova_compute[182713]: 2026-01-22 00:45:19.889 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:45:19 compute-1 nova_compute[182713]: 2026-01-22 00:45:19.890 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:45:19 compute-1 nova_compute[182713]: 2026-01-22 00:45:19.890 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:45:19 compute-1 nova_compute[182713]: 2026-01-22 00:45:19.890 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 00:45:20 compute-1 podman[244979]: 2026-01-22 00:45:20.008884515 +0000 UTC m=+0.074971595 container health_status dce133a2ffa21f3006ac27dc80027060a9029e6d972f4c62fecd35dd3bbb00f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, version=9.6, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, vendor=Red Hat, Inc., architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Jan 22 00:45:20 compute-1 nova_compute[182713]: 2026-01-22 00:45:20.081 182717 WARNING nova.virt.libvirt.driver [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 00:45:20 compute-1 nova_compute[182713]: 2026-01-22 00:45:20.083 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5689MB free_disk=73.17744445800781GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 00:45:20 compute-1 nova_compute[182713]: 2026-01-22 00:45:20.084 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:45:20 compute-1 nova_compute[182713]: 2026-01-22 00:45:20.084 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:45:20 compute-1 nova_compute[182713]: 2026-01-22 00:45:20.266 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 00:45:20 compute-1 nova_compute[182713]: 2026-01-22 00:45:20.266 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 00:45:20 compute-1 nova_compute[182713]: 2026-01-22 00:45:20.363 182717 DEBUG nova.scheduler.client.report [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Refreshing inventories for resource provider 39680711-70c9-4df1-ae59-25e54fac688d _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Jan 22 00:45:20 compute-1 nova_compute[182713]: 2026-01-22 00:45:20.572 182717 DEBUG nova.scheduler.client.report [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Updating ProviderTree inventory for provider 39680711-70c9-4df1-ae59-25e54fac688d from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Jan 22 00:45:20 compute-1 nova_compute[182713]: 2026-01-22 00:45:20.573 182717 DEBUG nova.compute.provider_tree [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Updating inventory in ProviderTree for provider 39680711-70c9-4df1-ae59-25e54fac688d with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Jan 22 00:45:20 compute-1 nova_compute[182713]: 2026-01-22 00:45:20.596 182717 DEBUG nova.scheduler.client.report [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Refreshing aggregate associations for resource provider 39680711-70c9-4df1-ae59-25e54fac688d, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Jan 22 00:45:20 compute-1 nova_compute[182713]: 2026-01-22 00:45:20.634 182717 DEBUG nova.scheduler.client.report [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Refreshing trait associations for resource provider 39680711-70c9-4df1-ae59-25e54fac688d, traits: COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_ACCELERATORS,COMPUTE_NODE,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_SSE41,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_GRAPHICS_MODEL_CIRRUS,HW_CPU_X86_SSE42,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_SECURITY_TPM_2_0,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_STORAGE_BUS_USB,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_STORAGE_BUS_FDC,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_STORAGE_BUS_SATA,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_DEVICE_TAGGING,COMPUTE_NET_VIF_MODEL_NE2K_PCI,HW_CPU_X86_SSE2,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_CPU_X86_SSSE3,COMPUTE_STORAGE_BUS_IDE,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_RESCUE_BFV,HW_CPU_X86_MMX,HW_CPU_X86_SSE,COMPUTE_TRUSTED_CERTS,COMPUTE_IMAGE_TYPE_ISO _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Jan 22 00:45:20 compute-1 nova_compute[182713]: 2026-01-22 00:45:20.678 182717 DEBUG nova.compute.provider_tree [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Inventory has not changed in ProviderTree for provider: 39680711-70c9-4df1-ae59-25e54fac688d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:45:20 compute-1 nova_compute[182713]: 2026-01-22 00:45:20.717 182717 DEBUG nova.scheduler.client.report [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Inventory has not changed for provider 39680711-70c9-4df1-ae59-25e54fac688d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:45:20 compute-1 nova_compute[182713]: 2026-01-22 00:45:20.766 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 00:45:20 compute-1 nova_compute[182713]: 2026-01-22 00:45:20.767 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.683s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:45:21 compute-1 nova_compute[182713]: 2026-01-22 00:45:21.106 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:45:21 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:45:21.442 104184 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=69, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:a4:fb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'fa:c2:bb:50:eb:27'}, ipsec=False) old=SB_Global(nb_cfg=68) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:45:21 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:45:21.443 104184 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 22 00:45:21 compute-1 nova_compute[182713]: 2026-01-22 00:45:21.443 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:45:23 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:45:23.445 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=74526b6d-b1ca-423f-9094-b845f8b97526, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '69'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:45:23 compute-1 nova_compute[182713]: 2026-01-22 00:45:23.768 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:45:23 compute-1 nova_compute[182713]: 2026-01-22 00:45:23.768 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 00:45:23 compute-1 nova_compute[182713]: 2026-01-22 00:45:23.769 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 22 00:45:23 compute-1 nova_compute[182713]: 2026-01-22 00:45:23.790 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 22 00:45:24 compute-1 nova_compute[182713]: 2026-01-22 00:45:24.690 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:45:26 compute-1 nova_compute[182713]: 2026-01-22 00:45:26.110 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:45:29 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:45:29.044 104184 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e8:5f:6e 10.100.0.2 2001:db8::f816:3eff:fee8:5f6e'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fee8:5f6e/64', 'neutron:device_id': 'ovnmeta-83666af9-15ce-4344-a623-7180c9b2515a', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-83666af9-15ce-4344-a623-7180c9b2515a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '837db8748d074b3c9179b47d30e7a1d4', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=496d15df-9baa-43c6-8bd0-ae8566291be1, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=4a596305-d10e-4e9e-a8ea-d94a630e8baa) old=Port_Binding(mac=['fa:16:3e:e8:5f:6e 10.100.0.2'], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-83666af9-15ce-4344-a623-7180c9b2515a', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-83666af9-15ce-4344-a623-7180c9b2515a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '837db8748d074b3c9179b47d30e7a1d4', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:45:29 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:45:29.047 104184 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 4a596305-d10e-4e9e-a8ea-d94a630e8baa in datapath 83666af9-15ce-4344-a623-7180c9b2515a updated
Jan 22 00:45:29 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:45:29.049 104184 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 83666af9-15ce-4344-a623-7180c9b2515a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 00:45:29 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:45:29.050 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[db16bd8d-c7a3-49bb-9ecd-7ed30a4e37c0]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:45:29 compute-1 nova_compute[182713]: 2026-01-22 00:45:29.692 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:45:30 compute-1 podman[245007]: 2026-01-22 00:45:30.577889329 +0000 UTC m=+0.066276514 container health_status 9069102163b9014ce8d990e36224edf2f6277e737eebbc85a8ea0ba5a3d6a468 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 22 00:45:30 compute-1 podman[245006]: 2026-01-22 00:45:30.647412523 +0000 UTC m=+0.131084373 container health_status 1b31374526468d6b98833c6ddc06c791524e3aaa8f4a92d02e3f11c449a17ca2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Jan 22 00:45:30 compute-1 sshd-session[245004]: Invalid user validator from 92.118.39.95 port 42094
Jan 22 00:45:31 compute-1 sshd-session[245004]: Connection closed by invalid user validator 92.118.39.95 port 42094 [preauth]
Jan 22 00:45:31 compute-1 nova_compute[182713]: 2026-01-22 00:45:31.112 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:45:34 compute-1 nova_compute[182713]: 2026-01-22 00:45:34.695 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:45:36 compute-1 nova_compute[182713]: 2026-01-22 00:45:36.114 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:45:36 compute-1 podman[245056]: 2026-01-22 00:45:36.594549172 +0000 UTC m=+0.081924830 container health_status af9a44923f14d865893c91712a29a99d59e05d83f1a1a0300c4f4d5af638f284 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Jan 22 00:45:36 compute-1 podman[245057]: 2026-01-22 00:45:36.594592493 +0000 UTC m=+0.075511730 container health_status e305ebfcd3dbca18f8dd680606b043b108cf9efb0d0a771039cb04fe0df8441d (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 22 00:45:39 compute-1 nova_compute[182713]: 2026-01-22 00:45:39.698 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:45:41 compute-1 nova_compute[182713]: 2026-01-22 00:45:41.116 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:45:44 compute-1 nova_compute[182713]: 2026-01-22 00:45:44.700 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:45:46 compute-1 nova_compute[182713]: 2026-01-22 00:45:46.159 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:45:49 compute-1 podman[245096]: 2026-01-22 00:45:49.624319414 +0000 UTC m=+0.110865577 container health_status cba261ffc859a105e0afa4436e64e27f9ba7e7387a1ea02146e0072c51844d44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute)
Jan 22 00:45:49 compute-1 nova_compute[182713]: 2026-01-22 00:45:49.704 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:45:50 compute-1 podman[245117]: 2026-01-22 00:45:50.583368041 +0000 UTC m=+0.071312490 container health_status dce133a2ffa21f3006ac27dc80027060a9029e6d972f4c62fecd35dd3bbb00f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, architecture=x86_64, release=1755695350, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, version=9.6, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, io.openshift.tags=minimal rhel9, vcs-type=git)
Jan 22 00:45:51 compute-1 nova_compute[182713]: 2026-01-22 00:45:51.162 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:45:51 compute-1 ovn_controller[94841]: 2026-01-22T00:45:51Z|00768|memory_trim|INFO|Detected inactivity (last active 30005 ms ago): trimming memory
Jan 22 00:45:54 compute-1 nova_compute[182713]: 2026-01-22 00:45:54.706 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:45:56 compute-1 nova_compute[182713]: 2026-01-22 00:45:56.163 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:45:59 compute-1 nova_compute[182713]: 2026-01-22 00:45:59.749 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:46:01 compute-1 nova_compute[182713]: 2026-01-22 00:46:01.165 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:46:01 compute-1 podman[245139]: 2026-01-22 00:46:01.576581759 +0000 UTC m=+0.066217523 container health_status 9069102163b9014ce8d990e36224edf2f6277e737eebbc85a8ea0ba5a3d6a468 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 22 00:46:01 compute-1 podman[245138]: 2026-01-22 00:46:01.619890801 +0000 UTC m=+0.108844544 container health_status 1b31374526468d6b98833c6ddc06c791524e3aaa8f4a92d02e3f11c449a17ca2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 22 00:46:03 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:46:03.063 104184 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:46:03 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:46:03.064 104184 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:46:03 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:46:03.064 104184 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:46:04 compute-1 nova_compute[182713]: 2026-01-22 00:46:04.751 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:46:06 compute-1 nova_compute[182713]: 2026-01-22 00:46:06.167 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:46:07 compute-1 podman[245190]: 2026-01-22 00:46:07.604924154 +0000 UTC m=+0.089416741 container health_status e305ebfcd3dbca18f8dd680606b043b108cf9efb0d0a771039cb04fe0df8441d (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 22 00:46:07 compute-1 podman[245189]: 2026-01-22 00:46:07.618275198 +0000 UTC m=+0.108458852 container health_status af9a44923f14d865893c91712a29a99d59e05d83f1a1a0300c4f4d5af638f284 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 00:46:07 compute-1 nova_compute[182713]: 2026-01-22 00:46:07.856 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:46:09 compute-1 nova_compute[182713]: 2026-01-22 00:46:09.759 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:46:11 compute-1 nova_compute[182713]: 2026-01-22 00:46:11.170 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:46:12 compute-1 nova_compute[182713]: 2026-01-22 00:46:12.855 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:46:13 compute-1 nova_compute[182713]: 2026-01-22 00:46:13.851 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:46:14 compute-1 nova_compute[182713]: 2026-01-22 00:46:14.760 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:46:16 compute-1 nova_compute[182713]: 2026-01-22 00:46:16.172 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:46:16 compute-1 nova_compute[182713]: 2026-01-22 00:46:16.855 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:46:16 compute-1 nova_compute[182713]: 2026-01-22 00:46:16.856 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 00:46:18 compute-1 nova_compute[182713]: 2026-01-22 00:46:18.851 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:46:18 compute-1 nova_compute[182713]: 2026-01-22 00:46:18.871 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:46:19 compute-1 nova_compute[182713]: 2026-01-22 00:46:19.762 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:46:19 compute-1 nova_compute[182713]: 2026-01-22 00:46:19.855 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:46:19 compute-1 nova_compute[182713]: 2026-01-22 00:46:19.886 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:46:19 compute-1 nova_compute[182713]: 2026-01-22 00:46:19.887 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:46:19 compute-1 nova_compute[182713]: 2026-01-22 00:46:19.887 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:46:19 compute-1 nova_compute[182713]: 2026-01-22 00:46:19.888 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 00:46:20 compute-1 nova_compute[182713]: 2026-01-22 00:46:20.138 182717 WARNING nova.virt.libvirt.driver [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 00:46:20 compute-1 nova_compute[182713]: 2026-01-22 00:46:20.139 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5704MB free_disk=73.17744064331055GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 00:46:20 compute-1 nova_compute[182713]: 2026-01-22 00:46:20.139 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:46:20 compute-1 nova_compute[182713]: 2026-01-22 00:46:20.140 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:46:20 compute-1 nova_compute[182713]: 2026-01-22 00:46:20.200 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 00:46:20 compute-1 nova_compute[182713]: 2026-01-22 00:46:20.201 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 00:46:20 compute-1 nova_compute[182713]: 2026-01-22 00:46:20.222 182717 DEBUG nova.compute.provider_tree [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Inventory has not changed in ProviderTree for provider: 39680711-70c9-4df1-ae59-25e54fac688d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:46:20 compute-1 nova_compute[182713]: 2026-01-22 00:46:20.240 182717 DEBUG nova.scheduler.client.report [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Inventory has not changed for provider 39680711-70c9-4df1-ae59-25e54fac688d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:46:20 compute-1 nova_compute[182713]: 2026-01-22 00:46:20.241 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 00:46:20 compute-1 nova_compute[182713]: 2026-01-22 00:46:20.242 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.102s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:46:20 compute-1 podman[245231]: 2026-01-22 00:46:20.593570324 +0000 UTC m=+0.079355981 container health_status cba261ffc859a105e0afa4436e64e27f9ba7e7387a1ea02146e0072c51844d44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute)
Jan 22 00:46:20 compute-1 podman[245251]: 2026-01-22 00:46:20.725764299 +0000 UTC m=+0.092785436 container health_status dce133a2ffa21f3006ac27dc80027060a9029e6d972f4c62fecd35dd3bbb00f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, config_id=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., distribution-scope=public, name=ubi9-minimal, release=1755695350, container_name=openstack_network_exporter)
Jan 22 00:46:21 compute-1 nova_compute[182713]: 2026-01-22 00:46:21.174 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:46:21 compute-1 nova_compute[182713]: 2026-01-22 00:46:21.243 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:46:21 compute-1 nova_compute[182713]: 2026-01-22 00:46:21.243 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:46:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:46:22.900 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:46:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:46:22.902 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:46:22 compute-1 rsyslogd[1003]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 22 00:46:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:46:22.902 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:46:22 compute-1 rsyslogd[1003]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 22 00:46:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:46:22.902 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:46:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:46:22.902 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:46:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:46:22.902 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:46:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:46:22.903 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:46:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:46:22.903 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:46:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:46:22.903 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:46:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:46:22.903 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:46:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:46:22.903 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:46:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:46:22.903 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:46:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:46:22.903 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:46:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:46:22.903 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:46:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:46:22.903 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:46:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:46:22.903 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:46:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:46:22.903 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:46:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:46:22.903 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:46:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:46:22.903 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:46:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:46:22.904 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:46:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:46:22.904 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:46:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:46:22.904 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:46:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:46:22.904 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:46:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:46:22.904 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:46:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:46:22.904 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:46:23 compute-1 nova_compute[182713]: 2026-01-22 00:46:23.857 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:46:23 compute-1 nova_compute[182713]: 2026-01-22 00:46:23.857 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 00:46:23 compute-1 nova_compute[182713]: 2026-01-22 00:46:23.857 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 22 00:46:23 compute-1 nova_compute[182713]: 2026-01-22 00:46:23.873 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 22 00:46:24 compute-1 nova_compute[182713]: 2026-01-22 00:46:24.765 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:46:25 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:46:25.646 104184 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=70, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:a4:fb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'fa:c2:bb:50:eb:27'}, ipsec=False) old=SB_Global(nb_cfg=69) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:46:25 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:46:25.647 104184 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 22 00:46:25 compute-1 nova_compute[182713]: 2026-01-22 00:46:25.648 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:46:26 compute-1 nova_compute[182713]: 2026-01-22 00:46:26.177 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:46:29 compute-1 nova_compute[182713]: 2026-01-22 00:46:29.766 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:46:31 compute-1 nova_compute[182713]: 2026-01-22 00:46:31.179 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:46:32 compute-1 podman[245274]: 2026-01-22 00:46:32.624726622 +0000 UTC m=+0.108254006 container health_status 9069102163b9014ce8d990e36224edf2f6277e737eebbc85a8ea0ba5a3d6a468 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 22 00:46:32 compute-1 podman[245273]: 2026-01-22 00:46:32.641784861 +0000 UTC m=+0.137626207 container health_status 1b31374526468d6b98833c6ddc06c791524e3aaa8f4a92d02e3f11c449a17ca2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible)
Jan 22 00:46:32 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:46:32.649 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=74526b6d-b1ca-423f-9094-b845f8b97526, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '70'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:46:34 compute-1 nova_compute[182713]: 2026-01-22 00:46:34.768 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:46:36 compute-1 nova_compute[182713]: 2026-01-22 00:46:36.180 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:46:38 compute-1 podman[245324]: 2026-01-22 00:46:38.572829221 +0000 UTC m=+0.060251538 container health_status e305ebfcd3dbca18f8dd680606b043b108cf9efb0d0a771039cb04fe0df8441d (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 22 00:46:38 compute-1 podman[245323]: 2026-01-22 00:46:38.581725706 +0000 UTC m=+0.071890108 container health_status af9a44923f14d865893c91712a29a99d59e05d83f1a1a0300c4f4d5af638f284 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 22 00:46:39 compute-1 nova_compute[182713]: 2026-01-22 00:46:39.768 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:46:41 compute-1 nova_compute[182713]: 2026-01-22 00:46:41.184 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:46:44 compute-1 nova_compute[182713]: 2026-01-22 00:46:44.819 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:46:46 compute-1 nova_compute[182713]: 2026-01-22 00:46:46.188 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:46:49 compute-1 nova_compute[182713]: 2026-01-22 00:46:49.821 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:46:51 compute-1 nova_compute[182713]: 2026-01-22 00:46:51.227 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:46:51 compute-1 podman[245365]: 2026-01-22 00:46:51.632005086 +0000 UTC m=+0.114187771 container health_status cba261ffc859a105e0afa4436e64e27f9ba7e7387a1ea02146e0072c51844d44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute)
Jan 22 00:46:51 compute-1 podman[245366]: 2026-01-22 00:46:51.646247587 +0000 UTC m=+0.120933909 container health_status dce133a2ffa21f3006ac27dc80027060a9029e6d972f4c62fecd35dd3bbb00f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, vcs-type=git, version=9.6, architecture=x86_64, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public)
Jan 22 00:46:54 compute-1 nova_compute[182713]: 2026-01-22 00:46:54.824 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:46:56 compute-1 nova_compute[182713]: 2026-01-22 00:46:56.229 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:46:59 compute-1 nova_compute[182713]: 2026-01-22 00:46:59.827 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:47:01 compute-1 nova_compute[182713]: 2026-01-22 00:47:01.232 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:47:03 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:47:03.063 104184 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:47:03 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:47:03.065 104184 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:47:03 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:47:03.065 104184 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:47:03 compute-1 podman[245406]: 2026-01-22 00:47:03.585792506 +0000 UTC m=+0.064842880 container health_status 9069102163b9014ce8d990e36224edf2f6277e737eebbc85a8ea0ba5a3d6a468 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 22 00:47:03 compute-1 podman[245405]: 2026-01-22 00:47:03.605784826 +0000 UTC m=+0.092195628 container health_status 1b31374526468d6b98833c6ddc06c791524e3aaa8f4a92d02e3f11c449a17ca2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Jan 22 00:47:04 compute-1 nova_compute[182713]: 2026-01-22 00:47:04.829 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:47:06 compute-1 nova_compute[182713]: 2026-01-22 00:47:06.235 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:47:09 compute-1 podman[245458]: 2026-01-22 00:47:09.579082047 +0000 UTC m=+0.067507073 container health_status e305ebfcd3dbca18f8dd680606b043b108cf9efb0d0a771039cb04fe0df8441d (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Jan 22 00:47:09 compute-1 podman[245457]: 2026-01-22 00:47:09.609258582 +0000 UTC m=+0.098170083 container health_status af9a44923f14d865893c91712a29a99d59e05d83f1a1a0300c4f4d5af638f284 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 00:47:09 compute-1 nova_compute[182713]: 2026-01-22 00:47:09.830 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:47:09 compute-1 nova_compute[182713]: 2026-01-22 00:47:09.856 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:47:09 compute-1 sshd-session[245455]: Invalid user admin from 139.19.117.129 port 41342
Jan 22 00:47:09 compute-1 sshd-session[245455]: userauth_pubkey: signature algorithm ssh-rsa not in PubkeyAcceptedAlgorithms [preauth]
Jan 22 00:47:10 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:47:10.870 104184 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=71, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:a4:fb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'fa:c2:bb:50:eb:27'}, ipsec=False) old=SB_Global(nb_cfg=70) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:47:10 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:47:10.871 104184 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 22 00:47:10 compute-1 nova_compute[182713]: 2026-01-22 00:47:10.872 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:47:11 compute-1 nova_compute[182713]: 2026-01-22 00:47:11.237 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:47:12 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:47:12.873 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=74526b6d-b1ca-423f-9094-b845f8b97526, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '71'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:47:13 compute-1 nova_compute[182713]: 2026-01-22 00:47:13.852 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:47:14 compute-1 nova_compute[182713]: 2026-01-22 00:47:14.832 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:47:14 compute-1 nova_compute[182713]: 2026-01-22 00:47:14.855 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:47:16 compute-1 nova_compute[182713]: 2026-01-22 00:47:16.270 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:47:16 compute-1 nova_compute[182713]: 2026-01-22 00:47:16.856 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:47:16 compute-1 nova_compute[182713]: 2026-01-22 00:47:16.856 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 00:47:18 compute-1 nova_compute[182713]: 2026-01-22 00:47:18.857 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:47:19 compute-1 sshd-session[245455]: Connection closed by invalid user admin 139.19.117.129 port 41342 [preauth]
Jan 22 00:47:19 compute-1 nova_compute[182713]: 2026-01-22 00:47:19.834 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:47:19 compute-1 nova_compute[182713]: 2026-01-22 00:47:19.855 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:47:19 compute-1 nova_compute[182713]: 2026-01-22 00:47:19.882 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:47:19 compute-1 nova_compute[182713]: 2026-01-22 00:47:19.883 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:47:19 compute-1 nova_compute[182713]: 2026-01-22 00:47:19.884 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:47:19 compute-1 nova_compute[182713]: 2026-01-22 00:47:19.884 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 00:47:20 compute-1 nova_compute[182713]: 2026-01-22 00:47:20.089 182717 WARNING nova.virt.libvirt.driver [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 00:47:20 compute-1 nova_compute[182713]: 2026-01-22 00:47:20.090 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5712MB free_disk=73.17744064331055GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 00:47:20 compute-1 nova_compute[182713]: 2026-01-22 00:47:20.091 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:47:20 compute-1 nova_compute[182713]: 2026-01-22 00:47:20.091 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:47:20 compute-1 nova_compute[182713]: 2026-01-22 00:47:20.158 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 00:47:20 compute-1 nova_compute[182713]: 2026-01-22 00:47:20.159 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 00:47:20 compute-1 nova_compute[182713]: 2026-01-22 00:47:20.187 182717 DEBUG nova.compute.provider_tree [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Inventory has not changed in ProviderTree for provider: 39680711-70c9-4df1-ae59-25e54fac688d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:47:20 compute-1 nova_compute[182713]: 2026-01-22 00:47:20.203 182717 DEBUG nova.scheduler.client.report [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Inventory has not changed for provider 39680711-70c9-4df1-ae59-25e54fac688d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:47:20 compute-1 nova_compute[182713]: 2026-01-22 00:47:20.206 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 00:47:20 compute-1 nova_compute[182713]: 2026-01-22 00:47:20.206 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.115s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:47:21 compute-1 nova_compute[182713]: 2026-01-22 00:47:21.207 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:47:21 compute-1 nova_compute[182713]: 2026-01-22 00:47:21.303 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:47:21 compute-1 nova_compute[182713]: 2026-01-22 00:47:21.855 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:47:22 compute-1 podman[245497]: 2026-01-22 00:47:22.575528467 +0000 UTC m=+0.069557187 container health_status cba261ffc859a105e0afa4436e64e27f9ba7e7387a1ea02146e0072c51844d44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 00:47:22 compute-1 podman[245498]: 2026-01-22 00:47:22.612819192 +0000 UTC m=+0.088818814 container health_status dce133a2ffa21f3006ac27dc80027060a9029e6d972f4c62fecd35dd3bbb00f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, config_id=openstack_network_exporter, container_name=openstack_network_exporter, release=1755695350, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, version=9.6, name=ubi9-minimal)
Jan 22 00:47:24 compute-1 nova_compute[182713]: 2026-01-22 00:47:24.837 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:47:25 compute-1 nova_compute[182713]: 2026-01-22 00:47:25.856 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:47:25 compute-1 nova_compute[182713]: 2026-01-22 00:47:25.856 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 00:47:25 compute-1 nova_compute[182713]: 2026-01-22 00:47:25.857 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 22 00:47:25 compute-1 nova_compute[182713]: 2026-01-22 00:47:25.879 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 22 00:47:26 compute-1 nova_compute[182713]: 2026-01-22 00:47:26.305 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:47:26 compute-1 nova_compute[182713]: 2026-01-22 00:47:26.810 182717 DEBUG oslo_concurrency.lockutils [None req-fb66fe15-a997-43ab-84c3-d6d8bc25af3a 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] Acquiring lock "9c4331d1-5216-4185-af70-efe3dea4e9ab" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:47:26 compute-1 nova_compute[182713]: 2026-01-22 00:47:26.810 182717 DEBUG oslo_concurrency.lockutils [None req-fb66fe15-a997-43ab-84c3-d6d8bc25af3a 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] Lock "9c4331d1-5216-4185-af70-efe3dea4e9ab" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:47:26 compute-1 nova_compute[182713]: 2026-01-22 00:47:26.829 182717 DEBUG nova.compute.manager [None req-fb66fe15-a997-43ab-84c3-d6d8bc25af3a 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] [instance: 9c4331d1-5216-4185-af70-efe3dea4e9ab] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 22 00:47:26 compute-1 nova_compute[182713]: 2026-01-22 00:47:26.949 182717 DEBUG oslo_concurrency.lockutils [None req-fb66fe15-a997-43ab-84c3-d6d8bc25af3a 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:47:26 compute-1 nova_compute[182713]: 2026-01-22 00:47:26.950 182717 DEBUG oslo_concurrency.lockutils [None req-fb66fe15-a997-43ab-84c3-d6d8bc25af3a 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:47:26 compute-1 nova_compute[182713]: 2026-01-22 00:47:26.956 182717 DEBUG nova.virt.hardware [None req-fb66fe15-a997-43ab-84c3-d6d8bc25af3a 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 22 00:47:26 compute-1 nova_compute[182713]: 2026-01-22 00:47:26.956 182717 INFO nova.compute.claims [None req-fb66fe15-a997-43ab-84c3-d6d8bc25af3a 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] [instance: 9c4331d1-5216-4185-af70-efe3dea4e9ab] Claim successful on node compute-1.ctlplane.example.com
Jan 22 00:47:27 compute-1 nova_compute[182713]: 2026-01-22 00:47:27.087 182717 DEBUG nova.compute.provider_tree [None req-fb66fe15-a997-43ab-84c3-d6d8bc25af3a 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] Inventory has not changed in ProviderTree for provider: 39680711-70c9-4df1-ae59-25e54fac688d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:47:27 compute-1 nova_compute[182713]: 2026-01-22 00:47:27.109 182717 DEBUG nova.scheduler.client.report [None req-fb66fe15-a997-43ab-84c3-d6d8bc25af3a 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] Inventory has not changed for provider 39680711-70c9-4df1-ae59-25e54fac688d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:47:27 compute-1 nova_compute[182713]: 2026-01-22 00:47:27.138 182717 DEBUG oslo_concurrency.lockutils [None req-fb66fe15-a997-43ab-84c3-d6d8bc25af3a 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.188s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:47:27 compute-1 nova_compute[182713]: 2026-01-22 00:47:27.139 182717 DEBUG nova.compute.manager [None req-fb66fe15-a997-43ab-84c3-d6d8bc25af3a 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] [instance: 9c4331d1-5216-4185-af70-efe3dea4e9ab] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 22 00:47:27 compute-1 nova_compute[182713]: 2026-01-22 00:47:27.207 182717 DEBUG nova.compute.manager [None req-fb66fe15-a997-43ab-84c3-d6d8bc25af3a 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] [instance: 9c4331d1-5216-4185-af70-efe3dea4e9ab] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 22 00:47:27 compute-1 nova_compute[182713]: 2026-01-22 00:47:27.207 182717 DEBUG nova.network.neutron [None req-fb66fe15-a997-43ab-84c3-d6d8bc25af3a 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] [instance: 9c4331d1-5216-4185-af70-efe3dea4e9ab] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 22 00:47:27 compute-1 nova_compute[182713]: 2026-01-22 00:47:27.226 182717 INFO nova.virt.libvirt.driver [None req-fb66fe15-a997-43ab-84c3-d6d8bc25af3a 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] [instance: 9c4331d1-5216-4185-af70-efe3dea4e9ab] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 22 00:47:27 compute-1 nova_compute[182713]: 2026-01-22 00:47:27.243 182717 DEBUG nova.compute.manager [None req-fb66fe15-a997-43ab-84c3-d6d8bc25af3a 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] [instance: 9c4331d1-5216-4185-af70-efe3dea4e9ab] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 22 00:47:27 compute-1 nova_compute[182713]: 2026-01-22 00:47:27.353 182717 DEBUG nova.compute.manager [None req-fb66fe15-a997-43ab-84c3-d6d8bc25af3a 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] [instance: 9c4331d1-5216-4185-af70-efe3dea4e9ab] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 22 00:47:27 compute-1 nova_compute[182713]: 2026-01-22 00:47:27.355 182717 DEBUG nova.virt.libvirt.driver [None req-fb66fe15-a997-43ab-84c3-d6d8bc25af3a 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] [instance: 9c4331d1-5216-4185-af70-efe3dea4e9ab] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 22 00:47:27 compute-1 nova_compute[182713]: 2026-01-22 00:47:27.356 182717 INFO nova.virt.libvirt.driver [None req-fb66fe15-a997-43ab-84c3-d6d8bc25af3a 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] [instance: 9c4331d1-5216-4185-af70-efe3dea4e9ab] Creating image(s)
Jan 22 00:47:27 compute-1 nova_compute[182713]: 2026-01-22 00:47:27.357 182717 DEBUG oslo_concurrency.lockutils [None req-fb66fe15-a997-43ab-84c3-d6d8bc25af3a 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] Acquiring lock "/var/lib/nova/instances/9c4331d1-5216-4185-af70-efe3dea4e9ab/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:47:27 compute-1 nova_compute[182713]: 2026-01-22 00:47:27.358 182717 DEBUG oslo_concurrency.lockutils [None req-fb66fe15-a997-43ab-84c3-d6d8bc25af3a 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] Lock "/var/lib/nova/instances/9c4331d1-5216-4185-af70-efe3dea4e9ab/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:47:27 compute-1 nova_compute[182713]: 2026-01-22 00:47:27.359 182717 DEBUG oslo_concurrency.lockutils [None req-fb66fe15-a997-43ab-84c3-d6d8bc25af3a 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] Lock "/var/lib/nova/instances/9c4331d1-5216-4185-af70-efe3dea4e9ab/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:47:27 compute-1 nova_compute[182713]: 2026-01-22 00:47:27.388 182717 DEBUG oslo_concurrency.processutils [None req-fb66fe15-a997-43ab-84c3-d6d8bc25af3a 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:47:27 compute-1 nova_compute[182713]: 2026-01-22 00:47:27.461 182717 DEBUG oslo_concurrency.processutils [None req-fb66fe15-a997-43ab-84c3-d6d8bc25af3a 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.073s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:47:27 compute-1 nova_compute[182713]: 2026-01-22 00:47:27.462 182717 DEBUG oslo_concurrency.lockutils [None req-fb66fe15-a997-43ab-84c3-d6d8bc25af3a 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] Acquiring lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:47:27 compute-1 nova_compute[182713]: 2026-01-22 00:47:27.463 182717 DEBUG oslo_concurrency.lockutils [None req-fb66fe15-a997-43ab-84c3-d6d8bc25af3a 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:47:27 compute-1 nova_compute[182713]: 2026-01-22 00:47:27.484 182717 DEBUG oslo_concurrency.processutils [None req-fb66fe15-a997-43ab-84c3-d6d8bc25af3a 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:47:27 compute-1 nova_compute[182713]: 2026-01-22 00:47:27.538 182717 DEBUG oslo_concurrency.processutils [None req-fb66fe15-a997-43ab-84c3-d6d8bc25af3a 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:47:27 compute-1 nova_compute[182713]: 2026-01-22 00:47:27.539 182717 DEBUG oslo_concurrency.processutils [None req-fb66fe15-a997-43ab-84c3-d6d8bc25af3a 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/9c4331d1-5216-4185-af70-efe3dea4e9ab/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:47:27 compute-1 nova_compute[182713]: 2026-01-22 00:47:27.591 182717 DEBUG oslo_concurrency.processutils [None req-fb66fe15-a997-43ab-84c3-d6d8bc25af3a 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/9c4331d1-5216-4185-af70-efe3dea4e9ab/disk 1073741824" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:47:27 compute-1 nova_compute[182713]: 2026-01-22 00:47:27.593 182717 DEBUG oslo_concurrency.lockutils [None req-fb66fe15-a997-43ab-84c3-d6d8bc25af3a 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.129s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:47:27 compute-1 nova_compute[182713]: 2026-01-22 00:47:27.594 182717 DEBUG oslo_concurrency.processutils [None req-fb66fe15-a997-43ab-84c3-d6d8bc25af3a 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:47:27 compute-1 nova_compute[182713]: 2026-01-22 00:47:27.651 182717 DEBUG oslo_concurrency.processutils [None req-fb66fe15-a997-43ab-84c3-d6d8bc25af3a 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:47:27 compute-1 nova_compute[182713]: 2026-01-22 00:47:27.652 182717 DEBUG nova.virt.disk.api [None req-fb66fe15-a997-43ab-84c3-d6d8bc25af3a 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] Checking if we can resize image /var/lib/nova/instances/9c4331d1-5216-4185-af70-efe3dea4e9ab/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 22 00:47:27 compute-1 nova_compute[182713]: 2026-01-22 00:47:27.653 182717 DEBUG oslo_concurrency.processutils [None req-fb66fe15-a997-43ab-84c3-d6d8bc25af3a 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9c4331d1-5216-4185-af70-efe3dea4e9ab/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:47:27 compute-1 nova_compute[182713]: 2026-01-22 00:47:27.712 182717 DEBUG oslo_concurrency.processutils [None req-fb66fe15-a997-43ab-84c3-d6d8bc25af3a 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9c4331d1-5216-4185-af70-efe3dea4e9ab/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:47:27 compute-1 nova_compute[182713]: 2026-01-22 00:47:27.713 182717 DEBUG nova.virt.disk.api [None req-fb66fe15-a997-43ab-84c3-d6d8bc25af3a 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] Cannot resize image /var/lib/nova/instances/9c4331d1-5216-4185-af70-efe3dea4e9ab/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 22 00:47:27 compute-1 nova_compute[182713]: 2026-01-22 00:47:27.714 182717 DEBUG nova.objects.instance [None req-fb66fe15-a997-43ab-84c3-d6d8bc25af3a 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] Lazy-loading 'migration_context' on Instance uuid 9c4331d1-5216-4185-af70-efe3dea4e9ab obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:47:27 compute-1 nova_compute[182713]: 2026-01-22 00:47:27.893 182717 DEBUG nova.virt.libvirt.driver [None req-fb66fe15-a997-43ab-84c3-d6d8bc25af3a 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] [instance: 9c4331d1-5216-4185-af70-efe3dea4e9ab] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 22 00:47:27 compute-1 nova_compute[182713]: 2026-01-22 00:47:27.894 182717 DEBUG nova.virt.libvirt.driver [None req-fb66fe15-a997-43ab-84c3-d6d8bc25af3a 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] [instance: 9c4331d1-5216-4185-af70-efe3dea4e9ab] Ensure instance console log exists: /var/lib/nova/instances/9c4331d1-5216-4185-af70-efe3dea4e9ab/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 22 00:47:27 compute-1 nova_compute[182713]: 2026-01-22 00:47:27.894 182717 DEBUG oslo_concurrency.lockutils [None req-fb66fe15-a997-43ab-84c3-d6d8bc25af3a 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:47:27 compute-1 nova_compute[182713]: 2026-01-22 00:47:27.895 182717 DEBUG oslo_concurrency.lockutils [None req-fb66fe15-a997-43ab-84c3-d6d8bc25af3a 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:47:27 compute-1 nova_compute[182713]: 2026-01-22 00:47:27.895 182717 DEBUG oslo_concurrency.lockutils [None req-fb66fe15-a997-43ab-84c3-d6d8bc25af3a 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:47:28 compute-1 nova_compute[182713]: 2026-01-22 00:47:28.673 182717 DEBUG nova.network.neutron [None req-fb66fe15-a997-43ab-84c3-d6d8bc25af3a 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] [instance: 9c4331d1-5216-4185-af70-efe3dea4e9ab] Successfully created port: 99067540-44f4-4846-b4d9-7da27554294f _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 22 00:47:29 compute-1 nova_compute[182713]: 2026-01-22 00:47:29.635 182717 DEBUG nova.network.neutron [None req-fb66fe15-a997-43ab-84c3-d6d8bc25af3a 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] [instance: 9c4331d1-5216-4185-af70-efe3dea4e9ab] Successfully updated port: 99067540-44f4-4846-b4d9-7da27554294f _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 22 00:47:29 compute-1 nova_compute[182713]: 2026-01-22 00:47:29.648 182717 DEBUG oslo_concurrency.lockutils [None req-fb66fe15-a997-43ab-84c3-d6d8bc25af3a 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] Acquiring lock "refresh_cache-9c4331d1-5216-4185-af70-efe3dea4e9ab" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:47:29 compute-1 nova_compute[182713]: 2026-01-22 00:47:29.649 182717 DEBUG oslo_concurrency.lockutils [None req-fb66fe15-a997-43ab-84c3-d6d8bc25af3a 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] Acquired lock "refresh_cache-9c4331d1-5216-4185-af70-efe3dea4e9ab" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:47:29 compute-1 nova_compute[182713]: 2026-01-22 00:47:29.649 182717 DEBUG nova.network.neutron [None req-fb66fe15-a997-43ab-84c3-d6d8bc25af3a 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] [instance: 9c4331d1-5216-4185-af70-efe3dea4e9ab] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 00:47:29 compute-1 nova_compute[182713]: 2026-01-22 00:47:29.799 182717 DEBUG nova.compute.manager [req-8404f050-f6cc-4820-8e4d-5066a4e0a558 req-75504d6c-4c6c-46e6-921a-cb08a4ff961d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 9c4331d1-5216-4185-af70-efe3dea4e9ab] Received event network-changed-99067540-44f4-4846-b4d9-7da27554294f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:47:29 compute-1 nova_compute[182713]: 2026-01-22 00:47:29.800 182717 DEBUG nova.compute.manager [req-8404f050-f6cc-4820-8e4d-5066a4e0a558 req-75504d6c-4c6c-46e6-921a-cb08a4ff961d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 9c4331d1-5216-4185-af70-efe3dea4e9ab] Refreshing instance network info cache due to event network-changed-99067540-44f4-4846-b4d9-7da27554294f. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 00:47:29 compute-1 nova_compute[182713]: 2026-01-22 00:47:29.800 182717 DEBUG oslo_concurrency.lockutils [req-8404f050-f6cc-4820-8e4d-5066a4e0a558 req-75504d6c-4c6c-46e6-921a-cb08a4ff961d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-9c4331d1-5216-4185-af70-efe3dea4e9ab" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:47:29 compute-1 nova_compute[182713]: 2026-01-22 00:47:29.838 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:47:29 compute-1 nova_compute[182713]: 2026-01-22 00:47:29.877 182717 DEBUG nova.network.neutron [None req-fb66fe15-a997-43ab-84c3-d6d8bc25af3a 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] [instance: 9c4331d1-5216-4185-af70-efe3dea4e9ab] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 00:47:31 compute-1 nova_compute[182713]: 2026-01-22 00:47:31.253 182717 DEBUG nova.network.neutron [None req-fb66fe15-a997-43ab-84c3-d6d8bc25af3a 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] [instance: 9c4331d1-5216-4185-af70-efe3dea4e9ab] Updating instance_info_cache with network_info: [{"id": "99067540-44f4-4846-b4d9-7da27554294f", "address": "fa:16:3e:ed:97:34", "network": {"id": "c27f16e8-e7ea-4ce6-8fc8-52a4d97170f2", "bridge": "br-int", "label": "tempest-TestServerMultinode-485848203-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "991deac0047e45c598f6b4e9ae868ad3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap99067540-44", "ovs_interfaceid": "99067540-44f4-4846-b4d9-7da27554294f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:47:31 compute-1 nova_compute[182713]: 2026-01-22 00:47:31.272 182717 DEBUG oslo_concurrency.lockutils [None req-fb66fe15-a997-43ab-84c3-d6d8bc25af3a 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] Releasing lock "refresh_cache-9c4331d1-5216-4185-af70-efe3dea4e9ab" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:47:31 compute-1 nova_compute[182713]: 2026-01-22 00:47:31.273 182717 DEBUG nova.compute.manager [None req-fb66fe15-a997-43ab-84c3-d6d8bc25af3a 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] [instance: 9c4331d1-5216-4185-af70-efe3dea4e9ab] Instance network_info: |[{"id": "99067540-44f4-4846-b4d9-7da27554294f", "address": "fa:16:3e:ed:97:34", "network": {"id": "c27f16e8-e7ea-4ce6-8fc8-52a4d97170f2", "bridge": "br-int", "label": "tempest-TestServerMultinode-485848203-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "991deac0047e45c598f6b4e9ae868ad3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap99067540-44", "ovs_interfaceid": "99067540-44f4-4846-b4d9-7da27554294f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 22 00:47:31 compute-1 nova_compute[182713]: 2026-01-22 00:47:31.273 182717 DEBUG oslo_concurrency.lockutils [req-8404f050-f6cc-4820-8e4d-5066a4e0a558 req-75504d6c-4c6c-46e6-921a-cb08a4ff961d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-9c4331d1-5216-4185-af70-efe3dea4e9ab" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:47:31 compute-1 nova_compute[182713]: 2026-01-22 00:47:31.273 182717 DEBUG nova.network.neutron [req-8404f050-f6cc-4820-8e4d-5066a4e0a558 req-75504d6c-4c6c-46e6-921a-cb08a4ff961d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 9c4331d1-5216-4185-af70-efe3dea4e9ab] Refreshing network info cache for port 99067540-44f4-4846-b4d9-7da27554294f _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 00:47:31 compute-1 nova_compute[182713]: 2026-01-22 00:47:31.276 182717 DEBUG nova.virt.libvirt.driver [None req-fb66fe15-a997-43ab-84c3-d6d8bc25af3a 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] [instance: 9c4331d1-5216-4185-af70-efe3dea4e9ab] Start _get_guest_xml network_info=[{"id": "99067540-44f4-4846-b4d9-7da27554294f", "address": "fa:16:3e:ed:97:34", "network": {"id": "c27f16e8-e7ea-4ce6-8fc8-52a4d97170f2", "bridge": "br-int", "label": "tempest-TestServerMultinode-485848203-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "991deac0047e45c598f6b4e9ae868ad3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap99067540-44", "ovs_interfaceid": "99067540-44f4-4846-b4d9-7da27554294f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'size': 0, 'disk_bus': 'virtio', 'boot_index': 0, 'device_name': '/dev/vda', 'guest_format': None, 'encrypted': False, 'encryption_format': None, 'encryption_secret_uuid': None, 'device_type': 'disk', 'image_id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 22 00:47:31 compute-1 nova_compute[182713]: 2026-01-22 00:47:31.281 182717 WARNING nova.virt.libvirt.driver [None req-fb66fe15-a997-43ab-84c3-d6d8bc25af3a 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 00:47:31 compute-1 nova_compute[182713]: 2026-01-22 00:47:31.293 182717 DEBUG nova.virt.libvirt.host [None req-fb66fe15-a997-43ab-84c3-d6d8bc25af3a 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 22 00:47:31 compute-1 nova_compute[182713]: 2026-01-22 00:47:31.294 182717 DEBUG nova.virt.libvirt.host [None req-fb66fe15-a997-43ab-84c3-d6d8bc25af3a 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 22 00:47:31 compute-1 nova_compute[182713]: 2026-01-22 00:47:31.297 182717 DEBUG nova.virt.libvirt.host [None req-fb66fe15-a997-43ab-84c3-d6d8bc25af3a 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 22 00:47:31 compute-1 nova_compute[182713]: 2026-01-22 00:47:31.298 182717 DEBUG nova.virt.libvirt.host [None req-fb66fe15-a997-43ab-84c3-d6d8bc25af3a 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 22 00:47:31 compute-1 nova_compute[182713]: 2026-01-22 00:47:31.300 182717 DEBUG nova.virt.libvirt.driver [None req-fb66fe15-a997-43ab-84c3-d6d8bc25af3a 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 22 00:47:31 compute-1 nova_compute[182713]: 2026-01-22 00:47:31.300 182717 DEBUG nova.virt.hardware [None req-fb66fe15-a997-43ab-84c3-d6d8bc25af3a 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-21T23:42:45Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c3389c03-89c4-4ff5-9e03-1a99d41713d4',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 22 00:47:31 compute-1 nova_compute[182713]: 2026-01-22 00:47:31.301 182717 DEBUG nova.virt.hardware [None req-fb66fe15-a997-43ab-84c3-d6d8bc25af3a 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 22 00:47:31 compute-1 nova_compute[182713]: 2026-01-22 00:47:31.301 182717 DEBUG nova.virt.hardware [None req-fb66fe15-a997-43ab-84c3-d6d8bc25af3a 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 22 00:47:31 compute-1 nova_compute[182713]: 2026-01-22 00:47:31.302 182717 DEBUG nova.virt.hardware [None req-fb66fe15-a997-43ab-84c3-d6d8bc25af3a 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 22 00:47:31 compute-1 nova_compute[182713]: 2026-01-22 00:47:31.302 182717 DEBUG nova.virt.hardware [None req-fb66fe15-a997-43ab-84c3-d6d8bc25af3a 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 22 00:47:31 compute-1 nova_compute[182713]: 2026-01-22 00:47:31.302 182717 DEBUG nova.virt.hardware [None req-fb66fe15-a997-43ab-84c3-d6d8bc25af3a 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 22 00:47:31 compute-1 nova_compute[182713]: 2026-01-22 00:47:31.302 182717 DEBUG nova.virt.hardware [None req-fb66fe15-a997-43ab-84c3-d6d8bc25af3a 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 22 00:47:31 compute-1 nova_compute[182713]: 2026-01-22 00:47:31.303 182717 DEBUG nova.virt.hardware [None req-fb66fe15-a997-43ab-84c3-d6d8bc25af3a 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 22 00:47:31 compute-1 nova_compute[182713]: 2026-01-22 00:47:31.303 182717 DEBUG nova.virt.hardware [None req-fb66fe15-a997-43ab-84c3-d6d8bc25af3a 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 22 00:47:31 compute-1 nova_compute[182713]: 2026-01-22 00:47:31.303 182717 DEBUG nova.virt.hardware [None req-fb66fe15-a997-43ab-84c3-d6d8bc25af3a 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 22 00:47:31 compute-1 nova_compute[182713]: 2026-01-22 00:47:31.304 182717 DEBUG nova.virt.hardware [None req-fb66fe15-a997-43ab-84c3-d6d8bc25af3a 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 22 00:47:31 compute-1 nova_compute[182713]: 2026-01-22 00:47:31.310 182717 DEBUG nova.virt.libvirt.vif [None req-fb66fe15-a997-43ab-84c3-d6d8bc25af3a 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T00:47:25Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestServerMultinode-server-81047553',display_name='tempest-TestServerMultinode-server-81047553',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testservermultinode-server-81047553',id=185,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='38ae0051f15c46809f70ec5299cfb2c6',ramdisk_id='',reservation_id='r-tomag5a8',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,member,reader',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestServerMultinode-385846676',owner_user_name='tempest-TestServerMultinode-385846676-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:47:27Z,user_data=None,user_id='8fb6fa8c5dd241fb975d0e13ddb107f4',uuid=9c4331d1-5216-4185-af70-efe3dea4e9ab,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "99067540-44f4-4846-b4d9-7da27554294f", "address": "fa:16:3e:ed:97:34", "network": {"id": "c27f16e8-e7ea-4ce6-8fc8-52a4d97170f2", "bridge": "br-int", "label": "tempest-TestServerMultinode-485848203-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "991deac0047e45c598f6b4e9ae868ad3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap99067540-44", "ovs_interfaceid": "99067540-44f4-4846-b4d9-7da27554294f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 00:47:31 compute-1 nova_compute[182713]: 2026-01-22 00:47:31.310 182717 DEBUG nova.network.os_vif_util [None req-fb66fe15-a997-43ab-84c3-d6d8bc25af3a 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] Converting VIF {"id": "99067540-44f4-4846-b4d9-7da27554294f", "address": "fa:16:3e:ed:97:34", "network": {"id": "c27f16e8-e7ea-4ce6-8fc8-52a4d97170f2", "bridge": "br-int", "label": "tempest-TestServerMultinode-485848203-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "991deac0047e45c598f6b4e9ae868ad3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap99067540-44", "ovs_interfaceid": "99067540-44f4-4846-b4d9-7da27554294f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:47:31 compute-1 nova_compute[182713]: 2026-01-22 00:47:31.312 182717 DEBUG nova.network.os_vif_util [None req-fb66fe15-a997-43ab-84c3-d6d8bc25af3a 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ed:97:34,bridge_name='br-int',has_traffic_filtering=True,id=99067540-44f4-4846-b4d9-7da27554294f,network=Network(c27f16e8-e7ea-4ce6-8fc8-52a4d97170f2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap99067540-44') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:47:31 compute-1 nova_compute[182713]: 2026-01-22 00:47:31.313 182717 DEBUG nova.objects.instance [None req-fb66fe15-a997-43ab-84c3-d6d8bc25af3a 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] Lazy-loading 'pci_devices' on Instance uuid 9c4331d1-5216-4185-af70-efe3dea4e9ab obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:47:31 compute-1 nova_compute[182713]: 2026-01-22 00:47:31.315 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:47:31 compute-1 nova_compute[182713]: 2026-01-22 00:47:31.334 182717 DEBUG nova.virt.libvirt.driver [None req-fb66fe15-a997-43ab-84c3-d6d8bc25af3a 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] [instance: 9c4331d1-5216-4185-af70-efe3dea4e9ab] End _get_guest_xml xml=<domain type="kvm">
Jan 22 00:47:31 compute-1 nova_compute[182713]:   <uuid>9c4331d1-5216-4185-af70-efe3dea4e9ab</uuid>
Jan 22 00:47:31 compute-1 nova_compute[182713]:   <name>instance-000000b9</name>
Jan 22 00:47:31 compute-1 nova_compute[182713]:   <memory>131072</memory>
Jan 22 00:47:31 compute-1 nova_compute[182713]:   <vcpu>1</vcpu>
Jan 22 00:47:31 compute-1 nova_compute[182713]:   <metadata>
Jan 22 00:47:31 compute-1 nova_compute[182713]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 00:47:31 compute-1 nova_compute[182713]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 00:47:31 compute-1 nova_compute[182713]:       <nova:name>tempest-TestServerMultinode-server-81047553</nova:name>
Jan 22 00:47:31 compute-1 nova_compute[182713]:       <nova:creationTime>2026-01-22 00:47:31</nova:creationTime>
Jan 22 00:47:31 compute-1 nova_compute[182713]:       <nova:flavor name="m1.nano">
Jan 22 00:47:31 compute-1 nova_compute[182713]:         <nova:memory>128</nova:memory>
Jan 22 00:47:31 compute-1 nova_compute[182713]:         <nova:disk>1</nova:disk>
Jan 22 00:47:31 compute-1 nova_compute[182713]:         <nova:swap>0</nova:swap>
Jan 22 00:47:31 compute-1 nova_compute[182713]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 00:47:31 compute-1 nova_compute[182713]:         <nova:vcpus>1</nova:vcpus>
Jan 22 00:47:31 compute-1 nova_compute[182713]:       </nova:flavor>
Jan 22 00:47:31 compute-1 nova_compute[182713]:       <nova:owner>
Jan 22 00:47:31 compute-1 nova_compute[182713]:         <nova:user uuid="8fb6fa8c5dd241fb975d0e13ddb107f4">tempest-TestServerMultinode-385846676-project-admin</nova:user>
Jan 22 00:47:31 compute-1 nova_compute[182713]:         <nova:project uuid="38ae0051f15c46809f70ec5299cfb2c6">tempest-TestServerMultinode-385846676</nova:project>
Jan 22 00:47:31 compute-1 nova_compute[182713]:       </nova:owner>
Jan 22 00:47:31 compute-1 nova_compute[182713]:       <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 22 00:47:31 compute-1 nova_compute[182713]:       <nova:ports>
Jan 22 00:47:31 compute-1 nova_compute[182713]:         <nova:port uuid="99067540-44f4-4846-b4d9-7da27554294f">
Jan 22 00:47:31 compute-1 nova_compute[182713]:           <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Jan 22 00:47:31 compute-1 nova_compute[182713]:         </nova:port>
Jan 22 00:47:31 compute-1 nova_compute[182713]:       </nova:ports>
Jan 22 00:47:31 compute-1 nova_compute[182713]:     </nova:instance>
Jan 22 00:47:31 compute-1 nova_compute[182713]:   </metadata>
Jan 22 00:47:31 compute-1 nova_compute[182713]:   <sysinfo type="smbios">
Jan 22 00:47:31 compute-1 nova_compute[182713]:     <system>
Jan 22 00:47:31 compute-1 nova_compute[182713]:       <entry name="manufacturer">RDO</entry>
Jan 22 00:47:31 compute-1 nova_compute[182713]:       <entry name="product">OpenStack Compute</entry>
Jan 22 00:47:31 compute-1 nova_compute[182713]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 00:47:31 compute-1 nova_compute[182713]:       <entry name="serial">9c4331d1-5216-4185-af70-efe3dea4e9ab</entry>
Jan 22 00:47:31 compute-1 nova_compute[182713]:       <entry name="uuid">9c4331d1-5216-4185-af70-efe3dea4e9ab</entry>
Jan 22 00:47:31 compute-1 nova_compute[182713]:       <entry name="family">Virtual Machine</entry>
Jan 22 00:47:31 compute-1 nova_compute[182713]:     </system>
Jan 22 00:47:31 compute-1 nova_compute[182713]:   </sysinfo>
Jan 22 00:47:31 compute-1 nova_compute[182713]:   <os>
Jan 22 00:47:31 compute-1 nova_compute[182713]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 22 00:47:31 compute-1 nova_compute[182713]:     <boot dev="hd"/>
Jan 22 00:47:31 compute-1 nova_compute[182713]:     <smbios mode="sysinfo"/>
Jan 22 00:47:31 compute-1 nova_compute[182713]:   </os>
Jan 22 00:47:31 compute-1 nova_compute[182713]:   <features>
Jan 22 00:47:31 compute-1 nova_compute[182713]:     <acpi/>
Jan 22 00:47:31 compute-1 nova_compute[182713]:     <apic/>
Jan 22 00:47:31 compute-1 nova_compute[182713]:     <vmcoreinfo/>
Jan 22 00:47:31 compute-1 nova_compute[182713]:   </features>
Jan 22 00:47:31 compute-1 nova_compute[182713]:   <clock offset="utc">
Jan 22 00:47:31 compute-1 nova_compute[182713]:     <timer name="pit" tickpolicy="delay"/>
Jan 22 00:47:31 compute-1 nova_compute[182713]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 22 00:47:31 compute-1 nova_compute[182713]:     <timer name="hpet" present="no"/>
Jan 22 00:47:31 compute-1 nova_compute[182713]:   </clock>
Jan 22 00:47:31 compute-1 nova_compute[182713]:   <cpu mode="custom" match="exact">
Jan 22 00:47:31 compute-1 nova_compute[182713]:     <model>Nehalem</model>
Jan 22 00:47:31 compute-1 nova_compute[182713]:     <topology sockets="1" cores="1" threads="1"/>
Jan 22 00:47:31 compute-1 nova_compute[182713]:   </cpu>
Jan 22 00:47:31 compute-1 nova_compute[182713]:   <devices>
Jan 22 00:47:31 compute-1 nova_compute[182713]:     <disk type="file" device="disk">
Jan 22 00:47:31 compute-1 nova_compute[182713]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 00:47:31 compute-1 nova_compute[182713]:       <source file="/var/lib/nova/instances/9c4331d1-5216-4185-af70-efe3dea4e9ab/disk"/>
Jan 22 00:47:31 compute-1 nova_compute[182713]:       <target dev="vda" bus="virtio"/>
Jan 22 00:47:31 compute-1 nova_compute[182713]:     </disk>
Jan 22 00:47:31 compute-1 nova_compute[182713]:     <disk type="file" device="cdrom">
Jan 22 00:47:31 compute-1 nova_compute[182713]:       <driver name="qemu" type="raw" cache="none"/>
Jan 22 00:47:31 compute-1 nova_compute[182713]:       <source file="/var/lib/nova/instances/9c4331d1-5216-4185-af70-efe3dea4e9ab/disk.config"/>
Jan 22 00:47:31 compute-1 nova_compute[182713]:       <target dev="sda" bus="sata"/>
Jan 22 00:47:31 compute-1 nova_compute[182713]:     </disk>
Jan 22 00:47:31 compute-1 nova_compute[182713]:     <interface type="ethernet">
Jan 22 00:47:31 compute-1 nova_compute[182713]:       <mac address="fa:16:3e:ed:97:34"/>
Jan 22 00:47:31 compute-1 nova_compute[182713]:       <model type="virtio"/>
Jan 22 00:47:31 compute-1 nova_compute[182713]:       <driver name="vhost" rx_queue_size="512"/>
Jan 22 00:47:31 compute-1 nova_compute[182713]:       <mtu size="1442"/>
Jan 22 00:47:31 compute-1 nova_compute[182713]:       <target dev="tap99067540-44"/>
Jan 22 00:47:31 compute-1 nova_compute[182713]:     </interface>
Jan 22 00:47:31 compute-1 nova_compute[182713]:     <serial type="pty">
Jan 22 00:47:31 compute-1 nova_compute[182713]:       <log file="/var/lib/nova/instances/9c4331d1-5216-4185-af70-efe3dea4e9ab/console.log" append="off"/>
Jan 22 00:47:31 compute-1 nova_compute[182713]:     </serial>
Jan 22 00:47:31 compute-1 nova_compute[182713]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 00:47:31 compute-1 nova_compute[182713]:     <video>
Jan 22 00:47:31 compute-1 nova_compute[182713]:       <model type="virtio"/>
Jan 22 00:47:31 compute-1 nova_compute[182713]:     </video>
Jan 22 00:47:31 compute-1 nova_compute[182713]:     <input type="tablet" bus="usb"/>
Jan 22 00:47:31 compute-1 nova_compute[182713]:     <rng model="virtio">
Jan 22 00:47:31 compute-1 nova_compute[182713]:       <backend model="random">/dev/urandom</backend>
Jan 22 00:47:31 compute-1 nova_compute[182713]:     </rng>
Jan 22 00:47:31 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root"/>
Jan 22 00:47:31 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:47:31 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:47:31 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:47:31 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:47:31 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:47:31 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:47:31 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:47:31 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:47:31 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:47:31 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:47:31 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:47:31 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:47:31 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:47:31 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:47:31 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:47:31 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:47:31 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:47:31 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:47:31 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:47:31 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:47:31 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:47:31 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:47:31 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:47:31 compute-1 nova_compute[182713]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:47:31 compute-1 nova_compute[182713]:     <controller type="usb" index="0"/>
Jan 22 00:47:31 compute-1 nova_compute[182713]:     <memballoon model="virtio">
Jan 22 00:47:31 compute-1 nova_compute[182713]:       <stats period="10"/>
Jan 22 00:47:31 compute-1 nova_compute[182713]:     </memballoon>
Jan 22 00:47:31 compute-1 nova_compute[182713]:   </devices>
Jan 22 00:47:31 compute-1 nova_compute[182713]: </domain>
Jan 22 00:47:31 compute-1 nova_compute[182713]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 22 00:47:31 compute-1 nova_compute[182713]: 2026-01-22 00:47:31.336 182717 DEBUG nova.compute.manager [None req-fb66fe15-a997-43ab-84c3-d6d8bc25af3a 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] [instance: 9c4331d1-5216-4185-af70-efe3dea4e9ab] Preparing to wait for external event network-vif-plugged-99067540-44f4-4846-b4d9-7da27554294f prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 22 00:47:31 compute-1 nova_compute[182713]: 2026-01-22 00:47:31.336 182717 DEBUG oslo_concurrency.lockutils [None req-fb66fe15-a997-43ab-84c3-d6d8bc25af3a 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] Acquiring lock "9c4331d1-5216-4185-af70-efe3dea4e9ab-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:47:31 compute-1 nova_compute[182713]: 2026-01-22 00:47:31.337 182717 DEBUG oslo_concurrency.lockutils [None req-fb66fe15-a997-43ab-84c3-d6d8bc25af3a 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] Lock "9c4331d1-5216-4185-af70-efe3dea4e9ab-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:47:31 compute-1 nova_compute[182713]: 2026-01-22 00:47:31.337 182717 DEBUG oslo_concurrency.lockutils [None req-fb66fe15-a997-43ab-84c3-d6d8bc25af3a 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] Lock "9c4331d1-5216-4185-af70-efe3dea4e9ab-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:47:31 compute-1 nova_compute[182713]: 2026-01-22 00:47:31.337 182717 DEBUG nova.virt.libvirt.vif [None req-fb66fe15-a997-43ab-84c3-d6d8bc25af3a 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T00:47:25Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestServerMultinode-server-81047553',display_name='tempest-TestServerMultinode-server-81047553',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testservermultinode-server-81047553',id=185,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='38ae0051f15c46809f70ec5299cfb2c6',ramdisk_id='',reservation_id='r-tomag5a8',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,member,reader',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestServerMultinode-385846676',owner_user_name='tempest-TestServerMultinode-385846676-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:47:27Z,user_data=None,user_id='8fb6fa8c5dd241fb975d0e13ddb107f4',uuid=9c4331d1-5216-4185-af70-efe3dea4e9ab,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "99067540-44f4-4846-b4d9-7da27554294f", "address": "fa:16:3e:ed:97:34", "network": {"id": "c27f16e8-e7ea-4ce6-8fc8-52a4d97170f2", "bridge": "br-int", "label": "tempest-TestServerMultinode-485848203-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "991deac0047e45c598f6b4e9ae868ad3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap99067540-44", "ovs_interfaceid": "99067540-44f4-4846-b4d9-7da27554294f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 00:47:31 compute-1 nova_compute[182713]: 2026-01-22 00:47:31.338 182717 DEBUG nova.network.os_vif_util [None req-fb66fe15-a997-43ab-84c3-d6d8bc25af3a 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] Converting VIF {"id": "99067540-44f4-4846-b4d9-7da27554294f", "address": "fa:16:3e:ed:97:34", "network": {"id": "c27f16e8-e7ea-4ce6-8fc8-52a4d97170f2", "bridge": "br-int", "label": "tempest-TestServerMultinode-485848203-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "991deac0047e45c598f6b4e9ae868ad3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap99067540-44", "ovs_interfaceid": "99067540-44f4-4846-b4d9-7da27554294f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:47:31 compute-1 nova_compute[182713]: 2026-01-22 00:47:31.338 182717 DEBUG nova.network.os_vif_util [None req-fb66fe15-a997-43ab-84c3-d6d8bc25af3a 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ed:97:34,bridge_name='br-int',has_traffic_filtering=True,id=99067540-44f4-4846-b4d9-7da27554294f,network=Network(c27f16e8-e7ea-4ce6-8fc8-52a4d97170f2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap99067540-44') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:47:31 compute-1 nova_compute[182713]: 2026-01-22 00:47:31.338 182717 DEBUG os_vif [None req-fb66fe15-a997-43ab-84c3-d6d8bc25af3a 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ed:97:34,bridge_name='br-int',has_traffic_filtering=True,id=99067540-44f4-4846-b4d9-7da27554294f,network=Network(c27f16e8-e7ea-4ce6-8fc8-52a4d97170f2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap99067540-44') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 00:47:31 compute-1 nova_compute[182713]: 2026-01-22 00:47:31.339 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:47:31 compute-1 nova_compute[182713]: 2026-01-22 00:47:31.339 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:47:31 compute-1 nova_compute[182713]: 2026-01-22 00:47:31.340 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 00:47:31 compute-1 nova_compute[182713]: 2026-01-22 00:47:31.344 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:47:31 compute-1 nova_compute[182713]: 2026-01-22 00:47:31.344 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap99067540-44, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:47:31 compute-1 nova_compute[182713]: 2026-01-22 00:47:31.344 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap99067540-44, col_values=(('external_ids', {'iface-id': '99067540-44f4-4846-b4d9-7da27554294f', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:ed:97:34', 'vm-uuid': '9c4331d1-5216-4185-af70-efe3dea4e9ab'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:47:31 compute-1 nova_compute[182713]: 2026-01-22 00:47:31.376 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:47:31 compute-1 NetworkManager[54952]: <info>  [1769042851.3782] manager: (tap99067540-44): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/374)
Jan 22 00:47:31 compute-1 nova_compute[182713]: 2026-01-22 00:47:31.379 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:47:31 compute-1 nova_compute[182713]: 2026-01-22 00:47:31.386 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:47:31 compute-1 nova_compute[182713]: 2026-01-22 00:47:31.389 182717 INFO os_vif [None req-fb66fe15-a997-43ab-84c3-d6d8bc25af3a 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ed:97:34,bridge_name='br-int',has_traffic_filtering=True,id=99067540-44f4-4846-b4d9-7da27554294f,network=Network(c27f16e8-e7ea-4ce6-8fc8-52a4d97170f2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap99067540-44')
Jan 22 00:47:31 compute-1 nova_compute[182713]: 2026-01-22 00:47:31.442 182717 DEBUG nova.virt.libvirt.driver [None req-fb66fe15-a997-43ab-84c3-d6d8bc25af3a 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 00:47:31 compute-1 nova_compute[182713]: 2026-01-22 00:47:31.443 182717 DEBUG nova.virt.libvirt.driver [None req-fb66fe15-a997-43ab-84c3-d6d8bc25af3a 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 00:47:31 compute-1 nova_compute[182713]: 2026-01-22 00:47:31.443 182717 DEBUG nova.virt.libvirt.driver [None req-fb66fe15-a997-43ab-84c3-d6d8bc25af3a 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] No VIF found with MAC fa:16:3e:ed:97:34, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 00:47:31 compute-1 nova_compute[182713]: 2026-01-22 00:47:31.444 182717 INFO nova.virt.libvirt.driver [None req-fb66fe15-a997-43ab-84c3-d6d8bc25af3a 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] [instance: 9c4331d1-5216-4185-af70-efe3dea4e9ab] Using config drive
Jan 22 00:47:32 compute-1 nova_compute[182713]: 2026-01-22 00:47:32.653 182717 INFO nova.virt.libvirt.driver [None req-fb66fe15-a997-43ab-84c3-d6d8bc25af3a 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] [instance: 9c4331d1-5216-4185-af70-efe3dea4e9ab] Creating config drive at /var/lib/nova/instances/9c4331d1-5216-4185-af70-efe3dea4e9ab/disk.config
Jan 22 00:47:32 compute-1 nova_compute[182713]: 2026-01-22 00:47:32.659 182717 DEBUG oslo_concurrency.processutils [None req-fb66fe15-a997-43ab-84c3-d6d8bc25af3a 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/9c4331d1-5216-4185-af70-efe3dea4e9ab/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmph336biob execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:47:32 compute-1 nova_compute[182713]: 2026-01-22 00:47:32.806 182717 DEBUG oslo_concurrency.processutils [None req-fb66fe15-a997-43ab-84c3-d6d8bc25af3a 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/9c4331d1-5216-4185-af70-efe3dea4e9ab/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmph336biob" returned: 0 in 0.146s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:47:32 compute-1 kernel: tap99067540-44: entered promiscuous mode
Jan 22 00:47:32 compute-1 NetworkManager[54952]: <info>  [1769042852.8940] manager: (tap99067540-44): new Tun device (/org/freedesktop/NetworkManager/Devices/375)
Jan 22 00:47:32 compute-1 ovn_controller[94841]: 2026-01-22T00:47:32Z|00769|binding|INFO|Claiming lport 99067540-44f4-4846-b4d9-7da27554294f for this chassis.
Jan 22 00:47:32 compute-1 nova_compute[182713]: 2026-01-22 00:47:32.941 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:47:32 compute-1 ovn_controller[94841]: 2026-01-22T00:47:32Z|00770|binding|INFO|99067540-44f4-4846-b4d9-7da27554294f: Claiming fa:16:3e:ed:97:34 10.100.0.10
Jan 22 00:47:32 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:47:32.955 104184 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ed:97:34 10.100.0.10'], port_security=['fa:16:3e:ed:97:34 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '9c4331d1-5216-4185-af70-efe3dea4e9ab', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c27f16e8-e7ea-4ce6-8fc8-52a4d97170f2', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '38ae0051f15c46809f70ec5299cfb2c6', 'neutron:revision_number': '2', 'neutron:security_group_ids': '09d74f28-816b-485a-851f-3d27c0c9555a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2b227cd0-f221-40a4-86c3-ce27482fa492, chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>], logical_port=99067540-44f4-4846-b4d9-7da27554294f) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:47:32 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:47:32.957 104184 INFO neutron.agent.ovn.metadata.agent [-] Port 99067540-44f4-4846-b4d9-7da27554294f in datapath c27f16e8-e7ea-4ce6-8fc8-52a4d97170f2 bound to our chassis
Jan 22 00:47:32 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:47:32.960 104184 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network c27f16e8-e7ea-4ce6-8fc8-52a4d97170f2
Jan 22 00:47:32 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:47:32.975 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[4af2fb50-e76f-4ec7-8312-7ba407f1761e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:47:32 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:47:32.977 104184 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapc27f16e8-e1 in ovnmeta-c27f16e8-e7ea-4ce6-8fc8-52a4d97170f2 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 22 00:47:32 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:47:32.981 211733 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapc27f16e8-e0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 22 00:47:32 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:47:32.981 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[c38e4bda-6c52-4a94-a918-3dca83a978c6]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:47:32 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:47:32.982 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[7afdca08-e55d-4410-a3d9-44f0ee0530fa]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:47:32 compute-1 systemd-udevd[245577]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 00:47:32 compute-1 systemd-machined[153970]: New machine qemu-80-instance-000000b9.
Jan 22 00:47:33 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:47:32.999 104576 DEBUG oslo.privsep.daemon [-] privsep: reply[a59a9fd3-d1bd-4c15-bcee-282bc7e81d1d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:47:33 compute-1 NetworkManager[54952]: <info>  [1769042853.0180] device (tap99067540-44): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 00:47:33 compute-1 NetworkManager[54952]: <info>  [1769042853.0194] device (tap99067540-44): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 00:47:33 compute-1 nova_compute[182713]: 2026-01-22 00:47:33.030 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:47:33 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:47:33.031 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[15e8b11a-8d64-40a9-a42c-5fc73577fd26]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:47:33 compute-1 systemd[1]: Started Virtual Machine qemu-80-instance-000000b9.
Jan 22 00:47:33 compute-1 ovn_controller[94841]: 2026-01-22T00:47:33Z|00771|binding|INFO|Setting lport 99067540-44f4-4846-b4d9-7da27554294f ovn-installed in OVS
Jan 22 00:47:33 compute-1 ovn_controller[94841]: 2026-01-22T00:47:33Z|00772|binding|INFO|Setting lport 99067540-44f4-4846-b4d9-7da27554294f up in Southbound
Jan 22 00:47:33 compute-1 nova_compute[182713]: 2026-01-22 00:47:33.036 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:47:33 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:47:33.063 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[041dc232-bf28-486f-bb2f-d4fcd6b071ad]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:47:33 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:47:33.069 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[308ea676-4102-456c-aab5-d676ba3ba9f9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:47:33 compute-1 NetworkManager[54952]: <info>  [1769042853.0702] manager: (tapc27f16e8-e0): new Veth device (/org/freedesktop/NetworkManager/Devices/376)
Jan 22 00:47:33 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:47:33.102 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[a7bab111-10b7-4687-84bd-a43f54193d58]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:47:33 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:47:33.106 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[e938423e-4c74-4414-9620-0850cef67e0c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:47:33 compute-1 NetworkManager[54952]: <info>  [1769042853.1318] device (tapc27f16e8-e0): carrier: link connected
Jan 22 00:47:33 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:47:33.138 211754 DEBUG oslo.privsep.daemon [-] privsep: reply[c4b831df-1bf2-42c8-9ad9-1ea7edcd0690]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:47:33 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:47:33.157 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[4124afc4-b25d-4767-95d1-b9cd8ac1a4de]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapc27f16e8-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:cf:14:4f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 225], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 742578, 'reachable_time': 27359, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 245608, 'error': None, 'target': 'ovnmeta-c27f16e8-e7ea-4ce6-8fc8-52a4d97170f2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:47:33 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:47:33.177 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[20271756-e8bc-4a45-bd78-5c1564c156eb]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fecf:144f'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 742578, 'tstamp': 742578}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 245609, 'error': None, 'target': 'ovnmeta-c27f16e8-e7ea-4ce6-8fc8-52a4d97170f2', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:47:33 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:47:33.198 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[7f9575d9-aa6c-4963-896d-8df255667798]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapc27f16e8-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:cf:14:4f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 225], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 742578, 'reachable_time': 27359, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 245610, 'error': None, 'target': 'ovnmeta-c27f16e8-e7ea-4ce6-8fc8-52a4d97170f2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:47:33 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:47:33.245 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[d1d8b73f-d566-41c6-ae1e-7b06ba196e5e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:47:33 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:47:33.321 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[04afb04e-629c-41dc-ba5d-6d3f0998e819]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:47:33 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:47:33.322 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc27f16e8-e0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:47:33 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:47:33.323 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 00:47:33 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:47:33.323 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc27f16e8-e0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:47:33 compute-1 NetworkManager[54952]: <info>  [1769042853.3276] manager: (tapc27f16e8-e0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/377)
Jan 22 00:47:33 compute-1 nova_compute[182713]: 2026-01-22 00:47:33.326 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:47:33 compute-1 kernel: tapc27f16e8-e0: entered promiscuous mode
Jan 22 00:47:33 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:47:33.331 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapc27f16e8-e0, col_values=(('external_ids', {'iface-id': '8c4d0320-cbc0-4761-8fbc-cd4251890b14'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:47:33 compute-1 ovn_controller[94841]: 2026-01-22T00:47:33Z|00773|binding|INFO|Releasing lport 8c4d0320-cbc0-4761-8fbc-cd4251890b14 from this chassis (sb_readonly=0)
Jan 22 00:47:33 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:47:33.351 104184 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/c27f16e8-e7ea-4ce6-8fc8-52a4d97170f2.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/c27f16e8-e7ea-4ce6-8fc8-52a4d97170f2.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 22 00:47:33 compute-1 nova_compute[182713]: 2026-01-22 00:47:33.349 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:47:33 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:47:33.351 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[8f8edfdb-d793-4d20-a3b2-fc3d7376910f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:47:33 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:47:33.353 104184 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 00:47:33 compute-1 ovn_metadata_agent[104179]: global
Jan 22 00:47:33 compute-1 ovn_metadata_agent[104179]:     log         /dev/log local0 debug
Jan 22 00:47:33 compute-1 ovn_metadata_agent[104179]:     log-tag     haproxy-metadata-proxy-c27f16e8-e7ea-4ce6-8fc8-52a4d97170f2
Jan 22 00:47:33 compute-1 ovn_metadata_agent[104179]:     user        root
Jan 22 00:47:33 compute-1 ovn_metadata_agent[104179]:     group       root
Jan 22 00:47:33 compute-1 ovn_metadata_agent[104179]:     maxconn     1024
Jan 22 00:47:33 compute-1 ovn_metadata_agent[104179]:     pidfile     /var/lib/neutron/external/pids/c27f16e8-e7ea-4ce6-8fc8-52a4d97170f2.pid.haproxy
Jan 22 00:47:33 compute-1 ovn_metadata_agent[104179]:     daemon
Jan 22 00:47:33 compute-1 ovn_metadata_agent[104179]: 
Jan 22 00:47:33 compute-1 ovn_metadata_agent[104179]: defaults
Jan 22 00:47:33 compute-1 ovn_metadata_agent[104179]:     log global
Jan 22 00:47:33 compute-1 ovn_metadata_agent[104179]:     mode http
Jan 22 00:47:33 compute-1 ovn_metadata_agent[104179]:     option httplog
Jan 22 00:47:33 compute-1 ovn_metadata_agent[104179]:     option dontlognull
Jan 22 00:47:33 compute-1 ovn_metadata_agent[104179]:     option http-server-close
Jan 22 00:47:33 compute-1 ovn_metadata_agent[104179]:     option forwardfor
Jan 22 00:47:33 compute-1 ovn_metadata_agent[104179]:     retries                 3
Jan 22 00:47:33 compute-1 ovn_metadata_agent[104179]:     timeout http-request    30s
Jan 22 00:47:33 compute-1 ovn_metadata_agent[104179]:     timeout connect         30s
Jan 22 00:47:33 compute-1 ovn_metadata_agent[104179]:     timeout client          32s
Jan 22 00:47:33 compute-1 ovn_metadata_agent[104179]:     timeout server          32s
Jan 22 00:47:33 compute-1 ovn_metadata_agent[104179]:     timeout http-keep-alive 30s
Jan 22 00:47:33 compute-1 ovn_metadata_agent[104179]: 
Jan 22 00:47:33 compute-1 ovn_metadata_agent[104179]: 
Jan 22 00:47:33 compute-1 ovn_metadata_agent[104179]: listen listener
Jan 22 00:47:33 compute-1 ovn_metadata_agent[104179]:     bind 169.254.169.254:80
Jan 22 00:47:33 compute-1 ovn_metadata_agent[104179]:     server metadata /var/lib/neutron/metadata_proxy
Jan 22 00:47:33 compute-1 ovn_metadata_agent[104179]:     http-request add-header X-OVN-Network-ID c27f16e8-e7ea-4ce6-8fc8-52a4d97170f2
Jan 22 00:47:33 compute-1 ovn_metadata_agent[104179]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 22 00:47:33 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:47:33.353 104184 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-c27f16e8-e7ea-4ce6-8fc8-52a4d97170f2', 'env', 'PROCESS_TAG=haproxy-c27f16e8-e7ea-4ce6-8fc8-52a4d97170f2', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/c27f16e8-e7ea-4ce6-8fc8-52a4d97170f2.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 22 00:47:33 compute-1 nova_compute[182713]: 2026-01-22 00:47:33.453 182717 DEBUG nova.compute.manager [req-2adc898b-7099-42cf-a6fc-c065ebceee8f req-1f3ee59c-6ea5-4f59-b171-b0120c8fe83f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 9c4331d1-5216-4185-af70-efe3dea4e9ab] Received event network-vif-plugged-99067540-44f4-4846-b4d9-7da27554294f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:47:33 compute-1 nova_compute[182713]: 2026-01-22 00:47:33.459 182717 DEBUG oslo_concurrency.lockutils [req-2adc898b-7099-42cf-a6fc-c065ebceee8f req-1f3ee59c-6ea5-4f59-b171-b0120c8fe83f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "9c4331d1-5216-4185-af70-efe3dea4e9ab-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:47:33 compute-1 nova_compute[182713]: 2026-01-22 00:47:33.461 182717 DEBUG oslo_concurrency.lockutils [req-2adc898b-7099-42cf-a6fc-c065ebceee8f req-1f3ee59c-6ea5-4f59-b171-b0120c8fe83f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "9c4331d1-5216-4185-af70-efe3dea4e9ab-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:47:33 compute-1 nova_compute[182713]: 2026-01-22 00:47:33.461 182717 DEBUG oslo_concurrency.lockutils [req-2adc898b-7099-42cf-a6fc-c065ebceee8f req-1f3ee59c-6ea5-4f59-b171-b0120c8fe83f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "9c4331d1-5216-4185-af70-efe3dea4e9ab-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:47:33 compute-1 nova_compute[182713]: 2026-01-22 00:47:33.462 182717 DEBUG nova.compute.manager [req-2adc898b-7099-42cf-a6fc-c065ebceee8f req-1f3ee59c-6ea5-4f59-b171-b0120c8fe83f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 9c4331d1-5216-4185-af70-efe3dea4e9ab] Processing event network-vif-plugged-99067540-44f4-4846-b4d9-7da27554294f _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 22 00:47:33 compute-1 nova_compute[182713]: 2026-01-22 00:47:33.495 182717 DEBUG nova.virt.driver [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Emitting event <LifecycleEvent: 1769042853.494325, 9c4331d1-5216-4185-af70-efe3dea4e9ab => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:47:33 compute-1 nova_compute[182713]: 2026-01-22 00:47:33.496 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 9c4331d1-5216-4185-af70-efe3dea4e9ab] VM Started (Lifecycle Event)
Jan 22 00:47:33 compute-1 nova_compute[182713]: 2026-01-22 00:47:33.499 182717 DEBUG nova.compute.manager [None req-fb66fe15-a997-43ab-84c3-d6d8bc25af3a 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] [instance: 9c4331d1-5216-4185-af70-efe3dea4e9ab] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 00:47:33 compute-1 nova_compute[182713]: 2026-01-22 00:47:33.504 182717 DEBUG nova.virt.libvirt.driver [None req-fb66fe15-a997-43ab-84c3-d6d8bc25af3a 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] [instance: 9c4331d1-5216-4185-af70-efe3dea4e9ab] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 22 00:47:33 compute-1 nova_compute[182713]: 2026-01-22 00:47:33.509 182717 INFO nova.virt.libvirt.driver [-] [instance: 9c4331d1-5216-4185-af70-efe3dea4e9ab] Instance spawned successfully.
Jan 22 00:47:33 compute-1 nova_compute[182713]: 2026-01-22 00:47:33.510 182717 DEBUG nova.virt.libvirt.driver [None req-fb66fe15-a997-43ab-84c3-d6d8bc25af3a 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] [instance: 9c4331d1-5216-4185-af70-efe3dea4e9ab] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 22 00:47:33 compute-1 nova_compute[182713]: 2026-01-22 00:47:33.528 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 9c4331d1-5216-4185-af70-efe3dea4e9ab] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:47:33 compute-1 nova_compute[182713]: 2026-01-22 00:47:33.538 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 9c4331d1-5216-4185-af70-efe3dea4e9ab] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 00:47:33 compute-1 nova_compute[182713]: 2026-01-22 00:47:33.546 182717 DEBUG nova.virt.libvirt.driver [None req-fb66fe15-a997-43ab-84c3-d6d8bc25af3a 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] [instance: 9c4331d1-5216-4185-af70-efe3dea4e9ab] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:47:33 compute-1 nova_compute[182713]: 2026-01-22 00:47:33.547 182717 DEBUG nova.virt.libvirt.driver [None req-fb66fe15-a997-43ab-84c3-d6d8bc25af3a 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] [instance: 9c4331d1-5216-4185-af70-efe3dea4e9ab] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:47:33 compute-1 nova_compute[182713]: 2026-01-22 00:47:33.548 182717 DEBUG nova.virt.libvirt.driver [None req-fb66fe15-a997-43ab-84c3-d6d8bc25af3a 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] [instance: 9c4331d1-5216-4185-af70-efe3dea4e9ab] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:47:33 compute-1 nova_compute[182713]: 2026-01-22 00:47:33.549 182717 DEBUG nova.virt.libvirt.driver [None req-fb66fe15-a997-43ab-84c3-d6d8bc25af3a 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] [instance: 9c4331d1-5216-4185-af70-efe3dea4e9ab] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:47:33 compute-1 nova_compute[182713]: 2026-01-22 00:47:33.550 182717 DEBUG nova.virt.libvirt.driver [None req-fb66fe15-a997-43ab-84c3-d6d8bc25af3a 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] [instance: 9c4331d1-5216-4185-af70-efe3dea4e9ab] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:47:33 compute-1 nova_compute[182713]: 2026-01-22 00:47:33.551 182717 DEBUG nova.virt.libvirt.driver [None req-fb66fe15-a997-43ab-84c3-d6d8bc25af3a 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] [instance: 9c4331d1-5216-4185-af70-efe3dea4e9ab] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:47:33 compute-1 nova_compute[182713]: 2026-01-22 00:47:33.561 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 9c4331d1-5216-4185-af70-efe3dea4e9ab] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 00:47:33 compute-1 nova_compute[182713]: 2026-01-22 00:47:33.561 182717 DEBUG nova.virt.driver [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Emitting event <LifecycleEvent: 1769042853.4946187, 9c4331d1-5216-4185-af70-efe3dea4e9ab => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:47:33 compute-1 nova_compute[182713]: 2026-01-22 00:47:33.561 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 9c4331d1-5216-4185-af70-efe3dea4e9ab] VM Paused (Lifecycle Event)
Jan 22 00:47:33 compute-1 nova_compute[182713]: 2026-01-22 00:47:33.590 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 9c4331d1-5216-4185-af70-efe3dea4e9ab] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:47:33 compute-1 nova_compute[182713]: 2026-01-22 00:47:33.592 182717 DEBUG nova.network.neutron [req-8404f050-f6cc-4820-8e4d-5066a4e0a558 req-75504d6c-4c6c-46e6-921a-cb08a4ff961d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 9c4331d1-5216-4185-af70-efe3dea4e9ab] Updated VIF entry in instance network info cache for port 99067540-44f4-4846-b4d9-7da27554294f. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 00:47:33 compute-1 nova_compute[182713]: 2026-01-22 00:47:33.593 182717 DEBUG nova.network.neutron [req-8404f050-f6cc-4820-8e4d-5066a4e0a558 req-75504d6c-4c6c-46e6-921a-cb08a4ff961d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 9c4331d1-5216-4185-af70-efe3dea4e9ab] Updating instance_info_cache with network_info: [{"id": "99067540-44f4-4846-b4d9-7da27554294f", "address": "fa:16:3e:ed:97:34", "network": {"id": "c27f16e8-e7ea-4ce6-8fc8-52a4d97170f2", "bridge": "br-int", "label": "tempest-TestServerMultinode-485848203-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "991deac0047e45c598f6b4e9ae868ad3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap99067540-44", "ovs_interfaceid": "99067540-44f4-4846-b4d9-7da27554294f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:47:33 compute-1 nova_compute[182713]: 2026-01-22 00:47:33.599 182717 DEBUG nova.virt.driver [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] Emitting event <LifecycleEvent: 1769042853.5030932, 9c4331d1-5216-4185-af70-efe3dea4e9ab => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:47:33 compute-1 nova_compute[182713]: 2026-01-22 00:47:33.599 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 9c4331d1-5216-4185-af70-efe3dea4e9ab] VM Resumed (Lifecycle Event)
Jan 22 00:47:33 compute-1 nova_compute[182713]: 2026-01-22 00:47:33.616 182717 DEBUG oslo_concurrency.lockutils [req-8404f050-f6cc-4820-8e4d-5066a4e0a558 req-75504d6c-4c6c-46e6-921a-cb08a4ff961d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-9c4331d1-5216-4185-af70-efe3dea4e9ab" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:47:33 compute-1 nova_compute[182713]: 2026-01-22 00:47:33.619 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 9c4331d1-5216-4185-af70-efe3dea4e9ab] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:47:33 compute-1 nova_compute[182713]: 2026-01-22 00:47:33.622 182717 DEBUG nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 9c4331d1-5216-4185-af70-efe3dea4e9ab] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 00:47:33 compute-1 nova_compute[182713]: 2026-01-22 00:47:33.649 182717 INFO nova.compute.manager [None req-c072f6d6-30f3-4493-9c09-d32fd671dc5b - - - - - -] [instance: 9c4331d1-5216-4185-af70-efe3dea4e9ab] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 00:47:33 compute-1 nova_compute[182713]: 2026-01-22 00:47:33.651 182717 INFO nova.compute.manager [None req-fb66fe15-a997-43ab-84c3-d6d8bc25af3a 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] [instance: 9c4331d1-5216-4185-af70-efe3dea4e9ab] Took 6.30 seconds to spawn the instance on the hypervisor.
Jan 22 00:47:33 compute-1 nova_compute[182713]: 2026-01-22 00:47:33.651 182717 DEBUG nova.compute.manager [None req-fb66fe15-a997-43ab-84c3-d6d8bc25af3a 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] [instance: 9c4331d1-5216-4185-af70-efe3dea4e9ab] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:47:33 compute-1 nova_compute[182713]: 2026-01-22 00:47:33.746 182717 INFO nova.compute.manager [None req-fb66fe15-a997-43ab-84c3-d6d8bc25af3a 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] [instance: 9c4331d1-5216-4185-af70-efe3dea4e9ab] Took 6.83 seconds to build instance.
Jan 22 00:47:33 compute-1 nova_compute[182713]: 2026-01-22 00:47:33.763 182717 DEBUG oslo_concurrency.lockutils [None req-fb66fe15-a997-43ab-84c3-d6d8bc25af3a 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] Lock "9c4331d1-5216-4185-af70-efe3dea4e9ab" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 6.953s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:47:33 compute-1 podman[245649]: 2026-01-22 00:47:33.853226909 +0000 UTC m=+0.076597894 container create bb354b05da4f97c968a465554c268da25b781ffd710222c90a1339fb0e1e3f34 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c27f16e8-e7ea-4ce6-8fc8-52a4d97170f2, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Jan 22 00:47:33 compute-1 podman[245649]: 2026-01-22 00:47:33.812126086 +0000 UTC m=+0.035497151 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 00:47:33 compute-1 systemd[1]: Started libpod-conmon-bb354b05da4f97c968a465554c268da25b781ffd710222c90a1339fb0e1e3f34.scope.
Jan 22 00:47:33 compute-1 systemd[1]: Started libcrun container.
Jan 22 00:47:33 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c23ac8b696592243e0abf38b0e02989532458e3c90dfaa67fe89c3b2aab0413d/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 00:47:33 compute-1 podman[245649]: 2026-01-22 00:47:33.961166354 +0000 UTC m=+0.184537339 container init bb354b05da4f97c968a465554c268da25b781ffd710222c90a1339fb0e1e3f34 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c27f16e8-e7ea-4ce6-8fc8-52a4d97170f2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 22 00:47:33 compute-1 podman[245662]: 2026-01-22 00:47:33.961494035 +0000 UTC m=+0.065497151 container health_status 9069102163b9014ce8d990e36224edf2f6277e737eebbc85a8ea0ba5a3d6a468 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 22 00:47:33 compute-1 podman[245649]: 2026-01-22 00:47:33.966933063 +0000 UTC m=+0.190304028 container start bb354b05da4f97c968a465554c268da25b781ffd710222c90a1339fb0e1e3f34 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c27f16e8-e7ea-4ce6-8fc8-52a4d97170f2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251202)
Jan 22 00:47:33 compute-1 neutron-haproxy-ovnmeta-c27f16e8-e7ea-4ce6-8fc8-52a4d97170f2[245683]: [NOTICE]   (245710) : New worker (245716) forked
Jan 22 00:47:33 compute-1 neutron-haproxy-ovnmeta-c27f16e8-e7ea-4ce6-8fc8-52a4d97170f2[245683]: [NOTICE]   (245710) : Loading success.
Jan 22 00:47:33 compute-1 podman[245661]: 2026-01-22 00:47:33.998924364 +0000 UTC m=+0.110997940 container health_status 1b31374526468d6b98833c6ddc06c791524e3aaa8f4a92d02e3f11c449a17ca2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 22 00:47:35 compute-1 nova_compute[182713]: 2026-01-22 00:47:35.566 182717 DEBUG nova.compute.manager [req-f6edc751-4b0a-4d45-8159-bb75db2a1269 req-683f0971-da3d-46c9-b345-da1e2bc5755e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 9c4331d1-5216-4185-af70-efe3dea4e9ab] Received event network-vif-plugged-99067540-44f4-4846-b4d9-7da27554294f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:47:35 compute-1 nova_compute[182713]: 2026-01-22 00:47:35.567 182717 DEBUG oslo_concurrency.lockutils [req-f6edc751-4b0a-4d45-8159-bb75db2a1269 req-683f0971-da3d-46c9-b345-da1e2bc5755e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "9c4331d1-5216-4185-af70-efe3dea4e9ab-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:47:35 compute-1 nova_compute[182713]: 2026-01-22 00:47:35.567 182717 DEBUG oslo_concurrency.lockutils [req-f6edc751-4b0a-4d45-8159-bb75db2a1269 req-683f0971-da3d-46c9-b345-da1e2bc5755e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "9c4331d1-5216-4185-af70-efe3dea4e9ab-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:47:35 compute-1 nova_compute[182713]: 2026-01-22 00:47:35.567 182717 DEBUG oslo_concurrency.lockutils [req-f6edc751-4b0a-4d45-8159-bb75db2a1269 req-683f0971-da3d-46c9-b345-da1e2bc5755e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "9c4331d1-5216-4185-af70-efe3dea4e9ab-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:47:35 compute-1 nova_compute[182713]: 2026-01-22 00:47:35.568 182717 DEBUG nova.compute.manager [req-f6edc751-4b0a-4d45-8159-bb75db2a1269 req-683f0971-da3d-46c9-b345-da1e2bc5755e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 9c4331d1-5216-4185-af70-efe3dea4e9ab] No waiting events found dispatching network-vif-plugged-99067540-44f4-4846-b4d9-7da27554294f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:47:35 compute-1 nova_compute[182713]: 2026-01-22 00:47:35.568 182717 WARNING nova.compute.manager [req-f6edc751-4b0a-4d45-8159-bb75db2a1269 req-683f0971-da3d-46c9-b345-da1e2bc5755e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 9c4331d1-5216-4185-af70-efe3dea4e9ab] Received unexpected event network-vif-plugged-99067540-44f4-4846-b4d9-7da27554294f for instance with vm_state active and task_state None.
Jan 22 00:47:36 compute-1 nova_compute[182713]: 2026-01-22 00:47:36.312 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:47:36 compute-1 nova_compute[182713]: 2026-01-22 00:47:36.376 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:47:37 compute-1 sshd-session[245729]: Invalid user node from 92.118.39.95 port 41078
Jan 22 00:47:37 compute-1 sshd-session[245729]: Connection closed by invalid user node 92.118.39.95 port 41078 [preauth]
Jan 22 00:47:40 compute-1 podman[245732]: 2026-01-22 00:47:40.581970468 +0000 UTC m=+0.065351526 container health_status e305ebfcd3dbca18f8dd680606b043b108cf9efb0d0a771039cb04fe0df8441d (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Jan 22 00:47:40 compute-1 podman[245731]: 2026-01-22 00:47:40.585306921 +0000 UTC m=+0.068336708 container health_status af9a44923f14d865893c91712a29a99d59e05d83f1a1a0300c4f4d5af638f284 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 22 00:47:41 compute-1 nova_compute[182713]: 2026-01-22 00:47:41.318 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:47:41 compute-1 nova_compute[182713]: 2026-01-22 00:47:41.378 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:47:45 compute-1 ovn_controller[94841]: 2026-01-22T00:47:45Z|00102|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:ed:97:34 10.100.0.10
Jan 22 00:47:45 compute-1 ovn_controller[94841]: 2026-01-22T00:47:45Z|00103|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:ed:97:34 10.100.0.10
Jan 22 00:47:46 compute-1 nova_compute[182713]: 2026-01-22 00:47:46.321 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:47:46 compute-1 nova_compute[182713]: 2026-01-22 00:47:46.379 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:47:51 compute-1 nova_compute[182713]: 2026-01-22 00:47:51.371 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:47:51 compute-1 nova_compute[182713]: 2026-01-22 00:47:51.380 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:47:53 compute-1 podman[245790]: 2026-01-22 00:47:53.59820203 +0000 UTC m=+0.083133356 container health_status cba261ffc859a105e0afa4436e64e27f9ba7e7387a1ea02146e0072c51844d44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute)
Jan 22 00:47:53 compute-1 podman[245791]: 2026-01-22 00:47:53.598944464 +0000 UTC m=+0.078283347 container health_status dce133a2ffa21f3006ac27dc80027060a9029e6d972f4c62fecd35dd3bbb00f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible, release=1755695350, com.redhat.component=ubi9-minimal-container, distribution-scope=public, container_name=openstack_network_exporter, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, name=ubi9-minimal, version=9.6, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, config_id=openstack_network_exporter, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41)
Jan 22 00:47:54 compute-1 nova_compute[182713]: 2026-01-22 00:47:54.758 182717 DEBUG oslo_concurrency.lockutils [None req-7b4ee34e-1d3b-4ed0-bafd-2ef93f0be17d 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] Acquiring lock "9c4331d1-5216-4185-af70-efe3dea4e9ab" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:47:54 compute-1 nova_compute[182713]: 2026-01-22 00:47:54.758 182717 DEBUG oslo_concurrency.lockutils [None req-7b4ee34e-1d3b-4ed0-bafd-2ef93f0be17d 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] Lock "9c4331d1-5216-4185-af70-efe3dea4e9ab" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:47:54 compute-1 nova_compute[182713]: 2026-01-22 00:47:54.759 182717 DEBUG oslo_concurrency.lockutils [None req-7b4ee34e-1d3b-4ed0-bafd-2ef93f0be17d 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] Acquiring lock "9c4331d1-5216-4185-af70-efe3dea4e9ab-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:47:54 compute-1 nova_compute[182713]: 2026-01-22 00:47:54.759 182717 DEBUG oslo_concurrency.lockutils [None req-7b4ee34e-1d3b-4ed0-bafd-2ef93f0be17d 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] Lock "9c4331d1-5216-4185-af70-efe3dea4e9ab-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:47:54 compute-1 nova_compute[182713]: 2026-01-22 00:47:54.760 182717 DEBUG oslo_concurrency.lockutils [None req-7b4ee34e-1d3b-4ed0-bafd-2ef93f0be17d 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] Lock "9c4331d1-5216-4185-af70-efe3dea4e9ab-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:47:54 compute-1 nova_compute[182713]: 2026-01-22 00:47:54.848 182717 INFO nova.compute.manager [None req-7b4ee34e-1d3b-4ed0-bafd-2ef93f0be17d 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] [instance: 9c4331d1-5216-4185-af70-efe3dea4e9ab] Terminating instance
Jan 22 00:47:55 compute-1 nova_compute[182713]: 2026-01-22 00:47:55.664 182717 DEBUG nova.compute.manager [None req-7b4ee34e-1d3b-4ed0-bafd-2ef93f0be17d 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] [instance: 9c4331d1-5216-4185-af70-efe3dea4e9ab] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 22 00:47:55 compute-1 kernel: tap99067540-44 (unregistering): left promiscuous mode
Jan 22 00:47:55 compute-1 NetworkManager[54952]: <info>  [1769042875.6885] device (tap99067540-44): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 00:47:55 compute-1 nova_compute[182713]: 2026-01-22 00:47:55.699 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:47:55 compute-1 ovn_controller[94841]: 2026-01-22T00:47:55Z|00774|binding|INFO|Releasing lport 99067540-44f4-4846-b4d9-7da27554294f from this chassis (sb_readonly=0)
Jan 22 00:47:55 compute-1 ovn_controller[94841]: 2026-01-22T00:47:55Z|00775|binding|INFO|Setting lport 99067540-44f4-4846-b4d9-7da27554294f down in Southbound
Jan 22 00:47:55 compute-1 ovn_controller[94841]: 2026-01-22T00:47:55Z|00776|binding|INFO|Removing iface tap99067540-44 ovn-installed in OVS
Jan 22 00:47:55 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:47:55.708 104184 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ed:97:34 10.100.0.10'], port_security=['fa:16:3e:ed:97:34 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '9c4331d1-5216-4185-af70-efe3dea4e9ab', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c27f16e8-e7ea-4ce6-8fc8-52a4d97170f2', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '38ae0051f15c46809f70ec5299cfb2c6', 'neutron:revision_number': '4', 'neutron:security_group_ids': '09d74f28-816b-485a-851f-3d27c0c9555a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2b227cd0-f221-40a4-86c3-ce27482fa492, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>], logical_port=99067540-44f4-4846-b4d9-7da27554294f) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7efce36eba30>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:47:55 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:47:55.709 104184 INFO neutron.agent.ovn.metadata.agent [-] Port 99067540-44f4-4846-b4d9-7da27554294f in datapath c27f16e8-e7ea-4ce6-8fc8-52a4d97170f2 unbound from our chassis
Jan 22 00:47:55 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:47:55.710 104184 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network c27f16e8-e7ea-4ce6-8fc8-52a4d97170f2, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 00:47:55 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:47:55.713 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[4b3f337c-44c9-4211-961e-28eaa664687c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:47:55 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:47:55.714 104184 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-c27f16e8-e7ea-4ce6-8fc8-52a4d97170f2 namespace which is not needed anymore
Jan 22 00:47:55 compute-1 nova_compute[182713]: 2026-01-22 00:47:55.734 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:47:55 compute-1 systemd[1]: machine-qemu\x2d80\x2dinstance\x2d000000b9.scope: Deactivated successfully.
Jan 22 00:47:55 compute-1 systemd[1]: machine-qemu\x2d80\x2dinstance\x2d000000b9.scope: Consumed 13.224s CPU time.
Jan 22 00:47:55 compute-1 systemd-machined[153970]: Machine qemu-80-instance-000000b9 terminated.
Jan 22 00:47:55 compute-1 neutron-haproxy-ovnmeta-c27f16e8-e7ea-4ce6-8fc8-52a4d97170f2[245683]: [NOTICE]   (245710) : haproxy version is 2.8.14-c23fe91
Jan 22 00:47:55 compute-1 neutron-haproxy-ovnmeta-c27f16e8-e7ea-4ce6-8fc8-52a4d97170f2[245683]: [NOTICE]   (245710) : path to executable is /usr/sbin/haproxy
Jan 22 00:47:55 compute-1 neutron-haproxy-ovnmeta-c27f16e8-e7ea-4ce6-8fc8-52a4d97170f2[245683]: [WARNING]  (245710) : Exiting Master process...
Jan 22 00:47:55 compute-1 neutron-haproxy-ovnmeta-c27f16e8-e7ea-4ce6-8fc8-52a4d97170f2[245683]: [ALERT]    (245710) : Current worker (245716) exited with code 143 (Terminated)
Jan 22 00:47:55 compute-1 neutron-haproxy-ovnmeta-c27f16e8-e7ea-4ce6-8fc8-52a4d97170f2[245683]: [WARNING]  (245710) : All workers exited. Exiting... (0)
Jan 22 00:47:55 compute-1 systemd[1]: libpod-bb354b05da4f97c968a465554c268da25b781ffd710222c90a1339fb0e1e3f34.scope: Deactivated successfully.
Jan 22 00:47:55 compute-1 podman[245852]: 2026-01-22 00:47:55.855399312 +0000 UTC m=+0.050632370 container died bb354b05da4f97c968a465554c268da25b781ffd710222c90a1339fb0e1e3f34 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c27f16e8-e7ea-4ce6-8fc8-52a4d97170f2, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team)
Jan 22 00:47:55 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-bb354b05da4f97c968a465554c268da25b781ffd710222c90a1339fb0e1e3f34-userdata-shm.mount: Deactivated successfully.
Jan 22 00:47:55 compute-1 systemd[1]: var-lib-containers-storage-overlay-c23ac8b696592243e0abf38b0e02989532458e3c90dfaa67fe89c3b2aab0413d-merged.mount: Deactivated successfully.
Jan 22 00:47:55 compute-1 podman[245852]: 2026-01-22 00:47:55.913635166 +0000 UTC m=+0.108868204 container cleanup bb354b05da4f97c968a465554c268da25b781ffd710222c90a1339fb0e1e3f34 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c27f16e8-e7ea-4ce6-8fc8-52a4d97170f2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 22 00:47:55 compute-1 systemd[1]: libpod-conmon-bb354b05da4f97c968a465554c268da25b781ffd710222c90a1339fb0e1e3f34.scope: Deactivated successfully.
Jan 22 00:47:55 compute-1 nova_compute[182713]: 2026-01-22 00:47:55.942 182717 INFO nova.virt.libvirt.driver [-] [instance: 9c4331d1-5216-4185-af70-efe3dea4e9ab] Instance destroyed successfully.
Jan 22 00:47:55 compute-1 nova_compute[182713]: 2026-01-22 00:47:55.943 182717 DEBUG nova.objects.instance [None req-7b4ee34e-1d3b-4ed0-bafd-2ef93f0be17d 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] Lazy-loading 'resources' on Instance uuid 9c4331d1-5216-4185-af70-efe3dea4e9ab obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:47:55 compute-1 nova_compute[182713]: 2026-01-22 00:47:55.967 182717 DEBUG nova.virt.libvirt.vif [None req-7b4ee34e-1d3b-4ed0-bafd-2ef93f0be17d 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T00:47:25Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestServerMultinode-server-81047553',display_name='tempest-TestServerMultinode-server-81047553',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testservermultinode-server-81047553',id=185,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-22T00:47:33Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='38ae0051f15c46809f70ec5299cfb2c6',ramdisk_id='',reservation_id='r-tomag5a8',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,member,reader',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestServerMultinode-385846676',owner_user_name='tempest-TestServerMultinode-385846676-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T00:47:33Z,user_data=None,user_id='8fb6fa8c5dd241fb975d0e13ddb107f4',uuid=9c4331d1-5216-4185-af70-efe3dea4e9ab,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "99067540-44f4-4846-b4d9-7da27554294f", "address": "fa:16:3e:ed:97:34", "network": {"id": "c27f16e8-e7ea-4ce6-8fc8-52a4d97170f2", "bridge": "br-int", "label": "tempest-TestServerMultinode-485848203-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "991deac0047e45c598f6b4e9ae868ad3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap99067540-44", "ovs_interfaceid": "99067540-44f4-4846-b4d9-7da27554294f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 00:47:55 compute-1 nova_compute[182713]: 2026-01-22 00:47:55.967 182717 DEBUG nova.network.os_vif_util [None req-7b4ee34e-1d3b-4ed0-bafd-2ef93f0be17d 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] Converting VIF {"id": "99067540-44f4-4846-b4d9-7da27554294f", "address": "fa:16:3e:ed:97:34", "network": {"id": "c27f16e8-e7ea-4ce6-8fc8-52a4d97170f2", "bridge": "br-int", "label": "tempest-TestServerMultinode-485848203-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "991deac0047e45c598f6b4e9ae868ad3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap99067540-44", "ovs_interfaceid": "99067540-44f4-4846-b4d9-7da27554294f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:47:55 compute-1 nova_compute[182713]: 2026-01-22 00:47:55.968 182717 DEBUG nova.network.os_vif_util [None req-7b4ee34e-1d3b-4ed0-bafd-2ef93f0be17d 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ed:97:34,bridge_name='br-int',has_traffic_filtering=True,id=99067540-44f4-4846-b4d9-7da27554294f,network=Network(c27f16e8-e7ea-4ce6-8fc8-52a4d97170f2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap99067540-44') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:47:55 compute-1 nova_compute[182713]: 2026-01-22 00:47:55.968 182717 DEBUG os_vif [None req-7b4ee34e-1d3b-4ed0-bafd-2ef93f0be17d 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ed:97:34,bridge_name='br-int',has_traffic_filtering=True,id=99067540-44f4-4846-b4d9-7da27554294f,network=Network(c27f16e8-e7ea-4ce6-8fc8-52a4d97170f2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap99067540-44') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 00:47:55 compute-1 nova_compute[182713]: 2026-01-22 00:47:55.970 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:47:55 compute-1 nova_compute[182713]: 2026-01-22 00:47:55.970 182717 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap99067540-44, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:47:55 compute-1 nova_compute[182713]: 2026-01-22 00:47:55.972 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:47:55 compute-1 nova_compute[182713]: 2026-01-22 00:47:55.974 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:47:55 compute-1 nova_compute[182713]: 2026-01-22 00:47:55.977 182717 INFO os_vif [None req-7b4ee34e-1d3b-4ed0-bafd-2ef93f0be17d 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ed:97:34,bridge_name='br-int',has_traffic_filtering=True,id=99067540-44f4-4846-b4d9-7da27554294f,network=Network(c27f16e8-e7ea-4ce6-8fc8-52a4d97170f2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap99067540-44')
Jan 22 00:47:55 compute-1 nova_compute[182713]: 2026-01-22 00:47:55.977 182717 INFO nova.virt.libvirt.driver [None req-7b4ee34e-1d3b-4ed0-bafd-2ef93f0be17d 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] [instance: 9c4331d1-5216-4185-af70-efe3dea4e9ab] Deleting instance files /var/lib/nova/instances/9c4331d1-5216-4185-af70-efe3dea4e9ab_del
Jan 22 00:47:55 compute-1 nova_compute[182713]: 2026-01-22 00:47:55.978 182717 INFO nova.virt.libvirt.driver [None req-7b4ee34e-1d3b-4ed0-bafd-2ef93f0be17d 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] [instance: 9c4331d1-5216-4185-af70-efe3dea4e9ab] Deletion of /var/lib/nova/instances/9c4331d1-5216-4185-af70-efe3dea4e9ab_del complete
Jan 22 00:47:56 compute-1 podman[245897]: 2026-01-22 00:47:56.067429502 +0000 UTC m=+0.125688346 container remove bb354b05da4f97c968a465554c268da25b781ffd710222c90a1339fb0e1e3f34 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c27f16e8-e7ea-4ce6-8fc8-52a4d97170f2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, io.buildah.version=1.41.3)
Jan 22 00:47:56 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:47:56.073 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[fdc39f5d-7ed5-4ba0-909c-69982b62c20a]: (4, ('Thu Jan 22 12:47:55 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-c27f16e8-e7ea-4ce6-8fc8-52a4d97170f2 (bb354b05da4f97c968a465554c268da25b781ffd710222c90a1339fb0e1e3f34)\nbb354b05da4f97c968a465554c268da25b781ffd710222c90a1339fb0e1e3f34\nThu Jan 22 12:47:55 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-c27f16e8-e7ea-4ce6-8fc8-52a4d97170f2 (bb354b05da4f97c968a465554c268da25b781ffd710222c90a1339fb0e1e3f34)\nbb354b05da4f97c968a465554c268da25b781ffd710222c90a1339fb0e1e3f34\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:47:56 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:47:56.075 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[01cef99a-49bb-4313-a2b1-0d6257c973e4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:47:56 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:47:56.076 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc27f16e8-e0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:47:56 compute-1 nova_compute[182713]: 2026-01-22 00:47:56.078 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:47:56 compute-1 kernel: tapc27f16e8-e0: left promiscuous mode
Jan 22 00:47:56 compute-1 nova_compute[182713]: 2026-01-22 00:47:56.094 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:47:56 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:47:56.098 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[a7d74bd5-1ca1-416f-b32e-0b5d8aa44040]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:47:56 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:47:56.120 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[84fba6cb-8534-43b2-a64f-66ae9aa63ce1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:47:56 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:47:56.121 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[4b8e0c07-5c15-4c0f-bee9-c9f945943700]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:47:56 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:47:56.139 211733 DEBUG oslo.privsep.daemon [-] privsep: reply[f4700532-8c6c-492c-984f-1370189fb125]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 742571, 'reachable_time': 24372, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 245915, 'error': None, 'target': 'ovnmeta-c27f16e8-e7ea-4ce6-8fc8-52a4d97170f2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:47:56 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:47:56.142 104576 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-c27f16e8-e7ea-4ce6-8fc8-52a4d97170f2 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 22 00:47:56 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:47:56.143 104576 DEBUG oslo.privsep.daemon [-] privsep: reply[fb79a4c1-33f0-4d6f-88ae-51ae0afda4cd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:47:56 compute-1 systemd[1]: run-netns-ovnmeta\x2dc27f16e8\x2de7ea\x2d4ce6\x2d8fc8\x2d52a4d97170f2.mount: Deactivated successfully.
Jan 22 00:47:56 compute-1 nova_compute[182713]: 2026-01-22 00:47:56.375 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:47:58 compute-1 nova_compute[182713]: 2026-01-22 00:47:58.609 182717 DEBUG nova.compute.manager [req-76831e1e-75c0-4417-a919-0d017419d251 req-16c431f1-13a7-4602-a90b-7397bbe26c97 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 9c4331d1-5216-4185-af70-efe3dea4e9ab] Received event network-vif-unplugged-99067540-44f4-4846-b4d9-7da27554294f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:47:58 compute-1 nova_compute[182713]: 2026-01-22 00:47:58.609 182717 DEBUG oslo_concurrency.lockutils [req-76831e1e-75c0-4417-a919-0d017419d251 req-16c431f1-13a7-4602-a90b-7397bbe26c97 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "9c4331d1-5216-4185-af70-efe3dea4e9ab-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:47:58 compute-1 nova_compute[182713]: 2026-01-22 00:47:58.610 182717 DEBUG oslo_concurrency.lockutils [req-76831e1e-75c0-4417-a919-0d017419d251 req-16c431f1-13a7-4602-a90b-7397bbe26c97 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "9c4331d1-5216-4185-af70-efe3dea4e9ab-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:47:58 compute-1 nova_compute[182713]: 2026-01-22 00:47:58.610 182717 DEBUG oslo_concurrency.lockutils [req-76831e1e-75c0-4417-a919-0d017419d251 req-16c431f1-13a7-4602-a90b-7397bbe26c97 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "9c4331d1-5216-4185-af70-efe3dea4e9ab-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:47:58 compute-1 nova_compute[182713]: 2026-01-22 00:47:58.611 182717 DEBUG nova.compute.manager [req-76831e1e-75c0-4417-a919-0d017419d251 req-16c431f1-13a7-4602-a90b-7397bbe26c97 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 9c4331d1-5216-4185-af70-efe3dea4e9ab] No waiting events found dispatching network-vif-unplugged-99067540-44f4-4846-b4d9-7da27554294f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:47:58 compute-1 nova_compute[182713]: 2026-01-22 00:47:58.611 182717 DEBUG nova.compute.manager [req-76831e1e-75c0-4417-a919-0d017419d251 req-16c431f1-13a7-4602-a90b-7397bbe26c97 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 9c4331d1-5216-4185-af70-efe3dea4e9ab] Received event network-vif-unplugged-99067540-44f4-4846-b4d9-7da27554294f for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 22 00:47:58 compute-1 nova_compute[182713]: 2026-01-22 00:47:58.624 182717 INFO nova.compute.manager [None req-7b4ee34e-1d3b-4ed0-bafd-2ef93f0be17d 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] [instance: 9c4331d1-5216-4185-af70-efe3dea4e9ab] Took 2.96 seconds to destroy the instance on the hypervisor.
Jan 22 00:47:58 compute-1 nova_compute[182713]: 2026-01-22 00:47:58.624 182717 DEBUG oslo.service.loopingcall [None req-7b4ee34e-1d3b-4ed0-bafd-2ef93f0be17d 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 22 00:47:58 compute-1 nova_compute[182713]: 2026-01-22 00:47:58.625 182717 DEBUG nova.compute.manager [-] [instance: 9c4331d1-5216-4185-af70-efe3dea4e9ab] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 22 00:47:58 compute-1 nova_compute[182713]: 2026-01-22 00:47:58.625 182717 DEBUG nova.network.neutron [-] [instance: 9c4331d1-5216-4185-af70-efe3dea4e9ab] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 22 00:48:00 compute-1 nova_compute[182713]: 2026-01-22 00:48:00.974 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:48:01 compute-1 nova_compute[182713]: 2026-01-22 00:48:01.422 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:48:01 compute-1 nova_compute[182713]: 2026-01-22 00:48:01.830 182717 DEBUG nova.compute.manager [req-186bd36f-0118-44ae-95d9-fefed57932ef req-6f0df382-175e-434a-a265-60d374cb74bd 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 9c4331d1-5216-4185-af70-efe3dea4e9ab] Received event network-vif-plugged-99067540-44f4-4846-b4d9-7da27554294f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:48:01 compute-1 nova_compute[182713]: 2026-01-22 00:48:01.830 182717 DEBUG oslo_concurrency.lockutils [req-186bd36f-0118-44ae-95d9-fefed57932ef req-6f0df382-175e-434a-a265-60d374cb74bd 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "9c4331d1-5216-4185-af70-efe3dea4e9ab-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:48:01 compute-1 nova_compute[182713]: 2026-01-22 00:48:01.830 182717 DEBUG oslo_concurrency.lockutils [req-186bd36f-0118-44ae-95d9-fefed57932ef req-6f0df382-175e-434a-a265-60d374cb74bd 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "9c4331d1-5216-4185-af70-efe3dea4e9ab-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:48:01 compute-1 nova_compute[182713]: 2026-01-22 00:48:01.831 182717 DEBUG oslo_concurrency.lockutils [req-186bd36f-0118-44ae-95d9-fefed57932ef req-6f0df382-175e-434a-a265-60d374cb74bd 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "9c4331d1-5216-4185-af70-efe3dea4e9ab-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:48:01 compute-1 nova_compute[182713]: 2026-01-22 00:48:01.831 182717 DEBUG nova.compute.manager [req-186bd36f-0118-44ae-95d9-fefed57932ef req-6f0df382-175e-434a-a265-60d374cb74bd 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 9c4331d1-5216-4185-af70-efe3dea4e9ab] No waiting events found dispatching network-vif-plugged-99067540-44f4-4846-b4d9-7da27554294f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:48:01 compute-1 nova_compute[182713]: 2026-01-22 00:48:01.831 182717 WARNING nova.compute.manager [req-186bd36f-0118-44ae-95d9-fefed57932ef req-6f0df382-175e-434a-a265-60d374cb74bd 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 9c4331d1-5216-4185-af70-efe3dea4e9ab] Received unexpected event network-vif-plugged-99067540-44f4-4846-b4d9-7da27554294f for instance with vm_state active and task_state deleting.
Jan 22 00:48:02 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:48:02.389 104184 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=72, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:a4:fb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'fa:c2:bb:50:eb:27'}, ipsec=False) old=SB_Global(nb_cfg=71) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:48:02 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:48:02.390 104184 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 22 00:48:02 compute-1 nova_compute[182713]: 2026-01-22 00:48:02.390 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:48:02 compute-1 nova_compute[182713]: 2026-01-22 00:48:02.423 182717 DEBUG nova.network.neutron [-] [instance: 9c4331d1-5216-4185-af70-efe3dea4e9ab] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:48:02 compute-1 nova_compute[182713]: 2026-01-22 00:48:02.442 182717 INFO nova.compute.manager [-] [instance: 9c4331d1-5216-4185-af70-efe3dea4e9ab] Took 3.82 seconds to deallocate network for instance.
Jan 22 00:48:02 compute-1 nova_compute[182713]: 2026-01-22 00:48:02.534 182717 DEBUG oslo_concurrency.lockutils [None req-7b4ee34e-1d3b-4ed0-bafd-2ef93f0be17d 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:48:02 compute-1 nova_compute[182713]: 2026-01-22 00:48:02.535 182717 DEBUG oslo_concurrency.lockutils [None req-7b4ee34e-1d3b-4ed0-bafd-2ef93f0be17d 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:48:02 compute-1 nova_compute[182713]: 2026-01-22 00:48:02.557 182717 DEBUG nova.compute.manager [req-285c1b7c-67cd-47da-94c2-26faf99326cb req-d2ee015a-5cbd-4456-a16a-d791fb1dcc04 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 9c4331d1-5216-4185-af70-efe3dea4e9ab] Received event network-vif-deleted-99067540-44f4-4846-b4d9-7da27554294f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:48:02 compute-1 nova_compute[182713]: 2026-01-22 00:48:02.602 182717 DEBUG nova.compute.provider_tree [None req-7b4ee34e-1d3b-4ed0-bafd-2ef93f0be17d 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] Inventory has not changed in ProviderTree for provider: 39680711-70c9-4df1-ae59-25e54fac688d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:48:02 compute-1 nova_compute[182713]: 2026-01-22 00:48:02.632 182717 DEBUG nova.scheduler.client.report [None req-7b4ee34e-1d3b-4ed0-bafd-2ef93f0be17d 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] Inventory has not changed for provider 39680711-70c9-4df1-ae59-25e54fac688d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:48:02 compute-1 nova_compute[182713]: 2026-01-22 00:48:02.851 182717 DEBUG oslo_concurrency.lockutils [None req-7b4ee34e-1d3b-4ed0-bafd-2ef93f0be17d 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.316s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:48:02 compute-1 nova_compute[182713]: 2026-01-22 00:48:02.895 182717 INFO nova.scheduler.client.report [None req-7b4ee34e-1d3b-4ed0-bafd-2ef93f0be17d 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] Deleted allocations for instance 9c4331d1-5216-4185-af70-efe3dea4e9ab
Jan 22 00:48:03 compute-1 nova_compute[182713]: 2026-01-22 00:48:03.018 182717 DEBUG oslo_concurrency.lockutils [None req-7b4ee34e-1d3b-4ed0-bafd-2ef93f0be17d 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] Lock "9c4331d1-5216-4185-af70-efe3dea4e9ab" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 8.260s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:48:03 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:48:03.065 104184 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:48:03 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:48:03.065 104184 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:48:03 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:48:03.066 104184 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:48:04 compute-1 podman[245917]: 2026-01-22 00:48:04.597226567 +0000 UTC m=+0.078409221 container health_status 9069102163b9014ce8d990e36224edf2f6277e737eebbc85a8ea0ba5a3d6a468 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 22 00:48:04 compute-1 podman[245916]: 2026-01-22 00:48:04.64379056 +0000 UTC m=+0.131412964 container health_status 1b31374526468d6b98833c6ddc06c791524e3aaa8f4a92d02e3f11c449a17ca2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Jan 22 00:48:06 compute-1 nova_compute[182713]: 2026-01-22 00:48:06.015 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:48:06 compute-1 nova_compute[182713]: 2026-01-22 00:48:06.058 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:48:06 compute-1 nova_compute[182713]: 2026-01-22 00:48:06.425 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:48:08 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:48:08.393 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=74526b6d-b1ca-423f-9094-b845f8b97526, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '72'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:48:09 compute-1 nova_compute[182713]: 2026-01-22 00:48:09.856 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:48:10 compute-1 nova_compute[182713]: 2026-01-22 00:48:10.940 182717 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769042875.9389486, 9c4331d1-5216-4185-af70-efe3dea4e9ab => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:48:10 compute-1 nova_compute[182713]: 2026-01-22 00:48:10.941 182717 INFO nova.compute.manager [-] [instance: 9c4331d1-5216-4185-af70-efe3dea4e9ab] VM Stopped (Lifecycle Event)
Jan 22 00:48:10 compute-1 nova_compute[182713]: 2026-01-22 00:48:10.965 182717 DEBUG nova.compute.manager [None req-7c9ad99a-6a5b-454c-9060-760bd3640290 - - - - - -] [instance: 9c4331d1-5216-4185-af70-efe3dea4e9ab] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:48:11 compute-1 nova_compute[182713]: 2026-01-22 00:48:11.018 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:48:11 compute-1 nova_compute[182713]: 2026-01-22 00:48:11.428 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:48:11 compute-1 podman[245966]: 2026-01-22 00:48:11.576138478 +0000 UTC m=+0.059947979 container health_status e305ebfcd3dbca18f8dd680606b043b108cf9efb0d0a771039cb04fe0df8441d (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 22 00:48:11 compute-1 podman[245965]: 2026-01-22 00:48:11.593093974 +0000 UTC m=+0.077863244 container health_status af9a44923f14d865893c91712a29a99d59e05d83f1a1a0300c4f4d5af638f284 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, io.buildah.version=1.41.3)
Jan 22 00:48:14 compute-1 nova_compute[182713]: 2026-01-22 00:48:14.852 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:48:16 compute-1 nova_compute[182713]: 2026-01-22 00:48:16.022 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:48:16 compute-1 nova_compute[182713]: 2026-01-22 00:48:16.462 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:48:16 compute-1 nova_compute[182713]: 2026-01-22 00:48:16.856 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:48:16 compute-1 nova_compute[182713]: 2026-01-22 00:48:16.857 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:48:16 compute-1 nova_compute[182713]: 2026-01-22 00:48:16.857 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 00:48:16 compute-1 nova_compute[182713]: 2026-01-22 00:48:16.857 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:48:19 compute-1 nova_compute[182713]: 2026-01-22 00:48:19.870 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:48:19 compute-1 nova_compute[182713]: 2026-01-22 00:48:19.871 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Jan 22 00:48:19 compute-1 nova_compute[182713]: 2026-01-22 00:48:19.895 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Jan 22 00:48:20 compute-1 nova_compute[182713]: 2026-01-22 00:48:20.881 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:48:21 compute-1 nova_compute[182713]: 2026-01-22 00:48:21.070 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:48:21 compute-1 nova_compute[182713]: 2026-01-22 00:48:21.464 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:48:21 compute-1 nova_compute[182713]: 2026-01-22 00:48:21.855 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:48:21 compute-1 nova_compute[182713]: 2026-01-22 00:48:21.856 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:48:21 compute-1 nova_compute[182713]: 2026-01-22 00:48:21.923 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:48:21 compute-1 nova_compute[182713]: 2026-01-22 00:48:21.923 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:48:21 compute-1 nova_compute[182713]: 2026-01-22 00:48:21.923 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:48:21 compute-1 nova_compute[182713]: 2026-01-22 00:48:21.923 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 00:48:22 compute-1 nova_compute[182713]: 2026-01-22 00:48:22.103 182717 WARNING nova.virt.libvirt.driver [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 00:48:22 compute-1 nova_compute[182713]: 2026-01-22 00:48:22.104 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5713MB free_disk=73.17743301391602GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 00:48:22 compute-1 nova_compute[182713]: 2026-01-22 00:48:22.105 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:48:22 compute-1 nova_compute[182713]: 2026-01-22 00:48:22.105 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:48:22 compute-1 nova_compute[182713]: 2026-01-22 00:48:22.245 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 00:48:22 compute-1 nova_compute[182713]: 2026-01-22 00:48:22.246 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 00:48:22 compute-1 nova_compute[182713]: 2026-01-22 00:48:22.359 182717 DEBUG nova.compute.provider_tree [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Inventory has not changed in ProviderTree for provider: 39680711-70c9-4df1-ae59-25e54fac688d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:48:22 compute-1 nova_compute[182713]: 2026-01-22 00:48:22.398 182717 DEBUG nova.scheduler.client.report [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Inventory has not changed for provider 39680711-70c9-4df1-ae59-25e54fac688d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:48:22 compute-1 nova_compute[182713]: 2026-01-22 00:48:22.431 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 00:48:22 compute-1 nova_compute[182713]: 2026-01-22 00:48:22.431 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.326s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:48:22 compute-1 nova_compute[182713]: 2026-01-22 00:48:22.855 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:48:22 compute-1 nova_compute[182713]: 2026-01-22 00:48:22.875 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:48:22 compute-1 nova_compute[182713]: 2026-01-22 00:48:22.875 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:48:22 compute-1 nova_compute[182713]: 2026-01-22 00:48:22.875 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Jan 22 00:48:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:48:22.897 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:48:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:48:22.898 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:48:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:48:22.898 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:48:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:48:22.899 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:48:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:48:22.899 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:48:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:48:22.899 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:48:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:48:22.899 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:48:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:48:22.899 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:48:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:48:22.899 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:48:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:48:22.899 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:48:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:48:22.899 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:48:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:48:22.899 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:48:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:48:22.899 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:48:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:48:22.899 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:48:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:48:22.899 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:48:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:48:22.900 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:48:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:48:22.900 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:48:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:48:22.900 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:48:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:48:22.900 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:48:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:48:22.900 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:48:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:48:22.900 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:48:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:48:22.900 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:48:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:48:22.900 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:48:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:48:22.900 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:48:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:48:22.900 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:48:24 compute-1 podman[246011]: 2026-01-22 00:48:24.605162828 +0000 UTC m=+0.086271134 container health_status dce133a2ffa21f3006ac27dc80027060a9029e6d972f4c62fecd35dd3bbb00f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, version=9.6, container_name=openstack_network_exporter, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, build-date=2025-08-20T13:12:41, vcs-type=git, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, io.openshift.expose-services=, managed_by=edpm_ansible)
Jan 22 00:48:24 compute-1 podman[246010]: 2026-01-22 00:48:24.6291141 +0000 UTC m=+0.116132009 container health_status cba261ffc859a105e0afa4436e64e27f9ba7e7387a1ea02146e0072c51844d44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Jan 22 00:48:25 compute-1 nova_compute[182713]: 2026-01-22 00:48:25.881 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:48:25 compute-1 nova_compute[182713]: 2026-01-22 00:48:25.881 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 00:48:25 compute-1 nova_compute[182713]: 2026-01-22 00:48:25.881 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 22 00:48:25 compute-1 nova_compute[182713]: 2026-01-22 00:48:25.906 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 22 00:48:26 compute-1 nova_compute[182713]: 2026-01-22 00:48:26.081 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:48:26 compute-1 nova_compute[182713]: 2026-01-22 00:48:26.479 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:48:31 compute-1 nova_compute[182713]: 2026-01-22 00:48:31.094 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:48:31 compute-1 nova_compute[182713]: 2026-01-22 00:48:31.481 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:48:35 compute-1 podman[246051]: 2026-01-22 00:48:35.563662461 +0000 UTC m=+0.051276491 container health_status 9069102163b9014ce8d990e36224edf2f6277e737eebbc85a8ea0ba5a3d6a468 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 22 00:48:35 compute-1 podman[246050]: 2026-01-22 00:48:35.637745336 +0000 UTC m=+0.128416640 container health_status 1b31374526468d6b98833c6ddc06c791524e3aaa8f4a92d02e3f11c449a17ca2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_managed=true, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.schema-version=1.0)
Jan 22 00:48:36 compute-1 nova_compute[182713]: 2026-01-22 00:48:36.097 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:48:36 compute-1 nova_compute[182713]: 2026-01-22 00:48:36.484 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:48:37 compute-1 sshd-session[246101]: Invalid user ubuntu from 203.83.238.251 port 50428
Jan 22 00:48:37 compute-1 sshd-session[246101]: Received disconnect from 203.83.238.251 port 50428:11:  [preauth]
Jan 22 00:48:37 compute-1 sshd-session[246101]: Disconnected from invalid user ubuntu 203.83.238.251 port 50428 [preauth]
Jan 22 00:48:41 compute-1 nova_compute[182713]: 2026-01-22 00:48:41.100 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:48:41 compute-1 ovn_controller[94841]: 2026-01-22T00:48:41Z|00777|memory_trim|INFO|Detected inactivity (last active 30004 ms ago): trimming memory
Jan 22 00:48:41 compute-1 nova_compute[182713]: 2026-01-22 00:48:41.486 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:48:42 compute-1 podman[246103]: 2026-01-22 00:48:42.601232558 +0000 UTC m=+0.084138328 container health_status af9a44923f14d865893c91712a29a99d59e05d83f1a1a0300c4f4d5af638f284 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2)
Jan 22 00:48:42 compute-1 podman[246104]: 2026-01-22 00:48:42.601385483 +0000 UTC m=+0.078718950 container health_status e305ebfcd3dbca18f8dd680606b043b108cf9efb0d0a771039cb04fe0df8441d (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Jan 22 00:48:46 compute-1 nova_compute[182713]: 2026-01-22 00:48:46.104 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:48:46 compute-1 nova_compute[182713]: 2026-01-22 00:48:46.488 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:48:51 compute-1 nova_compute[182713]: 2026-01-22 00:48:51.107 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:48:51 compute-1 nova_compute[182713]: 2026-01-22 00:48:51.490 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:48:55 compute-1 podman[246145]: 2026-01-22 00:48:55.572662172 +0000 UTC m=+0.069972929 container health_status cba261ffc859a105e0afa4436e64e27f9ba7e7387a1ea02146e0072c51844d44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_id=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 22 00:48:55 compute-1 podman[246146]: 2026-01-22 00:48:55.58773173 +0000 UTC m=+0.072908471 container health_status dce133a2ffa21f3006ac27dc80027060a9029e6d972f4c62fecd35dd3bbb00f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, vcs-type=git, config_id=openstack_network_exporter, version=9.6, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7)
Jan 22 00:48:56 compute-1 nova_compute[182713]: 2026-01-22 00:48:56.110 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:48:56 compute-1 nova_compute[182713]: 2026-01-22 00:48:56.491 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:49:01 compute-1 nova_compute[182713]: 2026-01-22 00:49:01.112 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:49:01 compute-1 nova_compute[182713]: 2026-01-22 00:49:01.233 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:49:01 compute-1 nova_compute[182713]: 2026-01-22 00:49:01.493 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:49:03 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:49:03.066 104184 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:49:03 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:49:03.066 104184 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:49:03 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:49:03.066 104184 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:49:06 compute-1 nova_compute[182713]: 2026-01-22 00:49:06.115 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:49:06 compute-1 nova_compute[182713]: 2026-01-22 00:49:06.495 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:49:06 compute-1 podman[246188]: 2026-01-22 00:49:06.556328593 +0000 UTC m=+0.052640932 container health_status 9069102163b9014ce8d990e36224edf2f6277e737eebbc85a8ea0ba5a3d6a468 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 22 00:49:06 compute-1 podman[246187]: 2026-01-22 00:49:06.580030318 +0000 UTC m=+0.077669999 container health_status 1b31374526468d6b98833c6ddc06c791524e3aaa8f4a92d02e3f11c449a17ca2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 22 00:49:09 compute-1 nova_compute[182713]: 2026-01-22 00:49:09.886 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:49:11 compute-1 nova_compute[182713]: 2026-01-22 00:49:11.117 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:49:11 compute-1 nova_compute[182713]: 2026-01-22 00:49:11.497 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:49:13 compute-1 podman[246236]: 2026-01-22 00:49:13.597521755 +0000 UTC m=+0.091391053 container health_status af9a44923f14d865893c91712a29a99d59e05d83f1a1a0300c4f4d5af638f284 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Jan 22 00:49:13 compute-1 podman[246237]: 2026-01-22 00:49:13.605908134 +0000 UTC m=+0.086930134 container health_status e305ebfcd3dbca18f8dd680606b043b108cf9efb0d0a771039cb04fe0df8441d (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 22 00:49:14 compute-1 nova_compute[182713]: 2026-01-22 00:49:14.851 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:49:16 compute-1 nova_compute[182713]: 2026-01-22 00:49:16.120 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:49:16 compute-1 nova_compute[182713]: 2026-01-22 00:49:16.500 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:49:16 compute-1 nova_compute[182713]: 2026-01-22 00:49:16.856 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:49:16 compute-1 nova_compute[182713]: 2026-01-22 00:49:16.857 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 00:49:18 compute-1 nova_compute[182713]: 2026-01-22 00:49:18.857 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:49:21 compute-1 nova_compute[182713]: 2026-01-22 00:49:21.123 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:49:21 compute-1 nova_compute[182713]: 2026-01-22 00:49:21.503 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:49:21 compute-1 nova_compute[182713]: 2026-01-22 00:49:21.855 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:49:21 compute-1 nova_compute[182713]: 2026-01-22 00:49:21.856 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:49:22 compute-1 nova_compute[182713]: 2026-01-22 00:49:22.855 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:49:22 compute-1 nova_compute[182713]: 2026-01-22 00:49:22.896 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:49:22 compute-1 nova_compute[182713]: 2026-01-22 00:49:22.897 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:49:22 compute-1 nova_compute[182713]: 2026-01-22 00:49:22.898 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:49:22 compute-1 nova_compute[182713]: 2026-01-22 00:49:22.898 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 00:49:23 compute-1 nova_compute[182713]: 2026-01-22 00:49:23.117 182717 WARNING nova.virt.libvirt.driver [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 00:49:23 compute-1 nova_compute[182713]: 2026-01-22 00:49:23.119 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5718MB free_disk=73.17724227905273GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 00:49:23 compute-1 nova_compute[182713]: 2026-01-22 00:49:23.120 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:49:23 compute-1 nova_compute[182713]: 2026-01-22 00:49:23.120 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:49:23 compute-1 nova_compute[182713]: 2026-01-22 00:49:23.200 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 00:49:23 compute-1 nova_compute[182713]: 2026-01-22 00:49:23.201 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 00:49:23 compute-1 nova_compute[182713]: 2026-01-22 00:49:23.225 182717 DEBUG nova.compute.provider_tree [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Inventory has not changed in ProviderTree for provider: 39680711-70c9-4df1-ae59-25e54fac688d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:49:23 compute-1 nova_compute[182713]: 2026-01-22 00:49:23.241 182717 DEBUG nova.scheduler.client.report [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Inventory has not changed for provider 39680711-70c9-4df1-ae59-25e54fac688d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:49:23 compute-1 nova_compute[182713]: 2026-01-22 00:49:23.243 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 00:49:23 compute-1 nova_compute[182713]: 2026-01-22 00:49:23.244 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.124s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:49:24 compute-1 nova_compute[182713]: 2026-01-22 00:49:24.244 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:49:26 compute-1 nova_compute[182713]: 2026-01-22 00:49:26.126 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:49:26 compute-1 nova_compute[182713]: 2026-01-22 00:49:26.509 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:49:26 compute-1 podman[246278]: 2026-01-22 00:49:26.575430692 +0000 UTC m=+0.063019163 container health_status cba261ffc859a105e0afa4436e64e27f9ba7e7387a1ea02146e0072c51844d44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.build-date=20251202, config_id=ceilometer_agent_compute)
Jan 22 00:49:26 compute-1 podman[246279]: 2026-01-22 00:49:26.608840807 +0000 UTC m=+0.099516624 container health_status dce133a2ffa21f3006ac27dc80027060a9029e6d972f4c62fecd35dd3bbb00f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, com.redhat.component=ubi9-minimal-container, version=9.6, config_id=openstack_network_exporter, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, io.buildah.version=1.33.7, name=ubi9-minimal, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, release=1755695350)
Jan 22 00:49:26 compute-1 nova_compute[182713]: 2026-01-22 00:49:26.856 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:49:26 compute-1 nova_compute[182713]: 2026-01-22 00:49:26.857 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 00:49:26 compute-1 nova_compute[182713]: 2026-01-22 00:49:26.857 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 22 00:49:26 compute-1 nova_compute[182713]: 2026-01-22 00:49:26.878 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 22 00:49:31 compute-1 nova_compute[182713]: 2026-01-22 00:49:31.182 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:49:31 compute-1 nova_compute[182713]: 2026-01-22 00:49:31.512 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:49:36 compute-1 nova_compute[182713]: 2026-01-22 00:49:36.185 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:49:36 compute-1 nova_compute[182713]: 2026-01-22 00:49:36.514 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:49:37 compute-1 podman[246319]: 2026-01-22 00:49:37.573587573 +0000 UTC m=+0.060091084 container health_status 9069102163b9014ce8d990e36224edf2f6277e737eebbc85a8ea0ba5a3d6a468 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 22 00:49:37 compute-1 podman[246318]: 2026-01-22 00:49:37.599011411 +0000 UTC m=+0.093615923 container health_status 1b31374526468d6b98833c6ddc06c791524e3aaa8f4a92d02e3f11c449a17ca2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Jan 22 00:49:40 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:49:40.047 104184 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=73, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:a4:fb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'fa:c2:bb:50:eb:27'}, ipsec=False) old=SB_Global(nb_cfg=72) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:49:40 compute-1 nova_compute[182713]: 2026-01-22 00:49:40.048 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:49:40 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:49:40.051 104184 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 22 00:49:40 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:49:40.053 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=74526b6d-b1ca-423f-9094-b845f8b97526, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '73'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:49:41 compute-1 nova_compute[182713]: 2026-01-22 00:49:41.189 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:49:41 compute-1 nova_compute[182713]: 2026-01-22 00:49:41.563 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:49:44 compute-1 podman[246371]: 2026-01-22 00:49:44.588537229 +0000 UTC m=+0.068521776 container health_status e305ebfcd3dbca18f8dd680606b043b108cf9efb0d0a771039cb04fe0df8441d (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 22 00:49:44 compute-1 podman[246370]: 2026-01-22 00:49:44.6331047 +0000 UTC m=+0.101653902 container health_status af9a44923f14d865893c91712a29a99d59e05d83f1a1a0300c4f4d5af638f284 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Jan 22 00:49:46 compute-1 nova_compute[182713]: 2026-01-22 00:49:46.192 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:49:46 compute-1 nova_compute[182713]: 2026-01-22 00:49:46.565 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:49:47 compute-1 sshd-session[246411]: Invalid user minima from 92.118.39.95 port 40042
Jan 22 00:49:48 compute-1 sshd-session[246411]: Connection closed by invalid user minima 92.118.39.95 port 40042 [preauth]
Jan 22 00:49:51 compute-1 nova_compute[182713]: 2026-01-22 00:49:51.196 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:49:51 compute-1 nova_compute[182713]: 2026-01-22 00:49:51.567 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:49:56 compute-1 nova_compute[182713]: 2026-01-22 00:49:56.200 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:49:56 compute-1 nova_compute[182713]: 2026-01-22 00:49:56.568 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:49:57 compute-1 podman[246413]: 2026-01-22 00:49:57.558677364 +0000 UTC m=+0.052196248 container health_status cba261ffc859a105e0afa4436e64e27f9ba7e7387a1ea02146e0072c51844d44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 22 00:49:57 compute-1 podman[246414]: 2026-01-22 00:49:57.597321292 +0000 UTC m=+0.073553380 container health_status dce133a2ffa21f3006ac27dc80027060a9029e6d972f4c62fecd35dd3bbb00f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, maintainer=Red Hat, Inc., name=ubi9-minimal, release=1755695350, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, vcs-type=git, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, config_id=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, version=9.6, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, distribution-scope=public)
Jan 22 00:50:01 compute-1 nova_compute[182713]: 2026-01-22 00:50:01.205 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:50:01 compute-1 nova_compute[182713]: 2026-01-22 00:50:01.572 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:50:03 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:50:03.067 104184 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:50:03 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:50:03.067 104184 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:50:03 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:50:03.067 104184 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:50:06 compute-1 nova_compute[182713]: 2026-01-22 00:50:06.207 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:50:06 compute-1 nova_compute[182713]: 2026-01-22 00:50:06.572 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:50:08 compute-1 podman[246454]: 2026-01-22 00:50:08.59833391 +0000 UTC m=+0.073696255 container health_status 9069102163b9014ce8d990e36224edf2f6277e737eebbc85a8ea0ba5a3d6a468 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 22 00:50:08 compute-1 podman[246453]: 2026-01-22 00:50:08.641047924 +0000 UTC m=+0.120308410 container health_status 1b31374526468d6b98833c6ddc06c791524e3aaa8f4a92d02e3f11c449a17ca2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 22 00:50:09 compute-1 nova_compute[182713]: 2026-01-22 00:50:09.855 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:50:11 compute-1 nova_compute[182713]: 2026-01-22 00:50:11.209 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:50:11 compute-1 nova_compute[182713]: 2026-01-22 00:50:11.574 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:50:15 compute-1 podman[246505]: 2026-01-22 00:50:15.560334906 +0000 UTC m=+0.054930403 container health_status af9a44923f14d865893c91712a29a99d59e05d83f1a1a0300c4f4d5af638f284 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent)
Jan 22 00:50:15 compute-1 podman[246506]: 2026-01-22 00:50:15.594231866 +0000 UTC m=+0.085631455 container health_status e305ebfcd3dbca18f8dd680606b043b108cf9efb0d0a771039cb04fe0df8441d (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 22 00:50:15 compute-1 nova_compute[182713]: 2026-01-22 00:50:15.852 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:50:16 compute-1 nova_compute[182713]: 2026-01-22 00:50:16.212 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:50:16 compute-1 nova_compute[182713]: 2026-01-22 00:50:16.574 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:50:18 compute-1 nova_compute[182713]: 2026-01-22 00:50:18.855 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:50:18 compute-1 nova_compute[182713]: 2026-01-22 00:50:18.855 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:50:18 compute-1 nova_compute[182713]: 2026-01-22 00:50:18.856 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 00:50:21 compute-1 nova_compute[182713]: 2026-01-22 00:50:21.215 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:50:21 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:50:21.244 104184 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=74, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:a4:fb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'fa:c2:bb:50:eb:27'}, ipsec=False) old=SB_Global(nb_cfg=73) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:50:21 compute-1 nova_compute[182713]: 2026-01-22 00:50:21.245 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:50:21 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:50:21.246 104184 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 22 00:50:21 compute-1 nova_compute[182713]: 2026-01-22 00:50:21.576 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:50:22 compute-1 nova_compute[182713]: 2026-01-22 00:50:22.851 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:50:22 compute-1 nova_compute[182713]: 2026-01-22 00:50:22.873 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:50:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:50:22.900 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:50:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:50:22.901 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:50:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:50:22.901 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:50:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:50:22.901 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:50:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:50:22.901 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:50:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:50:22.901 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:50:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:50:22.901 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:50:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:50:22.901 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:50:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:50:22.901 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:50:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:50:22.901 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:50:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:50:22.902 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:50:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:50:22.902 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:50:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:50:22.902 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:50:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:50:22.902 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:50:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:50:22.902 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:50:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:50:22.902 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:50:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:50:22.902 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:50:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:50:22.902 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:50:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:50:22.902 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:50:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:50:22.902 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:50:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:50:22.902 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:50:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:50:22.903 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:50:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:50:22.903 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:50:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:50:22.903 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:50:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:50:22.903 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:50:23 compute-1 nova_compute[182713]: 2026-01-22 00:50:23.855 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:50:23 compute-1 nova_compute[182713]: 2026-01-22 00:50:23.856 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:50:23 compute-1 nova_compute[182713]: 2026-01-22 00:50:23.890 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:50:23 compute-1 nova_compute[182713]: 2026-01-22 00:50:23.891 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:50:23 compute-1 nova_compute[182713]: 2026-01-22 00:50:23.891 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:50:23 compute-1 nova_compute[182713]: 2026-01-22 00:50:23.891 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 00:50:24 compute-1 nova_compute[182713]: 2026-01-22 00:50:24.093 182717 WARNING nova.virt.libvirt.driver [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 00:50:24 compute-1 nova_compute[182713]: 2026-01-22 00:50:24.095 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5715MB free_disk=73.17724227905273GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 00:50:24 compute-1 nova_compute[182713]: 2026-01-22 00:50:24.095 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:50:24 compute-1 nova_compute[182713]: 2026-01-22 00:50:24.096 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:50:24 compute-1 nova_compute[182713]: 2026-01-22 00:50:24.429 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 00:50:24 compute-1 nova_compute[182713]: 2026-01-22 00:50:24.429 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 00:50:24 compute-1 nova_compute[182713]: 2026-01-22 00:50:24.520 182717 DEBUG nova.scheduler.client.report [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Refreshing inventories for resource provider 39680711-70c9-4df1-ae59-25e54fac688d _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Jan 22 00:50:24 compute-1 nova_compute[182713]: 2026-01-22 00:50:24.705 182717 DEBUG nova.scheduler.client.report [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Updating ProviderTree inventory for provider 39680711-70c9-4df1-ae59-25e54fac688d from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Jan 22 00:50:24 compute-1 nova_compute[182713]: 2026-01-22 00:50:24.706 182717 DEBUG nova.compute.provider_tree [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Updating inventory in ProviderTree for provider 39680711-70c9-4df1-ae59-25e54fac688d with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Jan 22 00:50:24 compute-1 nova_compute[182713]: 2026-01-22 00:50:24.723 182717 DEBUG nova.scheduler.client.report [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Refreshing aggregate associations for resource provider 39680711-70c9-4df1-ae59-25e54fac688d, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Jan 22 00:50:24 compute-1 nova_compute[182713]: 2026-01-22 00:50:24.794 182717 DEBUG nova.scheduler.client.report [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Refreshing trait associations for resource provider 39680711-70c9-4df1-ae59-25e54fac688d, traits: COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_ACCELERATORS,COMPUTE_NODE,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_SSE41,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_GRAPHICS_MODEL_CIRRUS,HW_CPU_X86_SSE42,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_SECURITY_TPM_2_0,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_STORAGE_BUS_USB,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_STORAGE_BUS_FDC,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_STORAGE_BUS_SATA,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_DEVICE_TAGGING,COMPUTE_NET_VIF_MODEL_NE2K_PCI,HW_CPU_X86_SSE2,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_CPU_X86_SSSE3,COMPUTE_STORAGE_BUS_IDE,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_RESCUE_BFV,HW_CPU_X86_MMX,HW_CPU_X86_SSE,COMPUTE_TRUSTED_CERTS,COMPUTE_IMAGE_TYPE_ISO _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Jan 22 00:50:24 compute-1 nova_compute[182713]: 2026-01-22 00:50:24.818 182717 DEBUG nova.compute.provider_tree [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Inventory has not changed in ProviderTree for provider: 39680711-70c9-4df1-ae59-25e54fac688d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:50:24 compute-1 nova_compute[182713]: 2026-01-22 00:50:24.839 182717 DEBUG nova.scheduler.client.report [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Inventory has not changed for provider 39680711-70c9-4df1-ae59-25e54fac688d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:50:24 compute-1 nova_compute[182713]: 2026-01-22 00:50:24.842 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 00:50:24 compute-1 nova_compute[182713]: 2026-01-22 00:50:24.843 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.747s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:50:26 compute-1 nova_compute[182713]: 2026-01-22 00:50:26.219 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:50:26 compute-1 nova_compute[182713]: 2026-01-22 00:50:26.578 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:50:26 compute-1 nova_compute[182713]: 2026-01-22 00:50:26.843 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:50:26 compute-1 nova_compute[182713]: 2026-01-22 00:50:26.855 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:50:26 compute-1 nova_compute[182713]: 2026-01-22 00:50:26.855 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 00:50:26 compute-1 nova_compute[182713]: 2026-01-22 00:50:26.855 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 22 00:50:26 compute-1 nova_compute[182713]: 2026-01-22 00:50:26.878 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 22 00:50:27 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:50:27.249 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=74526b6d-b1ca-423f-9094-b845f8b97526, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '74'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:50:28 compute-1 podman[246547]: 2026-01-22 00:50:28.601653086 +0000 UTC m=+0.089821254 container health_status cba261ffc859a105e0afa4436e64e27f9ba7e7387a1ea02146e0072c51844d44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ceilometer_agent_compute)
Jan 22 00:50:28 compute-1 podman[246548]: 2026-01-22 00:50:28.623603776 +0000 UTC m=+0.096140970 container health_status dce133a2ffa21f3006ac27dc80027060a9029e6d972f4c62fecd35dd3bbb00f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public, version=9.6, vendor=Red Hat, Inc., config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, architecture=x86_64, maintainer=Red Hat, Inc., io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, release=1755695350)
Jan 22 00:50:31 compute-1 nova_compute[182713]: 2026-01-22 00:50:31.222 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:50:31 compute-1 nova_compute[182713]: 2026-01-22 00:50:31.581 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:50:36 compute-1 nova_compute[182713]: 2026-01-22 00:50:36.224 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:50:36 compute-1 nova_compute[182713]: 2026-01-22 00:50:36.581 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:50:39 compute-1 podman[246589]: 2026-01-22 00:50:39.580586941 +0000 UTC m=+0.062435276 container health_status 9069102163b9014ce8d990e36224edf2f6277e737eebbc85a8ea0ba5a3d6a468 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 22 00:50:39 compute-1 podman[246588]: 2026-01-22 00:50:39.613515401 +0000 UTC m=+0.095170900 container health_status 1b31374526468d6b98833c6ddc06c791524e3aaa8f4a92d02e3f11c449a17ca2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251202)
Jan 22 00:50:41 compute-1 nova_compute[182713]: 2026-01-22 00:50:41.226 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:50:41 compute-1 nova_compute[182713]: 2026-01-22 00:50:41.582 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:50:46 compute-1 nova_compute[182713]: 2026-01-22 00:50:46.228 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:50:46 compute-1 podman[246638]: 2026-01-22 00:50:46.554922629 +0000 UTC m=+0.052144928 container health_status af9a44923f14d865893c91712a29a99d59e05d83f1a1a0300c4f4d5af638f284 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 00:50:46 compute-1 nova_compute[182713]: 2026-01-22 00:50:46.585 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:50:46 compute-1 podman[246639]: 2026-01-22 00:50:46.598053805 +0000 UTC m=+0.082717614 container health_status e305ebfcd3dbca18f8dd680606b043b108cf9efb0d0a771039cb04fe0df8441d (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 22 00:50:51 compute-1 nova_compute[182713]: 2026-01-22 00:50:51.231 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:50:51 compute-1 nova_compute[182713]: 2026-01-22 00:50:51.587 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:50:56 compute-1 nova_compute[182713]: 2026-01-22 00:50:56.233 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:50:56 compute-1 nova_compute[182713]: 2026-01-22 00:50:56.589 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:50:59 compute-1 podman[246682]: 2026-01-22 00:50:59.592823713 +0000 UTC m=+0.080933109 container health_status dce133a2ffa21f3006ac27dc80027060a9029e6d972f4c62fecd35dd3bbb00f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., release=1755695350, build-date=2025-08-20T13:12:41, name=ubi9-minimal, vcs-type=git, io.openshift.expose-services=, managed_by=edpm_ansible, vendor=Red Hat, Inc., version=9.6, io.openshift.tags=minimal rhel9, distribution-scope=public, architecture=x86_64, config_id=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7)
Jan 22 00:50:59 compute-1 podman[246681]: 2026-01-22 00:50:59.604610817 +0000 UTC m=+0.100769914 container health_status cba261ffc859a105e0afa4436e64e27f9ba7e7387a1ea02146e0072c51844d44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=ceilometer_agent_compute, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Jan 22 00:51:01 compute-1 nova_compute[182713]: 2026-01-22 00:51:01.236 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:51:01 compute-1 nova_compute[182713]: 2026-01-22 00:51:01.591 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:51:03 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:51:03.068 104184 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:51:03 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:51:03.069 104184 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:51:03 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:51:03.070 104184 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:51:06 compute-1 nova_compute[182713]: 2026-01-22 00:51:06.239 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:51:06 compute-1 nova_compute[182713]: 2026-01-22 00:51:06.594 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:51:09 compute-1 nova_compute[182713]: 2026-01-22 00:51:09.856 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:51:10 compute-1 podman[246721]: 2026-01-22 00:51:10.574024277 +0000 UTC m=+0.061227338 container health_status 9069102163b9014ce8d990e36224edf2f6277e737eebbc85a8ea0ba5a3d6a468 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 22 00:51:10 compute-1 podman[246720]: 2026-01-22 00:51:10.626742921 +0000 UTC m=+0.112800106 container health_status 1b31374526468d6b98833c6ddc06c791524e3aaa8f4a92d02e3f11c449a17ca2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 22 00:51:11 compute-1 nova_compute[182713]: 2026-01-22 00:51:11.242 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:51:11 compute-1 nova_compute[182713]: 2026-01-22 00:51:11.595 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:51:15 compute-1 nova_compute[182713]: 2026-01-22 00:51:15.851 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:51:16 compute-1 nova_compute[182713]: 2026-01-22 00:51:16.245 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:51:16 compute-1 nova_compute[182713]: 2026-01-22 00:51:16.598 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:51:17 compute-1 podman[246771]: 2026-01-22 00:51:17.601023387 +0000 UTC m=+0.090064272 container health_status af9a44923f14d865893c91712a29a99d59e05d83f1a1a0300c4f4d5af638f284 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202)
Jan 22 00:51:17 compute-1 podman[246772]: 2026-01-22 00:51:17.601279895 +0000 UTC m=+0.075706806 container health_status e305ebfcd3dbca18f8dd680606b043b108cf9efb0d0a771039cb04fe0df8441d (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Jan 22 00:51:20 compute-1 nova_compute[182713]: 2026-01-22 00:51:20.855 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:51:20 compute-1 nova_compute[182713]: 2026-01-22 00:51:20.856 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:51:20 compute-1 nova_compute[182713]: 2026-01-22 00:51:20.856 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 00:51:21 compute-1 nova_compute[182713]: 2026-01-22 00:51:21.276 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:51:21 compute-1 nova_compute[182713]: 2026-01-22 00:51:21.600 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:51:23 compute-1 nova_compute[182713]: 2026-01-22 00:51:23.855 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:51:23 compute-1 nova_compute[182713]: 2026-01-22 00:51:23.856 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:51:23 compute-1 nova_compute[182713]: 2026-01-22 00:51:23.857 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:51:25 compute-1 sshd-session[246814]: Accepted publickey for zuul from 192.168.122.10 port 53054 ssh2: ECDSA SHA256:yVd9eaNlUxI8uClDfn5gH9n5gqPIh4AgIE0cN4DTPFE
Jan 22 00:51:25 compute-1 systemd-logind[796]: New session 59 of user zuul.
Jan 22 00:51:25 compute-1 systemd[1]: Started Session 59 of User zuul.
Jan 22 00:51:25 compute-1 sshd-session[246814]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 22 00:51:25 compute-1 sudo[246818]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/bash -c 'rm -rf /var/tmp/sos-osp && mkdir /var/tmp/sos-osp && sos report --batch --all-logs --tmp-dir=/var/tmp/sos-osp  -p container,openstack_edpm,system,storage,virt'
Jan 22 00:51:25 compute-1 sudo[246818]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 00:51:25 compute-1 nova_compute[182713]: 2026-01-22 00:51:25.343 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:51:25 compute-1 nova_compute[182713]: 2026-01-22 00:51:25.346 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.003s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:51:25 compute-1 nova_compute[182713]: 2026-01-22 00:51:25.347 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:51:25 compute-1 nova_compute[182713]: 2026-01-22 00:51:25.347 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 00:51:25 compute-1 nova_compute[182713]: 2026-01-22 00:51:25.558 182717 WARNING nova.virt.libvirt.driver [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 00:51:25 compute-1 nova_compute[182713]: 2026-01-22 00:51:25.560 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5712MB free_disk=73.17724227905273GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 00:51:25 compute-1 nova_compute[182713]: 2026-01-22 00:51:25.560 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:51:25 compute-1 nova_compute[182713]: 2026-01-22 00:51:25.560 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:51:25 compute-1 nova_compute[182713]: 2026-01-22 00:51:25.644 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 00:51:25 compute-1 nova_compute[182713]: 2026-01-22 00:51:25.645 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 00:51:25 compute-1 nova_compute[182713]: 2026-01-22 00:51:25.688 182717 DEBUG nova.compute.provider_tree [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Inventory has not changed in ProviderTree for provider: 39680711-70c9-4df1-ae59-25e54fac688d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:51:25 compute-1 nova_compute[182713]: 2026-01-22 00:51:25.705 182717 DEBUG nova.scheduler.client.report [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Inventory has not changed for provider 39680711-70c9-4df1-ae59-25e54fac688d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:51:25 compute-1 nova_compute[182713]: 2026-01-22 00:51:25.707 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 00:51:25 compute-1 nova_compute[182713]: 2026-01-22 00:51:25.707 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.146s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:51:26 compute-1 nova_compute[182713]: 2026-01-22 00:51:26.277 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:51:26 compute-1 nova_compute[182713]: 2026-01-22 00:51:26.601 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:51:28 compute-1 nova_compute[182713]: 2026-01-22 00:51:28.707 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:51:28 compute-1 nova_compute[182713]: 2026-01-22 00:51:28.708 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 00:51:28 compute-1 nova_compute[182713]: 2026-01-22 00:51:28.708 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 22 00:51:28 compute-1 nova_compute[182713]: 2026-01-22 00:51:28.735 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 22 00:51:28 compute-1 nova_compute[182713]: 2026-01-22 00:51:28.735 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:51:30 compute-1 ovs-vsctl[246990]: ovs|00001|db_ctl_base|ERR|no key "dpdk-init" in Open_vSwitch record "." column other_config
Jan 22 00:51:30 compute-1 podman[247018]: 2026-01-22 00:51:30.598269083 +0000 UTC m=+0.073747537 container health_status cba261ffc859a105e0afa4436e64e27f9ba7e7387a1ea02146e0072c51844d44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute)
Jan 22 00:51:30 compute-1 podman[247024]: 2026-01-22 00:51:30.618112997 +0000 UTC m=+0.093013303 container health_status dce133a2ffa21f3006ac27dc80027060a9029e6d972f4c62fecd35dd3bbb00f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, config_id=openstack_network_exporter, container_name=openstack_network_exporter, version=9.6, com.redhat.component=ubi9-minimal-container, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9-minimal, release=1755695350, vcs-type=git, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers)
Jan 22 00:51:30 compute-1 systemd[1]: proc-sys-fs-binfmt_misc.automount: Got automount request for /proc/sys/fs/binfmt_misc, triggered by 246842 (sos)
Jan 22 00:51:30 compute-1 systemd[1]: Mounting Arbitrary Executable File Formats File System...
Jan 22 00:51:30 compute-1 systemd[1]: Mounted Arbitrary Executable File Formats File System.
Jan 22 00:51:31 compute-1 nova_compute[182713]: 2026-01-22 00:51:31.322 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:51:31 compute-1 virtqemud[182235]: Failed to connect socket to '/var/run/libvirt/virtnetworkd-sock-ro': No such file or directory
Jan 22 00:51:31 compute-1 virtqemud[182235]: Failed to connect socket to '/var/run/libvirt/virtnwfilterd-sock-ro': No such file or directory
Jan 22 00:51:31 compute-1 virtqemud[182235]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Jan 22 00:51:31 compute-1 nova_compute[182713]: 2026-01-22 00:51:31.602 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:51:32 compute-1 crontab[247443]: (root) LIST (root)
Jan 22 00:51:34 compute-1 systemd[1]: Starting Hostname Service...
Jan 22 00:51:34 compute-1 systemd[1]: Started Hostname Service.
Jan 22 00:51:36 compute-1 nova_compute[182713]: 2026-01-22 00:51:36.356 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:51:36 compute-1 nova_compute[182713]: 2026-01-22 00:51:36.604 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:51:41 compute-1 nova_compute[182713]: 2026-01-22 00:51:41.365 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:51:41 compute-1 podman[248344]: 2026-01-22 00:51:41.580886532 +0000 UTC m=+0.058639038 container health_status 9069102163b9014ce8d990e36224edf2f6277e737eebbc85a8ea0ba5a3d6a468 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 22 00:51:41 compute-1 nova_compute[182713]: 2026-01-22 00:51:41.607 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:51:41 compute-1 podman[248338]: 2026-01-22 00:51:41.646301468 +0000 UTC m=+0.123351023 container health_status 1b31374526468d6b98833c6ddc06c791524e3aaa8f4a92d02e3f11c449a17ca2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS)
Jan 22 00:51:43 compute-1 ovs-appctl[248825]: ovs|00001|daemon_unix|WARN|/var/run/openvswitch/ovs-monitor-ipsec.pid: open: No such file or directory
Jan 22 00:51:43 compute-1 ovs-appctl[248831]: ovs|00001|daemon_unix|WARN|/var/run/openvswitch/ovs-monitor-ipsec.pid: open: No such file or directory
Jan 22 00:51:43 compute-1 ovs-appctl[248838]: ovs|00001|daemon_unix|WARN|/var/run/openvswitch/ovs-monitor-ipsec.pid: open: No such file or directory
Jan 22 00:51:46 compute-1 nova_compute[182713]: 2026-01-22 00:51:46.367 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:51:46 compute-1 nova_compute[182713]: 2026-01-22 00:51:46.609 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:51:48 compute-1 podman[249815]: 2026-01-22 00:51:48.479122372 +0000 UTC m=+0.080436094 container health_status af9a44923f14d865893c91712a29a99d59e05d83f1a1a0300c4f4d5af638f284 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, config_id=ovn_metadata_agent)
Jan 22 00:51:48 compute-1 podman[249816]: 2026-01-22 00:51:48.485140818 +0000 UTC m=+0.088820903 container health_status e305ebfcd3dbca18f8dd680606b043b108cf9efb0d0a771039cb04fe0df8441d (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Jan 22 00:51:50 compute-1 virtqemud[182235]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Jan 22 00:51:51 compute-1 nova_compute[182713]: 2026-01-22 00:51:51.418 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:51:51 compute-1 nova_compute[182713]: 2026-01-22 00:51:51.610 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:51:52 compute-1 systemd[1]: Starting Time & Date Service...
Jan 22 00:51:52 compute-1 systemd[1]: Started Time & Date Service.
Jan 22 00:51:55 compute-1 sshd-session[250340]: Invalid user mina from 92.118.39.95 port 39018
Jan 22 00:51:55 compute-1 sshd-session[250340]: Connection closed by invalid user mina 92.118.39.95 port 39018 [preauth]
Jan 22 00:51:56 compute-1 nova_compute[182713]: 2026-01-22 00:51:56.453 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:51:56 compute-1 nova_compute[182713]: 2026-01-22 00:51:56.614 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:52:01 compute-1 nova_compute[182713]: 2026-01-22 00:52:01.458 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:52:01 compute-1 podman[250344]: 2026-01-22 00:52:01.597738487 +0000 UTC m=+0.085111049 container health_status cba261ffc859a105e0afa4436e64e27f9ba7e7387a1ea02146e0072c51844d44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_id=ceilometer_agent_compute, org.label-schema.vendor=CentOS)
Jan 22 00:52:01 compute-1 nova_compute[182713]: 2026-01-22 00:52:01.617 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:52:01 compute-1 podman[250345]: 2026-01-22 00:52:01.627145488 +0000 UTC m=+0.115995885 container health_status dce133a2ffa21f3006ac27dc80027060a9029e6d972f4c62fecd35dd3bbb00f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=openstack_network_exporter, io.openshift.tags=minimal rhel9, release=1755695350, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, architecture=x86_64, vendor=Red Hat, Inc., version=9.6, container_name=openstack_network_exporter, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers)
Jan 22 00:52:03 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:52:03.070 104184 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:52:03 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:52:03.071 104184 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:52:03 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:52:03.071 104184 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:52:06 compute-1 nova_compute[182713]: 2026-01-22 00:52:06.493 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:52:06 compute-1 nova_compute[182713]: 2026-01-22 00:52:06.618 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:52:11 compute-1 nova_compute[182713]: 2026-01-22 00:52:11.521 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:52:11 compute-1 nova_compute[182713]: 2026-01-22 00:52:11.620 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:52:11 compute-1 nova_compute[182713]: 2026-01-22 00:52:11.856 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:52:12 compute-1 podman[250386]: 2026-01-22 00:52:12.434457896 +0000 UTC m=+0.085900874 container health_status 9069102163b9014ce8d990e36224edf2f6277e737eebbc85a8ea0ba5a3d6a468 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 22 00:52:12 compute-1 podman[250385]: 2026-01-22 00:52:12.47850417 +0000 UTC m=+0.140398041 container health_status 1b31374526468d6b98833c6ddc06c791524e3aaa8f4a92d02e3f11c449a17ca2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, tcib_managed=true)
Jan 22 00:52:16 compute-1 nova_compute[182713]: 2026-01-22 00:52:16.572 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:52:16 compute-1 nova_compute[182713]: 2026-01-22 00:52:16.622 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:52:16 compute-1 nova_compute[182713]: 2026-01-22 00:52:16.851 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:52:17 compute-1 sudo[246818]: pam_unix(sudo:session): session closed for user root
Jan 22 00:52:17 compute-1 sshd-session[246817]: Received disconnect from 192.168.122.10 port 53054:11: disconnected by user
Jan 22 00:52:17 compute-1 sshd-session[246817]: Disconnected from user zuul 192.168.122.10 port 53054
Jan 22 00:52:17 compute-1 sshd-session[246814]: pam_unix(sshd:session): session closed for user zuul
Jan 22 00:52:17 compute-1 systemd-logind[796]: Session 59 logged out. Waiting for processes to exit.
Jan 22 00:52:17 compute-1 systemd[1]: session-59.scope: Deactivated successfully.
Jan 22 00:52:17 compute-1 systemd[1]: session-59.scope: Consumed 1min 27.791s CPU time, 656.2M memory peak, read 160.5M from disk, written 16.7M to disk.
Jan 22 00:52:17 compute-1 systemd-logind[796]: Removed session 59.
Jan 22 00:52:18 compute-1 sshd-session[250436]: Accepted publickey for zuul from 192.168.122.10 port 51952 ssh2: ECDSA SHA256:yVd9eaNlUxI8uClDfn5gH9n5gqPIh4AgIE0cN4DTPFE
Jan 22 00:52:18 compute-1 systemd-logind[796]: New session 60 of user zuul.
Jan 22 00:52:18 compute-1 systemd[1]: Started Session 60 of User zuul.
Jan 22 00:52:18 compute-1 sshd-session[250436]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 22 00:52:18 compute-1 sudo[250440]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/cat /var/tmp/sos-osp/sosreport-compute-1-2026-01-22-vujoisi.tar.xz
Jan 22 00:52:18 compute-1 sudo[250440]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 00:52:18 compute-1 sudo[250440]: pam_unix(sudo:session): session closed for user root
Jan 22 00:52:18 compute-1 sshd-session[250439]: Received disconnect from 192.168.122.10 port 51952:11: disconnected by user
Jan 22 00:52:18 compute-1 sshd-session[250439]: Disconnected from user zuul 192.168.122.10 port 51952
Jan 22 00:52:18 compute-1 sshd-session[250436]: pam_unix(sshd:session): session closed for user zuul
Jan 22 00:52:18 compute-1 systemd[1]: session-60.scope: Deactivated successfully.
Jan 22 00:52:18 compute-1 systemd-logind[796]: Session 60 logged out. Waiting for processes to exit.
Jan 22 00:52:18 compute-1 systemd-logind[796]: Removed session 60.
Jan 22 00:52:18 compute-1 sshd-session[250465]: Accepted publickey for zuul from 192.168.122.10 port 51960 ssh2: ECDSA SHA256:yVd9eaNlUxI8uClDfn5gH9n5gqPIh4AgIE0cN4DTPFE
Jan 22 00:52:18 compute-1 systemd-logind[796]: New session 61 of user zuul.
Jan 22 00:52:18 compute-1 systemd[1]: Started Session 61 of User zuul.
Jan 22 00:52:18 compute-1 sshd-session[250465]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 22 00:52:18 compute-1 podman[250467]: 2026-01-22 00:52:18.674911223 +0000 UTC m=+0.074532151 container health_status af9a44923f14d865893c91712a29a99d59e05d83f1a1a0300c4f4d5af638f284 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 22 00:52:18 compute-1 podman[250469]: 2026-01-22 00:52:18.678136763 +0000 UTC m=+0.077528954 container health_status e305ebfcd3dbca18f8dd680606b043b108cf9efb0d0a771039cb04fe0df8441d (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Jan 22 00:52:18 compute-1 sudo[250493]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/rm -rf /var/tmp/sos-osp
Jan 22 00:52:18 compute-1 sudo[250493]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 00:52:18 compute-1 sudo[250493]: pam_unix(sudo:session): session closed for user root
Jan 22 00:52:18 compute-1 sshd-session[250470]: Received disconnect from 192.168.122.10 port 51960:11: disconnected by user
Jan 22 00:52:18 compute-1 sshd-session[250470]: Disconnected from user zuul 192.168.122.10 port 51960
Jan 22 00:52:18 compute-1 sshd-session[250465]: pam_unix(sshd:session): session closed for user zuul
Jan 22 00:52:18 compute-1 systemd[1]: session-61.scope: Deactivated successfully.
Jan 22 00:52:18 compute-1 systemd-logind[796]: Session 61 logged out. Waiting for processes to exit.
Jan 22 00:52:18 compute-1 systemd-logind[796]: Removed session 61.
Jan 22 00:52:20 compute-1 nova_compute[182713]: 2026-01-22 00:52:20.855 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:52:21 compute-1 nova_compute[182713]: 2026-01-22 00:52:21.575 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:52:21 compute-1 nova_compute[182713]: 2026-01-22 00:52:21.623 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:52:21 compute-1 nova_compute[182713]: 2026-01-22 00:52:21.855 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:52:21 compute-1 nova_compute[182713]: 2026-01-22 00:52:21.856 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 00:52:22 compute-1 systemd[1]: systemd-timedated.service: Deactivated successfully.
Jan 22 00:52:22 compute-1 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Jan 22 00:52:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:52:22.903 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:52:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:52:22.905 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:52:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:52:22.905 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:52:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:52:22.905 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:52:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:52:22.906 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:52:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:52:22.906 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:52:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:52:22.906 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:52:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:52:22.906 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:52:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:52:22.906 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:52:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:52:22.906 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:52:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:52:22.907 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:52:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:52:22.907 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:52:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:52:22.907 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:52:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:52:22.907 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:52:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:52:22.907 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:52:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:52:22.907 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:52:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:52:22.907 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:52:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:52:22.908 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:52:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:52:22.908 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:52:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:52:22.908 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:52:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:52:22.908 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:52:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:52:22.908 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:52:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:52:22.908 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:52:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:52:22.909 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:52:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:52:22.909 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:52:23 compute-1 nova_compute[182713]: 2026-01-22 00:52:23.856 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:52:23 compute-1 nova_compute[182713]: 2026-01-22 00:52:23.922 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:52:23 compute-1 nova_compute[182713]: 2026-01-22 00:52:23.923 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:52:23 compute-1 nova_compute[182713]: 2026-01-22 00:52:23.923 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:52:23 compute-1 nova_compute[182713]: 2026-01-22 00:52:23.923 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 00:52:24 compute-1 nova_compute[182713]: 2026-01-22 00:52:24.127 182717 WARNING nova.virt.libvirt.driver [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 00:52:24 compute-1 nova_compute[182713]: 2026-01-22 00:52:24.128 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5601MB free_disk=73.1769905090332GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 00:52:24 compute-1 nova_compute[182713]: 2026-01-22 00:52:24.128 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:52:24 compute-1 nova_compute[182713]: 2026-01-22 00:52:24.128 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:52:24 compute-1 nova_compute[182713]: 2026-01-22 00:52:24.194 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 00:52:24 compute-1 nova_compute[182713]: 2026-01-22 00:52:24.194 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 00:52:24 compute-1 nova_compute[182713]: 2026-01-22 00:52:24.212 182717 DEBUG nova.compute.provider_tree [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Inventory has not changed in ProviderTree for provider: 39680711-70c9-4df1-ae59-25e54fac688d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:52:24 compute-1 nova_compute[182713]: 2026-01-22 00:52:24.229 182717 DEBUG nova.scheduler.client.report [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Inventory has not changed for provider 39680711-70c9-4df1-ae59-25e54fac688d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:52:24 compute-1 nova_compute[182713]: 2026-01-22 00:52:24.232 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 00:52:24 compute-1 nova_compute[182713]: 2026-01-22 00:52:24.232 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.104s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:52:25 compute-1 nova_compute[182713]: 2026-01-22 00:52:25.233 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:52:25 compute-1 nova_compute[182713]: 2026-01-22 00:52:25.234 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:52:25 compute-1 nova_compute[182713]: 2026-01-22 00:52:25.852 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:52:26 compute-1 nova_compute[182713]: 2026-01-22 00:52:26.604 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:52:26 compute-1 nova_compute[182713]: 2026-01-22 00:52:26.625 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:52:27 compute-1 nova_compute[182713]: 2026-01-22 00:52:27.855 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:52:29 compute-1 nova_compute[182713]: 2026-01-22 00:52:29.856 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:52:29 compute-1 nova_compute[182713]: 2026-01-22 00:52:29.856 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 00:52:29 compute-1 nova_compute[182713]: 2026-01-22 00:52:29.857 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 22 00:52:29 compute-1 nova_compute[182713]: 2026-01-22 00:52:29.888 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 22 00:52:31 compute-1 nova_compute[182713]: 2026-01-22 00:52:31.607 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:52:31 compute-1 nova_compute[182713]: 2026-01-22 00:52:31.627 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:52:32 compute-1 podman[250541]: 2026-01-22 00:52:32.602403973 +0000 UTC m=+0.074307944 container health_status dce133a2ffa21f3006ac27dc80027060a9029e6d972f4c62fecd35dd3bbb00f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, io.openshift.tags=minimal rhel9, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, version=9.6, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc.)
Jan 22 00:52:32 compute-1 podman[250540]: 2026-01-22 00:52:32.609653488 +0000 UTC m=+0.080989682 container health_status cba261ffc859a105e0afa4436e64e27f9ba7e7387a1ea02146e0072c51844d44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute)
Jan 22 00:52:36 compute-1 nova_compute[182713]: 2026-01-22 00:52:36.628 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:52:36 compute-1 nova_compute[182713]: 2026-01-22 00:52:36.630 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:52:36 compute-1 nova_compute[182713]: 2026-01-22 00:52:36.631 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Jan 22 00:52:36 compute-1 nova_compute[182713]: 2026-01-22 00:52:36.631 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 00:52:36 compute-1 nova_compute[182713]: 2026-01-22 00:52:36.658 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:52:36 compute-1 nova_compute[182713]: 2026-01-22 00:52:36.659 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 00:52:41 compute-1 nova_compute[182713]: 2026-01-22 00:52:41.660 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:52:41 compute-1 nova_compute[182713]: 2026-01-22 00:52:41.662 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:52:41 compute-1 nova_compute[182713]: 2026-01-22 00:52:41.663 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Jan 22 00:52:41 compute-1 nova_compute[182713]: 2026-01-22 00:52:41.663 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 00:52:41 compute-1 nova_compute[182713]: 2026-01-22 00:52:41.701 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:52:41 compute-1 nova_compute[182713]: 2026-01-22 00:52:41.702 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 00:52:42 compute-1 podman[250581]: 2026-01-22 00:52:42.576534813 +0000 UTC m=+0.063087805 container health_status 9069102163b9014ce8d990e36224edf2f6277e737eebbc85a8ea0ba5a3d6a468 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 22 00:52:42 compute-1 podman[250582]: 2026-01-22 00:52:42.652601581 +0000 UTC m=+0.127565485 container health_status 1b31374526468d6b98833c6ddc06c791524e3aaa8f4a92d02e3f11c449a17ca2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible)
Jan 22 00:52:46 compute-1 nova_compute[182713]: 2026-01-22 00:52:46.703 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:52:49 compute-1 podman[250632]: 2026-01-22 00:52:49.610502429 +0000 UTC m=+0.092791286 container health_status af9a44923f14d865893c91712a29a99d59e05d83f1a1a0300c4f4d5af638f284 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Jan 22 00:52:49 compute-1 podman[250633]: 2026-01-22 00:52:49.639705014 +0000 UTC m=+0.114484419 container health_status e305ebfcd3dbca18f8dd680606b043b108cf9efb0d0a771039cb04fe0df8441d (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Jan 22 00:52:51 compute-1 nova_compute[182713]: 2026-01-22 00:52:51.704 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:52:56 compute-1 nova_compute[182713]: 2026-01-22 00:52:56.706 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:53:01 compute-1 nova_compute[182713]: 2026-01-22 00:53:01.709 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:53:03 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:53:03.071 104184 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:53:03 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:53:03.072 104184 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:53:03 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:53:03.072 104184 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:53:03 compute-1 podman[250674]: 2026-01-22 00:53:03.61320626 +0000 UTC m=+0.088845024 container health_status dce133a2ffa21f3006ac27dc80027060a9029e6d972f4c62fecd35dd3bbb00f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=openstack_network_exporter, build-date=2025-08-20T13:12:41, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, io.openshift.expose-services=, release=1755695350, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., io.buildah.version=1.33.7, name=ubi9-minimal, vendor=Red Hat, Inc., architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container)
Jan 22 00:53:03 compute-1 podman[250673]: 2026-01-22 00:53:03.616975367 +0000 UTC m=+0.091677313 container health_status cba261ffc859a105e0afa4436e64e27f9ba7e7387a1ea02146e0072c51844d44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Jan 22 00:53:06 compute-1 nova_compute[182713]: 2026-01-22 00:53:06.711 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:53:11 compute-1 nova_compute[182713]: 2026-01-22 00:53:11.716 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:53:11 compute-1 nova_compute[182713]: 2026-01-22 00:53:11.856 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:53:13 compute-1 podman[250712]: 2026-01-22 00:53:13.627028 +0000 UTC m=+0.093965103 container health_status 9069102163b9014ce8d990e36224edf2f6277e737eebbc85a8ea0ba5a3d6a468 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 22 00:53:13 compute-1 podman[250711]: 2026-01-22 00:53:13.67963075 +0000 UTC m=+0.149625617 container health_status 1b31374526468d6b98833c6ddc06c791524e3aaa8f4a92d02e3f11c449a17ca2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller)
Jan 22 00:53:16 compute-1 nova_compute[182713]: 2026-01-22 00:53:16.719 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:53:16 compute-1 nova_compute[182713]: 2026-01-22 00:53:16.852 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:53:20 compute-1 podman[250762]: 2026-01-22 00:53:20.579210221 +0000 UTC m=+0.062275941 container health_status e305ebfcd3dbca18f8dd680606b043b108cf9efb0d0a771039cb04fe0df8441d (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Jan 22 00:53:20 compute-1 podman[250761]: 2026-01-22 00:53:20.594224126 +0000 UTC m=+0.079253287 container health_status af9a44923f14d865893c91712a29a99d59e05d83f1a1a0300c4f4d5af638f284 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 00:53:21 compute-1 nova_compute[182713]: 2026-01-22 00:53:21.722 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:53:21 compute-1 nova_compute[182713]: 2026-01-22 00:53:21.724 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:53:21 compute-1 nova_compute[182713]: 2026-01-22 00:53:21.724 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Jan 22 00:53:21 compute-1 nova_compute[182713]: 2026-01-22 00:53:21.724 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 00:53:21 compute-1 nova_compute[182713]: 2026-01-22 00:53:21.725 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 00:53:21 compute-1 nova_compute[182713]: 2026-01-22 00:53:21.856 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:53:23 compute-1 nova_compute[182713]: 2026-01-22 00:53:23.855 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:53:23 compute-1 nova_compute[182713]: 2026-01-22 00:53:23.856 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 00:53:23 compute-1 nova_compute[182713]: 2026-01-22 00:53:23.856 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:53:23 compute-1 nova_compute[182713]: 2026-01-22 00:53:23.939 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:53:23 compute-1 nova_compute[182713]: 2026-01-22 00:53:23.940 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:53:23 compute-1 nova_compute[182713]: 2026-01-22 00:53:23.940 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:53:23 compute-1 nova_compute[182713]: 2026-01-22 00:53:23.941 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 00:53:24 compute-1 nova_compute[182713]: 2026-01-22 00:53:24.169 182717 WARNING nova.virt.libvirt.driver [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 00:53:24 compute-1 nova_compute[182713]: 2026-01-22 00:53:24.171 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5663MB free_disk=73.1769905090332GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 00:53:24 compute-1 nova_compute[182713]: 2026-01-22 00:53:24.171 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:53:24 compute-1 nova_compute[182713]: 2026-01-22 00:53:24.172 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:53:24 compute-1 nova_compute[182713]: 2026-01-22 00:53:24.312 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 00:53:24 compute-1 nova_compute[182713]: 2026-01-22 00:53:24.313 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 00:53:24 compute-1 nova_compute[182713]: 2026-01-22 00:53:24.377 182717 DEBUG nova.compute.provider_tree [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Inventory has not changed in ProviderTree for provider: 39680711-70c9-4df1-ae59-25e54fac688d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:53:24 compute-1 nova_compute[182713]: 2026-01-22 00:53:24.404 182717 DEBUG nova.scheduler.client.report [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Inventory has not changed for provider 39680711-70c9-4df1-ae59-25e54fac688d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:53:24 compute-1 nova_compute[182713]: 2026-01-22 00:53:24.407 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 00:53:24 compute-1 nova_compute[182713]: 2026-01-22 00:53:24.408 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.236s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:53:24 compute-1 nova_compute[182713]: 2026-01-22 00:53:24.409 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:53:24 compute-1 nova_compute[182713]: 2026-01-22 00:53:24.409 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Jan 22 00:53:26 compute-1 nova_compute[182713]: 2026-01-22 00:53:26.428 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:53:26 compute-1 nova_compute[182713]: 2026-01-22 00:53:26.727 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:53:26 compute-1 nova_compute[182713]: 2026-01-22 00:53:26.855 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:53:28 compute-1 nova_compute[182713]: 2026-01-22 00:53:28.857 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:53:28 compute-1 nova_compute[182713]: 2026-01-22 00:53:28.857 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Jan 22 00:53:28 compute-1 nova_compute[182713]: 2026-01-22 00:53:28.881 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Jan 22 00:53:29 compute-1 nova_compute[182713]: 2026-01-22 00:53:29.856 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:53:29 compute-1 nova_compute[182713]: 2026-01-22 00:53:29.857 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:53:30 compute-1 nova_compute[182713]: 2026-01-22 00:53:30.899 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:53:30 compute-1 nova_compute[182713]: 2026-01-22 00:53:30.903 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 00:53:30 compute-1 nova_compute[182713]: 2026-01-22 00:53:30.904 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 22 00:53:30 compute-1 nova_compute[182713]: 2026-01-22 00:53:30.925 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 22 00:53:31 compute-1 nova_compute[182713]: 2026-01-22 00:53:31.730 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:53:34 compute-1 podman[250802]: 2026-01-22 00:53:34.58377342 +0000 UTC m=+0.069841896 container health_status cba261ffc859a105e0afa4436e64e27f9ba7e7387a1ea02146e0072c51844d44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ceilometer_agent_compute, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 22 00:53:34 compute-1 podman[250803]: 2026-01-22 00:53:34.603973176 +0000 UTC m=+0.071181817 container health_status dce133a2ffa21f3006ac27dc80027060a9029e6d972f4c62fecd35dd3bbb00f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, container_name=openstack_network_exporter, managed_by=edpm_ansible, architecture=x86_64, config_id=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, io.openshift.expose-services=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Jan 22 00:53:36 compute-1 nova_compute[182713]: 2026-01-22 00:53:36.734 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:53:36 compute-1 nova_compute[182713]: 2026-01-22 00:53:36.737 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:53:36 compute-1 nova_compute[182713]: 2026-01-22 00:53:36.737 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5005 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Jan 22 00:53:36 compute-1 nova_compute[182713]: 2026-01-22 00:53:36.738 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 00:53:36 compute-1 nova_compute[182713]: 2026-01-22 00:53:36.758 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:53:36 compute-1 nova_compute[182713]: 2026-01-22 00:53:36.759 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 00:53:40 compute-1 sshd-session[250843]: Received disconnect from 38.67.240.124 port 48474:11:  [preauth]
Jan 22 00:53:40 compute-1 sshd-session[250843]: Disconnected from authenticating user root 38.67.240.124 port 48474 [preauth]
Jan 22 00:53:41 compute-1 nova_compute[182713]: 2026-01-22 00:53:41.760 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:53:41 compute-1 nova_compute[182713]: 2026-01-22 00:53:41.763 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:53:41 compute-1 nova_compute[182713]: 2026-01-22 00:53:41.764 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5004 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Jan 22 00:53:41 compute-1 nova_compute[182713]: 2026-01-22 00:53:41.764 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 00:53:41 compute-1 nova_compute[182713]: 2026-01-22 00:53:41.804 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:53:41 compute-1 nova_compute[182713]: 2026-01-22 00:53:41.806 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 00:53:44 compute-1 podman[250846]: 2026-01-22 00:53:44.599160179 +0000 UTC m=+0.078553576 container health_status 9069102163b9014ce8d990e36224edf2f6277e737eebbc85a8ea0ba5a3d6a468 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 22 00:53:44 compute-1 podman[250845]: 2026-01-22 00:53:44.64340972 +0000 UTC m=+0.112600701 container health_status 1b31374526468d6b98833c6ddc06c791524e3aaa8f4a92d02e3f11c449a17ca2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 22 00:53:46 compute-1 nova_compute[182713]: 2026-01-22 00:53:46.806 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:53:51 compute-1 podman[250894]: 2026-01-22 00:53:51.612841965 +0000 UTC m=+0.084351964 container health_status af9a44923f14d865893c91712a29a99d59e05d83f1a1a0300c4f4d5af638f284 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Jan 22 00:53:51 compute-1 podman[250895]: 2026-01-22 00:53:51.626609753 +0000 UTC m=+0.085942965 container health_status e305ebfcd3dbca18f8dd680606b043b108cf9efb0d0a771039cb04fe0df8441d (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 22 00:53:51 compute-1 nova_compute[182713]: 2026-01-22 00:53:51.809 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:53:56 compute-1 nova_compute[182713]: 2026-01-22 00:53:56.811 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:53:58 compute-1 sshd-session[250938]: Invalid user ethereum from 92.118.39.95 port 38002
Jan 22 00:53:58 compute-1 sshd-session[250938]: Connection closed by invalid user ethereum 92.118.39.95 port 38002 [preauth]
Jan 22 00:54:01 compute-1 nova_compute[182713]: 2026-01-22 00:54:01.813 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:54:03 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:54:03.072 104184 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:54:03 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:54:03.074 104184 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:54:03 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:54:03.074 104184 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:54:05 compute-1 podman[250941]: 2026-01-22 00:54:05.582050608 +0000 UTC m=+0.067580016 container health_status dce133a2ffa21f3006ac27dc80027060a9029e6d972f4c62fecd35dd3bbb00f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, vcs-type=git, config_id=openstack_network_exporter, managed_by=edpm_ansible, name=ubi9-minimal, io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., container_name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers)
Jan 22 00:54:05 compute-1 podman[250940]: 2026-01-22 00:54:05.600068326 +0000 UTC m=+0.085111748 container health_status cba261ffc859a105e0afa4436e64e27f9ba7e7387a1ea02146e0072c51844d44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ceilometer_agent_compute, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251202, managed_by=edpm_ansible)
Jan 22 00:54:06 compute-1 nova_compute[182713]: 2026-01-22 00:54:06.816 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:54:11 compute-1 nova_compute[182713]: 2026-01-22 00:54:11.819 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:54:11 compute-1 nova_compute[182713]: 2026-01-22 00:54:11.856 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:54:15 compute-1 podman[250982]: 2026-01-22 00:54:15.608342444 +0000 UTC m=+0.086126869 container health_status 9069102163b9014ce8d990e36224edf2f6277e737eebbc85a8ea0ba5a3d6a468 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 22 00:54:15 compute-1 podman[250981]: 2026-01-22 00:54:15.633545565 +0000 UTC m=+0.116138619 container health_status 1b31374526468d6b98833c6ddc06c791524e3aaa8f4a92d02e3f11c449a17ca2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 00:54:16 compute-1 nova_compute[182713]: 2026-01-22 00:54:16.821 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:54:17 compute-1 nova_compute[182713]: 2026-01-22 00:54:17.851 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:54:21 compute-1 nova_compute[182713]: 2026-01-22 00:54:21.824 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:54:22 compute-1 podman[251032]: 2026-01-22 00:54:22.608522444 +0000 UTC m=+0.089323109 container health_status af9a44923f14d865893c91712a29a99d59e05d83f1a1a0300c4f4d5af638f284 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Jan 22 00:54:22 compute-1 podman[251033]: 2026-01-22 00:54:22.642341142 +0000 UTC m=+0.117522973 container health_status e305ebfcd3dbca18f8dd680606b043b108cf9efb0d0a771039cb04fe0df8441d (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 22 00:54:22 compute-1 nova_compute[182713]: 2026-01-22 00:54:22.855 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:54:22.904 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:54:22.905 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:54:22.905 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:54:22.906 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:54:22.906 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:54:22.906 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:54:22.906 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:54:22.906 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:54:22.906 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:54:22.906 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:54:22.906 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:54:22.906 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:54:22.906 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:54:22.907 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:54:22.907 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:54:22.907 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:54:22.907 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:54:22.907 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:54:22.907 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:54:22.907 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:54:22.907 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:54:22.907 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:54:22.907 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:54:22.907 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:54:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:54:22.908 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:54:24 compute-1 nova_compute[182713]: 2026-01-22 00:54:24.856 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:54:24 compute-1 nova_compute[182713]: 2026-01-22 00:54:24.905 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:54:24 compute-1 nova_compute[182713]: 2026-01-22 00:54:24.907 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:54:24 compute-1 nova_compute[182713]: 2026-01-22 00:54:24.907 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:54:24 compute-1 nova_compute[182713]: 2026-01-22 00:54:24.908 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 00:54:25 compute-1 nova_compute[182713]: 2026-01-22 00:54:25.135 182717 WARNING nova.virt.libvirt.driver [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 00:54:25 compute-1 nova_compute[182713]: 2026-01-22 00:54:25.137 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5690MB free_disk=73.17702102661133GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 00:54:25 compute-1 nova_compute[182713]: 2026-01-22 00:54:25.137 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:54:25 compute-1 nova_compute[182713]: 2026-01-22 00:54:25.138 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:54:25 compute-1 nova_compute[182713]: 2026-01-22 00:54:25.211 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 00:54:25 compute-1 nova_compute[182713]: 2026-01-22 00:54:25.212 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 00:54:25 compute-1 nova_compute[182713]: 2026-01-22 00:54:25.267 182717 DEBUG nova.compute.provider_tree [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Inventory has not changed in ProviderTree for provider: 39680711-70c9-4df1-ae59-25e54fac688d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:54:25 compute-1 nova_compute[182713]: 2026-01-22 00:54:25.301 182717 DEBUG nova.scheduler.client.report [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Inventory has not changed for provider 39680711-70c9-4df1-ae59-25e54fac688d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:54:25 compute-1 nova_compute[182713]: 2026-01-22 00:54:25.304 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 00:54:25 compute-1 nova_compute[182713]: 2026-01-22 00:54:25.304 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.167s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:54:26 compute-1 nova_compute[182713]: 2026-01-22 00:54:26.305 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:54:26 compute-1 nova_compute[182713]: 2026-01-22 00:54:26.306 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 00:54:26 compute-1 nova_compute[182713]: 2026-01-22 00:54:26.827 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:54:26 compute-1 nova_compute[182713]: 2026-01-22 00:54:26.857 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:54:27 compute-1 nova_compute[182713]: 2026-01-22 00:54:27.856 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:54:30 compute-1 nova_compute[182713]: 2026-01-22 00:54:30.852 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:54:31 compute-1 nova_compute[182713]: 2026-01-22 00:54:31.830 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:54:31 compute-1 nova_compute[182713]: 2026-01-22 00:54:31.855 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:54:32 compute-1 nova_compute[182713]: 2026-01-22 00:54:32.857 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:54:32 compute-1 nova_compute[182713]: 2026-01-22 00:54:32.858 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 00:54:32 compute-1 nova_compute[182713]: 2026-01-22 00:54:32.858 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 22 00:54:32 compute-1 nova_compute[182713]: 2026-01-22 00:54:32.924 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 22 00:54:36 compute-1 podman[251076]: 2026-01-22 00:54:36.574531268 +0000 UTC m=+0.063969074 container health_status dce133a2ffa21f3006ac27dc80027060a9029e6d972f4c62fecd35dd3bbb00f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9-minimal, vcs-type=git, version=9.6, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., architecture=x86_64, config_id=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers)
Jan 22 00:54:36 compute-1 podman[251075]: 2026-01-22 00:54:36.598295594 +0000 UTC m=+0.074392907 container health_status cba261ffc859a105e0afa4436e64e27f9ba7e7387a1ea02146e0072c51844d44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202)
Jan 22 00:54:36 compute-1 nova_compute[182713]: 2026-01-22 00:54:36.834 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:54:36 compute-1 nova_compute[182713]: 2026-01-22 00:54:36.836 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:54:36 compute-1 nova_compute[182713]: 2026-01-22 00:54:36.836 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5004 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Jan 22 00:54:36 compute-1 nova_compute[182713]: 2026-01-22 00:54:36.836 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 00:54:36 compute-1 nova_compute[182713]: 2026-01-22 00:54:36.870 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:54:36 compute-1 nova_compute[182713]: 2026-01-22 00:54:36.871 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 00:54:41 compute-1 nova_compute[182713]: 2026-01-22 00:54:41.871 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:54:46 compute-1 podman[251116]: 2026-01-22 00:54:46.589322786 +0000 UTC m=+0.065007143 container health_status 9069102163b9014ce8d990e36224edf2f6277e737eebbc85a8ea0ba5a3d6a468 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 22 00:54:46 compute-1 podman[251115]: 2026-01-22 00:54:46.607784317 +0000 UTC m=+0.093949569 container health_status 1b31374526468d6b98833c6ddc06c791524e3aaa8f4a92d02e3f11c449a17ca2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 22 00:54:46 compute-1 nova_compute[182713]: 2026-01-22 00:54:46.873 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:54:51 compute-1 nova_compute[182713]: 2026-01-22 00:54:51.875 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:54:51 compute-1 nova_compute[182713]: 2026-01-22 00:54:51.878 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:54:53 compute-1 podman[251165]: 2026-01-22 00:54:53.563626988 +0000 UTC m=+0.050787373 container health_status af9a44923f14d865893c91712a29a99d59e05d83f1a1a0300c4f4d5af638f284 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 00:54:53 compute-1 podman[251166]: 2026-01-22 00:54:53.578043985 +0000 UTC m=+0.054769377 container health_status e305ebfcd3dbca18f8dd680606b043b108cf9efb0d0a771039cb04fe0df8441d (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 22 00:54:56 compute-1 nova_compute[182713]: 2026-01-22 00:54:56.880 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:54:56 compute-1 nova_compute[182713]: 2026-01-22 00:54:56.882 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:54:56 compute-1 nova_compute[182713]: 2026-01-22 00:54:56.882 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Jan 22 00:54:56 compute-1 nova_compute[182713]: 2026-01-22 00:54:56.882 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 00:54:56 compute-1 nova_compute[182713]: 2026-01-22 00:54:56.912 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:54:56 compute-1 nova_compute[182713]: 2026-01-22 00:54:56.913 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 00:55:01 compute-1 nova_compute[182713]: 2026-01-22 00:55:01.913 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:55:03 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:55:03.075 104184 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:55:03 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:55:03.075 104184 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:55:03 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:55:03.075 104184 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:55:06 compute-1 nova_compute[182713]: 2026-01-22 00:55:06.916 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:55:06 compute-1 nova_compute[182713]: 2026-01-22 00:55:06.918 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:55:06 compute-1 nova_compute[182713]: 2026-01-22 00:55:06.919 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Jan 22 00:55:06 compute-1 nova_compute[182713]: 2026-01-22 00:55:06.919 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 00:55:06 compute-1 nova_compute[182713]: 2026-01-22 00:55:06.943 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:55:06 compute-1 nova_compute[182713]: 2026-01-22 00:55:06.944 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 00:55:07 compute-1 podman[251207]: 2026-01-22 00:55:07.581729686 +0000 UTC m=+0.077963995 container health_status cba261ffc859a105e0afa4436e64e27f9ba7e7387a1ea02146e0072c51844d44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ceilometer_agent_compute)
Jan 22 00:55:07 compute-1 podman[251208]: 2026-01-22 00:55:07.588593537 +0000 UTC m=+0.075502777 container health_status dce133a2ffa21f3006ac27dc80027060a9029e6d972f4c62fecd35dd3bbb00f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, release=1755695350, vcs-type=git, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., config_id=openstack_network_exporter, io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, managed_by=edpm_ansible, name=ubi9-minimal)
Jan 22 00:55:11 compute-1 nova_compute[182713]: 2026-01-22 00:55:11.945 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:55:11 compute-1 nova_compute[182713]: 2026-01-22 00:55:11.948 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:55:11 compute-1 nova_compute[182713]: 2026-01-22 00:55:11.948 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Jan 22 00:55:11 compute-1 nova_compute[182713]: 2026-01-22 00:55:11.949 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 00:55:11 compute-1 nova_compute[182713]: 2026-01-22 00:55:11.982 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:55:11 compute-1 nova_compute[182713]: 2026-01-22 00:55:11.983 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 00:55:12 compute-1 nova_compute[182713]: 2026-01-22 00:55:12.855 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:55:16 compute-1 nova_compute[182713]: 2026-01-22 00:55:16.984 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:55:17 compute-1 podman[251249]: 2026-01-22 00:55:17.619990494 +0000 UTC m=+0.113657448 container health_status 1b31374526468d6b98833c6ddc06c791524e3aaa8f4a92d02e3f11c449a17ca2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 22 00:55:17 compute-1 podman[251250]: 2026-01-22 00:55:17.621161241 +0000 UTC m=+0.100460882 container health_status 9069102163b9014ce8d990e36224edf2f6277e737eebbc85a8ea0ba5a3d6a468 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 22 00:55:19 compute-1 nova_compute[182713]: 2026-01-22 00:55:19.851 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:55:21 compute-1 nova_compute[182713]: 2026-01-22 00:55:21.986 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:55:24 compute-1 podman[251302]: 2026-01-22 00:55:24.577174797 +0000 UTC m=+0.059357959 container health_status e305ebfcd3dbca18f8dd680606b043b108cf9efb0d0a771039cb04fe0df8441d (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Jan 22 00:55:24 compute-1 podman[251301]: 2026-01-22 00:55:24.579902951 +0000 UTC m=+0.067668136 container health_status af9a44923f14d865893c91712a29a99d59e05d83f1a1a0300c4f4d5af638f284 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202)
Jan 22 00:55:24 compute-1 nova_compute[182713]: 2026-01-22 00:55:24.856 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:55:25 compute-1 nova_compute[182713]: 2026-01-22 00:55:25.855 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:55:25 compute-1 nova_compute[182713]: 2026-01-22 00:55:25.895 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:55:25 compute-1 nova_compute[182713]: 2026-01-22 00:55:25.896 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:55:25 compute-1 nova_compute[182713]: 2026-01-22 00:55:25.896 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:55:25 compute-1 nova_compute[182713]: 2026-01-22 00:55:25.897 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 00:55:26 compute-1 nova_compute[182713]: 2026-01-22 00:55:26.101 182717 WARNING nova.virt.libvirt.driver [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 00:55:26 compute-1 nova_compute[182713]: 2026-01-22 00:55:26.103 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5689MB free_disk=73.17708587646484GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 00:55:26 compute-1 nova_compute[182713]: 2026-01-22 00:55:26.103 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:55:26 compute-1 nova_compute[182713]: 2026-01-22 00:55:26.104 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:55:26 compute-1 nova_compute[182713]: 2026-01-22 00:55:26.359 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 00:55:26 compute-1 nova_compute[182713]: 2026-01-22 00:55:26.360 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 00:55:26 compute-1 nova_compute[182713]: 2026-01-22 00:55:26.648 182717 DEBUG nova.scheduler.client.report [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Refreshing inventories for resource provider 39680711-70c9-4df1-ae59-25e54fac688d _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Jan 22 00:55:26 compute-1 nova_compute[182713]: 2026-01-22 00:55:26.882 182717 DEBUG nova.scheduler.client.report [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Updating ProviderTree inventory for provider 39680711-70c9-4df1-ae59-25e54fac688d from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Jan 22 00:55:26 compute-1 nova_compute[182713]: 2026-01-22 00:55:26.883 182717 DEBUG nova.compute.provider_tree [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Updating inventory in ProviderTree for provider 39680711-70c9-4df1-ae59-25e54fac688d with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Jan 22 00:55:26 compute-1 nova_compute[182713]: 2026-01-22 00:55:26.906 182717 DEBUG nova.scheduler.client.report [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Refreshing aggregate associations for resource provider 39680711-70c9-4df1-ae59-25e54fac688d, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Jan 22 00:55:26 compute-1 nova_compute[182713]: 2026-01-22 00:55:26.930 182717 DEBUG nova.scheduler.client.report [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Refreshing trait associations for resource provider 39680711-70c9-4df1-ae59-25e54fac688d, traits: COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_ACCELERATORS,COMPUTE_NODE,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_SSE41,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_GRAPHICS_MODEL_CIRRUS,HW_CPU_X86_SSE42,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_SECURITY_TPM_2_0,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_STORAGE_BUS_USB,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_STORAGE_BUS_FDC,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_STORAGE_BUS_SATA,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_DEVICE_TAGGING,COMPUTE_NET_VIF_MODEL_NE2K_PCI,HW_CPU_X86_SSE2,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_CPU_X86_SSSE3,COMPUTE_STORAGE_BUS_IDE,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_RESCUE_BFV,HW_CPU_X86_MMX,HW_CPU_X86_SSE,COMPUTE_TRUSTED_CERTS,COMPUTE_IMAGE_TYPE_ISO _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Jan 22 00:55:26 compute-1 nova_compute[182713]: 2026-01-22 00:55:26.959 182717 DEBUG nova.compute.provider_tree [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Inventory has not changed in ProviderTree for provider: 39680711-70c9-4df1-ae59-25e54fac688d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:55:26 compute-1 nova_compute[182713]: 2026-01-22 00:55:26.975 182717 DEBUG nova.scheduler.client.report [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Inventory has not changed for provider 39680711-70c9-4df1-ae59-25e54fac688d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:55:26 compute-1 nova_compute[182713]: 2026-01-22 00:55:26.976 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 00:55:26 compute-1 nova_compute[182713]: 2026-01-22 00:55:26.977 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.873s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:55:26 compute-1 nova_compute[182713]: 2026-01-22 00:55:26.989 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:55:28 compute-1 nova_compute[182713]: 2026-01-22 00:55:28.978 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:55:28 compute-1 nova_compute[182713]: 2026-01-22 00:55:28.980 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:55:28 compute-1 nova_compute[182713]: 2026-01-22 00:55:28.980 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:55:28 compute-1 nova_compute[182713]: 2026-01-22 00:55:28.981 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 00:55:31 compute-1 nova_compute[182713]: 2026-01-22 00:55:31.858 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:55:31 compute-1 nova_compute[182713]: 2026-01-22 00:55:31.991 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:55:32 compute-1 nova_compute[182713]: 2026-01-22 00:55:32.857 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:55:32 compute-1 nova_compute[182713]: 2026-01-22 00:55:32.857 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 00:55:32 compute-1 nova_compute[182713]: 2026-01-22 00:55:32.858 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 22 00:55:32 compute-1 nova_compute[182713]: 2026-01-22 00:55:32.883 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 22 00:55:36 compute-1 nova_compute[182713]: 2026-01-22 00:55:36.994 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:55:38 compute-1 podman[251344]: 2026-01-22 00:55:38.596419586 +0000 UTC m=+0.086640943 container health_status cba261ffc859a105e0afa4436e64e27f9ba7e7387a1ea02146e0072c51844d44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute)
Jan 22 00:55:38 compute-1 podman[251345]: 2026-01-22 00:55:38.645464015 +0000 UTC m=+0.123088261 container health_status dce133a2ffa21f3006ac27dc80027060a9029e6d972f4c62fecd35dd3bbb00f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., io.buildah.version=1.33.7, distribution-scope=public, managed_by=edpm_ansible, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=openstack_network_exporter, release=1755695350, vendor=Red Hat, Inc., version=9.6, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, io.openshift.tags=minimal rhel9)
Jan 22 00:55:41 compute-1 nova_compute[182713]: 2026-01-22 00:55:41.994 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:55:41 compute-1 nova_compute[182713]: 2026-01-22 00:55:41.996 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:55:46 compute-1 nova_compute[182713]: 2026-01-22 00:55:46.994 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:55:46 compute-1 nova_compute[182713]: 2026-01-22 00:55:46.997 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:55:48 compute-1 podman[251384]: 2026-01-22 00:55:48.577371013 +0000 UTC m=+0.066599352 container health_status 9069102163b9014ce8d990e36224edf2f6277e737eebbc85a8ea0ba5a3d6a468 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 22 00:55:48 compute-1 podman[251383]: 2026-01-22 00:55:48.606663501 +0000 UTC m=+0.103045152 container health_status 1b31374526468d6b98833c6ddc06c791524e3aaa8f4a92d02e3f11c449a17ca2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Jan 22 00:55:51 compute-1 nova_compute[182713]: 2026-01-22 00:55:51.996 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:55:55 compute-1 podman[251436]: 2026-01-22 00:55:55.580331112 +0000 UTC m=+0.065620133 container health_status af9a44923f14d865893c91712a29a99d59e05d83f1a1a0300c4f4d5af638f284 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 22 00:55:55 compute-1 podman[251437]: 2026-01-22 00:55:55.612723804 +0000 UTC m=+0.082428883 container health_status e305ebfcd3dbca18f8dd680606b043b108cf9efb0d0a771039cb04fe0df8441d (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 22 00:55:56 compute-1 nova_compute[182713]: 2026-01-22 00:55:56.998 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:55:57 compute-1 nova_compute[182713]: 2026-01-22 00:55:57.000 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:55:57 compute-1 nova_compute[182713]: 2026-01-22 00:55:57.000 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Jan 22 00:55:57 compute-1 nova_compute[182713]: 2026-01-22 00:55:57.000 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 00:55:57 compute-1 nova_compute[182713]: 2026-01-22 00:55:57.027 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:55:57 compute-1 nova_compute[182713]: 2026-01-22 00:55:57.028 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 00:56:02 compute-1 nova_compute[182713]: 2026-01-22 00:56:02.029 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:56:02 compute-1 nova_compute[182713]: 2026-01-22 00:56:02.030 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:56:02 compute-1 nova_compute[182713]: 2026-01-22 00:56:02.031 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Jan 22 00:56:02 compute-1 nova_compute[182713]: 2026-01-22 00:56:02.031 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 00:56:02 compute-1 nova_compute[182713]: 2026-01-22 00:56:02.031 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 00:56:03 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:56:03.076 104184 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:56:03 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:56:03.077 104184 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:56:03 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:56:03.077 104184 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:56:05 compute-1 sshd-session[251477]: Invalid user eth from 92.118.39.95 port 36968
Jan 22 00:56:05 compute-1 sshd-session[251477]: Connection closed by invalid user eth 92.118.39.95 port 36968 [preauth]
Jan 22 00:56:07 compute-1 nova_compute[182713]: 2026-01-22 00:56:07.033 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:56:07 compute-1 nova_compute[182713]: 2026-01-22 00:56:07.035 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:56:07 compute-1 nova_compute[182713]: 2026-01-22 00:56:07.035 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Jan 22 00:56:07 compute-1 nova_compute[182713]: 2026-01-22 00:56:07.036 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 00:56:07 compute-1 nova_compute[182713]: 2026-01-22 00:56:07.075 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:56:07 compute-1 nova_compute[182713]: 2026-01-22 00:56:07.075 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 00:56:09 compute-1 podman[251480]: 2026-01-22 00:56:09.590238344 +0000 UTC m=+0.081642718 container health_status dce133a2ffa21f3006ac27dc80027060a9029e6d972f4c62fecd35dd3bbb00f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, distribution-scope=public, maintainer=Red Hat, Inc., vcs-type=git, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, io.openshift.expose-services=, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=openstack_network_exporter, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Jan 22 00:56:09 compute-1 podman[251479]: 2026-01-22 00:56:09.593988531 +0000 UTC m=+0.083231247 container health_status cba261ffc859a105e0afa4436e64e27f9ba7e7387a1ea02146e0072c51844d44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Jan 22 00:56:12 compute-1 nova_compute[182713]: 2026-01-22 00:56:12.076 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:56:12 compute-1 nova_compute[182713]: 2026-01-22 00:56:12.079 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:56:12 compute-1 nova_compute[182713]: 2026-01-22 00:56:12.079 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Jan 22 00:56:12 compute-1 nova_compute[182713]: 2026-01-22 00:56:12.079 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 00:56:12 compute-1 nova_compute[182713]: 2026-01-22 00:56:12.103 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:56:12 compute-1 nova_compute[182713]: 2026-01-22 00:56:12.103 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 00:56:13 compute-1 nova_compute[182713]: 2026-01-22 00:56:13.856 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:56:17 compute-1 nova_compute[182713]: 2026-01-22 00:56:17.104 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:56:17 compute-1 nova_compute[182713]: 2026-01-22 00:56:17.106 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:56:17 compute-1 nova_compute[182713]: 2026-01-22 00:56:17.106 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Jan 22 00:56:17 compute-1 nova_compute[182713]: 2026-01-22 00:56:17.106 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 00:56:17 compute-1 nova_compute[182713]: 2026-01-22 00:56:17.146 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:56:17 compute-1 nova_compute[182713]: 2026-01-22 00:56:17.146 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 00:56:19 compute-1 podman[251520]: 2026-01-22 00:56:19.605476951 +0000 UTC m=+0.087597703 container health_status 9069102163b9014ce8d990e36224edf2f6277e737eebbc85a8ea0ba5a3d6a468 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 22 00:56:19 compute-1 podman[251519]: 2026-01-22 00:56:19.649922936 +0000 UTC m=+0.136418403 container health_status 1b31374526468d6b98833c6ddc06c791524e3aaa8f4a92d02e3f11c449a17ca2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS)
Jan 22 00:56:20 compute-1 nova_compute[182713]: 2026-01-22 00:56:20.851 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:56:22 compute-1 nova_compute[182713]: 2026-01-22 00:56:22.148 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:56:22 compute-1 nova_compute[182713]: 2026-01-22 00:56:22.150 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:56:22 compute-1 nova_compute[182713]: 2026-01-22 00:56:22.150 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Jan 22 00:56:22 compute-1 nova_compute[182713]: 2026-01-22 00:56:22.150 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 00:56:22 compute-1 nova_compute[182713]: 2026-01-22 00:56:22.179 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:56:22 compute-1 nova_compute[182713]: 2026-01-22 00:56:22.179 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 00:56:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:56:22.905 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:56:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:56:22.906 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:56:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:56:22.906 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:56:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:56:22.906 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:56:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:56:22.906 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:56:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:56:22.906 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:56:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:56:22.906 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:56:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:56:22.906 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:56:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:56:22.906 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:56:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:56:22.907 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:56:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:56:22.907 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:56:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:56:22.907 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:56:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:56:22.907 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:56:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:56:22.907 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:56:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:56:22.907 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:56:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:56:22.907 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:56:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:56:22.907 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:56:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:56:22.907 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:56:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:56:22.907 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:56:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:56:22.907 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:56:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:56:22.907 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:56:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:56:22.907 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:56:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:56:22.908 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:56:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:56:22.908 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:56:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:56:22.908 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:56:24 compute-1 nova_compute[182713]: 2026-01-22 00:56:24.856 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:56:26 compute-1 podman[251566]: 2026-01-22 00:56:26.582530138 +0000 UTC m=+0.057799390 container health_status e305ebfcd3dbca18f8dd680606b043b108cf9efb0d0a771039cb04fe0df8441d (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Jan 22 00:56:26 compute-1 podman[251565]: 2026-01-22 00:56:26.606986505 +0000 UTC m=+0.086891041 container health_status af9a44923f14d865893c91712a29a99d59e05d83f1a1a0300c4f4d5af638f284 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible)
Jan 22 00:56:26 compute-1 nova_compute[182713]: 2026-01-22 00:56:26.857 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:56:26 compute-1 nova_compute[182713]: 2026-01-22 00:56:26.896 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:56:26 compute-1 nova_compute[182713]: 2026-01-22 00:56:26.896 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:56:26 compute-1 nova_compute[182713]: 2026-01-22 00:56:26.896 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:56:26 compute-1 nova_compute[182713]: 2026-01-22 00:56:26.896 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 00:56:27 compute-1 nova_compute[182713]: 2026-01-22 00:56:27.060 182717 WARNING nova.virt.libvirt.driver [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 00:56:27 compute-1 nova_compute[182713]: 2026-01-22 00:56:27.061 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5693MB free_disk=73.17708587646484GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 00:56:27 compute-1 nova_compute[182713]: 2026-01-22 00:56:27.061 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:56:27 compute-1 nova_compute[182713]: 2026-01-22 00:56:27.061 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:56:27 compute-1 nova_compute[182713]: 2026-01-22 00:56:27.180 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:56:27 compute-1 nova_compute[182713]: 2026-01-22 00:56:27.182 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:56:27 compute-1 nova_compute[182713]: 2026-01-22 00:56:27.182 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Jan 22 00:56:27 compute-1 nova_compute[182713]: 2026-01-22 00:56:27.182 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 00:56:27 compute-1 nova_compute[182713]: 2026-01-22 00:56:27.221 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 00:56:27 compute-1 nova_compute[182713]: 2026-01-22 00:56:27.221 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 00:56:27 compute-1 nova_compute[182713]: 2026-01-22 00:56:27.230 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:56:27 compute-1 nova_compute[182713]: 2026-01-22 00:56:27.231 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 00:56:27 compute-1 nova_compute[182713]: 2026-01-22 00:56:27.253 182717 DEBUG nova.compute.provider_tree [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Inventory has not changed in ProviderTree for provider: 39680711-70c9-4df1-ae59-25e54fac688d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:56:27 compute-1 nova_compute[182713]: 2026-01-22 00:56:27.272 182717 DEBUG nova.scheduler.client.report [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Inventory has not changed for provider 39680711-70c9-4df1-ae59-25e54fac688d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:56:27 compute-1 nova_compute[182713]: 2026-01-22 00:56:27.274 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 00:56:27 compute-1 nova_compute[182713]: 2026-01-22 00:56:27.275 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.213s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:56:28 compute-1 nova_compute[182713]: 2026-01-22 00:56:28.274 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:56:28 compute-1 nova_compute[182713]: 2026-01-22 00:56:28.275 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 00:56:29 compute-1 nova_compute[182713]: 2026-01-22 00:56:29.857 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:56:29 compute-1 nova_compute[182713]: 2026-01-22 00:56:29.858 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:56:30 compute-1 nova_compute[182713]: 2026-01-22 00:56:30.852 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:56:32 compute-1 nova_compute[182713]: 2026-01-22 00:56:32.232 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:56:32 compute-1 nova_compute[182713]: 2026-01-22 00:56:32.235 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:56:32 compute-1 nova_compute[182713]: 2026-01-22 00:56:32.235 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5004 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Jan 22 00:56:32 compute-1 nova_compute[182713]: 2026-01-22 00:56:32.235 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 00:56:32 compute-1 nova_compute[182713]: 2026-01-22 00:56:32.272 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:56:32 compute-1 nova_compute[182713]: 2026-01-22 00:56:32.273 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 00:56:32 compute-1 nova_compute[182713]: 2026-01-22 00:56:32.856 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:56:32 compute-1 nova_compute[182713]: 2026-01-22 00:56:32.857 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 00:56:32 compute-1 nova_compute[182713]: 2026-01-22 00:56:32.857 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 22 00:56:33 compute-1 nova_compute[182713]: 2026-01-22 00:56:33.394 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 22 00:56:33 compute-1 nova_compute[182713]: 2026-01-22 00:56:33.856 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:56:37 compute-1 nova_compute[182713]: 2026-01-22 00:56:37.273 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:56:37 compute-1 nova_compute[182713]: 2026-01-22 00:56:37.276 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:56:37 compute-1 nova_compute[182713]: 2026-01-22 00:56:37.276 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Jan 22 00:56:37 compute-1 nova_compute[182713]: 2026-01-22 00:56:37.277 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 00:56:37 compute-1 nova_compute[182713]: 2026-01-22 00:56:37.318 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:56:37 compute-1 nova_compute[182713]: 2026-01-22 00:56:37.320 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 00:56:40 compute-1 podman[251608]: 2026-01-22 00:56:40.592083076 +0000 UTC m=+0.076391655 container health_status cba261ffc859a105e0afa4436e64e27f9ba7e7387a1ea02146e0072c51844d44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ceilometer_agent_compute)
Jan 22 00:56:40 compute-1 podman[251609]: 2026-01-22 00:56:40.607921547 +0000 UTC m=+0.092295788 container health_status dce133a2ffa21f3006ac27dc80027060a9029e6d972f4c62fecd35dd3bbb00f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, config_id=openstack_network_exporter, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., architecture=x86_64, io.buildah.version=1.33.7, name=ubi9-minimal, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., managed_by=edpm_ansible, vcs-type=git, build-date=2025-08-20T13:12:41, release=1755695350, vendor=Red Hat, Inc., version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers)
Jan 22 00:56:42 compute-1 nova_compute[182713]: 2026-01-22 00:56:42.319 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:56:42 compute-1 nova_compute[182713]: 2026-01-22 00:56:42.321 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:56:47 compute-1 nova_compute[182713]: 2026-01-22 00:56:47.320 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:56:47 compute-1 nova_compute[182713]: 2026-01-22 00:56:47.322 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:56:47 compute-1 nova_compute[182713]: 2026-01-22 00:56:47.322 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Jan 22 00:56:47 compute-1 nova_compute[182713]: 2026-01-22 00:56:47.322 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 00:56:47 compute-1 nova_compute[182713]: 2026-01-22 00:56:47.353 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:56:47 compute-1 nova_compute[182713]: 2026-01-22 00:56:47.354 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 00:56:50 compute-1 podman[251647]: 2026-01-22 00:56:50.592016821 +0000 UTC m=+0.069440731 container health_status 9069102163b9014ce8d990e36224edf2f6277e737eebbc85a8ea0ba5a3d6a468 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 22 00:56:50 compute-1 podman[251646]: 2026-01-22 00:56:50.642304447 +0000 UTC m=+0.129326354 container health_status 1b31374526468d6b98833c6ddc06c791524e3aaa8f4a92d02e3f11c449a17ca2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 22 00:56:52 compute-1 nova_compute[182713]: 2026-01-22 00:56:52.355 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:56:52 compute-1 nova_compute[182713]: 2026-01-22 00:56:52.357 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:56:52 compute-1 nova_compute[182713]: 2026-01-22 00:56:52.357 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Jan 22 00:56:52 compute-1 nova_compute[182713]: 2026-01-22 00:56:52.357 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 00:56:52 compute-1 nova_compute[182713]: 2026-01-22 00:56:52.394 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:56:52 compute-1 nova_compute[182713]: 2026-01-22 00:56:52.395 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 00:56:57 compute-1 nova_compute[182713]: 2026-01-22 00:56:57.395 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:56:57 compute-1 podman[251697]: 2026-01-22 00:56:57.559308736 +0000 UTC m=+0.051244827 container health_status af9a44923f14d865893c91712a29a99d59e05d83f1a1a0300c4f4d5af638f284 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 00:56:57 compute-1 podman[251698]: 2026-01-22 00:56:57.562625249 +0000 UTC m=+0.051912638 container health_status e305ebfcd3dbca18f8dd680606b043b108cf9efb0d0a771039cb04fe0df8441d (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 22 00:57:02 compute-1 nova_compute[182713]: 2026-01-22 00:57:02.398 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:57:03 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:57:03.078 104184 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:57:03 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:57:03.079 104184 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:57:03 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:57:03.079 104184 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:57:07 compute-1 nova_compute[182713]: 2026-01-22 00:57:07.400 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:57:07 compute-1 nova_compute[182713]: 2026-01-22 00:57:07.402 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:57:07 compute-1 nova_compute[182713]: 2026-01-22 00:57:07.402 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Jan 22 00:57:07 compute-1 nova_compute[182713]: 2026-01-22 00:57:07.402 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 00:57:07 compute-1 nova_compute[182713]: 2026-01-22 00:57:07.433 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:57:07 compute-1 nova_compute[182713]: 2026-01-22 00:57:07.434 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 00:57:11 compute-1 podman[251739]: 2026-01-22 00:57:11.626170331 +0000 UTC m=+0.109714397 container health_status cba261ffc859a105e0afa4436e64e27f9ba7e7387a1ea02146e0072c51844d44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute)
Jan 22 00:57:11 compute-1 podman[251740]: 2026-01-22 00:57:11.62742642 +0000 UTC m=+0.105988641 container health_status dce133a2ffa21f3006ac27dc80027060a9029e6d972f4c62fecd35dd3bbb00f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., version=9.6, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, config_id=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, vcs-type=git, io.openshift.tags=minimal rhel9)
Jan 22 00:57:12 compute-1 nova_compute[182713]: 2026-01-22 00:57:12.435 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:57:12 compute-1 nova_compute[182713]: 2026-01-22 00:57:12.437 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:57:15 compute-1 nova_compute[182713]: 2026-01-22 00:57:15.857 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:57:17 compute-1 nova_compute[182713]: 2026-01-22 00:57:17.437 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:57:20 compute-1 nova_compute[182713]: 2026-01-22 00:57:20.851 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:57:21 compute-1 podman[251781]: 2026-01-22 00:57:21.596182618 +0000 UTC m=+0.069073618 container health_status 9069102163b9014ce8d990e36224edf2f6277e737eebbc85a8ea0ba5a3d6a468 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 22 00:57:21 compute-1 podman[251780]: 2026-01-22 00:57:21.641613275 +0000 UTC m=+0.113189915 container health_status 1b31374526468d6b98833c6ddc06c791524e3aaa8f4a92d02e3f11c449a17ca2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 00:57:22 compute-1 nova_compute[182713]: 2026-01-22 00:57:22.439 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:57:24 compute-1 nova_compute[182713]: 2026-01-22 00:57:24.855 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:57:26 compute-1 nova_compute[182713]: 2026-01-22 00:57:26.855 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:57:26 compute-1 nova_compute[182713]: 2026-01-22 00:57:26.881 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:57:26 compute-1 nova_compute[182713]: 2026-01-22 00:57:26.882 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:57:26 compute-1 nova_compute[182713]: 2026-01-22 00:57:26.882 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:57:26 compute-1 nova_compute[182713]: 2026-01-22 00:57:26.882 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 00:57:27 compute-1 nova_compute[182713]: 2026-01-22 00:57:27.052 182717 WARNING nova.virt.libvirt.driver [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 00:57:27 compute-1 nova_compute[182713]: 2026-01-22 00:57:27.053 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5690MB free_disk=73.17708587646484GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 00:57:27 compute-1 nova_compute[182713]: 2026-01-22 00:57:27.053 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:57:27 compute-1 nova_compute[182713]: 2026-01-22 00:57:27.054 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:57:27 compute-1 nova_compute[182713]: 2026-01-22 00:57:27.151 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 00:57:27 compute-1 nova_compute[182713]: 2026-01-22 00:57:27.151 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 00:57:27 compute-1 nova_compute[182713]: 2026-01-22 00:57:27.174 182717 DEBUG nova.compute.provider_tree [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Inventory has not changed in ProviderTree for provider: 39680711-70c9-4df1-ae59-25e54fac688d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:57:27 compute-1 nova_compute[182713]: 2026-01-22 00:57:27.195 182717 DEBUG nova.scheduler.client.report [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Inventory has not changed for provider 39680711-70c9-4df1-ae59-25e54fac688d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:57:27 compute-1 nova_compute[182713]: 2026-01-22 00:57:27.197 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 00:57:27 compute-1 nova_compute[182713]: 2026-01-22 00:57:27.197 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.144s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:57:27 compute-1 nova_compute[182713]: 2026-01-22 00:57:27.441 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:57:28 compute-1 podman[251831]: 2026-01-22 00:57:28.564759873 +0000 UTC m=+0.052343452 container health_status e305ebfcd3dbca18f8dd680606b043b108cf9efb0d0a771039cb04fe0df8441d (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 22 00:57:28 compute-1 podman[251830]: 2026-01-22 00:57:28.573803993 +0000 UTC m=+0.066150879 container health_status af9a44923f14d865893c91712a29a99d59e05d83f1a1a0300c4f4d5af638f284 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_metadata_agent)
Jan 22 00:57:30 compute-1 nova_compute[182713]: 2026-01-22 00:57:30.198 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:57:30 compute-1 nova_compute[182713]: 2026-01-22 00:57:30.198 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 00:57:30 compute-1 nova_compute[182713]: 2026-01-22 00:57:30.856 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:57:30 compute-1 nova_compute[182713]: 2026-01-22 00:57:30.857 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:57:32 compute-1 nova_compute[182713]: 2026-01-22 00:57:32.445 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:57:33 compute-1 nova_compute[182713]: 2026-01-22 00:57:33.859 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:57:34 compute-1 nova_compute[182713]: 2026-01-22 00:57:34.859 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:57:34 compute-1 nova_compute[182713]: 2026-01-22 00:57:34.860 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 00:57:34 compute-1 nova_compute[182713]: 2026-01-22 00:57:34.860 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 22 00:57:34 compute-1 nova_compute[182713]: 2026-01-22 00:57:34.875 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 22 00:57:37 compute-1 nova_compute[182713]: 2026-01-22 00:57:37.447 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:57:42 compute-1 nova_compute[182713]: 2026-01-22 00:57:42.450 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:57:42 compute-1 podman[251872]: 2026-01-22 00:57:42.586715297 +0000 UTC m=+0.081152902 container health_status cba261ffc859a105e0afa4436e64e27f9ba7e7387a1ea02146e0072c51844d44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202)
Jan 22 00:57:42 compute-1 podman[251873]: 2026-01-22 00:57:42.597339486 +0000 UTC m=+0.082582537 container health_status dce133a2ffa21f3006ac27dc80027060a9029e6d972f4c62fecd35dd3bbb00f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, managed_by=edpm_ansible, distribution-scope=public, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., version=9.6, build-date=2025-08-20T13:12:41, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, config_id=openstack_network_exporter, io.openshift.expose-services=, io.openshift.tags=minimal rhel9)
Jan 22 00:57:47 compute-1 nova_compute[182713]: 2026-01-22 00:57:47.453 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:57:47 compute-1 nova_compute[182713]: 2026-01-22 00:57:47.454 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:57:47 compute-1 nova_compute[182713]: 2026-01-22 00:57:47.454 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Jan 22 00:57:47 compute-1 nova_compute[182713]: 2026-01-22 00:57:47.454 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 00:57:47 compute-1 nova_compute[182713]: 2026-01-22 00:57:47.501 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:57:47 compute-1 nova_compute[182713]: 2026-01-22 00:57:47.502 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 00:57:52 compute-1 nova_compute[182713]: 2026-01-22 00:57:52.503 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:57:52 compute-1 nova_compute[182713]: 2026-01-22 00:57:52.505 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:57:52 compute-1 nova_compute[182713]: 2026-01-22 00:57:52.505 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Jan 22 00:57:52 compute-1 nova_compute[182713]: 2026-01-22 00:57:52.505 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 00:57:52 compute-1 nova_compute[182713]: 2026-01-22 00:57:52.548 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:57:52 compute-1 nova_compute[182713]: 2026-01-22 00:57:52.550 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 00:57:52 compute-1 podman[251914]: 2026-01-22 00:57:52.613641337 +0000 UTC m=+0.094155317 container health_status 9069102163b9014ce8d990e36224edf2f6277e737eebbc85a8ea0ba5a3d6a468 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 22 00:57:52 compute-1 podman[251913]: 2026-01-22 00:57:52.639965561 +0000 UTC m=+0.122544394 container health_status 1b31374526468d6b98833c6ddc06c791524e3aaa8f4a92d02e3f11c449a17ca2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 22 00:57:57 compute-1 nova_compute[182713]: 2026-01-22 00:57:57.550 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:57:57 compute-1 nova_compute[182713]: 2026-01-22 00:57:57.553 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:57:59 compute-1 podman[251964]: 2026-01-22 00:57:59.595076499 +0000 UTC m=+0.084120425 container health_status e305ebfcd3dbca18f8dd680606b043b108cf9efb0d0a771039cb04fe0df8441d (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Jan 22 00:57:59 compute-1 podman[251963]: 2026-01-22 00:57:59.609196356 +0000 UTC m=+0.104812946 container health_status af9a44923f14d865893c91712a29a99d59e05d83f1a1a0300c4f4d5af638f284 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 22 00:58:02 compute-1 nova_compute[182713]: 2026-01-22 00:58:02.553 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:58:02 compute-1 nova_compute[182713]: 2026-01-22 00:58:02.556 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:58:02 compute-1 nova_compute[182713]: 2026-01-22 00:58:02.556 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5005 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Jan 22 00:58:02 compute-1 nova_compute[182713]: 2026-01-22 00:58:02.557 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 00:58:02 compute-1 nova_compute[182713]: 2026-01-22 00:58:02.608 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:58:02 compute-1 nova_compute[182713]: 2026-01-22 00:58:02.609 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 00:58:03 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:58:03.080 104184 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:58:03 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:58:03.081 104184 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:58:03 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:58:03.081 104184 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:58:07 compute-1 nova_compute[182713]: 2026-01-22 00:58:07.611 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:58:11 compute-1 sshd-session[252006]: Invalid user jito from 92.118.39.95 port 35934
Jan 22 00:58:11 compute-1 sshd-session[252006]: Connection closed by invalid user jito 92.118.39.95 port 35934 [preauth]
Jan 22 00:58:12 compute-1 nova_compute[182713]: 2026-01-22 00:58:12.613 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:58:13 compute-1 podman[252009]: 2026-01-22 00:58:13.627256749 +0000 UTC m=+0.097672865 container health_status dce133a2ffa21f3006ac27dc80027060a9029e6d972f4c62fecd35dd3bbb00f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., version=9.6, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=openstack_network_exporter, container_name=openstack_network_exporter)
Jan 22 00:58:13 compute-1 podman[252008]: 2026-01-22 00:58:13.633750829 +0000 UTC m=+0.110741608 container health_status cba261ffc859a105e0afa4436e64e27f9ba7e7387a1ea02146e0072c51844d44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 00:58:15 compute-1 nova_compute[182713]: 2026-01-22 00:58:15.856 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:58:17 compute-1 nova_compute[182713]: 2026-01-22 00:58:17.616 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:58:17 compute-1 nova_compute[182713]: 2026-01-22 00:58:17.856 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._run_image_cache_manager_pass run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:58:17 compute-1 nova_compute[182713]: 2026-01-22 00:58:17.857 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquiring lock "storage-registry-lock" by "nova.virt.storage_users.register_storage_use.<locals>.do_register_storage_use" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:58:17 compute-1 nova_compute[182713]: 2026-01-22 00:58:17.859 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "storage-registry-lock" acquired by "nova.virt.storage_users.register_storage_use.<locals>.do_register_storage_use" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:58:17 compute-1 nova_compute[182713]: 2026-01-22 00:58:17.861 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "storage-registry-lock" "released" by "nova.virt.storage_users.register_storage_use.<locals>.do_register_storage_use" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:58:17 compute-1 nova_compute[182713]: 2026-01-22 00:58:17.861 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquiring lock "storage-registry-lock" by "nova.virt.storage_users.get_storage_users.<locals>.do_get_storage_users" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:58:17 compute-1 nova_compute[182713]: 2026-01-22 00:58:17.862 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "storage-registry-lock" acquired by "nova.virt.storage_users.get_storage_users.<locals>.do_get_storage_users" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:58:17 compute-1 nova_compute[182713]: 2026-01-22 00:58:17.863 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "storage-registry-lock" "released" by "nova.virt.storage_users.get_storage_users.<locals>.do_get_storage_users" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:58:17 compute-1 nova_compute[182713]: 2026-01-22 00:58:17.893 182717 DEBUG nova.virt.libvirt.imagecache [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Verify base images _age_and_verify_cached_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:314
Jan 22 00:58:17 compute-1 nova_compute[182713]: 2026-01-22 00:58:17.894 182717 WARNING nova.virt.libvirt.imagecache [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Unknown base file: /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474
Jan 22 00:58:17 compute-1 nova_compute[182713]: 2026-01-22 00:58:17.895 182717 WARNING nova.virt.libvirt.imagecache [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Unknown base file: /var/lib/nova/instances/_base/3224e12fdef635d4ff8f1066ef61cd1ea8e72c9b
Jan 22 00:58:17 compute-1 nova_compute[182713]: 2026-01-22 00:58:17.896 182717 WARNING nova.virt.libvirt.imagecache [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Unknown base file: /var/lib/nova/instances/_base/28d799b333bd7d52e5e892149f424e185effed74
Jan 22 00:58:17 compute-1 nova_compute[182713]: 2026-01-22 00:58:17.896 182717 WARNING nova.virt.libvirt.imagecache [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Unknown base file: /var/lib/nova/instances/_base/4546bcf384626c54ce60a485a9f0fede193badcf
Jan 22 00:58:17 compute-1 nova_compute[182713]: 2026-01-22 00:58:17.896 182717 INFO nova.virt.libvirt.imagecache [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Removable base files: /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 /var/lib/nova/instances/_base/3224e12fdef635d4ff8f1066ef61cd1ea8e72c9b /var/lib/nova/instances/_base/28d799b333bd7d52e5e892149f424e185effed74 /var/lib/nova/instances/_base/4546bcf384626c54ce60a485a9f0fede193badcf
Jan 22 00:58:17 compute-1 nova_compute[182713]: 2026-01-22 00:58:17.897 182717 INFO nova.virt.libvirt.imagecache [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Base, swap or ephemeral file too young to remove: /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474
Jan 22 00:58:17 compute-1 nova_compute[182713]: 2026-01-22 00:58:17.897 182717 INFO nova.virt.libvirt.imagecache [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Base, swap or ephemeral file too young to remove: /var/lib/nova/instances/_base/3224e12fdef635d4ff8f1066ef61cd1ea8e72c9b
Jan 22 00:58:17 compute-1 nova_compute[182713]: 2026-01-22 00:58:17.898 182717 INFO nova.virt.libvirt.imagecache [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Base, swap or ephemeral file too young to remove: /var/lib/nova/instances/_base/28d799b333bd7d52e5e892149f424e185effed74
Jan 22 00:58:17 compute-1 nova_compute[182713]: 2026-01-22 00:58:17.898 182717 INFO nova.virt.libvirt.imagecache [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Base, swap or ephemeral file too young to remove: /var/lib/nova/instances/_base/4546bcf384626c54ce60a485a9f0fede193badcf
Jan 22 00:58:17 compute-1 nova_compute[182713]: 2026-01-22 00:58:17.899 182717 DEBUG nova.virt.libvirt.imagecache [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Verification complete _age_and_verify_cached_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:350
Jan 22 00:58:17 compute-1 nova_compute[182713]: 2026-01-22 00:58:17.899 182717 DEBUG nova.virt.libvirt.imagecache [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Verify swap images _age_and_verify_swap_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:299
Jan 22 00:58:17 compute-1 nova_compute[182713]: 2026-01-22 00:58:17.899 182717 DEBUG nova.virt.libvirt.imagecache [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Verify ephemeral images _age_and_verify_ephemeral_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:284
Jan 22 00:58:22 compute-1 nova_compute[182713]: 2026-01-22 00:58:22.619 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:58:22 compute-1 nova_compute[182713]: 2026-01-22 00:58:22.621 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:58:22 compute-1 nova_compute[182713]: 2026-01-22 00:58:22.621 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Jan 22 00:58:22 compute-1 nova_compute[182713]: 2026-01-22 00:58:22.622 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 00:58:22 compute-1 nova_compute[182713]: 2026-01-22 00:58:22.646 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:58:22 compute-1 nova_compute[182713]: 2026-01-22 00:58:22.647 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 00:58:22 compute-1 nova_compute[182713]: 2026-01-22 00:58:22.894 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:58:22.909 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:58:22.912 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:58:22.912 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:58:22.912 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:58:22.913 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:58:22.913 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:58:22.913 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:58:22.913 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:58:22.913 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:58:22.914 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:58:22.914 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:58:22.914 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:58:22.915 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:58:22.915 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:58:22.915 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:58:22.915 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:58:22.916 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:58:22.916 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:58:22.916 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:58:22.916 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:58:22.917 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:58:22.917 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:58:22.917 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:58:22.917 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:58:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 00:58:22.918 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:58:23 compute-1 podman[252049]: 2026-01-22 00:58:23.57675795 +0000 UTC m=+0.059079619 container health_status 9069102163b9014ce8d990e36224edf2f6277e737eebbc85a8ea0ba5a3d6a468 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 22 00:58:23 compute-1 podman[252048]: 2026-01-22 00:58:23.612230598 +0000 UTC m=+0.099329306 container health_status 1b31374526468d6b98833c6ddc06c791524e3aaa8f4a92d02e3f11c449a17ca2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 22 00:58:25 compute-1 nova_compute[182713]: 2026-01-22 00:58:25.856 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:58:27 compute-1 nova_compute[182713]: 2026-01-22 00:58:27.648 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:58:27 compute-1 nova_compute[182713]: 2026-01-22 00:58:27.856 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:58:27 compute-1 nova_compute[182713]: 2026-01-22 00:58:27.885 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:58:27 compute-1 nova_compute[182713]: 2026-01-22 00:58:27.885 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:58:27 compute-1 nova_compute[182713]: 2026-01-22 00:58:27.886 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:58:27 compute-1 nova_compute[182713]: 2026-01-22 00:58:27.886 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 00:58:28 compute-1 nova_compute[182713]: 2026-01-22 00:58:28.111 182717 WARNING nova.virt.libvirt.driver [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 00:58:28 compute-1 nova_compute[182713]: 2026-01-22 00:58:28.112 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5685MB free_disk=73.17708587646484GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 00:58:28 compute-1 nova_compute[182713]: 2026-01-22 00:58:28.113 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:58:28 compute-1 nova_compute[182713]: 2026-01-22 00:58:28.113 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:58:28 compute-1 nova_compute[182713]: 2026-01-22 00:58:28.189 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 00:58:28 compute-1 nova_compute[182713]: 2026-01-22 00:58:28.189 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 00:58:28 compute-1 nova_compute[182713]: 2026-01-22 00:58:28.212 182717 DEBUG nova.compute.provider_tree [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Inventory has not changed in ProviderTree for provider: 39680711-70c9-4df1-ae59-25e54fac688d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:58:28 compute-1 nova_compute[182713]: 2026-01-22 00:58:28.234 182717 DEBUG nova.scheduler.client.report [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Inventory has not changed for provider 39680711-70c9-4df1-ae59-25e54fac688d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:58:28 compute-1 nova_compute[182713]: 2026-01-22 00:58:28.237 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 00:58:28 compute-1 nova_compute[182713]: 2026-01-22 00:58:28.237 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.124s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:58:29 compute-1 nova_compute[182713]: 2026-01-22 00:58:29.855 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:58:29 compute-1 nova_compute[182713]: 2026-01-22 00:58:29.856 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Jan 22 00:58:29 compute-1 nova_compute[182713]: 2026-01-22 00:58:29.874 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Jan 22 00:58:30 compute-1 nova_compute[182713]: 2026-01-22 00:58:30.870 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:58:30 compute-1 podman[252100]: 2026-01-22 00:58:30.889688681 +0000 UTC m=+0.075786675 container health_status e305ebfcd3dbca18f8dd680606b043b108cf9efb0d0a771039cb04fe0df8441d (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Jan 22 00:58:30 compute-1 nova_compute[182713]: 2026-01-22 00:58:30.892 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:58:30 compute-1 podman[252099]: 2026-01-22 00:58:30.893562102 +0000 UTC m=+0.082418663 container health_status af9a44923f14d865893c91712a29a99d59e05d83f1a1a0300c4f4d5af638f284 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3)
Jan 22 00:58:31 compute-1 nova_compute[182713]: 2026-01-22 00:58:31.856 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:58:31 compute-1 nova_compute[182713]: 2026-01-22 00:58:31.857 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 00:58:32 compute-1 nova_compute[182713]: 2026-01-22 00:58:32.652 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:58:32 compute-1 nova_compute[182713]: 2026-01-22 00:58:32.681 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:58:32 compute-1 nova_compute[182713]: 2026-01-22 00:58:32.682 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5031 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Jan 22 00:58:32 compute-1 nova_compute[182713]: 2026-01-22 00:58:32.682 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 00:58:32 compute-1 nova_compute[182713]: 2026-01-22 00:58:32.683 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 00:58:32 compute-1 nova_compute[182713]: 2026-01-22 00:58:32.685 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:58:32 compute-1 nova_compute[182713]: 2026-01-22 00:58:32.855 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:58:32 compute-1 nova_compute[182713]: 2026-01-22 00:58:32.856 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:58:33 compute-1 nova_compute[182713]: 2026-01-22 00:58:33.874 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:58:33 compute-1 nova_compute[182713]: 2026-01-22 00:58:33.875 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Jan 22 00:58:34 compute-1 nova_compute[182713]: 2026-01-22 00:58:34.877 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:58:34 compute-1 nova_compute[182713]: 2026-01-22 00:58:34.878 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 00:58:34 compute-1 nova_compute[182713]: 2026-01-22 00:58:34.879 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 22 00:58:34 compute-1 nova_compute[182713]: 2026-01-22 00:58:34.898 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 22 00:58:35 compute-1 nova_compute[182713]: 2026-01-22 00:58:35.856 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:58:37 compute-1 nova_compute[182713]: 2026-01-22 00:58:37.687 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:58:37 compute-1 nova_compute[182713]: 2026-01-22 00:58:37.689 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:58:37 compute-1 nova_compute[182713]: 2026-01-22 00:58:37.689 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5004 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Jan 22 00:58:37 compute-1 nova_compute[182713]: 2026-01-22 00:58:37.690 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 00:58:37 compute-1 nova_compute[182713]: 2026-01-22 00:58:37.732 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:58:37 compute-1 nova_compute[182713]: 2026-01-22 00:58:37.734 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 00:58:42 compute-1 nova_compute[182713]: 2026-01-22 00:58:42.734 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:58:44 compute-1 podman[252138]: 2026-01-22 00:58:44.609229948 +0000 UTC m=+0.093989181 container health_status cba261ffc859a105e0afa4436e64e27f9ba7e7387a1ea02146e0072c51844d44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 00:58:44 compute-1 podman[252139]: 2026-01-22 00:58:44.6341821 +0000 UTC m=+0.102685539 container health_status dce133a2ffa21f3006ac27dc80027060a9029e6d972f4c62fecd35dd3bbb00f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, version=9.6, vendor=Red Hat, Inc., container_name=openstack_network_exporter, maintainer=Red Hat, Inc., managed_by=edpm_ansible, release=1755695350, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7)
Jan 22 00:58:47 compute-1 nova_compute[182713]: 2026-01-22 00:58:47.736 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:58:52 compute-1 nova_compute[182713]: 2026-01-22 00:58:52.738 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:58:54 compute-1 podman[252184]: 2026-01-22 00:58:54.609787538 +0000 UTC m=+0.083656010 container health_status 9069102163b9014ce8d990e36224edf2f6277e737eebbc85a8ea0ba5a3d6a468 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 22 00:58:54 compute-1 podman[252183]: 2026-01-22 00:58:54.619302612 +0000 UTC m=+0.111624766 container health_status 1b31374526468d6b98833c6ddc06c791524e3aaa8f4a92d02e3f11c449a17ca2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Jan 22 00:58:57 compute-1 nova_compute[182713]: 2026-01-22 00:58:57.740 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:58:57 compute-1 nova_compute[182713]: 2026-01-22 00:58:57.742 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:58:57 compute-1 nova_compute[182713]: 2026-01-22 00:58:57.743 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Jan 22 00:58:57 compute-1 nova_compute[182713]: 2026-01-22 00:58:57.743 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 00:58:57 compute-1 nova_compute[182713]: 2026-01-22 00:58:57.785 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:58:57 compute-1 nova_compute[182713]: 2026-01-22 00:58:57.786 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 00:59:01 compute-1 podman[252234]: 2026-01-22 00:59:01.590216501 +0000 UTC m=+0.077153499 container health_status af9a44923f14d865893c91712a29a99d59e05d83f1a1a0300c4f4d5af638f284 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 00:59:01 compute-1 podman[252235]: 2026-01-22 00:59:01.599942452 +0000 UTC m=+0.077700266 container health_status e305ebfcd3dbca18f8dd680606b043b108cf9efb0d0a771039cb04fe0df8441d (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Jan 22 00:59:02 compute-1 nova_compute[182713]: 2026-01-22 00:59:02.787 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:59:03 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:59:03.081 104184 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:59:03 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:59:03.082 104184 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:59:03 compute-1 ovn_metadata_agent[104179]: 2026-01-22 00:59:03.082 104184 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:59:07 compute-1 nova_compute[182713]: 2026-01-22 00:59:07.789 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:59:11 compute-1 nova_compute[182713]: 2026-01-22 00:59:11.234 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:59:12 compute-1 nova_compute[182713]: 2026-01-22 00:59:12.790 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:59:15 compute-1 podman[252277]: 2026-01-22 00:59:15.60520534 +0000 UTC m=+0.077359975 container health_status cba261ffc859a105e0afa4436e64e27f9ba7e7387a1ea02146e0072c51844d44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Jan 22 00:59:15 compute-1 podman[252278]: 2026-01-22 00:59:15.61553706 +0000 UTC m=+0.085402284 container health_status dce133a2ffa21f3006ac27dc80027060a9029e6d972f4c62fecd35dd3bbb00f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=openstack_network_exporter, distribution-scope=public, vendor=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, release=1755695350, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, vcs-type=git, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., version=9.6, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9)
Jan 22 00:59:15 compute-1 nova_compute[182713]: 2026-01-22 00:59:15.875 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:59:17 compute-1 nova_compute[182713]: 2026-01-22 00:59:17.794 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:59:22 compute-1 nova_compute[182713]: 2026-01-22 00:59:22.796 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:59:22 compute-1 nova_compute[182713]: 2026-01-22 00:59:22.797 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:59:22 compute-1 nova_compute[182713]: 2026-01-22 00:59:22.798 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Jan 22 00:59:22 compute-1 nova_compute[182713]: 2026-01-22 00:59:22.798 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 00:59:22 compute-1 nova_compute[182713]: 2026-01-22 00:59:22.799 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 00:59:22 compute-1 nova_compute[182713]: 2026-01-22 00:59:22.801 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:59:23 compute-1 nova_compute[182713]: 2026-01-22 00:59:23.851 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:59:25 compute-1 podman[252314]: 2026-01-22 00:59:25.585269168 +0000 UTC m=+0.068293995 container health_status 9069102163b9014ce8d990e36224edf2f6277e737eebbc85a8ea0ba5a3d6a468 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 22 00:59:25 compute-1 podman[252313]: 2026-01-22 00:59:25.623194412 +0000 UTC m=+0.117003772 container health_status 1b31374526468d6b98833c6ddc06c791524e3aaa8f4a92d02e3f11c449a17ca2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 22 00:59:26 compute-1 nova_compute[182713]: 2026-01-22 00:59:26.859 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:59:27 compute-1 nova_compute[182713]: 2026-01-22 00:59:27.832 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:59:27 compute-1 nova_compute[182713]: 2026-01-22 00:59:27.834 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:59:27 compute-1 nova_compute[182713]: 2026-01-22 00:59:27.834 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5035 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Jan 22 00:59:27 compute-1 nova_compute[182713]: 2026-01-22 00:59:27.834 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 00:59:27 compute-1 nova_compute[182713]: 2026-01-22 00:59:27.835 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 00:59:27 compute-1 nova_compute[182713]: 2026-01-22 00:59:27.835 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:59:29 compute-1 nova_compute[182713]: 2026-01-22 00:59:29.855 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:59:29 compute-1 nova_compute[182713]: 2026-01-22 00:59:29.895 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:59:29 compute-1 nova_compute[182713]: 2026-01-22 00:59:29.896 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:59:29 compute-1 nova_compute[182713]: 2026-01-22 00:59:29.896 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:59:29 compute-1 nova_compute[182713]: 2026-01-22 00:59:29.897 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 00:59:30 compute-1 nova_compute[182713]: 2026-01-22 00:59:30.123 182717 WARNING nova.virt.libvirt.driver [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 00:59:30 compute-1 nova_compute[182713]: 2026-01-22 00:59:30.125 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5691MB free_disk=73.17708587646484GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 00:59:30 compute-1 nova_compute[182713]: 2026-01-22 00:59:30.125 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:59:30 compute-1 nova_compute[182713]: 2026-01-22 00:59:30.125 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:59:30 compute-1 nova_compute[182713]: 2026-01-22 00:59:30.205 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 00:59:30 compute-1 nova_compute[182713]: 2026-01-22 00:59:30.205 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 00:59:30 compute-1 nova_compute[182713]: 2026-01-22 00:59:30.236 182717 DEBUG nova.compute.provider_tree [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Inventory has not changed in ProviderTree for provider: 39680711-70c9-4df1-ae59-25e54fac688d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:59:30 compute-1 nova_compute[182713]: 2026-01-22 00:59:30.256 182717 DEBUG nova.scheduler.client.report [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Inventory has not changed for provider 39680711-70c9-4df1-ae59-25e54fac688d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:59:30 compute-1 nova_compute[182713]: 2026-01-22 00:59:30.259 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 00:59:30 compute-1 nova_compute[182713]: 2026-01-22 00:59:30.260 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.134s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:59:32 compute-1 nova_compute[182713]: 2026-01-22 00:59:32.260 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:59:32 compute-1 nova_compute[182713]: 2026-01-22 00:59:32.261 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:59:32 compute-1 nova_compute[182713]: 2026-01-22 00:59:32.261 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 00:59:32 compute-1 podman[252366]: 2026-01-22 00:59:32.596535053 +0000 UTC m=+0.077433227 container health_status e305ebfcd3dbca18f8dd680606b043b108cf9efb0d0a771039cb04fe0df8441d (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 22 00:59:32 compute-1 podman[252365]: 2026-01-22 00:59:32.604846851 +0000 UTC m=+0.089384128 container health_status af9a44923f14d865893c91712a29a99d59e05d83f1a1a0300c4f4d5af638f284 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 22 00:59:32 compute-1 nova_compute[182713]: 2026-01-22 00:59:32.836 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:59:33 compute-1 nova_compute[182713]: 2026-01-22 00:59:33.856 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:59:35 compute-1 nova_compute[182713]: 2026-01-22 00:59:35.856 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:59:35 compute-1 nova_compute[182713]: 2026-01-22 00:59:35.857 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 00:59:35 compute-1 nova_compute[182713]: 2026-01-22 00:59:35.857 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 22 00:59:35 compute-1 nova_compute[182713]: 2026-01-22 00:59:35.881 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 22 00:59:35 compute-1 nova_compute[182713]: 2026-01-22 00:59:35.882 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:59:37 compute-1 nova_compute[182713]: 2026-01-22 00:59:37.839 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:59:42 compute-1 nova_compute[182713]: 2026-01-22 00:59:42.841 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:59:42 compute-1 nova_compute[182713]: 2026-01-22 00:59:42.843 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:59:42 compute-1 nova_compute[182713]: 2026-01-22 00:59:42.844 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Jan 22 00:59:42 compute-1 nova_compute[182713]: 2026-01-22 00:59:42.844 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 00:59:42 compute-1 nova_compute[182713]: 2026-01-22 00:59:42.865 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:59:42 compute-1 nova_compute[182713]: 2026-01-22 00:59:42.865 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 00:59:46 compute-1 podman[252409]: 2026-01-22 00:59:46.598876161 +0000 UTC m=+0.082749052 container health_status dce133a2ffa21f3006ac27dc80027060a9029e6d972f4c62fecd35dd3bbb00f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, io.openshift.tags=minimal rhel9, config_id=openstack_network_exporter, architecture=x86_64, vendor=Red Hat, Inc., version=9.6, com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, vcs-type=git, io.buildah.version=1.33.7, release=1755695350, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41)
Jan 22 00:59:46 compute-1 podman[252408]: 2026-01-22 00:59:46.610827691 +0000 UTC m=+0.098640445 container health_status cba261ffc859a105e0afa4436e64e27f9ba7e7387a1ea02146e0072c51844d44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS)
Jan 22 00:59:47 compute-1 nova_compute[182713]: 2026-01-22 00:59:47.866 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:59:47 compute-1 nova_compute[182713]: 2026-01-22 00:59:47.868 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:59:47 compute-1 nova_compute[182713]: 2026-01-22 00:59:47.868 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Jan 22 00:59:47 compute-1 nova_compute[182713]: 2026-01-22 00:59:47.868 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 00:59:47 compute-1 nova_compute[182713]: 2026-01-22 00:59:47.868 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 00:59:47 compute-1 nova_compute[182713]: 2026-01-22 00:59:47.869 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:59:52 compute-1 nova_compute[182713]: 2026-01-22 00:59:52.870 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:59:52 compute-1 nova_compute[182713]: 2026-01-22 00:59:52.870 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:59:52 compute-1 nova_compute[182713]: 2026-01-22 00:59:52.871 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5001 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Jan 22 00:59:52 compute-1 nova_compute[182713]: 2026-01-22 00:59:52.871 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 00:59:52 compute-1 nova_compute[182713]: 2026-01-22 00:59:52.871 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 00:59:52 compute-1 nova_compute[182713]: 2026-01-22 00:59:52.872 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:59:56 compute-1 podman[252448]: 2026-01-22 00:59:56.618179345 +0000 UTC m=+0.090467581 container health_status 9069102163b9014ce8d990e36224edf2f6277e737eebbc85a8ea0ba5a3d6a468 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 22 00:59:56 compute-1 podman[252447]: 2026-01-22 00:59:56.711592256 +0000 UTC m=+0.192901862 container health_status 1b31374526468d6b98833c6ddc06c791524e3aaa8f4a92d02e3f11c449a17ca2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 00:59:57 compute-1 nova_compute[182713]: 2026-01-22 00:59:57.873 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:59:57 compute-1 nova_compute[182713]: 2026-01-22 00:59:57.875 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:59:57 compute-1 nova_compute[182713]: 2026-01-22 00:59:57.875 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Jan 22 00:59:57 compute-1 nova_compute[182713]: 2026-01-22 00:59:57.875 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 00:59:57 compute-1 nova_compute[182713]: 2026-01-22 00:59:57.928 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:59:57 compute-1 nova_compute[182713]: 2026-01-22 00:59:57.929 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 01:00:02 compute-1 nova_compute[182713]: 2026-01-22 01:00:02.930 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 01:00:02 compute-1 nova_compute[182713]: 2026-01-22 01:00:02.932 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:00:03 compute-1 ovn_metadata_agent[104179]: 2026-01-22 01:00:03.082 104184 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 01:00:03 compute-1 ovn_metadata_agent[104179]: 2026-01-22 01:00:03.083 104184 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 01:00:03 compute-1 ovn_metadata_agent[104179]: 2026-01-22 01:00:03.083 104184 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 01:00:03 compute-1 podman[252497]: 2026-01-22 01:00:03.55872424 +0000 UTC m=+0.053211128 container health_status af9a44923f14d865893c91712a29a99d59e05d83f1a1a0300c4f4d5af638f284 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 01:00:03 compute-1 podman[252498]: 2026-01-22 01:00:03.559002689 +0000 UTC m=+0.048884494 container health_status e305ebfcd3dbca18f8dd680606b043b108cf9efb0d0a771039cb04fe0df8441d (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Jan 22 01:00:07 compute-1 nova_compute[182713]: 2026-01-22 01:00:07.933 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 01:00:07 compute-1 nova_compute[182713]: 2026-01-22 01:00:07.935 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 01:00:07 compute-1 nova_compute[182713]: 2026-01-22 01:00:07.935 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Jan 22 01:00:07 compute-1 nova_compute[182713]: 2026-01-22 01:00:07.935 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 01:00:07 compute-1 nova_compute[182713]: 2026-01-22 01:00:07.981 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:00:07 compute-1 nova_compute[182713]: 2026-01-22 01:00:07.981 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 01:00:12 compute-1 nova_compute[182713]: 2026-01-22 01:00:12.982 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 01:00:12 compute-1 nova_compute[182713]: 2026-01-22 01:00:12.984 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 01:00:13 compute-1 sshd-session[252541]: Invalid user jito from 92.118.39.95 port 34896
Jan 22 01:00:13 compute-1 sshd-session[252541]: Connection closed by invalid user jito 92.118.39.95 port 34896 [preauth]
Jan 22 01:00:17 compute-1 podman[252543]: 2026-01-22 01:00:17.593764051 +0000 UTC m=+0.080008227 container health_status cba261ffc859a105e0afa4436e64e27f9ba7e7387a1ea02146e0072c51844d44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute)
Jan 22 01:00:17 compute-1 podman[252544]: 2026-01-22 01:00:17.629114585 +0000 UTC m=+0.101625246 container health_status dce133a2ffa21f3006ac27dc80027060a9029e6d972f4c62fecd35dd3bbb00f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, release=1755695350, io.openshift.expose-services=, maintainer=Red Hat, Inc., distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, config_id=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, io.buildah.version=1.33.7, managed_by=edpm_ansible)
Jan 22 01:00:17 compute-1 nova_compute[182713]: 2026-01-22 01:00:17.856 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 01:00:17 compute-1 nova_compute[182713]: 2026-01-22 01:00:17.985 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:00:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 01:00:22.912 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 01:00:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 01:00:22.913 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 01:00:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 01:00:22.913 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 01:00:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 01:00:22.913 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 01:00:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 01:00:22.913 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 01:00:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 01:00:22.913 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 01:00:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 01:00:22.913 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 01:00:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 01:00:22.913 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 01:00:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 01:00:22.913 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 01:00:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 01:00:22.913 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 01:00:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 01:00:22.914 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 01:00:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 01:00:22.914 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 01:00:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 01:00:22.914 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 01:00:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 01:00:22.914 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 01:00:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 01:00:22.914 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 01:00:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 01:00:22.914 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 01:00:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 01:00:22.914 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 01:00:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 01:00:22.914 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 01:00:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 01:00:22.914 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 01:00:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 01:00:22.914 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 01:00:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 01:00:22.915 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 01:00:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 01:00:22.915 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 01:00:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 01:00:22.915 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 01:00:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 01:00:22.915 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 01:00:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 01:00:22.915 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 01:00:22 compute-1 nova_compute[182713]: 2026-01-22 01:00:22.987 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 01:00:24 compute-1 nova_compute[182713]: 2026-01-22 01:00:24.852 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 01:00:27 compute-1 podman[252585]: 2026-01-22 01:00:27.635152696 +0000 UTC m=+0.110383067 container health_status 9069102163b9014ce8d990e36224edf2f6277e737eebbc85a8ea0ba5a3d6a468 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 22 01:00:27 compute-1 podman[252584]: 2026-01-22 01:00:27.655539178 +0000 UTC m=+0.137538179 container health_status 1b31374526468d6b98833c6ddc06c791524e3aaa8f4a92d02e3f11c449a17ca2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251202, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Jan 22 01:00:27 compute-1 nova_compute[182713]: 2026-01-22 01:00:27.989 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 01:00:27 compute-1 nova_compute[182713]: 2026-01-22 01:00:27.991 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:00:27 compute-1 nova_compute[182713]: 2026-01-22 01:00:27.991 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Jan 22 01:00:27 compute-1 nova_compute[182713]: 2026-01-22 01:00:27.992 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 01:00:27 compute-1 nova_compute[182713]: 2026-01-22 01:00:27.993 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 01:00:27 compute-1 nova_compute[182713]: 2026-01-22 01:00:27.995 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 01:00:28 compute-1 nova_compute[182713]: 2026-01-22 01:00:28.857 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 01:00:30 compute-1 nova_compute[182713]: 2026-01-22 01:00:30.855 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 01:00:30 compute-1 nova_compute[182713]: 2026-01-22 01:00:30.913 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 01:00:30 compute-1 nova_compute[182713]: 2026-01-22 01:00:30.913 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 01:00:30 compute-1 nova_compute[182713]: 2026-01-22 01:00:30.914 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 01:00:30 compute-1 nova_compute[182713]: 2026-01-22 01:00:30.914 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 01:00:31 compute-1 nova_compute[182713]: 2026-01-22 01:00:31.106 182717 WARNING nova.virt.libvirt.driver [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 01:00:31 compute-1 nova_compute[182713]: 2026-01-22 01:00:31.108 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5700MB free_disk=73.17708587646484GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 01:00:31 compute-1 nova_compute[182713]: 2026-01-22 01:00:31.108 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 01:00:31 compute-1 nova_compute[182713]: 2026-01-22 01:00:31.108 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 01:00:32 compute-1 nova_compute[182713]: 2026-01-22 01:00:32.016 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 01:00:32 compute-1 nova_compute[182713]: 2026-01-22 01:00:32.017 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 01:00:32 compute-1 nova_compute[182713]: 2026-01-22 01:00:32.272 182717 DEBUG nova.scheduler.client.report [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Refreshing inventories for resource provider 39680711-70c9-4df1-ae59-25e54fac688d _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Jan 22 01:00:32 compute-1 nova_compute[182713]: 2026-01-22 01:00:32.491 182717 DEBUG nova.scheduler.client.report [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Updating ProviderTree inventory for provider 39680711-70c9-4df1-ae59-25e54fac688d from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Jan 22 01:00:32 compute-1 nova_compute[182713]: 2026-01-22 01:00:32.492 182717 DEBUG nova.compute.provider_tree [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Updating inventory in ProviderTree for provider 39680711-70c9-4df1-ae59-25e54fac688d with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Jan 22 01:00:32 compute-1 nova_compute[182713]: 2026-01-22 01:00:32.512 182717 DEBUG nova.scheduler.client.report [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Refreshing aggregate associations for resource provider 39680711-70c9-4df1-ae59-25e54fac688d, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Jan 22 01:00:32 compute-1 nova_compute[182713]: 2026-01-22 01:00:32.541 182717 DEBUG nova.scheduler.client.report [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Refreshing trait associations for resource provider 39680711-70c9-4df1-ae59-25e54fac688d, traits: COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_ACCELERATORS,COMPUTE_NODE,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_SSE41,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_GRAPHICS_MODEL_CIRRUS,HW_CPU_X86_SSE42,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_SECURITY_TPM_2_0,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_STORAGE_BUS_USB,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_STORAGE_BUS_FDC,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_STORAGE_BUS_SATA,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_DEVICE_TAGGING,COMPUTE_NET_VIF_MODEL_NE2K_PCI,HW_CPU_X86_SSE2,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_CPU_X86_SSSE3,COMPUTE_STORAGE_BUS_IDE,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_RESCUE_BFV,HW_CPU_X86_MMX,HW_CPU_X86_SSE,COMPUTE_TRUSTED_CERTS,COMPUTE_IMAGE_TYPE_ISO _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Jan 22 01:00:32 compute-1 nova_compute[182713]: 2026-01-22 01:00:32.588 182717 DEBUG nova.compute.provider_tree [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Inventory has not changed in ProviderTree for provider: 39680711-70c9-4df1-ae59-25e54fac688d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 01:00:32 compute-1 nova_compute[182713]: 2026-01-22 01:00:32.627 182717 DEBUG nova.scheduler.client.report [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Inventory has not changed for provider 39680711-70c9-4df1-ae59-25e54fac688d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 01:00:32 compute-1 nova_compute[182713]: 2026-01-22 01:00:32.629 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 01:00:32 compute-1 nova_compute[182713]: 2026-01-22 01:00:32.630 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.522s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 01:00:32 compute-1 nova_compute[182713]: 2026-01-22 01:00:32.991 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:00:32 compute-1 nova_compute[182713]: 2026-01-22 01:00:32.994 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:00:34 compute-1 podman[252637]: 2026-01-22 01:00:34.574156218 +0000 UTC m=+0.053496227 container health_status e305ebfcd3dbca18f8dd680606b043b108cf9efb0d0a771039cb04fe0df8441d (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 22 01:00:34 compute-1 podman[252636]: 2026-01-22 01:00:34.600716309 +0000 UTC m=+0.096037763 container health_status af9a44923f14d865893c91712a29a99d59e05d83f1a1a0300c4f4d5af638f284 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3)
Jan 22 01:00:34 compute-1 nova_compute[182713]: 2026-01-22 01:00:34.631 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 01:00:34 compute-1 nova_compute[182713]: 2026-01-22 01:00:34.632 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 01:00:34 compute-1 nova_compute[182713]: 2026-01-22 01:00:34.633 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 01:00:34 compute-1 nova_compute[182713]: 2026-01-22 01:00:34.852 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 01:00:35 compute-1 nova_compute[182713]: 2026-01-22 01:00:35.856 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 01:00:35 compute-1 nova_compute[182713]: 2026-01-22 01:00:35.857 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 01:00:35 compute-1 nova_compute[182713]: 2026-01-22 01:00:35.857 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 22 01:00:35 compute-1 nova_compute[182713]: 2026-01-22 01:00:35.974 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 22 01:00:35 compute-1 nova_compute[182713]: 2026-01-22 01:00:35.975 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 01:00:36 compute-1 nova_compute[182713]: 2026-01-22 01:00:36.855 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 01:00:37 compute-1 nova_compute[182713]: 2026-01-22 01:00:37.995 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:00:42 compute-1 nova_compute[182713]: 2026-01-22 01:00:42.998 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:00:48 compute-1 nova_compute[182713]: 2026-01-22 01:00:47.999 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:00:48 compute-1 nova_compute[182713]: 2026-01-22 01:00:48.001 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:00:48 compute-1 nova_compute[182713]: 2026-01-22 01:00:48.366 182717 DEBUG oslo_concurrency.processutils [None req-3383b229-0645-4d84-a161-f817197299df c798bde61dce4297a27213eac66acb7f 43b70c4e837343859ac97b6b2397ba1b - - default default] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 01:00:48 compute-1 nova_compute[182713]: 2026-01-22 01:00:48.404 182717 DEBUG oslo_concurrency.processutils [None req-3383b229-0645-4d84-a161-f817197299df c798bde61dce4297a27213eac66acb7f 43b70c4e837343859ac97b6b2397ba1b - - default default] CMD "env LANG=C uptime" returned: 0 in 0.038s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 01:00:48 compute-1 podman[252681]: 2026-01-22 01:00:48.600049365 +0000 UTC m=+0.078133589 container health_status dce133a2ffa21f3006ac27dc80027060a9029e6d972f4c62fecd35dd3bbb00f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=openstack_network_exporter, maintainer=Red Hat, Inc., name=ubi9-minimal, vcs-type=git, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, version=9.6, io.openshift.expose-services=, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container)
Jan 22 01:00:48 compute-1 podman[252680]: 2026-01-22 01:00:48.61219398 +0000 UTC m=+0.090763990 container health_status cba261ffc859a105e0afa4436e64e27f9ba7e7387a1ea02146e0072c51844d44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Jan 22 01:00:53 compute-1 nova_compute[182713]: 2026-01-22 01:00:53.002 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:00:58 compute-1 nova_compute[182713]: 2026-01-22 01:00:58.003 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:00:58 compute-1 nova_compute[182713]: 2026-01-22 01:00:58.005 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:00:58 compute-1 podman[252723]: 2026-01-22 01:00:58.570247168 +0000 UTC m=+0.058070639 container health_status 9069102163b9014ce8d990e36224edf2f6277e737eebbc85a8ea0ba5a3d6a468 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 22 01:00:58 compute-1 podman[252722]: 2026-01-22 01:00:58.653885516 +0000 UTC m=+0.149232239 container health_status 1b31374526468d6b98833c6ddc06c791524e3aaa8f4a92d02e3f11c449a17ca2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller)
Jan 22 01:00:59 compute-1 ovn_metadata_agent[104179]: 2026-01-22 01:00:59.124 104184 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=75, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:a4:fb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'fa:c2:bb:50:eb:27'}, ipsec=False) old=SB_Global(nb_cfg=74) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 01:00:59 compute-1 ovn_metadata_agent[104179]: 2026-01-22 01:00:59.125 104184 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 22 01:00:59 compute-1 nova_compute[182713]: 2026-01-22 01:00:59.125 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:01:01 compute-1 CROND[252774]: (root) CMD (run-parts /etc/cron.hourly)
Jan 22 01:01:01 compute-1 run-parts[252777]: (/etc/cron.hourly) starting 0anacron
Jan 22 01:01:01 compute-1 anacron[252785]: Anacron started on 2026-01-22
Jan 22 01:01:01 compute-1 anacron[252785]: Normal exit (0 jobs run)
Jan 22 01:01:01 compute-1 run-parts[252787]: (/etc/cron.hourly) finished 0anacron
Jan 22 01:01:01 compute-1 CROND[252773]: (root) CMDEND (run-parts /etc/cron.hourly)
Jan 22 01:01:03 compute-1 nova_compute[182713]: 2026-01-22 01:01:03.005 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:01:03 compute-1 ovn_metadata_agent[104179]: 2026-01-22 01:01:03.083 104184 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 01:01:03 compute-1 ovn_metadata_agent[104179]: 2026-01-22 01:01:03.084 104184 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 01:01:03 compute-1 ovn_metadata_agent[104179]: 2026-01-22 01:01:03.084 104184 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 01:01:05 compute-1 podman[252789]: 2026-01-22 01:01:05.572084362 +0000 UTC m=+0.059516674 container health_status e305ebfcd3dbca18f8dd680606b043b108cf9efb0d0a771039cb04fe0df8441d (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 22 01:01:05 compute-1 podman[252788]: 2026-01-22 01:01:05.599243173 +0000 UTC m=+0.078718438 container health_status af9a44923f14d865893c91712a29a99d59e05d83f1a1a0300c4f4d5af638f284 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible)
Jan 22 01:01:07 compute-1 ovn_metadata_agent[104179]: 2026-01-22 01:01:07.127 104184 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=74526b6d-b1ca-423f-9094-b845f8b97526, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '75'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 01:01:08 compute-1 nova_compute[182713]: 2026-01-22 01:01:08.007 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:01:13 compute-1 nova_compute[182713]: 2026-01-22 01:01:13.010 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 01:01:18 compute-1 nova_compute[182713]: 2026-01-22 01:01:18.013 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:01:19 compute-1 podman[252831]: 2026-01-22 01:01:19.589758774 +0000 UTC m=+0.075144867 container health_status cba261ffc859a105e0afa4436e64e27f9ba7e7387a1ea02146e0072c51844d44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3)
Jan 22 01:01:19 compute-1 podman[252832]: 2026-01-22 01:01:19.594307706 +0000 UTC m=+0.075108637 container health_status dce133a2ffa21f3006ac27dc80027060a9029e6d972f4c62fecd35dd3bbb00f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, architecture=x86_64, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, name=ubi9-minimal, container_name=openstack_network_exporter, io.openshift.expose-services=, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, maintainer=Red Hat, Inc., io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers)
Jan 22 01:01:19 compute-1 nova_compute[182713]: 2026-01-22 01:01:19.858 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 01:01:23 compute-1 nova_compute[182713]: 2026-01-22 01:01:23.017 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 01:01:23 compute-1 nova_compute[182713]: 2026-01-22 01:01:23.021 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:01:26 compute-1 nova_compute[182713]: 2026-01-22 01:01:26.853 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 01:01:28 compute-1 nova_compute[182713]: 2026-01-22 01:01:28.020 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:01:28 compute-1 nova_compute[182713]: 2026-01-22 01:01:28.023 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:01:28 compute-1 nova_compute[182713]: 2026-01-22 01:01:28.856 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 01:01:29 compute-1 podman[252872]: 2026-01-22 01:01:29.612685079 +0000 UTC m=+0.090758191 container health_status 9069102163b9014ce8d990e36224edf2f6277e737eebbc85a8ea0ba5a3d6a468 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 22 01:01:29 compute-1 podman[252871]: 2026-01-22 01:01:29.683829131 +0000 UTC m=+0.167049412 container health_status 1b31374526468d6b98833c6ddc06c791524e3aaa8f4a92d02e3f11c449a17ca2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 01:01:30 compute-1 nova_compute[182713]: 2026-01-22 01:01:30.856 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 01:01:30 compute-1 nova_compute[182713]: 2026-01-22 01:01:30.883 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 01:01:30 compute-1 nova_compute[182713]: 2026-01-22 01:01:30.884 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 01:01:30 compute-1 nova_compute[182713]: 2026-01-22 01:01:30.884 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 01:01:30 compute-1 nova_compute[182713]: 2026-01-22 01:01:30.885 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 01:01:31 compute-1 nova_compute[182713]: 2026-01-22 01:01:31.058 182717 WARNING nova.virt.libvirt.driver [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 01:01:31 compute-1 nova_compute[182713]: 2026-01-22 01:01:31.059 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5692MB free_disk=73.17708587646484GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 01:01:31 compute-1 nova_compute[182713]: 2026-01-22 01:01:31.060 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 01:01:31 compute-1 nova_compute[182713]: 2026-01-22 01:01:31.060 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 01:01:31 compute-1 nova_compute[182713]: 2026-01-22 01:01:31.155 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 01:01:31 compute-1 nova_compute[182713]: 2026-01-22 01:01:31.155 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 01:01:31 compute-1 nova_compute[182713]: 2026-01-22 01:01:31.184 182717 DEBUG nova.compute.provider_tree [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Inventory has not changed in ProviderTree for provider: 39680711-70c9-4df1-ae59-25e54fac688d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 01:01:31 compute-1 nova_compute[182713]: 2026-01-22 01:01:31.217 182717 DEBUG nova.scheduler.client.report [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Inventory has not changed for provider 39680711-70c9-4df1-ae59-25e54fac688d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 01:01:31 compute-1 nova_compute[182713]: 2026-01-22 01:01:31.219 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 01:01:31 compute-1 nova_compute[182713]: 2026-01-22 01:01:31.219 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.159s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 01:01:33 compute-1 nova_compute[182713]: 2026-01-22 01:01:33.025 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 01:01:33 compute-1 nova_compute[182713]: 2026-01-22 01:01:33.028 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:01:34 compute-1 nova_compute[182713]: 2026-01-22 01:01:34.220 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 01:01:34 compute-1 nova_compute[182713]: 2026-01-22 01:01:34.856 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 01:01:34 compute-1 nova_compute[182713]: 2026-01-22 01:01:34.856 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 01:01:36 compute-1 podman[252921]: 2026-01-22 01:01:36.603970467 +0000 UTC m=+0.087257552 container health_status af9a44923f14d865893c91712a29a99d59e05d83f1a1a0300c4f4d5af638f284 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 01:01:36 compute-1 podman[252922]: 2026-01-22 01:01:36.611763138 +0000 UTC m=+0.078425999 container health_status e305ebfcd3dbca18f8dd680606b043b108cf9efb0d0a771039cb04fe0df8441d (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 22 01:01:36 compute-1 nova_compute[182713]: 2026-01-22 01:01:36.856 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 01:01:36 compute-1 nova_compute[182713]: 2026-01-22 01:01:36.857 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 01:01:37 compute-1 nova_compute[182713]: 2026-01-22 01:01:37.856 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 01:01:37 compute-1 nova_compute[182713]: 2026-01-22 01:01:37.857 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 01:01:37 compute-1 nova_compute[182713]: 2026-01-22 01:01:37.857 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 22 01:01:37 compute-1 nova_compute[182713]: 2026-01-22 01:01:37.929 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 22 01:01:38 compute-1 nova_compute[182713]: 2026-01-22 01:01:38.029 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 01:01:43 compute-1 nova_compute[182713]: 2026-01-22 01:01:43.032 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 01:01:48 compute-1 nova_compute[182713]: 2026-01-22 01:01:48.034 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 01:01:48 compute-1 nova_compute[182713]: 2026-01-22 01:01:48.037 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:01:50 compute-1 podman[252965]: 2026-01-22 01:01:50.614160606 +0000 UTC m=+0.088724747 container health_status dce133a2ffa21f3006ac27dc80027060a9029e6d972f4c62fecd35dd3bbb00f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9-minimal, config_id=openstack_network_exporter, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, build-date=2025-08-20T13:12:41, release=1755695350, vcs-type=git, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, vendor=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, distribution-scope=public, io.openshift.expose-services=, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64)
Jan 22 01:01:50 compute-1 podman[252964]: 2026-01-22 01:01:50.632313018 +0000 UTC m=+0.113919957 container health_status cba261ffc859a105e0afa4436e64e27f9ba7e7387a1ea02146e0072c51844d44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, config_id=ceilometer_agent_compute, org.label-schema.build-date=20251202)
Jan 22 01:01:53 compute-1 nova_compute[182713]: 2026-01-22 01:01:53.037 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:01:53 compute-1 nova_compute[182713]: 2026-01-22 01:01:53.039 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:01:58 compute-1 nova_compute[182713]: 2026-01-22 01:01:58.040 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:01:58 compute-1 nova_compute[182713]: 2026-01-22 01:01:58.044 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:02:00 compute-1 podman[253004]: 2026-01-22 01:02:00.619698551 +0000 UTC m=+0.098411056 container health_status 9069102163b9014ce8d990e36224edf2f6277e737eebbc85a8ea0ba5a3d6a468 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 22 01:02:00 compute-1 podman[253003]: 2026-01-22 01:02:00.643755147 +0000 UTC m=+0.119357346 container health_status 1b31374526468d6b98833c6ddc06c791524e3aaa8f4a92d02e3f11c449a17ca2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Jan 22 01:02:03 compute-1 nova_compute[182713]: 2026-01-22 01:02:03.042 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:02:03 compute-1 nova_compute[182713]: 2026-01-22 01:02:03.045 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:02:03 compute-1 ovn_metadata_agent[104179]: 2026-01-22 01:02:03.084 104184 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 01:02:03 compute-1 ovn_metadata_agent[104179]: 2026-01-22 01:02:03.085 104184 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 01:02:03 compute-1 ovn_metadata_agent[104179]: 2026-01-22 01:02:03.085 104184 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 01:02:07 compute-1 podman[253052]: 2026-01-22 01:02:07.595139379 +0000 UTC m=+0.076181700 container health_status af9a44923f14d865893c91712a29a99d59e05d83f1a1a0300c4f4d5af638f284 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 01:02:07 compute-1 podman[253053]: 2026-01-22 01:02:07.602581569 +0000 UTC m=+0.076840419 container health_status e305ebfcd3dbca18f8dd680606b043b108cf9efb0d0a771039cb04fe0df8441d (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Jan 22 01:02:08 compute-1 nova_compute[182713]: 2026-01-22 01:02:08.043 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:02:08 compute-1 nova_compute[182713]: 2026-01-22 01:02:08.046 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:02:13 compute-1 nova_compute[182713]: 2026-01-22 01:02:13.046 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:02:13 compute-1 nova_compute[182713]: 2026-01-22 01:02:13.048 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:02:18 compute-1 nova_compute[182713]: 2026-01-22 01:02:18.048 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:02:18 compute-1 nova_compute[182713]: 2026-01-22 01:02:18.050 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:02:19 compute-1 nova_compute[182713]: 2026-01-22 01:02:19.855 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 01:02:21 compute-1 podman[253092]: 2026-01-22 01:02:21.606963319 +0000 UTC m=+0.090030648 container health_status dce133a2ffa21f3006ac27dc80027060a9029e6d972f4c62fecd35dd3bbb00f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, config_id=openstack_network_exporter, io.openshift.expose-services=, name=ubi9-minimal, vcs-type=git, build-date=2025-08-20T13:12:41, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc.)
Jan 22 01:02:21 compute-1 podman[253091]: 2026-01-22 01:02:21.611309753 +0000 UTC m=+0.096845338 container health_status cba261ffc859a105e0afa4436e64e27f9ba7e7387a1ea02146e0072c51844d44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 01:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 01:02:22.912 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 01:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 01:02:22.912 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 01:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 01:02:22.913 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 01:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 01:02:22.913 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 01:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 01:02:22.913 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 01:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 01:02:22.913 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 01:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 01:02:22.913 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 01:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 01:02:22.913 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 01:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 01:02:22.913 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 01:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 01:02:22.913 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 01:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 01:02:22.913 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 01:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 01:02:22.914 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 01:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 01:02:22.914 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 01:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 01:02:22.914 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 01:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 01:02:22.914 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 01:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 01:02:22.914 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 01:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 01:02:22.914 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 01:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 01:02:22.914 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 01:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 01:02:22.914 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 01:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 01:02:22.915 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 01:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 01:02:22.915 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 01:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 01:02:22.915 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 01:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 01:02:22.915 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 01:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 01:02:22.915 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 01:02:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 01:02:22.915 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 01:02:23 compute-1 nova_compute[182713]: 2026-01-22 01:02:23.050 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:02:23 compute-1 nova_compute[182713]: 2026-01-22 01:02:23.052 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:02:26 compute-1 sshd-session[253130]: Connection closed by authenticating user root 92.118.39.95 port 33870 [preauth]
Jan 22 01:02:26 compute-1 nova_compute[182713]: 2026-01-22 01:02:26.851 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 01:02:28 compute-1 nova_compute[182713]: 2026-01-22 01:02:28.052 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:02:28 compute-1 nova_compute[182713]: 2026-01-22 01:02:28.054 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:02:29 compute-1 nova_compute[182713]: 2026-01-22 01:02:29.855 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 01:02:31 compute-1 podman[253133]: 2026-01-22 01:02:31.594361253 +0000 UTC m=+0.080612506 container health_status 9069102163b9014ce8d990e36224edf2f6277e737eebbc85a8ea0ba5a3d6a468 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 22 01:02:31 compute-1 podman[253132]: 2026-01-22 01:02:31.6097487 +0000 UTC m=+0.101143212 container health_status 1b31374526468d6b98833c6ddc06c791524e3aaa8f4a92d02e3f11c449a17ca2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 22 01:02:32 compute-1 nova_compute[182713]: 2026-01-22 01:02:32.856 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 01:02:32 compute-1 nova_compute[182713]: 2026-01-22 01:02:32.900 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 01:02:32 compute-1 nova_compute[182713]: 2026-01-22 01:02:32.901 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 01:02:32 compute-1 nova_compute[182713]: 2026-01-22 01:02:32.901 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 01:02:32 compute-1 nova_compute[182713]: 2026-01-22 01:02:32.901 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 01:02:33 compute-1 nova_compute[182713]: 2026-01-22 01:02:33.053 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:02:33 compute-1 nova_compute[182713]: 2026-01-22 01:02:33.056 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:02:33 compute-1 nova_compute[182713]: 2026-01-22 01:02:33.088 182717 WARNING nova.virt.libvirt.driver [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 01:02:33 compute-1 nova_compute[182713]: 2026-01-22 01:02:33.090 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5689MB free_disk=73.17708587646484GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 01:02:33 compute-1 nova_compute[182713]: 2026-01-22 01:02:33.090 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 01:02:33 compute-1 nova_compute[182713]: 2026-01-22 01:02:33.090 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 01:02:33 compute-1 nova_compute[182713]: 2026-01-22 01:02:33.366 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 01:02:33 compute-1 nova_compute[182713]: 2026-01-22 01:02:33.366 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 01:02:33 compute-1 nova_compute[182713]: 2026-01-22 01:02:33.405 182717 DEBUG nova.compute.provider_tree [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Inventory has not changed in ProviderTree for provider: 39680711-70c9-4df1-ae59-25e54fac688d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 01:02:33 compute-1 nova_compute[182713]: 2026-01-22 01:02:33.448 182717 DEBUG nova.scheduler.client.report [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Inventory has not changed for provider 39680711-70c9-4df1-ae59-25e54fac688d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 01:02:33 compute-1 nova_compute[182713]: 2026-01-22 01:02:33.451 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 01:02:33 compute-1 nova_compute[182713]: 2026-01-22 01:02:33.452 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.362s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 01:02:35 compute-1 nova_compute[182713]: 2026-01-22 01:02:35.448 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 01:02:35 compute-1 nova_compute[182713]: 2026-01-22 01:02:35.469 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 01:02:35 compute-1 nova_compute[182713]: 2026-01-22 01:02:35.469 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 01:02:35 compute-1 nova_compute[182713]: 2026-01-22 01:02:35.469 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 01:02:36 compute-1 nova_compute[182713]: 2026-01-22 01:02:36.857 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 01:02:38 compute-1 nova_compute[182713]: 2026-01-22 01:02:38.055 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:02:38 compute-1 nova_compute[182713]: 2026-01-22 01:02:38.058 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:02:38 compute-1 podman[253182]: 2026-01-22 01:02:38.586818898 +0000 UTC m=+0.075342653 container health_status af9a44923f14d865893c91712a29a99d59e05d83f1a1a0300c4f4d5af638f284 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, io.buildah.version=1.41.3, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 01:02:38 compute-1 podman[253183]: 2026-01-22 01:02:38.600702767 +0000 UTC m=+0.093452734 container health_status e305ebfcd3dbca18f8dd680606b043b108cf9efb0d0a771039cb04fe0df8441d (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 22 01:02:38 compute-1 nova_compute[182713]: 2026-01-22 01:02:38.856 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 01:02:38 compute-1 nova_compute[182713]: 2026-01-22 01:02:38.856 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 01:02:38 compute-1 nova_compute[182713]: 2026-01-22 01:02:38.857 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 22 01:02:38 compute-1 nova_compute[182713]: 2026-01-22 01:02:38.873 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 22 01:02:38 compute-1 nova_compute[182713]: 2026-01-22 01:02:38.873 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 01:02:43 compute-1 nova_compute[182713]: 2026-01-22 01:02:43.058 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:02:43 compute-1 nova_compute[182713]: 2026-01-22 01:02:43.061 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:02:48 compute-1 nova_compute[182713]: 2026-01-22 01:02:48.062 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:02:48 compute-1 nova_compute[182713]: 2026-01-22 01:02:48.065 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 01:02:52 compute-1 podman[253224]: 2026-01-22 01:02:52.615073357 +0000 UTC m=+0.096441987 container health_status cba261ffc859a105e0afa4436e64e27f9ba7e7387a1ea02146e0072c51844d44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3)
Jan 22 01:02:52 compute-1 podman[253225]: 2026-01-22 01:02:52.617471091 +0000 UTC m=+0.092307718 container health_status dce133a2ffa21f3006ac27dc80027060a9029e6d972f4c62fecd35dd3bbb00f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, release=1755695350, container_name=openstack_network_exporter, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., io.buildah.version=1.33.7, maintainer=Red Hat, Inc., architecture=x86_64, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, vcs-type=git, config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6)
Jan 22 01:02:53 compute-1 nova_compute[182713]: 2026-01-22 01:02:53.064 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:02:53 compute-1 nova_compute[182713]: 2026-01-22 01:02:53.065 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:02:58 compute-1 nova_compute[182713]: 2026-01-22 01:02:58.067 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 01:02:58 compute-1 nova_compute[182713]: 2026-01-22 01:02:58.069 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 01:02:58 compute-1 nova_compute[182713]: 2026-01-22 01:02:58.070 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5004 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Jan 22 01:02:58 compute-1 nova_compute[182713]: 2026-01-22 01:02:58.070 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 01:02:58 compute-1 nova_compute[182713]: 2026-01-22 01:02:58.137 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:02:58 compute-1 nova_compute[182713]: 2026-01-22 01:02:58.138 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 01:02:58 compute-1 nova_compute[182713]: 2026-01-22 01:02:58.140 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 01:03:02 compute-1 podman[253265]: 2026-01-22 01:03:02.605946369 +0000 UTC m=+0.081940038 container health_status 9069102163b9014ce8d990e36224edf2f6277e737eebbc85a8ea0ba5a3d6a468 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 22 01:03:02 compute-1 podman[253264]: 2026-01-22 01:03:02.666891335 +0000 UTC m=+0.144871355 container health_status 1b31374526468d6b98833c6ddc06c791524e3aaa8f4a92d02e3f11c449a17ca2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 22 01:03:03 compute-1 ovn_metadata_agent[104179]: 2026-01-22 01:03:03.085 104184 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 01:03:03 compute-1 ovn_metadata_agent[104179]: 2026-01-22 01:03:03.086 104184 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 01:03:03 compute-1 ovn_metadata_agent[104179]: 2026-01-22 01:03:03.086 104184 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 01:03:03 compute-1 nova_compute[182713]: 2026-01-22 01:03:03.138 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:03:03 compute-1 nova_compute[182713]: 2026-01-22 01:03:03.140 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:03:08 compute-1 nova_compute[182713]: 2026-01-22 01:03:08.141 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 01:03:08 compute-1 nova_compute[182713]: 2026-01-22 01:03:08.142 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:03:09 compute-1 podman[253316]: 2026-01-22 01:03:09.564410292 +0000 UTC m=+0.063127224 container health_status af9a44923f14d865893c91712a29a99d59e05d83f1a1a0300c4f4d5af638f284 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 22 01:03:09 compute-1 podman[253317]: 2026-01-22 01:03:09.576980372 +0000 UTC m=+0.067786849 container health_status e305ebfcd3dbca18f8dd680606b043b108cf9efb0d0a771039cb04fe0df8441d (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 22 01:03:13 compute-1 nova_compute[182713]: 2026-01-22 01:03:13.144 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 01:03:13 compute-1 nova_compute[182713]: 2026-01-22 01:03:13.145 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:03:18 compute-1 nova_compute[182713]: 2026-01-22 01:03:18.145 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:03:18 compute-1 nova_compute[182713]: 2026-01-22 01:03:18.146 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:03:20 compute-1 nova_compute[182713]: 2026-01-22 01:03:20.855 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 01:03:23 compute-1 nova_compute[182713]: 2026-01-22 01:03:23.147 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 01:03:23 compute-1 nova_compute[182713]: 2026-01-22 01:03:23.148 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:03:23 compute-1 nova_compute[182713]: 2026-01-22 01:03:23.148 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5001 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Jan 22 01:03:23 compute-1 nova_compute[182713]: 2026-01-22 01:03:23.148 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 01:03:23 compute-1 nova_compute[182713]: 2026-01-22 01:03:23.149 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 01:03:23 compute-1 nova_compute[182713]: 2026-01-22 01:03:23.149 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:03:23 compute-1 podman[253360]: 2026-01-22 01:03:23.259615884 +0000 UTC m=+0.079642357 container health_status cba261ffc859a105e0afa4436e64e27f9ba7e7387a1ea02146e0072c51844d44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, config_id=ceilometer_agent_compute, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Jan 22 01:03:23 compute-1 podman[253361]: 2026-01-22 01:03:23.27276289 +0000 UTC m=+0.083725442 container health_status dce133a2ffa21f3006ac27dc80027060a9029e6d972f4c62fecd35dd3bbb00f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=openstack_network_exporter, io.openshift.tags=minimal rhel9, name=ubi9-minimal, distribution-scope=public, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, release=1755695350, io.buildah.version=1.33.7)
Jan 22 01:03:27 compute-1 nova_compute[182713]: 2026-01-22 01:03:27.852 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 01:03:28 compute-1 nova_compute[182713]: 2026-01-22 01:03:28.150 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 01:03:28 compute-1 nova_compute[182713]: 2026-01-22 01:03:28.151 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:03:28 compute-1 nova_compute[182713]: 2026-01-22 01:03:28.151 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Jan 22 01:03:28 compute-1 nova_compute[182713]: 2026-01-22 01:03:28.152 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 01:03:28 compute-1 nova_compute[182713]: 2026-01-22 01:03:28.152 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 01:03:28 compute-1 nova_compute[182713]: 2026-01-22 01:03:28.153 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:03:30 compute-1 nova_compute[182713]: 2026-01-22 01:03:30.856 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 01:03:32 compute-1 nova_compute[182713]: 2026-01-22 01:03:32.855 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 01:03:32 compute-1 nova_compute[182713]: 2026-01-22 01:03:32.911 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 01:03:32 compute-1 nova_compute[182713]: 2026-01-22 01:03:32.912 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 01:03:32 compute-1 nova_compute[182713]: 2026-01-22 01:03:32.912 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 01:03:32 compute-1 nova_compute[182713]: 2026-01-22 01:03:32.913 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 01:03:33 compute-1 nova_compute[182713]: 2026-01-22 01:03:33.146 182717 WARNING nova.virt.libvirt.driver [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 01:03:33 compute-1 nova_compute[182713]: 2026-01-22 01:03:33.147 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5684MB free_disk=73.17708587646484GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 01:03:33 compute-1 nova_compute[182713]: 2026-01-22 01:03:33.148 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 01:03:33 compute-1 nova_compute[182713]: 2026-01-22 01:03:33.148 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 01:03:33 compute-1 nova_compute[182713]: 2026-01-22 01:03:33.154 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:03:33 compute-1 nova_compute[182713]: 2026-01-22 01:03:33.230 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 01:03:33 compute-1 nova_compute[182713]: 2026-01-22 01:03:33.230 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 01:03:33 compute-1 nova_compute[182713]: 2026-01-22 01:03:33.264 182717 DEBUG nova.compute.provider_tree [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Inventory has not changed in ProviderTree for provider: 39680711-70c9-4df1-ae59-25e54fac688d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 01:03:33 compute-1 nova_compute[182713]: 2026-01-22 01:03:33.300 182717 DEBUG nova.scheduler.client.report [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Inventory has not changed for provider 39680711-70c9-4df1-ae59-25e54fac688d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 01:03:33 compute-1 nova_compute[182713]: 2026-01-22 01:03:33.303 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 01:03:33 compute-1 nova_compute[182713]: 2026-01-22 01:03:33.303 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.155s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 01:03:33 compute-1 podman[253401]: 2026-01-22 01:03:33.601659895 +0000 UTC m=+0.083506686 container health_status 9069102163b9014ce8d990e36224edf2f6277e737eebbc85a8ea0ba5a3d6a468 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 22 01:03:33 compute-1 podman[253400]: 2026-01-22 01:03:33.664245822 +0000 UTC m=+0.149873199 container health_status 1b31374526468d6b98833c6ddc06c791524e3aaa8f4a92d02e3f11c449a17ca2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 01:03:35 compute-1 nova_compute[182713]: 2026-01-22 01:03:35.304 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 01:03:35 compute-1 nova_compute[182713]: 2026-01-22 01:03:35.305 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 01:03:35 compute-1 nova_compute[182713]: 2026-01-22 01:03:35.856 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 01:03:35 compute-1 nova_compute[182713]: 2026-01-22 01:03:35.856 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 01:03:35 compute-1 nova_compute[182713]: 2026-01-22 01:03:35.856 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Jan 22 01:03:38 compute-1 nova_compute[182713]: 2026-01-22 01:03:38.157 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 01:03:38 compute-1 nova_compute[182713]: 2026-01-22 01:03:38.158 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 01:03:38 compute-1 nova_compute[182713]: 2026-01-22 01:03:38.158 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Jan 22 01:03:38 compute-1 nova_compute[182713]: 2026-01-22 01:03:38.159 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 01:03:38 compute-1 nova_compute[182713]: 2026-01-22 01:03:38.195 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:03:38 compute-1 nova_compute[182713]: 2026-01-22 01:03:38.196 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 01:03:38 compute-1 nova_compute[182713]: 2026-01-22 01:03:38.875 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 01:03:38 compute-1 nova_compute[182713]: 2026-01-22 01:03:38.875 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 01:03:39 compute-1 nova_compute[182713]: 2026-01-22 01:03:39.856 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 01:03:39 compute-1 nova_compute[182713]: 2026-01-22 01:03:39.856 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 01:03:39 compute-1 nova_compute[182713]: 2026-01-22 01:03:39.857 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 22 01:03:39 compute-1 nova_compute[182713]: 2026-01-22 01:03:39.874 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 22 01:03:40 compute-1 podman[253451]: 2026-01-22 01:03:40.586222504 +0000 UTC m=+0.076240681 container health_status e305ebfcd3dbca18f8dd680606b043b108cf9efb0d0a771039cb04fe0df8441d (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 22 01:03:40 compute-1 podman[253450]: 2026-01-22 01:03:40.597666488 +0000 UTC m=+0.082412292 container health_status af9a44923f14d865893c91712a29a99d59e05d83f1a1a0300c4f4d5af638f284 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, managed_by=edpm_ansible, tcib_managed=true)
Jan 22 01:03:43 compute-1 nova_compute[182713]: 2026-01-22 01:03:43.197 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 01:03:43 compute-1 nova_compute[182713]: 2026-01-22 01:03:43.199 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 01:03:43 compute-1 nova_compute[182713]: 2026-01-22 01:03:43.200 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Jan 22 01:03:43 compute-1 nova_compute[182713]: 2026-01-22 01:03:43.200 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 01:03:43 compute-1 nova_compute[182713]: 2026-01-22 01:03:43.229 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:03:43 compute-1 nova_compute[182713]: 2026-01-22 01:03:43.229 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 01:03:43 compute-1 nova_compute[182713]: 2026-01-22 01:03:43.856 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 01:03:43 compute-1 nova_compute[182713]: 2026-01-22 01:03:43.856 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Jan 22 01:03:43 compute-1 nova_compute[182713]: 2026-01-22 01:03:43.883 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Jan 22 01:03:43 compute-1 nova_compute[182713]: 2026-01-22 01:03:43.883 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 01:03:48 compute-1 nova_compute[182713]: 2026-01-22 01:03:48.231 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 01:03:48 compute-1 nova_compute[182713]: 2026-01-22 01:03:48.233 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 01:03:48 compute-1 nova_compute[182713]: 2026-01-22 01:03:48.233 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Jan 22 01:03:48 compute-1 nova_compute[182713]: 2026-01-22 01:03:48.234 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 01:03:48 compute-1 nova_compute[182713]: 2026-01-22 01:03:48.263 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:03:48 compute-1 nova_compute[182713]: 2026-01-22 01:03:48.263 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 01:03:53 compute-1 nova_compute[182713]: 2026-01-22 01:03:53.264 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 01:03:53 compute-1 nova_compute[182713]: 2026-01-22 01:03:53.266 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 01:03:53 compute-1 nova_compute[182713]: 2026-01-22 01:03:53.266 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Jan 22 01:03:53 compute-1 nova_compute[182713]: 2026-01-22 01:03:53.267 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 01:03:53 compute-1 nova_compute[182713]: 2026-01-22 01:03:53.353 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:03:53 compute-1 nova_compute[182713]: 2026-01-22 01:03:53.354 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 01:03:53 compute-1 podman[253493]: 2026-01-22 01:03:53.604200014 +0000 UTC m=+0.093299500 container health_status cba261ffc859a105e0afa4436e64e27f9ba7e7387a1ea02146e0072c51844d44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Jan 22 01:03:53 compute-1 podman[253494]: 2026-01-22 01:03:53.614306486 +0000 UTC m=+0.087539311 container health_status dce133a2ffa21f3006ac27dc80027060a9029e6d972f4c62fecd35dd3bbb00f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, vcs-type=git, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, name=ubi9-minimal, config_id=openstack_network_exporter, architecture=x86_64, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, io.openshift.expose-services=, release=1755695350, maintainer=Red Hat, Inc., io.buildah.version=1.33.7, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Jan 22 01:03:58 compute-1 nova_compute[182713]: 2026-01-22 01:03:58.354 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 01:03:58 compute-1 nova_compute[182713]: 2026-01-22 01:03:58.356 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 01:03:58 compute-1 nova_compute[182713]: 2026-01-22 01:03:58.356 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Jan 22 01:03:58 compute-1 nova_compute[182713]: 2026-01-22 01:03:58.356 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 01:03:58 compute-1 nova_compute[182713]: 2026-01-22 01:03:58.387 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:03:58 compute-1 nova_compute[182713]: 2026-01-22 01:03:58.387 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 01:04:03 compute-1 ovn_metadata_agent[104179]: 2026-01-22 01:04:03.098 104184 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 01:04:03 compute-1 ovn_metadata_agent[104179]: 2026-01-22 01:04:03.099 104184 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 01:04:03 compute-1 ovn_metadata_agent[104179]: 2026-01-22 01:04:03.099 104184 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 01:04:03 compute-1 nova_compute[182713]: 2026-01-22 01:04:03.389 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 01:04:03 compute-1 nova_compute[182713]: 2026-01-22 01:04:03.391 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 01:04:03 compute-1 nova_compute[182713]: 2026-01-22 01:04:03.391 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5004 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Jan 22 01:04:03 compute-1 nova_compute[182713]: 2026-01-22 01:04:03.392 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 01:04:03 compute-1 nova_compute[182713]: 2026-01-22 01:04:03.430 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:04:03 compute-1 nova_compute[182713]: 2026-01-22 01:04:03.431 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 01:04:04 compute-1 podman[253535]: 2026-01-22 01:04:04.623895259 +0000 UTC m=+0.098447237 container health_status 9069102163b9014ce8d990e36224edf2f6277e737eebbc85a8ea0ba5a3d6a468 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 22 01:04:04 compute-1 podman[253534]: 2026-01-22 01:04:04.643154635 +0000 UTC m=+0.130338255 container health_status 1b31374526468d6b98833c6ddc06c791524e3aaa8f4a92d02e3f11c449a17ca2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 22 01:04:08 compute-1 nova_compute[182713]: 2026-01-22 01:04:08.432 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 01:04:08 compute-1 nova_compute[182713]: 2026-01-22 01:04:08.434 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 01:04:08 compute-1 nova_compute[182713]: 2026-01-22 01:04:08.434 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Jan 22 01:04:08 compute-1 nova_compute[182713]: 2026-01-22 01:04:08.435 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 01:04:08 compute-1 nova_compute[182713]: 2026-01-22 01:04:08.460 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:04:08 compute-1 nova_compute[182713]: 2026-01-22 01:04:08.460 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 01:04:11 compute-1 podman[253585]: 2026-01-22 01:04:11.564989725 +0000 UTC m=+0.055563301 container health_status af9a44923f14d865893c91712a29a99d59e05d83f1a1a0300c4f4d5af638f284 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 01:04:11 compute-1 podman[253586]: 2026-01-22 01:04:11.617618084 +0000 UTC m=+0.100100719 container health_status e305ebfcd3dbca18f8dd680606b043b108cf9efb0d0a771039cb04fe0df8441d (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 22 01:04:13 compute-1 nova_compute[182713]: 2026-01-22 01:04:13.461 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 01:04:13 compute-1 nova_compute[182713]: 2026-01-22 01:04:13.463 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 01:04:13 compute-1 nova_compute[182713]: 2026-01-22 01:04:13.463 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Jan 22 01:04:13 compute-1 nova_compute[182713]: 2026-01-22 01:04:13.463 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 01:04:13 compute-1 nova_compute[182713]: 2026-01-22 01:04:13.510 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:04:13 compute-1 nova_compute[182713]: 2026-01-22 01:04:13.511 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 01:04:18 compute-1 nova_compute[182713]: 2026-01-22 01:04:18.512 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 01:04:21 compute-1 nova_compute[182713]: 2026-01-22 01:04:21.899 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 01:04:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 01:04:22.912 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 01:04:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 01:04:22.912 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 01:04:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 01:04:22.913 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 01:04:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 01:04:22.913 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 01:04:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 01:04:22.913 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 01:04:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 01:04:22.913 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 01:04:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 01:04:22.913 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 01:04:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 01:04:22.913 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 01:04:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 01:04:22.913 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 01:04:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 01:04:22.913 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 01:04:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 01:04:22.913 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 01:04:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 01:04:22.913 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 01:04:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 01:04:22.913 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 01:04:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 01:04:22.913 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 01:04:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 01:04:22.913 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 01:04:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 01:04:22.913 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 01:04:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 01:04:22.914 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 01:04:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 01:04:22.914 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 01:04:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 01:04:22.914 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 01:04:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 01:04:22.914 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 01:04:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 01:04:22.914 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 01:04:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 01:04:22.914 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 01:04:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 01:04:22.914 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 01:04:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 01:04:22.914 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 01:04:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 01:04:22.914 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 01:04:23 compute-1 nova_compute[182713]: 2026-01-22 01:04:23.514 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 01:04:24 compute-1 podman[253625]: 2026-01-22 01:04:24.584813921 +0000 UTC m=+0.076704615 container health_status cba261ffc859a105e0afa4436e64e27f9ba7e7387a1ea02146e0072c51844d44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, config_id=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 22 01:04:24 compute-1 podman[253626]: 2026-01-22 01:04:24.615697757 +0000 UTC m=+0.091199014 container health_status dce133a2ffa21f3006ac27dc80027060a9029e6d972f4c62fecd35dd3bbb00f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, version=9.6, io.openshift.tags=minimal rhel9, vcs-type=git)
Jan 22 01:04:28 compute-1 nova_compute[182713]: 2026-01-22 01:04:28.516 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 01:04:28 compute-1 nova_compute[182713]: 2026-01-22 01:04:28.517 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:04:28 compute-1 nova_compute[182713]: 2026-01-22 01:04:28.518 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Jan 22 01:04:28 compute-1 nova_compute[182713]: 2026-01-22 01:04:28.518 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 01:04:28 compute-1 nova_compute[182713]: 2026-01-22 01:04:28.519 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 01:04:28 compute-1 nova_compute[182713]: 2026-01-22 01:04:28.520 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:04:29 compute-1 nova_compute[182713]: 2026-01-22 01:04:29.851 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 01:04:31 compute-1 nova_compute[182713]: 2026-01-22 01:04:31.855 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 01:04:32 compute-1 nova_compute[182713]: 2026-01-22 01:04:32.856 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 01:04:32 compute-1 nova_compute[182713]: 2026-01-22 01:04:32.892 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 01:04:32 compute-1 nova_compute[182713]: 2026-01-22 01:04:32.892 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 01:04:32 compute-1 nova_compute[182713]: 2026-01-22 01:04:32.892 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 01:04:32 compute-1 nova_compute[182713]: 2026-01-22 01:04:32.893 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 01:04:33 compute-1 nova_compute[182713]: 2026-01-22 01:04:33.076 182717 WARNING nova.virt.libvirt.driver [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 01:04:33 compute-1 nova_compute[182713]: 2026-01-22 01:04:33.078 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5688MB free_disk=73.17708587646484GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 01:04:33 compute-1 nova_compute[182713]: 2026-01-22 01:04:33.078 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 01:04:33 compute-1 nova_compute[182713]: 2026-01-22 01:04:33.078 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 01:04:33 compute-1 nova_compute[182713]: 2026-01-22 01:04:33.135 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 01:04:33 compute-1 nova_compute[182713]: 2026-01-22 01:04:33.135 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 01:04:33 compute-1 nova_compute[182713]: 2026-01-22 01:04:33.156 182717 DEBUG nova.compute.provider_tree [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Inventory has not changed in ProviderTree for provider: 39680711-70c9-4df1-ae59-25e54fac688d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 01:04:33 compute-1 nova_compute[182713]: 2026-01-22 01:04:33.168 182717 DEBUG nova.scheduler.client.report [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Inventory has not changed for provider 39680711-70c9-4df1-ae59-25e54fac688d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 01:04:33 compute-1 nova_compute[182713]: 2026-01-22 01:04:33.170 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 01:04:33 compute-1 nova_compute[182713]: 2026-01-22 01:04:33.170 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.092s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 01:04:33 compute-1 nova_compute[182713]: 2026-01-22 01:04:33.521 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:04:35 compute-1 podman[253666]: 2026-01-22 01:04:35.596240341 +0000 UTC m=+0.086053574 container health_status 9069102163b9014ce8d990e36224edf2f6277e737eebbc85a8ea0ba5a3d6a468 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 22 01:04:35 compute-1 podman[253665]: 2026-01-22 01:04:35.632570876 +0000 UTC m=+0.120845882 container health_status 1b31374526468d6b98833c6ddc06c791524e3aaa8f4a92d02e3f11c449a17ca2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller)
Jan 22 01:04:37 compute-1 nova_compute[182713]: 2026-01-22 01:04:37.170 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 01:04:37 compute-1 nova_compute[182713]: 2026-01-22 01:04:37.170 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 01:04:37 compute-1 nova_compute[182713]: 2026-01-22 01:04:37.171 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 01:04:37 compute-1 sshd-session[253715]: Connection closed by authenticating user root 92.118.39.95 port 32834 [preauth]
Jan 22 01:04:38 compute-1 nova_compute[182713]: 2026-01-22 01:04:38.523 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 01:04:38 compute-1 nova_compute[182713]: 2026-01-22 01:04:38.852 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 01:04:38 compute-1 nova_compute[182713]: 2026-01-22 01:04:38.874 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 01:04:40 compute-1 nova_compute[182713]: 2026-01-22 01:04:40.856 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 01:04:41 compute-1 nova_compute[182713]: 2026-01-22 01:04:41.857 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 01:04:41 compute-1 nova_compute[182713]: 2026-01-22 01:04:41.858 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 01:04:41 compute-1 nova_compute[182713]: 2026-01-22 01:04:41.858 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 22 01:04:41 compute-1 nova_compute[182713]: 2026-01-22 01:04:41.877 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 22 01:04:42 compute-1 podman[253718]: 2026-01-22 01:04:42.566468116 +0000 UTC m=+0.052921059 container health_status e305ebfcd3dbca18f8dd680606b043b108cf9efb0d0a771039cb04fe0df8441d (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 22 01:04:42 compute-1 podman[253717]: 2026-01-22 01:04:42.580719518 +0000 UTC m=+0.067374997 container health_status af9a44923f14d865893c91712a29a99d59e05d83f1a1a0300c4f4d5af638f284 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0)
Jan 22 01:04:43 compute-1 nova_compute[182713]: 2026-01-22 01:04:43.525 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:04:48 compute-1 nova_compute[182713]: 2026-01-22 01:04:48.526 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:04:53 compute-1 nova_compute[182713]: 2026-01-22 01:04:53.528 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:04:55 compute-1 podman[253760]: 2026-01-22 01:04:55.592477963 +0000 UTC m=+0.073100882 container health_status dce133a2ffa21f3006ac27dc80027060a9029e6d972f4c62fecd35dd3bbb00f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, config_id=openstack_network_exporter, distribution-scope=public, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., version=9.6, container_name=openstack_network_exporter, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, vendor=Red Hat, Inc., name=ubi9-minimal, vcs-type=git, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, io.buildah.version=1.33.7, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Jan 22 01:04:55 compute-1 podman[253759]: 2026-01-22 01:04:55.61366592 +0000 UTC m=+0.092288548 container health_status cba261ffc859a105e0afa4436e64e27f9ba7e7387a1ea02146e0072c51844d44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.schema-version=1.0)
Jan 22 01:04:58 compute-1 nova_compute[182713]: 2026-01-22 01:04:58.530 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:05:03 compute-1 ovn_metadata_agent[104179]: 2026-01-22 01:05:03.102 104184 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 01:05:03 compute-1 ovn_metadata_agent[104179]: 2026-01-22 01:05:03.102 104184 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 01:05:03 compute-1 ovn_metadata_agent[104179]: 2026-01-22 01:05:03.102 104184 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 01:05:03 compute-1 nova_compute[182713]: 2026-01-22 01:05:03.532 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 01:05:03 compute-1 nova_compute[182713]: 2026-01-22 01:05:03.534 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 01:05:03 compute-1 nova_compute[182713]: 2026-01-22 01:05:03.535 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Jan 22 01:05:03 compute-1 nova_compute[182713]: 2026-01-22 01:05:03.535 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 01:05:03 compute-1 nova_compute[182713]: 2026-01-22 01:05:03.547 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:05:03 compute-1 nova_compute[182713]: 2026-01-22 01:05:03.549 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 01:05:06 compute-1 podman[253798]: 2026-01-22 01:05:06.611565651 +0000 UTC m=+0.083100163 container health_status 9069102163b9014ce8d990e36224edf2f6277e737eebbc85a8ea0ba5a3d6a468 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 22 01:05:06 compute-1 podman[253797]: 2026-01-22 01:05:06.640010972 +0000 UTC m=+0.119367836 container health_status 1b31374526468d6b98833c6ddc06c791524e3aaa8f4a92d02e3f11c449a17ca2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Jan 22 01:05:08 compute-1 nova_compute[182713]: 2026-01-22 01:05:08.551 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 01:05:08 compute-1 nova_compute[182713]: 2026-01-22 01:05:08.552 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 01:05:08 compute-1 nova_compute[182713]: 2026-01-22 01:05:08.553 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Jan 22 01:05:08 compute-1 nova_compute[182713]: 2026-01-22 01:05:08.553 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 01:05:08 compute-1 nova_compute[182713]: 2026-01-22 01:05:08.600 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:05:08 compute-1 nova_compute[182713]: 2026-01-22 01:05:08.601 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 01:05:13 compute-1 podman[253850]: 2026-01-22 01:05:13.563608245 +0000 UTC m=+0.057392557 container health_status af9a44923f14d865893c91712a29a99d59e05d83f1a1a0300c4f4d5af638f284 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent)
Jan 22 01:05:13 compute-1 podman[253851]: 2026-01-22 01:05:13.572508411 +0000 UTC m=+0.063977911 container health_status e305ebfcd3dbca18f8dd680606b043b108cf9efb0d0a771039cb04fe0df8441d (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 22 01:05:13 compute-1 nova_compute[182713]: 2026-01-22 01:05:13.602 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 01:05:18 compute-1 nova_compute[182713]: 2026-01-22 01:05:18.604 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 01:05:18 compute-1 nova_compute[182713]: 2026-01-22 01:05:18.606 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 01:05:18 compute-1 nova_compute[182713]: 2026-01-22 01:05:18.607 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5004 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Jan 22 01:05:18 compute-1 nova_compute[182713]: 2026-01-22 01:05:18.607 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 01:05:18 compute-1 nova_compute[182713]: 2026-01-22 01:05:18.637 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:05:18 compute-1 nova_compute[182713]: 2026-01-22 01:05:18.638 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 01:05:21 compute-1 nova_compute[182713]: 2026-01-22 01:05:21.855 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 01:05:23 compute-1 nova_compute[182713]: 2026-01-22 01:05:23.639 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 01:05:23 compute-1 nova_compute[182713]: 2026-01-22 01:05:23.642 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:05:26 compute-1 podman[253893]: 2026-01-22 01:05:26.577748666 +0000 UTC m=+0.060645648 container health_status cba261ffc859a105e0afa4436e64e27f9ba7e7387a1ea02146e0072c51844d44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible)
Jan 22 01:05:26 compute-1 podman[253894]: 2026-01-22 01:05:26.59596482 +0000 UTC m=+0.070073210 container health_status dce133a2ffa21f3006ac27dc80027060a9029e6d972f4c62fecd35dd3bbb00f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., architecture=x86_64, build-date=2025-08-20T13:12:41, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., name=ubi9-minimal, managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, release=1755695350, vcs-type=git, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=openstack_network_exporter, distribution-scope=public)
Jan 22 01:05:28 compute-1 nova_compute[182713]: 2026-01-22 01:05:28.643 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 01:05:28 compute-1 nova_compute[182713]: 2026-01-22 01:05:28.646 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 01:05:28 compute-1 nova_compute[182713]: 2026-01-22 01:05:28.646 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5004 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Jan 22 01:05:28 compute-1 nova_compute[182713]: 2026-01-22 01:05:28.647 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 01:05:28 compute-1 nova_compute[182713]: 2026-01-22 01:05:28.672 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:05:28 compute-1 nova_compute[182713]: 2026-01-22 01:05:28.673 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 01:05:30 compute-1 nova_compute[182713]: 2026-01-22 01:05:30.851 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 01:05:32 compute-1 nova_compute[182713]: 2026-01-22 01:05:32.856 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 01:05:33 compute-1 nova_compute[182713]: 2026-01-22 01:05:33.674 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:05:34 compute-1 nova_compute[182713]: 2026-01-22 01:05:34.855 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 01:05:34 compute-1 nova_compute[182713]: 2026-01-22 01:05:34.881 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 01:05:34 compute-1 nova_compute[182713]: 2026-01-22 01:05:34.882 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 01:05:34 compute-1 nova_compute[182713]: 2026-01-22 01:05:34.882 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 01:05:34 compute-1 nova_compute[182713]: 2026-01-22 01:05:34.883 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 01:05:35 compute-1 nova_compute[182713]: 2026-01-22 01:05:35.082 182717 WARNING nova.virt.libvirt.driver [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 01:05:35 compute-1 nova_compute[182713]: 2026-01-22 01:05:35.084 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5700MB free_disk=73.1866683959961GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 01:05:35 compute-1 nova_compute[182713]: 2026-01-22 01:05:35.084 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 01:05:35 compute-1 nova_compute[182713]: 2026-01-22 01:05:35.084 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 01:05:35 compute-1 nova_compute[182713]: 2026-01-22 01:05:35.342 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 01:05:35 compute-1 nova_compute[182713]: 2026-01-22 01:05:35.343 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 01:05:35 compute-1 nova_compute[182713]: 2026-01-22 01:05:35.484 182717 DEBUG nova.scheduler.client.report [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Refreshing inventories for resource provider 39680711-70c9-4df1-ae59-25e54fac688d _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Jan 22 01:05:35 compute-1 nova_compute[182713]: 2026-01-22 01:05:35.620 182717 DEBUG nova.scheduler.client.report [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Updating ProviderTree inventory for provider 39680711-70c9-4df1-ae59-25e54fac688d from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Jan 22 01:05:35 compute-1 nova_compute[182713]: 2026-01-22 01:05:35.621 182717 DEBUG nova.compute.provider_tree [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Updating inventory in ProviderTree for provider 39680711-70c9-4df1-ae59-25e54fac688d with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Jan 22 01:05:35 compute-1 nova_compute[182713]: 2026-01-22 01:05:35.648 182717 DEBUG nova.scheduler.client.report [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Refreshing aggregate associations for resource provider 39680711-70c9-4df1-ae59-25e54fac688d, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Jan 22 01:05:35 compute-1 nova_compute[182713]: 2026-01-22 01:05:35.687 182717 DEBUG nova.scheduler.client.report [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Refreshing trait associations for resource provider 39680711-70c9-4df1-ae59-25e54fac688d, traits: COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_ACCELERATORS,COMPUTE_NODE,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_SSE41,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_GRAPHICS_MODEL_CIRRUS,HW_CPU_X86_SSE42,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_SECURITY_TPM_2_0,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_STORAGE_BUS_USB,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_STORAGE_BUS_FDC,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_STORAGE_BUS_SATA,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_DEVICE_TAGGING,COMPUTE_NET_VIF_MODEL_NE2K_PCI,HW_CPU_X86_SSE2,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_CPU_X86_SSSE3,COMPUTE_STORAGE_BUS_IDE,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_RESCUE_BFV,HW_CPU_X86_MMX,HW_CPU_X86_SSE,COMPUTE_TRUSTED_CERTS,COMPUTE_IMAGE_TYPE_ISO _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Jan 22 01:05:35 compute-1 nova_compute[182713]: 2026-01-22 01:05:35.728 182717 DEBUG nova.compute.provider_tree [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Inventory has not changed in ProviderTree for provider: 39680711-70c9-4df1-ae59-25e54fac688d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 01:05:35 compute-1 nova_compute[182713]: 2026-01-22 01:05:35.753 182717 DEBUG nova.scheduler.client.report [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Inventory has not changed for provider 39680711-70c9-4df1-ae59-25e54fac688d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 01:05:35 compute-1 nova_compute[182713]: 2026-01-22 01:05:35.756 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 01:05:35 compute-1 nova_compute[182713]: 2026-01-22 01:05:35.756 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.672s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 01:05:37 compute-1 podman[253937]: 2026-01-22 01:05:37.600144577 +0000 UTC m=+0.075009272 container health_status 9069102163b9014ce8d990e36224edf2f6277e737eebbc85a8ea0ba5a3d6a468 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 22 01:05:37 compute-1 podman[253936]: 2026-01-22 01:05:37.668663859 +0000 UTC m=+0.154129362 container health_status 1b31374526468d6b98833c6ddc06c791524e3aaa8f4a92d02e3f11c449a17ca2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, org.label-schema.build-date=20251202)
Jan 22 01:05:38 compute-1 nova_compute[182713]: 2026-01-22 01:05:38.677 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 01:05:38 compute-1 nova_compute[182713]: 2026-01-22 01:05:38.679 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 01:05:38 compute-1 nova_compute[182713]: 2026-01-22 01:05:38.680 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5005 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Jan 22 01:05:38 compute-1 nova_compute[182713]: 2026-01-22 01:05:38.680 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 01:05:38 compute-1 nova_compute[182713]: 2026-01-22 01:05:38.719 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:05:38 compute-1 nova_compute[182713]: 2026-01-22 01:05:38.720 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 01:05:38 compute-1 nova_compute[182713]: 2026-01-22 01:05:38.757 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 01:05:38 compute-1 nova_compute[182713]: 2026-01-22 01:05:38.758 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 01:05:38 compute-1 nova_compute[182713]: 2026-01-22 01:05:38.759 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 01:05:40 compute-1 nova_compute[182713]: 2026-01-22 01:05:40.858 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 01:05:42 compute-1 nova_compute[182713]: 2026-01-22 01:05:42.855 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 01:05:42 compute-1 nova_compute[182713]: 2026-01-22 01:05:42.856 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 01:05:42 compute-1 nova_compute[182713]: 2026-01-22 01:05:42.856 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 22 01:05:42 compute-1 nova_compute[182713]: 2026-01-22 01:05:42.872 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 22 01:05:42 compute-1 nova_compute[182713]: 2026-01-22 01:05:42.873 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 01:05:43 compute-1 nova_compute[182713]: 2026-01-22 01:05:43.720 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:05:43 compute-1 nova_compute[182713]: 2026-01-22 01:05:43.721 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:05:44 compute-1 podman[253986]: 2026-01-22 01:05:44.597636967 +0000 UTC m=+0.092240666 container health_status af9a44923f14d865893c91712a29a99d59e05d83f1a1a0300c4f4d5af638f284 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 22 01:05:44 compute-1 podman[253987]: 2026-01-22 01:05:44.60677268 +0000 UTC m=+0.083717602 container health_status e305ebfcd3dbca18f8dd680606b043b108cf9efb0d0a771039cb04fe0df8441d (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 22 01:05:48 compute-1 nova_compute[182713]: 2026-01-22 01:05:48.723 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 01:05:48 compute-1 nova_compute[182713]: 2026-01-22 01:05:48.725 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:05:53 compute-1 nova_compute[182713]: 2026-01-22 01:05:53.725 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:05:57 compute-1 podman[254029]: 2026-01-22 01:05:57.586035322 +0000 UTC m=+0.077645915 container health_status cba261ffc859a105e0afa4436e64e27f9ba7e7387a1ea02146e0072c51844d44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=ceilometer_agent_compute)
Jan 22 01:05:57 compute-1 podman[254030]: 2026-01-22 01:05:57.605187944 +0000 UTC m=+0.081917477 container health_status dce133a2ffa21f3006ac27dc80027060a9029e6d972f4c62fecd35dd3bbb00f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, managed_by=edpm_ansible, release=1755695350, version=9.6, build-date=2025-08-20T13:12:41, config_id=openstack_network_exporter, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, vcs-type=git, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., io.buildah.version=1.33.7, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64)
Jan 22 01:05:58 compute-1 nova_compute[182713]: 2026-01-22 01:05:58.728 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 01:05:58 compute-1 nova_compute[182713]: 2026-01-22 01:05:58.730 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 01:05:58 compute-1 nova_compute[182713]: 2026-01-22 01:05:58.730 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5004 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Jan 22 01:05:58 compute-1 nova_compute[182713]: 2026-01-22 01:05:58.731 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 01:05:58 compute-1 nova_compute[182713]: 2026-01-22 01:05:58.771 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:05:58 compute-1 nova_compute[182713]: 2026-01-22 01:05:58.772 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 01:06:03 compute-1 ovn_metadata_agent[104179]: 2026-01-22 01:06:03.103 104184 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 01:06:03 compute-1 ovn_metadata_agent[104179]: 2026-01-22 01:06:03.103 104184 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 01:06:03 compute-1 ovn_metadata_agent[104179]: 2026-01-22 01:06:03.103 104184 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 01:06:03 compute-1 nova_compute[182713]: 2026-01-22 01:06:03.773 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:06:03 compute-1 nova_compute[182713]: 2026-01-22 01:06:03.775 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:06:08 compute-1 podman[254071]: 2026-01-22 01:06:08.604888673 +0000 UTC m=+0.087367086 container health_status 9069102163b9014ce8d990e36224edf2f6277e737eebbc85a8ea0ba5a3d6a468 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 22 01:06:08 compute-1 podman[254070]: 2026-01-22 01:06:08.619905927 +0000 UTC m=+0.110143520 container health_status 1b31374526468d6b98833c6ddc06c791524e3aaa8f4a92d02e3f11c449a17ca2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3)
Jan 22 01:06:08 compute-1 nova_compute[182713]: 2026-01-22 01:06:08.776 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:06:08 compute-1 nova_compute[182713]: 2026-01-22 01:06:08.777 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:06:13 compute-1 nova_compute[182713]: 2026-01-22 01:06:13.779 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 01:06:13 compute-1 nova_compute[182713]: 2026-01-22 01:06:13.780 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:06:15 compute-1 podman[254121]: 2026-01-22 01:06:15.59195939 +0000 UTC m=+0.073939670 container health_status e305ebfcd3dbca18f8dd680606b043b108cf9efb0d0a771039cb04fe0df8441d (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Jan 22 01:06:15 compute-1 podman[254120]: 2026-01-22 01:06:15.596036645 +0000 UTC m=+0.081195024 container health_status af9a44923f14d865893c91712a29a99d59e05d83f1a1a0300c4f4d5af638f284 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_metadata_agent)
Jan 22 01:06:18 compute-1 nova_compute[182713]: 2026-01-22 01:06:18.782 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 01:06:22 compute-1 nova_compute[182713]: 2026-01-22 01:06:22.855 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 01:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 01:06:22.914 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 01:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 01:06:22.916 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 01:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 01:06:22.916 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 01:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 01:06:22.917 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 01:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 01:06:22.917 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 01:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 01:06:22.917 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 01:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 01:06:22.917 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 01:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 01:06:22.917 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 01:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 01:06:22.918 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 01:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 01:06:22.918 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 01:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 01:06:22.918 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 01:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 01:06:22.918 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 01:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 01:06:22.918 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 01:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 01:06:22.918 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 01:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 01:06:22.919 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 01:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 01:06:22.919 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 01:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 01:06:22.919 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 01:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 01:06:22.919 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 01:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 01:06:22.919 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 01:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 01:06:22.919 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 01:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 01:06:22.920 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 01:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 01:06:22.920 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 01:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 01:06:22.920 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 01:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 01:06:22.920 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 01:06:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 01:06:22.920 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 01:06:23 compute-1 nova_compute[182713]: 2026-01-22 01:06:23.784 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 01:06:23 compute-1 nova_compute[182713]: 2026-01-22 01:06:23.786 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 01:06:23 compute-1 nova_compute[182713]: 2026-01-22 01:06:23.786 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Jan 22 01:06:23 compute-1 nova_compute[182713]: 2026-01-22 01:06:23.787 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 01:06:23 compute-1 nova_compute[182713]: 2026-01-22 01:06:23.787 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 01:06:23 compute-1 nova_compute[182713]: 2026-01-22 01:06:23.788 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:06:28 compute-1 podman[254164]: 2026-01-22 01:06:28.584508551 +0000 UTC m=+0.072082382 container health_status cba261ffc859a105e0afa4436e64e27f9ba7e7387a1ea02146e0072c51844d44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 22 01:06:28 compute-1 podman[254165]: 2026-01-22 01:06:28.59288334 +0000 UTC m=+0.077095988 container health_status dce133a2ffa21f3006ac27dc80027060a9029e6d972f4c62fecd35dd3bbb00f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., vcs-type=git, vendor=Red Hat, Inc., distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, managed_by=edpm_ansible, version=9.6, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, name=ubi9-minimal, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, config_id=openstack_network_exporter, io.openshift.expose-services=, release=1755695350, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Jan 22 01:06:28 compute-1 nova_compute[182713]: 2026-01-22 01:06:28.788 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:06:32 compute-1 nova_compute[182713]: 2026-01-22 01:06:32.851 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 01:06:32 compute-1 nova_compute[182713]: 2026-01-22 01:06:32.854 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 01:06:33 compute-1 nova_compute[182713]: 2026-01-22 01:06:33.789 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:06:33 compute-1 nova_compute[182713]: 2026-01-22 01:06:33.790 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:06:36 compute-1 nova_compute[182713]: 2026-01-22 01:06:36.857 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 01:06:36 compute-1 nova_compute[182713]: 2026-01-22 01:06:36.892 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 01:06:36 compute-1 nova_compute[182713]: 2026-01-22 01:06:36.893 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 01:06:36 compute-1 nova_compute[182713]: 2026-01-22 01:06:36.893 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 01:06:36 compute-1 nova_compute[182713]: 2026-01-22 01:06:36.893 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 01:06:37 compute-1 nova_compute[182713]: 2026-01-22 01:06:37.139 182717 WARNING nova.virt.libvirt.driver [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 01:06:37 compute-1 nova_compute[182713]: 2026-01-22 01:06:37.141 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5695MB free_disk=73.1866683959961GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 01:06:37 compute-1 nova_compute[182713]: 2026-01-22 01:06:37.141 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 01:06:37 compute-1 nova_compute[182713]: 2026-01-22 01:06:37.142 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 01:06:37 compute-1 nova_compute[182713]: 2026-01-22 01:06:37.210 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 01:06:37 compute-1 nova_compute[182713]: 2026-01-22 01:06:37.211 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 01:06:37 compute-1 nova_compute[182713]: 2026-01-22 01:06:37.234 182717 DEBUG nova.compute.provider_tree [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Inventory has not changed in ProviderTree for provider: 39680711-70c9-4df1-ae59-25e54fac688d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 01:06:37 compute-1 nova_compute[182713]: 2026-01-22 01:06:37.248 182717 DEBUG nova.scheduler.client.report [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Inventory has not changed for provider 39680711-70c9-4df1-ae59-25e54fac688d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 01:06:37 compute-1 nova_compute[182713]: 2026-01-22 01:06:37.250 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 01:06:37 compute-1 nova_compute[182713]: 2026-01-22 01:06:37.250 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.108s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 01:06:38 compute-1 nova_compute[182713]: 2026-01-22 01:06:38.792 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 01:06:38 compute-1 nova_compute[182713]: 2026-01-22 01:06:38.794 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 01:06:38 compute-1 nova_compute[182713]: 2026-01-22 01:06:38.794 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Jan 22 01:06:38 compute-1 nova_compute[182713]: 2026-01-22 01:06:38.794 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 01:06:38 compute-1 nova_compute[182713]: 2026-01-22 01:06:38.835 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:06:38 compute-1 nova_compute[182713]: 2026-01-22 01:06:38.836 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 01:06:39 compute-1 nova_compute[182713]: 2026-01-22 01:06:39.249 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 01:06:39 compute-1 nova_compute[182713]: 2026-01-22 01:06:39.250 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 01:06:39 compute-1 nova_compute[182713]: 2026-01-22 01:06:39.250 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 01:06:39 compute-1 podman[254202]: 2026-01-22 01:06:39.597370165 +0000 UTC m=+0.080830433 container health_status 9069102163b9014ce8d990e36224edf2f6277e737eebbc85a8ea0ba5a3d6a468 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 22 01:06:39 compute-1 podman[254201]: 2026-01-22 01:06:39.647035622 +0000 UTC m=+0.133859754 container health_status 1b31374526468d6b98833c6ddc06c791524e3aaa8f4a92d02e3f11c449a17ca2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 22 01:06:41 compute-1 nova_compute[182713]: 2026-01-22 01:06:41.852 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 01:06:42 compute-1 nova_compute[182713]: 2026-01-22 01:06:42.856 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 01:06:42 compute-1 nova_compute[182713]: 2026-01-22 01:06:42.856 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 01:06:43 compute-1 nova_compute[182713]: 2026-01-22 01:06:43.838 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 01:06:43 compute-1 nova_compute[182713]: 2026-01-22 01:06:43.857 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 01:06:43 compute-1 nova_compute[182713]: 2026-01-22 01:06:43.859 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 01:06:43 compute-1 nova_compute[182713]: 2026-01-22 01:06:43.860 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 22 01:06:43 compute-1 nova_compute[182713]: 2026-01-22 01:06:43.928 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 22 01:06:46 compute-1 podman[254249]: 2026-01-22 01:06:46.596781344 +0000 UTC m=+0.073942350 container health_status e305ebfcd3dbca18f8dd680606b043b108cf9efb0d0a771039cb04fe0df8441d (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 22 01:06:46 compute-1 podman[254248]: 2026-01-22 01:06:46.602757939 +0000 UTC m=+0.081757351 container health_status af9a44923f14d865893c91712a29a99d59e05d83f1a1a0300c4f4d5af638f284 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251202, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3)
Jan 22 01:06:47 compute-1 sshd-session[254289]: Connection closed by authenticating user root 92.118.39.95 port 60044 [preauth]
Jan 22 01:06:48 compute-1 nova_compute[182713]: 2026-01-22 01:06:48.841 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:06:53 compute-1 nova_compute[182713]: 2026-01-22 01:06:53.844 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:06:53 compute-1 nova_compute[182713]: 2026-01-22 01:06:53.846 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:06:58 compute-1 nova_compute[182713]: 2026-01-22 01:06:58.847 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 01:06:59 compute-1 podman[254292]: 2026-01-22 01:06:59.586781547 +0000 UTC m=+0.071732331 container health_status dce133a2ffa21f3006ac27dc80027060a9029e6d972f4c62fecd35dd3bbb00f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, config_id=openstack_network_exporter, distribution-scope=public, release=1755695350, io.buildah.version=1.33.7, vendor=Red Hat, Inc., architecture=x86_64, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Jan 22 01:06:59 compute-1 podman[254291]: 2026-01-22 01:06:59.59300557 +0000 UTC m=+0.078189731 container health_status cba261ffc859a105e0afa4436e64e27f9ba7e7387a1ea02146e0072c51844d44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 01:07:03 compute-1 ovn_metadata_agent[104179]: 2026-01-22 01:07:03.105 104184 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 01:07:03 compute-1 ovn_metadata_agent[104179]: 2026-01-22 01:07:03.106 104184 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 01:07:03 compute-1 ovn_metadata_agent[104179]: 2026-01-22 01:07:03.106 104184 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 01:07:03 compute-1 nova_compute[182713]: 2026-01-22 01:07:03.849 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 01:07:08 compute-1 nova_compute[182713]: 2026-01-22 01:07:08.851 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 01:07:08 compute-1 nova_compute[182713]: 2026-01-22 01:07:08.853 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:07:08 compute-1 nova_compute[182713]: 2026-01-22 01:07:08.853 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Jan 22 01:07:08 compute-1 nova_compute[182713]: 2026-01-22 01:07:08.853 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 01:07:08 compute-1 nova_compute[182713]: 2026-01-22 01:07:08.854 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 01:07:08 compute-1 nova_compute[182713]: 2026-01-22 01:07:08.855 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 01:07:10 compute-1 podman[254333]: 2026-01-22 01:07:10.585821343 +0000 UTC m=+0.065729604 container health_status 9069102163b9014ce8d990e36224edf2f6277e737eebbc85a8ea0ba5a3d6a468 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 22 01:07:10 compute-1 podman[254332]: 2026-01-22 01:07:10.664654234 +0000 UTC m=+0.141439819 container health_status 1b31374526468d6b98833c6ddc06c791524e3aaa8f4a92d02e3f11c449a17ca2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller)
Jan 22 01:07:13 compute-1 nova_compute[182713]: 2026-01-22 01:07:13.856 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:07:17 compute-1 podman[254383]: 2026-01-22 01:07:17.594833681 +0000 UTC m=+0.075392486 container health_status af9a44923f14d865893c91712a29a99d59e05d83f1a1a0300c4f4d5af638f284 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 22 01:07:17 compute-1 podman[254384]: 2026-01-22 01:07:17.615349865 +0000 UTC m=+0.083813215 container health_status e305ebfcd3dbca18f8dd680606b043b108cf9efb0d0a771039cb04fe0df8441d (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Jan 22 01:07:18 compute-1 nova_compute[182713]: 2026-01-22 01:07:18.858 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 01:07:22 compute-1 nova_compute[182713]: 2026-01-22 01:07:22.855 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 01:07:23 compute-1 nova_compute[182713]: 2026-01-22 01:07:23.860 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 01:07:23 compute-1 nova_compute[182713]: 2026-01-22 01:07:23.863 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 01:07:23 compute-1 nova_compute[182713]: 2026-01-22 01:07:23.863 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Jan 22 01:07:23 compute-1 nova_compute[182713]: 2026-01-22 01:07:23.863 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 01:07:23 compute-1 nova_compute[182713]: 2026-01-22 01:07:23.863 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 01:07:23 compute-1 nova_compute[182713]: 2026-01-22 01:07:23.864 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:07:28 compute-1 nova_compute[182713]: 2026-01-22 01:07:28.866 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 01:07:28 compute-1 nova_compute[182713]: 2026-01-22 01:07:28.868 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 01:07:28 compute-1 nova_compute[182713]: 2026-01-22 01:07:28.868 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5004 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Jan 22 01:07:28 compute-1 nova_compute[182713]: 2026-01-22 01:07:28.869 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 01:07:28 compute-1 nova_compute[182713]: 2026-01-22 01:07:28.897 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:07:28 compute-1 nova_compute[182713]: 2026-01-22 01:07:28.898 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 01:07:30 compute-1 podman[254425]: 2026-01-22 01:07:30.588577199 +0000 UTC m=+0.078267323 container health_status cba261ffc859a105e0afa4436e64e27f9ba7e7387a1ea02146e0072c51844d44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 01:07:30 compute-1 podman[254426]: 2026-01-22 01:07:30.603932865 +0000 UTC m=+0.080926086 container health_status dce133a2ffa21f3006ac27dc80027060a9029e6d972f4c62fecd35dd3bbb00f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, name=ubi9-minimal, config_id=openstack_network_exporter, container_name=openstack_network_exporter, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, build-date=2025-08-20T13:12:41, release=1755695350, vcs-type=git, vendor=Red Hat, Inc., io.openshift.expose-services=, architecture=x86_64, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible)
Jan 22 01:07:32 compute-1 nova_compute[182713]: 2026-01-22 01:07:32.855 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 01:07:33 compute-1 nova_compute[182713]: 2026-01-22 01:07:33.851 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 01:07:33 compute-1 nova_compute[182713]: 2026-01-22 01:07:33.899 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 01:07:33 compute-1 nova_compute[182713]: 2026-01-22 01:07:33.900 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:07:36 compute-1 nova_compute[182713]: 2026-01-22 01:07:36.856 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 01:07:36 compute-1 nova_compute[182713]: 2026-01-22 01:07:36.885 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 01:07:36 compute-1 nova_compute[182713]: 2026-01-22 01:07:36.886 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 01:07:36 compute-1 nova_compute[182713]: 2026-01-22 01:07:36.886 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 01:07:36 compute-1 nova_compute[182713]: 2026-01-22 01:07:36.886 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 01:07:37 compute-1 nova_compute[182713]: 2026-01-22 01:07:37.061 182717 WARNING nova.virt.libvirt.driver [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 01:07:37 compute-1 nova_compute[182713]: 2026-01-22 01:07:37.062 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5709MB free_disk=73.1865463256836GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 01:07:37 compute-1 nova_compute[182713]: 2026-01-22 01:07:37.062 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 01:07:37 compute-1 nova_compute[182713]: 2026-01-22 01:07:37.062 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 01:07:37 compute-1 nova_compute[182713]: 2026-01-22 01:07:37.146 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 01:07:37 compute-1 nova_compute[182713]: 2026-01-22 01:07:37.146 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 01:07:37 compute-1 nova_compute[182713]: 2026-01-22 01:07:37.176 182717 DEBUG nova.compute.provider_tree [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Inventory has not changed in ProviderTree for provider: 39680711-70c9-4df1-ae59-25e54fac688d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 01:07:37 compute-1 nova_compute[182713]: 2026-01-22 01:07:37.198 182717 DEBUG nova.scheduler.client.report [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Inventory has not changed for provider 39680711-70c9-4df1-ae59-25e54fac688d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 01:07:37 compute-1 nova_compute[182713]: 2026-01-22 01:07:37.200 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 01:07:37 compute-1 nova_compute[182713]: 2026-01-22 01:07:37.200 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.138s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 01:07:38 compute-1 nova_compute[182713]: 2026-01-22 01:07:38.903 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:07:38 compute-1 nova_compute[182713]: 2026-01-22 01:07:38.905 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 01:07:40 compute-1 nova_compute[182713]: 2026-01-22 01:07:40.201 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 01:07:40 compute-1 nova_compute[182713]: 2026-01-22 01:07:40.856 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 01:07:40 compute-1 nova_compute[182713]: 2026-01-22 01:07:40.857 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 01:07:41 compute-1 podman[254465]: 2026-01-22 01:07:41.57513935 +0000 UTC m=+0.062445624 container health_status 9069102163b9014ce8d990e36224edf2f6277e737eebbc85a8ea0ba5a3d6a468 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 22 01:07:41 compute-1 podman[254464]: 2026-01-22 01:07:41.648720738 +0000 UTC m=+0.130629065 container health_status 1b31374526468d6b98833c6ddc06c791524e3aaa8f4a92d02e3f11c449a17ca2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 01:07:43 compute-1 nova_compute[182713]: 2026-01-22 01:07:43.856 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 01:07:43 compute-1 nova_compute[182713]: 2026-01-22 01:07:43.857 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 01:07:43 compute-1 nova_compute[182713]: 2026-01-22 01:07:43.905 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 01:07:44 compute-1 nova_compute[182713]: 2026-01-22 01:07:44.856 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 01:07:44 compute-1 nova_compute[182713]: 2026-01-22 01:07:44.857 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 01:07:44 compute-1 nova_compute[182713]: 2026-01-22 01:07:44.858 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 22 01:07:44 compute-1 nova_compute[182713]: 2026-01-22 01:07:44.883 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 22 01:07:48 compute-1 podman[254516]: 2026-01-22 01:07:48.580147443 +0000 UTC m=+0.067943454 container health_status e305ebfcd3dbca18f8dd680606b043b108cf9efb0d0a771039cb04fe0df8441d (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 22 01:07:48 compute-1 podman[254515]: 2026-01-22 01:07:48.586827749 +0000 UTC m=+0.074752834 container health_status af9a44923f14d865893c91712a29a99d59e05d83f1a1a0300c4f4d5af638f284 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 22 01:07:48 compute-1 nova_compute[182713]: 2026-01-22 01:07:48.908 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 01:07:48 compute-1 nova_compute[182713]: 2026-01-22 01:07:48.912 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:07:53 compute-1 nova_compute[182713]: 2026-01-22 01:07:53.910 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:07:53 compute-1 nova_compute[182713]: 2026-01-22 01:07:53.913 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:07:58 compute-1 nova_compute[182713]: 2026-01-22 01:07:58.914 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:07:58 compute-1 nova_compute[182713]: 2026-01-22 01:07:58.918 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:08:01 compute-1 podman[254558]: 2026-01-22 01:08:01.575444071 +0000 UTC m=+0.070199774 container health_status cba261ffc859a105e0afa4436e64e27f9ba7e7387a1ea02146e0072c51844d44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 22 01:08:01 compute-1 podman[254559]: 2026-01-22 01:08:01.578388692 +0000 UTC m=+0.066643023 container health_status dce133a2ffa21f3006ac27dc80027060a9029e6d972f4c62fecd35dd3bbb00f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=openstack_network_exporter, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, distribution-scope=public, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, architecture=x86_64, managed_by=edpm_ansible, vcs-type=git)
Jan 22 01:08:03 compute-1 ovn_metadata_agent[104179]: 2026-01-22 01:08:03.107 104184 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 01:08:03 compute-1 ovn_metadata_agent[104179]: 2026-01-22 01:08:03.108 104184 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 01:08:03 compute-1 ovn_metadata_agent[104179]: 2026-01-22 01:08:03.108 104184 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 01:08:03 compute-1 nova_compute[182713]: 2026-01-22 01:08:03.917 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:08:03 compute-1 nova_compute[182713]: 2026-01-22 01:08:03.919 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:08:08 compute-1 nova_compute[182713]: 2026-01-22 01:08:08.918 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:08:08 compute-1 nova_compute[182713]: 2026-01-22 01:08:08.922 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:08:12 compute-1 podman[254597]: 2026-01-22 01:08:12.608224942 +0000 UTC m=+0.083223107 container health_status 9069102163b9014ce8d990e36224edf2f6277e737eebbc85a8ea0ba5a3d6a468 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 22 01:08:12 compute-1 podman[254596]: 2026-01-22 01:08:12.657217128 +0000 UTC m=+0.139313923 container health_status 1b31374526468d6b98833c6ddc06c791524e3aaa8f4a92d02e3f11c449a17ca2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 22 01:08:13 compute-1 nova_compute[182713]: 2026-01-22 01:08:13.921 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:08:13 compute-1 nova_compute[182713]: 2026-01-22 01:08:13.925 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:08:18 compute-1 nova_compute[182713]: 2026-01-22 01:08:18.924 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:08:18 compute-1 nova_compute[182713]: 2026-01-22 01:08:18.927 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:08:19 compute-1 podman[254646]: 2026-01-22 01:08:19.590296855 +0000 UTC m=+0.068077579 container health_status af9a44923f14d865893c91712a29a99d59e05d83f1a1a0300c4f4d5af638f284 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Jan 22 01:08:19 compute-1 podman[254647]: 2026-01-22 01:08:19.608415255 +0000 UTC m=+0.082229816 container health_status e305ebfcd3dbca18f8dd680606b043b108cf9efb0d0a771039cb04fe0df8441d (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 22 01:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 01:08:22.917 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 01:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 01:08:22.920 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 01:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 01:08:22.920 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 01:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 01:08:22.920 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 01:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 01:08:22.920 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 01:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 01:08:22.921 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 01:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 01:08:22.921 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 01:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 01:08:22.921 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 01:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 01:08:22.921 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 01:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 01:08:22.921 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 01:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 01:08:22.922 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 01:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 01:08:22.922 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 01:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 01:08:22.922 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 01:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 01:08:22.922 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 01:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 01:08:22.922 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 01:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 01:08:22.922 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 01:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 01:08:22.923 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 01:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 01:08:22.923 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 01:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 01:08:22.923 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 01:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 01:08:22.923 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 01:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 01:08:22.923 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 01:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 01:08:22.923 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 01:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 01:08:22.924 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 01:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 01:08:22.924 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 01:08:22 compute-1 ceilometer_agent_compute[192394]: 2026-01-22 01:08:22.924 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 01:08:23 compute-1 nova_compute[182713]: 2026-01-22 01:08:23.927 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:08:23 compute-1 nova_compute[182713]: 2026-01-22 01:08:23.928 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:08:24 compute-1 nova_compute[182713]: 2026-01-22 01:08:24.856 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 01:08:28 compute-1 nova_compute[182713]: 2026-01-22 01:08:28.929 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:08:28 compute-1 nova_compute[182713]: 2026-01-22 01:08:28.933 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:08:32 compute-1 podman[254688]: 2026-01-22 01:08:32.572300591 +0000 UTC m=+0.064628861 container health_status cba261ffc859a105e0afa4436e64e27f9ba7e7387a1ea02146e0072c51844d44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Jan 22 01:08:32 compute-1 podman[254689]: 2026-01-22 01:08:32.593227828 +0000 UTC m=+0.079880673 container health_status dce133a2ffa21f3006ac27dc80027060a9029e6d972f4c62fecd35dd3bbb00f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, version=9.6, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, release=1755695350, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=openstack_network_exporter, maintainer=Red Hat, Inc., vcs-type=git, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, vendor=Red Hat, Inc.)
Jan 22 01:08:33 compute-1 nova_compute[182713]: 2026-01-22 01:08:33.933 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:08:33 compute-1 nova_compute[182713]: 2026-01-22 01:08:33.934 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:08:34 compute-1 nova_compute[182713]: 2026-01-22 01:08:34.856 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 01:08:35 compute-1 nova_compute[182713]: 2026-01-22 01:08:35.852 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 01:08:36 compute-1 nova_compute[182713]: 2026-01-22 01:08:36.856 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 01:08:36 compute-1 nova_compute[182713]: 2026-01-22 01:08:36.886 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 01:08:36 compute-1 nova_compute[182713]: 2026-01-22 01:08:36.887 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 01:08:36 compute-1 nova_compute[182713]: 2026-01-22 01:08:36.887 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 01:08:36 compute-1 nova_compute[182713]: 2026-01-22 01:08:36.888 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 01:08:37 compute-1 nova_compute[182713]: 2026-01-22 01:08:37.121 182717 WARNING nova.virt.libvirt.driver [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 01:08:37 compute-1 nova_compute[182713]: 2026-01-22 01:08:37.122 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5695MB free_disk=73.1865463256836GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 01:08:37 compute-1 nova_compute[182713]: 2026-01-22 01:08:37.122 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 01:08:37 compute-1 nova_compute[182713]: 2026-01-22 01:08:37.123 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 01:08:37 compute-1 nova_compute[182713]: 2026-01-22 01:08:37.194 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 01:08:37 compute-1 nova_compute[182713]: 2026-01-22 01:08:37.195 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 01:08:37 compute-1 nova_compute[182713]: 2026-01-22 01:08:37.223 182717 DEBUG nova.compute.provider_tree [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Inventory has not changed in ProviderTree for provider: 39680711-70c9-4df1-ae59-25e54fac688d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 01:08:37 compute-1 nova_compute[182713]: 2026-01-22 01:08:37.252 182717 DEBUG nova.scheduler.client.report [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Inventory has not changed for provider 39680711-70c9-4df1-ae59-25e54fac688d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 01:08:37 compute-1 nova_compute[182713]: 2026-01-22 01:08:37.255 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 01:08:37 compute-1 nova_compute[182713]: 2026-01-22 01:08:37.255 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.133s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 01:08:38 compute-1 nova_compute[182713]: 2026-01-22 01:08:38.935 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 01:08:38 compute-1 nova_compute[182713]: 2026-01-22 01:08:38.938 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:08:40 compute-1 nova_compute[182713]: 2026-01-22 01:08:40.255 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 01:08:40 compute-1 nova_compute[182713]: 2026-01-22 01:08:40.856 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 01:08:40 compute-1 nova_compute[182713]: 2026-01-22 01:08:40.857 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 01:08:40 compute-1 nova_compute[182713]: 2026-01-22 01:08:40.857 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 01:08:40 compute-1 nova_compute[182713]: 2026-01-22 01:08:40.858 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Jan 22 01:08:41 compute-1 nova_compute[182713]: 2026-01-22 01:08:41.875 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 01:08:43 compute-1 podman[254732]: 2026-01-22 01:08:43.597518119 +0000 UTC m=+0.079508792 container health_status 9069102163b9014ce8d990e36224edf2f6277e737eebbc85a8ea0ba5a3d6a468 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 22 01:08:43 compute-1 podman[254731]: 2026-01-22 01:08:43.635408512 +0000 UTC m=+0.118730837 container health_status 1b31374526468d6b98833c6ddc06c791524e3aaa8f4a92d02e3f11c449a17ca2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.vendor=CentOS, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 01:08:43 compute-1 nova_compute[182713]: 2026-01-22 01:08:43.938 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:08:43 compute-1 nova_compute[182713]: 2026-01-22 01:08:43.940 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:08:44 compute-1 nova_compute[182713]: 2026-01-22 01:08:44.856 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 01:08:44 compute-1 nova_compute[182713]: 2026-01-22 01:08:44.857 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 01:08:44 compute-1 nova_compute[182713]: 2026-01-22 01:08:44.857 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 01:08:45 compute-1 nova_compute[182713]: 2026-01-22 01:08:45.915 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 01:08:45 compute-1 nova_compute[182713]: 2026-01-22 01:08:45.916 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 01:08:45 compute-1 nova_compute[182713]: 2026-01-22 01:08:45.916 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 22 01:08:45 compute-1 nova_compute[182713]: 2026-01-22 01:08:45.937 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 22 01:08:45 compute-1 nova_compute[182713]: 2026-01-22 01:08:45.938 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 01:08:45 compute-1 nova_compute[182713]: 2026-01-22 01:08:45.938 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Jan 22 01:08:45 compute-1 nova_compute[182713]: 2026-01-22 01:08:45.961 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Jan 22 01:08:48 compute-1 nova_compute[182713]: 2026-01-22 01:08:48.943 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:08:48 compute-1 nova_compute[182713]: 2026-01-22 01:08:48.947 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 01:08:50 compute-1 podman[254781]: 2026-01-22 01:08:50.595933789 +0000 UTC m=+0.073493577 container health_status af9a44923f14d865893c91712a29a99d59e05d83f1a1a0300c4f4d5af638f284 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 22 01:08:50 compute-1 podman[254782]: 2026-01-22 01:08:50.635497773 +0000 UTC m=+0.104753174 container health_status e305ebfcd3dbca18f8dd680606b043b108cf9efb0d0a771039cb04fe0df8441d (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Jan 22 01:08:53 compute-1 nova_compute[182713]: 2026-01-22 01:08:53.945 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:08:58 compute-1 nova_compute[182713]: 2026-01-22 01:08:58.948 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:09:03 compute-1 ovn_metadata_agent[104179]: 2026-01-22 01:09:03.109 104184 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 01:09:03 compute-1 ovn_metadata_agent[104179]: 2026-01-22 01:09:03.109 104184 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 01:09:03 compute-1 ovn_metadata_agent[104179]: 2026-01-22 01:09:03.110 104184 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 01:09:03 compute-1 podman[254825]: 2026-01-22 01:09:03.585744697 +0000 UTC m=+0.070825964 container health_status cba261ffc859a105e0afa4436e64e27f9ba7e7387a1ea02146e0072c51844d44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 22 01:09:03 compute-1 podman[254826]: 2026-01-22 01:09:03.597085278 +0000 UTC m=+0.063435795 container health_status dce133a2ffa21f3006ac27dc80027060a9029e6d972f4c62fecd35dd3bbb00f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, io.openshift.expose-services=, managed_by=edpm_ansible, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, vcs-type=git, vendor=Red Hat, Inc., version=9.6, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7)
Jan 22 01:09:03 compute-1 sshd-session[254823]: Connection closed by authenticating user root 92.118.39.95 port 59006 [preauth]
Jan 22 01:09:03 compute-1 nova_compute[182713]: 2026-01-22 01:09:03.950 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 01:09:03 compute-1 nova_compute[182713]: 2026-01-22 01:09:03.956 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 01:09:03 compute-1 nova_compute[182713]: 2026-01-22 01:09:03.957 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5007 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Jan 22 01:09:03 compute-1 nova_compute[182713]: 2026-01-22 01:09:03.957 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 01:09:03 compute-1 nova_compute[182713]: 2026-01-22 01:09:03.958 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:09:03 compute-1 nova_compute[182713]: 2026-01-22 01:09:03.959 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 01:09:08 compute-1 nova_compute[182713]: 2026-01-22 01:09:08.960 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 01:09:11 compute-1 nova_compute[182713]: 2026-01-22 01:09:11.291 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 01:09:13 compute-1 nova_compute[182713]: 2026-01-22 01:09:13.963 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 01:09:13 compute-1 nova_compute[182713]: 2026-01-22 01:09:13.964 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 01:09:13 compute-1 nova_compute[182713]: 2026-01-22 01:09:13.965 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Jan 22 01:09:13 compute-1 nova_compute[182713]: 2026-01-22 01:09:13.966 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 01:09:13 compute-1 nova_compute[182713]: 2026-01-22 01:09:13.995 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:09:13 compute-1 nova_compute[182713]: 2026-01-22 01:09:13.996 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 01:09:14 compute-1 podman[254865]: 2026-01-22 01:09:14.606446364 +0000 UTC m=+0.084810115 container health_status 9069102163b9014ce8d990e36224edf2f6277e737eebbc85a8ea0ba5a3d6a468 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 22 01:09:14 compute-1 podman[254864]: 2026-01-22 01:09:14.632210452 +0000 UTC m=+0.129997054 container health_status 1b31374526468d6b98833c6ddc06c791524e3aaa8f4a92d02e3f11c449a17ca2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Jan 22 01:09:18 compute-1 nova_compute[182713]: 2026-01-22 01:09:18.996 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:09:19 compute-1 nova_compute[182713]: 2026-01-22 01:09:19.238 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 01:09:21 compute-1 podman[254916]: 2026-01-22 01:09:21.58249943 +0000 UTC m=+0.066918202 container health_status af9a44923f14d865893c91712a29a99d59e05d83f1a1a0300c4f4d5af638f284 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, io.buildah.version=1.41.3)
Jan 22 01:09:21 compute-1 podman[254917]: 2026-01-22 01:09:21.605041878 +0000 UTC m=+0.079385087 container health_status e305ebfcd3dbca18f8dd680606b043b108cf9efb0d0a771039cb04fe0df8441d (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 22 01:09:23 compute-1 nova_compute[182713]: 2026-01-22 01:09:23.998 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:09:24 compute-1 nova_compute[182713]: 2026-01-22 01:09:24.876 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 01:09:29 compute-1 nova_compute[182713]: 2026-01-22 01:09:29.033 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 01:09:29 compute-1 nova_compute[182713]: 2026-01-22 01:09:29.034 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:09:29 compute-1 nova_compute[182713]: 2026-01-22 01:09:29.034 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5034 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Jan 22 01:09:29 compute-1 nova_compute[182713]: 2026-01-22 01:09:29.034 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 01:09:29 compute-1 nova_compute[182713]: 2026-01-22 01:09:29.035 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 01:09:31 compute-1 sshd-session[254957]: Connection closed by 27.47.24.28 port 32983
Jan 22 01:09:32 compute-1 sshd-session[254960]: Accepted publickey for zuul from 192.168.122.10 port 40832 ssh2: ECDSA SHA256:yVd9eaNlUxI8uClDfn5gH9n5gqPIh4AgIE0cN4DTPFE
Jan 22 01:09:32 compute-1 systemd-logind[796]: New session 62 of user zuul.
Jan 22 01:09:32 compute-1 systemd[1]: Started Session 62 of User zuul.
Jan 22 01:09:32 compute-1 sshd-session[254960]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 22 01:09:32 compute-1 sudo[254964]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/bash -c 'rm -rf /var/tmp/sos-osp && mkdir /var/tmp/sos-osp && sos report --batch --all-logs --tmp-dir=/var/tmp/sos-osp  -p container,openstack_edpm,system,storage,virt'
Jan 22 01:09:32 compute-1 sudo[254964]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 01:09:33 compute-1 podman[254998]: 2026-01-22 01:09:33.745359575 +0000 UTC m=+0.092188305 container health_status cba261ffc859a105e0afa4436e64e27f9ba7e7387a1ea02146e0072c51844d44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251202)
Jan 22 01:09:33 compute-1 podman[255002]: 2026-01-22 01:09:33.776203689 +0000 UTC m=+0.123493893 container health_status dce133a2ffa21f3006ac27dc80027060a9029e6d972f4c62fecd35dd3bbb00f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, io.openshift.expose-services=, name=ubi9-minimal, vcs-type=git, version=9.6, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., io.buildah.version=1.33.7)
Jan 22 01:09:34 compute-1 nova_compute[182713]: 2026-01-22 01:09:34.036 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 01:09:34 compute-1 nova_compute[182713]: 2026-01-22 01:09:34.039 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 01:09:34 compute-1 sshd-session[254958]: ssh_dispatch_run_fatal: Connection from 110.177.180.55 port 15739: Connection timed out [preauth]
Jan 22 01:09:36 compute-1 nova_compute[182713]: 2026-01-22 01:09:36.852 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 01:09:36 compute-1 nova_compute[182713]: 2026-01-22 01:09:36.855 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 01:09:37 compute-1 ovs-vsctl[255170]: ovs|00001|db_ctl_base|ERR|no key "dpdk-init" in Open_vSwitch record "." column other_config
Jan 22 01:09:38 compute-1 virtqemud[182235]: Failed to connect socket to '/var/run/libvirt/virtnetworkd-sock-ro': No such file or directory
Jan 22 01:09:38 compute-1 virtqemud[182235]: Failed to connect socket to '/var/run/libvirt/virtnwfilterd-sock-ro': No such file or directory
Jan 22 01:09:38 compute-1 virtqemud[182235]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Jan 22 01:09:38 compute-1 nova_compute[182713]: 2026-01-22 01:09:38.855 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 01:09:38 compute-1 nova_compute[182713]: 2026-01-22 01:09:38.901 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 01:09:38 compute-1 nova_compute[182713]: 2026-01-22 01:09:38.902 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 01:09:38 compute-1 nova_compute[182713]: 2026-01-22 01:09:38.902 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 01:09:38 compute-1 nova_compute[182713]: 2026-01-22 01:09:38.903 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 01:09:39 compute-1 nova_compute[182713]: 2026-01-22 01:09:39.038 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4996-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 01:09:39 compute-1 nova_compute[182713]: 2026-01-22 01:09:39.040 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 01:09:39 compute-1 nova_compute[182713]: 2026-01-22 01:09:39.040 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Jan 22 01:09:39 compute-1 nova_compute[182713]: 2026-01-22 01:09:39.040 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 01:09:39 compute-1 nova_compute[182713]: 2026-01-22 01:09:39.041 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 01:09:39 compute-1 nova_compute[182713]: 2026-01-22 01:09:39.041 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:09:39 compute-1 nova_compute[182713]: 2026-01-22 01:09:39.101 182717 WARNING nova.virt.libvirt.driver [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 01:09:39 compute-1 nova_compute[182713]: 2026-01-22 01:09:39.102 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5620MB free_disk=73.1634750366211GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 01:09:39 compute-1 nova_compute[182713]: 2026-01-22 01:09:39.102 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 01:09:39 compute-1 nova_compute[182713]: 2026-01-22 01:09:39.103 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 01:09:39 compute-1 nova_compute[182713]: 2026-01-22 01:09:39.262 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 01:09:39 compute-1 nova_compute[182713]: 2026-01-22 01:09:39.263 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 01:09:39 compute-1 nova_compute[182713]: 2026-01-22 01:09:39.326 182717 DEBUG nova.compute.provider_tree [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Inventory has not changed in ProviderTree for provider: 39680711-70c9-4df1-ae59-25e54fac688d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 01:09:39 compute-1 nova_compute[182713]: 2026-01-22 01:09:39.364 182717 DEBUG nova.scheduler.client.report [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Inventory has not changed for provider 39680711-70c9-4df1-ae59-25e54fac688d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 01:09:39 compute-1 nova_compute[182713]: 2026-01-22 01:09:39.367 182717 DEBUG nova.compute.resource_tracker [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 01:09:39 compute-1 nova_compute[182713]: 2026-01-22 01:09:39.367 182717 DEBUG oslo_concurrency.lockutils [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.264s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 01:09:39 compute-1 crontab[255574]: (root) LIST (root)
Jan 22 01:09:40 compute-1 nova_compute[182713]: 2026-01-22 01:09:40.368 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 01:09:41 compute-1 systemd[1]: Starting Hostname Service...
Jan 22 01:09:41 compute-1 nova_compute[182713]: 2026-01-22 01:09:41.856 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 01:09:41 compute-1 nova_compute[182713]: 2026-01-22 01:09:41.857 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 01:09:41 compute-1 systemd[1]: Started Hostname Service.
Jan 22 01:09:44 compute-1 nova_compute[182713]: 2026-01-22 01:09:44.041 182717 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:09:44 compute-1 podman[255998]: 2026-01-22 01:09:44.800482359 +0000 UTC m=+0.107783247 container health_status 9069102163b9014ce8d990e36224edf2f6277e737eebbc85a8ea0ba5a3d6a468 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 22 01:09:44 compute-1 podman[255996]: 2026-01-22 01:09:44.837202705 +0000 UTC m=+0.151539221 container health_status 1b31374526468d6b98833c6ddc06c791524e3aaa8f4a92d02e3f11c449a17ca2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f-22972dd5e424afe4c71073a9c57408166d7e525ea8041e90b54471ea4a28330f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.license=GPLv2, container_name=ovn_controller)
Jan 22 01:09:44 compute-1 nova_compute[182713]: 2026-01-22 01:09:44.858 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 01:09:45 compute-1 nova_compute[182713]: 2026-01-22 01:09:45.857 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 01:09:45 compute-1 nova_compute[182713]: 2026-01-22 01:09:45.857 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 01:09:45 compute-1 nova_compute[182713]: 2026-01-22 01:09:45.858 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 22 01:09:45 compute-1 nova_compute[182713]: 2026-01-22 01:09:45.877 182717 DEBUG nova.compute.manager [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 22 01:09:45 compute-1 nova_compute[182713]: 2026-01-22 01:09:45.881 182717 DEBUG oslo_service.periodic_task [None req-a349ba84-f49d-4352-8094-21db0084b42c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
